Essay: Potential Dangers of the Singularity and Friendly AI

Essay: Potential Dangers of the Singularity and Friendly AI
13/04/2011 Comments Off on Essay: Potential Dangers of the Singularity and Friendly AI Academic Papers on Information Technology,Sample Academic Papers admin

Sample Essay

The available literature on the topic of Technological Singularity also discusses the potential dangers and the measures that must be taken on the way to the creation of such an entity in order to prevent such dangers. One of the major threats that could be posed by the Super Intelligence to Human would be extinction. For example, since the Singularity will be self-sustaining and evolving it might question the existence of humans which would then be considered inferior organisms and would play no or insignificant part in the sustaining of the Singularity.

Or, even while working under humans, the singularity could be given a sub-goal as a super goal by mistake, which could then allow the intelligence to supersede any safety goals in place and take measures which could lead to the annihilation of mankind. Therefore, it has been proposed that the research should be undertaken to provide a Friendly version of AI, rather than given the AI all the control over everything. The proponents of such friendliness emphasize the placement of strict and strong safeguards within the algorithms of AI that would ensure the danger to the human race from AI remains low. It is important to place such measures because Artificial Intelligence is not guaranteed to see the aspects such as morality and sensibility as humans do, and an uncontrolled AI may venture into such tasks which although fulfills its goals but in fact leads to total annihilation (Kurzweil, 2009).

Please go to the order form to order essays, research papers, term papers, thesis, dissertation, case study, assignments on this essay topic.

Related Essays, Research Papers, Term Papers, Thesis, Dissertation, Case Study, Assignments entries.


About The Academic Paper