Thursday, October 4, 2007
The Technological Singularity is the hypothesized creation, usually via AI or brain-computer interfaces, of smarter-than-human entities who rapidly accelerate technological progress beyond the capability of human beings to participate meaningfully in said progress. Futurists have varying opinions regarding the time, consequences, and plausibility of such an event.
I. J. Good first explored the idea of an "intelligence explosion", arguing that machines surpassing human intellect should be capable of recursively augmenting their own mental abilities until they vastly exceed those of their creators. Vernor Vinge later popularized the Singularity in the 1980s with lectures, essays, and science fiction. More recently, some AI researchers have voiced concern over the Singularity's potential dangers.
Some futurists, such as Ray Kurzweil, consider it part of a long-term pattern of accelerating change that generalizes Moore's law to technologies predating the integrated circuit. Critics of this interpretation consider it an example of static analysis.
The Singularity has also been featured prominently in science fiction works by a plethora of authors.
Intelligence explosion
Some speculate superhuman intelligences may have goals inconsistent with human survival and prosperity. AI researcher Hugo de Garis suggests AIs may simply eliminate the human race, and humans would be powerless to stop them. Other oft-cited dangers include those commonly associated with molecular nanotechnology and genetic engineering. These threats are major issues for both Singularity advocates and critics, and were the subject of a Wired magazine article by Bill Joy, "Why the future doesn't need us" (2000).
In an essay on human extinction scenarios, Oxford philosopher Nick Bostrom (2002) lists superintelligence as a possible cause:
Some AI researchers have made efforts to diminish what they view as potential dangers associated with the Singularity. The Singularity Institute for Artificial Intelligence is a nonprofit research institute for the study and advancement of Friendly Artificial Intelligence, a method proposed by SIAI research fellow Eliezer Yudkowsky for ensuring the stability and safety of AIs that experience Good's "intelligence explosion". AI researcher Bill Hibbard also addresses issues of AI safety and morality in his book Super-Intelligent Machines.
Isaac Asimov's Three Laws of Robotics are one of the earliest examples of proposed safety measures for AI. The laws are intended to prevent artificially intelligent robots from harming humans. In Asimov's stories, any perceived problems with the laws tend to arise as a result of a misunderstanding on the part of some human operator; the robots themselves shut down in the case of a real conflict. On the other hand, in works such as the film I, Robot, which was based very loosely on Asimov's stories, a possibility is explored in which AI take complete control over humanity for the purpose of protecting humanity from itself. In 2004, the Singularity Institute launched an Internet campaign called 3 Laws Unsafe to raise awareness of AI safety issues and the inadequacy of Asimov's laws in particular.
Many Singularitarians consider nanotechnology to be one of the greatest dangers facing humanity. For this reason, they often believe seed AI should precede nanotechnology. Others, such as the Foresight Institute, advocate efforts to create molecular nanotechnology, claiming nanotechnology can be made safe for pre-Singularity use or can expedite the arrival of a beneficial Singularity.
Potential dangers
Subscribe to:
Post Comments (Atom)
Blog Archive
-
▼
2007
(103)
-
▼
October
(28)
- John A. Robertson holds the Vinson and Elkins ...
- Jacob "Jake" Eddie Peavy, (born May 31, 1981, in...
- Main articles Home Rule Repeal Irish nationalism...
- The voiceless palato-alveolar fricative or dome...
- Bond market Fixed income Corporate bond Governme...
- History According to the United States Census ...
- Versions H.263v2 (also known as H.263+ or as the...
- Ryan Sypek (born on August 6, 1982 in Boston, ...
- Worlds in Collision is a book written by Imman...
- Cristina Saralegui (born January 29, 1948 in H...
- Walter Mercado (sometimes referred to simply as ...
- Coordinates: 53°24′30″N 2°08′58″W / 53.4083,...
- Processes Every cell typically contains hundre...
- The Shropshire Fire and Rescue Service is the ...
- Vijay Singh Deol (Punjabi: ਵਿਜੈ ਸਿੰਘ ਦਿੳਲ, Hindi...
- :This article is about a type of Jewish religi...
- An Imprimatur is an official declaration from ...
- City of Edinburgh Council The Full Council compr...
- Miguel Ángel Osorio Benítez (July 29, 1883 – J...
- 2.5-3.5 °C 200-220 °C at 0.1 mmHg Tocopherol, kn...
- North Down Borough Council is a Local Council ...
- Gje (Ѓ, ѓ) is a letter of the Cyrillic alphabe...
- South Deering, one of the 77 official communit...
- Gratis versus Libre is the distinction between z...
- The Technological Singularity is the hypothesize...
- The Scottish Football Association (SFA) is the g...
- Margaret, The Hon. Mrs Rhodes, LVO, (b. 9 June...
- Barry Miller (born 6 February 1958 in Los Ange...
-
▼
October
(28)
No comments:
Post a Comment