Singularitarianism
From Wikipedia, the free encyclopedia
Part of Ideology series on |
Transhumanism |
Ideologies |
Abolitionism |
Related articles |
Transhumanism in fiction |
Organizations |
Applied Foresight Network |
Transhumanism Portal · |
Singularitarianism is a moral philosophy based upon the belief that a technological singularity — the technological creation of smarter-than-human intelligence — is possible, and advocating deliberate action to bring it into effect and ensure its safety.
While many futurists and transhumanists speculate on the possibility and nature of this technological development (often referred to as the Singularity), Singularitarians believe it is not only possible, but desirable if, and only if, guided safely. Accordingly, they might sometimes "dedicate their lives" to acting in ways they believe will contribute to its safe implementation.
The term "singularitarian" was originally defined by Extropian Mark Plus in 1991 to mean "one who believes the concept of a Singularity". This term has since been redefined to mean "Singularity activist" or "friend of the Singularity"; that is, one who acts so as to bring about the Singularity.[1]
Ray Kurzweil, the author of the book The Singularity is Near, defines a Singularitarian as someone "who understands the Singularity and who has reflected on its implications for his or her own life".[2]
Contents |
[edit] Beliefs
In the 1980s and 1990s, prior to Singularitarianism being articulated as a coherent ideology, belief in the coming of the Singularity was adopted and expressed by a growing minority of computer scientists and technical journalists:
It feels like something big is about to happen: graphs show us the yearly growth of populations, atmospheric concentrations of carbon dioxide, Net addresses, and Mbytes per dollar. They all soar up to form an asymptote just beyond the turn of the century: The Singularity. The end of everything we know. The beginning of something we may never understand.
—Danny Hillis, The Millennium Clock (1995, Wired magazine)
In his 2000 essay, "Singularitarian Principles", Eliezer Yudkowsky writes that there are four qualities that define a Singularitarian:[3]
- A Singularitarian believes that the Singularity is possible and desirable.
- A Singularitarian actually works to bring about the Singularity.
- A Singularitarian views the Singularity as an entirely secular, non-mystical process — not the culmination of any form of religious prophecy or destiny.
- A Singularitarian believes the Singularity should benefit the entire world, and should not be a means to benefit any specific individual or group.
In June 2000 Eliezer Yudkowsky, Brian Atkins and Sabine Atkins founded the Singularity Institute for Artificial Intelligence to work towards the creation of self-improving Friendly AI. The Singularity Institute's writings argue for the idea that an AI with the ability to improve upon its own design (Seed AI) would rapidly lead to superintelligence. Singularitarians believe that reaching the Singularity swiftly and safely is the best possible way to minimize net existential risk.
Many believe a technological singularity is possible without adopting Singularitarianism as a moral philosophy. Although the exact numbers are hard to quantify, Singularitarianism is presently a small movement. Other prominent Singularitarians include Ray Kurzweil and Nick Bostrom.
[edit] Criticism
Often ridiculing the Singularity as "the Rapture for nerds", many critics have dismissed singularitarianism as a pseudoreligion of fringe science.[4] However, some green anarchist militants have taken singularitarian rhetoric seriously enough to have called for violent direct action to stop the Singularity.[5]
[edit] See also
- Extropianism
- Seed AI — a theory closely associated with Singularitarianism
- Simulated reality — analysis of potential technologically based reality
[edit] References
- ^ http://www.extropy.org/neologo.htm#s Neologisms of Extropy
- ^ The Singularity is Near - Chapter One (The Six Epochs)
- ^ Singularitarian Principles"
- ^ Horgan, John (2008). The Consciousness Conundrum. http://www.spectrum.ieee.org/jun08/6280. Retrieved on 2008-12-17.
- ^ mosh@terran hacker corps (2005). A Singular Rapture. http://www.greenanarchy.org/index.php?action=viewwritingdetail&writingId=182. Retrieved on 2008-12-11.
[edit] External links
- Why Work Towards the Singularity? by Eliezer Yudkowsky
- Ethical Issues in Advanced Artificial Intelligence by Nick Bostrom