Our Mission

On 29 June, 2010, in Info, by Adam A. Ford

In the coming decades, humanity will likely create a powerful artificial intelligence. The Singularity Institute for Artificial Intelligence (SIAI) exists to confront this urgent challenge, both the opportunity and the risk. Our objectives as an organization are:

  • To ensure the development of friendly Artificial Intelligence, for the benefit of all mankind;
  • To prevent unsafe Artificial Intelligence from causing harm;
  • To encourage rational thought about our future as a species.

To learn more about SIAI’s charitable purpose, begin with What is the Singularity? and Why Work Toward the Singularity?, followed by Artificial Intelligence as a Positive and Negative Factor in Global Risk.

Marquis de Condorcet

Marquis de condorcetNature has set no term to the perfection of human faculties; that the perfectibility of man is truly indefinite; and that the progress of this perfectibility, from now onwards independent of any power that might wish to halt it, has no other limit than the duration of the globe upon which nature has cast us. This progress will doubtless vary in speed, but it will never be reversed as long as the earth occupies its present place in the system of the universe, and as long as the general laws of this system produce neither a general cataclysm nor such changes as will deprive the human race of its present faculties and its present resources.” – Marquis de Condorcet | Mathematician, philosopher

Technological Singularity

Technological singularity refers to a prediction in Futurology that technological progress will become extremely fast, and consequently will make the future (after the technological singularity) unpredictable and qualitatively different from today. It is most often associated with the ideas of futurist Ray Kurzweil.

Although technological progress has been accelerating, it has been limited by the basic intelligence of the human brain, which has not changed significantly for millennia. However with the increasing power of computers and other technologies, it might eventually be possible to build a machine that is more intelligent than humanity.

Theoretically, if a machine built by humans could bring to bear greater problem-solving and inventive skills than humans, then it could design a yet more capable machine. If built, this more capable machine then could design a machine of even greater capability. These iterations could accelerate, leading to recursive self improvement. I. J. Good described this as an “intelligence explosion”. It is quite different from normal technological progress because the underlying driving force is increasing, causing exponential growth. The term Technological Singularity reflects the idea that the change may happen suddenly, and that it is very difficult to predict how such a new world would operate. It is also unclear whether there would be any place for humans in a world containing very intelligent machines.

It is alternately suggested that a singularity could come about through amplification of human intelligence to the point that the resulting transhumans would be incomprehensible to their purely biological counterparts. The term can also be applied to general increase in technology over time.

 

Leave a Reply

You must be logged in to post a comment.