Technological singularity


 

Technological singularity

is a hypothetical point in time at which technological growth becomes uncontrollable and irreversible, resulting in unforeseeable changes to human civilization.

 

 This articles is for knowledge purpos only. I think i should start with Psychologist Steven Pinker statement in 2008:

... There is not the slightest reason to believe in a coming singularity. The fact that you can visualize a future in your imagination is not evidence that it is likely or even possible. Look at domed cities, jet-pack commuting, underwater cities, mile-high buildings, and nuclear-powered automobiles—all staples of futuristic fantasies when I was a child that have never arrived. Sheer processing power is not a pixie dust that magically solves all your problems

 

Intelligence Explosion

According to the most popular version of the singularity hypothesis, called intelligence explosion, an upgradable intelligent agent will eventually enter a "runaway reaction" of self-improvement cycles, each new and more intelligent generation appearing more and more rapidly, causing an "explosion" in intelligence and resulting in a powerful superintelligence that qualitatively far surpasses all human intelligence.  


The first to use the concept of a "singularity" in the technological context was John von Neumann. Stanislaw Ulam reports a discussion with von Neumann "centered on the accelerating progress of technology and changes in the mode of human life, which gives the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue". Subsequent authors have echoed this viewpoint.

I. J. Good's "intelligence explosion" model predicts that a future super intelligence will trigger a singularity.

 

(Warning Negative opinion Ahead: Own Discretion required))

Public figures such as Stephen Hawking and Elon Musk have expressed concern that full artificial intelligence (AI) could result in human extinction. (this is extreme opinion, exactly how future will unfold is unpredictable always). The consequences of the singularity and its potential benefit or harm to the human race have been intensely debated.  


Four polls of AI researchers, conducted in 2012 and 2013 by Nick Bostrom and Vincent C. Müller, suggested a median probability estimate of 50% that artificial general intelligence (AGI) would be developed by 2040–2050.

 

Criticism

Some critics, like philosopher Hubert Dreyfus, assert that computers or machines cannot achieve human intelligence, while others, like physicist Stephen Hawking, hold that the definition of intelligence is irrelevant if the net result is the same

 

Psychologist Steven Pinker stated in 2008:

... There is not the slightest reason to believe in a coming singularity. The fact that you can visualize a future in your imagination is not evidence that it is likely or even possible. Look at domed cities, jet-pack commuting, underwater cities, mile-high buildings, and nuclear-powered automobiles—all staples of futuristic fantasies when I was a child that have never arrived. Sheer processing power is not a pixie dust that magically solves all your problems

 

 

Post a Comment

0 Comments