Is it all bullshit? Or could it just be possible?
http://en.wikipedia.org/wiki/Technological_singularity
Quote:
The technological singularity is a theoretical future point of unprecedented technological progress?typically associated with advancements in computer hardware or the ability of machines to improve themselves using artificial intelligence.
Statistician I. J. Good first wrote of an "intelligence explosion", suggesting that if machines could even slightly surpass human intellect, they could improve their own designs in ways unforeseen by their designers, and thus recursively augment themselves into far greater intelligences. The first such improvements might be small, but as the machine became more intelligent it would become better at becoming more intelligent, which could lead to an exponential and quite sudden growth in intelligence.
|
Humans should make sure that they can always just pull the plug if necessary.
