"The Singularity", or the "Technological Singularity" is a term that was popularized by Vernor Vinge (see reference 1 below). It is a term from science fiction, considered by many to be in the realm of "possible, and maybe probable". One of the earliest "serious" references I know to it is from Vernor Vinge's essay, "The Coming Technological Singularity: How to Survive in the Post-Human Era". It can be found on page 11 of NASA Conference Publication 10129, "Vision-21", and is reproduced exactly at reference 2.
It is an interesting concept, and one explored in science fiction. Examples can be found in the works of Vernor Vinge, Charles Stross, Karl Schroeder, and many others.
The basic idea of the singularity is that humanity is currently getting a better understanding of the mind, and that there is no obvious physical effect that forbids the artificial augmentation of human intelligence, (or the creation of artificial human-like intelligences, but let's assume, for argument's sake, that this is fundamentally impossible). People will figure out how to make themselves smarter. I'd say that even the WWW is a start, because I can now research in minutes what once took me hours in a library. It's becoming easier for people to learn from one another. Vinge's point is that once people gain the ability to enhance their own intelligence, they will enter into an effective positive feedback loop, becoming more intelligent, and better able to enhance their intelligence. The pace of technological change will, one expects, track that of intelligence. Technological change will enter a positive feedback loop, and a graph of technological prowess against time will suddenly shoot sharply upwards. What happens then will be unfathomable to those of us with merely normal human intelligence.
A singularity is a place where the rules break down, and where you cannot extrapolate future behaviour given knowledge of the past conditions. Vinge argues that we will not be able to guess or understand our own future once the artificial enhancement of intelligence starts feeding back.
Will we upload our intelligences into computers? We cannot predict what will happen if we become extraordinarily more intelligent than we are now. "Uploading into a computer" may, by then, sound as likely as "Uploading into a flint arrowhead".
2007-06-25 14:06:04
·
answer #1
·
answered by Anonymous
·
0⤊
0⤋
Believe the term singularity refers to a discontinuity in a differential equation: in this case, the field equations of general relativity. It's a point where values become infinite, so the equations basically crater at the singularity. Fortunately, the event horizon isolates the little monster from the rest of the universe, so astrophysicists can sleep a little better.
If creating an artificial intelligence is possible, would expect we will null and void ourselves in the not too distant future. As a species, believe our best hope is that it isn't possible. Have to wonder if one reason the galaxy is so quite is not in the Drake equation. Maybe an AE has already distributed itself, and simply shuts down noisy new worlds shortly after they start broadcasting.
Wondering about what constitutes "me", here's a question. If a tricky god came in to your bedroom tonight and replaced every fundamental particle in your body with a like particle, then you would be a completely new person in the morning, but completely unaware of what happened since any two like fundamental particles are identical. Would it be you? It's memory says yes. What constitutes "me" starts to break down when you look at it up close.
2007-06-25 20:10:35
·
answer #2
·
answered by SAN 5
·
0⤊
0⤋
Think this question is in wrong section - not about physics singularity - refers to intellectual singularity arising from accelerated technological change with positive feed-back leading to massive, unimaginable progress in very short period of time. I think this is complete rubbish. This type of scenario has been predicted many times in the past and all have been wrong. As with many other situations I suspect there will be some "friction" which prevents the process going into meltdown and therefore limits the rate of progress. However, I don't believe there is anything "magic" about humanity, I don't believe in "souls" or other religious concepts based on bigotry and out of date nonsense. So, the possibility of constructing machines which "think" has to be considered - nature and natural selection came up with a version, why couldn't we?. I believe this is simply a matter of complexity with maybe a bit of quantum uncertainty thrown in. Are we just the sum of our experience, plus the bits handed down in our genetic code - yes. So, it may become possible to download all this and apart from our bodies it would be us. I sincerely hope it doesn't run in Windows!!!
2007-06-25 22:33:21
·
answer #3
·
answered by john796893 2
·
0⤊
0⤋
Hi. Questions within questions. The singularity is simply a concept of what might (or must) lie within an intense gravity field. The average IQ is 100. The spread is what matters. Every once in a while an intellect appears that moves humans forward. At this point computers can only mimic intellect. What you ask about 'live forever' can be accomplished with a photograph. A computer could 'know' your date of birth, your first kiss, your last words... but that would not make it 'you'. My opinion.
2007-06-25 18:44:37
·
answer #4
·
answered by Cirric 7
·
1⤊
1⤋
One day we may be able to download our thoughts and memories into a computer which would preserve them but it would not be us; we will still die.
In Astronomy (in general relativity) a singularity is the mathematical representation of a black hole.
They seem to mention it quite a lot in Star Trek.
2007-06-25 18:48:58
·
answer #5
·
answered by nettyone2003 6
·
0⤊
0⤋