Currently the path to the a singularity is speculated in these aspects. One great philosophy is Ray Kurzweil's law of accelerating change. This law claims that technological advancement is accelerating at exponential rates. Some growth might be linear, but much is expanding at exponential speeds. Technologies like the world wide web came out of nowhere and continues to grow exponentially. Another field advancing is cognitive science neuroscientist are closer than ever at fully reverse engineering the human brain. What could this do for us? Imagine making multiple copies perhaps a million copies of a human brain. Then running them millions of times faster than normal. During all of this you could see all that is happening and share it with the rest of the world. Imagine then how fast superhuman intelligence will accelerate.
Another path is known as the event horizon. This idea was described by Vernor Vinge. One of his ideas he wrote about was a time frame between 2005 and 2030 which humanity will reach superhuman intelligence. Once this happends life will experience strong AI. An event horizon causing chaos that could be compared to a black hole in terms of chaos. The only difference is that the singularity will involve information instead of all of the universal matter being crammed into a wormhole. Life after the singularity could very well be unpredictable in a situation like Vinge's event horizon.
I.J Goods idea of Intelligence explosion argues, following the creation of the first mind smarter than a human mind, because a superhuman mind in possession of a formal description of itself would be capable of incremental and additive self-improvements in its own intelligence ad infinitum. I find this situation in life today. I learn so much faster today because of AI search engines such as Google which keeps information as fast as my fingertips will transfer the speed of thought. Advanced search engine technology will involve superhuman intelligence. In some ways it already feels like it does
So what could life be like if there is a singularity? This is a question with a million different answers. Most of the issues are ethically challenged and speculating what life will be like after the singularity can be fun but too much can be downright mentally exhausting. From my own experience thinking about the future of technology is lots of fun, but I also believe it is important to think about these issues. Even if what we see it as doesn't become reality. If we don’t speculate about it, we could be missing out on key inventions to come.
Aubrey D. Gray who is a biomedical gerontologist from Cambridge,UK, argues that with the exponential growth of technology, lifespan is growing exponetially too. In Aubrey’s theory, he says lifespan will reach around 1000 years by 2100. With todays technology, Aubrey says that a typical human in good health has a 50% chance of survivability to live to 2100. I feel this is a fair judgment, but the only problem is people don’t usually buy into this kind of hype. Even if their being presented with models to show proof it isn’t something someone is going to believe right away. In my opinion we should focus more on staying alive today and worry about immortality when the signs are more clear.
So as you can see there is many views on what the singularity is and when its going to happen going however, it is also a possibility but not likely that Singularity won't happen. At least by Vinge's predicted 2030 circa. In that case it is important that we as humans plan for practical importance. There can be many different scenarios for what one may see in our future if a phenomena like singularity doesn't happen, here's mine.
With all these advancements in technological growth, why wouldn't we reach the point of singularity? What humans have created so far is a bottleneck of intelligent design. With todays processing--clustering technologies, we already seem to have the processing power to create neural networks that could simulate a human brain. So why aren't computers already more intelligent than humans? It is because human intelligence hasn't created software yet that can simulate human thinking or to remodel the human brain.
Kurzweil's scenario seems to me like the plausible one. Even though we are so close to tapping into the fountain of superhuman intelligence, its that same intelligence that could take the best of us. With the world struggle for nuclear power, terrorism, and depletion of livable resources, life in the future without a transcendental condition, will likely lead to human extinction before trancendence.
Thursday, May 22, 2008
Subscribe to:
Posts (Atom)