One Singular Sensation: The Geek Rapture
As a writer working intermittently in the information technology industry and as a science enthusiast, I have been aware of the evermore rapid advancements taking place in some areas of technological endeavor for some time. I first heard of Mooreâ€™s Law — which describes theÂ computer hardware trend whereÂ the number of transistors that can be placed inexpensively on an integrated circuit basically doubles every two years — in the mid-1990s while working for the leading PC manufacturer at the time, Compaq Computers. And like a lot of people, Iâ€™ve seen that trend unfold with remarkable alacrity in the ever-expanding marketplace of machines and gadgets we use for work and to communicate. In addition to that, I have access to people involved in the human genome project. I maintain close friendships with a handful of engineers currently contracting with NASA. Plus, simply because they interest me to a degree, I keep peripheral tabs on the biotechnology, nanotechnology, and artificial intelligence fields.
So, if Iâ€™m not a little more informed than the average person when it comes to the progress of technology and the processes that control it, I at least like to think I am.
I must admit, however, that until about a month ago, I was completely ignorant of something known as the technological Singularity. I was likewise ignorant of the people and institutions that espouse it.Â But recently,Â a friend of mine mentioned the Singularity to me, almost in passing, and being that Iâ€™m blessed/cursed with a penchant for letting my curiosity guide me, I began to look into it a bit more deeply, and soon discovered it to be one of the most interesting targets on which to turn a critical eye that Iâ€™ve ever come across.
Now, if you are in the same spot I was a month ago, the technological Singularity is defined as an “event horizon” in human technological development beyond which humans (at least in our current incarnation) will cease to be driving technological progress. The technology itself will take over that task. Basically, ultra-intelligent machines will surpass all the intellect of any human, no matter how clever. And since the design of machines is a product of human intellect, such an ultra-intelligent machine could design even better machines. The intelligence of man would be left behind. The first ultra-intelligent machine is the last invention that man would ever need to make, and its creation would mark the singularity.
Not only that, but people like inventor Ray Kurzweil, who is one of the leading thinkersÂ championing the coming Singularity, think it will entail a radical transformation of our minds and bodies, thanks to exaggerated advances in AI, nanotech, biotech, computer science, and neuroscience, and their integration with humankind on a mass scale. Weâ€™re talking about cyborgs, digitized psyches, downloading human consciousness onto machine, and immortality here.
Of course, this is the point at which pop cultureÂ favorites like the Terminator and Matrix movies come to mind, not to mention hundreds of science fiction novels that deal with similar themes. And in fact, there are other supporters of the Singularity, aside from Kurzweil,Â whoÂ see theÂ event resulting in the dystopian or utopian landscapes found in fiction. Brilliant men and women in cutting-edge technological scientific fields demonstrate a passionate belief that this â€œevent horizonâ€ will be reached, and that the world after will look like nothing we can imagine.
But before we get to the question of what our world will be like, we have to put on our critical thinking shoes to determine, â€œIs the Singularity going to take place at all?â€
One thing about the idea of the Singularity that I found very disturbing when I first started reading papers, blogs, and various other materials on the subject was the fervency with which many very rational, scientific people believed in its certainty. There are religious overtones to what some of the more staunch supporters say online, and the disharmony of hearing language of faith come from otherwise rational minds was shocking.
And Kurzweil does nothing to eschew that image. In fact, throughout a documentary detailing his life and beliefs called â€œTranscendent Man,â€ he is presented almost as a mystic, sitting in a chair with a shimmering, circular light floating around his head as he explains his philosophyâ€™s basic tenets. (See trailer for “Transcendent Man” below.)
Images and attitudes like these are why Horgan and others refer to the Singularity as the â€œrapture of the geeksâ€.
Now, I actually think â€œrapture of the geeksâ€ is a great line, and I recognize why some people would use that terminology, but I wanted to avoid having the conclusions of my research tainted by my impressions of the people involved and their behavior. I felt it more appropriate to examine instead the nuts and bolts behind the idea, and to let that examination inform my conclusions free of any preconception about the players.
After all, this is a different type of problem than most claims and phenomonaÂ skeptics and critical thinkers address in this context.Â It’s one thing toÂ examine a phenomenon that has already occurred (or that is occurring) and determine its validity or formulate an explanationÂ for it, but it’s something else entirely to try to determine whether something predicted will indeed come to pass.
Depending on the subject matter and the available information, it may very well be impossible toÂ proceed with any measure of confidence, but if we’re going to try,Â one way to do it might be toÂ look at the major implications for the prediction coming true, and weigh those against other similar implications that failed.
In other words:
What might one point to as an indicator that the Singularity will take place, and what might one point to as an indicator that it will not?
And as it turns out, there are a lot of indicators on both sides of the question. To keep this post from becoming even longer, I’ll just mention a few of them.
Kurzweil goes to great lengths in his talks to point out exponential progress in information technology (doubling, as in 1 to 2, 2 to 4, 4 to 8, 8 to 16 . . . . .). So for example, we can advance to a billion bits of information in merely 30 steps as opposed to a billion steps. And it is a fact that advances in information technology are taking place at an exponential rate as opposed to a linear rate.
The human genome has been mapped. The brain is being mapped. All of human biology is being deciphered and recorded as bits of information, and the stockpiles of information we have in all these areas are constantly doubling. The more information we have about something, the more we understand it, the better we can manipulate it. Itâ€™s this kind of speed in progress of understanding that lead many to think that the Singularity will happen very soon.
But will data collection and analysis allow us to do things like reverse engineer nervous systems or the brain, so that we can build one that will be more intelligent and more efficient than our own?
The information technology advancements also impact areas like AI and robotics. Machines are becoming more delicate and intricate, and are able to do exponentially more calculations in the same amount of time. In some cases, engineers are actually finding it challenging to design and manufacture items that keep pace with the capabilities.
If nanotechnology follows this trend, and we achieve a complete understanding of life on an informational level in a short time based on the exponential data collection, it is not inconceivable that we could use machines to fix every little thing that plagues our mortal bodies, thereby extending our lives to however long we see fit. We could ourselves become the machines that mark the Singularity.
But will that be the case?
Of course it remains to be seen, but these are but a few of the indicators that something extraordinary in technological advancementÂ could actually occur.
But are there areas of advancement that have leveled off or collapsed completely?
Well, there were predictions in the 1970s by epidemiologists that infectious diseases would soon be completely eliminated. Yet, we still fight colds, STDs, and the occasional new strain of the flu virus never fails to strike fear into the hearts of the general public. Â
In the late 1990s and early 2000s, several publications began reporting that diseases like Alzheimer’s were on the verge of being eradicated. Advances in our understanding of the disorder and in the treatment options were about to explode, and there was a great deal of fanfare that soon a diagnosis of Alzheimerâ€™s would be no more off-putting than a diagnosis of the flu. Yet here we are more than a decade later, and hardly any advances in the treatment for the disease have been made.
As early as the 1980s, there was great excitement over nuclear fusion. The PR at the time was that nuclear fusion energy was going to be so cheap by the 2000s that it would be too cheap to meter (basically free). Yet the fusion energy industry in the US has all but collapsed.
And one need not say more than the word â€œcancerâ€ when talking of problems we just canâ€™t seem to get a handle on.
(See a video of Horgan and Kurzweil debating these points here)
So, clearly there are indicators that something like a Singularity could take place. But there are also many pointsÂ of history that show our best guesses about how things are going to turn out donâ€™t always come to fruition.
AndÂ it seems another obstacle preventing the critical thinker from getting aboard the Singularity trainÂ lies in the nature of the futurist, or the predictors of things to come. I donâ€™t know if there is a default psychological make-up amongst futurists, but there is a long history of futurists letting their ideology get the better of their reason. How often have we read about and commented on the exploits of the doomsday cults on blogs such as this one? Weâ€™ve pitied and ridiculed those people who are so sure the end of the world is coming that they give away their material possessions, only to be standing in a field or on a mountain somewhere (or worse, lying dead in a heap)Â wondering what theyâ€™re going to eat and what theyâ€™re going to wear the next day because their prophet somehow picked the wrong day.
And if youâ€™re wondering if Iâ€™m comparing adherents of the Singularity to the likes of the Millerites, I absolutely am. At least in the sense that some Singulatarians donâ€™t seem to want to exercise due caution about what are some very bold claims.
Hey, if you think you can fly, donâ€™t jump off a 20-story rooftop. Try taking off from the ground first.
Of course, making precise predictionsÂ is nearly always problematic. There are so many unknowns, that even having an enormous set of strong indicators about a possible event happening doesnâ€™t necessarily mean it will.
When I ask you if the sun will rise tomorrow, or more precisely if the Earth will rotate on its axis in its orbit around a medium size star that continues to burn such that the inhabitants of the Earth experience what those with sufficient communication techniques call daylight, even though there are extremely strong indicators that it absolutely will, the correct answer is still â€œprobablyâ€.
And the answer is â€œprobablyâ€ and not â€œI donâ€™t knowâ€ only because there is precedent. It has happened before.
In the case of the Singularity, there is no precedent.
There a few final thoughts Iâ€™ll relate to you about the subject of the Singularity:
Will it happen?
Well, Iâ€™m going to go with â€œI donâ€™t knowâ€. I know itâ€™s not a very sexy answer, but itâ€™s one that thinking about the subject critically leads me to, and itâ€™s one that Iâ€™m okay with. And â€œI donâ€™t knowâ€ doesnâ€™t mean something wonâ€™t come to light that would make me change my mind. Indeed the subject fascinates me, and I will continue to look into it, and keep an open mind about what the future holds. But Iâ€™m not going to make any hard and fast predictions, nor am I going to adjust my life or my thinking as though the Singularity is coming.
Iâ€™ll also add that I had the great pleasure of attending the Singularity Summit in San Francisco this past weekend. And where I went into the conference honestly expecting to be awash in the cultish fervency I had seen demonstrated by some Singulatarians on the Internet, I was pleasantly surprised to have those fears allayed. The overly enthusiastic types seem to be the exception not the rule. Everyone I met was brilliant, sober, and extremely rational in their thinking. It was an awesome intellectual and philosophical experience.
On top of that, the subject matter for each presentation was just so cool. It was AI, and nanotech, and biotech, and robotics, and animal intelligence, etc., etc. And with the ardent futurist approach mostly absent from the program, it was a science enthusiastâ€™s/technophileâ€™s dream.
And ultimately, I came away from it thinking that it might just be best for all the folks involved in developing these amazing technologies to continue their work as always, at whatever pace their abilities and funding dictate, and let the future unfold as it will.
And when people ask what will the world be like after so much technological advancement, be comfortable saying, â€œWell, I think itâ€™s going to be really really cool, but I donâ€™t know for sureâ€.