Books: Foreshadow of the Singularity?


“The enlightenment has to be conceived as an attitude, an ethos, a philosophical life in which the critique of what we are is at one and the same time the historical analysis of the limits that are imposed on us and an experiment of possibly going beyond them.”

- Michel Foucault



After ten minutes of sitting in the bookstore with three daydreamers and a keyed-up old crone, I decided to stop watching a new age author give verbal phillacio to the works of Deppack Chopra, purchase my books, and retreat to my home where I could selfishly ignore the world.  With Bonnie Prince Billy’s I See a Darkness providing the soundtrack, I hit the rain-soaked road dodging potholes and the occasional listless wanderer. A short while after sunset I cautiously pulled up my slanted driveway, braking momentarily for my neighbor’s suicidal cat, and set the vehicle to rest.  With the absence of headlights the night was so tenebrous I could not even locate the best cliché to describe it. This particular evening was especially dark due to two things: a heavy northwest cloud cover and a burnt out porch light. I hopped out of my car and started in the direction of the stairway that climbs up to my front gate. As I was about to brave the uneven slick brick steps with only muscle memory to rely on, it dawned on me -- it’s 2011. I have an app for this. I tucked my newly purchased books under my arm, reached into my pocket and pulled out my phone. Sliding my finger across the touch screen surface the brightly lit New York Times page I had been reading a couple hours prior sprang to life. I aimed it at my feet and safely made my way up the steps. I subsequently opened my phone to the New York Times to light my path. Had I spoke that very sentence to nearly anyone besides Ray Kurzweil a decade ago, they surely would've thought I lost it.

Many tech-savvy people reading this will think I’ve lost it for a another reason – I still purchase physical books. “The book is a dead medium,” they’ll decree. "Didn’t the author go the way of the polar bear long before ereaders and ipads began to duke it out at prime time? For the few surviving writers who’ve been marginally successful in a market dominated by dribblers like James Patterson and Stephanie Myers, haven’t publishers left them vulnerable as kittens, fleeting as dandelions, purely latent vestiges of genius susceptible to a deathblow at any moment like a sleeping Stephen Hawking?”  It saddens me to say, in many ways: yes.

For as much as I love those tangible, notoriously musty-scented, easily-prone-to-coffee-staining parallelograms of pleasure called books, they may very well become antiquated. And although my keenness for the medium is strong, above all it’s the written word that garners my affection. I’m not opposed to reading digitally formatted text. I’m constantly reading the news, blogs, and articles – as well as viewing a bunch of other trivial nonsense online. Many mornings I often find myself reading on my iphone while waiting in line for coffee. Ereaders have some undeniable advantages. In academia for instance, being able to easily bookmark passages, watch embedded videos, and search for key terms simultaneously – throughout thousands of sources – is quite a handy tool.

However, these new technologies are so young people often exaggerate their virtues. For example: although ereaders aren’t made out of paper, they’re not as green as books. Through all stages of its existence a single ereader creates the waste equal to roughly 75 to 100 books. The average ereader marketing generation is about eleven months. Granted most people will use their ereader for longer than a generation, but not usually longer than two or three. So unless you’re Rainman, it’s not the ecological choice. The best ecological choice is still the library. For the same reasons, ebooks are not the cost effective choice. Considering that in some cases you’re really just borrowing the title due to the short-term contract the publisher has made with the ereader company, you never really own them. And lets not forget the Orwell debacle that with almost divine irony illuminated the problems with Amazon’s Kindle and their totalitarian publishing ways.

Nevertheless, as arguments of compare and contrast abound in our ever-evolving technosphere, they often marginalize legitimate concerns. What we should be asking ourselves is what affects will an instrument of expression have on future expressions. And when it’s something as essential as the first physical extension of our minds – the written word – how will it in turn shape the future of our species? Are books foreshadowing the singularity?





Tools have altered human perception and forecasts for the future ever since our species discovered the merits of the stick. It has been our nature to explain life and how the mind works through what we can observe. In many ways its been a positive approach. It beats the alternative of giving credo to a celestial string puller. But we’ve made a lot of mistakes in the process. Ever since the beginning of the industrial revolution many have feared machines will replace the worker, while simultaniously comparing the process of the system to a machine. Freud took to misunderstanding humans by making analogous comparisons to the steam engine. For Freud sexual expression was a need to “relieve steam,” if you will. Early on in the twentieth century Americans took the automobile and not only turned it into a symbol of individual liberty, but a nationalistic metaphor. In our contemporary mindset we now misunderstand ourselves, particularly our brains, through computer analogies. Although there are similarities in how synapses and circuit boards transmit data, the way that data is interpreted in the human mind is still a mystery. A computer may be able to deliver The New York Times to my hand, but its purpose is for me to read it – not to use it as a light.

Our brains have a marvelous ability to turn by-products into features. This is something that, as of today, computers can’t do. Computers are designed for a specific purpose. Super computers like Watson may be able to weave adaptive algorithms and beat humans at Jeopardy, but they can’t ask intuitive questions or comfort a loved one. Despite there shortcomings we claim the pinnacle of human intellect is when one can do what Watson does. This unwise philosophy has diminished the truly astonishing feats of the human mind and carved a deep groove in our culture.

As Jaron Lanier has pointed out on numerous occasions, nothing demonstrates a shift towards human mechanization more than the act of retweeting. If we just pass information down the line, rarely marinating in it or conversing about it, it has little relevance. We become synapses firing within an enormous brain like entity – The Internet.  Actually having conversations, writing about what we read, letting it sink in for a bit can be profound. But how often do we actually do that especially considering the hundreds of tweets many of us read in a day? There has always been – and will always be – more information available than we can retain. But what are we to do since we’ve sped the flow of information up so fast that our retention levels are declining?

When early homosapiens discovered written language it had astonishing affects on the development of the human brain. Vocabulary increased, math was created, and for some, memories were expanded by etching them onto pieces of earth. For the common person reading materials were rarely obtainable and thus so was literacy. And so it was for hundreds of years.

Books existed for quite some time before supply was able to embark on meeting demand. Until 1750, reading was done “intensively:” people with above average means tended to own a small number of books and, if they felt generous, they would read them out loud to small audiences. After 1750, people began to read “extensively.” They started collecting books and increasingly read them alone. It took the ability to copy and distribute books swiftly and cost effectively before they caused the first intellectual boom – the enlightenment.

As books abound their presence is often worth something beyond their intended function. When entering the home of a new acquaintance a moment often arises when you find found yourself perusing their bookshelves and music collections. Books, records, cds, dvds – are almost always displayed in open view. They’re exhibited as badges of honor and accomplishment, a source of pride and personality. It’s generally not impolite to browse other people’s shelves – it’s encouraged.  A walk through someone's media collection is a chance to be alone with partial contents of their mind. As I reflect on some of my past romantic relationships, they could also be summed up by the contents of those shelves. A few scattered classics, Chuck Palahniuk, David Foster Wallace, Irving Welsh, David Sedaris, and Dave Eggers – these authors were expected during an epoch not long ago. If I saw Zadie Smith, Murakami, Lethem, Didion, Pynchon, Nabokov, Zizek, Orwell, Amis, Hitchens, Vidal, – or anything philosophically or historically inclined – I was confident that the owner of the shelf and I would pair like pinot gris and pan-seared escolar. But if I walked in to find either a meager collection or books on astrology resting next to a copy of the dreaded Secret, I would turn and flee, never once looking back. On one shelf in particular I came across an abundance of Danielle Steel novels. Although it caused me to recoil I dismissed them, perhaps to the detriment of that particular relationship. However, if seeking a healthy partnership, exceptions must be made. Not everyone has the same palette; guilty pleasures, gifts, nostalgic heirlooms, these things make us human and can often carry a Titan Case full of charm.

People dating in the future are not going to have access to unfiltered bookshelves. Even today many people are keeping most of their media collections tucked away in digital receptacles. Places such as computers, ipods, ereaders, etc. are typically private. They’re only for respective viewing via consent of the owner – something not always granted to acquaintances.  The consequence is that instead of perusing someone’s shelves we’re now left with what they wish to share in conversation, or more realistically, through social media. Social media sites such as Facebook have created a platform for people to narcissistically control how they're perceived. Twenty-two year old Sarah from Boise may claim to love anything directed by Fellini, or to have read Infinite Jest in its entirety -- but has she? These claims could very well be accessories spicing up her digital appeal. If she had a bookshelf she could stock it with unread copies of brilliant novels, but purchasing books costs money and requires more effort than most people are willing to put to create an intellectual façade; besides, most facades are located outside of the home. And all those “guilty pleasures,” or books that defined a youthful moment in her life, can be easily kept from public view.

As the book makes the digital transformation it's commonly perceived that individuality is being expanded due to greater access to information.  However, that’s not necessarily why books are being digitized. A while back Google uploaded millions of books to its servers. Many authors rejected the idea, a lawsuit followed, and a great debate was born. But there’s a problem with the way we as consumers and creators viewed the issue. Google books didn’t upload all of those titles for you to read, they uploaded them for a computer to read. They put enormous amounts of data on the net to be retrieved via an algorithmic system (Google). As Googling takes it’s place among verbs like walking, we should be reminded that Google doesn’t work by knowing who you are or what you really need. Google works by comparing your current key term to what you’ve searched for in the past. It then measures that against key terms entered by other people. This loop then feeds back onto what we’re looking for. Google is not a form of Artificial Intelligence -- it’s an equation. As the information becomes digitized such equations alter our choices and behaviors in profound ways.

Take Pandora for example. Pandora is interfaced with Facebook. It knows the music interests you’ve listed on your “info” page and the artists you’ve mentioned in comments.  When you use their site you input a few artists you like, click on the “thumbs up” or “thumbs down”  Pandora filters through the algorithm you’ve created. It tries to predict your behavior. For many people it doesn’t just predict it -- it dictates it. And the media it draws on is setup through an algorithm based on other people’s playlists and advertising dollars. As media moves into digital formats, the individuality of human discovery and expression gets marginalized. Whether that is good or bad depends on what kind of species we want to be in the future. Our choices now affects those who aren’t yet able to make a choice on their own. Although that's how the world goes round, the repercussions of the choices we make now affect the next generation more rapidly than ever before.

I have a friend who’s kid loves to play games on his ipad. One day we were both sitting at the same table and as he was multitouching his way around a vibrant world while I was flailing away on my laptop. After his game ended, he reached out to my screen and with a pulling motion between his thumb and index finger, tried to enlarge a window on my screen. He was instantly puzzled as to why it wasn’t responding. All I could tell him was that, soon, when he’s reading and writing – it will.

A few weeks prior, I was at the airport and a little girl shouted, “daddy don’t forget your book,” and reached for the Kindle he left on his seat. This younger generation is not going to perceive information devices, or even information itself, as we do. And it’s due to choices our slowly evolving brain has rashly made regarding technology as it progresses at exponential rates.


There are many engineers and scientists that feel the only way we can keep up with the accelerating flow of technology and be able to respond to it intelligently is to merge with it. Ray Kurzweil is perhaps the leading advocate for this new futurism.

Ray Kurzweil is a technological guru and inventor extraordinaire. He’s responsible for creating optical character recognition (OCR), text-to-speech synthesis, speech recognition technology, the photo scanner, a series of keyboard synthesizers and a line of health supplements. He owns twelve companies and he’s a bestselling author tackling the subjects of artificial intelligence and the technological singularity. If you’re not familiar with the singularity, it’s essentially a hypothetical event horizon when biology and technology merge via the Internet causing an intelligence explosion, thus beginning humankind’s next evolution into the unknown. From that point in time the Internet could do away with us or turn us into digital cell like bits of information fitting snuggly into its algorithms. Kurzweil thinks that we’ll somehow merge with the Internet yet still be in control of our humanity. We’ll utilize it to be immortal and billions of times more intelligent than we are today. Preaching the singularity is such a passion for Kurzweil that he founded the Singularity University, a college for the technologically elite and wealthy located in the Silicon Valley.


In Barry Ptolemy’s Transcendent Man, a new documentary about the life and times of Kurzweil, the argument of which is superior books or ereaders, is fodder from the past. Instead He lays out what changes we’re going to see leading up to his inevitable singularity. According to Kurzweil soon information will be transferring at rates so infinitely fast that humans won’t be able to keep up unless we merge with technology. Kurzweil predicts that in the near future we will have the technological capabilities to backup the contents of our brains or upload them into new biotech bodies. We will inject nanobots the size of white blood cells into our blood stream. These nanobots will allow us to run for twenty minutes on a single breath or stay underwater for hours at a time. We will be able download knowledge directly into our feeble grey matter (you may even be able to learn Kung Fu in a day). Not only will we become “superhuman,” we will be adept at creating AI versions of our deceased loved ones, something Kurzweil is personally determined to do so he can once again interact with his father (a version of his father, respectfully). According to this modern day Thomas Edison the inevitable shift into the accelerated unknown is supposed to begin in 2045. Yep, if you can make it tell then, immortality shall be yours. According to Kurzweil the sick will be healed, the dead will be raised, and eternal life will be granted. (I swear I’ve heard this before).

Now why isn’t this man being treated like a vernal pseudo-religious crackpot who should be selling pencils from a cup, you say? Predicting the future has been something Kurzweil has spent the last twenty-five years perfecting and thus far his track record has been exceptional. In the early eighties he accurately predicted when the Internet would go mainstream (Although, I don’t think he knew Al Gore was going to invent it).  He predicted the fall of the Soviet Union, when computers would beat humans at chess, and when the human genome would be mapped.

Kurzweil’s timeline uncannily fits into his life expectancy. He takes over 200 pills a day in an attempt live long enough to live forever.  This calls into question whether his predictions are legit or are simply wishful thinking. His theory, though it may be feasible, leaves out the cap the market places on the rate technological progress and ignores the social repercussions that could arise as the singularity approaches. Many critics of Kurzweil also fear a potential fallout when mankind is divided by who has access to the Brave New World and who gets left behind. Wars could very well breakout. Not necessarily the wars between man and machine popularized by films like Terminator, but wars over resources.

Kurzweil isn’t alone in his prospects for the future of humanity. Transcendent Man also features Kevin Warwick, the world’s first cyborg. Warwick is perhaps the most frightening and extraordinary person alive today. Warwick is a professor of cybernetics at the University of Redding, Berkshire, UK. “Captain Cyborg,” as he’s often referred to, started his path towards becoming the Bionic Man in 1998 by embedding chips in his body that he used to control doors, heaters, lights, and other computer-controlled devices based on his proximity. Once he knew his body could safely handle the electronics he developed a sonar-transmitting device that allowed him to navigate obstacles in the dark.  Shortly after that he interfaced to a sensory transmitting robotic arm through the Internet, which he could control from anywhere in the world. If that’s not enough, he developed it so he could feel objects that he “touched” with the robotic arm. He did this via a radio chip inserted into his nervous system.


His next project was perhaps the most amazing. Warwick put a radio chip in his wife’s arm and interfaced with her through the Internet. Using the same technology as the robotic arm, he was able to feel when she moved her arm. This was the breakthrough he was searching for. The next step for Captain Cyborg is to further develop digital telepathy although it's uncertain that will bring us closer together.

The written word is a marvelous extension of ourselves, but it wasn’t the first. Humans are social animals and as such we’ve long been extensions of one another. Communication links us no matter how fast it is, no matter how literally we're wired to each other. Perhaps in the future we’ll extend our emotional reach and hone empathy beyond imagination. Perhaps each and everyone of us we'll have infinite time to do so. But with omnipotence, immortality and life on the circuit board the purpose of such emotional extension is presently incomprehensible.  As we watch the book take the digital plunge perhaps we're getting a glimpse into our future. We must pay attention. The evolution of humanity may very well be brought about by a conscious choice you are making now. That choice could be as simple as how you choose to read.