You are using an outdated browser.
Please upgrade your browser
and improve your visit to our site.
Skip Navigation

Jaron Lanier and the Case for Moonshot Thinking

Wikimedia Commons

Perhaps you have a coworker with a tendency toward oddball musings that everyone tolerates because he’s exceptionally intelligent and generally confines his weirdness to the lunch hour. Now let’s imagine he embraces his ruminative streak, and after spending a month at an Indonesian ashram he gives a speech to the whole office that contains the sentence “Don’t you see what’s happening here?” Jaron Lanier is that coworker. Only he’s operating on a larger scale. 

In his new book, Who Owns the Future?, Lanier is not foretelling the slow collapse of one company but of the entire edifice of capitalism in the technological era. It’s not entirely clear whether he should be redirecting the course of society or handing out pamphlets on a street corner—and he does not leave a lot of middle ground. But we should at least stop to hear him out. 

Lanier’s career as a computer scientist is entwined in the central economic story of our time, the rapid advance of computation and networking. A devotee of speculative projects, Lanier was a pioneer of virtual reality, a term he popularized, and he worked on the development of Microsoft’s Kinect, which allows a person to control software by moving in space. He has now ascended to the level of full-time theorist (currently for the R&D arm of Microsoft), but along the way he has become probably the foremost apostate of the digital age. “I am not writing about some remote ‘them,’” he remarks, “but about a world I have helped to create.” And he is not happy with it. 

In Lanier’s previous book, You Are Not a Gadget, he argued that the leading figures in Internet culture are dishonoring and effacing the specialness of individual personhood, elevating the powers of the “hive mind” to insupportable levels and forecasting a merger of man and machine—the so-called Singularity—that is fundamentally impossible. My response to You Are Not a Gadget was much like Zadie Smith’s in The New York Review of Books: the book “chimes with my own discomfort” in a deep way, and its break with orthodoxy provokes a thrill. 

Now Lanier is warning us that digital capitalism is not only based on an incorrect philosophy but is also wreaking real damage. Consider one of his simpler examples: When Google translates a foreign-language Web page, it appears as if a flawed but nonetheless impressive robot has done all the work. What has in fact happened, however, is that Google’s servers have gathered a huge number of handmade translations and auto-correlated them with the text in question. Google Translate is nothing but the robo-sum of human intelligence and skilled labor. “Digital information,” Lanier writes, “is really just people in disguise.”  

Lanier’s concern is that the translators, the people in disguise, are going unpaid. Lanier is saying that this is beginning to happen to all of us. Our collective efforts are enriching the giants of technology at our own expense. Facebook and Google are only worth something because billions of us have entered our data into their computers, without compensation. More than 100,000 jobs disappeared as massive Kodak collapsed into bankruptcy and gave way to the likes of tiny Instagram, though Instagram is partly built on the past ingenuity of Kodak’s employees. Truck drivers and surgeons have collected the knowledge and physical skill that computers will learn to simulate until most of the workers are unneeded. 

For Lanier, the issue is not that machines are getting so good that human beings are becoming useless—he angrily rejects that idea—but that the machines are made of human beings who are being impoverished by dishonest accounting. As Lanier sees it, musicians, journalists, and photographers have suffered financially in the digital age not because their work is less valuable, but because their work is not properly valued; everyone else, he suggests, is headed down the same ruinous road. Nurses could have their own Napster moment. Although he doesn’t mention it, Lanier’s vision of a middle-class death-by-digitization gains some credence from the current “jobless recovery,” wherein companies have returned to profitability with a smaller pool of workers than were employed before the crash, even in the white-collar professions. I find Lanier’s vision of a coming mess too persuasive for comfort. The vaguely defined conventional wisdom seems to be that as computers take away certain jobs, people will simply need to find other jobs that can’t be done by machines. But Lanier knows only too well that the machines are too good for that; we’re playing a game of musical chairs with fewer and fewer chairs. 

Lanier’s proposed solution to this mess is both intuitive, and, on further reflection, a little bit insane. What if those translators got a few pennies, he asks, every time Google Translate drew on their work? That sounds fair, but notice how much follows from this premise. Lanier envisions a universal “nanopayment” system that would compensate people for useful information gleaned from them. Tweaking a piece of software by adding a nice line of code would mean a lifetime flow of royalties. You would be paid to use Facebook and Google and Twitter, but you would also pay to use them because all accounting would now be honest, and the people who built those services are worth something too.The complications widen the eyes. Because we are always both generating and exploiting data, economic activity would be ambient and ubiquitous. Any distinction between work and free time collapses. Instead of becoming serfs, we all become hustlers.

Even if we accept that today’s situation is so dire that the bar is low for alternative ideas, Lanier does not do enough reckoning with the implications of his scheme. He calls his book “a work of nonnarrative science fiction, or what could be called speculative advocacy,” but this admirably frank admission only goes so far. He lays out the core concepts of Who Owns the Future? in the first 25 pages. After that, the subsequent 300-plus pages do not necessarily follow a clear progression. Like the friend who e-mails you too often because some mundane event reminded him of his latest big idea, Lanier is too easily led down the side streets of his own digressions. 

Instead, Lanier might have grappled more fully with foreseeable difficulties. Everyone would need to have a universal online ID for a nanopayment system to work, as he acknowledges. Despite appearing to harbor profound objections to the “spying” of Google and Facebook, Lanier merely does some hand-waving when confronted with the privacy issues that a universal ID would raise. He posits that in Lanier World the middle class would make more money and have more to spend, creating broad-based growth—the economy would not be “zero-sum.” But Lanier does not sufficiently explain why this is so. If we were all properly paid for collectively building Facebook, wouldn’t we all have to spend a commensurate amount on it? And why should we pay more? If “digital information is really just people in disguise,” where does the excess value that constitutes economic growth come from, and who collects it? This question may have a reasonable answer, but Lanier makes no real attempt to address it.

Perhaps more seriously, Lanier doesn’t adequately contend with how exhausting and depressing it sounds to hustle for and hemorrhage cash more constantly and visibly than we already do. Is your friend e-mailing you only to collect a dime from the data he supplied to Gmail? The economic sphere becomes essentially the only sphere. While elsewhere Lanier seems delighted by the spirit of the amateur and the volunteer, here he is too untroubled by a world where everyone is always working the room like a pro. 

The failings of Lanier’s vision, however, should not obscure his achievements. His book not only makes a convincing diagnosis of a widespread problem, but also answers a need for moonshot thinking. If the alternate society he proposes looks too flawed, he deserves to be applauded for at least putting one forward, no matter who might look at him funny while he gives his office-wide speech. Lanier hopes that one possible outcome of the book is that “the deep freeze of convention will have been thawed a little by this exercise,” making way for necessary new ideas. I hope so too.