You are using an outdated browser.
Please upgrade your browser
and improve your visit to our site.
Skip Navigation

Selling Your Bulk Online Data Really Means Selling Your Autonomy

Big tech's war on the meaning of life

Niv Bavarsky

In March, a Dutch student called Shawn Buckles placed his personal data on the market. He offered to hand over all of his most intimate electronic matter—e-mails, health records, calendars, geolocational data—to the highest bidder. By mid-April, Buckles had received 53 offers. The winner of the auction was The Next Web, a popular site for technology news. It shelled out $480 for his data soul.

Buckles—who is more of an activist than an entrepreneur—succeeded in raising awareness about the strange economics of data. As the suave futurologists at the World Economic Forum in Davos noted in a 2011 report, our personal data is rapidly becoming a new “asset class.” Companies like Google and Facebook have mastered the art of monetizing their store of information about our personal tastes and traits. But the subjects of that information—namely, you and me—don’t share any of the profits from these transactions. It was only a matter of time, therefore, before individuals like Buckles began to revolt against this system, selling their own data rather than just letting Silicon Valley exploit it.

Technology companies have a response to this rebellious logic: Sure, users relinquish their data to be harnessed and sold by giant corporations,but look at all the wonderful and nominally free services that they get in return. But is that even the case? What if the real value of our data is much greater than the utility of those services? And far more profoundly, Buckles’s clever stunt raises a philosophical question: Should we even be allowed to sell our most intimate data in the first place? Or should governments discourage or even prohibit such transactions, perhaps, on moral grounds?

On one level, the case for permitting individuals to sell their data is clear cut. It’s only fair: Since companies can already trade this data, it would be absurd to ban individuals from selling it. Furthermore, there’s immense social value in this data—and we should encourage its altruistic use. Why stop people who want to give away their health records to universities or hospitals to contribute to scientific discoveries? Ideally, we might want them to do so for humanitarian reasons, but one can think of moments when the promise of immediate monetary compensation might get the job done faster.




Illustration by Niv Bavarsky

There is, however, a vast difference between donating one’s data for humanitarian purposes and selling it to marketers, advertisers, and data brokers. Most of the data we surrender to private companies can be used to shape our lives. We allow our smartphones to access our locations—and ads, in turn, become more relevant. We search for a nutritional supplement online and the ads for weight loss soon follow us everywhere. We look up certain products online and companies make immediate inferences about health or plans for the future. The tight, real-time integration of this data with commercial outlets that structure our mundane, everyday existence is responsible not just for the particular choices that we make (e.g., Coke or Pepsi?) but also for the kinds of anxieties and aspirations that determine what it is that we do and want in the first place (e.g., my smartphone senses that I might be thirsty, shows me an ad, and I do find myself very thirsty—but was I actually thirsty in the first place?).

Suppose that Buckles, having sold his personal data, decides to change his life in some profound way. Perhaps he turns to Google and searches, “Should I become a vegetarian?” It doesn’t matter what sites he discovers: He has revealed that one previously stable part of his lifestyle is now up for grabs. This triggers numerous events that might appear random but are actually well engineered by competing companies. Buckles’s supermarket is offering him personalized discounts on purchases of vegetables while his local steakhouse is enticing him with coupons for a tasty BBQ dinner.

Whether Buckles decides to become a vegetarian or remain a carnivore makes no difference: His decision has been shaped by factors that he has failed to perceive, let alone discount or counteract. One can imagine the kinds of nudges Buckles would get if the government joined the fray and acted on its own fears of obesity, seeking to steer Buckles—once again via his smartphone—toward vegetables rather than meat.

It’s true that, thanks to real-time information, our world is irreversibly more interactive and individualized than it was decades ago. Nowadays we expect personalized treatment, personalized advertising, personalized entertainment. There’s much to celebrate here. But there are also reasons to worry. If we had well-formed, eternal preferences, such real-time adjustments to our desires would be most welcome. But that’s not how we are—and probably not how we want to be. We should want to preserve the space to make our own life plans, reconsider our values, abandon old projects, and embark on new ones.

Such soul-searching can be a very slow process. But once we reveal we are entering this process—via a search query, a slip in an e-mail, some random emotional outburst detected by our smart glasses—our autonomy is hijacked, as the alluring messages that pop up on our smartphones seeking to shift us in a direction favorable to advertisers and government bureaucrats obsessed with regulating how we eat, exercise, or consume energy. What makes the new data-heavy personalized advertising so cunning is that it leaves us with the illusion that we can make autonomous choices.

But to sell our intimate data in bulk is to fully surrender our quest for autonomy, accepting a life where the most existential choices are shaped either by the forces of the market or by whatever war—be it on climate change or obesity—the government has enlisted us (rather than corporations) to fight. In this world, whether we become vegetarians, and even whether we end up thinking about it, might ultimately hinge on which player (the steakhouses, the supermarkets, the bureaucrats) has the most to gain from this switch. Our data constitutes our very humanity. To voluntarily treat it as an “asset class” is to agree to the fate of an interactive billboard. We shouldn’t unquestionably accept the argument that personal data is just like any other commodity and that most of our digital problems would disappear if only, instead of gigantic data monopolists like Google and Facebook, we had an army of smaller data entrepreneurs. We don’t let people practice their right to autonomy in order to surrender that very right by selling themselves into slavery. Why make an exception for those who want to sell a slice of their intellect and privacy rather than their bodies?

Of course, the scenario I have just described is already happening, to a large extent. We’ve made a mass decision to sell our data to Silicon Valley, in exchange for free apps and services—an unthinking decision, to be sure. The time has arrived to more rigorously consider this momentous choice; we must place a value on our autonomy and privacy—is it worth $480 or a price that can never be tabulated?

Evgeny Morozov is a senior editor at The New Republic.