You are using an outdated browser.
Please upgrade your browser
and improve your visit to our site.

Understanding Facebook’s Algorithm Could Change How You See Yourself

The new “Why am I seeing this post?” feature could have unintended consequences—for users and advertisers alike.

Josel Saget/AFP/Getty Images

When we go online these days, we know we’re not alone: The internet is looking back at us. Our clicks give us the information and products we ask for, but at the same time they provide information about us. Algorithms then make use of that data to curate our search results, our social media feeds, and the advertisements we see. The internet ascribes an identity to its users, and projects it back to us.

The research behind the political campaigns run by Cambridge Analytica in 2016 suggested that a few Facebook likes are enough for an algorithm to identify our gender, personality traits, sexual orientation, religious and political beliefs. “Computer-based personality judgments,” two psychologists and a computer scientist claimed in a research paper in 2015, can “be more accurate than those made by humans.” The algorithms can end up knowing us better than our spouses do. If that’s true—not all researchers think it is—what does that mean for our own understanding of who we are?

The question is particularly relevant in the wake of Facebook’s April 1 announcement that it will soon boost “algorithmic transparency” for users with a “Why am I seeing this post?” button—an update of a feature that already exists for ads, and a new feature for all other posts on a person’s timeline. Most of us have little understanding of how the algorithms in charge of our internet experience work, and have no control over them other than the information we provide, many times involuntarily. The button would be one of a few small windows into how the internet decides what to show us: Amazon’s book recommendations, for example, are supposedly based on the purchases of people who ordered the same books as us. Netflix says its personalized recommendations are based on what we’ve already watched: “The more you watch, the better Netflix gets at recommending TV shows and movies you’ll love,” the company claims. Facebook’s new feature promises to uncover part of the identity that Facebook’s algorithms have assigned us, essentially letting us in on the personality quiz we unwittingly participate in each day—a good thing, right?

People tend to love taking online personality quizzes. “Personality tests offer a structured way of self-reflection. They’re a little bit like therapy in that way,” says Simine Vazire, Professor of Psychology at the University of California, Davis. And while the tests are all based on self-reporting, “self-reporting is actually quite reliable” Vazire adds. (She is skeptical about the accuracy of algorithms judging personality solely from online behavior, on the other hand.)

But personality quizzes can not only reveal, but also shape a personality. In 2002, a Japanese study found that even when subjects took ­what were—unbeknownst to them—bogus pop psychology tests, and were then told they exhibited a particular personality trait (extroversion, for example) they tended to believe the result and proceeded to act in ways that matched that description. “While the feedback is originally inaccurate,” the researchers wrote, “it could change the self-image of people who receive the feedback, and may eventually be perceived as an accurate description of their personality or psychological state.” Believing the tests seemed to lead people to internalize them.

So what does that mean for Facebook algorithms? The general idea that social media can affect our psychology is familiar at this point: Seeing other people’s perfectly curated lives on Instagram or a series of others’ achievements on Facebook can dent our self-esteem. And some feel different platforms have a tendency to accentuate certain personality traits: Twitter reveals the adversarial part in many of its users, Instagram encourages vanity.

Ad-targeting algorithms, it turns out, can function very similarly to personality quizzes, right down to affecting our sense of self. In 2016, three researchers tested the effect and published the results in the Journal of Consumer Research. “The data show that participants evaluated themselves as more sophisticated after receiving an ad [for a high-end watch brand] that they thought was individually targeted to them, compared to when they thought the same ad was not targeted,” they wrote. “In other words, participants saw the targeted ads as reflective of their own characteristics.”

That, in turn, alters behavior—and not just upping the probability that individuals purchase the advertised product, as the advertisers want. Participants in another experiment the researchers ran, who were shown ads for an environmentally friendly product they thought was targeted at them, were not only more likely to buy the product, but also more likely to donate to an environmental charity soon after. Again, it didn’t matter whether the targeted ads were actually based on participants’ past behavior—just that participants believed they were.

According to these findings, then, the actual effect of Facebook’s “Why am I seeing this?” button could be rather unnerving: a placating show of transparency that only enhances the hold the behemoth has on our minds. The glimpse into who Facebook thinks we are could affect how we see ourselves.

The idea that personhood is something fragile, dependent on others for confirmation and reinforcement, goes back to at least the nineteenth century. German philosopher Georg Wilhelm Friedrich Hegel saw identity as something formed through social relationships of recognition with others who we invest with the authority to grant us that identity. Many features of our personality exhibit that social dependency: One cannot continue to think oneself particularly gregarious, for example, when everyone else claims otherwise.

It is this recognition of one’s identity by someone else that people look for in personality tests. Being couched in the language of psychology gives them the necessary authority.

Similarly, and crucially, it’s our belief that the algorithms know us that makes us susceptible to them. And therein lies one way out of the trap: Seeing why Facebook thinks a particular post is relevant to us could also make it easier to dismiss it, just as it’s easy to dismiss Amazon recommendations based on purchases we regret.

A degree of skepticism might be the best way to avoid ceding control over our identities. Ultimately, we need to remind ourselves that the platforms analyzing our online behavior are only interested in aspects of ourselves that they can monetize. We should treat their depictions of us with the same wariness and suspicion we’d offer any human salesperson aiming to manipulate us.