The Pew Research Center today released the results of its latest survey of Facebook users, this time about the ad categories they’ve been assigned by the site’s algorithms. The users’ surprising ignorance of the information Facebook accrues highlights the disconnect between the platform and its users.
According to the results, several users weren’t even aware of the list of ad categories Facebook keeps on them: 74 percent said they “didn’t know the categories existed.” So just to get everyone up to speed, Facebook allows you to browse the ad categories to which it’s assigned you. You’ll be able to see everything from which political leaning you’re classified as to which entertainment options you enjoy. You can view yours here.
Not everyone gets a list — the study found 11 percent weren’t assigned categories. But that means that, of the people who did get a list, 88 percent weren’t aware of it until they were directed to the page.
When asked about the accuracy of the categories, users admitted Facebook was right more than not. 59 percent said the categories “represented them somewhat or very accurately,” while 73 percent said the political label assigned to them “accurately describes their views.” And, of the racial or ethnic affinities, 60 percent felt they were accurate.
That said, even when Facebook is right, it still feels wrong. At least, that seems to be the takeaway from the rest of the article. Of the people whom Facebook assigned categories, 58 percent said they weren’t comfortable with the lists. And why would they be? Facebook’s notoriously cagey about how it commodifies user information, even using details that you may not have listed on your profile.
Facebook in general, and Mark Zuckerberg in particular, have always defended their collection of ad-tailoring data by saying that users prefer relevant ads. During his Congressional testimony last year, Zuckerberg responded to a question from Florida Senator Bill Nelson — “Are you actually considering having Facebook users pay for you not to use the information?” — by saying:
What we found is that even though some people don’t like ads, people really don’t like ads that aren’t relevant. And while there is some discomfort for sure with using information in making ads more relevant, the overwhelming feedback that we get from our community is that people would rather have us show relevant content there than not.
Perhaps Zuckerberg means that. But if this study’s results are anything to go by, the “community” might only appreciate relevant ads specifically because they don’t know how they get them. When actually confronted with the data behind those ads, it would seem a large portion of users are actually not as cool with it as Facebook would have you believe.
We’ve contacted Facebook for comment on the Pew study’s findings and will update if we receive a reply.
Facebook Algorithms and Personal Data on Pew Research Center