Recent Study Reveals Most Facebook Users Are Still In Dark About Its Ad Practices

Dhir Acharya - Jan 18, 2019


Recent Study Reveals Most Facebook Users Are Still In Dark About Its Ad Practices

Pew Research has investigated and found a creepy fact that most users have none or little understanding of what Facebook is doing with their data.

Pew Research Center conducted a study which reveals that most Facebook users don’t know how the company tracks and profiles them for targeting advertisements.

From September 4 to October 1, 2018, Pew carried out a survey on GfK’s Knowledge Panel. There were 963 respondents who are 18 or older.

Facebook surveillance

The study gives us critical numbers

74 percent of Facebook users were not aware that the social network keeps a list of their interests and traits which it uses for ad-targeting purposes. The issue only came to light for these users when researchers helped them view their ad preferences page on the platform.

51 of users said they found Facebook’s data compilation uncomfortable.

27 percent acknowledged that Facebook’s lists of ad preferences represented them at a low or zero level.

88 percent of Facebook users, notably, generated some material for themselves on the Facebook ad preferences page.

How do your users have “complete control” when they don’t even know their data are collected, Facebook?

Kết quả hình ảnh cho facebook users

Last year, Mark Zuckerberg testified that Facebook users have complete control over what they upload to the platform and which of their data are collected for targeting ads.

But as in the bold question, how do users take their complete control as promised by Zuckerberg? Perhaps, US policymakers should have also asked themselves this question first when drafting a thorough federal privacy law.

By Pew’s findings, we learn that users are practicing little control over how much Facebook can link its algorithms to their identity because they barely know how the company functions.

The company essentially markets the social network as a service for maintaining connections among people rather than a tool to stalk and profile people. So it’s hard for an average user, who is not in the industry, to understand how it actually works.

It’s really hard for average users to understand

Kết quả hình ảnh cho users cannot understand facebook

Yes, and this is not a bug that it will fix, this is another facet of Facebook’s. Think about this, Facebook knows what you like, gives you the ads that interest you which, in turn, gives it profits. So if you knew about the process, you would most likely limit the data collection Facebook conduct, affecting its business – No, it doesn’t want that at all.

So when the public wants to regulate privacy, Zuckerberg requested a...

Capture

That means something that lures you into the feeling of security but actually just another attempt to hide the actual business.

Even when Facebook does offer a section to adjust ad management, a normal user hardly has the ability to locate and understand it due to complex progress and requirement of multiple interactions.

Additionally, it’s unlikely for an average user to go through a series of boring steps including searching for “ad management” and figuring out what each toggle and click does.

And as Facebook’s attempt to lure users, in its explanation of self-serving ads, it says targeted advertisements are useful for users. But what it doesn’t mention is the huge trade-off with privacy nor does it provide users with an entire opt-out of its surveillance and data collection.

It only offers little settings to “influence what ads you see,” which is a different story from controlling. No matter what Zuckerberg said to the Congress, Facebook still doesn’t give its users a simple way to know their ad options.

Its algorithm puts users in categories for political purposes

Facebook profiles its people to the extent where it labels users based on political views, racial and multicultural/ethnic affinities.

According to Pew researchers, Facebook labels 51 percent of its users as having a political affinity and 21 percent with a multicultural affinity.

Regarding the 51 percent users, 73 percent of them said they are accurately labeled while 27 percent the opposite.

This kind of classification is used for political purposes, which has burst out as major privacy breaches in the past few years, like the massive Cambridge Analytica scandal where user data were manipulated for campaign purposes.

Facebook has also used advertisements to avoid rules of campaign spending in elections. For example, in EU referendum vote in the UK, 2016, great deals of ads were targeted in a non-transparent way with the help social networks. Also in 2016, during the US presidential election, Facebook targeted large amounts of political misinformation to interfere with the election under the propaganda backed by Kremlin.

Last year, the UK data watchdog urged people to stop using social networks for political campaigns with concerns over Facebook’s data practices.

Regarding the 21 percent labeled with a multicultural affinity, Pew found that 60 percent admitted their strong affinity toward what Facebook assigned them with but 37 percent said otherwise. In addition, 43 percent of users assigned by Facebook with an affinity for African American culture, the same percentage for Hispanic culture, and 10 percent are labeled as interested in Asian American culture.

There aren’t any other affinities in the US, such as Caucasian, which shows the system’s inherent bias.

Facebook’s use of the algorithm to categorize users have resulted in controversy in recent years due to discriminatory ads. So the company said in late 2016 that it would stop using ethnic affinity label for ad targeting relating to housing, credits, and employment.

Apparently, user classification poses risks and they are not okay with this.

Regulators kick in, Facebook’s business is at risk

After several data breaches were discovered, regulators started looking into Facebook and drafting privacy laws to protect users from malicious data collection.

Discussing the matter, Michael Veale, a researcher in machine learning and data rights at University College London said:

Capture

As stated by Veale, by saying it's "manifestly excessive" that users want to see how their data are collected and profiled, Facebook just worried that they would be terrified with what they see.

Kết quả hình ảnh cho mark zuckerberg

From May 29 to June 11, 2018, Pew conducted a second survey using its American Trends Panel and a sample of US adult social media users, namely Facebook, Instagram, Twitter, and others. From this survey, Pew discovered that most users believe social networks can easily determine key facts about them based on the data collected on their behaviors.

Accordingly, 84 percent said social media can determine their ethnicity or race, 79 percent said the same thing for interests and hobbies, and 65 for religious beliefs. And while 71 percent said the same thing for political affiliation, less than 28 percent said the opposite.

So, it appears that even when users don't know exactly what social platforms are doing with their information, most of them believe it's easy for companies to profile them.

Comments

Sort by Newest | Popular

Next Story