Facebook’s Australia division uses sophisticated algorithms to identify users as young as 14 years old who feel “worthless” and “insecure” to target advertising to them, a leaked internal document revealed.
The Australian released the 23-page document, marked “Confidential: Internal Only” and dated 2017, on Monday. In it, Facebook executives describe how they can identify “moments when young people need a confidence boost” by monitoring users’ posts, pictures, interactions, and internet activity in real time. The social media giant can tell when young people in particular feel “stressed,” “defeated,” “overwhelmed,” “anxious,” “nervous,” “stupid,” “silly,” “useless,” and like a “failure,” the document stated.
This research appears to breach the Australian Code for Advertising & Marketing Communications to Children guidelines, which states that a child age 14 or younger “must obtain a parent or guardian’s express consent prior to engaging in any activity that will result in the collection or disclosure of … personal information,” according to The Australian.
“We have opened an investigation to understand the process failure and improve our oversight. We will undertake disciplinary and other processes as appropriate,” a Facebook spokesperson said in a statement sent to The Australian when the paper contacted the company with its findings.
The document also describes how Facebook gathers psychological information on 6.4 million high schoolers, college students, and young professionals to sell targeted advertising. Information available to advertisers includes a young person’s relationship status, location, number of Facebook friends, and how often they access the platform via mobile or desktop, The Australian reported.
Facebook also looks to sell ads to young people based on when they are interested in “looking good and body confidence,” and “working out and losing weight.” The platform uses image recognition tools to see what kinds of foods users are eating, including via Instagram, which Facebook owns.
In its statement to The Australian, the Facebook spokesperson did not disclose if this practice occurs in other countries.
“While the data on which this research is based was aggregated and presented consistent with applicable privacy and legal protections, including the removal of any personally identifiable information, our internal process sets a standard higher than required by law,” the statement read. “Facebook only permits research following a rigorous procedure of review whenever sensitive data, particularly data involving young people or their emotional behaviour, is involved. This research does not appear to have followed this process.”
The information raises ethical concerns about the ubiquitous targeted advertising that users face on many websites. It also raises questions about what kind of personal data should or should not be used to tailor ads to users, especially those younger than 18.
This isn’t the first time Facebook’s practices have come into question. In 2014, the company published the results of an experiment on manipulating the emotional content of users’ News Feeds, which it performed without their knowledge or consent. The social media giant has also guessed users “ethnic affinity” to target advertising to them, and has allowed advertisers to exclude users by race. Despite these concerns, the platform still has 1.86 billion monthly active users, making it one of the largest social media properties in the world.
The 3 big takeaways for TechRepublic readers
1. According to a leaked document, Facebook’s Australia division uses algorithms to identify young users who are feeling vulnerable in order to target advertising to them, according to a report from The Australian.
2. Facebook has been criticized for privacy and advertising policies in the past, including its policy that allows advertisers to exclude users by race.
3. The news from Australia raises ethical concerns about targeted advertising and privacy, especially when it comes to young people.