Facebook’s ad-targeting practices came under scrutiny last spring after a report in The Australian revealed a confidential document. In a presentation for a potential advertiser, Facebook boasted that it can pinpoint when its young users—including 1.9 million high school students—are feeling “stressed,” “overwhelmed,” “anxious,” and a “failure,” the report said.
Facebook has said that the data was anonymous and wasn’t used for ads.
Though federal laws prohibit companies from collecting online information from kids ages 12 and under, 13- to 18-year-olds have fewer protections, which vary state by state.
Facebook also got into hot water in 2016 when a report by ProPublica pointed out that advertisers could exclude certain “ethnic affinities” from their Facebook ads. Federal laws prohibit housing and job ads that discriminate based on race, gender, or other factors. Facebook doesn’t ask users their race, but it may place them in certain categories like Asian-American or Hispanic based on their online activity on the site.
Facebook initially defended the practice, but later said it would block ethnic marketing in ads and make marketers vow not to discriminate in their ads. However, ProPublica said in November that the practice was continuing.
“We passed civil rights laws in the 1960s and 1970s to prevent people from being excluded from core economic opportunities based on race, gender, age, and some other factors,” says Rachel Goodman, a staff attorney at the ACLU (American Civil Liberties Union). “Ad targeting has the potential to undo a lot of that progress.”
Goodman adds that many consumers don’t realize how much information they’re giving away because privacy policies are often buried deep in complex user agreements that “nobody reads,” she says.
“People need to have a better understanding of what the data is and where it’s going.”