Facebook flags thousands of kids as interested in gambling, booze

Credit to Author: Lisa Vaas| Date: Fri, 11 Oct 2019 11:12:38 +0000

We know that Facebook tracks what we do to flag our interests for use in targeted advertising.

But the algorithm it uses to do so is marking thousands of kids as being interested in booze and gambling, which could lead to them being targeted with ads that aren’t appropriate to show to minors, according to a joint investigation by The Guardian and the Danish Broadcasting Corporation (DBC).

The investigation found that Facebook’s ad tools flag 740,000 children under the age of 18 as being interested in gambling. Another 940,000 kids are marked as interested in alcoholic beverages.

As the Guardian points out, such interests are automatically generated, based on what the platform observes of a user’s activity. That data then gets fed to advertisers, who can use it to target specific subgroups that show signs of potentially being interested in whatever the advertisers are pushing.

Facebook said in a statement that advertising alcohol or gambling to minors on the social network is forbidden:

We don’t allow ads that promote the sale of alcohol or gambling to minors on Facebook and we enforce against this activity when we find it. We also work closely with regulators to provide guidance for marketers to help them reach their audiences effectively and responsibly.

But there are reportedly instances where Facebook will, in fact, let kids be targeted for interests in these age-inappropriate areas. The investigation’s reporters got input from a Facebook insider who gave the example of an anti-gambling service that might reach out to offer support to children who might have a gambling problem.

The Guardian also highlights a more insidious example of how such targeting might be used. The publication pointed to young people who are addicted to video games such as Fortnite, Candy Crush and Call of Duty – addicts whom the UK’s National Health Service (NHS) recently opened up a clinic to treat.

Developers of such games, with their profitable loot boxes of consumable virtual items that can be redeemed to get yet more virtual loot, could target their ads to children who’ve been flagged as having an interest in gambling – all without breaching Facebook’s regulations about not marketing gambling to kids.

Facebook actually got in trouble earlier this year for knowingly refusing refunds to parents whose kids didn’t realize that the money they were spending in games like Ninja Saga was real. That naivete led to kids unknowingly racking up thousands of dollars in charges.

For advertisers of content prohibited on Facebook who decide to skirt Facebook’s rules about advertising to children, they’ve got preselected audiences thanks to Facebook’s flagging users by interest. Nor does the platform have a proactive way to stop them. It relies primarily on automated review to flag prohibited ads, but those reviews don’t necessarily stop the ads from running in the first place.

The Guardian points to a recent lawsuit Facebook settled over this failing. In January, it got lawsuited into creating a scam ads reporting tool, and donating £3m to a consumer advocate group, by UK financial celeb Martin Lewis.

Lewis’s name and face had been slathered on all sorts of financial scams that he’d never endorse – scams that Facebook’s detection tools repeatedly failed to block. In fact, Lewis’s suit claimed that Facebook had published over 50 fake advertisements that used his face and name without his permission.

The boozy, gambling kids story isn’t the first time Facebook’s been called out for improperly tagging users, the Guardian points out. In May 2018, it was found to be letting advertisers target its users based on sensitive categorizations that are supposed to be off-limits according to data laws.

Just one month later, The Guardian and the DBC found that Facebook had algorithmically tagged 65,000 Russians as being “interested in treason,” potentially putting them at risk of retribution in their homeland. Following inquiries from journalists, Facebook removed the “treason” category.

Will the multi-algorithmically armed beast do the same for “alcohol” and “gambling,” or figure out some way to keep such terms as being available to advertisers who might use them to target minors?

Right now, Facebook doesn’t seem inclined to admit there’s a problem, but if it does, and if it does anything about it, we’ll let you know.

http://feeds.feedburner.com/NakedSecurity

Leave a Reply