John Naughton 

Don’t post on Facebook unless you are prepared to face the consequences

Facebook’s advertising software is beautifully engineered but it often produces ugly results
  
  

a radiation warning sign
Facebook is not the best place to share news of a cancer diagnosis. Photograph: Yui Mok/PA

Earlier this month Anne Borden King posted news on her Facebook page that she had been diagnosed with breast cancer. Since then, she reports, “my Facebook feed has featured ads for ‘alternative cancer care’. The ads, which were new to my timeline, promote everything from cumin seeds to colloidal silver as cancer treatments. Some ads promise luxury clinics – or even ‘nontoxic cancer therapies’ on a beach in Mexico.”

The irony is that King is the last person likely to fall for this crap. She’s a consultant for the watchdog group Bad Science Watch and a co-founder of the Campaign Against Phony Autism Cures. So she effortlessly recognised the telltale indicators of pseudoscience marketing – unproven and sometimes dangerous treatments, promising simplistic solutions and support. In that sense she is the polar opposite of, say, Donald Trump.

But one sentence in her thoughtful article brought me up short. “When I saw the ads,” she writes, “I knew that Facebook had probably tagged me to receive them.” At which point I began to wonder if even such a sophisticated user of social media really understood how it works. Of course Facebook had tagged her either as a cancer sufferer, or as someone interested in cancer, or perhaps both. The platform’s algorithms log everything a user does, in order to build a profile of him or her to help advertisers reach users who meet particular demographic or other criteria.

What this means is that the only way to understand how Facebook works is to go into it as an advertiser – ie as a customer – rather than as a mere user. What you then encounter is an utterly brilliant automated system that guides you through the process of creating a “custom audience” for whatever advertising message(s) you wish to propagate. And as you work through it, the system will come up with ideas for categories of users you may not have thought (or indeed even dreamed) of but which might be relevant for your campaign.

In King’s case, for example, vendors touting cumin seeds will probably have been asked if they would like to include users who have recently been diagnosed with breast cancer in their “custom audience” as well as those who have expressed interest in “non-toxic cancer therapies” or vitamin D. For advertisers, the object of the exercise is to widen the range of people who might be receptive to their messages. For Facebook, the objective is to increase the advertising spend. So it’s a win-win scenario for both parties. For users like King, though, it may be anything but.

The whole point of the Facebook advertising machine is precisely that – it’s a machine. It knows only correlations between advertisers’ requirements and users’ profiles. It neither knows nor cares about cancer or anything else.

This has been dramatically demonstrated in experiments conducted by imaginative journalists. In September 2017, for example, researchers from ProPublica did a test to see if the machine would help them to promote three posts to antisemitic users.

It did. At one point in the process, for example, the automated system asked the researchers if they wished to “INCLUDE people who match at least ONE of the following: German Schutzstaffel, history of ‘why Jews ruin the world’, how to burn Jews, Jew hater”. “Your potential audience selection is great!” it told the researchers. “Potential audience size: 108,000 people.” And all for $30. After ProPublica contacted Facebook, the company removed the antisemitic categories and said it would explore ways to fix the problem, such as limiting the number of categories available or scrutinising them before they are displayed to buyers.

There’s no point in trying to anthropomorphise this. Facebook is clearly not run by Nazis. But what its software engineers have built is an incredibly powerful, beautifully engineered machine for matching advertisers with people who might be receptive to their messages. And it’s clear that advertisers love that machine because it gives them a warm feeling that their advertising budgets may be spent more effectively on Facebook rather than on billboards or TV ads. Which, of course, sadly also means that the much-hyped advertising boycott spurred by the #blacklivesmatter protests will have little impact on Facebook’s bottom line. Morals matter, but money talks.

Which brings us back to a woman wrestling with a cancer diagnosis and confiding it to her Facebook page. One of the great things about social media in their infancy was the way they enabled people to reach out to others and find support and help when they needed it most. And maybe that still happens. In those early days, though, communications were not curated by algorithms designed to extract value from them. But the business model that we now call surveillance capitalism put paid to that, which is why you should never post anything on Facebook without being prepared to face the algorithmic consequences.

What I’ve been reading

Efficiently unnerving
The dark underbelly of our worship of “efficiency”. Thoughtful essay by Tim Bray on his blog.

Driven to succeed
How Tesla cracked the code of automobile innovation. Fascinating insider account by Philippe Chain on the Monday Note blog. I wouldn’t like to work there.

An AI for everyone
Don’t ask if artificial intelligence is good or fair, ask how it shifts power. Spot on. Great article by Pratyusha Kalluri on the Nature site.

 

Leave a Comment

Required fields are marked *

*

*