| |

It shouldn’t take a young girl’s death to shame social media into action

It shouldn’t take a young girl’s death to shame social media into action
Opinion: 100% Media 0% Nonsense

Why does it take the death of a young girl to get answers from Big Tech in public and on the record?

 

Sometimes you need an outsider to come on to your turf and show how inept we have become.

Oliver Sanders, a lawyer representing dead teenager Molly Russell’s family at her inquest last week, asked a Meta (Facebook) executive questions that I doubt many advertisers or media agencies have asked often enough, if ever, in recent years.

He asked why Instagram allowed children on the platform when it was “allowing people to put potentially harmful content on it”. He suggested Meta “could just restrict it to adults”.

Molly, 14, from Harrow in north-west London, killed herself in November 2017 after viewing extensive amounts of content online related to suicide, self-harm, depression and anxiety on Instagram (Facebook’s sister brand, also owned by Meta), YouTube, Twitter and Pinterest.

Elizabeth Lagone, Meta’s head of health and wellbeing, acknowledged at the inquest that some of the posts and videos had broken Instagram guidelines which prohibit the glorification, encouragement and promotion of suicide and self-harm.

“We are sorry that Molly saw content that violated our policies, and we don’t want that on the platform,” she said.

Lagone was taken through Instagram posts that were saved, liked and shared by Molly six months before she took her own life. The first batch included content that Molly’s family believe encouraged suicide and self-harm, which would have been against Instagram guidelines at the time.

The Meta exec said they were “by and large” permissible under the platform’s guidelines because they represented an attempt to raise awareness of a user’s mental state and share their feelings. However, she conceded that at least two of the posts shown would have violated Instagram’s policies.

So there we have it. But at this point, you have to wonder: why does it take the death of a young girl to get answers from Facebook in public and on the record in this way?

‘Just a business in America’

Then came the challenges that any concerned human being, let alone a bereaved parent or their lawyer, would naturally give in response.

“[Instagram] is an inherently unsafe environment and it is dangerous and toxic to have 13- and 14-year-olds alone in their bedrooms scrolling through this rubbish on their phones,” said Sanders.

Lagone replied: “I respectfully disagree.”

Then, raising his voice, Sanders said: “Why on earth are you doing this?” He pointed out that Instagram was choosing to put content “in the bedrooms of depressed children”.

“You have no right to. You are not their parent. You are just a business in America,” Sanders said.

Why is it then, that our society has just accepted that a 13-year-old child is allowed to use a social media app like Instagram, or Facebook, or TikTok, or YouTube, or Twitter, or Reddit where others users upload content instead of professional media companies?

It wasn’t because a team of scientists, sociologists, and anthropologists told us that 13-year-olds are ‘old enough’ to handle it. If anything, they would say the opposite: 13-year-old kids are still more than a decade from having a fully developed prefrontal cortex, the part of the brain involved in decision-making and impulse control.

It’s because of a US law written in 1998. The Children’s Online Privacy Protection Act (COPPA) was intended to prevent online platforms from collecting the personal data of children under the age of 13 for ad targeting and tracking. However, COPPA has become an excuse of Big Tech, which are indeed usually “just some businesses in America” to use 13 as a minimum age limit. Like some sort of absent parent, other countries like the UK seem to have just accepted this standard without anyone ever stopping to ask, as Sanders would put it, “why on earth are we allowing this?”

I was the same age as Molly in 1998, and also spent a lot of time online. But the internet experience was completely different to today. For a start, internet speeds and computers just weren’t fast enough to deliver the sheer amount of high-resolution video content that is at our fingertips today. The first banner ad was only published on Wired’s website in 1994, when ads were essentially bought and created as primitive online classifieds, before the days of hyperfast programmatic trading and dynamic creative.

The laws governing internet content are quite simply out of date. Some of them were written at a time when we thought the internet was little more than an online magazine. We could not imagine how quickly it would turn into a supersonic video machine, that not only spits out content at a lightning pace but threatens to record our behaviour forever.

And it’s not like Meta, in particular, could not have predicted this, according to the highly credible evidence given by former employee Frances Haugen to the US Senate last year. Recalling her time at Facebook, Haugen said the company “repeatedly encountered conflicts between its own profits and our safety. Facebook consistently resolved those conflicts in favor of its own profits.” Meta has denied this claim.

She added: “[T]heir profit optimizing machine is generating self-harm and self-hate — especially for vulnerable groups, like teenage girls. These problems have been confirmed repeatedly by Facebook’s own internal research.”

Media agency founder Nick Manning, who has long argued in his Media Leader column for advertisers to put more pressure on internet platforms to clean up the online environment, tells me: “[T]he social media companies don’t moderate their content sufficiently because they chase scale. Facebook can’t moderate 2 billion active daily users.

“While scale drives the business model, they are effectively leaving themselves open. Only governments can stop this.”

We’ll see whether UK Prime Minister Liz Truss, who has had the worst start to a new job since the captain of the Titanic, will maintain support for the Online Safety Bill proposed by her predecessor Boris Johnson.

Silence from across the industry

At last week’s inquest, coroner Andrew Walker said Instagram created “risk and danger” for users. “You create the risk, you create the danger, and then you take steps to lessen the risk and danger,” he said.

Lagone replied: “We take our responsibilities seriously to have right policies and procedures in place.”

There is always a risk of unintended consequences whenever any new product, including a media product, is launched into the world. We simply don’t know what will be the full impact of all kinds of new behaviours will have on young people with tech and media innovation, whether its violent computer games, ubiquitous pornography, chat rooms, texting, or online dating.

But we now have enough reason to demand much tougher restrictions on social media, including much higher age limits, content restrictions, and data-collection restrictions for anyone under the age of 18.

“Things have to change now,” a media agency boss tells me in confidence after the inquest concluded on Friday. Why not speak out, I ask. “I get paid to put ads on there. So it’s a bit hypocritical, and also a bit of a risk.”

If I had a penny for every time I heard that… I still wouldn’t be as rich as Mark Zuckerberg, a man so inured to criticism and condemnation that you really do wonder whether anyone on planet Earth is able to reach him.

I hope my suspicions are wrong and that advertisers and media agencies have been talking tough in private with Meta and other social media platforms, even if too few say anything publicly. Many will remember the rare example of then Initiative CEO Mat Baxter announcing in 2018 that he would advise clients to stay off Facebook “entirely” after the company was found to share users’ private messages with advertisers. The silence from across the industry that met Baxter’s rallying cry was telling.

Maybe things will begin to change. But it’s too late for Molly and her family.

[email protected]


100% Media 0% Nonsense is a weekly column by the editor. Feedback is welcome in the comments or by emailing [email protected]

This column features in the Monday edition of our daily UK newsletter and the weekly US round-up. Sign up for our newsletters here. 

 

Carole Lydon, Freelance Writer & Editor, Carole Lydon, on 06 Oct 2022
“Great piece Omar, devastating for Molly's family and being repeated in families all over the world. I side with the great Bob Hoffman on his stance that tracking and surveillance for the false dream of effective advertising is at the ugly core of the problem. An entire ecosystem addicted to the profit of the transaction, not caring about the catastrophic end result. Sadly I'm not sure anyone is 'talking tough in private'. Yet.”
Tim Bleakley, CEO, Ocean , on 04 Oct 2022
“Very good piece Omar and the Industry should unite around this topic and be noisy about applying pressure to ensure the online safety bill becomes law. For years and years I have been flabbergasted by the lack of accountability & legislation around online content and I find the lack of diligence and care for those who are supposedly custodians of brands and brand reputations staggering. I really hope the worm is turning .”
Ian Redman, Media & data privacy consultant, Title, on 03 Oct 2022
“A forceful and timely piece. If this shocking case does not break the uncomfortable silence from agencies and brand owners, nothing will.”
Claire Foss, Owner, Waterfall, on 03 Oct 2022
“Fantastic piece. As someone who's been working in/writing about the sector for years, and now with a 7 year old girl, this issue is hugely on my mind and under discussion by other parents I know. The content, the OS, the very nature of what social media apps ask of the user is hugely unsuitable for any kid - and frankly, for most adults too. Many people of all ages struggle with what these apps do to your brain and sense of self - as evidenced by the regular social media 'quits' or 'breaks' that even my 39-year old peers take. They're addictive, crazy spaces that are seen as a party we somehow have to be at - and as you say, somehow, society has just accepted that this is OK.”

Media Jobs