|

Role of social platforms and parents amid ‘biggest social experiment on children in history’

Role of social platforms and parents amid ‘biggest social experiment on children in history’

Social media companies are fully aware that young teenagers are lying about their ages when using their platforms and this could expose children to content meant for adults, industry leaders heard at a private event in London this week.

“We know that, on all platforms, there might be 13- or 14-year-olds who might be representing as 18- or 19-year-olds,” a policy executive with deep knowledge of social media platforms said at the event.

The delegate later told The Media Leader that, despite social media CEOs regularly stating that no under-13s are allowed on their platforms, none can actually guarantee the complete absence of underage users given the lack of privacy-conscious and non-discriminatory tools to verify the age of users.

The Media Leader was invited to this discussion as an observer, on the condition that participants would not be identified.

Social media companies do attempt to utilise user data, such as information contained in publicly shared posts, to ban users from platforms if they are suspected to be under the age of 13, although it is an imprecise solution.

Therefore, providing more tools to parents have also become necessary developments for social media companies.

Meta has previously argued that vetting teens’ ages should not fall on the apps themselves, but on app stores run by companies such as Apple and Google.

Facebook’s owner has called for legislation that would require parents to approve teens’ downloads of apps on app stores, rather than after accounts are created within social apps themselves.

The same policy expert told The Media Leader that this idea has received some traction among governing authorities in France and the UK, although it is still early in its development.

Parents left to navigate alone

The Conscious Advertising Network (CAN), a not-for-profit advocating for “human rights-based advertising”, has previously highlighted the challenges of online safety for children and adults. At its Conscious Thinking Live event last autumn, Baroness Beeban Kidron, a leading advocate for children’s rights to online safety, said social media companies “talk a good talk” on online safety, but stridently fight attempts at regulations in court.

“The problem is that in the pursuit of the advertiser’s audience, the design of products and services is entirely geared to hold attention, snap up data and amplify networks. The pursuit of that holy grail creates unacceptable outcomes for kids,” Kidron said.

She championed last year’s passage of the Online Safety Act, which created a new duty of care for online platforms operating in the UK, requiring them to police both content that is illegal and content that is “harmful” to teens, even if not explicitly illegal. It also requires platforms to prevent children from accessing age-inappropriate content and provide parents and children with accessible ways to report problems when they arise.

Campaigners against the act have argued that it is a “recipe for censorship”, given it orders social media platforms to take down some speech even if it is not unlawful. Other critics have referred to broader concerns around online harms and social media use as a moral panic.

CAN co-founder and co-chair Jake Dubbins told The Media Leader: “The key point for me is that these platforms collect so much data on us that it is unimaginable that they do not know 13- and 14-year-olds are misrepresenting their age on the platforms due to their user behaviour, search history, images uploaded and content produced.

“Without any sort of capacity for age verification or enforcement, much younger children will also be on the platforms and misrepresenting their age. Not only that, content and advertising will be served according to the child’s behaviour on the platform.

“More eyeballs, more advertising inventory to sell.”

Peer: Social media ‘talk a good talk’ on safety but fight regulations in court

Dubbins suggested more regulation of online services is urgently needed to protect children from online harms, including a “beefing up” of the Online Safety Act.

He continued: “If a 13-year-old walks into a shop and asks to buy a bottle of vodka and the shop owner sells it to that child, then the shop owner can be prosecuted. How is it different if a 13-year-old, or even a nine-year-old, pretends to be 18 and then is served porn or violent content? The platform should be responsible.”

Dubbins added that much more education is needed to equip parents with knowledge of how social media works and how algorithms prioritise user attention and engagement.

“Parents have largely been left to navigate the biggest social experiment on children in history on their own. This is not acceptable.”

Parental best practice? ‘Be curious, not furious’

The London event sought to address that knowledge gap, with much of the discussion centred around best practice for parents looking to exercise a greater level of awareness and control over their children’s online use.

One delegate advised that kids don’t want to be “judged or shamed or have their tech taken away” and so approaching conversations around online safety from an open rather than punitive perspective is most likely to be well-received.

“Be curious, not furious”, they added.

According to another representative, parental oversight should be “tailored” to the needs of the individual child, with their online use becoming progressively more independent as they grow into adulthood.

They noted that “good digital etiquette starts right at the start”, emphasising the importance of setting boundaries early in order to maintain enforcement throughout the teen years. Parents should also monitor their own smartphone use and online behaviour, they recommended, given that children often model the behaviour of their parents.

As per regulatory requirements, most social media companies offer tools for parents to have a degree of control and transparency over their teens’ app usage.

Meta announced updated parental controls earlier this year for Instagram and Facebook, while TikTok has a family pairing feature to allow parents to customise safety settings for their teens. Snapchat has a Family Centre that allows parents to view who their child has messaged over the past seven days, although not the content of those messages.

Sexual abuse material major area of concern

Online safety experts at the industry event identified a number of additional key issues that are important for social media companies to make progress on. These include online blackmail over sexually explicit images, known as sextortion. Indeed, tackling sextortion was recognised as “a huge area of focus” for stakeholders.

There was also an acknowledgement of scams and the solicitation of nude pictures of minors on social platforms, sometimes by older individuals also lying about their age to reach out to teenage users.

One delegate, when outlining the importance of parental supervision on social media, highlighted that “safety is really about risk mitigation and management”, suggesting that it was possible to maintain reasonably safe use of social media platforms by teens as long as vigilance and open conversation between parent and child can occur.

Social media companies don’t want to be ‘social media’ any more

Social media companies have recently come under increased fire by politicians and child-safety advocates for allowing child sexual abuse material to run rampant on their platforms.

Last month, Telegram CEO Pavel Durov was arrested in France for his alleged complicity in the “distribution of images of minors presenting child pornography” on the app, as Telegram’s comparatively lax moderation policies have allegedly resulted in unabated illegal behaviour. Unlike other social media companies, Telegram has previously refused to join child-protection schemes.

Some social media companies have recently sought to distance themselves from the toxicity and harms associated with social media. Snap in particular has attempted to position itself as distinct from other “traditional” social media competitors given that it does not have any “likes” or public post reactions. While Snap does provide content to users, the main point of emphasis for consumers is described as the app’s chat and camera functions.

Similarly, TikTok has also sought to define itself not as a social media company but as an “entertainment platform”.

Media Jobs