Tuesday, 24 Dec, 2024

Tech

Meta has not done enough to safeguard children, whistleblower says

Technology Desk | banglanews24.com
Update: 2024-01-25 10:35:53
Meta has not done enough to safeguard children, whistleblower says

Mark Zuckerberg’s Meta has not done enough to safeguard children after Molly Russell’s death, according to a whistleblower who said the social media company already has the infrastructure in place to shield teenagers from harmful content.

Arturo Béjar, a former senior engineer and consultant at the Instagram and Facebook owner, said if the company had learned its lessons from Molly’s death and subsequent inquest it would have created a safer experience for young users. According to research conducted by Béjar on Instagram users, 8.4% of 13- to 15-year-olds had seen someone harm themselves or threaten to harm themselves in the past week.

“If they had learned the lessons from Molly Russell, they would create a product safe for 13-15-year-olds where in the last week one in 12 don’t see someone harm themselves, or threaten to do so. And where the vast majority of them feel supported when they do come across self-harm content,” Béjar told the Guardian.

Russell, a 14-year-old girl from Harrow, north-west London, took her own life in 2017 after viewing harmful content related to suicide, self-harm, depression and anxiety on Instagram and Pinterest. In a landmark ruling in 2022, an inquest into her death found that Molly “died from an act of self-harm while suffering from depression and the negative effects of online content”.

Béjar said Zuckerberg had the tools at his disposal to make Instagram, in particular, safer for teenagers but the company has chosen not to make those changes.

“They either need a different chief executive or they need him to wake up tomorrow morning and say: ‘This kind of content is not allowed on the platform’, because they already have the infrastructure and the tools for that [content] to be impossible to find.”

Béjar’s research at Instagram, and attempts to get the company to act on it, feature in a lawsuit brought against Meta by Raúl Torrez, the New Mexico attorney general, that claims Meta fails to protect children from sexual abuse, predatory approaches and human trafficking. Unredacted documents from the lawsuit show that Meta employees warned the company was “defending the status quo” in the wake of Molly’s death when “the status quo is clearly unacceptable to media, many impacted families and … will be unacceptable to the wider public”.

Béjar’s responsibilities as an engineering director included child safety tools and helping children cope with harmful content such as bullying material. Having left the business as a senior engineer in 2015, he returned as a consultant in 2019 for a two-year period where he conducted research showing that one in eight children aged 13 to 15 on Instagram had received unwanted sexual advances, while one in five had been victims of bullying on the platform and 8% had viewed self-harm content.

The former Meta employee has called on the company to set goals around reducing harmful content. “That creates the incentive structure for them to work on these things over a long period of time,” he said.

Bejar has urged Meta to undertake a series of changes including: making it easier for users to flag unwanted content and state why they don’t want to see it; regularly survey users about their experiences on Meta platforms; and making it easier for users to submit reports about their experiences on Meta services.

Béjar continues to monitor the Instagram platform and says harmful content – including self-harm material – remains on the app as well as clear evidence of underage users. Instagram has a minimum age limit of 13.

Béjar has been meeting politicians, regulators and campaigners in the UK this week including Molly’s father, Ian Russell, whose Molly Rose Foundation facilitated his visit. Béjar testified before Congress last year detailing his experience at the company and the “awful experiences” of his teenage daughter and her friends on Instagram, including unwanted sexual advances and harassment.

It would take three months for Meta to carry out an efficient crackdown on self-harm content, Béjar added. “They have all the machinery necessary to do that. What it requires is the will and the policy decision to say, for teenagers, we’re going to create a truly safe environment that we’re going to measure and report on publicly.”

A Meta spokesperson said: “Every day countless people inside and outside of Meta are working on how to help keep young people safe online. Working with parents and experts, we have introduced over 30 tools and resources to support teens and their families in having safe, positive experiences online. All of this work continues.”

Meta points to numerous safety initiatives including automatically setting under 16-year-olds’ accounts to private mode when they join Instagram, restricting adults from sending private messages to teenagers who don’t follow them and allowing Instagram users to report bullying, harassment and sexual activity.

Source: The Guardian 

BDST: 1035 HRS, JAN 25, 2024
MN
 

All rights reserved. Sale, redistribution or reproduction of information/photos/illustrations/video/audio contents on this website in any form without prior permission from banglanews24.com are strictly prohibited and liable to legal action.