Teenage Committed Suicide due to Bullying in online

 Social media urged to take 'moment to reflect' after girl's death

Concern!

The Children's Commissioner for England has accused social media giants like Facebook and Snapchat of losing control over the information they distribute, advising them that recent adolescent deaths should serve as a "moment of reflection" for how they operate. Anne Longfield sent an open letter to Facebook, Instagram, WhatsApp, YouTube, Pinterest, and Snapchat, saying that the death of 14-year-old Molly Russell has brought attention to the "horrific" content that youngsters may readily obtain online.

Incident Brief 

"The recent terrible incidents of young people who accessed and drew from sites that contain very problematic information concerning suicide and self-harm, and who then committed themselves, should be a moment of pause," Longfield warned the corporations. "I would implore you to acknowledge that there are issues and commit to addressing them – or openly admit that you are unable to do so."

Victims' Statement 

After her Instagram account was discovered to include upsetting content about despair and suicide, Molly's father Ian said that social media was partially to blame for his daughter's death. "The risk of disrupting all user experiences should no longer be a deterrent to making young people's safety and well-being a top priority." Longfield stated that "hiding behind servers and equipment in other countries" should not be an acceptable approach to evade accountability. Longfield reaffirmed her proposal for an independent "digital ombudsman" to guarantee that firms safeguard youngsters and remove troubling content as quickly as possible. "I don't believe it's a stretch to wonder if even you, the proprietors, have any influence over their material anymore," Longfield wrote.

Experts' Opinion

"If that's the case, children shouldn't use your services at all, and parents should be informed that any authority regulating algorithms and material is a fantasy." "We have a significant responsibility to make sure young people are secure on our platforms," said a Facebook spokeswoman, who also owns Instagram and WhatsApp. "Working collaboratively with the government, the children's commissioner, and other companies is the only way to ensure we get this right." "Our hearts go out to Molly's family, as well as the families of those who have lost loved ones to suicide or self-harm." We're doing a comprehensive assessment of our policies, enforcement, and technology, as well as speaking with mental health professionals to see what further we can do." Longfield's letter contained a list of questions she wanted the firms to respond to, such as how many self-harm sites or postings are housed on their platforms, and how many are visited by those under the age of 18.

Role of institution

he firms were also requested to provide the findings of their own study on the impact of self-harm websites on children's mental health, as well as what help alternatives are available to users looking for self-harm imagery. Longfield addressed the firms, "It is your job to support measures that provide children the information and skills they need to grow up in this digital environment – or to recognise that you cannot control what anyone sees on your platforms." The intervention comes as Ofcom said this week that the number of 12- to 15-year-olds who have been bullied on social media has increased from 6% in 2016 to 11% last year.
Snapchat's public material was regulated and "extremely filtered," with only content from major media businesses and celebrities being used, and content from other users being collected by its in-house news team or professional partners. "We work hard to ensure that Snapchat is a safe and welcoming environment for everyone. "From the beginning, we've tried to link our community with authoritative and reputable material while also protecting them from dangerous content and misinformation," a spokeswoman said.

Post a Comment

Previous Post Next Post