Mother of Australian teen who committed suicide urges social media age restrictions.

At a United Nations General Assembly event, an Australian mother from Sydney, Emma Mason, whose 15-year-old daughter, Matilda “Tilly” Rosewarne, died by suicide due to severe cyberbullying on social media, urged for global social media reform. She proposed banning children under 16 from social media and holding tech companies financially responsible.

Mason shared the tragic details of her daughter’s last moments, recounting her twelfth and final suicide attempt on Feb. 16, 2022.

Mason said her daughter, wanting to look her best, carefully applied makeup before ending her life, having meticulously planned the act. She said that Tilly was exhausted, broken and could no longer fight. She climbed onto the backyard tree house, put a noose around her neck, and jumped.

Tilly’s father and 13-year-old sister found her in the backyard.

Mason stated that while Tilly had faced bullying since elementary school, it worsened with the rise of social media.

In November 2020, a fabricated nude image of Tilly, created by a male classmate, was rapidly shared on Snapchat, reaching over 3,000 children within hours.

Mason described the immediate impact, saying Tilly became hysterical and spiraled. When Mason contacted the school, they couldn’t intervene because the boy and his mother denied his phone use that day. Tilly then attempted suicide by cutting her arms and lost a lot of blood and never truly recovered.

The family was allegedly told by authorities that preventing such incidents was difficult, as obtaining information from Snapchat could take months.

As Tilly struggled, Mason said the cyberbullying persisted, with Tilly receiving messages encouraging her suicide.

The grieving mother identified Snapchat, and TikTok as directly contributing to her daughter’s death, arguing these platforms fail to protect young users, leading to worsened mental health, concentration, social skills, body image issues, sleep problems, and social isolation.

She argued that just as car manufacturers are liable for the safety of drivers, social media companies must be held accountable for the protection of children, who are suffering and dying globally due to social media engagement and that parents need help.

In 2024, Snap Inc.’s Asia-Pacific head of public policy, Henry Turnbull, told a parliamentary inquiry that the company is committed to ensuring user safety on Snapchat.

Turnbull acknowledged that bullying exists both online and offline and that Snap Inc. is working to combat it, recognizing its damaging impact. He emphasized the company’s focus on proactive measures to address these risks, though “this work is never done.”

During the 2024 inquiry, Google’s then-director of government affairs and public policy for Australia and New Zealand, Lucinda Longcroft, affirmed that user safety is Google’s top priority.

Longcroft said that Google is receptive to all methods of ensuring Australian user safety. She stated that Google continuously strives to improve its safety measures, as the safety of all users, particularly children, is a paramount concern and responsibility, and invests significant resources and expertise to ensure the safety of its systems, services, and products in the areas of mental health and suicide.

Despite Australia’s recent enactment of a law requiring social media platforms to prevent underage Australians (under 16) from creating accounts or face substantial fines, Mason called for a global ban to hold tech companies financially responsible.

She said that for parents of deceased children, life is measured by milestones and anniversaries that serve to remind them of what has been lost and that since Tilly’s death, she has met others who have had similar experiences, and asked how many more children must die.

European Commission President Ursula von der Leyen, following Mason’s speech, accused apps of using manipulative algorithms to attract and addict children to maximize profits for the companies.

Von der Leyen stated that while these ventures are profit-driven, parents bear the burden of the associated risks and harms, including cyberbullying, encouragement of self-harm, online predators, and addictive algorithms. She urged action to protect the next generation.

She said that Europe is testing an age verification prototype in France, Spain, Greece, Denmark and Italy.

Von der Leyen stated that it is common sense that young people must reach a certain age before they smoke, drink, or access adult content, and that the same logic applies to social media. She expressed optimism about the future of technology, noting that technology has already improved our lives and will continue to do so. However, she also emphasized the importance of defining our relationship with technology so that it serves humanity, not the other way around.

“`