Suicide, self-harm, trolling, bullying. Social media has been in the news a lot over the past few weeks, and for all the wrong reasons. While the likes of Facebook, Twitter and Instagram can influence positive change, acting as a domain for informative discussions and debate, there is no ignoring the fact that these social media platforms, among others, can – for some people, regardless of age – be an all-consuming force of danger, fuelling feelings of low self-worth and inadequacy to name just a few. But it’s the influence in which social media content has on the younger generation, children in their first years of secondary education, where we have most to fear and most to lose, unless we force some changes.

The story of 14-year-old Molly Russell, who took her own life in 2017, is a heartbreaking story of a girl who became immersed in a loop of harmful content on Instagram and Pinterest.

Molly’s family said she had not shown any obvious signs of severe mental health problems. They knew her to be happy; she was doing well and had recently got a lead role in her school’s production of Fantastic Mr Fox. It was only in the weeks and months following her tragic death that they found her Instagram account, filled with distressing material about anxiety, depression and suicide. Her father, Ian, has said he believed the images and information that Molly had been viewing on social media helped to kill his daughter.

In my view, there’s no doubt that Molly’s story is yet another very stark example of social media companies failing to act responsibly to protect impressionable young people from dangerous images and messages.

Before this story, I will admit I had no idea that users of Instagram could access such frankly horrifying material, glorifying self-harm and suicide. But a very quick search shows just how easy it is. A pop-up asking, “can we help” from Instagram can be bypassed with ease to reveal the gateway to thousands upon thousands of shocking and heartbreaking images, which are being accessed freely by people, many of whom are depressed and turning to the Internet for information.

Anyone who’s gone down an Instagram rabbit hole knows how easy it is to get sucked in to the endless images of content perfectly matched to your interests at any given time. It is compelling – perhaps even compulsive – and has been designed that way to keep you engaged. That may be harmless when it brings you closer to a lifestyle coach or your most sought after bedroom renovation, but when it’s keeping you engaged with negative, dangerous content encouraging self-harm and suicide, it’s a whole other ball game.

Ian Russell has shown real bravery in speaking with such honesty about his daughter’s struggles at the time of her untimely death. Few people who’ve endured such a devastating loss would have the strength to take that kind of powerful stand against a social media giant such as Facebook, which owns Instagram.

Indeed, his bravery has prompted the government to step in and write to social media providers such as Apple, Google, Pinterest, Snapchat and Facebook urging them to purge dangerous content, or face big fines. And while it’s no longer a David and Goliath situation, one still has to ask if this intervention by Health Minister Matt Hancock will be enough? After all, we’ve heard about various companies’ commitment to tacking harmful content before, and yet here we are again talking about the prevalence and pushing of harmful content.

I’m the last person to advocate for a nanny state, having a government that’s constantly interfering and patrolling our lives and legislating on many matters. In a utopia, big technology companies would see how their inaction on dangerous content is morally wrong and they would monitor and remove it in a flash, without the threat of big fines or bans. But we’re far from utopia and it’s clear that self-regulation isn’t working. So, as far as I’m concerned, when it could mean the difference between a life saved or a life lost, state legislation is the only way forward.