Calendar An icon of a desk calendar. Cancel An icon of a circle with a diagonal line across. Caret An icon of a block arrow pointing to the right. Email An icon of a paper envelope. Facebook An icon of the Facebook "f" mark. Google An icon of the Google "G" mark. Linked In An icon of the Linked In "in" mark. Logout An icon representing logout. Profile An icon that resembles human head and shoulders. Telephone An icon of a traditional telephone receiver. Tick An icon of a tick mark. Is Public An icon of a human eye and eyelashes. Is Not Public An icon of a human eye and eyelashes with a diagonal line through it. Pause Icon A two-lined pause icon for stopping interactions. Quote Mark A opening quote mark. Quote Mark A closing quote mark. Arrow An icon of an arrow. Folder An icon of a paper folder. Breaking An icon of an exclamation mark on a circular background. Camera An icon of a digital camera. Caret An icon of a caret arrow. Clock An icon of a clock face. Close An icon of the an X shape. Close Icon An icon used to represent where to interact to collapse or dismiss a component Comment An icon of a speech bubble. Comments An icon of a speech bubble, denoting user comments. Comments An icon of a speech bubble, denoting user comments. Ellipsis An icon of 3 horizontal dots. Envelope An icon of a paper envelope. Facebook An icon of a facebook f logo. Camera An icon of a digital camera. Home An icon of a house. Instagram An icon of the Instagram logo. LinkedIn An icon of the LinkedIn logo. Magnifying Glass An icon of a magnifying glass. Search Icon A magnifying glass icon that is used to represent the function of searching. Menu An icon of 3 horizontal lines. Hamburger Menu Icon An icon used to represent a collapsed menu. Next An icon of an arrow pointing to the right. Notice An explanation mark centred inside a circle. Previous An icon of an arrow pointing to the left. Rating An icon of a star. Tag An icon of a tag. Twitter An icon of the Twitter logo. Video Camera An icon of a video camera shape. Speech Bubble Icon A icon displaying a speech bubble WhatsApp An icon of the WhatsApp logo. Information An icon of an information logo. Plus A mathematical 'plus' symbol. Duration An icon indicating Time. Success Tick An icon of a green tick. Success Tick Timeout An icon of a greyed out success tick. Loading Spinner An icon of a loading spinner. Facebook Messenger An icon of the facebook messenger app logo. Facebook An icon of a facebook f logo. Facebook Messenger An icon of the Twitter app logo. LinkedIn An icon of the LinkedIn logo. WhatsApp Messenger An icon of the Whatsapp messenger app logo. Email An icon of an mail envelope. Copy link A decentered black square over a white square.

Social media firms pledge cash for Samaritans in bid to tackle harmful content

Social media companies and the Government have been under pressure to act (Nick Ansell/PA)
Social media companies and the Government have been under pressure to act (Nick Ansell/PA)

Social media giants have pledged hundreds of thousands of pounds to the Samaritans charity in a bid to rid the internet of self-harm videos and other damaging material, Health Secretary Matt Hancock said.

Representatives from Facebook, Google, Snapchat and Instagram were summoned by the Government to meet with the charity to identify and tackle harmful content, including that promoting suicide.

The summit in Whitehall came three weeks after the Government announced plans to make tech giants and social networks more accountable for harmful material online.

The maiden summit in February resulted in Instagram agreeing to ban graphic images of self-harm from its platform.

Speaking after the behind-closed-doors meeting, Mr Hancock said: “I met the main social media companies today and they have agreed to fund the Samaritans to identify what is harmful and then to put in place the technology on their platforms to find harmful material and make sure it is either removed or others can’t see it.

“The amount of support is in the hundreds of thousands.

“The crucial thing is that we have an independent body, the Samaritans, being able to be the arbiter of what is damaging content that needs taking down so all tech companies can follow the new rules that have been set out.”

Social media companies and the Government have been under pressure to act following the death of 14-year-old Molly Russell in 2017.

The schoolgirl’s family found material relating to depression and suicide when they looked at her Instagram account following her death.

Mr Hancock went on: “I feel the tech companies are starting to get the message, they’re starting to take action.

“But there’s much more to do … we also spoke about tackling eating disorders and some anti-vaccination messages which are so important to tackle to ensure they do not get prevalence online.”

In a statement, a spokesman for Facebook, which also owns Instagram, said: “The safety of people, especially young people, using our platforms is our top priority and we are continually investing in ways to ensure everyone on Facebook and Instagram has a positive experience.

“Most recently, as part of an ongoing review with experts, we have updated our policies around suicide, self-harm and eating disorder content so that more will be removed.

“We also continue to invest in our team of 30,000 people working in safety and security, as well as technology, to tackle harmful content.

“We support the new initiative from the Government and the Samaritans, and look forward to our ongoing work with industry to find more ways to keep people safe online.”

Samaritans chief executive Ruth Sutherland said: “We look forward to building a strategic partnership with government and the world’s leading technology companies that will help us all tackle the issue of dangerous online content relating to self harm and suicide together.

“An innovative research programme will be the foundation for building our shared knowledge on this complex issue.

“We need to know more about how certain content affects different people.

“We all have a role to play in suicide prevention and, by working together, we believe this hub of online excellence will drive meaningful change on an issue that needs urgent attention.”

The online harms white paper sets out a new statutory duty of care to make companies take more responsibility for the safety of their users and tackle harm caused by content or activity on their services.

Compliance with this duty of care will be overseen and enforced by an independent regulator.

Failure to fulfil this duty of care will result in enforcement action such as a company fine or individual liability on senior management.