Calendar An icon of a desk calendar. Cancel An icon of a circle with a diagonal line across. Caret An icon of a block arrow pointing to the right. Email An icon of a paper envelope. Facebook An icon of the Facebook "f" mark. Google An icon of the Google "G" mark. Linked In An icon of the Linked In "in" mark. Logout An icon representing logout. Profile An icon that resembles human head and shoulders. Telephone An icon of a traditional telephone receiver. Tick An icon of a tick mark. Is Public An icon of a human eye and eyelashes. Is Not Public An icon of a human eye and eyelashes with a diagonal line through it. Pause Icon A two-lined pause icon for stopping interactions. Quote Mark A opening quote mark. Quote Mark A closing quote mark. Arrow An icon of an arrow. Folder An icon of a paper folder. Breaking An icon of an exclamation mark on a circular background. Camera An icon of a digital camera. Caret An icon of a caret arrow. Clock An icon of a clock face. Close An icon of the an X shape. Close Icon An icon used to represent where to interact to collapse or dismiss a component Comment An icon of a speech bubble. Comments An icon of a speech bubble, denoting user comments. Comments An icon of a speech bubble, denoting user comments. Ellipsis An icon of 3 horizontal dots. Envelope An icon of a paper envelope. Facebook An icon of a facebook f logo. Camera An icon of a digital camera. Home An icon of a house. Instagram An icon of the Instagram logo. LinkedIn An icon of the LinkedIn logo. Magnifying Glass An icon of a magnifying glass. Search Icon A magnifying glass icon that is used to represent the function of searching. Menu An icon of 3 horizontal lines. Hamburger Menu Icon An icon used to represent a collapsed menu. Next An icon of an arrow pointing to the right. Notice An explanation mark centred inside a circle. Previous An icon of an arrow pointing to the left. Rating An icon of a star. Tag An icon of a tag. Twitter An icon of the Twitter logo. Video Camera An icon of a video camera shape. Speech Bubble Icon A icon displaying a speech bubble WhatsApp An icon of the WhatsApp logo. Information An icon of an information logo. Plus A mathematical 'plus' symbol. Duration An icon indicating Time. Success Tick An icon of a green tick. Success Tick Timeout An icon of a greyed out success tick. Loading Spinner An icon of a loading spinner. Facebook Messenger An icon of the facebook messenger app logo. Facebook An icon of a facebook f logo. Facebook Messenger An icon of the Twitter app logo. LinkedIn An icon of the LinkedIn logo. WhatsApp Messenger An icon of the Whatsapp messenger app logo. Email An icon of an mail envelope. Copy link A decentered black square over a white square.

What safety measures do social media platforms have in place?

Post Thumbnail

Social media platforms have come under increasing scrutiny as instances of inappropriate content have emerged.

And with kids spending more time than ever online during the pandemic, parents and carers are rightly worried about how to keep them safe when using social media.

But what measures do these companies put in place to stop such material getting onto the platforms? And could they be doing more to keep children and young people safe when they are online?

We contacted Snapchat and TikTok to get their response to these concerns.

‘An alternative to traditional social media’

A Snapchat representative detailed the safety precautions the platform had set up to ensure inappropriate content does not find its way onto the app.

They also stressed that the app was different to other social media platforms in that there is “no way” for rogue accounts to broadcast content to all users on the platform.

The rep said: “We believe we have a responsibility to provide our users with a safe, positive and personal experience on our platform.

Snapchat is a popular messaging application.

“Snapchat is designed as an alternative to traditional social media—a place where close friends can connect, strangers cannot broadcast to everyone on the platform, and popularity and content is not measured by virality metrics.

“We offer no way for unvetted accounts to broadcast to our entire user base, which means Snapchat is not a platform where anyone can distribute anything to anyone.”

“Our content platform, Discover, is closed, and individual Snapchat users cannot share content to a wide public audience. Like a television network, only media brands and content creators who we have chosen to work with us have the ability to distribute content to large groups of Snapchatters.

We encourage anyone who sees any violating content to report it immediately using our in app reporting tools, so our Trust and Safety team can take action.”

Snapchat spokeswoman

“All new features go through an intense privacy review process – where our privacy engineers and privacy lawyers vet all features that touch a user before they are released. We have always used this ‘privacy-by-design’ approach and won’t release a feature that doesn’t pass this vet.”

Snapchat, which was launched in 2011, allows users to communicate in private groups – a feature which the platform insists keeps those who use the app safe.

The rep added: “In private (one-to-one or small group) communication it is difficult for one single account or post to gain traction or ‘go viral’ because by default Snapchat accounts are set to friends only. There are no likes, shares or comments on Snapchat and group chats are limited to 31 people.

“User content on Snapchat is designed to delete by default, meaning that the majority of snaps and stories will automatically be deleted once opened by the intended recipient(s) or within 24 hours of being posted.”

Shock as parents learn pupils at Dundee schools ‘share Snapchat suicide clip’

The spokeswoman also pointed to rules and regulations that users of the app have to adhere to and what action they themselves can take if they spot inappropriate content.

She said: “We have clear  community guidelines and terms of service that tell Snapchatters what type of content is acceptable to post on Snapchat.

“We encourage anyone who sees any violating content to report it immediately using our in app reporting tools, so our Trust and Safety team can take action.

“They work around the clock to review abuse reports and take action when they become aware of a violation. In the vast majority of cases, they enforce reported in-app content well within two hours.

“When we are notified that a Snapchatter is violating our rules, we promptly investigate and remove the offending content and, if appropriate, may terminate the account.”

TikTok

The video sharing platform came under fire recently after a video of a man taking his own life went viral on the site.

But representatives for the company defended the safety measures it has in place to deal with such content, outlining what action is taken when it is flagged up.

A TikTok spokesman said: “Recently, clips of a suicide that had originally been live-streamed on Facebook circulated on other platforms, including TikTok.

“Our systems, together with our moderation teams, detected and removed these clips for violating our policies against content that displays, praises, glorifies, or promotes suicide.

“We banned accounts that repeatedly tried to upload clips, and we appreciate our community members who reported content and warned others against watching, engaging, or sharing such videos on any platform out of respect for the person and their family.”

The spokesman also outlined what actions parents can take to keep their children safe when they are using TikTok and where they can find more information on them.

He added: “We have built several tools to help parents manage their child’s experience on TikTok. This includes controls on what content they can see, and how long they can spend online. Parents can read about these tools in our safety center.

“We also wrote to the leaders of nine other platforms, offering to work together to further protect our users from harmful content.

“If anyone in our community is struggling with thoughts of suicide or concerned about someone who is, we encourage them to seek support, and we provide access to hotlines directly from our app and in our safety center.”