Last Updated on 3rd March 2023

Read the script below

Hello and welcome to Safeguarding Soundbites. As always, we’ll be checking out the week’s digital and safeguarding news, plus giving you the latest updates and advice from our Online Safety Experts.

Have you heard of or ever used Twitch, the online livestreaming platform, where people can broadcast themselves playing videogames, going for walks, cooking and just about anything you can think of?! You might be wondering how it works. People – called streamers – livestream themselves while viewers watch and interact with the streamer via chat. Twitch users can subscribe to their favourite streamers and donate money to them. It’s a lot like following an influencer or celebrity on social media but the ‘live’ element is what really hooks viewers in.

The livestreaming platform has an average of a whopping 103,000 livestreams happening at any given time…that’s a lot of people sharing their lives. Although it is mostly used to watch gamers playing, the platform is also used for streamers to just chat to their viewers or ‘take them with them’ as they carry out activities or go about their daily lives. But unfortunately, the live aspect of Twitch has brought problems, and some safeguarding risks. Streamers have broadcast sexually explicit and inappropriate content, despite that being against Twitch’s Community Guidelines. Our Online Safety Experts have put together a Guide to Twitch, which explains further about what the platform is, what the risks are and gives great top tips on how you can help the young people in your care stay safer on Twitch and other live streaming platforms.

Visit saferschoolsni.co.uk to find our Guide to Twitch, as well as a recent Safeguarding Alert about an explicit trend on TikTok and a Safeguarding Updated about a new type of game on Roblox.
You can also find all of this content on the Safer Schools NI App, which you can download for free right now! Just visit your device’s app store.

In the news this week, Meta has announced that they are co-launching a new platform that aims to stop intimate images of young people being posted online. Called ’Take It Down’, it’s been created in conjunction with the National Centre for Missing and Exploited Children. Young people will be able to request participating apps to search for intimate images of themselves – which, depending on their age, is referred to as youth produced sexual imagery. Meta, the company behind Facebook and Instagram, has faced strong criticism in the past over their lack of action on child sex abuse images hosted on their site. In 2019, the National Centre for Missing and Exploited Children received 16.9 million referrals from US tech firms of child sex abuse images, 94% of which came from Facebook.

As The Online Safety Bill continues its journey through parliament, the president of messaging app Signal has spoken out about the bill – and she’s not happy. In fact, she’s threatened to pull the app from the UK altogether should the bill go through. Signal prides itself on its commitment to privacy, with no screenshotting of messages, no tracking and no marketing. However, it’s the app’s end-to-end encryption messaging that’s brought the president of Signal promising to “100% walk” away if the bill comes into law. The Online Safety Bill could mean that tech companies like Signal would be required to allow Ofcom to scan encrypted messages for child sexual abuse materials and terrorism content. End-to-end encryption has long been the centre of debate between online privacy campaigners and safeguarding organisations. With the bill’s details still being finalised as it passes through the House of Lords, it will remain to be seen whether it will signal the end of signal in the UK.

And it’s a bit of a change of tune for Twitter after they have announced a tightening of their rules around violent content. The social media platform has been in the spotlight recently due to their concerning behaviour when it comes to safeguarding, such as firing large numbers of their content moderation staff. But the recently announced policy will see a ban on wishing others harm, ranging from traffic accidents, illnesses to death, along with threats against homes and infrastructure. The zero-tolerance approach won’t extend to speech related to sporting events or video games, however, nor satire or artistic expression when expressing a viewpoint.

The Irish Football Association has unveiled their new safeguarding policy for all football clubs in Northern Ireland. The document is designed to help clubs protect their young players and promote a sense of shared responsibility within the IFA. Koulla Yiasouma, the Northern Ireland Commissioner for Children and Young People, unveiled the policy and said, “As someone who has worked with children and young people in a paid and a voluntary capacity I recognise the importance of clear guidance, standards and procedures. Added to training and support, these procedures should make sure that everyone working in Irish FA-affiliated football clubs is confident in identifying and acting on concerns when a child may be at risk.” The IFA’s Safeguarding Manager Kevin Doyle has high hopes that policy will “highlight the layers of responsibility” to the “affiliated bodies, leagues, and clubs” in order to create a “fun, safe, and inclusive environment.”

Meanwhile, some of the biggest platforms have announced new safety features to their users in this past week. TikTok has planned improvements to their screen time tool with additional options for users to trial. There will also be new default settings for teen accounts and plans to include more parental controls on the platform, such as customisable daily screen time limits and a schedule for muting app notifications.

Snapchat have announced that they are trialling the ability for users to pause their snap streaks. This is a big shift in one of their most popular features, as streaks are attributed to friends who have consecutively sent each other at least one message on the platform every 24 hours. It is thought that being able to ‘pause streaks’ will allow users to improve their amount of screentime and decrease the stress a user might feel if they are unable to continue their streak for any reason.

That’s everything for this week’s episode of Safeguarding Soundbites. We’ll be back again next week but, in the meantime, follow us on social media to stay up-to-date on what our Online Safeguarding Experts are up to – just search social media for Safer Schools NI. We’d love you to share this podcast with your friends, family and colleagues so we can help keep the children and young people in our care safer online. And don’t forget, you get download the Safer Schools NI App right now for free on your device’s app store. Speak to you next time!

Join our Safeguarding Hub Newsletter Network

Members of our network receive weekly updates on the trends, risks and threats to children and young people online. 

Sign Up

Who are your Trusted Adults?

The Trusted Adult video explains who young people might speak to and includes examples of trusted adults, charities and organisations.

Pause, Think and Plan

Use our video for guidance and advice around constructing conversations about the online world with the children in your care.

Go to Top