Loading...

Last Updated on 16th February 2024

Read the script below

Natalie: Hello and welcome to Safeguarding Soundbites.

Colin: This is the weekly podcast that keeps you in the know with all the week’s important safeguarding updates and news.

Natalie: It sure is! I’m Natalie and he’s Colin and this week, we’re talking about the removal of political content on Threads and Instagram…

Colin: Cyber-attacks on schools.

Natalie: And much more. Colin, do you want to start us off?

Colin: This week, a deeply concerning story has emerged regarding child sexual abuse materials. Reports allege that school pupils have been accessing and sharing child abuse images on Snapchat. The claims came to light during events organised by the school itself, aiming to promote safe internet practices among students. Police are investigating and working with the school and Snapchat have said that this type of material has no place on their platform. Now, we don’t know any more details on this story – whether it’s sending images found online or self-generated images.

Colin: But I want to signpost listeners to our resources on self-generated sexual imagery, which you can find on the Safer Schools NI App. There, you’ll find really useful information on the subject, plus practical advice on how you can effectively respond if a child or young person in your care has created, shared or lost control of an image.

Natalie: Very alarming story. And unfortunately, our next story is also related to child sexual abuse imagery. A charity, called the Lucy Faithful Foundation, have reported that they’re receiving calls from people who are confused about the ethics of viewing AI-created child abuse imagery. The charity is a UK-wide child protection charity dedicated solely to preventing child sexual abuse and say that callers to its helpline think AI images are blurring the boundaries between what is illegal and what is morally wrong. The charity is warning callers that creating or viewing these materials is still illegal, even if they are created by AI.

Natalie: The PSNI has issued a warning about the prevalence of romance scams, highlighting the emotional and financial harm they can cause.

Since April 2023, the PSNI has received over 70 reports of such scams, resulting in a staggering collective loss of £713,133. These scams target individuals seeking love online, manipulating their emotions and ultimately draining them financially.

The report also details a particularly distressing case where a victim lost a staggering £130,000 over time to a woman they met online. This stark example demonstrates the potential severity of these scams and the devastating impact they can have.

Detective Chief Inspector Ian Wilson emphasises the importance of awareness in protecting oneself and one’s finances. He urges individuals to be cautious when engaging with online romantic interests, verifying information and avoiding sending money or sharing personal details prematurely.

Colin: Absolutely. I wanted to also share with our listeners an update from Meta. That’s the parent company of Instagram and Facebook, but also Threads, which is sort of Meta’s version of Twitter/X. They’ve announced plans to remove political content from recommendations on both Threads and Instagram.

This move aims to steer away from replicating the often heated and potentially harmful political discourse found on other platforms like X. However, it’s crucial to clarify that political content won’t vanish entirely. Users will still be able to follow and engage with accounts focused on politics, accessing their updates as usual. It’s purely the proactive recommendation of such content that’s being removed.

Meta emphasises this will be a gradual rollout approach, allowing them to carefully monitor and tweak the changes to ensure they achieve the desired effect.

Natalie: Interesting – I’ve not used Threads myself but I’ll definitely be keeping an eye out over on Instagram to see if I notice any changes. With this being an election year over in America, along with many other countries – possibly even here? – this seems like a timely move on Meta’s behalf.

In other news, a school in Buckinghamshire recently suffered a cyber-attack, resulting in the leak of sensitive personal data linked to its community on the dark web. Hackers exploited vulnerabilities to steal the information, despite the school’s attempts to contain the attack.

The nature of the leaked information remains unclear, but the stolen data was published on the dark web, a hidden online marketplace often used for illegal activities. This incident highlights the importance of robust cybersecurity measures in schools to protect sensitive student and staff information…and I’d like to remind our listeners about our upcoming training in cyber security. Here’s Ryan to tell us more:

[Ad Break] In today’s digital age, cyber security plays a crucial role. But it’s not just about protecting your own information – it’s about protecting our schools, our organisations and the children and young people in our care. Our cyber security webinars will help improve your understanding of what cyber security is and why it matters. Gain confidence by learning how to better protect against vulnerabilities and improve your response to potential dangers. Plus, our cyber security training is CPD certified and is available in both beginners and advanced levels. With webinars coming up soon, make sure you visit ineqe.com and head to the webinars page to book your place today.

Colin: And we’re back now for our safeguarding success story of this week and it’s about a new partnership between the Internet Watch Foundation and the Public Internet Registry that will give registries the tools to disrupt sexual abuse material online. Although it’s a bit tech-y, but basically this is going to disrupt criminals from hopping to one domain (like a website address) to another. So currently, if a site hosting this type of content online gets taken offline, it often comes back again but under a slightly different domain. So, for example, if ineqe.com changed to ineqe1.com. This new program will alert participating registries in real time, allowing for faster action to be taken.

Natalie: Definitely tech-y but definitely a fantastic initiative! Well that’s all from us for this week. Join us next time for more news and alerts – and don’t forget to visit ineqe.com to find out more about our upcoming cyber security training.

Colin: And if you and the child or young person in your care hasn’t checked out The Online Safety Show yet…well, why not?! Head over to theonlinesafetyshow.com to find all our episodes, including our latest Safer Internet Day episode in which you can learn more about the Online Safety Act with our special guest Bimpe Archer from Ofcom. And while you’re there, why not take part in our survey? Until next time…

Both: Stay safe!

Join our Safeguarding Hub Newsletter Network

Members of our network receive weekly updates on the trends, risks and threats to children and young people online. 

Sign Up

Who are your Trusted Adults?

The Trusted Adult video explains who young people might speak to and includes examples of trusted adults, charities and organisations.

Pause, Think and Plan

Use our video for guidance and advice around constructing conversations about the online world with the children in your care.

2024-02-16T15:46:14+00:00
Go to Top