Loading...

Last Updated on 9th February 2024

Read the script below

Tyla: Hello and welcome back to another episode of Safeguarding Soundbites, with me, Tyla.

Natalie: And me, Natalie. As always, we’ll be bringing this week’s need-to-know news on all things safeguarding.

Tyla: This week we’re talking AI, vaping, esports and more.

Natalie: Shall we dive in then?

Tyla: Yes! What’s grabbed your attention in the news this week, Natalie?

Natalie: So the first thing I want to talk about is the news that Facebook and Instagram – so, Meta, basically – are going to be introducing tech to help detect and label fake AI images that appear on their platforms. Now, Meta already labels AI images that have generated by its own AI systems but not images that were created elsewhere and uploaded. So, the aim looks to be to cover that sort of content, too. Meta have also asked users to label their own audio and video content because their tool doesn’t work for this. They’ve added that if users don’t do that, there might be penalties.

Tyla: This links in, actually, with the next news story I was going to bring up, which is about a video that was circulating on Meta of Joe Biden. So the video was a fake and the Oversight Board, which is the board that oversees Meta –

Natalie: imaginatively named!

Tyla: Yep, does exactly what it says on the tin! But yes, the board reviewed this particular incident and video and said that Meta made the right call in deciding not to remove it as it didn’t violate its ‘manipulated media’ policy. Instead, it needs this type of content to be labelled.

Natalie: Ah ha! So this is possibly the catalyst.

Tyla: Or a very large part of the decision, anyway! But the board also added that Meta’s policy on manipulated media is “incoherent”!

Natalie: Not a great description.

Tyla: No, so they are now reviewing their policy.

Natalie: Sounds like a good idea! Actually while we’re on the subject of AI, I want to mention that this week ministers in the UK were warned against waiting for an AI-involved scandal before taking steps to regulate. In fact, it was put to them as ‘don’t wait for a Post Office-style scandal’. This comes after the government announced late last year that there will be a global AI safety summit, with major tech companies agreed to work together on testing their most sophisticated AI models. They are also providing regulators with 10 million pounds to tackle AI tasks and come up with an approach by the end of April.

Next news story now and it’s about a 14-year-old girl who has reported that she was groomed by an older man who posed as a teenager and offered to buy her vapes. This relationship then developed into a sexually exploitative relationship. Girls Out Loud, a charity that works with vulnerable young people, said that promising vapes has become an increasingly common tactic to lure children.

Tyla: And that’s, again, very related to my story, actually. A 12-year-old boy in Derry/Londonderry ended up losing consciousness after being pressured to smoke a vape – which ended up containing spice. His father has warned other parents to be vigilant and is calling for more awareness about the dangers of spice. Thankfully the boy is recovering but this could have been a very different story.

And remember, you can find out more about vaping on our Safer Schools NI App or on our website saferschools.uk.

Quick ad break and then we’ll be right back.

[Ad Break]
Do you know your streamers from your scrims? COD from Counter-strike? A clan from a LAN? And no, I’m not just making up words now…I’m talking about e-sports! Our upcoming Safeguarding in Esports course is for parents and safeguarding professionals who want a better understanding of competitive online videogaming. The live webinar will give you the knowledge you need to help create a safer environment for young gamers, learn about the unique challenges of safeguarding in esports and explore parallels with traditional sports safeguarding. Sign up today to ensure your spot by visiting the webinar page at ineqe.com.

Natalie: And speaking of esports…guess what our theme for today’s safeguarding success story is?!

Tyla: Hmmm….could it be…esports?!

Natalie: You’re so smart! It is. The NSPCC are hosting two festivals this year with the aim to promote safeguarding in gaming. Their Game Safe festival is taking place right now and runs until the 11th of February. And on the 9th of February they held a ‘Safeguarding in Esports’ conference.

Tyla: Much needed. We know that esports come with some concerning risks for children and young people – we won’t go through all of them! But the likes of the use of loot boxes in games which could encourage unhealthy spending habits and also is teaching the premise of gambling, essentially.

Natalie: That’s right. There’s also the potential that children and young people watching might view inappropriate content or something distressing – a player or streamer could be playing a game that’s not age-appropriate for a child who’s watching along. And as you said, Tyla, we won’t go through all of the risks and areas of concern with esports but all of those are outlined on our course, which I’d personally highly recommend for any parent or safeguarding professional to attend.

Tyla: I agree. But awesome stuff from the NSPCC, too – great work – and we hope everyone who went along had a great experience. Now, that’s all from us. As ever, you can follow us on social media by searching for Safer Schools NI. And remember to visit saferschools.co.uk to find out more about the esports course as well as our full range of available training and webinars.

Natalie: And if you haven’t checked out the latest episode of The Online Safety Show, the new safeguarding show for children and young people, make sure to visit theonlinesafetyshow.com and while you’re there, take part in our survey about the Online Safety Act!

Until next time…goodbye and –

Both: Stay safe.

Join our Safeguarding Hub Newsletter Network

Members of our network receive weekly updates on the trends, risks and threats to children and young people online. 

Sign Up

Who are your Trusted Adults?

The Trusted Adult video explains who young people might speak to and includes examples of trusted adults, charities and organisations.

Pause, Think and Plan

Use our video for guidance and advice around constructing conversations about the online world with the children in your care.

2024-02-09T14:23:35+00:00
Go to Top