Loading...

Last Updated on 19th January 2024

Read the script below

Natalie: Hello, everyone, and welcome back to another Safeguarding Soundbites. I’m Natalie, and this week, I’m joined by Tyla. We’ve got quite a line up of stories today, don’t we?

Tyla: Absolutely, Natalie. Buckle up, everyone, because we’re diving headfirst into some shocking revelations and concerns in the tech world.

Natalie: Our first story sheds light on the darker side of online marketplaces. A recent study has uncovered that over a third of ads on Facebook Marketplace may be scams. The TSB retail banking chain’s fraud team went a step further and contacted 100 sellers on the platform. The shocking findings indicate that UK buyers are losing over £160,000 daily due to fraudulent listings. TSB found that 34% of the ads were scams, with sellers employing tactics commonly used by fraudsters. These include directing experts (who they thought were buyers) to fake websites, refusing in-person viewings, and demanding advanced fees.

Tyla: It’s disheartening to see such a widespread issue, Natalie. It’s important we all keep an eye out and report scams immediately. And highlight to the young people in your care the potential pitfalls of using online marketplaces like Facebook.

Natalie: Absolutely. Switching gears now and internal Meta documents about child safety have come to light. The documents reveal that Meta was not only aware of inappropriate and sexually explicit content being shared between adults and young people but also marketed its platforms to children. Disturbingly, Meta employees raised concerns internally about the exploitation of children on their private messaging platforms. The documents indicate that Meta intentionally recruited children to Messenger, limiting safety features in the process. This has raised significant concerns about Meta’s commitment to protecting children online.

Tyla: This is deeply troubling, Natalie. And the Internet Watch Foundation’s report adds another layer to those concerns. They found a 66% increase in self-generated child sexual abuse material (CSAM) featuring children under 10 on more than 100,000 webpages in the last year. With Meta’s move towards end-to-end encryption, there are fears that reporting such material could become even more challenging.

Natalie: Absolutely, Tyla. Let’s hope there’s a serious re-evaluation of priorities to ensure the safety of our youngest internet users.

Tyla: Now, on a different note, gamers, listen up. A recent study has shown that prolonged exposure to high-intensity sound in gaming may put players at risk of irreversible hearing loss and tinnitus. The study emphasizes the need for education and awareness to promote safe listening habits among gamers. Guidelines suggest permissible exposure limits, and the data indicates that, on average, sound levels often exceed these limits.

Natalie: It’s a wake-up call for the gaming community to prioritize their auditory health. Awareness and education can go a long way in preventing irreversible damage. And make sure the young people in your care are aware of the potential problems, especially as they may be using headphones.

Tyla: Definitely. Next up, a study by Fuse, the Centre for Translational Research in Public Health at Teesside University, and Newcastle University reveals a concerning link between young people’s consumption of energy drinks and mental health risks. Despite most supermarkets banning sales to those under 16, availability in local shops remains a concern. Teachers have expressed worries about the impact on behaviour and concentration in students who consume these drinks.

Natalie: This is a call to action for regulatory bodies to address the sale of energy drinks to young people. Their health and well-being should always be a top priority.

Tyla: 100%. And we often see advertising and marketing for energy drinks that seems to be geared towards young people.

Natalie: That’s true. Energy drinks…and vaping too!

Tyla: Absolutely. A quick ad break now and then we’ll be back with more news.

[Advert]

Natalie: A primary school in Co Antrim, Straidhavern Primary School, is the latest to face the threat of closure, joining nearly 200 Northern Ireland primary schools in a similar predicament due to low pupil numbers.

Tyla: This is indeed a concerning trend. The Education Authority’s consultation on the discontinuance of the school ended recently, putting the school at risk. In response to the increasing challenges, the Rural Schools Closure Group was formed and after a legal victory for St Mary’s PS, is actively opposing closures. They are seeking an urgent meeting with the Department of Education’s Permanent Secretary, Mark Browne.

Natalie: It’s heartening to see communities coming together to fight for the survival of their schools. Principal Judith Meekin highlighted the positive side, mentioning that applications have already started pouring in for places in their P1 class for September. She expressed excitement about the potential growth of the Straidhavern Primary School family for many years to come.

Tyla: Local councillor Paul Michael voiced the shock and horror felt by locals and parents in response to the closure proposals. He emphasized the need for Stormont to resume operations promptly, facilitating proper representation and dialogue between elected representatives to address these critical issues.

Natalie: It’s a reminder that education is not just about numbers and budgets; it’s about fostering communities and providing opportunities for the younger generation. Let’s hope for a positive outcome for Straidhavern Primary School and others facing similar challenges.

Tyla: Absolutely, Natalie.

And now over to our Safeguarding Success story of the week! The British Standards Institution has published the first international guidelines on AI safety, with a focus on safeguards. It comes amidst the debate around AI and how to regulate safety on the fast-moving technology.

BSI says the new guidelines are designed to enable the safe, secure and responsible use of artificial intelligence across society.

Tyla: Well, that’s positive. We hear a lot about concerns around AI and how quickly it’s developing.

Natalie: We do and in fact research by BSI shows that around 61% of people in the UK and globally want international guidelines on the use of AI. It also shows that nearly two fifths of people around the world use AI in their jobs every day!

Tyla: Wow, that’s a lot! Listeners, if you want to learn more about AI, in particular chatbots, you can find resources by downloading our free Safer Schools NI App.

Natalie: That’s all from us this week, join us next time for more safeguarding news and alerts.

Tyla: Thanks for listening and –

Both: Stay safe.

Join our Safeguarding Hub Newsletter Network

Members of our network receive weekly updates on the trends, risks and threats to children and young people online. 

Sign Up

Who are your Trusted Adults?

The Trusted Adult video explains who young people might speak to and includes examples of trusted adults, charities and organisations.

Pause, Think and Plan

Use our video for guidance and advice around constructing conversations about the online world with the children in your care.

2024-01-19T16:32:38+00:00
Go to Top