Colin: Hello, and welcome to Safeguarding Soundbites, the podcast that gives you a snapshot look about all things safeguarding from the news this past week. I’m Colin –
Danielle: – and I’m Danielle, and we’ve got lots to talk about this week! Let’s start off with the latest in social media news.
Colin: As we’ve talked about in earlier episodes, TikTok has been having a bit of an issue adjusting to the EU Digital Services Act, or DSA, which has recently come into effect. You might remember, they received a fine only a few weeks ago for breaching children’s privacy in the EU, with claims the platform has done “very little” to verify the ages of its users and remove underage accounts. Since then, TikTok have taken steps to try and remedy this, recently announcing that they will start to provide regular reports on how they are improving user safety, aligning with requirements set by the DSA, and giving their users insights into app use.
Colin: One of the DSA’s requirements is that large digital platforms must enhance platform reporting and transparency. Has TikTok said how they will provide this?
Danielle: Yes, they have! They will present a quarterly EU Safety Report that surmises the work they have been putting in to safeguard their users. This report is a part of the “expanded efforts” TikTok has claimed they are undertaking with ‘Project Clover’, their programme to create a “specially-reinforced protective environment” around their European user database.
Colin: And when will the report be available?
Danielle: Actually, the report has already been released! You can find it online on their website, along with further details and updates to Project Clover.
Colin: Now Danielle, what you just shared brought to my mind the recent discourse between social media companies and the UK government around end-to-end encryption. Many of the big tech companies have said that there is no technology that will bypass end-to-end encryption without compromising user privacy. A few weeks ago it seemed the government was recognising this and taking a step back, however Technology Secretary Michelle Donelan has just denied that anything had changed in the government’s expectations.
Danielle: Just one thing after another, isn’t it? Especially as the Online Safety Bill has just returned to the House of Commons for its final stages.
Colin: Well a slightly different approach may be taken in the EU. WhatsApp, one of Meta’s platforms, has recently launched beta testing for cross-platform communication. This would allow other app users on platforms like Snapchat and Telegram to send a WhatsApp user a message without either user having to download the other app. This ‘third party chats’ feature is still in testing and has lots of things, such as encryption and privacy, to work through before it is made public.
Danielle: As ever, our online safety experts will continue to monitor this development, and we will let you know of any progress as and when it happens.
Colin: UK MPs have recently launched an inquiry into the impact of screentime on children and young people. The formed ‘education committee’ will reportedly reach out to experts about apps, mobile phones, and tablets in how they relate to education and well-being for under 18s. The chair of the committee, Robin Walker, said that while the use of technology had aided children with “communication and accessing information” – most notably through the pandemic – it also presents concerns about the online content children are consuming, the harmful interactions such as grooming and exploitation, and the excessive use of screentime that can impact their offline lives.
Danielle: This all sounds…familiar?
Colin: It does a little bit, doesn’t it? Here at INEQE Safeguarding Group and Safer Schools NI, alongside our partners, we are always encouraged to see things like this developing at government levels.
Danielle: Very true! It’s something that we work at daily, so it’s good to see movement at a government level.
Colin: According to the committee, the inquiry will inspect how schools across the UK are dealing with the online activities of their students, what guidance is being given to families, how reliant classrooms are on using online tools such as Google, what is being taught to students about online harms, and how policies are being enforced regarding mobile phones.
Danielle: As we know, these questions are exceptionally important, especially as Ofcom recently reported that by the age of eight a child in the UK will typically spend just under 3 hours online per day. This jumps to 4 hours per day by the time they are eleven-years-old.
Colin: And in that same study, it was discovered that over half of five-to-seven-year-olds in the UK have their own device, which means it’s become far more common for children to access the internet on a regular basis using a personal device like a mobile phone or a tablet.
Danielle: While giving a child their own device might ease some arguments or nagging requests, it’s always important to remember that having your own device is a big responsibility – one that might be too big for a five-year-old child!
Colin: Well, this is one of the things the inquiry is hoping to address. The education committee is asking academics and people who work in children’s health, education, and technology to submit evidence via their website. The submission deadline is still a while away on Monday, 16 October, and we would urge any professionals in this capacity to submit their opinions. Greater understanding of both individual and shared experiences can only help broaden the scope of the online safeguarding world.
Danielle: You can also find more information around screentime and setting up your child’s first device on our Online Safety Centre, which you can find directly in your Safer Schools NI App.
Colin: You’ll find lots of great articles, shareable resources, and expert advice – right in the palm of your hand!
Danielle: A ban on single use vapes has recently been announced by ministers in the UK, in an attempt to mitigate the appeal of vaping to children and young people. Also in an effort to try and decrease the amount of waste produced by the product. This proposed ban has been applauded by many leading doctors and councils, but some are concerned that a ban might create an increase in illegal products on the market, sadly furthering the risk to children and young people.
Colin: Never mind that the illegal vape market is already a problem, with shops and dealers selling to underage users or bringing in higher-dosage vapes from America, which are illegal to sell in the UK.
Danielle: Exactly. It was recently revealed that 5 million vapes are discarded in the UK every week, which is a fourfold increase to the number recorded in 2022. And this is with current recycling procedures in place by some retailers. Some professionals have even claimed this ban could spell bad news for all smokers, especially as vaping is seen primarily as a tool to aid people who are quitting. Instead, their advice would be to introduce further controls and limits around how children and young people can access disposable vapes, as well as a “proper licensing scheme” to outline who can legally sell vapes.
Colin: As of yet, there here has been no concrete ban put in place, and ministers are still discussing how to identify opportunities that will reduce the number of children and young people accessing vapes.
Danielle: While you may be thinking this is just an offline issue, healthcare officials have actually warned that social media might be a big influence in children and young people deciding to vape. TikTok, Instagram, and Snapchat have all been mentioned as platforms where teenagers create videos talking about their vaping products and recommending them to others. There have even been vape dealers and shops running from social media accounts!
Colin: This kind of online behaviour can have an extremely negative offline effect on children and young people, which is why the government have been looking for ways to protect them from being targeted in the first plac e.
Danielle: For a deeper look at what youth vaping is, the risks and red flags to watch out for, and signposts to help in your area, head to your SSNI App or our website to read our article.
Danielle: Leading figures in the world of AI – or Artificial Intelligence – met in Derry/ Londonderry this past week for a discussion around the impact of AI on education. This meeting included representatives from big tech companies like Microsoft and the National Centre for AI, and workshops were hosted to examine how AI can generate educational materials.
Colin: Which has been a big problem for safeguarding professionals since the rise of AI at the beginning of this year.
Danielle: It has, and that’s partly what this conference was focused on. As these were leaders in the AI field, they looked at both the benefits and the challenges that AI brings. One of the leaders said the purpose of the summit was to have “honest conversation[s] about the challenges, about the ethical considerations, and making sure we get the benefit for our students”.
Colin: It’s refreshing seeing this issue looked at from all sides, especially as we know that AI is a tool just like anything else.
Danielle: Exactly. These conversations are important for those in the tech industry to have, especially when it comes to measuring the impact something like AI might have on children and young people, as well as the global education sector.
Colin: This also happens to come at the same time that top tech leaders, including Elon Musk, Mark Zuckerberg, and Bill Gates, met with lawmakers in the United States to talk about AI governance moving forward. Musk was quoted afterwards saying there is an “overwhelming consensus to regulate AI” and even claimed that particular meeting would “go down in history as being very important for the future of civilisation”.
Danielle: Big words for a big issue!
Colin: Indeed they are, and not entirely far off the mark. This conversation is one of many that have been happening recently across the world as AI technology grows rapidly. The decisions made now will greatly affect the way AI technology is regulated and designed going forward, so it’s important that industries and governments work together to make those decisions.
Danielle: While it’s encouraging to see meetings like these happening, let’s hope we see action happen to ensure everyone’s online safety when it comes to AI. To find out more about AI and its effect, check out our latest content in the brand new version of the Safer Schools NI App rolling out from Monday at 7AM.
Colin: If you have your app set to automatically update, you’ll simply be required to just log back in! If not, head to the relevant app store and manually update the app by searching for ‘Safer Schools NI’. We would encourage all SSNI app users to update their app for exciting new features, content, and courses. Don’t miss out!
Danielle: Now you mention it, is there something else happening on Monday, Colin?
Colin: Oh yes, there is!
Danielle: We’ll be heading out for in-person Regional Training across Northern Ireland. This is a free half day of training for schools all across Northern Ireland. All you have to do is register!
Colin: There are still spaces, and we would love to see you! When we spoke to the Department of Education Permanent Secretary, Dr Mark Browne said: “While the online world can be a beneficial tool for learners, we welcome the offer from INEQE to roll out training to schools particularly in light of the rising number of children and young people engaging in online platforms and the potential risks this can present. The Department of Education funds INEQE’s Safer Schools App to allow school communities to access support and up-to-date advice in relation to safeguarding and online safety and we continue to encourage schools, parents and most importantly pupils, to access the App.”
Danielle: We hope to see you there!
Colin: A school in Wales has recently released a new report that shows a significant rise in its pupils being excluded from school in the last year. These exclusions were a result of bullying, physical assaults against pupils and adults, and filming others without consent, to name a few.
Danielle: While it is unknown whether this rise is reflective of in-school behaviour across the UK as a whole, we can confirm that we have received reports in the last year of a similar nature across different UK schools. Some of these reports have included something we refer to as ‘Teacher Targeted Bullying’.
Colin: Teacher Targeted Bullying is more than just a few mean comments or rumours on the playground or whispered at the back of a classroom. In today’s digital world, this form of bullying manifests to grossly offensive and distressing commentary that is carried out online and impacts the offline lives of teachers, school staff, and students. When reported and investigated, it can actually lead to a serious ‘communications’ offence or even criminal harassment charges.
Danielle: To help teachers, school staff, parents, carers, and other safeguarding professionals – as well as pupils – come to terms with the effects of this type of bullying, we’ve created a specialised training course. This course features and interactive dramatization of Teacher Targeted Bullying to help illustrate the ways this situation can affect teachers, students, and their families.
Colin: We’ll be releasing this training course shortly, so make sure you switch on your app notifications or sign up to our Safeguarding Hub to be the first to find out when it launches.
Danielle: To wrap things up, why don’t we turn to our Safeguarding Success story of the week?
Colin: I love that idea! This week’s story is taking us to Norfolk and Suffolk, where a sexual abuse charity is aiming to help encourage child victims to speak out earlier. The charity, ‘Brave Futures’, is run by Chris Sargisson, who was a victim of one of his teachers while he was in school. His teacher has since been jailed, in part as a result of him speaking out. Now, he wants to help child victims speak out sooner and “stop the abuse earlier”.
Danielle: That is such a brave thing to do. It can be so hard for any victim to report their abuser, especially for child victims who may have been groomed or don’t fully understand what has happened to them.
Colin: It is, and that is why Mr Sargisson also wants to help support those victims as well. He has spoken about the lasting harms of grooming specifically, saying that it could often be “the worst aspect of the experience”. It can lead to things like suicidal thoughts, depression, addictions, and can encourage feelings of shame, blame, and guilt, especially if the victims did not realise they had been groomed.
Danielle: That is often the sort of harm that can persist into adulthood as well, and create further damaging behaviours.
Colin: Precisely. When Mr Sargisson spoke out about his abuse in his ex-teacher’s trial, he realised that two of the other boys who had also been abused had taken their own lives. This is what motivated him to create Brave Futures, as he hopes to help “save lives and…remake lives” by resetting the narrative and offering help at an early stage through free counselling services and family inclusion.
Danielle: What an admirable action to take after experiencing such a difficult time in his life. Hopefully this will inspire other charities to take similar approaches across the whole of the UK.
Colin: I hope so too! You can find out more about ‘Brave Futures’ by visiting their website.
Danielle: If any of the topics in today’s episode have affected you, you can reach out to the NSPCC Helpline to speak to someone.
Colin: Thank you for joining us! We hope your week is a great one. Remember to tune in next week to get all the latest safeguarding news and updates.
Danielle: We’ll chat to you then!
Colin: And remember –
Both: Stay safe!
Join our Safeguarding Hub Newsletter Network
Members of our network receive weekly updates on the trends, risks and threats to children and young people online.