Loading...

Last Updated on 29th April 2022

Share this with your friends, family and colleagues

While tech companies are constantly releasing new updates and features for mobile phones, laptops, and tablets, one of Apple’s upcoming releases is getting a lot of attention from safeguarding professionals as well as parents and carers.  

We’ve taken a look at the ‘Communication Safety in Messages’ feature to help you know everything you need to before it launches in the UK.  

Apple logo

What is it?

This new safety feature is referred to as ‘Communication Safety in Messages’ – and pretty much does what it says on the tin. It will allow parents and carers to activate alerts and warnings on their children’s phones, specifically for images that contain nudity and are sent or received over the Messages app on iOS devices.  

Mother and son on phone

Why is it being introduced?

The aim is to help children and young people make better choices when it comes to receiving and sending sexually explicit images. It also allows parents and carers to play a more active role in the online interactions of their children and young people, while also combatting the spread of self-generated images and Child Sexual Abuse Material (CSAM).  

Though originally announced in summer 2021 as part of a range of updates, the Communication Safety in Messages is only being released this year after facing controversy around design structures. Originally, parents and carers would be automatically alerted if a child under 13 sent or received sexually explicit images, but there were concerns around this exploiting user privacy and LGBTQ+ children being at risk of being outed.  

Apple have removed the automatic alert and instead created an intervention system that allows the child to stop themselves from opening the image and reaching out for help if they need it. There are hopes this will aid in the effort to lessen self-generated images and CSAM from being received and sent by children and young people. They are also releasing a safer search function within Spotlight, Siri, and Safari to prompt users to signposts for help if they are performing searches for topics relating to child sexual abuse 

It is unclear when or how these updates will appear in the UK, and which help services would be referenced for UK users.

“Self-generated sexual imagery of children aged 7-10 years old has increased three-fold making it the fasted growing age group. In 2020 there were 8,000 instances. In 2021 there were 27,000 – a 235% increase. [Meanwhile] self-generated content of children aged 11-13 remains the biggest age group for this kind of material. In 2021, 147,900 reports contained self-generated material involving children aged between 11 and 13. This is a 167% increase [from 2020].”

It’s important to note that Apple will not have access to these images. All processing is done by on-device machine learning, and messages retain end-to-end encryption. This feature is due to be rolled out soon in the UK (as well as Canada, New Zealand, and Australia), and has already been released in the US.

End-to-End Encryption: a term used to describe blocking or preventing any third-party recipient from viewing, reading or becoming aware of information that one individual has sent to another. (National Center for Missing & Exploited Children)

Essentially? This is a type of privacy technology that ensures only you and the person you’re communicating with can read, listen, or view your messages.

Illustration on two phones messaging and an onlooking eye blocked

How does it work?

This will primarily work by using on-device machine learning (technology built into the device that is able to detect certain objects or content in images without human interference) to scan photos and attachments sent over Messages. This Communication Safety feature functions in two different ways depending on whether or not the child or young person in your care is the sender or receiver.  

If an explicit photo is received by a child…

  • The image is blurred out with a hazard sign and a warning that the image ‘may be sensitive’.
  • They can choose to block the contact immediately.
  • If they select ‘View Photo’, they are warned via a pop-up that the image could be “sensitive to view”, alongside other supportive advice, and will be asked if they are sure they want to view it.  

  • If they choose to view the photo, they are given a second chance not to. They are given signposts to help, are advised that it is their decision, and prompted to speak to a parent or trusted adult if they need help. 

If a child attempts to send an explicit photo…

  • A pop-up will warn them that the image may contain sensitive material and will ask them if they want to send it, alongside other supportive advice. 

  • If they choose to send the image, they are given a second chance not to. They are given signposts to help, are advised that it is their decision, and told that their parents want to know they are safe. If they send the photo anyway, they will be prompted to speak to a parent or trusted adult if they need help.

This is an opt-in feature which means parents will have to enable it in their Family Sharing features for it to be active on their family’s devices. Children and young people under the age of 18 must also have a connected Apple ID (using their correct date of birth) in order for this feature to work properly.  

Communication Safety in Messages is not currently live in the UK, but our online safety experts will continue to monitor and update our resources when it is rolled out.

Is this the same as the CSAM (Child Sexual Abuse Material) detection feature? 

The ‘Communication Safety’ feature was first announced alongside another update that aimed to tackle increasing digital sexual abuse material. The CSAM detection feature would use technology to scan photos for recognised CSAM images and report them to the appropriate services (the National Center for Missing and Exploited Children in the US). This feature faced overwhelming backlash in concerns for user privacy, and Apple has yet to provide an update on its release.  

It’s important to note that the Communication Safety in Messages feature is not the same as the CSAM detection feature.

At this time, Apple has not released Communication Safety in Messages or other mentioned updates in the UK, and there is no planned release date. Our online safety experts will continue to monitor the progress of these features.

Are there any risks?

There has been some concern around how Communication Safety in Messages and other updates will impact a user’s right to privacy while using their device. 

Some users have voiced uneasiness over how Apple will store user data if they are flagged for either sending or receiving these images, especially in relation to younger users 

Worries regarding the safety of young people who may be exploring their sexual identity without their parents’ knowledge have arisen, though Apple have since modified their features to provide more protection. 

If a family does not have family sharing activated (or if they use different device brands) then the Communication Safety feature will not be used.

All of these features are centric to Apple devices and applications, and do not impact cross-brand/platform interactions that occur.

Sexualised images may not directly feature nudity or could be obscured to try and ‘hide’ explicit nudity. There have also been questions around reparations for misidentified photos being flagged.

Top Tips for Parents and Carers

  • Use the release of this feature as a conversation starter for what the child or young person in your care shares over messages. It’s important to remain calm and non-judgemental, and to try and have these conversations in comfortable settings (such as the car ride home from school or around the dinner table).  

  • Go through reporting features for message apps and social media platforms. You can use our Online Safety Centre to help you explore and learn this together. This will help your child feel more confident in their online interactions.
  • Enable Family Sharing on your family devices if possible, and have a family discussion about the reasons why you are choosing to do this. If a child or young person becomes upset, remind them that owning devices is a responsibility that involves trust.

  • Point out the Trusted Adults in your young person’s life. Remind them that, if they ever feel they can’t talk to you about something important, they should instead turn to one of these people who can help them process what they are struggling with.

To stay informed of the latest Apple updates and safeguarding news alerts, make sure you are signed up to our Safeguarding Hub.

Share this with your friends, family and colleagues

Join our Online Safeguarding Hub Newsletter Network

Members of our network receive weekly updates on the trends, risks and threats to children and young people online.

Sign Up

Who are your Trusted Adults?

The Trusted Adult video explains who young people might speak to and includes examples of trusted adults, charities and organisations.

Pause, Think and Plan

Use our video for guidance and advice around constructing conversations about the online world with the children in your care.

2022-04-29T16:02:42+00:00
Go to Top