Last Updated on 27th June 2022
As we get closer to summer holidays, many platforms have begun to release more features and updates that claim to help parents and carers protect younger users. Instagram is the most recent to do this, with many features building on its promise to improve the impact the platform has after significant negative press.
According to Ofcom, over 80% of young people use social media apps and platforms every single day. With Instagram being one of the most popular, it’s important to be mindful of every update. We’ve taken a look at Instagram’s latest updates to help parents, carers, and safeguarding professionals be aware of the safety features available to them on the app – and to know how effective these settings actually are.
1. Parental Controls
What is it?
Instagram has been releasing different types of parental controls since the start of the year, and has finally released its promised ‘Supervision’ tools. This new set of safety settings was launched in the UK on June 14th, 2022.
How it works
Before parents can implement Instagram’s Supervision feature, an ‘invitation link’ must be sent to the young person’s account. It is the young person’s decision to accept – a parent or carer cannot enforce this without their permission. To ensure you maintain a healthy relationship and ensure boundaries remain in place, discuss this option with the young person in your care first.
Instagram’s Supervision features include:
Areas of Concern
Potential Risks
2. Sensitive Content Control
What is it?
The Sensitive Content Control feature allows users to have more power over the type of content they see on Instagram. While this was previously limited to the ‘Explore’ page, it has now been extended to wherever Instagram makes recommendations (such as the ‘Search’ and ‘Hashtag’ pages, as well as a user’s personal feed). Instagram are planning to release a shortcut to this feature on the ‘Explore’ page as well as a ‘Not Interested’ shortcut to allow users to quickly inform platform algorithms of the content they do not wish to see (but these shortcuts have not been given a release date).
How it works
Sensitive content is defined by Instagram as “posts that don’t necessarily break our rules, but could potentially be upsetting to some people.” This includes content that is violent, sexually explicit/suggestive, or promoting regulated drug/alcohol products.
There are three options for the ‘level’ of sensitive content that can appear. These options have been renamed to help users better understand how this filtering system works.
- More (previously ‘Allow’) – Users may see more photos and videos that could be inappropriate and/or offensive for younger audiences. This option is not available to users under the age of 18.
- Standard (previously ‘Limit’) – Users may see some photos and videos that could be inappropriate and/or offensive for younger audiences. This option is the default setting for all users.
- Less (previously ‘Limit Even More’) – Users may see fewer photos and videos that could be inappropriate and/or offensive for younger audiences. This option is recommended for users under the age of 18.
Remember: Instagram’s minimum age requirement is 13. Some children may lie about their age and, if they say they are over 18, could be exposed to sensitive content. Talk to the children in your care about age restrictions and why it’s important to follow them.
Areas of Concern
Potential Risks
3. In-App Nudges
What is it?
This new feature is designed to help encourage young people using the platform to ‘discover something new’ while also excluding topics that may be associated with appearance comparison or fixation. The ‘nudge’ will use a new type of notification to interrupt a user if they are spending too much time browsing posts with themes that might make them anxious, upset, or self-conscious. They will then be redirected to an array of ‘positive’ options to choose from to ‘explore next’.
Areas of Concern
Potential Risks
4. ‘Take a break’
What is it?
This updated feature will allow users to enable a notification system that will interrupt their in-app activity with a ‘Time to take a break?’ message after a select period: 10, 20, or 30 minutes. A list of possible alternative activities (such as ‘Go for a walk’, ‘Listen to music’, or ‘Write down what you’re thinking’) may also appear. Initial testing showed that 90% of users kept this feature enabled.
New ‘break’ reminders will now feature well-known Instagram creators to increase screentime awareness and encourage users to take a break from the online environment.
Areas of Concern
Potential Risks
5. Partnership with Yoti
What is it?
On June 23rd, 2022, Instagram announced a partnership with Yoti – the age verification system approved by the Home Office. They claim Yoti’s ‘privacy-preserving’ technology will help them provide options for users to verify their age on the platform, allowing them to offer more ‘age-appropriate experiences’. Yoti uses Artificial Intelligence to estimate the age of a user and verify their identity against a provided image, piece of ID, and/or short video selfie. This detects whether a user has given the wrong age or if they have taken a photo from the internet. Yoti is currently used by platforms like Yubo and is GDPR compliant.
This testing is only available to selected users in the US. We will continue to monitor its progress and will update our information once it is released in the UK.
Join our Safeguarding Hub Newsletter Network
Members of our network receive weekly updates on the trends, risks and threats to children and young people online.