Loading...

Last Updated on 30th September 2022

Share this with your friends, family and colleagues

Many of today’s most popular platforms have begun to release more and more features and updates that claim to help parents and carers protect younger users. Instagram has been releasing updates almost monthly, with many features building on its promise to improve the impact the platform has after significant negative press 

According to Ofcom, 99% of UK children went online in 2021, with 62% saying they had more than one online profile. With Instagram being one of the most popular online platforms, it’s important to be mindful of every update they release. We’ve taken a look at the latest Instagram updates to help parents, carers, and safeguarding professionals be aware of the safety features available to them on the app – and to know how effective these settings actually are. 

 

What’s New: Meta has expanded Instagram’s current parental control tools as well as the sensitive content control feature to include new search restrictions and inappropriate content filters. It has reported that it is working on features that will help filter out nudity and sexualised content from in-platform messaging, but these are still in development. We will continue to monitor any testing or developments as they progress.

1. Family Center

What is it?

Instagram has been releasing different types of parental controls since the start of 2022, and has finally released a comprehensive parents’ guide and education hub within their ‘Supervision’ tools, launched in the UK on 14th June, 2022. Parent company Meta calls this collection of parental controls ‘Family Center’.

How it works

Before parents can implement Family Center, an ‘invitation link’ must be sent to the young person’s account. It is the young person’s decision to accept – a parent or carer cannot enforce this without their permission. To ensure you maintain a healthy relationship and ensure boundaries remain in place, discuss this option with the young person in your care first.

You can find all of these tools available under ‘Supervision’ in the individual account Settings tab on Instagram.

Supervisee ScreenSupervisor Screen

Instagram’s Supervision features include: 

Parents will be able to see their young person’s daily time spent on their linked Instagram account. They will also be able to set time limits between 15 minutes and 2 hours. After this limit is reached, a black screen appears on the account for the rest of the day saying they can ‘come back tomorrow’.
This will send ‘take a break’ reminders at intervals of 10, 20, or 30 minutes (depending on parental preference) to remind a young person to have a break from their screen. 
If a young person reports an account or post, a notification will be sent to their supervising account with details of the report (including the reported account/post and why it was reported). 
All the accounts connected to the young person will be visible to their parent or carer, starting with the most recent. They will be able to see how many accounts their young person is following and how many accounts are following them, as well as being able to view any individual profiles.
If an adult account has been repeatedly reported or blocked by young people, these accounts will be flagged and restricted automatically. They will not be shown any young person’s account or content in Explore, Reels, or suggested accounts. They will also not be able to follow any young people, see comments a young person posts, or comment on their posts.

Instagram’s Supervision features do not allow parents or carers to:

  • Access sent or received private messages.
  • Block or restrict access to specific accounts or topics.
  • View search, browsing, or in-app activity history.
  • Edit or delete posts made by other accounts.

Areas of Concern

Potential Risks

2. Sensitive Content Control

What is it?

The Sensitive Content Control feature allows users to have more power over the type of content they see on Instagram. While this was previously limited to the ‘Explore’ page, it has now been extended to wherever Instagram makes recommendations (such as the ‘Search’ and ‘Hashtag’ pages, as well as a user’s personal feed). Instagram are planning to release a shortcut to this feature on the ‘Explore’ page, a ‘Not Interested’ button to allow users to quickly inform platform algorithms of the content they do not wish to see, and a Nudity filter for chats that is in early stages of development (but these features have not been given a release date). 

A ‘hidden words’ feature has also been added to privacy settings, which will allow parents, carers, and young people to set a restriction on specific words, phrases, numbers, or emojis that might be upsetting, offensive, or triggering to them. DMs, posts, and comments containing these words will be automatically filtered from view in order to protect the user.  

How it works

Sensitive content is defined by Instagram as “posts that don’t necessarily break our rules, but could potentially be upsetting to some people.” This includes content that is violent, sexually explicit/suggestive, or regulated drug/alcohol products.  

There are three options for the ‘level’ of sensitive content that can appear. These options have been renamed to help users better understand how this filtering system works. 

  • More (previously ‘Allow’) – Users may see more photos and videos that could be inappropriate and/or offensive for younger audiences. This option is not available to users under the age of 18.
  • Standard (previously ‘Limit’) – Users may see some photos and videos that could be inappropriate and/or offensive for younger audiences. This option is the default setting for all users.
  • Less (previously ‘Limit Even More’) – Users may see fewer photos and videos that could be inappropriate and/or offensive for younger audiences. This option is recommended for users under the age of 18.

Sensitive Controls and Hidden Words can be found within the ‘Settings’ on an individual Instagram account. Sensitive Controls is found under ‘Account’ settings and Hidden words can be added and customised within the ‘Privacy’. 

Areas of Concern

Potential Risks

3. In-App Nudges

What is it?

This feature is designed to help encourage young people using the platform to ‘discover something new’ while also excluding topics that may be associated with appearance comparison or fixation. The ‘nudge’ will use a new type of notification to interrupt a user if they are spending too much time browsing posts with themes that might make them anxious, upset, or self-conscious. They will then be redirected to an array of ‘positive’ options to choose from to ‘explore next’ 

Areas of Concern

Potential Risks

4. ‘Take a break’

What is it?

This updated feature will allow users to enable a notification system that will interrupt their in-app activity with a ‘Time to take a break?’ message after a select period: 10, 20, or 30 minutes. A list of possible alternative activities (such as ‘Go for a walk’, ‘Listen to music’, or ‘Write down what you’re thinking’) may also appear. Initial testing showed that 90% of users kept this feature enabled.   

‘Break’ reminders will also feature well-known Instagram creators to increase screentime awareness and encourage users to take a break from the online environment.  

Areas of Concern

Potential Risks

5. Partnership with Yoti

What is it?

On June 23rd, 2022, Instagram announced a partnership with Yoti – the age verification system approved by the Home Office. They claim Yoti’s ‘privacy-preserving’ technology will help them provide options for users to verify their age on the platform, allowing them to offer more ‘age-appropriate experiences’. Yoti uses Artificial Intelligence to estimate the age of a user and verify their identity against a provided image, piece of ID, and/or short video selfie. This detects whether a user has given the wrong age or if they have taken a photo from the internet. Yoti is currently used by platforms like Yubo and is GDPR compliant.  

This testing is only available to selected users in the US. We will continue to monitor its progress and will update our information once it is released in the UK.

Share this with your friends, family and colleagues

Join our Safeguarding Hub Newsletter Network

Members of our network receive weekly updates on the trends, risks and threats to children and young people online. 

Sign Up

Who are your Trusted Adults?

The Trusted Adult video explains who young people might speak to and includes examples of trusted adults, charities and organisations.

Pause, Think and Plan

Use our video for guidance and advice around constructing conversations about the online world with the children in your care.

2022-09-30T07:12:32+00:00
Go to Top