Instagram’s Latest Updates – An Online Safety Guide to the Latest Features
Last Updated on 30th September 2022
Share this with your friends, family and colleagues
Many of today’s most popular platforms have begun to release more and more features and updates that claim to help parents and carers protect younger users. Instagram has been releasing updates almost monthly, with many features building on its promise to improve the impact the platform has after significant negative press.
According to Ofcom, 99% of UK children went online in 2021, with 62% saying they had more than one online profile. With Instagram being one of the most popular online platforms, it’s important to be mindful of every update they release. We’ve taken a look at the latest Instagram updates to help parents, carers, and safeguarding professionals be aware of the safety features available to them on the app – and to know how effective these settings actually are.
What’s New: Meta has expanded Instagram’s current parental control tools as well as the sensitive content control feature to include new search restrictions and inappropriate content filters. It has reported that it is working on features that will help filter out nudity and sexualised content from in-platform messaging, but these are still in development. We will continue to monitor any testing or developments as they progress.
1. Family Center
What is it?
Instagram has been releasing different types of parental controls since the start of 2022, and has finally released a comprehensive parents’ guide and education hub within their ‘Supervision’ tools, launched in the UK on 14th June, 2022. Parent company Meta calls this collection of parental controls ‘Family Center’.
How it works
Before parents can implement Family Center, an ‘invitation link’ must be sent to the young person’s account. It is the young person’s decision to accept – a parent or carer cannot enforce this without their permission. To ensure you maintain a healthy relationship and ensure boundaries remain in place, discuss this option with the young person in your care first.
You can find all of these tools available under ‘Supervision’ in the individual account Settings tab on Instagram.
These features are only available to parents and carers with an Instagram account. Find out how to register for an Instagram account here.
The features also only apply to accounts belonging to users with a registered age of 13 to 17. If your young person has signed up to Instagram with a fake date of birth, they will need to submit a request to have their age changed.
Parents will be able to see their young person’s daily time spent on their linked Instagram account. They will also be able to set time limits between 15 minutes and 2 hours. After this limit is reached, a black screen appears on the account for the rest of the day saying they can ‘come back tomorrow’.
All the accounts connected to the young person will be visible to their parent or carer, starting with the most recent. They will be able to see how many accounts their young person is following and how many accounts are following them, as well as being able to view any individual profiles.
If an adult account has been repeatedly reported or blocked by young people, these accounts will be flagged and restricted automatically.They will not be shown any young person’s account or content in Explore, Reels, or suggested accounts. They will also not be able to follow any young people, see comments a young person posts, or comment on their posts.
Instagram’s Supervision featuresdo not allow parents or carers to:
Access sent or received private messages.
Block or restrict access to specific accounts or topics.
View search, browsing, or in-app activity history.
Edit or delete posts made by other accounts.
Areas of Concern
‘Finstas’ are currently popular with young people, and added involvement from parents and carers may encourage a young person to create one. This means that the supervision features will not show an accurate portrayal of their young person’s real in-app activity.
A young person may be frustrated by the amount of time their parent/carer allows them to be on the platform, especially if their friends are not experiencing similar restrictions. This may cause a strain on family relationships, as well as negatively impacting the young person’s mood or mental health.
There is no way to verify the relationship between the supervision account holder and the young person. This means this feature could potentially be used as a grooming tactic to ‘prove trust’ or within abusive relationships as a way to exert control over a victim. It’s important the young person in your care knows that no one should have access to this feature unless they are their parent or carer.
This feature does not alert parents or carers to potentially harmful or triggering content that a young person could be accessing via Instagram. Even if you use the supervision tool, remember to have regular conversations about how to react to harmful content with those in your care.
2. Sensitive Content Control
What is it?
The Sensitive Content Control feature allows users to have more power over the type of content they see on Instagram. While this was previously limited to the ‘Explore’ page, it has now been extended to wherever Instagram makes recommendations (such as the ‘Search’ and ‘Hashtag’ pages, as well as a user’s personal feed). Instagram are planning to release a shortcut to this feature on the ‘Explore’ page, a ‘Not Interested’ button to allow users to quickly inform platform algorithms of the content they do not wish to see, and a ‘Nudity filter’for chats that is in early stages of development (but these features have not been given a release date).
A ‘hidden words’ feature has also been added to privacy settings, which will allow parents, carers, and young people to set a restriction on specific words, phrases, numbers, or emojis that might be upsetting, offensive, or triggering to them. DMs, posts, and comments containing these words will be automatically filtered from view in order to protect the user.
How it works
Sensitive content is defined by Instagram as “posts that don’t necessarily break our rules, but could potentially be upsetting to some people.” This includes content that is violent, sexually explicit/suggestive, or regulated drug/alcohol products.
There are three options for the ‘level’ of sensitive content that can appear. These options have been renamed to help users better understand how this filtering system works.
More (previously ‘Allow’) – Users may see more photos and videos that could be inappropriate and/or offensive for younger audiences. This option is not available to users under the age of 18.
Standard(previously ‘Limit’) – Users may see some photos and videos that could be inappropriate and/or offensive for younger audiences. This option is the default setting for all users.
Less(previously ‘Limit Even More’) – Users may see fewer photos and videos that could be inappropriate and/or offensive for younger audiences. This option is recommended for users under the age of 18.
Sensitive Controls and Hidden Words can be found within the‘Settings’ on an individual Instagram account. Sensitive Controls is found under ‘Account’ settings and Hidden words can be added and customised within the ‘Privacy’.
Remember: Instagram’s minimum age requirement is 13. Some children may lie about their age and, if they say they are over 18, could be exposed to sensitive content. Talk to the children in your care about age restrictions and why it’s important to follow them.
Areas of Concern
Young people may try to ‘break’ a search term by adding different letters (such as å, ë, and ô) or symbols (like $ or @) to show results that might otherwise be blocked (e.g. $åf£)
Instagram has previously faced accusations of inaccurate or inappropriate censorship, including the sexualisation of breastfeeding mothers.
This type of filter is never 100% fool proof. Young people may still be exposed to inappropriate, upsetting, or harmful content even with this filter enabled.
3. In-App Nudges
What is it?
This feature is designed to help encourage young people using the platform to ‘discover something new’ while also excluding topics that may be associated with appearance comparison or fixation. The ‘nudge’ will use a new type of notification to interrupt a user if they are spending too much time browsing posts with themes that might make them anxious, upset, or self-conscious. They will then be redirected to an array of ‘positive’ options to choose from to ‘explore next’.
Instagram’s parent company Meta has faced significant criticism surrounding the harmful impact this specific platform has on mental health (especially for teenaged girls)with claims that being on Instagram led to an increase in eating disorders and suicidal thoughts. Since these claims have come to light, Instagram has begun releasing multiple features that attempt to aid a user’s mental health and self-perception while on the platform.
There is no information on how the platform’s algorithms will decide what is potentially upsetting or triggering to a user, or how long they are allowed to browse before being notified.
This does not eliminate the presence of these types of posts on Instagram or the residual feelings a user may experience after seeing them.
As with other ‘interrupting’ features on the app, a user may be able to easily ignore the notification and continue to browse the topics.
There are no signposts to help for users dealing with significant issues like mental health struggles or eating disorders. Simply redirecting to ‘happier’ topics may actually increase feelings of helplessness or isolation.
4. ‘Take a break’
What is it?
This updated feature will allow users to enable a notification system that will interrupt their in-app activity with a ‘Time to take a break?’ message after a select period: 10, 20, or 30 minutes. A list of possible alternative activities (such as ‘Go for a walk’, ‘Listen to music’, or ‘Write down what you’re thinking’) may also appear. Initial testing showed that 90% of users kept this feature enabled.
‘Break’ reminders will also feature well-known Instagram creators to increase screentime awareness and encourage users to take a break from the online environment.
Areas of Concern
Settings like this may create an overreliance on technology for a young person’s digital wellbeing and screentime awareness.
There is no further education or signposting to resources on the impact of screentime overuse. Take some time to talk to the young person in your care about why break times are important!
Our testers found this feature extremely easy to ignore and noticed that it does not restrict users from in-app activity. While it creates a brief moment of awareness, this is soon forgotten once browsing resumes.
Using popular platform creators might annoy or frustrate young people. It may seem like an inauthentic (or even hypocritical) message from someone who makes their living by encouraging users to view and engage with their content.
On June 23rd, 2022, Instagram announced a partnership with Yoti – the age verification system approved by the Home Office. They claim Yoti’s ‘privacy-preserving’ technology will help them provide options for users to verify their age on the platform, allowing them to offer more ‘age-appropriate experiences’. Yoti uses Artificial Intelligence to estimate the age of a user and verify their identity against a provided image, piece of ID, and/or short video selfie. This detects whether a user has given the wrong age or if they have taken a photo from the internet.Yoti is currently used by platforms like Yubo and is GDPR compliant.
This testing is only available to selected users in the US. We will continue to monitor its progress and will update our information once it is released in the UK.