Last Updated on 19th November 2021

Every day, we see the release of new apps and new features that promise to help keep you more connected to others. One of the most popular online spaces for children and young people to connect is Discord. Keep reading to find out everything you need to know about this platform and why it presents a risk to those in your care.

What is Discord?

Discord is a free online platform (owned by Amazon) that hosts voice, video, and text chat. It was founded in 2015 by two friends who wanted a better way for gamers to connect with each other while playing the games they love.

The global COVID-19 pandemic saw a surge in active users for the platform, especially as there was a temporary increase in the amount of people who could participate in a video chat to help users through lockdown isolation. Now, Discord calls itself a “space for everyone to find belonging.”

How Does it Work?

While this platform maintains it is “not social media” due to its lack of algorithms, news feeds, and other familiar functions, it does focus on user interaction. It uses a simple design layout and is split up into online communities called “servers”. All users can create their own server for free. You can purchase premium memberships, with perks and enhanced features, but this is not necessary to fully experience Discord. Servers are based on individual topics or interests (such as Among Us, reading, or sports teams). They can be public (anyone can request to join) or private (requires an invitation from admin/moderators).

Once granted access to a server, users can participate in an open chat with other users from all over the world. Text, video, and voice chat options are available, with limitations on how many people can join in. Many young people use the ‘screenshare’ option to communicate with each other while watching films, playing multiplayer video games, or watching sport matches. There are private chat options available as well.

©Discord: Blurred screen showing the Discord desktop view

Age Restrictions & Safety Settings

Discord has relatively ineffective age verification measures. It recommends that users should be at least 13 to use the platform, but only requests a date of birth on registration without asking for verified ID. User accounts also cannot be made explicitly private. This means that any user can see another user’s profile and contact them, even if they are under 18.

The platform says it has an automatic privacy setting for users under 18 called Keep Me Safe. This scans all direct messages to block explicit content and restricts access to NSFW servers. These often have pornographic content, with some belonging to extremist or predatory groups.

NSFW: Not safe for work. This acronym serves as a warning for explicit content, implying that the viewer should use discretion.

All moderation on Discord is done by the individual server moderators (normal users who are not paid by the company). It’s important to note that some content is visible to non-members if the server is set to public.

Our online safety experts signed up as a 13-year-old and were able to switch off the Keep Me Safe filter in settings. They received a pop-up warning that the NSFW server was not suitable for users under the age of 18, but they were able to click OK and proceed.

What are the risks?

Discord’s simple design and special interest categories are especially appealing to children and young people. However, this creates a prime environment for someone with harmful intentions to easily build rapport with a young person based on similar interests. This creates the illusion of friendship and trust, and can lead to more serious consequences.

  • Sexual exploitation – There have been several cases of predators grooming children through chatting on Discord, resulting in sexual abuse and/or kidnapping after the predator arranged an in-person meet up with the child.
  • Inappropriate or illegal content – It is not difficult to find NSFW material on many Discord servers, even ones that are meant to be safe for under 18s. There have been reports of non-consensual imagery and videos being shared, with one server in 2020 found to have over 140,000 images being widely shared and distributed between members. Our online testers were easily able to find multiple servers dedicated to specific types of pornography and sexual interests without any age verification.
  • Lack of privacy settings – Discord relies on its moderators and administrators to report any bad or dangerous behaviour. This lack of corporate moderation can mean situations are allowed to progress before they are shut down, which can leave young people open to harm.
  • Harsh or inappropriate language and bullying – As Discord operates on a live chat basis, video and voice chat are popular contact options. We have received reports from several concerned parents shocked at the racist, misogynistic, and sexual language they overheard while their young person was on the platform.

Our Top Tips

  • Talk to the young person in your care about the importance of privacy settings. Encourage them to visit Discord servers that are age-appropriate and only communicate with people they know.
  • If your child is under 18, make sure they have the correct age on their profile to let the Discord age protection features work correctly. Remember – this setting is not fool proof.
  • Make sure they know who to turn to if they see something that upsets them online. Read our blog on Trusted Adults to find out more.
  • Make sure the young person in your care knows about healthy online relationships and how to set safe boundaries.
  • Outline why it’s important to not share any personal information, Identification or photos to anyone online.
  • Learn together how to block users and control who can send you friend requests through our easy-to-follow video guides on our Safety Centre.

  • You can also use our one pager Safety Card on Discord that looks at reporting, blocking and privacy.

Join our Safeguarding Hub Newsletter Network

Members of our network receive weekly updates on the trends, risks and threats to children and young people online. 

Sign Up

Who are your Trusted Adults?

The Trusted Adult video explains who young people might speak to and includes examples of trusted adults, charities and organisations.

Pause, Think and Plan

Use our video for guidance and advice around constructing conversations about the online world with the children in your care.

Clicking a Link? Stop and think!

Safeguarding Update: Evil Santa

Incels in the

Instagram: New Features Explained

Shining a Light on the Dark Web