Discord Security Settings – A Guide for Parents and Carers
In this blog we take a look at Discord and talk about some of the various privacy and security settings available for children and young people so parents and carers can help navigate the platform better.
How Do People Use Discord?
Discord has been established for several years but in more recent times has gained considerable popularity with an active user base reaching millions.
Similar to other social media apps, Discord uses a lot of features that will be familiar to a lot of us who are used to navigating similar platforms. The app is free to use and has been used to expand towards a wider social network compared to its early use of being used primarily more towards the gaming community. It now accommodates those with more general interests to communicate through forums that are called ‘servers’. People can chat online in Discord through messaging, video calling or audio. Users can also use direct messaging between two people which isn’t visible to others.
Servers vary in terms of the topic or interest being discussed. Fan bases can join together to discuss their interest in a particular film, game or hobby. Some groups on Discord require the user to have an invitation to the server whilst others are made public for anyone to join. If the user wants, chats can also be limited to just friends and family if they choose.
Discord has communities of users that cover a wide range of topics that can be seen as inappropriate for certain age groups. This can sometimes extend towards containing mature content, including offensive language or graphic imagery. Discord also provides chat features that can sometimes result in unwanted contact from strangers due to the public visibility of chats in servers.
Similar to other social media apps, Discord offers a lot of freedom in how users can socialise with others online. It is useful for parents and carers to become familiar with the app so that they are aware of what privacy and security measures are available for children and young people.
Discord Security Features
For Discord, the user needs to be 13 to create accounts. As this relies on the user’s honesty, it is up to the individual and family members (if they are young) to ensure that correct information is given when initially signing up. On Discord, there are also servers that require a minimum age of 18 to join.
The platform provides a variety of security features that can help with setting boundaries towards being exposed to harmful content, unwanted contact, and friend requests. These include:
- Direct Messaging – Media such as images and videos that are sent via direct message to a user can be scanned for explicit content. If explicit content is found, it can be deleted by Discord. Users are able to set their desired option: Scanning all messages (Keep me safe), scanning only those from strangers (My friends are nice) or not at all (Do not scan). Those users under 18, all messages are set to be scanned by default. To look at the scanning options, go to Privacy and Safety in the Settings.
- Unwanted Direct Messaging – Users can set their privacy settings to restrict direct messages from other server members. You can toggle this feature on or off at any time in the settings and it can also be applied to previous servers that may have already been joined. Direct messages are allowed by default when an initial account is made so if you want to put restrictions in, this needs to be set manually in Privacy and Safety.
- Unwanted Friend Requests – Users are able to set who is allowed to send them a friend request in the Privacy and Safety section. You can set accounts to be open to Everyone, Friends of Friends or Server Members.
- Blocking – Users can also block accounts if someone is being harmful or offensive. This will restrict contact from them through direct messaging and content on a server. Click on their profile and select the Block option.
Consideration for Parents and Carers
The privacy features available on Discord can help with setting some boundaries, however, explicit content can potentially still be viewed through explicit language and harmful content that is written by other users. It is important to utilise the blocking and reporting features on the platform if harmful content is still experienced and causing upset.
Parents and carers can also have an open discussion with their family around what potential risks there are with socialising online with others so family members can feel confident to come forward if problems arise and seek the correct support. It can also help with giving children and young people the confidence to ignore unwanted friend requests, speak out if something upsets them online and to leave server chats if something is making them feel uncomfortable.
If you are looking to report legal but harmful material online and have reported content to Discord with no action, you can find further support at Report Harmful Content. Any person over the age of 13 can use this service which is provided by the UK Safer Internet Centre and operated at SWGfL. Find out more here: