The biggest boon the internet has brought, besides the humongous amount of data, is easier communication. Through the internet, people from around the world are connected and are able to exchange information, heedless of the physical distance.
However, technological advancement benefited not only adults but also the younger generation. Back then, gadgets like computers and cellphones weren’t as advanced and were much more expensive, not to mention the eyebrow-raising internet bill. Now, these gadgets are common, and new brands are coming out with cheaper versions that have the same features as high-end models.
As a result of this exposure, more and more young people are using the internet. But one can’t deny the necessity, as it opens up more means of communication for children to stay in touch with their parents. Interestingly enough, schools now utilize certain applications for information dissemination and collaboration purposes. This was widely observed during the quarantine blues, when health and safety protocols included homeschooling.
Discord is one of these applications often used for communication and collaboration. So the million dollar question is—is the Discord app safe for kids? Here is a brief Discord review for parents to help you decide.
What parents need to know about Discord first of all is that it is a free social platform primarily used for communication, making it technically a social media application. Though a more apt description is that it is more along the lines of Facebook Messenger and Skype, which are mishmashed with forum sites like Reddit.
It can also be called an instant messaging (IM) application or a voice over internet protocol (VoIP) app. If you’re unfamiliar with the latter, VoIP is basically a technology that uses the internet instead of analog phone lines in order to make voice calls.
At first, Discord was meant to be a way for people in the gaming community to interact with each other. Until now, it has been a popular channel for communication among gamers. In the end, it became popular with most people and grew to have more than 100 million users, making it a huge social network.
To understand it better, here are some of the basic terms you need to know when using Discord:
Admins, for short. These are the people who created the Discord server. They run the server as a whole, from making the rules to letting new people join, assigning moderators, and doing other administrative tasks.
Moderators, in terms of administrative control, play an important role in community maintenance. They assist the admins in making sure that the server is a safe and healthy environment for its members. Some of their tasks include screening messages and members, deleting messages, and banning members who violate the rules.
The whole group of people within a Discord community is called a server. Servers can either be public or private. You can join a public server anytime with the invite link. The private servers, meanwhile, need the approval of the moderators before admitting a new member.
Channels are a subset of a server. These are like rooms where the members gather and talk. Besides the default channels that come with the creation of a server, members can add as many channels as they need.
These channels can be renamed and are used to organize topics of conversation.
Talk to our team!
Send Me a Quote
Since Discord is free, all you need is an email address and other personal information to make an account. According to its guidelines, users should be aged 13 and up in order to create an account and use the app.
Even so, this rule is easy to get around because there is no formal way to check. It is up to the user to self-report.
The anonymity and diverse ages of the users make it risky for kids to use the app, even more so without guidance. How safe is Discord chat for kids? Let’s discuss the two main types and the risks involved.
Despite their name, Text Chat type channels work just like regular chat applications. You can share text, images, videos, links, and other file types within the server.
Moderators keep an eye out for messages containing banned content and delete them. Sometimes it even leads to the member being suspended or kicked off the server.
AutoMod Keyword Filters in Discord help by automatically flagging content that contains any of these banned words. Besides the commonly flagged words, moderators can also add custom keywords. There’s even a spam filter in case spam starts appearing on the channels.
While users are still allowed to discuss NSFW topics on regular channels (depending on the server guidelines), the NSFW text channel is used if some of the users are not comfortable seeing it on the regular channels or for users not to accidentally wander in.
The voice chat, meanwhile, is where members can talk over voice or video calls. Aside from that, this is where the integration features of Discord are shown, which allow members to share their screens, stream games, or just watch the latter two.
Even then, users can still use the text chat to communicate if they can’t or don’t want to talk through the microphone.
Because people talk more freely in this type of chat, it is harder to keep things under control. This necessitates manual moderation, which means keeping a close eye on all manners of interaction.
There is no exact answer, as it depends on a case-by-case basis. This applies to a lot of pockets in the bag of trail mix that is the Internet. But there is a need to be sure and critical when providing safety for kids on the internet.
While children under the age of 13 should not have a Discord account, there are times when it is necessary or the account was created without parental supervision. So, with this provided information, is Discord dangerous? If you’re still on the fence, here is a summary of the pros and cons of children using the app.
It’s easy to use because of its clear labels and user-friendly interface and design.
You can send messages of varying formats in the text chat, call via audio or video, stream games, and share the screen. It’s customizable as well, allowing you to modify the server as needed.
Most admins and moderators screen users before or after admitting them to the server, making sure that kids don’t end up in places they shouldn’t be.
Users can also toggle the settings to change who can send you a friend request. The default is everyone, but it can be limited to friends of friends or server members only.
As for private messaging, Discord has a Safe Direct Messaging setting that does the job of screening messages for any explicit content.
The AutoMod is an extra safety net that makes the moderators' jobs easier by automatically filtering and flagging content that violates the brand-imposed rules and policies. It’s also active even if there are no active moderators.
Discord’s user-friendliness is a double-edged sword, making it easy for kids to learn how to create an account and use the app outside the eye of supervision.
As previously mentioned, there’s no formal way of verifying the user’s age. You also don't have to say it in your profile, which makes it hard for moderators to figure out if a user is old enough for the server or not.
Public servers, as their name suggests, allow any user to join. While moderation still exists, the moderators and admins either don’t screen the members or are too large to be able to. In some cases, the server simply does not care.
Yes, the NSFW chats are labeled, and NSFW content can be censored, but all the kids have to do is click past the age restriction warning before getting an eyeful of the content.
There’s currently really no other way to go about it short of baby-proofing the whole application and alienating the majority of the user base.
Yes, it’s inevitable for kids to be active on the internet. While there are safe spaces specifically for kids, it’s also not fair to expect that everyone has to adhere to kid-safety guidelines.
This is why parents need to at least know enough about the internet, or specific applications, in order to help guide their kids to appropriate spaces. You don't have to micromanage their every move, but you should advise them on what to do in certain situations and teach them about the risks and dangers of using Discord.
Meanwhile, on the server’s end, they also carry part of the responsibility. Despite its capabilities, the AutoMod is not infallible and can’t exactly catch everything. It cannot accurately discern the context and intent of the user. It’s also difficult to manage a server, especially very large ones.
Chekkee, with its human and AI-powered moderation, can be your partner. Instead of vetting each potential moderator and manually editing AutoMod, you can employ a company that specializes in moderating websites for children.
It provides 24/7 service in addition to monitoring and flagging inappropriate content. Another safety measure is that Chekkee works with the right people who can help handle any emergencies or threats to children's online safety.
Build a plan that works for you. Contact us!