Safety Library

Everything you could ever want to know about safety on Discord. Whether you’re a user, a moderator, or a parent, discover all of our tools and resources and how to use them.

Privacy

Safety

Moderation

Policy

Privacy

Safety

Moderation

Policy

Privacy

Safety

Moderation

Policy

Privacy

Safety

Moderation

Policy

Privacy

Safety

Moderation

Policy

Privacy

Safety

Moderation

Policy

Privacy

Safety

Moderation

Policy

Privacy

Safety

Moderation

Policy

Safety Library

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Safety
Safety

Four steps to a super safe account

1. Secure your account

First, we need to ensure that your account credentials and login information are as secure as possible.

Choose a secure password

  • Having a strong password is key to protecting your account. Choose a long password with a mix of uppercase letters, lowercase letters, and special characters that is hard to guess and that you don’t use for anything else.
  • We recommend checking out password managers like 1Password (Mac) or Dashlane (Windows), which make creating and storing secure passwords extremely easy.
  • Discord will require your password to be at least 8 characters long. Certain regions in the world may have additional requirements.

Consider enabling two-factor authentication (2FA)

  • Two-Factor Authentication (2FA) is the most secure way to protect your account. You can use Google Authenticator, Authy, and other authenticator apps on a mobile device in order to authorize access to your account. Once 2FA is enabled, you’ll have the option to further increase your account’s security with SMS Authentication by adding your phone number to your Discord account.
  • You can enable 2FA in your User Settings. You can also refer to this article for more information.

2. Set your privacy & safety settings

Your settings are very important. They give you control over who can contact you and what they can send you. You can access your privacy and safety settings in the Privacy & Safety section of your User Settings.

We know it’s important for users to understand what controls they have over their experience on Discord and how to be safer. Part of delivering a better, safer experience is making sure people don’t see content they don’t want to – whether that’s intrusive spam or unwanted sensitive media. This article covers settings that can help reduce the amount of unwanted content you see on Discord and promote a safer environment for everyone.

Sensitive media

At any time, you can further configure personal settings to blur or block content in DMs that may be sensitive. The “blur” option in our sensitive content filters applies to all historic and new media. For teen users, by default, Discord will blur media that may be sensitive in direct messages (DMs) and group direct messages (GDMs) from friends, as well as in servers. Adults can opt into these filters by changing their sensitive media preferences in Privacy & Safety Settings. Learn more about how to do this here. For all users, the default configuration is “block” for direct messages (DMs) and group direct messages (GDMs) from non-friends.

At any time, you can also block the user responsible and report the content that violates our Community Guidelines or Terms of Service.

DM spam filter

Automatically send direct messages that may contain spam into a separate spam inbox.

These filters are customizable and you can choose to turn them off. By default, these filters are set to “Filter direct messages from non-friends.” Choose “Filter all direct messages” if you want all direct messages that you receive to be filtered, or select “Do not filter direct messages” to turn these filters off.

Direct messages (DM) settings

  • You might only want certain people to contact you. By default, whenever you’re in a server with someone else, they can send you a direct message (DM).
  • You can toggle off the "Allow direct messages from server members" setting to block DMs from users in a server who aren’t on your friends list. When you toggle this setting off, you will be prompted to choose if you would like to apply this change to all of your existing servers. If you click "No," you’ll need to adjust your DM settings individually for each server that you have joined prior to toggling this setting off.
  • To change this setting for a specific server, select Privacy Settings on the server’s dropdown list and toggle off the "Allow direct messages from server members" setting.

Friend request settings

The last thing to do in your security settings is determine who can send you a friend request. You can find these settings in the Friend Requests section of your User Settings.

  • Everyone - Selecting this means that anyone who knows your Discord Tag or is in a mutual server with you can send you a friend request. This is handy if you don’t share servers with someone and you want to let them friend you with just your Discord Tag.
  • Friends of Friends - Selecting only this option means that for anyone to send you a friend request, they must have at least one mutual friend with you. You can view this in their user profile by clicking the Mutual Friends tab next to the Mutual Servers tab.
  • Server Members - Selecting this means users who share a server with you can send you a friend request. Deselecting this while "Friends of Friends" is selected means that you can only be added by someone with a mutual friend.

If you don’t want to receive ANY friend requests, you can deselect all three options. However, you can still send out friend requests to other people.

You should only accept friend requests from users that you know and trust — if you aren’t sure, there’s no harm in rejecting the friend request. You can always add them later if it’s a mistake.

3. Follow safe account practices

As with any online interaction, we recommend following some simple rules while you’re on Discord:

Be wary of suspicious links and files

  • DON'T click on links that look suspicious or appear to have been shortened or altered. Discord will try and warn you about links that are questionable, but it’s no substitute for thinking before you click.
  • DON'T download files or applications from users you don't know or trust. Were you expecting a file from someone? If not, don’t click the file!
  • DON'T open a file that your browser or computer has flagged as potentially malicious without knowing it’s safe.

Never give away your account information

  • DON'T give away your Discord account login or password information to anyone. We’ll never ask for your password. We also won’t ask for your token, and you should never give that to anyone.
  • DON'T give away account information for any account you own on any platform to other users on Discord. Malicious individuals might ask for this information and use it to take over your accounts.
  • DO report any accounts who claim to be Discord staff or who ask for account information to the Trust & Safety team.

Again, Discord will never ask you for your password either by email or by Discord direct message. If you believe your account has been compromised, submit a report to Trust & Safety here.

4. Block other users when needed

We understand that there are times when you might not want to interact with someone. We want everyone to have a positive experience on Discord and have you covered in this case.

How blocking works

  • When you block someone on Discord, they will be removed from your friends list (if they were on it) and will no longer be able to send you DMs.
  • Any message history you have with the user will remain, but any new messages the user posts in a shared server will be hidden from you, though you can see them if you wish.

How to block a user

On desktop:

  • Right-click the user's @Username to bring up a menu.
  • Select Block in the menu.

On mobile:

  • Tap the user's @Username to bring up the user's profile.
  • Tap the three dots in the upper right corner to bring up a menu.
  • Select Block in the menu.

If you have blocked a user but they create a new account to try and contact you, please report the user to the Trust & Safety team. You can learn more about how to do this at this link.

Safety
Safety

Tips against spam and hacking

General Tips to Protect Against Spam and Hacking

  • Never click on unfamiliar or unexpected links. If you leave Discord by clicking on a link that takes you elsewhere, it's possible that the external site can access your personal information. We recommend scanning any unfamiliar links through a site checker like Sucuri or VirusTotal before clicking on it. You may also consider running all shortened URLs through a URL expander to ensure you know exactly where you will be directed.
  • Never download unfamiliar files from anyone you don't know or trust.
  • Be careful about sharing personal information. Discord is a great way to meet new friends and join new communities, but as with any online interaction, protect yourself by only sharing personal information with people you know and trust.
  • Discord will only make announcements through our official channels. We do not distribute information secondhand through users or chainmail messages.

If you believe your account has been compromised, submit a report to Trust & Safety here.

If you’re getting unsolicited messages or friend requests, this article explains how to change your settings.

Spam

Discord uses a proactive spam filter to protect the experience of our users and the health of the platform. Sending spam is against our Terms of Service and Community Guidelines. We may take action against any account, bot, or server using the tactics described below or similar behavior.

Direct Message (DM) spam

Receiving unsolicited messages or ads is a bad experience for users. These are some examples of DM spam for both users and bots:

  • unsolicited messages and advertisements
  • mass server invites
  • multiple messages with the same content over a short period of time

Join 4 Join

Join 4 Join is the process of advertising for others to join your server with the promise to join their server in return. This might seem like a quick and fun way to introduce people to your server and to join new communities, but there’s a thin line between Join 4 Join and spam.

Even if these invitations are not unsolicited, they might be flagged by our spam filter. Sending a large number of messages in a short period of time creates a strain on our service. That may result in action being taken on your account.

Joining many servers, sending many friend requests

While we do want you to find new communities and friends on Discord, we will enforce rate limits against spammers who might take advantage of this through bulk joins or bulk requests. Joining a lot of servers simultaneously or sending a large number of friend requests might be considered spam. In order to shut down spambots, we take action against accounts that join servers too frequently, or send out too many friend requests at one time. The majority of Discord users will never encounter our proactive spam filter, but if, for example, you send a friend request in just a few minutes to everyone you see in a thousand-person server, we may take action on your account.

Instead of joining too many servers at once, we recommend using Server Discovery to find active public communities on topics you’re passionate about.

Servers dedicated to spamming actions

Servers dedicated to mass copy-paste messaging, or encouraging DM advertising, are considered dedicated spam servers.

Many servers have popular bots which reward active messaging. We don’t consider these to be spambots, but spam messages to generate these bot prompts is considered abuse of our API, and may result in our taking action on the server and/or the users who participate in mass messaging. Besides cheating those systems, sending a large number of messages in a short period of time harms the platform.

Invite rewards servers

Invite reward servers are servers that promise some form of perk, often financial, for inviting and getting other users to join said server.  We strongly discourage this activity, as it often results in spamming users with unsolicited messages. If it leads to spam or another form of abuse, we may take action including removing the users and server.

Bots and Selfbots

If a bot contacts you to be added to your server, or asks you to click on a suspicious link, please report it to our Trust & Safety team for investigation.

We don’t create bots to offer you free products. This is a scam. If you receive a DM from a bot offering you something, or asking you to click on a link, report it.

We understand the allure of free stuff. But we’re sorry to say these bots are not real. Do not add them to your server in hopes of receiving something in return as they likely will compromise your server. If anything gets deleted, we have no way of restoring what was lost.

Using a user token in any application (known as a Selfbot), or any automation of your account, may result in account suspension or termination. Our automated system will flag bots it suspects are being used for spam or any other suspicious activity. The bot, as well as the bot owner’s account, may be disabled as a result of our investigation. If your bot’s code is publicly available, please remove your bot’s token from the text to prevent it from being compromised.

Hacking incidents, DDoS attacks

If you believe your account has been compromised through hacking, here are some steps you can take to regain access and protect yourself in the future.

1. Reset your password.

  • Choose a long password with a mix of uppercase letters, lowercase letters, and special characters that is hard to guess and isn’t used for anything else. We recommend using a password manager which can make creating and storing secure passwords extremely easy.
  • If your account’s token has been compromised, reset your password to generate a new token. You should never give your account password or token to anyone. Discord will never ask for this information.

2. Turn on Two-Factor Authentication (2FA)

Two-factor authentication (2FA) strengthens your account to protect against intruders by requiring you to provide a second form of confirmation that you are the rightful account owner. Here’s how to set up 2FA on your Discord account. If for some reason you’re having trouble logging in with 2FA, here’s our help article.

3. DDoS (Distributed Denial of Service) attacks

A distributed denial of service (DDoS) attack floods an IP address with useless requests, resulting in the attacked modem or router no longer being able to successfully connect to the internet. If you believe your IP address has been targeted in a DDoS attack, here are some steps you can take:

  • Reset your router via its manufacturer instructions.
  • Unplug your modem for 5-10 minutes and then plug it back in. This can cycle your IP address to a new one.
  • Contact your internet service provider (ISP) for assistance. Your ISP might also be able to tell you where the attack came from. Please note that Discord does not have this information.
  • If you believe this attack is coming from a user on Discord, please file a report with Trust & Safety.
  • Please note: Discord never shares your IP address with other users. Bad actors might send malicious links such as IP grabbers or other scams in an attempt to get your IP address. Never click on unfamiliar links and be wary about sharing your IP address with anyone.

Safety
Safety

Age-Restricted Content on Discord

Labeling age-restricted content properly on Discord

To help keep age-restricted content in a clearly labeled, dedicated spot, we’ve added a channel setting that allows you to designate one or more text channels in your server as age-restricted.

Anyone that opens the channel will be greeted with a notification letting them know that it might contain age-restricted material and asking them to confirm that they are over 18.

Any content that cannot be placed in an age-gated channel, such as avatars, server banners, and invite splashes, cannot contain age-restricted content.

Age-restricted content that is not placed in an age-gated channel will be deleted by moderators, and the user posting that content may be banned from the server.

Partnered servers on Discord should not have age-restricted content in those servers.

It's worth mentioning that while having a dedicated place for your age-restricted content is permitted, there is still some material that isn't appropriate anywhere on Discord. Content that sexualizes minors is never allowed anywhere on Discord. If you're unsure of what is allowed on Discord, check out our Community Guidelines.

If you do not want to see age-restricted content on Discord

At any time, you can further configure personal settings to blur or block content in DMs that may be sensitive. The “blur” option in our sensitive content filters applies to all historic and new media. For teen users, by default, Discord will blur media that may be sensitive in direct messages (DMs) and group direct messages (GDMs) from friends, as well as in servers. Adults can opt into these filters by changing their sensitive media preferences in Privacy & Safety Settings. Learn more about how to do this here. For all users, the default configuration is “block” for direct messages (DMs) and group direct messages (GDMs) from non-friends.

At any time, you can block the user responsible and report the content that violates our Community Guidelines or Terms of Service.

You can control these settings by going into User Settings and selecting the Privacy & Safety section.

Safety
Safety

Discord's commitment to a safe and trusted experience

Our Approach to Delivering a Positive User Experience 

The core of our mission is to give everyone the power to find and create belonging in their lives. Creating a safe environment on Discord is essential to achieve this, and is one of the ways we prevent misuse of our platform. Safety is at the core of everything we do and a primary area of investment as a business: 

  1. We invest talent and resources towards safety efforts. From Safety and Policy to Engineering, Data, and Product teams, about 15 percent of all Discord employees are dedicated to working on safety. Creating a safer internet is at the heart of our collective mission. 
  1. We continue to innovate how we scale safety mechanisms, with a focus on proactive detection. Millions of people around the world use Discord every day, the vast majority are engaged in positive ways, but we take action on multiple fronts to address bad behavior and harmful content. For example, we use PhotoDNA image hashing to identify inappropriate images; we use advanced technology like machine learning models to identify and remedy offending content; and we empower and equip community moderators with tools and training to uphold our policies in their communities. You can read more about our safety initiatives and priorities below.
  1. Our ongoing work to protect users is conducted in collaboration and partnership with experts who share our mission to create a safer internet. We partner with a number of organizations to jointly confront challenges impacting internet users at large. For example, we partner with the Family Online Safety Institute, an international non-profit that endeavors to make the online world safer for children and families. We also cooperate with the National Center for Missing & Exploited Children (NCMEC), the Tech Coalition, and the Global Internet Forum to Counter Terrorism.

The fight against bad actors on communications platforms is unlikely to end soon, and our approach to safety is guided by the following principles:

  • Design for Safety: We make our products safe spaces by design and by default. Safety is and will remain part of the core product experience at Discord.
  • Prioritize the Highest Harms: We prioritize issues that present the highest harm to our platform and our users. This includes harm to our users and society (e.g. sexual exploitation, violence, sharing of illegal content) and platform integrity harm (e.g. spam, account take-over, malware). 
  • Design for Privacy: We carefully balance privacy and safety on the platform. We believe that users should be able to tailor their Discord experience to their preferences, including privacy.
  • Embrace Transparency & Knowledge Sharing: We continue to educate users, join coalitions, build relationships with experts, and publish our safety learnings including our Transparency Reports.

Underpinning all of this are two important considerations: our overall approach towards content moderation and our investments in technology solutions to keep our users safe. 

Our Technology Solutions 

We believe that in the long term, machine learning will be an essential component of safety solutions. In 2021, we acquired Sentropy, a leader in AI-powered moderation systems, to advance our work in this domain. We will continue to balance technology with the judgment and contextual assessment of highly trained employees, as well as continuing to maintain our strong stance on user privacy.

Here is an overview of some of our key investments in technology:

  • Safety Rules Engine: The rules engine allows our teams to evaluate user activities such as registrations, server joins, and other metadata. We can then analyze patterns of problematic behavior to make informed decisions and take uniform actions like user challenges or bans.
  • AutoMod: AutoMod allows community moderators to block messages with certain keywords, automatically block dangerous links, and identify harmful messages using machine learning. This technology empowers community moderators to keep their communities safe. 
  • Visual Safety Platform: This is a service that can identify hashes of objectionable images such as child sexual abuse material (CSAM), and check all image uploads to Discord against databases of known objectionable images. 

Our Partnerships

In the field of online safety, we are inspired by the spirit of cooperation across companies and civil society groups. We are proud to engage and learn from a wide range of companies and organizations including:

  • National Center for Missing & Exploited Children
  • Family Online Safety Institute
  • Tech Coalition
  • Crisis Text Line
  • Digital Trust & Safety Partnership
  • Trust & Safety Professionals Association 
  • Global Internet Forum to Counter Terrorism

This cooperation extends to our work with law enforcement agencies. When appropriate, Discord complies with information requests from law enforcement agencies while respecting the privacy and rights of our users. Discord also may disclose information to authorities in emergency situations when we possess a good faith belief that there is imminent risk of serious physical injury or death. You can read more about how Discord works with law enforcement here.

Our Policy and Safety Resources

If you would like to learn more about our approach to Safety, we welcome you to visit the links below. 

Safety
Safety

Talking about online safety with your teen

1. Let your teen know that the same rules apply online as offline.

  • The same common sense and critical thinking they use offline should be used online too. For example, only accept friend requests from people you know. If something doesn’t seem right, tell a trusted adult. Behavior that’s not okay at school is also not okay online.
  • Review Discord’s Community Guidelines with your teen and help them understand what is and isn’t permissible on Discord.
  • Review their security and privacy settings and the servers they belong to with them so they fully understand who can interact with them on Discord.

2. Talk about what content they see and share online

  • Much of our teens’ lives take place online today, including romances.
  • Sharing personal images online can have long-term consequences and it’s important for teens to understand these consequences. Help them think about what it might feel like to have intimate photos of themselves forwarded to any number of peers by someone they thought they liked or trusted. Make them aware of the risk of sharing intimate pictures with anyone.
  • Review your teen’s content filters on Discord. Have a discussion about who is on their friends list, who they’re messaging, and what information they are sharing about themselves.

3. Set limits on screen time with your teen.

  • iOS and Android operating systems offer parental controls that can help you manage your teen's phone usage if needed. Apple and Microsoft offer similar controls for computers.
  • If you are worried about how much time your teen spends online, set ground rules - for example, “no social media after 10 PM,” or “no phone at the dinner table.”

4. Try Discord for yourself

  • Most of the services that your kids use online aren’t limited to teens. You might find that you also enjoy using them, and might even find new and fun ways to communicate as a family.
  • You will also understand how these apps work from the inside, and will be more easily able to talk to your teens about staying safe online.
  • You can download Discord and create a Discord account right here.
  • We suggest asking your teen to give you a tour of Discord using your new account! Some prompts you can use to get started are:
  • Show me how to add you as a friend!
  • Show me how to create a server together.
  • Show me your favorite features.
  • Show me how you stay in touch with your friends.
  • Show me your favorite emojis or gifs.

5. Third party resources

Many online safety experts provide resources for parents to navigate their kids’ online lives.

ConnectSafely published their Parent’s Guide to Discord which gives a holistic overview of how your teen uses Discord, our safety settings, and ways to start conversations with your teen about their safety.

For more information from other organizations, please go directly to their websites:

Safety
Safety

Helping your teen stay safe on Discord

We know it’s important for users to understand what controls they have over their experience on Discord and how to be safer. Part of delivering a better, safer experience is making sure people don’t see content they don’t want to – whether that’s intrusive spam or unwanted sensitive media. This article covers settings that can help reduce the amount of unwanted content you see on Discord and promote a safer environment for everyone.

These settings can be controlled by going into User Settings and selecting the Privacy & Safety section.

Sensitive media

At any time, you can further configure personal settings to blur or block content in DMs that may be sensitive. The “blur” option in our sensitive content filters applies to all historic and new media. For teen users, Discord will blur media that may be sensitive in direct messages (DMs) and group direct messages (GDMs) from friends, as well as in servers. Adults can opt into these filters by changing their sensitive media preferences in Privacy & Safety Settings. Learn more about how to do this here. For all users, the default configuration is “block” for direct messages (DMs) and group direct messages (GDMs) from non-friends.

At any time, you can also block the user responsible and report the content that violates our Community Guidelines or Terms of Service.

DM spam filter

Automatically send direct messages that may contain spam into a separate spam inbox.

These filters are customizable and you can choose to turn them off. By default, these filters are set to “Filter direct messages from non-friends.” Choose “Filter all direct messages” if you want all direct messages that you receive to be filtered, or select “Do not filter direct messages” to turn these filters off.

Direct message (DM) settings

  • This menu lets users determine who can contact them in a DM. You can access this setting by going into User Settings, selecting the Privacy & Safety section, and finding the "Server Privacy Defaults" heading.
  • By default, whenever your teen is in a server, anyone in that server can send them a DM.
  • You can disable the ability for anyone in a server with your teen to send your teen a DM by toggling “Allow direct messages from server members” to off. When you toggle this setting off, you will be prompted to choose if you would like to apply this change to all of your teen’s existing servers. If you click "No," this change will only affect new servers your teen joins, and you will need to adjust the DM settings individually for each server that they have previously joined (in Privacy Settings on the server’s dropdown list).

You can also control these settings on a server-by-server basis.

Friend request settings

  • This menu lets you determine who can send your teen a friend request on Discord. You can access this setting by going into User Settings and selecting the Friend Requests section.
  • Users should only accept friend requests from users that they know and trust. If your teen isn’t sure, there’s no harm in rejecting the friend request. They can always add that user later if it’s a mistake.

You can choose from the following options when deciding who can send your teen a friend request.

  • Everyone - Selecting this means that anyone who knows your teen's Discord Tag or is in a mutual server with your teen can send your teen a friend request. This is handy if your teen doesn’t share servers with someone that introduced them to Discord and your teen wants to let them send a friend request with just your teen's Discord Tag.
  • Friends of Friends - Selecting only this option means that for anyone to send your teen a friend request, they must have at least one mutual friend with your teen. You can view this in their user profile by clicking the Mutual Friends tab next to the Mutual Servers tab.
  • Server Members - Selecting this means users who share a server with your teen can send your teen a friend request. Deselecting this while "Friends of Friends" is selected means that your teen can only be sent a friend request by someone with a mutual friend.

If you don’t want your teen to receive ANY friend requests, you can deselect all three options. However, your teen can still send out friend requests to other people.

Blocking

If someone is bothering your teen, you always have the option to block the user. Blocking on Discord removes the user from your teen's Friends List, prevents them from messaging your teen directly, and hides their messages in any shared servers.

To block someone, they can simply right-click on their @username and select Block.

If your teen has blocked a user but that user creates a new account to try and contact them, please report the user to the Trust & Safety team. You can learn more about how to do this at this link.

Deleting an account

If you or your teen would like to delete your teen’s Discord account, please follow the steps described in this article. Please note that we are unable to delete an account by request from someone other than the account owner.

Safety
Safety

Role of administrators and moderators on Discord

Administrators are the people who create Discord servers around specific interests. They establish the rules for participating, can invite people to join, and oversee the health and well-being of their community. They have broad administrative control, and can bring in moderators to manage community members. They can also ban or remove members and, if necessary, remove and replace moderators.

Administrators also choose moderators to play a vital role in Discord communities. The responsibilities of a moderator might vary, but their overall role is to ensure that their Discord server is a safe, healthy environment for everyone. They can do things like moderate or delete messages, as well as invite, ban, or suspend people who violate the server’s rules. The best moderators typically are seasoned and enthusiastic participants in one or more communities.

Admins and moderators are your first go-to when you encounter an issue in a server. They may be able to respond immediately and help resolve your concerns.

Each Discord server should have written rules for behavior to alleviate confusion or misunderstanding about the guidelines for that particular community. These rules, which supplement our Community Guidelines, are your tools to moderate efficiently and transparently. As communities grow, moderators can add more mods to keep their server a fun and welcoming place to be.

Policy
Policy

Answering parents' and educators' top questions

1. Is Discord safe for my teens to use?

There are a few things that make Discord a great and safe place for teens:

  • Users are in control of their experience on Discord. Conversations on Discord are driven by the people you choose and the topics you choose to talk about. Each user gets to choose who can message them, who can send them a friend request, and what kind of content they can receive.
  • Privacy is incredibly important to us. We do not ask for a real name when a user signs up, and we do not sell our user data. We do not monitor every server or every conversation.
  • Private, invite-only groups. Most Discord servers are small, private, invite-only groups where your teen can spend time with their friends or schoolmates.

To help your teen use Discord safely, it’s important to understand how Discord works and how you can best control your teen’s experience on it. We have listed a number of tips to do so here.

Just like with every other online service, the best way to ensure your teen stays safe online is to have clear guidelines on what they should and shouldn’t be looking at or posting online, and make sure that you keep clear lines of communication with them.

2. What is the minimum age to be on Discord? How do you ensure teens under 13 can’t create an account?

Discord's Terms of Service require people to be over a minimum age to access our app or website. The minimum age to access Discord is 13, unless local legislation mandates an older age.

To ensure that users satisfy that minimum age requirement, users are asked to confirm their date of birth upon creating an account. Learn more about how we use this age information here. If a user is reported as being under 13, we delete their account unless they can verify that they are at least 13 years old using an official ID document.

3. What personal information does Discord collect about my teen? Do you sell it?

Discord does not ask for a real name when a user signs up, and we do not sell user data. To learn more about what information we collect and how we use this information, please see our Privacy Policy.

4. Can my teen be exposed to inappropriate content on Discord?

Like on every internet platform, there is age-restricted content on Discord. Each user chooses which servers they want to join and who they want to interact with.

In servers, age-restricted content must be posted in a channel marked as age-restricted, which cannot be accessed by users under 18. For Direct Messages, we recommend that every user under 18 activates the explicit media content filter by selecting "Keep Me Safe" under the "Safe Direct Messaging" heading in the Privacy & Safety section of their User Settings. When a user chooses the "Keep Me Safe" setting, images and videos in all direct messages are scanned by Discord and explicit media content is blocked.

We believe that the best way to make sure that your teenagers are only accessing content that they should is to set clear guidelines on what they should and shouldn’t be looking at or posting online, and make sure that you keep clear lines of communication with them.

5. Can my teen interact with people they don’t know on Discord?

Unlike other platforms where someone might be able to message you as soon as you sign up for an account (before you have added any friends or joined any servers), this isn’t the case on Discord. In order for another user to send a direct message (DM) to your teen, your teen must either (1) accept the other user as a friend or (2) decide to join a server that the other user is a member of.

Each user has control over the following:

  • Who can DM them and who can send them a friend request
  • Which servers they want to join and be a part of
  • Who can join the servers that they create
  • Whether they want their direct messages to be scanned for explicit media content automatically

Users should only accept friend requests from users that they know and trust. If your teen isn’t sure, there’s no harm in rejecting the friend request. They can always add that user later if it’s a mistake.

If your teen is ever uncomfortable interacting with someone on Discord, they can always block that specific user. Blocking a user removes them from your teen's Friends List, prevents them from messaging your teen directly, and hides their messages in any shared servers.

6. Are there parental controls on Discord I can use to control my teen's account?

We have detailed all the controls you have to help make your teen’s account safer here. We recommend going through these settings together with your teen and having an open conversation about why you are choosing certain settings.

iOS and Android operating systems offer parental controls that can help you manage your teen's phone usage, including Discord, if needed. Apple and Microsoft offer similar controls for computers.

7. How can I monitor what my teen is doing on Discord?

Privacy is incredibly important to us, including your teen’s privacy. We can’t share their login information with you, but we encourage you to discuss how to use Discord safely directly with your teen.

8. What are Student Hubs?

Discord Hubs for Students allow students to verify their Discord account with their official student email, and unlock access to an exclusive hub for students at their school. Within the hub, they can connect with other verified students, discover servers for study groups or classes, and share their own servers for fellow students to join. Hubs are not affiliated with or managed by a school or school staff. Servers in a Hub are student-run but may include non-students.

For more information on Student Hubs, please check out our Student Hubs FAQs.

9. What safety measures are in place for the students participating in Student Hubs?

Only those with a school-assigned email address can access a school’s specific hub. Click here to dive into our Student Hubs guidelines. All servers and users in the hub must also comply with our Community Guidelines, Terms of Service, and Privacy Policy. We also have all of our existing moderator controls for servers, such as setting specific roles and access to #channels in a server, blocking, banning, mod bots, and more. Students can also submit a report from within their hub to bring attention to any inappropriate behavior.

Safety
Safety

If your teen encounters an issue

Even though the majority of Discord usage is in small, private, invite-only groups, we understand that there may be times when people in these groups behave in ways that make others uncomfortable or post content that isn’t allowed. Our Community Guidelines outline how all users should act on Discord and what we allow and do not allow. We recommend reviewing these with your teen so that you both know what behavior is and isn’t okay on the platform. Among other things, we do not allow:

  • Bullying and harassment
  • Sharing content glorifying or promoting self-harm
  • Making violent threats against people
  • Evading Discord-imposed bans and blocks
  • Content that sexualizes or threatens minors

If your teen encounters a violation of our Community Guidelines, such as harassment or inappropriate content, please file a report with details that you can gather. Our Trust & Safety team strives to ensure bad users don't disrupt your teen’s experience on Discord. We also provide a number of tools to ensure that teens (and everyone else) have control over their Discord experience.

Safety
Safety

How You Can Appeal Our Actions

Every user can appeal actions taken against their account.  While we do our best to ensure that we’re only taking action when it’s warranted, we’re not perfect and mistakes might happen. We recognize appeals are an important part of the process.

Just as you deserve a chance to be heard when some action is taken against you offline, you should have such a chance to be heard when action is taken against your Discord account by us. The process reflects Discord's commitment to internationally recognized human rights set out in the United Nations Guiding Principles on Business and Human Rights (UNGPs), privacy, and free expression.

If you think we took unwarranted action against your account, you can reach out to us so we can review your case. 

You may appeal the decision directly in the Discord app by going to User Settings > Privacy & Safety > selecting the violation you want to appeal > pressing on Let us know to begin submitting a review of your violation and account. For users in the EU, you may also choose an alternative resolution option when applicable.

What happens when we grant an appeal?

If our reviewers find that the violation was issued in error, we will remove it from your account and restore your account standing. Any actions that were the result of that violation will be removed. If there are other active violations on your account, you may still not regain full access to Discord. You can read more about violations in the Discord Warning System.

What if my appeal was denied?

If our reviewers find that the action taken was correct, we will send a message to let you know. The violation will remain on your account until it expires, if applicable. You can view any active or expired violations in Privacy & Safety Settings > Account Standing.

What if my appeal was deemed ineligible for review?

In some cases, we are unable to review certain appeals due to the nature of the violation or incomplete information provided. Even if we can’t review the appeal, your feedback helps us improve these systems over time.

Safety
Safety

What actions we take

When our Trust & Safety team confirms that there has been a violation of our Community Guidelines, the team takes immediate steps to mitigate the harm and, wherever possible, help users avoid breaking the rules in the future. The following are actions that we might take on either users and/or servers:

  • Removing the violating content
  • Temporarily limiting access to posting or other Discord features
  • Warning users about what rule they broke
  • Temporarily suspending users as a “cool-down” period
  • Disabling a server’s ability to invite new users
  • Removing a server from Discord
  • Permanently suspending a user from Discord due to severe or repeated violations

Discord also works with law enforcement agencies in cases of immediate danger and/or self-harm. We swiftly report child abuse material content and the users responsible to the National Center for Missing and Exploited Children.

We’re all about helping millions of communities, small and big, find a home online to talk, hang out, and have meaningful conversations. That means we need to find the right balance between giving people a place to express themselves and promoting a welcoming and safe environment for everyone.

Our Approach to Content Moderation 

We currently employ three levers to moderate user content on Discord, while being mindful of user privacy: 

  • User Controls: Our product architecture provides each user with fundamental control over their experience on Discord including who they communicate with, what content they see, and what communities they join or create. 
  • Platform Moderation: Our universal Community Guidelines apply to all content and every interaction on the platform. These fundamental rules are enforced by Discord on an ongoing basis through a mix of proactive and reactive work. In larger spaces, we take a more proactive and automated approach to safety as described in our Terms of Service and Privacy Policy. We may use automated means to detect violations of our policies.
  • Community Moderation: Server owners and volunteer community moderators define and enforce norms of behavior for their communities that can go beyond the Discord Community Guidelines. We enable our community moderators with technology (tools like AutoMod) as well as training and peer support (Discord Admin Community). 

Our Community Guidelines define what is and isn't okay to do on Discord. Every person on Discord should feel like their voice can be heard, but not at the expense of someone else.

If you come across a message that appears to break these rules, please report it to your server moderator or to us. If the content or behavior violates our Guidelines we may take a number of enforcement actions including but not limited to issuing warnings; removing content; disabling or removing the accounts and/or servers responsible; and reporting them to law enforcement.

Learn more about our Community Guidelines here and Terms of Service here.

Safety
Safety

How Discord Works with Law Enforcement

What is Discord?

Discord is a voice, video, and text chat app that's used by tens of millions of people ages 13+ to talk and hang out with their communities and friends.

The vast majority of servers are private, invite-only spaces for groups of friends and communities to stay in touch and spend time together. There are also larger, more open communities, generally centered around specific topics. Users have control over whom they interact with and what their experience on Discord is.

More information about Discord and our community goals can be found here.

Contact Information and Service of Process

Please submit your inquiry in our Government Request Portal at: https://app.kodex.us/discord/signin.

Our online portal will guide you through how to submit a request. You will need to:

1. Verify your email address via the link sent to your email address (only valid law enforcement or government domains will be accepted);
2. Fill in the required fields in the webform;
3. Upload a copy of any relevant documents in PDF format (for example, a copy of the subpoena or search warrant, as well as any non-disclosure order you may have).

Our online portal also allows you to add information, ask us questions, and download the information when it is available.

  • All legal correspondence should be sent on company or agency letterhead with requisite signatures included.
  • We will not comply with overly broad or vague requests.
  • The legal process should identify the user for whom information is being requested.
  • Users should be identified by their 17-18 digit user identification number.
  • If the user identification number is unattainable, a username and four-digit discriminator should be provided.
  • Note: Usernames are changing on Discord. New usernames are lowercase, alphanumeric, limited to certain special characters, and do not have discriminators. During the transition from old usernames to new usernames, some users will still have old usernames with discriminators (#0000) while other users will have new usernames. For more information see our Help Center article here. You can also read the blog post about this change from our co-founder here.
  • Instructions on how to identify a user can be found here.
  • Be advised, users are allowed to change their usernames and four-digit discriminators, thus responses rendered by Discord may list different usernames and four-digit discriminators than those originally requested.
  • All legal processes should comply with 18 U.S. Code § 2701-2712.

Emergency Requests

  • If you believe that an emergency involving danger of death or serious physical injury to any person requires disclosure without delay, please submit an emergency disclosure request via our Government Request Portal at: https://app.kodex.us/discord/signin. You will be asked to provide the factual basis for the request so that we may evaluate it under 18 U.S.C. § 2702.
  • We provide emergency disclosure responses only when enough information is provided for Discord to, in good faith, believe that the exigent situation requires disclosure of user information, as outlined in 18 U.S.C. § 2702.
  • We do not disclose information for emergency requests unless they are from law enforcement.

Preservation Request

  • Discord accepts requests to preserve records pursuant to 18 U.S.C. § 2703(f).
  • We take steps to preserve account records once proper requests have been received and processed.
  • Content and non-content information is preserved for 90 days, from the time of processing, and can only be extended for an additional 90 day period, pursuant to 18 U.S.C. § 2703. In certain cases, an additional 90 days may be requested.
  • Please reference the initial preservation request by providing the assigned ticket number and clear date range when sending legal process to obtain the preserved information.
  • We are not permitted to disclose user information when solely in receipt of a preservation request.

International Legal Requests

  • For requests of user data other than EEA or UK user data: Discord Inc. is headquartered in the U.S. and subject to U.S. law which may prevent the production of certain information. Requests should be addressed to Discord Inc. Discord Inc. will respond to valid legal process issued by a U.S. court and properly served on it in the U.S. To achieve this, you will need to work through the applicable process for international legal assistance. See 28 U.S.C. § 1782. For more information, you may wish to contact the Office of International Judicial Assistance at the U.S. Department of Justice. https://www.justice.gov/civil/evidence-requests
  • For requests of EEA or UK user data: Discord Netherlands BV is the data controller for users in the European Economic Area and the United Kingdom. Requests for information should be addressed to Discord Netherlands BV and requested via the European Investigation Order under Directive 2014/41/EU where possible. Where not possible, requests for information should be directed through the appropriate Dutch authorities by way of other mutual legal assistance agreements.
  • Orders against illegal content under Article 10 of the EU Digital Services Act should be submitted through our Government Request Portal (Kodex) under the DSA - Illegal Takedown section: https://app.kodex.us/discord/signin
  • If international legal process includes reports of child exploitation, Discord will investigate and take steps to archive these materials, remove them from our platform, and report any exploitative content.

Child Safety Policy

  • When we are made aware of potential Child Safety concerns on our platform, our Trust & Safety team reviews the content and reports the content to the National Center for Missing and Exploited Children (NCMEC) or law enforcement where appropriate as required by 18 U.S.C. § 2258A. NCMEC will then work with local and international law enforcement as necessary.
  • User accounts reported to NCMEC are archived and banned from our platform at the time of reporting.
  • When submitting legal processes prompted by NCMEC reporting, include cybertip number and all of the user information provided.
  • Law enforcement officials should indicate if a request pertains to a NCMEC report in the body of the email.

Data Retention Policy

  • Discord does not require and generally does not possess names, addresses or other personal information, as we do not require that information on sign-up. The subscriber information that we possess is limited to an email address, and, if the user is a paid user, we might have limited billing information. In certain circumstances, we might also have a phone number the user has verified.
  • Due to privacy concerns, Discord removes most user identifiers once a user is deleted, including username and discriminator.
  • Users can always be located by their 17-18 digit UID number.
  • More information regarding Discord's data retention and user privacy policy can be found here.
  • Questions regarding Discord's privacy policy can be sent to [email protected].

User Notification

It is our policy to notify Discord users and provide them with a copy of the legal process seeking their account information. We do not provide notice if we are prohibited by law or under exceptional circumstances such as child sexual exploitation investigations or threat to life emergencies. We ask that your non-disclosure order (18 U.S.C. § 2705(b)) or similar legal authority include a time limitation, as we provide delayed notice upon expiration of this limitation and when we believe the exceptional circumstance no longer exists. If no non-disclosure provision or exception is included with your legal request, Discord will confirm with law enforcement before providing user notice. In order to expedite your legal request for account information, we ask that you include or inform us initially of one of the following:

  • Clearly state you have no objection to Discord providing user notice.
  • Include a time limited non-disclosure order (18 U.S.C. § 2705(b)) or similar legal provision with your legal request.
  • Provide us with details on why an exceptional circumstance exists (e.g., child sexual exploitation, threat to life emergency) in providing user notice.

Note:

Discord reserves the right to not notify users if doing so would pose a risk to Discord or its users general welfare.

If your data request places Discord on notice of an ongoing or prior violation of our Terms of Service, Community Guidelines or other policies, we may take action to prevent further abuse, including account termination and other actions that may notify the user that we are aware of their misconduct.

Cost Reimbursement

  • Pursuant to 18 U.S.C. § 2706, Discord may require cost reimbursement if the legal process is unduly burdensome or has associated cost.

EU Contact Points

  • Under EU Regulation 2022/2065 (Digital Services Act), Discord’s Article 11 contact point is our Government Request Portal (Kodex): https://app.kodex.us/discord/signin. English or Dutch may be used to communicate through our contact point.
  • Under EU Regulation 2021/784 (Terrorist Content Online Regulation), Discord's Article 15 contact point is [email protected] and Discord's Article 17 legal representative is Discord Netherlands BV.

Still have questions?

If needed for mail service, our physical address is as follows:

For requests relating to all users other than those in the EEA or UK:
Discord, Inc.
444 De Haro St, Suite 200
San Francisco, CA, 94107
United States of America

For requests relating to users in the EEA or UK:
Discord Netherlands BV
Schiphol Boulevard 195,
1118 BG Schiphol,
Netherlands

If serving process by mail, please direct the mail to the attention of the Legal Department.

Policy
Policy

Mental health on Discord

Contact your server admins

If someone has posted comments about harming themselves in a server, you may consider reaching out to your server administrators or owner to let them know about the situation, so they can moderate their server as needed and provide support to the server member.

Provide some support resources

If you are still in touch with the user, you may wish to provide them with one of the help hotlines listed below.

Talk to a parent or trusted adult

You may not feel qualified to help a friend who expresses their desire to hurt themselves, and it may be helpful to ask a parent or another trusted adult for help in handling the situation.

Report to Trust & Safety

All Discord users can report policy violations right in the app by following the instructions here.

When we receive reports of self-harm threats, we investigate the situation and may contact authorities, but in the event of an emergency, we encourage you to contact law enforcement in addition to contacting us.

Please note that for privacy and security reasons we are unable to provide personal information such as contact information or location to someone who is not the account holder. If you are concerned that someone is in immediate danger, please contact law enforcement.

Contact law enforcement

If you or another user you know is in urgent trouble, please contact authorities right away, regardless of the limited information you might be able to provide. Law enforcement has investigative resources and can contact Discord Trust & Safety for information that we aren't allowed to disclose otherwise and can identify those users to get them help.

Mental health support vs. self-harm encouragement

Support networks and online communities can play a key role in helping people who are experiencing mental health issues. We support mental health communities on Discord where people can come together, and we want these spaces to remain positive and healthy.

When we receive reports of users or communities discussing or encouraging self-harm, we review such content carefully, and we take into account the context in which comments are posted. We will take action on communities or users that promote, encourage, or glorify suicide or self-harm. This includes content that encourages others to cut or injure themselves or content that encourages or glorifies eating disorders.

Mental Health Resources

Suicide Prevention

988 Suicide & Crisis Lifeline (U.S.):
Call: 1-800-273-8255, available 24/7 for emotional support
https://988lifeline.org/chat

Crisis Text Line (U.S.):
Text: DISCORD to 741741
https://www.crisistextline.org/
Outside the U.S.: Find a supportive resource on this Wikipedia list of worldwide crisis hotlines

Substance Abuse Support

Substance Abuse and Mental Health Services Hotline
Call: 1-800-662-HELP
https://www.samhsa.gov/
Please note that this is not a crisis hotline and should be used for referrals only.

Eating Disorder Support

National Eating Disorder Association (NEDA) Helpline
Call or Text: 1-800-931-2237
https://www.nationaleatingdisorders.org/help-support/contact-helpline
National Association of Anorexia Nervosa and Associated Disorders
Call: 1-888-375-7767
https://anad.org/get-help/eating-disorders-helpline/

Child Abuse and Domestic Violence

National Child Abuse Hotline
Call or Text: 1-800-422-4453
https://childhelphotline.org/
National Sexual Assault Hotline
Call: 1-800-656-HOPE (4673)
https://hotline.rainn.org/online
National Domestic Violence Hotline
Call: 1-800-799-7233, 1-800-787-3224 (TTY)
https://www.thehotline.org/

LGBTQ+ Support

The Trevor Project
Call: 1-866-488-7386
Text: START to 678-678
https://www.thetrevorproject.org/get-help/
Trans Lifeline
Call: 1-877-565-8860
https://translifeline.org/
Safety
Safety

Reporting Abusive Behavior to Discord

Reporting a Message

  1. Select the Message you wish to report. On mobile, hold down on the Message, and on desktop, “right-click.”
  2. Select “Report Message
  3. Select the type of abuse you’re seeing:
  1. The next screen will allow you to further specify the specific abuse that’s occurring. You can always click back and change your first answer, so you can select the most relevant category.

If the violation happened in a server, you can also reach out to the server’s moderators, who may be able to respond immediately and help resolve your concerns. In addition, please remember that you always have the ability to block any users that you don’t want to interact with anymore.

Do not mislead Discord’s support teams. Do not make false or malicious reports to our Trust & Safety or other customer support teams, send multiple reports about the same issue, or ask a group of users to report the same content or issue. Repeated violations of this guideline may result in loss of access to our reporting functions.

What to do if you receive a violent threat, or someone is at risk of Self-harm

If a credible threat of violence has been made and you or someone else are in immediate danger, or if someone is considering self-harm and is in immediate danger, please contact your local law enforcement agency.

Additionally, if you are in the United States, you can contact Crisis Text Line to speak with a volunteer crisis counselor to help you or a friend through any mental health crisis by texting DISCORD to 741741.  You can learn more about Discord’s partnership with Crisis Text line here.

You can find more resources about mental health here.

Reporting a User Profile

  1. Select the User Profile you wish to report by clicking on the three dot menu
  2. Select “Report User Profile
  3. Select the specific elements of the profile you are reporting - you can report multiple aspects of a profile at once

         4. Select the type of abuse you're seeing

  1. The next screen will allow you to further specify the specific abuse that’s occurring. You can always click back and change your first answer, so you can select the most relevant category.

Reports Under the EU Digital Services Act

EU users can report illegal content under the EU Digital Services Act by clicking here. EU government entities reporting illegal content should follow the process outlined here.

EU users will be required to go through a verification process and follow the prompts to provide details and descriptions for their report.

When reporting a message under the EU Digital Services Act, a message URL is required. Here’s how to find the message URL for the desktop and mobile apps.

Desktop App

  1. Navigate to the message that you would like to report.
  2. Right-click on the message or press on the ellipses icon when hovering over the message.
  1. Select Copy Message Link.
  1. The message URL will be copied to your device’s clipboard.

Mobile App

  1. Navigate to the message that you would like to report.
  2. Hold down on the message to reveal a pull-up menu.
  3. Select Copy Message Link.
  1. The message URL will be copied to your device’s clipboard.

What Happens After I Submit a Report?

When we become aware of content that violates our Community Guidelines or Terms of Service, our Safety team reviews and takes the necessary enforcement actions, including: disabling accounts, removing servers, and when appropriate, engaging with the proper authorities. We may not review your report manually or respond to you directly, but we’ll use your report to improve Discord.

You can read more about the reports we receive and the actions we take on violations of our Community Guidelines or Terms of Service in our quarterly Transparency Report.

Safety
Safety

What is Discord?

Discord Glossary

Discord has its own vocabulary. You might hear your teen or students using these words when talking about Discord.

Server: Servers are the spaces on Discord. They are made by specific communities and friend groups. The vast majority of servers are small and invitation-only. Some larger servers are public. Any user can start a new server for free and invite their friends to it.

Channel: Discord servers are organized into text and voice channels, which are usually dedicated to specific topics and can have different rules.

  • In text channels, users can post messages, upload files, and share images for others to see at any time.
  • In voice channels, users can connect through a voice or video call in real time, and can share their screen with their friends - we call this Go Live.

DMs and GDMs: Users can send private messages to other users as a direct message (DM), as well as start a voice or video call. Most DMs are one-on-one conversations, but users have the option to invite up to nine others to the conversation to create a private group DM (GDM), with a maximum size of ten people. Group DMs are not public and require an invite from someone in the group to join.

Go Live: users can share their screen with other people who are in a server or a DM with them.

Nitro: Nitro is Discord’s premium subscription service. Nitro offers special perks for subscribers, such as the option to customize your Discord Tag, the ability to use custom emotes in every server, a higher file upload cap, and discounted Server Boosts.

Server Boosts: If your teen is a big fan of a community, they might want to boost the community’s server (or their own). Like Nitro, Server Boosts give servers special perks like more custom emotes, better video and voice quality, and the ability to set a custom invite link. Server Boosts can be bought with Nitro or purchased separately.

Student Hubs: Discord Hubs for Students allow students to verify their Discord account with their official student email, and unlock access to an exclusive hub for students at their school. Within the hub, they can connect with other verified students, discover servers for study groups or classes, and share their own servers for fellow students to join. Hubs are not affiliated with or managed by a school or school staff. Servers in a Hub are student-run but may include non-students. For more information on Student Hubs, please check out our Student Hubs FAQs.

Why people love Discord

Below, you can see just a few of our favorite stories about what people are doing on Discord and why they love it. You can find even more stories about how people use Discord right here.

Cyndie, a parent of two from North Carolina, reflects on how her family uses Discord:

There are four of us and we all have Discord installed on both our computers and phones. My oldest son is in an apartment, and the younger one is on campus, so we use Discord to make family plans. Everything gets dropped into that server. From dinner’s ready to internships and job offers. Usually it’s the silly, stupid stuff we just drop in that makes us all laugh, like when there’s a Weird Al question on Jeopardy. I can’t imagine life without it.

Genavieve, a high-school student from California, talks about how her classes use Discord:

"I've been using Discord for the last two years as my main communication with my friends. We had too many people in our group chat and wanted a platform where we could all communicate with each other. Discord is a great way for a friend group of thirty people to stay in touch! Also, with distance learning in place, I’ve started using it with my AP Physics class too. It's been so important to feel connected to our teachers and each other when we are so isolated and in such a difficult class. Using Discord brought us closer together as a class — we are already a small class of 22 students, so being able to joke around and send memes helps us not feel so alone during the distance learning. The different channels and @mentions make it much easier to keep information straight. Screenshare makes it even easier, so we can show each other documents or problems we are working on to get feedback or troubleshooting advice.

David, a physics and math tutor from New Jersey, talks about how he teaches students and connects with other teachers over Discord:

"I use Discord to tutor one of my students and to stay up to date with conversations and announcements in a group of physics teachers interested in physics education research. It's nice to see a side-by-side camera view of my desk with the student's work. I also really like that the audio through the OPUS codec which sounds very clean."

Safety
Safety

Four steps to a super safe server

1. Set up roles and permissions

Roles are one of the building blocks of managing a Discord server. They give your members a fancy color, but more importantly, each role comes with a set of permissions that control what your members can and cannot do in the server. With roles, you can give members and bots administrative permissions like kicking or banning members, adding or removing channels, and pinging @everyone.

You can find these options in the Roles section of your Server Settings.

Assign permissions with care! Certain permissions allow members to make changes to your server and channels. These permissions are a great moderation tool, but be wary of who you grant this power to. Changes made to your server can’t be undone.

You can learn more about implementing roles and permissions in Role Management 101 and our Setting Up Permissions FAQ article.

2. Set a verification level

Server verification levels allow you to control who can send messages in your server. Setting a high verification level is a great way to protect your server from spammers or raids. You can find this option in the Safety Setup section of your Server Settings.

Verification levels

  • None - All new members in the server can start chatting immediately with no restrictions.
  • Low - Members in the server must have a verified email on their Discord account to begin chatting. We recommend this setting for any server where you’re putting an invite link on the internet!
  • Medium - Members in the server must have a verified email and their Discord account must be at least 5 minutes old to begin chatting.
  • High - Users must meet all of the previous requirements and also must be a member of the server for at least 10 minutes to begin chatting. This is a good way to stall raids. Most raiders don’t have the patience to wait ten minutes before they spam the channel.
  • Highest - Members must have a verified phone number on their Discord account. This is the highest verification level.

3. Enable server-wide 2FA

When enabled, server-wide two-factor authentication (2FA) requires all of your moderators and administrators to have 2FA enabled on their accounts in order to take administrative actions, like deleting messages. You can read more about 2FA here.

By requiring all admin accounts to have 2FA turned on, you protect your server from malicious users who might try to compromise one of your moderators' or administrators' accounts and then make unwanted changes to your server. If you are the server owner, you can enable the 2FA requirement for moderation in the Safety Setup section of your Server Settings.

You must have 2FA enabled on your own account before you can enable this option!

4. Explicit image filter

The explicit image filter automatically blocks messages in a server that may contain explicit images in channels not marked as Age-restricted.

You can control this feature so that it applies to all server members, or just to members without roles — or you can turn this feature off altogether.

Age-restricted channels are exempt from the explicit image filter. Turning this filter on allows your server members to share content while reducing the risk of explicit images being posted in channels that are not age-restricted.


You can access this setting by navigating to Server Settings, and under Moderation, select SafetySetup, and then find the Explicit image filter heading.

Safety
Safety

How we investigate

When we receive a report from a Discord user, the Trust & Safety team looks through the available evidence and gathers as much information as possible. This investigation is centered around the reported messages, but can expand if the evidence shows that there’s a bigger violation. For example, we may investigate whether the entire server is dedicated to bad behavior, or if the behavior appears to be part of a wider pattern.

We spend a lot of time on this process because we believe the context in which something is posted is important and can change the meaning entirely. We might ask the reporting user for more information to help our investigation.

Responding to user reports is an important part of our Trust & Safety team’s work, but we know there is also violating content on Discord that might go unreported. This is where we get proactive. Our goal is to stop bad actors and their activity before anyone else encounters it. We prioritize getting rid of the worst-of-the-worst content because it has absolutely no place on Discord, and because the risk of harm is high. We focus our efforts on exploitative content, in particular non-consensual pornography and sexual content related to minors, as well as violent extremism.

Please note: We do not monitor every server or every conversation. Privacy is incredibly important to us and we try to balance it thoughtfully with our duty to prevent harm. However, we scan images uploaded to our platform to detect child sexual abuse material. When we have data suggesting that a user is engaging in illegal activity or violating our policies, we investigate their networks, activity on Discord, and their messages to proactively detect accomplices and determine whether violations have occurred.

Safety
Safety

How Trust-Safety Addresses Violent Extremism on Discord

Our History with Violent Extremism

We’ve been paying close attention to violent extremist groups and movements ever since we learned how the organizers of the 2017 Unite the Right Rally in Charlottesville, Virginia utilized Discord to plan their hateful activities.

Back then, Discord Trust & Safety was a team of one, just beginning to make difficult decisions about how to properly moderate the platform. Almost four years later, our Trust & Safety Team makes up 15% of Discord’s near-400 employees and splits its time between responding to user reports and now proactively finding and removing servers and users engaging in high-harm activity like violent extremist organizing.

Trust & Safety has spent a lot of time since 2017 trying to ensure that another event like Charlottesville isn’t planned on our platform. Our team developed frameworks based on academic research on violent extremist radicalization and behavior to better identify extremist users who try to use Discord to recruit or organize. We keep up-to-date on research that can lend insight into how to evaluate and understand extremist behavior online, and our recent partnerships with organizations like the Global Internet Forum for Countering Terrorism (GIFCT) and Tech Against Terrorism (TAT) are intended to support this effort.

How We Identify Violent Extremism

Categorizing violent extremism itself is difficult because not all extremists have the same motives or believe in the same ideas. Some individuals who adopt violent ideologies act on their beliefs by joining organized hate, terrorist, or violent extremist groups.

Others don’t want to officially identify themselves as belonging to a particular movement, and may instead form looser connections with others who have adopted the same worldview. Different cultural contexts also influence belief systems and behaviors, so violent extremist ideologies in one country will naturally be different from those on the other side of the world.

Violent extremism is nuanced and the ideologies and tactics behind them evolve fast. We don’t try to apply our own labels or identify a certain “type” of extremism.

Instead, we evaluate user accounts, servers, and content that is flagged to us based on common characteristics and patterns of behavior, such as:

  • Individual accounts, servers, or organized hate groups promote or embrace radical and dangerous ideas that are intended to cause or lead to real-world violence
  • These accounts, servers, or groups target other groups or individuals who they perceive as enemies of their community, usually based on a sensitive attribute.
  • They don’t allow opinions or ideas opposing their ideologies to be expressed or accepted.
  • They express a desire to recruit others who are like them or believe in the same things to their communities and cause.

It’s important to note that the presence of one or two of these signals doesn’t automatically mean that we would classify a server as “violent extremist.” While we might use these signs to help us determine a user or space’s intent or purpose, we always want to understand the context in which user content is posted before taking any action.

How Did We Do on January 6?

On the day of the Insurrection, our Trust & Safety agents were reviewing reports of hate speech, glorification of violence, and misinformation about what was transpiring. We feel very fortunate that our team was able to locate and remove many of the most harmful servers dedicated to coordinating violence on January 6.

Our ability to move proactively on servers advocating for violence was thanks to two main factors: first, we were able to surface reports from users on these spaces quickly; and second, our Trust & Safety agents dedicated to countering violent extremism had been tracking these spaces ever since allegations of election fraud regarding the 2020 U.S. presidential election had begun to spread.

We believe it’s important to talk about the line we walk with Discord users who discuss politics or to organize political activities like protests. Many people are frustrated with how society works and how some governmental or societal systems are structured. Naturally, people have strong opinions on how things should or shouldn’t change.

Now more than ever, it’s important for meaningful conversations, debates, and exchanges of ideas to take place. We’re glad that users across the world have turned to Discord to discuss their opinions and beliefs, to organize, and to advocate for the change they want to see.

Discord Trust & Safety’s objective is to ensure that no harm comes to our users, or to society at large, because of actions taken on Discord, which is why we don’t tolerate activities that promote or advocate for violence. When we’re reviewing reports for violent extremism, while it’s sometimes clear when users or servers have crossed that line, in many cases there’s a lot more context to consider. One of the most difficult responsibilities of our work is balancing the mitigation of potential harm without appearing as if we are overstepping any boundaries or censoring meaningful conversation.

Looking to the Future

Because of these values, we plan to continue standing firm against ideologies of hate that violent extremist communities espouse, and we are excited to work with other platforms and organizations that seek to do the same. Stay tuned for more updates.

We know that we can’t solve violent extremism alone, but we’ll continue to do our best to make sure that the communities on Discord reflect our company values. We want Discord — and the internet as a whole — to be a space for positive interactions and creating belonging.

If you would like to report dangerous or harmful activity to the Trust & Safety team, please do so using our report form. If you’re unsure how to report a user or server, take a look at dis.gd/HowToReport.

Safety
Safety

Scams and What to Look out for

Fake Games, Programs, Videos or Downloads

In this situation, a user pretending to be your friend, or using a friend’s compromised account, reaches out asking you to check out their video, test a game they made, or practice running code they wrote. No matter the backstory, they’ll always ask you to download a program or click a link they provide, resulting in a malicious program entering your computer and/or compromising your account.

A suspicious Discord user sending a message that says "gi, hi" then sending two files called "Not a Virus dot Zip."

Another variation of this scheme involves a user asking you to “test” something for them, directing you to open the developer tools on your internet browser while logged into the web app. They’ll then ask you to show them your token — do not do this. With your token, malicious users can sign in and take over your account.

Discord will never ask you for your token, and you should never have any reason to open Discord’s Developer Console in the first place. Note that this is only applicable to Discord on your internet browser, and not the desktop or mobile application.

Fake Giveaways/NFT Drops

This is similar to the previous scheme in that usually it is, again, a trusted individual that DMs you. Sometimes it's in the form of a well-known bot or under the facade that they are an administrator for a server that you're active in. It may involve very genuine-looking links to websites as well. Like we said, if it's too good to be true, it likely is.

Discord Impersonation for Partner/Verified/HypeSquad

Discord impersonation involves a hacker pretends to be messaging you from an “official Discord account” and offer entry to one of our community initiatives, such as the HypeSquad or Partner programs.

This is nearly always fake. Below are two screenshots, both of which present themselves as official Discord-sent messages. However, of these two conversations, only the right screenshot is actually from Discord.

Two Discord messages side by side. On the left, a regular user is attempting to act like a Discord Employee to try and obtain personal information. On the left, a real Discord message has a verification checkmark that says "System" next to it, to prove it's a real message. Under the official message is a banner that says the following: "This thread is reserved for official Discord notifications. Discord will never ask you for your password or account token."

On the right, you can note the blurple “System” tag next to the sender’s name, as well as the Reply space being replaced with a unique banner that only official system messages come with.

The DM on the left does its best to be convincing though. It even sends an invite link to a real Discord-run server called Discord Testers and a somewhat-real-looking link to the supposed Discord Hypesquad form. Scammers will use a technique of mixing real Discord invite links (to public Discord servers usually) with their malicious links in order to portray legitimacy and lull you into a false sense of security.

A Discord message window. A button that says "Report Spam" is highlighted next to the user's name.

If you suss out that a DM is a fake, report it as Spam using the red “Report Spam” button at the top of the DM.

This feature is one of many improvements that we’re working on to help identify and remove bad actors as soon as we’re aware of them.

Free Nitro Scams

One of the oldest scams is the temptation of “free Nitro.” While we can’t discount people who may be truly full of generosity and believe in gifting Nitro, getting a random DM from a stranger claiming to have chosen *you* for a Nitro giveaway is incredibly sus, and most likely a scam.

Discord will never ask you to scan a QR code in order to redeem a Nitro code. Do NOT scan any QR codes from people you don’t know or those you can’t verify as legitimate.

If you ever use QR Code Login to sign in to Discord, make sure you’re using the desktop app, or if you’re on the web app, that your URL bar says “https://discord.com/login” exactly as it's written.

The above tactics are some of the ways that scammers may attempt to socially-engineer you into giving up your information. Even if you don’t click any of their links, it's best to simply block and report them to us, rather than engage further.

We encourage you to share this article with friends who may not be as informed as you — when everyone’s aware, our communities are safer than ever. Here’s a quick link back to Protecting Against Scams on Discord too.

Stay safe out there!

Safety
Safety

Protecting Against Scams on Discord

Internet Safety Checklist

Internet Safety doesn’t have to be exhausting. Below are some simple but effective ways to make sure you’re on guard against any potential ne’er-do-wells in your DMs, and even outside of Discord.

Only Open Trusted Links from Those You Know

This may feel like a given, but a surprising amount of security issues stem from people clicking on links before checking if they’re the real deal. Always double-check a link you’re clicking — link shortening services can easily mask unsafe websites or programs. We recommend getting it checked against a resource like VirusTotal to see if someone has already flagged it as potentially dangerous.

In addition, Discord has its own systems in place to remove malicious links and we’re constantly evolving those systems.

Note the misspelling in the URL, "dliscordnltro.com."

Don’t Download Programs or Run Code You Don’t Recognize

It’s not advised to download and run software that doesn’t come from a reputable source. Downloading and running programs that someone sends you unprompted is almost always a bad idea.

If a person claiming to have “special access to features” or new software says they need you to run on your own computer, they’re misleading you in order to get your personal info with their shady programs. If it sounds too good to be true, it probably is.

Never Give Your Password to Anyone

There’s no reason to give it up, ever. Sharing your password not only gives away access to your account but also exposes any personal information you have tied to that account — and potentially any website where you use that password — making you vulnerable to more than just a single account takeover.

Discord Safety Checklist

The above tips can be applied anywhere on the internet! Next, we'll share some Discord-specific tips to ensure you can be vigilant against baddies targeting your account or community:

Decide Who Can and Can’t Send You DMs

Disabling DMs for a particular server is one of the best ways to prevent bad apples hiding inside larger communities from contacting you.

To adjust who can and can’t DM you, head into User Settings > Privacy & Safety, then scroll down to “Server Privacy Defaults.” From there, you’ll find the option to “Allow direct messages from server members.”

Feel free to adjust it as you wish, but do note that this new state only applies to servers joined after changing the toggle; it won’t retroactively affect your existing servers.

If you turn this option off, members of newly-joined servers can’t contact you via DM unless you’re friends with them beforehand. Receiving mail might be nice, but receiving suspicious messages from people you don’t know is less nice.

If you're in a server you trust and don’t mind being messaged by those in it, you can toggle the privacy setting on an individual basis. Head to that server on desktop or mobile and select its name to open the server's settings, and choose “Privacy Settings.” Once there, you’ll find the “Allow direct messages from server members" option. Turn that on, and you’re free to receive all sorts of DMs from everyone in that server, regardless of if you’re friends or not!

If you’ve joined a lot of communities, consider auditing the list and see if you’re comfortable with letting non-friends message you from that server, or if opening up is inviting unnecessary risk into your inbox.

Audit your Server’s Permissions

Understanding which permissions your mods and members have access to is key to keeping everyone within it safe. If you're a server owner, have you checked your permissions list lately? Who has what perms? Did you know they had that access and for how long?

If the answer to any of these questions was a resounding :shrug_emoji:, it’s time to do a review of your server setup to ensure that only those who really need powerful permissions have them.

Specifically, make sure that only moderators you trust have access to permissions that can change powerful server tools, including any bots or webhooks you might add to the server. Be vigilant for bots that are impersonating larger well-known bots.

In almost every case, any large and reputable moderation bot will never need admin permissions to work properly. Only give a bot the permissions required for the tasks you need and no more — look for a Verified checkmark on a well-known bot before adding it.

If you need a refresher on how permissions on Discord work, you can check out the Help Center article here. If you have a basic understanding of the permissions system and want a more comprehensive look at what they mean in a moderation sense, we also have this article from the Discord Moderation Academy.

Keep Invite Links Updated

If you update your server’s links, make sure that your community and potential newcomers are aware of the changes and update any social media pages where you shared them. If possible, delete references to old invite links and make it known that those links have been updated.

This is doubly-so for servers Partnered, Verified or Level 3-boosted servers that utilize a vanity URL: if your server loses or changes its custom invite link, nefarious communities may swoop in and claim your old one. If this happens before you update your public-facing invites, people trying to join your community may instead join a server that’s looking to cause trouble.  

In addition to updating existing links, consider implementing easy-to-follow community rules around invite sharing and encouraging members to always verify where a server invite leads and who it is coming from before clicking it.

Pro Tip: Try pasting one of those invites to a Discord message to preview where it leads to before opening it! (But of course, don’t make your invite testing look like a spam message by pasting a random invite in #general.)

Why Would Someone Want Access to My Account?

If someone gains control of your Discord account, they will have as much reign over your account as you do: They’ll have the ability to change your username, password, email tied to the account, and any other information associated with your account.

They’ll also be able to see any personal info associated with your account once they’re in it. While most consider “personal info” to be payment info or email, it can also contain your private conversations and messages in DMs and servers alike. If you can see it, they’ll be able to see it.

As if they were the new owner of any servers you own, they’ll be able to make any changes they want: from server layout to server permissions, to bots and webhooks to kicking everyone out of the server, you name it. If your account is the moderator of a server that a hacker is targeting, they might even use you as a stepping stone to cause further damage within the community, or even impersonate you to trick unsuspecting members.

Some users may also target Discord accounts that have unique profile badges that are no longer available, such as the Early Supporter or Early Verified Bot Developer badges. If you have one of these unique badges, you should be extra-vigilant with your account.

If your account is taken over and the hacker changes the password, there isn’t much you can immediately do to stop them. However, if you have 2-Factor Authentication enabled on your account, the hacker will also be required to provide a 2FA code to change your password. We strongly recommend enabling 2FA, and you can learn how in our 2FA blog post.

Reporting what happened to Discord can help you regain ownership of your account, which can be done here — let us help you!

Additional Reading: Common Scams

We’ve created an additional blog post describing the types of scams floating around on Discord, which you can check out to familiarize yourself with signs that you may be dealing with a hacker who needs to be blocked:

Common Scams and What to Look Out For

We recommend sharing it with larger communities that may benefit from this knowledge.

With these recommendations in your pocket, you’ll be better able to foil any potential digital threats. Just like with keeping your IRL-self healthy and whole, taking preventative measures can keep your virtual self safe and secure.

Stay safe out there!

Moderation
Moderation

Managing Moderation Teams

Getting to Know Your Team

Above all else, the foundation of a good moderation team is familiarity. By knowing your fellow moderators better, what they do, or where they live, you’ll relate to them better. You’ll get to know people’s strengths and weaknesses, learn to understand them better, and get a feeling for when they need help, and vice versa. Though you all may be in different time zones and have diverse backgrounds you’re all working towards the same goal which is keeping the community you all care for safe. Who knows- you might find that you share a lot of the same interests along the way, make great new friends, or deepen existing friendships during your time together as moderators.

Here are a few basic things you should do to familiarize with each other:

  • Introducing yourself. Let moderators post a small introduction about themselves for existing and new moderators. This will give everyone in the team a basic understanding of where they’re coming from.
  • Engage new moderators. Just like in any relationship, talking with each other is a part of the process. Engage new moderators with conversations, and make them feel welcome in the team.
  • Keep in touch with moderators. Offer help to your moderators, or check on them if they haven’t been seen for a while. Sharing positive feedback is also a great way to stay in touch with your moderators! Positive, specific feedback is one of the best ways to recognize all of the hard, often thankless work, that your moderators do.
  • Maintain a channel for off topic discussions. Keep a channel for casual conversations for your moderators, letting them talk about their interests and getting to know each other better. Just because they’re here in a position of power does not mean that they’re incapable of having fun or talking about things outside of the immediate state of the server.
  • Hold Voice Chat Sessions. While talking through text is great- especially when trying to get more long form thoughts across- encouraging people to hop into voice to speak more casually can help to keep your mods personable and engaged with one another if they can put a voice to their fellow mods’ usernames.

Allocating Resources

A moderation team needs a clear structure and a unified understanding of server moderation, which has already been covered in Developing moderator guidelines. Now we’re going to expand on how to utilize each and every single moderator's abilities further. A moderation team can range from a few members in a small server to a huge team with 30 or more staff depending on the server size. The bigger your community gets the more the team needs to be organized. While they are all moderators, it doesn’t mean they all do the same job.

Some of your moderators, especially experienced moderators, are likely to be in more administrative positions. They usually stay further away from general day-to-day channel moderation while newer moderators are focused on watching conversations and enforcing the server rules.

If you do have one of these larger mod teams, consider delegating certain moderators to tasks and responsibilities that they’d be best suited for, rather than having a jack of all trades, master of none situation. This allows to divide the team into smaller sub-teams that talk to each other more frequently in designated channels regarding their specific mod duties.

Here are a few examples of sub-teams that are common within larger communities:

Example 1: Chat Moderators

Moderators that primarily contribute to the community by enforcing rules, watching conversations, engaging members, solving member to member conflicts and showing moderation presence. The same type of moderators could also exist for Voice Channels, but that is mostly for very large communities.

Example 2: Bot Managers

Moderators that are extremely familiar with permissions, bots, programming, etc. Ideally, they aren’t just able to operate bots you’re using, but also maintain them. A custom bot tailored to your community is always something to consider. Having a bot hosted “in-house” by a moderator within your team adds an additional layer of security. The Bot Team is very valuable in making new and creative ideas possible and especially in automating day-to-day processes.

Example 3: Event Supervisors

Most servers host events, from community run events to events run by staff members. Event Supervisors watch over the community members hosting events, watching out for new events, while being the general point of communication in hosting and announcing them.

These are ways of how moderators can be utilized better by giving them a designated job depending on your team's size, and also giving them the ability to dive into certain topics of moderation more in-depth within your community, which overall makes managing and coordinating a team as a whole easier.

Decision Making

As server size and the number of moderators increases, the harder it becomes to hear every voice and opinion. As a team, decisions need to be made together, they need to be consistent, equitable, and take into account as many different opinions as possible.

It’s important to establish a system, especially when making big decisions. Often, there are decisions that need to be done right at the very moment. For example, when someone posts offensive content. In most cases, a moderator will act on their perception of the rules and punish offenders accordingly. At that very moment, the offending content has to be removed, leaving little to no time to gather a few staff members and make a decision together. This is where consistency comes into play. The more your moderators share equal knowledge and the same mindset, the more consistent moderation actions get. This is why it’s important to have moderator guidelines and a clear structure.

It’s very important to give every moderator freedom so they don’t have to ask every time before they can take action, but it’s also important to hear out as many opinions on any major server changes as possible, if time allows it.

Growth

Over time, a moderation team grows. They grow in many ways, in their abilities, in the number of moderators, but also grow together as a team. Every new moderation team will face challenges they need to overcome together and every already established team will face new situations that they have to adapt to and deal with. It’s important to give a moderator something to work towards. Mods should look forward to opportunities that will strengthen their capabilities as a moderator and strengthen the team’s performance as a whole.

You should let moderators know that they have the potential for growth in their future as a moderator. It can be something like specializing into specific topics of moderation, like introducing them into managing or setting up bots. Perhaps over time they will progress to a senior moderator and focus more on the administrative side of things.

The Discord Mod Academy can be a valuable resource in encouraging moderator growth as well. While they may be familiar with some of the concepts in the Discord Moderation Academy, no moderator can know everything and these articles have the potential to further refine their moderation knowledge and enhance their abilities.

Professionalism

Moderators are direct representatives of your community and as such should be a reflection of what an ideal community member looks like. Many things tie into showing this professionalism, ranging from how moderators chat with members in public to consistency in moderator actions.

The presence of a moderator should never make people uncomfortable - there needs to be a fine line between “I can chat with this moderator like with any other member” and “This is a moderator and I need to take what they're saying seriously”.

Here are a few attributes in what makes a moderation team look and be professional:

  • Competency. A moderator needs to be able to express moderation knowledge and articulate themselves well, so they can explain any actions taken on members with competence.
  • Reliability. The team should plan ahead of time, and a moderator should commit to what they say they would deliver, and if they can’t, they should take responsibility for it.
  • Integrity. Moderators are also a contributory factor of morality, ethics and equality and should demonstrate being the golden standard to their community.
  • Self-control. A moderator will often face the situation of members not agreeing to actions taken against them. In those often heated conversations with an irate member, moderators need to keep self control by staying calm and resolving the issue.
  • Professional Image. Moderators need to maintain a professional image- this includes things like choosing your profile picture, status, nickname, or any other highly visible identifiers wisely. While typing quirks and a friendly tone can go a long way to engender familiarity among your server members, when engaging in moderator actions it’s also important to say things in a way that are definitive and official. It’s hard to take a moderator seriously when they’re typing with tons of grammar mistakes and sporting a borderline nsfw profile picture.

Your team should share positivity, engage conversation, show respect for others and maintain a professional look. Make it clear to your moderation team that they are a highly visible and perhaps incredibly scrutinized group within your community- and to conduct themselves appropriately with those aspects in mind.

Problem Solving

As with all group efforts, there is a possibility for friction to occur within a moderation team. This is undoubtedly something that you’re all going to have to address at some point in your mod team’s lifespan. It’s very important to deal with these situations as soon as they come up. Many people don't feel comfortable dealing with conflict directly as it can be uncomfortable. However, getting to any potential problems before they become serious can prevent more severe issues from cropping up in the future. The goal of a problem-solving process is to make a moderation team more conflict-friendly.

As a general problem solving process, you should:

  • Identify the issue a moderator/a group of moderators/or a member has
  • Understand the opinions and interests of everyone
  • List and evaluate possible solutions
  • Settle on a satisfactory solution

With that in mind, there are also situations where you’d want to exercise more discretion. Something that might prompt this is when a moderator makes a mistake.

It can be both uncomfortable and embarrassing to be “called out” for something, so often enough the best option is to speak to someone privately if you think they’ve made a mistake. After the moderator understands the situation and acknowledges the mistake, the problem can be talked about with the rest of the team, mainly to prevent situations like these in the future.

Still, sometimes there are situations where problems can’t be solved and things get more and more heated, and in the end separating is unavoidable. Moderators leave on their own or have to be removed. Your team members should always be given the option to take a break from moderation- especially to avoid burnout. You can learn more about how to deal with moderator burnout here.

Reviewing

Taking a moment to look back at the history and progression of your community and your mod team can be a useful way to evaluate where your team is at and where it needs to be. The time frame can be of your choosing, but some common intervals can be monthly, quarterly, or half year.

You don’t always get to talk with your moderators every day - most of us volunteer, moderating as a hobby, having our own lives while living scattered all over the world. With all of that going on it can be hard to find the time or space to discuss things that you might feel are lacking or could be changed regarding the server, and that’s why reviewing is important.

A Community Review can be done in many ways. For most, a channel asking a few basic questions, like how the community has developed since the last review, how activity and growth has changed, is a good way to start. Most importantly, you want to hear how the moderation team is doing. Talk about mistakes that have happened recently, how they can be prevented, and review some of the moderator actions that have been taken. A review allows everyone to share their thoughts, see what everyone is up to, and deal with more long term issues. It also allows giving your moderators feedback on their performance.

Summary

The essence of managing a moderation team is to be open-minded, communicate and to listen to each other. Endeavor to manage decisions and confront problems in an understanding, calm, and professional way. Give moderators the opportunity to grow, designated tasks, but also time to review, break and rest. Remember, you’re in this together!

Moderation
Moderation

Using Modmail Bots

What are Modmail Bots?

Before we discuss how to use a modmail bot, we should explore what they are and what purpose they serve. Simply put, modmail bots are bots that allow users to contact the moderators of a server via a direct message with the bot. The bot then relays these messages to the entire moderation team by creating a private channel on the server visible only to the server moderators (also referred to as a “ticket”). Moderators can then use this private channel to have the bot relay their own messages back to the original user, all while keeping a record of the entire conversation within the server. Moderators can even discuss the ticket together in that channel without the user seeing since it requires usage of a specific command to send a message to the user.


There are numerous modmail bots online, each with their own styles and features. Some modmail bots can be self-hosted, while others are hosted publicly online! Here are some examples of modmail bots you can use:

It is important to do research into readily available bots to understand what your server needs and to choose whichever one best meets those needs. Please note, the content of this article is not endorsed by any developer or company related to the bots.

What Can Modmail Contribute to my Server?

Before adding a modmail bot into your server, you should determine if you need one by considering the benefits and drawbacks of modmail bots. Let’s review the pros and cons of modmail bots to help with this decision.

The Benefits of Modmail Bots

One of the biggest benefits of having a modmail bot is better organization. Anytime a user opens a ticket, be it a question or concern about the server, instead of trying to figure out which moderator is online and available, a user can quickly DM the modmail bot and get in contact with the entire moderation team while also notifying the active moderators online. This also eliminates over usage of pinging the entire moderation team in public channels.


The next benefit of modmail bots is team discussions. In the above example, we’re using the ModMail bot by CHamburr. As you can see, a user has opened up a ticket that can be viewed by the entire moderation team. The team can then decide who wants to respond to the ticket, as well as discuss the ticket in question among themselves and more efficiently handle the issue. Team discussions are great for trickier questions that not everyone on the team might know the answer to.

In the image below, you can see an example of what moderators see from their side. The team can split their focus and knock out tickets depending on their priority, urgency, or even work together to pick up a ticket if another mod has signed off. The ability to read through moderator discussions and work together is a great team building exercise for moderator teams!

The next benefit of modmail bots are information logs. Most modmail bots have a modmail log that shows all of the modmail tickets that have been opened or closed along with useful receipts of the conversations that happened in the channels. This is great for administrators/lead moderators who want to see how cases were handled or want to catch up on what they’ve missed while offline. This also helps moderators look back on reporting history if an old report is continued at a different period of time or if they suspect someone is abusing modmail.

Another benefit of having a modmail bot is anonymity. Some modmail bots allow you to reply to the user anonymously which can decrease the chances of someone harassing a moderator, or other unpleasant situations that may come from having to action someone. Users can sometimes get upset or want to seek revenge after being banned, muted, or kicked from a server, so being able to communicate with someone and action them without revealing your identity is extremely valuable.

Moderators no longer need to act as the “middleman” between a user and the rest of the team. The moderator can speak on behalf of the team and use language such as “we” instead of “I” which can communicate to the user that it isn’t just one person handling their case, but rather the entire team while keeping their identity safe.

Something that you will notice when moderating is that there are usually some questions or concerns that come up frequently. Having a modmail bot can help you better understand what your community needs are and better visualize what the most frequent moderator tickets are about. With this knowledge, you can create custom quick-replies that you and your team can use to more efficiently resolve tickets! Having the same response to a scam DM that is used by all the moderators on your server makes responding to those types of tickets quick, easy, and consistent. Moderators can copy and paste replies, or use custom commands to reply with pre-made messages. Once you have set up quick replies, you can answer many tickets without needing to think of how to reply, making your replies consistent no matter who is responding to a ticket.

Here are a few examples of situations you can make tailored custom-replies for ease, speed, and consistency:

  • Someone sent a user a scam DM
  • A member has a suggestion for the server
  • A member is having technical issues
  • A member is asking something through modmail when they should be going through a better form of communication, such as an official bug report forum or a support page that is not run by the moderation staff
  • A user has a frequently asked question that comes up a lot in the community

The Drawbacks of Modmail Bots

It is important to know the limitations and drawbacks of modmail bots before deciding if your community needs one. Let’s explore some of the ways these bots can be abused, as well as other negatives you should know about. Please keep in mind that every modmail bot is different, so you can work to choose, or even code, a modmail bot that may eliminate some of these drawbacks if you are concerned about them.

Modmail bots are vulnerable to spam since anyone on the server can DM the bot to open up a ticket. Be it a targeted spam attack or multiple users reporting the same thing. Some modmail bots allow you to block a member from opening up future tickets.

Depending on what kind of modmail bot you use, one drawback is that some modmail bots might not save the conversations between you and the user who made the ticket, only showing the reason the ticket was closed. Some modmail bots will save the conversation, but not any comments made by other moderators regarding the inquiry inside the ticket channel which can erase a lot of discussion and context from replies that are made. If you want to be able to track actions, you have to prioritize a bot that does save information logs.

Another drawback is the learning curve from using a new bot. Members that are new to Discord or that aren’t familiar with modmail bots can have trouble interacting and using the bot to report issues. Almost all modmail bots require a user to DM the bot, which can go against initial instincts and be challenging to new users. Some users have their DMs turned off which makes using a modmail bot more complicated if the user can’t receive DM replies from the bot. One solution to this is to ping the user in a public channel asking for them to change their DM settings so the bot can communicate with them.

Members figuring out a modmail bot for the first time can lead to “test” messages which will spam your modmail tickets. While these aren’t a big concern and are mostly harmless, depending on the size of your server, multiple users “testing” how the bot works can lead to a lot of time cleaning up the tickets.

Similarly to how a modmail bot can be a learning curve for your members, a modmail bot can also be a learning curve for your moderation team. Learning how a modmail bot works can take some time to get used to if your team has never used one before. Depending on how familiar your team is with using bots, the onboarding time may vary, so be sure to set aside time for learning and teaching how to use a modmail bot. We have an article on Training and Onboarding New Moderators that should help with this!

Uses for Modmail Bots

Modmail bots can be used in a variety of ways. They not only serve as a way to communicate with moderators, but also help with organization and structure and can be used for other purposes. Let’s look at the different uses of a modmail bot, and how it can make life on the server easier.

Questions. Modmail bots help users who might have questions that they don’t feel comfortable asking in public. This can be questions related to the server, or more private and sensitive matters. By using a modmail bot, the moderation team can efficiently answer the question depending on who knows the answer, and the user gets a sense of privacy by not having to ask in public. As mentioned previously, having a set of quick-replies to frequently asked questions makes responding to questions a breeze. Some modmail bots even have custom command replies where you can create commands to reply with a specific text, depending on the command. For example, replying with the command “=hello” can reply to the ticket with a premade message meant to greet the user and open a conversation, such as: “Hello. Thank you for contacting the moderation team. How can we help you today?”

Bug Reports. You can implement a modmail bot in a support server that deals with bug reports. Some larger communities have two servers–one for the main community, and one for support, depending on the community or product. Organizing bug reports using a modmail bot is an easy way to keep track of issues in one place. With the ability to give a closing reason, you can later search the modmail logs to look back at what sort of bugs were reported by different users.

Ban Appeals. Anonymity is important when dealing with troublemakers online. One way to use a modmail bot is to have a ban appeal server for users who would like to re-enter your community after being banned. With this setup appeals go through the modmail system so that the banned users can’t talk in any channels and can only interact with the staff team through the bot. Moderators can then review the appeal together and reply to the banned user anonymously through the bot.

Community Reports. Sometimes a member of your community needs to report something to the moderators that isn’t rule breaking or has information regarding another user that is important for the team to know. A modmail bot is great for relaying these types of reports to the team so everyone is notified at the same time without having to utilize an emergency ping.

Server Feedback. Members of your server can provide feedback using a modmail bot. Using modmail for feedback removes the need for a feedback channel and guarantees that the team will see the feedback when it is sent.

Staff Applications. There are different ways you can have staff applications for a server. Depending on who needs to see and evaluate the application, it can make sense to use a modmail bot to receive server staff applications in smaller arenas. When an application is made, a modmail ticket will open conveniently for the moderators or admins to look at. You can even comment within the ticket to evaluate each application, case by case! Keep in mind that there are a limited number of channels (in this case, tickets) that can be made in a channel category so this will only work for smaller servers.

The above list are just some of the many common ways that modmail bots can be used to strengthen communication with your server. The use cases for a modmail bot is limited only by your imagination and the type of bot you’re using!

Modmail Best Practices

When using a modmail bot, it is important that your moderation team is on the same page. Make sure there are clear protocols of how to handle tickets. An easy way to ensure that the end-user doesn’t get messages sent by multiple moderators is to have a moderator say that they will handle the ticket inside the ticket channel itself. This way, moderators aren’t stepping on each other’s toes and the ticket can be handled efficiently.

In the image below, we’re using a modmail bot that sends all messages sent in the ticket to the user, unless it has a specific command. Other modmail bots might only send a message to the user if a reply command is used, so be sure to read the bot documentation to know how your modmail bot handles communication between the moderators and the user sending a ticket.

Another good practice is to close any old or resolved tickets as quickly as possible. If the user has no further questions, it can be helpful to send a final message stating that if there are no further questions, the ticket will be closed soon. This way, you reduce the amount of open tickets and maintain a clean and organized modmail.

As mentioned before, anonymity in responding to tickets can help reduce the amount of harassment you and your moderators may receive from rule-breakers. When using a modmail bot you can use discretion when deciding whether to respond anonymously or with a more personalized response, depending on the situation. Here are a few scenarios where it’s okay to respond regularly:

  • A member has a basic, harmless question and you want to be more personable
  • A member is requesting to speak to an admin and needs to see who is dealing with their ticket
  • A member wants to thank a specific moderator for their help with something on the server
  • A member is sending in a note praising a specific moderator for how they handled a situation

In most cases, however, it is better to respond anonymously especially when dealing with any modmail tickets related to actioning or punishing another user. This applies to handling ban appeals if you have such a system set up. It isn’t a big deal if you reply to a question without showing your name. The same is not true if you show your name to someone who has been punished on your server.

Each server has its own tone and style when it comes to moderation. Some servers are more formal and may have an official tone, while others might have a more relaxed and easygoing tone. It is important to keep this in mind when responding to modmail tickets so that your tone in the modmail responses is consistent with the tone set by the server.

This article has explored the ways modmail bots can strengthen how your team communicates with your community as well as the direct advantages and disadvantages that exist when introducing such a bot to your server. In some ways moderation becomes significantly easier, but in other ways, negatives like spam from these bots can be overwhelming and introduce new problems to solve. Moderating a server can involve a lot of different moving parts and it is important to analyze your community’s needs as it grows, taking into consideration whether a modmail bot is included in that growth. Use what you’ve learned in this article to make a decision that’s right for you and your community.
If you decide that a modmail bot isn’t a good fit for your community, but you still want some bot help on your server, check out our Auto Moderation on Discord article!

Moderation
Moderation

Internal Conflict Resolution

Causes of Internal Conflicts

There can be many causes for conflicts between your moderators, some of which occur more often than others. Some of these can easily be prevented, while others might need a more intentful thought process to stop them from escalating to a conflict.

Personality Clashes and Differences

The main reason for conflicts to arise is a difference in personalities amongst your team. Everyone perceives, analyzes and responds differently to certain situations. While this should be a learning opportunity for everyone involved, it can lead to arguments when moderators can not get along. While emphasizing there are different approaches to the same issue, you should teach moderators to learn from each other instead of disagreements ending in a conflict. When a moderator constantly disagrees with someone else, or ends up in a personal argument, try taking them aside and let them know that they should be working on their personal issues. You can help them as an objective observer - it can be very valuable to let members of your team know that you understand their individual perspectives and are working to balance them.

Try to avoid the “Halo/Horns effect” as much as possible. This is a type of cognitive bias where your view of someone’s actions can influence how you think and feel about that person in general, both positively as well as negatively.

Poor Communication

Misunderstandings between your moderators can lead to a conflict as well, especially when there is a different understanding of how a situation should be handled. Having transparent, consistent communication where everyone is on the same page is very important in an online setting such as Discord. Text is often perceived differently than talking over voice or even in person. Oftentimes a language barrier can lead to misinterpreting a message as well. Try to prevent misunderstandings from happening with clear communication and when misunderstandings happen, step in where possible and explain how the situation should be handled. Foster a culture where it is not only allowed but encouraged to ask for clarification on someone’s statement if you have any doubts about what their intent was.

Bullying and Harassment

Making jokes is beneficial for maintaining a healthy and welcoming environment and even a way to deal with the tough situations moderators may face. However, it is important to make sure that jokes don’t push the boundaries of what is appropriate and lead to perceived bullying of server members behind closed doors, or even other members of your own team.

What is perceived to be a harmless in-joke in context could out of context seem like a malicious and unwarranted comment. When it is clear someone is not comfortable with certain jokes or feels targeted, you need to step in and clear the situation by telling moderators involved that you want a positive atmosphere where everyone feels welcomed and comfortable while clearly identifying what line was crossed so it does not happen again. That’s not to say that all close knit groups trend towards exclusionary jokes, but it is something that can happen without a team even being cognizant that it’s a problem. Hence why it’s pertinent to keep inclusivity and good faith intent in mind with situations like this.

Unfair Treatment

A main cause for conflicts is unfair treatment amongst members of your team. This might be because there is a lack of equal opportunities, or because there are unrealistic expectations. Make sure that everyone on your team is treated equally as everyone deserves the same treatment, regardless of your personal relationships. This also means you and other moderators should have reasonable expectations of each other. Moderators often volunteer to help out in your community, so don’t expect them to be available at all times; they might have other responsibilities as well. Any mod team’s north star should be that “real life comes first”, and to be lenient amongst themselves in the face of that.

Unclear Responsibilities

Sometimes it is not clear what responsibilities everyone has on the team. This can lead to misunderstandings and eventually to conflicts. Make sure everyone on the team receives adequate training and information on what different roles and positions should be doing. You might want to divide different responsibilities amongst members of your team or between different roles, such as moderators and administrators. Be as transparent and clear as possible on what everyone should be doing.

Competition

Sometimes moderators feel that they are competing against each other, especially if there are statistics or promotions available. You should always be clear that moderation is not about competition, it is about helping your community in the best way possible. Not every moderator has the same time available to help, so naturally there are going to be differences amongst your team. Moderation is not about quantity but quality - if this negatively skewed competitive atmosphere is a frequent issue, it may be worth looking at whether there are mechanisms within your community that encourage competition conflicts that could be removed or revised, such as quotas, public moderation statistics, or leaderboards.

Different Values and Standards

Moderators usually come from all over the world and might have different values. Sometimes, these might result in a conflict. Remind your moderators that they should always respect each other's values, regardless of their own opinion. Moderators should be able to explain why they feel a certain way in order for others to be more understanding of the situation. When someone has values that do not align with the server’s moderation philosophy, you might need to remove them from your team.

Unresolved Underlying Issues

It is not always possible to immediately see a reason for a conflict, this might be due to unresolved underlying issues. It is very hard to know these without having a conversation with everyone involved. There can be many more causes for a dispute to arise, but it is always important to find the underlying issue which causes a conflict, only then are you able to resolve it.

Preventing Internal Conflicts

To minimize and prevent conflicts from happening, try as best as you can to get a comprehensive view of why and how conflicts occur in your server. It is very important to develop interpersonal relationships with all of your moderators and value their contributions to the server. Encourage moderators to take initiative in projects they are enthusiastic about, especially in collaboration with other moderators. Just as moderators set a positive example in your community, you as the leader of your moderation team should set a positive example within your team.

When you propose major changes to the server, listen to your team’s viewpoints before deploying and explain why a decision was made. Good communication is very important within a team, not only to prevent conflicts, but also to keep all moderators on the same page. All moderators should feel involved and informed when you are making major changes.

You can give your moderators regular feedback on how they are doing and what they can improve. Make sure moderators always feel welcome to provide feedback constructively and positively to each other and that they can always contact you or someone in charge in case a conflict does arise.

Try to discourage gossip within your team, both internal and external. When moderators start to talk about each other behind their backs, it becomes personal and can distort the relationship your moderators have and how they see themselves within the server. Instead, encourage moderators to form friendships with each other by organizing social events for your staff. During those events, you can learn about the different personalities in your team.

Last, but not least; you shouldn’t lash out over mistakes.Give feedback where it is appropriate and move on. Be quick to forgive and forget. You should always prevent belittling your moderators while also creating a culture of de-escalating situations in private.

Resolving an Internal Conflict

There are many ways to resolve conflict internally and most of these will depend on what the cause is for the conflict. A good way of resolving conflicts is using the Thomas-Kilmann model. It should be stated that conflicts are not necessarily bad and shouldn’t be avoided at all costs. When you are working in a team, it is important to be able to challenge the status quo and question each other, to keep everyone on the aligned and up to date.

In the Thomas-Kilmann model, there are two dimensions when you are choosing a course of action: assertiveness and cooperativeness. Assertiveness is the degree to which you try to satisfy your own needs where cooperativeness is the degree to which you try and satisfy the other person's concerns.

Based on these dimensions, there are five ways to handle a conflict:

  • Avoiding: Ignoring the conflict
  • Accommodating: Satisfying the other person’s concerns at your own expense
  • Compromising: Finding an acceptable compromise that partially satisfies everyone
  • Competing: Satisfying your own concerns at the expense of others
  • Collaborating: Finding a solution which satisfies the concerns of everyone



All these different ways to handle a conflict are your intention to solve them: Sometimes situations occur differently than you expect at first. Don’t jump to conclusions when you are dealing with a conflict, as the reasons for conflicts are often more complex than they first appear.  Everyone tends to resolve conflicts in a certain way, so try to balance them so you don’t end up overwhelmed or overwhelm others.

The first step is to identify the cause for the conflict. You might already know this based on previously sent messages, but sometimes you need to contact everyone involved separately and in private to determine the cause of the conflict. It is your responsibility to determine how to handle a situation. While collaborating together to resolve a conflict might look the most appealing, this will not always be possible. Try not to completely avoid a conflict: if you feel uneasy dealing with conflicts or don’t want to give moderators feedback, you might not want to be in a leading position.

If you need to resolve a conflict, choose a neutral place to work it out. This might be a separate server or a private group. None of the people involved in the conflict should have power over the others, so you or someone else should act as an objective observer. If the conflict is between other moderators, you should offer guidance, but don’t offer solutions: ultimately it is up to others to resolve their conflicts, you are not taking sides.

Remember that not all conflicts require consequences. Most conflicts are sparked by the passion of your moderators who are simply in disagreement on how to deal with situations. Try to turn a conflict into a learning opportunity for everyone involved. Let them explain how they view the situation and how they would have handled it or behaved differently. Afterwards, you should be able to identify specific disagreements that you can solve. Listen to everyone involved and give everyone an equal opportunity to express themselves. Remind everyone to stay respectful at all times, even if they disagree with each other.

Giving and Receiving Feedback

As said before, you should discourage gossip within your team and encourage each other to give constructive feedback. It is important that everyone knows how to give and receive feedback to prevent a conflict from happening in future situations.

Giving Feedback

When you notice a moderator that displays behavior or takes action that can be improved, you should give them feedback on how to improve this in the future. Don’t rush in giving your feedback, everyone needs time to process. When a situation becomes heated, it will not be the best time to give feedback. Remember to always give feedback in private!

If you are in doubt as to whether or not you should give feedback, see if you can recognize a pattern in their behavior or actions. Everyone makes mistakes and that is perfectly fine as mistakes serve as a learning opportunity and you should only give them feedback if it becomes a pattern. Additionally, it’s recommended that you ask for consent to give feedback. While this may seem a little counterintuitive to helping the moderator improve or change, blindsiding someone when they’re not ready can result in backlash rather than progress.

When you do decide to give someone feedback, don’t focus on the person who made a mistake: focus on their behavior or action instead.By making feedback personal and equating it to an issue with the individual rather than their choices, you can come off as argumentative and unconstructive and biased. Never exaggerate their behavior, be sure that you are clear and specific . When you make generalized or vague comments about someone, they will not be able to improve their behavior. Feedback should be actionable- there should be a suggestion or a change that the individual can work on as a result of the feedback. Otherwise it’s just airing your grievances, which is unproductive for all parties involved.

The next time you see someone doing something good when you have given them feedback in the past, give them a compliment! Positive, specific feedback is especially effective in encouraging repeating said good actions. Giving each other positive feedback is just as important as giving each other constructive or corrective feedback.

Remember to always treat others as you would like to be treated yourself. It is proven that negative feedback is given mostly when people are experiencing negative emotions like hunger, anger, loneliness, and tiredness. This negative feedback method is referred to as the HALT-mode, and it is important to try to avoid situations where you are giving feedback when you find yourself in one of these states.

Receiving Feedback

When someone is giving you feedback, it is very important to listen to what they have to say. Don’t jump to conclusions, react defensively, or disagree immediately. Take a moment to summarize the feedback as you understand it to make sure you have understood their feedback correctly.

You should always thank someone for giving them feedback, even if you disagree. It has to be clear you are welcoming feedback to improve. If you do not thank them, they might not give you valuable feedback in the future.

Ideally feedback is very clear and specific, in which case you can end the conversation by thanking them for their feedback and reflect back what you are going to do with the feedback in the future. Other times it might not be understandable to you and as a result, it is unclear what you should do differently in the future. In this case you should ask questions to understand their feedback better. To have a better understanding of what they want to achieve with their feedback, it is a good idea to ask the other person how they would have handled the situation or how they would have behaved. After this you should thank them and reflect back on what you are going to do with their feedback.

It is okay to let someone know why you disagree with their feedback, but remember to stay respectful at all times and explain clearly why you disagree. Everyone should feel heard, even if you are not acting on their feedback.

Handling Demotions

There are many reasons why you might want to remove a moderator from your team, including internal conflicts you are unable to resolve. Removing or demoting a moderator is not easy to do, but can be a necessary evil. Removing someone from your team should be in the best interest of your community or team and can often be in the best interest of that person as well. Be sure to give someone an opportunity to learn from their mistakes before removing them. Give a warning first and have a conversation in private with them, following the principles of giving and receiving feedback outlined above.

While it is not easy to deliver bad news, here are some tips to keep in mind when you do want to remove someone from your team.

Don’t delay the conversation

Once you have made the decision to remove someone from your team, don’t hesitate and wait for the conversation to take place. When someone is causing more issues than they are solving, they will need to be removed sooner than later, but make sure that someone has had enough opportunities to fix their mistakes and resolve their issues. Please make sure you have this conversation in private, for example in direct messages or a voice call.

Keep it short and clear

When you are talking to someone you are removing, it is important to keep your message short and clear. Tell them they are being removed from the team, why you have made this decision, and when this will take effect. Be transparent in what reasons you have for removing them, but do not go into too much detail with specific examples. This may result in a conversation where you are arguing about these examples rather than informing them of their removal.  You’re aiming for polite dismissal, not a lambasting of their character.

Stick to your decision

Despite your message being short, clear, and transparent, you might still receive counter arguments as to why they should not be removed and to give them another chance. It is important they can express themselves, but it should be clear at all times that the decision to remove them has already been made and is not up for debate. Always listen carefully to what someone has to say and appreciate their feedback, but unless someone has substantial evidence a mistake was made, repeat your decision and make it clear that your decision is final.

Be helpful and compassionate

Even though you are delivering bad news, you should take a supportive approach in the conversation. Remember that while it might be difficult for you to deliver bad news, it most definitely is difficult to be on the receiving end. Show empathy for them and especially when they have done a good job in the past.


In some cases, the moderator you’re removing may wish to receive feedback on what they can improve - giving this feedback in a constructive fashion is important, and will help them to avoid future problems. This feedback should include reflecting on the positive contributions they made to the team, helping them understand what the causes of conflict might have been while they were on your team, or simply trying to give them something positive to take forward and work on as a result of your conversation. This conversation shouldn’t reflect a reversal of your decision, but can be a useful point of reference if they want to join other servers or work on improving their skills down the line, or perhaps even re-apply to join the team in the future.

Inform your team

When someone has been removed from your team, make sure to inform the team of this decision. It is up to you whether or not you want to share the reason for their removal, but refrain from sharing details as these are confidential, especially when these could be potentially harmful to someone. You also want to prevent anyone, including yourself, from talking bad about them, as this sets a wrong precedent for your team. As with your decision, you should be straightforward and clear to your team, an example of this could be: “Today we have decided to remove Sarah from our team. I will not go into too much detail why this decision was made, but we believe it is the best decision for the community and the entire team.”

In case someone was removed because of misbehavior, you might want to include that in the message, as this gives the team assurance you are not removing someone because of a personal conflict, and it shows a strong message about how you want moderators to conduct themselves.

Wrapping Up

In every group, eventually a conflict or dispute will arise which needs to be resolved. There are common causes, such as personality clashes and differences, poor communication, unclear responsibilities, and harassment. To minimize and prevent conflicts from happening, try and understand as much as possible why conflicts occur in your server. It is very important to develop interpersonal relationships with all of your moderators and value their contributions to the server. Try to discourage gossip within your team. When moderators start to talk about each other behind their backs, it becomes personal and can distort the relationship your moderators have and how they see themselves within the server.

You can resolve conflicts using the Thomas-Kilmann model. It is important to know conflicts are not necessarily bad and shouldn’t be avoided at all costs. When you are working in a team, it is important to be able to challenge the status quo and question each other, to keep everyone on the same page and sharp.

When you want to give feedback to someone, talk to them in private and be clear and specific. Prevent giving feedback when you feel hungry, angry, lonely, or tired. Focus on their behavior or action instead and never exaggerate their behavior. Oppositely, when someone is giving you feedback, it is very important to listen to what they have to say and to avoid jumping to conclusions, reacting defensively, or disagreeing immediately. Take a moment to summarize the feedback to make sure you have understood their feedback correctly.

Sometimes you need to remove someone from your staff team. Don’t delay this conversation, keep it short and clear, and stick to your decision. When the conversation is taking place, be helpful and compassionate and do not forget to inform your team as well, but make it clear that certain things remain private for the safety and privacy of those involved.

Keeping all of those points in mind when managing a team environment will be integral to maintaining a healthy atmosphere full of various viewpoints that are united in the shared desire to keep your community safe.


Moderation
Moderation

Using Webhooks and Embeds

What is a Webhook?

A webhook is sort of a “reverse API”: rather than an application you own (like a bot) calling another application to receive data when it wants it, a webhook is something you can give to someone else's application to send data to you directly as soon as there’s something to share. This makes the process very convenient and efficient for both providers and users.

Now let’s apply this definition to a more tangible example. Let’s say you aren’t around to reply to new e-mails for a while. Through the power of webhooks you could have a setup that replies automatically! When an event happens on a service (like receiving an email), that service activates your webhook and sends you the data relevant to the event that happened. That way, you have exactly the information you need to automatically reply to each new e-mail you receive.

So what does this mean for your server? Basically, Discord provides you with the ability to have a webhook that sends a message to your server when it’s activated, with the option to send the message to any channel along with having a cool name and avatar of your choice.

For example, let’s say you’re waiting for the latest chapter of your favorite webcomic series to release. You can set up a webhook that’s connected with an RSS feed. The feed will activate the webhook when the chapter is released and a message will be posted on your server to notify you about it. This can be applied to many different things - the potential is limitless!

If the service is capable of sending JSON Webhooks, Discord can often use these to create visually appealing embeds when it sends a message out. However, it can also accept raw text messages to pass along as well.

Usage of Webhooks

Webhooks are a great tool for services that have events that you’re interested in latching onto. This can be incredibly useful for Content Creator communities where their latest YouTube videos or Twitter posts can be funneled into channels for followers to have easy viewing access in your Discord community. You could also use this for a community that’s oriented towards an open-source technology by having events from GitHub be submitted directly to your server for people to keep track of.

There are a lot of platforms out there that provide you with the ability to handle everything by yourself- from creating the Webhook through your Discord Server Settings to plugging it into the platform you want updates from. However, this often requires you to be more familiar with at least some level of programming as many of these platforms require developer accounts. There can be a few hoops to jump through to get everything set up just right. This can often turn people off from using these types of webhooks.

However, times are changing and the barrier to entry when dabbling into the world of webhooks is much lower. These days, multiple third-party services exist that have handled all the hard techy stuff for you. They can interface with the platforms that would normally require you to do extra work to get the data you want. IFTTT and Zapier are two such services that lets you plug-and-play several useful platforms directly into your Discord server through a clean web interface that allows you to customize the type of data your webhook receives, and thus what messages are sent to your server. Just keep in mind that some services have restrictions on how many events can be sent to one webhook, so you may have to create multiple distinct webhooks for multiple functionalities. Note that these options and restrictions are entirely platform-dependent. Discord has no problems handling whatever data is thrown at its webhooks when activated, even if it’s from different sources.

There’s another use-case for Webhooks that is more unique. While they’re very useful for automated messages, they’re also great for one-off embed messages that have a very polished look! You have the ability to customize the name and the avatar of the webhook, but if you’re technologically-inclined you can create your own JSON data to activate the webhook with. This allows you to fully define the aesthetics of an embed in your message, giving a clean look to whatever information you’re sending. One possible application is nice formatting for your server rules and information, rather than just sending multiple plain text messages.

Of course, there are other possible uses, the only limit is your creativity! If you’re not too keen on developing JSON data by hand, or have no idea what a “JSON” is, services like Discohook also make it easy for anyone to take advantage of this use case. As a note, the websites linked here are not endorsed by Discord and are only a suggestion from the authors of the article.

How do I set up a webhook?

The process of setting up webhooks can be roughly explained in five simple steps!

Step 1: In your Discord server, you will need to create a webhook and copy the webhook URL. This URL is the path for your webhook to receive an HTTP POST request from a service when an event occurs.

Step 2: From this menu, you have the ability to style your webhook with a name and avatar. We recommend at least a name to distinguish its purpose.

Step 3: If you’re using a third-party service to handle events or send messages, plug your webhook URL into the service and configure the type of events that the service should trigger for.

Step 4: If your third-party service allows you to, configure additional options that will change the look and content of your message to the way you like.

Step 5: Now that you’ve created the connection, when the event happens, the data will be sent directly to Discord and your webhook will post a message in your preferred channel!

What is an Embed?

We’ve been talking about embeds a lot, but we haven’t quite explained what they are yet! In simple terms, embedding refers to the integration of links, images, GIFs, videos and other forms of content into messages on social media. An embed itself is a part of the message that visually represents content in order to increase engagement.

As a user of Discord, you have probably seen embeds more than a few times. They often are created automatically by Discord when a user sends a link, such as allowing you to view a YouTube video within Discord when a YouTube URL is shared. But did you know that Discord Bots and Webhooks can create their own embeds with all kinds of data? They’re pretty flexible! They can have colored borders, images, text fields, and other kinds of fancy properties.

If you're a developer, the options available to you to create a good-looking embed are powerful and versatile. For non-developers, platforms may give you visualizations that help you plug-and-play data into a website for customization purposes. There are also online tools to help you do this too!

Here are several of those properties that can be modified in an embed in order to style it, to give you an idea of what’s possible when you have the ability to style your messages:

Element
Description
Title
The text that is placed above the description, usually highlighted. Also directs to a URL, if given.
Description
The part of the embed where most of the text is contained.
Content
The message content outside the embed.
URL
The link to the address of the webpage. Mostly used with the thumbnail, icon and author elements in order to link to an image.
Color
Color of your embed’s border, usually in hexadecimal or decimal.
Timestamp
Time that the embed was posted. Located next to the footer.
Footer
Text at the bottom of the embed.
Thumbnail
A medium-sized image in the top right corner of the embed.
Image
A large-sized image located below the “Description” element.
Author
Adds the author block to the embed, always located at the top of the embed.
Icon
An icon-sized image in the top left corner of the embed, next to the “Author” element. This is usually used to represent an Author icon.
Fields
Allows you to add multiple subtitles with additional content underneath them below the main “Title” & “Description” blocks.
Inline
Allows you to put multiple fields in the same row, rather than having one per row.

Markdown is also supported in an embed. Here is an image to showcase an example of these properties:

Example image to showcase the elements of an embed
An important thing to note is that embeds also have their limitations, which are set by the API. Here are some of the most important ones you need to know:

An important thing to note is that embeds also have their limitations, which are set by the API. Here are some of the most important ones you need to know:

  • Embed titles are limited to 256 characters
  • Embed descriptions are limited to 2048 characters
  • There can be up to 25 fields
  • The name of a field is limited to 256 characters and its value to 1024 characters
  • The footer text is limited to 2048 characters
  • The author name is limited to 256 characters
  • In addition, the sum of all characters in an embed structure must not exceed 6000 characters
  • A webhook can have 10 embeds per message
  • A webhook can only send 30 messages per minute

If you feel like experimenting even further you should take a look at the full list of limitations provided by Discord here.

It’s very important to keep in mind that when you are writing an embed, it should be in JSON format. Some bots even provide an embed visualizer within their dashboards. You can also use this embed visualizer tool which provides visualization for bot and webhook embeds.


Benefits of using Webhooks

There are several benefits to using webhooks in your community including simplicity, versatility, automation, customization, and accessibility.

Simplicity. Webhooks are straightforward and lacking in complexity. They have a single function, which is to send a designated message when they are activated.

Versatility. Webhooks have many different use cases, all stemming from being able to send a message upon activation. This allows for many types of automation powered by other services, or allows you to send clean looking messages manually. Both of these setups have their own various uses ranging from posting updates and logging to posting custom messages for an aesthetic purpose.

Automation. With webhooks, there isn’t a need to constantly look for updates and post them yourself. Once set up, a webhook will automatically post any updates you need.

Customization. Each webhook can have a unique name and avatar, and each message it sends can be unique as well. You can have multiple webhooks with different looks in different channels and use each of them however you like.

Accessibility. Once you create a webhook, all you need is its URL and a website that will push messages. Since these webhooks are hosted on Discord, they don’t need to be hosted like a normal bot, often saving moderation teams from encountering a financial investment to use them, and making them more easily available to all users.

Webhooks vs. Bots

Webhooks and bots, while having slight similarities, are actually very different from each other. There are several aspects we have to look at when comparing the two:

Webhooks
Bots
Function
  • Can only send messages to a set channel.
  • They can only send messages, not view any.
  • Can send up to 10 embeds per message.
  • Much more flexible as they can do more complex actions similar to what a regular user can do.
  • Bots are able to view and send messages.
  • Only one embed per message is allowed.
Customization
  • Can create 10 webhooks per server  with the ability to customize each avatar and name.
  • Able to hyperlink any text outside of an embed.
  • Public bots often have a preset avatar and name which cannot be modified by end users.
  • Cannot hyperlink any text in a normal message, must use an embed.
Load and security
  • Just an endpoint to send data to, no actual hosting is required.
  • No authentication that data sent to webhook is from a trusted source.
  • No authentication that data sent to webhook is from a trusted source.If webhook URL is leaked, only non-permanent problems may occur (e.g. spamming)
  • Easy to change webhook URL if needed.
  • Bots have to be hosted in a secure environment that will need to be kept online all the time, which costs more resources.
  • Bots are authenticated via a token, compromised token can cause severe damage due to their capabilities if they have permissions granted to them by the server owner.
  • However, you can reset the bot token if needed.

Even though this comparison is important for better understanding of both bots and webhooks, it does not mean you should limit yourself to only picking one or the other. Sometimes, bots and webhooks work their best when working together. It’s not uncommon for bots to use webhooks for logging purposes or to distinguish notable messages with a custom avatar and name for that message. Both tools are essential for a server to function properly and make for a powerful combination.

Even though this comparison is important for better understanding of both bots and webhooks, it does not mean you should limit yourself to only picking one or the other. Sometimes, bots and webhooks work their best when working together. It’s not uncommon for bots to use webhooks for logging purposes or to distinguish notable messages with a custom avatar and name for that message. Both tools are essential for a server to function properly and make for a powerful combination.

Summary

Thanks to their simplicity and accessibility, webhooks have become a staple in many communities. Despite their limitations and lack of functions compared to bots, they’re still very useful  and play an important role in the automation and decoration of your server. Overall, webhooks are a great tool for pushing various messages, with limitless customization opportunities. Ultimately, the choice of using them depends solely on your needs and preferences.

Moderation
Moderation

Auto Moderation in Discord

Why is Auto Moderation important?

Auto Moderation is integral to many communities on Discord, especially those of any notable size. There are many valid reasons for this, some of which you may find apply to your community as well. The security that auto moderation can provide can give your users a much better experience in your community, make the lives of your moderators easier and prevent malicious users from doing damage to your community or even joining your community.

Auto Moderation vs Manual Moderation

If you’re a well established community, you’ll likely have a moderation team in place. You may wonder, why should I use auto moderation? I already have moderators! Auto moderation isn’t a replacement for manual moderation, rather, it serves to enrich it. Your moderation team can continue to make informed decisions within your community while auto moderation serves to make that process easier for them by responding to common issues at any time more quickly than a real-life moderator can.

Knowing what’s right for your community

Different communities will warrant varying levels of auto moderation. It’s important to be able to classify your community and consider what level of auto moderation is most suitable to your community’s needs. Keep in mind that Discord does impose some additional guidelines depending on how you designate your community. Below are different kinds of communities and their recommended auto moderation systems:

Private communities

If you run a Discord community with limited invites where every new member is known, auto moderation won’t be a critical function unless you have a significantly larger member count. It’s recommended to have at least some auto moderation however, namely text filters, anti-spam, or Discord’s AutoMod keyword filters.

Public communities

If you run a Discord community that is Discoverable or has public invites where new members can come from just about anywhere, it’s strongly recommended to have anti-spam and text filters or Discord’s AutoMod keyword filters in place. Additionally, you should be implementing some level of member verification to facilitate the server onboarding process. If your community is large, with several thousand members, anti-raid functionality may become necessary. Remember, auto moderation is configurable to your rules, as strict or loose as they may be, so keep this principle in mind when deciding what level of automation works best for you.

Verified and Partnered communities

If your Discord community is Verified or Partnered, you will need to adhere to additional guidelines to maintain that status. Auto moderation is recommended for these communities in order to feel confident that you can succinctly and effectively enforce these guidelines at all times so consider using anti-spam and text filters or Discord’s AutoMod keyword filters. If you have a Vanity URL or your community is Discoverable, anti-raid is a must-have in order to protect your community from malicious actors.

Built-in moderation features

Some of the most powerful tools in auto moderation come with your community and are built directly into Discord. Located under the Server Settings tab, you will find the Moderation settings. This page houses some of the strongest safety features that Discord has to natively offer. These settings can help secure your Discord community without the elaborate setup of a third party bot involved. The individual settings will be detailed below.

AutoMod

AutoMod is a new content moderation feature as of 2022, allowing those with the “Manage Server” and “Administrator” permissions to set up keyword and spam filters that can automatically trigger moderation actions such as blocking messages that contain specific keywords or spam from being posted, and logging flagged messages as alerts for you to review.

This feature has a wide variety of uses within the realm of auto moderation, allowing mods to automatically log malicious messages and protect community members from harm and exposure to undesirable spam and words like slurs or severe profanity. AutoMod’s abilities also extend to messages within threads, text-in-voice channels, and Forum channels giving moderation teams peace of mind that they have AutoMod’s coverage across these message surfaces without having to worry about adding more manual moderation work by enabling these valuable features.

Setting up AutoMod is very straightforward. First, make sure your server has Communities enabled. Then, navigate to your server’s settings and click the AutoMod tab. From there, you’ll find AutoMod and can start setting up keyword and spam filters.

Keyword Filters

Keyword filters allow you to flag and block messages containing specific words, characters, and symbols from being posted. You can set up one “Commonly Flagged Words” filter, along with up to 3 custom keyword filters that allow you to enter a maximum of 1,000 keywords each, for a total of four keyword filters.

When inserting keywords, you should separate each word with a comma like so: Bad, words, go, here. Matches for keywords are exact and aware of whitespace. For example, the keyword “Test Filter” will be triggered by “test filter” but not “testfilter” or “test”. Do note that keywords also ignore capitalisation.

To have AutoMod filter messages containing words that partially match your keywords, which is helpful for preventing users from circumventing your filters, you can modify your keywords with the asterisk (*) wildcard character. This works as follows:

  • *cat - flags “bobcat” or “copycat”.
  • cat* - flags “catching” or “caterpillar”.
  • *cat* - flags “scathing” or “locate”

Be careful with wildcards so as to not have AutoMod incorrectly flag words that are acceptable and commonly used!

Commonly Flagged Words

AutoMod’s Commonly Flagged Words keyword filter comes equipped with three predefined wordlists that provide communities with convenient protection against commonly flagged words. There are three predefined categories of words available: Insults and Slurs, Sexual Content, and Severe Profanity. These wordlists will all share one rule, meaning they’ll all have the same response configured. These lists are maintained by Discord and can help keep conversations in your Community consistent with Discord's Community Guidelines. This can be particularly helpful for Partnered and Verified communities.

Exemptions

Both AutoMod’s commonly flagged word filters and custom filters allow for exemptions in the form of roles and channels, with the commonly flagged word filter also allowing for the exemption of words from Discord’s predefined wordlists. Anyone with these defined roles, or sending messages within defined channels or containing keywords from Discord’s wordlists, will not trigger responses from AutoMod.

This is notably useful for allowing moderators to bypass filters, allowing higher trusted users to send more unrestricted messages, and tailoring the commonly flagged wordlists to your community’s needs. As an example, you could prevent new users from sending Discord invites with a keyword filter of: *discord.gg/*, *discord.com/invites/* and then give an exemption to moderators or users who have a certain role, allowing them to send Discord invites. This could also be used to only allow sharing Discord invites in a specific channel. There’s a lot of potential use cases for exemptions! Members with the Manage Server and Administrator permissions will always be exempt from all AutoMod filters. Bots and webhooks are also exempt.

Spam Filters

Spam, by definition, is irrelevant or unsolicited messages. AutoMod comes equipped with two spam filters that allow you to flag messages containing mention spam and content spam.

Mention Spam

Mention spam is when users post messages containing excessive mentions for the purpose of disrupting your server and unnecessarily pinging others.

AutoMod’s mention spam filter lets you flag and block messages containing an excessive number of unique @role and @user mentions. You define what is “excessive” by setting a limit on the number of unique mentions that a message may contain, up to 50.

It is recommended to select "Block message" as an AutoMod response when it detects a message containing excessive mentions as this prevents notifications from being sent out to tagged users and roles. This helps prevent your channels from being clogged up by disruptive messages containing mention spam during mass mention attempts and mention raids, and saves your members from the annoyance of getting unnecessary notifications and ghost pings.

Spam Content
This filter flags spammy text content that has been widely reported by other users as spam, such as unsolicited messages, free Nitro scams and advertisements, and invite spam.

This filter identifies spam at large by using a model that has been trained by messages that users have reported as spam to Discord. Enabling this filter is an effective way to block out a variety of messages that resemble spammy content reported by Discord users, and identify spammers in your community that should be weeded out. However, this filter isn’t perfect and might not catch all forms of spam, such as DM spam, copy/pasta or repeat messages.

Automatic Responses

You can configure AutoMod’s keyword and spam filters with the following automatic responses when a message is flagged:

Block message

This response will prevent a message containing a keyword or spam from being sent entirely. Users will be notified with an ephemeral message when this happens, informing them the community has blocked the message from being sent.

Discord will seamlessly block all messages containing matching keywords, spam content, and excessive mentions from your filters from being sent entirely regardless of the volume of messages, making this response especially effective for preventing or de-escalating raids where raiders try to spam your channels with repeated messages and excessive mentions.


Send an alert

This response will send an alert containing who-what-where information of a flagged message to a logging channel of your choice.

This message will preview what the full caught message would’ve looked like, including the full content. It also shows a pair of buttons at the bottom of the message, ⛨ Actions and Report Issues. Thes action buttons will bring up a user context menu, allowing you to use any permissions you have to kick, ban or time out the member. The message also displays the channel the message was attempted to be sent in and the filter that was triggered by the message. In the future, some auto-moderation bots may be able to detect these messages and action users accordingly.

Time out user

This response will automatically apply a time out penalty to a user, preventing them from interacting in the server for the duration of the penalty. Affected users are unable to send messages, react to messages, join voice channels or video calls during their timeout period. Keep in mind that they are able to see messages being sent during this period.

To remove a timeout penalty, Moderators and Admins can right-click on any offending user’s name to bring up their Profile Context Menu and select “Remove Timeout.”

Recommended Configuration

AutoMod is a very powerful tool that you can set up easily to reduce moderation work and keep your community's channels and conversations clean and welcoming during all hours of the day. For example, you may want to use three keyword filters; one to just block messages, one to just send alerts for messages, and one to do both.

Overall, it's recommended to have AutoMod block messages you wouldn't want community members to see. For example, high harm keywords such as slurs and other extreme language should have AutoMod’s “block message” and “send alerts” responses enabled. This will allow your moderation team to take action against undesirable messages and the users behind them while preventing the rest of your community from exposure. Low harm keywords or commonly spammed phrases can also have AutoMod’s “Block message” response enabled without the need to set up alerts. This will still prevent undesirable messages from being sent without spamming your logs with alerts. You can also quickly configure AutoMod’s keyword and spam filters in real-time to prevent and de-escalate raids by adding spammed keywords or - adjusting your mention limit in the event of a mention raid - to prevent the raids from causing lasting damage.

It's also recommended to have AutoMod send you alerts for more subjective content that requires a closer look from your moderation team, rather than having them being blocked entirely. This will allow your moderation team to investigate flagged messages with additional context to ensure there’s nothing malicious going on. This is useful for keywords that can be commonly misrepresented, or sent in a non-malicious context.

Verification Level

None - This turns off verification for your community, meaning anyone can join and immediately interact with your community. This is typically not recommended for public communities as anyone with malicious intent can immediately join and wreak havoc.

Low - This requires people joining your community to have a verified email which can help protect your community from the laziest of malicious users while keeping everything simple for well-meaning users. This would be a good setting for a small, private community.

Medium - This requires the user to have a verified email address and for their account to be at least 5 minutes old. This further protects your community by introducing a blocker for people creating accounts solely to cause problems. This would be a good setting for a moderately sized community or small public community.

High - This includes the same protections as both medium and low verification levels but also adds a 10 minute barrier between someone joining your community and being able to interact. This can give you and anyone else responsible for keeping things clean in your community time to respond to ‘raids’, or large numbers of malicious users joining at once. For legitimate users, you can encourage them to do something with this 10 minute time period such as read the rules and familiarize themselves with informational channels to pass the time until the waiting period is over. This would be a good setting for a large public community.

Highest - This requires a joining user to have a verified phone number in addition to the above requirements. This setting can be bypassed by robust ‘raiders’, but it takes additional effort. This would be a good setting for a private community where security is tantamount, or a public community with custom verification. This requirement is one many normal Discord users won’t fill, by choice or inability. It’s worth noting that Discord’s phone verification disallows VoIP numbers to be abused.

Explicit media content filter

Not everyone on the internet is sharing content with the best intentions in mind. Discord provides a robust system to scan images and embeds to make sure inappropriate images don’t end up in your community. There are varying levels of scrutiny to the explicit media content filter which are:

Don’t scan any media content - Nothing sent in your community will go through Discord’s automagical image filter. This would be a good setting for a small, private community where only people you trust can post images, videos etc.

Scan media content from users without a role - Self explanatory, this works well to stop new users from filling your community with unsavoury imagery. When combined with the proper verification methods, this would be a good setting for a moderately sized private or public community.

Scan media content from all members - This setting makes sure everyone, regardless of their roles, isn’t posting unsavoury things in your community. In general, we recommend this setting for ALL public facing communities.

Once you’ve decided on the base level of auto moderation you want for your community, it’s time to look at the extra levels of auto moderation bots can bring to the table! The next few sections are going to detail the ways in which a bot can moderate.

Bot-controlled Auto Moderation

If you want to keep your chats clean and clear of certain words, phrases, spam, mentions and everything else that can be misused by malicious users you’re going to need a little help from a robotic friend or two. Examples of bots that are freely available are referenced below. If you decide to use several bots, you may need to juggle several moderation systems.

When choosing a bot for auto moderation, you should also consider their capabilities for manual moderation (things like managing mutes, warns etc.). Find a bot with an infraction/punishment system you and the rest of your moderator team find to be the most appropriate. All of the bots listed in this article have a manual moderation system.

The main and most pivotal forms of auto moderation are:

  • Anti-Spam
  • Text Filters
  • Anti-Raid
  • User Filters

Each of these subsets of auto moderation will be detailed below along with recommended configurations depending on your community.

Bots seen in this guide:

  • Mee6 - https://mee6.xyz/
  • Dyno - https://dyno.gg/
  • Giselle - https://docs.gisellebot.com/bot-invite.html
  • AutoModerator - https://automoderator.app/
  • Fire - https://getfire.bot/
  • Bulbbot - https://bulbbot.rocks/
  • Gearbot - https://gearbot.rocks/

Support of Discord API features

It’s important your auto moderation bot(s) of choice are adopting the cutting edge of Discord API features, as this will allow them to provide better capabilities and integrate more powerfully with Discord. Slash commands are especially important as you’re able to configure which commands are usable on which bot on a case by case basis for each slash command. This will allow you to maintain very detailed moderation permissions for your moderation team. Bots that support more recent API features are generally also considered to be more actively developed, and thus more reliable in regards to reacting to new threat vectors as well as able to adapt to new features on Discord. A severely outdated bot could react insufficiently to a high-harm situation.

Slash Command Permissions

As mentioned above, one of the more recent features is Slash Commands. Slash commands are configurable per-command, per-role, and per-channel. This allows you to designate moderation commands solely to your moderation team without relying on permissions on the bot’s side to work perfectly. This is relevant because there have been documented examples in the past of permissions being bypassed on a moderation bot’s permission checking, allowing normal users to execute moderation commands.

Anti-Spam

One of the most common forms of auto moderation is anti-spam, a type of filter that can detect and prevent various kinds of spam. Depending on what bot(s) you’re using, this comes with various levels of configurability.

One of the most common forms of auto moderation is anti-spam, a type of filter that can detect and prevent various kinds of spam. Depending on what bot(s) you’re using, this comes with various levels of configurability.

*Unconfigurable filters, these will catch all instances of the trigger, regardless of whether they’re spammed or a single instance **Giselle combines these elements into one filter

Anti-spam is integral to running a large private community, or a public community. There are multiple types of spam a user can engage in, with some of the most common forms listed in the table above. These types of spam messages are also very typical of raids, especially Fast Messages and Repeated Text. While spam can largely be defined as irrelevant or unsolicited messages, the nature of spam can vary greatly. However, the vast majority of instances involve a user or users sending lots of messages with the same content with the intent of disrupting your community.

There are subsets of this spam that many anti-spam filters will be able to catch. For example, if any of the following: Mentions, Links, Invites, Emoji and Newline Text are spammed repeatedly in one message, or spammed repeatedly across several messages, they will provoke most Repeated Text and Fast Messages filters appropriately. Subset filters are still a good thing for your anti-spam filter to have as you may wish to punish more or less harshly depending on the spam. Notably, Emoji and Links may warrant separate punishments. Spamming 10 links in a single message is inherently worse than having 10 emoji in a message.

Anti-spam will only act on these things contextually, usually in an X in Y fashion where if a user sends, for example, ten links in five seconds, they will be punished to some degree. This could be ten links in one message, or one link in ten messages. In this respect, some anti-spam filters can act simultaneously as Fast Messages and Repeated Text filters.

Sometimes, spam may happen too quickly and a bot can fall behind. There are rate limits in place to stop bots from harming communities that can prevent deletion of individual messages if those messages are being sent too quickly. This can often happen in raids. As such, Fast Messages filters should prevent offenders from sending messages; this can be done via a mute, kick or ban. If you want to protect your community from raids, please read on to the Anti-Raid section of this article.

Text Filters

Text filters allow you to control the types of words and/or links that people are allowed to put in your community. Different bots will provide various ways to filter these things, keeping your chat nice and clean.

*Defaults to banning ALL links **Users can bulk-input a YML config ***Only the templates may be used, custom filters cannot be made

A text filter is a must for a well moderated community. It’s strongly recommended you use a bot that can filter text based on a banlist. A Banned words filter can catch links and invites provided http:// and https:// are added to the word banlist (for all links) or specific full site URLs to block individual websites. In addition, discord.gg can be added to a banlist to block ALL Discord invites.

A Banned Words filter is integral to running a public community, especially for Partnered, Community, or Verified servers who have additional content guidelines they must meet that a Banned Words filter can help with.

Before configuring a filter, it’s a good idea to work out what is and isn’t ok to say in your community, regardless of context. For example, racial slurs are generally unacceptable in almost all communities, regardless of context. Banned word filters often won’t account for context with an explicit banlist. For this reason, it’s also important that a robust filter contains allowlisting options. For example, if you add ‘cat’ to your filter and someone says ‘catch’, they could get in trouble for using an otherwise acceptable word.

Filter immunity may also be important to your community as there may be individuals who need to discuss the use of banned words, namely members of a moderation team. There may also be channels that allow the usage of otherwise banned words. For example, a serious channel dedicated to discussion of real world issues may require discussions about slurs or other demeaning language, in this exception channel based Immunity is integral to allowing those conversations.

Link filtering is important to communities where sharing links in ‘general’ chats isn’t allowed, or where there are specific channels dedicated to sharing that content. This can allow a community to remove links with an appropriate reprimand without treating that misstep with the same gravity one would someone who used a slur.

Allow/ban-listing and templates for links are also a good idea to have. While many communities will use catch-all filters to make sure links stay in specific channels, some links will always be inherently unsavory. Being able to filter specific links is a good feature- with preset filters (like the google filter provided by YAGPDB) coming in very handy for protecting your user base without requiring intricate setup on your behalf. However, it is recommended you configure a custom filter as a supplement, to ensure specific slurs, words, etc. that break the rules of your community, aren’t being said.

Invite filtering is equally important in large or public communities where users will attempt to raid, scam or otherwise assault your community with links with the intention of manipulating your user base or where unsolicited self-promotion is potentially fruitful. Filtering allows these invites to be recognized instantly and dealt with more harshly. Some bots may also allow by-community white/banlisting allowing you to control which communities are approved to share invites to and which aren’t. A good example of invite filtering usage would be something like a partners channel, where invites to other, closely linked, communities are shared. These communities should be added to an invite allowlist to prevent their deletion.

Built-in suspicious link and file detection

Discord also implements a native filter on links and files, though this filter is entirely client-side and doesn’t prevent malicious links or files being sent. It does, however, warn users who attempt to click suspicious links or download suspicious files (executables, archives etc.) and prevents known malicious links from being clicked at all. While this doesn’t remove offending content, and shouldn’t be relied on as auto moderation, it does prevent some cracks in your auto moderation from harming users.

Anti-Raid

Raids, as defined earlier in this article, are mass-joins of users (often selfbots) with the intent of damaging your community. Protecting your community from these raids can come in various forms. One method involves gating your server using a method detailed elsewhere in the DMA.

*Unconfigurable, triggers raid prevention based on user joins and damage prevention based on humanly impossible user activity. Will not automatically trigger on the free version of the bot.


Raid detection means a bot can detect the large number of users joining that’s typical of a raid, usually in an X in Y format. This feature is usually chained with Raid Prevention or Damage Prevention to prevent the detected raid from being effective, wherein raiding users will typically spam channels with unsavory messages.

Raid-user detection is a system designed to detect users who are likely to be participating in a raid independently of the quantity of frequency of new user joins. These systems typically look for users that were created recently or have no profile picture, among other triggers depending on how elaborate the system is.

Raid prevention stops a raid from happening, either by Raid detection or Raid-user detection. These countermeasures stop participants of a raid specifically from harming your community by preventing raiding users from accessing your community in the first place, such as through kicks, bans, or mutes of the users that triggered the detection.

Damage prevention stops raiding users from causing any disruption via spam to your community by closing off certain aspects of it either from all new users, or from everyone. These functions usually prevent messages from being sent or read in public channels that new users will have access to. This differs from Raid Prevention as it doesn’t specifically target or remove new users in the community.

Raid anti-spam is an anti-spam system robust enough to prevent raiding users’ messages from disrupting channels via the typical spam found in a raid. For an anti-spam system to fit this dynamic, it should be able to prevent Fast Messages and Repeated Text. This is a subset of Damage Prevention.

Raid cleanup commands are typically mass-message removal commands to clean up channels affected by spam as part of a raid, often aliased to ‘Purge’ or ‘Prune’.

Built-in anti-raid

It should be noted that Discord features built-in raid and user bot detection, which is rather effective at preventing raids as or before they happen. If you are logging member joins and leaves, you can infer that Discord has taken action against shady accounts if the time difference between the join and the leave times is extremely small (such as between 0-5 seconds). However, you shouldn’t rely solely on these systems if you run a large or public community.

User filters

Messages aren’t the only way potential evildoers can introduce unwanted content to your community. They can also manipulate their Discord username or Nickname to be abusive. There are a few different ways a username can be abusive and different bots offer different filters to prevent this.

Username filtering is less important than other forms of auto moderation. When choosing which bot(s) to use for your auto moderation needs, this should typically be a later priority, since users with malicious usernames can just be nicknamed in order to hide their actual username.

Specialized Auto Moderation Bots

So far, we’ve covered general auto moderation bots with a wide toolset. However, there are some specialized bots that only cover one specific facet of auto moderation and execute it especially well. A few examples and descriptions are below:

  • Beemo - Bot raid detection and prevention

This bot detects raids as they happen globally, banning raiders from your community. This is especially notable as it’ll ban detected raiders from raids in other communities it’s in as they join your community, making it significantly more effective than other anti-raid solutions that only pay attention to your community.

  • Fish - Malicious link and DM raider detection

Fish is designed to counter scamming links and accounts, targeting patterns in joining users to prevent DM raids (Like normal raids, but members are directly messaged instead). These DM raids are typically phishing scams, which Fish also filters, deleting known phishing sites.

  • Safelink and Crosslink - Link automoderation

Both of these bots are highly specialized link and file moderation bots, effectively filtering adult sites, scamming sites and other categories of sites as defined by your moderation team.

Which bot do I use?

When choosing a bot for auto moderation you should ensure it has an infraction/punishment system you and your mod team are comfortable with as well as its features being what’s best suited for your community. Consider testing out several bots and their compatibility with Discord’s built-in auto moderation features to find what works best for your server’s needs. You should also keep in mind that the list of bots in this article is not comprehensive - you can consider bots not listed here. The world of Discord moderation bots is vast and fascinating, and we encourage you to do your own research!

For super-large public communities (>100,000)

For the largest of communities, it’s recommended you employ everything Discord has to offer. You should use the High or Highest Verification level, all of Discord’s AutoMod keyword filters and a robust moderation bot like Gearbot or Gaius. You should seriously consider additional bots like Fish, Beemo and Safelink/Crosslink to aid in keeping your users safe and have detailed Content Moderation filters. At this scale, you should seriously consider premium, self hosted, or custom moderation bots to meet the unique demands of your community.

For large public communities (>10,000)

It’s recommended you use a bot with a robust and diverse toolset, while simultaneously utilizing AutoMod’s commonly flagged word filters. You should use the High Verification level to aid in preventing raids. If raiding isn’t a large concern for your community, Gearbot and Giselle are viable options. Your largest concerns in a community of this size is going to be anti-spam and text filters meaning robust keyword filters are also highly recommended, with user filters as a good bonus. Beemo is generally recommended for any servers of this size. At this scale a self hosted, custom, or premium bot may also be a viable option, but such bots aren’t covered in this article.

For midsized public communities (>1,000)

It’s recommended you use Fire, Gearbot, Bulbbot, AutoModerator or Giselle. Mee6 and Dyno are also viable options, however as they’re very large bots and have been known to experience outages, leaving your community unprotected for large amounts of time. At this community size, you’re likely not going to be largely concerned about anti-raid with anti-spam and text filters being your main focus. You’ll likely be able to get by just using AutoMod’s keyword filters and commonly flagged words lists provided by Discord. User filters, at this size, are largely unneeded and your Verification Level shouldn’t need to be any higher than Medium.

For small public communities and private communities

If your community is small or private, the likelihood of malicious users joining to wreak havoc is rather low. As such, you can choose a bot with general moderation features you like the most and use that for auto moderation. Any of the bots listed in this article should serve this purpose. At this scale, you should be able to rely solely on AutoMod’s keyword filters. Your Verification Level is largely up to you at this scale depending on where you anticipate member growth coming from, with Medium being default recommended.

Configuring Auto Moderation for listed bots

Mee6

First, make sure Mee6 is in the communities you wish to configure it for. Then log into its online dashboard (https://mee6.xyz/dashboard/), navigate to the community(s), then plugins and enable the ‘Moderator’ plugin. Within the settings of this plugin are all the auto moderation options.

Dyno

First, make sure Dyno is in the communities you wish to configure it for. Then log into its online dashboard (https://dyno.gg/account), navigate to the community(s), then the ‘Modules’ tab. Within this tab, navigate to ‘Automod’ and you will find all the auto moderation options.

Giselle

First, make sure Giselle is in the communities you wish to configure it for. Then, look at its documentation (https://docs.gisellebot.com/) for full details on how to configure auto moderation for your community.

AutoModerator

First, make sure Gaius is in the communities you wish to configure it for. Then, look at its documentation (https://automoderator.app/docs/setup/) for full details on how to configure auto moderation for your community.

Fire

First, make sure Fire is in the communities) you wish to configure it for. Then, look at its documentation (https://getfire.bot/commands) for full details on how to configure auto moderation for your community.

Bulbbot

First, make sure Bulbbot is in the communities you wish to configure it for. Then, look at its documentation (https://docs.bulbbot.rocks/getting-started/) for full details on how to configure auto moderation for your community.

Gearbot

First, make sure Gearbot is in the communities you wish to configure it for. Then, look at its documentation (https://gearbot.rocks/docs) for full details on how to configure auto moderation for your community.

Moderation
Moderation

Transparency in Moderation

Pros and Cons of Transparency

While there are certain “best practices” when it comes to moderation transparency, there is no single system that is right for everyone. The amount of transparency you need for your moderation system ultimately depends on your server rules, culture, and vision. This article will explain the pros and cons of transparency and ways that you can apply transparency to your moderation system.

Though the idea of moderation transparency is generally considered to be a good thing, it is important to understand that there are both pros and cons to transparency in moderation. Some of these pros and cons are described below.

To help you understand how the pros and cons apply to transparency, consider an example in which a moderator publicly warns another user not to call someone a “retard” because it violates an existing “No Slurs Allowed” rule.

Pros

  • Accountability: A transparent moderation system holds moderators accountable to their own rules. For example, if a moderator were to call someone the same word, other users would know that behavior isn’t acceptable and could report it to an admin.
  • Community: Allowing the community to see when someone gets warned and why helps foster dialogue between moderators and regular users regarding server culture and rule enforcement and encourages cooperation. Users that may not understand the reason why calling someone that word is prohibited can become educated on the moderators’ position. Moderators may also be able to clear up any misunderstandings community members may have about what slurs are included in the rule and can update the rule accordingly if need be.
  • Comprehension: Providing users with practical examples helps them understand the difference between right and wrong. Users that were previously unfamiliar with what an “ableist slur” looks like now have a practical example to reference.
  • Compliance: Users can proactively and correctly encourage good behavior themselves without moderator intervention. Once users know that calling someone that word is unacceptable, they can echo that message throughout the server and let others know that that behavior isn’t tolerated.

Cons

  • Testing the Limits: Malicious users may take advantage of transparency to skirt the rules without punishment, or to manage their infractions to just barely avoid being banned. In the example above, users may try to censor or alter the word to have it slip under the radar of any watching moderators
  • Rules lawyers: Transparency may encourage “rules lawyers”  in which users will attempt to use the letter of the rules without reference to the spirit of the rules to appeal their warnings and punishments in bad faith. Explaining that the slur in question was used to insult those diagnosed with mental disability and is thus prohibited may prompt a bad faith counterargument that “fa****t” should be allowed because it used to mean “a bundle of sticks.”
  • Harassment: Moderators taking action or users that were punished may be subject to harassment by server members. Users may also feel harassed if they are publicly warned by a moderator. The person who was warned may carry a stigma with them in future interactions or be made fun of. Even if no one treats them differently in the future, they may feel embarrassed at being publicly criticized for their behavior. Conversely, those who sympathize with the warned user may start harassing the mod for being “too sensitive.”
  • Privacy: Transparency may cause moderators or users to feel that their privacy is insufficiently protected in relation to moderation issues.The lack of privacy can result in harassment or embarrassment as mentioned above. Furthermore, if the evidence for the case is preserved in public view then additional messages and usernames may be visible even if the original messages are later deleted by their authors.

The Moderation System

Now that you are aware of some of the pros and cons of transparency in moderation, you must next understand the components of the moderation system so that you can consider ways in which these components can be made more or less transparent. Broadly speaking, a moderation system can be split into the following components:

  • Server rules and penalties for breaking them
  • Guidelines for the moderation team to ensure consistent enforcement of the rules and penalties
  • Logging and communication of user infractions and applying the appropriate penalty
  • Processing appeals from users related to their logged infractions

Transparency and communication go hand-in-hand. The more you communicate these components to relevant users and the server as a whole, the more transparent your moderation system is.

Implementing Transparency

There are several ways to implement transparency in each of these components, each with their own pros and cons. Each section here will establish ways in which a component can be made more or less transparent and a recommendation of the appropriate level of transparency for each. However, please keep in mind that every server’s needs are different and some of the pros and cons discussed may not apply to your server. It is always important to consider your specific community when it comes to implementing transparency.

Server Rules and Penalties

Your server rules are the backbone of your moderation system. They describe how your members should conduct themselves and what happens if they don’t meet those expectations. In general though, your rules should be specific enough to ensure comprehension and compliance without being overly wordy or attempting to provide an exhaustive description of prohibited behaviors.

For example, giving a couple of examples of NSFW content for a “no NSFW content rule” may help people understand what you interpret as being NSFW, compared to other servers or Discord itself. However, too many examples may make the list seem fully comprehensive, and people will assume that items not on the list are fair game. Disclaiming that examples of rule-breaking content are non-exhaustive and that the moderators have the final say in interpreting if someone is breaking the rules can help to address users that are interested in testing the limits of the rules or being rules lawyers to escape punishment on a technicality.

Moderation Guidelines

Developing moderator guidelines is another important part of your moderation system. Similar to your rules guiding the conduct of your server members, your moderator guidelines help guide the conduct of your moderators.

Keeping your moderator guidelines visible to the rest of the server will encourage compliance from members, and enable them to defuse incidents without moderator intervention. Furthermore, providing basic standards of moderator conduct will help users know when it’s appropriate to report moderators to the server owner for misconduct and hold them accountable. However, you should avoid putting too much of your moderator guidelines out in the public in order to avoid rules lawyers deliberately misinterpreting the spirit of the guidelines to their advantage. After developing your moderator guidelines, balancing these pros and cons will help you determine how much of your guidelines you should present to the public.

Infraction Logging

Logging user infractions is key to ensuring that the entire moderation team has the same understanding of how often a user has broken the rules. Transparency between the mod team and the user in question is important for the user to understand when they have received a warning that brings them closer to being banned from the server. Informing the user of which moderator warned them is important for holding moderators accountable to the warnings they issue, but may leave moderators open to harassment by warned users. Having a procedure to deal with harassment that stems from this, is one way to achieve accountability while still protecting your moderators from bad actors in your server.

Although the communication of infractions is vital to ensure understanding among your server members, it may be prudent to withhold information about exactly how close a user is to being banned so that they do not attempt to toe the line by staying just under the threshold for being banned. Furthermore, even though a public infraction log may be a good way to promote cohesion and transparency by showing examples of unacceptable behavior to the rest of the server and fostering discussion between the mod team and community, others may think that such a log infringes on user privacy or that these logs may constitute a “witch hunt.” It may also leave mods and users open to harassment over warnings given or received.

If you want to encourage a sense of community and understanding without taking away user privacy or inadvertently encouraging harassment, a better option may be to encourage users to bring up criticisms of rules or enforcement in a feedback channel if they wish to. Provided that the mod team ensures these conversations remain constructive and civil, creating a public medium for these conversations will help others understand how the mod team operates and allow them to provide feedback on how the server is run.

Managing Appeals

Everyone makes mistakes, and moderators are no exception. It is important to have a process for users to appeal their warnings and punishments if they feel that they were issued unfairly. If you decide to have a public infractions log, you may receive appeals on behalf of warned users from people who were uninvolved in the situation if they feel the warning was issued unfairly. While this can help with accountability if a user is too nervous to try to appeal their warning, it can also waste the time of your mod team by involving someone that does not have a complete understanding of the situation. In general, it is better to keep the appeal process private between the moderation team and the punished user, primarily via mediums such as direct messages with an administrator or through a mod mail bot. During the appeal process, it is best to ensure that you clearly and calmly walk through the situation with the appealing user to help them better understand the rules while maintaining moderator accountability.

Summary

In the end, there is not a single “correct” way to manage transparency in your moderation system. The appropriate level of transparency will vary based on the size of the server and the rules that you implement. However, walking through the steps of your moderation system one by one and considering the various pros and cons of transparency will help you determine for yourself how to incorporate transparency into your moderation system. This will help you build trust between moderators and non-moderators while preventing abuse on both ends of the system.

Moderation
Moderation

Parasocial Relationships

Definition

A parasocial relationship describes a one-sided relationship between a spectator who develops a personal attachment through various influences to a performer who is not aware of the existence of the spectator. It is strengthened by continuous positive exposure to its source, which mainly happens on social platforms.

Process

In this section we’ll take a look at how parasocial relationships are developed and how to establish the severity of the level of parasocial relationships you are encountering from a moderation standpoint.

Development of Parasocial Relationships

The establishment of parasocial relationships can be portrayed as such:

User A, in this example a popular content creator, uploads regular content on a big platform. User B, who takes the position as a member of User A’s audience, takes an interest in their content. User B reacts to User A’s content and observes them. While User A may know that people are enjoying their content, they are unlikely to be aware of every viewers’ existence. This total awareness becomes more unlikely the bigger the audience gets.

User B on the other hand is regularly exposed to User A’s content and takes a liking to them. The interest is usually defined by User A’s online persona: content, visual appeal, likeability, and even their voice can all be influencing aspects. User B perceives User A as very relatable through common interests or behaviors and starts to develop a feeling of loyalty, or even responsibility, during that phase of one-sided bonding. This behavior can be attributed to personal reflection in User A, as well as psychological facts like loneliness, empathy, or even low self-esteem. As a consequence, User B can easily be influenced by User A.

At this point User B might feel like they understand User A in a way nobody else does and may even begin to view them on a personal level as some sort of friend or close relative. They see this individual every day, hear their voice on a regular basis, and believe that they are connecting to them on a deep level. They develop an emotional attachment, and the stronger the parasocial relationship gets, the more attention User B pays to User A’s behavior and mannerisms. While User A most likely doesn’t know User B personally, User B will seek out interaction with and recognition from User A. That behavior is typically represented through donations on stream, where User A either reads out their personal message and name or publishes a “thank you” message on certain websites.

Additionally, User B tries to follow and engage with their idol on as many platforms as possible aside from their main source of content creation. These social media platforms are usually Instagram or Twitter, but can also include User A’s Discord server.

Levels of Parasocial Relationships

While that type of relationship is natural and sometimes even desired, it is important to define the level of parasocial relationships and differ between its intensity for the safety of the community, the staff members, and the performer. In their article in the Psychology focused academic journal ‘The Psychologist’, researchers Giles and Maltby designed three levels of severity of parasocial relationships based on the Celebrity Attitude Scale.

Entertainment-Social

“Fans are attracted to a favourite celebrity because of their perceived ability to entertain and to become a source of social interaction and gossip. Items include ‘My friends and I like to discuss what my favourite celebrity has done’ and ‘Learning the life story of my favourite celebrity is a lot of fun’.”

The least harmful level is the general public and social presence. The targeted celebrity is subjected to gossip and mostly provides a source of entertainment. Their presence is mostly found in talks with friends, talk shows, on magazine covers, and similar public-facing media. Discord users on this level usually interact with the community in a relaxed, harmless way.

Intense-Personal

The next level is parasocial interaction. The characteristics of this level are the development of an emotional attachment of a spectator with a performer, resulting in intense feelings. This behavior is characterized by the spectator wanting to get to know the performer, followed by the desire to be part of their life as well as considering them as part of their own life. A result of that can be addictive or even obsessive behavior, which can be noticed in Discord servers, too.

“The intense-personal aspect of celebrity worship reflects intensive and compulsive feelings about the celebrity, akin to the obsessional tendencies of fans often referred to in the literature. Items include ‘My favourite celebrity is practically perfect in every way’ and ‘I consider my favourite celebrity to be my soulmate’.”

Spectators of that level usually ping the performer or message them privately in an attempt to be recognized. While that behavior is natural, anything that endangers safe interactions between themselves, the community, or the performer needs to be supervised carefully. Unrestrained abusive behavior, which can be found in unwanted intimate, borderline NSFW questions or comments, needs to be addressed and corrected accordingly.

Borderline-Pathological

The final level is considered the most intense level and also the most dangerous. It contains severe, harmful obsessions that can extend all the way to stalking and real-world consequences. Parasocial relationships to this degree will rarely be found on Discord, but have to immediately be reported if present.

“This dimension is typified by uncontrollable behaviours and fantasies about their celebrities. Items include ‘I would gladly die in order to save the life of my favourite celebrity’ and ‘If I walked through the door of my favourite celebrity’s house she or he would be happy to see me’.”

The vast majority of users won’t reach the level past seeing the performer as a source of entertainment, but moderators should be aware of potential consequences of anything beyond that as they can be harmful to both the spectator, the performer, and the safe environment you are working to upkeep for all.

Parasocial Relationships on Discord

Parasocial relationships on Discord can pertain to anyone who is perceived as being popular or influential, making them “celebrities” of Discord. Some examples of parasocial relationships on Discord can be found between a user and a moderator, a user and a content creator you are moderating for, or even a member of your moderation team and the content creator you are working for.

Between a User and Yourself

But what does all that mean for you, the mod? While Discord moderators are not nearly as popular and influential as big content creators or celebrities, they are still observed by Discord users. While being a moderator puts you into a position of power and responsibility over the wellbeing of the server, some users perceive it as you climbing the social ladder in the Discord server. In their eyes, becoming a moderator changes your overall social status within your Discord community.

Being hoisted higher in the servers’ hierarchy results in members quickly recognizing you and potentially treating you differently due to your influence, even becoming “fans” of you as a person. Some users will soak up any information they can get about you, especially if they realize that you have common interests. This may lead to the development of a parasocial relationship between users and you. Users you have never interacted with before might see you as a person they would get along with and seek out your attention, leading to a one-sided relationship on their part.

Having such an audience can be overwhelming at first. People will start to look up to you, and younger users especially can easily be influenced by online personas. They might adapt to your behavior or even copy your mannerisms. Knowing that, you should always be self-aware of your actions and etiquette in public to promote a healthy, sustainable relationship with the users. Receiving special attention from users can quickly influence and spiral into developing an arrogant, or entitled attitude. There is nothing wrong with being proud of your position and accomplishments, but being overtly arrogant will influence a members’ behavior towards you.

The mindset of one user deciding a moderator is not being responsible can spread through the community in negative ways. They might belittle you in front of new members and give them the feeling that you won’t be there to help them or might not inform you of ongoing problems on the server during a temporary absence of moderators in the chat. A healthy user-moderator relationship is important to prevent or stop ongoing raids as well as make moderators aware of a user misbehaving in chat.

Additionally, it’s important to be mindful that your perceived fame does not start to negatively influence your judgment. For example, you may find yourself giving special attention to those who seem to appreciate you while treating users that are indifferent towards your position as a moderator more harshly. It also causes the dynamics within the staff team to change as fellow moderators might start to perceive you differently if you begin to allow bias to seep into moderation. They may start to second guess your decisions, feel the need to check up on your moderator actions, or even lose trust in your capabilities.

If you ever notice that you experience said effect, or notice one of your fellow moderators is experiencing it and letting it consume them, be supportive and sort out the negative changes. When confronting another moderator about it, make sure to do it through constructive criticism that doesn’t seem like a personal attack.

In spite of that, the effects of parasocial relationships don’t always have to be negative in nature. If users manage to build such a connection to a moderation team, the general server atmosphere can grow positively. Users know what moderators like and don’t enjoy, which will lead them to behave in a way that appeals to staff and usually abides by the server rules. They will also be able to predict a moderators’ reaction to certain behavior or messages new people might use. As a result, they will attempt to correct users that are mildly misbehaving themselves without getting staff involved immediately in hopes of receiving positive feedback from staff. Naturally, moderators won’t be able to know of every single person that tries to appeal to them through those actions, but once they are aware that such things happen in certain text channels, it will give them the opportunity to focus on other channels and provide their assistance there.

Between a User and a Content Creator

As mentioned before: In the case of content creators who frequently upload videos, streams, and other forms of media for their followers, the chance of a parasocial phenomenon being developed can be even greater. This will only intensify by joining a creators’ Discord server. This can be done under the false assumption that there will be a higher chance of their messages being read and noticed. Your responsibility as a moderator is to neither weaken that bond nor encourage it while providing security for users, staff, and the content creator. Let the users interact in a controlled environment while maintaining the privacy of the content creator.

Some users might even feel like the content creator owes them some sort of recognition after long-term support, both through engagement or donations. Such a demand can be intensified when they’re shown as “higher” in the hierarchy through dedicated Discord roles, such as Patreon/Donator or simple activity roles. In the case of multiple people building a parasocial relationship with the same content creator and experiencing that phenomenon, they may see other active users or even moderators as “rivals.” They see the content creator as a close friend in their eyes and feel threatened that others, especially those that financially support the content creator, perceive them the same or think they are even closer to them. During such moments, it is recommended to keep the peace between users and let them know that the content creator appreciates every fan they have. While the ones providing financial support are appreciated, every viewer is what makes the creator as big as they are and played a part in getting them to where they are today.

Between a Moderator and a Content Creator

Maybe it won’t only be the user that feels closer to them by joining the server. Many beginner moderators may also find themselves feeling as though they are above the rest of the community because their idol has entrusted them with power on their server. Being closer to them than most of the users can easily fog your judgement; it is essential to prioritize being friendly and respectful to the users over these personal convictions. When adding moderators to your moderation team, it is important to keep an eye out for this kind of behavior to combat it, and to hold your teammates accountable should you see this behavior begin to exhibit in one of your teammates. Making sure your entire team is on the same page regarding your duties and standing in the community is essential to maintaining a healthy moderation environment. As a moderator for a Content Creator, this individual you may admire deeply has put their trust in you to keep their community safe. Falling into the machinations of developing an unhealthy parasocial relationship with them directly interferes with your ability to do that, failing not only the community but the creator.

In Conclusion

Despite the potential dangers from parasocial relationships, the fact that they develop at all may indicate that you are doing a good job as a moderator. While positive attention and appreciation are key factors to a healthy development, not everyone may like that sort of attention and it is completely acceptable to tell your fellow moderators or even the users themselves about it. At one point you might feel like you reached your limit and need a break from moderation and managing parasocial relationships aimed at you and those around you. Moderator burnout is very real, and you should not hesitate to take a break when you need it.

Users will view you, as a moderator, as a leader that helps guide your designated Discord server in the right direction. As such, you will be a target for rude comments by users that have personal issues with the server while simultaneously getting showered with affection by other users who are thankful for what you do for the server. Never be afraid to ask for help and rely on the moderation team if things go too far for your personal boundaries or comfort level, even if you are an experienced moderator. Establishing a healthy relationship with the community is important, but being able to trust your fellow staff members is even more so. Nobody expects you to build an intimate relationship with every member, but knowing you can count on them and their support is essential for the team to function correctly.

Safety
Safety

Securing Your Discord Account

Account Security

The first step towards securing the server you moderate is securing your own Discord account. Your first line of defense is a strong and unique password. Some characteristics of strong passwords include:

  • Length - Longer passwords are harder to guess
  • A mix of character types - Including numbers, symbols, lowercase, and uppercase letters make the password harder to guess
  • Uniqueness - Avoid reusing passwords you are using on other sites. If those sites are compromised, it could also compromise your Discord password

You can also use a random password generator or a password manager to create a completely random password that will be nearly impossible to guess, but difficult to remember. Another option is to combine several random words together. The key, though, is that the words need to be completely random. Using a tool to help select words at random from the dictionary is a good way to help ensure their randomness.

Once you have a strong password, you should also enable two-factor authentication, also known as 2FA. 2FA ensures that even if someone manages to guess your password, they won’t be able to get into your account without access to the device where the 2FA app is. You can also enable 2FA via SMS and receive your authentication code via text message. However, SMS 2FA is less secure than application-based 2FA because text messages can be intercepted or your phone number could be stolen. Although the chance of this is still low, you should still avoid enabling the SMS backup for this reason if possible.

You also need to make sure the devices where your Discord account is logged in and the device that has your 2FA app are physically secure. Make sure your computer is password protected and locked when you are physically away from it. If you use a public computer, make sure that you use incognito mode on the web browser to ensure that your Discord information is removed when you close the browser. For a phone or tablet, require a PIN code to unlock it so that it can’t be used by strangers.

Now that your account is nice and secure, there is one more thing you must closely monitor to ensure it doesn’t fall into the wrong hands: yourself.

Avoiding Social Engineering Attacks

The weakest link in any cybersecurity system is usually a human, and the security of your Discord account is no exception. Social engineering is the use of deception to manipulate individuals into divulging confidential or personal information that may be used for fraudulent purposes. People attempting to gain access to your Discord account may attempt to get you to log into a fake site, download a malicious file, or click on a suspicious link. Being able to identify these actions and avoiding potential pitfalls is an important part of keeping your account (and the servers you moderate) safe.

One of the most common and dangerous scams on Discord is a user or a bot sending out a direct message with a QR code saying that you should scan the QR code with Discord’s QR code scanner for free nitro. This will generally be combined with instructions on how to access and use Discord’s QR code scanner. However, it is important to remember that Discord's QR code scanner is only used to log in to Discord. Scanning the given QR code will allow that attacker to directly log into your account, bypassing your password and any 2FA you may have configured. If you accidentally scan a suspicious QR code, you should immediately change your password as this will invalidate your current account token and log you out of all devices.You can also report any such scams directly to Discord Trust and Safety for further action. For more information on making reports, check out this article.

Another common attack is to encourage you to click on a link that redirects to a fake Discord website. Before clicking on any links from a user, ask yourself the following questions:

  • Is the sender a stranger?
  • Is this message unexpected?
  • Does the message imply urgency or promise something as a reward (e.g., “If you don’t do this in the next five days, your Discord account will be deleted”)
  • Are they asking me to perform a suspicious/sensitive action (e.g., download a file, log in to a website)

If you find that the answer to many of the above questions is “yes”, you should avoid performing whatever action they are requesting. You can also check any suspicious-looking URLs with various URL checkers, such as this one.

If the user is specifically asking you to click on a link that prompts you to log in to Discord, another option you have is to navigate directly to https://discord.com in your web browser and log in from there. If clicking on the user’s link still takes you to a login page, double check the URL of the website. One thing you’ll want to check is if the website starts with https:// instead of http:, or that there is a lock next to the beginning of the URL. Although some fake sites may still have an https:// designation, many of them will not. Other signs may be slight misspellings of the URL or visual tricks such as diiscrd.com or dlscord.com with a lowercase “l” instead of an “i”. If you notice any of these signs, it is highly likely that it is not actually Discord’s website and instead a fake website intended to trick you into entering your login credentials so that it can steal your account.

Most modern browsers will have a lock icon indicating that the connection is secure if they do not show https:// before the URL. If the icon is shown as an unlocked lock or you see http:// rather than https:// before the URL, your connection is not secure.

Conclusion

Creating a strong password, enabling 2FA, and following best practices for physical device security are the first steps towards keeping your Discord account secure. However, there may be people that try to trick you into giving access to your Discord account through various scams or other social engineering attacks. Being able to spot suspicious messages and users and being cautious when encountering strange links or files is another important part of keeping your account safe. Of course, anyone that is able to illicitly gain access to a moderator account on your server still has the potential to do great harm, such as banning users and deleting messages, channels, and roles. Be sure to share this information with the other moderators on your server so that you can each do your part to keep your community safe by keeping your accounts secure.

Moderation
Moderation

Reddit X Discord

Does Your Subreddit Need a Discord?

Discord communities are distinct from subreddits and attract different audiences. While there are often overlaps between those audiences, it will not always be the case, and it’s important to determine whether your community will benefit from having a Discord before you attempt to start one.

Successful Discord communities revolve around human connections and conversations and not just content. For a Reddit community to translate well into a Discord community, it should be centered around a topic its members are passionate about and are highly engaged with.

To start, ask yourself the following questions:

  • Can there be in-depth discussions around the topic of your community that will benefit from a real-time chat setting? For subreddits mainly focused around content without an avenue for discussion, the answer is usually no. Examples could include /r/aww and /r/eyebleach.
  • Does your subreddit have a large amount of retained users that visit it directly on a regular basis? Reaching the members of your Reddit community is often challenging, and requires them to go out of their way and take the first step and move to a different platform. If your community sees very sharp declines in traffic when no trending posts are present, it might indicate that most of your members “follow and forget” - only visit your community when it pops up on their home feed, which often indicates they will not be willing to follow you to Discord.
  • Are there other benefits relevant to your community that having a Discord could provide? Examples include LFG (helping players find groups) in video game subreddits, the ability to host events such as game nights for community engagement, and topics that include lengthy back-and-forth conversations that are more suited for a real time chat environment rather than threaded long-form conversations (such as providing a space for personalized help around topics like programming, tech support, or mental health).

If you believe the answers to those questions would be negative about your community, it might be helpful to take a step back and reconsider whether a Discord server would benefit it.

Moderation Teams

Starting a Moderator Team

Typically, it’s best to keep the Reddit and Discord moderation teams separate. Your Discord is a separate ecosystem with its own needs - and it’s important to find users from within it that will help you develop and maintain it, and make it flourish.

When starting off, adding your existing subreddit moderator team usually works. However, it’s important to note that those mods might not always be as dedicated to this new platform as they are to the one that they came from. Looking into the future for your Discord server, things might change and the subreddit mods that helped it in the early days might end up having to take a backseat in favor of users who are brought in from within the server.

Server Owner and Administrators

The owner of the server should be a dedicated mod from the subreddit who knows both the community and the inner-workings of Discord. Decisions made by the owner will be critical to the development of your server, so take a moment to review all of the potential candidates within your team to choose the best one for the task.

Make sure to bring in at least one user or subreddit mod who is also as knowledgeable with the Discord ecosystem and familiar with your existing community to help with the setup and administration of the server early on.

Reddit and Discord Moderator Teams Coexisting

You now have a bunch of mods! Mods on your subreddit, mods dedicated to your Discord server, and mods that are both. But what do you do with all these mods, how do you tell them apart, what perms should they have? Communication between your different teams is key to the success of both of your communities.

Outside of the owner, you should ideally have at least two other moderators that are present on both teams. These shared mods will be able to efficiently relay information between the teams, coordinate collaborations between the Reddit and Discord communities, be able to take action in emergency situations, and mediate conflicts if they occur.  

Here are a few best practices:

  • Make a private channel where your subreddit mods will be able to chat privately with your Discord team. It can be used primarily as an off-topic chat, while also being available for use as a liaison for your team to discuss cross-platform issues if and when they arise.
  • Give the subreddit moderators a special role. This will show the association of your server with the subreddit to your members.
  • Avoid giving subreddit moderators moderation privileges within your server, unless they’re interested in becoming moderators within your Discord as well - and are ready to accept the additional responsibilities and time commitments that come with the role.
  • As your subreddit moderators are still trusted contributors, you may grant them privileges that might be gated off for regular users by default, such as link embeds and file upload permissions.

It’s important to be upfront to your community about the fact that your subreddit and Discord server are run and moderated by completely separate mod teams after the server starts taking shape. Set up escalation paths for both of your teams to direct issues related to your subreddit to Reddit modmail, and vice-versa, direct users experiencing issues within the server to its team.  

All staff positions (except the owner and lead admins) should be independent of the user’s status on different moderation teams, and you should remind your staff team that it is not a given to be modded elsewhere if you become a mod on either platform. Mods that participate in multiple teams must still uphold your activity requirements, and meet all of your expectations similarly to the rest of your team.

Hosting Events and Cross Promotions

While your communities are linked together, they’re separate entities with different groups of regular visitors and contributors. When considering cross platform promotions, assess their relevance to each of your audiences, and determine whether they will find it helpful. A few best practices around this topic are:

  • Avoid cross-promoting micro events within one of your community’s platforms, such as announcing a specific discussion thread on Reddit to your Discord members, when it does not provide them value that can’t be achieved otherwise. Focus on platform-specific events that your users will be able to participate in locally.
  • For major events that are relevant to all of the sub-communities you operate, try to include a space for participation in each one of them. However, it is also a good idea to provide information or links to the other event pages/channels on every platform. This makes it easy for people to browse the discussions on each platform and easily join in on multiple conversations in different places if they’re willing to.
  • Avoid promoting platform-specific announcements on other platforms. For example, don’t link your moderator application form for Discord on your subreddit, and vice versa.

Closing notes

Creating a Discord server can be a great way to broaden your subreddit-based community's horizons, giving your users a whole new way to interact with each other. However, it's important to remember that maintaining a Discord community can be a whole lot of work that some of your existing team members might not be interested in taking. Finding the right person to lead your Discord and ensuring your community's new outpost is in good hands early on will ensure a lasting and smooth relationship between your subreddit and Discord, to everyone’s benefit.

Moderation
Moderation

Training and Onboarding New Moderators

Training Methods

Each server has to decide for themselves what the most effective training method looks like. This article will present and explain some recommended methods that can simplify moderation training.

Buddy System

The “Buddy System” approach describes working in pairs or groups, in which two or more “buddies” work together on one task. Requiring newer moderators to work with each other or even an assigned team member allows you to better monitor progress and for aid to come when required.

It promotes active communication and trust between moderators and also allows them to keep an objective view on everything. Having multiple opinions about certain matters and receiving assistance from a reliable source prevents moderators from feeling pressured with moderation tasks and can even open your eyes to new viewpoints. Additionally, it allows its members to effectively share their moderation skills with each other. It also aids in terms of personal safety: if a moderator feels personally attacked by a user, they get immediate support from their “buddies” without feeling the need to tackle the issue alone.  

Using this system allows less experienced moderators to quickly and effectively catch up to the moderation standards as they learn how to deal with specific matters first hand. It is important to remind the more experienced moderators on your team to allow new moderators to learn instead of wanting to quickly handle every task themselves. With time, certain actions will also become second nature for newer moderators, too!

Mentoring System

This approach is most commonly used when onboarding new moderators. Here experienced moderators or the head of staff introduce each new member of the team personally and guide them through the most important aspects.

The difference between this system and the “Buddy System” is that each new moderator will be acquainted with the moderation tasks and responsibilities by a higher up, and usually only once. After the walkthrough most recruits are expected to manage certain moderation duties on their own while being supervised. It is crucial to support and reassure them so they are able to grow confident in their actions. Recruits can display and appropriately train their soft skills and be informed about moderation standards in a controlled environment without fear or causing too much irreparable damage.

Regular Exercises

Another method of efficiently training both experienced and inexperienced moderators is by letting them regularly test their knowledge. This involves designing exemplary situations of some everyday issues happening within your Discord server to let the trial moderators explain how they would handle them. Such instances can include how to handle issues in audio channels, user disputes, DM Discord invitations, off topic discussion in the incorrect channel, and potential issues that may be encountered with bots. At the end you should provide some sort of “model” or “example” answer to let the recruits know where they need to improve.

One negative effect of this system could be the fear of failure that some inexperienced moderators might be exposed to. Reassure them that making mistakes is okay as long as you take responsibility for your actions and are willing to learn from them. Moderation is an ever-shifting and learned art, and mistakes are not to be punished when they happen every once in a while.

It may be tempting to conduct these regular exercises incognito (aka acting up on an alt to see how they do) or test them without warning- while this may yield more “everyday” unbiased results, it has a high probability of backfiring. Blindsiding your moderation team with tests and exercises has the potential to do more harm than good, especially in terms of team trust and morale. It’s recommended that you don’t do this and instead, practice transparency when conducting regular exercises in order to avoid a potentially inequitable situation.

Presentations and Demonstrations

Another form of training is to demonstrate how situations or scenarios are handled in your community via presentation or an actual demonstrative walk-through with moderation alt accounts. This is very useful for training where moderators have to use a wide range of commands, such as explaining moderation and Modmail bots. Ideally, this should take place in an audio channel or group call where you can share your screen. Not everyone is able or comfortable joining a voice chat and unmuting themselves, which is something to be considered beforehand.

Training Program Content

Moderator Documents

An essential part of onboarding new moderators is to have an easily accessible document outlining the basic responsibilities and details on moderator tasks and different moderation teams. Such documents need to be designed for each server individually, but they usually contain general rules for staff, conduct expectations, and a punishment outline to set a standard and unity for moderation actions.

Recommended additions to such a moderation handbook are:

  • Elaborations on your rules to prevent users from finding loopholes in an attempt to “outplay” moderators.
  • Explanations of certain Discord Terms of Service matters, such as copyright infringements, underaged users, implied racism and more.
  • An overview of external rules to be applied within the server and a description of your moderation culture and expectations internally to help new moderators easily adjust to their new team.
  • Bot command overview to better be acquainted with commands instead of feeling pressured to remember them all immediately upon entry to a new team.
  • Commonly used “hacks” on Discord, such as phishing links, malicious files and other harmful softwares.

Be sure to recognize anything that is prevalent in your moderation culture or community that is also worth mentioning here! The more thorough the guidelines, the easier the document is to refer back to for any questions. For example, gaming servers should have a brief description of the featured gaming company’s Terms of Service and discussion about how to handle cheaters or users mentioning account sharing/selling in accordance with those rules. Another example would be for bot support servers where a brief description of commonly encountered issues and how to solve them as well as an FAQ section to help users with simpler answers.

Moderation Commands

Another important topic new moderators should have easy access to are commands for moderation bots. Having a simple guide in a separate text channel with a quick guide of the bots’ prefix and format (command, user-id, [time], reason) will aid moderators in quickly responding to ongoing issues on the server. It lets them react fast without having to pause to look up the necessary command. Try to use the same prefix for the mainly used moderation bot to not cause unnecessary confusion, particularly for users who are new to moderating.

Modmail or Ticket Service

Each server uses a differently designed service to assist moderators with helping out users. Bigger servers tend to rely on a ticket or Modmail system, so properly introducing them with sample conversations or problematic matters is essential for both the recruit and the future user. Confident moderators are more willing to aid users in need than those who are still unsure of how the system works.

This kind of training should include commonly used commands and procedures. There should be conversation about how to redirect users to higher staff and closing tickets appropriately so they don’t stack up and cause confusion. As long as a ticket system remains organization and those guidelines or organizations are established across the full team this will prove to be an easy to use communication system that is much less daunting than it may seem!

Staff Safety

No user should ever feel unsafe or threatened on a Discord server. Staff members are often exposed to harmful or disrespectful messages, some of them targeted at some moderators directly.

An important aspect of onboarding moderators therefore is to make them aware of how to react in such situations. They need to be able to create an environment in which it is comfortable for them to work in and not be afraid to ask for help if they feel threatened by users or, in extreme cases, other staff members. Staff members supporting each other and being able to communicate in such moments is crucial for an effectively working team.

Creating and Managing a Training Program

Effective management promotes a feeling of professionalism and simplifies the process of training new moderators by a lot. Important aspects that should always be covered when creating a training program are time management, deciding on which staff are involved in training and why, effective communication, detailed content, and flexibility.

Choosing an appropriate approach. How do you want to introduce new members? What system did you conclude will work best for your server and staff team? Never be afraid to evolve an already existing plan into something that suits better for your current situation.

Deciding what topics to cover. What topics do you prioritize over others and think require more time investment than others? How are you moderators handling the learning process? Might there be anything in need of adjustment?

Selecting voluntary participants. Having a reliable team behind you to assist you in training is of most importance. Everyone involved should be aware of what exactly they need to teach and how to approach the training within a reasonable time frame.

Developing a timetable. Find common ground between everyone involved, including both the mentors and trainees, and settle on when to teach what topic. Recommend tools to easily manage bigger teams are Google Docs and similar, but bots and self made timetables will suffice, too. Try to keep it simple and easily accessible for everyone involved.

Communication. Who responds to whom? Decide on how and when mentors report to the head of staff or other higher ups. Documenting helps with ensuring everyone is aware of their part and planning how to proceed further.

Flexibility. No matter how carefully you plan everything out, it can always happen that something doesn’t work according to plan. In such situations you need to be able to react spontaneously and be flexible, such as pitching in for someone who is unable to attend due to a last minute shift in schedule.

How to Deliver Effective Training

Before conducting training, you should evaluate what can make your training most effective. Below you will find some best practices to set up a training program and be able to “train the trainers”.


Planning

When you are giving a training, it is important to properly plan your training. What are the most important topics to cover? What do you want new moderators to know after conducting the training? How much time are you planning to spend on training? Some people struggle to focus for a longer period of time, so if your training takes over two hours, you should consider breaking your training up into multiple sessions of shorter duration. Make sure to accommodate for any accessibility issues or notes for those who were unable to be present.

Goals

Each training should have objectives. You can create an outline of your training where you write down the topics you will cover and the objective of that topic. For example, if you are training moderators to use Modmail, your objective is that new moderators are able to handle Modmail tickets and use the commands available to them. For each topic, also write down how you want to give this training: are you going to give a demonstration? Will you use illustrations? Will you offer new moderators to practice during your training and get hands-on experience? Publicly sharing this with trainees can help keep you on topic and allow them to come prepared with questions

Practice

After creating your training, make sure to practice it at least once. First, go through the entire training yourself to see if everything is covered that you think is important. You can then give your training to someone on your team, in case you missed something important and to check whether or not your estimated time is accurate. If it is possible, go through the material with someone who is unfamiliar with moderation. Adjust your training appropriately and you are good to go!

Interaction

For every training session, it is very important to have interaction with your trainees. People lose their attention after 15-30 minutes, so your training should include a discussion, practice, or some sort of interaction. Some of these interactive sessions are covered down below. Interaction with your trainees is also important because it acts as a way for you to verify whether or not your objectives are properly conveyed.

Make sure there is a break once every one to two hours so trainees can focus on your training without feeling overwhelmed.

Methods

Every training can consist of a combination of different training methods. These include, but are not limited to, lectures, quick exercises, group discussions, practice runs, quizzes, videos, demonstrations, and more. When planning your training, you can write down what method would work best to convey the objectives to your trainees. A group discussion will be better applicable to discuss moderation cases, where a demonstration and practice would work better to demonstrate how Modmail bots work.

Try mixing up multiple methods within a training to keep it fresh and keep people from losing their attention.

Preparation

Preparation is key when you are giving any sort of training. Have quick notes ready with keywords that you are going to use during your training, so you do not lose track of where you are and what objective you are trying to convey. If you need any material such as a presentation, paper and pencil, example cases and such, prepare them beforehand so you do not lose time during your training setting these up. If you use any material, test them beforehand.

Don’t forget to clear your schedule beforehand and have some water ready if your training is mostly conducted over a voice chat or call. It is important to sleep well the night before and feel confident. This will let you remain focused and have an uninterrupted training session.

Make sure you are ready at least five minutes before the training starts to welcome arriving trainees and as a final check whether or not everything is ready and good to go. If your training takes place in a voice chat or call, test your microphone and if you have a wireless headset, make sure it is charged up beforehand!

Wrapping up your training

When your training is done, finish with an exercise, group discussion, or something else that is interactive and fun to do. Ideally, this should summarize the entire training. Have some room at the end to answer any questions your trainees might have. Don’t forget to thank everyone who participated for their active contributions!

Evaluation

After each training, write down what went well and what could be improved. You can ask your trainees after they have had some experience as a moderator, what they missed during training that they think should be covered next time, as well as asking what information they did not find useful. You can then adjust your training for next time!

Summary

Having a training or onboarding process in place is very important to have new moderators get accustomed with moderation culture, using bot commands, the server rules, and your moderation guidelines. There are several training methods that include buddy or mentor systems as well as exercises and demonstrations. A training document should outline the most important information new moderators need, but prevent them from being overwhelmed with information they are unable to comprehend.

To have efficient training in place, there are some very important aspects to consider, such as your preparation, goals, how you will carry out the training of your new moderators, and planning. Considering all of these things when creating your training process will make onboarding as informational and effective as possible for you and your new recruits!

Moderation
Moderation

Internationalization of a Community

Expanding from one Language

Which languages to choose entirely depend on the type of server you run. For a community server, it is recommended to adjust it to the community’s needs. Usually, the most commonly featured languages outside of English are German, French, Spanish, Turkish and Russian. You can determine your featured language by community analysis, server insights (if your server has them), and demand. What nationalities are represented the most in your server? Which users struggle the most with English upon joining the server and if they’re struggling, what language do they usually communicate in? In addition to that, you can consider starting a survey every time you and your staff feel ready to expand into a new language.

On the other hand, if you make a server for a brand or company, it is recommended to go by your target audience. Introducing features like internationalization needs carefully planned, but steady steps.

If your server was originally English only, it’s not recommended to expand into too many languages at once for a variety of reasons. Agree on one language (preferably the most popular one in your server) and slowly add on from there, if the need presents itself. Once you figure out a pattern and a stable structure on how to approach the expansion, you can add more sections for different languages.

Server Design

As your server grows, so does Discord. It is vital to keep in mind that your server can get a lot of incoming users who have never really used Discord before- so make sure your server’s design is user-friendly, easily accessible, and has a clear structure. Also, remember that the more languages you feature on your server, the more text-and voice channels will appear for your staff.

It is recommended to hoist the most important text-channels at the top of your server. That may include announcements, server-info, rules, optionally giveaways and more. To ensure global accessibility, translate important parts like the rules and server information in every language you feature alongside your main language.

Additionally, you can add a server guide for easier coordination. Both new users and your staff will benefit from that. It should cover a list of accessible text and voice channels as well as a short but detailed description of each.

Furthermore, try to integrate every additional language in a way that utilizes it to its utmost potential. You can do so in the form of a feedback and suggestion channel in each language’s category. This lets you hear directly from the users that would benefit from that additional language chat the most. Consider having staff that are dedicated to interacting with each section instead of focusing all their attention on the global chat. Having these additional language chats makes it so that your international users don’t feel less validated only because they don’t want to or can’t communicate in the global chat.

You can implement that same line of thinking when organizing server events. Consider alternating between server-wide and international events as demand allows. For the latter, analyze your server’s activity for each region and host it at the time with the most community engagement for those users for the best results. You can apply the same system to the voice channels by creating a few global voice chats and then adding language specific channels. Your staff should only allow one common language to be spoken in the global chats and redirect every other language into their appropriate voice channels.

Creating a Server with One or More Languages From Scratch

If you want to feature multiple languages with equal amounts, you can effectively combine multiple Discord servers in one. For the first half, it will be the same as before: Have at least one international general-chat before you split the server into each language. That way every user is able to communicate with the entirety of the server instead of just their language-specific section. Let users choose their language via a reaction or verification with the option to opt-out and change regions. You can summarise the most important aspects in a common text-channel, the text being translated into every featured language.

The simplest way to manage the split server is by having the same channels and categories, but in different languages. This will heavily depend on the type of server you moderate, but recommended channels for every section are the ones you would usually include in a monolingual server. These channels should include things like rules, announcements, general-chat, bot-commands, media, optionally a looking-for-game channel, and more. Take care to ensure you have appropriate translation for each channel.

Moreover, community events can be a lot more diverse. You can host a variety of events for each language as well as let all of them participate in shared events. Take the time to analyze each nation’s activity and also take holidays or national days into account.

Staff Arrangement

Internationalizing your server may mean that your moderation team grows to be larger than expected, so to make it easy for users to contact the appropriate staff you can separate moderators by nicknames, colors, or role hoist.

Nicknames: If you want to have the moderation team equally displayed throughout the server, you can arrange the moderators by nicknames. Simply add the language they moderate at the beginning of their nickname so they are alphabetically sorted below each other. Users will have an easier time recognizing the appropriate moderators, but with a large staff, this might end up looking a little busy. Some moderators tend to change their nicknames frequently and may not be big fans of having the tag in front of their name, either.

Colors: It is advised to have the colors of each moderator role in the same shade with a slight but still visible difference. This helps to differentiate moderators and regular users will be able to tell the difference immediately. Remember that newer users might not check each role and just contact the first seemingly online moderator they see.

Role hoist: If you display only the accompanying moderator role for each language, ensure that you have a common, not displayed role for all moderators so one team is not less validated than the other. Users should definitely be able to contact the appropriate staff and ping the correct moderators to help them out with issues in chat. but, when displaying a lot of teams with many members each, some of them might be pushed off the screen. You run the risk that some moderators that are displayed below the other teams seem less “important and validated” to the community, but the shared staff role should fix that problem easily.

Moderation

Recruitment

While moderating a multilingual server it is crucial to have moderators that are native speakers or at least fluent in each language that you decide to expand to. Having moderators that are native speakers makes it so that they not only understand the textbook definitions of what is being said but understand the cultural contexts that may come with international chats. However, these moderators should also be accomplished and capable with the other responsibilities of being a mod that lie outside of just being fluent.

Responsibilities and Tasks

While community moderators should follow the basic tasks (moderate the chat, be accessible for questions, guide users), things in multilingual servers can be a bit more advanced. Moderators need to identify and address inappropriate behavior towards other cultures in an authoritative, but instructive way. Furthermore, they need to free the chat from toxic behavior and control discussions about sensitive topics.

In addition to that, they have to be open for cultural questions of users who chose a language they are not fluent in yet. Both the user and staff need to work their way around language barriers together. They’ll be required to help users, as well as resolve disputes, often at the same time. For that, they’ll have to rely on their communication skills in order to succeed in resolving both situations and any other issue that may require it. Without good communication skills, users won't understand what you're trying to convey to them, which should be assertive but tolerant.

Moderators can under no circumstances be discriminating towards users of other cultures or beliefs. It is their responsibility to create a civil, welcoming and comfortable environment for all users. While you are always entitled to your own opinion, make sure to keep a neutral output in the public part of the server and try not to let it cloud your judgement.

Managing International Moderator Teams

Managing several moderators from all around the world can turn out to be quite tricky. Organizing meetings and waiting for everyone’s approval for one date may be a real challenge to deal with. Getting the majority of the team on board could take days, especially with the difference in time zones.

One simple solution is to delegate some captains or representative moderators for each team. They will gather feedback, opinions and suggestions from the rest of the staff team and discuss them with the rest of staff during the meeting. That way it’s easier to organize staff meetings, but they can also directly inform the rest of their team who could not make it due to time conflicts. Depending on the size of the staff teams, you can appoint more than one moderator for that position.

When creating more than one staff role, make sure that everyone is comfortable with their position and their responsibilities. Misalignment can lead to misunderstanding later on, and it might create needless tension amongst the staff. Consider appointing one representative in each region that your server is branching out to. That will open the possibility to have regional staff meetings where language and culture specific issues and suggestions can be discussed without the need for an international meeting with the whole team.

While not every moderator will get the chance to get to know every other moderator from different regions in these larger moderation rosters, it is important that the team feels united regardless. Don’t leave it up to one team to tackle an issue- let them know that they can always ask for help from the rest of the staff.

Bots and ModMail

Global Accessibility: Make sure that you have the bots available for every part of your server. That could either mean that you include them in the global section of your server, or you translate the commands and definitions in every language featured.

Moderation: Since not all moderators will be able to understand every single language, using bots turn out to be very helpful for auto-moderation. Inform your fellow moderators about inappropriate words or phrases and ensure that you added them into the bot filter for easier moderation. Banned or filtered words should contain common slurs in not only English, but also in commonly banned words for every language that you offer.

Language Barriers: It may occur that users message the ModMail in poor English or in their native language. Find out the user's nationality and reply with a message in their native language that lets them know their request is being routed to the appropriately fluent mod. To save these messages, simply create an extra text-channel in the ModMail server and let moderators translate important phrases that might come in handy.

Cultural Awareness

Sensitive Topics

It's easy for people to underestimate the impact cultural differences can have. Culture influences values, rules, thought patterns, and perception.

That means that events happening in other countries may be viewed differently in each nation. News and social media don't always portray the truth which lets misinformation spread easily. Make sure to have your staff updated about the current situations so they have it easier to deal with discussions about sensitive topics and trolls. Communication is key and the users of the server will massively benefit from it. While people are entitled to have their own opinion and ask for further information, make sure to let it happen in a calm and civil atmosphere under a moderator’s watch. If you as a moderator find that you’re ill equipped to talk about the topic, you should refrain from publicly voicing your opinion until you’re better informed. Conversely, you may want to refrain from allowing contentious topics like current events or politics to occur in your server at all, which is something you should decide with your moderation team as a whole.

Final Words

Making a server internationally available is a great idea and can be a boon to your community’s retention. It can be utilized for both communities and companies alike. But you have to be careful with your approach- Internationalization is a deliberate and complicated process and it should be treated as such. If it’s gone about in the wrong way, left uncompleted or rushed, these international spaces could backfire. It may result in negative feedback, a disappointed community and staff or deserted channels.

Make sure to inform your staff and community about every step you’re about to take, and give them a chance to voice their input. Feedback and suggestions from both your mods and your community will be essential to making sure this is the right fit for your server. Internationalization requires a lot of effort and prioritization in order to properly take care of many factors simultaneously, but if done right, it’s an unparalleled way to enrich your community.

Moderation
Moderation

The Application of Metaphors in Moderation

The Five Types of Community Moderation Approaches

An academic study interviewed dozens of moderators across multiple platforms and grouped moderation approaches into five different categories:

  • Nurturing and Supporting Communities
  • Overseeing and Facilitating Communities
  • Governing and Regulating Communities
  • Fighting for Communities
  • Managing Communities

Being able to understand the perspective behind each of these approaches and then applying them to your own community as needed is a powerful ability as a community manager. This article will discuss in more detail what these five categories mean and how you can apply them within your own communities.

Nurturing and Supporting Communities

Moderators that nurture and support communities (nurturing-type moderators) focus on shaping the community and conversations that occur in the server among members to match their vision. The foundation for their moderation actions stem from their desire to keep the community positive and welcoming for everyone, not just long-time members. They seek to create a community with a good understanding of the rules that can then develop itself in a positive way over time.

These types of moderators may implement pre-screening of members or content in their communities by implementing a verification gate or using an automoderator to filter out low quality members or content and curate the conversations of the server to be better suited to their vision.

Although this passive behind-the-scenes guidance is one type of nurturing moderator, these types of moderators also often actively engage with the community as a “regular member.” For nurturing-type moderators, this engagement isn’t meant specifically to provide an example of rule-following behavior, but rather to encourage high-quality conversations on the server where members will naturally enjoy engaging with each other and the moderators as equals. They are leading by example.

Overseeing and Facilitating Communities

While nurturing- and supporting-type moderators operate based upon their long-term vision for a community, moderators that are focused on overseeing and facilitating communities focus on short-term needs and the day-to-day interactions of community members. They are often involved in handling difficult scenarios and fostering a healthy community.

For example, these types of moderators will step in when there is conflict within the community and attempt to mediate between parties to resolve any misunderstandings and restore   friendliness to the server. Depending on the issue, they may also refer to specific rules or community knowledge to assign validity to one viewpoint or to respectfully discredit the behavior of another. In both situations, moderators will attempt to elicit agreement from those involved about their judgment and resolve the conflict to earn the respect of their community members and restore order to the server.

Those in the overseeing and facilitating communities category may also take less involved approaches towards maintaining healthy day-to-day interaction among members, such as quickly making decisions to mute, kick, or ban someone that is causing an excessive amount of trouble rather than attempting to talk them down. They may also watch for bad behavior and report it to other moderators to step in and handle, or allow the community to self-regulate when possible rather than attempting to directly influence the conversation.

Fighting for Communities

Where overseeing and facilitating community moderators emphasize interactive and communicative approaches to solving situations with community members, moderators who see themselves as fighting for communities heavily emphasize taking action and content removal rather than moderating via a two-way interaction. They may see advocating for their community members as part of their job  and want to defend the community from those who would try to harm it. Oftentimes, the moderators themselves may have been on the receiving end of the problematic behavior in the past and desire to keep others in their community from having to deal with the same thing. This attitude is often the driver behind their  no-nonsense approach to moderation while strictly enforcing the community’s rules and values, quickly working to remove hateful content and users acting in bad faith.

Moderators in this category are similar to the subset of moderators that view moderation from the overseeing and facilitating communities, specifically the ones that quickly remove those who are causing trouble. However, compared to the perspective that misbehavior stems from immaturity, moderators that fight for communities have a stronger focus on the content being posted in the community, rather than the intent behind it. In contrast to moderators in the overseeing and facilitating communities category, these moderators take a firmer stance in their moderation style and do not worry about complaints from users who have broken rules. Instead they accept that pushback on the difficult decisions they make is part of the moderation process.

Governing and Regulating Communities

Those that see themselves as governing and regulating communities see the moderation team as a form of governance and place great emphasis on the appropriate and desirable application of the community rules, often seeing the process for making moderation decisions as similar to a court system making decisions based on a set of community “laws.” They may also see themselves as representatives of the community or the moderation team and emphasize the need to create policies or enforce rules that benefit the community as a whole

Moderators in this category may consciously run the community according to specific government principles, such as having a vote on community changes. However, they may also achieve consensus within the team about changes to the server without involving the community at large or even have one moderator make the final determination about community changes. This “final decision” power is usually exercised in terms of vetoing a proposed policy or issuing a ruling on an issue that is particularly contentious within the mod team or community. Very rarely would a form of decision-making be exercised, and it would be granted to very specific members of a team hierarchy such as the server owner or administrative lead. Even so, moderators in this category find following procedure to be important and tend to involve others to some extent in making decisions about the community rather than acting on their own.

This tendency is also seen in the way that they approach rule enforcement. Moderators that see themselves as governing and regulating communities view the rules as if they were the laws of a country. They meticulously review situations that involve moderator intervention to determine which rule was broken and how it was broken while referring to similar past cases to see how those were handled. These moderators also tend to interpret the rules more strictly, according to the “letter of the law,” and attempt to leave no room for argument while building their “case” against potential offending users.

Managing Communities

Moderators that see themselves as managing communities view moderation as a second job to be approached in a professional way. They pay particular attention to the way they interact with other members of the community moderation team as well as the moderation teams of other communities, and strive to represent the team positively to their community members. This type of moderator may appear more often as communities become very large and as there becomes a need for clearer, standard processes and division of responsibility between moderators in order to handle the workload.

Though this metaphor focuses more on moderator team dynamics than relationships between moderators and users, it can also shape the way moderators approach interactions with users. Managing-type moderators are more likely to be able to point users toward written rules, guidelines, or processes when they have questions. Managing-type moderators are also much less likely to make “on-the-fly” decisions about new issues that come up. Instead, they will document the issue and post about it in the proper place, such as a private moderator channel, so it can be discussed and a new process can be created if needed. This approach also makes it easier to be transparent with users about decision making. When there are established, consistent processes in place for handling issues, users are less likely to feel that decisions are random or arbitrary.

Another strength of this approach is evident in efficient on-boarding processes. When a community has clear processes for documenting, discussing, and handling different situations, adding new moderators to the team is much easier because there is already a set of written instructions for how they should do their job. This professional approach to moderation can also help moderators when they are attempting to form partnerships or make connections with other servers. An organized moderation team is much more likely to make a good impression with potential partners. If you want to learn more about managing moderation teams, click here.

Conclusion

As you read through this article, you may have found that some moderation category descriptions resonated with you more than others. The more experience you have moderating, the wider the variety of moderation approaches you’ll implement. Rather than trying to find a single “best” approach from among these categories, it’s better to consider your overall balance in using them and how often you consider moderation issues from each perspective. For example, you can nurture and support a community by controlling how members arrive at your server and curating the content of your informational channels to guide conversation, while also managing and overseeing the interactions of honest, well-intentioned community members and quickly banning those who seek to actively harm your community.

It’s perfectly natural that each person on your moderation team will have an approach that comes easier to them than the others and no category is superior to another. Making sure all moderation categories are represented in your moderation team helps to ensure a well-rounded staff that values differing opinions. Even just understanding each of these frameworks is an important component of maintaining a successful community. Now that you understand these different approaches, you can consciously apply them as needed so that your community can continue to thrive!

Moderation
Moderation

Twitch X Discord

Is Discord for Twitch Necessary?

At some point during their Twitch careers, most streamers will ask the question, “Is having a Discord for my community necessary?”. Most streamers are already struggling enough with configuring their overlays, cultivating relationships with other streamers, and taking the time to edit or design future content. Will working with yet-another platform really help move my streaming career forward, or is it just another item on the endless list of distractions?

Most of the streamers who ask this question fail to understand how Discord fits into their high-level plan for scaling their brand and community. One of the biggest misconceptions is that Twitch can be used for every step of growing your channel, brand, and community. Twitch is only one of many platforms required for success. While Twitch is well-known for engaging and monetizing your active community base, it struggles with content discovery, data collection, and long term retention. Twitch should instead be viewed as a tool to engage and connect with an already-existing community.

The image above categorizes content platforms into 3 primary groups; Discovery, Engagement, and Activation. Content platforms that focus on Discovery are specifically designed to spread new, persistent content that can be viewed later. These platforms are ideal for finding new community members and fans to engage with by inviting them to join another more personalized platform, such as a fansite, email list, or Discord community. It is through this persistent community that you can advertise ways to support and engage directly with the primary content creators by supporting through Patreon or viewing/subscribing on Twitch.

The platforms above can obviously be used for different purposes, but they’re segmented as such because of how their primary business models work. Discovery platforms run on ads and focus on getting you to consume more content. Engagement platforms are more personalized, with content not designed for larger audiences and frequently encourage users to take action. Activation platforms are designed to have monthly or direct recurring payment models that provide an enhanced or personalized version of Engagement content.

Going Live and Back Again

With a better understanding of the primary purposes of each platform, you can begin to design your user experience in a way that provides new community members an easy and direct way to participate in and show dedication to the community. Regardless of how your Twitch viewers discover your channel or content, directing them to a centralized community hub will allow them to share their appreciation with others, and go from being an individual fan to being part of a community.

To get the most out of Discord and encourage members to join your stream, spend time with your community before and after each livestream. Before each stream, spend 15 minutes chatting with your active Discord members to ask them about their day or receive feedback about your content. As you begin your stream, post announcements on your social media platforms and use a bot to alert your Discord community that you’ve gone live. Moderators should continue to oversee discussions in your Discord, but encourage members to watch the livestream as it occurs. Bots and scripts can also be used to remind users to join the Discord community if they haven’t already, as well as following on social media for updates and alerts. Finally, as your stream is ending, remind the community that you’ll be available in Discord for a short time to discuss the stream and answer any questions. Some streamers will also include special events for high-ranking members or subscribers, which can help encourage collaboration and active use of the server.

Playing with Viewers via Discord

One of the most effective ways to engage directly with your community is to create content with them. This can either be done while livestreaming, or else as a scheduled event that the community can participate in. Events need not necessarily be games, but can also include Q&A sessions or discussions about a topic of interest. Special precautions must be made while playing with viewers as you’re live to prevent “stream sniping”, where community members will attempt to disrupt your stream or game while you’re live. Moderators should take note and be ready to handle disruptions while engaging directly with the community, but most situations can be prevented by assigning restrictions or requirements to be on stream with the host. For example, being a subscriber to the community or requiring participants to apply for a spot if the game has a limited number of participants can help ensure everybody understands and follows the rules.

The Discord for Twitch Checklist

As you grow your Discord community, members will have different expectations of what your server offers, how it’s managed and what the server is used for. For example, a community with 500 members may not be expected to have a full events calendar, but they would definitely expect that roles have been properly configured so that subscribers appear as a different username color with more permissions and access to restricted channels. Below is a simple checklist for the functionality your server should have:

Milestones for 50 Members

  • Establish basic Discord functionality on your server. Early on your Discord community may only have a few channels, but setting the correct discussion channels and role permissions early will help to prevent rule violations and problems as your community grows. More articles in the Support Center can be found on Role Management Fundamentals, Setting up Permissions, and Designing Private Discussion Channels.
  • Configure your Twitch Integration. Once you reach Affiliate level on Twitch with 50 followers and an average of 3 viewers per stream, your channel will unlock community subscriptions and custom emotes. The subscription status and emotes can be synced to your Discord automatically through an Oauth integration, allowing you to provide custom subscriber-only channels to reward loyalty. Learn more in the support center about Twitch Integration.
  • Configure your Announcement Bots. While the use of bots for such a small community may seem unnecessary, having your Stream Announcement bots configured as soon as possible will help your Twitch channel grow exponentially. You can have two bots that fulfill this role, bots in Discord to announce Go-Live streams and bots on Twitch to offer invitation links to Discord.

Milestones for 500 Members

  • Regular News and Announcements. Once you have a larger and more active community consider doing regular posts featuring content released on other platforms like YouTube or information about changes and updates to the community. Many popular Discord bots offer custom alert notifications to encourage your community to load your livestream as you begin streaming. Several Twitch bots also allow for commands you can use to drive traffic to your Discord community.
  • Bot Automation for Modding and Common Questions. While small communities can be easily modded by a few individuals, larger communities require automated tools to delete posts and enforce the rules. Bots can be used to detect and remove spam, assign roles, kick and ban members or answer frequently asked questions using keyword detection.
  • Diversified Discussion and Call channels. As your community grows, new discussion and call channels should be created to allow multiple discussions to occur simultaneously. For example, assigning custom channels for promotion, media or other discussion topics help make content in your community easier to find and filter. Create two or more voice channels to allow for serious versus casual discussion and limit the number of active chatters to ensure the call doesn’t become too noisy.

Milestones for 5000 Members

  • Scheduled Events and Community Activities. As your community grows into the thousands, enough members will be active that you can begin hosting events or activities related to the core theme or value proposition your community offers. These events would be posted as part of your community announcements and may offer a custom role to keep regulars active and informed about similar events. A system that allows community members to create their own events can assist in offloading work and planning from the moderation team, and empower your members to build a stronger culture within the community.
  • Advanced Functionality Scripts. Finding a skilled developer to create custom bots and scripts for the community will open up new opportunities for what your members can accomplish on your platform. For example, modular discussion and voice channels can allow community members to temporarily create their own call or discussion channels for an activity. These channels can last a certain amount of time or automatically delete themselves once everyone has left, ensuring the total number of channels and messages in the community isn’t overwhelming.

Conclusion

We hope these guidelines have helped you to determine if creating a Discord server for your Twitch audience is the right fit for you. The most essential thing to keep in mind is that your Discord server is not just an appendix to your Twitch presence. It serves as its own entity and community that needs to be fostered with the same, if not more, personal attention and care in order for it to blossom. With the right mindset and efforts, your Discord community can bolster your Twitch presence and bring your community closer together, for longer

Moderation
Moderation

Considering Mental Health in Your Community

The Importance of Mental Health Awareness

Before we discuss what to consider when introducing mental health channels to a community, it is important to understand the difference between mental health and a mental illness, and to be aware of the realities of what these terms entail.

Mental health and mental illness are strongly intertwined, but very different. People with good mental health can develop a mental illness, while those with no mental illness can have poor mental health. Mental health reflects our emotional, physiological, and social well-being. Though the topic of mental health is still quite taboo to talk about in certain circles, there is help available for those who wish to seek it. Mental health is a crucial aspect of a person’s existence as it affects our actions, emotions, and thoughts. A healthy mental state enhances effectiveness and productivity in work, education, and interpersonal relationships.

A mental illness is a disorder of the mind that affects not just our thinking but also our energy, mood, and occasionally our conduct or behavior. Such a diagnosis may make it difficult to cope with the many obligations of life. Common mental illness diagnoses are anxiety disorders, such as panic attacks, post-traumatic stress disorder, obsessive-compulsive disorder, and specific types of fears. There are also mood disorders like depression, bipolar disorder, and schizophrenia. An excellent guide for identifying mental health challenges and pathways to care can be found here.

Being informed and aware about mental health issues is a great start when you’re considering how it may affect or apply to your community and team. As an illness can manifest in a variety of different ways, it is vital to be ready and have clear guidelines on these complex topics when dealing with online communities both for your staff internally and for your community externally. An often overlooked result of community moderation is how it can affect the mental health of your team. Moderator burnout stems from situations that harm your team’s mental health, and understanding the signs and how to deal with it are essential in building a strong online community led by a moderation team that can act as a strong support system not only to your community, but to each other.

Adding a Mental Health Channel in Your Server

There is a lot to consider when discussing if you want to add a channel that addresses and is dedicated to talking about topics that deal with mental health within your Discord server. Firstly, you have to determine whether or not your community needs a mental health discussion channel or thread. Consider whether or not your team has seen much talk about mental health-related content in your community and whether such conversations have remained respectful even without your team’s active intervention. If these conversations have devolved into arguments or negativity in your community, maybe re-evaluate if you feel this type of channel will benefit those in your server and really think about if you can find a way or have enough resources to guarantee that it remains a safe space for all users.

And easy way to contemplate whether or not a mental health channel would fit into your server is by looking at the general topic of your server. If your Discord server is more focused on gaming, it may not make sense to allow something that drastically differs from your server’s purpose. Contrarily, a server focusing on community building might be a safer space for a mental health channel. Ultimately, it’s very subjective and all about listening to not just the needs of your community but the tone of your server.

A lot goes into developing an excellent community and introducing a mental health channel is not easy as you have to consider how to best moderate it. This includes how you’ll be utilizing auto moderation and text filters in the channel as accidental flags in highly emotional situations can be more harmful than helpful. If you do decide to create a mental health channel or thread, it is important to be aware of all possible situations and to be flexible and prepared for unexpected scenarios.

Understanding that some moderators aren’t comfortable in these situations is important and if you decide to allow such a channel on your server, you need to be able to vet your moderators and future helpers. As moderators aren’t mental health professionals, they shouldn’t be treated as such and should also be presented with the option to opt out of spaces such as this. Therefore, it is vitally important that you establish clear internal guidelines and procedures to set expectations for the channel at a manageable level.

Facilitating Healthy Discussions about Mental Health

Discussing mental health in a positive way can be challenging as it is a sensitive topic with a variety of experiences attached to it. However, there are some ways to make this an easier experience from a moderation standpoint, the most important of which is having clear server rules and moderator guidelines for how to act when these discussions are taking place and having clear escalation protocols for if things go south. Talking with your moderators about mental health and challenging times in life as well as facilitating breaks for them demonstrates that you care about their well-being. This is an integral part of establishing a healthy environment for your moderators and solid internal relationships.

The creation of specific mental health-related guidelines ensure that users remain respectful of each other and that conversations are within the designated guidelines of your community.

Some suggestions for creating these guidelines include:

  • Make it clear that moderators (unless otherwise stated) are not mental health professionals.
  • Have a clear rule about respect with the emphasis that intentionally disrespectful actions will lead to moderator intervention. Respecting others and exhibiting empathy towards people’s situations can go a long way to helping them feel validated and supported.
  • Do not tolerate harassment of any kind. It is imperative that mental health channels remain safe, positive, and welcoming to all. There should be nothing that causes conversation to make others feel uncomfortable or attacked.
  • Have a rule or two about taking a break if necessary. This can be true for both moderators who are at risk of burnout and the regular users participating in these channels. Taking breaks is incredibly important and should be encouraged, especially in the context of mental health.

Escalation protocols are another important foundation for when mental health discussions take place. Some ideas to consider when implementing them:

  • Know when they should go into effect. A user just venting about their day is probably not going to need escalation. While a user that is talking about self harm will need to be escalated.
  • Have clear and understandable guidelines on when and how to escalate things. Some ideas for these guidelines:
  • Have a role that can be pinged for when / if a user is suicidal.
  • Have a link to the T&S report form on hand. This is the best way to escalate things as Discord Trust & Safety can do more than the average user. For more information on reporting and how to report, see this article.
  • Know that de-escalation is also an important step. Reporting problematic things to Discord is important, but reports are not often actioned upon immediately. Having a guideline for de-escalation can help alleviate those situations that are time sensitive.

How to Moderate a Mental Health Channel

Moderation of mental health channels can be a touchy subject. On one hand, moderators should be firm in removing harassers and users who are disrespecting others. On the other hand, moderators should try their best not to over-moderate. There must be a balance to ensure that these channels that have very sensitive subject matters within them are moderated with care. Some ways to do this are:

  • Always go into a situation with the assumption of good faith. Coming into an argument or conversation with the assumption of bad faith, especially in channels dedicated to mental health, can muddy the waters and skew perceptions.
  • Consider a situation from all angles, especially within the context of mental health. A person may be lashing out due to poor mental health, and thus it may be a good time to de-escalate the situation before dishing out any punishments. That said, do not let users use mental health as an excuse for bad behavior, especially if it becomes a pattern or other users are suffering.
  • Be firm, but fair. The best way to do this would be to have punishments that escalate for each offense in these channels. The first offense may be a warning, while the second or third may be a mute. There could also be a “strike” system, whereby a mute or tempban may be applied by the third strike.
  • Education is key. Try to use moderation not as a tool to make an example, but as a way to educate. Users will respond better to education and it will be an opportunity for growth not just for the user that broke a rule or guideline, but also for other users in the channel. This also humanizes mods as people who want to do the best for their community and not robots that serve to punish users.

How to Access Support

There are an incredible amount of resources available for mental health support–everything from emergency phone numbers to suicide hotlines and LGBTQ+ helplines. The Find a Helpline resource consists of a list of global helplines and hotlines. We also recommend looking at the TWLOHA resource, which is a non-profit movement dedicated to presenting hope and finding help for people struggling with their mental health.

Discord has also partnered with Crisis Text Line, a non-profit that provides text-based volunteer support for people in crisis. You can learn more about our integration and partnership and how to use it on Discord here.

Some more examples of support sites include:

Not all communities are prepared to host a mental health channel or thread, and there is a lot to consider before adding one to your server. While allowing talk about mental health can be incredibly beneficial, it is just as important to realize that moderators aren’t professionals and should not be put in those positions if they aren’t comfortable with it. That’s why we advise that you carefully consider your current environment, your community’s purpose, and implications that might occur if you choose to add one.

Whatever you decide is best for your community, we believe that education is key. It may be helpful to think about adding a channel or post that gives people access to professional support lines, such as FindaHelpline and the TWLOHA resource, as well as other resources laid out in this article. Generally, facilitating a healthy discussion around important topics of any kind is essential in helping to develop stronger communities online. Make sure that your community is a safe space for everyone that chooses to call it home!

Moderation
Moderation

Managing Interpersonal Relationships

Degrees of Interpersonal Relationships

Any relationship between two members of a community can be described as an interpersonal relationship. These relationships exist on a wide spectrum. As you participate in a community, you are most likely going to develop connections to varying degrees with other members of the community. As a moderator, this may even be expected as part of your duties to promote community engagement and healthy conversations. That’s perfectly normal, as it’s very natural for people who spend a lot of time communicating to develop closer ties to one another.

Every kind of relationship, from mere acquaintances to romantic partners, can occur in a Discord community, and every relationship you form as a moderator will carry its own unique challenges and responsibilities in order to ensure you are performing your duties to the best of your ability. Any kind of interpersonal relationship can create difficulty in moderation, but as the nature of the relationship changes, so too does the unconscious bias you may experience.

Friendships

A friendship between a moderator and a member of the community is the least problematic type of intersocial relationship, but as these friendships form it is still important to take notice and be aware of them. As a moderator it is your duty to be available to everyone in the community, even people who you may not ever see as a friend, so you must resist the temptation to devote more time and attention to the people you more easily connect with. If your biases toward your friends begin to show up in your moderation efforts, many more serious and harder to diagnose problems can arise. Feelings of "elitism" or “favoritism” can start to take hold and disgruntled members may take advantage of your friendships to excuse or justify their own behavior, so take care to make sure that you are remaining impartial.

Close Relationships

A friendship between a moderator and a member of the community that persists for a long period can evolve into a closer and more open relationship. These relationships are built on trust or shared experience, and can be more difficult to impartially manage than regular friendships or acquaintances. This kind of relationship could come from the fact that this person is someone you know from another server, in real life, or possibly even a family member. No matter what the scenario, the closeness of this kind of relationship makes it very difficult, sometimes impossible, to remove your own partiality from the equation. Special care must be taken to ensure you engage and listen to other moderators on your team when someone you are closely involved with is in question. When in doubt, it may be best to remove yourself from the situation entirely, which we will discuss in more detail later in the article.

Romantic Relationships

A romantic relationship between a moderator and a member of the community can (and does!) happen. As is natural, if you meet someone who shares common interests and has an attractive personality, over time your relationship may progress into something more profound. Romantic relationships are certainly the most difficult to manage as a moderator. The saying holds true, especially in new romantic relationships, that you will see your significant other through “rose-tinted glasses” which tend to blind you from their potential flaws or wrongdoings.

Additionally, other members can very quickly see a budding relationship as an opportunity for a fellow member to grab power through the moderator they are romantically involved with. As a best practice, you should remove yourself from any moderation decisions involving a user that you are in a romantic relationship with. Failing to do so can and has directly caused the death of some communities, especially when the romantic partners are both on the same moderator team.

Parasocial Relationships

This type of one-sided interpersonal relationship is rare among moderators because of the connection a moderation team usually has to the content creator or personality that they moderate for. More commonly, a user witnessing a friendly moderator carrying out their daily duties to interact with their server can develop such a relationship. However, this type of relationship requires an extra level of care and awareness, as they can quickly become toxic if not managed appropriately. Always be aware of them, and consider their existence when making certain moderation decisions. The DMA has an article exclusively dedicated to parasocial relationships for further reading.

On Visibility

One thing to keep in mind when evaluating your relationships in your communities- regardless of the nature of them– is that your relationships and connections if played out in the server are most likely visible to other members of the community. When interacting with your friends, close friends, or even your partner in a space with other people such as your server, members of the community may pick up on the fact that you do have these relationships. As with any kind of community, there may be feelings of exclusion or the perception of “in-groups” that can arise in, especially when it comes to relationships between a “regular” server member and a highly public and visible one like a moderator. A responsibility you have as a moderator is to take this dynamic into account and the effects it can have on your members and how they view you and your friendships. Making sure that your friendships and relationships are not creating an exclusionary atmosphere for other community members, where they feel like it’s unwanted or difficult for them to contribute.

On the subject of “visibility”, a moderator – whether they are consistently conscious of it or not - is someone in the server who has power over other users in that space. It is not always the easiest task balancing the dynamic between being part of a community and cultivating relationships and friendships with being conscious of your role within that community as a moderator and what that imbalance may influence. This difference in responsibility and position can make relationships and connections with other users in the server more complicated. You may not be directly aware of it when you’re chatting with fellow server members, but there will be users in your community who are keenly aware of your status as a moderator. This scrutiny can affect how they approach becoming friends with you as well as affect how they view your own relationships with other server members. Always keep this dynamic in mind and be aware of how your position may affect not just how users interact with you but also how they interpret your relationships and conversations with other members.

A Higher Standard

Just as it is natural for these relationships to form, it is also human nature to unconsciously develop and act on a bias toward the people closest to you. As a moderator, that natural bias is something you must actively resist, and take conscious steps to avoid. What happens when the friend of a moderator has a bad day and doesn’t act in the spirit of the rules of the community? In an ideal scenario, the moderator’s response would be the same reasonable response that would be expected if the offending member were anyone else. Your response to these situations will have a profound impact on your community’s attitude toward you as a moderator, as showing favoritism will quickly evaporate the community’s trust in your ability to be impartial. Moderators are human, and for inexperienced and seasoned moderators alike, this kind of scenario can prove to be one of the most significant tests of their ability to manage conflict.

Setting Yourself Up For Success

In preparing for this scenario, the most important tool in a moderator’s arsenal is self-awareness. It is the burden of a moderator that this commitment comes above any interpersonal relationships that may form during time spent engaging with a community. Being ever-mindful of your responsibility and role in a community can help temper the depth of the relationships that you build.

As a recommended best practice, moderators should be careful about building interpersonal relationships of depth (close or romantic relationships) in the communities they moderate, including with other moderators. The only guaranteed way for a moderator to remain impartial in upholding the rules for all members is to exclusively maintain friendships within their community, but this isn’t always reasonable for communities that you are closely involved in. Should you find yourself in a difficult scenario involving a member with whom you have a close interpersonal relationship, here are some best practices for managing the situation:

Self-Evaluate

The first step in successfully managing a scenario that involves someone you have an interpersonal relationship with is to take stock of your own investment. How are you feeling? Are you calm and capable of making rational judgment? Is your gut reaction to jump to the defense of the member? Or is the opposite true - do you feel the need to be overly harsh in order to compensate for potential bias? Carefully self-evaluate before proceeding with any action. The wrong type of moderator response in a scenario like this can often exacerbate or distract from the actual issue at hand, and potentially weaken your community’s trust in your capabilities as a moderator.

If in the course of your self-evaluation you realize that you cannot positively answer any or all of these questions, it may be necessary for you to more seriously evaluate whether or not you need to make difficult decisions regarding your position as a moderator. If your interpersonal relationship is preventing you from fulfilling your duties as a moderator, you may need to consider either abdicating your role as a moderator or ending the relationship until circumstances improve. Neither option is easy or ideal, but making tough decisions for the health of the community is your primary responsibility as a moderator.

Evaluate the Scenario

Once you’ve determined that you’re capable of proceeding with moderation, evaluate the scenario to identify what the problem is and whether it immediately needs to be addressed. If there is no immediate need to step in, as a best practice it is usually better to defer to another moderator whenever your personal relationships are involved. Contact another member of your moderation team to get a second opinion and some backup if necessary.

If immediate action is required, a concise and direct reference to the rules is usually sufficient to defuse the situation. Use your best judgment, but be aware that the likelihood of “rules lawyering” is higher with someone who trusts you or sees you as a friend in these scenarios because moderation action can be seen as a violation of that trust or relationship. Clearly and fairly indicating the grounds for you speaking up is crucial to prevent further issues from arising.

Additionally, be careful about what is discussed in private with the person involved in this scenario following any action. There is a higher likelihood of them contacting you via DM to talk about your decisions because of the level of trust that exists between you. As a best practice, it is usually best to avoid litigating the rules of the server with any member, especially a member with which you have an interpersonal relationship. Politely excuse yourself, or if prudent, redirect the conversation by giving the member a place to productively resolve their own issue.

Log Action and Self-Evaluate (Again)

As with any moderation action, once taken it is best practice to leave a note for your team about what action was taken and why. Another period of self-evaluation is a good idea after any action is taken. Ask yourself, was the action taken in alignment with the rules of your community? Was it fair to both the offending member, as well as the other members of your community? Was your decision affected by your bias towards the offending member? If necessary or unclear, ask your teammates for their outside perspective.

Summary

Taking moderation action when the offending member is one with whom a moderator has an interpersonal relationship can be one of the most difficult scenarios that a moderator can find themselves in. Set yourself up for success as a moderator by tempering the type of relationships you build within your community and cultivating the ability to self-evaluate. The best tool available to a moderator in these scenarios is self-awareness and the ability to recognize when their own biases prevent them from acting fairly. Remember that moderation is a team sport, and that team is your most valuable resource in impartially upholding the rules and values of your community.

Moderation
Moderation

Best Practices for Moderating Content Creation

Creating an Area for Content Creators

Content creation is one of the coolest aspects of a community! Even those that do not create themselves can celebrate the passion and excitement that comes with sharing art. Artists shouldn’t be regulated to just a generalized #media channel where all users are posting photos, and you should consider instead giving them their own designated area in the server. This shows these users that moderators see their contributions to the community and appreciate what they are doing. This area can be a channel dedicated to sharing art and content, or even an entire channel category depending on how your moderation team wishes to interact with your community’s creators and how active this part of your community may be. Listen to their needs and expand and modify this category as necessary.

Unique Rules of the Road

When building out a content creation realm in a server, it is important to keep in mind that your moderation team may encounter some new situations that don't apply to the rest of the server. Of course, content creators are subject to the same laws of the land in place for the entire community, but there may need to be some of these unique rules of the road to consider including:

Plagiarism. This is the practice of taking someone else’s work and claiming that it is your own. Plagiarizing another content creator should not be tolerated within any creative space. It should be highly discouraged and acted upon with moderator intervention if your community brings an accusation of plagiarism to your moderation team. As moderators, it is important to understand the difference between plagiarism and finding inspiration in someone else’s work. Tracing another creators’ artwork is the most common form of plagiarism, whereas being inspired by an original character to try out a new pose, color scheme, or scene featuring them is inspiration. While creators are often looking out for each other and willing to bring concerns about plagiarism to moderation teams, it is important to be able to look for it yourself by familiarizing yourself with your artists’ styles and reverse image searching images of concern to your team. Be sure to be able to explain to your community why plagiarizing is harmful when these situations arise.

Managing Constructive Criticism vs. Hate. Your content creation channels are going to be accessible to your entire server. This is so that the entire fandom can celebrate together, but also to drive interest from users to support your content creators. This means that the average user can come in and comment on content. There is a line between constructive criticism and hate. Watch out for it as moderators and be prepared to intervene should anything cross the line into attacks or hate-filled commentary that would give content creation an unwelcoming atmosphere. Oftentimes in creative communities it is an unspoken rule that you should not give constructive criticism unless it is specifically asked for. The average user may not realize this and could accidentally offend an artist if they’re not aware of this. As a moderator, it’s important to help artists understand constructive criticism when they ask for it while shielding them from trolls or baseless hate. Sharing content can be intimidating, so it is especially important to ensure that content creation channels remain positive and respectful environments. One way you can mitigate this issue is by making it clear in your rules that unless the artist specifically asks for constructive criticism, that feedback of that nature is not allowed.

Bumping. Art bumping may occur in an art channel where artists feel their content isn’t easily viewed by enough people. This is essentially the act of media getting bumped up in chat from other people sharing their media at the same time or from chatter about other works. An accusation of bumping usually comes up when a creator feels their art isn’t being noticed, or if they believe someone they do not have a good relationship with is intentionally bumping their work. In this case, it’s important to defuse the situation and not allow any forms of bullying by de-escalating the conflict. Maintaining an environment where users respect everyone's work is necessary for the peace of mind of creators and consumers alike. You can also consider building out a channel category instead of a single channel which would allow for a channel dedicated to posting art and a separate one for discussion. You may contemplate a rule of not posting art within a certain time frame of another creator posting, but be cautioned that this can lead to over-moderation by your community.

NSFW content. If your server allows Not Safe for Work content, it is important that you create a specific channel for it that can be marked as an NSFW channel separate from your regular content creation channels. In line with Discord’s policies, this will not allow users under the age of 18 to see this channel without agreeing to a prompt that says they are not a minor. It is also important to consider that the implementation of an NSFW channel disqualifies you from being a  Partnered or Verified Discord server. Make sure to keep the expectations around SFW and NSFW content creation in line with that of your entire server, and offer to answer any questions in DMs if a creator thinks a piece may toe the boundaries you enforce.

Advertising Commissions. If you have a blanket ban on advertisement in your community, you may not want to make an exception to the rule here. However, if you decide to allow advertising commissions in your server, you are allowing more commissions to flow to your creators. Do not allow other users to beg for free art or try to guilt creators with open commissions into providing free content to them. It may be the case that your moderation team will have to enforce boundaries if someone who commissions a creator within your community doesn’t pay them or revokes payment. Conversely, if a creator requires payment up front and then does not deliver work and doesn’t refund the commissioner, moderator intervention should occur to no longer allow them to accept commissions from other members.

To be clear, you are not responsible for their financial disputes or business transactions. Ultimately, creators should look into their specific payment provider website for policy information on fraud and filing disputes, both of which are out of your control. Your job as a moderation team is protecting creators from scammers who make themselves known within your community. You’ve created this space to cater to creators and need them to know that users who take advantage of them and creators who take advantage of users are not welcome here.

Low Quality/Low Effort Art. Something your moderation team should consider is whether or not you will be moderating low quality or low effort art. Lower quality art has the ability to potentially create a divide with more experienced artists or diminish the overall quality of your artistic channels. Expectedly, this is a very subjective and divisive topic. Moderating “low quality” or “low effort” art can run the risk of upsetting younger users or creators that are at the very beginning of learning how to create. When considering moderating low quality art, be sure to display empathy and compassion to avoid coming off as inconsiderate or rude. Be honest and realistic in your descriptions and requirements for these art spaces so that users may have a better idea as to what is and isn’t acceptable both content and quality wise. Other ways to healthily promote higher quality artists is via potential role systems, automatic pins, or weekly artist highlights, which will be discussed in further detail below.

Off Topic Art. As a team, think about whether you want your artist channels to be dedicated to the purpose of your server or if you want to also allow off topic content. Once this rule is decided, check that your moderation team is on the same page for enforcement and nudging should you decide not to allow off topic art.

Engaging Content Creators

There are several ways to keep your community’s content creators engaged, which helps to showcase how much your moderation team values their contributions to the server. Discord has several native features that can showcase your community’s talent in emojis, stickers, banners, and server icons. While a banner and a server icon are important to be branded and thus rarely changed, generating emojis and stickers (especially from within your community) is a good way to bond, celebrate inside jokes with your community, and show some love for your creators. Oftentimes communities will employ certain yearly opportunities like emoji elections where creators can submit emojis for consideration and allow your community to vote as a whole.

Continued engagement with your content creators is also important. If you are engaging your community with generalized game or server events, examine whether you can engage your content creators in the same way with art events or monthly prompts as this promotes community bonding. If your community has a system to reward winners for their work or participation in events, work to instill the same kind of system for art adjacent events or prompts.

Finally, some communities may want to install a special role for content creators, especially those that are active and constantly contributing quality work. This will showcase artists in the server to the rest of the server. Do keep in mind however, that unique role colors can lead to inadvertent exclusivity and a social hierarchy within the server. This can also have the effect of alienating artists that do not yet have the role, which is why you should be careful when thinking about if you want to introduce this role to your community. If you decide to bring a specialized role into your server, take the step to have clear criteria for users to qualify for it as well as easy rules for moderators to grant it to users. This ensures that your moderation team can avoid accidentally leaving someone out and hurting someone’s feelings. Avoid bringing a role into your server if your server has had problems with role related hierarchies in the past. Listen to your community’s needs and anticipate potential problems!

Conclusion

Content creators are an exciting subset of fandom that should be welcomed to your community! Ensure that they have their own area to share all forms of content in, whether it be a channel or an entire channel category. Be aware that content creation arenas often come with unique rule considerations that you may not have encountered previously in the daily moderation of your server. Talk to your moderation team about everything before launching this channel or category so that you are all on the same wavelength with enforcement before jumping in. Continuously engage your creators and involve them in the artistic aspects of the server, such as emoji and sticker creation. Art can bring people together, and having a healthy artistic space within your community will provide a new way for your community to bond and celebrate your fandom!

Moderation
Moderation

Patreon X Discord

What, Exactly, is Patreon?

Let’s start with the obvious question: what is Patreon? In case you are unfamiliar with Patreon, it is a subscription service for content creators similar to YouTube Memberships and Twitch Subscriptions. Supporters ( hereby referred to as “patrons”) can pay a monthly fee to access content from creators they are supporting. This private content can cover a wide range of content including text posts, polls, videos, images, merchandise, and more. The possibilities are truly endless and you, as a content creator, have control over how much value you place on the content you produce and distribute over Patreon.

Patreon also uses tiered platforms based upon monetary commitment. For example, let’s say you are a comic book creator that can generate an entire comic in a month’s time. You could charge patrons $5 to access works in progress, maybe snippets of the story you’re writing, or behind the scenes streams of you working on an anticipated page or panel. At the next $10 tier, patrons can gain access to the whole comic as a reward, and maybe even access polls to help you decide what to do next. Further, a $25 reward could be that patrons have complete access to your entire library, or get a physical edition of your comic book sent to them! The ideas are endless when it comes to what kind of rewards you can offer your patrons.

In fact, one of the new ways to explore rewards systems for patrons is via Discord.

Why Create Discord Rewards?

For content creators that use platforms like Patreon, providing rewards to patrons is an absolute necessity because it directly benefits those who choose to support you. One of the simpler rewards creators can provide patrons is a Discord reward system automatically managed by the Patreon Bot. An example of such an award allows you to assign one or more roles to patrons in your Discord server based upon their subscription tier on Patreon. Once they have their designated roles, you can utilize channel permissions to provide your supporters with exclusive access to aspects of your server like hidden channels or perhaps even hoist them in the sidebar of your Discord server as an added bonus. This gives them closer access to you as a creator as well as your content and your community. This, in turn, can make them feel appreciated as support and perhaps allow them to see a direct impact of their support on your content.

Installing a Discord Rewards System on Patreon

The first step to setting up Discord Rewards on Patreon is to have a Patreon account. Once you’ve started building your Patreon page (or have one completed), you’ll need to head into your “Tiers” section of your page.


Once in the “Tiers” section, pick the tier you’d like to grant the access to your rewards in your Discord server. We recommend that this should be the lowest tier you’d allow patrons to subscribe at in order to receive a special role in a Discord server that we are presuming is public. Once you’ve selected your chosen tier, you’ll need to connect your Patreon to Discord.

Now, let’s make this reward a reality with the following steps:

  1. Select the “Advanced” option within the “Benefits” section.
  2. Click that great, big, blurple button to connect to Discord.
  3. Allow the pop-up window to authenticate with Discord. You may be asked to sign-in to prove that you’re using the correct account.
  4. Grant the Patreon Bot access to your server by selecting the correct server in your dropdown list.
  5. Make sure to grant the bot Manage Roles and Create Invite permissions.
  6. Hit “authorize” to finalize everything.

Please note, if you created your server that you plan on adding to your Patreon, you’ll have no trouble finding it in the server list mentioned in Step 4. If you’ve had a friend, moderator, or maybe even a community manager create your server, you might not see it in this list. Make sure you have the Manage Server permission in order to see it. If you don’t, double check with the server owner and have them add it to one of your roles in order to successfully connect to Patreon.

Customizing Your Discord Rewards

Roles, Roles, and more Roles

With your Patreon page successfully linked with your Discord server, you may notice this message in Patreon:

To finalize reward set-up, you need to create the actual roles. We recommend creating a different role for each of the Patreon tiers you have and distinguishing the roles by naming them after the tiers’ name or the amount of money pledged by patrons of each tier. For example, if you have a $50 per month Patreon tier named Top Contributor, you could name the corresponding reward role either “$50” or “Top Contributor.” This usually works out the most suitably for future rewards, because you can easily distinguish your followers who send $5 a month, versus those sending $50 a month, and change role permissions accordingly if you are offering them different levels of Discord access.

Remember that it is important to keep the Patreon Bot’s role above all the other roles we make for this purpose so it can help manage them.

Now that your tier roles are in the server, there should be no more red warning text on the Patreon Tier Creation page. If there is, give it a quick refresh. The disappearance of this text means that you can now check the box “Gives patrons access to selected Discord roles.” If you are using multiple roles for each tier of your Patreon for organization in your server, make sure they’re all added. After setting your channels and permissions spoken about in the next step, double check that the correct role in your Discord server is associated with the proper tier on your Patreon page.

Success! You’ve completed the Patreon and Discord integration! Now lets establish accessibility levels.

Channels and Permissions

At this point, you’ll want to build a structure to your server for these new tier roles. Head back to the channel list and decide how you’d like to reward your different tiers. Do you want each tier to have their own channel category, or do you want one channel category with different channels designated to different tiers? Make sure to give each category and/or channel permissions associated with each role that you want to have access to that area.

We have an based upon the channel list image below to see some of our recommendations in action. This example utilizes one channel category for all patrons, but gives certain patron tiers different levels of access.  

You’ll notice Patreon allows you to assign users multiple roles when they subscribe to a tier. This makes it handy to ensure your private channels and categories are easier to manage across the different role tiers. In our provided example you’ll see that all patrons have access to an exclusive general text and voice channel to talk to each other regardless of tiers. However, different tier levels are given access to different additional perks such as sharing social media links for the creator to follow back, polls for future content, and a VIP text and voice chat to better talk to the creator they’re supporting.

Conclusion

Congratulations on launching your new and/or updated Patreon page! The beauty of this system is that you can continue to create new roles and channel permissions to best serve your community as you continue to grow. Now that you’re done with the basic steps outlined in this guide, all you have to do is edit/add/drop/shift roles around as needed to ensure you are making the most of your Discord Rewards patron program.

Patreon rewards give your fans and supporters exclusives in exchange for supporting your work. Having this automated system to handle and manage your supporters means you can spend more time making new content, and less time worrying about a bot, or a function in Discord whenever new patrons sign up. It also makes managing permissions much more simpler because you have a lot more control over which roles a patron is assigned to based on their subscription tier.

Moderation
Moderation

Understanding and Avoiding Moderator Burnout

Moderator Burnout: What is it?

Starting a new job or position or activity will always be exciting. Most moderators are eager to help a community that they love and that they are an active part of to grow and prosper. You spend a lot of time there already, so why would you not want to do your part in helping that community be successful? However as time passes, interests can change and initial enthusiasm can wane. As a moderator, you might find yourself spending time in other communities or you realize your real world schooling and work is a priority over moderating, so there is the potential for anxiety to build as you try to juggle all of your responsibilities. Maybe your mental health is being affected by spending many hours a day on the internet, dealing with trolls and people who simply just want to cause trouble. This can mentally drain you, having to deal with negativity and conflict day in and day out. When moderating begins to feel like a chore, as opposed to a hobby, that’s when you might feel like moderator burnout has set in.

Burnout is the emotional, mental, and physical exhaustion you feel after a prolonged period of stress brought on by certain activities. Spending too much time in a stressful environment can easily lead to feelings of exhaustion, feeling distant from the activity or task at hand, or simply just having negative feelings when thinking about doing the activity or task. These negative effects on your body and mind eat away at you until you feel like you are at your limit and just do not have enough energy and motivation in you anymore to continue moderating. Sometimes burnout makes it feel like the only thing that can make you happy is to stop being a moderator completely.

Signs of Moderator Burnout

One of the most common signs of moderator burnout is when you have been noticing yourself being less and less active when it comes to moderating, as well as being less active in the server itself. As a moderator, you have begun to feel like it is a chore for you to be moderating the server; something you feel forced to do, knowing it needs to be done with no joy attached to the task at hand. Time seems to move so slowly when you are moderating as you are constantly checking the clock, just hoping that an hour of your time is enough on the server. Your time spent on your server becomes less and less as the days go by until you have either stepped down or completely withdrawn from any activity in the server, moderation or otherwise.

A burgeoning lack of participation is another sign. If you know yourself well enough, you can probably tell when something is bothering you. Perhaps on any normal given day, you are social and engage with your other team members as well as with regular server members, but recently you’re only chatting in public channels, giving out the occasional public warning. You notice that you only really check the staff channels when there is a ping. Your account may be in the server and your name on the member list, but you’re no longer an active community member. At this point burnout has set in. There may be the urge to come in every day and give 100% but then you run the risk of giving too much, too quickly. You start to dread the amount of work necessary to do your part and eventually start to taper off.

You might notice yourself making more mistakes than normal. Frustration is another part of burnout that can affect the mental aspects of moderating. Feeling like you are making too many mistakes or are not doing as much as another moderator is a hard thing to put a reason to. Feelings of inadequacy may lead to reprimanding yourself internally, being your own worst critic, finding yourself in a rut, thinking everyone in the server is being difficult. You know you are making mistakes, others on your team see it, thinking they are helping by giving you constructive criticism. There is a lack of accomplishment that makes you frustrated, especially in moderating. Nothing you do feels like it is being done right, adding that to the difficulties of members of the server, and frustration kicks in. When telling members of the server to do something and they continue to post against the rules, you begin to wonder if you matter or are even making a difference.  

Burnout can easily affect your attitude. Moderating tests your patience as some members of the server will purposefully push the line, seeing what they can get away with. As a new moderator, you might not be as strict, as you are wanting the community to not only respect you but to like you. As time passes, your patience may begin to run thin and you put up with a lot less while you get more irritated. You have a shift in your attitude and can become bitter at what you have to deal with when moderating a server.

Avoiding Moderator Burnout

After taking time to look at the signs of moderator burnout, it is important to know how to avoid it so you and your team can find your groove again and remember what it feels like to enjoy being not only a moderator but a community member.

Everything is healthy with moderation, including, well, moderation. Knowing when you need breaks and encouraging yourself, as well as other moderators, to take these breaks can really help with mounting stress levels. Moderating takes a toll on your mental health, so being able to step away and catch your breath can really help reset your focus. Reach out to other members on your team that can help with moderator duties and shoulder the workload so those that are experiencing burnout can feel that it is okay to step away. A big part of working on a team is being honest with each other when things are good and when things are not so good. You may be surprised at how eager your team members are to help prop you up when you’re feeling low. Remember- you're not alone in this, so feel free to take those breaks and be assured that the server is not on fire and that there are eyes other than yours that are there to share the workload. Offline life and your overall health should always come before any aspect of online life.

Make sure you are not taking on too much, but also have enough to do. This can be a tricky balance. Depending on your moderation experience and skills, it can be hard to determine what your work load capabilities are, especially as offline life changes. Be upfront and honest with what your team expects from you, but also let them know that if you are feeling overwhelmed, that they can talk to you. Always have an open line of communication so you can find ways to help yourself, as well as the others you moderate with. Something that has worked on larger servers that you might be able to incorporate to your server is having a summary of the channels. With the summary, you can have moderators on your team sign up for which channels they enjoy moderating, as well as the others, and not feel like you have to be in too many channels at one time. Delegating work makes it seem more manageable and less daunting. Encourage yourself and your fellow moderators to try new channels after a couple weeks to change everyone’s scenery, as well allowing various team members the chance to interact with certain server members that might only hang out in channels that they don’t normally moderate. As a moderator, you might feel yourself wanting to do more and seek to add to your responsibilities. Suggesting community events and helping out organizing and running these events makes for a great change of pace to contrast the normal moderating duties of watching chat. Events are great for bringing regular members together with moderators. It is a fun task that can bring activity levels up and spread some excitement.

Create an environment that is fun to moderate in. Having or suggesting a staff channel where you and your fellow moderators can be yourselves or vent is a great way to relieve some stress and have your team get to know one another. Ask them questions every day to encourage discussion and communication between your entire team. Building a team that works well with one another helps with the communication between fellow moderators, so they can express how everyone is feeling, and seek advice on stresses in the community.  You can get to know each other’s personal lives and what other things in life might be causing outside stress. Getting to know each other’s personalities can help determine if someone might easily burn out or if they are just more introverted than other moderators on your team. Holding events, such as game nights, where you all play a game online together can really help you and other moderators feel at ease, bring enjoyment to everyone, and help everyone reset for the next day of moderating.

One of the biggest, and perhaps simplest, things you can do to help with moderator burnout is just being thankful. Typing the two little words of “thank you” can go a long way. As a moderator, you spend your free time helping the server, so let your fellow moderators know you appreciate that they chose this as their hobby.

To that end, positive specific feedback is one of the best ways to let someone know that they did a good job and what exactly it was that they did well. By being specific about what you’re thanking your moderator for, you’re letting them know that their hard work is recognized and valued and seen. Recognize when they put in a lot of hours on the server and are here on a day-to-day basis. Finding ways to reward them, whether through gifting Nitro or a special recognition in the server, can be really fulfilling to them. It reassures them that they are an important part of the server and are making a difference when helping. Be gracious with your words and remind them that they are here with you. It starts a chain reaction and you will see moderators thanking other moderators for their hard work.

Summary

Moderator burnout can happen to anyone at any time. It is important to understand what it is and how you can help. Whether you are an owner, administrator, or another moderator, it is important to support your team and look for the signs of burnout so you can suggest ways that might help them in how they are feeling. Always have an open line of communication with your team that fosters honesty. Encourage them or yourself to step away when feeling stressed or overwhelmed, and thank each other for helping in the server. Servers are a lot harder to run when doing it alone; moderators that are excited to be there are an important part of making sure things operate smoothly.

Moderation
Moderation

Fundamentals of Family-Friendly Servers

How Family-Friendly Servers Differ From Other Servers

Because you are dealing with young people, family-friendly servers may become what many refer to as Home Servers. These are online communities that are perceived to be safe spaces that people come back to time and again, like a home. Thus, such servers may attract users for different purposes than a community meant for college students might, for example. The moderators of family-friendly servers can be viewed as both authoritative figures but also older siblings or friends that may be looked up to by their community in unique parasocial relationships. They will also have the unique experience of seeing many of their young users mature and grow along with the server.

When forming this kind of environment, it’s important to make sure your moderators are naturally empathetic to the unique situations that will arise within member reports as a result of interpersonal relationships amongst younger users.

During the foundational stages of server building make sure to keep ideas such as privacy concerns, text filters, appropriate topics of discussions, and rule continuity at the forefront of your mind. Consider how you will handle users who are above the traditional age range of family-friendly servers who may not wish to abide by these rules.

Privacy Concerns

While privacy concerns exist on any platform, they are something to be especially aware of when moderating family-friendly spaces. Rules that cover privacy and internet safety may be more strictly enforced to protect younger server members from any harmful actors looking to take advantage of less experienced internet users. Oftentimes, these concerns can culminate in a blanket rule stating that sharing any personally identifying information in the server is not allowed. This goes beyond exact locations and can include full names, ages, and even face reveals.

Revealing exact locations and full names are generally frowned upon in most online communities because they can be used to find more information about a potential doxxing target. Forbidding users to share their age is an added level of privacy to protect younger members seeking community on the Internet from being taken advantage of by those with malicious intentions. Younger users may not understand how risky age differences can be to navigate. Face reveals can be reverse-image searched to further learn information about users who aren’t protecting their online privacy, or even be stolen and used in various forms of bullying. As a result, many family-friendly servers find it best to limit all personally identifying information outside of first name and country if users aren’t choosing to utilize an online persona to protect themselves.

As moderators it’s important to help educate young users about how to maintain confidentiality and use discretion when revealing personal information. This starts with recognizing and not clicking malicious links and steers into conversations about reporting things that make them uncomfortable in any way and protecting their private information. Keep an eye out on how others interact with younger users and act accordingly, as a victim may not realize they are a victim in some situations.

Text Filters and Topics of Discussion

Text filter implementation and determining which topics of discussion are appropriate for the server may be more strict than the average community as a result of building a safe space for younger users. There is generally no room for crudeness in any form.  This goes beyond using text filters to monitor hate speech, which is against Discord’s Community Guidelines, and going further to include curse words, innuendo, illegal substances and activities, gore, dark humor making light of serious situations, any negative and harmful rhetoric from within the fandom your community supports, and more.

Be sure to adapt your text filters as new situations and filter evasions arise within your community. It is often recommended to keep your blacklist private to avoid exposing harmful terminology to users. Additionally, make sure your text filters mirror the topics of discussion you aren’t allowing in your server and clearly outline those preferences in your server rules by noting that your environment is strictly SFW and any NSFW content is not allowed.

Automoderation can be utilized to mute or ban users who are engaging in problematic discussion around hate speech, and moderators can use their judgment on other blacklisted words as long as your moderation bot is set to log filtered words in moderation channels.

Rule Continuity

As mentioned in the previous section, rule continuity is a big deal in spaces meant for young people to ensure equal enforcement across everything. If you’re looking to filter certain content, it should also not be allowed to discuss that topic in your server. This goes for both general chatting and for various other forms of server usage, any place where there can be user generated content. One such example is meme or art channels. While it’s a great idea to foster and encourage a healthy and positive artistic environment, content that is NSFW, hate speech or politically motivated, and creations that may lean into age-restricted content like alcohol consumption or gore should be monitored and discouraged. If you have an active music channel setup, it might be worth keeping an eye on playlists to make sure the content being monitored elsewhere doesn’t slip by here.

With the release of server avatars, it is less likely that rule-breaking content can find its way into your server via this route, but consider moderating statuses of active users, user profiles, and profile pictures if it does. It also helps to have your moderators lead by example and follow the same rules in their profiles that you are looking to enforce elsewhere. Continuity is key for equal and unbiased rule enforcement.

Considerations for Users Above Your Target Age Range

While family-friendly environments exist to be safe spaces for young people you may find yourself also having users who are above your target age range. Their presence should not be discouraged, but it’s important to ensure that your rules and welcome messaging clearly defines your space as family-friendly so that they are aware of the expectations for ANY user who joins the server. Most will likely be accepting of the rules you’ve put in place and the space you are trying to cultivate. However, others may naturally challenge certain restrictions that they find childish or assume don’t apply to them. Establishing language around why your community is meant to be family-friendly and the importance of keeping it safe for everyone as a result, will be helpful if these conversations arise. Users who may rebel against rules that they do not agree with should be actioned accordingly.

If you notice a large portion of your userbase seeking out your community but growing frustrated with its family-friendly restrictions, you can consider opening locked channels for mature users that can either opt-in to their existence or show proof of their age. If you decide to do this, make sure you have a separate set of rules in place for these channels, as well as an identification vetting process to make sure that younger users do not get exposed to content you don’t want them to be exposed to. This would require moderation that is different from the rest of your server, so it’s important to consider whether or not your moderation team can handle this locked channel and if you feel it aligns with your servers’ purpose.

To Recap

Family-friendly communities exist to welcome Discord’s youngest users into safe spaces with the protection of moderation teams. Home Servers are SFW communities that limit topics of discussion more than a generalized server would in order to cultivate that safe environment. As a result, there’s special consideration to be taken around user privacy and educating users about their privacy, expanded text filters, appropriate topics of discussion, and rule continuity when building out the rules and structure of such an environment. Make sure your team is prepared to handle older users who may find these rules constricting by explaining their importance to your community’s purpose.

Moderation
Moderation

Facilitating Positive Environments

Introduction

The core foundation of a server on Discord is the community that populates it. Your community is what you engage with, protect, and grow over time. Engagement is important to focus on, but it’s just as important to make sure you are facilitating positive and welcoming engagement.

Why Positive Environments are Important

Positive engagement can mean a lot of things, but in this article, we will be referring to the way in which moderation can affect the culture of the server you are moderating. As moderators your policies, knowledge of your community, and deductive skills influence the way in which your community engages with each other and with your team.

When you establish and nurture your community, you are growing a collective group of people who all enjoy at least some of the same things. Regardless of your server topic, you are undoubtedly going to have members across different a variety of ethnicities, sexual orientations, and identities from across the world. Ensuring that your space on Discord is a space for them to belong necessitates making it safe for them to feel like they can be themselves, wholly, and without reservation. Your members are all humans, all community members, all people that deserve respect and deserve to be welcomed.

Establishing Community Boundaries in Moderation

When you are establishing your community, it’s important to have a basic understanding of what kind of environment you would like your server to be. It’s good to break down the general moderation philosophy on what content and discussion you’d like your community to engage in and what content would be inappropriate given the space. Depending on the topic of your server these goals may be different, but some common questions you can ask to establish general boundaries are:

  • What is the main topic of my server? When you’re thinking about the community and their impact on the growth of your server, it’s important to deduce what kind of server you want to build on a base conceptual level. If, for example, you are creating a politically-driven server, you might have different limits and expectations content and conversation wise for your community than a server based on Tetris or pets.
  • What topics do I expect users to engage in? Some servers will have the expectation that members will be allowed to discuss more sensitive or controversial and thought provoking topics, while others may feel as if these kinds of heavy debates are out of place. Video game servers tend to have a no-politics rule to avoid negative debates and personal attacks that are beyond the scope of the video game(s) in question. Servers centered around memes, irl, or social communities can be much more topical and have looser rules, while servers centered around mental health or marginalized communities can lean towards a stricter on-topic only community policy.
  • What would I like to foster in my community? While knowing what to avoid and moderate is very useful, having an idea of what kind of atmosphere you’d like the server to have goes far in setting the mood for the rest of the community at large. If users notice moderators are engaging in good-faith and positive conversations and condemning toxic or hateful discussion, it is more likely that your users will join in and participate in that positive conversation. If they see you and your mod team have taken the initiative to preserve the good atmosphere of the community, they are moved to put in the effort to reciprocate.

Moderating Hateful Content

When it comes to the content you allow or moderate in your server, it’s important to, again, reflect on what type of community you are. It’s also important that you act quickly and precisely on this type of harmful behavior. Some users will slowly push boundaries on what type of language they can ‘get away with’ before being moderated.

When discussing moderation, a popular theory that circulates is called the broken windows theory. This theory expresses that if there are signs of antisocial behavior, civil unrest and disorder, as well as visible signs of crimes in the area, that it encourages further anti-social behavior and crime. Similarly, if you create an environment in which toxic and hateful behavior is common, the cycle will perpetuate into further toxicity and hatefulness.

What is Bad-Faith Content vs. Good-Faith Content?

‘Bad-faith’ content is a term that describes behavior done intentionally to cause mischief, drama, or  toxicity to a community. They are also commonly referred to as bad actors, and are the type of people that should be swiftly dealt with and addressed directly.

‘Good-faith’ content is a term that describes user behavior with good intentions. When users are a positive foundation in your community, the members that join and interact with the established community will grow to adapt and speak in a way that continues the positive environment that has been fostered and established. It’s important to note that while ‘good-faith’ users are generally positive people, it is possible for them to state wrong or sometimes even harmful words. The importance of this distinction is that these users can be educated from their mistakes and adapt to the behavior you expect of them.

When users toe the line, they are not acting within good faith. As moderators, you should be directly involved enough to determine what is bad-faith content and remove it. On the other hand, education is important in the community sphere for long term growth. While you can focus on removing bad behavior from bad-faith users, reform in good-faith community members who are uneducated in harmful rhetoric should also be a primary goal when crafting your community. When interacting in your community, if you see harmful rhetoric or a harmful stereotype, step back and meaningfully think about the implications of leaving content up in channels that use this kind of language. Does it:

  • Enforce a negative stereotype?
  • Cause discomfort to users and the community at large?
  • Create a negative space for users to feel included in the community?

Ideas to Help Prioritize Inclusivity

  • Allowing users to have pronouns on their profile. Depending on your server, you may choose to have pronoun roles that members can directly pick from to display on their profile. This is a way to allow users to express their pronouns in a way that doesn’t isolate them. When creating a larger, more welcoming system for pronouns, it is much harder to decide who has pronouns because they are LGBTQ+, because they’re an ally, or just because it was part of setting up their roles. When servers have pronoun systems built into them, this can also allow for a community-wide acceptance of pronouns and respect for other users’ identities, and can deter transphobic rhetoric.
  • Discourage the use of harmful terms. It’s no secret that terms such as ‘retard’ and ‘trap’ are used in certain social circles commonly. As moderators, you can discourage the use of these words in your community’s lexicon.
  • Create strong bot filters. Automated moderation of slurs and other forms of hate speech is probably your strongest tool for minimizing the damage bad actors can create in your server. Add variating ways people commonly try to skip over the filter as well (for example, censoring a word with an added or subtracted letter that commonly is used as a slur).
  • A good document to follow for bot filter and auto moderation as a whole can be found here!
  • Educating your community. Building a community without toxicity takes a lot of time and energy. The core of all moderation efforts should be in educating your communities, rewarding good behavior, and making others aware of the content they are perpetuating.

A core way to handle all de-escalation stands in your approach. Users, when heated up during a frustrating or toxic discussion, are easy to set off or to accidentally escalate to more toxicity. The key is to type calmly, and to make sure with whatever manner you approach someone to de-escalate, you do it in a way that is understood to be for the benefit of everyone involved.

Closing

Creating a healthy community that leaves a lasting, positive impact in its members is difficult. Moderators have to be aware, educated, and always on the lookout for things they can improve on. By taking the initiative on this front, your community can grow into a positive, welcoming place for all people, regardless of their race, gender, gender identity, or sexual orientation.

Moderation
Moderation

Ban Evasion and Advanced Harassment

Understanding High Conflict Persons (HCPs)

The vast majority of community members are interested and willing to participate according to the platform rules, even if they might not agree with every one of them. Sometimes people break rules or disagree, but their behavior can be quickly corrected and they can learn from their mistakes. If users continue to break the rules, they may be given longer-term or even permanent bans from the community or platform. Most users will accept their ban, but a small fraction will not.

A 2018 study by Stanford University estimated that 1% of subreddit communities on Reddit initiate 74% of all conflict on the platform. This incredibly small percentage of users rank extremely low in the agreeableness personality trait, and have no interest in getting along with others. Only a trained clinical psychologist can diagnose a patient with a disorder, but a term commonly used prior to diagnosis is HCP (high-conflict people). There are four primary characteristics of high conflict personalities, which is not a diagnosis but their description of specific conflict behavior:

  • Preoccupation with blaming others (their "targets of blame")
  • All-or-nothing thinking
  • Intense or unmanaged emotions
  • Extreme behaviors (often what 90% of people would never do)

If you fail to use tact in your moderation technique and communication approaches, you may find that you or your community become the target of a high-conflict person. They may spam your community and you may delete their posts and ban their accounts, but more accounts can be created. Discord uses IP bans to prevent users from creating new accounts, but VPNs can be used to circumvent these bans. If truly motivated, armies of bot accounts can be created and used for mass-spamming, members of your community can be doxed, and ISPs or platforms can be DDoS’d to create fear in your community. If a high-conflict person gains access to money, they can pay somebody else to do the work for them.

Most moderators choose to simply wait out the harassment. Advanced harassment like this may go on for several days, or even weeks, but then stop abruptly as the individual turns their attention to something new in their life. In some cases the harassment can go on for months, continuing to escalate in new ways that may put the lives of your team or community members in danger.

What can you do to protect your community from High Conflict Persons? What motivates a person to behave like this? This article will help to explain the motivations behind this persistent, destructive behavior, and provide actionable steps to reduce or resolve their harassment.

The Virtue of Battling a Nemesis

A “nemesis” is an enemy or rival that pursues you relentlessly in the search for vengeance. A nemesis typically holds some degree of fascination for a protagonist, and vice versa. They’re an antagonist who’s bent on revenge, who doesn’t go away, and who seems to haunt the mind of the protagonist. They’ve moved past being an enemy to become something much more personal.

You might assume that a high-conflict person harassing your community is your nemesis, but this would be incorrect. You’re not going out of your way to obstruct their behavior, your primary focus is to engage and moderate your community. If the harassment stopped, you would move on and forget about their behavior. You resist their behavior only as long as it falls under your realm of influence.

In their mind, you have become their nemesis, and you must be punished for your insolence.

To them, you are the Architect of an oppressive Matrix, the President Snow of an authoritarian Hunger Games, the tyrannical Norsefire government in V for Vendetta. You or your community represent the opposite of what they believe. In one way or another, either by your direct actions or through your association with your community, you have wronged them and deserve to suffer for your behavior. It’s clear that you will never learn or understand what they see. You not only participate in creating the corrupt and unjust system that they are oppressed by and fight against, but as a moderator, you are the very lynchpin that maintains the corrupt system.

You may believe this sounds outlandish, and you would be correct. Most people don’t believe that the world is out to get them, and that they’ll be hunted down and persecuted for what they believe. These individuals have an overactive threat detection system that makes them believe that you or your community are actively plotting their downfall. They take your opposing stance as a direct challenge to their competence, authority and autonomy. They harass you and your community because they believe that you’re out to get them, or want to replace them and their way of life. The truth is, all you really want them to do is follow the rules and maintain a civil conversation.

Understanding Tactical Empathy

Now that you have a better understanding of how somebody like this thinks, we’ll discuss the strategies that you can employ to solve this problem. The goal is NOT to get them to seek help or change their mind- we aren’t attempting to solve people. Instead, our goal is to prevent or stop certain negative behaviors that keep happening so that you can protect your community and focus your energy elsewhere.

The key to getting an individual like this to change their behavior is through utilizing “tactical empathy”. Tactical empathy is the use of emotional intelligence and empathy to influence another’s behavior and establish a deal or relationship. It is not agreeing with them, but just grasping and recognizing their emotions and positions. This recognition allows us to act appropriately in order to respond to our counterpart’s position in a proactive and deliberate manner.

The premise behind tactical empathy is that no meaningful dialogue takes place when we are not trusted or we are perceived as a threat. In order to get someone to stop harassing your community, you need to shift yourself from being the villain of their story to being just another random person in their lives. You must work to shatter the persona that they have projected onto you and show that you are not the enemy working to destroy them. You’re just a mod trying to keep your community safe.

By demonstrating that you understand and respect them as an individual, this will disarm them and allow them to focus their energy elsewhere. It will not change their opinion, but at least their behavior will change.

A Mod’s Guide to Situation Diffusal

When somebody continues to harass or disrupt your community, they’re essentially holding your community hostage. If someone truly is holding your community “hostage”, they’re often doing so because they’re looking to open a dialogue for negotiation. Frequently, people take hostages because they need somebody to listen. They aren’t getting the attention that they believe they deserve, and attempt to cause as much disruption as possible in order to make their case.

You are a community moderator negotiating the peace of your community, not their lives, but these tactics can still apply.

Situation diffusal can generally be defined by three primary processes, each designed to collect information and use it to disarm the high-conflict person from believing that you’re an enemy or threat. These processes are called The Accusations Audit, Mirroring to Understand and Getting to “That’s Right”.

The Accusations Audit

An accusations audit is where you focus not on just the things that they believe, but the things that they believe you did wrong. An accusation Audit is not based on logic - it’s based on the unfiltered emotions of the other person.

It’s important that you go through their early comments and messages to understand what prompted this behavior in the first place. This might have been banning them for breaking a rule or not properly punishing another community member that they got into an argument with. They might believe “I feel like you didn’t give me a chance to explain myself” or “I feel like you’re discriminating against me”.

Your understanding of their beliefs will be flawed and inaccurate, but you must do your best to piece it together into a coherent argument on their behalf. If possible, learn more about the other communities they’re a part of. Identify if they’re harassing any other communities, and the reasons for doing so. Are there any commonalities of note?

Mirroring to Understand

Once you believe you’ve figured out why they’re upset with you or your community, mirror their language to verify it. At this point, opening a dialogue might be incredibly difficult if they’re using throwaway accounts regularly. Chances are they do have a primary account they continue to use in other communities, which can help greatly with starting your dialogue. At this stage, you’re still working to collect information about what they believe, directly from the source. Examples of questions you can use to verify their opinions include, “It seems like you believe that I’m being unfair because I didn’t give you a chance to explain yourself.” or “If I understand correctly, you believe I’ve been discriminating against you instead of taking your opinion seriously, is that right?”

Chances are, the responses you receive will be filled with aggression, profanity and insults. You must ignore all of this, and continue working to understand their position and the events that resulted in them targeting your community. Negotiations like this are difficult in voice-to-voice communication, and nearly impossible via instant or private messaging. They will be incredibly resistant at first, perhaps thinking that you’re attempting to trick them into a perjury trap for them to admit their guilt or ignorance.

When you get them talking to you, mirror that language to get them to elaborate further on their beliefs. An example of dialogue might go something like the following:

Spammer: “It’s bullshit that mods ban strawberry jam lovers because the blueberry jam lovers are afraid of being exposed for who they really are.”

Mod: “Afraid of being exposed?”

Spammer: “Yeah, the blueberry jam lovers are secretly running the world and plotting against anyone who doesn’t believe in the same jam flavor preferences as they do.”

Realistically, blueberry jam lovers are not actually running the world or plotting anything nefarious, but in the mind of the spammer this is undeniably true. And while this example was intentionally mild, you can infer more severe types of conversations that would follow a similar format.

Regardless, as you dig further into what they believe, you’ll notice that the rabbit hole will go very deep and be filled with logical fallacies and obviously disprovable biases that make no sense. Remember that the truth or reality behind what they believe is completely irrelevant, and attempts to correct them will undermine your goals. Your job is to help them explain their beliefs to you to the best of their ability, and for you to understand their position to the best of your ability. Once you believe you’ve collected enough information, you can move to the final step, getting to “That’s Right.”

Getting to “That’s Right”

Once you believe you’ve completely understood their position and what they believe, you can repeat their entire position back to them. Demonstrate your understanding by effectively summarizing it concisely and accurately, regardless of how much you disagree with the position. Don’t focus on their behavior or the actions that resulted in them getting banned. Instead, focus exclusively on the ideology that drove their behavior. Do this until you’re able to get them to say “Yes, that’s right” at least 3 times, or by asking if there’s anything else that you forgot in your summary. If you did miss anything, repeat the entire position again while including the extra information. When reiterating their points, be very careful about restating things that are not true. Do your best to remove personal bias from the statements to focus them back to “absolute truths.”

Their actions are about trying to make a point- but what you’re doing is getting them to make their point without taking action, because you have heard what they are trying to say. If you do this well enough, if you put enough effort into doing this correctly (and it doesn’t need to be perfect), they will know that you finally understand where they’re coming from and that they’ve been heard by you, and their opinion has been validated. By demonstrating you understand their position, you go from being part of the problem to being a real person. They might not like you, but they will at least (if begrudgingly so) respect you.

End on a High Note

When you successfully reach this state of your discussion, it’s essential that you be careful with your choice of words. There’s a good chance that the spammer will leave your community alone now that they know that their opinion has been recognized. At the very least, you should see an immediate reduction in the number of times they attempt to cause harm.

If they do continue to harass you or your community, it’s possible that you failed to address the primary reason that they’re upset. Open dialogue with them again and follow the steps above from the beginning, or check to see that you haven’t fallen into a common pitfall or mistake.


Common Pitfalls and Mistakes

Below is a list of common examples of mistakes people make during negotiations:

Getting to “You’re Right” instead of “That’s Right”

When using tactical empathy, remember that the purpose of the exercise is to bring their beliefs to the conscious mind and demonstrate agreement. If you attempt to tell them what they should believe, you may instead get a “you’re right” and fail to see any change. The difference is subtle, but important. Make sure that the other side actually feels heard, and that you’ve fully understood their position.

Don’t Try to Correct their Opinion

As a reminder: do not attempt to correct or modify their opinion. Remember the purpose of this process. It is not to modify their position or opinion, it’s only to mirror their opinion to stop identifying you and your community as a threat.

Be Careful with Tone and Wording

The methodology outlined in this article is designed for conversations in real-life, especially over the phone. It’s unlikely that you’ll be able to get the spammer on an audio call, so it’s essential to be patient with the process and careful with your wording. Formal grammar like punctuation can make a sentence feel more serious or threatening. Use casual phrasing and make an occasional spelling mistake to show you’re human. If you’re uncertain about tone, read the sentence out loud while sounding as angry as you can, and adjust accordingly.

Remember to Ask for Help

The process outlined here can be easily undermined by others who aren’t involved in the process. If you’re working to negotiate with a spammer but another moderator is threatening them in a different conversation, you won’t see any changes in their behavior. Communicate with your team on the strategy you plan to use, and remember to ask for emotional support or step away if it becomes too taxing.

Can High-Conflict People be Helped?

There will be some of you who believe that after getting this far, you may be on the path to rehabilitating a person like this. The mistake is believing that you are further along than you really are, or that you’re qualified to help someone struggling to control their emotions. The truth is, getting to “that’s right” is only 1% of the process.

Even if you’re a clinical psychologist, you wouldn’t be getting paid for your work, at least not this work. Attempting to provide support via text chat will have diminishing returns. Attempting to show somebody like this the “error of their ways” may result in all of the work you have done being reversed.

Instead, you must focus on the people who want your help and who need it- this being the people in your community. Empower the people who are truly deserving of your time and energy. At the end of the day, you’re a human and a moderator. Your primary focus in this realm is to make sure your community is safe and stays safe- and if you’ve managed to get the persistent spammer to stop then you’ve accomplished what it is you’ve aimed to do.

Moderation
Moderation

Confidentiality in Moderation

Considerations Regarding Leadership

Whichever moderation roles a server may have, there should always be an authority role that can make calls at their discretion if they believe it is the best thing for the community. A good example on how to do just that can be found here. Moderation administrators, leaders, managers, etc. should always be prepared and ready to make judgment calls on the information provided to them, whether by mods or users. A very common misconception among moderation teams is that they should share all information amongst the team for transparency. This can be a double-edged sword in the sense that disclosing private information that is not essential for a moderator can open more routes for that information to have unauthorized distribution. If this occurs, it will compromise the privacy and trust of the users that the information applies to. In sensitive situations containing very volatile information, consider if it may be beneficial to have it handled directly by a team leader or even the owner of the community.

Personally Identifiable Information

Personally identifiable information (or PII) is any information that can identify a user, such as an email address, full name, phone number, IP address, exact location, or even their Discord user ID and username.

People should never disclose someone’s personal information except their own in an appropriate environment, as disclosing others’ info can be treated as doxxing, which is a disclosure of personal info by a third party (for example, someone posting another user’s address), and can, in some instances, be actioned on by Trust and Safety as it may violate Discord’s Terms of Service/Community Guidelines. User IDs and usernames are acceptable as long as there is a justifiable need to disclose it, but make sure to always consider if there may be repercussions to that user if disclosed in any instance.

PII is very sensitive as it removes a user’s privacy and can result in them being targeted online or even in real life. Thus, this information should always be protected with the utmost discretion. Moderators may come in contact with this in ways such as a message they have to delete, someone maliciously doxxing another person, a user accidentally sharing it without realizing the harm they are putting themselves in or even from information included in a report. This information typically should not be disclosed to anyone and community leaders should consider removing it from bot logging channels to protect a user’s identity.

Also consider encouraging members of your community to learn how to safeguard their own information. You can include rules within your communities that discourage the sharing of even one’s own personal information. As important as it is to protect other users, it is just as important to help them protect themselves. Users may sometimes share their information out of good will or as a way of attempting to bond with others, but bad actors can use that information maliciously.

Personal Matters

Personal matters can refer to a huge range of information, but some common examples can include relationships, interpersonal conflicts, previous history, or things as simple as a DM or private conversation. As a moderator you may very likely come across information involving this as part of reports, concerns, or even someone breaching trust by screenshotting and sharing private messages. This information is extremely important to protect as people may trust you to keep it private and use it only to take care of the issue at hand. Exposures of this information can be very harmful to people and can result in targeted harassment, bullying, or even further negative consequences. Stories of this can cause people to be concerned and even worried about reporting something for fear of it happening to them. In the end, this makes things very difficult for moderators to not only reassure, but to rectify.

Moderation Information

Most public communities have ways of protecting their server with moderation tools, actions, and procedures. This includes moderator actions such as warnings, kicks, mutes, bans, etc. Moderation actions may be especially important when it involves a specific user.  Moderation info can even include internal details such as protocol, procedure, censor lists, or even bot details.

Moderation information is something that can vary from server to server, and thus it is relatively up to the discretion of each moderation team to instill their own server rules to enforce. Some may have full transparency with an open log channel, and some may take a more confidential approach and only speak with those involved. Both have their pros and cons, but be sure to weigh what could happen if people know who receives what penalties. For protocol, always remember to carefully decide what to share publicly, as disclosing a procedure can lead to someone using that information to evade moderators or even exploit the server. This also stands true with bots, as disclosing bot details such as configuration or censor list can result in users evading the protections put in place by your team.

Handling Information with Users vs. Mods

There are many different forms of information that must be considered heavily before disclosing to different people, whether they be users or other mods. Information can range from sensitive personal information such as emails, names, phone numbers, location, IP address, etc. to community-related information such as mod actions, previous incidents, and user history. Regarding users, very little should be shared to people who are not involved. When it comes to fellow mods, it is always best to share as much information as is reasonable aside from personal information to ensure everyone has a well-informed mindset.

Some questions to consider when speaking with users include:

  • Is the user in question involved in the situation?
  • If disclosing mod actions, is the user the one that was penalized?
  • What reason does a user have to need to know information?
  • Were they the victim? The perpetrator? Just a bystander?
  • Would it compromise someone’s privacy to disclose something?

Now for mods and members of the more internal team on servers, mods should of course be “in the loop” to know the story of a situation, and it’s never recommended to keep mod teams in the dark. That being said, even with other moderators, be careful about sharing unnecessary information, especially personally-identifying information, not only because there is often little benefit to it, but primarily because it compromises a user’s privacy even if behind closed doors.  While there are fewer factors to consider, they are still just as important as the ones you would ask for another user.

Some things to consider when disclosing to moderators include:

  • Is the mod involved in the issue directly?
  • Is the mod an “NTK (Need-to-know)” team member? These members include team leadership for serious cases.
  • Does the mod have a reason that knowing this information would prove beneficial for?
  • Would it be a conflict of interest for the mod? (If the mod has a personal history with people the information relates to.)

Remember that if you aren’t sure if you should disclose something related to moderation, always ask an administrator/leader on your server for guidance, and always dispose of private information if it is not needed.

Benefits of Confidentiality

It may be easier to be fully transparent and not have to check every sentence before it is said or sent. That being said, there are many benefits to upholding a consistent, confidential environment where staff act with discretion when assisting with a variety of matters. There are many consequences if confidentiality is not upheld properly. Below are some examples of the benefits of protecting information as well as the consequences that can come with being overly transparent.

Keeping Pseudonymous. As stated by Discord’s Safety Principles, Discord is pseudonymous, which means that your account on Discord doesn’t need to be tied back to your identity.  Protecting users who may provide information as evidence or otherwise may sometimes expose who they are, and protecting this information reassures that their personal life won’t be compromised by socializing or confiding in a server’s staff.

Trust. Users will know of and hold high trust within a staff team if they are confident that high expectations of privacy will be respected by the team they confide in. If not upheld, users will find it difficult to trust the team, and may heavily contemplate or even refrain contacting a moderation team again in the future.

User Safety. Diligent protection of user data and information helps protect users as it prevents unwanted data from getting into the wrong hands. If information is not guarded, information that gets into the wrong hands can result in targeted harassment or bullying, as many private details can reveal information to malicious individuals.

Moderator Safety. Keeping moderation actions confidential and only disclosing information to people who need to know helps to keep moderator anonymity and reinforces the idea of a team decision. Disclosing moderation actions and who performed them can put a target on the mod, as people may treat them personally responsible for an action and may result in harassment or disrespect from users who may not understand the decision.

Personally identifiable information being shared outside of need to know groups can result in compromising users and making them feel as if they may need to sacrifice their Discord to retain personal privacy. This leads to a loss of trust from the member, and perhaps even the loss of them as a member of your community.

Designing the Server for Privacy

There are multiple things to be mindful of when considering privacy and confidentiality, and it extends well beyond standard moderation. Often, privacy will fall down to the way that the server is configured. Some things to consider include:

Server Discoverability. If an LGBTQ+ server is in Server Discovery, a user may use an emote from that server in another one, and if someone clicks on the emote, it may accidentally expose the user as they may identify as LGBTQ+ privately but not publicly.

Public Join Messages. Some servers may have “welcome bots” or even Discord’s welcome feature that greets new users publicly upon joining. Server staff should take into account the type of community that they stand for, and consider if users may perhaps feel uncomfortable or exposed by being mentioned immediately upon joining.

Security. Automated security and “gatekeeper bots” may be used to prevent malicious users from joining a server on alt accounts or as part of malicious groups. While this seems perfectly normal, the part that has to be considered is what data you are requesting. Some of these bots may collect IP addresses, browser data, and various other forms of information. Users may not be comfortable in supplying information that could compromise who they are. Always make sure to read through the privacy statement of any bot that you add to ensure that you are not asking for too much information from regular members.

Bot Logging. Many servers have private log channels maintained by one or more bots. This tracks joins, leaves, deleted or edited messages, and even more. There are two main points to be wary of with these: if personal information is posted for any reason, be it accidentally by misclick or maliciously to dox a user, it will usually appear in a moderator logging channel when deleted. After the situation has been dealt with, owners or admins should consider deleting the log message to prevent personal information from persisting within that channel.

Keeping the Balance

There are pros and cons to any level of disclosure that is offered by a server to its community and its staff. It is not black and white and there are gray areas in both transparency and revealing select information with moderator discretion. There must always be a balance of both that may shift depending on the situation at hand and the type of community that is present. Just as complete confidentiality will lead to distrust, total transparency will lead to users feeling unprotected due to a lack of privacy.

Moderation
Moderation

Schools X Discord

Benefits of Using Discord in Classrooms

Lots of new opportunities are created for schools who make use of a private Discord server. One of the biggest opportunities is presented in a cultural shift away from formality and closer to general bonding. Gone are the days of formal communications over e-mail and instead we have open chat discussions with teachers, fellow classmates, group project members, and even private one-on-one’s (similar to office hours!) when necessary.

In this section we seek to discuss the many ways that Discord can help virtual learning. Discord can function as a virtual classroom setting in ways that can generate new opportunities for continued and reliable teamwork while enabling new and exciting ways for students to connect with each other.

Utilizing Discord as a Virtual Classroom

While certain permissions are essential to creating an environment that separates teachers and student private chats and subjects, a classroom environment acts as the overall hub for everything necessary from a full school, to a specific classroom, to separate groups working on projects, and even a variation of subjects being taught.

Simultaneously, a server can be set up to serve a variety of school-related functions including, but not limited to:

  • A Q&A forum for questions about a shared lesson where students can help each other, but teachers can also jump in to assist
  • Private student-teacher meeting spaces for additional tutoring or conversations regarding behavior
  • Voice channels where students can talk or video call to build out projects in designated groups
  • Break-out channels accessible only by assigned groups for group work that teachers can monitor to ensure equal participation
  • The utilization of screen-sharing to have a live online class in a voice channel for up to fifty students

Discord’s Real Time Communication Abilities

Discord naturally lends itself to communication in ways that allow students to help each other. If one student is stuck on an assignment, any student can answer their question or perhaps join them in a study session to work out the problem together. Schoolwork can feel less solitary, there is less of a reliance on teachers and aids, and students who may be taking the same subject in different classes can meet each other and help each other out. This also allows teachers to monitor such a space to gain insight as to where their students are struggling so they can adjust their lesson plans accordingly.

Interaction with each other doesn’t have to be limited to just typing in designated channels. Students can interact with each other and teachers while using subject specific audio and/or video channels as well, allowing them to have more personal interactions and cater to different learning styles. However, if you are using these channels to broadcast a lesson and not promote group work, make sure that you disable voice activity for students to avoid accidental noise from being broadcasted throughout your online classroom. This helps create a controlled environment for all involved in the virtual classroom space. An example about how to do this is below.

Discord’s Easy to Use Interface Will Make Students Feel at Home

Discord was created for housing online communities which sets it apart from other routes taken for e-learning platforms. Your students and staff will feel at home in the Discord user interface because it is easy to use and made for quick adaptation. The natural familiarity that many students may bring from their personal usage of the platform and even other similar platforms will reduce the friction of adopting into new virtual learning atmospheres and more easily help transition Discord into their daily educational routines.

A common complaint in larger classroom settings, both in person and virtually, is that the environment is impersonal. In a sea of many faces, it is understandable that a student may feel like just another number on the attendance roster. However, Discord’s premise of providing an easily built space meant for easy and informal communication can reduce this lack of personal connection and instead help students feel welcome, including, and easily engaged.

There is Always Fun to be Had in a Healthy Community

When creating your virtual classroom it is important to keep in mind what you want the purpose of your server to be. A server for a specific math classroom taught by Mr. Wumpus at noon everyday is going to be geared entirely towards education, whereas a more open-concept Algebra server that welcomes all students currently enrolled in the course under various teachers to have a more common meeting space while also being split into their respective virtual classrooms via permissions allows for antics in addition to education.

Educational settings can also be fun if you choose to build them that way. There is always fun to be had in a healthy community when the school day finishes, and providing this hang out space for students will allow them to not only bond but associate positive feelings with school. There can be spaces designated to help students study together in after school hours or perhaps come together to take a study break to play one of their favorite games. A music bot can be used to listen to music in an essay-writing sprint, and a leveling system can be used to reward students for participation. Community feelings lead to happiness, which can impact mental health and grading.

Discord is an Extension of your Physical Campus

Discord also brings many advantages to educational arenas outside of the classroom. A server can be viewed as an extension of your physical campus to the Internet in an e-learning environment. Similar to a real-life campus, Discord is a meeting place for students and staff to chat, get to know each other, and build a stronger community in places they feel safe. Shy students may even find it easier to bond better in an online environment than a physical one which can lead to generating new friendships while also improving pre-existing ones. Students who perceive a welcoming environment and have positive feelings about their school and community will often get better grades because of higher motivations levels.

Interestingly, a further extension of a physical campus would be to an alumni association. While school related Discord servers are often for study tools or virtual classrooms, they can also serve as special places for alumni to connect after graduation. They can not only catch up with their old classmates, but they can mentor current students as well. Such a situation can be brought about via unique ideas like career fairs oriented towards advising current students about future career options by making use of a schools’ alumni network.

The Disadvantages of Using Discord as a School Platform

As mentioned above, Discord’s widespread usage for personal gaming and easy to use interface means that a lot of students could already use Discord in their free time or begin to adopt it into their lives after being introduced to it in a classroom setting. The person someone may present themselves as online can differ from who they are in real life. When given the opportunity to use a profile picture, set usernames and nicknames, and even statuses, things can get out of hand quickly. It may even be difficult for some teachers to identify which student is behind certain accounts if these expectations aren’t established immediately upon server creation.

With the likely change that a student might not want to use a picture of themself as their profile picture, it is important to establish server guidelines about what is appropriate to be in that photo. While some users may not be comfortable using their full name or even their first name in their username, set the expectation that everyone has to change their nickname in your private school-related Discord servers to their real name. On a related note, if a user is subscribed to Discord Nitro, they may also change their profile picture on an individual server basis. However, it's not required in order to use Discord on the whole. Our recommendation if a student is uncomfortable with using their real picture for their platform-wide account is to tell them to make a secondary account specifically for school where they can use a photo of themselves. With Discord's account switcher feature having a second account for this purpose is less of hassle. Finally, when students first enter the server, we recommend enabling Developer Mode to privately connect each individual User ID to a student should someone alter or change their identity/account information in the future. This could be done through the schools information management system, or just by using a spreadsheet.

It is also imperative to reiterate the boundary that you expect everything present in the virtual school ecosystem to be school appropriate. Features like profile pictures, username, and status are platform-wide features that appear in all servers a user is in. What is appropriate in some spaces may not be appropriate in school. Consider this ahead of time to establish rules around this and also punishments should this not be respected. It is important to keep in mind that Discord does have strict Terms of Service and Community Guidelines that state users aren’t allowed to use NSFW content and other forms of illegal content as their username, profile picture, and/or status. This will partly make sure the content on their profiles is acceptable to a degree. However, you may not be comfortable if a student is cursing in your virtual classroom and that can be dealt with accordingly by you.

Conclusion

Discord’s very purpose is to foster communication by bringing users an easy to understand interface that is adaptable to a variety of needs, thus making it a really useful modern tool for e-learning and virtual classrooms. With the right permissions, students can be brought into environments where they can not only learn, but also have fun in a virtual extension of their physical campus. Through utilization of a variety of ways to teach classrooms and communicate amongst groups, Discord naturally fosters teamwork and open forum discussions between students and teachers alike.

However, Discord is still a platform with millions of users exploring a variety of interests. It’s important to enter Discord prepared to handle students who may be using inappropriate media for school via platform-wide features like profile pictures, statues, and usernames because they are appropriate elsewhere. If you consider the logistics of creating a safe, school-appropriate environment and set expectations and guidelines upon entry, Discord is a great tool for your future educational needs!

Moderation
Moderation

Creating Moderation Team Channels

Moderation Channel Setup

Once confident in channel permissions that lock the access to your private moderation channels, it’s important to think about what you want your moderation channels to look like. The larger the server the more channels you may need to accommodate everything.

An important rule of thumb is to make sure that your moderation channels are an exception to any auto-moderated actions your moderation bot may be taking, or that the automoderator is configured not to act on messages from moderators. No blacklists should be in effect in these channels to ensure proper discussion of punishments and happenings in the server can be discussed truthfully and respectfully. It’s also important to ensure that any message logging is configured to ignore channels that not all moderators have access to. For example, if all of your moderators can see a general action log channel, but you have a separate channel for a lead chat, deleted and edited messages from that channel should not be logged for privacy reasons. More about this can be found below.

Structurally, it is recommended to have informational channels at the top of your moderation channel category to make sure everyone sees them. Anything that should be easily viewable by the team follows, such as an update channel and moderation/action logging based channels. Channels restricted to smaller groups of moderators should be closer to the bottom as less people have access to these, and partnership channels meant to maintain relations as opposed to direct moderation connections can be on the bottom. A sample of a large staff channelset can be found below.

We’ll now begin to outline a variety of channels that are often useful for moderation purposes. The below list is for your consideration when building your own moderation channel category, but by no means should you feel obligated to add every channel discussed below to your server. This is all about recognizing your needs and making sure they are met!

Internal Rules

Every moderation team should have a developed moderation handbook that is easily accessible to all team members and updated regularly. However, some moderation teams like to have an overview channel for rule enforcement for quick referencing. This channel can contain information such as how you categorize punishments, an overview of popular commands for easy referencing after returning from a moderation break, and links to all guidelines and moderation forms for easy coordination.

We recommend that this channel is viewable to all moderators, but Send Message permissions for informational channels like this should be limited to the server owner or administrators to ensure only important and select information is fed into the channel. Additionally, denying Manage Message permissions to everyone but the administrators is also recommended to ensure that these informational messages aren’t deleted.

Moderation and Action Logs

Action logs are the most important moderation channels out there, but also the busiest. Moderation action logs exist for a variety of purposes, and you can configure them however you see fit. Some recommended actioning channels include, but are not limited to:

  • Moderator Actions - This is a channel specifically for using bot commands. Having a separate channel for commands allows other moderators to see what you’re doing and better offer opinions or ask questions about actions you may not agree with.
  • Censor Logs - Sometimes blacklists lead to false positives, but other times they do catch really problematic messages. Having a censor log separate from all other bot logging makes this information stand out so you can easily measure what is caught in your filter correctly and action accordingly, but not have auto-moderation techniques punish people incorrectly. If your blacklist is malfunctioning, this will become obvious for you to alter as needed if you keep an eye on this channel.
  • Moderation Log/Bot Spam Channel - General moderation bot logging can get a bit spammy. While it’s important to have all this info logged to look back on for reports or when looking up involved user profiles, this channel is often muted so you don’t get notified every time something is added to it. Username and nickname changes, edited messages, deleted messages, and auto-moderation actions can all get logged here in addition to whatever you configure a bot to send to this channel.
  • Comings and Goings - These can easily be sent to a moderation log or bot spam channel, but some teams may find it useful to track server comings and goings in addition to entry and existing specific to voice channels separately. This can help detect incoming raids when there’s an influx of similar looking accounts or suspicious accounts from the same invite in a short period of time. This also helps for voice channel moderation if you’re getting user reports to begin to consider if there’s accuracy from the reporters based upon who they are saying was present in the audio channel at the time.

General Team Channel

Like your server, your moderation team can have a general channel. While it is important for your moderators to moderate, it’s also important for your team to bond. This is best achieved by having a space that is not dedicated to moderation. It exists to talk to each other, get to know each other, and build rapport in your team environment. Moderation can be stressful, and this is where you can go to take a break with your teammates. However, it is important to maintain the same set of moderation expectations here as you would in public channels. An occasional vent is understood and acceptable, but you should avoid speaking negatively of server members that can taint a moderation experience. While it’s important to bond with your teammates, it is also important to bond with your server members as well. Chatting in the server itself is just as encouraged as getting to know your fellow moderators.

Update Channel

If you enable the community server option for your server, you’ll have updates fed to a chosen channel. As these will be major community-based updates for moderation purposes, it is often recommended to have them feed to a moderation update or memo based channel for ease of viewing. Channels that serve this purpose can be used for a variety of reasons including announcing extended absences from the team for vacations or mental health purposes to avoid burnout, taking team-wide votes, and making announcements for moderator removals, departures, promotions, and initiatives the team is pursuing. Many servers use a single channel as a catch-all update arena as it serves a very specific purpose and will not be used daily.

Leading and Training Channels

Although there are a variety of ways to organize your team hierarchy, in addition to regular moderators most mod teams also have administrators who are responsible for overseeing the regular moderators. Some mod teams also make a distinction between regular moderators and moderators-in-training, who may have different permissions compared to regular moderators or otherwise be subject to additional scrutiny during their training period. If these distinctions exist within your own mod team, it may be wise to create separate administrator channels and training channels.

Administrator Channels - An administrator chat is necessary to speak about private matters on the team. This is to judge general moderation performance, handle punishments for problematic actions internally, and it also serves as a place to handle any reports against moderators to ensure privacy. Please remember, it is important that these messages are not caught up in bot logging so moderators are not made aware of the fact that they are being discussed privately before leads connect with them.

Training Channels - General permissions granted to all moderators will not yet be accessible to moderator trainees, and thus some teams consider locking their access to select channels. There will need to be a space for all moderators and leads to privately discuss the growth of the trainees without them gaining access even after promotion. Some teams may go as far as to establish a unique action log channel used during training periods before giving them access to the full history of the moderation team once they prove their ability to be unbiased moderators. Again, it is important to ensure that discussion channels for trainees are exceptions to basic bot logging to avoid awkward occurrences with trainees seeing commentary about them that they should not see yet.

Miscellaneous Channels

There are a plethora of moderation channels that can benefit a team in unique circumstances that don’t fit into the above categories. Consider the following when thinking about what fits best for how your community is run:

  • Partnership Channels - If your community has partnerships of any form it may be worth considering adding spaces for your team to communicate privately with your partners. This type of channel might also be something that you want to restrict from trainees until they graduate to full moderators that know what it means to represent your team. Partnership channels can be used for spaces to coordinate with elevated server members granted special permissions like event managers. They also serve as areas to coordinate with community partners, and even places that your volunteer team can interface with the people that work for the game or organization that you support.
  • Meeting Channels - If your team has regular meetings, this can be the place to discuss the meetings, chat if you don’t want to use a mic, set meeting agendas, and keep notes for moderators that cannot attend. If you do not have monthly meetings, this channel may be useful to archive and reinstate as needed.
  • Appeal Channels - Smaller servers may not use a ban appeals bot for appeals since they often require the creation of a second server. In that case, some may consider it useful to have a designated space to discuss appeals coming in via moderator DMs, Reddit, google form, and any other way your team chooses to organize ban appeals.
  • Modmail Channels - If you use a modmail bot for general server interaction, a designated category would be useful to track threads opened by members to ask questions, share feedback, or appeal punishments that did not remove their access to the server.

Channels for Assisting in Moderating Related External Communities

If your community is linked to an external community such as Reddit or Twitch, it would be useful to have separate moderation channels dedicated to this external community in addition to your Discord moderation channels. Reddit moderation channels specifically can be created by utilizing webhooks. Reddit moderation spaces housed on Discord often have r/modnews updates feeding into a special update area and channel sets unique to their external community needs. You can also have new posts and comments logged into a designated Reddit action log for easy reference without opening Reddit.

Teams that have a separate Reddit or Twitch moderation team in addition to their Discord moderation team may have designated hangout spaces for all teams to get to know each other casually. But, most importantly, they may have shared moderation spaces to discuss troublemakers that can span multiple platforms to flag problematic users for the other team. Easy and specialized communication across teams will help to keep all facets of your community safe!

Conclusion

There is no right or wrong way to set up a moderation channel category outside of ensuring you utilize the correct permissions. It’s important to consider the needs of your community and how their needs translate to the needs of your team when creating your moderation space. Having flexibility and a willingness to grow as your server grows and requires change is imperative. While action logs are the most useful kind of moderation channels from a punishment perspective, hangout spaces are important to establishing team cohesion and rapport. Identifying the needs of your team and making sure they are adequately met will help you create the strongest moderation environment as possible.

Moderation
Moderation

Using XP Systems

What Are XP Systems And How Do They Work?

One of the most important factors of keeping a community alive and healthy is activity. To maintain activity, moderators can use a few different methods which generally can be separated into two groups: active and passive. Active methods are those which require the presence and active participation of a moderator. Passive methods, on the other hand, do not require a constant presence from an individual and are often automated by using bots. Keep in mind that even passive methods will require occasional maintenance from a moderator.

One of the more popular passive methods are XP systems. XP systems, otherwise known as experience or leveling systems, grant users experience points (XP) and levels based on their activity in a server. Their main purpose is to reward member activity in the community. These systems exist in the form of bots. Usually they are just one function of multi-purpose bots, but there are cases where the sole function of the bot is the leveling system.

The basic way these systems work is:

  • Granting experience. A certain number of XP points is rewarded per message in text chat. In the case of rewarding voice chat activity, they grant experience based on time spent talking in the voice chat.
  • Preventing spam. Leveling systems have built-in cooldowns. This means that not every message sent will contribute towards gaining XP points. Only the messages sent after the cooldown expires will grant additional XP.
  • Leveling up. After a certain number of XP points is acquired, the user reaches a new level.
  • Granting roles. When a certain level is hit, the bot automatically assigns a role to the user.

These four steps are only a simplification of the process, and there are many options to consider while using these systems. Depending on the bot you choose to use, you can get various options for configuration but there are several that are commonly available on most bots. Some of these options are:

  • XP and level customization. Changing the XP gain per message and cooldown will allow you to have better control over the system, as well as fine tune it towards the level of activity in your community. This can be extended to role based reward systems as well. It’s important to adjust the levels at which certain roles are acquired, as the roles need to be reachable in order for the reward to have a purpose. Keep in mind that not all bots will allow changing the cooldown or XP gain per message, but they will always allow for setting roles at a level of your choice.
  • XP Awareness. Being aware of the current XP a user possesses is also a valuable aspect of XP systems. The two ways to see these stats are through leaderboard and rank commands. Leaderboard commands typically show the current top 10 users based on their level or experience, while rank commands show this info for specific users. Usually, XP systems have built in announcement messages for level-ups to keep members posted about their level automatically, some even with a ping to the user in question. These notifications can be turned off.
  • Channel Control. XP gain can be disabled or enabled per channel. While helping prevent needless spam, it also allows you to remove XP gain in channels where spam-like activity is encouraged, such as channels with bots.
  • Manual XP Control. The last two basic options are adding and removing XP manually. Adding XP is commonly used if XP is automatically removed upon leaving the server. This allows users who had to re-join to gain all their former XP back. Alternatively, manually adding XP could also be considered a reward for certain actions, while removing would be used for punishments. Finally, resetting XP could be used for all users when you have a cyclical leveling system, while resetting XP for users who have been banned or left allows you to prevent clutter on the leaderboard.
  • No XP Roles. The No XP role allows you to completely block XP gain for any user who has this role. It can be used in two possible ways:
  • Use this role on obvious spammers. Anyone who you can tell is attempting to abuse the system via spamming should be put under this role in order to further hinder meaningless activity.
  • Grant the role per request. In the case where your system does not grant permissions via leveling, users should be allowed to request this role. Not everyone likes being seen on a leaderboard or the sidebar.

Usage of XP Systems

The main function of XP systems is to reward user activity. Their ability to passively lead users towards being active allows moderators to occasionally take a moment to step back from their usual activities of engaging with their community. The existence of a leaderboard appeals to the competitive nature of humans and pulls them to be more active. Rewards additionally add to this appeal. This is applicable in small, medium and some large communities.

Alternatively, XP systems can be used as a measure of security. By locking certain permissions behind levels, you can make sure that inactive and malicious users are prevented from committing certain offenses. This is mostly applicable in very large servers.

Both of these routes will require the utilization of roles. There are two main reasons for this:

  • When it comes to rewards, not all moderators are willing to invest money into their rewards. In those cases, roles are the simplest way to grant rewards within the community itself.
  • When it comes to security, permissions are tied to roles, so using them is needed as they allow control over which permissions are granted to large groups of people.

Regulation of Activity

As the goal of XP systems is to boost activity, it is important to note that they will also lure in users who believe any type of activity is acceptable. This is not the case. While the problem of spam is already resolved by the cooldown ability of most XP systems, there are still behavioral issues that need to be addressed.

Members need to be aware that rules still exist in the community and they cannot simply do as they please. It is important to moderate those who blatantly misbehave in order to level up. Other contributions to the community, which are not measured with activity in text or voice chat, should also be rewarded, such as artworks, stories, etc.

Furthermore, activity that comes from channels which encourage spam-like behavior, such as bot channels, should not count towards the total. For that reason, XP gain should be disabled in channels of that sort.

Types of XP Systems

Knowing that we have the ability to reset the XP in the server, we can use this option to create different types of XP systems. We can divide XP systems into three different types: cyclical, permanent and combined.

Cyclical Systems

Cyclical systems reset XP points in a regular cycle. Cycle duration should be set based on activity, though it is not advisable to use this system in communities with very low activity. The constant resets in the leaderboard allow new members to climb the charts quickly, but this only lets the system be used for rewards. It is common to give out special rewards to the most active users at the end of the cycle, such as custom roles for the duration of the following cycle, which encourages continued activity to retain their rewards. This also gives moderators the opportunity to post announcements regularly at the ends of cycles. The biggest downside of this system is that not all bots have the ability to also remove leveling roles when the cycle ends.

Permanent Systems

Permanent systems do not reset at any point in time. Occasionally, users that are banned or have left are removed from the leaderboards. They give a good look into who the most active and dedicated users in the community are. Permanent systems can be used both for rewards and security. Their biggest downside is that they are not very friendly towards new members, especially in older communities.

Combined Systems

Combined systems are a combination of cyclical and permanent systems. They usually require the usage of 2 separate bots to keep track of rankings on both leaderboards. They take the best aspects of both systems, meaning they can be used both for rewards, which would be connected to the cyclical system, and security, which would be connected to the permanent one. This also allows for the cyclical leaderboard to help involve new users more, while still giving good insight of who the most active members of all time are with the permanent one. The only big downside that remains is the issue of removing roles from a large number of members.

Negative Aspects

As with any system, there are several negative aspects to consider when it comes to using XP systems. All of these should be taken into consideration before you make a decision on whether you want to use one. Some of the most commonly voiced concerns would be the following:

Spam

The fact that these systems encourage sending a larger number of messages automatically leads to the idea of spam. Since spam is considered a violation of rules, as well as a ToS violation, this would become a huge problem. The issue is easily resolvable, thanks to the built in anti-spam measures XP systems have, primarily the message cooldown. This, in combination with good moderation would make it certain that no spam is generated by the presence of the system.

Role Bloating

Since most XP systems require utilization of roles for truly fulfilling their usage, the issue of role bloating comes to mind. It is important to manage and space the leveled roles properly in order to avoid creating an excessive number of roles with no real use in your server.

Low Quality content

It is believed that when it comes to rewarding members, rewards should be granted manually through qualitative judgement of content, rather than quantitative. It’s a fact that bots cannot themselves tell the quality of the messages sent. The fact that all types of conversations allow users to gain XP means that there is no meter on what the quality of the conversation is. Granting privileges as a result of such conversations sends the wrong idea of what sort of activity is encouraged. Only humans can truly judge content subjectively to determine quality. By combining proper moderation, for handling the judgement of quality, and the built-in preventive mechanisms of XP systems for quantity control, this issue can be held down to a minimal level.

Inorganic Communication

This would be one of the most difficult problems of the system to resolve. Many members, with the intent of increasing their XP count and level, will attempt to hold conversation in the community. This part of it is perfectly fine. The problem arises when they start forcing conversation at any point in time solely to increase their message count. The result would be communication that is completely unnatural and unhealthy.

This type of abuse of the system can’t really be stopped with the cooldown system since most of the time it is not spam. The only real way to prevent this is to use the No XP role, but the difficulty of telling organic and inorganic communication apart raises the question: Was there really any violation of the system?

Difficulty with Community Integration

Not in every case is an XP system useful for a community. In many cases moderators can’t figure out how they can add this sort of system and make it seem like a natural part of the server, or how it can fit the theme of the community. Considering the theme and purpose of your server is an important part of making the decision of adding a leveling system to your server. Before you make the decision, ask yourself: How can I make this system a natural part of my community?

Which Bot Do I Use?

In the following section, several publicly available bots will be presented as options for what you can use for an XP System in your server. The following bots have been chosen based on data collected from a survey of various moderators, administrators and owners in Discord communities.

This list is not exhaustive. There are plenty of alternatives available online. All of the listed bots are free and public. Some features may be limited to paid additions or private versions of the bots. The content of this article is not endorsed by any bot or company related to the bots.

Gaius Play

Gaius Play is an entertainment bot that also hosts an XP system. It takes both text and voice chat activity into account. The basic version comes with a preset configuration for XP gain and leveling, as well as commands for adding rewards up to 6 roles, fully customizing and toggling level-up messages, manually controlling XP, ignoring activity in certain channels and the ability to boost XP gain within certain parameters (roles, channels, time periods). It also has the ability to remove leveled roles from all users, making it ideal for use in a cyclical system. Additionally, there are several premium features, such as unlimited reward roles and a tree leveling system. Users can also reset their own reward roles in order to change paths on the tree leveling system.

AmariBot

Amari is a bot that is solely focused on leveling systems. It only looks at activity in text channels. This is a very simple bot, containing commands for setting rewards, manually controlling XP, customizing and toggling level-up messages, as well as ignoring activity in some channels. It has the ability to have 2 leaderboards active at the same time, both of which can be reset at any point. Donor features allow for modification of the cooldown between messages, as well as modification of the XP gain per message.

Nadeko

Nadeko is a multi-purpose bot with a leveling system module. It detects activity exclusively from text chat and has a preset configuration for XP gain. It contains commands for setting up reward roles, toggling level-up messages, ignoring activity in channels and manually controlling XP. A big upside of this bot is that you have the option to host it yourself. This also adds the ability to set up in-bot currency rewards, as well as better overall control of the bot.

Tatsu

Tatsu is another multi-purpose bot with an XP system module. It has several basic commands, allowing creation of reward roles, toggling level-up announcements, modifying XP gain per message and cooldown, manual XP control and ignoring channel activity. It also features a dashboard and a global leaderboard alongside the local one.

Summary

With the available selection of bots and documentation explaining the setup of XP systems, using leveling modules is simpler than ever. The configuration options that exist on these modules allow for creative usage of leveling systems with the goal of passively increasing activity within the community.

Naturally, you have to consider several factors prior to deciding on using XP systems. If you are considering using one, think of how you can best integrate it into your community. In which way will you use it? Which type of system would suit your community best? Which bot would be the best for the task? Of course, there are negative aspects to consider as well, meaning you’d have to figure out how to control and minimize them. Carefully weigh all the pros and cons prior to making a final decision.

Moderation
Moderation

Developing Moderator Guidelines

Lead by Example

The moderators are the people that your community members look to, not just for enforcement of server rules and maintaining the peace, but also as role models for what behavior is appropriate for the server. If your users see moderators ignoring or bending certain rules, they will learn that that is ok for them to do so also, and they will call you out if you attempt to hypocritically enforce rules against them. As such, moderators should hold themselves to a higher standard than other users, especially in regards to civility and more subjective rules such as what is considered NSFW content. This also applies also for private interactions among the mod team.

For example, if a moderator is talking in chat and shares a suggestive picture, users will understand that other pictures that are equally as suggestive are ok to post. Not only will this encourage borderline rule-breaking behavior, it makes it more difficult for moderators to peacefully moderate NSFW content because users will say “Well, you posted this picture and the picture I posted is basically the same.” The same holds true in the way moderators respond to questions. If someone asks for help on something and moderators respond to them rudely or condescendingly others will treat new users the same way and create a hostile environment.

That’s not to say that moderators can’t have fun, of course. Moderators can and should participate in chat regularly and engage with members as normal users. If a moderator entering chat is disruptive in and of itself, it usually means that moderators are not active enough in the server.

Ultimately, moderators should strive to be seen fondly by server members, yet respected in their positions of authority. Moderators that fail to enforce rules will be seen as unprofessional or “pushovers” by the server members, while moderators that enforce rules too strictly and/or do not participate in chat will be seen as aloof, aggressive, or out of touch.

The Spirit of the Rules

One of the things you may often hear is that the “spirit of the rules” is more important than the “letter.” In other words, it is more important that people follow the intent of the rules, rather than adhering to a literal or technical definition. As a result, moderators should focus on managing the problems of chat, including addressing unhealthy behavior that may not directly break a rule. It is appropriate to moderate people that are deliberately toeing the line to see what they can get away with (i.e., trolling), but what many moderator teams also forget is that the rules are not infallible, and moderators should use their judgment to enforce rules only when it makes sense, and not blindly following the letter of the law.

There may be instances where the wording or specifics of the rules end up disallowing behavior that, in practice, does not go against the main principle of moderation. In these cases, moderators should refrain from warning the user without consulting the rest of their mod team and also seriously consider modifying the rules to more accurately reflect the expectations of the mod team in regards to server conduct.

For example: let’s say you have a rule that prevents users from cropping images to focus on sexual body parts in order to prevent NSFW conversations from occurring in chat. However, someone ends up cropping an image of an in game character to focus on her skirt from behind, discussing the outfit. In this case, it may not be appropriate to warn the user since they are using this image to start an appropriate conversation, even if it technically breaks the rule about cropping pictures. So, the mod team should discuss ways that the rule can be rewritten to cover scenarios like these, rather than resign themselves to warning the user “because the rules say so.”

Remember: the rules exist to serve the community, not the other way around. Moderators should conduct themselves in accordance with the rules, and potentially even better, but they ultimately have the power to change them for the better of the server if need be. Treat your rules as a living document and remember that they are there to improve your community, not stunt it.

When to Forgo Progressive Discipline

While certain rules readily offer an “instant ban” option (such as doxxing) in some cases, a user’s conduct may reveal that they are only in chat to troll or otherwise cause trouble in a way that does not break one of the instant ban rules.

Just as the rules exist to serve the community, so too does the progressive discipline system. The purpose of the progressive discipline system is to allow your members to understand their bad behavior and rectify it in the future without unduly punishing them for occasional small mistakes. Conversely, this means that users that are clearly acting in bad faith on the server may not be afforded the same leniency and should be muted or banned depending on the circumstances especially if the user in question does not have any previously normal chat history. While users that instantly break rules without message history could all be potentially banned, some behavior you may want to consider in particular includes:

  • Predatory comments (e.g., “Are there any girls on this server? Send me a DM.”)
  • Racial slurs
  • Posting “cursed” images or otherwise spamming images in the wrong channels
  • Advertising other servers/their own social media

It’s important that disruptive users be addressed quickly before they sour the mood of the other server members (which could lead to additional infractions from users that were incited by the original bad behavior in the first place). Just as moderators should not use the rules to punish users that don’t practically deserve it, moderators should also be sure not to allow disruptive users to remain on the principle of following policy.

Other Considerations

While the principles above are the most generally important moderation principles, there may be other things you want to include in your moderator guidelines channel as well such as

  • How to evaluate more subjective rules (e.g., “what is NSFW content?”)
  • Technical instructions such as how to log warnings or other procedural information
  • The purpose of various moderation related channels

Always keep in mind any peculiarities of your server and questions your moderators might have so that you can proactively address them before they become issues.

Summary

A moderation guidelines channel is an important channel for helping your moderators get acquainted with both the procedural aspects of moderation and the more subjective aspects. Moderators should be aware that they are leading by example and hold themselves to a higher standard so that other users will be encouraged to follow their example. This will help them perform their duties smoothly as well as allow them to readily de-escalate conflicts before they become an issue, encouraging a positive server culture. Finally, by encouraging your moderators to evaluate situations critically, you have mods that can understand both when users should be swiftly punished as well as when rules may need to be adjusted or clarified to allow greater flexibility.

If you’re interested in seeing example moderation guidelines, you can check out the link here. Hopefully these help you in developing your own moderation guidelines. Happy moderating!

Safety
Safety

How To Report Content To Discord

Submitting a Report to Discord

  1. Select the Message you wish to report. On mobile, hold down on the Message, and on desktop, “right-click.”
  2. Select “Report Message
  3. Select the type of abuse you’re seeing:

The next screen will allow you to further specify the specific abuse that’s occurring. You can always click back and change your first answer, so you can select the most relevant category.

Alternatively, you may select “Delete Message,” which, as a mod will enable you to report the message to our Safety team as well, while removing the content from your server.

What to do if you receive a violent threat, or someone is at risk of Self-harm

If a credible threat of violence has been made and you or someone else are in immediate danger, or if someone is considering self-harm and is in immediate danger, please contact your local law enforcement agency.

Additionally, if you are in the United States, you can contact Crisis Text Line to speak with a volunteer crisis counselor to help you or a friend through any mental health crisis by texting DISCORD to 741741.  You can learn more about Discord’s partnership with Crisis Text line here.

You can find more resources about mental health here.

What Happens After I Submit a Report?

When we become aware of content that violates our Community Guidelines or Terms of Service, our Safety team reviews and takes the necessary enforcement actions, including: disabling accounts, removing servers, and when appropriate, engaging with the proper authorities. We may not review your report manually or respond to you directly, but we’ll use your report to improve Discord.

You can read more about the reports we receive and the actions we take on violations of our Community Guidelines or Terms of Service in our quarterly Transparency Report.

Safety
Safety

Navigating Online Boundaries With Your Teen

We’re committed to making Discord a safe place for teens to hang out with their friends online. While they’re doing their thing and we’re doing our part to keep them safe, sometimes it’s hard for parents to know what’s actually going on in their teens’ online lives.

Teens navigate the online world with a level of expertise that is often underestimated by the adults in their lives. For parents, it may be a hard lesson to fathom—that their teens know best. But why wouldn’t they? Every teen is their own leading expert in their life experiences (as we all are!). But growing up online, this generation is particularly adept at what it means to make new friends, find community, express their authentic selves, and test boundaries—all online.

But that doesn’t mean teens don’t need adults’ help when it comes to setting healthy digital boundaries. And it doesn’t mean parents can’t be a guide for cultivating safe, age-appropriate spaces. It’s about finding the right balance between giving teens agency while creating the right moments to check in with them.

One of the best ways to do that is to encourage more regular, open and honest conversations with your teen about staying safe online. Here at Discord, we’ve developed tools to help that process, like our Family Center: an opt-in program that makes it easy for parents and guardians to be informed about their teen’s Discord activity while respecting their autonomy.

Here are a few more ways to kick off online safety discussions.

Let your curiosity—not your judgment—guide you

If a teen feels like they could get in trouble for something, they won’t be honest with you. So go into these conversations from a place of curiosity first, rather than judgment.

Here are a few conversation-starters:

  • Give me a tour of your Discord. Tell me what you're seeing. 
  • What kind of communities are you in? Where are you having the most fun? 
  • What are your settings like? Do you feel like people can randomly reach out to you? 
  • Have there been times when you've been confused by how someone interacted with you? Were you worried about some of the messages that you've seen in the past?

Teens will be less likely to share if they feel like parents just don’t get it, so asking open questions like these will foster more conversation. Questions rooted in blame can also backfire: the teen may not be as forthcoming because they feel like the adult is already gearing up to punish them.

Read more helpful prompts for talking with your teen about online safety in our Discord Safety Center.

View your teen as an expert of their experience online

Our goal at Discord is to make it a place where teens can talk and hang out with their friends in a safe, fun way. It’s a place where teens have power and agency, where they get to feel like they own something.

Just because your teen is having fun online doesn't mean you have to give up your parental role. Parents and trusted adults in a teen’s life are here to coach and guide them, enabling them to explore themselves and find out who they are—while giving them the parameters by which to do so.

On Discord, some of those boundaries could include:

  • Turning on Direct Message Requests to make sure strangers can’t message them on larger servers without going to a separate inbox to be approved, first.
  • Creating different server profiles for servers that aren’t just their close friends. You and your teen can discuss how much information should be disclosed on different servers, for example by using different nicknames or disclosing pronouns on a server-by-server basis.

Using Discord’s Family Center feature so you can be more informed and involved in your teens’ online life without prying.

Create more excuses to have safety conversations

At Discord, we’ve created several tools to help parents stay informed and in touch with their teens online, including this Parent’s Guide to Discord and the Family Center.

In the spirit of meeting teens where they are, we’ve also introduced a lighthearted way to spur conversations through a set of digital safety tarot cards. Popular with Gen Z, tarot cards are a fun way for teens to self-reflect and find meaning in a world that can feel out of control.

The messages shown in the cards encourage teens to be kind and to use their intuition and trust their instincts. They remind teens to fire up their empathy, while also reminding them it’s OK to block those who bring you down.

And no, these cards will (unfortunately) not tell you your future! But they’re a fun way to initiate discussions about online safety and establish a neutral, welcoming space for your teen to share their concerns. They encourage teens to share real-life experiences and stories of online encounters, both positive and negative. The idea is to get young people talking, and parents listening.

Know that help is always available

Sometimes, even as adults, it's easy to get in over your head online. Through our research with parents and teens, we found that while 30% of parents said their Gen Zer’s emotional and mental health had taken a turn for the worse in the past few years, 55% of Gen Z said it. And while some teens acknowledged that being extremely online can contribute to that, more reported that online communications platforms, including social media, play a positive role in their life through providing meaningful community connection. Understanding healthy digital boundaries and how they can impact mental wellbeing is important, no matter if you’re a teen, parent, or any age in-between.

When it comes to addressing the unique safety needs of each individual, there are resources, such as Crisis Text Line. Trained volunteer Crisis Counselors are available to support anyone in their time of need 24/7. Just text DISCORD to 741741 to receive free and confidential mental health support in English and Spanish.

Safety
Safety

Building a Safer Place for Teens to Hang Out

Choose what you see and who you want to hang with

Introducing our new Teen Safety Assist initiative: Teen Safety Assist is focused on further protecting teens by introducing a new series of features, including multiple safety alerts and sensitive content filters that will be default enabled for teen users.

Starting next week, we’re excited to release the first two features of the Teen Safety Assist initiative. Teens have shared that they like tools that help them avoid unwanted direct messages (DMs) and media. To support this, we’re rolling out safety alerts on senders and sensitive content filters that automatically blur media that may be sensitive — even if the sender doesn’t initially blur it with a spoiler tag. 

  • Safety alerts on senders: When a teen receives a DM from a user for the first time, Discord will detect if a safety alert should be sent to the teen for this DM. The safety alert will encourage them to double check if they want to reply, and will provide links to block the user or view more safety tips to safeguard themselves.
  • Sensitive content filters: For teens, Discord will automatically blur media that may be sensitive in direct messages and group direct messages with friends, as well as in servers. The blur creates an extra step to encourage teens to use caution when viewing the media. In User Settings > Privacy & Safety, teens will be able to change their sensitive media preferences at any time. Anyone can opt into these filters by going to their Privacy & Safety settings page and changing their sensitive media preferences.

We’ve partnered with technology non-profit Thorn to design these features together based on their guidance on teen online safety behaviors and what works best to help protect teens. Our partnership has focused on empowering teens to take control over safety and how best to guide teens to helpful tips and resources.

We’re grateful for the support of Thorn to ensure we build the right protections and empower teens with the tools and resources they need to have a safer online experience.

The first two features of the Teen Safety initiative will start rolling out globally in November 2023 and will be turned on by default for teen users. Stay tuned for many more features from this initiative as we look ahead to the next year.

Work together to keep each other safe

We know mistakes happen and rules are accidentally broken. Our new Warning System includes multiple touchpoints so users can easily understand rule violations and the consequences. These touchpoints provide more transparency into Discord interventions, letting users know how their violation may impact their overall account standing and gives information for them to do better in the future.

We built this system so users can learn how to do better in the future and help keep all hangouts safe. 

  • It starts with a DM - Users who break the rules will receive an in-app message directly from Discord letting them know they received either a warning or a violation, based on the severity of what happened and whether or not Discord has taken action. 
  • Details are one click away - From that message, users will be guided to a detailed modal that, in many cases, will give details of the post that broke our rules, outline actions taken and/or account restrictions, and more info regarding the specific Discord policy or Community Guideline that was violated.
  • All info is streamlined in your account standing - In Privacy & Safety settings, all information about past violations can be seen in the new “Account Standing” tab. A user’s account standing is determined based on any active violations and the severity of those violations.

Some violations are more serious than others, and we’ll take appropriate action depending on the severity of the violation. For example, we have and will continue to have a zero-tolerance policy towards violent extremism and content that sexualizes children.

The Warning System is built based on our teen-centric policies. This is a multi-year, focused effort to invest more deeply and holistically into our teen safety efforts. We incorporate teen-centric philosophies in how we enforce our policies, assess risks of new products with Safety by Design, and communicate our policies in an age-appropriate manner.

For the Warning System, this means that we consider how teens are growing, learning, and taking age-appropriate risks as they mature, and we give them more opportunities to learn from mistakes rather than punish them harshly.

The Warning System starts rolling out today in select regions. Stay tuned for many more updates to our Warning System in the future.

For our server moderators and admins, we’ve recently expanded their safety toolkit with new features that allow them to proactively identify potentially unsafe activity, such as server raids and DM spam, and take swift actions as needed. 

  • Activity Alerts and Security Actions: Activity Alerts notify moderators of a server of abnormal server behavior, particularly what could be massive raids or unusual DM activity. From there, they can investigate further and take action to protect their community as necessary. If it doesn’t seem like an issue, they can easily resolve the alert. Security Actions enable swift server lockdowns by surfacing useful tools quicker, such as temporarily halting new member joins or closing off DMs between non-friend server members.
  • Members Page: We’ve reimagined the existing Members page, letting moderators view their members in a more organized way. The Members page now displays relevant safety information about their members like join date, account age, and safety-related flags that we call Signals (such as unusual DM activity or timed out users). We plan on continuing to develop and surface additional relevant Signals. This redesigned Members page appears in a more easily accessible spot above their server’s channel list. 


We began rolling these tools in September to Community Servers, and now we’re rolling them out to all servers.

You choose what stays private

While you might be an open book with your friends, having your personal business out there for everyone to see can feel a bit weird. That’s why you get to decide, from server to server, how much you want to share and how easy it is to contact you.

With Server Profiles, you can adjust to how you present yourself in different spaces, allowing you to be as much of the real “you” as you feel comfortable being in any particular group. 

You can also always try turning on Message Requests: a feature that sends any DM from someone new into a separate inbox and requires your approval before that user can DM you again. Or, go the extra mile and turn off DMs entirely from server members unless you’ve explicitly added them to your friends list.

We’ve got you

PHEW, okay… that was a lot. Thanks for stickin’ around and jamming through our most recent updates with us. Everything you could ever want to know about safety and privacy on Discord, and even more if you’re the learning type of person, lives in our Safety Library — check it out for more tips on making Discord safe and comfortable for you.

This article was first published on 10/24/2023.

The Teachable Moments Hidden in Plain Sight

Using research to help teens build their 'safety muscle'

To do that, Discord works with organizations like the Digital Wellness Lab at Boston Children’s Hospital and the National PTA to understand the needs of one of our largest user bases.

“We are experts in technology, not experts in teenagers. So we need to make sure we are pulling in expert research when we're thinking about what it means to be a safe, healthy and ultimately, empowering place for young people,” said Liz Hegarty, Discord’s global teen safety policy manager.  

Hegarty and her colleagues are guided by the research, which is empathetic to teens' lived experiences and cautions against being overly paternalistic.

For example, Digital Wellness Lab advises parents to resist the urge to step in and solve every problem for their teen, online or off. Instead, they recommend asking probing questions and talking through the decision-making process. This can clear the way to have more conversations about digital boundaries.

“These conversations can help teens build their safety muscle”, said Hegarty.

We built our Family Center with this idea in mind. Launched in 2023, the Family Center is an opt-in tool that enables guardians, trusted adults, and teens to “link” their accounts. While it allows parents to know who their teen is connected with, the tool does not allow the adults to see the contents of their teen’s conversations with friends.

“The key is to find balance between respecting teen autonomy and agency while also giving parents ways to check in with their teens”, said Savannah Badalich, Discord’s senior director of policy.

“Parents and trusted adults in a teen's life are here to coach them,” said Badalich, who leads the team responsible for key safety areas, such as teen safety and mental health. “Let them explore themselves, let them find who they are. But give them parameters in which to do so.”

There is a risk that a teen may choose not to talk at all if they feel they are being constantly monitored. This is why it's important to follow the research.

“Leading experts say that sort of parental surveillance does not always help generate safety or create a trustworthy environment for teens to talk to their caregivers,” Hegarty said.

Hegarty notes that when building the Family Center tool, Discord decided to add an additional layer of support. As a teen connects with their parents, Discord surfaces information for the Crisis Text Line (note that this is for users in the U.S. only). This is a support line that provides free, 24/7 mental health support via text, WhatsApp, and web chat.

The idea is to provide available resources to teens and to encourage them to seek help if they need it. It’s another way we embed subtle lessons throughout the experience.

Weaving in teachable moments

Discord’s Teen Safety Assist includes two recently released safety features: one for when a teen receives a direct message (DM) from a user for the first time, and another that blurs potentially sensitive content in DMs and group DMs with friends.

When an alert is triggered, the teen might see a pop-up that says, Unwanted message? If you don’t want to chat with this person, you can block or mute them. They can then click to “learn more,” and see a message that prompts “Unwanted message?”, as well as provides a set of tips for handling the situation.

“It's this idea of educating as opposed to just punishing teenagers when they make mistakes,” said Hegarty. “In the moment, it’s important to give teens resources, autonomy, agency, and help.”

Hegarty points to research about how teens are nervous to block other people. They're worried if the person finds out they're blocked, the teen is going to look uncool. Meanwhile, if they report, they aren’t sure what will happen next. They are anxious about getting somebody in trouble.

Research from Thorn, a leading child welfare nonprofit, also suggests that when teens encounter a potentially harmful experience online, they are less likely to seek support offline, such as talking to a trusted adult. So Discord is trying to meet teens where they are.

“It's not a paternalistic approach that protects teens from everything. Being overly restrictive risks them lacking the necessary skills to move within the world once they turn 18,” said Badalich. “Instead, let's educate them throughout their experience. Let's give them tools to control their experience.”

Giving teens opportunities to learn

Risk-taking is part of growing up. It’s part of being a teenager.

“Because their brain is still developing, they need options, time and education,” said Badalich, whose team is also working to sharpen Discord’s enforcement system with teens in mind.

When it comes to developing the policies and rules that govern the platform, Badalich and her team are rethinking what that looks like, with the understanding that teens need extra opportunities to learn.

“Keep up, take down, or ban have been the main levers. And we're continuing to explore even more ways to take necessary action,” said Badalich. For example, “When you think about someone talking about self-harm, do we really want to penalize them? Or do we want to figure out some other form of intervention?”

To be clear, when it comes to the most severe violations—such as those involving violent extremism and content that sexualizes children—Discord will continue to have a zero-tolerance policy.

But when it comes to building out more nuanced, research-driven policies, the policy team looks to lessons from restorative justice, a concept that recognizes that people are capable of change.

“Instead of just being purely punitive all of the time,” said Badalich, “Discord wants to move in the direction of where we are giving people opportunities to change, grow, and learn.”

Safety
Safety

How Do ‘Teens’ Rights’ Show Up in Discord’s Enforcement Policies?

What is a teens’ rights approach?

A teens’ rights approach honors young people’s experience, including areas where they could use a little help—say, when they push boundaries just a little too far or succumb to peer pressure and engage in inappropriate behavior online.

The idea comes from the United Nations’ guidance on the impact of digital technologies on young people’s lives. For the report, committee members consulted with governments, child experts, and other human rights organizations to formulate a set of principles designed to “protect and fulfill children’s rights” in online spaces.

The committee also engaged those at the center of the report—young people themselves. The UN’s guidance was not only rooted in research, it treated teens with respect. It listened to them.

It's this kind of holistic and human rights approach that makes this body of work so influential, especially for child rights advocates. But it’s also become a guiding force for Discord.

“Teens are going to try things out, figure out who they are, and make mistakes online. Discord is focused on guiding them through that process without being overly punitive or cutting them off from their support networks and their friends,” said Laney Cloyd, a senior product policy specialist.

Cloyd and her colleagues set out to construct a new warning system for when a teen breaks Discord’s rules. They started with the idea that young people are going to do regrettable things, like send their friend a link that says, “Fortnite cheats here” or “Cute kitten pics” and it’s really an IP grabber. They may think it's funny and not harmful, but it’s still against the rules.

“We can't be super punitive on the first violation, because they don't know. We have to teach them,” said Cloyd.

Of course, when someone violates the rules, there should be consequences. But a teens’ rights approach considers a person’s potential to change—call it a teachable moment.

Teens need extra opportunities to figure out what the boundaries are. It’s true when they’re at home with their parents, when they’re at school with their teachers, or when they’re on platforms like Discord with its own set of community guidelines. Parents, teachers, and coaches are there to tell young people what’s what. Discord saw an opportunity to play that role online: We built interactive educational moments that explain to a person what they did wrong.

“As opposed to just saying, ‘you're kicked off, go read our guidelines,’ we offer more guidance so they can do better,” said Liz Hegarty, Discord’s global teen policy manager. Hegarty pointed to research that shows teens really aren't given great advice about how to handle themselves online. Platforms have a role to play in helping teens build their digital citizenship skills.

All of this is part of Discord’s philosophy for designing policies that hangs on the belief that people are capable of change if you give them a chance, especially when those people are young and still learning. They need a chance at rehabilitation. (An important caveat is that some violations are more serious than others, and Discord takes appropriate action depending on the severity of the violation. For example, we have and will continue to have a zero-tolerance policy towards violent extremism and content that sexualizes children.)

It might sound odd that a tech company would borrow ideas from the preeminent human rights organization. But consider the most foundational tenet of human rights: the right to live freely and safely. Platforms like Discord are most successful when they enable people to be themselves, have fun, and make meaningful connections—all without fear or threats to their safety or well-being.

How do teens’ rights actually show up on Discord?

To illustrate how teens’ rights are woven into our warning system, take for example Discord’s updated approach to enforcing our Bullying and Harassment policy.

First, it’s helpful to understand that bullying is broken down into a number of different dimensions. One of those includes imminent threats of physical harm—for example, extortion or a death threat. For violations of that severity, there are no second chances. Users don’t get to do that on Discord.

But for non-imminent or non-physical threats, Discord has built in more opportunities for teens to get the message that bullying isn’t allowed. These show up in the form of a series of time-outs.

Say a teen has been flagged for harassing a user—maybe they’re degrading or mocking someone. For their first violation, they would get a notice and Discord would remove the content.

Then the second time, they’ll get a notice, the content will get removed, and the teen will be placed in “read only” mode for a set number of hours. If it keeps happening, Discord will apply longer and longer time outs.

Ultimately if the teen doesn’t change their behavior and continues to engage in harassment or sustained bullying, they can receive a year’s suspension.

This approach is in contrast to a traditional “three strikes and you’re out” rule, where most violations, no matter how severe, would count against you, and then after three dings, you’re banned for good. Discord’s more rehabilitative approach takes into account the too-real probability that teens are going to make mistakes.

“Imagine a 14-year-old who falls in with a bad crowd and engages in a bunch of bullying and gets suspended for a year,” said Cloyd. “If they return, a 15 year old is going to be a different person in a lot of fundamental ways.”

The new warning system is also geared toward addressing the specific thing a teen did wrong. For example, let’s say a teen posts an image that violates Discord’s rules. An appropriate measure would be to turn off their ability to post images for a while, but they can still talk to their friends, join voice chats, or play games.

“We're taking very targeted interventions and not just slamming a big ban hammer down,” said Cloyd. “Because you posted one wrong image you didn't know was wrong, we’d block you out of the app forever? That's not fair. That's not a learning experience.”

Why do teens need special rights in digital spaces?

“We intuitively recognize that as children get a little bit older, they are autonomous humans. They have agency,” said Hegarty. “They have the right to access online spaces safely.”

This goes back to the UN document, which Hegarty first encountered before she came to Discord when she worked for a children’s digital advocacy organization.

“The whole world is fairly new to thinking about how we handle teens in online spaces, especially when more and more of their lives are taking place online. It has an impact across everything from telemedicine to education,” said Hegarty.

Teens are at a transformative and exciting stage of life. They get to explore the world in new ways. They use platforms like Discord to find themselves and make meaningful connections, but that can't be separated from the potential that they'll make mistakes along the way.

Recognizing that—and helping them grow from it—helps keep everyone safer.

Safety
Safety

Culture of Safety

What do we mean by safety?

When we talk about protecting users, we’re talking about preventing them from anything that could cause harm. That includes things like harassment, hateful conduct, unwanted interactions from others, inappropriate contact, violent and abusive imagery, violent extremism, misinformation, spam, fraud, scams, and other illegal behavior.

It’s a broad set of experiences, but each of them can affect someone’s life negatively in real ways. They also degrade the overall experience of the community where they happen.

In addition to keeping users safe, we also need them to feel safe. That’s why we talk openly about our work, publish safety metrics each quarter, and give everyone access to resources in our Safety Center.

A culture of safety starts with the people building the technology

Discord’s commitment to safety is reflected in its staff and the way we build our products.

Over 15% of our staff works directly on the team that ensures our users have a safe experience. We invest time and resources here because keeping users safe is a core responsibility: It’s central to our mission to create the best place to hang out online and talk to friends.

Our safety work includes experts from all parts of the company, with a wide range of backgrounds. These experts are engineers, product managers, designers, legal experts, policy experts, and more. Some have worked in technology, but also social work, teen media, human rights, and international law. Building a diverse team like this helps ensure we get a 360-degree view on threats and risks, and the best ways to protect against them.

We also invest in proactively detecting and removing harmful content before it is viewed or experienced by others. During the fourth quarter of 2023, 94% of all the servers that were removed were removed proactively. We’ve built specialized teams, formed external partnerships with industry experts, and integrated advanced technology and machine learning to keep us at the cutting edge of providing a safe experience for users.

Getting users engaged in safety

While rules and technical capabilities lay the foundation, users play a central part in making servers safe for themselves and others.

Our Community Guidelines clearly communicate what activities aren’t allowed on Discord. We warn, restrict, or even ban users and servers if they violate those rules. But we don’t want to spend our days chastising and punishing people, and we don’t want people to worry that they’re always at risk of being reported. That’s no way to have fun with friends.

Instead, we help users understand when they’ve done something wrong and nudge them to change their behavior. When people internalize the rules that way, they recognize when they, or someone else, might be breaking them. They discuss the rules organically, and in the process build a shared sense of what keeps their community safe and in good standing.

This has a multiplier effect on our work to build safer spaces: Users act with greater intention and they more proactively moderate themselves and their communities against unsafe actions. When communities have this shared commitment to the rules, they’re also more likely to report harmful activity that could put an otherwise healthy community at risk.

Changing the safety narrative

It’s sometimes said in technology that a platform can’t really reduce harm or block bad actors from doing bad things. Instead, those actions have to be addressed after they happen, when they’re reported or detected.

At Discord, we have loftier goals than that.

We actually want to reduce harm. We want to make it so hard for bad actors to use our platform that they don’t even try. For everyone else, we want to build products and provide guidance that make safety not just a technical accomplishment, but a cultural value throughout our company and the communities we support.

Our Approach to Content Moderation

Moderation across Discord

All users have the ability to report behavior to Discord. User reports are processed by our Safety team for violative behavior so we can take enforcement actions where appropriate.

Discord also works hard to proactively identify harmful content and conduct so we can remove it and therefore expose fewer users to it. We work to prioritize the highest harm issues such as child safety or violent extremism.

We use a variety of techniques and technology to identify this behavior, including:

  • Image hashing and machine-learning powered technologies that help us identify known and unknown child sexual abuse material. We report child sexual abuse and other related content and perpetrators to the National Center for Missing & Exploited Children (NCMEC), which works with law enforcement to take appropriate action. 
  • Machine learning models that use metadata and network patterns to identify bad actors or spaces with harmful content and activity.
  • Human-led investigations based on flags from our detection tools or reports.
  • Automated moderation (AutoMod), a server-level feature that empowers community moderation through features like keyword and spam filters that can automatically trigger moderation actions.

In larger communities, we may use automated means to proactively identify harmful content on the platform and enforce our Terms of Service and Community Guidelines

If we identify a violation, we then determine the appropriate intervention based on the content or conduct and take action such as removing the content or suspending a user account. The result of our actions are conveyed to users through our Warning System.

Creating a safe teen experience

In addition to all the measures we've outlined above, we add layers of protection for teen users.

Our proactive approach for teens has additional layers of safe keeping because we believe in protecting the most vulnerable populations on Discord.

For teens, we may monitor the content or activity associated with teen accounts to identify possible situations of unwanted messages through Teen Safety Assist. For example, if we detect potentially unwanted DM activity directed at a teen account, we may send a safety alert in chat to give the teen account information about how they might handle the situation. (For messages between adults, we do not proactively scan or read the content by default.)

Together we are building a safer place for everyone to hang out. To stay up to date on our latest policies check out our Policy Hub.

Bringing Policies to Life

First, we do our research.

Discord’s policy team invests time and resources into understanding the dynamics of the type of behavior they’re writing policies about. “The team does a significant amount of research to understand: What is the phenomenon? What does this issue look like on other platforms? What does the issue look like in society?” said Bri Riggio, the senior platform policy manager who oversees this process.

The policy team reviews scholarly research, data, and nonprofit and industry guidance, among other resources. They bring in external subject matter experts from around the globe, like academics who study online behavior or legal scholars who focus on sexual harassment.

They also do internal research. They interviewed Discord’s Trust & Safety team—the ones who review content for violations—to see examples of what they consider bullying and harassment. 

One thing the policy team keeps in mind during this stage is how this research pertains to teens. “Teens are at the forefront of my research—thinking about how we protect young people who are online and making sure they feel safe and comfortable using Discord,” said Edith Gonzalez, a platform policy specialist. She added that she looks for research that addresses parents' concerns when it comes to issues such as cyberbullying.

We identify the problem we’re trying to solve for.

At the heart of each policy, Riggio and her team start by asking the question: What is the problem that we are trying to solve on Discord? She calls this idea the “spirit” of the policy.

“If we are trying to stop users from feeling psychological or emotional harm from other users, if we want them to have a good experience, then we write the policy to try and drive towards that north star,” she said. “It can be a guiding force for us, especially when we get escalations or edge cases where the policy may not be so clear cut.”

For example, if the Trust & Safety team gets a report of content that was flagged for bullying, but the content falls somewhere in a gray area or it’s not clear if the content was intentional, the team discusses if taking action under the policy is actually going to solve the problem or not. Another way of putting it: Does it go against the spirit of the policy? These discussions inform how the policy team can sharpen and better articulate policies over time.

We define the terms and the parameters for enforcement.

Policy specialists like Gonzalez begin listing all the different behaviors that describe the issue. For bullying and harassment, examples might include: sexual harassment, server raids, sending shock content, and mocking a person's death. “Once I write down all the criteria, I expand each behavior,” said Gonzalez. “How do you define a violent tragedy? How do you define mocking? How do you define unwanted sexual harassment? Slowly, the policy starts off as a list, and then I keep expanding it.”

For this policy, we define bullying as unwanted behavior purposefully conducted by an individual or group of people to cause distress or intimidate, and harassment as abusive behavior towards an individual or group of people that contributes to psychological or reputational harm that continues over time.

Riggio points out the significance of the sustained behavior. It’s an acknowledgment that context matters. For example, what may be friendly banter and joking between individuals could be misconstrued as harassment, not taking into account that friends sometimes joke around with each other. “We would be potentially over-enforcing the policy with users who are just having fun with each other.”

We index on educating users who we know are going to make mistakes.

Research and experience tell us that people don’t always know the rules, and when appropriate, they need an opportunity to learn from their mistakes. Discord built a warning system that helps educate users, especially teens. (Important note: The warning system does not apply to illegal or more harmful violations, such as violent extremism or endangering children. Discord has a zero-tolerance policy against those activities.)

“There's a perception that teens will roll their eyes at a rule, or they don't like rules, but the reality is, teens really like it when there are clear rules that resonate with them,” said Liz Hegarty, Discord’s global teen safety policy manager.

Once they learn the rule, they might slip up again, so the policy team built in several opportunities to educate teens that bullying isn’t allowed. These show up in the form of a series of time-outs, versus a flat-out ban after a few strikes, as is common on other platforms.

“It’s teen-centered,” said Hegarty. “It's this idea of: how can you educate as opposed to just punishing teenagers when they're making mistakes?”

This, too, reflects the spirit of the Bullying and Harassment policy, because ultimately, Discord wants to empower people to build meaningful relationships on the platform.

“The spirit is intended to get at the various behaviors that we know make users feel uncomfortable,” said Riggio. “That’s the behavior that impacts feelings of safety and the culture of a community.”

Remediation: Why you can’t ban your way to a safe platform

Discord’s remediation philosophy

Our approach to remediation is inspired by restorative justice, a concept from criminal justice that prioritizes repairing harm over punishment. Discord’s Community Guidelines exist to keep people and communities safe, and it’s important to make sure everyone knows how and why to follow them. And still, when people break the rules, most of the time they deserve a chance to correct themselves and show that they’ve learned.

When users break an online platform’s rules, they don’t always do so intentionally. Consider how many different platforms someone might use in a day, each with their own rules and expectations. Discord needs to enforce its own Community Guidelines, while understanding that a user who violates the Guidelines might not even know they had broken a rule in the first place.

“A lot of users have their own ‘terms of service’ that may be different from ours,” said Jenna Passmore, Discord’s senior staff product designer for safety. Admins and moderators set rules, norms, and expectations for their own servers on top of Discord’s Community Guidelines and Terms of Service, and users may not know when they’re breaching the rules of a server or the platform. “If there's a mismatch there, and it’s not a severe violation, we don't want to kick them off the platform.”

Instead, Discord wants to teach people how to stay in bounds and keep themselves, their friends, and their communities safe and fun for everyone. And, to be clear, when it comes to the most severe violations—such as those involving violent extremism and anything that endangers or exploits children—Discord will continue to have a zero-tolerance policy.

Putting users in charge of the outcomes

In most cases, breaking Discord’s Community Guidelines triggers a series of events that are aimed at returning the user to their community with a better understanding of how they should behave.

Education comes first. When Discord spots a violation, it lets a user know what specific thing they did wrong. They’re prompted to learn more, in plain language, about that rule and why it matters. Nobody expects someone to memorize the full Community Guidelines or Terms of Service, but this is the best moment to help them learn what they need to know.

Consequences come next. Any penalties we impose are transparent and match the severity of the violation. A first-time, low-level violation, for example, could prompt a warning, removal of the offending content, or a cooling-off period with temporary restrictions on account activity.

When the action Discord takes to enforce its Community Guidelines takes proper measure of the violation and provides users a path back to good standing, users can often return to their community with a greater understanding of the rules and the need to change their own behavior. If users stay on good terms, similar violations in the future will play out roughly the same way. If they continue to break the rules frequently, the violations stack up, and so do the consequences. The idea is to provide users an opportunity to learn from their mistakes, change their behavior, and therefore stay in communities they know and trust on Discord, rather than pushing users who violate Discord’s Community Guidelines off the platform and to other spaces that tolerate or encourage bad behavior.

“We want users to feel like they have some agency over what's going to happen next,” Passmore said. “We want an avenue for folks to see our line of thinking: You violated this policy, and this is what we're doing to your account. Learn more about this policy, and please don't do this again. If you do it again, here's what's going to happen to your account.”

Banning is reserved for the most extreme cases, and the highest-harm violations result in immediate bans. Otherwise, if someone repeatedly demonstrates that they won’t play by the rules, the consequences become more severe, from losing access to their account for a month, to a year, and eventually—if they continue to violate the Community Guidelines—for good.

Everyone knows where they stand

Remediation depends on users who are able to learn and change their behavior being able to return to good standing. Discord’s account standing meter lets users see how their past activity affects what they’re able to do on the platform.

The standing meter has five levels: good, limited, very limited, at risk, or banned.

“For most folks, it will always be ‘good’,” Passmore said. When it’s not, the hope is that someone will act more thoughtfully so they can return to that good status and stay there. She compared it to road signs that show the risk level for wildfires. “It changes your behavior when you’re driving through a national park and you see that there’s a moderate fire risk,” she said.

Discord’s internal policies outline the severity of each type of violation, the appropriate consequence, and how one type of violation compares to others. These policies help ensure that the employees who review violations of Discord’s Community Guidelines enforce the rules fairly and consistently. That means when someone breaks the rules, employees aren’t deciding punishment on an ad hoc basis. Instead, the outcome is determined by the severity and number of violations.

“Our goal is to help people act better,” said Ben Shanken, Discord’s vice president of product, who oversees teams that work on growth, safety, and the user experience. “People don't always know that they're doing bad things, and giving them a warning can help them to improve.”

Ultimately, we believe that guiding users toward better behavior—and giving them the tools they need to learn—results in a better experience for everyone.

How do you define troll-y?

Defining the terms

One of the main challenges for the Platform Policy team is writing the rules for content that isn’t so clearly defined. Riggio notes that for the most severe violations, such as posting illegal material or making an imminent threat of physical harm, the line between what’s allowed and what’s prohibited is bright red. Discord has a strict policy for those types of violations and clear processes in place for removing content and banning the user.

But for content where there aren’t legal definitions or even clear societal ones, like for bullying and harassment, the line is more gray.

Landing on a definition starts with posing a fundamental question, said Riggio: “What is the problem that we’re trying to solve here on Discord?” How the team goes about answering that challenge becomes the guiding force behind each policy.

For bullying and harassment, the team looks at the user experience. We don’t want people to experience psychological or emotional harm from other users. 

“The spirit of the policy is focused on individual and community experience and building healthy interactions,” Riggio said. Because ultimately, Discord is a place where everyone can find belonging, and experiencing these feelings hinder people from doing that.

The team did a rigorous amount of research including reviewing other platforms and reviewing guidance from industry groups, including the Global Internet Forum to Counter Terrorism and the Tech Coalition, which was formed so companies could share the best intelligence and resources when it comes to child safety online. They also read the academic literature on what it looks like in society.

They also brought in subject matter experts from different countries, such as academics who study harassment and legal scholars who can offer additional context for how platforms should be thinking about policies and enforcement. These experts provided feedback on early drafts of the policies.

They defined bullying as unwanted behavior purposefully conducted by an individual or group of people to cause distress or intimidate particular individuals, and harassment as abusive behavior towards individuals or groups of people that contributes to psychological or reputational harm that continues over time.

Policies like this one are only as useful if they are enforceable. So the team identified two dimensions that must be taken into account when assessing content.

First, the intent of the instigator. These are signals that the user or the server is posting content deliberately or is intending to cause harm. Intent isn’t always easy to spot, but Riggio said just as in real life, there are signs. For example, a user is directly harassed by someone, or there have been multiple instances of harassment in the past.

“What happens in society happens online. What happens in your high school can sometimes happen on Discord,” she said.

And second, is the content targeted? Is it targeting a specific individual by name? Is it targeting a specific individual by group? Here, there could also be dimensions of targeting users or groups because of certain protected characteristics, such as race or gender, which would then be considered a violation of Discord’s Hate Speech policy.

How this policy covers trolling

The term trolling has many dimensions. It’s been used to normalize “edgy behavior,” Riggio said. “Trolling in and of itself is a complicated term, because some communities that intend to cause harm downplay their behavior with, ‘We're just trolling,’” Riggio said.

Trolling on Discord takes on different dimensions. From "server raids", when a group of users joins a server at the same time to cause chaos, to "grief trolling", when someone attempts to make jokes or degrades the deceased or their next of kin, this type of negative behavior is rooted in disruption and harassment, and is not tolerated on our platform -- no matter how harmless the instigators may think it is.

“I hate the saying ‘sticks and stones may break my bones, but words will never hurt me,’ because words hurt,” said Patricia Noel, Discord’s mental health policy manager who is a licensed social worker and has a background in youth mental health. She collaborates cross-functionally within Discord and with external partners, such as the Crisis Text Line, with the goal of providing tools and resources to teens who may be experiencing mental health challenges.

Why education is as important as enforcement

Part of the enforcement strategy for bullying and harassment is to include a series of warnings, with escalating enforcement actions aimed at educating users, namely teens, that certain behaviors are not OK on Discord.

“This generation is really empathetic. But I also know that there are people who go online just for the sake of going online and being a troll,” Noel said. “Those might be the very same folks who are having problems at home, who are dealing with their own mental health and well being issues. They might not immediately recognize that they are doing harm to someone else.”

Understanding that young people are going to make mistakes is reflected in how the new warning system handles suspensions. For lower harm violations falling under the Bullying and Harassment policy, Discord will test suspensions rather than permanent bans. The idea here is that some teens may need a long pause before coming back to the platform, given how much growth and development occurs during each year of adolescence.

And while there will always be a zero-tolerance policy for the most serious violations, Discord will focus on remediation, when appropriate, for topics like bullying and harassment. This is so teens can have an opportunity to pause, acknowledge that they did something wrong, and hopefully change their behavior.

The challenge is that the behavior is so wide-ranging, said Ben Shanken, Discord’s Vice President of Product. He pointed to a recent analysis by Discord that looked at reports of bullying in large, discoverable servers.

“If you put these examples on an SAT test and asked, ‘Is this bullying? Many people probably would answer the questions incorrectly,” he said, noting how Discord’s large teen user base is inclined to “troll-y” behavior.

In one example, Shanken shared how teens were trolling each other by deceptively reporting their friends to Discord for supposedly being underage when in fact they were not.

It’s trolling. While it’s not nice, it’s also part of how teens joke around. But, instead of simply banning the offender, we build in an opportunity where they can learn from their mistake.

“If we ban a person, and they don't know what they did wrong, they likely won't change their behavior,” said Shanken. “Giving them a warning and saying, here's what you did wrong, flips the script in a more constructive way.

At its core, Discord is a place where people can come together to build genuine friendships. Educating users, especially young people, through our warning system is one part of Discord’s holistic strategy to cultivate safe spaces where users can make meaningful connections.

The nature of being a teenager and navigating relationships can be tricky. There are going to be situations where you hurt people's feelings. There may be moments where users lose their cool and say something really mean or even vaguely threatening,” Riggio said. “We want to build in more of a runway for them to make mistakes online but also learn from them.”

1. Keep it simple.

When designing safety alerts, Discord uses plain language, avoids idioms and metaphors, and keeps sentences and words short when possible. This makes alerts easier to understand in the moment, without overloading users with information. This can be particularly useful for users who may be in a heightened emotional state, for example, if they just got a notification that they violated Discord’s anti-harassment policy or they want to report someone else for harassing them.

Our design team tries to put themselves in the teens’ shoes in those moments. To imagine what their emotional state might be at the time. The simpler we can make options for them, the better. We try to make it easier for them to move through the flow so they are clear about what's supposed to happen.

Simple language is timeless. It isn’t trendy or metaphorical. And it translates better across different languages and cultures.

Take for example when a teen receives a DM from a user for the first time and Discord may provide some helpful education about online safety.

2. Guide, don’t reprimand.

With features like Discord’s Teen Safety Alerts, Discord partnered with a leading child safety non-profit Thorn to design features with teens’ online behavior in mind. The goal is to empower teens to build their own online safety muscle—not make them feel like they've done something wrong. So Discord builds teachable moments and guidance into our products, using language that aims to restore the user’s sense of control, without shaming them.

“With teens, because they can't opt out of some of these alerts, we want it to feel more like a guide—a guardrail versus a speed bump,” said Passmore, the lead product designer for the Safety team. “A speed bump is saying, ‘Hey, you're doing something wrong, you need to check yourself.’ But we built Teen Safety Alerts to feel more like guidance.”

This alert will encourage them to double check if they want to reply, and will provide links to block the user or view more tips for staying safe. The goal is to empower teens by providing information that helps them make important decisions to protect themselves.

3. Show that you’re listening and can help them regain control.

When a teen encounters potentially harmful or sensitive content or has a run-in with another user who is acting hostile, they, too, may be in an activated emotional state. They may be angry at the other person, or even at themselves. This is common with cyber-bullying, which researchers say can lead to depression and feelings of low self-esteem and isolation.

Our approach is to meet them in that mindset. This means offering them options and reminding them that they have options like blocking a person or reporting a message. We create a space to acknowledge what they’re experiencing.

It’s also important not to dwell on the upsetting incident. Instead, Discord uses language that aims to move the teen out of that emotional state and into a place of self-awareness. This can help them to calm down and feel more in control.

A users’ mindset matters to Discord. That is why we try and choose our words so carefully to support in the best possible way.

How teens find and explore community on Discord

Young people find community and identity online

Online connections are a fixture of modern life. One in three US adults report they are online “almost constantly,” and a similar share of teens say they use social media just as much, according to the Surgeon General’s advisory.

It’s probably true that everyone spends too much time on their phones — but that doesn’t mean there aren’t important things happening there. For young people in particular, online spaces can be great for hanging out with friends, building social circles, and forming identity at a key juncture in life.

Technology also removes geographical limits placed on friendships and allows people to find like-minded peers much more freely.

It’s important to have opportunities to connect with people who are into the same things, whether those are passions, causes, or an identity. Coming together online can help people find community, and help build  genuine connections.

On Discord, that may happen in servers focused on a favorite sports team, video game, or TV show. Servers can be small and intimate, with just your core friend group who might use it for hanging out and chatting. They can also be larger spaces for more people to share ideas, or organized around identities, including neurodivergence, disabilities, or LGBTQ communities; as well as special interests like music, art, or tech.

For some people, spaces like these offer support and social connection that might be hard to come by where they live.

“Especially as an LGBTQ young person, being able to find a space, be it online or in person, can literally be life-saving if they come from an unsupportive community,” said Patricia Noel, Discord’s mental health policy manager and a licensed social worker. “Just being able to find people who accept you, who use your pronouns — all I can say is that it can be a life-saving experience for young people to be able to come on Discord and to find community.”

More people are talking about mental health

Many young people use Discord servers to support each other as they navigate a difficult stage in life. In servers focused on peer support with mental health, members gather to discuss what they’re struggling with and build community around helping one another.

These spaces created and managed by users aren’t replacements for mental health professionals, but they offer an accessible and reliable space to be heard, Noel said. Young people look to communities of friends for validation and support, and they’re at an age in life when that has a significant impact on their sense of self.

“Young people are really invested in their mental health. They want to be having these conversations,” Noel said. “They're so open about what they're experiencing and feeling, and even the way that they support each other is amazing. They're asking the right questions, they’re so in tune with one another, and they’re such an empathetic group.”

In spaces dedicated to discussing mental health, people may feel more free to share things they aren’t yet comfortable sharing elsewhere. They can show up in a space that’s designed to offer words of validation and affirmation, and offer that same kind of support back to others.

“There’s a lot of research coming out around the benefits of peer support and just being able to talk to somebody who's gone through what you’re going through, or who's just within your same age bracket,” Noel said. “It ends up being beneficial for the person seeking support as well as the person who is the supporter.”

Building positive spaces for online communities

The experience of hanging out with friends is central to Discord.

Servers are all about conversation — text, voice, and video — because that’s how people build genuine connections, in the real world and online. Conversations happen at both the server level and also one-on-one or with smaller groups in Direct Messages, or DMs. We think of this in contrast to social media apps that are built for broadcasting and likes, and driven by algorithms that monopolize users’ attention. When all your time is spent on shallow interactions and endless feeds of other people seemingly living their best lives, it’s hard to feel better or more connected.

Often, the opposite is true: research published in the Journal of Medical Internet Research found that social comparison on digital platforms, where people judge themselves against others they see online, is “a significant risk factor for depression and anxiety.” Experiences of quality interactions and social support online, on the other hand, are related to lower levels of depression and anxiety.

The good news is we know how online communities can bring people together. The internet can still provide new solutions to people’s need for social connection, and Discord is committed to fostering safe, healthy, and supportive spaces on our platform.

“The internet isn’t going anywhere. It's a tool that young people use, and it's going to continue to be a part of their lives,” Noel said. “They want to be part of the solutions. They want to be included when it comes to deciding what happens next.”

Understanding and Avoiding Common Scams

Social Engineering

Social engineering is a manipulation tactic used by bad actors to trick individuals into divulging sensitive or personal information. The bad actor often poses as a trustworthy entity, offering a seemingly beneficial exchange of information. In its most basic form on our platform, social engineering is manipulating people to give their login credentials to an attacker.

Discord Staff Impersonation

Sometimes attackers try to impersonate Discord staff to gather information. To use this tactic, they hack into Discord accounts, then convince an account’s friends list that they've “accidentally reported them.” They encourage them to reach out to "Discord Employees" to resolve the issue.

These impersonators often copy social media profiles onto Discord accounts, produce fake resumes, and may even claim their staff badges are hidden for safety reasons. The end goal is to trick you into surrendering your account information, paying for their fraudulent services to “undo the report,” and acquiring your financial assets.

Discord Staff will never directly message users on the app for support or account-related inquiries. If someone claiming to be staff asks for personal information, payment, or changes to your login credentials, we recommend that you do not engage further. All Discord users can report policy violations in the app by following the instructions here.

Discord Staff are one of many groups that may be impersonated. Similar actions may occur for other companies as well, so be wary of accounts that may impersonate Support or Safety related questions in other companies too. In general, if you need support at any company, it is wise to go the official source instead.

You can always verify your account standing directly from Discord by going into User Settings > Privacy & Safety > Standing. Learn more about account standing here.

Impersonated Discord DMs

Attackers may also resort to impersonating official Discord responses through user accounts or bots. Typically, these messages include threats to your account standing if you do not comply with their demands. An official Discord DM will never ask for your password or account token, and will always display a staff badge on the profile, as well a system badge which says “Official.’

Malware Tricks

Malware often finds its way onto a device through downloads of malicious files. These files may appear harmless or even enticing—like a game from a friend. But once downloaded and run, they can give bad actors access to your login credentials, email addresses, and even your entire device.

Malicious Links and Fake Nitro Giveaways

Always exercise caution when clicking on links that will take you off of Discord, even when they appear to come from friends or promise rewards like free Nitro.

When you click on a link given to you, a pop up will show that you are leaving Discord and it will display the website you are being redirected to. It is advised to check the link to make sure you are going to the place that is intended.

Reporting Scams

Reporting safety violations is critically important to keeping you and the broader Discord community safe. All Discord users can report policy violations in the app by following the instructions here. Stay vigilant and informed to protect yourself and your digital assets.

Scams go against Discord’s Community Guidelines, and when we see this kind of activity, we take action, which can include banning users, shutting down servers and engaging with authorities. We are committed to reducing scams through technical interventions and continuously invest in safety enhancements and partner with third parties to accelerate our work.

For more information you can read our Deceptive Practices Policy Explainer as well as our Identity and Authenticity Policy Explainer.

No matching results