Highlights of the complaint’s allegations
Discord’s Platform is Structured to Encourage Unchecked and Unmoderated Engagement Among Its Users
Discord designed its app to appeal to children’s desire for personalization and play by offering custom emojis, stickers, and soundboard effects, all of which are intended to make chats more engaging and kid-friendly. And it has created or facilitated “student hubs” as well as communities focused on popular kids’ games, like Roblox.
Once engaged, Discord encourages and facilitates free interaction and engagement between its users. Specifically, Discord’s default settings allow users to receive friend requests from anyone on the app—and to receive private direct messages from friends and anyone using the same server or virtual “community”—enabling child users to connect easily and become “friends” with hundreds of other users. Then, because Discord’s default safety settings disable message scanning between “friends,” child users can be—and are—inundated with explicit content. This explicit content can include user-created child sexual abuse material, messages intended to sexually exploit or coerce a child to engage in self-harm, internet links to sexually explicit content, images, and videos depicting violence, and videos containing sexually explicit content. In short, the app’s design makes it easy for children to connect with other users, but also allows predators to lurk and target them, undeterred by the safety features Discord touts as reasons that parents and users should trust its app.
Discord Misled Users About its “Safe Direct Messaging” Feature
From March 28, 2017 until April 22, 2023, Discord included “Safe Direct Messaging” settings in the “Privacy & Safety” menu of Discord’s “User Settings.” The settings purported to address how direct messages from other users will be scanned and deleted before receipt by the intended user. The Safe Direct Messaging setting contained three options:
- Keep me safe. Scan direct messages from everyone.
- My friends are nice. Scan direct messages from everyone unless they are a friend.
- Do not scan. Direct messages with not be scanned for explicit content.
For most of the feature’s existence, Discord made the “My friends are nice” option the default setting for every new user on the app. This option only scanned incoming direct messages if the sender was not on the user’s friends list. For both the “Keep me Safe” and “My friends are nice” settings, Discord represented that it would “[a]utomatically scan and delete direct messages you receive that contain explicit media content.” But this was not true. Despite its claims, Discord knew that not all explicit content was being detected or deleted.
Discord’s Design Decisions Exacerbated the Risk to Children on the App
Combined with Discord’s deception about its Safe Direct Messaging features, Discord’s other design choices worked together to virtually ensure that children were harmed or placed at risk of harm on its app. For example:
- By default, Discord allows users to exchange DMs if they belong to a common server. Therefore, a malicious user—adult or child—need only to join a community server, which could contain over a million users, to exchange DMs with an unsuspecting child user.
- DMs among “friends” are even more dangerous. Discord’s default settings not only allow any user to send a friend request to a child, they also then permit those users, once “friends,” to exchange totally unscanned DMs through the default “My friends are nice” setting. Children can receive and accept friend requests from users whom they do not know and with whom they have no connection, and then engage privately on the platform without any oversight—all by design.
- Users may also create multiple accounts to hide their activities and circumvent being banned from servers, or from facing other repercussions. And even if users are banned from a server, or from Discord itself, Discord’s design allows them to simply re-engage using a brand new, easily created account.
Discord Misrepresented That Users Under the Age of 13 Are Not Permitted to Create Accounts and Are Banned from Discord Upon Discovery
At all relevant times, Discord’s Terms of Service have stated that users must be “at least 13 years old and meet the minimum age required by the laws in [the users’] country.” To this day, however, Discord only requires individuals to enter their date of birth to establish their age when creating an account—nothing more. Discord does not require users to verify their age or identity in any other way. Simple verification measures could have prevented predators from creating false accounts and kept children under 13 off the app more effectively.
Nevertheless, Discord actively chose not to bolster its age verification process for years and has allowed children under the age of 13 to operate freely on the app, despite their vulnerability to sexual predators.
Simply put, Discord has promised parents safety while simultaneously making deliberate choices about its app’s design and default settings, including Safe Direct Messaging and age verification systems, that broke those promises. As a result of Discord’s decisions, thousands of users were misled into signing up, believing they or their children would be safe, when they were really anything but.
Discord knew its safety features and policies could not and did not protect its youthful user base, but refused to do better, the complaint alleges. In particular, Discord misled parents and kids about its safety settings for direct messages (“DMs”).
Everybody is so quick to blame the parents in these situations. Maybe there is some truth to that, but people also need to reckon with the fact that kids (and adults) are being constantly inundated by Skinner box apps, and “platforms” full of engagement bait designed to be addictive and attractive as possible. All run by corporations with functionally no regard for the safety of their users.
Yeah, sure, if you’re giving advice to an individual parent, they should probably be keeping a closer eye on what their kids are doing.
But there are systemic problems here that can’t be fixed with individual action. By laying the blame solely at the feet of the parents here, you are in effect putting parents up against dozens of huge corporations, each with armies of expert advertisers, designers, and psychologists working to build these products. It’s hardly a fair fight.