Discord updates its policy to explicitly prohibit AI-generated CSAM and any text or media content that sexualizes children

TL;DR:

  • Discord updates its policy to explicitly prohibit AI-generated CSAM and any text or media content that sexualizes children.
  • Teen dating servers are banned, and actions will be taken against users engaged in inappropriate behavior.
  • Discord launches a Family Center tool for parents to monitor their children’s activity on the platform.
  • The platform aims to enhance safety for underage users with proactive scanning and innovative measures.

Main AI News:

In a decisive move to safeguard the well-being of its young user base, Discord, the popular communication platform, has implemented a comprehensive policy update. This significant revision aims to address the alarming reports surrounding the app’s misuse by predators, who exploit it to create and distribute child sexual abuse materials (CSAM) while also engaging in the grooming of vulnerable teenagers.

Recent investigations, as detailed by The Washington Post, shed light on the surge in generative artificial intelligence (AI) technologies, which has subsequently led to an alarming proliferation of lifelike images featuring explicit depictions of minors. Shockingly, the publication uncovered conversations revolving around the utilization of Midjourney, a text-to-image generative AI tool available on Discord, for the purpose of generating inappropriate images involving children.

Acknowledging the gravity of this issue, Discord has taken proactive steps to confront the problem head-on. The platform has explicitly banned AI-generated photorealistic CSAM, underscoring its unwavering commitment to protecting children and adolescents from any form of exploitation. Additionally, Discord has extended this prohibition to encompass any text or media content that sexualizes minors, demonstrating an uncompromising stance on safeguarding its user base.

In a bold move to eliminate potential threats, Discord has decided to ban teen dating servers, acknowledging the inherent risks associated with such platforms. This critical action is a direct response to a previous investigation conducted by NBC News, which unearthed the presence of Discord servers explicitly advertised as spaces for teen dating. Disturbingly, these servers facilitated the solicitation of explicit imagery from minors by certain participants.

Discord’s determination to eradicate this abhorrent behavior is further underscored by its commitment to taking swift and decisive action against any user found engaging in such activities. Previous cases have seen adult users prosecuted for grooming minors on the platform, with criminal networks even resorting to extorting underage individuals to obtain sexually explicit images. By entirely banning teen dating servers, Discord seeks to mitigate this issue effectively. Notably, the company has included a stringent clause in its policy, indicating that older teenagers involved in grooming younger teens will be subject to review and appropriate action under Discord’s Inappropriate Sexual Conduct with Children and Grooming Policy.

In addition to these policy updates, Discord has launched a groundbreaking tool known as the Family Center. Designed specifically for parents, this tool empowers them to monitor their children’s activity on the chat service. While the actual contents of their kids’ messages remain inaccessible, parents can now opt-in to gain insights into their children’s social connections and communication on the platform. By introducing these innovative measures and tools, Discord aims to enhance the safety of its underage users alongside its existing measures, which include the proactive scanning of images uploaded to the platform using the industry-standard PhotoDNA technology.

Conclusion:

Discord’s comprehensive policy update and reinforced child protection measures demonstrate its commitment to combating the distressing issues of CSAM and teen grooming. By explicitly banning AI-generated CSAM and content sexualizing children, as well as taking action against users involved in inappropriate behavior, Discord sets a strong precedent for safeguarding its young user base. The introduction of the Family Center tool further empowers parents to monitor their children’s activity, fostering a safer environment. These proactive steps by Discord not only prioritize child protection but also set a standard for the market, encouraging other platforms to follow suit in implementing robust measures to ensure the safety and well-being of their users.

Source