08/12/2024 | Press release | Distributed by Public on 08/12/2024 13:15
WASHINGTON - Today, U.S. Sen. Mark R. Warner (D-VA) pressed Discord, an instant messaging social platform, about the company's failure to safeguard minors and stop the proliferation of violent predatory groups who target children with the goal of forcing them to end their own lives and livestream the act online.
This letter follows a September warning from the FBI alerting Americans to the existence of these violent online groups, which exist on messaging platforms and deliberately extort children into producing child sexual abuse material (CSAM) or sharing acts of self-harm online. According to the warning, issued by the FBI's Internet Crime Complaint Center, these groups target minors between the ages of 8 and 17 years old and focus on racial and ethnic minorities, LGBTQ+ youth, and those who struggle with a variety of mental health issues.
"I am extremely concerned about this abuse, and I am profoundly saddened that it has affected Virginia families, including the daughter of a military family who was coerced into self-harm and to attempt suicide," Sen. Warner wrote. "I recognize that Discord's Trust & Safety team is aware of this type of activity and has taken some actions to detect and remove some of these violent groups from their platforms. However, despite increased moderation, predators continue to target minors on your platform."
"As a teenager, I fell victim to the cruel manipulation of violent predatory groups on Discord. During a period in my life where I struggled with anxiety, depression, and eating disorders, they took advantage of my feelings of isolation, and encouraged me to self-harm and even end my life. While I'm deeply grateful to have escaped their abuse, I'm heartbroken to know that this violent, dangerous behavior persists on Discord," said Abrielle, the Virginia teenager who was coerced by "King" into attempting suicide before being found by first responders in time to save her life. "Enough is enough - tech companies need to do more to crack down on the predatory groups that nearly took my life. Discord owes it to a generation of kids and teens to eliminate the extremely harmful content that abounds on their platforms."
Sen. Warner continued, "I urge you to devote more resources to this problem, including dedicating a greater number of content moderators, investigators, engineers, and legal professionals to it. It is my understanding that Discord currently enforces its policies through actions like suspending policy-violating users' accounts and servers, as well as banning their Internet Protocol (IP) addresses and email addresses. I also understand that there are far more sophisticated measures, such as device-based or cookie-based bans, that could be taken to prevent identified malign users from returning to your platform. Further, I am aware of measures that could be used to proactively detect harmful activity and initiate an early intervention to prevent harm and loss of life."
In the letter, Sen. Warner demands answers to a series of questions about the company's efforts to address these predatory groups. Specifically, he asks that Discord outline its policies and procedures around content that violates Discord's Terms of Service, and that it share more information on its detection mechanisms, enforcement actions, measures to prevent the re-entry of malicious actors, and more. He also requests answers on the number of accounts that have been removed over the last four years, and the quantity of suicide ideation or depiction content.
Today's letter also follows recommendations issued in July by the Biden-Harris Administration's Kids Online Health and Safety Task Force to address the online health and safety for children and youth, with specific recommendations made to industry. It also comes on the heels of the Senate passage of the Kids Online Safety Act ("KOSA") and the Children and Teens' Online Privacy Protection Act ("COPPA 2.0"), which will require online platforms to take specific measures to protect the safety and privacy of children using their platforms.
A copy of the letter is available here and below.
Dear Mr. Citron:
I write today regarding disturbing reports that Discord is being used by violent predatory groups to coerce children into self-harm. The failure of your company to stop this activity is deeply troubling, and the lack of adequate safeguards to protect vulnerable individuals, especially teens and children, from this degrading and violent form of abuse is of grave concern. I urge you to quickly take steps to remove malicious actors from your platform, prevent their future access, and collaborate with law enforcement officials to bring safety and justice to the victims.
On September 12, 2023, the FBI's Internet Crime Complaint Center (IC3) issued a warning to the public that violent online groups are deliberately targeting minor victims on messaging platforms to extort them into recording or live-streaming acts of self-harm and producing child sexual abuse material (CSAM). IC3 noted that these groups are targeting minors between the ages of 8 and 17 years old, especially LGBTQ+ youth, racial and ethnic minorities, and those who struggle with a variety of mental health issues. The warning further noted that these groups often control their victims through inflicting extreme fear, extorting them through threats of sharing sexually explicit videos or photos of the minor victims with their friends and families, and many have an end-goal of forcing these minors into completing suicide on live-stream to view and record for their own entertainment or sense of fame.
I am extremely concerned about this abuse, and I am profoundly saddened that it has affected Virginia families, including the daughter of a military family who was coerced into self-harm and to attempt suicide. The severe harm that the family's daughter faced from a predatory user going by the name "King" closely mirrored a story published in the Washington Post. This report detailed how one of these violent online groups misused your platform, engaging in pervasive harassing conduct that resulted in the deaths of several minors. It further described how Discord's Trust & Safety team has struggled to keep this specific group off the platform despite knowing of its existence. "King" ultimately coerced the Virginia minor into attempting suicide. Fortunately, first responders were able to reach her in time to save her life.
I recognize that Discord's Trust & Safety team is aware of this type of activity and has taken some actions to detect and remove some of these violent groups from their platforms. However, despite increased moderation, predators continue to target minors on your platform. I urge you to devote more resources to this problem, including dedicating a greater number of content moderators, investigators, engineers, and legal professionals to it. It is my understanding that Discord currently enforces its policies through actions like suspending policy-violating users' accounts and servers, as well as banning their Internet Protocol (IP) addresses and email addresses. I also understand that there are far more sophisticated measures, such as device-based or cookie-based bans, that could be taken to prevent identified malign users from returning to your platform. Further, I am aware of measures that could be used to proactively detect harmful activity and initiate an early intervention to prevent harm and loss of life.
On July 22, 2024, the Biden-Harris Administration's Kids Online Health and Safety Task Force issued a report providing guidance to address the online health and safety for children and youth with specific recommendations made to industry. Several recommendations that address the harm detailed in this letter were made; including developing and deploying mechanisms and strategies to counter child sexual exploitation and abuse, using data-driven methods to detect and prevent online harassment and abuse, and providing age-appropriate parental control tools. The findings and recommendations of this task force underscore the need for platforms like Discord to act on the self-harm extortion of minors. I urge Discord to review the detailed recommendations made in the report and to take them seriously.
I respectfully request that you respond to this letter with detailed answers to the following questions:
Thank you for your prompt attention to this letter, and I look forward to reviewing your response.
Sincerely,
###