Raleigh Man Arrested; Discord-Roblox Lawsuits Claim Platforms Aid Child Predators

Over 1,400 victims are part of lawsuits against Discord and Roblox, claiming their platforms are used by predators. This follows an arrest in Raleigh, NC.

Federal agents in Raleigh recently dismantled the bedroom of Andrew Swenson, 42, finding a collection of rope, zip ties, and duct tape. Swenson faces charges for the coercion and enticement of a 13-year-old and the production of child pornography involving an infant. This arrest is a single data point in a growing map of litigation claiming that the design of Discord and Roblox functions as a pipeline for predators.

"The FBI searched Swenson’s home where agents found bags containing rope, zip ties and duct tape… [and] photos/videos of him abusing an infant." — U.S. Attorney’s Office.

The Mechanics of Exploitation

The lawsuits, involving over 1,400 victims, describe a repeating pattern of behavior enabled by platform architecture. Predators initiate contact within the gaming environment of Roblox—often using digital currency or "Robux" as bait—before migrating the child to the less-monitored, private servers of Discord.

PlatformRole in PipelineKnown Vulnerability
RobloxInitial ContactAge-bypass via self-reported birthdates
DiscordGrooming/CSAMEncrypted or private channels; file sharing
Legal ShieldSection 230Immunizes platforms from user-generated content

Lawsuits allege these companies prioritize user growth over friction-heavy safety measures.

Discord Child Safety Questioned Again After Raleigh Man Charged With Alleged Sexual Abuse of 13-Year-Old - 1

Legal firms including Gould, Gould Grieco & Hensley and Milberg are currently testing the limits of Section 230 of the Communications Decency Act. This law usually protects tech companies from being sued for what their users do or say. The new wave of litigation argues that the design of the platforms—not just the content—is what is defective and dangerous.

Read More: New NVK Driver Replaces Nouveau for NVIDIA GPUs on Linux in 2025

  • Andrew Swenson allegedly used Discord to plan an abduction in Nebraska.

  • Roblox recently restricted direct messaging for users under 13, but the system still relies on "trust-based" birthdate entry.

  • Families report lasting trauma, including PTSD and depression, as children move from gaming avatars to real-world threats.

Background: The Growth Mandate

Both Discord and Roblox have marketed themselves as "digital third places"—safe spots for kids to hang out when physical parks are empty. Roblox (NYSE: RBLX) shares have seen volatility as these lawsuits emerge, highlighting the tension between a "safe for kids" brand image and the reality of unmoderated dark corners. While the companies claim to use automated tools to scrub CSAM (Child Sexual Abuse Material), the Swenson case suggests that determined individuals still find the "pipes" quite clear for their use.

Read More: Iowa Man Admits Killing 3 Utah Women for Money and Car

Reflective Note: The infrastructure of the modern internet is built on the assumption of 'good' traffic, yet it provides the same efficiency to those carrying zip ties and duct tape.

Frequently Asked Questions

Q: What happened in Raleigh that relates to Discord and Roblox?
Federal agents arrested Andrew Swenson, 42, in Raleigh, NC, on charges of child enticement and child pornography. This arrest is linked to lawsuits against Discord and Roblox.
Q: What do the lawsuits claim about Discord and Roblox?
Lawsuits claim that the way Discord and Roblox are designed helps predators find and groom children. Predators reportedly use Roblox to contact children and then move them to Discord.
Q: How do predators allegedly use Roblox and Discord?
Predators are said to start contact on Roblox, sometimes using game currency like Robux to attract children. They then move the conversations to Discord's private servers for grooming.
Q: Who is affected by these alleged platform vulnerabilities?
Over 1,400 victims are named in the lawsuits. Families report lasting trauma like PTSD and depression as children face real-world threats after online contact.
Q: What legal protection do Discord and Roblox have?
Discord and Roblox are protected by Section 230 of the Communications Decency Act, which usually shields tech companies from lawsuits over user-generated content. New lawsuits argue the platform design itself is the problem.
Q: What are Discord and Roblox doing about safety?
Both companies say they use tools to find and remove child abuse material. Roblox recently limited direct messaging for users under 13, but relies on users to enter their correct birthdate.