Roblox, Discord sued after 15-year-old boy was allegedly groomed online before he died by suicide

The mother of a 15 -year -old California boy who took his life now is demanding Roblox and Discord for his death, claiming that his son was supposedly fixed and forced to send explicit images in the applications.

Rebecca Dallas filed the lawsuit on Friday at the Superior Court of San Francisco County accusing companies of “operating recklessly and deceptively their businesses in a way that led to sexual exploitation and suicide” by Ethan Dallas.

Ethan was a “brilliant and imaginative child who loved games, transmitting and interacting with friends online,” says the demand.

He began playing on the online Roblox game platform around 9 years, with the approval of his parents and with parental controls instead. When he was 12 years old, he was the target of “an adult sexual predator” who passed a child in Roblox and became friends with Ethan, Rebecca Dallas lawyers said in a statement.

What began as an innocent conversation “gradually intensified to sexual issues and explicit exchanges,” says the complaint.

After a while, the man encouraged Ethan to turn off parents and transfer their conversations to Discord, the lawyers said.

In Discord, the man “demanded more and more explicit photographs and videos” and threatened Ethan that he would publish or share the images. Ethan served for fear, says the complaint.

“Tragically, Ethan was permanently damaged and persecuted for these experiences, and died for suicide at the age of 15,” said the complaint.

The demand accuses Roblox and the discord of unfair death, the fraudulent concealment and the misrepresentations, the negligent misrepresentation and the strict responsibility.

He argues that if Roblox and Discord took measures to evaluate users before allowing them in applications, or implementing age and identity verification and other security measures, “Ethan would never have interacted with this predator, he never suffered the damage he did, and never died for suicide.”

Non -safe applications for children, says the suit

Dallas, from San Diego County, thought that both platforms were sure for his son to use to communicate with friends while playing, given the way the applications were marketed and the parents of the parents that he established, according to the demand.

Roblox is used daily by 111 million people, according to its website, which offers a variety of games, obstacle courses and the ability to chat with other users. It is free to make an account and there is no minimum age, nor the required age verification.

Discord, released in 2015, is a communications platform commonly used by players who wish to chat or chat by video while playing video games. Demand said that the application does not verify age or identity.

The lawsuit states that Roblox allowed Ethan to turn off parents and discord allowed him to create an account and communicate with adults without any supervision of the parents. He said that while Roblox states that children must have parents permission to register, “nothing prevents them from creating their own accounts and playing in Roblox.”

The demand alleges The two applications misrepresented security on their platforms, saying that the design of the applications “makes children preside easy for pedophiles” due to the lack of safeguards and the detection of predators.

After Ethan’s tragic death, his family learned of the police that the man who prepared him had been arrested in Florida “for sexually exploiting other children through the applications of the accused,” the complaint said.

Today, the default Roblox configuration does not allow adults to send messages directly to children under 13, but children can still create accounts with false birth dates that give full access to direct messaging options, according to the complaint.

“We are deeply sad for this tragic loss. While we cannot comment on the claims raised in litigation, we always strive to keep ourselves at the highest level of security,” said a Roblox spokesman for NBC News.

Roblox said it is designed with “rigorous safety features incorporated” and is “continuously innovating new security features, more than 100 only this year, which protect our users and train parents and caregivers with greater control and visibility.”

Security efforts include processes to detect and act on problematic behaviors and human moderation 24/7. Roblox added that the company is associated with the application of the law and the main organizations of Mental Security and Mental Security worldwide to combat the sexual exploitation of children.

While Discord has configurations to keep the minors safe, such as automatically scanning messages to obtain explicit images and videos, the demand said that Discord is “overflowing with sexually explicit images and videos that involve children, including the sexual abuse material of anime and children.”

Discord said he does not comment on legal matters, but said the platform is “deeply committed to security.”

“We require that all users be at least 13 years old to use our platform. We use a combination of advanced technology and security equipment trained to find and proactively eliminate the content that violates our policies,” said a spokesman. “We maintain solid systems to avoid the spread of sexual exploitation and preparation on our platform and we also work with other technology and security organizations to improve online safety on the Internet.”

Other accusations against Roblox, Discord

Anapol Weiss, the firm that filed Dallas’s demand, said that this is the ninth demand that he has presented in relation to accusations that the children were prepared, exploited or attacked after the contact in Roblox or related platforms.

The National Sexual Exploitation Center in 2024 complied with a list of “dirty dozen” of conventional rights that says that facilitating, enabling and benefiting from sexual abuse and exploitation. He included Discord, saying that “this platform is popular among predators who seek to prepare children and with the shields who seek to create, trade or find sexual content of unleashed children and adults,” and Roblox, saying that children are exposed to sexual theme games and exposed to predators.

An NBC news investigation in 2023 found 35 cases during the previous six years in which adults were prosecuted for positions of kidnapping, preparation or sexual assault that supposedly involved communications on discord.

In August, the main prosecutor of Louisiana sued Roblox, claiming that his failure in implementing strong security protocols for children has made him “the perfect place for pedophiles.”

“This case is exposed to the devastating consequences when the platforms of one billion dollars knowingly design environments that allow predators to take advantage of vulnerable children,” he said Alexandra Walsh, An Anapol Weiss partner. “These companies are scouring billions. Children are paying the price.”

Dallas seeks a jury trial and compensatory damage.

If you or someone you know is in crisis, call or send a text message to 988, or go to 988lifeline.orgto achieve the suicide and crisis life line. You can also call the network, previously known as National Suicide Prevention Lifeline, 800-273-8255, or visit Speakingofsuicide.com/resources.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *