Dad claims daughter, 16, took life after assembly a predator on Roblox,



Penelope Sokolowski was simply 16 years previous when she took her personal life final February.

Her father, Jason, believes her suicide was the fruits of a grooming course of that started on Roblox, the sport platform beloved by children — with some 170,000 customers beneath the age of 13, in line with firm information from 2023.

“We type of thought we had been overlaying all of the bases,” Jason instructed The Submit, noting that his household had used a third-party app to watch Penelope’s on-line exercise.

Jason alleges that his solely little one was contacted by a predator on Roblox who coerced her into reducing his title into her chest and sending movies of herself bloodied from self-harm — and who, in the end, despatched Penelope down a spiral that culminated in her demise.

Jason Sokolowski believes his daughter Penelope’s suicide final yr, at age 16, was the fruits of a grooming course of that started on Roblox. Courtesy of Jason Sokolowski

The lady was 7 or 8 years previous when she first signed up for Roblox, gamers rove round on-line worlds and may chat with different customers.

“I’d are available and sit within the room along with her and see what she was doing, ask who these folks had been,” Jason mentioned, recalling Penelope drawing an anime-style sketch for a buddy she’d made on Roblox.

“As a dad I believed, oh, that is good, she’s creative, and she or he’s made creative buddies,” he added. “However I didn’t perceive what Roblox was and its impact on her.”

The dad, who works within the movie trade in Vancouver, British Columbia, separated from Penelope’s mom and moved out of the household dwelling when the lady was 13.

“They’re grooming women to do no matter it’s they’ll get a woman to do, whether or not it’s nudes or cuts or gore or violence,” Jason mentioned of predators just like the one who allegedly groomed his daughter, Penelope. Courtesy of Jason Sokolowski

He recollects how Penelope’s grades started to tumble and, when she was 14, he observed scars from self-inflicted cuts on her arms, which she had been overlaying with bracelets and his outsized hockey jerseys. 

Penelope confided that she had been recruited right into a self-harm group through Roblox, however assured her father she had moved on.

However not lengthy after her sixteenth birthday, she took her personal life.

Later, when Jason opened up his daughter’s mobile phone, he discovered what he describes as a “crime scene.”

In accordance with the dad, there have been messages spanning two years with an individual who egged on her self-destruction. Jason believes Penelope met this individual on Roblox after which started privately conversing with them over Discord — typically for hours.

Some 45% of Roblox’s customers — 170,000 of them — are beneath the age of 13, in line with firm information from 2023. wachiwit – inventory.adobe.com
Dozens of lawsuits accusing the Roblox Company of neglecting to guard minors have been consolidated into one federal case. REUTERS

In a single trade, Penelope despatched a photograph of her chest, providing to chop herself there however worrying she couldn’t go “too deep.” Minutes later, she adopted up with a picture of the predator’s Discord person title written throughout her chest in bloodied letters.

In different photographs, she had carved the numbers “764” into her physique. Jason believes Penelope had been contacted by a member of 764, described by the FBI as a “violent on-line group” that targets minors and grooms them into committing egregious acts of self-harm and violence.

Members of 764 reportedly troll platforms like Roblox in search of victims they’ll persuade — through grooming or sextortion — into hurting themselves.

“They’re grooming women to do no matter it’s they’ll get a woman to do, whether or not it’s nudes or cuts or gore or violence,” Jason mentioned. “[Penelope] was brainwashed during.”

Jason believes Penelope met a predator from the “violent on-line group” 764 on Roblox after which started privately conversing with them over Discord — typically for hours. Courtesy of Jason Sokolowski

Twenty years after Roblox was launched, many households are claiming the platform makes it too simple for predators to contact kids. 

Dozens of lawsuits accusing the Roblox Company of neglecting to guard minors have been consolidated into one federal case. The primary listening to happened within the Northern District of California on January 31.

“Nowhere on this world is it regular for adults to talk to kids unrelated to them, however that goes on [on Roblox],” Matt Dolman, an lawyer for the plaintiffs, instructed The Submit.

He alleges, on behalf of plaintiffs, that the corporate designed merchandise in a method that enables “adults to talk to kids unabated on the platform with out [necessary] security options to stop that from taking place.”

After Penelope’s demise in February 2025, her father says he discovered disturbing messages between her and a “buddy” she met on-line. Penelope despatched a photograph of her chest, providing to chop herself there however worrying she couldn’t go “too deep.” Courtesy of Jason Sokolowski

In accordance with Dolman, the everyday case begins with predators providing children Robux — a type of in-game forex that may be bought with actual cash.

Dolman’s agency is conscious of not less than 119 lawsuits filed in opposition to Roblox since 2025. The instances, which span 34 states, embrace a stunning array of predatory habits.

Case summaries offered by Dolman’s agency element how an autistic little one was allegedly coerced into sending specific images. A lady despatched movies of herself doing cartwheels shirtless. Children like Penelope have despatched predators movies of themselves reducing.

One predator is accused of disseminating specific images of a kid in retribution for being blocked. A number of allegedly threatened to kill the households of minors until they despatched sexual photographs. 

A plaintiff household claims {that a} predator recorded himself having intercourse with their underage daughter, whereas intercourse toys have allegedly arrived on the houses of youngsters. There are a number of accusations of minors being kidnapped, typically throughout state traces. One little one was allegedly raped by 5 males. 

Twenty years after Roblox was launched, many households are claiming the platform makes it too simple for predators to contact kids.  REUTERS

“There isn’t a making these children complete once more,” Dolman mentioned.

In accordance with Dolman, although preliminary contact takes place on Roblox, typically the predator will transfer the chat to a third-party app. Not less than 51 lawsuits point out Discord as a co-defendant, Snapchat cited 20 occasions and Meta 5.

Discord instructed The Submit that the corporate is “deeply dedicated to security” and “preserve[s] robust methods to stop the unfold of sexual exploitation and grooming.”

Snapchat and Meta and didn’t reply to requests for remark.

In late 2025, Roblox rolled out extra security options to stop kids from encountering predatory adults, together with AI age-estimation software program for all customers who use communication options. Customers are then segregated into age-based subgroups.

Matthew Dolman’s regulation agency represents a number of households who allege their kids had been groomed on Roblox. dolmanlaw.com

Roblox instructed The Submit that the corporate limits chats for youthful customers, doesn’t enable user-to-user picture sharing, and has filters designed to dam the sharing of non-public info.

“We’re deeply troubled by any incident that endangers any person,” the corporate mentioned. “We additionally perceive that no system is ideal and that’s the reason we’re continually working to additional enhance our security instruments and platform restrictions to make sure dad and mom can belief us to assist hold their kids secure on-line.”

Dolman stays skeptical of the enhancements. “Is it going to get higher on the platform? Sure, assume it could possibly’t worsen,” he mentioned. “However is it a secure platform? Completely not.”

A yr after his daughter took her personal life, Jason Sokolowski firmly factors his finger at Massive Tech.

“Social media corporations are defending these predators as a result of they’re virtually one and the identical,” he mentioned. “They’re working with no conscience or ethical compass only for energy or cash. The platforms are those that might mitigate this in a single day.”



Supply hyperlink

Leave a Comment