Specifically, they added new parental controls options that would block direct messaging to children outside of games, by default automatically blocking of direct messages in and outside of experiences to users under 13. In 2024, “social hangout” games were restricted to players over 13 years old, and the platform implemented parental controls automatically blocking direct messages to users under 13. The safety of children on Roblox, a multiplayer game platform managed by the American company Roblox Corporation, has been the subject of much debate and controversy. That means almost 8 in 10 gamers regularly encounter insults, identity-based harassment, unwanted sexual attention, threats, or offensive content like usernames and player skins.
Teens who pass the age test would be able to add anyone aged 13 through 17 as a trusted connection without restriction, though users over 18 years old would need to use a QR code or phone number to add verified teen users. In October 2020, the Roblox Corporation reported employing approximately 1,600 full-time employees dedicated to content moderation. In-game mechanics for self-expression such as user avatars, item decorations, and spray paint tools were also identified as being used to spread extremist and hateful imagery. One high-profile group in Roblox, called “The Senate and People of Rome”, outlawed “race-mixing, feminism” and homosexuality, while also requiring another player, who was Jewish, to wear a “Judea tunic or be arrested on sight”. Investigative journalism YouTube channel People Make Games accused the platform of “exploiting” younger game developers by promising them large amounts of money from creating games, only to apply high revenue cuts and leaving creators with little to no income. In April 2022, Truth in Advertising filed a complaint against Roblox with the Federal Trade Commission for false advertising, mainly failing to disclose when advertising is present, such as with advergames and brand ambassadors.
First, ethically, it’s alarming that online games often have teen or even “Everyone” ratings – environments where children face such hostility. The kinds of slurs, sexual comments, and hate imagery that proliferate in some games are absolutely not what most parents want their kids to witness or endure. Recent studies show that three-quarters of teens and pre-teens (ages 10–17) experienced harassment in online games, a sharp rise from the previous year. When even seasoned adult gamers are wary of exposing children to the standard multiplayer experience, it’s a damning indictment of the status quo. We surveyed 2,000 gamers in the United States to understand how toxicity impacts online multiplayer games. As they get older, introduce more interactive games gradually, always with parental controls enabled to manage content, communication, and time spent gaming.
She also requested that the game’s PG rating be reviewed again by the Australian Classification Board, where it was noted that the platform hosted games intended for adults. The system requires users to either upload an ID to verify their exact age or perform a facial scan via Persona to place a person into an estimated age group. On November 18, 2025, Roblox announced that all forms of in-game chat and experiences rated “18+” on the platform would be locked behind an age verification system. In October 2025, Roblox planned revenue changes in order to expand child safety and moderation which caused the stock of the company to fall by 15 percent, though the company stated they expect a rising third quarter capital after the changes are made.
Statements like this are important – they set the tone from the top that toxic behavior is not welcome on major platforms. Leading figures in gaming have started to openly acknowledge this responsibility. Players aren’t asking for censorship, they’re asking for safety. For platform owners, this is yet another signal that cleaning up online spaces isn’t just a moral duty – it’s necessary to secure the trust of families and the market’s longevity. The gaming industry has long focused on cultivating the next generation of fans, but toxicity is blocking the on-ramp. If most adults wouldn’t let their kids into your online ecosystem, you’re seeding a future audience problem.
Certain platforms let you receive alerts when your child gets direct messages from other players, allowing you to catch potential issues early. Knowing the ins and outs of the games your child plays gives you a clearer sense of the content, the people they’re interacting with, and potential risks. Most major gaming platforms have player and family or parental controls that make it easy to create and manage family accounts. When using EA’s online services, you can play, chat and, in some cases, share other content with players including friends. Los Angeles County filed a lawsuit against Roblox two days later, claiming the platform “makes children easy prey for pedophiles” and “fails to implement reasonable and readily available safety measures”. On December 18, 2025, the Attorney General of Tennessee Jonathan Skrmetti would sue Roblox for misleading parents about child safety, saying that “Roblox is the digital equivalent of a creepy cargo van lingering at the edge of a playground”.
Throughout the 2020s, the cyber police divisions of several Mexican states have reported numerous cases of sexual harassment of minors through Roblox, including the federal government of Chihuahua, which from 2021 to 2025 reported an increase in cases of grooming using the platform. In December 2025, the Moroccan government took the first steps to moderate the platform and other online games after repeated warnings from several members of parliament who voiced their concern that the platform could be a danger to minors such as Fatima Zahra Afif. In October 2025, the Lebanese Association for Statistics, Training, and Development urged the Lebanese government to ban the video game following reports that 30% of minors in the country could be exposed to inappropriate content. This announcement was met with major backlash from parents, creators, safety advocates, and many online communities who feared it would put children at a greater risk. A 2023 study by researchers from Pennsylvania State University analyzing discussion among users of the community noted that roleplaying games aligned with prejudiced or extremist values seemed to hide or only imply their values to players, occurrences of which usually took place in military roleplaying games.
Since 1998, NCMEC has operated the CyberTipline, a place where the public and electronic service providers can report suspected online and offline child sexual exploitation. On the same day, Iowa Attorney General Brenna Bird filed a lawsuit against Roblox Corporation for allegedly failing to protect children from exploitation. On November 6, 2025, Texas attorney general Ken Paxton filed a lawsuit against the Roblox Corporation, alleging that the company misleadingly promoted its platform as a safe environment for children.
GameSafe monitors your child’s game chats, alerting you to threats in real-time. The new Besedo report highlights the effects of fraud and poor content quality on online marketplaces. Every unchecked bigoted rant or sexist attack might drive away dozens of other players, shrinking the total user pool and revenue potential in the long run. Players spend 54% more on games they perceive as safe and non-toxic. We’ve built solutions that help platforms detect and prevent this kind of content before it becomes problematic.” User-generated content is a huge competitive advantage.
Of course, implementing this kind of system isn’t plug-and-play. “A safer user experience leads to more enjoyment, stronger communities, and ultimately, more engagement and revenue,” he adds. Apostolos explains that smarter workflows, automated and tiered by severity, allow platforms to respond faster and more fairly. AI can also adapt to evolving language, subcultures, and slang much better than rule-based systems.” However, words and isolated features aren’t enough if enforcement is inconsistent or a company’s stance isn’t communicated to the player base.
Roblox developers are required to fill out a questionnaire which will determine the game’s maturity rating, ranging from “minimal” to “restricted”, which are only available to users who have verified they are at least 17 years old through government-issued ID. According to a 2022 report by The Weekend Australian, “dozens” of forums exist to show Roblox players how to make Nazi-inspired content without being banned, such as rearranging the colors of the Nazi flag and altering the Swastika. After Roblox requested the channel to take down the video, People Make Games released several more accusations towards Roblox, focused on an alleged lack of oversight of developers and a method for people to address developer abuse, leading to child developers being exploited for labor on third-party platforms. Some found that the platform made it very easy to purchase microtransactions, leading to numerous instances where children have spent large sums of money on the platform without parents’ knowledge.
From content filters to limiting chat options, parental controls give you more oversight on who your child can communicate with and what they can play. Learn more about our Positive Play Charter and how to report players. FC Playtime was designed to help FC players understand and control how they play. NetSmartz is NCMEC’s online safety education program. But there’s still more to do—join us in protecting children and supporting our mission. On September 15, 2025, Oklahoma attorney general Gentner Drummond announced that he was seeking outside law firms to investigate Roblox over alleged child exploitation and safety failures.
“GameSafe is a must-have for any parent with gaming kids.” Dating app users are frustrated with in-app messaging. We’re Besedo and we provide content moderation tools and services to companies all over the world. Healthy communities drive deeper engagement, session length, and, yes, monetization, while toxic communities stifle all of the above. “You need compute-efficient transformer models, smart latency management, and retraining pipelines that minimize false positives and negatives in an ever-changing context.”
Nearly 59% of players mute or block toxic users, 30% actively avoid certain communities, and 28% quit mid-game. Alarmingly, more than half (52%) of women said they stopped playing games because of harassment or toxic communities. According to the Anti-Defamation League’s 2023 report, 76% of adult gamers reported similar harassment experiences. Our survey revealed that nearly 78% of gamers have experienced some form of harassment online. Our research highlights gamers’ critical challenges and demonstrates why gaming platforms need stronger moderation efforts. We surveyed 2,000 American gamers aged to understand how toxicity impacts gaming.
According to Bloomberg Businessweek, some Roblox users have become “vigilante gamers” in response to Roblox’s perceived poor moderation and failure to protect children. In a Twitter thread reported on by Polygon, a woman stated that when she was investigating the Roblox games that her five-year-old child was playing, she joined a game named Public Bathroom Simulator. While not explicitly sexual, some games have been reported to feature suggestive environments that facilitate sexual or fetishistic roleplay such as vore or feet fetishes. Roblox is an online game platform and game creation system developed and managed by Roblox Corporation, built around user-generated content and games, officially referred to as “experiences”.
In 2024, Bloomberg Businessweek reported that, since 2018, at least 24 people had been arrested in the United States on charges of abducting or sexually abusing children they had groomed on Roblox. According to the company in 2020, the monthly player base included half of all American children under the age of 16. A December 2017 study found that children ages 5 to 9 primarily spend their time playing Roblox over all other activities when using a PC. Additionally, in 2025, social hangout games featuring private locations such as bedrooms and bathrooms were restricted to users aged 17 and above. Around 40% of Roblox players are under 13 years old, and Roblox Corporation stated in 2020 that half of all American children used the platform. Customize alert settings for peace of mind and full control over your child’s online gaming safety.
Every child is different, so consider their maturity and understanding of online safety when making this decision. How can I monitor my child’s messages in gaming chats? What are the most important parental control settings for safe online gaming? Without limits, gaming can become all-consuming, especially for children prone to hyper-focus. If your child uses voice chat, encourage them to play using speakers rather than headphones. Encourage your child to use unique, anonymous usernames and avoid using the same one across multiple platforms to prevent linking their identity across different services.
If the user’s age could not be determined with high confidence after a facial scan, the platform would ask for the ID of the user or their parent. By default, users under 9 were only allowed to access “minimal” and “mild” experiences, whereas “moderate” experiences would require parental consent. They had met on Roblox and had been friends for around six years, communicating through various social media platforms and meeting in person four or five times. The perpetrators in both cases were in close contact with each other and communicated through online communication. Online child exploitation groups which operate on the platform like 764 and CVLT are also known to hold neo-Nazi cubet and other extremist ideologies.