Roblox Checked the Ages. The Crisis Checked Back In.

Roblox's age-verification rollout was supposed to fix child safety. Months later, lawsuits, settlements, new kid accounts, and Roblox's own filings tell a messier story.

Roblox Checked the Ages. The Crisis Checked Back In.

I am picking up where I left off in January's deep dive into Roblox's age-verification fiasco, mostly because I made the mistake of asking a simple follow-up question: how is that going?

The answer, in the most SiliconSnark possible phrasing, is: exactly well enough for Roblox to call it progress, exactly badly enough for attorneys general to keep treating the company like a homework assignment that came in late, sticky, and somehow monetized.

Roblox's official story is that it is building the "gold standard" for online safety. The company rolled out facial age checks globally for chat on January 7, 2026, saying it was the first large online gaming platform to require facial age checks for users of all ages to access chat and that the system would limit communication between adults and children under 16. Then, on April 13, Roblox announced Roblox Kids accounts for ages 5 to 8 and Roblox Select accounts for ages 9 to 15, with age-based content access, communication defaults, and expanded parental controls rolling out in early June.

That sounds great until you remember the context. Roblox did not wake up one morning, gaze lovingly upon the safety needs of children, and invent responsible platform governance as a lifestyle choice. This all arrived after a stack of lawsuits, investigations, settlements, and public reporting alleging that Roblox had spent years marketing itself as a safe place for kids while predators, explicit experiences, Robux incentives, weak age claims, and brittle moderation kept making the opposite case.

So yes, Roblox is finally checking ages. Congratulations to the world's child-friendly metaverse for discovering that children have ages. The applause will be delayed while parents, regulators, and everyone with a working memory asks why this was not table stakes before the company became a multibillion-dollar playground with 132 million average daily active users and 31 billion hours engaged in Q1 2026.

The Age Check Became the Product

Back in January, the age-check pitch was simple: if users want chat, they need to complete an age check. Roblox said the process uses Facial Age Estimation through Persona, that images and video are deleted after secure processing, and that users 13 and older can appeal with ID verification or parental controls if the system gets their age wrong. The company also said its vendor's models had achieved a mean absolute error of 1.4 years for users under 18 in U.K. testing.

That is the shiny version. The lived version, as TechCrunch summarized at launch, was that age checks were "optional" in the same way a steering wheel is optional if you do not plan to drive. You could use Roblox without verifying, but chat required verification. If Roblox is a social platform, and Roblox very much wants investors to believe it is a social platform, that is not a small feature. That is the bloodstream.

Now the age check is expanding from a communications gate into an account architecture. The new Roblox Kids and Roblox Select plan will sort younger users into age-based account types, limit which games they can access, and restrict unverified users to Minimal or Mild games with all communication unavailable until they complete an age check. For kids ages 5 to 8, communication is off by default. For ages 9 to 15, direct chat controls remain under parental management through age 15.

This is the part where Roblox wants credit for finally replacing "enter any birthday you feel emotionally aligned with" with something closer to actual age assurance. Fine. Credit granted, in the same spirit one might credit a restaurant for installing a fire extinguisher after the third small kitchen fire.

The deeper issue is that Roblox is now using age checks as the master key for everything: chat access, account category, game availability, creator eligibility, parental controls, matchmaking, future content ratings, and eventually even investor-facing demographic claims. In its April 30 post about courting older players with a 42% DevEx bump for qualifying U.S. 18+ spend, Roblox had to disclose that its Q1 2026 age-checked metrics are estimates based on limited information and evolving methods, and that extrapolated results may not fairly represent the actual demographic split or engagement of non-age-checked DAUs.

Translation: the company is building a new safety-and-monetization machine on age data it is still learning how to measure. Nothing says "gold standard" like a footnote whispering, please do not compare this to history, reality, or itself too aggressively.

Settlements Are Not a Product Roadmap, But Here We Are

The most revealing document is not Roblox's blog. It is Alabama's settlement paperwork. On April 21, Alabama Attorney General Steve Marshall announced a $12.2 million settlement with Roblox, saying platforms must give parents "a fighting chance" and that parents need "a partner, not a black box." That phrase lands because it accidentally describes the whole Roblox safety problem better than most tech commentary: parents have been handed settings panels, reassuring slogans, and a product their kids' social lives orbit around, while the real risk model sits somewhere inside Roblox's servers wearing a laminated badge that says trust us.

The underlying settlement term sheet is even more blunt. It says that on or before May 1, 2026, Roblox will require all users to undergo age assurance beyond self-reported age to access chat, and will prohibit chat for all users until they do so. It also says Roblox must limit chat between adult users and users under 16 unless the adult is a trusted friend, keep minor communications unencrypted, add first-time private-chat warnings by September 1, submit annual safety reports to Alabama, and implement privacy controls around age-assurance data.

Read that list slowly. Those are not exotic demands. They are the sort of controls a reasonable person might have expected before a platform with millions of children built a financial civilization out of social play, user-generated worlds, and Robux. But because tech prefers to innovate backward from consequences, we get the 2026 version: parental controls by settlement, warnings by deadline, transparency reports by attorney general.

Roblox can say, correctly, that settlements do not equal admissions of wrongdoing. The company can also say, correctly, that no safety system is perfect. But the pattern is difficult to spin into triumph. The company's new safety framework is being shaped not only by internal wisdom, but by legal pressure from states that have decided "we are working on it" is no longer a governance strategy.

This is the same company posture I roasted in Roblox's Face-Scan Safety Theater Is Even Worse When You Hear the CEO Explain It: a platform facing child-safety allegations keeps trying to turn emergency repairs into visionary product language. It is not a crisis, it is an opportunity. It is not overdue safety plumbing, it is the gold standard. It is not "we let the house fill with smoke," it is "look at our exciting new ventilation roadmap."

The Lawsuit Parade Did Not End in January

If January was supposed to mark the end of the safety crisis, February and March did not get the memo. Los Angeles County sued Roblox on February 19, with reporting from Malwarebytes noting the county alleged Roblox gave predators "powerful tools" to target children and misled parents about platform safety. Nebraska followed on March 4, when Attorney General Mike Hilgers filed a lawsuit accusing Roblox of enabling child exploitation and deceptive safety practices.

Nebraska's press release is not subtle. It says Roblox built a multibillion-dollar business on the trust of families while creating "a playground for predators" and exposing children to dangerous content. Those are allegations, not adjudicated facts, and Roblox has denied similar claims in prior suits. Still, when multiple states keep producing variations of the same sentence, the company does not get to answer with a product tour and expect everyone to clap.

The uncomfortable thing for Roblox is that age verification is now both its defense and part of the critique. The company says age checks are the foundation for safer communication. Nebraska, according to Forbes' coverage of the lawsuit, argued the January update was still insufficient because age checks were not required at sign-up, leaving users able to self-report fake birthdays for some purposes. Malwarebytes' April follow-up also noted Nebraska's complaint that misclassifications could place adults in child chat groups and minors in adult categories, while age-verified accounts for young children could be traded on third-party marketplaces.

That is the core problem with Roblox's age-check era: the company is trying to solve a trust crisis with a classification system, then asking everyone to trust the classification system. If the system works, Roblox gets to say it pioneered safer communication. If the system misfires, is bypassed, is socially gamed by parents, or becomes an account black market, the company has simply moved the weak point to a fancier location.

SiliconSnark's earlier Roblox: The Lottery That Trains Kids to Spend, Not Win argued that Roblox is not just a game but a behavioral economy for children. That context matters. When the platform is this monetized, safety features are not neutral. They decide who can talk, who can play, who can publish, who can spend, who can earn, and who gets routed toward which content. Roblox is no longer just patching a chat feature. It is rebuilding the traffic laws of a child-heavy economy after years of letting the roads function like a dare.

The Investor Deck Says the Quiet Part in Spreadsheet

Roblox's Q1 2026 supplemental materials are a beautiful little artifact because they contain both the safety story and the business story sitting next to each other like coworkers pretending not to date. The company reported $1.4 billion in revenue, $1.7 billion in bookings, 132 million average DAUs, and 31 billion hours engaged for Q1, with DAUs up 35% year over year and hours engaged up 43% year over year.

Then, deeper in the same materials, Roblox warns that its user metrics are based on internal systems that have not been validated by an independent third party, that age data now increasingly relies on age-check systems, and that if users provide incorrect information or age-check systems misrepresent user ages, estimates may be inaccurate. It also says age-checked metrics are not comparable to historical periods that relied on self-reported data.

I appreciate the candor. I also appreciate that it makes the whole situation faintly absurd. Roblox is telling investors that age data unlocks strategic upside, improves matchmaking, enables age-appropriate experiences, and creates a path back beyond pre-rollout engagement levels. It is also telling investors, in securities-law dialect, that the measurement scaffolding underneath the new age era is still shifting.

Even better, the company's annual-report language describes safety and civility as a "strategic advantage" and says the initial age-check rollout created a mid-single-digit headwind to engagement growth and a low-single-digit headwind to bookings growth, while accurate age data should drive long-term engagement. That is an astonishingly Roblox sentence. Child safety, but please note the booking headwind.

This is not to say Roblox should ignore business metrics. It is a public company. Investors exist. Revenue exists. But when the company frames its scale with young users as a valuable strategic asset while regulators are alleging that same scale has enabled serious harm, the snark writes itself and then asks for legal review.

The Older-Player Pivot Is Not a Moral Escape Hatch

On April 30, Roblox announced a 42% DevEx increase for eligible U.S. in-game spend from age-checked 18-and-older players, part of a push to support high-fidelity games for older audiences. The company said U.S. 18-to-34 users monetize over 50% higher than under-18 users and that the 18-to-34 audience is growing over 50% year over year, with all the age-data caveats mentioned above.

Strategically, this makes sense. If Roblox can age up its audience, it can reduce some kid-platform stigma, lure developers building deeper games, and reassure Wall Street that it is not forever trapped in the allowance economy. It also lets Roblox say, implicitly, "Look, we are not only a kids' platform."

But that argument has limits. You do not get to spend years benefiting from children as the platform's cultural center, then wave at older-player monetization when child-safety scrutiny gets expensive. The older-player pivot may be commercially smart. It may even make Roblox healthier over time if it reduces the pressure to squeeze younger users. But it does not erase the duties created by the children already there.

Roblox's own new structure proves the point. Roblox Kids. Roblox Select. Parental approvals. Age-based chat. Under-16 content selection. Trusted friends. Warnings. Settlement reports. None of that exists because Roblox is secretly Steam with blockier avatars. It exists because Roblox remains one of the most important digital spaces in children's lives, and it is only now being forced to behave like that responsibility is real.

What Would Actually Look Better

The charitable version is that Roblox is finally doing hard things at scale. Age assurance across a global platform is messy. Moderating user-generated 3D spaces is hard. Letting kids play, create, chat, learn, and socialize without turning the whole thing into either a locked-down museum or a predatory free-for-all is genuinely difficult. I do not envy the safety team. I do, however, reserve the right to side-eye the executive mythology.

A better Roblox response would be less triumphant and more accountable. Publish clearer error rates from real-world age checks, not just lab-test averages. Disclose misclassification categories and appeal outcomes. Explain how often adult-minor chat attempts are blocked. Report how many accounts are rechecked after behavioral signals suggest an age mismatch. Show parents what "trusted friend" actually means in practice. Say plainly how Roblox detects account resale and age-verification fraud. Let independent researchers audit meaningful parts of the system. Make the safety dashboard public enough that parents do not have to choose between marketing copy and subpoenas.

Also, maybe stop treating every safety repair as brand elevation. The vibe should be less "we are defining the future of civility" and more "we recognize the trust deficit and are publishing receipts." SiliconSnark has made this point across digital identity, AI companions, and personal AI memory: when a product asks users to hand over more intimate data for safety, convenience, or personalization, transparency cannot be a decorative footer.

The Sharp Takeaway

Roblox's age-verification era is not nothing. The company has made real changes: global chat age checks, age-based accounts, expanded parental controls, content filtering for younger users, creator verification requirements, settlement commitments, and a broader move away from self-reported birthdays. Some of those changes may genuinely reduce risk. Some may make parents' lives easier. Some may create better defaults for younger users who should never have been dropped into a massive social-commerce engine with "good luck" as the safety model.

But the bigger story is still ugly. Roblox is not being celebrated because it led. It is being pressured because it lagged. It is not merely rolling out safety innovation. It is retrofitting governance around a platform that became socially indispensable to children before it had the adult supervision to match. The company's own filings now treat age data as both a safety foundation and a growth lever. Regulators are turning product features into enforceable obligations. Parents are being asked to trust systems whose failures they have little visibility into. Developers are being asked to adapt to a new regime. Kids, as usual, are the ones living inside the experiment.

So, how is Roblox's big safety fix going? Better than doing nothing. Worse than the marketing suggests. Late enough to be infuriating. Big enough to matter. Fragile enough to deserve scrutiny.

Roblox checked the ages. Good. Now the rest of us get to check the company.