
Europe’s latest crackdown on Big Tech is targeting Roblox, with the Netherlands’ consumer watchdog launching a formal probe focused on risks to underage users, including exposure to harmful content, predatory targeting, and deceptive in-game purchases. Tied to the EU’s Digital Services Act (DSA), this investigation will test the platform’s compliance and could foreshadow a new era of sweeping regulation for parents and platforms worldwide.
Story Highlights
- The Netherlands’ consumer watchdog (ACM) launched a formal probe into Roblox on Jan. 30, 2026, focused on risks to underage users.
- Regulators are examining exposure to violent or sexually explicit content, predatory targeting, and “deceptive” prompts for in-game purchases.
- The investigation is tied to the EU Digital Services Act, with potential penalties or directives if violations are found.
- Dutch officials began reviewing Roblox in late 2025 after similar assessments of TikTok and Instagram.
What the Dutch watchdog is investigating—and why it matters
The Netherlands Authority for Consumers and Markets announced it is opening an investigation into Roblox to determine whether the platform sufficiently protects minors in the European Union. Dutch authorities cited concerns reported about violent or sexually explicit content, predatory targeting, and practices that nudge children toward spending money in-game. The probe is expected to take about a year, and it may lead to sanctions if regulators find failures under EU rules.
The legal framework driving the case is the EU Digital Services Act, which requires online platforms to take “appropriate and proportionate” measures to protect children and safeguard privacy. While U.S. families are used to hearing about lawsuits and state investigations, the EU model leans heavily on proactive compliance obligations backed by enforcement power. For Americans watching from afar, the key issue is the precedent: regulators treating child protection as a justification for broad platform oversight.
Dutch consumer watchdog launches probe into Roblox and children https://t.co/40VL2smrxO pic.twitter.com/rpwQRiGDDh
— DutchNews.NL (@DutchNewsNL) January 30, 2026
The DSA enforcement model: child safety goals paired with heavy regulatory leverage
ACM’s move follows Dutch government activity that began months earlier. In October 2025, the Netherlands started an initial review of Roblox amid concerns about bullying, privacy, and harmful contact between users. Dutch officials have pushed for Children’s Rights Impact Assessments and have framed responsibility as resting primarily with platform providers rather than families alone. That approach can produce real pressure for better guardrails, but it also expands the role of government in policing digital speech and behavior.
ACM’s previous enforcement provides context for how aggressive the Netherlands can be when children are involved. In 2024, the watchdog fined Epic Games about €1.1 million over allegations that Fortnite pressured children into making purchases. That case signaled that “dark patterns” and high-pressure monetization mechanics are a top priority for regulators, not a side issue. Roblox, which relies heavily on microtransactions through Robux, now faces scrutiny under a similar consumer-protection lens.
Roblox’s scale, monetization, and moderation problems collide
Roblox is not a niche platform; it draws tens of millions of daily users globally, and a significant portion are minors. That scale complicates moderation because the platform hosts vast amounts of user-generated content, including games and social spaces. Regulators say the central question is whether Roblox’s systems actually reduce foreseeable risks for kids, rather than merely offering tools that determined bad actors can bypass. The investigation’s long timeline suggests a deep review of processes, not a quick public-relations dispute.
Roblox has pointed to enforcement steps such as warnings, bans, abuse reporting, parental controls, and age-related features, but criticism has persisted internationally. Reports and legal actions referenced in the broader record describe concerns ranging from exposure to sexual content to predatory behavior and financial exploitation through in-game spending. The research also notes controversy around a 2025 facial age-verification update, with critics arguing that misidentifications and workarounds could undermine its purpose. The Dutch probe will test what regulators consider “reasonable” protection.
How U.S. families should read this: safety concerns are real, but the policy tool matters
For American parents, the underlying fears described by European and U.S. investigators are easy to recognize: kids spending too much money, being exposed to age-inappropriate material, or encountering predators online. The unresolved question is how far governments should go when they translate those fears into policy. The EU model can force rapid changes in platform design and moderation, but it can also normalize sweeping compliance regimes that affect lawful speech, privacy practices, and the ability of users to create and share content.
The most solid, verifiable takeaway from the available reporting is procedural: the Netherlands has opened a formal, DSA-linked investigation with potential penalties, and it may take about a year to conclude. What cannot be responsibly claimed at this stage is the outcome—whether Roblox will be found in violation, what specific measures regulators will order, or whether the probe will materially reduce harm to children. Until findings are published, families are left with the practical reality: oversight is increasing, and parental involvement still matters.
Watch the report: Roblox safety measures: Company rolls out online safeguarding tool
Sources:
Dutch watchdog launches Roblox probe over ‘risks to children’
Dutch government to assess child safety risks on Roblox after review of TikTok, Instagram
Concerns about child safety: ACM launches investigation into gaming platform Roblox












