Bill would establish guardrails on internet platforms used by kids

 

STATE HOUSE – Senate Majority Leader Valarie J. Lawson and Rep. Megan L. Cotter have introduced legislation to require social media platforms, apps and other online services and products that are likely to be used by children to design their products to protect them.

“Right now, the burden is on parents to protect kids from online harms. This bill shifts some of that responsibility to the platforms that profit from children’s engagement,” said Representative Cotter (D-Dist. 39, Exeter, Richmond, Hopkinton). “Unless we pass laws requiring change, Big Tech will continue to put profits over protections, leaving children vulnerable to harmful content, data exploitation and addictive design practices. Rhode Island should defend our kids and join the growing number of nations and states that are successfully demanding responsibility from the industry.”

Said Majority Leader Lawson (D-Dist. 14, East Providence), “Although kids, especially teens, are avid users of them, social media apps and sites are designed for adults. This bill is not a panacea, but it establishes basic guardrails to protect kids from the most egregious threats to their safety, such as location tracking, having their personal information sold, or being profiled for commercial purposes. The goal is to prevent tech companies from exploiting children’s vulnerability for profit, or enabling others to do so.”

The bill (2025-H 5830, 2025-S 0903) places a responsibility on Big Tech companies to take reasonable care to avoid any heightened risk of harm to children caused by their online services, products or features. It would require them to use design features to prevent unfair or deceptive treatment, unlawful disparate impact, financial or reputational injury, discrimination against, or offensive intrusions into the private affairs of children. The bill also discourages features that are designed to increase, sustain or extend use of the product by children in a way that might result in certain harms.

The requirements in the bill would apply to tech companies operating in Rhode Island that collect and control personal data and either gross more than $25 million annually; receive, buy or sell data from a total of at least 50,000 individuals, households or devices annually; or derive at least 50% of their annual revenues from selling individuals’ personal data.

Companies subject to the requirements would have to assess their online products, services and features that are likely to accessed by children to determine whether they comply with the requirements to protect children from heightened risks, and create plans to ensure compliance. The attorney general’s office would be charged with enforcement, which could result in fines of up to $7,500 per affected child.

The legislation, which would take effect Jan. 1, 2026, follows global standards set by the United Kingdom’s Age-Appropriate Design Code and incorporates lessons learned from legal challenges elsewhere to ensure its durability.

Some 97% of teens use the internet every day. About 95% report using social media platforms, and more than one-third of them report using social media “almost constantly.” Teens report using an average of 4.8 hours a day on seven popular social media apps, and among those with the highest use, 41% rate their own mental health as poor or very poor.