Last year, former Facebook employee Frances Haugen leaked a trove of internal company documents, which illuminated just how harmful apps like Instagram can be for teens. These bombshell revelations sparked five Senate subcommittee hearings on children’s internet safety, featuring testimony from executives at TikTok, Snap, YouTube, Instagram and Facebook.

As a result of these hearings, Senator Richard Blumenthal (D-CT) and Senator Marsha Blackburn (R-TN) introduced the Kids Online Safety Act (KOSA) today. The bill would require social media companies to provide users under 16 with the option to protect their information, disable addictive product features, opt out of algorithmic recommendations; give parents more control over their child’s social media usage; require social media platforms to conduct a yearly independent audit to assess their risk to minors; and allow academics and public interest organizations to use company data to inform their research on children’s internet safety.

“In hearings over the last year, Senator Blumenthal and I have heard countless stories of physical and emotional damage affecting young users, and Big Tech’s unwillingness to change,” Sen. Blackburn said in a press release. “The Kids Online Safety Act will address those harms by setting necessary safety guide rails for online platforms to follow that will require transparency and give parents more peace of mind.”

There is a lot of overlap among KOSA and other legislation floated over the last year by members of the Senate Subcommittee on Consumer Protection, on which Blackburn and Blumenthal both serve.

Back in October, representatives from YouTube, TikTok and Snap all agreed at a hearing that parents should have the ability to erase online data for their children or teens, an idea that appears in Senator Ed Markey (D-MA) and Senator Bill Cassidy (R-LA)’s proposed updates to the historic the Children’s Online Privacy Protection Act (COPPA). Plus, Blumenthal and Markey reintroduced their Kids Internet Design and Safety (KIDS) Act in September, which would protect online users under 16 from attention-grabbing features like autoplay and push alerts, ban influencer marketing targeted toward children and young teens, and even prohibit interface features that quantify popularity, like follower counts and like buttons. 

Meanwhile, the bipartisan Filter Bubble Transparency Act, introduced in both the House and the Senate, addresses concerns about the secrecy around algorithms and how they influence users. That bill would require social networks to allow users to choose to use a standard reverse-chronological feed instead of using the platform’s opaque, sometimes proprietary algorithm. The newly introduced KOSA bill doesn’t require this toggle, but it would force tech companies to turn over “critical datasets from social media platforms” to the government, the bill’s summary says. Then, academics and non-profit employees could apply to gain access to the data for research purposes.

In California, bipartisan state lawmakers plan to introduce a bill tomorrow that’s modeled off of the UK’s Age Appropriate Design Code. This bill would require companies like Meta and YouTube, which are headquartered in the state, to limit data collection from children on their platforms. Given the prevalence of Big Tech in California, even state laws could potentially force platforms to take action towards making their platforms safer for young users.

It’s too soon to say which of these proposed legislations, if any, will gain enough momentum to change the way that social media platforms operate. But lawmakers have proven committed to asserting more control over big tech.