Written by: Sarah Johnson | August 14, 2024

In recent legislative sessions, there has been a notable increase in bills aimed at protecting children online. These measures include stricter age gating for certain websites, restricting minors' access to social media platforms, and regulating the tracking of children's data. This trend signifies a shift in how children's online presence is being scrutinized and managed. Project 2025 even focused on "reining in Big Tech" with their proposals related to the FCC. This week, we'll dive into two highly discussed bills: the Kids Online Safety Act and the Children’s Online Privacy Protection Act 2.0.

What are the General Concerns with Kids Online?

It probably comes as no surprise, but there are significant concerns about children under 18 being online, especially regarding data collection and exposure to various online content.

Privacy and Data Security: Websites and apps often collect and track personal data without users' knowledge or consent, using it for targeted advertising or selling it to third parties. This is particularly problematic for minors, who maybe don't quite appreciate their privacy.

Mental Health and Bullying: Online bullying and harassment can severely impact children's mental health, with 53% of teens in the US reporting online harassment as a major problem according to a study released by the Pew Research Center. Excessive time online (primarily social media and video games) can lead to addiction, depression, anxiety, and poor academic performance.

Predatory Behavior: Children online can be easy targets for predators. The National Center for Missing & Exploited Children reported more than a 300% increase in online enticement cases between 2021 and 2023, with 36.2 million reports of suspected child sexual exploitation in 2023 alone.

Inappropriate Content: Children can access adult content, including pornography and violent media, online. A report from Common Sense Media found that 54% of teens saw online pornography before age 13. Check out this post looking at other legislation specifically trying to limit kid's exposure to porn.

Misinformation Influence: Children can be susceptible to misinformation, which can influence their beliefs and behaviors, and potentially radicalize young minds. If untrue or harmful content does not have moderation controls, and algorithms pick up on a child's viewership/interest, this can get out of hand quite quickly.

What is the KOSA Act?

The Kids Online Safety Act (KOSA), or US S. 1409, is a bi-partisan bill that just passed through the Senate and is currently in the House. Last year, a previous version of this bill failed, and since, states have sued platforms (like Meta) to advocate for enhanced online safety measures for children.

Generally, the bill holds platforms liable if their designs and services do not “prevent and mitigate” harm to children from online activities like cyberbullying, promotion of self-harm, and exploitation. The bill does this by mandating a "duty of care", requiring platforms to act in the "best interests of minors" by taking steps to prevent and mitigate harm.

The bill tasks Attorneys Generals with deciding what content is dangerous to young people, which could be defined differently from person to person.

The bill would:

  • Require companies to undergo regular external audits of the risks their platforms create for minors
  • Require platforms to implement age verification systems
  • Mandate algorithm transparency, compelling platforms to disclose how their algorithms work, particularly in recommending content to users, to prevent harmful content from being promoted to minors (this is something that also influenced the TikTok Ban)
  • Strengthen data privacy protections for children by limiting the collection, storage, and use of their personal data. Platforms would be required to allow parents and children to access, delete, or modify the data collected about them.
  • Necessitate the provision of parental controls to give parents tools to manage their children's online activities (like controlling screen time, monitoring usage, and blocking harmful content)
  • Obligate platforms to report any incidents of harm or safety risks to relevant authorities and maintain records of such incidents.

What do supporters say?

Supporters argue KOSA is essential for enhancing child protection in the digital age. Many believe the Act addresses key issues (especially cyberbullying, exposure to harmful content, and data privacy) which currently have a huge impact on kids. By mandating age verification and better parental controls, they argue the Act will create a safer online environment for minors. Additionally, the focus on algorithmic transparency leads many to believe this will help mitigate harmful or false content from being shown to children.

What do opponents say?

The biggest objection to KOSA is whether or not it is constitutional. Some opponents have referenced four prior decisions where the court struck down similar state laws. Carl Szabo, NetChoice’s vice president and general counsel, said in a statement, “KOSA fails to meet basic constitutional principles and fails parents because it won’t make a single child safer online or address their concerns.

Opponents of KOSA argue it could threaten young people’s privacy, limit access to vital resources, and silence important online conversations. A particular concern to many is that stringent age verification would increase data collection and undermine anonymity. Critics also highlight the disproportionate impact on small businesses due to compliance costs, which would ultimately solidify the power of Big Tech giants. Additionally, many fear KOSA may lead to over-censorship, restricting access to information on important issues like substance abuse, sexuality, and depression.

What is the COPPA 2.0 Act?

COPPA 2.0 aims to amend the Children’s Online Privacy Protection Act of 1998 to strengthen protections relating to the online collection, use, and disclosure of personal information of children and teens. S 1418, which also passed the Senate with bi-partisan support late July, aims to expand the age range of protections to include teenagers up to 15 years old (previously focusing on children under 13). COPPA 2.0 also enhances data privacy by requiring online services to obtain explicit consent from parents or guardians before collecting personal information from children under 13, and consent from teenagers themselves if they are 13-15.

The bill also expands the definition of "personal information" to include more types of data and mandates clear and accessible opt-out mechanisms for parents and teenagers, allowing them to refuse data collection and request the deletion of previously collected data.

Another large data focus in COPPA 2.0 is prohibiting targeted advertising to children and teenagers based on their personal data, including behavioral and location data. To ensure compliance, the legislation includes provisions for stronger enforcement and higher penalties for violations.

What do supporters say?

Supporters argue the legislation is a critical update to the original COPPA, extending crucial protections to teenagers up to 15 years old and addressing modern privacy concerns. Many emphasize that the expanded age range ensures greater protection for a wider group of young users who are increasingly active online. Advocates believe COPPA 2.0 will prevent exploitation and misuse of minors' personal data overall, while also supporting provisions for clearer privacy policies and opt-out mechanisms when it comes to kids.

What do opponents say?

Opponents concerns primarily are based in potential drawbacks, arguing that the expanded protections will lead to increased data collection and privacy risks, as platforms will need to gather more information to comply with new consent requirements. Critics also worry the prohibitions on targeted advertising could impact the viability of free online services and disrupt business models which rely on ad revenue.

Other Related Legislation

There are a decent number legislative efforts which reflect ongoing concerns about online privacy and safety for children. Most of these aim to address various aspects of data protection, content moderation, and user rights in the digital age.

Combating Harmful Actions with Transparency on Social Media Act

The CHATS Act, HR 4689, aimed to increase transparency and accountability on social media platforms. CHATS would have required platforms to disclose their content moderation policies, detailing how they handle harmful content such as misinformation, harassment, and hate speech. Platforms would have been required to report data on the prevalence and handling of harmful content, including statistics on flagged and removed posts.

To ensure compliance, the Act mandated independent audits of social media platforms to assess moderation effectiveness and fairness. It also required platforms to notify users when content is flagged or removed, with clear explanations and appeal opportunities. Additionally, the Act demanded transparency in how algorithms prioritize and recommend content and emphasizes the protection of user data, particularly for minors. It was referred to the Subcommittee on Crime, Terrorism, and Homeland Security, but died in early 2023.

Protecting Kids on Social Media Act

The Protecting Kids on Social Media Act is legislation focused on regulating social media platforms to enhance the safety of young users. It aims to address the risks associated with social media use by implementing stricter content moderation standards and requiring platforms to develop robust mechanisms to prevent exposure to harmful material. The Act emphasized the need for improved safeguards against cyberbullying, online exploitation, and inappropriate content, with a particular focus on creating safer online environments for children.

Conclusion

The growing legislative focus on protecting children online reflects a significant shift in how lawmakers are approaching the digital landscape. Bills like the KOSA and COPPA 2.0 aim to address critical concerns, from data privacy to the mental health impacts of social media, recognizing that the current legal framework, including Section 230, is outdated and insufficient. However, while these initiatives are a step in the right direction, they have sparked debate over the balance between protecting young users and preserving free speech, privacy, and access to information. The challenge now is ensuring these laws effectively safeguard children without unintended consequences that could stifle important online conversations or infringe on civil liberties.

An intriguing aspect of this debate is that many "Big Tech" companies, including X, Snap, Pinterest, Meta, Google, Amazon, Netflix, and PayPal, have endorsed or at least expressed support for the "spirit" of KOSA. Given that Big Tech has ramped up its lobbying efforts for legislation they deem favorable.
One may have to ask, is there an underlying advantage in these bills large tech companies recognize, but the rest of us might be missing?

Cover Photo by Thomas Park on Unsplash

About BillTrack50 – BillTrack50 offers free tools for citizens to easily research legislators and bills across all 50 states and Congress. BillTrack50 also offers professional tools to help organizations with ongoing legislative and regulatory tracking, as well as easy ways to share information both internally and with the public.