The Federal Trade Commission proposed on Wednesday radical changes to strengthen a key federal rule that has protected children's privacy online, in one of the U.S. government's most significant attempts to strengthen consumer privacy in more than a decade.
The changes are intended to strengthen the rules underlying the Children's Online Privacy Protection Act of 1998, a law that restricts online tracking of young people by services such as social media apps, video game platforms, toy retailers. and digital advertising networks. Regulators said the measures would “shift the burden” of online safety from parents to apps and other digital services, while limiting how platforms can use and monetize children's data.
The proposed changes would require certain online services to disable targeted advertising by default for children under 13 years of age. They would prohibit online services from using personal data such as a child's cell phone number to induce young people to stay on their platforms longer. That means online services will no longer be able to use personal data to bombard young children with push notifications.
The proposed updates would also strengthen security requirements for online services that collect data from children and limit how long online services could retain that information. And they would limit the collection of student data by learning apps and other educational technology providers by allowing schools to consent to the collection of children's personal data only for educational, non-commercial purposes.
“Children should be able to play and learn online without being endlessly tracked by companies seeking to hoard and monetize their personal data,” Federal Trade Commission Chairwoman Lina M. Khan said in a statement Wednesday. She and she added: “By requiring companies to better protect children's data, our proposal imposes affirmative obligations on service providers and prohibits them from outsourcing their responsibilities to parents.”
COPPA is the central federal law protecting children online in the United States, although members of Congress have since attempted to introduce broader online safety bills for children and teens.
Low COPPA lawOnline services aimed at children, or those who know they have children on their platform, must obtain parental permission before collecting, using or sharing personal data, such as first and last names, addresses and phone numbers, of a child younger. of 13.
To comply with the law, popular apps like Instagram and TikTok have terms of service that prohibit children under 13 from creating accounts. Gaming and social media apps often ask new users to provide their dates of birth.
Still, regulators have filed numerous complaints against big tech companies, accusing them of failing to establish effective age screening systems; display targeted ads to children based on their online behavior without parental permission; allowing strangers to contact children online; or retain children's data even after parents asked for it to be deleted. Amazon; Microsoft; Google and its YouTube platform; Epic Games, the creator of Fortnite; and Musical.ly, the social app now known as TikTok, have paid multimillion-dollar fines to resolve charges of violating the law.
Furthermore, a coalition of 33 state attorneys general filed a joint federal lawsuit in October against Metathe parent company of Facebook and Instagram, saying that the company had violated the children's privacy law. In particular, the states criticized Meta's age verification system, saying the company had allowed millions of underage users to create accounts without parental consent. Meta has said it spent a decade working to make online experiences safe and age-appropriate for teens and that the states' complaint “mischaracterizes” the company's work.
The FTC proposed stronger children's privacy protections amid heightened public concern about the potential mental health and physical safety risks that popular online services may pose to young people online. Parents, pediatricians and children's groups warn that social media content recommendation systems have routinely shown inappropriate content that promotes self-harm, eating disorders and plastic surgery among girls. And some school officials worry that social media platforms will distract students from his work in the class.
This year, states have passed more than a dozen laws that restrict minors' access to social networks either porn sites. Industry trade groups have successfully sued to temporarily block several of those laws.
The FTC began reviewing the children's privacy rule in 2019 and received more than 175,000 comments from technology and advertising industry trade groups, video content developers, consumer advocacy groups, and members of Congress. The resulting proposal It has more than 150 pages.
The proposed changes include narrowing an exception that allows online services to collect persistent identification codes for children for certain internal operations, such as product improvement, consumer personalization or fraud prevention, without parental consent.
The proposed changes would prohibit online operators from employing such user tracking codes to maximize the amount of time children spend on their platforms. That means online services would not be able to use techniques such as sending notifications to mobile phones “to prompt the child to interact with the site or service, without verifiable parental consent,” according to the proposal.
It is not yet known how online services would comply with the changes. The public has 60 days to comment on the proposals, after which the commission will vote.
Initial reactions from industry trade groups were mixed.
The Software and Information Industry Association, whose members include AmazonApple, Google and Meta said it was “grateful” for the FTC's efforts to consider outside input and that the agency's proposal had cited the group's recommendations.
“We are interested in participating in the next phase of the effort and hope that the FTC will take a similarly thoughtful approach,” Paul Lekas, the group's head of global public policy, said in an email.
NetChoice, whose members include TikTok, Snap, Amazon, Google and Meta, by contrast, said the agency's proposed changes went too far by setting defaults that parents might not want. The group has sued several states to block new laws that would limit minors' access to online services.
“With this new rule, the FTC is overriding parents' wishes,” Carl Szabo, the group's general counsel, said in a statement. “It will make it even more difficult for websites to provide necessary services to children approved by their parents.”
News USA Today has a skilled online editor and content writer, boasting six years of experience in Media and Broadcasting. News, Finance, Sports, Travel, and Entertainment.