Share

TikTok Denies Violating Texas Child Safety Law


TikTok denies Texas lawsuit allegations that it is not doing enough to protect minors’ privacy and give parents sufficient control over their children’s use of the platform.

Attorney General for Texas Ken Paxton filed a suit against TikTok on Thursday, October 3, alleging the social media giant violated a new state law, the Securing Children Online through Parental Empowerment Act (SCOPE), designed to protect children online from improper handling of minors’ personal data.

At the heart of Paxton’s complaint is the assertion that TikTok’s current parental control features fall short of the SCOPE Act’s requirements.

“We strongly disagree with these allegations,” a TikTok spokesperson said in an emailed statement to Newsweek.

“In fact, we offer robust safeguards for teens and parents, including Family Pairing, all of which are publicly available,” added the statement. “We stand by the protections we provide families.”

Newsweek contacted the office of Attorney General Ken Paxton for comment via email.

Paxton argues that the “family pairing” system is inadequate. The Texas lawsuit claims that TikTok does not mandate “commercially reasonable methods” for parents to verify their identities, as stipulated by the act. Additionally, the suit criticizes the requirement for minors to consent to the pairing, potentially creating a barrier for parental oversight.

Paxton further claims that TikTok engages in the “unlawful sharing, disclosing, and selling [of] known minors’ personal identifying information” to various third parties, including advertisers and search engines. This data sharing, the lawsuit claims, occurs without obtaining proper consent from verified parents, directly contravening the SCOPE Act’s provisions.

The SCOPE Act, which was passed in 2023, partially came into effect on September 1, 2024. The act introduces requirements for “digital service providers”—a broad term encompassing websites, apps, and software that collect personal data—particularly those that enable social interaction and content sharing.

Stock photo of children holding smartphones.
A group of children using smartphones in a school corridor. TikTok has denied Texas allegations that it fails to properly protect minors on its platform.

lakschmiprasad S/Getty Images

Paxton’s lawsuit claims TikTok “collect[s], store[s], and process[es] personal identifying information about minors whenever a minor interacts with TikTok,” adding that the data “from users includes date of birth, email, phone number, and device settings, such as device type, language preference, and country setting, as well as data about a user’s interaction with TikTok, such as videos viewed, ‘liked’ or shared, accounts followed, comments, content created, video captions, sounds, and hashtags.”

The law also mandates that these platforms obtain verifiable parental consent before sharing, disclosing, or selling a minor’s personal information. Additionally, it requires these platforms to provide parents with tools to manage their children’s privacy and account settings, which Paxton claims TikTok fails to do.

“I will continue to hold TikTok and other Big Tech companies accountable for exploiting Texas children and failing to prioritize minors’ online safety and privacy,” said Attorney General Ken Paxton in an official statement on October 3.

“Texas law requires social media companies to take steps to protect kids online and requires them to provide parents with tools to do the same. TikTok and other social media companies cannot ignore their duties under Texas law.”

Paxton’s lawsuit comes after a separate legal challenge to the SCOPE Act. In August, just days before the law was set to take effect, U.S. District Judge Robert Pitman issued a last-minute, partial block, preventing the enforcement of SCOPE’s “monitoring and filtering” requirements related to filtering harmful content from minors’ feeds.

The judge cited concerns over potential violations of online free speech rights, particularly given the broad language used in defining harmful content, including terms like “promoting,” “glorifying,” and “grooming.” Nonetheless, other aspects of the SCOPE Act, such as those pertaining to data sharing and parental control tools, remain in effect.



Source link