Teens in Texas, which last week became the third state to demand parental authorization for using social media for children, will likely soon lose their online privileges. Similar legislation was adopted in Utah in March, and Louisiana did the same this month.
On Wednesday, Greg Abbott, the governor of Texas, signed HB 18 into law. After it goes into effect on September 1, platforms will need to confirm the ages of all underage users and obtain parental permission before allowing them to sign up, or face legal repercussions from the state attorney general or complaints from parents.
“Social Media For Children have been collecting data and manipulating theirs online behavior,” the Texas House Republican Caucus tweeted Thursday after the bill, sponsored by Representative Shelby Slawson (R), was signed into law. Slawson also tweeted, thanking Abbott and Texas House Speaker Dade Phelan for “prioritizing this issue.”
“Texas is leading to empower parents to protect our kids online,” Slawson tweeted.
Slawson didn’t answer Ars’ request for a comment right away.
Texas Bans Social Media For Children without Parental Consent
The expansive regulation imposes onerous requirements on web platforms. Basically, it mandates that any supplier of digital services that collects an email upon sign-up do age verification to identify all minors, confirm the parents or guardians of all identified minors, and secure parental authorization for a variety of account activity.
Social media for children are expected to go above and beyond to safeguard them from dangerous, dishonest, or unfair business practices.
The law mandates that social media for children to be accountable for new parental controls, building a portal to communicate with parents about minor activity, and enabling parents to more easily monitor minors’ behaviours and control their activity on online platforms, in addition to the burden of verifying minors and parents, guardians, or careers.
The use of harmful content that “promotes, glorifies, or facilitates” self-harm, eating disorders, drug abuse, stalking, bullying, harassment, grooming, trafficking, child sexual abuse materials, or other forms of sexual exploitation or abuse must be prohibited by platforms. Creating a plan to keep “a comprehensive list of harmful material” to “block from display to a known minor” and employing real people to evaluate and confirm that filters are functioning—rather than merely relying on automated content moderation—are also parts of that effort.
Any errors could result in more demands, such as annual independent audits of Social media for children to make sure content filters are working properly to protect them.
Additionally, the Texas law mandates that internet companies increase user transparency by clearly describing how algorithms organize and filter content under terms of service or privacy policies.