Richard Drew / AP
WASHINGTON – Building on hugely popular social media platforms and their impact on children, leaders of a Senate panel called executives from YouTube, TikTok and Snapchat to answer questions about what their businesses are doing to ensure the safety of young users.
The Senate Trade Subcommittee on Consumer Protection has just been released a very busy hearing with a former Facebook data scientist, who presented internal company research showing that the company’s Instagram photo-sharing service appears to be seriously harming some teens.
The panel is expanding its reach to examine other tech platforms, with millions or billions of users, also vying for the attention and loyalty of young people.
The Three Leaders – Michael Beckerman, Vice President of TikTok and Head of Public Policy for the Americas; Leslie Miller, vice president of government affairs and public policy for YouTube owner, Google; and Jennifer Stout, vice president of global public policy for Snapchat, parent company Snap Inc., are scheduled to appear in a subcommittee hearing Tuesday.
The three platforms are woven into the fabric of young people’s lives, often influencing their dress, dance moves and diet, leading to obsession. Peer pressure to access applications is strong. Social media can provide entertainment and education, but the platforms have been misused to harm children and promote bullying, vandalism in schools, eating disorders and manipulative marketing, according to legislators.
âWe need to understand the impact of popular platforms like Snapchat, TikTok and YouTube on children and what businesses can do better to keep them safe,â said Senator Richard Blumenthal, D-Conn., Deputy Chairman committee, in a press release.
The panel wants to learn how algorithms and product designs can amplify harm to children, foster addiction and invasion of privacy, says Blumenthal. The aim is to develop legislation to protect young people and give parents the tools to protect their children.
Very popular with teens and young children, the TikTok video platform is owned by the Chinese company ByteDance. In just five years since its launch, it has gained around 1 billion monthly users.
TikTok denies claims, including from conservative Republican lawmakers, that it operates at the behest of the Chinese government and provides it with users’ personal data. The company says it stores all TikTok US data in the United States. The company also rejects criticism of promoting content harmful to children.
TikTok says it has implemented tools, such as screen time management, to help kids and parents moderate how much time kids spend on the app and what they see. The company says it is focusing on age-appropriate experiences, noting that some features, such as direct messaging, are not available to younger users.
Earlier this year, after federal regulators ordered TikTok to disclose how its practices affect children and teens, the platform tightened its privacy practices for those under 18.
A separate House committee investigated the YouTube Kids video service this year. Lawmakers said the YouTube branch is feeding children inappropriate material in “a wasteland of tasteless and consumerist content” so that he can show them ads. The app, with both video hosting and original shows, is available in around 70 countries.
A House Oversight and Reform Committee panel told YouTube CEO Susan Wojcicki that the service is not doing enough to protect children from potentially harmful content. Instead, it relies on artificial intelligence and self-monitoring by content creators to decide which videos will be shown on the platform, the panel chair said in a letter to Wojcicki.
Parent company Google agreed to pay $ 170 million in 2019, agreements with the Federal Trade Commission and the State of New York over allegations that YouTube collected personal data from children without their parents’ consent.
Despite the changes made after the regulations, the lawmaker’s letter said, YouTube Kids still shows ads to children.
YouTube says it has worked to provide children and families with parental protections and controls like time limits, to limit viewing to age-appropriate content. He points out that the 2019 regulations involved the main YouTube platform, not the kids’ version.
“We took action on over 7 million accounts in the first three quarters of 2021 when we learned they could be owned by a user under the age of 13 – 3 million of those in the third quarter alone – while we’ve accelerated our automated removal efforts, âsaid Miller, vice president of Google, in written testimony prepared for the hearing.
Snap Inc.’s Snapchat service allows people to send photos, videos and messages destined to disappear quickly, an incentive for its young users looking to avoid spying on parents and teachers. Hence its faceless (and wordless) white “Ghostface Chillah” logo.
At just 10 years old, Snapchat says 90% of 13-24 year olds in the United States use the service. It reported 306 million daily users in the July-September quarter.
The company accepted in 2014 to settle FTC allegations that it deceived users about how effectively shared material went missing and that it collected users’ contacts without telling them or asking for permission. The messages, called “snaps,” could be saved using third-party applications or other means, regulators said.
Snapchat was not fined but agreed to establish a privacy program that will be monitored by an outside expert for the next 20 years – similar to the oversight imposed on Facebook, Google and Myspace in the privacy regulations these last years.