Tech executives, including Meta CEO Mark Zuckerberg, are set to face another round of questioning from Congress this week regarding the potential harm their products may pose to teenagers. Despite previous assurances that they would assist teens and families in making informed decisions, social media platforms are under increasing scrutiny amid concerns that their services could contribute to depression and even suicide among young users.
With a looming presidential election and state lawmakers taking a more prominent role, Congress is poised to push tech companies to go beyond their previous efforts in addressing these concerns. The Senate Judiciary Committee hearing scheduled for Wednesday will feature testimony from the chief executives of TikTok, Snap, Discord, and X, with some appearing before Congress for the first time
While many tech CEOs are expected to highlight tools and policies aimed at protecting children and giving parents greater control over their online experiences, critics argue that these measures fall short. Online safety advocates insist that relying solely on parents and young users to safeguard against potential harms is inadequate and that tech platforms must be held accountable for implementing more robust safety features.
Concerns have also been raised about the role of generative artificial intelligence tools, which could facilitate the creation and dissemination of harmful content on social media. This has intensified calls for tech platforms to prioritize safety features by default.
Several major platforms, including Meta, Snapchat, Discord, and TikTok, have introduced oversight tools allowing parents to monitor their teens’ online activities and exert some control over their experiences. Additionally, platforms like Instagram and TikTok have implemented features such as “take a break” reminders and screen time limits to mitigate exposure to harmful content.
In response to mounting pressure, Meta recently proposed federal legislation advocating for app stores to verify users’ ages and enforce age restrictions. The company also unveiled new safety initiatives, such as hiding age-inappropriate content and prompting teens to adjust their privacy settings.
Snapchat has expanded its parental oversight tool, Family Center, giving parents more control over their teens’ interactions on the platform. Despite these efforts, critics argue that tech companies have been slow to implement necessary safety updates and cannot be relied upon to self-regulate effectively.
As efforts to regulate tech platforms stall in Congress, momentum for addressing social media issues has shifted to state legislatures and the courts. Several states have passed laws aimed at restricting social media use among teens, though many of these laws are facing legal challenges from the tech industry.
Wednesday’s hearing will also provide lawmakers with an opportunity to question smaller industry players, such as X and Discord, about their youth safety efforts. Discord, in particular, has faced scrutiny over its role in hosting controversial content and has been engaging with lawmakers to address concerns.
With tech CEOs once again under the spotlight, there is growing recognition that industry-wide solutions are needed to address the complex challenges posed by social media platforms and their impact on young users.