The CEOs of Discord, Snap, TikTok, X and Meta put together to testify earlier than the Senate Judiciary Committee on Jan. 31, 2024. Alex Wong/Getty Photographs
“You may have blood in your fingers.”
“I’m sorry for the whole lot you may have all been by way of.”
These quotes, the primary from Sen. Lindsey Graham, R-S.C., talking to Meta CEO Mark Zuckerberg, and the second from Zuckerberg to households of victims of on-line youngster abuse within the viewers, are highlights from a unprecedented day of testimony earlier than the Senate Judiciary Committee about defending kids on-line.
However maybe probably the most telling quote from the Jan. 31, 2024, listening to got here not from the CEOs of Meta, TikTok, X, Discord or Snap however from Sen. Graham in his opening assertion: Social media platforms “as they’re presently designed and function are harmful merchandise.”
We’re college researchers who research how social media organizes information, info and communities. Whether or not or not social media apps meet the authorized definition of “unreasonably harmful merchandise,” the social media firms’ enterprise fashions do depend on having thousands and thousands of younger customers. On the similar time, we consider that the businesses haven’t invested ample assets to successfully defend these customers.
Cellular machine use by kids and youths skyrocketed through the pandemic and has stayed excessive. Naturally, teenagers wish to be the place their mates are, be it the skate park or on social media. In 2022, there have been an estimated 49.8 million customers age 17 and below of YouTube, 19 million of TikTok, 18 million of Snapchat, 16.7 million of Instagram, 9.9 million of Fb and seven million of Twitter, in response to a latest research by researchers at Harvard’s Chan Faculty of Public Well being.
Teenagers are a major income supply for social media firms. Income from customers 17 and below of social media was US$11 billion in 2022, in response to the Chan Faculty research. Instagram netted almost $5 billion, whereas TikTok and YouTube every accrued over $2 billion. Teenagers imply inexperienced.
Social media poses a variety of dangers for teenagers, from exposing them to harassment, bullying and sexual exploitation to encouraging consuming issues and suicidal ideation. For Congress to take significant motion on defending kids on-line, we establish three points that have to be accounted for: age, enterprise mannequin and content material moderation.
Following vigorous prompting from Sen. Josh Hawley, R-Mo., Meta CEO Mark Zuckerberg apologized to households of victims of on-line youngster abuse.
How previous are you?
Social media firms have an incentive to look the opposite approach by way of their customers’ ages. In any other case they must spend the assets to reasonable their content material appropriately. Tens of millions of underage customers – these below 13 – are an “open secret” at Meta. Meta has described some potential methods to confirm person ages, like requiring identification or video selfies, and utilizing AI to guess their age primarily based on “Pleased Birthday” messages.
Nonetheless, the accuracy of those strategies isn’t publicly open to scrutiny, so it’s troublesome to audit them independently.
Meta has said that on-line teen security laws is required to stop hurt, however the firm factors to app shops, presently dominated by Apple and Google, because the place the place age verification ought to occur. Nonetheless, these guardrails will be simply circumvented by accessing a social media platform’s web site fairly than its app.
New generations of shoppers
Teen adoption is essential for continued progress of all social media platforms. The Fb Recordsdata, an investigation primarily based on a evaluate of firm paperwork, confirmed that Instagram’s progress technique depends on teenagers serving to relations, notably youthful siblings, get on the platform. Meta claims it optimizes for “significant social interplay,” prioritizing household and mates’ content material over different pursuits. Nonetheless, Instagram permits pseudonymity and a number of accounts, which makes parental oversight much more troublesome.
On Nov. 7, 2023, Auturo Bejar, a former senior engineer at Fb, testified earlier than Congress. At Meta he surveyed teen Instagram customers and located 24% of 13- to 15-year-olds stated they’d obtained undesirable advances inside the previous seven days, a reality he characterizes as “seemingly the largest-scale sexual harassment of teenagers to have ever occurred.” Meta has since carried out restrictions on direct messaging in its merchandise for underage customers.
However to be clear, widespread harassment, bullying and solicitation is part of the panorama of social media, and it’s going to take greater than dad and mom and app shops to rein it in.
Meta not too long ago introduced that it’s aiming to supply teenagers with “age-appropriate experiences,” partly by prohibiting searches for phrases associated to suicide, self-harm and consuming issues. Nonetheless, these steps don’t cease on-line communities that promote these dangerous behaviors from flourishing on the corporate’s social media platforms. It takes a rigorously skilled staff of human moderators to observe and implement phrases of service violations for harmful teams.
Content material moderation
Social media firms level to the promise of synthetic intelligence to reasonable content material and supply security on their platforms, however AI isn’t a silver bullet for managing human habits. Communities adapt shortly to AI moderation, augmenting banned phrases with purposeful misspellings and creating backup accounts to stop getting kicked off a platform.
Human content material moderation can also be problematic, given social media firms’ enterprise fashions and practices. Since 2022, social media firms have carried out huge layoffs that struck on the coronary heart of their belief and security operations and weakened content material moderation throughout the trade.
Congress will want laborious knowledge from the social media firms – knowledge the businesses haven’t supplied up to now – to evaluate the suitable ratio of moderators to customers.
The best way ahead
In well being care, professionals have an obligation to warn in the event that they consider one thing harmful would possibly occur. When these uncomfortable truths floor in company analysis, little is completed to tell the general public of threats to security. Congress might mandate reporting when inner research reveal damaging outcomes.
Serving to teenagers at this time would require social media firms to spend money on human content material moderation and significant age verification. However even that isn’t prone to repair the issue. The problem is dealing with the fact that social media because it exists at this time thrives on having legions of younger customers spending important time in environments that put them in danger. These risks for younger customers are baked into the design of latest social media, which requires a lot clearer statutes about who polices social media and when intervention is required.
One of many motives for tech firms to not section their person base by age, which might higher defend kids, is how it will have an effect on promoting income. Congress has restricted instruments accessible to enact change, akin to implementing legal guidelines about promoting transparency, together with “know your buyer” guidelines. Particularly as AI accelerates focused advertising, social media firms are going to proceed making it simple for advertisers to succeed in customers of any age. But when advertisers knew what quantity of advertisements had been seen by kids, fairly than adults, they might assume twice about the place they place advertisements sooner or later.
Regardless of a lot of high-profile hearings on the harms of social media, Congress has not but handed laws to guard kids or make social media platforms accountable for the content material revealed on their platforms. However with so many younger folks on-line post-pandemic, it’s as much as Congress to implement guardrails that in the end put privateness and group security on the heart of social media design.

Joan Donovan is on the board of Free Press and the founding father of the Vital Web Research Institute.
Sara Parker works for the Media Ecosystem Observatory at McGill College. Their work is basically funded by the Authorities of Canada.












