Ross: Social media companies might actually be held responsible for the content they host
Oct 5, 2022, 7:14 AM | Updated: Oct 6, 2022, 6:15 am

(Photo Illustration by Justin Sullivan/Getty Images)
(Photo Illustration by Justin Sullivan/Getty Images)
The Supreme Court could finally decide if social media is legally responsible for what it broadcasts. The justices have agreed to take the case of .
Nohemi Gonzalez was a California student studying abroad in Paris in 2015 – when ISIS terrorists attacked a stadium, a concert hall, and the restaurant where she was eating.
More from Dave Ross: With voter pamphlets and mail-in ballots, you have no reason not to vote
Nohemi was among the 130 people killed that night. And when her family discovered that YouTube – which is owned by Google – had hosted ISIS recruitment videos, they sued.
And they lost.
Because under Section 230 of the Communications Decency Act, YouTube is not responsible for what users post on its platform. That rule was passed in 1996 when social media was just getting started – to make sure it wasn’t strangled by regulation.
But social media is all grown up now. And Google doesn’t merely post videos, it also uses algorithms that recommend videos. And the Gonzalez family argues that those algorithms actively directed terrorist videos to people with terrorist tendencies and that the company, therefore, bears some responsibility for what happened to their daughter.
And the Supreme Court has now decided it’s time to sort this out.
The case has social media companies really worried – because all of them – YouTube, Facebook, Twitter, TikTok – all use software to automatically figure out what you like so they can feed you more of it. But very little is actually previewed by human eyes.
Just last month, Congress again asked social media executives to explain why they can’t seem to control stuff like Q-Anon conspiracies, and other calls to violence:
“We do not allow that content on our platform. We’ve been removing that type of content from our platform for years,” one YouTube executive said.
But the issue here isn’t about removal after the damage has been done. It’s about making sure the bad stuff doesn’t get broadcast to begin with. It’s about whether the blanket protection of Section 230 still makes sense now that social media has become the mainstream media.
The Supreme Court could conceivably take away that protection, which would require YouTube and its peers to preview everything they broadcast. Which terrifies them.
But I’m here to say there is nothing to be afraid of. It’s not impossible, and it’s not the end of the world.
I know that because we broadcasters have been doing it for about a hundred years.
Listen to Seattle’s Morning News with Dave Ross and Colleen O’Brien weekday mornings from 5 – 9 a.m. on ³ÉÈËXÕ¾ Newsradio, 97.3 FM. Subscribe to the podcast here.