777rainbow Bad News: We’ve Lost Control of Our Social Media Feeds. Good News: Courts Are Noticing.

Updated:2024-12-11 02:48    Views:144

During a recent rebranding tour, the Meta chief Mark Zuckerberg, sporting Gen Z-approved tousled hair, streetwear and a gold chain, let the truth slip: Consumers no longer control their social-media feeds. Meta’s algorithm, he boasted, has improved to the point that it is showing users “a lot of stuff” not posted by people they had connected with and he sees a future in which feeds show you “content that’s generated by an A.I. system.”

Spare me. There’s nothing I want less than a bunch of memes of Jesus-as-a-shrimp, pie-eating cartoon cats and other A.I. slop added to all the clickbait already clogging my feed. But there is a silver lining: Our legal system is starting to recognize this shift and hold tech giants responsible for the effects of their algorithms — a significant, and even possibly transformative, development that over the next few years could finally force social media platforms to be answerable for the societal consequences of their choices.

Let’s back up and start with the problem. Section 230, a snippet of law embedded in the 1996 Communications Decency Act, was initially intended to protect tech companies from defamation claims related to posts made by users. That protection made sense in the early days of social media, when we largely chose the content we saw, based on whom we “friended” on sites such as Facebook. Since we selected those relationships, it was relatively easy for the companies to argue they should not be blamed if your Uncle Bob insulted your strawberry pie on Instagram.

Then, of course, things got a little darker. Not everything Uncle Bob shared was accurate, and the platforms’ algorithms prioritized outrageous, provocative content from anyone with internet access over more neutral, fact-based reporting. Despite this, the tech companies’ lawyers continued to successfully argue that they were not responsible for the content shared on their platforms — no matter how misleading or dangerous.

Section 230 now has been used to shield tech from consequences for facilitating deadly drug sales, sexual harassment, illegal arms sales and human trafficking. And in the meantime, the companies grew to be some of the most valuable in the world.

Then came TikTok. After the wild popularity of TikTok’s “For You” algorithm, which selects bite-size videos to be fed to the passive viewer, social networks are increasingly having us watch whatever content their algorithms have chosen, often pushing to the sidelines the posts of accounts we had actually chosen to follow.

We are having trouble retrieving the article content.

Please enable JavaScript in your browser settings.

Thank you for your patience while we verify access. If you are in Reader mode please exit and log into your Times account, or subscribe for all of The Times.

Thank you for your patience while we verify access.

Already a subscriber? Log in.

Want all of The Times? Subscribe.777rainbow