In the US, we have this thing called "Section 230" that says a social site is not legally liable for the content of posts made by users. A court has now put a limitation on this:
...Several kids died taking part in the “Blackout Challenge," which Third Circuit Judge Patty Shwartz described in her opinion as encouraging users "to choke themselves with belts, purse strings, or anything similar until passing out."...
Because TikTok's For You Page (FYP) algorithm decides which third-party speech to include or exclude and organizes content, TikTok's algorithm counts as TikTok's own "expressive activity." That "expressive activity" is not protected by Section 230, which only shields platforms from liability for third-party speech, not platforms' own speech, Shwartz wrote. ...
According to Shwartz, if Nylah had discovered the "Blackout Challenge" video by searching on TikTok, the platform would not be liable, but because she found it on her FYP, TikTok transformed into "an affirmative promoter of such content."...
Ars Technica wrote the following
post Wed, 28 Aug 2024 13:10:55 -0700
Court: Section 230 doesn’t shield TikTok from Blackout Challenge death suit
TikTok must face claim over For You Page recommending content that killed kids.
https://arstechnica.com/tech-policy/2024/08/court-section-230-doesnt-shield-tiktok-from-blackout-challenge-death-suit/?utm_brand=arstechnica&utm_social-type=owned&utm_source=mastodon&utm_medium=social