fbpx

Netflix Targeted Children for Suicide Show Class Action

Did Netflix, Inc. cause a wave of child suicides? That’s the contention of this class action. The complaint alleges that the company had been told of the risks of airing the show Thirteen Reasons Why, targeting “impressionable youths” and not warning them of the risks in viewing the show. “Instead,” the complaint claims, “it used its sophisticated, targeted recommendation systems to push the Show on unsuspecting and vulnerable children, using its cutting-edge technology.”

The plaintiffs in this case are the estate of one child who killed herself, IBH, her father John Herndon, and her two younger brothers, JMH and TPH.

A spike in child suicides occurred in April 2017. According to the complaint, the cause was the release of the show Thirteen Reasons Why on Netflix’s streaming service and its targeting the show at children. Later, it claims, experts were able to see this: For example, years later, “the National Institute of Mental Health associated the 28.9% increase in the child-suicide rate during the month of April 2017 with Netflix’s Show…”

The complaint alleges that Netflix had been warned that this could happen. “Yet Netflix proceeded anyway, prioritizing its own strategy goals of market dominance in the youth demographic over the lives and well-being of vulnerable populations it knew would suffer and die if it did not provide greater warnings and take reasonable, common-sense steps to avoid using its data in a reckless manner that harmed children.”

The show was based on a novel of the same title that became very popular when it was published in 2007. The story was told in the form of fictional audiotape transcripts, supposedly from tapes made by the main character, Hannah Baker, each aimed at another person she blames for her suicide.

Netflix bought the rights to make the novel into a show. The complaint alleges, “Part of the business case for adapting the Novel into the Show was that the Novel already had a ‘huge following’ and ‘huge fan base’ so the Show was expected to attract younger audiences.”

But there were at least two differences between the show and the novel. First, the complaint says, the novel was fast-paced, whereas the show was like a thirteen-hour suicide note, with the suicide as the “grand climax.” Second, the suicide scene was changed from the taking of pills to a scene where, an article claims, “she saws vertically at her forearms with razor blades, sobbing and screaming in an overflowing, pinkish tub.”

The complaint says that “Netflix is not being sued because it created a Show of questionable morality that arguably glorifies teenage suicide.” Instead, it claims, it is being sued for two things: “(1) Netflix’s failure to adequately warn of its Show’s … dangerous features and (2) Netflix’s use of its trove of individualized data about its users to specifically target vulnerable children and manipulate them into watching content that was deeply harmful to them…”

The complaint provides reasons why a class action is suitable for resolving the claims, but the class is not defined.

Article Type: Lawsuit
Topic: Wrongful Death

Most Recent Case Event

Netflix Targeted Children for Suicide Show Complaint

August 25, 2021

Did Netflix, Inc. cause a wave of child suicides? That’s the contention of this class action. The complaint alleges that the company had been told of the risks of airing the show Thirteen Reasons Why, targeting “impressionable youths” and not warning them of the risks in viewing the show. “Instead,” the complaint claims, “it used its sophisticated, targeted recommendation systems to push the Show on unsuspecting and vulnerable children, using its cutting-edge technology.”

Netflix Targeted Children for Suicide Show Complaint

Case Event History

Netflix Targeted Children for Suicide Show Complaint

August 25, 2021

Did Netflix, Inc. cause a wave of child suicides? That’s the contention of this class action. The complaint alleges that the company had been told of the risks of airing the show Thirteen Reasons Why, targeting “impressionable youths” and not warning them of the risks in viewing the show. “Instead,” the complaint claims, “it used its sophisticated, targeted recommendation systems to push the Show on unsuspecting and vulnerable children, using its cutting-edge technology.”

Netflix Targeted Children for Suicide Show Complaint
Tags: Negligence, Strict Product Liability—Failure to Warn, Wrongful Death