Suit against Snap over suicide may test platform protections - Los Angeles Times
Advertisement

A teen who was bullied on Snapchat died. His mom is suing to hold social media liable

The Snapchat app on a mobile device screen.
(Associated Press)
Share via

One of the last things that Carson Bride did before taking his own life was look to his phone for help.

The 16-year-old had been receiving anonymous messages for months, according to a federal lawsuit filed Monday in California, through a popular Snapchat app called Yolo. The messages included sexual comments and taunts over specific incidents, such as the time he’d fainted in biology class at his Portland, Ore., school.

The messages had to be coming from people he knew, but the app’s design made it impossible for him to know who was behind it. If he replied to the taunts, Yolo would automatically make the original message public, revealing his humiliation to the world.

Advertisement

His family found him dead June 23, 2020. The history on his phone showed that he had been searching for “Reveal YOLO Username Online” earlier that morning.

Now Kristin Bride, Carson’s mother, is leading a lawsuit against Snap, Yolo and LMK, another anonymous messaging app built for Snapchat that the teenager used before his death. Her complaint alleges that the companies violated consumer protection law by failing to live up to their own terms of service and policies, and that anonymous messaging apps facilitate bullying to such a degree that they should be considered dangerous products.

The suit, filed Monday in federal court in the Northern District of California, seeks to form a class on behalf of the approximately 93 million U.S. users of Snapchat — a number that the company says includes 90% of all Americans ages 13 to 24 — along with the 10 million users of Yolo and 1 million users of LMK. Bride’s co-plaintiff in the case is the Tyler Clementi Foundation, a nonprofit formed to prevent bullying by the family of Tyler Clementi, who took his own life at age 18 in 2010 after experiencing cyber harassment by a dorm mate at Rutgers University.

Advertisement

Snap declined to comment on active litigation. Yolo and LMK did not respond to requests for comment.

Suicide prevention and crisis counseling resources

If you or someone you know is struggling with suicidal thoughts, seek help from a professional and call 9-8-8. The United States’ first nationwide three-digit mental health crisis hotline 988 will connect callers with trained mental health counselors. Text “HOME” to 741741 in the U.S. and Canada to reach the Crisis Text Line.

The lawsuit seeks to have Yolo and LMK immediately banned from Snap’s platform, along with any other apps that have failed to set up safeguards against cyberbullying, and seeks damages for the alleged harms and misrepresentations.

“The high school students who anonymously cyberbullied Carson will live with this tragedy for the rest of their lives,” Kristin Bride said in a statement provided by Eisenberg & Baum, the law firm representing the plaintiffs. “However, it is the executives at Snapchat, Yolo, and LMK irresponsibly putting profits over the mental health of young people who ultimately need to be held accountable.”

Advertisement

To date, those attempting to sue social media companies over their users’ words and actions have met with little success. Most cases against tech companies over content posted by their users are dismissed out of hand under Section 230 of the 1996 Communications Decency Act, which states that no “interactive computer service” can be held liable for information posted by a user on that service.

But changes in the legal landscape and a novel legal argument may set this case apart.

Section 230 has been a ‘brick wall’ for anyone seeking to sue internet companies over user-generated content, but bills and legal strategies under consideration could represent an opportunity to change that.

March 1, 2021

In a ruling last week, the U.S. 9th Circuit Court of Appeals opened the door to the idea that social media companies — and Snap in particular — can be held responsible for building or enabling features that are so clearly dangerous to its users that the product is essentially defective.

That case centered on a Snapchat filter that automatically detected how fast the user was moving and let them add that number to a post on the platform. Plaintiffs in the suit argued that the feature incentivized driving at high speeds, leading to a fatal 2017 car crash in Wisconsin, in which a 17-year-old passenger pulled up Snap moments before the car hit a speed of 123 mph, then ran off the road and crashed into a tree.

The 9th Circuit reversed a lower court’s decision to dismiss the case over Section 230 protections, with Judge Kim McLane Wardlaw writing that “this type of claim rests on the premise that manufacturers have a ‘duty to exercise due care in supplying products that do not present unreasonable risk of injury or harm to the public.’”

In the case filed Monday, Bride and the Tyler Clementi Foundation argue that anonymous messaging features such as Yolo and LMK similarly present unreasonable risk of harm. To bolster this argument, the suit points to multiple generations of anonymous messaging apps targeted at teen users that have risen and collapsed in recent years, each brought down under the weight of abuse and harassment that they enabled, such as Yik Yak, Secret and Sarahah.

The suit cites research linking anonymous harassment and teen suicide to bolster this argument, including a 2007 study that found that students who experience bullying, online or in real life, are nearly twice as likely to attempt suicide. A subsequent study in 2014 found that cyberbullying may be even more dangerous, with online bullying tripling the risk for suicidal ideation.

Advertisement

There’s a rise in cyberbullying nationwide, with three times as many girls reporting being harassed online or by text message than boys, according to the National Center for Education Statistics.

July 26, 2019

But the suit also pursues a line of argument that draws from consumer protection law, arguing that Snap, Yolo and LMK failed to live up to their terms of service and other commitments to users.

A recent decision in the 2nd Circuit Court of Appeals in New York City showed that this consumer protection argument might have legs. In that case, the plaintiff argued that the gay dating app Grindr should be held responsible for harassment he received on the app from a former boyfriend, who set up a fake account for the plaintiff and sent a stream of 1,400 strange men to his house looking for sex over the course of 10 months.

The court dismissed the case, citing Section 230 protections, and wrote that it was unclear that Grindr was in violation of its terms of service for a number of reasons. But the judge’s ruling indicated that a case showing a more clear-cut violation of those terms and agreements could have merit on consumer protection grounds.

A suit filed in early April against Facebook relies on a similar strategy. The civil rights group Muslim Advocates sued the social media giant alleging that it has failed to live up to its promises to users by allowing hate speech against Muslims to proliferate on the platform unchecked.

An approach based on consumer protection laws might find traction in the court, said Jeff Kosseff, a law professor at the U.S. Naval Academy and the author of a book about Section 230, “The Twenty-Six Words That Created the Internet.

“Five years ago, I would have predicted that this case would almost definitely be dismissed” because of the internet shield law, Kosseff said, “but now I think it could really go either way.

Advertisement

“This is a really horrible, tragic case, and judges who are reading these cases are not completely blind to that. No matter how much case law there is, they’re clearly going to be influenced by the fact these are some really terrible allegations in this complaint.”

A prompt shown to first-time Yolo users warns that the app has “no tolerance for objectionable content or abusive users,” and that users will be banned for inappropriate behavior. Its privacy policy warns users that the company collects user information in order to enforce that zero tolerance policy.

But after discovering the bullying Yolo messages and search history on her son’s phone, Bride contacted the company through its website, sending a message about her son’s cyberbullying and resulting death. Yolo never responded, according to the suit.

Months later, Bride and her husband, Tom, again tried to contact Yolo through its website and via an email address provided for reporting emergencies, demanding that the company remove the bullying users from its platform. The email carried the subject line “Our Son’s Suicide — Request for Help.” The emergency reporting account replied only with a bounce-back message, saying that the recipient could not be reached due to an invalid address. Two subsequent attempts to contact the company also went unanswered, according to the suit.

LMK, the other third-party app in the suit, made even stronger claims about user safety, writing in its terms of service that it would “go to great lengths to protect our community” from inappropriate usage, including using “artificial intelligence technology” and human moderation to police its content.

And Snap, the suit alleges, has failed to live up to its own promises to users by allowing these apps to run on its service.

Advertisement

Snap allows other companies to build apps for its platform with a set of tools called Snap Kit and makes clear in its guidelines that it actively reviews all third-party apps, noting that “encouraging or promoting violence, harassment, bullying, hate speech, threats and/or self-harm” is unacceptable, as is having “inadequate safeguards in place to prevent this type of behavior.”

Those guidelines also state that any app with anonymous user-generated content is subject to a longer review process, and if a third-party app is noncompliant or violates guidelines, Snap reserves the right to remove the app from its platform.

Snap has not removed Yolo or LMK from its platform, “even though it knows or has reason to know through numerous reports that YOLO and LMK lack adequate safeguards to prevent teen users from being victimized by harassment, sexually explicit materials, and other harm,” according to the suit.

Advertisement