Meta/Snap Wrongful Death Lawsuit
Meta/Snap Wrongful Death Lawsuit
Meta/Snap Wrongful Death Lawsuit
4
UNITED STATES DISTRICT COURT
5 EASTERN DISTRICT OF WISCONSIN
6
DONNA DAWLEY, individually and Personal
7 Representative of the ESTATE OF
CHRISTOPHER J. DAWLEY NO.
8
Plaintiffs, COMPLAINT FOR WRONGFUL
9 DEATH AND SURVIVORSHIP
v.
10 JURY DEMAND
META PLATFORMS, INC., formerly known
11 as FACEBOOK, INC.; SNAP, INC.
12 Defendants.
13
14
In these digital public spaces, which are privately owned and tend to be run for
15 profit, there can be tension between what’s best for the technology company and
what’s best for the individual user or for society. Business models are often built
16 around maximizing user engagement as opposed to safeguarding users’ health and
ensuring that users engage with one another in safe and healthy ways. . . .
17 Technology companies must step up and take responsibility for creating a safe
digital environment for children and youth. Today, most companies are not
18 transparent about the impact of their products, which prevents parents and young
people from making informed decisions and researchers from identifying
19 problems and solutions.
22
23
2 on January 4, 2015, and brings this action for wrongful death and survivorship against Meta
3 Platforms, Inc., formerly known as Facebook, Inc. (“Facebook”), doing business as Instagram
4 (“Instagram”) and Snap, Inc. doing business as Snap Chat and alleges as follows:
5 I. INTRODUCTION
6 1. This product liability action seeks to hold Defendants’ products responsible for
7 causing and contributing to burgeoning mental health crisis perpetrated upon the children and
8 teenagers in the United States by Defendants and, specifically, for the death by suicide of
12 cataloging a dramatic increase in teen mental health crises including suicides, attempted suicides,
13 eating disorders, anxiety, depression, self-harm and inpatient admissions. Between 2007 and
14 2018, for example, suicide rates among youth ages twelve to sixteen in the U.S. increased a
16 3. The most significant and far-reaching change to the lives of young people during
17 this period was the widespread adoption of mobile social media platforms, prominently the
18 Instagram, Snapchat and Facebook products designed and distributed by Defendants. By 2014,
19 80 percent of high-school students said they used a social-media platform daily, and 24 percent
20 said that they were online “almost constantly.” Many children and teenagers spend hours
22 4. Peer reviewed studies and the available medical science have identified a
23 particular type of social media and electronic device use associated with major mental health
2 dissatisfaction with life, depression and sleep deprivation. Both large observational studies and
3 experimental results point to the heavy use of Defendants’ social media products as a cause of
4 increased depression, suicidal ideation, and sleep deprivation among teenagers, particularly
5 teenage girls.
6 5. Defendants’ own research also points to the use of Defendants’ social media
7 products as a cause of increased depression, suicidal ideation, sleep deprivation, and other,
8 serious harms. Meta researchers, for example, found that Instagram is “worse” than many
9 competitor products and “is seen as having the highest impact [on negative body and appearance
12 develop their products to encourage, enable, and push content to teens and children that
13 Defendants know to be problematic and highly detrimental to their minor users’ mental health.
15 numbers of its users are engaging in problematic use of its products. Indeed, the problematic use
16 identified in the medical literature is precisely the type of use Defendants have designed their
19 8. Likewise, each of Defendants’ products contain unique product features which are
20 intended to and do encourage addiction, and unlawful content and use of said products, to the
22 9. Plaintiff brings claims of strict liability based upon Defendants’ defective design
23 of their social media products that renders such products not reasonably safe for ordinary
2 media products that substantially decrease both the incidence and magnitude of harm to ordinary
3 consumers and minors arising from their foreseeable use of Defendants’ products with a
5 10. Plaintiffs also bring claims for strict liability based on Defendants’ failure to
6 provide adequate warnings to minor users and their parents of danger of mental, physical, and
7 emotional harms arising from foreseeable use of their social media products. The addictive
8 quality of Defendants’ products and their harmful algorithms are unknown to minor users and
9 their parents.
10 11. Plaintiffs also bring claims for common law negligence arising from Defendants’
11 unreasonably dangerous social media products and their failure to warn of such dangers.
12 Defendants knew, or in the exercise or ordinary care should have known, that their social media
13 products were harmful to a significant percentage of their minor users and failed to re-design
14 their products to ameliorate these harms or warn minor users and their parents of dangers arising
16 12. Plaintiff also brings claims under Wis. Stat § 100.18 based on Defendants unfair
17 and deceptive trade practices in the marketing addictive social media products to unsuspecting
18 minors.
19 II. PARTIES
21 Christopher J. Dawley and will soon be appointed Personal Representative of his Estate.
22 14. Plaintiff DONNA DAWLEY has not entered into a User Agreement or other
23 contractual relationship with any of the Defendants herein. As such, in prosecuting this action
2 forth in said User Agreements. Further, as Personal Representative of the Estate of CJ Dawley,
3 Plaintiff expressly disaffirms any User Agreements with Defendants which CJ Dawley may have
4 accepted, which would have been entered into prior to his reaching the age of majority.
5 15. Defendant Meta Platforms, Inc., formerly known as Facebook, Inc., is a Delaware
6 corporation with its principal place of business in Menlo Park, CA. Defendant Meta Platforms
7 owns and operates the Facebook and Instagram social media platforms, that are widely available
9 16. Defendant Snap, Inc. is a Delaware corporation with its principal place of
10 business in Venice Beach, CA. Defendant Snap Inc. owns and operates the Snapchat social
11 media platform, an application that is widely marketed by Snap Inc. and available to users
14 17. This Court has subject-matter jurisdiction over this case under 28 U.S.C. §
15 1332(a) because the amount in controversy exceeds $75,000 and Plaintiff and Defendants are
16 residents of different states. Venue is proper in this District under 28 U.S.C. § 1391(b)(2)
17 because a substantial part of the events giving rise to the claim occurred in the Eastern District of
18 Wisconsin.
19 18. This Court has specific personal jurisdiction over Defendants Meta and Snap
20 because these Defendants transact business in the State of Wisconsin and purposefully avail
21 themselves of the benefits of transacting business with Wisconsin residents; Plaintiff’s claims set
22 forth herein arise out of and/or relate to Defendants’ activities in the State of Wisconsin and
23
2 jurisdiction by this Court comports with traditional notions of fair play and substantial justice.
3 19. Defendants have contracts with a significant percentage of the population of the
4 State of Wisconsin relating to use of the products at issue in this case, and interact extensively
5 with, send messages, notifications, and communications to, and provide a myriad of other
6 interactive services and recommendations to users Defendants expect and know to be in the State
7 of Wisconsin.
9 with third party “partners” who advertise on their behalf via electronic platforms and devices.
10 Defendants Meta also has agreements with cell phone manufacturers and/or providers and/or
11 retailers, who often pre-install its products on mobile devices prior to sale and without regard to
12 the age of the intended user of each such device. That is, even though Defendants are prohibited
13 from providing their products to users under the age of 13, by encouraging and allowing its
15 access to its product to the underage users in Wisconsin for whom those devices are intended.
16 21. Defendants have earned millions of dollars in annual revenue from their
17 Wisconsin-related activities over the last several years arising from their defective and inherently
19 Dawley.
22 22. Facebook is an American online social network service that is part of the
23 Defendant Meta Platforms. Facebook was founded in 2004 and became the largest social
2 using Facebook every day. The company’s headquarters are in Menlo Park, California.
3 23. Instagram is a photo sharing social media application that originally enabled users
4 to post and share photos that could be seen by other users who “follow” the user. A user’s
5 followers could “like” and post comments on the photos. Instagram was purchased by Facebook,
7 24. Facebook recently changed its name to, and is referred to herein and collectively
9 25. A user’s “feed” is a comprised of a series of photos and videos posted by accounts
10 that the user follows, along with advertising and content specifically selected and promoted by
11 Instagram. Meta exerts control over a user’s Instagram “feed,” including through certain ranking
12 mechanisms, escalation loops, and/or promotion of advertising and content specifically selected
13 and promoted by Meta based on, among other things, its ongoing planning, assessment, and
14 prioritization of the types of information most likely to increase engagement – which, in the case
15 of certain groups of users, including teens, translates to Meta’s deliberate and repeated
16 promotion of harmful and unhealthy content, which Meta knows or has reason to know is
18 26. Instagram also features a “discover” feature where a user is shown an endless feed
19 of content that is selected by an algorithm designed by Meta based upon the users’ demographics
20 and prior activity in the application. Again, Meta has designed its product in a manner such that
21 it promotes harmful and/or unhealthy content. Meta is aware of these inherently dangerous
22 product features and has repeatedly decided against changing them and/or implementing readily
23
3 27. Users’ profiles on Instagram may be public or private. On public profiles, any
4 user is able to view the photos, videos, and other content posted by the user. On private profiles,
5 the users’ content may only be viewed by the user’s followers, which the user is able to approve.
6 During the relevant period, Instagram profiles were public by default and Instagram allowed all
7 users to message and send follow request to underage users, including Decedent CJ Dawley.
9 functionality and/or a user’s ability to access content. Instead, this product feature increased user
10 engagement during onboarding (when a user first starts using Instagram) by increasing user
11 connections. However, Meta also knew that harmful and/or undesirable, even dangerous,
12 contacts could be made through this public setting feature, particularly for users under the age of
14 29. During the last five years, Instagram has added features and promoted the use of
15 short videos and temporary posts. The latter are referred to as “Reels” while the former is
17 30. Based on individualized data collected from their users’ social media habits,
18 Instagram independently selects content for its users and notifies them of such content through
19 text and email. Instagram’s notifications to individuals users are specifically designed to and do
20 prompt them to open Instagram and view the content selected by Instagram which increases the
22
23
2 platform amongst teenagers and young adults in the United States with over 57 million users
3 below the age of eighteen, meaning that 72 percent of America’s youth use Instagram.
4 B. Snapchat Background
5 32. Snapchat is a photo and short video sharing social media application that allows
6 users to form groups and share posts or “Snaps” that disappear after being viewed by the
7 recipients. The Snapchat product is well-known for its self-destructing content feature.
8 Specifically, the Snapchat product allows users to form groups and share posts or “Snaps” that
11 appeal to minor users by evading parents’ ability to monitor their children’s social media activity
13 permits minor users to exchange illegal and sexually explicit images with adults and provides
15 34. Snapchat also features a series of rewards including trophies, streaks, and other
16 signals of social recognition similar to the “likes” metrics available across other platforms.
17 These features are designed to encourage users to share their videos and posts with the public.
18 Snapchat designed these features to be addictive, and they are. Users also have an “explore” feed
19 that displays content created by other users around the world. The trophies, streaks, and other
20 signals of social recognition that users receive is content neutral; users receive the same amount
21 of rewards regardless of the content of the posts and videos they exchange,
22
23
2 long as possible each day, and have led many people, from psychologists to government
4 36. Snapchat was founded in 2011 by current president and CEO Evan Spiegel and
6 37. In 2014, Snapchat added “Stories” and “Chat” features that allowed users to post
7 longer stories that could be viewed by users outside the user’s friends. In 2014, Snapchat also
8 released a feature called Snapcash that allowed users to send money to other users without regard
10 38. Snapchat also allows users to enable the sharing of their location, through a tool
11 called Snap Map, which allows the users followers (and the public for Snaps submitted by the
12 users) to see the user’s location on a map. This feature is available to all users, including minors.
13 39. By 2015, Snapchat had over 75 million monthly active users and was considered
14 to be the most popular social media application amongst American teenagers in terms of number
17 40. Instagram, Facebook and Snapchat are products that are designed and
18 manufactured by Meta and Snap, respectively. These products are designed to be used by minors
19 and are actively marketed to minors across the United States including the State of Wisconsin.
20 Further, Defendants are aware that large numbers of children under the age of 13 use their
21 products despite user terms or “community standards” that purport to restrict use to individuals
23
2 marketed to minors across the United States. Defendants market to minors through their own
3 marketing efforts and design. But also, Defendants work with and actively encourage advertisers
4 to create ads targeted at and appealing to teens, and even to children under the age of 13.
5 Defendants spend millions researching, analyzing, and experimenting with young children to
6 find ways to make their products more appealing and addictive to these age groups, as these age
7 groups are seen as the key to Defendants’ long-term profitability and market dominance.
8 42. Defendants are aware that large numbers of children under the age of 18 use their
9 products without parental consent. They design their products in a manner that allows and/or
10 does not prevent such use to increase user engagement and, thereby, their own profits.
11 43. Defendants are likewise aware that large numbers of children under the age of 13
12 use their products despite user terms or “community standards” that purport to restrict use to
13 individuals who are 13 and older. They have designed their products in a manner that allows
14 and/or does not prevent such use to increase user engagement and, thereby, their own profits.
16 44. Defendants advertise their products as “free,” because they do not charge their
17 users for downloading or using their products. What many users do not know is that, in fact,
18 Defendants make a profit by finding unique and increasingly dangerous ways to capture user
19 attention and target advertisements to their users. Defendants receive money from advertisers
20 who pay a premium to target advertisements to specific demographic groups of users in the
21 applications including, and specifically, users in Wisconsin under the age of 18.
22 45. Defendants generate revenue based upon the total time spent on the application,
23 which directly correlates with the number of advertisements that can be shown to each user.
2 who consume Snapchat in excessive and dangerous ways. Snap knows or should know that its
3 design has created extreme and addictive behaviors by its largely teenage and young-adult users.
4 Indeed, Snap knowingly or purposefully designed its products to encourage such behaviors
5 47. All the achievements and trophies in Snapchat are unknown to users. The
6 Company has stated that “[y]ou don’t even know about the achievement until you unlock it.”
8 reinforcement provides the most reliable tool to maintain a desired behavior over time.
9 48. This design is akin to a slot machine but marketed toward teenage users who are
10 even more susceptible than gambling addicts to the variable reward and reminder system
11 designed by Snap. The system is designed to reward increasingly extreme behavior because users
12 are not actually aware of what stunt will unlock the next award.
13 49. Instagram and Facebook, like Snapchat, are designed around a series of design
14 features that do not add to the communication and communication utility of the application, but
15 instead seek to exploit users’ susceptibility to persuasive design and unlimited accumulation of
16 unpredictable and uncertain rewards, including “likes” and “followers.” In the hands of children,
17 this design is unreasonably dangerous to the mental well-being of underage user’s developing
18 minds.
20 engineers to help make their products maximally addicting. For example, Instagram’s “pull to
21 refresh” is based on how slot machines operate. It creates an endless feed, designed to
22 manipulate brain chemistry and prevent natural end points that would otherwise encourage users
2 contrary, Defendants actively try to conceal the dangerous and addictive nature of their products,
3 lulling users and parents into a false sense of security. This includes consistently playing down
4 their products’ negative effects on teens in public statements and advertising, making false or
5 materially misleading statements concerning product safety, and refusing to make their research
7 52. For example, in or around July 2018, Meta told BBC news that “at no stage does
8 wanting something to be addictive factor into” its product design process. Similarly, Meta told
9 U.S. Senators in November of 2020 that “We certainly do not want our products to be addictive.”
10 Yet, Meta product managers and designers attend and event present at an annual conference held
11 in Silicon Valley called the Habit Summit, the primary purpose of which is to learn how to make
13 53. Defendants engineer their products to keep users, and particularly young users,
14 engaged longer and coming back for more. This is referred to as “engineered addiction,” and
15 examples include features like bottomless scrolling, tagging, notifications, and live stories.
16 54. Internal Meta documents identify the potential of reduction in usage by their
17 minor users as an “existential threat” to their business and spend billions of dollars per year
18 marketing their products to minors. Defendants have deliberately traded in user harm to protect
21 55. Defendants have intentionally designed their products to maximize users ‘screen
22 time, using complex algorithms designed to exploit human psychology and driven by the most
23
3 56. Defendants designed and have progressively modified their products to promote
4 problematic and excessive use that they know is indicative of addictive and self-destructive use.
5 57. One of these features—present in both Snapchat and Instagram—is the use of
6 complex algorithms to select and promote content that is provided to users in an unlimited and
7 never ending “feed.” Defendants are well-aware that algorithm-controlled feeds promote
8 unlimited “scrolling”—a type of use those studies have identified as detrimental to users’ mental
9 health – however, this type of use allows Defendants to display more advertisements and obtain
12 most likely to increase user engagement, which often means content that Defendants know to be
13 harmful to their users. This is content that users might otherwise never see but for Defendant’s
15 59. The addictive nature of Defendants products and the complex and psychologically
18 a “free” social media platform, burying advertisements in personalized content, and making
19 public statements about the safety of their products that simply are not true.
20 61. Defendants also have developed unique product features designed to limit and
21 have in other ways limited Parents’ ability to monitor and prevent problematic use by their
22 children.
23
2 be content neutral. They adapt to the social media activity of individual users to promote
3 whatever content will trigger a particular user’s interest and maximize their screen time.
5 particular types of content on their social media platforms. If User One is triggered by elephants
6 and User Two is triggered by moonbeams, Defendants’ algorithm design will promote elephant
7 content to User One and moonbeam content to User Two. Defendants’ above-described
8 algorithms are solely quantitative devices and make no qualitative distinctions between the
2 of unused synapses, leading to more efficient neural connections. Through myelination and
3 pruning, the brain’s frontal lobes change to help the brain work faster and more efficiently,
4 improving the “executive” functions of the frontal lobes, including impulse control and risk
5 evaluation. This shift in the brain’s composition continues throughout adolescence and
8 particularly those involving the brain’s executive functions and the coordinated activity of
9 regions involved in emotion and cognition. As such, the part of the brain that is critical for
10 control of impulses and emotions and mature, considered decision-making is still developing
11 during adolescence, consistent with the demonstrated behavioral and psychosocial immaturity of
12 juveniles.
13 67. The algorithms in Defendants’ social media products exploit minor users
15 resiliency caused by users’ incomplete brain development. Defendants know, or in the exercise
16 of reasonable care should know, that because their minor users’ frontal lobes are not fully
17 developed, such users are much more likely to sustain serious physical and psychological harm
18 through their social media use than adult users. Nevertheless, Defendants have failed to design
19 their products with any protections to account for and ameliorate the psychosocial immaturity of
21
22
23
3 73. Defendants’ product features are designed to be and are addictive and harmful in
4 themselves, without regard to any content that may exist on Defendants’ platform, for example,
6 74. Defendants have designed other product features for the purpose of encouraging
7 and assisting children in evasion of parental oversight, protection, and consent, which features
10 of harmful content. For example, during an October 2021 Senate Hearing, members of the U.S.
11 Congress made statements based on tens of thousands of Meta documents provided to them by a
12 whistleblower, among which were several statements that support Plaintiff’s allegations that
14 content,
15 a. Defendants approve of ads that contain harmful content, for example, “designed
22 Again, Defendants specifically select and push this harmful content, for which
23
2 teens into darker and darker places.” (Senator Blumenthal, October 5, 2022).
4 based ranking … can lead children from very innocuous topics like healthy
5 recipes … all the way from just something innocent like healthy recipes to
6 anorexia promoting content over a very short period of time.” Defendants have
7 knowledge that their products and the content they are encouraging and helping to
9 76. Defendants have information and knowledge that can determine with reasonably
10 certainty each user’s age, habits, and other personal information, regardless of what information
12 77. In short, none of Plaintiff’s claims rely on treating Defendants as the publisher or
13 speaker of any third party’s words or content. Plaintiff’s claims seek to hold Defendants
14 accountable for their own allegedly wrongful acts and omissions, not for the speech of others or
16 78. Plaintiff is not alleging that Defendants are liable for what the third parties said,
18 79. None of Plaintiff’s Claims for Relief set forth herein treat Defendants as a speaker
19 or publisher of content posted by third parties. Rather, Plaintiff seeks to hold Defendants liable
20 for their own speech and their own silence in failing to warn of foreseeable dangers arising from
21 anticipate use of their products. Defendants could manifestly fulfill their legal duty to design
22 reasonably safe social products and furnish adequate warnings of foreseeable dangers arising out
23
3 V. PLAINTIFF-SPECIFIC ALLEGATIONS
4 80. Christopher James (“CJ”) Dawley was born on Feb. 13, 1997, in Burlington,
5 Wisconsin. CJ was a senior at Central High School in Paddock Lake. CJ worked as a busboy at
6 Texas Roadhouse in Kenosha. He was enrolled in Advanced Placement and Honors courses and
8 81. CJ was a member of St. Alphonsus Catholic Church in New Munster. He was
9 active at Central High School and loved golfing, volleyball, woods and metal class and was
10 enrolled in Advanced Placement and Honors Classes. He formerly played baseball with the
11 Lakeland Little League and played quarterback for the Bulldogs Football team. His greatest
12 loves were his car, his tractor and fixing and taking things apart. He enjoyed boating, ATV
15 media products. Because he was a minor at the time, CJ lacked contractual capacity to bind
16 himself and his Estate to the terms of any User Agreements he may have clicked when signing
17 up for Defendants’ social media products. Neither CJ nor his Estate is therefore subject to any
18 arbitration, forum selection, choice of law or class action waiver set forth in said User
19 Agreements
20 83. After he acquired Defendants’ social media products, CJ spent progressively more
21 time communicating on social media through his smart phone and laptop computer. By 2014, CJ
22 had developed an addiction to Defendants’ social media products, never left his smart phone, and
23 everything he did was absorbed on his phone. Through Facebook, Snapchat, and Instagram, he
2 image and would communicate through social media at all hours of the night resulting in sleep
4 84. CJ never showed outward signs of depression or mental injury but became
5 addicted to Defendants’ social media products, progressively sleep deprived, and increasingly
7 85. On January 4, 2015, while his family was cleaning Christmas decorations and
8 dismantling their Christmas tree, CJ retreated into his room. He sent a text message to his best
9 friend “God’s speed” and posted the message “Who turned out the light?” on his Facebook page.
10 CJ held a 22-caliber rifle in one hand, his smart phone in the other, and shot himself to death.
11 Nobody heard the shot and his parents assumed that CJ was sleeping. Five hours later, CJ’s
13 86. CJ hand wrote the following message to his family on the envelope which
15 I don’t want you to think this is at all your fault. It’s not. I’m f****d up. You
showed me love and family. I wish I didn’t have to do this to you guys. I love
16 you all more than the world. It’s hard to be a person right now. And I wish I
believed in God. If God does exist, he will have to beg for my forgiveness. There
17 are a lot of things you don’t know about me. What goes on inside my head scares
me. I tried to be a good person. It’s just as I am yelling in a dark tunnel running
18 after the light so I can be happy. But my legs are tired and what’s a man to do
when the lights go out. Tell my friends thank you for the friendship and support
19 and I love them with all my being. I tried.
20 87. CJ’s death by suicide was the proximate result of the unreasonably dangerous
21 Instagram, Snapchat, and Facebook products he used. As set forth in detail below, these
22 products were not reasonably safe due to their defective design and inadequate warnings.
23
2 fact and have “sought to stonewall and block this information [information about the
3 dangerousness of their products, especially to young users] from becoming public.” (Senator
5 possession from the public, the US government, and governments, including information relating
6 to the safety of children and the role of their algorithms in causing addiction and others harms.
7 89. Defendants made false statements to the press and public, designed to cover up
8 the inherent dangers of their products and, even when asked direct questions as to how those
9 products “impact the health and safety of our children, they choose to mislead and misdirect.”
10 90. Plaintiff did not discover, or in the exercise of reasonable diligence could not have
11 discovered, that CJ’s death by suicide was caused by the Defendant’s unreasonably dangerous
13 V. PLAINTIFFS’ CLAIMS
15 91. Plaintiffs reallege each and every allegation contained in paragraphs 1 through 90
17 92. Defendant Meta’s product is defective because the foreseeable risks of harm
18 posed by the product’s design could have been reduced or avoided by the adoption of a
19 reasonable alternative design by Meta and the omission of the alternative design renders the
20 product not reasonably safe. This defective condition rendered the product unreasonably
21 dangerous to persons or property and existed at the time the product left the Meta’s control,
22 reached the user or consumer without substantial change in the condition and its defective
2 posed by the product's design could have been reduced or avoided by the adoption of a
3 reasonable alternative design by Snap and the omission of the alternative design renders the
4 product not reasonably safe. This defective condition rendered the product unreasonably
5 dangerous to persons or property and existed at the time the product left the Snap’s control,
6 reached the user or consumer without substantial change in the condition and its defective
8 94. Defendants designed, manufactured, marketed, and sold social media products
9 that were unreasonably dangerous because they were designed to be addictive to the minor users
10 to whom Defendants actively marketed and because the foreseeable use of Defendants’ products
13 numerous design characteristics that are not necessary for the utility provided to the user but are
14 unreasonably dangerous and implemented by Defendants solely to increase the profits they
15 derived from each additional user and the length of time they could keep each user dependent on
16 their product.
18 96. As designed, Snapchat, Instagram and Facebook algorithms are not reasonably
19 safe because they affirmatively direct minor users to harmful and exploitative content while
20 failing to deploy feasible safeguards to protect vulnerable teens from such harmful exposures. It
21 is feasible to design an algorithm that substantially distinguishes between harmful and innocuous
22 content and protects minor users from being exposed to harmful content without altering,
23 modifying, or deleting any third-party content posted on Defendants’ social media products. The
2 benefit would be high in terms of reducing the quantum of mental and physical harm sustained
4 97. Snap and Meta also engage in conduct, outside of the algorithms themselves,
5 which is designed to promote harmful and exploitative content as a means of increasing their
6 revenue from advertisements. This includes but is not limited to efforts to encourage advertisers
7 to design ads that appeal to minors, including children under the age of 13; and product design
8 features intended to attract and engage minor users to these virtual spaces where harmful ad
9 content is then pushed to those users in a manner intended to increase user engagement, thereby
11 98. Reasonable users (and their parents) would not expect that Defendants’ would
12 knowingly expose them to such harmful content and/or that Defendants’ products would direct
13 them to harmful content at all, much less in the manipulative and coercive manner that they do.
14 Defendants have and continue to knowingly use their algorithms on users in a manner designed
15 to affirmatively change their behavior, which methods are particularly effective on (and harmful
18 99. As designed, Defendants’ products are not reasonably safe because they do not
19 provide for adequate age verification by requiring users to document and verify their age and
20 identity.
21 100. Adults frequently set up user accounts on Defendants’ social media products
22 posing as minors to groom unsuspecting minors to exchange sexually explicit content and
2 prurient adults set up fraudulent accounts on Defendants’ social media products and pose as
4 102. Likewise, minor users who are under the age of 13 and/or whose parents have
5 taken affirmative steps to keep them away from Defendants’ products often open multiple
6 accounts, such that Defendants know or have reason to know that the user is underage and/or
7 does not have parental permission to use their product. Defendants already have the information
8 and means they need to ascertain with reasonable certainty each user’s actual age and, at least in
9 some cases, Defendants utilize these tools to investigate, assess, and report on percentages and
10 totals of underage users for internal assessment purposes. They simply then choose to do nothing
12 103. Moreover, reasonably accurate age and identity verification is not only feasible
14 104. The cost of incorporating age and identify verification into Defendants’ products
15 would be negligible whereas the benefit of age and identity verification would be a substantial
16 reduction in severe mental health harms, sexual exploitation, and abuse among minor users of
17 Defendants’ products.
19 105. Defendants’ products are also defective for lack of parental controls, permission,
21 106. Defendants’ products are designed with specific product features intended to
22 prevent and/or interfere with parents’ reasonable and lawful exercise of parental control,
23 permission, and monitoring capability available on many other devices and applications.
5 108. Ad content pushed to new teenage users, including CJ Dawley, because of their
6 age and vulnerability, purposefully steer those users toward content Defendants know to be
9 109. Defendants’ products are not reasonably safe because they do not protect minor
10 users from sexually explicit content and images or report sex offenders to law enforcement or
12 110. Parents do not expect their children will use Defendants’ products to exchange
13 sexually explicit content and images and minor users do not expect that prurient adults pose as
14 minors for malign purposes or that exchange of such content will be deleterious to their personal
16 111. Minor users of Defendants’ products lack the cognitive ability and life experience
17 to identify on-line grooming behaviors by prurient adults and psychosocial maturity to decline
20 because they allow minor children to use “public” profiles, in many cases default “public”
21 profiles, that can be mass messaged by anonymous and semi-anonymous adult users for the
22 purposes of sexual exploitation, and grooming, including the sending of encrypted, disappearing
2 113. As designed, Defendants’ social media products are addictive to minor users as
3 follows: When minors use design features such as “likes” it cause their brains release dopamine
4 which creates short term euphoria. However, as soon as dopamine is released, minor users’
5 brains adapt by reducing or “downregulating” the number of dopamine receptors that are
6 stimulated and their euphoria is countered by dejection. In normal stimulatory environments, this
7 dejection abates, and neutrality is restored. However, Defendants’ algorithms are designed to
8 exploit users’ natural tendency to counteract dejection by going back to the source of pleasure
9 for another dose of euphoria. As this pattern continues over a period of months and the
10 neurological base line to trigger minor users’ dopamine responses increases, they continue to use
11 Instagram, not for enjoyment, but simply to feel normal. Once they stop using Instagram, minor
12 users experience the universal symptoms of withdrawal from any addictive substance including
16 Association's 2013 Diagnostic and Statistical Manual of Mental Disorders (DSM-5), which is
18 recognized mental health disorder by the World Health Organization and International
20 addition.
21 115. The diagnostic symptoms of social media addiction among minors are the same as
23
3 irritability).
4 b. Tolerance, the need to spend more time using social media to satisfy the urge.
9 f. Deceiving family members or others about the amount of time spent on social
10 media.
11 g. The use of social media to relieve negative moods, such as guilt or hopelessness;
12 and
14 usage.
15 116. Defendants’ advertising profits are directly tied to the amount of time that its
16 users spend online, and their algorithms and other product features are designed to maximize the
17 time users spend using the product by directing them to content that is progressively more and
18 more stimulative. Defendants enhance advertising revenue by maximizing users’ time online
19 through a product design that addicts them to the platform. However, reasonable minor users
20 and their parents do not expect that on-line social media platforms are psychologically and
21 neurologically addictive.
23 limiting the frequency and duration of access and suspending service during sleeping hours.
2 suspends service during sleeping hours could be accomplished at negligible cost; whereas the
3 benefit of minor users maintaining healthy sleep patterns would be a significant reduction in
4 depression, attempted and completed suicide and other forms self-harm among this vulnerable
5 age cohort.
3 121. It is reasonable for parents to expect that platforms such as Instagram, which
4 actively promote their services to minors, will undertake reasonable efforts to identify users
5 suffering from mental injury, self-harm, or sexual abuse and implement technological safeguards
6 to notify parents by text, email, or other reasonable means that their child is in danger.
8 Defendants’ products, Decedent CJ Dawley suffered severe mental harm leading to his suicide
9 on January 4, 2015. Plaintiff did not know, and in the exercise of reasonable diligence could not
12 products, Plaintiff DONNA DAWLEY, her husband and their surviving children have suffered
14 124. Defendants are further liable to Plaintiffs for punitive damages based upon the
15 willful and wanton design of their products that were intentionally marketed and sold to
16 underage users, whom they knew would be seriously harmed through their use of Instagram,
19 125. Plaintiffs reallege each and every allegation contained in paragraphs 1 through
22 because the foreseeable risks of harm posed by the product could have been reduced or avoided
23 by the provision of reasonable instructions or warnings by the manufacturer and the omission of
2 rendered the product unreasonably dangerous to persons or property, existed at the time the
3 product left defendant’s control, reached the user or consumer without substantial change in the
6 because the foreseeable risks of harm posed by the product could have been reduced or avoided
7 by the provision of reasonable instructions or warnings by the manufacturer and the omission of
8 the instructions or warnings renders the product not reasonably safe. This defective condition
9 rendered the product unreasonably dangerous to persons or property, existed at the time the
10 product left defendant’s control, reached the user or consumer without substantial change in the
12 128. Defendants’ products are unreasonably dangerous and defective because they
13 contain no warning to users or parents regarding the addictive design and effects of Snapchat,
15 129. Defendants’ social media products rely on highly complex and proprietary
16 algorithms that are both undisclosed and unfathomable to ordinary consumers who do not expect
18 130. The magnitude of harm from addiction to Defendants’ products is horrific ranging
19 from simple diversion from academic, athletic, and face-to-face socialization to sleep loss, severe
21 131. The harms resulting from minors’ addictive use of social media platforms have
22 been not only well- documented in the professional and scientific literature, but Meta had actual
23 knowledge of such harms. On information and belief, Snap also has conducted internal studies
2 users.
3 132. Defendants’ products are unreasonably dangerous because they lack any warnings
4 that foreseeable product use can disrupt healthy sleep patterns or specific warnings to parents
5 when their child’s product usage exceeds healthy levels or occurs during sleep hours. Excessive
6 screen time is harmful adolescents’ mental health and sleep patterns and emotional well-being.
7 Reasonable and responsible parents are not able to accurately monitor their child’s screen time
8 because most adolescents own or can obtain access to mobile devices and engage in social media
10 133. It is feasible for Defendants’ products to report the frequency and duration of their
11 minor users’ screen time to their parents without disclosing the content of communications at
12 negligible cost, whereas parents’ ability to track the frequency, time and duration of their minor
13 child’s social media use are better situated to identify and address problems arising from such
15 134. Defendants knew about these harms, knew that users and parents would not be
16 able to safely use their products without warnings, and failed to provide warnings that were
17 adequate to make the products reasonably safe during ordinary and foreseeable use by children.
19 mental harm, leading to physical injury and death, from his use of Snapchat, Instagram and
20 Facebook.
22 husband and their surviving children suffered loss of consortium, emotional distress, past and
2 willful and wanton failure to warn of known dangers of their products that were intentionally
3 marketed and sold to teenage users, whom they knew would be seriously harmed through their
6 138. Plaintiffs reallege each and every allegation contained in paragraphs 1 through
8 139. At all relevant times, Defendants had a duty to exercise reasonable care and
9 caution for the safety of individuals using their products, such as Decedent CJ Dawley.
10 140. Defendants owe a heightened duty of care to minor users of their social media
11 products because adolescents’ brains are not fully developed, which results in a diminished
12 capacity to make good decisions regarding their social media usages, eschew self-destructive
13 behaviors, and overcome emotional and psychological harm from negative and destructive social
14 media encounters.
16 Wisconsin, Defendants owed a duty to exercise ordinary care in the manufacture, marketing, and
17 sale of their products, including a duty to warn minor users and their parents of hazards that
18 Defendants knew to be present, but not obvious, to underage users and their parents.
19 142. As business owners, Defendants owe their users who visit Defendants’ social
20 media platform and from whom Defendants derive billions of dollars per year in advertising
21 revenue a duty of ordinary care substantially similar to that owed by physical business owners to
23
2 failed to exercise ordinary care and caution for the safety of underage users, like CJ Dawley
4 144. Defendants were negligent in failing to conduct adequate testing and failing to
5 allow independent academic researchers to adequately study the effects of their products and
6 levels of problematic use amongst teenage users. Defendants’ have extensive internal research
7 indicating that their products are harmful, cause extensive mental harm and that minor users are
8 engaging in problematic and addictive use that their parents are helpless to monitor and prevent.
9 145. Defendants were negligent in failing to provide adequate warnings about the
10 dangers associated with the use of social media products and in failing to advise users and their
11 parents about how and when to safely use their social media platforms and features.
12 146. Defendants were negligent in failing to fully assess, investigate, and restrict the
13 use of Snapchat, Instagram and Facebook by adults to sexually solicit, abuse, manipulate, and
15 147. Defendants were negligent in failing to provide users and parents the tools to
16 ensure their social media products were used in a limited and safe manner by underage users.
18 mental harm, leading to death by suicide from his use of Snapchat, Instagram and Facebook.
20 and surviving children have suffered loss of consortium, emotional distress, pain and suffering.
21 150. Defendants are further liable to Plaintiffs for punitive damages based upon their
22 willful and wanton conduct toward underage users, including CJ Dawley, whom they knew
23 would be seriously harmed through the use of their social media products.
3 152. Meta actively represents to the public that its social media products are not
4 addictive and safe for use by minors with the intent to induce use of their products by minors.
5 This representation is untrue, deceptive, or misleading and caused the plaintiff a pecuniary loss.
6 153. Snap actively represents to the public that its social media products are not
7 addictive and safe for use by minors with the intent to induce use of their products by minors.
8 This representation was untrue, deceptive, or misleading and caused the plaintiff a pecuniary
9 loss.
13 WHEREFORE, Plaintiff prays for judgment against Defendant for monetary damages for
15 1. Past physical and mental pain and suffering of CJ Dawley, in an amount to be more
18 3. Plaintiffs’ pecuniary loss and loss of Christopher Dawley's services, comfort, care,
21 5. Punitive damages.
22 6. For the reasonable costs and attorney and expert/consultant fees incurred in
10
11
12
13
14
15
16
17
18
19
20
21
22
23