Big Tech Companies Are Intentionally Making Products Addictive
A Pew Research Center report showed that 84% of American adults aged 18 to 29 use some form of social media. From survey results collected in February 2021, approximately 71% of U.S. adults stated they used social media, with the 65+ age group skewing the average downward with a usage rate of just 45%.
Meta Lawsuit
April 16th, 2024 – Courts dismiss Mark Zuckerberg from personal liability in the Meta addiction lawsuit – The courts have dismissed Mark Zuckerberg from being held personally liable in the Meta addiction lawsuits. Although he was warned of the harm Facebook and Instagram had on kids, he wasn’t required to disclose this information to users. The plaintiffs are still working on finding ways to hold him liable for his lack of warning on the effects these products have on kids.
TikTok Lawsuit For Teenage Harm
The harm caused to teenagers and children by TikTok’s product design and algorithm has been linked to several deaths in the United States. The app has also been connected to several dangerous mental and physical impacts. Parents and young people harmed by TikTok have legal options to hold the platform accountable.
TikTok is one of the most widely used social media platforms among children and teenagers in the United States. Unfortunately, using the platform is uniquely dangerous to young people. Using TikTok has been linked to several physical and mental health harms, including depression, eating disorders, loneliness, and even death or suicide.
TikTok’s failure to protect its young users from the harms associated with using the platform has led to several lawsuits against the company.
Snap Chat Lawsuit
The social media platform Snapchat is wildly popular with youth, but its addictive algorithm and inherently harmful features have led to serious mental health problems and suicide among children and teens. Parents have filed Snapchat lawsuits accusing the company of failing to warn users about the dangers of its product, defective design, and gender discrimination, among other complaints.
Discord Lawsuit
According to Discord’s own transparency report, during the second quarter of 2022, 532,498 accounts were disabled for child safety violations, with 497,267 of those accounts involving sexualized content depicting minors. An additional 106,645 accounts were disabled for exploitative and unsolicited content. In addition to sexually abusive and exploitative content, young users of Discord may also be exposed to content depicting self-harm, harassment, and graphic violence. Discord routinely removes these accounts, but the number of accounts removed in just one three-month period is in the thousands. .
Contact Us Now
How can we help you?
Please complete the form below and we’ll get back to you as soon as possible.
Testimonials
Testimonials
Client Testimonials
Jiara Martins
Boost your product and service’s credibility by adding testimonials from your clients. People love recommendations so feedback from others who’ve tried it is invaluable.
Estelle Darcy
We are honored to serve you
Boost your product and service’s credibility by adding testimonials from your clients. People love recommendations so feedback from others who’ve tried it is invaluable.
Lars Peeters
Boost your product and service’s credibility by adding testimonials from your clients. People love recommendations so feedback from others who’ve tried it is invaluable.