Recently, a newly released video has opened a private window into TikTok employees worries, and the footage is now a crucial part of a major court case. It shows company staff arguing that the app’s powerful recommendation system could be putting the mental health of young users at real risk. This directly counters TikTok’s public claims that the app is a safe environment for minors.
The clip highlights the widening gap between what companies know and what they choose to tell the public about user safety. TikTok employees remarks reveal unease about how the app is designed and the consequences that may still be speeding toward them.
The North Carolina lawsuit is at the center of the case, and a key court decision has framed the current debate. In response to a release of the video, a North Carolina court has ordered TikTok to provide even more internal documents and communications that the company previously tried to keep hidden, arguing that deeper digging is necessary to estimate the potential harm the app could be causing.
A North Carolina Superior Court judge handed TikTok a major setback this week by ordering the release of a damaging internal video along with the class-action complaint. The judge also rejected the app’s request to throw out the case, keeping the lawsuit alive. Judge Adam Conrad determined that the possible hurt feelings of TikTok employees in the footage don’t outweigh the public’s right to see the material.
“You Never Want to Leave”: Inside the Unsealed Video
The public now has access to a video stitched together from internal TikTok employees’ meetings. The exact dates of the sessions aren’t included, but the employees speak openly about what troubles them.
Nicholas Chng, a former risk-detection team member, said, “Unfortunately, the things that are popular aren’t always the healthiest… The platform’s setup does push some of this content up, and that worries me.”
In a recent video, Brett Peters, TikTok’s head of creator advocacy and reputation, shared the company’s ambition: “We want people to stay on the app longer.” His words were clear: “We’re here to broaden the content ecosystem so TikTok feels so varied and alive that you never want to click away.” This official statement echoes complaints that the platform is built to keep users scrolling.
Equally revealing, Alexandra Evans, a former TikTok safety policy lead in Europe, warned that “compulsive use is hardwired into the app.” She urged colleagues to recognize the effect on basic life rhythms: “Sleep, meals, physical movement, and even eye contact all suffer.” Such comments show a growing concern that the technology may push people away from real-life moments.
A Nationwide Legal Reckoning
North Carolina isn’t the only state taking TikTok to court. Just this Tuesday, Minnesota Attorney General Keith Ellison joined the fight, pushing the total to about 24 states now suing the app. Ellison put it plainly: “This isn’t about free speech… It’s actually about deception, manipulation, misrepresentation. This is about a company knowing the dangers, and the dangerous effects of its product.” The message is clear: officials believe TikTok is hiding key dangers from parents and kids.
These state suits aren’t the only concern for TikTok. There’s a huge, ongoing federal court case—called MDL 3047—where over 620 families from across the country allege the app is causing major mental health issues for their kids. That case is gathering all the separate lawsuits into one courtroom so everybody can see the same evidence.
TikTok’s Response and Safety Features
After the Minnesota lawsuit dropped, a TikTok spokesperson insisted the complaint is “based on misleading and inaccurate claims.” The rep went on to list the “robust safety measures” TikTok says it has put in place. So what are those measures? The company says it has rolled out more than 50 tools and settings just for teen accounts. Some of the biggest ones are screen time limits, content filters to hide risky videos, and a family pairing feature that lets parents set controls from their own phones.
Whether these tools are enough, and whether they actually protect kids from the app’s biggest reports of harm, is what the courts will now take a close look at.
TikTok says over and over that its app is safe for kids, but new footage shows TikTok employees questioning that idea, so those comments hit much harder. A recently released clip shows that while the company tells the world its targets are a safe environment, some of TikTok employees are privately shaking their heads at the ethics of their day-to-day tasks.
The Cost in Sleep: What TikTok employees Worry About at 3 A.M.
When you watch the new footage, you see real worry. Instead of spouting company phrases, real TikTok employees speak about their daily fears and the unsettled conscience behind their screens.
Take Ashlen Sepulveda, who used to review videos to see if they violated the rules. She boiled her worry down to a single sentence: “The part of my night that is ruined is the knowing that our algorithm decides for kids based in part on what it thinks they like.” She goes on to explain the path that starts with a search for mental-health tips. Spice that with a chat about fitness, and soon a kid could arrive on a full “soft disordered eating” roadmap, a tunnel so hard to leave that it fills her with dread.
Inside, TikTok employees are pulled between the duty of keeping likes climbing and the images of kids struggling after the app is done with them. That tension shows a firm caught in the coils of the algorithm it built and can’t fully control.
The Road Ahead: Bans, Sales, and Accountability
This newly released TikTok video could change everything. The app faces a possible U.S. ban just weeks away unless ByteDance, its parent company, agrees to sell its stake by September 17. As a result, the clock is ticking.
The tape isn’t just leaked footage; it captures TikTok employees expressing alarm about the company’s algorithm and its push to lock in young users. Officials are already listening. North Carolina Attorney General Jeff Jackson pointed to the video in court, arguing, “The clips back up our claim that platforms are trading kids’ safety for profits.” The TikTok employees’ frank admissions now sit at the center of that claim.
Conclusion: A Question of Responsibility
This video is more than court evidence; it’s a confession delivered from inside the code. The TikTok employees’ uneasy candor reminds us that the dangers of hyper-addictive design are recognized by the design teams, not only by outside critics. The footage shines a light on a decision-point: either change the model or accept the responsibility for its effects.
Source: https://edition.cnn.com/2025/08/20/tech/tiktok-north-carolina-lawsuit-unsealed-employee-video