The Same Playbook, Different Audience
A parent’s read of an investor lawsuit, and why it rhymes with everything else.
The complaint alleges that management instructed employees not to quantify or document the expected revenue impact.
That sentence isn’t from a child safety lawsuit.
It’s from an investor securities complaint. A lawsuit about what executives allegedly told Wall Street versus what was happening behind the scenes.
Different audience. Same instinct.
The Paragraph That Stopped Me
According to the complaint, management instructed employees not to quantify or document the expected revenue impact of Apple’s privacy changes, specifically to avoid creating a paper trail that could trigger disclosure obligations.
It alleges that when someone began calculating the impact, they were told to stop.
Maybe that gets fought. Maybe it gets explained. Maybe it gets denied.
But as a pattern, it matters.
Because when you are a parent trying to get answers after your child dies, everything turns on the same questions:
What was written down? What was escalated? What was measured? What was retained? What mysteriously does not exist?
Why I’m Reading This
I am not an investor. I am a father.
My son Avery died in December 2024. An impulsive decision that Snapchat made easy.
Since then, I have learned something ugly: when you want to understand a platform, you don’t start with the safety video. You start with the revenue engine. You start with what happens when that engine is threatened.
This document is a securities class action complaint. It does not litigate what happened to my child. It litigates what executives said to investors, and what the plaintiffs allege was happening internally.
But the alleged pattern is the same one parents encounter:
Reassure the public. Use careful words. Celebrate “tools.” Avoid the hard questions. And when the stakes get real, manage the paper trail.
The Origin Story
Snapchat’s early brand was built around one idea: ephemerality. Messages that disappear. Fewer receipts. Less permanence.
That wasn’t a side feature. It was the identity.
Whether the original intent was playful or predatory, the effect is the same: a platform designed for vanishing evidence.
So when an investor complaint alleges an internal instinct of “don’t document,” I don’t read it as an aberration. I read it as muscle memory.
And when I try to imagine what internal mechanics, what institutional practices, what organizational habits cause multi-week delays when law enforcement serves a warrant about a drug dealer selling to children—I feel sick.
Because I know what those weeks cost.
According to law enforcement in our case, a warrant for the dealer’s Snapchat records existed roughly two months before Avery died. They waited. And waited. By the time they had what they needed, it was too late.
A company built around disappearing evidence. An investor complaint alleging an instruction not to document a looming problem. And a response time to law enforcement that, in practice, often means the evidence has already vanished and the child is already gone.
What the Complaint Alleges
The lawsuit is straightforward:
Who is suing: Investors who bought Snap stock during a specific period.
What they claim: Snap told the market it was ready for Apple’s privacy changes. The plaintiffs allege it was not—and that the company’s public statements concealed how severe the problem was.
This is a disclosure case. Not a morality play.
What did they know? When did they know it? What did they say while they knew it?
The Business Problem
Snap’s growth engine ran on direct response advertising—ads that live or die on one question: Did the user do the thing?
Install. Purchase. Subscribe.
If you can’t measure that action, advertisers cut spend. Or demand lower prices. Or leave.
Before Apple’s 2021 privacy changes, iPhones had an advertising identifier called IDFA—a tag that helped ad systems recognize the same device across apps and websites. The complaint describes this as the ability to track user activity “even outside of the Snapchat app” and optimize ads using real-time and historical behavior.
Speed. Certainty. That’s the engine.
Apple introduced App Tracking Transparency (ATT). Now that cross-app tracking required the user to opt in. Most users did not.
Apple offered a substitute system called SKAdNetwork (SKAN). More privacy-forward. More constrained. Less granular. Harder to use for rapid optimization.
Here is where the complaint sharpens:
The plaintiffs allege Snap repeatedly told investors that advertisers representing a “majority” of direct response revenue had “successfully implemented” the new system.
Later, the complaint points to Snap acknowledging that advertisers found SKAN unreliable and limiting—especially for real-time targeting and measurement.
Same problem. Different phrasing.
“We’re ready.” “We’ve built the tools.” “It’s working.”
And then: actually, it’s not.
Why I’m Writing This Now
This week, Snap reportedly settled a social media addiction lawsuit shortly before trial.
Around the same time, Evan Spiegel posted a LinkedIn video about new Family Center updates—more visibility into “time spent,” how teens use the app, additional parental controls.
If the allegations in this investor complaint are true, then advertisers were given a reassurance loop. A message of readiness. A message of control. A message that the tools were working.
And as a grieving parent, watching safety messaging roll out, I feel that same structure. Not the same facts. The same structure.
More settings. More dashboards. More “visibility.”
In both cases, the message is the same: “Trust us. We have tools.” And in both cases, the unanswered questions are the ones that would show whether the tools change outcomes.
The Questions That Require Numbers, Not Videos
I don’t want another safety update. I want answers that can be measured.
Aggregate numbers. No user data. Just the operational truth.
On law enforcement response: What is the median time between warrant service and data production in cases involving minors? What percentage of warrants receive a response within 72 hours? Within 30 days?
On dealer accounts: When Snapchat produces records to law enforcement identifying an account as involved in drug sales, what happens to that account? How many were still active 7 days later? 30 days?
On stranger connections: What percentage of Quick Add suggestions connect adults with minors who share zero mutual friends? What is the false-positive rate for “proactive detection” of drug content?
On evidence preservation: When a user is reported for a serious violation, how long is message content retained? If a parent requests data after a child’s death, what is actually available?
On default ephemerality: What percentage of conversations between teens and non-contacts are set to disappear by default? Who chose that default—the user, or the platform?
On the response loop: When an account is flagged by law enforcement as under active investigation, does that account retain the ability to message minors? For how long?
These are not rhetorical questions. They can be answered with data or with silence.
A LinkedIn video about Family Center features answers none of them.
The Parallel I Cannot Unsee
Advertisers wanted proof their ads worked. They needed attribution. Measurement. Reliability.
Parents want proof their children are protected. They need responsiveness. Evidence. Accountability.
In both cases, the ask is the same:
Show us the system. Show us the logs. Show us the real metrics. Show us what happens when it fails.
One group gets quarterly earnings calls with executives.
The other gets a video about parental controls.
What This Proves and What It Doesn’t
This complaint does not prove anything about my child. It does not prove anything about safety.
It shows alleged habits.
A dependence on granular tracking to make the business work. A public narrative of preparedness when problems loomed. And, most chilling, an alleged preference for less documentation when consequences threatened.
A company built around disappearing evidence. An investor complaint alleging an instruction not to document a looming problem. And a safety narrative that still avoids the questions that would show whether kids are actually protected.
I’m not saying these are the same thing legally.
I’m saying they rhyme. And the rhyme is part of a bigger verse.
The Bigger Picture
I recently testified before a state legislature about platform accountability. The hearing was about ai chatbots. My testimony was about corporate manipulation of the First Amendment.
Also testifying: representatives from FIRE, the Foundation for Individual Rights and Expression. They were there to defend AI companion chatbots—the kind that form parasocial relationships with lonely teenagers, the kind that have been linked to at least one child’s suicide.
Their argument: speech. Free expression. Constitutional protection.
I sat there watching the First Amendment get purchased in real time.
This is the system. Not one company. Not one complaint. Not one dead child.
It’s an architecture where corporations claim constitutional protections designed for citizens, where “free speech” becomes a shield for products that would be regulated if they were sold as anything other than communication, where truth gets distorted, hidden, and bought—and the health consequences land on families who never saw it coming.
The investor lawsuit alleges Snap misled shareholders about business risks.
The fentanyl lawsuits allege Snap’s design enabled dealers to kill children.
Different plaintiffs. Different courts. Same company. Same question underneath it all:
Who will America stand up for? Its investors or its children?
I’ll be following both cases to find out.
And I’m done being marketed to.

