Mark Klein's Room 641A: NSA Whistleblower Story
How AT&T technician Mark Klein blew the whistle on the NSA's secret surveillance facility Room 641A, and why his story matters today for privacy and democracy.
You've likely heard of Edward Snowden, but before him there was Mark Klein—an AT&T technician who, in 2004, walked into the Electronic Frontier Foundation's office with documents exposing a secret NSA surveillance facility inside his company's San Francisco building. Known as Room 641A, this site was a key node in the agency's warrantless wiretapping program. A new book excerpt discussed on Hacker News tells Klein's story, and it's generating intense discussion.
What's the story?
The excerpt, from Cohn's book The Return of the Moguls, recounts how Klein contacted the EFF after discovering that AT&T had installed a splitter that copied all internet traffic—emails, web searches, everything—and routed it to a locked room controlled by the NSA. Klein had already helped install similar equipment years earlier, and his internal memos and diagrams became the foundation of the landmark lawsuit Hepting v. AT&T. That case eventually forced the government to reveal the NSA's mass surveillance program, though technical details remain classified.
The excerpt ends on a cliffhanger, as one HN commenter noted: "Beware, this is a book excerpt rather than a standalone blog post, so it ends on a cliffhanger. Still a fun read." The full book promises a deeper dive into Klein's decade-long struggle.
Why it's blowing up on HN
The HN crowd is reacting strongly because this story exposes the myth of a clear wall between foreign and domestic surveillance. One commenter, a former government employee, wrote:
I was aware of this rule at the time (early 90's), and was very surprised to find that it had been routinely violated for at least a decade.
This sentiment—that the system was broken long before 9/11—resonates. Another commenter framed it broadly:
Instances like this is a powerful statement that truly free and democratic governance is not sustainable in the long run with technological advancements. We are basically trading marginal comforts from new technology in the short run for political freedom in the long run and the latency is decreasing.
The thread also praises Klein's integrity: "A true American hero who never tried to turn his whistle-blowing into becoming a celebrity," one wrote. But others cautioned: "If the documents are classified... I would never hand them over," reflecting the tension between legal obedience and moral duty.
The builder's take
Klein's story is a sobering reminder that mass surveillance isn't theoretical—it has been operational for decades inside the very companies we trust with our data. AT&T willingly gave the NSA unfettered access, proving that telecom incentives aren't aligned with user privacy. The government's secrecy only compounds the problem; even when Klein acted legally, he was treated as a threat.
The chilling reality is how little has changed since 2004. Data collection is more pervasive than ever—through cloud providers, social media, and IoT devices—and legal frameworks like Section 702 of FISA still authorize massive warrantless surveillance. The difference is that surveillance is often commercial now, but the result is the same: our digital lives are mined by both corporations and governments.
The HN comment about trading comfort for freedom is spot-on. Every time we accept a free service that monetizes our data, we erode privacy at scale. But the blame isn't just on users—it's on builders who design systems without privacy as a first principle.
What this means for builders
If you're building technology today, consider three concrete implications from Klein's story:
1. Know the legal landscape. Laws like the Patriot Act and FISA Amendments Act can compel companies to cooperate. If you handle communications data, have a clear plan for government requests. Implement robust warrant-canary mechanisms or code that resists tampering.
2. Design for end-to-end encryption. If AT&T deployed strong encryption, the NSA splitter would have captured only encrypted noise. Default to encryption in transit and at rest, ideally with client-side key management. Signal's protocol is a strong model.
3. Embrace transparency. Klein only discovered surveillance because he had access to internal systems. Make your infrastructure auditable, publish transparency reports, and adopt open-source principles. If users can't verify what you're doing with their data, you invite suspicion.
Here's a trivial example of enforcing encryption in transit:
# Enforce HTTPS with strong ciphers
server {
listen 443 ssl http2;
ssl_certificate /etc/letsencrypt/live/example.com/fullchain.pem;
ssl_certificate_key /etc/letsencrypt/live/example.com/privkey.pem;
ssl_protocols TLSv1.2 TLSv1.3;
ssl_ciphers HIGH:!aNULL:!MD5;
}
But encryption alone isn't enough. Ensure encryption keys aren't handed over under subpoena—that's why services like ProtonMail and Tresorit emphasize zero-access architecture.
Should you care?
If you build software that handles user communications, data storage, or core internet infrastructure—yes, care deeply. Klein's story proves surveillance can be invisible, embedded in the network itself. For the average citizen, support strong privacy products and push for legal reform. But if you think surveillance only affects "criminals" or you have nothing to hide, consider the indiscriminate scope of data collection. The threat to democracy is real. You don't need to become a whistleblower, but you do need to care about the systems you build.