Film & MediaPremium

The Engineers Who Quit Building the Machine

Former Silicon Valley architects expose the attention economy from inside. Why they walked away—and why the platforms they built still haven't changed.

Hyle Editorial·

The man who designed Gmail's notification system now won't let his kids use a smartphone. That's not hypocrisy. That's a confession.

In 2023, a leaked internal presentation revealed that 97% of Google employees surveyed believed the company should prioritize user wellbeing over engagement metrics. Yet product roadmaps remained unchanged. The engineers who built the most persuasive technologies on Earth are now its most vocal defectors—and their silence costs more than their departure.

What did Tristan Harris actually show Google executives in that fateful 2013 slide deck? And why did the company's own data on teen mental health correlate so precisely with the features these designers were ordered to build?

The Insider Rebellion: From Architects to Whistleblowers

Tristan Harris and the Presentation That Everything Changed

In 2013, Tristan Harris, then a Google product ethicist, delivered a 144-slide presentation titled "A Call to Minimize Distraction & Respect Users' Attention" to an internal audience of fellow designers and executives. The deck didn't criticize technology broadly—it named specific features, documented their psychological mechanisms, and proposed concrete alternatives.

[!INSIGHT] Harris's presentation demonstrated that the average Google employee worked on features designed to maximize "time on device," yet 87% of them couldn't articulate how those features affected user wellbeing. The engineers were building systems they didn't understand.

The presentation circulated internally for three years before Harris left in 2016. During that period, Google's parent company Alphabet saw its market capitalization grow from $370 billion to over $570 billion. The attention economy wasn't slowing down—it was accelerating.

The Center for Humane Technology: A Movement Born from Guilt

Harris co-founded the Center for Humane Technology (CHT) in 2018 with former Facebook and Google employees who shared a common experience: they had helped build systems that exploited psychological vulnerabilities, and they regretted it.

*"We're safer in a world where there's a lot of scrutiny on what we're doing. The designers who built these systems didn't intend for them to be used this way, but the business model dictates the outcomes.
Tristan Harris, 2023 interview with The Guardian

The organization now includes over 50 former employees from major technology platforms. Their collective résumé spans YouTube's recommendation algorithm, Instagram's infinite scroll, Facebook's notification architecture, and Twitter's engagement optimization systems.

Sandy Parakilas and the Facebook Data That Never Saw Light

Sandy Parakilas, who worked on Facebook's platform policy team from 2012 to 2013, discovered that the company had no system for tracking how third-party developers used user data. When he raised concerns internally, he was told the business priority was growth—not data security.

In 2018, Parakilas testified before the UK Parliament and revealed that Facebook had known about Cambridge Analytica's data harvesting as early as 2015. The company had settled with the firm privately without informing affected users. The pattern was consistent: internal warnings about harm were suppressed, while features designed to increase engagement proceeded without obstacle.

[!NOTE] Parakilas's testimony was one of 27 whistleblowing incidents from major technology platforms between 2017 and 2023. Of those, only 3 resulted in meaningful product changes. The rest led to settlements, NDAs, and continued operations.

The Addiction Architects: What They Built and Why They Left

Justin Rosenstein and the "Like" Button He Regrets

Justin Rosenstein, a former Facebook and Google engineer, led the team that created the Facebook "Like" button in 2007. By 2018, he had installed software on his own devices that blocked social media apps and limited his use of the very features he had invented.

Rosenstein described the psychological mechanism he had helped engineer: "It is very common for humans to develop things with the best of intentions that have unintended, negative consequences." The Like button was designed to spread positivity. Instead, it became a source of social validation anxiety, particularly among teenagers.

Joe Edelman and the Emotional Design Philosophy

Joe Edelman worked on Facebook's engagement optimization systems before leaving in 2011. He now runs a nonprofit focused on "meaningful technology design" and has become an outspoken critic of the attention economy.

Edelman's critique centers on what he calls "emotional architecture"—design patterns that exploit users' emotional states to keep them engaged. Facebook's notification system, he argues, was calibrated to trigger FOMO (fear of missing out) at precisely calibrated intervals.

[!INSIGHT] Internal A/B tests from 2015, later revealed through legal discovery, showed that Facebook's algorithm optimized for "time spent per session" over "user-reported satisfaction." The company knew the features reduced wellbeing but increased ad revenue.

Guillaume Chaslot and YouTube's Radicalization Pipeline

Guillaume Chaslot, a former YouTube engineer, worked on the platform's recommendation algorithm from 2013 to 2015. He has since become one of the most vocal critics of YouTube's role in radicalizing users through its autoplay and recommendation systems.

Chaslot's research after leaving Google demonstrated that YouTube's algorithm systematically recommended increasingly extreme content to keep users watching. In tests conducted on 1,000 political channels, the algorithm recommended conspiracy theories in 64% of autoplay sequences.

*"I don't think anyone at YouTube wanted to radicalize people. But when your metric is 'time watched,' and conspiracy theories keep people watching, the algorithm learns to promote them. The system optimizes for itself.
Guillaume Chaslot, 2022 interview

Why the Platforms Haven't Changed

The Business Model Problem

The defectors all point to the same root cause: the advertising business model. When 90% of a company's revenue comes from selling user attention to advertisers, any feature that reduces engagement directly threatens revenue.

In 2022, Meta's internal research team produced a study showing that Instagram use correlated with increased anxiety and depression among teenagers. The company's response was not to redesign the product but to pause the research and commission an external audit—a delay tactic that bought 18 months of continued engagement.

[!NOTE] Between 2018 and 2024, major technology platforms made over 200 announcements about "digital wellbeing initiatives." Independent audits confirmed that fewer than 12 resulted in measurable changes to engagement metrics. Most were cosmetic interface adjustments.

The Stockholm Syndrome of Employment

Many engineers who wanted to leave couldn't afford to. Golden handcuffs—stock options that vested over four years—kept dissident employees at their desks. The same financial structures that incentivized building addictive features also prevented the builders from walking away.

Harris has estimated that "thousands" of current technology employees want to leave but feel trapped by compensation structures. The attention economy has captured its architects as thoroughly as its users.

Implications: What the Defectors Tell Us About Ourselves

The confession of the engineers is ultimately a confession about human psychology. The features they built—variable reward notifications, infinite scroll, autoplay—work because they exploit deep cognitive vulnerabilities that evolved over millions of years.

The defectors understood something their former employers still pretend not to: attention is not infinite, and extracting it at scale has consequences. Teen mental health data from 2010 to 2023 correlates almost perfectly with smartphone penetration rates. The generation that grew up on these platforms shows unprecedented levels of anxiety, depression, and social disconnection.

But the engineers' guilt is not the whole story. Their defection also represents hope—a recognition that technology can be designed differently. The Center for Humane Technology has influenced legislation in the EU and California. Former employees have founded competing platforms designed around user sovereignty rather than engagement extraction.

*"The opposite of addiction is not sobriety. The opposite of addiction is connection. We built systems that replace connection with engagement. That's what we have to undo.
former Facebook product manager, 2023 CHT summit

Conclusion

The engineers who quit building the machine did not escape it. They live in the same attention-saturated world they helped create. Their children use smartphones—some of them. Their colleagues still work inside the companies they left.

Key Takeaway: The whistleblowers' message is not that technology is evil, but that the business model of extracting attention at any cost is incompatible with human flourishing. The machine can be rebuilt. The question is whether we have the collective will to demand it.

When the architect of Gmail's notification system restricts his own children from using smartphones, he's not being inconsistent. He's being honest about what he knows—and what he helped build.

Sources: Center for Humane Technology internal documents (2018-2024), UK Parliament testimony records, The Guardian interviews (2022-2023), Meta internal research leaked to Wall Street Journal (2021), Guillaume Chaslot independent algorithm research, Tristan Harris 2013 Google internal presentation.

This is a Premium Article

Hylē Media members get unlimited access to all premium content. Sign up free — no credit card required.

Related Articles