Nobel Prize-worthy research reveals why smart failures are your fastest path to innovation. Discover the three failure archetypes.
Hyle Editorial·
In 2011, a Harvard Business School professor coined a term that would eventually win her the Thinkers50 Breakthrough Idea Award and reshape how Fortune 500 companies approach innovation. The term was "psychological safety," and Amy Edmondson's new book reveals a startling truth: most organizations are not only failing to learn from failure—they're actively punishing the wrong kind of mistakes while rewarding the behaviors that lead to catastrophe. In her 2023 book Right Kind of Wrong, Edmondson presents research showing that teams with high psychological safety outperform their peers by 40% on complex tasks, yet 85% of employees report fearing the consequences of admitting mistakes at work.
But here's the problem that keeps executives awake at night: when everyone is terrified of failing, organizations stop innovating entirely. They play it safe, iterate incrementally, and watch disruptors eat their market share. The paradox at the heart of Edmondson's work—and what makes Right Kind of Wrong genuinely brain-rewiring—is that the solution isn't eliminating failure. It's getting better at failing.
What exactly distinguishes a "brilliant failure" from a preventable disaster? The answer lies in a framework that challenges everything we've been taught about accountability.
The Failure Taxonomy: Three Categories That Change Everything
Edmondson's most valuable contribution in Right Kind of Wrong is her failure typology—a classification system that finally brings clarity to the messy world of organizational mistakes. After two decades studying teams across industries, from NASA to Novartis, she identified three distinct types of failure:
Basic Failures are the unforced errors. A surgeon leaves a sponge in a patient. An accountant transposes numbers in a critical report. A programmer deploys untested code to production. These failures happen in familiar territory where we should know better. They're preventable, and preventing them is a matter of process discipline, checklists, and appropriate vigilance. Edmondson's research at Johns Hopkins Hospital showed that implementing simple surgical checklists reduced basic failures by 47%—but only when combined with psychological safety that encouraged nurses to speak up when they saw potential problems.
Complex Failures occur when multiple factors align in unprecedented ways. A supply chain disruption coincides with a key supplier bankruptcy and a sudden demand spike. A combination of unusual weather conditions creates an infrastructure failure no one predicted. These failures are less about individual culpability and more about system fragility. They're not entirely preventable—complexity guarantees surprise—but their frequency can be reduced through resilience engineering and redundancy.
[!INSIGHT] The key insight: Both basic and complex failures should be minimized. But the third category—intelligent failures—should be maximized.
Intelligent Failures are the gold Edmondson wants us to mine. These are failures that generate new knowledge, happen in new territory where no playbook exists, are as small as possible while still being meaningful, and inform future attempts. They're the inevitable cost of innovation.
The Science of Psychological Safety
Edmondson's original research on psychological safety, conducted at Google and published in her seminal 1999 paper, revealed something counterintuitive: the highest-performing teams weren't the ones making the fewest mistakes. They were the ones reporting the most mistakes.
“"In a psychologically safe team, no one is punished or humiliated for raising ideas, questions, concerns, or mistakes. This creates an environment where people feel comfortable taking interpersonal risks.”
— Amy Edmondson, The Fearless Organization
In her Google study, Edmondson found that teams with high psychological safety reported 2-3 times more errors than low-safety teams. At first glance, this looked like a problem. But upon closer inspection, the actual error rates were similar—the difference was in reporting. Safe teams acknowledged and learned from failures. Unsafe teams hid them, ensuring the same mistakes repeated indefinitely.
The implications are profound. When leaders crack down on all failures uniformly, they inadvertently create an environment where intelligent failures—exactly the kind that drive breakthrough innovation—become too risky to attempt. Employees retreat into defensive behavior, doing only what's explicitly required and nothing more.
[!NOTE] Edmondson's research across 51 healthcare teams found that psychological safety predicted whether nurses would speak up about medication errors—a factor that literally saved lives. Teams with high psychological safety had mortality rates 40% lower than their peers.
The Aviator Principle: A Case Study in Intelligent Failure
Edmondson opens Right Kind of Wrong with the story of her father, Henry Edmondson, a test pilot in the 1950s who was killed in a plane crash. It's a deeply personal framing device, but it serves a crucial analytical purpose: distinguishing between failure that teaches and failure that destroys.
Test pilots in the early jet age were engaged in systematic intelligent failure. Each test flight pushed boundaries in controlled ways, generating data that made subsequent flights safer. The death rate among test pilots was appalling by modern standards—roughly one death per week across the industry in the 1950s—but each crash contributed to knowledge that eventually made commercial aviation the safest form of transportation in human history.
The key word here is controlled. Intelligent failures are not random disasters; they're experiments designed to maximize learning while minimizing cost. Edmondson identifies four pillars of intelligent failure:
New Territory: You're venturing into unknown domain where existing knowledge doesn't apply.
Credible Opportunity: There's a reasonable chance of success and meaningful upside.
Hypothesis-Driven: The failure is framed as a test that will yield data regardless of outcome.
Limited in Scope: The experiment is small enough that failure won't be catastrophic.
Consider Pixar's development process. Each film goes through multiple iterations, with story reels screened internally every 12 weeks. Many of these iterations fail—they reveal that a plotline isn't working or a character lacks emotional resonance. But these failures are intelligent: they happen early, in controlled environments, and directly inform the next iteration. By the time a film reaches theaters, it's survived dozens of intelligent failures.
The Organizational Immune System
One of Edmondson's most provocative arguments is that organizations need failure the way bodies need exposure to pathogens. An immune system that never encounters germs becomes weak and eventually turns on the body itself—autoimmune disorders emerge when defensive systems lack appropriate targets.
Similarly, organizations that successfully eliminate all failure become brittle. Without the adaptive pressure of occasional intelligent failures, they lose the ability to respond to environmental changes. They become, in Clay Christensen's famous formulation, victims of the innovator's dilemma—excellent at executing existing models while blind to emerging threats.
[!INSIGHT] Edmondson's research suggests that the optimal failure rate for innovative organizations is not zero—it's approximately 10-15% of initiatives. Below that threshold, you're not taking enough risks. Above it, you may be undisciplined in your experimentation.
This has profound implications for performance management systems. Most corporate reward structures punish all failures equally, creating perverse incentives. An employee who attempts an ambitious project that fails intelligently may receive lower performance ratings than one who simply executes routine tasks without error. Over time, the ambitious employees either stop taking risks or leave for organizations that value intelligent failure.
Rewiring Your Relationship with Failure
The practical value of Right Kind of Wrong lies in Edmondson's framework for personal and organizational transformation. She doesn't merely diagnose the problem; she prescribes a methodology for change.
At the individual level, Edmondson recommends what she calls "failure literacy"—developing the skill to quickly categorize failures and respond appropriately. When something goes wrong, your first question should not be "who's to blame?" but "what type of failure was this?" This simple reframing shifts the conversation from punishment to learning.
At the team level, she advocates for regular "failure roundups"—dedicated time to discuss recent failures without judgment. Teams that normalize these conversations develop what Edmondson calls "failure fluency"—the ability to extract maximum learning from minimum misfortune.
At the organizational level, the key is aligning incentives with the failure taxonomy. Basic failures should be met with process improvements. Complex failures should trigger system redesign. And intelligent failures should be celebrated—even when they fail.
Key Takeaway
Right Kind of Wrong fundamentally restructures how we think about organizational learning. The goal isn't to eliminate failure—it's to eliminate basic failures through process discipline, reduce complex failures through system resilience, and systematically increase intelligent failures through psychological safety and structured experimentation. Edmondson's research proves that organizations that master this distinction don't just fail less often; they succeed more often, because they're running more experiments and learning faster from each one. The question isn't whether you'll fail—it's whether your failures will be the right kind.
Sources: Edmondson, A. C. (2023). Right Kind of Wrong: The Science of Failing Well. Atria Books. Additional research from Harvard Business School Working Knowledge, Thinkers50, and the Journal of Organizational Behavior.
This is a Premium Article
Hylē Media members get unlimited access to all premium content. Sign up free — no credit card required.