I’m deep in the writing process for my new book, Shift: The Playbook for Event-Driven Architecture Advocacy.
An important part of the book is psychological safety. Why? Because I’ve seen it time and again: most big tech transformations don’t fail on the tech, they fail on the people.
To make this point crystal clear, I needed a powerful, real-world story. So I hit up my friend, Fran Arismendi.
Fran’s a world-class Chilean psychologist who, in a cool turn of events, is now my neighbor here in Badajoz, Spain. I asked him to write something showing why this stuff is non-negotiable.
He sent back the text below. It uses the Chernobyl disaster to show how a lack of psychological safety isn’t just a “team-building” issue—it’s a catastrophe-level failure. It’s a perfect excerpt of what the book is all about.
Read this.
The nature of a company’s tasks should be the major factor guiding its processes. Whether it is an online sale or the successful launch of a rocket with people inside, the dynamics of a mature (not old) company culture is that it is professional and rational on recognition, design, and implementation of processes. As processes grow in complexity, sometimes companies do not mature with them. For that reason, the provision of psychological safety is a pre-condition for the possibility of fostering an environment for professional development.
The key lies in how explicit and intentional the goal of creating a culture of psychological safety needs to be, when questioning and reporting are integrated as a daily practice. To illustrate it, we can go to other domains where errors in technical and event driven process could literally end up in a “Post Mortem” analysis.
Chernobyl’s disaster story begins on April 26, 1986, on a Reactor of the nuclear power plant in the Ukrainian SSR, near Pripyat. Known for its horrific scale, it started with a routine but very risky experiment: a planned power shutdown test to see if the reactor’s still-spinning turbines could power internal operations while diesel generators powered up.
In a catastrophic procedural failure, operators shut down critical safety systems. They lost control of the reactor, causing a massive steam explosion and a nuclear meltdown. The reactor’s own design flaws made the disaster worse. The resulting radiation cloud spread across Northern Europe. It killed 31 people instantly and would indirectly kill tens of thousands more. The USSR responded with secrecy and denial. The government wanted to preserve national pride and hide any flaws in Soviet engineering. This secrecy slowed the emergency response and made the disaster’s effects worse. In the end, officials blamed the catastrophe solely on “human error.” They focused only on the operators’ mistakes, not the deeper system failures. Three men—Viktor Bryukhanov, Nikolai Fomin, and Anatoly Dyatlov—were imprisoned for their roles.
The catastrophic meltdown was not just a technical failure. It was a crisis caused by a complete lack of psychological safety. The company’s culture was sick. It silenced concerns and made a healthy workplace impossible. Truth-telling was sacrificed for party and profit. This created a deep organizational discontentment—a feeling of unease employees live with every day, where ethics and excellence were blocked by the pursuit of gain and a false sense of camaraderie.
This organizational discontentment was rooted in a workplace defined by tension, stress, and fear. The wider Soviet culture demanded absolute compliance, fostering a powerful fear of authority. The state prioritized political mandates over technical expertise. In such an environment, where people feel their security is threatened, the mind focuses entirely on survival. They have no mental space for thriving, creating, or working to their full ethical and capable potential.
The Chernobyl disaster underscored a critical failure: operators identified safety violations but were powerless to address them. This case highlights how a culture that values secrecy and punitive measures over transparency and learning is intrinsically vulnerable to systemic breakdown. Such conditions cultivate an organizational discontentment, rooted in the deprivation of fundamental psychological requirements—namely, the need for safety, significance, and a sense of belonging.
The specific dynamics that drive a culture can be identified and used as predictors. Adlerian Theory and Transactional Analysis offer effective lenses through which to observe these signs. From a rather general perspective the USSR system provide a textbook example of a pathological organization where fundamental needs were systematically frustrated. The unsafe working environment bred fear, leading to counterproductive fight, flight, or freeze responses, preventing optimal working conditions. Workers were denied dignity and respect, confirming that their work life was deeply unsatisfactory. When the authorities blamed the accident solely on “human error”, they inflicted a deep psychological loss on the workers, suggesting that the only value derived from their job was a paycheck, robbing them of their sense of contribution. The highly hierarchical decision-making process at Chernobyl was an authoritarian model where orders flowed only from above. This system encouraged vertical striving—a mode of goal-striving where individuals strive against others, viewing colleagues as “ladder rungs” to step on to reach higher power. This desire to dominate came from a lack of belonging or perceived importance. Greater power distance suffocated the horizontal effort—cooperation towards a common, collective goal—that was necessary for security.
Given the fear of retaliation, the predictable coping mechanism was avoidance behavior. Individuals who face an organization that prioritized avoiding stress, rejection, or humiliation, utilized defensive mechanisms such as hedging, qualifying, and softening their language about concerns and mistakes, ensuring that vital safety information was suppressed.
Transactional Analysis, on the other hand, reveals how the political and corporate response functioned as a destructive game, one designed to provide a psychological payoff that absolves management of guilt.
The post-mortem response, which valued punishment over learning, set up a classic “blame culture” game. The government, acting as the Parent ego state, focused on individual operators’ “violations”, rather than objective and systemic analysis (Adult). The jailing of the plant managers served the psychological payoff of a classic Transactional Analysis Game, such as ‘Now I’ve Got You.’ The authorities framed the disaster as a profound ‘provocation,’ permitting them to unleash suppressed anger and validate a core belief in the untrustworthiness of others. This pattern is further evidenced by their use of a trivial slip—the test failure—as a pretext for rage, a distinctive maneuver in such blame-shifting games The official declaration of “human error” allowed the authorities to claim vindication and justification for their rage against the operators. This punishment, however, did not reduce mistakes; instead, it achieved the opposite, forcing problems to be suppressed until they were too big to hide. The lack of psychological safety ensured that people facing interpersonal threat would continue to use “defensive routines” that thwarted organizational learning. Having said that, the Chernobyl disaster demonstrated that technical solutions alone were inadequate for prevention, compelling a profound cultural and operational shift in nuclear safety. In response, the industry adapted frameworks from aviation, wh
ich had long emphasized the critical role of human factors.
Thus, the primary conclusion of multiple studies, including the analysis of Chernobyl's disaster, has been that technical improvements alone do not reduce accidents or operational errors, as these are often caused by human factors such as poor communication, deficient team interactions, and rigid hierarchies that inhibit the expression of concerns and the reporting of any issues that could threaten the status quo of the system.
In response to this, a “safety culture” was established in the nuclear industry, adopting aviation practices such as Crew Resource Management.
This methodology, known as CRM for non-aeronautical teams or Human and Organisational Performance Training (HOP), is designed to create psychologically safe environments, promote effective communication and reduce power gradients, in order to prevent fatal consequences. At the same time, the International Reporting System (IRS) was created to break with the culture of secrecy and punishment and replace it with a mechanism for collective learning. But integrating these systems is still a challenge, as can be seen in the nuclear industry’s IRS limitation, which allows only ‘authorized’ users to issue reports, demonstrating the difficulty of completely overcoming the culture of opacity that catalyzed the Chernobyl disaster.
Pretty powerful, right?
This is exactly the kind of thinking I’m packing into Shift. The book isn’t just about schemas, brokers, and protocols. It’s about the hardest part: the human-side strategies, the advocacy, and the cultural changes you need to actually make an Event-Driven Architecture work in the real world.
If this story made you think, the rest of the book will give you the playbook.
It’s on pre-sale right now. I’d be super grateful if you checked it out and grabbed your copy.