I just finished reading “Midnight in Chernobyl“. A chilling and magnificent book on the world’s greatest nuclear disaster. But as much as it is a book about Chernobyl, it’s above all a book on behavioural design. The book provides a fascinating peek into how totalitarian regimes like the Soviet Union profoundly shape the behaviour of its constituents.

The leadership of the Soviet Union created a magnificent illusion that the ultimate workers’ paradise of True Communism was on the horizon if everyone stuck with the plans of the Party. The best way to understand the behaviour of the people inside the Party is to look at it through the lens of a sadistic game. The higher you got in the ranks, the more effort you had to put in defending your position. Inversely, the most dreadful thing that could happen to you is to fall out of the Party’s grace.

The disastrous behaviour that triggered Chernobyl

Learn the Behavioural Design Method

Learn how influence works in our monthly two-day masterclass Behavioural Design Fundamentals in Amsterdam

It’s, therefore, no surprise that the Communist Game Design produced only ruthless and sociopathic men who made it to the top. It also created a culture of lies, secrecy, and false reporting for the simple reason that no one in the communist hierarchy wanted – or could afford – to hear bad news.

Why they didn’t want to hear bad news had to do with the two obsessions of the former Soviet Union: ludicrous targets and the competition with the West. Goals that were not going to be met, triggered fury and outrage at the top, and creative corner-cutting at the bottom of the pyramid. The Soviet Union leadership always had to prove that they were able to pull off the impossible. They had to do this to maintain the illusion that it was far more capable than those bloody western capitalists. Not being able to meet the impossible was the equivalent of treason.

Being loyal beats being right

In a culture like this, rational decision-making becomes impossible. Every fact will be interpreted through this obsessive lens: “Could this fact hurt my reputation”? or “Could the Party lose face if this fact were true”? You don’t want this decision-making process when you’re building a nuclear reactor, operating one, or when you’re trying to contain it after the explosion. Reactors got built with lousy material, and prototype tests were skipped, because deadlines were sacred. Known critical errors in the design of the reactor were discarded (because it could hurt the reputation of the Soviet nuclear community), safety protocols were overruled, because deadlines had to be met,… And at every stage there always was a bullying chief who made sure people didn’t step up.

Adam Higginbotham writes:

“The accident and the government’s inability to protect the population from its consequences finally shattered the illusion that the USSR was a global superpower armed with technology that led the world. And, as the state’s attempts to conceal the truth of what had happened  came to light, even the most faithful citizens of the Soviet Union faced the realization that their leaders were corrupt and that the Communist dream was a sham”. (p. 276)

Societies shape behaviour

Midnight in Chernobyl is a fascinating story of how behaviour of a whole nation is shaped by the invisible rules of the game. Everyone is trapped inside the rules of this game, not in the least those at the top.

When you approach it from this angle, then the book pairs surprisingly well with Life.Inc. In this book, Douglas Rushkoff argues that we in the West have entirely internalized corporate thinking. We’ve become obsessed with success, we have started to see everything as assets, and we divide the world into winners and losers.

If you need to bring two books with you on holiday that will teach you a lot without being boring business books, then these are the ones.

Update February 17th 2020:

It’s also quite interesting to observe that a similar pattern is taking place in China in the context of the Corona virus outbreak.. The Chines Government is trying to contain the virus, but are now experiencing unpleasant surprises, due to  overly optimistic reporting from the local governments. People just didn’t want to report bad news, because it could hamper their changes for making career. This is a perfect illustration of Goodhart’s Law:

“When a measure becomes a target, it ceases to be a good measure.”

In other words: When your promotion depends on the things you measure, you start cooking the numbers.

Want to read more?

More blogposts on the design of citizen behaviour.

All our blogs

Book a 60-minutes with SUE

Do you consider hiring SUE to learn how we could help you to imrpove your product, service or marketing through behavioural psychology? Book 60-minutes with SUE. Get a Behavioural Design perspective on your challenge. Who knows where it could lead to…

×

Ok. Are you 100% sure?

You will miss out on receiving free training.
Tips you can start using right away.
To change minds and nudge behaviour.
Always practical, we promise!
Ok you've convinced me
Sign me up anyways