Okay, I’ll admit it. I’m a bit biased toward crows.
In one of my previous workshops I asked people to identify their power animal. Inevitably I got tigers, lions, and bears (oh my!), along with some hawks and eagles. I have yet to have anyone respond with crow. But I’m hoping by the end of this blog focus some of you will reconsider your choices.
The crow is one of the most adaptable animals on the planet and has representative members scattered globally. In fact the only places you can’t find crows are at the poles, and New Zealand. In other words, as soon as they develop opposable thumbs the human species might watch its back.
Charles Darwin is attributed with saying:
“It’s not the smartest, nor the strongest of the species that survive, but those that are most adaptable to change.”
The crow seems to be the master of this. Before 1970, crows were only found in the rural areas of America. Now you can’t pass a McDonald's dumpster or walk through a suburban neighborhood without hearing their distinctive voices. When crows see an opportunity (like taking advantage of the food provided by human commensals in the city) you can bet they will capitalize.
Businesses could learn a lot from these feathered friends (calling Blockbuster and Kodak!). Being responsive and prepared for change in shifting environments may be the most valuable trait in today’s varying markets.
So how does one, (or one entity/organization) become adaptable?
Answering that question requires a conscious effort to look for your blind spots—something that both individuals and organizations are resistant to do. Blind spots are kept in the dark for a reason.
When people have a vested interest in seeing a situation in a certain light they can no longer be objective or clear about how to approach or resolve the problem.
While much of the business buzz has been in positive psychology (strengths testing, happiness indices, etc.), all of which I think is very helpful and beneficial for organizations, that doesn’t mean we can ignore those other aspects of our companies and ourselves that simply feel too uncomfortable to explore. These kind of blind spots can come back to bite us, and ultimately lead to a massive breakdown.
The Challenger explosion in 1986 is a classic, and tragic, example of just such a blind spot. The contracted engineering company Thiokol had warned NASA that their O-ring products (a critical component to the integrity of the spacecraft) hadn’t demonstrated the ability to seal properly in the temperatures that the ship would be exposed to on that unusually cold launch morning. NASA, facing a giant blind spot of pride in the face of accusations and express disappointment from the US government on the many previous delays leading up to this launch date, reacted to the engineer's’ recommendation not to launch with outright hostility. The management team at Thiokol met privately for a total of five minutes the night before the scheduled launch to discuss NASA's dismay.
Three of the four executives were in favor of reversing the “no-launch” decision. The fourth, resisted until he was apparently told to “remove his engineering hat, and put on his management hat.” In this light, it was clear to the fourth executive that pleasing their customer (NASA) and the devastating consequence of the loss of potential future contracts with NASA should they opt not to launch, was not a wise managerial decision. In other words, his blind spot fell back into the shadows and an unethical decision was made that cost 7 human lives.
NASA never bothered asking what prompted the full reversal in decision. They heard the answer they needed to hear and were content not to explore deeper despite the fact that the data demonstrating a 99% probability of a failure at launch was readily available.
Here, arguably the best trained analytical and statistical minds failed to properly evaluate their own data because it sat in a blind spot they were too fearful to assess and too eager to ignore. The results lead to one of the most preventable and horrific breakdowns in American history.
Luckily most of us are not dealing with matters of life and death in our daily decisions, but the example above does beg the question: If rocket scientists fail to assess their engineering blind spots when human lives are at stake, consider what traits or flaws in your own personal character or company you are willing to overlook?
Ignoring a blind spot might simply mean missing an opportunity, like the expansion into cities by crows, but the potential exists for a much more sinister outcome.
What blind spots are you unwilling to check?