Posts Tagged ‘Risk management’

Acknowledging risks…ooh scary!

May 15, 2013
"John T. Raymond as the insurance agent i...

Random dude…I’ll bet you started to create a story behind this guy. (Photo credit: Wikipedia)

We are all irrational to some extent. That truth is undeniable. Fear affects our behavior whether we like to admit it or not. However, how we deal with that fear differs widely among organizations, which is affected greatly by the leadership in those organizations.

For a more thorough treatise on this topic, check out the McKinsey article on managing the people side of risk here.

I’ll take the lazy way out and add a couple random thoughts and experiences of my own. I have worked closely with a couple different organizations who are in similar situations with wildly varying standard operating procedures when it comes to risk. If risks were openly discussed with Organization A, the outcome of the conversation would be one of the following scenarios:

  1. The risk would be greatly exaggerated, and there would be a mad scramble to ‘fix’ the issue without regard to costs or resources. Blame would be assigned in a structured, post-event analysis. 
  2. The risk would be quickly dismissed as not valid if it was judged to be something requiring a top-level, systematic fix to which there was no readily apparent solution. Business would carry on as usual until the problem came to a head. There are no worse headaches than those caused by cognitive dissonance.
  3. Whoever raised the risk would be blamed for not having fixed it already as it fell under their area of responsibility. It would then be classified into either scenario 1 or 2.

In Organization B, there is a more rational approach to risk. Risks and assumptions are brought to the forefront. Sometimes, those risks would be acknowledged and accepted. Other times, an immediate fix was decided. In still other instances, there would be an acceptance of the current way of doing business as the cost would be too great to change, with a view to adapting in future decisions.

The important thing to remember is that in business, there is no such thing as a free lunch. Decisions that impact profitability are complex and do not involve easy solutions. The test? If they were easy, someone would have already figured out the way forward.

So what can we do? The first step is developing the mental discipline to overcome one’s gut reaction to hand out blame. Where issues are complex, our mind tends to distill the world’s randomness by creating stories, often assigning malevolent motives to people that, in fact, had no such motives. Understanding this tendency will provide some perspective, so that next time, risks can be openly discussed.

The next step is to choose which risks are truly the most important, and be relentless in finding and implementing the answer. If we swing wildly from one worry to the next, based on the randomness of one person’s perspective, we’ll be stuck in an eternal loop. However, if we pool our mental resources, talented people working together can do extraordinary things.

Advertisements

Possibly the best blog post I have read to date…

June 14, 2011

And no, this is not a link to an earlier TPS report post. Click here to read about what makes a successful organization. Point by point, John Persico challenges conventional wisdom and inspires us to take a critical eye to how we manage. I know I took a hard look at myself after this one. Congratulations, John, you have earned yourself a place on the coveted right panel of the Its more fun than a TPS report blog. Unless I forget.

The article can best be summarized by the five pillars of success for organizations, which are: Dissent, Openness to criticism, Mutant employees, Examining the unknowable and unthinkable, and Looking for our blind spots. All of these things are extremely important, but the two I would highlight here are Examining the unknowable and unthinkable and actively looking for our blind spots. How, often, when challenged, are people quick to gloss over a potential weakness by saying ‘no, we’ve covered that because of x,’ and a the debate ceases. All of the things Persico mentions fit in very nicely with the human proclivity for the status quo bias.

He takes on a number of things that are considered standard practice. For example, he challenges, albeit indirectly, traditional job interviews and performance assessment/management. Plus, he unleashes my personal favorite quote:

“Do you surround yourself with people who like and agree with you? How open are you to criticism and argument? Do you always get the final say? Can you change your mind? When was the last time you gave in to someone else and changed a decision you had previously made? Are you always right?” (my emphasis).

I think this also fits nicely into the debate over professional certifications including this post by T. Cummins (along with comment from yours truly). People can become very complacent, and very happy that they know a lot of stuff. However, business conditions are always changing, and what you knew last year will only help you a little for this year. More important is the fluid intelligence of which I mentioned earlier, which will allow you to solve new problems as they arise.  Check back over the next few days when all of this is tied to other thoughts on leadership.

A good example of supply chain risk management

June 9, 2011

I know, I know…a bit of a yawner for some members of TPS planet. Other than the use of the word “robust,” (which middle-brows have left beaten to a pulp and completely devoid of meaning) this case study in supply chain risk management is a worthwhile read. Whenever we facilitate workshops, this session does not represent new/breakthrough-type knowledge. It is, however, a good reminder that business is not normally complicated… e.g. simply getting the supply chain to a point that everything runs smoothly and risks are minimal is a great place to start.

Larry Page’s bias toward action

January 24, 2011

Risk is normally thought to stem from taking action before thinking something through. If only we would always run lots of information through the rational (as opposed to emotional) parts of our brain, we’d always make better decisions, right? Surprisingly, no.

Larry Page

Suit Larry? Nah...I'm good.

When there is no way to know what the outcome of a decision will be, the rational side of our brain will want to wait, delay, and seek new information. The only problem is that our brains will only absorb and process information that does not conflict with previously formed thoughts and opinions, and the new info wouldn’t hold any predictive power even if that weren’t the case. The decision is made…we just don’t know it yet!

There is another side of risk, one that Google CEO Larry Page understands very well…the risk of standing still. We applaud your efforts, Larry, especially given the field in which you operate.

Mental biases post-Deepwater

January 6, 2011

I don’t know about you, but when I hear the words “long-string well design” as it relates to building a massive pipeline for oil harvested from below the ocean floor to be pumped up to a big tank on the surface, my first thought is “what could possibly go wrong?”

It’s official, folks. In the wake of the disastrous oil spill in which President Obama spent significant time calling out BP and former CEO Tony Hayward in the media for poor risk management, a presidential commission found that BP was guilty of risky behavior that contributed to the disaster. The consistency of it all is overwhelming.

I won’t delve into the political angle of the disaster as I’ve already taken one shower today, however, this extremely unfortunate event does expose the mental biases that allow the reaction to a disaster to become so predictable…and unhelpful.

Let’s start with the motives of the presidential commission. You are heading up the investigation into the oil spill. Your boss has been publicly critical of BP for their role in the disaster. What would you guess is the probability of you saying, “you know, after careful consideration, all the decisions that BP took at the time they were taken and with the information available at the time were actually sound, and actually most of them were taken in isolation of the others since it was not apparent that any of them would affect the others?” I’ll venture a guess – less than 0%.

So, predictably the commission went looking for causes, and sure enough, they found them. This is just a guess, but if you ran some of these causes by some of the most knowledgeable, technically-oriented engineers responsible for the oil rig, they would come up with very compelling reasons why the causes given in the report were absolute nonsense and had nothing to do with the real problem – which will most likely remain a mystery forever. Our brains need to connect events through a cause and effect relationship even where none may exist.

Secondly, history is created by a series of individual decisions taken by individuals who are doing what they believe is best for them. There was not any over-arching strategy to ignore risk, but rather a series of individual decisions that unknowingly added up to increased risk of a disaster. Risk managers are put in the position of either telling a company that what they are doing is very risky – something not easily mitigated by a small investment, or justifying the current position by telling senior execs that they have thought of a risk, but that they have it covered. This is what the execs, deep down, want to hear. If they say something like “we are exposed to massive risk,” unless they are new to the job, will face the question of “well, why have you just thought of this now? What have you been doing for the past x months/years/decades.”

Or, in a more extreme example, they may be in the position of saying, what we are doing is inherently very risky – the only way to mitigate it is to not perform this activity. Those at the top, including Hayward, could hardly announce one day that they would stop drilling for oil in the ocean – just imagine the reaction of the shareholders! There is an illusion that those at the top are in control of historical events. In reality, they are swept along by a powerful wave consisting of all the individual decisions, emotions, and motives of those doing the work, fighting the war, or writing presidential commission reports.

The themes (not the details) of the presidential commission report were decided well before the report was even begun – the themes were decided as soon as a hurricane of causes came together to cause the first drop of oil to enter the Gulf of Mexico.

We are not without hope, however. We can be forward-looking to a degree when performing risk management. That topic has been addressed in previous posts, and will continue to be a theme of this blog.


%d bloggers like this: