Share Podcast
When to Listen to a Dire Warning
Richard Clarke, former counterterrorism adviser to U.S. presidents Bill Clinton and George W. Bush, has made a career of investigating disaster warnings. The way he sees it,...
- Subscribe:
- Apple Podcasts
- Spotify
- RSS
Richard Clarke, former counterterrorism adviser to U.S. presidents Bill Clinton and George W. Bush, has made a career of investigating disaster warnings. The way he sees it, catastrophes can happen at any time, so why should decision makers ignore a Cassandra? Now a cybersecurity firm CEO, Clarke is an expert at figuring out who is a conspiracy theorist and who is a credible source. He explains his method through a few case studies—on the Bernie Madoff Ponzi scheme, the Fukushima nuclear plant meltdown, and others—from his new book, Warnings: Finding Cassandras to Stop Catastrophes.
SARAH GREEN CARMICHAEL: Welcome to the HBR IdeaCast, from Harvard Business Review. I’m Sarah Green Carmichael.
We’re rational creatures. We know that just because something unbelievably bad hasn’t happened to us doesn’t mean it will never happen. Practically though, we often act like everything’s going to be just fine. And we don’t like to listen to the people who warn us otherwise.
There’s this scene in The Big Short, the movie about the months leading up to 2007 financial crisis. An investor storms into a hedge fund manager’s office to tell him he’s wrong: there is no real estate bubble.
SOUND FROM THE BIG SHORT: “So Mike Burry, a guy who gets his hair cut at Supercuts and doesn’t wear shoes knows more than Alan Greenspan and Hank Paulson?” “Dr. Mike Burry. Yes, he does.”
SARAH GREEN CARMICHAEL: Mike Burry, the hedge fund manager here, is the type of eccentric prophet of doom who fascinates Richard Clarke.
The former White House counterterrorism adviser calls these characters Cassandras. That’s from the Greek myth about the cursed princess, Cassandra. She could see disaster coming but couldn’t make anyone believe her.
Clarke should know. He tried to convince the George W. Bush administration that Osama bin Laden was the biggest threat to the United States and that it was only a matter of time before al-Qaida would attack. He was not successful. You might remember his apology in front of the 9/11 Commission.
SOUND FROM CLARKE’S 9/11 COMMISSION TESTIMONY: “To those who are watching on television, your government failed you. Those entrusted with protecting you failed you. And I failed you.”
SARAH GREEN CARMICHAEL: But Clarke doesn’t want organizations to brush off credible predictions anymore. Today he’s the CEO of a cyber-security consulting firm and the co-author of a new book called Warnings. And he joins us now to talk about what companies can do to recognize Cassandras and protect themselves from catastrophes.
Dick, thanks for talking with the HBR IdeaCast.
RICHARD CLARKE: Oh, it’s good to be with you.
SARAH GREEN CARMICHAEL: So, how do you think you would feel now—would you be even writing this book now—if the government had taken your warning on al-Qaida seriously and acted on it?
RICHARD CLARKE: You know, I think so. I’ve always been fascinated by this phenomenon. And, you know, it goes back to Churchill, who was the great Cassandra of World War II. And the interesting thing is, his colleagues at the time called him a Cassandra, and he knew he was a Cassandra. If people had paid attention to him, could we have nipped World War II in the bud by doing something in the early 1930s as Churchill suggested against the Nazis? These kinds of questions have always been running around in my head. This book is not about my personal experience; it’s about the phenomenon that I think deserves recognition.
SARAH GREEN CARMICHAEL: What’s it like to be someone in that position?
RICHARD CLARKE: It’s very frustrating, obviously, and this is why I—you know—I feel some sympathy for the 14 characters in the book because they are all experts in their field. They have data which they think is pretty convincing. They say to people, “Hey, look, I want to be wrong. I’d really like for this to not happen. Here’s my data. Tell me what’s wrong with my data.” We heard that same line from almost everybody we interviewed.
SARAH GREEN CARMICHAEL: Maybe the Bernie Madoff Ponzi scheme example is a good place to start.
RICHARD CLARKE: Bernie Madoff, as everyone now knows, was a very well-known and very well-respected figure on Wall Street, and chairman of one of the exchanges. And he ran a scheme that was making lots of people lots of money. And there was an accountant who looked at the results of these investments and said, “That can’t happen. This can’t be real.”
And so he did a very formal analysis. He took it to the Securities and Exchange Commission fraud people, and said, “Look, this is a fraud.” And they told him to go home. And he did again a year later, and he did it again a year later. And every time he was essentially given the brush-off.
His data, of course we now know, is absolutely right. But there were lots of problems with why he wasn’t listened to. One was, he was a nobody; the guy he was reporting on was a very prominent person. Two, his data was really complicated, and he was reporting to law enforcement types, lawyers, that really didn’t understand the complicated regression analysis. But I think the real issue was one of magnitude, which we find in a lot of our case studies, that the magnitude of the problem was so big if it was real, that people just didn’t think it could be real.
SARAH GREEN CARMICHAEL: So, in a way, those sort of criteria remind me of another case study, which is the financial crisis, where the heads of those investment banks didn’t understand themselves the derivatives they were trading.
RICHARD CLARKE: Absolutely. When you’re managing a program that is so complicated you do not understand it, you run a real risk that when someone comes to you and says, “Hey, there’s a problem,” you won’t be able to tell whether they’re accurate or not.
SARAH GREEN CARMICHAEL: Is there a particular company or industry where you’ve really seen that kind of thing as well?
RICHARD CLARKE: Well, there’s another case study in the book about Tokyo Electric. Tokyo Electric wanted to increase the number of nuclear power plants they had. And they sited them well outside of Tokyo, in a little town called Fukushima. An expert— an engineer, civil engineer— went to the company and said, “This is a very bad place to put them, because it’s a potential earthquake zone, and it’s right on the water. So, if there’s an earthquake here, it’ll create a tsunami. That tsunami will overtop the seawall that you’re planning. You could have a meltdown.”
They didn’t believe him. They said there hadn’t been earthquakes there, on any record that they knew about. And so he started going to all these public hearings that you have to have for environmental clearance. And he told the story that 400 years before, there had been a big earthquake and a big tsunami. And the company decided that was silly; and therefore they made a decision to run the risk.
Now you can say that’s a calculated risk, but it’s a big risk. And the cost was in the hundreds of billions of dollars and, of course, also in the leadership of TEPCO, all of whom got fired because there was this incredible disaster, which they could have prevented had they only spent a few tens of millions of dollars more and made the seawall maybe three or four times as high.
SARAH GREEN CARMICHAEL: So, you sort of point there to—not just, like, what’s the risk, but what is that potential cost if something goes wrong. What are the kinds of other things like that that organization should really be thinking about as they’re trying to get better at listening to their Cassandras.
RICHARD CLARKE: Well, one of the things we suggest in the book is that when you have somebody who may be a Cassandra in your organization, has predicted something, and it’s never happened before, and you don’t really know whether or not to trust their data, don’t be dismissive.
Take what we call a surveillance and hedging strategy. Surveillance meaning, put the issue under watch; see if the data begins to change. And if so, in what direction. Meanwhile, spend a little bit of money planning what you would do if you became convinced that the Cassandra was right. It doesn’t cost a lot of money to plan. And then if the data starts shifting in their direction, increase the amount of planning; increase the preventative measures—you don’t have to make a decision all at once. You can experiment. You can keep it under watch.
SARAH GREEN CARMICHAEL: Do you think that there are things Cassandras themselves could do to help raise these issues more effectively?
RICHARD CLARKE: I think they need to give an appearance of being calm and not overreact when they’re not taken seriously. They don’t think, “Well, you know, my obligation, was satisfied because I published that journal article, or I went to that—or I went to that academic conference and I made my—I presented my paper, and, you know, therefore I’m done and walk away.”
No, when they publish their journal article or they present their academic paper to the conference and nothing happens, they get excited; they get agitated. Sometimes they act in ways that are detrimental to their cause. The accountant in Boston that we talked about with the Bernie Madoff scheme, he was convinced that Bernie Madoff would know that he was reporting him and would hire a hit man to rub him out. And so the accountant got a permit to carry a gun. That made many of his colleagues think he was crazy. So there is this feedback loop when they’re not listened to and they think they should be.
SARAH GREEN CARMICHAEL: Is that kind of frustration of not being listened to, which is of course the curse of Cassandra, right? I mean, that is the curse that the gods levy at her. Is that something that you can sort of see in your own career, looking back?
RICHARD CLARKE: Well I think certainly with regard to 9/11, when I look back at the way the Bush administration reacted to my predicting that we were going to have a major al-Qaida attack. Yeah, I think so. I was used to the Clinton administration, where I had been for eight years in the White House, paying attention to me and trusting me and believing me, because we had been through a lot together. And I thought the new Bush administration coming in in 2001 knew me well enough. I had worked with many of them in the first Bush administration, his father’s administration. But in retrospect, I think when I said the No. 1 national security issue is bin Laden and al-Qaida, I didn’t read their reaction properly. In retrospect, I think they probably thought I was nuts.
SARAH GREEN CARMICHAEL: I wonder if there is a particular type of leader that you have, based on your experience in the research for the book, see as someone who was particularly at risk of making that kind of mistake.
RICHARD CLARKE: Well, first of all, a lot of these issues don’t have a decision maker. But when you do have a clear authority, they often have what we call in the book agenda inertia. They come into office—whether it’s president of a university or president of a country or CEO of a company—they’ve come into that job with an agenda, and they’ve moved resources around and they’ve moved people around, to do that one thing or that series of things.
And now you, the Cassandra, come and say, “Nope, don’t work on that. My pet peeve, my pet issue, which you’ve never even heard about before, is the most important thing that you can do. And you have to stop working on your agenda in order to prevent something that has never happened before.” It’s almost always something that has never happened before, because if it has happened before, it’s easier to persuade a decision maker, like—this is going to happen again; look at the indicators. But if it’s a first occurrence syndrome, then the decision maker, somehow, irrationally, says to themselves, “Well, you know, if it’s never happened before, it probably won’t; or, if it does happen, no one can blame me for not reacting to it, because, what? It never happened before.”
SARAH GREEN CARMICHAEL: What about the people on the team? I mean, if the Cassandra is sort of managing a team, you know, have you seen situations where they’re getting kind of pushback from the team like, “Hey, boss, you have us working on something that’s probably not ever going to happen”?
RICHARD CLARKE: No, actually, we haven’t seen that. It’s kind of the opposite. What’s interesting is, these Cassandras are, in their own quirky way, good leaders. They’re kind of charismatic to the people in their immediate team in their field. Because those people have watched the leader say, “OK, let’s test this. Let’s make sure we’re right. Let’s do peer review. I really don’t want this to be true.” And they also know the reputation of the Cassandra from her previous work in whatever the field is.
You take, for example, David Morrison, who is our Cassandra on asteroid impact. And asteroid impact sounds like a silly thing. It sounds like a movie from Hollywood. But when David Morrison starts saying we have to take this seriously; we have to spend money; we have to build systems to identify incoming asteroids; and we need to build systems to push them away before they hit the earth. And the people on his team have tried to be members of his team because they want to be with this guy who’s a recognized leader in his field.
SARAH GREEN CARMICHAEL: We have spent a lot of time talking about super scary things, some of which have happened, some of which might happen soon. Are there examples from your research, from your own career, of crises that have been averted?
RICHARD CLARKE: Yeah. The one we talk about most in the book, and the one that I was involved in, is Y2K.
SARAH GREEN CARMICHAEL: Yes. I think most of our listeners will remember the Y2K hullabaloo. I remember my dad, who worked in software, going off with a sleeping bag to spend the night in his office actually, just in case anything should go wrong. I’m sure you have sort of exciting memories from that evening too.
RICHARD CLARKE: I spent the evening in a command post that we had built in the White House compound that had direct lines into every industry group and direct lines into countries around the world. And we followed the beginning of the new century around, beginning in Australia and moving across the earth. And there were computer failures as a result of the software being wrong, in that not having been fixed. But most of the critical systems had been fixed, some just in the nick of time. And so we didn’t have a big system crash.
SARAH GREEN CARMICHAEL: And yet now it is kind of remembered as this big, big kerfuffle over nothing.
RICHARD CLARKE: When you have a Cassandra come in to a government and convince the leader of that government to do something, and the leader acts in time, then the crisis never happens. Then the reaction that you get later is, “Oh, that was a waste of time. The Cassandra was probably wrong, and you wasted all of our time and all of our money.”
The government spent an enormous amount of money and a great deal of time and forced industry to as well, on a global basis, and therefore, nothing much happened. And people say, “Well, you see, that was just a scheme by the computer industry to get everybody to buy new software.” Well, I think it actually was a crisis averted by a lot of hard work and a lot of resources.
SARAH GREEN CARMICHAEL: And I think that’s probably how most leaders would want things to turn out, most of the time.
RICHARD CLARKE: That’s right. Most leaders do want that result—most responsible leaders. Well, let me say something that’s terribly cynical: that if you’re a leader, and you are posed with this problem—do I spend a lot of time and money and change my agenda to work on this Cassandra issue or not—you might do the following calculation: If it happens, and it had never happened before, I can’t really be blamed. But then I can be the crisis manager, and I can whip everybody into shape and I can ride to the rescue and I can get the money I need and the people I need. Whereas, if I deal with it in a calm, cool, collected, prophylactic kind of way now, I’ll never get any credit. And people will say that I wasted all the money.
SARAH GREEN CARMICHAEL: Yeah, it’s interesting to hear you, kind of, talk about a crisis as that kind of thing. Because when I started working at HBR, it was actually right on the eve of the financial crisis, and I remember after that happened we published a lot of articles about, kind of, how to take advantage of a crisis.
RICHARD CLARKE: Never let a good crisis go to waste.
SARAH GREEN CARMICHAEL: Exactly.
RICHARD CLARKE: Well, I don’t think George W. Bush ever made that kind of calculus about 9/11. But think about it for a minute. If he had attacked Afghanistan before 9/11, as I was recommending, people would have said he was a warmonger and he was overreacting. He didn’t. Therefore, he did things that pretty much, I think, let al-Qaida grow and attack us. When that happened, nobody blamed him. He stood on the ruin of the World Trade Center with a fireman, yelling through a megaphone and became a national hero.
SARAH GREEN CARMICHAEL: Is it easier or harder now that there are so many conspiracy theories on the internet and so much bad data out there? Is it easier or harder for Cassandras really to be heard?
RICHARD CLARKE: It’s harder, because there are so many truly fake news stories. And when you come in to a decision maker with something, they tend to react like, “Oh, you’re one of those people.” If you can get by that by saying, “I’m an expert. Here are my credentials. Here all of these other experts who recognize me as an expert,” then I think you get over it.
But, you know, the thing about conspiracy theories—when I was in the government, I was hit with a lot of them all the time, I didn’t dismiss them just because they sounded crazy. Because, in my experience, crazy things were happening all the time, and things that had never happened before were happening all the time, and unlikely things turned out to be real. So, I always said, look, it’s our obligation to disprove it. And if you can’t disprove it, then you’ve got to keep an eye on it.
SARAH GREEN CARMICHAEL: You know, it’s interesting to me that you’re even, sort of, reclaiming this word, “Cassandra,” right, because it could be seen as a very sort of negative way to portray someone. But it sounds like it’s something that you’ve just sort of come to embrace.
RICHARD CLARKE: Well, Cassandra was right.
SARAH GREEN CARMICHAEL: Fair enough! Fair enough. Well, Dick, this has really been a pleasure.
RICHARD CLARKE: All right. I enjoyed it. Thank you very much.
SARAH GREEN CARMICHAEL: That’s Richard Clarke. He led counterterrorism efforts for presidents Bill Clinton and George W. Bush. Now he runs the consulting firm Good Harbor Security Risk Management. And he’s the coauthor of the new book Warnings: Finding Cassandras to Stop Catastrophes.
You can follow HBR on Twitter @HarvardBiz, and at Facebook and LinkedIn, too.
Thanks for listening to the HBR IdeaCast. I’m Sarah Green Carmichael.