We’ve opened the 2024 application period for our SOCAP Entrepreneur Fellowship: LEARN MORE AND APPLY!

A Cognitive Neuroscientist Shares How to Avoid Business Disasters

March 12, 2020

During a recent training I conducted on how to plan better meetings using neuroscience, one of the many association executives there shared a harrowing, true story. Let’s call him Mark — though it could have been Mary, Merle, or Michael — and his experience wasn’t a surprise to me.

“Our last annual conference was a real disaster,” Mark said.

Mark is the Events Director of a 17,000-member association that I won’t name. He clearly felt vulnerable but was willing to share the situation for everyone to learn from. As he explained, his team had followed their typical plan for preparing for the annual conference: they got the usual sponsors, booked a venue with a good reputation, secured quality speakers, and marketed the conference to their membership. And as he noted, small problems had happened in past years, but no major disruptions. As usual, his on-site staff and volunteers addressed the minor issues. Unfortunately, this time the conference did not go according to plan.

This had all come up as we were talking about why failures happen — we hit blind spots, or dangerous judgment errors, that cause us to assume that things will just keep happening as they always have before. There are over 100 mental blind spots that cognitive neuroscientists and behavioral economists like myself call cognitive biases, and any one of them can cause us to make the kind of poor decisions that lead to disasters. Mark’s underestimating the likelihood of something truly going wrong this time is a perfect example of normalcy bias. Since planning past events had only produced minor issues in the past, he and his team were assuming that’s how it would go this time — only that’s not what happened.

As Mark recounted, the first sign of real trouble dealt with promotion: the new registration software smartphone app was a breeze for the team — as professional meeting planners they were all used to such tools. They assumed everyone would be fine navigating the new technology — an example of another cognitive bias, the false consensus effect, where we underestimate the extent to which other people’s values, perspectives, and understanding differs from our own. In reality, the software and app were extremely confusing to older members, and they just chose to not struggle with it. Attendance dropped by 20 percent. The attendees who had installed the app couldn’t use it, which cast a pall over their enthusiasm for the conference.

There were other problems as well: the venue, while well-recommended, had several conferences happening at once, which overburdened the staff and compromised their ability to set up rooms on time. The menu lacked options. The AV techs were harried. Then the keynote speaker got laryngitis days before the event. Without time to find a replacement, Mark tapped the association’s own Executive Director for the job. Unfair or not, attendees had the impression that the whole event was ill-planned and a waste of time.

It could have been worse, we all reasoned; no one had a medical emergency. With hundreds of thousands of conferences taking place in the US every year, perhaps it was Mark’s turn for bad luck. But the disaster could have been prevented, and Mark knew that. He just knew it in hindsight, and he hoped the conference hadn’t marked his career before it was too late. “Next time,” he said, “if there is one.” We all agreed we hoped there was.

Failure Proofing

Events-planning is a highly competitive arena, but so are countless other fields, and we’re all trying to avoid mistakes that could damage our careers. As the group I was leading reassessed what Mark could have done to prevent his meeting catastrophe, I introduced a fail-safe method for avoiding disasters. This Failure-Proofing exercise can help ensure a major endeavor’s success. The strategy is based in neuroscience, and tackles the cognitive biases that bring down your efforts. Here are the steps in a nutshell:

1. Envision the disaster. 

To avoid disaster you have to accept it could happen. You and, ideally, your team should imagine what that disaster would look like, and what went wrong.

2. List all the reasons that disaster happened. 

Next, brainstorm all the reasons that project failed. Include reasons that could be seen as rude or politically problematic — those are often the ones we don’t talk about but should have. In Mark’s case, he didn’t want to seem disrespectful to the well-know keynote speaker, so he didn’t discuss an understudy. This is an exercise, so nothing has happened yet. Use that as license to call for complete honesty, but have the team share their reasons anonymously — use a Google Forms survey, for instance. You need to tackle the sensitive issues.

3. Discuss, assess, and find solutions. 

As a team, discuss the reasons everyone gave. Decide which should be addressed, based on an assessment of their likelihood and their impact. Then brainstorm ways you could address these problems as they happen.

4. Revise the project plan. 

With this new knowledge in hand, integrate the new ideas into your project plans.

This whole exercise can be done in retrospect, as well. When Mark ran the exercise, he realized how he and his team could have prevented countless mishaps. They could have offered multiple registration options — from traditional to new — to fit the diverse needs of the membership. They could have checked if the venue was running too many conferences at once and move theirs. They could have lined up a backup keynote speaker by establishing a shortlist, and vetting speakers with breakout slots for the next year.

But here’s the fun part: the last phase of Failure-Proofing is about maximizing your success. Now, imagine the project was a success and run the exercise in the positive. Brainstorm all the reasons it was a success — and use that information to revise your plans so it will be. Defeating cognitive biases such as the normalcy bias and the false consensus effect is key to undoing disasters before they happen. Next time, you won’t need to do it in hindsight.

Social Entrepreneurship / Stakeholder Capitalism
Join the SOCAP Newsletter!