Like most suburban Washington, DC neighborhoods, ours is off a small street that has become shortcut for commuters looking to avoid a congested intersection, most of whom fly down the street well above the speed limit. In this area, everybody’s neighborhood is somebody’s super secret traffic hack (DC traffic is the worst, well… almost).

Last summer we received a letter in the mail notifying us that the city was giving us options to help reduce speeding. Our options, put to an informal mail-in vote were:

  1. Speed bumps or
  2. Pay for additional police enforcement.

My husband and I groaned at the option of speed bumps. Anyone who has ever had a sleeping infant or toddler in a car knows that speed bumps are sleeping baby landmines. We both supported paying for increased police enforcement. Besides, we reasoned, speed bumps don’t slow people down, people just speed up between them, right?

Being a data nerd, before we voted I dug into the peer-reviewed research and it turns out we were absolutely wrong.

Speed bumps were far more effective in reducing risk than increased police enforcement. And I discovered that decades of study across the world showed that not only do speed bumps slow drivers down, but they reduce both car accidents and pedestrian deaths. So we changed our vote, and the speed bumps were added a few months ago.

But this isn’t about speed bumps or traffic. It’s about not being afraid to be wrong and changing your mind with new information, even if it’s inconvenient.

Our generation of parents has been trained to view actions as synonymous with character. If a parent makes a less-than-ideal decision (or hell, a benign decision that we wouldn’t make) we’ve been trained to believe they must be a bad parent. My husband and I weren’t bad parents because we initially thought speed bumps wouldn’t reduce the traffic issue. We were just wrong about that issue. When we had better information we changed our opinion.

So why do many of us persist in our current beliefs even when we’re confronted with evidence hat those beliefs are incorrect? Our brains don’t help matters. In fact the brain is hard wired to reassure us that we’re right, even when we’re wrong. Even the smartest among us are guilty of persisting in beliefs even when confronted with contradictory information. The ways in which we convince ourselves that we’re right when we’re actually wrong are called confirmation biases, and there are a few different types.

Leslie's son itching to make a run for freedom (and danger).

Leslie’s son itching to make a run for freedom (and danger).

I wanted to escape the confirmation bias trap with the speed bump issue and do what was truly safest for our toddler – who incidentally wasn’t always consistent when told that the street wasn’t for playing – so I did what I usually do in these situations: Try to prove myself wrong.

The first thing I did was to voice my opinion to others who I didn’t think would necessarily agree with me. This was an example of avoiding the confirmation bias known as the echo chamber, wherein we seek information from sources we think are likely to agree with us.

While chatting with neighbors I mentioned that we were thinking of voting for increased enforcement, to which my neighbor replied that he’d been at a meeting with a city planner who explained that speed bumps were much more effective at reducing pedestrian deaths. I was a little skeptical of what the city planner had apparently said. “Of course he’d say that,” I thought, “increased enforcement probably costs the county a ton in police overtime and he’s looking to save money.”

My dismissal of the city planner’s statement is an example of a phenomenon called motivated reasoning, first identified in the 1950s. Motivated reasoning is often subconscious, meaning that we don’t do it on purpose. In this process we often discard, or put less emphasis on evidence that contradicts our current point of view, and include or put heavy emphasis on evidence that supports it.

Motivated reasoning happens in a few ways, including only looking for information that we know supports our beliefs. For example, my neighbor’s statement was enough to make me investigate how wrong I may actually be. So I turned to the peer-reviewed literature, but I knew I had to search smartly if I wanted to get an accurate answer.

Had I been looking simply to confirm my hypothesis I could have searched for “speed bumps” ineffective traffic deaths, in JStor, in which case the results focus on citizen’s individual freedom and less on the statistical data. Instead I searched for traffic deaths “speed bumps”. I specifically kept the search terms neutral because I wanted to know what the data actually said and the resulting studies showed me that I was wrong.

When I brought the evidence to my husband and told him we were going to be changing our vote, he groaned and protested. I gave him a little while to think it over and he eventually came around acknowledging that the improvements in safety outweighed the inconvenience of the speed bumps. Not that he had any say in it, I’d already changed our vote and mailed in our ballot.

Tags: , ,
Categories: Accidents, Injuries, + Abuse, Science 101 + Mythbusting