If you haven’t read A Bit of a Rant on Data and Data Rant Continued, they’ll provide additional context for this.
You wouldn’t think a place would vie for the title “Most Dangerous Intersection in the US.” But somehow a place in Hawaii took that spot away from Pennsylvania. Or did it?
You’d sure think so if you read the headline “Honolulu has the nation’s deadliest intersection” — about an intersection that didn’t even make the top 10 in the original Time article. Granted, a couple of years had passed, but it takes time to verify data for official sources — that is, if you’re actually using them.
Let’s retrace the tangled web that brought us to this point:
The 2014 Time magazine analysis looked at NHTSA data on fatalities 2003-2012 — although I still don’t know for sure whether this was only motorist/pedestrian fatalities or included bicyclists and motorcyclists. Based solely on number of fatalities, they listed the worst intersection in the US in Bensalem PA (seven deaths). Eleven other places tied for the #2 spot with six fatalities. None of these were in Hawaii.
This (plus some crowdsourcing data you wouldn’t know was there from reading most of the pieces) got turned into a catchy map infographic that really made the rounds, and continues to get picked up. That map listed an intersection for every state, including a particular block in Hawaii, and it’s what set me off. I started looking at some of the articles that had picked it up to see if they had examined the data at all and spotted that headline saying Hawaii had the worst intersection.
Enter The Cheat Sheet, referred to as a media company on the Hawaii news site that used their analysis and as a lifestyle site on their own About page. According to their Twitter bio they “help you save time and live more with the most up-to-date guides, reviews, lists, and advice.” Their sportswriter “dug into the numbers” (this is how the Hawaii news story described it) to come up with their own analysis.
What does “dug into the numbers” mean? Here’s where the sources spiral out of control and I’m not chasing down every one — I’ll just give you a taste.
Now if you want to rank anything you generally have one scale. Think of it as holding a ruler up to something to see how long it is, then holding it up to another item, then telling us which one is longer. We have enough confusion between the metric and British imperial systems — we don’t go introducing Chinese or ancient Mayan or other ways of defining units of length.
At least, not in my world. Or if we need to use two different rulers for some reason we tell you what and why.
At the Cheat Sheet they used quite a variety of rulers. They linked to all of them, at least. So I know they used some media analyses and lists produced by personal-injury law firms (like that first map infographic that started me down this rabbithole), each with varying definitions and time frames. Some sources used a clear and specific methodology relying on government data — works great if you were to use the same methodology on all the places you’re comparing. In at least one case the state DOT had put out one set of information but the Cheat Sheet writer decided to use another one (without saying why). In these various sources sometimes “danger” means all crashes, sometimes serious injuries and fatalities, sometimes fatalities, sometimes only one type of road user. I saw one mention of using Facebook to take nominations for the most dangerous intersections.
To pick two examples:
- For #1 Hawaii they used an analysis by a law firm of just 2015, which at least does appear to be based on a government data source linked at the bottom of the page. They found 312 “accidents” (sic — #CrashNotAccident) . For which modes? What locations? Because the place identified isn’t one intersection, it’s now an entire shopping mall and all its parking lots as well as actual street intersections.
- The #15 place on the list is based on a professional firm’s analysis of all pedestrian crashes in that city 2008-2012 with extra weight for serious injury/fatality crashes. Not a bad methodology — and not the same criteria as the Time analysis or the Hawaii one or any other city on the list.
So what? You might ask. Listicles abound. Who cares if they used 15 different rulers to measure these intersections?
But here’s the deal — If a public agency presents safety data bounded by specific sources, definitions, time frames, and federal standards this is the climate in which they do so. “You’re wrong! I just read a list of the worst intersections in America!”
Which list did they see? The original Time analysis? Ones illustrated with the map that got described a number of times as relying on NHTSA data but actually also includes crowdsourced unknowns and got turned into a catchy infographic? Or the one that used a variety of time frames, sources, methods, and opinions? (I imagine there are yet other lists with different results.)
I stopped pulling on the threads but I’m sure they unravel further. All of this is to say that if you’re looking at something that purports to be based on data, ask some questions. Don’t let the type turned sideways turn your head, too, and don’t assume #1 is really #1.