I like a good infographic as much as the next person. They’re eye-catching, they get passed around a lot, they’re more readable than dense academic papers full of footnotes (sorry, academics). So when I saw one highlighting “the most dangerous intersections in the US” I fell for the clickbait.
My first reaction on reading the very short piece that accompanied it: Catchy title but the piece provides zero basis for the statement that these truly are the most dangerous intersections (however “danger” is defined).
Data-driven safety improvements are essential — but where are the actual data? This could have been so much more. So I did an easy, basic thing: I Googled the law firm credited on the map. Nice long list of results.
One toward the top told me the map came from NHTSA “accident” (sic — #CrashNotAccident) data for automobiles/pedestrians. It’s entirely possible to slice and dice that data so you only look at collisions or fatalities involving certain types of road users, so I wondered whether they had actually left out bicyclists and motorcyclists or the writer just made some assumptions.
At least this article — unlike the one I first saw — linked to the short Time Magazine piece that had originally inspired the map’s creation. That informed me their analysis was “fatalities.” So maybe all road users, not just drivers and pedestrians? Hard to be sure. But wait, there’s more.
Back to that list of Google results — maybe some other writer using this got a little more specific about the data fields. Scroll down and toward the bottom of page one of the results, I get this from Infographics Journal: “Using crowd sourced data from Badintersections.com and 9 years of data from the National Highway Traffic Safety Administration…based on automobile and pedestrian accident occurrences” (emphasis mine, and 2003-12 actually includes 10 years of data).
So we now have a map based on two data sources: unspecified time frame, definitions or quality in the crowdsourced data that appears to be for all road user types concerning “intersections that are hazardous or annoying” (per a statement on the Bad Intersections site) and NHTSA data for ten years, possibly motorist/pedestrian only.
Annoying? Let’s just pause to note that “annoying” and “unsafe” are not the same thing. In fact, if a driver is annoyed because the crosswalk light is long enough to enable people to actually clear the intersection, that’s an intersection design with safety consideration for the pedestrian.
If people reported as annoying/hazardous the same intersections that show up in NHTSA data for collisions then some locations are being double-counted. If multiple people are annoyed by an intersection and report it, that deepens the overweighting. But I don’t know, since this isn’t a government source with clearly specified criteria.
And this map is getting picked up by media all over. After all, we have a splashy infographic. Big words! Turned sideways! So authoritative.
Don’t get me wrong — As someone who works with real crash data and wants to do real work to improve conditions for people on our streets and roads, I do want attention-getting tactics that focus attention.
But this is not helpful. This map circulated extensively when it was developed after the initial Time article and it’s catchy enough that it gets picked up again and again. To see it written up yet again with no discussion of underlying source material, none of what I found in about seven minutes with Google that didn’t even go past the first page of results…. That’s frustrating.