What the Automated Vehicle Crash in Tempe Says About Road Safety

America let out a collective shriek on Monday, as it was reported that a woman was killed by a car in Tempe, Arizona.

It was the first time a car had ever killed a pedestrian, at least to our knowledge. Usually, it’s the people inside the cars doing the killing. This time, the car—an automated vehicle developed and being tested on the streets by Uber—was at fault for the crash.

Traveling about 40 miles per hour in a 35 mile-per-hour zone, according to the San Francisco Chronicle, the vehicle killed 49-year-old Elaine Herzberg as she allegedly walked her bicycle across the street. A person was behind the wheel, observing the vehicle, but didn’t take control in time to save Ms. Herzberg. Uber has since removed their automated vehicles from the road, and good on them for doing it. 

This crash is different for the reasons mentioned above: The killer was a robot, not a human being, and there are lots of dystopian theories we could take from that. And people have.

But it’s also remarkably similar.

First off, unfortunately — and predictably — news outlets began blaming Herzberg almost immediately after the crash. The news and comments from those tasked with investigating the crash noted she was crossing outside a marked crosswalk, and that she “came out of the shadows.” We also learned that the victim may have been homeless and may have had a criminal record.

This regularly happens when pedestrians and cyclists get killed by people driving motor vehicles. If a cyclist, the first question is whether they were wearing a helmet. If a pedestrian, it’s whether they were looking at their phone while walking, or whether they were in the crosswalk.

But don’t dare question the technology. In fact, very little is known, or is being reported, about the technology that presumably failed in the death of Herzberg.

What we do know is that the car did not stop, swerve or slow down to avoid Herzberg the way a human being might’ve. It seems to have not registered the human being walking on the side of the road.

It’s sort of similar to automated vehicle testing gone wrong on the streets last year, as noted by PC Magazine:

Around this time last year, a Volvo SUV fitted with Uber self-driving tech collided with another vehicle that failed to yield, causing significant damage but no serious injuries. Following that collision, Uber temporarily suspended its US self-driving programs in Tempe, Pittsburgh, and San Francisco.

Meanwhile, Google’s autonomous test vehicles have been involved in several crashes over the years, including an incident in September 2016 when another car ran a red light and collided with a Google SUV. That same year, a more serious crash left a Tesla driver dead after his car, operating in “Autopilot,” collided head-on with a truck.

Second, part of the problem with testing automated vehicles in Arizona is the lack of regulations the state requires these companies to comply with. “In 2015,” notes the New York Times, “Arizona officials declared the state a regulation-free zone in order to attract testing operations from companies like Uber, Waymo and Lyft.”

Unfortunately, more states are looking to loosen regulations on automated vehicle testing in U.S. towns and cities. And to be fair, the rate at which automated vehicles kill people seems to be less than that of human beings driving vehicles killing people.

But, lastly, and perhaps most importantly, there are few news outlets that have not covered, or at least mentioned the Tempe crash. The same is not so for the thousands of crashes which occur throughout the U.S. each day. Here in Philadelphia, throughout 2017, the Bicycle Coalition found that roughly a quarter of motor vehicle crashes which resulted in a person’s death did not make the news.

The only good thing about the collective shriek about the death of Elaine Herzberg was that there was one. It indicated that as a society we don’t think it’s normal for a robot to not see a human and kill that person.

Unfortunately, we as a society do think it’s normal for humans to see other humans and kill them with their automobile, even if they could have prevented such a collision.

We need to make any and all collisions between humans and/or robots driving motor vehicles and others, whether they are walking, bicycling or in a car, worthy of a collective shriek — and not acceptable.

One hundred people are killed in traffic every year in Philadelphia. There are 10,000 reportable crashes, and the city and its people spend $1 billion on those crashes.

Pedestrian deaths and crashes are on the rise. And yet, car crashes, still often referred to as “accidents” in the media and otherwise, are still met with a shrug in our society. As Philadelphia, and many other cities around the country, continue to adopt Vision Zero, that will hopefully change.

As many know, the Bicycle Coalition is attempting to bring awareness to every single crash that happens throughout 2018, not only by writing about it, but by putting a physical Vision Zero memorial at the intersection where the violence occurs.

We believe this, among other advocacy and tactics, will help our city pay more attention to these needless, avoidable, deaths, and do something about it.

But until it does, Herzberg, like the thousands of others killed in traffic throughout the U.S., are too often referred to as a statistic — whether they’re killed by an inattentive driver, or an inattentive robot.

 6,763 total views,  1 views today

Topics: Crash, Featured

2 comments on “What the Automated Vehicle Crash in Tempe Says About Road Safety

  1. James Kahn

    In the interest of accuracy, according to the New York Times, this was a vehicle traveling at 40 mph in 45 mph zone, and the pedestrian with the bicycle crossed in the middle of the street at night. The video shows there was not much exterior lighting, and no light on the bike, and the pedestrian was wearing dark clothing. Please review the video from the car, which video is linked to the article, and consider whether this tragic accident could have been avoided by a human driver (and the car’s back up driver is shown in an interior video, and it is not clear if he was paying attention). I’m not sure how much smarter we can expect autonomous vehicles to be. Here’s the link:

    We can’t reflexively blame the driver (autonomous or not). Bicyclists and pedestrians need to be safe as well. Even the best safety measures, and we should advocate for them, will not work unless cyclists and pedestrians proceed with due caution.

  2. Suzanne Hagner

    This is not the first time this has happened…a cyclist in the UK was hit and killed by a Tesla automatic vehicle that was on auto pilot. I know that we as bicyclists must do our part to ensure safety on the roads. However, there does seem to be a problem with automatic vehicles responding to cyclists as well as in responding to inclement weather conditions.

Comments are closed.

Search the Blog

Browse the Archives

Stay Connected:

Get our blog
sent to your inbox