About two years ago, I wrote a column in Roundel titled “Mobility, Shmobility.” I’d just attended a conference at MIT on mobility (the buzzword that encompasses both self-driving cars as well as ride-sharing services like Uber and Lyft), and while I was impressed by autonomous vehicle (AV) technology, my take-away was that auto manufacturers were falling all over themselves to spin it into big-thinking “mobility strategies” so that they wouldn’t be accused by shareholders of being caught flat-footed by a disruptive technology in the way Polaroid and Kodak were swamped by digital photography.
The chair of the conference was an automotive writer named Craig Fitzgerald. Craig and I have since become friends, sharing a love of old guitars, twangy rock ‘n’ roll, and of course cars. And, in a strange twist, Craig’s wife, Lisa, used to own the E39 525i wagon that I almost bought, even after it tried to kill me when it pitched off a floor jack that sank into soft asphalt while I was checking its front-end play. (I wasn’t under it at the time, but when they say, “Never work on a car held up only by a floor jack,” they mean it. Set it down on jack stands and leave the jack in place as a back-up. Live and learn—with the emphasis on live.)
Craig has his hands in a number of pies, including being editor-in-chief at BestRide.com, and, like me, being a Facebook addict. So it was from Craig’s Facebook post that I first learned the news that a self-driving Uber vehicle undergoing testing in Arizona had killed a pedestrian. I did what many people did and watched the in-car video, which was widely distributed by media outlets, and is now on Youtube. [Editor’s note: While the video does not show the impact, viewers are advised that it does show the actions and reactions surrounding a graphic, fatal incident. Proceed with care.]
If you haven’t seen the video, it is jarring, to say the least. The dashcam shows the view out of the windshield of the vehicle, an Uber-prepared Volvo XC90. It also shows the vehicle’s driver. (Wait—a driver in a self-driving car? We’ll get to that.) It’s nighttime, the road is straight and clear, traffic is light, and there is no inclement weather. A woman emerges from the left, wheeling a bicycle, and crosses the road. The driver notices at the last moment, has a horrified expression on his face, and is unable to do anything as the vehicle hits her.
Having watched the video, as many people did, I continued to do what many people did, which was to render judgment. Even when I was in my twenties, I never had the twitch reflexes of a fighter pilot, and as I close in on the Big Six-Oh, my night vision isn’t what it used to be. So yes it was tragic—but had I been the driver, I probably would’ve hit her, too.
Numerous Facebook threads went up, linking to the video. Occasionally some young turk would chest-beat and say that he thought that he could’ve avoided the pedestrian, but most folks agreed that given the circumstances, the tragedy was probably unavoidable. The “probably unavoidable” judgement was also apparently the opinion of the local police chief after reviewing the dashcam video in the preliminary investigation, as a number of media outlets, including USA Today, reported the day after the accident.
Having rendered judgment, I moved on.
Now, while I didn’t think that I could’ve missed the pedestrian, I did, it turns out, totally miss the point—and so did everyone else who watched the video and rendered judgement on the question of whether they could’ve done better than the Uber driver.
Let me circle back to the question of why a self-driving car needs a driver. When I went to the mobility conference two years ago, I learned that the Society of Automotive Engineers (SAE) has levels that describe vehicle automation. There was a lot of debate over the jump from SAE Level 3, where the car can operate autonomously but still requires a driver to be ready to take back control, to SAE Level 4, where the car can operate completely autonomously in certain situations without driver intervention (SAE Level 5 is where the car can operate autonomously in all situations).
If you think about it, in SAE Level 3, there’s an inherently contradictory level of responsibility placed on the driver. He or she is expected to sit behind the wheel of an autonomously-driving vehicle and do nothing except maintain concentration so rapt and flawless that he or she can intervene at the exact moment that the AV systems fail to do what they were designed to do, and override them. The better the AV systems work, the less often this kind of intervention is necessary. It’s reasonable to expect that, as a result, the autonomous systems will lull the operator into a state of complacency. So if, when you watch the video of the fatal Uber accident, it appears that the driver isn’t paying attention, I’m not sure why anyone would expect anything different.
And yet, according to Craig Fitzgerald, that still misses the point. The real issue, he says , is that the self-driving car in the Uber fatality didn’t even try to slow down, and it should have, because the vehicle was a commercial off-the-shelf Volvo equipped with technology that already detects pedestrians and brakes when necessary.
You can read Craig’s excellent article here on BestRide, but in a nutshell, the vehicle in the Uber accident was a Volvo XC90 with Volvo’s “City Safety” pedestrian-detection technology. That technology uses a combination of visual, radar, and infrared sensors not only to detect cars, pedestrians, and cyclists, but to track pedestrians and cyclists as they enter and leave the car’s field of view, keep track of them as the car passes them, alert the driver of their presence via lights and beeps if they get too close, and automatically apply full braking if necessary, reducing relative speed by as much as 30 mph.
The vehicle in the Uber accident was apparently traveling at 38 mph. The stock Volvo technology should’ve detected the pedestrian and her bicycle and slowed the vehicle to as little as 8 mph before the impact. The point that Craig has been shouting from the rooftops about this is that the dashcam video appears to show no attempt whatsoever by the car to slow itself, and none of Volvo’s visible and audible “City Safety” warnings being activated. This raises the possibility that, in Uber’s attempt to layer on its own self-driving technology on top of Volvo’s, the already-functional “City Safety” pedestrian-detection and automatic braking technology was tampered with or defeated.
To put this another way, when we look at the dashcam video and render judgment that we probably would’ve hit the pedestrian as well, we’re looking at the wrong thing. We’re putting ourselves in the driver’s shoes, not in the AV technology’s position. If we do that, it’s fine to let the driver off the hook, but we shouldn’t let the AV technology off the hook. The car’s AV system isn’t using what you see in the dashcam video; it’s using multiple sensors that detect pedestrians off to the side of the road. Not hitting pedestrians should be Job One for any AV system.
And this didn’t appear to be a highly-cluttered environment. The traffic appeared to be light, the road appeared to be straight, and the weather appeared to be good. The car appeared to be traveling at a modest speed of 38 mph. The automatic braking should’ve worked. There’s a good chance that the pedestrian should be alive. You can read what you like into the fact that Uber has already quickly settled with the pedestrian’s family and discontinued testing of its self-driving cars on public roads in Arizona and California.
I need to think about our tendency to read a story or watch a video, measure ourselves against it, and render judgment on whether someone’s performance is reasonable or flawed compared with our own. This tendency is very deep-rooted; after all, the expression “man is the measure of all things” dates back to the Greeks. Going forward, I’ll try to measure myself against the right things.
The National Transportation Safety Board (NTSB) is examining the Uber crash. In spite of the initial incorrect reaction to the dashcam video, many stories are now being written about how the crash was a failure of AV technology, and how the testing of autonomous vehicles may require more regulation. Fitzgerald is not the only one trumpeting the possibility of tampering, but he may have been the first.
Follow Craig. The guy’s got insight.—Rob Siegel
Got a question for Rob Siegel, the Hack Mechanic? You can find him in the BMW CCA Forums here!
Rob’s new book, Ran When Parked: How I Resurrected a Decade-Dead 1972 BMW 2002tii and Road-Tripped it a Thousand Miles Back Home, and How You Can, Too, is now available on Amazon. Or you can order personally inscribed copies through Rob’s website: www.robsiegel.com. His new book, Just Needs a Recharge: The Hack MechanicTM Guide to Vintage Air Conditioning, will be out in the spring.