Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
A dashcam video that was posted last spring, reportedly showing a Tesla in dense fog heading directly toward a train rumbling beyond the lowered arms of a railroad crossing, puts new meaning to the word “frightening.”
It made me think twice about the self-driving revolution, which always seems to be just over the next horizon. Maybe the future isn’t what it used to be.
In the video, the car’s owner, Craig Doty II, is using the car’s “full self-driving” mode, which is a bit of a misnomer, considering the mode still requires the owner to have both hands on the car and to be at the ready. It’s a foggy day. The car is going about 60 mph when the train and the crossing come into view. The owner realizes what is about to happen, applies the brakes hard and turns the car off the road just in time, hitting a crossing arm and narrowly missing the train. I found the video on the NBC News website, which reported that the car’s owner blamed the car for not seeing what was ahead.
“I was like there’s no way it doesn’t see the train,” NBC quoted him saying. “There’s no way it doesn’t see the flashing lights. Yes, it was foggy, but you can still see the lights.”
No, apparently, it couldn’t.
In one month, it will be 2025. Soon it will be nine years since Business Insider predicted 10 million self-driving cars would be on the roads in 2020 — a prediction that proved to be off by several miles, at least.
I used to eagerly await the day it finally would happen. But while some driverless taxi services are currently in operation in select cities, I’m no longer sure I’m looking forward to it. Some of our assumptions about that day may not be true.
Earlier this year, the Brookings Institution quoted the Association for Computing Machinery, which told regulators it would be wrong to assume self-driving cars would reduce injuries and deaths on the highways. The reason is simple. Computers are programmed by fallible humans. Therefore, they make mistakes.
Brookings quoted Mary “Missy” Cummings, a safety engineer at George Mason University, who said we may just be replacing the errors of human drivers with “human coding errors.”
In California three years ago, a set of unique factors and circumstances touched off a software glitch that sent a self-driving car onto the raised median of a city street.
The Brookings report, written by senior fellow Mark MacCarthy, goes on to compare human and computer-driven driving records. Humans, when they’re not drunk, overly tired or otherwise distracted, are good drivers, perhaps experiencing one fatality for every 200 million miles traveled. Even adding in those delinquent drivers, the rate rises to about 1 in 100 million.
Meanwhile, “Computer vision systems have been shown to misperceive a stop sign as a 45-mph speed limit sign, under adversarial engineered conditions that mimic many real-world situations,” Brookings said. “Computer perception systems are notoriously brittle and malfunction in unpredictable ways.”
A recent post on Daily Kos described self-driving cars as having a social problem. The author described coming to a non-functioning stoplight, where drivers used “a combination of gestures, car maneuvers, and eye contact to communicate our way through an unexpected obstacle.”
Technologists, he said, “are underestimating how social a problem driving actually is.”
As I’ve written before, the cars of the future will have to interpret the hand signals of a traffic cop, and distinguish those from a regular person in a crosswalk who happens to wave or gesture to a friend.
As Motortrend wrote in September, the problem is simply that “affordable mass-produced onboard computers still can’t match the human brain’s processing power.” But then, humans can’t see things in a continuous 360-degree sweep. Maybe, the best solution is to combine humans and computers, if we can keep the humans from nodding off under a false sense of security.
Yes, humans make errors — to the tune of more than 40,000 fatalities in the United States in 2023, most of which, if history holds, were caused by human mistakes. Those errors are horrific. But I don’t know if they are much worse than watching, as a passenger, while a computer misperceives important clues in real time, and as a railroad crossing grows ever larger in the windshield.
The truth is we may never get a completely safe driving system, which makes the future a lot less fun than I thought.