The Self-Driving Car Industry’s Biggest Turning Point Yet

Mar 30, 2018 · 19 comments
Mike B (Ridgewood, NJ)
It’s easier to automatically land a 747 in a thunderstorm. Easier to safely land a space shuttle orbiting at 17,000+ mph to a full stop in Florida. Sensor based auto-driving will never work. There are too many incursions and unpredictables. Too many ways to inhibit or occlude sensors from rain, dirt, grime and other road nasties. There will always be multiple unknowns of objects and/or scenarios to confound the computer. I am surprised that the sensors did not have the peripheral sensor capacity to have prevented he most recent death. Only one scenario will allow the technology to thrive: Investors stand more to gain in sales than in casualty payouts. But the payouts won’t last for long. As corporate persons, they will write new legislation absolving themselves of all liability; for the rights of the corporation shall not be infringed!!! Thank you SCOTUS. We should concentrate on a form of positive highway control. Once on the highway a car can lock onto an embedded "track" of signal wire which will auto-space and speed the vehicle and only with ample fail-safe, meaning, when something fails, it's fails in a safe mode, impossible to be dangerous. A variation of what's used with trains. And I'm not kidding myself, it's not that simple with car. Probably ten to fifteen years out but far simpler than full autonomy. Do you really want to be anywhere near a road with 140,000 lb autonomous tandem tractor trailer traffic? In all types of weather?
Mr. Bill (Albuquerque)
This big announcement doesn't convince me that they will actually be providing this service in this timeframe. There are serious safety and ethical questions around self-driving cars. Even in aviation, automation is a two-edged sword that enhances safety but also degrades basic airmanship skills, with occasionally disastrous results (Air France 447, Asiana 214). But occasional inattention by pilots flying long distances on autopilot, in controlled airspace, is not particularly unsafe. Operating in the chaotic environment of surface roads, traffic, and random hazards, with a safety driver who is, inevitably, lulled into staring at their phone, is completely different. I will never take an automated car ride.
John (CO)
You have the potential to fix the software on a self-driving car so a particular accident never happens again. The same cannot be said for human drivers.
Ed Andrews (Los Angeles)
There is a big leap of faith here in expecting the general public to "climb aboard" and pay for a ride in a vehicle with no driver. "No driver?" "No way!" I'll stick with the cab driver Professional.
CCC (NoVa)
I've worked in auto insurance claims my entire career, including covering Arizona for a time. The fault for the ped/bicycle accident involving the Uber SD car lies mostly with the ped. You just can't safely cross a street midblock in the dark. If this had not involved a SD car, there would be no story at all, just a sad loss of life resulting from a wmaon negligently trying to cross a street, midblock, in full darkness. One day we can expect these cars to be far 'smarter' than humans, but we shouldn't fault them today for an accident where we wouldn't blame a human driver.
Jay (Pa)
A million trips per day by using 20,000 Jags is 50 trips per car per day. During what hours? Seems wildly optimistic. Or will there be more vehicles than that available to Waymo?
Dee (Out West)
A recent experience with a self-driving vehicle. While exiting a parking lot to turn right onto a busy 5-lane street, I waited behind a Waymo SD vehicle for several minutes - more than long enough to notice the spinning coil, where a trailer hitch would be, in addition to the spinning cylinder on the roof. This is a right turn that I have made many times in the past, usually with trepidation because of constant but spaced traffic coming from the left. This time, though there was no traffic coming, the SD car just sat there with it’s right-turn signal blinking. What was going on? A malfunction with the traffic camera, that would give all in line to turn a long wait? And what recourse does a SD car ‘pilot’ have in such a situation. How can a SD car be perfect when it is designed by imperfect humans? I suspect, the driver was doing something (e.g. texting) that he should have done before entering an active traffic lane. (Grateful that he was not in traffic, but a little consideration please.) And there’s the problem. Self-driving cars will not eliminate the biggest problem on the roads - the self-absorbed drivers who have no regard for their selfish actions’ effects on others. If an SD car avoids them, these inattentive drivers will just plow into the next car, with the car behind possibly plowing into the SD car - a “dodge ‘em” bumper car derby. How is that better? The real solution is for the self-absorbed to take public transit, or taxis.
Costantino Volpe (Wrentham Ma)
So the score for killing people with cars is: Self driving Cars:2 Every other moron on the road in a car: 40,000
Alan Dean Foster (Prescott, Arizona)
As an example, I'd far rather trust my journey to an SDV than to an 85-year old with slowed reflexes and weakening eyesight. But in the near future I should never have to do so, since that same 85-year old will enjoy a comfortable, safe ride in an SDV.
SmartenUp (US)
Here is the thing, you can be a poor driver at any age. If, as a society we are truly interested in road safety, we need to retest after the initial license. (not just "renew" every four years for Seniors, instead of six--what does that solve?) Why not do it on a periodic schedule that is easy to remember: ten-year anniversaries from your very first license? Thus, start at age 16 for example. Then, at age 26 you get another written, vision (BTW, a REAL vision test, by a licensed optometrist!), and road test. Same at ages 36, 46, 56, 66, 76, 86, 96, 106, etc. , or 10 year anniversaries of whatever age you started. If you fail, you have 30 days to study up, get retrained, get new eyeglasses so you might pass the tests again. People can be a menace on the road at any age: drinking/drug problems, arrogance, inexperience, simple lack of knowledge. Expensive, you say? Factor the cost of hospital bills, rehab, police, ambulance, fire, road workers, etc. Not to mention deaths. What is THAT cost? Within a generation or two, the culture behind the wheel would change, and people would stop thinking they could drive * just fine after a few drinks...* and other poor presumptions. If we are truly serious about road safety....and not just grandstanding.
Alex (New Orleans)
This article makes such an important point, one that I think is too often overlooked in the public imagination about self driving cars. Another story in the Times had vivid stats on how far apart these companies are: Uber's self driving cars needed human help every 13 miles, while Waymo's cars have been able to travel, on average, 5,600 miles between interventions from human operators. That's a truly immense gulf. To the other commenters who are more skeptical of the potential of autonomous vehicles, ask yourself how frequently human drivers obey signs reducing speed limits "when children are present." I see people driving in aggressive, unsafe, distracted ways every single day and I have no problem imagining that robots will do a much better job in the not too distant future. In the meantime, though, we need regulators who can distinguish between the Ubers of the world and the Waymos.
Ben (Santa Barbara, Ca)
Alex, I completely agree. Uber's program looks half baked. Waymo is really leading the pack with this tech. I'm excited to start taking Waymo's self driving cars and go down to a single car in our home.
David (California)
There's a traffic sign near my home: "Speed Limit 25 When Children Are Present." Similar signs everywhere. Can a self driving car determine "when children are present?" Likewise can a self driving car obey hand signals from a traffic cop? These guys are pushing too fast.
Steve L (Chestnut Ridge, NY)
Perhaps self-driving cars wouldn't need to drive 25 mph when children are present.
David (California)
So they get a special exemption from the motor vehicle code? What other laws can they ignore?
RC (MN)
The profit-based concept of having cars driven by glitch-prone and hackable computers is dangerous. Computers cannot be programmed to carry out the life-saving reactions performed by humans based on observation of the nuanced behavior of other drivers, bicyclists, or pedestrians. Nor can they enable cars to avoid damage from unexpected road hazards. Under laboratory-like conditions they may perform well, but in the real world safety would suffer, for example when two vehicles are approaching each other at high speed only a few feet apart, in the dark, snow, ever-changing road conditions, etc. A false sense of security would ensure back-up human drivers are unable to perform in an emergency. And people are not going to accept giving control over who will die to tech companies. It is premature to consider allowing "self-driving" cars on pubic roads Tax dollars would be better invested in enhancing driver skills.
DaveB (Boston, MA)
How about this as arejoinder to your comment: "The profit-based concept of having cars driven by glitch-prone and mentally, emotionally, and drug-compromised humans is dangerous. People cannot be programmed to not use drugs, not text while driving, not drive drunk, or provide the behavior required to observe the nuanced behavior of other drivers, bicyclist, or pedestrians." Based on the few articles describing the efficiency of Waymo vehicles, if every vehicle were a Waymo, my drive to work each day would be much safer, and so would yours.
JFP (NYC)
I'm surprised how little space has been devoted to the cause and nature of the accident of a self-driven car that killed a person. It is as important to understand this to determine the culpability in the accident as it would be in any accident involving automobiles. Was the pedestrian reckless or negligent? Did the car in question violate any traffic laws or laws of safe driving? These are very important questions that must be answered if self-driving cars are to be condoned or allowed to proceed in development.
C Taylor (Los Angeles, CA)
I completely agree. It seems as though very few people have watched the video of this particular tragic accident. The pedestrian was not in a crosswalk. It was dark. I didn’t see any lights on the bike, but maybe there were. The pedestrian comes into view so suddenly, I do not know whether I could’ve stopped the car in time if I had been driving. It is a more complicated scenario than self-driving cars are “bad,” pedestrians are “good.” The final results of the investigation have not yet been released, so I guess we’ll see.