Lessons From the Tesla Crash

Jul 11, 2016 · 184 comments
Bill W. (San Marcos, CA)
For decades now, we have been mandating technology that makes crashes more survivable (e.g., seat belts, air bags, crumple zones, side-impact beams). We are only just now beginning to see the deployment of technology that makes crashing less likely (and Tesla's Autopilot is among them, with the number of deaths per million miles traveled well below the average for all vehicles).

But with around 30,000 people dying in car crashes every year, why have we continued to fail to do the most obvious thing to reduce the number of crashes? I'm talking about requiring better driver training. We have the most lax driver licensing of any developed OECD country. Per a recent CDC report, compared with 19 other high-income countries, the United States had the most motor vehicle crash deaths per 100,000 population and per 10,000 registered vehicles; the second highest percentage of deaths related to alcohol impairment; the third lowest national front seat belt use; and the lowest percentage decline in the rate of motor vehicle crash deaths between 2000 and 2013 (a 31% decline in the US, compared to an average of 56% for all 19 countries). It ranks fifth in deaths per 100 million miles traveled.

Anybody who has spent time on American roads knows that a significant fraction of our drivers are simply terrible at it. We need better training, stricter licensing, and better enforcement.
vulcanalex (Tennessee)
The lesson is that these systems are not "autopilot" are not self driving cars and are not described as such. You are the driver, these things are to assist you not to replace you. No regulations will force you to actually drive rather than just allowing the car to drive. Now perhaps mandatory intense education with these cars that you are to drive them might help, I doubt it. Now these technologies will avoid some accidents that would have happened much more than they will allow them to happen due to deficiencies as here.
John Brown (Idaho)
We embrace new technology too quickly and too fully.

I remember my first flight, age 6, on an airplane and wondering where my parachute was in case we had to abandon the aircraft.
I was shown the little floating pad that was also my seat cushion.

My reply of: "A Fat Lot of Good that is Going to Do if we are going down
at 35,000 feet over Kansas !"'

got me a reprimand from my Mother.

Why anyone drives over 35 mph baffles me save that you will be rear-ended
and honked at and ticketed if you don't go the speed limit.

When I drive in a city/suburb I always look out for pedestrians who might step
into the street, children/animals who might dart into the street, cars backing
into the street, people opening their driver's doors etc - I don't see how any
computer can scan the street and see not only all those possibilities but the
look on the faces of the people whose sudden movements may cause an accident.

Perhaps the "Automatic Driving Ability" should be shut off in cities and suburbs
and other areas where traffic is too dense and too many people are present to
allow the computer to carry out its functions properly.
William Stewart (Charlotte, NC)
Human drivers: 1.9 fatalities per 100 million miles driven
Tesla Auto-pilot: 0.8 fatalities per 100 million miles driven

What exactly is everyone talking about when they state with firm certainty the supremacy of a human pilot?

Lastly, you can't only compare auto-pilot to an alert, experienced, sober driver. You have to weigh the benefits/risks against ALL drivers - many of which are drunk, on their smart phone, in a fit of road rage, or hunting Pokémon.
Alaink (Princeton NJ)
A very good editorial except for a few significant lapses:
1. This crash is NOT about the failure of the Tesla’s AutoPilot, but its underlying Automated Emergency Braking (AEB) system. Maybe the AEB system can’t discern a white trailer, but there is no excuse for it to fail to discern the tractor that led the trailer across the highway. Why didn’t the Tesla’s AEB begin braking when the tractor crossed its lane?
2. If the issue is driver distraction, why isn’t there more discussion about the possible distraction of the truck driver? Did he see the Tesla? Why didn’t he yield?
3. To have better than a “coin flip” chance of helping in a 2 vehicle crash, V2V requires both vehicles being so equipped (and having it turned on and operating). To have better than a “coin flip” chance of helping in a 2 vehicle crash then roughly 70% of all operating vehicles would need to be so equipped. (because the “chance” is the product of the probabilities). Starting today, equipping some new cars, then all new cars and having a compelling aftermarket strategy it will take many years to get there with little gain along the way.
4. Relying on “automakers” minimize driver distraction is challenging because they are the ones that promote “carPlay”, WiFi, big screens and other TravelTainement equipment to help sell the cars that they make.
Carol (Los Angeles)
Tesla immediately noted that these cars had over 100,000,000 miles on them, and this was the first fatality. If you look at "normal" car statistics, the fatality rate is about 1.08 per 100,000,000 miles (for 2014). Sounds like all the calls for testing, regulations, etc., are unfounded, as these cars are no different from traditional cars.
So, is there any reason to have them? I can think of many advantages of a "self-driving car", and if there is no added danger, why not?
Dean Fox (California)
If they ever become a reality, "self-driving" cars will be a goldmine for the lawyers, and a textbook example of hype and hubris over common sense.
basauer (Kentucky)
Driverless vehicles are not far off now and will have a huge impact on employment. The transportation job sector is #1 in this country and it will see massive unemployment within the next decade.
Distracted driving is a growing problem but so is poor to non-existant driver training. It's no longer taught in high school for most students. My state Kentucky has never required a written test for issuing a drivers license for the 14 years I lived here. I would advocate for requiring passage of a written exam for every renewal, regardless of age. The test should be tough enough that if you didn't re-read the manual (do they even exist any more?) before taking the exam, you probably wouldn't pass. This wouldn't have to be prohibitively expensive. In times past, we considered it worth the expense to have drivers who were knowledgeable of the rules of the road.
Marc LaPine (Cottage Grove, OR)
With over 260 million cars registered in the US, the death toll rising on our highways, with greater driver distraction primarily from cell phone talking and texting, and significant lack of law enforcement of speeding and distraction, it seems inane to attempt to develop a technology that even in the testing phase is previewing greater driver distraction! Where is this headed? If all a person has to do is ride, why not take public transportation where ever possible? Insanity in this case is defined as 260 million automatically driven cars on the road, each with it's own CO2 belching engine, when global warming is threatening our continued existence.
The past "joy of driving" on the open road is past. Too many trucks and cars on limited road space. Open road is limited to areas of the plains and the west.
Occupy Government (Oakland)
good lord. i hadn't realized it before, but Tesla owners have become the new Beemer drivers. spare us all.
Eben Spinoza (SF)
Don't name something "autopilot" that isn't really an autopilot. Robots don't kill people. Hype kills people.
Beartooth Bronsky (Collingswood, NJ)
Let's see. In auto deaths this year the score is 35,200 to 1 between human drivers and auto-driving cars. We are never (or not for a long time) develop an auto-driving system that will be 100% accident or fatality-free. But, if we merely develop one that is safer than human drivers, we will still have come a long way. Let's not read too much into one accident in an admittedly beta-test assisted-driving (NOT auto-driving) car where the driver had been warned to keep hands on or near the wheel to take over manual control if needed, but was immersed in Pokeman instead.

Most Americans are statistically illiterate, but are overly affected by any anecdotal story (like Reagan's "Welfare Cadillac Queens"). Media reports have to go to considerable lengths to put their stories into perspective.

We see this all the time in the news reports like "Eating carrots will double your chances of catching Fulminating Bilious Keratoma!!!" What they invariably lack is the baseline statistic that is needed to make sense of this sensationalist claim. If 1 in 10 people are susceptible to Fulminating Bilious Keratoma and eating carrots increases your chances to 2 in 10 and I'd stay the heck away from carrots. But, if only 1 in 100,000 are susceptible and eating carrots doubles that risk to 2 in 100,000, er, hand me that carrot please. The baseline, the total incidence you are comparing the percentage change to, is literally never part of the story.
SpartanFan (Carlisle, PA)
This country has an $189 million backlog in needed federal highway repairs. Until that is addressed, either through raising gas taxes or more tolls, attempting to integrate automated cars to any large degree is a waste of time and money. Secondly, will Jay Leno and the growing hordes of classic car collectors be willing to share the road and insurance costs with the Jetsons?
Finally, we already have driverless cars--they are called trains.
Andy W (Chicago, Il)
I have yet to see an article that fully explains the details surrounding this accident. Was it a red light? Were the Tesla and its driver doing anything illegal? Was the truck making a proper turn? How much warning did the Tesla and its driver have (how sudden was the truck driver's maneuver)? There is an awful lot of hype and pure speculation around this incident. The fact that this driver left a set of videos behind showing a bit of overzealous behavior doesn't help. Autonomous driving technology does eventually need conformity, consistency and oversight. The agencies that need to regulate it must also be properly funded and expertly staffed. Beyond all the hype and speculation, people also need to remember where the most danger on our highways is really created. Eighty mile-per-hour speed limits and four-hundred horsepower engines are responsible for far more deaths in a single year, than automated vehicles are likely to be in an entire century.
Lisa Rivera (Pittsburgh)
I looked at a Tesla. The salesperson told me I had to keep an eye on the road and my hands on the wheel even if the car was in autopilot. I want to have a fully autopilot car when I'm too old to drive myself so get off the moral high horse and push all car makers to make that dream come true. Decades ago, the U.S. decided to choose dangerous individual drivers who to got to drive their own cars. Let's move towards mass transport or autopilot now and reverse that stupid decision.
JK (BOS)
Some people claim that Tesla drivers who don't use this feature as intended will be to blame when the system fails to keep them safe. This is partly true--but what is the intended use of this feature, anyway? Apparently, it's to drive the car and pay attention to its surroundings, while the driver drives the car and pays attention to its surroundings. It's marketed as a fallback for when a driver fails to react, but the driver is required to be the fallback for when the system fails to react.

If it wasn't intended to be used as an "autopilot" function, then it shouldn't have been named so; if it wasn't intended to allow drivers to abdicate control of their vehicle, it shouldn't have been designed to do exactly that. Tesla bears responsibility for selling a product that is not ready for the road, and that they knew (beyond any doubt) would be widely misused.
1420.405751786 MHz (everywhere)
when your computer sftwre crashes, you reboot

when your autocar sftwre crashes, you get decapitated

see th difference ?
Mary (Atlanta, GA)
"The National Highway Traffic Safety Administration should study how automakers can minimize driver distraction. This will become more urgent as advanced systems become available in cars made for the mass market."

We want to spend government tax dollars to study how automakers can minimize driver distraction? Now those that make cards are to somehow minimize distraction? If someone is putting make-up on, what would you suggest? Use the rear view mirror and if someone is holding mascara the cars over to the side of the road? If they're on the phone and blue tooth is detected, pull over to side of road? If they are texting, pull over to side of road?

What in the world are we proposing - we have tons of regulations today, but guess what? You can't regulate good behavior, only against bad behavior. I see people texting all the time when I'm driving, that's already against the law. How are we to really expect manufacturers to stop all risk?
1420.405751786 MHz (everywhere)
what happens when th autopliot doesnt recognize a troupe of girl scouts crossing th street
Steve Donato (Ben Lomond, CA)
If you want to talk seriously about decreasing driver distraction, how about making cars with windows that stop cell phone transmissions? These windows are in use on trains in the UK. But talk like this will get all the cell phone companies--and others--up in arms about decreasing their profits, so we're back to square one in this radically capitalist system of ours: what's more important human life or dollars?
Stage 12 (Long Island)
It's good and necessary that the NHTSA establish guidelines for driverless cars. HOWEVER, I hope they do it fast before republicans start whining about government interference and then try to de-fund the agency.
Joe (Vegas)
"These risks, however, could be minimized with better testing and regulations." One would think that such testing and regulations were already in place.Musk's capricious business enterprise has placed "Green" stardom and business needs ahead of safety.
mjohns (Bay Area CA)
I remember using the "auto-pilot" of my generation (speed control) as a game while still a teenager. Set the speed at 75 on a two lane winding road through hilly terrain and see how long you can go without touching the accelerator or brakes.
While "auto-pilots" are much better today, brains are not (well, most anyway, I have not used any speed control in decades, but do leave the ABS and traction controls enabled and leave the cell phone alone (except for Google map voice driving directions.) Any driver's aid will be treated as a game at least some of the time by a significant fraction of the owners of such features. Thinking otherwise will generate more unsafe driving. Yes, the self-driving controls on a Tesla will lead to better driving--but only if the driver is actively engaged in the task of driving.
Tesla needs to focus on keeping the driver and passengers safe. Paradoxically, this means disabling or threatening to disable the automation when driver inattention is detected. I suggest that any in-car cell phone use disable the automation after a (loud) verbal and visual warning, same for no weight in the driver's seat, and other indications of driver inattention. May need to read speed limit signs (and state laws, based on location) to disconnect if the requested speed is too high. Perhaps a gentle request to move the wheel or touch the accelerator intermittently (and fairly frequently) would also be required.
casual observer (Los angeles)
"...It’s not surprising that technology that helps drivers can lull them into thinking they need not pay attention at all. Chris Urmson, who heads Google’s driverless car project, said in a TED talk last year that when his company tested a driver assistance system some drivers became so dangerously distracted that Google pulled back on that concept. It has decided to focus its efforts on fully self-driving cars instead..."

This statement reveals how little Google understands the problems with self driving vehicles operating in an environment that is well known to be subject to unpredictable circumstances. Hubris and superficial consideration of the subject is what is controlling the decision making process at Google. The technical challenge and the notion of providing the means, at a profit, to allow automobiles to operate without human drivers has mesmerized them into failing to consider what must happen with respect to systems that must know everything in advance to work well. Every driver on a road or highway is uncertain about all that might occur during any particular journey and all that might be needed to safely reach the destination. That is why we have accidents and why we will always have accidents on roadways.
Objective Opinion (NYC)
Interesting. I don't think I'll ever use the driverless feature of an automobile....if I ever happen to own one with the feature.

I believe driverless cars are yet another reason for people to stare at their I phones and tablets.

We're creating a society of people who are fixated on their devices......for Mr. Brown, he can keep watching up in heaven.
Kathleen (Anywhere)
Funny, I thought the NYT Picks were supposed to reflect the spectrum of viewpoints, not simply advocate the viewpoint that the NYT has clearly taken, as noted by other commenters, in previous pieces. Is there a conflict of interest to be disclosed?
Kathleen (Anywhere)
Automation will save lives, and that is reason enough to encourage and support it, but it will ultimately also spare so many the pain of severe and possibly life-altering injuries as are now incurred in car crashes. In addition, it will save an enormous amount of money, as so much is now spent on liability awards, medical bill payments, and car repair/replacement. Even those who think they could never be involved in a crash will benefit, as auto insurance rates will decline dramatically.

Beyond that, think of being able to remain in one's own home long after one's driving ability declines, knowing that you could go out anytime you pleased courtesy of your robot driver, or of having an auto-driven car pick up your children at school or from the bus stop, delivering them safely home. Or of increased speed of deliveries, since auto-piloted trucks will not need the down time now mandated for driver rest. Or of fuel savings, since vehicles that will not crash except in the most extreme of circumstances will not have to weigh tons. Or of decreased automobile-related expenses, since car ownership may no longer be necessary, once we get to the point at which a self-driving car may be summoned just as easily as Uber cars are in many places today. This will happen, of course, only if we insist that the selfish, i.e., those who care nothing about others' safety, instead preferring to pilot their own vehicles for the pure enjoyment of that power, relinquish their "right" to drive.
Greg (Houston, TX)
How many people died last week...or even just yesterday... in fatality crashes involving regular vehicles? I'm sure it's many more than 1 outlying event in a Tesla. I'd say the autopilot and self-driving cars are a step on the road toward fatality-free (human control free) roadways. Although there will be errors and missteps along the way, the goal and presumed end result is well-worth the risk, I believe.
PK2NYT (Sacramento, CA)
Mr. Musk’s aggressive marketing of Tesla has helped create awareness and interest in electric cars; and Tesla has also provided ample proof that electric cars are commercial, roadworthy products and not science experiments. Yet calling Tesla’s aggregation of several oft desired and sorely needed safety functions “autopilot” is marketing mendacity. Unfortunately an otherwise well designed car will definitely get a black eye and may have a spillover negative image of electric cars and advanced safety functions of all cars. Mr. Musk needs to put brakes on over hyping of a product when people take his words at face value. At this point an average Tesla buyer is financially above average but let us not assume that the same is true in the area of common sense. The National Highway Traffic Safety Administration needs to put in regulation as to what can be labeled “autopilot”.
Blue state (Here)
We do have a National Transportation Safety Board. We have quite a number of test states, and state regulations. The plain fact is that drivers now cause more accidents than even the introduction of self driving cars into a mixed driver environment will. Let us not delay; that is the lesson of the lone Tesla crash fatality.
M. (California)
Does anyone seriously believe that in, say, twenty or thirty years' time, most cars will still be driven by humans? The writing is on the wall, folks.

Take a step back and imagine how this transition might look, and for the early phase you'll see something rather like what we're seeing now. The first "self driving" systems would be more of an augmented cruise control, still requiring driver attention. There will be occasional crashes as the technology is refined, and as humans let their attention slip more than they should. These accidents will occur at a lower rate than when humans drive themselves, but will cause all manner of consternation in the press. No matter. The systems will gradually get better and better until, at some point, driver controls will be phased out completely. There will be less and less need for individuals to own cars, and people will spend much less time in traffic as the machines work together more efficiently.

And, of course, the new systems will be much safer for everyone.

We'll get there, but it's going to take a little while.
casual observer (Los angeles)
I do not know why both the driving software and the human driver failed to stop the car. What I do know from studying computer science is that artificial intelligence has never been achieved using digital computer systems and nobody knows how to do make such a system. There are many very excellent expert systems that have been in use for many decades, but they are very limited in the things that they address and the ways that they address them. There have been experiments in systems which can learn that are very interesting, but the ability to be dropped into a situation without preparation that covers all possibilities as do humans all the time is beyond science fiction, it's purely magical thinking. There should be no vehicles on the roads used by everybody that are self driving, it's reckless and it will result in a lot of harm.
atozdbf (Bronx)
This op-ed reminds me of the punch line of the old joke about automated systems: "This programming and redundant controls make this system totally free of being affected by human error, human error, human error, human error . . . ."
elpdriver (Minneapolis)
Has anyone commenting actually ever been in a Tesla with Autodrive? I recently was, with my son driving (he is an engineer there). He showed us exactly how it works, and what the limitations are. On a highway with well marked lines you can relax a bit, as the car keeps you at a proper distance behind the vehicle in front of you, and it senses the lines on the pavement. Going between Autodrive and manual is seamless. It is perfect for a long highway trip, just as cruise control relieves you of some of the pressure of driving (true? you would never completely rely on cruise control to keep you at a proper distance from the vehicle in front of you, you are always monitoring it, but it does help with the tedium of a long trip). However, you NEVER take your hands off the wheel - if you do, you get a gentle beeping reminder to put your hands back on the wheel. Having experienced this for myself I can see how useful this technology is; really, an extension of cruise control in a way. This accident certainly sounds like human error, and the driver relying on the technology beyond what it is designed to do.
Nico (San Francisco, CA)
While this is true, there are currently ways of bypassing this system, and Tesla engineers will be forced to tighten up their monitoring of a driver's attendance, as well as improve on their traffic detection systems.

Having those monitoring systems enabled with beeps and such is fine and well -- they work as designed when used within their parameters -- but when you market to the general consumer (in this case a driver who has only had one road test several years/decades ago, and virtually zero ongoing training), you are expected to design systems that enforce certain required behavior, especially where it pertains to critical safety. Human factors engineering is wholly dedicated to the ways in which the interface between humans and machines is weak and breaks down.

I am sure everyone over at Tesla means well, but their systems and approach so far isn't exactly "safety first". The need for speed that comes directly from their target market is driving safety further down the list of priorities.
lastcard jb (westport ct)
So you are saying as long as you pay attention and drive the car it won't get in an accident. great, so basically you dive the car......and use cruise control like on a volvo or other car with distance control. when a farce.
Andrew (Colesville, MD)
I'd say it is cruise control plus and Tesla should say so without using the term "autopilot" which is misleading. A full-scale autopilot has to wait after going through several iterations of improvements later. Self-driving is not supposed to be confused with assisted-driving based on an improved cruise control extension system.
Tommy (Clovis, CA)
So, a truck driver pulls in front of a vehicle that has auto assist and that vehicle's system is unable to avoid an imminent collision. You really don't know if the driver was engaged in the driving of the vehicle yet you assert that somehow he and or his vehicle were factors in the accident. Then a Times reader is subjected to many articles by your staff writers & your editorial board in which they wring their hands about the safety of a Tesla (new technology). I can tell you as the owner of a Tesla that your articles about the safety of a Tesla are misguided. It is most probably if not certainly one of the safest vehicles on the planet.

Please tell me why you continue to extoll the above position when you know full well that Tesla owners are quite capable of understanding the auto pilot system and that to ignore driving a vehicle properly is to invite an accident that could results in a fatality. Perhaps you forgot that Tesla is an all electric vehicle forged to help save the environment by using cleaner and alternative energy sources other than gasoline. I know you want to sell newspapers but please refrain from making something out of nothing!
lastcard jb (westport ct)
Yeah, gas, oil and coal generated electricity are fantastic....oh its clean because i plug it in.....who made the tires? how about the resins in the body and frame? one mans poison.......
Bill Delamain (San Francisco)
if you buy a knife and stab yourself, you'll get injured, but would you say the knife's manufacturer is at fault?

There are always people who do not care for instructions and misuse equipment. I think those people should be reprimanded, not Tesla.
lastcard jb (westport ct)
So if you have to stay attentive with your hands on the wheel ready for any emergency or incident, tell me, what does self driving mean?
David (California)
By far the most dangerous technology on the road today is the smartphone. If you care about traffic deaths then deal with that. Obviously passing a law banning phone use by drivers doesn't work and is not enough. It is technically possible to disable phone use by drivers.
Beartooth Bronsky (Collingswood, NJ)
By far, the most dangerous factors on the road today are drinking drivers, inattentive drivers, or distracted drivers (smartphones are only the most visible form of distraction, so we focus only on them). A friend of mine's son smashed up his very expensive BMW 7 series because he took his eyes off the road to get a CD out of a holder and insert it into the player. Teens have twice the rate of accidents when they have a friend in the passenger's seat than they do when driving alone. That figure goes up dramatically if more friends are in the back seat. I know another friend whose son had a fender bender in slow traffic when he got distracted looking at a pretty girl on the sidewalk.

Full attention, intelligent safe driving, and effective defensive driving techniques are required at all times. Smartphones are only one of a myriad of distractions that can cause accidents.
David (California)
The Tesla autopilot is safer than human drivers. Period. It also makes driving a lot easier and less tiring. It is not perfect, and any reasonable driver would realize that after a short while - meaning that you'd be a fool to not pay attention hurtling down the highway at 60 mph. Yes there are lots of fools out there - the same type of people who text and drive. But to condemn the Tesla because of foolish drivers is silly, as is condemning Telsa for enabling foolishness.

Unlike 99% of the people who have opinions on the matter, I actually own a Tesla and have considerable real world experience with autopilot.
Ryan Bingham (Up there)
No one wants a self driven car. You going to find one to pull a boat or a camper? Think we'll end up with self driving pick-ups or motorcycles? This is the biggest stock price fraud since 1929.
NaJaCar (New York)
We own a tesla s (the sedan) as our only car. We absolutely love our Tesla-- it is a thrill to drive, incredibly roomy and versatile. We hope the company thrives because the product is truly excellent. My husband was very nervous buying it due to range anxiety, articles about glitches and all the negative press on Tesla. The car has run perfectly with no issues-- we have taken long road trips that somehow fit 4 adults (including two mother in laws) and 2 kids, as well as our pet and luggage (we have the jump seat in the back which accommodates 7 people in the sedan). We feel very safe in the car-- and we completely understand that Autopilot, when engaged, still requires two hands on the wheel and the driver paying attention. The car warns you of this when it is engaged. When used properly the autopilot makes an already incredibly safe car even safer.
Hilary Hopkins (Cambridge MA)
: It appears that the Editorial Board has adopted the very optimistic view of autonomous cars that the technology world has fostered. I suggest you read the excellent column by Lee Gomes in this very issue: "Silicon Valley-Driven Hype for Self-Driving Cars." Or Steven Shladover's article in the June Scientific American: "The Truth About "Self-Driving" Cars." Both note that off the Interstate System the challenge of dealing with the enormous variety of potential hazardous situations in normal urban and rural driving is far greater than generally recognized. Both articles note the incredible and very often understated or even unrecognized challenges in normal driving that humans generally deal with very effectively--potholes, roads under construction, misleading markings or signage, or unexpected maneuvers by other cars.

Actually, humans are very good drivers, and automobile safety has improved dramatically. Do the math. Americans account for about 3 trillion vehicle-miles per year, yielding about 30,000 deaths: 1 death per 100 million miles of driving--about 200 times the average person will drive in a lifetime. The chance of merely being in a reportable accident, which may or may not involve injury, is only about 50% in the same period. (The fatality rate per vehicle mile travelled is now one seventh of what it was in 1950.)
Rob (East Bay, CA)
The only reason I can see is economic. Ultimately, no drivers means you don't have to pay them. Why else would anyone want a driverless car, or truck?
Parker O'Brien (Augusta, Maine)
I'm not sure why it's not being discussed, but the connectedness of autonomous vehicles will be the key to their safety and success. The crazy "trolley problem" hypotheticals that get thrown around in every article about AVs just won't exist. A car doesn't exist in a vacuum -- there are other cars next to it, behind it, across the intersection. When these cars are equipped with V2V (vehicle to vehicle) technology, they will be able to communicate all kinds of data, ten times a second. This includes speed, heading, foreign objects, etc. Couple this with V2I (vehicle to infrastructure) traffic signals and even V2P (vehicle to pedestrian) connected mobile phones, and a person won't even be able to purposely run out into an intersection without all the surrounding vehicles already knowing and preparing for it.

A human can't even see what's on both sides of the vehicle at the same time. A fleet of connected and autonomous vehicles will be know the position, speed, direction, state, etc. of every other surrounding vehicle and traffic signal at all times.
Beartooth Bronsky (Collingswood, NJ)
Most airliners have a collision avoidance system (ACAS/TCAS) that will warn converging airplanes of an imminent collision. They coordinate with each other to negotiate which pilot should be notified to pull up and which to descend, or instruct the planes to make opposite banks.

True auto-drive cars DO, as Parker O'Brien states, need some way to coordinate with each other. One day in the near future, there will be RFID (radio-frequency ID) chips in all roadway lanes and receivers in all of our cars to maintain not only vehicle-to-vehicle communication, but coordination across the entire highway grid. Systems like this are already in use in many railroad companies (unfortunately, the U.S. is lagging behind other countries in installing them.
Tom (Earth)
The Tesla that crashed in Florida had one fatal defect: it needed a new head for the operator.
Chris Carmichael (Alabama)
This is the old nitrates in meat problem. Nitrates and nitrites are known to cause cancer. But knowing that, they are still added to processed meats to retard spoilage. A few people die each years from attributable cancer, but many, many more are saved who don't die from spoiled meat.

Perfect is the enemy of excellent. If Tesla waits on perfect, nothing will ever happen. A sense of scale is necessary. The outrage over one death is being pushed while the hundreds of deaths and injuries from defective airbags continues to grow.

And the "pundits" never seem to want to talk about the missing parts of the puzzle. Our infrastructure (which includes such things as keeping the highway stripes visible, is in shambles. And car-to-car communication has not yet happened. But part of making autonomous or assisted driving will be the ability to adapt our infrastructure by keeping it in good repair and even improving markings to make them easier to identify. For example in the crash in Florida, maybe it would be a good idea to require tractor-trailers to have a distinctive pattern on the side to help identify it.
RWilsker (Boston)
Such panic from the press. We like to knock down anyone who's done something innovative as soon as they have the least stumble.

I expect most Tesla drivers, like me, use the driver assist to help with their driving. It watches the lane I'm in and surrounding traffic, patiently moderates my speed based on what's happening around me, and makes sure I'm paying attention. Tesla makes it abundantly clear that I'm responsible for what happens.

It's done nothing but make my driving safer.

Are there Tesla drivers who don't pay enough attention? Sure. Are they more dangerous than the people I see everyday driving while putting on their eyeliner, eating breakfast, reading the newspaper, texting their buddies, etc. - all in cars without any automation. Not even close.

People are walking away from accidents in Teslas that they would have died in with other cars. (It's built like a tank.) Will a Tesla guard against any possible accident? Of course not. It a truck suddenly cuts in front of you and shears off the passenger canopy, you're not likely to walk away. But, in fact, there have been multiple videos of the Tesla reacting to such situations and protecting their drivers. (Funny how that doesn't show up in these articles. I guess that doesn't attract as many clicks.)

Assisted driving is here to stay. And manufacturers like Tesla are learning from every recorded drive and constantly improving the assisted driving capabilities.
Aniket Saha (Greenville, SC)
A careless driver who doesn't follow Tesla's very clear instruction is not the responsibility of the company. The Times has had it in for Tesla ever since their 'review' of the model S a few years ago. Get over it.
RC (MN)
The profit-based concept of letting crash-prone computers drive cars under real-world conditions (road hazards, snow, rain, darkness, etc.) is absurd. Even when they aren't malfunctioning, computers will not detect and interpret the subtle behavioral cues that humans routinely use to avoid accidents. And why should humans be expected to let companies who have circumscribed the ability to drive defensively decide whether they or their families will be selected to die in an avoidable accident?
David (California)
Crash prone? That may fit your narrative but it has no support in reality.
Dave (Cleveland)
Your argument is based on a major mistake, namely the idea that humans are good drivers. As thousands of people learn very painfully each year, we aren't.

Some of the big advantages of computers over human drivers:
- Computers don't get drunk.
- Computers don't get sleepy.
- Computers don't send text messages while driving.
- Computers don't get distracted by the cute person driving the car next to them.
- Computers don't get road rage.
- Computers don't try to drive when their eyesight and reaction time isn't what it used to be.
And so forth.

My standard is very simply accident damage to people, followed by accident damage to equipment, per mile: If computers are better, then we should use them. If humans are better, then we shouldn't. There will probably be a couple of decades when we'll have plenty of both computer-controlled and human-controlled vehicles on the road, but it sure would be nice for driving to consist of entering an address into a GPS and then sitting back with a cup of coffee.
Brez (West Palm Beach)
I am a retired airline pilot. The first commercial aircraft autoland was available in 1968. It has been vastly improved since then, to the point where it can now safely autoland in conditions of zero ceiling and near zero visibility. Nonetheless, there are limits more restrictive than those that apply under good weather landings. For example, most commercial aircraft have a crosswind limit of around 30 knots, depending on the aircraft type, and a tailwind limit of 10 knots. In an autoland operation, the limits are reduced to 10 knots crosswind (some newer aircraft are slightly higher), because the incredibly sophisticated and expensive high tech equipment can't figure out gusts and crosswinds as well as an experienced pilot.

Even after over 40 years of development, an autolanding is one of the highest alertness situations in the cockpit, both pilots monitoring the instruments, ready to punch off the autopilot and either go around or continue the approach depending on conditions. A nose down hardover at 200 feet would require instantaneous reaction to avert a disaster. I assure you, no one is chatting or texting on their cellphone.

So, on these brand new autodrive cars, we have a bunch of idiots (not all, but enough) doing all manner of stupid actions in disregard of the label warning them to PAY ATTENTION and then crashing. Darwin strikes again!

Autodrive will never be viable (literally) as long as the average IQ stays stuck at 100.
David (California)
This is not 1968. Before you condemn something you should try it. But yes, always pay attention.
Brez (West Palm Beach)
Sigh... My point was that autoland has been around a looong time, not that this is 1968, but still requires constant attention from the operator. Based on your reply, you should not try it. Darwin and such.
SusanK (San Francisco)
I and another driver were nearly hit by a self-driving car this past week down in Silicon Valley. The car had someone behind the wheel - which seems to be about the only rule right now for them - but clearly her role was to do nothing but stare straight ahead and ignore all the other cars honking at her. The self-driving car was going way under the speed limit and then cut across several lanes. It was up to us "human" drivers to anticipate and avoid this car.

Yes, there are many ways for autos to improve but to allow auto manufacturers to bring these technologies to city streets and highways at the current state they're in - or to act as Tesla does and claim no responsibility when they throw these new capabilities out there - is irresponsible and a recipe for more accidents.
David (California)
How do you know the Tesla was in autopilot mode? From the behavior of the car it probably wasn't. The Tesla will not cut across several lanes when in autopilot. Probably just another driver reading her text messages instead of paying attention.
SusanK (San Francisco)
It wasn't a Tesla, nor did I say it was. After all, according to Tesla, they don't make a self-driving car. The point is that this technology is being put out on the open road before its time.
RWilsker (Boston)
So you know this was a self-driving car, how? You know the driver didn't, as many people do, have her hands in her lap while holding the wheel?

And Tesla did none of what you accuse them of. What they did do is to chide the press for acting as if one unusual accident indicated some terrible issue with the driver assist software. It doesn't - one does not equal many.

As they have always done - I know, since I get updates to the software all the time - they will analyze and learn from the occurrence and improve the software. And they will reinforce their message that this is a driver assist function, not auto drive, and that the driver is responsible for paying attention and being ready to take over at any point.
New Haven CT (New Haven)
The weak link in assisted-driving systems unfortunately is the human. As Google has noted most of the crashes of their test cars have been caused by humans. A slow transition with a mix of human/computer control may be asking for trouble. The computer wants to take over most control but you still want the driver to pay attention? That's not going to work. Systems that allow cars to transmit their location and talk to other cars will be a big step in the right direction. When the transition to automation comes (and it's coming faster than you think) - it will need to be "all in". Human's are not reliable enough to drive without creating carnage.
Terry McKenna (Dover, N.J.)
Actually, humans are very reliable. Considering the variability in humans, if we took out all the drunk/high drivers and inexperience kids, we would show excellent safety in the US considering days x drivers x miles.

That does not mean we should not do more with technology but let's understand that the data suggests cars and adult drivers are very safe.
David (California)
The comparison made between Google and Tesla is apples v oranges. Indeed, Google has an axe to grind having made its choice.
wlieu (dallas)
Think about the tens of billions of decisions and actions that we, as drivers, do everyday that do *not* result in crashes. If and only if it can be proven that the automation can match (let alone exceed) those skills can we state that driverless cars could help reduce traffic deaths.
David (California)
Think about the high percentage of human drivers who hurtle down the freeway at 70 mph reading their text messages. Computers don't get distracted, and have instantaneous reaction times. I love my autopilot but I know it has limitations and I need to pay attention.
David Gregory (Deep Red South)
All the Tesla crash shows is that cars cannot be made foolproof.

When you are behind the wheel you are supposed to drive the car- hands on the wheel, eyes on the road, paying attention.

It us sad when anyone dies, but Tesla never said you could watch a video instead of drive.
Marty (Milwaukee)
From my days in engineering, I remember a basic rule: "Computers are stupid." They cannot do anything they are not programmed to do. It is impossible to imagine being able to program a computer, no matter how sophisticated its sensors may be, to recognize every possible situation that you may run into on the road. Maybe the Tesla system would have worked fine in a northern state, but was incapable of recognizing the situation under the very different lighting conditions in the Florida sun. A human would have been much harder to fool, had he been paying attention.
Steve (Arlington, VA)
It's hard to drive a mile without seeing a driver gazing into a cell phone. So already the relevant question is whether, for the average person, driving using a system like Tesla's is safer than driving without it. Unfortunately, gathering the data to answer that question is going to take a long time.
David (California)
The data gathered to date, over 100,000,000 miles of autopilot, suggest it is safer than humans.
John Brews (Reno, NV)
The role of testing and regulation is to insure standards across the industry. A patchwork of approaches is confusing to drivers. However, it will prove difficult to implement test scenarios that mimic all real-life scenarios. That is more complex than running cars into obstacles, of course.

Also, successful technology will be patented and technology advances will be proprietary, and that makes uniform standards hard to impose. Patent rights for the best automation could exclude manufacturers that come to the table a bit too late in the game unless mandatory limits on license fees are implemented. That would in turn kill innovation.
alexander hamilton (new york)
The lesson from the Tesla crash is obvious: put someone with a smart phone behind the wheel of a car, tell them the car is "self-driving," and wait for the crash. We have plenty of that right occurring now with ordinary cars. Think the situation will improve if people believe they have to pay less attention, not more, to traveling in an automobile?

Cars already have seat belts, air bags, antilock brakes, etc. These features don't actually reduce crashes; they just make them more survivable. And then we have back up cameras, lane departure warnings, and some cars even promise to parallel park for you. Here's my theory: if you can't park your own car, or back it up by turning your head and looking behind you, you don't belong behind the wheel. Driving is a privilege, not a right. Master the basics or stay off the road.

Until we tackle the epidemic of drivers texting, like we tackled drivers drinking, the annual carnage will continue. To add so-called "driverless cars" to the mix has to be a cure worse than the disease.
Mark Schaffer (Las Vegas)
You have to wait much longer for a fatal crash to occur in a Tesla than any normal vehicle.
ACW (New Jersey)
Awhile back, I read some interesting research regarding what ethicists, and the insurance industry, call is sometimes called 'moral hazard'. For example, we eat more of 'low calorie' foods, or smoke more of 'low-tar' cigarettes. When we believe we will not incur the consequences of our actions, we take more risks.
I've seen that just from watching SUV drivers in my town. They're in vehicles the size of tanks and barrel ahead. After all, whatever they may hit is going to get the worst of it, not they; and they carry no-fault insurance up to the eyeballs anyway. So outta my way! (If I had space I could do a whole digression on the implications of mandatory health insurance.)
It has been suggested, only half-jokingly, that the real solution to the carnage on the roads would be not to build safer cars, but to strip out all the safety provisions we have incorporated - essentially, to have people driving around in eggshells. It would force the responsibility for their own safety back onto them, and have them paying attention all the time, or else.
We keep trying to make these contraptions foolproof, but it cannot be done, because there will always be a bigger fool; and the only solution to that is frankly Darwinian and involves the bigger fool taking a lot of innocent bystanders with him in the process of collecting his Darwin Award.
The Poet McTeagle (California)
A lesson not mentioned is a danger regulated and addressed in the EU, the ability in an accident of a smaller vehicle to slide under a large truck, decapitating the occupants of the smaller vehicle. In the EU, skirts on the backs and sides of trailers are required to prevent that type of traffic death. Here in the US, the trucking lobby successfully kept that regulation from being imposed. If that had been in place Mr. Brown may have walked away from the crash.

The real lesson is that there are many deaths that could be prevented but for the overwhelming power of money, which chooses who will die and who will survive. The corporate world is clamoring for automated driving because it will eliminate the annoyance of employing drivers, currently the number one job of men without college educations. Think of the increase in profits! So it will happen. Oh brave new world!
Richard Green (San Francisco)
As someone who spent a large part of his working life I played very small roles in the development of computer technologies that are now or have only recently become relatively mainstream. This included some very early work in technologies that would aid in the very earliest autonomous vehicles more than 20 years ago.

If I were told today that a vehicle I was considering had an "autopilot" or "self-driving" mode. The question I would ask is: "How do I turn it off or disable it. It's not quite ready for large scale prime-time roll-out. Certainly, some day, perhaps sooner than I believe the technology will become appropriate for wide distribution, but not today.

But then, even thought I spent a lot of time at the cutting-edge of developing technologies, I have always been a relatively late adopter of same in my own life.
David (California)
Autopilot is off by default. Once it is activated you can turn it off 4 different ways.
DaveD (Wisconsin)
Mr Brown died from technology worship, the latest secular religion. Not that the sectarian variety is without risk, but the worship of technology is the worship of branding and profit. Both highly addictive inventions.
Joe G (Houston)
Major auto companies American, European and Japanese are working on auto piloted vehicles. With all their resources how does Tesla put one out before them? Is Tesla's software like the next version of Windows. Poorly planed, executed and not really worth the price. Couldn't tell the difference between a tractor trailer and the sky?

Let's not get ahead of ourselves. If this technology is released it should work 100%. After a long day or trip the only thing keeping me awake is controlling the car. If I had an auto piloted vehicle i'd velcro my hands to the steering wheel, rig a harness to hold my head up and wear a pair of sunglasses so no one could tell. I don't know about you but I could use the extra sleep.
David (California)
Nothing works 100%. Nothing. Isn't it sufficient to work twice as good as humans?
Ray Johansson (NYC)
Tesla can't advertise autopilot and expect people to remain just as attentive. Of course they are going to be doing something else - otherwise, what's the point of autopilot?

Say autopilot is driving you at 65mph on the highway, and road construction completely messes up the lane markings. Of course the auto-pilot is going to mess up. And of course the driver is likely to be doing something else and may not respond in time.

Sensing features and alarms are great as a complement, but there's no way I'd trust full autopilot - there is too much chaos on the streets.
David (California)
"Tesla can't advertise autopilot and expect people to remain just as attentive."

Why? I think my autopilot is absolutely wonderful. But I realize I need to pay attention.
CA (key west, Fla & wash twp, NJ)
Self driving cars are certainly the future and we need to hone these technologies. The only way is to manufacturer and improve. I now demand an automobile that has all the safety features, including blindspot monitoring, all around sensors, safe braking, backup cameras, etc.
The population is aging these technologies can assist our independence into the future.
Stephen Rinsler (Arden, NC)
Many manufacturers are providing safety systems in their cars, not only luxury ones. Subaru and Honda have these packages as options on many models of their sedans.

These are an aid to the driver, not a replacement.

A truly automatic system with each vehicle "talking" to each other would be associated with fewer accidents, obviously (otherwise, it wouldn't be implemented).

Worthwhile editorial, but it doesn't have much to do with Tesla's Autopilot; that seems to be a failure of the drivers involved.
Joe Capowski (Chapel Hill, NC)
I am a long-time computer-design engineer in neuroscience and ophthalmology.

Humans and animals recognize patterns very well, even in challenging visual
situations. Our visual system is designed for this. Computers don't; they
lack enough parallel processing. There will always be autonomous-driving situations where the autopilot will fail when a person would not. The car company response, to call the human driver to resolve an urgent situation will fail because it takes too long for the human to reach situational awareness.

Computer scientists believe that the answer to every technical problem is more technology. This is an infinite fool's chase. The safest solution is to keep the human in the loop, to combine his/her pattern recognition skills with the computer's ability to control the car.

Tesla Motors especially puts its profits ahead of safety. How, except for profits, can Elon Musk justify a 17-inch interactive computer screen in front of a driver?

Traffic laws are made by politicians, and car company lobbyists have convinced politicians that as long as their hands are on the wheel and their eyes are on the roads, they are safe drivers, though accident and neuropsychology studies clearly demonstrate that the phone call or Facebook post, even done with hands-free technology, remains a serious distraction that reduces the driver's vision.
T. W. Smith (Livingston, Texas)
The lack of situational awareness on the part of pilots using autopilots has been a known problem for decades. Although flying an airplane is more complex and demanding than driving an automobile in MOST respects, due to traffic spacing regulations and positive control in congested areas, pilots do not face potential collision situations every few minutes throughout their flights. Unless the system controlling the cars can operate without driver input in all situations, it will ultimately prove a failure. Expecting a distracted or napping driver to assume control immedeatly whenever the going gets tough due to factors external to the vehicle is probably not going to work on a large-scale.
H. Paul Mazer (Miami, FL)
It would seem that if a semi turned directly across the path of a car traveling at speed there would be nothing that either a computerized car nor a human driver could do to avoid it, and even if the car or driver were to hit the brakes, physics would have kept the car moving into the trailer directly in its path. Why has this not been discussed?
PAN (NC)
Ironic how the safer the cars become the more reckless the driving public becomes. Sorry, but anti-lock brakes cannot stop a vehicle on a dime on icy roads; Stability control cannot violate the laws of physics; GPS will not prevent you from driving off a cliff or into a lake.

I wonder how much more distracted a driver becomes - with texting, etc. - with driver assisted features. It might even encourage drunks to think they can now drive.

Just like auto-pilots require a pilot's attention on a plane, so does any auto-pilot on a car needs a driver's attention. The weakest link still remains the human operator.
David (CA)
Tesla insists that drivers keep their hands on the steering wheel when using auto-pilot. But "hands on" is the opposite of "auto-pilot" in the original use of the phrase "auto-pilot" in airplanes. Sure, pilots have to pay attention when using auto-pilot, but most people are not aware of how auto-pilot is used in planes and commonly think hands-on auto-pilot is an oxymoron. Tesla should use a different phrase, like active assistive driving
qed (Manila)
Here you are roaring along at 60 mph or more watching Harry Potter. Something wrong here. It is fine to have IT-assisted cars but people should still be responsible enough to realize that they are driving a deadly piece of equipment.
Sinan Baskan (New York)
I don't expect driver assisted or 100% self driving cars to come anywhere close to consumer acceptance until the insurance industry have spoken. And so far they really have not. I might consider a Google car when Google carries the insurance and full liability. And people who buys cars because they like to drive are not interested in sharing the joy with a computer.
Sam (Boston)
There's a nice more extensive write up here, arguing that it is far too early to be judging this technology:
http://www.saistent.com/tesla-flirt-with-hand-off-valley
mj (MI)
As a person who spends a fair amount of time on the road I would venture 80% of cars are driverless.

Which is to say the person behind the wheel pays so little attention to the 3000 pound hurtling killing machine they are supposed to be navigating as to be non-existent.

It seems as a species we get dumber and more cavalier and arrogant with each passing decade. Perhaps it is a freaky quirk of evolution that the more we learn the more determined we are to kill ourselves.
ClearedtoLand (WDC)
Lots of journalists –most without scientific and technical backgrounds and experience-- swallowed Theranos's hype and the same misstep may be at work here. Navigation for these cars relies on GPS, an easily defeated and tricked technology as evidenced by the use of jammers near airports which have resulted in arrests and the Iranian's tricking a drone to land in their territory. Radar and ultrasound are similarly subject to interference by hackers, and cameras can rapidly become useless in snow, storms, mud hits, etc. Couple this with the high likelihood of Internet-connected cars being routinely hacked and you have a recipe for disaster.

And if a Tesla missed a tractor trailer, how would it do at seeing a turtle or cat crossing the road?

The government has provided billions for this work--a questionable investment in these deep-pocketed and politically-connected firms and needs to step back and hear from independent voices.
J Singh (Boston)
We could use a system of highways that will provide electronic guidance to driverless cars and make them (nearly) risk free.

The Eisenhower administration began the concept of limited access interstate highways and made it safe and convenient for commerce within the country.

Maybe the Clinton administration should visualize a system of highways to truly take our infrastructure to the next level?
Tom In Maine (Maine)
The body of this editorial broke the promise of its title: There was no substantive discussion of lessons learned from the Tesla crash.

My take on the broader lessons learned is that this event reinforced what engineering and science already know about complex systems in which humans are a component. Machines are good at repetitive tasks and tend to do poorly at reacting to novel situations; in general, the converse is true for humans fully engaged in an activity. For example, driver assisted cars are excellent at perfectly tracking along well marked roads and slowing and breaking for clearly distinguishable objects in front of them. They are poor at distinguishing between a shopping cart and a baby carriage; even worse at choosing which to avoid - especially if there is a baby in the shopping cart and nothing in the baby carriage. Humans are also poor at “monitoring tasks” which may require intervention or input randomly and infrequently.

True self driving cars are years if not decades away given current technology. Autopilot cars are here, but operate in a nexus where the respective human and machine capabilities are challenged most when truly needed to avoid an unwanted outcome.
lastcard jb (westport ct)
Time and again we are led by the nose to a place that we do not need to go. Spend the money - billions and billions on mass transit so there is no need for cars.... You will add countless jobs developing that infrastructure, decrease cars and therefore car deaths, let people text, work, watch harry potter - whatever, in total safety and not develop and industry designed to make a company richer by addressing a need no one wants. Just because you can make a driverless car doesn't mean you should. I would say if you really want to test them out, put them in Boston on a busy day or any state in New England in the dead of winter - then see how ready the tech is for mass consumption. Og yeah, make the cars available for 2 to 5000 or lease for $250.00. Look at whats on the road and what the majority of people can afford. Driverless cars are like Donald Trump, big talk, big promises but when you look hard at it - worthless.
its time (NYC)
The idea that a person is going to pay attention and focus when not "completely" engaged and responsible for the activity 100% of the time is foolish at best.

So who has the liability when the car hits a school bus with 30 kids - the person or the manufacturer?
David (California)
Why? Obviously you don't have any experience and are just parroting others.
KarlosTJ (Bostonia)
Distracted Driver = Accident

Doesn't matter if the car is on Autopilot or not.
XY (NYC)
You can't tell people the car has autopilot and then expect them to act as a backup in emergency situations. That is nonsense thought up by lawyers.

Tesla should be sued out of business and the engineers and lawyers who signed off on this should be prosecuted for criminal negligence.
David (California)
Nonsense. I have autopilot and find it to make driving hugely easier. But I have used it enough to know it has limitations, and that I'd be a fool to not pay attention. Believe me paying attention is a lot easier than doing all the driving.
Mark Schaffer (Las Vegas)
It would have helped if you had discussed relative risk and interviewed Tesla Motors directly before going off half cocked.
https://www.teslamotors.com/blog/tragic-loss

This Vanity Fair article will help any commentators understand this issue better than this opinion piece:
http://www.vanityfair.com/news/2016/07/how-the-media-screwed-up-the-fata...
paul (st louis)
The Truck driver was clearly at fault-- he pulled out directly in front of the car. My wife had the same thing happen to her on a side street and she served to avoid the girl driving. Unfortunately, to avoid hitting the person breaking the law by pulling in front of her, my wife dove off the road and crashed into a pole. Luckily, my wife was only dove 25 mph and was not injured.
If she had been cut off on a highway, she might well have been killed-- autopilot or not.
PAN (NC)
Truck driver may be at fault, but the Tesla driver - I mean, the "occupant" - is dead. Had he been paying attention he may still be alive.

I may have the green light, but I still check left and right for careless drivers blowing through a red light. I may be in the right to proceed, but I prefer to be healthy and alive and not get hit - or drive into a truck if "I" can help it.
paul (st louis)
Yes. Clearly the Tesla driver should have been paying attention and MAYBE could be alive if he was. However, we're assuming that there would not have been a crash, which I don't think is a reasonable assumption given highway speeds. I think you're expecting a miracle if you don't think there will be an accident if someone pulling in front of you while driving 60-70 mpg on a divided highway.
Tom (New Jersey)
This is not necessarily true without knowing more about the dynamics of this accident. Reports have indicated the Tesla was going 85 mph, which clearly exceeds the speed limit. Highways are designed with certain sight distances and speed limitations in mind. A truck turns very slowly. Perhaps the roadway was properly designed so that a truck could turn left assuming an approaching vehicle was traveling the speed limit, and could be spotted. Or perhaps the design of the roadway was substandard and had poor sight lines. Before you indict the truck driver, it is entirely possible that the roadway was clear when he safely began the turn, but the Tesla was driving too fast, and closed the distance after the trucker had entered the intersection but before he could get out of the way.
Thomas Green (Texas)
Reminds me of electric football. It was all the buzz when I was a child. And utterly ridiculous.
LT (Boston)
Why no mention of the driver of the truck? If this had been an accident without an assisted driving system it would be a clear case of the other car being at fault and causing the accident. Perhaps there was a failure with the assisted driving system, but it's not clear that it would not have been a fatal accident even if the system had seen the truck. Tesla cannot control the dangerous actions of other drivers and sometimes it is impossible to avoid them.
Alan Dean Foster (Prescott, AZ)
Charges against the truck driver are pending.
VJR (North America)
The Department of Transportation should require driverless systems on cars to be certified just as it does with aircraft systems. For example, before an aircraft can fly, each of its electronic systems must satisfy two rigorous standards that the FAA uses:

RTCA/DO-178C Software Considerations in Airborne Systems and Equipment Certification

RTCA/DO-254 Design Assurance Guidance for Airborne Electronic Hardware

The DOT should encourage the development and then use similar standards before certifying that an automated car may be permitted to drive on our nations roads.
Brez (West Palm Beach)
How about the DOT certifies drivers to the same rigorous standards it uses for pilots, even private pilots?

Hurrah! No more rush hour traffic!

ATP & CDL holder.
ACJ (Chicago)
As a liberal, I do believe in regulation, but, I draw the line with claims that the government can make driving on autopilot safer. No regulation will neutralize the kinds of nutty drivers I meet up with everyday on the expressway. My recommendation: turn off the autopilot, turn off the DVD player, put on your safety belt, stop texting, step talking on the phones, and drive your car.
Nico (San Francisco, CA)
I am an engineer, and I find it stunning that these cars enable a very dangerous operating protocol that enables a driver to completely check out if they so wish, while at the same time relying on tech that has many flaws, and most ludicrously, permitting speeds above posted limits.

If a car requires driver engagement, it had better have eye trackers, occupancy sensors, etc. Right now, a water bottle jammed just so into the steering wheel can fool the system easily.

Allowing speed to be set (or default to) even 1 mph above posted limits while in autopilot mode (a very bad choice of name) is an indication of very poorly thought-through design. Speed and comfort cannot be given higher priority than safety.

Furthermore, there are many very well known limitations of the visual processing tech (cameras, lidar) used on the front on these cars. Radar/sonar can cover some of those blind spots quite well, but even then it will have flaws.

Other car makers are much more judicious in how they implement these systems. Google in particular has it 100%. Until we have a completely redesigned traffic system where those dependencies are built in, the best use of driver assistance tech is coupled with ensuring the driver is more engaged with the driving task, not less. That can bring the number of road fatalities down.
PAN (NC)
Perhaps the solution to getting the driver to pay attention is to tell the driver the autopilot will disengage at random points and intervals - so they better pay attention at all times.

I know, not a practical idea. But short of that perhaps video game display showing the real world in a synthetic or virtual reality of what is going on will keep their attention. Ah, maybe not - they would forget any accidents would be real world.

Anyone one have better ideas for keeping someone's attention in this hyper-distracted world of multi-taskers?
Mark Schaffer (Las Vegas)
Are the an engineer who has studied what Tesla's systems do in detail? Do you have real expertise in this or just claiming expertise you don't have?
http://www.vanityfair.com/news/2016/07/how-the-media-screwed-up-the-fata...
Nico (San Francisco, CA)
I am a Computer Scientist, in AI, I have worked with lidar and autonomous tech in Silicon Valley, and usually get to see tech that is some time away from being ready for prime time.

I don't work for Tesla, and don't have access to their source code. I have worked for some very big tech firms some of which are intimately involved in the technology of autonomous. Most of my engineering colleagues agree with the issues exposed here, and the criticism of how Tesla is naively trusting the avg driver to self-regulate their attention span.

The VF article doesn't go into any detail on how these systems work in detail (and in all fairness, these issues are hashed out in academic journals where algorithms get published). And the VF article only makes very cursory mention of the statistics involved (1:100 v. 1:130 mil deaths), which is simply the push-back PR by the automaker. I usually throw stats as a support for an argument, but the details of this make that one stat irrelevant.

I am not against the tech. I am all for it, with bullet-proof oversight and enforcement of its major constraints. If the car requires attendance, make the driver monitoring systems fool-proof.

It is true that humans and their weaknesses are a root cause in most accidents, but releasing this nacent tech into the wild with all its weaknesses changes the ways in which accidents happen. Being distracted while operating machinery will cause less accidents only when we use more tech to decrease distractions.
sherm (lee ny)
Such a huge technological and operational change in one of the most commonly used and complex conveniences. Is there any groundswell of demand or any overarching community necessity for driverless cars? Is this just a case of the free market creating an artificial imperative to create a new revenue stream?

Of course one way of looking at it is, that by the time these cars are ubiquitous, AI will have eliminated most of the need for anyone to go anywhere, especially to the workerless workplace. So the roads won't be very crowded.

I know it's Big Brother thinking, but maybe the government can coax the genius industries into shelving driverless cars for the time being and concentrate on on ways to sustain the the water supply as global warming dries everything up.
C. V. Danes (New York)
I would argue that the increase in traffic fatalities can be traced directly to increased usage of cell phones while driving. Just the other day, while waiting for a stoplight to turn green, I noticed that the person in front of me was texting and the person behind me was talking a cell phone. Indeed, I have been passed by police who were talking in a cell phone!

The issue is not that people cannot drive safely. The issue is that they are increasingly distracted by their devices. Treat cell phone usage on the same level as a DUI. Or better yet, add a cell phone blanketing system as part of a car's safety features.
Charlie B (USA)
Your argument about the "increase" in traffic fatalities suffers from the fact that has been no increase. Here are the fatalities per 100,000 of population for several decades:

1960: 20.147
1970: 25.665
1980: 22.485
1990: 17.878
2000: 14,867
2010: 10.668
2014 (latest): 10.25

In other words, the rise in cell phone usage and texting is associated with a dramatic DECREASE in fatal accidents. That doesn't imply causality, of course, but if you're going to blame phones you're going to have to come up with some evidence.
pat (chi)
There needs to be standards for these systems so when one drives a different car they are not confronted by a systems that behaves differently. Also, not sure who is responsible for the term "self-driving" cars, the media or the car manufacturers. However, that term mistakenly tells consumers that the car can drive itself and that their intervention is not needed.
scott (New York)
The instructions say pay attention at all times and do not take your hands off the wheel. The driver was watching a movie. How is this the fault of the technology and not the driver?
ACW (New Jersey)
'pay attention at all times and do not take your hands off the wheel'
These are the same instructions I received on my first driving lesson in 1979. A week before that lesson, which I'd scheduled in advance, I was the front-seat passenger in a car that was T-boned by a brand-new Mercedes that shot a red light because the woman driving it was yelling at her kid instead of watching the road. So you can bet I took those words to heart and have been aware at all times since of what can happen when you are careless with more than 2 tons of steel and glass. (Both cars totaled; no injuries, miraculously, given that the woman and her kids were not wearing seatbelts.)
I am not sure what improvement so-called self-driving cars are, given that the most important equipment, and the part most subject to failure, will continue to be the nut behind the wheel.
Bob (NYC)
What on Earth is the purpose of an "autopilot" that requires you to pay just as much attention as if you were driving yourself? It is unreasonable to expect that anyone who is not actually driving to keep enough attention and readiness for any significant length of time to be able to regain control of the car within the fraction of a second that would take to prevent a crash.

Regardless of their weaselly legal disclaimers, this is very much Tesla's fault for foisting onto consumers a dangerous technology which can't do what it pretends to do. What astonishes me is that this is not forbidden by regulation. We need an "FDA" for vehicles to establish that vehicles in the market are actually roadworthy.
SM (Portland, OR)
So claims the truck driver who caused the crash by not yielding to oncoming traffic. The Tesla driver was known to listen to audio books so it is more than likely what was going on at the time of the crash. Why that driver didn't see the truck is unknown but also unknown is how far the Tesla was away from the intersection when the truck pulled in front of it.
Bill Sprague (on the planet)
A kid rear-ended me (after I'd driven across the country and up into Canada for 8,000 miles r/t accident-free) in his mom's jeep and he was distracted by the computer on wheels that he was supposedly driving. And Silly Valley thinks that we're going to be safer in our cars when they drive themselves? I don't think so. Driving is about paying attention.
Girish Kotwal (Louisville, KY)
Time and time again we find limits to human endeavor and human innovation. The Tesla crash is one such example. We have driverless trains within limited areas operated remotely on tracks confined to certain regions eg at airports connecting terminals and locally in the city of Lyon, France. Driverless cars was an overambitious endeavor destined to disaster and essential only to boost Tesla stock. A better advancement could have been a car that senses the driver loosing control due to sleep or alcohol consumption or a heart attack or a stroke and forcing the driver to pull over. The conventional wisdom and knowledge of human limitation should stop others from setting on a daring impractical costly mission that will not bear fruit of success. As a person trained as a virologist, throughout the century I have had serious doubts whether a cure or a vaccine against HIV was a mission impossible and my doubts have been validated by others who have after spending billions failed at this venture. At the same time when treatment against HIV has worked but many in our world were unable to access. Why do we keep financially supporting attempts that are either impractical, impossible or nonessential? A driverless car in an open highway or in a crowded city will never be 100% safe and worthwhile. A person wanting that kind of luxury can buy his or her own island to try out a Tesla driverless car or could drive a car with a hired driver and not endanger anyone else;s life or limb.
underhill (ann arbor, michigan)
You haven't heard about them, because the press doesn't cover news about the domestic automakers unless it's negative, but the domestic automakers are doing just this: my sons new Malibu will gently nudge the car back into the lane if the car starts to cross the line obliquely. There are many new safety features on cars, meant to save lives, while moving incrementally toward the self-driving automobile (which will not be ready for quite awhile, no matter what Elon Musk says).
VKG (Boston)
The fact that driver deaths are high and going higher, against a backdrop of safer cars, can in large part be attributed to a very noticeable lack of enforcement. On highways where once you risked near certainty of being pulled over if you were speeding, or at least flagrantly so, I can go hours and see not a single traffic officer. States have cut back, and those that still patrol have loosened their criteria, such that in many places the old rule of letting slide anything 10 miles an hour and less over the speed limit is now 15 mph. Most disturbing, in addition to drivers with little apparent training, is the sheer level of unsafe speeding by big rig trucks. While truckers may claim that most accidents involving trucks originate with a mistake by a car, their sheer weight makes stopping in time if you're going 80 mph a virtual impossibility. In CA the highways are still marked as 55 mph max for trucks, yet I have routinely clocked trucks going 75 that were passed by the highway patrol.

Spend less time and energy on tech fixes for political problems, such as driverless cars. Ignore the lobbyists for the trucking industry, who are also four square behind the move for driverless vehicles (nothing cheaper that a driverless truck that can go 24 hr a day), and enforce laws to get us back to saner speeds. There already exist hands-free solutions for people that want them, it's called mass transit. That's what we should be focusing on.
underhill (ann arbor, michigan)
interesting how, now that the wealthy pay less tax than they used to, and we have cut and cut and cut in every area of our lives, life has gotten cheaper in the United States.
Jim (North Carolina)
These are great cars. We own one and love it. But Tesla instructions are clear that Autopilot is still in tests and does not allow driverless operation except in low speed parking, which we have not done. Drivers are instructed to keep their hands on the wheel at all times while driving, Autopilot notwithstanding.
underhill (ann arbor, michigan)
what do you think would happen to GM or Ford if they put out a car with a "beta test" auto drive feature? They'd get sued out of existence for the inevitable mayhem. Elon Musk needs to take his hubris down a few notches, and learn a few things about legal liability, and human nature.
Mister X (NY)
FORGET the ethics: does a car "decide" which person to kill if a decision must be made.

For in the US, the obstacle to wide spread use of such cars is will derive from a less ethical perspective.

When there is an accident, who do you sue? The driver? Or the computer programmer?
ACW (New Jersey)
There is a whole philosophical subdiscipline called 'trolleyology' that considers the ethics of decision-making in situations in which there is no good choice. (The hypothetical presumes you have the power to divert an out-of-control trolley that will strike and kill five people on the tracks - but only at the cost of killing one on the other track; and then posits many variants.)
How will your self-driving car make such decisions - say, if it has a choice of running into a sudden obstacle, such as a rockslide, killing you; or onto a field full of children, killing many of them but preserving its passenger? Would it choose altruism or follow an imperative to cause the least feasible damage to itself and its owner? What if it has to choose between bending a fender in a minor accident or running over someone's dog - will it value its own integrity, and avoid minor property damage, over preserving non-human life? I'm sure it'd be programmed not to crimp its bumper; but I'd hit the fencepost to save the dog, because I'm not a computer. Can you program a conscience?
thomas (Washington DC)
There is a fundamental lie at the heart the self-driving car technology: "Driver must be ready to assume control of the car at any time."
This simple statement completely ignores the reality of what most people are going to be doing while the car is driving. Are the companies really that stupid, or do they just think we are?
I know, they are just trying to push the legal liability for an accident off on someone else.
But I'd say, so long as they want to hide behind that statement, the cars aren't ready for the general public.
RWilsker (Boston)
Have you ever driven a Tesla? If you take your hands off the wheel too long, the assisted driving mode beeps at you, then turns off the radio, then starts slowing down the car. It's pretty hard core about your need to pay attention. And if it sees a situation it can't handle (usually before an inattentive drive would), it beeps even louder and makes you take over.

They mean what they say about being responsible for the driving of the car.

Can Tesla prevent someone from being an idiot? Of course not. As Schiller said, "Against stupidity, the gods themselves contend in vain." As does Tesla.

But I'd rather have a driver being normally inattentive in a driver assisted Tesla than in a regular car (where the drivers still text, talk on the phone, read the newspaper, spill their hot coffee, put on their makeup, etc.)
Oliver Jones (Newburyport, MA)
New technology certainly needs careful scrutiny. But here's the thing: so does old technology.

A 90kWh battery can certainly catch fire if it's damaged. So can a ten-gallon gas tank.

When the operator of a machine behaves like a passenger, by watching a movie rather than monitoring the machine, crashes can happen. Competent airplane pilots pay attention when their autopilots are engaged; they don't watch movies. Distracted drivers crash, and it has always been so. The old song JIngle Bells mentions a distracted-driver crash in a one-horse open sleigh.

So far in history, regulations against unwise human behavior have never been 100% effective. It would be a shame if a regulation like that served to prevent innovation. It would be like requiring a horseless carriage always to follow a man carrying a flag by day and a lantern by night.

That being said, I wish there were a way to turn off the screen in my Tesla Model S when I'm under way.
J (New York, N.Y.)
Though not easy, it is easier to hack a driverless
car than a human brain.
John Brews (Reno, NV)
Any evidence? The methods differ, but which is harder?
benjamin (NYC)
I own a Tesla with the Auto Pilot feature. The very first time I used it at night it misapprehended traffic headed in the opposite direction due to the glaring lights and low median for a car travelling in the opposite direction towards my car. It swerved dramatically. Fortunately, I was nervous and heeded the strict instructions on the vehicle to keep my hands next to the wheel while using Auto Pilot and easily took control and prevented a problem. Had I been distracted or not paying attention confident the car could do a better job than me there could have been fatal consequences. Instructions and warnings are there for a reason, to be heeded. No system is foul proof or error free including humans. Pay attention at all times when you operate a vehicle and be cautious and alert or be prepared to bear the consequences .
Todd R. Lockwood (Burlington, VT)
Once again, Tesla gets put under the magnifying glass for a single event, while the same technology has likely prevented numerous accidents—some of them documented by Tesla owners' dash cameras.

While Tesla's Autopilot system can be used hands-free, it does require the driver to grasp the steering wheel every few minutes to insure that the driver is still engaged. It is no longer possible to leave the driver's seat while Autopilot is in use. Tesla issued a firmware update immediately after the irresponsible stunt referred to in this article appeared online.

Mercedes-Benz offers a nearly identical system to Autopilot for it's S-Class vehicles. Would the media have shown as much interest if a fatal accident occurred while using their assisted driving system?

No safety technology is 100% effective. But given that Tesla's vehicle fleet has logged over 130 million miles with Autopilot turned on, the stats look promising, even with one tragic fatality. The biggest risk lies in over-confidence on the part of some Tesla drivers—the assumption that Autopilot will save them from any situation. While it's cool to think of Autopilot as "autonomous," it is really just a driver's aid. A pilot is still required.
Marty (Milwaukee)
"the stats look promising, even with one tragic fatality"? Tell it to the dead guy's family.
Mark Thomason (Clawson, Mich)
"But when officials do put rules in place, they will have to update them regularly as they learn about how the technology works in practice."

I believe in safety regulations. That said, there is a problem.

Regulation can only follow technology, not lead it, not push it. Minimum standards are only identified once options become available, and are evaluated. Regulation too early or with too little information can actually slow down improvements, requiring the less safe or effective.

Regulation too early can impede the learning curve and inhibit useful innovation.

Revisions in regulations just don't come quick or easy. That is not the nature of regulations.

We are just going to have to live with a learning curve for awhile. We don't know better yet.
Ericka (New York)
A driverless car. Total control by a central authority. How utterly insane.
RWilsker (Boston)
What are you talking about? There was a driver in the car. He was a bad driver.

There is no central authority. The car's own systems do the monitoring and driver assistance.

Don't let that tin hat get too tight.
Nikolai (NYC)
A miniscule number of self-driving cars are on the roads, and out of that miniscule number there has already been a fatal crash. The car failed to "see" an oncoming truck. Apparently the car is unable, as a human being would be, also to hear the truck. Apparently unlike a human driver, the car doesn't adjust its sun shade, put on sunglasses, or otherwise wait until it can see what's ahead before driving. We're not talking here about performing calculations at a rapid pace and beating someone at chess. We're accustomed to thinking of computers as being highly sophisticated, but here we are talking about a new arena for computers, sensory perception.There are so many senses at work as we drive, I doubt this self-driving car can hold a candle to us.
Nikolai (NYC)
Correction, not self-driving but driver assisted.
Architect (NYC)
Indeed, the facts of this crash are disturbing and direct much of the cause to Tesla's failure to detect what any human driver would have easily seen and reacted to. It's strongly suggestive that the Tesla autopilot simply stopped working (disengaged? shut off?) since it not only failed to "see" any of the trailer's lower trim or the shadows beneath, but after impact (the windscreen and roof of the car were struck) the car continued without stopping. The autopilot didn't even recognize there had been an impact!
Carrie (Albuquerque)
There is one human fatality for every 93 million miles of driving. For vehicles in autopilot mode, there has been one fatality in 130 million miles. It's a small sample size, however it does appear that autopilot is nearly 50% safer.
John Smith (Cherry Hill NJ)
PILOTS of commercial jets report that they often fall asleep when the plane is on autopilot, which causes less of a threat than with motor vehicles because there are so many fewer commercial jets. But they'd best keep awake and active with the legalization of drones, because they will soon outnumber planes the way that insects outnumber all other species on Earth. To wit, the model used in trains that require the conductor to keep a foot on the operational pedal has great advantages. Such a feature should be required for autos, even when using cruise control. If drivers distracted by electronic devices have accidents at the same rate as those under the influence of intoxicants. Often I see drivers behind me look down at their texting before coming to a full stop behind me. Then they fail to start driving until beeped since they're not looking at traffic lights. I've even had cars follow me off the driving lanes onto the shoulders when I try to get away from being recklessly tailgated. Distractions, whether from intoxicants, electronic devices or robotic drivers must be minimized, whether by law enforcement or by safety devices in cars driven without adequate driver control. The person who died in the Tesla crash tragically was the canary in the coalmine of the next stage of automation. We must be proactive in making automated devices even safer than those operated with constant, active human control. The intent is to make things safer and easier, not more dangerous.
Fredda Weinberg (Brooklyn)
Sorry, but driving in the city takes skills no machine can possess. No computer can replace an alert driver at the wheel. You think you have a super computer under the hood?

In rural areas, where you can't hurt someone, it makes sense, but here? You don't understand the limitations of logic to make an informed opinion.
Marty (Milwaukee)
I agree with your first paragraph, but must take issue with the second. I've spent many hours driving rural roads, and have seen countless situations I never could have predicted, or programmed a computer to recognize and avoid.
Overton Window (Lower East Side)
From what I've read and learned elsewhere (not covered in the Times) is that the driver of the Tesla in the crash was egregiously misusing and pushing the limits of the software. That doesn't mean there aren't serious concerns and issues with so-called driverless technology, but it does call into question the accuracy and fairness of the Time's coverage. Is Tesla the Bernie Sanders of car companies now?
underhill (ann arbor, michigan)
Tesla gets plenty of positive news coverage and positive attention generally. Just look at their stock price, and compare it to how many cars they turn out. Then look at GM and Ford, look at how many cars they turn out. look at their profits and then look at their share prices. Tesla gets worshipful publicity. Did you mean to say that when something bad happens to Tesla, the press shouldn't cover it?
DoNotResuscitate (Geneva NY)
I bet the 7.7% increase in traffic deaths since 2014 was mostly caused by drivers distracted by their phones. The solution? More technology, of course.
This is like the National Rifle Association saying the answer to gun violence is more guns.

So now we have a self-driving car that really isn't, a Wallace & Gromit-like contraption for rich people that malfunctions with lethal results. We do need safer cars; what we don't need is technology that encourages yet more irresponsible behavior, like watching a movie while your car slams into a truck.

If we're serious about improving auto safety, why not go after the other big cause of fatal accidents--drunk driving? Technology already exists that prevents a car from starting if the operator is drunk. Why not install an app that does this in every new vehicle?
Rob (VA)
"I bet the 7.7% increase in traffic deaths since 2014 was mostly caused by drivers distracted by their phones."

Cell phones have been a major distraction since long before 2014. If there is a single cause, that is not it.

"If we're serious about improving auto safety, why not go after the other big cause of fatal accidents--drunk driving? Technology already exists that prevents a car from starting if the operator is drunk. Why not install an app that does this in every new vehicle?"

Because that is an unreasonable burden on the vast majority of people who do not need a nanny state to stop them from driving under the influence. I'm very much against driving under the influence, but I do not want to have to blow into a tube every time I start my car to prove that I'm a responsible driver.
Blue state (Here)
How about if I get into my self driving car stone drunk and texting and snap chatting selfies to my bffs, and it just gets fool me back home safely? What if I am blind, young, handicapped or old, and a self driving car can give me mobility? Really the problem is doing blank while driving, not doing blank. If we can really get fully self driving cars, when we get a majority of the cars on the road to be self driving, the problem will not be drunks and iPhones and the solution will be the self driving car.
Christine McMorrow (Waltham, MA)
When I read about the Tesla death and the circumstances under which it happened, I vowed I would never drive a "driverless" car. I couldn't relax in a car system where any distraction on my part could be lethal.

I drive a Subaru with DriveAssist, a pretty good system that warns of inadvertent lane swervings, and puts on the brakes when it senses forward obstruction and it's clear the driver isn't responding. In fact, the system is so good, I actually feel safer in this car than in many I've driven--but that doesn't absolve me from paying attention to the roads and drivers all around me.

That the driver/passenger of the Tesla. a car enthusiast, was making videos every time he hit the road, and clearly was distracted--not to mention the other temptations of climbing into the back seat--simply speaks to the daredevil in many. I think it's great that technology has advanced to protect us, but in the same instant it gives us so many ways to get distracted: multiple screen media consoles that I'm sure lead to crashes when a driver is staring at his radio or Sirius channel selection.

But cars, the great American treasure and obsession, have always made it tempting to skirt safe driving techniques. Frankly, I can't imagine when more cars are self-driving, with some drivers total idiots, actually increasing road safety.

For all their engineering marvels, cars will always require a conscious, sane, sober driver behind the wheel.
Maureen (New York)
Would you book a seat in an aircraft that uses autopilot?
Thomas Green (Texas)
Ah Christine, I had you pegged for a Saturn driver. But Subaru, I'm shocked.
Mark Schaffer (Las Vegas)
Did you realize that by the time your senses report and your brain sorts out the signals you are ALWAYS a few tenths of a second behind reality? Did you realize that Tesla's autopilot has a better safety record than when drivers are engaged?
I finally got it also! (South Jersey)
I own a Tesla and only just recently attempted to use the autopilot. Only when cars can cummunicate with one another in real time, process the information in real time, and then react, like a human brain in real time BUT faster than humans will the autopilot create a safer driving experience. When a driver does not have his hands on the wheel, and foot near the gas or break pedal the reaction time needed to avoid an accident (on average 2.3 seconds) will be increased to atleast double! In otherwords, more dangerous in the shourt term!! When 5G wifi is introduced in phones and cars together with interactive sensors and transponders that communcate, process, and react for the driver the only thing left to make our roads finally safer will be the full roll out of the technology. Then, the autopilot will work only when EVERYOTHER car on the road has the same technology; like seat belts, and airbags!
Maureen (New York)
Frankly, and as you reluctantly admit, Tesla did not cause this accident. It was the operator of the vehicle, who was "distracted" -- the cause of most auto accidents these days. If auto "safety" is the sole reason for this and the all the media hoopla surrounding this tragedy, why isn't the media out there clamouring for the abandonment of "smart" phones -- which in reality have been the cause of far more fatal accidents than the Tesla technology?
wlieu (dallas)
You don't realize the rich, sad irony of your argument, do you?
underhill (ann arbor, michigan)
considering how many orders of magnitude more smart phones that Teslas there are in the world, its inevitable that they would cause more accidents. Doesn't change the fact that Elon Musk said his auto pilot was better at driving than most drivers. They named the feature 'auto-pilot'. What do you think people will do with it? This technology is not ready for public use, and Musk's disregard for his customers safety is breathtaking. There is a price to be paid for such towering hubris.
Architect (NYC)
Of course, Tesla did not cause the accident, but what "distracted" Tesla's autopilot and prevented it from taking evasive action? In its failure to act and take any kind of evasive maneuver, the Tesla failed to protect its driver who had come to trust the car that it would do so. And in this failure the Tesla broke Asimov's first law of robotics.
West Coaster (California)
Who ever said driverless cars would be risk-free? I believe the proponents and developers of these systems have claimed they can be far safer than cars driven by human beings. One fatal crash certainly doesn't negate that claim.

When these automated systems are on the road in larger numbers (hopefully sooner rather than later) the bigger debates will be about who is liable for damages on those rare occasions when mishaps do occur.
Salim Akrabawi (Indiana)
I have a driver assistance car ( NOT a Tesla ). Many times in the six months I had it the car screamed at me to break just before I almost plunged in the rear of another car that suddenly stopped in front of me. This sure was very helpful to say the least. While watching it like a hawk I challenged the vehicle few times to drive me the 15 miles to my office without touching the accelerator or the breaks and almost not helping with the steering and it did perfectly well. But it also occasionally does not detect vehicles in front of me while the radar is engaged and that is very scary. So these cars are helpful but NOT perfect and they will never be. On balance and at my age of 75, I am glad I purchased this driver assistance vehicle.
Charlie B (USA)
With respect, sir, if your car has saved you from crashing into others "many times", it's time to turn in your license and let someone else do the driving.
Bill Corcoran (Windsor, CT)
Joshua Brown Tesla Model S Autopilot Crash 2016

Topics Begging to be Explored:

Joshua Brown’s Driving History, e.g., accidents and infractions
Generalized Peter Principle
False Sense of Security
Mindless trust in technology
Gleeful Risk Taking
Violating Manufacturer’s Instructions
Tesla informing drivers of the Safe Operating Envelope(SOE) for the Autopilot
The extent to which the software detects and acts on violations of the SOE
Need for documented formal training in use of autopilot
Need for autopilot designation on license
Mindlessly Clicking “Agree” or “OK” (See also Therac-25)
Design of Highway
Ability of Tractor Trailer driver to see Tesla
Violating Speed Limit
Design of Trailer (No Side Under-drive barrier)
Design of Tesla Model S (Strength of Roof Supports)
Need for Simulator reenactment of event
Need for Computer reconstruction/ reenactment of event
Legality of Tractor Trailer Left Hand Turn under the circumstances
Testimony of other witnesses
Ability of Autopilot Software to detect grossly unsafe autopilot use, e.g., certain types of highway situations and conditions

The Law of Unintended Consequences
The Law of Conservation of Wretchedness
qed (Manila)
Darwin's principle of survival of the fittest and selection of the gene pool.
Mark Schaffer (Las Vegas)
mary lou spencer (ann arbor, michigan)
With so. many distracted drivers using the road, why not mandate all new cars to stop instead of rear ending the car in front of them?
Rodrick Wallace (Manhattan)
Arguments that 'drivers cause 90% of accidents, so removing them will cut deaths by the same amount' remind one of the fallacy that 'if a woman can gestate a baby in 9 months, 9 women should be able to do it in a month'. In reality, focus on the atomistic problem of individual driverless vehicles obscures larger problems regarding the canonical instability of vehicle-to-vehicle/vehicle-to-infrastructure (V2V/V2I) systems, problems that mirror other well-recognized, if not well-understood, network instabilities. Here's some homework:
https://peerj.com/preprints/1714/
Autonomous weapons and drones are a political disaster and V2V/V2I systems will be a transit disaster.
Matt (America)
What are you even talking about? Nowhere in the article does it claim removing drivers would reduce collisions by 90%...
Mark Schaffer (Las Vegas)
Cathy (Hopewell Junction NY)
If you are hurtling around in a tin can at 50 or 60 or 70 miles an hour, you have automatically left " risk free" behind. People are made from not very durable material. We don't do well when forced to occupy the same physical space as another object.

Driving is an activity that even if the car does a lot for you requires a commitment to responsibility. Ultimately, the driver is responsible for the car's overall safety, and will remain so even after cars can drive on autopilot. We don't allow actual pilots to take a nap or stream a movie just because the plane is automated.

If we want to read our papers, hold business meetings, eat a five course meal when we are traveling, we need to take public transport, hire a driver, or carpool.

I see a place for driverless cars to extend the driving years of people who otherwise could not drive once the technology is perfected. But I never see a time when you can get in the car, change into PJs and nap while it takes you home from the bar.
Terry McKenna (Dover, N.J.)
Technology never gets to 100% of any change. For example, the truly paperless office has yet to arrive - even as it was predicted decades ago. We have most of the elements, but paper remains. Similarly driverless cars will never take over in a big way. To begin with, roads present variables that are not like chess but like 3 dimensional chess. The technologies can save lives if used as back up. But to go all the way - in free access roads will simply not work. That does not change the thrust of the research nor the benefits, but this crash was a reminder of just how difficult it is for a machine to figure out what is going on.
NRroad (Northport, NY)
So the fatal flaw of driver assistance software is egregious human error. What else is new? The question is whether such software should be held to a higher standard than human drivers? Tesla has pointed out that the deaths per million miles driven using its system is substantially lower than the national average. Seems to me you can make a better case for more regulation of drivers than for more regulation of assistance software.
Rlanni (Princeton NJ)
It's too soon to speculate. When someone cuts you off, not even a computer may react fast enough.

As for driver assistance leading to driver distraction, we have seen a similar problem in commercial aviation where pilots put the plane on auto and then take a nap, or play with their ipads.
Here (There)
I know one thing, the times will never be invited to test drive a Tesla again. We haven't forgotten the fake column. Good thing Tesla was monitoring! I remember how the monitoring, not the reporter faking things, upset the former public editor.