The Legal Examiner Affiliate Network The Legal Examiner The Legal Examiner The Legal Examiner search instagram avvo phone envelope checkmark mail-reply spinner error close The Legal Examiner The Legal Examiner The Legal Examiner
Skip to main content

When I recently wrote about concerns being voiced, mine included, about self-driven (robotic or driverless) cars I did not imagine I would get to say, “I told you so” quite so quickly. But the recent accident caused by a Google robotic vehicle that assumed a bus would stop for it is a perfect example of why this new ‘smart’ self-driven car is not yet close to being ready for widespread use on our highways.

Here is the thing about this crash:  “The vehicle and the test driver ‘believed the bus would slow or allow the Google (autonomous vehicle) to continue,’ it [Google] said.”  The company went on to say, “We clearly bear some responsibility, because if our car hadn’t moved, there wouldn’t have been a collision. That said, our test driver believed the bus was going to slow or stop to allow us to merge into the traffic, and that there would be sufficient space to do that.”

It is not clear from the story whether the ‘test driver’ mentioned is the vehicle’s computer or an actual person? Either way, the self-driven car made an error in judgment that could have been serious if either vehicle had been going faster. A computer cannot make ‘assumptions’ or use sight and other factors to make a more informed judgment.  In this case the car’s computer couldn’t determine “Is this bus driver going to stop for me or not?” As most of us humans know, when a large vehicle has the right of way and a small vehicle is approaching, the driver of the larger vehicle assumes the smaller vehicle’s driver is aware enough (and has the proper judgment) not to risk a crash or property damage that happens when he hits a large moving vehicle like a city bus.

I have to agree with the statement from John M. Simpson, the privacy project director for advocacy group Consumer Watchdog, who said the crash “is more proof that robot car technology is not ready for auto pilot.” (Reuters.com, 2/29/16)

A week before the accident occurred a reporter for National Public Radio interviewed Transportation Secretary Foxx about driverless cars.

As part of the story random drivers were asked,  “Would you rather have a computer do the driving for you?” Out of the16 drivers polled, 15 said no to the idea of self-driven cars.  You can listen to the story here.  The advent of driverless cars is being driven by technology and the whim of big companies like Google who want to make these cars—because they can. There are many issues yet to be resolved—some of them problematic from a logistical point of view, some legal and some involving personal safety. The question of whether the American public is ready for cars that operate themselves remains to be seen.  This driver likes technology, and the safety that technology can bring to the car I drive, but I am not ready to give up total control to a computer.

Comments for this article are closed.