The Legal Examiner Affiliate Network The Legal Examiner The Legal Examiner The Legal Examiner search instagram avvo phone envelope checkmark mail-reply spinner error close The Legal Examiner The Legal Examiner The Legal Examiner
Skip to main content

The first reported death involving an autonomous – or, in this case, a semi-autonomous – vehicle occurred this week.  The victim was driving in Florida when the Tesla’s Autopilot mode failed to detect a tractor-trailer that was turning in front of the vehicle.

“Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied.  The high ride right of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S.

Tesla Statement

Tesla has been an innovator – and that, generally, is a good thing.  Telsa has acknowledged that its Autopilot feature is new (in beta) and that drivers must manually enable the feature and that drivers “must maintain control and responsibility” for the vehicle, even when using the Autopilot system.  “Autopilot is getting better all the time, but it is not perfect and still requires the driver to remain alert”, Tesla said in its statement.

Autonomous vehicles – whether partially autonomous or fully self-driving vehicles – will eventually experience failure that results in injury or death.

“For years people have been saying that the technology is ready, and it’s one of my pet peeves, because no it’s not,” said Bryant Walker Smith, a law professor at the University of South Carolina and an expert on autonomous driving issues.

CBS News

This tragedy again raises the issue of just who is responsible when autonomous vehicles crash.

Many have already tried to tackle this question.  See here (BBC) and here (Bloomberg).  Bloomberg, late last year, answered “no one knows.”

In Missouri – and, I would suspect, most states – product defect laws would govern liability in these crashes.  Missouri’s product defect statute permits a victim to hold a manufacturer or seller liable for injuries caused when the product was in a defective condition and unreasonably dangerous when used as reasonably anticipated.  If the autonomous system failed to properly control the vehicle, then the vehicle was in a defective and unreasonably dangerous condition and the manufacturer or seller should be held liable.

Some car companies have already said they would accept responsibility… if their autonomous cars are involved in a crash.  Both Google and Mercedes told 60 Minutes last year that the companies would accept responsibility and liability “if their technology is at fault once it becomes commercially available.”  Volvo has also said it would accept responsibility if its autonomous vehicle causes an collision.

But what does that mean?  Does that mean they would pay the full value of the damages that were caused?  Or does that mean they would fight the victim’s family tooth and nail, forcing the family to spend hundreds of thousands of dollars to prove their case in court?

The fact is that every person, business, corporation, and even car maker is responsible when they or their product injures or kills someone.  As someone who has tried auto defect cases across the country, every car maker I have litigated against ultimately says that they stand behind their car and will accept responsibility if there’s something wrong with it.  Then they say there’s nothing wrong with it.

What about the times when it’s not so clear?

“There is going to be a moment in time when there’s going to be a crash and it’s going to be undetermined who or what was at fault,” said David Strickland, former head of the National Highway Traffic Safety Administration and now a partner at Venable LLP law firm in Washington.  “That’s where the difficulty begins.”

Bloomberg Technology

Sorting out the fault in such cases may ultimately require a detailed and complex forensic computer analysis.  Was the code the problem?  Was it an algorithm?  Faulty hardware?  Was the mapping programming defective? Did a camera (or radar) fail?  Was the highway not marked properly?

The Tesla crash is the first.  There will be more crashes of autonomous vehicles.  Will car makers accept responsibility for failures of their systems?  Only time will tell.

Read More:

© Copyright 2016 Brett A. Emison

Follow @BrettEmison on Twitter.

 

 

Comments for this article are closed.