According to Google.
Who is responsible when a self-driving car kills someone?
With fully autonomous vehicles, the software and vehicle manufacturers are expected to be liable for any at-fault collisions (under existing automobile products liability laws), rather than the human occupants, the owner, or the owner’s insurance company. Reason they are dangerous-
Authorities told KPRC that it took four hours, 32,000 gallons of water and a call to Tesla to extinguish the flames because the vehicle batteries kept reigniting.
Same here. Though from what I understand, it says in the Tesla manual that in autonomous mode, requires your hands still be on the wheel. I don’t get that, but that’s true it says.
" While using Autopilot, it is your responsibility to stay alert, keep your hands on the steering wheel at all times and maintain control of your car. … Subsequently, every time the driver engages Autopilot, they are shown a visual reminder to “keep your hands on the wheel."
Problem is people putting to much faith in an underdeveloped system. IMHO
I have heard that autonomous cars are notoriously unreliable in driving themselves often making many mistakes, the self driving car is way in the future as so much needs to change to make them possible, legislation needs to change as to who is responsible in an accident in an autonomous vehicle, manufacturers and insurance companies won’t want to be responsible as to who to blame in the event of a crash,
We don’t even know who is responsible or allowing this to happen - governments? Who? By that I mean allowing technology free reign to infiltrate our lives. No consultation, no choice by the population as to whether they want this. Just pretty much imposed with two fingers at the population.