New UK legislation will put the blame where it should be – on the vehicles’ manufacturers.
In August 2022, the UK government announced a £100m plan to speed up the development and deployment of self-driving vehicles. The plan also calls for new safety regulation, including a bold objective to hold car manufacturers accountable. This would mean that when a vehicle is self-driving, the person behind the wheel will not be responsible for any driving errors.
So who programs the robots that make the cars then? There’s a human involved somewhere
That works for me unless humans can take control and do something that could cause harm. The examples he gives is where the human driver did nothing. But could there be a case where the driver does something to cause the accident even if they’re in a driverless vehicle?
A study carried out at Stanford Law school in 2013 found that, with traditional cars, more than 90 per cent of road accidents are due to human error, so one thing is clear: in the future, streets filled with autonomous drivers will be much safer. The only question is how we handle the long and winding road to get there.
Yes there are a lot of unanswered questions still. The software for driverless vehicles has to make choices about what to hit in the situation where there are two bad choices. How does it do that? Hit the car with 4 passengers or hit a pedestrian on the sidewalk?