Posted by Mark Hambleton, Partner
Self-driving Uber fatality – technological progress meets the same old excuses
As the news broke of a 49-year-old lady being struck and killed by an Uber driverless car in Arizona, it dawned on the world that this is the first fatality involving a pedestrian and a driverless car. But whilst this moment is a tragic landmark in both technology and the law, the reaction to Ms Herzberg’s death is sadly familiar.
Following the fatal accident in Arizona, many will be asking the questions: how did driverless technology fail to stop the car from hitting a lady crossing the road? Is it safe to test driverless cars on public roads?
Many people more qualified than me in the field of automation will be looking into these important questions, I’m sure. However, I’m here to ask the question: what does this mean for people’s safety on the road?
The reaction of many to Ms Herzberg’s death has been very disappointing, and sadly very familiar. They reinforce an apparent acceptance that road traffic collisions are some sort of unavoidable part of life.
Some of the most shocking talking points I’ve come across:
- Speeding wasn’t the issue – The fact that the car was (apparently) speeding at 38mph in a 35mph zone is hardly mentioned. Speed limits are not targets. The car should have been travelling at an appropriate speed to react to hazards.
- Victim-blaming – Ms Herzberg just “came out of nowhere”. This type of unacceptable excuse shouldn’t wash, especially for driverless cars. Their USP is safety.
- Unpredictable behaviour – Ms Herzberg is being criticised for not using a pedestrian crossing. It is worth having a look at the layout of the road. Pedestrians often cross the road away from a crossing, cyclists have to swerve to avoid potholes etc. These are all everyday things that driverless cars should deal with.
- No response from the car, not mentioned – The car, either by its human “driver” or its driverless technology, didn’t even slow down or react to Ms Herzberg’s presence in the road.
Driverless technology needs to be tested so that pedestrians, cyclists and other vulnerable road users are respected. The robots operating the cars need to interact with human behaviour, which can, of course, be unpredictable. Hopefully, the law will change to reflect this too, so that when accidents do happen people are protected.
Who is to blame under the law? The question the world is asking
I will be following this matter to see if Ms Herzberg’s family bring a claim for compensation. It will be interesting to see who they pursue. Will it be Uber, the manufacturer of the car’s technology or the human monitor who was in the driver’s seat?
And what about laws closer to home? Our current law doesn’t deal with insurance for driverless vehicles. The Government’s proposal is that the current rule of compulsory insurance (as currently required by the Road Traffic Act) is being extended to apply to driverless vehicles i.e. motorists who have a driverless car must be insured.
Under the new scheme, the same insurer will cover both the driver and the driverless vehicle technology. There are going to be exceptions though, but they are particularly vague references to when it is ‘appropriate’ to have your vehicle in self-driving mode, which could leave cyclists in a tricky situation when it comes to claiming compensation.
If you have a question about road safety law, contact Mark today
01225 730 214 Email us