When Leonardo De Vinci first sketched a rough blueprint for a self-propelled cart 500 years ago could he have imagined that it would lead to an influx of driverless vehicles on our roads?
Possibly not. In fact, I am sure we are all amazed at the tide behind the trickle, as the insurance industry now supports the marching drum beat of automated technology.
So why is this overly cautious body supporting such a radical shift in the way we drive? Mainly reasons to do with safety but also a head-scratching juxtaposition, that is that more autonomy behind the wheel will actually increase a driver’s accountability.
And this is something that has already been seen – whether it’s a bus unable to go backwards or a more serious human error as demonstrated in the Tesla tragedy, I’m going to try and shed some light on what the rise of the autonomous vehicle really means.
Are the robots taking over?
The Bloomberg Aspen Initiative on Cities and Autonomous Vehicles predicts that a staggering 25% of all new cars sold by 2035 will have this technology but luckily for us, there are still a few years to go before we as drivers become redundant.
The UK Government, who is very supportive of this technology, introduced regulations which were announced in the Queen’s speech in June 2017. The new Automated and Electric Vehicles Bill is seen as an important first step in guiding the manufacturing and insurance industry.
Meanwhile, the Association of British Insurers (ABI) has confirmed the insurance industry’s 100% commitment to the development of the automated car. However, for the foreseeable future, whilst cars may be automated, they will not be driverless and the insurance industry’s expectation is that drivers will still need to be able to take back control of the vehicle at short notice.
Something an unfortunate driver of a Tesla vehicle failed to do while using autopilot in Florida in May 2016. Alarmingly, the Model S was unable to distinguish a large white 18-wheeler tractor and trailer from the bright sky as it crossed the highway. While on autopilot, the car’s sensors failed to detect the vehicle causing a collision at high speed. The driver died at the scene.
Tesla was quick to point out that this was the first fatality in some 130 million driving miles, compared to America’s national average of a death every 94 million miles. The company makes it clear that drivers are still responsible for their vehicles even on autopilot, and the car comes with functionality such as a vibrating steering wheel when there is a delay in movement to prompt a response. The company has been cleared of any wrongdoing.
Whose fault is it anyway?
It’s hard to argue with the benefits of using artificial intelligence and digital intervention to reduce human error when human error is shown to cause over 90% of accidents according to a recent study by the US National Highway Traffic Safety Administration.
Take the introduction of autonomous emergency braking (AEB), where brakes are applied automatically if the driver does not respond in time. A study in 2015 by the vehicle safety bodies for Europe and Australasia found that low-speed AEB technology leads to a 38% reduction in real-world rear-end crashes. It is therefore unsurprising that many insurers see the new automated technology as a positive step forward.
Removing human error could significantly reduce many types of injury and non-injury accidents but the burden of liability could shift when autonomy is present leading to some complicated legal cases.
How clear is the consumer on what is automated and what is autonomous?
The UK wants to be a global leader in autonomous cars and there is a clear support from the ABI for the innovation of insurance products to meet the car manufacturers’ innovation timeline on this.
But it will be interesting to see how the terms and conditions of car manufacturers match with the insurers’ liability clauses and the aims of the Thatcham research. Thatcham has stated that is all about clarity. So it is vital that consumers are not misled about the capabilities of advanced driver assistance systems (ADAS) and fully automated driving technology (ADT).
The big question needs to be ‘If an autonomous car has an accident is the driver absolved of responsibility?’ As demonstrated by the Tesla fatality, probably not. What if, for example, the same level of control is needed as when one engages cruise control? This would require a driver to be physically connected to the vehicle’s controls and capable of taking immediate emergency action. If this is not the case, what will be the acceptable timeframes for action: could it be 2, 5 or even 10 seconds?
Whether a car has Pilot Assist – a system that allows the driver to driver for around 20 seconds with no hands on the wheel, or Auto Pilot where the driver can take the hands off the wheel on the motorway for 2 minutes or the system disconnects, drivers need to be aware of how the system works and their liability in relation to insurance should anything go wrong.
‘Black Box’ data sharing
The insurance industry has stated that it is critical that a set of standards for data sharing is agreed at an international level and that this is clearly laid out before any fully automated vehicles are commercially available. Telematics is the use of technology to gather and analyse information about the operation of a vehicle and has been widely adopted on a voluntary basis, particularly by younger drivers, to help manage insurance premiums. However, such technology might need to be mandatory in order to monitor vehicles and possibly to establish liability in the event of an accident (the black box for evidential purposes). This raises further issues relating to data and privacy laws.
Potentially who could see your data? Definitely your insurance company, possibly the company behind the car and if you were to be in a collision then it would be likely that the emergency services and Police would be interested in the driving conditions behind the accident.
This could go on to inform larger audiences; imagine submitting telematics to a council officer that the road layout on your morning commute was a harsh breaking nightmare or your boss pulled you in to have a word about speeding last Thursday as confirmed by your weekly data report. But if safety on the roads can be improved with a little analysis, shouldn’t our driving habits become common knowledge?
Certainly, from a company point of view, this data is welcome and it comes as no surprise that commercial vehicles are being fitted with these little black boxes. Agnes Miller, the Fleet Manager at BAM Construction, decided to fit trackers on a quarter of her light goods vehicles 6-months ago and has already seen a reduction in the average number of reported accidents. She says, “From a safety point of view when someone is driving safely that reduces accidents, lowers emissions and saves fuel. It’s a natural progression. When we introduced the weekly data reviews we immediately saw driving styles become much safer although our drivers do adjust over time.”
Telematics is definitely an increasing market; in 2016 Fleet Industry News reported that almost 40% of UK businesses used the software, with the figure rising to 45% in London. The research provided by RAC Business found that over half (58%) found a reduction in speeding and fines with a similar figure claiming a reduction in accidents. While 1 in 10 reported insurance premiums decreasing as a direct result.
Not all people will be happy sharing driving styles or car routes and locations but in the event of an accident involving a fully automated vehicle a reluctance to share data with involved parties could lead to protracted and costly litigation. This could increase insurance premiums and ultimately make the cost of buying an automated car or the risk of using an automated car prohibitive for consumers.
The small issue of cybersecurity and data protection
The ABI issued guidance to help promote compliance with the Data Protection Act 1998 amongst insurers regarding telematics. The guidance emphasises that consumers are given clear and comprehensive information to ensure they fully understand how their personal data will be collected. But such guidance only applies to insurers, so will similar rules need to be applied to vehicle manufacturers?
And what about that super scary tech revolution? After all, technology is fallible, and cybercrime is a significant issue. Where would liability lie if a vehicle’s control system was hacked and the vehicle used for criminal or terrorist purposes? You may have already seen something similar in The Fate of the Furious, the most recent of The Fast and the Furious franchise. In this film, Charlize Theron’s cyberterrorist Cipher and her team hack the high-tech electronics of cars in New York City. Operated remotely, they are sent swarming after one, in particular, was unable to do anything but their master’s bidding.
Could this really happen? Well it would depend on the degree of security built into the system and it would take a huge team to make happen but if it did, arguments could be raised that the vehicle manufacturer did not take sufficient steps to prevent the security breach and should, therefore, be liable to injured parties.
The cost of human impact
A more altogether depressing thought it that new technology could actually play the part of God when it comes to reacting in a collision. The New Scientist reported that cars will soon come with ethical settings, where you can choose who would survive in a crash.
But really, could a computer ever take into consideration the damage a collision could do to a small child vs. its owner? And the guilt and powerful human emotions that factor into every minute decision we make behind a wheel?
Despite recent assuagements that we are heading in a driverless direction, I think there is more that needs to be explicitly said about how and when this technology is used.
Legally, the ABI may have answered some of the simple questions around how the insurance will work and the rise of the autonomous vehicle is an exciting proposition but we still have many unanswered questions and, in reality, many will remain unresolved until they actually happen.
Driver assist or driverless? ABI Video: https://youtu.be/LSzTCjIRkY0