More clarity needed for driverless decision making, finds report

in News

Back Share
The report was compiled on behalf of UK Autodrive by legal experts at Gowling WLG.
The report was compiled on behalf of UK Autodrive by legal experts at Gowling WLG.

Driverless vehicles will need to be programmed with a clear and agreed set of rules for decision-making, according to new research published on Tuesday by international law firm Gowling WLG.

In its report on “The Moral Algorithm”, Gowling WLG finds that concerns over the so-called “trolley problem” – where a vehicle must choose between hitting defined individuals – may have been exaggerated, with most of the experts interviewed agreeing that autonomous vehicles (AVs) will never be programmed to make such distinctions.

Nevertheless, the paper argues that harmonised safety regulations will be needed for other decisions, such as when it is permissible for a car to break the rules of the road, or when determining the ‘assertiveness’ of a vehicle when it interacts with other road-users.

The report, which can be downloaded for free here, concludes with a series of eight recommendations. These include the creation of an independent regulator to balance the legality, safety and commerciality issues surrounding autonomous vehicles, the development of a policy regarding how the moral algorithm will operate in terms of major safety situations and a programme of public education and consultation.

Commenting on the outcome of the research, Stuart Young, a partner at Gowling WLG, said:

“It is important not to equate regulation with a burden. It can, in fact, facilitate new markets and important developments. Unless completely new legislation that accommodates new products in advance of them being produced is implemented, this is likely to impose huge additional risks on the companies producing them, as a result of regulatory uncertainty.”

The “Moral Algorithm” study took the form of interviews with industry specialists and representatives from the UK Autodrive consortium during September and October 2016 as well as desktop research and analysis of publicly-available information.

Speaking about the dilemmas that could be posed once cars are required to make complex decisions, Tim Armitage, Arup’s UK Autodrive Project Director, said:

“As with any complex new technology, AVs cannot be specifically programmed to respond to every possible scenario. This simply isn’t practical when a machine is expected to interact with humans, in a complex environment, on a day-to-day basis.  AVs will drive to the speed limits and will not be distracted from the task of safe driving; they will make practical decisions based on their programming, but they cannot be expected to make moral decisions around which society provides no agreed guidance. To allow AVs to demonstrate their capacity for practical decision-making in complex environments, and to begin to establish public trust through contact, the first step is allowing testing in relatively simple and well-defined environments. Of course, regulation will need to keep up, so in echoing Stuart’s sentiments, it is vital the legal industry act now in order to help create a realistic and viable route to market for AVs.”