When cars operate themselves, law should be in driver’s seat

By By Wang Guanglu / 08-11-2016 / (Chinese Social Sciences Today)

A Tesla Model S is involved in the fatal crash with a tractor-trailer truck on nearby highway in Williston, Florida, U.S. on May 7, 2016. And the driver of the Tesla’s vehicle was confirmed by the NTSB to be using Autosteer lane-keeping assistance at the time of the accident.

 

A string of recent accidents involving self-driving Tesla vehicles has raised red flags about the risks inherent in this rapidly developing field of technology, underscoring the urgent need for regulations.


The focus of automated driving research should be on traffic safety, said Gu Dasong, an associate professor from the School of Law at Southeast University in Nanjing.


Although various ethical and legal controversies remain unresolved, intelligent driving should be seen as progress, so tests should be allowed, said Wang Lulu, a professor of ethics from the School of Public Administration at Nanjing Normal University. However, we should also recognize potential risks and find timely solutions, she added.
 

The intelligent driving system is not a conscious entity acting on its own free will. Therefore its relationship with other human drivers and pedestrians falls outside the purview of traditional ethics, Wang Lulu said.


The autonomous system makes decisions and choices based on preset programs, which are incapable of accounting for all possible situations. At the same time, a system based on preset technical parameters is unable to make ethical choices when necessary, so it becomes difficult to establish legal liability should an accident occur, she said.
 

The biggest regulatory challenge designers face is the difference between controlled tests under ideal conditions and the reality of adapting to complex situations, Gu said. Self-driving technology has been the frontier for competition between some countries in transportation, which also will have a possible impact on the structure of society.
 

Public security, transportation management and other government bodies that regulate vehicle testing and licensing on public roads should draft new rules and regulations to keep pace with the times, Gu said.
 

The application of this technology in high-risk industries and occupations may also reduce accidents caused by unavoidable human defects and fatigue, said Wang Guoyu, a professor of philosophy of science and technology from Department of Philosophy at Dalian University of Technology. But Wang said laws and regulations have to stay in step with rapid technological change.


One of the main outstanding issues is how to determine responsibility should intelligent technology operate in a way that violates human ethics or causes harm, Wang Guoyu said. The intelligent system may be able to replace human drivers, but machines cannot be held accountable. Therefore, civil liability in the event of an accident is the burden of the manufacturers, designers and the users, she said.


Wang Lulu pointed out that accidents are unavoidable even when humans are driving, but the safety of self-driving cars can be assured through a large number of tests, which is the only way to achieve widespread application and popularization. Based on the principles of risk management, designers can strike a balance between technological progress and proper allocation of liability by reining in new technology with a rational framework of ethics and law, she said.
 

Researchers on ethics and law should join forces with technology experts to develop a coherent set of regulations that will lay the legal foundation for widespread use of autonomous driving technology, Wang Guoyu said. It is both costly to provide civil restitution for victims and recall defective products, and in some cases, the situation could be more intractable, she added.
 

Users of automated car systems have a moral and legal obligation to intervene and take manual control of vehicles in certain circumstances, Gu said, but when accidents are caused by defects in vehicles, the manufacturer should bear the burden of proof. However, if it is determined that the manufacturer is neither at fault nor negligent, then a judgment should be made about whether or not the user acts negligently. With regard to circumstances in which accidents are highly probable, the “no-fault” principle should be adopted, he said.

 

 

Wang Guanglu is a reporter at the Chinese Social Sciences Today.