> News > IN CHINA

Can AI have legal rights and assume liability?

NI YI | 2018-05-09
(Chinese Social Sciences Today)

A medical care robot was displayed at an AI conference held in Beijing in November, 2017.


 

Sophia, a social humanoid robot, became the first non-human to be granted citizenship by Saudi Arabia in 2017. As artificial intelligence continues to drastically reshape the form of production and life as we know it, the new technology poses novel challenges to China’s legal system.


“Amid the slightly bright shadows of lamp, I came to know her adorable ‘soil’, and my heart is captured.” This poem was written by AI chat-bot Xiaoice. Sunlight Loses the Glass Window, a collection of poems by Xiaoice, was published in 2017, which raises a new question: Could AI products claim intellectual property rights?


According to existing laws,  works of intellectual property refer to products created by human beings, and AI programs are not the subjects of IP rights, said Cao Xinming, director of the Center for Studies of Intellectual Property Rights at the Zhongnan University of Economics and Law. However,  works created by AI programs to be considered intellectual products, works of AI do have certain attributes of IP products, he said.


Even if AI products are identified as subject to IP rights, the question of who can claim these rights remains a controversy, Cao said. If AI products are considered instruments, the developer of the design, the owner, the holder of its use rights or several rights holders could jointly claim rights over AI works, he said.


On the other hand, if AI were to be considered an “artificial person,” AI products should be the fruits in the sense of civil law. In other words, AI shall be treated as hens and AI works as eggs. Eggs belong to owner of the hens, and so will AI works belong to the owners of AI products, Cao said.


In addition, AI programs are also capable of deep learning which involves collection and storage of information that might infringe others’ IP rights. The application of AI in industries like automated driving could also cause accidents, which raises the issue of tort liability.


From the perspective of existing law, only civil subjects can bear tort liability, and AI products have not yet been recognized as subjects that can assume liability, said Cheng Xiao, a professor of law from Tsinghua University. It seems uncontroversial that the owner of AI products should be liable for AI torts, he said. However, considering that concrete actions of AI are controlled by the programs, it is open for debate whether the owner or the developer of the program should be liable for tort actions, Cheng said.


In terms of AI products, demanding someone assume certain types of tort liability involves more the principle of danger liability or no-fault liability, Cheng said. In a case of torts caused by automated driving vehicles, whether considering this tort as products liability or liability of torts caused by a motor vehicle accident, the no-fault principle would apply to the case, which means the defendant would be liable even if one is not at fault, he said. In the future, the use of AI products, such as unmanned aerial vehicles, may be treated as a high-risk activity and such rules regarding inherently dangerous activities may be applied to cases involving AI technology, he said.

 

(edited by MA YUHONG)