27 January 2017PE 582.443v03-00 A8-0005/2017
with recommendations to the Commission on Civil Law Rules on Robotics(2015/2103(INL))
Committee on Legal AffairsRapporteur: Mady Delvaux(Initiative – Rule 46 of the Rules of Procedure)Rapporteurs for the opinions (*):Georg Mayer, Committee on Transport and TourismMichał Boni, Committee on Civil Liberties, Justice and Home Affairs(*)  Associated committees – Rule 54 of the Rules of Procedure
AMENDMENTS001-001002-003/r1004-008009-010/r1009-013012-013/r1
MOTION FOR A EUROPEAN PARLIAMENT RESOLUTION ANNEX TO THE MOTION FOR A RESOLUTION:DETAILED RECOMMENDATIONS AS TO THE CONTENT OF THE PROPOSAL REQUESTED EXPLANATORY STATEMENT OPINION of the Committee on Transport and Tourism (*) OPINION of the Committee on Civil Liberties, Justice and Home Affairs (*) OPINION of the Committee on Employment and Social Affairs OPINION of the Committee on the Environment, Public Health and Food Safety OPINION of the Committee on Industry, Research and Energy OPINION of the Committee on the Internal Market and Consumer Protection RESULT OF FINAL VOTE IN COMMITTEE RESPONSIBLE
MOTION FOR A EUROPEAN PARLIAMENT RESOLUTION
with recommendations to the Commission on Civil Law Rules on Robotics(2015/2103(INL))The European Parliament,–  having regard to Article 225 of the Treaty on the Functioning of the European Union,–  having regard to the Product Liability Directive 85/374/EEC,–  having regard to Rules 46 and 52 of its Rules of Procedure,–  having regard to the report of the Committee on Legal Affairs and the opinions of the Committee on Transport and Tourism, the Committee on Civil Liberties, Justice and Home Affairs, the Committee on Employment and Social Affairs, the Committee on the Environment, Public Health and Food Safety, the Committee on Industry, Research and Energy and the Committee on the Internal Market and Consumer Protection (A8-0005/2017),
(1)Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) (OJ L 119 4.5.2016, p. 1).(2)(1) A robot may not injure a human being or, through inaction, allow a human being to come to harm. (2) A robot must obey the orders given it by human beings except where such orders would conflict with the First Law. (3) A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws (See: I.Asimov, Runaround, 1943) and (0) A robot may not harm humanity, or, by inaction, allow humanity to come to harm.(3)Cf. the Schuman Declaration (1950: “Europe will not be made all at once, or according to a single plan. It will be built through concrete achievements which first create a de facto solidarity.”(4)Council Directive 85/374/EEC of 25 July 1985 on the approximation of the laws, regulations and administrative provisions of the Member States concerning liability for defective products (OJ L 210, 7.8.1985, p. 29).(5)Texts adopted, P8_TA(2015)0390.(6)Regulation (EC) No 216/2008 of the European Parliament and of the Council of 20 February 2008 on common rules in the field of civil aviation and establishing a European Aviation Safety Agency, and repealing Council Directive 91/670/EEC, Regulation (EC) No 1592/2002 and Directive 2004/36/EC (OJ L 79, 19.3.2008, p. 1).(7)See the European Parliament legislative resolution of 2 April 2014 on the proposal for a regulation of the European Parliament and of the Council on medical devices, and amending Directive 2001/83/EC, Regulation (EC) No 178/2002 and Regulation (EC) No 1223/2009 (COM(2012)0542 – C7-0318/2012 – 2012/0266(COD)).(8)Regulation (EC) No 864/2007 of the European Parliament and of the Council of 11 July 2007 on the law applicable to non-contractual obligations (Rome II) (OJ L 199, 31.7.2007, p. 40).(9)Council Regulation (EC) No 428/2009 setting up a Community regime for the control of exports, transfer, brokering and transit of dual-use items (OJ L 341, 29.5.2009, p. 1).
ANNEX TO THE MOTION FOR A RESOLUTION:DETAILED RECOMMENDATIONS AS TO THE CONTENT OF THE PROPOSAL REQUESTED
Definition and classification of ‘smart robots’A common European definition for smart autonomous robots should be established, where appropriate including definitions of its subcategories, taking into consideration the following characteristics:–  the capacity to acquire autonomy through sensors and/or by exchanging data with its environment (inter-connectivity) and the analysis of those data;–  the capacity to learn through experience and interaction;–  the form of the robot’s physical support;–  the capacity to adapt its behaviour and actions to the environment.Registration of smart robotsFor the purposes of traceability and in order to facilitate the implementation of further recommendations, a system of registration of advanced robots should be introduced, based on the criteria established for the classification of robots. The system of registration and the register should be Union-wide, covering the internal market, and could be managed by a designated EU Agency for Robotics and Artificial Intelligence in case such an Agency is created.Civil law liabilityAny chosen legal solution applied to the liability of robots and of artificial intelligence in cases other than those of damage to property should in no way restrict the type or the extent of the damages which may be recovered, nor should it limit the forms of compensation which may be offered to the aggrieved party on the sole grounds that damage is caused by a non-human agent.The future legislative instrument should be based on an in-depth evaluation by the Commission defining whether the strict liability or the risk management approach should be applied.An obligatory insurance scheme, which could be based on the obligation of the producer to take out insurance for the autonomous robots it produces, should be established.The insurance system should be supplemented by a fund in order to ensure that damages can be compensated for in cases where no insurance cover exists.Any policy decision on the civil liability rules applicable to robots and artificial intelligence should be taken with due consultation of a European-wide research and development project dedicated to robotics and neuroscience, with scientists and experts able to assess all related risks and consequences;Interoperability, access to code and intellectual property rightsThe interoperability of network-connected autonomous robots that interact with each other should be ensured. Access to the source code, input data, and construction details should be available when needed, to investigate accidents and damage caused by smart robots, as well as in order to ensure their continued operation, availability, reliability, safety and security.Charter on RoboticsThe Commission, when proposing legal acts relating to robotics, should take into account the principles enshrined in the following Charter on Robotics.CHARTER ON ROBOTICSThe proposed code of ethical conduct in the field of robotics will lay the groundwork for the identification, oversight and compliance with fundamental ethical principles from the design and development phase.The framework, drafted in consultation with a European-wide research and development project dedicated to robotics and neuroscience, must be designed in a reflective manner that allows individual adjustments to be made on a case-by-case basis in order to assess whether a given behaviour is right or wrong in a given situation and to take decisions in accordance with a pre-set hierarchy of values.The code should not replace the need to tackle all major legal challenges in this field, but should have a complementary function. It will, rather, facilitate the ethical categorisation of robotics, strengthen the responsible innovation efforts in this field and address public concerns.Special emphasis should be placed on the research and development phases of the relevant technological trajectory (design process, ethics review, audit controls, etc.). It should aim to address the need for compliance by researchers, practitioners, users and designers with ethical standards, but also introduce a procedure for devising a way to resolve the relevant ethical dilemmas and to allow these systems to function in an ethically responsible manner.CODE OF ETHICAL CONDUCT FOR ROBOTICS ENGINEERSPREAMBLEThe Code of Conduct invites all researchers and designers to act responsibly and with absolute consideration for the need to respect the dignity, privacy and safety of humans.The Code asks for close cooperation among all disciplines in order to ensure that robotics research is undertaken in the European Union in a safe, ethical and effective manner.The Code of Conduct covers all research and development activities in the field of robotics.The Code of Conduct is voluntary and offers a set of general principles and guidelines for actions to be taken by all stakeholders.Robotics research funding bodies, research organisations, researchers and ethics committees are encouraged to consider, at the earliest stages, the future implications of the technologies or objects being researched and to develop a culture of responsibility with a view to the challenges and opportunities that may arise in the future.Public and private robotics research funding bodies should request that a risk assessment be performed and presented along with each submission of a proposal for funding for robotics research. Such a code should consider humans, not robots, as the responsible agents.Researchers in the field of robotics should commit themselves to the highest ethical and professional conduct and abide by the following principles:Beneficence – robots should act in the best interests of humans;Non-maleficence – the doctrine of ‘first, do no harm’, whereby robots should not harm a human;Autonomy – the capacity to make an informed, un-coerced decision about the terms of interaction with robots;Justice – fair distribution of the benefits associated with robotics and affordability of homecare and healthcare robots in particular.Fundamental RightsRobotics research activities should respect fundamental rights and be conducted in the interests of the well-being and self-determination of the individual and society at large in their design, implementation, dissemination and use. Human dignity and autonomy – both physical and psychological – is always to be respected.PrecautionRobotics research activities should be conducted in accordance with the precautionary principle, anticipating potential safety impacts of outcomes and taking due precautions, proportional to the level of protection, while encouraging progress for the benefit of society and the environment.InclusivenessRobotics engineers guarantee transparency and respect for the legitimate right of access to information by all stakeholders. Inclusiveness allows for participation in decision-making processes by all stakeholders involved in or concerned by robotics research activities.AccountabilityRobotics engineers should remain accountable for the social, environmental and human health impacts that robotics may impose on present and future generations.SafetyRobot designers should consider and respect people’s physical wellbeing, safety, health and rights. A robotics engineer must preserve human wellbeing, while also respecting human rights, and disclose promptly factors that might endanger the public or the environment.ReversibilityReversibility, being a necessary condition of controllability, is a fundamental concept when programming robots to behave safely and reliably. A reversibility model tells the robot which actions are reversible and how to reverse them if they are. The ability to undo the last action or a sequence of actions allows users to undo undesired actions and get back to the ‘good’ stage of their work.PrivacyThe right to privacy must always be respected. A robotics engineer should ensure that private information is kept secure and only used appropriately. Moreover, a robotics engineer should guarantee that individuals are not personally identifiable, aside from exceptional circumstances and then only with clear, unambiguous informed consent. Human informed consent should be pursued and obtained prior to any man-machine interaction. As such, robotics designers have a responsibility to develop and follow procedures for valid consent, confidentiality, anonymity, fair treatment and due process. Designers will comply with any requests that any related data be destroyed, and removed from any datasets.Maximising benefit and minimising harmResearchers should seek to maximise the benefits of their work at all stages, from inception through to dissemination. Harm to research participants/human subject/an experiment, trial, or study participant or subject must be avoided. Where risks arise as an unavoidable and integral element of the research, robust risk assessment and management protocols should be developed and complied with. Normally, the risk of harm should be no greater than that encountered in ordinary life, i.e. people should not be exposed to risks greater than or additional to those to which they are exposed in their normal lifestyles. The operation of a robotics system should always be based on a thorough risk assessment process, which should be informed by the precautionary and proportionality principles.CODE FOR RESEARCH ETHICS COMMITTEES (REC)PrinciplesIndependenceThe ethics review process should be independent of the research itself. This principle highlights the need to avoid conflicts of interest between researchers and those reviewing the ethics protocol, and between reviewers and organisational governance structures.CompetenceThe ethics review process should be conducted by reviewers with appropriate expertise, taking into account the need for careful consideration of the range of membership and ethics-specific training of RECs.Transparency and accountabilityThe review process should be accountable and open to scrutiny. RECs need to recognise their responsibilities and to be appropriately located within organisational structures that give transparency to the REC operation and procedures to maintain and review standards.The role of a Research Ethics CommitteeA REC is normally responsible for reviewing all research involving human participants conducted by individuals employed within or by the institution concerned; ensuring that ethics review is independent, competent and timely; protecting the dignity, rights and welfare of research participants; considering the safety of the researcher(s); considering the legitimate interests of other stakeholders; making informed judgements of the scientific merit of proposals; and making informed recommendations to the researcher if the proposal is found to be wanting in some respect.The constitution of a Research Ethics CommitteeA REC should normally be multidisciplinary; include both men and women; be comprised of members with a broad experience of and expertise in the area of robotics research. The appointment mechanism should ensure that the committee members provide an appropriate balance of scientific expertise, philosophical, legal or ethical backgrounds, and lay views, and that they include at least one member with specialist knowledge in ethics, users of specialist health, education or social services where these are the focus of research activities, and individuals with specific methodological expertise relevant to the research they review; and they must be so constituted that conflicts of interest are avoided.MonitoringAll research organisations should establish appropriate procedures to monitor the conduct of research which has received ethics approval until it is completed, and to ensure continuing review where the research design anticipates possible changes over time that might need to be addressed. Monitoring should be proportionate to the nature and degree of risk associated with the research. Where a REC considers that a monitoring report raises significant concerns about the ethical conduct of the study, it should request a full and detailed account of the research for full ethics review. Where it is judged that a study is being conducted unethically, the withdrawal of its approval should be considered and its research should be suspended or discontinued.LICENCE FOR DESIGNERS–  You should take into account the European values of dignity, autonomy and self-determination, freedom and justice before, during and after the process of design, development and delivery of such technologies including the need not to harm, injure, deceive or exploit (vulnerable) users.–  You should introduce trustworthy system design principles across all aspects of a robot’s operation, for both hardware and software design, and for any data processing on or off the platform for security purposes.–  You should introduce privacy by design features so as to ensure that private information is kept secure and only used appropriately.–  You should integrate obvious opt-out mechanisms (kill switches) that should be consistent with reasonable design objectives.–  You should ensure that a robot operates in a way that is in accordance with local, national and international ethical and legal principles.–  You should ensure that the robot’s decision-making steps are amenable to reconstruction and traceability.–  You should ensure that maximal transparency is required in the programming of robotic systems, as well as predictability of robotic behaviour.–  You should analyse the predictability of a human-robot system by considering uncertainty in interpretation and action and possible robotic or human failures.–  You should develop tracing tools at the robot’s design stage. These tools will facilitate accounting and explanation of robotic behaviour, even if limited, at the various levels intended for experts, operators and users.–  You should draw up design and evaluation protocols and join with potential users and stakeholders when evaluating the benefits and risks of robotics, including cognitive, psychological and environmental ones.–  You should ensure that robots are identifiable as robots when interacting with humans.–  You should safeguard the safety and health of those interacting and coming in touch with robotics, given that robots as products should be designed using processes which ensure their safety and security. A robotics engineer must preserve human wellbeing while also respecting human rights and may not deploy a robot without safeguarding the safety, efficacy and reversibility of the operation of the system.–  You should obtain a positive opinion from a Research Ethics Committee before testing a robot in a real environment or involving humans in its design and development procedures.LICENCE FOR USERS–  You are permitted to make use of a robot without risk or fear of physical or psychological harm.–  You should have the right to expect a robot to perform any task for which it has been explicitly designed.–  You should be aware that any robot may have perceptual, cognitive and actuation limitations.–  You should respect human frailty, both physical and psychological, and the emotional needs of humans.–  You should take the privacy rights of individuals into consideration, including the deactivation of video monitors during intimate procedures.–  You are not permitted to collect, use or disclose personal information without the explicit consent of the data subject.–  You are not permitted to use a robot in any way that contravenes ethical or legal principles and standards.–  You are not permitted to modify any robot to enable it to function as a weapon.

 http://www.europarl.europa.eu/sides/getDoc.do?pubRef=-//EP//TEXT+REPORT+A8-2017-0005+0+DOC+XML+V0//EN