LIVING WITH ONLINE ROBOTS

Rafael Capurro




Contribution to the
Friedrich-Ebert-Foundation and University of Tsukuba Joint Symposium: Robo-Ethics and "Mind-Body-Schema" of Human and Robot - Challenges for a Better Quality of Life, University of Tsukuba (Japan), Keynote: Robo-Ethics (PP), January 23, 2015. This text was reshaped for publication: Intercultural  Roboethics for a Robot Age in: Makoto Nakada, Rafael Capurro and Koetsu Sato (Eds.): Critical Review of Information Ethics and Roboethics in East and West. Master's and Doctoral Program in International and Advanced Japanese Studies, Research Group for "Ethics and Technology in the Information Era"), University of Tsukuba 2017 (ISSN 2432-5414), 13-18.
See also The Quest for Roboethics. A Survey
.





CONTENTS

What is a robot?
Robots in everyday life
What kind of in-built rules of behaviour should robots have?
What are the social risks and opportunities?
Conclusion
Acknowledgements
References



WHAT IS A ROBOT?

The philosophical reflection on robots in the Western tradition goes back for more than two and a half millennia. Aristotle writes in his Politics (Pol. I, 1253 b 22-39)  that if all lifeless (apsycha) instruments (organa) necessary for the household – as different from living 'instruments' such as slaves (doulos) or an assistant to the steersman (kybernete) – would accomplish their work (ergon), master builders and home administrators would not need slaves any more. Their work would be done by such instruments moving by themselves (automatos) obeying an order (keleusthen) or foreseeing what to do in advance (proaisthanómenon). Although Aristotle lived in a slave-based society, he did not think of being slave as a natural condition, a common bias at that time and for centuries afterwards. The Aristotelian definition of automatos as a machine designed to accomplish autonomously and heteronomously different tasks according to its owner's orders is still valid if put it in today's Internet context.

Of course, there is a long history between Aristotle and today's concept(s) of robots. I would like to highlight the Arabic-Islamic tradition by referring to the exhibition "Allah's Automata" organized by the Zentrum für Kunst und Medientechnologie (ZKM Karlsruhe, Siegfried Zielinski, Curator, Eckhard Fürlus, Co-Curator, and Daniel Irrgang, Co-Curator). I quote:

"The first Renaissance did not take place in Europe, but in Mesopotamia. Arabic-Islamic culture functioned – from a media-archaeological point of view – as a mediator between classical antiquity and the early Modern age in Europe. As part of the exhibition »Exo-Evolution« and on the basis of outstanding examples, the exhibition explores the rich and fascinating world of the automata that were developed and built during the golden age of the Arabic-Islamic cultures, the period from the early 9th to the 13th century.
The machines to glorify God Almighty draw mainly on the traditions of Greek Alexandria and Byzantium. They introduced spectacular innovations, which did not emerge in Europe until the Modern era: permanent energy supply, universalism, and programmability. For the first time, four of the master manuscripts of automata construction from Baghdad, Kurdistan, and Andalusia are on show together: the »Kitab āit Hiyal« [Book of Ingenious Devices] (ca. 850 CE) by the Banū Mūsā Ibn-Šākir; the »Kitab al-urghanun« [Book of the Organ] from the same period, a masterpiece of all modern programmable music automata; the »Kitab fi ma’rifat al-hiyal al-handasiyya« [Compendium on the Theory and Practice of the Mechanical Arts] (1206 CE) by the Kurdish engineer Al-Jazarī; and the »Kitab al-Asrar fi Nataij al Afkar« [Book of Secrets] by the Andalusian engineer Alī Ibn Khalaf al-Murādī.
Furthermore, the exhibition shows three reconstructions of legendary artifacts: Al-Jazarī’s masterpiece among his audiovisual automata, the so-called Elephant Clock – a spectacular object for hearing and seeing time –, and the programmable music automaton by the Banū Mūsā as a functioning mechatronic model." (GLOBALE 2015)

Allahs Automaten

I also want to mention the late medieval Jewish story of the golem by Juda Loew ben Bezalel, a rabbi of Prague, in the 16th century, going back to earlier stories and to the Bible (Psalm 139:16). Golem means – I quote from the Wikipedia article "Golem" – "'my unshaped form' connoting the unfinished human being before God's eyes. [...]  In Modern Hebrew, golem is used to mean 'dumb' or 'helpless'." See: Cathy S. Gelbin: The Golem Returns: From German Romantic Literature to Global Jewish Culture, 1808-2008. The University of Michigan Press 2001.

Polish science fiction author Stanisław Lem (1921-2006) wrote a novel "Golem XIV" published in 1981. The golem story inspired russian-american science fiction author Isaac Asimov (1920-1992). The collection of stories "I, Robot" was published in 1950, the first of them listing Asimov's three laws of robotics (Wikipedia: Three Laws of Robotics).

"The word robot" – I quote from the Wikipedia article "Robot" – "was introduced to the public by the Czech interwar writer Karel Čapek in his play R.U.R. (Rossum's Universal Robots), published in 1920. [...] In an article in the Czech journal Lidové noviny in 1933, he explained that he had originally wanted to call the creatures laboři ("workers", from Latin labor). However, he did not like the word, and sought advice from his brother Josef, who suggested "roboti". The word robota means literally "corvée", "serf labor", and figuratively "drudgery" or "hard work" in Czeck and also "more general" "work, "labor" in many Slavic languages."

The history of robots is closely related to the history of puppets and marionettes, i.e., puppets "controlled from above using wires or strings depending on regional variations." (Wikipedia article "Marionette").

"Puppetry is a very ancient art form, thought to have originated about 3000 years ago. Puppets have been used since the earliest times to animate and communicate the ideas and needs of human societies. Some historians claim that they pre-date actors in theatre. There is evidence that they were used in Egypt as early as 2000 BC when string-operated figures of wood were manipulated to perform the action of kneading bread. Wire controlled, articulated puppets made of clay and ivory have also been found in Egyptian tombs Hieroglyphs also describe "walking statues" being used in Ancient Egyptian religious dramas. Puppetry was practiced in Ancient Greece and the oldest written records of puppetry can be found in the works of Herodotus and Xenophon, dating from the 5th century BC. [...]  
Sub-Saharan Africa may have inherited some of the puppet traditions of Ancient Egypt. Certainly, secret societies in many African ethnic groups still use puppets (and masks) in ritual dramas as well as in their healing and hunting ceremonies.[...]
The epic Mahabharata, Tamil literature from the Sangam Era, and various literary works dating from the late centuries BC to the early centuries AD, including Ashokan edicts, describe puppets. [...]

Some scholars trace the origin of puppets to India 4000 years ago, where the main character in Sanskrit plays was known as "Sutradhara", "the holder of strings". China has a history of puppetry dating back 2000 years, originally in "pi-ying xi", the "theatre of the lantern shadows", or, as it is more commonly known today, Chinese shadow theatre. [...]

Japan has many forms of puppetry, including the bunraku. Bunraku developed out of Shinto temple rites and gradually became a highly sophisticated form of puppetry. Chikamatsu Monzaemon, considered by many to be Japan's greatest playwright, gave up writing Kabuki plays and focused exclusively on the puppet-only Bunraku plays. Initially consisting of one puppeteer, by 1730 three puppeteers were used to operate each puppet in full view of the audience. The puppeteers, who dressed all in black, would become invisible when standing against a black background, while the torches illuminated only the carved, painted and costumed wooden puppets."

A last quote from the Wikipedia article "Bunraku":

"Bunraku (文楽), also known as Ningyō jōruri (人形浄瑠璃), is a form of traditional Japanese puppet theatre, founded in Osaka in 1684. Three kinds of performers take part in a bunraku performance: the Ningyōtsukai or Ningyōzukai (puppeteers), the Tayū (chanters) and shamisen musicians. Occasionally other instruments such as taiko drums will be used. The most accurate term for the traditional puppet theater in Japan is ningyō jōruri (人形浄瑠璃). The combination of chanting and shamisen playing is called jōruri and the Japanese word for puppet (or dolls, generally) is ningyō. It is used in many plays. Bunraku puppetry has been a documented traditional activity for Japanese for hundreds of years."

ROBOTS IN EVERYDAY LIFE


Will robots become widespread in the 21st century similarly to cars, TV and washing machines in the last century? If yes, how quickly will this happen? Or is this already the case not only in the industry but also in everyday life? Are computers robots? Is a smartphone a robot? The use of robots in health care as well as for many other tasks at home such as cleaning or gardening are some examples. There might be people who say: “I don't like robots in my kitchen” while others argue: “Of course I like robots in my kitchen because I live alone and I don't have the time to housework every day myself." Aimee van Wynsberghe writes:

"In my thesis, entitled “Designing Robots with Care: Creating an ethical framework for the future design and implementation of care robots”, I addressed robots intended to be designed for nurses in their role as care giver. These robots are hoped to help with the increase in care demands of society on healthcare systems across the globe. Alongside the foreseen benefits there are a variety of ethical concerns related to this emerging technology. Such issues include; how the standard or quality of care might change when human nurses are no longer the sole care providers or how this technology might displace care workers from their role as the stewards of care? I do not claim that care robots (robots in healthcare) should be made and used for any care purpose but I also do not claim that care robots should never be made or used. Instead, my goal has been to explore the ethical limits within which these robots can be made and used. To do this I have created a novel framework for their design and implementation (Care Centered Value Sensitive Design) that relies on the care ethics tradition along with the Value-Sensitive Design approach. The hope is that by steering the design of this technology in a manner that incorporates care values into the technical content of the care robot, robot designers can avoid the majority of negative ethical concerns or risks.” (Wynsberghe 2015: Homepage; Wynsberghe 2016)

The ethical and legal discussion about self-driving cars is still in its infancy. Robots may receive feelings from people similar to dogs and cats for their self-awareness, as discussed before. Children are projecting the idea of life into their toys, like a kind of animism (Kaplan 2005). Robots can be seen from a religious point of view. The Shinto tradition in Japan, for instance, includes animism. Robots can be seen also from the point of view of social justice: workers will lose their work because robots will take it, as we know from other kinds of technologies. Following Buddhist ethics, we should not build robots capable of suffering. Ethical issues of cyber warfare are closely related to robots (Altmann and Vidal 2013; Capurro and Marsieske 2012). Gender issues are an important topic in roboethics (Weber 2007).

There is the speculation, particularly in science fiction films and novels, about robots becoming some day autonomous moral beings toward which we would have a relation of mutual respect instead of just giving them orders to accomplish a useful task for us. If this were the case, the concept of robot would not be appropriate as it implies a relation of dependency to its owner as its master following her orders in accordance with Isaac Asimov's "laws of robotics" and with societal customs (Capurro 1995).

The acceptance of robots in Japanese culture is closely related to the influence of mangas. As the German historian of technology Stefan Krebs remarks:

"The Japanese manga author, Osamu Tezuka, paints a quite technically euphoric, optimistic picture of the 21st century ―"robot society." For him, the actual conflicts are between the developers and users of robot technology, and not between robots and humans. Robots appear as neutral tools or as humans‘ partners. In the Japanese reception of the Tetsuwan Atomu mangas, the ethical conflicts are the burden of human agents alone (Leis 2006: S-2). [...] Tezuka offers no real attempts at a solution for the ethical conflicts between humans and robots in his stories. To expect this would hardly do justice to the manga‘s humble pretences. Still, at the end there remains an uncritical attitude toward technology. Here a widespread ideology of a value neutrality of science and technology shines through which can also easily be found in the West (Hornyak 2006: 47- 51).[...] pop culture often perpetuates and pronounces stereotypes and simplified ideas of science and technology. The Tesuwan Atomu mangas were intended to buttress the techno-euphoria of the years of recuperation from the lost Second World War, thus contributing to the country‘s recovery (Schodt 1988: 75-79). For this, their current effect ought to be examined all the more critically." (Krebs 2006, 67)

In her book on the acceptance of robots in Japan, Cosima Wagner writes that:

"[...] on the one hand, as a Japanese Studies research topic "social robots" illustrate the "negotiation character of the creation and use of technological artefacts" (Hörning), which for example includes the rejection of military applications of robot technology in Japan. On the other hand, as a cultural topos, they mirror dreams, desires and needs of human beings at a certain time and therefore have to be interpreted as political objects as. As a source for a Japanese history of objects ‘social’ robots exemplify the cultural meaning of robots, the expectations of the Japanese state and economy, the mentality of Japanese engineers and scientists, and last but not least the socio-cultural change, which the ageing Japanese society is about to face.” (Wagner 2013, English abstract)


WHAT KIND OF IN-BUILT RULES OF BEHAVIOUR
SHOULD ROBOTS HAVE?

A robot should accomplish a task, according to programmed rules. This is not only a technical but also an ethical issue. Ethics (from Greek ethiké) or moral philosophy is a philosophical discipline dealing with morality (from Latin mores), i.e., the implicit or explicit customs and rules of behavior that build the core of the culture of a society. The editors of the International Review of Information Ethics, special issue on Ethics of Robots, write:

"Our main values are embedded into all our technological devices. Therefore, the question is: which values are we trying to realize through them? Artificial creatures are a mirror of shared cultural values. Humans redefine themselves in comparison with robots. This redefinition of living organisms in technological terms has far-reaching implications. Long-term reflections need to be developed and plausible scenarios need to be anticipated." (Capurro, Hausmanninger, Weber, Weil 2006).

It is therefore important to analyze technology in general and robotics in particular in different cultural contexts. I call this kind of analysis intercultural robo-ethics taking into consideration that the concept of ethics itself is understood differently not only in, for instance, the Western tradition but also in other traditions. With regard to Japan, Naho Kitano from Watseda University writes:

"'Rinri', the Japanese Ethics.
When discussing the ethics of using a robot, I have been using the term ―"Roboethic" generally in my research, but it is used in very particular ways especially at international conferences. The word for ―"Ethics" in Japanese is Rinri. However, the Japanese concept of ethics differs from the Western concept of ethics, and this can lead to misunderstandings. In Japan, ethics is the study of the community, or of the way of achieving harmony in human relationships, while in the West, ethics has a more subjective and individualistic basis. The contrast can be observed, for example, in the concept of social responsibility. In Japan, responsibility in the sense of moral accountability for one‘s action already existed in the classical period, but the individual was inseparable from status (or social role) in the community. Each individual had a responsibility toward the universe and the community. Thus in Japan, social virtue lay in carrying out this responsibility." (Kitano 2006, 80)

In the Aristotelian tradition the concept of ethics (philosophia ethiké) is, indeed, related to moulding or 'in-forming' the individual character but ethics belongs together with issues concerning the family (philosophia oikonomiké, from Greek oikos) and the state-city (philosophia politiké, from Greek polis) to what Aristotle calls 'practical philosophy' (philosophia praktiké). In Modernity, there is, for instance, the tradition of utilitarianism which is individual and social oriented and there is the Kantian tradition which is prima facie oriented towards the individual but under the perspective that her practical maximes should be universalizable. Ethics or moral philosophy understood as "problematization" of morality, as I understand it following Michel Foucault's paths of thought (Foucault 1999), deals with the customs and values of a society. There is not such a thing as a worldless isolated subject (Heidegger 1976; Capurro, Eldred, Nagel 2013).

The difference between ethics or moral philosophy and its object of study, namely morality or social rules is crucial with regard to the question about in-built rules of behaviour for robots. Such rules are moral rules, i.e., robots are supposed to follow not to problematize them even if they might be able to 'choose' between different rules. Such rules are human rules, they do not concern the robot in its being. There is a moral of robots (genitivus obiectivus), i.e. moral programms for robots. Only in this sense we can speak of robots as "moral machines" (Wallach and Allen 2009). And there is an ethics of robots (or 'robo-ethics') (genitivus obiectivus), i.e., our reflection on how to deal with robots. There is a difference between a program and an agent (Capurro 2012 and 2012a). Michael Nagenborg writes:

„One major difference between a „program“ and an „agent“ is, that programs are designed as tools to be used by human beings, while „agents“ are designed to interact as partners with human beings. […] An AMA [artificial moral agent, RC] is an AA [artificial agent, RC] guided by norms which we as human beings consider to have a moral content. […] Agents may be guided by a set of moral norms, which the agent itself may not change, or they are capable of creating and modifying rules by themselves. […] Thus, there must be questioning about what kind of „morality“ will be fostered by AMAs, especially since now norms and values are to be embedded consciously into the „ethical subroutines“. Will they be guided by „universal values“, or will they be guided by specific Western or African concepts.“ (Nagenborg 2007, 2-3)

Naho Kitano makes an important remark concerning intercultural research in general and with regard to robo-ethics in the "West" and "Japan" in particular. He writes:

"I believe that the positive acceptance of robots in the contemporary Japan is possible to explain from the indigenous idea of how human relations work, as well as the customs and psychology of the Japanese. Such factors are intangible from inside, for it is taken for granted. In this paper, I attempt to identify these factors and provide a theoretical explanation by means of, first, the Japanese culture of Anima and, secondly, the idea of Japanese Ethics, ―"Rinri", which are, I believe, urging the Japanese robotization.[...] Before starting my argument, I should note my awareness that Japan cannot be considered a uniform and single traditional entity. At the same way, although I use the terms ―the "West" without giving firm definitions, I do not characterize the West as a uni-cultural entity. To the international readers of this paper, I would like to clarify that I use the term ―the "West" in order to set it as ―a "mirror" to reflect ―"Japan"." (Kitano 2006, 79-80)

In other words, cultures are not closed and fixed entities but they are in a permanent process of transformation and interrelation. They are also not incommensurable, i.e., they can be compared. The worst we can do is to 'argue' with clichés instead of digging into the cultural past that in the case of Japan is intimately related with Buddhism, Confucianism and Shintoism but also with Western traditions particularly since the Meiji Era. As Edward Said stated, there is the problem of 'orientalism' concerning on how 'the West' perceives inaccurately Middle-East cultures (Said 1978). This can be extended to cultures in the "Far East". If we reverse the view and look at the "Far West" from the perspective of the "Far East" we must take care of developing an intercultural dialogue as in the case of, for instance, the French sinologist François Jullien (Jullien 2003), otherwise what we get is a reversed cliché, namely 'occidentalism.'


WHAT ARE THE SOCIAL RISKS AND OPPORTUNITIES?
 

Roboethics and particularly intercultural roboethics are young fields of research (Wallach and Asaro 2015; Capurro 2009; Decker and Gutmann 2011, Beavers 2010; Veruggio 2007) See also the EU Projects ETICA and ETHICBOTS as well as the website roboethics.org. The dialogue between the "West" and "Japan" on roboethics started some ten years ago. Naho Kitano, Makoto Nakada and Toru Nishigaki have made important contributions to the field (Capurro and Nakada 2013; Nishigaki 2012). On the relation between robots and Shintoism in relation to Western notions of body and soul Jared Bielby writes:

"A significant difference exists between traditional western and Japanese presumptions regarding the possibility of humans co-existing with robots. Traditional western culture prejudices such a scenario through engrained suspicion, their fears situated in their religious and cultural biases of life and death as grounded in ancient Greek and biblical notions of dualism of body and soul, and thus good and evil. The Japanese likewise look to a co-existence with robots through engrained religious and cultural lenses as well, but in the case of the Japanese, they do so in terms of harmony, a presumption that arises from Shinto notions of animistic life energies. These energies, known as kami in Shintoism, not only infuse all objects that exist, whether tree, rock, water, animal, human, and even human created inanimate objects such as robots and puppets, but also possess and give incarnation to ideas, such as love, passion, or fear. While the nature of the kami can be understood in terms similar to western notions of spirits or gods, kami do not constitute the same moral metaphysical qualities that allow for a dualism of good and evil. Instead, while certain kami may exhibit what in western notions could be deemed either beneficent or maleficent qualities, they are not separated by terms of good and evil, but only by ideas of purity and pollution. No matter the qualities exhibited by various kami, all kami constitute vital life energies, and death in Shintoism is not understood in terms of endings or evil in the way that it is in the west." (Bielby 2012)

Nishigaki takes a critical view of this Japanese tradition when he writes 

"Technology may have been for the Japanese people of old, as Kaplan indicates, a method of protecting their spiritual essence. However, it is undeniable that the spiritual essence has long been undermined through Japanese ignorance of philosophical inquiry into technology. The insensibility of Japanese people to modernity can also be very dangerous. If everything has its own spirit, and if all things are of equal value, why are we not allowed to reconstruct our bodies as freely as we like? In response to the problem of robots or cyborgs, the simplistic division using the words "the Japanese spirit" and "Western learning" is no longer sufficient." (Nishigaki 2012, 19-20)

Nishigaki's critical view on "all things are of equal value" as seen from the perspective of everything having "its own spirit" can be also applied to the view that if everything is nothing but a bunch of digital data we get the same ethical problem.

Concerning the issue of the autonomous subject Nishigaki writes:

"It was Kant, as is well known, who scrutinized this problem profoundly and discussed human free will, which led to the moral argument in his book "Kritik der praktischen Vernunft (The Critique of Practical Reason)".
The argument is often referred to in Western countries when talking about the autonomy of a robot. In Japan, on the other hand, the consideration of human free will and autonomy tends to be neglected. Instead, cooperation with one's surrounding people is most valued. Therefore, autonomy is easily ignored in community life. The ordinary life of the average Japanese is mostly carried out with a clock-like, mechanical rhythm.[...] This tendency ― to harmonize with one's surroundings and to behave in a manner similar to that of others around oneself ― can be dangerous in a highly developed information society dependent on the WWW. Measures should be taken against the suppression of minority opinions in popular social movements. Japanese people must deliberate more deeply about autonomous subjects." (Nishigaki 2012, 22)

The Japanese robot tradition is not grounded on questions of human autonomy, dignity and subjectivity but with the harmonic interplay of humans with other living and non-living beings as experienced in puppetry. We can both learn from each other if we, in the 'West' retrieve our own puppetry tradition without giving up the autonomy/heteronomy issue and vice versa in the case of "Japan". In other words, if in the Internet of Things, robots go online, we get reversed problems in Japan and in Western countries. The ethical issue of robot autonomy is no longer the same. Also, the modern Western and particularly Kantian view of ourselves as autonomous subjects changes when we begin to live our lives online. While Japanese people must deliberate about being autonomous subjects in the information society, Western people must learn what it means to be heteronomic beings not only in WWW but also in physical life with regard to the Internet of Things. In other words, the online robot's dependency on its creator might be turned on its head, increasing our dependency on them. We have to explore the interplay of autonomy and heteronomy, taking into consideration that both terms are not identical in their meaning with regard to robots and humans as far as, for instance, the fact that humans are contingent free beings. There is an interaction between robots themselves as well as between humans with and without the mediation of robots. In the Internet of Things, stand-alone autonomous robots belong to the past (Haarkötter and Weil 2015; Balkam 2015). An online robot is not an oxymoron, it is obvious. Online robots are the next generation. "Robotics and Internet of Things Playing Together" is the title of a plenary session of the conference RoboBusiness Europe, Milan, 29-30 April, 2015 (RoboBusiness Europe 2015).

Democratic societies ruled and controlled by parliaments and mass media might be shaped by bureaucratic robotic rules increasing the heteronomic dependency of humans on robots beyond present surveillance systems as well as beyond this dependency in cyberspace (Capurro 1993; Lessig 1999). Who are we who live in this bureaucratic digital capitalism? (Weber 1973, 379ss).

Robots take over already a lot of tasks in the field of producing material things (Greek poiesis) but also in the field of human action (Greek praxis) (Arendt 1958). A dystopian scenario may be a kind of robot-divide between people who can afford buying robots and those who cannot. Robots might become soon job killers leading to mass unemployment, similarly to what happened during the Industrial Revolution in the19th century but concerning now not only physical but also intellectual work (Bernau 2015). From the perspective of Critical Political Economy, the question to be asked is whether robots will lead to a transformation of "digital labor" within an "exploitation economy" into "playful digital work" (Fuchs and Sevignani 2013).

A less dystopian scenario may be that in short time robots will be so cheap that everybody who wants them will be able to buy them, similarly to computers and smartphones, prototypes to the type of robot and robot-human interactions implied above.The "second machine age" (Brynjolfsson and McAfee 2014) could lead to prosperity if we are ready to invest in education empowering humans for creative tasks that they can do better than robots. We must also develop societal moral and legal rules for the robot age that should be internationally accepted. Changing a tool means changing a form of life, which is an ethical issue (Spinosa, Flores, Dreyfus 1997). As in the case of other machines, robots will break down. What are the consequences? (Flores Morador 2009). For a critical epistemological view on robots see (Negrotti 2012, Capurro 1995 and 2009). The Henn na Hotel in Nagasaki is the first hotel with their main staff consisting entirely of robots (Henn na Hotel "Evolve Hotel" 2015).

Global operating software companies may produce a standardized software for robots. Hackers may go into the software of the robot, using them for other purposes than those intended by their owner's. This may cause serious or unfavorable results not only for the owner. Issues of safety and security belong to the core of robo-ethics. There is an ongoing ethical and legal debate about autonomous cars taken under control by, for instance, cyber-carjackers (Lin 2013). Some fifty years ago we started designing cities adapted to cars. Today we have car-free areas in the city. In the near future we will have to think about robot-free zones.


CONCLUSION

It is crucial to be proactive concerning which aspects of human life should not be robot-oriented. Producers and users of robots should take care that they are appropriately recycled as this is or should be the case with other electronic technologies (e-waste). This is a key issue of a future ecological ethics in the era of online robots (Capurro 2010). The awareness and ethical analysis of cultural traditions with regard to robots is a relevant issue not only for users but also for inventors and producers of robots. One point where policy, academics, and industry meet, is regulation (Nagenborg et al. 2007). In industrial history, regulations in the context of cars are a good example. A hundred years ago nobody thought about car regulation.

Who are we in the robot era? According to the German philosopher Peter Sloterdijk, morality and law are "symbolic immune systems" necessary for an individual and a society to survive (Sloterdijk 2009). As in the case of biological immune systems, it is necessary to think critically about them when changes in the living and cultural environment take place. This is the reason why we need intercultural robo-ethics. It should problematize the current "symbolic immune systems" with regard to the changes that take place in societies due to the development and use of robots. Robots belong to the core of today's "anthropotechnologies" (Sloterdijk), i.e., of technologies that we use not only to perform tasks but also to change ourselves. Human-robot interactions are embedded in literary and artistic traditions. Robots are masks of human desire (Brun 1981, 1992). Engineers, scientists, politicians, users and, last but not least, writers, project their desires, nightmares, and dreams, according to the society in which they live, its history and culture. This is the reason why research in intercultural robo-ethics is an important task for society, not only in Japan.


ACKNOWLEDGEMENTS

The author thanks Prof. Makoto Nakada (University of Tsukuba, Japan), Prof. Martin Pohl (University of Tsukuba, Japan), Jared Bielby (University of Alberta, Canada), Dr. Felix Weil (CEO, Quibiq, Stuttgart, Germany), Joseph E. Brenner (International Society for Information Studies, Vienna, Austria), Prof. Juliet Lodge (University of Leeds, UK), and Prof. Francesca Vidal (University of Koblenz-Landau, Germany) for critical remarks.


REFERENCES

Altmann, Jürgen and Vidal, Francesca (Guest Editors): Cyber Warfare. International Review of Information Ethics (IRIE), 20, 2013. 

Arendt, Hannah: The Human Condition. Chicago 1998 (1st. edition 1958).

Aristotle: The Politics. Ed. W.L. Newman. Oxford University Press 1950.

Balkam, Stephen: What will happen when the internet of things become artificially intelligent? In: The Guardian, February 20, 2015.

Beavers, Anthony  (Guest Editor): Special Issue: Robot Ethics and Human Ethics. In: Ethics and Information Technology, Vol. 12, Nr.3, Sept. 2010.

Bernau, Varinia: Die Roboter kommen. Maschinen übernehmen mehr Jobs. Aber steigern sie damit auch den Wohlstand aller? In: Süddeutsche Zeitung, Nr. 43, February 21-23, 2015, 25. 

Bielby, Jared: Kami: Shinto Influences on JapaneseVideo Game Culture, 2012.

Brun, Jean: Les Masques du désir. Paris 1981.

Brun, Jean: Le rêve et la machine. Technique et Existence, Paris 1992.

Brynjolfsson, Erik and McAfee, Andrew: The Second Machine Age, New York 2014.

Capurro, Rafael: The Quest for Roboethics. A Survey. Contribution to the Workshop organized by Cybernics, University of Tsukuba (Japan) September 30, 2009. In: Cybernics Technical Reports. Special Issue on Roboethics. University of Tsukuba 2011a, 39-59 (CYB-2011-001 -CYB-2011-008 March 2011). Updated version 2015.

Capurro, Rafael: Toward a Comparative Theory of Agents. In: AI & Society, Vol. 27, 4 (2012), 479-488.
 
Capurro, Rafael: Wer ist der Mensch? Überlegungen zu einer vergleichenden Theorie der Agenten. In: Hans-Arthur Marsiske (Ed.): Kriegsmaschinen - Roboter im Militäreinsatz. Hannover 2012a, 231-238.

Capurro, Rafael: Netz.Ökologien. Zur Ethik des Abfalls im Zeitalter digitaler Medialisierung. Saarbrücken 2010.

Capurro, Rafael: Ethical Aspects of Biometrics. EU Project Tabula Rasa, 2011.

Capurro, Rafael: Ethics and Robotics. In: Rafael Capurro and Michael Nagenborg: Ethics and Robotics. Heidelberg 2009, 117-123.

Capurro, Rafael: On Artificiality. IMES (Istituto Metodologico Economico Statistico)Laboratory for the Culture of the Artificial, Università di Urbino, Dir. Massimo Negrotti (IMES-LCA WP-15) Urbino 1995.

Capurro, Rafael: Zur Frage der professionellen Ethik. In: Peter Schefe, Heiner Hastedt, Yvonne Dittrich and Geert Keil (Eds.): Informatik und Philosophie, Mannheim 1993, 121-140.

Capurro, Rafael: La chose à penser. Mannheim 1988. 

Capurro, Rafael;  Hausmanninger, Thomas; Weber, Karsten; Weil, Felix (Editors): Ethics in Robotics, In: International Review of Information Ethics, Vol. 6, 2006.

Capurro, Rafael; Eldred, Michael; Nagel, Daniel: Digital Whoness: Identity, Privacy and Freedom in the Cyberworld. Frankfurt 2013.

Capurro, Rafael and Marsiske, Hans-Arthur: Der Moment des Triumphs. E-Mail-Dialog über ein Bild. In: Hans-Arthur Marsiske (Ed.): Kriegsmaschinen - Roboter im Militäreinsatz. Hannover 2012, 11-30.

Capurro, Rafael and Nagenborg, Michael (Eds.): Ethics and Robotics. Heidelberg 2009.

Capurro, Rafael and Nakada, Makoto: An Intercultural Dialogue on Roboethics. In: ibid. (Eds.): The Quest for Information Ethics and Roboethics in East and West. Research Report on trends in information ethics and roboethics in Japan and the West. ReGIS and ICIE, March 31, 2013, pp. 13-22 (ISSN 2187-6061).

Cerqui, Daniela; Weber, Jutta; Weber, Karsten (Guest Editors): Ethics in Robotics. International Review of Information Ethics, Vol. 6, 2006.

Decker, Michael and Gutmann, Matthias (eds.): Robo- and Information Ethics. Zürich, Berlin 2011.

ETHICBOTS: Emerging Technoethics of Human Interaction with Communication, Bionics and Robotic Systems (FP 6, 2005-2007):
R. Capurro, M. Nagenborg, J. Weber, Chr. Pingel: Ethical Regulations on Robotics in Europe. In: AI & Society, 22 (2008), 349-366.
Deliverable 5 Technoethical Case Studies in Robotics, Bionics, and Related AI Agent Technologies (R. Capurro, G. Tamburrini, J. Weber, eds.).

ETICA: Ethical Issues of Emerging ICT Applications (FP 7, 2009-2011)
Deliverable 2.2 Normative Issues (R. Heersmink, J. van den Hoven, J. Timmermans)
Deliverable 3.2 Evaluation Report (M. Rader et al.)
Deliverable 3.2.2 Ethical Evaluation (M. Nageborg, R. Capurro) 

Flores Morador, Fernando: Broken Technologies. The Humanist as Engineer. Lund University 2009.

Foucault, Michel: Discourse and Truth: The Problematization of Parrhesia, 1999. http://foucault.info/documents/parrhesia

Fuchs, Christian and Sevignani, Sebastian: What is Digital Labour? What is Digital Work? What’s their Difference? And why do these Questions Matter for Understanding Social Media? In: tripleC 11 (2), 2013, 237-293.

Gelbin, Cathy S.: The Golem Returns: From German Romantic Literature to Global Jewish Culture, 1808-2008. The University of Michigan Press 2001.

GLOBALE 2015: Allah's Automata. Zentrum für Kunst und Medientechnologie (ZKM, Karlsruhe)
http://zkm.de/en/event/2015/10/globale-allahs-automata

Haarkötter, Hektor and Weil, Felix (Guest Editors): Ethics for the Internet of Things. In: International Review of Information Ethics (IRIE), Vol. 22, Feb. 2015. http://www.i-r-i-e.net/current_issue.htm

Heidegger, Martin: Sein und Zeit. Tübingen 1976. (Engl. transl. J. Macquarrie & E. Robinson: Being and Time, Basil Balckwell 1987).

Henn na Hotel ("Evolve Hotel") (2015)
http://www.h-n-h.jp/en/

Jullien, François. La valeur allusive. Paris 2003 (1st. ed. 1985).

Kaplan, Frédéric: Les machines apprivoisées. Comprendre les robots de loisir. Paris 2005.

Kitano, Nano: 'Rinri'. An Incitement towards the existence of robots in the Japanese Society. In: International Review of Information Ethics, Vol. 6, 2006, 78-83. http://www.i-r-i-e.net/issue6.htm

Krebs, Stefan: On the Anticipation of Ethical Conflicts Between Humans and Robots in Japanese Mangas. In: International Review of Information Ethics, Vol. 6, 2006, 63-68.

Lessig, Lawrence: Code and other laws of cyberspace. New York 1999.

Lin, Patrick: The Ethics of Autonomous Cars. In: The Atlantic, Oct. 8, 2013. 

Lin, Patrick; Abney, Keith; Bekey, George A.(eds.): Robot Ethics. The Ethical and Social Implications of Robotics. The MIT Press 2012.

Nagenborg, Michael: Artificial moral agents: an intercultural perspective. In: International Review of Information Ethics, Vol.7, 2007. 

Nagenborg, Michael; Capurro, Rafael; Weber, Jutta; Pingel, Christopher: Ethical regulations on robotics in Europe. In: AI & Society, 2008, Vol. 22, n. 2, 349-366.

Negrotti, Massimo: The Reality of the Artificial. Nature, Technology and Naturoids. Heidelberg and Berlin 2012.

Nishigaki, Toru: Is a Society of Cohabitation with Robots Possible? In: Toru Nishigaki and Tadashi Takenouchi (Eds.): Information Ethics. The Future of Humanities. Nagoya City 2012, 1-25. 

RoboBusiness Europe Conference: Plenary Session: Robotics and Internet of Things Playing Together. Milan, 29-30 April, 2015.

Said, Edward W.: Orientalism. new York 1978.

Sloterdijk, Peter: Du musst dein Leben ändern. Über Anthropotechnik. Frankfurt a.M. 2009.

Spinosa, Charles; Flores, Fernando; Dreyfus, Hubert L.: Disclosing New Worlds. Entrepreneurship, Democratic Action, and the Cultivation of Solidarity. MIT Press 1997.

Veruggio, Gianmarco: EURON. Roboethics Roadmap, 2007. 

Wallach, Wendell, Asaro, Peter Mario (eds.): Machine Ethics and Robot Ethics. (forthcoming).

Wallach, Wendell and Allen, Colin: Moral Machines: Teaching Robots Right from Wrong. Oxford University Press 2009.

Wagner, Cosima: Robotopia Nipponica – Recherchen zur Akzeptanz von Robotern in Japan. Marburg: Tectum Verlag 2013, English abstract.

Weber, Jutta: "Social" Robots & "Emotional" Software Agents: Gendeering Processes and De-Gendering Strategies for "Technologies in the Making". In: Isabel Zorn, Susanne Maass, Carola Schirmer, Els Rommes, Heidi Schelhowe (eds.): Gender Designs IT. Construction and Deconstruction of Information Society Technology. Wiesbaden 2007, 53-63.

Weber, Max: Soziologie, Universalgeschichtliche Analysen, Politik. Stuttgart 1973.

Wikipedia: Bunraku 

Wikipedia: I, Robot 

Wikipedia: Isaac Asimov

Wikipedia: Golem 

Wikipedia: Marionette 

Wikipedia: Puppetry

Wikipedia: Robot 

Wikipedia: Three Laws of Robotics

Wynsberghe, Aimee van:  Homepage, University of Twente, The Netherlands, 2015.

Wynsberghe, Aimee van: Healthcare Robots. Ethics, Design and Implementation.  London and New York: Routledge 2016.


Last update: June 26, 2017




 
    

Copyright © 2015 by Rafael Capurro, all rights reserved. This text may be used and shared in accordance with the fair-use provisions of U.S. and international copyright law, and it may be archived and redistributed in electronic form, provided that the author is notified and no fee is charged for access. Archiving, redistribution, or republication of this text on other terms, in any medium, requires the consent of the author.



 
Back to Digital Library
 
Homepage Research Activities
Publications Teaching Video/Audio