Chapter 15 Empathizing with Robots Animistic and Performative Methods to Anticipate a Robot's Impact

Typically, social robots are supposed to empathize with humans, understand human emotions, and anticipate human needs. With this chapter, the authors turn the table: What can humans learn through empathizing with technology? How might the design of robots change if developers adopted the perspective...

Description complète

Enregistré dans:
Détails bibliographiques
Auteur principal: Dörrenbächer, Judith (auth)
Autres auteurs: Hassenzahl, Marc (auth)
Format: Électronique Chapitre de livre
Langue:anglais
Publié: Taylor & Francis 2023
Sujets:
Accès en ligne:DOAB: download the publication
DOAB: description of the publication
Tags: Ajouter un tag
Pas de tags, Soyez le premier à ajouter un tag!
Description
Résumé:Typically, social robots are supposed to empathize with humans, understand human emotions, and anticipate human needs. With this chapter, the authors turn the table: What can humans learn through empathizing with technology? How might the design of robots change if developers adopted the perspective of a robot, walking in its shoes to perceive and understand the world from its point of view through sensors and actuators? Is the technomorphization of human bodies a mind-expanding complement to the anthropomorphization of technology? The authors present a range of innovative methods, all of which are based on empathy, for use by robot designers. For example, Thing Ethnography works by attaching cameras to access the perspective of an object. Object Personas is about imagining the personality of an object. When applying Enacting Utopia, designers perform like an object in a positive future. Through with Techno-Mimesis, they are able to perceive a use scenario as an object does. The authors clarify that such kinds of empathy do not happen out of naïveté (Old Animism). When applied consciously, they generate knowledge about-and reflexive distance from-technological objects such as robots.
Description matérielle:1 electronic resource (16 p.)
ISBN:9781003287445-15
9781032262673
9781032246482
Accès:Open Access