This post is by Henry Taylor, who is a philosopher at the University of Birmingham. He is interested in in the philosophy of mind. His main areas of research in the area are attention, consciousness, peripheral vision and robotics.
Henry Taylor |
You wake up and listen for the familiar sound of your household robot making you your morning porridge. On the way to work, you pop into a supermarket, and a robot helps you to find the products you need. You’re a mental health professional, and you spend the day working alongside the robots that support people with post-traumatic stress disorder. On your way home, you call into the care home where your parents are being looked after by both humans and robots.
The use of robots in all of the above contexts is currently being investigated. In healthcare, for example, researchers are exploring how robots can support humans with autism, cancer, dementia, diabetes social anxiety, and more.
These applications raise questions that straddle robotics and philosophy. One of them concerns how robots should respond to differing cultural norms and expectations. For example, different cultures seem to have different norms about personal space. This is important for understanding how far from a human the robot should stand. Different human cultures also have different expectations about facial expressions, hand gestures, physical greetings, and so on. How should we take these on board when we’re designing the robot?
Cultural robotics is the study of how robots can fit into this world of varying (and constantly shifting) cultural expectations and practices. The most fundamental question in cultural robotics is: what do we mean by ‘culture’? One popular approach in robotics is to equate culture with nationality. On this approach, ‘culture’ just means things like British, Canadian, Indian, Iranian, Italian, Japanese, Nigerian, etc. However, this approach has raised concerns in the robotics community. Equating culture with nationality runs the risk of propagating an over-simplistic approach, where whole cultures are reduced to a few stereotyped patterns of behaviour associated with particular countries. It also marginalises those who do not fit into the dominant patterns of behaviour in a particular country, such as refugees, immigrants, religious minorities, or members of subcultures.
In our recent work, myself and my co-author, Masoumeh Mansouri, have addressed this issue by arguing for a more nuanced definition of culture in robotics. Rather than looking for ‘the correct’ definition of culture, we argue for a conceptually fragmented approach. This involves accepting that there are many different ways of approaching culture in sociology and the humanities, and recognising that different approaches to culture might be appropriate for different areas of robotics. For example, a robot designed only to give directions to humans in a shopping centre may only require norms of politeness and helpfulness. Conversely, a robot designed for long-term use by the same group of people in a factory or hospital may need to grow and change its behaviour over time, in response to changes in the social dynamics of that environment.
It is inevitable that robots will come to occupy a more prominent role in our everyday lives. This raises fundamental questions about how these robots can behave appropriately, and also which social interactions should be kept human.
The use of robots in all of the above contexts is currently being investigated. In healthcare, for example, researchers are exploring how robots can support humans with autism, cancer, dementia, diabetes social anxiety, and more.
These applications raise questions that straddle robotics and philosophy. One of them concerns how robots should respond to differing cultural norms and expectations. For example, different cultures seem to have different norms about personal space. This is important for understanding how far from a human the robot should stand. Different human cultures also have different expectations about facial expressions, hand gestures, physical greetings, and so on. How should we take these on board when we’re designing the robot?
Cultural robotics is the study of how robots can fit into this world of varying (and constantly shifting) cultural expectations and practices. The most fundamental question in cultural robotics is: what do we mean by ‘culture’? One popular approach in robotics is to equate culture with nationality. On this approach, ‘culture’ just means things like British, Canadian, Indian, Iranian, Italian, Japanese, Nigerian, etc. However, this approach has raised concerns in the robotics community. Equating culture with nationality runs the risk of propagating an over-simplistic approach, where whole cultures are reduced to a few stereotyped patterns of behaviour associated with particular countries. It also marginalises those who do not fit into the dominant patterns of behaviour in a particular country, such as refugees, immigrants, religious minorities, or members of subcultures.
In our recent work, myself and my co-author, Masoumeh Mansouri, have addressed this issue by arguing for a more nuanced definition of culture in robotics. Rather than looking for ‘the correct’ definition of culture, we argue for a conceptually fragmented approach. This involves accepting that there are many different ways of approaching culture in sociology and the humanities, and recognising that different approaches to culture might be appropriate for different areas of robotics. For example, a robot designed only to give directions to humans in a shopping centre may only require norms of politeness and helpfulness. Conversely, a robot designed for long-term use by the same group of people in a factory or hospital may need to grow and change its behaviour over time, in response to changes in the social dynamics of that environment.
It is inevitable that robots will come to occupy a more prominent role in our everyday lives. This raises fundamental questions about how these robots can behave appropriately, and also which social interactions should be kept human.