Background
Robotics may provide a technological aid in meeting the increasing demand on health and social care [
1], caused in part by increasing life expectancy [
1‐
3], as human function deteriorates with age [
4,
5]. Companion robots such as robot pets designed congruent with animal aesthetics and behaviours, have particular potential in aged care [
6,
7]. The most well researched example is Paro, the robot seal [
8]. Research has suggested numerous benefits of interacting with Paro, including reduced agitation and depression in dementia [
9,
10], a more adaptive stress response [
11], reduced care provider burden [
11], and significantly improved affect and communication between dementia patients and day care staff [
12]. Furthermore, Paro may reduce psychoactive and analgesic medication use [
13], and even decrease blood pressure [
14]. Alternatives to Paro include, amongst others, Miro, Pleo, and the Joy for All devices, some of which have been used in previous research [
15]. Although research with alternatives is limited (due to an apparent selection bias for Paro and a limited availability of comparison studies [
8,
16]), we previously found evidence that more affordable, less sophisticated devices may offer acceptable alternatives [
17], with potential for reproducing the cited benefits of Paro [
18].
That said, these reported benefits need to be considered in the context of ethical concerns of robot implementation with older people [
19]. In the following, we review some of the relevant literature for the most commonly discussed concerns, including infantilisation, deception, reduced human contact and intrusions on privacy [
19‐
21]. Sparrow and Sparrow [
22] assessed the reported capacity of robots to meet older people’s needs, particularly considering social and ethical implications. The authors claim to provide “a much-needed dose of reality” [p:143], suggesting that robots are unable to meet social and emotional needs in almost all aspects of care. They raise the issue of potential for harm, with technological restrictions and potential dangers (eg. trip hazards), removing hopes of robots aiding with personal care, mobility or daily tasks. Potential for harm raises the additional issue of accountability [should harm result from robot implementation [
23]. However, the most ethically controversial proposed role for robots appears to be that of companions for older people, the concept of which is sometimes reported as “positively bizarre” [p:308] [
21], unethical, and “akin to deception” [p:148] [
22].
Regarding deception, some authors feel companion robot benefits rely on delusions as to the real nature of the interaction, described by Sparrow [
21] as “sentimentality of a morally deplorable sort” [p:306], with this deceit making robot use misguided and unethical. Sparrow [
21] argued robot behaviour is merely imitation: robots do not possess human frailties, and thus cannot ‘understand’ human experience and mortality, rendering them incapable of appropriate, genuine, emotional response [
22]. Thus, the extent to which a person feels cared for depends on delusions of robot capabilities. In contrast, Wachsmuth [
24] discussed necessity of ‘true’ care for older people, suggesting the illusion of responses to feelings and suffering of the care recipient would suffice, despite a robot’s qualitative experience (without neurophysiological basis for consciousness) not being a ‘true’ caregiver. Sparrow and Sparrow [
22] would likely disagree, reporting “the desire to place [robots] in such roles is itself morally reprehensible” [p:154] as robots in roles requiring care, compassion and affection expresses a “gross lack of respect for older persons” [p:156].
Sparrow [
21] further suggested that if an older person treats a robot pet as living, thus engaging in the delusion, we have done them a disservice. This appears likely to occur: Robinson et al. [
25] noted participants interacted with Paro as a live pet, with some perceiving Paro as having agency despite awareness the device was robotic. The issue of deceit, in particular concerning the distinction between robot and live pet becomes even more problematic with the presence of dementia [
26]. Deception is therefore a common ethical concern specific to companion robots that can also be problematic for acceptability among older people’s relatives. Sharkey [
19] suggested that, despite a vulnerable older person enjoying robot pets, and perhaps not distinguishing between living and not, relatives may feel they were suffering humiliation and loss of dignity through deception (although it is also possible this tension would ease upon witnessing potential quality of life benefits [
27]).
A further ethical issue commonly discussed is reduced human contact. The substantial economic pressures within aged care may result in substitution of human staff with robotic alternatives, which is problematic as human social contact provides significant wellbeing benefits, autonomy and communication opportunities [
22]. However, given the regrettably low standard of care provided on occasion by human carers, possibly as a result of high demands including a large workload and low pay [
22], there is a well-documented increasing concern that older people can suffer abuse and mistreatment [
19]. Dignified treatment by human carers is therefore not a given. In contrast, robots are unable to get angry, abuse an older person or become tired and stressed. Therefore, a small reduction in human contact may be an acceptable compromise for improved quality of care and interaction if robotics could ease strain on human care providers. Support comes from research suggesting reduced carer stress with Paro implementation [
11,
28]. Furthermore, robots may mediate social interaction [
25], providing a conversation topic between staff, family and older people, and more opportunities to engage socially [
19]. Sharkey [
19] suggests however, despite solving negatives of human behaviour, robots also lack the true positives; compassion, empathy and understanding. Sparrow and Sparrow [
22] argue, due to the crucial role of emotional labour and meaningful conversations for wellbeing, any reduction in human contact would be indefensible.
A further ethical concern is infantilising, an issue also raised for doll therapy, seen by some as congruent with the idea of second childhood, being dispiriting and deficit-based [
26,
29]. Infantilisation may damage acceptability for family members, as supported by Robinson et al. [
30] who reported that a care resident’s son conveyed their father was not the type to cuddle a soft toy. Another concern is equality of access, as the current cost of companion robots may be prohibitive for people of lower socioeconomic status, who would be denied the potentially therapeutic tool [
20,
31].
Whilst the literature is rich with commentary on potential ethical issues, we have been researching real-world robot pet implementation with older people in care homes, and to date, seen limited evidence of ethical concerns amongst older people themselves. We have noted however, occasions where family members have reported such concerns. Family members are key stakeholders in the care of older relatives, and views of relevant stakeholders are fundamental for real-world use [
32]. Presenting the views of relevant stakeholders is the core contribution we seek to make with this paper. Successful real-world use of companion robots depends on skilled and careful deployment by relatives and carers [
19], thus negative ethical perceptions would likely impair implementation, forming a barrier to adoption [
33].
Some previous research has assessed perceptions of older people themselves, including Wu et al. [
34], whose results suggested ethical/societal issues presented a potential barrier to robot use, namely privacy and reduced social contact. Pino et al. [
32] also conducted a survey and focus group with 25 older people and informal carers, who discussed stigmatisation, privacy issues, dignity, infantilising, replacing human carers, and cost being prohibitively high. Although the exploratory study provided initial insight, with only seven informal carers surveyed, more research is required specific to family member perceptions. A larger sample would additionally allow a comparison between the highlighted concerns to identify the most significant potential barriers. Furthermore, the study involved demonstration of only one robot (RobuLAB 10), with PowerPoint demonstrations of other available socially assistive robots, limiting participant ability to assess robot capabilities [
35]. In contrast, we surveyed opinions based on real-world interaction with companion robots, providing informed perceptions with increased validity.
Views of health and social care professionals have also been reported. For example, questionnaire results from 2365 trainee care professionals suggested participants felt companion robots were more beneficial than monitoring or assistive robots, and provided low ratings for maleficence [
36]. Nonetheless, research directly surveying ethical perceptions among older people’s family members appears limited. Although much literature debates ethics philosophically, providing a strong overview of potential issues [
37], fewer studies specifically assess stakeholder perceptions. Stahl and Coeckelbergh [
37] argued that, further to philosophical speculation, we need dialogue and experimentation closer to the context of use. The authors suggest academic reflection on ethics is divorced from the context of practice, with literature mainly addressing what the robot ethics community “think are important ethical issues” [p:154] whilst stakeholder voices remain unheard.
Here, we therefore explore perceptions and prevalence of ethical concerns among younger adults as family members of potential end-users of companion robots, and compare importance of various ethical concerns for this significant stakeholder category, thus contributing to robot ethics understanding for real-world implementation and potential barriers to successful use. This study addresses a timely topic, with real-world and research use of social robot pets increasing, and their use in dementia care being explored, both in the UK and elsewhere [
6‐
18].
Results
Sixty-seven people interacted with the robots and then agreed to complete a questionnaire. They had an average age of 28 years (Range 18–65, SD 10.99). Most (53/67 (79%)) reported having older adult relatives, and 11/67 (16%) had a relative with diagnosed dementia.
Section A of the survey first gained understanding of participant device preferences, likes and dislikes, available in Supplementary File
1. It is worth noting, only one dislike referred to a potential ethical concern (reducing human contact).
Most participants would purchase a device for an older relative (Table
1). Many participants suggested more than one device, and the most popular option was the Joy for All cat. It is also worth noting, that of the 10 participants who reported they would purchase a Paro, four wrote an additional comment such as “if cheaper or more affordable.” Price was also a common reason for participants reporting that they would not buy their relative a device, or a deciding factor on selecting a device other than Paro. This would indicate financial cost is a key deciding factor, with no ethical concerns reported as the reason for not purchasing a device.
Table 1
Responses to purchasing a device for an older relative (Q3)
Yes | 39 (58) | Paro | Pleo | Cat | Dog |
10 | 4 | 14 | 10 |
No | 21 (31) | Example Reasons |
“Too expensive” “They can decide themselves” “I don’t think they’d like it” “Not into animals” “Not yet” “They have real animals” |
None/Unsure | 7 (10) | |
Table
2 demonstrates that the majority of participants felt positively when surveyed on general feelings towards companion robots for older people. Within the participants with a mixed response, negative feelings are often justified based on potential benefits. A very small minority provided a completely negative response. Further example evidence can be found in Supplementary File
1.
Table 2
Responses to open question on general feelings towards companion robots for older people (Q4)
Positive | 44 (66) | “it would be very therapeutic for them” “I think it would be very successful in providing comfort to my relative with dementia, particularly the dog, for nostalgic purposes” |
Mixed | 10 (15) | “I struggle with the concept of replacing care with robotics but in neurodegenerative diseases such as AZ dementia it can be harder on family members sometimes and if it stimulates/soothes them then maybe” “A good idea, the problem would be making the robot responsive enough without it being too expensive” |
Negative | 5 (7) | “I would have thought it was a bit ridiculous” “I would be slightly worried of infantilising the person, the person may get upset or see it as a trick” |
None | 8 (12) | |
Most (40/67) reported having no ethical concerns (Table
3). A further five left the box empty, perhaps also indicating a lack of concerns to report, or alternatively reflecting a lack of understanding. This would suggest that prevalence of instinctual ethical concerns is low. The concerns raised by 20 of the 67 participants are summarised in Table
3, demonstrating that deception and reduced human contact were the most prevalent concerns noted by participants upon unprompted questioning of ethical issues. While prevalence was low, the examples do provide some support for the ethical issues reported in previous literature. However, the concerns around battery life, malfunctioning and robustness relate better to the performance of the robot, rather than ethical concerns. Some further examples are available in Supplementary File
1.
Table 3
Responses to open question on ethical concerns of companion robot use with older people (%) (Q5)
Concern | 20 (30) | Concern | N | |
Batteries | 2 | “Emotional distress if the batteries ran out” |
Malfunction | 1 | “What happens if they malfunction?” |
Human Contact | 7 | “Might encourage people to be distant from the elderly” |
Robustness | 1 | “Toughness, can they withstand a fall?” |
Deception | 4 | “They could become confused as to whether the robot was real or not” |
Privacy | 1 | “Should not be connected to net (privacy)” |
Danger | 2 | “Tripping/falling” |
Dignity | 2 | “They may try to feed or walk them, potential embarrassment” |
Infantilisation | 1 | “May feel patronised, belittled with a fluffy toy” |
No Concern | 40 (60) | “No” “None” “No, it seems very safe” |
Unsure | 2 (3) | “I don’t know” “Not sure” |
No Response | 5 (7) | |
Table
4 demonstrates that participants felt the most concerning factor related to equality of access to devices through socioeconomic factors. This concern received the highest mean score, but also the highest median and mode, meaning this issue was most commonly scored as of more concern. The second most concerning issue appears to be robots being used for carer convenience. The least concern was seen for reduced human contact, privacy issues, and potential for injury or harm, all receiving means, modes and medians below the midpoint of 3.5. Infantilising and deception mean scores sit just below the midpoint, whilst the median and mode are just above, demonstrating some concern.
Table 4
Potential ethical issues scored on Likert-scales based on level of concern (1 = not at all a concern – 7 = very much a concern)
Socioeconomic Status – Equality of Access | 5 | 6 | 4.72 | 1.75 |
Robots for Carer Convenience | 4 | 5 | 3.98 | 1.58 |
Infantilising | 4 | 4 | 3.45 | 1.70 |
Deception | 4 | 4 | 3.44 | 1.61 |
Reduced Human Contact | 3 | 2 | 3.06 | 1.68 |
Injury or Harm | 1 | 2 | 2.38 | 1.67 |
Privacy | 2 | 1 | 2.17 | 1.54 |
Finally, we acknowledge a possible concern with our participant sample. That is, despite the obvious participant interest in robotics as they attended this exhibition, we recognise 14 out of the 64 participants did not report having an older relative. We therefore analysed (crosstabs and Fisher exact tests) our data from our three key reported outcomes for statistical difference between participants without an older relative, with an older relative and with a relative with dementia. We found no difference between the three groups for the three outcomes we assessed; decision to buy/not buy (Table
1) (.320,
n = 60,
p = .925), general perceptions (Table
2) (1.390,
n = 59,
p = .618), and ethical concerns (Table
3) (5.897,
n = 62,
p = .051). This would suggest the default views of potential future stakeholders is congruent with actual stakeholders.
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.