Socially assistive robots, aging in place, and problems of privacy and data governance

By Teresa Goff

July 16, 2024

Socially assistive robots, aging in place, and problems of privacy and data governance

Once the stuff of science fiction, robots designed to resemble and interact with humans have progressed and expanded across various sectors from entertainment to education, hospitality, and healthcare.

In their article “Bringing older people’s perspectives on consumer socially assistive robots into debates about the future of privacy protection and AI governance,” in AI & Society, Andrea Slane & Isabel Pedersen (2024) foreground the perspective of older people’s ideas on privacy and data governance related to socially assistive robots. Specifically, Slane & Pedersen (2024) claim privacy is central to ethical discourse around the use of socially assistive robots (SARs) not only in aged care, but also when older people are considering a SAR as a new consumer product. Drawing on two qualitative studies, the authors place the views of older people in the context of digital technologies, highlighting their experiences with and anticipations for future AI systems. What is striking about this study is the engagement of older people in broader debates on privacy protection and AI governance, given their experiences as technology consumers and users.

By including older adults’ views on near-future technologies like SARs, Slane & Pedersen (2024) aim to disrupt problematic power relationships at both the individual user level and at collective levels: that is, older people individuals, as members of a diverse demographic, and as consumers more generally. This collaborative approach helps to democratize the technology development process, and fosters an approach to innovation that is more user-centered. Existing criticisms of consumer health monitoring technologies make similar points. Slane & Pedersen state: “Consumer health monitoring technologies have been criticized for their shift of power away not only from users to health professionals, but also to far less accountable software and platform service providers.” This sort of shifting of power away from users can lead to a loss of control and protection from data misuse, not in relation to one’s own health data, but also in relation to what is done with aggregated data on a large scale, community level.

Funding for the first study included in the article came from the Office of the Privacy Commissioner of Canada, while the Social Sciences and Humanities Research Council (SSHRC) funded the second study. Study participants emphasized trust in companies handling personal data and proposed integrating SARs with trusted businesses and financial institutions to mitigate risks for vulnerable groups. They advocated for programming SARs to detect scams and stressed the need for strong legal frameworks and AI governance to protect user interests, as well as initial legal safeguards against potential corporate data misuse, amid deep mistrust of tech firms’ user commitments. As Slane & Pedersen (2024) state: “Imposition of legal limits were sometimes called upon by participants to help get beyond their common perception that private entities would always pose a risk to users and not have their best interests in mind” (Slane & Pedersen, p. 11).

Trust is crucial in both care ethics and privacy law, and so will need to be foregrounded if SARs are to be adopted, whether for aged care or as consumer home assistant devices. But Slane and Pedersen also note that consumer technology use is rife with simultaneously positive and negative experiences, leading to endemic feelings of ambivalence. While ambivalence can be a barrier to willingness to adopt a new technology, it can also be seen positively as creating an opening for change. By emphasizing the need for inclusive, participatory SARs design, privacy protection, and data governance policy development, the complexities of older adults’ concerns as well as their optimism can be taken into account. As Slane & Pedersen (2024) conclude, heeding calls for flexible governance models, iterative regulation, and democratic engagement will help safeguard a future that respects privacy and promotes beneficial data use.

Part of the paper’s method included use of The Fabric of Digital Life, a public research repository that helps researchers track technology development through curated collections like Aging, Culture, And Technology (2018-2023), which explores robotic integration with Artificial Intelligence (AI). Researchers use the database to assign keywords as metadata, which helps them identify and illuminate patterns, correlations, leading to insights related to how AI systems perform, are marketed, or otherwise framed for use in different contexts and with different users. Analyzing social and technical aspects can reveal socio-technical trade-offs, such as those between system performance and energy consumption or between user convenience and data privacy. Assessing how these trade-offs affect users and society can foster informed decision-making on a variety of levels, and so may inform design, policy, and usage decisions to improve the beneficial aims of technologies aiming for use by older people..

In conclusion, the intersection of humanoid robots and Artificial Intelligence (AI) marks a profound evolution from speculative fiction to practical applications. As we continue to navigate the ethical complexities and societal implications of these advancements, it is crucial to integrate diverse perspectives, such as those of older adults, into the ongoing discourse on privacy, governance and the responsible development of AI-driven technologies. Slane & Pedersen (2024) illustrate that fostering inclusive dialogue and democratic development of regulatory frameworks is a requirement for a future in which innovation aligns with ethical values and enhances the well-being of all individuals and communities, especially those being targeted as users and consumers of advanced technologies.

This research contributes to the Digital Life Institute AI and Advocacy Challenge.