Care robots are being pitched as a solution to the loneliness epidemic among elderly people. How bad is the problem? And is it just found among senior citizens?
Coming out of the pandemic, the whole world was thinking about loneliness and isolation. I was just reflecting on that – the experience of being lonely, not having meaningful connection in your life, and the impacts that has on an individual.
So says Grace Brown, founder and CEO of Melbourne, Australia, start-up Andromeda Robotics, whose Abi robot is an intelligent humanoid built for human connection and companionship.
In 2020 when the pandemic hit, Brown was just two years into her degree in Robotics Engineering, so – like every young person on the planet – she found herself in enforced isolation. She explains:
I had COVID during the Melbourne lockdowns, which were very restrictive. I was fortunate because I still had my family and friends, though I couldn't see them. But it was still a very isolating period. So, Abi started as a companion, a project that I was building for myself. But I recognised that she could be used for a much greater purpose. And that’s what led me down this path.
To date, Andromeda has secured investment of $16 million and contracts with a variety of care providers for the robot, which is a multicoloured, multi-lingual, child-sized humanoid. Abi is designed to have “playful features and infinite empathy”, via onboard and cloud-based access to a variety of Large Language Models (LLMs).
Lockdown impact
While the long-term impact of lockdowns on young people has yet to play out in society – anecdotal evidence from parents and teachers suggests lurking problems that go deeper than teenage angst – the 2020 world in which Brown found herself was remarkably like the future that many middle-aged and elderly people face today.
The post-war baby boom combined with better healthcare and declining birthrates this century has created a demographic timebomb in countries such as the US, the UK, Japan, and much of Western Europe: an ageing population and a shortfall of workers, not to mention a generation of youngsters who face uncertain employment prospects with AI. Expecting them to work in low-paid care jobs to look after their wealthy elders will be a massive ask. As Brown puts it:
All the data shows that it is our older generations that are most impacted by the loneliness epidemic. In Australia’s nursing homes today, four out of ten people never receive visitors from a friend or family member – ever, from the time they enter an aged care home. And it's about six out of ten here in the States.
After Melbourne, the start-up opened an office in San Francisco, where Brown herself is based. But in both countries, those are bleak statistics, and the situation will only get worse until the Boomers, Apollo’s Children, and Generation X have finally left the planet.
So, the race is on to build robots to fill the labour gap in many industries, and especially for health and social care tasks. But while the truly dextrous, intelligent, general-purpose robot may be a decade or more away – because of the 100,000-year Robot Data Gap I described in my earlier report [LINK] – a companion robot is already achievable, thanks to advances in conversational AI and mechatronic engineering.
So, welcome Abi. Reports from Andromeda’s clients are certainly enthusiastic. The Medical & Aged Care Group (MACG) is a private provider of care and assisted-living homes across the state of Victoria. Group Wellbeing & Community Coordinator, Jacinta Jaritus, says:
Abi brings such joy, companionship and a sense of community. Our Wellbeing Teams have observed so many positive and memorable moments between Abi and people living and working in our homes. Our staff also love speaking with Abi, especially in their first languages.
The robot’s positive impact on hard-pressed care workers is a notable upside, confirms CEO Cameron McPherson:
On the day she’s in our homes, staff are bouncing with excitement. We even have staff who have created and wear Abi t-shirts.
That speaks volumes about the pressures on those workers.
The youth epidemic
But there is a little-acknowledged problem amidst all this good news. While the labour gap is indeed growing, and middle-aged journalists focus on the loneliness epidemic they are facing themselves – backed by hard evidence – most people ignore loneliness among the young. That disconnection was fostered in lockdown, which pushed more and more teens and young adults towards technology, such as AI.
This is important, because while inspired innovators like Brown are developing solutions for our socially isolated elders, evidence is growing that those technologies are most popular with the young. For example, it is 18-24-year-olds who are turning to AI companions, therapists, counsellors, and advisors at scale.
I put this to Brown – a young woman who, after all, built a robot companion for herself. She says:
Yeah, I’m definitely seeing that there is a higher sample of younger generations leaning on AI as a life advisor. OpenAI has released data showing that young people use ChatGPT more like a life advisor while Millennials use it as a search engine. From what I’ve observed, people don't feel like they're judged when they're talking to an AI, as opposed to with friends where there are always social pressures or boundaries.
So, have Boomers, Generation Xers, and Millennials let their children and grandchildren down? Recently, I apologised to a twenty-something musician friend for my generation having broken the planet for people his age. He froze, smiled, and said:
Wow. Someone finally said it. We all think it, but you’re the first person to put it into words.
Then he paused and added:
I really appreciate it, man.
But Brown explains that these problems cross the generation gap, for a variety of complex reasons:
We have older adults who have end-of-life conversations with Abi, because they're embarrassed to have them with a nurse – let alone with their family. They feel comfortable talking to Abi, because she's non-judgmental. She's designed to demonstrate infinite patience.
“lder adults who are just starting to get dementia are often aware their cognition is failing, and they're embarrassed by it. They catch themselves repeating things. But with Abi you can have the same conversation one hundred times, and she'll always react just as enthusiastically.
With your parents or grandparents, much as you love them, you’d find it frustrating to hear the same thing over and over. So, I think it's across all generations. There's just none of that social pressure or embarrassment.
She adds:
Even when you talk to friends, you might not want to share all your thoughts because you would worry what they might think. And people are naturally self-centred: they might have a bit of conversational narcissism and navigate back to themselves. But with an LLM, it's always about you and what you're talking about. It's always very affirming.
But is that a good thing? After all, ChatGPT has been cited in lawsuits where vulnerable people have committed suicide after seeking advice and companionship from their AI. Brown says:
This is probably where AI is a little bit dangerous, because the models are very affirming. And if someone has strong views in a direction that is not all that healthy or good, it can be quite dangerous speaking to an AI.
So, I think there definitely needs to be more education for the next generations that will lean on these tools to understand that they're never going to disagree with you or really push back on what you have to say.
All of which brings us to Abi’s AI. Is it ChatGPT? Brown explains:
We use several different LLMs, and one of them is gpt-realtime. But all these models are really just a response generator. With Abi and other true AI companions there is a lot more around the conversational architecture. We use a whole suite of different models, both in the cloud and offline, but our core IP is the fact that we've built the conversational architecture ourselves.
The robot security challenge
On that point, any cloud-connected robot should always be regarded as the eyes, ears, and hands of the internet. So, what are the privacy and security implications of that? Abi takes part in conversations with isolated, lonely people who are facing the ends of their lives, with some losing their mental acuity. Who else sees and hears those conversations? How secure and private are they? Brown says:
That's going to be different based on who built the robot – different companies will have different policies. But in our case, we do it differently for every customer. That's a negotiation we have with each of them independently, based on the level of security they're comfortable with. For example, in children's hospitals, it's very difficult to record any data because of how secure their requirements are for devices.
She adds:
Everyone is building towards the future where a general-purpose humanoid robot will be your assistant, or your companion, or to watch your children or your grandparents. So, how those robots are built and how the data is managed will be completely dependent on what vendors’ leaders have decided.
We work closely with our customers in terms of how Abi is integrated into their homes – in some cases, she's connected to their IT systems. So, any specific, select data that the home is interested in, Abi can collect. Staff can look at that data and be like, ‘Oh, Jane had this incident’, or ‘Jane said this to Abi, so we should send a nurse to go check on her.’
Right, so conversations with a companion robot may not be as private or as privileged as people believe, and this will need to be a focus for policymakers and regulators in the years ahead. In the hands of a less ethical and considerate provider than Andromeda, we could be ushering in an era of truly intrusive surveillance and data gathering, not to mention of invisible marketing to captive customers.
Consider Tesla’s Optimus robot as a hypothetical example. How likely is it that a robot whose conversational interface and brain are based on Grok – an AI trained on xAI user accounts and data scraped off the internet, and recently implicated in the creation of illegal images – would respect user privacy and dignity?
So, if a dedicated care robot such as Abi is not connected to the cloud – because of a client’s privacy and security guidelines – would it have less functionality? Brown says:
Yeah, an offline robot is always going to be less capable and intelligent. Abi is a hybrid in that she can be online and offline, because in many environments, connectivity may be patchy. For example, elevators are boxes with zero connectivity. And in a hospital, there's lots of infrastructure around that interferes with Abi’s connection for continuous interaction.
Anyone who has visited people in hospital wards for extended periods will know that mobile and internet connectivity is often non-existent for this reason.
On a final point, Brown constantly refers to Abi as ‘she’. Perceived gender in humanoid robots is a complex topic in its own right, so does Brown really see her creation as female? She says:
I just use her pronouns because it's a bit more natural. Everyone uses something, but we leave it up to people to decide how they want to talk to Abi.
Most people use ‘she/her’, but some do use ‘he/him’. Very few people use ‘it’, because if something even slightly has a sign of life, people naturally anthropomorphize it. Abi has eyes, arms, and legs and people naturally project a childlike persona onto her. And when you do that, people tend to use human pronouns.
My take
A fascinating and insightful conversation with a young entrepreneur who is clearly achieving real results for her clients.
Abi is currently not for sale as a domestic companion and is available to professional care homes as a subscription service. But if, one day, Andromeda opts to go the consumer sales route, Brown envisages a cost equivalent to an average family car.