How will artificial intelligence help aging boomers?
10th April 2017 · 0 Comments
By Randy Rieland
Contributing Writer
Part 1
(Special from Smithsonian.com/New America Media) – The relationship between humans and robots is a tricky thing. If the latter looks too much like the former, but is still clearly a machine, people think it’s creepy, even repulsive—a feeling that’s become known as the “uncanny valley.”
Or, as is sometimes the case, the human, with Star Wars or “The Jetsons” as his or her reference points, is disappointed by all the things the robot can’t yet do. Then, there is the matter of job insecurity — the fear of one day being replaced by a tireless, unflappable, unfailingly consistent device.
Human-robot interactions can be even more complicated for one group in particular—older adults. Many are not that comfortable with new technology, even less so if they feel it’s invading their privacy or a constant reminder of their own slipping cognitive skills.
And yet, it’s widely believed that with the first surge of baby boomers hitting their 70s — with a huge wave to follow — technology in some form will play a growing role in enabling older adults to live in their homes longer.
But will it be robot companions? Talking digital assistants? Strategically-placed sensors? Or maybe some combination of devices? And, what unexpected impact could they have on how people age and whether they stay connected to family and friends.
“You have to walk this balance on where you are starting to impinge on somebody’s privacy versus tracking their safety and social engagement,” says David Lindeman, co-director of Health Care at the Center for Information Technology Research in the Interest of Society (CITRIS) at the University of California, Berkeley. “That’s the compelling challenge of the next decade. How do we maximize the use of this technology without having unintended consequences.”
The Right Moves
Recently, a small group of older adults in San Francisco have been learning to engage with a talking device named ElliQ. It’s more desk lamp than archetypal robot—think of the hopping lamp at the beginning of Pixar movies. But while ElliQ is meant to sit on a table or nightstand, it’s all about movement, or more accurately, body language.
Like Siri or Amazon’s Alexa, ElliQ talks. But it also moves, leaning toward the person with whom it’s speaking. It lights up, too, as another means of engagement, and uses volume and sound effects to distinguish its messages.
“If ElliQ is shy, she will look down and talk softly, and her lights will be soft,” explained Dor Skuler, CEO and founder of Intuition Robotics, the Israeli company behind the device. “If she tries to get you to go for a walk, she will lean forward and take a more aggressive tone, and her lights will be bright.”
Skuler added, “Most of the way we communicate as humans is nonverbal. It’s our body language, our use of silence and tone, the way we hold ourselves. But when it comes to working with a computer, we’ve adapted to the technology instead of the other way around. We felt that a machine having a physical presence, versus a digital presence, would go a long way in having what we call natural communication.”
In a typical interaction, Skuler described, the grandchildren of an ElliQ owner send her photos through a “chatbot” using Face-book Messenger. When ElliQ sees new pictures have come in, it tells the grandmother and asks if she wants to look at them. If she says yes, ElliQ brings them up on its separate screen component. As the woman looks at the photos, so does ElliQ, tilting its “head” toward the screen, and turning the moment into more of a shared experience. With the help of its image recognition software, it might add, “Aren’t those girls cute?”
“It’s not the same as your adult child coming over to you and showing you photos of your grandchildren on her phone,” said Skuler. “But it’s also very different from you just looking at the photos on a screen by yourself. You weren’t with another person, but you weren’t really alone, either. We call that an in-between stage.”
He continued, “What we like about this is that without the family sending the content, there is no content. ElliQ isn’t there to replace the family. I don’t think we want to live in a world where people have meaningful relationships with machines. What it can do, though, is make that content more accessible and allow you to share the experience.”
Not Too Cutesy
A lot of research went into how ElliQ looks and behaves, said Yves Béhar, founder of fuseproject, fuseproject.com/ the Swiss industrial design firm that worked with Intuition Robotics on the project. That included getting input from experts on aging. (“Our first hire was a gerontologist,” said Skuler.)
“One of the key premises behind ElliQ is that technology is complicated and perhaps too complex for aging people to use,” Béhar said. “But artificial intelligence [AI] can be used to engage with a person in a much simpler way. It can remind a person to take their meds, or connect with their family or just tell them, ‘Hey, why not go outside. It’s nice out.’
“And we felt that ElliQ should be a table object, rather than a creepy robot that follows you around,” he said. “By keeping it in one room, a person can interact with it like they would a familiar appliance in a familiar context.”
There was another important consideration, notes Behar. It had to look appropriate. “We didn’t want it to look childish or cartoonish,” he says. “We didn’t feel that was right. We wanted it to be friendly, but not too cutesy in a way that diminished the intelligence of the user.”
It’s also critical that ElliQ keeps learning. As Skuler explained it, one of the first steps in establishing a relationship with this particular robot is to set some goals, such as how many times a week a person wants to go out for a walk or be reminded to see friends. Then, it’s up to ElliQ to determine the most effective way to do its job. In other words, it will learn that one person responds better to, “It’s nice out, why don’t you go for a walk,” while another needs to be prodded more aggressively with “You’ve been on the couch watching TV for four hours. Time to get up and take a walk.”
“That’s where the emotive side kicks in,” he said. “ElliQ can set a whole different tone, and use different body language and gestures based on what works and what doesn’t work. The machine fine-tunes itself.”
Although he describes ElliQ as a “good listener,” Behar sees the device more as a coach than a companion. He acknowledged the risk of making machines too engaging, and thereby encouraging more social isolation, not less.
“We don’t want to create the kind of emotional dependency that social media sometimes does,” he said. “We need to make sure it complements their human relationships. It’s very important that we keep that in mind as we develop these interactions between humans and machines with artificial intelligence.”
This article originally published in the April 10, 2017 print edition of The Louisiana Weekly newspaper.