But the extra years are not always golden. A recent report by the International Longevity Centre-UK noted that although today’s seniors are healthier overall, there are rising levels of illness among them. The report showed that in 2012, 14 percent of those over the age of 50 had a serious illness, down from 16 percent in 2002. However, illnesses among the 80 to 84 age group is rising, with 31 percent having a serious illness, up from 26 percent only a decade ago.
Caring for ailing seniors is expensive, and healthcare costs for the elderly promise to outstrip tax revenues – a problem compounded by a diminishing workforce. Worse yet, traditional support mechanisms are becoming outdated as grown-up children lack the time (and often the inclination) to look after aging parents. Recent developments in technology promise to provide a solution in the form of a new generation of “cyberconscious” robots – automatons that not only take over the well-being of seniors from human caregivers, but are also able to improve health outcomes by determining their charges’ moods and cheering them up when unhappiness descends.
Here Come the Robots
The Organization for Economic Cooperation and Development (OECD), a grouping comprised primarily of many of the world’s wealthiest countries, predicts that the number of people over the age of 65 in its member nations will expand from 85 million in 1970 to 350 million by 2050. Unfortunately the same can’t be said of the workforce paying the taxes traditionally used to fund the care of seniors. The result is that the ratio of employees to retirees, known as the support (or dependency) ratio, is deteriorating steadily in all wealthy nations. According to The Economist, in 1970 in the U.S., 5.3 workers supported every pensioner; by 2050, that number will drop to 2.6. The situation looks even worse in Japan, where in the same time period, the ratio is forecast to drop from 8.6 to 1.2.
Worse still, medical expenditures for seniors in the U.S. is forecast to balloon to over $638 billion by 2015 (up from $492 billion in 2012), accounting for more than half of the total healthcare budget, according to recent spending figures released by the U.S.
One strategy to rein in the healthcare budget is to encourage seniors to accept routine treatment at home by trained caregivers rather than in hospitals by expensive medical professionals. Anecdotal evidence shows that the elderly prefer to maintain this degree of independence, and this view is often backed by their families as well. The problem is an acute shortage of caregivers willing to visit seniors at home. The U.S. Bureau of Labor Statistics forecasts that demand for such people will far outstrip supply in the years ahead. Demand is expected to jump 48 percent, while the number of caregivers will rise just 1 percent.
As some researchers are eager to suggest, robots could provide the answer by stepping (or rolling) in to replace human caregivers. Concerned families are likely to choose a robot that reminds parents to take their medication and provides company over the emotional and financial cost of moving their loved ones into an assisted living facility should such an option become practical. In practice, the robots, ranging from glorified lunch carts to human companions that wouldn’t be out of place on a science fiction film set, would assist the elderly with their day-to-day tasks and facilitate communications with family members via videoconferencing and the Internet. For this to work, the interface with the robot must be intuitive, and manufacturers must alleviate any fears that the elderly might have about relying on new technology.
The Electronic Nurse
The first generation of robots is already roaming the home. For example, more than 8 million of iRobot’s home robots have been sold worldwide. The Massachusetts Institute of Technology’s (MIT) spin-off, a vacuuming robot, was among the first to make practical robots a reality. But caring for a senior is a little more complex than keeping carpets clean, so robots that can provide meaningful assistance and companionship for an aging population are taking a little longer to evolve. However, the first generation of caregiving robots has been under development for some years. Two examples can be found though the French company Robosoft’s Kompaï, as well as the U.S.-based GeckoSystems’ CareBot.
Kompaï, which first became available in 2010, features a touch-screen display and a spherical white head. Remote family members call the robot via the Internet and Kompaï uses ultrasonic sensors to detect the location of its charge and navigate to that person, who responds via the display and a webcam. Similarly, CareBot uses a combination of sensors to maintain awareness of its surroundings. The robot’s GeckoTrak software senses body heat, identifies colors, and uses sonar and infrared (IR) range finders to monitor a person’s movements in the home. CareBot’s software can be used to remind the person in its care to walk the dog or turn on the TV to catch a particular program.
Kompaï and CareBot, together with other first-generation home-care robots such as Giraff (Figure 1), created by Swedish company Giraff Technologies AB, are primarily designed to make it easy for remote relatives to keep in contact with loved ones through Internet videoconferencing. But all perform the equally vital task of monitoring their charges’ health via wireless connection to third-party medical monitoring systems that measure vital signs and forward the information to physicians via the Internet.
Figure 1:Giraff makes it easy for relatives to keep in touch via videoconferencing. (Source: Giraff)
However, while these early home-care robots perform a useful task in keeping relatively healthy seniors in touch with families and medical staff, they fall far short of what’s needed to look after infirm elderly who are suffering from the ailments of advanced age such as brain degeneration, diabetes, heart disease, and cancer.
Tackling Loneliness
To assume the tasks of a live-in nurse, for example, requires a sophisticated robot with advanced sensors, learning capabilities, decision-making software and a personality.
Research shows that seniors who spend most of their time alone live shorter and unhealthier lives than those with partners or live-in companions. Loneliness leads to an increased prevalence of mental illness such as depression, as well as physical symptoms like lack of appetite and restricted movement. For example, a large nationally representative survey of 3,000 older adults in the U.S. found that a lack of social relationships was associated with worse physical health, whether or not loneliness was actually experienced. Seniors who felt the most isolated reported 65 percent more depressive symptoms than those with more social interaction. Another study, conducted in 2010, observed 200 people over the age of 65, of which 45 percent lived alone and 55 percent lived with partners. The study concluded that the former group prepared fewer daily meals and had a significantly lower daily intake of protein, fruits, and vegetables compared to people who lived with partners.
Home-care robot designers have been quick to acknowledge such research, and the next generation of robots will benefit from systems that allow them to act as companions as well as assistants. The concept is not new. In Japan, for example, a cuddly social robot called Paro has been available for a decade. Paro is an interactive robot developed by AIST, a Japanese industrial automation firm, allowing lonely elderly people to enjoy the documented benefits of animal therapy, which has been shown to reduce patient stress. Paro employs tactile-, light-, temperature-, and posture-sensors, with which it can perceive people and its environment. For example, the tactile sensor allows the robot to detect if it is being stroked while the posture sensor can indicate Paro is being cuddled. The robot can also recognize the direction of voices and words such as its name via an audio sensor.
Making an emotional connection
Tomorrow’s home care robots are more likely to adopt a human-like appearance rather than Paro’s Pinnipedian form factor. Even the first-generation products include rudimentary attempts to offer emotional comfort. Kompaï’s head features a “face,” for example, and Giraff displays a screen image of a human head and shoulders that speaks to the elderly person.
The Personal Robots Group at MIT’s Media Lab have taken things a step further with research that focuses on robots that can make stronger social and emotional connections with people. The lab has developed a range of robots including one called Nexi that can blink, shrug, and make facial expressions (Figure 2).
Figure 2:The Personal Robot Group at MIT’s Media Lab has designed Nexi to make social and emotional connections with people. (Source: MIT)
Challenges remain because there is only so far this can be taken before that robot enters the so-called uncanny valley, a place where the machine has a face that is very humanlike, but not enough to be mistaken for a real person. The result is typically creepy or even the stuff of nightmares (in the same way that zombies are frightening). To develop a meaningful “relationship” with a human, a robot needs to be able to detect a person’s mood and react accordingly. This is particularly important in the home-care sector where a lonely senior may be frequently unhappy due to discomfort or loneliness. The robot will need to pick up on this negative demeanor through visual cues such as tears and downturned mouth, audio signals such as extended silence or barely audible speech, and even physiological signs such as lowered temperature and raised blood pressure.
Christopher G. Atkeson, a professor in the Robotics Institute and Human-Computer Interaction Institute at Carnegie Mellon University, is conducting pioneering work in adapting robotic technology to meet the needs of the elderly. Atkeson’s key contribution is to suggest that robots should be soft, not made of metal. He argues that people are more likely to accept a soft robot (with an inflatable body constructed from materials similar to a bounce house) as a companion rather than a conventional “metal monster.” In fact, his vision partly influenced the making of Big Hero 6, a Disney movie about a boy whose closest companion is Baymax, an endearing inflatable robot that comforts the sick. The film showed how a health-care robot could interact with humans.
Aldebaran, a French company primarily owned by Japanese telecom firm SoftBank, is among the first to develop a cyberconscious robot. The firm created Pepper (Figure 3), an interactive robot that, according to the manufacturer, is able to detect human emotions and choose the ideal way to communicate with the person. The robot listens to voices and analyzes body language and can modify language and gestures to adapt to a given situation. Pepper includes a screen on his (apparently the robot is male) chest, where he “shows his emotions and what’s happening in his inner world.”
Figure 3:Pepper analyzes body language and adapts to the given situation. (Source: Aldebaran)
Enhancing Robot Sensors
Robots such as Pepper and its descendants will need to take advantage of the full range of modern sensors in order to pick up the gamut of cues that indicate a human’s state of mind. Like humans themselves, robots will use vision to take in much of this information.
OMRON’s global research and development group has developed the OKAO Vision system, which can recognize different faces through analysis of facial features. These products are already finding a niche in robot-vision applications. OKAO Vision comprises Human Vision Components (HVC) and advanced software that the company says will help robotic machines “understand people visually in much the same way as humans do.” (Figure 4) The software can recognize faces and facial attributes to estimate a person’s gender, age, and ethnicity as well as determining whether a person is happy, surprised, angry, sad, or neutral.
Figure 4: OKAO Vision system is being adopted for robotic applications. (Source: OMRON)
In addition to charged coupled device-based vision systems such as OKAO’s HVC, future robots will benefit from enhanced vision through the use of sensors that extend sensitivity to a wider range of electromagnetic radiation than just visible light. For example, infrared (IR) sensors from companies such as Amphenol’s ZTP Thermopile IR Sensors and Lumex’s MicronSensIR™ will enable robots to “see” in the dark and locate their charges as well as determining whether the person’s temperature is high or low.
Complementing vision sensors, future robots will also employ audio sensors based on microphones converting the vibrations from sound waves into voltages. Multiple microphones situated around the robots head will be used to determine the direction and intensity of a person’s voice. But that’s the simple part; in order for a home-care robot to use the sound of its charge’s voice to determine emotional status, the machine will need to be equipped with analog-to-digital conversion (ADC) and digital-signal processing (DSP) electronics allied to a powerful microprocessor and some clever software. Companies such as Analog Devices, Cirrus Logic, STMicroelectronics, and Texas Instruments provide the electronics that will underpin robot audio-analysis applications.
Touch sensors that will help the machine determine a person’s emotional state will supplement a next-generation home-care robot’s vision system. Today, developers can chose from a range of touch sensors that use technologies such as a change in capacitance to detect touch. Although the technology is currently used for products such as touch-screen displays, it’s not too difficult to envision how the process could be reversed such that a robot could use a capacitive touch sensor to “feel” human skin and gain some insight into the person’s emotional state by measuring the changes in capacitance that occur as the skin changes temperature. Companies such as Freescale Semiconductor, Maxim Integrated, Cypress Semiconductor, and ON Semiconductor are strong in touch-sensor technology.
Tomorrow’s developments include a touch sensor that uses gold nanoparticles supplemented with organic connector molecules. When the sensor touches skin and slightly flexes, the distance between the nanoparticles changes and the electrical characteristics of the sensor alter, allowing not only pressure, but also the temperature and skin humidity of the person being touched to be measured.
Robots will likely leverage technology beyond their own in their role as caregivers. For example, wearable wireless medical sensors enabled by technologies such as Bluetooth Low Energy could be employed to relay information such as heart rate, blood pressure, and temperature to the robot in order to complement its own sensors in determining a person’s mood. Companies that specialize in Bluetooth technology include Broadcom, Nordic Semiconductor, Panasonic, and Texas Instruments.
Robots have some way to go before they can match the quality of home care routinely provided by humans. But the lucrative rewards promised for manufacturers targeting the rapidly expanding over-65 age group is encouraging rapid development. When asked by Not Impossible Now, a technology website, whether robots such as Disney’s Baymax would ever be possible, Professor Atkeson replied “not only will they be possible, but it will happen very soon.”
By Steve Keeping, Mouser Electronics
Steven Keeping gained a BEng (Hons.) degree at Brighton University, U.K., before working in the electronics divisions of Eurotherm and BOC for seven years. He then joined Electronic Production magazine and subsequently spent 13 years in senior editorial and publishing roles on electronics manufacturing, test, and design titles including What’s New in Electronics and Australian Electronics Engineering for Trinity Mirror, CMP and RBI in the U.K. and Australia. In 2006, Steven became a freelance journalist specializing in electronics. He is based in Sydney.