Capturing the Heartbeats of Daily Living to Improve Eldercare

Share this post:

Every room in your home reveals unique insights about your daily activities.

By leveraging a network of connected devices, sensors and AI-based systems, we have solved the old adage, “If those walls could talk.” Now, they now can’t stop talking.An image of the inside of a house overplayed on top of blueprints.

We can learn about individual patterns for sleeping, eating, exercising, cooking, bathing, and how often you connect with family, friends, and the community – all balanced with security and privacy.

IBM believes it is these patterns that will help improve care for our aging population. By leveraging cognitive eldercare solutions we can transform the way seniors age in place and prolong their independence.

Aging Dilemma

The aging of society will have unprecedented effects on healthcare, the economy, and individual quality of life.

For the first time in history, people over 65 will outnumber children under five. Nearly 40 percent of older adults develop disabilities, and many more suffer from chronic physical and mental diseases that require intensive monitoring and care.

Technology and Eldercare

The Internet of Things and artificial intelligence will help us improve proactive care, reduce risk of injury, and increase the ability for elders to remain where they are most familiar and content – in their homes.

With inexpensive, consumer-grade sensors placed throughout a home, we can now capture the heartbeats of daily living. For instance, if you could see or hear an elder running the water, opening the refrigerator, lighting the stove, and walking repeatedly across the kitchen, what would you conclude?

They’re cooking, of course. That’s an obvious deduction, but with important implications.

Knowing an elder is cooking could say a lot about his or her nutrition and social life, and enables us to draw conclusions about whether they’re entertaining or feeling isolated.

Keeping Our Loved Ones Safe

The challenge comes from trying to scale such insights to make similar observations for every room throughout the home.

At IBM, my job is to determine which sensors to deploy and where to place the sensors, then analyze the sensor data, correlate it with any other dataset available, search for patterns, and identify warning signals. If I am successful, then we’ll enable family and professionals to more effectively monitor elders from afar and keep them safe and independent.

Senior care providers, such as our partner Avamere Family of Companies, want a solution that generates new insights that could reduce risk, lower cost of care, and significantly improve the elder’s quality of life.

To meet these objectives, research tells us there are six Activities of Daily Living (ADLs) patterns – toileting, bathing, eating/cooking, sleeping, “transferring”/mobility, and dressing – that are key to understanding how well an elder is managing aging in place and the right level of care required.

Toileting Revelations

For instance, it’s amazing what we have been able to learn from an elder’s toileting.Photo of a toilet

Understanding toileting behavior and routines isn’t just about diagnosing incontinence or dehydration, it’s about making connections from certain behaviors to an increased risk of developing afflictions, such as urinary tract infections. Early detection may reduce emergency room visits, which keeps residents from costlier skilled nursing facilities.

By using the information from flush and motion sensors, we can now detect when the event happened, and can also predict when we expect it to happen again – per individual – and look for anomalies and outliers that give us warning signals.

Internet of “Caring” Things

Realizing what we can model and analyze with consumer-grade sensors – even those placed in a bathroom – illustrates the power of “listening to the walls”…for elders, caregivers, and families.

Unfortunately, these sensors are not going to help anyone stay young, but our they can help keep our loved one’s safer and more independent for as long as possible.

For more information:

More Aging stories
By Tim Powers on March 14, 2018

IBM Accessibility Research at CSUN 2018

Artificial intelligence (AI), automation, compliance, and best practices in enterprise accessibility make up the IBM Accessibility Research lineup for the 33rd annual CSUN Assistive Technology Conference in San Diego next week (March 19-23, 2018). CSUN is the world’s largest event dedicated to presenting and exploring new ways technology can assist people with disabilities. At the […]

Continue reading

By Erich Manser on March 7, 2018

IBM Employees with Disabilities Volunteer Time to Create Accessible Technology

Who better to confirm the accessibility of solutions before they are released to market than people with disabilities? Being visually impaired, I rely considerably on magnification and altered colors to see what’s on my device screens. One day, not long ago, I sat in a room full of developers in a meeting and it was illuminating to […]

Continue reading

By Michael Gower on February 8, 2018

Simplifying the New WCAG 2.1 Guidelines

Recommendations for new accessibility guidelines were released on January 30, 2018 by the World Wide Web Consortium (W3C). In this new draft version of the Web Content Accessibility Guidelines (WCAG), none of the current 2.0 Success Criteria have been altered. Instead, 17 additional criteria have been recommended for 2.1. This article is intended to help […]

Continue reading