Ideation: Can Artifical Intelligence and Sensors be used for Alzheimer patient and the families that care for them?

In 2015, I participated in a local 24-hour hackathon. During that time, I joined a developer and a designer to ideate and launch a prototype concept of Safe Senior. This is a write-up on how we tackled our hackathon project. Our aim was to build a project around the issue facing the aging Alzheimer’s population and the families that cared for them. Our ingredients of choice to make this happen were sensors, mobile, and user-centered design.  I recently wrote about the top performing public digital health companies. From that observation, one may infer that the top software companies focus on niche specific issues to invent and develop more efficient solutions. Our main focus was on the aging population who suffered from Alzheimer’s disease.


Source: Alz.org

If we look at the chart above, the cost of caring for an individual with Alzheimer’s is $160 billion for the Center For Medicare & Medicaid. The direct cost projected in this report was $236 Billion. Our hypothesis was that families who have loved ones with this disease are concerned on a daily basis on how their loved one is doing. Our target users were families of those who lived in assisted living homes or those being cared for at home by a caregiver. The way we saw it, sensors could be used to detect and monitor sensors activities notifying families in the background.


mobilemockupforsafesenior (1)

Source: SafeSenior.Club design was led by Christopher Torres.

To handle this, we decided to design a tool with the goal of keeping families alert and tracking and notifying them of their loved one’s movements while at home.  At the core, our problem was leveraging tech to lower the spending on Alzheimer’s of seniors by tracking their movement.  We incorporated the color blue and a shield shape into the logo to reflect the values of trust and accountability.

The interface on the right was thought of by the designer who wanted a custom design that broke away from expectations based on Material Design. The shield appeared on the homepage and the 4 sections were: check status of senior, add family/senior member, send a message, or get location details of this particular. Simple login and interface features ensured people could get benefit immediately from the service.


Artificial Intelligence (AI) and it is subsets like machine learning made us ask the question, “What if you could let families know their loved one was fine and overtime use that data to determine the need for a caregiver?” One of the hurdles for this idea was finding a way we could prototype to collect the data and send it out. We ideally wanted a DropCam, Alexa, or smart home device. The developer on the team brought in an Xbox Kinect. We would use the camera to detect motion and use a face/emotion recognition API to detect the expressions of the person being monitored at the given point.

The tech stack was a browser (web service), mobile app and a sensor acting to capture information that would be sent first to the web and then mobile service. Our main objective is using the sensor to monitor the movement of a person in a way that we could capture the necessary data to then send it to our to our web app and lastly to our mobile app that would notify family members.


SafeSeniorClub (1)

The use of sensors and remote monitoring are hardly new. Our objective wasn’t to reinvent the wheel it was to simplify and test if our idea has merit. Iteration could lead to designing a better tool. The project mission for this hackathon, how can we make families more connected to their aging loved ones who face Alzheimer’s.

The Xbox sensor bar has a limited range which meant observation would have to be designated to more than just this device. So perhaps an interface that allows for more than one tracking method would make this service better. For instance, more of a Matrix device.  Also, we weren’t clinicians.  We would have to pilot and find a group to use this in order to see if we were moving in the proper direction.

emotionsBy using the Xbox Kinect we could identify when a senior was walking around the house. We could also detect the senior’s emotion through the sensor. We would then collect that data, send it to our web app and then notify the family member on the current status of the senior. For instance, are they happy? Have they left the house? Do they appear confused? Over time we assumed that through machine learning to notify family or caregivers when they should visit.

On an extended version of this proof of concept, you could add more smart devices, sensors, cameras or more to build a smart home that handles remote monitoring. Through piloting and iteration, this could be perfected to determine how you could decrease the need for a caregiver or the hours per week that a care professional will be needed. Another good example of face recognition for Alzheimer patients is noted in a 13-year-old girl who built an IOS app called Timeless.

Here is a Sway Presentation of what we presented at the hackathon.