Daily tasks that seem insignificant to many of us can present major challenges for people living with dementia. Getting dressed is a particularly poignant source of frustration due to its complexity and lack of privacy when relying on a caregiver. Now, researchers at Arizona State University, the New York University Rory Meyers College of Nursing, and the Massachusetts General Hospital Institute of Health Professions recently developed, with the help of caregiver focus groups, a smart dressing system called DRESS that could help dementia sufferers.
“The intent of the DRESS prototype is to integrate typical routines and humanized interactions, promote normalcy and safety, and allow for customization to guide people with dementia through the dressing process,” the study’s lead author, Winslow Burleson, Ph.D., said in a press release.
The smart dresser prototype system uses sensors, a camera, a motion sensor, a tablet, and a mobile app. A caregiver uses the app to initiate the system and monitor the user’s progress. The process starts when the user hears a prompt, previously recorded by the caregiver, that tells the user to open the top drawer, which lights up.
The top drawer, as with all of the other drawers, contains a single article of clothing. The camera uses bar codes on the clothes to identify each item and determine if the user puts it on correctly. If they do, the system prompts them to move on the next drawer.
If the system detects an error or a lack of activity, the audio recordings will provide redirection and encouragement. All the while, a bracelet containing a skin-conductive sensor monitors stress levels. If DRESS detects repeated difficulties or rising stress levels, it will alert the caregiver to come provide assistance.
A study the researchers published in JMIR Medical Informatics describes lab tests of the DRESS system. In the tests, 11 healthy participants tested scenarios from standard dressing to putting a shirt on backwards.
The tests found that the system could accurately perceive clothing orientation and position. It did so 288 out of 384 times. However, it had trouble detecting when users finished putting on clothing. It missed these indications 10 of 22 times for shirts and five out of 22 times for pants.
Based on their tests, the researchers identified several areas for improvement including using larger bar codes, minimizing the folding of clothing, and more optimally positioning users in front of the DRESS system.
The researchers believe that, after completing the improvements they identified, the DRESS system could improve the quality of life of the approximately 1.5 million U.S. seniors who need help completing everyday activities.
The research is also an indication of the positive impact that smart technologies can potentially have on people’s lives.
“With improvements identified by this study,” Burleson said, “the DRESS prototype has the potential to provide automated dressing support to assist people with dementia in maintaining their independence and privacy, while alleviating the burden on caregivers.”