Bracelets and amulets are in the works at Dartmouth's Institute for Security, Technology, and Society (ISTS). Rather than items of mere adornment, the scientists and engineers are constructing personal mobile health (mHealth) devices-highly functional jewelry, as it were.
mHealth is a rapidly growing field where technology helps you or your physician monitor your health through mobile devices. This approach can offer more accurate and timely diagnoses as well as lower health costs. However, smartphones are often used to transmit collected medical information, and these transmissions are open to hacking.
David Kotz '86 guides a research group whose focus is mHealth. Kotz, professor of computer science and associate dean of faculty for the sciences, works with a diverse team whose members include graduate students (Shrirang Mare and Cory Cornelius), a postdoctoral associate in computer science (Jacob Sorber, now an assistant professor of computer science at Clemson University), a computer programmer (Ronald Peterson), faculty and technical staff from Thayer School of Engineering and the Geisel School of Medicine (Ryan Halter and Joe Skinner), and others. Their wide-ranging skills are being brought to bear in a field that is redefining the relationship between patient and doctor.
Collection and communication of medical information via mHealth systems can help a physician monitor patients with chronic diseases or other medical concerns on a more frequent basis. The ability to look at the data remotely and assess a patient's condition might also mean fewer trips to the hospital or the doctor's office.
One of the Institute's intriguing wearable devices under development has actually been christened "Amulet," first announced at a conference in February 2012. Its foremost feature would be a wireless communication capability, possibly using something like Bluetooth. As envisioned, Amulet would function as a communications hub for mHealth devices on a person's body, akin to a local area network, ultimately connecting them to an electronic medical records system.
"We see our Amulet concept as a means to collect body-area sensor data," Kotz explains. "The device could collect electrocardiogram signals from a heart monitor, obtain glucose readings from a glucose meter, and even talk to your insulin pump to control the insulin injections."
Looking like a fancy digital watch, incorporating a display and computing capabilities, Amulet's ultimate utility would hinge on its ability to accurately and securely communicate and correlate the collected sensor data.
In a related development, led by Dartmouth graduate student Cory Cornelius '07, this group presented a paper in August 2012 that described a prototype biometric bracelet that adds security to the system.
Another jewelry-like mHealth device, the "bracelet" could be functionally integrated with the Amulet. The bracelet applies a tiny alternating current to a person's skin at different frequencies, to which each person's body seems to respond uniquely. These unique responses, based on variations in body tissue shape and thickness, could represent a person's biometric "fingerprint," expressed in terms of bioimpedance-a measure of how the body's tissues resist the electric current.
"We imagine a device [the bracelet] that can be worn on the wrist and unobtrusively recognize its wearer," writes Cornelius and his co-authors. "Without any other action on the part of the user, the [mHealth] devices discover each other's presence [and] recognize that they are on the same body."
This network learns from the unique electrical signature of the wrist device whose body they are on. The network's own configuration could then be used as a basis for encryption in establishing reliable and secure external communications.
"The Amulet and the bracelet are complementary ideas, still under development," says Kotz. "We're still building an Amulet prototype and we have more to do to validate bioimpedance as a biometric method. We think these concepts, once fully proven, offer great potential for mHealth."
mHealth for Stress Management
In the May 2011 issue of Pervasive Computing Magazine, Dartmouth computer scientist Andrew Campbell and his colleagues introduced "BeWell," touted by Google Play as "the next generation in mobile health apps." The authors describe it as an automated well-being app for the Android smartphone that tracks activities such as sleep, exercise, and social interaction, providing direct feedback to the user.
"By providing a more complete picture of health, BeWell has the potential to empower individuals to improve their overall well-being," says Campbell. The app's main job is keeping people informed about their health status in real time and giving them a chance to do something about it.
Now Campbell and colleague Tanzeem Choudhury at Cornell University are about to add another dimension to BeWell. As reported in the August 18, 2012 New Scientist, "StressSense" is scheduled for release in September 2012. A "plug-in" for the BeWell app, it will continuously monitor, assess, and report to an individual how much psychological stress he or she is experiencing.
"We know that stress exists in our lives but we don't know necessarily what causes it or when it occurs," Campbell says. "If you can expose it to people somehow, then they can start to deal with it."
Like the other BeWell dimensions, StressSense is displayed on the background of your phone. Its assessment of stress is based on voice analysis-not conversational content, but voice pitch, patterns, and speaking rate. "Using sophisticated signal processing, you can dig down into speech and find some features or patterns that correlate with stress," Campbell says.
StressSense is the first smartphone app that can detect stress patterns directly from speech across different acoustic environments.
With students from Dartmouth and -cole Polytechnique F-d-rale de Lausanne in Switzerland as study subjects, data were collected in stressful exercises (simulated job interviews) and non-stressful situations (reading aloud). The same participants repeated these exercises both indoors and outdoors to provide data on different acoustic environments. The results of these studies informed the construction of the computational model at the heart of StressSense.
Campbell regards this approach as innovative, the first time a computer model based on speech is being used as an indicator of stress, and then displaying this stress indication on a phone. He notes that the app adapts itself to the individual specifically because of the extreme variability in reactions to stress, as well as differences in individual speech. "You have to sort of train the app," he says, "adapting to both acoustic environments and the individual."