New Center Seeks to Bring Context to Sensor Data

Santosh Kumar, an associate professor in the computer science department at the University of Memphis, knows the challenges of attaching a sensor to a human. If it’s obtrusive or uncomfortable, a person might just take it off or move it to another part of the body where the sensor’s capabilities are limited.

Depending on the situation, the implications could be huge. Take, for example, a sensor that monitors the fluid-level in a person’s lungs. When the subject stands, gravity causes the fluid to spread out to the limbs, leading the sensor to produce a lower recording. But when that person lies down, fluid moves into the lungs, generating a higher reading. A computer collecting the raw data may see a spike and raise a red flag, not knowing that the person did little more than change his or her body position.

“The data is not collected from a hospital, and no one is making sure sensors are working properly…and yet we’re supposed to make health decisions,” said Mr. Kumar, explaining the potential limitations of drawing conclusions from remote readings. “Unless you combine and contextualize, it’s hard to make decisions out of that.” To tackle the posture issue, researchers have designed a sensor that helps them estimate body orientation. … (Read more)

Source: WSJ.com