ZDNet, August 18, 2008
Note Ary: Look at the mentioned presentation!
The Intel Developer Forum kicks off Tuesday and the company has started with a few “day 0″ presentations–including one on research on how mobile devices that can sense and adapt to you–that are worth a look.
Intel released three presentations and the most interesting one–and potentially problematic–comes from Mary Smiley, director of Intel’s emerging platforms labs. The general concept goes like this: Mobile devices will learn you, know what situation you’re in, gauge the environment and infer what you need. Smiley’s primary example looked out how these smart devices would apply to the health care industry.
Here’s the prototype use case:
All of those sensors may have a big health care benefit, but you could argue that it also has “nag” written all over it. Devices will tell you you’re too fat, shouldn’t eat that steak and need to exercise more. I have a lot of rugby and offensive line pals that may just throw this “proactive wellness” prototype out the window (or worse). Luckily, my girthy pals don’t have to worry about this scenario anytime soon–there’s a lot of architecture work ahead.
Here’s a look at the moving parts for these pervasive sensor-driven devices of the future. This architecture could have ramifications for everything from social networking to wellness to mobile augmented reality.
Geen opmerkingen:
Een reactie posten