Researchers testing in-ear computer in Japan
When we found this informative article we were so excited, having looked for over one year for this, discovering it on this blog was an exciting time for me.

Rumors indicate that Apple is currently developing an iWatch as its first foray into wearable technology, and the release of the Google Glass device has been percolating for awhile now, but according to a recent report from Japan Daily Press, theres another new innovation out there thats worth watching for.

At first glance, the new device a smartphone and earpiece pairing designed by Hiroshima City University engineer Kazuhiro Taniguch doesnt look like much to write home about. Weve seen mobile phones with Bluetooth earpieces, and while they certainly make talking on the phone in the car much more convenient (and in some states, legal), they are hardly something that is breaking new ground in the crowded smartphone market. Taniguchs earpiece, however, is a bit different. While the smartphone portion of the device is actually fairly standard issue, the earpiece supposedly has the capability to navigate and control the entire system, a capability that puts the device thoroughly in the wearable technology category.

So how does it work? Frankly, were not quite sure. Supposedly, the earpiece will be able to read users facial expressions, connect those to a control command for the phone, and then elicit some sort of response from the device. In the Japan Daily Press article, the examples given are things like sticking out your tongue, raising your eyebrow, clenching your teeth, or wrinkling your nose. However, were not quite sure whether users of the device will configure a customized control scheme for the phone based on their unique facial movements, or if the phone will have factory settings for its controls that dictate what each facial action triggers.

Taniguch, on the other hand, gave the example of looking up in the sky and wondering which stars or constellations are visible. The engineer says that, based on the angle and direction of a glance, the earpiece would be able to surmise what its user was thinking and would look up the information about the stars or constellations on the Internet. By using infrared sensors to track every minute movement of the ear, Taniguch says that the device could catch just about every subtle expression and movement of the face.

Of course, it seems as if a stray glance or an unintentional facial movement could elicit some pretty random and unwanted responses from the device, but Taniguch at very least must be credited for trying to do something different. The Japanese engineers new gadget could see a widespread release in 2015, if all goes well with testing and such.