Midas Touch: Finger-Worn Interaction Device for Connecting the Connected Things

PI: Liwei Chan (National Chiao Tung University), CoPI: Rong-Hao Liang (Eindhoven University of Technology), Robin Bing-Yu Chen (National Taiwan University)

Champion: Giuseppe Raffa (Intel)

Project Objective:

In MidasTouch project, we have proposed three wearable systems for recognizing gestures on the everyday objects and providing immediate semantic haptic feedback. They are a fingerstall-like device utilizing RFID reader to recognize gestures performed on different tagged everyday objects reliably, a nail-mounted tactor arrays that displays spatial cues and character information to users in an eyes-free manner, and a system of spatiotemporal vibration patterns and guidelines for delivering alphanumeric characters on wrist-worn vibrotactile displays. These results have been published in the international conferences MobileHCI and UIST in 2016.

To explore more expressive tactile feedback, we started the project, FingerTactor, to create animated vibrotactile motions on the finger that helps users make sense of and gain better awareness of invisible information flows, using two vibrotactile rings worn by the user’s finger. Furthermore, our previous works mainly focused on tactile type of haptic feedback. To enable haptic output with motions, we plan to extend the scope toward haptic force feedback. Current progress included seeking to simulate remote physical operations with together stimulating proprioceptive sensory at arm and tactile sensory at fingers and arms.

Furthermore, we foresee haptic-integrated applications are increasing. The main difference of haptic feedback from other feedback types (e.g., visual and audio) is that haptic feedback is usually non-observable and non-shareable (e.g., only the person of experiencing knows). These distinct features made haptic design exceptionally difficult to debug and evaluate in particular in an IOT environment where multiple haptic outputs may appear everywhere. We plan to explore this problem by visualizing various kinds of haptic feedbacks using e.g., colors, allowing haptic output to become visual cues visible and shareable between the designers and the users. The results will lay good foundation to provide effective development tools for haptic design and evaluation.

 

In addition to haptic interaction, we plan to explore using personal multi-view videos to capture and recognize users’ activities in an eye-hand coordinated manner, by multiple wearable cameras worn by the user, e.g., on glasses and on finger-rings. This multi-perspective views containing visual information of different focuses and levels of details from the wearer’s perspective, will allow to capture and recognize users’ activities in an eye-hand coordinated manner, and provide good potential for delivering real-time guidance, which can be later enhanced with expressive haptic feedbacks produced by our haptic project, useful for education and learning.  Also, we will conduct a user study to know the potential issues beyond haptic signal delivery, such as the limitations of human haptic sensation. (updated in Feb, 2017)