学術雑誌論文 Gesture Recognition Method Utilizing Ultrasonic Active Acoustic Sensing
Gesture Recognition Method Utilizing Ultrasonic Active Acoustic Sensing

Hiroki, Watanabe  ,  Tsutomu, Terada  ,  Masahiko, Tsukamoto

58 ( 4 ) 2017-04-15
ISSN:1882-7764
NII書誌ID(NCID):AN00116647
内容記述
We propose a method for gesture recognition that utilizes active acoustic sensing, which transmits acoustic signals to a target, and recognizes the target's state by analyzing the response. In this study, the user wore a contact speaker that transmitted ultrasonic sweep signals to the user's body and a contact microphone that detected the ultrasound propagated through the body. The propagation characteristics of the ultrasound changed depending on the user's movements. We utilized these changes to recognize the user's gestures. One of the important novelty features of our method is that the user's gestures can be acquired not only from the physical movement but also from the user's internal state, such as muscle activity, since ultrasound is transmitted via both the user's internal body and body surface. Moreover, our method is not adversely affected by audible-range sounds generated by the environment and body movements because we utilize ultrasound. We implemented a device that uses active acoustic sensing to effectively transmit/detect the ultrasound to/from the body and investigated the performance of the proposed method in 21 contexts with 10 subjects. The evaluation results confirmed that the precision and recall are 93.1% and 91.6%, respectively when we set 10% of the data as training data and the rest as testing data in the same data set. When we used the data set for training and the other data set for testing in the same day, the precision and recall are 51.6% and 51.3%, respectively.------------------------------This is a preprint of an article intended for publication Journal ofInformation Processing(JIP). This preprint should not be cited. Thisarticle should be cited as: Journal of Information Processing Vol.25(2017) (online)DOI http://dx.doi.org/10.2197/ipsjjip.25.331------------------------------
We propose a method for gesture recognition that utilizes active acoustic sensing, which transmits acoustic signals to a target, and recognizes the target's state by analyzing the response. In this study, the user wore a contact speaker that transmitted ultrasonic sweep signals to the user's body and a contact microphone that detected the ultrasound propagated through the body. The propagation characteristics of the ultrasound changed depending on the user's movements. We utilized these changes to recognize the user's gestures. One of the important novelty features of our method is that the user's gestures can be acquired not only from the physical movement but also from the user's internal state, such as muscle activity, since ultrasound is transmitted via both the user's internal body and body surface. Moreover, our method is not adversely affected by audible-range sounds generated by the environment and body movements because we utilize ultrasound. We implemented a device that uses active acoustic sensing to effectively transmit/detect the ultrasound to/from the body and investigated the performance of the proposed method in 21 contexts with 10 subjects. The evaluation results confirmed that the precision and recall are 93.1% and 91.6%, respectively when we set 10% of the data as training data and the rest as testing data in the same data set. When we used the data set for training and the other data set for testing in the same day, the precision and recall are 51.6% and 51.3%, respectively.------------------------------This is a preprint of an article intended for publication Journal ofInformation Processing(JIP). This preprint should not be cited. Thisarticle should be cited as: Journal of Information Processing Vol.25(2017) (online)DOI http://dx.doi.org/10.2197/ipsjjip.25.331------------------------------
本文を読む

https://ipsj.ixsq.nii.ac.jp/ej/?action=repository_action_common_download&item_id=178667&item_no=1&attribute_id=1&file_no=1

このアイテムのアクセス数:  回

その他の情報