Articles

A review on the recognition of mid-air gestures

  • YU Hanchao ,
  • YANG Xiaodong ,
  • ZHANG Yingwei ,
  • ZHONG Xi ,
  • CHEN Yiqiang
Expand
  • 1. Institute of Computing Technology, Chinese Academy of Sciences;Beijing Key Laboratory of Mobile Computing and Pervasive Device, Beijing 100190, China;
    2. University of Chinese Academy of Sciences, Beijing 100049, China

Received date: 2017-06-02

  Revised date: 2017-08-01

  Online published: 2017-08-26

Abstract

The rapid development of the ubiquitous computing and wearable devices witnesses a new challenge in the natural hand gesture recognition:to free the users from the constraints of the environment and the devices and help the users interact with the environment in a natural and effective way. And the mid-air gesture recognition is one of the effective methods, capable of dealing with the challenge. This paper describes the definition of the mid-air gesture at first, and then analyzes and summarizes the existing hand gesture recognition methods, based on the computer vision, the ultrasonic signal and the electromagnetic wave. At last, this paper discusses the applications of the mid-air gesture recognition, some open questions and the development in the future.

Cite this article

YU Hanchao , YANG Xiaodong , ZHANG Yingwei , ZHONG Xi , CHEN Yiqiang . A review on the recognition of mid-air gestures[J]. Science & Technology Review, 2017 , 35(16) : 64 -73 . DOI: 10.3981/j.issn.1000-7857.2017.16.009

References

[1] Cabreira A T, Hwang F. An analysis of mid-air gestures used across three platforms[C]//Proceedings of the 2015 British HCI Conference. New York:ACM, 2015:257-258.
[2] Zhang Z. Microsoft kinect sensor and its effect[J]. IEEE Multimedia, 2012, 19(2):4-10.
[3] Khademi M, Mousavi Hondori H, McKenzie A, et al. Free-hand interac-tion with leap motion controller for stroke rehabilitation[C]//Proceedings of the Extended Abstracts of the 32nd Annual ACM Conference on Hu-man Factors in Computing Systems. New York:ACM, 2014:1663-1668.
[4] Lien J, Gillian N, Karagozler M E, et al. Soli:Ubiquitous gesture sens-ing with millimeter wave radar[J]. ACM Transactions on Graphics (TOG), 2016, 35(4):142.
[5] Bretzner L, Laptev I, Lindeberg T. Hand gesture recognition using multi-scale colour features, hierarchical models and particle filtering[C]//Fifth IEEE International Conference on Automatic Face and Ges-ture Recognition. Washington D C:IEEE, 2002:423-428.
[6] Stenger B. Template-based hand pose recognition using multiple cues[C]//ACCV 2006:Computer Vision. Berlin Heidelberg:Springer, 2006:551-560.
[7] Kaufmann B, Louchet J, Lutton E. Hand posture recognition using realtime artificial evolution[C]//Applications of Evolutionary Computation. Berlin Heidelberg:Springer, 2010:251-260.
[8] Weng C, Li Y, Zhang M, et al. Robust hand posture recognition inte-grating multi-cue hand tracking[C]//Edutainment 2010:Entertainment for Education. Digital Techniques and Systems. Berlin Heidelberg:Springer, 2010:497-508.
[9] Flasiński M, My-liński S. On the use of graph parsing for recognition of isolated hand postures of Polish Sign Language[J]. Pattern Recogni-tion, 2010, 43(6):2249-2264.
[10] 任海兵, 徐光祐, 林学訚. 基于特征线条的手势识别[J]. 软件学报, 2002, 13(5):987-993. Ren Haibin, Xu Guangyou, Lin Xueyin. Hand gesture recognition based on characteristic curves[J]. Journal of Software, 2002, 13(5):987-993.
[11] 朱继玉, 王西颖, 王威信, 等. 基于结构分析的手势识别[J]. 计算机学报, 2006, 29(12):2130-2137. Zhu Jiyu, Wang Xiying, Wang Weixin, et al. Hand gesture recognition based on structure analysis[J]. Chinese Journal of Computers, 2006, 29(12):2130-2137.
[12] 杨波, 宋晓娜, 冯志全, 等. 复杂背景下基于空间分布特征的手势识别算法[J]. 计算机辅助设计与图形学学报, 2010, 22(10):1841-1848. Yang Bo, Song Xiaona, Feng Zhiquan, et al. Gesture recognition in complex background based on distribution features of hand[J]. Journal of Computer-Aided Design & Computer Graphics, 2010, 22(10):1841-1848.
[13] Beh J, Han D, Ko H. Rule-based trajectory segmentation for modeling hand motion trajectory[J]. Pattern Recognition, 2014, 47(4):1586-1601.
[14] Song J, Sörös G, Pece F, et al. Real-time hand gesture recognition on unmodified wearable devices[C/OL].[2017-03-31]. http://www.vs.inf.ethz.ch/publ/papers/soeroesg-cvpr2015-MobileGestures.pdf.
[15] Kar A. Skeletal tracking using microsoft kinect[R/OL].[2017-03-31]. http://people.eecs.berkeley.edu/~akar/ⅡTK_website/cs397/Skeletal%20Tracking%20Using%20Microsoft%20Kinect.pdf.
[16] Doliotis P, Stefan A, McMurrough C, et al. Comparing gesture recogni-tion accuracy using color and depth information[C]//Proceedings of the 4th International Conference on PErvasive Technologies Related to As-sistive Environments. New York:ACM, 2011:20.
[17] Ren Z, Meng J, Yuan J, et al. Robust hand gesture recognition with ki-nect sensor[C]//Proceedings of the 19th ACM International Conference on Multimedia. New York:ACM, 2011:759-760.
[18] Wang C, Liu Z, Chan S C. Superpixel-based hand gesture recognition with kinect depth camera[J]. IEEE Transactions on Multimedia, 2015, 17(1):29-39.
[19] Elmezain M, Al-Hamadi A, Appenrodt J, et al. A hidden markov mod-el-based isolated and meaningful hand gesture recognition[J]. Interna-tional Journal of Electrical, Computer, and Systems Engineering, 2009, 3(3):156-163.
[20] Gonzalez-Sanchez T, Puig D. Real-time body gesture recognition us-ing depth camera[J]. Electronics Letters, 2011, 47(12):697-698.
[21] Molina J, Escudero-Viñolo M, Signoriello A, et al. Real-time user in-dependent hand gesture recognition from time-of-flight camera video using static and dynamic models[J]. Machine Vision and Applications, 2013, 24(1):187-204.
[22] Lee U, Tanaka J. Finger identification and hand gesture recognition techniques for natural user interface[C]//Proceedings of the 11th Asia Pacific Conference on Computer Human Interaction. New York:ACM, 2013:274-279.
[23] 于汉超, 唐晓庆, 刘军发, 等. 手掌姿态自适应的单指尖鲁棒跟踪方法[J]. 计算机辅助设计与图形学学报, 2013, 25(12):1793-1800. Yu Hanchao, Tang Xiaoqing, Liu Junfa, et al. Robust single fingertip tracking method based on palm posture self-adaption[J]. Journal of Computer-Aided Design & Computer Graphics, 2013, 25(12):1793-1800.
[24] Yu H C, Chen Y Q, Liu J F, et al. Adaptive and iterative online se-quential ELM based multi-degree-of-freedom gesture recognition sys-tem[J]. IEEE Intelligent Systems, 2013, 28(6):55-59.
[25] Yu H C, Yang X D, Chen Y Q, et al. strDoctor:Indicate Stroke for El-derly through Body Sensing Game[C]//Proceedings of the 12th IEEE International Conference on Ubiquitous Intelligence and Computing. Beijing:IEEE, 2015:360-363.
[26] Chen Y Q, Yu H C, Miao C Y, et al. Using motor patterns for stroke detection[R/OL].[2017-03-31]. http://www.ntulily.org/wp-content/up-loads/journal/Using_motor_patterns_for_stroke_detection_accepted.pdf.
[27] Kalgaonkar K, Raj B. One-handed gesture recognition using ultrason-ic Doppler sonar[C]//IEEE International Conference on Acoustics, Speech and Signal Processing. Washington D C:IEEE, 2009:1889-1892.
[28] Gupta S, Morris D, Patel S, et al. Soundwave:Using the doppler effect to sense gestures[C]//Proceedings of the SIGCHI Conference on Hu-man Factors in Computing Systems. New York:ACM, 2012:1911-1914.
[29] Pittman C, Wisniewski P, Brooks C, et al. Multiwave:Doppler Effect Based Gesture Recognition in Multiple Dimensions[C]//Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems. New York:ACM, 2016:1729-1736.
[30] Yang Q F, Tang H, Zhao X B, et al. Dolphin:Ultrasonic-based ges-ture recognition on smartphone platform[C]//Computational Science and Engineering (CSE), 2014 IEEE 17th International Conference on. Washington D C:IEEE, 2014:1461-1468.
[31] Wang W, Liu A X, Sun K. Device-free gesture tracking using acous-tic signals[C]//Proceedings of the 22nd Annual International Confer-ence on Mobile Computing and Networking. New York:ACM, 2016:82-94.
[32] Nandakumar R, Iyer V, Tan D, et al. FingerIO:Using Active Sonar for Fine-Grained Finger Tracking[C]//Proceedings of the 2016 CHI Con-ference on Human Factors in Computing Systems. New York:ACM, 2016:1515-1525.
[33] 杨晓东, 陈益强, 于汉超, 等. 面向可穿戴设备的超声波手势识别方法[J]. 计算机科学, 2015, 42(10):20-24. Yang Xiaodong, Chen Yiqiang, Yu Hanchao, et al. Ultrasonic waves based gesture recognition method for wearable equipment[J]. Comput-er Science, 2015, 42(10):20-24.
[34] Dahl T, Ealo J L, Pazos-Ospina J, et al. High-resolution ultrasonic gesture tracking systems for future portable devices[C]//Ultrasonics Symposium (IUS), 2012 IEEE International. Washington D C:IEEE, 2012:150-153.
[35] Wachs J P, Stern H I, Edan Y, et al. A hand gesture sterile tool for browsing MRI images in the OR[J]. Journal of the American Medical Informatics Association, 2008, 15(3):321-323.
[36] Ruppert G C S, Reis L O, Amorim P H J, et al. Touchless gesture us-er interface for interactive image visualization in urological surgery[J]. World Journal of Urology, 2012, 30(5):687-691.
[37] Jacob M G, Wachs J P. Context-based hand gesture recognition for the operating room[J]. Pattern Recognition Letters, 2014, 36:196-203.
[38] Phelan I, Arden M, Garcia C, et al. Exploring virtual reality and pros-thetic training[C]//2015 IEEE Virtual Reality (VR). Washington D C:IEEE, 2015:353-354.
[39] Phelan I, Arden M, Garcia C, et al. Exploring virtual reality and pros-thetic training[C]//2015 IEEE Virtual Reality (VR). Washington D C:IEEE, 2015:353-354.
[40] Lee S W. Automatic gesture recognition for intelligent human-robot in-teraction[C]//Automatic Face and Gesture Recognition, 2006. FGR 2006. 7th International Conference on. Washington D C:IEEE, 2006:645-650.
Outlines

/