Date Awarded

2020

Document Type

Dissertation

Degree Name

Doctor of Philosophy (Ph.D.)

Department

Computer Science

Advisor

Gang Zhou

Committee Member

Qun Li

Committee Member

Xu Liu

Committee Member

Weizhen Mao

Abstract

Human behavior recognition and analysis have been considered as a core technology that can facilitate a variety of applications. However, accurate detection and recognition of human behavior is still a big challenge that attracts a lot of research efforts. Among all the research works, motion sensors-based human behavior recognition is promising as it is low cost, low power, and easy to carry. In this dissertation, we use motion sensors to study human behaviors. First, we present Ultigesture (UG) wristband, a hardware platform for detecting and analyzing human behavior. The hardware platform integrates an accelerometer, gyroscope, and compass sensor, providing a combination of (1) fully open Application Programming Interface (API) for various application development, (2) appropriate form factor for comfortable daily wear, and (3) affordable cost for large scale adoption. Second, we study the hand gesture recognition problem when a user performs gestures continuously. we propose a novel continuous gesture recognition algorithm. It accurately and automatically separates hand movements into segments, and merges adjacent segments if needed, so that each gesture only exists in one segment. Then, we apply the Hidden Markov Model to classify each segment into one of predefined hand gestures. Experiments with human subjects show that the recognition accuracy is 99.4% when users perform gestures discretely, and 94.6% when users perform gestures continuously. Third, we study the hand gesture recognition problem when a user is moving. We propose a novel mobility-aware hand gesture segmentation algorithm to detect and segment hand gestures. We also propose a Convolutional Neural Network to classify hand gestures with mobility noises. For the leave-one-subject-out cross-validation test, experiments with human subjects show that the proposed segmentation algorithm achieves 94.0% precision, and 91.2% recall when the user is moving. The proposed hand gesture classification algorithm is 16.1%, 15.3%, and 14.4% more accurate than state-of-the-art work when the user is standing, walking, and jogging, respectively. Finally, we present a tennis ball speed estimation system, TennisEye, which uses a racket-mounted motion sensor to estimate ball speed. We divide the tennis shots into three categories: serve, groundstroke, and volley. For a serve, we propose a regression model to estimate the ball speed. In addition, we propose a physical model and a regression model for both groundstroke and volley shots. Under the leave-one-subject-out cross-validation test, evaluation results show that TennisEye is 10.8% more accurate than the state-of-the-art work.

DOI

http://dx.doi.org/10.21220/s2-5xdw-zx81

Rights

© The Author

Share

COinS