Mobile Real-time Detection of Full-body Gestures

January 02, 2019 – ,

In recent years, the widespread availability of mobile sensor devices has spurred the development of many mobile applications based on human activity recognition, primarily in the form of fitness applications. These utilize sensors integrated into modern smartphones as well as additional sensor devices such as smart watches or fitness trackers in order to detect a user's current activity (e.g. running). Individual executions of full-body gestures are typically only tracked as repetitions within a continuous activity (e.g. steps) or for very specific types of gestures (e.g. dumbbell exercises). In order to utilize human activity recognition in more diverse applications such as controlling a mobile game, a system is required that is capable of detecting a large variety of different full-body gestures in real-time.

The goal of this thesis is to assess existing methods of time series segmentation and classification for their applicability to real-time mobile sensor data and develop a system which can reliably detect individual executions of a wide variety of full-body gestures. The developed system should be implemented to work on mobile (Android) devices and evaluated for its real-world performance.

Minimum requirements are a basic understanding of classifiers and machine learning in general as well as the familiarity with an Android-capable programming language (Java / Kotlin).

Keywords: activity recognition, human activity recognition, time series, classification, segmentation, mobile, android

Research Area(s): Multimedia Technologies & Serious Games

Tutor: Müller,

Open Theses