At the Cognitive Systems Lab we use Ineartial Measurement Units (IMUs), as well as Electromyography (EMG) to recognize gestures for innovative user interfaces. IMUs are used to measure movements by combining accelerometers and gyroscopes. On the other hand, EMG measures the electrical potential on the human skin during muscular activities. Both technologies are of interest for the field of mobile interfaces due to them being fully wearable.
Public data corpora
Various groups deal with using IMUs and EMG in the context of gesture recognition. A comparison between their results is often very hard, due to the multitude of different tasks and gesture sets. In addition, often the performance in session-, as well as person-independent recognition are not evaluated, both being of special importance for practical and usable interfaces.
We publish the data corpus we collected during our experiments for gesture recognition using IMUs and EMG.
Data corpus for gesture recognition using IMUs and EMG. We collected both IMU and EMG readings from 5 different subjects in 5 different session.
If you want to use this data, please cite:
|||Recognizing Hand and Finger Gestures with IMU based Motion and EMG based Muscle Activity Sensing ( ), In International Conference on Bio-inspired Systems and Signal Processing, 2015. (BIOSIGNALS 2015)|
The complete data can be found here.
This is the csl-hdemg dataset containing high-density EMG recordings of finger motions. The dataset is described together with a baseline recognition system in the paper:
|||Advancing Muscle-Computer Interfaces with High-Density Electromyography ( ), In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, ACM, 2015.|
Please cite this paper, if you publish any work based on this dataset. The data is contained in a zip file which is over 2GB in size. Within the archive, there is a readme.txt, that describes how the data is structured and how to access the data from Matlab or Python.
You can download dataset here.