Two datasets are included regarding gesture input with smartwatches, glasses, and rings.
The first one contains 7,290 stroke-gesture samples collected from 28 participants, of which 14 participants with upper-body motor impairments.
The second one contains 3,809 motion-gesture samples collected from the same participants.
These datasets are companion to our CHI 2022 paper Vatavu and Ungurean (2022)
where we report experimental results on stroke-gestures and motion-gestures performed by people with upper-body motor impairments using
smartwatches, glasses, and rings.
For example, we found that people with upper-body motor impairments take twice as much time to produce stroke-gestures on wearable
touchscreens compared to users without impairments, but they articulate motion-gestures equally fast and with
similar kinematic characteristics of acceleration and jerk.
Resources
We release our two gesture datasets to encourage further studies and developments towards
wearable devices that are more accessible to users with motor impairments.
Our datasets and C# code are freely available to download and use for research purposes under this license.
If you find our datasets and/or code useful in your work, please let us know.
If you use our datasets and/or code in scientific publications, please reference the paper (Vatavu and Ungurean, 2022) that introduced these resources.
Publication
Contact
For any information or suggestion regarding this web page, the dataset, or the source code, please contact Prof. Radu-Daniel Vatavu.