The work at hand outlines a recording setup for capturing hand and finger movements of musicians. The focus is on a series of baseline experiments on the detectability of coloured markers under different lighting conditions. With the goal of capturing and recording hand and finger movements of musicians in mind, requirements for such a system and existing approaches are analysed and compared. The results of the experiments and the analysis of related work show that the envisioned setup is suited for the expected scenario.
This paper introduces a novel and efficient segmentation method designed for articulated hand motion. The method is based on a graph representation of temporal structures in human hand-object interaction. Along with the method for temporal segmentation we provide an extensive new database of hand motions. The experiments performed on this dataset show that our method is capable of a fully automatic hand motion segmentation which largely coincides with human user annotations.