In this repo, I document the development process and present M.A.S.K. (Machine-Learning Assisted Skeleton Kinect Tracking), an approach for threshold-based pose recognition in TouchDesigner using a KinectV2, MediaPipe, and their connection via data synchronization and a Kalman filter. The system utilizes machine learning to improve Kinect skeleton tracking and aims to enable precise pose recognition. The workflow integrates the TouchDesigner Kinect node with a MediaPipe pipeline for extracting and processing skeleton values. These values are visualized for troubleshooting and used to set thresholds for skeleton node relationships. These skeleton points are queried in parallel for pattern matching. This allows for the recognition of dance patterns and the triggering of real-time visualizations via a Boolean node. The system aims to show improvements in accuracy and responsiveness. It serves as a tool for applications in interactive media and performance art. It was created as a technical component of a student project under the direction of the Filmakademie Ludwigsburg.
Please be aware that the documenation that can be found inside the LaTex folder is written in german though to this project being a academic endaveour in a german univerisity. For questions in english, please contact me via the email-adress that can be found in this repos parent.
the .toe is made only for version 2023.11220 since it was a requirement by the stakeholders. Compatibility is not assured, but a overall project structure and the python scripts should remain the same. Please check for correct and compatible calls via the op() method that suits your version, since the call has been outdated in newer versions.