Looking for a friendly UI to interpret raw data from a number of webcams: Playstation Eye Camera, Xbox Kinect Camera, Macbook Pro webcams. I would like the webcams to recognize my arms independently (if possible), and assign a XYZ axis of the movements of my arms. I would then like to map this movement along the axis and interpret it as a custom MIDI signal which I get to pick. I would also like a way for this data to be assigned to either a virtual device that can be picked up by any DAW or route it to an internal MIDI bus. If two arms are a bit less attainable, then at least one arm/hand on three axis is enough. I would also like for the Playstation Eye and Kinect to work on both Mac and PC.
There is already something similar out there, but it has become abandonware and is no longer supported by newer operating systems, however I would like to copy this project or a more advanced version with more options: