Papers:
- Robert Jacobs, Mark Feldmeier, Joseph A. Paradiso. " A Wireless Sensor-based Mobile Music Environment Compiled from a Graphical Language." In Proceedings of the 2008 Conference on New Interfaces for Musical Expression (NIME-08), pages 193-196, Genoa, Italy, 2008.
- The poster for the above paper
- R. Jacobs. A wireless sensor-based mobile music environment compiled from a graphical language. Master's thesis, MIT EECS Department, Cambridge, Mass., September 2007.
Video Demonstrations:
A Video (50 Meg) illustrating how this system is used for interactive exercise. The exercise mapping here is very simple, building up a synchronized tempo when detecting a rhythm, and automatically changing musical mode when moving from a jog into calistethics. The dancer in this example was weraring wireless sensors (3-axis acceleration and 2-axis gyro) on one leg and one foot.
A Video (21 MEG) showing a simple demo running on under PuDAC on a Nokia 800, taken at NIME 2008 in Genoa in June, 2008. Again, the mapping is simple, with particular motion of one sensor triggering a particular sound sample (and introducing limited vibrato when it is moved post trigger), and movement of the other introducing chorusing and detune effects once the sound is launced. The wireless sensors used are the same as above (we have made much smaller sensors of this sort under other projects in the Responsive Environments Group, such as Spinner and SportSemble).