The brain’s intuitive methods for distributing information across our auditory and visual sensory channels can aid in the design of a productive multichannel data analysis and monitoring framework
‘Rotator’ is a web-based multisensory analysis interface that enables users to shift dimensions of scientific data between their auditory and visual sensory channels in order to better discern structure and anomaly in the data. Github repository here
The Rotator framework is described in chapters 3 through 8 of Juliana Cherston's Master's thesis. This project is also described in a paper presented at the International Conference on Auditory Display.
The Rotator tool was built by Juliana Cherston in the Responsive Environments Group at the MIT Media Lab (Principal Investigator: Joseph A. Paradiso). Please contact the Responsive Environments Group for additional information.
In this sample, the participant is particularly active. An oscillator is used to sonify accelerometer data, and steps extracted from the data are sonified with clicks. Body temperature is far from average, triggering a very perceptable high-frequency beat synth.
Conversely, in this sample the participant is very still. Accelerometer data is very flat, and very few steps are extracted from the data. The participant's body temperature is near average, triggering a very low frequency beat effect synthesizer.
Here, the participant may be exhibiting a stress response. There are peaks in the EDA stream (audible as a windy white noise). The many EDRs extracted from the EDA stream are sonified using a pulse synth. Body temperature is high and activity is low.