DoppelLab



Homes and offices are being filled with sensor networks to answer specific queries and solve pre-determined problems, but there are no comprehensive visualization or sonification tools for fusing these disparate data to examine relationships across spaces and sensing modalities. DoppelLab is an immersive, cross-reality virtual environment that serves as an active repository of the multimodal sensor data produced by a building and its inhabitants. We transform architectural models into browsing environments for real-time sensor data visualizations, as well as open-ended platforms for building audiovisual applications atop those data. These applications in turn become sensor-driven interfaces to physical world actuation and control. As an interface tool designed to enable rapid parsing, visualization, sonification, and application development, DoppelLab proposes to organize these data by the space from which they originate and thereby provide a platform to make both broad and specific queries about the activities, systems, and relationships in a complex, sensor-rich environment.

Visit Tidmarsh to see our more recent work with sensor data from an outdoor site.


Lab Overview, Solid Walls Lab Overview, Transparent Walls Temperature and Humidity Sensors RFID Identification

Responsive Environments Group | MIT Media Lab