Paradiso directs the MIT Media Lab‘s Responsive Environments Group, which investigates the enhancement of human experience through external sensors. In his lecture, he outlined how the group’s research is bringing us closer to a world where humans will be connected to a ubiquitous electronic ‘nervous system’—a sensor network that will extend across objects, people and places, challenging our notion of physical presence.
Many of us are already connected to the Internet of Things, through digital assistants like Siri, or smartphone apps of the sort that can control the thermostat. According to Paradiso, this is a mere transition phase.
“We’re living in the world of the digital butler,” he said. “We talk to our devices, or in some cases we can gesture to them. We’re going to move through this pretty quickly and get to the point where we see them as an extension of ourselves.”
How to become a cyborg
In the near future, sensors will capture our needs and desires without us having to tell them, and the built environment will respond accordingly, Paradiso says. For example, his group has developed a bracelet which measures the wearer’s comfort in terms of heat and humidity. These data, and the user’s location, are relayed to a central system in the building, which activates the air conditioning or heating in the relevant office. “The wall thermostat is obsolete,” Paradiso said.
This is just one of the many examples of ‘sensory prosthetics’ that will become available. MIT’s Media Lab has a long history of research into wearable sensors, including shoes, glasses, bracelets, and rings. For example, Paradiso’s group has developed a headset which controls lighting in dedicated rooms, by interpreting visual data and sensing what kind of task the user is performing. The key to a good wearable is that it must be unobtrusive and consume little power. Many sensors piggyback on commercial products that people would wear anyway.
For the “nervous system” to be truly ubiquitous, however, it must extend beyond wearables, to include sensory information about far-away places. MIT’s Networked Sensory Landscape project has created virtual models of remote locations by dotting those places with sensors measuring all sorts of information, from the weather to the wildlife.
One such location is Tidmarsh, a 600-acre wetland in Massachusetts, USA, where the Media Lab’s researchers have planted sensory units including soil moisture samplers, microphones, infra-red motion sensors, cameras and weather stations. The result is a virtual copy of the marsh, informed by all the real-life data from the sensor network, enabling you to experience weather, ground cover, light, temperature and animals from your computer, just as they are on location.
The same principle has been applied to buildings, allowing visualisation of the people, temperature, working appliances, and many other features of their interior. “You can go there virtually,” said Paradiso.
On-location sensors like the ones at Tidmarsh can also help perceive information about the place, when visiting the actual place for real – things that you simply could not perceive through your own senses. For example real-time data from the sensors already in the field could be fed to a headset to give information about the weather, or the identification of a bird species from the deep learning algorithm that analyses recordings of its call.
Paradiso’s networked vision goes beyond interacting with discrete devices like the smartphone. As our presence becomes increasingly generalised through these sensor networks, he questions whether humans will retain the notion of a personal boundary: “In this new world, where does the ‘self’ end, and ‘other’ begin?”
Source : Imperial College London