How do we perceive reality? A philosophical question, which mostly comes to mind: thanks to the 5 senses!
This is why in recent years several companies have been trying to make virtual reality more and more immersive, adding revolutionary technologies to our headsets.
Compared to the visual experience, spatial audio is far behind, and today it is difficult to find a truly immersive virtual reality, from this point of view.
It is essential to develop the spatiality of the audio, because it is not the same thing to enter, for example, a safety simulation and hear the sounds in stereo all around us, and instead clearly perceive the air cargo crashing behind us, or a car jamming to our right.
The spatiality of audio is already well developed by other devices and implemented by many theme parks. The problem with the application to immersive virtual reality is that we are talking about bulky and difficult to transport speakers.
Much more interesting and with many challenges in the field is the simulation of smell. There is FeelReal, a company that is developing a mask to put under the VR visor to smell, for example, flowers in spring, or fires in the woods.
There are several techniques on the market today:
As Robert Stone, director of the Human Interface Technologies Team at the University of Birmingham, explains:
“But the one problem we’ve always had with olfactory displays is the delivery mechanism. Take, for example, a project we started for the British Army. We wanted to recreate the smells of a Middle Eastern village, because we’d learned that when army personnel were on patrol, certain smells – or the absence of smells – could warn them that something was about to happen. We could synthesize the smells – cooking, tobacco, rotting, hanging meat and so on – and release them, but the noise from the electromechanical and pneumatic components of the hardware would warn users long before the smells, which made the simulation useless.”
For touch, virtual reality simulation can become immersive only if it includes all the mechanical stimulations that touch naturally generates. For example, we already have the Teslasuit on the market (no connection with Tesla) that creates suits capable of making you perceive a blow received. HaptX, meanwhile, makes a glove that serves as a sort of exoskeleton, providing force feedback, simulating what you feel when you close your hands on a steering wheel or an apple.
For home use, but also for surgical robotics, we already have electric movement bases; they are still quite expensive, but still available.
Flight simulators are already using touch simulation to create immersive virtual reality: for years, they have used hydraulics to simulate the sensations of movement and acceleration from beneath the seats of pilots in training.
Taste simulation is currently the most invasive: we have the Digital Lollipop, created by Nimesha Ranasinghe at the National University of Singapore, a device that is placed in direct contact with the tongue, and directly stimulates different tastes.
These direct stimulation techniques have been known since 1700 (called "galvanic stimulation of the tongue") and are based on the geographical distribution of stimuli on the tongue. If you disturb these regions of the tongue thermally and electrically, you get a direct response from the brain.
For direct brain stimulation (like the Matrix, so to speak) we may still have to wait... in the meantime we can wait and see how the dynamic and ever-changing world of immersive virtual reality evolves.
Are you interested in the topic of AR/VR? Read our article on augmented reality for business.
