VR Lullaby Machine: An artwork Controlled by Brainwaves
RMIT University, Media Art Installation and Research Project, Australia, 2018
INTER-DREAM is a virtual reality artwork by PluginHUMAN. This artwork places the human body at the centre of a digital experience involving all five senses. Participants wear EEG and VR headsets – their neural function (brainwaves) controls the sights, sounds and aromas in an immersive performance space. Their brainwaves also provide haptic (touch) and gustatory (taste) feedback.
Poor sleep has been acknowledged as an increasingly prevalent global health concern, however, how to design for promoting sleep is relatively underexplored. We propose that neurofeedback technology may potentially facilitate restfulness and sleep onset, and we explore this through the creation and study of Inter-Dream, a novel multi-sensory interactive artistic experience driven by neurofeedback.
This artwork/research project investigates human-computer integration in relation to pre-sleep. We created a feedback loop, forming a cycle of information that is shaped by a collusion between neural function and real-time art. The feedback loop involves sight, sound, touch, smell and taste.
THE INTER-DREAM EXPERIENCE
An Inter-Dream participant reclines on an interactive resting platform. They wear EEG and VR headsets. The EEG headset reads the electrical activity that occurs in the brain. The participant’s brain function animates abstract visuals that are projected onto media surfaces at the rear of the space. These visuals are also displayed in VR so that the participant can view the imagery with minimal physical effort. Their neural function also triggers audio loops and they experience aromatic and gustatory feedback.
When a participant hears and sees the audio-visuals, and when they feel, taste and smell the sensory stimuli, this has an impact on their neural function. This impact stimulates new brain function and this changes the audio, visuals, aromas, gustatory and haptic effects. Therefore, a cyclical feedback loop is created between the human body and a real-time reactive multi-sensory environment. Our studies show that engaging with this system may assist people to prepare for sleep.
RELATED ACADEMIC PAPER
Justin Dwyer ~ Artist (projection mapping), VR programmer
Betty Sargeant ~ Artist (multi-sensory design), producer
Andrew Ogburn ~ Music Composer
Nathan Semertzidis ~ Lead Researcher
Thanks to: Florian ‘Floyd’ Mueller and the Exertion Games Lab, RMIT University
PluginHUMAN is a multi-award-winning art duo led by Dr Betty Sargeant and Justin Dwyer. PluginHUMAN are at the progressive edge of their field, providing audiences with new cultural, environmental and scientific perspectives. Their artworks address the leading questions and concerns of our times.
PluginHUMAN’s real time art centres around the art of illumination. They create projection mapping, video artworks, and led immersive multi-sensory environments. They use the medium of light to translate complex data into meaningful audience experiences.
PluginHUMAN has created commissioned artworks for institutions such as the National Taiwan Museum of Fine Arts (Taiwan), the Asia Culture Centre (South Korea), Questacon (Australia’s National Science and Technology Centre), the Melbourne Museum (White Night Festival, Australia) and Experimenta (Australian triennial touring media art exhibition). They have exhibited in Europe, North America, Asia and Australia. PluginHUMAN were awarded the Rupert Bunny Foundation Visual Art Fellowship (2019/20); they’re developing new carbon neutral and carbon negative materials and working methods. They have also won Good Design Awards (2020 and 2018), and a Victorian Premier’s Design Award (2017).
PluginHUMAN has an acute understanding of the role that technology plays in contemporary society. They reimagine new technologies to produce artistic innovations, creating meaningful large and small scale audience experiences for indoor spaces and outdoor public arenas. They specialise in immersive and interactive art environments.