DREAM 2.2 [2018]

An Interactive Art Experience Controlled by Brainwaves. By Betty Sargeant and Justin Dwyer

National Taiwan Museum of Fine Arts, TAIWAN, February-June 2018

DREAM 2.2 is an immersive, interactive art installation and performance by PluginHUMAN. This projection mapping artwork gives form to our mind’s ephemeral data. Audiences wear an EEG headset. They use their neural activity (brainwaves) to alter the installation’s audio and visuals. In doing this, audiences experience their dreams come alive. They can also, in a poetic sense, walk in the dreams of others. DREAM 2.2 performances feature two sleeping teenagers. Their brainwaves control the exhibition’s projection mapped visuals and trigger audio effects.

The immersive artwork DREAM 2.2 by PluginHUMAN.

This interactive art experience features PluginHUMAN’s original brain-computer interface. They used the software suites TouchDesigner and Ableton Live to design the system. The system also involves commercial EEG (electroencephalogram) headsets. These headsets track the electrical activity of the brain. This provides audiences with a projection mapping immersive art experience and a personalised connection to their neural data.

PluginHUMAN_Betty Sargeant_Justin Dwyer_Interactive Art Experience
DREAM 2.2 an interactive art experience controlled by brainwaves. By PluginHUMAN.

THE INSTALLATION SPACE

The DREAM 2.2 immersive art installation features a ‘brain forest’; a 13m x 6m rectangular maze. The maze is made from 100 hand-painted, 4-meter-high PVC panels that hang from the ceiling. When people move through the maze they are surrounded by abstract projection-mapped visuals. The outer walls are covered in reflective mirrors. The effect of the mirrored walls and the effect of the projection-mapped maze creates a mesmerising, uncanny immersive art experience shaped by neural data, light projections, reflections and quadrophonic sound.

This artwork also features a poem that is projection-mapped onto the front of the installation’s resting platform. DREAM 2.2 performer Coco Disco wrote the poem, drawing inspiration from their own dreams. The poem operates as a linguistic tool that further assists audiences to immerse themselves in the otherworldly, dreamlike environment of the installation.

PluginHUMAN_Betty Sargeant_Justin Dwyer_Interactive Art Experience
The DREAM 2.2 projection mapping art.

THE AUDIENCE’S INTERACTIVE ART EXPERIENCE

During the installation audience members can sit or lie down on the DREAM 2.2 resting platform. Each person wipes their forehead with a disposable wet cloth before interacting with the system . This removes any oils and residue from the skin and allows the EEG’s sensors to form a reliable connection with their forehead. One person at a time can use the EEG system. People are able to interact with the system for approximately 5 minutes. If there is not a queue of people waiting, they can use the system for a longer timeframe.

Audiences can also view a tablet displaying graphs of the neural data that is controlling the installation. The five graphs on our display track changes in Alpha, Beta, Delta, Theta and Gamma neural signals. When no one is wearing the EEG headset, the graph’s lines are flat. As soon as the EEG is securely placed onto someone’s forehead, the graphical lines move, tracking changes in neural activity. This provides audiences with a traditional scientific display of their neural data, and it’s a clear indication that the headset is working. When they wear the headset, the audience member’s neural activity controls the audio and visuals in the installation space. Their neural function forms abstract visuals that are projected onto the surrounding maze and they trigger audio effects. Different neural function creates different audio-visual effects, so everyone’s experience is unique.

PluginHUMAN_Betty Sargeant_Justin Dwyer_Interactive Art Experience
The DREAM 2.2 interactive art experience.

THE PERFORMANCES

DREAM 2.2 performances feature two sleeping performers and PluginHUMAN. PluginHUMAN assign visual and audio effects to the performers’ neural data in real-time. This data is then projection-mapped onto the exhibition’s maze. The data takes the form of abstract visualisations. The audio consists of a 30-minute electronic soundtrack. When the performers generate specific neural signals, new audio sounds are automatically triggered and layered over the soundtrack. The performers’ neural data is delivered via Bluetooth from the EEG headsets to a third party non-proprietary app. The app divides the neural signals into five signal bands; Delta, Beta, Gamma, Alpha and Theta.

PluginHUMAN_Betty Sargeant_Justin Dwyer_Interactive Art Experience
The DREAM 2.2. Immersive art environment.

RELATED ACADEMIC PUBLICATION

Investigating Novel BCI Displays that Support Personalised Engagement and Interpersonal Connections

CREDITS

Betty Sargeant ~ Artist (maze design), producer

Justin Dwyer ~ Artist (projection mapping), programmer

Coco Disco ~ Performer, writer

Levi Dwyer ~ Performer

Andrew Ogburn ~ Composer

PRESS

Customised Reality: The Lure and Enchantment of Digital Art

RELATED BRAINWAVE ARTWORKS

PluginHUMAN have created a series of neural controlled artworks. DREAM 2.2 is part of a wider art investigation. Betty Sargeant and Justin Dwyer have created five interactive art installations and immersive art experiences that involve their custom brain-computer interface. These include the immersive art experience Inter-Dream, and the multi-sensory performance The Loop.

PluginHUMAN_Justin Dwyer_Betty Sargeant
PluginHUMAN: Justin Dwyer and Betty Sargeant

ABOUT PluginHUMAN

PluginHUMAN is a multi-award-winning art duo led by Dr Betty Sargeant and Justin Dwyer. PluginHUMAN are at the progressive edge of their field, providing audiences with new cultural, environmental and scientific perspectives. Their artworks address the leading questions and concerns of our times.

PluginHUMAN’s work centres around the art of illumination. They create projection mapping, video artworks and led immersive multi-sensory environments. They use the medium of light to translate complex data into meaningful audience experiences.

PluginHUMAN has created commissioned artworks for institutions such as the National Taiwan Museum of Fine Arts (Taiwan), the Asia Culture Centre (South Korea), Questacon (Australia’s National Science and Technology Centre), the Melbourne Museum (White Night Festival, Australia) and Experimenta (Australian triennial touring media art exhibition). They have exhibited in Europe, North America, Asia and Australia. PluginHUMAN were awarded the Rupert Bunny Foundation Visual Art Fellowship (2019/20); they’re developing new carbon neutral and carbon negative materials and working methods. They have also won Good Design Awards (2020 and 2018), and a Victorian Premier’s Design Award (2017).

PluginHUMAN has an acute understanding of the role that technology plays in contemporary society. They reimagine new technologies to produce artistic innovations, creating meaningful large and small scale audience experiences for indoor spaces and outdoor public arenas.

RELATED PROJECTS

PluginHUMAN_Betty Sargeant_Justin Dwyer_Art Lighting Melbourne_tn
BREATHE
PluginHUMAN_Betty Sargeant_Justin Dwyer_Real Time Art_1.jpg PluginHUMAN_Betty Sargeant_Justin Dwyer_Real Time Art_2.jpg PluginHUMAN_Betty Sargeant_Justin Dwyer_Real Time Art_3.jpg PluginHUMAN_Betty Sargeant_Justin Dwyer_Real Time Art_4.jpg PluginHUMAN_Betty Sargeant_Justin Dwyer_Real Time Art
[ i miss your touch ]
PluginHUMAN_Betty Sargeant_Justin Dwyer_Real Time Art
INTER-DREAM