With digital art by João Martinho Moura and musical composition by Miguel Pedro, the performance exudes cooperation between Art and Technology. After its premiere at FrameArt festival (Cultural European Capital 2012), it has been revised and amplified, and a new version was presented at Theatro Circo.
Video1 (initial setup stage at Theatro Circo):
the moment when a nurse turns on the system, live captured at Theatro Circo, Braga, in the front of the audience.
The event represents a collaboration between Art and Technology.
Video2: teaser of the event at the European Capital of Culture – Guimarães 2012:
Video3: teaser – Theatro Circo:
João Martinho Moura, Adolfo Luxúria Canibal, Miguel Pedro Guimarães & Pedro Branco (2013). “Câmara Neuronal, a Neuro / Visual / Audio Performance”. Proceedings xCoAx2013 Computation, Communication, Aesthetics and X. Bergamo, Italy (2013) . P. 309-311. e-ISBN: 978-989-746-017-3; Link
Sandra Bettencourt (2015) Ele Canta o Corpo Elétrico: Câmara Neuronal como Performance Pós-humanista e Pós-digital CLP | Universidade de Coimbra. DOI: http://dx.doi.org/10.14195/2182-8830_3-1_4; Link
With the duration of 30 minutes, the performance involves a single performer on the stage, with his body connected to the visual and audio system. The connections include 18 physical sensors: 16 electrodes in the brain and two electrodes in the chest. The body movement is also captured through a 3d depth camera. Vital signals are acquired in real-time, analyzed and transformed into image and sound. The helmet signals are transmitted via wireless radio to custom audio/visual software trained specifically to the mental states of the performer.
Research institution: engageLab
Created by: João Martinho Moura, Adolfo Luxúria Canibal and Miguel Pedro
Supervision: Pedro Branco
The main software:
A custom software developed specifically for the project, in C++, using X-Code and the OpenFrameworks framework. All the graphics are rendered in real-time during the exhibition. Part of the graphics engine is also developed in Processing (JAVA) and the visual interpolation is done via the Syphon video protocol.
The brain signals are acquired via an EEG headset, from Emotiv Inc, and transmitted via radio to the main visual software.
The Heartbeat is acquired via a Polar Inc device and analyzed via a custom software developed in Processing JAVA.
The body shape is acquired via a 3d depth camera device, from Microsoft Inc, and all the image transformations and analytics are done via the openCV framework.
All the communications of the various signals is done via the OSC protocol, between 3 computers (central, the brain and the heartbeat), using Processing (JAVA) for the communications interface.
Video4: excerpts of the exhibition at Theatro Circo:
Openframeworks and Processing communities
Pictures of the Exhibition:
> Flickr Pictures of the exhibition