One of the most innovative aspects to be explored is the strict connection between the narrative and the emotional aspects of the performer, done with the resource of a neuronal physiological capture device (ECG – electro-encephalogram), which are then synchronized with the audiovisual aesthetic. Thus, despite the ECG signals being processed wirelessly (and specifically programmed to the performer’s states of mind), the ECG helmet is adorned with cables connecting it to the ceiling, evoking a cerebral connection to the system and the idea of human guinea pigs in police laboratories. With digital art by João Martinho Moura and musical composition by Miguel Pedro, the performance exudes cooperation between Art and Technology. After its premiere at FrameArt festival (Cultural European Capital 2012), it has been revised and amplified, creating a new version which will be now presented at Theatro Circo.
The event represents a collaboration between Art and Technology.
Video teaser of the event at the European Capital of Culture – Guimarães 2012:
Video teaser – Theatro Circo:
João Martinho Moura, Adolfo Luxúria Canibal, Miguel Pedro Guimarães & Pedro Branco (2013). “Câmara Neuronal, a Neuro / Visual / Audio Performance”. Proceedings xCoAx2013 Computation, Communication, Aesthetics and X. Bergamo, Italy (2013) . P. 309-311. e-ISBN: 978-989-746-017-3;
Sandra Bettencourt (2015) Ele Canta o Corpo Elétrico: Câmara Neuronal como Performance Pós-humanista e Pós-digital
CLP | Universidade de Coimbra
With the duration of 30 minutes, the performance involves a single performer on the stage, with his body connected to the visual and audio system. The connections includes 18 physical sensors: 16 electrodes in the brain and 2 electrodes in the chest. The body movement is also captured through a 3d depth camera. Vital signals are acquired in real time, analysed and transformed into image and sound. The helmet signals are transmitted via wireless radio to custom audio/visual software trained specifically to the mental states of the performer.
Research institution: engageLab
Supervisor: Prof Pedro Branco
The main software:
A custom software developed specifically for the project, in C++, using X-Code and the OpenFrameworks framework. All the graphics are rendered in real-time during the exhibition. Part of the graphics engine is also developed in Processing (JAVA) and the visual interpolation is done via the Syphon video protocol.
The brain signals are acquired via an EEG headset, from Emotiv Inc, and transmitted via radio to the main visual software.
The Heartbeat is acquired via a Polar Inc device, and analysed via a custom software developed in Processing JAVA.
The body shape is acquired via a 3d depth camera device, from Microsoft Inc, and all the image transformations and analytics is done via the openCV framework.
All the communications of the various signals is done via the OSC protocol, between 3 computers (central, the brain and the heartbeat), using Processing (JAVA) for the communications interface.
Video excerpts of the exhibition:
Pictures of the Exhibition
> Flickr Pictures of the exhibition