Heartbeatsaudiovisual performance driven by human heartbeats

I created unique hardware, and custom software, which let Piotr Bejnar play a concert on human heartbeats.

We built 12 bracelets reading pulse and movement data in real-time, custom software which controlled tempo of Piotr's music, and another application for live generative visualisations.

Bracelets were powered by Arduino Pro Mini with pulse sensor, accelerometer, and 868MHz radio. They communicated between each other and "main" radio transceiver via meshing network - even if one of the bracelets didn't "see" the master radio, it could communicate with it using other nodes.

bracelets arduino
bracelets live

Each bracelet was equipped with four RGB LEDs changing color based on a bracelet status - blue indicated that bracelet is working, and red meant connection to the DJ - the tempo from that bracelet is controlling tempo of a song.

Casing was 3D printed, and as straps, we used Apple Watch ones.

We wrote a lot of software for this project, it could be divided in main three parts:

  • main transceiver - handling communication between all three computers and all bracelets through radio transceiver
  • audio controller - receiving data from main transceiver and controlling Ableton Live tempo
  • generative visuals - receiving the same data from transceiver and generating visuals on the fly

Communication between all the machines was handled with OSC through ad-hoc network.

max4live

An audio controller, created in max4live, was running on Piotr’s machine. It allowed him to easily choose which bracelet currently controls tempo, and showed all data in real-time. While playing, Piotr was switching between active bracelets which were in turn controlling the song tempo.

Live visualas were important part of the whole project, they intended to show people that the actual tempo data is being controlled by the bracelets. We closely collaborated with amazing VJ and graphic/motion designer - Michał Mierzwa - to build half-generative visuals.

Part of them was made with fully geneative code in clojure and quil (processing wrapper), and another part were Michał's video samples created specially for that occasion. Live visuals application was rendering frames through Syphon, which allowed Michał to receive and display them easily as a part of his Modul8 workflow. In addition, Modul8 tempo was also synced to current bracelet tempo, making the whole concert synced to live heartbeat data.