Archive – the live set up from ElectroBluesSociety

ElectroBluesSociety’s live setup combined live instruments, real-time looping, backing tracks, and synchronized visuals into one integrated performance system. The setup was built around two laptops: one for audio and musical control, and one for video playback and visual synchronization.

What is worth mentioning is that this system did not start as a fixed technical masterplan. It developed gradually as a work in progress — with trial and error, practical limitations, experimentation, and quite a bit of improvisation. In other words: by simply trying things out, making mistakes, adjusting, and building from there. There may well be easier, smarter, or more modern solutions by now, but this was the way ElectroBluesSociety approached it and made it work live.

The instrumental setup included several guitars, electric bass, double bass, and a drum kit. Depending on the venue and stage size, guitar and bass were played through their own amplifiers, while other audio signals and certain effects were routed to a compact PA system or to the front-of-house mixer.

Because the show moved between smaller clubs, alternative spaces, and larger festival stages, flexibility was essential. The setup therefore had to work in different situations, from relatively compact self-contained performances to larger productions with full FOH support. That practical adaptability was an important part of the technical concept.

The main performance computer was a MacBook running Ableton Live. This laptop handled the musical backbone of the show: samples, loops, backing elements, MIDI control, and synchronization. Ableton was controlled in real time via an AKAI APC40 and an Apogee GiO foot controller, allowing hands-on operation during the performance.

A Focusrite Saffire Pro 40 audio interface was used as the central hub for audio routing. It managed the different input and output paths required for instruments, loops, effects, and playback. This made it possible to route signals independently to amplifiers, PA, or recording paths, depending on the needs of the show.

In practice, this meant the setup functioned less like a standard band rig and more like a modular live production system. It had to remain playable and immediate on stage, while at the same time acting as a routing environment for loops, backing tracks, click-based structures, and synchronized cues.

One of the key elements in the setup was the ability to capture and replay live guitar and bass loops in real time. Instrument signals could be recorded directly into Ableton, looped live, and then routed back out to the guitar amplifier or bass amplifier. This kept the looped sound integrated with the stage sound of the live instruments, rather than treating it only as a PA signal.

This routing approach also allowed processed sounds or special effects on guitar or bass to be sent either back to the instrument amplifiers or directly to the PA. As a result, the system offered a wide range of sonic options, from traditional amp-based tones to more produced or spatial effects in the main mix.

Most additional audio elements — such as samples, backing layers, and other non-amp-based signals — were played through the PA system.

An important part of this concept was that technology was not used to replace the live performance, but to extend it. The loops were created from played material on stage, often in the moment, and then fed back into the performance as an extra musical layer. That made the setup feel dynamic and alive, even when electronic elements were tightly integrated.

Because the system evolved over time, the routing was not designed from theory alone. It was shaped by rehearsal, live experience, technical problems, and the question: what works on stage without killing the flow of the performance? That hands-on process strongly influenced the final setup.

A second MacBook ran Resolume Avenue for live visuals. This machine was connected to an AKAI APC20, which was used to control and launch visual content during the set. A projector and screen were used to display the visuals on stage.

The visual system was linked to the music system through a Wi-Fi network and MIDI synchronization. The Ableton MacBook functioned as the MIDI master, sending MIDI clock and related trigger information to the Resolume MacBook, which ran as the MIDI slave.

Because of this setup, visual clips could be triggered in relation to musical events in Ableton. Images and film loops were assigned to MIDI-controlled actions, so individual audio samples or loops could be matched with specific visuals. MIDI synchronization also ensured that visual playback followed the same tempo structure as the music, keeping movement, rhythm, and BPM aligned during the show.

This part of the show was also developed experimentally. The aim was not simply to play video behind the band, but to make visuals part of the performance language itself. The moving image had to react to structure, rhythm, and atmosphere in a way that felt connected to the music rather than decorative.

For smaller venues, much of the setup could run through the band’s own compact PA system, with guitar and bass remaining on their dedicated amplifiers.

For larger venues and festival stages, the system was expanded by sending the complete audio setup to the FOH mixer. This included direct outputs from the live system as well as microphone signals from the guitar amp, bass amp, and drum kit. In that situation, the live rig effectively functioned as a submix and playback system feeding the house engineer, who could then balance the full stage sound for the audience.

That scalability was important. The same core rig had to survive very different real-world situations, and the solution was never purely theoretical or studio-based. It was built to be usable, transportable, and adaptable under live conditions.

Technically, the ElectroBluesSociety setup was designed as a hybrid live system. It did not rely only on playback, and it was not just a conventional band setup with visuals added afterwards. Instead, instruments, loops, samples, effects, and visuals were integrated into one synchronized performance environment. That combination made it possible to have raw live blues energy combined with controlled electronic production.

At the same time, the system should be understood as the result of an ongoing process rather than a final technical blueprint. It was built step by step, through experimentation, problem-solving, and, at times, simply by messing around until something useful emerged. There are probably more elegant solutions available now, but this setup reflects how ElectroBluesSociety developed its own working method: practical, hybrid, and shaped by live experience.