At Open Culture Tech, we are developing our own open-source and easy-to-use AI, Augmented Reality and Avatar creation tools for live music. Open Culture Tech issued an Open Call in the spring of 2023, in which musicians in The Netherlands could join our development programme.
From a large number of applications, 10 diverse artists were selected. From Punk to R&B and from Folk to EDM. Each with a unique research question that could be answered by applying new technology. Questions such as: “I’ve put my heart and soul into creating an EP that’s 15 minutes long. I want to perform this new work live on stage, but I need at least 30 minutes of music to create a full show.” Or questions like: “how can I enhance the interaction between me and the audience, when I play guitar on stage?”
Together with these artists, we (the Open Culture Tech team) come up with original answers to their questions that we translate into open-source AI, AR or Avatar technology – with easy to use interfaces. Then, each solution prototype will be tested during a live pilot show. After 10 shows, we have tested various different prototypes that we will combine into one toolkit. In this way, we aim to make the latest technology more accessible to every artist. Below, we share the results of our first two prototypes and live pilot shows.
OATS × Avatars
The first Open Culture Tech pilot show was created in close collaboration with OATS. Merging punk and jazz, OATS is establishing themselves through compelling and powerful live shows. Their question was: “how can we translate the lead singer's expression into real-time visuals on stage?” To answer this question, we decided to build and test the first version of our own Avatar creation tool, with the help of Reblika.
Unreal Engine is the global industry standard in developing 3D models. It is used by many major 3D companies in the music, gaming and film industry. The learning curve is steep and the prices for experts are high. Reblika is a Rotterdam-based 3D company with years of experience in creating hi-res avatars for the creative industry, who are currently developing their own avatar creator tool – using Unreal Engine – called Reblium. For Open Culture Tech, Reblika is developing a free, stand alone, open-source edition with an easy-to-use interface, aimed at helping live musicians.
The master plan was to capture the body movement of the lead singer (Jacob Clausen) with a Motion Capture Suit and link the signal to a 3D avatar in a 3D environment that could be projected live on stage. In this way, we could experience what it’s like to use avatars on stage and to find out what functionalities our Avatar creation tool would need. In this case, the aesthetic had to be dystopian, alienating and glitchy.
Our first step was to create a workflow for finding the right 3D avatar and 3D environment. OATS preferred a gloomy character in hazmat suit, moving through an abandoned factory building. We decided to use the Unreal Engine Marketplace, a website that offers ready-made 3D models. To create the 3D environment, Jacob Clausen decided to use a tool called Polycam and scan an abandoned industrial area. Polycam is an easy-to-use software tool that uses a technology called LiDAR, better known as 3D-scanning. Polycam allows you to scan any physical 3D object or space and render it into a 3D model.
The 3D scan (factory) and avatar (hazmat suit) were imported into Unreal Engine and the avatar was connected to a motion capture suit. This allowed Jacob Clausen to become the main character on screen and test the experience live on stage at Popronde in EKKO in Utrecht, on 19 October at 23:30. What followed was a show that taught us a lot.
The venue provided us with a standard beamer/projector and a white screen behind the stage. Due to an over-active smoke machine, unstable internet connection and a low-res beamer-projector, the avatar was not always visible on screen. Nevertheless, there were certainly moments where everything came together. At these moments, the synchronization between Jacob and his avatar was super interesting, the storytelling was amazing and the technology showed a lot of potential.
The Motion Capture suit was very expensive and we had to borrow this suit from Reblika. This is not very sustainable, accessible and inclusive. For our next prototype, we will look at Motion Capture AI technology, such as Rokoko Vision, instead of suits.
The 3D avatar and environment were shown from different camera angles. To make this possible, someone had to keep changing the camera angle (real-time) within the Unreal Engine software. Going forward, we should add predefined camera angles. In this way, you don’t need an extra person to control the visuals.
Ineffekt × AR
The second use case of Open Culture Tech was provided by Ineffekt. Through a blend of glistening vocal pieces, strings of dreamy melodies and distinctive rhythms, Ineffekt cleverly takes on a sound that both feels comfortable and illusive. The question of Ineffekt was: “how can I translate my album artwork into a virtual experience that could transform any location into an immersive videoclip?”. To answer this question, we decided to build and test the first version of our own AR creation tool, with the help of Superposition, an innovative design studio for interactive experiences.
For his latest album artwork and music video, Ineffekt decided to use a 3D model of a greenhouse in which yellow organisms are growing. This 3D model formed the basis for the AR experience we tested during the Amsterdam Dance Event.
Our main goal was to create and test an intimate mobile AR experience that was built with the use of open-source 3D technologies. This meant that we couldn’t use popular AR tools like Spark AR (Meta), Snap AR (Snapchat) or ArCore (Google).
In our first experiment, Blender was used to create a hi-res 3D model and webXR was used to translate this model into a mobile Augmented Reality application. Superposition also decided to experiment with App Clips on iOS and Play Instant on Android. These techniques allow you to open a mobile application – after scanning a QR code – in your browser without downloading the actual app.
On October 20 and 21, we tested our first AR prototype in front of Black & Gold in Amsterdam, during ADE. After scanning the QR code on a poster, the audience was taken to a mobile website that explained the project. Then, the camera of your phone would switch on and you’d see the yellow plants/fungus grow around you. In the back, someone was sitting quietly, a silent avatar. The overall experience was poetic and intimate. As with OATS, we learned a lot.
It is possible to create an intimate and valuable experience with mobile Augmented Reality technology.
It is possible to create a mobile AR experience with open-source technology.
The experience was static and did not react to the presence of the viewer. Going forward, we should look into the possibilities of adding interactive elements.
Our ultimate goal is to develop accessible AI, AR and Avatar creation tools that any musician can use without our support. In the above examples, this has not been the case. We have mainly tested the workflow of existing tools and not created our own tools – yet. Going forward, we will start building and testing our own software interfaces and let artists create their own AI, AR Avatar experiences from scratch. In this way, we hope to ensure that up-and-coming musicians also get the opportunities and resources to work with the latest technology.
This article was also published on MUSIC x on Thursday, 16 November 2023.