Autor-Archiv Eveline Vervliet

VonEveline Vervliet

Volumetric Audio Capture

Abstract: Beschreibung des 6DOF Recording Systems der Firma Zylia für HOA und dessen Software, mit Streaming und Binauralix Anwendungen.

Verantwortliche: Prof. Dr. Marlon Schumacher, Eveline Vervliet

Introduction

The Zylia microphone is a 19-capsule microphone array used for 3D/360 audio recording in 3rd ambisonics order. It’s easy to connect to your computer with a USB cable and compact in transportation.


Software

For proper functioning of the Zylia ZM-1, you must install a driver. Download the driver specific for your operating system here

Zylia 6DoF Recording Application for recording with multiple Zylia microphones
Zylia Ambisonics Converter for converting from A to B format
Zylia Control Panel with some information on the connected microphone
Zylia Streaming Application for setting up your live audio streaming with the Zylia microphone
Zylia Studio for recording with one Zylia microphone

Download the software here. Note that licenses are required.


Workflow

Recording

Recording with the Zylia microphone can be done either in the standalone application Zylia Studio or in a DAW with the Zylia Studio Pro audio plugin. As a DAW, Reaper is most recommended.

 

Conversion

To use the recordings on other platforms or for applications like videos, the recordings have to be converted to an Ambisonics B-format. You can either use the standalone application or the Zylia Ambisonics Converter plugin.

Zylia conversion software

There are several standards in the ambisonics world related to channel ordering and normalization levels. The most used one is the ambiX standard. For this, you choose the following settings: channel ordering ‚ACN‘ and normalization ‚SN3D‘. The following video from ZYLIA explains the workflow for converting a recording.


Stream on Zoom with the Zylia Microphone


Stream on Zoom with multiple speakers

 

Download Reaper session template


Use recording with Binauralix + BITalino R-IoT

The raw recording from the Zylia microphone will contain of 19 channels. The converted file in B-format in 3rd order will have 16 channels. First encode the B-format in a software like MultiPlayer-mini before integrating it with the open-source software Binauralix.

In the following demonstration video, I open the 3rd order B-format of a Zylia recording in multiplayer mini and send it to Binauralix over Blackhole. The I communicate with Binauralix over OSC in Max. In this way, I can use the BITalino R-IoT sensor to control the listening orientation in Binauralix in real-time.

 

 

Download the Max patch

Read this blog article for more information on the BITalino R-IoT sensor.


Research

The White Paper from Zylia contains the most important information on recording and post-processing with the Zylia microphone. download

In the same folder are two more papers Ambisonics recordings and A-B format conversion.


Links to documentation

Zylia documentation
Zylia software

VonEveline Vervliet

Conductor Gesture Recognition via ML techniques

Abstract: Beschreibung des Inertial Motion Tracking Systems Bitalino R-IoT und dessen Software

Verantwortliche: Prof. Dr. Marlon Schumacher, Eveline Vervliet

Introduction

In this blog, I will explain how we can use machine learning techniques to recognize specific conductor gestures sensed via the the BITalino R-IoT platform in Max. The goal of this article is to enable you to create an interactive electronic composition for a conductor in Max.

For more information on the BITalino R-IoT, check the previous blog article.

 

This project is based on research by Tommi Ilmonen and Tapio Takala. Their article ‚Conductor Following with Artificial Neural Networks‘ can be downloaded here. This article can be an important lead in further development of this project.


Demonstration Patches

In the following demonstration patches, I have build further on the example patches from the previous blog post, which are based on Ircam’s examples. To detect conductor’s gestures, we need to use two sensors, one for each hand. You then have the choice to train the gestures with both hands combined or to train a model for each hand separately.

Detect static gestures with 2 hands combined

When training both hands combined, there are only a few changes we need to make to the patches for one hand.

First of all, we need a second [bitalino-riot] object. You can double click on the object to change the ID. Most likely, you’ll have chosen sensor 1 with ID 0 and sensor 2 with ID 1. The data from both sensors are joined in one list.

In the [p mubu.gmm] subpatch, you will have to change the @matrixcols parameter of the [mubu.record] object depending on the amount of values in the list. In the example,  two accelerometer data lists with each 3 values were joined, thus we need 6 columns.

The rest of the process is exactly the same as in previous patches: we need to record two or more different static postures, train the model, and then click play to start the gesture detection.

Download Max patch

Download Max patch with training example
Download training data

Detect static gestures with 2 hands separately

When training both hands separately, the training process becomes a bit more complex, although most steps remain the same. Now, there is a unique model for each hand, which has to be trained separately. You can see the models in the [p mubu.gmm-left] and [p mubu.gmm-right] subpatches. There is a switch object which routes the training data to the correct model.

Download Max patch

Download Max patch with training example
Download training data

In the above example, I personally found the training with both hands separate to be most efficient: even though the training process took slightly longer, the programming after that was much easier. Depending on your situation, you will have to decide which patch makes most sense to use. Experimentation can be a useful tool in determining this.

Detect dynamic gestures with 2 hands

The detection with both hands of dynamic gestures follow the same principles as the above examples. You can download the two Max patches here:

Download Max patch mubu.hhmm with two hands combined
Download Max patch mubu.hhmm with two hands separate


Research

The mentioned tools can be used to detect ancillary gestures in musicians in real-time, which in turn could have an impact on a musical composition or improvisation. Ancillary gestures are „musician’s performance movements which are not directly related to the production or sustain of the sound“ (Lähdeoja et al.) but are believed to have an impact both in the sound production as well as in the perceived performative aspects. Wanderley also refers to this as ‘non-obvious performer gestures’.

In a following article, Marlon Schumacher worked with Wanderley on a framework for integrating gestures in computer-aided composition. The result is the Open Music library OM-Geste. This article is a helpful example of how the data can be used artistically.

Links to articles:

  • Marcelo M. Wanderley – Non-obvious Performer Gestures in Instrumental Music download
  • O. Lähdeoja, M. M. Wanderley, J. Malloch – Instrument Augmentation using Ancillary Gestures for Subtle Sonic Effects download
  • M. Schumacher, M. Wanderley – Integrating gesture data in computer-aided composition: A framework for representation, processing and mapping download

Detecting gestures in musicians has been a much-researched topic in the last decades. This folder holds several other articles on this topic that could interest.


Links to documentation

Demonstration videos and Max patches made by Eveline Vervliet

Official R-IoT documentation

Max patches by Ircam and other software

The folder with all the assembled information regarding the Bitalino R-IoT sensor can be found here.

This link leads to the official Data Sheet from Bitalino.

 

VonEveline Vervliet

Inertial Motion Tracking mit BITalino R-IoT

Abstract: Beschreibung des Inertial Motion Tracking Systems BITalino R-IoT und dessen Software

Verantwortliche: Prof. Dr. Marlon Schumacher, Eveline Vervliet

Introduction to the BITalino R-IoT sensor

The R-IoT module (IoT stands for Internet of Things) from BITalino includes several sensors to calculate the position and orientation of the board in space. It can be used for an array of artistic applications, most notably for gesture capturing in the performative arts. The sensor’s data is sent over WiFi and can be captured with the OSC protocol.

The R-IoT sensor outputs the following data:

  • Accelerometer data (3-axis)
  • Gyroscope data (3-axis)
  • Magnetometer data (3-axis)
  • Temperature of the sensor
  • Quaternions (4-axis)
  • Euler angles (3-axis)
  • Switch button (0/1)
  • Battery voltage
  • Sampling period

The accelerometer measures the sensor’s acceleration on the x, y and z axis. The gyroscope measures the sensor’s deviation from its ’neutral‘ position. The magnetometer measures the sensor’s relative orientation to the earth’s magnetic field. Euler angles and quaternions measure the rotation of the sensor.

The sensor has been explored and used by the {Sound Music Movement} department of Ircam. They have distributed several example patches to receive and use data from the R-IoT sensor in Max. The example patches mentioned in this article are based on these.

The sensor can be used with all programs that can receive OSC data, like Max and Open Music.

 

Max patches by Ircam and other software
software/
  motion-analysis-max-master/
    max-bitalino-riot/
      bitalino-riot-analysis-example.maxpat
    max-motion-features/
      freefall.maxpat
      intensity.maxpat
      kick.maxpat
      shake.maxpat
      spin.maxpat
      still.maxpat
│    README.md


Demonstration Videos

In the following demonstration videos and example patches, we use the Mubu library in Max from Ircam to record gestures with the sensor, visualise the data and train a machine learning algorithm to detect distinct postures. The ‚Mubu for Max‘ library must be downloaded in the max package manager.

Mubu.gmm example patch

Detect static gestures with mubu.gmm

First, we use the GMM (Gaussian mixture model) with the [mubu.gmm] object. This model is used to detect static gestures. We use the accelerometer data to record three different hand postures.

Download Max patch

 

Detect dynamic gestures with mubu.hhmm

The HHMM (hierarchical hidden Markov model) can be used through the [mubu.hhmm] object to detect dynamic (i.e. moving) hand gestures.

Download Max patch

 

Detect dynamic gestures with Mubu Gesture Follower

The Gesture Follower (GF) is a separate tool from the Mubu library that can be used in gesture recognition applications. In the following video, the same movements are trained as in the Mubu.hhmm demonstration so we can easily compare both methods.

Download Max patch

 

Gesture detection and vocalization with Mubu in Max for the Bitalino R-IoT

The [mubu.xmm] object uses hierarchical multimodel hidden Markov models for gesture recognition. In the following demonstration video, gestures and audio is recorded simultaneously. After training, a gesture will trigger its accompanying audio recording. The sound is played back via granular or concatenative synthesis.

Download Max patch with granulator
Download Max patch with concatenative synthesis

Download static training data
Download dynamic training data

 


Links to documentation

Demonstration videos and Max patches made by Eveline Vervliet

Official R-IoT documentation

The folder with all the assembled information regarding the Bitalino R-IoT sensor can be found here.

This link leads to the official Data Sheet from Bitalino.


Videos from Ircam

An example of an artistic application from Ircam on YouTube