Charles Martin

Blog - page 2

Aug 13, 2019

Commune 6 at Sideway

Christina and I are playing as Ensemble Metatone again next week (it’s been a while!) at Hellosquare recording’s Commune #6 alongside Soft Hollow (Ben Harb - CBR) and Thomas Meadowcroft, an Australian composer visiting from Berlin.

read more

Jul 17, 2018

Bela, Myo, and Standstill at NIME2018

In June I travelled to NIME2018 at Virginia Tech to present some of work from the RITMO centre and EPEC project at the University of Oslo. This year, our NIME presentations were focussed on “standstill performance”—where participants have to stand as still as possible to create sound. In previous years, our group had created standstill performances using motion capture in the lab, but our new work was on ways to do this at live events, and even in installations, using the Bela single-board computer.

read more

Sep 12, 2017

Neural Network Ensembles in London and Representing Collaborative Interaction

I recently had the chance to present a paper about my “Neural iPad Ensemble” at the Audio Mostly conference in London. The paper, discusses how machine learning can help to model and create free-improvised music on new interfaces, where the rules of music theory may not fit. I described the Recurrent Neural Network (RNN) design that I used to produce an AI iPad ensemble that responds to a “lead” human performer. In the demonstration session, I set up the iPads and RNN and had lots of fun jamming with the conference attendees.

read more

Jun 22, 2017

MicroJam at Boost

We presented MicroJam this week at the Boost Technology and Equality in Music Conference at Sentralen, Oslo. The conference arranged a Tech Showcase session in Hvelvet, Sentralen’s old bank vault with developers of music apps, synthesisers, robots and education software.

read more

May 29, 2017

Music Tech at IFI

We recently hosted a music technology event at the Department of Informatics to gather together researchers and students from the University of Oslo to see performances and demonstrations of current research.

read more

May 3, 2017

Performing with a Neural Touch-Screen Ensemble

Since about 2011, I’ve been performing music with various kinds of touch-screen devices in percussion ensembles, new music groups, improvisation workshops, installations, as well as my dedicated iPad group, Ensemble Metatone. Most of these events were recorded; detailed touch and gestural information was collected including classifications of each ensemble member’s gesture every second during each performance. Since moving to Oslo, however, I don’t have an iPad band! This leads to the question: Given all this performance data, can I make an artificial touch-screen ensemble using deep neural networks?

read more