Centre for Digital Music

 
How to contact us
About the Department
People
Academic Staff
Support Staff
Research Staff
Research Students
Location
Queen Mary Information
Contact details
Staff directory
News
Mile End campus
Disability & accessibility
 

Adam Stark

Contact Details

Title: Research Student
Tel: Internal: [13] 5528
National: 020 7882 5528
International: +44 20 7882 5528
Fax:
National: 020 7882 7997
International: +44 20 7882 7997
Email:
adam.stark@elec.qmul.ac.uk
Room: Eng. 112

Research Group: Centre for Digital Music

Supervisor: Mark Plumbley

Research Topic: Musical Audio Analysis for Real-Time Interaction

In recent years, much research has been carried out into designing musical systems that are able to 'interact' with their users in a useful and meaningful way. This research has included intelligent autonomous composition, given some human input, examples of which include Experiments in Musical Intelligence (Cope, 1996) and The Continuator (Pachet, 2002).

Much of this past research has been into systems that do not operate in real-time or that operate upon symbolic musical representations such as MIDI. However, recent developments on audio signal processing algorithms performing pitch detection, beat tracking, chord analysis and key signature detection amongst others combined with improvements in processing power allow the possibility of interactive systems that operate upon audio and in real-time, live performance environments.

Some recent work in this area includes the B-Keeper automatic accompaniment system (Robertson & Plumbley, 2007) and audio effects that employ beat-tracking to allow semantic control of parameters (Stark et al., 2007). Recent research has also been focused upon the ability of a machine to extract perceptual features from a signal that reflect the properties heard by a human listener (Scheirer, 2000). Similar techniques have been used to build a framework for machine musicianship (Jehan, 2005).

This project will aim to broadly examine what information can be extracted from a musical audio signal in real-time and how that information can be used to build interactive systems. The project will also aim to determine how much information it is necessary to extract to allow for meaningful musical interaction. Some of the proposed features to be extracted from audio signals in real-time are pitch (monophonic & polyphonic), amplitude, timbre, rhythm (beat and time signature), chord analysis, key signature detection and melody extraction. Using this information, an attempt will be made to model some of the stylistic features of a musical performance.

This framework will allow for further exploration of the possibilities for musical interaction when a rich variety of meaningful musical and perceptual information is available to a real-time system. Some proposed applications to be built upon this framework include: automatic accompaniment systems adapting a prepared piece of music or a sample to a human performer in real-time based upon their performance; systems for live generative musical accompaniment based upon machine learning algorithms where music would be generated autonomously in real-time by a machine in response to a human performance; Other potential applications include the facilitated integration of recorded music and audio samples into live performance and interactive audio effects.

Publications

A. M. Stark and M. D. Plumbley. Performance following: Tracking a performance without a score. In Proc ICASSP 2010, Dallas, TX, USA, pp 2482-2485, March 2010.
[Download: pdf]

A. M. Stark, M. E. P. Davies and M. D. Plumbley. Real-Time Beat-Synchronous Analysis of Musical Audio. In Proc. of the 12th Int. Conference on Digital Audio Effects (DAFx-09), Como, Italy, September 1-4, 2009.
[Download: pdf]

A. M. Stark and M. D. Plumbley. Real-time chord recognition for live performance. In Proceedings of the 2009 International Computer Music Conference (ICMC 2009), Montreal, Canada, 16-21 August 2009.
[Download: pdf]

A. M. Stark, M. E. P. Davies and M. D. Plumbley. Rhythmic analysis for real-time audio effects. In Proceedings of the 2008 International Computer Music Conference (ICMC 2008).
[Download: pdf]

A. M. Stark, M. D. Plumbley and M. E. P. Davies. Audio effects for real-time performance using beat tracking. In Proceedings of the the 122nd AES Convention, Convention paper 7156. Vienna, Austria, May 5–8 2007.
[Download: pdf]

A. M. Stark, M. D. Plumbley and M. E. P. Davies. Real-time beat-synchronous audio effects. In Proceedings of New Interfaces for Musical Expression (NIME 2007), New York, NY, USA, June 6-10, 2007, pp 344-345, 2007.
[Download: pdf]