Centre for Digital Music

 
Overview
Audio Engineering
Interactional Sound
Machine Listening
Music Informatics
Music Cognition
Projects
People
Publications
Seminars
Seminar Videos
Conferences & Events
Education
PhD Study
PhD Graduates
Software
Patents
 

Reducing Artifacts Between Source and Microphone in Live Sound

Alice Clifford

This project investigates the reduction of artifacts between source and microphone in live
sound. For the purpose of this research, artifacts are defined as unexpected and undesired audible sounds introduced to the output microphone signal. Examples of these are comb filtering, microphone bleed, microphone delays, reverberation and the proximity effect. Previous work on reducing microphone artifacts has been aimed at voice applications only. In contrast, this research focuses on live musical applications. Work so far includes finding delays of multiple active sources and an investigation into the effect of signal bandwidth on time delay estimation accuracy.
Work is also proposed to undertake research into the proximity effect and undesired reverberation artifacts. The proximity effect will be researched by understanding the causes and exploring the possibility of reduction of the effect, which is an aspect yet to be investigated. Undesired reverberation artifacts are proposed be tackled using echo cancellation, a technique traditionally aimed at mobile phone technology.

Comb filtering

Comb filtering occurs when a signal is summed with a delayed version of itself. Cancellation and reinforcement occur periodically between the two signals causing a comb shaped frequency response. The resultant sound can be described as ’thin’ and ’phasey’ and is the basis of flanging and phasing effects. Common practice in both live sound and studio recording is to record a single sound source with multiple microphones. Sound radiates from an instrument in all directions but the sound differs depending on the microphone position around the instrument. For this reason multiple microphones can be used to pick up different qualities of an instrument and mixed together to create the desired sound. The sound from the instrument will arrive at each microphone with different delays. If these microphones are then mixed, a signal and a delayed version of that signal are being summed, therefore causing comb filtering.
Delayed signals can also occur from microphone bleed. In a simple microphone configuration, a number of microphones are placed facing desired sound sources. Each microphone may pick up an undesired sound source, which will be delayed. Again, summing these microphone signals will be summing a delayed signal causing comb filtering.
When multiple microphones are used to reproduce a single source, the delay between the source arriving at each microphone can be estimated by analysis of the microphone signals. Our research so far has looked into ways to improve this estimation so that it can work in real-time in a noisy, reverberant environment. That is, we aim to prevent comb filtering in live music events.

Demonstration videos

Comb filter reduction of real recordings using time delay estimation

Comb filter reduction of moving sources using time delay estimation

The effect of window shape on the accuracy of time delay estimation for comb filter reduction

Publications

Alice Clifford and Josh Reiss, ”Effects of bandwidth limited signals on delay estimation for reduction of comb filtering,"
the Art of Record Production conference, Leeds, 2010

Alice Clifford and Josh Reiss, ”Calculating time delays of multiple active sources in live sound," 129th AES Convention, San Francisco 2010

see also

Enrique Perez Gonzalez, Josh Reiss "Determination and correction of individual channel time offsets for signals involved in an audio mixture", 125th AES Convention, San Francisco, USA, October 2008