For Timps and Tape v1.1

FTaT-sg-300px

A drum skin vibrational modes based visualisation of of Jan Bradley’s
For Timps and Tape as performed by Gravity Percussion


Jan Bradley’s For Timps and Tape is a challenging, contemporary accompanied solo for five timpani – an original musical score with pre-recorded backing track. An interpretation of the piece integrating visualisation was first performed live by Gravity Percussion and Monomatic at a Royal Northern College of Music (RNCM) Spotlight Concert in March 2014 – https://vimeo.com/90468260. This latest version uses audio recordings and multiple camera footage from a studio session at RNCM in December 2014.

The visualisation, overlaid on the performance video edit, integrates a Bessel function within the General Scientific Library (GSL) to create virtual models of five drum skins and display a series of their vibrational modes – emphasised periodically through the use of Moiré patterns. Essentially the visualisation is a construct – illustrating yet abstracting the real-world behaviour of timpani by mapping each note value within the score to a discrete pattern within the first 13 vibrational modes of an ideal drum skin. In reality, all these modal vibrational states (and more) would be present on each drum skin and contribute to its pitch and timbre – this technique spotlights just a single mode. Despite being an ideal – no real world drum skin would behave as perfectly as this – it proves effective in visualising the dynamics, pace and physicality of the percussion performance.

The piece is next stage adaptation of Moiré Modesvimeo.com/92383476 – a short, abstract audiovisual work created as part of Lewis Sykes’ Practice as Research Ph.D. It extends the approach of using “virtual systems” within audiovisualisation – of drawing on derivations and formulas from physics and mathematics along with key aspects of Music Theory and particularly the Pythagorean laws of harmony, to code computer models of a variety of naturally occurring oscillatory and harmonic systems. The real-time animations generated from these models are then used to visualise the music.