I am a designer, musician and technologist. Currently I am a PhD student researching craft in digital musical instrument design, supervised by Dr. Andrew McPherson in the Augmented Instruments Lab, Centre for Digital Music, Queen Mary University of London. I am also part of the open source Bela project for making interactive audio projects.
From August to December 2017 I was a Visiting Scholar at the Center for Music Technology, Georgia Institute of Technology, working with Prof. Jason Freeman. Before my PhD I was a research engineer at ROLI and FXpansion, and before that I completed a BSc in Music, Multimedia and Electronics at the University of Leeds.
To get in touch:
|Research mailing list||GitHub||Google Scholar|
|Bibtex of all references|
Jack Armitage & Andrew McPherson. Proceedings of the International Conference on New Interfaces for Musical Expression (NIME), 2018.
In digital musical instrument design, different tools and methods offer a variety of approaches for constraining the exploration of musical gestures and sounds. Toolkits made of modular components usefully constrain exploration towards simple, quick and functional combinations, and methods such as sketching and model-making alternatively allow imagination and narrative to guide exploration. In this work we sought to investigate a context where these approaches to exploration were combined. We designed a craft workshop for 20 musical instrument designers, where groups were given the same partly-finished instrument to craft for one hour with raw materials, and though the task was open ended, they were prompted to focus on subtle details that might distinguish their instruments.
Fabio Morreale, Jack Armitage & Andrew McPherson. Frontiers in Psychology, 2018.
Extensive training with a musical instrument results in the automatization of the bodily operations needed to manipulate the instrument: the performer no longer has to consciously think about the instrument while playing. The ability of the performer to automate operations on the instrument is due to sensorimotor mechanisms that can predict changes in the state of the body and the instrument in response to motor commands. But how strong are these mechanisms? To what extent can we alter the structure of the instrument before they disappear? We performed an exploratory study to understand whether and how sensorimotor predictions survive instrument modification. We asked seven professional violinists to perform repertoire pieces and sight-reading exercises on four different violins: their own, a cheap violin, a small violin, and a violin whose strings had been put on in reverse order. We performed a series of quantitative investigations on performance intonation and duration, and on bowing gestures and errors. The analysis revealed that participants struggled adapting to the altered instruments, suggesting that prediction mechanisms are a function of instrument configuration. In particular, the analysis of bowing errors, intonation, and of performance duration suggested that the performance with the reverse violin was much less fluent and precise than the performer’s own instrument; the performance with the small violin was also sub-standard though to a lesser extent. We also observed that violinists were differently affected by instrument modifications, suggesting that the capability to adapt to a new instrument is highly personal.
Spencer Salazar & Jack Armitage. Proceedings of the International Conference on New Interfaces for Musical Expression (NIME), 2018.
At first glance, the practice of musical live coding seems distanced from the gestures and sense of embodiment common in musical performance, electronic or otherwise. This workshop seeks to explore the extent to which this assertion is justified, to re-examine notions of gesture and embodiment in musical live coding performance, to consider historical approaches for integrating musical programming and gesture, and to look to the future for new ways of fusing the two. The workshop will consist firstly of a critical discussion of these issues and related literature. This will be followed by applied practical experiments involving ideas generated during these discussions. The workshop will conclude with a recapitulation and examination of these experiments in the context of previous research and proposed future directions.
Avneesh Sarwate, Ryan Rose, Jack Armitage & Jason Freeman. Proceedings of the International Conference on New Interfaces for Musical Expression (NIME), 2018.
Though laptop live coders are known to use other devices and instruments and play with other musicians, laptop live coding generally shares the common physical interface of the QWERTY keyboard. This project seeks to provide a means to explore alternatives to the QWERTY keyboard as a physical interface to laptop live coding. We present a live coding keyboard which is also a digital musical instrument, called the Stenophone. The Stenophone is an augmented stenotype or chorded keyboard, which permits continuous gestural control of keys and features an ergonomic design. These capabilities are exploited to enable the manipulation of algorithms and their parameterisation simultaneously
Jack Armitage & Andrew McPherson. Proceedings of the International Conference on Live Coding, 2017.
Liam Donovan, S. Astrid Bin, Jack Armitage & Andrew McPherson. Proceedings of the Web Audio Conference, 2017.
Many digital musical instrument design frameworks have been proposed that are well suited for analysis and comparison. However, not all provide applicable design suggestions, especially where subtle, important details are concerned. Using traditional lutherie as a model, we conducted a series of interviews to explore how violin makers “go beyond the obvious”, and how players perceive and describe subtle details of instrumental quality. We find that lutherie frameworks provide clear design methods, but are not enough to make a fine violin. Success comes after acquiring sufficient tacit knowledge, which enables detailed craft through subjective, empirical methods. Testing instruments for subtle qualities was suggested to be a different skill to playing. Whilst players are able to identify some specific details about instrumental quality by comparison, these are often not actionable, and important aspects of “sound and feeling” are much more difficult to describe. In the DMI domain, we introduce the term NIMEcraft to describe subtle differences between otherwise identical instruments and their underlying design processes, and consider how to improve the dissemination of NIMEcraft.
Jack Armitage & Andrew McPherson. Proceedings of the International Conference on New Interfaces for Musical Expression (NIME), 2017.
Andrew McPherson, Jack Armitage, S. Astrid Bin, Fabio Morreale & Robert Jack. Workshop held at the International Conference on New Interfaces for Musical Expression (NIME), 2017.
Jack Armitage, Kyle Molleson, Michael Battcock, Chris Earnshaw, David Moore & Kia Ng. Proceedings of the Electronic Visualisation and the Arts Conference, 2012.