events
 
 
home
what is a live algorithm?
membership
events
LAM WIKI
contact us
links
 

December 19-20 2005, Goldsmiths College

Monday 19 December

13.00 Registration
13.45 Tim Blackwell, Michael Young (Goldsmiths College) Welcome and overview

Evolutionary Music I:

14.00 Al Biles (Rochester Institute of Technology, New York) GenJam - Genetic Algorithms and
Jazz improvisationabstract


15.00 Break


15.30 David Plans Casals (Electroacoustic Studios, University of East Anglia): Frank: an Open
Source framework for evolutionary music composition abstract


16.15 Joao Martins (Future Music Lab, University of Plymouth) Evolution of Rhythms in Artificial
Worlds abstract

 

19.30-21.30 Evening Concert

Great Hall, Goldsmiths College. Programme to include:

Habitation Ron Herrema

Lipsync1 (cello and live electronics). Thomas Gardner

Stem Cells Eric Lyon

Swarm Granulator with Mette Bilde, Tim Blackwell & Michael Young

improvisations with Mike Casey, Chris Redgate, Sebastian Lexer, Thanos Chrysakis, Ollie Bown.

 


Tuesday December 20th

Interactivity I:

09.00 Christopher McClelland, Michael Alcorn (Sonic Arts Research Centre Queens University of
Belfast): Escore - Real-time notation in interactive and live electronic performance environmentsabstract


09.45 Alain Renaud, Pedro Rebello (Sonic Arts Research Centre Queens University of Belfast):
Distributed Cues in Networked Improvisation. abstract

10.30 Break

11.00 Simon Emmerson
(De Montfort University, Leicester)
'Testing relating'. Can Alan Turing's 'test criteria' for machine intelligent behaviour be applied to interactive and live electronic music? If so, how does this square up with Christopher Small's declaration that "it is the relationships that it brings into existence in which the meaning of a musical performance lies"?


12.00 Tom Davis, Pedro Rebello ( Sonic Arts Research Centre Queens University of Belfast):
Emergence in Sound abstract


12.45 Lunch


Evolutionary Music II:

14.00 Luke Harrald (Elder School of Music, Adelaide University): The Iterated Prisoner's Dilemma abstract


14.45 Ollie Bown and Alice Eldridge (Centre for Cognition, Computation and Culture, Goldsmiths
College): Live (Dynamical Systems) Algorithms for Musical Instruments abstract


15.30 Break


Interactivity II:

16.00 David Muth (Ravensbourne College of Design and Communication) Sodaconductorabstract


17.00 Michael Young, Tim Blackwell (Goldsmiths College) Project Groups. Closing remarks

 

Abstracts

Al Biles (Rochester Institute if Technology, New York)
Improvising with Evolutionary Computation: Lessons from the GenJam Project

This talk will present a 10-year perspective on GenJam, its ongoing development and growth as an improviser and performer. I'll trace a chronology of functional and technical enhancements, focusing on efforts to gain greater spontaneity, interactivity, and musicality. I'll also discuss the evolution of my conception of GenJam from an exploratory application of evolutionary computation to a true musical partner. Along the way, I'll address what GenJam may have to say about improvisation, what GenJam has to say about evolutionary computation, and whether GenJam is a "live algorithm." Finally, I'll make some general remarks on the mutual influences of applicationdomains and technology on each another.

David Plans Casals (Electroacoustic Studios, University of East Anglia)
Frank: an Open Source framework for evolutionary music composition'

We aim to create an Open Source system for composition and improvisation which addresses recent approaches in evolutionary algorithm systems, to allow for more direct and live musical uses of usually closed academic approaches to users of the Puredata framework. Problem Statement:Intelligent algorithmic music composition and improvisation systems have typically focused on three methods : machine learning of musical patterns, rule-following, or evolution based on musical criteria. Most implementations face musical fitness bottlenecks in which human interaction is needed and slows the creative process down. Recent efforts have focused on co-evolutionary methods which address this problem, but donot do so in a sufficiently scalable way to be used interactively by live musicians, in the context of improvisation. Approach and Results: Here we outline a series of Puredata external objects; while implementing Todd & Werner’s (2001) co-evolutionary method, we further exemplify an approach whereby the concepts of musical novelty and surprise are managed by internal agents, which broker musical statements using genetic algorithms, while a general musical form manager is described which constitutes the main musical output agent.We present a framework which will further the notion of musical criticism and composition within intelligent agent behaviour by generifying the technical approach, and introducing form management in a way which can be practically used by an improvising musician.

top
*********************


Joao Martins (Future Music Lab, University of Plymouth)
Evolution of Rhythms in Artificial Worlds

The research at the Interdisciplinary Centre for Computer Music Research is looking for computational tools to study the evolution of music. The approach here presented focus specifically on the evolution of music grammars. The aim is to show to what extent new rhythms emerge from the interaction between autonomous agents and self organization of internal rhythm representations. The proposed model explores the evolution of rhythms in a society of artificial agents based upon imitation games inspired by research on Language. The agents make use of connectionist models to process rhythmic information, to extract their compositional patterns, and to evaluate compositions from other agents. In this way we can apply the principle of coevolution of music creators and critics avoiding the fitness bottleneck.

top

Alain Renaud, Pedro Rebello (Sonic Arts Research Centre
Queens University of Belfast)
Distributed Cues in Networked Improvisation


The development of high-speed networks in recent years has led to the implementation of a stable infrastructure, which is suitable for most types of networked performance - a musical performance involving two or more separated entities over a high-speed network. Most multi-site
network performance systems have so far been concentrating on providing high quality audio connectivity but haven’t explored the musical and performative potential offered by a network of musicians and machines.We address technical and philosophical issues and propose a number of strategies which aim to identify possibilities for a new performance practice in the context of multi-site networking. We refer to an on-going collaborative project between the Sonic Arts Research Centre and Stanford University, and to the use of algorithms that act as distributed cues in the structuring of live-electronic improvisation in the project BLISS.
top

*********************

Tom Davis, Pedro Rebello ( Sonic Arts Research Centre, Queens University of Belfast)
Emergence in Sound

Emergence can be defined as a global property of a complex system that results from the interactions and relationships among its agents, and between the agents and their environment. This definition however does not take into account how these emergent global properties are to be perceived. We propose that when setting up systems that exhibit emergent structures, consideration should be given to the medium of presentation and to the perceptual systems of the user. We suggest a shift away from a Cartesian top-down perspective of the world in order to design systems that exhibit emergence specifically in the sonic domain. As elaborated in phenomenological literature (Merleau-Ponty, Husserl) the observer/listener perceives the world only in relation to his or her own body. Such an understanding of perception suggests strategies for the design of emergence in the sound domain in which the listener him/herself becomes an agent and perceives/modulates the world from “within” rather than from the privileged but distanced position of the observer. This paper refers to prototypes that explore notions of emergence from the point of view of the perception of sound environments.

top


*********************


Christopher McClelland, Michael Alcorn (Sonic Arts Research Centre Queens University of Belfast)
Escore - Real-time notation in interactive and live electronic performance environments.


Work in the area of computer-assisted composition has enabled the development and application of algorithmic and generative processes in instrumental and electroacoustic composition. Specific software tools have been developed to aid composers in the process of exploring musical
ideas and in defining complex relationships between materials. This paper describes a system which extends the algorithmic process to include procedures that happen during live performance. A prototype environment (eScore) is described which replaces paper-based notation with a screen-based display. The system allows for flexible relationships predicated on previous and future events in the performance and allows for an open interaction with other performers using rules defined by the composer/musician. Of particular importance is the application of this work to interactive and live electronic performance environments.

top

Luke Harrald (Elder School of Music, Adelaide University)
The Prisoner's Dilemma

The presentation will outline the ongoing development of the ‘Ensemble’ system, a musical implementation of the Prisoner’s Dilemma Game. The goals for this system have been wide-ranging, and include the development of an interactive improvisation environment that allows live performers to interact with a virtual ensemble of agents manipulating visual and sonic media in real time. The main premise behind the system is the modelling of the social dynamics of music performance, particularly in indeterminate or improvised situations. The Iterated Prisoner’s Dilemma is used as a basis for this model. Currently, the system is not interactive, but has been used to generate a number of compositions in both real and non-real time. These offer insight into the musical suitability of the agent model and it is hoped that the addition of interactive elements will allow for collaborative music making.

top


*********************


Alice Eldridge (University of Sussex), Ollie Bown (Centre for Cognition, Computation and Culture,
Goldsmiths College)
Live (Dynamical Systems) Algorithms for Musical Instruments

We propose a body of research into how systems borrowed from artificial life, computational neuroscience, adaptive/autonomous systems studies and complex systems studies can be used as tools for performing computer musicians. We argue that such systems can be presented in a way that focuses on their general behavioural properties rather than on how they should be embodied and situated in a musical context, the latter being left to the musician, facilitated by the design of the system. We contextualise our argument within the present musical-technological creative climate, and with respect to existing ideas formed by the Live Algorithms for Music Group. Finally, we give examples of the systems that we have explored so far and their uses.

top

David Muth (Ravensbourne College)

Sodaconductor

Sodaconductor allows to create and animate complex audiovisual structures via the assembly of most minimalist ingredients or ‘building blocks’: simple sonic waveform generators controlled by black lines on white background. Once created, these audiovisual constructs can be choreographed in various fashions: several instances of them can be put into rhythmical sequences, whilst their shapes and movements obey the rules of their simulated environment. As the Sodaconductor framework is networked, it also offers the possibility to automatically conduct a little orchestra of Sodaconstructor creations running synchronised on different computers at the same time.

top


Previous events

LAM 1 Introductions. Goldsmiths College, December 2004

Presentations

Robert Rowe with Michael Casey MARCEL panel

Tim Blackwell, Michael Young (Goldsmiths) Live Algorithms
Eduardo Miranda (Future Music Lab Plymouth) Music as Emergent Behaviour: A Discussion on the Activities of the Plymouth Group
Pedro Rebelo (SARC, Belfast) Research Culture at SARC
Geraint Wiggins (Goldsmiths) Computational Creativity
Juan Bello Introduction to the Centre for Digital Music, QM
Norbert Schnell ATR Group at IRCAM
Jonathon Impett, John Bowers (UEA) Redefinining a Live Algorithm
Kia Ng (ICSRiM) Interactive MultimediaLAM1 Concert

 

John Bowers For al-Khwarizmi experiments with a mix of devices (laptop, monochord, analogue electronics, loudspeakers) and transducers (air and contact microphones, position and pressure sensors) to improvise a 'desktop sound ecology'.listen


Jonathan Impett (meta-trumpet), Sebastian Lexer (piano, electronics) and John Tilbury (piano). Musicalinstruments expanded with technologies; the meta-trumpet expands timbre with spectrum-tilting program - analysis parameters are sent an emergence-modelling program (Santa Fe Institute). Lexer uses audio and video analysis to generate an 'image' of the performance which is employed to mediate and control sound processing. Plus John Tilbury's acclaimed & ingenious use of 19th C technology.
listen

 

Paul Archbold a little night music Chris Redgate, oboe An intimate work for oboe and live electronics. The solo oboe inhabits a swarm of simulacra, its sonorities in turn dissected and metamorphosed. listen


Eduardo Miranda Robotaphitecos Inspired by the paradox of the origins of music, as purported by philosopher Jean-Jacques Rousseau who described the earliest languages as composed of vocal inflections (warnings). Re-synthesis techniques create various hybrid voices imposing synthesised human-like formants onto thespectrum of monkey sounds: the piece culminates with a choir of virtual singing digital creatures.

Ollie Bown (laptop) and Tom Arthurs (trumpet). Simple generative systems extend the context of expressive musical interaction beyond that of the participants alone…Tom plays through a number of Max/MSP objects; Ollie adds layers of drum patterns by blending the complex sequences drawn from a number of randomly generated Boolean networks.listen

Tim Blackwell Swarm Music Swarm music produces improvisations from the behaviour of particle swarms. The particle swarms derive from the mathematics of actual swarms. The improvisations are developed through stigmergy: just as insect swarms revisit food sources the musical swarm is drawn towards promising musical input.listen

Michael Young Argrophylax Chris Redgate, oboe A magical stone (Plutarch) that produces piercing, trumpet-like sounds…the oboe negotiates a deterministic score, to which the electronics respond probabilistically; empathetic, or provocative…live spectral shifting and distortion, live recording/granulation/wave shaping and the real-time sample-based generation of material listen

back to to


First Research Workshop. Goldsmiths College, April 2005

Presentations

Owen Holland (Dept Computer Science, University of Essex) Machine Consciousness and Creativity download pdf
Paul Brown (School of History of Art, Film and Visual Media, Birkbeck College) Robotic Art
Tim Blackwell and Michael Young A Framework for Live Algorithms download ppt
Eduardo Miranda and Bram Boskamp Generative Grammars download pdf

Michael Casey Audio Similarity download ppt

Chris and Roger Redgate Textural Improvisation: Practitioners' View

Nick Bryan-Kinns and Pat Healey Digital Reciprocity

Live Algorithms Group, June 2005 performed at the Sonic Arts Network Expo in Scarborough, June 2005. Tim Blackwell (sax), Sebastian Lexer (piano+) Michael Young (trumpet) with the Swarm Granulator and other electronic


Second Research Workshop. September 05, Goldsmiths College

in association with ISMIR 2005

Jon McCormack (Monash University, Australia): Practical strategies and ideas for how generative techniques can be used in artworks/live algorithms

Francois Pachet (Sony CLS): Research at CLS including Ringomatic: A Real-Time Interactive Drummer


Roger Dannenburg (School of Computer Science and School of Art, Carnegie Mellon University): Music Understanding for Interactive Music Performance

Andrew Brown and Rene Wooler (Queensland Institute of Technology): Generative Scores, Impromptu, a live programming environment, generative dance music


Chris Raphael (School of Informatics, Indiana University): Music Plus One.


Nick Collins (University of Cambridge) Machine Enhanced Improvisation


Owen Holland and Tim Blackwell: Conscious Algorithms


Michael Young, Chris Redgate, Roger Redgate, Pat Healey: Codification of performance (including recent experimental results)

Concert programme 15 Sept 05, Goldsmiths College:

Alice Eldridge, fond punctions
fond punctions is my first stab at integrating generative and improvisation practices for live AV performance. The performance is an improvisation between 2 distinct but interacting adaptive generative processes and me. A homeostatic network is embedded in a simple physics simulation which describes the movement of various floating cell-llike aggregations seen in the visuals. The algorithmic processes parameterise a granuluar synthesis engine which processes live samples creating re-compositions of earlier improvisations and inspiring new ones. listen

Chris Redgate (oboe) and Roger Redgate (violin) improvisation without electronics listen

John Lely and Sebastian Lexer improvisation with electronics listen

Nick Collins Concerto for Accompaniment. Chris Raphael, oboe listen

klipp av (Nick Collins, Fredrik Olofsson) audiovisual capture experiment
klipp av is Swedish for "cut apart" and the adopted name of an audiovisual laptop duo who specialise in modality synchronised source manipulation. Algorithms twist and turn in music and graphics, each informing the other. Everything is generated live with program code and mappings being manipulated on-the-fly.
klipp av are committed to live laptop music, and demonstrate on-the-spot manipulation of sound and visuals wherever possible. listen


Michael Casey, Roger Dannenburg (tpt) Sueme No. 1
Audio mosaicing is a technique of sound construction that matches a live sound stream, say from a performer, to a large database of pre-recorded sound material. The 'mosaic' is thereby composed of small fragments of sound from the source database. In this piece, the sonic materials consist of the UK No. 1 singles from 1960 to the present day. The goal is to explore the hidden worlds that exist within these archived instants; re-shaping the material and re-organizing it in the moment. listen