Gen.AV 2

Hackathon and performance on Generative Audiovisuals, July 2015

Performance: 30/7/2015, Goldsmiths, University of London, PSHB-LG01

About the projects:

Butterfly
ButterflyA general purpose scope viewer that can be mapped arbitrarily to any internal bits of SuperCollider synths. It is intended as a interactive instrument that can be manipulated in real time for generative audio visuals performances. The visual tool can be used with any SuperCollider synth definitions, even your own as long as they follow some naming conventions
Project link: https://github.com/AVUIs/Butterfly

Cantor Dust
CantorDustCantor Dust is an audio visual instrument that generates, displays, and sonifies cantor set like fractals. It allows to the user to control the starting seed and number of iterations for a number of fractals at once, either through the graphical interface, or through an midi controller. The project is written in plain old Javascript, and runs entirely in a (modern) web browser.
Project link: https://github.com/AVUIs/cantor-dust

Esoterion Universe Gestenkrach
EsoterionGestenkrach2Esoterion Universe Gestenkrach is a fork of Esoterion Universe which adds support for SuperCollider sound engine with much wider variety of sounds, and LeapMotion sensor integration for more playful universe navigation and planet sculpting. Also the UI was adapted and made more directly responsive to LeapMotion input. Original description: under Gen.AV 1
Project link:
https://github.com/AVUIs/EsoterionUniverseGestenkrach

OnTheTap
OnTheTapStap reactive audio visual system. The system plays with the tactile, analog feel of tapping surfaces as a digital input device. This input and it’s gestures in turn drive sound and visuals expressively.
Project link: https://github.com/AVUIs/OnTheTap

residUUm
residUUm is a an attempt to sonify a particle system whose inhabitants exchange and discard their sonic characteristics as they collide leaving remnants that contribute a din of noise as their larger bodies fade. The sound engine and graphics are done, but the exchanging of characteristics has yet to be accomplished. This project uses processing to send visual characteristics of particle bodies to be sonified in pd.
Project link: https://github.com/AVUIs/residUUm

wat
watAn Audio-Visual Exploration of Chaotic 2-Dimensional Dynamical Systems (Or ‘Wat’). We would like to develop a 2D or 3D visualization based on Continuous Cellular Automata with various evolving rulesets, and sonify the result in a musical way. The core principle involves applying a matrix of mathematical operations (generally non-linear functions) to an image specifying the starting conditions. We apply our operation matrix my sliding it across the image (as in convolution) and applying the operation in each element of the matrix to the corresponding element in the image matrix. The result is complex evolving, moving, unpredictable textures which can be sonified with the right method. Furthermore, the rules of the system can be ‘performed’ by varying the operation set and coefficients in real time through some sort of input.
Project link: https://github.com/AVUIs/wat

Projects developed during the second Gen.AV – Hackathon on Generative Audiovisuals (25-26 July 2015, London Music Hackspace and Goldsmiths, University of London).