For my final project, I wanted to explore how to create interactive audio and visuals applications and how the 2 mediums can interact with each other.
i used what i did for my midterm and decided to add some visuals to the sampler.
I used the visuals from http://www.generative-gestaltung.de/code to get a head start.
The main obstacle for me was audio control in processing. i tried various libraries like Minim and Beads, but each has their own shortcomings. Minim didn’t have any pitch control and beads was old and incompatible with the latest version of processing.
The SoundChannel class represents a single audio loop, with the ability to stop / play / change pitch.
First I set to create a single class for controlling the sound and the slider. But then i need to use the sliders to control the agents and the parameters for their behavior. So it seemed to make more sense to separate the slider in their own classes.
The Particles class creates and controls the agents and their behavior. This is controlled by a slider instance calling a function directly on the particles instance and that required passing the particles instance to each slider in their constructor.
This doesnt feel like great design, but it worked out fine.
I was also hoping to add more variations to the visuals, like colors and maybe even switch between different variations but the time didnt permit to do so.
- The slider class also had a reference to a SoundChannel object in order to modify the speed ot the sound. And now, the application became centered around the sliders. I guess it makes sense in a VJ or Digital audio tool that the control interface is the main focus, but i would like to have more separation between classes and use something like events.