Yesterday we had sound workshop (part 2) with the Dr. Due to work commitments, it was not possible for me to go to London that morning. However, I was able to participate in the Skype chat session. Earlier that morning I received an email instructing to download an application called Pure Data-extended. Below are a some research notes on Pure Data-extended prior to the session as well as sound design websites and notes from the Skype transcript:
What is Pure Data?
Pure Data lets you process audio and MIDI within a powerful modular environment, and lets you combine it easily with video, custom hardware controllers, light shows, and even robotics. source: soundonsound.com
To give a better idea of the way Pure Data works, and what it can do, I’ve created a couple of Pure Data demonstration patches, the first of which is a MIDI synth editor. (You can download this from http://www.soundonsound.com/sos/jul06/patches/SynthEditor.pd.) It can be used to edit VST instruments as well as hardware instruments, and would come in handy for controlling a rackmount synth with a somewhat minimal front panel. Because of Pure Data’s totally open-ended nature, you can build a custom editor for your setup. For instance, if you typically patch your VST synth through a number of effects, Pure Data allows you to edit all the synth and effects parameters at once — just place some sliders on screen and configure them to send parameters on the MIDI channels/ports of your choice.
This wavetable synthesizer shows something of what Pure Data can do when it comes to audio processing, and it also includes a built-in step sequencer. The sequence can be edited with a MIDI keyboard, while wavetables can be drawn on with a mouse.
More interestingly, you can create relationships between parameters. You might want to limit the level of resonance on a filter (to avoid damaging your speakers), or to only limit the resonance when the cut-off is above a certain level. Or how about setting the rate of one LFO to remain exactly half the rate of another? All this can be done with basic maths in Pure Data.
In my example I have set up the editor to control a free VST instrument called Cheeze Machine. I’ve decided that I want the attack and release values to be the same, so I have created one slider that controls both parameters at once. I have also created a button to instantly randomise all parameters, but I have chosen to limit the range of this randomisation on certain parameters in order to control the results a little — totally random settings can sometimes produce disappointing results, for instance when the envelope’s attack setting ends up so long that notes don’t sound properly.
MIDI changes can easily be triggered from an audio signal, and there’s a simple example of this in my patch — a mic input which will trigger parameter randomisation whenever the audio signal reaches above a certain threshold level. However, there’s lots more scope for more complex creative control, especially because of Pure Data’s nifty pitch-detection object.
Adam Armfield (soundonsound.com)
An extract from a scholarly article by a university teacher and his methodology when teaching Pure Data-extended.
Teach through Inquiry
To satisfy the primary objective, I teach students to learn how to learn Pd-extended, instead of teaching them Pd-extended itself. Therefore, I avoid giving the students direct information about Pd-extended whenever possible. For example, when creating tutorials I do not give answers to what Pd-extended will do in this or that case. Instead, I ask the students to create patches to discover these answers themselves. Moreover, I do not give students the code to the patches we use in our tutorials, but instead provide screen shots. This way, students become more familiar with the Pd-extended environment as they recreate the patches. Moreover, during this process of rebuilding the patches, students are more likely to individually modify them, driven by their own curiosity. During class time, if a student asks how Pd-extended will behave in a particular situation, I ask them how they might test for that behavior. If they are unsure, we work together as a class. I may also ask directed questions to facilitate the process.
Teaching Pd-extended/ GEM within an artistic engineering, cooperative learning model by John Harrison Wichita State University
patch (audio term)
When used as a verb, to patch means to route signal to/from an external device, usually with patchcords and possibly with a patchbay. In synthesizers, a patch is a configured sound that can be recalled for playback. In computer terms, a patch is a correction that fixes functionality problems in a software program.
Skype Chat Transcript Notes
Hopefully there is a concept that we’ve all grasped of foreground (events) and background (ambience).
Of course, the more we listen, the more we can distinguish events, even in ambient settings
So the line is necessarily blurred in real life, between the two
This is the difference between listening and hearing Dr Ed Kelly
The basic idea is that there are objects, messages and numbers, arrays of numbers and some graphical objects
The thing to remember is that it’s a completely open environment in which audio happens live. Some things may seem overly complex (the patch I showed there basically loads and plays back a sound file) but they are the same every time, and can be made to respond to the physical environment (via Arduino boards and sensors) or sound input, or a webcam, or data sent over a network from another program (like Processing)
By flat, I mean there is little graphical distinction between one object and another, other than it’s name.
A good place to start is in the Help Browser…Pure Data -> 03.audio.examples
Dr Ed Kelly