Wednesday, 15 December 2010

View of the Final Project

Below I have included a screenshot (click to maximise) of the arrange page for my project. I have tried to include as much information about the process in one window as possible, there is a lot more going on behind the scenes especially in the mixer but I wanted to give a broad sense of what went in to create the final piece.


Visible is the synth settings I tuned to create the atmospheric bass, one of the reverb settings, an example of some equalisation for a track, the matrix editor showing the attention to velocities and note changes and the automation for the pitch (in cents).


I feel that this sort of production requires attention to detail, something that may not be obvious on the initial listen but at the same time 'makes' the track. I hope the screenshot gives a little insight into this.

Friday, 29 October 2010

Final Composition Decision

I have finalised the concept for my piece, it is to be a minimal orchestral piece fused with electronic sub bass to add some depth. The inspiration is discussed in the previous post.

My intentions are to create an atmosphere using the organic textures of acoustic instruments, to do this I have acquired the starter pack for East West's Symphonic Orchestra and am in the process of playing in the appropriate chords via midi.

To cover the whole frequency spectrum I will be layering the various instruments, having a full range of strings and horns. To give space to the mix I will be fine tuning reverb across the instruments.

Due to the minimalist nature of the production a lot of attention needs to be paid to the details, the length of the reverb, the velocities of various notes (to give as authentic a feel as possible), equalisation (to prevent frequency clashes) and bus compression to help mesh the layers together.

Using these techniques and others I hope to achieve a spacious and meaningful cohesion of sound.

Wednesday, 27 October 2010

Research into

For some background information and inspiration into the type of music I intend to compose and arrange for my task 3 project I have looked into the work of minimalists and modern classical composers such as Jóhann Jóhannsson, Ólöf Arnalds and Arvo Pärt. My trawling lead to me to a very interesting documentary commissioned by the BBC and presented by Björk in 1997. It went into some of the thought processes behind the music and really helped me to envision the sound i wanted to create. Link below.

Saturday, 23 October 2010

Task 3

For the 3rd task, create a song of 3mins 50secs, I have chosen to play with orchestral sounds, namely strings and horns. This is due to an interest in film music and the emotion that can be conveyed in a scene through the subtlety of strings.

Being interested in electronic music, I am contemplating using techniques such as re-sampling and cutting and altering the audio to create a new twist on film music.

Wednesday, 20 October 2010

Pure Data - Task 2

Two Simultaneous Random Melodies:

To create this a bang is needed, when pressed it creates a number between 0 and 800. This randomly generated number is then increased by 200 by running it into an object with the code +200.

The next number box contains the result of the random output with the 200 added, this is then run into the oscillator (which has a tilde after its name to signify output) to produce the sound.

This chain is copied so to produce two random melodies simultaneously.

The final object before routing it all out divides the signal by 0.5



A Quarter Tone Scale:

This sequence starts off with a toggle, a toggle is a switch triggered by a click that can be either on or off, to the computer that is 1 or 0. This is routed into a metronome which acts to set off a series of bangs at a set tempo. The tempo here is set at 700 which translates to 0.7 of a second.

The metronome is then routed into the left input of a float, a float being a way of storing numbers, this is then sent to an object that adds 0.5 to what has been input. Meanwhile the right input to the float is supplied by a message box, the number contained is sent on by a click, to avoid having to click a bang is sent as the patch is opened by having an object containing 'loadbang' routing to the message box.

The result of the float plus modifier is routed back to the original toggle. A parameter is set using the command 'sel' which only produces an output (a bang) when the input equals a certain value, in this case 72.

Before going to the oscillator the signal is run through an mtof object to convert MIDI numbers into frequencies.



Intervals Using Two Bangs:

The first bang routes to two message boxes containing 400 and 600 respectively (the numbers translate to frequencies), these then go into seperate oscillators and are combined at the end to produce one sound. When the second bang is clicked the signal is triggered to run through another two message boxes containing 600 and 700, the sound produced differs from that of the first bang, this difference is an interval.



Glissando Linear

The process for this patch is started with a bang, another bang is triggered at intervals of half a second by a metronome the output of which generates a new random number at these intervals.  The 'line' object is the key to the glissando as it allows ramping in between the randomly generated pitches, giving it the glissando effect.


Linear Logarithmic

 The two chains in this patch start with message boxes which, when clicked, send the data into a a line which has the effect of ramping the pitches. To output the signal the result is routed to an oscillator, the linear one however has an object to convert midi to frequency.





Friday, 15 October 2010

Pure Data vs. Chuck


Pure Data is the the open-source creation of  Miller Puckette (also authour of Max/msp). It is a visual form (gui) of programming audio using objects that are patched together in an attempt to re-create a patchable analogue synth, this type of coding is often called a 'Dataflow programming language'. Pure Data is an object-based code, that is to say it is centered around manipulating entitys such as variables, functions or values to produce the desired outcome.
Chuck is a far less intuitive program, namely as it is presented as a Command Line Interface (CLI), although efforts have been made to increase the readability of the code. Being a CLI the correct syntax has to be learnt making it slower (potentially only initially) than PD to operate however it differs from normal programming in that it runs in realtime, this means that the program does not need to stop running to be altered.

Similarly to PD it is an object based language taking elements of C+ + and Java to form its own language, it still uses classes, arrays and types.

Offering as much control, if not more, over the sound as Pure Data it is a powerful program for synthesis.


Overall the two programs are very similar in the way they run, it is the way they are presented that differs. Pure Data is designed to hide the majority of its workings and present a clear and instantly recognisable interface whereas ChucK is the workings and is a deeper form of synthesis and composition.


Apparently there is a ChucK theme song too!

Wednesday, 6 October 2010

Basic Programing Techniques, definitions

Class: A class is like a blueprint for a group of similar objects. In the real world an example would be the bicycle, the bicycle can change its characteristics but would still be classed as a bicycle.

An Object: An object in programming is similar to a real world object, they have both states (e.g. on/off) and behaviour (e.g the process of turning on and off).

An Array: An array is a series of objects that share the same data type and are the same size.

Types: This refers to the type of date being used in code. The most common are integer (a whole number), real number (a number with a decimal point), boolean (true or false) and text.

Values: All variables must have a value. The value is what the variable is set to.

Variables: Help to define the object's characteristics and change accordingly.

Concurrency: Is when programs are run in parallel or multiple tasks are being completed in a given time frame.

Monday, 4 October 2010

Electronic Music Notation

Differences between conventional musical notation and the derived notation for electronic music.

John Milton Cage, an influential American composer in the 20th century, did a lot of experimental forms of notation. Part of the attraction of electronic music is to create new noise that is impossible to achieve through conventional instrumentation. To help represent these pitches and rhythmic differences Cage experimented with new forms of notation, mainly graphical.



John Cage was by no means the only artist to experiment with graphical notation, Karlheinz Stockhausen too saw the possibilities availiable with a more open form of notation. Traditional notation allows for only one outcome, Stockhausen wanted to bring interpretation into the performance. In his piece Zyklus he created written music that can be read from any starting point and in any direction, this meant the performance would always be different and dependent not just on the music written down but the performers themselves and their interpretations.

The most basic and common of electronic music notation would be the midi matrix editor, found in as standard in most music creation software. It has a piano roll up the left side to indicate what note is played and the screen is then divided into bars with a note represented as a block. The difference between this and standard notation would be that the lengths of a note can be more precisely calibrated. Inflections such as vibrato and pitch bend can also be added and it is alot more visual than standard notation, the use of colours can affect velocity of a note for example.


A very modern and interesting form of musical notation is the use of graphics to create sound, using spectrogramms where the x axis is time and the y axis is frequency a visual representation of music can be seen. By reversing the process an image can have its corresponding sound, Aphex Twin created such a track on his Window Licker ep where a creepy image of his face shows up when watching the spectrogramm.




Wednesday, 29 September 2010





A lot of people complain about laptop djs these days, staring into their screen from atop the stage having their shit 'beatmatched for them' but this evolution of it that Baths presents just brings the uniqueness (noone will ever hear that precise coordination of sounds again) and performance back into a live electronic set, something you don't get with mixing vinyl. So yeah I've heard that Katey B song 7 times this festival already, so what if its a different face behind the decks playing it?

With people like Baths and his Akai and Daedelus with his Monome I'd love to see a trend set of a new type of live performance from the electronic world. It personalises the music for the crowd, something live acoustic music has always had, I believe this use of new technology should be embraced into the electronic music scene.

Hi

This might work.