Friday, June 29, 2012

Granular Synthesis With ChucK

I found this wiki page on MultiGrain Granular Synthesis with ChucK. It includes code, however the example audio files aren't working on my computer right now:

https://ccrma.stanford.edu/wiki/MultiGrain_Granular_Synthesis_in_Chuck

This might be a good place to start exploring with Granular Synthesis in ChucK. It seems to me that we could very easily add to/alter the code to manipulate the synthesis.

Judy, let me know what you think, and if this is on the right track...

-Julia

Thursday, June 28, 2012

ICAD June 2012 ChucK Workshop

SonifyingInChucKICAD2012.pdf

This is the printout from the ChucK Workshop held recently with Perry Cook.  ICAD was held in Atlanta on June 18-21, 2012.



Foundations of On-The-Fly Programming in the ChucK Programming Language

This paper written by Rebecca Fiebrink, Ge Wang, Perry Cook  2008.
ABSTRACT  excerpt:
"We present three case studies of applying learning in real-time to performance tasks in ChucK, and we propose that fusing learning abilities with ChucK's real-time, on-the-fly aesthetic suggests exciting new ways of using and interacting with learning algorithms in live computer music performance."

More papers at http://smirk.cs.princeton.edu/

Wednesday, June 27, 2012

EXERCISE - On the Fly

I uploaded a PDF of an "evolving" exercise I created for On The Fly Synchronization project using the example files OTF_01 to _07.ck files in the examples directory.

Click here to see a PDF of the On-The-Fly Exercise

A mapping to the keyboard tutorial

// not a full "lab" per se but it does have the possibility to be developed into one. This is based on my experience working with the KIns.

To get ChucK working with the keyboard, there are a few essential things to include in your code:

1.
// keyboard
HidIn kb;
// hid message
HidMsg msg;

In the above code, "Hid" stands for "Human Interface (possibly also input?) Device." AKA a device that you use to interact with your computer, like the keyboard. The first line defines the variable that will represent the HID, and the second defines the variable that will hold the information we receive from the HID.

2. 
// If the keyboard is not available, just exit the program right away
if( !kb.openKeyboard( 0 ) ) me.exit();
//Otherwise, assuming the program didn't exit on the last line, keep going
<<< "Ready?", "" >>>;

These lines are not critical, but helpful to include in a file, especially for debugging purposes. Obviously you can change them around to suit your tastes, such as only printing out a message if the keyboard *doesn't* open instead of if it does.

3.
// wait for event
    kb => now;

The above lines go inside the event loop, which is the (usually infinite) loop that makes the program run. They'll probably be the first thing inside the loop, unless your program has a reason for them not to be. Basically they pause the loop until the HID has a message to send.

4.
kb.recv( msg )

When this function returns true, a key is being held down. It's probably best used in a while loop or if statement, such as while( kb.recv( msg ) ){....} That will execute the body of the loop as long as a key is being held down.

5. msg.which

The value of this variable contains an int that corresponds to the key that's being held down. Operators like ==, >, <, != can be used to work with it to assign different actions to different keys.


These are the most important things to understand when working with keyboard input. Using them, you can construct an array of values of the keys on your keyboard that correspond to meaningful values. For example, to make a program that plays notes when the keyboard is pressed, you could make an array in which the index of the array is the frequency of the note, and the data that corresponds to any given index is the key mapping integer.

How can we find out the key mapping integers if we don't know them or have a reference though? An easy, if tedious, way to do it is to simply print out msg.which whenever a keypress event is received. You can then write your own table by going through each key on your keyboard.

Oscillators, Harmonics and Wavelength Resource




Oscillators, Harmonics and Wavelength Explained Simply

The Synth School website is now closed and "under construction" but this video survived on YouTube.  It's a very basic and clear explanation with great animations to show how to create saw and square waves from the original sine wave.  Video: 10 minutes.

Tuesday, June 19, 2012

Thursday, June 14, 2012

Kins Project Progress

//piano patch

//clarinet patch
Clarinet clarin => JCRev c => dac;
0 => c.gain;
.1 => c.mix;  

//flute patch

//organ patch
BeeThree org => JCRev o => Echo e => Echo e2 => dac;
o => dac;
   
// set delays
240::ms => e.max => e.delay;
480::ms => e2.max => e2.delay;
// set gains
//.6 => e.gain;
//.3 => e2.gain;
0 => o.gain;
0 => e.gain;
0 => e2.gain;
.05 => o.mix;

//brass patch


MAUI_View view;
view.name("Kins 2012");

MAUI_Button piano, clarinet, flute, organ, brass;
MAUI_LED lpiano, lclarinet, lflute, lorgan, lbrass;
MAUI_Slider volume;

view.size(500,250);

piano.pushType();
piano.size(100,100);
piano.position(0,0);
piano.name("piano");
view.addElement(piano);

clarinet.pushType();
clarinet.size(100,100);
clarinet.position(piano.x()+piano.width(),piano.y());
clarinet.name("clarinet");
view.addElement(clarinet);

flute.pushType();
flute.size(100,100);
flute.position(clarinet.x()+clarinet.width(),clarinet.y());
flute.name("flute");
view.addElement(flute);

organ.pushType();
organ.size(100,100);
organ.position(flute.x()+flute.width(),flute.y());
organ.name("organ");
view.addElement(organ);

brass.pushType();
brass.size(100,100);
brass.position(organ.x()+organ.width(),organ.y());
brass.name("brass");
view.addElement(brass);

lpiano.color(lpiano.blue);
lpiano.size(50,50);
lpiano.position(25,75);
lpiano.light();
view.addElement(lpiano);

lclarinet.color(lclarinet.blue);
lclarinet.size(50,50);
lclarinet.position(lpiano.x()+100, lpiano.y());
lclarinet.light();
view.addElement(lclarinet);

lflute.color(lflute.blue);
lflute.size(50,50);
lflute.position(lclarinet.x()+100,lclarinet.y());
lflute.light();
view.addElement(lflute);

lorgan.color(lorgan.blue);
lorgan.size(50,50);
lorgan.position(lflute.x()+100,lflute.y());
lorgan.light();
view.addElement(lorgan);

lbrass.color(lbrass.blue);
lbrass.size(50,50);
lbrass.position(lorgan.x()+100,lorgan.y());
lbrass.light();
view.addElement(lbrass);

volume.range(0,5);
volume.position (0,125);
volume.size(500,volume.height());
volume.name("Volume");
view.addElement(volume);

view.display();

[lpiano, lclarinet, lflute, lorgan, lbrass] @=>MAUI_LED leds[];
[622, 659, 698, 739, 783, 830,880, 932, 987, 1046, 1479,1567,1244,
349, 493, 174, 369, 415, 554, 440, 220, 311, 329, 1661, 1760,1864,
138,391,164,184,195,207,233,246,261,2093,2217,1108,587,523,155,466,146,293,277,2489,2637,2793,
61101016] @=>int keys[];
   
[131,139,156,165,185,208,233,247,277,311,330,370,415,
131,147,165,175,196,220,247,262,294,330,349,392,440,
523,554,622,659,740,831,932,988,69,78,82,93,
523,587,659,698,784,880,988,65,73,82,87] @=>int pitch[];

int ip, ic, ifl, io, ib;

function void volumeControl(){
    while (true){
       volume => now;
       if (ip == 1){
           <<<"volume">>>;
        }
        else if (ic == 1){
            volume.value() => c.gain;
        }
        else if (ifl == 1){
        }
        else if (io == 1){
            volume.value() => org.gain;
        }
        else if (ib == 1){
        }
        else{
        }
    }
}

function void ledControl(MAUI_LED led){
    for (0=>int i; i<leds.cap(); i++){
        leds[i].color(leds[i].blue);
        leds[i].light();        
    }
    led.color(led.green);
    led.light();
}

function void allinstrumentsControl(){
    while (true){
        if (ic == 1){
            clarinetSounds();
        }
        if (io == 1){
            organSounds();
        }
       
    }
  
   /* if (piano.state() == 0 || clarinet.state() == 0){
        while (true){
    <<< piano.state(), clarinet.state(), flute.state(),
    organ.state(), brass.state()>>>;
    if (piano.state() == 1){
        pianoControl();
    }
    else if (clarinet.state() == 1){
        clarinetControl();
    }
    else if (flute.state() == 1){
        fluteControl();
    }
    else if (organ.state() == 1){
        organControl();
    }
    else if (brass.state() == 1){
        brassControl();
    }
    else {
    }
}
}*/
}

function void pianoControl(){
    while (true){
        piano => now;
        if (piano.state() == 1){
            ledControl(lpiano);
            1=>ip;
            0=>ic=>ifl=>io=>ib;
            <<< ip, ic, ifl, io, ib>>>;
        }
    }
}

function void clarinetControl(){
    while (true){
        clarinet => now;
        if (clarinet.state() == 1){
            ledControl(lclarinet);
            1=>ic;
            0=>ip=>ifl=>io=>ib;
            <<< ip, ic, ifl, io, ib>>>;
            //clarinetSounds();
        }
    }
}

function void fluteControl(){
    while (true){
        flute => now;
        if (flute.state() == 1){
            ledControl(lflute);
            1=>ifl;
            0=>ip=>ic=>io=>ib;
            <<< ip, ic, ifl, io, ib>>>;
        }
    }
}

function void organControl(){
    while (true){
        organ => now;
        if (organ.state() == 1){
            ledControl(lorgan);
            1=>io;
            0=>ip=>ic=>ifl=>ib;
            <<< ip, ic, ifl, io, ib>>>;
        }
    }
}

function void brassControl(){
    while (true){
        brass => now;
        if (brass.state() == 1){
            ledControl(lbrass);
            1=>ib;
            0=>ip=>ic=>ifl=>io;
            <<< ip, ic, ifl, io, ib>>>;
        }
    }
}
function void pianoSounds(){
}
function void clarinetSounds(){
// HID
Hid hi;
HidMsg msg;
   
// which keyboard
0 => int device;
// get from command line
if( me.args() ) me.arg(0) => Std.atoi => device;
   
// open keyboard (get device number from command line)
if( !hi.openKeyboard( device ) ) me.exit();
<<< "keyboard '" + hi.name() + "' ready", "" >>>;
 
 
    // infinite event loop (the if was a while)
while (true){       
 // wait for event
 hi => now;
 // get message
 while ( hi.recv( msg ) ) {
     if( msg.isButtonDown() && ic == 1) {
         //0.75 => c.gain;
         volume.value() => c.gain;
         clarin.clear(1.0);
         1 => clarin.reed;
         Std.rand2f(0,1) => clarin.noiseGain;
         Std.rand2f(0,12) => clarin.vibratoFreq;
         Std.rand2f(0,1) => clarin.vibratoGain;
         Std.rand2f(0,1)=> clarin.pressure;                    
         // Std.mtof( msg.which + 45 ) => float freq;
         Std.mtof( msg.which+45) => float freq;
         freq $ int => int ifreq;
         <<< "Frequency: " + freq + "   " + ifreq>>>;
         if( ifreq > 20000 ) continue;
            for (0=>int i; i<keys.cap(); i++){
                if (ifreq == keys[i]){
                    pitch[i] => clarin.freq;
                    1 => clarin.noteOn;
                    //<<< "in: noteOn">>>;
                    300::ms=>now;
                }
                0 =>clarin.noteOff;
            }
        }
        0=>c.gain;
    }
 }
}

/**********************************************************************************************/
function void fluteSounds(){
}
/**********************************************************************************************/
function void organSounds(){
// HID
Hid hi;
HidMsg msg;
   
// which keyboard
0 => int device;
// get from command line
if( me.args() ) me.arg(0) => Std.atoi => device;
   
// open keyboard (get device number from command line)
if( !hi.openKeyboard( device ) ) me.exit();
<<< "keyboard '" + hi.name() + "' ready", "" >>>;

//0 => organ.gain;
   
// infinite event loop
while (true){
    // wait for event
    hi => now;
    // get message
    while ( hi.recv( msg) && io == 1) {
        // check
        //.75 => o.gain;
        volume.value() => o.gain;
        .6 => e.gain;
        .3 => e2.gain;
        if( msg.isButtonDown() ) {
            // Std.mtof( msg.which + 45 ) => float freq;
            Std.mtof( msg.which+45) => float freq;
            freq $ int => int ifreq;
            <<< "Frequency: " + freq + "   " + ifreq>>>;
            if( ifreq > 20000 ) continue;
           
            //.5 => organ.gain;
            for (0=>int i; i<keys.cap(); i++){
                if (ifreq == keys[i]){
                    pitch[i] => org.freq;
                    1 => org.noteOn;
                    80::ms=>now;
                }
                0 => org.noteOff;
             }
            
         }
         0=>e.gain;
         0=>e2.gain;
         0=>o.gain;
     }
 }
}
           
/**********************************************************************************************/
function void brassSounds(){
}

function void keyBoard(){
}
spork ~ pianoControl();
spork ~ clarinetControl();
spork ~ fluteControl();
spork ~ organControl();
spork ~ brassControl();
//spork ~ allinstrumentsControl();
spork ~ clarinetSounds();
spork ~ organSounds();

while (true){
    1::day=>now;
}

KIns progress: 6/14/12

New features:
-Sustained notes! When you release the key the note doesn't cut off immediately. Instead it fades out.
-Finer control over what keys do what. This is hopefully paving the way for the user to change the key mappings from a GUI.

Next step is to combine Lucy's GUI with this code below.

Wednesday, June 13, 2012

REFERENCES for ChucK Code

Clickable ChucK Manual
An online reference with clickable links to the ChucK manual.


LICK Library for ChucK
Many code examples.



SAMPLE COMPOSITION:   To Listen Click Here  or go to Dropbox/ChuckIt/Code Samples with WAV file folder and listen to file "otf_01-07Combined...wav"

Try this sample composition which uses a combination .ck files that include .wav  and chucK generated frequencies.  The /examples/data folder holds the sample .wav files used.  [Update the path in otf_01, _02, _03, _04 and 07.ck  from /data/------.wav to reflect the correct path on your computer for this folder, i.e. data/snare.wav to  read /Users/localhost/chuck/examples/data/snare.wav]


/examples/
Open all otf_01.ck through otf_07.ck files
Update path to the .wav files
Add Shreds to the VM in any order

In miniAudicle:
                        ChucK, Start Virtual Machine [Hotkey Cmd-period]

In command line:
                        Type chuck  otf_01.ck otf_02.ck …. And chain all files to shreduler





OSC Update

Update on The Hallway Project - 6/13/12 - Name: RainfallPlayer


Tuesday, June 12, 2012

Kins Project Update

I have been working on the Kins project, and it is going on very well so far. I am making use of te MAUI elements in mini-Audicle and they are very helpful. I prefer to post my code when I am done with everything--probably tomorrow, but you may view my GUI. The idea is to have individuals choose instruments they would like to play and allow them to control the volume of the instruments as well. Keyboard keys are mapped to make them easier to play. I am done with my organ--built on from my last one, but tried to make the code look like it was written by someone who has been programming for a while (used arrays instead of a bunch of if statements). Please feel free to critique the whole thing so we come up with the best project.


KIns progress: 06/12/12

Some significant progress on the KIns has been made! Here's a summary of my process so far:

I started using keyinmulti2.ck (http://smelt.cs.princeton.edu/code/keyboard/keyinmulti2.ck) from the S.M.E.L.T. website as my base. It already supported polyphony (playing more than one note at once) and variable note length, so those things can be crossed off the KIns project summary (I've updated that post too to reflect the work done so far). The way the program implements polyphony is very elegant--the making of actual noise is handled by one function, and when the program receives a message that a key has been pressed on the keyboard, it sporks a new shred of the keysound function, and when a key release message is received, it unsporks the proper shred to stop the sound. Therefore, pressing three keys at once will cause three shreds to get sporked, which will sit there playing their respective notes until they are unsporked. The number of notes you can have playing at one time seems to be limited only by the number of simultaneous key presses the computer is capable of registering.

I've made two specific modifications to keyinmulti2.ck.
1. I rearranged the key mappings to make a more linear progression. They were originally arranged in "frets" like on a guitar, so that there were a lot of overlapping notes between the key rows. This arrangement actually makes somewhat more sense for real music making, but for the beginner level user that the KIns is aimed at, I think it just makes it a lot more confusing. I feel like there's probably a somewhat optimal arrangement for the key mapping that makes more sense the way I've got it arranged now, but I think that will be figured out through experimentation much later on.
2. I added the capability to switch between different instrument sounds by pressing the number keys. In this version, 1 switches to a basic sine wave, 2 is a saw wave, and 3 is a "Rhodey" instrument (from the STK instrument kit that's built into ChucK). It's extremely easy to edit the instruments/add new instruments in the code, but I'm hoping eventually to be able to do this via a GUI and to also have slider controls on the GUI for the individual properties of the instruments. For example, a sine wave has very few built-in controls, but the STK instruments have lots of different controls that vary with the instrument. Having a dynamically changing GUI would be great for giving more control over these instruments.

One thing that I think would be good to work on next would be capability to sustain notes after the key is released, so that they fade out rather than being cut off abruptly by the shred being unsporked right as the key is released. Another thing is of course the GUI, which could be executed in MAUI (miniAudicle's built in, but very simple GUI). Processing also seems a promising choice for the GUI though. Julia found some demos that show two-way interaction between Processing and ChucK, I think via OSC, so I might look into those to see if that would be possible.

Here is the code for the current iteration of the KIns. It's very heavily commented, probably more so than is actually necessary.


/*----------------------------------------------------------------------------
S.M.E.L.T. : Small Musically Expressive Laptop Toolkit

Copyright (c) 2008 Dan Trueman.  All rights reserved.
http://smelt.cs.princeton.edu/
http://soundlab.cs.princeton.edu/

This program is free software; you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
the Free Software Foundation; either version 2 of the License, or
(at your option) any later version.

This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
GNU General Public License for more details.

You should have received a copy of the GNU General Public License
along with this program; if not, write to the Free Software
Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307
U.S.A.
-----------------------------------------------------------------------------*/

//-----------------------------------------------------------------------------
// name: keyinmulti2.ck
// desc: this program creates an array filled with the keyboard code for
// each key on the keyboard, indexed by the keyboard row and column. Then,
// it treats each row as a string, which is "tuned" in the code. Pressing 
// a key will play the note.
//
// This version supports polyphony, and the note ends when you release the key!
// Warning: Due to hardware (not our fault), you may not be able to play all chords.
//
// to run (in command line chuck):
//     %> chuck keyinmulti2.ck
//
// to run (in miniAudicle):
//     (make sure VM is started, add the thing)

//
//-----------------------------------------------------------------------------

//Hid = human input device. It's a variable to hold whatever HID the program
//needs to use
Hid hi;
//Hidmsg contains data about what the HID is doing at any given moment
HidMsg msg;
//sound determines what sound the keyboard is making at any given time
0 => int sound;

//initializes the HID as the keyboard, and exits if there's no keyboard available
0 => int deviceNum;
hi.openKeyboard( deviceNum ) => int deviceAvailable;
if ( deviceAvailable == 0 ) me.exit();
<<< "keyboard '", hi.name(), "' ready" >>>;


//array with key codes, for MacBook anyhow

[30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 45, 46, 42], //1234... row
[20, 26, 8, 21, 23, 28, 24, 12, 18, 19, 47, 48, 49], //qwer... row
[4, 22, 7, 9, 10, 11, 13, 14, 15, 51, 52], //asdf... row
[29, 27, 6, 25, 5, 17, 16, 54, 55, 56]   //zxcv... row

]   @=> int row[][];

//our big array of pitch values, indexed by ASCII value
int keyToPitch_table[256];

//this function takes each row and tunes it in half steps, based
//on whatever fundamental pitch note specified
fun void tuneString(int whichString, int basepitch) {
    
    for (0 => int i; i < row[whichString].cap(); i++) {
        
        basepitch + i => keyToPitch_table[row[whichString][i]];
        
        <<<row[whichString][i], keyToPitch_table[row[whichString][i]]>>>;
        
    }
    
}

//tune the strings!! This starts at (I think) A1, and then continues up by halftones
//each key. To hear the progression, go from left to right z->?, then up to a->",
//then q->]

tuneString(3, 55);
tuneString(2, 65);
tuneString(1, 76);
//tuneString(0, 185);


//makes the key sounds!
//currently configured to let the top row (number keys) control the type of sound
//the KIns makes. The code below should be pretty self-explanatory.
fun void keysound(float freq, Event noteOff) {
    
    if(sound==0){
    
    SinOsc sine => ADSR envelope => dac;
    envelope.set(80::ms, 25::ms, 0.1, 150::ms);
    
    freq => sine.freq;
    
    envelope.keyOn();
    noteOff => now;
    envelope.keyOff();
    150::ms => now;
    
    envelope =< dac;
}
else if(sound==1){
    
    SawOsc saw => ADSR envelope => dac;
    envelope.set(10::ms, 25::ms, 0.1, 150::ms);
    
    freq => saw.freq;
    
    envelope.keyOn();
    noteOff => now;
    envelope.keyOff();
    150::ms => now;
    
    envelope =< dac;
}
else if(sound ==2){
    Rhodey voc=> JCRev r =>dac;
    freq => voc.freq;
0.8 => voc.gain;
.8 => r.gain;
.2 => r.mix;

voc.noteOn(1);
    noteOff => now;
    voc.noteOff(1);
    150::ms => now;
}
}

Event noteOffs[256];

//infinite time loop
while( true )
{
    
    hi => now;
    
    //only does things when there's a message coming in from the HID
    while( hi.recv( msg ) )
    {
        //only if the message from the HID is that a button was pressed...
        if( msg.isButtonDown() )
        {
            //the following if/elseif statements check to see if the button press
            //should cause the type of sound to change. Only the value
            //of sound needs to change, as the keysound function
            //handles actually producing the appropriate sound.
            if(msg.which==30){
                0 => sound;
            }
            else if(msg.which==31){
                1=>sound;
            }
            else if(msg.which==32){
                2=>sound;
            }
            else{
            keyToPitch_table[ msg.which ] => Std.mtof => float freq;
            spork ~ keysound(freq, noteOffs[ msg.which] );
        }
            
        }
        //if the message was not that a button was pressed...
        else
        {   
            noteOffs[ msg.which ].signal();
        }
    }
}


Monday, June 11, 2012

Processing and OSC and ChucK

I just found this web site:
http://visiblearea.com/blog/Processing_and_ChucK

that looks helpful in connecting code written in ChucK with code written in Processing, which makes animations, etc Processing is open source, written in Java, and friendlier

UPDATE:  Julia and I spent today working on Processing => OSC => ChucK.  We ran into some problems we needed to figure out:

ChuckFilePlayDemo and ChuckHelloWorld would run successfully once and the second time we lost sound.  There was an if statement in the monitor.ck file that included kill commands for three different systems.  After commenting out everything but the killall -c chuck command for MacOS, the ChuckFilePlayDemo file worked fine on each successive play with sound and killed all running instances.  However, the ChuckHelloWorld demo did not terminate after hitting the stop button in Processing.

The error message was TCP IP cannot bind to port 8888.  We learned this meant there was another socket trying to access port 8888, and was preventing the second play.  We typed "netstat -tan | grep 8888" into the command line of terminal and it showed several lines followed with CLOSE_WAIT.  This meant that the command to finish that was sent from Processing did not completely close out the connection.  We could not figure out what was different between the ChuckFilePlayDemo.pde that was now running successfully and ChuckHelloWorld.pde, because we made the same changes to monitor.ck.

Tonight I tried again and ChuckHelloWorld.pde does stop and clears out any TCP IP connections in terminal so there is only one LISTEN, however, there is one residual tone that plays continuously.  When I run the ChuckHelloWorld.pde again, it does not override the tone.  The tone is in the background of the subsequent call to ChuckHellowWorld.

We then met to work on a design based off of this code to create a visualization of rain based on the amount of amplitude received through the adc connected to an external microphone.


OSC for Java

So I'm working on creating the interactive/responsive animation for the hallway, and I've hit a few roadblocks. Can anyone help me figure this out?

javaOSC:

I downloaded the javaOSC folder from this site: http://www.illposed.com/software/javaosc.html. I added it to the package I'm working on, but the program isn't recognizing any of the classes. Anyone have any idea how to make it recognize the class names from the OSC download?

Making ChucK send the amplitude:

Jan sent me a great link which has some code, but I'm just kind of sitting here scratching my head as I try to figure out how it works. Help? (Here is the code: http://electro-music.com/forum/post-260184.html)

Java animation:

Is it just like animation in Python? Have a dy/dx and update the canvas with a small shift in location for each object?

Music theory for musicians

I decided not to make a powerpoint about music theory for musicians because it's such a widely covered topic already. Instead, I found some easy to use online resources that probably explain things better than I could. Most of these resources cover the systems that western music traditionally uses to write and play music (musical staves, types of notes, chords, harmony, etc).

http://www.musictheory.net/lessons
This site has excellent coverage of basic through intermediate/lower advanced topics of music theory. It lets you go through the lessons at your own pace and has exercises to test your knowledge too. However, if you're looking for a very quick summary, it may not be the best.

Here's a much more in-depth document that explains how notes relate to each other and how scales are constructed:
http://www.gospelmusic.org.uk/resources/cowpielessons.pdf

A very simply written summary of the basics:
http://datadragon.com/education/reading/

The Wikipedia page on music theory. Not very good as a learning tool, but not bad as a reference.
http://en.wikipedia.org/wiki/Music_theory


Digital to Audio Conversion and Intro to Synthesis Powerpoint

This is the powerpoint I spent most of last week researching for. I think it can be improved upon/added to, so please leave any thoughts/comments you have!!

I was planning on doing Wavetables in this powerpoint, but I quickly realized that wavetables make much more sense when one knows the basics of music synthesis. So the next powerpoint will be on wavetables... or at least, that is once again the plan.