Thursday, August 2, 2012

KIns Update

The dropbox includes KIns_August1_JBH.ck. Updates: -- Clarinet modified code to achieve clarinet sound -- Organ modified code to achieve organ sound Future Work: -- Play Loop After several attempts to get the Play Loop button to work unsuccessfully, suddenly it began playing back. I did have both Record Loop and Play Loop simultaneously lit for awhile. Unable to identify specifically what happened, but the Console Window did print "here : string" repeatedly while this happened.

Friday, July 27, 2012

Open Source Tools Using ChucK

ROB.VOX http://www.cacheflowe.com/?page=rob.vox "Rob.Vox is a piece of vocal processing software that you can run on your own computer. It runs in ChucK, a great audio-programming language and environment." - website

Playing with Rob.Vox - custom vocal processing software from CacheFlowe on Vimeo.

WEKINATOR "The Wekinator is a free software package to facilitate rapid development of and experimentation with machine learning in live music performance and other real-time domains. The Wekinator allows users to build interactive systems by demonstrating human actions and computer responses, rather than by programming." - website Demo video by Rebecca Fiebrink - Click Here

000000swan, Kinect, Unity 3D + Wekinator from phoenix perry on Vimeo.

Summary on Projects

Today’s great success concluded with the KIns Player automatically recording a loop for playback. Julia is still tinkering with playing a loop while recording another instrument. However, the toggle function on the Recording Loop button is working. The KIns (Keyboard Instrument) is fully functional with GUI on a Mac. However, it is not functional on Windows. With miniAudicle installed, the KIns is easy to install and use for simple play of the provided instruments. Future development includes keeping an eye out for the cross-platform Qt being developed by miniAudicle’s author S. Salazar. Also, developing the ability to play a loop while recording a new loop remains on the wish list. Additional instruments, additional slider bars using MAUI Elements to enable even more live synthesis, and the development of scores. The Rainfall Player works for users with Processing and Chuck installed. The installation is a bit more involved. The player works with any adc, installed or external mics. Future development includes packaging for a single download, adding different schemes, i..e. background images and objects to rain something other than cats and dogs or shhhhh! We discussed the overall learning process and came to unanimous agreement that the tools we used supported deeper learning of the principles. Our approaches varied in the early weeks, but in retrospect, we felt a guided tour through the material should weave the tool features with in depth explanation together. For instance, very early on we read about sound and how sinusoidal waves combine to make more complex sounds. We feel that as part of this discussion an introduction to how to make a simple SinOsc in ChucK would be appropriate and helpful. Also, we agreed that a visual such as offered in TAPESTREA to see how the wave generated appears helps in the understanding of this fundamental concept. By combining the building blocks of signal processing, synthesis and programming, we felt a stronger foundation of understanding can be achieved. Simple sine waves, frequencies, classes, objects, variables and parameters first. Then complex waves, unit generators and envelopes, using another layer of programming that gives the students more control and an understanding of these basics will give them greater success in creating sounds they envision rather than random experimentation. Once they have built up a small library of files, then introduce functions where they can take blocks of code they have created and learn how to reuse it and change parameters. This would be a good time to introduce more complex timing and synchronization methods, sporking several files and possibly using event driven options. Filters made more sense to us when we also saw a visual display of how envelopes change sound. The Circle audio software gave a good overview of ADSR filters, envelopes, and how various other filters operate on sound. The examples folder of Chuck, MiniAudicle and TAPESTREA provides a great resource for simple introduction to become familiar with how code is laid out, and it is very easy to simply change values to learn how it affects sound, and to discover and work with simple debugging errors. TAPESTREA is an interesting program to work with to create a “musical tapestry” of pre-recorded sounds. It is experimental software and there are several known and unknown bugs, so for a beginner it may yield somewhat frustrating to work with on any significant level. But using it for a clear purpose like separating the frequencies of a pre-recorded wave file, or having a mix of prerecorded sounds and ChucK scripts, it would make it very easy to create a soundscape and record to a file.

Thursday, July 26, 2012

Update on MAUI for Windows

I received the following to my inquiry whether MAUI will ever work with Windows for the KIns. Bad news is not on Windows, but the good news is: "The current MAUI elements will probably never work on Windows. The good news is that Spencer (chief author of the mini) is busy re-writing it and will use a cross-platform toolkit (Qt) which should mean functionality will be the same across all platforms. That should be nice (I don't use OSX either)." From Kassen, Chuck Users Forum

Courses Using ChucK Programming

Ge Wang Course List  - Music 128 and CS 170  Beginning course for SLOrk  Link here to full list

California Institute of the Arts
George Institute of Technology
McGill University
University of Victoria
UC Santa Barbara
Stanford Classes:  Fundamentals of Computer Generated Sound; Computational Algorithms; Composing, Coding and Performance w/Laptop Orchestra

Wednesday, July 25, 2012

Great ChucK Resource

Hey guys,

This is a great resource which seems to discuss all things ChucK and even provides some great libraries!

http://en.flossmanuals.net/chuck/index/


-Julia Check out our June 13 entry for more info too on this. The online version has more than the manual and is searchable.

Monday, July 23, 2012

Rhodey Example

http://chuck.cs.princeton.edu/doc/examples/stk/rhodey.ck

I just wanted to share this because it actually sounds good. Motivation to one day get our ChucK programs to be pleasing to the ear, you know?

-Julia

Wednesday, July 18, 2012


Tapestrea is another software tool for learning about frequencies, filters, synthesis, composition and reads ChucK scripts. The reason I find this so useful is it gives visual context to signal processing. This 5 minute demo video gives a quick tour through the software features. For example, a pre-recorded .wav file is played and the frequencies appear in the spectrogram display. The "separate" button creates individual files for each frequency in the .wav file, dividing the files into stable foreground sinusoidal frequencies, bursts or transient events, and background/noise (stochastic events). Extraction of a frequency is done easily by clicking and dragging a window across the segment you want. Sliders make it easy to stretch, loop, synthesize, etc. and give visual/manual control to processing new sounds. However, the Lion roars on this one because though Tapestrea incorporates ChucK programming features, the software does not run on a Mac OSX Lion operating system. Windows, Linux or Mac OSX is necessary. I emailed Princeton and received this confirmation by email.

Monday, July 2, 2012

Granular Synthesis Part 2

I'm making this a new post because the comments section acts up when I try to post there...

Jan pointed out that the use of MIDI devices is, in a sense, deprecated in ChucK. So I'm currently researching and playing with LiSa to do granular synthesis.

The nice thing about LiSa is that you can hook up anything that is a sound source - like an oscillator or SndBuf - and use that to make the grains. In addition, you can mathematically synthesize grains. This allows much more freedom in synthesis techniques than a MIDI controller.

There are several LiSa example files in the examples/special folder. For some reason, however, a few of the files aren't working for me at the moment... But for those that do work, the sound is very unique...

This thread (also where I got the "garbage" MP3 and the "DigitalTape" code) has to do with granular synthesis. I will post any other threads I find in a comment.

-Julia

Using ChucK for Granular Synthesis

http://electro-music.com/forum/viewtopic.php?t=14196

There is a good description starting from the May 14 post and down between two users explaining creating code from scratch or using LiSa.  Kassen explains it well.

Friday, June 29, 2012

Granular Synthesis With ChucK

I found this wiki page on MultiGrain Granular Synthesis with ChucK. It includes code, however the example audio files aren't working on my computer right now:

https://ccrma.stanford.edu/wiki/MultiGrain_Granular_Synthesis_in_Chuck

This might be a good place to start exploring with Granular Synthesis in ChucK. It seems to me that we could very easily add to/alter the code to manipulate the synthesis.

Judy, let me know what you think, and if this is on the right track...

-Julia

Thursday, June 28, 2012

ICAD June 2012 ChucK Workshop

SonifyingInChucKICAD2012.pdf

This is the printout from the ChucK Workshop held recently with Perry Cook.  ICAD was held in Atlanta on June 18-21, 2012.



Foundations of On-The-Fly Programming in the ChucK Programming Language

This paper written by Rebecca Fiebrink, Ge Wang, Perry Cook  2008.
ABSTRACT  excerpt:
"We present three case studies of applying learning in real-time to performance tasks in ChucK, and we propose that fusing learning abilities with ChucK's real-time, on-the-fly aesthetic suggests exciting new ways of using and interacting with learning algorithms in live computer music performance."

More papers at http://smirk.cs.princeton.edu/

Wednesday, June 27, 2012

EXERCISE - On the Fly

I uploaded a PDF of an "evolving" exercise I created for On The Fly Synchronization project using the example files OTF_01 to _07.ck files in the examples directory.

Click here to see a PDF of the On-The-Fly Exercise

A mapping to the keyboard tutorial

// not a full "lab" per se but it does have the possibility to be developed into one. This is based on my experience working with the KIns.

To get ChucK working with the keyboard, there are a few essential things to include in your code:

1.
// keyboard
HidIn kb;
// hid message
HidMsg msg;

In the above code, "Hid" stands for "Human Interface (possibly also input?) Device." AKA a device that you use to interact with your computer, like the keyboard. The first line defines the variable that will represent the HID, and the second defines the variable that will hold the information we receive from the HID.

2. 
// If the keyboard is not available, just exit the program right away
if( !kb.openKeyboard( 0 ) ) me.exit();
//Otherwise, assuming the program didn't exit on the last line, keep going
<<< "Ready?", "" >>>;

These lines are not critical, but helpful to include in a file, especially for debugging purposes. Obviously you can change them around to suit your tastes, such as only printing out a message if the keyboard *doesn't* open instead of if it does.

3.
// wait for event
    kb => now;

The above lines go inside the event loop, which is the (usually infinite) loop that makes the program run. They'll probably be the first thing inside the loop, unless your program has a reason for them not to be. Basically they pause the loop until the HID has a message to send.

4.
kb.recv( msg )

When this function returns true, a key is being held down. It's probably best used in a while loop or if statement, such as while( kb.recv( msg ) ){....} That will execute the body of the loop as long as a key is being held down.

5. msg.which

The value of this variable contains an int that corresponds to the key that's being held down. Operators like ==, >, <, != can be used to work with it to assign different actions to different keys.


These are the most important things to understand when working with keyboard input. Using them, you can construct an array of values of the keys on your keyboard that correspond to meaningful values. For example, to make a program that plays notes when the keyboard is pressed, you could make an array in which the index of the array is the frequency of the note, and the data that corresponds to any given index is the key mapping integer.

How can we find out the key mapping integers if we don't know them or have a reference though? An easy, if tedious, way to do it is to simply print out msg.which whenever a keypress event is received. You can then write your own table by going through each key on your keyboard.

Oscillators, Harmonics and Wavelength Resource




Oscillators, Harmonics and Wavelength Explained Simply

The Synth School website is now closed and "under construction" but this video survived on YouTube.  It's a very basic and clear explanation with great animations to show how to create saw and square waves from the original sine wave.  Video: 10 minutes.

Tuesday, June 19, 2012

Thursday, June 14, 2012

Kins Project Progress

//piano patch

//clarinet patch
Clarinet clarin => JCRev c => dac;
0 => c.gain;
.1 => c.mix;  

//flute patch

//organ patch
BeeThree org => JCRev o => Echo e => Echo e2 => dac;
o => dac;
   
// set delays
240::ms => e.max => e.delay;
480::ms => e2.max => e2.delay;
// set gains
//.6 => e.gain;
//.3 => e2.gain;
0 => o.gain;
0 => e.gain;
0 => e2.gain;
.05 => o.mix;

//brass patch


MAUI_View view;
view.name("Kins 2012");

MAUI_Button piano, clarinet, flute, organ, brass;
MAUI_LED lpiano, lclarinet, lflute, lorgan, lbrass;
MAUI_Slider volume;

view.size(500,250);

piano.pushType();
piano.size(100,100);
piano.position(0,0);
piano.name("piano");
view.addElement(piano);

clarinet.pushType();
clarinet.size(100,100);
clarinet.position(piano.x()+piano.width(),piano.y());
clarinet.name("clarinet");
view.addElement(clarinet);

flute.pushType();
flute.size(100,100);
flute.position(clarinet.x()+clarinet.width(),clarinet.y());
flute.name("flute");
view.addElement(flute);

organ.pushType();
organ.size(100,100);
organ.position(flute.x()+flute.width(),flute.y());
organ.name("organ");
view.addElement(organ);

brass.pushType();
brass.size(100,100);
brass.position(organ.x()+organ.width(),organ.y());
brass.name("brass");
view.addElement(brass);

lpiano.color(lpiano.blue);
lpiano.size(50,50);
lpiano.position(25,75);
lpiano.light();
view.addElement(lpiano);

lclarinet.color(lclarinet.blue);
lclarinet.size(50,50);
lclarinet.position(lpiano.x()+100, lpiano.y());
lclarinet.light();
view.addElement(lclarinet);

lflute.color(lflute.blue);
lflute.size(50,50);
lflute.position(lclarinet.x()+100,lclarinet.y());
lflute.light();
view.addElement(lflute);

lorgan.color(lorgan.blue);
lorgan.size(50,50);
lorgan.position(lflute.x()+100,lflute.y());
lorgan.light();
view.addElement(lorgan);

lbrass.color(lbrass.blue);
lbrass.size(50,50);
lbrass.position(lorgan.x()+100,lorgan.y());
lbrass.light();
view.addElement(lbrass);

volume.range(0,5);
volume.position (0,125);
volume.size(500,volume.height());
volume.name("Volume");
view.addElement(volume);

view.display();

[lpiano, lclarinet, lflute, lorgan, lbrass] @=>MAUI_LED leds[];
[622, 659, 698, 739, 783, 830,880, 932, 987, 1046, 1479,1567,1244,
349, 493, 174, 369, 415, 554, 440, 220, 311, 329, 1661, 1760,1864,
138,391,164,184,195,207,233,246,261,2093,2217,1108,587,523,155,466,146,293,277,2489,2637,2793,
61101016] @=>int keys[];
   
[131,139,156,165,185,208,233,247,277,311,330,370,415,
131,147,165,175,196,220,247,262,294,330,349,392,440,
523,554,622,659,740,831,932,988,69,78,82,93,
523,587,659,698,784,880,988,65,73,82,87] @=>int pitch[];

int ip, ic, ifl, io, ib;

function void volumeControl(){
    while (true){
       volume => now;
       if (ip == 1){
           <<<"volume">>>;
        }
        else if (ic == 1){
            volume.value() => c.gain;
        }
        else if (ifl == 1){
        }
        else if (io == 1){
            volume.value() => org.gain;
        }
        else if (ib == 1){
        }
        else{
        }
    }
}

function void ledControl(MAUI_LED led){
    for (0=>int i; i<leds.cap(); i++){
        leds[i].color(leds[i].blue);
        leds[i].light();        
    }
    led.color(led.green);
    led.light();
}

function void allinstrumentsControl(){
    while (true){
        if (ic == 1){
            clarinetSounds();
        }
        if (io == 1){
            organSounds();
        }
       
    }
  
   /* if (piano.state() == 0 || clarinet.state() == 0){
        while (true){
    <<< piano.state(), clarinet.state(), flute.state(),
    organ.state(), brass.state()>>>;
    if (piano.state() == 1){
        pianoControl();
    }
    else if (clarinet.state() == 1){
        clarinetControl();
    }
    else if (flute.state() == 1){
        fluteControl();
    }
    else if (organ.state() == 1){
        organControl();
    }
    else if (brass.state() == 1){
        brassControl();
    }
    else {
    }
}
}*/
}

function void pianoControl(){
    while (true){
        piano => now;
        if (piano.state() == 1){
            ledControl(lpiano);
            1=>ip;
            0=>ic=>ifl=>io=>ib;
            <<< ip, ic, ifl, io, ib>>>;
        }
    }
}

function void clarinetControl(){
    while (true){
        clarinet => now;
        if (clarinet.state() == 1){
            ledControl(lclarinet);
            1=>ic;
            0=>ip=>ifl=>io=>ib;
            <<< ip, ic, ifl, io, ib>>>;
            //clarinetSounds();
        }
    }
}

function void fluteControl(){
    while (true){
        flute => now;
        if (flute.state() == 1){
            ledControl(lflute);
            1=>ifl;
            0=>ip=>ic=>io=>ib;
            <<< ip, ic, ifl, io, ib>>>;
        }
    }
}

function void organControl(){
    while (true){
        organ => now;
        if (organ.state() == 1){
            ledControl(lorgan);
            1=>io;
            0=>ip=>ic=>ifl=>ib;
            <<< ip, ic, ifl, io, ib>>>;
        }
    }
}

function void brassControl(){
    while (true){
        brass => now;
        if (brass.state() == 1){
            ledControl(lbrass);
            1=>ib;
            0=>ip=>ic=>ifl=>io;
            <<< ip, ic, ifl, io, ib>>>;
        }
    }
}
function void pianoSounds(){
}
function void clarinetSounds(){
// HID
Hid hi;
HidMsg msg;
   
// which keyboard
0 => int device;
// get from command line
if( me.args() ) me.arg(0) => Std.atoi => device;
   
// open keyboard (get device number from command line)
if( !hi.openKeyboard( device ) ) me.exit();
<<< "keyboard '" + hi.name() + "' ready", "" >>>;
 
 
    // infinite event loop (the if was a while)
while (true){       
 // wait for event
 hi => now;
 // get message
 while ( hi.recv( msg ) ) {
     if( msg.isButtonDown() && ic == 1) {
         //0.75 => c.gain;
         volume.value() => c.gain;
         clarin.clear(1.0);
         1 => clarin.reed;
         Std.rand2f(0,1) => clarin.noiseGain;
         Std.rand2f(0,12) => clarin.vibratoFreq;
         Std.rand2f(0,1) => clarin.vibratoGain;
         Std.rand2f(0,1)=> clarin.pressure;                    
         // Std.mtof( msg.which + 45 ) => float freq;
         Std.mtof( msg.which+45) => float freq;
         freq $ int => int ifreq;
         <<< "Frequency: " + freq + "   " + ifreq>>>;
         if( ifreq > 20000 ) continue;
            for (0=>int i; i<keys.cap(); i++){
                if (ifreq == keys[i]){
                    pitch[i] => clarin.freq;
                    1 => clarin.noteOn;
                    //<<< "in: noteOn">>>;
                    300::ms=>now;
                }
                0 =>clarin.noteOff;
            }
        }
        0=>c.gain;
    }
 }
}

/**********************************************************************************************/
function void fluteSounds(){
}
/**********************************************************************************************/
function void organSounds(){
// HID
Hid hi;
HidMsg msg;
   
// which keyboard
0 => int device;
// get from command line
if( me.args() ) me.arg(0) => Std.atoi => device;
   
// open keyboard (get device number from command line)
if( !hi.openKeyboard( device ) ) me.exit();
<<< "keyboard '" + hi.name() + "' ready", "" >>>;

//0 => organ.gain;
   
// infinite event loop
while (true){
    // wait for event
    hi => now;
    // get message
    while ( hi.recv( msg) && io == 1) {
        // check
        //.75 => o.gain;
        volume.value() => o.gain;
        .6 => e.gain;
        .3 => e2.gain;
        if( msg.isButtonDown() ) {
            // Std.mtof( msg.which + 45 ) => float freq;
            Std.mtof( msg.which+45) => float freq;
            freq $ int => int ifreq;
            <<< "Frequency: " + freq + "   " + ifreq>>>;
            if( ifreq > 20000 ) continue;
           
            //.5 => organ.gain;
            for (0=>int i; i<keys.cap(); i++){
                if (ifreq == keys[i]){
                    pitch[i] => org.freq;
                    1 => org.noteOn;
                    80::ms=>now;
                }
                0 => org.noteOff;
             }
            
         }
         0=>e.gain;
         0=>e2.gain;
         0=>o.gain;
     }
 }
}
           
/**********************************************************************************************/
function void brassSounds(){
}

function void keyBoard(){
}
spork ~ pianoControl();
spork ~ clarinetControl();
spork ~ fluteControl();
spork ~ organControl();
spork ~ brassControl();
//spork ~ allinstrumentsControl();
spork ~ clarinetSounds();
spork ~ organSounds();

while (true){
    1::day=>now;
}

KIns progress: 6/14/12

New features:
-Sustained notes! When you release the key the note doesn't cut off immediately. Instead it fades out.
-Finer control over what keys do what. This is hopefully paving the way for the user to change the key mappings from a GUI.

Next step is to combine Lucy's GUI with this code below.

Wednesday, June 13, 2012

REFERENCES for ChucK Code

Clickable ChucK Manual
An online reference with clickable links to the ChucK manual.


LICK Library for ChucK
Many code examples.



SAMPLE COMPOSITION:   To Listen Click Here  or go to Dropbox/ChuckIt/Code Samples with WAV file folder and listen to file "otf_01-07Combined...wav"

Try this sample composition which uses a combination .ck files that include .wav  and chucK generated frequencies.  The /examples/data folder holds the sample .wav files used.  [Update the path in otf_01, _02, _03, _04 and 07.ck  from /data/------.wav to reflect the correct path on your computer for this folder, i.e. data/snare.wav to  read /Users/localhost/chuck/examples/data/snare.wav]


/examples/
Open all otf_01.ck through otf_07.ck files
Update path to the .wav files
Add Shreds to the VM in any order

In miniAudicle:
                        ChucK, Start Virtual Machine [Hotkey Cmd-period]

In command line:
                        Type chuck  otf_01.ck otf_02.ck …. And chain all files to shreduler





OSC Update

Update on The Hallway Project - 6/13/12 - Name: RainfallPlayer


Tuesday, June 12, 2012

Kins Project Update

I have been working on the Kins project, and it is going on very well so far. I am making use of te MAUI elements in mini-Audicle and they are very helpful. I prefer to post my code when I am done with everything--probably tomorrow, but you may view my GUI. The idea is to have individuals choose instruments they would like to play and allow them to control the volume of the instruments as well. Keyboard keys are mapped to make them easier to play. I am done with my organ--built on from my last one, but tried to make the code look like it was written by someone who has been programming for a while (used arrays instead of a bunch of if statements). Please feel free to critique the whole thing so we come up with the best project.


KIns progress: 06/12/12

Some significant progress on the KIns has been made! Here's a summary of my process so far:

I started using keyinmulti2.ck (http://smelt.cs.princeton.edu/code/keyboard/keyinmulti2.ck) from the S.M.E.L.T. website as my base. It already supported polyphony (playing more than one note at once) and variable note length, so those things can be crossed off the KIns project summary (I've updated that post too to reflect the work done so far). The way the program implements polyphony is very elegant--the making of actual noise is handled by one function, and when the program receives a message that a key has been pressed on the keyboard, it sporks a new shred of the keysound function, and when a key release message is received, it unsporks the proper shred to stop the sound. Therefore, pressing three keys at once will cause three shreds to get sporked, which will sit there playing their respective notes until they are unsporked. The number of notes you can have playing at one time seems to be limited only by the number of simultaneous key presses the computer is capable of registering.

I've made two specific modifications to keyinmulti2.ck.
1. I rearranged the key mappings to make a more linear progression. They were originally arranged in "frets" like on a guitar, so that there were a lot of overlapping notes between the key rows. This arrangement actually makes somewhat more sense for real music making, but for the beginner level user that the KIns is aimed at, I think it just makes it a lot more confusing. I feel like there's probably a somewhat optimal arrangement for the key mapping that makes more sense the way I've got it arranged now, but I think that will be figured out through experimentation much later on.
2. I added the capability to switch between different instrument sounds by pressing the number keys. In this version, 1 switches to a basic sine wave, 2 is a saw wave, and 3 is a "Rhodey" instrument (from the STK instrument kit that's built into ChucK). It's extremely easy to edit the instruments/add new instruments in the code, but I'm hoping eventually to be able to do this via a GUI and to also have slider controls on the GUI for the individual properties of the instruments. For example, a sine wave has very few built-in controls, but the STK instruments have lots of different controls that vary with the instrument. Having a dynamically changing GUI would be great for giving more control over these instruments.

One thing that I think would be good to work on next would be capability to sustain notes after the key is released, so that they fade out rather than being cut off abruptly by the shred being unsporked right as the key is released. Another thing is of course the GUI, which could be executed in MAUI (miniAudicle's built in, but very simple GUI). Processing also seems a promising choice for the GUI though. Julia found some demos that show two-way interaction between Processing and ChucK, I think via OSC, so I might look into those to see if that would be possible.

Here is the code for the current iteration of the KIns. It's very heavily commented, probably more so than is actually necessary.


/*----------------------------------------------------------------------------
S.M.E.L.T. : Small Musically Expressive Laptop Toolkit

Copyright (c) 2008 Dan Trueman.  All rights reserved.
http://smelt.cs.princeton.edu/
http://soundlab.cs.princeton.edu/

This program is free software; you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
the Free Software Foundation; either version 2 of the License, or
(at your option) any later version.

This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
GNU General Public License for more details.

You should have received a copy of the GNU General Public License
along with this program; if not, write to the Free Software
Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307
U.S.A.
-----------------------------------------------------------------------------*/

//-----------------------------------------------------------------------------
// name: keyinmulti2.ck
// desc: this program creates an array filled with the keyboard code for
// each key on the keyboard, indexed by the keyboard row and column. Then,
// it treats each row as a string, which is "tuned" in the code. Pressing 
// a key will play the note.
//
// This version supports polyphony, and the note ends when you release the key!
// Warning: Due to hardware (not our fault), you may not be able to play all chords.
//
// to run (in command line chuck):
//     %> chuck keyinmulti2.ck
//
// to run (in miniAudicle):
//     (make sure VM is started, add the thing)

//
//-----------------------------------------------------------------------------

//Hid = human input device. It's a variable to hold whatever HID the program
//needs to use
Hid hi;
//Hidmsg contains data about what the HID is doing at any given moment
HidMsg msg;
//sound determines what sound the keyboard is making at any given time
0 => int sound;

//initializes the HID as the keyboard, and exits if there's no keyboard available
0 => int deviceNum;
hi.openKeyboard( deviceNum ) => int deviceAvailable;
if ( deviceAvailable == 0 ) me.exit();
<<< "keyboard '", hi.name(), "' ready" >>>;


//array with key codes, for MacBook anyhow

[30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 45, 46, 42], //1234... row
[20, 26, 8, 21, 23, 28, 24, 12, 18, 19, 47, 48, 49], //qwer... row
[4, 22, 7, 9, 10, 11, 13, 14, 15, 51, 52], //asdf... row
[29, 27, 6, 25, 5, 17, 16, 54, 55, 56]   //zxcv... row

]   @=> int row[][];

//our big array of pitch values, indexed by ASCII value
int keyToPitch_table[256];

//this function takes each row and tunes it in half steps, based
//on whatever fundamental pitch note specified
fun void tuneString(int whichString, int basepitch) {
    
    for (0 => int i; i < row[whichString].cap(); i++) {
        
        basepitch + i => keyToPitch_table[row[whichString][i]];
        
        <<<row[whichString][i], keyToPitch_table[row[whichString][i]]>>>;
        
    }
    
}

//tune the strings!! This starts at (I think) A1, and then continues up by halftones
//each key. To hear the progression, go from left to right z->?, then up to a->",
//then q->]

tuneString(3, 55);
tuneString(2, 65);
tuneString(1, 76);
//tuneString(0, 185);


//makes the key sounds!
//currently configured to let the top row (number keys) control the type of sound
//the KIns makes. The code below should be pretty self-explanatory.
fun void keysound(float freq, Event noteOff) {
    
    if(sound==0){
    
    SinOsc sine => ADSR envelope => dac;
    envelope.set(80::ms, 25::ms, 0.1, 150::ms);
    
    freq => sine.freq;
    
    envelope.keyOn();
    noteOff => now;
    envelope.keyOff();
    150::ms => now;
    
    envelope =< dac;
}
else if(sound==1){
    
    SawOsc saw => ADSR envelope => dac;
    envelope.set(10::ms, 25::ms, 0.1, 150::ms);
    
    freq => saw.freq;
    
    envelope.keyOn();
    noteOff => now;
    envelope.keyOff();
    150::ms => now;
    
    envelope =< dac;
}
else if(sound ==2){
    Rhodey voc=> JCRev r =>dac;
    freq => voc.freq;
0.8 => voc.gain;
.8 => r.gain;
.2 => r.mix;

voc.noteOn(1);
    noteOff => now;
    voc.noteOff(1);
    150::ms => now;
}
}

Event noteOffs[256];

//infinite time loop
while( true )
{
    
    hi => now;
    
    //only does things when there's a message coming in from the HID
    while( hi.recv( msg ) )
    {
        //only if the message from the HID is that a button was pressed...
        if( msg.isButtonDown() )
        {
            //the following if/elseif statements check to see if the button press
            //should cause the type of sound to change. Only the value
            //of sound needs to change, as the keysound function
            //handles actually producing the appropriate sound.
            if(msg.which==30){
                0 => sound;
            }
            else if(msg.which==31){
                1=>sound;
            }
            else if(msg.which==32){
                2=>sound;
            }
            else{
            keyToPitch_table[ msg.which ] => Std.mtof => float freq;
            spork ~ keysound(freq, noteOffs[ msg.which] );
        }
            
        }
        //if the message was not that a button was pressed...
        else
        {   
            noteOffs[ msg.which ].signal();
        }
    }
}