OSC and Javascript

Having my mouth cut open took a bit more out of me than I expected. I have nothing new to share about rendering graphics with font symbols in javascript, but here are some js libraries that can do OSC, which will be needed for a later stage of this project. I was looking these up for another project that needs to both send and receive OSC. I don’t want to have to learn two different OSC libraries, so even though this project only needs to receive, I’m only interested in libraries that do both. All of these seem to be built on top of another library called Node.js

  • Kievii does a lot more than I need, but there’s OSC in it too.
  • OSC-web has some proxy stuff which could be useful for later projects
  • OSC-min is a minimal implementation that does everything I need
  • The website for Node.js has a list of popular OSC implementations

Back to rendering graphics soon. I’ve been reading an introduction to javascript, which is way too basic, but a place to start. After that, I think what I need to know about is probably canvas, so I’ll be reading about that.
Also, obviously, I need to think about symbols that are not just bits of music notation font. If I also want to use text, I either need to write some or find something that I can use, such as a philosophical text or poem. If I want the text to be related to the mechanics of the piece, it should be about sounding and listening. There’s some odd text on this theme that I remember from the very start of Noise Water Meat, so I might go looking for that in translation. I really do need to make some paper sketches soon or there’s a risk that I’ll just be making a real-time, machine-listening, pastiche of Redhead and while that would be great fun, I do want to make this my own.
I’m also sort of wondering how I want to page changing to work. Do I want the boxes to just fade in and then fade out? Do I want them to move their location on the screen? Do I want them to change size? Should the elements within them stay fixed? I’m thinking that if the box is moving or changing size, the stuff in it can’t also move or it’s too much.

Creative Pact Day 3

So I’m doing the Creative Pact, but I’m a day behind because I didn’t hear about it until September 2nd. My project is to write a vocal piece wth a computer generated score in real time.
I don’t have a lot to report today. I learned some facts:

  • There is a new release of SCMIR
  • It is totally possible to serve fonts across the web for viewing web pages. You use a thing in CSS called @font-face
  • There is a new initiative for standardisation in music fonts called SMuFL (pronounced ‘smooful’). Compliant fonts implement the unicode section for music fonts and then have a bunch of additional musical symbols.
  • The font that ships with LilyPond, Bravura implements SMuFL and looks alright.
  • (The tattoo on my arm is bass clef in the font that shipped with Sibelius 3. What font is that?)

I can’t do a proper demo of this because fonts included in a web page need to be served from the same server as the webpage (at least for firefox users), but this style.css bit sets the font to Bravura for the whole of a webpage:

@font-face {
  font-family: Bravura Regular;
  src: local("Bravura Regular"),
    local("BravuraRegular"), 
    url(Bravura.otf) format("opentype");
}
 
body {
  font-family: Bravura Regular, Sonata, sans-serif; 
}

That uses the local copy of Bravura, if it’s already on your machine, or downloads it if not. If somehow that goes wrong, it uses the Sonata font instead. When I get a chance, I’ll upload a proper demo to my website, doing a table of all the musical symbols. But probably not tomorrow, as I’m going ot have dental surgery in the morning and will probably feel rotten later in the day.

Anxiety and Musical Form

My gf asked me why I don’t blog about having anxiety. There’s a few reasons I haven’t mentioned it much of late. Part of the reason for this is that while it shapes a certain amount of my experience, I absolutely do not want it to become a point of identity. Partly, I’d rather talk about things that are more interesting, like music. (There’s nothing quite so boring as other people’s health problems.) And maybe most importantly, I’m embarrassed about it.
But yeah, something about my fight-or-flight response is not working correctly and I get panic attacks.
I’m going to get a referral for CBT – cognitive behavioural therapy, which has been shown to be effective. I kind of know what leads to them. If I’m stressed or not eating right or especially if I’m short on sleep, I’m much more likely to get them. I read some place that they normally last about half an hour and I don’t know if this is true or not, but I do know that if I wait and take deep breaths, they tend to go away.
In case you’ve never had a panic attack, (lucky you), I shall attempt to describe it. For me, it is what it sounds like, a sudden stab of panic. You know when you go out and you’re nearly to the tube station and think, ‘oh my god, did I leave the stove on?!’ It’s like that, but I tend to think maybe I have cancer or something. While one can turn around and go home and check if they’ve forgotten a pan of beans bubbling away, one can’t do that with a sudden fear of dread disease. So I try to talk myself out of it, but can get into a loop: ‘I’m fine because of X. But what about Y?’ Repeat for several minutes. This is very annoying. Especially when I know I’m in a loop. I know I’m being irrational. But I can’t seem to shake it. Because what about Y??! And maybe it just goes away quietly, or maybe I start shaking and phone NHS Direct.
What sets these off? Sometimes it’s because something is actually wrong with me. And, indeed, when I said I phone NHS Direct, this is actually not entirely true – I have a stone in one of my spit glands. It hurts occasionally and once in a great while, it blocks something and swells a bit. Eventually, somebody is going to remove it. But anyway, one day, before it was x-rayed, my neck swelled slightly and I saw it in the mirror and went into a panic. Sure, the dentist said he thought it was probably a stone, but what if it’s actually a terrible infection that’s spread to something important in my neck before it turns into blood poisoning or gets into my brain and might kill me by morning. (‘I’m fine because if I had a horrible infection, I would have a fever. But what about my swollen neck?’ Repeat for several minutes.) It was a Friday evening, so I phoned NHS Direct and they said it sounded like something that might happen with a spit gland stone and if it hadn’t gone down by Monday morning, I should ring my dentist. This was exactly the calm reassurance I wanted from the NHS. (God bless the NHS.) Then, unexpectedly, they started reading a list of diseases that might be associated with swollen necks. Glandular fever. (Mononucleosis to Americans.) ‘No, I already had that.’ Mumps. ‘I’m vaccinated.’ Meningitis. ‘Doesn’t that usually have a fever?’ I was starting to get alarmed. The woman on the phone was starting to get annoyed. She just had a script to read through (I guess so nobody can say they weren’t warned?) and didn’t want to be interrupted. Lupus. ‘Wait, what? How do I know if I have lupus??’ I hadn’t even considered this possibility. The woman sighed in an irritated manner. I have not rung the NHS Direct since.
Most often what sets off a panic attack is that I’m trying to ignore some emotion I’m having. Something has upset me. I don’t want to deal with it. I tell myself I’m fine and carry on. And then: ‘Wait, what if that cat I just pet has rabies?’ Sometimes, I can identify what emotion or thought I’m trying to nullify and go deal with it and be fine. Often it’s just some really small erasure. Very often, I’m not even aware that I’m doing it. Some part of my brain has intercepted my experience and tried to overwrite it and I haven’t even noticed. This is what I hope CBT can help with, since I really want to stop doing that. I mean, I don’t like having negative emotions, but they’re vastly preferrable to panic attacks.
Also, these erased emotions are sometimes important ones, but often are fairly small. So let’s say I’ve read something that reminds me of LGBT-phobia in schools and its affecting and I just don’t feel like thinking about that at the moment, so there’s an erasure. ‘What if that friendly cat is actually rabid?’ I try to ignore the thought, so there’s another avoidance. ‘I am feeling panicked about this cat and this is stupid, so I must try to hide it or else everyone will know I’m crazy.’ And thus something small builds.

Musical Forms

I did my undergraduate music education sort of the wrong way around. I didn’t expect to want to get a degree in it, so I started by taking all the upper division classes and seminars because they were most interesting. So I learned about how John Cage rejected all the old way of doing things that were not useful any more in the 20th century. Music theory is a old and unneeded! I accepted this at face value. (Never mind how a lot of minimalists, who I loved, worked a lot with harmony.) Then, after I decided I wanted to get a music degree, I had to take all the first year classes in counterpoint, history and all of this stuff I had already rejected. I insisted that John Cage had said it was useless. My teachers disagreed, but since they were the same ones who had told me a few months earlier that this was the stuff of the past, I felt their position was somewhat weakened. Anyway, I graduated, having learned as little traditional stuff as possible.
It wasn’t long after graduation that I became aware that perhaps I had been overzealous in my embracing of Cagean values. I wanted to write something harmonic, but all I actually knew how to do were chorales and I didn’t even know most of the rules for them. How to structure anything was a mystery. I know that forms existed and they had names, but what those meant, I had no idea.
I got to grad school (a decade ago) and somebody commented that everything I wrote was in sonata form. I had no idea. Maybe I should mix this up a bit more? I didn’t know any other forms.
Obviously the thing to do about this gap in my knowledge was to feel deep shame and attempt to hide it. So rather than read a book or ask a teacher, I just hoped nobody noticed how I was nowhere near good enough to actually be qualified to be in an MA program. (To be fair, I was REALLY busy trying to learn every other thing that everybody around me already seemed to know.)
At some point, I finally learned that while there are named forms that exist, form is arbitrary. It’s just any structure you can use to make sense of things. It can sometimes be implied by the material, or it can be decided in advance. Some Cage pieces are all about form. They are vessels into which you can pour any material and the structure somehow causes the material to sound better. Forms are like that. I don’t feel worried about them any more and while my classical vocabulary is still a bit lacking, I’m not overly concerned about this. Anyway, since I moved to England, I don’t even know note names any more, so not knowing how to organise a minuet is somewhat less important than not being able to remember the duration of a semi-quaver or a minim.
If you are hoping for some insight: The important thing to remember about structures is that they fix musical problems. This is how to write a piece of music: Put your material into a structure, then get the ending perfect, then get the beginning nice, then do the middle bits. Then write some glue to hold everything together. If your middle section is supposed to be 5 minutes long but is getting dull by minute three, you can either make it shorter or subdivide it – so instead of being one long thing, it’s got an ABA or ABC structure within it, so it moves between related ideas. That is to say, add some stuff.
As I recall, Cage didn’t talk much about the importance of structure directly, but he implied the hell out of it. Lecture on Nothing is nothing but structure. This is much more vital than the ‘ignore harmony’ that I first got from my youthful introduction. I was over-eager to be freed from a prison I’d never even been in. But aren’t we all? Especially when we’re young.

What’s the Connection?

This post is in AB form, with a small coda.

A useful script

The best way to remember to do something when you’re going to run some important program is to put in the program itself. Or at the very least, put it in a script that you use to invoke the program.
I have a few things I need to remember for the performance I’m preparing for. One has to do with a projector. I’m using a stylus to draw cloud shapes on my screen. And one way I can do this is to mirror my screen to a projector so the audience can see my GUI. However, doing this usually changes the geometry of my laptop screen, so that instead of extending all the way to the edges, there are empty black bars on either side of the used portion of my display. That’s fine, except the stylus doesn’t know and doesn’t adjust. So to reach the far right edge of the drawn portion of the screen, I need to touch the far right edge of the drawn portion, which puts over a centimetre between the stylus tip and the arrow pointer. Suboptimal!
Ideally, I’d like to have any change in screen geometry trigger a script that changes the settings for for the stylus (and I have ideas about how that may or may not work, using upstart, xrandr and xsetwacom), but in the absence of that, I just want to launch a manual calibration program. If I launch the settings panel, there’s a button on that that launches one. So the top part of my script checks if the calibration is different than normal and launches settings if it is.
The next things I need to remember are audio related. I need to kill pulseaudio. If my soundcard (a Fast Track Ultra) is attached, I need to change the amplitude settings internally so it doesn’t send the input straight to the output. Then I need to start jack using it. Or if it’s not attached, I need to start jack using a default device. Then, because it’s useful, I should start up Jack Control, so I can do some routing, should I need it. (Note: in 12.04 if you start qjackctl after starting jackd, it doesn’t work properly. This is fixed by 13.04.) Finally, I should see if SuperCollider is already running and if not, I should start it.
That’s a bit too much to remember for a performance, so I wrote a script. The one thing I need to remember with this script is that if I want to kill jack, it won’t die from Jack Control, so I’ll need to do a kill -9 from the prompt. hopefully, this will not be an issue on stage.
This is my script:

#!/bin/bash


# first check the screen

LINE=`xrandr -q | grep Screen`
WIDTH=`echo ${LINE} | awk '{ print $8 }'`
HEIGHT=`echo ${LINE} | awk '{ print $10 }' | awk -F"," '{ print $1 }'`

if  [[ ${WIDTH} != 1366 || ${HEIGHT} != 768 ]]
  then
 gnome-control-center wacom
  else
 echo normal resolution
fi


# now setup the audio

pulseaudio --kill

# is the ultra attached?
if aplay -l | grep -qi ultra
  then
 echo ultra
 
 #adjust amplitude
 i=0
 j=0
 for i in $(seq 8); do
         for j in $(seq 8); do
                 if [ "$i" != "$j" ]; then
                         amixer -c Ultra set "DIn$i - Out$j" 0% > /dev/null
                 else
                         amixer -c Ultra set "DIn$i - Out$j" 100% > /dev/null
                 fi
                 amixer -c Ultra set "AIn$i - Out$j" 0% > /dev/null
         done
 done

 #for i in $(seq 4); do 
 # amixer -c Ultra set "Effects return $i" 0% > /dev/null 
 #done 

 #start jack
 jackd -d alsa -d hw:Ultra &
  else
 #start jack with default hardware
 jackd -d alsa -d hw:0 &
fi

sleep 2

# jack control
qjackctl &

sleep 1

# is supercollider running?
if ps aux | grep -vi grep | grep -q scide
  then
 echo already running
  else
 scide test.scd &
fi

Live code, code based interfaces and live patching – theory and practice

Some theory

Not every use of code interaction on stage is an instance of live coding. When I first started working with SuperCollider a decade ago, I didn’t create GUIs. I started and stopped code on stage. I had comments in the code giving me instructions on how to do this. One piece instructed me to count to ten between evaluating code blocks. Another told me to take a deep breath.
Part of this was because I hadn’t yet learned how to invoke callback methods or use tasks. Some of it was to create a musical timing – a deep breath is not the same as a two second pause. This was undoubtedly using code interactions to make music. But in no sense were these programs examples of live coding. Once a block was started, there was no further intervention possible aside from halting executing or turning down a fader to slowly mute the output. These pieces were live realisations of generative music, which means they have virtually no interactivity once started whether by code or by other means.
There is not a bright line separating code based interfaces from live coding but instead a continuum between pieces like the ones I used to write and blank slate live coding. The more the interactivity of the code, the farther along this continuum something falls. Levels of interaction could be thought to include starting and stopping, changing parameters on the fly, and changing the logic or signal graph on the fly. Changing logic or signal graph would put one closer to blank slate coding than does just changing numbers on something while it plays.
This argument does imply a value judgement about authenticity, however, this is not my purpose. Different types of code interactions are better suited to different circumstance. A piece that is more live coded isn’t necessarily sonically or objectively better. However, this kind of value judgement is useful in applying the metaphor of live coding to other interactions.
I have been pondering for a while whether or not live synthesiser patching is an analogue form of live coding, a question first posed by Julian Rohrhuber (2011) on the live coding email list. On the one hand, the kind of analogue modules used for modular synthesisers were originally developed for analogue computers. The synthesiser itself is a general purpose tool for sound, although, obviously limited to whatever modules are available. (Thus putting it some place between a chainsaw and an idea. (Toplap 2010)) Both computer programs and live patches can quickly grow in complexity to there the performer can no longer comprehend exactly what’s happening. (Collins 2007)
On the other hand, there is no code. However, I’m not sure how much that matters. A PD or MAX patch created on the fly that crates a signal graph is clearly an example of live coding. If for some reason, the patching language had hard limits on what unit generators were available and in what quantity, this would still count. Therefore the transition from virtual to physical seems small. Instead of focussing on the code itself, then, let’s look at the metaphor.
Knob twirling is an example of changing numbers on the fly. Modular synthesisers do contain logic in the forms of gates and switches. This logic and the musical signal routing can be changed on the fly via re-patching. Therefore, a live patching performance that contained all of these elements would be an example of analogue live coding.

Gig Report

I very recently did some live patching in the Live Code Festival in Karlsruhe. Alas, this reasoning of what is or is not live coding did not become clear to me until I was reflecting back on my performance afterwards. This is the first time I was doing patching with any other goals in addition to making nice sounds, which meant I was pushing against places the sounds wanted to settle and I realised on stage I was inadequately prepared. There is both in live coding and live patching a problem of how to prepare for a show, something it has in common with forms of improvised or partially improvised music.
I had a conversation with Scott Wilson about how to practice improvised music that has an agenda. I should have spent the few days before the show building patches that use gates to control timbrel or graph changes. I should have also practised making graph changes in the middle of playing. Instead, I spent the days ahead wrestling with problems with Jack on Linux. I use SuperCollider to manage some panning and recording for me and was having tech problems with it. Mixing analogue and digital systems in this way exposes one to the greater inherent instability of computers. I could make my own stereo autopanner with some envelope followers, a comparator and a panner, so I’ll be looking into putting something together out of guitar pedals or seeing of there is an off-the-shelf solution available.
For this performance, I decided to colour code my cables, following the colour conventions of Sonology in the Hague, so that I would use black for audio, blue for control voltages and red for triggers. This was so users with synthesiser knowledge might be able to at least partly decode my patches. However, this caused a few problems. Normally, I play with cables around my neck and I’ve never before done anything live with cable colours. This time, every time I looked down, I only saw red cables but never actually wanted to use them. For the audience, I tried to make up for the difficulty of seeing a distant synth by using a web cam to project live images of it, but the colour was lost in the low resolution of the web cam. People who tried to just look at the synth directly would have trouble perceiving black cables on a black synth. If I do colour coding again, I need to swap the colours I use and not wear them around my neck. A better webcam might also help.
Aside from the low resolution, the web cam part was successful. I also set the program up so that if I pressed certain buttons on my midi device, slides would be displayed of modules. So when I patched an oscillator, I pushed the oscillator button on the midi control and a labelled picture of an oscillator appeared in the upper left corner. I didn’t always remember to push the buttons, but the audience appreciated the slides and I may extend this in future with more of my modules (I forgot to do the ring mod module) and also extend it to more combinations, so I would have a slide of two oscillators showing FM and one of three showing chaos.
Sonically, the patching seems to have been a success although it was not fun to do because I did have an agenda I was trying to push towards, but had not rehearsed adequately. I want to spend a lot of time now working this out and getting it right and doing another show, but that was it. My next presentation will be all SuperCollider and I need to work on that. I am thinking a lot, though about what I might do for the Other Minds festival next spring. I wonder if live patching would be adequately ambitious for such a high profile gig….

Citations

Collins, Nick. “Live Coding Practice,” 2007. The Proceedings of NIME 2007 [E-JOURNAL]
Available at: <http://www.nime.org/2007/proceedings.php> [Accessed 3 March 2012].
Rohrhuber, Julian. “[livecode] analogue live coding?” 19 February 2011. [Email to livecode list].
Available at: <http://lists.lurk.org/mailman/private/livecode/2011-February/001176.html>
[Accessed: 1 March 2012].
TopLap, “ManifestoDraft.” 14 November 2010. TopLap. [ONLINE] Available at:
<http://toplap.org/index.php/ManifestoDraft> [Accessed: 12 September 2011].

Republic

It’s time for everybody’s favourite collaborative real time network live coding tool for SuperCollider.
Invented by PowerBooks UnPlugged – granular synthesis playing across a bunch of unplugged laptops.
Then some of them started Republic111, which is named for the room number where the workshop where they taught stuff.
Code reading is interesting in network msic partly because of stealing, but also to understand somebody else’s code quickly, or actively understand it by changing it. Live coding is a public or collective thinking action.
If you evaluate code, it shows up in a history file, and gets sent to everybody else in the Republic. You can stop the sound of everybody on the network. All the SynthDefs are saved. People play ‘really equally’ on everybody’s computer. Users don’t feel obligated to act, but rather to respond. Participants spend most of their time listening
Republic is mainly one big class, which is a weakness and should be broken up into smaller classes hat can be used separately. Scott Wilson is working on a newer versions which on github. Look up ‘The Way things May Go on Vimeo’.
Graham and Jonas have done a system which allows you to see a map of who is emitting what sound and you can click on it and get the Tdef that made it.
Scott is putting out a call for participation and discussion about how it should be.

David Ogborn: EspGrid

In Canada, laptop orchestras get tones of gigs.
Naive sync methods: do redundant packet transmission – so send the same value several times in a row. This actually increases the chance of collision, but probably one will get through. Or you can schedule further in advance and schedule larger chunks – so send a measure instead of just sending a beat.
download it from esp.mcmasters.ca. mac only
5 design principles

  • Immediacy – launch it and you’ve got stuff going right away
  • Decentralisation – everything is peer to peer
  • Neutrality – works with chuck, supercollider, whatever
  • Hybridity – they can even use different software on the same computer at the same time
  • Extensibility – it can schedule arbitrary stuff

The grid has public and private parts. EspGrid communicates with other apps via localhost OSC. Your copy of supercollider does not talk to the larger network. EspGrid handles all that.
The “private protocol” is not osc. It’s going to use a binary format for transmission. Interoperability is thus only based on client software, not based on the middleware.
Because the Grid thing runs OSC to clients, it can run on a neighbour’s computer and send the osc messages to linux users or other unsupported OSes.
The program is largely meant to be run in the background. You can turn a beat on or off, and this is shared across the network. You can chat. You can share clipboards. Also, Chuck will dump stuff directly.
Arbitrary osc messages will be echoed out, with a time stamp. you can schedule them for the future.
You can publish papers on this stuff or use it to test shit for papers. Like swap sync methods and test which works best.
Reference Beacon does triangulation to figure out latencies.
He wants to add WAN stuff, but not change the UI, so the users won’t notice.

Question

Have they considered client/server topology for time sync? No. A server is a point of failure.
Security implications? He has not considered the possibility of sending naughty messages or how to stop them.
Licence? Some Open Source one… maybe GPL2. It’s on google code.

Chad McKinney – Lich.js – A Networked Audio / Visual Live Coding Language

They started with SuperCollider and have gone on from there. He’s into updates in browser technologies.
He decided to write a language first as a way to start live coding.
Uses web audio and web gl
This language is GPL2 and is on github
If you’re on Chrome, go mess with http://www.chadmckinneyaudio.com/Lich.js/Lich.html

Battery dying

Alex McLean

He did command line scripts 2001-2004, then started working in perl, the Haskell.
slub – writing code to make music to drink beer to.
Feedback.pl – writing code to write code to make music to drink beer to
He read Laurie Spiegel’s paper on manipulations of musical patterns, so he got into pattern languages (ie HMSL, Common music, SuperCollider)
Tidal is embedded in Haskell for pattern manipulation for music. Complexity through combination of simplicity.
Structures could be trees of structures…
So he moved to functions of time. Time is an integer. Give a Pattern a time and it gives you an integer and the periodicity. This is limited because time is an integer.
Now, he thinks of time as cyclic, with repetition. So make time a floating point instead of an int. But this makes lookups hard.
So the thought of having patterns be a sequence with discrete stuff or a signal, which is indexed by a rational and is non-discrete. However, mixing analog and digital in one data type is a problem.
So he made separate types, but then this caused massive code inflation.
He’s gone back to one type again.
Haskell is intense…..
and REALLY concise

Questions

Does the system have state? No, which means he can’t even have random numbers.
Is time still a loop in the current version? Notionally, yes. But representationally, it’s a number, just in the functions, so its what you make it.