Kinect and OSC Human Interface Devices

To make up for the boring title of this post, lets’s start off with a video:

XYZ with Kinect a video by celesteh on Flickr.

This is a sneak preview of the system I wrote to play XYZ by Shelly Knotts. Her score calls for every player to make a drone that’s controllable by x, y, and z parameters of a gestural controller. For my controller, I’m using a kinect.

I’m using a little c++ program based on OpenNi and NITE to find my hand position and then sending out OSC messages with those coordinates. I’ve written a class for OSCHIDs in SuperCollider, which will automatically scale the values for me, based on the largest and smallest inputs it’s seen so far. In an actual performance, I would need to calibrate it by waving my arms around a bit before starting to play.

You can see that I’m selecting myself in a drop down menu as I start using those x, y and z values. If this had been a real performance, other players names would have been there also and there is a mechanism wherein we duel for controls of each other’s sounds!

We’re doing a sneak preview of this piece on campus on wednesday, which I’m not allowed to invite the public to (something about file regulations) but the proper premiere will be at NIME in Oslo, on Tuesday 31st May @ 9.00pm atChateau Neuf (Street address: Slemdalsveien 15). More information about the performance is available via BiLE’s blog.

The SuperCollider Code

I’ve blogged about this earlier, but have since updated WiiOSCClient.sc to be more immediately useful to people working with TouchOSC or OSCeleton or other weird OSC devices. I’ve also generated several helpfiles!
OSCHID allows one to describe single OSC devices and define “slots” for them.
Those are called OscSlots and are meant to be quite a lot like GeneralHIDSlots, except that OSCHIDs and their slots do not call actions while they are calibrating.
The OSC WiiMote class that uses DarWiinRemote OSC is still called WiiOSCClient and, as far as I recall, has not changed its API since I last posted.
Note that except for people using smart devices like iPhones or whatever, OSC HIDs require helper apps to actually talk to the WiiMote or the kinect. Speaking of which…

The Kinect Code

Compiling / Installing

This code is, frankly, a complete mess and this should be considered pre-alpha. I’m only sharing it because I’m hoping somebody knows how to add support to change the tilt or how to package this as a proper Mac Application. And because I like to share. As far as I know, this code should be cross-platform, but I make no promises at all.
First, there are dependencies. You have to install a lot of crap: SensorKinect, OpenNi and NITE. Find instructions here or here.
Then you need to install the OSC library. Everybody normally uses packosc because it’s easy and stuff…. except it was segfaulting for me, so bugger that. Go install libOSC++.
Ok, now you can download my source code: OscHand.zip. (Isn’t that a clever name? Anyway…) Go to your NITE folder and look for a subfolder called Samples. You need to put this into that folder. Then, go to the terminal and get into the directory and type: make. God willing and the floodwaters don’t rise, it should compile and put an executable file into the ../Bin directory.
You need to invoke the program from the terminal, so cd over to Bin and type ./OscHand and it should work.

Using

This program needs an XML file which is lurking a few directories below in ../../Data/Sample-Tracking.xml. If you leave everything where it is in Bin, you don’t need to specify anything, but if you want to move stuff around, you need to provide the path to this XML file as the first argument on the command line.
The program generates some OSC messages which are /hand/x , /hand/y and /hand/z, all of which are followed by a single floating point number. It does not bundle things together because I couldn’t get oscpack to work, so this is what it is. By default, it sends these to port 57120, because that is the port I most want to use. Theoretically, if you give it a -p followed by a number for the second and third arguments, it will set to the port that you want. Because I have not made this as lovely as possible, you MUST specify the XML file path before you specify the port number. (As this is an easy fix, it’s high on my todo list, but it’s not happening this week.)
There are some keyboard options you can do in the window while the program is running. Typing s turns smoothing on or off. Unless you’re doing very small gestures, you probably want smoothing on.
If you want to adjust the tilt, you’re SOL, as I have been unable to solve this problem. If you also download libfreenect, you can write a little program to aim the thing, which you will then have to quit before you can use this program. Which is just awesome. There are some Processing sketches which can also be used for aiming.
You should be able to figure out how to use this in SuperCollider with the classes above, but here’s a wee bit of example code to get you started:




 k = OSCHID.new.spec_((
  ax: OscSlot(realtive, '/hand/x'),
  ay: OscSlot(realtive, '/hand/y'),
  az: OscSlot(realtive, '/hand/z')
  ));

 // wave your arms a bit to calibrate

 k.calibrate = false;

 k.setAction(ax, { |val|  val.value.postln});

And more teaser

You can see the GUIs of a few other BiLE Tools in the video at the top, including the Chat client and a shared stopwatch. There’s also a network API. I’m going to do a big code release in the fall, so stay tuned.

Strategies for using tuba in live solo computer music

I had the idea of live sampling my tuba for an upcoming gig. I’ve had this idea before but never used due to two major factors. The first is the difficulty of controlling a computer and a tuba at the same time. One obvious solution is foot pedals, which I’ve yet to explore and the other idea is a one-handed, freely moving controller such as the wiimote.
The other major issue with doing tuba live-sampling is sound quality. Most dynamic mics (including the SM57, which is the mic I own) make a tuba sound like either bass kazoo or a disturbingly flatulent sound. I did some tests with the zoom H4 positioned inside the bell and it appeared to sound ok, so I was going to do my gig this way and started working on my chops.
Unfortunately, the sound quality turns out not to be consistent. The mic is prone to distortion even when it seems not to be peaking. Low frequencies are especially like to contain distortion or a rattle which seems to be caused by the mic itself vibrating from the tuba.
There are a few possible work arounds. One is to embrace the distortion as an aesthetic choice and possible emphasise it through the use of further distortion fx such as clipping, dropping the bit rate or ring modulation. I did a trial of ring modulating a recorded buffer with another part of the same buffer. This was not successful as it created a sound lurking around the uncanny valley of bad brass sounds, however a more regular waveform may work better.
At the SuperCollider symposium at Wesleyan, I saw a tubist (I seem to recall it was Sam Pluta, but I could be mistaken) deliberately sampling tuba-based rattle. The performer put a cardboard box over the bell of the tuba. Attached to the box was a piezo buzzer in a plastic encasing. The composer put a ball bearing inside the plastic enclosure and attached it to the cardboard box. The vibration of the tuba shook the box which rattled the bearing. The piezo element recorded the bearing’s rattle, which roughly followed the amplitude of the tuba, along with other factors. I thought this was a very interesting way to record a sound caused by the tuba rather than the tuba itself.
Similarly, one could use the tuba signal for feature extraction, recognising that errors in miccing the tuba will be correlated with errors in the feature extraction. Two obvious thing to attempt to extract are pitch and amplitude, the latter being somewhat more error-resistant. I’ve described before an algorithm for time-domain frequency detection for tuba. As this method relies on RMS, it also calculates amplitude. Other interesting features may be findable via FFT-based analysis such as onset detection or spectral centroid, etc using the MLCD UGens. These features could be used to control the playing of pre-prepared sounds or live software synthesis. I have not yet experimented with this method.
Of course, a very obvious solution is to buy a better microphone. It may also be that the poor sound quality stemmed from my speakers, which are a bit small for low frequencies. The advantage of exploring other approaches include cost (although a tuba is not usually cheap either) and that cheaper solutions are often more durable or at least I’d be more willing to take cheaper gear to bar gigs (see previous note about tuba cost). As I have an interest in playing in bars and making my music accessible through ‘gigability,’ a bar-ready solution is most appealing.
Finally, the last obvious solution is to not interact with the tuba’s sounds at all, thus creating a piece for tuba and tape. This has less that can go wrong, but it looses quit a lot of spontaneity and requires a great deal of advance preparation. A related possibility is that the tubist control real-time processes via the wiimote or other controller. This would also require a great deal of advanced preparation – making the wiimote into it’s own instrument requires the performer to learn to play it and the tuba at the same time, which is rather a lot to ask, especially for an avant guarde tubist who is already dealing with more performance parameters (such as voice, etc) than a typical tubist. This approach also abandons the dream of a computer-extended tuba and loses whatever possibilities for integration exist with more interactive methods. However, a controller that can somehow be integrated into the act of tuba playing may work quite well. This could include sensors mounted directly on the horn such that, for example, squeezing something in a convenient location, extra buttons near valves, etc.
I’m bummed that I won’t be playing tuba on thursday, but I will have something that’s 20 minutes long and involves tuba by September

WiiOSCClient.sc

Because there are problems with the wiimote support in SuperCollider, I wrote a class for talking to Darwiin OSC. This class has the same methods as the official wiimote classes, so, should those ever get fixed, you can just switch to them with minimal impact on your code.
Because this class takes an OSC stream from a controller and treats it like input from a joystick, this code may potentially be useful to people using TouchOSC on their iPhones.
There is no helpfile, but there is some usage information at the bottom of the file:


 // First, you create a new instance of WiiOSCClient, 
 // which starts in calibration mode
 
 
 w = WiiOSCClient.new;

 // If you have not already done so, open up DarwiinRemote OSC and get it talking to your wii.
 // Then go to preferences of that application and set the OSC port to the language port
 // Of SuperCollider.  You will see a message in the post window telling you what port
 // that is .... or you will see a lot of min and max messages, which lets you know it's
 // already callibrating
 
 // move your wiimote about as if you were playing it.  It will scale it's output accordingly
 
 
 // now that you're done callibrating, turn callibration mode off
 
 w.calibrate = false;
 
 // The WiiOSCClient is set up to behave very much like a HID client and is furthermore
 // designed for drop-in-place compatibility if anybody ever sorts out the WiiMote code
 // that SuperCollider pretends to support.
 
 // To get at a particular aspect of the data, you set an action per slot:
 
 w.setAction(ax, {|val|
  
  val.value; // is the scaled data from ax - the X axis of the accelerometre.
  // It should be between 0-1, scaled according to how you waved your arms during
  // the callibration period
 });
 
 
 
 // You can use a WiiRamp to provide some lag
 (
  r = WiiRamp (20, 200, 15);
 
  w.setAction(ax, {|val|
   var scaled, lagged;
  
   scaled = ((val.value * 2) - 1).abs;
   lagged = r.next(scaled);
  
   // now do somehting with lagged
  });
 )

Calibration

this class is self-calibrating. It scales the wiimote input against the largest and smallest numbers that it’s seen thus far. While calibration is set to true, it does not call any of its action methods, as it assumes the calibrated numbers are bogus. After to set calibration to false, it does start calling the actions, but it still changes the scale if it sees a bigger or smaller number than previously.

WiiRamp

The WiiRamp class attempts to deal with the oddness of using accelerometers, but it does not just do a differentiation, as that would be too easy. The accelerometers give you major peaks and valleys, all centred around a middle, so just using the raw values often is a bit boring. In the example, you see that we scale the incoming data first: ((val.value * 2) – 1) changes the data range from 0 to 1 into -1 to 1. The puts the centre on 0. Then, because we care more about the height of peaks and depth of valleys than we care about whether they’re positive or negative, we take the absolute value, moving the scale back to 0 to 1.
When you shake your wiimote, the ramp keeps track of your largest gesture. It takes N steps to reach that max (updating if a larger max is found before it gets there), then holds at the number for M steps and then scoots back down towards the current input level. You can change those rates with upslope, hold and downslope.

OscSlot

This class is the one that might be useful to iPhone users. It creates an OSCResponderNode and then calls an action function when it gets something. It also optionally sends data to a Bus and has JIT support with a .kr method. It is modelled after some of the HID code. It also supports callibration. How to deploy it with TouchOSC is an exercise left to the reader.
http://www.berkeleynoise.com/celesteh/code/WiiOSCClient.sc

First BiLE Performance

BiLE, the Birmingham Laptop Ensemble, had it’s first gig on Thursday, just six or eight weeks after being formed. We played at the Hare and Hounds in Birmingham, which is a well-known venue for rock bands, as a part of the Sound Kitchen series. There were two pieces on the bill, one called 15 Minutes for BiLE by BiLE member Jorge Garcia Moncada and we did a cover of Stucknote by Scot Gresham-Lancaster, which was a piece played by The Hub.
As a first performance, I thought it went rather well. There were the usual issues where everything sounds completely different on stage and the few minutes of sound checking does not give anybody enough time to get used to the monitor speakers. And time moves completely differently in front of an audience, where suddenly every minute gets much longer. But there were also the performing-with-a-computer issues: computers get terrible stage fright and are much more prone to crash. A few people did have their sound engines crash, so the first piece had a high pitched squeal for a few minutes, while messages flew on the chat window, reminding people to be quiet during the quiet parts. Actually, there was quite a lot of panic in the chat window and I wish I’d kept a log of it. (Later the audience said we all looked panicked from time to time. I always look panicked on stage, but it’s not cool.) In the second piece, I forgot to tell my programme to commence sound-making for a bout the first three minutes. I haven’t heard the recording yet, but I bet things sounded ok. Considering that most of us had never done live laptop performance at all before and how quickly we went from our first planning meeting to our first gig, I think we got a good result.
Jorge’s piece was complicated but Stucknote seems deceptively simple, so we did not try running through it until the day before the gig. In retrospect, this was clearly an error, because the piece, like all structured improvisation, does require some practice to get the flow down. Of course, we’d all spent the requisite time working on our sound generation and I’d coded up some faders for me and the other SuperCollider user, with Ron Kuivila’s Conductor quark, which is a very quick and dirty was of making useful GUIs. I’d tried out my part at home and it worked well and the sound I got was interesting, so I felt confident in it until I got to the practice and it crashed very quickly. I restarted SuperCollider and it crashed again. And again. And again. Half the time, it brought down the other SC user’s computer also. And it was clobbering the network, causing the MAX users a bunch of error messages and a few moments of network congestion. MAX, usefully, just throws away network messages when there are too many of them, whereas SC does not seem to.
I could not figure out where the bug was and so, after the practice, I sat down to sort it out. And there was no sign of it. Everything was fine again.
Fortunately, this provided enough of a clue that I was able to figure out that I had created an infinite loop between the two SuperCollider programmes. When I moved a slider in the GUI, that sent a message to the network which effected the sound on the target machine and also caused Shelly’s programme to update the GUI. However, the Conductor class always informs listeners when it’s updated, no matter who updated it or how, so it sent a message back to the network informing everybody of it’s new value, which caused my GUI to update, which sent a message to the network, ad infintum until I crashed.
I came up with a fix using a flag and semaphores:

                   Task({
                             semaphore.wait;
                             should_call_action = false;
                             cv = con[contag];
                             cv.input = input;
                             should_call_action = true; 
                             semaphore.signal;
                     }).play;
 

While this fix mostly works, it does bring up some interesting questions about data management across this kind of network. If we’re all updating the data at once, is there a master copy of it somewhere? Who owns the master copy if one exists? In this case, as one person is making sound from it, that person would seem to be the owner of the data. But what if we were all sharing and using the sliders? Then we all own it and may all have different ideas of what it might actually be.
I’m writing a class for managing shared resources which holds a value and notifies listeners when it changes. The object that’s changing it passes itself along to the method, so when listeners are notified, the changer is not. I haven’t finished the class yet, so I don’t have sample code, but I’m pondering some related issues.
Like, should there be a client version of this class for a local copy held on the local machine and a master version for the canonical copy on the network that everybody else is updating? Should a master copy of some data advertise itself on the network via the API and automatically listen for updates? Should they specify a way to scale values so it can also accepted changed inputs from 0-1 and scale them appropriately? If it does accept inputs/values in a specified range, should there be a switch for the clients to automagically build a GUI containing sliders for every master variable on the network? I think that would be quite cool, but I may not have time to code it soon, as our next gig, where we’ll be playing a piece of mine, is coming up very soon on 29 of April and then there’s a gig in May and then I suspect probably one in June and one in July (although not scheduled yet) and in August, we’re going to NIME in Oslo, which is very exciting. Bright days ahead.

Semaphores are awesomesauce

Imagine, if you will, that you are a programmer and somebody has asked you to write an application that counts cars in intersections. You have webcams mounted on top of the traffic lights and it sends you a message when it sees a car. You have saved somewhere a count of all the cars so far. So, when it tells you it sees a car, you go find that number, add one to it and write down the new number. There is more than one camera in the intersections, though and while some cars are travelling north, others are travelling south at the same time. What if two cameras see different cars at the same time?

  1. Camera one sees a car and the programme listening to it goes and finds the count of cars so far, which is 75.
  2. Camera two sees a car and the programme listening to it goes and finds the count of cars so far, which is 75.
  3. Camera one’s programme adds one to the total and gets 76.
  4. Camera two’s programme adds one to the total and gets 76.
  5. Camera one’s programme saves it’s new total.
  6. Camera two’s programme saves it’s new total.
  7. You go to look how many cars have been through the intersection and the number recorded is 76.

Camera one and camera two are operating separately from each other at the same time. They are sperate threads. when two of them are trying to change the same resource at the same time, you get something called a race condition. Will the first thread finish before the second thread clobbers it’s changes? The race is on!
Fortunately, there is a solution to this problem: semaphores! Lets’ say you are trying to update your traffic count with SuperCollider:

(
 var traffic_count, camera1, camera2, semaphore, action;

 traffic_count = 0;
 semaphore = Semaphore.new(1);

 camera1 = TrafficCounterCamera(north);
 camera2 = TrafficCounterCamera(south);

 action = { // this will be called when a car is seen
  Task({
   semaphore.wait; // only one thread can get past this point at a time
   traffic_count = traffic_count +1;
   semaphore.signal; // relinquish control of the semaphore
   traffic_count.postln;
  }).play;
 };

 camera1.action = action;
 camera2.action = action;
)

You need to make a new Semaphore before you use it. By default, they allow one thread through at a time, but you can change the argument to 2 or 3 or whatever number you want.
When your code encounters a semaphore.wait, the thread will pause and wait until it’s turn to procede. Only one thread will be allowed past that line at a time. If both cameras update at the exact same time, one of them will have to wait until the other says it’s good to go ahead.
semaphore.signal is how that thread says it’s good to go ahead. The code in between those lines can only be accessed by a single thread at a time. the traffic_count.postln line is outside the seamphore because it’s not making a change to anything, so it’s safe to read it outside of the semaphore.
So when you have two seperate threads trying to change something at the same time, semaphors can help you out! This sort of situation can arrive with GUI objects, or with OSC messages, HID objects or anything with an action method.
Be thread-safe!

My life lately (is tl;dr)

Tuesday and Wednesday Last Week

A week ago Tuesday, I taught my module in Cambridge. The next morning, I got on a train to Birmingham for BiLE practice. I’m a co-founder of BiLE, the Birmingham Laptop Ensemble. We formed in February and we have a gig next week. The technical hurdles to getting a laptop ensemble going are not minor, so there has been a lot of energy going into this from everybody. We have got group messaging going, thanks to OSCGroups and I wrote some SuperCollider infrastructure based on the API quark and a small chat GUI and a stopwatch sort of timer, which is controlled with OSC, so there’s been a lot of that sort of tool writing. And much less successful coding of sound-making items, which will eventually be joystick controllable if I ever get them to work. All my code is written for mono samples and all of the shared samples people are using are in stereo, so I spent a lot of time trying to stereo-ise my code before finally mixing the samples down to mono.
I’m a big believer in mono, actually, in shared playing environments. If I am playing with other people, I’m playing my computer as an instrument and instruments have set sound-radiation patterns. I could go with a PLOrk-style 6-speaker hemisphere, if I wanted to spend a boatload of money on a single-use speaker to get an instrumental radiation pattern form my laptop, so I could just use a single Genelec 1029 that I already own.
Anyway, after the BiLE rehearsal, a couple students gave a group presentation on Reaper, which is a shareware, cheap, powerful DAW. I’m quite impressed and am pondering switching. My main hesitation is that I expect my next computer will be linux, so I don’t know if I want to get heavily involved with a program that won’t run on that OS. On the other hand, I don’t actually like Ardour very much, truth be told. I haven’t liked any of them since I walked away from ProTools.
After that we went out for socialising and instead of catching a train home, I went to stay on the floor of Julien’s studio. He lives way out in the country, up a lane (British for a single-track country road). It’s quite lovely. I would not be a fan of that commute, but I might do it for that cottage.

Thursday

The next morning, Juju and I set back to campus quite early so he could meet his supervisor. I ran a couple of errands and got a uni-branded hoodie. I haven’t worn such a garment for years, because fabric clinging to my chest in the bad old days was not a good thing. But now I can wear snug woven fabrics, like T-shirts, hoodies and jumpers! It’s amazing! Also, I remember the major student protests about university branded clothing made by child labour, but this was actually fairtrade, according to the label, which is fairly impressive.
Then all the postgrads met in the basement of the Barber Institute to start loading speakers into a truck for a gig. We were moving a relatively small system, only 70 speakers, but that’s still a fair amount of gear to muscle around. Then we went to the Midlands Arts Centre to move all the gear into the venue and set it up. The gear is all in heavy flight cases, which needed to be pushed up and down ramps and down hallways and then the speakers inside needed to be carried to where they would be set up, as did the stands to which they would be attached and the cables that connect them. It’s a lot of gear. We worked until 6 or 7 pm and then went back to the studios at uni to get a 2 hour long presentation from Hans Tutchku about how he does music stuff. I tried desperately to stay awake because it was interesting and I wanted to hear what he was saying, but I did not entirely succeed in my quest.

Friday

Then, Juju and I went back to his place, 45 minutes away and then came back to the MAC early the next morning to finish rigging the system. We put up the remainder of the system and then people who were playing in that evening’s concert began to rehearse. I hung around for the afternoon, trying to get my BiLE code working. Kees Tazelaar, who played the next evening came along to see how things were going and recognised me from Sonology and greeted me by my old name. I like Kees quite a lot, but it was a very awkward moment for me and I wasn’t sure what to do, so I spoke to him only briefly and then mostly avoided him later. This was not the best way to handle it.
There were two concerts in the evening. The second of them was organised by Sound Kitchen and was a continuous hour with no break between pieces. The people diffusing the stereo audio to the 70 speakers took turns, but changed places without interrupting the sound flow. It was extremely successful, I thought. The hour was made up of the work of many different composers, each of whom had contributed only 5 minutes, but somehow this was arranged into a much larger whole that held together quite well, partly because many of the different composers had used similar sound material. A lot of them used bird sounds, for example, so that was a repeating motif throughout the concert.

Saturday

After that, we hung around the bar for a bit afterwards. The next morning was not so early, thank goodness, when we went back to the MAC and then back to the uni for the BiLE hack day. The idea was that we would do a long group coding session, where people could write code around each other and ask for clarification or feedback or help or whatever from band mates. However, it started really late and everybody was really tired, so it was not entirely successful in it’s goals.
Then we went back to the MAC for the concerts. I was sitting in the hallway, trying to figure out why my BiLE code had failed so completely when I got drafted into being in charge of the comp tickets. It turns out that this is actually somewhat stressful, because it requires knowing who is supposed to get comped in, getting tickets for them and then distributing them. Which means approaching Francis Dhomont and speaking to him.
The first concert was curated by Kees Tazelaar and started with a reconstruction of the sounds played in the Philips Pavilion at the Brussels Worlds Fair in 1958. He found the source tapes and remixed them. Concrete PH sounded much more raw and rougher than other mixes I’ve heard. It had a gritty quality that seemed much more grounded in a physical process. I was surprised by how different it sounded. Then he played Poem électronique and a his own work called Voyage dans l’espace. I hope he plays these again on large multi-channel systems, because it was pretty cool.
I was feeling fairly overwhelmed by the lack of sleep, my lack of success with BiLE and getting stuck with all the comp tickets, so I was not happy between concerts. The next one was all pieces by Anette Vande Gorne, a Belgian woman who runs the Espace du son festival in Brussels and who has very definite theories about how to diffuse sounds in space. Some of them are quite sensible, however, she thinks that sound can start at the front of the hall and be panned towards the back of the hall, but sound cannot originate at the back of the hall and travel to the front. Hearing about this had prejudiced me against her, as it seems rather silly.
She always diffuses standing up, so they had raised the faders for her, with one bank slightly higher than the other, like organ manuals. She started to play her pieces… and it was amazing. It was like being transported to another place. All of my stress was lifted from my shoulders. It was just awe inspiring. The second piece was even better. I was sitting in the back half, so I could see her standing at the mixers, her hands flying across the faders dramatically, like an organist, full of intensity as her music dramatically swelled and travelled around the room. It was awe-inspiring. Then I understood why people listened to her, even when some of her theories sound silly. She might not be right about everything, but there’s quite a lot she is right about. This was one of the best concerts that I’ve ever been to.
The last concert was a surprise booking, so it wasn’t well publicised. It was Jonty Harrison, Francis Dhomont and Hans Tutchku. It was also quite good, but I wouldn’t want to play after Vande Gorne. Tutchku’s piece had several pauses in it that went on just a few moments too long. It’s major climax came quite early. It worked as a piece, but seemed like it could be experienced in another order as easily as the way it was actually constructed. I talked to him at the party afterwards and he said that the pauses were climaxes for him and ways of building tension and that he had carried them out for too long in order to build suspense. I’m not entirely positive they functioned in this way, but the idea is quite ineresting and I may look into it. He also asked me what I thought of his presentation for two days earlier, so I was hoping he hadn’t noticed me dozing off, but I think he did.
After the final concert, there was a large party at Jonty’s house. I got a lift from Jonty, so I was squeezed in the back of a car with Anette Vande Gorne on one side of me and Hans Tutchku on the other side with Francis Dhomont in the front. They all spoke French the whole way. I’ve been filling out job applications and one them wants to know about my foreign language skills and now I can say with certainty that if I’m stuck in a car with several famous composers speaking French, I can follow their conversation fairly well, but would be way too starstruck to contribute anything.
Apparently, the party went on until 4:30 in the morning, but I didn’t stay so late. I talked a lot to Jean-François Denis, the director of empreintes DIGITALes, a Canadian record label. He flew from Canada just for the weekend and showed up without anyone expecting him. He is extraordinarily charming.

Sunday

The next morning, we went back again to the MAC and then there was a long concert with an intermission in the early afternoon. Amazingly, none of the concerts over the entire weekend featured overhead water drops. There were barely any dripping sounds at all.
After the concert, we de-rigged the system and packed all the gear back into cases and loaded it onto the two rented trucks. Then we went for curry in Mosely, which we seem to do after every gig. Shelly was talking about how it was her last BEAST gig and I wasn’t paying much attention until I realised this meant it was my last gig too. I really should have signed up to play something. I thought there was another gig coming later in the year, but it was cancelled. I’m seriously going to graduate from Brum having only played a piece at a BEAST gig one time and never having diffused a stereo piece. That is extremely lame on my part.

Monday

Juju was completely exhausted, so we left the curry early, so he could go home and catch up on sleep. The next morning, we all went back to the Barber Institute to unload the trucks and put everything away. Then we, as usual, went to the senior common room to have cups of terrible coffee. Their tea is alright, so that’s what I had, but most people go for the coffee, which could double as diesel fuel. I guess this was my last time of that also.
Normally, I would then gather my things and go home, but I did not. I worked on code and faffed and worried about my lecture the next day and then in the evening, we had another seminar. Howard Skempton came and talked for two hours about Cardew and Morton Feldman and his own music. It was quite good. We all went to the pub afterwards, but that dissipated quickly as people left to sleep it off.

Tueday

I got the train home, finally and got in after midnight. There’s a large stack of mail inside my door. I woke up early the next morning to assemble my presentation for my module. As luck would have it, the topic was acousmatic music, so I talked about BEAST and played them some of the music from the weekend. I also pointed them at some tools. I was supposed to have them start their task during the class time, but a surprising number of them wanted to show their works in progress, so that didn’t happen.
As I was on the train back to London from Cambridge, I wondered whether I should go out to a bar that night to socialise when I fell completely asleep on the train. Drooling on my backpack asleep. I completely crashed. I woke myself up enough to get the tube home and then thought I would sort out my BiLE code instead of going out, but I couldn’t concentrate, so I just faffed around on the internet instead of sleeping or going out. Meh to me.

Wednesday

Then, the next day, which was Wednesday, a week and a day after all of this started, I got on the train for Birmingham to go to a BiLE rehearsal and to go to a seminar. I got my code working on the train and was feeling somewhat happy about that, but when I got to the rehearsal, it just gave up completely. I managed to make sounds twice during the entire rehearsal, one of which was during a grand pause. When I tried repeating the sound later, it wouldn’t play. Also, Shelly found a crash bug in my chat application, when Juju typed a french character. On the bright side, however, all of the MAX users got all the way through one of the pieces we’re playing next Thursday, which is quite encouraging. Antonio, our graphics guy got the projector sort of working, so I was able to glance at what he was doing a couple of times and it looked good.
We took a break and a bunch of the postgrads were dissing live coding, so I guess that might not be a good goal for the ensemble. They thought projected code was self-indulgent and only programmers would care. I need to link them to the toplap mainfesto. Actually, they were more dissing the idea of live coding, having barely witnessed any themselves. Non-programmers do seem to care and, while it is a movement that does require some thoughtful understanding to fully appreciate it, the same could certainly be said of acousmatic music. I like the danger of live coding, something that I think a laptop ensemble ought to appreciate. It’s a bit like a high wire act.
The presentations at the seminar were interesting and then we went to the pub. I was so tired biking home from the train station that I got confused about which side of the street I’m supposed to be on.

Thursday

I slept until 2 this afternoon and I was supposed to sort out my BiLE code and fix up my CV and write my research portfolio, but all I did was send out email about Monday’s supercollider meetup and fix the crashbug in the chat thing. SuperCollider strings are in 7 bit ascii and fuck up if you give them unicode, which is really quire shocking and not documented anywhere.
Then I went to Sam’s to get Xena back and I wired up part of the 5.1 system she got for her daughter and sorted out her daughter’s macmini so that she could connect to it with VNC and so it was wired to the sound system and the projector and quit asking for the keychain password every 5 seconds. Then I came home and spent ages typing this up. Tomorrow, I will do my CV stuff for real, because I have to get it done and then work on my BiLE code. Saturday I’m going back to Brum again for a 5 hour rehearsal in wich we sort out the rest of our music for the gig. Sunday, I need to finish and job application related stuff and write my presentation for Tuesday. Monday is the job application deadline and a SuperCollider meetup. Tuesday, I teach. Wednesday, I need to get Xena back to Sam’s and then go to Brum again for a rehearsal and will be there overnight to practice the next day and then play the gig and then get stonkingly drunk. Friday, I go home. And then start sorting out the tech stuff for the next two pieces, which at least are by me and count towards my portfolio. And I need to sort out my stretched piece which is a disorganised mess and start writing a 20 minut piece, which I haven’t done at all and needs to be done very soon because I need to graduate and I have not spent all this busy time working on my own music, although the tools I’ve written should be kind of valuable. All I can think about now, going over and over in my head is all the stuff I have to do. And snogging. That thing about men thinking about sex every 7 seconds has never been true for me before, but it is now. And it’s actually quite annoying except that as the alternative is thinking about everything that I have to do, I actually prefer it.

Attention: Single Ladies

Are you a straight or bi woman between 29-40, who has given up on the single scene? Feel like all the good men are taken?

Despair Not!

Meet eligible postgraduate men near you!
Yes, your area may be teeming with unpaired postgrad men. Men with exciting and interesting hobbies such as:

  • Working on their dissertations
  • Doing fake-work
  • Facebooking
  • Procrastinating
  • Feeling guilty about facebooking and procrastinating
  • Deconstructing re-runs of The Simpsons
  • And More!

Yes, you too can be let into the life and the flat of a man who has stacks of books everywhere and mutters to himself about conference submission deadlines. You can experience the joys of wonder of hearing him say, “I really should be working right now.” You can go to fun parties with academics where your date shows up exceedingly late and then drinks only lemonade in case he decides to do more work at 1:30 AM.
Do you find gaudy material tokens of success like nice haircuts and shoes without holes to be shallow and off-putting? Have you always wondered abou the finer points of spectromorphology carried out with open source software and the communities that produce those artefacts? Does your heart go pitter-patter for somebody reading theory textbooks on the beach on holidays? Yes, postgraduate men are waiting to meet you!

Or

If you’re a bloke and you’re still reading this far, why not meet postgraduate women who are pretty much like the men described above, but with the added bonus of being female.

Act NOW

While supplies last! Yes, nab them quickly before they finish with their writing up year or drop out of uni or meet somebody else (haha, just kidding on the last one)

Why note date a postgraduate near you TODAY?

You’re going to have to make the first move here, and be as blatant as possible about it, or else they might not notice. But give it a go. Soon.

A dream I had

I had a friend who had an inverse fairy godmother. She would always grant the opposite of whatever he wished for. If he wished to go see a concert somewhere, she would wave her wand and it would guarantee that we wouldn’t get to the concert, but something else really good would happen, always better than whatever wish he had. His life was always full of unexpected and amazing events.
This made him very popular, because when you were hanging around with him, you never knew what was going to happen, but that it was going to be good. His friends thought this was awesome and it seemed like his fairy godmother was benefitting them more than him. It’s not that he didn’t also benefit from his inverse wishes, but he was constantly frustrated that none of his wishes ever came true. He seemed unaware that he lived a charmed life. He wasn’t even aware that his fairy godmother existed, but I had seen her.
Still, he was relatively happy despite himself and had a good sense of humour. He never learned to stop wishing for things, though or to wish aloud for the opposite of what he actually wanted. He just smiled and carried on, surrounded by his friends.
And then I woke up and all the specifics of this dream faded from my mind, as dreams do, but the general plot remained. Weird, innit

Adventures in American Healthcare

A few days before I left England, my ear began to itch, in the spot where I used to have a cartilage piercing. I didn’t worry about it, but scratched at it, absentmindedly, thinking I really should do something about it but then forgetting. Then I got on a 10 hour flight, followed immediately by a 16 hour train ride. I got to my dad’s house and felt exhuasted and my ear was irrated to heck. I caught a glimpse of it in the mirror and my entire ear was red enough that I could step in for Rudolph and save Christmas, in case a holiday movie suddemly formed around me.
My dad took me to see a doctor at an “urgent care clinic.” This is American for a walk-in clinic. First, a nurse took my contact details and then told me to wait in the lobby. The primary feature of this was a large flatscreen TV showing adverts for prescription drugs. “Feeling stressed? Ask your doctor about Damitol. Damitol can help with burts of impotent rage. Do not take Damitol if you are already taking Fukitol. Side effects of Damitol may include becoming red faced, excessive sputtering and fatigue. Damitol works best when combined with diet and exercise. . . .” Blah blah blah. They had a 5 or 10 minute advert for a diabetes drug. Then they had a minute or two of random health-related information, then another advert. It was all branded as CNN Health.
“This is weird.” I said to my dad.
“I think it’s just general information about insulin . . . oh. That is weird.”
A nurse took me back, weighed me, took my blood pressure, pulse and temperature and asked about allergies. All interactions with healthcare providers in the US start with weight, blood pressure, etc. I explained about my ear, which was significantly less red by then. She took notes and left.
A moment later, the doctor came in and I repeated my story. He looked at my ear for 5 seconds and prescribed sulfa antibiotics. “They’re cheap,” he explained. I asked something about my ear and he said it was probably a staph infection and they tend to respond to sulpha.
“Staph?!” I thought.
“Unless it’s MRSA,” he continued.
I quit listening to his list of dire diseases. I asked about side effects and he started talking about posssible allergic reactions. “In the worst case your mouth and tongue will swell up and . . .”
“I just wanted to know if it was ok to drink or not.” I interrupted.
“If you drink, it will make the allergic reaction hit more quickly . . .”
I stopped listening again. Then I went out to the front to pay. Actually, my dad paid. It was over $100. Then we went to a pharmacy, where the drugs were only $14. They really were quite cheap.
The pharmacist explained that they might upset my stomach, etc. i had forgotten that in the States, you get this information from pharmacists and not doctors. Probably because we were in Washington state, she didn’t mention that I should stay out of the sun.
So I started taking antibiotics, wondering if my British GP would have prescribed them. he certainly would have poked my ear several times first. I also started putting hot compresses on it. It hurt if anything touched it, so no wearing headphones or hats or sleeping on that side.
Last night, on the 8th day, it was bright red again. And still hurting and warm to the touch this morning, so I resolved to go to a clinic. I called the one closest to my house. They weren’t answering, so I called another which was taking a holiday and then another and another. Every clinic seems to be closed today, except for one 3 miles away, which said it was open, but the recptionist was busy. I cycled over. It was closed.
Finally, I tried the Berkeley Free Clinic and was startled when a person answered. I described my woes. “You need to be seen,” he said, but they couldn’t see me before Monday. “Do you have money or insurance?” The person asked. Money, yes. Insurance, no. He suggested that I go to Highland Hospital. “They have an urgent care clinic. Go to the emergency room and they’ll direct you.”
I faffed around for a bit and finally got on a bus. Highland is an emergency-only hospital with a reputation for highly organised, professional helpful staff in the midst of the complete chaos.
I asked for the urgent care clinic and was told it had closed down. They said they just do it all in emergency now. The intake person said it was fine that I wasn’t having an emergency and took my ID and told me to sit.
I got called up to a triage desk and a nurse took my temperature pulse and blood pressure and asked about allergies and past illnesses. “When was your last tetnus shot?” Then she asked what the problem was and gave me a red wristband to indicate that I have allergies. She told me to wait in a different room.
I got called back to a different desk where I was asked for ID again, address, emergency contact information, mother’s maiden name, social security number, whether I had a job and a GP and many other questions. “Did you come by car or bus?” Then, she told me to wait again.
A nurse called me and walked me over to a bunch of cublicles. “Wait here for a moment.” He said and then vanished. A while later, a woman introduced herself as a doctor and I repeated my entire tale of woe. She looked in my ears and then prodded my ill one a bit. She said it was a minor infection and would probably go away on its own, but decided to prescribe me new antibiotics. She told me to keep sitting there and a nurse would come.
The nurse had the prescription forms. “You have to take these every 6 hours, which is a pain in the ass.” She looked at my warm, but no longer red ear and wondered why I had been given a prescription at all. She lead me to wait for a financial advisor. While waiting, I heard an announcement calling the trauma team to assemble, saying a type 2 trauma would be arriving in 8 minutes.
The financial person asked if I had a job and for ID. I said I worked in England. “So you’re not a resident of California?” Well, I kind of am, I’m just studying abroad. I gave her my expired drivers lisence. It has the wrong name on it. This did not help clarify matters. She said I would need to provide pay stubs to prove my income. I said they were in England. She sent me to wait to talk to her supervisor.
I looked at the information provided to me while I waited. “Cellulitis usually clears up on its own.” No mention of staph or mrsa. The financial person called me back.
“You’re not a resident here.” We began again. I finally gave up. She asked what had happened during my visit. “Oh, that won’t cost much anyway.”
“How much will it be?”
They don’t tally it up for a couple of weeks. In my experience, a trip to an emergency room is at least $400, so I really hope this will be billed as if their clinic still existed.
I took the prescription to Walgreens pharmacy, despite knowing that they have a 1000% markup on some drugs, including ones I got from them in the past. 7 days of the new antibiotic cost $60, but if I spend $20 to enroll on their discount program, I could get it for $30. Obviously, they have a large markup on antibiotics also. Charming. I enrolled in the program. The form I got explained that it was not health insurance. No kidding.

Post Script

The bill from Highland came out to $283, which is a lot less than I’d anticipated.

Ignorance is Strength; Socialism is Slavery

It has recently been put to me that socialism is slavery, because under socialism, the person belongs to the state and not themselves. The speaker’s inspiration was likely the not-yet-implemented health reform, because he wrongly fears he will be compelled to get regular checkups from a physician.
By his logic (and I use that term losely) the entire first world and part of the third world, are currently enslaved. France? Enslaved. Germany? Enslaved.
Of course, America is no stranger to slavery. There are people alive today who grew up hearing their own grandparents accounts of what it was like to be born into bondage. To have no rights and be considered less than human, to fear random violence.
Slaves were compelled to work at jobs that were often dangerous or life-span shortening. The method of compulsion was systemic and grusome violence. People were often brutally whipped. The owner had a “right” to sexual abuse. In the normal course of trade, families could be broken up and torn apart- spouses sold to separate locations, children taken from their parents. And it took just an accusation of sex to cause a lynching – something which did not just include hanging, but also torture.
Slavery in America has never been equaled in it’s brutality, it’s violence and it’s injustice. One race of people systeamtically beat, exploited and opressed another race of people, taking the output of their labour for themselves and providing violence in return. And this system of violence and terror is exactly like . . . Universal access to healthcare?
Stay sane, America.