Soliloquy [NOC]

A non-visual world that is created with wind and sounds. A collaboration with Alex Dodge.

Most of us know our world through visual means; our other senses are often overshadowed or taken for granted, but they give us a fuller sense of the world around us. So in Soliloquy, we wanted to explore way humans interact with non-visual feedback, depriving the user of any visual feedback. In addition, it’s a way to create a world for the user which is entirely in their own mind — a world in which a person can fill in the gaps with their own preconceived visual information that correlates with these sounds and


The installation is a circular rig with eight fans hanging off the side, about head-height. The user sits on a seat in the middle of these fans, then the user puts on headphones and a blindfold. After a quick calibration, the user is able to hear sounds from the world and can explore the world by leaning in whichever direction he wishes to move. As she move certain directions, the user can feel the fans — i.e. wind — blowing in her face, as if she is flying through the world. The faster she moves, the stronger the wind. The user fan also feel wind sounds rushing past their ears, and they can hear sounds in 3D space and attempt to chase these sounds.

Image by Alex Dodge. His initial post is here.

The rig

The physical rig, designed by Alex, is a ring of eight fans hung in a circular fashion. We are using four 200 mm PC fans because they run off of DC power, which allows us to manipulate the speed of the rotation. There are also four smaller fans to close the gap between the four big fans. The ring of fans hang off of a stand for audio speakers. Much credit to Alex for this incredible engineering feat. The rig is easily collapsable, and can be stored in small spaces — much needed, since we had no staging space to work with.

The circuit

We were essentially talking out from Processing to Arduino — sending four values in, and writing those values to the eight fans: six for the front and back, two for the left and right. Alex designed power supplies out of old PC power units.

The circuit is wired through a TIP 120 power resistor, since we are working with a high power load. We simply used the example from the Physical Computing lab on high current loads.

Now, one of the troubles we ran into was writing values from Processing to Arduino. We were taught how to write from Arduino to Processing — and that’s simple, because Processing has functions that parse the data coming in from Arduino. (Here’s a lab demonstrating that.) However, writing from Processing to Arduino meant we had to parse the data on the Arduino side using arrays. With the help of Tom Igoe, we figured it out.

We wrote out from Processing following code:

void fans() {
  //We're running fans, depending on whether we're going right/left, forward/backward. 
  if (vel.y <  0) {
   fan1 = int(map(vel.y,0,-20,920,1023)); //Straight ahead, 9 
  } else {
  fan4 = int(map(vel.y,0,20,900,1023)); ; //Backward, 3
  if (fan1 < 1000 && fan4 < 1000) {
  if (vel.x < 0) {
   fan3 = int(map(vel.x,0,-20,900,1023)); //To the left, 6
  } else {
  fan2 = int(map(vel.x,0,20,900,1023)); //To the right, 5
//We're getting the fans going at full blast for 5 seconds
  if (millis() < 5000) {
  port.write(1023); //FORWARD
  port.write(1023); //RIGHT
  port.write(1023); //LEFT
  port.write(1023); //BACKWARD
  ///We're writing to the fans here
  port.write(fan1); //FORWARD
  port.write(fan2); //RIGHT
  port.write(fan3); //LEFT
  port.write(fan4); //BACKWARD

So visually, we’re writing out to Arduino like this:

R, 1023, G, 900, B, 900, Q, 900
R, 1000, G, 900, B, 800, Q, 800

Our delimiter is a comma. That’s when we know there is a new array value coming. And we know to reset the array when we hit a newline, represented by “\n”.

Now, to read values in Arduino, we had to store the incoming values in a buffer array. Here’s the code:

char buffer[9];
int counter = 0;

void setup() {

void loop() {
  if (Serial.available() > 0) { 
    char thisByte =;
    buffer[counter] = thisByte;
    if (thisByte == '\n') {
     counter = 0;

void parseBuffer() {
  if (buffer[0] == 'R') {
   analogWrite(9, buffer[1]);
  if (buffer[2] == 'G') {
   analogWrite(6, buffer[3]);
  if (buffer[4] == 'B') {
   analogWrite(5, buffer[5]);
  if (buffer[6] == 'Q') {
   analogWrite(3, buffer[7]);

First, Arduino reads from the Serial port and puts it in the thisByte variable. Then it puts that character into the buffer. So if we start with 0, it writes to buffer[0]. Then it increases the array key by one, so we’re then going to write to buffer[1] in the next loop.

It keeps doing that until it find the new-line character, which is “\n”. That’s when it knows to begin parsing the buffer, which now looks something like:


First, the program asks whether the first buffer key — buffer[0] — is the value “R”. If it isn’t, the know it isn’t accurate and our values will be off. But if it is, then we know the following number in buffer[1] is the first value we want to read. We do that four more times to get all the values. And once all the numbers are parsed, we reset the counter to buffer[0] — that way it can begin writing another set of fan values.

In all, we are writing nine values from Proccessing to Arduino. We are writing a character to determine which port we want to write to — in this case, R, G and B — and we are writing the values for each port. And lastly we have a newline characters.

The body tracking

In order to track the body, we used the Microsoft Kinect. Using the OSCeleton library, which is developed by Sensebloom, we are able to track the joints of a human being. (I recommend using Tohm Judson’s guide to installation.) It returns each joint as an array of x, y and z coordinates. So for the head, you would get a variable that looks like: head[100,300,900], indicating where the head is.

So in order to track how a person’s body is moving, we found the angle between a horizontal line and the shoulders. That determined left-to-right movement. For forward-to-back, we looked at the angle between a vertical line and the neck/torso. Simple trigonometry did the trick here.

Now, this is a great library — except one problem: It requires calibration, which mean someone has to put their hands up in the air at a 90 degree angle at the elbows. While it worked for some people in our rig, it didn’t work for taller people. It just wouldn’t calibrate, and that was unacceptable. So in the coming days, we will work on tracking the user from atop the rig with color tracking and a normal camera. It’s a much simpler solution, but it seems to be the best solution. We may even resort to IR tracking, if it comes to it.

The visuals

Even though this isn’t a visual system, we had to create some type of visual indication for a few reasons. First off, debugging would be impossible without visual feedback. Secondly, we are visually-oriented people, and we know space as a mainly visual thing.

So I created a world that visualized the person, and the sounds around him. Here it is below:

The white dot is the person. The blue dots are the sounds. The average range of the sound is indicated by the translucent circle surrounding the dot.

Now, the model we are using for movement is flying. When you fly, I assume we can’t stop on a dime, much like swimming. There’s momentum. So we using the location, velocity and acceleration model we learned from Dan Shiffman’s Vectors lesson in order to achieve this. So when we got values in from the Kinect camera about how far the person is leaning, we fed that number into the acceleration variable. So a person would speed up slowly, not instantly.

Now, with fans, there’s already real-world physics there. We don’t need to program in acceleration. But we do need to program it in for the computer world. To marry those two worlds together, I set the friction quite high and the acceleration high, too. This way, you can accelerate quickly and decelerate quickly. It makes it a little more responsive.

Lastly, there’s the zooming and tracking functionality of the visuals. Now, OpenGL doesn’t play nicely with ToxicLibs. So I kind of hacked together a fake 3D. For the tracking, I used the translate function in two dimension — and translated the whole visual world as we moved the screen. For the zooming, I faked 3D by using the scale() function and scaling the entire world. This type of 3D wouldn’t work if we rotated anything… but we’re not rotating anything, so this was perfect!

The sounds

We used ToxicLibs’ audio library, which can be downloaded here. The documentation for that can be found here.

In order to set the location of the listener, we can use the SoundListener class and use the setPosition() function, which takes an x, y and z value. Now, we’re working in a 2D space so we always set the z-value to 0. In order to set the location of the sounds, we used the AudioSource class, which has a setPosition() function as well, and also takes x, y and z values.

This all seems easy — until you want to place these sounds in a space, and have them increase and decrease in volume, as well as have a doppler effect. There are a few steps to this:

1. I found that only mono wav files would work with this. Otherwise, the sounds did not have a location.

2. In order to set the sounds, you have to use the function setReferenceDistance(). This function takes a number which determines how far away you can hear a sound, and how loudly you can hear it. It basically determines the falloff.

3. That seems easy enough… except that falloff doesn’t always work. Remember: When working with a non-visual world, everything is relative. On a computer screen, the proportions are determined by the size of your screen and your resolution. But in a non-visual world, it can be infinite. So if the user can move faster, but the world is bigger, then that’s the same as the user moving slower and the world being smaller. Eeek.

So we had a problem with the falloff not working properly. In order to fix that, I made a little if-statement that makes the falloff more of an exponential function. Keep in mind that you[0] and you[1] are the x- and y-coordinates of the listener. The position[0] and position[1] are the x- and y-coordinates of the sound. Those values can be found by using the getPosition() function, which returns an array with three values of x, y and z:

you = listener.getPosition();
  for (int i = 0; i < sound.length; i++) {
  if (dist(position[0],position[1],you[0],you[1]) != 0) {
  } else {

That pretty much did it for the sounds. It was just a matter of playing with what types of sounds worked the best from there on out.

Last thoughts

We’ll continue to work on this for the show, but I’ve learned a lot working on this project. First off, the power of relativity: We don’t often know this because we work with limited visual space, but when nothing is definite — and there are no limits — everything is relative. I think that’s a profound realization.

Technically speaking, I’ve learned a lot from Alex about these physical rigs. Also, I learned a bit from Tom Igoe about how to talk from Processing to Arduino and, of course, learned a massive amount from Dan Shiffman’s class about how to emulate nature in computer programming.

The project itself has a powerful appeal to it. Once you “plug in” to the world, you’re in a completely different universe. It has philosophical implications that are quite interesting. It brings to the forefront exactly how much humans construct their own version of reality with the senses we know, with certain weights put on each sense — visuals being the strongest. When you’re in Soliloquy, you are in a completely different world where you are disoriented from the world you are accustomed to.

Going forward, I’m going to program in the functionality that allows for blob tracking with a camera. I hope it will be more effective than using the Kinect.

The code for the entire program can be found below. I have not included the data files for sake of size. Also, the Coords and Skeleton classes are almost entirely from the OSCeleton example called Stickmanetic. Hopefully we can strip down the code some more once we begin using camera tracking:

by Alvin Chang and Alex Dodge

This is the code for an installation in which a user sit in the middle of a ring of wind-creating fans
and leans his or her body in a certain direction to move oneself in this world. We are using the XBoX Kinect
as the sensor to detect the angle of the user's shoulders, and the angle of the forward/backward lean.
We are using the OSCeleton library from Sensebloom, as well as wind sounds from the user ERH at
We are using Toxiclibs' sound library to create the non-visual sound space.

This project was created in Daniel Shiffman's course, "The Nature of Code" at NYU's Interactive Telecommunications Program.

For more information, e-mail

import oscP5.*;
import netP5.*;
import processing.serial.*;
import toxi.geom.*;

OscP5 oscP5;
Shoulders shoulders;
Coords coords;
Skeleton s;
Sounds sound;

//Zoom and pan
int transX = 440;
int transY = 460;
float transZ = .1;

int ballSize = 30;
Hashtable<Integer, Skeleton> skels = new Hashtable<Integer, Skeleton>();

int fan1;
int fan2;
int fan3;
int fan4;
Serial port;

PVector loc;
PVector vel;
PVector acc ;

PVector X;
PVector Y;

boolean calibrated = false;

void setup() {
    shoulders = new Shoulders();
    coords = new Coords();
    sound = new Sounds();
    s = new Skeleton(1);
    loc = new PVector(width/2,height/2);
    vel = new PVector(0,0);
    acc = new PVector(0,0);
    println("Available serial ports:");
    port = new Serial(this, Serial.list()[1], 9600);

void draw() {
  //Drawing the body shape
  for (Skeleton s: skels.values()) {   
  fill(100);; //draw skeleton
  ellipse(s.headCoords[0]*width, s.headCoords[1]*height + 30, ballSize*2.5, ballSize*2.5);
  ellipse(s.headCoords[0]*width, s.headCoords[1]*height + 23, ballSize*1.8, ballSize*.8);
  ellipse(s.headCoords[0]*width+ballSize*.3, s.headCoords[1]*height +23, ballSize*.3, ballSize*.3);
  ellipse(s.headCoords[0]*width-ballSize*.3, s.headCoords[1]*height +23, ballSize*.3, ballSize*.3);

  float c = .7;
  PVector friction = vel.get(); 
  if (vel.x < 0.1 && vel.x > 0.1) {
   vel.x = 0; 
  if (vel.y < 0.1 && vel.y > 0.1) {
   vel.y = 0; 
  //Calibrated? Mousepressed sets this to true, and also calibrates
  if (keyPressed && key == ' ') {
   calibrated = true; 
  //If it isn't calibrated, don't apply any forces
  if (calibrated == true) {
  X = new PVector(shoulders.angle*-12,0);
  Y = new PVector(0,shoulders.angle2*1.5);
  } else {
  X = new PVector(0,0);
  Y = new PVector(0,0); 
  text("Loc X: " + int(loc.x-450),10,30);
  text("Loc Y: " + int(loc.y-450),10,45);
  text("Speed-X: " + int(vel.x*3),10,70);
  text("Speed-Y: " + int(vel.y*3),10,85);
  text("World X: " + (transX-440),10,110);
  text("World Y: " + (transY-460),10,125);
  text("Zoom: " + int(transZ*1000) + "%",10,140);
  text("Calibrated (press SPACEBAR): " + calibrated,10,165);
  text("CONTROLS:", 10, 195);
  text("Press 'R' to re-place sounds",10,210);
  text("arrow keys move world",10,225);
  text("'a' zooms in, 'z' zooms out ",10,240);
  text("(when debugging, j/i/k/l moves listener)",10,255);
  //Forces: X is left-right movement. Y is forward/backward. Friction is, well, friction.
  //Shoulders calculates the shoulder angles for the force
  //Initiates the Serial stuff;
  //This allows us to move around the screen with the arrow keys
  //We're running sound in here because the sounds class draws the location of the listener;

//We're adding up all the values for movement
void update() {

//We're applying a force here
void applyForce(PVector f) {

//Our serial data
void fans() {
  //We're running fans, depending on whether we're going right/left, forward/backward. 
  if (vel.y <  0) {
   fan1 = int(map(vel.y,0,-20,920,1023)); //Straight ahead, 9 
  } else {
  fan4 = int(map(vel.y,0,20,900,1023)); ; //Backward, 3
  if (fan1 < 1000 && fan4 < 1000) {
  if (vel.x < 0) {
   fan3 = int(map(vel.x,0,-20,900,1023)); //To the left, 6
  } else {
  fan2 = int(map(vel.x,0,20,900,1023)); //To the right, 5
  if (millis() < 5000) {
  port.write(1023); //FORWARD
  port.write(1023); //RIGHT
  port.write(1023); //LEFT
  port.write(1023); //BACKWARD
  ///We're writing to the fans here
  port.write(fan1); //FORWARD
  port.write(fan2); //RIGHT
  port.write(fan3); //LEFT
  port.write(fan4); //BACKWARD

void zoom() {
 if (keyPressed && keyCode == UP) {
  transY += 4;
 if (keyPressed && keyCode == DOWN) {
  transY -= 4;
 if (keyPressed && keyCode == LEFT) {
  transX += 4;
 if (keyPressed && keyCode == RIGHT) {
  transX -= 4;

 if (keyPressed && key == 'a') {
  transZ += 0.0006;
 if (keyPressed && key == 'z') {
  transZ -= 0.0006;

class Coords {
float ballsize = 20;

Coords() {
  oscP5 = new OscP5(this, "", 7110);
void run() {
  for (Skeleton s: skels.values()) {


/* incoming osc message are forwarded to the oscEvent method. */
// Here you can easily see the format of the OSC messages sent. For each user, the joints are named with 
// the joint named followed by user ID (head0, neck0 .... r_foot0; head1, neck1.....)
void oscEvent(OscMessage msg) {
  if (msg.checkAddrPattern("/joint") && msg.checkTypetag("sifff")) {
    // We have received joint coordinates, let's find out which skeleton/joint and save the values 😉
    Integer id = msg.get(1).intValue();
    Skeleton s = skels.get(id);
    if (s == null) {
      s = new Skeleton(id);
      skels.put(id, s);
    if (msg.get(0).stringValue().equals("head")) {
      s.headCoords[0] = msg.get(2).floatValue();
      s.headCoords[1] = msg.get(3).floatValue();
      s.headCoords[2] = msg.get(4).floatValue();
    else if (msg.get(0).stringValue().equals("neck")) {
      s.neckCoords[0] = msg.get(2).floatValue();
      s.neckCoords[1] = msg.get(3).floatValue();
      s.neckCoords[2] = msg.get(4).floatValue();
    else if (msg.get(0).stringValue().equals("r_collar")) {
      s.rCollarCoords[0] = msg.get(2).floatValue();
      s.rCollarCoords[1] = msg.get(3).floatValue();
      s.rCollarCoords[2] = msg.get(4).floatValue();
    else if (msg.get(0).stringValue().equals("r_shoulder")) {
      s.rShoulderCoords[0] = msg.get(2).floatValue();
      s.rShoulderCoords[1] = msg.get(3).floatValue();
      s.rShoulderCoords[2] = msg.get(4).floatValue();
    else if (msg.get(0).stringValue().equals("r_elbow")) {
      s.rElbowCoords[0] = msg.get(2).floatValue();
      s.rElbowCoords[1] = msg.get(3).floatValue();
      s.rElbowCoords[2] = msg.get(4).floatValue();
    else if (msg.get(0).stringValue().equals("r_wrist")) {
      s.rWristCoords[0] = msg.get(2).floatValue();
      s.rWristCoords[1] = msg.get(3).floatValue();
      s.rWristCoords[2] = msg.get(4).floatValue();
    else if (msg.get(0).stringValue().equals("r_hand")) {
      s.rHandCoords[0] = msg.get(2).floatValue();
      s.rHandCoords[1] = msg.get(3).floatValue();
      s.rHandCoords[2] = msg.get(4).floatValue();
    else if (msg.get(0).stringValue().equals("r_finger")) {
      s.rFingerCoords[0] = msg.get(2).floatValue();
      s.rFingerCoords[1] = msg.get(3).floatValue();
      s.rFingerCoords[2] = msg.get(4).floatValue();
    else if (msg.get(0).stringValue().equals("r_collar")) {
      s.lCollarCoords[0] = msg.get(2).floatValue();
      s.lCollarCoords[1] = msg.get(3).floatValue();
      s.lCollarCoords[2] = msg.get(4).floatValue();
    else if (msg.get(0).stringValue().equals("l_shoulder")) {
      s.lShoulderCoords[0] = msg.get(2).floatValue();
      s.lShoulderCoords[1] = msg.get(3).floatValue();
      s.lShoulderCoords[2] = msg.get(4).floatValue();
    else if (msg.get(0).stringValue().equals("l_elbow")) {
      s.lElbowCoords[0] = msg.get(2).floatValue();
      s.lElbowCoords[1] = msg.get(3).floatValue();
      s.lElbowCoords[2] = msg.get(4).floatValue();
    else if (msg.get(0).stringValue().equals("l_wrist")) {
      s.lWristCoords[0] = msg.get(2).floatValue();
      s.lWristCoords[1] = msg.get(3).floatValue();
      s.lWristCoords[2] = msg.get(4).floatValue();
    else if (msg.get(0).stringValue().equals("l_hand")) {
      s.lHandCoords[0] = msg.get(2).floatValue();
      s.lHandCoords[1] = msg.get(3).floatValue();
      s.lHandCoords[2] = msg.get(4).floatValue();
    else if (msg.get(0).stringValue().equals("l_finger")) {
      s.lFingerCoords[0] = msg.get(2).floatValue();
      s.lFingerCoords[1] = msg.get(3).floatValue();
      s.lFingerCoords[2] = msg.get(4).floatValue();
    else if (msg.get(0).stringValue().equals("torso")) {
      s.torsoCoords[0] = msg.get(2).floatValue();
      s.torsoCoords[1] = msg.get(3).floatValue();
      s.torsoCoords[2] = msg.get(4).floatValue();
    else if (msg.get(0).stringValue().equals("r_hip")) {
      s.rHipCoords[0] = msg.get(2).floatValue();
      s.rHipCoords[1] = msg.get(3).floatValue();
      s.rHipCoords[2] = msg.get(4).floatValue();
    else if (msg.get(0).stringValue().equals("r_knee")) {
      s.rKneeCoords[0] = msg.get(2).floatValue();
      s.rKneeCoords[1] = msg.get(3).floatValue();
      s.rKneeCoords[2] = msg.get(4).floatValue();
    else if (msg.get(0).stringValue().equals("r_ankle")) {
      s.rAnkleCoords[0] = msg.get(2).floatValue();
      s.rAnkleCoords[1] = msg.get(3).floatValue();
      s.rAnkleCoords[2] = msg.get(4).floatValue();
    else if (msg.get(0).stringValue().equals("r_foot")) {
      s.rFootCoords[0] = msg.get(2).floatValue();
      s.rFootCoords[1] = msg.get(3).floatValue();
      s.rFootCoords[2] = msg.get(4).floatValue();
    else if (msg.get(0).stringValue().equals("l_hip")) {
      s.lHipCoords[0] = msg.get(2).floatValue();
      s.lHipCoords[1] = msg.get(3).floatValue();
      s.lHipCoords[2] = msg.get(4).floatValue();
    else if (msg.get(0).stringValue().equals("l_knee")) {
      s.lKneeCoords[0] = msg.get(2).floatValue();
      s.lKneeCoords[1] = msg.get(3).floatValue();
      s.lKneeCoords[2] = msg.get(4).floatValue();
    else if (msg.get(0).stringValue().equals("l_ankle")) {
      s.lAnkleCoords[0] = msg.get(2).floatValue();
      s.lAnkleCoords[1] = msg.get(3).floatValue();
      s.lAnkleCoords[2] = msg.get(4).floatValue();
    else if (msg.get(0).stringValue().equals("l_foot")) {
      s.lFootCoords[0] = msg.get(2).floatValue();
      s.lFootCoords[1] = msg.get(3).floatValue();
      s.lFootCoords[2] = msg.get(4).floatValue();
  else if (msg.checkAddrPattern("/new_user") && msg.checkTypetag("i")) {
    // A new user is in front of the kinect... Tell him to do the calibration pose!
//    println("New user with ID = " + msg.get(0).intValue());
  else if(msg.checkAddrPattern("/new_skel") && msg.checkTypetag("i")) {
    //New skeleton calibrated! Lets create it!
    Integer id = msg.get(0).intValue();
    Skeleton s = new Skeleton(id);
    skels.put(id, s);
  else if(msg.checkAddrPattern("/lost_user") && msg.checkTypetag("i")) {
    //Lost user/skeleton
    Integer id = msg.get(0).intValue();
//    println("Lost user " + id);


class Shoulders {
  float angle; 
  float angle2;
  float initial1 = 0;
  float initial2 = 0;

  Shoulders() {

  void pan() {
    //Using trigonometry to calculate the angle between a straight line and the alignment of the shoulders. Basically, we're doing cos = adjacent/hypotenuse. And we're using the shoulder coordinates to do it.
    //Also notice that we're taking the inital value, which is the calibrate value set by this same calculation at mousePressed. This calibrates to 0.
    for (Skeleton s: skels.values()) {
      angle = initial1 - ((cos(dist(s.headCoords[0], s.headCoords[1], s.headCoords[0], s.rShoulderCoords[1])/dist(s.headCoords[0], s.headCoords[1], s.rShoulderCoords[0], s.rShoulderCoords[1]))));

    text("Side angle: " + int((initial1-angle)*1000), 10, 15);

  void zoom() {
    //Calculates the angle between a vertical line and the line between the torso and the neck.
    for (Skeleton s: skels.values()) {
      if (s.neckCoords[2] < s.torsoCoords[2]) {
        angle2 = initial2 - (1*(tan(dist(s.torsoCoords[2], s.torsoCoords[1], s.neckCoords[2], s.torsoCoords[1])/dist(s.torsoCoords[2], s.torsoCoords[1], s.neckCoords[2], s.neckCoords[1]))));
      else { 
        angle2 = initial2 - (-2*(tan(dist(s.torsoCoords[2], s.torsoCoords[1], s.neckCoords[2], s.torsoCoords[1])/dist(s.torsoCoords[2], s.torsoCoords[1], s.neckCoords[2], s.neckCoords[1]))));

    text("Forward angle: " + int(1000*(initial2-angle2)), 100, 15);

    if (keyPressed && key == ' ') {
      for (Skeleton s: skels.values()) {
        initial1 = (cos(dist(s.headCoords[0], s.headCoords[1], s.headCoords[0], s.rShoulderCoords[1])/dist(s.headCoords[0], s.headCoords[1], s.rShoulderCoords[0], s.rShoulderCoords[1])));
        if (s.neckCoords[2] < s.torsoCoords[2]) {
          initial2 = 1*(tan(dist(s.torsoCoords[2], s.torsoCoords[1], s.neckCoords[2], s.torsoCoords[1])/dist(s.torsoCoords[2], s.torsoCoords[1], s.neckCoords[2], s.neckCoords[1])));
        else { 
          initial2 = -2*(tan(dist(s.torsoCoords[2], s.torsoCoords[1], s.neckCoords[2], s.torsoCoords[1])/dist(s.torsoCoords[2], s.torsoCoords[1], s.neckCoords[2], s.neckCoords[1])));

  void run() {

class Skeleton {
  // We just use this class as a structure to store the joint coordinates sent by OSC.
  // The format is {x, y, z}, where x and y are in the [0.0, 1.0] interval, 
  // and z is in the [0.0, 7.0] interval.
  float headCoords[] = new float[3];
  float neckCoords[] = new float[3];
  float rCollarCoords[] = new float[3];
  float rShoulderCoords[] = new float[3];
  float rElbowCoords[] = new float[3];
  float rWristCoords[] = new float[3];
  float rHandCoords[] = new float[3];
  float rFingerCoords[] = new float[3];
  float lCollarCoords[] = new float[3];
  float lShoulderCoords[] = new float[3];
  float lElbowCoords[] = new float[3];
  float lWristCoords[] = new float[3];
  float lHandCoords[] = new float[3];
  float lFingerCoords[] = new float[3];
  float torsoCoords[] = new float[3];
  float rHipCoords[] = new float[3];
  float rKneeCoords[] = new float[3];
  float rAnkleCoords[] = new float[3];
  float rFootCoords[] = new float[3];
  float lHipCoords[] = new float[3];
  float lKneeCoords[] = new float[3];
  float lAnkleCoords[] = new float[3];
  float lFootCoords[] = new float[3];
  float[] allCoords[] = {headCoords, neckCoords, rCollarCoords, rShoulderCoords, rElbowCoords, rWristCoords,
                       rHandCoords, rFingerCoords, lCollarCoords, lShoulderCoords, lElbowCoords, lWristCoords,
                       lHandCoords, lFingerCoords, torsoCoords, rHipCoords, rKneeCoords, rAnkleCoords,
                       rFootCoords, lHipCoords, lKneeCoords, lAnkleCoords, lFootCoords};
  int id; //here we store the skeleton's ID as assigned by OpenNI and sent through OSC.

  Skeleton(int id) { = id;
  void drawBone(float joint1[], float joint2[]) {
    if ((joint1[0] == -1 && joint1[1] == -1) || (joint2[0] == -1 && joint2[1] == -1))
  float dx = (joint2[0] - joint1[0]) * width;
  float dy = (joint2[1] - joint1[1]) * height;
  float steps = 4 * sqrt(pow(dx,2) + pow(dy,2)) / ballSize;
  float step_x = dx / steps / width;
  float step_y = dy / steps / height;
  for (int i=0; i<=steps; i++) {
    ellipse((joint1[0] + (i*step_x))*width, 
            (joint1[1] + (i*step_y))*height, 
            ballSize, ballSize);

void run() {
  //Head to neck
    //Center upper body
    drawBone(rShoulderCoords, neckCoords);
    drawBone(lShoulderCoords, neckCoords);
    drawBone(neckCoords, torsoCoords);
    //Right upper body
    drawBone(rShoulderCoords, rElbowCoords);
    drawBone(rElbowCoords, rHandCoords);
    //Left upper body
    drawBone(lShoulderCoords, lElbowCoords);
    drawBone(lElbowCoords, lHandCoords);
    //drawBone(rShoulderCoords, rHipCoords);
    //drawBone(lShoulderCoords, lHipCoords);
    drawBone(rHipCoords, torsoCoords);
    drawBone(lHipCoords, torsoCoords);
    //drawBone(lHipCoords, rHipCoords);
    //Right leg
  //  drawBone(rHipCoords, rKneeCoords);
  //  drawBone(rKneeCoords, rFootCoords);
  //  drawBone(rFootCoords, lHipCoords);
    //Left leg
  //  drawBone(lHipCoords, lKneeCoords);
  //  drawBone(lKneeCoords, lFootCoords);
  //  drawBone(lFootCoords, rHipCoords); 

class Sounds {
    JOALUtil audioSys;
    AudioSource[] sound = new AudioSource[20];
    AudioSource backnoise;
    SoundListener listener;
    float position[];
    float you[];
    boolean useFalloff=true;
    float gain;
    float offsetx;
    float offsety;
  Sounds() {
  audioSys = JOALUtil.getInstance();

   //People scene
    //Sparrow and water
    //Panflute and gong
    //Bell and flute
    //Frog and waterflow
    //Jay and wind
    //birds follow
  //baby and zip
    //Keyboard and crunch
    //steps and clong

  for (int i = 0; i < sound.length; i++) {  
  //Background noise
  backnoise = audioSys.generateSourceFromFile(dataPath("backnoise.wav"));

  void run() {
  you = listener.getPosition();
  for (int i = 0; i < sound.length; i++) {
  position = sound[i].getPosition();
  if (dist(position[0],position[1],you[0],you[1]) != 0) {
  } else {
  you = listener.getPosition();


void reset() {
 if (keyPressed && key == 'r') { 
   for (int i = 0; i < sound.length; i++) {

void debug() {


  if (keyPressed && key == 'l') {
   acc.x += 1; 
  if (keyPressed  && key == 'j') {
   acc.x -= 1; 
  if (keyPressed  && key == 'k') {
   acc.y += 1; 
  if (keyPressed  && key == 'i') {
   acc.y -= 1; 

void backnoise() {
  gain = map(abs(vel.x+vel.y),0,40,.005,1);
  if (vel.x > 1) {
  offsetx = 1;  
  } else if (vel.x < 1) {
  offsetx = -1; 
  } else {
  offsetx = 0;
  if (vel.y > 1) {
    offsety = 3;
  } else if (vel.y < 1) {
    offsety = -3;
  } else {
    offsety = 0;

public void stop() {


2 responses to “Soliloquy [NOC]”

  1. Yamaha

    Excellent post I am currently boning up on motorcycles and this post really helped me out! I really appreciate the hard work.

  2. Alvin Lionberger

    One thing I’ve noticed is the fact there are plenty of misguided beliefs regarding the finance institutions intentions any time talking about property foreclosure. One fairy tale in particular is always that the bank prefers to have your house. The financial institution wants your hard earned money, not your property. They want the cash they gave you having interest. Steering clear of the bank will simply draw a foreclosed realization. Thanks for your write-up.

Leave a Reply