Tag Archives: Design and Digital Media

[DDM] Angklung with Two Tubes


Today I cleaned up my abstraction and add help file. I already finish three abstraction for my project; ak_speaker, ak_player, and ak_pitchshifter. There are just basic abstraction. Not just cleaning up my abstractions, today I also managed to create another abstraction. I called it ak_angklung_2tubes. Why Angklung 2 (two) tubes? I tried to structure my patch so that it’s easy to understand. Angklung 2 (two) tubes means in this abstraction I use two sample, one for big tube and another one for small tube. As I wrote in my blog post, Angklung has many variant, 2 (two) tubes called Angklung melody, 3 (three) or 4 (four) called Angklung akompanimen so this new abstraction was created specifically for Angklung melody.

Capture1

This abstraction represents one Angklung note so I need to create another abstraction so it can be used for any Angklung note. I hope I can finish it tomorrow.

Tagged , , , ,

[DDM] Node JS and AIM Client


It wasn’t easy. It was frustrating. But at least I can see the result. It wasn’t bad, well it’s working but I’m not easily satisfied with the result.

Okay, these past two days I’ve been working the client side of my project. At first I want to do it with just simple HTML and Javascript. But it turns out I can’t send OSC message directly from web browser. After hours searching about this problem, most of the solutions that work involving Node JS as a bridge server. I never use Node JS before but because this is the only (easy) way to solve my problem then I have no choice.

Learn Node JS wasn’t easy but was’t that hard too, well at least for me :p. Using Express web framework, I manage to create both client (i.e. the thing runs using web browser) and the bridge server all entirely in Node JS application. The client itself just basic interface for the application setting. Image below is the screenshot of the client

AIM Client

AIM Client

I also manage to integrate this with my Pure Data patch and of course it works. I still need to make the client response better and also need to clean up my patch. It still a long way to go but I’ll manage ^_^.

Btw, maybe you notice that I mention “AIM” in this blog title and you’re right, it is my project name, Angklung In Motion. Basically Angklung In Motion means Angklung using Leap Motion :D.

Tagged , , , , ,

[DDM] Lucky


Today I found something that can be very helpful for my project. In my last blog post I wrote that I need to find a velocity of some point. Luckily after I read the documentation and read the developer forum, I found out that the library already provide finger velocity with this attribute I don’t need to calculate it my self and just use it. Well it doesn’t mean I don’t need calculate another thing, it just make my life easier :).

With Leap Motion data plotter plugin I can see that I have to find some threshold for this velocity so I can somehow use it to determine the amplitude of the sound output whether it faint, normal, or loud.

Capture3 Capture4

Tagged , , ,

[DDM] Hello Leap Motion


For my project I will use Leap Motion controller as my motion sensor. For this purpose I have to learn about Leap Motion itself and of course how to code or create application using it. Leap Motion SDK available for a lot of programming languages. For my project I choose LeapJS, Leap Motion javascript library. Why I choose javascript because it can run in almost all modern browser so it will be easy for client.

I have to choose which hand and fingers that I will use, what information do I need, what data I can get from Leap Motion, etc. After some time poking around its documentation, I found some useful information on how to use it for my need. Image below is what Leap Motion see from its perspective.

Hand image from Leap Motion

Hand image from Leap Motion

I make this hand form because this form is the closest one with the actual thing when you hold Angklung. From this image you can see that there are three fingers that I can use as a reference. These fingers are the most visible from leap motion point of view and after some trial that I do, I choose thumb as my reference. Why thumb? It was because pinky finger can be mistaken as other fingers and sometime index finger not really visible when I shake.

I can’t stop here, I have to choose which point/position that I will use in thumb. According below image from Leap Motion documentation there are 5 (five) position that I can use and another 3 (three) bone type that I can use too (because I can get its center position).

Capture2

I want to use “btipPosition” but I think it can be unclear. I want to get the closest point to “btipPosition” and the closest point is the center point of Distal Phalanges bone. So I decided to use the center point of Distal Phalanges.

To get this point/position is not that difficult in code, the challenge now is how to get the movement velocity of this point/position. Using this velocity then I can define some shaking parameter for my project.

Tagged , , ,

[DDM] Poly and OSC


These past few days, I have been searching about how to do multi threading in Pure Data. My supervisor told me about Poly object in Max MSP and Pure Data has the same object. After I do my research and tried some patches it seems Poly object isn’t what I need. I will try to learn it more next time or maybe I can find a workaround for this problem.

I also tried to learn more about OSC in Pure Data, after following some tutorial I think OSC is not that complicated to do. OSC is just as simple as sending and receiving data in some format. The thing that I need to think more is the data structure itself. I can create any data structure that I want but I have to decide what data that I need for my project and to do this I have to finish the client part of my project (Leap motion controller part).

Tagged , , ,