Here is some video I shot shortly after my project presentation which demonstrates the AirDeck virtual theremin application I designed and explains some of the features. It uses the Wii remote as an input mechanism by tracking motion with Infrared LEDs. The AirDeck is written in Java with the WiiUseJ API for handling Wii remote events and the JSyn API for internal synthesis. It can control MIDI out as well as offering a simple DJ scratch interface for real-time manipulation of sound samples similar to a DJ scratching with vinyl records. After about a year of working on this project, I am very relieved that it is finished. I’ve learned a lot and am looking forward to possibly working on other similar concepts in the not too distant future.
Category: Java
Converting MP3s in Java
I was surprised to find that there was very little information on the internets on decoding MP3s for manipulation in Java, which is one of the things I would like to add to my project. After spending almost two weeks on this problem, I have finally cobbled together a solution, that while not perfect, gets close to a resolution. I would like to share my findings in the interest of helping any other struggling souls out there who are stuck. At this point, the main problem I am having is that certain large files are not being played in their entirety – only snippets are being played. I feel this is probably occurring somewhere in the logic of my array assignment; perhaps another pair of eyes could can spot something and post a tip in comments.
In any event, in order to get started, one needs a decoder class. There are two APIs that provide such classes, JLayer and Tritonus, which offers a plugin that can work with Java Sound. I used JLayer with MP3SPI as they have some documentation that serves as a good starting point.
Here is some code that exemplifies the process. This method takes a file name string in as a parameter and after defining an output format it is read through the decoder object and the resultant bytes are assigned to a ByteArrayOutputStream. After this, I send the stream as a byte array on return for further manipulation.
public byte[] testPlay(String filename) throws UnsupportedAudioFileException, IOException {
ByteArrayOutputStream f = new ByteArrayOutputStream();
File file = new File(filename);
AudioInputStream in = AudioSystem.getAudioInputStream(file);
AudioFormat baseFormat = in.getFormat();
AudioFormat decodedFormat = new AudioFormat(AudioFormat.Encoding.PCM_SIGNED,
baseFormat.getSampleRate(), 16,
baseFormat.getChannels(), baseFormat.getChannels() * 2,
baseFormat.getSampleRate(), true);
DecodedMpegAudioInputStream decoder = new DecodedMpegAudioInputStream(decodedFormat, in);
try {
byte[] byteData = new byte[4];
int nBytesRead = 0;
int offset = 0;
while (nBytesRead != -1) {
nBytesRead = decoder.read(byteData, offset, byteData.length);
if (nBytesRead != -1) {
int numShorts = nBytesRead >> 1;
for (int j = 0; j < numShorts; j++) {
f.write(byteData[j]);
}
}
}
} catch (SynthException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
byte[] buffer = new byte[f.size()];
buffer = f.toByteArray();
f.close();
return buffer;
}
The above method is called like so and I proceed to assign the byte array as a short array so that I can feed it into the format that synthesis API I use, JSyn, prefers.
Hope this is helpful to someone out there…
stream = (InputStream) (new FileInputStream(fileName));
String ext = fileName.substring(fileName.lastIndexOf('.')+1, fileName.length());
if (ext.equalsIgnoreCase("mp3") ) {
byte[] buffer = testPlay(fileName);
int j=0;
short shrtData[] = new short[buffer.length];
for (int i = 0; i < buffer.length ; i++) {
int sampled = ((int)(buffer[i])) & 0x00FF;
sampled += ((int)(buffer[i++])) << 8;
shrtData[j++] = (short) sampled;
}
if( shrtData != null ){
sampleTable.allocate( shrtData.length );
sampleTable.write( shrtData );
}
JSyn Java Synthesis API
When I initially started working on my project, I began with C#, because I had found a Wii remote library which would get my idea off the ground. And it did, and for that I am grateful. Unfortunately, C# does not have a whole lot to offer in the musical arena as far as synth or MIDI functions.
So, I had to look a bit further. I discovered WiiuseJ, which I wrote about previously, which does the whole Wii remote manipulation in Java. Not having worked in Java before I was a bit skeptical as to whether I should embark down this path. But then after further research, I found JSyn, written by Phil Burk. This API offers a very deep and robust set of synthesizer and digital sound processing functions and is relatively easy to pick up and understand, which was a blessing since I am a complete Java noob.
Check out the JSyn website for more information.
AirDeck Project Update 2.0
The Fall Quarter will be starting up this week and as a result I am really making a final push to complete as much of my project as possible. My goal over the summer was to complete as much of the application as possible, so I wasn’t scrambling during the quarter to wrap everything up. The good news is, I have made some significant progress on many fronts, but I am still not where I wanted to be when the quarter started.
Here is where I am at so far:
- I have redesigned the GUI somewhat, adding elements for some of the features that I am adding. For example, I added tabs to select the mode the AirDeck will be using based on user selection. Right now, my main priority is on the “SynthDeck,” which is the theremin app. I added tabs for ScratchDeck, a DJ scratching utility and MixDeck, a DJ mixing utility. These last two components are outside of the scope of my Project, although I would like to fit them in, time and resources permitting. Even when the Project is complete, I still plan to develop these elements… I Just had to limit the scope of my project work to something that I am 100% confident that I can deliver – which is the theremin/synth side.
- I have the MIDI out working now. This will allow the user to select a MIDI out device and it either can use the built in General Midi Instrument library(from the soundcard), or it can be used to control an external sound source such as a VST instrument, a 3rd party synth application or MIDI capable keyboard/gear. One issue I am trying to work through is that because I am using the MIDI pitch bend control, which is a variable parameter depending on what MIDI capable device is being triggered, the notes on the keyboard grid do not line up with the grid that I currently have set up for internal synthesis. So I might have to dynamically reconfigure the keyboard grid depending on the range of notes the pitch bend is set for, although this is dependent on the external application. I did include a dropdown menu for choosing the General MIDI patch, so the user can change the instrument they are playing. Currently these are represented by numbers, if I can figure out how to get the list of instrument names programmatically, I will add that as well.
- The amplitude control issue has been resolved, although I need to work this out for midi control. It is my understanding that midi control has aftertouch and velocity signals that can be modified, so I need to figure out how to dynamically control these.
- The GUI has been tightened up in terms of event logic. I have individual methods for each of the GUI components such that choices made by choosing certain items make the appropriate calls.
Here is what is currently outstanding:
- While I added synth parameter sliders such as Attack, Decay, Resonance, etc. I still need to figure out how to program those to apply the actual effect.
- Same thing with the effects sliders, which will add global effects such as Reverb and Delay.
- I need to program the preset sounds. I want these to sound as realistic as possible and since I am still learning a lot about analogue synthesis, it will remain a challenge to program these right.
- I want to incorporate patch saving capabilities so that a user can tweak certain parameters of a sound and then be able to recall these later. That is why I have added a File Menu.
- I want to add status indicators that show the three following things: a) the Wii remote is connected, b)battery level of the Wii remote, c) Midi out is active.
Here is the GUI as it currently stands:
WiiuseJ
The Java API that I am using in my application to communicate with the Wii remote is called WiiuseJ and was written by Guilhem Duche. Guilhem maintains his code and a user forum at the WiiuseJ project. He’s a friendly guy; very helpful and responsive. Anyone interested in Java Wii hacking should check it out.
AirDeck Wii Theremin Project Update 1
As promised, here is some video of my project work completed so far. I still have quite a ways to go; there are a bunch of ideas I have to add to this, as well as checking for bugs and making the GUI look more sophisticated. But this video captures the main functionality of the application and I am pleased with the initial progress I have made over the summer.
The Interface
This is just the initial interface to get things going. I intend to have a drop down menu with several preset sounds. Additionally, I want to offer the user the ability to customize their sounds, which is what the waveform radio buttons are for. I am still debating how much customization of sound I will be offering, but choosing a waveform is a good basic start. Further, there will be an option to send the notes out through MIDI. The red dot represents the right hand and controls pitch by tracking movement along the X-Axis. The blue dot represents volume and is controlled by the left hand moving along the Y-Axis. The application is triggered with the Wii Remote communicating with the host PC over bluetooth. More on this process in upcoming posts.

And so it was written…
I am currently working on a Java application that emulates a theremin for my undergraduate Senior Project. It uses a Wii remote for tracking infrared LEDs on one’s hands and converts the coordinate data to pitch and volume. This blog is intended to document changes and ideas as I work through the development process. I will also be presenting other cool and interesting music and technology related ideas here that catch my attention. My application is at a functional stage. I hope to post video of it in action within the next few weeks. Additionally, time and resources permitting, I intend to make a DJ version of this that would allow mixing and scratching by moving one’s hands in the air.