Ken Moore has been working on theremin emulation for some time now. He developed a Wii remote based theremin, and was quite helpful to me as I was developing my AirDeck project, which was also a Wii based theremin emulation application. Now Ken has done it again. This time, with the Kinect motion detection device that is used with the Xbox 360. Perhaps, if I can find the time, I can work on something similar with the PlayStation Move, which I have, and then we will have effectively converted all three motion based video game systems into theremins. Check this out:
Here’s a project that was done by Youtube user svenisnumb that uses the Microsoft Kinect as a Midi Controller. His Project was coded in C#. This is similar to my AirDeck Wiimote theremin. I don’t have an Xbox, so I haven’t been been able to fool around with the Kinect yet… but this intrigues me.
Reactable is an object based musical platform that uses the shapes of objects on a multi-touch like surface to create musical patterns and effects. They have now made a mobile version for iPod, iPhone and iPad. How cool is that?
A Guy Called Tom (yes, that’s his name on Vimeo) is using the TouchOSC app on his iPhone to control a modular synthesizer. Pure Data is doing the heavy lifting, converting the data to MIDI. Here is what he says about it:
TouchOsc iPhone app sending osc data into PureData Extended, where it is converted to midi and sent to the Doepfer MCV24 which converts it to voltage and controls the modular synth.
TouchOsc XY Pad controls the pitch of two Thomas Henry VCO-1 which also crossmodulate each other.
TouchOsc Sliders control Elby Steiner filter cutoff, Plan B Model 10 env cycle speed, Doepfer BBD feedback and delay. Thomas White LPG used in both mode for amplification. Delay is a Stereo Memory Man with Hazarai. Sorry for the video quality, its done using a photo cam.
Using TouchOsc is fun, there is a lot of control right at your fingertips. Actually it can control way more than i have to control 🙂 Really cool app. Disadvantage is the steping in the control voltages that you can hear quite well, especially when controlling the pitch of oscillators. Not sure if its the midi resolution, the mcv24 or the application itself.
This is very cool. I was experimenting with TouchOSC during my Special Projects in Music Technology class last quarter. I was just using my iPod to control some sounds in Pure Data, this is taking it to the next level. Maybe even the level after that.
Welcome to the future. This is a concept I have been playing around with in my head and on paper for some time and now Pablo Martin has brought this vision to fruition. Basically, Pablo has created a software interface called Emulator that allows Traktor to receive multi touch data, freeing it up from the confines of just the mouse and allowing it to be used on a multi touch surface. That right there is cool enough.
Rodrigo from Chile is developing the multi touch surface you see in the video, which is called Töken. Between the two of them they have put together one hell of an incredible DJ rig.
Shout out to simfonik for turning me on to this.
This video highlights the development of Skinput, a project that uses the bodies’ surface as an input mechanism. A bio-acoustic sensor device is used to capture and decode taps made on the body, with a high degree of accuracy. Pretty wild and there are a number of applications in a wide variety of industries where such an interface would be useful. Obviously, I see music and performance capabilities…
As a fan of the original DS-10 synthesizer software for the Nintendo DS, I have to say the new features look pretty cool. This is a great tool for portable composition of ideas as well as learning some of the basics of synthesis. And the touchpad offers and X-Y input mechanism which was very helpful for me as I worked on my virtual theremin project. New features include effects, an expanded number of tracks and improvements in the sequencer and song modes as well as improved performance.
The Kaossilator Pro is an upgrade from Korg’s Kaossilator X-Y axis touchpad synth, and it has a number of noteworthy upgrades over the previous version. The Pro now features an SD card slot to save one’s work as well as a USB port to import and export to your computer. This little guy would be great for live performance as well as a portable workstation to generate ideas. It can now also be used to externally control other instruments through MIDI. Check out the demo.
Here is some video I shot shortly after my project presentation which demonstrates the AirDeck virtual theremin application I designed and explains some of the features. It uses the Wii remote as an input mechanism by tracking motion with Infrared LEDs. The AirDeck is written in Java with the WiiUseJ API for handling Wii remote events and the JSyn API for internal synthesis. It can control MIDI out as well as offering a simple DJ scratch interface for real-time manipulation of sound samples similar to a DJ scratching with vinyl records. After about a year of working on this project, I am very relieved that it is finished. I’ve learned a lot and am looking forward to possibly working on other similar concepts in the not too distant future.
The Fall Quarter will be starting up this week and as a result I am really making a final push to complete as much of my project as possible. My goal over the summer was to complete as much of the application as possible, so I wasn’t scrambling during the quarter to wrap everything up. The good news is, I have made some significant progress on many fronts, but I am still not where I wanted to be when the quarter started.
Here is where I am at so far:
- I have redesigned the GUI somewhat, adding elements for some of the features that I am adding. For example, I added tabs to select the mode the AirDeck will be using based on user selection. Right now, my main priority is on the “SynthDeck,” which is the theremin app. I added tabs for ScratchDeck, a DJ scratching utility and MixDeck, a DJ mixing utility. These last two components are outside of the scope of my Project, although I would like to fit them in, time and resources permitting. Even when the Project is complete, I still plan to develop these elements… I Just had to limit the scope of my project work to something that I am 100% confident that I can deliver – which is the theremin/synth side.
- I have the MIDI out working now. This will allow the user to select a MIDI out device and it either can use the built in General Midi Instrument library(from the soundcard), or it can be used to control an external sound source such as a VST instrument, a 3rd party synth application or MIDI capable keyboard/gear. One issue I am trying to work through is that because I am using the MIDI pitch bend control, which is a variable parameter depending on what MIDI capable device is being triggered, the notes on the keyboard grid do not line up with the grid that I currently have set up for internal synthesis. So I might have to dynamically reconfigure the keyboard grid depending on the range of notes the pitch bend is set for, although this is dependent on the external application. I did include a dropdown menu for choosing the General MIDI patch, so the user can change the instrument they are playing. Currently these are represented by numbers, if I can figure out how to get the list of instrument names programmatically, I will add that as well.
- The amplitude control issue has been resolved, although I need to work this out for midi control. It is my understanding that midi control has aftertouch and velocity signals that can be modified, so I need to figure out how to dynamically control these.
- The GUI has been tightened up in terms of event logic. I have individual methods for each of the GUI components such that choices made by choosing certain items make the appropriate calls.
Here is what is currently outstanding:
- While I added synth parameter sliders such as Attack, Decay, Resonance, etc. I still need to figure out how to program those to apply the actual effect.
- Same thing with the effects sliders, which will add global effects such as Reverb and Delay.
- I need to program the preset sounds. I want these to sound as realistic as possible and since I am still learning a lot about analogue synthesis, it will remain a challenge to program these right.
- I want to incorporate patch saving capabilities so that a user can tweak certain parameters of a sound and then be able to recall these later. That is why I have added a File Menu.
- I want to add status indicators that show the three following things: a) the Wii remote is connected, b)battery level of the Wii remote, c) Midi out is active.
Here is the GUI as it currently stands:
I imagine we are going to start seeing many more apps like this for mobile platforms like the iPhone. This one actually looks pretty cool. It’s called Fingerbeat. Here’s the promo…
I haven’t posted anything new for a couple of days as I’ve been busy working on some new elements for the Airdeck. For starters, I am designing some Infrared LED gloves. I know virtually nothing about electronics, so I had to research this, but I think I have a decent design in mind now. I actually had a working glove this weekend, but I didn’t know that you needed resistors, so I have to order some and start over. Also, my dad shared with me some information about Molex solderless connectors, so I am gonna try those for some of the connections because my soldering skills need some work. Hopefully, I’ll be able to post some more info on the gloves by the weekend.
As for coding, I have been trying to incorporate some new features. I have two oscillators available now that get mixed together. So if a user picks a “Custom” preset, the radio button panels for oscillators 1 and 2 will be available to select different waveforms. I also have a panel with several faders, but they don’t do anything yet. The idea is to offer some filtering and modulation to the waveforms. Additionally, because the left hand(Y-axis) is only used for volume… I thought that any left hand movement on the X-Axis should be available to control another parameter such as a filter, if the user desires.
I still need to program the preset sounds, program the filters, and figure out how to get MIDI out to work. Also, when I added two waveforms mixed together, for some reason the amplitude control sounds a little glitchy when you try and raise or lower the volume with the left hand. So I have to figure out what that is all about.
Finally, I’ve been working on making the GUI look a little nicer. I still have to work some kinks out, but this is one of the ideas I had. Going old school. Although it might be cool to offer several skins that the user can select from. I will give that some thought as well.
The Haken Continuum is a musical interface that reacts to touch like a touchboard, but allows one to control pitch, velocity and other parameters by sliding or pressing the board.
Yesterday, I posted about the emerging field of Augmented Reality. The folks at 5 Gum are already working on making this a music interface. The graphic pattern that is shown determines the beat that is played and the proximity of it to a given point determines the volume.
Augmented Reality refers to the computer science field where real world data and computer generated data are used in conjunction to accomplish certain objectives. GE has an example of this you can demo yourself that they use to present information on their SmartGrid. The following video shows their AR application in action.