Who Am I?

Toney, Alabama, United States
Software Engineer, Systems Analyst, XML/X3D/VRML97 Designer, Consultant, Musician, Composer, Writer

Monday, May 28, 2007

X3D and MIDI: Using A 3D World as a MIDI Controller

While some X3D world builders are focused on the large multi-user applications of real-time 3D such as immersive world building, others are focused on discrete applications that harness 3D as an application interface. Among these is John A. Stewart Team Leader: Networked Virtual Reality at the Network Systems and Technologies of the Communications Research Centre in Ottawa, Ontario Canada. John is the force behind FreeWRL. FreeWRL is an open source VRML/X3D browser that is maintained by volunteers and employees of the Canadian Government.

John’s team has implemented midi interfaces for FreeWRL. Most musicians recognize midi as the format for digital instrument controllers or synthesizers. At the FreeWRL homepage, this feature is described thus:

Interactivity in our Shared Virtual Worlds has been controlled and displayed by magnetic tracking systems, joysticks, a "space ball", and various headsets. Although these worked, configuration was hard to do, and the expense for some of the devices was way beyond the means of the budget-conscious.

We noticed that musicians were able to use computers with ease. These musicians, intent on creating music, were using computers to perform real-time interactive tasks, with apparent ease.

This observation led us to question why the music industry could create easy to use computer interfaces, while we had difficulty; and to investigate what we could leverage the successes of the music industry to enable "plug and play" interactivity with our shared virtual worlds.


Because the topic of using real-time 3D for musical applications has been one of my favorite hobby horses since I first began to model with VRML, I asked John to share some ideas about using the midi FreeWRL features. Technically, the FreeWRL midi node is an extension and not part of the X3D standard. The way this works, browser implementers and their customers and partners are encouraged to experiment with extensions to the language. When a node shows promise by being used for unique, useful or simply neat applications of 3D, it can be submitted to the X3D Working Group and the Web3D Consortium for inclusion in a future version of the standard.

This keeps the language evolving through tested and community accepted nodes rather than the older way of having committees dream up extensions that clutter the language but don’t get used. This is the elite cultivar technique where a core standard is bred to features that grow in the wild and have a healthy ecosystem edge relationship prior to their admission to the core systems.

While midi files can already be played from within a VRML/X3D file through the AudioClip node, they cannot be controlled by that node: only started and stopped. The following interview presents John’s thoughts on how a midi node can improve this through properties for controlling the midi sequence itself.

Interview



Len: Why would a musician want to have a midi interface in a real-time 3D system? MIDI is a controller interface designed to enable a musician with a good ear for sound but only mediocre notation skill to assemble medium complexity orchestration. Over time, it has become the way almost all composers work because of the different interfaces that can produce the controller codes. Some of us use notation programs, some prefer to play live and record in midi, and all of us edit in midi because we can tweak it into a much better composition now that we have a means to feedback the composition in as large or small an increment of the piece as we wish to work with. It enables us to borrow or outright steal phrases from any other midi file so midi has become the musician's analog of open source. It leads to the melting of styles into one another and sometimes to the devolution of music into very sparse and almost comical compositions.

John: Ok – lets play with the equation first. As a musician, I see MIDI in use almost every time I play. I also hear MIDI ring tones on cell phones. I went to see Jim Bryson playing recently at the BlackSheep Inn; the music was certainly not what one would call “electronic”. But, from my viewpoint of standing just beside the drummer, I could see 2 Apple laptops in use, connected to keyboards. This stuff is everywhere.

As an engineer, I looked at these people using complex networked computers, in real time. I also looked at our collection of 3D graphics I/O devices that are sitting collecting dust, because they are just too hard to use, and now the interfaces are obsolete.

So, the question was “what can the music industry give to the 3D world?” We are currently looking, and experimenting, so the jury is still out on that one.

Now, from the musician's side; there are many areas that 3D graphics can enable creativity. I'm interfacing FreeWRL with Propellerheads' Reason software; and I'll admit that I find the complexity of the Reason software daunting. The Reason UI mimics a real life “rack”; complete with knobs and buttons and little displays and lights.

What we can do is to represent your view on the functionality of a Reason module, or group of modules, with a 3d representation. You select which Reason controls you want exposed. You can manipulate the module by proximity; you can touch and poke it, and all of these things can effect the sound. For instance, I took one of Roland Smeenk's VRML files, and adapted it. It has 3 sliders arranged in a cube; moving these sliders changes the sound. Clicking on the box in the middle toggles a switch. So, I have attached this to a Reason sequencer, feeding a synthesizer, using a sequence that most people will recognize as being Pink Floydish. So, the functionality of a sequencer and synthesizer have been distilled down, for this example, to a very simple 3D object consisting of 3 Spheres and a Box. This example is part of the FreeWRL/ReWire Tutorials.



That's a very simple example. What about generating sounds, based on a users location in a 3D world? The sounds can be simple samples, or actual sounds, or sound reflections. What about generating a 3D representation of your music? Much like the Visualizer in Itunes, but with the ability to navigate through the soundscape?

Len: There are aspects of music that are intensely haptic, meaning, the tone of a sound played on the same guitar by two musicians sounds distinctive and different. In the early years of midi prior to touch sensitive keyboards, it all sounded mechanical but today, with skill, even a well-trained ear has a problem distinguishing a midi performance from a live performance.

But the recurring theme is one of the interface. MIDI itself is empowering but the interface is enervating. So the first question is what is possible given a 3D world and real-time 3D objects as interfaces to midi controllers?

Proximity certainly. The Theremin comes to mind. Velocity certainly. The combinations of easeInEaseOut in X3D or Stocker's follower objects are very similar to the ADSR curve of a note well played. Doppler effects and motion effects can be created but these are all just variations of things we can do without 3D. Simulating a band playing live while a band using other controllers plays live is obvious. The jam application in 3D that would certainly be cheaper than using video. Musicians watch each other when they play just as actors listen to each other. How can we tie the extra-reality or beyond reality aspects possible in VR to the midi interface as a feedback to live or virtual musicians, perhaps tieing some AI into the circuit? VR/X3D is a network of objects. Musicians jamming are also. How do we make these synergistic and will that enable a new kind of music or just enhance the old?

There is a story about the early use of wah pedals. Musicians couldn't figure out what it was good for because it is actually a very limited filter. Then they saw Muddy Waters tapping on it with his foot in rhythm and they got it. This combination of interface, physical motion and basic musical sensibility created a new sound and then within these simple parameters the very recognizable crybaby wah sound emerged that is unique to that interface.

What potentials for midi are unique to 3D?

John: Ok – first, let me put my engineers hat on again; simply because the desire of the work is to push the boundaries of 3D graphics interfaces, and to hook that back in to our peer to peer shared VR protocol (MVIP-II) that we created quite a few years ago now. Theremins for navigation are a certainty. Just point, and you'll move. PAIA will have another order, once our budget here is settled for the year. The Korg PadKontrol is another one; it has 16 large buttons, that can be lit under computer control. Imagine an X3D designer being able to flash a light on an I/O device? Our MIDI Interface code already uses a text-to-speech synthesizer for some messages; why not let the X3D designer use all of this functionality?

These MIDI devices are simple, and inexpensive. I mentioned PAIA above; we currently have 2 workstations configured with MIDI, MIDI to Control Voltage (CV), and CV to MIDI boards, and I have some PAIA Analog synth modules being completed. We are ultra-sonic sensoring one of our labs. We can hook almost anything up, and communicate with it via MIDI, or a CV. We can implement Herbert Stockers' Follower nodes in hardware, if we want, with just a physical patch cord or two.

MVIP-II was designed to push the envelope in terms of responsiveness across large distance networks; MIDI is designed to do this across local devices. The marriage of the two protocols seems to make sense.

So, from the “interface to 3D” perspective, the work is underway, and we'll see what results come out at the end.

That did not answer your question, though, because you are coming at this from a musician's perspective.

What I see for musicians is a simplification of control. A musician can logically create whatever functionality is required. They can see, in real time, the status of their music. They can use color, shading, texturing, object morphing, viewpoints, time sensors, whatever, to help them in the art of creation.

Second life seems to be the rage right now. My personal view is that we have been there, and have done that. Lets look and try to see what will be in use 5 to 10 years from now. So, to me, the question is not “can we adapt this to today's knowledge” but “can we use this to change tomorrows interaction?”

Like you, I'm a musician. As a bass player, I'm always in the middle sending and receiving signals from percussionists and guitarists, and, well, everyone, to indicate where music is going. I don't see avatars and distance networks helping for performing. Instruction, as found in CRC's virtual classroom, yes, but not for performing.

I'm going to wait and see what applications come back from this. As with your wah-pedal example, one never knows what some creative soul will do with this technology to propel it into the “it's so obvious” category.

Len: Ok – so, what kind of information are you passing between MIDI and X3D? How is this information passed?

John: First, this system is designed to be plug and play, and it is designed to be modified at run time. So, all bindings can come and go – for instance, if your X3D model is using a Reason module, it can see when that module exists, and, if it disappears, it will see that, too.

This means that someone can potentially plug a MIDI device in, try it, then plug another device in, and try that. There are currently some limitations with physical MIDI support; I'm investigating how to resolve these. (see more details below)

Lets go through a simple example (reasonAutoKey.wrl in the examples archive).

The main X3D node is the MidiControl node. This lets us assign MIDI Information to an X3D node:


DEF RW1 MidiControl {
controllerType "ButtonPress"
minVal 40
maxVal 90
useIntValue FALSE
autoButtonPress TRUE
pressLength 0.6
deviceName "SubTractor 1"

}


This X3D Node lets us:
 Send keyboard values, because the controllerType says so;
 The notes will fall in the MIDI range of 40 to 90 (E2 to F#6);
 It expects a floating point value sent to it (useIntValue is FALSE);
 It will send a MIDI Note On automatically, (autoButtonPress) when the note changes;
 it will send a MIDI Note Off 0.6 seconds after the note-on is sent.
 It will send these note on/off messages to the device “SubTractor 1”.

If we put in 2 other X3D Nodes:

DEF TS TimeSensor {loop TRUE cycleInterval 50}
ROUTE TS.fraction_changed TO RW1.set_floatValue

We have a Timer that continually sends floating point values, that goes from 0.0 to 1.0 every 50 seconds. That floating point value is sent to the MIDI node.

What we will hear is the chromatic scale, going from E2 to F#6, cycling through the progression every 50 seconds.

Want another example? What if you want to get the value of a MIDI controller, and use it to manipulate a value in an X3D program?

DEF RW1 MidiControl{deviceName "SubTractor 1" channel "Filter Freq"}
...
ROUTE RW1.floatValue_changed TO PO2.set_fraction

This will take the MIDI value of the “Filter Freq” slider on the device “SubTractor 1”, and ROUTE it to a node called “PO2”.

Len: This shows routing MIDI Note messages to a MIDI device, and getting MIDI controller values into an X3D program. Can you do the opposite – ie, get MIDI Note messages, and send controller values?

John: Yes. There are however, some missing links that are still being resolved. We are looking at how best to synchronize MIDI time and X3D time, utilize pitch bend messages, implement some kind of MIDI learning mode, push X3D AudioClip nodes to external devices, and, last but most certainly not least, probe USB for MIDI devices. For the probing, we'll use Propellerheads' Remote protocol, as it looks more complete than the equivalent Apple protocol.

So, for now, if you connect external devices, they are currently known as “External 0” through “External 15” (MIDI channels 1-16), and they only have hooks for controllers 1 and 31. This will change in a few weeks, though, as I now have some MIDI USB devices with which to code and test with. It's best to keep your eyes peeled on the URLs shown below.

Len: I notice that you use the word “ButtonPress” and “Slider” rather than the MIDI standard “keyPress” and “Controller”. Why?

John: The Protocol was designed to extend control to/from X3D programs. A key on your MIDI keyboard is in reality just a button, and your MIDI controller just like a slider. If you think more along the lines of, say, elevator buttons and light dimmers, you'll see why the names reflect more every day usage.

Len: Where can people go and find more information about these nodes? Can they get some examples? Can they give you advice, or examples?

John: First, feedback of any kind is really important. In many ways, I'm feeling my way along with this code; advice from anyone, especially active musicians would be fantastic. If we are going to do this, we might as well do it well.

I have some pages set up with tutorials, more examples, and a more complete description of the nodes added to X3D.

Have a look at:

http://freewrl.sourceforge.net/rewire/tutorials.html

I have an overview page at:

http://freewrl.sourceforge.net/midi.html

And, of course, the home page and code is available at:

http://freewrl.sourceforge.net/

Len: Are these nodes in the X3D standard?

John: No, not yet, for two reasons. I'd like to ensure that what is eventually proposed for inclusion in the X3D standard is usable in the real world. The X3D Consortium, who hold the X3D Standards, currently are not accepting any new nodes, and for good reason. This is all good; I'd like to go to the X3D Consortium with a complete, documented set of X3D Nodes and examples, so the X3D moratorium on new node additions is fine, for me.

Closing



John’s team has provided a vital missing link in the marriage of X3D and music, a marriage some have predicted would be one of the most productive exhilarating relationships in the emerging field of real-time 3D systems. If you have suggestions, code or are applying this node in FreeWRL in your worlds or you live performances, please write to John or the VRML/X3D lists, or leave comments on this blog. The more we share the more we have. Many thanks to John Stewart for contributing this piece to our library of first class and CHEAP means to take 3D to the streets!

No comments: