a) That’s trivial and easy. Anyone can do that.
b) That’s hard. I don’t get it. I’ll never be able to do that.
Both camps are right and both camps are wrong. It does take practice to learn the basics of VRML. Then there is X3D which is a step up, but also a bit more complex in terms of multiple encodings and some nasty syntax surprises. On the other hand, VRML/X3D are in the sweet spot between lower level languages like OpenGL which are very powerful but much closer to the metal and require a considerable technical grasp of graphics programming and the higher level languages such as 3DML designed to do a few things well but at the cost of freedom of expression so not very powerful.
In other camps are the closed-system proprietary languages such as are used in platform implementations such as Second Life. These provide tools for building in-world or a virtual economy, so once in, you can make some money. On the other hand, you get to do what Linden Labs wants you to do and regardless of the intellectual property policy that says what you create there, you own, you can’t take it with you anywhere else so it is like owning your own baggage in the Greyhound lines terminal. It may go from bus stop to bus stop but you can’t take it home.
VRML/X3D is designed not to be the fastest language or the easiest to learn, but to sit somewhere in between as the open language for royalty-free (say CHEAP!) 3D on The Web. If you are in the second camp, I assure you that I am not a graphics programmer, I am not a graduate student, and I don’t own a company trying to sell you graphics games or standards. I have a degree in English and Music, am a performing musician, have been a technical writer and self-taught programmer for various companies, am a technical consultant now, and have done some standards work in a former life.
If I can do this, you can do this. It ain’t rocket science. I live in the town where the original rocket scientists live. I say with complete certainty this ain’t THAT hard or that expensive.
If you are in the first camp, then show me yours. I’m showing you mine. Some people want to go to Valhalla and demand to be recognized. But you can only cross the Rainbow Bridge with heads in your pelt or your head in someone else’s pelt.
Build and show or shut up.
So on to the examples.
Begin At The Beginning
To begin at the beginning is to create an entry point in your world or story. In VRML, this is the first Viewpoint on the stack, the first thing the user will see when the world opens, and the first opportunity you have to acquaint the user with your intentions. Get this right and what follows is easy. Get this wrong and what follows is unpredictable.
"The Sims" creator, Will Wright, makes an excellent point about building games that applies to our journey and is similar to Glenda’s advice to Dorothy in the Wizard of Oz.
Wright’s most excellent advice is that the first view should entail a task or puzzle that teaches the user how to navigate. Then increase the problems to solve slowly so that by the time they have done the first three or four, they are masters of the user interface.
Those of us who like games or 3D worlds are so proficient with the mouse that we automatically press the left button and push. Not surprisingly, this simple motion confuses the heck out of lots of otherwise well-educated adults if not most children. Couple this to the need to have that first motion result in an experience that explains the topic of this story and you realize that you have to combine your storytelling artistry with your technical VRML chops and come up with an object that does both easily.
In other words, don’t under rate obviousness and don’t over rate cleverness. An early game pioneer once remarked, “We consistently overestimated their intelligence and underestimated their manual dexterity.” People aren’t really stupid. Don’t assume that. They sometimes aren’t patient. That’s reliable. Entertain them.
The code in these examples comes from my VRML world, The River of Life (ROL after this), a story of a devadasi or temple dancer in a mythical version of old Bharata (India). Many a river journey begins with boat at a dock by the river.
That’s obvious. The instructions for ROL can be summarized as:
ROL opens with a title screen with an animated psychedelic background and scrolling credits.
As I said in the last tutorial, library code is the key to productivity. The success of VRML is based on the fact that code written ten years ago still works. The title 3D text was created with Flux Studio. The background is a piece of code from Robert St John distributed with the old V-Realm Builder VRML editor.
After the credits complete, a single line of red text asks the user to click on the title to open the world. Most of us can do that. If they watch the credits, good. The next world is loading in the background. If they don’t, ok. The world will just appear to take longer to load. The credits are a side show. The opening view of ROL is shown here.
The boat and the dock were created with Parallel Graphics Internet Scene Builder (ISB) free demo. CHEAP!!
Note that this world has an automatic sky proto provided from an online PROTO library. It checks the time on your computer and more or less matches the virtual time of day of your local machine minus the weather conditions. In theory, one could do both but that is beyond this tutorial. However, it points out one reason to become a proficient scripter. Some of the sequences have to work in day or night conditions, and a changing background has surprising effects on renderings of objects such as text. I’ll demonstrate this.
Some people believe an author should put text instructions at every step as if the user were building a kitchen cabinet from a kit. If possible, I prefer to minimize imperative instructions and big overview maps. They destroy immersion. A user should be able to get lost. That’s part of the fun. Remember, this is a computer medium but not of necessity, a computer application like a spreadsheet. This is for fun and some cerebral exercise maybe, but not to keep your books or your photos or your FaveFive in. Use your cell phone for that, ok?
On the other hand, VRML worlds are sometimes referred to as ‘free roamers’. Unlike games that are maze-driven, a VRML world usually enables you to go wherever you want once you enter the world. This is much more like Second Life than Myst. So it is up to the author to use their story telling skills, graphics skills, and programming skills to encourage the user to follow a pre-defined path of discovery. As we go through this example, I will point out some design decisions I made to encourage the user to choose to follow the story.
At the dock, the user can either get on the boat or wander off to roam. If they skip the boat, they won’t see some information that provides clues. There will be other entry points into the story, but if they miss the boat, some of those clues will be missed. To get on the boat, they have to get through the dock and to do that, they have to learn to push the mouse and turn.
Story telling in a free-roamer can be cinematic (single path) or like a game (multi-path) but the cost of the extremes of the former is boredom played once, and the costs in the latter to create the separate converging and diverging paths to get to the end state of the story/game (IF THERE IS ONE) are your time and maybe the loss of the user’s attention.
We’ll come back to this in another tutorial and discuss really abstract make-your-eyes-glaze-over (MYEGO) topics such as path metrics for complexity ratings that differentiate genre of 3D applications (Give me a grant for that one, Mr. MacArthur!). For now, remember that every multiple choice situation exponentially raises the costs of completing the world for you and the user. In English,
The more you offer, the longer it takes to choose, the more it takes to provide the choices.
As Rita Turkowski says, “Baby steps, baby steps, baby steps.”
Take the Ride
If the user gets through the dock and steps on the boat, the boat takes over and a little show begins consisting of a ghost floating up out of the river, flying eyes that “flash at the sound of lies”, some excellent Hindi movie music and flaming text. This is our first sequence. It is completely automatic. The user should relax, read the text, and enjoy the ride.
As I said, to make this easy for the user, you have to write some code. In this sequence, all of that code is located in the boat.wrl file.
Did I tell you that VRML files conventionally have wrl file extensions unless compressed? Well, I am telling you now. Typically, a scene will be made up of several world (wrl) files, each with some VRML object or several objects and the code that makes them behave. One technique we will be learning later in this tutorial is how to enable these objects in different files to send and receive events to and from each other, but our first example of a sequence is initiated and controlled completely from within one object: the boat.
“Baby steps…” sayeth the Turk.
As soon as the user gets on the boat, a VRML object called a Proximity Sensor fires. The boat’s Proximity Sensor
All of this is initiated by a single sensor firing. Let’s look at the code that makes that happen.
Code!! Thank God, I thought he would never shut up!
Viewpoints, Navigation Nodes and Defined Names
First we have the two VRML objects for the entry point view and the navigation
DEF ShoreView Viewpoint {
orientation 0 1 0 1.70
position 342 4.0 -80
description "Temple Dock"
}
A Viewpoint named Shoreview at the coordinates X (342 meters from world origin in the X axis, 4 meters above the origin in the Y axis, and a negative 80 meters in the Z axis) is turned to the left and given the description “Temple Dock”.
Did I tell you the unit of measurement in VRML is meters? Well, I’m telling you now. Like radians for angles, it is one of those geeky things that is inconvenient for some people and downright baffling to Americans, but we cope by not giving a frik about proper scaling and blaming it all on the French who reply by blaming the XML encoding of X3D on us and of course, they are right about that. Yanno, when they’re right, they’re right. Of course when they are left, they are still right. That’s what it is to be French.
The description value is displayed in the pop-up menu of the browser when the user right clicks if the browser is a Bit Management Contact browser. Different browsers put these descriptions in different parts of the GUI, but a right click pop-menu is common.
Note the DEF keyword assigning the name ShoreView. When referring to the Viewpoint from ROUTEs or Scripts or USE statements, etc., it is the DEF name that is used for reference, not the description value. The description value is simply a label for the user interface. If you leave this blank, the user cannot find this viewpoint using the GUI and that can be quite useful because sometimes you want to hide viewpoints from the user but use them in scripts.
DEF names are used by ALL VRML objects as the names for referring to them. This is similar to the concept of SGML/XML id (identifier) values in Document Type Definitions. A DEF name is a ‘defined’ name. The difference is in XML, an id value has to be declared in a DTD or Schema. In VRML the DEF keyword is used in the data itself. So if you don’t define a name, you can’t refer to the object later.
Define the name BEFORE you need to use it in a file. Just do it or the browser may complain about nodes not existing or “INVALID REFERENCES” even though you can see right there in the file. Just do it.
Don’t define names that you don’t need. It wastes resources. You may not notice it, but it does. On the other hand, you may name it just so you can find it easily while you are building the world. That’s ok. There are utilities for cleaning up later, in fact, FREE utilities. Say mondo CHEAP!
DEF DefaultNav NavigationInfo {
type "NONE"
}
A default Navigation node is provided that sets the navigation type to NONE. This keeps the user from getting off the boat midstream. A captive audience is a wonderful karma. There are other important properties of Navigation nodes, but we don’t need them here.
Next we see the transforms and other objects that make up the boat itself. They begin as follows:
DEF Boat Transform {
translation 245 -3.1 582
rotation 0 1 0 -1.57
scale 1 1.3 1.5
children [
Transform { rotation 0 0 1 -0.15 children [
DEF BoatView Viewpoint {
orientation 0 1 0 1.55
position 1.8888 2.9 -3
}
]}
Note the nesting for these two transforms (children are positioned relative to parents) and that the second transform contains the Viewpoint for the boat itself. As mentioned above, when motion is applied to the boat transform, the Viewpoint moves with it. There are various geometric objects that make up the visible structure of the boat, but we aren’t looking at those here.
Proximity Sensor
The next object is very necessary. It is the proximity sensor that starts the journey as described above:
DEF OnBoat ProximitySensor { size 12 12 12 }
Proximity Sensors are surprisingly simple looking for what they can do. This is an example of the language’s pre-built objects provided by the browser. The browser tracks the current position of the user. Proximity sensors enable the author to declare regions in the virtual space of interest. When a user enters or exits the area defined in the sensor declaration, in this case an area of a 12 meter cube (X = 12, Y = 12, Z = 12), the sensor sends an enterTime event. When the user exits, it sends an exitTime event. These can be used, of course, to start other behaviors.
In fact, a Proximity Sensor continuously sends position and orientation events of the user as long at they are in that sensor’s area allowing you to track the user through the sensor area continuously and with considerable precision. This is important but we don’t need it here. There are other important events and properties of Proximity Sensors but for now, we only need the enterTime and the exitTime event Outs. Yet the simple looking Proximity Sensor is one of the most powerful objects in the VRML language for many purposes.
Remember, events have types, and types must match when routed to other objects. The enterTime and exitTime events (both event Outs because the sensor emits them to other objects), are both SFTime events, that is, timestamps. By sending a timestamp event to another object with an eventIn of type SFTime, one can use these events to start behaviors or even start several. It’s all a matter of routing.
Time Sensor
The behaviors we need for the boat trip are:
To create the sequence of events for these behaviors, we need :
To capture the user on the boat, the boat proximity sensor sends an enterTime event to a script that binds the boat viewpoint as the current viewpoint. The script then sets the boat in motion by sending the enterTime to the startTime event of a VRML engine called a TimeSensor. The TimeSensor sends time events to other engines, specifically a Position Interpolator for moving the boat in the XYZ coordinate system.
To turn the boat as it follows the river, it also sends the same TimeSensor timing events to another engine called an Orientation Interpolator. These could be controlled by different timers but it is easier to synchronize the movements by using the same timer; so in the example, I use one timer for both. There can be reasons for separating these but we’ll look at that in the next section as we dissect the interpolator declarations.
TimeSensors are the verbs of VRML in the sense that they cause actions. When naming them, it helps to use verbs in the DEF name or at least to indicate the kind of behavior they control. This is the TimeSensor that controls the motion of the boat by sending timing events to the movement engines, the Position and Orientation Interpolators.
DEF BoatRideTimer TimeSensor {cycleInterval 180}
The cycleInterval property specifies that the BoatRideTimer TimeSensor runs for 180 seconds one time. If I want it to repeat, I can specify a loop property of TRUE or FALSE. It defaults to FALSE. There are other property values for Time Sensors and because we will use them later, let’s look at them now. Here is the complete node specification for a TimeSensor object:
TimeSensor {
cycleInterval 1 # exposed field SFTime
enabled TRUE # exposed field SFBool
loop FALSE # exposed field SFBool
startTime 0 # exposed field SFTime
stopTime 0 # exposed field SFTime
}
When reading a node specification, the syntax looks the syntax used in the instance. It provides the sensor type, the properties of the sensor, the default values that you can rely on being there if you don’t change them, and the data type of each field. If these are exposed fields, they can be set by other nodes in the world.
Not shown in the node specification but shown in the documentation are the event interfaces. A TimeSensor has the following event interfaces:
the event is sent when the sensor starts to run (TRUE) and when it stops (FALSE).
the current time sent at the start of each cycle. If the loop property value for the RideBoat Timer was TRUE, this interface would send a timestamp every 180 seconds.
the fraction of the current cycle that has completed. The cycle is divided from 0 to 1 where 0 is the beginning of the cycle and 1 is the completion of the cycle. All divisions between 0 and 1 are floating point numbers, for example, 0.0, 0.1, 0.18372 and so on.
the current time, in seconds since 12 midnight GMT January 1, 1970
Remember that event interfaces are the destinations for ROUTEd events. Exposed fields can be set by the author, by default, or by scripts. To understand VRML objects, learn to read the node specifications.
This is important:
The restriction on what nodes can be routed to other nodes is the type compatibility of the events, not the nodes themselves. If a node exposes, for example, an SFTime eventIn and another node exposes an SFTime eventOut, these nodes can be connected. The same is true for other event data types such as SFFloats. The power of this becomes apparent as you build more VRML worlds.
Interpolators
This is out of order in the code, but first let’s look at the ROUTEs for the boat movement because they can be considered a single movement given they are controlled by the same Time Sensor: RideBoatTimer.
ROUTE RideBoatTimer.fraction_changed TO RideBoatPosition.set_fraction
ROUTE RideBoatPosition.value_changed TO Boat.set_translation
ROUTE RideBoatTimer.fraction_changed TO RideBoatOrientation.set_fraction
ROUTE RideBoatOrientation.value_changed TO Boat.rotation
Note that the only way you can tell that these are initiated by the same sensor is the sensor DEF name is used to identify the timing source: BoatRideTimer. The timer is sending SFFloat events (fraction of the timer cycle) to the Position and Orientation eventIns for SFFloats. The Position and interpolator event Outs are then sending two different kinds of events to the Boat transform: a vector (XYZ) and a rotation (the axis to be rotated and the angle of rotation. By sending these with the same timer, I am ensuring that the rotations of the boat occur as specific positions in the journey. It makes sense. You really don’t want to run the boat up on the banks do you?
To understand precisely how this works, we must look at the Position and Orientation interpolators themselves.
DEF BoatRidePosition PositionInterpolator {
key [0, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1]
keyValue [ 243 -3.1 582, 225 -3.1 450, 225 -3.1 445, 225 -3.1 438, 370 -3.1 365, 375 -3.1 355, 375 -3.1 300, 375 -3.1 200, 375 -3.1 100, 346 -3.1 -80 ] }
DEF BoatRideOrientation OrientationInterpolator {
key [ 0 .1 .2 .3 .5 .6 1 ]
keyValue [ 0 1 0 -1.21, 0 1 0 -1.57, 0 1 0 -1.57, 0 1 0 -2, 0 1 0 -3.07, 0 1 0 -1.57, 0 1
0 -1.57 ] }
Note that both have two parts: a set of keys and a set of key values. Keys and key values are paired. At key 0, the position is XYZ (243 -3.1 582). At key 1, the position is XYZ(346 -3.1 -80). These are the start and stop positions of the boat on the trip respectively. The same pairing relationship is used in the orientation, but the values are different.
To understand this, the TimeSensor sends time stamps that are allocated to the keys. The interpolator allocates the time events proportionally to the keys which point to the key values for each interpolator. The series of position and orientation moves will start and stop at the same times but the independent moves happen at different fractions of that time. It takes some experimentation to set the values so turns happen at correct bends in the river, and positions go in the directions pointed to by the turns thus keeping the prow pointing forward, but that is how it works.
Note that by setting keys to different intervals, you can speed up or slow down certain parts of the movements. To keep movement natural, objects under the influence of gravity or water or any other force of resistance are slower when they start, faster once in motion, and slow down as they stop or they go kerflop. Natural motion has these elements of force. Since currently there is no physics engine in VRML browsers for simulating this, you can use interval distance to approximate it. There is a physics engine specification in the works for X3D engines, and it should be there Real Soon Now. However, if you want to create natural motion, you should get into the habit of observing different kinds of objects in motion to see how they behave and mentally try to model that using intervals. It is one way of preparing for a trance dance. Breathe.
The technical term for this is keyframe animation. In theory, there are an infinite number of positions along any path. In practice, that is GeekyCaca. Remember the old gag in math class where they prove to you that if a chicken crossing the road has to cross half of it first, but then half of that first, then half of that, the chicken is run down by a car before they can cross the road because there is an infinite number of steps. This drives you nuts until you realize that there is not an infinite number of steps. The chicken is taking approximately the same length of step each time, even if speeding up to avoid the car. There is an infinite number of divisions because the function for dividing the steps is looping (really recursing but let’s not go mathGeeky). The chicken may indeed get run over if he isn’t fast enough, but not because he has to cross an infinite space with tiny legs.
The author sets the key for specific moves and positions along the path. The browser engine determines the in-between positions through a mathematical process called interpolation (thus position and orientation interpolators). So you can think of the key value positions as the minimum and maximum steps the chicken takes, the orientation angles as the minimum and maximum angles it turns as it runs, and the keys as the length of time each one of those takes given a time sensor sending fractions (SFFloats) of some cycle interval. Really, the key is the division of the fraction but the engine automatically maps that to the time. Don’t worry. Given type compatibility, it just works.
Let’s look at another timer/interpolator pair that turns the sounds up and down in volume/intensity to better understand the principles of type compatibility and intervals.
DEF TurnDownTimer TimeSensor { cycleInterval 3}
DEF TurnDownSound ScalarInterpolator {
key [0,1]
keyValue [1,0] }
DEF TurnUpTimer TimeSensor { cycleInterval 4 }
DEF TurnUpSound ScalarInterpolator { key [0,1] keyValue [0,1] }
This is simpler but the principle is the same. A scalarInterpolator takes a single floating point value such as ‘intensity’, the volume of a sound playing’ and changes it by the scale in the key value. So if a volume starts at 0 (no sound), a scale of 1 turns it up all the way (not to Eleven, just 1). So a timer with a cycleInterval of 3 seconds routed to a scalar with only the values 0 and 1 routed to a sound node will turn it up to full volume in three seconds. One can add intervening fractions to make the sound go up evenly or unevenly just as one can speed up the boat when going in a straight line and slow it down during turns by adjusting the gaps between the key values.
Note in the example above that I used two different timers. Why? Because otherwise initiating a single timer would attempt to turn it up and down at the same time and that doesn’t work, does it? Also note that I use two different lengths for the cycleIntervals. Why? Because I wanted to. That’s all. Observing people, they usually turn sound up slower than they turn it down. It simply feels natural.
To live well in real time or virtual time:
Learn to watch. Learn to listen.
Listening and Seeing are everything. Timing is everything else.
Practice makes it real. Skill makes it virtually real.
Breathe. Remember.
Sound Nodes and the Switch Proto
Before getting to the Script node that ties the sequences together, let’s look at two objects being animated by the Timers and Script that are not the boat or the user’s viewpoint. These are the sound nodes that provide the background music, and a series of three gif graphics that are the ghosts and they eyes of the soul of the river. Spooky stuff.
Sound Node
Here is the first sound node used in the journey:
DEF Soprano_Sound Sound {
maxBack 1300
maxFront 1300
source DEF SopranoGreeting AudioClip {
url "audio/sopranoGreeting.wav"
}
spatialize FALSE
}
A sound node is actually two nodes:
There are two sound nodes in the boat journey. The first as shown here is an acapella soprano (big gal whining alone). The sound plays once and stops. The second is an synth bass drone with a repeating riff. This is a loop that plays until the end of the boat ride. These sounds are timed and interpolated so that as the soprano finishes the looped bass riff begins. These could be one wav file easily but I chose to keep them separate because I reuse different sounds in different parts of the world. I loop the second sound because that makes it smaller. Also, the sounds are reusable too. Some sounds serve are themes for characters and emotions, events, and so on. Keeping them small and discrete keeps them available when I need them.
HINT: a sound loop library is just as important as a texture library for exactly the same reason. Make them in your spare time when waiting for Doctor Who to return to the Sci-Fi channel. Decorate audio space as carefully and passionately as you decorate visual space and remember, they are BOTH 3D+Time and in combination, used well, the soul of virtual reality.
DEF Travel_Sound Sound {
intensity 0
maxBack 1300
maxFront 1300
source DEF Travel AudioClip {
loop TRUE
startTime -1
url "audio/travel02.wav"
}
spatialize FALSE }
Note the first node is set with the intensity and loop to defaults. The second is set with the intensity at zero (silent) and the loop at TRUE. I use the combination of timers and scalar interpolators to turn it up as the first sound completes its one cycle. Thus, I get the effect of a single sequenced music background from two sounds and use the intensity to blend them. The browser I am using supports up to 16 channels of sound, meaning you can theoretically keep sixteen sounds going at one time. Your mileage may vary given your sound card but most handle that with varying degrees of fidelity.
If you were running ROL right now, you would notice that two sounds run continuously in the background: wind and water. Near the temples there are some music loops that run automatically. Together, these comprise the ambient background sounds over which all other discrete sounds must run, be heard, and must blend. I will show you how these can be controlled by the objects in the world later using events send among the objects in different files.
The size of good sounding wave files is one of the bigger obstacles to decent download times for virtual worlds. Graphic texture sizes are nothing when compared to the size of a sound file; yet sound is one of the single best ways to make a virtual world come alive until Smell nodes are implemented.
I could use MIDI which is considerably smaller but depends too much on the local machine for fidelity (say too many cheap SoundBlaster cards out there and they aren’t the worst) and because MIDI cards don’t usually have whining soprano sounds or wind and water on them. There are sound cards with wav samples on them, but these don’t exist too frequently outside of recording studios or the homes of serious musicians, which are the same thing these days.
I could combine MIDI and wave files but that is even more unpredictable. At this point, mp3 sound support is only in the MovieTexture nodes of most browsers and streaming sound support is spotty but improving.
What’s a guitar hero to do?
I will dedicate a whole tutorial to sounds in VRML, but for now, know that these are 22khz mono wav files and that is as much compromise in fidelity that I as a musician will make with the web for bandwidth. Go below that and sound sucks. Tell the local bandwidth nazis to go pick on YouTube.
The Switch Node and Switch Interpolator (A PROTO)
To give the effect of a ghost rising from the water, floating toward the boat, then transforming into a pair of flashing sultry eyes, I use some really…. Cheap… tricks. If you want to play, you have to pay. There are some sophisticated ways to do this with morphing of mesh surfaces, but hey, this isn’t Disney or Shrek and you and I aren’t paid per vertex yet.
The effect is achieved by having three textures using the combination of a timer, position and orientation interpolators to move the textures, and a PROTO node to switch them out in series as they move toward the boat. We’ve already covered the first nodes and I explained PROTOs in the first part of the tutorial. Let’s look at the Switch proto briefly then how these combine.
The Switch Interpolator proto was provided to the VRML community by Jed Hartmann and Josie Wernecke in “The VRML 2.0 Handbook”. Originally provided for flipbook animation where a series of images are changed in rapid succession to emulate motion animation as in a flipbook, I adapted these to drive text displays similar to karaoke text displays and for tricks like the GhostEyes animation. I’m not including the PROTO itself here because I am unsure what the copyright position of the publisher is on that.
However, the instance of the proto looks like this:
DEF GhostEyes_Animator SwitchInterpolator {
key [ 0, 0.1, 0.5, 1 ]
keyValue [ 0, 1, 2, 3 ]}
As you can see, like the earlier interpolators, this is a set of keys that index key values. The key values represent the number of nodes or choices in the Switch. So if I want to have 27 lines of text, I would have 29 values so that the first and last nodes are blank. If I want one to remain on the screen at the end, I would decrease the keyValue by one member. Actually, no don’t need the intervening values for a simple switch. You can as I do in other places where I use the Switch Interpolator do this:
DEF mySwitch SwitchInterpolator { key [0,1] keyValue [0,28]}
The timer and the browser will see to it that 28 nodes are displayed at equal intervals of time.
What the interpolator does is send the key values to a VRML Switch node to change the whichChoice value of the Switch, thus changing the text node or the graphic node, or any node inside the switch node to the next node in succession. A timer drives the SwitchInterpolator and is routed just as we have seen before:
ROUTE TextBackTimer.fraction_changed TO GhostEyes_Animator.set_fraction
ROUTE GhostEyes_Animator.value_changed TO GhostEyes.whichChoice
The position and orientation engines create the movement. The graphics spin as they approach. To create the illusion that there is one graphic, they are switched as they turn. The tricky bit here is to time the switch so they change as they are on edge toward the viewer. This works because from the viewers perspective, when on edge, they are a line so the switch to the next graphic is not perceptible except for the last switch when it is close enough that the actual detail can be seen changing from the image of the spirit into the flashing eyes. As I said, cheap tricks. They aren’t perfect but they are cheap.
Compare the values in the position and orientations to see how this works. Distance matters to illusions:
DEF EyesPosition PositionInterpolator {
key [0, 0.35, 0.5, 0.65, 1]
keyValue [ 1 -10, -50, 1 5, -50, 1 5, -40, 1 6, -35, 0 0.36 -.2]}
DEF EyesOrientation OrientationInterpolator {
key [0, 0.2, 0.3, 0.5, 0.95, 0.99999999, 1]
keyValue [ 0 1 0 -3.14, 0 1 0 -1.57,0 1 0 3.14,0 1 0 -1.57,0 1 0 3.14,0 1 0 -1.57, 0 1 0 3.14]}
The secret is that the movement of the graphics, the switch, and the texts are all controlled by a single timer that synchronizes when the switching occurs relative to the position and orientation and the display of the text that is the first part of the story.
ROUTE TextBackTimer.fraction_changed TO EyesPosition.set_fraction
ROUTE EyesPosition.value_changed TO TextBackground.set_translation
ROUTE TextBackTimer.fraction_changed TO EyesOrientation.set_fraction
ROUTE EyesOrientation.value_changed TO TextBackground.set_rotation
ROUTE TextBackTimer.fraction_changed TO GhostEyes_Animator.set_fraction
ROUTE GhostEyes_Animator.value_changed TO GhostEyes.whichChoice
If this seems insanely complex, it isn’t really. You build and test it one piece at a time until it works as you think it should. Then tweak.
A Switch node is a group that has a choice value which can be changed by sending values to it as illustrated by the switch interpolator, basically a proto that take series of values, converts them to integers and send them as event outs. Here is the first part of the text node switch. The choice counts from zero to n and the 0 choice is displayed by default. To give the appearance of the text appearing in empty space, I set the first node with values I will use for the other nodes but don’t include a string property value. That ensures the first text node is blank but useful.
DEF RiverPoem Switch {
whichChoice 0
choice [
Transform {
children Shape {
appearance Appearance {
material DEF WhiteFont Material {
diffuseColor 1 0 0
shininess 1}}
geometry Text {
fontStyle DEF Poem FontStyle {
justify "MIDDLE"
size 0.22
style "ITALIC"}}}}
Transform {
children [
Transform {
children Shape {
appearance Appearance {
material USE WhiteFont
texture DEF Flame MovieTexture {
url "movies/flame.gif"
}}
geometry Text {
string "The River of Life"
fontStyle DEF Title FontStyle {
justify "MIDDLE"
size 0.3
style "ITALIC"}}}}]}
By the way, you can route values into the nodes in a switch for cheap tricks such as changing the color of each line of text as it appears, set the transparency of each texture to make them appear and disappear or appear to be … ghostly, and so on. I do that in ROL. I won’t illustrate it here because it is a combination of cheap tricks we’ve already looked at. (Hint: send scalar values to transparency values to fade the image in and out just as you turn sounds up and down.)
You will also note in the example that I use a MovieTexture node, in this case, a gif animation, to give the text in the switch the appearance of being on fire. Such 2D animation on 3D surfaces is extremely useful and extremely cheap when compared to creating such effects in 3D for a one shot deal. Try to do fire in 3D. Just try. Then come back here another time and I’ll show you a way to make a 2D fire look as if it were 3D WITHOUT Billboard nodes. ;-)
The Sequencing Script
Now we come to programming over declared nodes. Without scripts, VRML worlds are just elaborate cuckoo clocks. They will do the same thing every time more or less monotonically. With clever timing in loops, it can appear casually that these are more complex than they are, but randomness only goes so far to emulating life. Eventually, an organizing decision making structure must be used to give flight to fancy. I’m not advocating intelligent design as the principle of the universe, but for intelligent VR, you need scripts to make choices and to elaborate choices. That life emerges from random accidents or intelligence is a matter of philosophy; that life directs its own evolution as a matter of intelligent choice is scientific fact.
The script serves as a place to route events from objects, then change the values of these or other objects based on the values of the events received. In the last tutorial, I illustrated the structure of a script, how events are declared, how these also rely on type compatibility, and how nodes can be USEd in fields so that their property values can be set within the script event handlers or in functions called from those handlers.
With that said, I give you the script here without too much comment. You will want to note how different event types are used to activate the behaviors, how these are chained to start and stop behaviors to create the sequence, and nuances such as using the same SFTime event and adding a value to it to enable behaviors to start later than the time the timestamp is received.
DEF SetViewPoint Script {
eventIn SFBool jumpToBoat
eventIn SFTime boatRide
eventIn SFTime getOnDock
eventIn SFBool setViews
eventIn SFBool stopSound
field SFNode boatnav USE BoatNav
field SFNode defnav USE DefaultNav
field SFNode shoreview USE ShoreView
field SFNode boatview USE BoatView
field SFNode boattimer USE BoatRideTimer
field SFNode onboat USE OnBoat
field SFNode greeting USE SopranoGreeting
field SFNode travel USE Travel
field SFNode templedock USE TempleDock
field SFNode dockstop USE DockStop
field SFNode turndown USE TurnDownTimer
field SFNode turnup USE TurnUpTimer
field SFNode poemtimer USE PoemTimer
field SFNode textbacktimer USE TextBackTimer
field SFNode textback USE TextBack
field SFNode eyes USE Eyes
url "javascript:
function jumpToBoat (value) {
if (value == TRUE ) {
boatview.set_bind = TRUE;
boatnav.set_bind = TRUE;
}
}
function boatRide (value) {
if (value > 0 ) {
boattimer.startTime = value + 2;
greeting.startTime = value + 2;
travel.startTime = value + 21;
turnup.startTime = value + 21;
poemtimer.startTime = value + 7;
textbacktimer.startTime = value + 2.5;
}
}
function getOnDock (value) {
if (value > 0 ) {
turndown.startTime = value;
dockstop.startTime = value + 4;
templedock.startTime = value + 18;
}
}
function stopSound (value) {
if (value == FALSE ) {
travel.loop = FALSE;
}
}
function setViews (value) {
if (value == FALSE ) {
shoreview.set_bind = TRUE;
boatnav.set_bind = FALSE;
onboat.enabled = FALSE;
textback.transparency = 1;
eyes.loop = FALSE;
}
}"
}
ROUTE OnBoat.isActive TO SetViewPoint.jumpToBoat
ROUTE OnBoat.isActive TO EyeLight.on
ROUTE OnBoat.enterTime TO SetViewPoint.boatRide
ROUTE OnBoat.exitTime TO SetViewPoint.getOnDock
ROUTE BoatRideTimer.isActive TO SetViewPoint.setViews
ROUTE TurnDownTimer.isActive TO SetViewPoint.stopSound
ROUTE BoatRideTimer.fraction_changed TO BoatRidePosition.set_fraction
ROUTE BoatRidePosition.value_changed TO Boat.set_translation
ROUTE BoatRideTimer.fraction_changed TO BoatRideOrientation.set_fraction
ROUTE BoatRideOrientation.value_changed TO Boat.rotation
ROUTE TextBackTimer.fraction_changed TO EyesPosition.set_fraction
ROUTE EyesPosition.value_changed TO TextBackground.set_translation
ROUTE TextBackTimer.fraction_changed TO EyesOrientation.set_fraction
ROUTE EyesOrientation.value_changed TO TextBackground.set_rotation
ROUTE TextBackTimer.fraction_changed TO GhostEyes_Animator.set_fraction
ROUTE GhostEyes_Animator.value_changed TO GhostEyes.whichChoice
The Babe On The Bridge
When the river journey ends, the script will automatically take you out of the boat and plop you on the dock with an ominous sound. The status of the boat trip timer isActive property is routed to the setViews function. When isActive sends an SFBool value of FALSE, meaning the timer has ceased (it has one long cycleInterval) sending timing events, this function binds the view to the Temple Dock, resets the Navigation node to the default, disables the Proximity sensor, and makes the Material node with the ghostly eyes disappear by setting it to transparent.
function setViews (value) {
if (value == FALSE ) {
shoreview.set_bind = TRUE;
boatnav.set_bind = FALSE;
onboat.enabled = FALSE;
textback.transparency = 1;
}
isActive events are useful for initiating sequencing transitions. Notice that AudioClips also send isActive events so they can also be used for this and I do that in other parts of th world. In fact, sequencing is as much about finding useful event combinations as timing. One can also route the fraction changed events and use comparison statements (eg, if statements) in the code to initiate events based on the approximate values of the current fraction, thus having a series of seemingly unrelated events being keyed by one timer without the timer actually being routed to them. Using a script to do this is one way to get around type compatibility. The handler has to receive type compatible events, but setting USEd node values within the handler itself does not rely on that.
A hint on using Javascript: = and == are two different operators. The first is an assignment operator. The second compares two values. Javascript interpreters are sometimes lax but if you mix these up in a statement where you are trying to determine a value such as a fraction changed and compare it, eg,
if (value == .999) { ….}
and instead you enter
if (value = .999) { ….}
the interpreter accepts that, but it isn’t logical and things may not work as you expect. Just a hint…
The ominous sound is just for drama. When you move toward the babe, she will sing with an accompanying low note that is a nice cadence out of the dock sound, but the reason for plopping you is that getting out of the boat is harder than getting in. The user needs more skill maneuvering and most won’t be able to do it without practice. That struggle slows down the pace of the story. Also, it’s a nice dramatic effect to suddenly be looking from the temple dock at the temples in the distance, and between you and the temples, that sad eyed babe on the bridge. We’ll meet her soon enough.
At Journey's End
There are lots of tricks and variations on these combinations of nodes, events, routes and scripting but they are all just combinations of the same tricks used differently. However, this sequence works inside one object. In the next installment, I will explain, as it was explained to me by the brilliant and often dry humored Sergey Bederov at Parallel Graphics how to use PROTOs and EXTERNPROTOs to enable objects to communicate when they are in discrete files. This enables very powerful cheap tricks for reactive characters, and if you are very clever and have the time, non-linear worlds.
No comments:
Post a Comment