Lighthouse3d.com

Send me bugs and suggestions, please
VRML Script Tutorial
Full list

VRML Interactive Tutorial

Introduction
VRML File Structure
Drawing: Shape node
Geometry Nodes:
Box
Sphere
Cone
Cylinder
PointSet
IndexedLineSet
IndexedFaceSet
Extrusion
ElevationGrid
Example: Chessboard
Text
FontStyle
Appearance
Material
Textures
Image Texture
Movie Texture
Pixel Texture
Texture Coordinate
Texture Transform
Let there be Light
Directional Light
Point Light
Spot Light
Materials with Colored Lights
Hierarchical Node Structures
Group
Transform
Collision
Anchor
Billboard
Switch
Inlining Files
Defining and Instancing Nodes
Defining Levels of Detail
Events in VRML
Creating Paths between events: ROUTE
Generating Events based on Timers or User Actions
Timers
Touch Sensor
Visibility Sensor
Dragging Sensors
Plane Sensor
Sphere Sensor
Cylinder Sensor
Proximity Sensors
Example: Proximity sensor
Interpolators
Color
Coordinate
Normal
Orientation
Position
Scalar
Example
Let the Music Play
Sound
AudioClip
Bindable Nodes
Who Am I: NavigationInfo
Where Am I: ViewPoint
Adding Realism to the world
Background
Fog
Information about your world
WorldInfo
Definition for Auxiliary Nodes
Coordinate
Color
Normal

Dragging Sensors


Dragging sensors are a special kind of sensors that not only track users motion but also move the objects within the same group as the sensor. There are three type of dragging sensors:
  • PlaneSensor: lets the user move objects in the XY plane.
  • CylinderSensor: Maps the movement to the surface of a conceptual cylinder.
  • SphereSensor: Maps the movement to the surface of a conceptual sphere.
  • The above sensors all share the following fields:
  • enabled defines the status of the sensor
  • offset indicates the initial position of the shapes within the group, a zero offset will mean that the shapes will be moved from their original position, whereas an offset different from zero will indicate that dragging starts at the original position plus the specified offset. The offset value is ignored if autoOffset is TRUE. Note that the type of the offset field varies with the type of the sensor.
  • autoOffset specifies if the browser should track the current position or do all dragging operations relative to the original position. Only relevant for the second and subsequent draggings. If autoOffset is TRUE then the second dragging will start where the first one ended, if FALSE then the shapes will return to their original position each time a new dragging operation begins.
  • The following events are common to all the sensors:
  • isActive indicates whether a dragging operation is being done. The isActive will output TRUE if the user has the mouse pressed over a shape within the same group as the sensor, and FALSE otherwise.
  • trackPoint_changed provides the actual coordinates in the surface defined by the sensor
  • rotation_changed (SphereSensor and CylinderSensor) and translation_changed (PlaneSensor) provides the relative orientation or translation being made.
  • In order to actually move the shapes you should place the shapes inside a Transform node. The Transform node should be in the same group as the sensor. You then need to route this events to fields in a Transform group. See the examples provided for each sensor.

    If using multiple sensors in the same group it is up to you to specify which does what, they will all generate events when any of the shapes within the group is affected.

    If using multiple drag sensors in nested groups then the inner group sensors grab the users action and the outer group sensors will ignore it.