-
Notifications
You must be signed in to change notification settings - Fork 26
HowTos
This page contains suggestions, code snippets and tips that are not elaborate enough to require their own page
Added: ver. 4.2 - 22 August 2013
Question:
I know I can use the event isButtonDown and isButtonUp to check for button presses and releases, but what if I want to check the button state for my wand / game controller / mouse?
If you want your code to work both with a mouse and a wand, the best way is to set a flag and process it in the update function:
button1Pressed = False
def onEvent(e):
if(e.isButtonDown(Event.Button1)): button1Pressed = True
else if(e.isButtonUp(Event.Button1)): button1Pressed = False
def onUpdate(frame, time, dt):
if(button1Pressed) print("Button 1 is currently pressed")
You can also use the isFlagSet method of an event to see if a button or key is currently in the pressed state
def onEvent(e):
if(e.isFlagSet(Event.Button1)): print("button1Pressed")
But this more device-dependent because it depends on the frequency at which the input devices send out events. The best way of doing this is the first one.
Added: ver. 4.2 - 22 August 2013
Question:
what is a good way to refer back to an Actor given a picked object? Lets say I write an Actor that encapsulates a finite state machine and it controls a scenenode- like moves it around the scene. Now lets say I pick that scenenode and decide I want to move it somewhere. It would be nice if I can alert the Actor that it's being picked and act accordingly.
I would keep the actors in a python dictionary, giving each actor the same name as the sceneNode they control:
# Actor dictionary
actors = {}
# create a custom actor class
class Pickable(Actor):
node = None
def __init__(self):
# create something
node = SphereNode.create(1,2)
node.setSelectable(True)
# NOTE: each node has a unique name by default. if you want you
# can re-name the node before this line doung node.setName('blahblah')
actors[node.getName()] = self
def picked():
print("Can't touch this!")
#[...]
def onObjectSelected(node, distance):
if(node != None and node.getName() in actors):
actors[node.getName()].picked()
def onEvent():
e = getEvent()
# When the confirm button is pressed:
if(e.isButtonDown(Event.Button5)):
r = getRayFromEvent(e)
if(r[0]): querySceneRay(r[1], r[2], onObjectSelected)
setEventFunction(onEvent)
Another way is to add the onEvent method of Actor, and check there is the sceneNode the actor owns has been picked, like in this example: https://github.com/uic-evl/omegalib/blob/master/examples/python/actor.py
Added: ver. 3.3 - 29 January 2013
In some applications you want to have a light attached to the main camera: when the camera moves and rotates the light should follow. This is very easy to implement, given that the Camera
class derives from SceneNode
, and it can have children attached to it:
# Create a full white headlight
headlight = Light.create()
headlight.setColor(Color("white"))
headlight.setEnabled(True)
getDefaultCamera().addChild(headlight)
Remember the light will be attached by default at the camera origin: inside a VR system this does not correspond to the head position of the tracked user. If you want to take the head offset into account, you can make the light follow the head trackable object. Add the following line:
headlight.followTrackable(headTrackableId)
Where headTrackableId
is the integer Id of the head trackable (check your VR system config file or tracker configuration for this).
Note that you need to attach the light to the camera AND to the head trackable to have a full headlight. Without attaching the light to the camera, the light will not move when navigating in the scene.
Added: 2 September 2013
On Windows and OSX, make sure you are launchng orun from the bin directory, or that you have the path to the bin directory added to your PATH environment variable.
For instance, do this:
> cd ~/omegalib/build/bin
> orun -s ~/omegalib/core/examples/python/planes.py
Instead of this:
> cd ~/omegalib/core/examples/python
> ~/omegalib/build/bin/orun -s planes.py
Added: ver. 4.1-alpha4 - 8 Jul 2013
There can be lots of reasons for this: osgEarth plugins not being in the library path, misconfigured proxies, etc. One particularly tricky reason is that osgEarth does not like to open .earth files when no graphics context is available. So, if you are running on a cluster with a headless master, your script will stall during model loading. A simple way to solve this is to modify your program or script to run only on cluster slave nodes, for instance using isMaster
in python:
if(not isMaster()):
# put osgEarth model loading code here
This trick may be helpful in other scenarios: running all your application code within a not isMaster()
block will basically set up the master instance to be just a lightweight synchronization controller for all the display nodes that will run the actual application.
Added: ver. 4.0 - 20 June 2013
doing git clone https://github.com/febret/omegalib.git omegalib --recursive
gives this in reply:
Cloning into omegalib...
error: SSL certificate problem, verify that the CA cert is OK. Details:
error:14090086:SSL routines:SSL3_GET_SERVER_CERTIFICATE:certificate verify failed while accessing https://github.com/febret/omegalib.git/info/refs
A complete asnwer to this problem can be found here: http://stackoverflow.com/questions/3777075/ssl-certificate-rejected-trying-to-access-github-over-https-behind-firewall
A quick solution is to ignore ssl certificate verification just for this repository clone:
env GIT_SSL_NO_VERIFY=true git clone https://github/febret/omegalib.git omegalib --recursive
Added: ver. 3.2 - 20 December 2012
Cameras in omegalib (and in other VR toolkits) work a little differently compared to classic graphic engines. There are two things that influence the final camera view:
- The camera position and orientation, which infuence the view transform. They determine the conversion between the physical space in your system , and the space used your virtual world.
- The head position and orientation, which influence both the view and the projection transforms. They represent the position of the observer in the phisical space. head transform, camera transform and display geometry all contribute in determining what will appear on the screens.
"In a VR system, how do I convert my trackable object position / orientation into world coordinates?"
Added: ver. 3.2 - 20 December 2012
Tracked objects like stereo glasses, wands etc. generate events with positions and orientations in sensor space. This space usually corresponds to the physical space used to describe the VR system structure. For instance in CAVE2 both sensor space and VR system space cosider the origin as the center of the sphere (on the ground), with Z pointing forward.
When navigating within the scene, you move the camera to a different position and orientation. Consequently, tracked object positions do not make sense anymore, until you convert them to this new, "virtual world" space defined by the camera transformations. Luckily, the Camera
class comes with method that do exactly this.
Given a Vector3f trackerPosition
, you can convert it to the world reference frame doing (in C++)
Camera* cam = Engine::instance()->getDefaultCamera(); // equivalent to the getDefaultCamera python call.
cam->localToWorldPosition(Vector3f position);
// use cam->localToWorldOrientation(orientation) to convert an orientation instead.
"I'm trying to load an .obj file. The model does not show up, and the console gets spammed with a lot of CullVisitor Errors'
Added: ver. 3.7 - 29 April 2013
This is likely due to your obj model using per-vertex colors. The OpenSceneGraph obj loader does not like per-vertex colors. The typical solution is to use per-face colors instead. An easy way to do this is to open your mesh in a tool like MeshLab and use Filters > Color Creation and Processing > Vertex Colors to Face Colors When saving the new obj make sure you disable colors in the vertex attributes section.
"My model disappears when I get close / far from it, or it just shows up on a few tiles of my tiled display"
Updated: ver. 3.7 - 3 May 2013
This is likely a near/far clipping plane issue. It usually shows up when you are drawing big models (i.e. when using osgEarth).
Omegalib tries to set up good clipping planes for you but does not handle every situation well. You can manually specify the near and far clipping planes using the setNearFarZ
function in python (or the equivalent DisplaySystem
method in C++). A good test call is something like setNearFarZ(0.1, 1000000)
. Note that if you have a big near-to-far z ratio, you may get z fighting issues for objects close in depth. Omegalib uses a 24-bit z buffer by default.
If your application is based on OpenSceneGraph (using cyclops for instance) you can also turn on depth partitioning to handle scenes with large Z intervals (i.e drawing planetary-scale data at a distance, plus small objects close to the camera).
For example, to enable depth partitioning and create two partitions, one for near objects (z from 0.1 to 1000) and one for far objects (z from 1000 to 100000000):
# Specify the overall depth range
setNearFarZ(0.1, 100000000)
# Give a render hint to the OpenSceneGraph module (if loaded): create a depth partition at 1000.
queueCommand(`:depthpart on 1000`)
omegalib supports builds in visual studio 2008 and 2010. Both 32 and 64 bit builds should work, but currently only Visual studio 2010 32 bit builds are tested regularly.
"the omegalib build fails with error C3859: virtual memory range for PCH exceeded; please recompile with a command line option of '-Zm120' or greater"
Updated: ver. 4.2 - 16 August 2013
Just add the suggested option in the Visual Studio Project property page:
using /Zm500
should fix the problem.
"When building omegalib from scratch with equalizer enabled the build fails with the compiler complaining about not finding lexer.cpp
"
This error is due to some incorrect dependencies in the equalizer project, and visual studio spawning multiple compiler processes on multiprocessor machines. Simply ignore the error and build the solution again. After one-two additional builds the problem should go away. This won't happen anymore, unless you rebuild omegalib from scratch.
For DirectInput to work, you have to make sure the Visual Studio compiler searches for DirectInput headers from the windows DDK.
*^Make sure the WinDDK include directory is the first one in the include directories section under Project Options >> Projects and Solutions >> VC++ Directories^*
Similar to the 'lexer.cpp' error. Sometimes building again all that is needed to solve the problem. Otherwise, to make sure it goes away, build equalizer separately (see the picture) the do the same for the omega project
*^How to build equalizer separately to solve this error.^*
Some of the libraries used internally by omegalib need the Visual Studio 2008 Debug CRT to run. To solve this issue install Visual C++ Express 2008 (post link)
The cause of this error is unclear, and so far it happened on only one virtual machine install. Equalizer and omegalib actually build fine with this error, but Visual Studio will keep complaining about it.
A workaround is to exclude the equalizer project from the build after building the entire project the fist time (see the Speeding up builds tip in the general tips & tricks section)
Make sure that the path to your source and build directories does not contain spaces
For example:
C:/Users/Me/omegalib/build
is fine
C:/Users/Alessandro Febretti/omegalib/build
is not.
After the first build, it is possible to speed up successive builds by unloading projects in the 3rdparty folder. These dependencies need to be built only once per configuration, so they can be disabled after the first build. This way, the build system won't have to go through them to check if they are up to date.
*^To unload third party projects, right-click on the 3rdparty folder and select "Unload Projects in Solution Folder"^*