‘I have electrocuted my finger,’ Arkwood said casually, ‘make us a cup of tea, would ya.’
No problem. I told him to put on the Oculus Rift virtual reality headset. I started my Microsoft Visual Studio C++ application (with OpenGL graphics library and the Oculus SDK for Windows) and he took a slurp of virtual cha.
Here is the video of Arkwood consuming his refreshment:
But how to hold a cup in VR?
First we need the cup. The tutorial Another Coffee Cup (Easy Handle) helped me mould a vessel in Blender (the 3D creation suite).
Next we need to detect collision with the cup (that’s been imported from Blender into the C++ app). My post Zip wire with OpenGL provides a C++ struct for basic Axis-Aligned Bounding Box collision when reaching for the mug’s handle on the -z axis.
To put the cup into our virtual hands we need to track the Oculus Touch controllers. Here’s how we get the position and orientation of the left hand:
double displayMidpointSeconds = ovr_GetPredictedDisplayTime(session, frameIndex); ovrTrackingState trackState = ovr_GetTrackingState(session, displayMidpointSeconds, ovrTrue); ovrPosef leftHandPose = trackState.HandPoses[ovrHand_Left].ThePose; Vector3f leftHandPosition = pos + leftHandPose.Position; Quatf leftHandOrientation = leftHandPose.Orientation;
Notice that we add our hand offset position to the camera position in world space. The orientation we take from the hand as is.
So once our code detects one of our hands colliding with the cup then we simply update the cup’s mesh position and rotation, placing it in the appropriate hand:
MyModels[i]->MyMeshes->Pos = leftHandPosition; MyModels[i]->MyMeshes->Rot = leftHandOrientation;
But one thing stumped me. The cup rotates and positions itself lovely when I stand facing it. I can flex my arms and bob my head and the cup stays perfectly in my paw. But if I start to rotate my body, the cup does not move with me.
The fix ended up being simple. One thing you read with OpenGL is that the camera is not moving about the room, rather the room is moving about the camera. So when it came to rendering the cup I created a special view for it, devoid of any yaw. The cup handle remained in my grip as I rotated about on the spot and then went for a wander.
Arwood took the Rift off his head. ‘My, that was a braw cup of tea!’ he exclaimed.
And he left, clutching two fizzing wires.
One other thing. Detecting a hand colliding with the cup handle works great, as I am facing the -z axis. But what about when I grab the cup from the side, or when I turn around and snatch at it from behind. The detection code needs the following help:
Matrix4f rollPitchYaw = Matrix4f::RotationY(yaw); Vector3f leftHandPosition = pos + rollPitchYaw.Transform(leftHandPose.Position); Vector3f rightHandPosition = pos + rollPitchYaw.Transform(rightHandPose.Position); MyCups[i]->Detect(leftHandPosition, rightHandPosition);
Our camera yaw is added to a rollPitchYaw matrix, so we are able to transform our hands to the correct position before adding to the camera position. Now we can detect our hands colliding with the cup from all directions.
Indeed, the rollPitchYaw matrix can provide an alternative to the special view I created earlier, that was devoid of any yaw. We simply update the cup’s mesh position and rotation as thus and the cup handle will remain in my grip as I rotated about on the spot:
MyModels[j]->MyMeshes->Pos = pos + rollPitchYaw.Transform(leftHandPose.Position); MyModels[j]->MyMeshes->Rot = Quatf(rollPitchYaw * Matrix4f(leftHandPose.Orientation));