Tags

, , , , , , , , , , , , ,

In my last post Arkwood got all gushy over his new Oculus Touch controllers. The controllers grant my buddy virtual hands when he’s playing games on the Oculus Rift virtual reality headset. And the Touch controllers look and feel damn swell too:

oculusrift_touchcontrollers_screenshot

But how easy is it to program the left and right hand controllers with C++ code? Dead easy! Let’s put a few scraps together… it will be a first step towards rendering our own pair of virtual hands and manipulating objects with virtual fingers.

I’ll amend a virtual world from a previous post. Now when Arkwood holds the left Touch controller at a certain height and presses the hand trigger, a cone will display. Not only that, the left controller will also vibrate, providing haptic feedback.

The Oculus PC SDK Developer Guide has detail on the Oculus Touch controllers, including code samples on hand tracking, button state and haptic feedback.

Getting the hand pose of our left Touch controller is a cinch:

// determine whether left hand high
double displayMidpointSeconds = ovr_GetPredictedDisplayTime(session, frameIndex);
ovrTrackingState trackState = ovr_GetTrackingState(session, displayMidpointSeconds, ovrTrue);

ovrPosef leftHandPose = trackState.HandPoses[ovrHand_Left].ThePose;

bool leftHandHigh = false; 
if (leftHandPose.Position.y > 1.0f) {
	leftHandHigh = true;
}

We can then check the height of the left controller.

It is also straightforward to check whether we are pressing the hand trigger of the left Touch controller:

// determine whether left hand trigger pressed
ovrInputState inputState;
bool leftHandTriggerPressed = false;
				
if (OVR_SUCCESS(ovr_GetInputState(session, ovrControllerType_Touch, &inputState))) {

	if (inputState.HandTrigger[ovrHand_Left] > 0.5f) {
		leftHandTriggerPressed = true;
	}
}

So if the left Touch controller is held high and its hand trigger is pressed, we can vibrate the left controller (providing haptic feedback):

// set haptic feedback vibrations
if (leftHandHigh && leftHandTriggerPressed) {
	ovr_SetControllerVibration(session, ovrControllerType_LTouch, 0.0f, 1.0f);
}
else {
	ovr_SetControllerVibration(session, ovrControllerType_LTouch, 0.0f, 0.0f);
}

Notice that the fourth parameter of the ovr_SetControllerVibration – the amplitude – is set to 1.0f to vibrate the controller.

And the cone can be rendered:

// render cone
if (leftHandHigh && leftHandTriggerPressed) {
	Meshes[i]->Render(view, proj);
}

Here’s the green cone in the virtual world:

oculusrift_touchcontrollers_cone

Every time I hold the left Touch controller high and press its hand trigger, the cone displays and the controller vibrates.

As if Arkwood was not already delirious enough with his Touch controllers, his mind spun with all the virtual possibilities when programming them with C++. Make them vibrate, make them vibrate! he screamed, sticking both controllers in his underpants (thoughtfully, one at the groin and one against his bottom cheeks).

I shut down my PC and bade him farewell. It will all end in disaster.

Ciao!

P.S.

We can also program the thumbsticks to let us stroll about our virtual world:

// set position and orientation via thumbsticks
Vector2f leftStick = inputState.Thumbstick[ovrHand_Left];
Vector2f rightStick = inputState.Thumbstick[ovrHand_Right];

Pos2 += Matrix4f::RotationY(Yaw).Transform(
	Vector3f(leftStick.x * leftStick.x * (leftStick.x > 0 ? 0.1f : -0.1f), 0, leftStick.y * leftStick.y * (leftStick.y > 0 ? -0.1f : 0.1f)));

if (rightStick.x > 0) Yaw -= 0.01f;
if (rightStick.y > 0) Yaw += 0.01f;

Here the left Touch controller thumbstick is updating our position, allowing us to move forward, backward, left and right. And the right Touch controller thumbstick is updating our orientation, allowing us to rotate left and right.

Advertisements