Tags

, , , , , , , , , , , , ,

Arkwood – my scrawny and feverishly sordid Belgian buddy – said it again:

‘Build me a synthetic world! I must escape this natural hell!’ Strangely poetic for him.

You see, Daphne – the plump spotty girl who works down the chippy – had dumped him again.

Now, for many a month and year, I have been building virtual reality worlds for Arkwood to escape into. I have built augmented reality apps too. In truth, it’s all mixed reality as it all contains elements of the natural and synthetic worlds.

But how exactly do we go about creating a virtual world? Hmm. We’ll need some hardware. And some input and output (and maybe some stuff in between).

So with the C++ programming language and a Windows 10 PC at my disposal, let’s go!

Hardware

I will be using the Oculus Rift virtual reality headset, with built-in microphone and on-ear headphones:

virtualrealitydevelopmentwithoculusrift_rift

I will be using the Oculus Touch Controllers for my hands:

virtualrealitydevelopmentwithoculusrift_touch

The Rift and Touch movement will be tracked by a pair of Oculus Sensors:

virtualrealitydevelopmentwithoculusrift_sensors

And let’s chuck in a Logitech HD 720p webcam:

virtualrealitydevelopmentwithoculusrift_webcam

Input

A human has 5 traditional senses: sight, hearing, touch, taste, smell.

Here’s how we can get them into our virtual world (but not taste and smell, as I’ve not got the hardware for that yet!)…

Sight

The Oculus Rift does not have a stereo camera mounted on the headset, so we are going to have to provide sight to our virtual world with a basic webcam.

Oculus Rift: watching you watching me shows how OpenCV computer vision can capture images from a webcam and put them on a cube in the virtual world.

oculusrift_cubecone_webcamtexture_sdk

Next up, in OpenCV motion detection in Oculus Rift the webcam alerts us to motion detected in the natural world by displaying a cone in the virtual world.

oculusrift_cubecone_webcammotiondetected_sdk

Finally, in OpenCV motion detection in Oculus Rift (Mark II) I animate the cone. The more motion detected in the webcam, the higher the cone rises into the air.

Our virtual reality is now able to see the natural world!

Hearing

The Oculus Rift headset has a built-in microphone, to provide hearing to our virtual world.

Microsoft Speech API (SAPI) with C++ shows how Microsoft Speech API (SAPI) can capture our voice, granting speech recognition to our virtual world.

Of course, not all audio from the natural world will be spoken words – it may be the sound of passing cars or our favourite concerto on the radio. We can inspect the raw data and let it wash into our virtual world.

Our virtual reality is now able to hear the natural world!

Touch

The Oculus Touch Controllers provide touch to our virtual world.

Launch of Oculus Touch controllers shows just how stupendously the Touch can capture our hands and put them into the virtual world.

oculusrift_touchcontrollers_deadandburied_screenshot_1

Oh, our hands have turned skeletal!

Next, in Oculus Touch controllers with C++ I show how the height of our hands and the flexing of our fingers in the natural world can mould our virtual world. We can also use the Touch to move about in virtual landscapes.

Our virtual reality is now able to feel the touch of the natural world!

Analysis (the stuff in the middle)

So, we have discussed the input of sight, hearing and touch to a virtual world.

And we are just about to discuss the output of sight, hearing and touch from a virtual world.

But what options do we have to make sense of the input, before we create the output? Well, in truth some of the input and output technologies have their own built-in intelligence – for example, OpenCV computer vision has many algorithms to understand the objects and behaviour in a set of webcam images. And, of course, the C++ language we are using can construct logic to oil our virtual world.

In HTTP request and HTML parsing with C++ I show how we can retrieve and inspect online data, letting it pepper our virtual world. Online data is a sixth sense, a pipe into the up-to-the-moment shenanigans of planet Earth and beyond.

There is also artificial intelligence to consider, where we can use neural networks, fuzzy logic or reinforcement learning to influence our virtual world.

Our virtual reality is now wise!

Output

We’re back with our 5 traditional senses again: sight, hearing, touch, taste, smell.

Here’s how we can get them out of our virtual world (but not taste and smell, that food just ain’t on the table yet!)…

Sight

The Oculus Rift headset has two eye lenses (most people have two eyes, few have three), to let us see our virtual world.

Oculus Rift PC SDK using OpenGL shows how we can start developing virtual reality with the Oculus Rift, using OpenGL graphics library to render a simple triangle in front of our eyes.

oculusrift_pcsdk_oculusroomtinygl

In Add OpenGL Texture to Rift SDK I texture the triangle, adding an image to it.

oculusrift_texture_rodgersaltwash

But what if we want to put more complex objects in front of our eyes, objects that we have modelled in a 3D creation suite? Blender shapes in Oculus Rift shows how spheres, cylinders and a giant man can infiltrate our natural world.

oculusrift_spherecylinder_sdk

Finally, in Blender textures in Oculus Rift we add some texture to those shapes and the giant man.

oculusrift_cubecone_nanosuit

Our virtual reality is now able to give the natural world some vision!

Hearing

The Oculus Rift headset has on-ear headphones, to let us hear our virtual world.

3D Audio with Oculus Rift shows how we can fill our ears with 3D audio, sounds that can come from any direction and distance in our virtual world.

In Oculus spatializer plugin for FMOD a virtual room pumps creaky and squishy 3D sounds into our head, startling us and making us cry. A few dabs of C++ code and some audio middleware is all it took.

Our virtual world can also talk to us. In Microsoft Speech API (SAPI) with C++ we use the text-to-speech capabilities of SAPI to say “The cone has been moved, sir”.

Our virtual reality is now able to give the natural world some sounds!

Touch

The Oculus Touch Controllers let us feel the touch of our virtual world.

Oculus Touch controllers with C++ shows how the Touch can vibrate in our hands – the virtual world is dispensing haptic feedback.

Our virtual reality is now able to give the natural world a sense of touch!

Summary

So there you have it. Armed with an Oculus Rift, Touch, Sensors and a webcam I can push the three senses of sight, hearing and touch in and out of a virtual world (I’ll worry about taste and smell later, goddamn you!). And we’ve even got space in the middle for stuff like online data and AI.

These are exciting times. The 2.0 of our species, where we can add a new synthetic layer to our natural world. And maybe slip into virtual reality, to leave much of the horrid, cruel globe behind. Pathos.

‘Your new world is ready!’ I announced to Arkwood, but he was not there. He had legged it down the chippy to make up with Daphne. Some men will never be free of the chaos. Just ask Rodger Saltwash.

Ciao!

P.S.

If you want to follow the referenced posts in order, try:
Oculus Rift PC SDK using OpenGL
Add OpenGL Texture to Rift SDK
Blender shapes in Oculus Rift
Blender textures in Oculus Rift
Oculus Rift: watching you watching me
OpenCV motion detection in Oculus Rift
OpenCV motion detection in Oculus Rift (Mark II)
3D Audio with Oculus Rift
Oculus spatializer plugin for FMOD
HTTP request and HTML parsing with C++
Microsoft Speech API (SAPI) with C++
Launch of Oculus Touch controllers
Oculus Touch controllers with C++

Advertisements