The doorbell rang. Ring!!! Ring!!! It was Daphne, the plump spotty girl who works down the chippy. She had a battered cod and chips in hand.
‘Is Arkwood in?’ she asked, flakes of potato falling from her greasy lips.
I told her that he had gone ice skating. ‘Oh well,’ and then she spied my computer and the augmented reality upon its screen.
In my previous post, Augmented Reality using OpenCV, OpenGL and Blender, I projected a cone and sphere from pieces of paper (optical glyphs, to be more precise). I was even able to rotate a cube in my post Augmented Reality with 3D object rotation.
‘What’s that?’ Daphne exclaimed, prodding an oily fat finger at my monitor.
‘Oh, I am using my webcam to display 3D book reviews.’ Indeed, it be the Amazon UK customer reviews of my books, The Inscrutable Diaries Of Rodger Saltwash and Plastic Halo (a Hong Kong adventure novel).
Here’s what Daphne saw:
The 3D book reviews are projected from a glyph, which has been integrated into the cover design of the book (notice how the reviews are to the size and angle of the glyph). Each glyph is unique, meaning we can match the correct reviews to the correct book.
‘Why do you want to do that?’ she sneered. I told her it was something to do. Keeps me busy.
‘Hmm. Well, if you see Arkwood, send him round to my flat. I want a fuck.’
And with that, she took her enormous arse down the street. I sighed. That’s the problem with our species. There’s no reward for being anything other than animal.
I ran the code on my Windows 7 PC using Python Tools for Visual Studio.
Take a peek at the previous posts for the lowdown on using:
- Blender to create 3D objects
- OpenCV computer vision to detect optical glyphs
- OpenGL graphics library to render 3D objects