Tags

, , , , , , , , , , , , ,

One devil is plenty. But now there be devil of red fury besides. Lay me down on soft pillow and still those bulbous eyes burn into the skin. Those bulbous eyes!

Let me explain. My last post, Glyph recognition using OpenCV and Python, produced the following damnation:

But now I have recoded my 2D Augmented Reality application to do the following to boot:

  • store the glyph rotation patterns in a database (which negates the need for rotating glyph image)
  • rotate the substitute image (so we can see the devil upside down and shake out his illicit coins)
  • recognise more than one glyph (hence the coming of a second, more zealous beast of sin)

So let’s see the Price of Darkness aside his fiery brother:

It be too much! I ward them off with the only bible I know.

P.S.

Here’s the main program:

import cv2
from glyphdatabase import *
from glyphfunctions import *
from webcam import Webcam

webcam = Webcam()
webcam.start()

QUADRILATERAL_POINTS = 4
SHAPE_RESIZE = 100.0
BLACK_THRESHOLD = 100
WHITE_THRESHOLD = 155

while True:

    # Stage 1: Read an image from our webcam
    image = webcam.get_current_frame()

    # Stage 2: Detect edges in image
    gray = cv2.cvtColor(image, cv2.COLOR_BGR2GRAY)
    gray = cv2.GaussianBlur(gray, (5,5), 0)
    edges = cv2.Canny(gray, 100, 200)

    # Stage 3: Find contours
    contours, _ = cv2.findContours(edges, cv2.RETR_TREE, cv2.CHAIN_APPROX_SIMPLE)
    contours = sorted(contours, key=cv2.contourArea, reverse=True)[:10]

    for contour in contours:

        # Stage 4: Shape check
        perimeter = cv2.arcLength(contour, True)
        approx = cv2.approxPolyDP(contour, 0.01*perimeter, True)

        if len(approx) == QUADRILATERAL_POINTS:

            # Stage 5: Perspective warping
            topdown_quad = get_topdown_quad(gray, approx.reshape(4, 2))

            # Stage 6: Border check
            resized_shape = resize_image(topdown_quad, SHAPE_RESIZE)
            if resized_shape[5, 5] > BLACK_THRESHOLD: continue

            # Stage 7: Glyph pattern
            glyph_pattern = get_glyph_pattern(resized_shape, BLACK_THRESHOLD, WHITE_THRESHOLD)
            glyph_found, glyph_rotation, glyph_substitute = match_glyph_pattern(glyph_pattern)

            if glyph_found:

                # Stage 8: Substitute glyph
                substitute_image = cv2.imread('glyphs/images/{}.jpg'.format(glyph_substitute))
                
                for _ in range(glyph_rotation):
                    substitute_image = rotate_image(substitute_image, 90)
                
                image = add_substitute_quad(image, substitute_image, approx.reshape(4, 2))

    # Stage 9: Show augmented reality
    cv2.imshow('2D Augmented Reality using Glyphs', image)
    cv2.waitKey(10)

Out of bounds errors need addressing (but everyone knows that exception handling is for squares).

Here’s the glyph database:

# Glyph table
GLYPH_TABLE = [[[[0, 1, 0, 1, 0, 0, 0, 1, 1],[0, 0, 1, 1, 0, 1, 0, 1, 0],[1, 1, 0, 0, 0, 1, 0, 1, 0],[0, 1, 0, 1, 0, 1, 1, 0, 0]], "devil"],[[[1, 0, 0, 0, 1, 0, 1, 0, 1],[0, 0, 1, 0, 1, 0, 1, 0, 1],[1, 0, 1, 0, 1, 0, 0, 0, 1],[1, 0, 1, 0, 1, 0, 1, 0, 0]], "devil_red"]]

# Match glyph pattern to database record
def match_glyph_pattern(glyph_pattern):
    glyph_found = False
    glyph_rotation = None
    glyph_substitute = None
    
    for glyph_record in GLYPH_TABLE:
        for idx, val in enumerate(glyph_record[0]):    
            if glyph_pattern == val: 
                glyph_found = True
                glyph_rotation = idx
                glyph_substitute = glyph_record[1]
                break
        if glyph_found: break

    return (glyph_found, glyph_rotation, glyph_substitute)

And the supporting functions and Webcam class can be sourced from my previous post.

Advertisements