Description as a Tweet:

My project is 3 augmented reality interactions made with Python and OpenCV. It uses motion tracking from any camera on a computer that can run Python to allow you to interact with a virtual world that you can see on your monitor.


My friend friend suggested that I make a game that doesn't use a controller. My first few thoughts were a bicycle or some other weird object but then I thought about somehow using a camera as a controller. Eventually I decided I want to learn motion tracking.

What it does:

My project uses Python and OpenCV to create three augmented reality interactions. One of them is "Boxing" where you try to hit as many blue circles as fast as you can while dodging red circles coming at you really fast. The other is "Survival" where red and green circles are falling on you and you have to avoid the red circles and hit the green circles. Finally, there is "Music" where you can play virtual drums on your screen.

How we built it:

I created it entirely in Python 3.7. I used OpenCV for pretty much everything. I used it to get the camera data. I first had it take a picture of an empty wall when it runs, and then had an algorithm that would separate new objects from the background which is how it tracks motion. I then converted all background pixels to black and all object pixels to white. To create the circles, I used OpenCV. To detect collision of the circles, I used an algorithm to get all the points of the inside of the circle on an X,Y graph based on the radius and center position, then whenever the person's motion tracking pixels collided with the circle's pixels, it would then carry out a function. To make the colors of the circle slowly change from red to green in the menu, I used the equation to get and array all the x,y points of the circle given the radius and position, using this, every frame, it would update the pixels in the area by starting from the lowest Y position, and every frame and would increment up one Y position until they were are changed.
To make the circles move, I just updated the X or Y position every frame.

Technologies we used:

  • Python

Challenges we ran into:

My biggest challenge at first was making the motion tracking. It took a long time and didn't work well until I came up with the idea to add a threshold slider which allowed you to change the sensitivity which helped prevent shadows and lighting in different rooms from affecting it.

Accomplishments we're proud of:

I'm very proud by how accurate it is. It doesn't look pretty but it works very well.

What we've learned:

I knew a lot about OpenCV already for manipulating images but this was the first time I tried anything with motion tracking so I learned a lot about that. I also learned how to get all the points of the inside of a circle on a graph which is what I used to detect when the person collided with the circle.

What's next:

Next would be to make it look better with some pictures and effects because it is very bland being only colored circles and words. After that, I would like to add better games and interactions because in my opinion, even though I had fun making them, they're not very fun to play.

Built with:

I used my laptop and it's webcam, along with Python 3.7 and OpenCV.

Prizes we're going for:

  • Best AR/VR Hack

Team Members

Brandon Halig

Table Number

Table 16