Description as a Tweet:

A modern take on a retro classic - the game of Pong! Take a step away from the computer and immerse yourself in a real-life experience: utilize computer vision to simulate user paddles, and physically feel the joys of winning and the anguish of defeat.


Our group is comprised of enthusiastic and competitive gamers - so we wanted to do something that aligned with our interests yet was different than the status quo. As a result, we decided to do a spin on the classic game of Pong - more specifically, we would use computer vision and physical paddles to make the gameplay happpen. We have all used XBox Kinect and Wii before, but what if something similar could be done on a much lower budget and without nearly as much equipment? We wanted to take that challenge on and make this a possibility for anyone through augmented reality and motion sensing.

What it does:

Our project uses computer vision with motion sensing to simulate user paddles on the screen with virtually any bright object. The users can then play Pong on any screen connected to a webcam, simply by using their paddles to hit a simulated virtual ball.

How we built it:

We used Processing - an open-source graphical library and IDE to make the project. We utilized computer vision and motion tracking to track a user paddle based on color, and then simulated a pong ball to go back and forth between user hits.

Technologies we used:

  • Java
  • Misc

Challenges we ran into:

We faced a major challenge of processing speed - the gameplay was not nearly as fast as we wanted it to be, so we had to rewrite the way we approached methods and functions to optimize the code for speed as much as possible. Another challenge we faced was making the gameplay completely correct - we should not have missed hits or false hits, etc. so it took a lot of trial and error to figure out where things were going wrong.

Accomplishments we're proud of:

We are proud of being able to use computer vision and motion tracking precisely and have an almost-instant reaction to user inputs on a live feed. We are also proud of being able to make this project scalable - it can potentially be played on anything that has a camera. Finally, we are proud of the amount of time it took us to make this project - we were able to finish the framework relatively quickly and spend the remaining time working on enhancements.

What we've learned:

We learned the ins and outs of computer vision and how motion tracking works, and generally how to make a good game. We also learned how to be patient and work through the debugging process in order to make the best project possible that was also loaded with good features.

What's next:

We would like to add a "Play vs. CPU" option, where the user can play against the PC in a variety of difficulty modes. Additionally, we would like to implement X-position tracking smoothly, so the user can play more in an air-hockey style. We would also like to add different game modes for the users to play with, for example, "points-per-rally," and a "penalty kick" mode. Eventually, this can become a standalone game that runs on any browser or mobile device - all that is needed is a camera and a plain background.

Built with:

We built it using the Processing IDE in the language of Java, and some makeshift paddles out of wood and foam.

Prizes we're going for:

  • Best AR/VR Hack
  • Best Documentation

Team Members

Vyom Rathod
Christopher Caron
Nnaji Obinelo
Nathaniel Lamptey
Jonah O'Brien Weiss

Table Number

Table 59

View on Github