Ahanaf Z. | BlueStamp Engineering
A robot that is able to use computer vision to track a ball. After tracking the ball, the robot will move towards the ball.

Ahanaf Z.

Ball Tracking Robot

Engineer School Area of Interest Grade
Ahanaf Z.
The Browning School
Astronautical Engineering
Rising Junior



My first milestone included mainly included working on the software side of things for my ball tracking robot. First I began with downloading all the required libraries and dependences like PyQT, OpenCV, PiCamera2, Numpy, and a few others. Afterwards, I used the PiCamera2 documentation along with help from my instructor to construct an image masking/color filtering program with my PiCamera through VNC Viewer. The image masker works by collecting BGR values from a PiCamera frame (or RGB values if specified in the configuration parameters), and then uses OpenCV (imported as CV2 in the code) to translate the BGR/RGB values into HSV (Hue saturation values). This is so that when the program uses the picamera to search for color, it can search for a range of colors instead of one very specific RGB or BGR value.


My second milestone was a lot more hands-on relatively speaking than my first milestone as I got to work with a lot of the hardware for my project such as the raspberrypi pins, jumper cables, a bread board, ultrasonic sensors, encoders, h-bridges and many others. I quickly learned how the different pins on a raspberry pi function and got right to building the chassi for my robot as well as attaching motors, wheels, and finding a reliable power source for the robot. I used tape as my main adhesive for keeping most of my hardware on the robot chassi itself. I then proceeded to write code to move my motors using PWM as well as code for evaluating distance through the ultra sonic sensors. I proceeded to use a breadboard to make the use of the ultrasonic sensors more organized if nothing else as all the cables on my raspberry pi were getting very messy. Finally, after a day of troubleshooting, I realized that the double A batteries I was originally using for my h-bridge that had originally been working, could notpower the h-bridge, raspberry pi, breadboard, ultrasonic sensors, and motors all at the same time. Hence, I replaced my primary battery source with a phone battery bank, and for the time being moved my code from VNC Viewer to Visual Studio Code where I was able to view it more easily. With all of this working, I coded the logic for my obstacle avoiding robot (the 2nd step in getting the full ball tracking robot working, as I didn’t want my robot to run into walls.) One big obstacle was the battery and uneven voltages, but luckily the battery bank solved this issue.


My third milestone marks the end of the base robot build for my project and opens up a lot of new doors for modifications to make my robot much cooler than it already is. The robot hasn’t changed drastically from the second milestone but does have a key difference from it’s previous model and that is the object tracking part. More specifically speaking, the robot is now able to use bounding rectangle calculations, contours, and aid from image masking to detect objects through the pi-camera and move towards that object through a number of conditionals and PWM operations on the motors. I look forward to slowly turning my simple “ball-tracking-robot” into a more rover like robot in the next 3 weeks of Bluestamp I plan on attending and am really excited about it!

Stay Connected to BlueStamp