Hello, my name is Cesar, and I am a rising junior at YES Prep Southeast. I became very interested in BlueStamp Engineering the moment I heard about it because being able to choose and construct my own project, while learning valuable engineering skills, seemed like something that can take me a step forward to an engineering career. This summer at BlueStamp I will be building the Minty Boost as my starter project, which is a portable, battery powered charger for a phone. For my main project, I will be creating a hand gestured controlled 3 wheeled robot, that will be manipulated with flex sensors that are connected to a glove. This project will have lots of customization with the multiple hand gestures that I can program to send instructions to control the robot’s movement.

Here is a picture capturing how my hand gestured controlled robot looked once it was finished, which includes both the glove and robot.
IMG_4824

IMG_4825

I’d also like to give credit to the person who also created a hand gestured controlled robot by attaching the link to his project www.instructables.com/id/Handgesture-controlled-robot-with-robotic-arm/, which is what I based my project on.

My final milestone for my hand gestured controlled robot has been reached and I can now say that I have finished my main project. I added a robot claw to the front of my robot and screwed it down to make sure the claw is stable. This claw is also run by a VEX motor, which I connected to a motor controller and wired to the receiving Arduino. I also fixed my delay problem by implementing two solutions to my code. I first increased the baud rate between both Xbees, which allowed for the data to be sent faster between the two. This helped the delay,but I also made sure that the Arduino on the robot didn’t have so much code to process. I was able to accomplish this by changing my initial code, which sent the characters of the flex sensors to the Arduino which then had to recognize what hand gesture I am doing, then turn that into a robot instruction. But with my new code I sent the entire hand gesture as a single character, which allowed for the robot to immediately know the gestures I’m making and simply turn them into robot instructions. Below is a video of me controlling the robot with zero delay and also opening and closing the robot claw.

Attached is my bill of materials for my hand gestured controlled robot.
http://drive.google.com/open?id=1R5gaVHWd49KL2dnGZHM9t6d4hH3hv4N_vhttps://docs.google.com/document/d/1R5gaVHWd49KL2dnGZHM9t6d4hH3hv4N_vH5slzsn0OI/edit?usp=sharing

Here below is the code that I wrote for the transmitting Arduino.
http://https://drive.google.com/file/d/0B6iKVcmGo1c8YjdlZHFXMHdGRXM/view?usp=sharing

And attached below is my code for the receiving Arduino.
http://https://drive.google.com/file/d/0B6iKVcmGo1c8ZWhFeXNJa0tHMFU/view?usp=sharing

I have attached the schematics for both the glove and the robot.
Here is the schematic containing the flex sensors wired to the Arduino on the glove.

Schematic for TX 001

Here is the schematic for the motors and Arduino on the robot.
Schematic for RX 001

I have also attached multiple pictures of my CAD design for my frame of the robot down below.
CAD file for Robot

CAD file for Robot2

CAD file for Robot3

I have progressed even more on my hand gestured controlled robot by reaching my second milestone. I had already controlled the robot’s movements with the flex sensors, but now I am able to use the flex sensors with the Xbee’s. These Xbee’s made the entire project wireless from the flex sensors to the robot. By configuring and pairing the Xbee’s they are now able to send data between each other. I attached one Xbee to the Arduino on the glove which read and sent the data of the flex sensors to the other Xbee connected to the Arduino on the robot that received and turned the flex readings into motor instructions. I was able to attach the flex sensors and the Arduino with the Xbee on the glove by sewing them on. The flex sensors would simply send either yes they are bent or no they are not bent, but in order for the Xbee on the receiving side to know which flex sensor is which I converted each flex sensor reading to a specific character. This enabled the robot to recognize what fingers are bent and therefore what hand gesture I am doing. In my code I wrote 5 functions that enabled the robot’s motors to stop, go forward, go backwards, turn left, and turn right. Below is a video demonstrating me controlling the robot with the glove, there is still a delay in my instructions, which is why the robot keeps moving after I tell it to stop, this is a problem that I will solve and explain in the next one.

I have reached my first major milestone of my main project, by accomplishing not only my robot frame but also being able to communicate instructions to it with the flex sensors. I started off my process for my hand gestured controlled robot by building my frame. I designed the robot to have three wheels, two of them are in the rear end, those of which will be powered by VEX motors, and the one in the front is a omni-directional wheel. This wheel is capable of turning in every direction, which will help the robot move left and right. Next I started to wire all the motors to the Arduino, these VEX motors were connected to a motor controller that has servo leads. I powered the motors by a 7.4V Li-Ion battery and the battery and the motors were grounded to the Arduino. Each motor also had a signal lead that were connected to the PWM pins on the Arduino. Once I completed my robot frame I started to learn how to program on the Arduino, which eventually led me to understand how to write a sketch to manipulate all of the robot’s movements. Now that I knew how to control the robot my new challenge was learning how to control it with the flex sensors. The flex sensors are pretty much resistors, but as you bend them more their resistance increases and ┬ácan turn the resistance into a value that I can read on the Arduino’s serial monitor. Before I could find these flex sensor values I had to wire them to the Arduino, which I accomplished by connecting a signal, 5v, and ground lead to the flex sensor as well as a 22K ohm resistor. By testing each flex sensor I was able to know the value of when it was bent, and used this in a function statement in my code. To sum it up the function said that whenever the flex sensor is bent that the motor would turn on and spin in whichever direction and speed I would input. After this I was able to wire more flex sensors to control both motors of the robot, which is demonstrated in this video below.

Today marked my first success in Bluestamp Engineering as I finished my Minty Boost Starter Project. The main components that make up my project include a 2AA battery holder which provides the power, and a PCB with all of its components which help connect and convert the power from the batteries to the phone. I learned lots of new things, especially the purpose of certain electronic parts that were attached to the PCB. But most importantly I learned how to solder, which is a skill I will need to build my robot for my main project and a skill I could use in almost any type of electronic project. During the starter project process I did encounter a setback as I soldered a hole in the PCB shut and melted off the tip of the wire. However soon I learned that I was able to solder the hole until the solder that was covering it melted, which then allowed me to easily slid the wire into the hole. Now I have a fully functional battery powered phone charger that I can neatly put into my pocket and can use wherever I go!

Attached is a youtube video that shows me explaining a little more in depth of what the Minty Boost is composed of.

Leave a Comment

Start typing and press Enter to search