My name is Dylan and I am a rising Junior at Regis High School. I found out about BlueStamp last year when the came to do a presentation at my school. As a result I joined BlueStamp in 2013 to focus on making an autonomous BugBot. I was so impressed with the program that I came back this year. Since my last project was fully autonomous, I wanted a robot I had full control over this time around. Additionally I had been intrigued by my colleague Rain’s Omni-Directional Robot last year (which is what I based my own project own and can be found here), so I wanted to make one of my own this year in addition to my starter project which was a proximity sensor. This also had the added perk of using a playstation controller, something I use recreationally at home, so this project had a lot of things going for it and it proved to be fun and exciting (and difficult) to create.
A detailed mechanical drawing can be downloaded in the form of a SketchUp file here:Mechanical Drawing
All of my code can be found in my Github repository here: OmniBot-Code
My Bill of Materials: OMNIBOM
My Final Omni-Bot
After six weeks at BlueStamp, I have finished my Omni-Directional Robot and added my own personal modification. I faced randomly breaking electronic parts, buggy code, and worn away screws combined with flimsy tape. However, I persevered to achieve this final Omni-Directional Robot that has a phone slot for live feed capabilities that is controlled by a play station controller.
The basic way it works is simple. I use the play station controller to send data to the wireless reciever on the omni-bot. The data sent comes in the form of x and y values for either the left analog stick or the right one. This data is then sent to the arduino on board which is wired to this receiver. The arduino uses these values to calculate both an angle and an “intensity” (which is the distance the analog stick is from its resting position). It then uses these values to calculate a value from 1250 to 1750 for each motor. This value is used in the “WriteMicroseconds” function. This means that that the arduino sends a specific pulse (determined by this value) to each micro-controller which then uses that data to turn the servo in a specific direction and speed. 1500 would be stationary, 1250 is counter-clockwise, 2000 is clockwise, plus values in between (so 1300 is counter-clockwise but slower than 1250). So input is handled by the controller, computation by the arduino, and then output by the motors.
However since my last milestone I also added a 3D printed case for my phone that snugly slides onto the robot. I designed it in SketchUp, converted it to stl and Kristen helpfully printed it for me. This actually took numerous tries as the 3D printed kept filling in the slot where my phone was to go. When I finally got it, I could then set up my phone to skype with my laptop and then I place my phone into the slot. I get a feed of what’s in front of my robot streamed to my laptop which enables me to control it while sitting in front of my computer- even if the robot is in another room. I can also communicate with people via skype and they can see my face as the case is made of clear plastic. There are limitations as the controller has a somewhat limited range between itself and the receiver though, so I cannot communicate more than one room over. Though I was tempted to make my own camera and set up my own live-feed and found previous projects that made ones using a raspberry pi. However that would have cost upwards of $150 more dollars, would have been to much work for my week of time allotted, would have an atrocious frame rate, awful lag, no audio, and limit the communication to make it one way. This was simpler, cheaper, and better – everything engineering should try to be.
The SketchUp File can be found here: Full Case Model 4
The STL file can be found here: Full Case Model 4
I have finally reached the third milestone for my Omni-Directional Robot. I have made five essential fixes to my robot since the last milestone: rotation, intensity, square-to-circle warping, proper omni-movement, and more secure wheels.
The first three issues all go hand in hand. Rotation was a fairly easy beast to tackle. If the analog stick is pushed to the left, all wheels go counter-clockwise turning the robot as a whole counter-clockwise. If the stick is pushed to the right everything is clockwise instead. However the speed of rotation should depend on how far the stick is being pushed: a value I’ve referred to as intensity. Intensity is fairly easy to calculate for rotation as I only had to worry about the x value (The robot cannot rotate up or down) and so I just found its absolute value. However this became more difficult when it came to movement which had to account for both x and y. To find intensity here, I would have to find how far the stick is from its resting center position: something that could be done using the Pythagorean theorem where x and y values are the a and b values and the distance is c. However the analog sticks, despite being in a circular holder do not actually retrieve data in a circular range. Data from the controller can supply any point on a coordinate grid from 0-255 (which was scaled to -100 to 100 for the sake of convenience). This means it’s receiving a point from a location on a square – not a circle. Because of this, pushing the stick all the way into the top right corner is regarded as point (100,100) which would be counted as farther away than all the way up (0,100). If I used distance from the center to calculate speed, my robot would be able to move faster diagonally than straight. I then fixed that by warping the square into a circle using code (this is explained much further in depth in my actual code). So now pushing it to the top right would return coordinates of about (71,71) which would have the same distance, and therefore intensity, of pushing it straight up.
This next issue was in issue that did not even occur to me in my previous milestone. However after a bit of testing with my robot I realized that it excelled at moving in 6 directions, (at 0,60,120,180,240, or 260 degrees) however moved imperfectly moving in other directions like to the left or right (which would be 90 or 270 degrees). This stemmed from the fact that I had made an incorrect assumption: that servo motor speed reacted in a linear fashion to microseconds. A bit of back up first. My servos are controlled with the servo.writeMicroseconds() command. If the value in between parenthesis is 1000, it moves counter-clockwise; 1500, stationary; 2000 clockwise. If it is a value like 1250, it will move counter-clockwise slower than if it were 1000. I assumed it would move at half speed, however it is quite a bit faster than half speed. I found a which showed the relation between microseconds written and servo speed. What it illustrated was that for a given range there was a linear relation for servos. Of course parallax servos are different from vex servos and so though the premise was correct, I had to run my own trials to find the given range for my motors. Though narrowing the range improves its directional accuracy, it also slows it slightly, a benefit still worth the cost. Of course the omni-bot still cannot move perfectly (as that is impacted by wheel placement and orientation, friction from dirt and dust that has collected on the wheels, and the placement of weight on the robot itself), but I mostly fixed the inaccuracies that resulted from the programming side, which substantially improved performance.
Finally I turned to a more mechanical problem: the wheels kept falling out. This was actually more of an issue when my robot moved faster and jerked more (due to the lack of intensity sensitive speed) however the wheels would still occasionally pop out after a bit of use. I initially wanted a non-permanent solution. I tried double sided tape which proved completely ineffective as the surface area of the drive shaft is too small for the amount of tape required to be effective. Trying to stick with removable solutions I attempted using a tack used for hanging up posters. This actually made the problem worse as the wheels could not go quite as far into the motor as the could before. I eventually just decided to commit and use a difficult-to-undo-solution and crazy glued the shafts into the motors.
The code for this milestone can be found here:
I have just reached the second milestone for my omni-bot. Initially the plan was that the second milestone would come once I had the wheels moving in the proper way to facilitate omni-directional movement. The thought was that I would build the robot after this step. However I quickly realized that I had no way of telling that the wheels were actually moving properly unless I could seen them move my actual robot. Thus I had to actually build my robot first. Building was something I had never really done before as my previous experience at Bluestamp had taught me about coding and electric parts, but didn’t require me to construct anything like this.
The initial plan was to wing it by duct taping the motors to the underside of the wooden base that Kyle, my instructor, laser cut for me using the diagram made by me in AutoCad. Despite the fact that it almost worked and did hold it together, it was very weak and constantly about to falling apart. Additionally the motors sagged, and the wheels were angled off as a result. I realized I would need something more structurally sound.
After discussing the issue with my Blue Stamp colleague Ben, we collectively came to an idea of sandwiching the motors between my base and a thinner, but otherwise identical hexagon of wood. Kyle helpfully cut that out for me too. I then also borrow some vex rods and screwed my motors into them and then screwed the rods into the side of the main base. I then twisted the rods to clamp around the thinner piece of wood and screwed them in that way from the bottom as they were too thin to screwed in through the side. As of that moment I constructed my base!
I then paired it with my code, and after a few (okay a lot of) fixes it was working almost properly. It cannot rotate and cannot vary in speed and these things result in it still being a little clunky. For my next milestone I hope to tackle these issues.
The dxf file used in laser cutting for my base can be found here: Hexagon
The code for this step can be found here: Milestone 2 Code
My main project at BlueStamp Engineering is the Omni-Directional Robot. The way this robot will be controlled is through a Playstation Controller and so it is necessary for me to be able to first just get the controller sending data to and from the Arduino and seeing if I can use the controller to affect an electrical setup. The first step is to connect the Arduino to the Playstation receiver which works via bluetooth. I set it up the wires following and then used code (that had to be slightly modified to utilize the new pins) from Bill Porter which can be found . In addition to providing this code which allows the computer to read inputs from the controller, he provided the library which makes interacting with the controller significantly easier. However I didn’t want to merely see “Circle has been pressed” on my computer when I pressed circle, I wanted to make sure I could use that button to trigger an electrical or mechanical response. The best way to see if I could do this was through LEDs. I then was able to code a program which turned a blue LED on when I pressed the blue button and then off when I pressed it again. I repeated this until I had done this with all 4 colors (blue, pink, green, red). The analog sticks will also be incredibly important in moving the Omni-Directional Robot and so I needed to make sure I could use the analog controls. I then set it up so that I could dim the pink LED (the brightest one) by adjusting the right analog stick. Finally I wanted to utilize the vibrate function, and so if the pink LED is made brighter than it usually is it will vibrate and the vibration will increase as the light increases. The next step for me is to start working on the motors.
The code for this step can be found here: Milestone 1 Code
Ultra Sonic Parking Sensor
For my starter project I am doing the Ultra Sonic Parking Sensor. This device is able to detect if an object is within 70cm of an object, and if an object is present, it will beep. This capability makes it ideal for installment in a car for use as a parking sensor to avoid minor bumps and bruises with other cars. However, generally speaking, it can be used as any kind of proximity sensor. The way it works is through an ultrasonic transmitter and and ultrasonic receiver. The ultrasonic transmitter is constantly playing a frequency of 40 kHz. This is a frequency far above human hearing range which is 20 Hz – 20 kHz approximately (though it is a frequency flies and mosquitoes can hear). There is also an ultrasonic receiver awaiting the sound. The only way it will pick up the audio waves is if someone bounces back the ultrasonic waves sent out by the transmitter. If the receiver picks up these ultrasonic waves, that means that there is something in the way, and so it alters the current of the circuit in response. Once that alteration happens, the buzzer goes off and alters the user that there is indeed an obstruction. Though those pieces are the core parts of this device, all of the other pieces play crucial roles too. The resistors of course function to decrease the current to prevent the device from burning out. Diodes are scattered throughout the device to insure that the current flows in the right direction. Capacitors are used to store up energy to be released for quick bursts of energy (when the buzzer goes off). Two of the integrated circuits are there as input devices, while another is used for output. Another one exists to interpret the signals from the ultrasonic sensor and two more work to regulate the “clock” of the circuit. This makes sure that all of the ICs are operating together and this regulation is done in part not just by the ICs themselves but also by a 5 MHz quartz crystal. Finally there is a trim potentiometer which can be adjusted with a screwdriver to change its resistance. Changing the resistance in turn changes the device’s sensitivity, allowing for the tool to be customized for alternative purposes or preferences.
This is my second year at BlueStamp and actually proved to be a very different experience. I started this year far more informed and capable than last year so there was less absolute confusion and fewer instances where I was completely paralyzed when I found myself in over my head. However this year I challenged myself a lot more. While my last project rotated around building a kit, this one had a lot more room for creativity and a lot more room for error. I even made a point of making this project more difficult by denying myself the ease of using Rain’s code and made my own code with my own methods and own ideas. I feel so much more empowered now since I really had to hack everything together myself. Last time I followed instructions and then learned why it worked. However this time I did my own thing and then learned why it didn’t work (and then of course kept trying until it did work). This project was also almost 100% devoid of electrical engineering which proved to be awesome. This has really made me realize that programming and mechanical work are my preferred and stronger fields.