Hi, My name is Zachary, and I am a rising senior at SAR High School. I came to Bluestamp because I wanted to foster a greater interest in engineering. In the past, I have done basic experiments with Solar Panels and learned about basic electronics and coding. I have also developed a strong interest in advanced mathematics and physics. This is my second time at BlueStamp; during my first year I completed the omnidirectional robot. This time, I am making an Amazon Alexa with the ability to control devices.
Main Project -- Alexa for Home Automation (Automated Cat Toy)
Reflection: This project was fun and I learned so much. Despite the shortened time frame, I worked my hardest and did as much as I could As a three week student, I was able to make progress on the cat toy, but not enough to sufficiently debug and modify the project. If I had more time, I would have chosen a project more sophisticated than a basic robotic arm. Perhaps a robot controlled by Alexa would have been more appropriate. However, given the time constraints, this project fulfilled some of my basic expectations.
As a second year student, I analyzed whether an automated cat toy can be sold at a competitive price. I found that by building the box out of wood, and by using a cheaper alternative to the Particle Photon (the ESP8266), I could build a single cat toy for about 11.75. This price would likely go down if a bulk order were made. Most automated cat toys currently on the market cost between 12 and 20 dollars. Thus, this toy could be sold competitively for a profit at around $15. Cat toys are a small market, only amounting to 250 million dollars each year. However, automated cat toys are rare and thus there is tremendous opportunity in the automated cat toy market.
My final milestone was the building of the ultrasonic arm and the configuration of the ultrasonic sensors with Alexa. I can issue Alexa a command to play with the cat which, through the diagram under milestone milestone one, which ultimately relays this command to the particle photon. The photon calls a function which moves the arm to play with the cat.
I modified a project box from Amazon, making it compatible with the ultrasonic sensors and the three micro servos. Thus, these components can be mounted outside the box while maintaining connections with the circuitry. Additionally, one of the servos sits in a hole I drilled into the box.
The robotic arm is on top of the project box. It contains a circular base plate and two rectangular components. It has three degrees of motion and the servos provide enough torque to move the entire arm. Additionally, the arm can be configured with the sensors to move the arm based on the location of the cat.
A CAD drawing of the robotic arm (which is slightly different than the project box itself) can be found here.
The main difficulty I encountered was loose and/or improper connections. All connections must be correct for the sensors to work. Additionally, the photon is buggy and has difficulty connecting to Wifi. This makes it highly difficult to upload code and debug at times. This has been very frustrating and I will, in the future, explore using Arduino with a bluetooth/wifi shield instead of the particle.
Once I am able to configure the sensors by fixing the connections, I can program the arm to move the arm away from the cat. This would be a tremendous modification to add to my project.
My second milestone is the completion of a number miscellaneous and important tasks that are vital to the creation of an automated cat toy. It is not one large accomplishment, but rather a group of small achievements which compound to form a milestone.
- The first milestone only involved tasks that had one command, (set a pin value to HIGH, return a temperature, etc.). For a robot to continuously play with a cat, I must have the Alexa Skill fire one trigger which then continues so that it can play with the cat. I was able to achieve this in this milestone using a coding trick.
- The first time Alexa sends the command to play with a cat, a function is called which sets a boolean value to true. Then, in the loop, a separate function moving the servo is called only if this boolean is true. If I tell Alexa to stop playing with the cat, it sets the previously mentioned boolean to false. Thus, the function stops running.
- Secondly, I calibrated the ultrasonic sensor. It has four pins, two of which are power and ground. It also has a trigger and echo pin. The trigger pin, which is an output pin, causes it to send out ultrasonic waves which hit an object and bounce back. The time this takes is recorded. The bounced sound waves send a signal, indicating the time required for the ultrasonic wave to make a round trip, through the echo pin back to the particle.. This time is divided by a constant (from the ultrasonic sound library) to convert to a distance in centimeters. The process by which the ultrasonic sensor worksis demonstrated by the green LED, which only turns on when my hand is within 5 centimeters of it.
- Essentially, this milestone consists of my making of a unique Alexa skill which has the ability to loop, and the calibration of an US sensor.
First Milestone Diagram (This is a diagram which describes the processes explained below)
I reached my first milestone by having Alexa effectively communicate with a temperature and humidity sensor (DHT11). I used this linkto guide my first steps in completing this project. There are four main components to the communication between the Alexa and the various circuit components used:
- Amazon Skills Kit -- This is where the custom Alexa skill is defined. This definition includes various phrases which can activate certain commands. It also designates the identifiers for each unique command. In other words, it converts words (through voice recognition) into a string which are then analyzed and converted into specific variables. These variables are sent to AWS (Amazon Web Services) Lambda (refer below for AWS Lambda) where the variables are sorted and data is sent to the Particle Cloud API. The Skill also has a unique Application ID which allows AWS to directly reference it.
- AWS Lambda-- A server provided by Amazon Web Services which integrates various custom Alexa skill sets with one another. The server is programmed to understand various outputs from the Alexa custom skill. It receives this information from the skill and sends it to the Particle Cloud API, and eventually to the Photon itself. (see below)
- Particle Cloud API-- The Particle Photon -- which has a microprocessor and pins, similar to an Arduino -- is connected via the internet to the cloud API. The photon provides the API with its unique device ID as well as an access key. The API, in turn, is able to connect to AWS lambda via the internet. It receives commands from the AWS Lambda which correspond to verbal commands from the skill set. These commands will be interpreted digitally by the photon, and then implemented into the analog circuitry. For example, a command to turn the green light on will be converted into a function which sets a pin to High, giving it a voltage and turning on the LED. Essentially, the Particle Cloud API is the means through which the AWS Lambda sends data to the Particle Photon.
- Particle Photon -- A physical device which has pins, a microprocessor, and a microUSB slot through which code is uploaded. It converts the commands from the API into digital outputs from its pins. It also receives data from various circuit components, such as the temperature/humidity sensor (DHT11). It can take data from the sensor and send it to the API and all the way to the Amazon Skills Kit, where Alexa can return the temperature.
My first milestone was successfully linking each of these components. The LEDs can now be activated via a voice command and temperature values can be sent to Alexa from the circuit board.
My starter project is a Theremin, which is a motion activated musical instrument. My main intensive project is an omnidirectional robot.
Main Project: Omni-Directional robot.
After six full weeks at Bluestamp Engineering, I have gained a great wealth of knowledge while having a tremendous amount of fun. I came to Bluestamp with nothing more than a little programming experience and some basic electrical knowledge. I now feel like I have much more experience. I am proud to have made a semi-autonomous robot with a visual and sound system. I can even move the robot using the sensors, which was something I had not planned for.
Final Project -- OmniDirectional Robot:
Picture of the Completed Robot
After building the basic frame and coding the robot to move, I began to add many features. I added a wifi camera which livestreams directly to my phone. This way I am able to see my robot in real time. I can also take videos and pictures to save for later. I also installed a walkie talkie onto the robot. The wallkie talkie has a servo glued to it to press the button so I can hear what happens around the robot. I can also speak into the paired walkie talkie and communicate with people and objects around the robot. Finally I added a sensor system to prevent crashing. The sensors send sound waves out and receives the reflected waves. It sends the arduino the amount of time it took for the sound waves to ping, and converts that to a distance. The robot then moves in the opposite direction of the robot as to avoid it. I am now able to move my robot my manipulating the sensors. Finally, I have added a button switch to turn the motors on and off more easily than unplugging the battery each time.
I encountered some problems soldering the sensors, but I was eventually able to do it correctly. I also encountered some wiring issues but I was able to rewire my project neatly. I also ran out of pins on my arduino board, but I got an arduino mega to get those additional pins. Finally, the servo on the walkie talkie proved difficult to mount, but with the help of tac and hot glue, I was able to mount it correctly.
I would like to thank my wonderful instructors for helping me complete the robot. I would also like to thank Rain for her excellent documentation which I took advantage of.
Schematic download: Omnibotschem
CAD Drawing of robot:
The CAD drawing for my robot, the 3D design, made with Onshape.
Code for the Servos, Walkie Talkies, and sensors.
Second Milestone: Physical structure of Robot with working code.
The Platform of the Robot is a regular hexagon with sides of four inches. The platform itself is one eighth of an inch thick. It is very light but is strong enough to support the servos and other parts. The Arduino board is powered by a portable USB charger, and the servos are super-glued to the bottom of the wood. There is potential to add a second layer of wood on top in order to carry a camera, microphone, and potential speakers.
I encountered a major problem with the connectivity of the receiver to the Arduino. After doing diagnostics for many hours, I discovered that it was a combination of problems with power, the code, and wiring. After fixing these problems, and writing the code, I was able to control my robot.
Potential additions, in addition to a sound system and camera, include adding LEDs, a second layer, and a bar where an iPad could be mounted.
Picture of robot with Camera but without Sensors and Walkie Talkie installed.
First Milestone: Spinning Servos using a PS2 controller
Today I reached my first milestone, which was being able to use the PS2 controller to spin the three Servo motors. I connected the wireless receiver to the Arduino with specific wires in specific pins, which I then programmed to receive signal from the wireless receiver. The Arduino, through its digital output pins, sends signals to the motor controllers which allow the Servo motors to spin. I programmed each button to spin one motor and a fourth button to spin all the motors.
Some Challenges that I encountered while reaching this milestone were learning the syntax of the Arduino language, and having the controller stay connected to the Arduino. I read documentation from different places to learn the syntax and I manipulated the controller to remain connected to the Arduino.
My next step is to build the physical structure of the robot. I will figure out the proper dimensions and the material with which my robot will be built.
Starter Project: Mini Theremin
The Theremin has an antenna, which can detect the electrical field around it, and thus it can detect motion within that electrical field. This creates a direct current that enters the board, and then the ic 555 chip. This chip converts the DC current into AC current, which is able to create and electromagnetic field. This AC current creates that field in the Pienzo speaker, which is made of two disks that require a magnetic field to make noise. The other chip, the 12C508, controls the LED lights that light up as the notes sound, and the different modes. The two modes are continuous and discrete. When turned on, this theremin is in continuous mode, which makes more continuous sounds. When the two buttons are pressed simultaneously, it is switched to discrete mode. When in discrete mode, the buttons can raise or lower the key. The resistors control the amount of electricity going to each part of the circuit. The capacitors store electricity that is used when needed. The regulator also controls how much electricity goes to each part of the circuit. The 9V battery, of which 5V is used, supplies power to the LEDs, the speaker, and the chips.