Hi, My Name is Frederick F. and I am a rising senior at Trinity High School in New York City. For my main project, I am constructing a motion sensor glove which accurately tracks the position of my hand, and displays the orientation of my hand on my computer screen in real time. Due to my previous experience as a student in Bluestamp, I did not make a starter project this year.
As a second year student at Bluestamp, I was able to further extend my knowledge of robotics, especially concerning coding and computer science. This project helped me become more familiar with Processing, especially the 3D modeling aspect of it. In addition, I learned a lot about I2C as a result of my implementation of the I2c expander, which allowed me to communicate with all 4 IMUs simultaneously. Overall, I had a lot of fun learning to build my motion capture glove, and plan to continue working on it in the future.
For my final Milestone, I accomplished the goal of my project: to accurately map any and all movements of my hand and fingers (in relation to my forearm) onto my computer screen. I was able to do this by refining the glove and adding two more 9 DOF sensors in order to better map the position of my thumb and forearm. I also traced my own hand and used that data to better render the hand on Processing, implementing a custom-shaped palm instead of the rectangle as I had previously used, and making the fingers/joints proportionate lengths. This allowed my the modeled hand to realistically interact with itself, including consistent finger-thumb interactions (such as when I touch my thumb to my index finger). In addition, I applied my makeshift filter to all of the 4 IMUs (9 DOF sensors) in order to get more accurate data. However, as the hand still would become inaccurate over time, I added a button on the inside of my thumb that, when clicked against the side of my palm, serves two functions: a short click resets the thumb’s position (relative to the hand), while a longer click resets the wrist’s position (relative to the forearm). This button allows me to continue displaying my hand accurately for longer periods of time.
For my Third milestone, I assembled the glove and got a basic visualization of my hand on processing. The data from my glove is pretty accurate; however, further accuracy will be achieved in my next milestone, where I plan to refine the IMUs and flex sensor setup and code. The most significant challenge I still face concerns proper representation of the thumb, which is very hard to accurately model and get orientation from using only one IMU; therefore, next milestone I plan on adding a second IMU in order to receive clearer data from all the joints in my thumb.
For my second Milestone, I incorporated an I2C expander from Adafruit in order to receive data from both IMU sensors. I needed the I2C expander because both sensors have the same, unchangeable I2C address, so I could not talk to both at the same time without an expander. The expander has its own I2C address, and I can command it to switch between sensors. I used this data in processing to model a basic palm/thumb in real time. After completing this milestone, I still have some problems concerning the gyroscopic drift of the sensors. While I managed to get rid of this drift, I sacrificed more accurate position. Further modifications and testing will be necessary in order to retain accurate sensor orientation while still accounting for drift.
For my first milestone, I applied a Madgwick filter to the data from my Inertial Measurement Unit (IMU) in order to receive accurate orientation data. The IMU is composed of three sensors: a gyroscope, accelerometer, and magnetometer. The Madgwick filter uses data from these three sensors onboard the IMU and combines them to output Euler angles. In addition, the filter utilizes magnetic field data to account for magnetometer error. I applied the Euler angles to a cube on the Processing IDE in order to represent the orientation of the IMU in real time on my computer screen.