Tom and Jerry
Tom & Jerry are a pair of robotic cars that run side by side along a test track. Jerry is attached with a green marker and Tom is attached with a Pixy camera. Jerry is told to move on a specific velocity profile while Tom tracks Jerry and attempts to match his speed and position. The names of the cars are taken from the popular kids TV show "Tom and Jerry". This project pertinent to up-and-coming technology for self-driving cars which need to be able to sense other vehicles beside them as well as in front and back of them in order to operate safely.
Other than the green marker and the Pixy camera, each car is constructed identically. The mechanical components included an aluminum chassis, a plastic mounting board, stock wheels and axles, a timing belt, and a motor. The electronic components included an encoder, an Arduino Uno, an XBee radio transmitter, a motor driver, and a battery pack. The XBee allows for two way communication between the cars and a computer. The computer tells the cars to start and gives Jerry a velocity profile to follow, while the cars transmit data from the encoder to computer to be logged. This data is used to track the position and speeds of each car.
Our goal for this project wasn't simply to just make Tom and Jerry and prove they work, but it was also to test the feasibility of PID algorithms to control the cars. Jerry's PID algorithm uses data from its encoder to match its given velocity profile. Tom's PID algorithm uses its position relative to Jerry's as seen through the Pixy camera and tries to make that distance equal zero.
The design for our cars was quite simplistic, and we borrowed methods that we had frequently used in other projects to quickly buy parts and assemble the cars. A majority of the work involved creating a computational model of the system, programming the control algorithms and data collection, and testing the cars amongst many PID values and velocity profiles.
Our computational model was based off traditional solid-body dynamics and transfer function analysis. A Simulink model was created in MATLAB to represent the system. A velocity input is fed into Jerry's system which outputs Jerry's position and velocity. The error between Jerry's and Tom's positions are fed into Tom's system which outputs Tom's position and velocity. We also used special blocks to simulate the power limitations of the hardware, the limited field of view of the Pixy camera, and the discrete sampling time of the sensors.
Each car's control system was programmed in their Arduino, using open source libraries to gather data from the sensors, process the PID control, and send data through the XBee's. A Python script written on a laptop was used to start the cars, send a velocity profile and PID constants, and log incoming data to a file. A Matlab script was used to read the data files and plot the cars' position and velocity for each test, as well as calculate the error between Tom and Jerry.
Our tests involved sending Jerry straight and back-and-forth on the track at a slow and a fast speed. For each test we tested Tom at a set of PID values. Our selection of PID values were based off of our computational model and some initial tests. From our results, we found that Jerry was able to consistently match its velocity profile. Tom was also able to match Jerry's position when using specific PID values. However, we could not find a single set of PID values that would work well at all velocity profiles (a prediction based from PID theory). Furthermore, our computational model was unable to predict Tom's exact behavior, although it was able to predict trends. We based these errors on wheel slip and sensor limitations. We believe that a few modifications to the track and the code could clear some of this error, but a more robust controller model might be desired.
Team Members
- Ryan Feng
- Douglas Hutchings
- Anish Khare
- Aonan Li
- Ahlad Reddy
- Forrest Wang
- Barber Christian Waters
- Mingyi Zheng
- Professor Tony Keaveny (Advisor)