Pool Test 07/13/2024

Today we made significant progress towards completing the gate task.

As DepthAI color correction is still a work in progress, we temporarily added a YOLOv7 package to onboard to run our gate model on the main computer and perform detections on the front mono camera’s feed. This had a detection rate of 2.66 Hz, which is much slower than the 22Hz we typically get from the DepthAI camera. Nevertheless, the slowness of the detections didn’t prove to be too much of an issue in completing the gate task. Running a neural net on the main computer (which doesn’t seem to have a GPU, so it’s running on the CPU) significantly increased our CPU usage from about 25% to 50%.

We had reserved two lanes today, so we could put the 10ft gate across both lanes. Initially, we got many false positive detections, but we then increased the confidence threshold to 0.9, which filtered out most false positives. The detections for the two images on the gate were very accurate and reliable. The detection for the gate tick was mostly accurate, although sometimes it would confuse its reflection or one of the legs of the gate for the tick. The detection for the entire gate wasn’t super reliable; it would be detected in some frames, but not in others. The latter two didn’t matter, however, as task planning used only the two gate images.

Recently, I obtained the focal length and image sensor size from the manufacturer of our mono cameras. We also know the actual heights and widths of the competition props. With that information, we can compute the distance between the robot and any prop using the following formula:

$$\text{distance to object (m)} = \frac{\text{focal length (mm)} \times \text{real height/width (m)} \times \text{frame height/width (px)}} {\text{bounding box height/width (px)} \times \text{image sensor height/width (mm)}}$$

This method proved to be quite accurate at computing the distance to the gate. Combined with the strategy proven in the buoy task of computing a meters to pixels ratio and obtaining Y and Z positions of the object (relative to the robot) we can now fully localize, in 3D, any object whose actual dimensions are known (which is the case with all competition props).

Using this 3D localization, we were able to make the robot move through the red (clockwise) side of the gate. The robot would start facing the gate. It would submerge, then correct its Y position so that it was in line with the red side of the gate. Then, it would begin moving forward in 1-meter increments. After each step, it would again correct its Y position and correct its Z position to maintain a constant depth. When it was within 3 meters of the gate, it would slow down and move only in 0.5-meter increments. Once within 2 meters of the gate, the gate image would be almost out of frame, so it would dead reckon the rest of the way through.

Thus, the robot can move through the gate most of the time, although some more work is required to make it more reliable.

We then shifted our attention to the coin flip. Our strategy is to initially point the robot at the gate, record this yaw angle, turn away from the gate as dictated by the coin flip, submerge, and turn back to the recorded yaw angle.

Initially, we attempted to do this by turning motion.launch on with the robot pointed at the gate, then returning to zero yaw. However, this wasn’t a great strategy as the state’s yaw drifted significantly as the robot turned, making the robot turn much more than we wanted. We fixed this by instead relying on the IMU’s yaw angle, which remained accurate even when the robot spun. Thus, when task planning was initialized, it would save the first message it received from the IMU. Task planning would then wait for user input before actually running any of the tasks. During this time, the swimmer would rotate the robot for the coin flip (we rotated 90 degrees counterclockwise in our testing today). Then, task planning would be commanded to start running the tasks, and controls would be enabled. The robot would submerge, then return back to the initial IMU orientation.

The other issue we faced during the coin flip was that the robot would experience significant roll and pitch oscillations while it was only supposed to yaw. This was because any slight perturbations in the robot’s roll and pitch angles would cause an overreaction from the PID loops, applying too much corrective effort, which would end up only worsening the perturbations. This was fixed by tuning the roll, pitch, and yaw PID constants, reducing both \(Kp\) and \(Kd\) significantly to prevent the overreactions. We also tuned Z PID whilst doing this to make it converge faster, as the Z error would take longer than expected to converge when the robot submerged.

With these issues fixed, the robot would start its motion facing away from the gate, but then turn itself back smoothly to face the gate again, and it would start getting detections from YOLOv7.

Tomorrow, we’ll work on combining the coin flip, barrel roll, and moving through the gate to complete the gate task.

Twice during the test, the tether disconnected unexpectedly and reconnected at a slower speed. After the second time, we switched to the fiber tether to avoid further issues. We plan to grease the SubConn end of the tether before tomorrow’s pool test to, hopefully, avoid this issue.

Today we also switched the thrusters to use pure serial (no ROS). Halfway through the test, after we had written the required CV and task planning code, on our first attempt at moving through the gate, the robot went out of control with some thrusters spinning, and others not. We tried it a few more times, restarting motion.launch and even the entire robot, but it didn’t solve the problem. Finally, we reuploaded to the thruster Arduino, which finally fixed the issue. The thrusters were normal for the rest of the test. In short, even after switching to serial, the thruster Arduino still seems to have problems occasionally. Moreover, we don’t know when it is experiencing those problems – unlike the previous version with ROS where we would get a disconnect message in the motion.launch terminal. It is only when the robot moves unexpectedly that we would know something is wrong. It seems that ROS is not the true cause of the thruster Arduino issues – there is something else causing unexpected behavior of the Arduino.