Finally! Today was the first time I got to write code towards solving the tasks put forth by NASA. I concentrated on the first of two tasks, which is a computer vision task. Lights on a console turn on and your program needs to determine what color the lights are and where they are located relative to the robot’s head. The work I completed today detected the colors as they lit up as well as where they are located within the incoming video feed from the robot’s camera. The next steps are to determine the button’s location in 3D space as well as refine the accuracy of the button locations as that will result in a higher score. Once these tasks are complete, I simply need to publish ROS messages with the appropriate information. A logging mechanism subscribes to these messages and stores them in a file. This file is submitted to NASA as my final score for Task 1 of the Qualification round.
I’m off to a good start, if not a late one. I was able to install all the required software, including reinstalling Linux, and I ran through a walking tutorial to make sure it was all working. I got an error here and there, but overall it seems to be working. January 21st (Qualification deadline) is approaching fast!
I just got word a few days ago that the paperwork has been approved for team Rhubarb! It looks like it slipped through the cracks on their end, but they were able to find it and rushed the approval through once I had inquired about the status of the application. With a little less than a month to go, we have our work cut out for us!
- ROS Indigo
- Gazebo 7.0
- R5 simulation will be provided with walking/balancing control from IHMC
We’ll probably need or want this as a viewer into Gazebo Sim.