Not exactly a morning-news sort of read, but it’s fun to look through.
With exactly 5 hours to spare, I was able to complete Task 2. I had a little bit of troubles completing it using C++, so I switched over to Python and things ran smoothly. I’m currently trying to fix up a few things that should be somewhat easy to fix and will result in a better score. I’ll make another post later with pictures, videos, and an after action report.
Finally! Today was the first time I got to write code towards solving the tasks put forth by NASA. I concentrated on the first of two tasks, which is a computer vision task. Lights on a console turn on and your program needs to determine what color the lights are and where they are located relative to the robot’s head. The work I completed today detected the colors as they lit up as well as where they are located within the incoming video feed from the robot’s camera. The next steps are to determine the button’s location in 3D space as well as refine the accuracy of the button locations as that will result in a higher score. Once these tasks are complete, I simply need to publish ROS messages with the appropriate information. A logging mechanism subscribes to these messages and stores them in a file. This file is submitted to NASA as my final score for Task 1 of the Qualification round.
“Programming for artists”
- ROS Indigo
- Gazebo 7.0
- R5 simulation will be provided with walking/balancing control from IHMC
We’ll probably need or want this as a viewer into Gazebo Sim.
Indent size does not matter as long as it is consistent.