University of Central Florida — College of Engineering and Computer Science
The goal of our project is to work with the Florida Space Institute (FSI) and their EZ-RASSOR rover simulation/RE-RASSOR real-world rover to integrate a robotic arm that will be used to place small rectangular bricks together to build infrastructure such as roads and landing pads on other planets/moons. This arm utilizes a visual network powered by a stereo/depth camera along with autonomous driving/path-planning algorithms to carry out its job of building with efficiency and care for its surroundings. While the physical solution with the real-world arm will be used by FSI for many activities such as NASA competitions; the primary use of this solution is as a teaching tool for students to learn about space and how a rover interacts with its environment on a moon or other extraterrestrial body.
To accomplish this goal we first created a simulation solution within the simulation software gazebo. This involved adding the arm model/drivers to the existing rover model as well as adding the paver model to the simulation’s model database. Once the arm model was integrated we added a camera to the rover next to the arm to fuel a visual network with images. This visual network takes images before placement of the next paver to determine the previous paver’s position in cartesian space in the form of a 3D bounding box. The center of the previous paver’s3D bounding box is sent to the arm’s autonomous controller which is created using the moveit ROS package. This controller uses inverse kinematics along with the information from the visual network to direct the arm to the proper location for placement. Finally our modified driving algorithm, that uses the existing autonomous driving functions of the rover, is used to return to the rover’s spawn to generate a new paver and then moving to the next location for paver placement. We also added arm functionality to the existing gamepad mobile app which connects to the simulation/real-world rover through a local http server; this gamepad functionality was added primarily as a tool for the learning aspect of this project as it would allow for students to easily interact with the arm in a more personal manner.
- ROS – The Robot Operating System (ROS) is a framework for interacting with and controlling robotic components both for real-world and simulation
- Gazebo – A simulation software that is used to generate a simulated environment and contains many useful tools for interacting with ROS
- Blender – A modeling software used to create the meshes for the arm model that was added to gazebo as well as for generating training images for our first version of the visual network
- Tensorflow – A python library that facilitates the creation/use of deep neural networks and was used for object detection in our visual network
- OpenCV – A python library that facilitates the interaction/conversion of images
- React Native – React js library for developing mobile applications and is similar to the normal React js library for web development
- Moveit – ROS package for creating controllers for robotic manipulators such as our arm by using the universal robot description format (URDF) of the arm to map a kinematics controller to
Mark Heinrich – UCF
Robert Forristall (Project Manager, ROS Control/Integration, Autonomous/Manual ArmControl): email@example.com
Luca Gigliobianco (Arduino Software Development, Jetson Nano Integration): firstname.lastname@example.org
Cooper Urich (React Native Developer): Cooperurich@knights.ucf.edu
Christopher Jackson (Visual Network): email@example.com
Austin Dunlop (Simulation): firstname.lastname@example.org