W2023 SAR Push - Software Team Member Split

Members

In-person:

  • Colin

  • Keyon

  • Orson

  • Soumya

  • Cheng

  • Nico

Remote:

  • Andrew

Training:

  • Edward (in-person)

  • Nafiz (remote)

  • Suraj (in-person)

Project Split

Hardware interface → Colin

  1. Create generic UWRT CAN wrapper class (under uwrt_mars_rover_utilities) for use for all hardware interfaces. GitHub Issue

  2. Implement CAN wrapper for Drivetrain hardware interface. Stub CAN IDs and data until given by firmware team. GitHub Issue

  3. Implement CAN wrapper for Arm hardware interface. Stub CAN IDs and data until given by firmware team. GitHub Issue

Comms → Soumya, Nico

Only partially Nico, as he has nodes in the works for autonomy that are useful to comms

Drivetrain control:

  1. Nico: Create ROS2 node for reading from Xbox joystick, and publishing it over the ROS2 network. GitHub Issue

  2. Soumya: Create node for controlling the drivetrain via the Xbox controller; aka create a ROS2 node to pipe data from the joystick over to ros2_control. GitHub issue

    1. The first part of this task is to boot the drivetrain in Gazebo and check out what topics ros2_control exposes for it. You can use rqt or ros2 topic list for this; you’re looking for something like a /cmd_vel or /position_commands to figure out what you need to publish to in order to get the drivetrain to move.

    2. Then, you’ll set up a node that subscribes from the Xbox controller’s joystick node that Nico wrote, and forwards the data (publishes) to the ros2_control topics. With this done, the drivetrain will move anytime the joystick does!

  3. Once this is done we need to test it working for a tele-operated robot. Depending on how far along we are with the firmware of the robot we’ll do one of two things:

    1. Simulated: Run Gazebo on the Jetson to simulate the robot. Connect the Jetson and your laptop to the same network via David Choi’s router setup. Boot RViz on your laptop and move the robot around using your joystick controller node.

    2. Real: Run the actual robot via the Jetson. Connect the Jetson and your laptop to the same network. Run the joystick controller node on the robot, watch the drivetrain move around.

Video stream:

  1. Nico: Create node to read from zed_ros2 camera stream, publish over the ROS2 network.

  2. Soumya: Plug the ZED2 camera into the jetson, run Nico’s zed_ros2_camera stream node. Connect the Jetson and your laptop to the same network using David Choi’s router setup. Run RViz on your laptop and test the video stream coming from the Jetson; is it looking okay? Is there a lot of latency?

Autonomy → Cheng, Nico

Nico: Create ARuCO code reader. This should be split up into at least two different nodes:

  1. Camera stream node from the ZED2 camera (Soumya will need this too) - GitHub Issue

  2. ARuCO code parser node - GitHub Issue

Cheng: ROS2 Navigation First Time Robot Setup. This will involve a lot of alterations to our drivetrain URDF.

  1. https://navigation.ros.org/setup_guides/odom/setup_odom.html

  2. https://navigation.ros.org/setup_guides/sensors/setup_sensors.html

  3. https://navigation.ros.org/setup_guides/footprint/setup_footprint.html

After these two:

  1. Create a node to read from the Vectornav. Launch file.

  2. We’ll look into the behaviour trees/further setup we need to get the robot moving in nav2. Specifically, for SAR, we want something like https://navigation.ros.org/behavior_trees/trees/nav_to_pose_and_pause_near_goal_obstacle.html

Arm Controller → Meshva

  1. Add the moveit arm package with the URDF and ROS2_control enabled to the repository.

  2. Create a service server node for moving the arm from one pose to another. Launch file.

  3. Create a node for controlling the arm via the Xbox controller; aka create a ROS2 node to pipe data from the joystick over to ros2_control. Launch file. GitHub issue

    1. The first part of this task is to boot the arm in Gazebo and check out what topics ros2_control exposes for it. You can use rqt or ros2 topic list for this; you’re looking for something like a /cmd_vel or /position_commands to figure out what you need to publish to in order to get the drivetrain to move.

    2. Then, you’ll set up a node that subscribes from the Xbox controller node that Nico wrote, and forwards the data (publishes) to the ros2_control topics. This could be “if you press button A, joint 1 moves at a speed of 0.1m/s”, or anything similar. However you think the arm should be controlled for SAR.

Gimbal Controller → Suraj, Edward,