Visible to Intel only — GUID: GUID-B4F61236-C916-4ECD-B880-8356102D26A8
Visible to Intel only — GUID: GUID-B4F61236-C916-4ECD-B880-8356102D26A8
Step 5: Navigation Full Stack
Introduction to the Navigation Full Stack
The Edge Insights for Autonomous Mobile Robots navigation full stack contains a lot of components that help the robot navigate, avoid obstacles, and map an area. For example:
Intel® RealSense™ camera node: receives input from the camera and publishes topics used by the vSLAM algorithm
Robot base node: receives input from the motor controller (for example, from wheel encoders) and sends commands to the motor controller to move the robot
ros-base-camera-tf: Uses static_transform_publisher to create transforms between base_link and camera_link
static_transform_publisher publishes a static coordinate transform to tf using an x/y/z offset in meters and yaw/pitch/roll in radians. The period, in milliseconds, specifies how often to send a transform.
yaw = rotation on the x axis
pitch = rotation on the y axis
roll = rotation on the z axis
collab-slam: A Collaborative Visual SLAM Framework for Service Robots paper
FastMapping: an algorithm to create a 3D voxelmap of a robot’s surroundings, based on Intel® RealSense™ camera’s depth sensor data and provide the 2D map needed by the Navigation 2 stack
nav2: the navigation package
Wandering: demonstrates the combination of middleware, algorithms, and the ROS 2 navigation stack to move a robot around a room without hitting obstacles
In Edge Insights for Autonomous Mobile Robots, there two examples:
cd <edge_insights_for_amr_path>/Edge_Insights_for_Autonomous_Mobile_Robots_*/AMR_containers ls 01_docker_sdk_env/docker_compose/05_tutorials/aaeon_wandering__aaeon_realsense_collab_slam_fm_nav2.tutorial.yml ls 01_docker_sdk_env/docker_compose/05_tutorials/pengo_wandering__kobuki_realsense_collab_slam_fm_nav2.tutorial.yml
One is for AAEON’s UP Xtreme i11 Robotic Kit, and the other is for Cogniteam’s Pengo robot.
Create a Parameter File for Your Robotic Kit
To create a parameter file for your robotic kit:
cp 01_docker_sdk_env/docker_compose/05_tutorials/aaeon_wandering__aaeon_realsense_collab_slam_fm_nav2.tutorial.yml 01_docker_sdk_env/docker_compose/05_tutorials/generic_robot_wandering__aaeon_realsense_collab_slam_fm_nav2.tutorial.yml # Replace generic_robot_nav to a name that makes sense to your robotic kit. gedit 01_docker_sdk_env/docker_compose/05_tutorials/generic_robot_wandering__aaeon_realsense_collab_slam_fm_nav2.tutorial.yml # you can also use any other preferred editor, it is important though to keep the path.
Make all of the changes that are specific to your robotic kit:
Replace the aaeon-amr-interface target with the generic robot node you created in Step 3: Robot Base Node ROS 2 Node.
Remove the ros-base-teleop target because this is specific to AAEON’s UP Xtreme i11 Robotic Kit.
In the ROS 2 command file, change the Navigation 2 target so that params_file targets the parameter file you created in Step 4: Robot Base Node ROS 2 Navigation Parameter File.
from: params_file:=${CONTAINER_BASE_PATH}/01_docker_sdk_env/artifacts/01_amr/amr_generic/param/aaeon_nav.param.yaml
to: params_file:=${CONTAINER_BASE_PATH}/01_docker_sdk_env/artifacts/01_amr/amr_generic/param/generic_robot_nav.param.yaml
In the ros-base-camera-tf target, change the transform values from static_transform_publisher. The values for x, y, and z depend on where your Intel® RealSense™ camera is set.
Start Mapping an Area with Your Robot
Place the robot in an area with multiple objects in it.
Go to the installation folder of Edge_Insights_for_Autonomous_Mobile_Robots:
cd <edge_insights_for_amr_path>Edge_Insights_for_Autonomous_Mobile_Robots_*/AMR_containers/ source ./01_docker_sdk_env/docker_compose/common/docker_compose.source export CONTAINER_BASE_PATH=`pwd` export ROS_DOMAIN_ID=27
Start mapping the area:
docker-compose -f 01_docker_sdk_env/docker_compose/05_tutorials/generic_robot_wandering__aaeon_realsense_collab_slam_fm_nav2.tutorial.yml up
Expected result: The robot starts wandering around the room and mapping the entire area.
On a different terminal, prepare the environment to visualize the mapping and the robot using rviz2.
NOTE:If available, use a different development machine because rviz2 consumes a lot of resources that may interfere with the robot.cd <edge_insights_for_amr_path>Edge_Insights_for_Autonomous_Mobile_Robots_*/AMR_containers/ source ./01_docker_sdk_env/docker_compose/common/docker_compose.source export CONTAINER_BASE_PATH=`pwd` export ROS_DOMAIN_ID=27 CHOOSE_USER=eiforamr docker-compose -f 01_docker_sdk_env/docker_compose/05_tutorials/rviz_robot_wandering.yml up
To see the map in 3D, check the MarkerArray:
NOTE:Displaying in 3D consumes a lot of system resources. Intel recommends opening rviz2 on a development system. The development system needs to be in the same network and have the same ROS_DOMAIN_ID set.To stop the robot from mapping the area, do one of the following:
Type Ctrl-c in the terminal where the aaeon_wandering__aaeon_realsense_collab_slam_fm_nav2_ukf.tutorial.yml was run.
(Preferred method because this option cleans the workspace) On the system you used docker-compose up on in step 2, use docker-compose down:
docker-compose -f 01_docker_sdk_env/docker_compose/05_tutorials/generic_robot_wandering__aaeon_realsense_collab_slam_fm_nav2.tutorial.yml down --remove-orphans
Did you find the information on this page useful?
Feedback Message
Characters remaining: