UPS 6000 and UP Xtreme i11 Robot Kits
Hardware Prerequisite
You have one of these AAEON* robot kits:
This tutorial uses the UP Xtreme i11 Robotic Kit.
If you need help assembling your robot, see AAEON* Resources.
You can use one of these teleop methods to validate that the robot kit hardware setup was done correctly.
-
Start at step 2 (insert the USB dongle in the robot); and, for step 5, run the yml file exactly as shown in the example (ignore the instruction to replace it with your own generic yml file).
-
For step 2, instead of customizing your file, use the exact command in the example.
Check Your Installation
Check if your installation has the turtlesim Docker* image.
docker images |egrep "amr-aaeon-amr-interface|amr-ros-base|amr-imu-tools|amr-robot-localization|amr-nav2|amr-wandering|amr-realsense|amr-collab-slam|amr-collab-slam-gpu|" #if you have them installed, the result is: amr-aaeon-amr-interface amr-ros-base amr-imu-tools amr-robot-localization amr-realsense amr-collab-slam amr-collab-slam-gpu amr-nav2 wandering
NOTE:If these images are not installed, continuing with these steps triggers a build that takes longer than an hour (sometimes, a lot longer depending on the system resources and internet connection).If these images are not installed, Intel recommends checking your installation with FastMapping Algorithm or installing the Robot Complete Kit with the Get Started Guide for Robots.
Calibrate Your Robot’s Inertial Measurement Unit (IMU) Sensor
The IMU sensor is used to determine the robot’s orientation. Moving the robot interferes with calibration, so do not move the robot while performing these steps.
Prepare the environment:
mkdir ~/imu_cal sudo chmod 0777 ~/imu_cal cd <edge_insights_for_amr_path>Edge_Insights_for_Autonomous_Mobile_Robots_*/AMR_containers/
Start the ros2_amr_interface:
chmod a+x run_interactive_docker.sh ./run_interactive_docker.sh amr-aaeon-amr-interface:2022.3 eiforamr -c imu_cal export ROS_DOMAIN_ID=27 ros2 run ros2_amr_interface amr_interface_node --ros-args -p try_reconnect:=true -p publishTF:=true --remap /amr/cmd_vel:=/cmd_vel -p port_name:=/dev/ttyUSB0
Expected output example:
[INFO] [1655311130.138413471] [IoContext::IoContext]: Thread(s) Created: 2 [INFO] [1655311130.139098979] [AMR_node]: Serial opened on /dev/ttyUSB0 at 115200 [INFO] [1655311131.144572706] [AMR_node]: Hardware is now online
If your output is not similar to this, verify that the motor controller is not on /dev/ttyUSB0. Adapt the command accordingly.
In another terminal, attach to the opened Docker* image, and get the offsets needed for calibration.
docker exec -it imu_cal bash source ros_entrypoint.sh export ROS_DOMAIN_ID=27 awk -F',' '{sum+=$17; ++n} END { print "Ang_vel_x: "sum"/"n"="0-sum/n }' <( timeout 4s ros2 topic echo /amr/imu/raw --csv ) awk -F',' '{sum+=$18; ++n} END { print "Ang_vel_y: "sum"/"n"="0-sum/n }' <( timeout 4s ros2 topic echo /amr/imu/raw --csv ) awk -F',' '{sum+=$19; ++n} END { print "Ang_vel_z: "sum"/"n"="0-sum/n }' <( timeout 4s ros2 topic echo /amr/imu/raw --csv ) awk -F',' '{sum+=$29; ++n} END { print "linear_accel_x: "sum"/"n"="0-sum/n }' <( timeout 4s ros2 topic echo /amr/imu/raw --csv ) awk -F',' '{sum+=$30; ++n} END { print "linear_accel_y: "sum"/"n"="0-sum/n }' <( timeout 4s ros2 topic echo /amr/imu/raw --csv )
Expected output example:
eiforamr@glaic3tglaaeon1:~/workspace$ awk -F',' '{sum+=$17; ++n} END { print "Ang_vel_x: "sum"/"n"="0-sum/n }' <( timeout 4s ros2 topic echo /amr/imu/raw --csv ) Ang_vel_x: 0.873369/290=-0.00301162 eiforamr@glaic3tglaaeon1:~/workspace$ awk -F',' '{sum+=$18; ++n} END { print "Ang_vel_y: "sum"/"n"="0-sum/n }' <( timeout 4s ros2 topic echo /amr/imu/raw --csv ) Ang_vel_y: 1.16431/261=-0.00446097 eiforamr@glaic3tglaaeon1:~/workspace$ awk -F',' '{sum+=$19; ++n} END { print "Ang_vel_z: "sum"/"n"="0-sum/n }' <( timeout 4s ros2 topic echo /amr/imu/raw --csv ) Ang_vel_z: -0.487217/290=0.00168006 eiforamr@glaic3tglaaeon1:~/workspace$ awk -F',' '{sum+=$29; ++n} END { print "linear_accel_x: "sum"/"n"="0-sum/n }' <( timeout 4s ros2 topic echo /amr/imu/raw --csv ) linear_accel_x: -104.311/261=0.399657 eiforamr@glaic3tglaaeon1:~/workspace$ awk -F',' '{sum+=$30; ++n} END { print "linear_accel_y: "sum"/"n"="0-sum/n }' <( timeout 4s ros2 topic echo /amr/imu/raw --csv ) linear_accel_y: 120.091/261=-0.460118
After some time, the aaeon node stops publishing data on /amr/imu/raw. When this happens, you get results similar to:
"linear_accel_z: /=-nan"
To fix this, restart the aaeon node:
# Go to the terminal where you started "the ros2_amr_interface" # Stop the AAEON node ctrl-c # re-start the AAEON node ros2 run ros2_amr_interface amr_interface_node --ros-args -p try_reconnect:=true -p publishTF:=true --remap /amr/cmd_vel:=/cmd_vel -p port_name:=/dev/ttyUSB0 # go back to the terminal where you get the data for calibration and continue with the commands
Put these values in aaeon_node_params.yaml:
Ang_vel_x in gyro: x
Ang_vel_y in gyro: y
Ang_vel_z in gyro: z
linear_accel_x in accelerometer: x
linear_accel_y in accelerometer: y
cd <edge_insights_for_amr_path>Edge_Insights_for_Autonomous_Mobile_Robots_*/AMR_containers/ gedit 01_docker_sdk_env/artifacts/01_amr/amr_generic/param/aaeon_node_params.yaml # Replace the values from this file to the ones you got in the previous step. imu: frame_id: imu_link offsets: accelerometer: x: 0.399657 y: -0.460118 z: 0.0 gyro: x: -0.00301162 y: -0.00446097 z: 0.00168006
NOTE:Indentation is important in yaml files, so make sure to align offsets with frame_id. If the indentation is incorrect, the container reports an error when started.Verify that the changes are correctly aligned and that the aaeon-amr-interface node can start:
cd <edge_insights_for_amr_path>Edge_Insights_for_Autonomous_Mobile_Robots_*/AMR_containers/ source ./01_docker_sdk_env/docker_compose/common/docker_compose.source export CONTAINER_BASE_PATH=`pwd` export ROS_DOMAIN_ID=27 docker-compose -f 01_docker_sdk_env/docker_compose/05_tutorials/aaeon_wandering__aaeon_realsense_collab_slam_fm_nav2_ukf.tutorial.yml up aaeon-amr-interface
Expected results:
Creating amr-aaeon-amr-interface ... done Attaching to amr-aaeon-amr-interface amr-aaeon-amr-interface | ros_distro = foxy amr-aaeon-amr-interface | USER = eiforamr amr-aaeon-amr-interface | User's HOME = /home/eiforamr amr-aaeon-amr-interface | ROS_HOME = /home/eiforamr/.ros amr-aaeon-amr-interface | ROS_LOG_DIR = /home/eiforamr/.ros/ros_log amr-aaeon-amr-interface | ROS_WORKSPACE = /home/eiforamr/ros2_ws amr-aaeon-amr-interface | [INFO] [1659008958.761251488] [IoContext::IoContext]: Thread(s) Created: 2 amr-aaeon-amr-interface | [INFO] [1659008958.761926192] [AMR_node]: Serial opened on /dev/ttyUSB0 at 115200 amr-aaeon-amr-interface | [INFO] [1659008959.779575257] [AMR_node]: Hardware is now online
If the “Hardware is now online” message is not received but a message similar to the following message is received, check the alignment of the yaml file again.
what(): failed to initialize rcl: Couldn't parse params file:
Send Ctrl-c to the terminal where you ran the docker-compose command to close it; or run, in a different terminal:
docker-compose -f 01_docker_sdk_env/docker_compose/05_tutorials/aaeon_wandering__aaeon_realsense_collab_slam_fm_nav2_ukf.tutorial.yml down --remove-orphans
Map an Area with the Wandering Application and UP Xtreme i11 Robotic Kit
The goal of the wandering application is to map an area and avoid hitting objects.
Place the robot in an area with multiple objects in it.
Go to the installation folder of Edge_Insights_for_Autonomous_Mobile_Robots:
cd <edge_insights_for_amr_path>/Edge_Insights_for_Autonomous_Mobile_Robots_*/AMR_containers/ source ./01_docker_sdk_env/docker_compose/common/docker_compose.source export CONTAINER_BASE_PATH=`pwd` export ROS_DOMAIN_ID=27 # The following command makes sure you have correct permissions so that collaborative SLAM can save the map sudo find . -type d -exec chmod 775 {} + sudo chown $USER:$USER * -R
Start mapping the area:
CHOOSE_USER=eiforamr docker-compose -f 01_docker_sdk_env/docker_compose/05_tutorials/aaeon_mapping_realsense_collab_slam_nav2_ukf.tutorial.yml up
Expected result: The robot starts wandering around the room and mapping the entire area.
On a different terminal, prepare the environment to visualize the mapping and the robot using rviz2.
NOTE:If available, use a different development machine because rviz2 consumes a lot of resources that may interfere with the robot.cd <edge_insights_for_amr_path>Edge_Insights_for_Autonomous_Mobile_Robots_*/AMR_containers/ source ./01_docker_sdk_env/docker_compose/common/docker_compose.source export CONTAINER_BASE_PATH=`pwd` export ROS_DOMAIN_ID=27 CHOOSE_USER=eiforamr docker-compose -f 01_docker_sdk_env/docker_compose/05_tutorials/rviz_robot_wandering.yml up
To see the map in 3D, can check the MarkerArray:
NOTE:Displaying in 3D consumes a lot of the system resources. Intel recommends opening rviz2 on a development system. The development system needs to be in the same network and have the same ROS_DOMAIN_ID set.To stop the robot from mapping the area, do one of the following:
Type Ctrl-c in the terminal where the aaeon_wandering__aaeon_realsense_collab_slam_fm_nav2_ukf.tutorial.yml was run.
(Preferred method because this option cleans the workspace) On the system you used docker-compose up on in step 2, use docker-compose down:
docker-compose -f 01_docker_sdk_env/docker_compose/05_tutorials/aaeon_wandering__aaeon_realsense_collab_slam_fm_nav2_ukf.tutorial.yml down --remove-orphans
If the robot moves in an unpredictable way and hits objects easily, there may be some hardware configuration issues. See the Troubleshooting section for suggestions.
Start the UP Xtreme i11 Robotic Kit in Localization Mode
Prerequisites:
Collaborative visual SLAM is based on visual keypoints, so use a room with multiple obstacles in it.
If the room has bland walls, consider adding pictures to it so that visual SLAM has enough keypoints to localize itself.
Localization mode only works if a map was already generated. To generate a map go to Map an Area with the Wandering Application and UP Xtreme i11 Robotic Kit.
The pre-generated file contains enough information to allow the robot to navigate without having to map the entire area.
With this pre-generated map, the robot only has to localize itself relative to the pre-generated map.
Go to the installation folder of Edge_Insights_for_Autonomous_Mobile_Robots:
cd <edge_insights_for_amr_path>Edge_Insights_for_Autonomous_Mobile_Robots_*/AMR_containers/ source ./01_docker_sdk_env/docker_compose/common/docker_compose.source export CONTAINER_BASE_PATH=`pwd` export ROS_DOMAIN_ID=27 CHOOSE_USER=eiforamr docker-compose -f 01_docker_sdk_env/docker_compose/05_tutorials/aaeon_localization_realsense_collab_slam_nav2_ukf.tutorial.yml up
Expected result: The robot starts moving in the already mapped area and reports if it is able to localize itself or not.
If the robot is able to localize itself, the amr-collab-slam node increases the tracking success number, and the “relocal fail number” stays constant:
amr-collab-slam | [univloc_tracker_ros-2] [2022-09-05 06:59:58.720] [info] tracking success number: 102, fail number: 0, relocal number: 1, relocal fail number: 51 amr-collab-slam | [univloc_tracker_ros-2] [2022-09-05 06:59:58.766] [info] Started Localization! current frame id is 154 amr-collab-slam | [univloc_tracker_ros-2] [2022-09-05 06:59:58.779] [info] valid tracked server landmarks num: 590 amr-collab-slam | [univloc_tracker_ros-2] [2022-09-05 06:59:58.779] [info] Successfully tracked server map! amr-collab-slam | [univloc_tracker_ros-2] [2022-09-05 06:59:58.785] [info] tracking success number: 103, fail number: 0, relocal number: 1, relocal fail number: 51 amr-collab-slam | [univloc_tracker_ros-2] [2022-09-05 06:59:58.835] [info] Started Localization! current frame id is 155 amr-collab-slam | [univloc_tracker_ros-2] [2022-09-05 06:59:58.848] [info] valid tracked server landmarks num: 571 amr-collab-slam | [univloc_tracker_ros-2] [2022-09-05 06:59:58.848] [info] Successfully tracked server map!
If the robot is not able to localize itself, the amr-collab-slam node keeps the tracking success number constant and increases the “relocal fail number”:
amr-collab-slam | [univloc_tracker_ros-2] [2022-09-05 07:06:49.475] [info] tracking success number: 5740, fail number: 6, relocal number: 6, relocal fail number: 119 amr-collab-slam | [univloc_tracker_ros-2] [2022-09-05 07:06:49.584] [info] tracking success number: 5740, fail number: 6, relocal number: 6, relocal fail number: 120 amr-collab-slam | [univloc_tracker_ros-2] [2022-09-05 07:06:49.667] [info] tracking success number: 5740, fail number: 6, relocal number: 6, relocal fail number: 121 amr-collab-slam | [univloc_tracker_ros-2] [2022-09-05 07:06:49.806] [info] tracking success number: 5740, fail number: 6, relocal number: 6, relocal fail number: 122 amr-collab-slam | [univloc_tracker_ros-2] [2022-09-05 07:06:49.859] [info] tracking success number: 5740, fail number: 6, relocal number: 6, relocal fail number: 123
To visualize the robot localizing itself and updating its pose, run in a different terminal:
cd <edge_insights_for_amr_path>Edge_Insights_for_Autonomous_Mobile_Robots_*/AMR_containers/ source ./01_docker_sdk_env/docker_compose/common/docker_compose.source export CONTAINER_BASE_PATH=`pwd` export ROS_DOMAIN_ID=27 CHOOSE_USER=eiforamr docker-compose -f 01_docker_sdk_env/docker_compose/05_tutorials/rviz_robot_localization.yml up
NOTE:If the robot is not able to localize itself, the robot does not start navigating the room, and rviz2 reports map not found. To avoid this, move the robot to the room where the map was created and face it towards a keypoint. It also helps if the room you mapped has a lot of keypoints
Perform Object Detection While Mapping an Area with the UP Xtreme i11 Robotic Kit
Place the robot in an area with multiple objects in it.
Go to the installation folder of Edge_Insights_for_Autonomous_Mobile_Robots:
cd <edge_insights_for_amr_path>Edge_Insights_for_Autonomous_Mobile_Robots_*/AMR_containers/ source ./01_docker_sdk_env/docker_compose/common/docker_compose.source export CONTAINER_BASE_PATH=`pwd` export ROS_DOMAIN_ID=27
Start mapping the area and listing the detected objects:
docker-compose -f 01_docker_sdk_env/docker_compose/05_tutorials/aaeon_wandering_local_inference.yml up |grep Label
Expected result: The robot starts wandering around the room and listing what objects it sees:
AAEON* Resources
Development Kit: https://github.com/up-board/up-community/wiki/UP-Robotic-Development-Kit-QSG
Hardware Assembly: https://github.com/up-board/up-community/wiki/UP-Robotic-Development-Kit-HW-Assembly-Guide
Power Management: https://github.com/up-board/up-community/wiki/UP-Robotic-Development-Kit-Power-Management-Guide
Troubleshooting
If the server fails to load the map:
amr-collab-slam | [univloc_server-1] [2022-09-19 06:45:08.520] [critical] cannot load the file at /tmp/aaeon/maps/map.msg amr-collab-slam | [univloc_server-1] terminate called after throwing an instance of 'std::runtime_error' amr-collab-slam | [univloc_server-1] what(): cannot load the file at /tmp/aaeon/maps/map.msg amr-collab-slam | [ERROR] [univloc_server-1]: process has died [pid 72, exit code -6, cmd '/home/eiforamr/workspace/CollabSLAM/prebuilt_collab_slam_core/univloc_server/lib/univloc_server/univloc_server --ros-args -r __node:=univloc_server --params-file /tmp/launch_params_iiuu5pfp'].
Change your folder permissions so that the Docker* user is able to write the map on your system:
cd <edge_insights_for_amr_path>Edge_Insights_for_Autonomous_Mobile_Robots_*/AMR_containers/ sudo find . -type d -exec chmod 775 {} + sudo chown $USER:$USER * -R
If the tracker (univloc_tracker_ros) fails to start, giving the following error, see Collaborative Visual SLAM on Intel® Atom® Processor-Based Systems.
amr-collab-slam | [ERROR] [univloc_tracker_ros-2]: process has died [pid 140, exit code -4, cmd '/home/eiforamr/workspace/CollabSLAM/prebuilt_collab_slam_core/univloc_tracker/lib/univloc_tracker/univloc_tracker_ros --ros-args -r __node:=univloc_tracker_0 -r __ns:=/ --params-file /tmp/launch_params_zfr70odz -r /tf:=tf -r /tf_static:=tf_static -r /univloc_tracker_0/map:=map'].
If the robot does not start moving, the firmware might be stuck. To make it work again:
cd <edge_insights_for_amr_path>Edge_Insights_for_Autonomous_Mobile_Robots_*/AMR_containers/ ./run_interactive_docker.sh amr-aaeon-amr-interface:2022.3 eiforamr -c aaeon_robot ros2 run ros2_amr_interface amr_interface_node --ros-args -p try_reconnect:=true -p publishTF:=true --remap /amr/cmd_vel:=/cmd_vel -p port_name:=/dev/ttyUSB0 ctrl-c ros2 run ros2_amr_interface amr_interface_node --ros-args -p try_reconnect:=true -p publishTF:=true --remap /amr/cmd_vel:=/cmd_vel -p port_name:=/dev/ttyUSB0 # Look for the text: [INFO] [1655311131.144572706] [AMR_node]: Hardware is now online # If you don't get this repeat the commands from the docker image and check if the motor controller is not attached to /dev/ttyUSB0. # If it is not attached to /dev/ttyUSB0, find out which one it is and adapt the commands accordingly. # When you get the [INFO] [1655311131.144572706] [AMR_node]: Hardware is now online, exit the docker image: exit
If the robot is not behaving as instructed when using the teleop_twist_keyboard, try the following steps.
Check the direction of the wheels. The way they are facing is very important, as shown in the following picture.
Each wheel says R (Right) or L (Left). Intel had to use the following wheel setup:
R (wheel) <<<>>> L (wheel)
L (wheel) <<<>>> R (wheel)
Check the connection between the wheels (left in the following picture) and the motor controller.
It is very important to have the hardware setup correctly configured. If it is not correct, it is evident when testing with the teleop_twist_keyboard.
If the wheels do not turn at all, there may be something wrong with the wheel motor control. The board’s datasheet states that it takes a 12 V input. Intel found that a 12.5 V input did not work, but 5V, 8V, and 10V inputs do work.
If the IMU gives errors and you did not install the librealsense udev rules when you configured the host, install the librealsense udev rules now:
git clone https://github.com/IntelRealSense/librealsense # Copy the 99-realsense-libusb.rules files to the rules.d folder cd librealsense sudo cp config/99-realsense-libusb.rules /etc/udev/rules.d/ sudo udevadm control --reload-rules sudo udevadm trigger
For general robot issues, go to: Troubleshooting for Robot Tutorials.