Release Notes

  • 2022.3.0
  • 09/26/2022
  • Public Content

Edge Insights for Autonomous Mobile Robots Release Notes

EI for AMR Features

EI for AMR 2022.3 is deployed via Docker* prebuilt images, grouped into six use cases: Robot Base Kit, Server Complete Kit, UP Xtreme i11 Robotic Kit, Robot and Server Complete Kit, Robot Complete Kit, and PCL Optimization. The last option includes PCL distributed as a standalone.
You can customize your Edge Insights for Autonomous Mobile Robots installation and only download the Docker* images you need from the SDK.
VDA Navigator
  • This is the first release of the VDA navigator, a basic Navigation 2 based waypoint follower. This release includes support for
    updateRoute
    and
    cancelRoute
    functionalities.
Optimized ORB Feature Extractor for an Intel® GPU
  • The ORB feature extractor, on version 1.0, is optimized for 12th Generation Intel® GPUs.
  • The ORB feature extractor improves the frame processing speed compared to a CPU implementation and reduces CPU utilization.
  • It includes CV kernels that are optimized for a GPU using C for Metal language to run on Intel® oneAPI Base Toolkit level-zero runtime: resize, gaussian, fast, compute descriptor, and compute orientation.
  • The per frame latency can be adjusted via environment variable GPU_CHECK_SLEEP_TIME (the default is 50 µs).
PCL Optimization
  • Added the
    march=native
    flag in Intel® oneAPI Base Toolkit CMakeLists.txt to fix an MLS segmentation fault
  • Added octree approximate nearest search
  • Changed namespace
    pcl::device
    for Intel® oneAPI Base Toolkit functions (For Intel® oneAPI Base Toolkit functions of octree and convex hull, use
    pcl::oneapi::device
    namespace instead of
    pcl::device
    .)
  • Added octree approximate nearest search with the following modifications:
    • Included code converted with Intel® DPC++ Compatibility Tool
    • Removed improper Intel® DPC++ Compatibility Tool warnings
    • Used LOG_WARP_SIZE of four instead of five for better performance
    • Used
      sycl::range
      <1> instead of
      sycl::range
      <3> when submitting the kernel
  • Added radius search
  • Added the
    wait_for sycl::event
    API to reduce CPU utilization
  • Replaced
    q.wait()
    with
    wait_for event
    for octree and convex hull
  • Added octree
    knn_search
    with the following modifications:
    • Included code converted with Intel® DPC++ Compatibility Tool
    • Changed the sub-group size from 32 to 16 for better performance
    • Used
      wait_for(event)
      instead of queue wait
    • Supports k > 1
  • Fixed Eigen v. 3.3 compilation issues
  • Enabled support for end users to build Intel® oneAPI Base Toolkit independently
  • Implemented point cloud passthrough and voxel grid filters
  • Added script to deploy PCL binary as a standalone, outside of EI for AMR Docker* images
FLANN Optimization
  • Added the
    wait_for sycl::event
    API to reduce CPU utilization
Collaborative Visual SLAM
  • Added support for the ORB feature extractor with C for Metal using the Intel® oneAPI Base Toolkit level zero interface
  • Added support for stereo input in the EuRoC dataset
  • Added support for both monocular and stereo input in the KITTI dataset
  • Added re-localization mode
  • Fixed the issue with memory consumption in localization mode when re-localizing
  • Integrated FastMapping features: basic, loop closure, and map merge
  • Updated the rviz2 configuration for FastMapping
  • Reduced the tracker=to-server communication bandwidth in both mapping and localization mode
VDA5050-ros2-bridge
  • Added support for Order and Update Order messages
  • Initial release of the AGV state machine feature
  • Added a custom instant action for pausing and resuming the robot’s wandering
  • Enabled the forwarding of the AGV state to the ThingsBoard* server
  • Added custom Host and Port MQTT connections based on environment variables
Wandering AI Algorithm
  • Added a pause-resume option for the robots
  • Fixed indices in the loop when a robot is looking for the next cell to explore (When looking for next free cell to explore, the width and height of the map are changing during the search.)
Remote Inference
  • Updated the copyright header in the source files
Object Detection Node
  • Updated the copyright header in the source files
EI for AMR Containers (Robot and Edge Server)
  • Minor fixes and cleanups on Docker* containers
  • Updated dependencies in the collaborative visual SLAM Docker* file
  • Added PCL and FLANN libraries to the Navigation 2 and rtabmap Docker* images
  • Enabled the option to install either OpenVINO™ developer package or runtime package, reducing the size of the Docker* containers that include OpenVINO™
  • Support offloading tasks to the GPU over Intel® oneAPI Base Toolkit level zero interface when using C for Metal
  • Included the ORB feature extractor in the Intel® oneAPI Base Toolkit Docker* image
  • Create Docker* images for WWAN 5G modem and the VDA navigator
  • Integrate Kudan visual SLAM with the ORB feature extractor into a Docker* image
  • Added wandering, navigator, and common-dependencies playbooks
  • Created a localization mode launch configuration for AAEON’s UP Xtreme i11 Robotic Kit
  • Enabled Intel® Smart Edge Open device registration
  • Secure file server support to download an additional file to EI for AMR during the onboarding process
  • Included all EI for AMR Docker* images into a single Kubernetes* pod
  • Support for running a ROS 2 node on a Kubernetes* cluster

Known Issues and Limitations

  • The server setup deploys collaborative visual SLAM on Cogniteam’s Pengo robot without checking the type of CPU it has. (The Intel® Atom® CPU in Cogniteam’s Pengo robot is not supported by default.)
  • The POTA implementation in the onboarding flow requires manual input for Product Name and Manufacturer for each type of robot added to the flow.
  • It has been observed that, after 20 minutes of run time, an AAEON* robot gets too close to obstacles and, because Intel® RealSense™ depth does not give information under 20 cm, the robot collides with obstacles.
  • Kudan visual SLAM with an ORB feature extractor on the GPU stops mapping using the
    robot_moving_15fps
    ROS 2 bag.
  • The Inertial Measurement Unit (IMU) cannot be started on Intel® RealSense™.
  • The AAEON’s UP Xtreme i11 Robotic Kit loses its orientation and redraws the walls in the test area multiple times. Due to this improper mapping, the robot cannot correctly identify the position of the obstacles and might collide with them.
  • Due to the EOL of the LIDAR L515 camera, L515 is no longer supported by EI for AMR.
  • Gazebo* simulation does not work on the Intel® Atom® 3000 processor family like the Apollo Lake-based UP2 (UP Squared). Intel recommends creating the Gazebo* simulation environment on more powerful systems that have, for example, 11th generation Intel® Core™ or Intel® Xeon® Scalable processors.
  • It currently takes up to 100 minutes to install the EI for AMR package. The time varies depending on the speed of your internet connection and system’s capabilities.
  • The installed TensorFlow* version in EI for AMR contains Intel® Advanced Vector Extensions (Intel® AVX) instructions. These Intel® AVX instructions are not supported by Intel® Atom® CPUs like the CPU in Elkhart Lake platform. Any action, including the OpenVINO™ sample application, fails on a platform with an Intel® Atom® CPU. To be able to run TensorFlow* on an Intel® Atom® CPU, it must be re-compiled without the Intel® AVX instructions using the steps from: How to Build and Install the Latest TensorFlow* Without CUDA GPU.
Wandering AI Application
  • RTAB_MAP is not best suited for indoor navigation. Therefore, some obstacles may not be detected with the highest accuracy, due to reflections, etc.
ADBSCAN
  • ADBSCAN is configured to work with low resolutions (360). Using higher resolutions like 1440 makes ADBSCAN report inconsistent findings on each run.
Collaborative Visual SLAM
  • For long runs, where the edge server accumulates more than 80,000 key frames, the shutdown process takes more time to exit cleanly.
  • For visual odometry fusion with monocular input, after the visual tracking is lost, the system relies on odometry input (if enabled) to sustain tracking and is never able to switch back to visual tracking.
  • The visual-inertial fusion is not supported in localization mode.
  • Fusion of visual, inertial, and odometry data at the same time is not supported.
  • Map merge does not happen if robots are moving in opposite directions through a common area.
  • When both the server and tracker are launched, the robot looses tracking while rotating in place.
  • The collaborative visual SLAM tracker crashes when running in localization mode with 30 FPS.
VDA5050-ros2-bridge
  • VDA5050-ros2-bridge receives fewer MQTT generated messages than the defined number of messages.
Intel® Edge Software Device Qualification (Intel® ESDQ) for Edge Insights for Autonomous Mobile Robots (EI for AMR)
  • The Intel® RealSense™ camera test fails if there is no Intel® RealSense™ camera attached to the target System, reporting this error message:
    [ERROR]: No RealSense devices were found
    .
  • Object detection on Intel® Movidius™ Myriad™ X VPU fails if there is no Intel® Movidius™ Myriad™ X connected to the target system.
  • If the internet connection is not stable or GitHub is blocked by a firewall, some Intel® ESDQ tests fail.

Where to Find the Release

You can find the release on the Product Download page.

Release Content

Subproject (component)
Revision
amr-aaeon-amr-interface
2022.3
amr-adbscan
2022.3
amr-battery-bridge
2022.3
amr-cartographer
2022.3
amr-collab-slam
2022.3
amr-collab-slam-gpu
2022.3
amr-fastmapping
2022.3
amr-fdo-client
2022.3
amr-fleet-management
2022.3
amr-gazebo
2022.3
amr-gstreamer
2022.3
amr-imu-tools
2022.3
amr-kobuki
2022.3
amr-kudan-slam
2022.3
amr-nav2
2022.3
amr-object-detection
2022.3
amr-realsense
2022.3
amr-ros2-openvino
2022.3
amr-ros-arduino
2022.3
amr-ros-base
2022.3
amr-ros1-bridge
2022.3
amr-robot-localization
2022.3
amr-rplidar
2022.3
amr-rtabmap
2022.3
amr-sick-nanoscan
2022.3
amr-slam-toolbox
2022.3
amr-turtlebot3
2022.3
amr-turtlesim
2022.3
amr-vda-navigator
2022.3
amr-vda5050-ros2-bridge
2022.3
amr-wandering
2022.3
amr-wwan-5g-modem
2022.3
eiforamr-base-sdk
2022.3
eiforamr-full-flavour-sdk
2022.3
eiforamr-openvino-sdk
2022.3
edge-server-base
2022.3
edge-server-fdo-manufacturer
2022.3
edge-server-fdo-owner
2022.3
edge-server-fdo-rendezvous
2022.3
edge-server-fleet-management
2022.3
edge-server-ovms-tls
2022.3
EI for AMR containers
2022.3
EI for AMR Edge Server containers
2022.3
EI for AMR Test Module
2022.3
EI for AMR Bag Files
2022.3
Intel® Edge Software Device Qualification (Intel® ESDQ)
9.0
Docker Compose*
1.29.0
Docker* Community Edition CE
20.10.5
Source Code Distribution under GPL
2022.3

Notices and Disclaimers

You may not use or facilitate the use of this document in connection with any infringement or other legal analysis concerning Intel products described herein. You agree to grant Intel a non-exclusive, royalty-free license to any patent claim thereafter drafted which includes subject matter disclosed herein.
No license (express or implied, by estoppel or otherwise) to any intellectual property rights is granted by this document.
All product plans and roadmaps are subject to change without notice.
The products described may contain design defects or errors known as errata which may cause the product to deviate from published specifications. Current characterized errata are available on request.
Intel disclaims all express and implied warranties, including without limitation, the implied warranties of merchantability, fitness for a particular purpose, and non-infringement, as well as any warranty arising from course of performance, course of dealing, or usage in trade.
Intel technologies may require enabled hardware, software or service activation.
No product or component can be absolutely secure.
Your costs and results may vary.
Code names are used by Intel to identify products, technologies, or services that are in development and not publicly available. These are not “commercial” names and not intended to function as trademarks.
Performance varies by use, configuration and other factors. Learn more at www.Intel.com/PerformanceIndex.
© Intel Corporation. Intel, the Intel logo, and other Intel marks are trademarks of Intel Corporation or its subsidiaries. Other names and brands may be claimed as the property of others.
This software and the related documents are Intel copyrighted materials, and your use of them is governed by the express license under which they were provided to you (License). Unless the License provides otherwise, you may not use, modify, copy, publish, distribute, disclose or transmit this software or the related documents without Intel’s prior written permission.
This software and the related documents are provided as is, with no express or implied warranties, other than those that are expressly stated in the License.

Product and Performance Information

1

Performance varies by use, configuration and other factors. Learn more at www.Intel.com/PerformanceIndex.