Nvidia Orin/Jetson +GMSL/RLINC/VbyOne/FPDLink Coaxial AI Multi-Camera Synchronized Automotive Vision Solution
In this presentation, we introduced the application of multi-camera synchronization technology in autonomous machines. A detailed explanation was provided on how to offer efficient multi-camera synchronization solutions to customers, focusing on three typical use cases: autonomous delivery robots, upgrades to controller vision sensor solutions, and humanoid robots. Furthermore, it detailed how innovative multi-camera synchronization technology can meet the demands of autonomous machines for multi-camera combined perception and precise synchronization of data from different cameras.

Multi-Camera Application in Autonomous Delivery Robots
Autonomous delivery robots, as a typical L4 autonomous driving scenario, commonly use the NVIDIA Jetson Orin platform as their core computing processor. They need to process information from various sensors simultaneously, such as lidar, cameras, GPS, and IMU, to achieve autonomous navigation, traffic light recognition, and obstacle avoidance.
In autonomous vehicle applications, cameras for different functions have varying requirements. For instance, traffic light cameras must be able to identify traffic lights, thus demanding very high camera resolution, HDR, and AWB capabilities. They need to accurately reproduce the contours and colors of traffic lights under various lighting conditions. Perception cameras are used for recognizing the environment around the vehicle, and therefore have higher requirements for HDR, contrast, and clarity. Surround-view cameras are used for maneuvering, not only requiring a clear view of the vehicle's surroundings but also needing to provide clear imaging at night to ensure 24-hour operation. Consequently, low-light performance requirements for these cameras are very high.
To improve the accuracy of perception algorithms, the NVIDIA Jetson autonomous driving controller utilizes precise timing technology to achieve system-wide time synchronization and synchronized exposure for multiple cameras.

Upgrading Industrial Robot Controller Solutions
With the development of AI, when industrial robots undergo algorithm upgrades, they need to connect to multiple cameras simultaneously and require synchronized exposure for these cameras to complete tasks such as object capture, recognition, and grasping.
Since traditional industrial robots mostly use USB cameras or Ethernet cameras, they cannot achieve synchronized camera exposure. To address these issues, a complete multi-camera synchronization solution is provided based on the NVIDIA Jetson Orin platform.
First, multiple cameras are connected to the Jetson Orin controller via coaxial cameras (e.g., GMSL/RLINC/VbyOne/FPDLink). The Jetson Orin achieves precise synchronized exposure for multiple cameras through a triggering mechanism. Then, after preprocessing the data from multiple cameras, such as stitching, the processed data is fed into the industrial robot's main controller using the same interface.
This not only enables multiple cameras to simultaneously possess edge computing capabilities but also, through the Jetson Orin's multi-camera synchronization processing, solves the problem of precise synchronized exposure that was previously unachievable with USB cameras.

Multi-Sensor Application in Humanoid Robots
Humanoid robots, as a current hot topic, are also typical autonomous machines highly reliant on cameras for perception/recognition. For robots to operate autonomously, they need to precisely perceive their surrounding environment and targets, which heavily depends on cameras.
Common camera deployment schemes for humanoid robots generally include: a stereo camera placed on the head, similar to human eyes; two cameras each on the chest and back, covering the robot's front and rear fields of view; and cameras on the joints can serve as blind spot compensation, better meeting the requirements for tasks such as object recognition.
Furthermore, humanoid robots need to perceive complex scenes during operation. To achieve precise target localization, depth information needs to be calculated, thus requiring strict synchronization among different cameras.
In summary, cameras for humanoid robots need features such as long-distance transmission, highly stable transmission, and multi-camera time synchronization. The synchronization technology based on the NVIDIA Jetson platform combined with multiple digital coaxial cameras (e.g., GMSL/RLINC/VbyOne/FPDLink) can effectively meet the demands of machine vision applications.
Series Products Launched Based on NVIDIA Jetson Orin
As a global leading provider of imaging and perception products and services, since its establishment, it has focused on imaging and vision technology. By creating excellent camera modules and solutions, it provides first-class services for intelligent driving, autonomous driving, autonomous machines, vehicle-road collaboration, and other fields.
The cooperation with NVIDIA has a long history. Currently, a series of development kits have been launched based on the NVIDIA Jetson Orin platform, including coaxial cameras (e.g., GMSL/RLINC/VbyOne/FPDLink), MIPI cameras, and EVS bionic event cameras.