Background image
2024 Mar. 21

Embedded World 2024 (Apr 9-11)

N.A.T. presents FPGA-based vision platform for real-time imaging data analytics of multiple high-end cameras at stand 1-211 in hall 1

 

Bonn/Germany, 20 March 2024 - At Embedded World 2024, N.A.T. is presenting NATvision, its new real-time capable, FPGA-based vision platform for high-resolution, high-end cameras based on the MicroTCA standard (Embedded World). NATvision is the world's first platform of its kind that fully utilizes AMD's (formerly Xilinx) ZYNQ UltraScale+ MPSoC FPGA technology for the processing and analysis of imaging data streams from high-performance video cameras.

NATvision enables developers and system integrators to process imaging data streams from multiple high-resolution, high-end cameras in real time on a single consolidated platform. NATvision stands out from conventional PC-based vision platforms by utilizing faster and more energy-efficient FPGA technology, which can be modularly adapted to the individual processing tasks and a wide variety of transmission protocols as required. Thanks to the scalability of NATvision, up to twelve FPGA cards per system can be combined in a single system. The system is suitable for connecting cameras with a data throughput of up to 100 GbE.

 

Demo setup for Embedded World 2024, stand 1-211 in hall 1

 

Wide range of applications

NATvision is based on the open MicroTCA standard, which achieves a very high degree of modularity and at the same time allows enormous amounts of data to be transferred flexibly between all of the system's plug-in cards via its high-speed backplane. Thanks to its modularity, every NATvision system can be supplemented with additional processors and AI accelerators. NATvision naturally supports time synchronization using IEEE1588 to ensure high-precision time synchronization of all connected cameras. This makes NATvision the ideal solution for image processing with high frame rates and high image resolutions in real time.

NATvision enables the use of high-resolution, high-end cameras for inspection, quality control and monitoring across the entire value chain of industrial companies - from development and production to packaging, as well as in safety and security applications. Vertical markets can be found in industrial production (e.g. analyzing off-rollable meterage, printed items, and bulk materials as well as semiconductors and electronic assemblies), aerospace and automotive testing (e.g. material stress tests, crash tests, airbag dynamics, aerodynamics, combustion processes in engines, thermal measurement). Other applications include accident prevention systems in intelligent transport systems (e.g. railway technology), medical imaging (e.g. X-ray) and science/research (physics, chemistry, biology/biomechanics).

 

The technology in detail

The FPGA-based NATvision vision system from N.A.T., which is being presented for the first time at Embedded World, can be equipped with one to twelve FPGA cards to suit specific requirements and applications. The performance of the AMD ZYNQ UltraScale+ MPSoC processors can be scaled from 103,000 to 1.143 million logic cells, depending on requirements. The design of the in-house GigEVision firmware package for the FPGA cards is also modular, so that other camera interfaces such as CoaXPress can be easily integrated instead of GigEVision. The sample applications supplied with NATvision and the GenICam-compatible SDK enable users to quickly commission the system and integrate it into their own software routines. N.A.T. is currently working with a partner to add AI functionality to the system. The system chassis comes from partner nVent, which is also presenting the demo system at its Embedded World stand 211 in hall 1.

 

Live demo

The performance of the new real-time FPGA-based NATvision platform in high-speed/high-res vision applications will be demonstrated using a live demo that captures and visualizes the trajectory of a ball in real time. NATvision enables calculated trajectory predictions to be made, on the basis of which corresponding trigger signals can be initiated. Motion control systems can use such calculations to precisely identify goods that are recognized as faulty, e.g. to exclude them from the further production process at the appropriate point using a generated trigger.As part of the live demo, the image data is recorded with a 4K Ultra HD GigEVision camera with a resolution of 3,840 x 2,160 pixels at a frame rate of up to 180 frames/sec. Digital video data streams of up to 35 Gbit/s are transmitted and processed in the FPGA using N.A.T.'s own GigEVision firmware package. These data are then available for further FPGA-based processing as an AXI stream. Finally, the video data is written to the RAM of the FPGA-integrated ARM cores – which support Linux – via DMA. From there, data can be transferred to external nodes at up to 100Gbps via the system uplinks, for example. GPUs and AI accelerators can also be integrated as additional cards. Currently, each FPGA card can receive and process up to 40Gb/s of image data from up to four cameras. This value will increase to 100Gb/s per card in future. The measured processing latency in the FPGA is less than 500ns, while the processing speed is deterministic in every configuration.

 

Camera manufacturers, inspection system developers and system integrators can find more information about NATvision from N.A.T. and the service offering at https://nateurope.com/solution/natvision/