jetson-inference is a training guide for inference and deep learning on Jetson platforms. It uses NVIDIA TensorRT for efficiently deploying neural networks.The "dev" branch on the repository is specifically oriented for Jetson Xavier since it uses the Deep Learning Accelerator (DLA) integration with TensorRT 5.
With jetson-inference you can deploy deep learning examples on the Xavier in a matter of minutes. Some of the example applications are showed below.
ImageNet is a classification network trained with a database of 1000 objects. The input is an image and it outputs the most likely class and the probability that the image belongs to that class.
Image recognition networks output a class probabilities corresponding to the entire input image. Detection networks, on the other hand, find where in the image those objects are located. DetectNet accepts an input image, and outputs the class and coordinates of the detected bounding boxes.
For more examples and a tutorial on how to get jetson-inference running in your Xavier please visit our jetson-inference wiki page.
If you are new to the Xavier or are planning on getting one please visit our Jetson Xavier wiki page.
No comments:
Post a Comment