Thursday, December 27, 2018

RidgeRun's Sony IMX219 CMOS Image Sensor Linux Driver for NVidia Jetson Xavier and Jetson TX1/TX2

This blog highlights the RidgeRun support for Jetson Xavier and Jetson Tegra platform on developing a CMOS Image Sensor Linux Driver for Sony IMX219.

Driver Features:

  • L4T 31.1 and Jetpack 4.1
  • V4l2 Media controller driver
  • One camera capturing (TODO: to expand to 6 cameras)
  • Tested resolution 3280 x 2464 @ 15 fps
  • Tested resolution 720p @ 78 fps
  • Tested resolution 1640x1232 @ 30 fps
  • Tested resolution 820x616 @ 30 fps
  • Tested with J20 Auvidea board.
  • Capture with v4l2src and also with nvarguscamerasrc using the ISP.
Images attached here are taken during the developing driver for Sony IMX219 image sensor, testing various image capture and display options, performance and latency measurement in our R&D lab located in CostaRica. Various tests are carried out using GStreamer pipelines. 

Enabling and building the driver with Auvidea J20 Expansion board, Example GStreamer pipelines, Performance, Latency measurement details are shared in the RidgeRun developer wiki's mentioned at the end of this blog.

RidgeRun's Sony IMX219 Linux driver for Jetson Xavier

Output image of the IMX219 camera sensor image capture @1640x1232 resolution with nvcamerasrc GStreamer element for the Jetson Xavier platform:




RidgeRun's Sony IMX219 Linux driver latency measurements on Jetson Xavier

Frames showing how the glass to glass latency measurement method is setup by our team while testing for image capture using IMX219 camera mode at 3280x2464@16fps resolution for Xavier platform.


Glass to glass latency measured is 215 ms (07:772 minus 07:557).Time readings can be seen in the displays.

RidgeRun's Sony IMX219 Linux driver for Jetson TX1

Image below is showing the IMX219 capture @1640x1232 resolution with nvcamerasrc on the Jetson TX1 platform. Camera aimed at computer monitor on the left which is reflecting the wall and ceiling shown on the right:



RidgeRun's Sony IMX219 Linux driver latency measurements on Jetson TX1

Image below is captured while measuring the Jetson TX1 glass to glass latency for 1080p 30fps IMX219 camera mode:


Glass to glass latency measured is 130 ms ((13.586 minus 13.456).Time readings can be seen in the displays.

You can find more information about the driver in these developer wiki's from RidgeRun : 


For technical information, please email to us at support@ridgerun.com or for purchase related questions post your inquiry at our Contact Us page.

2 comments:

  1. Hi,

    how does using v4l impact performance over using the nvargus-pipeline and raw memory access?

    ReplyDelete
    Replies
    1. There are 3 main ways to capture in Jetson:

      a) Plain V4L2: Using V4L2 IOCTLs calls you can create an application to capture the data available in the MIPI CSI inputs and put them in memory without any additional post-processing. This approach bypasses the built-in ISP so debayering is not done on it. Latency is typically around 2 frames to get the frame in user space and the ARM load is less than the other methods because V4l2 just puts the data in memory. Example applications using this approach are YAVTA and v4l2src from GStreamer.

      b) nvargus library: This is the argus library provided by NVIDIA, besides capturing it passes the data through the ISP for post-processing (mainly debayering) to finally get a YUV frame. It uses NVIDIA’s argus daemon which loads one core around ~40% or ~50%. It also increases the latency (maybe a couple of frames) depending on the algorithms enabled.

      c) nvarguscamerasrc: this is a GStreamer element provided by NVIDIA, it uses the argus camera daemon as well and therefore the performance is similar to b).

      If you want to reach the lowest latency and ARM load possible I recommend a) but try using a sensor giving YUV data so later you don’t need to do the color space conversion because the encoder and displays will need YUV data.

      Delete