Camera-ISP Driver

From OMAPpedia

Revision as of 01:14, 25 May 2012 by Jrkwon (Talk | contribs)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search

Contents

[edit] Introduction

The OMAP3 Camera-ISP driver is a Video4Linux2 (V4L2) driver, which makes use of v4l2-int-device framework, which is now marked as obsolete by v4l2 community.

[edit] Overall hardware structure

The Camera-ISP contains the following features:

And the way they are interconnected, can be described in the Public Technical Reference manual, found here (public TRM)

NOTE: Please check Chapter #6: Camera Image Signal Processor


[edit] Overall driver structure and functionality

[edit] Current driver structure

The HW blocks supported are:

The VPFE is supported by the CCDC module and performs signal processing operations on RAW image input data. Based on the pipeline used, the input RAW data can either be sent to the video port or can be saved into memory. The CCDC is also responsible for LSC (Lens Shading Compensation).

The VPBE consists of the Preview and Resizer module. The RAW input data is converted into YUV422 by the Preview module. The preview also performs other operations like white balance, luma enhance, black adjustment, etc. The Resizer module resizes the image into the resolution required by user space. The OMAP3 ISP resizer can only scale (up-scale and down-scale) by a factor of 4x.

The SCM or Statistics Collection Module consists of the H3A and Histogram blocks. Once the H3A stats (AEWB and AF) are collected, they are sent to the 3A engine. The 3A engine then processes these stats and returns back values to reprogram the ISP to improve the quality of the capture images.

[edit] Camera ISP driver files and locations

The files which contain the implementation of Camera ISP modules reside in drivers/media/video/isp/


[edit] Camera States

Camera Device State Diagram

From a power management perspective, the Camera Driver supports active and suspended states that are entered by the execution of resume and suspend callbacks respectively. While in the active state, the Camera Driver manages the state machine of the camera device in five states: Configuration, Buffer Allocation & Mapping, Buffer Prepare and Streaming Capture as depicted in the adjoining figure.


An application can issue any sensor image, preview window, image cropping and control IOCTL in the Configuration state. Once the proper configuration is done, an application can move the state machine to an intermediate state to allocate and/or map the streaming buffers. Then it moves to a buffer prepare state through the V4L2 QBUF IOCTL call. After then it reaches at streaming capture state through the V4L2 STREAMON IOCTL call. V4L2 control IOCTLs can be made even in the Streaming Capture states. Any state can be suspended or resumed by the Linux power management framework.


[edit] Imaging Pipelines

The camera driver supports different data paths or pipelines. The pipes are classified as High Performance and High Quality, based on the system performance and image quality of the images captured.

Some of the Pipes that we support are:


[edit] High Performance

In the HP pipe, (for all cases except 720p) the data path used is SEN-CSI2-CCDC-PREV-RESZ-MEM. The resized buffers are then sent to the Display subsystem. For the 720p use case, the path used is SEN-CSI2-CCDC-PREV-MEM. The buffers stored in memory are then sent to the DSS; the DSS then scales those buffers using the ISP resizer. There is an ongoing discussion to remove this '720p hack'.

[edit] High Quality

Additional pipelines were added to support HQ capture. These pipes are selected when the Android Camera app requests for an HQ image capture. The pipe SEN-CSI2-MEM-CCP2-CCDC-PREV-RESZ-MEM is used for for HQ capture. To select the SEN-CSI2-MEM data path, a private IOCTL must be used i.e. V4L2_CID_PRIVATE_OMAP3ISP_CSI2MEM.


The SEN-CSI2-MEM data path is used to receive directly a raw image with the most appropriate parameters for exposure and gain. This path is selected during raw capture. For this purpose, we use /dev/video5 with the configuration for BAYER format. The Sensor-MEM-CCP2-CCDC-H3-MEM path is used to gather H3A statistic from the RAW image. For this purpose, /dev/video10 is used as virtual sensor to provide input of the image in memory; and /dev/video6 is used to configure the CCDC and H3A engine and receive the H3A statistics buffers.

[edit] Camera Driver Kernel Source

The OMAP3 ISP driver is part of the Linux Android Kernel. The source can be found here (android kernel source)

To build the kernel, refer to this page


[edit] Camera Driver Feature Trees

Akwasi

Christina

[edit] Camera Driver Testing

The camera driver test code can be found here (test code source)

Refer to the README for build instructions.

$ git clone git://dev.omapzoom.org/pub/scm/richo/device_driver_test.git
$ cd device_driver_test/
$ git checkout master
$ git checkout --track -b ddt_master origin/master

Add this to your .bashrc:

$ export TESTROOT=<output dir>
$ export CROSS_COMPILE=arm-none-linux-gnueabi-
$ export KDIR=<path to kernel>
$ export HOST=arm-none-linux-gnueabi-
$ export TESTSUITES="camera"
$ make

Note: If you are using the latest version of the git tree, then make "all" defaults to "APPLICABLE_TESTS". In that case, camera tests will not be built. So, to build camera, try make TESTSUITES=camera.

The <output_dir> is a location where the driver test suites will be copied. Typically a user will point <output_dir> to a location of their NFS file system.

If you want to build test apps statically linked (for use in Android FS) then export following before build: $ export FSTYPE=android


Execute a specific Camera Test Group:

$ cd camera/test_code/scripts
$ ./test_runner.sh  -p L_DD_CAMERA_0102


Execute a specific Camera Test Case ID (within a Group):

$ cd camera/test_code/scripts
$ ./test_runner.sh  -T 0002  -p L_DD_CAMERA_0102

[edit] Media Controller

[edit] Background for Media Controller Framework

As the HW for multimedia embedded devices is becoming complex, and can be configured in many different ways, current Video4Linux specification wasn't enough for covering the required flexibility by applications to truly exploit the hardware.

As a consequence, the driver ended up being too complex, and using a lot of non-standard, "hackish" paths to export that functionality to user-space.

So, as many other multimedia embedded silicon vendors have very similar experiences, the Media Controller concept was formulated.

A little more background on it can be seen here.

Also, it's very helpful to look at one of the presentations done by Laurent Pinchart in the last V4L2 mini summit, here.

The OMAP3 Camera-ISP driver is a work in progress towards being the first driver with the Media controller framework (see above), and contains entities for these HW blocks: Sensor, CSI2, CCP2, CCDC, Preview, Resizer, H3A AEWB, H3A AF and Histogram

And the way those are interconnected is intended to be done in a user-space library.

In few words, the kernel will just provide entities, and the basic validations to be done to configure and link them.

And userspace will be the one deciding, in an abstracted fashion, what pipelines to build, like:

... etc.

Also, the main camera driver associates a sensor driver, with a flash device and a lens motor, to provide a fully integrated camera solution.


[edit] To-do list for driver submission

[edit] Mandatory items

  1. Migrate Camera-ISP driver from v4l2-int-device to v4l2_subdev framework:
    Already done by Laurent Pinchart, and published on 6 Nov

  2. Migrate Sensor drivers from v4l2-int-device to v4l2_subdev framework
    Done for Nokia N900, Zoom2/3 & 3630SDP platforms

  3. Migrate Camera-ISP driver to Media Controller framework
    Already started, and N900 is the first functional device. The driver can be found in the 'Source' section below.

  4. Submit v4l2_subdev preparations for Media controller for review.
    This is in progress, and these are the revisions so far:
    • V1
    • V2 (included as part of the latest OMAP3 ISP RFC): 1, 2, 3

  5. Submit Media controller patches for review.
    This is in progress, and these are the revisions so far:
  6. After above is completed, submit Camera-ISP driver, adapted to Media Controller for review. Meanwhile, Laurent has been posting snapshots of the driver to function as a reference to Media Controller patches:
    • For V3 of Media Controller: refer to this
    • For V5 of Media Controller: 1, 2, 3.


[edit] Desirable items

  1. Migrate other platform's sensor drivers to v4l2_subdev:
    • Aptina MT9P012 5MP RAW10 parallel sensor (3430SDP)
    • Tps61059 Flash (3430SDP)
    • DW9710 Absolute Lens (3430SDP)
    • OmniVision OV3640 3MP YUYV/RAW10 CSI2 sensor (3430SDP, LDP)
    • Aptina MT9T111 3MP YUYV/JPEG parallel sensor (Beagleboard xM)
    • Aptina MT9V113 VGA YUYV parallel sensor (Beagleboard xM)

  2. Prepare Documentation on steps to adapt driver to any custom platform.


[edit] People involved in migration to Media Controller Framework

The people involved in these plans are mainly:


[edit] Source

You can find the development repository at ('devel' branch), maintained by Laurent Pinchart.

Also, Laurent keeps a clean tree meant only for consolidated and rebased patches for submission located here

Please base all your contributions on above trees _only_.


[edit] Testing

Media controller can be tested with Laurent's 'media-ctl' tool, located here

Build it with:

$ KDIR=<path to your omap kernel src dir> make

and copy media-ctl binary in your filesystem.

Add "CFLAGS=-static" to compile statically, in case your target filesystem lacks libraries (Like a busybox based filesystem).

For help, run it with -h option.

Some sample instructions:

Show topology of subdevices (-p) for /dev/media0 media controller device (-d):

./media-ctl -d /dev/media0 -p

Reset links (-r) and create a link (-l) for CSI2 subdev with CSI2 output video node:

./media-ctl -d /dev/media0 -r -l '"OMAP3 ISP CSI2a":1 -> "OMAP3 ISP CSI2a output":0 [1]'

Configure imx046 sensor output pad to 3280x2464 SRGGB10, and CSI2a input as the same size (CSI2a output size is propagated automatically):

./media-ctl -d /dev/media0 -f '"imx046 2-001a":0 [SRGGB10 3280x2464]','"OMAP3 ISP CSI2a":0 [SRGGB10 3280x2464]'

For actually capturing the resulting images, you should be able to use any v4l2 compatible capture app (mplayer w/v4l2 plugin, Laurent's yavta app, etc.)

[edit] Appendix A: Sergio Aguirre's tree

This tree's main development focus is Zoom3 boards, and this small guide in order to help you try the code easily.


[edit] Source

You can find the tree here

Which contains the following branches:


[edit] Compiling

After cloning, and while in master branch, do:

$ make ARCH=arm CROSS_COMPILE=arm-none-linux-gnueabi- omap2plus_defconfig
$ make ARCH=arm CROSS_COMPILE=arm-none-linux-gnueabi- menuconfig

1. Select "Device Drivers->Multimedia support" (CONFIG_MEDIA_SUPPORT) and enter to the submenu.
2. Select "Video For Linux" (CONFIG_VIDEO_DEV)
3. Go back to main kernel configuration page
4. Select "System Type->TI OMAP2/3/4 Specific Features->Zoom 2/3 & 3630SDP camera support" (CONFIG_VIDEO_MACH_OMAP_ZOOM)
That will select: "Sony IMX046 sensor driver (8MP)" (CONFIG_VIDEO_IMX046) and "Piezo Actuator Lens driver for LV8093" (CONFIG_VIDEO_LV8093)

NOTE: If you want to start adding patches, I'll suggest you to follow the topic branches philosophy.

Compile as usual, with:

$ make ARCH=arm CROSS_COMPILE=arm-none-linux-gnueabi- uImage

[edit] Patches in queue for Laurent's tree

[edit] WIP not yet submitted

[edit] Appendix B: TI MC migration tree

This tree's main development focus is Zoom3 boards, and most importantly, is based on TI omap3 integration tree (found here).

IMPORTANT: This tree is only helpful if you want a stable Zoom3 & 3630SDP support tree, but it is NOT meant for upstreaming.

[edit] Links

Please visit it's dedicated Gitorious project page here.

For help, visit the project's Wiki here.

Personal tools
Namespaces
Variants
Actions
Navigation
Toolbox