Message ID | 20240806144517.1903397-2-stefan.klug@ideasonboard.com |
---|---|
State | Superseded |
Headers | show |
Series |
|
Related | show |
Quoting Stefan Klug (2024-08-06 15:44:41) > This patch adds a initial version of the tuning guide for libcamera. The > process is established based on the imx8mp and will be improved when we > do more tuning for different ISPs with our own tooling. > > Signed-off-by: Stefan Klug <stefan.klug@ideasonboard.com> > --- > Documentation/conf.py | 4 +- > Documentation/guides/tuning/tuning.rst | 276 +++++++++++++++++++++++++ > Documentation/index.rst | 1 + > Documentation/meson.build | 1 + > 4 files changed, 280 insertions(+), 2 deletions(-) > create mode 100644 Documentation/guides/tuning/tuning.rst > > diff --git a/Documentation/conf.py b/Documentation/conf.py > index 7eeea7f3865b..5387942b9af5 100644 > --- a/Documentation/conf.py > +++ b/Documentation/conf.py > @@ -21,8 +21,8 @@ > # -- Project information ----------------------------------------------------- > > project = 'libcamera' > -copyright = '2018-2019, The libcamera documentation authors' > -author = u'Kieran Bingham, Jacopo Mondi, Laurent Pinchart, Niklas Söderlund' > +copyright = '2018-2024, The libcamera documentation authors' > +author = u'Kieran Bingham, Jacopo Mondi, Laurent Pinchart, Niklas Söderlund, Stefan Klug' > > # Version information is provided by the build environment, through the > # sphinx command line. > diff --git a/Documentation/guides/tuning/tuning.rst b/Documentation/guides/tuning/tuning.rst > new file mode 100644 > index 000000000000..1c30f3f20b4e > --- /dev/null > +++ b/Documentation/guides/tuning/tuning.rst > @@ -0,0 +1,276 @@ > +.. SPDX-License-Identifier: CC-BY-SA-4.0 > + > +Sensor Tuning Guide > +=================== > + > +To create visually good images out of the raw data provided by the camera s/out of the/from the/ > +sensor, a lot of image processing takes place. This is usually done inside an > +ISP (Image Signal Processor). To be able to do the necessary processing, the > +corresponding algorithms need to be parameterized according to the used hardware s/to the used hardware/to the hardware in use/ > +(typically sensor, light and lens). Creating these parameters is called tuning. s/Creating/Calculating/ ? s/is called/is a process called/ > +The tuning process results in a tuning file which is then used by libcamera to > +parameterize the algorithms at runtime. s/parameterize the/provide calibrated parameters to the/ > + > +The processing block of an ISP vary from vendor to vendor and can be s/block/blocks > +arbitrarily complex. Never the less a diagram of the most common and important s/Never the less/Nevertheless/ ? - based on https://dictionary.cambridge.org/dictionary/english/nevertheless Is it correct to say these are the most common and important ? That sounds too subjective and dependent upon use cases... Perhaps: "a diagram of common blocks frequently found in an ISP design is shown below:" But I don't mind your version. > +blocks is shown below: > + > + :: > + > + +--------+ > + | Light | > + +--------+ > + | > + v > + +--------+ +-----+ +-----+ +-----+ +-----+ +-------+ > + | Sensor | -> | BLC | -> | AWB | -> | LSC | -> | CCM | -> | Gamma | > + +--------+ +-----+ +-----+ +-----+ +-----+ +-------+ > + > +**Light** The light used to light the scene has a crucial influence on the > +resulting image. The human eye and brain are specialized in automatically > +adapting to different lights. In a camera this has to be done by an algorithm. > +Light is a complex topic and to correctly describe light sources you need to > +describe the spectrum of the light. To simplify things, lights are categorized > +according to their `color temperature (K) > +<https://en.wikipedia.org/wiki/Color_temperature>`_. This is important to > +keep in mind as it means that calibrations may differ between light sources even > +though they have the same nominal color temperature. For best results the tuning > +images need to be taken for a complete range of light sources. I would put that last statement on it's own line as it's a distinct/important point ("For best results...") > + > +**Sensor** The sensor captures the incoming light and produces the raw images. "and converts a measured analogue signal to a digital representation which conveys the raw images." > +Data is commonly filtered by a color filter array into red, green and blue Data? or light ... I think the light is filtered by a colour filter... > +channels. As these filters are not perfect some postprocessing needs to be done > +to recreate the correct color. > + > +**BLC** Black level correction. Even without incoming light a sensor produces > +pixel values above zero. This is due to a artificially added pedestal value s/due to a/due to an/ > +(Which is added to get a evenly distributed noise level around zero instead of > +only the upper half) and other effects in the electronics light dark currents. Rather than bracket that, I'd suggest: " This is due to two system artifacts that impact the measured light levels on the sensor. Firstly, a deliberate and artificially added pedestal value is added to get an evenly distributed noise level around zero to avoid negative values and clipping the data. Secondly, additional underlying electrical noise can be caused by various external factors including thermal noise and electrical interferences. > +To get good images with real black, that black level needs to be subtracted. As > +that level is typically known for a sensor it is hardcoded in libcamera and does > +not need any calibration at the moment. "As the pedestal value is typically known" (we don't know the electrical noise component) Is it hardcoded though? Can't it be overridden by the tuning file still? > + > +**AWB** Auto white balance. For a proper image the color channels need to be I'm not sure if 'proper image' is defined well. But I haven't got a better option in my head yet... > +adjusted, to get correct white balance. This means that monochrome objects in > +the scene appear monochrome in the output image (white is white and gray is > +gray). In libcamera this is done based on a gray world model. No tuning is > +necessary for this step. "Presently in the libipa implementation of libcamera, this is managed by a grey world model." ? > + > +**LSC** Lens shading correction. The lens in use has a big influence on the > +resulting image. The typical effects on the image are lens-shading (also called > +vignetting). This means that due to the physical properties of a lens the > +transmission of light falls of to the corners of the lens. To make things even 'falls off towards the corners' ? > +harder, this falloff can be different for different colors/wave lengths. LSC is s/wave lengths/wavelengths/ > +therefore tuned for a set of light sources and for each color channel > +individually. > + > +**CCM** Color correction matrix. After the previous processing blocks the grays > +are preserved, but colors are still not correct. This is mostly due to the > +color temperature of the light source and the imperfections in the color filter > +array of the sensor. To correct for this a so called color correction matrix is s/a so called color correction matrix/a 'color correction matrix'/ ? > +calculated. This is a 3x3 matrix, that is used to optimize the captured colors > +regarding the perceived color error. To do this a chart of known and precisely > +measured colors (macbeth chart) is captured. Then the matrix is calculated s/(macbeth chart)/commonly called a macbeth chart/ > +using linear optimization to best map each measured colour of the chart to it's > +known value. The color error is measured in `deltaE > +<https://en.wikipedia.org/wiki/Color_difference>`_. > + > +**Gamma** Gamma correction. Today's display usually apply a gamma of 2.2 for > +display. For images to be perceived correctly by the human eye, they need to be > +encoded with the corresponding inverse gamma. See also > +<https://en.wikipedia.org/wiki/Gamma_correction>. This block doesn't need > +tuning, but is crucial for correct visual display. > + > +Materials needed for the tuning process > +--------------------------------------- > + > +Precisely calibrated optical equipment is very expensive and out of the scope of > +this document. Still it is possible to get reasonably good calibration results > +at little costs. The most important devices needed are: > + > + - A light box with the ability to produce defined light of different color > + temperatures. Typical temperatures used for calibration are 2400K > + (incandescent), 2700K (fluorescent), 5000K (daylight fluorescent), 6500K > + (daylight). As a first test, keylights for webcam streaming can be used. > + These are available with support for color temperatures ranging from 2500K > + to 9000K. For better results professional light boxes are needed. > + - A ColorChecker chart. These are sold from calibrite and it makes sense to > + get the original one. > + - A integration sphere. This is used to create completely homogenious light > + for the lens shading calibration. We had good results with the use of > + a homogeneous area light (the above mentioned keylight). I would perhaps call this "a large light panel with sufficient diffusion to get an even distribution of light." > + - An environment without external light sources. Ideally calibration is done > + without any external lights in a fully dark room. A black curtain is one > + solution, working by night is the cheap alternative. > + Indeed, I think I'm happier with a dark curtain so far so I don't have to stay up so late. And I've hit 50% through the document, but run out of time so I'll snip and pause here! <snip> -- Kieran
Hi Stefan, Resuming at ~50% Quoting Stefan Klug (2024-08-06 15:44:41) > This patch adds a initial version of the tuning guide for libcamera. The > process is established based on the imx8mp and will be improved when we > do more tuning for different ISPs with our own tooling. > > Signed-off-by: Stefan Klug <stefan.klug@ideasonboard.com> > --- > Documentation/conf.py | 4 +- > Documentation/guides/tuning/tuning.rst | 276 +++++++++++++++++++++++++ > Documentation/index.rst | 1 + > Documentation/meson.build | 1 + > 4 files changed, 280 insertions(+), 2 deletions(-) > create mode 100644 Documentation/guides/tuning/tuning.rst > <snip> > +Overview over the process > +------------------------- > + > +The tuning process of libcamera consists of the following steps: > + > + 1. Take images for the tuning steps with different color temperatures > + 2. Configure the tuning process > + 3. Run the tuning script to create a corresponding tuning file > + > + > +Taking raw images with defined exposure/gain > +-------------------------------------------- > + > +For all the tuning files you need to capture a raw image with a defined exposure > +time and gain. It is crucial that no pixels are saturated on any channel. At the > +same time the images should not be too dark. Strive for a exposure time, so that > +the brightest pixel is roughly 90% saturated. Finding the correct exposure time > +is currently a manual process. We are working on solutions to aid in that > +process. After finding the correct exposure time settings, the easiest way to > +capture a raw image is with the ``cam`` application that gets installed with the > +regular libcamera install. > + > +Create a ``constant-exposure.yaml`` file with the following content: > + > +.. code:: yaml > + > + frames: > + - 0: > + AeEnable: false > + ExposureTime: 1000 AnalogueGain? Or should it always be expected to be set to 1 ? Even if set to 1 I'd put that in the yaml example I think so it's "there" and tunable/changeable. > + > +Then capture one or more images using the following command: > + ``cam -c 1 --script constant-exposure.yaml -C -s role=raw --file=image_#.dng`` One issue I see here is that different lights from different colour temperatures will likely need different exposure/gain value. It's possibly worth stating that somewhere so there's not an assumption that one exposure value will all captures. My light box for instance definitely has different brightnesses across the different light sources. I see you've already stated we're working on improvements here above though so I think we can get started with this. Should we specify the stream size on the raw stream? I expect there will often be multiple 'modes' which may or may not be cropped. We should probably state that the captures should ideally be from a mode with the full sensor area exposed. > + > + > +Capture images for lens shading correction > +------------------------------------------ > + > +To be able to correct lens shading artifacts, images of a homogeneously lit > +neutral gray background are needed. Ideally a integration sphere is used for > +that task. > + > +As a fallback images of a area light or a white wall can be used. If there are 'of an area light' ? I can't tell if you are talking about taking pictures of a flat panel led 'light' there, or if you are referring to a light or white wall. I assume you were meaning 'a light or white' wall. I have found that really challenging when I've tried that in fact - and found it far easier to get captures of a large flat panel LED with diffuser. Might be another method to recommend or highlight. <edit, now I see the images I realise you call an 'area light' what I would call a flat panel light> > +multiple images for the same color temperature, the tuning scripts will average > +the tuning results which can further improve the tuning quality. > + > +.. Note:: Currently lsc is ignored in the ccm calculation. This can lead to > + slight color deviations and will be improved in the future. > + > +Images shall be taken for multiple color temperatures and named > +``alsc_<temperature>k_#.dng``. ``#`` can be any number, if multiple images are > +taken for the same color temperature. > + > +.. figure:: img/setup_lens_shading.jpg > + :width: 50% > + > + Sample setup using an area light. Due to the wide angle lens the camera has > + to moved closely in front of the light. This is not a perfect solution as > + angle to the light might get way too large. I wouldn't say 'due to the wide angle lens' as not all cameras have wide angle lenses, but indeed - a draw back to this method is that cameras with a wide angle lens may have a field of view that captures beyond the light. > + > +.. figure:: img/camshark-alsc.png > + :width: 100% > + > + A calibration image for lens shading correction. This shows using 'camshark' to view the image, which isn't too unreasonable as a viewer. I'm really looking forward to when that can be the capture and realtime adjustment tool too ... but I think it's fine to continue with cam as the capture tool for the moment, even if we are highlighting that the preview/determining the correct exposure/gains might be worth while using camshark for as it has fast feedback and viewing of the scene. > + > +- Ensure the full sensor is lit. Especially with wide angle lenses it may be > + difficult to get the full field of view lit homogeneously. > +- Ensure no color channel is saturated (can be checked with camshark) Can you be more specific here perhaps? Ensure that the brightest area of the image does not contain any pixels that reach 100% on any of the colors. Maybe we should move the % values in camshark on the color info line to a saturation line to be explicit (with just a 'max' of any of the channels perhaps) > + > +Take a raw dng image for multiple color temperatures: > + ``cam -c 1 --script constant-exposure.yaml -C -s role=raw --file=alsc_3200k_#.dng`` > + > + I'm afraid again, I think it's important here that the constant-exposure.yaml should be specific to each light source! I look forward to when the 'tuning tool' can analyse the image and do some 'automatic' AE loop on the RAW with manual controls to handle this. Maybe that's something we should already throw together in (q)cam as well... doesn't even have to run everyframe ... but I imagine this will be better handled in camshark first. > +Capture images for color calibration > +------------------------------------ > + > +To do the color calibration, raw images of the color checker need to be taken > +for different light sources. These need to be named > +``<sensor_name>_<lux-level>l_<temperature>k_0.dng``. > + > +For best results the following hints should be taken into account: > + - Ensure that the 18% gray patch (third from the right in bottom line) is > + roughly at 18% saturation for all channels (a mean of 16-20% should be fine) > + - Ensure the color checker is homogeneously lit (ideally from 45 degree above) > + - No straylight from other light sources is present > + - The color checker is not too small and not too big (so that neither lens > + shading artifacts nor lens distortions are prevailing in the area of the > + color chart) > + > +If no lux meter is at hand to precisely measure the lux level, a lux meter app > +on a mobile phone can provide a sufficient estimation. Or ... hopefully in the future - another libcamera calibrated device ;-) (I'm imagining an RPi zero or such with a camera in our tuning boxes in the future) > +.. figure:: img/setup_calibration2.jpg > + :width: 50% > + > + > +Run the tuning scripts > +---------------------- > + > +After taking the calibration images, you should have a directory with all the > +tuning files. It should look something like this: > + > + :: > + > + ../tuning-data/ > + ├── alsc_2500k_0.dng > + ├── alsc_2500k_1.dng > + ├── alsc_2500k_2.dng > + ├── alsc_6500k_0.dng > + ├── imx335_1000l_2500k_0.dng > + ├── imx335_1200l_4000k_0.dng > + ├── imx335_1600l_6000k_0.dng Why do the colour charts have imx335 but the lsc not ? Should we be more consistent? > + > + > +The tuning scripts are part of the libcamera source tree. After cloning the > +libcamera sources the necessary steps to create a tuning file are: > + > + :: > + > + # install the necessary python packages > + cd libcamera > + python -m venv venv > + source ./venv/bin/activate > + pip install -r utils/tuning/requirements.txt > + > + # run the tuning script > + utils/tuning/rkisp1.py -c config.yaml -i ../tuning-data/ -o tuning-file.yaml > + > +After the tuning script has run, the tuning file can be tested with any > +libcamera based application like `qcam`. To quickly switch to a specific tuning > +file, the environment variable ``LIBCAMERA_<pipeline>_TUNING_FILE`` is helpful. > +E.g.: > + > + :: > + > + LIBCAMERA_RKISP1_TUNING_FILE=/path/to/tuning-file.yaml qcam -c1 > + > + > +Sample images > +------------- > + > + > +.. figure:: img/image-no-blc.png > + :width: 800 > + > + Image without black level correction (and completely invalid color estimation). > + > +.. figure:: img/image-no-lsc-3000.png > + :width: 800 > + > + Image without lens shading correction @ 3000k. The vignetting artifacts can > + be clearly seen > + > +.. figure:: img/image-no-lsc-6000.png > + :width: 800 > + > + Image without lens shading correction @ 6000k. The vignetting artifacts can > + be clearly seen > + > +.. figure:: img/image-fully-tuned-3000.png > + :width: 800 > + > + Fully tuned image @ 3000k > + > +.. figure:: img/image-fully-tuned-6000.png > + :width: 800 > + > + Fully tuned image @ 6000k I think we'll add more sample images and descriptions on what to look for to an untrained eye in the future too, so I like this section ;-) I wonder if we should highlight or 'circle' the areas of interest like the vignetting? (or other interesting points in future sample images). But I don't think we need to do this now. > + > diff --git a/Documentation/index.rst b/Documentation/index.rst > index 5442ae75dde7..991dcf2b66fb 100644 > --- a/Documentation/index.rst > +++ b/Documentation/index.rst > @@ -16,6 +16,7 @@ > > Developer Guide <guides/introduction> > Application Writer's Guide <guides/application-developer> > + Sensor Tuning Guide <guides/tuning/tuning> > Pipeline Handler Writer's Guide <guides/pipeline-handler> > IPA Writer's guide <guides/ipa> > Tracing guide <guides/tracing> > diff --git a/Documentation/meson.build b/Documentation/meson.build > index 30d395234952..471eabcac344 100644 > --- a/Documentation/meson.build > +++ b/Documentation/meson.build > @@ -77,6 +77,7 @@ if sphinx.found() > 'guides/ipa.rst', > 'guides/pipeline-handler.rst', > 'guides/tracing.rst', > + 'guides/tuning/tuning.rst', > 'index.rst', > 'lens_driver_requirements.rst', > 'python-bindings.rst', > -- > 2.43.0 Oh I got there. Looking good though! Thanks! -- Kieran
Hi Kieran, Thank you for the review. I marked most comments as done. They will be included in the next version. There was only one comment with a response from my side. On Tue, Aug 06, 2024 at 05:55:06PM +0100, Kieran Bingham wrote: > Quoting Stefan Klug (2024-08-06 15:44:41) > > This patch adds a initial version of the tuning guide for libcamera. The > > process is established based on the imx8mp and will be improved when we > > do more tuning for different ISPs with our own tooling. > > > > Signed-off-by: Stefan Klug <stefan.klug@ideasonboard.com> > > --- > > Documentation/conf.py | 4 +- > > Documentation/guides/tuning/tuning.rst | 276 +++++++++++++++++++++++++ > > Documentation/index.rst | 1 + > > Documentation/meson.build | 1 + > > 4 files changed, 280 insertions(+), 2 deletions(-) > > create mode 100644 Documentation/guides/tuning/tuning.rst > > > > diff --git a/Documentation/conf.py b/Documentation/conf.py > > index 7eeea7f3865b..5387942b9af5 100644 > > --- a/Documentation/conf.py > > +++ b/Documentation/conf.py > > @@ -21,8 +21,8 @@ > > # -- Project information ----------------------------------------------------- > > > > project = 'libcamera' > > -copyright = '2018-2019, The libcamera documentation authors' > > -author = u'Kieran Bingham, Jacopo Mondi, Laurent Pinchart, Niklas Söderlund' > > +copyright = '2018-2024, The libcamera documentation authors' > > +author = u'Kieran Bingham, Jacopo Mondi, Laurent Pinchart, Niklas Söderlund, Stefan Klug' > > > > # Version information is provided by the build environment, through the > > # sphinx command line. > > diff --git a/Documentation/guides/tuning/tuning.rst b/Documentation/guides/tuning/tuning.rst > > new file mode 100644 > > index 000000000000..1c30f3f20b4e > > --- /dev/null > > +++ b/Documentation/guides/tuning/tuning.rst > > @@ -0,0 +1,276 @@ > > +.. SPDX-License-Identifier: CC-BY-SA-4.0 > > + > > +Sensor Tuning Guide > > +=================== > > + > > +To create visually good images out of the raw data provided by the camera > > s/out of the/from the/ done > > > +sensor, a lot of image processing takes place. This is usually done inside an > > +ISP (Image Signal Processor). To be able to do the necessary processing, the > > +corresponding algorithms need to be parameterized according to the used hardware > > s/to the used hardware/to the hardware in use/ done > > > +(typically sensor, light and lens). Creating these parameters is called tuning. > > s/Creating/Calculating/ ? done > > s/is called/is a process called/ done > > > +The tuning process results in a tuning file which is then used by libcamera to > > +parameterize the algorithms at runtime. > > s/parameterize the/provide calibrated parameters to the/ done > > > + > > +The processing block of an ISP vary from vendor to vendor and can be > > s/block/blocks done > > > +arbitrarily complex. Never the less a diagram of the most common and important > > s/Never the less/Nevertheless/ ? Oh yes it get's written as one word... > > - based on https://dictionary.cambridge.org/dictionary/english/nevertheless I get "500 Internal server error" :-) > > Is it correct to say these are the most common and important ? That > sounds too subjective and dependent upon use cases... > > Perhaps: > > "a diagram of common blocks frequently found in an ISP design is shown > below:" Definitely sounds better. Taken. > > But I don't mind your version. > > > > +blocks is shown below: > > + > > + :: > > + > > + +--------+ > > + | Light | > > + +--------+ > > + | > > + v > > + +--------+ +-----+ +-----+ +-----+ +-----+ +-------+ > > + | Sensor | -> | BLC | -> | AWB | -> | LSC | -> | CCM | -> | Gamma | > > + +--------+ +-----+ +-----+ +-----+ +-----+ +-------+ > > + > > +**Light** The light used to light the scene has a crucial influence on the > > +resulting image. The human eye and brain are specialized in automatically > > +adapting to different lights. In a camera this has to be done by an algorithm. > > +Light is a complex topic and to correctly describe light sources you need to > > +describe the spectrum of the light. To simplify things, lights are categorized > > +according to their `color temperature (K) > > +<https://en.wikipedia.org/wiki/Color_temperature>`_. This is important to > > +keep in mind as it means that calibrations may differ between light sources even > > +though they have the same nominal color temperature. For best results the tuning > > +images need to be taken for a complete range of light sources. > > I would put that last statement on it's own line as it's a > distinct/important point ("For best results...") > done > > + > > +**Sensor** The sensor captures the incoming light and produces the raw images. > > "and converts a measured analogue signal to a digital representation which conveys > the raw images." done > > > +Data is commonly filtered by a color filter array into red, green and blue > > Data? or light ... I think the light is filtered by a colour filter... right :) > > > +channels. As these filters are not perfect some postprocessing needs to be done > > +to recreate the correct color. > > + > > +**BLC** Black level correction. Even without incoming light a sensor produces > > +pixel values above zero. This is due to a artificially added pedestal value > > s/due to a/due to an/ done > > > +(Which is added to get a evenly distributed noise level around zero instead of > > +only the upper half) and other effects in the electronics light dark currents. > > Rather than bracket that, I'd suggest: > > " > > This is due to two system artifacts that impact the measured light > levels on the sensor. Firstly, a deliberate and artificially added > pedestal value is added to get an evenly distributed noise level around > zero to avoid negative values and clipping the data. > > Secondly, additional underlying electrical noise can be caused by > various external factors including thermal noise and electrical > interferences. taken > > > +To get good images with real black, that black level needs to be subtracted. As > > +that level is typically known for a sensor it is hardcoded in libcamera and does > > +not need any calibration at the moment. > > "As the pedestal value is typically known" (we don't know the electrical noise > component) I'm not sure here. To my understanding some sensors (e.g. AR0521) regularly measure the optical black and adjust the internal regulation. This would then the electrical noise component also. > > Is it hardcoded though? Can't it be overridden by the tuning file still? Yes, that is still possible. I added a sentence for that. > > > + > > +**AWB** Auto white balance. For a proper image the color channels need to be > > I'm not sure if 'proper image' is defined well. But I haven't got a > better option in my head yet... > > > +adjusted, to get correct white balance. This means that monochrome objects in > > +the scene appear monochrome in the output image (white is white and gray is > > +gray). In libcamera this is done based on a gray world model. No tuning is > > +necessary for this step. > > "Presently in the libipa implementation of libcamera, this is managed by > a grey world model." ? > done > > > + > > +**LSC** Lens shading correction. The lens in use has a big influence on the > > +resulting image. The typical effects on the image are lens-shading (also called > > +vignetting). This means that due to the physical properties of a lens the > > +transmission of light falls of to the corners of the lens. To make things even > > 'falls off towards the corners' ? done > > > +harder, this falloff can be different for different colors/wave lengths. LSC is > > s/wave lengths/wavelengths/ done > > > > +therefore tuned for a set of light sources and for each color channel > > +individually. > > + > > +**CCM** Color correction matrix. After the previous processing blocks the grays > > +are preserved, but colors are still not correct. This is mostly due to the > > +color temperature of the light source and the imperfections in the color filter > > +array of the sensor. To correct for this a so called color correction matrix is > > s/a so called color correction matrix/a 'color correction matrix'/ ? done > > > +calculated. This is a 3x3 matrix, that is used to optimize the captured colors > > +regarding the perceived color error. To do this a chart of known and precisely > > +measured colors (macbeth chart) is captured. Then the matrix is calculated > > s/(macbeth chart)/commonly called a macbeth chart/ done > > > +using linear optimization to best map each measured colour of the chart to it's > > +known value. The color error is measured in `deltaE > > +<https://en.wikipedia.org/wiki/Color_difference>`_. > > + > > +**Gamma** Gamma correction. Today's display usually apply a gamma of 2.2 for > > +display. For images to be perceived correctly by the human eye, they need to be > > +encoded with the corresponding inverse gamma. See also > > +<https://en.wikipedia.org/wiki/Gamma_correction>. This block doesn't need > > +tuning, but is crucial for correct visual display. > > + > > +Materials needed for the tuning process > > +--------------------------------------- > > + > > +Precisely calibrated optical equipment is very expensive and out of the scope of > > +this document. Still it is possible to get reasonably good calibration results > > +at little costs. The most important devices needed are: > > + > > + - A light box with the ability to produce defined light of different color > > + temperatures. Typical temperatures used for calibration are 2400K > > + (incandescent), 2700K (fluorescent), 5000K (daylight fluorescent), 6500K > > + (daylight). As a first test, keylights for webcam streaming can be used. > > + These are available with support for color temperatures ranging from 2500K > > + to 9000K. For better results professional light boxes are needed. > > + - A ColorChecker chart. These are sold from calibrite and it makes sense to > > + get the original one. > > + - A integration sphere. This is used to create completely homogenious light > > + for the lens shading calibration. We had good results with the use of > > + a homogeneous area light (the above mentioned keylight). > > I would perhaps call this "a large light panel with sufficient diffusion > to get an even distribution of light." done > > > > + - An environment without external light sources. Ideally calibration is done > > + without any external lights in a fully dark room. A black curtain is one > > + solution, working by night is the cheap alternative. > > + > > Indeed, I think I'm happier with a dark curtain so far so I don't have > to stay up so late. done > > > > And I've hit 50% through the document, but run out of time so I'll snip > and pause here! Me too :-) Thank you. Cheers, Stefan > > <snip> > -- > Kieran
Quoting Stefan Klug (2024-08-13 15:50:19) > Hi Kieran, > > Thank you for the review. I marked most comments as done. They will be > included in the next version. There was only one comment with a response > from my side. > > On Tue, Aug 06, 2024 at 05:55:06PM +0100, Kieran Bingham wrote: > > Quoting Stefan Klug (2024-08-06 15:44:41) > > > > > +To get good images with real black, that black level needs to be subtracted. As > > > +that level is typically known for a sensor it is hardcoded in libcamera and does > > > +not need any calibration at the moment. > > > > "As the pedestal value is typically known" (we don't know the electrical noise > > component) > > I'm not sure here. To my understanding some sensors (e.g. AR0521) > regularly measure the optical black and adjust the internal regulation. > This would then the electrical noise component also. > > > > > Is it hardcoded though? Can't it be overridden by the tuning file still? > > Yes, that is still possible. I added a sentence for that. Does that regulation happen in a way that we have to read the value and account for it on the ISP ? Anyway, I suspect you can handle any update here better than me ;-) -- Kieran
Hi Kieran, Thanks again. On Wed, Aug 07, 2024 at 11:52:10AM +0100, Kieran Bingham wrote: > Hi Stefan, > > Resuming at ~50% > > > Quoting Stefan Klug (2024-08-06 15:44:41) > > This patch adds a initial version of the tuning guide for libcamera. The > > process is established based on the imx8mp and will be improved when we > > do more tuning for different ISPs with our own tooling. > > > > Signed-off-by: Stefan Klug <stefan.klug@ideasonboard.com> > > --- > > Documentation/conf.py | 4 +- > > Documentation/guides/tuning/tuning.rst | 276 +++++++++++++++++++++++++ > > Documentation/index.rst | 1 + > > Documentation/meson.build | 1 + > > 4 files changed, 280 insertions(+), 2 deletions(-) > > create mode 100644 Documentation/guides/tuning/tuning.rst > > > > <snip> > > > +Overview over the process > > +------------------------- > > + > > +The tuning process of libcamera consists of the following steps: > > + > > + 1. Take images for the tuning steps with different color temperatures > > + 2. Configure the tuning process > > + 3. Run the tuning script to create a corresponding tuning file > > + > > + > > +Taking raw images with defined exposure/gain > > +-------------------------------------------- > > + > > +For all the tuning files you need to capture a raw image with a defined exposure > > +time and gain. It is crucial that no pixels are saturated on any channel. At the > > +same time the images should not be too dark. Strive for a exposure time, so that > > +the brightest pixel is roughly 90% saturated. Finding the correct exposure time > > +is currently a manual process. We are working on solutions to aid in that > > +process. After finding the correct exposure time settings, the easiest way to > > +capture a raw image is with the ``cam`` application that gets installed with the > > +regular libcamera install. > > + > > +Create a ``constant-exposure.yaml`` file with the following content: > > + > > +.. code:: yaml > > + > > + frames: > > + - 0: > > + AeEnable: false > > + ExposureTime: 1000 > > AnalogueGain? Or should it always be expected to be set to 1 ? > > Even if set to 1 I'd put that in the yaml example I think so it's > "there" and tunable/changeable. done > > > + > > +Then capture one or more images using the following command: > > + ``cam -c 1 --script constant-exposure.yaml -C -s role=raw --file=image_#.dng`` > > One issue I see here is that different lights from different colour > temperatures will likely need different exposure/gain value. It's > possibly worth stating that somewhere so there's not an assumption that > one exposure value will all captures. > > My light box for instance definitely has different brightnesses across > the different light sources. > > I see you've already stated we're working on improvements here above > though so I think we can get started with this. > > Should we specify the stream size on the raw stream? I expect there will > often be multiple 'modes' which may or may not be cropped. We should > probably state that the captures should ideally be from a mode with the > full sensor area exposed. We enter undefined space here :). At the moment we can't adjust the LSC for different modes. For CCM and WB the area shouldn't matter. For LSC it does and the user should do the calibration for the mode he uses in his application. Shall we mention that, or just fix it as soon as we can adjust the LSC? > > > + > > + > > +Capture images for lens shading correction > > +------------------------------------------ > > + > > +To be able to correct lens shading artifacts, images of a homogeneously lit > > +neutral gray background are needed. Ideally a integration sphere is used for > > +that task. > > + > > +As a fallback images of a area light or a white wall can be used. If there are > > 'of an area light' ? I can't tell if you are talking about taking > pictures of a flat panel led 'light' there, or if you are referring to a light > or white wall. > > I assume you were meaning 'a light or white' wall. > > I have found that really challenging when I've tried that in fact - and > found it far easier to get captures of a large flat panel LED with > diffuser. Might be another method to recommend or highlight. > > <edit, now I see the images I realise you call an 'area light' what I > would call a flat panel light> I was only told that a white wall works, but in my house I don't have such a beautifully homogeneous white wall :-). As led lights are so cheap, I'll remove the wall and replace the area light by a flat panel light. I think I stuck to area light due to my interest in 3D graphics: https://en.wikipedia.org/wiki/Computer_graphics_lighting#Area > > > > > +multiple images for the same color temperature, the tuning scripts will average > > +the tuning results which can further improve the tuning quality. > > + > > +.. Note:: Currently lsc is ignored in the ccm calculation. This can lead to > > + slight color deviations and will be improved in the future. > > + > > +Images shall be taken for multiple color temperatures and named > > +``alsc_<temperature>k_#.dng``. ``#`` can be any number, if multiple images are > > +taken for the same color temperature. > > + > > +.. figure:: img/setup_lens_shading.jpg > > + :width: 50% > > + > > + Sample setup using an area light. Due to the wide angle lens the camera has > > + to moved closely in front of the light. This is not a perfect solution as > > + angle to the light might get way too large. > > I wouldn't say 'due to the wide angle lens' as not all cameras have wide > angle lenses, but indeed - a draw back to this method is that cameras > with a wide angle lens may have a field of view that captures beyond the > light. I'd like to express, that it is not necessarily good to move the camera that close to the light. What about: Sample setup using a flat panel light. In this specific setup, the camera has to be moved close to the light due to the wide angle lens. This has the downside that the angle to the light might get way too steep. I also replaced 'large' with 'steep'. Sounds more natural to me. > > > > + > > +.. figure:: img/camshark-alsc.png > > + :width: 100% > > + > > + A calibration image for lens shading correction. > > This shows using 'camshark' to view the image, which isn't too > unreasonable as a viewer. > > I'm really looking forward to when that can be the capture and realtime > adjustment tool too ... but I think it's fine to continue with cam as > the capture tool for the moment, even if we are highlighting that the > preview/determining the correct exposure/gains might be worth while > using camshark for as it has fast feedback and viewing of the scene. > > > + > > +- Ensure the full sensor is lit. Especially with wide angle lenses it may be > > + difficult to get the full field of view lit homogeneously. > > +- Ensure no color channel is saturated (can be checked with camshark) > > Can you be more specific here perhaps? Ensure that the brightest area of > the image does not contain any pixels that reach 100% on any of the > colors. > > Maybe we should move the % values in camshark on the color info line to > a saturation line to be explicit (with just a 'max' of any of the > channels perhaps) Yes, makes sense. Max is a bit expensive though. Maybe in the histogram when we do the calculations anyways. > > > + > > +Take a raw dng image for multiple color temperatures: > > + ``cam -c 1 --script constant-exposure.yaml -C -s role=raw --file=alsc_3200k_#.dng`` > > + > > + > > I'm afraid again, I think it's important here that the > constant-exposure.yaml should be specific to each light source! > I added a note and renamed the yaml to constant-exposure-<temperature>.yaml > I look forward to when the 'tuning tool' can analyse the image and do > some 'automatic' AE loop on the RAW with manual controls to handle this. > > Maybe that's something we should already throw together in (q)cam as well... > doesn't even have to run everyframe ... but I imagine this will be > better handled in camshark first. Yes, you will need some tool to either detect the macbeth chart, or to be able to specify a position where to measure.... > > > > +Capture images for color calibration > > +------------------------------------ > > + > > +To do the color calibration, raw images of the color checker need to be taken > > +for different light sources. These need to be named > > +``<sensor_name>_<lux-level>l_<temperature>k_0.dng``. > > + > > +For best results the following hints should be taken into account: > > + - Ensure that the 18% gray patch (third from the right in bottom line) is > > + roughly at 18% saturation for all channels (a mean of 16-20% should be fine) > > + - Ensure the color checker is homogeneously lit (ideally from 45 degree above) > > + - No straylight from other light sources is present > > + - The color checker is not too small and not too big (so that neither lens > > + shading artifacts nor lens distortions are prevailing in the area of the > > + color chart) > > + > > +If no lux meter is at hand to precisely measure the lux level, a lux meter app > > +on a mobile phone can provide a sufficient estimation. > > Or ... hopefully in the future - another libcamera calibrated device ;-) > > (I'm imagining an RPi zero or such with a camera in our tuning boxes in > the future) Sure... might be risky at the same time :-) https://deepgram.com/learn/when-ai-eats-itself > > > > +.. figure:: img/setup_calibration2.jpg > > + :width: 50% > > + > > + > > +Run the tuning scripts > > +---------------------- > > + > > +After taking the calibration images, you should have a directory with all the > > +tuning files. It should look something like this: > > + > > + :: > > + > > + ../tuning-data/ > > + ├── alsc_2500k_0.dng > > + ├── alsc_2500k_1.dng > > + ├── alsc_2500k_2.dng > > + ├── alsc_6500k_0.dng > > + ├── imx335_1000l_2500k_0.dng > > + ├── imx335_1200l_4000k_0.dng > > + ├── imx335_1600l_6000k_0.dng > > Why do the colour charts have imx335 but the lsc not ? Should we be more > consistent? Funny. The first sample files had that pattern and I didn't really question it. We should enforce a sensor name and that it's the same on every image. I'll put that on my todo list. > > > + > > + > > +The tuning scripts are part of the libcamera source tree. After cloning the > > +libcamera sources the necessary steps to create a tuning file are: > > + > > + :: > > + > > + # install the necessary python packages > > + cd libcamera > > + python -m venv venv > > + source ./venv/bin/activate > > + pip install -r utils/tuning/requirements.txt > > + > > + # run the tuning script > > + utils/tuning/rkisp1.py -c config.yaml -i ../tuning-data/ -o tuning-file.yaml > > + > > +After the tuning script has run, the tuning file can be tested with any > > +libcamera based application like `qcam`. To quickly switch to a specific tuning > > +file, the environment variable ``LIBCAMERA_<pipeline>_TUNING_FILE`` is helpful. > > +E.g.: > > + > > + :: > > + > > + LIBCAMERA_RKISP1_TUNING_FILE=/path/to/tuning-file.yaml qcam -c1 > > + > > + > > +Sample images > > +------------- > > + > > + > > +.. figure:: img/image-no-blc.png > > + :width: 800 > > + > > + Image without black level correction (and completely invalid color estimation). > > + > > +.. figure:: img/image-no-lsc-3000.png > > + :width: 800 > > + > > + Image without lens shading correction @ 3000k. The vignetting artifacts can > > + be clearly seen > > + > > +.. figure:: img/image-no-lsc-6000.png > > + :width: 800 > > + > > + Image without lens shading correction @ 6000k. The vignetting artifacts can > > + be clearly seen > > + > > +.. figure:: img/image-fully-tuned-3000.png > > + :width: 800 > > + > > + Fully tuned image @ 3000k > > + > > +.. figure:: img/image-fully-tuned-6000.png > > + :width: 800 > > + > > + Fully tuned image @ 6000k > > I think we'll add more sample images and descriptions on what to look > for to an untrained eye in the future too, so I like this section ;-) > > I wonder if we should highlight or 'circle' the areas of interest like > the vignetting? (or other interesting points in future sample images). > But I don't think we need to do this now. > > > + > > diff --git a/Documentation/index.rst b/Documentation/index.rst > > index 5442ae75dde7..991dcf2b66fb 100644 > > --- a/Documentation/index.rst > > +++ b/Documentation/index.rst > > @@ -16,6 +16,7 @@ > > > > Developer Guide <guides/introduction> > > Application Writer's Guide <guides/application-developer> > > + Sensor Tuning Guide <guides/tuning/tuning> > > Pipeline Handler Writer's Guide <guides/pipeline-handler> > > IPA Writer's guide <guides/ipa> > > Tracing guide <guides/tracing> > > diff --git a/Documentation/meson.build b/Documentation/meson.build > > index 30d395234952..471eabcac344 100644 > > --- a/Documentation/meson.build > > +++ b/Documentation/meson.build > > @@ -77,6 +77,7 @@ if sphinx.found() > > 'guides/ipa.rst', > > 'guides/pipeline-handler.rst', > > 'guides/tracing.rst', > > + 'guides/tuning/tuning.rst', > > 'index.rst', > > 'lens_driver_requirements.rst', > > 'python-bindings.rst', > > -- > > 2.43.0 > > Oh I got there. Looking good though! Thanks! Phew, thank you. I'll post a v2 soon. > > -- > Kieran
diff --git a/Documentation/conf.py b/Documentation/conf.py index 7eeea7f3865b..5387942b9af5 100644 --- a/Documentation/conf.py +++ b/Documentation/conf.py @@ -21,8 +21,8 @@ # -- Project information ----------------------------------------------------- project = 'libcamera' -copyright = '2018-2019, The libcamera documentation authors' -author = u'Kieran Bingham, Jacopo Mondi, Laurent Pinchart, Niklas Söderlund' +copyright = '2018-2024, The libcamera documentation authors' +author = u'Kieran Bingham, Jacopo Mondi, Laurent Pinchart, Niklas Söderlund, Stefan Klug' # Version information is provided by the build environment, through the # sphinx command line. diff --git a/Documentation/guides/tuning/tuning.rst b/Documentation/guides/tuning/tuning.rst new file mode 100644 index 000000000000..1c30f3f20b4e --- /dev/null +++ b/Documentation/guides/tuning/tuning.rst @@ -0,0 +1,276 @@ +.. SPDX-License-Identifier: CC-BY-SA-4.0 + +Sensor Tuning Guide +=================== + +To create visually good images out of the raw data provided by the camera +sensor, a lot of image processing takes place. This is usually done inside an +ISP (Image Signal Processor). To be able to do the necessary processing, the +corresponding algorithms need to be parameterized according to the used hardware +(typically sensor, light and lens). Creating these parameters is called tuning. +The tuning process results in a tuning file which is then used by libcamera to +parameterize the algorithms at runtime. + +The processing block of an ISP vary from vendor to vendor and can be +arbitrarily complex. Never the less a diagram of the most common and important +blocks is shown below: + + :: + + +--------+ + | Light | + +--------+ + | + v + +--------+ +-----+ +-----+ +-----+ +-----+ +-------+ + | Sensor | -> | BLC | -> | AWB | -> | LSC | -> | CCM | -> | Gamma | + +--------+ +-----+ +-----+ +-----+ +-----+ +-------+ + +**Light** The light used to light the scene has a crucial influence on the +resulting image. The human eye and brain are specialized in automatically +adapting to different lights. In a camera this has to be done by an algorithm. +Light is a complex topic and to correctly describe light sources you need to +describe the spectrum of the light. To simplify things, lights are categorized +according to their `color temperature (K) +<https://en.wikipedia.org/wiki/Color_temperature>`_. This is important to +keep in mind as it means that calibrations may differ between light sources even +though they have the same nominal color temperature. For best results the tuning +images need to be taken for a complete range of light sources. + +**Sensor** The sensor captures the incoming light and produces the raw images. +Data is commonly filtered by a color filter array into red, green and blue +channels. As these filters are not perfect some postprocessing needs to be done +to recreate the correct color. + +**BLC** Black level correction. Even without incoming light a sensor produces +pixel values above zero. This is due to a artificially added pedestal value +(Which is added to get a evenly distributed noise level around zero instead of +only the upper half) and other effects in the electronics light dark currents. +To get good images with real black, that black level needs to be subtracted. As +that level is typically known for a sensor it is hardcoded in libcamera and does +not need any calibration at the moment. + +**AWB** Auto white balance. For a proper image the color channels need to be +adjusted, to get correct white balance. This means that monochrome objects in +the scene appear monochrome in the output image (white is white and gray is +gray). In libcamera this is done based on a gray world model. No tuning is +necessary for this step. + +**LSC** Lens shading correction. The lens in use has a big influence on the +resulting image. The typical effects on the image are lens-shading (also called +vignetting). This means that due to the physical properties of a lens the +transmission of light falls of to the corners of the lens. To make things even +harder, this falloff can be different for different colors/wave lengths. LSC is +therefore tuned for a set of light sources and for each color channel +individually. + +**CCM** Color correction matrix. After the previous processing blocks the grays +are preserved, but colors are still not correct. This is mostly due to the +color temperature of the light source and the imperfections in the color filter +array of the sensor. To correct for this a so called color correction matrix is +calculated. This is a 3x3 matrix, that is used to optimize the captured colors +regarding the perceived color error. To do this a chart of known and precisely +measured colors (macbeth chart) is captured. Then the matrix is calculated +using linear optimization to best map each measured colour of the chart to it's +known value. The color error is measured in `deltaE +<https://en.wikipedia.org/wiki/Color_difference>`_. + +**Gamma** Gamma correction. Today's display usually apply a gamma of 2.2 for +display. For images to be perceived correctly by the human eye, they need to be +encoded with the corresponding inverse gamma. See also +<https://en.wikipedia.org/wiki/Gamma_correction>. This block doesn't need +tuning, but is crucial for correct visual display. + +Materials needed for the tuning process +--------------------------------------- + +Precisely calibrated optical equipment is very expensive and out of the scope of +this document. Still it is possible to get reasonably good calibration results +at little costs. The most important devices needed are: + + - A light box with the ability to produce defined light of different color + temperatures. Typical temperatures used for calibration are 2400K + (incandescent), 2700K (fluorescent), 5000K (daylight fluorescent), 6500K + (daylight). As a first test, keylights for webcam streaming can be used. + These are available with support for color temperatures ranging from 2500K + to 9000K. For better results professional light boxes are needed. + - A ColorChecker chart. These are sold from calibrite and it makes sense to + get the original one. + - A integration sphere. This is used to create completely homogenious light + for the lens shading calibration. We had good results with the use of + a homogeneous area light (the above mentioned keylight). + - An environment without external light sources. Ideally calibration is done + without any external lights in a fully dark room. A black curtain is one + solution, working by night is the cheap alternative. + + +Overview over the process +------------------------- + +The tuning process of libcamera consists of the following steps: + + 1. Take images for the tuning steps with different color temperatures + 2. Configure the tuning process + 3. Run the tuning script to create a corresponding tuning file + + +Taking raw images with defined exposure/gain +-------------------------------------------- + +For all the tuning files you need to capture a raw image with a defined exposure +time and gain. It is crucial that no pixels are saturated on any channel. At the +same time the images should not be too dark. Strive for a exposure time, so that +the brightest pixel is roughly 90% saturated. Finding the correct exposure time +is currently a manual process. We are working on solutions to aid in that +process. After finding the correct exposure time settings, the easiest way to +capture a raw image is with the ``cam`` application that gets installed with the +regular libcamera install. + +Create a ``constant-exposure.yaml`` file with the following content: + +.. code:: yaml + + frames: + - 0: + AeEnable: false + ExposureTime: 1000 + +Then capture one or more images using the following command: + ``cam -c 1 --script constant-exposure.yaml -C -s role=raw --file=image_#.dng`` + + +Capture images for lens shading correction +------------------------------------------ + +To be able to correct lens shading artifacts, images of a homogeneously lit +neutral gray background are needed. Ideally a integration sphere is used for +that task. + +As a fallback images of a area light or a white wall can be used. If there are +multiple images for the same color temperature, the tuning scripts will average +the tuning results which can further improve the tuning quality. + +.. Note:: Currently lsc is ignored in the ccm calculation. This can lead to + slight color deviations and will be improved in the future. + +Images shall be taken for multiple color temperatures and named +``alsc_<temperature>k_#.dng``. ``#`` can be any number, if multiple images are +taken for the same color temperature. + +.. figure:: img/setup_lens_shading.jpg + :width: 50% + + Sample setup using an area light. Due to the wide angle lens the camera has + to moved closely in front of the light. This is not a perfect solution as + angle to the light might get way too large. + +.. figure:: img/camshark-alsc.png + :width: 100% + + A calibration image for lens shading correction. + +- Ensure the full sensor is lit. Especially with wide angle lenses it may be + difficult to get the full field of view lit homogeneously. +- Ensure no color channel is saturated (can be checked with camshark) + +Take a raw dng image for multiple color temperatures: + ``cam -c 1 --script constant-exposure.yaml -C -s role=raw --file=alsc_3200k_#.dng`` + + + +Capture images for color calibration +------------------------------------ + +To do the color calibration, raw images of the color checker need to be taken +for different light sources. These need to be named +``<sensor_name>_<lux-level>l_<temperature>k_0.dng``. + +For best results the following hints should be taken into account: + - Ensure that the 18% gray patch (third from the right in bottom line) is + roughly at 18% saturation for all channels (a mean of 16-20% should be fine) + - Ensure the color checker is homogeneously lit (ideally from 45 degree above) + - No straylight from other light sources is present + - The color checker is not too small and not too big (so that neither lens + shading artifacts nor lens distortions are prevailing in the area of the + color chart) + +If no lux meter is at hand to precisely measure the lux level, a lux meter app +on a mobile phone can provide a sufficient estimation. + +.. figure:: img/setup_calibration2.jpg + :width: 50% + + +Run the tuning scripts +---------------------- + +After taking the calibration images, you should have a directory with all the +tuning files. It should look something like this: + + :: + + ../tuning-data/ + ├── alsc_2500k_0.dng + ├── alsc_2500k_1.dng + ├── alsc_2500k_2.dng + ├── alsc_6500k_0.dng + ├── imx335_1000l_2500k_0.dng + ├── imx335_1200l_4000k_0.dng + ├── imx335_1600l_6000k_0.dng + + +The tuning scripts are part of the libcamera source tree. After cloning the +libcamera sources the necessary steps to create a tuning file are: + + :: + + # install the necessary python packages + cd libcamera + python -m venv venv + source ./venv/bin/activate + pip install -r utils/tuning/requirements.txt + + # run the tuning script + utils/tuning/rkisp1.py -c config.yaml -i ../tuning-data/ -o tuning-file.yaml + +After the tuning script has run, the tuning file can be tested with any +libcamera based application like `qcam`. To quickly switch to a specific tuning +file, the environment variable ``LIBCAMERA_<pipeline>_TUNING_FILE`` is helpful. +E.g.: + + :: + + LIBCAMERA_RKISP1_TUNING_FILE=/path/to/tuning-file.yaml qcam -c1 + + +Sample images +------------- + + +.. figure:: img/image-no-blc.png + :width: 800 + + Image without black level correction (and completely invalid color estimation). + +.. figure:: img/image-no-lsc-3000.png + :width: 800 + + Image without lens shading correction @ 3000k. The vignetting artifacts can + be clearly seen + +.. figure:: img/image-no-lsc-6000.png + :width: 800 + + Image without lens shading correction @ 6000k. The vignetting artifacts can + be clearly seen + +.. figure:: img/image-fully-tuned-3000.png + :width: 800 + + Fully tuned image @ 3000k + +.. figure:: img/image-fully-tuned-6000.png + :width: 800 + + Fully tuned image @ 6000k + diff --git a/Documentation/index.rst b/Documentation/index.rst index 5442ae75dde7..991dcf2b66fb 100644 --- a/Documentation/index.rst +++ b/Documentation/index.rst @@ -16,6 +16,7 @@ Developer Guide <guides/introduction> Application Writer's Guide <guides/application-developer> + Sensor Tuning Guide <guides/tuning/tuning> Pipeline Handler Writer's Guide <guides/pipeline-handler> IPA Writer's guide <guides/ipa> Tracing guide <guides/tracing> diff --git a/Documentation/meson.build b/Documentation/meson.build index 30d395234952..471eabcac344 100644 --- a/Documentation/meson.build +++ b/Documentation/meson.build @@ -77,6 +77,7 @@ if sphinx.found() 'guides/ipa.rst', 'guides/pipeline-handler.rst', 'guides/tracing.rst', + 'guides/tuning/tuning.rst', 'index.rst', 'lens_driver_requirements.rst', 'python-bindings.rst',
This patch adds a initial version of the tuning guide for libcamera. The process is established based on the imx8mp and will be improved when we do more tuning for different ISPs with our own tooling. Signed-off-by: Stefan Klug <stefan.klug@ideasonboard.com> --- Documentation/conf.py | 4 +- Documentation/guides/tuning/tuning.rst | 276 +++++++++++++++++++++++++ Documentation/index.rst | 1 + Documentation/meson.build | 1 + 4 files changed, 280 insertions(+), 2 deletions(-) create mode 100644 Documentation/guides/tuning/tuning.rst