[{"id":30806,"web_url":"https://patchwork.libcamera.org/comment/30806/","msgid":"<8052d1f5-24be-4a60-8d04-e3a888d3957b@ideasonboard.com>","date":"2024-08-14T08:29:01","subject":"Re: [PATCH v2 1/1] Documentation: Add first version of the\n\ttuning-guide","submitter":{"id":156,"url":"https://patchwork.libcamera.org/api/people/156/","name":"Dan Scally","email":"dan.scally@ideasonboard.com"},"content":"Hi Stefan, thanks for the patch. This is very good!\n\nOn 14/08/2024 08:40, Stefan Klug wrote:\n> This patch adds a initial version of the tuning guide for libcamera. The\n> process is established based on the imx8mp and will be improved when we\n> do more tuning for different ISPs with our own tooling.\n>\n> Signed-off-by: Stefan Klug <stefan.klug@ideasonboard.com>\n> ---\n>   Documentation/conf.py                  |   4 +-\n>   Documentation/guides/tuning/tuning.rst | 291 +++++++++++++++++++++++++\n>   Documentation/index.rst                |   1 +\n>   Documentation/meson.build              |   1 +\n>   4 files changed, 295 insertions(+), 2 deletions(-)\n>   create mode 100644 Documentation/guides/tuning/tuning.rst\n>\n> diff --git a/Documentation/conf.py b/Documentation/conf.py\n> index 7eeea7f3865b..5387942b9af5 100644\n> --- a/Documentation/conf.py\n> +++ b/Documentation/conf.py\n> @@ -21,8 +21,8 @@\n>   # -- Project information -----------------------------------------------------\n>   \n>   project = 'libcamera'\n> -copyright = '2018-2019, The libcamera documentation authors'\n> -author = u'Kieran Bingham, Jacopo Mondi, Laurent Pinchart, Niklas Söderlund'\n> +copyright = '2018-2024, The libcamera documentation authors'\n> +author = u'Kieran Bingham, Jacopo Mondi, Laurent Pinchart, Niklas Söderlund, Stefan Klug'\n>   \n>   # Version information is provided by the build environment, through the\n>   # sphinx command line.\n> diff --git a/Documentation/guides/tuning/tuning.rst b/Documentation/guides/tuning/tuning.rst\n> new file mode 100644\n> index 000000000000..a58dda350556\n> --- /dev/null\n> +++ b/Documentation/guides/tuning/tuning.rst\n> @@ -0,0 +1,291 @@\n> +.. SPDX-License-Identifier: CC-BY-SA-4.0\n> +\n> +Sensor Tuning Guide\n> +===================\n> +\n> +To create visually good images from the raw data provided by the camera sensor,\n> +a lot of image processing takes place. This is usually done inside an ISP (Image\n> +Signal Processor). To be able to do the necessary processing, the corresponding\n> +algorithms need to be parameterized according to the hardware in use (typically\n> +sensor, light and lens). Calculating these parameters is a process called\n> +tuning. The tuning process results in a tuning file which is then used by\n> +libcamera to provide calibrated parameters to the algorithms at runtime.\n> +\n> +The processing blocks of an ISP vary from vendor to vendor and can be\n> +arbitrarily complex. Nevertheless a diagram of common blocks frequently found in\n> +an ISP design is shown below:\n> +\n> + ::\n> +\n> +  +--------+\n> +  |  Light |\n> +  +--------+\n> +      |\n> +      v\n> +  +--------+     +-----+    +-----+    +-----+    +-----+    +-------+\n> +  | Sensor |  -> | BLC | -> | AWB | -> | LSC | -> | CCM | -> | Gamma |\n> +  +--------+     +-----+    +-----+    +-----+    +-----+    +-------+\n> +\n> +**Light** The light used to light the scene has a crucial influence on the\n\"light used to light\" is a bit funny. Perhaps \"light used to illuminate\"?\n> +resulting image. The human eye and brain are specialized in automatically\n> +adapting to different lights. In a camera this has to be done by an algorithm.\n> +Light is a complex topic and to correctly describe light sources you need to\n> +describe the spectrum of the light. To simplify things, lights are categorized\n> +according to their `color temperature (K)\n> +<https://en.wikipedia.org/wiki/Color_temperature>`_. This is important to\n> +keep in mind as it means that calibrations may differ between light sources even\n> +though they have the same nominal color temperature.\n> +\n> +For best results the tuning images need to be taken for a complete range of\n> +light sources.\n> +\n> +**Sensor** The sensor captures the incoming light and converts a measured\n> +analogue signal to a digital representation which conveys the raw images. The\n> +light is commonly filtered by a color filter array into red, green and blue\n> +channels. As these filters are not perfect some postprocessing needs to be done\n> +to recreate the correct color.\n> +\n> +**BLC** Black level correction. Even without incoming light a sensor produces\n> +pixel values above zero. This is due to two system artifacts that impact the\n> +measured light levels on the sensor. Firstly, a deliberate and artificially\n> +added pedestal value is added to get an evenly distributed noise level around\n> +zero to avoid negative values and clipping the data.\n> +\n> +Secondly, additional underlying electrical noise can be caused by various\n> +external factors including thermal noise and electrical interferences.\n> +\n> +To get good images with real black, that black level needs to be subtracted. As\n> +that level is typically known for a sensor it is hardcoded in libcamera and does\n> +not need any calibration at the moment. If needed, that value can be manually\n> +overwritten in the tuning configuration.\nCan it? For all IPA modules?\n> +\n> +**AWB** Auto white balance. For a proper image the color channels need to be\n> +adjusted, to get correct white balance. This means that monochrome objects in\n> +the scene appear monochrome in the output image (white is white and gray is\n> +gray). Presently in the libipa implementation of libcamera, this is managed by a\n> +grey world model. No tuning is necessary for this step.\n> +\n> +**LSC** Lens shading correction. The lens in use has a big influence on the\n> +resulting image. The typical effects on the image are lens-shading (also called\n> +vignetting). This means that due to the physical properties of a lens the\n> +transmission of light falls of towards the corners of the lens. To make things\ns/falls of/falls off. I would perhaps link vignetting to the Wikipedia page.\n> +even harder, this falloff can be different for different colors/wavelengths. LSC\n> +is therefore tuned for a set of light sources and for each color channel\n> +individually.\n> +\n> +**CCM** Color correction matrix. After the previous processing blocks the grays\n> +are preserved, but colors are still not correct. This is mostly due to the color\n> +temperature of the light source and the imperfections in the color filter array\n> +of the sensor. To correct for this a 'color correction matrix' is calculated.\n> +This is a 3x3 matrix, that is used to optimize the captured colors regarding the\n> +perceived color error.\n\n\nPerhaps \"used to optimize the captured colours by correcting the perceived color error\"?\n\n> To do this a chart of known and precisely measured colors\n> +commonly called a macbeth chart is captured. Then the matrix is calculated using\n> +linear optimization to best map each measured colour of the chart to it's known\n> +value. The color error is measured in `deltaE\n> +<https://en.wikipedia.org/wiki/Color_difference>`_.\n> +\n> +**Gamma** Gamma correction. Today's display usually apply a gamma of 2.2 for\n> +display.\n\nThe double \"display\" feels a bit off. Hmmmm...maybe \"Today's displays usually apply a gamma of 2.2 \nto the image they show\"?\n\n\n>   For images to be perceived correctly by the human eye, they need to be\n> +encoded with the corresponding inverse gamma. See also\n> +<https://en.wikipedia.org/wiki/Gamma_correction>. This block doesn't need\n> +tuning, but is crucial for correct visual display.\n> +\n> +Materials needed for the tuning process\n> +---------------------------------------\n> +\n> +Precisely calibrated optical equipment is very expensive and out of the scope of\n> +this document. Still it is possible to get reasonably good calibration results\n> +at little costs. The most important devices needed are:\ns/costs/cost.\n> +\n> +   - A light box with the ability to produce defined light of different color\n> +     temperatures. Typical temperatures used for calibration are 2400K\n> +     (incandescent), 2700K (fluorescent), 5000K (daylight fluorescent), 6500K\n> +     (daylight). As a first test, keylights for webcam streaming can be used.\n> +     These are available with support for color temperatures ranging from 2500K\n> +     to 9000K.\nIs it worth mentioning that you're using those in combination with a DIY lightbox rather than just \nopenly in a room?\n> For better results professional light boxes are needed.\n> +   - A ColorChecker chart. These are sold from calibrite and it makes sense to\n> +     get the original one.\nI think that the reasons for getting the original one might be worth expanding on, as it's probably \nthe most expensive thing to acquire on this list.\n> +   - A integration sphere. This is used to create completely homogenious light\n> +     for the lens shading calibration. We had good results with the use of a\n> +     large light panel with sufficient diffusion to get an even distribution of\n> +     light (the above mentioned keylight).\n> +   - An environment without external light sources. Ideally calibration is done\n> +     without any external lights in a fully dark room. A black curtain is one\n> +     solution, working by night is the cheap alternative.\n> +\n> +\n> +Overview over the process\ns/over/of\n> +-------------------------\n> +\n> +The tuning process of libcamera consists of the following steps:\n> +\n> +   1. Take images for the tuning steps with different color temperatures\ns/with/at\n> +   2. Configure the tuning process\n> +   3. Run the tuning script to create a corresponding tuning file\n> +\n> +\n> +Taking raw images with defined exposure/gain\n> +--------------------------------------------\n> +\n> +For all the tuning files you need to capture a raw image with a defined exposure\n> +time and gain. It is crucial that no pixels are saturated on any channel. At the\n> +same time the images should not be too dark. Strive for a exposure time, so that\n\"Strive for an exposure time such that\"\n> +the brightest pixel is roughly 90% saturated. Finding the correct exposure time\n> +is currently a manual process. We are working on solutions to aid in that\n> +process.\nAre we? cool!\n>   After finding the correct exposure time settings, the easiest way to\n> +capture a raw image is with the ``cam`` application that gets installed with the\n> +regular libcamera install.\n> +\n> +Create a ``constant-exposure-<temperature>.yaml`` file with the following content:\n> +\n> +.. code:: yaml\n> +\n> +   frames:\n> +     - 0:\n> +         AeEnable: false\n> +         AnalogueGain: 1.0\n> +         ExposureTime: 1000\n> +\nThe IPU3 IPA wouldn't handle these...one for my todo list, but perhaps we need to formally define \nthe requirements an IPA would need to meet to support our tuning process somewhere or something...\n> +Then capture one or more images using the following command:\n> +  ``cam -c 1 --script constant-exposure-<temperature>.yaml -C -s role=raw --file=image_#.dng``\n> +\n> +.. Note:: Typically the brightness changes for different colour temperatures. So\n> +    this has to be readjusted for every colour temperature.\n> +\n> +\n> +Capture images for lens shading correction\n> +------------------------------------------\n> +\n> +To be able to correct lens shading artifacts, images of a homogeneously lit\n> +neutral gray background are needed. Ideally a integration sphere is used for\n> +that task.\n> +\n> +As a fallback images of a flat panel light can be used. If there are multiple\n> +images for the same color temperature, the tuning scripts will average the\n> +tuning results which can further improve the tuning quality.\n> +\n> +.. Note:: Currently lsc is ignored in the ccm calculation. This can lead to\n> +    slight color deviations and will be improved in the future.\n> +\n> +Images shall be taken for multiple color temperatures and named\n> +``alsc_<temperature>k_#.dng``. ``#`` can be any number, if multiple images are\n> +taken for the same color temperature.\n> +\n> +.. figure:: img/setup_lens_shading.jpg\n> +    :width: 50%\n> +\n> +    Sample setup using a flat panel light. In this specific setup, the camera\n> +    has to be moved close to the light due to the wide angle lens. This has the\n> +    downside that the angle to the light might get way too steep.\nI _think_ that the effect of that would be to increase the severity of the vignetting and so the \ntuning process would increase the gains to counter it more aggressively than would really be needed \nin a natural scene; is that right? Maybe that's worth a note box (or warning? Or whatever the orange \none is)\n> +\n> +.. figure:: img/camshark-alsc.png\n> +    :width: 100%\n> +\n> +    A calibration image for lens shading correction.\n> +\n> +- Ensure the full sensor is lit. Especially with wide angle lenses it may be\n> +  difficult to get the full field of view lit homogeneously.\n> +- Ensure that the brightest area of the image does not contain any pixels that\n> +  reach more than 95% on any of the colors (can be checked with camshark).\n\n\nOoh where does camshark show that?\n\n> +\n> +Take a raw dng image for multiple color temperatures:\n> +    ``cam -c 1 --script constant-exposure-<temperature>.yaml -C -s role=raw --file=alsc_3200k_#.dng``\n> +\n> +\n> +\n> +Capture images for color calibration\n> +------------------------------------\n> +\n> +To do the color calibration, raw images of the color checker need to be taken\ns/color checker/ColorChecker chart.\n> +for different light sources. These need to be named\n\n> +``<sensor_name>_<lux-level>l_<temperature>k_0.dng``.\n> +\n> +For best results the following hints should be taken into account:\n> +  - Ensure that the 18% gray patch (third from the right in bottom line) is\n> +    roughly at 18% saturation for all channels (a mean of 16-20% should be fine)\nSo I think that means you need to set the color temperature on your lamp and then tune the intensity \nuntil the saturation is met - is that right?\n> +  - Ensure the color checker is homogeneously lit (ideally from 45 degree above)\n> +  - No straylight from other light sources is present\n> +  - The color checker is not too small and not too big (so that neither lens\n> +    shading artifacts nor lens distortions are prevailing in the area of the\n> +    color chart)\n> +\n> +If no lux meter is at hand to precisely measure the lux level, a lux meter app\n> +on a mobile phone can provide a sufficient estimation.\n> +\n> +.. figure:: img/setup_calibration2.jpg\n> +    :width: 50%\n> +\n> +\n> +Run the tuning scripts\n> +----------------------\n> +\n> +After taking the calibration images, you should have a directory with all the\n> +tuning files. It should look something like this:\n> +\n> + ::\n> +\n> +   ../tuning-data/\n> +   ├── alsc_2500k_0.dng\n> +   ├── alsc_2500k_1.dng\n> +   ├── alsc_2500k_2.dng\n> +   ├── alsc_6500k_0.dng\n> +   ├── imx335_1000l_2500k_0.dng\n> +   ├── imx335_1200l_4000k_0.dng\n> +   ├── imx335_1600l_6000k_0.dng\n> +\n> +\n> +The tuning scripts are part of the libcamera source tree. After cloning the\n> +libcamera sources the necessary steps to create a tuning file are:\n> +\n> + ::\n> +\n> +  # install the necessary python packages\n> +  cd libcamera\n> +  python -m venv venv\n> +  source ./venv/bin/activate\n> +  pip install -r utils/tuning/requirements.txt\n> +\n> +  # run the tuning script\n> +  utils/tuning/rkisp1.py -c config.yaml -i ../tuning-data/ -o tuning-file.yaml\n\nThere needs to be an explanation of the creation of config.yaml here I think.\n\n\nReally good work in my opinion.\n\n> +\n> +After the tuning script has run, the tuning file can be tested with any\n> +libcamera based application like `qcam`. To quickly switch to a specific tuning\n> +file, the environment variable ``LIBCAMERA_<pipeline>_TUNING_FILE`` is helpful.\n> +E.g.:\n> +\n> + ::\n> +\n> +    LIBCAMERA_RKISP1_TUNING_FILE=/path/to/tuning-file.yaml qcam -c1\n> +\n> +\n> +Sample images\n> +-------------\n> +\n> +\n> +.. figure:: img/image-no-blc.png\n> +    :width: 800\n> +\n> +    Image without black level correction (and completely invalid color estimation).\n> +\n> +.. figure:: img/image-no-lsc-3000.png\n> +    :width: 800\n> +\n> +    Image without lens shading correction @ 3000k. The vignetting artifacts can\n> +    be clearly seen\n> +\n> +.. figure:: img/image-no-lsc-6000.png\n> +    :width: 800\n> +\n> +    Image without lens shading correction @ 6000k. The vignetting artifacts can\n> +    be clearly seen\n> +\n> +.. figure:: img/image-fully-tuned-3000.png\n> +    :width: 800\n> +\n> +    Fully tuned image @ 3000k\n> +\n> +.. figure:: img/image-fully-tuned-6000.png\n> +    :width: 800\n> +\n> +    Fully tuned image @ 6000k\n> +\n> diff --git a/Documentation/index.rst b/Documentation/index.rst\n> index 5442ae75dde7..991dcf2b66fb 100644\n> --- a/Documentation/index.rst\n> +++ b/Documentation/index.rst\n> @@ -16,6 +16,7 @@\n>   \n>      Developer Guide <guides/introduction>\n>      Application Writer's Guide <guides/application-developer>\n> +   Sensor Tuning Guide <guides/tuning/tuning>\n>      Pipeline Handler Writer's Guide <guides/pipeline-handler>\n>      IPA Writer's guide <guides/ipa>\n>      Tracing guide <guides/tracing>\n> diff --git a/Documentation/meson.build b/Documentation/meson.build\n> index 1ba40fdf67ac..8bf09f31afa0 100644\n> --- a/Documentation/meson.build\n> +++ b/Documentation/meson.build\n> @@ -135,6 +135,7 @@ if sphinx.found()\n>           'guides/ipa.rst',\n>           'guides/pipeline-handler.rst',\n>           'guides/tracing.rst',\n> +        'guides/tuning/tuning.rst',\n>           'index.rst',\n>           'lens_driver_requirements.rst',\n>           'python-bindings.rst',","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id 02CB7C323E\n\tfor <parsemail@patchwork.libcamera.org>;\n\tWed, 14 Aug 2024 08:29:07 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id 3F683633B5;\n\tWed, 14 Aug 2024 10:29:07 +0200 (CEST)","from perceval.ideasonboard.com (perceval.ideasonboard.com\n\t[IPv6:2001:4b98:dc2:55:216:3eff:fef7:d647])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id 13AF261946\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tWed, 14 Aug 2024 10:29:05 +0200 (CEST)","from [192.168.0.43]\n\t(cpc141996-chfd3-2-0-cust928.12-3.cable.virginm.net [86.13.91.161])\n\tby perceval.ideasonboard.com (Postfix) with ESMTPSA id 5311C2E0\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tWed, 14 Aug 2024 10:28:07 +0200 (CEST)"],"Authentication-Results":"lancelot.ideasonboard.com; dkim=pass (1024-bit key;\n\tunprotected) header.d=ideasonboard.com header.i=@ideasonboard.com\n\theader.b=\"qlaIhTld\"; dkim-atps=neutral","DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/simple; d=ideasonboard.com;\n\ts=mail; t=1723624087;\n\tbh=b/mEODjxWNBgoILEpJNsnWXLew3oXc+n98V3M5zvU0g=;\n\th=Date:Subject:To:References:From:In-Reply-To:From;\n\tb=qlaIhTldaWSWl1Lkzap4EjUXCA1pUNxFGbPGDYiztpHBHjCbUkiDpHoDf2wTzC6jg\n\tELvRb30o2BUdRy9ZwuvWq2oRBYzZ+yVJdl9/yCq858zTGeEZKG+klL3MAVV1W0XbsP\n\tBNSME4ASR/Gs7He0NtW8p5bm6JQpn7WublvO306E=","Message-ID":"<8052d1f5-24be-4a60-8d04-e3a888d3957b@ideasonboard.com>","Date":"Wed, 14 Aug 2024 09:29:01 +0100","MIME-Version":"1.0","User-Agent":"Mozilla Thunderbird","Subject":"Re: [PATCH v2 1/1] Documentation: Add first version of the\n\ttuning-guide","To":"libcamera-devel@lists.libcamera.org","References":"<20240814074013.52693-1-stefan.klug@ideasonboard.com>\n\t<20240814074013.52693-2-stefan.klug@ideasonboard.com>","Content-Language":"en-US","From":"Dan Scally <dan.scally@ideasonboard.com>","Autocrypt":"addr=dan.scally@ideasonboard.com; keydata=\n\txsFNBGLydlEBEADa5O2s0AbUguprfvXOQun/0a8y2Vk6BqkQALgeD6KnXSWwaoCULp18etYW\n\tB31bfgrdphXQ5kUQibB0ADK8DERB4wrzrUb5CMxLBFE7mQty+v5NsP0OFNK9XTaAOcmD+Ove\n\teIjYvqurAaro91jrRVrS1gBRxIFqyPgNvwwL+alMZhn3/2jU2uvBmuRrgnc/e9cHKiuT3Dtq\n\tMHGPKL2m+plk+7tjMoQFfexoQ1JKugHAjxAhJfrkXh6uS6rc01bYCyo7ybzg53m1HLFJdNGX\n\tsUKR+dQpBs3SY4s66tc1sREJqdYyTsSZf80HjIeJjU/hRunRo4NjRIJwhvnK1GyjOvvuCKVU\n\tRWpY8dNjNu5OeAfdrlvFJOxIE9M8JuYCQTMULqd1NuzbpFMjc9524U3Cngs589T7qUMPb1H1\n\tNTA81LmtJ6Y+IV5/kiTUANflpzBwhu18Ok7kGyCq2a2jsOcVmk8gZNs04gyjuj8JziYwwLbf\n\tvzABwpFVcS8aR+nHIZV1HtOzyw8CsL8OySc3K9y+Y0NRpziMRvutrppzgyMb9V+N31mK9Mxl\n\t1YkgaTl4ciNWpdfUe0yxH03OCuHi3922qhPLF4XX5LN+NaVw5Xz2o3eeWklXdouxwV7QlN33\n\tu4+u2FWzKxDqO6WLQGjxPE0mVB4Gh5Pa1Vb0ct9Ctg0qElvtGQARAQABzShEYW4gU2NhbGx5\n\tIDxkYW4uc2NhbGx5QGlkZWFzb25ib2FyZC5jb20+wsGNBBMBCAA3FiEEsdtt8OWP7+8SNfQe\n\tkiQuh/L+GMQFAmLydlIFCQWjmoACGwMECwkIBwUVCAkKCwUWAgMBAAAKCRCSJC6H8v4YxDI2\n\tEAC2Gz0iyaXJkPInyshrREEWbo0CA6v5KKf3I/HlMPqkZ48bmGoYm4mEQGFWZJAT3K4ir8bg\n\tcEfs9V54gpbrZvdwS4abXbUK4WjKwEs8HK3XJv1WXUN2bsz5oEJWZUImh9gD3naiLLI9QMMm\n\tw/aZkT+NbN5/2KvChRWhdcha7+2Te4foOY66nIM+pw2FZM6zIkInLLUik2zXOhaZtqdeJZQi\n\tHSPU9xu7TRYN4cvdZAnSpG7gQqmLm5/uGZN1/sB3kHTustQtSXKMaIcD/DMNI3JN/t+RJVS7\n\tc0Jh/ThzTmhHyhxx3DRnDIy7kwMI4CFvmhkVC2uNs9kWsj1DuX5kt8513mvfw2OcX9UnNKmZ\n\tnhNCuF6DxVrL8wjOPuIpiEj3V+K7DFF1Cxw1/yrLs8dYdYh8T8vCY2CHBMsqpESROnTazboh\n\tAiQ2xMN1cyXtX11Qwqm5U3sykpLbx2BcmUUUEAKNsM//Zn81QXKG8vOx0ZdMfnzsCaCzt8f6\n\t9dcDBBI3tJ0BI9ByiocqUoL6759LM8qm18x3FYlxvuOs4wSGPfRVaA4yh0pgI+ModVC2Pu3y\n\tejE/IxeatGqJHh6Y+iJzskdi27uFkRixl7YJZvPJAbEn7kzSi98u/5ReEA8Qhc8KO/B7wprj\n\txjNMZNYd0Eth8+WkixHYj752NT5qshKJXcyUU87BTQRi8nZSARAAx0BJayh1Fhwbf4zoY56x\n\txHEpT6DwdTAYAetd3yiKClLVJadYxOpuqyWa1bdfQWPb+h4MeXbWw/53PBgn7gI2EA7ebIRC\n\tPJJhAIkeym7hHZoxqDQTGDJjxFEL11qF+U3rhWiL2Zt0Pl+zFq0eWYYVNiXjsIS4FI2+4m16\n\ttPbDWZFJnSZ828VGtRDQdhXfx3zyVX21lVx1bX4/OZvIET7sVUufkE4hrbqrrufre7wsjD1t\n\t8MQKSapVrr1RltpzPpScdoxknOSBRwOvpp57pJJe5A0L7+WxJ+vQoQXj0j+5tmIWOAV1qBQp\n\thyoyUk9JpPfntk2EKnZHWaApFp5TcL6c5LhUvV7F6XwOjGPuGlZQCWXee9dr7zym8iR3irWT\n\t+49bIh5PMlqSLXJDYbuyFQHFxoiNdVvvf7etvGfqFYVMPVjipqfEQ38ST2nkzx+KBICz7uwj\n\tJwLBdTXzGFKHQNckGMl7F5QdO/35An/QcxBnHVMXqaSd12tkJmoRVWduwuuoFfkTY5mUV3uX\n\txGj3iVCK4V+ezOYA7c2YolfRCNMTza6vcK/P4tDjjsyBBZrCCzhBvd4VVsnnlZhVaIxoky4K\n\taL+AP+zcQrUZmXmgZjXOLryGnsaeoVrIFyrU6ly90s1y3KLoPsDaTBMtnOdwxPmo1xisH8oL\n\ta/VRgpFBfojLPxMAEQEAAcLBfAQYAQgAJhYhBLHbbfDlj+/vEjX0HpIkLofy/hjEBQJi8nZT\n\tBQkFo5qAAhsMAAoJEJIkLofy/hjEXPcQAMIPNqiWiz/HKu9W4QIf1OMUpKn3YkVIj3p3gvfM\n\tRes4fGX94Ji599uLNrPoxKyaytC4R6BTxVriTJjWK8mbo9jZIRM4vkwkZZ2bu98EweSucxbp\n\tvjESsvMXGgxniqV/RQ/3T7LABYRoIUutARYq58p5HwSP0frF0fdFHYdTa2g7MYZl1ur2JzOC\n\tFHRpGadlNzKDE3fEdoMobxHB3Lm6FDml5GyBAA8+dQYVI0oDwJ3gpZPZ0J5Vx9RbqXe8RDuR\n\tdu90hvCJkq7/tzSQ0GeD3BwXb9/R/A4dVXhaDd91Q1qQXidI+2jwhx8iqiYxbT+DoAUkQRQy\n\txBtoCM1CxH7u45URUgD//fxYr3D4B1SlonA6vdaEdHZOGwECnDpTxecENMbz/Bx7qfrmd901\n\tD+N9SjIwrbVhhSyUXYnSUb8F+9g2RDY42Sk7GcYxIeON4VzKqWM7hpkXZ47pkK0YodO+dRKM\n\tyMcoUWrTK0Uz6UzUGKoJVbxmSW/EJLEGoI5p3NWxWtScEVv8mO49gqQdrRIOheZycDmHnItt\n\t9Qjv00uFhEwv2YfiyGk6iGF2W40s2pH2t6oeuGgmiZ7g6d0MEK8Ql/4zPItvr1c1rpwpXUC1\n\tu1kQWgtnNjFHX3KiYdqjcZeRBiry1X0zY+4Y24wUU0KsEewJwjhmCKAsju1RpdlPg2kC","In-Reply-To":"<20240814074013.52693-2-stefan.klug@ideasonboard.com>","Content-Type":"text/plain; charset=UTF-8; format=flowed","Content-Transfer-Encoding":"8bit","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}},{"id":30916,"web_url":"https://patchwork.libcamera.org/comment/30916/","msgid":"<dxuo7x4c6cqs5dpezn3fb7kruvtx4wdljue6fc7taqif7eirlg@zanihaqkhq42>","date":"2024-08-27T09:56:44","subject":"Re: [PATCH v2 1/1] Documentation: Add first version of the\n\ttuning-guide","submitter":{"id":184,"url":"https://patchwork.libcamera.org/api/people/184/","name":"Stefan Klug","email":"stefan.klug@ideasonboard.com"},"content":"Hi Dan,\n\nThank you for your review.\n\nOn Wed, Aug 14, 2024 at 09:29:01AM +0100, Dan Scally wrote:\n> Hi Stefan, thanks for the patch. This is very good!\n> \n> On 14/08/2024 08:40, Stefan Klug wrote:\n> > This patch adds a initial version of the tuning guide for libcamera. The\n> > process is established based on the imx8mp and will be improved when we\n> > do more tuning for different ISPs with our own tooling.\n> > \n> > Signed-off-by: Stefan Klug <stefan.klug@ideasonboard.com>\n> > ---\n> >   Documentation/conf.py                  |   4 +-\n> >   Documentation/guides/tuning/tuning.rst | 291 +++++++++++++++++++++++++\n> >   Documentation/index.rst                |   1 +\n> >   Documentation/meson.build              |   1 +\n> >   4 files changed, 295 insertions(+), 2 deletions(-)\n> >   create mode 100644 Documentation/guides/tuning/tuning.rst\n> > \n> > diff --git a/Documentation/conf.py b/Documentation/conf.py\n> > index 7eeea7f3865b..5387942b9af5 100644\n> > --- a/Documentation/conf.py\n> > +++ b/Documentation/conf.py\n> > @@ -21,8 +21,8 @@\n> >   # -- Project information -----------------------------------------------------\n> >   project = 'libcamera'\n> > -copyright = '2018-2019, The libcamera documentation authors'\n> > -author = u'Kieran Bingham, Jacopo Mondi, Laurent Pinchart, Niklas Söderlund'\n> > +copyright = '2018-2024, The libcamera documentation authors'\n> > +author = u'Kieran Bingham, Jacopo Mondi, Laurent Pinchart, Niklas Söderlund, Stefan Klug'\n> >   # Version information is provided by the build environment, through the\n> >   # sphinx command line.\n> > diff --git a/Documentation/guides/tuning/tuning.rst b/Documentation/guides/tuning/tuning.rst\n> > new file mode 100644\n> > index 000000000000..a58dda350556\n> > --- /dev/null\n> > +++ b/Documentation/guides/tuning/tuning.rst\n> > @@ -0,0 +1,291 @@\n> > +.. SPDX-License-Identifier: CC-BY-SA-4.0\n> > +\n> > +Sensor Tuning Guide\n> > +===================\n> > +\n> > +To create visually good images from the raw data provided by the camera sensor,\n> > +a lot of image processing takes place. This is usually done inside an ISP (Image\n> > +Signal Processor). To be able to do the necessary processing, the corresponding\n> > +algorithms need to be parameterized according to the hardware in use (typically\n> > +sensor, light and lens). Calculating these parameters is a process called\n> > +tuning. The tuning process results in a tuning file which is then used by\n> > +libcamera to provide calibrated parameters to the algorithms at runtime.\n> > +\n> > +The processing blocks of an ISP vary from vendor to vendor and can be\n> > +arbitrarily complex. Nevertheless a diagram of common blocks frequently found in\n> > +an ISP design is shown below:\n> > +\n> > + ::\n> > +\n> > +  +--------+\n> > +  |  Light |\n> > +  +--------+\n> > +      |\n> > +      v\n> > +  +--------+     +-----+    +-----+    +-----+    +-----+    +-------+\n> > +  | Sensor |  -> | BLC | -> | AWB | -> | LSC | -> | CCM | -> | Gamma |\n> > +  +--------+     +-----+    +-----+    +-----+    +-----+    +-------+\n> > +\n> > +**Light** The light used to light the scene has a crucial influence on the\n> \"light used to light\" is a bit funny. Perhaps \"light used to illuminate\"?\n> > +resulting image. The human eye and brain are specialized in automatically\n> > +adapting to different lights. In a camera this has to be done by an algorithm.\n> > +Light is a complex topic and to correctly describe light sources you need to\n> > +describe the spectrum of the light. To simplify things, lights are categorized\n> > +according to their `color temperature (K)\n> > +<https://en.wikipedia.org/wiki/Color_temperature>`_. This is important to\n> > +keep in mind as it means that calibrations may differ between light sources even\n> > +though they have the same nominal color temperature.\n> > +\n> > +For best results the tuning images need to be taken for a complete range of\n> > +light sources.\n> > +\n> > +**Sensor** The sensor captures the incoming light and converts a measured\n> > +analogue signal to a digital representation which conveys the raw images. The\n> > +light is commonly filtered by a color filter array into red, green and blue\n> > +channels. As these filters are not perfect some postprocessing needs to be done\n> > +to recreate the correct color.\n> > +\n> > +**BLC** Black level correction. Even without incoming light a sensor produces\n> > +pixel values above zero. This is due to two system artifacts that impact the\n> > +measured light levels on the sensor. Firstly, a deliberate and artificially\n> > +added pedestal value is added to get an evenly distributed noise level around\n> > +zero to avoid negative values and clipping the data.\n> > +\n> > +Secondly, additional underlying electrical noise can be caused by various\n> > +external factors including thermal noise and electrical interferences.\n> > +\n> > +To get good images with real black, that black level needs to be subtracted. As\n> > +that level is typically known for a sensor it is hardcoded in libcamera and does\n> > +not need any calibration at the moment. If needed, that value can be manually\n> > +overwritten in the tuning configuration.\n> Can it? For all IPA modules?\n> > +\n> > +**AWB** Auto white balance. For a proper image the color channels need to be\n> > +adjusted, to get correct white balance. This means that monochrome objects in\n> > +the scene appear monochrome in the output image (white is white and gray is\n> > +gray). Presently in the libipa implementation of libcamera, this is managed by a\n> > +grey world model. No tuning is necessary for this step.\n> > +\n> > +**LSC** Lens shading correction. The lens in use has a big influence on the\n> > +resulting image. The typical effects on the image are lens-shading (also called\n> > +vignetting). This means that due to the physical properties of a lens the\n> > +transmission of light falls of towards the corners of the lens. To make things\n> s/falls of/falls off. I would perhaps link vignetting to the Wikipedia page.\n> > +even harder, this falloff can be different for different colors/wavelengths. LSC\n> > +is therefore tuned for a set of light sources and for each color channel\n> > +individually.\n> > +\n> > +**CCM** Color correction matrix. After the previous processing blocks the grays\n> > +are preserved, but colors are still not correct. This is mostly due to the color\n> > +temperature of the light source and the imperfections in the color filter array\n> > +of the sensor. To correct for this a 'color correction matrix' is calculated.\n> > +This is a 3x3 matrix, that is used to optimize the captured colors regarding the\n> > +perceived color error.\n> \n> \n> Perhaps \"used to optimize the captured colours by correcting the perceived color error\"?\n\nTo me \"correcting the error\" also doesn't really cut it. Would you be ok with \"used to\noptimize the captured colours by minimizing the perceived color error.\"?\n\n> \n> > To do this a chart of known and precisely measured colors\n> > +commonly called a macbeth chart is captured. Then the matrix is calculated using\n> > +linear optimization to best map each measured colour of the chart to it's known\n> > +value. The color error is measured in `deltaE\n> > +<https://en.wikipedia.org/wiki/Color_difference>`_.\n> > +\n> > +**Gamma** Gamma correction. Today's display usually apply a gamma of 2.2 for\n> > +display.\n> \n> The double \"display\" feels a bit off. Hmmmm...maybe \"Today's displays\n> usually apply a gamma of 2.2 to the image they show\"?\n> \n\ndone\n\n> \n> >   For images to be perceived correctly by the human eye, they need to be\n> > +encoded with the corresponding inverse gamma. See also\n> > +<https://en.wikipedia.org/wiki/Gamma_correction>. This block doesn't need\n> > +tuning, but is crucial for correct visual display.\n> > +\n> > +Materials needed for the tuning process\n> > +---------------------------------------\n> > +\n> > +Precisely calibrated optical equipment is very expensive and out of the scope of\n> > +this document. Still it is possible to get reasonably good calibration results\n> > +at little costs. The most important devices needed are:\n> s/costs/cost.\n\ndone\n\n> > +\n> > +   - A light box with the ability to produce defined light of different color\n> > +     temperatures. Typical temperatures used for calibration are 2400K\n> > +     (incandescent), 2700K (fluorescent), 5000K (daylight fluorescent), 6500K\n> > +     (daylight). As a first test, keylights for webcam streaming can be used.\n> > +     These are available with support for color temperatures ranging from 2500K\n> > +     to 9000K.\n> Is it worth mentioning that you're using those in combination with a DIY\n> lightbox rather than just openly in a room?\n\nI added: \"9000k and can be combined with a simple white or grey card\nbox. It is important, that the box is of neutral grey color, so that it\ndoesn't influence the measurements.\"\n\n> > For better results professional light boxes are needed.\n> > +   - A ColorChecker chart. These are sold from calibrite and it makes sense to\n> > +     get the original one.\n> I think that the reasons for getting the original one might be worth\n> expanding on, as it's probably the most expensive thing to acquire on this\n> list.\n\nI changed that to: It makes sense to get the original one, as there is\nno easy way to recreate this with similar quality.\n\n> > +   - A integration sphere. This is used to create completely homogenious light\n> > +     for the lens shading calibration. We had good results with the use of a\n> > +     large light panel with sufficient diffusion to get an even distribution of\n> > +     light (the above mentioned keylight).\n> > +   - An environment without external light sources. Ideally calibration is done\n> > +     without any external lights in a fully dark room. A black curtain is one\n> > +     solution, working by night is the cheap alternative.\n> > +\n> > +\n> > +Overview over the process\n> s/over/of\n\ndone\n\n> > +-------------------------\n> > +\n> > +The tuning process of libcamera consists of the following steps:\n> > +\n> > +   1. Take images for the tuning steps with different color temperatures\n> s/with/at\n\ndone\n\n> > +   2. Configure the tuning process\n> > +   3. Run the tuning script to create a corresponding tuning file\n> > +\n> > +\n> > +Taking raw images with defined exposure/gain\n> > +--------------------------------------------\n> > +\n> > +For all the tuning files you need to capture a raw image with a defined exposure\n> > +time and gain. It is crucial that no pixels are saturated on any channel. At the\n> > +same time the images should not be too dark. Strive for a exposure time, so that\n> \"Strive for an exposure time such that\"\n\ndone\n\n> > +the brightest pixel is roughly 90% saturated. Finding the correct exposure time\n> > +is currently a manual process. We are working on solutions to aid in that\n> > +process.\n> Are we? cool!\n\nIn my mind it's mostly done :-)... then reality hits.\n\n> >   After finding the correct exposure time settings, the easiest way to\n> > +capture a raw image is with the ``cam`` application that gets installed with the\n> > +regular libcamera install.\n> > +\n> > +Create a ``constant-exposure-<temperature>.yaml`` file with the following content:\n> > +\n> > +.. code:: yaml\n> > +\n> > +   frames:\n> > +     - 0:\n> > +         AeEnable: false\n> > +         AnalogueGain: 1.0\n> > +         ExposureTime: 1000\n> > +\n> The IPU3 IPA wouldn't handle these...one for my todo list, but perhaps we\n> need to formally define the requirements an IPA would need to meet to\n> support our tuning process somewhere or something...\n\nOh what is supported by the ipu3? Yes, a feature table somewhere would\nbe nice. Someday...\n\n> > +Then capture one or more images using the following command:\n> > +  ``cam -c 1 --script constant-exposure-<temperature>.yaml -C -s role=raw --file=image_#.dng``\n> > +\n> > +.. Note:: Typically the brightness changes for different colour temperatures. So\n> > +    this has to be readjusted for every colour temperature.\n> > +\n> > +\n> > +Capture images for lens shading correction\n> > +------------------------------------------\n> > +\n> > +To be able to correct lens shading artifacts, images of a homogeneously lit\n> > +neutral gray background are needed. Ideally a integration sphere is used for\n> > +that task.\n> > +\n> > +As a fallback images of a flat panel light can be used. If there are multiple\n> > +images for the same color temperature, the tuning scripts will average the\n> > +tuning results which can further improve the tuning quality.\n> > +\n> > +.. Note:: Currently lsc is ignored in the ccm calculation. This can lead to\n> > +    slight color deviations and will be improved in the future.\n> > +\n> > +Images shall be taken for multiple color temperatures and named\n> > +``alsc_<temperature>k_#.dng``. ``#`` can be any number, if multiple images are\n> > +taken for the same color temperature.\n> > +\n> > +.. figure:: img/setup_lens_shading.jpg\n> > +    :width: 50%\n> > +\n> > +    Sample setup using a flat panel light. In this specific setup, the camera\n> > +    has to be moved close to the light due to the wide angle lens. This has the\n> > +    downside that the angle to the light might get way too steep.\n> I _think_ that the effect of that would be to increase the severity of the\n> vignetting and so the tuning process would increase the gains to counter it\n> more aggressively than would really be needed in a natural scene; is that\n> right? Maybe that's worth a note box (or warning? Or whatever the orange one\n> is)\n\nYes, that is my expectation. But as I really don't know the actual\nimpact of the angle and I don't know the impact of the increase in\ndistance to the light source I kept it quite vaguely. Now that I say it,\nmaybe we should actually compensate for the light decay due to distance\nchange. See https://en.wikipedia.org/wiki/Inverse-square_law . So in the\nend I don't know exactly what to put in that warning.\n\n> > +\n> > +.. figure:: img/camshark-alsc.png\n> > +    :width: 100%\n> > +\n> > +    A calibration image for lens shading correction.\n> > +\n> > +- Ensure the full sensor is lit. Especially with wide angle lenses it may be\n> > +  difficult to get the full field of view lit homogeneously.\n> > +- Ensure that the brightest area of the image does not contain any pixels that\n> > +  reach more than 95% on any of the colors (can be checked with camshark).\n> \n> \n> Ooh where does camshark show that?\n\nThere is a \"Info\" group, showing RGB and Mean values when you hover over\nthe image.\n\n> \n> > +\n> > +Take a raw dng image for multiple color temperatures:\n> > +    ``cam -c 1 --script constant-exposure-<temperature>.yaml -C -s role=raw --file=alsc_3200k_#.dng``\n> > +\n> > +\n> > +\n> > +Capture images for color calibration\n> > +------------------------------------\n> > +\n> > +To do the color calibration, raw images of the color checker need to be taken\n> s/color checker/ColorChecker chart.\n\ndone\n\n> > +for different light sources. These need to be named\n> \n> > +``<sensor_name>_<lux-level>l_<temperature>k_0.dng``.\n> > +\n> > +For best results the following hints should be taken into account:\n> > +  - Ensure that the 18% gray patch (third from the right in bottom line) is\n> > +    roughly at 18% saturation for all channels (a mean of 16-20% should be fine)\n> So I think that means you need to set the color temperature on your lamp and\n> then tune the intensity until the saturation is met - is that right?\n\nYes, either by tuning the intensity or changing the exposure time. I\nadded a sentence for that.\n\n> > +  - Ensure the color checker is homogeneously lit (ideally from 45 degree above)\n> > +  - No straylight from other light sources is present\n> > +  - The color checker is not too small and not too big (so that neither lens\n> > +    shading artifacts nor lens distortions are prevailing in the area of the\n> > +    color chart)\n> > +\n> > +If no lux meter is at hand to precisely measure the lux level, a lux meter app\n> > +on a mobile phone can provide a sufficient estimation.\n> > +\n> > +.. figure:: img/setup_calibration2.jpg\n> > +    :width: 50%\n> > +\n> > +\n> > +Run the tuning scripts\n> > +----------------------\n> > +\n> > +After taking the calibration images, you should have a directory with all the\n> > +tuning files. It should look something like this:\n> > +\n> > + ::\n> > +\n> > +   ../tuning-data/\n> > +   ├── alsc_2500k_0.dng\n> > +   ├── alsc_2500k_1.dng\n> > +   ├── alsc_2500k_2.dng\n> > +   ├── alsc_6500k_0.dng\n> > +   ├── imx335_1000l_2500k_0.dng\n> > +   ├── imx335_1200l_4000k_0.dng\n> > +   ├── imx335_1600l_6000k_0.dng\n> > +\n> > +\n> > +The tuning scripts are part of the libcamera source tree. After cloning the\n> > +libcamera sources the necessary steps to create a tuning file are:\n> > +\n> > + ::\n> > +\n> > +  # install the necessary python packages\n> > +  cd libcamera\n> > +  python -m venv venv\n> > +  source ./venv/bin/activate\n> > +  pip install -r utils/tuning/requirements.txt\n> > +\n> > +  # run the tuning script\n> > +  utils/tuning/rkisp1.py -c config.yaml -i ../tuning-data/ -o tuning-file.yaml\n> \n> There needs to be an explanation of the creation of config.yaml here I think.\n> \n\nArgh. I knew someone will spot that :-). The config file and code is\nsomething that really needs some love. But you are soo right. I added a\ncomment that just copies the config-example.yaml.\n\n> \n> Really good work in my opinion.\n\nThanks!\n\nBest regards,\nStefan\n\n> \n> > +\n> > +After the tuning script has run, the tuning file can be tested with any\n> > +libcamera based application like `qcam`. To quickly switch to a specific tuning\n> > +file, the environment variable ``LIBCAMERA_<pipeline>_TUNING_FILE`` is helpful.\n> > +E.g.:\n> > +\n> > + ::\n> > +\n> > +    LIBCAMERA_RKISP1_TUNING_FILE=/path/to/tuning-file.yaml qcam -c1\n> > +\n> > +\n> > +Sample images\n> > +-------------\n> > +\n> > +\n> > +.. figure:: img/image-no-blc.png\n> > +    :width: 800\n> > +\n> > +    Image without black level correction (and completely invalid color estimation).\n> > +\n> > +.. figure:: img/image-no-lsc-3000.png\n> > +    :width: 800\n> > +\n> > +    Image without lens shading correction @ 3000k. The vignetting artifacts can\n> > +    be clearly seen\n> > +\n> > +.. figure:: img/image-no-lsc-6000.png\n> > +    :width: 800\n> > +\n> > +    Image without lens shading correction @ 6000k. The vignetting artifacts can\n> > +    be clearly seen\n> > +\n> > +.. figure:: img/image-fully-tuned-3000.png\n> > +    :width: 800\n> > +\n> > +    Fully tuned image @ 3000k\n> > +\n> > +.. figure:: img/image-fully-tuned-6000.png\n> > +    :width: 800\n> > +\n> > +    Fully tuned image @ 6000k\n> > +\n> > diff --git a/Documentation/index.rst b/Documentation/index.rst\n> > index 5442ae75dde7..991dcf2b66fb 100644\n> > --- a/Documentation/index.rst\n> > +++ b/Documentation/index.rst\n> > @@ -16,6 +16,7 @@\n> >      Developer Guide <guides/introduction>\n> >      Application Writer's Guide <guides/application-developer>\n> > +   Sensor Tuning Guide <guides/tuning/tuning>\n> >      Pipeline Handler Writer's Guide <guides/pipeline-handler>\n> >      IPA Writer's guide <guides/ipa>\n> >      Tracing guide <guides/tracing>\n> > diff --git a/Documentation/meson.build b/Documentation/meson.build\n> > index 1ba40fdf67ac..8bf09f31afa0 100644\n> > --- a/Documentation/meson.build\n> > +++ b/Documentation/meson.build\n> > @@ -135,6 +135,7 @@ if sphinx.found()\n> >           'guides/ipa.rst',\n> >           'guides/pipeline-handler.rst',\n> >           'guides/tracing.rst',\n> > +        'guides/tuning/tuning.rst',\n> >           'index.rst',\n> >           'lens_driver_requirements.rst',\n> >           'python-bindings.rst',","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id 1E750C324C\n\tfor <parsemail@patchwork.libcamera.org>;\n\tTue, 27 Aug 2024 09:56:50 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id A060963457;\n\tTue, 27 Aug 2024 11:56:48 +0200 (CEST)","from perceval.ideasonboard.com (perceval.ideasonboard.com\n\t[IPv6:2001:4b98:dc2:55:216:3eff:fef7:d647])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id 7797F6344A\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tTue, 27 Aug 2024 11:56:47 +0200 (CEST)","from ideasonboard.com (unknown\n\t[IPv6:2a00:6020:448c:6c00:58b7:f3d:c9d4:defa])\n\tby perceval.ideasonboard.com (Postfix) with ESMTPSA id 8206D9FF;\n\tTue, 27 Aug 2024 11:55:40 +0200 (CEST)"],"Authentication-Results":"lancelot.ideasonboard.com; dkim=pass (1024-bit key;\n\tunprotected) header.d=ideasonboard.com header.i=@ideasonboard.com\n\theader.b=\"Mqn4/P4I\"; dkim-atps=neutral","DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/simple; d=ideasonboard.com;\n\ts=mail; t=1724752540;\n\tbh=CELqR/6WEK+NMZG4h0HWOb0F5SE2lmHfDTh03aRBPcQ=;\n\th=Date:From:To:Cc:Subject:References:In-Reply-To:From;\n\tb=Mqn4/P4I5BJg5k4GPI0+CBCsN4TajQJDHeWh4KLicNwqoL7McKKV1vCriDSjlstVj\n\tDS6Lr6pel1ge5Htaty1ypJz3l53piqcR0eowAGdPa9jPmpR+UCs3fUaO1BOuaUC1wH\n\tk8rrLliEUAv2xzyYDnrFnR+e8J2iA3HSPNj10Ma0=","Date":"Tue, 27 Aug 2024 11:56:44 +0200","From":"Stefan Klug <stefan.klug@ideasonboard.com>","To":"Dan Scally <dan.scally@ideasonboard.com>","Cc":"libcamera-devel@lists.libcamera.org","Subject":"Re: [PATCH v2 1/1] Documentation: Add first version of the\n\ttuning-guide","Message-ID":"<dxuo7x4c6cqs5dpezn3fb7kruvtx4wdljue6fc7taqif7eirlg@zanihaqkhq42>","References":"<20240814074013.52693-1-stefan.klug@ideasonboard.com>\n\t<20240814074013.52693-2-stefan.klug@ideasonboard.com>\n\t<8052d1f5-24be-4a60-8d04-e3a888d3957b@ideasonboard.com>","MIME-Version":"1.0","Content-Type":"text/plain; charset=utf-8","Content-Disposition":"inline","Content-Transfer-Encoding":"8bit","In-Reply-To":"<8052d1f5-24be-4a60-8d04-e3a888d3957b@ideasonboard.com>","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}},{"id":31042,"web_url":"https://patchwork.libcamera.org/comment/31042/","msgid":"<4643e192-437c-494c-8f75-46159f839bdf@ideasonboard.com>","date":"2024-09-02T11:12:58","subject":"Re: [PATCH v2 1/1] Documentation: Add first version of the\n\ttuning-guide","submitter":{"id":156,"url":"https://patchwork.libcamera.org/api/people/156/","name":"Dan Scally","email":"dan.scally@ideasonboard.com"},"content":"Hi Stefan\n\nOn 27/08/2024 10:56, Stefan Klug wrote:\n> Hi Dan,\n>\n> Thank you for your review.\n>\n> On Wed, Aug 14, 2024 at 09:29:01AM +0100, Dan Scally wrote:\n>> Hi Stefan, thanks for the patch. This is very good!\n>>\n>> On 14/08/2024 08:40, Stefan Klug wrote:\n>>> This patch adds a initial version of the tuning guide for libcamera. The\n>>> process is established based on the imx8mp and will be improved when we\n>>> do more tuning for different ISPs with our own tooling.\n>>>\n>>> Signed-off-by: Stefan Klug <stefan.klug@ideasonboard.com>\n>>> ---\n>>>    Documentation/conf.py                  |   4 +-\n>>>    Documentation/guides/tuning/tuning.rst | 291 +++++++++++++++++++++++++\n>>>    Documentation/index.rst                |   1 +\n>>>    Documentation/meson.build              |   1 +\n>>>    4 files changed, 295 insertions(+), 2 deletions(-)\n>>>    create mode 100644 Documentation/guides/tuning/tuning.rst\n>>>\n>>> diff --git a/Documentation/conf.py b/Documentation/conf.py\n>>> index 7eeea7f3865b..5387942b9af5 100644\n>>> --- a/Documentation/conf.py\n>>> +++ b/Documentation/conf.py\n>>> @@ -21,8 +21,8 @@\n>>>    # -- Project information -----------------------------------------------------\n>>>    project = 'libcamera'\n>>> -copyright = '2018-2019, The libcamera documentation authors'\n>>> -author = u'Kieran Bingham, Jacopo Mondi, Laurent Pinchart, Niklas Söderlund'\n>>> +copyright = '2018-2024, The libcamera documentation authors'\n>>> +author = u'Kieran Bingham, Jacopo Mondi, Laurent Pinchart, Niklas Söderlund, Stefan Klug'\n>>>    # Version information is provided by the build environment, through the\n>>>    # sphinx command line.\n>>> diff --git a/Documentation/guides/tuning/tuning.rst b/Documentation/guides/tuning/tuning.rst\n>>> new file mode 100644\n>>> index 000000000000..a58dda350556\n>>> --- /dev/null\n>>> +++ b/Documentation/guides/tuning/tuning.rst\n>>> @@ -0,0 +1,291 @@\n>>> +.. SPDX-License-Identifier: CC-BY-SA-4.0\n>>> +\n>>> +Sensor Tuning Guide\n>>> +===================\n>>> +\n>>> +To create visually good images from the raw data provided by the camera sensor,\n>>> +a lot of image processing takes place. This is usually done inside an ISP (Image\n>>> +Signal Processor). To be able to do the necessary processing, the corresponding\n>>> +algorithms need to be parameterized according to the hardware in use (typically\n>>> +sensor, light and lens). Calculating these parameters is a process called\n>>> +tuning. The tuning process results in a tuning file which is then used by\n>>> +libcamera to provide calibrated parameters to the algorithms at runtime.\n>>> +\n>>> +The processing blocks of an ISP vary from vendor to vendor and can be\n>>> +arbitrarily complex. Nevertheless a diagram of common blocks frequently found in\n>>> +an ISP design is shown below:\n>>> +\n>>> + ::\n>>> +\n>>> +  +--------+\n>>> +  |  Light |\n>>> +  +--------+\n>>> +      |\n>>> +      v\n>>> +  +--------+     +-----+    +-----+    +-----+    +-----+    +-------+\n>>> +  | Sensor |  -> | BLC | -> | AWB | -> | LSC | -> | CCM | -> | Gamma |\n>>> +  +--------+     +-----+    +-----+    +-----+    +-----+    +-------+\n>>> +\n>>> +**Light** The light used to light the scene has a crucial influence on the\n>> \"light used to light\" is a bit funny. Perhaps \"light used to illuminate\"?\n>>> +resulting image. The human eye and brain are specialized in automatically\n>>> +adapting to different lights. In a camera this has to be done by an algorithm.\n>>> +Light is a complex topic and to correctly describe light sources you need to\n>>> +describe the spectrum of the light. To simplify things, lights are categorized\n>>> +according to their `color temperature (K)\n>>> +<https://en.wikipedia.org/wiki/Color_temperature>`_. This is important to\n>>> +keep in mind as it means that calibrations may differ between light sources even\n>>> +though they have the same nominal color temperature.\n>>> +\n>>> +For best results the tuning images need to be taken for a complete range of\n>>> +light sources.\n>>> +\n>>> +**Sensor** The sensor captures the incoming light and converts a measured\n>>> +analogue signal to a digital representation which conveys the raw images. The\n>>> +light is commonly filtered by a color filter array into red, green and blue\n>>> +channels. As these filters are not perfect some postprocessing needs to be done\n>>> +to recreate the correct color.\n>>> +\n>>> +**BLC** Black level correction. Even without incoming light a sensor produces\n>>> +pixel values above zero. This is due to two system artifacts that impact the\n>>> +measured light levels on the sensor. Firstly, a deliberate and artificially\n>>> +added pedestal value is added to get an evenly distributed noise level around\n>>> +zero to avoid negative values and clipping the data.\n>>> +\n>>> +Secondly, additional underlying electrical noise can be caused by various\n>>> +external factors including thermal noise and electrical interferences.\n>>> +\n>>> +To get good images with real black, that black level needs to be subtracted. As\n>>> +that level is typically known for a sensor it is hardcoded in libcamera and does\n>>> +not need any calibration at the moment. If needed, that value can be manually\n>>> +overwritten in the tuning configuration.\n>> Can it? For all IPA modules?\n>>> +\n>>> +**AWB** Auto white balance. For a proper image the color channels need to be\n>>> +adjusted, to get correct white balance. This means that monochrome objects in\n>>> +the scene appear monochrome in the output image (white is white and gray is\n>>> +gray). Presently in the libipa implementation of libcamera, this is managed by a\n>>> +grey world model. No tuning is necessary for this step.\n>>> +\n>>> +**LSC** Lens shading correction. The lens in use has a big influence on the\n>>> +resulting image. The typical effects on the image are lens-shading (also called\n>>> +vignetting). This means that due to the physical properties of a lens the\n>>> +transmission of light falls of towards the corners of the lens. To make things\n>> s/falls of/falls off. I would perhaps link vignetting to the Wikipedia page.\n>>> +even harder, this falloff can be different for different colors/wavelengths. LSC\n>>> +is therefore tuned for a set of light sources and for each color channel\n>>> +individually.\n>>> +\n>>> +**CCM** Color correction matrix. After the previous processing blocks the grays\n>>> +are preserved, but colors are still not correct. This is mostly due to the color\n>>> +temperature of the light source and the imperfections in the color filter array\n>>> +of the sensor. To correct for this a 'color correction matrix' is calculated.\n>>> +This is a 3x3 matrix, that is used to optimize the captured colors regarding the\n>>> +perceived color error.\n>>\n>> Perhaps \"used to optimize the captured colours by correcting the perceived color error\"?\n> To me \"correcting the error\" also doesn't really cut it. Would you be ok with \"used to\n> optimize the captured colours by minimizing the perceived color error.\"?\nYep sounds ok to me.\n>\n>>> To do this a chart of known and precisely measured colors\n>>> +commonly called a macbeth chart is captured. Then the matrix is calculated using\n>>> +linear optimization to best map each measured colour of the chart to it's known\n>>> +value. The color error is measured in `deltaE\n>>> +<https://en.wikipedia.org/wiki/Color_difference>`_.\n>>> +\n>>> +**Gamma** Gamma correction. Today's display usually apply a gamma of 2.2 for\n>>> +display.\n>> The double \"display\" feels a bit off. Hmmmm...maybe \"Today's displays\n>> usually apply a gamma of 2.2 to the image they show\"?\n>>\n> done\n>\n>>>    For images to be perceived correctly by the human eye, they need to be\n>>> +encoded with the corresponding inverse gamma. See also\n>>> +<https://en.wikipedia.org/wiki/Gamma_correction>. This block doesn't need\n>>> +tuning, but is crucial for correct visual display.\n>>> +\n>>> +Materials needed for the tuning process\n>>> +---------------------------------------\n>>> +\n>>> +Precisely calibrated optical equipment is very expensive and out of the scope of\n>>> +this document. Still it is possible to get reasonably good calibration results\n>>> +at little costs. The most important devices needed are:\n>> s/costs/cost.\n> done\n>\n>>> +\n>>> +   - A light box with the ability to produce defined light of different color\n>>> +     temperatures. Typical temperatures used for calibration are 2400K\n>>> +     (incandescent), 2700K (fluorescent), 5000K (daylight fluorescent), 6500K\n>>> +     (daylight). As a first test, keylights for webcam streaming can be used.\n>>> +     These are available with support for color temperatures ranging from 2500K\n>>> +     to 9000K.\n>> Is it worth mentioning that you're using those in combination with a DIY\n>> lightbox rather than just openly in a room?\n> I added: \"9000k and can be combined with a simple white or grey card\n> box. It is important, that the box is of neutral grey color, so that it\n> doesn't influence the measurements.\"\n>\n>>> For better results professional light boxes are needed.\n>>> +   - A ColorChecker chart. These are sold from calibrite and it makes sense to\n>>> +     get the original one.\n>> I think that the reasons for getting the original one might be worth\n>> expanding on, as it's probably the most expensive thing to acquire on this\n>> list.\n> I changed that to: It makes sense to get the original one, as there is\n> no easy way to recreate this with similar quality.\n>\n>>> +   - A integration sphere. This is used to create completely homogenious light\n>>> +     for the lens shading calibration. We had good results with the use of a\n>>> +     large light panel with sufficient diffusion to get an even distribution of\n>>> +     light (the above mentioned keylight).\n>>> +   - An environment without external light sources. Ideally calibration is done\n>>> +     without any external lights in a fully dark room. A black curtain is one\n>>> +     solution, working by night is the cheap alternative.\n>>> +\n>>> +\n>>> +Overview over the process\n>> s/over/of\n> done\n>\n>>> +-------------------------\n>>> +\n>>> +The tuning process of libcamera consists of the following steps:\n>>> +\n>>> +   1. Take images for the tuning steps with different color temperatures\n>> s/with/at\n> done\n>\n>>> +   2. Configure the tuning process\n>>> +   3. Run the tuning script to create a corresponding tuning file\n>>> +\n>>> +\n>>> +Taking raw images with defined exposure/gain\n>>> +--------------------------------------------\n>>> +\n>>> +For all the tuning files you need to capture a raw image with a defined exposure\n>>> +time and gain. It is crucial that no pixels are saturated on any channel. At the\n>>> +same time the images should not be too dark. Strive for a exposure time, so that\n>> \"Strive for an exposure time such that\"\n> done\n>\n>>> +the brightest pixel is roughly 90% saturated. Finding the correct exposure time\n>>> +is currently a manual process. We are working on solutions to aid in that\n>>> +process.\n>> Are we? cool!\n> In my mind it's mostly done :-)... then reality hits.\nHah\n>\n>>>    After finding the correct exposure time settings, the easiest way to\n>>> +capture a raw image is with the ``cam`` application that gets installed with the\n>>> +regular libcamera install.\n>>> +\n>>> +Create a ``constant-exposure-<temperature>.yaml`` file with the following content:\n>>> +\n>>> +.. code:: yaml\n>>> +\n>>> +   frames:\n>>> +     - 0:\n>>> +         AeEnable: false\n>>> +         AnalogueGain: 1.0\n>>> +         ExposureTime: 1000\n>>> +\n>> The IPU3 IPA wouldn't handle these...one for my todo list, but perhaps we\n>> need to formally define the requirements an IPA would need to meet to\n>> support our tuning process somewhere or something...\n> Oh what is supported by the ipu3?\n\n\nI can't remember offhand the controls it does handle, but it doesn't handle those controls at the \nmoment. It really should though, as I say one for my to-do list.\n\n> Yes, a feature table somewhere would\n> be nice. Someday...\n>\n>>> +Then capture one or more images using the following command:\n>>> +  ``cam -c 1 --script constant-exposure-<temperature>.yaml -C -s role=raw --file=image_#.dng``\n>>> +\n>>> +.. Note:: Typically the brightness changes for different colour temperatures. So\n>>> +    this has to be readjusted for every colour temperature.\n>>> +\n>>> +\n>>> +Capture images for lens shading correction\n>>> +------------------------------------------\n>>> +\n>>> +To be able to correct lens shading artifacts, images of a homogeneously lit\n>>> +neutral gray background are needed. Ideally a integration sphere is used for\n>>> +that task.\n>>> +\n>>> +As a fallback images of a flat panel light can be used. If there are multiple\n>>> +images for the same color temperature, the tuning scripts will average the\n>>> +tuning results which can further improve the tuning quality.\n>>> +\n>>> +.. Note:: Currently lsc is ignored in the ccm calculation. This can lead to\n>>> +    slight color deviations and will be improved in the future.\n>>> +\n>>> +Images shall be taken for multiple color temperatures and named\n>>> +``alsc_<temperature>k_#.dng``. ``#`` can be any number, if multiple images are\n>>> +taken for the same color temperature.\n>>> +\n>>> +.. figure:: img/setup_lens_shading.jpg\n>>> +    :width: 50%\n>>> +\n>>> +    Sample setup using a flat panel light. In this specific setup, the camera\n>>> +    has to be moved close to the light due to the wide angle lens. This has the\n>>> +    downside that the angle to the light might get way too steep.\n>> I _think_ that the effect of that would be to increase the severity of the\n>> vignetting and so the tuning process would increase the gains to counter it\n>> more aggressively than would really be needed in a natural scene; is that\n>> right? Maybe that's worth a note box (or warning? Or whatever the orange one\n>> is)\n> Yes, that is my expectation. But as I really don't know the actual\n> impact of the angle and I don't know the impact of the increase in\n> distance to the light source I kept it quite vaguely. Now that I say it,\n> maybe we should actually compensate for the light decay due to distance\n> change. See https://en.wikipedia.org/wiki/Inverse-square_law . So in the\n> end I don't know exactly what to put in that warning.\nFair enough, no point adding one if we're unsure.\n>\n>>> +\n>>> +.. figure:: img/camshark-alsc.png\n>>> +    :width: 100%\n>>> +\n>>> +    A calibration image for lens shading correction.\n>>> +\n>>> +- Ensure the full sensor is lit. Especially with wide angle lenses it may be\n>>> +  difficult to get the full field of view lit homogeneously.\n>>> +- Ensure that the brightest area of the image does not contain any pixels that\n>>> +  reach more than 95% on any of the colors (can be checked with camshark).\n>>\n>> Ooh where does camshark show that?\n> There is a \"Info\" group, showing RGB and Mean values when you hover over\n> the image.\n\n\nThanks!\n\n>\n>>> +\n>>> +Take a raw dng image for multiple color temperatures:\n>>> +    ``cam -c 1 --script constant-exposure-<temperature>.yaml -C -s role=raw --file=alsc_3200k_#.dng``\n>>> +\n>>> +\n>>> +\n>>> +Capture images for color calibration\n>>> +------------------------------------\n>>> +\n>>> +To do the color calibration, raw images of the color checker need to be taken\n>> s/color checker/ColorChecker chart.\n> done\n>\n>>> +for different light sources. These need to be named\n>>> +``<sensor_name>_<lux-level>l_<temperature>k_0.dng``.\n>>> +\n>>> +For best results the following hints should be taken into account:\n>>> +  - Ensure that the 18% gray patch (third from the right in bottom line) is\n>>> +    roughly at 18% saturation for all channels (a mean of 16-20% should be fine)\n>> So I think that means you need to set the color temperature on your lamp and\n>> then tune the intensity until the saturation is met - is that right?\n> Yes, either by tuning the intensity or changing the exposure time. I\n> added a sentence for that.\n>\n>>> +  - Ensure the color checker is homogeneously lit (ideally from 45 degree above)\n>>> +  - No straylight from other light sources is present\n>>> +  - The color checker is not too small and not too big (so that neither lens\n>>> +    shading artifacts nor lens distortions are prevailing in the area of the\n>>> +    color chart)\n>>> +\n>>> +If no lux meter is at hand to precisely measure the lux level, a lux meter app\n>>> +on a mobile phone can provide a sufficient estimation.\n>>> +\n>>> +.. figure:: img/setup_calibration2.jpg\n>>> +    :width: 50%\n>>> +\n>>> +\n>>> +Run the tuning scripts\n>>> +----------------------\n>>> +\n>>> +After taking the calibration images, you should have a directory with all the\n>>> +tuning files. It should look something like this:\n>>> +\n>>> + ::\n>>> +\n>>> +   ../tuning-data/\n>>> +   ├── alsc_2500k_0.dng\n>>> +   ├── alsc_2500k_1.dng\n>>> +   ├── alsc_2500k_2.dng\n>>> +   ├── alsc_6500k_0.dng\n>>> +   ├── imx335_1000l_2500k_0.dng\n>>> +   ├── imx335_1200l_4000k_0.dng\n>>> +   ├── imx335_1600l_6000k_0.dng\n>>> +\n>>> +\n>>> +The tuning scripts are part of the libcamera source tree. After cloning the\n>>> +libcamera sources the necessary steps to create a tuning file are:\n>>> +\n>>> + ::\n>>> +\n>>> +  # install the necessary python packages\n>>> +  cd libcamera\n>>> +  python -m venv venv\n>>> +  source ./venv/bin/activate\n>>> +  pip install -r utils/tuning/requirements.txt\n>>> +\n>>> +  # run the tuning script\n>>> +  utils/tuning/rkisp1.py -c config.yaml -i ../tuning-data/ -o tuning-file.yaml\n>> There needs to be an explanation of the creation of config.yaml here I think.\n>>\n> Argh. I knew someone will spot that :-). The config file and code is\n> something that really needs some love. But you are soo right. I added a\n> comment that just copies the config-example.yaml.\n>\n>> Really good work in my opinion.\n> Thanks!\n>\n> Best regards,\n> Stefan\n>\n>>> +\n>>> +After the tuning script has run, the tuning file can be tested with any\n>>> +libcamera based application like `qcam`. To quickly switch to a specific tuning\n>>> +file, the environment variable ``LIBCAMERA_<pipeline>_TUNING_FILE`` is helpful.\n>>> +E.g.:\n>>> +\n>>> + ::\n>>> +\n>>> +    LIBCAMERA_RKISP1_TUNING_FILE=/path/to/tuning-file.yaml qcam -c1\n>>> +\n>>> +\n>>> +Sample images\n>>> +-------------\n>>> +\n>>> +\n>>> +.. figure:: img/image-no-blc.png\n>>> +    :width: 800\n>>> +\n>>> +    Image without black level correction (and completely invalid color estimation).\n>>> +\n>>> +.. figure:: img/image-no-lsc-3000.png\n>>> +    :width: 800\n>>> +\n>>> +    Image without lens shading correction @ 3000k. The vignetting artifacts can\n>>> +    be clearly seen\n>>> +\n>>> +.. figure:: img/image-no-lsc-6000.png\n>>> +    :width: 800\n>>> +\n>>> +    Image without lens shading correction @ 6000k. The vignetting artifacts can\n>>> +    be clearly seen\n>>> +\n>>> +.. figure:: img/image-fully-tuned-3000.png\n>>> +    :width: 800\n>>> +\n>>> +    Fully tuned image @ 3000k\n>>> +\n>>> +.. figure:: img/image-fully-tuned-6000.png\n>>> +    :width: 800\n>>> +\n>>> +    Fully tuned image @ 6000k\n>>> +\n>>> diff --git a/Documentation/index.rst b/Documentation/index.rst\n>>> index 5442ae75dde7..991dcf2b66fb 100644\n>>> --- a/Documentation/index.rst\n>>> +++ b/Documentation/index.rst\n>>> @@ -16,6 +16,7 @@\n>>>       Developer Guide <guides/introduction>\n>>>       Application Writer's Guide <guides/application-developer>\n>>> +   Sensor Tuning Guide <guides/tuning/tuning>\n>>>       Pipeline Handler Writer's Guide <guides/pipeline-handler>\n>>>       IPA Writer's guide <guides/ipa>\n>>>       Tracing guide <guides/tracing>\n>>> diff --git a/Documentation/meson.build b/Documentation/meson.build\n>>> index 1ba40fdf67ac..8bf09f31afa0 100644\n>>> --- a/Documentation/meson.build\n>>> +++ b/Documentation/meson.build\n>>> @@ -135,6 +135,7 @@ if sphinx.found()\n>>>            'guides/ipa.rst',\n>>>            'guides/pipeline-handler.rst',\n>>>            'guides/tracing.rst',\n>>> +        'guides/tuning/tuning.rst',\n>>>            'index.rst',\n>>>            'lens_driver_requirements.rst',\n>>>            'python-bindings.rst',","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id 3F745BF415\n\tfor <parsemail@patchwork.libcamera.org>;\n\tMon,  2 Sep 2024 11:13:05 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id 3D6BD634CB;\n\tMon,  2 Sep 2024 13:13:04 +0200 (CEST)","from perceval.ideasonboard.com (perceval.ideasonboard.com\n\t[213.167.242.64])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id 0E7CF618FD\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tMon,  2 Sep 2024 13:13:02 +0200 (CEST)","from [192.168.0.43]\n\t(cpc141996-chfd3-2-0-cust928.12-3.cable.virginm.net [86.13.91.161])\n\tby perceval.ideasonboard.com (Postfix) with ESMTPSA id C3116720;\n\tMon,  2 Sep 2024 13:11:50 +0200 (CEST)"],"Authentication-Results":"lancelot.ideasonboard.com; dkim=pass (1024-bit key;\n\tunprotected) header.d=ideasonboard.com header.i=@ideasonboard.com\n\theader.b=\"LOPJCPis\"; dkim-atps=neutral","DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/simple; d=ideasonboard.com;\n\ts=mail; t=1725275510;\n\tbh=SxqvM1kXJQfbUkgUAuy3+TrzBZNG4ezmUtUixsIysIE=;\n\th=Date:Subject:To:Cc:References:From:In-Reply-To:From;\n\tb=LOPJCPis5Dzafjvsdt7zJonRaZjB+WpbCB+UfnC9lc2XmchP6BDKHNGpjaGz6p7Ed\n\tsLbDF05LF+Lr9jTnij00hD1aYVeV3zqHA9YVuAFJH2b/RYDthHvpxDRQX01nDlUQrG\n\tNto1D7Gz7uN5mUVPueROyBP2xJC8eFjYvDt75v5o=","Message-ID":"<4643e192-437c-494c-8f75-46159f839bdf@ideasonboard.com>","Date":"Mon, 2 Sep 2024 12:12:58 +0100","MIME-Version":"1.0","User-Agent":"Mozilla Thunderbird","Subject":"Re: [PATCH v2 1/1] Documentation: Add first version of the\n\ttuning-guide","To":"Stefan Klug <stefan.klug@ideasonboard.com>","Cc":"libcamera-devel@lists.libcamera.org","References":"<20240814074013.52693-1-stefan.klug@ideasonboard.com>\n\t<20240814074013.52693-2-stefan.klug@ideasonboard.com>\n\t<8052d1f5-24be-4a60-8d04-e3a888d3957b@ideasonboard.com>\n\t<dxuo7x4c6cqs5dpezn3fb7kruvtx4wdljue6fc7taqif7eirlg@zanihaqkhq42>","Content-Language":"en-US","From":"Dan Scally <dan.scally@ideasonboard.com>","Autocrypt":"addr=dan.scally@ideasonboard.com; keydata=\n\txsFNBGLydlEBEADa5O2s0AbUguprfvXOQun/0a8y2Vk6BqkQALgeD6KnXSWwaoCULp18etYW\n\tB31bfgrdphXQ5kUQibB0ADK8DERB4wrzrUb5CMxLBFE7mQty+v5NsP0OFNK9XTaAOcmD+Ove\n\teIjYvqurAaro91jrRVrS1gBRxIFqyPgNvwwL+alMZhn3/2jU2uvBmuRrgnc/e9cHKiuT3Dtq\n\tMHGPKL2m+plk+7tjMoQFfexoQ1JKugHAjxAhJfrkXh6uS6rc01bYCyo7ybzg53m1HLFJdNGX\n\tsUKR+dQpBs3SY4s66tc1sREJqdYyTsSZf80HjIeJjU/hRunRo4NjRIJwhvnK1GyjOvvuCKVU\n\tRWpY8dNjNu5OeAfdrlvFJOxIE9M8JuYCQTMULqd1NuzbpFMjc9524U3Cngs589T7qUMPb1H1\n\tNTA81LmtJ6Y+IV5/kiTUANflpzBwhu18Ok7kGyCq2a2jsOcVmk8gZNs04gyjuj8JziYwwLbf\n\tvzABwpFVcS8aR+nHIZV1HtOzyw8CsL8OySc3K9y+Y0NRpziMRvutrppzgyMb9V+N31mK9Mxl\n\t1YkgaTl4ciNWpdfUe0yxH03OCuHi3922qhPLF4XX5LN+NaVw5Xz2o3eeWklXdouxwV7QlN33\n\tu4+u2FWzKxDqO6WLQGjxPE0mVB4Gh5Pa1Vb0ct9Ctg0qElvtGQARAQABzShEYW4gU2NhbGx5\n\tIDxkYW4uc2NhbGx5QGlkZWFzb25ib2FyZC5jb20+wsGNBBMBCAA3FiEEsdtt8OWP7+8SNfQe\n\tkiQuh/L+GMQFAmLydlIFCQWjmoACGwMECwkIBwUVCAkKCwUWAgMBAAAKCRCSJC6H8v4YxDI2\n\tEAC2Gz0iyaXJkPInyshrREEWbo0CA6v5KKf3I/HlMPqkZ48bmGoYm4mEQGFWZJAT3K4ir8bg\n\tcEfs9V54gpbrZvdwS4abXbUK4WjKwEs8HK3XJv1WXUN2bsz5oEJWZUImh9gD3naiLLI9QMMm\n\tw/aZkT+NbN5/2KvChRWhdcha7+2Te4foOY66nIM+pw2FZM6zIkInLLUik2zXOhaZtqdeJZQi\n\tHSPU9xu7TRYN4cvdZAnSpG7gQqmLm5/uGZN1/sB3kHTustQtSXKMaIcD/DMNI3JN/t+RJVS7\n\tc0Jh/ThzTmhHyhxx3DRnDIy7kwMI4CFvmhkVC2uNs9kWsj1DuX5kt8513mvfw2OcX9UnNKmZ\n\tnhNCuF6DxVrL8wjOPuIpiEj3V+K7DFF1Cxw1/yrLs8dYdYh8T8vCY2CHBMsqpESROnTazboh\n\tAiQ2xMN1cyXtX11Qwqm5U3sykpLbx2BcmUUUEAKNsM//Zn81QXKG8vOx0ZdMfnzsCaCzt8f6\n\t9dcDBBI3tJ0BI9ByiocqUoL6759LM8qm18x3FYlxvuOs4wSGPfRVaA4yh0pgI+ModVC2Pu3y\n\tejE/IxeatGqJHh6Y+iJzskdi27uFkRixl7YJZvPJAbEn7kzSi98u/5ReEA8Qhc8KO/B7wprj\n\txjNMZNYd0Eth8+WkixHYj752NT5qshKJXcyUU87BTQRi8nZSARAAx0BJayh1Fhwbf4zoY56x\n\txHEpT6DwdTAYAetd3yiKClLVJadYxOpuqyWa1bdfQWPb+h4MeXbWw/53PBgn7gI2EA7ebIRC\n\tPJJhAIkeym7hHZoxqDQTGDJjxFEL11qF+U3rhWiL2Zt0Pl+zFq0eWYYVNiXjsIS4FI2+4m16\n\ttPbDWZFJnSZ828VGtRDQdhXfx3zyVX21lVx1bX4/OZvIET7sVUufkE4hrbqrrufre7wsjD1t\n\t8MQKSapVrr1RltpzPpScdoxknOSBRwOvpp57pJJe5A0L7+WxJ+vQoQXj0j+5tmIWOAV1qBQp\n\thyoyUk9JpPfntk2EKnZHWaApFp5TcL6c5LhUvV7F6XwOjGPuGlZQCWXee9dr7zym8iR3irWT\n\t+49bIh5PMlqSLXJDYbuyFQHFxoiNdVvvf7etvGfqFYVMPVjipqfEQ38ST2nkzx+KBICz7uwj\n\tJwLBdTXzGFKHQNckGMl7F5QdO/35An/QcxBnHVMXqaSd12tkJmoRVWduwuuoFfkTY5mUV3uX\n\txGj3iVCK4V+ezOYA7c2YolfRCNMTza6vcK/P4tDjjsyBBZrCCzhBvd4VVsnnlZhVaIxoky4K\n\taL+AP+zcQrUZmXmgZjXOLryGnsaeoVrIFyrU6ly90s1y3KLoPsDaTBMtnOdwxPmo1xisH8oL\n\ta/VRgpFBfojLPxMAEQEAAcLBfAQYAQgAJhYhBLHbbfDlj+/vEjX0HpIkLofy/hjEBQJi8nZT\n\tBQkFo5qAAhsMAAoJEJIkLofy/hjEXPcQAMIPNqiWiz/HKu9W4QIf1OMUpKn3YkVIj3p3gvfM\n\tRes4fGX94Ji599uLNrPoxKyaytC4R6BTxVriTJjWK8mbo9jZIRM4vkwkZZ2bu98EweSucxbp\n\tvjESsvMXGgxniqV/RQ/3T7LABYRoIUutARYq58p5HwSP0frF0fdFHYdTa2g7MYZl1ur2JzOC\n\tFHRpGadlNzKDE3fEdoMobxHB3Lm6FDml5GyBAA8+dQYVI0oDwJ3gpZPZ0J5Vx9RbqXe8RDuR\n\tdu90hvCJkq7/tzSQ0GeD3BwXb9/R/A4dVXhaDd91Q1qQXidI+2jwhx8iqiYxbT+DoAUkQRQy\n\txBtoCM1CxH7u45URUgD//fxYr3D4B1SlonA6vdaEdHZOGwECnDpTxecENMbz/Bx7qfrmd901\n\tD+N9SjIwrbVhhSyUXYnSUb8F+9g2RDY42Sk7GcYxIeON4VzKqWM7hpkXZ47pkK0YodO+dRKM\n\tyMcoUWrTK0Uz6UzUGKoJVbxmSW/EJLEGoI5p3NWxWtScEVv8mO49gqQdrRIOheZycDmHnItt\n\t9Qjv00uFhEwv2YfiyGk6iGF2W40s2pH2t6oeuGgmiZ7g6d0MEK8Ql/4zPItvr1c1rpwpXUC1\n\tu1kQWgtnNjFHX3KiYdqjcZeRBiry1X0zY+4Y24wUU0KsEewJwjhmCKAsju1RpdlPg2kC","In-Reply-To":"<dxuo7x4c6cqs5dpezn3fb7kruvtx4wdljue6fc7taqif7eirlg@zanihaqkhq42>","Content-Type":"text/plain; charset=UTF-8; format=flowed","Content-Transfer-Encoding":"8bit","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}},{"id":34278,"web_url":"https://patchwork.libcamera.org/comment/34278/","msgid":"<007a06bf-6967-4ab8-b5a9-6cc23125ca73@cherry.de>","date":"2025-05-19T14:38:03","subject":"Re: [PATCH v2 1/1] Documentation: Add first version of the\n\ttuning-guide","submitter":{"id":202,"url":"https://patchwork.libcamera.org/api/people/202/","name":"Quentin Schulz","email":"quentin.schulz@cherry.de"},"content":"Hi Stefan,\n\nI've been pointed at this when asking where I could find information on \nhow to write a configuration file for a new camera sensor.\n\nOn 8/14/24 9:40 AM, Stefan Klug wrote:\n> This patch adds a initial version of the tuning guide for libcamera. The\n> process is established based on the imx8mp and will be improved when we\n> do more tuning for different ISPs with our own tooling.\n> \n> Signed-off-by: Stefan Klug <stefan.klug@ideasonboard.com>\n> ---\n>   Documentation/conf.py                  |   4 +-\n>   Documentation/guides/tuning/tuning.rst | 291 +++++++++++++++++++++++++\n>   Documentation/index.rst                |   1 +\n>   Documentation/meson.build              |   1 +\n>   4 files changed, 295 insertions(+), 2 deletions(-)\n>   create mode 100644 Documentation/guides/tuning/tuning.rst\n> \n> diff --git a/Documentation/conf.py b/Documentation/conf.py\n> index 7eeea7f3865b..5387942b9af5 100644\n> --- a/Documentation/conf.py\n> +++ b/Documentation/conf.py\n> @@ -21,8 +21,8 @@\n>   # -- Project information -----------------------------------------------------\n>   \n>   project = 'libcamera'\n> -copyright = '2018-2019, The libcamera documentation authors'\n> -author = u'Kieran Bingham, Jacopo Mondi, Laurent Pinchart, Niklas Söderlund'\n> +copyright = '2018-2024, The libcamera documentation authors'\n> +author = u'Kieran Bingham, Jacopo Mondi, Laurent Pinchart, Niklas Söderlund, Stefan Klug'\n>   \n>   # Version information is provided by the build environment, through the\n>   # sphinx command line.\n> diff --git a/Documentation/guides/tuning/tuning.rst b/Documentation/guides/tuning/tuning.rst\n> new file mode 100644\n> index 000000000000..a58dda350556\n> --- /dev/null\n> +++ b/Documentation/guides/tuning/tuning.rst\n> @@ -0,0 +1,291 @@\n> +.. SPDX-License-Identifier: CC-BY-SA-4.0\n> +\n> +Sensor Tuning Guide\n> +===================\n> +\n> +To create visually good images from the raw data provided by the camera sensor,\n> +a lot of image processing takes place. This is usually done inside an ISP (Image\n> +Signal Processor). To be able to do the necessary processing, the corresponding\n> +algorithms need to be parameterized according to the hardware in use (typically\n> +sensor, light and lens). Calculating these parameters is a process called\n> +tuning. The tuning process results in a tuning file which is then used by\n> +libcamera to provide calibrated parameters to the algorithms at runtime.\n> +\n> +The processing blocks of an ISP vary from vendor to vendor and can be\n> +arbitrarily complex. Nevertheless a diagram of common blocks frequently found in\n> +an ISP design is shown below:\n> +\n> + ::\n> +\n> +  +--------+\n> +  |  Light |\n> +  +--------+\n> +      |\n> +      v\n> +  +--------+     +-----+    +-----+    +-----+    +-----+    +-------+\n> +  | Sensor |  -> | BLC | -> | AWB | -> | LSC | -> | CCM | -> | Gamma |\n> +  +--------+     +-----+    +-----+    +-----+    +-----+    +-------+\n> +\n> +**Light** The light used to light the scene has a crucial influence on the\n> +resulting image. The human eye and brain are specialized in automatically\n> +adapting to different lights. In a camera this has to be done by an algorithm.\n> +Light is a complex topic and to correctly describe light sources you need to\n> +describe the spectrum of the light. To simplify things, lights are categorized\n> +according to their `color temperature (K)\n> +<https://en.wikipedia.org/wiki/Color_temperature>`_. This is important to\n> +keep in mind as it means that calibrations may differ between light sources even\n> +though they have the same nominal color temperature.\n> +\n> +For best results the tuning images need to be taken for a complete range of\n> +light sources.\n> +\n> +**Sensor** The sensor captures the incoming light and converts a measured\n> +analogue signal to a digital representation which conveys the raw images. The\n> +light is commonly filtered by a color filter array into red, green and blue\n> +channels. As these filters are not perfect some postprocessing needs to be done\n> +to recreate the correct color.\n> +\n> +**BLC** Black level correction. Even without incoming light a sensor produces\n> +pixel values above zero. This is due to two system artifacts that impact the\n> +measured light levels on the sensor. Firstly, a deliberate and artificially\n> +added pedestal value is added to get an evenly distributed noise level around\n> +zero to avoid negative values and clipping the data.\n> +\n> +Secondly, additional underlying electrical noise can be caused by various\n> +external factors including thermal noise and electrical interferences.\n> +\n> +To get good images with real black, that black level needs to be subtracted. As\n> +that level is typically known for a sensor it is hardcoded in libcamera and does\n> +not need any calibration at the moment. If needed, that value can be manually\n> +overwritten in the tuning configuration.\n> +\n\nI see BLC registers in OV5675 and OV13850 datasheets, but considering \nthere's no BLC info exposed by v4l2 UAPI (only V4L2_CID_BLACK_LEVEL \nwhich is marked as deprecated (for a long time already)), I assume the \ncamera sensor is actually doing all the maths for us already and we \ncould have 0 here?\n\nIf that's correct, could we specify that here too? The datasheet for the \nOV13850 can be easily found on the Internet though I have WAY more \nregisters on the OV5675 datasheet.\n\n> +**AWB** Auto white balance. For a proper image the color channels need to be\n> +adjusted, to get correct white balance. This means that monochrome objects in\n> +the scene appear monochrome in the output image (white is white and gray is\n> +gray). Presently in the libipa implementation of libcamera, this is managed by a\n> +grey world model. No tuning is necessary for this step.\n> +\n> +**LSC** Lens shading correction. The lens in use has a big influence on the\n> +resulting image. The typical effects on the image are lens-shading (also called\n> +vignetting). This means that due to the physical properties of a lens the\n> +transmission of light falls of towards the corners of the lens. To make things\n> +even harder, this falloff can be different for different colors/wavelengths. LSC\n> +is therefore tuned for a set of light sources and for each color channel\n> +individually.\n> +\n> +**CCM** Color correction matrix. After the previous processing blocks the grays\n> +are preserved, but colors are still not correct. This is mostly due to the color\n> +temperature of the light source and the imperfections in the color filter array\n> +of the sensor. To correct for this a 'color correction matrix' is calculated.\n> +This is a 3x3 matrix, that is used to optimize the captured colors regarding the\n> +perceived color error. To do this a chart of known and precisely measured colors\n> +commonly called a macbeth chart is captured. Then the matrix is calculated using\n\nFirst time you talk about the Macbeth chart but you used ColorChecker \nbefore, so maybe stick with that?\n\n> +linear optimization to best map each measured colour of the chart to it's known\n\ns/it's/its/\n\n> +value. The color error is measured in `deltaE\n> +<https://en.wikipedia.org/wiki/Color_difference>`_.\n> +\n> +**Gamma** Gamma correction. Today's display usually apply a gamma of 2.2 for\n\ns/display/displays/ I assume here? Otherwise conjugation of \"apply\" \nisn't right.\n\n> +display. For images to be perceived correctly by the human eye, they need to be\n> +encoded with the corresponding inverse gamma. See also\n> +<https://en.wikipedia.org/wiki/Gamma_correction>. This block doesn't need\n> +tuning, but is crucial for correct visual display.\n> +\n> +Materials needed for the tuning process\n> +---------------------------------------\n> +\n> +Precisely calibrated optical equipment is very expensive and out of the scope of\n> +this document. Still it is possible to get reasonably good calibration results\n> +at little costs. The most important devices needed are:\n> +\n> +   - A light box with the ability to produce defined light of different color\n\nDo not indent, somehow Sphinx see this as a quote?\n\n> +     temperatures. Typical temperatures used for calibration are 2400K\n> +     (incandescent), 2700K (fluorescent), 5000K (daylight fluorescent), 6500K\n> +     (daylight). As a first test, keylights for webcam streaming can be used.\n> +     These are available with support for color temperatures ranging from 2500K\n> +     to 9000K. For better results professional light boxes are needed.\n> +   - A ColorChecker chart. These are sold from calibrite and it makes sense to\n> +     get the original one.\n\nMaybe provide a link to the wikipedia page so people know what's being \ntalked about? https://en.wikipedia.org/wiki/ColorChecker\n\n> +   - A integration sphere. This is used to create completely homogenious light\n\ns/A/An/\n\ns/homogenious/homogeneous/\n\nIs it supposed to be an integrating sphere rather than integration \nsphere maybe? c.f. https://en.wikipedia.org/wiki/Integrating_sphere\n\n> +     for the lens shading calibration. We had good results with the use of a\n> +     large light panel with sufficient diffusion to get an even distribution of\n> +     light (the above mentioned keylight).\n\nWhat kind of diffusion contraption did you use? It's not entirely clear \nhow one is supposed to source an integrating sphere and the few I saw \nare prohibitively expensive :)\n\n> +   - An environment without external light sources. Ideally calibration is done\n> +     without any external lights in a fully dark room. A black curtain is one\n\ns/lights/light/\n\n> +     solution, working by night is the cheap alternative.\n> +\n\nIt's never pitch black by night though, you typically have light \npollution in the city or moon light otherwise?\n\n> +\n> +Overview over the process\n\ns/Overview over the process/Process overview/\n\nor Overview of the process?\n\n> +-------------------------\n> +\n> +The tuning process of libcamera consists of the following steps:\n> +\n> +   1. Take images for the tuning steps with different color temperatures\n> +   2. Configure the tuning process\n> +   3. Run the tuning script to create a corresponding tuning file\n> +\n\nDo not indent, somehow Sphinx see this as a quote?\n\n> +\n> +Taking raw images with defined exposure/gain\n> +--------------------------------------------\n> +\n> +For all the tuning files you need to capture a raw image with a defined exposure\n> +time and gain. It is crucial that no pixels are saturated on any channel. At the\n\ns/no pixels are/no pixel is/ ?\n\n> +same time the images should not be too dark. Strive for a exposure time, so that\n\nWhat is \"too dark\"? We have some info on the max saturation, but not on \nwhat would be too dark.\n\n> +the brightest pixel is roughly 90% saturated. Finding the correct exposure time\n> +is currently a manual process. We are working on solutions to aid in that\n> +process. After finding the correct exposure time settings, the easiest way to\n> +capture a raw image is with the ``cam`` application that gets installed with the\n> +regular libcamera install.\n> +\n\ns/that gets installed with the regular libcamera install./that typically \ngets installed by libcamera./ ?\n\n> +Create a ``constant-exposure-<temperature>.yaml`` file with the following content:\n> +\n> +.. code:: yaml\n> +\n> +   frames:\n> +     - 0:\n> +         AeEnable: false\n> +         AnalogueGain: 1.0\n> +         ExposureTime: 1000\n> +\n> +Then capture one or more images using the following command:\n> +  ``cam -c 1 --script constant-exposure-<temperature>.yaml -C -s role=raw --file=image_#.dng``\n> +\n> +.. Note:: Typically the brightness changes for different colour temperatures. So\n> +    this has to be readjusted for every colour temperature.\n> +\n\nBy \"this\" I assume you mean exposure time?\n\nI see that the ISP for some camera helpers for the Raspberry Pi ISP \n(RPi, not PiSP) skips 2 frames because they are too under-exposed. Do we \nneed to take care of that/warn about that here?\n\nAdditionally, I believe this won't work with new sensors as libcamera \nalways expect a YAML tuning file for each detected sensor? I had to add \na dummy one to get libcamera to capture anything on my OV5675 with \nRockchip's ISP. So we probably should document that as well?\n\n> +\n> +Capture images for lens shading correction\n> +------------------------------------------\n> +\n> +To be able to correct lens shading artifacts, images of a homogeneously lit\n> +neutral gray background are needed. Ideally a integration sphere is used for\n> +that task.\n> +\n> +As a fallback images of a flat panel light can be used. If there are multiple\n\nAre keylights and flat panel lights the same thing? It's the first time \nyou speak about flat panel lights, hence the question.\n\n> +images for the same color temperature, the tuning scripts will average the\n> +tuning results which can further improve the tuning quality.\n> +\n> +.. Note:: Currently lsc is ignored in the ccm calculation. This can lead to\n> +    slight color deviations and will be improved in the future.\n> +\n\nCapitalize LSC and CCM and maybe have a link to the earlier section \nwhere each is defined/explained?\n\n> +Images shall be taken for multiple color temperatures and named\n> +``alsc_<temperature>k_#.dng``. ``#`` can be any number, if multiple images are\n> +taken for the same color temperature.\n> +\n\nIsn't libcamera capable of replacing this # character with the proper \ndigit whenever you pass --capture=X? Should we NOT use that but really \nhave multiple \"human\" delays between captures?\n\n> +.. figure:: img/setup_lens_shading.jpg\n> +    :width: 50%\n> +\n> +    Sample setup using a flat panel light. In this specific setup, the camera\n> +    has to be moved close to the light due to the wide angle lens. This has the\n> +    downside that the angle to the light might get way too steep.\n\nNo clue what you wanted to convey with the last sentence, can you reword?\n\n> +\n> +.. figure:: img/camshark-alsc.png\n> +    :width: 100%\n> +\n> +    A calibration image for lens shading correction.\n\n+using camshark\n\n> +\n> +- Ensure the full sensor is lit. Especially with wide angle lenses it may be\n> +  difficult to get the full field of view lit homogeneously.\n\nHow does one make sure of that?\n\n> +- Ensure that the brightest area of the image does not contain any pixels that\n> +  reach more than 95% on any of the colors (can be checked with camshark).\n\ns/any pixels that reach/any pixel that reaches/\n\nExplain how to check with camshark :)\n\n> +\n> +Take a raw dng image for multiple color temperatures:\n> +    ``cam -c 1 --script constant-exposure-<temperature>.yaml -C -s role=raw --file=alsc_3200k_#.dng``\n\nalsc_<temperature>_#.dng I assume?\n\nAlso, isn't -C without an argument essentially \"forever\" (I don't \nremember tbh :) ) or is it only one because of having only a \nconfiguration file for one frame?\n\nMaybe use a code block there instead of double tick quoting it?\n\n> +\n> +\n> +\n> +Capture images for color calibration\n> +------------------------------------\n> +\n> +To do the color calibration, raw images of the color checker need to be taken\n> +for different light sources. These need to be named\n> +``<sensor_name>_<lux-level>l_<temperature>k_0.dng``.\n> +\n> +For best results the following hints should be taken into account:\n> +  - Ensure that the 18% gray patch (third from the right in bottom line) is\n\n+in the ColorChecker\n\n> +    roughly at 18% saturation for all channels (a mean of 16-20% should be fine)\n\nI assume you used camshark for that? Can you explain or point at the \nprevious explanation from LSC chapter?\n\n> +  - Ensure the color checker is homogeneously lit (ideally from 45 degree above)\n> +  - No straylight from other light sources is present\n> +  - The color checker is not too small and not too big (so that neither lens\n> +    shading artifacts nor lens distortions are prevailing in the area of the\n> +    color chart)\n> +\n\nWikipedia says \"ColorChecker\" and not color checker :) c.f. \nhttps://en.wikipedia.org/wiki/ColorChecker\n\nHow do I check it's not too small or too big or for lens distortions?\n\nI assume we also want to talk about focus length so that things are in \nfocus? E.g. I have a camera module which has a focus lens driver, so we \nprobably should indicate to be careful about that (e.g. no autofocus \nwhatsoever, same focus lens for all pictures, etc..). Question: would \nfocus lens impact all those results? I believe on my DSLR 12-24mm wide \nangle I have different distortions at 12mm than 24mm? I see that the RPi \nand IPU3 ISPs have af algorithms, how are we planning on handling that?\n\n> +If no lux meter is at hand to precisely measure the lux level, a lux meter app\n> +on a mobile phone can provide a sufficient estimation.\n> +\n> +.. figure:: img/setup_calibration2.jpg\n> +    :width: 50%\n> +\n\nPlease provide a caption here :)\n\n> +\n> +Run the tuning scripts\n> +----------------------\n> +\n> +After taking the calibration images, you should have a directory with all the\n> +tuning files. It should look something like this:\n> +\n> + ::\n\ns/like this:\\n\\n::/like this::/\n\n> +\n> +   ../tuning-data/\n> +   ├── alsc_2500k_0.dng\n> +   ├── alsc_2500k_1.dng\n> +   ├── alsc_2500k_2.dng\n> +   ├── alsc_6500k_0.dng\n> +   ├── imx335_1000l_2500k_0.dng\n> +   ├── imx335_1200l_4000k_0.dng\n> +   ├── imx335_1600l_6000k_0.dng\n> +\n\nSomething's odd here, Sphinx renders this as a code-block inside a quote?\n\n> +\n> +The tuning scripts are part of the libcamera source tree. After cloning the\n> +libcamera sources the necessary steps to create a tuning file are:\n> +\n> + ::\n> +\n\nDitto.\n\n> +  # install the necessary python packages\n> +  cd libcamera\n> +  python -m venv venv\n> +  source ./venv/bin/activate\n> +  pip install -r utils/tuning/requirements.txt\n> +\n> +  # run the tuning script\n> +  utils/tuning/rkisp1.py -c config.yaml -i ../tuning-data/ -o tuning-file.yaml\n> +\n\nSomething's odd here, Sphinx renders this as a code-block inside a quote?\n\nShould we provide some meson.build with options to install this script? \nI assume it could be useful in some debian, Yocto, buildroot package?\n\n> +After the tuning script has run, the tuning file can be tested with any\n> +libcamera based application like `qcam`. To quickly switch to a specific tuning\n> +file, the environment variable ``LIBCAMERA_<pipeline>_TUNING_FILE`` is helpful.\n> +E.g.:\n> +\n> + ::\n\ns/:\\n\\n::/::/\n\n> +\n> +    LIBCAMERA_RKISP1_TUNING_FILE=/path/to/tuning-file.yaml qcam -c1\n> +\n\nSomething's odd here, Sphinx renders this as a code-block inside a quote?\n\nAh, would that be a way to avoid having to define a configuration file \nfor a new sensor for cam to capture images? e.g. \nLIBCAMERA_RKISP1_TUNING_FILE=uncalibrated.yaml cam...\n?\n\n> +\n> +Sample images\n> +-------------\n> +\n> +\n> +.. figure:: img/image-no-blc.png\n> +    :width: 800\n> +\n> +    Image without black level correction (and completely invalid color estimation).\n> +\n> +.. figure:: img/image-no-lsc-3000.png\n> +    :width: 800\n> +\n> +    Image without lens shading correction @ 3000k. The vignetting artifacts can\n> +    be clearly seen\n\nHonestly, to the untrained eye... not really until you see the last \nimages you can compare it to.\n\nThanks for the example figures, I was wondering how much of the picture \nthe ColorChecker should take.\n\nI've been told that we don't want pictures in the libcamera git repo and \nthat is one of the reasons this wasn't merged. Can we host the photos \nsomewhere (better on something managed by the libcamera community, so it \ndoesn't disappear :) ) and link to them or have the compilation script \ndownload them locally before?\n\nCheers,\nQuentin","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id 25FD9C31E9\n\tfor <parsemail@patchwork.libcamera.org>;\n\tMon, 19 May 2025 14:38:11 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id 2150768D7A;\n\tMon, 19 May 2025 16:38:10 +0200 (CEST)","from EUR05-VI1-obe.outbound.protection.outlook.com\n\t(mail-vi1eur05on2060d.outbound.protection.outlook.com\n\t[IPv6:2a01:111:f403:2613::60d])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id 5F62168B67\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tMon, 19 May 2025 16:38:08 +0200 (CEST)","from AS8PR04MB8897.eurprd04.prod.outlook.com\n\t(2603:10a6:20b:42c::20)\n\tby PA2PR04MB10311.eurprd04.prod.outlook.com (2603:10a6:102:413::6)\n\twith Microsoft SMTP Server (version=TLS1_2,\n\tcipher=TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384) id 15.20.8722.33;\n\tMon, 19 May 2025 14:38:05 +0000","from AS8PR04MB8897.eurprd04.prod.outlook.com\n\t([fe80::35f6:bc7d:633:369a]) by\n\tAS8PR04MB8897.eurprd04.prod.outlook.com\n\t([fe80::35f6:bc7d:633:369a%4]) with mapi id 15.20.8746.030;\n\tMon, 19 May 2025 14:38:04 +0000"],"Authentication-Results":["lancelot.ideasonboard.com; dkim=pass (1024-bit key;\n\tunprotected) header.d=cherry.de header.i=@cherry.de\n\theader.b=\"hf/I7tLL\"; dkim-atps=neutral","dkim=none (message not signed)\n\theader.d=none;dmarc=none action=none header.from=cherry.de;"],"ARC-Seal":"i=1; a=rsa-sha256; s=arcselector10001; d=microsoft.com; cv=none;\n\tb=Y9wyYJQB+jqVvenn0efPpIF2qX2B2Dvy7uTvyKxyCeVn08kMhHAeZ4twtoEThPkqcFz6L9d9sWmMqj90yY895Sge4lk4d/lSXmuQgknzbB2yp2/ZvHfp/wHrDoy2l85LylLH2PnpFWVAEClOGRKqYuRuufcmdfkt8zXTB7d+m9cP2yGa2Dz2DTGiSY4RGKtkpqWPYHoaCAeBsvRUxEpZ/1QDIDUOBw5A+hlboq/0+ezD6ssfOK++ePWkhKScMwqu92l8i+oTlhMqmmRTwZiXqqJjRnul93sMQWuVJI0ztd2QHpRTMkjAA7OPBTeKyteYCFeAVt5+BHIr51q3xVUy+w==","ARC-Message-Signature":"i=1; a=rsa-sha256; c=relaxed/relaxed; d=microsoft.com;\n\ts=arcselector10001;\n\th=From:Date:Subject:Message-ID:Content-Type:MIME-Version:X-MS-Exchange-AntiSpam-MessageData-ChunkCount:X-MS-Exchange-AntiSpam-MessageData-0:X-MS-Exchange-AntiSpam-MessageData-1;\n\tbh=kBZE7L26FYC2BsQXMO/FMaKliDg/bW59zkqTtHD5hS4=;\n\tb=RPqDPy43mfxJcQhuJHE3mE42ZLG4YQFNj/PwYBOKGSUFztmlOXCg/ZfwIPQyot1AWm3DOmjwP6XmBvagJo7qxya2IhSXe0O+DlWnuI2npMCbETurq3/LPqW2yor8m7JLEfHfc4ttyX5X2I4KY47lUsO68aL3tmp7lZMbmhw04TDdGBxg41Q3POp0kqSJLlvPLyx+eD04+rWg6FtwvLweUin3f58ADnp12xEF1GLbTm4V8CD35www9GRrF1LJcic4eM9G63hWnl/v1NT3QmjFbVwQvjwyaKX25AdZbkZ0+iqGFTjEa5eExll3EBizu9dtwIG0bZ/b9GAb6EV5KonDtw==","ARC-Authentication-Results":"i=1; mx.microsoft.com 1; spf=pass\n\tsmtp.mailfrom=cherry.de; dmarc=pass action=none header.from=cherry.de;\n\tdkim=pass header.d=cherry.de; arc=none","DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/relaxed; d=cherry.de;\n\ts=selector1; \n\th=From:Date:Subject:Message-ID:Content-Type:MIME-Version:X-MS-Exchange-SenderADCheck;\n\tbh=kBZE7L26FYC2BsQXMO/FMaKliDg/bW59zkqTtHD5hS4=;\n\tb=hf/I7tLLJvWs9qz3YfMQsIdUCsckxYreDYJ0nwOrteV/aoWi7LssE4kyjnCDJKGoebewJt444fuLdX4xZZSox+cEz856VouC8NOPcXmkaaro7Qh9xoFLUvuGeUBKROSNJdM67xWLy5vP4p8pM7L5feG0MEqiv5pN7jWeNiFlHKc=","Message-ID":"<007a06bf-6967-4ab8-b5a9-6cc23125ca73@cherry.de>","Date":"Mon, 19 May 2025 16:38:03 +0200","User-Agent":"Mozilla Thunderbird","Subject":"Re: [PATCH v2 1/1] Documentation: Add first version of the\n\ttuning-guide","To":"Stefan Klug <stefan.klug@ideasonboard.com>,\n\tlibcamera-devel@lists.libcamera.org","References":"<20240814074013.52693-1-stefan.klug@ideasonboard.com>\n\t<20240814074013.52693-2-stefan.klug@ideasonboard.com>","Content-Language":"en-US","From":"Quentin Schulz <quentin.schulz@cherry.de>","In-Reply-To":"<20240814074013.52693-2-stefan.klug@ideasonboard.com>","Content-Type":"text/plain; charset=UTF-8; format=flowed","Content-Transfer-Encoding":"8bit","X-ClientProxiedBy":"FR4P281CA0109.DEUP281.PROD.OUTLOOK.COM\n\t(2603:10a6:d10:bb::13) To AS8PR04MB8897.eurprd04.prod.outlook.com\n\t(2603:10a6:20b:42c::20)","MIME-Version":"1.0","X-MS-PublicTrafficType":"Email","X-MS-TrafficTypeDiagnostic":"AS8PR04MB8897:EE_|PA2PR04MB10311:EE_","X-MS-Office365-Filtering-Correlation-Id":"a895f746-b7ff-4ab2-cfe6-08dd96e2c3b0","X-MS-Exchange-SenderADCheck":"1","X-MS-Exchange-AntiSpam-Relay":"0","X-Microsoft-Antispam":"BCL:0;\n\tARA:13230040|376014|1800799024|366016|7053199007|13003099007; ","X-Microsoft-Antispam-Message-Info":"=?utf-8?q?ddoVcUm0bk/M71LVQ+9w4cSKDJya?=\n\t=?utf-8?q?tTZ5aacHK3YL+H6B/B3dShlA7JyAR4bdcydXCuCnAnMvxsEDRW+6VdR9?=\n\t=?utf-8?q?aYfyc9h2FB7XF1UxSWz9Z5SU5TCEEmTrMROiZNyouKbhjXXP68wnqpo8?=\n\t=?utf-8?q?IEORahbvSV/BW3dP5R7H0VH2BKBNBD/arexAGsKq0EIMYyRByH7ZWz9A?=\n\t=?utf-8?q?hE49OrQw7yJhk8lcvwlp11e2uD1BLLkdgAVb6CuUJrQbi2w6W6944EXE?=\n\t=?utf-8?q?uCDxF1iLgu3KnhA7ToAfxUDlGlPsBHJpnmK7Pmn3XSulSdOVUFC1x1XG?=\n\t=?utf-8?q?eDcNPTT+DVIz65/amFJbhLGA4u9tAUooDwVyi7udFz+FkHtlD3vBBEy+?=\n\t=?utf-8?q?mrsqbPqYyltLdSgrFnXpHJXF0f5MdbxI0UfSaCzMz58vEvZOTUvnR02T?=\n\t=?utf-8?q?Cn8iPRhdUnUH+43j/dFEnOXNCJKZNJi4bhgD+fg/S3NaUEocSq9xTmUH?=\n\t=?utf-8?q?mT36sjN9M+wrqZkv+sdtpO9+nJFLkd289WWZuCGZaHJv84yPl56GPGOj?=\n\t=?utf-8?q?gjEhaNYVI/eqbMJM5SmKWxXGLbmfYeVjuO+Mlk4PY4lUfuEb7SkWndIa?=\n\t=?utf-8?q?eM0IY5qYHv9lbs+DDmaVzA2xPkdiguyaU8+pAws/0VKgHvuR4CrZNDWJ?=\n\t=?utf-8?q?F+7LsFBB9zsGxLGqlLUeV3Rd1QBgA/5lVGtm4wcdClvQHvMsD7nD/cBN?=\n\t=?utf-8?q?ODvRIP/dUKYBv3W5pRlYCwWg3WOZCcjBZr+Ft7J/Ebjoumucjp8B6nTU?=\n\t=?utf-8?q?AJYzz6xaIe/tEbmL8Qk0gsXL1fNTM7JRqYxifHY5pfj0lv+DYfSS+hUv?=\n\t=?utf-8?q?BRnZXNykqY0SX2EBy56Hj7V2pWTtsBqjEuFtxZD+VD7IP9UM5toHeabh?=\n\t=?utf-8?q?VBWxIgGUB9/hojERFOgEkGUw1mECON0VWU3L0qOeRMtUMVzBF8xWX8c/?=\n\t=?utf-8?q?fRHaGwJmwIDV9dzUoF1MwwMW2A4sXm0bkhw0V/l6PqvRctcfN6WFK1yT?=\n\t=?utf-8?q?MjFNNM8zBKDmsr56sW3UINjN2pHAsddBtGhQo4L8tpUaA7Vd6D75qYCc?=\n\t=?utf-8?q?70qrh2h0UgVFf0ihfQFyxNG9CAFZuF2DLIocCrDZveqL6gkjffqvEaGy?=\n\t=?utf-8?q?GQtzsyaxebotUSHrVDjvW1RiKvs9Aie9jRMnNC+bg57szMluc+webKrb?=\n\t=?utf-8?q?xyky8H6SQpXjJzrO+ak/8Cv4zGP7TbDhR3R70gY7EYKz8d9QizHvUGU5?=\n\t=?utf-8?q?nJrmRJtUlOIK/dWDPe7TbX78rvM/rKYCwXX4Hq0a9eYmPLnufjTa/wAK?=\n\t=?utf-8?q?G6HbZb5NDzmJjtH1Mc0HcyBOWfJWiOYHfCYnY5TcQxlCJVF92+V+bd3a?=\n\t=?utf-8?q?2uX6zX1THh1lMfOrLpCdgbpj9T4oMRHuH9lpQtboTFMyyzc9O9Z+TcAo?=\n\t=?utf-8?q?Z4KRmXYSKTJii3s=3D?=","X-Forefront-Antispam-Report":"CIP:255.255.255.255; CTRY:; LANG:en; SCL:1; SRV:;\n\tIPV:NLI; SFV:NSPM; H:AS8PR04MB8897.eurprd04.prod.outlook.com; PTR:;\n\tCAT:NONE; \n\tSFS:(13230040)(376014)(1800799024)(366016)(7053199007)(13003099007);\n\tDIR:OUT; SFP:1101; ","X-MS-Exchange-AntiSpam-MessageData-ChunkCount":"1","X-MS-Exchange-AntiSpam-MessageData-0":"=?utf-8?q?/vMPfXXtkT+kMk6GIjFF01e2D?=\n\t=?utf-8?q?nu7ftlHOLU0slIbPIhg/6JBpAytpr7urMX+XX+aIr5H9OYBbQvNKPIFm?=\n\t=?utf-8?q?yteaDPPS2ZeIiBINkevWcj3JQ6IsI6Vm3nPa5S8SJozGJ5QqN/aD8rJU?=\n\t=?utf-8?q?W63iVIIkpuEw8nfAWKfQ5Lwrf92lszfbaO2LPWktiyQaMwRWjCklESK3?=\n\t=?utf-8?q?H83mMID9O/dbH9J2RRtFiW7o2ft3bbpMOpabhBw1ZzMUA3NfSxPVrHbr?=\n\t=?utf-8?q?t8N8ndXdef2DH7oof7If5Bhu2PUjD+koMVE9nSs+fdkLKeZrH3yLNdJk?=\n\t=?utf-8?q?Mduj7jKsZye6UMtoL9/lzrT/yaBreh8lVpPzxsXPydLROXSP9xI6mCBk?=\n\t=?utf-8?q?B8tyYLyTfamphuzgB2pNyj23iK2vWSNBcIUyT5LOThSocNHW0vU3/w/7?=\n\t=?utf-8?q?bQ209A0Be+inmycrw12Ly7/FpO8DZQx8RSqQVldD4k5lAZzs5YintDGe?=\n\t=?utf-8?q?lwXt+tlcGjHJRkxmolZPzDwQ29ZVvzOxH8kBthN3MDfNPyEQhxpCS3lF?=\n\t=?utf-8?q?PaXOmavkC6u0DzEy1XKm8ai5An5yzqpN1rDnGabNFsviZB8WPpgx0V3L?=\n\t=?utf-8?q?LFG2kpHL7+INwrmkGJGkaZg8bytaYzuC0f2K5CGRbEozotwuQ6lqLBxy?=\n\t=?utf-8?q?KkvdLKZLfuGx/6OL2A6wQMqduBZsqOxkgLK2LMavkGkUvlVfM5eoS+CB?=\n\t=?utf-8?q?QgBibjhSHgJLZzr79Cx6dyLcWfESqRo2UfkN0dpr6RWkGJ11SSTrYDzw?=\n\t=?utf-8?q?Tjg6fl2tfreW+MCTehyxg5fFxPRLIsJ6tcBJssuHl5BqRyaF2A6e13Yi?=\n\t=?utf-8?q?+SBoS3OKhTgwFKH9RvXb8L2KGJ6TGtT4zVAFlPd5Dmf70Bl1m/y/3p+Z?=\n\t=?utf-8?q?9f4zlMoxNxGyEDympAb3IovzI859HejMMqZAivNcy1OomLjkKSlzm7Lv?=\n\t=?utf-8?q?t3NVwXPtlex2InpGtc9UjqGShpl1+P6bX4lI7Z8MzwPaoa1rrDxexCtT?=\n\t=?utf-8?q?q9AUYrtpbNANLAbN7WxTFFaq0winkI6EVXdSe/+5pf2XbDruFvAlfjPd?=\n\t=?utf-8?q?+xlXyfgLBWosjpNxKEseY6clRB82BcT9R6PC3e1+r+ngazTKlLDXnSVn?=\n\t=?utf-8?q?YmZkhshA0QQKUJqq4l9zvw4tbUld+fX3vmo7x0ZzJCCnPL3sO6ESWwTn?=\n\t=?utf-8?q?ua0Q2x3X+amWe3FZV4iBwJdZhwl+Z+8wPRHZFO1p5bDt442lBxlmYyfD?=\n\t=?utf-8?q?r1vheZKiyE7wX0PiRN4rQPxnxao3lUonQjrfFOwRXk2a1GTDk62WGOeE?=\n\t=?utf-8?q?TaLmlfkGWHgOXkICTsI6bByg5ZPI8V8aJ9C07nQUI52I6W2KaX4s+Ini?=\n\t=?utf-8?q?2qXjGVzLkzOOnpoQS530iDX6S/Qmd8MtDqhWNz5ASXLAa4Eym01MDkbT?=\n\t=?utf-8?q?2AnamAMpcFqW1xmdvyjgDxd7nwfBdPKENQTyOP4lzNnB7FReovodqKCH?=\n\t=?utf-8?q?+iHbgGbNScbfUh5swE4Tss2bhBaBhGp7fsENzP+xPR3/gNbCzlfxP39J?=\n\t=?utf-8?q?WbShZY7zkSRN8PGEsoQp0s50Nvb2RHaBh2F7Sq4Ml4ITLQ48IQRuMv1R?=\n\t=?utf-8?q?+k5iDMQNbuojc8vHQNKvntGBHCAopF0iq2t6Bo4IcFqa3xahe+5gi03c?=\n\t=?utf-8?q?P/lUqChN0j9H+T06mK9W5K6SDzRlw=3D=3D?=","X-OriginatorOrg":"cherry.de","X-MS-Exchange-CrossTenant-Network-Message-Id":"a895f746-b7ff-4ab2-cfe6-08dd96e2c3b0","X-MS-Exchange-CrossTenant-AuthSource":"AS8PR04MB8897.eurprd04.prod.outlook.com","X-MS-Exchange-CrossTenant-AuthAs":"Internal","X-MS-Exchange-CrossTenant-OriginalArrivalTime":"19 May 2025 14:38:04.8025\n\t(UTC)","X-MS-Exchange-CrossTenant-FromEntityHeader":"Hosted","X-MS-Exchange-CrossTenant-Id":"5e0e1b52-21b5-4e7b-83bb-514ec460677e","X-MS-Exchange-CrossTenant-MailboxType":"HOSTED","X-MS-Exchange-CrossTenant-UserPrincipalName":"yLyNOEJtbFNr3YXR11a0u7eqaU7Ai4PI+9rzjMXOdiBs2qrkGlcWmgBxjVKyghUyxxnVWhbF7t3smYiMotYqXPkC7cG9CvadUxz/zuweKNQ=","X-MS-Exchange-Transport-CrossTenantHeadersStamped":"PA2PR04MB10311","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}}]