[{"id":27828,"web_url":"https://patchwork.libcamera.org/comment/27828/","msgid":"<20230921120440.GD27722@pendragon.ideasonboard.com>","date":"2023-09-21T12:04:40","subject":"Re: [libcamera-devel] [PATCH v4 01/12] documentation: Introduce\n\tCamera Sensor Model","submitter":{"id":2,"url":"https://patchwork.libcamera.org/api/people/2/","name":"Laurent Pinchart","email":"laurent.pinchart@ideasonboard.com"},"content":"Hi Jacopo,\n\nThank you for the patch.\n\nOn Sat, Sep 16, 2023 at 02:19:19PM +0200, Jacopo Mondi via libcamera-devel wrote:\n> Introduce a documentation page about the 'camera sensor model'\n> implemented by libcamera.\n> \n> The camera sensor model serves to provide to applications a reference\n> description of the processing steps that take place in a camera sensor\n> in order to precisely control the sensor configuration through the\n> forthcoming SensorConfiguration class.\n> \n> Signed-off-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>\n> Reviewed-by: Naushir Patuck <naush@raspberrypi.com>\n> Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>\n> ---\n>  Documentation/binning.svg             | 5053 +++++++++++++++++++++++++\n>  Documentation/camera-sensor-model.rst |  170 +\n>  Documentation/index.rst               |    1 +\n>  Documentation/meson.build             |    1 +\n>  Documentation/sensor_model.svg        | 4866 ++++++++++++++++++++++++\n>  Documentation/skipping.svg            | 1720 +++++++++\n>  6 files changed, 11811 insertions(+)\n>  create mode 100644 Documentation/binning.svg\n>  create mode 100644 Documentation/camera-sensor-model.rst\n>  create mode 100644 Documentation/sensor_model.svg\n>  create mode 100644 Documentation/skipping.svg\n> \n> diff --git a/Documentation/binning.svg b/Documentation/binning.svg\n> new file mode 100644\n> index 000000000000..c6a3b6394acd\n> --- /dev/null\n> +++ b/Documentation/binning.svg\n> @@ -0,0 +1,5053 @@\n> +<?xml version=\"1.0\" encoding=\"UTF-8\" standalone=\"no\"?>\n\n<!-- SPDX-License-Identifier: CC-BY-SA-4.0 -->\n\nSame for the other SVG files below.\n\nThis will likely need manual merge when updating the diagrams :-S\nAnother option is to specify the license in .reuse/dep5.\n\n> +<!-- Created with Inkscape (http://www.inkscape.org/) -->\n\nI won't review the rest of the SVG here :-) However, neither Firefox nor\nChrome render the arrow heads :-S I wonder if inkscape does something\nwrong. This isn't a blocker, we can fix it on top.\n\n[snip]\n\n> diff --git a/Documentation/camera-sensor-model.rst b/Documentation/camera-sensor-model.rst\n> new file mode 100644\n> index 000000000000..7ff0bdc50564\n> --- /dev/null\n> +++ b/Documentation/camera-sensor-model.rst\n> @@ -0,0 +1,170 @@\n> +.. SPDX-License-Identifier: CC-BY-SA-4.0\n> +\n> +.. _camera-sensor-model:\n\nCould you add a comment here to indicate this should be moved to the\ndoxygen-generated documentation ?\n\n.. todo: Move to Doxygen-generated documentation\n\n> +\n> +The libcamera camera sensor model\n> +=================================\n> +\n> +libcamera defines an abstract camera sensor model in order to provide\n> +a description of each of the processing steps that result in image data being\n> +sent on the media bus and that form the image stream delivered to applications.\n> +\n> +Applications should use the abstracted camera sensor model defined here to\n\ns/abstracted/abstract/\n\n> +precisely control the operations of the camera sensor.\n> +\n> +The libcamera camera sensor model targets image sensors producing frames in\n> +RAW format, delivered through a MIPI CSI-2 compliant bus implementation.\n> +\n> +The abstract sensor model maps libcamera components to the characteristics and\n> +operations of an image sensor, and serves as a reference to model the libcamera\n> +CameraSensor class and SensorConfiguration classes and operations.\n> +\n> +In order to control the configuration of the camera sensor through the\n> +SensorConfiguration class, applications should understand this model and map it\n> +to the combination of image sensor and kernel driver in use.\n> +\n> +The camera sensor model defined here is based on the *MIPI CCS specification*,\n> +particularly on *Section 8.2 - Image readout* of *Chapter 8 - Video Timings*.\n> +\n> +Glossary\n> +---------\n> +\n> +- *Pixel array*: The full grid of pixels, active and inactive ones\n> +\n> +- *Pixel array active area*: The portion(s) of the pixel array that\n> +  contains valid and readable pixels; corresponds to the libcamera\n> +  properties::PixelArrayActiveAreas\n> +\n> +- *Analog crop rectangle*: The portion of the *pixel array active area* which\n> +  is read out and passed to further processing stages\n> +\n> +- *Subsampling*: Pixel processing techniques that reduce the image size by\n> +  binning or by skipping adjacent pixels\n> +\n> +- *Digital crop*: Crop of the sub-sampled image data before scaling\n> +\n> +- *Frame output*: The frame (image) as output on the media bus by the\n> +  camera sensor\n\nSphinx has a glossary directive:\n\n.. glossary::\n\n   Pixel array\n      The full grid of pixels, active and inactive ones\n\n   Pixel array active area\n      The portion(s) of the pixel array that contains valid and readable\n      pixels; corresponds to the libcamera properties::PixelArrayActiveAreas\n\n   Analog crop rectangle\n      The portion of the *pixel array active area* which is read out and passed\n      to further processing stages\n\n   Subsampling\n      Pixel processing techniques that reduce the image size by binning or by\n      skipping adjacent pixels\n\n   Digital crop\n      Crop of the sub-sampled image data before scaling\n\n   Frame output\n      The frame (image) as output on the media bus by the camera sensor\n\n> +\n> +Camera sensor model\n> +-------------------\n> +\n> +The abstract sensor model is described in the following diagram\n\ns/$/./\n\n> +\n> +.. figure:: sensor_model.svg\n> +\n> +\n> +1. The sensor reads pixels from the *pixel array*. The pixels being read out are\n> +   selected by the *analog crop rectangle*.\n> +\n> +2. The pixels can be subsampled to reduce the image size without affecting the\n> +   field of view. Two subsampling techniques can be used:\n> +\n> +   - Binning: combines adjacent pixels of the same colour by averaging or\n> +     summing their values, in the analog domain and/or the digital domain.\n> +\n> +      .. figure:: binning.svg\n> +\n> +\n> +   - Skipping: skips the read out of a number of adjacent pixels.\n> +\n> +      .. figure:: skipping.svg\n> +\n> +\n> +3. The output of the optional sub-sampling stage is then cropped after the\n> +conversion of the analogue pixel values in the digital domain.\n\ns/^/   /\n\n> +\n> +4. The resulting output frame is sent on the media bus by the sensor.\n> +\n> +Camera Sensor configuration parameters\n> +--------------------------------------\n> +\n> +The libcamera camera sensor model defines parameters that allow users to\n> +control:\n> +\n> +1. The image format bit depth\n> +\n> +2. The size and position of the  *Analog crop rectangle*\n> +\n> +3. The subsampling factors used to downscale the pixel array readout data to a\n> +   smaller frame size without reducing the image *field of view*. Two\n> +   configuration parameters are made available to control the downscaling factor\n\ns/^/:/\n\n> +\n> +   - binning\n> +      - a vertical and horizontal binning factor can be specified, the image\n\ns/- a/A/\n\n> +        will be downscaled in its vertical and horizontal sizes by the specified\n> +        factor\n\ns/$/./\n\n> +\n> +      .. code-block::\n> +         :caption: Definition: The horizontal and vertical binning factors\n> +\n> +         horizontal_binning = xBin;\n> +         vertical_binning = yBin;\n> +\n> +\n> +   - skipping\n> +      - reduces the image resolution by skipping the read-out of a\n> +        number of adjacent pixels\n> +      - the skipping factor is specified by the 'increment' number (number of\n> +        pixels to 'skip') in the vertical and horizontal directions and for\n> +        even and odd rows and columns\n\n      Skipping reduces the image resolution by skipping the read-out of a number\n      of adjacent pixels. The skipping factor is specified by the 'increment'\n      number (number of pixels to 'skip') in the vertical and horizontal\n      directions and for even and odd rows and columns.\n\n> +\n> +      .. code-block::\n> +         :caption: Definition: The horizontal and vertical skipping factors\n> +\n> +         horizontal_skipping = (xOddInc + xEvenInc) / 2\n> +         vertical_skipping = (yOddInc + yEvenInc) / 2\n> +\n> +\n> +   - different sensors perform the binning and skipping stages in different\n> +     orders\n> +   - for the sake of computing the final output image size the order of\n> +     execution is not relevant.\n> +\n> +   - the overall down-scaling factor is obtained by combining the binning and\n> +     skipping factors\n\nAs the text above introduces a list of two scaling factors, let's turn\nthe rest into text, not bullet points:\n\n   Different sensors perform the binning and skipping stages in different\n   orders. For the sake of computing the final output image size the order of\n   execution is not relevant. The overall down-scaling factor is obtained by\n   combining the binning and skipping factors.\n\n> +\n> +   .. code-block::\n> +      :caption: Definition: The total scaling factor (binning + sub-sampling)\n> +\n> +      total_horizontal_downscale = horizontal_binning + horizontal_skipping\n> +      total_vertical_downscale = vertical_binning + vertical_skipping\n> +\n> +\n> +4. The output data frame size\n> +    - the output size is used to specify any additional cropping on the\n> +      sub-sampled frame\n\nDitto, drop the bullet, start with a capital letter, end with a period.\nSame below.\n\n> +\n> +5. The total line length and frame height (*visibile* pixels + *blankings*) as\n> +   sent on the MIPI CSI-2 bus\n> +\n> +6. The pixel transmission rate on the MIPI CSI-2 bus\n> +\n> +The above parameters are combined to obtain the following high-level\n> +configurations:\n> +\n> +- **frame output size**\n> +\n> +   Obtained by applying a crop to the physical pixel array size in the analog\n> +   domain, followed by optional binning and sub-sampling (in any order),\n> +   followed by an optional crop step in the output digital domain.\n> +\n> +- **frame rate**\n> +\n> +   The combination of the *total frame size*, the image format *bit depth* and\n> +   the *pixel rate* of the data sent on the MIPI CSI-2 bus allows to compute the\n> +   image stream frame rate. The equation is the well known:\n> +\n> +   .. code-block::\n> +\n> +      frame_duration = total_frame_size / pixel_rate\n> +      frame_rate = 1 / frame_duration\n> +\n> +\n> +   where the *pixel_rate* parameter is the result of the sensor's configuration\n> +   of the MIPI CSI-2 bus *(the following formula applies to MIPI CSI-2 when\n> +   used on MIPI D-PHY physical protocol layer only)*\n> +\n> +   .. code-block::\n> +\n> +      pixel_rate = CSI-2_link_freq * 2 * nr_of_lanes / bits_per_sample\n> diff --git a/Documentation/index.rst b/Documentation/index.rst\n> index 43d8b017d3b4..63fac72d11ed 100644\n> --- a/Documentation/index.rst\n> +++ b/Documentation/index.rst\n> @@ -23,3 +23,4 @@\n>     Sensor driver requirements <sensor_driver_requirements>\n>     Lens driver requirements <lens_driver_requirements>\n>     Python Bindings <python-bindings>\n> +   Camera Sensor Model <camera-sensor-model>\n> diff --git a/Documentation/meson.build b/Documentation/meson.build\n> index b2a5bf15e6ea..7c1502592baa 100644\n> --- a/Documentation/meson.build\n> +++ b/Documentation/meson.build\n> @@ -63,6 +63,7 @@ endif\n>  \n>  if sphinx.found()\n>      docs_sources = [\n> +        'camera-sensor-model.rst',\n>          'coding-style.rst',\n>          'conf.py',\n>          'contributing.rst',\n> diff --git a/Documentation/sensor_model.svg b/Documentation/sensor_model.svg\n> new file mode 100644\n> index 000000000000..1f76d41cf6a5\n> --- /dev/null\n> +++ b/Documentation/sensor_model.svg\n> @@ -0,0 +1,4866 @@\n\nIt looks quite neat. The only improvement I can see could be nice is to\nscale the image in 4 to have the same pixel size as in 3, to avoid\nimplying there's any scaling. It can be done on top, or you can the\nchange before pushing.\n\n[snip]\n\n> diff --git a/Documentation/skipping.svg b/Documentation/skipping.svg\n> new file mode 100644\n> index 000000000000..7bef37cfcc64\n> --- /dev/null\n> +++ b/Documentation/skipping.svg\n> @@ -0,0 +1,1720 @@\n\n[snip]\n\nReviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id AC011C326B\n\tfor <parsemail@patchwork.libcamera.org>;\n\tThu, 21 Sep 2023 12:04:31 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id E06DE62944;\n\tThu, 21 Sep 2023 14:04:30 +0200 (CEST)","from perceval.ideasonboard.com (perceval.ideasonboard.com\n\t[IPv6:2001:4b98:dc2:55:216:3eff:fef7:d647])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id CBA9F628D8\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tThu, 21 Sep 2023 14:04:28 +0200 (CEST)","from pendragon.ideasonboard.com (213-243-189-158.bb.dnainternet.fi\n\t[213.243.189.158])\n\tby perceval.ideasonboard.com (Postfix) with ESMTPSA id 2A6AA10FE;\n\tThu, 21 Sep 2023 14:02:51 +0200 (CEST)"],"DKIM-Signature":["v=1; a=rsa-sha256; c=relaxed/simple; d=libcamera.org;\n\ts=mail; t=1695297870;\n\tbh=yulaXZZpY2vemj8EV+kaJpmBZ0/4r7y63yp8Gs8JR9k=;\n\th=Date:To:References:In-Reply-To:Subject:List-Id:List-Unsubscribe:\n\tList-Archive:List-Post:List-Help:List-Subscribe:From:Reply-To:Cc:\n\tFrom;\n\tb=4Se993ga8APbrj247d+HF8DuPLedQt0Zch/6DrfZHVq+km9b6n/BufcU7lt4gJSMF\n\t3mLgv0pPzEPctTRQUZFDuzESICmwjaDkYWNPGriJJZYJMW8nr37crC1TpooRDf3hFF\n\tiyvj/QIEEt/q85bid4wfIvNKS+I8kY6nyJ5T+7KaG2DS66+Ze5ikxo/PW56GAq7XB2\n\tHJizD7QJZFGEhh7dJCM98e0x54uVvzMe9RZfejY+7HkfzTfWja/zZ+YDpxQ+EKAaK7\n\t2j6uOOM3z8uxrLiTJZJCWz7JcwCFYhecaWmm5roKXbDVGh/Xr4F3XIl922caUaIh0G\n\tzrESnRedbSTtQ==","v=1; a=rsa-sha256; c=relaxed/simple; d=ideasonboard.com;\n\ts=mail; t=1695297771;\n\tbh=yulaXZZpY2vemj8EV+kaJpmBZ0/4r7y63yp8Gs8JR9k=;\n\th=Date:From:To:Cc:Subject:References:In-Reply-To:From;\n\tb=UNTBl/rAGNO0Q30IT4Z8C5aButR34LBRkCkY34aNdVz6UU39a0AgC8oL+EJ4vjrjv\n\txVC0N1+P9aEM84SLklzHzF85FKWqtg74Ywc/R8zzVuTYDqKGMOsFrcHN/DZ/xxz1wX\n\tnYow0unL18vvlATfLLTS/wYCX1LH6I3ALJrmDqOg="],"Authentication-Results":"lancelot.ideasonboard.com; dkim=pass (1024-bit key; \n\tunprotected) header.d=ideasonboard.com\n\theader.i=@ideasonboard.com\n\theader.b=\"UNTBl/rA\"; dkim-atps=neutral","Date":"Thu, 21 Sep 2023 15:04:40 +0300","To":"Jacopo Mondi <jacopo.mondi@ideasonboard.com>","Message-ID":"<20230921120440.GD27722@pendragon.ideasonboard.com>","References":"<20230916121930.29490-1-jacopo.mondi@ideasonboard.com>\n\t<20230916121930.29490-2-jacopo.mondi@ideasonboard.com>","MIME-Version":"1.0","Content-Type":"text/plain; charset=utf-8","Content-Disposition":"inline","In-Reply-To":"<20230916121930.29490-2-jacopo.mondi@ideasonboard.com>","Subject":"Re: [libcamera-devel] [PATCH v4 01/12] documentation: Introduce\n\tCamera Sensor Model","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","From":"Laurent Pinchart via libcamera-devel\n\t<libcamera-devel@lists.libcamera.org>","Reply-To":"Laurent Pinchart <laurent.pinchart@ideasonboard.com>","Cc":"libcamera-devel@lists.libcamera.org","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}},{"id":27833,"web_url":"https://patchwork.libcamera.org/comment/27833/","msgid":"<fbf7ysdf3bdmexb4eskxjhulowyhfxac42h62rmduti7ijtq2g@uob7wq2j5h72>","date":"2023-09-21T13:48:22","subject":"Re: [libcamera-devel] [PATCH v4 01/12] documentation: Introduce\n\tCamera Sensor Model","submitter":{"id":143,"url":"https://patchwork.libcamera.org/api/people/143/","name":"Jacopo Mondi","email":"jacopo.mondi@ideasonboard.com"},"content":"Hi Laurent\n\nOn Thu, Sep 21, 2023 at 03:04:40PM +0300, Laurent Pinchart wrote:\n> Hi Jacopo,\n>\n> Thank you for the patch.\n>\n> On Sat, Sep 16, 2023 at 02:19:19PM +0200, Jacopo Mondi via libcamera-devel wrote:\n> > Introduce a documentation page about the 'camera sensor model'\n> > implemented by libcamera.\n> >\n> > The camera sensor model serves to provide to applications a reference\n> > description of the processing steps that take place in a camera sensor\n> > in order to precisely control the sensor configuration through the\n> > forthcoming SensorConfiguration class.\n> >\n> > Signed-off-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>\n> > Reviewed-by: Naushir Patuck <naush@raspberrypi.com>\n> > Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>\n> > ---\n> >  Documentation/binning.svg             | 5053 +++++++++++++++++++++++++\n> >  Documentation/camera-sensor-model.rst |  170 +\n> >  Documentation/index.rst               |    1 +\n> >  Documentation/meson.build             |    1 +\n> >  Documentation/sensor_model.svg        | 4866 ++++++++++++++++++++++++\n> >  Documentation/skipping.svg            | 1720 +++++++++\n> >  6 files changed, 11811 insertions(+)\n> >  create mode 100644 Documentation/binning.svg\n> >  create mode 100644 Documentation/camera-sensor-model.rst\n> >  create mode 100644 Documentation/sensor_model.svg\n> >  create mode 100644 Documentation/skipping.svg\n> >\n> > diff --git a/Documentation/binning.svg b/Documentation/binning.svg\n> > new file mode 100644\n> > index 000000000000..c6a3b6394acd\n> > --- /dev/null\n> > +++ b/Documentation/binning.svg\n> > @@ -0,0 +1,5053 @@\n> > +<?xml version=\"1.0\" encoding=\"UTF-8\" standalone=\"no\"?>\n>\n> <!-- SPDX-License-Identifier: CC-BY-SA-4.0 -->\n>\n> Same for the other SVG files below.\n>\n> This will likely need manual merge when updating the diagrams :-S\n> Another option is to specify the license in .reuse/dep5.\n\nGood point, I'll add to .reuse/dep5\n\nFiles: Documentation/binning.svg\n       Documentation/camera-sensor-model.rst\n       Documentation/sensor_model.svg\nCopyright: Copyright 2023 Ideas On Board Oy\nLicense: CC-BY-SA-4.0\n\n>\n> > +<!-- Created with Inkscape (http://www.inkscape.org/) -->\n>\n> I won't review the rest of the SVG here :-) However, neither Firefox nor\n> Chrome render the arrow heads :-S I wonder if inkscape does something\n> wrong. This isn't a blocker, we can fix it on top.\n>\n\nYes, I notiched with Chromium and did blame the broswer as the arrows\nrender correctly in Inkscape... But it firefox fails too I wonder if\nsomething is not wrong in Inkscape\n\n> [snip]\n>\n> > diff --git a/Documentation/camera-sensor-model.rst b/Documentation/camera-sensor-model.rst\n> > new file mode 100644\n> > index 000000000000..7ff0bdc50564\n> > --- /dev/null\n> > +++ b/Documentation/camera-sensor-model.rst\n> > @@ -0,0 +1,170 @@\n> > +.. SPDX-License-Identifier: CC-BY-SA-4.0\n> > +\n> > +.. _camera-sensor-model:\n>\n> Could you add a comment here to indicate this should be moved to the\n> doxygen-generated documentation ?\n>\n> .. todo: Move to Doxygen-generated documentation\n>\n\nAck\n\n> > +\n> > +The libcamera camera sensor model\n> > +=================================\n> > +\n> > +libcamera defines an abstract camera sensor model in order to provide\n> > +a description of each of the processing steps that result in image data being\n> > +sent on the media bus and that form the image stream delivered to applications.\n> > +\n> > +Applications should use the abstracted camera sensor model defined here to\n>\n> s/abstracted/abstract/\n>\n> > +precisely control the operations of the camera sensor.\n> > +\n> > +The libcamera camera sensor model targets image sensors producing frames in\n> > +RAW format, delivered through a MIPI CSI-2 compliant bus implementation.\n> > +\n> > +The abstract sensor model maps libcamera components to the characteristics and\n> > +operations of an image sensor, and serves as a reference to model the libcamera\n> > +CameraSensor class and SensorConfiguration classes and operations.\n> > +\n> > +In order to control the configuration of the camera sensor through the\n> > +SensorConfiguration class, applications should understand this model and map it\n> > +to the combination of image sensor and kernel driver in use.\n> > +\n> > +The camera sensor model defined here is based on the *MIPI CCS specification*,\n> > +particularly on *Section 8.2 - Image readout* of *Chapter 8 - Video Timings*.\n> > +\n> > +Glossary\n> > +---------\n> > +\n> > +- *Pixel array*: The full grid of pixels, active and inactive ones\n> > +\n> > +- *Pixel array active area*: The portion(s) of the pixel array that\n> > +  contains valid and readable pixels; corresponds to the libcamera\n> > +  properties::PixelArrayActiveAreas\n> > +\n> > +- *Analog crop rectangle*: The portion of the *pixel array active area* which\n> > +  is read out and passed to further processing stages\n> > +\n> > +- *Subsampling*: Pixel processing techniques that reduce the image size by\n> > +  binning or by skipping adjacent pixels\n> > +\n> > +- *Digital crop*: Crop of the sub-sampled image data before scaling\n> > +\n> > +- *Frame output*: The frame (image) as output on the media bus by the\n> > +  camera sensor\n>\n> Sphinx has a glossary directive:\n>\n> .. glossary::\n>\n>    Pixel array\n>       The full grid of pixels, active and inactive ones\n>\n>    Pixel array active area\n>       The portion(s) of the pixel array that contains valid and readable\n>       pixels; corresponds to the libcamera properties::PixelArrayActiveAreas\n>\n>    Analog crop rectangle\n>       The portion of the *pixel array active area* which is read out and passed\n>       to further processing stages\n>\n>    Subsampling\n>       Pixel processing techniques that reduce the image size by binning or by\n>       skipping adjacent pixels\n>\n>    Digital crop\n>       Crop of the sub-sampled image data before scaling\n>\n>    Frame output\n>       The frame (image) as output on the media bus by the camera sensor\n>\n> > +\n> > +Camera sensor model\n> > +-------------------\n> > +\n> > +The abstract sensor model is described in the following diagram\n>\n> s/$/./\n>\n> > +\n> > +.. figure:: sensor_model.svg\n> > +\n> > +\n> > +1. The sensor reads pixels from the *pixel array*. The pixels being read out are\n> > +   selected by the *analog crop rectangle*.\n> > +\n> > +2. The pixels can be subsampled to reduce the image size without affecting the\n> > +   field of view. Two subsampling techniques can be used:\n> > +\n> > +   - Binning: combines adjacent pixels of the same colour by averaging or\n> > +     summing their values, in the analog domain and/or the digital domain.\n> > +\n> > +      .. figure:: binning.svg\n> > +\n> > +\n> > +   - Skipping: skips the read out of a number of adjacent pixels.\n> > +\n> > +      .. figure:: skipping.svg\n> > +\n> > +\n> > +3. The output of the optional sub-sampling stage is then cropped after the\n> > +conversion of the analogue pixel values in the digital domain.\n>\n> s/^/   /\n>\n> > +\n> > +4. The resulting output frame is sent on the media bus by the sensor.\n> > +\n> > +Camera Sensor configuration parameters\n> > +--------------------------------------\n> > +\n> > +The libcamera camera sensor model defines parameters that allow users to\n> > +control:\n> > +\n> > +1. The image format bit depth\n> > +\n> > +2. The size and position of the  *Analog crop rectangle*\n> > +\n> > +3. The subsampling factors used to downscale the pixel array readout data to a\n> > +   smaller frame size without reducing the image *field of view*. Two\n> > +   configuration parameters are made available to control the downscaling factor\n>\n> s/^/:/\n>\n> > +\n> > +   - binning\n> > +      - a vertical and horizontal binning factor can be specified, the image\n>\n> s/- a/A/\n>\n> > +        will be downscaled in its vertical and horizontal sizes by the specified\n> > +        factor\n>\n> s/$/./\n>\n> > +\n> > +      .. code-block::\n> > +         :caption: Definition: The horizontal and vertical binning factors\n> > +\n> > +         horizontal_binning = xBin;\n> > +         vertical_binning = yBin;\n> > +\n> > +\n> > +   - skipping\n> > +      - reduces the image resolution by skipping the read-out of a\n> > +        number of adjacent pixels\n> > +      - the skipping factor is specified by the 'increment' number (number of\n> > +        pixels to 'skip') in the vertical and horizontal directions and for\n> > +        even and odd rows and columns\n>\n>       Skipping reduces the image resolution by skipping the read-out of a number\n>       of adjacent pixels. The skipping factor is specified by the 'increment'\n>       number (number of pixels to 'skip') in the vertical and horizontal\n>       directions and for even and odd rows and columns.\n>\n> > +\n> > +      .. code-block::\n> > +         :caption: Definition: The horizontal and vertical skipping factors\n> > +\n> > +         horizontal_skipping = (xOddInc + xEvenInc) / 2\n> > +         vertical_skipping = (yOddInc + yEvenInc) / 2\n> > +\n> > +\n> > +   - different sensors perform the binning and skipping stages in different\n> > +     orders\n> > +   - for the sake of computing the final output image size the order of\n> > +     execution is not relevant.\n> > +\n> > +   - the overall down-scaling factor is obtained by combining the binning and\n> > +     skipping factors\n>\n> As the text above introduces a list of two scaling factors, let's turn\n> the rest into text, not bullet points:\n>\n>    Different sensors perform the binning and skipping stages in different\n>    orders. For the sake of computing the final output image size the order of\n>    execution is not relevant. The overall down-scaling factor is obtained by\n>    combining the binning and skipping factors.\n>\n\nAck\n\n> > +\n> > +   .. code-block::\n> > +      :caption: Definition: The total scaling factor (binning + sub-sampling)\n> > +\n> > +      total_horizontal_downscale = horizontal_binning + horizontal_skipping\n> > +      total_vertical_downscale = vertical_binning + vertical_skipping\n> > +\n> > +\n> > +4. The output data frame size\n> > +    - the output size is used to specify any additional cropping on the\n> > +      sub-sampled frame\n>\n> Ditto, drop the bullet, start with a capital letter, end with a period.\n> Same below.\n>\n> > +\n> > +5. The total line length and frame height (*visibile* pixels + *blankings*) as\n> > +   sent on the MIPI CSI-2 bus\n> > +\n> > +6. The pixel transmission rate on the MIPI CSI-2 bus\n> > +\n> > +The above parameters are combined to obtain the following high-level\n> > +configurations:\n> > +\n> > +- **frame output size**\n> > +\n> > +   Obtained by applying a crop to the physical pixel array size in the analog\n> > +   domain, followed by optional binning and sub-sampling (in any order),\n> > +   followed by an optional crop step in the output digital domain.\n> > +\n> > +- **frame rate**\n> > +\n> > +   The combination of the *total frame size*, the image format *bit depth* and\n> > +   the *pixel rate* of the data sent on the MIPI CSI-2 bus allows to compute the\n> > +   image stream frame rate. The equation is the well known:\n> > +\n> > +   .. code-block::\n> > +\n> > +      frame_duration = total_frame_size / pixel_rate\n> > +      frame_rate = 1 / frame_duration\n> > +\n> > +\n> > +   where the *pixel_rate* parameter is the result of the sensor's configuration\n> > +   of the MIPI CSI-2 bus *(the following formula applies to MIPI CSI-2 when\n> > +   used on MIPI D-PHY physical protocol layer only)*\n> > +\n> > +   .. code-block::\n> > +\n> > +      pixel_rate = CSI-2_link_freq * 2 * nr_of_lanes / bits_per_sample\n> > diff --git a/Documentation/index.rst b/Documentation/index.rst\n> > index 43d8b017d3b4..63fac72d11ed 100644\n> > --- a/Documentation/index.rst\n> > +++ b/Documentation/index.rst\n> > @@ -23,3 +23,4 @@\n> >     Sensor driver requirements <sensor_driver_requirements>\n> >     Lens driver requirements <lens_driver_requirements>\n> >     Python Bindings <python-bindings>\n> > +   Camera Sensor Model <camera-sensor-model>\n> > diff --git a/Documentation/meson.build b/Documentation/meson.build\n> > index b2a5bf15e6ea..7c1502592baa 100644\n> > --- a/Documentation/meson.build\n> > +++ b/Documentation/meson.build\n> > @@ -63,6 +63,7 @@ endif\n> >\n> >  if sphinx.found()\n> >      docs_sources = [\n> > +        'camera-sensor-model.rst',\n> >          'coding-style.rst',\n> >          'conf.py',\n> >          'contributing.rst',\n> > diff --git a/Documentation/sensor_model.svg b/Documentation/sensor_model.svg\n> > new file mode 100644\n> > index 000000000000..1f76d41cf6a5\n> > --- /dev/null\n> > +++ b/Documentation/sensor_model.svg\n> > @@ -0,0 +1,4866 @@\n>\n> It looks quite neat. The only improvement I can see could be nice is to\n> scale the image in 4 to have the same pixel size as in 3, to avoid\n> implying there's any scaling. It can be done on top, or you can the\n> change before pushing.\n>\n\nAh good point, I think it's a leftover from when I had the digital\nscaling block here\n\n\n> [snip]\n>\n> > diff --git a/Documentation/skipping.svg b/Documentation/skipping.svg\n> > new file mode 100644\n> > index 000000000000..7bef37cfcc64\n> > --- /dev/null\n> > +++ b/Documentation/skipping.svg\n> > @@ -0,0 +1,1720 @@\n>\n> [snip]\n>\n> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>\n\nThanks\n\n>\n> --\n> Regards,\n>\n> Laurent Pinchart","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id 5C87CC326B\n\tfor <parsemail@patchwork.libcamera.org>;\n\tThu, 21 Sep 2023 13:48:29 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id AF30362931;\n\tThu, 21 Sep 2023 15:48:28 +0200 (CEST)","from perceval.ideasonboard.com (perceval.ideasonboard.com\n\t[IPv6:2001:4b98:dc2:55:216:3eff:fef7:d647])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id 71892628D8\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tThu, 21 Sep 2023 15:48:26 +0200 (CEST)","from ideasonboard.com (93-46-82-201.ip106.fastwebnet.it\n\t[93.46.82.201])\n\tby perceval.ideasonboard.com (Postfix) with ESMTPSA id CBC571102;\n\tThu, 21 Sep 2023 15:46:48 +0200 (CEST)"],"DKIM-Signature":["v=1; a=rsa-sha256; c=relaxed/simple; d=libcamera.org;\n\ts=mail; t=1695304108;\n\tbh=ECFbUyOVo7c2kT+KSSClH4MZG1jSVVxl3lI0BGhHrjo=;\n\th=Date:To:References:In-Reply-To:Subject:List-Id:List-Unsubscribe:\n\tList-Archive:List-Post:List-Help:List-Subscribe:From:Reply-To:Cc:\n\tFrom;\n\tb=nbhCA/bnR01RoWuqqkuykv9/XGLgo1gHcJDuPzvWi7CmjmUTps6M4QSud8MtuHm1d\n\tfpEcfJ5QErecwgg5mriGaa2Dab7Y9Myr1NYsDL/LjAsI4GOYW3G+oNQQo7b6pBmzHF\n\tbxLDYzGmjx0o5TTt7sGjY8yJljObVM8Cjr0h+/f9mg0IsrbXNchrXPTPN4jA3D5BoE\n\tXrweAGU+sL3UH6mOMbgTzIyahPVBDUheHFMdufiA3lkt0kZsd4dE8KrXbat25JGW9p\n\toNb8/S48oSkgBBQrUKeOIPJZZ1EnGCyMh0orctJLDYOGfX2Mrh9LqVSuoht8ioU5iC\n\tvaWMBZw9HMAcg==","v=1; a=rsa-sha256; c=relaxed/simple; d=ideasonboard.com;\n\ts=mail; t=1695304008;\n\tbh=ECFbUyOVo7c2kT+KSSClH4MZG1jSVVxl3lI0BGhHrjo=;\n\th=Date:From:To:Cc:Subject:References:In-Reply-To:From;\n\tb=UnyRMSTxYlXU+YOVYLPmV/ELhgvo3bn0hXmb5VFjedMMktfscCBnsHGh29Yrr+qL6\n\t68Tea6HLz3TG2zof1/KaGo/am8qQY1ebAkK3gFw++cwiyuavfIqL3fH1J0oaju7eb1\n\tQ+rM5nqIZxiOI9xU4o5aeKpf37KG73kqst/UU5zw="],"Authentication-Results":"lancelot.ideasonboard.com; dkim=pass (1024-bit key; \n\tunprotected) header.d=ideasonboard.com\n\theader.i=@ideasonboard.com\n\theader.b=\"UnyRMSTx\"; dkim-atps=neutral","Date":"Thu, 21 Sep 2023 15:48:22 +0200","To":"Laurent Pinchart <laurent.pinchart@ideasonboard.com>","Message-ID":"<fbf7ysdf3bdmexb4eskxjhulowyhfxac42h62rmduti7ijtq2g@uob7wq2j5h72>","References":"<20230916121930.29490-1-jacopo.mondi@ideasonboard.com>\n\t<20230916121930.29490-2-jacopo.mondi@ideasonboard.com>\n\t<20230921120440.GD27722@pendragon.ideasonboard.com>","MIME-Version":"1.0","Content-Type":"text/plain; charset=utf-8","Content-Disposition":"inline","In-Reply-To":"<20230921120440.GD27722@pendragon.ideasonboard.com>","Subject":"Re: [libcamera-devel] [PATCH v4 01/12] documentation: Introduce\n\tCamera Sensor Model","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","From":"Jacopo Mondi via libcamera-devel <libcamera-devel@lists.libcamera.org>","Reply-To":"Jacopo Mondi <jacopo.mondi@ideasonboard.com>","Cc":"Jacopo Mondi <jacopo.mondi@ideasonboard.com>,\n\tlibcamera-devel@lists.libcamera.org","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}}]