[{"id":19642,"web_url":"https://patchwork.libcamera.org/comment/19642/","msgid":"<d985d46a-b9f4-01de-b0f9-9dc4fd1c15c9@ideasonboard.com>","date":"2021-09-13T14:29:15","subject":"Re: [libcamera-devel] [RFC PATCH] Documentation: IPU3 IPA Design\n\tguide","submitter":{"id":86,"url":"https://patchwork.libcamera.org/api/people/86/","name":"Umang Jain","email":"umang.jain@ideasonboard.com"},"content":"Hi Kieran,\n\nOn 9/13/21 3:00 AM, Kieran Bingham wrote:\n> The IPU3 IPA implements the basic 3a using the ImgU ISP.\n3a or 3A?\n>\n> Provide an overview document to describe it's operations, and provide a\n> block diagram to help visualise how the components are put together to\n> assist any new developers exploring the code.\n>\n> Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>\n>\n> ---\n> This is really just an RFC: Particularly:\n>\n>   - Should this content actually live in Documentation?\n>     or is it more suited to a page under src/ipa/ipu3/...\n\n\nI would probably keep it under Documentation/ but there is already a IPA \nwriting guide. So maybe we need a new dir, to document overviews of \nexisting IPAs?\n\n>\n>   - This is not complete to document all functionality\n>     but gives an overview of the current implementation\n>\n>   - Is there anything anyone would find helpful for me to\n>     extend/write about on this?\n>\n>   - It could likely get merged in with some of the work\n>     that Jean-Michel is doing, and might duplicate some of\n>     the parts he is documenting, but this was an aim to\n>     write a single page overview as a getting started with\n>     the IPU3 IPA ....\n>\n>\n>   .../guides/ipu3-ipa-design-guide.rst          | 144 ++++++++++++++++++\n>   1 file changed, 144 insertions(+)\n>   create mode 100644 Documentation/guides/ipu3-ipa-design-guide.rst\n>\n> diff --git a/Documentation/guides/ipu3-ipa-design-guide.rst b/Documentation/guides/ipu3-ipa-design-guide.rst\n> new file mode 100644\n> index 000000000000..a1d0f13fbb00\n> --- /dev/null\n> +++ b/Documentation/guides/ipu3-ipa-design-guide.rst\n> @@ -0,0 +1,144 @@\n> +IPU3 IPA Architecture Design and Overview\n> +=========================================\n> +\n> +The IPU3 IPA is built as a modular and extensible framework with an\n> +upper layer to manage the interactions with the pipeline handler, and\n> +the image processing algorithms split to compartmentalise the processing\n> +required for each accellerator cluster provided by the ImgU ISP.\n> +\n> +The core IPU3 class is responsible for initialisation and construction\ns/IPU3/IPAIPU3 maybe?\n> +of the algorithm components, processing controls set by the requests\n> +from applications, and managing events from the Pipeline handler.\n> +\n> +::\n> +\n> +        ┌───────────────────────────────────────────┐\n> +        │      IPU3 Pipeline Handler                │\n> +        │   ┌────────┐    ┌────────┐    ┌────────┐  │\n> +        │   │        │    │        │    │        │  │\n> +        │   │ Sensor ├───►│  CIO2  ├───►│  ImgU  ├──►\n> +        │   │        │    │        │    │        │  │\n> +        │   └────────┘    └────────┘    └─▲────┬─┘  │    P: Parameter Buffer\n> +        │                                 │P   │    │    S: Statistics Buffer\n> +        │                                 │    │S   │\n> +        └─┬───┬───┬──────┬────┬────┬────┬─┴────▼─┬──┘    1: init()\n> +          │   │   │      │ ▲  │ ▲  │ ▲  │ ▲      │       2: configure()\n> +          │1  │2  │3     │4│  │4│  │4│  │4│      │5      3: mapBuffers(), start()\n> +          ▼   ▼   ▼      ▼ │  ▼ │  ▼ │  ▼ │      ▼       4: processEvent()\n> +        ┌──────────────────┴────┴────┴────┴─────────┐    5: stop(), unmapBuffers()\n> +        │ IPU3 IPA                                  │\n> +        │                 ┌───────────────────────┐ │\n> +        │ ┌───────────┐   │ Algorithms            │ │\n> +        │ │IPAContext │   │          ┌─────────┐  │ │\n> +        │ │ ┌───────┐ │   │          │ ...     │  │ │\n> +        │ │ │       │ │   │        ┌─┴───────┐ │  │ │\n> +        │ │ │  SC   │ │   │        │ Tonemap ├─┘  │ │\n> +        │ │ │       │ ◄───►      ┌─┴───────┐ │    │ │\n> +        │ │ ├───────┤ │   │      │ AWB     ├─┘    │ │\n> +        │ │ │       │ │   │    ┌─┴───────┐ │      │ │\n> +        │ │ │  FC   │ │   │    │ AGC     ├─┘      │ │\n> +        │ │ │       │ │   │    │         │        │ │\n> +        │ │ └───────┘ │   │    └─────────┘        │ │\n> +        │ └───────────┘   └───────────────────────┘ │\n> +        └───────────────────────────────────────────┘\n> +         SC: IPASessionConfiguration\n> +         FC: IPAFrameContext(s)\n> +\n> +The IPA instance is constructed and initialised at the point a Camera is\n> +created by the IPU3 pipeline handler. The initialisation call provides\n> +details about which Camera Sensor is being used, and the controls that\n> +it has available, along with their defaults and ranges.\n\n\nBrief line about how sensor controls are used in IPA will be useful? \nThis will explain \"why\" sensor controls are passed to IPA and what IPA \ndoes with it (for e.g gather necessary info to run algorithms).\n\n> +\n> +Buffers\n> +~~~~~~~\n> +\n> +The IPA will have Parameter and Statistics buffers shared with it from\n> +the IPU3 Pipeline handler. These buffers will be passed to the IPA\n> +before the ``start()`` operation occurs.\n> +\n> +The IPA will map the buffers into CPU accessible memory, associated with\n> +a buffer ID, and further events for sending or receiving parameter and\n> +statistics buffers will reference the ID to avoid expensive memory\n> +mapping operations, or the passing of file handles during streaming.\n> +\n> +After the ``stop()`` operation occurs, these buffers will be unmapped,\n> +and no further access to the buffers is permitted.\n> +\n> +Context\n> +~~~~~~~\n> +\n> +Algorithm calls will always have the ``IPAContext`` available to them.\n> +This context comprises of two parts:\n> +\n> +-  IPA Session Configuration\n> +-  IPA Frame Context\n> +\n> +The session configuration structure ``IPASessionConfiguration``\n> +represents constant parameters determined from either before streaming\n> +commenced during ``configure()`` or updated parameters when processing\n> +controls.\n> +\n> +The IPA Frame Context provides the storage for algorithms for a single\n> +frame operation.\n> +\n> +The ``IPAFrameContext`` structure may be extended to an array, list, or\n> +queue to store historical state for each frame, allowing algorithms to\n> +obtain and reference results of calculations which are deeply pipelined.\n> +This may only be done if an algorithm needs to know the context that was\n> +applied at the frame the statistics were produced for, rather than the\n> +previous or current frame.\n> +\n> +Presently there is a single ``IPAFrameContext`` without historical data,\n> +and the context is maintained and updated through successive processing\n> +operations.\n> +\n> +Operating\n> +~~~~~~~~~\n> +\n> +There are three main parts of interactions with the algorithms for the\n> +IPU3 IPA to operate when running:\n> +\n> +-  configure()\n> +-  processEvent(``EventFillParams``)\n\n\nOne liner explanation for EventFillParams is missing. It should be \npresent in Pre-frame preparation section, no? Can you please include a \nbrief line analogous to \"EventStatReady\" event in the Post-frame \ncompletion section?\n\nOther than that, looks fine to me:\n\nReviewed-by: Umang Jain <umang.jain@ideasonboard.com>\n\n> +-  processEvent(``EventStatReady``)\n> +\n> +The configuration phase allows the pipeline-handler to inform the IPA of\n> +the current stream configurations, which is then passed into each\n> +algorithm to provide an opportunity to identify and track state of the\n> +hardware, such as image size or ImgU pipeline configurations.\n> +\n> +Pre-frame preparation\n> +~~~~~~~~~~~~~~~~~~~~~\n> +\n> +When configured, the IPA is notified by the pipeline handler of the\n> +Camera ``start()`` event, after which incoming requests will be queued\n> +for processing, requiring a parameter buffer (``ipu3_uapi_params``) to\n> +be populated for the ImgU. This is passed directly to each algorithm\n> +through the ``prepare()`` call allowing the ISP configuration to be\n> +updated for the needs of each component that the algorithm is\n> +responsible for.\n> +\n> +The algorithm should set the use flag (``ipu3_uapi_flags``) for any\n> +structure that it modifies, and it should take care to ensure that any\n> +structure set by a use flag is fully initialised to suitable values.\n> +\n> +The parameter buffer is returned to the pipeline handler through the\n> +``ActionParamFilled`` event, and from there queued to the ImgU along\n> +with a raw frame captured with the CIO2.\n> +\n> +Post-frame completion\n> +~~~~~~~~~~~~~~~~~~~~~\n> +\n> +When the capture of an image is completed, and successfully processed\n> +through the ImgU, the generated statistics buffer\n> +(``ipu3_uapi_stats_3a``) is given to the IPA through the\n> +``EventStatReady`` event. This provides the IPA with an opportunity to\n> +examine the results of the ISP and run the calculations required by each\n> +algorithm on the new data. The algorithms may require context from the\n> +operations of other algorithms, for example, the AWB might choose to use\n> +a scene brightness determined by the AGC. It is important that the\n> +algorithms are ordered to ensure that required results are determined\n> +before they are needed.\n> +\n> +The ordering of the algorithm processing is determined by their\n> +placement in the ``IPU3::algorithms_`` ordered list.","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id CEABABDC71\n\tfor <parsemail@patchwork.libcamera.org>;\n\tMon, 13 Sep 2021 14:29:23 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id 5655F6916F;\n\tMon, 13 Sep 2021 16:29:23 +0200 (CEST)","from perceval.ideasonboard.com (perceval.ideasonboard.com\n\t[IPv6:2001:4b98:dc2:55:216:3eff:fef7:d647])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id 847B26024C\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tMon, 13 Sep 2021 16:29:21 +0200 (CEST)","from [192.168.1.104] (unknown [103.251.226.152])\n\tby perceval.ideasonboard.com (Postfix) with ESMTPSA id A24E2499;\n\tMon, 13 Sep 2021 16:29:20 +0200 (CEST)"],"Authentication-Results":"lancelot.ideasonboard.com; dkim=pass (1024-bit key;\n\tunprotected) header.d=ideasonboard.com header.i=@ideasonboard.com\n\theader.b=\"PgRWEy/q\"; dkim-atps=neutral","DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/simple; d=ideasonboard.com;\n\ts=mail; t=1631543361;\n\tbh=cxzfEeAZmCr+Lxk2uOrszwgGYNbUrtGm8vP3KQzhEgg=;\n\th=Subject:To:References:From:Date:In-Reply-To:From;\n\tb=PgRWEy/q2yqTcHnX+n0GGYPWgoH1USz4RXTMCuKx+ABZp16ZONh29BL6mFrwHhZFK\n\t+lhzBIl5oFhp463fmrcPrbUg+7q31EBu/tnJg7uwLNuTmpyWc1NBor7HQmlMq3TZKT\n\tVlcz+yjpqcDntwd6gA0iTK29z7DuduLEbQn4Pfbs=","To":"Kieran Bingham <kieran.bingham@ideasonboard.com>,\n\tlibcamera devel <libcamera-devel@lists.libcamera.org>","References":"<20210912213017.702492-1-kieran.bingham@ideasonboard.com>","From":"Umang Jain <umang.jain@ideasonboard.com>","Message-ID":"<d985d46a-b9f4-01de-b0f9-9dc4fd1c15c9@ideasonboard.com>","Date":"Mon, 13 Sep 2021 19:59:15 +0530","User-Agent":"Mozilla/5.0 (X11; Linux x86_64; rv:78.0) Gecko/20100101\n\tThunderbird/78.10.2","MIME-Version":"1.0","In-Reply-To":"<20210912213017.702492-1-kieran.bingham@ideasonboard.com>","Content-Type":"text/plain; charset=utf-8; format=flowed","Content-Transfer-Encoding":"8bit","Content-Language":"en-US","Subject":"Re: [libcamera-devel] [RFC PATCH] Documentation: IPU3 IPA Design\n\tguide","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}},{"id":19667,"web_url":"https://patchwork.libcamera.org/comment/19667/","msgid":"<YUA1AzpA4K3fNiat@pendragon.ideasonboard.com>","date":"2021-09-14T05:37:07","subject":"Re: [libcamera-devel] [RFC PATCH] Documentation: IPU3 IPA Design\n\tguide","submitter":{"id":2,"url":"https://patchwork.libcamera.org/api/people/2/","name":"Laurent Pinchart","email":"laurent.pinchart@ideasonboard.com"},"content":"Hi Kieran,\n\nThank you for the patch.\n\nOn Sun, Sep 12, 2021 at 10:30:17PM +0100, Kieran Bingham wrote:\n> The IPU3 IPA implements the basic 3a using the ImgU ISP.\n\ns/3a/3A/\n\nTechnically speaking we're only implementing 2 out of 3 at this point\n:-)\n\n> Provide an overview document to describe it's operations, and provide a\n\ns/it's/its/\n\n> block diagram to help visualise how the components are put together to\n> assist any new developers exploring the code.\n> \n> Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>\n> \n> ---\n> This is really just an RFC: Particularly:\n> \n>  - Should this content actually live in Documentation?\n>    or is it more suited to a page under src/ipa/ipu3/...\n\nGood question. We tend to document things in the code where possible.\nWould there be any drawback from doing so in this case ?\n\n>  - This is not complete to document all functionality\n>    but gives an overview of the current implementation\n\nSounds good to me.\n\n>  - Is there anything anyone would find helpful for me to\n>    extend/write about on this?\n> \n>  - It could likely get merged in with some of the work\n>    that Jean-Michel is doing, and might duplicate some of\n>    the parts he is documenting, but this was an aim to\n>    write a single page overview as a getting started with\n>    the IPU3 IPA ....\n\nI think it's *very* useful to have an overview.\n\n>  .../guides/ipu3-ipa-design-guide.rst          | 144 ++++++++++++++++++\n>  1 file changed, 144 insertions(+)\n>  create mode 100644 Documentation/guides/ipu3-ipa-design-guide.rst\n> \n> diff --git a/Documentation/guides/ipu3-ipa-design-guide.rst b/Documentation/guides/ipu3-ipa-design-guide.rst\n> new file mode 100644\n> index 000000000000..a1d0f13fbb00\n> --- /dev/null\n> +++ b/Documentation/guides/ipu3-ipa-design-guide.rst\n> @@ -0,0 +1,144 @@\n> +IPU3 IPA Architecture Design and Overview\n> +=========================================\n> +\n> +The IPU3 IPA is built as a modular and extensible framework with an\n> +upper layer to manage the interactions with the pipeline handler, and\n> +the image processing algorithms split to compartmentalise the processing\n> +required for each accellerator cluster provided by the ImgU ISP.\n\nI've commented on the expression \"accelerator cluster\" when reviewing\nJean-Michel's series. I've then seen that it's mentioned in the IPU3\nkernel documentation in one place, I suppose that's where it comes from.\nFrom what I understand, the expression is used to refer to a processing\nblock (or sub-block) that is fully implemented in hardware, as opposed\nas being implemented partly or fully in the ImgU firmware. I don't think\nthat's relevant from an IPA point of view, it's an internal\nimplementation detail of the ImgU. I'd thus use \"processing block\" or a\nsimilar term instead.\n\n> +\n> +The core IPU3 class is responsible for initialisation and construction\n> +of the algorithm components, processing controls set by the requests\n> +from applications, and managing events from the Pipeline handler.\n\ns/Pipeline/pipeline/\n\n> +\n> +::\n> +\n> +        ┌───────────────────────────────────────────┐\n> +        │      IPU3 Pipeline Handler                │\n> +        │   ┌────────┐    ┌────────┐    ┌────────┐  │\n> +        │   │        │    │        │    │        │  │\n> +        │   │ Sensor ├───►│  CIO2  ├───►│  ImgU  ├──►\n> +        │   │        │    │        │    │        │  │\n> +        │   └────────┘    └────────┘    └─▲────┬─┘  │    P: Parameter Buffer\n> +        │                                 │P   │    │    S: Statistics Buffer\n> +        │                                 │    │S   │\n> +        └─┬───┬───┬──────┬────┬────┬────┬─┴────▼─┬──┘    1: init()\n> +          │   │   │      │ ▲  │ ▲  │ ▲  │ ▲      │       2: configure()\n> +          │1  │2  │3     │4│  │4│  │4│  │4│      │5      3: mapBuffers(), start()\n> +          ▼   ▼   ▼      ▼ │  ▼ │  ▼ │  ▼ │      ▼       4: processEvent()\n> +        ┌──────────────────┴────┴────┴────┴─────────┐    5: stop(), unmapBuffers()\n> +        │ IPU3 IPA                                  │\n> +        │                 ┌───────────────────────┐ │\n> +        │ ┌───────────┐   │ Algorithms            │ │\n> +        │ │IPAContext │   │          ┌─────────┐  │ │\n> +        │ │ ┌───────┐ │   │          │ ...     │  │ │\n> +        │ │ │       │ │   │        ┌─┴───────┐ │  │ │\n> +        │ │ │  SC   │ │   │        │ Tonemap ├─┘  │ │\n> +        │ │ │       │ ◄───►      ┌─┴───────┐ │    │ │\n> +        │ │ ├───────┤ │   │      │ AWB     ├─┘    │ │\n> +        │ │ │       │ │   │    ┌─┴───────┐ │      │ │\n> +        │ │ │  FC   │ │   │    │ AGC     ├─┘      │ │\n> +        │ │ │       │ │   │    │         │        │ │\n> +        │ │ └───────┘ │   │    └─────────┘        │ │\n> +        │ └───────────┘   └───────────────────────┘ │\n> +        └───────────────────────────────────────────┘\n> +         SC: IPASessionConfiguration\n> +         FC: IPAFrameContext(s)\n> +\n> +The IPA instance is constructed and initialised at the point a Camera is\n> +created by the IPU3 pipeline handler. The initialisation call provides\n> +details about which Camera Sensor is being used, and the controls that\n\ns/Camera Sensor/camera sensor/ ?\n\n> +it has available, along with their defaults and ranges.\n\ns/defaults/default values/\n\n> +\n> +Buffers\n> +~~~~~~~\n> +\n> +The IPA will have Parameter and Statistics buffers shared with it from\n> +the IPU3 Pipeline handler. These buffers will be passed to the IPA\n> +before the ``start()`` operation occurs.\n> +\n> +The IPA will map the buffers into CPU accessible memory, associated with\n\ns/CPU accessible/CPU-accessible/\n\n> +a buffer ID, and further events for sending or receiving parameter and\n> +statistics buffers will reference the ID to avoid expensive memory\n> +mapping operations, or the passing of file handles during streaming.\n> +\n> +After the ``stop()`` operation occurs, these buffers will be unmapped,\n> +and no further access to the buffers is permitted.\n> +\n> +Context\n> +~~~~~~~\n> +\n> +Algorithm calls will always have the ``IPAContext`` available to them.\n> +This context comprises of two parts:\n> +\n> +-  IPA Session Configuration\n> +-  IPA Frame Context\n> +\n> +The session configuration structure ``IPASessionConfiguration``\n> +represents constant parameters determined from either before streaming\n> +commenced during ``configure()`` or updated parameters when processing\n> +controls.\n\nI'm not sure about the last part, I thought the session configuration\nwas meant to be constant after configure() ?\n\n> +The IPA Frame Context provides the storage for algorithms for a single\n> +frame operation.\n> +\n> +The ``IPAFrameContext`` structure may be extended to an array, list, or\n> +queue to store historical state for each frame, allowing algorithms to\n> +obtain and reference results of calculations which are deeply pipelined.\n> +This may only be done if an algorithm needs to know the context that was\n> +applied at the frame the statistics were produced for, rather than the\n> +previous or current frame.\n> +\n> +Presently there is a single ``IPAFrameContext`` without historical data,\n> +and the context is maintained and updated through successive processing\n> +operations.\n> +\n> +Operating\n> +~~~~~~~~~\n> +\n> +There are three main parts of interactions with the algorithms for the\n> +IPU3 IPA to operate when running:\n> +\n> +-  configure()\n> +-  processEvent(``EventFillParams``)\n> +-  processEvent(``EventStatReady``)\n> +\n> +The configuration phase allows the pipeline-handler to inform the IPA of\n> +the current stream configurations, which is then passed into each\n> +algorithm to provide an opportunity to identify and track state of the\n> +hardware, such as image size or ImgU pipeline configurations.\n> +\n> +Pre-frame preparation\n> +~~~~~~~~~~~~~~~~~~~~~\n> +\n> +When configured, the IPA is notified by the pipeline handler of the\n> +Camera ``start()`` event, after which incoming requests will be queued\n> +for processing, requiring a parameter buffer (``ipu3_uapi_params``) to\n> +be populated for the ImgU. This is passed directly to each algorithm\n> +through the ``prepare()`` call allowing the ISP configuration to be\n> +updated for the needs of each component that the algorithm is\n> +responsible for.\n> +\n> +The algorithm should set the use flag (``ipu3_uapi_flags``) for any\n> +structure that it modifies, and it should take care to ensure that any\n> +structure set by a use flag is fully initialised to suitable values.\n> +\n> +The parameter buffer is returned to the pipeline handler through the\n> +``ActionParamFilled`` event, and from there queued to the ImgU along\n> +with a raw frame captured with the CIO2.\n\nThis reminds me that we should turn the operations (actions and events)\ninto separate functions in the IPAIPU3Interface. Out of scope for this\npatch of course.\n\n> +\n> +Post-frame completion\n> +~~~~~~~~~~~~~~~~~~~~~\n> +\n> +When the capture of an image is completed, and successfully processed\n> +through the ImgU, the generated statistics buffer\n> +(``ipu3_uapi_stats_3a``) is given to the IPA through the\n> +``EventStatReady`` event. This provides the IPA with an opportunity to\n> +examine the results of the ISP and run the calculations required by each\n> +algorithm on the new data. The algorithms may require context from the\n> +operations of other algorithms, for example, the AWB might choose to use\n> +a scene brightness determined by the AGC. It is important that the\n> +algorithms are ordered to ensure that required results are determined\n> +before they are needed.\n> +\n> +The ordering of the algorithm processing is determined by their\n> +placement in the ``IPU3::algorithms_`` ordered list.\n\nI expect interesting research to happen in this area :-)","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id 5BC4CBDB1D\n\tfor <parsemail@patchwork.libcamera.org>;\n\tTue, 14 Sep 2021 05:37:35 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id BA28E69188;\n\tTue, 14 Sep 2021 07:37:34 +0200 (CEST)","from perceval.ideasonboard.com (perceval.ideasonboard.com\n\t[213.167.242.64])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id 4603269181\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tTue, 14 Sep 2021 07:37:33 +0200 (CEST)","from pendragon.ideasonboard.com (62-78-145-57.bb.dnainternet.fi\n\t[62.78.145.57])\n\tby perceval.ideasonboard.com (Postfix) with ESMTPSA id B77862A5;\n\tTue, 14 Sep 2021 07:37:32 +0200 (CEST)"],"Authentication-Results":"lancelot.ideasonboard.com; dkim=pass (1024-bit key;\n\tunprotected) header.d=ideasonboard.com header.i=@ideasonboard.com\n\theader.b=\"oiUW9gTY\"; dkim-atps=neutral","DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/simple; d=ideasonboard.com;\n\ts=mail; t=1631597852;\n\tbh=hXHQUhJfdFGgf21sY9w0IsSHE1AGjX4yenZCEawDPwE=;\n\th=Date:From:To:Cc:Subject:References:In-Reply-To:From;\n\tb=oiUW9gTYdUSJlmpffWKLOX+99FQ98I63Oh34KtRjjq16zbWqMojau50Ay4M5cBDBI\n\tU6FWlxgSUrY+8Cf+kbHcyos4/k20TjI2bQoy16YMpKjuajO4nn4tOy29x5NBEqbsBc\n\tO2zRHhMSki9hf4XqRbEq5cVkQfJ+5yjqhyVq++QE=","Date":"Tue, 14 Sep 2021 08:37:07 +0300","From":"Laurent Pinchart <laurent.pinchart@ideasonboard.com>","To":"Kieran Bingham <kieran.bingham@ideasonboard.com>","Message-ID":"<YUA1AzpA4K3fNiat@pendragon.ideasonboard.com>","References":"<20210912213017.702492-1-kieran.bingham@ideasonboard.com>","MIME-Version":"1.0","Content-Type":"text/plain; charset=utf-8","Content-Disposition":"inline","Content-Transfer-Encoding":"8bit","In-Reply-To":"<20210912213017.702492-1-kieran.bingham@ideasonboard.com>","Subject":"Re: [libcamera-devel] [RFC PATCH] Documentation: IPU3 IPA Design\n\tguide","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","Cc":"libcamera devel <libcamera-devel@lists.libcamera.org>","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}},{"id":19672,"web_url":"https://patchwork.libcamera.org/comment/19672/","msgid":"<9f2e1956-542c-a3d6-46eb-ee615850ce8d@ideasonboard.com>","date":"2021-09-14T10:13:39","subject":"Re: [libcamera-devel] [RFC PATCH] Documentation: IPU3 IPA Design\n\tguide","submitter":{"id":4,"url":"https://patchwork.libcamera.org/api/people/4/","name":"Kieran Bingham","email":"kieran.bingham@ideasonboard.com"},"content":"Hi Laurent,\n\nOn 14/09/2021 06:37, Laurent Pinchart wrote:\n> Hi Kieran,\n> \n> Thank you for the patch.\n> \n> On Sun, Sep 12, 2021 at 10:30:17PM +0100, Kieran Bingham wrote:\n>> The IPU3 IPA implements the basic 3a using the ImgU ISP.\n> \n> s/3a/3A/\n> \n> Technically speaking we're only implementing 2 out of 3 at this point\n> :-)\n> \n>> Provide an overview document to describe it's operations, and provide a\n> \n> s/it's/its/\n> \n>> block diagram to help visualise how the components are put together to\n>> assist any new developers exploring the code.\n>>\n>> Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>\n>>\n>> ---\n>> This is really just an RFC: Particularly:\n>>\n>>  - Should this content actually live in Documentation?\n>>    or is it more suited to a page under src/ipa/ipu3/...\n> \n> Good question. We tend to document things in the code where possible.\n> Would there be any drawback from doing so in this case ?\n\nI don't think so - I think the question mostly comes from\n'discoverability' of the documentation.\n\n\nWhich is also a bigger question that we need to make it easier for\nreaders to find the documentation that they desire.\n\nI've come across a great talk on this recently:\n  - What nobody tells you about documentation - Daniele Procida\n  - PyConAu\n  - https://2017.pycon-au.org/schedule/presentation/15/\n  - https://youtu.be/t4vKPhjcMZg\n\nAnd that's been spun into a website to document how to document ;-)\n  - https://diataxis.fr/\n\n\nSo, I think this could indeed live under src/ipa/ipu3 but it needs to be\nclear that this should be the first thing to read when looking into the IPA.\n\n\n\n> \n>>  - This is not complete to document all functionality\n>>    but gives an overview of the current implementation\n> \n> Sounds good to me.\n> \n>>  - Is there anything anyone would find helpful for me to\n>>    extend/write about on this?\n>>\n>>  - It could likely get merged in with some of the work\n>>    that Jean-Michel is doing, and might duplicate some of\n>>    the parts he is documenting, but this was an aim to\n>>    write a single page overview as a getting started with\n>>    the IPU3 IPA ....\n> \n> I think it's *very* useful to have an overview.\n> \n>>  .../guides/ipu3-ipa-design-guide.rst          | 144 ++++++++++++++++++\n>>  1 file changed, 144 insertions(+)\n>>  create mode 100644 Documentation/guides/ipu3-ipa-design-guide.rst\n>>\n>> diff --git a/Documentation/guides/ipu3-ipa-design-guide.rst b/Documentation/guides/ipu3-ipa-design-guide.rst\n>> new file mode 100644\n>> index 000000000000..a1d0f13fbb00\n>> --- /dev/null\n>> +++ b/Documentation/guides/ipu3-ipa-design-guide.rst\n>> @@ -0,0 +1,144 @@\n>> +IPU3 IPA Architecture Design and Overview\n>> +=========================================\n>> +\n>> +The IPU3 IPA is built as a modular and extensible framework with an\n>> +upper layer to manage the interactions with the pipeline handler, and\n>> +the image processing algorithms split to compartmentalise the processing\n>> +required for each accellerator cluster provided by the ImgU ISP.\n> \n> I've commented on the expression \"accelerator cluster\" when reviewing\n> Jean-Michel's series. I've then seen that it's mentioned in the IPU3\n> kernel documentation in one place, I suppose that's where it comes from.\n> From what I understand, the expression is used to refer to a processing\n> block (or sub-block) that is fully implemented in hardware, as opposed\n> as being implemented partly or fully in the ImgU firmware. I don't think\n> that's relevant from an IPA point of view, it's an internal\n> implementation detail of the ImgU. I'd thus use \"processing block\" or a\n> similar term instead.\n\nhttps://git.kernel.org/pub/scm/linux/kernel/git/torvalds/linux.git/tree/drivers/staging/media/ipu3/include/uapi/intel-ipu3.h#n2428\n\nDescribes it as:\n\n\"\"\"\nACC refers to the HW cluster containing all Fixed Functions (FFs). Each\nFF implements a specific algorithm.\n\"\"\"\n\nSo in fact, yes - the term Accelerator Cluster is a 'group' of all the\nhardware accelerator functions, rather than each individual one.\n\nEach individual function (fixed function) can indeed be better named as\na processing block.\n\n\nLets make this more like\n\n\"\"\"\nThe IPU3 IPA is built as a modular and extensible framework with an\nupper layer to manage the interactions with the pipeline handler, and\nthe image processing algorithms split to compartmentalise the processing\nrequired for each processing block, making use of the fixed-function\naccelerators provided by the ImgU ISP.\n\"\"\"\n\n\n\n>> +\n>> +The core IPU3 class is responsible for initialisation and construction\n>> +of the algorithm components, processing controls set by the requests\n>> +from applications, and managing events from the Pipeline handler.\n> \n> s/Pipeline/pipeline/\n> \n>> +\n>> +::\n>> +\n>> +        ┌───────────────────────────────────────────┐\n>> +        │      IPU3 Pipeline Handler                │\n>> +        │   ┌────────┐    ┌────────┐    ┌────────┐  │\n>> +        │   │        │    │        │    │        │  │\n>> +        │   │ Sensor ├───►│  CIO2  ├───►│  ImgU  ├──►\n>> +        │   │        │    │        │    │        │  │\n>> +        │   └────────┘    └────────┘    └─▲────┬─┘  │    P: Parameter Buffer\n>> +        │                                 │P   │    │    S: Statistics Buffer\n>> +        │                                 │    │S   │\n>> +        └─┬───┬───┬──────┬────┬────┬────┬─┴────▼─┬──┘    1: init()\n>> +          │   │   │      │ ▲  │ ▲  │ ▲  │ ▲      │       2: configure()\n>> +          │1  │2  │3     │4│  │4│  │4│  │4│      │5      3: mapBuffers(), start()\n>> +          ▼   ▼   ▼      ▼ │  ▼ │  ▼ │  ▼ │      ▼       4: processEvent()\n>> +        ┌──────────────────┴────┴────┴────┴─────────┐    5: stop(), unmapBuffers()\n>> +        │ IPU3 IPA                                  │\n>> +        │                 ┌───────────────────────┐ │\n>> +        │ ┌───────────┐   │ Algorithms            │ │\n>> +        │ │IPAContext │   │          ┌─────────┐  │ │\n>> +        │ │ ┌───────┐ │   │          │ ...     │  │ │\n>> +        │ │ │       │ │   │        ┌─┴───────┐ │  │ │\n>> +        │ │ │  SC   │ │   │        │ Tonemap ├─┘  │ │\n>> +        │ │ │       │ ◄───►      ┌─┴───────┐ │    │ │\n>> +        │ │ ├───────┤ │   │      │ AWB     ├─┘    │ │\n>> +        │ │ │       │ │   │    ┌─┴───────┐ │      │ │\n>> +        │ │ │  FC   │ │   │    │ AGC     ├─┘      │ │\n>> +        │ │ │       │ │   │    │         │        │ │\n>> +        │ │ └───────┘ │   │    └─────────┘        │ │\n>> +        │ └───────────┘   └───────────────────────┘ │\n>> +        └───────────────────────────────────────────┘\n>> +         SC: IPASessionConfiguration\n>> +         FC: IPAFrameContext(s)\n>> +\n>> +The IPA instance is constructed and initialised at the point a Camera is\n>> +created by the IPU3 pipeline handler. The initialisation call provides\n\n\n\"Camera\" is capitalised as a libcamera noun. Shouldn't Pipeline Handler\nalso be capitalised as it's a libcamera named thing?\n\n\n(From your comment above about s/Pipeline/pipeline/. I wonder if instead\nit should be capitalised there and here too.\n\n\n>> +details about which Camera Sensor is being used, and the controls that\n> \n> s/Camera Sensor/camera sensor/ ?\n\nNow that one is awkward, as it's both a libcamera noun, and a real world\nobject.\n\nIn this context, I believe camera sensor is correct, (as you've\nsuggested) as it's the external world camera sensor details.\n\n> \n>> +it has available, along with their defaults and ranges.\n> \n> s/defaults/default values/\n> \n>> +\n>> +Buffers\n>> +~~~~~~~\n>> +\n>> +The IPA will have Parameter and Statistics buffers shared with it from\n>> +the IPU3 Pipeline handler. These buffers will be passed to the IPA\n>> +before the ``start()`` operation occurs.\n>> +\n>> +The IPA will map the buffers into CPU accessible memory, associated with\n> \n> s/CPU accessible/CPU-accessible/\n> \n>> +a buffer ID, and further events for sending or receiving parameter and\n>> +statistics buffers will reference the ID to avoid expensive memory\n>> +mapping operations, or the passing of file handles during streaming.\n>> +\n>> +After the ``stop()`` operation occurs, these buffers will be unmapped,\n>> +and no further access to the buffers is permitted.\n>> +\n>> +Context\n>> +~~~~~~~\n>> +\n>> +Algorithm calls will always have the ``IPAContext`` available to them.\n>> +This context comprises of two parts:\n>> +\n>> +-  IPA Session Configuration\n>> +-  IPA Frame Context\n>> +\n>> +The session configuration structure ``IPASessionConfiguration``\n>> +represents constant parameters determined from either before streaming\n>> +commenced during ``configure()`` or updated parameters when processing\n>> +controls.\n> \n> I'm not sure about the last part, I thought the session configuration\n> was meant to be constant after configure() ?\n\nI would expect updates from controls to update the 'active session\nconfiguration'.\n\nOf course some of that session configuration will be constant, and not\nchange (I don't think we'll change the resolution for instance) - but I\nexpect that some parameters will be set such as whether AE is enabled or\nlocked.\n\nThat to me is a session configuration parameter. But it can change at\nruntime - and while it remains the same, every new frame will have that\nconfiguration applied.\n\n\nWhereas the FrameContexts are about the context that is handled for each\nframe, and any context shared between algorithms , and is much more dynamic.\n\n\n>> +The IPA Frame Context provides the storage for algorithms for a single\n>> +frame operation.\n>> +\n>> +The ``IPAFrameContext`` structure may be extended to an array, list, or\n>> +queue to store historical state for each frame, allowing algorithms to\n>> +obtain and reference results of calculations which are deeply pipelined.\n>> +This may only be done if an algorithm needs to know the context that was\n>> +applied at the frame the statistics were produced for, rather than the\n>> +previous or current frame.\n>> +\n>> +Presently there is a single ``IPAFrameContext`` without historical data,\n>> +and the context is maintained and updated through successive processing\n>> +operations.\n>> +\n>> +Operating\n>> +~~~~~~~~~\n>> +\n>> +There are three main parts of interactions with the algorithms for the\n>> +IPU3 IPA to operate when running:\n>> +\n>> +-  configure()\n>> +-  processEvent(``EventFillParams``)\n>> +-  processEvent(``EventStatReady``)\n>> +\n>> +The configuration phase allows the pipeline-handler to inform the IPA of\n>> +the current stream configurations, which is then passed into each\n>> +algorithm to provide an opportunity to identify and track state of the\n>> +hardware, such as image size or ImgU pipeline configurations.\n>> +\n>> +Pre-frame preparation\n>> +~~~~~~~~~~~~~~~~~~~~~\n>> +\n>> +When configured, the IPA is notified by the pipeline handler of the\n>> +Camera ``start()`` event, after which incoming requests will be queued\n>> +for processing, requiring a parameter buffer (``ipu3_uapi_params``) to\n>> +be populated for the ImgU. This is passed directly to each algorithm\n>> +through the ``prepare()`` call allowing the ISP configuration to be\n>> +updated for the needs of each component that the algorithm is\n>> +responsible for.\n>> +\n>> +The algorithm should set the use flag (``ipu3_uapi_flags``) for any\n>> +structure that it modifies, and it should take care to ensure that any\n>> +structure set by a use flag is fully initialised to suitable values.\n>> +\n>> +The parameter buffer is returned to the pipeline handler through the\n>> +``ActionParamFilled`` event, and from there queued to the ImgU along\n>> +with a raw frame captured with the CIO2.\n> \n> This reminds me that we should turn the operations (actions and events)\n> into separate functions in the IPAIPU3Interface. Out of scope for this\n> patch of course.\n\nYou mean directly defining them in Mojo instead of passing them through\nprocessEvent with an event flag?\n\nI think that would probably be better indeed, I don't see much added\nvalue to having processEvent() ... (with the small exception that it\npartially shows which functions are called while streaming, but we could\nalso document the async/streaming expectations anyway).\n\n\n> \n>> +\n>> +Post-frame completion\n>> +~~~~~~~~~~~~~~~~~~~~~\n>> +\n>> +When the capture of an image is completed, and successfully processed\n>> +through the ImgU, the generated statistics buffer\n>> +(``ipu3_uapi_stats_3a``) is given to the IPA through the\n>> +``EventStatReady`` event. This provides the IPA with an opportunity to\n>> +examine the results of the ISP and run the calculations required by each\n>> +algorithm on the new data. The algorithms may require context from the\n>> +operations of other algorithms, for example, the AWB might choose to use\n>> +a scene brightness determined by the AGC. It is important that the\n>> +algorithms are ordered to ensure that required results are determined\n>> +before they are needed.\n>> +\n>> +The ordering of the algorithm processing is determined by their\n>> +placement in the ``IPU3::algorithms_`` ordered list.\n> \n> I expect interesting research to happen in this area :-)\n\nYes, well it's too early to state any more than that currently I think.\n\n--\nKieran","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id 64915BDC71\n\tfor <parsemail@patchwork.libcamera.org>;\n\tTue, 14 Sep 2021 10:13:45 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id A864960248;\n\tTue, 14 Sep 2021 12:13:44 +0200 (CEST)","from perceval.ideasonboard.com (perceval.ideasonboard.com\n\t[213.167.242.64])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id 3020B60132\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tTue, 14 Sep 2021 12:13:43 +0200 (CEST)","from [192.168.0.20]\n\t(cpc89244-aztw30-2-0-cust3082.18-1.cable.virginm.net [86.31.172.11])\n\tby perceval.ideasonboard.com (Postfix) with ESMTPSA id 9A9872A5;\n\tTue, 14 Sep 2021 12:13:42 +0200 (CEST)"],"Authentication-Results":"lancelot.ideasonboard.com; dkim=pass (1024-bit key;\n\tunprotected) header.d=ideasonboard.com header.i=@ideasonboard.com\n\theader.b=\"Op22fYz5\"; dkim-atps=neutral","DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/simple; d=ideasonboard.com;\n\ts=mail; t=1631614422;\n\tbh=fb0ijOkRCCHl1h6XuUuKtvqn7cSu4AupjVN4e2gCNb8=;\n\th=From:To:Cc:References:Subject:Date:In-Reply-To:From;\n\tb=Op22fYz5LkEbvnfSa2LJ9jbmxPlCcRHhDXDUfFBMrlaoZTvsR+uHrZijU6e/+OxYI\n\t1Obr8nyervg/AGGkEhptQpoMchTi1uoQ2hdFycYw8ft2yjQOr7ZpYW+e29i4d7VgST\n\tTm9cHlTpXWiIO+H9vcNHMJv/Tm9XXaYTFBHQ5xMI=","From":"Kieran Bingham <kieran.bingham@ideasonboard.com>","To":"Laurent Pinchart <laurent.pinchart@ideasonboard.com>","References":"<20210912213017.702492-1-kieran.bingham@ideasonboard.com>\n\t<YUA1AzpA4K3fNiat@pendragon.ideasonboard.com>","Message-ID":"<9f2e1956-542c-a3d6-46eb-ee615850ce8d@ideasonboard.com>","Date":"Tue, 14 Sep 2021 11:13:39 +0100","User-Agent":"Mozilla/5.0 (X11; Linux x86_64; rv:78.0) Gecko/20100101\n\tThunderbird/78.11.0","MIME-Version":"1.0","In-Reply-To":"<YUA1AzpA4K3fNiat@pendragon.ideasonboard.com>","Content-Type":"text/plain; charset=utf-8","Content-Language":"en-GB","Content-Transfer-Encoding":"8bit","Subject":"Re: [libcamera-devel] [RFC PATCH] Documentation: IPU3 IPA Design\n\tguide","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","Cc":"libcamera devel <libcamera-devel@lists.libcamera.org>","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}},{"id":19676,"web_url":"https://patchwork.libcamera.org/comment/19676/","msgid":"<YUB6sMziBTpEToDB@pendragon.ideasonboard.com>","date":"2021-09-14T10:34:24","subject":"Re: [libcamera-devel] [RFC PATCH] Documentation: IPU3 IPA Design\n\tguide","submitter":{"id":2,"url":"https://patchwork.libcamera.org/api/people/2/","name":"Laurent Pinchart","email":"laurent.pinchart@ideasonboard.com"},"content":"Hi Kieran,\n\nOn Tue, Sep 14, 2021 at 11:13:39AM +0100, Kieran Bingham wrote:\n> On 14/09/2021 06:37, Laurent Pinchart wrote:\n> > On Sun, Sep 12, 2021 at 10:30:17PM +0100, Kieran Bingham wrote:\n> >> The IPU3 IPA implements the basic 3a using the ImgU ISP.\n> > \n> > s/3a/3A/\n> > \n> > Technically speaking we're only implementing 2 out of 3 at this point\n> > :-)\n> > \n> >> Provide an overview document to describe it's operations, and provide a\n> > \n> > s/it's/its/\n> > \n> >> block diagram to help visualise how the components are put together to\n> >> assist any new developers exploring the code.\n> >>\n> >> Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>\n> >>\n> >> ---\n> >> This is really just an RFC: Particularly:\n> >>\n> >>  - Should this content actually live in Documentation?\n> >>    or is it more suited to a page under src/ipa/ipu3/...\n> > \n> > Good question. We tend to document things in the code where possible.\n> > Would there be any drawback from doing so in this case ?\n> \n> I don't think so - I think the question mostly comes from\n> 'discoverability' of the documentation.\n> \n> \n> Which is also a bigger question that we need to make it easier for\n> readers to find the documentation that they desire.\n> \n> I've come across a great talk on this recently:\n>   - What nobody tells you about documentation - Daniele Procida\n>   - PyConAu\n>   - https://2017.pycon-au.org/schedule/presentation/15/\n>   - https://youtu.be/t4vKPhjcMZg\n> \n> And that's been spun into a website to document how to document ;-)\n>   - https://diataxis.fr/\n\nThanks, I'll have a look.\n\n> So, I think this could indeed live under src/ipa/ipu3 but it needs to be\n> clear that this should be the first thing to read when looking into the IPA.\n\nAgreed. I could envision moving it to Documentation/ later, as part of a\nlonger document that uses the IPU3 IPA to teach how to write algorithms,\nbut at that point we'll start writing a book about algorithm design :-)\n\n> >>  - This is not complete to document all functionality\n> >>    but gives an overview of the current implementation\n> > \n> > Sounds good to me.\n> > \n> >>  - Is there anything anyone would find helpful for me to\n> >>    extend/write about on this?\n> >>\n> >>  - It could likely get merged in with some of the work\n> >>    that Jean-Michel is doing, and might duplicate some of\n> >>    the parts he is documenting, but this was an aim to\n> >>    write a single page overview as a getting started with\n> >>    the IPU3 IPA ....\n> > \n> > I think it's *very* useful to have an overview.\n> > \n> >>  .../guides/ipu3-ipa-design-guide.rst          | 144 ++++++++++++++++++\n> >>  1 file changed, 144 insertions(+)\n> >>  create mode 100644 Documentation/guides/ipu3-ipa-design-guide.rst\n> >>\n> >> diff --git a/Documentation/guides/ipu3-ipa-design-guide.rst b/Documentation/guides/ipu3-ipa-design-guide.rst\n> >> new file mode 100644\n> >> index 000000000000..a1d0f13fbb00\n> >> --- /dev/null\n> >> +++ b/Documentation/guides/ipu3-ipa-design-guide.rst\n> >> @@ -0,0 +1,144 @@\n> >> +IPU3 IPA Architecture Design and Overview\n> >> +=========================================\n> >> +\n> >> +The IPU3 IPA is built as a modular and extensible framework with an\n> >> +upper layer to manage the interactions with the pipeline handler, and\n> >> +the image processing algorithms split to compartmentalise the processing\n> >> +required for each accellerator cluster provided by the ImgU ISP.\n> > \n> > I've commented on the expression \"accelerator cluster\" when reviewing\n> > Jean-Michel's series. I've then seen that it's mentioned in the IPU3\n> > kernel documentation in one place, I suppose that's where it comes from.\n> > From what I understand, the expression is used to refer to a processing\n> > block (or sub-block) that is fully implemented in hardware, as opposed\n> > as being implemented partly or fully in the ImgU firmware. I don't think\n> > that's relevant from an IPA point of view, it's an internal\n> > implementation detail of the ImgU. I'd thus use \"processing block\" or a\n> > similar term instead.\n> \n> https://git.kernel.org/pub/scm/linux/kernel/git/torvalds/linux.git/tree/drivers/staging/media/ipu3/include/uapi/intel-ipu3.h#n2428\n> \n> Describes it as:\n> \n> \"\"\"\n> ACC refers to the HW cluster containing all Fixed Functions (FFs). Each\n> FF implements a specific algorithm.\n> \"\"\"\n> \n> So in fact, yes - the term Accelerator Cluster is a 'group' of all the\n> hardware accelerator functions, rather than each individual one.\n> \n> Each individual function (fixed function) can indeed be better named as\n> a processing block.\n> \n> \n> Lets make this more like\n> \n> \"\"\"\n> The IPU3 IPA is built as a modular and extensible framework with an\n> upper layer to manage the interactions with the pipeline handler, and\n> the image processing algorithms split to compartmentalise the processing\n> required for each processing block, making use of the fixed-function\n> accelerators provided by the ImgU ISP.\n> \"\"\"\n\nSounds good to me. Let's standardize on the same \"processing block\"\nterminology in the IPU3 IPA documentation series too.\n\n> >> +\n> >> +The core IPU3 class is responsible for initialisation and construction\n> >> +of the algorithm components, processing controls set by the requests\n> >> +from applications, and managing events from the Pipeline handler.\n> > \n> > s/Pipeline/pipeline/\n> > \n> >> +\n> >> +::\n> >> +\n> >> +        ┌───────────────────────────────────────────┐\n> >> +        │      IPU3 Pipeline Handler                │\n> >> +        │   ┌────────┐    ┌────────┐    ┌────────┐  │\n> >> +        │   │        │    │        │    │        │  │\n> >> +        │   │ Sensor ├───►│  CIO2  ├───►│  ImgU  ├──►\n> >> +        │   │        │    │        │    │        │  │\n> >> +        │   └────────┘    └────────┘    └─▲────┬─┘  │    P: Parameter Buffer\n> >> +        │                                 │P   │    │    S: Statistics Buffer\n> >> +        │                                 │    │S   │\n> >> +        └─┬───┬───┬──────┬────┬────┬────┬─┴────▼─┬──┘    1: init()\n> >> +          │   │   │      │ ▲  │ ▲  │ ▲  │ ▲      │       2: configure()\n> >> +          │1  │2  │3     │4│  │4│  │4│  │4│      │5      3: mapBuffers(), start()\n> >> +          ▼   ▼   ▼      ▼ │  ▼ │  ▼ │  ▼ │      ▼       4: processEvent()\n> >> +        ┌──────────────────┴────┴────┴────┴─────────┐    5: stop(), unmapBuffers()\n> >> +        │ IPU3 IPA                                  │\n> >> +        │                 ┌───────────────────────┐ │\n> >> +        │ ┌───────────┐   │ Algorithms            │ │\n> >> +        │ │IPAContext │   │          ┌─────────┐  │ │\n> >> +        │ │ ┌───────┐ │   │          │ ...     │  │ │\n> >> +        │ │ │       │ │   │        ┌─┴───────┐ │  │ │\n> >> +        │ │ │  SC   │ │   │        │ Tonemap ├─┘  │ │\n> >> +        │ │ │       │ ◄───►      ┌─┴───────┐ │    │ │\n> >> +        │ │ ├───────┤ │   │      │ AWB     ├─┘    │ │\n> >> +        │ │ │       │ │   │    ┌─┴───────┐ │      │ │\n> >> +        │ │ │  FC   │ │   │    │ AGC     ├─┘      │ │\n> >> +        │ │ │       │ │   │    │         │        │ │\n> >> +        │ │ └───────┘ │   │    └─────────┘        │ │\n> >> +        │ └───────────┘   └───────────────────────┘ │\n> >> +        └───────────────────────────────────────────┘\n> >> +         SC: IPASessionConfiguration\n> >> +         FC: IPAFrameContext(s)\n> >> +\n> >> +The IPA instance is constructed and initialised at the point a Camera is\n> >> +created by the IPU3 pipeline handler. The initialisation call provides\n> \n> \"Camera\" is capitalised as a libcamera noun. Shouldn't Pipeline Handler\n> also be capitalised as it's a libcamera named thing?\n> \n> (From your comment above about s/Pipeline/pipeline/. I wonder if instead\n> it should be capitalised there and here too.\n\nIn Doxygen capitalization generates links to classes, so we have to be\ncareful about writing \"Camera\" for instance. When it comes to pipeline\nhandlers, \"Pipeline Handler\" won't generate a link, and I wouldn't write\n\"PipelineHandler\" here anyway.\n\nI'm sure there are conventions for all this in documentation. I usually\ncapitalize words that have special meanings the first time they're\nintroduced only, but I don't know if that's a good thing to do.\n\n> >> +details about which Camera Sensor is being used, and the controls that\n> > \n> > s/Camera Sensor/camera sensor/ ?\n> \n> Now that one is awkward, as it's both a libcamera noun, and a real world\n> object.\n> \n> In this context, I believe camera sensor is correct, (as you've\n> suggested) as it's the external world camera sensor details.\n> \n> >> +it has available, along with their defaults and ranges.\n> > \n> > s/defaults/default values/\n> > \n> >> +\n> >> +Buffers\n> >> +~~~~~~~\n> >> +\n> >> +The IPA will have Parameter and Statistics buffers shared with it from\n> >> +the IPU3 Pipeline handler. These buffers will be passed to the IPA\n> >> +before the ``start()`` operation occurs.\n> >> +\n> >> +The IPA will map the buffers into CPU accessible memory, associated with\n> > \n> > s/CPU accessible/CPU-accessible/\n> > \n> >> +a buffer ID, and further events for sending or receiving parameter and\n> >> +statistics buffers will reference the ID to avoid expensive memory\n> >> +mapping operations, or the passing of file handles during streaming.\n> >> +\n> >> +After the ``stop()`` operation occurs, these buffers will be unmapped,\n> >> +and no further access to the buffers is permitted.\n> >> +\n> >> +Context\n> >> +~~~~~~~\n> >> +\n> >> +Algorithm calls will always have the ``IPAContext`` available to them.\n> >> +This context comprises of two parts:\n> >> +\n> >> +-  IPA Session Configuration\n> >> +-  IPA Frame Context\n> >> +\n> >> +The session configuration structure ``IPASessionConfiguration``\n> >> +represents constant parameters determined from either before streaming\n> >> +commenced during ``configure()`` or updated parameters when processing\n> >> +controls.\n> > \n> > I'm not sure about the last part, I thought the session configuration\n> > was meant to be constant after configure() ?\n> \n> I would expect updates from controls to update the 'active session\n> configuration'.\n> \n> Of course some of that session configuration will be constant, and not\n> change (I don't think we'll change the resolution for instance) - but I\n> expect that some parameters will be set such as whether AE is enabled or\n> locked.\n> \n> That to me is a session configuration parameter. But it can change at\n> runtime - and while it remains the same, every new frame will have that\n> configuration applied.\n> \n> \n> Whereas the FrameContexts are about the context that is handled for each\n> frame, and any context shared between algorithms , and is much more dynamic.\n\nThis makes me think that we may want to introduce a third structure to\ngroup state that isn't per-frame. I recall designing\nIPASessionConfiguration as something that will not change during the\nsession. Maybe I recall it wrong, but I like the idea of keeping it\nstatic. Separating configuration that is set at the beginning and\ndoesn't change from controls that are updated dynamically seems a good\ndesign principle to me.\n\n> >> +The IPA Frame Context provides the storage for algorithms for a single\n> >> +frame operation.\n> >> +\n> >> +The ``IPAFrameContext`` structure may be extended to an array, list, or\n> >> +queue to store historical state for each frame, allowing algorithms to\n> >> +obtain and reference results of calculations which are deeply pipelined.\n> >> +This may only be done if an algorithm needs to know the context that was\n> >> +applied at the frame the statistics were produced for, rather than the\n> >> +previous or current frame.\n> >> +\n> >> +Presently there is a single ``IPAFrameContext`` without historical data,\n> >> +and the context is maintained and updated through successive processing\n> >> +operations.\n> >> +\n> >> +Operating\n> >> +~~~~~~~~~\n> >> +\n> >> +There are three main parts of interactions with the algorithms for the\n> >> +IPU3 IPA to operate when running:\n> >> +\n> >> +-  configure()\n> >> +-  processEvent(``EventFillParams``)\n> >> +-  processEvent(``EventStatReady``)\n> >> +\n> >> +The configuration phase allows the pipeline-handler to inform the IPA of\n> >> +the current stream configurations, which is then passed into each\n> >> +algorithm to provide an opportunity to identify and track state of the\n> >> +hardware, such as image size or ImgU pipeline configurations.\n> >> +\n> >> +Pre-frame preparation\n> >> +~~~~~~~~~~~~~~~~~~~~~\n> >> +\n> >> +When configured, the IPA is notified by the pipeline handler of the\n> >> +Camera ``start()`` event, after which incoming requests will be queued\n> >> +for processing, requiring a parameter buffer (``ipu3_uapi_params``) to\n> >> +be populated for the ImgU. This is passed directly to each algorithm\n> >> +through the ``prepare()`` call allowing the ISP configuration to be\n> >> +updated for the needs of each component that the algorithm is\n> >> +responsible for.\n> >> +\n> >> +The algorithm should set the use flag (``ipu3_uapi_flags``) for any\n> >> +structure that it modifies, and it should take care to ensure that any\n> >> +structure set by a use flag is fully initialised to suitable values.\n> >> +\n> >> +The parameter buffer is returned to the pipeline handler through the\n> >> +``ActionParamFilled`` event, and from there queued to the ImgU along\n> >> +with a raw frame captured with the CIO2.\n> > \n> > This reminds me that we should turn the operations (actions and events)\n> > into separate functions in the IPAIPU3Interface. Out of scope for this\n> > patch of course.\n> \n> You mean directly defining them in Mojo instead of passing them through\n> processEvent with an event flag?\n\nYes.\n\n> I think that would probably be better indeed, I don't see much added\n> value to having processEvent() ... (with the small exception that it\n> partially shows which functions are called while streaming, but we could\n> also document the async/streaming expectations anyway).\n\nprocessEvent() dates back from when we tried to have a single generic\nAPI for IPAs.\n\n> >> +\n> >> +Post-frame completion\n> >> +~~~~~~~~~~~~~~~~~~~~~\n> >> +\n> >> +When the capture of an image is completed, and successfully processed\n> >> +through the ImgU, the generated statistics buffer\n> >> +(``ipu3_uapi_stats_3a``) is given to the IPA through the\n> >> +``EventStatReady`` event. This provides the IPA with an opportunity to\n> >> +examine the results of the ISP and run the calculations required by each\n> >> +algorithm on the new data. The algorithms may require context from the\n> >> +operations of other algorithms, for example, the AWB might choose to use\n> >> +a scene brightness determined by the AGC. It is important that the\n> >> +algorithms are ordered to ensure that required results are determined\n> >> +before they are needed.\n> >> +\n> >> +The ordering of the algorithm processing is determined by their\n> >> +placement in the ``IPU3::algorithms_`` ordered list.\n> > \n> > I expect interesting research to happen in this area :-)\n> \n> Yes, well it's too early to state any more than that currently I think.","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id EA36ABDB1D\n\tfor <parsemail@patchwork.libcamera.org>;\n\tTue, 14 Sep 2021 10:34:52 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id 74F656918B;\n\tTue, 14 Sep 2021 12:34:52 +0200 (CEST)","from perceval.ideasonboard.com (perceval.ideasonboard.com\n\t[213.167.242.64])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id 5169360248\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tTue, 14 Sep 2021 12:34:50 +0200 (CEST)","from pendragon.ideasonboard.com (62-78-145-57.bb.dnainternet.fi\n\t[62.78.145.57])\n\tby perceval.ideasonboard.com (Postfix) with ESMTPSA id AD2852A5;\n\tTue, 14 Sep 2021 12:34:49 +0200 (CEST)"],"Authentication-Results":"lancelot.ideasonboard.com; dkim=pass (1024-bit key;\n\tunprotected) header.d=ideasonboard.com header.i=@ideasonboard.com\n\theader.b=\"sdYaTAhk\"; dkim-atps=neutral","DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/simple; d=ideasonboard.com;\n\ts=mail; t=1631615689;\n\tbh=tNZ1tiaM4sA7jYq9HoPQSsSrF+wc5u75M8DDhqmMlwM=;\n\th=Date:From:To:Cc:Subject:References:In-Reply-To:From;\n\tb=sdYaTAhkK1X4zhxWAUVIdbCUXuGlFX8+dVpZVugQEnSMtejZPtQti3x/7XyvN8Agi\n\tNoPDhKjip+Vd842DiJjU4r9Gq8pp0CkTqQ7capLEMc/xzv0fCovsqG4i2gpCLZJtSS\n\tFTSfCrf2qLfQuX42brQIIOXOlk9UVkkP/cAXobBA=","Date":"Tue, 14 Sep 2021 13:34:24 +0300","From":"Laurent Pinchart <laurent.pinchart@ideasonboard.com>","To":"Kieran Bingham <kieran.bingham@ideasonboard.com>","Message-ID":"<YUB6sMziBTpEToDB@pendragon.ideasonboard.com>","References":"<20210912213017.702492-1-kieran.bingham@ideasonboard.com>\n\t<YUA1AzpA4K3fNiat@pendragon.ideasonboard.com>\n\t<9f2e1956-542c-a3d6-46eb-ee615850ce8d@ideasonboard.com>","MIME-Version":"1.0","Content-Type":"text/plain; charset=utf-8","Content-Disposition":"inline","Content-Transfer-Encoding":"8bit","In-Reply-To":"<9f2e1956-542c-a3d6-46eb-ee615850ce8d@ideasonboard.com>","Subject":"Re: [libcamera-devel] [RFC PATCH] Documentation: IPU3 IPA Design\n\tguide","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","Cc":"libcamera devel <libcamera-devel@lists.libcamera.org>","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}},{"id":19706,"web_url":"https://patchwork.libcamera.org/comment/19706/","msgid":"<e876f75a-ab01-b3c0-2e91-6a9b2c12f1c2@ideasonboard.com>","date":"2021-09-16T12:36:43","subject":"Re: [libcamera-devel] [RFC PATCH] Documentation: IPU3 IPA Design\n\tguide","submitter":{"id":4,"url":"https://patchwork.libcamera.org/api/people/4/","name":"Kieran Bingham","email":"kieran.bingham@ideasonboard.com"},"content":"On 13/09/2021 15:29, Umang Jain wrote:\n> Hi Kieran,\n> \n> On 9/13/21 3:00 AM, Kieran Bingham wrote:\n>> The IPU3 IPA implements the basic 3a using the ImgU ISP.\n> 3a or 3A?\n>>\n>> Provide an overview document to describe it's operations, and provide a\n>> block diagram to help visualise how the components are put together to\n>> assist any new developers exploring the code.\n>>\n>> Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>\n>>\n>> ---\n>> This is really just an RFC: Particularly:\n>>\n>>   - Should this content actually live in Documentation?\n>>     or is it more suited to a page under src/ipa/ipu3/...\n> \n> \n> I would probably keep it under Documentation/ but there is already a IPA\n> writing guide. So maybe we need a new dir, to document overviews of\n> existing IPAs?\n\nIndeed, the existing document really describes the IPA IPC interfaces, I\nwonder if it should be renamed ...\n\nThe aim of this is to provide some overview on how the IPU3 IPA operates\nfrom a higher level.\n\nAlong with the discussions with Laurent, I'm going to move this to\nsrc/ipa/ipu3/ipu3-ipa-design-guide.rst for now.\n\nThat will keep the overview document in the actual component for now.\n\nIt may be that we write further 'generic' IPA design guides, which may\ntake inspiration from this, and could live in Documentation, but I think\nI/we need to spend some dedicated cycles on how we clean up our\ndocumentation anyway, so I'd rather keep this file close to it's\nimplementation for now.\n\n\n\n\n>>\n>>   - This is not complete to document all functionality\n>>     but gives an overview of the current implementation\n>>\n>>   - Is there anything anyone would find helpful for me to\n>>     extend/write about on this?\n>>\n>>   - It could likely get merged in with some of the work\n>>     that Jean-Michel is doing, and might duplicate some of\n>>     the parts he is documenting, but this was an aim to\n>>     write a single page overview as a getting started with\n>>     the IPU3 IPA ....\n>>\n>>\n>>   .../guides/ipu3-ipa-design-guide.rst          | 144 ++++++++++++++++++\n>>   1 file changed, 144 insertions(+)\n>>   create mode 100644 Documentation/guides/ipu3-ipa-design-guide.rst\n>>\n>> diff --git a/Documentation/guides/ipu3-ipa-design-guide.rst\n>> b/Documentation/guides/ipu3-ipa-design-guide.rst\n>> new file mode 100644\n>> index 000000000000..a1d0f13fbb00\n>> --- /dev/null\n>> +++ b/Documentation/guides/ipu3-ipa-design-guide.rst\n>> @@ -0,0 +1,144 @@\n>> +IPU3 IPA Architecture Design and Overview\n>> +=========================================\n>> +\n>> +The IPU3 IPA is built as a modular and extensible framework with an\n>> +upper layer to manage the interactions with the pipeline handler, and\n>> +the image processing algorithms split to compartmentalise the processing\n>> +required for each accellerator cluster provided by the ImgU ISP.\n>> +\n>> +The core IPU3 class is responsible for initialisation and construction\n> s/IPU3/IPAIPU3 maybe?\n>> +of the algorithm components, processing controls set by the requests\n>> +from applications, and managing events from the Pipeline handler.\n>> +\n>> +::\n>> +\n>> +        ┌───────────────────────────────────────────┐\n>> +        │      IPU3 Pipeline Handler                │\n>> +        │   ┌────────┐    ┌────────┐    ┌────────┐  │\n>> +        │   │        │    │        │    │        │  │\n>> +        │   │ Sensor ├───►│  CIO2  ├───►│  ImgU  ├──►\n>> +        │   │        │    │        │    │        │  │\n>> +        │   └────────┘    └────────┘    └─▲────┬─┘  │    P: Parameter\n>> Buffer\n>> +        │                                 │P   │    │    S:\n>> Statistics Buffer\n>> +        │                                 │    │S   │\n>> +        └─┬───┬───┬──────┬────┬────┬────┬─┴────▼─┬──┘    1: init()\n>> +          │   │   │      │ ▲  │ ▲  │ ▲  │ ▲      │       2: configure()\n>> +          │1  │2  │3     │4│  │4│  │4│  │4│      │5      3:\n>> mapBuffers(), start()\n>> +          ▼   ▼   ▼      ▼ │  ▼ │  ▼ │  ▼ │      ▼       4:\n>> processEvent()\n>> +        ┌──────────────────┴────┴────┴────┴─────────┐    5: stop(),\n>> unmapBuffers()\n>> +        │ IPU3 IPA                                  │\n>> +        │                 ┌───────────────────────┐ │\n>> +        │ ┌───────────┐   │ Algorithms            │ │\n>> +        │ │IPAContext │   │          ┌─────────┐  │ │\n>> +        │ │ ┌───────┐ │   │          │ ...     │  │ │\n>> +        │ │ │       │ │   │        ┌─┴───────┐ │  │ │\n>> +        │ │ │  SC   │ │   │        │ Tonemap ├─┘  │ │\n>> +        │ │ │       │ ◄───►      ┌─┴───────┐ │    │ │\n>> +        │ │ ├───────┤ │   │      │ AWB     ├─┘    │ │\n>> +        │ │ │       │ │   │    ┌─┴───────┐ │      │ │\n>> +        │ │ │  FC   │ │   │    │ AGC     ├─┘      │ │\n>> +        │ │ │       │ │   │    │         │        │ │\n>> +        │ │ └───────┘ │   │    └─────────┘        │ │\n>> +        │ └───────────┘   └───────────────────────┘ │\n>> +        └───────────────────────────────────────────┘\n>> +         SC: IPASessionConfiguration\n>> +         FC: IPAFrameContext(s)\n>> +\n>> +The IPA instance is constructed and initialised at the point a Camera is\n>> +created by the IPU3 pipeline handler. The initialisation call provides\n>> +details about which Camera Sensor is being used, and the controls that\n>> +it has available, along with their defaults and ranges.\n> \n> \n> Brief line about how sensor controls are used in IPA will be useful?\n> This will explain \"why\" sensor controls are passed to IPA and what IPA\n> does with it (for e.g gather necessary info to run algorithms).\n\nI have wondered about putting a CameraSensorHelper box in the IPA too,\nbut I thought that would be too cluttered, and I don't think it adds\nmuch to convey the context of the running IPA (at least for the moment) ...\n\n\nBut it might certainly be worth explaining about how controls are also\nsent to the pipeline handler. I'll see if I can add a section:\n\n\nControls\n~~~~~~~~\n\nThe AutoExposure and AutoGain (AGC) algorithm differs slightly from the\nothers as it requires operating directly on the sensor, as opposed to\nthrough the ImgU ISP. To support this, there is a dedicated action\n`ActionSetSensorControls` to allow the IPA to request controls to be set\non the camera sensor through the pipeline handler.\n\n\n\n>> +\n>> +Buffers\n>> +~~~~~~~\n>> +\n>> +The IPA will have Parameter and Statistics buffers shared with it from\n>> +the IPU3 Pipeline handler. These buffers will be passed to the IPA\n>> +before the ``start()`` operation occurs.\n>> +\n>> +The IPA will map the buffers into CPU accessible memory, associated with\n>> +a buffer ID, and further events for sending or receiving parameter and\n>> +statistics buffers will reference the ID to avoid expensive memory\n>> +mapping operations, or the passing of file handles during streaming.\n>> +\n>> +After the ``stop()`` operation occurs, these buffers will be unmapped,\n>> +and no further access to the buffers is permitted.\n>> +\n>> +Context\n>> +~~~~~~~\n>> +\n>> +Algorithm calls will always have the ``IPAContext`` available to them.\n>> +This context comprises of two parts:\n>> +\n>> +-  IPA Session Configuration\n>> +-  IPA Frame Context\n>> +\n>> +The session configuration structure ``IPASessionConfiguration``\n>> +represents constant parameters determined from either before streaming\n>> +commenced during ``configure()`` or updated parameters when processing\n>> +controls.\n>> +\n>> +The IPA Frame Context provides the storage for algorithms for a single\n>> +frame operation.\n>> +\n>> +The ``IPAFrameContext`` structure may be extended to an array, list, or\n>> +queue to store historical state for each frame, allowing algorithms to\n>> +obtain and reference results of calculations which are deeply pipelined.\n>> +This may only be done if an algorithm needs to know the context that was\n>> +applied at the frame the statistics were produced for, rather than the\n>> +previous or current frame.\n>> +\n>> +Presently there is a single ``IPAFrameContext`` without historical data,\n>> +and the context is maintained and updated through successive processing\n>> +operations.\n>> +\n>> +Operating\n>> +~~~~~~~~~\n>> +\n>> +There are three main parts of interactions with the algorithms for the\n>> +IPU3 IPA to operate when running:\n>> +\n>> +-  configure()\n>> +-  processEvent(``EventFillParams``)\n> \n> \n> One liner explanation for EventFillParams is missing. It should be\n> present in Pre-frame preparation section, no? Can you please include a\n> brief line analogous to \"EventStatReady\" event in the Post-frame\n> completion section?\n\nAh yes, indeed they should both be covered in the same way, so that's\nmissing thanks.\n\nI've updated it.\n\n\n\n> \n> Other than that, looks fine to me:\n> \n> Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>\n> \n>> +-  processEvent(``EventStatReady``)\n>> +\n>> +The configuration phase allows the pipeline-handler to inform the IPA of\n>> +the current stream configurations, which is then passed into each\n>> +algorithm to provide an opportunity to identify and track state of the\n>> +hardware, such as image size or ImgU pipeline configurations.\n>> +\n>> +Pre-frame preparation\n>> +~~~~~~~~~~~~~~~~~~~~~\n>> +\n>> +When configured, the IPA is notified by the pipeline handler of the\n>> +Camera ``start()`` event, after which incoming requests will be queued\n>> +for processing, requiring a parameter buffer (``ipu3_uapi_params``) to\n>> +be populated for the ImgU. This is passed directly to each algorithm>> +through the ``prepare()`` call allowing the ISP configuration to be\n>> +updated for the needs of each component that the algorithm is\n>> +responsible for.\n>> +\n>> +The algorithm should set the use flag (``ipu3_uapi_flags``) for any\n>> +structure that it modifies, and it should take care to ensure that any\n>> +structure set by a use flag is fully initialised to suitable values.\n>> +\n>> +The parameter buffer is returned to the pipeline handler through the\n>> +``ActionParamFilled`` event, and from there queued to the ImgU along\n>> +with a raw frame captured with the CIO2.\n>> +\n>> +Post-frame completion\n>> +~~~~~~~~~~~~~~~~~~~~~\n>> +\n>> +When the capture of an image is completed, and successfully processed\n>> +through the ImgU, the generated statistics buffer\n>> +(``ipu3_uapi_stats_3a``) is given to the IPA through the\n>> +``EventStatReady`` event. This provides the IPA with an opportunity to\n>> +examine the results of the ISP and run the calculations required by each\n>> +algorithm on the new data. The algorithms may require context from the\n>> +operations of other algorithms, for example, the AWB might choose to use\n>> +a scene brightness determined by the AGC. It is important that the\n>> +algorithms are ordered to ensure that required results are determined\n>> +before they are needed.\n>> +\n>> +The ordering of the algorithm processing is determined by their\n>> +placement in the ``IPU3::algorithms_`` ordered list.","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id ADB30BDC71\n\tfor <parsemail@patchwork.libcamera.org>;\n\tThu, 16 Sep 2021 12:36:49 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id BD59269188;\n\tThu, 16 Sep 2021 14:36:48 +0200 (CEST)","from perceval.ideasonboard.com (perceval.ideasonboard.com\n\t[213.167.242.64])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id 1450C6916F\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tThu, 16 Sep 2021 14:36:47 +0200 (CEST)","from [192.168.0.20]\n\t(cpc89244-aztw30-2-0-cust3082.18-1.cable.virginm.net [86.31.172.11])\n\tby perceval.ideasonboard.com (Postfix) with ESMTPSA id 8A6382A5;\n\tThu, 16 Sep 2021 14:36:46 +0200 (CEST)"],"Authentication-Results":"lancelot.ideasonboard.com; dkim=pass (1024-bit key;\n\tunprotected) header.d=ideasonboard.com header.i=@ideasonboard.com\n\theader.b=\"DogurN/N\"; dkim-atps=neutral","DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/simple; d=ideasonboard.com;\n\ts=mail; t=1631795806;\n\tbh=xAKEXlHaS2pplbkpUPl9U1wE0c/mMKFt111p2tyOVsw=;\n\th=From:To:References:Subject:Date:In-Reply-To:From;\n\tb=DogurN/Nc5TBT9bGGrHgPWd1zA3KHXkH+Wo4GkJgzaaDHjns0dtzCjEzlyzc76Jni\n\tVEJIgQkoaYrTVVilIp+nNulb/iQmuiPokBIbFmKTWlrIKOuNoEQYAPAC/5BlOlxo+r\n\tAY3gP9nx+E8GUGFZykNo5v+SSL3auM2mHsEvl4mg=","From":"Kieran Bingham <kieran.bingham@ideasonboard.com>","To":"Umang Jain <umang.jain@ideasonboard.com>,\n\tlibcamera devel <libcamera-devel@lists.libcamera.org>","References":"<20210912213017.702492-1-kieran.bingham@ideasonboard.com>\n\t<d985d46a-b9f4-01de-b0f9-9dc4fd1c15c9@ideasonboard.com>","Message-ID":"<e876f75a-ab01-b3c0-2e91-6a9b2c12f1c2@ideasonboard.com>","Date":"Thu, 16 Sep 2021 13:36:43 +0100","User-Agent":"Mozilla/5.0 (X11; Linux x86_64; rv:78.0) Gecko/20100101\n\tThunderbird/78.11.0","MIME-Version":"1.0","In-Reply-To":"<d985d46a-b9f4-01de-b0f9-9dc4fd1c15c9@ideasonboard.com>","Content-Type":"text/plain; charset=utf-8","Content-Language":"en-GB","Content-Transfer-Encoding":"8bit","Subject":"Re: [libcamera-devel] [RFC PATCH] Documentation: IPU3 IPA Design\n\tguide","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}},{"id":19707,"web_url":"https://patchwork.libcamera.org/comment/19707/","msgid":"<7de7d46c-ec06-b45f-d7d0-6ac9b4eb7e9b@ideasonboard.com>","date":"2021-09-16T12:37:28","subject":"Re: [libcamera-devel] [RFC PATCH] Documentation: IPU3 IPA Design\n\tguide","submitter":{"id":4,"url":"https://patchwork.libcamera.org/api/people/4/","name":"Kieran Bingham","email":"kieran.bingham@ideasonboard.com"},"content":"Hi Laurent,\n\n+cc JM for discussion point below...\n\nOn 14/09/2021 11:34, Laurent Pinchart wrote:\n> Hi Kieran,\n> \n> On Tue, Sep 14, 2021 at 11:13:39AM +0100, Kieran Bingham wrote:\n>> On 14/09/2021 06:37, Laurent Pinchart wrote:\n>>> On Sun, Sep 12, 2021 at 10:30:17PM +0100, Kieran Bingham wrote:\n>>>> The IPU3 IPA implements the basic 3a using the ImgU ISP.\n>>>\n>>> s/3a/3A/\n>>>\n>>> Technically speaking we're only implementing 2 out of 3 at this point\n>>> :-)\n>>>\n>>>> Provide an overview document to describe it's operations, and provide a\n>>>\n>>> s/it's/its/\n>>>\n>>>> block diagram to help visualise how the components are put together to\n>>>> assist any new developers exploring the code.\n>>>>\n>>>> Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>\n>>>>\n>>>> ---\n>>>> This is really just an RFC: Particularly:\n>>>>\n>>>>  - Should this content actually live in Documentation?\n>>>>    or is it more suited to a page under src/ipa/ipu3/...\n>>>\n>>> Good question. We tend to document things in the code where possible.\n>>> Would there be any drawback from doing so in this case ?\n>>\n>> I don't think so - I think the question mostly comes from\n>> 'discoverability' of the documentation.\n>>\n>>\n>> Which is also a bigger question that we need to make it easier for\n>> readers to find the documentation that they desire.\n>>\n>> I've come across a great talk on this recently:\n>>   - What nobody tells you about documentation - Daniele Procida\n>>   - PyConAu\n>>   - https://2017.pycon-au.org/schedule/presentation/15/\n>>   - https://youtu.be/t4vKPhjcMZg\n>>\n>> And that's been spun into a website to document how to document ;-)\n>>   - https://diataxis.fr/\n> \n> Thanks, I'll have a look.\n> \n>> So, I think this could indeed live under src/ipa/ipu3 but it needs to be\n>> clear that this should be the first thing to read when looking into the IPA.\n> \n> Agreed. I could envision moving it to Documentation/ later, as part of a\n> longer document that uses the IPU3 IPA to teach how to write algorithms,\n> but at that point we'll start writing a book about algorithm design :-)\n\nFor now, for the v2 I will keep this as an RST document, but place it in\nsrc/ipa/ipu3/.\n\n\n>>>>  - This is not complete to document all functionality\n>>>>    but gives an overview of the current implementation\n>>>\n>>> Sounds good to me.\n>>>\n>>>>  - Is there anything anyone would find helpful for me to\n>>>>    extend/write about on this?\n>>>>\n>>>>  - It could likely get merged in with some of the work\n>>>>    that Jean-Michel is doing, and might duplicate some of\n>>>>    the parts he is documenting, but this was an aim to\n>>>>    write a single page overview as a getting started with\n>>>>    the IPU3 IPA ....\n>>>\n>>> I think it's *very* useful to have an overview.\n\nI'll keep this separate for now then.\n\n>>>\n>>>>  .../guides/ipu3-ipa-design-guide.rst          | 144 ++++++++++++++++++\n>>>>  1 file changed, 144 insertions(+)\n>>>>  create mode 100644 Documentation/guides/ipu3-ipa-design-guide.rst\n>>>>\n>>>> diff --git a/Documentation/guides/ipu3-ipa-design-guide.rst b/Documentation/guides/ipu3-ipa-design-guide.rst\n>>>> new file mode 100644\n>>>> index 000000000000..a1d0f13fbb00\n>>>> --- /dev/null\n>>>> +++ b/Documentation/guides/ipu3-ipa-design-guide.rst\n>>>> @@ -0,0 +1,144 @@\n>>>> +IPU3 IPA Architecture Design and Overview\n>>>> +=========================================\n>>>> +\n>>>> +The IPU3 IPA is built as a modular and extensible framework with an\n>>>> +upper layer to manage the interactions with the pipeline handler, and\n>>>> +the image processing algorithms split to compartmentalise the processing\n>>>> +required for each accellerator cluster provided by the ImgU ISP.\n>>>\n>>> I've commented on the expression \"accelerator cluster\" when reviewing\n>>> Jean-Michel's series. I've then seen that it's mentioned in the IPU3\n>>> kernel documentation in one place, I suppose that's where it comes from.\n>>> From what I understand, the expression is used to refer to a processing\n>>> block (or sub-block) that is fully implemented in hardware, as opposed\n>>> as being implemented partly or fully in the ImgU firmware. I don't think\n>>> that's relevant from an IPA point of view, it's an internal\n>>> implementation detail of the ImgU. I'd thus use \"processing block\" or a\n>>> similar term instead.\n>>\n>> https://git.kernel.org/pub/scm/linux/kernel/git/torvalds/linux.git/tree/drivers/staging/media/ipu3/include/uapi/intel-ipu3.h#n2428\n>>\n>> Describes it as:\n>>\n>> \"\"\"\n>> ACC refers to the HW cluster containing all Fixed Functions (FFs). Each\n>> FF implements a specific algorithm.\n>> \"\"\"\n>>\n>> So in fact, yes - the term Accelerator Cluster is a 'group' of all the\n>> hardware accelerator functions, rather than each individual one.\n>>\n>> Each individual function (fixed function) can indeed be better named as\n>> a processing block.\n>>\n>>\n>> Lets make this more like\n>>\n>> \"\"\"\n>> The IPU3 IPA is built as a modular and extensible framework with an\n>> upper layer to manage the interactions with the pipeline handler, and\n>> the image processing algorithms split to compartmentalise the processing\n>> required for each processing block, making use of the fixed-function\n>> accelerators provided by the ImgU ISP.\n>> \"\"\"\n> \n> Sounds good to me. Let's standardize on the same \"processing block\"\n> terminology in the IPU3 IPA documentation series too.\n\nOk,\n\n+Cc: Jean Michel to highlight this discussion point.\n\n> \n>>>> +\n>>>> +The core IPU3 class is responsible for initialisation and construction\n>>>> +of the algorithm components, processing controls set by the requests\n>>>> +from applications, and managing events from the Pipeline handler.\n>>>\n>>> s/Pipeline/pipeline/\n>>>\n>>>> +\n>>>> +::\n>>>> +\n>>>> +        ┌───────────────────────────────────────────┐\n>>>> +        │      IPU3 Pipeline Handler                │\n>>>> +        │   ┌────────┐    ┌────────┐    ┌────────┐  │\n>>>> +        │   │        │    │        │    │        │  │\n>>>> +        │   │ Sensor ├───►│  CIO2  ├───►│  ImgU  ├──►\n>>>> +        │   │        │    │        │    │        │  │\n>>>> +        │   └────────┘    └────────┘    └─▲────┬─┘  │    P: Parameter Buffer\n>>>> +        │                                 │P   │    │    S: Statistics Buffer\n>>>> +        │                                 │    │S   │\n>>>> +        └─┬───┬───┬──────┬────┬────┬────┬─┴────▼─┬──┘    1: init()\n>>>> +          │   │   │      │ ▲  │ ▲  │ ▲  │ ▲      │       2: configure()\n>>>> +          │1  │2  │3     │4│  │4│  │4│  │4│      │5      3: mapBuffers(), start()\n>>>> +          ▼   ▼   ▼      ▼ │  ▼ │  ▼ │  ▼ │      ▼       4: processEvent()\n>>>> +        ┌──────────────────┴────┴────┴────┴─────────┐    5: stop(), unmapBuffers()\n>>>> +        │ IPU3 IPA                                  │\n>>>> +        │                 ┌───────────────────────┐ │\n>>>> +        │ ┌───────────┐   │ Algorithms            │ │\n>>>> +        │ │IPAContext │   │          ┌─────────┐  │ │\n>>>> +        │ │ ┌───────┐ │   │          │ ...     │  │ │\n>>>> +        │ │ │       │ │   │        ┌─┴───────┐ │  │ │\n>>>> +        │ │ │  SC   │ │   │        │ Tonemap ├─┘  │ │\n>>>> +        │ │ │       │ ◄───►      ┌─┴───────┐ │    │ │\n>>>> +        │ │ ├───────┤ │   │      │ AWB     ├─┘    │ │\n>>>> +        │ │ │       │ │   │    ┌─┴───────┐ │      │ │\n>>>> +        │ │ │  FC   │ │   │    │ AGC     ├─┘      │ │\n>>>> +        │ │ │       │ │   │    │         │        │ │\n>>>> +        │ │ └───────┘ │   │    └─────────┘        │ │\n>>>> +        │ └───────────┘   └───────────────────────┘ │\n>>>> +        └───────────────────────────────────────────┘\n>>>> +         SC: IPASessionConfiguration\n>>>> +         FC: IPAFrameContext(s)\n>>>> +\n>>>> +The IPA instance is constructed and initialised at the point a Camera is\n>>>> +created by the IPU3 pipeline handler. The initialisation call provides\n>>\n>> \"Camera\" is capitalised as a libcamera noun. Shouldn't Pipeline Handler\n>> also be capitalised as it's a libcamera named thing?\n>>\n>> (From your comment above about s/Pipeline/pipeline/. I wonder if instead\n>> it should be capitalised there and here too.\n> \n> In Doxygen capitalization generates links to classes, so we have to be\n> careful about writing \"Camera\" for instance. When it comes to pipeline\n> handlers, \"Pipeline Handler\" won't generate a link, and I wouldn't write\n> \"PipelineHandler\" here anyway.\n> \n> I'm sure there are conventions for all this in documentation. I usually\n> capitalize words that have special meanings the first time they're\n> introduced only, but I don't know if that's a good thing to do.\n> \n>>>> +details about which Camera Sensor is being used, and the controls that\n>>>\n>>> s/Camera Sensor/camera sensor/ ?\n>>\n>> Now that one is awkward, as it's both a libcamera noun, and a real world\n>> object.\n>>\n>> In this context, I believe camera sensor is correct, (as you've\n>> suggested) as it's the external world camera sensor details.\n>>\n>>>> +it has available, along with their defaults and ranges.\n>>>\n>>> s/defaults/default values/\n>>>\n>>>> +\n>>>> +Buffers\n>>>> +~~~~~~~\n>>>> +\n>>>> +The IPA will have Parameter and Statistics buffers shared with it from\n>>>> +the IPU3 Pipeline handler. These buffers will be passed to the IPA\n>>>> +before the ``start()`` operation occurs.\n>>>> +\n>>>> +The IPA will map the buffers into CPU accessible memory, associated with\n>>>\n>>> s/CPU accessible/CPU-accessible/\n>>>\n>>>> +a buffer ID, and further events for sending or receiving parameter and\n>>>> +statistics buffers will reference the ID to avoid expensive memory\n>>>> +mapping operations, or the passing of file handles during streaming.\n>>>> +\n>>>> +After the ``stop()`` operation occurs, these buffers will be unmapped,\n>>>> +and no further access to the buffers is permitted.\n>>>> +\n>>>> +Context\n>>>> +~~~~~~~\n>>>> +\n>>>> +Algorithm calls will always have the ``IPAContext`` available to them.\n>>>> +This context comprises of two parts:\n>>>> +\n>>>> +-  IPA Session Configuration\n>>>> +-  IPA Frame Context\n>>>> +\n>>>> +The session configuration structure ``IPASessionConfiguration``\n>>>> +represents constant parameters determined from either before streaming\n>>>> +commenced during ``configure()`` or updated parameters when processing\n>>>> +controls.\n>>>\n>>> I'm not sure about the last part, I thought the session configuration\n>>> was meant to be constant after configure() ?\n>>\n>> I would expect updates from controls to update the 'active session\n>> configuration'.\n>>\n>> Of course some of that session configuration will be constant, and not\n>> change (I don't think we'll change the resolution for instance) - but I\n>> expect that some parameters will be set such as whether AE is enabled or\n>> locked.\n>>\n>> That to me is a session configuration parameter. But it can change at\n>> runtime - and while it remains the same, every new frame will have that\n>> configuration applied.\n>>\n>>\n>> Whereas the FrameContexts are about the context that is handled for each\n>> frame, and any context shared between algorithms , and is much more dynamic.\n> \n> This makes me think that we may want to introduce a third structure to\n> group state that isn't per-frame. I recall designing\n> IPASessionConfiguration as something that will not change during the\n> session. Maybe I recall it wrong, but I like the idea of keeping it\n\nBut ... state that isn't per frame is the ... session ? and thus that\nstate is stored as the SessionConfiguration ...\n\nAt the moment I anticipate we'll need to store things like control\nstate. I.e. is AE enabled or manual for instance. And if manual - what\nis the current fixed manual value.\n\nI expected that would be the session configuration, as in it's the\ncurrent active settings ... but maybe a third group would come up, I\nthink we'll find out when we actually add the controls.\n\n\n> static. Separating configuration that is set at the beginning and\n> doesn't change from controls that are updated dynamically seems a good\n> design principle to me.\n\n\nI'll delete the last part:\n\"or updated parameters when processing controls.\"\n\n\nWe can revisit this when we add controls.\n\n\n\n>>>> +The IPA Frame Context provides the storage for algorithms for a single\n>>>> +frame operation.\n>>>> +\n>>>> +The ``IPAFrameContext`` structure may be extended to an array, list, or\n>>>> +queue to store historical state for each frame, allowing algorithms to\n>>>> +obtain and reference results of calculations which are deeply pipelined.\n>>>> +This may only be done if an algorithm needs to know the context that was\n>>>> +applied at the frame the statistics were produced for, rather than the\n>>>> +previous or current frame.\n>>>> +\n>>>> +Presently there is a single ``IPAFrameContext`` without historical data,\n>>>> +and the context is maintained and updated through successive processing\n>>>> +operations.\n>>>> +\n>>>> +Operating\n>>>> +~~~~~~~~~\n>>>> +\n>>>> +There are three main parts of interactions with the algorithms for the\n>>>> +IPU3 IPA to operate when running:\n>>>> +\n>>>> +-  configure()\n>>>> +-  processEvent(``EventFillParams``)\n>>>> +-  processEvent(``EventStatReady``)\n>>>> +\n>>>> +The configuration phase allows the pipeline-handler to inform the IPA of\n>>>> +the current stream configurations, which is then passed into each\n>>>> +algorithm to provide an opportunity to identify and track state of the\n>>>> +hardware, such as image size or ImgU pipeline configurations.\n>>>> +\n>>>> +Pre-frame preparation\n>>>> +~~~~~~~~~~~~~~~~~~~~~\n>>>> +\n>>>> +When configured, the IPA is notified by the pipeline handler of the\n>>>> +Camera ``start()`` event, after which incoming requests will be queued\n>>>> +for processing, requiring a parameter buffer (``ipu3_uapi_params``) to\n>>>> +be populated for the ImgU. This is passed directly to each algorithm\n>>>> +through the ``prepare()`` call allowing the ISP configuration to be\n>>>> +updated for the needs of each component that the algorithm is\n>>>> +responsible for.\n>>>> +\n>>>> +The algorithm should set the use flag (``ipu3_uapi_flags``) for any\n>>>> +structure that it modifies, and it should take care to ensure that any\n>>>> +structure set by a use flag is fully initialised to suitable values.\n>>>> +\n>>>> +The parameter buffer is returned to the pipeline handler through the\n>>>> +``ActionParamFilled`` event, and from there queued to the ImgU along\n>>>> +with a raw frame captured with the CIO2.\n>>>\n>>> This reminds me that we should turn the operations (actions and events)\n>>> into separate functions in the IPAIPU3Interface. Out of scope for this\n>>> patch of course.\n>>\n>> You mean directly defining them in Mojo instead of passing them through\n>> processEvent with an event flag?\n> \n> Yes.\n\nThat sounds like a good thing to do, and may indeed make the interfaces\nclearer I think.\n\nWe may have to further document the expectations of those calls for both\ndirections. I.e. EventFillParams (or it's new direct call equivalent) is\nexpected to call or initiate the calling of a return of the buffer\nthrough ActionParamFilled (or it's renamed direct call equivalent).\n\n\n\n>> I think that would probably be better indeed, I don't see much added\n>> value to having processEvent() ... (with the small exception that it\n>> partially shows which functions are called while streaming, but we could\n>> also document the async/streaming expectations anyway).\n> \n> processEvent() dates back from when we tried to have a single generic\n> API for IPAs.\n\nAnd we clearly have per - pipeline handler IPA's now.\n\n\n>>>> +\n>>>> +Post-frame completion\n>>>> +~~~~~~~~~~~~~~~~~~~~~\n>>>> +\n>>>> +When the capture of an image is completed, and successfully processed\n>>>> +through the ImgU, the generated statistics buffer\n>>>> +(``ipu3_uapi_stats_3a``) is given to the IPA through the\n>>>> +``EventStatReady`` event. This provides the IPA with an opportunity to\n>>>> +examine the results of the ISP and run the calculations required by each\n>>>> +algorithm on the new data. The algorithms may require context from the\n>>>> +operations of other algorithms, for example, the AWB might choose to use\n>>>> +a scene brightness determined by the AGC. It is important that the\n>>>> +algorithms are ordered to ensure that required results are determined\n>>>> +before they are needed.\n>>>> +\n>>>> +The ordering of the algorithm processing is determined by their\n>>>> +placement in the ``IPU3::algorithms_`` ordered list.\n>>>\n>>> I expect interesting research to happen in this area :-)\n>>\n>> Yes, well it's too early to state any more than that currently I think.\n>","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id D9A45BDC71\n\tfor <parsemail@patchwork.libcamera.org>;\n\tThu, 16 Sep 2021 12:37:34 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id A2DFE6918A;\n\tThu, 16 Sep 2021 14:37:34 +0200 (CEST)","from perceval.ideasonboard.com (perceval.ideasonboard.com\n\t[IPv6:2001:4b98:dc2:55:216:3eff:fef7:d647])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id A99026916F\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tThu, 16 Sep 2021 14:37:33 +0200 (CEST)","from [192.168.0.20]\n\t(cpc89244-aztw30-2-0-cust3082.18-1.cable.virginm.net [86.31.172.11])\n\tby perceval.ideasonboard.com (Postfix) with ESMTPSA id 31B752A5;\n\tThu, 16 Sep 2021 14:37:33 +0200 (CEST)"],"Authentication-Results":"lancelot.ideasonboard.com;\n\tdkim=fail reason=\"signature verification failed\" (1024-bit key;\n\tunprotected) header.d=ideasonboard.com header.i=@ideasonboard.com\n\theader.b=\"S9sM5ZNX\"; dkim-atps=neutral","DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/simple; d=ideasonboard.com;\n\ts=mail; t=1631795853;\n\tbh=aoXjJ11/rnZp69HC3XFf/7xq/VM7y6Gw4JnUhlhGt+8=;\n\th=From:To:Cc:References:Subject:Date:In-Reply-To:From;\n\tb=S9sM5ZNXRAszO1rI0VSK56WZTU3YiYU/aIn59Rty9TzPbiN4ZkWwotAeqscSFCJWK\n\tK10FSAkvtc/YZcG6A4GPDixOj7cgOyyuYySp3aMCRJqfuGgLrkHnE4F9xJz1oGZZgX\n\tFWarRnkckKFMgtphrXNOVrbEQakEdE/Mtf6LlBeg=","From":"Kieran Bingham <kieran.bingham@ideasonboard.com>","To":"Laurent Pinchart <laurent.pinchart@ideasonboard.com>","References":"<20210912213017.702492-1-kieran.bingham@ideasonboard.com>\n\t<YUA1AzpA4K3fNiat@pendragon.ideasonboard.com>\n\t<9f2e1956-542c-a3d6-46eb-ee615850ce8d@ideasonboard.com>\n\t<YUB6sMziBTpEToDB@pendragon.ideasonboard.com>","Message-ID":"<7de7d46c-ec06-b45f-d7d0-6ac9b4eb7e9b@ideasonboard.com>","Date":"Thu, 16 Sep 2021 13:37:28 +0100","User-Agent":"Mozilla/5.0 (X11; Linux x86_64; rv:78.0) Gecko/20100101\n\tThunderbird/78.11.0","MIME-Version":"1.0","In-Reply-To":"<YUB6sMziBTpEToDB@pendragon.ideasonboard.com>","Content-Type":"text/plain; charset=utf-8","Content-Language":"en-GB","Content-Transfer-Encoding":"8bit","Subject":"Re: [libcamera-devel] [RFC PATCH] Documentation: IPU3 IPA Design\n\tguide","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","Cc":"libcamera devel <libcamera-devel@lists.libcamera.org>","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}},{"id":19708,"web_url":"https://patchwork.libcamera.org/comment/19708/","msgid":"<c976db30-c07b-d706-ca6f-8c25b37f467b@ideasonboard.com>","date":"2021-09-16T12:46:23","subject":"Re: [libcamera-devel] [RFC PATCH] Documentation: IPU3 IPA Design\n\tguide","submitter":{"id":75,"url":"https://patchwork.libcamera.org/api/people/75/","name":"Jean-Michel Hautbois","email":"jeanmichel.hautbois@ideasonboard.com"},"content":"Hi Kieran,\n\nOn 16/09/2021 14:37, Kieran Bingham wrote:\n> Hi Laurent,\n> \n> +cc JM for discussion point below...\n> \n> On 14/09/2021 11:34, Laurent Pinchart wrote:\n>> Hi Kieran,\n>>\n>> On Tue, Sep 14, 2021 at 11:13:39AM +0100, Kieran Bingham wrote:\n>>> On 14/09/2021 06:37, Laurent Pinchart wrote:\n>>>> On Sun, Sep 12, 2021 at 10:30:17PM +0100, Kieran Bingham wrote:\n>>>>> The IPU3 IPA implements the basic 3a using the ImgU ISP.\n>>>>\n>>>> s/3a/3A/\n>>>>\n>>>> Technically speaking we're only implementing 2 out of 3 at this point\n>>>> :-)\n>>>>\n>>>>> Provide an overview document to describe it's operations, and provide a\n>>>>\n>>>> s/it's/its/\n>>>>\n>>>>> block diagram to help visualise how the components are put together to\n>>>>> assist any new developers exploring the code.\n>>>>>\n>>>>> Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>\n>>>>>\n>>>>> ---\n>>>>> This is really just an RFC: Particularly:\n>>>>>\n>>>>>  - Should this content actually live in Documentation?\n>>>>>    or is it more suited to a page under src/ipa/ipu3/...\n>>>>\n>>>> Good question. We tend to document things in the code where possible.\n>>>> Would there be any drawback from doing so in this case ?\n>>>\n>>> I don't think so - I think the question mostly comes from\n>>> 'discoverability' of the documentation.\n>>>\n>>>\n>>> Which is also a bigger question that we need to make it easier for\n>>> readers to find the documentation that they desire.\n>>>\n>>> I've come across a great talk on this recently:\n>>>   - What nobody tells you about documentation - Daniele Procida\n>>>   - PyConAu\n>>>   - https://2017.pycon-au.org/schedule/presentation/15/\n>>>   - https://youtu.be/t4vKPhjcMZg\n>>>\n>>> And that's been spun into a website to document how to document ;-)\n>>>   - https://diataxis.fr/\n>>\n>> Thanks, I'll have a look.\n>>\n>>> So, I think this could indeed live under src/ipa/ipu3 but it needs to be\n>>> clear that this should be the first thing to read when looking into the IPA.\n>>\n>> Agreed. I could envision moving it to Documentation/ later, as part of a\n>> longer document that uses the IPU3 IPA to teach how to write algorithms,\n>> but at that point we'll start writing a book about algorithm design :-)\n> \n> For now, for the v2 I will keep this as an RST document, but place it in\n> src/ipa/ipu3/.\n> \n> \n>>>>>  - This is not complete to document all functionality\n>>>>>    but gives an overview of the current implementation\n>>>>\n>>>> Sounds good to me.\n>>>>\n>>>>>  - Is there anything anyone would find helpful for me to\n>>>>>    extend/write about on this?\n>>>>>\n>>>>>  - It could likely get merged in with some of the work\n>>>>>    that Jean-Michel is doing, and might duplicate some of\n>>>>>    the parts he is documenting, but this was an aim to\n>>>>>    write a single page overview as a getting started with\n>>>>>    the IPU3 IPA ....\n>>>>\n>>>> I think it's *very* useful to have an overview.\n> \n> I'll keep this separate for now then.\n> \n>>>>\n>>>>>  .../guides/ipu3-ipa-design-guide.rst          | 144 ++++++++++++++++++\n>>>>>  1 file changed, 144 insertions(+)\n>>>>>  create mode 100644 Documentation/guides/ipu3-ipa-design-guide.rst\n>>>>>\n>>>>> diff --git a/Documentation/guides/ipu3-ipa-design-guide.rst b/Documentation/guides/ipu3-ipa-design-guide.rst\n>>>>> new file mode 100644\n>>>>> index 000000000000..a1d0f13fbb00\n>>>>> --- /dev/null\n>>>>> +++ b/Documentation/guides/ipu3-ipa-design-guide.rst\n>>>>> @@ -0,0 +1,144 @@\n>>>>> +IPU3 IPA Architecture Design and Overview\n>>>>> +=========================================\n>>>>> +\n>>>>> +The IPU3 IPA is built as a modular and extensible framework with an\n>>>>> +upper layer to manage the interactions with the pipeline handler, and\n>>>>> +the image processing algorithms split to compartmentalise the processing\n>>>>> +required for each accellerator cluster provided by the ImgU ISP.\n>>>>\n>>>> I've commented on the expression \"accelerator cluster\" when reviewing\n>>>> Jean-Michel's series. I've then seen that it's mentioned in the IPU3\n>>>> kernel documentation in one place, I suppose that's where it comes from.\n>>>> From what I understand, the expression is used to refer to a processing\n>>>> block (or sub-block) that is fully implemented in hardware, as opposed\n>>>> as being implemented partly or fully in the ImgU firmware. I don't think\n>>>> that's relevant from an IPA point of view, it's an internal\n>>>> implementation detail of the ImgU. I'd thus use \"processing block\" or a\n>>>> similar term instead.\n>>>\n>>> https://git.kernel.org/pub/scm/linux/kernel/git/torvalds/linux.git/tree/drivers/staging/media/ipu3/include/uapi/intel-ipu3.h#n2428\n>>>\n>>> Describes it as:\n>>>\n>>> \"\"\"\n>>> ACC refers to the HW cluster containing all Fixed Functions (FFs). Each\n>>> FF implements a specific algorithm.\n>>> \"\"\"\n>>>\n>>> So in fact, yes - the term Accelerator Cluster is a 'group' of all the\n>>> hardware accelerator functions, rather than each individual one.\n>>>\n>>> Each individual function (fixed function) can indeed be better named as\n>>> a processing block.\n>>>\n>>>\n>>> Lets make this more like\n>>>\n>>> \"\"\"\n>>> The IPU3 IPA is built as a modular and extensible framework with an\n>>> upper layer to manage the interactions with the pipeline handler, and\n>>> the image processing algorithms split to compartmentalise the processing\n>>> required for each processing block, making use of the fixed-function\n>>> accelerators provided by the ImgU ISP.\n>>> \"\"\"\n>>\n>> Sounds good to me. Let's standardize on the same \"processing block\"\n>> terminology in the IPU3 IPA documentation series too.\n> \n> Ok,\n> \n> +Cc: Jean Michel to highlight this discussion point.\n> \n\nThis terminology is indeed coming from the kernel. I agree with the\n\"processing block\" term ;-).\n\n>>\n>>>>> +\n>>>>> +The core IPU3 class is responsible for initialisation and construction\n>>>>> +of the algorithm components, processing controls set by the requests\n>>>>> +from applications, and managing events from the Pipeline handler.\n>>>>\n>>>> s/Pipeline/pipeline/\n>>>>\n>>>>> +\n>>>>> +::\n>>>>> +\n>>>>> +        ┌───────────────────────────────────────────┐\n>>>>> +        │      IPU3 Pipeline Handler                │\n>>>>> +        │   ┌────────┐    ┌────────┐    ┌────────┐  │\n>>>>> +        │   │        │    │        │    │        │  │\n>>>>> +        │   │ Sensor ├───►│  CIO2  ├───►│  ImgU  ├──►\n>>>>> +        │   │        │    │        │    │        │  │\n>>>>> +        │   └────────┘    └────────┘    └─▲────┬─┘  │    P: Parameter Buffer\n>>>>> +        │                                 │P   │    │    S: Statistics Buffer\n>>>>> +        │                                 │    │S   │\n>>>>> +        └─┬───┬───┬──────┬────┬────┬────┬─┴────▼─┬──┘    1: init()\n>>>>> +          │   │   │      │ ▲  │ ▲  │ ▲  │ ▲      │       2: configure()\n>>>>> +          │1  │2  │3     │4│  │4│  │4│  │4│      │5      3: mapBuffers(), start()\n>>>>> +          ▼   ▼   ▼      ▼ │  ▼ │  ▼ │  ▼ │      ▼       4: processEvent()\n>>>>> +        ┌──────────────────┴────┴────┴────┴─────────┐    5: stop(), unmapBuffers()\n>>>>> +        │ IPU3 IPA                                  │\n>>>>> +        │                 ┌───────────────────────┐ │\n>>>>> +        │ ┌───────────┐   │ Algorithms            │ │\n>>>>> +        │ │IPAContext │   │          ┌─────────┐  │ │\n>>>>> +        │ │ ┌───────┐ │   │          │ ...     │  │ │\n>>>>> +        │ │ │       │ │   │        ┌─┴───────┐ │  │ │\n>>>>> +        │ │ │  SC   │ │   │        │ Tonemap ├─┘  │ │\n>>>>> +        │ │ │       │ ◄───►      ┌─┴───────┐ │    │ │\n>>>>> +        │ │ ├───────┤ │   │      │ AWB     ├─┘    │ │\n>>>>> +        │ │ │       │ │   │    ┌─┴───────┐ │      │ │\n>>>>> +        │ │ │  FC   │ │   │    │ AGC     ├─┘      │ │\n>>>>> +        │ │ │       │ │   │    │         │        │ │\n>>>>> +        │ │ └───────┘ │   │    └─────────┘        │ │\n>>>>> +        │ └───────────┘   └───────────────────────┘ │\n>>>>> +        └───────────────────────────────────────────┘\n>>>>> +         SC: IPASessionConfiguration\n>>>>> +         FC: IPAFrameContext(s)\n>>>>> +\n>>>>> +The IPA instance is constructed and initialised at the point a Camera is\n>>>>> +created by the IPU3 pipeline handler. The initialisation call provides\n>>>\n>>> \"Camera\" is capitalised as a libcamera noun. Shouldn't Pipeline Handler\n>>> also be capitalised as it's a libcamera named thing?\n>>>\n>>> (From your comment above about s/Pipeline/pipeline/. I wonder if instead\n>>> it should be capitalised there and here too.\n>>\n>> In Doxygen capitalization generates links to classes, so we have to be\n>> careful about writing \"Camera\" for instance. When it comes to pipeline\n>> handlers, \"Pipeline Handler\" won't generate a link, and I wouldn't write\n>> \"PipelineHandler\" here anyway.\n>>\n>> I'm sure there are conventions for all this in documentation. I usually\n>> capitalize words that have special meanings the first time they're\n>> introduced only, but I don't know if that's a good thing to do.\n>>\n>>>>> +details about which Camera Sensor is being used, and the controls that\n>>>>\n>>>> s/Camera Sensor/camera sensor/ ?\n>>>\n>>> Now that one is awkward, as it's both a libcamera noun, and a real world\n>>> object.\n>>>\n>>> In this context, I believe camera sensor is correct, (as you've\n>>> suggested) as it's the external world camera sensor details.\n>>>\n>>>>> +it has available, along with their defaults and ranges.\n>>>>\n>>>> s/defaults/default values/\n>>>>\n>>>>> +\n>>>>> +Buffers\n>>>>> +~~~~~~~\n>>>>> +\n>>>>> +The IPA will have Parameter and Statistics buffers shared with it from\n>>>>> +the IPU3 Pipeline handler. These buffers will be passed to the IPA\n>>>>> +before the ``start()`` operation occurs.\n>>>>> +\n>>>>> +The IPA will map the buffers into CPU accessible memory, associated with\n>>>>\n>>>> s/CPU accessible/CPU-accessible/\n>>>>\n>>>>> +a buffer ID, and further events for sending or receiving parameter and\n>>>>> +statistics buffers will reference the ID to avoid expensive memory\n>>>>> +mapping operations, or the passing of file handles during streaming.\n>>>>> +\n>>>>> +After the ``stop()`` operation occurs, these buffers will be unmapped,\n>>>>> +and no further access to the buffers is permitted.\n>>>>> +\n>>>>> +Context\n>>>>> +~~~~~~~\n>>>>> +\n>>>>> +Algorithm calls will always have the ``IPAContext`` available to them.\n>>>>> +This context comprises of two parts:\n>>>>> +\n>>>>> +-  IPA Session Configuration\n>>>>> +-  IPA Frame Context\n>>>>> +\n>>>>> +The session configuration structure ``IPASessionConfiguration``\n>>>>> +represents constant parameters determined from either before streaming\n>>>>> +commenced during ``configure()`` or updated parameters when processing\n>>>>> +controls.\n>>>>\n>>>> I'm not sure about the last part, I thought the session configuration\n>>>> was meant to be constant after configure() ?\n>>>\n>>> I would expect updates from controls to update the 'active session\n>>> configuration'.\n>>>\n>>> Of course some of that session configuration will be constant, and not\n>>> change (I don't think we'll change the resolution for instance) - but I\n>>> expect that some parameters will be set such as whether AE is enabled or\n>>> locked.\n>>>\n>>> That to me is a session configuration parameter. But it can change at\n>>> runtime - and while it remains the same, every new frame will have that\n>>> configuration applied.\n>>>\n>>>\n>>> Whereas the FrameContexts are about the context that is handled for each\n>>> frame, and any context shared between algorithms , and is much more dynamic.\n>>\n>> This makes me think that we may want to introduce a third structure to\n>> group state that isn't per-frame. I recall designing\n>> IPASessionConfiguration as something that will not change during the\n>> session. Maybe I recall it wrong, but I like the idea of keeping it\n> \n> But ... state that isn't per frame is the ... session ? and thus that\n> state is stored as the SessionConfiguration ...\n> \n> At the moment I anticipate we'll need to store things like control\n> state. I.e. is AE enabled or manual for instance. And if manual - what\n> is the current fixed manual value.\n> \n> I expected that would be the session configuration, as in it's the\n> current active settings ... but maybe a third group would come up, I\n> think we'll find out when we actually add the controls.\n> \n> \n>> static. Separating configuration that is set at the beginning and\n>> doesn't change from controls that are updated dynamically seems a good\n>> design principle to me.\n> \n> \n> I'll delete the last part:\n> \"or updated parameters when processing controls.\"\n> \n> \n> We can revisit this when we add controls.\n> \n> \n> \n>>>>> +The IPA Frame Context provides the storage for algorithms for a single\n>>>>> +frame operation.\n>>>>> +\n>>>>> +The ``IPAFrameContext`` structure may be extended to an array, list, or\n>>>>> +queue to store historical state for each frame, allowing algorithms to\n>>>>> +obtain and reference results of calculations which are deeply pipelined.\n>>>>> +This may only be done if an algorithm needs to know the context that was\n>>>>> +applied at the frame the statistics were produced for, rather than the\n>>>>> +previous or current frame.\n>>>>> +\n>>>>> +Presently there is a single ``IPAFrameContext`` without historical data,\n>>>>> +and the context is maintained and updated through successive processing\n>>>>> +operations.\n>>>>> +\n>>>>> +Operating\n>>>>> +~~~~~~~~~\n>>>>> +\n>>>>> +There are three main parts of interactions with the algorithms for the\n>>>>> +IPU3 IPA to operate when running:\n>>>>> +\n>>>>> +-  configure()\n>>>>> +-  processEvent(``EventFillParams``)\n>>>>> +-  processEvent(``EventStatReady``)\n>>>>> +\n>>>>> +The configuration phase allows the pipeline-handler to inform the IPA of\n>>>>> +the current stream configurations, which is then passed into each\n>>>>> +algorithm to provide an opportunity to identify and track state of the\n>>>>> +hardware, such as image size or ImgU pipeline configurations.\n>>>>> +\n>>>>> +Pre-frame preparation\n>>>>> +~~~~~~~~~~~~~~~~~~~~~\n>>>>> +\n>>>>> +When configured, the IPA is notified by the pipeline handler of the\n>>>>> +Camera ``start()`` event, after which incoming requests will be queued\n>>>>> +for processing, requiring a parameter buffer (``ipu3_uapi_params``) to\n>>>>> +be populated for the ImgU. This is passed directly to each algorithm\n>>>>> +through the ``prepare()`` call allowing the ISP configuration to be\n>>>>> +updated for the needs of each component that the algorithm is\n>>>>> +responsible for.\n>>>>> +\n>>>>> +The algorithm should set the use flag (``ipu3_uapi_flags``) for any\n>>>>> +structure that it modifies, and it should take care to ensure that any\n>>>>> +structure set by a use flag is fully initialised to suitable values.\n>>>>> +\n>>>>> +The parameter buffer is returned to the pipeline handler through the\n>>>>> +``ActionParamFilled`` event, and from there queued to the ImgU along\n>>>>> +with a raw frame captured with the CIO2.\n>>>>\n>>>> This reminds me that we should turn the operations (actions and events)\n>>>> into separate functions in the IPAIPU3Interface. Out of scope for this\n>>>> patch of course.\n>>>\n>>> You mean directly defining them in Mojo instead of passing them through\n>>> processEvent with an event flag?\n>>\n>> Yes.\n> \n> That sounds like a good thing to do, and may indeed make the interfaces\n> clearer I think.\n> \n> We may have to further document the expectations of those calls for both\n> directions. I.e. EventFillParams (or it's new direct call equivalent) is\n> expected to call or initiate the calling of a return of the buffer\n> through ActionParamFilled (or it's renamed direct call equivalent).\n> \n> \n> \n>>> I think that would probably be better indeed, I don't see much added\n>>> value to having processEvent() ... (with the small exception that it\n>>> partially shows which functions are called while streaming, but we could\n>>> also document the async/streaming expectations anyway).\n>>\n>> processEvent() dates back from when we tried to have a single generic\n>> API for IPAs.\n> \n> And we clearly have per - pipeline handler IPA's now.\n> \n> \n>>>>> +\n>>>>> +Post-frame completion\n>>>>> +~~~~~~~~~~~~~~~~~~~~~\n>>>>> +\n>>>>> +When the capture of an image is completed, and successfully processed\n>>>>> +through the ImgU, the generated statistics buffer\n>>>>> +(``ipu3_uapi_stats_3a``) is given to the IPA through the\n>>>>> +``EventStatReady`` event. This provides the IPA with an opportunity to\n>>>>> +examine the results of the ISP and run the calculations required by each\n>>>>> +algorithm on the new data. The algorithms may require context from the\n>>>>> +operations of other algorithms, for example, the AWB might choose to use\n>>>>> +a scene brightness determined by the AGC. It is important that the\n>>>>> +algorithms are ordered to ensure that required results are determined\n>>>>> +before they are needed.\n>>>>> +\n>>>>> +The ordering of the algorithm processing is determined by their\n>>>>> +placement in the ``IPU3::algorithms_`` ordered list.\n>>>>\n>>>> I expect interesting research to happen in this area :-)\n>>>\n>>> Yes, well it's too early to state any more than that currently I think.\n>>","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id A3171BF01C\n\tfor <parsemail@patchwork.libcamera.org>;\n\tThu, 16 Sep 2021 12:46:28 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id 198D069183;\n\tThu, 16 Sep 2021 14:46:28 +0200 (CEST)","from perceval.ideasonboard.com (perceval.ideasonboard.com\n\t[213.167.242.64])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id EDCCC6916F\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tThu, 16 Sep 2021 14:46:25 +0200 (CEST)","from tatooine.ideasonboard.com (unknown\n\t[IPv6:2a01:e0a:169:7140:83cf:f045:9f9f:a85e])\n\tby perceval.ideasonboard.com (Postfix) with ESMTPSA id 8D7012A5;\n\tThu, 16 Sep 2021 14:46:25 +0200 (CEST)"],"Authentication-Results":"lancelot.ideasonboard.com; dkim=pass (1024-bit key;\n\tunprotected) header.d=ideasonboard.com header.i=@ideasonboard.com\n\theader.b=\"FBXEa7vL\"; dkim-atps=neutral","DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/simple; d=ideasonboard.com;\n\ts=mail; t=1631796385;\n\tbh=g+5KfZ0fXquOzpjfp8net0+z1UL3hAF+t0oAvW2Z2KE=;\n\th=Subject:To:Cc:References:From:Date:In-Reply-To:From;\n\tb=FBXEa7vLXQbI1Fe21PN+IkPXNlQVtRHy+q1+TBXege6cYDshSbzH8GVigZ4yVpa63\n\tsvQ1ZFhNzx/0pKgTxTdZ0173QC6KviSwF/vPoj7d6ayVKsgojj1BNEj6DDE2jUb+4h\n\tYK06BNQIfnD0zOVGc2bqSPhYpKbApfMjnUv5Sy8g=","To":"Kieran Bingham <kieran.bingham@ideasonboard.com>,\n\tLaurent Pinchart <laurent.pinchart@ideasonboard.com>","References":"<20210912213017.702492-1-kieran.bingham@ideasonboard.com>\n\t<YUA1AzpA4K3fNiat@pendragon.ideasonboard.com>\n\t<9f2e1956-542c-a3d6-46eb-ee615850ce8d@ideasonboard.com>\n\t<YUB6sMziBTpEToDB@pendragon.ideasonboard.com>\n\t<7de7d46c-ec06-b45f-d7d0-6ac9b4eb7e9b@ideasonboard.com>","From":"Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>","Message-ID":"<c976db30-c07b-d706-ca6f-8c25b37f467b@ideasonboard.com>","Date":"Thu, 16 Sep 2021 14:46:23 +0200","User-Agent":"Mozilla/5.0 (X11; Linux x86_64; rv:78.0) Gecko/20100101\n\tThunderbird/78.13.0","MIME-Version":"1.0","In-Reply-To":"<7de7d46c-ec06-b45f-d7d0-6ac9b4eb7e9b@ideasonboard.com>","Content-Type":"text/plain; charset=utf-8","Content-Language":"en-US","Content-Transfer-Encoding":"8bit","Subject":"Re: [libcamera-devel] [RFC PATCH] Documentation: IPU3 IPA Design\n\tguide","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","Cc":"libcamera devel <libcamera-devel@lists.libcamera.org>","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}},{"id":19710,"web_url":"https://patchwork.libcamera.org/comment/19710/","msgid":"<YUNJgCzkUhi8c8Jm@pendragon.ideasonboard.com>","date":"2021-09-16T13:41:20","subject":"Re: [libcamera-devel] [RFC PATCH] Documentation: IPU3 IPA Design\n\tguide","submitter":{"id":2,"url":"https://patchwork.libcamera.org/api/people/2/","name":"Laurent Pinchart","email":"laurent.pinchart@ideasonboard.com"},"content":"Hi Kieran,\n\nOn Thu, Sep 16, 2021 at 01:36:43PM +0100, Kieran Bingham wrote:\n> On 13/09/2021 15:29, Umang Jain wrote:\n> > On 9/13/21 3:00 AM, Kieran Bingham wrote:\n> >> The IPU3 IPA implements the basic 3a using the ImgU ISP.\n> > 3a or 3A?\n> >>\n> >> Provide an overview document to describe it's operations, and provide a\n> >> block diagram to help visualise how the components are put together to\n> >> assist any new developers exploring the code.\n> >>\n> >> Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>\n> >>\n> >> ---\n> >> This is really just an RFC: Particularly:\n> >>\n> >>   - Should this content actually live in Documentation?\n> >>     or is it more suited to a page under src/ipa/ipu3/...\n> > \n> > \n> > I would probably keep it under Documentation/ but there is already a IPA\n> > writing guide. So maybe we need a new dir, to document overviews of\n> > existing IPAs?\n> \n> Indeed, the existing document really describes the IPA IPC interfaces, I\n> wonder if it should be renamed ...\n> \n> The aim of this is to provide some overview on how the IPU3 IPA operates\n> from a higher level.\n> \n> Along with the discussions with Laurent, I'm going to move this to\n> src/ipa/ipu3/ipu3-ipa-design-guide.rst for now.\n\nI had assumed it would be moved to ipu3.cpp. Seems like explicit\ncommunication would work better than assumptions.\n\nI'm not sure I want to set a precedent of storing .rst documentation\nsomewhere else than in the Documentation/ directory. If you want to keep\nit as .rst, I think it should stay there, and get compiled to catch\nerrors. My preference would be ipu3.cpp though.\n\n> That will keep the overview document in the actual component for now.\n> \n> It may be that we write further 'generic' IPA design guides, which may\n> take inspiration from this, and could live in Documentation, but I think\n> I/we need to spend some dedicated cycles on how we clean up our\n> documentation anyway, so I'd rather keep this file close to it's\n> implementation for now.\n> \n> >>\n> >>   - This is not complete to document all functionality\n> >>     but gives an overview of the current implementation\n> >>\n> >>   - Is there anything anyone would find helpful for me to\n> >>     extend/write about on this?\n> >>\n> >>   - It could likely get merged in with some of the work\n> >>     that Jean-Michel is doing, and might duplicate some of\n> >>     the parts he is documenting, but this was an aim to\n> >>     write a single page overview as a getting started with\n> >>     the IPU3 IPA ....\n> >>\n> >>\n> >>   .../guides/ipu3-ipa-design-guide.rst          | 144 ++++++++++++++++++\n> >>   1 file changed, 144 insertions(+)\n> >>   create mode 100644 Documentation/guides/ipu3-ipa-design-guide.rst\n> >>\n> >> diff --git a/Documentation/guides/ipu3-ipa-design-guide.rst\n> >> b/Documentation/guides/ipu3-ipa-design-guide.rst\n> >> new file mode 100644\n> >> index 000000000000..a1d0f13fbb00\n> >> --- /dev/null\n> >> +++ b/Documentation/guides/ipu3-ipa-design-guide.rst\n> >> @@ -0,0 +1,144 @@\n> >> +IPU3 IPA Architecture Design and Overview\n> >> +=========================================\n> >> +\n> >> +The IPU3 IPA is built as a modular and extensible framework with an\n> >> +upper layer to manage the interactions with the pipeline handler, and\n> >> +the image processing algorithms split to compartmentalise the processing\n> >> +required for each accellerator cluster provided by the ImgU ISP.\n> >> +\n> >> +The core IPU3 class is responsible for initialisation and construction\n> >\n> > s/IPU3/IPAIPU3 maybe?\n> >\n> >> +of the algorithm components, processing controls set by the requests\n> >> +from applications, and managing events from the Pipeline handler.\n> >> +\n> >> +::\n> >> +\n> >> +        ┌───────────────────────────────────────────┐\n> >> +        │      IPU3 Pipeline Handler                │\n> >> +        │   ┌────────┐    ┌────────┐    ┌────────┐  │\n> >> +        │   │        │    │        │    │        │  │\n> >> +        │   │ Sensor ├───►│  CIO2  ├───►│  ImgU  ├──►\n> >> +        │   │        │    │        │    │        │  │\n> >> +        │   └────────┘    └────────┘    └─▲────┬─┘  │    P: Parameter Buffer\n> >> +        │                                 │P   │    │    S: Statistics Buffer\n> >> +        │                                 │    │S   │\n> >> +        └─┬───┬───┬──────┬────┬────┬────┬─┴────▼─┬──┘    1: init()\n> >> +          │   │   │      │ ▲  │ ▲  │ ▲  │ ▲      │       2: configure()\n> >> +          │1  │2  │3     │4│  │4│  │4│  │4│      │5      3: mapBuffers(), start()\n> >> +          ▼   ▼   ▼      ▼ │  ▼ │  ▼ │  ▼ │      ▼       4: processEvent()\n> >> +        ┌──────────────────┴────┴────┴────┴─────────┐    5: stop(), unmapBuffers()\n> >> +        │ IPU3 IPA                                  │\n> >> +        │                 ┌───────────────────────┐ │\n> >> +        │ ┌───────────┐   │ Algorithms            │ │\n> >> +        │ │IPAContext │   │          ┌─────────┐  │ │\n> >> +        │ │ ┌───────┐ │   │          │ ...     │  │ │\n> >> +        │ │ │       │ │   │        ┌─┴───────┐ │  │ │\n> >> +        │ │ │  SC   │ │   │        │ Tonemap ├─┘  │ │\n> >> +        │ │ │       │ ◄───►      ┌─┴───────┐ │    │ │\n> >> +        │ │ ├───────┤ │   │      │ AWB     ├─┘    │ │\n> >> +        │ │ │       │ │   │    ┌─┴───────┐ │      │ │\n> >> +        │ │ │  FC   │ │   │    │ AGC     ├─┘      │ │\n> >> +        │ │ │       │ │   │    │         │        │ │\n> >> +        │ │ └───────┘ │   │    └─────────┘        │ │\n> >> +        │ └───────────┘   └───────────────────────┘ │\n> >> +        └───────────────────────────────────────────┘\n> >> +         SC: IPASessionConfiguration\n> >> +         FC: IPAFrameContext(s)\n> >> +\n> >> +The IPA instance is constructed and initialised at the point a Camera is\n> >> +created by the IPU3 pipeline handler. The initialisation call provides\n> >> +details about which Camera Sensor is being used, and the controls that\n> >> +it has available, along with their defaults and ranges.\n> > \n> > Brief line about how sensor controls are used in IPA will be useful?\n> > This will explain \"why\" sensor controls are passed to IPA and what IPA\n> > does with it (for e.g gather necessary info to run algorithms).\n> \n> I have wondered about putting a CameraSensorHelper box in the IPA too,\n> but I thought that would be too cluttered, and I don't think it adds\n> much to convey the context of the running IPA (at least for the moment) ...\n> \n> But it might certainly be worth explaining about how controls are also\n> sent to the pipeline handler. I'll see if I can add a section:\n> \n> Controls\n> ~~~~~~~~\n> \n> The AutoExposure and AutoGain (AGC) algorithm differs slightly from the\n> others as it requires operating directly on the sensor, as opposed to\n> through the ImgU ISP. To support this, there is a dedicated action\n> `ActionSetSensorControls` to allow the IPA to request controls to be set\n> on the camera sensor through the pipeline handler.\n> \n> >> +\n> >> +Buffers\n> >> +~~~~~~~\n> >> +\n> >> +The IPA will have Parameter and Statistics buffers shared with it from\n> >> +the IPU3 Pipeline handler. These buffers will be passed to the IPA\n> >> +before the ``start()`` operation occurs.\n> >> +\n> >> +The IPA will map the buffers into CPU accessible memory, associated with\n> >> +a buffer ID, and further events for sending or receiving parameter and\n> >> +statistics buffers will reference the ID to avoid expensive memory\n> >> +mapping operations, or the passing of file handles during streaming.\n> >> +\n> >> +After the ``stop()`` operation occurs, these buffers will be unmapped,\n> >> +and no further access to the buffers is permitted.\n> >> +\n> >> +Context\n> >> +~~~~~~~\n> >> +\n> >> +Algorithm calls will always have the ``IPAContext`` available to them.\n> >> +This context comprises of two parts:\n> >> +\n> >> +-  IPA Session Configuration\n> >> +-  IPA Frame Context\n> >> +\n> >> +The session configuration structure ``IPASessionConfiguration``\n> >> +represents constant parameters determined from either before streaming\n> >> +commenced during ``configure()`` or updated parameters when processing\n> >> +controls.\n> >> +\n> >> +The IPA Frame Context provides the storage for algorithms for a single\n> >> +frame operation.\n> >> +\n> >> +The ``IPAFrameContext`` structure may be extended to an array, list, or\n> >> +queue to store historical state for each frame, allowing algorithms to\n> >> +obtain and reference results of calculations which are deeply pipelined.\n> >> +This may only be done if an algorithm needs to know the context that was\n> >> +applied at the frame the statistics were produced for, rather than the\n> >> +previous or current frame.\n> >> +\n> >> +Presently there is a single ``IPAFrameContext`` without historical data,\n> >> +and the context is maintained and updated through successive processing\n> >> +operations.\n> >> +\n> >> +Operating\n> >> +~~~~~~~~~\n> >> +\n> >> +There are three main parts of interactions with the algorithms for the\n> >> +IPU3 IPA to operate when running:\n> >> +\n> >> +-  configure()\n> >> +-  processEvent(``EventFillParams``)\n> > \n> > \n> > One liner explanation for EventFillParams is missing. It should be\n> > present in Pre-frame preparation section, no? Can you please include a\n> > brief line analogous to \"EventStatReady\" event in the Post-frame\n> > completion section?\n> \n> Ah yes, indeed they should both be covered in the same way, so that's\n> missing thanks.\n> \n> I've updated it.\n> \n> > Other than that, looks fine to me:\n> > \n> > Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>\n> > \n> >> +-  processEvent(``EventStatReady``)\n> >> +\n> >> +The configuration phase allows the pipeline-handler to inform the IPA of\n> >> +the current stream configurations, which is then passed into each\n> >> +algorithm to provide an opportunity to identify and track state of the\n> >> +hardware, such as image size or ImgU pipeline configurations.\n> >> +\n> >> +Pre-frame preparation\n> >> +~~~~~~~~~~~~~~~~~~~~~\n> >> +\n> >> +When configured, the IPA is notified by the pipeline handler of the\n> >> +Camera ``start()`` event, after which incoming requests will be queued\n> >> +for processing, requiring a parameter buffer (``ipu3_uapi_params``) to\n> >> +be populated for the ImgU. This is passed directly to each algorithm\n> >> +through the ``prepare()`` call allowing the ISP configuration to be\n> >> +updated for the needs of each component that the algorithm is\n> >> +responsible for.\n> >> +\n> >> +The algorithm should set the use flag (``ipu3_uapi_flags``) for any\n> >> +structure that it modifies, and it should take care to ensure that any\n> >> +structure set by a use flag is fully initialised to suitable values.\n> >> +\n> >> +The parameter buffer is returned to the pipeline handler through the\n> >> +``ActionParamFilled`` event, and from there queued to the ImgU along\n> >> +with a raw frame captured with the CIO2.\n> >> +\n> >> +Post-frame completion\n> >> +~~~~~~~~~~~~~~~~~~~~~\n> >> +\n> >> +When the capture of an image is completed, and successfully processed\n> >> +through the ImgU, the generated statistics buffer\n> >> +(``ipu3_uapi_stats_3a``) is given to the IPA through the\n> >> +``EventStatReady`` event. This provides the IPA with an opportunity to\n> >> +examine the results of the ISP and run the calculations required by each\n> >> +algorithm on the new data. The algorithms may require context from the\n> >> +operations of other algorithms, for example, the AWB might choose to use\n> >> +a scene brightness determined by the AGC. It is important that the\n> >> +algorithms are ordered to ensure that required results are determined\n> >> +before they are needed.\n> >> +\n> >> +The ordering of the algorithm processing is determined by their\n> >> +placement in the ``IPU3::algorithms_`` ordered list.","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id ADBFEBDC71\n\tfor <parsemail@patchwork.libcamera.org>;\n\tThu, 16 Sep 2021 13:41:50 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id B3FEB69188;\n\tThu, 16 Sep 2021 15:41:49 +0200 (CEST)","from perceval.ideasonboard.com (perceval.ideasonboard.com\n\t[213.167.242.64])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id 341666916F\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tThu, 16 Sep 2021 15:41:48 +0200 (CEST)","from pendragon.ideasonboard.com (62-78-145-57.bb.dnainternet.fi\n\t[62.78.145.57])\n\tby perceval.ideasonboard.com (Postfix) with ESMTPSA id A42992A5;\n\tThu, 16 Sep 2021 15:41:47 +0200 (CEST)"],"Authentication-Results":"lancelot.ideasonboard.com;\n\tdkim=fail reason=\"signature verification failed\" (1024-bit key;\n\tunprotected) header.d=ideasonboard.com header.i=@ideasonboard.com\n\theader.b=\"N17hFT92\"; dkim-atps=neutral","DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/simple; d=ideasonboard.com;\n\ts=mail; t=1631799707;\n\tbh=WlmXqB5rjLXvdJuFf7cdIcOxBpT4u2uik5M4CvynRFE=;\n\th=Date:From:To:Cc:Subject:References:In-Reply-To:From;\n\tb=N17hFT92hMJZ/Pd6bd6T1TP+vxZroGPxgYImgczgwxi5ZqhtBRq5yoXKJPprEDFh5\n\t6eksN2cfndVIM7Y7jeMR6vUBIiUqp28GB6g52cB2sWzE5tmf/hKc5ij7RtcvaNzspT\n\tkLUuKi6axPb5MGRAG/vsFum+e1eXY/qr7wD2s6Zg=","Date":"Thu, 16 Sep 2021 16:41:20 +0300","From":"Laurent Pinchart <laurent.pinchart@ideasonboard.com>","To":"Kieran Bingham <kieran.bingham@ideasonboard.com>","Message-ID":"<YUNJgCzkUhi8c8Jm@pendragon.ideasonboard.com>","References":"<20210912213017.702492-1-kieran.bingham@ideasonboard.com>\n\t<d985d46a-b9f4-01de-b0f9-9dc4fd1c15c9@ideasonboard.com>\n\t<e876f75a-ab01-b3c0-2e91-6a9b2c12f1c2@ideasonboard.com>","MIME-Version":"1.0","Content-Type":"text/plain; charset=utf-8","Content-Disposition":"inline","Content-Transfer-Encoding":"8bit","In-Reply-To":"<e876f75a-ab01-b3c0-2e91-6a9b2c12f1c2@ideasonboard.com>","Subject":"Re: [libcamera-devel] [RFC PATCH] Documentation: IPU3 IPA Design\n\tguide","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","Cc":"libcamera devel <libcamera-devel@lists.libcamera.org>","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}},{"id":19711,"web_url":"https://patchwork.libcamera.org/comment/19711/","msgid":"<YUNKbWgrdv+q5nOU@pendragon.ideasonboard.com>","date":"2021-09-16T13:45:17","subject":"Re: [libcamera-devel] [RFC PATCH] Documentation: IPU3 IPA Design\n\tguide","submitter":{"id":2,"url":"https://patchwork.libcamera.org/api/people/2/","name":"Laurent Pinchart","email":"laurent.pinchart@ideasonboard.com"},"content":"Hi Kieran,\n\nOn Thu, Sep 16, 2021 at 01:37:28PM +0100, Kieran Bingham wrote:\n> On 14/09/2021 11:34, Laurent Pinchart wrote:\n> > On Tue, Sep 14, 2021 at 11:13:39AM +0100, Kieran Bingham wrote:\n> >> On 14/09/2021 06:37, Laurent Pinchart wrote:\n> >>> On Sun, Sep 12, 2021 at 10:30:17PM +0100, Kieran Bingham wrote:\n> >>>> The IPU3 IPA implements the basic 3a using the ImgU ISP.\n> >>>\n> >>> s/3a/3A/\n> >>>\n> >>> Technically speaking we're only implementing 2 out of 3 at this point\n> >>> :-)\n> >>>\n> >>>> Provide an overview document to describe it's operations, and provide a\n> >>>\n> >>> s/it's/its/\n> >>>\n> >>>> block diagram to help visualise how the components are put together to\n> >>>> assist any new developers exploring the code.\n> >>>>\n> >>>> Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>\n> >>>>\n> >>>> ---\n> >>>> This is really just an RFC: Particularly:\n> >>>>\n> >>>>  - Should this content actually live in Documentation?\n> >>>>    or is it more suited to a page under src/ipa/ipu3/...\n> >>>\n> >>> Good question. We tend to document things in the code where possible.\n> >>> Would there be any drawback from doing so in this case ?\n> >>\n> >> I don't think so - I think the question mostly comes from\n> >> 'discoverability' of the documentation.\n> >>\n> >>\n> >> Which is also a bigger question that we need to make it easier for\n> >> readers to find the documentation that they desire.\n> >>\n> >> I've come across a great talk on this recently:\n> >>   - What nobody tells you about documentation - Daniele Procida\n> >>   - PyConAu\n> >>   - https://2017.pycon-au.org/schedule/presentation/15/\n> >>   - https://youtu.be/t4vKPhjcMZg\n> >>\n> >> And that's been spun into a website to document how to document ;-)\n> >>   - https://diataxis.fr/\n> > \n> > Thanks, I'll have a look.\n> > \n> >> So, I think this could indeed live under src/ipa/ipu3 but it needs to be\n> >> clear that this should be the first thing to read when looking into the IPA.\n> > \n> > Agreed. I could envision moving it to Documentation/ later, as part of a\n> > longer document that uses the IPU3 IPA to teach how to write algorithms,\n> > but at that point we'll start writing a book about algorithm design :-)\n> \n> For now, for the v2 I will keep this as an RST document, but place it in\n> src/ipa/ipu3/.\n> \n> \n> >>>>  - This is not complete to document all functionality\n> >>>>    but gives an overview of the current implementation\n> >>>\n> >>> Sounds good to me.\n> >>>\n> >>>>  - Is there anything anyone would find helpful for me to\n> >>>>    extend/write about on this?\n> >>>>\n> >>>>  - It could likely get merged in with some of the work\n> >>>>    that Jean-Michel is doing, and might duplicate some of\n> >>>>    the parts he is documenting, but this was an aim to\n> >>>>    write a single page overview as a getting started with\n> >>>>    the IPU3 IPA ....\n> >>>\n> >>> I think it's *very* useful to have an overview.\n> \n> I'll keep this separate for now then.\n> \n> >>>\n> >>>>  .../guides/ipu3-ipa-design-guide.rst          | 144 ++++++++++++++++++\n> >>>>  1 file changed, 144 insertions(+)\n> >>>>  create mode 100644 Documentation/guides/ipu3-ipa-design-guide.rst\n> >>>>\n> >>>> diff --git a/Documentation/guides/ipu3-ipa-design-guide.rst b/Documentation/guides/ipu3-ipa-design-guide.rst\n> >>>> new file mode 100644\n> >>>> index 000000000000..a1d0f13fbb00\n> >>>> --- /dev/null\n> >>>> +++ b/Documentation/guides/ipu3-ipa-design-guide.rst\n> >>>> @@ -0,0 +1,144 @@\n> >>>> +IPU3 IPA Architecture Design and Overview\n> >>>> +=========================================\n> >>>> +\n> >>>> +The IPU3 IPA is built as a modular and extensible framework with an\n> >>>> +upper layer to manage the interactions with the pipeline handler, and\n> >>>> +the image processing algorithms split to compartmentalise the processing\n> >>>> +required for each accellerator cluster provided by the ImgU ISP.\n> >>>\n> >>> I've commented on the expression \"accelerator cluster\" when reviewing\n> >>> Jean-Michel's series. I've then seen that it's mentioned in the IPU3\n> >>> kernel documentation in one place, I suppose that's where it comes from.\n> >>> From what I understand, the expression is used to refer to a processing\n> >>> block (or sub-block) that is fully implemented in hardware, as opposed\n> >>> as being implemented partly or fully in the ImgU firmware. I don't think\n> >>> that's relevant from an IPA point of view, it's an internal\n> >>> implementation detail of the ImgU. I'd thus use \"processing block\" or a\n> >>> similar term instead.\n> >>\n> >> https://git.kernel.org/pub/scm/linux/kernel/git/torvalds/linux.git/tree/drivers/staging/media/ipu3/include/uapi/intel-ipu3.h#n2428\n> >>\n> >> Describes it as:\n> >>\n> >> \"\"\"\n> >> ACC refers to the HW cluster containing all Fixed Functions (FFs). Each\n> >> FF implements a specific algorithm.\n> >> \"\"\"\n> >>\n> >> So in fact, yes - the term Accelerator Cluster is a 'group' of all the\n> >> hardware accelerator functions, rather than each individual one.\n> >>\n> >> Each individual function (fixed function) can indeed be better named as\n> >> a processing block.\n> >>\n> >>\n> >> Lets make this more like\n> >>\n> >> \"\"\"\n> >> The IPU3 IPA is built as a modular and extensible framework with an\n> >> upper layer to manage the interactions with the pipeline handler, and\n> >> the image processing algorithms split to compartmentalise the processing\n> >> required for each processing block, making use of the fixed-function\n> >> accelerators provided by the ImgU ISP.\n> >> \"\"\"\n> > \n> > Sounds good to me. Let's standardize on the same \"processing block\"\n> > terminology in the IPU3 IPA documentation series too.\n> \n> Ok,\n> \n> +Cc: Jean Michel to highlight this discussion point.\n> \n> > \n> >>>> +\n> >>>> +The core IPU3 class is responsible for initialisation and construction\n> >>>> +of the algorithm components, processing controls set by the requests\n> >>>> +from applications, and managing events from the Pipeline handler.\n> >>>\n> >>> s/Pipeline/pipeline/\n> >>>\n> >>>> +\n> >>>> +::\n> >>>> +\n> >>>> +        ┌───────────────────────────────────────────┐\n> >>>> +        │      IPU3 Pipeline Handler                │\n> >>>> +        │   ┌────────┐    ┌────────┐    ┌────────┐  │\n> >>>> +        │   │        │    │        │    │        │  │\n> >>>> +        │   │ Sensor ├───►│  CIO2  ├───►│  ImgU  ├──►\n> >>>> +        │   │        │    │        │    │        │  │\n> >>>> +        │   └────────┘    └────────┘    └─▲────┬─┘  │    P: Parameter Buffer\n> >>>> +        │                                 │P   │    │    S: Statistics Buffer\n> >>>> +        │                                 │    │S   │\n> >>>> +        └─┬───┬───┬──────┬────┬────┬────┬─┴────▼─┬──┘    1: init()\n> >>>> +          │   │   │      │ ▲  │ ▲  │ ▲  │ ▲      │       2: configure()\n> >>>> +          │1  │2  │3     │4│  │4│  │4│  │4│      │5      3: mapBuffers(), start()\n> >>>> +          ▼   ▼   ▼      ▼ │  ▼ │  ▼ │  ▼ │      ▼       4: processEvent()\n> >>>> +        ┌──────────────────┴────┴────┴────┴─────────┐    5: stop(), unmapBuffers()\n> >>>> +        │ IPU3 IPA                                  │\n> >>>> +        │                 ┌───────────────────────┐ │\n> >>>> +        │ ┌───────────┐   │ Algorithms            │ │\n> >>>> +        │ │IPAContext │   │          ┌─────────┐  │ │\n> >>>> +        │ │ ┌───────┐ │   │          │ ...     │  │ │\n> >>>> +        │ │ │       │ │   │        ┌─┴───────┐ │  │ │\n> >>>> +        │ │ │  SC   │ │   │        │ Tonemap ├─┘  │ │\n> >>>> +        │ │ │       │ ◄───►      ┌─┴───────┐ │    │ │\n> >>>> +        │ │ ├───────┤ │   │      │ AWB     ├─┘    │ │\n> >>>> +        │ │ │       │ │   │    ┌─┴───────┐ │      │ │\n> >>>> +        │ │ │  FC   │ │   │    │ AGC     ├─┘      │ │\n> >>>> +        │ │ │       │ │   │    │         │        │ │\n> >>>> +        │ │ └───────┘ │   │    └─────────┘        │ │\n> >>>> +        │ └───────────┘   └───────────────────────┘ │\n> >>>> +        └───────────────────────────────────────────┘\n> >>>> +         SC: IPASessionConfiguration\n> >>>> +         FC: IPAFrameContext(s)\n> >>>> +\n> >>>> +The IPA instance is constructed and initialised at the point a Camera is\n> >>>> +created by the IPU3 pipeline handler. The initialisation call provides\n> >>\n> >> \"Camera\" is capitalised as a libcamera noun. Shouldn't Pipeline Handler\n> >> also be capitalised as it's a libcamera named thing?\n> >>\n> >> (From your comment above about s/Pipeline/pipeline/. I wonder if instead\n> >> it should be capitalised there and here too.\n> > \n> > In Doxygen capitalization generates links to classes, so we have to be\n> > careful about writing \"Camera\" for instance. When it comes to pipeline\n> > handlers, \"Pipeline Handler\" won't generate a link, and I wouldn't write\n> > \"PipelineHandler\" here anyway.\n> > \n> > I'm sure there are conventions for all this in documentation. I usually\n> > capitalize words that have special meanings the first time they're\n> > introduced only, but I don't know if that's a good thing to do.\n> > \n> >>>> +details about which Camera Sensor is being used, and the controls that\n> >>>\n> >>> s/Camera Sensor/camera sensor/ ?\n> >>\n> >> Now that one is awkward, as it's both a libcamera noun, and a real world\n> >> object.\n> >>\n> >> In this context, I believe camera sensor is correct, (as you've\n> >> suggested) as it's the external world camera sensor details.\n> >>\n> >>>> +it has available, along with their defaults and ranges.\n> >>>\n> >>> s/defaults/default values/\n> >>>\n> >>>> +\n> >>>> +Buffers\n> >>>> +~~~~~~~\n> >>>> +\n> >>>> +The IPA will have Parameter and Statistics buffers shared with it from\n> >>>> +the IPU3 Pipeline handler. These buffers will be passed to the IPA\n> >>>> +before the ``start()`` operation occurs.\n> >>>> +\n> >>>> +The IPA will map the buffers into CPU accessible memory, associated with\n> >>>\n> >>> s/CPU accessible/CPU-accessible/\n> >>>\n> >>>> +a buffer ID, and further events for sending or receiving parameter and\n> >>>> +statistics buffers will reference the ID to avoid expensive memory\n> >>>> +mapping operations, or the passing of file handles during streaming.\n> >>>> +\n> >>>> +After the ``stop()`` operation occurs, these buffers will be unmapped,\n> >>>> +and no further access to the buffers is permitted.\n> >>>> +\n> >>>> +Context\n> >>>> +~~~~~~~\n> >>>> +\n> >>>> +Algorithm calls will always have the ``IPAContext`` available to them.\n> >>>> +This context comprises of two parts:\n> >>>> +\n> >>>> +-  IPA Session Configuration\n> >>>> +-  IPA Frame Context\n> >>>> +\n> >>>> +The session configuration structure ``IPASessionConfiguration``\n> >>>> +represents constant parameters determined from either before streaming\n> >>>> +commenced during ``configure()`` or updated parameters when processing\n> >>>> +controls.\n> >>>\n> >>> I'm not sure about the last part, I thought the session configuration\n> >>> was meant to be constant after configure() ?\n> >>\n> >> I would expect updates from controls to update the 'active session\n> >> configuration'.\n> >>\n> >> Of course some of that session configuration will be constant, and not\n> >> change (I don't think we'll change the resolution for instance) - but I\n> >> expect that some parameters will be set such as whether AE is enabled or\n> >> locked.\n> >>\n> >> That to me is a session configuration parameter. But it can change at\n> >> runtime - and while it remains the same, every new frame will have that\n> >> configuration applied.\n> >>\n> >>\n> >> Whereas the FrameContexts are about the context that is handled for each\n> >> frame, and any context shared between algorithms , and is much more dynamic.\n> > \n> > This makes me think that we may want to introduce a third structure to\n> > group state that isn't per-frame. I recall designing\n> > IPASessionConfiguration as something that will not change during the\n> > session. Maybe I recall it wrong, but I like the idea of keeping it\n> \n> But ... state that isn't per frame is the ... session ? and thus that\n> state is stored as the SessionConfiguration ...\n\nI agree that it's the session, but in my mind we'd have a session\nconfiguration that groups the parameters set at configure() time and\nthat never vary during the session, and a session state for the state.\n\n> At the moment I anticipate we'll need to store things like control\n> state. I.e. is AE enabled or manual for instance. And if manual - what\n> is the current fixed manual value.\n> \n> I expected that would be the session configuration, as in it's the\n> current active settings ... but maybe a third group would come up, I\n> think we'll find out when we actually add the controls.\n> \n> > static. Separating configuration that is set at the beginning and\n> > doesn't change from controls that are updated dynamically seems a good\n> > design principle to me.\n> \n> I'll delete the last part:\n> \"or updated parameters when processing controls.\"\n> \n> We can revisit this when we add controls.\n> \n> >>>> +The IPA Frame Context provides the storage for algorithms for a single\n> >>>> +frame operation.\n> >>>> +\n> >>>> +The ``IPAFrameContext`` structure may be extended to an array, list, or\n> >>>> +queue to store historical state for each frame, allowing algorithms to\n> >>>> +obtain and reference results of calculations which are deeply pipelined.\n> >>>> +This may only be done if an algorithm needs to know the context that was\n> >>>> +applied at the frame the statistics were produced for, rather than the\n> >>>> +previous or current frame.\n> >>>> +\n> >>>> +Presently there is a single ``IPAFrameContext`` without historical data,\n> >>>> +and the context is maintained and updated through successive processing\n> >>>> +operations.\n> >>>> +\n> >>>> +Operating\n> >>>> +~~~~~~~~~\n> >>>> +\n> >>>> +There are three main parts of interactions with the algorithms for the\n> >>>> +IPU3 IPA to operate when running:\n> >>>> +\n> >>>> +-  configure()\n> >>>> +-  processEvent(``EventFillParams``)\n> >>>> +-  processEvent(``EventStatReady``)\n> >>>> +\n> >>>> +The configuration phase allows the pipeline-handler to inform the IPA of\n> >>>> +the current stream configurations, which is then passed into each\n> >>>> +algorithm to provide an opportunity to identify and track state of the\n> >>>> +hardware, such as image size or ImgU pipeline configurations.\n> >>>> +\n> >>>> +Pre-frame preparation\n> >>>> +~~~~~~~~~~~~~~~~~~~~~\n> >>>> +\n> >>>> +When configured, the IPA is notified by the pipeline handler of the\n> >>>> +Camera ``start()`` event, after which incoming requests will be queued\n> >>>> +for processing, requiring a parameter buffer (``ipu3_uapi_params``) to\n> >>>> +be populated for the ImgU. This is passed directly to each algorithm\n> >>>> +through the ``prepare()`` call allowing the ISP configuration to be\n> >>>> +updated for the needs of each component that the algorithm is\n> >>>> +responsible for.\n> >>>> +\n> >>>> +The algorithm should set the use flag (``ipu3_uapi_flags``) for any\n> >>>> +structure that it modifies, and it should take care to ensure that any\n> >>>> +structure set by a use flag is fully initialised to suitable values.\n> >>>> +\n> >>>> +The parameter buffer is returned to the pipeline handler through the\n> >>>> +``ActionParamFilled`` event, and from there queued to the ImgU along\n> >>>> +with a raw frame captured with the CIO2.\n> >>>\n> >>> This reminds me that we should turn the operations (actions and events)\n> >>> into separate functions in the IPAIPU3Interface. Out of scope for this\n> >>> patch of course.\n> >>\n> >> You mean directly defining them in Mojo instead of passing them through\n> >> processEvent with an event flag?\n> > \n> > Yes.\n> \n> That sounds like a good thing to do, and may indeed make the interfaces\n> clearer I think.\n> \n> We may have to further document the expectations of those calls for both\n> directions. I.e. EventFillParams (or it's new direct call equivalent) is\n> expected to call or initiate the calling of a return of the buffer\n> through ActionParamFilled (or it's renamed direct call equivalent).\n> \n> >> I think that would probably be better indeed, I don't see much added\n> >> value to having processEvent() ... (with the small exception that it\n> >> partially shows which functions are called while streaming, but we could\n> >> also document the async/streaming expectations anyway).\n> > \n> > processEvent() dates back from when we tried to have a single generic\n> > API for IPAs.\n> \n> And we clearly have per - pipeline handler IPA's now.\n> \n> >>>> +\n> >>>> +Post-frame completion\n> >>>> +~~~~~~~~~~~~~~~~~~~~~\n> >>>> +\n> >>>> +When the capture of an image is completed, and successfully processed\n> >>>> +through the ImgU, the generated statistics buffer\n> >>>> +(``ipu3_uapi_stats_3a``) is given to the IPA through the\n> >>>> +``EventStatReady`` event. This provides the IPA with an opportunity to\n> >>>> +examine the results of the ISP and run the calculations required by each\n> >>>> +algorithm on the new data. The algorithms may require context from the\n> >>>> +operations of other algorithms, for example, the AWB might choose to use\n> >>>> +a scene brightness determined by the AGC. It is important that the\n> >>>> +algorithms are ordered to ensure that required results are determined\n> >>>> +before they are needed.\n> >>>> +\n> >>>> +The ordering of the algorithm processing is determined by their\n> >>>> +placement in the ``IPU3::algorithms_`` ordered list.\n> >>>\n> >>> I expect interesting research to happen in this area :-)\n> >>\n> >> Yes, well it's too early to state any more than that currently I think.","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id 46514BDC71\n\tfor <parsemail@patchwork.libcamera.org>;\n\tThu, 16 Sep 2021 13:45:48 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id 8E5946918A;\n\tThu, 16 Sep 2021 15:45:47 +0200 (CEST)","from perceval.ideasonboard.com (perceval.ideasonboard.com\n\t[213.167.242.64])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id DD1126916F\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tThu, 16 Sep 2021 15:45:45 +0200 (CEST)","from pendragon.ideasonboard.com (62-78-145-57.bb.dnainternet.fi\n\t[62.78.145.57])\n\tby perceval.ideasonboard.com (Postfix) with ESMTPSA id 5E7612A5;\n\tThu, 16 Sep 2021 15:45:45 +0200 (CEST)"],"Authentication-Results":"lancelot.ideasonboard.com;\n\tdkim=fail reason=\"signature verification failed\" (1024-bit key;\n\tunprotected) header.d=ideasonboard.com header.i=@ideasonboard.com\n\theader.b=\"IA135kA3\"; dkim-atps=neutral","DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/simple; d=ideasonboard.com;\n\ts=mail; t=1631799945;\n\tbh=2dRCTm+Rj1PSkTtNVndvAy02k8PFRiBhwGAoGmmGBL0=;\n\th=Date:From:To:Cc:Subject:References:In-Reply-To:From;\n\tb=IA135kA3Xk3kMV3OWjT8056GxRwvNpedEtBMccyn2I03g7ZdYG0N9GTceSVOdTzQY\n\t4/NnbMIAsHnFrfwyZdTPKq1e6m/dWGC+ZzI1dUAgJDuSkXJQors6Wt/8/jvrMBPqFy\n\tWIUUdUMcLCTuN1BvSOHOhagnCCuKlsqmnxmSTWeQ=","Date":"Thu, 16 Sep 2021 16:45:17 +0300","From":"Laurent Pinchart <laurent.pinchart@ideasonboard.com>","To":"Kieran Bingham <kieran.bingham@ideasonboard.com>","Message-ID":"<YUNKbWgrdv+q5nOU@pendragon.ideasonboard.com>","References":"<20210912213017.702492-1-kieran.bingham@ideasonboard.com>\n\t<YUA1AzpA4K3fNiat@pendragon.ideasonboard.com>\n\t<9f2e1956-542c-a3d6-46eb-ee615850ce8d@ideasonboard.com>\n\t<YUB6sMziBTpEToDB@pendragon.ideasonboard.com>\n\t<7de7d46c-ec06-b45f-d7d0-6ac9b4eb7e9b@ideasonboard.com>","MIME-Version":"1.0","Content-Type":"text/plain; charset=utf-8","Content-Disposition":"inline","Content-Transfer-Encoding":"8bit","In-Reply-To":"<7de7d46c-ec06-b45f-d7d0-6ac9b4eb7e9b@ideasonboard.com>","Subject":"Re: [libcamera-devel] [RFC PATCH] Documentation: IPU3 IPA Design\n\tguide","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","Cc":"libcamera devel <libcamera-devel@lists.libcamera.org>","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}},{"id":19712,"web_url":"https://patchwork.libcamera.org/comment/19712/","msgid":"<a37bda96-e69d-cf03-627c-7413f5589b59@ideasonboard.com>","date":"2021-09-16T15:22:35","subject":"Re: [libcamera-devel] [RFC PATCH] Documentation: IPU3 IPA Design\n\tguide","submitter":{"id":4,"url":"https://patchwork.libcamera.org/api/people/4/","name":"Kieran Bingham","email":"kieran.bingham@ideasonboard.com"},"content":"On 16/09/2021 14:41, Laurent Pinchart wrote:\n> Hi Kieran,\n> \n> On Thu, Sep 16, 2021 at 01:36:43PM +0100, Kieran Bingham wrote:\n>> On 13/09/2021 15:29, Umang Jain wrote:\n>>> On 9/13/21 3:00 AM, Kieran Bingham wrote:\n>>>> The IPU3 IPA implements the basic 3a using the ImgU ISP.\n>>> 3a or 3A?\n>>>>\n>>>> Provide an overview document to describe it's operations, and provide a\n>>>> block diagram to help visualise how the components are put together to\n>>>> assist any new developers exploring the code.\n>>>>\n>>>> Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>\n>>>>\n>>>> ---\n>>>> This is really just an RFC: Particularly:\n>>>>\n>>>>   - Should this content actually live in Documentation?\n>>>>     or is it more suited to a page under src/ipa/ipu3/...\n>>>\n>>>\n>>> I would probably keep it under Documentation/ but there is already a IPA\n>>> writing guide. So maybe we need a new dir, to document overviews of\n>>> existing IPAs?\n>>\n>> Indeed, the existing document really describes the IPA IPC interfaces, I\n>> wonder if it should be renamed ...\n>>\n>> The aim of this is to provide some overview on how the IPU3 IPA operates\n>> from a higher level.\n>>\n>> Along with the discussions with Laurent, I'm going to move this to\n>> src/ipa/ipu3/ipu3-ipa-design-guide.rst for now.\n> \n> I had assumed it would be moved to ipu3.cpp. Seems like explicit\n> communication would work better than assumptions.\n\nIsn't it always ;-)\n\n> I'm not sure I want to set a precedent of storing .rst documentation\n> somewhere else than in the Documentation/ directory. If you want to keep\n> it as .rst, I think it should stay there, and get compiled to catch\n> errors. My preference would be ipu3.cpp though.\n\nI don't mind it being in the 'doxygen' in ipu3.cpp - but then I /really/\nconflict with what Jean-Michel has been doing, while this was aiming to\nbe separate.\n\nI'll try to tackle that for v3, and incorporate src/ipa/ipu3 into the\nDoxygen build ... and try to figure out if I can generate a distinct\n'page' for this overview.\n\n\n>> That will keep the overview document in the actual component for now.\n\n--\nKieran","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id 65DDCBF01C\n\tfor <parsemail@patchwork.libcamera.org>;\n\tThu, 16 Sep 2021 15:22:41 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id 8492A69188;\n\tThu, 16 Sep 2021 17:22:40 +0200 (CEST)","from perceval.ideasonboard.com (perceval.ideasonboard.com\n\t[IPv6:2001:4b98:dc2:55:216:3eff:fef7:d647])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id 656B76916F\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tThu, 16 Sep 2021 17:22:39 +0200 (CEST)","from [192.168.0.20]\n\t(cpc89244-aztw30-2-0-cust3082.18-1.cable.virginm.net [86.31.172.11])\n\tby perceval.ideasonboard.com (Postfix) with ESMTPSA id A9C4E2A5;\n\tThu, 16 Sep 2021 17:22:38 +0200 (CEST)"],"Authentication-Results":"lancelot.ideasonboard.com;\n\tdkim=fail reason=\"signature verification failed\" (1024-bit key;\n\tunprotected) header.d=ideasonboard.com header.i=@ideasonboard.com\n\theader.b=\"UdwYid0a\"; dkim-atps=neutral","DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/simple; d=ideasonboard.com;\n\ts=mail; t=1631805758;\n\tbh=V2EKQptHg2M/AA8Pn3Kl+mKI+HeHSqasg6ZkwCm7PwQ=;\n\th=From:To:Cc:References:Subject:Date:In-Reply-To:From;\n\tb=UdwYid0afq4nmkVMdgLl/cIoTMEu3okiTI01ZeLH6VmLh+CDMtySzf2zi5a9e0ZrK\n\tHQtf7vS8e0owW3zwtWfb3c3bb4nCUWW31KFk/SOXGjyKDnnKGlhhB2A0i+aYlP98/7\n\tj8CDrd8BbHoLjE9IWWqaTOXwFPoc9V4XaPGNLnUA=","From":"Kieran Bingham <kieran.bingham@ideasonboard.com>","To":"Laurent Pinchart <laurent.pinchart@ideasonboard.com>","References":"<20210912213017.702492-1-kieran.bingham@ideasonboard.com>\n\t<d985d46a-b9f4-01de-b0f9-9dc4fd1c15c9@ideasonboard.com>\n\t<e876f75a-ab01-b3c0-2e91-6a9b2c12f1c2@ideasonboard.com>\n\t<YUNJgCzkUhi8c8Jm@pendragon.ideasonboard.com>","Message-ID":"<a37bda96-e69d-cf03-627c-7413f5589b59@ideasonboard.com>","Date":"Thu, 16 Sep 2021 16:22:35 +0100","User-Agent":"Mozilla/5.0 (X11; Linux x86_64; rv:78.0) Gecko/20100101\n\tThunderbird/78.11.0","MIME-Version":"1.0","In-Reply-To":"<YUNJgCzkUhi8c8Jm@pendragon.ideasonboard.com>","Content-Type":"text/plain; charset=utf-8","Content-Language":"en-GB","Content-Transfer-Encoding":"8bit","Subject":"Re: [libcamera-devel] [RFC PATCH] Documentation: IPU3 IPA Design\n\tguide","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","Cc":"libcamera devel <libcamera-devel@lists.libcamera.org>","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}}]