[{"id":28009,"web_url":"https://patchwork.libcamera.org/comment/28009/","msgid":"<20231019205235.GO14832@pendragon.ideasonboard.com>","date":"2023-10-19T20:52:35","subject":"Re: [libcamera-devel] [PATCH v3] libcamera: controls: Add controls\n\tfor HDR","submitter":{"id":2,"url":"https://patchwork.libcamera.org/api/people/2/","name":"Laurent Pinchart","email":"laurent.pinchart@ideasonboard.com"},"content":"Hi David,\n\nThank you for the patch.\n\nOn Thu, Oct 19, 2023 at 12:52:20PM +0100, David Plowman via libcamera-devel wrote:\n> We add an HdrMode control (to enable and disable HDR processing)\n> and an HdrChannel, which indicates what kind of HDR frame (short, long\n> or medium) has just arrived.\n> \n> Currently the HdrMode supports the following values:\n> \n> * Off - no HDR processing at all.\n> * MultiExposureUnmerged - frames at multiple different exposures are\n>   produced, but not merged together. They are returned \"as is\".\n> * MultiExposure - frames at multiple different exposures are merged\n>   to create HDR images.\n> * SingleExposure - multiple frames all at the same exposure are\n>   merged to create HDR images.\n> * Night - multiple frames will be combined to create \"night mode\"\n>   images.\n> \n> Signed-off-by: David Plowman <david.plowman@raspberrypi.com>\n> Reviewed-by: Naushir Patuck <naush@raspberrypi.com>\n> ---\n>  src/libcamera/control_ids.yaml | 75 ++++++++++++++++++++++++++++++++++\n>  1 file changed, 75 insertions(+)\n> \n> diff --git a/src/libcamera/control_ids.yaml b/src/libcamera/control_ids.yaml\n> index f2e542f4..c3232abf 100644\n> --- a/src/libcamera/control_ids.yaml\n> +++ b/src/libcamera/control_ids.yaml\n> @@ -774,6 +774,81 @@ controls:\n>              Continuous AF is paused. No further state changes or lens movements\n>              will occur until the AfPauseResume control is sent.\n>  \n> +  - HdrMode:\n> +      type: int32_t\n> +      description: |\n> +        Control to set the mode to be used for High Dynamic Range (HDR)\n> +        imaging. HDR techniques typically include multiple exposure, image\n> +        fusion and tone mapping techniques to improve the dynamic range of the\n> +        resulting images.\n> +\n> +        When using an HDR mode, images are tagged to indicate which HDR channel\n> +        (long, medium or short) they come from.\n> +\n> +        \\sa HdrChannel\n> +\n> +      enum:\n> +        - name: HdrModeOff\n> +          value: 0\n> +          description: |\n> +            HDR is disabled. The HDR channel, if present, will report\n> +            HdrChannelNone.\n\nStating what HDR channel is used is an improvement compared to the\nprevious version, but there's still an option left to implementors here:\nreporting HdrChannelNone, or not reporting HdrChannel at all. Unless you\nsee a reason to allow both, I would pick the latter:\n\n            HDR is disabled. Metadata for this frame will not include the\n\t    HdrChannel control.\n\n> +        - name: HdrModeMultiExposureUnmerged\n> +          value: 1\n> +          description: |\n> +            Multiple exposures will be generated in an alternating fashion.\n> +            However, they will not be merged together and will be returned to\n> +            the application as they are. Each image will be tagged with the\n> +            correct HDR channel, indicating what kind of exposure (long, medium\n> +            or short) it is.  The expectation is that, if necessary, the\n> +            application can merge them to create HDR images for itself.\n\nYou mention here long, medium and short. Does this mean there will\nalways be three channels ?\n\n> +        - name: HdrModeMultiExposure\n> +          value: 2\n> +          description: |\n> +            Multiple exposures will be generated and merged to create HDR\n> +            images. Each image will be tagged with the HDR channel (long, medium\n> +            or short) that arrived and which caused this image to be output.\n> +        - name: HdrModeSingleExposure\n> +          value: 3\n> +          description: |\n> +            Multiple frames all at a single exposure will be used to create HDR\n> +            images. These images should be reported as all corresponding to the\n> +            HDR short channel.\n> +        - name: HdrModeNight\n> +          value: 4\n> +          description: |\n> +            Multiple frames will be combined to produce \"night mode\"\n> +            images. Images will be tagged as belonging either to the long,\n> +            medium or short HDR channel according to the implementation.\n\nDoes this mean that night more will always use multi-exposure, or that\nit is implementation-defined ?\n\n> +\n> +  - HdrChannel:\n> +      type: int32_t\n> +      description: |\n> +        This value is reported back to the application so that it can discover\n> +        whether this capture corresponds to the short or long exposure image (or\n> +        any other image used by the HDR procedure). An application can monitor\n> +        the HDR channel to discover when the differently exposed images have\n> +        arrived.\n> +\n> +      enum:\n> +        - name: HdrChannelNone\n> +          value: 0\n> +          description: |\n> +            This image does not correspond to any of the captures used to create\n> +            an HDR image.\n\nAs indicated above, do we need this, or should we not report HdrChannel\nwhen HDR is disabled ?\n\n> +        - name: HdrChannelShort\n> +          value: 1\n> +          description: |\n> +            This is a short exposure image.\n> +        - name: HdrChannelMedium\n> +          value: 2\n> +          description: |\n> +            This is a medium exposure image.\n> +        - name: HdrChannelLong\n> +          value: 3\n> +          description: |\n> +            This is a long exposure image.\n> +\n>    # ----------------------------------------------------------------------------\n>    # Draft controls section\n>","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id 3BA23BDCBD\n\tfor <parsemail@patchwork.libcamera.org>;\n\tThu, 19 Oct 2023 20:52:32 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id 7EA496297F;\n\tThu, 19 Oct 2023 22:52:31 +0200 (CEST)","from perceval.ideasonboard.com (perceval.ideasonboard.com\n\t[IPv6:2001:4b98:dc2:55:216:3eff:fef7:d647])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id 6B1DF6055B\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tThu, 19 Oct 2023 22:52:29 +0200 (CEST)","from pendragon.ideasonboard.com (213-243-189-158.bb.dnainternet.fi\n\t[213.243.189.158])\n\tby perceval.ideasonboard.com (Postfix) with ESMTPSA id 056C58D;\n\tThu, 19 Oct 2023 22:52:20 +0200 (CEST)"],"DKIM-Signature":["v=1; a=rsa-sha256; c=relaxed/simple; d=libcamera.org;\n\ts=mail; t=1697748751;\n\tbh=2SJb7vcjQqdZN2n4T2xoCCFG4dqH573jKK1yVpmkbzU=;\n\th=Date:To:References:In-Reply-To:Subject:List-Id:List-Unsubscribe:\n\tList-Archive:List-Post:List-Help:List-Subscribe:From:Reply-To:Cc:\n\tFrom;\n\tb=tFAsV06OqUJltshhk6gbzWsLVahzGvkVOY2IOVuz7J0R4v9E4xnV41K30JwelXWRs\n\t/tVAlmWMs18D0cWC9S8FQUlFzTCsSoAnfVAaSjeI2mU0ssxsq5lQM9Ao7rDHW6eUbh\n\tS2MD+e/DcfVbGEcSwAtTBgUvx+6tZDAFy9XkOZNoBkbiy1fULWHu6P3oOAfpnZQhw8\n\t057gpAo7pUc6hPYdDJ6M4GD0vEx4rUaH23BE+THuWOz25BX3IqQQN9VYxzfmUCa1nk\n\tH++kwns7oJwLxSzM8iLGqc3rTq1L0d2Zt5x+iH9rfzneE8D6EwTp6BtVlAAM/1WJi+\n\tMfy7FLc1roEhg==","v=1; a=rsa-sha256; c=relaxed/simple; d=ideasonboard.com;\n\ts=mail; t=1697748741;\n\tbh=2SJb7vcjQqdZN2n4T2xoCCFG4dqH573jKK1yVpmkbzU=;\n\th=Date:From:To:Cc:Subject:References:In-Reply-To:From;\n\tb=qkpsEi33kwaV0J2cxFVKrpNONe18atwAgrkYUM04P61+SYtQPFTRuaMGa6y/fGNLV\n\tQQMP7MG+/iOvuTT+1CpVy4Y8eekY3ZSI2yGjaTVk2Y7sa/Y+P2xxWAgyyX7R8FSgFN\n\tAnoC0+qNjFCRHY/+DaJl9SdgNzY1qqtJR8mjNmQk="],"Authentication-Results":"lancelot.ideasonboard.com; dkim=pass (1024-bit key; \n\tunprotected) header.d=ideasonboard.com\n\theader.i=@ideasonboard.com\n\theader.b=\"qkpsEi33\"; dkim-atps=neutral","Date":"Thu, 19 Oct 2023 23:52:35 +0300","To":"David Plowman <david.plowman@raspberrypi.com>","Message-ID":"<20231019205235.GO14832@pendragon.ideasonboard.com>","References":"<20231019115220.4473-1-david.plowman@raspberrypi.com>","MIME-Version":"1.0","Content-Type":"text/plain; charset=utf-8","Content-Disposition":"inline","In-Reply-To":"<20231019115220.4473-1-david.plowman@raspberrypi.com>","Subject":"Re: [libcamera-devel] [PATCH v3] libcamera: controls: Add controls\n\tfor HDR","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","From":"Laurent Pinchart via libcamera-devel\n\t<libcamera-devel@lists.libcamera.org>","Reply-To":"Laurent Pinchart <laurent.pinchart@ideasonboard.com>","Cc":"libcamera-devel@lists.libcamera.org","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}},{"id":28015,"web_url":"https://patchwork.libcamera.org/comment/28015/","msgid":"<CAHW6GY+BWEHxU_JfzLjpRhVzqdBbtWxZp0uVri-sirRobKGqVQ@mail.gmail.com>","date":"2023-10-20T08:37:28","subject":"Re: [libcamera-devel] [PATCH v3] libcamera: controls: Add controls\n\tfor HDR","submitter":{"id":42,"url":"https://patchwork.libcamera.org/api/people/42/","name":"David Plowman","email":"david.plowman@raspberrypi.com"},"content":"Hi Laurent\n\nThanks for the suggestions.\n\nOn Thu, 19 Oct 2023 at 21:52, Laurent Pinchart\n<laurent.pinchart@ideasonboard.com> wrote:\n>\n> Hi David,\n>\n> Thank you for the patch.\n>\n> On Thu, Oct 19, 2023 at 12:52:20PM +0100, David Plowman via libcamera-devel wrote:\n> > We add an HdrMode control (to enable and disable HDR processing)\n> > and an HdrChannel, which indicates what kind of HDR frame (short, long\n> > or medium) has just arrived.\n> >\n> > Currently the HdrMode supports the following values:\n> >\n> > * Off - no HDR processing at all.\n> > * MultiExposureUnmerged - frames at multiple different exposures are\n> >   produced, but not merged together. They are returned \"as is\".\n> > * MultiExposure - frames at multiple different exposures are merged\n> >   to create HDR images.\n> > * SingleExposure - multiple frames all at the same exposure are\n> >   merged to create HDR images.\n> > * Night - multiple frames will be combined to create \"night mode\"\n> >   images.\n> >\n> > Signed-off-by: David Plowman <david.plowman@raspberrypi.com>\n> > Reviewed-by: Naushir Patuck <naush@raspberrypi.com>\n> > ---\n> >  src/libcamera/control_ids.yaml | 75 ++++++++++++++++++++++++++++++++++\n> >  1 file changed, 75 insertions(+)\n> >\n> > diff --git a/src/libcamera/control_ids.yaml b/src/libcamera/control_ids.yaml\n> > index f2e542f4..c3232abf 100644\n> > --- a/src/libcamera/control_ids.yaml\n> > +++ b/src/libcamera/control_ids.yaml\n> > @@ -774,6 +774,81 @@ controls:\n> >              Continuous AF is paused. No further state changes or lens movements\n> >              will occur until the AfPauseResume control is sent.\n> >\n> > +  - HdrMode:\n> > +      type: int32_t\n> > +      description: |\n> > +        Control to set the mode to be used for High Dynamic Range (HDR)\n> > +        imaging. HDR techniques typically include multiple exposure, image\n> > +        fusion and tone mapping techniques to improve the dynamic range of the\n> > +        resulting images.\n> > +\n> > +        When using an HDR mode, images are tagged to indicate which HDR channel\n> > +        (long, medium or short) they come from.\n> > +\n> > +        \\sa HdrChannel\n> > +\n> > +      enum:\n> > +        - name: HdrModeOff\n> > +          value: 0\n> > +          description: |\n> > +            HDR is disabled. The HDR channel, if present, will report\n> > +            HdrChannelNone.\n>\n> Stating what HDR channel is used is an improvement compared to the\n> previous version, but there's still an option left to implementors here:\n> reporting HdrChannelNone, or not reporting HdrChannel at all. Unless you\n> see a reason to allow both, I would pick the latter:\n>\n>             HDR is disabled. Metadata for this frame will not include the\n>             HdrChannel control.\n\nYes, I think that's fine.\n\n>\n> > +        - name: HdrModeMultiExposureUnmerged\n> > +          value: 1\n> > +          description: |\n> > +            Multiple exposures will be generated in an alternating fashion.\n> > +            However, they will not be merged together and will be returned to\n> > +            the application as they are. Each image will be tagged with the\n> > +            correct HDR channel, indicating what kind of exposure (long, medium\n> > +            or short) it is.  The expectation is that, if necessary, the\n> > +            application can merge them to create HDR images for itself.\n>\n> You mention here long, medium and short. Does this mean there will\n> always be three channels ?\n\nNo - it's whatever the implementation wants to do. We don't use\nmedium, for example. I think it's quite likely that some vendors would\nwant other channels, such as \"very short\" and \"very long\". Maybe the\ndescription can avoid implying that all channels will appear.\n\n>\n> > +        - name: HdrModeMultiExposure\n> > +          value: 2\n> > +          description: |\n> > +            Multiple exposures will be generated and merged to create HDR\n> > +            images. Each image will be tagged with the HDR channel (long, medium\n> > +            or short) that arrived and which caused this image to be output.\n> > +        - name: HdrModeSingleExposure\n> > +          value: 3\n> > +          description: |\n> > +            Multiple frames all at a single exposure will be used to create HDR\n> > +            images. These images should be reported as all corresponding to the\n> > +            HDR short channel.\n> > +        - name: HdrModeNight\n> > +          value: 4\n> > +          description: |\n> > +            Multiple frames will be combined to produce \"night mode\"\n> > +            images. Images will be tagged as belonging either to the long,\n> > +            medium or short HDR channel according to the implementation.\n>\n> Does this mean that night more will always use multi-exposure, or that\n> it is implementation-defined ?\n\nI really think that needs to be implementation defined. Our night mode\nis single-exposure, but I can't possibly predict what anyone else\nwould want to do.\n\n>\n> > +\n> > +  - HdrChannel:\n> > +      type: int32_t\n> > +      description: |\n> > +        This value is reported back to the application so that it can discover\n> > +        whether this capture corresponds to the short or long exposure image (or\n> > +        any other image used by the HDR procedure). An application can monitor\n> > +        the HDR channel to discover when the differently exposed images have\n> > +        arrived.\n> > +\n> > +      enum:\n> > +        - name: HdrChannelNone\n> > +          value: 0\n> > +          description: |\n> > +            This image does not correspond to any of the captures used to create\n> > +            an HDR image.\n>\n> As indicated above, do we need this, or should we not report HdrChannel\n> when HDR is disabled ?\n\nActually I'd quite like to keep this, even if it's not reported when HDR is off.\n\nOne use case is multi-exposure HDR on a Pi 4. You can't merge images\nso how would you get a viewfinder? You could intersperse some\n\"ordinary AGC\" frames with your long/short/whatever frames. You might\nwant to label these \"HdrChannelNone\". I suppose you could label them\n\"medium\", but then maybe you're using \"medium\" for other frames that\nyour HDR implementation requires. Like everything else, it all falls a\nbit into the \"who knows what anyone will do\" category.\n\nI guess there's also an outside chance that some implementations can't\nflick exposures instantaneously and accurately like we can, so maybe\nit would suit them too. Or we could have an\n\"HdrChannelCouldntMakeThisWorkProperly\" value? Having wrestled with\nall this for a while I can see the attraction! :)\n\nI'll send out another version today with the changes in line with all\nthe above. Obviously everyone please shout if we want to do anything\ndifferent.\n\nThanks!\nDavid\n\n>\n> > +        - name: HdrChannelShort\n> > +          value: 1\n> > +          description: |\n> > +            This is a short exposure image.\n> > +        - name: HdrChannelMedium\n> > +          value: 2\n> > +          description: |\n> > +            This is a medium exposure image.\n> > +        - name: HdrChannelLong\n> > +          value: 3\n> > +          description: |\n> > +            This is a long exposure image.\n> > +\n> >    # ----------------------------------------------------------------------------\n> >    # Draft controls section\n> >\n>\n> --\n> Regards,\n>\n> Laurent Pinchart","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id 5F53FBDCBD\n\tfor <parsemail@patchwork.libcamera.org>;\n\tFri, 20 Oct 2023 08:37:43 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id 99D426297F;\n\tFri, 20 Oct 2023 10:37:42 +0200 (CEST)","from mail-vs1-xe34.google.com (mail-vs1-xe34.google.com\n\t[IPv6:2607:f8b0:4864:20::e34])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id CFC3261DCF\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tFri, 20 Oct 2023 10:37:40 +0200 (CEST)","by mail-vs1-xe34.google.com with SMTP id\n\tada2fe7eead31-457eebf8e01so218112137.2\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tFri, 20 Oct 2023 01:37:40 -0700 (PDT)"],"DKIM-Signature":["v=1; a=rsa-sha256; c=relaxed/simple; d=libcamera.org;\n\ts=mail; t=1697791062;\n\tbh=BQiSu3nMi0myXovi9c1gmMdiWlaN9K632AnYS53kIXo=;\n\th=References:In-Reply-To:Date:To:Subject:List-Id:List-Unsubscribe:\n\tList-Archive:List-Post:List-Help:List-Subscribe:From:Reply-To:Cc:\n\tFrom;\n\tb=QXMBSH99vWIi39cFekxbTtTGolKn2P9yIlZ1v2yNAnPLxWHQTHPVq7o6XwANDKxVo\n\tMqoCIVYSOH7tKEd/KSm623q9MaSR8ngsibQ/Pyj5QA2TTrkFKTE1M/tMbBKi1OK30d\n\t+zFb/8dsWWZ3hq3ZpNjleG7f7azOphT+BNccWQF2/+FNLxN3MrKyKn99zl96vQswZU\n\tTqwUs7PaRK4q6hHzOuRX7tqgyeHVrotIa7LeoWV6pITd6UJzxrF4YRV7npSATBt8/h\n\t6une4IDWcw/3a1ms84UpE/rgNd3n/bFOrQtgqj//xpPcN/z3FEL14e4Pz4VZwz0d51\n\tTD/Zgg5KZvhMg==","v=1; a=rsa-sha256; c=relaxed/relaxed;\n\td=raspberrypi.com; s=google; t=1697791059; x=1698395859;\n\tdarn=lists.libcamera.org; \n\th=cc:to:subject:message-id:date:from:in-reply-to:references\n\t:mime-version:from:to:cc:subject:date:message-id:reply-to;\n\tbh=CHmrZucPGQ4YZ3C5POki6aSDVW+zPsHxw7Z5vEAQKtQ=;\n\tb=FgDXeWSAHwoL9zay3Xq5Fvu8DhNHjxkJ/4NzVtziduFe3p6wGFJYBZwAudlIjbC2Zs\n\ts8qhi9Ll5OKTpVrPZhoJLFdire98JfpxheKuRATWc3MgLfBwAUloeMyHr50W6Gs43Crj\n\tsPNKLnHbE6XEc5wIm45NVKG1oQONukqcUJBFc0LRF9eHhvbHEwSlvFhdNfd5HKysYGUh\n\tYqY94/wQbw5NO5ndKQT9z88xZUW1ypbI7MBdrc36IomgTIPg310rayyWazzcOP3TaiyY\n\tDCa8U3RFrZX2Cfimv32m3tmdBwu9wSGiENfzPHUHf+iG82gWLHlck4BDTYwTExP2DRfB\n\tMpvQ=="],"Authentication-Results":"lancelot.ideasonboard.com; dkim=pass (2048-bit key; \n\tunprotected) header.d=raspberrypi.com\n\theader.i=@raspberrypi.com\n\theader.b=\"FgDXeWSA\"; dkim-atps=neutral","X-Google-DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/relaxed;\n\td=1e100.net; s=20230601; t=1697791059; x=1698395859;\n\th=cc:to:subject:message-id:date:from:in-reply-to:references\n\t:mime-version:x-gm-message-state:from:to:cc:subject:date:message-id\n\t:reply-to;\n\tbh=CHmrZucPGQ4YZ3C5POki6aSDVW+zPsHxw7Z5vEAQKtQ=;\n\tb=uFTpdGzyqCHpeeQxj06PQ2RHMcRxWEavRItJBgoihLQL65OK7/zR0l+Z8FS1+rgUMK\n\tSmdayuwQeizUYGQppy1/Qhsa/+U/rszSKYUfXTfJiX3DRdrZ/X4kzUqAgPJMk3QXq0Be\n\tblsXsQlC0r7fkCXeEUV5KJaO+duU1ltOhl9rGU7Osv+KonFXQNZaIhklyJWSHYyREGuV\n\tH4TL6W2JU6CbrglNOQ9/hzbCUS6Q3ZGqCJgOtytIQydnJsL5R4mQEzhQetxZIRn8M9EY\n\tZwFud8zqrAPowW9O0fn6gd+lXLEqiPvlPa26BTcfLJHdUVkgpdRY2qvwHfG6spPHXz96\n\thEJA==","X-Gm-Message-State":"AOJu0YxdyGeHVf2MbIJu9sAmQ3gFBrpiAPmwjFQmplzyIaR3FnE4w+U5\n\tBJDLTUUPGn/3fc9Y6wBXICk7MlqCEQaivM15kyBgyQ==","X-Google-Smtp-Source":"AGHT+IFOWUKVCokMb4fN9XIDPD3h6s6BuC6l7sY85DwogjDbL5GnPtsX28PDrfOU1j4dfbEiFw5XQT2/6BiNw6sEgoE=","X-Received":"by 2002:a05:6102:4719:b0:457:cc32:39c6 with SMTP id\n\tei25-20020a056102471900b00457cc3239c6mr1412127vsb.16.1697791059475;\n\tFri, 20 Oct 2023 01:37:39 -0700 (PDT)","MIME-Version":"1.0","References":"<20231019115220.4473-1-david.plowman@raspberrypi.com>\n\t<20231019205235.GO14832@pendragon.ideasonboard.com>","In-Reply-To":"<20231019205235.GO14832@pendragon.ideasonboard.com>","Date":"Fri, 20 Oct 2023 09:37:28 +0100","Message-ID":"<CAHW6GY+BWEHxU_JfzLjpRhVzqdBbtWxZp0uVri-sirRobKGqVQ@mail.gmail.com>","To":"Laurent Pinchart <laurent.pinchart@ideasonboard.com>","Content-Type":"text/plain; charset=\"UTF-8\"","Subject":"Re: [libcamera-devel] [PATCH v3] libcamera: controls: Add controls\n\tfor HDR","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","From":"David Plowman via libcamera-devel <libcamera-devel@lists.libcamera.org>","Reply-To":"David Plowman <david.plowman@raspberrypi.com>","Cc":"libcamera-devel@lists.libcamera.org","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}},{"id":28016,"web_url":"https://patchwork.libcamera.org/comment/28016/","msgid":"<20231020095535.GB20914@pendragon.ideasonboard.com>","date":"2023-10-20T09:55:35","subject":"Re: [libcamera-devel] [PATCH v3] libcamera: controls: Add controls\n\tfor HDR","submitter":{"id":2,"url":"https://patchwork.libcamera.org/api/people/2/","name":"Laurent Pinchart","email":"laurent.pinchart@ideasonboard.com"},"content":"Hi David,\n\nOn Fri, Oct 20, 2023 at 09:37:28AM +0100, David Plowman wrote:\n> On Thu, 19 Oct 2023 at 21:52, Laurent Pinchart wrote:\n> > On Thu, Oct 19, 2023 at 12:52:20PM +0100, David Plowman via libcamera-devel wrote:\n> > > We add an HdrMode control (to enable and disable HDR processing)\n> > > and an HdrChannel, which indicates what kind of HDR frame (short, long\n> > > or medium) has just arrived.\n> > >\n> > > Currently the HdrMode supports the following values:\n> > >\n> > > * Off - no HDR processing at all.\n> > > * MultiExposureUnmerged - frames at multiple different exposures are\n> > >   produced, but not merged together. They are returned \"as is\".\n> > > * MultiExposure - frames at multiple different exposures are merged\n> > >   to create HDR images.\n> > > * SingleExposure - multiple frames all at the same exposure are\n> > >   merged to create HDR images.\n> > > * Night - multiple frames will be combined to create \"night mode\"\n> > >   images.\n> > >\n> > > Signed-off-by: David Plowman <david.plowman@raspberrypi.com>\n> > > Reviewed-by: Naushir Patuck <naush@raspberrypi.com>\n> > > ---\n> > >  src/libcamera/control_ids.yaml | 75 ++++++++++++++++++++++++++++++++++\n> > >  1 file changed, 75 insertions(+)\n> > >\n> > > diff --git a/src/libcamera/control_ids.yaml b/src/libcamera/control_ids.yaml\n> > > index f2e542f4..c3232abf 100644\n> > > --- a/src/libcamera/control_ids.yaml\n> > > +++ b/src/libcamera/control_ids.yaml\n> > > @@ -774,6 +774,81 @@ controls:\n> > >              Continuous AF is paused. No further state changes or lens movements\n> > >              will occur until the AfPauseResume control is sent.\n> > >\n> > > +  - HdrMode:\n> > > +      type: int32_t\n> > > +      description: |\n> > > +        Control to set the mode to be used for High Dynamic Range (HDR)\n> > > +        imaging. HDR techniques typically include multiple exposure, image\n> > > +        fusion and tone mapping techniques to improve the dynamic range of the\n> > > +        resulting images.\n> > > +\n> > > +        When using an HDR mode, images are tagged to indicate which HDR channel\n> > > +        (long, medium or short) they come from.\n> > > +\n> > > +        \\sa HdrChannel\n> > > +\n> > > +      enum:\n> > > +        - name: HdrModeOff\n> > > +          value: 0\n> > > +          description: |\n> > > +            HDR is disabled. The HDR channel, if present, will report\n> > > +            HdrChannelNone.\n> >\n> > Stating what HDR channel is used is an improvement compared to the\n> > previous version, but there's still an option left to implementors here:\n> > reporting HdrChannelNone, or not reporting HdrChannel at all. Unless you\n> > see a reason to allow both, I would pick the latter:\n> >\n> >             HDR is disabled. Metadata for this frame will not include the\n> >             HdrChannel control.\n> \n> Yes, I think that's fine.\n> \n> > > +        - name: HdrModeMultiExposureUnmerged\n> > > +          value: 1\n> > > +          description: |\n> > > +            Multiple exposures will be generated in an alternating fashion.\n> > > +            However, they will not be merged together and will be returned to\n> > > +            the application as they are. Each image will be tagged with the\n> > > +            correct HDR channel, indicating what kind of exposure (long, medium\n> > > +            or short) it is.  The expectation is that, if necessary, the\n> > > +            application can merge them to create HDR images for itself.\n> >\n> > You mention here long, medium and short. Does this mean there will\n> > always be three channels ?\n> \n> No - it's whatever the implementation wants to do. We don't use\n> medium, for example. I think it's quite likely that some vendors would\n> want other channels, such as \"very short\" and \"very long\". Maybe the\n> description can avoid implying that all channels will appear.\n\nShouldn't this be something that the application should control ? I'm\nincreasingly thinking that this shouldn't be an HDR mode, but should\ninstead be controlled through a per-frame control mechanism. Is there\nany reason this can't be done, not right now (I don't want to delay this\npatch unnecessarily), but at some point in the future ?\n\n> > > +        - name: HdrModeMultiExposure\n> > > +          value: 2\n> > > +          description: |\n> > > +            Multiple exposures will be generated and merged to create HDR\n> > > +            images. Each image will be tagged with the HDR channel (long, medium\n> > > +            or short) that arrived and which caused this image to be output.\n> > > +        - name: HdrModeSingleExposure\n> > > +          value: 3\n> > > +          description: |\n> > > +            Multiple frames all at a single exposure will be used to create HDR\n> > > +            images. These images should be reported as all corresponding to the\n> > > +            HDR short channel.\n> > > +        - name: HdrModeNight\n> > > +          value: 4\n> > > +          description: |\n> > > +            Multiple frames will be combined to produce \"night mode\"\n> > > +            images. Images will be tagged as belonging either to the long,\n> > > +            medium or short HDR channel according to the implementation.\n> >\n> > Does this mean that night more will always use multi-exposure, or that\n> > it is implementation-defined ?\n> \n> I really think that needs to be implementation defined. Our night mode\n> is single-exposure, but I can't possibly predict what anyone else\n> would want to do.\n\nWon't it be problematic for applications if they don't know what\nHdrChannel values they will get ? How do you expect HdrChannel to be\nused in the HDR modes where the camera combines images (all but the\nUnmerged mode) ?\n\n> > > +\n> > > +  - HdrChannel:\n> > > +      type: int32_t\n> > > +      description: |\n> > > +        This value is reported back to the application so that it can discover\n> > > +        whether this capture corresponds to the short or long exposure image (or\n> > > +        any other image used by the HDR procedure). An application can monitor\n> > > +        the HDR channel to discover when the differently exposed images have\n> > > +        arrived.\n> > > +\n> > > +      enum:\n> > > +        - name: HdrChannelNone\n> > > +          value: 0\n> > > +          description: |\n> > > +            This image does not correspond to any of the captures used to create\n> > > +            an HDR image.\n> >\n> > As indicated above, do we need this, or should we not report HdrChannel\n> > when HDR is disabled ?\n> \n> Actually I'd quite like to keep this, even if it's not reported when HDR is off.\n> \n> One use case is multi-exposure HDR on a Pi 4. You can't merge images\n> so how would you get a viewfinder? You could intersperse some\n> \"ordinary AGC\" frames with your long/short/whatever frames. You might\n> want to label these \"HdrChannelNone\". I suppose you could label them\n> \"medium\", but then maybe you're using \"medium\" for other frames that\n> your HDR implementation requires. Like everything else, it all falls a\n> bit into the \"who knows what anyone will do\" category.\n\nThis makes me believe even more strongly that the unmerged mode\nshouldn't be an HDR mode. We're trying to cram too many assumptions\nabout the application needs here.\n\nIf you want to keep HdrChannelNone for the time being until we drop\nHdrMultiExposureUnmerged, then this needs better documentation. Until I\nread your reply I had no idea that the unmerged mode would produce\nimages for the \"None\" channel. If this isn't documented properly, it\nwon't be usable for applications.\n\nDepending on the priorities (I assume the Pi 5 HDR modes will be more\ninteresting than the unmerged operation on Pi 4), we could also drop the\nunmerged mode for now, and design a solution better for application-side\nHDR.\n\n> I guess there's also an outside chance that some implementations can't\n> flick exposures instantaneously and accurately like we can, so maybe\n> it would suit them too. Or we could have an\n> \"HdrChannelCouldntMakeThisWorkProperly\" value? Having wrestled with\n> all this for a while I can see the attraction! :)\n> \n> I'll send out another version today with the changes in line with all\n> the above. Obviously everyone please shout if we want to do anything\n> different.\n> \n> > > +        - name: HdrChannelShort\n> > > +          value: 1\n> > > +          description: |\n> > > +            This is a short exposure image.\n> > > +        - name: HdrChannelMedium\n> > > +          value: 2\n> > > +          description: |\n> > > +            This is a medium exposure image.\n> > > +        - name: HdrChannelLong\n> > > +          value: 3\n> > > +          description: |\n> > > +            This is a long exposure image.\n> > > +\n> > >    # ----------------------------------------------------------------------------\n> > >    # Draft controls section\n> > >","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id 934F3C3272\n\tfor <parsemail@patchwork.libcamera.org>;\n\tFri, 20 Oct 2023 09:55:31 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id 145186297F;\n\tFri, 20 Oct 2023 11:55:31 +0200 (CEST)","from perceval.ideasonboard.com (perceval.ideasonboard.com\n\t[IPv6:2001:4b98:dc2:55:216:3eff:fef7:d647])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id 7E24D61DCF\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tFri, 20 Oct 2023 11:55:29 +0200 (CEST)","from pendragon.ideasonboard.com (213-243-189-158.bb.dnainternet.fi\n\t[213.243.189.158])\n\tby perceval.ideasonboard.com (Postfix) with ESMTPSA id A6ECA814;\n\tFri, 20 Oct 2023 11:55:20 +0200 (CEST)"],"DKIM-Signature":["v=1; a=rsa-sha256; c=relaxed/simple; d=libcamera.org;\n\ts=mail; t=1697795731;\n\tbh=7KCy8hL7Ap56B4GMEnm6rarU3/iSTqc3+cFvRnfFSV8=;\n\th=Date:To:References:In-Reply-To:Subject:List-Id:List-Unsubscribe:\n\tList-Archive:List-Post:List-Help:List-Subscribe:From:Reply-To:Cc:\n\tFrom;\n\tb=qXbXwYl2zRNd5AUwYP3lCF1uSmx94MIaJciSLm+0SvbKM2wD+MgE2mbyBNcQB2Fa6\n\ttEfhvIaAlJGUBwhmSPQ3onJegDktesPdfM2kpkcBhy1vF3ud5kKMyhTGVSU4NVmORx\n\tPNJsAL6JF5n96hDmg/FBe14qGqU1Mu5tgTc8bmYduDrtM/tkF4py+nDwcB9fHl0mMV\n\tJaKCwSNo+thCyVByE1YhyNw1TlMfSrP+gwFTQtbn0SzGqB/a3J+Cdl5Omz6DDgMxWL\n\tm91WlD7DxJool2N3tQe1jV30hG6CaZVzLLfW55oNf4ZZu0nCIUnvPjak6PIp7YFQ1A\n\t19eb26Gk5Rnxw==","v=1; a=rsa-sha256; c=relaxed/simple; d=ideasonboard.com;\n\ts=mail; t=1697795720;\n\tbh=7KCy8hL7Ap56B4GMEnm6rarU3/iSTqc3+cFvRnfFSV8=;\n\th=Date:From:To:Cc:Subject:References:In-Reply-To:From;\n\tb=T1ndVJc0X7TwddYBSlYSvoTfdlF9qCq2iT3G8FvGmH2sBOcWiwjf/UhQB0uAh9mo9\n\trM+fZR/8rDKrAwzX6wpVY4/y2tKKBPSqCmJE80W5oXMHSRh1EbYapJFTC0PTXUWpzt\n\twcHZbtZaxvq68t6s+HBMzm6bW1wsxIL89qOclyiw="],"Authentication-Results":"lancelot.ideasonboard.com; dkim=pass (1024-bit key; \n\tunprotected) header.d=ideasonboard.com\n\theader.i=@ideasonboard.com\n\theader.b=\"T1ndVJc0\"; dkim-atps=neutral","Date":"Fri, 20 Oct 2023 12:55:35 +0300","To":"David Plowman <david.plowman@raspberrypi.com>","Message-ID":"<20231020095535.GB20914@pendragon.ideasonboard.com>","References":"<20231019115220.4473-1-david.plowman@raspberrypi.com>\n\t<20231019205235.GO14832@pendragon.ideasonboard.com>\n\t<CAHW6GY+BWEHxU_JfzLjpRhVzqdBbtWxZp0uVri-sirRobKGqVQ@mail.gmail.com>","MIME-Version":"1.0","Content-Type":"text/plain; charset=utf-8","Content-Disposition":"inline","In-Reply-To":"<CAHW6GY+BWEHxU_JfzLjpRhVzqdBbtWxZp0uVri-sirRobKGqVQ@mail.gmail.com>","Subject":"Re: [libcamera-devel] [PATCH v3] libcamera: controls: Add controls\n\tfor HDR","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","From":"Laurent Pinchart via libcamera-devel\n\t<libcamera-devel@lists.libcamera.org>","Reply-To":"Laurent Pinchart <laurent.pinchart@ideasonboard.com>","Cc":"libcamera-devel@lists.libcamera.org","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}},{"id":28017,"web_url":"https://patchwork.libcamera.org/comment/28017/","msgid":"<CAHW6GY+s8zL-5GeHDS50Ch6fE=3cuCkp3WXqY=QB89w2V7DLXg@mail.gmail.com>","date":"2023-10-20T11:09:36","subject":"Re: [libcamera-devel] [PATCH v3] libcamera: controls: Add controls\n\tfor HDR","submitter":{"id":42,"url":"https://patchwork.libcamera.org/api/people/42/","name":"David Plowman","email":"david.plowman@raspberrypi.com"},"content":"HI Laurent\n\nOn Fri, 20 Oct 2023 at 10:55, Laurent Pinchart\n<laurent.pinchart@ideasonboard.com> wrote:\n>\n> Hi David,\n>\n> On Fri, Oct 20, 2023 at 09:37:28AM +0100, David Plowman wrote:\n> > On Thu, 19 Oct 2023 at 21:52, Laurent Pinchart wrote:\n> > > On Thu, Oct 19, 2023 at 12:52:20PM +0100, David Plowman via libcamera-devel wrote:\n> > > > We add an HdrMode control (to enable and disable HDR processing)\n> > > > and an HdrChannel, which indicates what kind of HDR frame (short, long\n> > > > or medium) has just arrived.\n> > > >\n> > > > Currently the HdrMode supports the following values:\n> > > >\n> > > > * Off - no HDR processing at all.\n> > > > * MultiExposureUnmerged - frames at multiple different exposures are\n> > > >   produced, but not merged together. They are returned \"as is\".\n> > > > * MultiExposure - frames at multiple different exposures are merged\n> > > >   to create HDR images.\n> > > > * SingleExposure - multiple frames all at the same exposure are\n> > > >   merged to create HDR images.\n> > > > * Night - multiple frames will be combined to create \"night mode\"\n> > > >   images.\n> > > >\n> > > > Signed-off-by: David Plowman <david.plowman@raspberrypi.com>\n> > > > Reviewed-by: Naushir Patuck <naush@raspberrypi.com>\n> > > > ---\n> > > >  src/libcamera/control_ids.yaml | 75 ++++++++++++++++++++++++++++++++++\n> > > >  1 file changed, 75 insertions(+)\n> > > >\n> > > > diff --git a/src/libcamera/control_ids.yaml b/src/libcamera/control_ids.yaml\n> > > > index f2e542f4..c3232abf 100644\n> > > > --- a/src/libcamera/control_ids.yaml\n> > > > +++ b/src/libcamera/control_ids.yaml\n> > > > @@ -774,6 +774,81 @@ controls:\n> > > >              Continuous AF is paused. No further state changes or lens movements\n> > > >              will occur until the AfPauseResume control is sent.\n> > > >\n> > > > +  - HdrMode:\n> > > > +      type: int32_t\n> > > > +      description: |\n> > > > +        Control to set the mode to be used for High Dynamic Range (HDR)\n> > > > +        imaging. HDR techniques typically include multiple exposure, image\n> > > > +        fusion and tone mapping techniques to improve the dynamic range of the\n> > > > +        resulting images.\n> > > > +\n> > > > +        When using an HDR mode, images are tagged to indicate which HDR channel\n> > > > +        (long, medium or short) they come from.\n> > > > +\n> > > > +        \\sa HdrChannel\n> > > > +\n> > > > +      enum:\n> > > > +        - name: HdrModeOff\n> > > > +          value: 0\n> > > > +          description: |\n> > > > +            HDR is disabled. The HDR channel, if present, will report\n> > > > +            HdrChannelNone.\n> > >\n> > > Stating what HDR channel is used is an improvement compared to the\n> > > previous version, but there's still an option left to implementors here:\n> > > reporting HdrChannelNone, or not reporting HdrChannel at all. Unless you\n> > > see a reason to allow both, I would pick the latter:\n> > >\n> > >             HDR is disabled. Metadata for this frame will not include the\n> > >             HdrChannel control.\n> >\n> > Yes, I think that's fine.\n> >\n> > > > +        - name: HdrModeMultiExposureUnmerged\n> > > > +          value: 1\n> > > > +          description: |\n> > > > +            Multiple exposures will be generated in an alternating fashion.\n> > > > +            However, they will not be merged together and will be returned to\n> > > > +            the application as they are. Each image will be tagged with the\n> > > > +            correct HDR channel, indicating what kind of exposure (long, medium\n> > > > +            or short) it is.  The expectation is that, if necessary, the\n> > > > +            application can merge them to create HDR images for itself.\n> > >\n> > > You mention here long, medium and short. Does this mean there will\n> > > always be three channels ?\n> >\n> > No - it's whatever the implementation wants to do. We don't use\n> > medium, for example. I think it's quite likely that some vendors would\n> > want other channels, such as \"very short\" and \"very long\". Maybe the\n> > description can avoid implying that all channels will appear.\n>\n> Shouldn't this be something that the application should control ? I'm\n> increasingly thinking that this shouldn't be an HDR mode, but should\n> instead be controlled through a per-frame control mechanism. Is there\n> any reason this can't be done, not right now (I don't want to delay this\n> patch unnecessarily), but at some point in the future ?\n\nYes, I'd be happy with an API for multi-channel AGC, which is what\nthis really is. This was actually where I started, though it seemed\nsomewhat controversial so I thought it might be more palatable\npresented like this instead. But either way works for me.\n\nI'm very keen to handle this inside our algorithms, so that\napplications don't explicitly have to switch AGC channels all the\ntime. That's a more reliable approach (there will be fewer failures to\nswitch on every frame), and saves the application from a whole world\nof pain and complication.\n\nThough I'm all in favour of per-frame controls for lots of other\nreasons. We've been very keen to see that move forward for quite a\nlong time now!\n\n>\n> > > > +        - name: HdrModeMultiExposure\n> > > > +          value: 2\n> > > > +          description: |\n> > > > +            Multiple exposures will be generated and merged to create HDR\n> > > > +            images. Each image will be tagged with the HDR channel (long, medium\n> > > > +            or short) that arrived and which caused this image to be output.\n> > > > +        - name: HdrModeSingleExposure\n> > > > +          value: 3\n> > > > +          description: |\n> > > > +            Multiple frames all at a single exposure will be used to create HDR\n> > > > +            images. These images should be reported as all corresponding to the\n> > > > +            HDR short channel.\n> > > > +        - name: HdrModeNight\n> > > > +          value: 4\n> > > > +          description: |\n> > > > +            Multiple frames will be combined to produce \"night mode\"\n> > > > +            images. Images will be tagged as belonging either to the long,\n> > > > +            medium or short HDR channel according to the implementation.\n> > >\n> > > Does this mean that night more will always use multi-exposure, or that\n> > > it is implementation-defined ?\n> >\n> > I really think that needs to be implementation defined. Our night mode\n> > is single-exposure, but I can't possibly predict what anyone else\n> > would want to do.\n>\n> Won't it be problematic for applications if they don't know what\n> HdrChannel values they will get ? How do you expect HdrChannel to be\n> used in the HDR modes where the camera combines images (all but the\n> Unmerged mode) ?\n\nReally, what I'm hoping is that the IQ stability control will indicate\nwhen it's OK to capture the frame, and this will be set when all the\nexposures that the implementation wants to use have arrived. So at\nthat point the channel information will be of interest to some folks,\nbut not essential to general applications. But obviously the IQ\nstability control discussion needs to progress.\n\n>\n> > > > +\n> > > > +  - HdrChannel:\n> > > > +      type: int32_t\n> > > > +      description: |\n> > > > +        This value is reported back to the application so that it can discover\n> > > > +        whether this capture corresponds to the short or long exposure image (or\n> > > > +        any other image used by the HDR procedure). An application can monitor\n> > > > +        the HDR channel to discover when the differently exposed images have\n> > > > +        arrived.\n> > > > +\n> > > > +      enum:\n> > > > +        - name: HdrChannelNone\n> > > > +          value: 0\n> > > > +          description: |\n> > > > +            This image does not correspond to any of the captures used to create\n> > > > +            an HDR image.\n> > >\n> > > As indicated above, do we need this, or should we not report HdrChannel\n> > > when HDR is disabled ?\n> >\n> > Actually I'd quite like to keep this, even if it's not reported when HDR is off.\n> >\n> > One use case is multi-exposure HDR on a Pi 4. You can't merge images\n> > so how would you get a viewfinder? You could intersperse some\n> > \"ordinary AGC\" frames with your long/short/whatever frames. You might\n> > want to label these \"HdrChannelNone\". I suppose you could label them\n> > \"medium\", but then maybe you're using \"medium\" for other frames that\n> > your HDR implementation requires. Like everything else, it all falls a\n> > bit into the \"who knows what anyone will do\" category.\n>\n> This makes me believe even more strongly that the unmerged mode\n> shouldn't be an HDR mode. We're trying to cram too many assumptions\n> about the application needs here.\n>\n> If you want to keep HdrChannelNone for the time being until we drop\n> HdrMultiExposureUnmerged, then this needs better documentation. Until I\n> read your reply I had no idea that the unmerged mode would produce\n> images for the \"None\" channel. If this isn't documented properly, it\n> won't be usable for applications.\n>\n> Depending on the priorities (I assume the Pi 5 HDR modes will be more\n> interesting than the unmerged operation on Pi 4), we could also drop the\n> unmerged mode for now, and design a solution better for application-side\n> HDR.\n\nI'm definitely very keen to move forward all the discussions on\nmulti-channel AGC, IQ stability, per-frame controls and HDR. I am\naware that I want to be very flexible about many aspects of HDR, and I\nexplicitly do not want to tie our users down with \"it will work this\nway\", I want them to be able to implement the strategies that work\nbest for them. So maybe vendor-specific controls are another topic\nthat needs to be added to this list.\n\nFor the moment, however, Pi 5 is upon us and realistically, I think we\nhave no ability to change anything at this point. I'm definitely open\nto suggestions on how to rationalise things after our release.\n\nDavid\n\n>\n> > I guess there's also an outside chance that some implementations can't\n> > flick exposures instantaneously and accurately like we can, so maybe\n> > it would suit them too. Or we could have an\n> > \"HdrChannelCouldntMakeThisWorkProperly\" value? Having wrestled with\n> > all this for a while I can see the attraction! :)\n> >\n> > I'll send out another version today with the changes in line with all\n> > the above. Obviously everyone please shout if we want to do anything\n> > different.\n> >\n> > > > +        - name: HdrChannelShort\n> > > > +          value: 1\n> > > > +          description: |\n> > > > +            This is a short exposure image.\n> > > > +        - name: HdrChannelMedium\n> > > > +          value: 2\n> > > > +          description: |\n> > > > +            This is a medium exposure image.\n> > > > +        - name: HdrChannelLong\n> > > > +          value: 3\n> > > > +          description: |\n> > > > +            This is a long exposure image.\n> > > > +\n> > > >    # ----------------------------------------------------------------------------\n> > > >    # Draft controls section\n> > > >\n>\n> --\n> Regards,\n>\n> Laurent Pinchart","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id C83ABBDCBD\n\tfor <parsemail@patchwork.libcamera.org>;\n\tFri, 20 Oct 2023 11:09:50 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id 23C896297F;\n\tFri, 20 Oct 2023 13:09:50 +0200 (CEST)","from mail-qt1-x82b.google.com (mail-qt1-x82b.google.com\n\t[IPv6:2607:f8b0:4864:20::82b])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id B8D2E60557\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tFri, 20 Oct 2023 13:09:48 +0200 (CEST)","by mail-qt1-x82b.google.com with SMTP id\n\td75a77b69052e-41cbd2cf3bbso15054551cf.0\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tFri, 20 Oct 2023 04:09:48 -0700 (PDT)"],"DKIM-Signature":["v=1; a=rsa-sha256; c=relaxed/simple; d=libcamera.org;\n\ts=mail; t=1697800190;\n\tbh=6+/TpTF9bnqWZPpdZsU3FVMUSFMUZr8Qm63RPM/pkmY=;\n\th=References:In-Reply-To:Date:To:Subject:List-Id:List-Unsubscribe:\n\tList-Archive:List-Post:List-Help:List-Subscribe:From:Reply-To:Cc:\n\tFrom;\n\tb=MA0xULdTx1ohjafMJwclfE0O2xBUG4+Jgp/qnp79SF9F2W5Vz/Ae/PIIutKnaGiMW\n\tYFhzebEXCbkbtKoNR1uNtBFoGrljCXjqnGJA0HKNiljF4X0PcYBydjxBmf1FeLB3Lm\n\th7of0ZzFnDiJjgjDzetUoNf8Jwpc44NnvBLMYpcCa85iqzpnBRiVr7NhRISf1qdwpt\n\tk70dxaBuEo8uZC03rMgnHoGI2JfhOesOKCcopKWlAqlpA2BFcnKjySgbNclI2HoI5y\n\tOtTdGpbx7O5jOGXdg5JldB+WLnlLhko/YRGC3Vq0RshbILPkohvoejCW9tQVIUhaVC\n\tuxvSsHIhzmcJg==","v=1; a=rsa-sha256; c=relaxed/relaxed;\n\td=raspberrypi.com; s=google; t=1697800187; x=1698404987;\n\tdarn=lists.libcamera.org; \n\th=cc:to:subject:message-id:date:from:in-reply-to:references\n\t:mime-version:from:to:cc:subject:date:message-id:reply-to;\n\tbh=cKXk+KPKD/gBqg225bsmoBF0TRoOziQAJc4okSFCEcc=;\n\tb=FmFu910corvccz8SCPpE2odmA/doUMeI/wcLxTB3hU3rGzCEg1RxPVn6yGp3OdxpuZ\n\tlHV0pP+yAYuFuYnihLQwM6w3ed+/pSEs4ASF9W9erCmnRg68txav4HyZl+X47fgk5BKu\n\t6bs/gDT8nrdQykfIyB7qVCF4pkGz4f0qSgtpHW7NfF6cG0NW+M0d5AjBPMwkkHpzEzSZ\n\tT0JVC+eRZZjvtWPvu00cgTxMvm735QJ1lbxPDRR4o9Y125T2SlQNLN5jxnCq1TXWdXfv\n\tkQGRS4l73Mv/NlT5as2caSs05NsiV46/XAJuuvpd6isQZV0XLHBekFYwj7OQDlVNIsNr\n\toeYw=="],"Authentication-Results":"lancelot.ideasonboard.com; dkim=pass (2048-bit key; \n\tunprotected) header.d=raspberrypi.com\n\theader.i=@raspberrypi.com\n\theader.b=\"FmFu910c\"; dkim-atps=neutral","X-Google-DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/relaxed;\n\td=1e100.net; s=20230601; t=1697800187; x=1698404987;\n\th=cc:to:subject:message-id:date:from:in-reply-to:references\n\t:mime-version:x-gm-message-state:from:to:cc:subject:date:message-id\n\t:reply-to;\n\tbh=cKXk+KPKD/gBqg225bsmoBF0TRoOziQAJc4okSFCEcc=;\n\tb=BdRXF9BFwzR+hgqHow8P20qCtCcfK0uiY79MCIponpCaeTIE1FYOjSF5QLeyVOYtlu\n\tiBzPBVzowH4C6GRnZVcmyqEAFXMV9KvypuYhts4jA6ekuoTP4EbtREbjPorBugw60lI5\n\ts1OHtRg8bUwYf178RNfBLvUhGaEvmXmrSkJUL1t6Y3ZYI2JVV2gWmrrNFnLKPS2O32MZ\n\teu/iQ1UOuIqOovhT4kjZs7pyc00rL6CdNOh7hDW21DWXrcJooVVN7JyAfk2AvSdcpVWN\n\tDthGcqp+xI+dIUhLUYzFTSKQXvmRtirqPIF+Wu++8dFgyRXjqRrXwAye4OiqmLFv3Gbx\n\t0dxg==","X-Gm-Message-State":"AOJu0YwbPz7JWsHrgIN1NnhfVCUz62iTwQFbXssLqaAlmF66uYU8ZAJQ\n\tr2Yke0NZ05mlzYr+U4yHOB9HZuLjDeAmTVfVLoKY5w==","X-Google-Smtp-Source":"AGHT+IHglUbxvdyV/+JepMFAux0pH9CEU5ImB8ULbXeXapDGrbymN6OlBBf5elhDnlhZtIPoKxplJRByfHDw/cl30dk=","X-Received":"by 2002:a0c:cb83:0:b0:66d:13c2:1c31 with SMTP id\n\tp3-20020a0ccb83000000b0066d13c21c31mr1613376qvk.24.1697800187064;\n\tFri, 20 Oct 2023 04:09:47 -0700 (PDT)","MIME-Version":"1.0","References":"<20231019115220.4473-1-david.plowman@raspberrypi.com>\n\t<20231019205235.GO14832@pendragon.ideasonboard.com>\n\t<CAHW6GY+BWEHxU_JfzLjpRhVzqdBbtWxZp0uVri-sirRobKGqVQ@mail.gmail.com>\n\t<20231020095535.GB20914@pendragon.ideasonboard.com>","In-Reply-To":"<20231020095535.GB20914@pendragon.ideasonboard.com>","Date":"Fri, 20 Oct 2023 12:09:36 +0100","Message-ID":"<CAHW6GY+s8zL-5GeHDS50Ch6fE=3cuCkp3WXqY=QB89w2V7DLXg@mail.gmail.com>","To":"Laurent Pinchart <laurent.pinchart@ideasonboard.com>","Content-Type":"text/plain; charset=\"UTF-8\"","Subject":"Re: [libcamera-devel] [PATCH v3] libcamera: controls: Add controls\n\tfor HDR","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","From":"David Plowman via libcamera-devel <libcamera-devel@lists.libcamera.org>","Reply-To":"David Plowman <david.plowman@raspberrypi.com>","Cc":"libcamera-devel@lists.libcamera.org","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}},{"id":28018,"web_url":"https://patchwork.libcamera.org/comment/28018/","msgid":"<20231020112605.GD20914@pendragon.ideasonboard.com>","date":"2023-10-20T11:26:05","subject":"Re: [libcamera-devel] [PATCH v3] libcamera: controls: Add controls\n\tfor HDR","submitter":{"id":2,"url":"https://patchwork.libcamera.org/api/people/2/","name":"Laurent Pinchart","email":"laurent.pinchart@ideasonboard.com"},"content":"Hi David,\n\nOn Fri, Oct 20, 2023 at 12:09:36PM +0100, David Plowman wrote:\n> On Fri, 20 Oct 2023 at 10:55, Laurent Pinchart wrote:\n> > On Fri, Oct 20, 2023 at 09:37:28AM +0100, David Plowman wrote:\n> > > On Thu, 19 Oct 2023 at 21:52, Laurent Pinchart wrote:\n> > > > On Thu, Oct 19, 2023 at 12:52:20PM +0100, David Plowman via libcamera-devel wrote:\n> > > > > We add an HdrMode control (to enable and disable HDR processing)\n> > > > > and an HdrChannel, which indicates what kind of HDR frame (short, long\n> > > > > or medium) has just arrived.\n> > > > >\n> > > > > Currently the HdrMode supports the following values:\n> > > > >\n> > > > > * Off - no HDR processing at all.\n> > > > > * MultiExposureUnmerged - frames at multiple different exposures are\n> > > > >   produced, but not merged together. They are returned \"as is\".\n> > > > > * MultiExposure - frames at multiple different exposures are merged\n> > > > >   to create HDR images.\n> > > > > * SingleExposure - multiple frames all at the same exposure are\n> > > > >   merged to create HDR images.\n> > > > > * Night - multiple frames will be combined to create \"night mode\"\n> > > > >   images.\n> > > > >\n> > > > > Signed-off-by: David Plowman <david.plowman@raspberrypi.com>\n> > > > > Reviewed-by: Naushir Patuck <naush@raspberrypi.com>\n> > > > > ---\n> > > > >  src/libcamera/control_ids.yaml | 75 ++++++++++++++++++++++++++++++++++\n> > > > >  1 file changed, 75 insertions(+)\n> > > > >\n> > > > > diff --git a/src/libcamera/control_ids.yaml b/src/libcamera/control_ids.yaml\n> > > > > index f2e542f4..c3232abf 100644\n> > > > > --- a/src/libcamera/control_ids.yaml\n> > > > > +++ b/src/libcamera/control_ids.yaml\n> > > > > @@ -774,6 +774,81 @@ controls:\n> > > > >              Continuous AF is paused. No further state changes or lens movements\n> > > > >              will occur until the AfPauseResume control is sent.\n> > > > >\n> > > > > +  - HdrMode:\n> > > > > +      type: int32_t\n> > > > > +      description: |\n> > > > > +        Control to set the mode to be used for High Dynamic Range (HDR)\n> > > > > +        imaging. HDR techniques typically include multiple exposure, image\n> > > > > +        fusion and tone mapping techniques to improve the dynamic range of the\n> > > > > +        resulting images.\n> > > > > +\n> > > > > +        When using an HDR mode, images are tagged to indicate which HDR channel\n> > > > > +        (long, medium or short) they come from.\n> > > > > +\n> > > > > +        \\sa HdrChannel\n> > > > > +\n> > > > > +      enum:\n> > > > > +        - name: HdrModeOff\n> > > > > +          value: 0\n> > > > > +          description: |\n> > > > > +            HDR is disabled. The HDR channel, if present, will report\n> > > > > +            HdrChannelNone.\n> > > >\n> > > > Stating what HDR channel is used is an improvement compared to the\n> > > > previous version, but there's still an option left to implementors here:\n> > > > reporting HdrChannelNone, or not reporting HdrChannel at all. Unless you\n> > > > see a reason to allow both, I would pick the latter:\n> > > >\n> > > >             HDR is disabled. Metadata for this frame will not include the\n> > > >             HdrChannel control.\n> > >\n> > > Yes, I think that's fine.\n> > >\n> > > > > +        - name: HdrModeMultiExposureUnmerged\n> > > > > +          value: 1\n> > > > > +          description: |\n> > > > > +            Multiple exposures will be generated in an alternating fashion.\n> > > > > +            However, they will not be merged together and will be returned to\n> > > > > +            the application as they are. Each image will be tagged with the\n> > > > > +            correct HDR channel, indicating what kind of exposure (long, medium\n> > > > > +            or short) it is.  The expectation is that, if necessary, the\n> > > > > +            application can merge them to create HDR images for itself.\n> > > >\n> > > > You mention here long, medium and short. Does this mean there will\n> > > > always be three channels ?\n> > >\n> > > No - it's whatever the implementation wants to do. We don't use\n> > > medium, for example. I think it's quite likely that some vendors would\n> > > want other channels, such as \"very short\" and \"very long\". Maybe the\n> > > description can avoid implying that all channels will appear.\n> >\n> > Shouldn't this be something that the application should control ? I'm\n> > increasingly thinking that this shouldn't be an HDR mode, but should\n> > instead be controlled through a per-frame control mechanism. Is there\n> > any reason this can't be done, not right now (I don't want to delay this\n> > patch unnecessarily), but at some point in the future ?\n> \n> Yes, I'd be happy with an API for multi-channel AGC, which is what\n> this really is. This was actually where I started, though it seemed\n> somewhat controversial so I thought it might be more palatable\n> presented like this instead. But either way works for me.\n>\n> I'm very keen to handle this inside our algorithms, so that\n> applications don't explicitly have to switch AGC channels all the\n> time. That's a more reliable approach (there will be fewer failures to\n> switch on every frame), and saves the application from a whole world\n> of pain and complication.\n>\n> Though I'm all in favour of per-frame controls for lots of other\n> reasons. We've been very keen to see that move forward for quite a\n> long time now!\n\nBoth make sense. I'm thinking that we may need some kind of \"controls\nscheduling\" API to help applications alternate between different\nsettings, in addition to the full manual mode offered by per-frame\ncontrols. I don't know yet how that would look like though.\n\n> > > > > +        - name: HdrModeMultiExposure\n> > > > > +          value: 2\n> > > > > +          description: |\n> > > > > +            Multiple exposures will be generated and merged to create HDR\n> > > > > +            images. Each image will be tagged with the HDR channel (long, medium\n> > > > > +            or short) that arrived and which caused this image to be output.\n> > > > > +        - name: HdrModeSingleExposure\n> > > > > +          value: 3\n> > > > > +          description: |\n> > > > > +            Multiple frames all at a single exposure will be used to create HDR\n> > > > > +            images. These images should be reported as all corresponding to the\n> > > > > +            HDR short channel.\n> > > > > +        - name: HdrModeNight\n> > > > > +          value: 4\n> > > > > +          description: |\n> > > > > +            Multiple frames will be combined to produce \"night mode\"\n> > > > > +            images. Images will be tagged as belonging either to the long,\n> > > > > +            medium or short HDR channel according to the implementation.\n> > > >\n> > > > Does this mean that night more will always use multi-exposure, or that\n> > > > it is implementation-defined ?\n> > >\n> > > I really think that needs to be implementation defined. Our night mode\n> > > is single-exposure, but I can't possibly predict what anyone else\n> > > would want to do.\n> >\n> > Won't it be problematic for applications if they don't know what\n> > HdrChannel values they will get ? How do you expect HdrChannel to be\n> > used in the HDR modes where the camera combines images (all but the\n> > Unmerged mode) ?\n> \n> Really, what I'm hoping is that the IQ stability control will indicate\n> when it's OK to capture the frame, and this will be set when all the\n> exposures that the implementation wants to use have arrived. So at\n> that point the channel information will be of interest to some folks,\n> but not essential to general applications. But obviously the IQ\n> stability control discussion needs to progress.\n\nThose are two different issues, aren't they ? The point I was trying to\nmake is that I think the HdrChannel metadata will not be useful for\napplications if we don't standardize at least a bit what it will report.\nUou gave an example where there would be a HdrChannelNone once per\n\"group\" of exposures, to be used as a viewfinder. If applications need\nthat, they need to be able to rely on it being present, or at least know\nif it will be, or have a way to select it.\n\nWhat I'm missing here is a design, it seems to be more of a prototype to\nsee what will happen. It's fine to prototype things, but I would also\nlike to think about a longer term design. At the very least, how can we\nget feedback on what will be useful to users, and when will we revisit\nthis ?\n\n> > > > > +\n> > > > > +  - HdrChannel:\n> > > > > +      type: int32_t\n> > > > > +      description: |\n> > > > > +        This value is reported back to the application so that it can discover\n> > > > > +        whether this capture corresponds to the short or long exposure image (or\n> > > > > +        any other image used by the HDR procedure). An application can monitor\n> > > > > +        the HDR channel to discover when the differently exposed images have\n> > > > > +        arrived.\n> > > > > +\n> > > > > +      enum:\n> > > > > +        - name: HdrChannelNone\n> > > > > +          value: 0\n> > > > > +          description: |\n> > > > > +            This image does not correspond to any of the captures used to create\n> > > > > +            an HDR image.\n> > > >\n> > > > As indicated above, do we need this, or should we not report HdrChannel\n> > > > when HDR is disabled ?\n> > >\n> > > Actually I'd quite like to keep this, even if it's not reported when HDR is off.\n> > >\n> > > One use case is multi-exposure HDR on a Pi 4. You can't merge images\n> > > so how would you get a viewfinder? You could intersperse some\n> > > \"ordinary AGC\" frames with your long/short/whatever frames. You might\n> > > want to label these \"HdrChannelNone\". I suppose you could label them\n> > > \"medium\", but then maybe you're using \"medium\" for other frames that\n> > > your HDR implementation requires. Like everything else, it all falls a\n> > > bit into the \"who knows what anyone will do\" category.\n> >\n> > This makes me believe even more strongly that the unmerged mode\n> > shouldn't be an HDR mode. We're trying to cram too many assumptions\n> > about the application needs here.\n> >\n> > If you want to keep HdrChannelNone for the time being until we drop\n> > HdrMultiExposureUnmerged, then this needs better documentation. Until I\n> > read your reply I had no idea that the unmerged mode would produce\n> > images for the \"None\" channel. If this isn't documented properly, it\n> > won't be usable for applications.\n> >\n> > Depending on the priorities (I assume the Pi 5 HDR modes will be more\n> > interesting than the unmerged operation on Pi 4), we could also drop the\n> > unmerged mode for now, and design a solution better for application-side\n> > HDR.\n> \n> I'm definitely very keen to move forward all the discussions on\n> multi-channel AGC, IQ stability, per-frame controls and HDR. I am\n> aware that I want to be very flexible about many aspects of HDR, and I\n> explicitly do not want to tie our users down with \"it will work this\n> way\", I want them to be able to implement the strategies that work\n> best for them. So maybe vendor-specific controls are another topic\n> that needs to be added to this list.\n> \n> For the moment, however, Pi 5 is upon us and realistically, I think we\n> have no ability to change anything at this point. I'm definitely open\n> to suggestions on how to rationalise things after our release.\n\nWould you prefer focussing on Pi 5 first and postponing\nHdrModeMultiExposureUnmerged, or bundling them all together ? If your\nfocus is in Pi 5 and you won't have time to release an implementation of\nHdrModeMultiExposureUnmerged with corresponding application-side support\nin the short term, maybe it would be better to merge the camera-side HDR\nsupport right now, and revisit HdrModeMultiExposureUnmerged when you'll\nhave time to work on its implementation ? To be perfectly clear here, I\nthink HdrModeMultiExposureUnmerged is less well-defined than the other\nmodes, and thus needs more work, but I'm also not opposed to merging it\nearly if we can come up with a plan to then improve it.\n\n> > > I guess there's also an outside chance that some implementations can't\n> > > flick exposures instantaneously and accurately like we can, so maybe\n> > > it would suit them too. Or we could have an\n> > > \"HdrChannelCouldntMakeThisWorkProperly\" value? Having wrestled with\n> > > all this for a while I can see the attraction! :)\n> > >\n> > > I'll send out another version today with the changes in line with all\n> > > the above. Obviously everyone please shout if we want to do anything\n> > > different.\n> > >\n> > > > > +        - name: HdrChannelShort\n> > > > > +          value: 1\n> > > > > +          description: |\n> > > > > +            This is a short exposure image.\n> > > > > +        - name: HdrChannelMedium\n> > > > > +          value: 2\n> > > > > +          description: |\n> > > > > +            This is a medium exposure image.\n> > > > > +        - name: HdrChannelLong\n> > > > > +          value: 3\n> > > > > +          description: |\n> > > > > +            This is a long exposure image.\n> > > > > +\n> > > > >    # ----------------------------------------------------------------------------\n> > > > >    # Draft controls section\n> > > > >","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id DBF5BC3272\n\tfor <parsemail@patchwork.libcamera.org>;\n\tFri, 20 Oct 2023 11:26:01 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id 40F0B6297F;\n\tFri, 20 Oct 2023 13:26:01 +0200 (CEST)","from perceval.ideasonboard.com (perceval.ideasonboard.com\n\t[IPv6:2001:4b98:dc2:55:216:3eff:fef7:d647])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id A41AF60557\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tFri, 20 Oct 2023 13:25:59 +0200 (CEST)","from pendragon.ideasonboard.com (213-243-189-158.bb.dnainternet.fi\n\t[213.243.189.158])\n\tby perceval.ideasonboard.com (Postfix) with ESMTPSA id C9ECE2B3;\n\tFri, 20 Oct 2023 13:25:50 +0200 (CEST)"],"DKIM-Signature":["v=1; a=rsa-sha256; c=relaxed/simple; d=libcamera.org;\n\ts=mail; t=1697801161;\n\tbh=PIU1t6osPlY13m0QfyPyDPAXU0+ySfywkIA+QL1qCRU=;\n\th=Date:To:References:In-Reply-To:Subject:List-Id:List-Unsubscribe:\n\tList-Archive:List-Post:List-Help:List-Subscribe:From:Reply-To:Cc:\n\tFrom;\n\tb=pYYANMB4RoIOpI5mMAcbH+cH+Zur+kaOMtR4d+OoWcZLXo3dQX7nXbDcO7XpIjFJ5\n\tfg2BZM3YLRY9+8Imdshi74JXxSvOZhrOyJw+3WxgoPtZSgGZMfkxeR1kJEW48GTPbE\n\tQonEEnvmn4rgga7WvSIVcsYNJjd72VhTBPteZ8bp8k04yQiTplFOH89nJub7fH6LEU\n\tPeE3pcMpc9L76QRJo7elJo0AE08YL180u+ppG8NiZQUoTcrcIa6HNiGjnHa96yYTfb\n\tyhrWZo6OCDMV/HeoysBJT3pjwt6VCSRynb/Ry9qOdZjwTvtL04V9G/UXLa3+IC4TTF\n\ttjJdiRQFfAEvA==","v=1; a=rsa-sha256; c=relaxed/simple; d=ideasonboard.com;\n\ts=mail; t=1697801151;\n\tbh=PIU1t6osPlY13m0QfyPyDPAXU0+ySfywkIA+QL1qCRU=;\n\th=Date:From:To:Cc:Subject:References:In-Reply-To:From;\n\tb=AGfV2kbvCfjI3po3dn4M5Zdw0HWbLNwXNvAOnl/s1iUBmpAKs+YZWCuihFGtmFyQJ\n\t3ce/ffzjioMgPlxEFZHCc7AwsuMNqSvJC1zf/4QMDTv1HuNgTiCTaCfXBADHhYOfTV\n\t2iO5fLYJjfMb+xLb+kwGbANszg4F3hMAKT5tXHYI="],"Authentication-Results":"lancelot.ideasonboard.com; dkim=pass (1024-bit key; \n\tunprotected) header.d=ideasonboard.com\n\theader.i=@ideasonboard.com\n\theader.b=\"AGfV2kbv\"; dkim-atps=neutral","Date":"Fri, 20 Oct 2023 14:26:05 +0300","To":"David Plowman <david.plowman@raspberrypi.com>","Message-ID":"<20231020112605.GD20914@pendragon.ideasonboard.com>","References":"<20231019115220.4473-1-david.plowman@raspberrypi.com>\n\t<20231019205235.GO14832@pendragon.ideasonboard.com>\n\t<CAHW6GY+BWEHxU_JfzLjpRhVzqdBbtWxZp0uVri-sirRobKGqVQ@mail.gmail.com>\n\t<20231020095535.GB20914@pendragon.ideasonboard.com>\n\t<CAHW6GY+s8zL-5GeHDS50Ch6fE=3cuCkp3WXqY=QB89w2V7DLXg@mail.gmail.com>","MIME-Version":"1.0","Content-Type":"text/plain; charset=utf-8","Content-Disposition":"inline","In-Reply-To":"<CAHW6GY+s8zL-5GeHDS50Ch6fE=3cuCkp3WXqY=QB89w2V7DLXg@mail.gmail.com>","Subject":"Re: [libcamera-devel] [PATCH v3] libcamera: controls: Add controls\n\tfor HDR","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","From":"Laurent Pinchart via libcamera-devel\n\t<libcamera-devel@lists.libcamera.org>","Reply-To":"Laurent Pinchart <laurent.pinchart@ideasonboard.com>","Cc":"libcamera-devel@lists.libcamera.org","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}},{"id":28019,"web_url":"https://patchwork.libcamera.org/comment/28019/","msgid":"<CAHW6GYKULifqq70bG1JGBDM3T7QAyfJ-uCYrDT2by4P4NZ1oNg@mail.gmail.com>","date":"2023-10-20T12:11:04","subject":"Re: [libcamera-devel] [PATCH v3] libcamera: controls: Add controls\n\tfor HDR","submitter":{"id":42,"url":"https://patchwork.libcamera.org/api/people/42/","name":"David Plowman","email":"david.plowman@raspberrypi.com"},"content":"Hi Laurent\n\n\nOn Fri, 20 Oct 2023 at 12:26, Laurent Pinchart\n<laurent.pinchart@ideasonboard.com> wrote:\n>\n> Hi David,\n>\n> On Fri, Oct 20, 2023 at 12:09:36PM +0100, David Plowman wrote:\n> > On Fri, 20 Oct 2023 at 10:55, Laurent Pinchart wrote:\n> > > On Fri, Oct 20, 2023 at 09:37:28AM +0100, David Plowman wrote:\n> > > > On Thu, 19 Oct 2023 at 21:52, Laurent Pinchart wrote:\n> > > > > On Thu, Oct 19, 2023 at 12:52:20PM +0100, David Plowman via libcamera-devel wrote:\n> > > > > > We add an HdrMode control (to enable and disable HDR processing)\n> > > > > > and an HdrChannel, which indicates what kind of HDR frame (short, long\n> > > > > > or medium) has just arrived.\n> > > > > >\n> > > > > > Currently the HdrMode supports the following values:\n> > > > > >\n> > > > > > * Off - no HDR processing at all.\n> > > > > > * MultiExposureUnmerged - frames at multiple different exposures are\n> > > > > >   produced, but not merged together. They are returned \"as is\".\n> > > > > > * MultiExposure - frames at multiple different exposures are merged\n> > > > > >   to create HDR images.\n> > > > > > * SingleExposure - multiple frames all at the same exposure are\n> > > > > >   merged to create HDR images.\n> > > > > > * Night - multiple frames will be combined to create \"night mode\"\n> > > > > >   images.\n> > > > > >\n> > > > > > Signed-off-by: David Plowman <david.plowman@raspberrypi.com>\n> > > > > > Reviewed-by: Naushir Patuck <naush@raspberrypi.com>\n> > > > > > ---\n> > > > > >  src/libcamera/control_ids.yaml | 75 ++++++++++++++++++++++++++++++++++\n> > > > > >  1 file changed, 75 insertions(+)\n> > > > > >\n> > > > > > diff --git a/src/libcamera/control_ids.yaml b/src/libcamera/control_ids.yaml\n> > > > > > index f2e542f4..c3232abf 100644\n> > > > > > --- a/src/libcamera/control_ids.yaml\n> > > > > > +++ b/src/libcamera/control_ids.yaml\n> > > > > > @@ -774,6 +774,81 @@ controls:\n> > > > > >              Continuous AF is paused. No further state changes or lens movements\n> > > > > >              will occur until the AfPauseResume control is sent.\n> > > > > >\n> > > > > > +  - HdrMode:\n> > > > > > +      type: int32_t\n> > > > > > +      description: |\n> > > > > > +        Control to set the mode to be used for High Dynamic Range (HDR)\n> > > > > > +        imaging. HDR techniques typically include multiple exposure, image\n> > > > > > +        fusion and tone mapping techniques to improve the dynamic range of the\n> > > > > > +        resulting images.\n> > > > > > +\n> > > > > > +        When using an HDR mode, images are tagged to indicate which HDR channel\n> > > > > > +        (long, medium or short) they come from.\n> > > > > > +\n> > > > > > +        \\sa HdrChannel\n> > > > > > +\n> > > > > > +      enum:\n> > > > > > +        - name: HdrModeOff\n> > > > > > +          value: 0\n> > > > > > +          description: |\n> > > > > > +            HDR is disabled. The HDR channel, if present, will report\n> > > > > > +            HdrChannelNone.\n> > > > >\n> > > > > Stating what HDR channel is used is an improvement compared to the\n> > > > > previous version, but there's still an option left to implementors here:\n> > > > > reporting HdrChannelNone, or not reporting HdrChannel at all. Unless you\n> > > > > see a reason to allow both, I would pick the latter:\n> > > > >\n> > > > >             HDR is disabled. Metadata for this frame will not include the\n> > > > >             HdrChannel control.\n> > > >\n> > > > Yes, I think that's fine.\n> > > >\n> > > > > > +        - name: HdrModeMultiExposureUnmerged\n> > > > > > +          value: 1\n> > > > > > +          description: |\n> > > > > > +            Multiple exposures will be generated in an alternating fashion.\n> > > > > > +            However, they will not be merged together and will be returned to\n> > > > > > +            the application as they are. Each image will be tagged with the\n> > > > > > +            correct HDR channel, indicating what kind of exposure (long, medium\n> > > > > > +            or short) it is.  The expectation is that, if necessary, the\n> > > > > > +            application can merge them to create HDR images for itself.\n> > > > >\n> > > > > You mention here long, medium and short. Does this mean there will\n> > > > > always be three channels ?\n> > > >\n> > > > No - it's whatever the implementation wants to do. We don't use\n> > > > medium, for example. I think it's quite likely that some vendors would\n> > > > want other channels, such as \"very short\" and \"very long\". Maybe the\n> > > > description can avoid implying that all channels will appear.\n> > >\n> > > Shouldn't this be something that the application should control ? I'm\n> > > increasingly thinking that this shouldn't be an HDR mode, but should\n> > > instead be controlled through a per-frame control mechanism. Is there\n> > > any reason this can't be done, not right now (I don't want to delay this\n> > > patch unnecessarily), but at some point in the future ?\n> >\n> > Yes, I'd be happy with an API for multi-channel AGC, which is what\n> > this really is. This was actually where I started, though it seemed\n> > somewhat controversial so I thought it might be more palatable\n> > presented like this instead. But either way works for me.\n> >\n> > I'm very keen to handle this inside our algorithms, so that\n> > applications don't explicitly have to switch AGC channels all the\n> > time. That's a more reliable approach (there will be fewer failures to\n> > switch on every frame), and saves the application from a whole world\n> > of pain and complication.\n> >\n> > Though I'm all in favour of per-frame controls for lots of other\n> > reasons. We've been very keen to see that move forward for quite a\n> > long time now!\n>\n> Both make sense. I'm thinking that we may need some kind of \"controls\n> scheduling\" API to help applications alternate between different\n> settings, in addition to the full manual mode offered by per-frame\n> controls. I don't know yet how that would look like though.\n\nControl scheduling is an interesting idea. I'm not sure how it relates\nto AGC because AGC/AE controls are always a bit special because of the\nframe delays that are incurred in the sensor. Controls also tend to\nwind up at the back of the request queue which is another debatable\nfeature for us, and not a place we'd be wanting exposure/gain updates\nto languish. So certainly things to think about.\n\n>\n> > > > > > +        - name: HdrModeMultiExposure\n> > > > > > +          value: 2\n> > > > > > +          description: |\n> > > > > > +            Multiple exposures will be generated and merged to create HDR\n> > > > > > +            images. Each image will be tagged with the HDR channel (long, medium\n> > > > > > +            or short) that arrived and which caused this image to be output.\n> > > > > > +        - name: HdrModeSingleExposure\n> > > > > > +          value: 3\n> > > > > > +          description: |\n> > > > > > +            Multiple frames all at a single exposure will be used to create HDR\n> > > > > > +            images. These images should be reported as all corresponding to the\n> > > > > > +            HDR short channel.\n> > > > > > +        - name: HdrModeNight\n> > > > > > +          value: 4\n> > > > > > +          description: |\n> > > > > > +            Multiple frames will be combined to produce \"night mode\"\n> > > > > > +            images. Images will be tagged as belonging either to the long,\n> > > > > > +            medium or short HDR channel according to the implementation.\n> > > > >\n> > > > > Does this mean that night more will always use multi-exposure, or that\n> > > > > it is implementation-defined ?\n> > > >\n> > > > I really think that needs to be implementation defined. Our night mode\n> > > > is single-exposure, but I can't possibly predict what anyone else\n> > > > would want to do.\n> > >\n> > > Won't it be problematic for applications if they don't know what\n> > > HdrChannel values they will get ? How do you expect HdrChannel to be\n> > > used in the HDR modes where the camera combines images (all but the\n> > > Unmerged mode) ?\n> >\n> > Really, what I'm hoping is that the IQ stability control will indicate\n> > when it's OK to capture the frame, and this will be set when all the\n> > exposures that the implementation wants to use have arrived. So at\n> > that point the channel information will be of interest to some folks,\n> > but not essential to general applications. But obviously the IQ\n> > stability control discussion needs to progress.\n>\n> Those are two different issues, aren't they ? The point I was trying to\n> make is that I think the HdrChannel metadata will not be useful for\n> applications if we don't standardize at least a bit what it will report.\n> Uou gave an example where there would be a HdrChannelNone once per\n> \"group\" of exposures, to be used as a viewfinder. If applications need\n> that, they need to be able to rely on it being present, or at least know\n> if it will be, or have a way to select it.\n\nI agree standardisation is good here. I also think that beyond some\nobvious cases (long/medium/short) it will be difficult. I'm already\nexpecting \"very short\" and \"very long\" at some point. I see a need for\n\"this isn't HDR at all but is for preview\", maybe that's just \"this is\na normal AGC image\". I think some platforms might just want a long run\nof exposures where some might be the same, some might be different,\nand the only way to name them will be \"0\", \"1\" and so on.\n\nWe've had this problem to some extent elsewhere and have adopted a\n\"custom\" value. But it's not a great solution, and I don't think it\nreally flies here when there could be many \"custom\" values.\n\n>\n> What I'm missing here is a design, it seems to be more of a prototype to\n> see what will happen. It's fine to prototype things, but I would also\n> like to think about a longer term design. At the very least, how can we\n> get feedback on what will be useful to users, and when will we revisit\n> this ?\n\nI wouldn't disagree with any of that, though I think experience tells\nus that the only real way to get feedback is to put stuff out there\nand see how people react. We aren't short of wider camera forums where\ngetting meaningful engagement from other parties is difficult! To be\nfair, Pi users are very good in this respect, though I think most\nwould engage more actively when there's stuff to play with. But hard\nto say!\n\n>\n> > > > > > +\n> > > > > > +  - HdrChannel:\n> > > > > > +      type: int32_t\n> > > > > > +      description: |\n> > > > > > +        This value is reported back to the application so that it can discover\n> > > > > > +        whether this capture corresponds to the short or long exposure image (or\n> > > > > > +        any other image used by the HDR procedure). An application can monitor\n> > > > > > +        the HDR channel to discover when the differently exposed images have\n> > > > > > +        arrived.\n> > > > > > +\n> > > > > > +      enum:\n> > > > > > +        - name: HdrChannelNone\n> > > > > > +          value: 0\n> > > > > > +          description: |\n> > > > > > +            This image does not correspond to any of the captures used to create\n> > > > > > +            an HDR image.\n> > > > >\n> > > > > As indicated above, do we need this, or should we not report HdrChannel\n> > > > > when HDR is disabled ?\n> > > >\n> > > > Actually I'd quite like to keep this, even if it's not reported when HDR is off.\n> > > >\n> > > > One use case is multi-exposure HDR on a Pi 4. You can't merge images\n> > > > so how would you get a viewfinder? You could intersperse some\n> > > > \"ordinary AGC\" frames with your long/short/whatever frames. You might\n> > > > want to label these \"HdrChannelNone\". I suppose you could label them\n> > > > \"medium\", but then maybe you're using \"medium\" for other frames that\n> > > > your HDR implementation requires. Like everything else, it all falls a\n> > > > bit into the \"who knows what anyone will do\" category.\n> > >\n> > > This makes me believe even more strongly that the unmerged mode\n> > > shouldn't be an HDR mode. We're trying to cram too many assumptions\n> > > about the application needs here.\n> > >\n> > > If you want to keep HdrChannelNone for the time being until we drop\n> > > HdrMultiExposureUnmerged, then this needs better documentation. Until I\n> > > read your reply I had no idea that the unmerged mode would produce\n> > > images for the \"None\" channel. If this isn't documented properly, it\n> > > won't be usable for applications.\n> > >\n> > > Depending on the priorities (I assume the Pi 5 HDR modes will be more\n> > > interesting than the unmerged operation on Pi 4), we could also drop the\n> > > unmerged mode for now, and design a solution better for application-side\n> > > HDR.\n> >\n> > I'm definitely very keen to move forward all the discussions on\n> > multi-channel AGC, IQ stability, per-frame controls and HDR. I am\n> > aware that I want to be very flexible about many aspects of HDR, and I\n> > explicitly do not want to tie our users down with \"it will work this\n> > way\", I want them to be able to implement the strategies that work\n> > best for them. So maybe vendor-specific controls are another topic\n> > that needs to be added to this list.\n> >\n> > For the moment, however, Pi 5 is upon us and realistically, I think we\n> > have no ability to change anything at this point. I'm definitely open\n> > to suggestions on how to rationalise things after our release.\n>\n> Would you prefer focussing on Pi 5 first and postponing\n> HdrModeMultiExposureUnmerged, or bundling them all together ? If your\n> focus is in Pi 5 and you won't have time to release an implementation of\n> HdrModeMultiExposureUnmerged with corresponding application-side support\n> in the short term, maybe it would be better to merge the camera-side HDR\n> support right now, and revisit HdrModeMultiExposureUnmerged when you'll\n> have time to work on its implementation ? To be perfectly clear here, I\n> think HdrModeMultiExposureUnmerged is less well-defined than the other\n> modes, and thus needs more work, but I'm also not opposed to merging it\n> early if we can come up with a plan to then improve it.\n\nFor us, the release we're making in a day or two is a fait accompli at\nthis point (Pi 5s are being shipped, I hear), and all this stuff is in\nit. All these modes are implemented and work. So from our point of\nview, bundling everything that we've done together would minimse\ndivergence. Though we can live with it too, at least temporarily.\n\nBut as I said, I'm more than happy to revisit stuff. I think\nmulti-channel AGC is probably a better answer than \"unmerged\" HDR, but\nwe need to agree an approach.\n\nI think many platforms would react with horror to multi-channel AGC.\nThey'd take the view that they'll just drive multiple different\nexposures explicitly, and not worry about multiple channels. Though\nyou still have to decide if they need to adjust dynamically and how\nyou switch between them, how they don't get too far apart, and how to\nmake it jump from one exposure to another without adapting slowly. So\nit's all the same problems, just without exposing a general solution\nthat enables it. But it's definitely more viable if you're only doing\nstill image HDR. I can't escape wondering whether the dawn of vendor\nextensions is drawing closer.\n\nSo lots to think about!\n\nDavid\n\n>\n> > > > I guess there's also an outside chance that some implementations can't\n> > > > flick exposures instantaneously and accurately like we can, so maybe\n> > > > it would suit them too. Or we could have an\n> > > > \"HdrChannelCouldntMakeThisWorkProperly\" value? Having wrestled with\n> > > > all this for a while I can see the attraction! :)\n> > > >\n> > > > I'll send out another version today with the changes in line with all\n> > > > the above. Obviously everyone please shout if we want to do anything\n> > > > different.\n> > > >\n> > > > > > +        - name: HdrChannelShort\n> > > > > > +          value: 1\n> > > > > > +          description: |\n> > > > > > +            This is a short exposure image.\n> > > > > > +        - name: HdrChannelMedium\n> > > > > > +          value: 2\n> > > > > > +          description: |\n> > > > > > +            This is a medium exposure image.\n> > > > > > +        - name: HdrChannelLong\n> > > > > > +          value: 3\n> > > > > > +          description: |\n> > > > > > +            This is a long exposure image.\n> > > > > > +\n> > > > > >    # ----------------------------------------------------------------------------\n> > > > > >    # Draft controls section\n> > > > > >\n>\n> --\n> Regards,\n>\n> Laurent Pinchart","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id C2656BDCBD\n\tfor <parsemail@patchwork.libcamera.org>;\n\tFri, 20 Oct 2023 12:11:19 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id 221B06297F;\n\tFri, 20 Oct 2023 14:11:19 +0200 (CEST)","from mail-yw1-x112a.google.com (mail-yw1-x112a.google.com\n\t[IPv6:2607:f8b0:4864:20::112a])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id 9450360557\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tFri, 20 Oct 2023 14:11:17 +0200 (CEST)","by mail-yw1-x112a.google.com with SMTP id\n\t00721157ae682-5a7af20c488so7917237b3.1\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tFri, 20 Oct 2023 05:11:17 -0700 (PDT)"],"DKIM-Signature":["v=1; a=rsa-sha256; c=relaxed/simple; d=libcamera.org;\n\ts=mail; t=1697803879;\n\tbh=Plmh4hM9xi3EERne8wgXCt8MQwGxy5seybtIAhK7+gg=;\n\th=References:In-Reply-To:Date:To:Subject:List-Id:List-Unsubscribe:\n\tList-Archive:List-Post:List-Help:List-Subscribe:From:Reply-To:Cc:\n\tFrom;\n\tb=gvjnem4QoZKJlmoC5yS5mWzqi7DldP2wv0riK/Dq95WUNlOC+wTvNiRkI+387GfUT\n\tAx4DwHAszo+xQmtTxGqtFk0zq/7122i1iRU7davioLBXNBE2uqyZne40oWbgy3zNTs\n\tr6OdvJfUXAOh+ONUAwJR5lkPtTEz22o090HKgVo5h/y9v0J9NRCswm6e/TvkClWHzH\n\t1jeyJsLNZLucEZNw4pxmlwb7ISS18iA2YW4swy7BfoJPF5xw+jH68idgtDwvjgw+py\n\tXPdAmmNW4ggLPdl3Jx9XLQRP9qdBWgh0CZHt1uwJq6WQ70r87nhLkJeO+JgZI3EZhe\n\tCrjODKtx+kHjw==","v=1; a=rsa-sha256; c=relaxed/relaxed;\n\td=raspberrypi.com; s=google; t=1697803876; x=1698408676;\n\tdarn=lists.libcamera.org; \n\th=cc:to:subject:message-id:date:from:in-reply-to:references\n\t:mime-version:from:to:cc:subject:date:message-id:reply-to;\n\tbh=+11XCEztAoglJZy9gry5DPNCVwh1FYCKWyNbG6t45MA=;\n\tb=d8Gmuism/OC3zqivCX7qni39eIFnFM948n4FkbzJXrYfj2/NWHi6rfBaWkiKm7gNgL\n\t3AsrDDrIAKCUPPi0SPmrH8YEnpyCT7VhJpWGmVOXYPtor0yBrUX1sJ6UBosGvl6oS6KU\n\tN6/nhlzIcX9EvT02HJyc9ZYcrOQQx8yItbF9VslgdIZlGX0j2vr84MH9SoR5Wnp8Ci3x\n\tSKJrk7JK07usA93n7Iy+/eacZUEIfQymqN4On9MSMu415s//mV0Th/brQwejONHIMsjl\n\t/4oKkOq0d4EgkRRlCkWySfSQxI3ps8+1SOIaUDDp3n83jXvI7imDi+EnylYDwReRD+FV\n\t48gw=="],"Authentication-Results":"lancelot.ideasonboard.com; dkim=pass (2048-bit key; \n\tunprotected) header.d=raspberrypi.com\n\theader.i=@raspberrypi.com\n\theader.b=\"d8Gmuism\"; dkim-atps=neutral","X-Google-DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/relaxed;\n\td=1e100.net; s=20230601; t=1697803876; x=1698408676;\n\th=cc:to:subject:message-id:date:from:in-reply-to:references\n\t:mime-version:x-gm-message-state:from:to:cc:subject:date:message-id\n\t:reply-to;\n\tbh=+11XCEztAoglJZy9gry5DPNCVwh1FYCKWyNbG6t45MA=;\n\tb=rJ2AQpd3/CH94eRYVO7oVVtxMNswjX6XKxH/HZ9eRhkkN4d9dgQKG+86iAjLGylnb3\n\tA09Vu4fVi6E20Fw1Cx0ALIkoleMLql057T7aIG/89uLsyyo2njh00/DPdb+o3+wwlUuu\n\tm5AZ040vyJEDduVsyb+Fhjp1IT7hqi/N4YtA2P0l91mxDrKI3jGsHG9LrtkH3Se5fEsX\n\tVYMqF+wen8lwhMxeJxTa9nMD1cOhPs9t9l8jKlUuBPCBaBGyCyzFYBrltqSMpuK6HdK+\n\tOZtBc3rMY6pzuPp78eroW61jY84SfDS8ATJPrwKIHErSn7ztjInw26gvsviCLkqkqhaS\n\trWJg==","X-Gm-Message-State":"AOJu0YyBIUOXdGE7hDQlYp9obYdD522yw7XDyHP/ZpjY+1fM6wfGNyqk\n\tulSHU4eAakYTMkIG04ITp9Rs2YtIZlYZnIU5K0hM4A==","X-Google-Smtp-Source":"AGHT+IHMQuB4fP7lx5AvMV0oi2u15cqVEQ1upOh6gfAIg5jNoaxAUqZf68PlZK0U/yzHf0XkG5XLJyQXqU69+UHZt5Q=","X-Received":"by 2002:a05:6902:188f:b0:d9a:c946:4187 with SMTP id\n\tcj15-20020a056902188f00b00d9ac9464187mr1763634ybb.22.1697803875904;\n\tFri, 20 Oct 2023 05:11:15 -0700 (PDT)","MIME-Version":"1.0","References":"<20231019115220.4473-1-david.plowman@raspberrypi.com>\n\t<20231019205235.GO14832@pendragon.ideasonboard.com>\n\t<CAHW6GY+BWEHxU_JfzLjpRhVzqdBbtWxZp0uVri-sirRobKGqVQ@mail.gmail.com>\n\t<20231020095535.GB20914@pendragon.ideasonboard.com>\n\t<CAHW6GY+s8zL-5GeHDS50Ch6fE=3cuCkp3WXqY=QB89w2V7DLXg@mail.gmail.com>\n\t<20231020112605.GD20914@pendragon.ideasonboard.com>","In-Reply-To":"<20231020112605.GD20914@pendragon.ideasonboard.com>","Date":"Fri, 20 Oct 2023 13:11:04 +0100","Message-ID":"<CAHW6GYKULifqq70bG1JGBDM3T7QAyfJ-uCYrDT2by4P4NZ1oNg@mail.gmail.com>","To":"Laurent Pinchart <laurent.pinchart@ideasonboard.com>","Content-Type":"text/plain; charset=\"UTF-8\"","Subject":"Re: [libcamera-devel] [PATCH v3] libcamera: controls: Add controls\n\tfor HDR","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","From":"David Plowman via libcamera-devel <libcamera-devel@lists.libcamera.org>","Reply-To":"David Plowman <david.plowman@raspberrypi.com>","Cc":"libcamera-devel@lists.libcamera.org","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}},{"id":28020,"web_url":"https://patchwork.libcamera.org/comment/28020/","msgid":"<20231020131845.GE20914@pendragon.ideasonboard.com>","date":"2023-10-20T13:18:45","subject":"Re: [libcamera-devel] [PATCH v3] libcamera: controls: Add controls\n\tfor HDR","submitter":{"id":2,"url":"https://patchwork.libcamera.org/api/people/2/","name":"Laurent Pinchart","email":"laurent.pinchart@ideasonboard.com"},"content":"Hi David,\n\nOn Fri, Oct 20, 2023 at 01:11:04PM +0100, David Plowman wrote:\n> On Fri, 20 Oct 2023 at 12:26, Laurent Pinchart wrote:\n> > On Fri, Oct 20, 2023 at 12:09:36PM +0100, David Plowman wrote:\n> > > On Fri, 20 Oct 2023 at 10:55, Laurent Pinchart wrote:\n> > > > On Fri, Oct 20, 2023 at 09:37:28AM +0100, David Plowman wrote:\n> > > > > On Thu, 19 Oct 2023 at 21:52, Laurent Pinchart wrote:\n> > > > > > On Thu, Oct 19, 2023 at 12:52:20PM +0100, David Plowman via libcamera-devel wrote:\n> > > > > > > We add an HdrMode control (to enable and disable HDR processing)\n> > > > > > > and an HdrChannel, which indicates what kind of HDR frame (short, long\n> > > > > > > or medium) has just arrived.\n> > > > > > >\n> > > > > > > Currently the HdrMode supports the following values:\n> > > > > > >\n> > > > > > > * Off - no HDR processing at all.\n> > > > > > > * MultiExposureUnmerged - frames at multiple different exposures are\n> > > > > > >   produced, but not merged together. They are returned \"as is\".\n> > > > > > > * MultiExposure - frames at multiple different exposures are merged\n> > > > > > >   to create HDR images.\n> > > > > > > * SingleExposure - multiple frames all at the same exposure are\n> > > > > > >   merged to create HDR images.\n> > > > > > > * Night - multiple frames will be combined to create \"night mode\"\n> > > > > > >   images.\n> > > > > > >\n> > > > > > > Signed-off-by: David Plowman <david.plowman@raspberrypi.com>\n> > > > > > > Reviewed-by: Naushir Patuck <naush@raspberrypi.com>\n> > > > > > > ---\n> > > > > > >  src/libcamera/control_ids.yaml | 75 ++++++++++++++++++++++++++++++++++\n> > > > > > >  1 file changed, 75 insertions(+)\n> > > > > > >\n> > > > > > > diff --git a/src/libcamera/control_ids.yaml b/src/libcamera/control_ids.yaml\n> > > > > > > index f2e542f4..c3232abf 100644\n> > > > > > > --- a/src/libcamera/control_ids.yaml\n> > > > > > > +++ b/src/libcamera/control_ids.yaml\n> > > > > > > @@ -774,6 +774,81 @@ controls:\n> > > > > > >              Continuous AF is paused. No further state changes or lens movements\n> > > > > > >              will occur until the AfPauseResume control is sent.\n> > > > > > >\n> > > > > > > +  - HdrMode:\n> > > > > > > +      type: int32_t\n> > > > > > > +      description: |\n> > > > > > > +        Control to set the mode to be used for High Dynamic Range (HDR)\n> > > > > > > +        imaging. HDR techniques typically include multiple exposure, image\n> > > > > > > +        fusion and tone mapping techniques to improve the dynamic range of the\n> > > > > > > +        resulting images.\n> > > > > > > +\n> > > > > > > +        When using an HDR mode, images are tagged to indicate which HDR channel\n> > > > > > > +        (long, medium or short) they come from.\n> > > > > > > +\n> > > > > > > +        \\sa HdrChannel\n> > > > > > > +\n> > > > > > > +      enum:\n> > > > > > > +        - name: HdrModeOff\n> > > > > > > +          value: 0\n> > > > > > > +          description: |\n> > > > > > > +            HDR is disabled. The HDR channel, if present, will report\n> > > > > > > +            HdrChannelNone.\n> > > > > >\n> > > > > > Stating what HDR channel is used is an improvement compared to the\n> > > > > > previous version, but there's still an option left to implementors here:\n> > > > > > reporting HdrChannelNone, or not reporting HdrChannel at all. Unless you\n> > > > > > see a reason to allow both, I would pick the latter:\n> > > > > >\n> > > > > >             HDR is disabled. Metadata for this frame will not include the\n> > > > > >             HdrChannel control.\n> > > > >\n> > > > > Yes, I think that's fine.\n> > > > >\n> > > > > > > +        - name: HdrModeMultiExposureUnmerged\n> > > > > > > +          value: 1\n> > > > > > > +          description: |\n> > > > > > > +            Multiple exposures will be generated in an alternating fashion.\n> > > > > > > +            However, they will not be merged together and will be returned to\n> > > > > > > +            the application as they are. Each image will be tagged with the\n> > > > > > > +            correct HDR channel, indicating what kind of exposure (long, medium\n> > > > > > > +            or short) it is.  The expectation is that, if necessary, the\n> > > > > > > +            application can merge them to create HDR images for itself.\n> > > > > >\n> > > > > > You mention here long, medium and short. Does this mean there will\n> > > > > > always be three channels ?\n> > > > >\n> > > > > No - it's whatever the implementation wants to do. We don't use\n> > > > > medium, for example. I think it's quite likely that some vendors would\n> > > > > want other channels, such as \"very short\" and \"very long\". Maybe the\n> > > > > description can avoid implying that all channels will appear.\n> > > >\n> > > > Shouldn't this be something that the application should control ? I'm\n> > > > increasingly thinking that this shouldn't be an HDR mode, but should\n> > > > instead be controlled through a per-frame control mechanism. Is there\n> > > > any reason this can't be done, not right now (I don't want to delay this\n> > > > patch unnecessarily), but at some point in the future ?\n> > >\n> > > Yes, I'd be happy with an API for multi-channel AGC, which is what\n> > > this really is. This was actually where I started, though it seemed\n> > > somewhat controversial so I thought it might be more palatable\n> > > presented like this instead. But either way works for me.\n> > >\n> > > I'm very keen to handle this inside our algorithms, so that\n> > > applications don't explicitly have to switch AGC channels all the\n> > > time. That's a more reliable approach (there will be fewer failures to\n> > > switch on every frame), and saves the application from a whole world\n> > > of pain and complication.\n> > >\n> > > Though I'm all in favour of per-frame controls for lots of other\n> > > reasons. We've been very keen to see that move forward for quite a\n> > > long time now!\n> >\n> > Both make sense. I'm thinking that we may need some kind of \"controls\n> > scheduling\" API to help applications alternate between different\n> > settings, in addition to the full manual mode offered by per-frame\n> > controls. I don't know yet how that would look like though.\n> \n> Control scheduling is an interesting idea. I'm not sure how it relates\n> to AGC because AGC/AE controls are always a bit special because of the\n> frame delays that are incurred in the sensor. Controls also tend to\n> wind up at the back of the request queue which is another debatable\n> feature for us, and not a place we'd be wanting exposure/gain updates\n> to languish. So certainly things to think about.\n\nAnother point to keep in mind is that, while application-side HDR with\nstaggered exposures in its simplest form from a libcamera point of view\nwould see the application specify per-frame manual exposures, I think we\nneed to provide a way to still run the libcamera-side AEC/AGC. Maybe\nsetting the ExposureValue per frame would be such a way, to enable\ngeneration of darker and lighter frames based on the \"normal\" exposure\ncalculated by the AEC/AGC algorithm. Maybe we will also need more/other\ncontrols for this.\n\n> > > > > > > +        - name: HdrModeMultiExposure\n> > > > > > > +          value: 2\n> > > > > > > +          description: |\n> > > > > > > +            Multiple exposures will be generated and merged to create HDR\n> > > > > > > +            images. Each image will be tagged with the HDR channel (long, medium\n> > > > > > > +            or short) that arrived and which caused this image to be output.\n> > > > > > > +        - name: HdrModeSingleExposure\n> > > > > > > +          value: 3\n> > > > > > > +          description: |\n> > > > > > > +            Multiple frames all at a single exposure will be used to create HDR\n> > > > > > > +            images. These images should be reported as all corresponding to the\n> > > > > > > +            HDR short channel.\n> > > > > > > +        - name: HdrModeNight\n> > > > > > > +          value: 4\n> > > > > > > +          description: |\n> > > > > > > +            Multiple frames will be combined to produce \"night mode\"\n> > > > > > > +            images. Images will be tagged as belonging either to the long,\n> > > > > > > +            medium or short HDR channel according to the implementation.\n> > > > > >\n> > > > > > Does this mean that night more will always use multi-exposure, or that\n> > > > > > it is implementation-defined ?\n> > > > >\n> > > > > I really think that needs to be implementation defined. Our night mode\n> > > > > is single-exposure, but I can't possibly predict what anyone else\n> > > > > would want to do.\n> > > >\n> > > > Won't it be problematic for applications if they don't know what\n> > > > HdrChannel values they will get ? How do you expect HdrChannel to be\n> > > > used in the HDR modes where the camera combines images (all but the\n> > > > Unmerged mode) ?\n> > >\n> > > Really, what I'm hoping is that the IQ stability control will indicate\n> > > when it's OK to capture the frame, and this will be set when all the\n> > > exposures that the implementation wants to use have arrived. So at\n> > > that point the channel information will be of interest to some folks,\n> > > but not essential to general applications. But obviously the IQ\n> > > stability control discussion needs to progress.\n> >\n> > Those are two different issues, aren't they ? The point I was trying to\n> > make is that I think the HdrChannel metadata will not be useful for\n> > applications if we don't standardize at least a bit what it will report.\n> > Uou gave an example where there would be a HdrChannelNone once per\n> > \"group\" of exposures, to be used as a viewfinder. If applications need\n> > that, they need to be able to rely on it being present, or at least know\n> > if it will be, or have a way to select it.\n> \n> I agree standardisation is good here. I also think that beyond some\n> obvious cases (long/medium/short) it will be difficult. I'm already\n> expecting \"very short\" and \"very long\" at some point. I see a need for\n> \"this isn't HDR at all but is for preview\", maybe that's just \"this is\n> a normal AGC image\". I think some platforms might just want a long run\n> of exposures where some might be the same, some might be different,\n> and the only way to name them will be \"0\", \"1\" and so on.\n\nCould we at least decide on basic rules that govern what channels should\nbe used, depending on the number of channels ? For instance, if we were\nto define 5 channels (very short, short, medium, long, very long), and a\ncamera used 3 channels, it would be horrible to ask applications\ndevelopers to be prepared to handle all possible combinations. Neither\nwould it make much sense, in my opinion, to allow cameras to use \"very\nshort, long, very long\" in that case. The exposures are relative to each\nother, there's no rule to govern the delta between them, so I think we\ncan at least set some rules on the ordering.\n\nFor the single-exposure case, does it even make sense to report a\nchannel, given that a single one is used ?\n\nFor the multi-exposure cases, can we tell which channels will be used\nbased on the number of channels ? If you want some images to be reported\nwith channel \"None\", can we document this, and explain what those images\nwill be ?\n\n> We've had this problem to some extent elsewhere and have adopted a\n> \"custom\" value. But it's not a great solution, and I don't think it\n> really flies here when there could be many \"custom\" values.\n> \n> > What I'm missing here is a design, it seems to be more of a prototype to\n> > see what will happen. It's fine to prototype things, but I would also\n> > like to think about a longer term design. At the very least, how can we\n> > get feedback on what will be useful to users, and when will we revisit\n> > this ?\n> \n> I wouldn't disagree with any of that, though I think experience tells\n> us that the only real way to get feedback is to put stuff out there\n> and see how people react. We aren't short of wider camera forums where\n> getting meaningful engagement from other parties is difficult! To be\n> fair, Pi users are very good in this respect, though I think most\n> would engage more actively when there's stuff to play with. But hard\n> to say!\n\nI agree, getting it in the hands of users is certainly a way to get\nfeedback (how constructive that feedback is is a different question\n:-)).\n\n> > > > > > > +\n> > > > > > > +  - HdrChannel:\n> > > > > > > +      type: int32_t\n> > > > > > > +      description: |\n> > > > > > > +        This value is reported back to the application so that it can discover\n> > > > > > > +        whether this capture corresponds to the short or long exposure image (or\n> > > > > > > +        any other image used by the HDR procedure). An application can monitor\n> > > > > > > +        the HDR channel to discover when the differently exposed images have\n> > > > > > > +        arrived.\n> > > > > > > +\n> > > > > > > +      enum:\n> > > > > > > +        - name: HdrChannelNone\n> > > > > > > +          value: 0\n> > > > > > > +          description: |\n> > > > > > > +            This image does not correspond to any of the captures used to create\n> > > > > > > +            an HDR image.\n> > > > > >\n> > > > > > As indicated above, do we need this, or should we not report HdrChannel\n> > > > > > when HDR is disabled ?\n> > > > >\n> > > > > Actually I'd quite like to keep this, even if it's not reported when HDR is off.\n> > > > >\n> > > > > One use case is multi-exposure HDR on a Pi 4. You can't merge images\n> > > > > so how would you get a viewfinder? You could intersperse some\n> > > > > \"ordinary AGC\" frames with your long/short/whatever frames. You might\n> > > > > want to label these \"HdrChannelNone\". I suppose you could label them\n> > > > > \"medium\", but then maybe you're using \"medium\" for other frames that\n> > > > > your HDR implementation requires. Like everything else, it all falls a\n> > > > > bit into the \"who knows what anyone will do\" category.\n> > > >\n> > > > This makes me believe even more strongly that the unmerged mode\n> > > > shouldn't be an HDR mode. We're trying to cram too many assumptions\n> > > > about the application needs here.\n> > > >\n> > > > If you want to keep HdrChannelNone for the time being until we drop\n> > > > HdrMultiExposureUnmerged, then this needs better documentation. Until I\n> > > > read your reply I had no idea that the unmerged mode would produce\n> > > > images for the \"None\" channel. If this isn't documented properly, it\n> > > > won't be usable for applications.\n> > > >\n> > > > Depending on the priorities (I assume the Pi 5 HDR modes will be more\n> > > > interesting than the unmerged operation on Pi 4), we could also drop the\n> > > > unmerged mode for now, and design a solution better for application-side\n> > > > HDR.\n> > >\n> > > I'm definitely very keen to move forward all the discussions on\n> > > multi-channel AGC, IQ stability, per-frame controls and HDR. I am\n> > > aware that I want to be very flexible about many aspects of HDR, and I\n> > > explicitly do not want to tie our users down with \"it will work this\n> > > way\", I want them to be able to implement the strategies that work\n> > > best for them. So maybe vendor-specific controls are another topic\n> > > that needs to be added to this list.\n> > >\n> > > For the moment, however, Pi 5 is upon us and realistically, I think we\n> > > have no ability to change anything at this point. I'm definitely open\n> > > to suggestions on how to rationalise things after our release.\n> >\n> > Would you prefer focussing on Pi 5 first and postponing\n> > HdrModeMultiExposureUnmerged, or bundling them all together ? If your\n> > focus is in Pi 5 and you won't have time to release an implementation of\n> > HdrModeMultiExposureUnmerged with corresponding application-side support\n> > in the short term, maybe it would be better to merge the camera-side HDR\n> > support right now, and revisit HdrModeMultiExposureUnmerged when you'll\n> > have time to work on its implementation ? To be perfectly clear here, I\n> > think HdrModeMultiExposureUnmerged is less well-defined than the other\n> > modes, and thus needs more work, but I'm also not opposed to merging it\n> > early if we can come up with a plan to then improve it.\n> \n> For us, the release we're making in a day or two is a fait accompli at\n> this point (Pi 5s are being shipped, I hear), and all this stuff is in\n> it. All these modes are implemented and work. So from our point of\n> view, bundling everything that we've done together would minimse\n> divergence. Though we can live with it too, at least temporarily.\n\nOK, fine with me.\n\n> But as I said, I'm more than happy to revisit stuff. I think\n> multi-channel AGC is probably a better answer than \"unmerged\" HDR, but\n> we need to agree an approach.\n> \n> I think many platforms would react with horror to multi-channel AGC.\n> They'd take the view that they'll just drive multiple different\n> exposures explicitly, and not worry about multiple channels. Though\n> you still have to decide if they need to adjust dynamically and how\n> you switch between them, how they don't get too far apart, and how to\n> make it jump from one exposure to another without adapting slowly. So\n> it's all the same problems, just without exposing a general solution\n> that enables it. But it's definitely more viable if you're only doing\n> still image HDR. I can't escape wondering whether the dawn of vendor\n> extensions is drawing closer.\n\nIt's always drawing closer, but I think we're still far away from it for\nHDR specifically. This is just the starting point.\n\n> So lots to think about!\n> \n> > > > > I guess there's also an outside chance that some implementations can't\n> > > > > flick exposures instantaneously and accurately like we can, so maybe\n> > > > > it would suit them too. Or we could have an\n> > > > > \"HdrChannelCouldntMakeThisWorkProperly\" value? Having wrestled with\n> > > > > all this for a while I can see the attraction! :)\n> > > > >\n> > > > > I'll send out another version today with the changes in line with all\n> > > > > the above. Obviously everyone please shout if we want to do anything\n> > > > > different.\n> > > > >\n> > > > > > > +        - name: HdrChannelShort\n> > > > > > > +          value: 1\n> > > > > > > +          description: |\n> > > > > > > +            This is a short exposure image.\n> > > > > > > +        - name: HdrChannelMedium\n> > > > > > > +          value: 2\n> > > > > > > +          description: |\n> > > > > > > +            This is a medium exposure image.\n> > > > > > > +        - name: HdrChannelLong\n> > > > > > > +          value: 3\n> > > > > > > +          description: |\n> > > > > > > +            This is a long exposure image.\n> > > > > > > +\n> > > > > > >    # ----------------------------------------------------------------------------\n> > > > > > >    # Draft controls section\n> > > > > > >","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id 2EFE2C3272\n\tfor <parsemail@patchwork.libcamera.org>;\n\tFri, 20 Oct 2023 13:18:41 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id 71DBF6297F;\n\tFri, 20 Oct 2023 15:18:40 +0200 (CEST)","from perceval.ideasonboard.com (perceval.ideasonboard.com\n\t[213.167.242.64])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id 4C0DE60557\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tFri, 20 Oct 2023 15:18:39 +0200 (CEST)","from pendragon.ideasonboard.com (213-243-189-158.bb.dnainternet.fi\n\t[213.243.189.158])\n\tby perceval.ideasonboard.com (Postfix) with ESMTPSA id 4F0A763C;\n\tFri, 20 Oct 2023 15:18:30 +0200 (CEST)"],"DKIM-Signature":["v=1; a=rsa-sha256; c=relaxed/simple; d=libcamera.org;\n\ts=mail; t=1697807920;\n\tbh=09j7E1x6WDVS7zkia1x5szvlFbcRPXjTvhWwO+IZH9Y=;\n\th=Date:To:References:In-Reply-To:Subject:List-Id:List-Unsubscribe:\n\tList-Archive:List-Post:List-Help:List-Subscribe:From:Reply-To:Cc:\n\tFrom;\n\tb=geAbqfdz0zX5+/+fmM9WyHzqdM+kJMI1fHwMRdXkpKan/nYhFpv3JL/k0ltlxZq4B\n\t1XYrUBEVLN3p1YKDSuJvABcL4GFbOcOlbPleuhNBR8RIu6B9k2vvNFtX2q0xv9syeO\n\tYVOo6PdyQ/rdYztg4J7UlAWDseevZDViZwx3RAYZtge5/6/qnglMWlDkRGH7xfbl7l\n\tWZmlFkR7NUETu5KagwGLjra+vDfWOC2AHWn8T8nMokdFalPwKPUi1BMsHuYtra96uy\n\tK2Hq8X2FNk0/TbAKPe5CK+WzyjM0bhmhFv3NLjAyIWOYo3So9JpOXVZcHS15jOEkDM\n\tw+64Kl4TrbIEw==","v=1; a=rsa-sha256; c=relaxed/simple; d=ideasonboard.com;\n\ts=mail; t=1697807910;\n\tbh=09j7E1x6WDVS7zkia1x5szvlFbcRPXjTvhWwO+IZH9Y=;\n\th=Date:From:To:Cc:Subject:References:In-Reply-To:From;\n\tb=iizYI7o5o5Pu8EC6UvByICXow1kDDWJXRZ752/xdp0DAHfrixTNe+rzXmh3UvIPh+\n\tEQ+lKN616laCn7OAQQJE99D8ktH8jzeBWtqq0EafcDshz1JH3IiZo0xpO6tMxlQFzY\n\tXjTQ4ECVJw1EIh6iFBtuHL5d2XIQDv5stNz9YT4g="],"Authentication-Results":"lancelot.ideasonboard.com; dkim=pass (1024-bit key; \n\tunprotected) header.d=ideasonboard.com\n\theader.i=@ideasonboard.com\n\theader.b=\"iizYI7o5\"; dkim-atps=neutral","Date":"Fri, 20 Oct 2023 16:18:45 +0300","To":"David Plowman <david.plowman@raspberrypi.com>","Message-ID":"<20231020131845.GE20914@pendragon.ideasonboard.com>","References":"<20231019115220.4473-1-david.plowman@raspberrypi.com>\n\t<20231019205235.GO14832@pendragon.ideasonboard.com>\n\t<CAHW6GY+BWEHxU_JfzLjpRhVzqdBbtWxZp0uVri-sirRobKGqVQ@mail.gmail.com>\n\t<20231020095535.GB20914@pendragon.ideasonboard.com>\n\t<CAHW6GY+s8zL-5GeHDS50Ch6fE=3cuCkp3WXqY=QB89w2V7DLXg@mail.gmail.com>\n\t<20231020112605.GD20914@pendragon.ideasonboard.com>\n\t<CAHW6GYKULifqq70bG1JGBDM3T7QAyfJ-uCYrDT2by4P4NZ1oNg@mail.gmail.com>","MIME-Version":"1.0","Content-Type":"text/plain; charset=utf-8","Content-Disposition":"inline","In-Reply-To":"<CAHW6GYKULifqq70bG1JGBDM3T7QAyfJ-uCYrDT2by4P4NZ1oNg@mail.gmail.com>","Subject":"Re: [libcamera-devel] [PATCH v3] libcamera: controls: Add controls\n\tfor HDR","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","From":"Laurent Pinchart via libcamera-devel\n\t<libcamera-devel@lists.libcamera.org>","Reply-To":"Laurent Pinchart <laurent.pinchart@ideasonboard.com>","Cc":"libcamera-devel@lists.libcamera.org","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}},{"id":28039,"web_url":"https://patchwork.libcamera.org/comment/28039/","msgid":"<20231024082130.GA10517@pendragon.ideasonboard.com>","date":"2023-10-24T08:21:30","subject":"Re: [libcamera-devel] [PATCH v3] libcamera: controls: Add controls\n\tfor HDR","submitter":{"id":2,"url":"https://patchwork.libcamera.org/api/people/2/","name":"Laurent Pinchart","email":"laurent.pinchart@ideasonboard.com"},"content":"Hi David,\n\nTo make sure we're not both waiting on each other, there are questions\nfor you below in my previous reply that I'd like to figure out to make\nprogress on this patch.\n\nOn Fri, Oct 20, 2023 at 04:18:45PM +0300, Laurent Pinchart via libcamera-devel wrote:\n> On Fri, Oct 20, 2023 at 01:11:04PM +0100, David Plowman wrote:\n> > On Fri, 20 Oct 2023 at 12:26, Laurent Pinchart wrote:\n> > > On Fri, Oct 20, 2023 at 12:09:36PM +0100, David Plowman wrote:\n> > > > On Fri, 20 Oct 2023 at 10:55, Laurent Pinchart wrote:\n> > > > > On Fri, Oct 20, 2023 at 09:37:28AM +0100, David Plowman wrote:\n> > > > > > On Thu, 19 Oct 2023 at 21:52, Laurent Pinchart wrote:\n> > > > > > > On Thu, Oct 19, 2023 at 12:52:20PM +0100, David Plowman via libcamera-devel wrote:\n> > > > > > > > We add an HdrMode control (to enable and disable HDR processing)\n> > > > > > > > and an HdrChannel, which indicates what kind of HDR frame (short, long\n> > > > > > > > or medium) has just arrived.\n> > > > > > > >\n> > > > > > > > Currently the HdrMode supports the following values:\n> > > > > > > >\n> > > > > > > > * Off - no HDR processing at all.\n> > > > > > > > * MultiExposureUnmerged - frames at multiple different exposures are\n> > > > > > > >   produced, but not merged together. They are returned \"as is\".\n> > > > > > > > * MultiExposure - frames at multiple different exposures are merged\n> > > > > > > >   to create HDR images.\n> > > > > > > > * SingleExposure - multiple frames all at the same exposure are\n> > > > > > > >   merged to create HDR images.\n> > > > > > > > * Night - multiple frames will be combined to create \"night mode\"\n> > > > > > > >   images.\n> > > > > > > >\n> > > > > > > > Signed-off-by: David Plowman <david.plowman@raspberrypi.com>\n> > > > > > > > Reviewed-by: Naushir Patuck <naush@raspberrypi.com>\n> > > > > > > > ---\n> > > > > > > >  src/libcamera/control_ids.yaml | 75 ++++++++++++++++++++++++++++++++++\n> > > > > > > >  1 file changed, 75 insertions(+)\n> > > > > > > >\n> > > > > > > > diff --git a/src/libcamera/control_ids.yaml b/src/libcamera/control_ids.yaml\n> > > > > > > > index f2e542f4..c3232abf 100644\n> > > > > > > > --- a/src/libcamera/control_ids.yaml\n> > > > > > > > +++ b/src/libcamera/control_ids.yaml\n> > > > > > > > @@ -774,6 +774,81 @@ controls:\n> > > > > > > >              Continuous AF is paused. No further state changes or lens movements\n> > > > > > > >              will occur until the AfPauseResume control is sent.\n> > > > > > > >\n> > > > > > > > +  - HdrMode:\n> > > > > > > > +      type: int32_t\n> > > > > > > > +      description: |\n> > > > > > > > +        Control to set the mode to be used for High Dynamic Range (HDR)\n> > > > > > > > +        imaging. HDR techniques typically include multiple exposure, image\n> > > > > > > > +        fusion and tone mapping techniques to improve the dynamic range of the\n> > > > > > > > +        resulting images.\n> > > > > > > > +\n> > > > > > > > +        When using an HDR mode, images are tagged to indicate which HDR channel\n> > > > > > > > +        (long, medium or short) they come from.\n> > > > > > > > +\n> > > > > > > > +        \\sa HdrChannel\n> > > > > > > > +\n> > > > > > > > +      enum:\n> > > > > > > > +        - name: HdrModeOff\n> > > > > > > > +          value: 0\n> > > > > > > > +          description: |\n> > > > > > > > +            HDR is disabled. The HDR channel, if present, will report\n> > > > > > > > +            HdrChannelNone.\n> > > > > > >\n> > > > > > > Stating what HDR channel is used is an improvement compared to the\n> > > > > > > previous version, but there's still an option left to implementors here:\n> > > > > > > reporting HdrChannelNone, or not reporting HdrChannel at all. Unless you\n> > > > > > > see a reason to allow both, I would pick the latter:\n> > > > > > >\n> > > > > > >             HDR is disabled. Metadata for this frame will not include the\n> > > > > > >             HdrChannel control.\n> > > > > >\n> > > > > > Yes, I think that's fine.\n> > > > > >\n> > > > > > > > +        - name: HdrModeMultiExposureUnmerged\n> > > > > > > > +          value: 1\n> > > > > > > > +          description: |\n> > > > > > > > +            Multiple exposures will be generated in an alternating fashion.\n> > > > > > > > +            However, they will not be merged together and will be returned to\n> > > > > > > > +            the application as they are. Each image will be tagged with the\n> > > > > > > > +            correct HDR channel, indicating what kind of exposure (long, medium\n> > > > > > > > +            or short) it is.  The expectation is that, if necessary, the\n> > > > > > > > +            application can merge them to create HDR images for itself.\n> > > > > > >\n> > > > > > > You mention here long, medium and short. Does this mean there will\n> > > > > > > always be three channels ?\n> > > > > >\n> > > > > > No - it's whatever the implementation wants to do. We don't use\n> > > > > > medium, for example. I think it's quite likely that some vendors would\n> > > > > > want other channels, such as \"very short\" and \"very long\". Maybe the\n> > > > > > description can avoid implying that all channels will appear.\n> > > > >\n> > > > > Shouldn't this be something that the application should control ? I'm\n> > > > > increasingly thinking that this shouldn't be an HDR mode, but should\n> > > > > instead be controlled through a per-frame control mechanism. Is there\n> > > > > any reason this can't be done, not right now (I don't want to delay this\n> > > > > patch unnecessarily), but at some point in the future ?\n> > > >\n> > > > Yes, I'd be happy with an API for multi-channel AGC, which is what\n> > > > this really is. This was actually where I started, though it seemed\n> > > > somewhat controversial so I thought it might be more palatable\n> > > > presented like this instead. But either way works for me.\n> > > >\n> > > > I'm very keen to handle this inside our algorithms, so that\n> > > > applications don't explicitly have to switch AGC channels all the\n> > > > time. That's a more reliable approach (there will be fewer failures to\n> > > > switch on every frame), and saves the application from a whole world\n> > > > of pain and complication.\n> > > >\n> > > > Though I'm all in favour of per-frame controls for lots of other\n> > > > reasons. We've been very keen to see that move forward for quite a\n> > > > long time now!\n> > >\n> > > Both make sense. I'm thinking that we may need some kind of \"controls\n> > > scheduling\" API to help applications alternate between different\n> > > settings, in addition to the full manual mode offered by per-frame\n> > > controls. I don't know yet how that would look like though.\n> > \n> > Control scheduling is an interesting idea. I'm not sure how it relates\n> > to AGC because AGC/AE controls are always a bit special because of the\n> > frame delays that are incurred in the sensor. Controls also tend to\n> > wind up at the back of the request queue which is another debatable\n> > feature for us, and not a place we'd be wanting exposure/gain updates\n> > to languish. So certainly things to think about.\n> \n> Another point to keep in mind is that, while application-side HDR with\n> staggered exposures in its simplest form from a libcamera point of view\n> would see the application specify per-frame manual exposures, I think we\n> need to provide a way to still run the libcamera-side AEC/AGC. Maybe\n> setting the ExposureValue per frame would be such a way, to enable\n> generation of darker and lighter frames based on the \"normal\" exposure\n> calculated by the AEC/AGC algorithm. Maybe we will also need more/other\n> controls for this.\n> \n> > > > > > > > +        - name: HdrModeMultiExposure\n> > > > > > > > +          value: 2\n> > > > > > > > +          description: |\n> > > > > > > > +            Multiple exposures will be generated and merged to create HDR\n> > > > > > > > +            images. Each image will be tagged with the HDR channel (long, medium\n> > > > > > > > +            or short) that arrived and which caused this image to be output.\n> > > > > > > > +        - name: HdrModeSingleExposure\n> > > > > > > > +          value: 3\n> > > > > > > > +          description: |\n> > > > > > > > +            Multiple frames all at a single exposure will be used to create HDR\n> > > > > > > > +            images. These images should be reported as all corresponding to the\n> > > > > > > > +            HDR short channel.\n> > > > > > > > +        - name: HdrModeNight\n> > > > > > > > +          value: 4\n> > > > > > > > +          description: |\n> > > > > > > > +            Multiple frames will be combined to produce \"night mode\"\n> > > > > > > > +            images. Images will be tagged as belonging either to the long,\n> > > > > > > > +            medium or short HDR channel according to the implementation.\n> > > > > > >\n> > > > > > > Does this mean that night more will always use multi-exposure, or that\n> > > > > > > it is implementation-defined ?\n> > > > > >\n> > > > > > I really think that needs to be implementation defined. Our night mode\n> > > > > > is single-exposure, but I can't possibly predict what anyone else\n> > > > > > would want to do.\n> > > > >\n> > > > > Won't it be problematic for applications if they don't know what\n> > > > > HdrChannel values they will get ? How do you expect HdrChannel to be\n> > > > > used in the HDR modes where the camera combines images (all but the\n> > > > > Unmerged mode) ?\n> > > >\n> > > > Really, what I'm hoping is that the IQ stability control will indicate\n> > > > when it's OK to capture the frame, and this will be set when all the\n> > > > exposures that the implementation wants to use have arrived. So at\n> > > > that point the channel information will be of interest to some folks,\n> > > > but not essential to general applications. But obviously the IQ\n> > > > stability control discussion needs to progress.\n> > >\n> > > Those are two different issues, aren't they ? The point I was trying to\n> > > make is that I think the HdrChannel metadata will not be useful for\n> > > applications if we don't standardize at least a bit what it will report.\n> > > Uou gave an example where there would be a HdrChannelNone once per\n> > > \"group\" of exposures, to be used as a viewfinder. If applications need\n> > > that, they need to be able to rely on it being present, or at least know\n> > > if it will be, or have a way to select it.\n> > \n> > I agree standardisation is good here. I also think that beyond some\n> > obvious cases (long/medium/short) it will be difficult. I'm already\n> > expecting \"very short\" and \"very long\" at some point. I see a need for\n> > \"this isn't HDR at all but is for preview\", maybe that's just \"this is\n> > a normal AGC image\". I think some platforms might just want a long run\n> > of exposures where some might be the same, some might be different,\n> > and the only way to name them will be \"0\", \"1\" and so on.\n> \n> Could we at least decide on basic rules that govern what channels should\n> be used, depending on the number of channels ? For instance, if we were\n> to define 5 channels (very short, short, medium, long, very long), and a\n> camera used 3 channels, it would be horrible to ask applications\n> developers to be prepared to handle all possible combinations. Neither\n> would it make much sense, in my opinion, to allow cameras to use \"very\n> short, long, very long\" in that case. The exposures are relative to each\n> other, there's no rule to govern the delta between them, so I think we\n> can at least set some rules on the ordering.\n> \n> For the single-exposure case, does it even make sense to report a\n> channel, given that a single one is used ?\n> \n> For the multi-exposure cases, can we tell which channels will be used\n> based on the number of channels ? If you want some images to be reported\n> with channel \"None\", can we document this, and explain what those images\n> will be ?\n> \n> > We've had this problem to some extent elsewhere and have adopted a\n> > \"custom\" value. But it's not a great solution, and I don't think it\n> > really flies here when there could be many \"custom\" values.\n> > \n> > > What I'm missing here is a design, it seems to be more of a prototype to\n> > > see what will happen. It's fine to prototype things, but I would also\n> > > like to think about a longer term design. At the very least, how can we\n> > > get feedback on what will be useful to users, and when will we revisit\n> > > this ?\n> > \n> > I wouldn't disagree with any of that, though I think experience tells\n> > us that the only real way to get feedback is to put stuff out there\n> > and see how people react. We aren't short of wider camera forums where\n> > getting meaningful engagement from other parties is difficult! To be\n> > fair, Pi users are very good in this respect, though I think most\n> > would engage more actively when there's stuff to play with. But hard\n> > to say!\n> \n> I agree, getting it in the hands of users is certainly a way to get\n> feedback (how constructive that feedback is is a different question\n> :-)).\n> \n> > > > > > > > +\n> > > > > > > > +  - HdrChannel:\n> > > > > > > > +      type: int32_t\n> > > > > > > > +      description: |\n> > > > > > > > +        This value is reported back to the application so that it can discover\n> > > > > > > > +        whether this capture corresponds to the short or long exposure image (or\n> > > > > > > > +        any other image used by the HDR procedure). An application can monitor\n> > > > > > > > +        the HDR channel to discover when the differently exposed images have\n> > > > > > > > +        arrived.\n> > > > > > > > +\n> > > > > > > > +      enum:\n> > > > > > > > +        - name: HdrChannelNone\n> > > > > > > > +          value: 0\n> > > > > > > > +          description: |\n> > > > > > > > +            This image does not correspond to any of the captures used to create\n> > > > > > > > +            an HDR image.\n> > > > > > >\n> > > > > > > As indicated above, do we need this, or should we not report HdrChannel\n> > > > > > > when HDR is disabled ?\n> > > > > >\n> > > > > > Actually I'd quite like to keep this, even if it's not reported when HDR is off.\n> > > > > >\n> > > > > > One use case is multi-exposure HDR on a Pi 4. You can't merge images\n> > > > > > so how would you get a viewfinder? You could intersperse some\n> > > > > > \"ordinary AGC\" frames with your long/short/whatever frames. You might\n> > > > > > want to label these \"HdrChannelNone\". I suppose you could label them\n> > > > > > \"medium\", but then maybe you're using \"medium\" for other frames that\n> > > > > > your HDR implementation requires. Like everything else, it all falls a\n> > > > > > bit into the \"who knows what anyone will do\" category.\n> > > > >\n> > > > > This makes me believe even more strongly that the unmerged mode\n> > > > > shouldn't be an HDR mode. We're trying to cram too many assumptions\n> > > > > about the application needs here.\n> > > > >\n> > > > > If you want to keep HdrChannelNone for the time being until we drop\n> > > > > HdrMultiExposureUnmerged, then this needs better documentation. Until I\n> > > > > read your reply I had no idea that the unmerged mode would produce\n> > > > > images for the \"None\" channel. If this isn't documented properly, it\n> > > > > won't be usable for applications.\n> > > > >\n> > > > > Depending on the priorities (I assume the Pi 5 HDR modes will be more\n> > > > > interesting than the unmerged operation on Pi 4), we could also drop the\n> > > > > unmerged mode for now, and design a solution better for application-side\n> > > > > HDR.\n> > > >\n> > > > I'm definitely very keen to move forward all the discussions on\n> > > > multi-channel AGC, IQ stability, per-frame controls and HDR. I am\n> > > > aware that I want to be very flexible about many aspects of HDR, and I\n> > > > explicitly do not want to tie our users down with \"it will work this\n> > > > way\", I want them to be able to implement the strategies that work\n> > > > best for them. So maybe vendor-specific controls are another topic\n> > > > that needs to be added to this list.\n> > > >\n> > > > For the moment, however, Pi 5 is upon us and realistically, I think we\n> > > > have no ability to change anything at this point. I'm definitely open\n> > > > to suggestions on how to rationalise things after our release.\n> > >\n> > > Would you prefer focussing on Pi 5 first and postponing\n> > > HdrModeMultiExposureUnmerged, or bundling them all together ? If your\n> > > focus is in Pi 5 and you won't have time to release an implementation of\n> > > HdrModeMultiExposureUnmerged with corresponding application-side support\n> > > in the short term, maybe it would be better to merge the camera-side HDR\n> > > support right now, and revisit HdrModeMultiExposureUnmerged when you'll\n> > > have time to work on its implementation ? To be perfectly clear here, I\n> > > think HdrModeMultiExposureUnmerged is less well-defined than the other\n> > > modes, and thus needs more work, but I'm also not opposed to merging it\n> > > early if we can come up with a plan to then improve it.\n> > \n> > For us, the release we're making in a day or two is a fait accompli at\n> > this point (Pi 5s are being shipped, I hear), and all this stuff is in\n> > it. All these modes are implemented and work. So from our point of\n> > view, bundling everything that we've done together would minimse\n> > divergence. Though we can live with it too, at least temporarily.\n> \n> OK, fine with me.\n> \n> > But as I said, I'm more than happy to revisit stuff. I think\n> > multi-channel AGC is probably a better answer than \"unmerged\" HDR, but\n> > we need to agree an approach.\n> > \n> > I think many platforms would react with horror to multi-channel AGC.\n> > They'd take the view that they'll just drive multiple different\n> > exposures explicitly, and not worry about multiple channels. Though\n> > you still have to decide if they need to adjust dynamically and how\n> > you switch between them, how they don't get too far apart, and how to\n> > make it jump from one exposure to another without adapting slowly. So\n> > it's all the same problems, just without exposing a general solution\n> > that enables it. But it's definitely more viable if you're only doing\n> > still image HDR. I can't escape wondering whether the dawn of vendor\n> > extensions is drawing closer.\n> \n> It's always drawing closer, but I think we're still far away from it for\n> HDR specifically. This is just the starting point.\n> \n> > So lots to think about!\n> > \n> > > > > > I guess there's also an outside chance that some implementations can't\n> > > > > > flick exposures instantaneously and accurately like we can, so maybe\n> > > > > > it would suit them too. Or we could have an\n> > > > > > \"HdrChannelCouldntMakeThisWorkProperly\" value? Having wrestled with\n> > > > > > all this for a while I can see the attraction! :)\n> > > > > >\n> > > > > > I'll send out another version today with the changes in line with all\n> > > > > > the above. Obviously everyone please shout if we want to do anything\n> > > > > > different.\n> > > > > >\n> > > > > > > > +        - name: HdrChannelShort\n> > > > > > > > +          value: 1\n> > > > > > > > +          description: |\n> > > > > > > > +            This is a short exposure image.\n> > > > > > > > +        - name: HdrChannelMedium\n> > > > > > > > +          value: 2\n> > > > > > > > +          description: |\n> > > > > > > > +            This is a medium exposure image.\n> > > > > > > > +        - name: HdrChannelLong\n> > > > > > > > +          value: 3\n> > > > > > > > +          description: |\n> > > > > > > > +            This is a long exposure image.\n> > > > > > > > +\n> > > > > > > >    # ----------------------------------------------------------------------------\n> > > > > > > >    # Draft controls section\n> > > > > > > >","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id 765AEBDCBD\n\tfor <parsemail@patchwork.libcamera.org>;\n\tTue, 24 Oct 2023 08:21:25 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id B4BCD6297F;\n\tTue, 24 Oct 2023 10:21:24 +0200 (CEST)","from perceval.ideasonboard.com (perceval.ideasonboard.com\n\t[IPv6:2001:4b98:dc2:55:216:3eff:fef7:d647])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id 9D65261DCC\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tTue, 24 Oct 2023 10:21:23 +0200 (CEST)","from pendragon.ideasonboard.com (213-243-189-158.bb.dnainternet.fi\n\t[213.243.189.158])\n\tby perceval.ideasonboard.com (Postfix) with ESMTPSA id 1E653AE;\n\tTue, 24 Oct 2023 10:21:11 +0200 (CEST)"],"DKIM-Signature":["v=1; a=rsa-sha256; c=relaxed/simple; d=libcamera.org;\n\ts=mail; t=1698135684;\n\tbh=iOWlESOH4a6ExQhsmNhvcwGPNhAtNW+0dm3mBfbGvoI=;\n\th=Date:To:References:In-Reply-To:Subject:List-Id:List-Unsubscribe:\n\tList-Archive:List-Post:List-Help:List-Subscribe:From:Reply-To:Cc:\n\tFrom;\n\tb=ak0VZKSLB6JxrjiBTr7TZsdc+46BuOSKGnOJVOHD3lLpk2Bduacvs8i0KVxjZEIje\n\tWyf5adhnBJVW3c8tARAUSyRHFa8Zwpgf6bNufyb6EsFHh3guKfb+rmKopDlnh2HP1E\n\tQ9rSyEPEHyDGXieO9eqWQ9WoBeYqMVQzHqQFu0Fn0yPypcZSNvIlrdHVrw1LMpQXw7\n\tNdvbZARKYrfXrqRVZIx5eGwTMpt9ozAA3UWYye8toMHsnz8z6OXRDrGOSg0SaHEmlZ\n\t6B8DbE4q1gIUVQ0mYdr+z7iaC94jMEMsGxQIf02+VL089ZmT39el6UQrzyhQCNqUkC\n\tk95OJ5zBSqApQ==","v=1; a=rsa-sha256; c=relaxed/simple; d=ideasonboard.com;\n\ts=mail; t=1698135672;\n\tbh=iOWlESOH4a6ExQhsmNhvcwGPNhAtNW+0dm3mBfbGvoI=;\n\th=Date:From:To:Cc:Subject:References:In-Reply-To:From;\n\tb=SgV0upGl39Aq1n0ug6Yd3190s6U73RPq+nHK8PXm8miXUWYEiabkoIRlLoj/ha90Q\n\t3ifZbSigvAHp/Dc9X/lmnvYEKnJbXLVYE1YW64j+29UOi9XxQNmgWzX3UzBx03g6dj\n\tIEDq+9fh54Zr7fmoT4P1ZXfkgAzZxUfTrZK5ABO4="],"Authentication-Results":"lancelot.ideasonboard.com; dkim=pass (1024-bit key; \n\tunprotected) header.d=ideasonboard.com\n\theader.i=@ideasonboard.com\n\theader.b=\"SgV0upGl\"; dkim-atps=neutral","Date":"Tue, 24 Oct 2023 11:21:30 +0300","To":"David Plowman <david.plowman@raspberrypi.com>","Message-ID":"<20231024082130.GA10517@pendragon.ideasonboard.com>","References":"<20231019115220.4473-1-david.plowman@raspberrypi.com>\n\t<20231019205235.GO14832@pendragon.ideasonboard.com>\n\t<CAHW6GY+BWEHxU_JfzLjpRhVzqdBbtWxZp0uVri-sirRobKGqVQ@mail.gmail.com>\n\t<20231020095535.GB20914@pendragon.ideasonboard.com>\n\t<CAHW6GY+s8zL-5GeHDS50Ch6fE=3cuCkp3WXqY=QB89w2V7DLXg@mail.gmail.com>\n\t<20231020112605.GD20914@pendragon.ideasonboard.com>\n\t<CAHW6GYKULifqq70bG1JGBDM3T7QAyfJ-uCYrDT2by4P4NZ1oNg@mail.gmail.com>\n\t<20231020131845.GE20914@pendragon.ideasonboard.com>","MIME-Version":"1.0","Content-Type":"text/plain; charset=utf-8","Content-Disposition":"inline","In-Reply-To":"<20231020131845.GE20914@pendragon.ideasonboard.com>","Subject":"Re: [libcamera-devel] [PATCH v3] libcamera: controls: Add controls\n\tfor HDR","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","From":"Laurent Pinchart via libcamera-devel\n\t<libcamera-devel@lists.libcamera.org>","Reply-To":"Laurent Pinchart <laurent.pinchart@ideasonboard.com>","Cc":"libcamera-devel@lists.libcamera.org","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}},{"id":28042,"web_url":"https://patchwork.libcamera.org/comment/28042/","msgid":"<CAHW6GY+Z2xC7NSXu3UgAfATLr4Ve2+-oumj=XaFuOdtZVDU=Pw@mail.gmail.com>","date":"2023-10-24T11:03:36","subject":"Re: [libcamera-devel] [PATCH v3] libcamera: controls: Add controls\n\tfor HDR","submitter":{"id":42,"url":"https://patchwork.libcamera.org/api/people/42/","name":"David Plowman","email":"david.plowman@raspberrypi.com"},"content":"Hi Laurent\n\nYes, I was meaning to answer, but also just wanted to explain a few\nmore general things. Perhaps these are things to consider slightly\nbeyond the immediate patches.\n\nThe first thing I wanted to add is that I suspect the things I want to\ndo could well be regarded as \"platform specific\". I don't think we\nshould be too surprised, my aim after all is to help people get all\nthe best features out of their Pis, even when these features may be\nspecific to Pis in various ways.\n\nThe other point is that I think Raspberry Pi are not \"typical\" camera\nusers. Other folks are wanting to write still or video capture\napplications, whereas I'm trying to give our users a toolkit of\ntechniques so that they can experiment with imaging for themselves.\nInevitably, these will include Raspberry Pi techniques. This will\nbecome trickier once we get beyond the \"easy\" features, like \"set the\nexposure\".\n\nHDR may be a case in point. I'm keen to expose the way we have\nimplemented HDR to our users. Whilst I've made certain decisions\n(\"I'll meter the short channel like this...\") I don't expect that to\nsuit all our users. So I want them to be able to change it. I want\nthem to be able to define their own AGC channels, for whatever\nuse-cases they have. Many of our users have specific and quite\ntechnical use cases that I know nothing about and can't predict, so\nthings need to be flexible and configurable. To a large extent this\nmay be via the tuning file, but I expect there will be some external\ncontrols necessary too.\n\nNow, supposing we find another vendor who wants to implement HDR, I\nthink there could be some issues. At a high level, things might look\nthe same (\"turn on HDR mode\", \"use some metadata to tell me that the\nHDR images are good to capture\"), but after that I'm quite doubtful. I\nthink any other vendor would look at what Raspberry Pi have done and\nsay \"Yeah, no. Just no. We're using our HDR that we've spent the last\n100 person-years developing, thank you very much\". (And possibly make\neverything proprietary and expose only the simplest API that gives\nminimal control to users - but we don't know.)\n\nAnyway, hopefully that provides a bit of the context here, and so now\nlet me try to comment on the more specific questions!\n\nOn Tue, 24 Oct 2023 at 09:21, Laurent Pinchart\n<laurent.pinchart@ideasonboard.com> wrote:\n>\n> Hi David,\n>\n> To make sure we're not both waiting on each other, there are questions\n> for you below in my previous reply that I'd like to figure out to make\n> progress on this patch.\n>\n> On Fri, Oct 20, 2023 at 04:18:45PM +0300, Laurent Pinchart via libcamera-devel wrote:\n> > On Fri, Oct 20, 2023 at 01:11:04PM +0100, David Plowman wrote:\n> > > On Fri, 20 Oct 2023 at 12:26, Laurent Pinchart wrote:\n> > > > On Fri, Oct 20, 2023 at 12:09:36PM +0100, David Plowman wrote:\n> > > > > On Fri, 20 Oct 2023 at 10:55, Laurent Pinchart wrote:\n> > > > > > On Fri, Oct 20, 2023 at 09:37:28AM +0100, David Plowman wrote:\n> > > > > > > On Thu, 19 Oct 2023 at 21:52, Laurent Pinchart wrote:\n> > > > > > > > On Thu, Oct 19, 2023 at 12:52:20PM +0100, David Plowman via libcamera-devel wrote:\n> > > > > > > > > We add an HdrMode control (to enable and disable HDR processing)\n> > > > > > > > > and an HdrChannel, which indicates what kind of HDR frame (short, long\n> > > > > > > > > or medium) has just arrived.\n> > > > > > > > >\n> > > > > > > > > Currently the HdrMode supports the following values:\n> > > > > > > > >\n> > > > > > > > > * Off - no HDR processing at all.\n> > > > > > > > > * MultiExposureUnmerged - frames at multiple different exposures are\n> > > > > > > > >   produced, but not merged together. They are returned \"as is\".\n> > > > > > > > > * MultiExposure - frames at multiple different exposures are merged\n> > > > > > > > >   to create HDR images.\n> > > > > > > > > * SingleExposure - multiple frames all at the same exposure are\n> > > > > > > > >   merged to create HDR images.\n> > > > > > > > > * Night - multiple frames will be combined to create \"night mode\"\n> > > > > > > > >   images.\n> > > > > > > > >\n> > > > > > > > > Signed-off-by: David Plowman <david.plowman@raspberrypi.com>\n> > > > > > > > > Reviewed-by: Naushir Patuck <naush@raspberrypi.com>\n> > > > > > > > > ---\n> > > > > > > > >  src/libcamera/control_ids.yaml | 75 ++++++++++++++++++++++++++++++++++\n> > > > > > > > >  1 file changed, 75 insertions(+)\n> > > > > > > > >\n> > > > > > > > > diff --git a/src/libcamera/control_ids.yaml b/src/libcamera/control_ids.yaml\n> > > > > > > > > index f2e542f4..c3232abf 100644\n> > > > > > > > > --- a/src/libcamera/control_ids.yaml\n> > > > > > > > > +++ b/src/libcamera/control_ids.yaml\n> > > > > > > > > @@ -774,6 +774,81 @@ controls:\n> > > > > > > > >              Continuous AF is paused. No further state changes or lens movements\n> > > > > > > > >              will occur until the AfPauseResume control is sent.\n> > > > > > > > >\n> > > > > > > > > +  - HdrMode:\n> > > > > > > > > +      type: int32_t\n> > > > > > > > > +      description: |\n> > > > > > > > > +        Control to set the mode to be used for High Dynamic Range (HDR)\n> > > > > > > > > +        imaging. HDR techniques typically include multiple exposure, image\n> > > > > > > > > +        fusion and tone mapping techniques to improve the dynamic range of the\n> > > > > > > > > +        resulting images.\n> > > > > > > > > +\n> > > > > > > > > +        When using an HDR mode, images are tagged to indicate which HDR channel\n> > > > > > > > > +        (long, medium or short) they come from.\n> > > > > > > > > +\n> > > > > > > > > +        \\sa HdrChannel\n> > > > > > > > > +\n> > > > > > > > > +      enum:\n> > > > > > > > > +        - name: HdrModeOff\n> > > > > > > > > +          value: 0\n> > > > > > > > > +          description: |\n> > > > > > > > > +            HDR is disabled. The HDR channel, if present, will report\n> > > > > > > > > +            HdrChannelNone.\n> > > > > > > >\n> > > > > > > > Stating what HDR channel is used is an improvement compared to the\n> > > > > > > > previous version, but there's still an option left to implementors here:\n> > > > > > > > reporting HdrChannelNone, or not reporting HdrChannel at all. Unless you\n> > > > > > > > see a reason to allow both, I would pick the latter:\n> > > > > > > >\n> > > > > > > >             HDR is disabled. Metadata for this frame will not include the\n> > > > > > > >             HdrChannel control.\n> > > > > > >\n> > > > > > > Yes, I think that's fine.\n> > > > > > >\n> > > > > > > > > +        - name: HdrModeMultiExposureUnmerged\n> > > > > > > > > +          value: 1\n> > > > > > > > > +          description: |\n> > > > > > > > > +            Multiple exposures will be generated in an alternating fashion.\n> > > > > > > > > +            However, they will not be merged together and will be returned to\n> > > > > > > > > +            the application as they are. Each image will be tagged with the\n> > > > > > > > > +            correct HDR channel, indicating what kind of exposure (long, medium\n> > > > > > > > > +            or short) it is.  The expectation is that, if necessary, the\n> > > > > > > > > +            application can merge them to create HDR images for itself.\n> > > > > > > >\n> > > > > > > > You mention here long, medium and short. Does this mean there will\n> > > > > > > > always be three channels ?\n> > > > > > >\n> > > > > > > No - it's whatever the implementation wants to do. We don't use\n> > > > > > > medium, for example. I think it's quite likely that some vendors would\n> > > > > > > want other channels, such as \"very short\" and \"very long\". Maybe the\n> > > > > > > description can avoid implying that all channels will appear.\n> > > > > >\n> > > > > > Shouldn't this be something that the application should control ? I'm\n> > > > > > increasingly thinking that this shouldn't be an HDR mode, but should\n> > > > > > instead be controlled through a per-frame control mechanism. Is there\n> > > > > > any reason this can't be done, not right now (I don't want to delay this\n> > > > > > patch unnecessarily), but at some point in the future ?\n> > > > >\n> > > > > Yes, I'd be happy with an API for multi-channel AGC, which is what\n> > > > > this really is. This was actually where I started, though it seemed\n> > > > > somewhat controversial so I thought it might be more palatable\n> > > > > presented like this instead. But either way works for me.\n> > > > >\n> > > > > I'm very keen to handle this inside our algorithms, so that\n> > > > > applications don't explicitly have to switch AGC channels all the\n> > > > > time. That's a more reliable approach (there will be fewer failures to\n> > > > > switch on every frame), and saves the application from a whole world\n> > > > > of pain and complication.\n> > > > >\n> > > > > Though I'm all in favour of per-frame controls for lots of other\n> > > > > reasons. We've been very keen to see that move forward for quite a\n> > > > > long time now!\n> > > >\n> > > > Both make sense. I'm thinking that we may need some kind of \"controls\n> > > > scheduling\" API to help applications alternate between different\n> > > > settings, in addition to the full manual mode offered by per-frame\n> > > > controls. I don't know yet how that would look like though.\n> > >\n> > > Control scheduling is an interesting idea. I'm not sure how it relates\n> > > to AGC because AGC/AE controls are always a bit special because of the\n> > > frame delays that are incurred in the sensor. Controls also tend to\n> > > wind up at the back of the request queue which is another debatable\n> > > feature for us, and not a place we'd be wanting exposure/gain updates\n> > > to languish. So certainly things to think about.\n> >\n> > Another point to keep in mind is that, while application-side HDR with\n> > staggered exposures in its simplest form from a libcamera point of view\n> > would see the application specify per-frame manual exposures, I think we\n> > need to provide a way to still run the libcamera-side AEC/AGC. Maybe\n> > setting the ExposureValue per frame would be such a way, to enable\n> > generation of darker and lighter frames based on the \"normal\" exposure\n> > calculated by the AEC/AGC algorithm. Maybe we will also need more/other\n> > controls for this.\n\nI think application-side HDR is quite feasible for still image\ncapture, and would expect many users to want to do it this way,\npossibly using different AGC \"channels\", possibly doing it all \"by\nhand\".\n\nI'm not persuaded by application-side HDR for video capture as I think\nit would be difficult to use. In fact, I think most of our users would\nactually give up if they had to do this for themselves.\n\nMetering for HDR is an interesting question. Having worked on this for\na bit now, I've come to the conclusion that using ev ('ExposureValue')\nto meter for different channels is a poor solution as it forces you to\nunder-expose even in cases where you didn't need to. A much better\nidea (and what I've implemented) is to be able to target different AGC\nchannels at different parts of the image histogram. But these are just\nmy decisions - they may not even suit all Pi users, and certainly not\nany other HDR implementor!\n\n> >\n> > > > > > > > > +        - name: HdrModeMultiExposure\n> > > > > > > > > +          value: 2\n> > > > > > > > > +          description: |\n> > > > > > > > > +            Multiple exposures will be generated and merged to create HDR\n> > > > > > > > > +            images. Each image will be tagged with the HDR channel (long, medium\n> > > > > > > > > +            or short) that arrived and which caused this image to be output.\n> > > > > > > > > +        - name: HdrModeSingleExposure\n> > > > > > > > > +          value: 3\n> > > > > > > > > +          description: |\n> > > > > > > > > +            Multiple frames all at a single exposure will be used to create HDR\n> > > > > > > > > +            images. These images should be reported as all corresponding to the\n> > > > > > > > > +            HDR short channel.\n> > > > > > > > > +        - name: HdrModeNight\n> > > > > > > > > +          value: 4\n> > > > > > > > > +          description: |\n> > > > > > > > > +            Multiple frames will be combined to produce \"night mode\"\n> > > > > > > > > +            images. Images will be tagged as belonging either to the long,\n> > > > > > > > > +            medium or short HDR channel according to the implementation.\n> > > > > > > >\n> > > > > > > > Does this mean that night more will always use multi-exposure, or that\n> > > > > > > > it is implementation-defined ?\n> > > > > > >\n> > > > > > > I really think that needs to be implementation defined. Our night mode\n> > > > > > > is single-exposure, but I can't possibly predict what anyone else\n> > > > > > > would want to do.\n> > > > > >\n> > > > > > Won't it be problematic for applications if they don't know what\n> > > > > > HdrChannel values they will get ? How do you expect HdrChannel to be\n> > > > > > used in the HDR modes where the camera combines images (all but the\n> > > > > > Unmerged mode) ?\n> > > > >\n> > > > > Really, what I'm hoping is that the IQ stability control will indicate\n> > > > > when it's OK to capture the frame, and this will be set when all the\n> > > > > exposures that the implementation wants to use have arrived. So at\n> > > > > that point the channel information will be of interest to some folks,\n> > > > > but not essential to general applications. But obviously the IQ\n> > > > > stability control discussion needs to progress.\n> > > >\n> > > > Those are two different issues, aren't they ? The point I was trying to\n> > > > make is that I think the HdrChannel metadata will not be useful for\n> > > > applications if we don't standardize at least a bit what it will report.\n> > > > Uou gave an example where there would be a HdrChannelNone once per\n> > > > \"group\" of exposures, to be used as a viewfinder. If applications need\n> > > > that, they need to be able to rely on it being present, or at least know\n> > > > if it will be, or have a way to select it.\n> > >\n> > > I agree standardisation is good here. I also think that beyond some\n> > > obvious cases (long/medium/short) it will be difficult. I'm already\n> > > expecting \"very short\" and \"very long\" at some point. I see a need for\n> > > \"this isn't HDR at all but is for preview\", maybe that's just \"this is\n> > > a normal AGC image\". I think some platforms might just want a long run\n> > > of exposures where some might be the same, some might be different,\n> > > and the only way to name them will be \"0\", \"1\" and so on.\n> >\n> > Could we at least decide on basic rules that govern what channels should\n> > be used, depending on the number of channels ? For instance, if we were\n> > to define 5 channels (very short, short, medium, long, very long), and a\n> > camera used 3 channels, it would be horrible to ask applications\n> > developers to be prepared to handle all possible combinations. Neither\n> > would it make much sense, in my opinion, to allow cameras to use \"very\n> > short, long, very long\" in that case. The exposures are relative to each\n> > other, there's no rule to govern the delta between them, so I think we\n> > can at least set some rules on the ordering.\n\nAs I think I said earlier, I believe there is a \"high-level\" HDR\ninterface, where you basically just turn HDR on, and wait for some\nmetadata on the image that says \"it's ready now\". This makes for an\neasy interface, and lets any vendor implement HDR in whatever way they\nwant.\n\nAt that point, the HDR channel information is just for those who are\ncurious, you don't \"need\" it. Though I really want to tag all our\nimages with exactly what channel they are. We'll have users who want\nto monitor the exposures on the individual channels so that they can\nunderstand what's going on.\n\nThe precise order in which channels come out also becomes fairly\nunimportant now, too. I could also imagine other vendors not wanting\nto divulge any channel information at all, so maybe it needs to be a\npurely optional thing. I think quite a few vendors regard HDR as\n\"secret sauce\" and may be highly secretive about their\nimplementations.\n\n> >\n> > For the single-exposure case, does it even make sense to report a\n> > channel, given that a single one is used ?\n\nI could potentially go either way here, though where I'm actually\ndoing short channel metering I'd probably still nudge towards calling\nit the \"short\" channel.\n\n> >\n> > For the multi-exposure cases, can we tell which channels will be used\n> > based on the number of channels ? If you want some images to be reported\n> > with channel \"None\", can we document this, and explain what those images\n> > will be ?\n\nI'm not sure about this one. I could imagine someone saying they want\nto run in a mode where the frames come out in the order \"short none\nlong none\" so that they can get a half framerate viewfinder. (Or they\ncould ask for \"short long none\" and put up with a 1/3 rate viewfinder\n- their choice.)\n\nPerhaps we could have metadata that lists what channels are going to\nbe used, and also what the \"cycle length\" is after which the same\nsequence of channels repeats (bearing in mind that interrupts can be\nmissed and channels \"go missing\" if you're unlucky)? But I'm very\nprepared to accept that this kind of thing may not be sensible for\nanyone other than Raspberry Pi!\n\nI'm sorry if that possibly complicates everything. Perhaps there's\nsome mileage in agreeing the \"simple HDR\" API initially, and leaving\nthe rest for further discussion, and possibly vendor specific controls\nwhere we think Raspberry Pi's requirements are not a good general fit?\n\nThanks!\n\nDavid\n\n> >\n> > > We've had this problem to some extent elsewhere and have adopted a\n> > > \"custom\" value. But it's not a great solution, and I don't think it\n> > > really flies here when there could be many \"custom\" values.\n> > >\n> > > > What I'm missing here is a design, it seems to be more of a prototype to\n> > > > see what will happen. It's fine to prototype things, but I would also\n> > > > like to think about a longer term design. At the very least, how can we\n> > > > get feedback on what will be useful to users, and when will we revisit\n> > > > this ?\n> > >\n> > > I wouldn't disagree with any of that, though I think experience tells\n> > > us that the only real way to get feedback is to put stuff out there\n> > > and see how people react. We aren't short of wider camera forums where\n> > > getting meaningful engagement from other parties is difficult! To be\n> > > fair, Pi users are very good in this respect, though I think most\n> > > would engage more actively when there's stuff to play with. But hard\n> > > to say!\n> >\n> > I agree, getting it in the hands of users is certainly a way to get\n> > feedback (how constructive that feedback is is a different question\n> > :-)).\n> >\n> > > > > > > > > +\n> > > > > > > > > +  - HdrChannel:\n> > > > > > > > > +      type: int32_t\n> > > > > > > > > +      description: |\n> > > > > > > > > +        This value is reported back to the application so that it can discover\n> > > > > > > > > +        whether this capture corresponds to the short or long exposure image (or\n> > > > > > > > > +        any other image used by the HDR procedure). An application can monitor\n> > > > > > > > > +        the HDR channel to discover when the differently exposed images have\n> > > > > > > > > +        arrived.\n> > > > > > > > > +\n> > > > > > > > > +      enum:\n> > > > > > > > > +        - name: HdrChannelNone\n> > > > > > > > > +          value: 0\n> > > > > > > > > +          description: |\n> > > > > > > > > +            This image does not correspond to any of the captures used to create\n> > > > > > > > > +            an HDR image.\n> > > > > > > >\n> > > > > > > > As indicated above, do we need this, or should we not report HdrChannel\n> > > > > > > > when HDR is disabled ?\n> > > > > > >\n> > > > > > > Actually I'd quite like to keep this, even if it's not reported when HDR is off.\n> > > > > > >\n> > > > > > > One use case is multi-exposure HDR on a Pi 4. You can't merge images\n> > > > > > > so how would you get a viewfinder? You could intersperse some\n> > > > > > > \"ordinary AGC\" frames with your long/short/whatever frames. You might\n> > > > > > > want to label these \"HdrChannelNone\". I suppose you could label them\n> > > > > > > \"medium\", but then maybe you're using \"medium\" for other frames that\n> > > > > > > your HDR implementation requires. Like everything else, it all falls a\n> > > > > > > bit into the \"who knows what anyone will do\" category.\n> > > > > >\n> > > > > > This makes me believe even more strongly that the unmerged mode\n> > > > > > shouldn't be an HDR mode. We're trying to cram too many assumptions\n> > > > > > about the application needs here.\n> > > > > >\n> > > > > > If you want to keep HdrChannelNone for the time being until we drop\n> > > > > > HdrMultiExposureUnmerged, then this needs better documentation. Until I\n> > > > > > read your reply I had no idea that the unmerged mode would produce\n> > > > > > images for the \"None\" channel. If this isn't documented properly, it\n> > > > > > won't be usable for applications.\n> > > > > >\n> > > > > > Depending on the priorities (I assume the Pi 5 HDR modes will be more\n> > > > > > interesting than the unmerged operation on Pi 4), we could also drop the\n> > > > > > unmerged mode for now, and design a solution better for application-side\n> > > > > > HDR.\n> > > > >\n> > > > > I'm definitely very keen to move forward all the discussions on\n> > > > > multi-channel AGC, IQ stability, per-frame controls and HDR. I am\n> > > > > aware that I want to be very flexible about many aspects of HDR, and I\n> > > > > explicitly do not want to tie our users down with \"it will work this\n> > > > > way\", I want them to be able to implement the strategies that work\n> > > > > best for them. So maybe vendor-specific controls are another topic\n> > > > > that needs to be added to this list.\n> > > > >\n> > > > > For the moment, however, Pi 5 is upon us and realistically, I think we\n> > > > > have no ability to change anything at this point. I'm definitely open\n> > > > > to suggestions on how to rationalise things after our release.\n> > > >\n> > > > Would you prefer focussing on Pi 5 first and postponing\n> > > > HdrModeMultiExposureUnmerged, or bundling them all together ? If your\n> > > > focus is in Pi 5 and you won't have time to release an implementation of\n> > > > HdrModeMultiExposureUnmerged with corresponding application-side support\n> > > > in the short term, maybe it would be better to merge the camera-side HDR\n> > > > support right now, and revisit HdrModeMultiExposureUnmerged when you'll\n> > > > have time to work on its implementation ? To be perfectly clear here, I\n> > > > think HdrModeMultiExposureUnmerged is less well-defined than the other\n> > > > modes, and thus needs more work, but I'm also not opposed to merging it\n> > > > early if we can come up with a plan to then improve it.\n> > >\n> > > For us, the release we're making in a day or two is a fait accompli at\n> > > this point (Pi 5s are being shipped, I hear), and all this stuff is in\n> > > it. All these modes are implemented and work. So from our point of\n> > > view, bundling everything that we've done together would minimse\n> > > divergence. Though we can live with it too, at least temporarily.\n> >\n> > OK, fine with me.\n> >\n> > > But as I said, I'm more than happy to revisit stuff. I think\n> > > multi-channel AGC is probably a better answer than \"unmerged\" HDR, but\n> > > we need to agree an approach.\n> > >\n> > > I think many platforms would react with horror to multi-channel AGC.\n> > > They'd take the view that they'll just drive multiple different\n> > > exposures explicitly, and not worry about multiple channels. Though\n> > > you still have to decide if they need to adjust dynamically and how\n> > > you switch between them, how they don't get too far apart, and how to\n> > > make it jump from one exposure to another without adapting slowly. So\n> > > it's all the same problems, just without exposing a general solution\n> > > that enables it. But it's definitely more viable if you're only doing\n> > > still image HDR. I can't escape wondering whether the dawn of vendor\n> > > extensions is drawing closer.\n> >\n> > It's always drawing closer, but I think we're still far away from it for\n> > HDR specifically. This is just the starting point.\n> >\n> > > So lots to think about!\n> > >\n> > > > > > > I guess there's also an outside chance that some implementations can't\n> > > > > > > flick exposures instantaneously and accurately like we can, so maybe\n> > > > > > > it would suit them too. Or we could have an\n> > > > > > > \"HdrChannelCouldntMakeThisWorkProperly\" value? Having wrestled with\n> > > > > > > all this for a while I can see the attraction! :)\n> > > > > > >\n> > > > > > > I'll send out another version today with the changes in line with all\n> > > > > > > the above. Obviously everyone please shout if we want to do anything\n> > > > > > > different.\n> > > > > > >\n> > > > > > > > > +        - name: HdrChannelShort\n> > > > > > > > > +          value: 1\n> > > > > > > > > +          description: |\n> > > > > > > > > +            This is a short exposure image.\n> > > > > > > > > +        - name: HdrChannelMedium\n> > > > > > > > > +          value: 2\n> > > > > > > > > +          description: |\n> > > > > > > > > +            This is a medium exposure image.\n> > > > > > > > > +        - name: HdrChannelLong\n> > > > > > > > > +          value: 3\n> > > > > > > > > +          description: |\n> > > > > > > > > +            This is a long exposure image.\n> > > > > > > > > +\n> > > > > > > > >    # ----------------------------------------------------------------------------\n> > > > > > > > >    # Draft controls section\n> > > > > > > > >\n>\n> --\n> Regards,\n>\n> Laurent Pinchart","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id 878DFC0DA4\n\tfor <parsemail@patchwork.libcamera.org>;\n\tTue, 24 Oct 2023 11:03:52 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id AE2936297F;\n\tTue, 24 Oct 2023 13:03:51 +0200 (CEST)","from mail-qk1-x734.google.com (mail-qk1-x734.google.com\n\t[IPv6:2607:f8b0:4864:20::734])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id BA31861DCF\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tTue, 24 Oct 2023 13:03:49 +0200 (CEST)","by mail-qk1-x734.google.com with SMTP id\n\taf79cd13be357-778ac9c898dso211197985a.0\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tTue, 24 Oct 2023 04:03:49 -0700 (PDT)"],"DKIM-Signature":["v=1; a=rsa-sha256; c=relaxed/simple; d=libcamera.org;\n\ts=mail; t=1698145431;\n\tbh=zMARNrCcgNdKf4DwJTxbxkyPYUj37OwzZJZHncm0TeM=;\n\th=References:In-Reply-To:Date:To:Subject:List-Id:List-Unsubscribe:\n\tList-Archive:List-Post:List-Help:List-Subscribe:From:Reply-To:Cc:\n\tFrom;\n\tb=M2B1AV6OC7Vx6Ut9XMkBmUQT3tum6Dz7vgZj/lb9KIArpyrUZcUCL3J0yu0bB8Xid\n\t2BaVwwC7ozDyVpaw6BRMonwilp/YlaMS6zMxQtZRKoheSXNU1rmXr1ZMm2NvzfmraG\n\tey6SrzsX4Hf/TuNErOmo1kG0SD9DJe5kxti4TwI117lw5V3pNgwb77MWc+DQVv6cuV\n\tVFCFEURLjcXm/YUB3l3mHdEc7RoSqTp/SCjTRr3v+S2oWbwbVQhC0D+WD0dYRyIidt\n\ttTRmc2JgIQH+cknIbJx4X4pEdBWInZj+h4D3Mk7gDd8n3l96GdoZ/DCBOlOmJ6j7Lw\n\txTRoQSDZQaqFw==","v=1; a=rsa-sha256; c=relaxed/relaxed;\n\td=raspberrypi.com; s=google; t=1698145428; x=1698750228;\n\tdarn=lists.libcamera.org; \n\th=cc:to:subject:message-id:date:from:in-reply-to:references\n\t:mime-version:from:to:cc:subject:date:message-id:reply-to;\n\tbh=9oyyw6h84o5ANtuLRvtx/yVmxmN3yVzLZxiYZoCI2rQ=;\n\tb=a13CyjNZE/ynVtkEVKFOe1JxxCDtVn6p64lHj67r2VVHrgGZdkJtXos2oTkcSWlBsr\n\tZFdXDghuM+UIV+hLhMBbXT3hpS/2mRyHU9ejggoQrXUALTW3RmXPUlH6bkXG8E/H0Lc0\n\tiEX/e4np5xmoPCgTDcDoB2ROhw9xomEr34rEGGT8VkJRyVo6UC/b3jyM4D9JV4J5P8O4\n\t7+IPU/s+SulsGYhyFNoD+I2gtpYh17ifzVAuZY/ZDf94lBnNjMNODIno4iMHegAq08Nn\n\tYY1DEEcTJrYVrH0qJqFEfILHo1tc4NUozTTc8WS6tB456D1haEJ4bk6MWCL65Dn56Q9Y\n\thE9w=="],"Authentication-Results":"lancelot.ideasonboard.com; dkim=pass (2048-bit key; \n\tunprotected) header.d=raspberrypi.com\n\theader.i=@raspberrypi.com\n\theader.b=\"a13CyjNZ\"; dkim-atps=neutral","X-Google-DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/relaxed;\n\td=1e100.net; s=20230601; t=1698145428; x=1698750228;\n\th=cc:to:subject:message-id:date:from:in-reply-to:references\n\t:mime-version:x-gm-message-state:from:to:cc:subject:date:message-id\n\t:reply-to;\n\tbh=9oyyw6h84o5ANtuLRvtx/yVmxmN3yVzLZxiYZoCI2rQ=;\n\tb=Ln4BX4FWPUZ7TndNfrAKxIOfGxEpvwvnlOty40ws19ieBPfTAcm6PbiLveOQdTiqDC\n\teRYjw+oxQv/gVvjp1Nn5vkoJozv88qOqxArNVf2w3xQfspVBNpfDahJwgnKB7kZEYZF1\n\tpKD5DFDS7S5sLwpMqtyDepQF7UI42/4A/9jwQUqkT64P57Lc/u+pMRCXOxyLuVTZvxEA\n\tZQQYXVTOlwNiDT+18nVOZIGSdH+Q2xUSuU2pwNhikzdY109/L7swHI6TSf5GfqUgH6t0\n\tJkINps8hwrD13NKoDLnIamPVzYnVQxZqZlWJgWWmCcCR717UB9uDzDURKcitW5aOTVaJ\n\tzIrw==","X-Gm-Message-State":"AOJu0Yzui8pILgPQro/Sl9z18Ncn+gjDvTH8mOYJgfAehHb4py5xGsM5\n\t8IKGJDLy8LHLex3RdQXdfpkBxGTQVpfIBxFqmFZesnFczmHBGi6G","X-Google-Smtp-Source":"AGHT+IHxgbE5WK1XjKkXORtoVC+DspesBuQXGkSxfnEbcWC9lL++VKpuAVFcr2lPN0kBA9LwIO5+aSTws3/42yoJDVc=","X-Received":"by 2002:ad4:5bc8:0:b0:66d:1ffe:bcd4 with SMTP id\n\tt8-20020ad45bc8000000b0066d1ffebcd4mr10635097qvt.44.1698145427929;\n\tTue, 24 Oct 2023 04:03:47 -0700 (PDT)","MIME-Version":"1.0","References":"<20231019115220.4473-1-david.plowman@raspberrypi.com>\n\t<20231019205235.GO14832@pendragon.ideasonboard.com>\n\t<CAHW6GY+BWEHxU_JfzLjpRhVzqdBbtWxZp0uVri-sirRobKGqVQ@mail.gmail.com>\n\t<20231020095535.GB20914@pendragon.ideasonboard.com>\n\t<CAHW6GY+s8zL-5GeHDS50Ch6fE=3cuCkp3WXqY=QB89w2V7DLXg@mail.gmail.com>\n\t<20231020112605.GD20914@pendragon.ideasonboard.com>\n\t<CAHW6GYKULifqq70bG1JGBDM3T7QAyfJ-uCYrDT2by4P4NZ1oNg@mail.gmail.com>\n\t<20231020131845.GE20914@pendragon.ideasonboard.com>\n\t<20231024082130.GA10517@pendragon.ideasonboard.com>","In-Reply-To":"<20231024082130.GA10517@pendragon.ideasonboard.com>","Date":"Tue, 24 Oct 2023 12:03:36 +0100","Message-ID":"<CAHW6GY+Z2xC7NSXu3UgAfATLr4Ve2+-oumj=XaFuOdtZVDU=Pw@mail.gmail.com>","To":"Laurent Pinchart <laurent.pinchart@ideasonboard.com>","Content-Type":"text/plain; charset=\"UTF-8\"","Subject":"Re: [libcamera-devel] [PATCH v3] libcamera: controls: Add controls\n\tfor HDR","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","From":"David Plowman via libcamera-devel <libcamera-devel@lists.libcamera.org>","Reply-To":"David Plowman <david.plowman@raspberrypi.com>","Cc":"libcamera-devel@lists.libcamera.org","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}}]