[{"id":16274,"web_url":"https://patchwork.libcamera.org/comment/16274/","msgid":"<542f47d6-4dd7-6b9f-c604-fef3b13ad8ed@ideasonboard.com>","date":"2021-04-14T10:49:47","subject":"Re: [libcamera-devel] [PATCH v3 1/3] ipa: raspberrypi: Make\n\tCamHelper exposure methods virtual","submitter":{"id":4,"url":"https://patchwork.libcamera.org/api/people/4/","name":"Kieran Bingham","email":"kieran.bingham@ideasonboard.com"},"content":"Hi David,\n\nOn 14/04/2021 11:29, David Plowman wrote:\n> This allows derived classes to override them if they have any special\n> behaviours to implement. For instance if a particular camera mode\n> produces a different signal level to other modes, you might choose to\n> address that in the gain or exposure methods.\n> \n> Signed-off-by: David Plowman <david.plowman@raspberrypi.com>\n> Reviewed-by: Naushir Patuck <naush@raspberrypi.com>\n\nThis isn't actually used yet is it?\nBut I don't see any harm in it, given that's the point of the cam helper\ninterface.\n\nReviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>\n\n> ---\n>  src/ipa/raspberrypi/cam_helper.hpp | 4 ++--\n>  1 file changed, 2 insertions(+), 2 deletions(-)\n> \n> diff --git a/src/ipa/raspberrypi/cam_helper.hpp b/src/ipa/raspberrypi/cam_helper.hpp\n> index 1b2d6eec..4053a870 100644\n> --- a/src/ipa/raspberrypi/cam_helper.hpp\n> +++ b/src/ipa/raspberrypi/cam_helper.hpp\n> @@ -66,8 +66,8 @@ public:\n>  \tvirtual ~CamHelper();\n>  \tvoid SetCameraMode(const CameraMode &mode);\n>  \tMdParser &Parser() const { return *parser_; }\n> -\tuint32_t ExposureLines(double exposure_us) const;\n> -\tdouble Exposure(uint32_t exposure_lines) const; // in us\n> +\tvirtual uint32_t ExposureLines(double exposure_us) const;\n> +\tvirtual double Exposure(uint32_t exposure_lines) const; // in us\n>  \tvirtual uint32_t GetVBlanking(double &exposure_us, double minFrameDuration,\n>  \t\t\t\t      double maxFrameDuration) const;\n>  \tvirtual uint32_t GainCode(double gain) const = 0;\n>","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id 13663BD1F6\n\tfor <parsemail@patchwork.libcamera.org>;\n\tWed, 14 Apr 2021 10:49:54 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id 8396368803;\n\tWed, 14 Apr 2021 12:49:53 +0200 (CEST)","from perceval.ideasonboard.com (perceval.ideasonboard.com\n\t[IPv6:2001:4b98:dc2:55:216:3eff:fef7:d647])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id 0F72D68803\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tWed, 14 Apr 2021 12:49:52 +0200 (CEST)","from [192.168.0.20]\n\t(cpc89244-aztw30-2-0-cust3082.18-1.cable.virginm.net [86.31.172.11])\n\tby perceval.ideasonboard.com (Postfix) with ESMTPSA id 8B7C96F2;\n\tWed, 14 Apr 2021 12:49:51 +0200 (CEST)"],"Authentication-Results":"lancelot.ideasonboard.com;\n\tdkim=fail reason=\"signature verification failed\" (1024-bit key;\n\tunprotected) header.d=ideasonboard.com header.i=@ideasonboard.com\n\theader.b=\"D+HV6PSA\"; dkim-atps=neutral","DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/simple; d=ideasonboard.com;\n\ts=mail; t=1618397391;\n\tbh=KwfUHWqsac57VnyQKFGbqm63dqhDePWEmLAcnWZ3aUY=;\n\th=Reply-To:Subject:To:References:From:Date:In-Reply-To:From;\n\tb=D+HV6PSAfzl7Q6bP6+77X0fJ3Sy1dQG3yOPdPDDlcda4dt84FNwB36rnpxLK994Jb\n\tdlLlkb4HwuhCJvsT8Mw7zoyrDoV7Lv/3f/tPfezLP6FVEKJ7D26jCCuyCmk4M7z1xw\n\tigPHyMjfCmwhTVw4/2DGvXl4r+y7qunkyh+l43KE=","To":"David Plowman <david.plowman@raspberrypi.com>,\n\tlibcamera-devel@lists.libcamera.org","References":"<20210414102955.9503-1-david.plowman@raspberrypi.com>\n\t<20210414102955.9503-2-david.plowman@raspberrypi.com>","From":"Kieran Bingham <kieran.bingham@ideasonboard.com>","Autocrypt":"addr=kieran.bingham@ideasonboard.com; keydata=\n\tmQINBFYE/WYBEACs1PwjMD9rgCu1hlIiUA1AXR4rv2v+BCLUq//vrX5S5bjzxKAryRf0uHat\n\tV/zwz6hiDrZuHUACDB7X8OaQcwhLaVlq6byfoBr25+hbZG7G3+5EUl9cQ7dQEdvNj6V6y/SC\n\trRanWfelwQThCHckbobWiQJfK9n7rYNcPMq9B8e9F020LFH7Kj6YmO95ewJGgLm+idg1Kb3C\n\tpotzWkXc1xmPzcQ1fvQMOfMwdS+4SNw4rY9f07Xb2K99rjMwZVDgESKIzhsDB5GY465sCsiQ\n\tcSAZRxqE49RTBq2+EQsbrQpIc8XiffAB8qexh5/QPzCmR4kJgCGeHIXBtgRj+nIkCJPZvZtf\n\tKr2EAbc6tgg6DkAEHJb+1okosV09+0+TXywYvtEop/WUOWQ+zo+Y/OBd+8Ptgt1pDRyOBzL8\n\tRXa8ZqRf0Mwg75D+dKntZeJHzPRJyrlfQokngAAs4PaFt6UfS+ypMAF37T6CeDArQC41V3ko\n\tlPn1yMsVD0p+6i3DPvA/GPIksDC4owjnzVX9kM8Zc5Cx+XoAN0w5Eqo4t6qEVbuettxx55gq\n\t8K8FieAjgjMSxngo/HST8TpFeqI5nVeq0/lqtBRQKumuIqDg+Bkr4L1V/PSB6XgQcOdhtd36\n\tOe9X9dXB8YSNt7VjOcO7BTmFn/Z8r92mSAfHXpb07YJWJosQOQARAQABtDBLaWVyYW4gQmlu\n\tZ2hhbSA8a2llcmFuLmJpbmdoYW1AaWRlYXNvbmJvYXJkLmNvbT6JAlcEEwEKAEECGwMFCwkI\n\tBwIGFQgJCgsCBBYCAwECHgECF4ACGQEWIQSQLdeYP70o/eNy1HqhHkZyEKRh/QUCXWTtygUJ\n\tCyJXZAAKCRChHkZyEKRh/f8dEACTDsbLN2nioNZMwyLuQRUAFcXNolDX48xcUXsWS2QjxaPm\n\tVsJx8Uy8aYkS85mdPBh0C83OovQR/OVbr8AxhGvYqBs3nQvbWuTl/+4od7DfK2VZOoKBAu5S\n\tQK2FYuUcikDqYcFWJ8DQnubxfE8dvzojHEkXw0sA4igINHDDFX3HJGZtLio+WpEFQtCbfTAG\n\tYZslasz1YZRbwEdSsmO3/kqy5eMnczlm8a21A3fKUo3g8oAZEFM+f4DUNzqIltg31OAB/kZS\n\tenKZQ/SWC8PmLg/ZXBrReYakxXtkP6w3FwMlzOlhGxqhIRNiAJfXJBaRhuUWzPOpEDE9q5YJ\n\tBmqQL2WJm1VSNNVxbXJHpaWMH1sA2R00vmvRrPXGwyIO0IPYeUYQa3gsy6k+En/aMQJd27dp\n\taScf9am9PFICPY5T4ppneeJLif2lyLojo0mcHOV+uyrds9XkLpp14GfTkeKPdPMrLLTsHRfH\n\tfA4I4OBpRrEPiGIZB/0im98MkGY/Mu6qxeZmYLCcgD6qz4idOvfgVOrNh+aA8HzIVR+RMW8H\n\tQGBN9f0E3kfwxuhl3omo6V7lDw8XOdmuWZNC9zPq1UfryVHANYbLGz9KJ4Aw6M+OgBC2JpkD\n\thXMdHUkC+d20dwXrwHTlrJi1YNp6rBc+xald3wsUPOZ5z8moTHUX/uPA/qhGsbkCDQRWBP1m\n\tARAAzijkb+Sau4hAncr1JjOY+KyFEdUNxRy+hqTJdJfaYihxyaj0Ee0P0zEi35CbE6lgU0Uz\n\ttih9fiUbSV3wfsWqg1Ut3/5rTKu7kLFp15kF7eqvV4uezXRD3Qu4yjv/rMmEJbbD4cTvGCYI\n\td6MDC417f7vK3hCbCVIZSp3GXxyC1LU+UQr3fFcOyCwmP9vDUR9JV0BSqHHxRDdpUXE26Dk6\n\tmhf0V1YkspE5St814ETXpEus2urZE5yJIUROlWPIL+hm3NEWfAP06vsQUyLvr/GtbOT79vXl\n\tEn1aulcYyu20dRRxhkQ6iILaURcxIAVJJKPi8dsoMnS8pB0QW12AHWuirPF0g6DiuUfPmrA5\n\tPKe56IGlpkjc8cO51lIxHkWTpCMWigRdPDexKX+Sb+W9QWK/0JjIc4t3KBaiG8O4yRX8ml2R\n\t+rxfAVKM6V769P/hWoRGdgUMgYHFpHGSgEt80OKK5HeUPy2cngDUXzwrqiM5Sz6Od0qw5pCk\n\tNlXqI0W/who0iSVM+8+RmyY0OEkxEcci7rRLsGnM15B5PjLJjh1f2ULYkv8s4SnDwMZ/kE04\n\t/UqCMK/KnX8pwXEMCjz0h6qWNpGwJ0/tYIgQJZh6bqkvBrDogAvuhf60Sogw+mH8b+PBlx1L\n\toeTK396wc+4c3BfiC6pNtUS5GpsPMMjYMk7kVvEAEQEAAYkCPAQYAQoAJgIbDBYhBJAt15g/\n\tvSj943LUeqEeRnIQpGH9BQJdizzIBQkLSKZiAAoJEKEeRnIQpGH9eYgQAJpjaWNgqNOnMTmD\n\tMJggbwjIotypzIXfhHNCeTkG7+qCDlSaBPclcPGYrTwCt0YWPU2TgGgJrVhYT20ierN8LUvj\n\t6qOPTd+Uk7NFzL65qkh80ZKNBFddx1AabQpSVQKbdcLb8OFs85kuSvFdgqZwgxA1vl4TFhNz\n\tPZ79NAmXLackAx3sOVFhk4WQaKRshCB7cSl+RIng5S/ThOBlwNlcKG7j7W2MC06BlTbdEkUp\n\tECzuuRBv8wX4OQl+hbWbB/VKIx5HKlLu1eypen/5lNVzSqMMIYkkZcjV2SWQyUGxSwq0O/sx\n\tS0A8/atCHUXOboUsn54qdxrVDaK+6jIAuo8JiRWctP16KjzUM7MO0/+4zllM8EY57rXrj48j\n\tsbEYX0YQnzaj+jO6kJtoZsIaYR7rMMq9aUAjyiaEZpmP1qF/2sYenDx0Fg2BSlLvLvXM0vU8\n\tpQk3kgDu7kb/7PRYrZvBsr21EIQoIjXbZxDz/o7z95frkP71EaICttZ6k9q5oxxA5WC6sTXc\n\tMW8zs8avFNuA9VpXt0YupJd2ijtZy2mpZNG02fFVXhIn4G807G7+9mhuC4XG5rKlBBUXTvPU\n\tAfYnB4JBDLmLzBFavQfvonSfbitgXwCG3vS+9HEwAjU30Bar1PEOmIbiAoMzuKeRm2LVpmq4\n\tWZw01QYHU/GUV/zHJSFk","Organization":"Ideas on Board","Message-ID":"<542f47d6-4dd7-6b9f-c604-fef3b13ad8ed@ideasonboard.com>","Date":"Wed, 14 Apr 2021 11:49:47 +0100","User-Agent":"Mozilla/5.0 (X11; Linux x86_64; rv:68.0) Gecko/20100101\n\tThunderbird/68.10.0","MIME-Version":"1.0","In-Reply-To":"<20210414102955.9503-2-david.plowman@raspberrypi.com>","Content-Language":"en-GB","Subject":"Re: [libcamera-devel] [PATCH v3 1/3] ipa: raspberrypi: Make\n\tCamHelper exposure methods virtual","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","Reply-To":"kieran.bingham@ideasonboard.com","Content-Type":"text/plain; charset=\"us-ascii\"","Content-Transfer-Encoding":"7bit","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}},{"id":16326,"web_url":"https://patchwork.libcamera.org/comment/16326/","msgid":"<YHoIBNpFTzakzIhi@pendragon.ideasonboard.com>","date":"2021-04-16T21:56:20","subject":"Re: [libcamera-devel] [PATCH v3 1/3] ipa: raspberrypi: Make\n\tCamHelper exposure methods virtual","submitter":{"id":2,"url":"https://patchwork.libcamera.org/api/people/2/","name":"Laurent Pinchart","email":"laurent.pinchart@ideasonboard.com"},"content":"Hi David,\n\nThank you for the patch.\n\nOn Wed, Apr 14, 2021 at 11:29:53AM +0100, David Plowman wrote:\n> This allows derived classes to override them if they have any special\n> behaviours to implement. For instance if a particular camera mode\n> produces a different signal level to other modes, you might choose to\n> address that in the gain or exposure methods.\n\nFor my information, would you be able to give an example of how this\nwould be used ? What kind of particular camera mode would produce a\ndifferent signal level, is this related to binning, or something else ?\n\n> Signed-off-by: David Plowman <david.plowman@raspberrypi.com>\n> Reviewed-by: Naushir Patuck <naush@raspberrypi.com>\n\nReviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>\n\n> ---\n>  src/ipa/raspberrypi/cam_helper.hpp | 4 ++--\n>  1 file changed, 2 insertions(+), 2 deletions(-)\n> \n> diff --git a/src/ipa/raspberrypi/cam_helper.hpp b/src/ipa/raspberrypi/cam_helper.hpp\n> index 1b2d6eec..4053a870 100644\n> --- a/src/ipa/raspberrypi/cam_helper.hpp\n> +++ b/src/ipa/raspberrypi/cam_helper.hpp\n> @@ -66,8 +66,8 @@ public:\n>  \tvirtual ~CamHelper();\n>  \tvoid SetCameraMode(const CameraMode &mode);\n>  \tMdParser &Parser() const { return *parser_; }\n> -\tuint32_t ExposureLines(double exposure_us) const;\n> -\tdouble Exposure(uint32_t exposure_lines) const; // in us\n> +\tvirtual uint32_t ExposureLines(double exposure_us) const;\n> +\tvirtual double Exposure(uint32_t exposure_lines) const; // in us\n>  \tvirtual uint32_t GetVBlanking(double &exposure_us, double minFrameDuration,\n>  \t\t\t\t      double maxFrameDuration) const;\n>  \tvirtual uint32_t GainCode(double gain) const = 0;","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id 3DB03BD235\n\tfor <parsemail@patchwork.libcamera.org>;\n\tFri, 16 Apr 2021 21:56:24 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id A0CDB6880C;\n\tFri, 16 Apr 2021 23:56:23 +0200 (CEST)","from perceval.ideasonboard.com (perceval.ideasonboard.com\n\t[IPv6:2001:4b98:dc2:55:216:3eff:fef7:d647])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id CCC4F68806\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tFri, 16 Apr 2021 23:56:22 +0200 (CEST)","from pendragon.ideasonboard.com (62-78-145-57.bb.dnainternet.fi\n\t[62.78.145.57])\n\tby perceval.ideasonboard.com (Postfix) with ESMTPSA id 489965A5;\n\tFri, 16 Apr 2021 23:56:22 +0200 (CEST)"],"Authentication-Results":"lancelot.ideasonboard.com;\n\tdkim=fail reason=\"signature verification failed\" (1024-bit key;\n\tunprotected) header.d=ideasonboard.com header.i=@ideasonboard.com\n\theader.b=\"MpWe8wYG\"; dkim-atps=neutral","DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/simple; d=ideasonboard.com;\n\ts=mail; t=1618610182;\n\tbh=vpDfoktWRmgukLLprLag3eT6L36KwcUYk91ZgINk100=;\n\th=Date:From:To:Cc:Subject:References:In-Reply-To:From;\n\tb=MpWe8wYGBvdrVPCFiXOaNQ5yUiNe4j/NPY2qYsvTCEg5kzzebKNikRg+9RNQS2f2j\n\t1nP50cjH7c+rW49i1AFTP9s4yDTCDhY+mIsr6OmtWrHVRZRtU+63xA8ujVORE7E4iz\n\tOtMB+6uGwYhtroPW7Jo1u/upOeExH2gwNn2Cxd7U=","Date":"Sat, 17 Apr 2021 00:56:20 +0300","From":"Laurent Pinchart <laurent.pinchart@ideasonboard.com>","To":"David Plowman <david.plowman@raspberrypi.com>","Message-ID":"<YHoIBNpFTzakzIhi@pendragon.ideasonboard.com>","References":"<20210414102955.9503-1-david.plowman@raspberrypi.com>\n\t<20210414102955.9503-2-david.plowman@raspberrypi.com>","MIME-Version":"1.0","Content-Disposition":"inline","In-Reply-To":"<20210414102955.9503-2-david.plowman@raspberrypi.com>","Subject":"Re: [libcamera-devel] [PATCH v3 1/3] ipa: raspberrypi: Make\n\tCamHelper exposure methods virtual","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","Cc":"libcamera-devel@lists.libcamera.org","Content-Type":"text/plain; charset=\"us-ascii\"","Content-Transfer-Encoding":"7bit","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}},{"id":16332,"web_url":"https://patchwork.libcamera.org/comment/16332/","msgid":"<CAHW6GYL0csoeG3JH0tD4n=jg03GpcmxGGq=e6dcUvfrxtAuQ1g@mail.gmail.com>","date":"2021-04-17T08:19:51","subject":"Re: [libcamera-devel] [PATCH v3 1/3] ipa: raspberrypi: Make\n\tCamHelper exposure methods virtual","submitter":{"id":42,"url":"https://patchwork.libcamera.org/api/people/42/","name":"David Plowman","email":"david.plowman@raspberrypi.com"},"content":"Hi Laurent\n\nThanks for the comments and questions!\n\nOn Fri, 16 Apr 2021 at 22:56, Laurent Pinchart\n<laurent.pinchart@ideasonboard.com> wrote:\n>\n> Hi David,\n>\n> Thank you for the patch.\n>\n> On Wed, Apr 14, 2021 at 11:29:53AM +0100, David Plowman wrote:\n> > This allows derived classes to override them if they have any special\n> > behaviours to implement. For instance if a particular camera mode\n> > produces a different signal level to other modes, you might choose to\n> > address that in the gain or exposure methods.\n>\n> For my information, would you be able to give an example of how this\n> would be used ? What kind of particular camera mode would produce a\n> different signal level, is this related to binning, or something else ?\n\nYes, so the sensor I'm looking at produces twice the signal level in\nthe binning modes that it does in the full resolution mode. I think\nthis difference would be awkward for applications which means we have\nto hide it somewhere.\n\nOne of my earlier emails talked a little bit about hiding this in the\nAGC. You'd have to know a \"base gain\" for every mode (so 1 for full\nres, 2 for binned), and then have some process where we have public\nexposure/gain numbers (which are the same for all modes) and there's a\nprocess of conversion into and out of the AGC and IPA which deal with\nreal exposure/gain values. But I'm really not sure I want to go there,\nat least right now!\n\nI think this is a pragmatic alternative. You can already hide the\ndifference by changing the CamHelper's Gain and GainCode functions, to\napply the magic extra factor of 2 when necessary. This change here\nmakes it possible to apply that factor in the exposure instead - you\ncould imagine someone wanting the lowest noise possible and accepting\nthat the real exposure will be double (I wouldn't want to double\nexposures without people explicitly making that choice). And finally,\nthis change doesn't make it any more difficult to do the compensation\nin the AGC at a later date, should we wish.\n\nIn the CamHelper for my new sensor, I even make it easy to mix the\nfactor of 2 across both gain and exposure, so you could, for example,\nhave sqrt(2) in each.\n\nThat got rather long... I hope it was understandable!\n\nDavid\n\n>\n> > Signed-off-by: David Plowman <david.plowman@raspberrypi.com>\n> > Reviewed-by: Naushir Patuck <naush@raspberrypi.com>\n>\n> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>\n>\n> > ---\n> >  src/ipa/raspberrypi/cam_helper.hpp | 4 ++--\n> >  1 file changed, 2 insertions(+), 2 deletions(-)\n> >\n> > diff --git a/src/ipa/raspberrypi/cam_helper.hpp b/src/ipa/raspberrypi/cam_helper.hpp\n> > index 1b2d6eec..4053a870 100644\n> > --- a/src/ipa/raspberrypi/cam_helper.hpp\n> > +++ b/src/ipa/raspberrypi/cam_helper.hpp\n> > @@ -66,8 +66,8 @@ public:\n> >       virtual ~CamHelper();\n> >       void SetCameraMode(const CameraMode &mode);\n> >       MdParser &Parser() const { return *parser_; }\n> > -     uint32_t ExposureLines(double exposure_us) const;\n> > -     double Exposure(uint32_t exposure_lines) const; // in us\n> > +     virtual uint32_t ExposureLines(double exposure_us) const;\n> > +     virtual double Exposure(uint32_t exposure_lines) const; // in us\n> >       virtual uint32_t GetVBlanking(double &exposure_us, double minFrameDuration,\n> >                                     double maxFrameDuration) const;\n> >       virtual uint32_t GainCode(double gain) const = 0;\n>\n> --\n> Regards,\n>\n> Laurent Pinchart","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id 4B659C314C\n\tfor <parsemail@patchwork.libcamera.org>;\n\tSat, 17 Apr 2021 08:20:07 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id 6B6006880B;\n\tSat, 17 Apr 2021 10:20:06 +0200 (CEST)","from mail-ot1-x336.google.com (mail-ot1-x336.google.com\n\t[IPv6:2607:f8b0:4864:20::336])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id 3A3DE602D1\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tSat, 17 Apr 2021 10:20:04 +0200 (CEST)","by mail-ot1-x336.google.com with SMTP id\n\tf75-20020a9d03d10000b0290280def9ab76so22768300otf.12\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tSat, 17 Apr 2021 01:20:04 -0700 (PDT)"],"Authentication-Results":"lancelot.ideasonboard.com;\n\tdkim=fail reason=\"signature verification failed\" (2048-bit key;\n\tunprotected) header.d=raspberrypi.com header.i=@raspberrypi.com\n\theader.b=\"a8hl9Wpt\"; dkim-atps=neutral","DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/relaxed;\n\td=raspberrypi.com; s=google;\n\th=mime-version:references:in-reply-to:from:date:message-id:subject:to\n\t:cc; bh=LF+de4w7jidRlkXI4R/LckS5LkJAWFi0W4NH5j0sTEY=;\n\tb=a8hl9WptI5Dgr76BKeUDLFCbcKnHHeC6NzrsUPdACvxy/QOYP6AHUg947eqwuGhr6J\n\th7FBlJIoALMt0GQRrdh2Z+XFs5ENVklFYhLqpa/7gNqGtgEqnr2AQwmFvruzMgTi3Oaz\n\tAubbg1fjxgef21/gSKGSkQvSIROvYIOip+YtMhFoMBjD5hbvgOFnW10zrAPG9SIfblPE\n\t+ZCiFKmuxLKktYvF63avNvlkqdzHf4W3buEqN4k8RoQf/8x2QZfFn/qyMX/9sJy4+nir\n\tw9ytR+nnwj3Jx3vixbSf9o759f4ZBmxyYxiHvLo9/1tbTJY+lsqm4haTNSvhl60lYupu\n\t0nnQ==","X-Google-DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/relaxed;\n\td=1e100.net; s=20161025;\n\th=x-gm-message-state:mime-version:references:in-reply-to:from:date\n\t:message-id:subject:to:cc;\n\tbh=LF+de4w7jidRlkXI4R/LckS5LkJAWFi0W4NH5j0sTEY=;\n\tb=jGfx31iEeKv5YOMWibjwOsJPVJhOiznGlGzxqsRAVxq92Nr0vCSP8Uzs9CxdI80+0d\n\tdaup7ArN0p9psTqQyv4oHvD0ukla9rbtgy9xUJLQU6tYxT+wNq5zSwFy9oJOPs4LRD17\n\tam5U8/PTs6DPSMCc1QMFJO2iHWlRSX3YqUKxVL4ZkLCF8IvE8rZ46qj5XSM2Wn2owbMn\n\tafgSaIf/GMY4UD4QsLghXrHltaY9S1EpxwArhDlzi/H4j//5+9GK55k5nT/goZaCfM7W\n\tGn0+VTfLsFyqiriTLlxUB3xbSHYgjDGC5ZQG8YOWWoCNpDZtyo4872Wf70aw74hE++Fx\n\tIh6g==","X-Gm-Message-State":"AOAM530FyaotufCSjBRja1pRa09KA1SrS+xsZ+7BfKROayT9JQQoJ6kj\n\tUaxBIOMS3XiqAUgNlxe7p/dcnleY3M+c8CoJZmHUqQ==","X-Google-Smtp-Source":"ABdhPJwv7DN90ZnKNCl8pzQ9ZHI+fvQ7nIOQu2PFX8sdDm2GoRY7cBHV34WxIMGWlJMTt7SsZUHjeTw9OEhbVi9iUQI=","X-Received":"by 2002:a9d:2f48:: with SMTP id\n\th66mr6720110otb.160.1618647602656; \n\tSat, 17 Apr 2021 01:20:02 -0700 (PDT)","MIME-Version":"1.0","References":"<20210414102955.9503-1-david.plowman@raspberrypi.com>\n\t<20210414102955.9503-2-david.plowman@raspberrypi.com>\n\t<YHoIBNpFTzakzIhi@pendragon.ideasonboard.com>","In-Reply-To":"<YHoIBNpFTzakzIhi@pendragon.ideasonboard.com>","From":"David Plowman <david.plowman@raspberrypi.com>","Date":"Sat, 17 Apr 2021 09:19:51 +0100","Message-ID":"<CAHW6GYL0csoeG3JH0tD4n=jg03GpcmxGGq=e6dcUvfrxtAuQ1g@mail.gmail.com>","To":"Laurent Pinchart <laurent.pinchart@ideasonboard.com>","Subject":"Re: [libcamera-devel] [PATCH v3 1/3] ipa: raspberrypi: Make\n\tCamHelper exposure methods virtual","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","Cc":"libcamera devel <libcamera-devel@lists.libcamera.org>","Content-Type":"text/plain; charset=\"us-ascii\"","Content-Transfer-Encoding":"7bit","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}},{"id":16623,"web_url":"https://patchwork.libcamera.org/comment/16623/","msgid":"<YIetKmO+nQU2bS7A@pendragon.ideasonboard.com>","date":"2021-04-27T06:20:26","subject":"Re: [libcamera-devel] [PATCH v3 1/3] ipa: raspberrypi: Make\n\tCamHelper exposure methods virtual","submitter":{"id":2,"url":"https://patchwork.libcamera.org/api/people/2/","name":"Laurent Pinchart","email":"laurent.pinchart@ideasonboard.com"},"content":"Hi David,\n\nSorry for the late reply, catching up with reviews.\n\nOn Sat, Apr 17, 2021 at 09:19:51AM +0100, David Plowman wrote:\n> On Fri, 16 Apr 2021 at 22:56, Laurent Pinchart wrote:\n> > On Wed, Apr 14, 2021 at 11:29:53AM +0100, David Plowman wrote:\n> > > This allows derived classes to override them if they have any special\n> > > behaviours to implement. For instance if a particular camera mode\n> > > produces a different signal level to other modes, you might choose to\n> > > address that in the gain or exposure methods.\n> >\n> > For my information, would you be able to give an example of how this\n> > would be used ? What kind of particular camera mode would produce a\n> > different signal level, is this related to binning, or something else ?\n> \n> Yes, so the sensor I'm looking at produces twice the signal level in\n> the binning modes that it does in the full resolution mode. I think\n> this difference would be awkward for applications which means we have\n> to hide it somewhere.\n> \n> One of my earlier emails talked a little bit about hiding this in the\n> AGC. You'd have to know a \"base gain\" for every mode (so 1 for full\n> res, 2 for binned), and then have some process where we have public\n> exposure/gain numbers (which are the same for all modes) and there's a\n> process of conversion into and out of the AGC and IPA which deal with\n> real exposure/gain values. But I'm really not sure I want to go there,\n> at least right now!\n> \n> I think this is a pragmatic alternative. You can already hide the\n> difference by changing the CamHelper's Gain and GainCode functions, to\n> apply the magic extra factor of 2 when necessary. This change here\n> makes it possible to apply that factor in the exposure instead - you\n> could imagine someone wanting the lowest noise possible and accepting\n> that the real exposure will be double (I wouldn't want to double\n> exposures without people explicitly making that choice). And finally,\n> this change doesn't make it any more difficult to do the compensation\n> in the AGC at a later date, should we wish.\n> \n> In the CamHelper for my new sensor, I even make it easy to mix the\n> factor of 2 across both gain and exposure, so you could, for example,\n> have sqrt(2) in each.\n> \n> That got rather long... I hope it was understandable!\n\nThanks for the information. There may be something that I don't get\nthough, as I don't see why the exposure needs to be affected. The\nexposure time has a big influence on motion blur, so it seems to me that\nwe shouldn't cheat here. The gain seems to be a better place to hide\nthis sensor feature. I wonder if we should consider moving from gain to\nsensor sensitivity.\n\n> > > Signed-off-by: David Plowman <david.plowman@raspberrypi.com>\n> > > Reviewed-by: Naushir Patuck <naush@raspberrypi.com>\n> >\n> > Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>\n> >\n> > > ---\n> > >  src/ipa/raspberrypi/cam_helper.hpp | 4 ++--\n> > >  1 file changed, 2 insertions(+), 2 deletions(-)\n> > >\n> > > diff --git a/src/ipa/raspberrypi/cam_helper.hpp b/src/ipa/raspberrypi/cam_helper.hpp\n> > > index 1b2d6eec..4053a870 100644\n> > > --- a/src/ipa/raspberrypi/cam_helper.hpp\n> > > +++ b/src/ipa/raspberrypi/cam_helper.hpp\n> > > @@ -66,8 +66,8 @@ public:\n> > >       virtual ~CamHelper();\n> > >       void SetCameraMode(const CameraMode &mode);\n> > >       MdParser &Parser() const { return *parser_; }\n> > > -     uint32_t ExposureLines(double exposure_us) const;\n> > > -     double Exposure(uint32_t exposure_lines) const; // in us\n> > > +     virtual uint32_t ExposureLines(double exposure_us) const;\n> > > +     virtual double Exposure(uint32_t exposure_lines) const; // in us\n> > >       virtual uint32_t GetVBlanking(double &exposure_us, double minFrameDuration,\n> > >                                     double maxFrameDuration) const;\n> > >       virtual uint32_t GainCode(double gain) const = 0;","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id E260EBDCC3\n\tfor <parsemail@patchwork.libcamera.org>;\n\tTue, 27 Apr 2021 06:20:34 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id 4BA7368880;\n\tTue, 27 Apr 2021 08:20:34 +0200 (CEST)","from perceval.ideasonboard.com (perceval.ideasonboard.com\n\t[IPv6:2001:4b98:dc2:55:216:3eff:fef7:d647])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id 98CC160512\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tTue, 27 Apr 2021 08:20:32 +0200 (CEST)","from pendragon.ideasonboard.com (62-78-145-57.bb.dnainternet.fi\n\t[62.78.145.57])\n\tby perceval.ideasonboard.com (Postfix) with ESMTPSA id 198DEE9;\n\tTue, 27 Apr 2021 08:20:32 +0200 (CEST)"],"Authentication-Results":"lancelot.ideasonboard.com;\n\tdkim=fail reason=\"signature verification failed\" (1024-bit key;\n\tunprotected) header.d=ideasonboard.com header.i=@ideasonboard.com\n\theader.b=\"MCuncw6/\"; dkim-atps=neutral","DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/simple; d=ideasonboard.com;\n\ts=mail; t=1619504432;\n\tbh=5QOjUvOvl93WUwVTPHXG8XrckWmNc6s1b7A7XMUMW80=;\n\th=Date:From:To:Cc:Subject:References:In-Reply-To:From;\n\tb=MCuncw6/419hieJJxnBAVRzJsQ+pgmeIVt9chc8mba5Rj7fLz49XHaHnKQvmcUZSL\n\tGbCfxHxA9p08w98BPjIzNGIVcgBlweKDh3aYKca47x3GBfhiEkXFJyJxhBoKbQ32me\n\tBk9tUHmGb2hnObrt0TcNld3FHVt9ohVWMsTKqcvM=","Date":"Tue, 27 Apr 2021 09:20:26 +0300","From":"Laurent Pinchart <laurent.pinchart@ideasonboard.com>","To":"David Plowman <david.plowman@raspberrypi.com>","Message-ID":"<YIetKmO+nQU2bS7A@pendragon.ideasonboard.com>","References":"<20210414102955.9503-1-david.plowman@raspberrypi.com>\n\t<20210414102955.9503-2-david.plowman@raspberrypi.com>\n\t<YHoIBNpFTzakzIhi@pendragon.ideasonboard.com>\n\t<CAHW6GYL0csoeG3JH0tD4n=jg03GpcmxGGq=e6dcUvfrxtAuQ1g@mail.gmail.com>","MIME-Version":"1.0","Content-Disposition":"inline","In-Reply-To":"<CAHW6GYL0csoeG3JH0tD4n=jg03GpcmxGGq=e6dcUvfrxtAuQ1g@mail.gmail.com>","Subject":"Re: [libcamera-devel] [PATCH v3 1/3] ipa: raspberrypi: Make\n\tCamHelper exposure methods virtual","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","Cc":"libcamera devel <libcamera-devel@lists.libcamera.org>","Content-Type":"text/plain; charset=\"us-ascii\"","Content-Transfer-Encoding":"7bit","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}},{"id":16648,"web_url":"https://patchwork.libcamera.org/comment/16648/","msgid":"<CAHW6GYKpdCD_VdPbaQJg_bzGV+-hi2PrMzxUQF8epTu9vG3B5Q@mail.gmail.com>","date":"2021-04-27T08:22:00","subject":"Re: [libcamera-devel] [PATCH v3 1/3] ipa: raspberrypi: Make\n\tCamHelper exposure methods virtual","submitter":{"id":42,"url":"https://patchwork.libcamera.org/api/people/42/","name":"David Plowman","email":"david.plowman@raspberrypi.com"},"content":"Hi Laurent\n\nThanks for the comments.\n\nOn Tue, 27 Apr 2021 at 07:20, Laurent Pinchart\n<laurent.pinchart@ideasonboard.com> wrote:\n>\n> Hi David,\n>\n> Sorry for the late reply, catching up with reviews.\n>\n> On Sat, Apr 17, 2021 at 09:19:51AM +0100, David Plowman wrote:\n> > On Fri, 16 Apr 2021 at 22:56, Laurent Pinchart wrote:\n> > > On Wed, Apr 14, 2021 at 11:29:53AM +0100, David Plowman wrote:\n> > > > This allows derived classes to override them if they have any special\n> > > > behaviours to implement. For instance if a particular camera mode\n> > > > produces a different signal level to other modes, you might choose to\n> > > > address that in the gain or exposure methods.\n> > >\n> > > For my information, would you be able to give an example of how this\n> > > would be used ? What kind of particular camera mode would produce a\n> > > different signal level, is this related to binning, or something else ?\n> >\n> > Yes, so the sensor I'm looking at produces twice the signal level in\n> > the binning modes that it does in the full resolution mode. I think\n> > this difference would be awkward for applications which means we have\n> > to hide it somewhere.\n> >\n> > One of my earlier emails talked a little bit about hiding this in the\n> > AGC. You'd have to know a \"base gain\" for every mode (so 1 for full\n> > res, 2 for binned), and then have some process where we have public\n> > exposure/gain numbers (which are the same for all modes) and there's a\n> > process of conversion into and out of the AGC and IPA which deal with\n> > real exposure/gain values. But I'm really not sure I want to go there,\n> > at least right now!\n> >\n> > I think this is a pragmatic alternative. You can already hide the\n> > difference by changing the CamHelper's Gain and GainCode functions, to\n> > apply the magic extra factor of 2 when necessary. This change here\n> > makes it possible to apply that factor in the exposure instead - you\n> > could imagine someone wanting the lowest noise possible and accepting\n> > that the real exposure will be double (I wouldn't want to double\n> > exposures without people explicitly making that choice). And finally,\n> > this change doesn't make it any more difficult to do the compensation\n> > in the AGC at a later date, should we wish.\n> >\n> > In the CamHelper for my new sensor, I even make it easy to mix the\n> > factor of 2 across both gain and exposure, so you could, for example,\n> > have sqrt(2) in each.\n> >\n> > That got rather long... I hope it was understandable!\n>\n> Thanks for the information. There may be something that I don't get\n> though, as I don't see why the exposure needs to be affected. The\n> exposure time has a big influence on motion blur, so it seems to me that\n> we shouldn't cheat here. The gain seems to be a better place to hide\n> this sensor feature. I wonder if we should consider moving from gain to\n> sensor sensitivity.\n\nYou're right, exposure doesn't *need* to be affected, and indeed my\ndefault position is to handle the difference using gain instead. The\nintention is that people are *able* to keep the gain to a minimum if\nthey want to, at the cost of changing the exposure - but this has to\nbe a deliberate choice.\n\nI agree the \"sensitivity\" is a good idea, and indeed a much better\nterm than I used above, which was \"base gain\".\n\nNote that sensitivity is a per-mode thing here. I think it's a good\nsolution but has some consequences:\n\n- If an application wants to change camera mode and set the exposure\nthey're going to have to deal with per-mode sensitivities and adapt\ntheir numbers accordingly. Do we like this? I'm always very nervous of\ncomplicating an application's life!\n\n- You could try and handle per-mode sensitivities either outside the\nAGC (effectively that's what this allows) or you could handle it\nwithin the AGC. Whenever the camera mode changes you'd have to adjust\nthe state variables according to the ratio of old_sensitivity /\nnew_sensitivity. You'd probably have to use a different exposure\nprofile, possibly it needs to be set by the application - or could we\ndo something more automatic? The implementation would also depend on\nwhether we want applications to have to pay attention to the \"per-mode\nsensitivities\" or not.\n\nSo overall, I'm in several minds about going this \"per-mode\nsensitivity\" route and the work it requires on the AGC, as well as the\nimplications for applications. In the meantime, making the exposure\nmethods virtual allows these sensors to work and doesn't make a later\nstep towards per-mode sensitivities any more difficult. You would\nsimply do that work and then delete your virtual exposure methods.\n\nPossibly I've repeated myself rather a lot there - but maybe I've\nexplained it better this time? :)\n\nThanks\nDavid\n\n>\n> > > > Signed-off-by: David Plowman <david.plowman@raspberrypi.com>\n> > > > Reviewed-by: Naushir Patuck <naush@raspberrypi.com>\n> > >\n> > > Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>\n> > >\n> > > > ---\n> > > >  src/ipa/raspberrypi/cam_helper.hpp | 4 ++--\n> > > >  1 file changed, 2 insertions(+), 2 deletions(-)\n> > > >\n> > > > diff --git a/src/ipa/raspberrypi/cam_helper.hpp b/src/ipa/raspberrypi/cam_helper.hpp\n> > > > index 1b2d6eec..4053a870 100644\n> > > > --- a/src/ipa/raspberrypi/cam_helper.hpp\n> > > > +++ b/src/ipa/raspberrypi/cam_helper.hpp\n> > > > @@ -66,8 +66,8 @@ public:\n> > > >       virtual ~CamHelper();\n> > > >       void SetCameraMode(const CameraMode &mode);\n> > > >       MdParser &Parser() const { return *parser_; }\n> > > > -     uint32_t ExposureLines(double exposure_us) const;\n> > > > -     double Exposure(uint32_t exposure_lines) const; // in us\n> > > > +     virtual uint32_t ExposureLines(double exposure_us) const;\n> > > > +     virtual double Exposure(uint32_t exposure_lines) const; // in us\n> > > >       virtual uint32_t GetVBlanking(double &exposure_us, double minFrameDuration,\n> > > >                                     double maxFrameDuration) const;\n> > > >       virtual uint32_t GainCode(double gain) const = 0;\n>\n> --\n> Regards,\n>\n> Laurent Pinchart","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id CBAD4BDE19\n\tfor <parsemail@patchwork.libcamera.org>;\n\tTue, 27 Apr 2021 08:22:13 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id 195C9688C2;\n\tTue, 27 Apr 2021 10:22:13 +0200 (CEST)","from mail-wm1-x32b.google.com (mail-wm1-x32b.google.com\n\t[IPv6:2a00:1450:4864:20::32b])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id 0471C688B5\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tTue, 27 Apr 2021 10:22:11 +0200 (CEST)","by mail-wm1-x32b.google.com with SMTP id\n\tn3-20020a05600c4f83b02901425630b2c2so617710wmq.1\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tTue, 27 Apr 2021 01:22:11 -0700 (PDT)"],"Authentication-Results":"lancelot.ideasonboard.com;\n\tdkim=fail reason=\"signature verification failed\" (2048-bit key;\n\tunprotected) header.d=raspberrypi.com header.i=@raspberrypi.com\n\theader.b=\"WEf2YAm+\"; dkim-atps=neutral","DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/relaxed;\n\td=raspberrypi.com; s=google;\n\th=mime-version:references:in-reply-to:from:date:message-id:subject:to\n\t:cc; bh=PnQ6woK8pXSpD6LDthWtI8AovLzx9iuNUvtOuzt5md8=;\n\tb=WEf2YAm+apyJqr4qZhF/rrcf8lOehzK3utC8vCCvhSjtKP96s72kolAqLbcK44NRMB\n\tcAplM45xCkyzmYbBhjVYWr0C7HRXTOfjGyjsUAw5gX1Mx7JpkQyvNxPkGJKP+8brgKyF\n\t1cmSG9hNv7UsDB0HQIYyhOZCLnuGXPaf3R3joKiKM//B+sG4Cjatbj342HYbFYhu7u3u\n\tNJiHrjoouJTYvItXQ1+MuqsGzyUdObCQh2dXBMA1cHfrnPp2f/g2sOpixZc12FNgsfnq\n\tn8bgkeMts05c4JC7Oc1Tg7pNNQ7LN/ml8ZLdT3G/r/jb46hoH8GpzoHqMoRM+yGG3wLF\n\tNNjg==","X-Google-DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/relaxed;\n\td=1e100.net; s=20161025;\n\th=x-gm-message-state:mime-version:references:in-reply-to:from:date\n\t:message-id:subject:to:cc;\n\tbh=PnQ6woK8pXSpD6LDthWtI8AovLzx9iuNUvtOuzt5md8=;\n\tb=GeFTDP5WWEVqA3DarHo1pzUUjVRkoGIzZck0TsUPKBpaj0/51VDo79w3SD+csDks4P\n\tV583StM9n+YBhEKgjDKmtjGUA069pbvS2P4j4DDWoFcn2OXEqjm5t4J70qOfFlkHl1fH\n\tIDJcJE6Tx5e4fymmBNlUSjyLo0u+qf7ub/LLjac6HVP/tR9/IsdInCub76zJzg6VC9eu\n\tg2DbOyKAwTWeIwQKa3MCjGZzylAAIpqudh0lT6oU5geMhw93d/yzY6P4AdNQCPdRQP+l\n\txNddQuPxmBnGjaDn3uJ5TRtualLRGt6jKo73Kqkrhdf2+Wg+ci89sachiujUMzbae1Zc\n\tv/3A==","X-Gm-Message-State":"AOAM530sHMFt9aYbIz0mAPpoK6KR+8m2W3DZNM2yP7Ez7R0N89fs/vKY\n\tIqSzKWJjxLHKmpUhiluCpzKd55rgPWiAFo5EDazY7w==","X-Google-Smtp-Source":"ABdhPJxe0mabelFgAVRbv+Oma1h7Tfm1fNOwDOWJKnWGJ/eIwRqAWn84L3nVf5+mhGqpyRL0KBU3a5Pl2vbJr1G/018=","X-Received":"by 2002:a7b:c097:: with SMTP id\n\tr23mr23101182wmh.40.1619511731657; \n\tTue, 27 Apr 2021 01:22:11 -0700 (PDT)","MIME-Version":"1.0","References":"<20210414102955.9503-1-david.plowman@raspberrypi.com>\n\t<20210414102955.9503-2-david.plowman@raspberrypi.com>\n\t<YHoIBNpFTzakzIhi@pendragon.ideasonboard.com>\n\t<CAHW6GYL0csoeG3JH0tD4n=jg03GpcmxGGq=e6dcUvfrxtAuQ1g@mail.gmail.com>\n\t<YIetKmO+nQU2bS7A@pendragon.ideasonboard.com>","In-Reply-To":"<YIetKmO+nQU2bS7A@pendragon.ideasonboard.com>","From":"David Plowman <david.plowman@raspberrypi.com>","Date":"Tue, 27 Apr 2021 09:22:00 +0100","Message-ID":"<CAHW6GYKpdCD_VdPbaQJg_bzGV+-hi2PrMzxUQF8epTu9vG3B5Q@mail.gmail.com>","To":"Laurent Pinchart <laurent.pinchart@ideasonboard.com>","Subject":"Re: [libcamera-devel] [PATCH v3 1/3] ipa: raspberrypi: Make\n\tCamHelper exposure methods virtual","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","Cc":"libcamera devel <libcamera-devel@lists.libcamera.org>","Content-Type":"text/plain; charset=\"us-ascii\"","Content-Transfer-Encoding":"7bit","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}},{"id":16670,"web_url":"https://patchwork.libcamera.org/comment/16670/","msgid":"<YIlSKYg+CjMlfkNs@pendragon.ideasonboard.com>","date":"2021-04-28T12:16:41","subject":"Re: [libcamera-devel] [PATCH v3 1/3] ipa: raspberrypi: Make\n\tCamHelper exposure methods virtual","submitter":{"id":2,"url":"https://patchwork.libcamera.org/api/people/2/","name":"Laurent Pinchart","email":"laurent.pinchart@ideasonboard.com"},"content":"Hi David,\n\nOn Tue, Apr 27, 2021 at 09:22:00AM +0100, David Plowman wrote:\n> On Tue, 27 Apr 2021 at 07:20, Laurent Pinchart wrote:\n> > On Sat, Apr 17, 2021 at 09:19:51AM +0100, David Plowman wrote:\n> > > On Fri, 16 Apr 2021 at 22:56, Laurent Pinchart wrote:\n> > > > On Wed, Apr 14, 2021 at 11:29:53AM +0100, David Plowman wrote:\n> > > > > This allows derived classes to override them if they have any special\n> > > > > behaviours to implement. For instance if a particular camera mode\n> > > > > produces a different signal level to other modes, you might choose to\n> > > > > address that in the gain or exposure methods.\n> > > >\n> > > > For my information, would you be able to give an example of how this\n> > > > would be used ? What kind of particular camera mode would produce a\n> > > > different signal level, is this related to binning, or something else ?\n> > >\n> > > Yes, so the sensor I'm looking at produces twice the signal level in\n> > > the binning modes that it does in the full resolution mode. I think\n> > > this difference would be awkward for applications which means we have\n> > > to hide it somewhere.\n> > >\n> > > One of my earlier emails talked a little bit about hiding this in the\n> > > AGC. You'd have to know a \"base gain\" for every mode (so 1 for full\n> > > res, 2 for binned), and then have some process where we have public\n> > > exposure/gain numbers (which are the same for all modes) and there's a\n> > > process of conversion into and out of the AGC and IPA which deal with\n> > > real exposure/gain values. But I'm really not sure I want to go there,\n> > > at least right now!\n> > >\n> > > I think this is a pragmatic alternative. You can already hide the\n> > > difference by changing the CamHelper's Gain and GainCode functions, to\n> > > apply the magic extra factor of 2 when necessary. This change here\n> > > makes it possible to apply that factor in the exposure instead - you\n> > > could imagine someone wanting the lowest noise possible and accepting\n> > > that the real exposure will be double (I wouldn't want to double\n> > > exposures without people explicitly making that choice). And finally,\n> > > this change doesn't make it any more difficult to do the compensation\n> > > in the AGC at a later date, should we wish.\n> > >\n> > > In the CamHelper for my new sensor, I even make it easy to mix the\n> > > factor of 2 across both gain and exposure, so you could, for example,\n> > > have sqrt(2) in each.\n> > >\n> > > That got rather long... I hope it was understandable!\n> >\n> > Thanks for the information. There may be something that I don't get\n> > though, as I don't see why the exposure needs to be affected. The\n> > exposure time has a big influence on motion blur, so it seems to me that\n> > we shouldn't cheat here. The gain seems to be a better place to hide\n> > this sensor feature. I wonder if we should consider moving from gain to\n> > sensor sensitivity.\n> \n> You're right, exposure doesn't *need* to be affected, and indeed my\n> default position is to handle the difference using gain instead. The\n> intention is that people are *able* to keep the gain to a minimum if\n> they want to, at the cost of changing the exposure - but this has to\n> be a deliberate choice.\n>\n> I agree the \"sensitivity\" is a good idea, and indeed a much better\n> term than I used above, which was \"base gain\".\n> \n> Note that sensitivity is a per-mode thing here. I think it's a good\n> solution but has some consequences:\n> \n> - If an application wants to change camera mode and set the exposure\n> they're going to have to deal with per-mode sensitivities and adapt\n> their numbers accordingly. Do we like this? I'm always very nervous of\n> complicating an application's life!\n>\n> - You could try and handle per-mode sensitivities either outside the\n> AGC (effectively that's what this allows) or you could handle it\n> within the AGC. Whenever the camera mode changes you'd have to adjust\n> the state variables according to the ratio of old_sensitivity /\n> new_sensitivity. You'd probably have to use a different exposure\n> profile, possibly it needs to be set by the application - or could we\n> do something more automatic? The implementation would also depend on\n> whether we want applications to have to pay attention to the \"per-mode\n> sensitivities\" or not.\n> \n> So overall, I'm in several minds about going this \"per-mode\n> sensitivity\" route and the work it requires on the AGC, as well as the\n> implications for applications. In the meantime, making the exposure\n> methods virtual allows these sensors to work and doesn't make a later\n> step towards per-mode sensitivities any more difficult. You would\n> simply do that work and then delete your virtual exposure methods.\n\nI'm sorry, but I still don't see why it requires making the below\nfunctions virtual. They convert between exposure expressed in lines and\nexposure expressed as a time, why would that need to be sensor-specific\n? I guess I need to see an actual use case first.\n\n> Possibly I've repeated myself rather a lot there - but maybe I've\n> explained it better this time? :)\n> \n> > > > > Signed-off-by: David Plowman <david.plowman@raspberrypi.com>\n> > > > > Reviewed-by: Naushir Patuck <naush@raspberrypi.com>\n> > > >\n> > > > Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>\n> > > >\n> > > > > ---\n> > > > >  src/ipa/raspberrypi/cam_helper.hpp | 4 ++--\n> > > > >  1 file changed, 2 insertions(+), 2 deletions(-)\n> > > > >\n> > > > > diff --git a/src/ipa/raspberrypi/cam_helper.hpp b/src/ipa/raspberrypi/cam_helper.hpp\n> > > > > index 1b2d6eec..4053a870 100644\n> > > > > --- a/src/ipa/raspberrypi/cam_helper.hpp\n> > > > > +++ b/src/ipa/raspberrypi/cam_helper.hpp\n> > > > > @@ -66,8 +66,8 @@ public:\n> > > > >       virtual ~CamHelper();\n> > > > >       void SetCameraMode(const CameraMode &mode);\n> > > > >       MdParser &Parser() const { return *parser_; }\n> > > > > -     uint32_t ExposureLines(double exposure_us) const;\n> > > > > -     double Exposure(uint32_t exposure_lines) const; // in us\n> > > > > +     virtual uint32_t ExposureLines(double exposure_us) const;\n> > > > > +     virtual double Exposure(uint32_t exposure_lines) const; // in us\n> > > > >       virtual uint32_t GetVBlanking(double &exposure_us, double minFrameDuration,\n> > > > >                                     double maxFrameDuration) const;\n> > > > >       virtual uint32_t GainCode(double gain) const = 0;","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id 5F212BDE44\n\tfor <parsemail@patchwork.libcamera.org>;\n\tWed, 28 Apr 2021 12:16:50 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id B42A2688DA;\n\tWed, 28 Apr 2021 14:16:49 +0200 (CEST)","from perceval.ideasonboard.com (perceval.ideasonboard.com\n\t[213.167.242.64])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id C9ADE688D0\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tWed, 28 Apr 2021 14:16:47 +0200 (CEST)","from pendragon.ideasonboard.com (62-78-145-57.bb.dnainternet.fi\n\t[62.78.145.57])\n\tby perceval.ideasonboard.com (Postfix) with ESMTPSA id 4F72A2CF;\n\tWed, 28 Apr 2021 14:16:47 +0200 (CEST)"],"Authentication-Results":"lancelot.ideasonboard.com;\n\tdkim=fail reason=\"signature verification failed\" (1024-bit key;\n\tunprotected) header.d=ideasonboard.com header.i=@ideasonboard.com\n\theader.b=\"Xd/epZ/K\"; dkim-atps=neutral","DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/simple; d=ideasonboard.com;\n\ts=mail; t=1619612207;\n\tbh=AVh/886+kD0sySP7SC5DGnYnoyVT56Uf+8hTFeGuDgQ=;\n\th=Date:From:To:Cc:Subject:References:In-Reply-To:From;\n\tb=Xd/epZ/KYRZLcD/A1znukZZfTp2cKe8BVyeRXMaf2NLFAPk5yWjetLzWCwwWnyclu\n\t4A/JODPF6gxKiz7iN38+I9P2xX50CE/Y/WjQpQr872eC5MMOuui1M1MRfefbgLDO/M\n\tQLAcDqFzwXOvLkKGyLi7TAv8SAFqY/ejeox/AWFQ=","Date":"Wed, 28 Apr 2021 15:16:41 +0300","From":"Laurent Pinchart <laurent.pinchart@ideasonboard.com>","To":"David Plowman <david.plowman@raspberrypi.com>","Message-ID":"<YIlSKYg+CjMlfkNs@pendragon.ideasonboard.com>","References":"<20210414102955.9503-1-david.plowman@raspberrypi.com>\n\t<20210414102955.9503-2-david.plowman@raspberrypi.com>\n\t<YHoIBNpFTzakzIhi@pendragon.ideasonboard.com>\n\t<CAHW6GYL0csoeG3JH0tD4n=jg03GpcmxGGq=e6dcUvfrxtAuQ1g@mail.gmail.com>\n\t<YIetKmO+nQU2bS7A@pendragon.ideasonboard.com>\n\t<CAHW6GYKpdCD_VdPbaQJg_bzGV+-hi2PrMzxUQF8epTu9vG3B5Q@mail.gmail.com>","MIME-Version":"1.0","Content-Disposition":"inline","In-Reply-To":"<CAHW6GYKpdCD_VdPbaQJg_bzGV+-hi2PrMzxUQF8epTu9vG3B5Q@mail.gmail.com>","Subject":"Re: [libcamera-devel] [PATCH v3 1/3] ipa: raspberrypi: Make\n\tCamHelper exposure methods virtual","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","Cc":"libcamera devel <libcamera-devel@lists.libcamera.org>","Content-Type":"text/plain; charset=\"us-ascii\"","Content-Transfer-Encoding":"7bit","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}},{"id":16671,"web_url":"https://patchwork.libcamera.org/comment/16671/","msgid":"<CAEmqJPpTT00SmEHwmH=XPoeSg-3cxqsmyoMhufjNWKXUjX8D0w@mail.gmail.com>","date":"2021-04-28T12:35:08","subject":"Re: [libcamera-devel] [PATCH v3 1/3] ipa: raspberrypi: Make\n\tCamHelper exposure methods virtual","submitter":{"id":34,"url":"https://patchwork.libcamera.org/api/people/34/","name":"Naushir Patuck","email":"naush@raspberrypi.com"},"content":"Hi Laurent,\n\nOn Wed, 28 Apr 2021 at 13:16, Laurent Pinchart <\nlaurent.pinchart@ideasonboard.com> wrote:\n\n> Hi David,\n>\n> On Tue, Apr 27, 2021 at 09:22:00AM +0100, David Plowman wrote:\n> > On Tue, 27 Apr 2021 at 07:20, Laurent Pinchart wrote:\n> > > On Sat, Apr 17, 2021 at 09:19:51AM +0100, David Plowman wrote:\n> > > > On Fri, 16 Apr 2021 at 22:56, Laurent Pinchart wrote:\n> > > > > On Wed, Apr 14, 2021 at 11:29:53AM +0100, David Plowman wrote:\n> > > > > > This allows derived classes to override them if they have any\n> special\n> > > > > > behaviours to implement. For instance if a particular camera mode\n> > > > > > produces a different signal level to other modes, you might\n> choose to\n> > > > > > address that in the gain or exposure methods.\n> > > > >\n> > > > > For my information, would you be able to give an example of how\n> this\n> > > > > would be used ? What kind of particular camera mode would produce a\n> > > > > different signal level, is this related to binning, or something\n> else ?\n> > > >\n> > > > Yes, so the sensor I'm looking at produces twice the signal level in\n> > > > the binning modes that it does in the full resolution mode. I think\n> > > > this difference would be awkward for applications which means we have\n> > > > to hide it somewhere.\n> > > >\n> > > > One of my earlier emails talked a little bit about hiding this in the\n> > > > AGC. You'd have to know a \"base gain\" for every mode (so 1 for full\n> > > > res, 2 for binned), and then have some process where we have public\n> > > > exposure/gain numbers (which are the same for all modes) and there's\n> a\n> > > > process of conversion into and out of the AGC and IPA which deal with\n> > > > real exposure/gain values. But I'm really not sure I want to go\n> there,\n> > > > at least right now!\n> > > >\n> > > > I think this is a pragmatic alternative. You can already hide the\n> > > > difference by changing the CamHelper's Gain and GainCode functions,\n> to\n> > > > apply the magic extra factor of 2 when necessary. This change here\n> > > > makes it possible to apply that factor in the exposure instead - you\n> > > > could imagine someone wanting the lowest noise possible and accepting\n> > > > that the real exposure will be double (I wouldn't want to double\n> > > > exposures without people explicitly making that choice). And finally,\n> > > > this change doesn't make it any more difficult to do the compensation\n> > > > in the AGC at a later date, should we wish.\n> > > >\n> > > > In the CamHelper for my new sensor, I even make it easy to mix the\n> > > > factor of 2 across both gain and exposure, so you could, for example,\n> > > > have sqrt(2) in each.\n> > > >\n> > > > That got rather long... I hope it was understandable!\n> > >\n> > > Thanks for the information. There may be something that I don't get\n> > > though, as I don't see why the exposure needs to be affected. The\n> > > exposure time has a big influence on motion blur, so it seems to me\n> that\n> > > we shouldn't cheat here. The gain seems to be a better place to hide\n> > > this sensor feature. I wonder if we should consider moving from gain to\n> > > sensor sensitivity.\n> >\n> > You're right, exposure doesn't *need* to be affected, and indeed my\n> > default position is to handle the difference using gain instead. The\n> > intention is that people are *able* to keep the gain to a minimum if\n> > they want to, at the cost of changing the exposure - but this has to\n> > be a deliberate choice.\n> >\n> > I agree the \"sensitivity\" is a good idea, and indeed a much better\n> > term than I used above, which was \"base gain\".\n> >\n> > Note that sensitivity is a per-mode thing here. I think it's a good\n> > solution but has some consequences:\n> >\n> > - If an application wants to change camera mode and set the exposure\n> > they're going to have to deal with per-mode sensitivities and adapt\n> > their numbers accordingly. Do we like this? I'm always very nervous of\n> > complicating an application's life!\n> >\n> > - You could try and handle per-mode sensitivities either outside the\n> > AGC (effectively that's what this allows) or you could handle it\n> > within the AGC. Whenever the camera mode changes you'd have to adjust\n> > the state variables according to the ratio of old_sensitivity /\n> > new_sensitivity. You'd probably have to use a different exposure\n> > profile, possibly it needs to be set by the application - or could we\n> > do something more automatic? The implementation would also depend on\n> > whether we want applications to have to pay attention to the \"per-mode\n> > sensitivities\" or not.\n> >\n> > So overall, I'm in several minds about going this \"per-mode\n> > sensitivity\" route and the work it requires on the AGC, as well as the\n> > implications for applications. In the meantime, making the exposure\n> > methods virtual allows these sensors to work and doesn't make a later\n> > step towards per-mode sensitivities any more difficult. You would\n> > simply do that work and then delete your virtual exposure methods.\n>\n> I'm sorry, but I still don't see why it requires making the below\n> functions virtual. They convert between exposure expressed in lines and\n> exposure expressed as a time, why would that need to be sensor-specific\n> ? I guess I need to see an actual use case first.\n>\n\nThis is entirely tangential to David's usage, but I also need this method to\nbe virtual so I can override this for imx477 ultra long exposure modes.  In\nthis mode, we have some additional scaling to apply to the standard\nequation.\n\nRegards,\nNaush\n\n\n>\n> > Possibly I've repeated myself rather a lot there - but maybe I've\n> > explained it better this time? :)\n> >\n> > > > > > Signed-off-by: David Plowman <david.plowman@raspberrypi.com>\n> > > > > > Reviewed-by: Naushir Patuck <naush@raspberrypi.com>\n> > > > >\n> > > > > Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>\n> > > > >\n> > > > > > ---\n> > > > > >  src/ipa/raspberrypi/cam_helper.hpp | 4 ++--\n> > > > > >  1 file changed, 2 insertions(+), 2 deletions(-)\n> > > > > >\n> > > > > > diff --git a/src/ipa/raspberrypi/cam_helper.hpp\n> b/src/ipa/raspberrypi/cam_helper.hpp\n> > > > > > index 1b2d6eec..4053a870 100644\n> > > > > > --- a/src/ipa/raspberrypi/cam_helper.hpp\n> > > > > > +++ b/src/ipa/raspberrypi/cam_helper.hpp\n> > > > > > @@ -66,8 +66,8 @@ public:\n> > > > > >       virtual ~CamHelper();\n> > > > > >       void SetCameraMode(const CameraMode &mode);\n> > > > > >       MdParser &Parser() const { return *parser_; }\n> > > > > > -     uint32_t ExposureLines(double exposure_us) const;\n> > > > > > -     double Exposure(uint32_t exposure_lines) const; // in us\n> > > > > > +     virtual uint32_t ExposureLines(double exposure_us) const;\n> > > > > > +     virtual double Exposure(uint32_t exposure_lines) const; //\n> in us\n> > > > > >       virtual uint32_t GetVBlanking(double &exposure_us, double\n> minFrameDuration,\n> > > > > >                                     double maxFrameDuration)\n> const;\n> > > > > >       virtual uint32_t GainCode(double gain) const = 0;\n>\n> --\n> Regards,\n>\n> Laurent Pinchart\n> _______________________________________________\n> libcamera-devel mailing list\n> libcamera-devel@lists.libcamera.org\n> https://lists.libcamera.org/listinfo/libcamera-devel\n>","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id 8A945BDE41\n\tfor <parsemail@patchwork.libcamera.org>;\n\tWed, 28 Apr 2021 12:35:27 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id D6D1B688DC;\n\tWed, 28 Apr 2021 14:35:26 +0200 (CEST)","from mail-lf1-x12b.google.com (mail-lf1-x12b.google.com\n\t[IPv6:2a00:1450:4864:20::12b])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id EF328688D0\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tWed, 28 Apr 2021 14:35:24 +0200 (CEST)","by mail-lf1-x12b.google.com with SMTP id h36so44782409lfv.7\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tWed, 28 Apr 2021 05:35:24 -0700 (PDT)"],"Authentication-Results":"lancelot.ideasonboard.com;\n\tdkim=fail reason=\"signature verification failed\" (2048-bit key;\n\tunprotected) header.d=raspberrypi.com header.i=@raspberrypi.com\n\theader.b=\"V3scT6Cf\"; dkim-atps=neutral","DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/relaxed;\n\td=raspberrypi.com; s=google;\n\th=mime-version:references:in-reply-to:from:date:message-id:subject:to\n\t:cc; bh=A0hXi0RCVU6LU08C5qMwV/z49NykwU6wHClAP/KD/RI=;\n\tb=V3scT6CfBIbgyBUXAgPkDBLXG+3d3prKn3WbYHbenbvt8E1el0qRCLyyi8bK2G35DN\n\tRmyUCYTMyTkt28wri+luod9GTPu5EbSsH+rrIyfGpgauD3h/91qRcHsum2ezcUC6mpL0\n\t2J7Q2pIQqbFQ4573ULeryfgJ8animscmE/nSpwJfPQCNgj/NUCPpKFsmEKxqaLOlTEJE\n\tA2S8hlrN4Pf9SIWgYSgyh4R4N0duorBE1IECNGOEUSApLSQyW0CaKFDRgEil/6Tfw4ZH\n\tpEq6mvTdmVXA7PT3S5m5lHGokZsMlqoF5F72Xkd4sy5oJUTLmCyz3a9cjTBUToO0CxT4\n\tQPqg==","X-Google-DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/relaxed;\n\td=1e100.net; s=20161025;\n\th=x-gm-message-state:mime-version:references:in-reply-to:from:date\n\t:message-id:subject:to:cc;\n\tbh=A0hXi0RCVU6LU08C5qMwV/z49NykwU6wHClAP/KD/RI=;\n\tb=Ndp8U6eDLALOqEbjVD6IpS/hxTMR7KjFOqSB5csUpYH3xNYP03DxuCR4gk84fAlyba\n\tw42HguHaojNkc8uRugF+KMhqq/FLk0jTk+NO9SSxmc6mhp5CYYWODk9x4AeCyVlvzSv3\n\tQcK08hx8LET1NKaL7p9Ojq5KdCngRNyZSxK1SnCwmrQjeay/+jmzYhALeh3LR63mTH8F\n\tSyQmMpgdksN9/XHy6wQd/909c82W/UjfUxuf8BdZToebmij+1fg11G2IJ1HzerDoJZwe\n\tLG/20Q6oRQZwAhzBbt635OPjIXoz7OKeo2FHcA/FRp0QowjmCDHeWTFe71nKwWkmsVun\n\tLyOA==","X-Gm-Message-State":"AOAM531r87JUHeD/wcyPNTtBALYnw8dFCNIr8GFTW95Xn5B9AV4grbzc\n\ttpkftYX0L+tSR3eM7Zq1t6jdS6p7NBrlGb8pWzsyXA==","X-Google-Smtp-Source":"ABdhPJz7ghpeOYruXK9PuK2LvZEAqPXVENiNhWsG2236lawfnay6G4SSRBavNJih2muieOVUOpTOqpFhUKLeHFK7GFY=","X-Received":"by 2002:a05:6512:b25:: with SMTP id\n\tw37mr20968171lfu.272.1619613324331; \n\tWed, 28 Apr 2021 05:35:24 -0700 (PDT)","MIME-Version":"1.0","References":"<20210414102955.9503-1-david.plowman@raspberrypi.com>\n\t<20210414102955.9503-2-david.plowman@raspberrypi.com>\n\t<YHoIBNpFTzakzIhi@pendragon.ideasonboard.com>\n\t<CAHW6GYL0csoeG3JH0tD4n=jg03GpcmxGGq=e6dcUvfrxtAuQ1g@mail.gmail.com>\n\t<YIetKmO+nQU2bS7A@pendragon.ideasonboard.com>\n\t<CAHW6GYKpdCD_VdPbaQJg_bzGV+-hi2PrMzxUQF8epTu9vG3B5Q@mail.gmail.com>\n\t<YIlSKYg+CjMlfkNs@pendragon.ideasonboard.com>","In-Reply-To":"<YIlSKYg+CjMlfkNs@pendragon.ideasonboard.com>","From":"Naushir Patuck <naush@raspberrypi.com>","Date":"Wed, 28 Apr 2021 13:35:08 +0100","Message-ID":"<CAEmqJPpTT00SmEHwmH=XPoeSg-3cxqsmyoMhufjNWKXUjX8D0w@mail.gmail.com>","To":"Laurent Pinchart <laurent.pinchart@ideasonboard.com>","Subject":"Re: [libcamera-devel] [PATCH v3 1/3] ipa: raspberrypi: Make\n\tCamHelper exposure methods virtual","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","Cc":"libcamera devel <libcamera-devel@lists.libcamera.org>","Content-Type":"multipart/mixed;\n\tboundary=\"===============3948575689567599112==\"","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}},{"id":16672,"web_url":"https://patchwork.libcamera.org/comment/16672/","msgid":"<YIlkCw6Adu3sk5S2@pendragon.ideasonboard.com>","date":"2021-04-28T13:32:59","subject":"Re: [libcamera-devel] [PATCH v3 1/3] ipa: raspberrypi: Make\n\tCamHelper exposure methods virtual","submitter":{"id":2,"url":"https://patchwork.libcamera.org/api/people/2/","name":"Laurent Pinchart","email":"laurent.pinchart@ideasonboard.com"},"content":"Hi Naush,\n\nOn Wed, Apr 28, 2021 at 01:35:08PM +0100, Naushir Patuck wrote:\n> On Wed, 28 Apr 2021 at 13:16, Laurent Pinchart wrote:\n> > On Tue, Apr 27, 2021 at 09:22:00AM +0100, David Plowman wrote:\n> > > On Tue, 27 Apr 2021 at 07:20, Laurent Pinchart wrote:\n> > > > On Sat, Apr 17, 2021 at 09:19:51AM +0100, David Plowman wrote:\n> > > > > On Fri, 16 Apr 2021 at 22:56, Laurent Pinchart wrote:\n> > > > > > On Wed, Apr 14, 2021 at 11:29:53AM +0100, David Plowman wrote:\n> > > > > > > This allows derived classes to override them if they have any special\n> > > > > > > behaviours to implement. For instance if a particular camera mode\n> > > > > > > produces a different signal level to other modes, you might choose to\n> > > > > > > address that in the gain or exposure methods.\n> > > > > >\n> > > > > > For my information, would you be able to give an example of how this\n> > > > > > would be used ? What kind of particular camera mode would produce a\n> > > > > > different signal level, is this related to binning, or something else ?\n> > > > >\n> > > > > Yes, so the sensor I'm looking at produces twice the signal level in\n> > > > > the binning modes that it does in the full resolution mode. I think\n> > > > > this difference would be awkward for applications which means we have\n> > > > > to hide it somewhere.\n> > > > >\n> > > > > One of my earlier emails talked a little bit about hiding this in the\n> > > > > AGC. You'd have to know a \"base gain\" for every mode (so 1 for full\n> > > > > res, 2 for binned), and then have some process where we have public\n> > > > > exposure/gain numbers (which are the same for all modes) and there's a\n> > > > > process of conversion into and out of the AGC and IPA which deal with\n> > > > > real exposure/gain values. But I'm really not sure I want to go there,\n> > > > > at least right now!\n> > > > >\n> > > > > I think this is a pragmatic alternative. You can already hide the\n> > > > > difference by changing the CamHelper's Gain and GainCode functions, to\n> > > > > apply the magic extra factor of 2 when necessary. This change here\n> > > > > makes it possible to apply that factor in the exposure instead - you\n> > > > > could imagine someone wanting the lowest noise possible and accepting\n> > > > > that the real exposure will be double (I wouldn't want to double\n> > > > > exposures without people explicitly making that choice). And finally,\n> > > > > this change doesn't make it any more difficult to do the compensation\n> > > > > in the AGC at a later date, should we wish.\n> > > > >\n> > > > > In the CamHelper for my new sensor, I even make it easy to mix the\n> > > > > factor of 2 across both gain and exposure, so you could, for example,\n> > > > > have sqrt(2) in each.\n> > > > >\n> > > > > That got rather long... I hope it was understandable!\n> > > >\n> > > > Thanks for the information. There may be something that I don't get\n> > > > though, as I don't see why the exposure needs to be affected. The\n> > > > exposure time has a big influence on motion blur, so it seems to me that\n> > > > we shouldn't cheat here. The gain seems to be a better place to hide\n> > > > this sensor feature. I wonder if we should consider moving from gain to\n> > > > sensor sensitivity.\n> > >\n> > > You're right, exposure doesn't *need* to be affected, and indeed my\n> > > default position is to handle the difference using gain instead. The\n> > > intention is that people are *able* to keep the gain to a minimum if\n> > > they want to, at the cost of changing the exposure - but this has to\n> > > be a deliberate choice.\n> > >\n> > > I agree the \"sensitivity\" is a good idea, and indeed a much better\n> > > term than I used above, which was \"base gain\".\n> > >\n> > > Note that sensitivity is a per-mode thing here. I think it's a good\n> > > solution but has some consequences:\n> > >\n> > > - If an application wants to change camera mode and set the exposure\n> > > they're going to have to deal with per-mode sensitivities and adapt\n> > > their numbers accordingly. Do we like this? I'm always very nervous of\n> > > complicating an application's life!\n> > >\n> > > - You could try and handle per-mode sensitivities either outside the\n> > > AGC (effectively that's what this allows) or you could handle it\n> > > within the AGC. Whenever the camera mode changes you'd have to adjust\n> > > the state variables according to the ratio of old_sensitivity /\n> > > new_sensitivity. You'd probably have to use a different exposure\n> > > profile, possibly it needs to be set by the application - or could we\n> > > do something more automatic? The implementation would also depend on\n> > > whether we want applications to have to pay attention to the \"per-mode\n> > > sensitivities\" or not.\n> > >\n> > > So overall, I'm in several minds about going this \"per-mode\n> > > sensitivity\" route and the work it requires on the AGC, as well as the\n> > > implications for applications. In the meantime, making the exposure\n> > > methods virtual allows these sensors to work and doesn't make a later\n> > > step towards per-mode sensitivities any more difficult. You would\n> > > simply do that work and then delete your virtual exposure methods.\n> >\n> > I'm sorry, but I still don't see why it requires making the below\n> > functions virtual. They convert between exposure expressed in lines and\n> > exposure expressed as a time, why would that need to be sensor-specific\n> > ? I guess I need to see an actual use case first.\n> \n> This is entirely tangential to David's usage, but I also need this method to\n> be virtual so I can override this for imx477 ultra long exposure modes.  In\n> this mode, we have some additional scaling to apply to the standard\n> equation.\n\nI assume that in that case the exposure_lines argument, and the return\nvalue of ExposureLines(), won't be expressed as a number of lines, right\n?\n\nGiven that sensors also often have a fine exposure time expressed in\npixels, in addition to the coarse exposure time expressed in lines,\ncould we create an Exposure structure that could model exposure settings\n(coarse in lines, fine in pixels, plus other parameters for long\nexposures) in a way that wouldn't be sensor-specific ?\n\n> > > Possibly I've repeated myself rather a lot there - but maybe I've\n> > > explained it better this time? :)\n> > >\n> > > > > > > Signed-off-by: David Plowman <david.plowman@raspberrypi.com>\n> > > > > > > Reviewed-by: Naushir Patuck <naush@raspberrypi.com>\n> > > > > >\n> > > > > > Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>\n> > > > > >\n> > > > > > > ---\n> > > > > > >  src/ipa/raspberrypi/cam_helper.hpp | 4 ++--\n> > > > > > >  1 file changed, 2 insertions(+), 2 deletions(-)\n> > > > > > >\n> > > > > > > diff --git a/src/ipa/raspberrypi/cam_helper.hpp b/src/ipa/raspberrypi/cam_helper.hpp\n> > > > > > > index 1b2d6eec..4053a870 100644\n> > > > > > > --- a/src/ipa/raspberrypi/cam_helper.hpp\n> > > > > > > +++ b/src/ipa/raspberrypi/cam_helper.hpp\n> > > > > > > @@ -66,8 +66,8 @@ public:\n> > > > > > >       virtual ~CamHelper();\n> > > > > > >       void SetCameraMode(const CameraMode &mode);\n> > > > > > >       MdParser &Parser() const { return *parser_; }\n> > > > > > > -     uint32_t ExposureLines(double exposure_us) const;\n> > > > > > > -     double Exposure(uint32_t exposure_lines) const; // in us\n> > > > > > > +     virtual uint32_t ExposureLines(double exposure_us) const;\n> > > > > > > +     virtual double Exposure(uint32_t exposure_lines) const; // in us\n> > > > > > >       virtual uint32_t GetVBlanking(double &exposure_us, double minFrameDuration,\n> > > > > > >                                     double maxFrameDuration) const;\n> > > > > > >       virtual uint32_t GainCode(double gain) const = 0;","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id 173A4BDE44\n\tfor <parsemail@patchwork.libcamera.org>;\n\tWed, 28 Apr 2021 13:33:09 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id 7A867688E2;\n\tWed, 28 Apr 2021 15:33:07 +0200 (CEST)","from perceval.ideasonboard.com (perceval.ideasonboard.com\n\t[213.167.242.64])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id C83B8688DC\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tWed, 28 Apr 2021 15:33:05 +0200 (CEST)","from pendragon.ideasonboard.com (62-78-145-57.bb.dnainternet.fi\n\t[62.78.145.57])\n\tby perceval.ideasonboard.com (Postfix) with ESMTPSA id 5FAD52CF;\n\tWed, 28 Apr 2021 15:33:05 +0200 (CEST)"],"Authentication-Results":"lancelot.ideasonboard.com;\n\tdkim=fail reason=\"signature verification failed\" (1024-bit key;\n\tunprotected) header.d=ideasonboard.com header.i=@ideasonboard.com\n\theader.b=\"PyvlFqN2\"; dkim-atps=neutral","DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/simple; d=ideasonboard.com;\n\ts=mail; t=1619616785;\n\tbh=P+FwLhUYur6DcG8b53Xn/Bd8nSNuTmvRFABUeCSNmJw=;\n\th=Date:From:To:Cc:Subject:References:In-Reply-To:From;\n\tb=PyvlFqN2oU0IFYAA7ywF+S9sZcIxrhwgoVb5FEoDWImFASZ6gBYqfDjSg2D8l3HZb\n\tBj8F1N3Urii9fnMzA33BRWBW8YQsmHRIt+pr4e62vVaUS859S9w0mdAG3jEHFB8Xqg\n\t6x0z/3mN3VJXK6k0tWpMl82itncmEgM6NLdIkK+Y=","Date":"Wed, 28 Apr 2021 16:32:59 +0300","From":"Laurent Pinchart <laurent.pinchart@ideasonboard.com>","To":"Naushir Patuck <naush@raspberrypi.com>","Message-ID":"<YIlkCw6Adu3sk5S2@pendragon.ideasonboard.com>","References":"<20210414102955.9503-1-david.plowman@raspberrypi.com>\n\t<20210414102955.9503-2-david.plowman@raspberrypi.com>\n\t<YHoIBNpFTzakzIhi@pendragon.ideasonboard.com>\n\t<CAHW6GYL0csoeG3JH0tD4n=jg03GpcmxGGq=e6dcUvfrxtAuQ1g@mail.gmail.com>\n\t<YIetKmO+nQU2bS7A@pendragon.ideasonboard.com>\n\t<CAHW6GYKpdCD_VdPbaQJg_bzGV+-hi2PrMzxUQF8epTu9vG3B5Q@mail.gmail.com>\n\t<YIlSKYg+CjMlfkNs@pendragon.ideasonboard.com>\n\t<CAEmqJPpTT00SmEHwmH=XPoeSg-3cxqsmyoMhufjNWKXUjX8D0w@mail.gmail.com>","MIME-Version":"1.0","Content-Disposition":"inline","In-Reply-To":"<CAEmqJPpTT00SmEHwmH=XPoeSg-3cxqsmyoMhufjNWKXUjX8D0w@mail.gmail.com>","Subject":"Re: [libcamera-devel] [PATCH v3 1/3] ipa: raspberrypi: Make\n\tCamHelper exposure methods virtual","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","Cc":"libcamera devel <libcamera-devel@lists.libcamera.org>","Content-Type":"text/plain; charset=\"us-ascii\"","Content-Transfer-Encoding":"7bit","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}},{"id":16673,"web_url":"https://patchwork.libcamera.org/comment/16673/","msgid":"<CAEmqJPrj9HAHp_Jn_4M0p7uRf6hDmo62sWXvP1b63G+Z-m1nmA@mail.gmail.com>","date":"2021-04-28T13:40:26","subject":"Re: [libcamera-devel] [PATCH v3 1/3] ipa: raspberrypi: Make\n\tCamHelper exposure methods virtual","submitter":{"id":34,"url":"https://patchwork.libcamera.org/api/people/34/","name":"Naushir Patuck","email":"naush@raspberrypi.com"},"content":"Hi Laurent,\n\n\nOn Wed, 28 Apr 2021 at 14:33, Laurent Pinchart <\nlaurent.pinchart@ideasonboard.com> wrote:\n\n> Hi Naush,\n>\n> On Wed, Apr 28, 2021 at 01:35:08PM +0100, Naushir Patuck wrote:\n> > On Wed, 28 Apr 2021 at 13:16, Laurent Pinchart wrote:\n> > > On Tue, Apr 27, 2021 at 09:22:00AM +0100, David Plowman wrote:\n> > > > On Tue, 27 Apr 2021 at 07:20, Laurent Pinchart wrote:\n> > > > > On Sat, Apr 17, 2021 at 09:19:51AM +0100, David Plowman wrote:\n> > > > > > On Fri, 16 Apr 2021 at 22:56, Laurent Pinchart wrote:\n> > > > > > > On Wed, Apr 14, 2021 at 11:29:53AM +0100, David Plowman wrote:\n> > > > > > > > This allows derived classes to override them if they have\n> any special\n> > > > > > > > behaviours to implement. For instance if a particular camera\n> mode\n> > > > > > > > produces a different signal level to other modes, you might\n> choose to\n> > > > > > > > address that in the gain or exposure methods.\n> > > > > > >\n> > > > > > > For my information, would you be able to give an example of\n> how this\n> > > > > > > would be used ? What kind of particular camera mode would\n> produce a\n> > > > > > > different signal level, is this related to binning, or\n> something else ?\n> > > > > >\n> > > > > > Yes, so the sensor I'm looking at produces twice the signal\n> level in\n> > > > > > the binning modes that it does in the full resolution mode. I\n> think\n> > > > > > this difference would be awkward for applications which means we\n> have\n> > > > > > to hide it somewhere.\n> > > > > >\n> > > > > > One of my earlier emails talked a little bit about hiding this\n> in the\n> > > > > > AGC. You'd have to know a \"base gain\" for every mode (so 1 for\n> full\n> > > > > > res, 2 for binned), and then have some process where we have\n> public\n> > > > > > exposure/gain numbers (which are the same for all modes) and\n> there's a\n> > > > > > process of conversion into and out of the AGC and IPA which deal\n> with\n> > > > > > real exposure/gain values. But I'm really not sure I want to go\n> there,\n> > > > > > at least right now!\n> > > > > >\n> > > > > > I think this is a pragmatic alternative. You can already hide the\n> > > > > > difference by changing the CamHelper's Gain and GainCode\n> functions, to\n> > > > > > apply the magic extra factor of 2 when necessary. This change\n> here\n> > > > > > makes it possible to apply that factor in the exposure instead -\n> you\n> > > > > > could imagine someone wanting the lowest noise possible and\n> accepting\n> > > > > > that the real exposure will be double (I wouldn't want to double\n> > > > > > exposures without people explicitly making that choice). And\n> finally,\n> > > > > > this change doesn't make it any more difficult to do the\n> compensation\n> > > > > > in the AGC at a later date, should we wish.\n> > > > > >\n> > > > > > In the CamHelper for my new sensor, I even make it easy to mix\n> the\n> > > > > > factor of 2 across both gain and exposure, so you could, for\n> example,\n> > > > > > have sqrt(2) in each.\n> > > > > >\n> > > > > > That got rather long... I hope it was understandable!\n> > > > >\n> > > > > Thanks for the information. There may be something that I don't get\n> > > > > though, as I don't see why the exposure needs to be affected. The\n> > > > > exposure time has a big influence on motion blur, so it seems to\n> me that\n> > > > > we shouldn't cheat here. The gain seems to be a better place to\n> hide\n> > > > > this sensor feature. I wonder if we should consider moving from\n> gain to\n> > > > > sensor sensitivity.\n> > > >\n> > > > You're right, exposure doesn't *need* to be affected, and indeed my\n> > > > default position is to handle the difference using gain instead. The\n> > > > intention is that people are *able* to keep the gain to a minimum if\n> > > > they want to, at the cost of changing the exposure - but this has to\n> > > > be a deliberate choice.\n> > > >\n> > > > I agree the \"sensitivity\" is a good idea, and indeed a much better\n> > > > term than I used above, which was \"base gain\".\n> > > >\n> > > > Note that sensitivity is a per-mode thing here. I think it's a good\n> > > > solution but has some consequences:\n> > > >\n> > > > - If an application wants to change camera mode and set the exposure\n> > > > they're going to have to deal with per-mode sensitivities and adapt\n> > > > their numbers accordingly. Do we like this? I'm always very nervous\n> of\n> > > > complicating an application's life!\n> > > >\n> > > > - You could try and handle per-mode sensitivities either outside the\n> > > > AGC (effectively that's what this allows) or you could handle it\n> > > > within the AGC. Whenever the camera mode changes you'd have to adjust\n> > > > the state variables according to the ratio of old_sensitivity /\n> > > > new_sensitivity. You'd probably have to use a different exposure\n> > > > profile, possibly it needs to be set by the application - or could we\n> > > > do something more automatic? The implementation would also depend on\n> > > > whether we want applications to have to pay attention to the\n> \"per-mode\n> > > > sensitivities\" or not.\n> > > >\n> > > > So overall, I'm in several minds about going this \"per-mode\n> > > > sensitivity\" route and the work it requires on the AGC, as well as\n> the\n> > > > implications for applications. In the meantime, making the exposure\n> > > > methods virtual allows these sensors to work and doesn't make a later\n> > > > step towards per-mode sensitivities any more difficult. You would\n> > > > simply do that work and then delete your virtual exposure methods.\n> > >\n> > > I'm sorry, but I still don't see why it requires making the below\n> > > functions virtual. They convert between exposure expressed in lines and\n> > > exposure expressed as a time, why would that need to be sensor-specific\n> > > ? I guess I need to see an actual use case first.\n> >\n> > This is entirely tangential to David's usage, but I also need this\n> method to\n> > be virtual so I can override this for imx477 ultra long exposure modes.\n> In\n> > this mode, we have some additional scaling to apply to the standard\n> > equation.\n>\n> I assume that in that case the exposure_lines argument, and the return\n> value of ExposureLines(), won't be expressed as a number of lines, right\n> ?\n>\n\nI have not worked through the details yet, but yet, this is possibly going\nto be\nthe case with the expected implementation.\n\n\n>\n> Given that sensors also often have a fine exposure time expressed in\n> pixels, in addition to the coarse exposure time expressed in lines,\n> could we create an Exposure structure that could model exposure settings\n> (coarse in lines, fine in pixels, plus other parameters for long\n> exposures) in a way that wouldn't be sensor-specific ?\n>\n\nPerhaps that might be a way to go that could be generalised for all sensors.\nI do still wonder about global shutter sensors though.  I've not worked with\nmany of them, but the ones I have expressed exposure in terms entirely\nunrelated to HTS/VTS, but maybe something to worry about when we\ncome across it :-)\n\n\n>\n> > > > Possibly I've repeated myself rather a lot there - but maybe I've\n> > > > explained it better this time? :)\n> > > >\n> > > > > > > > Signed-off-by: David Plowman <david.plowman@raspberrypi.com>\n> > > > > > > > Reviewed-by: Naushir Patuck <naush@raspberrypi.com>\n> > > > > > >\n> > > > > > > Reviewed-by: Laurent Pinchart <\n> laurent.pinchart@ideasonboard.com>\n> > > > > > >\n> > > > > > > > ---\n> > > > > > > >  src/ipa/raspberrypi/cam_helper.hpp | 4 ++--\n> > > > > > > >  1 file changed, 2 insertions(+), 2 deletions(-)\n> > > > > > > >\n> > > > > > > > diff --git a/src/ipa/raspberrypi/cam_helper.hpp\n> b/src/ipa/raspberrypi/cam_helper.hpp\n> > > > > > > > index 1b2d6eec..4053a870 100644\n> > > > > > > > --- a/src/ipa/raspberrypi/cam_helper.hpp\n> > > > > > > > +++ b/src/ipa/raspberrypi/cam_helper.hpp\n> > > > > > > > @@ -66,8 +66,8 @@ public:\n> > > > > > > >       virtual ~CamHelper();\n> > > > > > > >       void SetCameraMode(const CameraMode &mode);\n> > > > > > > >       MdParser &Parser() const { return *parser_; }\n> > > > > > > > -     uint32_t ExposureLines(double exposure_us) const;\n> > > > > > > > -     double Exposure(uint32_t exposure_lines) const; // in\n> us\n> > > > > > > > +     virtual uint32_t ExposureLines(double exposure_us)\n> const;\n> > > > > > > > +     virtual double Exposure(uint32_t exposure_lines)\n> const; // in us\n> > > > > > > >       virtual uint32_t GetVBlanking(double &exposure_us,\n> double minFrameDuration,\n> > > > > > > >                                     double maxFrameDuration)\n> const;\n> > > > > > > >       virtual uint32_t GainCode(double gain) const = 0;\n>\n> --\n> Regards,\n>\n> Laurent Pinchart\n>","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id B6952BDE41\n\tfor <parsemail@patchwork.libcamera.org>;\n\tWed, 28 Apr 2021 13:40:45 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id 18E27688E4;\n\tWed, 28 Apr 2021 15:40:45 +0200 (CEST)","from mail-lf1-x135.google.com (mail-lf1-x135.google.com\n\t[IPv6:2a00:1450:4864:20::135])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id 52F7A688DC\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tWed, 28 Apr 2021 15:40:43 +0200 (CEST)","by mail-lf1-x135.google.com with SMTP id z13so18302775lft.1\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tWed, 28 Apr 2021 06:40:43 -0700 (PDT)"],"Authentication-Results":"lancelot.ideasonboard.com;\n\tdkim=fail reason=\"signature verification failed\" (2048-bit key;\n\tunprotected) header.d=raspberrypi.com header.i=@raspberrypi.com\n\theader.b=\"cTm7vEag\"; dkim-atps=neutral","DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/relaxed;\n\td=raspberrypi.com; s=google;\n\th=mime-version:references:in-reply-to:from:date:message-id:subject:to\n\t:cc; bh=+PgDxCEfMrYl8wWK5h05i9FPQZuPTfha6LM68ipH2UM=;\n\tb=cTm7vEagt0PEl9x7R1IzgLPk9kxt/YcdMEdK2WLhMDZP9OqHLT2nTDqEjg/pJ9Woob\n\tHRJb0zrHxVYFu7qgJwHN9zFvDOqdKcxg35U5Je9KV30WqvrkqOwfTOHt4vt599Ixl2a4\n\t5bgTj/7VOzJgOrFWkeuWZfbwsm/bepICN/Q5A8+XYjgjr6IkmI6S7NHsohJAzMzsQOFU\n\tivYEo91TTWYYoUWs+bBBZPNSczkXG7/RUSBLtsa3gmR3q9EgSEApzrHKcwLaPCtk9JFt\n\tEt39Mn1EKH+Tq6b2xKrP86iD9OkkM/9Uyg6mAOrIkBAtXStOPtX6+9akIVmjOWUG5Ni0\n\tEYvg==","X-Google-DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/relaxed;\n\td=1e100.net; s=20161025;\n\th=x-gm-message-state:mime-version:references:in-reply-to:from:date\n\t:message-id:subject:to:cc;\n\tbh=+PgDxCEfMrYl8wWK5h05i9FPQZuPTfha6LM68ipH2UM=;\n\tb=WSJD3KZ0qXjbitOQgc9O8AN6fogTfKKJF40aoL1L/PMRI1eG6JsBIjoFb5rgsqICZY\n\tcU5F5BrXb79grxv+SWokaZqPw8FHdPp9eDOK4x5vQEO1BLzCLg3IsUvPnf17W5IFwMT9\n\tpKP4wdmrpmWnKGZ/hLXuj6WPRvRql5R9CAyorc/BIPb/+oGZgukEZAocKhD7CNQgiqi0\n\tI0FPgZtR7q3tRFzFAiJihJtHOyF4yRymUj9J2AXeqzI/Q6O3mLMaZILisxQEdKmmWxS9\n\tczTaaxqBsUEbACKZlKDl/9m64BD3W/LmC/pcv8mzEjA0b2VISZz++Ho4UC/j7o/85xRB\n\t1U6w==","X-Gm-Message-State":"AOAM532ZKfaKdi3Ys/LEam5vC0Y9B1B7lRSDvugWQF6olzesJ86f9vBH\n\t+qeYYZrwxpg64fc2/d9wwDcJamfKHhxKAIEGSUubGUkmgJU=","X-Google-Smtp-Source":"ABdhPJwLp9N0Xw+kXyNkzHZcM206DLQGQHYXB1AKnDCnIFdDsIEiCFWeVYwttFHVkRYiQk4v2+s1u9LJzttJIiH1Uz0=","X-Received":"by 2002:a05:6512:11cc:: with SMTP id\n\th12mr20893978lfr.567.1619617242701; \n\tWed, 28 Apr 2021 06:40:42 -0700 (PDT)","MIME-Version":"1.0","References":"<20210414102955.9503-1-david.plowman@raspberrypi.com>\n\t<20210414102955.9503-2-david.plowman@raspberrypi.com>\n\t<YHoIBNpFTzakzIhi@pendragon.ideasonboard.com>\n\t<CAHW6GYL0csoeG3JH0tD4n=jg03GpcmxGGq=e6dcUvfrxtAuQ1g@mail.gmail.com>\n\t<YIetKmO+nQU2bS7A@pendragon.ideasonboard.com>\n\t<CAHW6GYKpdCD_VdPbaQJg_bzGV+-hi2PrMzxUQF8epTu9vG3B5Q@mail.gmail.com>\n\t<YIlSKYg+CjMlfkNs@pendragon.ideasonboard.com>\n\t<CAEmqJPpTT00SmEHwmH=XPoeSg-3cxqsmyoMhufjNWKXUjX8D0w@mail.gmail.com>\n\t<YIlkCw6Adu3sk5S2@pendragon.ideasonboard.com>","In-Reply-To":"<YIlkCw6Adu3sk5S2@pendragon.ideasonboard.com>","From":"Naushir Patuck <naush@raspberrypi.com>","Date":"Wed, 28 Apr 2021 14:40:26 +0100","Message-ID":"<CAEmqJPrj9HAHp_Jn_4M0p7uRf6hDmo62sWXvP1b63G+Z-m1nmA@mail.gmail.com>","To":"Laurent Pinchart <laurent.pinchart@ideasonboard.com>","Subject":"Re: [libcamera-devel] [PATCH v3 1/3] ipa: raspberrypi: Make\n\tCamHelper exposure methods virtual","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","Cc":"libcamera devel <libcamera-devel@lists.libcamera.org>","Content-Type":"multipart/mixed;\n\tboundary=\"===============7473376868998883479==\"","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}},{"id":16675,"web_url":"https://patchwork.libcamera.org/comment/16675/","msgid":"<YIlruG6KTHqNHDF8@pendragon.ideasonboard.com>","date":"2021-04-28T14:05:44","subject":"Re: [libcamera-devel] [PATCH v3 1/3] ipa: raspberrypi: Make\n\tCamHelper exposure methods virtual","submitter":{"id":2,"url":"https://patchwork.libcamera.org/api/people/2/","name":"Laurent Pinchart","email":"laurent.pinchart@ideasonboard.com"},"content":"Hi Naush,\n\nOn Wed, Apr 28, 2021 at 02:40:26PM +0100, Naushir Patuck wrote:\n> On Wed, 28 Apr 2021 at 14:33, Laurent Pinchart wrote:\n> > On Wed, Apr 28, 2021 at 01:35:08PM +0100, Naushir Patuck wrote:\n> > > On Wed, 28 Apr 2021 at 13:16, Laurent Pinchart wrote:\n> > > > On Tue, Apr 27, 2021 at 09:22:00AM +0100, David Plowman wrote:\n> > > > > On Tue, 27 Apr 2021 at 07:20, Laurent Pinchart wrote:\n> > > > > > On Sat, Apr 17, 2021 at 09:19:51AM +0100, David Plowman wrote:\n> > > > > > > On Fri, 16 Apr 2021 at 22:56, Laurent Pinchart wrote:\n> > > > > > > > On Wed, Apr 14, 2021 at 11:29:53AM +0100, David Plowman wrote:\n> > > > > > > > > This allows derived classes to override them if they have any special\n> > > > > > > > > behaviours to implement. For instance if a particular camera mode\n> > > > > > > > > produces a different signal level to other modes, you might choose to\n> > > > > > > > > address that in the gain or exposure methods.\n> > > > > > > >\n> > > > > > > > For my information, would you be able to give an example of how this\n> > > > > > > > would be used ? What kind of particular camera mode would produce a\n> > > > > > > > different signal level, is this related to binning, or something else ?\n> > > > > > >\n> > > > > > > Yes, so the sensor I'm looking at produces twice the signal level in\n> > > > > > > the binning modes that it does in the full resolution mode. I think\n> > > > > > > this difference would be awkward for applications which means we have\n> > > > > > > to hide it somewhere.\n> > > > > > >\n> > > > > > > One of my earlier emails talked a little bit about hiding this in the\n> > > > > > > AGC. You'd have to know a \"base gain\" for every mode (so 1 for full\n> > > > > > > res, 2 for binned), and then have some process where we have public\n> > > > > > > exposure/gain numbers (which are the same for all modes) and there's a\n> > > > > > > process of conversion into and out of the AGC and IPA which deal with\n> > > > > > > real exposure/gain values. But I'm really not sure I want to go there,\n> > > > > > > at least right now!\n> > > > > > >\n> > > > > > > I think this is a pragmatic alternative. You can already hide the\n> > > > > > > difference by changing the CamHelper's Gain and GainCode functions, to\n> > > > > > > apply the magic extra factor of 2 when necessary. This change here\n> > > > > > > makes it possible to apply that factor in the exposure instead - you\n> > > > > > > could imagine someone wanting the lowest noise possible and accepting\n> > > > > > > that the real exposure will be double (I wouldn't want to double\n> > > > > > > exposures without people explicitly making that choice). And finally,\n> > > > > > > this change doesn't make it any more difficult to do the compensation\n> > > > > > > in the AGC at a later date, should we wish.\n> > > > > > >\n> > > > > > > In the CamHelper for my new sensor, I even make it easy to mix the\n> > > > > > > factor of 2 across both gain and exposure, so you could, for example,\n> > > > > > > have sqrt(2) in each.\n> > > > > > >\n> > > > > > > That got rather long... I hope it was understandable!\n> > > > > >\n> > > > > > Thanks for the information. There may be something that I don't get\n> > > > > > though, as I don't see why the exposure needs to be affected. The\n> > > > > > exposure time has a big influence on motion blur, so it seems to me that\n> > > > > > we shouldn't cheat here. The gain seems to be a better place to hide\n> > > > > > this sensor feature. I wonder if we should consider moving from gain to\n> > > > > > sensor sensitivity.\n> > > > >\n> > > > > You're right, exposure doesn't *need* to be affected, and indeed my\n> > > > > default position is to handle the difference using gain instead. The\n> > > > > intention is that people are *able* to keep the gain to a minimum if\n> > > > > they want to, at the cost of changing the exposure - but this has to\n> > > > > be a deliberate choice.\n> > > > >\n> > > > > I agree the \"sensitivity\" is a good idea, and indeed a much better\n> > > > > term than I used above, which was \"base gain\".\n> > > > >\n> > > > > Note that sensitivity is a per-mode thing here. I think it's a good\n> > > > > solution but has some consequences:\n> > > > >\n> > > > > - If an application wants to change camera mode and set the exposure\n> > > > > they're going to have to deal with per-mode sensitivities and adapt\n> > > > > their numbers accordingly. Do we like this? I'm always very nervous of\n> > > > > complicating an application's life!\n> > > > >\n> > > > > - You could try and handle per-mode sensitivities either outside the\n> > > > > AGC (effectively that's what this allows) or you could handle it\n> > > > > within the AGC. Whenever the camera mode changes you'd have to adjust\n> > > > > the state variables according to the ratio of old_sensitivity /\n> > > > > new_sensitivity. You'd probably have to use a different exposure\n> > > > > profile, possibly it needs to be set by the application - or could we\n> > > > > do something more automatic? The implementation would also depend on\n> > > > > whether we want applications to have to pay attention to the \"per-mode\n> > > > > sensitivities\" or not.\n> > > > >\n> > > > > So overall, I'm in several minds about going this \"per-mode\n> > > > > sensitivity\" route and the work it requires on the AGC, as well as the\n> > > > > implications for applications. In the meantime, making the exposure\n> > > > > methods virtual allows these sensors to work and doesn't make a later\n> > > > > step towards per-mode sensitivities any more difficult. You would\n> > > > > simply do that work and then delete your virtual exposure methods.\n> > > >\n> > > > I'm sorry, but I still don't see why it requires making the below\n> > > > functions virtual. They convert between exposure expressed in lines and\n> > > > exposure expressed as a time, why would that need to be sensor-specific\n> > > > ? I guess I need to see an actual use case first.\n> > >\n> > > This is entirely tangential to David's usage, but I also need this method to\n> > > be virtual so I can override this for imx477 ultra long exposure modes. In\n> > > this mode, we have some additional scaling to apply to the standard\n> > > equation.\n> >\n> > I assume that in that case the exposure_lines argument, and the return\n> > value of ExposureLines(), won't be expressed as a number of lines, right\n> > ?\n> \n> I have not worked through the details yet, but yet, this is possibly going\n> to be the case with the expected implementation.\n> \n> > Given that sensors also often have a fine exposure time expressed in\n> > pixels, in addition to the coarse exposure time expressed in lines,\n> > could we create an Exposure structure that could model exposure settings\n> > (coarse in lines, fine in pixels, plus other parameters for long\n> > exposures) in a way that wouldn't be sensor-specific ?\n> \n> Perhaps that might be a way to go that could be generalised for all sensors.\n\nWould you be able to check if this could be achieved for the different\nsensors that you need to support ?\n\n> I do still wonder about global shutter sensors though.  I've not worked with\n> many of them, but the ones I have expressed exposure in terms entirely\n> unrelated to HTS/VTS, but maybe something to worry about when we\n> come across it :-)\n\nAgreed.\n\n> > > > > Possibly I've repeated myself rather a lot there - but maybe I've\n> > > > > explained it better this time? :)\n> > > > >\n> > > > > > > > > Signed-off-by: David Plowman <david.plowman@raspberrypi.com>\n> > > > > > > > > Reviewed-by: Naushir Patuck <naush@raspberrypi.com>\n> > > > > > > >\n> > > > > > > > Reviewed-by: Laurent Pinchart < laurent.pinchart@ideasonboard.com>\n> > > > > > > >\n> > > > > > > > > ---\n> > > > > > > > >  src/ipa/raspberrypi/cam_helper.hpp | 4 ++--\n> > > > > > > > >  1 file changed, 2 insertions(+), 2 deletions(-)\n> > > > > > > > >\n> > > > > > > > > diff --git a/src/ipa/raspberrypi/cam_helper.hpp b/src/ipa/raspberrypi/cam_helper.hpp\n> > > > > > > > > index 1b2d6eec..4053a870 100644\n> > > > > > > > > --- a/src/ipa/raspberrypi/cam_helper.hpp\n> > > > > > > > > +++ b/src/ipa/raspberrypi/cam_helper.hpp\n> > > > > > > > > @@ -66,8 +66,8 @@ public:\n> > > > > > > > >       virtual ~CamHelper();\n> > > > > > > > >       void SetCameraMode(const CameraMode &mode);\n> > > > > > > > >       MdParser &Parser() const { return *parser_; }\n> > > > > > > > > -     uint32_t ExposureLines(double exposure_us) const;\n> > > > > > > > > -     double Exposure(uint32_t exposure_lines) const; // in us\n> > > > > > > > > +     virtual uint32_t ExposureLines(double exposure_us) const;\n> > > > > > > > > +     virtual double Exposure(uint32_t exposure_lines) const; // in us\n> > > > > > > > >       virtual uint32_t GetVBlanking(double &exposure_us, double minFrameDuration,\n> > > > > > > > >                                     double maxFrameDuration) const;\n> > > > > > > > >       virtual uint32_t GainCode(double gain) const = 0;","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id 17F71BDE41\n\tfor <parsemail@patchwork.libcamera.org>;\n\tWed, 28 Apr 2021 14:05:53 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id 5E238688E2;\n\tWed, 28 Apr 2021 16:05:52 +0200 (CEST)","from perceval.ideasonboard.com (perceval.ideasonboard.com\n\t[213.167.242.64])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id C1837688DC\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tWed, 28 Apr 2021 16:05:50 +0200 (CEST)","from pendragon.ideasonboard.com (62-78-145-57.bb.dnainternet.fi\n\t[62.78.145.57])\n\tby perceval.ideasonboard.com (Postfix) with ESMTPSA id 594412CF;\n\tWed, 28 Apr 2021 16:05:50 +0200 (CEST)"],"Authentication-Results":"lancelot.ideasonboard.com;\n\tdkim=fail reason=\"signature verification failed\" (1024-bit key;\n\tunprotected) header.d=ideasonboard.com header.i=@ideasonboard.com\n\theader.b=\"mBwo2kSO\"; dkim-atps=neutral","DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/simple; d=ideasonboard.com;\n\ts=mail; t=1619618750;\n\tbh=P/RxBl0/vq3K/dcKD6vg8C3NnN8BvGmCiEr5DFuUb+E=;\n\th=Date:From:To:Cc:Subject:References:In-Reply-To:From;\n\tb=mBwo2kSOW2oYU8CGHzx7QhPHsdO/Z9e4o8cQko4DJgWqK0KuA1msQLod0+6kh2rdj\n\tAg0M1ozaE0jrUb4bM9X2afCUArJ3WE7/OrXfVnzmXrHldnKhPAmheAo+9CjIZX8rnC\n\tnycxLrE/tRzXT08rcJhi0Kote9CCqZg5DR9DEy2Q=","Date":"Wed, 28 Apr 2021 17:05:44 +0300","From":"Laurent Pinchart <laurent.pinchart@ideasonboard.com>","To":"Naushir Patuck <naush@raspberrypi.com>","Message-ID":"<YIlruG6KTHqNHDF8@pendragon.ideasonboard.com>","References":"<20210414102955.9503-1-david.plowman@raspberrypi.com>\n\t<20210414102955.9503-2-david.plowman@raspberrypi.com>\n\t<YHoIBNpFTzakzIhi@pendragon.ideasonboard.com>\n\t<CAHW6GYL0csoeG3JH0tD4n=jg03GpcmxGGq=e6dcUvfrxtAuQ1g@mail.gmail.com>\n\t<YIetKmO+nQU2bS7A@pendragon.ideasonboard.com>\n\t<CAHW6GYKpdCD_VdPbaQJg_bzGV+-hi2PrMzxUQF8epTu9vG3B5Q@mail.gmail.com>\n\t<YIlSKYg+CjMlfkNs@pendragon.ideasonboard.com>\n\t<CAEmqJPpTT00SmEHwmH=XPoeSg-3cxqsmyoMhufjNWKXUjX8D0w@mail.gmail.com>\n\t<YIlkCw6Adu3sk5S2@pendragon.ideasonboard.com>\n\t<CAEmqJPrj9HAHp_Jn_4M0p7uRf6hDmo62sWXvP1b63G+Z-m1nmA@mail.gmail.com>","MIME-Version":"1.0","Content-Disposition":"inline","In-Reply-To":"<CAEmqJPrj9HAHp_Jn_4M0p7uRf6hDmo62sWXvP1b63G+Z-m1nmA@mail.gmail.com>","Subject":"Re: [libcamera-devel] [PATCH v3 1/3] ipa: raspberrypi: Make\n\tCamHelper exposure methods virtual","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","Cc":"libcamera devel <libcamera-devel@lists.libcamera.org>","Content-Type":"text/plain; charset=\"us-ascii\"","Content-Transfer-Encoding":"7bit","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}},{"id":16677,"web_url":"https://patchwork.libcamera.org/comment/16677/","msgid":"<CAEmqJPrfBK3a0iiddwsgmHpkUgjDWN6M4z5y1o-=UWT-sdQ-hQ@mail.gmail.com>","date":"2021-04-28T14:12:39","subject":"Re: [libcamera-devel] [PATCH v3 1/3] ipa: raspberrypi: Make\n\tCamHelper exposure methods virtual","submitter":{"id":34,"url":"https://patchwork.libcamera.org/api/people/34/","name":"Naushir Patuck","email":"naush@raspberrypi.com"},"content":"On Wed, 28 Apr 2021 at 15:05, Laurent Pinchart <\nlaurent.pinchart@ideasonboard.com> wrote:\n\n> Hi Naush,\n>\n> On Wed, Apr 28, 2021 at 02:40:26PM +0100, Naushir Patuck wrote:\n> > On Wed, 28 Apr 2021 at 14:33, Laurent Pinchart wrote:\n> > > On Wed, Apr 28, 2021 at 01:35:08PM +0100, Naushir Patuck wrote:\n> > > > On Wed, 28 Apr 2021 at 13:16, Laurent Pinchart wrote:\n> > > > > On Tue, Apr 27, 2021 at 09:22:00AM +0100, David Plowman wrote:\n> > > > > > On Tue, 27 Apr 2021 at 07:20, Laurent Pinchart wrote:\n> > > > > > > On Sat, Apr 17, 2021 at 09:19:51AM +0100, David Plowman wrote:\n> > > > > > > > On Fri, 16 Apr 2021 at 22:56, Laurent Pinchart wrote:\n> > > > > > > > > On Wed, Apr 14, 2021 at 11:29:53AM +0100, David Plowman\n> wrote:\n> > > > > > > > > > This allows derived classes to override them if they\n> have any special\n> > > > > > > > > > behaviours to implement. For instance if a particular\n> camera mode\n> > > > > > > > > > produces a different signal level to other modes, you\n> might choose to\n> > > > > > > > > > address that in the gain or exposure methods.\n> > > > > > > > >\n> > > > > > > > > For my information, would you be able to give an example\n> of how this\n> > > > > > > > > would be used ? What kind of particular camera mode would\n> produce a\n> > > > > > > > > different signal level, is this related to binning, or\n> something else ?\n> > > > > > > >\n> > > > > > > > Yes, so the sensor I'm looking at produces twice the signal\n> level in\n> > > > > > > > the binning modes that it does in the full resolution mode.\n> I think\n> > > > > > > > this difference would be awkward for applications which\n> means we have\n> > > > > > > > to hide it somewhere.\n> > > > > > > >\n> > > > > > > > One of my earlier emails talked a little bit about hiding\n> this in the\n> > > > > > > > AGC. You'd have to know a \"base gain\" for every mode (so 1\n> for full\n> > > > > > > > res, 2 for binned), and then have some process where we have\n> public\n> > > > > > > > exposure/gain numbers (which are the same for all modes) and\n> there's a\n> > > > > > > > process of conversion into and out of the AGC and IPA which\n> deal with\n> > > > > > > > real exposure/gain values. But I'm really not sure I want to\n> go there,\n> > > > > > > > at least right now!\n> > > > > > > >\n> > > > > > > > I think this is a pragmatic alternative. You can already\n> hide the\n> > > > > > > > difference by changing the CamHelper's Gain and GainCode\n> functions, to\n> > > > > > > > apply the magic extra factor of 2 when necessary. This\n> change here\n> > > > > > > > makes it possible to apply that factor in the exposure\n> instead - you\n> > > > > > > > could imagine someone wanting the lowest noise possible and\n> accepting\n> > > > > > > > that the real exposure will be double (I wouldn't want to\n> double\n> > > > > > > > exposures without people explicitly making that choice). And\n> finally,\n> > > > > > > > this change doesn't make it any more difficult to do the\n> compensation\n> > > > > > > > in the AGC at a later date, should we wish.\n> > > > > > > >\n> > > > > > > > In the CamHelper for my new sensor, I even make it easy to\n> mix the\n> > > > > > > > factor of 2 across both gain and exposure, so you could, for\n> example,\n> > > > > > > > have sqrt(2) in each.\n> > > > > > > >\n> > > > > > > > That got rather long... I hope it was understandable!\n> > > > > > >\n> > > > > > > Thanks for the information. There may be something that I\n> don't get\n> > > > > > > though, as I don't see why the exposure needs to be affected.\n> The\n> > > > > > > exposure time has a big influence on motion blur, so it seems\n> to me that\n> > > > > > > we shouldn't cheat here. The gain seems to be a better place\n> to hide\n> > > > > > > this sensor feature. I wonder if we should consider moving\n> from gain to\n> > > > > > > sensor sensitivity.\n> > > > > >\n> > > > > > You're right, exposure doesn't *need* to be affected, and indeed\n> my\n> > > > > > default position is to handle the difference using gain instead.\n> The\n> > > > > > intention is that people are *able* to keep the gain to a\n> minimum if\n> > > > > > they want to, at the cost of changing the exposure - but this\n> has to\n> > > > > > be a deliberate choice.\n> > > > > >\n> > > > > > I agree the \"sensitivity\" is a good idea, and indeed a much\n> better\n> > > > > > term than I used above, which was \"base gain\".\n> > > > > >\n> > > > > > Note that sensitivity is a per-mode thing here. I think it's a\n> good\n> > > > > > solution but has some consequences:\n> > > > > >\n> > > > > > - If an application wants to change camera mode and set the\n> exposure\n> > > > > > they're going to have to deal with per-mode sensitivities and\n> adapt\n> > > > > > their numbers accordingly. Do we like this? I'm always very\n> nervous of\n> > > > > > complicating an application's life!\n> > > > > >\n> > > > > > - You could try and handle per-mode sensitivities either outside\n> the\n> > > > > > AGC (effectively that's what this allows) or you could handle it\n> > > > > > within the AGC. Whenever the camera mode changes you'd have to\n> adjust\n> > > > > > the state variables according to the ratio of old_sensitivity /\n> > > > > > new_sensitivity. You'd probably have to use a different exposure\n> > > > > > profile, possibly it needs to be set by the application - or\n> could we\n> > > > > > do something more automatic? The implementation would also\n> depend on\n> > > > > > whether we want applications to have to pay attention to the\n> \"per-mode\n> > > > > > sensitivities\" or not.\n> > > > > >\n> > > > > > So overall, I'm in several minds about going this \"per-mode\n> > > > > > sensitivity\" route and the work it requires on the AGC, as well\n> as the\n> > > > > > implications for applications. In the meantime, making the\n> exposure\n> > > > > > methods virtual allows these sensors to work and doesn't make a\n> later\n> > > > > > step towards per-mode sensitivities any more difficult. You would\n> > > > > > simply do that work and then delete your virtual exposure\n> methods.\n> > > > >\n> > > > > I'm sorry, but I still don't see why it requires making the below\n> > > > > functions virtual. They convert between exposure expressed in\n> lines and\n> > > > > exposure expressed as a time, why would that need to be\n> sensor-specific\n> > > > > ? I guess I need to see an actual use case first.\n> > > >\n> > > > This is entirely tangential to David's usage, but I also need this\n> method to\n> > > > be virtual so I can override this for imx477 ultra long exposure\n> modes. In\n> > > > this mode, we have some additional scaling to apply to the standard\n> > > > equation.\n> > >\n> > > I assume that in that case the exposure_lines argument, and the return\n> > > value of ExposureLines(), won't be expressed as a number of lines,\n> right\n> > > ?\n> >\n> > I have not worked through the details yet, but yet, this is possibly\n> going\n> > to be the case with the expected implementation.\n> >\n> > > Given that sensors also often have a fine exposure time expressed in\n> > > pixels, in addition to the coarse exposure time expressed in lines,\n> > > could we create an Exposure structure that could model exposure\n> settings\n> > > (coarse in lines, fine in pixels, plus other parameters for long\n> > > exposures) in a way that wouldn't be sensor-specific ?\n> >\n> > Perhaps that might be a way to go that could be generalised for all\n> sensors.\n>\n> Would you be able to check if this could be achieved for the different\n> sensors that you need to support ?\n>\n\nYes, I'll have a look and see what we can come up with.\n\n\n>\n> > I do still wonder about global shutter sensors though.  I've not worked\n> with\n> > many of them, but the ones I have expressed exposure in terms entirely\n> > unrelated to HTS/VTS, but maybe something to worry about when we\n> > come across it :-)\n>\n> Agreed.\n>\n> > > > > > Possibly I've repeated myself rather a lot there - but maybe I've\n> > > > > > explained it better this time? :)\n> > > > > >\n> > > > > > > > > > Signed-off-by: David Plowman <\n> david.plowman@raspberrypi.com>\n> > > > > > > > > > Reviewed-by: Naushir Patuck <naush@raspberrypi.com>\n> > > > > > > > >\n> > > > > > > > > Reviewed-by: Laurent Pinchart <\n> laurent.pinchart@ideasonboard.com>\n> > > > > > > > >\n> > > > > > > > > > ---\n> > > > > > > > > >  src/ipa/raspberrypi/cam_helper.hpp | 4 ++--\n> > > > > > > > > >  1 file changed, 2 insertions(+), 2 deletions(-)\n> > > > > > > > > >\n> > > > > > > > > > diff --git a/src/ipa/raspberrypi/cam_helper.hpp\n> b/src/ipa/raspberrypi/cam_helper.hpp\n> > > > > > > > > > index 1b2d6eec..4053a870 100644\n> > > > > > > > > > --- a/src/ipa/raspberrypi/cam_helper.hpp\n> > > > > > > > > > +++ b/src/ipa/raspberrypi/cam_helper.hpp\n> > > > > > > > > > @@ -66,8 +66,8 @@ public:\n> > > > > > > > > >       virtual ~CamHelper();\n> > > > > > > > > >       void SetCameraMode(const CameraMode &mode);\n> > > > > > > > > >       MdParser &Parser() const { return *parser_; }\n> > > > > > > > > > -     uint32_t ExposureLines(double exposure_us) const;\n> > > > > > > > > > -     double Exposure(uint32_t exposure_lines) const; //\n> in us\n> > > > > > > > > > +     virtual uint32_t ExposureLines(double exposure_us)\n> const;\n> > > > > > > > > > +     virtual double Exposure(uint32_t exposure_lines)\n> const; // in us\n> > > > > > > > > >       virtual uint32_t GetVBlanking(double &exposure_us,\n> double minFrameDuration,\n> > > > > > > > > >                                     double\n> maxFrameDuration) const;\n> > > > > > > > > >       virtual uint32_t GainCode(double gain) const = 0;\n>\n> --\n> Regards,\n>\n> Laurent Pinchart\n>","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id 1422FBDE44\n\tfor <parsemail@patchwork.libcamera.org>;\n\tWed, 28 Apr 2021 14:12:59 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id 4BAAF688E2;\n\tWed, 28 Apr 2021 16:12:58 +0200 (CEST)","from mail-lj1-x22d.google.com (mail-lj1-x22d.google.com\n\t[IPv6:2a00:1450:4864:20::22d])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id 33A04688DC\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tWed, 28 Apr 2021 16:12:56 +0200 (CEST)","by mail-lj1-x22d.google.com with SMTP id o5so39321810ljc.1\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tWed, 28 Apr 2021 07:12:56 -0700 (PDT)"],"Authentication-Results":"lancelot.ideasonboard.com;\n\tdkim=fail reason=\"signature verification failed\" (2048-bit key;\n\tunprotected) header.d=raspberrypi.com header.i=@raspberrypi.com\n\theader.b=\"Dq9LAdKW\"; dkim-atps=neutral","DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/relaxed;\n\td=raspberrypi.com; s=google;\n\th=mime-version:references:in-reply-to:from:date:message-id:subject:to\n\t:cc; bh=HSpdkmIad/JP3iRB/2amVBmTOuSO7yvFVJhaDw3uo+g=;\n\tb=Dq9LAdKWs0Zh18Csy3S/Ru106JBIAtUsD40mU19ENdbWQCQzxFIgqiXhH1nJZLWp9N\n\tSWTZ50tk/eNnx9QtWtYsJdCD355AgQslG7vVx6kQTLI38Aqqof95jGpB89nEl2kBKE/m\n\tn0wR51e1Rsq9gTc2y2apOm7hrLQKolm93o8TFwXfPU5PfpX+v8xgFEMEfr7Ch8N9eUyD\n\tGQxEouzHAbjsgwLtaBjoCZNFB8FPHZyomuCzeiWayUDebOSxHNEJGg2K6SAf3yYyBFht\n\tKvOUM5eTsSVCEBmnPLOEBdW92wxdUCBN1JTfLxKoOK+gUKLHQDgaw7W8WPdpZTcqalD3\n\tXGzw==","X-Google-DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/relaxed;\n\td=1e100.net; s=20161025;\n\th=x-gm-message-state:mime-version:references:in-reply-to:from:date\n\t:message-id:subject:to:cc;\n\tbh=HSpdkmIad/JP3iRB/2amVBmTOuSO7yvFVJhaDw3uo+g=;\n\tb=HGQNz8PrssdVQM6742cSM/xzSH7g3ZyVIglhMwRCPkWHiUbuzOwUMEwq8YF3b24eCe\n\tMgs6P4N3dqIEcSNP9OHTfpJVJpulMxyWJ6qX5ZFNCLXpbZPGJQBVd8KNV0kBnw/R/dPK\n\tJjUIdKV6clq6wn9Ut0wgzvj8W/LBBe9jHAXBrjzvC170zybFh+lnkPC4PjLKiKZp2wQw\n\tGP71wP8JMLA5U+w20qnhYeSOK4pjkj4OX7wd6k8Zu25jyEUGlmpcmI2+PhzhyNqJ0T9s\n\tBZhixLpKe2iIc1+4upsOp7NcXvoTe7aBBtudIrPAK7+EFC563otTfp5j2zd6UWiLEhmb\n\tLOBQ==","X-Gm-Message-State":"AOAM531OPFyKz6YU06mtV/GgW+3+Yy/DFcFgBSyeuqYn/TEpk6zoc5Am\n\tF4BBjo5QJ6JO/E7PvXGbBHR/69HgjvgieFsp5Oq7vQ==","X-Google-Smtp-Source":"ABdhPJzLvU0t/ihn5egZ8FEQFk5MLweakNecZCcissRRPsT09dWT2+eLowusi8q+rFcK3qievwmxnSeiKnRQ2XvAc1I=","X-Received":"by 2002:a05:651c:335:: with SMTP id\n\tb21mr20413969ljp.299.1619619175217; \n\tWed, 28 Apr 2021 07:12:55 -0700 (PDT)","MIME-Version":"1.0","References":"<20210414102955.9503-1-david.plowman@raspberrypi.com>\n\t<20210414102955.9503-2-david.plowman@raspberrypi.com>\n\t<YHoIBNpFTzakzIhi@pendragon.ideasonboard.com>\n\t<CAHW6GYL0csoeG3JH0tD4n=jg03GpcmxGGq=e6dcUvfrxtAuQ1g@mail.gmail.com>\n\t<YIetKmO+nQU2bS7A@pendragon.ideasonboard.com>\n\t<CAHW6GYKpdCD_VdPbaQJg_bzGV+-hi2PrMzxUQF8epTu9vG3B5Q@mail.gmail.com>\n\t<YIlSKYg+CjMlfkNs@pendragon.ideasonboard.com>\n\t<CAEmqJPpTT00SmEHwmH=XPoeSg-3cxqsmyoMhufjNWKXUjX8D0w@mail.gmail.com>\n\t<YIlkCw6Adu3sk5S2@pendragon.ideasonboard.com>\n\t<CAEmqJPrj9HAHp_Jn_4M0p7uRf6hDmo62sWXvP1b63G+Z-m1nmA@mail.gmail.com>\n\t<YIlruG6KTHqNHDF8@pendragon.ideasonboard.com>","In-Reply-To":"<YIlruG6KTHqNHDF8@pendragon.ideasonboard.com>","From":"Naushir Patuck <naush@raspberrypi.com>","Date":"Wed, 28 Apr 2021 15:12:39 +0100","Message-ID":"<CAEmqJPrfBK3a0iiddwsgmHpkUgjDWN6M4z5y1o-=UWT-sdQ-hQ@mail.gmail.com>","To":"Laurent Pinchart <laurent.pinchart@ideasonboard.com>","Subject":"Re: [libcamera-devel] [PATCH v3 1/3] ipa: raspberrypi: Make\n\tCamHelper exposure methods virtual","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","Cc":"libcamera devel <libcamera-devel@lists.libcamera.org>","Content-Type":"multipart/mixed;\n\tboundary=\"===============3394174990836841840==\"","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}},{"id":16755,"web_url":"https://patchwork.libcamera.org/comment/16755/","msgid":"<CAHW6GYJN3ZpREM_AC9+uj2fV4nt8dYw-83bCiJ704oA4=OzotA@mail.gmail.com>","date":"2021-05-04T09:15:26","subject":"Re: [libcamera-devel] [PATCH v3 1/3] ipa: raspberrypi: Make\n\tCamHelper exposure methods virtual","submitter":{"id":42,"url":"https://patchwork.libcamera.org/api/people/42/","name":"David Plowman","email":"david.plowman@raspberrypi.com"},"content":"Hi again\n\nThanks for all the various opinions. If I may wade in again... :)\n\nLike Naush, I have some more requirements for what the Exposure\nfunctions need to do. The sensor I'm currently dealing with has the\nconstraint - in some readout modes only - that the number of exposure\nlines has to be a multiple of 2. It's one of those \"quad Bayer\" or\n\"tetracell\" sensors, so you can imagine where this is coming from. You\nmight speculate that so-called \"nonacell\" sensors might want multiples\nof 3, though I've never had my hands on one of those!\n\nI suppose we could argue that the ControlInfos needs to record and\nadvertise the \"step\" that V4L2 parameters expose - we clearly need to\nfix up the exposure correctly before we pass it into the delayed\ncontrol. Or at least, it must have the true value when we read it out.\n\nPerhaps the problem with making this function virtual is that we've\nnever been totally clear what it's for. Maybe it really is supposed to\ndo nothing but a division, though at that point, shouldn't we remove\nit and just do the division? Perhaps it's purpose is to hide away\npeculiarities of particular sensors (and modes) on the grounds that\nthey're going to be difficult to predict in advance for all future\ndevices, that probably sums up how I've always viewed it.\n\nI already treat the GainCode function rather like this. For example,\nwhere different modes have different minimum gains, or even different\nsensitivities, I soak that up in the virtual GainCode method, so it's\nnot \"give me the code for this gain\", it's \"give me the code I need to\nprogram for this gain, even if it's not really this exact gain\". Here\nI'm just wanting to give developers the *option* to get minimum gain\nimages if they're prepared deliberately to alter the exposure. I won't\nargue that some solution involving sensitivities and per-mode\nsensitivities is not better, but we'd need to decide how that\ninteracts with the AEC/AGC, with applications and so on. I'm feeling\nlike that might be another one of those rather long discussions...\n\nThanks!\nDavid\n\nOn Wed, 28 Apr 2021 at 15:12, Naushir Patuck <naush@raspberrypi.com> wrote:\n>\n>\n>\n> On Wed, 28 Apr 2021 at 15:05, Laurent Pinchart <laurent.pinchart@ideasonboard.com> wrote:\n>>\n>> Hi Naush,\n>>\n>> On Wed, Apr 28, 2021 at 02:40:26PM +0100, Naushir Patuck wrote:\n>> > On Wed, 28 Apr 2021 at 14:33, Laurent Pinchart wrote:\n>> > > On Wed, Apr 28, 2021 at 01:35:08PM +0100, Naushir Patuck wrote:\n>> > > > On Wed, 28 Apr 2021 at 13:16, Laurent Pinchart wrote:\n>> > > > > On Tue, Apr 27, 2021 at 09:22:00AM +0100, David Plowman wrote:\n>> > > > > > On Tue, 27 Apr 2021 at 07:20, Laurent Pinchart wrote:\n>> > > > > > > On Sat, Apr 17, 2021 at 09:19:51AM +0100, David Plowman wrote:\n>> > > > > > > > On Fri, 16 Apr 2021 at 22:56, Laurent Pinchart wrote:\n>> > > > > > > > > On Wed, Apr 14, 2021 at 11:29:53AM +0100, David Plowman wrote:\n>> > > > > > > > > > This allows derived classes to override them if they have any special\n>> > > > > > > > > > behaviours to implement. For instance if a particular camera mode\n>> > > > > > > > > > produces a different signal level to other modes, you might choose to\n>> > > > > > > > > > address that in the gain or exposure methods.\n>> > > > > > > > >\n>> > > > > > > > > For my information, would you be able to give an example of how this\n>> > > > > > > > > would be used ? What kind of particular camera mode would produce a\n>> > > > > > > > > different signal level, is this related to binning, or something else ?\n>> > > > > > > >\n>> > > > > > > > Yes, so the sensor I'm looking at produces twice the signal level in\n>> > > > > > > > the binning modes that it does in the full resolution mode. I think\n>> > > > > > > > this difference would be awkward for applications which means we have\n>> > > > > > > > to hide it somewhere.\n>> > > > > > > >\n>> > > > > > > > One of my earlier emails talked a little bit about hiding this in the\n>> > > > > > > > AGC. You'd have to know a \"base gain\" for every mode (so 1 for full\n>> > > > > > > > res, 2 for binned), and then have some process where we have public\n>> > > > > > > > exposure/gain numbers (which are the same for all modes) and there's a\n>> > > > > > > > process of conversion into and out of the AGC and IPA which deal with\n>> > > > > > > > real exposure/gain values. But I'm really not sure I want to go there,\n>> > > > > > > > at least right now!\n>> > > > > > > >\n>> > > > > > > > I think this is a pragmatic alternative. You can already hide the\n>> > > > > > > > difference by changing the CamHelper's Gain and GainCode functions, to\n>> > > > > > > > apply the magic extra factor of 2 when necessary. This change here\n>> > > > > > > > makes it possible to apply that factor in the exposure instead - you\n>> > > > > > > > could imagine someone wanting the lowest noise possible and accepting\n>> > > > > > > > that the real exposure will be double (I wouldn't want to double\n>> > > > > > > > exposures without people explicitly making that choice). And finally,\n>> > > > > > > > this change doesn't make it any more difficult to do the compensation\n>> > > > > > > > in the AGC at a later date, should we wish.\n>> > > > > > > >\n>> > > > > > > > In the CamHelper for my new sensor, I even make it easy to mix the\n>> > > > > > > > factor of 2 across both gain and exposure, so you could, for example,\n>> > > > > > > > have sqrt(2) in each.\n>> > > > > > > >\n>> > > > > > > > That got rather long... I hope it was understandable!\n>> > > > > > >\n>> > > > > > > Thanks for the information. There may be something that I don't get\n>> > > > > > > though, as I don't see why the exposure needs to be affected. The\n>> > > > > > > exposure time has a big influence on motion blur, so it seems to me that\n>> > > > > > > we shouldn't cheat here. The gain seems to be a better place to hide\n>> > > > > > > this sensor feature. I wonder if we should consider moving from gain to\n>> > > > > > > sensor sensitivity.\n>> > > > > >\n>> > > > > > You're right, exposure doesn't *need* to be affected, and indeed my\n>> > > > > > default position is to handle the difference using gain instead. The\n>> > > > > > intention is that people are *able* to keep the gain to a minimum if\n>> > > > > > they want to, at the cost of changing the exposure - but this has to\n>> > > > > > be a deliberate choice.\n>> > > > > >\n>> > > > > > I agree the \"sensitivity\" is a good idea, and indeed a much better\n>> > > > > > term than I used above, which was \"base gain\".\n>> > > > > >\n>> > > > > > Note that sensitivity is a per-mode thing here. I think it's a good\n>> > > > > > solution but has some consequences:\n>> > > > > >\n>> > > > > > - If an application wants to change camera mode and set the exposure\n>> > > > > > they're going to have to deal with per-mode sensitivities and adapt\n>> > > > > > their numbers accordingly. Do we like this? I'm always very nervous of\n>> > > > > > complicating an application's life!\n>> > > > > >\n>> > > > > > - You could try and handle per-mode sensitivities either outside the\n>> > > > > > AGC (effectively that's what this allows) or you could handle it\n>> > > > > > within the AGC. Whenever the camera mode changes you'd have to adjust\n>> > > > > > the state variables according to the ratio of old_sensitivity /\n>> > > > > > new_sensitivity. You'd probably have to use a different exposure\n>> > > > > > profile, possibly it needs to be set by the application - or could we\n>> > > > > > do something more automatic? The implementation would also depend on\n>> > > > > > whether we want applications to have to pay attention to the \"per-mode\n>> > > > > > sensitivities\" or not.\n>> > > > > >\n>> > > > > > So overall, I'm in several minds about going this \"per-mode\n>> > > > > > sensitivity\" route and the work it requires on the AGC, as well as the\n>> > > > > > implications for applications. In the meantime, making the exposure\n>> > > > > > methods virtual allows these sensors to work and doesn't make a later\n>> > > > > > step towards per-mode sensitivities any more difficult. You would\n>> > > > > > simply do that work and then delete your virtual exposure methods.\n>> > > > >\n>> > > > > I'm sorry, but I still don't see why it requires making the below\n>> > > > > functions virtual. They convert between exposure expressed in lines and\n>> > > > > exposure expressed as a time, why would that need to be sensor-specific\n>> > > > > ? I guess I need to see an actual use case first.\n>> > > >\n>> > > > This is entirely tangential to David's usage, but I also need this method to\n>> > > > be virtual so I can override this for imx477 ultra long exposure modes. In\n>> > > > this mode, we have some additional scaling to apply to the standard\n>> > > > equation.\n>> > >\n>> > > I assume that in that case the exposure_lines argument, and the return\n>> > > value of ExposureLines(), won't be expressed as a number of lines, right\n>> > > ?\n>> >\n>> > I have not worked through the details yet, but yet, this is possibly going\n>> > to be the case with the expected implementation.\n>> >\n>> > > Given that sensors also often have a fine exposure time expressed in\n>> > > pixels, in addition to the coarse exposure time expressed in lines,\n>> > > could we create an Exposure structure that could model exposure settings\n>> > > (coarse in lines, fine in pixels, plus other parameters for long\n>> > > exposures) in a way that wouldn't be sensor-specific ?\n>> >\n>> > Perhaps that might be a way to go that could be generalised for all sensors.\n>>\n>> Would you be able to check if this could be achieved for the different\n>> sensors that you need to support ?\n>\n>\n> Yes, I'll have a look and see what we can come up with.\n>\n>>\n>>\n>> > I do still wonder about global shutter sensors though.  I've not worked with\n>> > many of them, but the ones I have expressed exposure in terms entirely\n>> > unrelated to HTS/VTS, but maybe something to worry about when we\n>> > come across it :-)\n>>\n>> Agreed.\n>>\n>> > > > > > Possibly I've repeated myself rather a lot there - but maybe I've\n>> > > > > > explained it better this time? :)\n>> > > > > >\n>> > > > > > > > > > Signed-off-by: David Plowman <david.plowman@raspberrypi.com>\n>> > > > > > > > > > Reviewed-by: Naushir Patuck <naush@raspberrypi.com>\n>> > > > > > > > >\n>> > > > > > > > > Reviewed-by: Laurent Pinchart < laurent.pinchart@ideasonboard.com>\n>> > > > > > > > >\n>> > > > > > > > > > ---\n>> > > > > > > > > >  src/ipa/raspberrypi/cam_helper.hpp | 4 ++--\n>> > > > > > > > > >  1 file changed, 2 insertions(+), 2 deletions(-)\n>> > > > > > > > > >\n>> > > > > > > > > > diff --git a/src/ipa/raspberrypi/cam_helper.hpp b/src/ipa/raspberrypi/cam_helper.hpp\n>> > > > > > > > > > index 1b2d6eec..4053a870 100644\n>> > > > > > > > > > --- a/src/ipa/raspberrypi/cam_helper.hpp\n>> > > > > > > > > > +++ b/src/ipa/raspberrypi/cam_helper.hpp\n>> > > > > > > > > > @@ -66,8 +66,8 @@ public:\n>> > > > > > > > > >       virtual ~CamHelper();\n>> > > > > > > > > >       void SetCameraMode(const CameraMode &mode);\n>> > > > > > > > > >       MdParser &Parser() const { return *parser_; }\n>> > > > > > > > > > -     uint32_t ExposureLines(double exposure_us) const;\n>> > > > > > > > > > -     double Exposure(uint32_t exposure_lines) const; // in us\n>> > > > > > > > > > +     virtual uint32_t ExposureLines(double exposure_us) const;\n>> > > > > > > > > > +     virtual double Exposure(uint32_t exposure_lines) const; // in us\n>> > > > > > > > > >       virtual uint32_t GetVBlanking(double &exposure_us, double minFrameDuration,\n>> > > > > > > > > >                                     double maxFrameDuration) const;\n>> > > > > > > > > >       virtual uint32_t GainCode(double gain) const = 0;\n>>\n>> --\n>> Regards,\n>>\n>> Laurent Pinchart","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id 2ACF7BDE79\n\tfor <parsemail@patchwork.libcamera.org>;\n\tTue,  4 May 2021 09:15:41 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id 8771B6891A;\n\tTue,  4 May 2021 11:15:40 +0200 (CEST)","from mail-wr1-x42d.google.com (mail-wr1-x42d.google.com\n\t[IPv6:2a00:1450:4864:20::42d])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id D0B8C6890C\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tTue,  4 May 2021 11:15:38 +0200 (CEST)","by mail-wr1-x42d.google.com with SMTP id t18so8558781wry.1\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tTue, 04 May 2021 02:15:38 -0700 (PDT)"],"Authentication-Results":"lancelot.ideasonboard.com;\n\tdkim=fail reason=\"signature verification failed\" (2048-bit key;\n\tunprotected) header.d=raspberrypi.com header.i=@raspberrypi.com\n\theader.b=\"bkg5XfCF\"; dkim-atps=neutral","DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/relaxed;\n\td=raspberrypi.com; s=google;\n\th=mime-version:references:in-reply-to:from:date:message-id:subject:to\n\t:cc; bh=nbWZXFS1etP2TM1BBTHXnE8qE0uhrtwc05y3CzBvWSU=;\n\tb=bkg5XfCFTIeo3+cXGP61GBT7EnFxdMpyLYQ+fTKZmG4hpwg7qGWOXCRtzXsEScRpju\n\tg3xwTdhjP50bsA0wEhSpSw3Vp2cvlGWfFaLVpy5ixJ6J6bpHsFslYI0B3K02hXvZVgGD\n\tMpjS3E+YYIX0teM7bG0KFKIfAM1dMA6/+VdoeHlFfASgeOanqDGyklrBEoUwgZqhHdle\n\tuYbqZVDSvCc9pM5SERkciv6K6oPQkEz0vnH5NNFHfQK/ZIXBBxri2BXx3pj+nZTJw7Jw\n\tt2JoSwUTjSYLiENEooIHnO2s+qsGOoJYVAGaTF0HzLJ5Meb4QEPS1A/P9L2tkvPd54Bp\n\tDr7A==","X-Google-DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/relaxed;\n\td=1e100.net; s=20161025;\n\th=x-gm-message-state:mime-version:references:in-reply-to:from:date\n\t:message-id:subject:to:cc;\n\tbh=nbWZXFS1etP2TM1BBTHXnE8qE0uhrtwc05y3CzBvWSU=;\n\tb=uFLvnHutIoV8QAUq7xoVJ1MDLrcyKALjXCeq/Pt4hp2jEyS8lk2jj4Dd6jxc+fwhu4\n\tnSDY55wnGFTyx0vRfBZOp3iITrF8uteLF+DojMVW2ChB1EyIJhmeeEUJjdlVdZ5r/VYi\n\teJ1FdBEtQXWaVHy4vEvkqPoL6PgMonMz2YkT/GVlPgpSzZ1yKvBQIrOcSjtXcfw1AxFH\n\t1D/TmU0VcKhDjWZW3bbivSvbiE/nY5KaqnVyK4dMXRsfIGD/s2d/788d2wk3BeT2vXij\n\tGf6t6HD9UKHl6eNhTNHXBHQGc83E2Dou+NyRPtATaP0iUBWcKL/mgcArc62FLDPEOgxE\n\t9i6A==","X-Gm-Message-State":"AOAM531qNFX/LUy7kzooppAvwAOXedelkjYc0Sc+5qOU/Bssmthh324u\n\tdCKgNanmFS2ZfVhVIR97zy4Lm7e6lHSAV6ARZgLJJGcG6X4=","X-Google-Smtp-Source":"ABdhPJzdIZQztce0iqgG6eCQyyfh+Hr9DBpGySc0Ay7tODOvxhrMo7h69CqXFSAUV/5vG/+0chvWYOWvqq9jprIwiic=","X-Received":"by 2002:a05:6000:18a4:: with SMTP id\n\tb4mr23438407wri.86.1620119738446; \n\tTue, 04 May 2021 02:15:38 -0700 (PDT)","MIME-Version":"1.0","References":"<20210414102955.9503-1-david.plowman@raspberrypi.com>\n\t<20210414102955.9503-2-david.plowman@raspberrypi.com>\n\t<YHoIBNpFTzakzIhi@pendragon.ideasonboard.com>\n\t<CAHW6GYL0csoeG3JH0tD4n=jg03GpcmxGGq=e6dcUvfrxtAuQ1g@mail.gmail.com>\n\t<YIetKmO+nQU2bS7A@pendragon.ideasonboard.com>\n\t<CAHW6GYKpdCD_VdPbaQJg_bzGV+-hi2PrMzxUQF8epTu9vG3B5Q@mail.gmail.com>\n\t<YIlSKYg+CjMlfkNs@pendragon.ideasonboard.com>\n\t<CAEmqJPpTT00SmEHwmH=XPoeSg-3cxqsmyoMhufjNWKXUjX8D0w@mail.gmail.com>\n\t<YIlkCw6Adu3sk5S2@pendragon.ideasonboard.com>\n\t<CAEmqJPrj9HAHp_Jn_4M0p7uRf6hDmo62sWXvP1b63G+Z-m1nmA@mail.gmail.com>\n\t<YIlruG6KTHqNHDF8@pendragon.ideasonboard.com>\n\t<CAEmqJPrfBK3a0iiddwsgmHpkUgjDWN6M4z5y1o-=UWT-sdQ-hQ@mail.gmail.com>","In-Reply-To":"<CAEmqJPrfBK3a0iiddwsgmHpkUgjDWN6M4z5y1o-=UWT-sdQ-hQ@mail.gmail.com>","From":"David Plowman <david.plowman@raspberrypi.com>","Date":"Tue, 4 May 2021 10:15:26 +0100","Message-ID":"<CAHW6GYJN3ZpREM_AC9+uj2fV4nt8dYw-83bCiJ704oA4=OzotA@mail.gmail.com>","To":"Naushir Patuck <naush@raspberrypi.com>","Subject":"Re: [libcamera-devel] [PATCH v3 1/3] ipa: raspberrypi: Make\n\tCamHelper exposure methods virtual","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","Cc":"libcamera devel <libcamera-devel@lists.libcamera.org>","Content-Type":"text/plain; charset=\"us-ascii\"","Content-Transfer-Encoding":"7bit","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}},{"id":16758,"web_url":"https://patchwork.libcamera.org/comment/16758/","msgid":"<YJE/QqoJqBkHnGSN@pendragon.ideasonboard.com>","date":"2021-05-04T12:34:10","subject":"Re: [libcamera-devel] [PATCH v3 1/3] ipa: raspberrypi: Make\n\tCamHelper exposure methods virtual","submitter":{"id":2,"url":"https://patchwork.libcamera.org/api/people/2/","name":"Laurent Pinchart","email":"laurent.pinchart@ideasonboard.com"},"content":"Hi David,\n\nThank you for the patch.\n\nOn Tue, May 04, 2021 at 10:15:26AM +0100, David Plowman wrote:\n> Hi again\n> \n> Thanks for all the various opinions. If I may wade in again... :)\n> \n> Like Naush, I have some more requirements for what the Exposure\n> functions need to do. The sensor I'm currently dealing with has the\n> constraint - in some readout modes only - that the number of exposure\n> lines has to be a multiple of 2. It's one of those \"quad Bayer\" or\n> \"tetracell\" sensors, so you can imagine where this is coming from. You\n> might speculate that so-called \"nonacell\" sensors might want multiples\n> of 3, though I've never had my hands on one of those!\n> \n> I suppose we could argue that the ControlInfos needs to record and\n> advertise the \"step\" that V4L2 parameters expose - we clearly need to\n> fix up the exposure correctly before we pass it into the delayed\n> control. Or at least, it must have the true value when we read it out.\n\nThere's also information we could hardcode in userspace. Generally\nspeaking, querying information automatically is nice, but it can become\ntricky when the information isn't static, as we've seen with the limits\nfor the vertical blanking control. I'm not opposed to storing\ninformation in libcamera, but that should preferably be done as data, in\nthe sensor database, instead of virtual functions. I'm not fond of\nfunctions that adjust parameters in an unspecified way, as they may work\n\"magically\" in some use cases, but fail in other ones. I prefer giving\nenough information to the caller to figure needed adjustements out, with\nthe option of having a standard helper performing the most common\nadjustements in a documented way.\n\n> Perhaps the problem with making this function virtual is that we've\n> never been totally clear what it's for.\n\nI think that matches the reason why I dislike virtual fucntions that\nI've just explained :-)\n\n> Maybe it really is supposed to\n> do nothing but a division, though at that point, shouldn't we remove\n> it and just do the division? Perhaps it's purpose is to hide away\n> peculiarities of particular sensors (and modes) on the grounds that\n> they're going to be difficult to predict in advance for all future\n> devices, that probably sums up how I've always viewed it.\n> \n> I already treat the GainCode function rather like this. For example,\n> where different modes have different minimum gains, or even different\n> sensitivities, I soak that up in the virtual GainCode method, so it's\n> not \"give me the code for this gain\", it's \"give me the code I need to\n> program for this gain, even if it's not really this exact gain\".\n\nThat's the part that bothers me. If we can document the function to\nspecify exactly what types of adjustements it can make, then I'm OK, the\ncaller can decide whether that's fine, or if its use cases require\nsomething different. It's the black box aspect that I don't like.\n\n> Here\n> I'm just wanting to give developers the *option* to get minimum gain\n> images if they're prepared deliberately to alter the exposure. I won't\n> argue that some solution involving sensitivities and per-mode\n> sensitivities is not better, but we'd need to decide how that\n> interacts with the AEC/AGC, with applications and so on. I'm feeling\n> like that might be another one of those rather long discussions...\n\nIt's a needed discussion though, so maybe we should bite the bullet\ninstead of shoving it under the carpet :-) \n\n> On Wed, 28 Apr 2021 at 15:12, Naushir Patuck wrote:\n> > On Wed, 28 Apr 2021 at 15:05, Laurent Pinchart wrote:\n> >> On Wed, Apr 28, 2021 at 02:40:26PM +0100, Naushir Patuck wrote:\n> >>> On Wed, 28 Apr 2021 at 14:33, Laurent Pinchart wrote:\n> >>>> On Wed, Apr 28, 2021 at 01:35:08PM +0100, Naushir Patuck wrote:\n> >>>>> On Wed, 28 Apr 2021 at 13:16, Laurent Pinchart wrote:\n> >>>>>> On Tue, Apr 27, 2021 at 09:22:00AM +0100, David Plowman wrote:\n> >>>>>>> On Tue, 27 Apr 2021 at 07:20, Laurent Pinchart wrote:\n> >>>>>>>> On Sat, Apr 17, 2021 at 09:19:51AM +0100, David Plowman wrote:\n> >>>>>>>>> On Fri, 16 Apr 2021 at 22:56, Laurent Pinchart wrote:\n> >>>>>>>>>> On Wed, Apr 14, 2021 at 11:29:53AM +0100, David Plowman wrote:\n> >>>>>>>>>>> This allows derived classes to override them if they have any special\n> >>>>>>>>>>> behaviours to implement. For instance if a particular camera mode\n> >>>>>>>>>>> produces a different signal level to other modes, you might choose to\n> >>>>>>>>>>> address that in the gain or exposure methods.\n> >>>>>>>>>>\n> >>>>>>>>>> For my information, would you be able to give an example of how this\n> >>>>>>>>>> would be used ? What kind of particular camera mode would produce a\n> >>>>>>>>>> different signal level, is this related to binning, or something else ?\n> >>>>>>>>>\n> >>>>>>>>> Yes, so the sensor I'm looking at produces twice the signal level in\n> >>>>>>>>> the binning modes that it does in the full resolution mode. I think\n> >>>>>>>>> this difference would be awkward for applications which means we have\n> >>>>>>>>> to hide it somewhere.\n> >>>>>>>>>\n> >>>>>>>>> One of my earlier emails talked a little bit about hiding this in the\n> >>>>>>>>> AGC. You'd have to know a \"base gain\" for every mode (so 1 for full\n> >>>>>>>>> res, 2 for binned), and then have some process where we have public\n> >>>>>>>>> exposure/gain numbers (which are the same for all modes) and there's a\n> >>>>>>>>> process of conversion into and out of the AGC and IPA which deal with\n> >>>>>>>>> real exposure/gain values. But I'm really not sure I want to go there,\n> >>>>>>>>> at least right now!\n> >>>>>>>>>\n> >>>>>>>>> I think this is a pragmatic alternative. You can already hide the\n> >>>>>>>>> difference by changing the CamHelper's Gain and GainCode functions, to\n> >>>>>>>>> apply the magic extra factor of 2 when necessary. This change here\n> >>>>>>>>> makes it possible to apply that factor in the exposure instead - you\n> >>>>>>>>> could imagine someone wanting the lowest noise possible and accepting\n> >>>>>>>>> that the real exposure will be double (I wouldn't want to double\n> >>>>>>>>> exposures without people explicitly making that choice). And finally,\n> >>>>>>>>> this change doesn't make it any more difficult to do the compensation\n> >>>>>>>>> in the AGC at a later date, should we wish.\n> >>>>>>>>>\n> >>>>>>>>> In the CamHelper for my new sensor, I even make it easy to mix the\n> >>>>>>>>> factor of 2 across both gain and exposure, so you could, for example,\n> >>>>>>>>> have sqrt(2) in each.\n> >>>>>>>>>\n> >>>>>>>>> That got rather long... I hope it was understandable!\n> >>>>>>>>\n> >>>>>>>> Thanks for the information. There may be something that I don't get\n> >>>>>>>> though, as I don't see why the exposure needs to be affected. The\n> >>>>>>>> exposure time has a big influence on motion blur, so it seems to me that\n> >>>>>>>> we shouldn't cheat here. The gain seems to be a better place to hide\n> >>>>>>>> this sensor feature. I wonder if we should consider moving from gain to\n> >>>>>>>> sensor sensitivity.\n> >>>>>>>\n> >>>>>>> You're right, exposure doesn't *need* to be affected, and indeed my\n> >>>>>>> default position is to handle the difference using gain instead. The\n> >>>>>>> intention is that people are *able* to keep the gain to a minimum if\n> >>>>>>> they want to, at the cost of changing the exposure - but this has to\n> >>>>>>> be a deliberate choice.\n> >>>>>>>\n> >>>>>>> I agree the \"sensitivity\" is a good idea, and indeed a much better\n> >>>>>>> term than I used above, which was \"base gain\".\n> >>>>>>>\n> >>>>>>> Note that sensitivity is a per-mode thing here. I think it's a good\n> >>>>>>> solution but has some consequences:\n> >>>>>>>\n> >>>>>>> - If an application wants to change camera mode and set the exposure\n> >>>>>>> they're going to have to deal with per-mode sensitivities and adapt\n> >>>>>>> their numbers accordingly. Do we like this? I'm always very nervous of\n> >>>>>>> complicating an application's life!\n> >>>>>>>\n> >>>>>>> - You could try and handle per-mode sensitivities either outside the\n> >>>>>>> AGC (effectively that's what this allows) or you could handle it\n> >>>>>>> within the AGC. Whenever the camera mode changes you'd have to adjust\n> >>>>>>> the state variables according to the ratio of old_sensitivity /\n> >>>>>>> new_sensitivity. You'd probably have to use a different exposure\n> >>>>>>> profile, possibly it needs to be set by the application - or could we\n> >>>>>>> do something more automatic? The implementation would also depend on\n> >>>>>>> whether we want applications to have to pay attention to the \"per-mode\n> >>>>>>> sensitivities\" or not.\n> >>>>>>>\n> >>>>>>> So overall, I'm in several minds about going this \"per-mode\n> >>>>>>> sensitivity\" route and the work it requires on the AGC, as well as the\n> >>>>>>> implications for applications. In the meantime, making the exposure\n> >>>>>>> methods virtual allows these sensors to work and doesn't make a later\n> >>>>>>> step towards per-mode sensitivities any more difficult. You would\n> >>>>>>> simply do that work and then delete your virtual exposure methods.\n> >>>>>>\n> >>>>>> I'm sorry, but I still don't see why it requires making the below\n> >>>>>> functions virtual. They convert between exposure expressed in lines and\n> >>>>>> exposure expressed as a time, why would that need to be sensor-specific\n> >>>>>> ? I guess I need to see an actual use case first.\n> >>>>>\n> >>>>> This is entirely tangential to David's usage, but I also need this method to\n> >>>>> be virtual so I can override this for imx477 ultra long exposure modes. In\n> >>>>> this mode, we have some additional scaling to apply to the standard\n> >>>>> equation.\n> >>>>\n> >>>> I assume that in that case the exposure_lines argument, and the return\n> >>>> value of ExposureLines(), won't be expressed as a number of lines, right\n> >>>> ?\n> >>>\n> >>> I have not worked through the details yet, but yet, this is possibly going\n> >>> to be the case with the expected implementation.\n> >>>\n> >>>> Given that sensors also often have a fine exposure time expressed in\n> >>>> pixels, in addition to the coarse exposure time expressed in lines,\n> >>>> could we create an Exposure structure that could model exposure settings\n> >>>> (coarse in lines, fine in pixels, plus other parameters for long\n> >>>> exposures) in a way that wouldn't be sensor-specific ?\n> >>>\n> >>> Perhaps that might be a way to go that could be generalised for all sensors.\n> >>\n> >> Would you be able to check if this could be achieved for the different\n> >> sensors that you need to support ?\n> >\n> > Yes, I'll have a look and see what we can come up with.\n> >\n> >>> I do still wonder about global shutter sensors though.  I've not worked with\n> >>> many of them, but the ones I have expressed exposure in terms entirely\n> >>> unrelated to HTS/VTS, but maybe something to worry about when we\n> >>> come across it :-)\n> >>\n> >> Agreed.\n> >>\n> >>>>>>> Possibly I've repeated myself rather a lot there - but maybe I've\n> >>>>>>> explained it better this time? :)\n> >>>>>>>\n> >>>>>>>>>>> Signed-off-by: David Plowman <david.plowman@raspberrypi.com>\n> >>>>>>>>>>> Reviewed-by: Naushir Patuck <naush@raspberrypi.com>\n> >>>>>>>>>>\n> >>>>>>>>>> Reviewed-by: Laurent Pinchart < laurent.pinchart@ideasonboard.com>\n> >>>>>>>>>>\n> >>>>>>>>>>> ---\n> >>>>>>>>>>>  src/ipa/raspberrypi/cam_helper.hpp | 4 ++--\n> >>>>>>>>>>>  1 file changed, 2 insertions(+), 2 deletions(-)\n> >>>>>>>>>>>\n> >>>>>>>>>>> diff --git a/src/ipa/raspberrypi/cam_helper.hpp b/src/ipa/raspberrypi/cam_helper.hpp\n> >>>>>>>>>>> index 1b2d6eec..4053a870 100644\n> >>>>>>>>>>> --- a/src/ipa/raspberrypi/cam_helper.hpp\n> >>>>>>>>>>> +++ b/src/ipa/raspberrypi/cam_helper.hpp\n> >>>>>>>>>>> @@ -66,8 +66,8 @@ public:\n> >>>>>>>>>>>       virtual ~CamHelper();\n> >>>>>>>>>>>       void SetCameraMode(const CameraMode &mode);\n> >>>>>>>>>>>       MdParser &Parser() const { return *parser_; }\n> >>>>>>>>>>> -     uint32_t ExposureLines(double exposure_us) const;\n> >>>>>>>>>>> -     double Exposure(uint32_t exposure_lines) const; // in us\n> >>>>>>>>>>> +     virtual uint32_t ExposureLines(double exposure_us) const;\n> >>>>>>>>>>> +     virtual double Exposure(uint32_t exposure_lines) const; // in us\n> >>>>>>>>>>>       virtual uint32_t GetVBlanking(double &exposure_us, double minFrameDuration,\n> >>>>>>>>>>>                                     double maxFrameDuration) const;\n> >>>>>>>>>>>       virtual uint32_t GainCode(double gain) const = 0;","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id 5C07EBDE79\n\tfor <parsemail@patchwork.libcamera.org>;\n\tTue,  4 May 2021 12:34:16 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id D25056890E;\n\tTue,  4 May 2021 14:34:15 +0200 (CEST)","from perceval.ideasonboard.com (perceval.ideasonboard.com\n\t[213.167.242.64])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id 04502602BF\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tTue,  4 May 2021 14:34:13 +0200 (CEST)","from pendragon.ideasonboard.com (62-78-145-57.bb.dnainternet.fi\n\t[62.78.145.57])\n\tby perceval.ideasonboard.com (Postfix) with ESMTPSA id 8604058E;\n\tTue,  4 May 2021 14:34:13 +0200 (CEST)"],"Authentication-Results":"lancelot.ideasonboard.com;\n\tdkim=fail reason=\"signature verification failed\" (1024-bit key;\n\tunprotected) header.d=ideasonboard.com header.i=@ideasonboard.com\n\theader.b=\"eN520B/l\"; dkim-atps=neutral","DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/simple; d=ideasonboard.com;\n\ts=mail; t=1620131653;\n\tbh=Lje99Qq0EHs0Y9JCtDV50yNlT80lgoAjl1v01g/woQo=;\n\th=Date:From:To:Cc:Subject:References:In-Reply-To:From;\n\tb=eN520B/l19OnVVyWH6DwQ8O2oOr7shR16bOCIipUxJAGVlip/RDuWvh/JfFjOuyBW\n\tYxG0d5lLgRo4g2QmjWiOJ6JBRcpxNTe90+fmqztp5QUO1tCecTrUAnyRpI5cy6ancg\n\tH0OmPhmA1/rl+ArM6CJiViaNupLnzpsega8TR7E8=","Date":"Tue, 4 May 2021 15:34:10 +0300","From":"Laurent Pinchart <laurent.pinchart@ideasonboard.com>","To":"David Plowman <david.plowman@raspberrypi.com>","Message-ID":"<YJE/QqoJqBkHnGSN@pendragon.ideasonboard.com>","References":"<CAHW6GYL0csoeG3JH0tD4n=jg03GpcmxGGq=e6dcUvfrxtAuQ1g@mail.gmail.com>\n\t<YIetKmO+nQU2bS7A@pendragon.ideasonboard.com>\n\t<CAHW6GYKpdCD_VdPbaQJg_bzGV+-hi2PrMzxUQF8epTu9vG3B5Q@mail.gmail.com>\n\t<YIlSKYg+CjMlfkNs@pendragon.ideasonboard.com>\n\t<CAEmqJPpTT00SmEHwmH=XPoeSg-3cxqsmyoMhufjNWKXUjX8D0w@mail.gmail.com>\n\t<YIlkCw6Adu3sk5S2@pendragon.ideasonboard.com>\n\t<CAEmqJPrj9HAHp_Jn_4M0p7uRf6hDmo62sWXvP1b63G+Z-m1nmA@mail.gmail.com>\n\t<YIlruG6KTHqNHDF8@pendragon.ideasonboard.com>\n\t<CAEmqJPrfBK3a0iiddwsgmHpkUgjDWN6M4z5y1o-=UWT-sdQ-hQ@mail.gmail.com>\n\t<CAHW6GYJN3ZpREM_AC9+uj2fV4nt8dYw-83bCiJ704oA4=OzotA@mail.gmail.com>","MIME-Version":"1.0","Content-Disposition":"inline","In-Reply-To":"<CAHW6GYJN3ZpREM_AC9+uj2fV4nt8dYw-83bCiJ704oA4=OzotA@mail.gmail.com>","Subject":"Re: [libcamera-devel] [PATCH v3 1/3] ipa: raspberrypi: Make\n\tCamHelper exposure methods virtual","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","Cc":"libcamera devel <libcamera-devel@lists.libcamera.org>","Content-Type":"text/plain; charset=\"us-ascii\"","Content-Transfer-Encoding":"7bit","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}}]