[{"id":11732,"web_url":"https://patchwork.libcamera.org/comment/11732/","msgid":"<20200730163804.gyhkgcrcwwbfn2tj@uno.localdomain>","date":"2020-07-30T16:38:04","subject":"Re: [libcamera-devel] [PATCH v3 1/2] libcamera: Infrastructure for\n\tdigital zoom","submitter":{"id":3,"url":"https://patchwork.libcamera.org/api/people/3/","name":"Jacopo Mondi","email":"jacopo@jmondi.org"},"content":"Hi David,\n  sorry for jumping late on the topic, I managed to read the\ndiscussions in v1 and v2 only today.\n\nOn Thu, Jul 23, 2020 at 09:43:37AM +0100, David Plowman wrote:\n> These changes add a Digital Zoom control, taking a rectangle as its\n> argument, indicating the region of the sensor output that the\n> pipeline will \"zoom up\" to the final output size.\n>\n> Additionally, we need to have a method returning the \"pipelineCrop\"\n> which gives the dimensions of the sensor output, taken by the\n> pipeline, and within which we can subsequently pan and zoom.\n\nThis is the part I don't get. What's the intended user of this\ninformation ? Is this meant to be provided to applications so that\nthey know which is the area inside which they can later pan/zoom, right ?\n\nI'm kind of against to ad-hoc function to retrieve information about\nthe image processing pipeline configuration. I can easily see them\nmultiply and make our API awful, and each platform would need to add\nits own very-specific bits, and that can't certainly go through the\nmain Camera class API.\n\nIf I got your use case right:\n\n1) You need to report the pipeline crop area, which is a rectangle\n   defined on the frame that is produced by the sensor.\n   Using your example:\n\n        sensorSize = 1920x1440\n        pipelineCrop = 1920x1080\n\n   It seems to me the 'pipelineCrop' is a good candidate to be a\n   CameraConfiguration property, which at the moment is not yet\n   implemented. Roughly speaking, we have been discussing it in the\n   context of frame durations, and it's about augmenting the\n   CameraConfiguration class with a set of properties that depend on\n   the currently applied configuration of the enabled streams.\n\n   Ideally we should be able to receive CameraConfiguration, validate\n   it, and populate it with properties, like the minimum frame\n   duration (which is a combination of durations of all streams), and\n   now with this new 'pipelineCrop' which is the result of a decision\n   the pipeline handler takes based on the requested output stream\n   sizes.\n\n   I think the ideal plan forward for this would be to define a new\n   control/property (I would say property as it's immutable from\n   applications) that is set by pipeline handlers during validate()\n   when the sensor mode is selected and eventually cropped.\n\n   Application can access it from the CameraConfiguration and decide\n   where to set their pan/zoom rectangle.\n\n   How the pipeline handler applies the cropping to the sensor\n   produced frame is less relevant imo. Systems which have direct access\n   to the v4l2 subdev will probably use the crop/compose rectangle\n   applied on the sensor. In your case I see you use the crop target\n   on the ISP video device node. Am I wrong or you could calculate\n   that at validate() time, once you know all the desired output sizes and\n   based on them identify the 'best' sensor mode to use ?\n\n2) Application, once they know the 'pipelineCrop' (I'm ok with that\n   term btw, as we have analogCrop and we will have a digitalCrop for\n   the sensor's processing stages, so pipelineCrop feels right) can\n   set their pan/zoom rectangle with a control, as the one you have\n   defined below. Now, what do you think the reference of that rectangle\n   should be ? I mean, should the 'pipelineCrop' always be in position\n   (0,0) and the 'digitalZoom' be applied on top of it ? I would say\n   so, but then we lose the information on which part of the image\n   the pipeline decided to crop, and I'm not sure it is relevant or\n   not.\n\n   The other possibility is to report the 'pipelineCrop' in respect to\n   the full active pixel array size, and ask application to provide a\n   'digitalZoom' rectangle with the same reference. Is it worth the\n   additional complication in you opinion?\n\n   Third option is to report the sensor frame size AND the pipeline crop\n   defined in respect to it, and let application provide a digitalZoom\n   rectangle defined in respect to the pipeline crop. This would look\n   like (numbers are random):\n        sensorFrameSize = {0, 0, 1920, 1440}\n        pipelineCrop = { 0, 150, 1920, 1080}\n        digitalZoom = {300, 300, 1280, 720}\n\n   This would allow application to know that the sensor frame is\n   1920x1400, the pipeline selected a region on 1920x1080 by applying\n   an offset of 150 pixels in the vertical direction and the desired\n   digital zoom is then applied on this last rectangle (resulting in a\n   effective (300,450) dislocation from the full sensor frame).\n\nI won't bikeshed on names too much, just leave as a reference that\nAndroid reports the same property we are defining as pipelineCrop as\n'scalera.cropRegion' as for them 'scaler' is used to report properties\nof the ISP processing pipeline.\n\nI'm sorry for the wall of text, I wish we had a blackboard :)\n\nThanks\n  j\n\n>\n> Signed-off-by: David Plowman <david.plowman@raspberrypi.com>\n> ---\n>  include/libcamera/camera.h                    |  2 ++\n>  include/libcamera/internal/pipeline_handler.h |  4 +++\n>  src/libcamera/camera.cpp                      | 27 +++++++++++++++++++\n>  src/libcamera/control_ids.yaml                | 10 +++++++\n>  4 files changed, 43 insertions(+)\n>\n> diff --git a/include/libcamera/camera.h b/include/libcamera/camera.h\n> index 4d1a4a9..6819b8e 100644\n> --- a/include/libcamera/camera.h\n> +++ b/include/libcamera/camera.h\n> @@ -92,6 +92,8 @@ public:\n>  \tstd::unique_ptr<CameraConfiguration> generateConfiguration(const StreamRoles &roles = {});\n>  \tint configure(CameraConfiguration *config);\n>\n> +\tSize const &getPipelineCrop() const;\n> +\n>  \tRequest *createRequest(uint64_t cookie = 0);\n>  \tint queueRequest(Request *request);\n>\n> diff --git a/include/libcamera/internal/pipeline_handler.h b/include/libcamera/internal/pipeline_handler.h\n> index 22e629a..5bfe890 100644\n> --- a/include/libcamera/internal/pipeline_handler.h\n> +++ b/include/libcamera/internal/pipeline_handler.h\n> @@ -89,6 +89,8 @@ public:\n>\n>  \tconst char *name() const { return name_; }\n>\n> +\tSize const &getPipelineCrop() const { return pipelineCrop_; }\n> +\n>  protected:\n>  \tvoid registerCamera(std::shared_ptr<Camera> camera,\n>  \t\t\t    std::unique_ptr<CameraData> data);\n> @@ -100,6 +102,8 @@ protected:\n>\n>  \tCameraManager *manager_;\n>\n> +\tSize pipelineCrop_;\n> +\n>  private:\n>  \tvoid mediaDeviceDisconnected(MediaDevice *media);\n>  \tvirtual void disconnect();\n> diff --git a/src/libcamera/camera.cpp b/src/libcamera/camera.cpp\n> index 69a1b44..f8b8ec6 100644\n> --- a/src/libcamera/camera.cpp\n> +++ b/src/libcamera/camera.cpp\n> @@ -793,6 +793,33 @@ int Camera::configure(CameraConfiguration *config)\n>  \treturn 0;\n>  }\n>\n> +/**\n> + * \\brief Return the size of the sensor image being used by the pipeline\n> + * to create the output.\n> + *\n> + * This method returns the size, in pixels, of the raw image read from the\n> + * sensor and which is used by the pipeline to form the output image(s)\n> + * (rescaling if necessary). Note that these values take account of any\n> + * cropping performed on the sensor output so as to produce the correct\n> + * aspect ratio. It would normally be necessary to retrieve these values\n> + * in order to calculate correct parameters for digital zoom.\n> + *\n> + * Example: a sensor mode may produce a 1920x1440 output image. But if an\n> + * application has requested a 16:9 image, the values returned here might\n> + * be 1920x1080 - the largest portion of the sensor output that provides\n> + * the correct aspect ratio.\n> + *\n> + * \\context This function is \\threadsafe. It will only return valid\n> + * (non-zero) values when the camera has been configured.\n> + *\n> + * \\return The dimensions of the sensor image used by the pipeline.\n> + */\n> +\n> +Size const &Camera::getPipelineCrop() const\n> +{\n> +\treturn p_->pipe_->getPipelineCrop();\n> +}\n> +\n>  /**\n>   * \\brief Create a request object for the camera\n>   * \\param[in] cookie Opaque cookie for application use\n> diff --git a/src/libcamera/control_ids.yaml b/src/libcamera/control_ids.yaml\n> index 988b501..5a099d5 100644\n> --- a/src/libcamera/control_ids.yaml\n> +++ b/src/libcamera/control_ids.yaml\n> @@ -262,4 +262,14 @@ controls:\n>          In this respect, it is not necessarily aimed at providing a way to\n>          implement a focus algorithm by the application, rather an indication of\n>          how in-focus a frame is.\n> +\n> +  - DigitalZoom:\n> +      type: Rectangle\n> +      description: |\n> +        Sets the portion of the full sensor image, in pixels, that will be\n> +        used for digital zoom. That is, this part of the sensor output will\n> +        be scaled up to make the full size output image (and everything else\n> +        discarded). To obtain the \"full sensor image\" that is available, the\n> +        method Camera::getOutputCrop() should be called once the camera is\n> +        configured. An application may pan and zoom within this rectangle.\n>  ...\n> --\n> 2.20.1\n>\n> _______________________________________________\n> libcamera-devel mailing list\n> libcamera-devel@lists.libcamera.org\n> https://lists.libcamera.org/listinfo/libcamera-devel","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id 33B6ABD86F\n\tfor <parsemail@patchwork.libcamera.org>;\n\tThu, 30 Jul 2020 16:34:27 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id 9C74F61988;\n\tThu, 30 Jul 2020 18:34:26 +0200 (CEST)","from relay6-d.mail.gandi.net (relay6-d.mail.gandi.net\n\t[217.70.183.198])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id 54A5B611A2\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tThu, 30 Jul 2020 18:34:25 +0200 (CEST)","from uno.localdomain (2-224-242-101.ip172.fastwebnet.it\n\t[2.224.242.101]) (Authenticated sender: jacopo@jmondi.org)\n\tby relay6-d.mail.gandi.net (Postfix) with ESMTPSA id B0F55C0003;\n\tThu, 30 Jul 2020 16:34:24 +0000 (UTC)"],"X-Originating-IP":"2.224.242.101","Date":"Thu, 30 Jul 2020 18:38:04 +0200","From":"Jacopo Mondi <jacopo@jmondi.org>","To":"David Plowman <david.plowman@raspberrypi.com>","Message-ID":"<20200730163804.gyhkgcrcwwbfn2tj@uno.localdomain>","References":"<20200723084338.13711-1-david.plowman@raspberrypi.com>\n\t<20200723084338.13711-2-david.plowman@raspberrypi.com>","MIME-Version":"1.0","Content-Disposition":"inline","In-Reply-To":"<20200723084338.13711-2-david.plowman@raspberrypi.com>","Subject":"Re: [libcamera-devel] [PATCH v3 1/2] libcamera: Infrastructure for\n\tdigital zoom","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","Cc":"libcamera-devel@lists.libcamera.org","Content-Type":"text/plain; charset=\"us-ascii\"","Content-Transfer-Encoding":"7bit","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}},{"id":11743,"web_url":"https://patchwork.libcamera.org/comment/11743/","msgid":"<CAHW6GYKvV436aqjNSHt62Eqc62EU4p4R-bLemX53f0f6vYNBTA@mail.gmail.com>","date":"2020-07-31T09:02:44","subject":"Re: [libcamera-devel] [PATCH v3 1/2] libcamera: Infrastructure for\n\tdigital zoom","submitter":{"id":42,"url":"https://patchwork.libcamera.org/api/people/42/","name":"David Plowman","email":"david.plowman@raspberrypi.com"},"content":"Hi Jacopo, everyone\n\nOn Thu, 30 Jul 2020 at 17:34, Jacopo Mondi <jacopo@jmondi.org> wrote:\n>\n> Hi David,\n>   sorry for jumping late on the topic, I managed to read the\n> discussions in v1 and v2 only today.\n\nPlease don't apologise, I'm very glad that you've picked this up. This\nis actually one of the \"big ticket\" items that libcamera is currently\nlacking compared to our existing stack, so I was going to give the\ndiscussion a \"nudge\" quite soon!\n\n>\n> On Thu, Jul 23, 2020 at 09:43:37AM +0100, David Plowman wrote:\n> > These changes add a Digital Zoom control, taking a rectangle as its\n> > argument, indicating the region of the sensor output that the\n> > pipeline will \"zoom up\" to the final output size.\n> >\n> > Additionally, we need to have a method returning the \"pipelineCrop\"\n> > which gives the dimensions of the sensor output, taken by the\n> > pipeline, and within which we can subsequently pan and zoom.\n>\n> This is the part I don't get. What's the intended user of this\n> information ? Is this meant to be provided to applications so that\n> they know which is the area inside which they can later pan/zoom, right ?\n\nCorrect. I'm imagining a user might say \"I want to see the image at 2x\nzoom\". If we know the unzoomed image is being created by (as per one\nof my examples somewhere) 2028x1140 pixels from the sensor, then the\napplication would send a \"digital zoom\" of offset: (507,285) size:\n1014x570,\n\nI think the feeling was that we liked having control right down to the\npixel level, which in turn means the the numbers 2028x1140 need to be\nsupplied to the application from somewhere.\n\n>\n> I'm kind of against to ad-hoc function to retrieve information about\n> the image processing pipeline configuration. I can easily see them\n> multiply and make our API awful, and each platform would need to add\n> its own very-specific bits, and that can't certainly go through the\n> main Camera class API.\n>\n> If I got your use case right:\n>\n> 1) You need to report the pipeline crop area, which is a rectangle\n>    defined on the frame that is produced by the sensor.\n>    Using your example:\n>\n>         sensorSize = 1920x1440\n>         pipelineCrop = 1920x1080\n>\n>    It seems to me the 'pipelineCrop' is a good candidate to be a\n>    CameraConfiguration property, which at the moment is not yet\n>    implemented. Roughly speaking, we have been discussing it in the\n>    context of frame durations, and it's about augmenting the\n>    CameraConfiguration class with a set of properties that depend on\n>    the currently applied configuration of the enabled streams.\n>\n>    Ideally we should be able to receive CameraConfiguration, validate\n>    it, and populate it with properties, like the minimum frame\n>    duration (which is a combination of durations of all streams), and\n>    now with this new 'pipelineCrop' which is the result of a decision\n>    the pipeline handler takes based on the requested output stream\n>    sizes.\n>\n>    I think the ideal plan forward for this would be to define a new\n>    control/property (I would say property as it's immutable from\n>    applications) that is set by pipeline handlers during validate()\n>    when the sensor mode is selected and eventually cropped.\n\nAbsolutely. The problem with this has never been how to calculate the\n\"pipelineCrop\", or who calculates it (that's the pipeline handler),\nbut how to signal this value to the application. When can I have this\nfeature, please? :)\n\n>\n>    Application can access it from the CameraConfiguration and decide\n>    where to set their pan/zoom rectangle.\n>\n>    How the pipeline handler applies the cropping to the sensor\n>    produced frame is less relevant imo. Systems which have direct access\n>    to the v4l2 subdev will probably use the crop/compose rectangle\n>    applied on the sensor. In your case I see you use the crop target\n>    on the ISP video device node. Am I wrong or you could calculate\n>    that at validate() time, once you know all the desired output sizes and\n>    based on them identify the 'best' sensor mode to use ?\n\nWe update the selection on the ISP device node. I would expect this\napproach to be more common than doing it on the sensor itself (you\ndon't lose statistics for the rest of the frame), but it's of course\nup to the pipeline handlers.\n\n>\n> 2) Application, once they know the 'pipelineCrop' (I'm ok with that\n>    term btw, as we have analogCrop and we will have a digitalCrop for\n>    the sensor's processing stages, so pipelineCrop feels right) can\n>    set their pan/zoom rectangle with a control, as the one you have\n>    defined below. Now, what do you think the reference of that rectangle\n>    should be ? I mean, should the 'pipelineCrop' always be in position\n>    (0,0) and the 'digitalZoom' be applied on top of it ? I would say\n>    so, but then we lose the information on which part of the image\n>    the pipeline decided to crop, and I'm not sure it is relevant or\n>    not.\n\nSo I'm in favour of giving applications only the size of the pipeline\ncrop, nothing else. It's all you need to implement digital zoom, and\npanning within the original image, and it makes life very\nstraightforward for everyone.\n\nI agree there are more complex scenarios you could imagine. Maybe you\nlet applications pan/zoom within the whole of the sensor area, so\nyou'd be able to pan to parts of the image that you can't see in the\n\"default unzoomed\" version. I mean, there's nothing \"wrong\" in this,\nit just feels more complicated and a bit unexpected to me. I also\nthink this would delegate most of the aspect ratio calculations to the\napplications, and I suspect they'd get themselves confused and\nactually implement the simpler scheme anyway... But yes, there is a\ndecision to be made here.\n\n>\n>    The other possibility is to report the 'pipelineCrop' in respect to\n>    the full active pixel array size, and ask application to provide a\n>    'digitalZoom' rectangle with the same reference. Is it worth the\n>    additional complication in you opinion?\n\nSorry, answered that one above!\n\n>\n>    Third option is to report the sensor frame size AND the pipeline crop\n>    defined in respect to it, and let application provide a digitalZoom\n>    rectangle defined in respect to the pipeline crop. This would look\n>    like (numbers are random):\n>         sensorFrameSize = {0, 0, 1920, 1440}\n>         pipelineCrop = { 0, 150, 1920, 1080}\n>         digitalZoom = {300, 300, 1280, 720}\n>\n>    This would allow application to know that the sensor frame is\n>    1920x1400, the pipeline selected a region on 1920x1080 by applying\n>    an offset of 150 pixels in the vertical direction and the desired\n>    digital zoom is then applied on this last rectangle (resulting in a\n>    effective (300,450) dislocation from the full sensor frame).\n>\n> I won't bikeshed on names too much, just leave as a reference that\n> Android reports the same property we are defining as pipelineCrop as\n> 'scalera.cropRegion' as for them 'scaler' is used to report properties\n> of the ISP processing pipeline.\n\nI could go with \"scalerCrop\" too!\n\nBest regards\nDavid\n\n>\n> I'm sorry for the wall of text, I wish we had a blackboard :)\n>\n> Thanks\n>   j\n>\n> >\n> > Signed-off-by: David Plowman <david.plowman@raspberrypi.com>\n> > ---\n> >  include/libcamera/camera.h                    |  2 ++\n> >  include/libcamera/internal/pipeline_handler.h |  4 +++\n> >  src/libcamera/camera.cpp                      | 27 +++++++++++++++++++\n> >  src/libcamera/control_ids.yaml                | 10 +++++++\n> >  4 files changed, 43 insertions(+)\n> >\n> > diff --git a/include/libcamera/camera.h b/include/libcamera/camera.h\n> > index 4d1a4a9..6819b8e 100644\n> > --- a/include/libcamera/camera.h\n> > +++ b/include/libcamera/camera.h\n> > @@ -92,6 +92,8 @@ public:\n> >       std::unique_ptr<CameraConfiguration> generateConfiguration(const StreamRoles &roles = {});\n> >       int configure(CameraConfiguration *config);\n> >\n> > +     Size const &getPipelineCrop() const;\n> > +\n> >       Request *createRequest(uint64_t cookie = 0);\n> >       int queueRequest(Request *request);\n> >\n> > diff --git a/include/libcamera/internal/pipeline_handler.h b/include/libcamera/internal/pipeline_handler.h\n> > index 22e629a..5bfe890 100644\n> > --- a/include/libcamera/internal/pipeline_handler.h\n> > +++ b/include/libcamera/internal/pipeline_handler.h\n> > @@ -89,6 +89,8 @@ public:\n> >\n> >       const char *name() const { return name_; }\n> >\n> > +     Size const &getPipelineCrop() const { return pipelineCrop_; }\n> > +\n> >  protected:\n> >       void registerCamera(std::shared_ptr<Camera> camera,\n> >                           std::unique_ptr<CameraData> data);\n> > @@ -100,6 +102,8 @@ protected:\n> >\n> >       CameraManager *manager_;\n> >\n> > +     Size pipelineCrop_;\n> > +\n> >  private:\n> >       void mediaDeviceDisconnected(MediaDevice *media);\n> >       virtual void disconnect();\n> > diff --git a/src/libcamera/camera.cpp b/src/libcamera/camera.cpp\n> > index 69a1b44..f8b8ec6 100644\n> > --- a/src/libcamera/camera.cpp\n> > +++ b/src/libcamera/camera.cpp\n> > @@ -793,6 +793,33 @@ int Camera::configure(CameraConfiguration *config)\n> >       return 0;\n> >  }\n> >\n> > +/**\n> > + * \\brief Return the size of the sensor image being used by the pipeline\n> > + * to create the output.\n> > + *\n> > + * This method returns the size, in pixels, of the raw image read from the\n> > + * sensor and which is used by the pipeline to form the output image(s)\n> > + * (rescaling if necessary). Note that these values take account of any\n> > + * cropping performed on the sensor output so as to produce the correct\n> > + * aspect ratio. It would normally be necessary to retrieve these values\n> > + * in order to calculate correct parameters for digital zoom.\n> > + *\n> > + * Example: a sensor mode may produce a 1920x1440 output image. But if an\n> > + * application has requested a 16:9 image, the values returned here might\n> > + * be 1920x1080 - the largest portion of the sensor output that provides\n> > + * the correct aspect ratio.\n> > + *\n> > + * \\context This function is \\threadsafe. It will only return valid\n> > + * (non-zero) values when the camera has been configured.\n> > + *\n> > + * \\return The dimensions of the sensor image used by the pipeline.\n> > + */\n> > +\n> > +Size const &Camera::getPipelineCrop() const\n> > +{\n> > +     return p_->pipe_->getPipelineCrop();\n> > +}\n> > +\n> >  /**\n> >   * \\brief Create a request object for the camera\n> >   * \\param[in] cookie Opaque cookie for application use\n> > diff --git a/src/libcamera/control_ids.yaml b/src/libcamera/control_ids.yaml\n> > index 988b501..5a099d5 100644\n> > --- a/src/libcamera/control_ids.yaml\n> > +++ b/src/libcamera/control_ids.yaml\n> > @@ -262,4 +262,14 @@ controls:\n> >          In this respect, it is not necessarily aimed at providing a way to\n> >          implement a focus algorithm by the application, rather an indication of\n> >          how in-focus a frame is.\n> > +\n> > +  - DigitalZoom:\n> > +      type: Rectangle\n> > +      description: |\n> > +        Sets the portion of the full sensor image, in pixels, that will be\n> > +        used for digital zoom. That is, this part of the sensor output will\n> > +        be scaled up to make the full size output image (and everything else\n> > +        discarded). To obtain the \"full sensor image\" that is available, the\n> > +        method Camera::getOutputCrop() should be called once the camera is\n> > +        configured. An application may pan and zoom within this rectangle.\n> >  ...\n> > --\n> > 2.20.1\n> >\n> > _______________________________________________\n> > libcamera-devel mailing list\n> > libcamera-devel@lists.libcamera.org\n> > https://lists.libcamera.org/listinfo/libcamera-devel","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id 7D762BD86F\n\tfor <parsemail@patchwork.libcamera.org>;\n\tFri, 31 Jul 2020 09:03:00 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id F352E61DCF;\n\tFri, 31 Jul 2020 11:02:59 +0200 (CEST)","from mail-oi1-x242.google.com (mail-oi1-x242.google.com\n\t[IPv6:2607:f8b0:4864:20::242])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id 484B260395\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tFri, 31 Jul 2020 11:02:58 +0200 (CEST)","by mail-oi1-x242.google.com with SMTP id q4so14407843oia.1\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tFri, 31 Jul 2020 02:02:58 -0700 (PDT)"],"Authentication-Results":"lancelot.ideasonboard.com;\n\tdkim=fail reason=\"signature verification failed\" (2048-bit key;\n\tunprotected) header.d=raspberrypi.com header.i=@raspberrypi.com\n\theader.b=\"YwtLnWUn\"; dkim-atps=neutral","DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/relaxed;\n\td=raspberrypi.com; s=google;\n\th=mime-version:references:in-reply-to:from:date:message-id:subject:to\n\t:cc; bh=Anl1umTtNDMttWCxsm+2uqjLQyT8PL8rMiWqDfMUcyg=;\n\tb=YwtLnWUns3zvFsj1H3nQsT8Mllm17bnTeeHo9OTSmTyxJBYJ/kONCdt0oSp7v/RwKO\n\tsPD8hBFWjrbSLbxf3bY7hw6bxJKjXTxDW1q/4GBUM+qDrw2C+xVPNnuIMEXJGaSvCLzD\n\tF1Q+FExTXWF+xQnLWZBCK6IBchP7A+r2LHFK35kUESg/PNCnSRE5r00v2RwZboasYO9F\n\tbu0YYXCVZLHPkHkscohUsvaOH5qvg+CuFVS059hZpNXZ0KCBUgDYlVQPcOArgJwVutaH\n\tChRnDSM4uz/pyVU8T8xf3g00lQIISQR0SOWa517+j0RUx8JnyAankuegfury6XSU0koU\n\tjMGQ==","X-Google-DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/relaxed;\n\td=1e100.net; s=20161025;\n\th=x-gm-message-state:mime-version:references:in-reply-to:from:date\n\t:message-id:subject:to:cc;\n\tbh=Anl1umTtNDMttWCxsm+2uqjLQyT8PL8rMiWqDfMUcyg=;\n\tb=hA60WcibIowK4UIU5tK4oohAXwnoQ5FHAqKQaVjPqGf0ZJ5cJIo8y0IXoUzpF/OFjh\n\tj11wMbjiB/v1U9Bu9jqCBwb7+fJMjpLB+pLkbPabiRlj8QeP4xW/7MsGRkjJzVkVbC/y\n\tM7bmL2UeRpINjERXPZe0QV8JnimsHptrG65QFWmeB+KdKbH3IvWovuNfSQooKuOifl6+\n\tp5l08hhioJ+SnrTxWbcQ1nfH88kazD8Iah4a4ZQ9b14TgMi4plZryoN5FenGh08AeRjG\n\t79Yp2otdCn0OOo8uxUtXRW4gm43tES49SvL2unC8J5ep08tHTnn0N5/UuPgfWNFxxiHw\n\tQzdQ==","X-Gm-Message-State":"AOAM533LKS/9WuckpTILFmVHw/13ogFz9tVO0RbhqqM47+cCRT6IX3GN\n\tzJjkCk8ARvOlvhHqv5EFheVzrY4mXx+1f6rkrUbMyQ==","X-Google-Smtp-Source":"ABdhPJw2el/JhxdmtaXJ+S+Z7PX46cm2+Q+eejYHtl/NhYesbH3kKsbIAgbuoaqstqO6/NPUaPJxL/9Bv6EKESMgqZk=","X-Received":"by 2002:aca:84e:: with SMTP id 75mr2267330oii.107.1596186176724; \n\tFri, 31 Jul 2020 02:02:56 -0700 (PDT)","MIME-Version":"1.0","References":"<20200723084338.13711-1-david.plowman@raspberrypi.com>\n\t<20200723084338.13711-2-david.plowman@raspberrypi.com>\n\t<20200730163804.gyhkgcrcwwbfn2tj@uno.localdomain>","In-Reply-To":"<20200730163804.gyhkgcrcwwbfn2tj@uno.localdomain>","From":"David Plowman <david.plowman@raspberrypi.com>","Date":"Fri, 31 Jul 2020 10:02:44 +0100","Message-ID":"<CAHW6GYKvV436aqjNSHt62Eqc62EU4p4R-bLemX53f0f6vYNBTA@mail.gmail.com>","To":"Jacopo Mondi <jacopo@jmondi.org>","Subject":"Re: [libcamera-devel] [PATCH v3 1/2] libcamera: Infrastructure for\n\tdigital zoom","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","Cc":"libcamera-devel@lists.libcamera.org","Content-Type":"text/plain; charset=\"us-ascii\"","Content-Transfer-Encoding":"7bit","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}},{"id":11764,"web_url":"https://patchwork.libcamera.org/comment/11764/","msgid":"<20200801093001.nokdnc5nz5zfde32@uno.localdomain>","date":"2020-08-01T09:30:01","subject":"Re: [libcamera-devel] [PATCH v3 1/2] libcamera: Infrastructure for\n\tdigital zoom","submitter":{"id":3,"url":"https://patchwork.libcamera.org/api/people/3/","name":"Jacopo Mondi","email":"jacopo@jmondi.org"},"content":"Hi David,\n\nOn Fri, Jul 31, 2020 at 10:02:44AM +0100, David Plowman wrote:\n> Hi Jacopo, everyone\n>\n> On Thu, 30 Jul 2020 at 17:34, Jacopo Mondi <jacopo@jmondi.org> wrote:\n> >\n> > Hi David,\n> >   sorry for jumping late on the topic, I managed to read the\n> > discussions in v1 and v2 only today.\n>\n> Please don't apologise, I'm very glad that you've picked this up. This\n> is actually one of the \"big ticket\" items that libcamera is currently\n> lacking compared to our existing stack, so I was going to give the\n> discussion a \"nudge\" quite soon!\n>\n> >\n> > On Thu, Jul 23, 2020 at 09:43:37AM +0100, David Plowman wrote:\n> > > These changes add a Digital Zoom control, taking a rectangle as its\n> > > argument, indicating the region of the sensor output that the\n> > > pipeline will \"zoom up\" to the final output size.\n> > >\n> > > Additionally, we need to have a method returning the \"pipelineCrop\"\n> > > which gives the dimensions of the sensor output, taken by the\n> > > pipeline, and within which we can subsequently pan and zoom.\n> >\n> > This is the part I don't get. What's the intended user of this\n> > information ? Is this meant to be provided to applications so that\n> > they know which is the area inside which they can later pan/zoom, right ?\n>\n> Correct. I'm imagining a user might say \"I want to see the image at 2x\n> zoom\". If we know the unzoomed image is being created by (as per one\n> of my examples somewhere) 2028x1140 pixels from the sensor, then the\n> application would send a \"digital zoom\" of offset: (507,285) size:\n> 1014x570,\n>\n> I think the feeling was that we liked having control right down to the\n> pixel level, which in turn means the the numbers 2028x1140 need to be\n> supplied to the application from somewhere.\n>\n\nJust for the sake of discussion, a 'digitalZoomFactor' (or whatever\nother name) control that takes a, in example, 2x value alongside with\na more precies 'digitalZoomRectangle' that behaves like you have\ndescribed, would make sense ?\n\n> >\n> > I'm kind of against to ad-hoc function to retrieve information about\n> > the image processing pipeline configuration. I can easily see them\n> > multiply and make our API awful, and each platform would need to add\n> > its own very-specific bits, and that can't certainly go through the\n> > main Camera class API.\n> >\n> > If I got your use case right:\n> >\n> > 1) You need to report the pipeline crop area, which is a rectangle\n> >    defined on the frame that is produced by the sensor.\n> >    Using your example:\n> >\n> >         sensorSize = 1920x1440\n> >         pipelineCrop = 1920x1080\n> >\n> >    It seems to me the 'pipelineCrop' is a good candidate to be a\n> >    CameraConfiguration property, which at the moment is not yet\n> >    implemented. Roughly speaking, we have been discussing it in the\n> >    context of frame durations, and it's about augmenting the\n> >    CameraConfiguration class with a set of properties that depend on\n> >    the currently applied configuration of the enabled streams.\n> >\n> >    Ideally we should be able to receive CameraConfiguration, validate\n> >    it, and populate it with properties, like the minimum frame\n> >    duration (which is a combination of durations of all streams), and\n> >    now with this new 'pipelineCrop' which is the result of a decision\n> >    the pipeline handler takes based on the requested output stream\n> >    sizes.\n> >\n> >    I think the ideal plan forward for this would be to define a new\n> >    control/property (I would say property as it's immutable from\n> >    applications) that is set by pipeline handlers during validate()\n> >    when the sensor mode is selected and eventually cropped.\n>\n> Absolutely. The problem with this has never been how to calculate the\n> \"pipelineCrop\", or who calculates it (that's the pipeline handler),\n> but how to signal this value to the application. When can I have this\n> feature, please? :)\n>\n\nI think we need to resume discussions on frame durations calculation,\nor either decide that the proposed direction for 'pipelineCrop' is the\nright one and:\n1) Define the property\n2) Add a ControlList of properties to CameraConfiguration\n3) Instrument the pipeline handler to report it in validate()\n\nIt seems pretty straightforward to me, but I might be missing pieces.\nLaurent, do you have opinions here ?\n\n> >\n> >    Application can access it from the CameraConfiguration and decide\n> >    where to set their pan/zoom rectangle.\n> >\n> >    How the pipeline handler applies the cropping to the sensor\n> >    produced frame is less relevant imo. Systems which have direct access\n> >    to the v4l2 subdev will probably use the crop/compose rectangle\n> >    applied on the sensor. In your case I see you use the crop target\n> >    on the ISP video device node. Am I wrong or you could calculate\n> >    that at validate() time, once you know all the desired output sizes and\n> >    based on them identify the 'best' sensor mode to use ?\n>\n> We update the selection on the ISP device node. I would expect this\n> approach to be more common than doing it on the sensor itself (you\n> don't lose statistics for the rest of the frame), but it's of course\n> up to the pipeline handlers.\n>\n\nYou are right, this will mostly always come from ISP (sub)device node,\nnot from the sensor.\n\n> >\n> > 2) Application, once they know the 'pipelineCrop' (I'm ok with that\n> >    term btw, as we have analogCrop and we will have a digitalCrop for\n> >    the sensor's processing stages, so pipelineCrop feels right) can\n> >    set their pan/zoom rectangle with a control, as the one you have\n> >    defined below. Now, what do you think the reference of that rectangle\n> >    should be ? I mean, should the 'pipelineCrop' always be in position\n> >    (0,0) and the 'digitalZoom' be applied on top of it ? I would say\n> >    so, but then we lose the information on which part of the image\n> >    the pipeline decided to crop, and I'm not sure it is relevant or\n> >    not.\n>\n> So I'm in favour of giving applications only the size of the pipeline\n> crop, nothing else. It's all you need to implement digital zoom, and\n> panning within the original image, and it makes life very\n> straightforward for everyone.\n>\n> I agree there are more complex scenarios you could imagine. Maybe you\n> let applications pan/zoom within the whole of the sensor area, so\n> you'd be able to pan to parts of the image that you can't see in the\n> \"default unzoomed\" version. I mean, there's nothing \"wrong\" in this,\n> it just feels more complicated and a bit unexpected to me. I also\n> think this would delegate most of the aspect ratio calculations to the\n> applications, and I suspect they'd get themselves confused and\n> actually implement the simpler scheme anyway... But yes, there is a\n> decision to be made here.\n>\n> >\n> >    The other possibility is to report the 'pipelineCrop' in respect to\n> >    the full active pixel array size, and ask application to provide a\n> >    'digitalZoom' rectangle with the same reference. Is it worth the\n> >    additional complication in you opinion?\n>\n> Sorry, answered that one above!\n>\n> >\n> >    Third option is to report the sensor frame size AND the pipeline crop\n> >    defined in respect to it, and let application provide a digitalZoom\n> >    rectangle defined in respect to the pipeline crop. This would look\n> >    like (numbers are random):\n> >         sensorFrameSize = {0, 0, 1920, 1440}\n> >         pipelineCrop = { 0, 150, 1920, 1080}\n> >         digitalZoom = {300, 300, 1280, 720}\n> >\n> >    This would allow application to know that the sensor frame is\n> >    1920x1400, the pipeline selected a region on 1920x1080 by applying\n> >    an offset of 150 pixels in the vertical direction and the desired\n> >    digital zoom is then applied on this last rectangle (resulting in a\n> >    effective (300,450) dislocation from the full sensor frame).\n> >\n> > I won't bikeshed on names too much, just leave as a reference that\n> > Android reports the same property we are defining as pipelineCrop as\n> > 'scalera.cropRegion' as for them 'scaler' is used to report properties\n> > of the ISP processing pipeline.\n>\n> I could go with \"scalerCrop\" too!\n\nJust for reference, I paste here the Android control documentation,\nmostly to wrap our heads around the fact they decided to implement it\nusing the full pixel array as reference, but the documentation later\nthrows the sensor scaling in the mix, so I don't fully get it.\n\nReading it through might help spotting details I might have missed.\n\n------------------------------------------------------------------------------\nandroid.scaler.cropRegion\n\nThe desired region of the sensor to read out for this capture.\n\nPixel coordinates relative to android.sensor.info.activeArraySize or\nandroid.sensor.info.preCorrectionActiveArraySize depending on\ndistortion correction capability and mode\n\nThis control can be used to implement digital zoom.\n\nFor devices not supporting android.distortionCorrection.mode control,\nthe coordinate system always follows that of\nandroid.sensor.info.activeArraySize, with (0, 0) being the top-left\npixel of the active array.\n\nFor devices supporting android.distortionCorrection.mode control, the\ncoordinate system depends on the mode being set.When the distortion\ncorrection mode is OFF, the coordinate system follows\nandroid.sensor.info.preCorrectionActiveArraySize, with (0, 0) being\nthe top-left pixel of the pre-correction active array.When the\ndistortion correction mode is not OFF, the coordinate system follows\nandroid.sensor.info.activeArraySize, with (0, 0) being the top-left\npixel of the active array.\n\nOutput streams use this rectangle to produce their output,cropping to\na smaller region if necessary to maintain the stream's aspect ratio,\nthen scaling the sensor input to match the output's configured\nresolution.\n\nThe crop region is applied after the RAW to other color space (e.g.\nYUV) conversion. Since raw streams (e.g. RAW16) don't have the\nconversion stage, they are not croppable. The crop region will be\nignored by raw streams.\n\nFor non-raw streams, any additional per-stream cropping will be done\nto maximize the final pixel area of the stream.\n\nFor example, if the crop region is set to a 4:3 aspect ratio, then 4:3\nstreams will use the exact crop region. 16:9 streams will further crop\nvertically (letterbox).\n\nConversely, if the crop region is set to a 16:9, then 4:3 outputs will\ncrop horizontally (pillarbox), and 16:9 streams will match exactly.\nThese additional crops will be centered within the crop region.\n\nIf the coordinate system is android.sensor.info.activeArraySize, the\nwidth and height of the crop region cannot be set to be smaller than\nfloor( activeArraySize.width / android.scaler.availableMaxDigitalZoom\n) and floor( activeArraySize.height /\nandroid.scaler.availableMaxDigitalZoom ), respectively.\n\nIf the coordinate system is\nandroid.sensor.info.preCorrectionActiveArraySize, the width and height\nof the crop region cannot be set to be smaller than floor(\npreCorrectionActiveArraySize.width /\nandroid.scaler.availableMaxDigitalZoom ) and floor(\npreCorrectionActiveArraySize.height /\nandroid.scaler.availableMaxDigitalZoom ),respectively.\n\nThe camera device may adjust the crop region to account for rounding\nand other hardware requirements; the final crop region used will be\nincluded in the output capture result.\n\nHAL Implementation Details: The output streams must maintain square\npixels at all times, no matter what the relative aspect ratios of the\ncrop region and the stream are. Negative values for corner are allowed\nfor raw output if full pixel array is larger than active pixel array.\nWidth and height may be rounded to nearest larger supportable width,\nespecially for raw output, where only a few fixed scales may be\npossible.\n\nFor a set of output streams configured, if the sensor output is\ncropped to a smaller size than pre-correction active array size, the\nHAL need follow below cropping rules:\n\nThe HAL need handle the cropRegion as if the sensor crop size is the\neffective pre-correction active array size. More specifically, the HAL\nmust transform the request cropRegion from\nandroid.sensor.info.preCorrectionActiveArraySize to the sensor cropped\npixel area size in this way:\n\nTranslate the requested cropRegion w.r.t., the left top corner of the\nsensor cropped pixel area by (tx, ty),where ty = sensorCrop.top *\n(sensorCrop.height / preCorrectionActiveArraySize.height) and tx =\nsensorCrop.left * (sensorCrop.width /\npreCorrectionActiveArraySize.width).The (sensorCrop.top,\nsensorCrop.left) is the coordinate based off the\nandroid.sensor.info.activeArraySize.  Scale the width and height of\nrequested cropRegion with scaling factor of\nsensorCrop.width/preCorrectionActiveArraySize.width and\nsensorCrop.height/preCorrectionActiveArraySize.height\nrespectively.Once this new cropRegion is calculated, the HAL must use\nthis region to crop the image with regard to the sensor crop size\n(effective pre-correction active array size). The HAL still need\nfollow the general cropping rule for this new cropRegion and effective\npre-correction active array size.  The HAL must report the cropRegion\nwith regard to android.sensor.info.preCorrectionActiveArraySize.The\nHAL need convert the new cropRegion generated above w.r.t., full\npre-correction active array size. The reported cropRegion may be\nslightly different with the requested cropRegion since the HAL may\nadjust the crop region to account for rounding, conversion error, or\nother hardware limitations.\n------------------------------------------------------------------------------\n\nThanks\n  j\n\n\n>\n> Best regards\n> David\n>\n> >\n> > I'm sorry for the wall of text, I wish we had a blackboard :)\n> >\n> > Thanks\n> >   j\n> >\n> > >\n> > > Signed-off-by: David Plowman <david.plowman@raspberrypi.com>\n> > > ---\n> > >  include/libcamera/camera.h                    |  2 ++\n> > >  include/libcamera/internal/pipeline_handler.h |  4 +++\n> > >  src/libcamera/camera.cpp                      | 27 +++++++++++++++++++\n> > >  src/libcamera/control_ids.yaml                | 10 +++++++\n> > >  4 files changed, 43 insertions(+)\n> > >\n> > > diff --git a/include/libcamera/camera.h b/include/libcamera/camera.h\n> > > index 4d1a4a9..6819b8e 100644\n> > > --- a/include/libcamera/camera.h\n> > > +++ b/include/libcamera/camera.h\n> > > @@ -92,6 +92,8 @@ public:\n> > >       std::unique_ptr<CameraConfiguration> generateConfiguration(const StreamRoles &roles = {});\n> > >       int configure(CameraConfiguration *config);\n> > >\n> > > +     Size const &getPipelineCrop() const;\n> > > +\n> > >       Request *createRequest(uint64_t cookie = 0);\n> > >       int queueRequest(Request *request);\n> > >\n> > > diff --git a/include/libcamera/internal/pipeline_handler.h b/include/libcamera/internal/pipeline_handler.h\n> > > index 22e629a..5bfe890 100644\n> > > --- a/include/libcamera/internal/pipeline_handler.h\n> > > +++ b/include/libcamera/internal/pipeline_handler.h\n> > > @@ -89,6 +89,8 @@ public:\n> > >\n> > >       const char *name() const { return name_; }\n> > >\n> > > +     Size const &getPipelineCrop() const { return pipelineCrop_; }\n> > > +\n> > >  protected:\n> > >       void registerCamera(std::shared_ptr<Camera> camera,\n> > >                           std::unique_ptr<CameraData> data);\n> > > @@ -100,6 +102,8 @@ protected:\n> > >\n> > >       CameraManager *manager_;\n> > >\n> > > +     Size pipelineCrop_;\n> > > +\n> > >  private:\n> > >       void mediaDeviceDisconnected(MediaDevice *media);\n> > >       virtual void disconnect();\n> > > diff --git a/src/libcamera/camera.cpp b/src/libcamera/camera.cpp\n> > > index 69a1b44..f8b8ec6 100644\n> > > --- a/src/libcamera/camera.cpp\n> > > +++ b/src/libcamera/camera.cpp\n> > > @@ -793,6 +793,33 @@ int Camera::configure(CameraConfiguration *config)\n> > >       return 0;\n> > >  }\n> > >\n> > > +/**\n> > > + * \\brief Return the size of the sensor image being used by the pipeline\n> > > + * to create the output.\n> > > + *\n> > > + * This method returns the size, in pixels, of the raw image read from the\n> > > + * sensor and which is used by the pipeline to form the output image(s)\n> > > + * (rescaling if necessary). Note that these values take account of any\n> > > + * cropping performed on the sensor output so as to produce the correct\n> > > + * aspect ratio. It would normally be necessary to retrieve these values\n> > > + * in order to calculate correct parameters for digital zoom.\n> > > + *\n> > > + * Example: a sensor mode may produce a 1920x1440 output image. But if an\n> > > + * application has requested a 16:9 image, the values returned here might\n> > > + * be 1920x1080 - the largest portion of the sensor output that provides\n> > > + * the correct aspect ratio.\n> > > + *\n> > > + * \\context This function is \\threadsafe. It will only return valid\n> > > + * (non-zero) values when the camera has been configured.\n> > > + *\n> > > + * \\return The dimensions of the sensor image used by the pipeline.\n> > > + */\n> > > +\n> > > +Size const &Camera::getPipelineCrop() const\n> > > +{\n> > > +     return p_->pipe_->getPipelineCrop();\n> > > +}\n> > > +\n> > >  /**\n> > >   * \\brief Create a request object for the camera\n> > >   * \\param[in] cookie Opaque cookie for application use\n> > > diff --git a/src/libcamera/control_ids.yaml b/src/libcamera/control_ids.yaml\n> > > index 988b501..5a099d5 100644\n> > > --- a/src/libcamera/control_ids.yaml\n> > > +++ b/src/libcamera/control_ids.yaml\n> > > @@ -262,4 +262,14 @@ controls:\n> > >          In this respect, it is not necessarily aimed at providing a way to\n> > >          implement a focus algorithm by the application, rather an indication of\n> > >          how in-focus a frame is.\n> > > +\n> > > +  - DigitalZoom:\n> > > +      type: Rectangle\n> > > +      description: |\n> > > +        Sets the portion of the full sensor image, in pixels, that will be\n> > > +        used for digital zoom. That is, this part of the sensor output will\n> > > +        be scaled up to make the full size output image (and everything else\n> > > +        discarded). To obtain the \"full sensor image\" that is available, the\n> > > +        method Camera::getOutputCrop() should be called once the camera is\n> > > +        configured. An application may pan and zoom within this rectangle.\n> > >  ...\n> > > --\n> > > 2.20.1\n> > >\n> > > _______________________________________________\n> > > libcamera-devel mailing list\n> > > libcamera-devel@lists.libcamera.org\n> > > https://lists.libcamera.org/listinfo/libcamera-devel","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id 4117BBD878\n\tfor <parsemail@patchwork.libcamera.org>;\n\tSat,  1 Aug 2020 09:26:24 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id CB2AB61F4C;\n\tSat,  1 Aug 2020 11:26:23 +0200 (CEST)","from relay1-d.mail.gandi.net (relay1-d.mail.gandi.net\n\t[217.70.183.193])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id 7CC5E60394\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tSat,  1 Aug 2020 11:26:22 +0200 (CEST)","from uno.localdomain (93-34-118-233.ip49.fastwebnet.it\n\t[93.34.118.233]) (Authenticated sender: jacopo@jmondi.org)\n\tby relay1-d.mail.gandi.net (Postfix) with ESMTPSA id 52BE2240003;\n\tSat,  1 Aug 2020 09:26:21 +0000 (UTC)"],"X-Originating-IP":"93.34.118.233","Date":"Sat, 1 Aug 2020 11:30:01 +0200","From":"Jacopo Mondi <jacopo@jmondi.org>","To":"David Plowman <david.plowman@raspberrypi.com>,\n\tLaurent Pinchart <laurent.pinchart@ideasonboard.com>","Message-ID":"<20200801093001.nokdnc5nz5zfde32@uno.localdomain>","References":"<20200723084338.13711-1-david.plowman@raspberrypi.com>\n\t<20200723084338.13711-2-david.plowman@raspberrypi.com>\n\t<20200730163804.gyhkgcrcwwbfn2tj@uno.localdomain>\n\t<CAHW6GYKvV436aqjNSHt62Eqc62EU4p4R-bLemX53f0f6vYNBTA@mail.gmail.com>","MIME-Version":"1.0","Content-Disposition":"inline","In-Reply-To":"<CAHW6GYKvV436aqjNSHt62Eqc62EU4p4R-bLemX53f0f6vYNBTA@mail.gmail.com>","Subject":"Re: [libcamera-devel] [PATCH v3 1/2] libcamera: Infrastructure for\n\tdigital zoom","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","Cc":"libcamera-devel@lists.libcamera.org","Content-Type":"text/plain; charset=\"us-ascii\"","Content-Transfer-Encoding":"7bit","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}},{"id":11789,"web_url":"https://patchwork.libcamera.org/comment/11789/","msgid":"<CAHW6GY+0HB62HLDGVwwzLkXD8cT_N1dpaDbffTC-yd-EygACzQ@mail.gmail.com>","date":"2020-08-03T11:05:25","subject":"Re: [libcamera-devel] [PATCH v3 1/2] libcamera: Infrastructure for\n\tdigital zoom","submitter":{"id":42,"url":"https://patchwork.libcamera.org/api/people/42/","name":"David Plowman","email":"david.plowman@raspberrypi.com"},"content":"Hi Jacopo\n\nThanks for all the information, especially good to know what Android wants...\n\nOn Sat, 1 Aug 2020 at 10:26, Jacopo Mondi <jacopo@jmondi.org> wrote:\n>\n> Hi David,\n>\n> On Fri, Jul 31, 2020 at 10:02:44AM +0100, David Plowman wrote:\n> > Hi Jacopo, everyone\n> >\n> > On Thu, 30 Jul 2020 at 17:34, Jacopo Mondi <jacopo@jmondi.org> wrote:\n> > >\n> > > Hi David,\n> > >   sorry for jumping late on the topic, I managed to read the\n> > > discussions in v1 and v2 only today.\n> >\n> > Please don't apologise, I'm very glad that you've picked this up. This\n> > is actually one of the \"big ticket\" items that libcamera is currently\n> > lacking compared to our existing stack, so I was going to give the\n> > discussion a \"nudge\" quite soon!\n> >\n> > >\n> > > On Thu, Jul 23, 2020 at 09:43:37AM +0100, David Plowman wrote:\n> > > > These changes add a Digital Zoom control, taking a rectangle as its\n> > > > argument, indicating the region of the sensor output that the\n> > > > pipeline will \"zoom up\" to the final output size.\n> > > >\n> > > > Additionally, we need to have a method returning the \"pipelineCrop\"\n> > > > which gives the dimensions of the sensor output, taken by the\n> > > > pipeline, and within which we can subsequently pan and zoom.\n> > >\n> > > This is the part I don't get. What's the intended user of this\n> > > information ? Is this meant to be provided to applications so that\n> > > they know which is the area inside which they can later pan/zoom, right ?\n> >\n> > Correct. I'm imagining a user might say \"I want to see the image at 2x\n> > zoom\". If we know the unzoomed image is being created by (as per one\n> > of my examples somewhere) 2028x1140 pixels from the sensor, then the\n> > application would send a \"digital zoom\" of offset: (507,285) size:\n> > 1014x570,\n> >\n> > I think the feeling was that we liked having control right down to the\n> > pixel level, which in turn means the the numbers 2028x1140 need to be\n> > supplied to the application from somewhere.\n> >\n>\n> Just for the sake of discussion, a 'digitalZoomFactor' (or whatever\n> other name) control that takes a, in example, 2x value alongside with\n> a more precies 'digitalZoomRectangle' that behaves like you have\n> described, would make sense ?\n\nYes possibly, though I slightly get the feeling we'd be making more\nwork for ourselves... another control, pipeline handlers would all be\nexpected to do it, does it mean more metadata being returned... You\nalso have to define what it would do if someone has already panned\naway from the centre etc. Maybe adding some kind of helper function\nwould be a good idea here? (e.g. give me the Rectangle I need for this\nzoom factor, and then there's only one control).\n\n>\n> > >\n> > > I'm kind of against to ad-hoc function to retrieve information about\n> > > the image processing pipeline configuration. I can easily see them\n> > > multiply and make our API awful, and each platform would need to add\n> > > its own very-specific bits, and that can't certainly go through the\n> > > main Camera class API.\n> > >\n> > > If I got your use case right:\n> > >\n> > > 1) You need to report the pipeline crop area, which is a rectangle\n> > >    defined on the frame that is produced by the sensor.\n> > >    Using your example:\n> > >\n> > >         sensorSize = 1920x1440\n> > >         pipelineCrop = 1920x1080\n> > >\n> > >    It seems to me the 'pipelineCrop' is a good candidate to be a\n> > >    CameraConfiguration property, which at the moment is not yet\n> > >    implemented. Roughly speaking, we have been discussing it in the\n> > >    context of frame durations, and it's about augmenting the\n> > >    CameraConfiguration class with a set of properties that depend on\n> > >    the currently applied configuration of the enabled streams.\n> > >\n> > >    Ideally we should be able to receive CameraConfiguration, validate\n> > >    it, and populate it with properties, like the minimum frame\n> > >    duration (which is a combination of durations of all streams), and\n> > >    now with this new 'pipelineCrop' which is the result of a decision\n> > >    the pipeline handler takes based on the requested output stream\n> > >    sizes.\n> > >\n> > >    I think the ideal plan forward for this would be to define a new\n> > >    control/property (I would say property as it's immutable from\n> > >    applications) that is set by pipeline handlers during validate()\n> > >    when the sensor mode is selected and eventually cropped.\n> >\n> > Absolutely. The problem with this has never been how to calculate the\n> > \"pipelineCrop\", or who calculates it (that's the pipeline handler),\n> > but how to signal this value to the application. When can I have this\n> > feature, please? :)\n> >\n>\n> I think we need to resume discussions on frame durations calculation,\n> or either decide that the proposed direction for 'pipelineCrop' is the\n> right one and:\n> 1) Define the property\n> 2) Add a ControlList of properties to CameraConfiguration\n> 3) Instrument the pipeline handler to report it in validate()\n>\n> It seems pretty straightforward to me, but I might be missing pieces.\n> Laurent, do you have opinions here ?\n>\n> > >\n> > >    Application can access it from the CameraConfiguration and decide\n> > >    where to set their pan/zoom rectangle.\n> > >\n> > >    How the pipeline handler applies the cropping to the sensor\n> > >    produced frame is less relevant imo. Systems which have direct access\n> > >    to the v4l2 subdev will probably use the crop/compose rectangle\n> > >    applied on the sensor. In your case I see you use the crop target\n> > >    on the ISP video device node. Am I wrong or you could calculate\n> > >    that at validate() time, once you know all the desired output sizes and\n> > >    based on them identify the 'best' sensor mode to use ?\n> >\n> > We update the selection on the ISP device node. I would expect this\n> > approach to be more common than doing it on the sensor itself (you\n> > don't lose statistics for the rest of the frame), but it's of course\n> > up to the pipeline handlers.\n> >\n>\n> You are right, this will mostly always come from ISP (sub)device node,\n> not from the sensor.\n>\n> > >\n> > > 2) Application, once they know the 'pipelineCrop' (I'm ok with that\n> > >    term btw, as we have analogCrop and we will have a digitalCrop for\n> > >    the sensor's processing stages, so pipelineCrop feels right) can\n> > >    set their pan/zoom rectangle with a control, as the one you have\n> > >    defined below. Now, what do you think the reference of that rectangle\n> > >    should be ? I mean, should the 'pipelineCrop' always be in position\n> > >    (0,0) and the 'digitalZoom' be applied on top of it ? I would say\n> > >    so, but then we lose the information on which part of the image\n> > >    the pipeline decided to crop, and I'm not sure it is relevant or\n> > >    not.\n> >\n> > So I'm in favour of giving applications only the size of the pipeline\n> > crop, nothing else. It's all you need to implement digital zoom, and\n> > panning within the original image, and it makes life very\n> > straightforward for everyone.\n> >\n> > I agree there are more complex scenarios you could imagine. Maybe you\n> > let applications pan/zoom within the whole of the sensor area, so\n> > you'd be able to pan to parts of the image that you can't see in the\n> > \"default unzoomed\" version. I mean, there's nothing \"wrong\" in this,\n> > it just feels more complicated and a bit unexpected to me. I also\n> > think this would delegate most of the aspect ratio calculations to the\n> > applications, and I suspect they'd get themselves confused and\n> > actually implement the simpler scheme anyway... But yes, there is a\n> > decision to be made here.\n> >\n> > >\n> > >    The other possibility is to report the 'pipelineCrop' in respect to\n> > >    the full active pixel array size, and ask application to provide a\n> > >    'digitalZoom' rectangle with the same reference. Is it worth the\n> > >    additional complication in you opinion?\n> >\n> > Sorry, answered that one above!\n> >\n> > >\n> > >    Third option is to report the sensor frame size AND the pipeline crop\n> > >    defined in respect to it, and let application provide a digitalZoom\n> > >    rectangle defined in respect to the pipeline crop. This would look\n> > >    like (numbers are random):\n> > >         sensorFrameSize = {0, 0, 1920, 1440}\n> > >         pipelineCrop = { 0, 150, 1920, 1080}\n> > >         digitalZoom = {300, 300, 1280, 720}\n> > >\n> > >    This would allow application to know that the sensor frame is\n> > >    1920x1400, the pipeline selected a region on 1920x1080 by applying\n> > >    an offset of 150 pixels in the vertical direction and the desired\n> > >    digital zoom is then applied on this last rectangle (resulting in a\n> > >    effective (300,450) dislocation from the full sensor frame).\n> > >\n> > > I won't bikeshed on names too much, just leave as a reference that\n> > > Android reports the same property we are defining as pipelineCrop as\n> > > 'scalera.cropRegion' as for them 'scaler' is used to report properties\n> > > of the ISP processing pipeline.\n> >\n> > I could go with \"scalerCrop\" too!\n>\n> Just for reference, I paste here the Android control documentation,\n> mostly to wrap our heads around the fact they decided to implement it\n> using the full pixel array as reference, but the documentation later\n> throws the sensor scaling in the mix, so I don't fully get it.\n>\n> Reading it through might help spotting details I might have missed.\n\nThis is indeed different. They're letting you pan anywhere that the\nsensor will let you, and then they're sorting out the aspect ratio\nissues for you, it seems.\n\nPanning anywhere that the sensor lets you is rather going away from\nwhat I was proposing. It means they'll get the behaviour that I\nthought was a bit weird (zooming in more will reveal previously\ninvisible parts of the image). But they take care of the aspect ratio\ncalculations for the user, though that presumably means you couldn't\nask for a strange aspect ratio even if you wanted to.\n\nPerhaps the answer is like the one I gave earlier. We should remain as\nflexible as possible (weird behaviours and all) and provide helper\nfunctions to make it easy for applications to implement the behaviours\nthey want. So that would mean:\n\n1. The \"pipeline crop\" disappears and we use the total image size that\nis available from the sensor. That is, the size of the sensor output\nin actual pixels, taking account of any scaling/binning/cropping\nperformed *within* the sensor. I think this might already be available\nsomewhere? (If the pipeline handler doesn't want to use the full\nsensor area for some reason, then the pipeline handler can re-scale\ninternally, though you might not have quite the same pixel-level\ncontrol as you thought)\n\n2. We take an arbitrary rectangle supplied by the application and\nsimply use it to specify the pan/zoom without altering it. The extra\nmunging that Android would do can be performed in the\nlibcamera/Android interface layer.\n\n3. Then there's a helper function taking that sensor output size from\n1, the desired pan/zoom parameters (2), and called\ngetMeTheSensibleZoomRectanglePlease (OK, we might change that name).\n\nHow does that sound?\nDavid\n\n>\n> ------------------------------------------------------------------------------\n> android.scaler.cropRegion\n>\n> The desired region of the sensor to read out for this capture.\n>\n> Pixel coordinates relative to android.sensor.info.activeArraySize or\n> android.sensor.info.preCorrectionActiveArraySize depending on\n> distortion correction capability and mode\n>\n> This control can be used to implement digital zoom.\n>\n> For devices not supporting android.distortionCorrection.mode control,\n> the coordinate system always follows that of\n> android.sensor.info.activeArraySize, with (0, 0) being the top-left\n> pixel of the active array.\n>\n> For devices supporting android.distortionCorrection.mode control, the\n> coordinate system depends on the mode being set.When the distortion\n> correction mode is OFF, the coordinate system follows\n> android.sensor.info.preCorrectionActiveArraySize, with (0, 0) being\n> the top-left pixel of the pre-correction active array.When the\n> distortion correction mode is not OFF, the coordinate system follows\n> android.sensor.info.activeArraySize, with (0, 0) being the top-left\n> pixel of the active array.\n>\n> Output streams use this rectangle to produce their output,cropping to\n> a smaller region if necessary to maintain the stream's aspect ratio,\n> then scaling the sensor input to match the output's configured\n> resolution.\n>\n> The crop region is applied after the RAW to other color space (e.g.\n> YUV) conversion. Since raw streams (e.g. RAW16) don't have the\n> conversion stage, they are not croppable. The crop region will be\n> ignored by raw streams.\n>\n> For non-raw streams, any additional per-stream cropping will be done\n> to maximize the final pixel area of the stream.\n>\n> For example, if the crop region is set to a 4:3 aspect ratio, then 4:3\n> streams will use the exact crop region. 16:9 streams will further crop\n> vertically (letterbox).\n>\n> Conversely, if the crop region is set to a 16:9, then 4:3 outputs will\n> crop horizontally (pillarbox), and 16:9 streams will match exactly.\n> These additional crops will be centered within the crop region.\n>\n> If the coordinate system is android.sensor.info.activeArraySize, the\n> width and height of the crop region cannot be set to be smaller than\n> floor( activeArraySize.width / android.scaler.availableMaxDigitalZoom\n> ) and floor( activeArraySize.height /\n> android.scaler.availableMaxDigitalZoom ), respectively.\n>\n> If the coordinate system is\n> android.sensor.info.preCorrectionActiveArraySize, the width and height\n> of the crop region cannot be set to be smaller than floor(\n> preCorrectionActiveArraySize.width /\n> android.scaler.availableMaxDigitalZoom ) and floor(\n> preCorrectionActiveArraySize.height /\n> android.scaler.availableMaxDigitalZoom ),respectively.\n>\n> The camera device may adjust the crop region to account for rounding\n> and other hardware requirements; the final crop region used will be\n> included in the output capture result.\n>\n> HAL Implementation Details: The output streams must maintain square\n> pixels at all times, no matter what the relative aspect ratios of the\n> crop region and the stream are. Negative values for corner are allowed\n> for raw output if full pixel array is larger than active pixel array.\n> Width and height may be rounded to nearest larger supportable width,\n> especially for raw output, where only a few fixed scales may be\n> possible.\n>\n> For a set of output streams configured, if the sensor output is\n> cropped to a smaller size than pre-correction active array size, the\n> HAL need follow below cropping rules:\n>\n> The HAL need handle the cropRegion as if the sensor crop size is the\n> effective pre-correction active array size. More specifically, the HAL\n> must transform the request cropRegion from\n> android.sensor.info.preCorrectionActiveArraySize to the sensor cropped\n> pixel area size in this way:\n>\n> Translate the requested cropRegion w.r.t., the left top corner of the\n> sensor cropped pixel area by (tx, ty),where ty = sensorCrop.top *\n> (sensorCrop.height / preCorrectionActiveArraySize.height) and tx =\n> sensorCrop.left * (sensorCrop.width /\n> preCorrectionActiveArraySize.width).The (sensorCrop.top,\n> sensorCrop.left) is the coordinate based off the\n> android.sensor.info.activeArraySize.  Scale the width and height of\n> requested cropRegion with scaling factor of\n> sensorCrop.width/preCorrectionActiveArraySize.width and\n> sensorCrop.height/preCorrectionActiveArraySize.height\n> respectively.Once this new cropRegion is calculated, the HAL must use\n> this region to crop the image with regard to the sensor crop size\n> (effective pre-correction active array size). The HAL still need\n> follow the general cropping rule for this new cropRegion and effective\n> pre-correction active array size.  The HAL must report the cropRegion\n> with regard to android.sensor.info.preCorrectionActiveArraySize.The\n> HAL need convert the new cropRegion generated above w.r.t., full\n> pre-correction active array size. The reported cropRegion may be\n> slightly different with the requested cropRegion since the HAL may\n> adjust the crop region to account for rounding, conversion error, or\n> other hardware limitations.\n> ------------------------------------------------------------------------------\n>\n> Thanks\n>   j\n>\n>\n> >\n> > Best regards\n> > David\n> >\n> > >\n> > > I'm sorry for the wall of text, I wish we had a blackboard :)\n> > >\n> > > Thanks\n> > >   j\n> > >\n> > > >\n> > > > Signed-off-by: David Plowman <david.plowman@raspberrypi.com>\n> > > > ---\n> > > >  include/libcamera/camera.h                    |  2 ++\n> > > >  include/libcamera/internal/pipeline_handler.h |  4 +++\n> > > >  src/libcamera/camera.cpp                      | 27 +++++++++++++++++++\n> > > >  src/libcamera/control_ids.yaml                | 10 +++++++\n> > > >  4 files changed, 43 insertions(+)\n> > > >\n> > > > diff --git a/include/libcamera/camera.h b/include/libcamera/camera.h\n> > > > index 4d1a4a9..6819b8e 100644\n> > > > --- a/include/libcamera/camera.h\n> > > > +++ b/include/libcamera/camera.h\n> > > > @@ -92,6 +92,8 @@ public:\n> > > >       std::unique_ptr<CameraConfiguration> generateConfiguration(const StreamRoles &roles = {});\n> > > >       int configure(CameraConfiguration *config);\n> > > >\n> > > > +     Size const &getPipelineCrop() const;\n> > > > +\n> > > >       Request *createRequest(uint64_t cookie = 0);\n> > > >       int queueRequest(Request *request);\n> > > >\n> > > > diff --git a/include/libcamera/internal/pipeline_handler.h b/include/libcamera/internal/pipeline_handler.h\n> > > > index 22e629a..5bfe890 100644\n> > > > --- a/include/libcamera/internal/pipeline_handler.h\n> > > > +++ b/include/libcamera/internal/pipeline_handler.h\n> > > > @@ -89,6 +89,8 @@ public:\n> > > >\n> > > >       const char *name() const { return name_; }\n> > > >\n> > > > +     Size const &getPipelineCrop() const { return pipelineCrop_; }\n> > > > +\n> > > >  protected:\n> > > >       void registerCamera(std::shared_ptr<Camera> camera,\n> > > >                           std::unique_ptr<CameraData> data);\n> > > > @@ -100,6 +102,8 @@ protected:\n> > > >\n> > > >       CameraManager *manager_;\n> > > >\n> > > > +     Size pipelineCrop_;\n> > > > +\n> > > >  private:\n> > > >       void mediaDeviceDisconnected(MediaDevice *media);\n> > > >       virtual void disconnect();\n> > > > diff --git a/src/libcamera/camera.cpp b/src/libcamera/camera.cpp\n> > > > index 69a1b44..f8b8ec6 100644\n> > > > --- a/src/libcamera/camera.cpp\n> > > > +++ b/src/libcamera/camera.cpp\n> > > > @@ -793,6 +793,33 @@ int Camera::configure(CameraConfiguration *config)\n> > > >       return 0;\n> > > >  }\n> > > >\n> > > > +/**\n> > > > + * \\brief Return the size of the sensor image being used by the pipeline\n> > > > + * to create the output.\n> > > > + *\n> > > > + * This method returns the size, in pixels, of the raw image read from the\n> > > > + * sensor and which is used by the pipeline to form the output image(s)\n> > > > + * (rescaling if necessary). Note that these values take account of any\n> > > > + * cropping performed on the sensor output so as to produce the correct\n> > > > + * aspect ratio. It would normally be necessary to retrieve these values\n> > > > + * in order to calculate correct parameters for digital zoom.\n> > > > + *\n> > > > + * Example: a sensor mode may produce a 1920x1440 output image. But if an\n> > > > + * application has requested a 16:9 image, the values returned here might\n> > > > + * be 1920x1080 - the largest portion of the sensor output that provides\n> > > > + * the correct aspect ratio.\n> > > > + *\n> > > > + * \\context This function is \\threadsafe. It will only return valid\n> > > > + * (non-zero) values when the camera has been configured.\n> > > > + *\n> > > > + * \\return The dimensions of the sensor image used by the pipeline.\n> > > > + */\n> > > > +\n> > > > +Size const &Camera::getPipelineCrop() const\n> > > > +{\n> > > > +     return p_->pipe_->getPipelineCrop();\n> > > > +}\n> > > > +\n> > > >  /**\n> > > >   * \\brief Create a request object for the camera\n> > > >   * \\param[in] cookie Opaque cookie for application use\n> > > > diff --git a/src/libcamera/control_ids.yaml b/src/libcamera/control_ids.yaml\n> > > > index 988b501..5a099d5 100644\n> > > > --- a/src/libcamera/control_ids.yaml\n> > > > +++ b/src/libcamera/control_ids.yaml\n> > > > @@ -262,4 +262,14 @@ controls:\n> > > >          In this respect, it is not necessarily aimed at providing a way to\n> > > >          implement a focus algorithm by the application, rather an indication of\n> > > >          how in-focus a frame is.\n> > > > +\n> > > > +  - DigitalZoom:\n> > > > +      type: Rectangle\n> > > > +      description: |\n> > > > +        Sets the portion of the full sensor image, in pixels, that will be\n> > > > +        used for digital zoom. That is, this part of the sensor output will\n> > > > +        be scaled up to make the full size output image (and everything else\n> > > > +        discarded). To obtain the \"full sensor image\" that is available, the\n> > > > +        method Camera::getOutputCrop() should be called once the camera is\n> > > > +        configured. An application may pan and zoom within this rectangle.\n> > > >  ...\n> > > > --\n> > > > 2.20.1\n> > > >\n> > > > _______________________________________________\n> > > > libcamera-devel mailing list\n> > > > libcamera-devel@lists.libcamera.org\n> > > > https://lists.libcamera.org/listinfo/libcamera-devel","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id 914FFBD87A\n\tfor <parsemail@patchwork.libcamera.org>;\n\tMon,  3 Aug 2020 11:05:43 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id 51FD061885;\n\tMon,  3 Aug 2020 13:05:42 +0200 (CEST)","from mail-ot1-x341.google.com (mail-ot1-x341.google.com\n\t[IPv6:2607:f8b0:4864:20::341])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id 8862E605A9\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tMon,  3 Aug 2020 13:05:40 +0200 (CEST)","by mail-ot1-x341.google.com with SMTP id r21so17057015ota.10\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tMon, 03 Aug 2020 04:05:40 -0700 (PDT)"],"Authentication-Results":"lancelot.ideasonboard.com;\n\tdkim=fail reason=\"signature verification failed\" (2048-bit key;\n\tunprotected) header.d=raspberrypi.com header.i=@raspberrypi.com\n\theader.b=\"MUpJL0+K\"; dkim-atps=neutral","DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/relaxed;\n\td=raspberrypi.com; s=google;\n\th=mime-version:references:in-reply-to:from:date:message-id:subject:to\n\t:cc; bh=ZWh0SGlxbfPHznyvGfHITfQYtYau2U3UMYgrkBNUZeE=;\n\tb=MUpJL0+KqnLosEIaxZ2HqxANk8+v1qe43xM0aNiHdaTt6aFO2MN3J5vJ+eBdZWnZOJ\n\tUHaP/i/Smrb/sx1PjVv8xA83iIaVZ0PXSL7yk83UPdu76zGM8ct78AhwGsB5V0CeeSKu\n\tK7Edq8qTGWSxxaOJDG97wPYR+84uHCzcYr6uzCEzSnbtp3ux7PaZuceaJmvL7ZTHSMl/\n\t9a8L8IQu4Et5ZlhsQh8CT0J/b/t4DAqyAn0ADuHO2MbZBPj9+rFrgjEod8vC2H08LOO8\n\t8Gb0A8hbdJ/iUepruP3fpjeCEjee+pDZibLqOOpxCOPCqt22oTPOzDWy0iWDzkFm18GK\n\tT5hQ==","X-Google-DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/relaxed;\n\td=1e100.net; s=20161025;\n\th=x-gm-message-state:mime-version:references:in-reply-to:from:date\n\t:message-id:subject:to:cc;\n\tbh=ZWh0SGlxbfPHznyvGfHITfQYtYau2U3UMYgrkBNUZeE=;\n\tb=paf3H1kkfdkU6axONB5qJZuRmsA+MGZzRMmost9zSy5H7zq9uv+U7MpQZEhPXFvwKx\n\tr/LASSbn8zLWRPHtk5S3uv4oQTitIGpFKtaHL035zY0wcYHkR54l/cA6qRd6Wa59OAuk\n\tp00iQGB3wdIn+lZV0igTM5jjwlwomW8muPCwJqRKssIoG3EHx/YwJPsBb6Q8Q0ZNlKbx\n\tNxQh2+4dI2vDkEyjeqebeQiuw8kUjxVoamB6tHCd6JASxg6bHrp5xtyXf7oiBTM3WwhO\n\t2aiC0lb6oYFVEk1cGW7DipmXaXdE8iGInYqbvR4cTu6DA1h2ekruS8JS+Ei32wo0p8lD\n\t5ISA==","X-Gm-Message-State":"AOAM532EUAmzvDTYVX8/tTY0WZ6+CIVd7hLkCrRBRQeaTXxSiOhN3vnU\n\tqbf7XENZAVrocFk4jNl4+FXWfbtEtsAVsJe2IbEJIPkqWoI=","X-Google-Smtp-Source":"ABdhPJyNpvhLShfr1Fa/dzV0xLBnqjYg328pMXZ03RiAndEsXg9AZyZs5171NyL6366uqgXLWaYLlvSJ1KlZ4cEQyJA=","X-Received":"by 2002:a9d:7994:: with SMTP id\n\th20mr13188000otm.317.1596452736517; \n\tMon, 03 Aug 2020 04:05:36 -0700 (PDT)","MIME-Version":"1.0","References":"<20200723084338.13711-1-david.plowman@raspberrypi.com>\n\t<20200723084338.13711-2-david.plowman@raspberrypi.com>\n\t<20200730163804.gyhkgcrcwwbfn2tj@uno.localdomain>\n\t<CAHW6GYKvV436aqjNSHt62Eqc62EU4p4R-bLemX53f0f6vYNBTA@mail.gmail.com>\n\t<20200801093001.nokdnc5nz5zfde32@uno.localdomain>","In-Reply-To":"<20200801093001.nokdnc5nz5zfde32@uno.localdomain>","From":"David Plowman <david.plowman@raspberrypi.com>","Date":"Mon, 3 Aug 2020 12:05:25 +0100","Message-ID":"<CAHW6GY+0HB62HLDGVwwzLkXD8cT_N1dpaDbffTC-yd-EygACzQ@mail.gmail.com>","To":"Jacopo Mondi <jacopo@jmondi.org>","Subject":"Re: [libcamera-devel] [PATCH v3 1/2] libcamera: Infrastructure for\n\tdigital zoom","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","Cc":"libcamera-devel@lists.libcamera.org","Content-Type":"text/plain; charset=\"us-ascii\"","Content-Transfer-Encoding":"7bit","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}}]