[{"id":26229,"web_url":"https://patchwork.libcamera.org/comment/26229/","msgid":"<CAHW6GYL68eiz-2iTBAq-qPgjwL+6j4EXfxkuMaR6Rz4n4N+5DQ@mail.gmail.com>","date":"2023-01-16T12:24:09","subject":"Re: [libcamera-devel] [PATCH v2 2/5] libcamera: camera_sensor:\n\tValidate Transform","submitter":{"id":42,"url":"https://patchwork.libcamera.org/api/people/42/","name":"David Plowman","email":"david.plowman@raspberrypi.com"},"content":"Hi Jacopo\n\nThanks for the patch!\n\nOn Sat, 14 Jan 2023 at 19:47, Jacopo Mondi\n<jacopo.mondi@ideasonboard.com> wrote:\n>\n> The two pipeline handlers that currently support Transform (IPU3 and\n> RkISP1) implement it by operating H/V flips on the image sensor.\n\nShouldn't that be three PHs, including Raspberry Pi... I'm feeling left out!\n\n>\n> Centralize the code that validates a Transform request against the\n> sensor rotation capabilities in the CameraSensor class.\n>\n> The implementation in the IPU3 pipeline handler was copied from the\n> RaspberryPi implementation, and is now centralized in CameraSensor to\n> make it easier for other platforms.\n>\n> The CameraSensor::validateTransform() implementation comes directly from\n> the RaspberryPi pipeline handler, no functional changes intended.\n>\n> Signed-off-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>\n> ---\n>  include/libcamera/internal/camera_sensor.h    |  4 ++\n>  src/libcamera/camera_sensor.cpp               | 65 +++++++++++++++++++\n>  src/libcamera/pipeline/ipu3/ipu3.cpp          | 45 ++-----------\n>  .../pipeline/raspberrypi/raspberrypi.cpp      | 59 ++---------------\n>  4 files changed, 82 insertions(+), 91 deletions(-)\n>\n> diff --git a/include/libcamera/internal/camera_sensor.h b/include/libcamera/internal/camera_sensor.h\n> index 878f3c28a3c9..bea52badaff7 100644\n> --- a/include/libcamera/internal/camera_sensor.h\n> +++ b/include/libcamera/internal/camera_sensor.h\n> @@ -29,6 +29,8 @@ class BayerFormat;\n>  class CameraLens;\n>  class MediaEntity;\n>\n> +enum class Transform;\n> +\n>  struct CameraSensorProperties;\n>\n>  class CameraSensor : protected Loggable\n> @@ -68,6 +70,8 @@ public:\n>\n>         CameraLens *focusLens() { return focusLens_.get(); }\n>\n> +       Transform validateTransform(Transform *transform) const;\n> +\n>  protected:\n>         std::string logPrefix() const override;\n>\n> diff --git a/src/libcamera/camera_sensor.cpp b/src/libcamera/camera_sensor.cpp\n> index 83ac075a4265..3518d3e3b02a 100644\n> --- a/src/libcamera/camera_sensor.cpp\n> +++ b/src/libcamera/camera_sensor.cpp\n> @@ -962,6 +962,71 @@ void CameraSensor::updateControlInfo()\n>   * connected to the sensor\n>   */\n>\n> +/**\n> + * \\brief Validate a transform request against the sensor capabilities\n> + * \\param[inout] transform The requested transformation, updated to match\n> + * the sensor capabilities\n> + *\n> + * The requested \\a transform is adjusted to compensate for the sensor's\n> + * mounting rotation and validated agains the sensor capabilities.\n> + *\n> + * For example, if the requested \\a transform is Transform::Identity and the\n> + * sensor rotation is 180 degrees, the resulting transform returned by the\n> + * function is Transform::Rot180 to automatically correct the image, but only if\n> + * the sensor can actually apply horizontal and vertical flips.\n\nI'm not completely sure about the explanation here... what we're doing\nis so easy to get in a muddle over that I wonder if we could be\nclearer.\n\nThe critical point is that the output from this function is supposed\nto be the transform you need to apply to the sensor in order to give\nyou the input transform that you requested, compensating for any\nsensor rotation. Perhaps these two paragraphs might go something like\nthis:\n\n* The output transform is the transform that needs to be applied to the sensor\n* in order to produce the input \\a transform, compensating for the sensor's\n* mounting rotation. It is also validated against the sensor's ability\nto perform\n* horizontal and vertical flips.\n*\n* For example, if the requested \\a transform is Transform::Identity and the\n* sensor rotation is 180 degrees, the output transform will be Transform::Rot180\n* to correct the images so that they appear to have\nTransform::Identity, but only if\n* the sensor can apply horizontal and vertical flips.\n\nBut please only use any of that if you think it's helpful. We should\nperhaps also say something more about the way the input transform is\nchanged?\n\n* The input \\a transform is the transform that the caller wants, and\nit is adjusted\n* according to the capabilities of the sensor to represent the\n\"nearest\" transform\n* that can actually be delivered.\n\n> + *\n> + * \\return A Transform instance that represents which transformation has been\n> + * applied to the camera sensor\n> + */\n> +Transform CameraSensor::validateTransform(Transform *transform) const\n> +{\n> +       /* Adjust the requested transform to compensate the sensor rotation. */\n> +       int32_t rotation = properties().get(properties::Rotation).value_or(0);\n> +       bool success;\n> +\n> +       Transform rotationTransform = transformFromRotation(rotation, &success);\n> +       if (!success)\n> +               LOG(CameraSensor, Warning) << \"Invalid rotation of \" << rotation\n> +                                          << \" degrees - ignoring\";\n> +\n> +       Transform combined = *transform * rotationTransform;\n\nI think there's still an outstanding question about the direction of\nthe rotation... but not an issue for this patch!\n\n> +\n> +       /*\n> +        * We combine the platform and user transform, but must \"adjust away\"\n> +        * any combined result that includes a transform, as we can't do those.\n> +        * In this case, flipping only the transpose bit is helpful to\n> +        * applications - they either get the transform they requested, or have\n> +        * to do a simple transpose themselves (they don't have to worry about\n> +        * the other possible cases).\n> +        */\n> +       if (!!(combined & Transform::Transpose)) {\n> +               /*\n> +                * Flipping the transpose bit in \"transform\" flips it in the\n> +                * combined result too (as it's the last thing that happens),\n> +                * which is of course clearing it.\n> +                */\n> +               *transform ^= Transform::Transpose;\n> +               combined &= ~Transform::Transpose;\n> +       }\n> +\n> +       /*\n> +        * We also check if the sensor doesn't do h/vflips at all, in which\n> +        * case we clear them, and the application will have to do everything.\n> +        */\n> +       if (!supportFlips_ && !!combined) {\n> +               /*\n> +                * If the sensor can do no transforms, then combined must be\n> +                * changed to the identity. The only user transform that gives\n> +                * rise to this the inverse of the rotation. (Recall that\n\nTypo in the original code! s/this the/this is the/\n\nAnyway, sorry for all the fussing over the documentation, otherwise I\nthink this is a very good change.\n\nReviewed-by: David Plowman <david.plowman@raspberrypi.com>\n\nThanks!\nDavid\n\n> +                * combined = transform * rotationTransform.)\n> +                */\n> +               *transform = -rotationTransform;\n> +               combined = Transform::Identity;\n> +       }\n> +\n> +       return combined;\n> +}\n> +\n>  std::string CameraSensor::logPrefix() const\n>  {\n>         return \"'\" + entity_->name() + \"'\";\n> diff --git a/src/libcamera/pipeline/ipu3/ipu3.cpp b/src/libcamera/pipeline/ipu3/ipu3.cpp\n> index e4d79ea44aed..a424ac914859 100644\n> --- a/src/libcamera/pipeline/ipu3/ipu3.cpp\n> +++ b/src/libcamera/pipeline/ipu3/ipu3.cpp\n> @@ -184,48 +184,15 @@ CameraConfiguration::Status IPU3CameraConfiguration::validate()\n>         if (config_.empty())\n>                 return Invalid;\n>\n> -       Transform combined = transform * data_->rotationTransform_;\n> -\n> -       /*\n> -        * We combine the platform and user transform, but must \"adjust away\"\n> -        * any combined result that includes a transposition, as we can't do\n> -        * those. In this case, flipping only the transpose bit is helpful to\n> -        * applications - they either get the transform they requested, or have\n> -        * to do a simple transpose themselves (they don't have to worry about\n> -        * the other possible cases).\n> -        */\n> -       if (!!(combined & Transform::Transpose)) {\n> -               /*\n> -                * Flipping the transpose bit in \"transform\" flips it in the\n> -                * combined result too (as it's the last thing that happens),\n> -                * which is of course clearing it.\n> -                */\n> -               transform ^= Transform::Transpose;\n> -               combined &= ~Transform::Transpose;\n> -               status = Adjusted;\n> -       }\n> -\n>         /*\n> -        * We also check if the sensor doesn't do h/vflips at all, in which\n> -        * case we clear them, and the application will have to do everything.\n> +        * Validate the requested transform against the sensor capabilities and\n> +        * rotation and store the final combined transform that configure() will\n> +        * need to apply to the sensor to save us working it out again.\n>          */\n> -       if (!data_->supportsFlips_ && !!combined) {\n> -               /*\n> -                * If the sensor can do no transforms, then combined must be\n> -                * changed to the identity. The only user transform that gives\n> -                * rise to this is the inverse of the rotation. (Recall that\n> -                * combined = transform * rotationTransform.)\n> -                */\n> -               transform = -data_->rotationTransform_;\n> -               combined = Transform::Identity;\n> +       Transform requestedTransform = transform;\n> +       combinedTransform_ = data_->cio2_.sensor()->validateTransform(&transform);\n> +       if (transform != requestedTransform)\n>                 status = Adjusted;\n> -       }\n> -\n> -       /*\n> -        * Store the final combined transform that configure() will need to\n> -        * apply to the sensor to save us working it out again.\n> -        */\n> -       combinedTransform_ = combined;\n>\n>         /* Cap the number of entries to the available streams. */\n>         if (config_.size() > kMaxStreams) {\n> diff --git a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp\n> index 8569df17976a..c086a69a913d 100644\n> --- a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp\n> +++ b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp\n> @@ -367,59 +367,14 @@ CameraConfiguration::Status RPiCameraConfiguration::validate()\n>         status = validateColorSpaces(ColorSpaceFlag::StreamsShareColorSpace);\n>\n>         /*\n> -        * What if the platform has a non-90 degree rotation? We can't even\n> -        * \"adjust\" the configuration and carry on. Alternatively, raising an\n> -        * error means the platform can never run. Let's just print a warning\n> -        * and continue regardless; the rotation is effectively set to zero.\n> +        * Validate the requested transform against the sensor capabilities and\n> +        * rotation and store the final combined transform that configure() will\n> +        * need to apply to the sensor to save us working it out again.\n>          */\n> -       int32_t rotation = data_->sensor_->properties().get(properties::Rotation).value_or(0);\n> -       bool success;\n> -       Transform rotationTransform = transformFromRotation(rotation, &success);\n> -       if (!success)\n> -               LOG(RPI, Warning) << \"Invalid rotation of \" << rotation\n> -                                 << \" degrees - ignoring\";\n> -       Transform combined = transform * rotationTransform;\n> -\n> -       /*\n> -        * We combine the platform and user transform, but must \"adjust away\"\n> -        * any combined result that includes a transform, as we can't do those.\n> -        * In this case, flipping only the transpose bit is helpful to\n> -        * applications - they either get the transform they requested, or have\n> -        * to do a simple transpose themselves (they don't have to worry about\n> -        * the other possible cases).\n> -        */\n> -       if (!!(combined & Transform::Transpose)) {\n> -               /*\n> -                * Flipping the transpose bit in \"transform\" flips it in the\n> -                * combined result too (as it's the last thing that happens),\n> -                * which is of course clearing it.\n> -                */\n> -               transform ^= Transform::Transpose;\n> -               combined &= ~Transform::Transpose;\n> -               status = Adjusted;\n> -       }\n> -\n> -       /*\n> -        * We also check if the sensor doesn't do h/vflips at all, in which\n> -        * case we clear them, and the application will have to do everything.\n> -        */\n> -       if (!data_->supportsFlips_ && !!combined) {\n> -               /*\n> -                * If the sensor can do no transforms, then combined must be\n> -                * changed to the identity. The only user transform that gives\n> -                * rise to this the inverse of the rotation. (Recall that\n> -                * combined = transform * rotationTransform.)\n> -                */\n> -               transform = -rotationTransform;\n> -               combined = Transform::Identity;\n> +       Transform requestedTransform = transform;\n> +       combinedTransform_ = data_->sensor_->validateTransform(&transform);\n> +       if (transform != requestedTransform)\n>                 status = Adjusted;\n> -       }\n> -\n> -       /*\n> -        * Store the final combined transform that configure() will need to\n> -        * apply to the sensor to save us working it out again.\n> -        */\n> -       combinedTransform_ = combined;\n>\n>         unsigned int rawCount = 0, outCount = 0, count = 0, maxIndex = 0;\n>         std::pair<int, Size> outSize[2];\n> @@ -454,7 +409,7 @@ CameraConfiguration::Status RPiCameraConfiguration::validate()\n>                         if (data_->flipsAlterBayerOrder_) {\n>                                 BayerFormat bayer = BayerFormat::fromV4L2PixelFormat(fourcc);\n>                                 bayer.order = data_->nativeBayerOrder_;\n> -                               bayer = bayer.transform(combined);\n> +                               bayer = bayer.transform(combinedTransform_);\n>                                 fourcc = bayer.toV4L2PixelFormat();\n>                         }\n>\n> --\n> 2.39.0\n>","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id CBC22BD87C\n\tfor <parsemail@patchwork.libcamera.org>;\n\tMon, 16 Jan 2023 12:24:24 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id 37FBC625D8;\n\tMon, 16 Jan 2023 13:24:24 +0100 (CET)","from mail-oi1-x22e.google.com (mail-oi1-x22e.google.com\n\t[IPv6:2607:f8b0:4864:20::22e])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id D6B9C61F01\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tMon, 16 Jan 2023 13:24:21 +0100 (CET)","by mail-oi1-x22e.google.com with SMTP id r205so23197757oib.9\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tMon, 16 Jan 2023 04:24:21 -0800 (PST)"],"DKIM-Signature":["v=1; a=rsa-sha256; c=relaxed/simple; d=libcamera.org;\n\ts=mail; t=1673871864;\n\tbh=KobNBcs7aKwNhhxstajEZN8F+z7w7qiWz85x7uRzXks=;\n\th=References:In-Reply-To:Date:To:Subject:List-Id:List-Unsubscribe:\n\tList-Archive:List-Post:List-Help:List-Subscribe:From:Reply-To:Cc:\n\tFrom;\n\tb=30YNzCvhZf+pvFBf1xL9VDy2i90AWBp19lgkX21aezxB3WV5DYEcxFwFYzzRWy+N+\n\tOxpdDCUdPVLV6uunu6FT/aCD1Chh+iRewgU2rjlnEK/du8i9m3b70+Vnr9rQDsnXtw\n\tCgR4a92k7Qz3U67qSFimFs7uDpyq14E6dVuWfqp/Se2yCM4w2te0pd4rb2Chb1WxO+\n\tqhfWJGDGR4RPaeRSUIFJ8765mZH8M5LuFWab8LRg8x8fk6F0co0emKjvNGDvM0ZV1g\n\tqwLZcG2ZyDAWhXULYFVvsrE4FRbsodi7IZr7hEd+zN51S1U4Px19b81K8dbssssOLg\n\tD0XQIvibhLx5A==","v=1; a=rsa-sha256; c=relaxed/relaxed;\n\td=raspberrypi.com; s=google;\n\th=cc:to:subject:message-id:date:from:in-reply-to:references\n\t:mime-version:from:to:cc:subject:date:message-id:reply-to;\n\tbh=R73CkaXO3jsDbEru+2ptCtYZHRxFACj40AYO7FXiiV4=;\n\tb=KicLKUHyn4XV5JyPRvkWRwjkGzuAEvBZjg3v+Jkecpo6CoLV5c45is8hnt6Y9EKtkT\n\tPxxILO/AQXkC1e8tm7SUZuY3LqzMtx4ifejPwdvv4nnBBOl+kyJ70EvDThsJts/h4C0j\n\taWvO+xg7mq2EH5K6kf5Zp5f3Hnefn/MG2RmwdrBWQJHKspsrAX5GXmk0/isaaEP9w9DG\n\tZDwqsoomUA4W8OtrPqfZ/pPFoIB4GLboQ9chgjUW9tc48djUnMCm7/dWL/IzBCZuAVLc\n\t8xnHecf8skmyYpu7CmnQimg5hvvk0xn26K+lfU841efN2Dr9W272/AtIxDzsS+L1JOe+\n\tx9/Q=="],"Authentication-Results":"lancelot.ideasonboard.com; dkim=pass (2048-bit key; \n\tunprotected) header.d=raspberrypi.com\n\theader.i=@raspberrypi.com\n\theader.b=\"KicLKUHy\"; dkim-atps=neutral","X-Google-DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/relaxed;\n\td=1e100.net; s=20210112;\n\th=cc:to:subject:message-id:date:from:in-reply-to:references\n\t:mime-version:x-gm-message-state:from:to:cc:subject:date:message-id\n\t:reply-to;\n\tbh=R73CkaXO3jsDbEru+2ptCtYZHRxFACj40AYO7FXiiV4=;\n\tb=Ldjfcui8g1EorgD8wPRHnman4jSwZ0n1iRBBrXHvDUIoXXTEPhhFNRSim7uG5FW+Q0\n\t07P0wwN6pgO6nrdSfR3eu8ndl+XsvutxoTM2SqBVa8gPfHUM91zU/pXSWJ9vJYf9C7sU\n\tI8HAl8VsmCOsSxp6zcl/2rmWCwrj8w9BwKkXiSqoH8shg+jK+jOK4ij03kY5FXykpUce\n\t8nq1I26mvtelXUKZFMp6pus5NnjZ3OCP3Q80UhNyxUO47W8+GzF2OpU7wkYDD2424/r0\n\tud8M8LrtvjW8toJkypwhc5XqqB89bwwQd6pwGET6vUSP77kQ6AUvUz1pMTTukHE8RNoe\n\tO+nw==","X-Gm-Message-State":"AFqh2kqjvksBcIVbgi77Qa81PisURbn2JMm7gI0zFqD5u82OATCnqYUt\n\tB67FhhokmndJQcvHx35S5HsFO520qsziEHgH9PqK9g==","X-Google-Smtp-Source":"AMrXdXtKQk3kE6cXCAmVf1EG+id+/yBTlTgUmNlMakPA45asxWaDh27Ur0op6ke+IxsyvUruGGYI3O/kQ2nqsoGHX/w=","X-Received":"by 2002:aca:910:0:b0:354:4cfe:aaad with SMTP id\n\t16-20020aca0910000000b003544cfeaaadmr3899655oij.246.1673871860325;\n\tMon, 16 Jan 2023 04:24:20 -0800 (PST)","MIME-Version":"1.0","References":"<20230114194712.23272-1-jacopo.mondi@ideasonboard.com>\n\t<20230114194712.23272-3-jacopo.mondi@ideasonboard.com>","In-Reply-To":"<20230114194712.23272-3-jacopo.mondi@ideasonboard.com>","Date":"Mon, 16 Jan 2023 12:24:09 +0000","Message-ID":"<CAHW6GYL68eiz-2iTBAq-qPgjwL+6j4EXfxkuMaR6Rz4n4N+5DQ@mail.gmail.com>","To":"Jacopo Mondi <jacopo.mondi@ideasonboard.com>","Content-Type":"text/plain; charset=\"UTF-8\"","Subject":"Re: [libcamera-devel] [PATCH v2 2/5] libcamera: camera_sensor:\n\tValidate Transform","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","From":"David Plowman via libcamera-devel <libcamera-devel@lists.libcamera.org>","Reply-To":"David Plowman <david.plowman@raspberrypi.com>","Cc":"libcamera-devel@lists.libcamera.org","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}},{"id":26231,"web_url":"https://patchwork.libcamera.org/comment/26231/","msgid":"<20230116124532.fng7573ci26zayej@uno.localdomain>","date":"2023-01-16T12:45:32","subject":"Re: [libcamera-devel] [PATCH v2 2/5] libcamera: camera_sensor:\n\tValidate Transform","submitter":{"id":143,"url":"https://patchwork.libcamera.org/api/people/143/","name":"Jacopo Mondi","email":"jacopo.mondi@ideasonboard.com"},"content":"Hi David\n\nOn Mon, Jan 16, 2023 at 12:24:09PM +0000, David Plowman wrote:\n> Hi Jacopo\n>\n> Thanks for the patch!\n>\n> On Sat, 14 Jan 2023 at 19:47, Jacopo Mondi\n> <jacopo.mondi@ideasonboard.com> wrote:\n> >\n> > The two pipeline handlers that currently support Transform (IPU3 and\n> > RkISP1) implement it by operating H/V flips on the image sensor.\n>\n> Shouldn't that be three PHs, including Raspberry Pi... I'm feeling left out!\n>\n\nNo, it's two PHs, but it should be RPi and not RkISP1 :)\n\n> >\n> > Centralize the code that validates a Transform request against the\n> > sensor rotation capabilities in the CameraSensor class.\n> >\n> > The implementation in the IPU3 pipeline handler was copied from the\n> > RaspberryPi implementation, and is now centralized in CameraSensor to\n> > make it easier for other platforms.\n> >\n> > The CameraSensor::validateTransform() implementation comes directly from\n> > the RaspberryPi pipeline handler, no functional changes intended.\n> >\n> > Signed-off-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>\n> > ---\n> >  include/libcamera/internal/camera_sensor.h    |  4 ++\n> >  src/libcamera/camera_sensor.cpp               | 65 +++++++++++++++++++\n> >  src/libcamera/pipeline/ipu3/ipu3.cpp          | 45 ++-----------\n> >  .../pipeline/raspberrypi/raspberrypi.cpp      | 59 ++---------------\n> >  4 files changed, 82 insertions(+), 91 deletions(-)\n> >\n> > diff --git a/include/libcamera/internal/camera_sensor.h b/include/libcamera/internal/camera_sensor.h\n> > index 878f3c28a3c9..bea52badaff7 100644\n> > --- a/include/libcamera/internal/camera_sensor.h\n> > +++ b/include/libcamera/internal/camera_sensor.h\n> > @@ -29,6 +29,8 @@ class BayerFormat;\n> >  class CameraLens;\n> >  class MediaEntity;\n> >\n> > +enum class Transform;\n> > +\n> >  struct CameraSensorProperties;\n> >\n> >  class CameraSensor : protected Loggable\n> > @@ -68,6 +70,8 @@ public:\n> >\n> >         CameraLens *focusLens() { return focusLens_.get(); }\n> >\n> > +       Transform validateTransform(Transform *transform) const;\n> > +\n> >  protected:\n> >         std::string logPrefix() const override;\n> >\n> > diff --git a/src/libcamera/camera_sensor.cpp b/src/libcamera/camera_sensor.cpp\n> > index 83ac075a4265..3518d3e3b02a 100644\n> > --- a/src/libcamera/camera_sensor.cpp\n> > +++ b/src/libcamera/camera_sensor.cpp\n> > @@ -962,6 +962,71 @@ void CameraSensor::updateControlInfo()\n> >   * connected to the sensor\n> >   */\n> >\n> > +/**\n> > + * \\brief Validate a transform request against the sensor capabilities\n> > + * \\param[inout] transform The requested transformation, updated to match\n> > + * the sensor capabilities\n> > + *\n> > + * The requested \\a transform is adjusted to compensate for the sensor's\n> > + * mounting rotation and validated agains the sensor capabilities.\n> > + *\n> > + * For example, if the requested \\a transform is Transform::Identity and the\n> > + * sensor rotation is 180 degrees, the resulting transform returned by the\n> > + * function is Transform::Rot180 to automatically correct the image, but only if\n> > + * the sensor can actually apply horizontal and vertical flips.\n>\n> I'm not completely sure about the explanation here... what we're doing\n> is so easy to get in a muddle over that I wonder if we could be\n> clearer.\n>\n> The critical point is that the output from this function is supposed\n> to be the transform you need to apply to the sensor in order to give\n> you the input transform that you requested, compensating for any\n> sensor rotation. Perhaps these two paragraphs might go something like\n> this:\n>\n> * The output transform is the transform that needs to be applied to the sensor\n> * in order to produce the input \\a transform, compensating for the sensor's\n> * mounting rotation. It is also validated against the sensor's ability\n> to perform\n> * horizontal and vertical flips.\n> *\n> * For example, if the requested \\a transform is Transform::Identity and the\n> * sensor rotation is 180 degrees, the output transform will be Transform::Rot180\n> * to correct the images so that they appear to have\n> Transform::Identity, but only if\n> * the sensor can apply horizontal and vertical flips.\n>\n> But please only use any of that if you think it's helpful. We should\n> perhaps also say something more about the way the input transform is\n> changed?\n>\n> * The input \\a transform is the transform that the caller wants, and\n> it is adjusted\n> * according to the capabilities of the sensor to represent the\n> \"nearest\" transform\n> * that can actually be delivered.\n>\n\nI'm fine taking all of this in if it sounds better to you. What I want\nin this series is to consolidate the current implementation that comes\nfrom you pipeline in a single place, to discuss the general topics\nabout how to use the Transform on a single implementation\n\n> > + *\n> > + * \\return A Transform instance that represents which transformation has been\n> > + * applied to the camera sensor\n> > + */\n> > +Transform CameraSensor::validateTransform(Transform *transform) const\n> > +{\n> > +       /* Adjust the requested transform to compensate the sensor rotation. */\n> > +       int32_t rotation = properties().get(properties::Rotation).value_or(0);\n> > +       bool success;\n> > +\n> > +       Transform rotationTransform = transformFromRotation(rotation, &success);\n> > +       if (!success)\n> > +               LOG(CameraSensor, Warning) << \"Invalid rotation of \" << rotation\n> > +                                          << \" degrees - ignoring\";\n> > +\n> > +       Transform combined = *transform * rotationTransform;\n>\n> I think there's still an outstanding question about the direction of\n> the rotation... but not an issue for this patch!\n>\n\nI would prefer to maintain the current implementation and make changes\non top, on a single location\n\n> > +\n> > +       /*\n> > +        * We combine the platform and user transform, but must \"adjust away\"\n> > +        * any combined result that includes a transform, as we can't do those.\n> > +        * In this case, flipping only the transpose bit is helpful to\n> > +        * applications - they either get the transform they requested, or have\n> > +        * to do a simple transpose themselves (they don't have to worry about\n> > +        * the other possible cases).\n> > +        */\n> > +       if (!!(combined & Transform::Transpose)) {\n> > +               /*\n> > +                * Flipping the transpose bit in \"transform\" flips it in the\n> > +                * combined result too (as it's the last thing that happens),\n> > +                * which is of course clearing it.\n> > +                */\n> > +               *transform ^= Transform::Transpose;\n> > +               combined &= ~Transform::Transpose;\n> > +       }\n> > +\n> > +       /*\n> > +        * We also check if the sensor doesn't do h/vflips at all, in which\n> > +        * case we clear them, and the application will have to do everything.\n> > +        */\n> > +       if (!supportFlips_ && !!combined) {\n> > +               /*\n> > +                * If the sensor can do no transforms, then combined must be\n> > +                * changed to the identity. The only user transform that gives\n> > +                * rise to this the inverse of the rotation. (Recall that\n>\n> Typo in the original code! s/this the/this is the/\n\nAh I thought it wasn't me getting the phrase :)\nI'll fix\n\n>\n> Anyway, sorry for all the fussing over the documentation, otherwise I\n> think this is a very good change.\n>\n\nNo worries, the contrary actually\n\n> Reviewed-by: David Plowman <david.plowman@raspberrypi.com>\n>\n\nThanks\n\n> Thanks!\n> David\n>\n> > +                * combined = transform * rotationTransform.)\n> > +                */\n> > +               *transform = -rotationTransform;\n> > +               combined = Transform::Identity;\n> > +       }\n> > +\n> > +       return combined;\n> > +}\n> > +\n> >  std::string CameraSensor::logPrefix() const\n> >  {\n> >         return \"'\" + entity_->name() + \"'\";\n> > diff --git a/src/libcamera/pipeline/ipu3/ipu3.cpp b/src/libcamera/pipeline/ipu3/ipu3.cpp\n> > index e4d79ea44aed..a424ac914859 100644\n> > --- a/src/libcamera/pipeline/ipu3/ipu3.cpp\n> > +++ b/src/libcamera/pipeline/ipu3/ipu3.cpp\n> > @@ -184,48 +184,15 @@ CameraConfiguration::Status IPU3CameraConfiguration::validate()\n> >         if (config_.empty())\n> >                 return Invalid;\n> >\n> > -       Transform combined = transform * data_->rotationTransform_;\n> > -\n> > -       /*\n> > -        * We combine the platform and user transform, but must \"adjust away\"\n> > -        * any combined result that includes a transposition, as we can't do\n> > -        * those. In this case, flipping only the transpose bit is helpful to\n> > -        * applications - they either get the transform they requested, or have\n> > -        * to do a simple transpose themselves (they don't have to worry about\n> > -        * the other possible cases).\n> > -        */\n> > -       if (!!(combined & Transform::Transpose)) {\n> > -               /*\n> > -                * Flipping the transpose bit in \"transform\" flips it in the\n> > -                * combined result too (as it's the last thing that happens),\n> > -                * which is of course clearing it.\n> > -                */\n> > -               transform ^= Transform::Transpose;\n> > -               combined &= ~Transform::Transpose;\n> > -               status = Adjusted;\n> > -       }\n> > -\n> >         /*\n> > -        * We also check if the sensor doesn't do h/vflips at all, in which\n> > -        * case we clear them, and the application will have to do everything.\n> > +        * Validate the requested transform against the sensor capabilities and\n> > +        * rotation and store the final combined transform that configure() will\n> > +        * need to apply to the sensor to save us working it out again.\n> >          */\n> > -       if (!data_->supportsFlips_ && !!combined) {\n> > -               /*\n> > -                * If the sensor can do no transforms, then combined must be\n> > -                * changed to the identity. The only user transform that gives\n> > -                * rise to this is the inverse of the rotation. (Recall that\n> > -                * combined = transform * rotationTransform.)\n> > -                */\n> > -               transform = -data_->rotationTransform_;\n> > -               combined = Transform::Identity;\n> > +       Transform requestedTransform = transform;\n> > +       combinedTransform_ = data_->cio2_.sensor()->validateTransform(&transform);\n> > +       if (transform != requestedTransform)\n> >                 status = Adjusted;\n> > -       }\n> > -\n> > -       /*\n> > -        * Store the final combined transform that configure() will need to\n> > -        * apply to the sensor to save us working it out again.\n> > -        */\n> > -       combinedTransform_ = combined;\n> >\n> >         /* Cap the number of entries to the available streams. */\n> >         if (config_.size() > kMaxStreams) {\n> > diff --git a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp\n> > index 8569df17976a..c086a69a913d 100644\n> > --- a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp\n> > +++ b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp\n> > @@ -367,59 +367,14 @@ CameraConfiguration::Status RPiCameraConfiguration::validate()\n> >         status = validateColorSpaces(ColorSpaceFlag::StreamsShareColorSpace);\n> >\n> >         /*\n> > -        * What if the platform has a non-90 degree rotation? We can't even\n> > -        * \"adjust\" the configuration and carry on. Alternatively, raising an\n> > -        * error means the platform can never run. Let's just print a warning\n> > -        * and continue regardless; the rotation is effectively set to zero.\n> > +        * Validate the requested transform against the sensor capabilities and\n> > +        * rotation and store the final combined transform that configure() will\n> > +        * need to apply to the sensor to save us working it out again.\n> >          */\n> > -       int32_t rotation = data_->sensor_->properties().get(properties::Rotation).value_or(0);\n> > -       bool success;\n> > -       Transform rotationTransform = transformFromRotation(rotation, &success);\n> > -       if (!success)\n> > -               LOG(RPI, Warning) << \"Invalid rotation of \" << rotation\n> > -                                 << \" degrees - ignoring\";\n> > -       Transform combined = transform * rotationTransform;\n> > -\n> > -       /*\n> > -        * We combine the platform and user transform, but must \"adjust away\"\n> > -        * any combined result that includes a transform, as we can't do those.\n> > -        * In this case, flipping only the transpose bit is helpful to\n> > -        * applications - they either get the transform they requested, or have\n> > -        * to do a simple transpose themselves (they don't have to worry about\n> > -        * the other possible cases).\n> > -        */\n> > -       if (!!(combined & Transform::Transpose)) {\n> > -               /*\n> > -                * Flipping the transpose bit in \"transform\" flips it in the\n> > -                * combined result too (as it's the last thing that happens),\n> > -                * which is of course clearing it.\n> > -                */\n> > -               transform ^= Transform::Transpose;\n> > -               combined &= ~Transform::Transpose;\n> > -               status = Adjusted;\n> > -       }\n> > -\n> > -       /*\n> > -        * We also check if the sensor doesn't do h/vflips at all, in which\n> > -        * case we clear them, and the application will have to do everything.\n> > -        */\n> > -       if (!data_->supportsFlips_ && !!combined) {\n> > -               /*\n> > -                * If the sensor can do no transforms, then combined must be\n> > -                * changed to the identity. The only user transform that gives\n> > -                * rise to this the inverse of the rotation. (Recall that\n> > -                * combined = transform * rotationTransform.)\n> > -                */\n> > -               transform = -rotationTransform;\n> > -               combined = Transform::Identity;\n> > +       Transform requestedTransform = transform;\n> > +       combinedTransform_ = data_->sensor_->validateTransform(&transform);\n> > +       if (transform != requestedTransform)\n> >                 status = Adjusted;\n> > -       }\n> > -\n> > -       /*\n> > -        * Store the final combined transform that configure() will need to\n> > -        * apply to the sensor to save us working it out again.\n> > -        */\n> > -       combinedTransform_ = combined;\n> >\n> >         unsigned int rawCount = 0, outCount = 0, count = 0, maxIndex = 0;\n> >         std::pair<int, Size> outSize[2];\n> > @@ -454,7 +409,7 @@ CameraConfiguration::Status RPiCameraConfiguration::validate()\n> >                         if (data_->flipsAlterBayerOrder_) {\n> >                                 BayerFormat bayer = BayerFormat::fromV4L2PixelFormat(fourcc);\n> >                                 bayer.order = data_->nativeBayerOrder_;\n> > -                               bayer = bayer.transform(combined);\n> > +                               bayer = bayer.transform(combinedTransform_);\n> >                                 fourcc = bayer.toV4L2PixelFormat();\n> >                         }\n> >\n> > --\n> > 2.39.0\n> >","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id 3FF9FC3240\n\tfor <parsemail@patchwork.libcamera.org>;\n\tMon, 16 Jan 2023 12:45:41 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id 9F82E625D8;\n\tMon, 16 Jan 2023 13:45:40 +0100 (CET)","from perceval.ideasonboard.com (perceval.ideasonboard.com\n\t[IPv6:2001:4b98:dc2:55:216:3eff:fef7:d647])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id ABEAB61F01\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tMon, 16 Jan 2023 13:45:38 +0100 (CET)","from ideasonboard.com (93-61-96-190.ip145.fastwebnet.it\n\t[93.61.96.190])\n\tby perceval.ideasonboard.com (Postfix) with ESMTPSA id 278CF802;\n\tMon, 16 Jan 2023 13:45:38 +0100 (CET)"],"DKIM-Signature":["v=1; a=rsa-sha256; c=relaxed/simple; d=libcamera.org;\n\ts=mail; t=1673873140;\n\tbh=GZ85myV6Ss5UvRPiDfyKdFjUbZj2WkL/ajZVgLM0CgQ=;\n\th=Date:To:References:In-Reply-To:Subject:List-Id:List-Unsubscribe:\n\tList-Archive:List-Post:List-Help:List-Subscribe:From:Reply-To:Cc:\n\tFrom;\n\tb=Iasc5itMFFNI0iVM/8xz+64V9RuZ7cmq9DccTNvnpP6UX3X5bpLXTClTQhFbacXve\n\tZzEX7kDxbpSm7MMQZxetdbh1c1FjIcwpaBqe9VanHXgbJgZKYQq0NK11UXpe/7Lzk1\n\tOhSJXI0+M9KAEOxH5ab+lzFta2y4asAYDRv3opGy1cKZygTeYawW4QTaOG7Kwm1cuG\n\tSck5mbjqtvl8H777cqMTCUralhnxW37KTIjHjzs4UdTGNVcbTZJOSWtjQOB6LRdKnP\n\tobDG0+RN2t2wppsgJOIxd0SH+fjWsL/kMuhLnRTNTGBP+k52bQu6viJ1rwOtTguNl0\n\tGf4zV+woj0Urg==","v=1; a=rsa-sha256; c=relaxed/simple; d=ideasonboard.com;\n\ts=mail; t=1673873138;\n\tbh=GZ85myV6Ss5UvRPiDfyKdFjUbZj2WkL/ajZVgLM0CgQ=;\n\th=Date:From:To:Cc:Subject:References:In-Reply-To:From;\n\tb=CWR3dJLXjdhjKWptZJgDGAF6iRoiFdWrknmDbydWZ2Dk/MVba3uDWVxbQ3LU6ZUB/\n\tEgSCYXFoeDhW6Zlb/Qh+hIZiVJvzIDYvX0JrSzPNvAmz/I0ZFCTOBlg/pCr22aLs6o\n\t+6mpiFenk1BfkA1J1/mfh+sEvAyrmmzLxpSZLSxU="],"Authentication-Results":"lancelot.ideasonboard.com; dkim=pass (1024-bit key; \n\tunprotected) header.d=ideasonboard.com\n\theader.i=@ideasonboard.com\n\theader.b=\"CWR3dJLX\"; dkim-atps=neutral","Date":"Mon, 16 Jan 2023 13:45:32 +0100","To":"David Plowman <david.plowman@raspberrypi.com>","Message-ID":"<20230116124532.fng7573ci26zayej@uno.localdomain>","References":"<20230114194712.23272-1-jacopo.mondi@ideasonboard.com>\n\t<20230114194712.23272-3-jacopo.mondi@ideasonboard.com>\n\t<CAHW6GYL68eiz-2iTBAq-qPgjwL+6j4EXfxkuMaR6Rz4n4N+5DQ@mail.gmail.com>","MIME-Version":"1.0","Content-Type":"text/plain; charset=utf-8","Content-Disposition":"inline","In-Reply-To":"<CAHW6GYL68eiz-2iTBAq-qPgjwL+6j4EXfxkuMaR6Rz4n4N+5DQ@mail.gmail.com>","Subject":"Re: [libcamera-devel] [PATCH v2 2/5] libcamera: camera_sensor:\n\tValidate Transform","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","From":"Jacopo Mondi via libcamera-devel <libcamera-devel@lists.libcamera.org>","Reply-To":"Jacopo Mondi <jacopo.mondi@ideasonboard.com>","Cc":"Jacopo Mondi <jacopo.mondi@ideasonboard.com>,\n\tlibcamera-devel@lists.libcamera.org","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}}]