From patchwork Sat Jan 14 19:47:09 2023 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Jacopo Mondi X-Patchwork-Id: 18113 Return-Path: X-Original-To: parsemail@patchwork.libcamera.org Delivered-To: parsemail@patchwork.libcamera.org Received: from lancelot.ideasonboard.com (lancelot.ideasonboard.com [92.243.16.209]) by patchwork.libcamera.org (Postfix) with ESMTPS id 1FCD5C3240 for ; Sat, 14 Jan 2023 19:47:31 +0000 (UTC) Received: from lancelot.ideasonboard.com (localhost [IPv6:::1]) by lancelot.ideasonboard.com (Postfix) with ESMTP id BB986625EC; Sat, 14 Jan 2023 20:47:28 +0100 (CET) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/simple; d=libcamera.org; s=mail; t=1673725648; bh=CtRTJpV4d+DsfdLbYm0nj4IZEqeFkpUvjLsEMsL4KgY=; h=To:Date:In-Reply-To:References:Subject:List-Id:List-Unsubscribe: List-Archive:List-Post:List-Help:List-Subscribe:From:Reply-To:Cc: From; b=iMbaIN0fkKSXlkG+v7pdrcqms6fkut9pJucy+eJ/3wnVYuVnrVN+Lbu20G/0XmirI KnBMVvEg7st44nOJpZnv/iHOwh9Ida2mHtf5hOnzSkwwPDlH+jF6FB4vLgHB45TC+s ry1jXCVGBMR1CaoC2E4rogaTLwaf8KJdzhQTS8I0AutZCvd187Fko7ZyL1XnGUUlu0 4kK+hdmT2bEHGcJXWiS+PCrYz0wJEnAQPWYpQ2AuT7RN6y3xjM88R1Ven3Unr0/uwz oDyeH9NS3QkzuyWjDNivLvuwI8V3fuSBihj3UaJt9jAe17w7ONmvBcQip/SyWPim6e 5XhKTT6ZaZfag== Received: from perceval.ideasonboard.com (perceval.ideasonboard.com [213.167.242.64]) by lancelot.ideasonboard.com (Postfix) with ESMTPS id 81F05625D0 for ; Sat, 14 Jan 2023 20:47:25 +0100 (CET) Authentication-Results: lancelot.ideasonboard.com; dkim=pass (1024-bit key; unprotected) header.d=ideasonboard.com header.i=@ideasonboard.com header.b="AZrHpNhi"; dkim-atps=neutral Received: from uno.LocalDomain (93-61-96-190.ip145.fastwebnet.it [93.61.96.190]) by perceval.ideasonboard.com (Postfix) with ESMTPSA id EAAA84D4; Sat, 14 Jan 2023 20:47:24 +0100 (CET) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/simple; d=ideasonboard.com; s=mail; t=1673725645; bh=CtRTJpV4d+DsfdLbYm0nj4IZEqeFkpUvjLsEMsL4KgY=; h=From:To:Cc:Subject:Date:In-Reply-To:References:From; b=AZrHpNhi/Dv2tGiAl4F5CxzbmYXDqu1CaJkEZ9XB7S/hjosa6QFMmbOWQ4xn0Slo6 erATmIngOulW+8XKKzr0edtDc0pyUJQFh0W0sn9lEJO0p5pv0TwZz/UfO6OY898QMP yFXSBFt+GzDofWgikx3uPSgjR5tWb3WL9xjbMs6o= To: libcamera-devel@lists.libcamera.org Date: Sat, 14 Jan 2023 20:47:09 +0100 Message-Id: <20230114194712.23272-3-jacopo.mondi@ideasonboard.com> X-Mailer: git-send-email 2.39.0 In-Reply-To: <20230114194712.23272-1-jacopo.mondi@ideasonboard.com> References: <20230114194712.23272-1-jacopo.mondi@ideasonboard.com> MIME-Version: 1.0 Subject: [libcamera-devel] [PATCH v2 2/5] libcamera: camera_sensor: Validate Transform X-BeenThere: libcamera-devel@lists.libcamera.org X-Mailman-Version: 2.1.29 Precedence: list List-Id: List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , X-Patchwork-Original-From: Jacopo Mondi via libcamera-devel From: Jacopo Mondi Reply-To: Jacopo Mondi Cc: Jacopo Mondi Errors-To: libcamera-devel-bounces@lists.libcamera.org Sender: "libcamera-devel" The two pipeline handlers that currently support Transform (IPU3 and RkISP1) implement it by operating H/V flips on the image sensor. Centralize the code that validates a Transform request against the sensor rotation capabilities in the CameraSensor class. The implementation in the IPU3 pipeline handler was copied from the RaspberryPi implementation, and is now centralized in CameraSensor to make it easier for other platforms. The CameraSensor::validateTransform() implementation comes directly from the RaspberryPi pipeline handler, no functional changes intended. Signed-off-by: Jacopo Mondi Reviewed-by: David Plowman --- include/libcamera/internal/camera_sensor.h | 4 ++ src/libcamera/camera_sensor.cpp | 65 +++++++++++++++++++ src/libcamera/pipeline/ipu3/ipu3.cpp | 45 ++----------- .../pipeline/raspberrypi/raspberrypi.cpp | 59 ++--------------- 4 files changed, 82 insertions(+), 91 deletions(-) diff --git a/include/libcamera/internal/camera_sensor.h b/include/libcamera/internal/camera_sensor.h index 878f3c28a3c9..bea52badaff7 100644 --- a/include/libcamera/internal/camera_sensor.h +++ b/include/libcamera/internal/camera_sensor.h @@ -29,6 +29,8 @@ class BayerFormat; class CameraLens; class MediaEntity; +enum class Transform; + struct CameraSensorProperties; class CameraSensor : protected Loggable @@ -68,6 +70,8 @@ public: CameraLens *focusLens() { return focusLens_.get(); } + Transform validateTransform(Transform *transform) const; + protected: std::string logPrefix() const override; diff --git a/src/libcamera/camera_sensor.cpp b/src/libcamera/camera_sensor.cpp index 83ac075a4265..3518d3e3b02a 100644 --- a/src/libcamera/camera_sensor.cpp +++ b/src/libcamera/camera_sensor.cpp @@ -962,6 +962,71 @@ void CameraSensor::updateControlInfo() * connected to the sensor */ +/** + * \brief Validate a transform request against the sensor capabilities + * \param[inout] transform The requested transformation, updated to match + * the sensor capabilities + * + * The requested \a transform is adjusted to compensate for the sensor's + * mounting rotation and validated agains the sensor capabilities. + * + * For example, if the requested \a transform is Transform::Identity and the + * sensor rotation is 180 degrees, the resulting transform returned by the + * function is Transform::Rot180 to automatically correct the image, but only if + * the sensor can actually apply horizontal and vertical flips. + * + * \return A Transform instance that represents which transformation has been + * applied to the camera sensor + */ +Transform CameraSensor::validateTransform(Transform *transform) const +{ + /* Adjust the requested transform to compensate the sensor rotation. */ + int32_t rotation = properties().get(properties::Rotation).value_or(0); + bool success; + + Transform rotationTransform = transformFromRotation(rotation, &success); + if (!success) + LOG(CameraSensor, Warning) << "Invalid rotation of " << rotation + << " degrees - ignoring"; + + Transform combined = *transform * rotationTransform; + + /* + * We combine the platform and user transform, but must "adjust away" + * any combined result that includes a transform, as we can't do those. + * In this case, flipping only the transpose bit is helpful to + * applications - they either get the transform they requested, or have + * to do a simple transpose themselves (they don't have to worry about + * the other possible cases). + */ + if (!!(combined & Transform::Transpose)) { + /* + * Flipping the transpose bit in "transform" flips it in the + * combined result too (as it's the last thing that happens), + * which is of course clearing it. + */ + *transform ^= Transform::Transpose; + combined &= ~Transform::Transpose; + } + + /* + * We also check if the sensor doesn't do h/vflips at all, in which + * case we clear them, and the application will have to do everything. + */ + if (!supportFlips_ && !!combined) { + /* + * If the sensor can do no transforms, then combined must be + * changed to the identity. The only user transform that gives + * rise to this the inverse of the rotation. (Recall that + * combined = transform * rotationTransform.) + */ + *transform = -rotationTransform; + combined = Transform::Identity; + } + + return combined; +} + std::string CameraSensor::logPrefix() const { return "'" + entity_->name() + "'"; diff --git a/src/libcamera/pipeline/ipu3/ipu3.cpp b/src/libcamera/pipeline/ipu3/ipu3.cpp index e4d79ea44aed..a424ac914859 100644 --- a/src/libcamera/pipeline/ipu3/ipu3.cpp +++ b/src/libcamera/pipeline/ipu3/ipu3.cpp @@ -184,48 +184,15 @@ CameraConfiguration::Status IPU3CameraConfiguration::validate() if (config_.empty()) return Invalid; - Transform combined = transform * data_->rotationTransform_; - - /* - * We combine the platform and user transform, but must "adjust away" - * any combined result that includes a transposition, as we can't do - * those. In this case, flipping only the transpose bit is helpful to - * applications - they either get the transform they requested, or have - * to do a simple transpose themselves (they don't have to worry about - * the other possible cases). - */ - if (!!(combined & Transform::Transpose)) { - /* - * Flipping the transpose bit in "transform" flips it in the - * combined result too (as it's the last thing that happens), - * which is of course clearing it. - */ - transform ^= Transform::Transpose; - combined &= ~Transform::Transpose; - status = Adjusted; - } - /* - * We also check if the sensor doesn't do h/vflips at all, in which - * case we clear them, and the application will have to do everything. + * Validate the requested transform against the sensor capabilities and + * rotation and store the final combined transform that configure() will + * need to apply to the sensor to save us working it out again. */ - if (!data_->supportsFlips_ && !!combined) { - /* - * If the sensor can do no transforms, then combined must be - * changed to the identity. The only user transform that gives - * rise to this is the inverse of the rotation. (Recall that - * combined = transform * rotationTransform.) - */ - transform = -data_->rotationTransform_; - combined = Transform::Identity; + Transform requestedTransform = transform; + combinedTransform_ = data_->cio2_.sensor()->validateTransform(&transform); + if (transform != requestedTransform) status = Adjusted; - } - - /* - * Store the final combined transform that configure() will need to - * apply to the sensor to save us working it out again. - */ - combinedTransform_ = combined; /* Cap the number of entries to the available streams. */ if (config_.size() > kMaxStreams) { diff --git a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp index 8569df17976a..c086a69a913d 100644 --- a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp +++ b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp @@ -367,59 +367,14 @@ CameraConfiguration::Status RPiCameraConfiguration::validate() status = validateColorSpaces(ColorSpaceFlag::StreamsShareColorSpace); /* - * What if the platform has a non-90 degree rotation? We can't even - * "adjust" the configuration and carry on. Alternatively, raising an - * error means the platform can never run. Let's just print a warning - * and continue regardless; the rotation is effectively set to zero. + * Validate the requested transform against the sensor capabilities and + * rotation and store the final combined transform that configure() will + * need to apply to the sensor to save us working it out again. */ - int32_t rotation = data_->sensor_->properties().get(properties::Rotation).value_or(0); - bool success; - Transform rotationTransform = transformFromRotation(rotation, &success); - if (!success) - LOG(RPI, Warning) << "Invalid rotation of " << rotation - << " degrees - ignoring"; - Transform combined = transform * rotationTransform; - - /* - * We combine the platform and user transform, but must "adjust away" - * any combined result that includes a transform, as we can't do those. - * In this case, flipping only the transpose bit is helpful to - * applications - they either get the transform they requested, or have - * to do a simple transpose themselves (they don't have to worry about - * the other possible cases). - */ - if (!!(combined & Transform::Transpose)) { - /* - * Flipping the transpose bit in "transform" flips it in the - * combined result too (as it's the last thing that happens), - * which is of course clearing it. - */ - transform ^= Transform::Transpose; - combined &= ~Transform::Transpose; - status = Adjusted; - } - - /* - * We also check if the sensor doesn't do h/vflips at all, in which - * case we clear them, and the application will have to do everything. - */ - if (!data_->supportsFlips_ && !!combined) { - /* - * If the sensor can do no transforms, then combined must be - * changed to the identity. The only user transform that gives - * rise to this the inverse of the rotation. (Recall that - * combined = transform * rotationTransform.) - */ - transform = -rotationTransform; - combined = Transform::Identity; + Transform requestedTransform = transform; + combinedTransform_ = data_->sensor_->validateTransform(&transform); + if (transform != requestedTransform) status = Adjusted; - } - - /* - * Store the final combined transform that configure() will need to - * apply to the sensor to save us working it out again. - */ - combinedTransform_ = combined; unsigned int rawCount = 0, outCount = 0, count = 0, maxIndex = 0; std::pair outSize[2]; @@ -454,7 +409,7 @@ CameraConfiguration::Status RPiCameraConfiguration::validate() if (data_->flipsAlterBayerOrder_) { BayerFormat bayer = BayerFormat::fromV4L2PixelFormat(fourcc); bayer.order = data_->nativeBayerOrder_; - bayer = bayer.transform(combined); + bayer = bayer.transform(combinedTransform_); fourcc = bayer.toV4L2PixelFormat(); }