[libcamera-devel,v5] libcamera: ipu3: Add rotation to ipu3 pipeline
diff mbox series

Message ID 20210222113227.395737-1-jacopo@jmondi.org
State Accepted
Headers show
Series
  • [libcamera-devel,v5] libcamera: ipu3: Add rotation to ipu3 pipeline
Related show

Commit Message

Jacopo Mondi Feb. 22, 2021, 11:32 a.m. UTC
From: Fabian Wüthrich <me@fabwu.ch>

Use the same transformation logic as in the raspberry pipeline to
implement rotations in the ipu3 pipeline.

Tested on a Surface Book 2 with an experimental driver for OV5693.

Signed-off-by: Fabian Wüthrich <me@fabwu.ch>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
---

I have re-based and re-worked this slightly

- Rebased on master
- Remove snake_case variables
- Move H/V flip setting from the end of configure() to just before
  the part where bail out if the request contains only raw streams, so that
  transform can be applied on RAW images as well
- Minor style changes such as breaking lines more often

Fabian, could you re-ack this, and maybe if you have a setup to do so re-test ?
I'll push it with your ack!

Thanks
   j

---
 src/libcamera/pipeline/ipu3/ipu3.cpp | 87 +++++++++++++++++++++++++++-
 1 file changed, 85 insertions(+), 2 deletions(-)

--
2.30.0

Comments

Fabian Wüthrich Feb. 22, 2021, 7:55 p.m. UTC | #1
Hi Jacopo

I tested you changes on my Surface Book 2 and everything works fine:

Acked-by: Fabian Wüthrich <me@fabwu.ch>

On 22.02.21 12:32, Jacopo Mondi wrote:
> From: Fabian Wüthrich <me@fabwu.ch>
> 
> Use the same transformation logic as in the raspberry pipeline to
> implement rotations in the ipu3 pipeline.
> 
> Tested on a Surface Book 2 with an experimental driver for OV5693.
> 
> Signed-off-by: Fabian Wüthrich <me@fabwu.ch>
> Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
> Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
> Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
> ---
> 
> I have re-based and re-worked this slightly
> 
> - Rebased on master
> - Remove snake_case variables
> - Move H/V flip setting from the end of configure() to just before
>   the part where bail out if the request contains only raw streams, so that
>   transform can be applied on RAW images as well
> - Minor style changes such as breaking lines more often
> 
> Fabian, could you re-ack this, and maybe if you have a setup to do so re-test ?
> I'll push it with your ack!
> 
> Thanks
>    j
> 
> ---
>  src/libcamera/pipeline/ipu3/ipu3.cpp | 87 +++++++++++++++++++++++++++-
>  1 file changed, 85 insertions(+), 2 deletions(-)
> 
> diff --git a/src/libcamera/pipeline/ipu3/ipu3.cpp b/src/libcamera/pipeline/ipu3/ipu3.cpp
> index b8a655ce0b68..0561a4610007 100644
> --- a/src/libcamera/pipeline/ipu3/ipu3.cpp
> +++ b/src/libcamera/pipeline/ipu3/ipu3.cpp
> @@ -16,6 +16,7 @@
>  #include <libcamera/formats.h>
>  #include <libcamera/ipa/ipu3_ipa_interface.h>
>  #include <libcamera/ipa/ipu3_ipa_proxy.h>
> +#include <libcamera/property_ids.h>
>  #include <libcamera/request.h>
>  #include <libcamera/stream.h>
> 
> @@ -75,6 +76,9 @@ public:
> 
>  	uint32_t exposureTime_;
>  	Rectangle cropRegion_;
> +	bool supportsFlips_;
> +	Transform rotationTransform_;
> +
>  	std::unique_ptr<DelayedControls> delayedCtrls_;
>  	IPU3Frames frameInfos_;
> 
> @@ -95,6 +99,9 @@ public:
>  	const StreamConfiguration &cio2Format() const { return cio2Configuration_; }
>  	const ImgUDevice::PipeConfig imguConfig() const { return pipeConfig_; }
> 
> +	/* Cache the combinedTransform_ that will be applied to the sensor */
> +	Transform combinedTransform_;
> +
>  private:
>  	/*
>  	 * The IPU3CameraData instance is guaranteed to be valid as long as the
> @@ -167,11 +174,49 @@ CameraConfiguration::Status IPU3CameraConfiguration::validate()
>  	if (config_.empty())
>  		return Invalid;
> 
> -	if (transform != Transform::Identity) {
> -		transform = Transform::Identity;
> +	Transform combined = transform * data_->rotationTransform_;
> +
> +	/*
> +	 * We combine the platform and user transform, but must "adjust away"
> +	 * any combined result that includes a transposition, as we can't do
> +	 * those. In this case, flipping only the transpose bit is helpful to
> +	 * applications - they either get the transform they requested, or have
> +	 * to do a simple transpose themselves (they don't have to worry about
> +	 * the other possible cases).
> +	 */
> +	if (!!(combined & Transform::Transpose)) {
> +		/*
> +		 * Flipping the transpose bit in "transform" flips it in the
> +		 * combined result too (as it's the last thing that happens),
> +		 * which is of course clearing it.
> +		 */
> +		transform ^= Transform::Transpose;
> +		combined &= ~Transform::Transpose;
> +		status = Adjusted;
> +	}
> +
> +	/*
> +	 * We also check if the sensor doesn't do h/vflips at all, in which
> +	 * case we clear them, and the application will have to do everything.
> +	 */
> +	if (!data_->supportsFlips_ && !!combined) {
> +		/*
> +		 * If the sensor can do no transforms, then combined must be
> +		 * changed to the identity. The only user transform that gives
> +		 * rise to this is the inverse of the rotation. (Recall that
> +		 * combined = transform * rotationTransform.)
> +		 */
> +		transform = -data_->rotationTransform_;
> +		combined = Transform::Identity;
>  		status = Adjusted;
>  	}
> 
> +	/*
> +	 * Store the final combined transform that configure() will need to
> +	 * apply to the sensor to save us working it out again.
> +	 */
> +	combinedTransform_ = combined;
> +
>  	/* Cap the number of entries to the available streams. */
>  	if (config_.size() > IPU3_MAX_STREAMS) {
>  		config_.resize(IPU3_MAX_STREAMS);
> @@ -503,6 +548,24 @@ int PipelineHandlerIPU3::configure(Camera *camera, CameraConfiguration *c)
>  	cio2->sensor()->sensorInfo(&sensorInfo);
>  	data->cropRegion_ = sensorInfo.analogCrop;
> 
> +	/*
> +	 * Configure the H/V flip controls based on the combination of
> +	 * the sensor and user transform.
> +	 */
> +	if (data->supportsFlips_) {
> +		ControlList sensorCtrls(cio2->sensor()->controls());
> +		sensorCtrls.set(V4L2_CID_HFLIP,
> +				static_cast<int32_t>(!!(config->combinedTransform_
> +							& Transform::HFlip)));
> +		sensorCtrls.set(V4L2_CID_VFLIP,
> +				static_cast<int32_t>(!!(config->combinedTransform_
> +						        & Transform::VFlip)));
> +
> +		ret = cio2->sensor()->setControls(&sensorCtrls);
> +		if (ret)
> +			return ret;
> +	}
> +
>  	/*
>  	 * If the ImgU gets configured, its driver seems to expect that
>  	 * buffers will be queued to its outputs, as otherwise the next
> @@ -980,6 +1043,26 @@ int PipelineHandlerIPU3::registerCameras()
>  		data->cio2_.frameStart().connect(data->delayedCtrls_.get(),
>  						 &DelayedControls::applyControls);
> 
> +		/* Convert the sensor rotation to a transformation */
> +		int32_t rotation = 0;
> +		if (data->properties_.contains(properties::Rotation))
> +			rotation = data->properties_.get(properties::Rotation);
> +		else
> +			LOG(IPU3, Warning) << "Rotation control not exposed by "
> +					   << cio2->sensor()->id()
> +					   << ". Assume rotation 0";
> +
> +		bool success;
> +		data->rotationTransform_ = transformFromRotation(rotation, &success);
> +		if (!success)
> +			LOG(IPU3, Warning) << "Invalid rotation of " << rotation
> +					   << " degrees: ignoring";
> +
> +		ControlList ctrls = cio2->sensor()->getControls({ V4L2_CID_HFLIP });
> +		if (!ctrls.empty())
> +			/* We assume it will support vflips too... */
> +			data->supportsFlips_ = true;
> +
>  		/**
>  		 * \todo Dynamically assign ImgU and output devices to each
>  		 * stream and camera; as of now, limit support to two cameras
> --
> 2.30.0
>
Jacopo Mondi Feb. 23, 2021, 8:49 a.m. UTC | #2
On Mon, Feb 22, 2021 at 08:55:19PM +0100, Fabian Wüthrich wrote:
> Hi Jacopo
>
> I tested you changes on my Surface Book 2 and everything works fine:
>
> Acked-by: Fabian Wüthrich <me@fabwu.ch>

Thanks, finally pushed!

>
> On 22.02.21 12:32, Jacopo Mondi wrote:
> > From: Fabian Wüthrich <me@fabwu.ch>
> >
> > Use the same transformation logic as in the raspberry pipeline to
> > implement rotations in the ipu3 pipeline.
> >
> > Tested on a Surface Book 2 with an experimental driver for OV5693.
> >
> > Signed-off-by: Fabian Wüthrich <me@fabwu.ch>
> > Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
> > Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
> > Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
> > ---
> >
> > I have re-based and re-worked this slightly
> >
> > - Rebased on master
> > - Remove snake_case variables
> > - Move H/V flip setting from the end of configure() to just before
> >   the part where bail out if the request contains only raw streams, so that
> >   transform can be applied on RAW images as well
> > - Minor style changes such as breaking lines more often
> >
> > Fabian, could you re-ack this, and maybe if you have a setup to do so re-test ?
> > I'll push it with your ack!
> >
> > Thanks
> >    j
> >
> > ---
> >  src/libcamera/pipeline/ipu3/ipu3.cpp | 87 +++++++++++++++++++++++++++-
> >  1 file changed, 85 insertions(+), 2 deletions(-)
> >
> > diff --git a/src/libcamera/pipeline/ipu3/ipu3.cpp b/src/libcamera/pipeline/ipu3/ipu3.cpp
> > index b8a655ce0b68..0561a4610007 100644
> > --- a/src/libcamera/pipeline/ipu3/ipu3.cpp
> > +++ b/src/libcamera/pipeline/ipu3/ipu3.cpp
> > @@ -16,6 +16,7 @@
> >  #include <libcamera/formats.h>
> >  #include <libcamera/ipa/ipu3_ipa_interface.h>
> >  #include <libcamera/ipa/ipu3_ipa_proxy.h>
> > +#include <libcamera/property_ids.h>
> >  #include <libcamera/request.h>
> >  #include <libcamera/stream.h>
> >
> > @@ -75,6 +76,9 @@ public:
> >
> >  	uint32_t exposureTime_;
> >  	Rectangle cropRegion_;
> > +	bool supportsFlips_;
> > +	Transform rotationTransform_;
> > +
> >  	std::unique_ptr<DelayedControls> delayedCtrls_;
> >  	IPU3Frames frameInfos_;
> >
> > @@ -95,6 +99,9 @@ public:
> >  	const StreamConfiguration &cio2Format() const { return cio2Configuration_; }
> >  	const ImgUDevice::PipeConfig imguConfig() const { return pipeConfig_; }
> >
> > +	/* Cache the combinedTransform_ that will be applied to the sensor */
> > +	Transform combinedTransform_;
> > +
> >  private:
> >  	/*
> >  	 * The IPU3CameraData instance is guaranteed to be valid as long as the
> > @@ -167,11 +174,49 @@ CameraConfiguration::Status IPU3CameraConfiguration::validate()
> >  	if (config_.empty())
> >  		return Invalid;
> >
> > -	if (transform != Transform::Identity) {
> > -		transform = Transform::Identity;
> > +	Transform combined = transform * data_->rotationTransform_;
> > +
> > +	/*
> > +	 * We combine the platform and user transform, but must "adjust away"
> > +	 * any combined result that includes a transposition, as we can't do
> > +	 * those. In this case, flipping only the transpose bit is helpful to
> > +	 * applications - they either get the transform they requested, or have
> > +	 * to do a simple transpose themselves (they don't have to worry about
> > +	 * the other possible cases).
> > +	 */
> > +	if (!!(combined & Transform::Transpose)) {
> > +		/*
> > +		 * Flipping the transpose bit in "transform" flips it in the
> > +		 * combined result too (as it's the last thing that happens),
> > +		 * which is of course clearing it.
> > +		 */
> > +		transform ^= Transform::Transpose;
> > +		combined &= ~Transform::Transpose;
> > +		status = Adjusted;
> > +	}
> > +
> > +	/*
> > +	 * We also check if the sensor doesn't do h/vflips at all, in which
> > +	 * case we clear them, and the application will have to do everything.
> > +	 */
> > +	if (!data_->supportsFlips_ && !!combined) {
> > +		/*
> > +		 * If the sensor can do no transforms, then combined must be
> > +		 * changed to the identity. The only user transform that gives
> > +		 * rise to this is the inverse of the rotation. (Recall that
> > +		 * combined = transform * rotationTransform.)
> > +		 */
> > +		transform = -data_->rotationTransform_;
> > +		combined = Transform::Identity;
> >  		status = Adjusted;
> >  	}
> >
> > +	/*
> > +	 * Store the final combined transform that configure() will need to
> > +	 * apply to the sensor to save us working it out again.
> > +	 */
> > +	combinedTransform_ = combined;
> > +
> >  	/* Cap the number of entries to the available streams. */
> >  	if (config_.size() > IPU3_MAX_STREAMS) {
> >  		config_.resize(IPU3_MAX_STREAMS);
> > @@ -503,6 +548,24 @@ int PipelineHandlerIPU3::configure(Camera *camera, CameraConfiguration *c)
> >  	cio2->sensor()->sensorInfo(&sensorInfo);
> >  	data->cropRegion_ = sensorInfo.analogCrop;
> >
> > +	/*
> > +	 * Configure the H/V flip controls based on the combination of
> > +	 * the sensor and user transform.
> > +	 */
> > +	if (data->supportsFlips_) {
> > +		ControlList sensorCtrls(cio2->sensor()->controls());
> > +		sensorCtrls.set(V4L2_CID_HFLIP,
> > +				static_cast<int32_t>(!!(config->combinedTransform_
> > +							& Transform::HFlip)));
> > +		sensorCtrls.set(V4L2_CID_VFLIP,
> > +				static_cast<int32_t>(!!(config->combinedTransform_
> > +						        & Transform::VFlip)));
> > +
> > +		ret = cio2->sensor()->setControls(&sensorCtrls);
> > +		if (ret)
> > +			return ret;
> > +	}
> > +
> >  	/*
> >  	 * If the ImgU gets configured, its driver seems to expect that
> >  	 * buffers will be queued to its outputs, as otherwise the next
> > @@ -980,6 +1043,26 @@ int PipelineHandlerIPU3::registerCameras()
> >  		data->cio2_.frameStart().connect(data->delayedCtrls_.get(),
> >  						 &DelayedControls::applyControls);
> >
> > +		/* Convert the sensor rotation to a transformation */
> > +		int32_t rotation = 0;
> > +		if (data->properties_.contains(properties::Rotation))
> > +			rotation = data->properties_.get(properties::Rotation);
> > +		else
> > +			LOG(IPU3, Warning) << "Rotation control not exposed by "
> > +					   << cio2->sensor()->id()
> > +					   << ". Assume rotation 0";
> > +
> > +		bool success;
> > +		data->rotationTransform_ = transformFromRotation(rotation, &success);
> > +		if (!success)
> > +			LOG(IPU3, Warning) << "Invalid rotation of " << rotation
> > +					   << " degrees: ignoring";
> > +
> > +		ControlList ctrls = cio2->sensor()->getControls({ V4L2_CID_HFLIP });
> > +		if (!ctrls.empty())
> > +			/* We assume it will support vflips too... */
> > +			data->supportsFlips_ = true;
> > +
> >  		/**
> >  		 * \todo Dynamically assign ImgU and output devices to each
> >  		 * stream and camera; as of now, limit support to two cameras
> > --
> > 2.30.0
> >

Patch
diff mbox series

diff --git a/src/libcamera/pipeline/ipu3/ipu3.cpp b/src/libcamera/pipeline/ipu3/ipu3.cpp
index b8a655ce0b68..0561a4610007 100644
--- a/src/libcamera/pipeline/ipu3/ipu3.cpp
+++ b/src/libcamera/pipeline/ipu3/ipu3.cpp
@@ -16,6 +16,7 @@ 
 #include <libcamera/formats.h>
 #include <libcamera/ipa/ipu3_ipa_interface.h>
 #include <libcamera/ipa/ipu3_ipa_proxy.h>
+#include <libcamera/property_ids.h>
 #include <libcamera/request.h>
 #include <libcamera/stream.h>

@@ -75,6 +76,9 @@  public:

 	uint32_t exposureTime_;
 	Rectangle cropRegion_;
+	bool supportsFlips_;
+	Transform rotationTransform_;
+
 	std::unique_ptr<DelayedControls> delayedCtrls_;
 	IPU3Frames frameInfos_;

@@ -95,6 +99,9 @@  public:
 	const StreamConfiguration &cio2Format() const { return cio2Configuration_; }
 	const ImgUDevice::PipeConfig imguConfig() const { return pipeConfig_; }

+	/* Cache the combinedTransform_ that will be applied to the sensor */
+	Transform combinedTransform_;
+
 private:
 	/*
 	 * The IPU3CameraData instance is guaranteed to be valid as long as the
@@ -167,11 +174,49 @@  CameraConfiguration::Status IPU3CameraConfiguration::validate()
 	if (config_.empty())
 		return Invalid;

-	if (transform != Transform::Identity) {
-		transform = Transform::Identity;
+	Transform combined = transform * data_->rotationTransform_;
+
+	/*
+	 * We combine the platform and user transform, but must "adjust away"
+	 * any combined result that includes a transposition, as we can't do
+	 * those. In this case, flipping only the transpose bit is helpful to
+	 * applications - they either get the transform they requested, or have
+	 * to do a simple transpose themselves (they don't have to worry about
+	 * the other possible cases).
+	 */
+	if (!!(combined & Transform::Transpose)) {
+		/*
+		 * Flipping the transpose bit in "transform" flips it in the
+		 * combined result too (as it's the last thing that happens),
+		 * which is of course clearing it.
+		 */
+		transform ^= Transform::Transpose;
+		combined &= ~Transform::Transpose;
+		status = Adjusted;
+	}
+
+	/*
+	 * We also check if the sensor doesn't do h/vflips at all, in which
+	 * case we clear them, and the application will have to do everything.
+	 */
+	if (!data_->supportsFlips_ && !!combined) {
+		/*
+		 * If the sensor can do no transforms, then combined must be
+		 * changed to the identity. The only user transform that gives
+		 * rise to this is the inverse of the rotation. (Recall that
+		 * combined = transform * rotationTransform.)
+		 */
+		transform = -data_->rotationTransform_;
+		combined = Transform::Identity;
 		status = Adjusted;
 	}

+	/*
+	 * Store the final combined transform that configure() will need to
+	 * apply to the sensor to save us working it out again.
+	 */
+	combinedTransform_ = combined;
+
 	/* Cap the number of entries to the available streams. */
 	if (config_.size() > IPU3_MAX_STREAMS) {
 		config_.resize(IPU3_MAX_STREAMS);
@@ -503,6 +548,24 @@  int PipelineHandlerIPU3::configure(Camera *camera, CameraConfiguration *c)
 	cio2->sensor()->sensorInfo(&sensorInfo);
 	data->cropRegion_ = sensorInfo.analogCrop;

+	/*
+	 * Configure the H/V flip controls based on the combination of
+	 * the sensor and user transform.
+	 */
+	if (data->supportsFlips_) {
+		ControlList sensorCtrls(cio2->sensor()->controls());
+		sensorCtrls.set(V4L2_CID_HFLIP,
+				static_cast<int32_t>(!!(config->combinedTransform_
+							& Transform::HFlip)));
+		sensorCtrls.set(V4L2_CID_VFLIP,
+				static_cast<int32_t>(!!(config->combinedTransform_
+						        & Transform::VFlip)));
+
+		ret = cio2->sensor()->setControls(&sensorCtrls);
+		if (ret)
+			return ret;
+	}
+
 	/*
 	 * If the ImgU gets configured, its driver seems to expect that
 	 * buffers will be queued to its outputs, as otherwise the next
@@ -980,6 +1043,26 @@  int PipelineHandlerIPU3::registerCameras()
 		data->cio2_.frameStart().connect(data->delayedCtrls_.get(),
 						 &DelayedControls::applyControls);

+		/* Convert the sensor rotation to a transformation */
+		int32_t rotation = 0;
+		if (data->properties_.contains(properties::Rotation))
+			rotation = data->properties_.get(properties::Rotation);
+		else
+			LOG(IPU3, Warning) << "Rotation control not exposed by "
+					   << cio2->sensor()->id()
+					   << ". Assume rotation 0";
+
+		bool success;
+		data->rotationTransform_ = transformFromRotation(rotation, &success);
+		if (!success)
+			LOG(IPU3, Warning) << "Invalid rotation of " << rotation
+					   << " degrees: ignoring";
+
+		ControlList ctrls = cio2->sensor()->getControls({ V4L2_CID_HFLIP });
+		if (!ctrls.empty())
+			/* We assume it will support vflips too... */
+			data->supportsFlips_ = true;
+
 		/**
 		 * \todo Dynamically assign ImgU and output devices to each
 		 * stream and camera; as of now, limit support to two cameras