[libcamera-devel,v2,2/3] libcamera: raspberrypi: Add control of sensor vblanking

Message ID 20200513091120.1653-3-naush@raspberrypi.com
State Superseded
Headers show
Series
  • raspberrypi: FPS control
Related show

Commit Message

Naushir Patuck May 13, 2020, 9:11 a.m. UTC
Add support for setting V4L2_CID_VBLANK appropriately when setting
V4L2_CID_EXPOSURE. This will allow adaptive framerates during
viewfinder use cases (e.g. when the exposure time goes above 33ms, we
can reduce the framerate to lower than 30fps).

The minimum and maximum frame durations are provided via libcamera
controls, and will prioritise exposure time limits over any AGC request.

V4L2_CID_VBLANK is controlled through the staggered writer, just like
the exposure and gain controls.

Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
---
 include/ipa/raspberrypi.h                     |  1 +
 src/ipa/raspberrypi/cam_helper.cpp            | 38 ++++++++++++++-
 src/ipa/raspberrypi/cam_helper.hpp            | 15 +++++-
 src/ipa/raspberrypi/cam_helper_imx219.cpp     | 11 ++++-
 src/ipa/raspberrypi/cam_helper_imx477.cpp     | 11 ++++-
 src/ipa/raspberrypi/cam_helper_ov5647.cpp     | 11 ++++-
 src/ipa/raspberrypi/raspberrypi.cpp           | 48 ++++++++++++++++---
 .../pipeline/raspberrypi/raspberrypi.cpp      |  2 +
 8 files changed, 124 insertions(+), 13 deletions(-)

Comments

Laurent Pinchart May 24, 2020, 11:16 p.m. UTC | #1
Hi Naush,

Thank you for the patch.

On Wed, May 13, 2020 at 10:11:19AM +0100, Naushir Patuck wrote:
> Add support for setting V4L2_CID_VBLANK appropriately when setting
> V4L2_CID_EXPOSURE. This will allow adaptive framerates during
> viewfinder use cases (e.g. when the exposure time goes above 33ms, we
> can reduce the framerate to lower than 30fps).
> 
> The minimum and maximum frame durations are provided via libcamera
> controls, and will prioritise exposure time limits over any AGC request.

General question about the AGC and frame duration selection behaviour.
When the scene illumination drops and the applications allows the frame
rate to be lowered, is it best to adapt the frame duration continuously,
or jump from, let's say, 30fps to 15fps and stay at 15fps ? I could
imagine adaptative frame durations to be annoying to handle for codecs
or displays, but I probably lack a good view of the details.

> V4L2_CID_VBLANK is controlled through the staggered writer, just like
> the exposure and gain controls.
> 
> Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
> ---
>  include/ipa/raspberrypi.h                     |  1 +
>  src/ipa/raspberrypi/cam_helper.cpp            | 38 ++++++++++++++-
>  src/ipa/raspberrypi/cam_helper.hpp            | 15 +++++-
>  src/ipa/raspberrypi/cam_helper_imx219.cpp     | 11 ++++-
>  src/ipa/raspberrypi/cam_helper_imx477.cpp     | 11 ++++-
>  src/ipa/raspberrypi/cam_helper_ov5647.cpp     | 11 ++++-
>  src/ipa/raspberrypi/raspberrypi.cpp           | 48 ++++++++++++++++---
>  .../pipeline/raspberrypi/raspberrypi.cpp      |  2 +
>  8 files changed, 124 insertions(+), 13 deletions(-)
> 
> diff --git a/include/ipa/raspberrypi.h b/include/ipa/raspberrypi.h
> index c109469e..74954db2 100644
> --- a/include/ipa/raspberrypi.h
> +++ b/include/ipa/raspberrypi.h
> @@ -51,6 +51,7 @@ static const ControlInfoMap RPiControls = {
>  	{ &controls::Brightness, ControlInfo(-1.0f, 1.0f) },
>  	{ &controls::Contrast, ControlInfo(0.0f, 32.0f) },
>  	{ &controls::Saturation, ControlInfo(0.0f, 32.0f) },
> +	{ &controls::FrameDurations, ControlInfo(1.0e3f, 1.0e9f) }
>  };
>  
>  } /* namespace libcamera */
> diff --git a/src/ipa/raspberrypi/cam_helper.cpp b/src/ipa/raspberrypi/cam_helper.cpp
> index 7f05d2c6..06732241 100644
> --- a/src/ipa/raspberrypi/cam_helper.cpp
> +++ b/src/ipa/raspberrypi/cam_helper.cpp
> @@ -34,8 +34,10 @@ CamHelper *CamHelper::Create(std::string const &cam_name)
>  	return nullptr;
>  }
>  
> -CamHelper::CamHelper(MdParser *parser)
> -	: parser_(parser), initialized_(false)
> +CamHelper::CamHelper(MdParser *parser, unsigned int maxFrameLength,
> +		     unsigned int frameIntegrationDiff)
> +	: parser_(parser), initialized_(false), maxFrameLength_(maxFrameLength),
> +	  frameIntegrationDiff_(frameIntegrationDiff)
>  {
>  }
>  
> @@ -56,6 +58,38 @@ double CamHelper::Exposure(uint32_t exposure_lines) const
>  	return exposure_lines * mode_.line_length / 1000.0;
>  }
>  
> +uint32_t CamHelper::GetVBlanking(double &exposure, double minFrameDuration,
> +				 double maxFrameDuration) const
> +{
> +	uint32_t frameLengthMin, frameLengthMax, vblank;
> +	uint32_t exposureLines = ExposureLines(exposure);
> +
> +	assert(initialized_);
> +	/*
> +	 * Clip frame length by the frame duration range and the maximum allowable
> +	 * value in the sensor, given by maxFrameLength_.
> +	 */
> +	frameLengthMin = std::min<uint32_t>(1e3 * minFrameDuration / mode_.line_length,
> +					    maxFrameLength_);
> +	frameLengthMax = std::min<uint32_t>(1e3 * maxFrameDuration / mode_.line_length,
> +					    maxFrameLength_);

Not something to be addressed as part of this series, but as we divide
and multiply by 1000 in various places, should we specify all duration
controls in nanoseconds instead of microseconds ?

> +	/*
> +	 * Limit the exposure to the maximum frame duration requested, and
> +	 * re-calculate if it has been clipped.
> +	 */
> +	exposureLines = std::min(frameLengthMax - frameIntegrationDiff_,
> +				 exposureLines);
> +	exposure = Exposure(exposureLines);
> +
> +	/* Limit the vblank to the range allowed by the frame length limits. */
> +	vblank = std::max<uint32_t>(mode_.height, exposureLines) -
> +		 mode_.height + frameIntegrationDiff_;

If the exposure time is shorter than the frame height, do we need the
frameIntegrationDiff_ margin ? Assuming a sensor that would have a
minimum vblank of 0 and a frameIntegrationDiff_ of 40, if height was
1080 and exposureLines 500, there should be no need to set vblank to 40,
right ?

> +	vblank = std::max(vblank, frameLengthMin - mode_.height);
> +	vblank = std::min(vblank, frameLengthMax - mode_.height);

You can use

	vblank = utils::clamp(vblank, frameLengthMin - mode_.height,
			      frameLengthMax - mode_.height);

> +
> +	return vblank;
> +}
> +
>  void CamHelper::SetCameraMode(const CameraMode &mode)
>  {
>  	mode_ = mode;
> diff --git a/src/ipa/raspberrypi/cam_helper.hpp b/src/ipa/raspberrypi/cam_helper.hpp
> index 0c8aa29a..fdcdbddb 100644
> --- a/src/ipa/raspberrypi/cam_helper.hpp
> +++ b/src/ipa/raspberrypi/cam_helper.hpp
> @@ -68,12 +68,15 @@ class CamHelper
>  {
>  public:
>  	static CamHelper *Create(std::string const &cam_name);
> -	CamHelper(MdParser *parser);
> +	CamHelper(MdParser *parser, unsigned int maxFrameLength,
> +		  unsigned int frameIntegrationDiff);
>  	virtual ~CamHelper();
>  	void SetCameraMode(const CameraMode &mode);
>  	MdParser &Parser() const { return *parser_; }
>  	uint32_t ExposureLines(double exposure_us) const;
>  	double Exposure(uint32_t exposure_lines) const; // in us
> +	virtual uint32_t GetVBlanking(double &exposure_us, double minFrameDuration,
> +				      double maxFrameDuration) const;

Does this function need to be virtual ?

>  	virtual uint32_t GainCode(double gain) const = 0;
>  	virtual double Gain(uint32_t gain_code) const = 0;
>  	virtual void GetDelays(int &exposure_delay, int &gain_delay) const;
> @@ -83,10 +86,20 @@ public:
>  	virtual unsigned int MistrustFramesStartup() const;
>  	virtual unsigned int MistrustFramesModeSwitch() const;
>  	virtual CamTransform GetOrientation() const;
> +
>  protected:
>  	MdParser *parser_;
>  	CameraMode mode_;
> +
> +private:
>  	bool initialized_;
> +	/* Largest possible frame length, in units of lines. */
> +	unsigned int maxFrameLength_;
> +	/*
> +	 * Smallest difference between the frame length and integration time,
> +	 * in units of lines.
> +	 */
> +	unsigned int frameIntegrationDiff_;
>  };
>  
>  // This is for registering camera helpers with the system, so that the
> diff --git a/src/ipa/raspberrypi/cam_helper_imx219.cpp b/src/ipa/raspberrypi/cam_helper_imx219.cpp
> index 35c6597c..fd95a31a 100644
> --- a/src/ipa/raspberrypi/cam_helper_imx219.cpp
> +++ b/src/ipa/raspberrypi/cam_helper_imx219.cpp
> @@ -50,13 +50,22 @@ public:
>  	unsigned int MistrustFramesModeSwitch() const override;
>  	bool SensorEmbeddedDataPresent() const override;
>  	CamTransform GetOrientation() const override;
> +
> +private:
> +	/*
> +	 * Smallest difference between the frame length and integration time,
> +	 * in units of lines.
> +	 */
> +	static constexpr int frameIntegrationDiff = 4;
> +	/* Largest possible frame length, in units of lines. */
> +	static constexpr int maxFrameLength = 0xffff;
>  };
>  
>  CamHelperImx219::CamHelperImx219()
>  #if ENABLE_EMBEDDED_DATA
>  	: CamHelper(new MdParserImx219())

Shouldn't this be updated too ?

>  #else
> -	: CamHelper(new MdParserRPi())
> +	: CamHelper(new MdParserRPi(), maxFrameLength, frameIntegrationDiff)
>  #endif
>  {
>  }
> diff --git a/src/ipa/raspberrypi/cam_helper_imx477.cpp b/src/ipa/raspberrypi/cam_helper_imx477.cpp
> index 69544456..4a1cab76 100644
> --- a/src/ipa/raspberrypi/cam_helper_imx477.cpp
> +++ b/src/ipa/raspberrypi/cam_helper_imx477.cpp
> @@ -39,10 +39,19 @@ public:
>  	double Gain(uint32_t gain_code) const override;
>  	bool SensorEmbeddedDataPresent() const override;
>  	CamTransform GetOrientation() const override;
> +
> +private:
> +	/*
> +	 * Smallest difference between the frame length and integration time,
> +	 * in units of lines.
> +	 */
> +	static constexpr int frameIntegrationDiff = 22;
> +	/* Largest possible frame length, in units of lines. */
> +	static constexpr int maxFrameLength = 0xffdc;
>  };
>  
>  CamHelperImx477::CamHelperImx477()
> -	: CamHelper(new MdParserImx477())
> +	: CamHelper(new MdParserImx477(), maxFrameLength, frameIntegrationDiff)
>  {
>  }
>  
> diff --git a/src/ipa/raspberrypi/cam_helper_ov5647.cpp b/src/ipa/raspberrypi/cam_helper_ov5647.cpp
> index 3dbcb164..d814fa90 100644
> --- a/src/ipa/raspberrypi/cam_helper_ov5647.cpp
> +++ b/src/ipa/raspberrypi/cam_helper_ov5647.cpp
> @@ -22,6 +22,15 @@ public:
>  	unsigned int HideFramesModeSwitch() const override;
>  	unsigned int MistrustFramesStartup() const override;
>  	unsigned int MistrustFramesModeSwitch() const override;
> +
> +private:
> +	/*
> +	 * Smallest difference between the frame length and integration time,
> +	 * in units of lines.
> +	 */
> +	static constexpr int frameIntegrationDiff = 4;
> +	/* Largest possible frame length, in units of lines. */
> +	static constexpr int maxFrameLength = 0xffff;
>  };
>  
>  /*
> @@ -30,7 +39,7 @@ public:
>   */
>  
>  CamHelperOv5647::CamHelperOv5647()
> -	: CamHelper(new MdParserRPi())
> +	: CamHelper(new MdParserRPi(), maxFrameLength, frameIntegrationDiff)
>  {
>  }
>  
> diff --git a/src/ipa/raspberrypi/raspberrypi.cpp b/src/ipa/raspberrypi/raspberrypi.cpp
> index 3bcc0815..1af5e74a 100644
> --- a/src/ipa/raspberrypi/raspberrypi.cpp
> +++ b/src/ipa/raspberrypi/raspberrypi.cpp
> @@ -53,6 +53,8 @@ namespace libcamera {
>  /* Configure the sensor with these values initially. */
>  #define DEFAULT_ANALOGUE_GAIN 1.0
>  #define DEFAULT_EXPOSURE_TIME 20000
> +#define DEFAULT_MIN_FRAME_DURATION (1e6 / 30.0)
> +#define DEFAULT_MAX_FRAME_DURATION (1e6 / 0.01)
>  
>  LOG_DEFINE_CATEGORY(IPARPI)
>  
> @@ -136,6 +138,9 @@ private:
>  	/* LS table allocation passed in from the pipeline handler. */
>  	uint32_t lsTableHandle_;
>  	void *lsTable_;
> +
> +	/* Frame duration (1/fps) given in microseconds. */
> +	double minFrameDuration_, maxFrameDuration_;

Shouldn't these be float, or uint32_t depending on the outcome of the
discussion for patch 1/3 ?

>  };
>  
>  int IPARPi::init(const IPASettings &settings)
> @@ -252,13 +257,20 @@ void IPARPi::configure(const CameraSensorInfo &sensorInfo,
>  		controller_.Initialise();
>  		controllerInit_ = true;
>  
> -		/* Calculate initial values for gain and exposure. */
> +		/* Calculate initial values for gain, vblank, and exposure */
> +		minFrameDuration_ = DEFAULT_MIN_FRAME_DURATION;
> +		maxFrameDuration_ = DEFAULT_MAX_FRAME_DURATION;
> +
> +		double exposure = DEFAULT_EXPOSURE_TIME;
> +		int32_t vblank = helper_->GetVBlanking(exposure, minFrameDuration_,
> +						       maxFrameDuration_);
> +		int32_t exposure_lines = helper_->ExposureLines(exposure);
>  		int32_t gain_code = helper_->GainCode(DEFAULT_ANALOGUE_GAIN);
> -		int32_t exposure_lines = helper_->ExposureLines(DEFAULT_EXPOSURE_TIME);
>  
>  		ControlList ctrls(unicam_ctrls_);
> -		ctrls.set(V4L2_CID_ANALOGUE_GAIN, gain_code);
> +		ctrls.set(V4L2_CID_VBLANK, vblank);
>  		ctrls.set(V4L2_CID_EXPOSURE, exposure_lines);
> +		ctrls.set(V4L2_CID_ANALOGUE_GAIN, gain_code);
>  
>  		IPAOperationData op;
>  		op.operation = RPI_IPA_ACTION_V4L2_SET_STAGGERED;
> @@ -630,6 +642,17 @@ void IPARPi::queueRequest(const ControlList &controls)
>  			break;
>  		}
>  
> +		case controls::FRAME_DURATIONS: {
> +			auto frameDurations = ctrl.second.get<Span<const float>>();
> +
> +			/* This will be applied once AGC recalculations occur. */
> +			minFrameDuration_ = frameDurations[0];
> +			maxFrameDuration_ = frameDurations[1];

Should the values be adjusted based on the sensor capabilities ?

> +			libcameraMetadata_.set(controls::FrameDurations,
> +					       { frameDurations[0], frameDurations[1] });

Hmmm... From an application point of view, wouldn't it be more useful to
know what frame duration was actually used for the current frame ? You
don't have to implement this as part of this series, but I could imagine
a FrameDuration metadata control being useful for that purpose.
FrameDurations would be set by applications, and FrameDuration would be
reported through metadata. (On a side note, would FrameDurationLimits be
a better name than FrameDuration then ?)

> +			break;
> +		}
> +
>  		default:
>  			LOG(IPARPI, Warning)
>  				<< "Ctrl " << controls::controls.at(ctrl.first)->name()
> @@ -795,7 +818,12 @@ void IPARPi::applyAGC(const struct AgcStatus *agcStatus)
>  	op.operation = RPI_IPA_ACTION_V4L2_SET_STAGGERED;
>  
>  	int32_t gain_code = helper_->GainCode(agcStatus->analogue_gain);
> -	int32_t exposure_lines = helper_->ExposureLines(agcStatus->shutter_time);
> +
> +	/* GetVBlanking might clip exposure time to the fps limits. */
> +	double exposure = agcStatus->shutter_time;
> +	int32_t vblanking = helper_->GetVBlanking(exposure, minFrameDuration_,
> +						  maxFrameDuration_);

I had completely missed, when reviewing the GetVBlanking()
implementation above, that the first parameter was a referenced and was
used to return a value. We tend in libcamera to use const references for
input parameters, and pointers for output or in/out parameters to make
that clearer. It's not ideal though, as pointer can be null, and using
references when a parameter should not be null is nicer. No need to
change the code here, but maybe a comment on top of GetVBlanking() would
be useful to point out that the exposure time can be adjusted.

> +	int32_t exposure_lines = helper_->ExposureLines(exposure);
>  
>  	if (unicam_ctrls_.find(V4L2_CID_ANALOGUE_GAIN) == unicam_ctrls_.end()) {
>  		LOG(IPARPI, Error) << "Can't find analogue gain control";
> @@ -807,14 +835,20 @@ void IPARPi::applyAGC(const struct AgcStatus *agcStatus)
>  		return;
>  	}
>  
> -	LOG(IPARPI, Debug) << "Applying AGC Exposure: " << agcStatus->shutter_time
> -			   << " (Shutter lines: " << exposure_lines << ") Gain: "
> +	LOG(IPARPI, Debug) << "Applying AGC Exposure: " << exposure
> +			   << " (Shutter lines: " << exposure_lines << ", AGC requested "
> +			   << agcStatus->shutter_time << ") Gain: "
>  			   << agcStatus->analogue_gain << " (Gain Code: "
>  			   << gain_code << ")";

Could the AGC algorithm get confused if it requests a certain exposure,
and a smaller value is set in the sensor ? Will it keep trying to push
the exposure up to increase the luminance of the scene without
notificing it's pointless ?

>  
>  	ControlList ctrls(unicam_ctrls_);
> -	ctrls.set(V4L2_CID_ANALOGUE_GAIN, gain_code);
> +	/*
> +	 * VBLANK must be set before EXPOSURE as the former will adjust the
> +	 * limits of the latter control.
> +	 */

This will be nasty on the V4L2 side. We write multiple controls with a
single VIDIOC_S_EXT_CTRLS call, and the V4L2 control framework will
first clamp all control values to the limits before setting any control.
The limits for the exposure time will thus not be adjusted before its
value is clamped :-( I'm not sure how to best solve this. Ideally the
problem should be fixed in the control framework, but I doubt that will
be happen. We need to at least ask Hans to see what options we have (or
rather, what options we don't have). One easy hack may be to report a
large range for the exposure time limits in the camera sensor drivers,
and adjust the value in .s_ctrl() instead of relying on the control
framework to do that for us.

> +	ctrls.set(V4L2_CID_VBLANK, vblanking);
>  	ctrls.set(V4L2_CID_EXPOSURE, exposure_lines);
> +	ctrls.set(V4L2_CID_ANALOGUE_GAIN, gain_code);
>  	op.controls.push_back(ctrls);
>  	queueFrameAction.emit(0, op);
>  }
> diff --git a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
> index 21a1d7f7..948290e2 100644
> --- a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
> +++ b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
> @@ -1159,7 +1159,9 @@ void RPiCameraData::queueFrameAction(unsigned int frame, const IPAOperationData
>  		if (!staggeredCtrl_) {
>  			staggeredCtrl_.init(unicam_[Unicam::Image].dev(),
>  					    { { V4L2_CID_ANALOGUE_GAIN, action.data[0] },
> +					      { V4L2_CID_VBLANK, action.data[1] },
>  					      { V4L2_CID_EXPOSURE, action.data[1] } });

So the initial value of the exposure time and the vertical blanking are
always identical ?

> +
>  			sensorMetadata_ = action.data[2];
>  		}
>
Naushir Patuck May 26, 2020, 9:29 a.m. UTC | #2
Hi Laurent,

Thank you for the comments.

On Mon, 25 May 2020 at 00:16, Laurent Pinchart
<laurent.pinchart@ideasonboard.com> wrote:
>
> Hi Naush,
>
> Thank you for the patch.
>
> On Wed, May 13, 2020 at 10:11:19AM +0100, Naushir Patuck wrote:
> > Add support for setting V4L2_CID_VBLANK appropriately when setting
> > V4L2_CID_EXPOSURE. This will allow adaptive framerates during
> > viewfinder use cases (e.g. when the exposure time goes above 33ms, we
> > can reduce the framerate to lower than 30fps).
> >
> > The minimum and maximum frame durations are provided via libcamera
> > controls, and will prioritise exposure time limits over any AGC request.
>
> General question about the AGC and frame duration selection behaviour.
> When the scene illumination drops and the applications allows the frame
> rate to be lowered, is it best to adapt the frame duration continuously,
> or jump from, let's say, 30fps to 15fps and stay at 15fps ? I could
> imagine adaptative frame durations to be annoying to handle for codecs
> or displays, but I probably lack a good view of the details.

Traditionally, we had always allowed the fps to gradually drop with
exposure during viewfinder use cases.  This would provide a better
user experience (in my opinion) over a step change to the fps).  It
also ensures we run with the maximum possible framerate whenever we
can.  However, during video encoding, we have always left the fps at a
fixed rate, thereby limiting the maximum exposure time allowed.  I
guess it is essentially up to the AGC algorithm to decide how it wants
to handle it depending on the use case.

> > V4L2_CID_VBLANK is controlled through the staggered writer, just like
> > the exposure and gain controls.
> >
> > Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
> > ---
> >  include/ipa/raspberrypi.h                     |  1 +
> >  src/ipa/raspberrypi/cam_helper.cpp            | 38 ++++++++++++++-
> >  src/ipa/raspberrypi/cam_helper.hpp            | 15 +++++-
> >  src/ipa/raspberrypi/cam_helper_imx219.cpp     | 11 ++++-
> >  src/ipa/raspberrypi/cam_helper_imx477.cpp     | 11 ++++-
> >  src/ipa/raspberrypi/cam_helper_ov5647.cpp     | 11 ++++-
> >  src/ipa/raspberrypi/raspberrypi.cpp           | 48 ++++++++++++++++---
> >  .../pipeline/raspberrypi/raspberrypi.cpp      |  2 +
> >  8 files changed, 124 insertions(+), 13 deletions(-)
> >
> > diff --git a/include/ipa/raspberrypi.h b/include/ipa/raspberrypi.h
> > index c109469e..74954db2 100644
> > --- a/include/ipa/raspberrypi.h
> > +++ b/include/ipa/raspberrypi.h
> > @@ -51,6 +51,7 @@ static const ControlInfoMap RPiControls = {
> >       { &controls::Brightness, ControlInfo(-1.0f, 1.0f) },
> >       { &controls::Contrast, ControlInfo(0.0f, 32.0f) },
> >       { &controls::Saturation, ControlInfo(0.0f, 32.0f) },
> > +     { &controls::FrameDurations, ControlInfo(1.0e3f, 1.0e9f) }
> >  };
> >
> >  } /* namespace libcamera */
> > diff --git a/src/ipa/raspberrypi/cam_helper.cpp b/src/ipa/raspberrypi/cam_helper.cpp
> > index 7f05d2c6..06732241 100644
> > --- a/src/ipa/raspberrypi/cam_helper.cpp
> > +++ b/src/ipa/raspberrypi/cam_helper.cpp
> > @@ -34,8 +34,10 @@ CamHelper *CamHelper::Create(std::string const &cam_name)
> >       return nullptr;
> >  }
> >
> > -CamHelper::CamHelper(MdParser *parser)
> > -     : parser_(parser), initialized_(false)
> > +CamHelper::CamHelper(MdParser *parser, unsigned int maxFrameLength,
> > +                  unsigned int frameIntegrationDiff)
> > +     : parser_(parser), initialized_(false), maxFrameLength_(maxFrameLength),
> > +       frameIntegrationDiff_(frameIntegrationDiff)
> >  {
> >  }
> >
> > @@ -56,6 +58,38 @@ double CamHelper::Exposure(uint32_t exposure_lines) const
> >       return exposure_lines * mode_.line_length / 1000.0;
> >  }
> >
> > +uint32_t CamHelper::GetVBlanking(double &exposure, double minFrameDuration,
> > +                              double maxFrameDuration) const
> > +{
> > +     uint32_t frameLengthMin, frameLengthMax, vblank;
> > +     uint32_t exposureLines = ExposureLines(exposure);
> > +
> > +     assert(initialized_);
> > +     /*
> > +      * Clip frame length by the frame duration range and the maximum allowable
> > +      * value in the sensor, given by maxFrameLength_.
> > +      */
> > +     frameLengthMin = std::min<uint32_t>(1e3 * minFrameDuration / mode_.line_length,
> > +                                         maxFrameLength_);
> > +     frameLengthMax = std::min<uint32_t>(1e3 * maxFrameDuration / mode_.line_length,
> > +                                         maxFrameLength_);
>
> Not something to be addressed as part of this series, but as we divide
> and multiply by 1000 in various places, should we specify all duration
> controls in nanoseconds instead of microseconds ?

Good question :)  I think microseconds is more "user friendly", but I
do see the point that we convert to nanoseconds everywhere.  My
suggestion would be to leave it as microseconds for the user-facing
controls.

>
> > +     /*
> > +      * Limit the exposure to the maximum frame duration requested, and
> > +      * re-calculate if it has been clipped.
> > +      */
> > +     exposureLines = std::min(frameLengthMax - frameIntegrationDiff_,
> > +                              exposureLines);
> > +     exposure = Exposure(exposureLines);
> > +
> > +     /* Limit the vblank to the range allowed by the frame length limits. */
> > +     vblank = std::max<uint32_t>(mode_.height, exposureLines) -
> > +              mode_.height + frameIntegrationDiff_;
>
> If the exposure time is shorter than the frame height, do we need the
> frameIntegrationDiff_ margin ? Assuming a sensor that would have a
> minimum vblank of 0 and a frameIntegrationDiff_ of 40, if height was
> 1080 and exposureLines 500, there should be no need to set vblank to 40,
> right ?

Yes, you are correct.  Will fix this calculation.

>
> > +     vblank = std::max(vblank, frameLengthMin - mode_.height);
> > +     vblank = std::min(vblank, frameLengthMax - mode_.height);
>
> You can use
>
>         vblank = utils::clamp(vblank, frameLengthMin - mode_.height,
>                               frameLengthMax - mode_.height);

Ack.

> > +
> > +     return vblank;
> > +}
> > +
> >  void CamHelper::SetCameraMode(const CameraMode &mode)
> >  {
> >       mode_ = mode;
> > diff --git a/src/ipa/raspberrypi/cam_helper.hpp b/src/ipa/raspberrypi/cam_helper.hpp
> > index 0c8aa29a..fdcdbddb 100644
> > --- a/src/ipa/raspberrypi/cam_helper.hpp
> > +++ b/src/ipa/raspberrypi/cam_helper.hpp
> > @@ -68,12 +68,15 @@ class CamHelper
> >  {
> >  public:
> >       static CamHelper *Create(std::string const &cam_name);
> > -     CamHelper(MdParser *parser);
> > +     CamHelper(MdParser *parser, unsigned int maxFrameLength,
> > +               unsigned int frameIntegrationDiff);
> >       virtual ~CamHelper();
> >       void SetCameraMode(const CameraMode &mode);
> >       MdParser &Parser() const { return *parser_; }
> >       uint32_t ExposureLines(double exposure_us) const;
> >       double Exposure(uint32_t exposure_lines) const; // in us
> > +     virtual uint32_t GetVBlanking(double &exposure_us, double minFrameDuration,
> > +                                   double maxFrameDuration) const;
>
> Does this function need to be virtual ?

I'm not sure :)

I wanted to be flexible enough that if a sensor wanted to do something
a bit different from the norm, it could override the base
implementation.  I don't have a particular example in mind though.

>
> >       virtual uint32_t GainCode(double gain) const = 0;
> >       virtual double Gain(uint32_t gain_code) const = 0;
> >       virtual void GetDelays(int &exposure_delay, int &gain_delay) const;
> > @@ -83,10 +86,20 @@ public:
> >       virtual unsigned int MistrustFramesStartup() const;
> >       virtual unsigned int MistrustFramesModeSwitch() const;
> >       virtual CamTransform GetOrientation() const;
> > +
> >  protected:
> >       MdParser *parser_;
> >       CameraMode mode_;
> > +
> > +private:
> >       bool initialized_;
> > +     /* Largest possible frame length, in units of lines. */
> > +     unsigned int maxFrameLength_;
> > +     /*
> > +      * Smallest difference between the frame length and integration time,
> > +      * in units of lines.
> > +      */
> > +     unsigned int frameIntegrationDiff_;
> >  };
> >
> >  // This is for registering camera helpers with the system, so that the
> > diff --git a/src/ipa/raspberrypi/cam_helper_imx219.cpp b/src/ipa/raspberrypi/cam_helper_imx219.cpp
> > index 35c6597c..fd95a31a 100644
> > --- a/src/ipa/raspberrypi/cam_helper_imx219.cpp
> > +++ b/src/ipa/raspberrypi/cam_helper_imx219.cpp
> > @@ -50,13 +50,22 @@ public:
> >       unsigned int MistrustFramesModeSwitch() const override;
> >       bool SensorEmbeddedDataPresent() const override;
> >       CamTransform GetOrientation() const override;
> > +
> > +private:
> > +     /*
> > +      * Smallest difference between the frame length and integration time,
> > +      * in units of lines.
> > +      */
> > +     static constexpr int frameIntegrationDiff = 4;
> > +     /* Largest possible frame length, in units of lines. */
> > +     static constexpr int maxFrameLength = 0xffff;
> >  };
> >
> >  CamHelperImx219::CamHelperImx219()
> >  #if ENABLE_EMBEDDED_DATA
> >       : CamHelper(new MdParserImx219())
>
> Shouldn't this be updated too ?

Ack.

>
> >  #else
> > -     : CamHelper(new MdParserRPi())
> > +     : CamHelper(new MdParserRPi(), maxFrameLength, frameIntegrationDiff)
> >  #endif
> >  {
> >  }
> > diff --git a/src/ipa/raspberrypi/cam_helper_imx477.cpp b/src/ipa/raspberrypi/cam_helper_imx477.cpp
> > index 69544456..4a1cab76 100644
> > --- a/src/ipa/raspberrypi/cam_helper_imx477.cpp
> > +++ b/src/ipa/raspberrypi/cam_helper_imx477.cpp
> > @@ -39,10 +39,19 @@ public:
> >       double Gain(uint32_t gain_code) const override;
> >       bool SensorEmbeddedDataPresent() const override;
> >       CamTransform GetOrientation() const override;
> > +
> > +private:
> > +     /*
> > +      * Smallest difference between the frame length and integration time,
> > +      * in units of lines.
> > +      */
> > +     static constexpr int frameIntegrationDiff = 22;
> > +     /* Largest possible frame length, in units of lines. */
> > +     static constexpr int maxFrameLength = 0xffdc;
> >  };
> >
> >  CamHelperImx477::CamHelperImx477()
> > -     : CamHelper(new MdParserImx477())
> > +     : CamHelper(new MdParserImx477(), maxFrameLength, frameIntegrationDiff)
> >  {
> >  }
> >
> > diff --git a/src/ipa/raspberrypi/cam_helper_ov5647.cpp b/src/ipa/raspberrypi/cam_helper_ov5647.cpp
> > index 3dbcb164..d814fa90 100644
> > --- a/src/ipa/raspberrypi/cam_helper_ov5647.cpp
> > +++ b/src/ipa/raspberrypi/cam_helper_ov5647.cpp
> > @@ -22,6 +22,15 @@ public:
> >       unsigned int HideFramesModeSwitch() const override;
> >       unsigned int MistrustFramesStartup() const override;
> >       unsigned int MistrustFramesModeSwitch() const override;
> > +
> > +private:
> > +     /*
> > +      * Smallest difference between the frame length and integration time,
> > +      * in units of lines.
> > +      */
> > +     static constexpr int frameIntegrationDiff = 4;
> > +     /* Largest possible frame length, in units of lines. */
> > +     static constexpr int maxFrameLength = 0xffff;
> >  };
> >
> >  /*
> > @@ -30,7 +39,7 @@ public:
> >   */
> >
> >  CamHelperOv5647::CamHelperOv5647()
> > -     : CamHelper(new MdParserRPi())
> > +     : CamHelper(new MdParserRPi(), maxFrameLength, frameIntegrationDiff)
> >  {
> >  }
> >
> > diff --git a/src/ipa/raspberrypi/raspberrypi.cpp b/src/ipa/raspberrypi/raspberrypi.cpp
> > index 3bcc0815..1af5e74a 100644
> > --- a/src/ipa/raspberrypi/raspberrypi.cpp
> > +++ b/src/ipa/raspberrypi/raspberrypi.cpp
> > @@ -53,6 +53,8 @@ namespace libcamera {
> >  /* Configure the sensor with these values initially. */
> >  #define DEFAULT_ANALOGUE_GAIN 1.0
> >  #define DEFAULT_EXPOSURE_TIME 20000
> > +#define DEFAULT_MIN_FRAME_DURATION (1e6 / 30.0)
> > +#define DEFAULT_MAX_FRAME_DURATION (1e6 / 0.01)
> >
> >  LOG_DEFINE_CATEGORY(IPARPI)
> >
> > @@ -136,6 +138,9 @@ private:
> >       /* LS table allocation passed in from the pipeline handler. */
> >       uint32_t lsTableHandle_;
> >       void *lsTable_;
> > +
> > +     /* Frame duration (1/fps) given in microseconds. */
> > +     double minFrameDuration_, maxFrameDuration_;
>
> Shouldn't these be float, or uint32_t depending on the outcome of the
> discussion for patch 1/3 ?

Ack.

>
> >  };
> >
> >  int IPARPi::init(const IPASettings &settings)
> > @@ -252,13 +257,20 @@ void IPARPi::configure(const CameraSensorInfo &sensorInfo,
> >               controller_.Initialise();
> >               controllerInit_ = true;
> >
> > -             /* Calculate initial values for gain and exposure. */
> > +             /* Calculate initial values for gain, vblank, and exposure */
> > +             minFrameDuration_ = DEFAULT_MIN_FRAME_DURATION;
> > +             maxFrameDuration_ = DEFAULT_MAX_FRAME_DURATION;
> > +
> > +             double exposure = DEFAULT_EXPOSURE_TIME;
> > +             int32_t vblank = helper_->GetVBlanking(exposure, minFrameDuration_,
> > +                                                    maxFrameDuration_);
> > +             int32_t exposure_lines = helper_->ExposureLines(exposure);
> >               int32_t gain_code = helper_->GainCode(DEFAULT_ANALOGUE_GAIN);
> > -             int32_t exposure_lines = helper_->ExposureLines(DEFAULT_EXPOSURE_TIME);
> >
> >               ControlList ctrls(unicam_ctrls_);
> > -             ctrls.set(V4L2_CID_ANALOGUE_GAIN, gain_code);
> > +             ctrls.set(V4L2_CID_VBLANK, vblank);
> >               ctrls.set(V4L2_CID_EXPOSURE, exposure_lines);
> > +             ctrls.set(V4L2_CID_ANALOGUE_GAIN, gain_code);
> >
> >               IPAOperationData op;
> >               op.operation = RPI_IPA_ACTION_V4L2_SET_STAGGERED;
> > @@ -630,6 +642,17 @@ void IPARPi::queueRequest(const ControlList &controls)
> >                       break;
> >               }
> >
> > +             case controls::FRAME_DURATIONS: {
> > +                     auto frameDurations = ctrl.second.get<Span<const float>>();
> > +
> > +                     /* This will be applied once AGC recalculations occur. */
> > +                     minFrameDuration_ = frameDurations[0];
> > +                     maxFrameDuration_ = frameDurations[1];
>
> Should the values be adjusted based on the sensor capabilities ?

They do, but separately in the CamHelper::GetVBlanking() call.  Unless
I have misunderstood the comment?

>
> > +                     libcameraMetadata_.set(controls::FrameDurations,
> > +                                            { frameDurations[0], frameDurations[1] });
 -> >
> Hmmm... From an application point of view, wouldn't it be more useful to
> know what frame duration was actually used for the current frame ? You
> don't have to implement this as part of this series, but I could imagine
> a FrameDuration metadata control being useful for that purpose.
> FrameDurations would be set by applications, and FrameDuration would be
> reported through metadata. (On a side note, would FrameDurationLimits be
> a better name than FrameDuration then ?)

Yes, good point.  I can update this to reflect the actual frame durations used.
Happy to rename FrameDuration -> FrameDurationLimits.

>
> > +                     break;
> > +             }
> > +
> >               default:
> >                       LOG(IPARPI, Warning)
> >                               << "Ctrl " << controls::controls.at(ctrl.first)->name()
> > @@ -795,7 +818,12 @@ void IPARPi::applyAGC(const struct AgcStatus *agcStatus)
> >       op.operation = RPI_IPA_ACTION_V4L2_SET_STAGGERED;
> >
> >       int32_t gain_code = helper_->GainCode(agcStatus->analogue_gain);
> > -     int32_t exposure_lines = helper_->ExposureLines(agcStatus->shutter_time);
> > +
> > +     /* GetVBlanking might clip exposure time to the fps limits. */
> > +     double exposure = agcStatus->shutter_time;
> > +     int32_t vblanking = helper_->GetVBlanking(exposure, minFrameDuration_,
> > +                                               maxFrameDuration_);
>
> I had completely missed, when reviewing the GetVBlanking()
> implementation above, that the first parameter was a referenced and was
> used to return a value. We tend in libcamera to use const references for
> input parameters, and pointers for output or in/out parameters to make
> that clearer. It's not ideal though, as pointer can be null, and using
> references when a parameter should not be null is nicer. No need to
> change the code here, but maybe a comment on top of GetVBlanking() would
> be useful to point out that the exposure time can be adjusted.

Ack.

>
> > +     int32_t exposure_lines = helper_->ExposureLines(exposure);
> >
> >       if (unicam_ctrls_.find(V4L2_CID_ANALOGUE_GAIN) == unicam_ctrls_.end()) {
> >               LOG(IPARPI, Error) << "Can't find analogue gain control";
> > @@ -807,14 +835,20 @@ void IPARPi::applyAGC(const struct AgcStatus *agcStatus)
> >               return;
> >       }
> >
> > -     LOG(IPARPI, Debug) << "Applying AGC Exposure: " << agcStatus->shutter_time
> > -                        << " (Shutter lines: " << exposure_lines << ") Gain: "
> > +     LOG(IPARPI, Debug) << "Applying AGC Exposure: " << exposure
> > +                        << " (Shutter lines: " << exposure_lines << ", AGC requested "
> > +                        << agcStatus->shutter_time << ") Gain: "
> >                          << agcStatus->analogue_gain << " (Gain Code: "
> >                          << gain_code << ")";
>
> Could the AGC algorithm get confused if it requests a certain exposure,
> and a smaller value is set in the sensor ? Will it keep trying to push
> the exposure up to increase the luminance of the scene without
> notificing it's pointless ?

Yes, it can, and yet it will :)
However, there is no harm done.  As long as the AGC algorithm gets
given the actual exposure time / gain used for the frame - rather than
what it requested - it will fill up the rest of the exposure with
digital gain to get the expected brightness.  This is why it is
important for CamHelper::GetVBlanking() to return out the exposure
that would actually be used.

David and I did discuss passing in the framerate limits into the AGC
algorithm, but we felt it added complexity for no real gain in
functionality.  The AGC algorithm would not behave any differently in
either case.  Would could help a user who wanted a very specific
behavior with framerate limits would be to setup a suitable exposure
profile (division of shutter speed and gain) in the tuning file.

>
> >
> >       ControlList ctrls(unicam_ctrls_);
> > -     ctrls.set(V4L2_CID_ANALOGUE_GAIN, gain_code);
> > +     /*
> > +      * VBLANK must be set before EXPOSURE as the former will adjust the
> > +      * limits of the latter control.
> > +      */
>
> This will be nasty on the V4L2 side. We write multiple controls with a
> single VIDIOC_S_EXT_CTRLS call, and the V4L2 control framework will
> first clamp all control values to the limits before setting any control.
> The limits for the exposure time will thus not be adjusted before its
> value is clamped :-( I'm not sure how to best solve this. Ideally the
> problem should be fixed in the control framework, but I doubt that will
> be happen. We need to at least ask Hans to see what options we have (or
> rather, what options we don't have). One easy hack may be to report a
> large range for the exposure time limits in the camera sensor drivers,
> and adjust the value in .s_ctrl() instead of relying on the control
> framework to do that for us.

Ack.  This needs a bit more thinking.

If I were to separate out the VBLANK set ctrl with the EXPOSURE set
ctrl would that help here?  Note that I cannot easily do that right
now as the staggered writer only has an interface to set a single list
of ctrls.

I suppose the current behavior means that we get 1 frame of exposure
that is possibly wrong or clipped before the vblank has updated the
ctrl limits in the sensor driver.

> > +     ctrls.set(V4L2_CID_VBLANK, vblanking);
> >       ctrls.set(V4L2_CID_EXPOSURE, exposure_lines);
> > +     ctrls.set(V4L2_CID_ANALOGUE_GAIN, gain_code);
> >       op.controls.push_back(ctrls);
> >       queueFrameAction.emit(0, op);
> >  }
> > diff --git a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
> > index 21a1d7f7..948290e2 100644
> > --- a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
> > +++ b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
> > @@ -1159,7 +1159,9 @@ void RPiCameraData::queueFrameAction(unsigned int frame, const IPAOperationData
> >               if (!staggeredCtrl_) {
> >                       staggeredCtrl_.init(unicam_[Unicam::Image].dev(),
> >                                           { { V4L2_CID_ANALOGUE_GAIN, action.data[0] },
> > +                                           { V4L2_CID_VBLANK, action.data[1] },
> >                                             { V4L2_CID_EXPOSURE, action.data[1] } });
>
> So the initial value of the exposure time and the vertical blanking are
> always identical ?

This is setting up the delay in applying the ctrl for the staggered
writer.  So I would expect EXPOSURE and VBLANK to have the same delay
in the sensor.

>
> > +
> >                       sensorMetadata_ = action.data[2];
> >               }
> >
>
> --
> Regards,
>
> Laurent Pinchart

Thanks,
Naush
Laurent Pinchart May 28, 2020, 1:30 a.m. UTC | #3
Hi Naush,

On Tue, May 26, 2020 at 10:29:12AM +0100, Naushir Patuck wrote:
> On Mon, 25 May 2020 at 00:16, Laurent Pinchart wrote:
> > On Wed, May 13, 2020 at 10:11:19AM +0100, Naushir Patuck wrote:
> > > Add support for setting V4L2_CID_VBLANK appropriately when setting
> > > V4L2_CID_EXPOSURE. This will allow adaptive framerates during
> > > viewfinder use cases (e.g. when the exposure time goes above 33ms, we
> > > can reduce the framerate to lower than 30fps).
> > >
> > > The minimum and maximum frame durations are provided via libcamera
> > > controls, and will prioritise exposure time limits over any AGC request.
> >
> > General question about the AGC and frame duration selection behaviour.
> > When the scene illumination drops and the applications allows the frame
> > rate to be lowered, is it best to adapt the frame duration continuously,
> > or jump from, let's say, 30fps to 15fps and stay at 15fps ? I could
> > imagine adaptative frame durations to be annoying to handle for codecs
> > or displays, but I probably lack a good view of the details.
> 
> Traditionally, we had always allowed the fps to gradually drop with
> exposure during viewfinder use cases.  This would provide a better
> user experience (in my opinion) over a step change to the fps).  It
> also ensures we run with the maximum possible framerate whenever we
> can.  However, during video encoding, we have always left the fps at a
> fixed rate, thereby limiting the maximum exposure time allowed.  I
> guess it is essentially up to the AGC algorithm to decide how it wants
> to handle it depending on the use case.

Thanks for the information. Would you expect applications to specify a
frame duration interval in the viewfinder use case and a fixed frame
duration (with min == max) in the video recording use case, or would you
handle that internally in the IPA ? I suppose an application may want
the IPA to pick either 15fps or 30fps depending on the scene
illumination and stick to that ? In that case we would need to provide a
hint to tell whether the frame rate should be adapted continuously or
not. Or we could do so on the application side by specifying a 15-30 fps
interval, capturing a few frames until AEGC locks, reading the frame
duration selected by the IPA, setting the frame duration to a fixed
value and proceeding with the video recording. It's essentially a
question of who would be responsible for implementing this (application
or IPA), and if we decide to put the burden on the application, we could
also provide a higher-level video recording helper to implement that
sequence. What do you think would be best ?

> > > V4L2_CID_VBLANK is controlled through the staggered writer, just like
> > > the exposure and gain controls.
> > >
> > > Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
> > > ---
> > >  include/ipa/raspberrypi.h                     |  1 +
> > >  src/ipa/raspberrypi/cam_helper.cpp            | 38 ++++++++++++++-
> > >  src/ipa/raspberrypi/cam_helper.hpp            | 15 +++++-
> > >  src/ipa/raspberrypi/cam_helper_imx219.cpp     | 11 ++++-
> > >  src/ipa/raspberrypi/cam_helper_imx477.cpp     | 11 ++++-
> > >  src/ipa/raspberrypi/cam_helper_ov5647.cpp     | 11 ++++-
> > >  src/ipa/raspberrypi/raspberrypi.cpp           | 48 ++++++++++++++++---
> > >  .../pipeline/raspberrypi/raspberrypi.cpp      |  2 +
> > >  8 files changed, 124 insertions(+), 13 deletions(-)
> > >
> > > diff --git a/include/ipa/raspberrypi.h b/include/ipa/raspberrypi.h
> > > index c109469e..74954db2 100644
> > > --- a/include/ipa/raspberrypi.h
> > > +++ b/include/ipa/raspberrypi.h
> > > @@ -51,6 +51,7 @@ static const ControlInfoMap RPiControls = {
> > >       { &controls::Brightness, ControlInfo(-1.0f, 1.0f) },
> > >       { &controls::Contrast, ControlInfo(0.0f, 32.0f) },
> > >       { &controls::Saturation, ControlInfo(0.0f, 32.0f) },
> > > +     { &controls::FrameDurations, ControlInfo(1.0e3f, 1.0e9f) }
> > >  };
> > >
> > >  } /* namespace libcamera */
> > > diff --git a/src/ipa/raspberrypi/cam_helper.cpp b/src/ipa/raspberrypi/cam_helper.cpp
> > > index 7f05d2c6..06732241 100644
> > > --- a/src/ipa/raspberrypi/cam_helper.cpp
> > > +++ b/src/ipa/raspberrypi/cam_helper.cpp
> > > @@ -34,8 +34,10 @@ CamHelper *CamHelper::Create(std::string const &cam_name)
> > >       return nullptr;
> > >  }
> > >
> > > -CamHelper::CamHelper(MdParser *parser)
> > > -     : parser_(parser), initialized_(false)
> > > +CamHelper::CamHelper(MdParser *parser, unsigned int maxFrameLength,
> > > +                  unsigned int frameIntegrationDiff)
> > > +     : parser_(parser), initialized_(false), maxFrameLength_(maxFrameLength),
> > > +       frameIntegrationDiff_(frameIntegrationDiff)
> > >  {
> > >  }
> > >
> > > @@ -56,6 +58,38 @@ double CamHelper::Exposure(uint32_t exposure_lines) const
> > >       return exposure_lines * mode_.line_length / 1000.0;
> > >  }
> > >
> > > +uint32_t CamHelper::GetVBlanking(double &exposure, double minFrameDuration,
> > > +                              double maxFrameDuration) const
> > > +{
> > > +     uint32_t frameLengthMin, frameLengthMax, vblank;
> > > +     uint32_t exposureLines = ExposureLines(exposure);
> > > +
> > > +     assert(initialized_);
> > > +     /*
> > > +      * Clip frame length by the frame duration range and the maximum allowable
> > > +      * value in the sensor, given by maxFrameLength_.
> > > +      */
> > > +     frameLengthMin = std::min<uint32_t>(1e3 * minFrameDuration / mode_.line_length,
> > > +                                         maxFrameLength_);
> > > +     frameLengthMax = std::min<uint32_t>(1e3 * maxFrameDuration / mode_.line_length,
> > > +                                         maxFrameLength_);
> >
> > Not something to be addressed as part of this series, but as we divide
> > and multiply by 1000 in various places, should we specify all duration
> > controls in nanoseconds instead of microseconds ?
> 
> Good question :)  I think microseconds is more "user friendly", but I
> do see the point that we convert to nanoseconds everywhere.  My
> suggestion would be to leave it as microseconds for the user-facing
> controls.

Let's do so for now. I may reconsider that later though if we need more
precision than 1µs (timestamps are where I expect precision to be
needed, and we express them in ns already, maybe we won't need as much
precision in the durations) or if we end up having controls in both
units and we decide to make the API more consistent.

> > > +     /*
> > > +      * Limit the exposure to the maximum frame duration requested, and
> > > +      * re-calculate if it has been clipped.
> > > +      */
> > > +     exposureLines = std::min(frameLengthMax - frameIntegrationDiff_,
> > > +                              exposureLines);
> > > +     exposure = Exposure(exposureLines);
> > > +
> > > +     /* Limit the vblank to the range allowed by the frame length limits. */
> > > +     vblank = std::max<uint32_t>(mode_.height, exposureLines) -
> > > +              mode_.height + frameIntegrationDiff_;
> >
> > If the exposure time is shorter than the frame height, do we need the
> > frameIntegrationDiff_ margin ? Assuming a sensor that would have a
> > minimum vblank of 0 and a frameIntegrationDiff_ of 40, if height was
> > 1080 and exposureLines 500, there should be no need to set vblank to 40,
> > right ?
> 
> Yes, you are correct.  Will fix this calculation.
> 
> >
> > > +     vblank = std::max(vblank, frameLengthMin - mode_.height);
> > > +     vblank = std::min(vblank, frameLengthMax - mode_.height);
> >
> > You can use
> >
> >         vblank = utils::clamp(vblank, frameLengthMin - mode_.height,
> >                               frameLengthMax - mode_.height);
> 
> Ack.
> 
> > > +
> > > +     return vblank;
> > > +}
> > > +
> > >  void CamHelper::SetCameraMode(const CameraMode &mode)
> > >  {
> > >       mode_ = mode;
> > > diff --git a/src/ipa/raspberrypi/cam_helper.hpp b/src/ipa/raspberrypi/cam_helper.hpp
> > > index 0c8aa29a..fdcdbddb 100644
> > > --- a/src/ipa/raspberrypi/cam_helper.hpp
> > > +++ b/src/ipa/raspberrypi/cam_helper.hpp
> > > @@ -68,12 +68,15 @@ class CamHelper
> > >  {
> > >  public:
> > >       static CamHelper *Create(std::string const &cam_name);
> > > -     CamHelper(MdParser *parser);
> > > +     CamHelper(MdParser *parser, unsigned int maxFrameLength,
> > > +               unsigned int frameIntegrationDiff);
> > >       virtual ~CamHelper();
> > >       void SetCameraMode(const CameraMode &mode);
> > >       MdParser &Parser() const { return *parser_; }
> > >       uint32_t ExposureLines(double exposure_us) const;
> > >       double Exposure(uint32_t exposure_lines) const; // in us
> > > +     virtual uint32_t GetVBlanking(double &exposure_us, double minFrameDuration,
> > > +                                   double maxFrameDuration) const;
> >
> > Does this function need to be virtual ?
> 
> I'm not sure :)
> 
> I wanted to be flexible enough that if a sensor wanted to do something
> a bit different from the norm, it could override the base
> implementation.  I don't have a particular example in mind though.

We could always make it virtual at that point then, this is an internal
API.

> > >       virtual uint32_t GainCode(double gain) const = 0;
> > >       virtual double Gain(uint32_t gain_code) const = 0;
> > >       virtual void GetDelays(int &exposure_delay, int &gain_delay) const;
> > > @@ -83,10 +86,20 @@ public:
> > >       virtual unsigned int MistrustFramesStartup() const;
> > >       virtual unsigned int MistrustFramesModeSwitch() const;
> > >       virtual CamTransform GetOrientation() const;
> > > +
> > >  protected:
> > >       MdParser *parser_;
> > >       CameraMode mode_;
> > > +
> > > +private:
> > >       bool initialized_;
> > > +     /* Largest possible frame length, in units of lines. */
> > > +     unsigned int maxFrameLength_;
> > > +     /*
> > > +      * Smallest difference between the frame length and integration time,
> > > +      * in units of lines.
> > > +      */
> > > +     unsigned int frameIntegrationDiff_;
> > >  };
> > >
> > >  // This is for registering camera helpers with the system, so that the
> > > diff --git a/src/ipa/raspberrypi/cam_helper_imx219.cpp b/src/ipa/raspberrypi/cam_helper_imx219.cpp
> > > index 35c6597c..fd95a31a 100644
> > > --- a/src/ipa/raspberrypi/cam_helper_imx219.cpp
> > > +++ b/src/ipa/raspberrypi/cam_helper_imx219.cpp
> > > @@ -50,13 +50,22 @@ public:
> > >       unsigned int MistrustFramesModeSwitch() const override;
> > >       bool SensorEmbeddedDataPresent() const override;
> > >       CamTransform GetOrientation() const override;
> > > +
> > > +private:
> > > +     /*
> > > +      * Smallest difference between the frame length and integration time,
> > > +      * in units of lines.
> > > +      */
> > > +     static constexpr int frameIntegrationDiff = 4;
> > > +     /* Largest possible frame length, in units of lines. */
> > > +     static constexpr int maxFrameLength = 0xffff;
> > >  };
> > >
> > >  CamHelperImx219::CamHelperImx219()
> > >  #if ENABLE_EMBEDDED_DATA
> > >       : CamHelper(new MdParserImx219())
> >
> > Shouldn't this be updated too ?
> 
> Ack.
> 
> > >  #else
> > > -     : CamHelper(new MdParserRPi())
> > > +     : CamHelper(new MdParserRPi(), maxFrameLength, frameIntegrationDiff)
> > >  #endif
> > >  {
> > >  }
> > > diff --git a/src/ipa/raspberrypi/cam_helper_imx477.cpp b/src/ipa/raspberrypi/cam_helper_imx477.cpp
> > > index 69544456..4a1cab76 100644
> > > --- a/src/ipa/raspberrypi/cam_helper_imx477.cpp
> > > +++ b/src/ipa/raspberrypi/cam_helper_imx477.cpp
> > > @@ -39,10 +39,19 @@ public:
> > >       double Gain(uint32_t gain_code) const override;
> > >       bool SensorEmbeddedDataPresent() const override;
> > >       CamTransform GetOrientation() const override;
> > > +
> > > +private:
> > > +     /*
> > > +      * Smallest difference between the frame length and integration time,
> > > +      * in units of lines.
> > > +      */
> > > +     static constexpr int frameIntegrationDiff = 22;
> > > +     /* Largest possible frame length, in units of lines. */
> > > +     static constexpr int maxFrameLength = 0xffdc;
> > >  };
> > >
> > >  CamHelperImx477::CamHelperImx477()
> > > -     : CamHelper(new MdParserImx477())
> > > +     : CamHelper(new MdParserImx477(), maxFrameLength, frameIntegrationDiff)
> > >  {
> > >  }
> > >
> > > diff --git a/src/ipa/raspberrypi/cam_helper_ov5647.cpp b/src/ipa/raspberrypi/cam_helper_ov5647.cpp
> > > index 3dbcb164..d814fa90 100644
> > > --- a/src/ipa/raspberrypi/cam_helper_ov5647.cpp
> > > +++ b/src/ipa/raspberrypi/cam_helper_ov5647.cpp
> > > @@ -22,6 +22,15 @@ public:
> > >       unsigned int HideFramesModeSwitch() const override;
> > >       unsigned int MistrustFramesStartup() const override;
> > >       unsigned int MistrustFramesModeSwitch() const override;
> > > +
> > > +private:
> > > +     /*
> > > +      * Smallest difference between the frame length and integration time,
> > > +      * in units of lines.
> > > +      */
> > > +     static constexpr int frameIntegrationDiff = 4;
> > > +     /* Largest possible frame length, in units of lines. */
> > > +     static constexpr int maxFrameLength = 0xffff;
> > >  };
> > >
> > >  /*
> > > @@ -30,7 +39,7 @@ public:
> > >   */
> > >
> > >  CamHelperOv5647::CamHelperOv5647()
> > > -     : CamHelper(new MdParserRPi())
> > > +     : CamHelper(new MdParserRPi(), maxFrameLength, frameIntegrationDiff)
> > >  {
> > >  }
> > >
> > > diff --git a/src/ipa/raspberrypi/raspberrypi.cpp b/src/ipa/raspberrypi/raspberrypi.cpp
> > > index 3bcc0815..1af5e74a 100644
> > > --- a/src/ipa/raspberrypi/raspberrypi.cpp
> > > +++ b/src/ipa/raspberrypi/raspberrypi.cpp
> > > @@ -53,6 +53,8 @@ namespace libcamera {
> > >  /* Configure the sensor with these values initially. */
> > >  #define DEFAULT_ANALOGUE_GAIN 1.0
> > >  #define DEFAULT_EXPOSURE_TIME 20000
> > > +#define DEFAULT_MIN_FRAME_DURATION (1e6 / 30.0)
> > > +#define DEFAULT_MAX_FRAME_DURATION (1e6 / 0.01)
> > >
> > >  LOG_DEFINE_CATEGORY(IPARPI)
> > >
> > > @@ -136,6 +138,9 @@ private:
> > >       /* LS table allocation passed in from the pipeline handler. */
> > >       uint32_t lsTableHandle_;
> > >       void *lsTable_;
> > > +
> > > +     /* Frame duration (1/fps) given in microseconds. */
> > > +     double minFrameDuration_, maxFrameDuration_;
> >
> > Shouldn't these be float, or uint32_t depending on the outcome of the
> > discussion for patch 1/3 ?
> 
> Ack.
> 
> > >  };
> > >
> > >  int IPARPi::init(const IPASettings &settings)
> > > @@ -252,13 +257,20 @@ void IPARPi::configure(const CameraSensorInfo &sensorInfo,
> > >               controller_.Initialise();
> > >               controllerInit_ = true;
> > >
> > > -             /* Calculate initial values for gain and exposure. */
> > > +             /* Calculate initial values for gain, vblank, and exposure */
> > > +             minFrameDuration_ = DEFAULT_MIN_FRAME_DURATION;
> > > +             maxFrameDuration_ = DEFAULT_MAX_FRAME_DURATION;
> > > +
> > > +             double exposure = DEFAULT_EXPOSURE_TIME;
> > > +             int32_t vblank = helper_->GetVBlanking(exposure, minFrameDuration_,
> > > +                                                    maxFrameDuration_);
> > > +             int32_t exposure_lines = helper_->ExposureLines(exposure);
> > >               int32_t gain_code = helper_->GainCode(DEFAULT_ANALOGUE_GAIN);
> > > -             int32_t exposure_lines = helper_->ExposureLines(DEFAULT_EXPOSURE_TIME);
> > >
> > >               ControlList ctrls(unicam_ctrls_);
> > > -             ctrls.set(V4L2_CID_ANALOGUE_GAIN, gain_code);
> > > +             ctrls.set(V4L2_CID_VBLANK, vblank);
> > >               ctrls.set(V4L2_CID_EXPOSURE, exposure_lines);
> > > +             ctrls.set(V4L2_CID_ANALOGUE_GAIN, gain_code);
> > >
> > >               IPAOperationData op;
> > >               op.operation = RPI_IPA_ACTION_V4L2_SET_STAGGERED;
> > > @@ -630,6 +642,17 @@ void IPARPi::queueRequest(const ControlList &controls)
> > >                       break;
> > >               }
> > >
> > > +             case controls::FRAME_DURATIONS: {
> > > +                     auto frameDurations = ctrl.second.get<Span<const float>>();
> > > +
> > > +                     /* This will be applied once AGC recalculations occur. */
> > > +                     minFrameDuration_ = frameDurations[0];
> > > +                     maxFrameDuration_ = frameDurations[1];
> >
> > Should the values be adjusted based on the sensor capabilities ?
> 
> They do, but separately in the CamHelper::GetVBlanking() call.  Unless
> I have misunderstood the comment?

I think I was just confused, this seems fine.

> > > +                     libcameraMetadata_.set(controls::FrameDurations,
> > > +                                            { frameDurations[0], frameDurations[1] });
> >
> > Hmmm... From an application point of view, wouldn't it be more useful to
> > know what frame duration was actually used for the current frame ? You
> > don't have to implement this as part of this series, but I could imagine
> > a FrameDuration metadata control being useful for that purpose.
> > FrameDurations would be set by applications, and FrameDuration would be
> > reported through metadata. (On a side note, would FrameDurationLimits be
> > a better name than FrameDuration then ?)
> 
> Yes, good point.  I can update this to reflect the actual frame durations used.
> Happy to rename FrameDuration -> FrameDurationLimits.
> 
> > > +                     break;
> > > +             }
> > > +
> > >               default:
> > >                       LOG(IPARPI, Warning)
> > >                               << "Ctrl " << controls::controls.at(ctrl.first)->name()
> > > @@ -795,7 +818,12 @@ void IPARPi::applyAGC(const struct AgcStatus *agcStatus)
> > >       op.operation = RPI_IPA_ACTION_V4L2_SET_STAGGERED;
> > >
> > >       int32_t gain_code = helper_->GainCode(agcStatus->analogue_gain);
> > > -     int32_t exposure_lines = helper_->ExposureLines(agcStatus->shutter_time);
> > > +
> > > +     /* GetVBlanking might clip exposure time to the fps limits. */
> > > +     double exposure = agcStatus->shutter_time;
> > > +     int32_t vblanking = helper_->GetVBlanking(exposure, minFrameDuration_,
> > > +                                               maxFrameDuration_);
> >
> > I had completely missed, when reviewing the GetVBlanking()
> > implementation above, that the first parameter was a referenced and was
> > used to return a value. We tend in libcamera to use const references for
> > input parameters, and pointers for output or in/out parameters to make
> > that clearer. It's not ideal though, as pointer can be null, and using
> > references when a parameter should not be null is nicer. No need to
> > change the code here, but maybe a comment on top of GetVBlanking() would
> > be useful to point out that the exposure time can be adjusted.
> 
> Ack.
> 
> > > +     int32_t exposure_lines = helper_->ExposureLines(exposure);
> > >
> > >       if (unicam_ctrls_.find(V4L2_CID_ANALOGUE_GAIN) == unicam_ctrls_.end()) {
> > >               LOG(IPARPI, Error) << "Can't find analogue gain control";
> > > @@ -807,14 +835,20 @@ void IPARPi::applyAGC(const struct AgcStatus *agcStatus)
> > >               return;
> > >       }
> > >
> > > -     LOG(IPARPI, Debug) << "Applying AGC Exposure: " << agcStatus->shutter_time
> > > -                        << " (Shutter lines: " << exposure_lines << ") Gain: "
> > > +     LOG(IPARPI, Debug) << "Applying AGC Exposure: " << exposure
> > > +                        << " (Shutter lines: " << exposure_lines << ", AGC requested "
> > > +                        << agcStatus->shutter_time << ") Gain: "
> > >                          << agcStatus->analogue_gain << " (Gain Code: "
> > >                          << gain_code << ")";
> >
> > Could the AGC algorithm get confused if it requests a certain exposure,
> > and a smaller value is set in the sensor ? Will it keep trying to push
> > the exposure up to increase the luminance of the scene without
> > notificing it's pointless ?
> 
> Yes, it can, and yet it will :)
> However, there is no harm done.  As long as the AGC algorithm gets
> given the actual exposure time / gain used for the frame - rather than
> what it requested - it will fill up the rest of the exposure with
> digital gain to get the expected brightness.  This is why it is
> important for CamHelper::GetVBlanking() to return out the exposure
> that would actually be used.
> 
> David and I did discuss passing in the framerate limits into the AGC
> algorithm, but we felt it added complexity for no real gain in
> functionality.  The AGC algorithm would not behave any differently in
> either case.  Would could help a user who wanted a very specific
> behavior with framerate limits would be to setup a suitable exposure
> profile (division of shutter speed and gain) in the tuning file.

OK.

> > >
> > >       ControlList ctrls(unicam_ctrls_);
> > > -     ctrls.set(V4L2_CID_ANALOGUE_GAIN, gain_code);
> > > +     /*
> > > +      * VBLANK must be set before EXPOSURE as the former will adjust the
> > > +      * limits of the latter control.
> > > +      */
> >
> > This will be nasty on the V4L2 side. We write multiple controls with a
> > single VIDIOC_S_EXT_CTRLS call, and the V4L2 control framework will
> > first clamp all control values to the limits before setting any control.
> > The limits for the exposure time will thus not be adjusted before its
> > value is clamped :-( I'm not sure how to best solve this. Ideally the
> > problem should be fixed in the control framework, but I doubt that will
> > be happen. We need to at least ask Hans to see what options we have (or
> > rather, what options we don't have). One easy hack may be to report a
> > large range for the exposure time limits in the camera sensor drivers,
> > and adjust the value in .s_ctrl() instead of relying on the control
> > framework to do that for us.
> 
> Ack.  This needs a bit more thinking.
> 
> If I were to separate out the VBLANK set ctrl with the EXPOSURE set
> ctrl would that help here?  Note that I cannot easily do that right
> now as the staggered writer only has an interface to set a single list
> of ctrls.
> 
> I suppose the current behavior means that we get 1 frame of exposure
> that is possibly wrong or clipped before the vblank has updated the
> ctrl limits in the sensor driver.

The more I think about it, the more I believe it's a deficiency of the
V4L2 API, and we should work around it by reporting exposure time limits
that don't depend on the blanking, and clamping the value in .s_ctrl().
We would otherwise need much more complexity on the userspace side, and
it's not worth it.

> > > +     ctrls.set(V4L2_CID_VBLANK, vblanking);
> > >       ctrls.set(V4L2_CID_EXPOSURE, exposure_lines);
> > > +     ctrls.set(V4L2_CID_ANALOGUE_GAIN, gain_code);
> > >       op.controls.push_back(ctrls);
> > >       queueFrameAction.emit(0, op);
> > >  }
> > > diff --git a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
> > > index 21a1d7f7..948290e2 100644
> > > --- a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
> > > +++ b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
> > > @@ -1159,7 +1159,9 @@ void RPiCameraData::queueFrameAction(unsigned int frame, const IPAOperationData
> > >               if (!staggeredCtrl_) {
> > >                       staggeredCtrl_.init(unicam_[Unicam::Image].dev(),
> > >                                           { { V4L2_CID_ANALOGUE_GAIN, action.data[0] },
> > > +                                           { V4L2_CID_VBLANK, action.data[1] },
> > >                                             { V4L2_CID_EXPOSURE, action.data[1] } });
> >
> > So the initial value of the exposure time and the vertical blanking are
> > always identical ?
> 
> This is setting up the delay in applying the ctrl for the staggered
> writer.  So I would expect EXPOSURE and VBLANK to have the same delay
> in the sensor.

Oops, I have misunderstood this, my bad.

> > > +
> > >                       sensorMetadata_ = action.data[2];
> > >               }
> > >
Naushir Patuck May 28, 2020, 9:08 a.m. UTC | #4
Hi Laurent,

On Thu, 28 May 2020 at 02:30, Laurent Pinchart
<laurent.pinchart@ideasonboard.com> wrote:
>
> Hi Naush,
>
> On Tue, May 26, 2020 at 10:29:12AM +0100, Naushir Patuck wrote:
> > On Mon, 25 May 2020 at 00:16, Laurent Pinchart wrote:
> > > On Wed, May 13, 2020 at 10:11:19AM +0100, Naushir Patuck wrote:
> > > > Add support for setting V4L2_CID_VBLANK appropriately when setting
> > > > V4L2_CID_EXPOSURE. This will allow adaptive framerates during
> > > > viewfinder use cases (e.g. when the exposure time goes above 33ms, we
> > > > can reduce the framerate to lower than 30fps).
> > > >
> > > > The minimum and maximum frame durations are provided via libcamera
> > > > controls, and will prioritise exposure time limits over any AGC request.
> > >
> > > General question about the AGC and frame duration selection behaviour.
> > > When the scene illumination drops and the applications allows the frame
> > > rate to be lowered, is it best to adapt the frame duration continuously,
> > > or jump from, let's say, 30fps to 15fps and stay at 15fps ? I could
> > > imagine adaptative frame durations to be annoying to handle for codecs
> > > or displays, but I probably lack a good view of the details.
> >
> > Traditionally, we had always allowed the fps to gradually drop with
> > exposure during viewfinder use cases.  This would provide a better
> > user experience (in my opinion) over a step change to the fps).  It
> > also ensures we run with the maximum possible framerate whenever we
> > can.  However, during video encoding, we have always left the fps at a
> > fixed rate, thereby limiting the maximum exposure time allowed.  I
> > guess it is essentially up to the AGC algorithm to decide how it wants
> > to handle it depending on the use case.
>
> Thanks for the information. Would you expect applications to specify a
> frame duration interval in the viewfinder use case and a fixed frame
> duration (with min == max) in the video recording use case, or would you
> handle that internally in the IPA ? I suppose an application may want
> the IPA to pick either 15fps or 30fps depending on the scene
> illumination and stick to that ? In that case we would need to provide a
> hint to tell whether the frame rate should be adapted continuously or
> not. Or we could do so on the application side by specifying a 15-30 fps
> interval, capturing a few frames until AEGC locks, reading the frame
> duration selected by the IPA, setting the frame duration to a fixed
> value and proceeding with the video recording. It's essentially a
> question of who would be responsible for implementing this (application
> or IPA), and if we decide to put the burden on the application, we could
> also provide a higher-level video recording helper to implement that
> sequence. What do you think would be best ?

I would expect the application to make the decision as to what frame
durations it wants based on the intended use cases.  In my opinion, we
should minimise the burden on the application side for these types of
controls and let the IPA handle it.  So, having hints passed into the
IPA for either gradual fps changes or step changes does seem like a
good solution to me.

>
> > > > V4L2_CID_VBLANK is controlled through the staggered writer, just like
> > > > the exposure and gain controls.
> > > >
> > > > Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
> > > > ---
> > > >  include/ipa/raspberrypi.h                     |  1 +
> > > >  src/ipa/raspberrypi/cam_helper.cpp            | 38 ++++++++++++++-
> > > >  src/ipa/raspberrypi/cam_helper.hpp            | 15 +++++-
> > > >  src/ipa/raspberrypi/cam_helper_imx219.cpp     | 11 ++++-
> > > >  src/ipa/raspberrypi/cam_helper_imx477.cpp     | 11 ++++-
> > > >  src/ipa/raspberrypi/cam_helper_ov5647.cpp     | 11 ++++-
> > > >  src/ipa/raspberrypi/raspberrypi.cpp           | 48 ++++++++++++++++---
> > > >  .../pipeline/raspberrypi/raspberrypi.cpp      |  2 +
> > > >  8 files changed, 124 insertions(+), 13 deletions(-)
> > > >
> > > > diff --git a/include/ipa/raspberrypi.h b/include/ipa/raspberrypi.h
> > > > index c109469e..74954db2 100644
> > > > --- a/include/ipa/raspberrypi.h
> > > > +++ b/include/ipa/raspberrypi.h
> > > > @@ -51,6 +51,7 @@ static const ControlInfoMap RPiControls = {
> > > >       { &controls::Brightness, ControlInfo(-1.0f, 1.0f) },
> > > >       { &controls::Contrast, ControlInfo(0.0f, 32.0f) },
> > > >       { &controls::Saturation, ControlInfo(0.0f, 32.0f) },
> > > > +     { &controls::FrameDurations, ControlInfo(1.0e3f, 1.0e9f) }
> > > >  };
> > > >
> > > >  } /* namespace libcamera */
> > > > diff --git a/src/ipa/raspberrypi/cam_helper.cpp b/src/ipa/raspberrypi/cam_helper.cpp
> > > > index 7f05d2c6..06732241 100644
> > > > --- a/src/ipa/raspberrypi/cam_helper.cpp
> > > > +++ b/src/ipa/raspberrypi/cam_helper.cpp
> > > > @@ -34,8 +34,10 @@ CamHelper *CamHelper::Create(std::string const &cam_name)
> > > >       return nullptr;
> > > >  }
> > > >
> > > > -CamHelper::CamHelper(MdParser *parser)
> > > > -     : parser_(parser), initialized_(false)
> > > > +CamHelper::CamHelper(MdParser *parser, unsigned int maxFrameLength,
> > > > +                  unsigned int frameIntegrationDiff)
> > > > +     : parser_(parser), initialized_(false), maxFrameLength_(maxFrameLength),
> > > > +       frameIntegrationDiff_(frameIntegrationDiff)
> > > >  {
> > > >  }
> > > >
> > > > @@ -56,6 +58,38 @@ double CamHelper::Exposure(uint32_t exposure_lines) const
> > > >       return exposure_lines * mode_.line_length / 1000.0;
> > > >  }
> > > >
> > > > +uint32_t CamHelper::GetVBlanking(double &exposure, double minFrameDuration,
> > > > +                              double maxFrameDuration) const
> > > > +{
> > > > +     uint32_t frameLengthMin, frameLengthMax, vblank;
> > > > +     uint32_t exposureLines = ExposureLines(exposure);
> > > > +
> > > > +     assert(initialized_);
> > > > +     /*
> > > > +      * Clip frame length by the frame duration range and the maximum allowable
> > > > +      * value in the sensor, given by maxFrameLength_.
> > > > +      */
> > > > +     frameLengthMin = std::min<uint32_t>(1e3 * minFrameDuration / mode_.line_length,
> > > > +                                         maxFrameLength_);
> > > > +     frameLengthMax = std::min<uint32_t>(1e3 * maxFrameDuration / mode_.line_length,
> > > > +                                         maxFrameLength_);
> > >
> > > Not something to be addressed as part of this series, but as we divide
> > > and multiply by 1000 in various places, should we specify all duration
> > > controls in nanoseconds instead of microseconds ?
> >
> > Good question :)  I think microseconds is more "user friendly", but I
> > do see the point that we convert to nanoseconds everywhere.  My
> > suggestion would be to leave it as microseconds for the user-facing
> > controls.
>
> Let's do so for now. I may reconsider that later though if we need more
> precision than 1µs (timestamps are where I expect precision to be
> needed, and we express them in ns already, maybe we won't need as much
> precision in the durations) or if we end up having controls in both
> units and we decide to make the API more consistent.

Agreed.

>
> > > > +     /*
> > > > +      * Limit the exposure to the maximum frame duration requested, and
> > > > +      * re-calculate if it has been clipped.
> > > > +      */
> > > > +     exposureLines = std::min(frameLengthMax - frameIntegrationDiff_,
> > > > +                              exposureLines);
> > > > +     exposure = Exposure(exposureLines);
> > > > +
> > > > +     /* Limit the vblank to the range allowed by the frame length limits. */
> > > > +     vblank = std::max<uint32_t>(mode_.height, exposureLines) -
> > > > +              mode_.height + frameIntegrationDiff_;
> > >
> > > If the exposure time is shorter than the frame height, do we need the
> > > frameIntegrationDiff_ margin ? Assuming a sensor that would have a
> > > minimum vblank of 0 and a frameIntegrationDiff_ of 40, if height was
> > > 1080 and exposureLines 500, there should be no need to set vblank to 40,
> > > right ?
> >
> > Yes, you are correct.  Will fix this calculation.
> >
> > >
> > > > +     vblank = std::max(vblank, frameLengthMin - mode_.height);
> > > > +     vblank = std::min(vblank, frameLengthMax - mode_.height);
> > >
> > > You can use
> > >
> > >         vblank = utils::clamp(vblank, frameLengthMin - mode_.height,
> > >                               frameLengthMax - mode_.height);
> >
> > Ack.
> >
> > > > +
> > > > +     return vblank;
> > > > +}
> > > > +
> > > >  void CamHelper::SetCameraMode(const CameraMode &mode)
> > > >  {
> > > >       mode_ = mode;
> > > > diff --git a/src/ipa/raspberrypi/cam_helper.hpp b/src/ipa/raspberrypi/cam_helper.hpp
> > > > index 0c8aa29a..fdcdbddb 100644
> > > > --- a/src/ipa/raspberrypi/cam_helper.hpp
> > > > +++ b/src/ipa/raspberrypi/cam_helper.hpp
> > > > @@ -68,12 +68,15 @@ class CamHelper
> > > >  {
> > > >  public:
> > > >       static CamHelper *Create(std::string const &cam_name);
> > > > -     CamHelper(MdParser *parser);
> > > > +     CamHelper(MdParser *parser, unsigned int maxFrameLength,
> > > > +               unsigned int frameIntegrationDiff);
> > > >       virtual ~CamHelper();
> > > >       void SetCameraMode(const CameraMode &mode);
> > > >       MdParser &Parser() const { return *parser_; }
> > > >       uint32_t ExposureLines(double exposure_us) const;
> > > >       double Exposure(uint32_t exposure_lines) const; // in us
> > > > +     virtual uint32_t GetVBlanking(double &exposure_us, double minFrameDuration,
> > > > +                                   double maxFrameDuration) const;
> > >
> > > Does this function need to be virtual ?
> >
> > I'm not sure :)
> >
> > I wanted to be flexible enough that if a sensor wanted to do something
> > a bit different from the norm, it could override the base
> > implementation.  I don't have a particular example in mind though.
>
> We could always make it virtual at that point then, this is an internal
> API.

Ack.

>
> > > >       virtual uint32_t GainCode(double gain) const = 0;
> > > >       virtual double Gain(uint32_t gain_code) const = 0;
> > > >       virtual void GetDelays(int &exposure_delay, int &gain_delay) const;
> > > > @@ -83,10 +86,20 @@ public:
> > > >       virtual unsigned int MistrustFramesStartup() const;
> > > >       virtual unsigned int MistrustFramesModeSwitch() const;
> > > >       virtual CamTransform GetOrientation() const;
> > > > +
> > > >  protected:
> > > >       MdParser *parser_;
> > > >       CameraMode mode_;
> > > > +
> > > > +private:
> > > >       bool initialized_;
> > > > +     /* Largest possible frame length, in units of lines. */
> > > > +     unsigned int maxFrameLength_;
> > > > +     /*
> > > > +      * Smallest difference between the frame length and integration time,
> > > > +      * in units of lines.
> > > > +      */
> > > > +     unsigned int frameIntegrationDiff_;
> > > >  };
> > > >
> > > >  // This is for registering camera helpers with the system, so that the
> > > > diff --git a/src/ipa/raspberrypi/cam_helper_imx219.cpp b/src/ipa/raspberrypi/cam_helper_imx219.cpp
> > > > index 35c6597c..fd95a31a 100644
> > > > --- a/src/ipa/raspberrypi/cam_helper_imx219.cpp
> > > > +++ b/src/ipa/raspberrypi/cam_helper_imx219.cpp
> > > > @@ -50,13 +50,22 @@ public:
> > > >       unsigned int MistrustFramesModeSwitch() const override;
> > > >       bool SensorEmbeddedDataPresent() const override;
> > > >       CamTransform GetOrientation() const override;
> > > > +
> > > > +private:
> > > > +     /*
> > > > +      * Smallest difference between the frame length and integration time,
> > > > +      * in units of lines.
> > > > +      */
> > > > +     static constexpr int frameIntegrationDiff = 4;
> > > > +     /* Largest possible frame length, in units of lines. */
> > > > +     static constexpr int maxFrameLength = 0xffff;
> > > >  };
> > > >
> > > >  CamHelperImx219::CamHelperImx219()
> > > >  #if ENABLE_EMBEDDED_DATA
> > > >       : CamHelper(new MdParserImx219())
> > >
> > > Shouldn't this be updated too ?
> >
> > Ack.
> >
> > > >  #else
> > > > -     : CamHelper(new MdParserRPi())
> > > > +     : CamHelper(new MdParserRPi(), maxFrameLength, frameIntegrationDiff)
> > > >  #endif
> > > >  {
> > > >  }
> > > > diff --git a/src/ipa/raspberrypi/cam_helper_imx477.cpp b/src/ipa/raspberrypi/cam_helper_imx477.cpp
> > > > index 69544456..4a1cab76 100644
> > > > --- a/src/ipa/raspberrypi/cam_helper_imx477.cpp
> > > > +++ b/src/ipa/raspberrypi/cam_helper_imx477.cpp
> > > > @@ -39,10 +39,19 @@ public:
> > > >       double Gain(uint32_t gain_code) const override;
> > > >       bool SensorEmbeddedDataPresent() const override;
> > > >       CamTransform GetOrientation() const override;
> > > > +
> > > > +private:
> > > > +     /*
> > > > +      * Smallest difference between the frame length and integration time,
> > > > +      * in units of lines.
> > > > +      */
> > > > +     static constexpr int frameIntegrationDiff = 22;
> > > > +     /* Largest possible frame length, in units of lines. */
> > > > +     static constexpr int maxFrameLength = 0xffdc;
> > > >  };
> > > >
> > > >  CamHelperImx477::CamHelperImx477()
> > > > -     : CamHelper(new MdParserImx477())
> > > > +     : CamHelper(new MdParserImx477(), maxFrameLength, frameIntegrationDiff)
> > > >  {
> > > >  }
> > > >
> > > > diff --git a/src/ipa/raspberrypi/cam_helper_ov5647.cpp b/src/ipa/raspberrypi/cam_helper_ov5647.cpp
> > > > index 3dbcb164..d814fa90 100644
> > > > --- a/src/ipa/raspberrypi/cam_helper_ov5647.cpp
> > > > +++ b/src/ipa/raspberrypi/cam_helper_ov5647.cpp
> > > > @@ -22,6 +22,15 @@ public:
> > > >       unsigned int HideFramesModeSwitch() const override;
> > > >       unsigned int MistrustFramesStartup() const override;
> > > >       unsigned int MistrustFramesModeSwitch() const override;
> > > > +
> > > > +private:
> > > > +     /*
> > > > +      * Smallest difference between the frame length and integration time,
> > > > +      * in units of lines.
> > > > +      */
> > > > +     static constexpr int frameIntegrationDiff = 4;
> > > > +     /* Largest possible frame length, in units of lines. */
> > > > +     static constexpr int maxFrameLength = 0xffff;
> > > >  };
> > > >
> > > >  /*
> > > > @@ -30,7 +39,7 @@ public:
> > > >   */
> > > >
> > > >  CamHelperOv5647::CamHelperOv5647()
> > > > -     : CamHelper(new MdParserRPi())
> > > > +     : CamHelper(new MdParserRPi(), maxFrameLength, frameIntegrationDiff)
> > > >  {
> > > >  }
> > > >
> > > > diff --git a/src/ipa/raspberrypi/raspberrypi.cpp b/src/ipa/raspberrypi/raspberrypi.cpp
> > > > index 3bcc0815..1af5e74a 100644
> > > > --- a/src/ipa/raspberrypi/raspberrypi.cpp
> > > > +++ b/src/ipa/raspberrypi/raspberrypi.cpp
> > > > @@ -53,6 +53,8 @@ namespace libcamera {
> > > >  /* Configure the sensor with these values initially. */
> > > >  #define DEFAULT_ANALOGUE_GAIN 1.0
> > > >  #define DEFAULT_EXPOSURE_TIME 20000
> > > > +#define DEFAULT_MIN_FRAME_DURATION (1e6 / 30.0)
> > > > +#define DEFAULT_MAX_FRAME_DURATION (1e6 / 0.01)
> > > >
> > > >  LOG_DEFINE_CATEGORY(IPARPI)
> > > >
> > > > @@ -136,6 +138,9 @@ private:
> > > >       /* LS table allocation passed in from the pipeline handler. */
> > > >       uint32_t lsTableHandle_;
> > > >       void *lsTable_;
> > > > +
> > > > +     /* Frame duration (1/fps) given in microseconds. */
> > > > +     double minFrameDuration_, maxFrameDuration_;
> > >
> > > Shouldn't these be float, or uint32_t depending on the outcome of the
> > > discussion for patch 1/3 ?
> >
> > Ack.
> >
> > > >  };
> > > >
> > > >  int IPARPi::init(const IPASettings &settings)
> > > > @@ -252,13 +257,20 @@ void IPARPi::configure(const CameraSensorInfo &sensorInfo,
> > > >               controller_.Initialise();
> > > >               controllerInit_ = true;
> > > >
> > > > -             /* Calculate initial values for gain and exposure. */
> > > > +             /* Calculate initial values for gain, vblank, and exposure */
> > > > +             minFrameDuration_ = DEFAULT_MIN_FRAME_DURATION;
> > > > +             maxFrameDuration_ = DEFAULT_MAX_FRAME_DURATION;
> > > > +
> > > > +             double exposure = DEFAULT_EXPOSURE_TIME;
> > > > +             int32_t vblank = helper_->GetVBlanking(exposure, minFrameDuration_,
> > > > +                                                    maxFrameDuration_);
> > > > +             int32_t exposure_lines = helper_->ExposureLines(exposure);
> > > >               int32_t gain_code = helper_->GainCode(DEFAULT_ANALOGUE_GAIN);
> > > > -             int32_t exposure_lines = helper_->ExposureLines(DEFAULT_EXPOSURE_TIME);
> > > >
> > > >               ControlList ctrls(unicam_ctrls_);
> > > > -             ctrls.set(V4L2_CID_ANALOGUE_GAIN, gain_code);
> > > > +             ctrls.set(V4L2_CID_VBLANK, vblank);
> > > >               ctrls.set(V4L2_CID_EXPOSURE, exposure_lines);
> > > > +             ctrls.set(V4L2_CID_ANALOGUE_GAIN, gain_code);
> > > >
> > > >               IPAOperationData op;
> > > >               op.operation = RPI_IPA_ACTION_V4L2_SET_STAGGERED;
> > > > @@ -630,6 +642,17 @@ void IPARPi::queueRequest(const ControlList &controls)
> > > >                       break;
> > > >               }
> > > >
> > > > +             case controls::FRAME_DURATIONS: {
> > > > +                     auto frameDurations = ctrl.second.get<Span<const float>>();
> > > > +
> > > > +                     /* This will be applied once AGC recalculations occur. */
> > > > +                     minFrameDuration_ = frameDurations[0];
> > > > +                     maxFrameDuration_ = frameDurations[1];
> > >
> > > Should the values be adjusted based on the sensor capabilities ?
> >
> > They do, but separately in the CamHelper::GetVBlanking() call.  Unless
> > I have misunderstood the comment?
>
> I think I was just confused, this seems fine.
>
> > > > +                     libcameraMetadata_.set(controls::FrameDurations,
> > > > +                                            { frameDurations[0], frameDurations[1] });
> > >
> > > Hmmm... From an application point of view, wouldn't it be more useful to
> > > know what frame duration was actually used for the current frame ? You
> > > don't have to implement this as part of this series, but I could imagine
> > > a FrameDuration metadata control being useful for that purpose.
> > > FrameDurations would be set by applications, and FrameDuration would be
> > > reported through metadata. (On a side note, would FrameDurationLimits be
> > > a better name than FrameDuration then ?)
> >
> > Yes, good point.  I can update this to reflect the actual frame durations used.
> > Happy to rename FrameDuration -> FrameDurationLimits.
> >
> > > > +                     break;
> > > > +             }
> > > > +
> > > >               default:
> > > >                       LOG(IPARPI, Warning)
> > > >                               << "Ctrl " << controls::controls.at(ctrl.first)->name()
> > > > @@ -795,7 +818,12 @@ void IPARPi::applyAGC(const struct AgcStatus *agcStatus)
> > > >       op.operation = RPI_IPA_ACTION_V4L2_SET_STAGGERED;
> > > >
> > > >       int32_t gain_code = helper_->GainCode(agcStatus->analogue_gain);
> > > > -     int32_t exposure_lines = helper_->ExposureLines(agcStatus->shutter_time);
> > > > +
> > > > +     /* GetVBlanking might clip exposure time to the fps limits. */
> > > > +     double exposure = agcStatus->shutter_time;
> > > > +     int32_t vblanking = helper_->GetVBlanking(exposure, minFrameDuration_,
> > > > +                                               maxFrameDuration_);
> > >
> > > I had completely missed, when reviewing the GetVBlanking()
> > > implementation above, that the first parameter was a referenced and was
> > > used to return a value. We tend in libcamera to use const references for
> > > input parameters, and pointers for output or in/out parameters to make
> > > that clearer. It's not ideal though, as pointer can be null, and using
> > > references when a parameter should not be null is nicer. No need to
> > > change the code here, but maybe a comment on top of GetVBlanking() would
> > > be useful to point out that the exposure time can be adjusted.
> >
> > Ack.
> >
> > > > +     int32_t exposure_lines = helper_->ExposureLines(exposure);
> > > >
> > > >       if (unicam_ctrls_.find(V4L2_CID_ANALOGUE_GAIN) == unicam_ctrls_.end()) {
> > > >               LOG(IPARPI, Error) << "Can't find analogue gain control";
> > > > @@ -807,14 +835,20 @@ void IPARPi::applyAGC(const struct AgcStatus *agcStatus)
> > > >               return;
> > > >       }
> > > >
> > > > -     LOG(IPARPI, Debug) << "Applying AGC Exposure: " << agcStatus->shutter_time
> > > > -                        << " (Shutter lines: " << exposure_lines << ") Gain: "
> > > > +     LOG(IPARPI, Debug) << "Applying AGC Exposure: " << exposure
> > > > +                        << " (Shutter lines: " << exposure_lines << ", AGC requested "
> > > > +                        << agcStatus->shutter_time << ") Gain: "
> > > >                          << agcStatus->analogue_gain << " (Gain Code: "
> > > >                          << gain_code << ")";
> > >
> > > Could the AGC algorithm get confused if it requests a certain exposure,
> > > and a smaller value is set in the sensor ? Will it keep trying to push
> > > the exposure up to increase the luminance of the scene without
> > > notificing it's pointless ?
> >
> > Yes, it can, and yet it will :)
> > However, there is no harm done.  As long as the AGC algorithm gets
> > given the actual exposure time / gain used for the frame - rather than
> > what it requested - it will fill up the rest of the exposure with
> > digital gain to get the expected brightness.  This is why it is
> > important for CamHelper::GetVBlanking() to return out the exposure
> > that would actually be used.
> >
> > David and I did discuss passing in the framerate limits into the AGC
> > algorithm, but we felt it added complexity for no real gain in
> > functionality.  The AGC algorithm would not behave any differently in
> > either case.  Would could help a user who wanted a very specific
> > behavior with framerate limits would be to setup a suitable exposure
> > profile (division of shutter speed and gain) in the tuning file.
>
> OK.
>
> > > >
> > > >       ControlList ctrls(unicam_ctrls_);
> > > > -     ctrls.set(V4L2_CID_ANALOGUE_GAIN, gain_code);
> > > > +     /*
> > > > +      * VBLANK must be set before EXPOSURE as the former will adjust the
> > > > +      * limits of the latter control.
> > > > +      */
> > >
> > > This will be nasty on the V4L2 side. We write multiple controls with a
> > > single VIDIOC_S_EXT_CTRLS call, and the V4L2 control framework will
> > > first clamp all control values to the limits before setting any control.
> > > The limits for the exposure time will thus not be adjusted before its
> > > value is clamped :-( I'm not sure how to best solve this. Ideally the
> > > problem should be fixed in the control framework, but I doubt that will
> > > be happen. We need to at least ask Hans to see what options we have (or
> > > rather, what options we don't have). One easy hack may be to report a
> > > large range for the exposure time limits in the camera sensor drivers,
> > > and adjust the value in .s_ctrl() instead of relying on the control
> > > framework to do that for us.
> >
> > Ack.  This needs a bit more thinking.
> >
> > If I were to separate out the VBLANK set ctrl with the EXPOSURE set
> > ctrl would that help here?  Note that I cannot easily do that right
> > now as the staggered writer only has an interface to set a single list
> > of ctrls.
> >
> > I suppose the current behavior means that we get 1 frame of exposure
> > that is possibly wrong or clipped before the vblank has updated the
> > ctrl limits in the sensor driver.
>
> The more I think about it, the more I believe it's a deficiency of the
> V4L2 API, and we should work around it by reporting exposure time limits
> that don't depend on the blanking, and clamping the value in .s_ctrl().
> We would otherwise need much more complexity on the userspace side, and
> it's not worth it.

Ack, do you think it's probably best kept as a separate thread of work?

>
> > > > +     ctrls.set(V4L2_CID_VBLANK, vblanking);
> > > >       ctrls.set(V4L2_CID_EXPOSURE, exposure_lines);
> > > > +     ctrls.set(V4L2_CID_ANALOGUE_GAIN, gain_code);
> > > >       op.controls.push_back(ctrls);
> > > >       queueFrameAction.emit(0, op);
> > > >  }
> > > > diff --git a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
> > > > index 21a1d7f7..948290e2 100644
> > > > --- a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
> > > > +++ b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
> > > > @@ -1159,7 +1159,9 @@ void RPiCameraData::queueFrameAction(unsigned int frame, const IPAOperationData
> > > >               if (!staggeredCtrl_) {
> > > >                       staggeredCtrl_.init(unicam_[Unicam::Image].dev(),
> > > >                                           { { V4L2_CID_ANALOGUE_GAIN, action.data[0] },
> > > > +                                           { V4L2_CID_VBLANK, action.data[1] },
> > > >                                             { V4L2_CID_EXPOSURE, action.data[1] } });
> > >
> > > So the initial value of the exposure time and the vertical blanking are
> > > always identical ?
> >
> > This is setting up the delay in applying the ctrl for the staggered
> > writer.  So I would expect EXPOSURE and VBLANK to have the same delay
> > in the sensor.
>
> Oops, I have misunderstood this, my bad.
>
> > > > +
> > > >                       sensorMetadata_ = action.data[2];
> > > >               }
> > > >
>
> --
> Regards,
>
> Laurent Pinchart

Regards,
Naush

Patch

diff --git a/include/ipa/raspberrypi.h b/include/ipa/raspberrypi.h
index c109469e..74954db2 100644
--- a/include/ipa/raspberrypi.h
+++ b/include/ipa/raspberrypi.h
@@ -51,6 +51,7 @@  static const ControlInfoMap RPiControls = {
 	{ &controls::Brightness, ControlInfo(-1.0f, 1.0f) },
 	{ &controls::Contrast, ControlInfo(0.0f, 32.0f) },
 	{ &controls::Saturation, ControlInfo(0.0f, 32.0f) },
+	{ &controls::FrameDurations, ControlInfo(1.0e3f, 1.0e9f) }
 };
 
 } /* namespace libcamera */
diff --git a/src/ipa/raspberrypi/cam_helper.cpp b/src/ipa/raspberrypi/cam_helper.cpp
index 7f05d2c6..06732241 100644
--- a/src/ipa/raspberrypi/cam_helper.cpp
+++ b/src/ipa/raspberrypi/cam_helper.cpp
@@ -34,8 +34,10 @@  CamHelper *CamHelper::Create(std::string const &cam_name)
 	return nullptr;
 }
 
-CamHelper::CamHelper(MdParser *parser)
-	: parser_(parser), initialized_(false)
+CamHelper::CamHelper(MdParser *parser, unsigned int maxFrameLength,
+		     unsigned int frameIntegrationDiff)
+	: parser_(parser), initialized_(false), maxFrameLength_(maxFrameLength),
+	  frameIntegrationDiff_(frameIntegrationDiff)
 {
 }
 
@@ -56,6 +58,38 @@  double CamHelper::Exposure(uint32_t exposure_lines) const
 	return exposure_lines * mode_.line_length / 1000.0;
 }
 
+uint32_t CamHelper::GetVBlanking(double &exposure, double minFrameDuration,
+				 double maxFrameDuration) const
+{
+	uint32_t frameLengthMin, frameLengthMax, vblank;
+	uint32_t exposureLines = ExposureLines(exposure);
+
+	assert(initialized_);
+	/*
+	 * Clip frame length by the frame duration range and the maximum allowable
+	 * value in the sensor, given by maxFrameLength_.
+	 */
+	frameLengthMin = std::min<uint32_t>(1e3 * minFrameDuration / mode_.line_length,
+					    maxFrameLength_);
+	frameLengthMax = std::min<uint32_t>(1e3 * maxFrameDuration / mode_.line_length,
+					    maxFrameLength_);
+	/*
+	 * Limit the exposure to the maximum frame duration requested, and
+	 * re-calculate if it has been clipped.
+	 */
+	exposureLines = std::min(frameLengthMax - frameIntegrationDiff_,
+				 exposureLines);
+	exposure = Exposure(exposureLines);
+
+	/* Limit the vblank to the range allowed by the frame length limits. */
+	vblank = std::max<uint32_t>(mode_.height, exposureLines) -
+		 mode_.height + frameIntegrationDiff_;
+	vblank = std::max(vblank, frameLengthMin - mode_.height);
+	vblank = std::min(vblank, frameLengthMax - mode_.height);
+
+	return vblank;
+}
+
 void CamHelper::SetCameraMode(const CameraMode &mode)
 {
 	mode_ = mode;
diff --git a/src/ipa/raspberrypi/cam_helper.hpp b/src/ipa/raspberrypi/cam_helper.hpp
index 0c8aa29a..fdcdbddb 100644
--- a/src/ipa/raspberrypi/cam_helper.hpp
+++ b/src/ipa/raspberrypi/cam_helper.hpp
@@ -68,12 +68,15 @@  class CamHelper
 {
 public:
 	static CamHelper *Create(std::string const &cam_name);
-	CamHelper(MdParser *parser);
+	CamHelper(MdParser *parser, unsigned int maxFrameLength,
+		  unsigned int frameIntegrationDiff);
 	virtual ~CamHelper();
 	void SetCameraMode(const CameraMode &mode);
 	MdParser &Parser() const { return *parser_; }
 	uint32_t ExposureLines(double exposure_us) const;
 	double Exposure(uint32_t exposure_lines) const; // in us
+	virtual uint32_t GetVBlanking(double &exposure_us, double minFrameDuration,
+				      double maxFrameDuration) const;
 	virtual uint32_t GainCode(double gain) const = 0;
 	virtual double Gain(uint32_t gain_code) const = 0;
 	virtual void GetDelays(int &exposure_delay, int &gain_delay) const;
@@ -83,10 +86,20 @@  public:
 	virtual unsigned int MistrustFramesStartup() const;
 	virtual unsigned int MistrustFramesModeSwitch() const;
 	virtual CamTransform GetOrientation() const;
+
 protected:
 	MdParser *parser_;
 	CameraMode mode_;
+
+private:
 	bool initialized_;
+	/* Largest possible frame length, in units of lines. */
+	unsigned int maxFrameLength_;
+	/*
+	 * Smallest difference between the frame length and integration time,
+	 * in units of lines.
+	 */
+	unsigned int frameIntegrationDiff_;
 };
 
 // This is for registering camera helpers with the system, so that the
diff --git a/src/ipa/raspberrypi/cam_helper_imx219.cpp b/src/ipa/raspberrypi/cam_helper_imx219.cpp
index 35c6597c..fd95a31a 100644
--- a/src/ipa/raspberrypi/cam_helper_imx219.cpp
+++ b/src/ipa/raspberrypi/cam_helper_imx219.cpp
@@ -50,13 +50,22 @@  public:
 	unsigned int MistrustFramesModeSwitch() const override;
 	bool SensorEmbeddedDataPresent() const override;
 	CamTransform GetOrientation() const override;
+
+private:
+	/*
+	 * Smallest difference between the frame length and integration time,
+	 * in units of lines.
+	 */
+	static constexpr int frameIntegrationDiff = 4;
+	/* Largest possible frame length, in units of lines. */
+	static constexpr int maxFrameLength = 0xffff;
 };
 
 CamHelperImx219::CamHelperImx219()
 #if ENABLE_EMBEDDED_DATA
 	: CamHelper(new MdParserImx219())
 #else
-	: CamHelper(new MdParserRPi())
+	: CamHelper(new MdParserRPi(), maxFrameLength, frameIntegrationDiff)
 #endif
 {
 }
diff --git a/src/ipa/raspberrypi/cam_helper_imx477.cpp b/src/ipa/raspberrypi/cam_helper_imx477.cpp
index 69544456..4a1cab76 100644
--- a/src/ipa/raspberrypi/cam_helper_imx477.cpp
+++ b/src/ipa/raspberrypi/cam_helper_imx477.cpp
@@ -39,10 +39,19 @@  public:
 	double Gain(uint32_t gain_code) const override;
 	bool SensorEmbeddedDataPresent() const override;
 	CamTransform GetOrientation() const override;
+
+private:
+	/*
+	 * Smallest difference between the frame length and integration time,
+	 * in units of lines.
+	 */
+	static constexpr int frameIntegrationDiff = 22;
+	/* Largest possible frame length, in units of lines. */
+	static constexpr int maxFrameLength = 0xffdc;
 };
 
 CamHelperImx477::CamHelperImx477()
-	: CamHelper(new MdParserImx477())
+	: CamHelper(new MdParserImx477(), maxFrameLength, frameIntegrationDiff)
 {
 }
 
diff --git a/src/ipa/raspberrypi/cam_helper_ov5647.cpp b/src/ipa/raspberrypi/cam_helper_ov5647.cpp
index 3dbcb164..d814fa90 100644
--- a/src/ipa/raspberrypi/cam_helper_ov5647.cpp
+++ b/src/ipa/raspberrypi/cam_helper_ov5647.cpp
@@ -22,6 +22,15 @@  public:
 	unsigned int HideFramesModeSwitch() const override;
 	unsigned int MistrustFramesStartup() const override;
 	unsigned int MistrustFramesModeSwitch() const override;
+
+private:
+	/*
+	 * Smallest difference between the frame length and integration time,
+	 * in units of lines.
+	 */
+	static constexpr int frameIntegrationDiff = 4;
+	/* Largest possible frame length, in units of lines. */
+	static constexpr int maxFrameLength = 0xffff;
 };
 
 /*
@@ -30,7 +39,7 @@  public:
  */
 
 CamHelperOv5647::CamHelperOv5647()
-	: CamHelper(new MdParserRPi())
+	: CamHelper(new MdParserRPi(), maxFrameLength, frameIntegrationDiff)
 {
 }
 
diff --git a/src/ipa/raspberrypi/raspberrypi.cpp b/src/ipa/raspberrypi/raspberrypi.cpp
index 3bcc0815..1af5e74a 100644
--- a/src/ipa/raspberrypi/raspberrypi.cpp
+++ b/src/ipa/raspberrypi/raspberrypi.cpp
@@ -53,6 +53,8 @@  namespace libcamera {
 /* Configure the sensor with these values initially. */
 #define DEFAULT_ANALOGUE_GAIN 1.0
 #define DEFAULT_EXPOSURE_TIME 20000
+#define DEFAULT_MIN_FRAME_DURATION (1e6 / 30.0)
+#define DEFAULT_MAX_FRAME_DURATION (1e6 / 0.01)
 
 LOG_DEFINE_CATEGORY(IPARPI)
 
@@ -136,6 +138,9 @@  private:
 	/* LS table allocation passed in from the pipeline handler. */
 	uint32_t lsTableHandle_;
 	void *lsTable_;
+
+	/* Frame duration (1/fps) given in microseconds. */
+	double minFrameDuration_, maxFrameDuration_;
 };
 
 int IPARPi::init(const IPASettings &settings)
@@ -252,13 +257,20 @@  void IPARPi::configure(const CameraSensorInfo &sensorInfo,
 		controller_.Initialise();
 		controllerInit_ = true;
 
-		/* Calculate initial values for gain and exposure. */
+		/* Calculate initial values for gain, vblank, and exposure */
+		minFrameDuration_ = DEFAULT_MIN_FRAME_DURATION;
+		maxFrameDuration_ = DEFAULT_MAX_FRAME_DURATION;
+
+		double exposure = DEFAULT_EXPOSURE_TIME;
+		int32_t vblank = helper_->GetVBlanking(exposure, minFrameDuration_,
+						       maxFrameDuration_);
+		int32_t exposure_lines = helper_->ExposureLines(exposure);
 		int32_t gain_code = helper_->GainCode(DEFAULT_ANALOGUE_GAIN);
-		int32_t exposure_lines = helper_->ExposureLines(DEFAULT_EXPOSURE_TIME);
 
 		ControlList ctrls(unicam_ctrls_);
-		ctrls.set(V4L2_CID_ANALOGUE_GAIN, gain_code);
+		ctrls.set(V4L2_CID_VBLANK, vblank);
 		ctrls.set(V4L2_CID_EXPOSURE, exposure_lines);
+		ctrls.set(V4L2_CID_ANALOGUE_GAIN, gain_code);
 
 		IPAOperationData op;
 		op.operation = RPI_IPA_ACTION_V4L2_SET_STAGGERED;
@@ -630,6 +642,17 @@  void IPARPi::queueRequest(const ControlList &controls)
 			break;
 		}
 
+		case controls::FRAME_DURATIONS: {
+			auto frameDurations = ctrl.second.get<Span<const float>>();
+
+			/* This will be applied once AGC recalculations occur. */
+			minFrameDuration_ = frameDurations[0];
+			maxFrameDuration_ = frameDurations[1];
+			libcameraMetadata_.set(controls::FrameDurations,
+					       { frameDurations[0], frameDurations[1] });
+			break;
+		}
+
 		default:
 			LOG(IPARPI, Warning)
 				<< "Ctrl " << controls::controls.at(ctrl.first)->name()
@@ -795,7 +818,12 @@  void IPARPi::applyAGC(const struct AgcStatus *agcStatus)
 	op.operation = RPI_IPA_ACTION_V4L2_SET_STAGGERED;
 
 	int32_t gain_code = helper_->GainCode(agcStatus->analogue_gain);
-	int32_t exposure_lines = helper_->ExposureLines(agcStatus->shutter_time);
+
+	/* GetVBlanking might clip exposure time to the fps limits. */
+	double exposure = agcStatus->shutter_time;
+	int32_t vblanking = helper_->GetVBlanking(exposure, minFrameDuration_,
+						  maxFrameDuration_);
+	int32_t exposure_lines = helper_->ExposureLines(exposure);
 
 	if (unicam_ctrls_.find(V4L2_CID_ANALOGUE_GAIN) == unicam_ctrls_.end()) {
 		LOG(IPARPI, Error) << "Can't find analogue gain control";
@@ -807,14 +835,20 @@  void IPARPi::applyAGC(const struct AgcStatus *agcStatus)
 		return;
 	}
 
-	LOG(IPARPI, Debug) << "Applying AGC Exposure: " << agcStatus->shutter_time
-			   << " (Shutter lines: " << exposure_lines << ") Gain: "
+	LOG(IPARPI, Debug) << "Applying AGC Exposure: " << exposure
+			   << " (Shutter lines: " << exposure_lines << ", AGC requested "
+			   << agcStatus->shutter_time << ") Gain: "
 			   << agcStatus->analogue_gain << " (Gain Code: "
 			   << gain_code << ")";
 
 	ControlList ctrls(unicam_ctrls_);
-	ctrls.set(V4L2_CID_ANALOGUE_GAIN, gain_code);
+	/*
+	 * VBLANK must be set before EXPOSURE as the former will adjust the
+	 * limits of the latter control.
+	 */
+	ctrls.set(V4L2_CID_VBLANK, vblanking);
 	ctrls.set(V4L2_CID_EXPOSURE, exposure_lines);
+	ctrls.set(V4L2_CID_ANALOGUE_GAIN, gain_code);
 	op.controls.push_back(ctrls);
 	queueFrameAction.emit(0, op);
 }
diff --git a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
index 21a1d7f7..948290e2 100644
--- a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
+++ b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
@@ -1159,7 +1159,9 @@  void RPiCameraData::queueFrameAction(unsigned int frame, const IPAOperationData
 		if (!staggeredCtrl_) {
 			staggeredCtrl_.init(unicam_[Unicam::Image].dev(),
 					    { { V4L2_CID_ANALOGUE_GAIN, action.data[0] },
+					      { V4L2_CID_VBLANK, action.data[1] },
 					      { V4L2_CID_EXPOSURE, action.data[1] } });
+
 			sensorMetadata_ = action.data[2];
 		}