[v2,00/10] libipa: agc: Calculate exposure limits
mbox series

Message ID 20251028-exposure-limits-v2-0-a8b5a318323e@ideasonboard.com
Headers show
Series
  • libipa: agc: Calculate exposure limits
Related show

Message

Jacopo Mondi Oct. 28, 2025, 9:31 a.m. UTC
By definition, the exposure time that can be programmed on a sensor is
limited by the frame duration and whenever the frame duration is
updated, the exposure time limits should be updated as well.

This is not the case for IPAs using the AgcMeanLuminance algorithm from
libipa, where the exposure limits are computed at configure() time and
never updated.

This has two implications:
1) The AGC algorithms will always operate with an exposure time bound to
   the frame duration programmed at startup time
2) The Camera::controls() limits are not updated to reflect the new
   constraints

This series addresses issue 1) by changing the AgcMeanLuminance class
and the associated helpers to regulate the exposure time base on the
currently programmed max frame duration using a sensor-specific margin
that is now reported by the CameraHelper class.

As the AgcMeanLuminance algorithm tries to push the exposure time up
to the maximum allowed frame duration, initializing the algorithm
with the sensor's maximum capabilities results in a very low frame rate,
unless the user specifies a FrameDurationLimits range in which the
algorithm should operate. To avoid slowing the frame rate down too much
compute a "default" frame rate at algorithm configuration time, were
the canonical 30, 15 and 10 FPS values are tested and selected in order
of preference.

The last patch is not for inclusion as Mali support needs to be
finalized upstream first, but shows how is it possible to handle
frame duration limits and exposure with the new model (and I needed that
patch for testing on Mali anyway).

Tested with camshark on Mali-C55 and with a test scripts that sets a
longer frame duration (66 msecs) and by validating that the exposure
time is increased to match the new limits on RkISP1.

RkISP1Agc agc.cpp:644 Divided up exposure time, analogue gain, quantization gain and digital gain are 33251.27us, 10.6667, 1.00018 and 6.49639
61.545331 (47.58 fps) cam0-stream0 seq: 000050 bytesused: 2073600/1036800
kISP1Agc agc.cpp:644 Divided up exposure time, analogue gain, quantization gain and digital gain are 66578.16us, 10.6667, 1.00018 and 3.24451
61.566241 (47.82 fps) cam0-stream0 seq: 000051 bytesused: 2073600/1036800

Signed-off-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
---
Changes in v2:
- Rework of v1 that moves the exposure limits computation to the
  AgcMeanLuminance exposure helpers

---
Jacopo Mondi (9):
      ipa: camera_sensor_helper: Introduce exposureMargin()
      ipa: ipu3: Move CameraHelper to context
      ipa: libipa: agc: Make Agc::configure() set limits
      ipa: libipa: agc: Configure with frame duration
      ipa: libipa: agc: Initialize exposure with frame duration
      ipa: libipa: agc: Set exposure limits with frame duration
      ipa: libipa: agc: Initialize a sensible frame duration
      ipa: libipa: agc: Calculate frame duration in helper
      [DNI] ipa: mali: Handle FrameDurationLimits

Kieran Bingham (1):
      ipa: mali-c55: Move CameraHelper to context

 src/ipa/ipu3/algorithms/agc.cpp         |  32 +++++++---
 src/ipa/ipu3/ipa_context.cpp            |   3 +
 src/ipa/ipu3/ipa_context.h              |   3 +
 src/ipa/ipu3/ipu3.cpp                   |  22 +++----
 src/ipa/libipa/agc_mean_luminance.cpp   |  73 ++++++++++++++++++---
 src/ipa/libipa/agc_mean_luminance.h     |  18 +++++-
 src/ipa/libipa/camera_sensor_helper.cpp |  72 +++++++++++++++++++++
 src/ipa/libipa/camera_sensor_helper.h   |   2 +
 src/ipa/libipa/exposure_mode_helper.cpp | 109 +++++++++++++++++++++++++++-----
 src/ipa/libipa/exposure_mode_helper.h   |  16 ++++-
 src/ipa/mali-c55/algorithms/agc.cpp     |  76 +++++++++++++++++++---
 src/ipa/mali-c55/ipa_context.cpp        |   6 ++
 src/ipa/mali-c55/ipa_context.h          |  12 ++++
 src/ipa/mali-c55/mali-c55.cpp           |  44 ++++++-------
 src/ipa/rkisp1/algorithms/agc.cpp       |  42 ++++++------
 15 files changed, 430 insertions(+), 100 deletions(-)
---
base-commit: 36f9cdcdb4a3c69e10b7da8a3de8354421cdcb56
change-id: 20251017-exposure-limits-5c48252fe548

Best regards,

Comments

Hans de Goede Oct. 28, 2025, 9:53 a.m. UTC | #1
Hi Jacopo,

On 28-Oct-25 10:31 AM, Jacopo Mondi wrote:
> By definition, the exposure time that can be programmed on a sensor is
> limited by the frame duration and whenever the frame duration is
> updated, the exposure time limits should be updated as well.
> 
> This is not the case for IPAs using the AgcMeanLuminance algorithm from
> libipa, where the exposure limits are computed at configure() time and
> never updated.
> 
> This has two implications:
> 1) The AGC algorithms will always operate with an exposure time bound to
>    the frame duration programmed at startup time
> 2) The Camera::controls() limits are not updated to reflect the new
>    constraints
> 
> This series addresses issue 1) by changing the AgcMeanLuminance class
> and the associated helpers to regulate the exposure time base on the
> currently programmed max frame duration using a sensor-specific margin
> that is now reported by the CameraHelper class.

I have a question about the above paragraph. The sensor-specific margin
is already known by the kernel's sensor driver and properly written
sensor drivers will update the exposure range reported by the query-ctrl
ioctl after changing the vblank control.

So I think it would be better to rely on the kernel for this info
and refresh the cached exposure-control limits when changing the vblank
control rather then duplicating this info in the CameraHelper class ?

One could even calculate the exposure-margin by calculating the configured
frame-time (in a unit of lines) and then substracting the max-exposure
reported by the sensor from the frame-time-in-lines.

> As the AgcMeanLuminance algorithm tries to push the exposure time up
> to the maximum allowed frame duration, initializing the algorithm
> with the sensor's maximum capabilities results in a very low frame rate,
> unless the user specifies a FrameDurationLimits range in which the
> algorithm should operate. To avoid slowing the frame rate down too much
> compute a "default" frame rate at algorithm configuration time, were
> the canonical 30, 15 and 10 FPS values are tested and selected in order
> of preference.

I must admit I'm not familiar with the AgcMeanLuminance algorithm.
Does this algorithm also set / used to set the vblank control to
the maximum allowed frame duration? Or does it dynamically adjust
vblank when it needs more exposure "range" ?

I've been thinking about adding support to dynamically increase
vblank when more exposure time is needed in the softISP AGC code,
but it would probably be better to first convert that over to
the shared AGC helpers. I know that Kieran is working on this.

At least some UVC cameras have an auto-exposure priority (or is mode?)
v4l2 control which can be set to "aperture priority" to make them
automatically increase frameduration if necessary because of low
light conditions. I think we want something similar for libcamera's
AGC.

Regards,

Hans






> 
> The last patch is not for inclusion as Mali support needs to be
> finalized upstream first, but shows how is it possible to handle
> frame duration limits and exposure with the new model (and I needed that
> patch for testing on Mali anyway).
> 
> Tested with camshark on Mali-C55 and with a test scripts that sets a
> longer frame duration (66 msecs) and by validating that the exposure
> time is increased to match the new limits on RkISP1.
> 
> RkISP1Agc agc.cpp:644 Divided up exposure time, analogue gain, quantization gain and digital gain are 33251.27us, 10.6667, 1.00018 and 6.49639
> 61.545331 (47.58 fps) cam0-stream0 seq: 000050 bytesused: 2073600/1036800
> kISP1Agc agc.cpp:644 Divided up exposure time, analogue gain, quantization gain and digital gain are 66578.16us, 10.6667, 1.00018 and 3.24451
> 61.566241 (47.82 fps) cam0-stream0 seq: 000051 bytesused: 2073600/1036800
> 
> Signed-off-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
> ---
> Changes in v2:
> - Rework of v1 that moves the exposure limits computation to the
>   AgcMeanLuminance exposure helpers
> 
> ---
> Jacopo Mondi (9):
>       ipa: camera_sensor_helper: Introduce exposureMargin()
>       ipa: ipu3: Move CameraHelper to context
>       ipa: libipa: agc: Make Agc::configure() set limits
>       ipa: libipa: agc: Configure with frame duration
>       ipa: libipa: agc: Initialize exposure with frame duration
>       ipa: libipa: agc: Set exposure limits with frame duration
>       ipa: libipa: agc: Initialize a sensible frame duration
>       ipa: libipa: agc: Calculate frame duration in helper
>       [DNI] ipa: mali: Handle FrameDurationLimits
> 
> Kieran Bingham (1):
>       ipa: mali-c55: Move CameraHelper to context
> 
>  src/ipa/ipu3/algorithms/agc.cpp         |  32 +++++++---
>  src/ipa/ipu3/ipa_context.cpp            |   3 +
>  src/ipa/ipu3/ipa_context.h              |   3 +
>  src/ipa/ipu3/ipu3.cpp                   |  22 +++----
>  src/ipa/libipa/agc_mean_luminance.cpp   |  73 ++++++++++++++++++---
>  src/ipa/libipa/agc_mean_luminance.h     |  18 +++++-
>  src/ipa/libipa/camera_sensor_helper.cpp |  72 +++++++++++++++++++++
>  src/ipa/libipa/camera_sensor_helper.h   |   2 +
>  src/ipa/libipa/exposure_mode_helper.cpp | 109 +++++++++++++++++++++++++++-----
>  src/ipa/libipa/exposure_mode_helper.h   |  16 ++++-
>  src/ipa/mali-c55/algorithms/agc.cpp     |  76 +++++++++++++++++++---
>  src/ipa/mali-c55/ipa_context.cpp        |   6 ++
>  src/ipa/mali-c55/ipa_context.h          |  12 ++++
>  src/ipa/mali-c55/mali-c55.cpp           |  44 ++++++-------
>  src/ipa/rkisp1/algorithms/agc.cpp       |  42 ++++++------
>  15 files changed, 430 insertions(+), 100 deletions(-)
> ---
> base-commit: 36f9cdcdb4a3c69e10b7da8a3de8354421cdcb56
> change-id: 20251017-exposure-limits-5c48252fe548
> 
> Best regards,
Robert Mader Oct. 28, 2025, 11:26 a.m. UTC | #2
Just gave it a go on my PinePhone Pro (rkisp1/rk3399) and can confirm it 
still fixes low framerates for me. So for the common and rkisp1 bits:

Tested-by: Robert Mader<robert.mader@collabora.com>

On 10/28/25 10:31, Jacopo Mondi wrote:
> By definition, the exposure time that can be programmed on a sensor is
> limited by the frame duration and whenever the frame duration is
> updated, the exposure time limits should be updated as well.
>
> This is not the case for IPAs using the AgcMeanLuminance algorithm from
> libipa, where the exposure limits are computed at configure() time and
> never updated.
>
> This has two implications:
> 1) The AGC algorithms will always operate with an exposure time bound to
>     the frame duration programmed at startup time
> 2) The Camera::controls() limits are not updated to reflect the new
>     constraints
>
> This series addresses issue 1) by changing the AgcMeanLuminance class
> and the associated helpers to regulate the exposure time base on the
> currently programmed max frame duration using a sensor-specific margin
> that is now reported by the CameraHelper class.
>
> As the AgcMeanLuminance algorithm tries to push the exposure time up
> to the maximum allowed frame duration, initializing the algorithm
> with the sensor's maximum capabilities results in a very low frame rate,
> unless the user specifies a FrameDurationLimits range in which the
> algorithm should operate. To avoid slowing the frame rate down too much
> compute a "default" frame rate at algorithm configuration time, were
> the canonical 30, 15 and 10 FPS values are tested and selected in order
> of preference.
>
> The last patch is not for inclusion as Mali support needs to be
> finalized upstream first, but shows how is it possible to handle
> frame duration limits and exposure with the new model (and I needed that
> patch for testing on Mali anyway).
>
> Tested with camshark on Mali-C55 and with a test scripts that sets a
> longer frame duration (66 msecs) and by validating that the exposure
> time is increased to match the new limits on RkISP1.
>
> RkISP1Agc agc.cpp:644 Divided up exposure time, analogue gain, quantization gain and digital gain are 33251.27us, 10.6667, 1.00018 and 6.49639
> 61.545331 (47.58 fps) cam0-stream0 seq: 000050 bytesused: 2073600/1036800
> kISP1Agc agc.cpp:644 Divided up exposure time, analogue gain, quantization gain and digital gain are 66578.16us, 10.6667, 1.00018 and 3.24451
> 61.566241 (47.82 fps) cam0-stream0 seq: 000051 bytesused: 2073600/1036800
>
> Signed-off-by: Jacopo Mondi<jacopo.mondi@ideasonboard.com>
> ---
> Changes in v2:
> - Rework of v1 that moves the exposure limits computation to the
>    AgcMeanLuminance exposure helpers
>
> ---
> Jacopo Mondi (9):
>        ipa: camera_sensor_helper: Introduce exposureMargin()
>        ipa: ipu3: Move CameraHelper to context
>        ipa: libipa: agc: Make Agc::configure() set limits
>        ipa: libipa: agc: Configure with frame duration
>        ipa: libipa: agc: Initialize exposure with frame duration
>        ipa: libipa: agc: Set exposure limits with frame duration
>        ipa: libipa: agc: Initialize a sensible frame duration
>        ipa: libipa: agc: Calculate frame duration in helper
>        [DNI] ipa: mali: Handle FrameDurationLimits
>
> Kieran Bingham (1):
>        ipa: mali-c55: Move CameraHelper to context
>
>   src/ipa/ipu3/algorithms/agc.cpp         |  32 +++++++---
>   src/ipa/ipu3/ipa_context.cpp            |   3 +
>   src/ipa/ipu3/ipa_context.h              |   3 +
>   src/ipa/ipu3/ipu3.cpp                   |  22 +++----
>   src/ipa/libipa/agc_mean_luminance.cpp   |  73 ++++++++++++++++++---
>   src/ipa/libipa/agc_mean_luminance.h     |  18 +++++-
>   src/ipa/libipa/camera_sensor_helper.cpp |  72 +++++++++++++++++++++
>   src/ipa/libipa/camera_sensor_helper.h   |   2 +
>   src/ipa/libipa/exposure_mode_helper.cpp | 109 +++++++++++++++++++++++++++-----
>   src/ipa/libipa/exposure_mode_helper.h   |  16 ++++-
>   src/ipa/mali-c55/algorithms/agc.cpp     |  76 +++++++++++++++++++---
>   src/ipa/mali-c55/ipa_context.cpp        |   6 ++
>   src/ipa/mali-c55/ipa_context.h          |  12 ++++
>   src/ipa/mali-c55/mali-c55.cpp           |  44 ++++++-------
>   src/ipa/rkisp1/algorithms/agc.cpp       |  42 ++++++------
>   15 files changed, 430 insertions(+), 100 deletions(-)
> ---
> base-commit: 36f9cdcdb4a3c69e10b7da8a3de8354421cdcb56
> change-id: 20251017-exposure-limits-5c48252fe548
>
> Best regards,
Jacopo Mondi Oct. 28, 2025, 11:42 a.m. UTC | #3
Hi Hans

On Tue, Oct 28, 2025 at 10:53:15AM +0100, Hans de Goede wrote:
> Hi Jacopo,
>
> On 28-Oct-25 10:31 AM, Jacopo Mondi wrote:
> > By definition, the exposure time that can be programmed on a sensor is
> > limited by the frame duration and whenever the frame duration is
> > updated, the exposure time limits should be updated as well.
> >
> > This is not the case for IPAs using the AgcMeanLuminance algorithm from
> > libipa, where the exposure limits are computed at configure() time and
> > never updated.
> >
> > This has two implications:
> > 1) The AGC algorithms will always operate with an exposure time bound to
> >    the frame duration programmed at startup time
> > 2) The Camera::controls() limits are not updated to reflect the new
> >    constraints
> >
> > This series addresses issue 1) by changing the AgcMeanLuminance class
> > and the associated helpers to regulate the exposure time base on the
> > currently programmed max frame duration using a sensor-specific margin
> > that is now reported by the CameraHelper class.
>
> I have a question about the above paragraph. The sensor-specific margin
> is already known by the kernel's sensor driver and properly written
> sensor drivers will update the exposure range reported by the query-ctrl
> ioctl after changing the vblank control.
>
> So I think it would be better to rely on the kernel for this info
> and refresh the cached exposure-control limits when changing the vblank
> control rather then duplicating this info in the CameraHelper class ?

We discussed this with Niklas in v1
https://patchwork.libcamera.org/patch/24695/#36372

My main concern (one more ioctl per frame apart) is that relying on
the kernel to update the control limits will introduce a delay of a
variable number of frames that correponds to the pipeline depth plus
the control delays.

Whenever processing statistics for frame X and a new vblank value is
calculated to accommodate the newly computed exposure value (within
the FrameDurationLimits) we push the new value to DelayedControls
which caches it in a queue. Whenever a new frame is started the
queue of controls is popped and controls get applied to the device. They
will take effect a few frames later (while the kernel interface
updates the control limits immediately if I'm not mistaken).

The IPA operates "in the future" and as once it has consumed request X
(which will be realized on the device at frame X + depth) it shall
calculate new values from frame X+1. This means that once the IPA
state is updated to new limits, all calculations from that point on
should use the new state, and we can't rely on the device statee for this.

We currently do not properly track Request id/frame sequences, and we
kind of ignore the pipeline depth as we keep consuming requests as
soon as they arrive, but it's not unresonable to assume that the
overall delay, once we properly track the depth and compute parameters
early enough, can be around 6 or 7 frames.

>
> One could even calculate the exposure-margin by calculating the configured
> frame-time (in a unit of lines) and then substracting the max-exposure
> reported by the sensor from the frame-time-in-lines.

Stefan suggested that too.

When you configure() you take the sensor's exposure time and the frame
length and the diff between the two is the margin. However there is no
indication anywhere that the sensor driver has to initialize controls
in this way and considering how slightly different sensor drivers are
I wouldn't be on them being consistent.

>
> > As the AgcMeanLuminance algorithm tries to push the exposure time up
> > to the maximum allowed frame duration, initializing the algorithm
> > with the sensor's maximum capabilities results in a very low frame rate,
> > unless the user specifies a FrameDurationLimits range in which the
> > algorithm should operate. To avoid slowing the frame rate down too much
> > compute a "default" frame rate at algorithm configuration time, were
> > the canonical 30, 15 and 10 FPS values are tested and selected in order
> > of preference.
>
> I must admit I'm not familiar with the AgcMeanLuminance algorithm.
> Does this algorithm also set / used to set the vblank control to
> the maximum allowed frame duration? Or does it dynamically adjust
> vblank when it needs more exposure "range" ?

It didn't, as it didn't calculate the frame duration.

The only IPA that use AgcMeanLuminance and handles VBLANK is the
RkISP1 and as you can see in 09/10 the way it does that is to take
the exposure time calculated by AgcMeanLuminance::calculateNewEV()
and use it as the desired frame duration. This is wrong as it breaks
the assumption we have a safe margin. It only doesn't break as the
driver will clamp exposure once VBLANK is applied, but I presume this
also causes a mis-alignment between what the algorithm think the
exposure time is and what is actually applied to the device (and I
presume this is what Stefan has reported causing oscillations when I
naively proposed to ignore the margin and let the driver adjust
exposure).

This is the reason for 09/10: the AgcMeanLuminance now calculates the
duration as well using the (exposure time it has computed + margin) and
sends it back to the IPA that transforms it in a proper vblank value
to be eventually set on the device.

>
> I've been thinking about adding support to dynamically increase
> vblank when more exposure time is needed in the softISP AGC code,
> but it would probably be better to first convert that over to
> the shared AGC helpers. I know that Kieran is working on this.

Yes, I think that's a desirable goal.

My whole effort in this series has been around moving as much as
possible of the logic to the shared algorithm so that all IPA using it
will get a consistent behaviour from "free".

Also patch 10/10 does what you want to do on Simple for Mali, so that
could serve as a reference.

>
> At least some UVC cameras have an auto-exposure priority (or is mode?)
> v4l2 control which can be set to "aperture priority" to make them
> automatically increase frameduration if necessary because of low
> light conditions. I think we want something similar for libcamera's
> AGC.

Yes, that's why we re-worked the AGC-related controls to have a
separate ExposureTimeMode and AnalogueGainMode compared to the single
"big button" AeEnable.

To implement priorities application can fix one of the two and let the
algorithm adjust the other.

The exposure time will anyway always be clamped to the user selected
FrameDurationLimits[] so that has to be properly adjusted too.

Thanks
  j
>
> Regards,
>
> Hans
>
>
>
>
>
>
> >
> > The last patch is not for inclusion as Mali support needs to be
> > finalized upstream first, but shows how is it possible to handle
> > frame duration limits and exposure with the new model (and I needed that
> > patch for testing on Mali anyway).
> >
> > Tested with camshark on Mali-C55 and with a test scripts that sets a
> > longer frame duration (66 msecs) and by validating that the exposure
> > time is increased to match the new limits on RkISP1.
> >
> > RkISP1Agc agc.cpp:644 Divided up exposure time, analogue gain, quantization gain and digital gain are 33251.27us, 10.6667, 1.00018 and 6.49639
> > 61.545331 (47.58 fps) cam0-stream0 seq: 000050 bytesused: 2073600/1036800
> > kISP1Agc agc.cpp:644 Divided up exposure time, analogue gain, quantization gain and digital gain are 66578.16us, 10.6667, 1.00018 and 3.24451
> > 61.566241 (47.82 fps) cam0-stream0 seq: 000051 bytesused: 2073600/1036800
> >
> > Signed-off-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
> > ---
> > Changes in v2:
> > - Rework of v1 that moves the exposure limits computation to the
> >   AgcMeanLuminance exposure helpers
> >
> > ---
> > Jacopo Mondi (9):
> >       ipa: camera_sensor_helper: Introduce exposureMargin()
> >       ipa: ipu3: Move CameraHelper to context
> >       ipa: libipa: agc: Make Agc::configure() set limits
> >       ipa: libipa: agc: Configure with frame duration
> >       ipa: libipa: agc: Initialize exposure with frame duration
> >       ipa: libipa: agc: Set exposure limits with frame duration
> >       ipa: libipa: agc: Initialize a sensible frame duration
> >       ipa: libipa: agc: Calculate frame duration in helper
> >       [DNI] ipa: mali: Handle FrameDurationLimits
> >
> > Kieran Bingham (1):
> >       ipa: mali-c55: Move CameraHelper to context
> >
> >  src/ipa/ipu3/algorithms/agc.cpp         |  32 +++++++---
> >  src/ipa/ipu3/ipa_context.cpp            |   3 +
> >  src/ipa/ipu3/ipa_context.h              |   3 +
> >  src/ipa/ipu3/ipu3.cpp                   |  22 +++----
> >  src/ipa/libipa/agc_mean_luminance.cpp   |  73 ++++++++++++++++++---
> >  src/ipa/libipa/agc_mean_luminance.h     |  18 +++++-
> >  src/ipa/libipa/camera_sensor_helper.cpp |  72 +++++++++++++++++++++
> >  src/ipa/libipa/camera_sensor_helper.h   |   2 +
> >  src/ipa/libipa/exposure_mode_helper.cpp | 109 +++++++++++++++++++++++++++-----
> >  src/ipa/libipa/exposure_mode_helper.h   |  16 ++++-
> >  src/ipa/mali-c55/algorithms/agc.cpp     |  76 +++++++++++++++++++---
> >  src/ipa/mali-c55/ipa_context.cpp        |   6 ++
> >  src/ipa/mali-c55/ipa_context.h          |  12 ++++
> >  src/ipa/mali-c55/mali-c55.cpp           |  44 ++++++-------
> >  src/ipa/rkisp1/algorithms/agc.cpp       |  42 ++++++------
> >  15 files changed, 430 insertions(+), 100 deletions(-)
> > ---
> > base-commit: 36f9cdcdb4a3c69e10b7da8a3de8354421cdcb56
> > change-id: 20251017-exposure-limits-5c48252fe548
> >
> > Best regards,
>
Hans de Goede Oct. 28, 2025, 2:48 p.m. UTC | #4
Hi Jacopo,

On 28-Oct-25 12:42 PM, Jacopo Mondi wrote:
> Hi Hans
> 
> On Tue, Oct 28, 2025 at 10:53:15AM +0100, Hans de Goede wrote:
>> Hi Jacopo,
>>
>> On 28-Oct-25 10:31 AM, Jacopo Mondi wrote:
>>> By definition, the exposure time that can be programmed on a sensor is
>>> limited by the frame duration and whenever the frame duration is
>>> updated, the exposure time limits should be updated as well.
>>>
>>> This is not the case for IPAs using the AgcMeanLuminance algorithm from
>>> libipa, where the exposure limits are computed at configure() time and
>>> never updated.
>>>
>>> This has two implications:
>>> 1) The AGC algorithms will always operate with an exposure time bound to
>>>    the frame duration programmed at startup time
>>> 2) The Camera::controls() limits are not updated to reflect the new
>>>    constraints
>>>
>>> This series addresses issue 1) by changing the AgcMeanLuminance class
>>> and the associated helpers to regulate the exposure time base on the
>>> currently programmed max frame duration using a sensor-specific margin
>>> that is now reported by the CameraHelper class.
>>
>> I have a question about the above paragraph. The sensor-specific margin
>> is already known by the kernel's sensor driver and properly written
>> sensor drivers will update the exposure range reported by the query-ctrl
>> ioctl after changing the vblank control.
>>
>> So I think it would be better to rely on the kernel for this info
>> and refresh the cached exposure-control limits when changing the vblank
>> control rather then duplicating this info in the CameraHelper class ?
> 
> We discussed this with Niklas in v1
> https://patchwork.libcamera.org/patch/24695/#36372
> 
> My main concern (one more ioctl per frame apart) is that relying on
> the kernel to update the control limits will introduce a delay of a
> variable number of frames that correponds to the pipeline depth plus
> the control delays.
> 
> Whenever processing statistics for frame X and a new vblank value is
> calculated to accommodate the newly computed exposure value (within
> the FrameDurationLimits) we push the new value to DelayedControls
> which caches it in a queue. Whenever a new frame is started the
> queue of controls is popped and controls get applied to the device. They
> will take effect a few frames later (while the kernel interface
> updates the control limits immediately if I'm not mistaken).

Right I forgot about the delayed-controls stuff, not being able
to get the new exposure limits from the kernel then makes sense.

Question: is there code somewhere to guarantee that when delayed
controls changes vblank it applies the vblank before the exposure?

IIRC libcamera uses set-ext-ctrls to set them all at once, but
most (all?) sensor drivers do not implement this, so the kernel
just applies them 1-by-1 internally. And if vblank gets increased
only after exposure then for one frame exposure will still get
clamped based on the old vblank.

This problem actually sounds like a more likely cause for Stefan's
oscillation issue you mention below.

>> One could even calculate the exposure-margin by calculating the configured
>> frame-time (in a unit of lines) and then substracting the max-exposure
>> reported by the sensor from the frame-time-in-lines.
> 
> Stefan suggested that too.
> 
> When you configure() you take the sensor's exposure time and the frame
> length and the diff between the two is the margin. However there is no
> indication anywhere that the sensor driver has to initialize controls
> in this way and considering how slightly different sensor drivers are
> I wouldn't be on them being consistent.

Hmm, I would say that we need to fix the sensor drivers then. But since
we need the camerahelper for the analog-gain bits I guess that having
to add the exposure vs framelength margin is not a big problem.

>>> As the AgcMeanLuminance algorithm tries to push the exposure time up
>>> to the maximum allowed frame duration, initializing the algorithm
>>> with the sensor's maximum capabilities results in a very low frame rate,
>>> unless the user specifies a FrameDurationLimits range in which the
>>> algorithm should operate. To avoid slowing the frame rate down too much
>>> compute a "default" frame rate at algorithm configuration time, were
>>> the canonical 30, 15 and 10 FPS values are tested and selected in order
>>> of preference.
>>
>> I must admit I'm not familiar with the AgcMeanLuminance algorithm.
>> Does this algorithm also set / used to set the vblank control to
>> the maximum allowed frame duration? Or does it dynamically adjust
>> vblank when it needs more exposure "range" ?
> 
> It didn't, as it didn't calculate the frame duration.
> 
> The only IPA that use AgcMeanLuminance and handles VBLANK is the
> RkISP1 and as you can see in 09/10 the way it does that is to take
> the exposure time calculated by AgcMeanLuminance::calculateNewEV()
> and use it as the desired frame duration. This is wrong as it breaks
> the assumption we have a safe margin. It only doesn't break as the
> driver will clamp exposure once VBLANK is applied, but I presume this
> also causes a mis-alignment between what the algorithm think the
> exposure time is and what is actually applied to the device (and I
> presume this is what Stefan has reported causing oscillations when I
> naively proposed to ignore the margin and let the driver adjust
> exposure).

See above, applying exposure before an increased vblank will lead
to a much bigger clamp which is a more likely cause of oscillation.

> This is the reason for 09/10: the AgcMeanLuminance now calculates the
> duration as well using the (exposure time it has computed + margin) and
> sends it back to the IPA that transforms it in a proper vblank value
> to be eventually set on the device.
> 
>>
>> I've been thinking about adding support to dynamically increase
>> vblank when more exposure time is needed in the softISP AGC code,
>> but it would probably be better to first convert that over to
>> the shared AGC helpers. I know that Kieran is working on this.
> 
> Yes, I think that's a desirable goal.
> 
> My whole effort in this series has been around moving as much as
> possible of the logic to the shared algorithm so that all IPA using it
> will get a consistent behaviour from "free".

Ack.

Thank you for taking the time to answer all my questions.

Regards,

Hans
Naushir Patuck Oct. 28, 2025, 2:58 p.m. UTC | #5
Hi Hans,

On Tue, 28 Oct 2025 at 14:49, Hans de Goede <hansg@kernel.org> wrote:
>
> Hi Jacopo,
>
> On 28-Oct-25 12:42 PM, Jacopo Mondi wrote:
> > Hi Hans
> >
> > On Tue, Oct 28, 2025 at 10:53:15AM +0100, Hans de Goede wrote:
> >> Hi Jacopo,
> >>
> >> On 28-Oct-25 10:31 AM, Jacopo Mondi wrote:
> >>> By definition, the exposure time that can be programmed on a sensor is
> >>> limited by the frame duration and whenever the frame duration is
> >>> updated, the exposure time limits should be updated as well.
> >>>
> >>> This is not the case for IPAs using the AgcMeanLuminance algorithm from
> >>> libipa, where the exposure limits are computed at configure() time and
> >>> never updated.
> >>>
> >>> This has two implications:
> >>> 1) The AGC algorithms will always operate with an exposure time bound to
> >>>    the frame duration programmed at startup time
> >>> 2) The Camera::controls() limits are not updated to reflect the new
> >>>    constraints
> >>>
> >>> This series addresses issue 1) by changing the AgcMeanLuminance class
> >>> and the associated helpers to regulate the exposure time base on the
> >>> currently programmed max frame duration using a sensor-specific margin
> >>> that is now reported by the CameraHelper class.
> >>
> >> I have a question about the above paragraph. The sensor-specific margin
> >> is already known by the kernel's sensor driver and properly written
> >> sensor drivers will update the exposure range reported by the query-ctrl
> >> ioctl after changing the vblank control.
> >>
> >> So I think it would be better to rely on the kernel for this info
> >> and refresh the cached exposure-control limits when changing the vblank
> >> control rather then duplicating this info in the CameraHelper class ?
> >
> > We discussed this with Niklas in v1
> > https://patchwork.libcamera.org/patch/24695/#36372
> >
> > My main concern (one more ioctl per frame apart) is that relying on
> > the kernel to update the control limits will introduce a delay of a
> > variable number of frames that correponds to the pipeline depth plus
> > the control delays.
> >
> > Whenever processing statistics for frame X and a new vblank value is
> > calculated to accommodate the newly computed exposure value (within
> > the FrameDurationLimits) we push the new value to DelayedControls
> > which caches it in a queue. Whenever a new frame is started the
> > queue of controls is popped and controls get applied to the device. They
> > will take effect a few frames later (while the kernel interface
> > updates the control limits immediately if I'm not mistaken).
>
> Right I forgot about the delayed-controls stuff, not being able
> to get the new exposure limits from the kernel then makes sense.
>
> Question: is there code somewhere to guarantee that when delayed
> controls changes vblank it applies the vblank before the exposure?
>
> IIRC libcamera uses set-ext-ctrls to set them all at once, but
> most (all?) sensor drivers do not implement this, so the kernel
> just applies them 1-by-1 internally. And if vblank gets increased
> only after exposure then for one frame exposure will still get
> clamped based on the old vblank.
>
> This problem actually sounds like a more likely cause for Stefan's
> oscillation issue you mention below.

When constructing the DelayedControls object, we mark the VBLANK
control as "priority write".  This ensures that VBLANK is written
ahead of any (and particularly EXPOSURE) controls for this very
reason.

Regards,
Naush


>
> >> One could even calculate the exposure-margin by calculating the configured
> >> frame-time (in a unit of lines) and then substracting the max-exposure
> >> reported by the sensor from the frame-time-in-lines.
> >
> > Stefan suggested that too.
> >
> > When you configure() you take the sensor's exposure time and the frame
> > length and the diff between the two is the margin. However there is no
> > indication anywhere that the sensor driver has to initialize controls
> > in this way and considering how slightly different sensor drivers are
> > I wouldn't be on them being consistent.
>
> Hmm, I would say that we need to fix the sensor drivers then. But since
> we need the camerahelper for the analog-gain bits I guess that having
> to add the exposure vs framelength margin is not a big problem.
>
> >>> As the AgcMeanLuminance algorithm tries to push the exposure time up
> >>> to the maximum allowed frame duration, initializing the algorithm
> >>> with the sensor's maximum capabilities results in a very low frame rate,
> >>> unless the user specifies a FrameDurationLimits range in which the
> >>> algorithm should operate. To avoid slowing the frame rate down too much
> >>> compute a "default" frame rate at algorithm configuration time, were
> >>> the canonical 30, 15 and 10 FPS values are tested and selected in order
> >>> of preference.
> >>
> >> I must admit I'm not familiar with the AgcMeanLuminance algorithm.
> >> Does this algorithm also set / used to set the vblank control to
> >> the maximum allowed frame duration? Or does it dynamically adjust
> >> vblank when it needs more exposure "range" ?
> >
> > It didn't, as it didn't calculate the frame duration.
> >
> > The only IPA that use AgcMeanLuminance and handles VBLANK is the
> > RkISP1 and as you can see in 09/10 the way it does that is to take
> > the exposure time calculated by AgcMeanLuminance::calculateNewEV()
> > and use it as the desired frame duration. This is wrong as it breaks
> > the assumption we have a safe margin. It only doesn't break as the
> > driver will clamp exposure once VBLANK is applied, but I presume this
> > also causes a mis-alignment between what the algorithm think the
> > exposure time is and what is actually applied to the device (and I
> > presume this is what Stefan has reported causing oscillations when I
> > naively proposed to ignore the margin and let the driver adjust
> > exposure).
>
> See above, applying exposure before an increased vblank will lead
> to a much bigger clamp which is a more likely cause of oscillation.
>
> > This is the reason for 09/10: the AgcMeanLuminance now calculates the
> > duration as well using the (exposure time it has computed + margin) and
> > sends it back to the IPA that transforms it in a proper vblank value
> > to be eventually set on the device.
> >
> >>
> >> I've been thinking about adding support to dynamically increase
> >> vblank when more exposure time is needed in the softISP AGC code,
> >> but it would probably be better to first convert that over to
> >> the shared AGC helpers. I know that Kieran is working on this.
> >
> > Yes, I think that's a desirable goal.
> >
> > My whole effort in this series has been around moving as much as
> > possible of the logic to the shared algorithm so that all IPA using it
> > will get a consistent behaviour from "free".
>
> Ack.
>
> Thank you for taking the time to answer all my questions.
>
> Regards,
>
> Hans
>
>
Jacopo Mondi Oct. 28, 2025, 3:27 p.m. UTC | #6
Hi Hans

On Tue, Oct 28, 2025 at 03:48:58PM +0100, Hans de Goede wrote:
> Hi Jacopo,
>
> On 28-Oct-25 12:42 PM, Jacopo Mondi wrote:
> > Hi Hans
> >
> > On Tue, Oct 28, 2025 at 10:53:15AM +0100, Hans de Goede wrote:
> >> Hi Jacopo,
> >>
> >> On 28-Oct-25 10:31 AM, Jacopo Mondi wrote:
> >>> By definition, the exposure time that can be programmed on a sensor is
> >>> limited by the frame duration and whenever the frame duration is
> >>> updated, the exposure time limits should be updated as well.
> >>>
> >>> This is not the case for IPAs using the AgcMeanLuminance algorithm from
> >>> libipa, where the exposure limits are computed at configure() time and
> >>> never updated.
> >>>
> >>> This has two implications:
> >>> 1) The AGC algorithms will always operate with an exposure time bound to
> >>>    the frame duration programmed at startup time
> >>> 2) The Camera::controls() limits are not updated to reflect the new
> >>>    constraints
> >>>
> >>> This series addresses issue 1) by changing the AgcMeanLuminance class
> >>> and the associated helpers to regulate the exposure time base on the
> >>> currently programmed max frame duration using a sensor-specific margin
> >>> that is now reported by the CameraHelper class.
> >>
> >> I have a question about the above paragraph. The sensor-specific margin
> >> is already known by the kernel's sensor driver and properly written
> >> sensor drivers will update the exposure range reported by the query-ctrl
> >> ioctl after changing the vblank control.
> >>
> >> So I think it would be better to rely on the kernel for this info
> >> and refresh the cached exposure-control limits when changing the vblank
> >> control rather then duplicating this info in the CameraHelper class ?
> >
> > We discussed this with Niklas in v1
> > https://patchwork.libcamera.org/patch/24695/#36372
> >
> > My main concern (one more ioctl per frame apart) is that relying on
> > the kernel to update the control limits will introduce a delay of a
> > variable number of frames that correponds to the pipeline depth plus
> > the control delays.
> >
> > Whenever processing statistics for frame X and a new vblank value is
> > calculated to accommodate the newly computed exposure value (within
> > the FrameDurationLimits) we push the new value to DelayedControls
> > which caches it in a queue. Whenever a new frame is started the
> > queue of controls is popped and controls get applied to the device. They
> > will take effect a few frames later (while the kernel interface
> > updates the control limits immediately if I'm not mistaken).
>
> Right I forgot about the delayed-controls stuff, not being able
> to get the new exposure limits from the kernel then makes sense.
>
> Question: is there code somewhere to guarantee that when delayed
> controls changes vblank it applies the vblank before the exposure?

Yes, DelayedControls has a "priority" parameters to make sure VBLANK
is updated before EXPOSURE
https://git.libcamera.org/libcamera/libcamera.git/tree/src/libcamera/delayed_controls.cpp#n50
https://git.libcamera.org/libcamera/libcamera.git/tree/src/libcamera/delayed_controls.cpp#n232

>
> IIRC libcamera uses set-ext-ctrls to set them all at once, but
> most (all?) sensor drivers do not implement this, so the kernel
> just applies them 1-by-1 internally. And if vblank gets increased
> only after exposure then for one frame exposure will still get
> clamped based on the old vblank.
>
> This problem actually sounds like a more likely cause for Stefan's
> oscillation issue you mention below.

You're right, thankfully RPi handled this when they first implemented
DelayedControls

>
> >> One could even calculate the exposure-margin by calculating the configured
> >> frame-time (in a unit of lines) and then substracting the max-exposure
> >> reported by the sensor from the frame-time-in-lines.
> >
> > Stefan suggested that too.
> >
> > When you configure() you take the sensor's exposure time and the frame
> > length and the diff between the two is the margin. However there is no
> > indication anywhere that the sensor driver has to initialize controls
> > in this way and considering how slightly different sensor drivers are
> > I wouldn't be on them being consistent.
>
> Hmm, I would say that we need to fix the sensor drivers then. But since
> we need the camerahelper for the analog-gain bits I guess that having
> to add the exposure vs framelength margin is not a big problem.
>
> >>> As the AgcMeanLuminance algorithm tries to push the exposure time up
> >>> to the maximum allowed frame duration, initializing the algorithm
> >>> with the sensor's maximum capabilities results in a very low frame rate,
> >>> unless the user specifies a FrameDurationLimits range in which the
> >>> algorithm should operate. To avoid slowing the frame rate down too much
> >>> compute a "default" frame rate at algorithm configuration time, were
> >>> the canonical 30, 15 and 10 FPS values are tested and selected in order
> >>> of preference.
> >>
> >> I must admit I'm not familiar with the AgcMeanLuminance algorithm.
> >> Does this algorithm also set / used to set the vblank control to
> >> the maximum allowed frame duration? Or does it dynamically adjust
> >> vblank when it needs more exposure "range" ?
> >
> > It didn't, as it didn't calculate the frame duration.
> >
> > The only IPA that use AgcMeanLuminance and handles VBLANK is the
> > RkISP1 and as you can see in 09/10 the way it does that is to take
> > the exposure time calculated by AgcMeanLuminance::calculateNewEV()
> > and use it as the desired frame duration. This is wrong as it breaks
> > the assumption we have a safe margin. It only doesn't break as the
> > driver will clamp exposure once VBLANK is applied, but I presume this
> > also causes a mis-alignment between what the algorithm think the
> > exposure time is and what is actually applied to the device (and I
> > presume this is what Stefan has reported causing oscillations when I
> > naively proposed to ignore the margin and let the driver adjust
> > exposure).
>
> See above, applying exposure before an increased vblank will lead
> to a much bigger clamp which is a more likely cause of oscillation.
>
> > This is the reason for 09/10: the AgcMeanLuminance now calculates the
> > duration as well using the (exposure time it has computed + margin) and
> > sends it back to the IPA that transforms it in a proper vblank value
> > to be eventually set on the device.
> >
> >>
> >> I've been thinking about adding support to dynamically increase
> >> vblank when more exposure time is needed in the softISP AGC code,
> >> but it would probably be better to first convert that over to
> >> the shared AGC helpers. I know that Kieran is working on this.
> >
> > Yes, I think that's a desirable goal.
> >
> > My whole effort in this series has been around moving as much as
> > possible of the logic to the shared algorithm so that all IPA using it
> > will get a consistent behaviour from "free".
>
> Ack.
>
> Thank you for taking the time to answer all my questions.
>

No worries at all, this is very helpful as some of my understanding of
the issue has to be validated, and that's exactly what reviews are for ;)

Thanks
   j

> Regards,
>
> Hans
>
>
Niklas Söderlund Oct. 28, 2025, 3:38 p.m. UTC | #7
Hello,

On 2025-10-28 15:48:58 +0100, Hans de Goede wrote:
> Hi Jacopo,
> 
> On 28-Oct-25 12:42 PM, Jacopo Mondi wrote:
> > Hi Hans
> > 
> > On Tue, Oct 28, 2025 at 10:53:15AM +0100, Hans de Goede wrote:
> >> Hi Jacopo,
> >>
> >> On 28-Oct-25 10:31 AM, Jacopo Mondi wrote:
> >>> By definition, the exposure time that can be programmed on a sensor is
> >>> limited by the frame duration and whenever the frame duration is
> >>> updated, the exposure time limits should be updated as well.
> >>>
> >>> This is not the case for IPAs using the AgcMeanLuminance algorithm from
> >>> libipa, where the exposure limits are computed at configure() time and
> >>> never updated.
> >>>
> >>> This has two implications:
> >>> 1) The AGC algorithms will always operate with an exposure time bound to
> >>>    the frame duration programmed at startup time
> >>> 2) The Camera::controls() limits are not updated to reflect the new
> >>>    constraints
> >>>
> >>> This series addresses issue 1) by changing the AgcMeanLuminance class
> >>> and the associated helpers to regulate the exposure time base on the
> >>> currently programmed max frame duration using a sensor-specific margin
> >>> that is now reported by the CameraHelper class.
> >>
> >> I have a question about the above paragraph. The sensor-specific margin
> >> is already known by the kernel's sensor driver and properly written
> >> sensor drivers will update the exposure range reported by the query-ctrl
> >> ioctl after changing the vblank control.
> >>
> >> So I think it would be better to rely on the kernel for this info
> >> and refresh the cached exposure-control limits when changing the vblank
> >> control rather then duplicating this info in the CameraHelper class ?
> > 
> > We discussed this with Niklas in v1
> > https://patchwork.libcamera.org/patch/24695/#36372
> > 
> > My main concern (one more ioctl per frame apart) is that relying on
> > the kernel to update the control limits will introduce a delay of a
> > variable number of frames that correponds to the pipeline depth plus
> > the control delays.
> > 
> > Whenever processing statistics for frame X and a new vblank value is
> > calculated to accommodate the newly computed exposure value (within
> > the FrameDurationLimits) we push the new value to DelayedControls
> > which caches it in a queue. Whenever a new frame is started the
> > queue of controls is popped and controls get applied to the device. They
> > will take effect a few frames later (while the kernel interface
> > updates the control limits immediately if I'm not mistaken).
> 
> Right I forgot about the delayed-controls stuff, not being able
> to get the new exposure limits from the kernel then makes sense.

I too understand why it's easier to not consult the kernel for the new 
limits. I have no real strong feeling that the correct solution is to 
have this limit calculation done in the kernel or user-space. What I do 
feel somewhat strongly about is if we do the limit calculation in 
user-space we should never consult the kernel for the limits.

This just mixes two ways of doing the same thing and is very confusing 
when reading the code. So if doing this in user-space is the way forward 
I think we should drop the reading of min and max of V4L2_CID_EXPOSURE 
in IPARkISP1::configure() and instead use this method there too.

I also feel, a little less strongly as it's a big work item, that it's 
really wrong to duplicate the calculation both in kernel and user-space.

As this IMHO exposes a weakness in the V4L2 API we should think about 
how we want the V4L2_CID_EXPOSURE control to report min/max values. Do 
we really want it to report its min/max values based on the current 
V4L2_CID_VBLANK setting as it's too expensive to query it with an IOCTL 
for each frame.

Can we simplify the control to report the min and max values the sensor 
can do provided the VBLANK setting allows it, and just reject any value 
at set time that do not conform to the current VBLANK setting?

If so we can stop doing  stuff in sensor drivers such as in IMX219 to 
allow for this control dependency in each driver.

  static int imx219_set_ctrl(struct v4l2_ctrl *ctrl)
  {
    ...

    if (ctrl->id == V4L2_CID_VBLANK) {
    	int exposure_max, exposure_def;

    	/* Update max exposure while meeting expected vblanking */
    	exposure_max = format->height + ctrl->val - 4;
    	exposure_def = (exposure_max < IMX219_EXPOSURE_DEFAULT) ?
    		exposure_max : IMX219_EXPOSURE_DEFAULT;
    	__v4l2_ctrl_modify_range(imx219->exposure,
    				 imx219->exposure->minimum,
    				 exposure_max, imx219->exposure->step,
    				 exposure_def);
    }

    ...
  }

>
> Question: is there code somewhere to guarantee that when delayed
> controls changes vblank it applies the vblank before the exposure?
> 
> IIRC libcamera uses set-ext-ctrls to set them all at once, but
> most (all?) sensor drivers do not implement this, so the kernel
> just applies them 1-by-1 internally. And if vblank gets increased
> only after exposure then for one frame exposure will still get
> clamped based on the old vblank.
> 
> This problem actually sounds like a more likely cause for Stefan's
> oscillation issue you mention below.
> 
> >> One could even calculate the exposure-margin by calculating the configured
> >> frame-time (in a unit of lines) and then substracting the max-exposure
> >> reported by the sensor from the frame-time-in-lines.
> > 
> > Stefan suggested that too.
> > 
> > When you configure() you take the sensor's exposure time and the frame
> > length and the diff between the two is the margin. However there is no
> > indication anywhere that the sensor driver has to initialize controls
> > in this way and considering how slightly different sensor drivers are
> > I wouldn't be on them being consistent.
> 
> Hmm, I would say that we need to fix the sensor drivers then. But since
> we need the camerahelper for the analog-gain bits I guess that having
> to add the exposure vs framelength margin is not a big problem.
> 
> >>> As the AgcMeanLuminance algorithm tries to push the exposure time up
> >>> to the maximum allowed frame duration, initializing the algorithm
> >>> with the sensor's maximum capabilities results in a very low frame rate,
> >>> unless the user specifies a FrameDurationLimits range in which the
> >>> algorithm should operate. To avoid slowing the frame rate down too much
> >>> compute a "default" frame rate at algorithm configuration time, were
> >>> the canonical 30, 15 and 10 FPS values are tested and selected in order
> >>> of preference.
> >>
> >> I must admit I'm not familiar with the AgcMeanLuminance algorithm.
> >> Does this algorithm also set / used to set the vblank control to
> >> the maximum allowed frame duration? Or does it dynamically adjust
> >> vblank when it needs more exposure "range" ?
> > 
> > It didn't, as it didn't calculate the frame duration.
> > 
> > The only IPA that use AgcMeanLuminance and handles VBLANK is the
> > RkISP1 and as you can see in 09/10 the way it does that is to take
> > the exposure time calculated by AgcMeanLuminance::calculateNewEV()
> > and use it as the desired frame duration. This is wrong as it breaks
> > the assumption we have a safe margin. It only doesn't break as the
> > driver will clamp exposure once VBLANK is applied, but I presume this
> > also causes a mis-alignment between what the algorithm think the
> > exposure time is and what is actually applied to the device (and I
> > presume this is what Stefan has reported causing oscillations when I
> > naively proposed to ignore the margin and let the driver adjust
> > exposure).
> 
> See above, applying exposure before an increased vblank will lead
> to a much bigger clamp which is a more likely cause of oscillation.
> 
> > This is the reason for 09/10: the AgcMeanLuminance now calculates the
> > duration as well using the (exposure time it has computed + margin) and
> > sends it back to the IPA that transforms it in a proper vblank value
> > to be eventually set on the device.
> > 
> >>
> >> I've been thinking about adding support to dynamically increase
> >> vblank when more exposure time is needed in the softISP AGC code,
> >> but it would probably be better to first convert that over to
> >> the shared AGC helpers. I know that Kieran is working on this.
> > 
> > Yes, I think that's a desirable goal.
> > 
> > My whole effort in this series has been around moving as much as
> > possible of the logic to the shared algorithm so that all IPA using it
> > will get a consistent behaviour from "free".
> 
> Ack.
> 
> Thank you for taking the time to answer all my questions.
> 
> Regards,
> 
> Hans
> 
>