[libcamera-devel,RFC,v3,05/16] controls: Replace AeLocked with AeState, and add AeLock
diff mbox series

Message ID 20210702103800.41291-6-paul.elder@ideasonboard.com
State Superseded
Delegated to: Paul Elder
Headers show
Series
  • Preliminary FULL plumbing
Related show

Commit Message

Paul Elder July 2, 2021, 10:37 a.m. UTC
AeLocked alone isn't sufficient for reporting the AE state, so replace
it with AeState. Add an AeLock control for instructing the camera to
lock the AE values.

Update the current users of AeLocked accordingly.

Signed-off-by: Paul Elder <paul.elder@ideasonboard.com>

---
No change in v3
---
 src/ipa/raspberrypi/raspberrypi.cpp |  5 +-
 src/ipa/rkisp1/rkisp1.cpp           | 13 ++--
 src/libcamera/control_ids.yaml      | 96 ++++++++++++++++++-----------
 3 files changed, 71 insertions(+), 43 deletions(-)

Comments

Jacopo Mondi July 5, 2021, 3:52 p.m. UTC | #1
Hi Paul,

On Fri, Jul 02, 2021 at 07:37:49PM +0900, Paul Elder wrote:
> AeLocked alone isn't sufficient for reporting the AE state, so replace
> it with AeState. Add an AeLock control for instructing the camera to
> lock the AE values.
>
> Update the current users of AeLocked accordingly.

As discussed, I like this change. Let's see what RPi and Laurent
think.

Thanks
  j

>
> Signed-off-by: Paul Elder <paul.elder@ideasonboard.com>
>
> ---
> No change in v3
> ---
>  src/ipa/raspberrypi/raspberrypi.cpp |  5 +-
>  src/ipa/rkisp1/rkisp1.cpp           | 13 ++--
>  src/libcamera/control_ids.yaml      | 96 ++++++++++++++++++-----------
>  3 files changed, 71 insertions(+), 43 deletions(-)
>
> diff --git a/src/ipa/raspberrypi/raspberrypi.cpp b/src/ipa/raspberrypi/raspberrypi.cpp
> index 4d09a84f..4981aa29 100644
> --- a/src/ipa/raspberrypi/raspberrypi.cpp
> +++ b/src/ipa/raspberrypi/raspberrypi.cpp
> @@ -469,7 +469,10 @@ void IPARPi::reportMetadata()
>
>  	AgcStatus *agcStatus = rpiMetadata_.GetLocked<AgcStatus>("agc.status");
>  	if (agcStatus) {
> -		libcameraMetadata_.set(controls::AeLocked, agcStatus->locked);
> +		libcameraMetadata_.set(controls::AeState,
> +				       agcStatus->locked ?
> +				       controls::AeStateLocked :
> +				       controls::AeStateSearching);
>  		libcameraMetadata_.set(controls::DigitalGain, agcStatus->digital_gain);
>  	}
>
> diff --git a/src/ipa/rkisp1/rkisp1.cpp b/src/ipa/rkisp1/rkisp1.cpp
> index cdfb4d13..4eca26e2 100644
> --- a/src/ipa/rkisp1/rkisp1.cpp
> +++ b/src/ipa/rkisp1/rkisp1.cpp
> @@ -51,7 +51,7 @@ private:
>  			      const rkisp1_stat_buffer *stats);
>
>  	void setControls(unsigned int frame);
> -	void metadataReady(unsigned int frame, unsigned int aeState);
> +	void metadataReady(unsigned int frame, int aeState);
>
>  	std::map<unsigned int, FrameBuffer> buffers_;
>  	std::map<unsigned int, void *> buffersMemory_;
> @@ -227,7 +227,7 @@ void IPARkISP1::updateStatistics(unsigned int frame,
>  				 const rkisp1_stat_buffer *stats)
>  {
>  	const rkisp1_cif_isp_stat *params = &stats->params;
> -	unsigned int aeState = 0;
> +	int aeState = controls::AeStateInactive;
>
>  	if (stats->meas_type & RKISP1_CIF_ISP_STAT_AUTOEXP) {
>  		const rkisp1_cif_isp_ae_stat *ae = &params->ae;
> @@ -262,7 +262,9 @@ void IPARkISP1::updateStatistics(unsigned int frame,
>  			setControls(frame + 1);
>  		}
>
> -		aeState = fabs(factor - 1.0f) < 0.05f ? 2 : 1;
> +		aeState = fabs(factor - 1.0f) < 0.05f ?
> +			  controls::AeStateConverged :
> +			  controls::AeStateSearching;
>  	}
>
>  	metadataReady(frame, aeState);
> @@ -281,12 +283,11 @@ void IPARkISP1::setControls(unsigned int frame)
>  	queueFrameAction.emit(frame, op);
>  }
>
> -void IPARkISP1::metadataReady(unsigned int frame, unsigned int aeState)
> +void IPARkISP1::metadataReady(unsigned int frame, int aeState)
>  {
>  	ControlList ctrls(controls::controls);
>
> -	if (aeState)
> -		ctrls.set(controls::AeLocked, aeState == 2);
> +	ctrls.set(controls::AeState, aeState);
>
>  	RkISP1Action op;
>  	op.op = ActionMetadata;
> diff --git a/src/libcamera/control_ids.yaml b/src/libcamera/control_ids.yaml
> index 9d4638ae..5717bc1f 100644
> --- a/src/libcamera/control_ids.yaml
> +++ b/src/libcamera/control_ids.yaml
> @@ -14,16 +14,70 @@ controls:
>
>          \sa ExposureTime AnalogueGain
>
> -  - AeLocked:
> +  - AeLock:
>        type: bool
>        description: |
> -        Report the lock status of a running AE algorithm.
> +        Control to lock the AE values.
> +        When set to true, the AE algorithm is locked to its latest parameters,
> +        and will not change exposure settings until set to false.
> +        \sa AeState
>
> -        If the AE algorithm is locked the value shall be set to true, if it's
> -        converging it shall be set to false. If the AE algorithm is not
> -        running the control shall not be present in the metadata control list.
> +  - AeState:
> +      type: int32_t
> +      description: |
> +        Control to report the current AE algorithm state. Enabling or disabling
> +        AE (AeEnable) always resets the AeState to AeStateInactive. The camera
> +        device can do several state transitions between two results, if it is
> +        allowed by the state transition table. For example, AeStateInactive may
> +        never actually be seen in a result.
>
> -        \sa AeEnable
> +        The state in the result is the state for this image (in sync with this
> +        image). If AE state becomes AeStateConverged, then the image data
> +        associated with the result should be good to use.
> +
> +        The state transitions mentioned below assume that AeEnable is on.
> +
> +        \sa AeLock
> +      enum:
> +        - name: AeStateInactive
> +          value: 0
> +          description: |
> +            The AE algorithm is inactive.
> +            If the camera initiates an AE scan, the state shall go to Searching.
> +            If AeLock is on, the state shall go to Locked.
> +        - name: AeStateSearching
> +          value: 1
> +          description: |
> +            The AE algorithm has not converged yet.
> +            If the camera finishes an AE scan, the state shall go to Converged.
> +            If the camera finishes an AE scan, but flash is required, the state
> +            shall go to FlashRequired.
> +            If AeLock is on, the state shall go to Locked.
> +        - name: AeStateConverged
> +          value: 2
> +          description: |
> +            The AE algorithm has converged.
> +            If the camera initiates an AE scan, the state shall go to Searching.
> +            If AeLock is on, the state shall go to Locked.
> +        - name: AeStateLocked
> +          value: 3
> +          description: |
> +            The AE algorithm is locked.
> +            If AeLock is off, the state can go to Searching, Converged, or
> +            FlashRequired.
> +        - name: AeStateFlashRequired
> +          value: 4
> +          description: |
> +            The AE algorithm would need a flash for good results
> +            If the camera initiates an AE scan, the state shall go to Searching.
> +            If AeLock is on, the state shall go to Locked.
> +        - name: AeStatePrecapture
> +          value: 5
> +          description: |
> +            The AE algorithm has started a pre-capture metering session.
> +            After the sequence is finished, the state shall go to Converged if
> +            AeLock is off, and Locked if it is on.
> +            \sa AePrecaptureTrigger
>
>    # AeMeteringMode needs further attention:
>    # - Auto-generate max enum value.
> @@ -477,36 +531,6 @@ controls:
>              High quality aberration correction which might reduce the frame
>              rate.
>
> -  - AeState:
> -      type: int32_t
> -      draft: true
> -      description: |
> -       Control to report the current AE algorithm state. Currently identical to
> -       ANDROID_CONTROL_AE_STATE.
> -
> -        Current state of the AE algorithm.
> -      enum:
> -        - name: AeStateInactive
> -          value: 0
> -          description: The AE algorithm is inactive.
> -        - name: AeStateSearching
> -          value: 1
> -          description: The AE algorithm has not converged yet.
> -        - name: AeStateConverged
> -          value: 2
> -          description: The AE algorithm has converged.
> -        - name: AeStateLocked
> -          value: 3
> -          description: The AE algorithm is locked.
> -        - name: AeStateFlashRequired
> -          value: 4
> -          description: The AE algorithm would need a flash for good results
> -        - name: AeStatePrecapture
> -          value: 5
> -          description: |
> -            The AE algorithm has started a pre-capture metering session.
> -            \sa AePrecaptureTrigger
> -
>    - AfState:
>        type: int32_t
>        draft: true
> --
> 2.27.0
>
Naushir Patuck July 5, 2021, 5:34 p.m. UTC | #2
Hi Paul,

Thank you for your work.

On Fri, 2 Jul 2021 at 11:38, Paul Elder <paul.elder@ideasonboard.com> wrote:

> AeLocked alone isn't sufficient for reporting the AE state, so replace
> it with AeState. Add an AeLock control for instructing the camera to
> lock the AE values.
>
> Update the current users of AeLocked accordingly.
>
> Signed-off-by: Paul Elder <paul.elder@ideasonboard.com>
>
> ---
> No change in v3
> ---
>  src/ipa/raspberrypi/raspberrypi.cpp |  5 +-
>  src/ipa/rkisp1/rkisp1.cpp           | 13 ++--
>  src/libcamera/control_ids.yaml      | 96 ++++++++++++++++++-----------
>  3 files changed, 71 insertions(+), 43 deletions(-)
>
> diff --git a/src/ipa/raspberrypi/raspberrypi.cpp
> b/src/ipa/raspberrypi/raspberrypi.cpp
> index 4d09a84f..4981aa29 100644
> --- a/src/ipa/raspberrypi/raspberrypi.cpp
> +++ b/src/ipa/raspberrypi/raspberrypi.cpp
> @@ -469,7 +469,10 @@ void IPARPi::reportMetadata()
>
>         AgcStatus *agcStatus =
> rpiMetadata_.GetLocked<AgcStatus>("agc.status");
>         if (agcStatus) {
> -               libcameraMetadata_.set(controls::AeLocked,
> agcStatus->locked);
> +               libcameraMetadata_.set(controls::AeState,
> +                                      agcStatus->locked ?
> +                                      controls::AeStateLocked :
> +                                      controls::AeStateSearching);
>

Yes, this seems like the correct state mapping.


>                 libcameraMetadata_.set(controls::DigitalGain,
> agcStatus->digital_gain);
>         }
>
> diff --git a/src/ipa/rkisp1/rkisp1.cpp b/src/ipa/rkisp1/rkisp1.cpp
> index cdfb4d13..4eca26e2 100644
> --- a/src/ipa/rkisp1/rkisp1.cpp
> +++ b/src/ipa/rkisp1/rkisp1.cpp
> @@ -51,7 +51,7 @@ private:
>                               const rkisp1_stat_buffer *stats);
>
>         void setControls(unsigned int frame);
> -       void metadataReady(unsigned int frame, unsigned int aeState);
> +       void metadataReady(unsigned int frame, int aeState);
>
>         std::map<unsigned int, FrameBuffer> buffers_;
>         std::map<unsigned int, void *> buffersMemory_;
> @@ -227,7 +227,7 @@ void IPARkISP1::updateStatistics(unsigned int frame,
>                                  const rkisp1_stat_buffer *stats)
>  {
>         const rkisp1_cif_isp_stat *params = &stats->params;
> -       unsigned int aeState = 0;
> +       int aeState = controls::AeStateInactive;
>
>         if (stats->meas_type & RKISP1_CIF_ISP_STAT_AUTOEXP) {
>                 const rkisp1_cif_isp_ae_stat *ae = &params->ae;
> @@ -262,7 +262,9 @@ void IPARkISP1::updateStatistics(unsigned int frame,
>                         setControls(frame + 1);
>                 }
>
> -               aeState = fabs(factor - 1.0f) < 0.05f ? 2 : 1;
> +               aeState = fabs(factor - 1.0f) < 0.05f ?
> +                         controls::AeStateConverged :
> +                         controls::AeStateSearching;
>         }
>
>         metadataReady(frame, aeState);
> @@ -281,12 +283,11 @@ void IPARkISP1::setControls(unsigned int frame)
>         queueFrameAction.emit(frame, op);
>  }
>
> -void IPARkISP1::metadataReady(unsigned int frame, unsigned int aeState)
> +void IPARkISP1::metadataReady(unsigned int frame, int aeState)
>  {
>         ControlList ctrls(controls::controls);
>
> -       if (aeState)
> -               ctrls.set(controls::AeLocked, aeState == 2);
> +       ctrls.set(controls::AeState, aeState);
>
>         RkISP1Action op;
>         op.op = ActionMetadata;
> diff --git a/src/libcamera/control_ids.yaml
> b/src/libcamera/control_ids.yaml
> index 9d4638ae..5717bc1f 100644
> --- a/src/libcamera/control_ids.yaml
> +++ b/src/libcamera/control_ids.yaml
> @@ -14,16 +14,70 @@ controls:
>
>          \sa ExposureTime AnalogueGain
>
> -  - AeLocked:
> +  - AeLock:
>        type: bool
>        description: |
> -        Report the lock status of a running AE algorithm.
> +        Control to lock the AE values.
> +        When set to true, the AE algorithm is locked to its latest
> parameters,
> +        and will not change exposure settings until set to false.
> +        \sa AeState
>

I know I am repeating myself here, but I don't see the need
to differentiate between
AeLock and AeEnable  == falses.


> -        If the AE algorithm is locked the value shall be set to true, if
> it's
> -        converging it shall be set to false. If the AE algorithm is not
> -        running the control shall not be present in the metadata control
> list.
> +  - AeState:
> +      type: int32_t
> +      description: |
> +        Control to report the current AE algorithm state. Enabling or
> disabling
> +        AE (AeEnable) always resets the AeState to AeStateInactive. The
> camera
> +        device can do several state transitions between two results, if
> it is
> +        allowed by the state transition table. For example,
> AeStateInactive may
> +        never actually be seen in a result.
>
> -        \sa AeEnable
> +        The state in the result is the state for this image (in sync with
> this
> +        image). If AE state becomes AeStateConverged, then the image data
> +        associated with the result should be good to use.
> +
> +        The state transitions mentioned below assume that AeEnable is on.
> +
> +        \sa AeLock
> +      enum:
> +        - name: AeStateInactive
> +          value: 0
> +          description: |
> +            The AE algorithm is inactive.
> +            If the camera initiates an AE scan, the state shall go to
> Searching.
> +            If AeLock is on, the state shall go to Locked.
>

This state is also a bit ambiguous to me.  Is this the state that must be
reported
if AeEnable == false?  If so, is that not also conveyed by setting the
latter?

If the objective of the change is to wholesale do what Android does, then I
don't
really have any objection to this change as-is, and we can treat the above
states as best we can in the Raspberry Pi IPA.  Otherwise, I do see some
scope
to simplify some of this.

Regards,
Naush


> +        - name: AeStateSearching
> +          value: 1
> +          description: |
> +            The AE algorithm has not converged yet.
> +            If the camera finishes an AE scan, the state shall go to
> Converged.
> +            If the camera finishes an AE scan, but flash is required, the
> state
> +            shall go to FlashRequired.
> +            If AeLock is on, the state shall go to Locked.
> +        - name: AeStateConverged
> +          value: 2
> +          description: |
> +            The AE algorithm has converged.
> +            If the camera initiates an AE scan, the state shall go to
> Searching.
> +            If AeLock is on, the state shall go to Locked.
> +        - name: AeStateLocked
> +          value: 3
> +          description: |
> +            The AE algorithm is locked.
> +            If AeLock is off, the state can go to Searching, Converged, or
> +            FlashRequired.
> +        - name: AeStateFlashRequired
> +          value: 4
> +          description: |
> +            The AE algorithm would need a flash for good results
> +            If the camera initiates an AE scan, the state shall go to
> Searching.
> +            If AeLock is on, the state shall go to Locked.
> +        - name: AeStatePrecapture
> +          value: 5
> +          description: |
> +            The AE algorithm has started a pre-capture metering session.
> +            After the sequence is finished, the state shall go to
> Converged if
> +            AeLock is off, and Locked if it is on.
> +            \sa AePrecaptureTrigger
>
>    # AeMeteringMode needs further attention:
>    # - Auto-generate max enum value.
> @@ -477,36 +531,6 @@ controls:
>              High quality aberration correction which might reduce the
> frame
>              rate.
>
> -  - AeState:
> -      type: int32_t
> -      draft: true
> -      description: |
> -       Control to report the current AE algorithm state. Currently
> identical to
> -       ANDROID_CONTROL_AE_STATE.
> -
> -        Current state of the AE algorithm.
> -      enum:
> -        - name: AeStateInactive
> -          value: 0
> -          description: The AE algorithm is inactive.
> -        - name: AeStateSearching
> -          value: 1
> -          description: The AE algorithm has not converged yet.
> -        - name: AeStateConverged
> -          value: 2
> -          description: The AE algorithm has converged.
> -        - name: AeStateLocked
> -          value: 3
> -          description: The AE algorithm is locked.
> -        - name: AeStateFlashRequired
> -          value: 4
> -          description: The AE algorithm would need a flash for good
> results
> -        - name: AeStatePrecapture
> -          value: 5
> -          description: |
> -            The AE algorithm has started a pre-capture metering session.
> -            \sa AePrecaptureTrigger
> -
>    - AfState:
>        type: int32_t
>        draft: true
> --
> 2.27.0
>
>
Laurent Pinchart July 11, 2021, 10:35 p.m. UTC | #3
Hi Naush,

(CC'ing David)

Apologies for the late reply, I should have shared more information on
this topic earlier.

On Mon, Jul 05, 2021 at 06:34:07PM +0100, Naushir Patuck wrote:
> On Fri, 2 Jul 2021 at 11:38, Paul Elder wrote:
> > AeLocked alone isn't sufficient for reporting the AE state, so replace
> > it with AeState. Add an AeLock control for instructing the camera to
> > lock the AE values.
> >
> > Update the current users of AeLocked accordingly.
> >
> > Signed-off-by: Paul Elder <paul.elder@ideasonboard.com>
> >
> > ---
> > No change in v3
> > ---
> >  src/ipa/raspberrypi/raspberrypi.cpp |  5 +-
> >  src/ipa/rkisp1/rkisp1.cpp           | 13 ++--
> >  src/libcamera/control_ids.yaml      | 96 ++++++++++++++++++-----------
> >  3 files changed, 71 insertions(+), 43 deletions(-)
> >
> > diff --git a/src/ipa/raspberrypi/raspberrypi.cpp b/src/ipa/raspberrypi/raspberrypi.cpp
> > index 4d09a84f..4981aa29 100644
> > --- a/src/ipa/raspberrypi/raspberrypi.cpp
> > +++ b/src/ipa/raspberrypi/raspberrypi.cpp
> > @@ -469,7 +469,10 @@ void IPARPi::reportMetadata()
> >
> >         AgcStatus *agcStatus = rpiMetadata_.GetLocked<AgcStatus>("agc.status");
> >         if (agcStatus) {
> > -               libcameraMetadata_.set(controls::AeLocked, agcStatus->locked);
> > +               libcameraMetadata_.set(controls::AeState,
> > +                                      agcStatus->locked ?
> > +                                      controls::AeStateLocked :
> > +                                      controls::AeStateSearching);
> 
> Yes, this seems like the correct state mapping.

Is it ? I thought it should be AeStateConverged.

> >                 libcameraMetadata_.set(controls::DigitalGain, agcStatus->digital_gain);
> >         }
> >
> > diff --git a/src/ipa/rkisp1/rkisp1.cpp b/src/ipa/rkisp1/rkisp1.cpp
> > index cdfb4d13..4eca26e2 100644
> > --- a/src/ipa/rkisp1/rkisp1.cpp
> > +++ b/src/ipa/rkisp1/rkisp1.cpp
> > @@ -51,7 +51,7 @@ private:
> >                               const rkisp1_stat_buffer *stats);
> >
> >         void setControls(unsigned int frame);
> > -       void metadataReady(unsigned int frame, unsigned int aeState);
> > +       void metadataReady(unsigned int frame, int aeState);
> >
> >         std::map<unsigned int, FrameBuffer> buffers_;
> >         std::map<unsigned int, void *> buffersMemory_;
> > @@ -227,7 +227,7 @@ void IPARkISP1::updateStatistics(unsigned int frame,
> >                                  const rkisp1_stat_buffer *stats)
> >  {
> >         const rkisp1_cif_isp_stat *params = &stats->params;
> > -       unsigned int aeState = 0;
> > +       int aeState = controls::AeStateInactive;
> >
> >         if (stats->meas_type & RKISP1_CIF_ISP_STAT_AUTOEXP) {
> >                 const rkisp1_cif_isp_ae_stat *ae = &params->ae;
> > @@ -262,7 +262,9 @@ void IPARkISP1::updateStatistics(unsigned int frame,
> >                         setControls(frame + 1);
> >                 }
> >
> > -               aeState = fabs(factor - 1.0f) < 0.05f ? 2 : 1;
> > +               aeState = fabs(factor - 1.0f) < 0.05f ?
> > +                         controls::AeStateConverged :
> > +                         controls::AeStateSearching;
> >         }
> >
> >         metadataReady(frame, aeState);
> > @@ -281,12 +283,11 @@ void IPARkISP1::setControls(unsigned int frame)
> >         queueFrameAction.emit(frame, op);
> >  }
> >
> > -void IPARkISP1::metadataReady(unsigned int frame, unsigned int aeState)
> > +void IPARkISP1::metadataReady(unsigned int frame, int aeState)
> >  {
> >         ControlList ctrls(controls::controls);
> >
> > -       if (aeState)
> > -               ctrls.set(controls::AeLocked, aeState == 2);
> > +       ctrls.set(controls::AeState, aeState);
> >
> >         RkISP1Action op;
> >         op.op = ActionMetadata;
> > diff --git a/src/libcamera/control_ids.yaml b/src/libcamera/control_ids.yaml
> > index 9d4638ae..5717bc1f 100644
> > --- a/src/libcamera/control_ids.yaml
> > +++ b/src/libcamera/control_ids.yaml
> > @@ -14,16 +14,70 @@ controls:
> >
> >          \sa ExposureTime AnalogueGain
> >
> > -  - AeLocked:
> > +  - AeLock:
> >        type: bool
> >        description: |
> > -        Report the lock status of a running AE algorithm.
> > +        Control to lock the AE values.
> > +        When set to true, the AE algorithm is locked to its latest parameters,
> > +        and will not change exposure settings until set to false.
> > +        \sa AeState
> 
> I know I am repeating myself here, but I don't see the need to differentiate between
> AeLock and AeEnable  == falses.

Please see below.

> > -        If the AE algorithm is locked the value shall be set to true, if it's
> > -        converging it shall be set to false. If the AE algorithm is not
> > -        running the control shall not be present in the metadata control list.
> > +  - AeState:
> > +      type: int32_t
> > +      description: |
> > +        Control to report the current AE algorithm state. Enabling or disabling
> > +        AE (AeEnable) always resets the AeState to AeStateInactive. The camera
> > +        device can do several state transitions between two results, if it is
> > +        allowed by the state transition table. For example, AeStateInactive may
> > +        never actually be seen in a result.
> >
> > -        \sa AeEnable
> > +        The state in the result is the state for this image (in sync with this
> > +        image). If AE state becomes AeStateConverged, then the image data
> > +        associated with the result should be good to use.
> > +
> > +        The state transitions mentioned below assume that AeEnable is on.
> > +
> > +        \sa AeLock
> > +      enum:
> > +        - name: AeStateInactive
> > +          value: 0
> > +          description: |
> > +            The AE algorithm is inactive.
> > +            If the camera initiates an AE scan, the state shall go to Searching.
> > +            If AeLock is on, the state shall go to Locked.
> >
> 
> This state is also a bit ambiguous to me.  Is this the state that must be reported
> if AeEnable == false?  If so, is that not also conveyed by setting the latter?
> 
> If the objective of the change is to wholesale do what Android does, then I don't
> really have any objection to this change as-is, and we can treat the above
> states as best we can in the Raspberry Pi IPA.  Otherwise, I do see some scope
> to simplify some of this.

First of all, let me clarify the goal. The libcamera native API needs to
offer the features required to implement the Android camera HAL API, but
we don't need to just duplicate it. When there are reasons to depart
(for instance to fix historical mistakes made in Android, to provide
additional features, or just to expose an API we consider more
consistent), we should do so.

Then, a word about the process. Implementing support for an Android
control requires plumbing it through the camera stack: definition of a
libcamera control, translation between the Android and libcamera control
in the Android camera HAL, implementation of the libcamera control in
the pipeline handlesr, and, when applicable, in the IPAs. Defining
libcamera controls, as any API design effort, takes time. To allow
moving forward on all the other parts, we have a fast-track process to
define draft controls that mimick the Android controls. All the draft
controls will be redesigned before freezing the libcamera public API.

This means that AeLock and AeState should be marked as draft controls as
part of this patch.

This being said, we don't need to define controls as draft controls if
we want to already go through a complete design process. The discussion
that stemmed from this patch series goes in that direction.

It's useful, in this context, to understand why the Android AE (and AWB)
controls have been designed this way. Conceptually speaking, the
algorithms are considered, in Android, to be one level above manual
controls.

 Controls                                                        Metadata
             /---------- == OFF ---------\
             |                           v
aeMode      -/             +------+     |\
aeLock                 --> |  AE  | --> |0\     +--------+
aeExposureCompensation     +------+     | | --> | Sensor | --> exposureTime
                                        | |     +--------+     sensitivity
exposureTime   -----------------------> |1/
sensitivity                             |/

(The AE controls are in the adnroid.control class, while the sensor
manual control are in the android.sensor class.)

The aeMode control selects whether the AE algorithm (when != OFF) or the
manual controls (when == OFF) drive the sensor. The aeLock control tells
the AE algorithm to freeze its output (when ON) or operate normally
(when OFF). The metadata always report the values applied to the sensor,
regardless of whether they come from the AE algorithm or from manual
controls.

One important difference, compared to the model we currently implement,
is that the Android AE algorithm is independent from the manual sensor
controls. The values of the manual controls aren't affected by the AE
algorithm, and when AE is disabled, the last set manual control values
are applied. This thus require an aeLock control to transition
seamlessly between AE and manual mode, as the application needs to set
the manual controls to the values applied to the sensor on the last
frame that AE was on, which couldn't otherwise be done due to the
internal delays.

The existing libcamera AE controls are ill-defined. There's a \todo
comment to document the interactions between AeEnable and setting a
fixed value for ExposureTime or AnalogueGain. I'm actually not sure what
happens if AeEnable is set to false when ExposureTime and/or
AnalogueGain are 0, or when AeEnable is set to true when ExposureTime
and/or AnalogueGain are not 0.

We need a formal model for how libcamera algorithms interact with manual
controls. It doesn't have to duplicate the Android model, but it has to
offer at least the same features. I believe we need to take it one step
further, as we want to support exposure priority and aperture priority.
I think we should actually talk about gain/sensitivity priority instead
of aperture priority for the existing feature, and already prepare for
devices that have a controllable lens aperture.

Now, the hard question: how should libcamera model this ? :-) Naush,
David, your feedback from a RPi point of view would be very appreciated.
Please also let me know if the above explanation is clear or if you need
more information.

> > +        - name: AeStateSearching
> > +          value: 1
> > +          description: |
> > +            The AE algorithm has not converged yet.
> > +            If the camera finishes an AE scan, the state shall go to Converged.
> > +            If the camera finishes an AE scan, but flash is required, the state
> > +            shall go to FlashRequired.
> > +            If AeLock is on, the state shall go to Locked.
> > +        - name: AeStateConverged
> > +          value: 2
> > +          description: |
> > +            The AE algorithm has converged.
> > +            If the camera initiates an AE scan, the state shall go to Searching.
> > +            If AeLock is on, the state shall go to Locked.
> > +        - name: AeStateLocked
> > +          value: 3
> > +          description: |
> > +            The AE algorithm is locked.
> > +            If AeLock is off, the state can go to Searching, Converged, or
> > +            FlashRequired.
> > +        - name: AeStateFlashRequired
> > +          value: 4
> > +          description: |
> > +            The AE algorithm would need a flash for good results
> > +            If the camera initiates an AE scan, the state shall go to Searching.
> > +            If AeLock is on, the state shall go to Locked.
> > +        - name: AeStatePrecapture
> > +          value: 5
> > +          description: |
> > +            The AE algorithm has started a pre-capture metering session.
> > +            After the sequence is finished, the state shall go to Converged if
> > +            AeLock is off, and Locked if it is on.
> > +            \sa AePrecaptureTrigger
> >
> >    # AeMeteringMode needs further attention:
> >    # - Auto-generate max enum value.
> > @@ -477,36 +531,6 @@ controls:
> >              High quality aberration correction which might reduce the frame
> >              rate.
> >
> > -  - AeState:
> > -      type: int32_t
> > -      draft: true
> > -      description: |
> > -       Control to report the current AE algorithm state. Currently identical to
> > -       ANDROID_CONTROL_AE_STATE.
> > -
> > -        Current state of the AE algorithm.
> > -      enum:
> > -        - name: AeStateInactive
> > -          value: 0
> > -          description: The AE algorithm is inactive.
> > -        - name: AeStateSearching
> > -          value: 1
> > -          description: The AE algorithm has not converged yet.
> > -        - name: AeStateConverged
> > -          value: 2
> > -          description: The AE algorithm has converged.
> > -        - name: AeStateLocked
> > -          value: 3
> > -          description: The AE algorithm is locked.
> > -        - name: AeStateFlashRequired
> > -          value: 4
> > -          description: The AE algorithm would need a flash for good results
> > -        - name: AeStatePrecapture
> > -          value: 5
> > -          description: |
> > -            The AE algorithm has started a pre-capture metering session.
> > -            \sa AePrecaptureTrigger
> > -
> >    - AfState:
> >        type: int32_t
> >        draft: true
Laurent Pinchart July 11, 2021, 10:36 p.m. UTC | #4
Hi Naush,

(now really CC'ing David)

Apologies for the late reply, I should have shared more information on
this topic earlier.

On Mon, Jul 05, 2021 at 06:34:07PM +0100, Naushir Patuck wrote:
> On Fri, 2 Jul 2021 at 11:38, Paul Elder wrote:
> > AeLocked alone isn't sufficient for reporting the AE state, so replace
> > it with AeState. Add an AeLock control for instructing the camera to
> > lock the AE values.
> >
> > Update the current users of AeLocked accordingly.
> >
> > Signed-off-by: Paul Elder <paul.elder@ideasonboard.com>
> >
> > ---
> > No change in v3
> > ---
> >  src/ipa/raspberrypi/raspberrypi.cpp |  5 +-
> >  src/ipa/rkisp1/rkisp1.cpp           | 13 ++--
> >  src/libcamera/control_ids.yaml      | 96 ++++++++++++++++++-----------
> >  3 files changed, 71 insertions(+), 43 deletions(-)
> >
> > diff --git a/src/ipa/raspberrypi/raspberrypi.cpp b/src/ipa/raspberrypi/raspberrypi.cpp
> > index 4d09a84f..4981aa29 100644
> > --- a/src/ipa/raspberrypi/raspberrypi.cpp
> > +++ b/src/ipa/raspberrypi/raspberrypi.cpp
> > @@ -469,7 +469,10 @@ void IPARPi::reportMetadata()
> >
> >         AgcStatus *agcStatus = rpiMetadata_.GetLocked<AgcStatus>("agc.status");
> >         if (agcStatus) {
> > -               libcameraMetadata_.set(controls::AeLocked, agcStatus->locked);
> > +               libcameraMetadata_.set(controls::AeState,
> > +                                      agcStatus->locked ?
> > +                                      controls::AeStateLocked :
> > +                                      controls::AeStateSearching);
> 
> Yes, this seems like the correct state mapping.

Is it ? I thought it should be AeStateConverged.

> >                 libcameraMetadata_.set(controls::DigitalGain, agcStatus->digital_gain);
> >         }
> >
> > diff --git a/src/ipa/rkisp1/rkisp1.cpp b/src/ipa/rkisp1/rkisp1.cpp
> > index cdfb4d13..4eca26e2 100644
> > --- a/src/ipa/rkisp1/rkisp1.cpp
> > +++ b/src/ipa/rkisp1/rkisp1.cpp
> > @@ -51,7 +51,7 @@ private:
> >                               const rkisp1_stat_buffer *stats);
> >
> >         void setControls(unsigned int frame);
> > -       void metadataReady(unsigned int frame, unsigned int aeState);
> > +       void metadataReady(unsigned int frame, int aeState);
> >
> >         std::map<unsigned int, FrameBuffer> buffers_;
> >         std::map<unsigned int, void *> buffersMemory_;
> > @@ -227,7 +227,7 @@ void IPARkISP1::updateStatistics(unsigned int frame,
> >                                  const rkisp1_stat_buffer *stats)
> >  {
> >         const rkisp1_cif_isp_stat *params = &stats->params;
> > -       unsigned int aeState = 0;
> > +       int aeState = controls::AeStateInactive;
> >
> >         if (stats->meas_type & RKISP1_CIF_ISP_STAT_AUTOEXP) {
> >                 const rkisp1_cif_isp_ae_stat *ae = &params->ae;
> > @@ -262,7 +262,9 @@ void IPARkISP1::updateStatistics(unsigned int frame,
> >                         setControls(frame + 1);
> >                 }
> >
> > -               aeState = fabs(factor - 1.0f) < 0.05f ? 2 : 1;
> > +               aeState = fabs(factor - 1.0f) < 0.05f ?
> > +                         controls::AeStateConverged :
> > +                         controls::AeStateSearching;
> >         }
> >
> >         metadataReady(frame, aeState);
> > @@ -281,12 +283,11 @@ void IPARkISP1::setControls(unsigned int frame)
> >         queueFrameAction.emit(frame, op);
> >  }
> >
> > -void IPARkISP1::metadataReady(unsigned int frame, unsigned int aeState)
> > +void IPARkISP1::metadataReady(unsigned int frame, int aeState)
> >  {
> >         ControlList ctrls(controls::controls);
> >
> > -       if (aeState)
> > -               ctrls.set(controls::AeLocked, aeState == 2);
> > +       ctrls.set(controls::AeState, aeState);
> >
> >         RkISP1Action op;
> >         op.op = ActionMetadata;
> > diff --git a/src/libcamera/control_ids.yaml b/src/libcamera/control_ids.yaml
> > index 9d4638ae..5717bc1f 100644
> > --- a/src/libcamera/control_ids.yaml
> > +++ b/src/libcamera/control_ids.yaml
> > @@ -14,16 +14,70 @@ controls:
> >
> >          \sa ExposureTime AnalogueGain
> >
> > -  - AeLocked:
> > +  - AeLock:
> >        type: bool
> >        description: |
> > -        Report the lock status of a running AE algorithm.
> > +        Control to lock the AE values.
> > +        When set to true, the AE algorithm is locked to its latest parameters,
> > +        and will not change exposure settings until set to false.
> > +        \sa AeState
> 
> I know I am repeating myself here, but I don't see the need to differentiate between
> AeLock and AeEnable  == falses.

Please see below.

> > -        If the AE algorithm is locked the value shall be set to true, if it's
> > -        converging it shall be set to false. If the AE algorithm is not
> > -        running the control shall not be present in the metadata control list.
> > +  - AeState:
> > +      type: int32_t
> > +      description: |
> > +        Control to report the current AE algorithm state. Enabling or disabling
> > +        AE (AeEnable) always resets the AeState to AeStateInactive. The camera
> > +        device can do several state transitions between two results, if it is
> > +        allowed by the state transition table. For example, AeStateInactive may
> > +        never actually be seen in a result.
> >
> > -        \sa AeEnable
> > +        The state in the result is the state for this image (in sync with this
> > +        image). If AE state becomes AeStateConverged, then the image data
> > +        associated with the result should be good to use.
> > +
> > +        The state transitions mentioned below assume that AeEnable is on.
> > +
> > +        \sa AeLock
> > +      enum:
> > +        - name: AeStateInactive
> > +          value: 0
> > +          description: |
> > +            The AE algorithm is inactive.
> > +            If the camera initiates an AE scan, the state shall go to Searching.
> > +            If AeLock is on, the state shall go to Locked.
> >
> 
> This state is also a bit ambiguous to me.  Is this the state that must be reported
> if AeEnable == false?  If so, is that not also conveyed by setting the latter?
> 
> If the objective of the change is to wholesale do what Android does, then I don't
> really have any objection to this change as-is, and we can treat the above
> states as best we can in the Raspberry Pi IPA.  Otherwise, I do see some scope
> to simplify some of this.

First of all, let me clarify the goal. The libcamera native API needs to
offer the features required to implement the Android camera HAL API, but
we don't need to just duplicate it. When there are reasons to depart
(for instance to fix historical mistakes made in Android, to provide
additional features, or just to expose an API we consider more
consistent), we should do so.

Then, a word about the process. Implementing support for an Android
control requires plumbing it through the camera stack: definition of a
libcamera control, translation between the Android and libcamera control
in the Android camera HAL, implementation of the libcamera control in
the pipeline handlesr, and, when applicable, in the IPAs. Defining
libcamera controls, as any API design effort, takes time. To allow
moving forward on all the other parts, we have a fast-track process to
define draft controls that mimick the Android controls. All the draft
controls will be redesigned before freezing the libcamera public API.

This means that AeLock and AeState should be marked as draft controls as
part of this patch.

This being said, we don't need to define controls as draft controls if
we want to already go through a complete design process. The discussion
that stemmed from this patch series goes in that direction.

It's useful, in this context, to understand why the Android AE (and AWB)
controls have been designed this way. Conceptually speaking, the
algorithms are considered, in Android, to be one level above manual
controls.

 Controls                                                        Metadata
             /---------- == OFF ---------\
             |                           v
aeMode      -/             +------+     |\
aeLock                 --> |  AE  | --> |0\     +--------+
aeExposureCompensation     +------+     | | --> | Sensor | --> exposureTime
                                        | |     +--------+     sensitivity
exposureTime   -----------------------> |1/
sensitivity                             |/

(The AE controls are in the adnroid.control class, while the sensor
manual control are in the android.sensor class.)

The aeMode control selects whether the AE algorithm (when != OFF) or the
manual controls (when == OFF) drive the sensor. The aeLock control tells
the AE algorithm to freeze its output (when ON) or operate normally
(when OFF). The metadata always report the values applied to the sensor,
regardless of whether they come from the AE algorithm or from manual
controls.

One important difference, compared to the model we currently implement,
is that the Android AE algorithm is independent from the manual sensor
controls. The values of the manual controls aren't affected by the AE
algorithm, and when AE is disabled, the last set manual control values
are applied. This thus require an aeLock control to transition
seamlessly between AE and manual mode, as the application needs to set
the manual controls to the values applied to the sensor on the last
frame that AE was on, which couldn't otherwise be done due to the
internal delays.

The existing libcamera AE controls are ill-defined. There's a \todo
comment to document the interactions between AeEnable and setting a
fixed value for ExposureTime or AnalogueGain. I'm actually not sure what
happens if AeEnable is set to false when ExposureTime and/or
AnalogueGain are 0, or when AeEnable is set to true when ExposureTime
and/or AnalogueGain are not 0.

We need a formal model for how libcamera algorithms interact with manual
controls. It doesn't have to duplicate the Android model, but it has to
offer at least the same features. I believe we need to take it one step
further, as we want to support exposure priority and aperture priority.
I think we should actually talk about gain/sensitivity priority instead
of aperture priority for the existing feature, and already prepare for
devices that have a controllable lens aperture.

Now, the hard question: how should libcamera model this ? :-) Naush,
David, your feedback from a RPi point of view would be very appreciated.
Please also let me know if the above explanation is clear or if you need
more information.

> > +        - name: AeStateSearching
> > +          value: 1
> > +          description: |
> > +            The AE algorithm has not converged yet.
> > +            If the camera finishes an AE scan, the state shall go to Converged.
> > +            If the camera finishes an AE scan, but flash is required, the state
> > +            shall go to FlashRequired.
> > +            If AeLock is on, the state shall go to Locked.
> > +        - name: AeStateConverged
> > +          value: 2
> > +          description: |
> > +            The AE algorithm has converged.
> > +            If the camera initiates an AE scan, the state shall go to Searching.
> > +            If AeLock is on, the state shall go to Locked.
> > +        - name: AeStateLocked
> > +          value: 3
> > +          description: |
> > +            The AE algorithm is locked.
> > +            If AeLock is off, the state can go to Searching, Converged, or
> > +            FlashRequired.
> > +        - name: AeStateFlashRequired
> > +          value: 4
> > +          description: |
> > +            The AE algorithm would need a flash for good results
> > +            If the camera initiates an AE scan, the state shall go to Searching.
> > +            If AeLock is on, the state shall go to Locked.
> > +        - name: AeStatePrecapture
> > +          value: 5
> > +          description: |
> > +            The AE algorithm has started a pre-capture metering session.
> > +            After the sequence is finished, the state shall go to Converged if
> > +            AeLock is off, and Locked if it is on.
> > +            \sa AePrecaptureTrigger
> >
> >    # AeMeteringMode needs further attention:
> >    # - Auto-generate max enum value.
> > @@ -477,36 +531,6 @@ controls:
> >              High quality aberration correction which might reduce the frame
> >              rate.
> >
> > -  - AeState:
> > -      type: int32_t
> > -      draft: true
> > -      description: |
> > -       Control to report the current AE algorithm state. Currently identical to
> > -       ANDROID_CONTROL_AE_STATE.
> > -
> > -        Current state of the AE algorithm.
> > -      enum:
> > -        - name: AeStateInactive
> > -          value: 0
> > -          description: The AE algorithm is inactive.
> > -        - name: AeStateSearching
> > -          value: 1
> > -          description: The AE algorithm has not converged yet.
> > -        - name: AeStateConverged
> > -          value: 2
> > -          description: The AE algorithm has converged.
> > -        - name: AeStateLocked
> > -          value: 3
> > -          description: The AE algorithm is locked.
> > -        - name: AeStateFlashRequired
> > -          value: 4
> > -          description: The AE algorithm would need a flash for good results
> > -        - name: AeStatePrecapture
> > -          value: 5
> > -          description: |
> > -            The AE algorithm has started a pre-capture metering session.
> > -            \sa AePrecaptureTrigger
> > -
> >    - AfState:
> >        type: int32_t
> >        draft: true
Naushir Patuck July 12, 2021, 2:50 p.m. UTC | #5
Hi Laurent,

On Sun, 11 Jul 2021 at 23:37, Laurent Pinchart <
laurent.pinchart@ideasonboard.com> wrote:

> Hi Naush,
>
> (now really CC'ing David)
>
> Apologies for the late reply, I should have shared more information on
> this topic earlier.
>
> On Mon, Jul 05, 2021 at 06:34:07PM +0100, Naushir Patuck wrote:
> > On Fri, 2 Jul 2021 at 11:38, Paul Elder wrote:
> > > AeLocked alone isn't sufficient for reporting the AE state, so replace
> > > it with AeState. Add an AeLock control for instructing the camera to
> > > lock the AE values.
> > >
> > > Update the current users of AeLocked accordingly.
> > >
> > > Signed-off-by: Paul Elder <paul.elder@ideasonboard.com>
> > >
> > > ---
> > > No change in v3
> > > ---
> > >  src/ipa/raspberrypi/raspberrypi.cpp |  5 +-
> > >  src/ipa/rkisp1/rkisp1.cpp           | 13 ++--
> > >  src/libcamera/control_ids.yaml      | 96 ++++++++++++++++++-----------
> > >  3 files changed, 71 insertions(+), 43 deletions(-)
> > >
> > > diff --git a/src/ipa/raspberrypi/raspberrypi.cpp
> b/src/ipa/raspberrypi/raspberrypi.cpp
> > > index 4d09a84f..4981aa29 100644
> > > --- a/src/ipa/raspberrypi/raspberrypi.cpp
> > > +++ b/src/ipa/raspberrypi/raspberrypi.cpp
> > > @@ -469,7 +469,10 @@ void IPARPi::reportMetadata()
> > >
> > >         AgcStatus *agcStatus =
> rpiMetadata_.GetLocked<AgcStatus>("agc.status");
> > >         if (agcStatus) {
> > > -               libcameraMetadata_.set(controls::AeLocked,
> agcStatus->locked);
> > > +               libcameraMetadata_.set(controls::AeState,
> > > +                                      agcStatus->locked ?
> > > +                                      controls::AeStateLocked :
> > > +                                      controls::AeStateSearching);
> >
> > Yes, this seems like the correct state mapping.
>
> Is it ? I thought it should be AeStateConverged.
>
> > >                 libcameraMetadata_.set(controls::DigitalGain,
> agcStatus->digital_gain);
> > >         }
> > >
> > > diff --git a/src/ipa/rkisp1/rkisp1.cpp b/src/ipa/rkisp1/rkisp1.cpp
> > > index cdfb4d13..4eca26e2 100644
> > > --- a/src/ipa/rkisp1/rkisp1.cpp
> > > +++ b/src/ipa/rkisp1/rkisp1.cpp
> > > @@ -51,7 +51,7 @@ private:
> > >                               const rkisp1_stat_buffer *stats);
> > >
> > >         void setControls(unsigned int frame);
> > > -       void metadataReady(unsigned int frame, unsigned int aeState);
> > > +       void metadataReady(unsigned int frame, int aeState);
> > >
> > >         std::map<unsigned int, FrameBuffer> buffers_;
> > >         std::map<unsigned int, void *> buffersMemory_;
> > > @@ -227,7 +227,7 @@ void IPARkISP1::updateStatistics(unsigned int
> frame,
> > >                                  const rkisp1_stat_buffer *stats)
> > >  {
> > >         const rkisp1_cif_isp_stat *params = &stats->params;
> > > -       unsigned int aeState = 0;
> > > +       int aeState = controls::AeStateInactive;
> > >
> > >         if (stats->meas_type & RKISP1_CIF_ISP_STAT_AUTOEXP) {
> > >                 const rkisp1_cif_isp_ae_stat *ae = &params->ae;
> > > @@ -262,7 +262,9 @@ void IPARkISP1::updateStatistics(unsigned int
> frame,
> > >                         setControls(frame + 1);
> > >                 }
> > >
> > > -               aeState = fabs(factor - 1.0f) < 0.05f ? 2 : 1;
> > > +               aeState = fabs(factor - 1.0f) < 0.05f ?
> > > +                         controls::AeStateConverged :
> > > +                         controls::AeStateSearching;
> > >         }
> > >
> > >         metadataReady(frame, aeState);
> > > @@ -281,12 +283,11 @@ void IPARkISP1::setControls(unsigned int frame)
> > >         queueFrameAction.emit(frame, op);
> > >  }
> > >
> > > -void IPARkISP1::metadataReady(unsigned int frame, unsigned int
> aeState)
> > > +void IPARkISP1::metadataReady(unsigned int frame, int aeState)
> > >  {
> > >         ControlList ctrls(controls::controls);
> > >
> > > -       if (aeState)
> > > -               ctrls.set(controls::AeLocked, aeState == 2);
> > > +       ctrls.set(controls::AeState, aeState);
> > >
> > >         RkISP1Action op;
> > >         op.op = ActionMetadata;
> > > diff --git a/src/libcamera/control_ids.yaml
> b/src/libcamera/control_ids.yaml
> > > index 9d4638ae..5717bc1f 100644
> > > --- a/src/libcamera/control_ids.yaml
> > > +++ b/src/libcamera/control_ids.yaml
> > > @@ -14,16 +14,70 @@ controls:
> > >
> > >          \sa ExposureTime AnalogueGain
> > >
> > > -  - AeLocked:
> > > +  - AeLock:
> > >        type: bool
> > >        description: |
> > > -        Report the lock status of a running AE algorithm.
> > > +        Control to lock the AE values.
> > > +        When set to true, the AE algorithm is locked to its latest
> parameters,
> > > +        and will not change exposure settings until set to false.
> > > +        \sa AeState
> >
> > I know I am repeating myself here, but I don't see the need to
> differentiate between
> > AeLock and AeEnable  == falses.
>
> Please see below.
>
> > > -        If the AE algorithm is locked the value shall be set to true,
> if it's
> > > -        converging it shall be set to false. If the AE algorithm is
> not
> > > -        running the control shall not be present in the metadata
> control list.
> > > +  - AeState:
> > > +      type: int32_t
> > > +      description: |
> > > +        Control to report the current AE algorithm state. Enabling or
> disabling
> > > +        AE (AeEnable) always resets the AeState to AeStateInactive.
> The camera
> > > +        device can do several state transitions between two results,
> if it is
> > > +        allowed by the state transition table. For example,
> AeStateInactive may
> > > +        never actually be seen in a result.
> > >
> > > -        \sa AeEnable
> > > +        The state in the result is the state for this image (in sync
> with this
> > > +        image). If AE state becomes AeStateConverged, then the image
> data
> > > +        associated with the result should be good to use.
> > > +
> > > +        The state transitions mentioned below assume that AeEnable is
> on.
> > > +
> > > +        \sa AeLock
> > > +      enum:
> > > +        - name: AeStateInactive
> > > +          value: 0
> > > +          description: |
> > > +            The AE algorithm is inactive.
> > > +            If the camera initiates an AE scan, the state shall go to
> Searching.
> > > +            If AeLock is on, the state shall go to Locked.
> > >
> >
> > This state is also a bit ambiguous to me.  Is this the state that must
> be reported
> > if AeEnable == false?  If so, is that not also conveyed by setting the
> latter?
> >
> > If the objective of the change is to wholesale do what Android does,
> then I don't
> > really have any objection to this change as-is, and we can treat the
> above
> > states as best we can in the Raspberry Pi IPA.  Otherwise, I do see some
> scope
> > to simplify some of this.
>
> First of all, let me clarify the goal. The libcamera native API needs to
> offer the features required to implement the Android camera HAL API, but
> we don't need to just duplicate it. When there are reasons to depart
> (for instance to fix historical mistakes made in Android, to provide
> additional features, or just to expose an API we consider more
> consistent), we should do so.
>
> Then, a word about the process. Implementing support for an Android
> control requires plumbing it through the camera stack: definition of a
> libcamera control, translation between the Android and libcamera control
> in the Android camera HAL, implementation of the libcamera control in
> the pipeline handlesr, and, when applicable, in the IPAs. Defining
> libcamera controls, as any API design effort, takes time. To allow
> moving forward on all the other parts, we have a fast-track process to
> define draft controls that mimick the Android controls. All the draft
> controls will be redesigned before freezing the libcamera public API.
>
> This means that AeLock and AeState should be marked as draft controls as
> part of this patch.
>
> This being said, we don't need to define controls as draft controls if
> we want to already go through a complete design process. The discussion
> that stemmed from this patch series goes in that direction.
>
> It's useful, in this context, to understand why the Android AE (and AWB)
> controls have been designed this way. Conceptually speaking, the
> algorithms are considered, in Android, to be one level above manual
> controls.
>
>  Controls                                                        Metadata
>              /---------- == OFF ---------\
>              |                           v
> aeMode      -/             +------+     |\
> aeLock                 --> |  AE  | --> |0\     +--------+
> aeExposureCompensation     +------+     | | --> | Sensor | --> exposureTime
>                                         | |     +--------+     sensitivity
> exposureTime   -----------------------> |1/
> sensitivity                             |/
>
> (The AE controls are in the adnroid.control class, while the sensor
> manual control are in the android.sensor class.)
>
> The aeMode control selects whether the AE algorithm (when != OFF) or the
> manual controls (when == OFF) drive the sensor. The aeLock control tells
> the AE algorithm to freeze its output (when ON) or operate normally
> (when OFF). The metadata always report the values applied to the sensor,
> regardless of whether they come from the AE algorithm or from manual
> controls.
>
> One important difference, compared to the model we currently implement,
> is that the Android AE algorithm is independent from the manual sensor
> controls. The values of the manual controls aren't affected by the AE
> algorithm, and when AE is disabled, the last set manual control values
> are applied. This thus require an aeLock control to transition
> seamlessly between AE and manual mode, as the application needs to set
> the manual controls to the values applied to the sensor on the last
> frame that AE was on, which couldn't otherwise be done due to the
> internal delays.
>

Thank you for the detailed explanation here.  I can now better understand
the Android AE state transitions and why they may be needed.


>
> The existing libcamera AE controls are ill-defined. There's a \todo
> comment to document the interactions between AeEnable and setting a
> fixed value for ExposureTime or AnalogueGain. I'm actually not sure what
> happens if AeEnable is set to false when ExposureTime and/or
> AnalogueGain are 0, or when AeEnable is set to true when ExposureTime
> and/or AnalogueGain are not 0.
>

To provide some brief context on the Raspberry Pi AGC, we never really
go into a "disabled" state at all.  This is because the AGC algorithm always
provides the shutter/gain values to pass to the sensor, even in manual mode
(more on this below).  We use an internal Paused state to lock the
shutter/gain
values so that they can never change, which essentially mimics a "disabled"
state.

For manual operation, we pass shutter/gain ctrl values (either or both) to
the AGC,
and that is what the AGC requests off the sensor.  This happens regardless
of
whether the AGC is in it's internal paused state or not.  If the AGC is not
in
paused state, and we only provide one manual control value (e.g. shutter
speed),
then the AGC will vary the gain as it sees fit to converge to the desired
brightness.  Setting a manual shutter/gain ctrl value back to 0 hands back
control
of the control to the AGC and it starts adapting it as necessary to
converge to
the desired brightness.

This type of implementation does allow us to essentially do all that
Android would
except of the AGC, but does also allow us the flexibility to do
Aperture/Shutter/Gain
priority modes if needed.


> We need a formal model for how libcamera algorithms interact with manual
> controls. It doesn't have to duplicate the Android model, but it has to
> offer at least the same features. I believe we need to take it one step
> further, as we want to support exposure priority and aperture priority.
> I think we should actually talk about gain/sensitivity priority instead
> of aperture priority for the existing feature, and already prepare for
> devices that have a controllable lens aperture.
>

> Now, the hard question: how should libcamera model this ? :-) Naush,
> David, your feedback from a RPi point of view would be very appreciated.
>
>
It would be easy for me to say do it the Raspberry Pi way :-)
But in reality, I don't think the Android AE model gains us an advantage
over our existing
model.  And having to track one less state (locked vs disabled) is always a
good thing.

Explicitly defining AeModes to have (amongst others) ShutterPrioiry,
GainPrioiry,
AperturePriority and Auto modes would be useful though.  This avoids the
messiness of
setting 0 to hand back manual control back to the AGC?

I'm sure David will also want to share his opinions on this, but he won't
be able to
respond for a few days.

Regards,
Naush


> Please also let me know if the above explanation is clear or if you need
> more information.

> > +        - name: AeStateSearching
> > > +          value: 1
> > > +          description: |
> > > +            The AE algorithm has not converged yet.
> > > +            If the camera finishes an AE scan, the state shall go to
> Converged.
> > > +            If the camera finishes an AE scan, but flash is required,
> the state
> > > +            shall go to FlashRequired.
> > > +            If AeLock is on, the state shall go to Locked.
> > > +        - name: AeStateConverged
> > > +          value: 2
> > > +          description: |
> > > +            The AE algorithm has converged.
> > > +            If the camera initiates an AE scan, the state shall go to
> Searching.
> > > +            If AeLock is on, the state shall go to Locked.
> > > +        - name: AeStateLocked
> > > +          value: 3
> > > +          description: |
> > > +            The AE algorithm is locked.
> > > +            If AeLock is off, the state can go to Searching,
> Converged, or
> > > +            FlashRequired.
> > > +        - name: AeStateFlashRequired
> > > +          value: 4
> > > +          description: |
> > > +            The AE algorithm would need a flash for good results
> > > +            If the camera initiates an AE scan, the state shall go to
> Searching.
> > > +            If AeLock is on, the state shall go to Locked.
> > > +        - name: AeStatePrecapture
> > > +          value: 5
> > > +          description: |
> > > +            The AE algorithm has started a pre-capture metering
> session.
> > > +            After the sequence is finished, the state shall go to
> Converged if
> > > +            AeLock is off, and Locked if it is on.
> > > +            \sa AePrecaptureTrigger
> > >
> > >    # AeMeteringMode needs further attention:
> > >    # - Auto-generate max enum value.
> > > @@ -477,36 +531,6 @@ controls:
> > >              High quality aberration correction which might reduce the
> frame
> > >              rate.
> > >
> > > -  - AeState:
> > > -      type: int32_t
> > > -      draft: true
> > > -      description: |
> > > -       Control to report the current AE algorithm state. Currently
> identical to
> > > -       ANDROID_CONTROL_AE_STATE.
> > > -
> > > -        Current state of the AE algorithm.
> > > -      enum:
> > > -        - name: AeStateInactive
> > > -          value: 0
> > > -          description: The AE algorithm is inactive.
> > > -        - name: AeStateSearching
> > > -          value: 1
> > > -          description: The AE algorithm has not converged yet.
> > > -        - name: AeStateConverged
> > > -          value: 2
> > > -          description: The AE algorithm has converged.
> > > -        - name: AeStateLocked
> > > -          value: 3
> > > -          description: The AE algorithm is locked.
> > > -        - name: AeStateFlashRequired
> > > -          value: 4
> > > -          description: The AE algorithm would need a flash for good
> results
> > > -        - name: AeStatePrecapture
> > > -          value: 5
> > > -          description: |
> > > -            The AE algorithm has started a pre-capture metering
> session.
> > > -            \sa AePrecaptureTrigger
> > > -
> > >    - AfState:
> > >        type: int32_t
> > >        draft: true
>
> --
> Regards,
>
> Laurent Pinchart
>
Laurent Pinchart Aug. 16, 2021, 2:07 a.m. UTC | #6
Hi Naush,

On Mon, Jul 12, 2021 at 03:50:29PM +0100, Naushir Patuck wrote:
> On Sun, 11 Jul 2021 at 23:37, Laurent Pinchart wrote:
> > On Mon, Jul 05, 2021 at 06:34:07PM +0100, Naushir Patuck wrote:
> > > On Fri, 2 Jul 2021 at 11:38, Paul Elder wrote:
> > > > AeLocked alone isn't sufficient for reporting the AE state, so replace
> > > > it with AeState. Add an AeLock control for instructing the camera to
> > > > lock the AE values.
> > > >
> > > > Update the current users of AeLocked accordingly.
> > > >
> > > > Signed-off-by: Paul Elder <paul.elder@ideasonboard.com>
> > > >
> > > > ---
> > > > No change in v3
> > > > ---
> > > >  src/ipa/raspberrypi/raspberrypi.cpp |  5 +-
> > > >  src/ipa/rkisp1/rkisp1.cpp           | 13 ++--
> > > >  src/libcamera/control_ids.yaml      | 96 ++++++++++++++++++-----------
> > > >  3 files changed, 71 insertions(+), 43 deletions(-)
> > > >
> > > > diff --git a/src/ipa/raspberrypi/raspberrypi.cpp b/src/ipa/raspberrypi/raspberrypi.cpp
> > > > index 4d09a84f..4981aa29 100644
> > > > --- a/src/ipa/raspberrypi/raspberrypi.cpp
> > > > +++ b/src/ipa/raspberrypi/raspberrypi.cpp
> > > > @@ -469,7 +469,10 @@ void IPARPi::reportMetadata()
> > > >
> > > >         AgcStatus *agcStatus = rpiMetadata_.GetLocked<AgcStatus>("agc.status");
> > > >         if (agcStatus) {
> > > > -               libcameraMetadata_.set(controls::AeLocked, agcStatus->locked);
> > > > +               libcameraMetadata_.set(controls::AeState,
> > > > +                                      agcStatus->locked ?
> > > > +                                      controls::AeStateLocked :
> > > > +                                      controls::AeStateSearching);
> > >
> > > Yes, this seems like the correct state mapping.
> >
> > Is it ? I thought it should be AeStateConverged.
> >
> > > >                 libcameraMetadata_.set(controls::DigitalGain, agcStatus->digital_gain);
> > > >         }
> > > >
> > > > diff --git a/src/ipa/rkisp1/rkisp1.cpp b/src/ipa/rkisp1/rkisp1.cpp
> > > > index cdfb4d13..4eca26e2 100644
> > > > --- a/src/ipa/rkisp1/rkisp1.cpp
> > > > +++ b/src/ipa/rkisp1/rkisp1.cpp
> > > > @@ -51,7 +51,7 @@ private:
> > > >                               const rkisp1_stat_buffer *stats);
> > > >
> > > >         void setControls(unsigned int frame);
> > > > -       void metadataReady(unsigned int frame, unsigned int aeState);
> > > > +       void metadataReady(unsigned int frame, int aeState);
> > > >
> > > >         std::map<unsigned int, FrameBuffer> buffers_;
> > > >         std::map<unsigned int, void *> buffersMemory_;
> > > > @@ -227,7 +227,7 @@ void IPARkISP1::updateStatistics(unsigned int frame,
> > > >                                  const rkisp1_stat_buffer *stats)
> > > >  {
> > > >         const rkisp1_cif_isp_stat *params = &stats->params;
> > > > -       unsigned int aeState = 0;
> > > > +       int aeState = controls::AeStateInactive;
> > > >
> > > >         if (stats->meas_type & RKISP1_CIF_ISP_STAT_AUTOEXP) {
> > > >                 const rkisp1_cif_isp_ae_stat *ae = &params->ae;
> > > > @@ -262,7 +262,9 @@ void IPARkISP1::updateStatistics(unsigned int frame,
> > > >                         setControls(frame + 1);
> > > >                 }
> > > >
> > > > -               aeState = fabs(factor - 1.0f) < 0.05f ? 2 : 1;
> > > > +               aeState = fabs(factor - 1.0f) < 0.05f ?
> > > > +                         controls::AeStateConverged :
> > > > +                         controls::AeStateSearching;
> > > >         }
> > > >
> > > >         metadataReady(frame, aeState);
> > > > @@ -281,12 +283,11 @@ void IPARkISP1::setControls(unsigned int frame)
> > > >         queueFrameAction.emit(frame, op);
> > > >  }
> > > >
> > > > -void IPARkISP1::metadataReady(unsigned int frame, unsigned int aeState)
> > > > +void IPARkISP1::metadataReady(unsigned int frame, int aeState)
> > > >  {
> > > >         ControlList ctrls(controls::controls);
> > > >
> > > > -       if (aeState)
> > > > -               ctrls.set(controls::AeLocked, aeState == 2);
> > > > +       ctrls.set(controls::AeState, aeState);
> > > >
> > > >         RkISP1Action op;
> > > >         op.op = ActionMetadata;
> > > > diff --git a/src/libcamera/control_ids.yaml b/src/libcamera/control_ids.yaml
> > > > index 9d4638ae..5717bc1f 100644
> > > > --- a/src/libcamera/control_ids.yaml
> > > > +++ b/src/libcamera/control_ids.yaml
> > > > @@ -14,16 +14,70 @@ controls:
> > > >
> > > >          \sa ExposureTime AnalogueGain
> > > >
> > > > -  - AeLocked:
> > > > +  - AeLock:
> > > >        type: bool
> > > >        description: |
> > > > -        Report the lock status of a running AE algorithm.
> > > > +        Control to lock the AE values.
> > > > +        When set to true, the AE algorithm is locked to its latest parameters,
> > > > +        and will not change exposure settings until set to false.
> > > > +        \sa AeState
> > >
> > > I know I am repeating myself here, but I don't see the need to differentiate between
> > > AeLock and AeEnable  == falses.
> >
> > Please see below.
> >
> > > > -        If the AE algorithm is locked the value shall be set to true, if it's
> > > > -        converging it shall be set to false. If the AE algorithm is not
> > > > -        running the control shall not be present in the metadata control list.
> > > > +  - AeState:
> > > > +      type: int32_t
> > > > +      description: |
> > > > +        Control to report the current AE algorithm state. Enabling or disabling
> > > > +        AE (AeEnable) always resets the AeState to AeStateInactive. The camera
> > > > +        device can do several state transitions between two results, if it is
> > > > +        allowed by the state transition table. For example, AeStateInactive may
> > > > +        never actually be seen in a result.
> > > >
> > > > -        \sa AeEnable
> > > > +        The state in the result is the state for this image (in sync with this
> > > > +        image). If AE state becomes AeStateConverged, then the image data
> > > > +        associated with the result should be good to use.
> > > > +
> > > > +        The state transitions mentioned below assume that AeEnable is on.
> > > > +
> > > > +        \sa AeLock
> > > > +      enum:
> > > > +        - name: AeStateInactive
> > > > +          value: 0
> > > > +          description: |
> > > > +            The AE algorithm is inactive.
> > > > +            If the camera initiates an AE scan, the state shall go to Searching.
> > > > +            If AeLock is on, the state shall go to Locked.
> > > >
> > >
> > > This state is also a bit ambiguous to me.  Is this the state that must be reported
> > > if AeEnable == false?  If so, is that not also conveyed by setting the latter?
> > >
> > > If the objective of the change is to wholesale do what Android does, then I don't
> > > really have any objection to this change as-is, and we can treat the above
> > > states as best we can in the Raspberry Pi IPA.  Otherwise, I do see some scope
> > > to simplify some of this.
> >
> > First of all, let me clarify the goal. The libcamera native API needs to
> > offer the features required to implement the Android camera HAL API, but
> > we don't need to just duplicate it. When there are reasons to depart
> > (for instance to fix historical mistakes made in Android, to provide
> > additional features, or just to expose an API we consider more
> > consistent), we should do so.
> >
> > Then, a word about the process. Implementing support for an Android
> > control requires plumbing it through the camera stack: definition of a
> > libcamera control, translation between the Android and libcamera control
> > in the Android camera HAL, implementation of the libcamera control in
> > the pipeline handlesr, and, when applicable, in the IPAs. Defining
> > libcamera controls, as any API design effort, takes time. To allow
> > moving forward on all the other parts, we have a fast-track process to
> > define draft controls that mimick the Android controls. All the draft
> > controls will be redesigned before freezing the libcamera public API.
> >
> > This means that AeLock and AeState should be marked as draft controls as
> > part of this patch.
> >
> > This being said, we don't need to define controls as draft controls if
> > we want to already go through a complete design process. The discussion
> > that stemmed from this patch series goes in that direction.
> >
> > It's useful, in this context, to understand why the Android AE (and AWB)
> > controls have been designed this way. Conceptually speaking, the
> > algorithms are considered, in Android, to be one level above manual
> > controls.
> >
> >  Controls                                                        Metadata
> >              /---------- == OFF ---------\
> >              |                           v
> > aeMode      -/             +------+     |\
> > aeLock                 --> |  AE  | --> |0\     +--------+
> > aeExposureCompensation     +------+     | | --> | Sensor | --> exposureTime
> >                                         | |     +--------+     sensitivity
> > exposureTime   -----------------------> |1/
> > sensitivity                             |/
> >
> > (The AE controls are in the adnroid.control class, while the sensor
> > manual control are in the android.sensor class.)
> >
> > The aeMode control selects whether the AE algorithm (when != OFF) or the
> > manual controls (when == OFF) drive the sensor. The aeLock control tells
> > the AE algorithm to freeze its output (when ON) or operate normally
> > (when OFF). The metadata always report the values applied to the sensor,
> > regardless of whether they come from the AE algorithm or from manual
> > controls.
> >
> > One important difference, compared to the model we currently implement,
> > is that the Android AE algorithm is independent from the manual sensor
> > controls. The values of the manual controls aren't affected by the AE
> > algorithm, and when AE is disabled, the last set manual control values
> > are applied. This thus require an aeLock control to transition
> > seamlessly between AE and manual mode, as the application needs to set
> > the manual controls to the values applied to the sensor on the last
> > frame that AE was on, which couldn't otherwise be done due to the
> > internal delays.
> 
> Thank you for the detailed explanation here.  I can now better understand
> the Android AE state transitions and why they may be needed.
> 
> > The existing libcamera AE controls are ill-defined. There's a \todo
> > comment to document the interactions between AeEnable and setting a
> > fixed value for ExposureTime or AnalogueGain. I'm actually not sure what
> > happens if AeEnable is set to false when ExposureTime and/or
> > AnalogueGain are 0, or when AeEnable is set to true when ExposureTime
> > and/or AnalogueGain are not 0.
> 
> To provide some brief context on the Raspberry Pi AGC, we never really
> go into a "disabled" state at all.  This is because the AGC algorithm always
> provides the shutter/gain values to pass to the sensor, even in manual mode
> (more on this below).  We use an internal Paused state to lock the shutter/gain
> values so that they can never change, which essentially mimics a "disabled"
> state.
> 
> For manual operation, we pass shutter/gain ctrl values (either or both) to the AGC,
> and that is what the AGC requests off the sensor.  This happens regardless of
> whether the AGC is in it's internal paused state or not.  If the AGC is not in
> paused state, and we only provide one manual control value (e.g. shutter speed),
> then the AGC will vary the gain as it sees fit to converge to the desired
> brightness.  Setting a manual shutter/gain ctrl value back to 0 hands back control
> of the control to the AGC and it starts adapting it as necessary to converge to
> the desired brightness.
> 
> This type of implementation does allow us to essentially do all that Android would
> except of the AGC, but does also allow us the flexibility to do Aperture/Shutter/Gain
> priority modes if needed.

If I understand correctly, when the AeEnable control is set to false,
the AGC algorithm is paused, which just means that it sets the fixed
shutter and gain to the last computed values. Those are the values that
will then be applied to the sensor, until either the AeEnable control is
set back to true, or the manual gain or exposure time control is set to
a different value.

This leads us to the first interesting behaviour: if the application
keeps queuing request that contain the AeEnable control set to false, as
well as manual exposure and/or gain values, in order to apply the manual
values as expected, the Agc::SetFixedShutter() and
Agc::SetFixedAnalogueGain() functions copy the fixed values to the
status_ structure, so that pausing the algorithm won't override them.
There's a bit of fragility here that I'm not sure to like, as it seems
fairly error-prone (not in the existing Raspberry Pi IPA implementation,
but from a general IPA module development point of view).

Another interesting side effect is what happens if the application then
sets the exposure time and/or gain to 0 while keeping AeEnable to false.
The AGC algorithm will resume operation in that case. We can of course
argue that this is a corner case and that applications shouldn't do
this, but it's in any case something that needs to be documented.

Looking at how to switch from auto to manual exposure seamlessly, an
application would need to go through the following sequence:

- Set AeEnable to false in a request, without setting a manual gain or
  exposure.
- Wait until metadata reports AeEnable set to false (this isn't
  currently implemented in the Raspberry Pi IPA), to know that the
  setting has taken effect.
- Take the reported gain and exposure time from the request that
  reported AeEnable set to false, and use them as starting values for
  manual control of the AGC.

On Android, the equivalent sequence is documented in
https://ideasonboard.org/android/docs.html#controls_android.control.aeLock.
The sequences are quite similar, the main difference being that when
Android calls aeLock is called AeEnable in libcamera. This causes
another issue: the AeEnable control needs to be reported in metadata,
but its value doesn't actually indicate whether the gain and exposure
time are controlled automatically or not. If an application sets
AeEnable to true but also sets manual gain and exposure time, metadata
will report AeEnable to true, but the AGC will be manually controlled.
This will also be difficult to document in a clear way I think.

Another point that bothers me is that treating 0 as a magic value to
mean automatic control prevents from reporting a minimum value for the
exposure time and gain to applications. The Raspberry Pi IPA report the
AnalogueGain control as having a 1.0 minimum value, but when we'll
enforce bound checking in the libcamera core (as this really shouldn't
be duplicated by all IPAs), this will break.

> > We need a formal model for how libcamera algorithms interact with manual
> > controls. It doesn't have to duplicate the Android model, but it has to
> > offer at least the same features. I believe we need to take it one step
> > further, as we want to support exposure priority and aperture priority.
> > I think we should actually talk about gain/sensitivity priority instead
> > of aperture priority for the existing feature, and already prepare for
> > devices that have a controllable lens aperture.
> >
> > Now, the hard question: how should libcamera model this ? :-) Naush,
> > David, your feedback from a RPi point of view would be very appreciated.
>
> It would be easy for me to say do it the Raspberry Pi way :-)

I have no ideological opposition to that. What bothers me at the moment
is that the interactions between the controls related to AGC are not
well defined in the documentation, and based on the comments above, I
think it will be difficult to do so in a clear way, and avoid
error-prone behaviours. There's also the bound checking issue for the
gain (and exposure time, the lower limit happens to be 0 for the
Raspberry Pi IPA, but that won't be the case universally) that seems
difficult to fix.

I fear that documenting the behaviour of the existing controls will be
difficult, and will result in unclear documentation. If we want to keep
the existing behaviour, I'd like to see documentation that shows that my
fears are unfounded.

This being said, I'm not advocating for replicating the Android
controls, as, as discussed previously, they don't support implementing
shutter or gain priority, which is an important feature. We may however
decide to design a different semantics for the AE controls that would
support all our use cases and result in simpler documentation.

> But in reality, I don't think the Android AE model gains us an advantage over our existing
> model.  And having to track one less state (locked vs disabled) is always a good thing.
> 
> Explicitly defining AeModes to have (amongst others) ShutterPrioiry, GainPrioiry,
> AperturePriority and Auto modes would be useful though.  This avoids the messiness of
> setting 0 to hand back manual control back to the AGC?

Once we'll get more than two parameters to control (adding aperture to
shutter and gain), I think this would become a bit messy, as there will
be 8 combinations of what can be controlled manually vs. automatically.

> I'm sure David will also want to share his opinions on this, but he won't be able to
> respond for a few days.

If David has time to share his opinion at some point, that would be
helpful.

> > Please also let me know if the above explanation is clear or if you need
> > more information.
> >
> > > +        - name: AeStateSearching
> > > > +          value: 1
> > > > +          description: |
> > > > +            The AE algorithm has not converged yet.
> > > > +            If the camera finishes an AE scan, the state shall go to Converged.
> > > > +            If the camera finishes an AE scan, but flash is required, the state
> > > > +            shall go to FlashRequired.
> > > > +            If AeLock is on, the state shall go to Locked.
> > > > +        - name: AeStateConverged
> > > > +          value: 2
> > > > +          description: |
> > > > +            The AE algorithm has converged.
> > > > +            If the camera initiates an AE scan, the state shall go to Searching.
> > > > +            If AeLock is on, the state shall go to Locked.
> > > > +        - name: AeStateLocked
> > > > +          value: 3
> > > > +          description: |
> > > > +            The AE algorithm is locked.
> > > > +            If AeLock is off, the state can go to Searching, Converged, or
> > > > +            FlashRequired.
> > > > +        - name: AeStateFlashRequired
> > > > +          value: 4
> > > > +          description: |
> > > > +            The AE algorithm would need a flash for good results
> > > > +            If the camera initiates an AE scan, the state shall go to Searching.
> > > > +            If AeLock is on, the state shall go to Locked.
> > > > +        - name: AeStatePrecapture
> > > > +          value: 5
> > > > +          description: |
> > > > +            The AE algorithm has started a pre-capture metering session.
> > > > +            After the sequence is finished, the state shall go to Converged if
> > > > +            AeLock is off, and Locked if it is on.
> > > > +            \sa AePrecaptureTrigger
> > > >
> > > >    # AeMeteringMode needs further attention:
> > > >    # - Auto-generate max enum value.
> > > > @@ -477,36 +531,6 @@ controls:
> > > >              High quality aberration correction which might reduce the frame
> > > >              rate.
> > > >
> > > > -  - AeState:
> > > > -      type: int32_t
> > > > -      draft: true
> > > > -      description: |
> > > > -       Control to report the current AE algorithm state. Currently identical to
> > > > -       ANDROID_CONTROL_AE_STATE.
> > > > -
> > > > -        Current state of the AE algorithm.
> > > > -      enum:
> > > > -        - name: AeStateInactive
> > > > -          value: 0
> > > > -          description: The AE algorithm is inactive.
> > > > -        - name: AeStateSearching
> > > > -          value: 1
> > > > -          description: The AE algorithm has not converged yet.
> > > > -        - name: AeStateConverged
> > > > -          value: 2
> > > > -          description: The AE algorithm has converged.
> > > > -        - name: AeStateLocked
> > > > -          value: 3
> > > > -          description: The AE algorithm is locked.
> > > > -        - name: AeStateFlashRequired
> > > > -          value: 4
> > > > -          description: The AE algorithm would need a flash for good results
> > > > -        - name: AeStatePrecapture
> > > > -          value: 5
> > > > -          description: |
> > > > -            The AE algorithm has started a pre-capture metering session.
> > > > -            \sa AePrecaptureTrigger
> > > > -
> > > >    - AfState:
> > > >        type: int32_t
> > > >        draft: true
David Plowman Aug. 24, 2021, 3:04 p.m. UTC | #7
Hi Laurent

Sorry for the late reply, it seemed to get erroneously discarded by my
human level spam filter (i.e. myself)! Apologies for that. I've also
deleted anything with more than three "> > >" indents, I was rather
drowning - hope that doesn't confuse anyone!

Anyway, here are some of my thoughts on the matter. In the end I
decided to leave them at the top here, I'm not too sure how well they
fit into all the text below.

1. I like the idea of being able to fix some but not all of the
AEC/AGC "levers" (e.g. gain but not exposure). Android seems not to
allow this? But I think it's essential.

2. We don't like unfixing these levers by writing a magic zero. There
needs to be another mechanism.

3. In the RPi code, we fix specific levers by writing a fixed value.
Yet there's no way to fix a single lever at the current value, which
is what happens to all the levers when we disable AEC/AGC. That feels
a bit inconsistent.

4. I think point 1 above means that fixing all levers is a natural
state of the AEC/AGC algorithm, so there doesn't need to be any kind
of completely separate manual mode. My personal view is therefore that
libcamera possibly shouldn't emulate the Android behaviour here.

So here's something that might work for everyone:

- Every lever has its own enable, assuming the implementation can support it.

- Disabling AEC/AGC globally is the same as disabling all the levers.

- Enabling a specific lever is obviously the new version of writing a
zero to it!

- When you disable a lever, it gets fixed to its current value.

- Or you can disable a lever by writing a specific value, which is
obviously then the value it gets.

So I think that might cover all the use cases we have? I don't really
see the point of a separate "AeLock" control in libcamera itself, it
simply seems unnecessary (unless I've missed something). Nonetheless,
the Android view of the world can easily be implemented on top of
this, including the "lock" mechanism.

I expect non-Android applications will prefer to disable levers
without specifying a value so that you just pick up the current one.
To me this feels like the most usual situation.

But I'm always happy to be corrected or persuaded otherwise!

Thanks

David

On Mon, 16 Aug 2021 at 03:07, Laurent Pinchart
<laurent.pinchart@ideasonboard.com> wrote:
>
> Hi Naush,
>
> On Mon, Jul 12, 2021 at 03:50:29PM +0100, Naushir Patuck wrote:
> > On Sun, 11 Jul 2021 at 23:37, Laurent Pinchart wrote:
> > > On Mon, Jul 05, 2021 at 06:34:07PM +0100, Naushir Patuck wrote:
...
> > >
> > > Is it ? I thought it should be AeStateConverged.
> > >
...
> > >
> > > Please see below.
> > >
...
> > >
> > > First of all, let me clarify the goal. The libcamera native API needs to
> > > offer the features required to implement the Android camera HAL API, but
> > > we don't need to just duplicate it. When there are reasons to depart
> > > (for instance to fix historical mistakes made in Android, to provide
> > > additional features, or just to expose an API we consider more
> > > consistent), we should do so.
> > >
> > > Then, a word about the process. Implementing support for an Android
> > > control requires plumbing it through the camera stack: definition of a
> > > libcamera control, translation between the Android and libcamera control
> > > in the Android camera HAL, implementation of the libcamera control in
> > > the pipeline handlesr, and, when applicable, in the IPAs. Defining
> > > libcamera controls, as any API design effort, takes time. To allow
> > > moving forward on all the other parts, we have a fast-track process to
> > > define draft controls that mimick the Android controls. All the draft
> > > controls will be redesigned before freezing the libcamera public API.
> > >
> > > This means that AeLock and AeState should be marked as draft controls as
> > > part of this patch.
> > >
> > > This being said, we don't need to define controls as draft controls if
> > > we want to already go through a complete design process. The discussion
> > > that stemmed from this patch series goes in that direction.
> > >
> > > It's useful, in this context, to understand why the Android AE (and AWB)
> > > controls have been designed this way. Conceptually speaking, the
> > > algorithms are considered, in Android, to be one level above manual
> > > controls.
> > >
> > >  Controls                                                        Metadata
> > >              /---------- == OFF ---------\
> > >              |                           v
> > > aeMode      -/             +------+     |\
> > > aeLock                 --> |  AE  | --> |0\     +--------+
> > > aeExposureCompensation     +------+     | | --> | Sensor | --> exposureTime
> > >                                         | |     +--------+     sensitivity
> > > exposureTime   -----------------------> |1/
> > > sensitivity                             |/
> > >
> > > (The AE controls are in the adnroid.control class, while the sensor
> > > manual control are in the android.sensor class.)
> > >
> > > The aeMode control selects whether the AE algorithm (when != OFF) or the
> > > manual controls (when == OFF) drive the sensor. The aeLock control tells
> > > the AE algorithm to freeze its output (when ON) or operate normally
> > > (when OFF). The metadata always report the values applied to the sensor,
> > > regardless of whether they come from the AE algorithm or from manual
> > > controls.
> > >
> > > One important difference, compared to the model we currently implement,
> > > is that the Android AE algorithm is independent from the manual sensor
> > > controls. The values of the manual controls aren't affected by the AE
> > > algorithm, and when AE is disabled, the last set manual control values
> > > are applied. This thus require an aeLock control to transition
> > > seamlessly between AE and manual mode, as the application needs to set
> > > the manual controls to the values applied to the sensor on the last
> > > frame that AE was on, which couldn't otherwise be done due to the
> > > internal delays.
> >
> > Thank you for the detailed explanation here.  I can now better understand
> > the Android AE state transitions and why they may be needed.
> >
> > > The existing libcamera AE controls are ill-defined. There's a \todo
> > > comment to document the interactions between AeEnable and setting a
> > > fixed value for ExposureTime or AnalogueGain. I'm actually not sure what
> > > happens if AeEnable is set to false when ExposureTime and/or
> > > AnalogueGain are 0, or when AeEnable is set to true when ExposureTime
> > > and/or AnalogueGain are not 0.
> >
> > To provide some brief context on the Raspberry Pi AGC, we never really
> > go into a "disabled" state at all.  This is because the AGC algorithm always
> > provides the shutter/gain values to pass to the sensor, even in manual mode
> > (more on this below).  We use an internal Paused state to lock the shutter/gain
> > values so that they can never change, which essentially mimics a "disabled"
> > state.
> >
> > For manual operation, we pass shutter/gain ctrl values (either or both) to the AGC,
> > and that is what the AGC requests off the sensor.  This happens regardless of
> > whether the AGC is in it's internal paused state or not.  If the AGC is not in
> > paused state, and we only provide one manual control value (e.g. shutter speed),
> > then the AGC will vary the gain as it sees fit to converge to the desired
> > brightness.  Setting a manual shutter/gain ctrl value back to 0 hands back control
> > of the control to the AGC and it starts adapting it as necessary to converge to
> > the desired brightness.
> >
> > This type of implementation does allow us to essentially do all that Android would
> > except of the AGC, but does also allow us the flexibility to do Aperture/Shutter/Gain
> > priority modes if needed.
>
> If I understand correctly, when the AeEnable control is set to false,
> the AGC algorithm is paused, which just means that it sets the fixed
> shutter and gain to the last computed values. Those are the values that
> will then be applied to the sensor, until either the AeEnable control is
> set back to true, or the manual gain or exposure time control is set to
> a different value.
>
> This leads us to the first interesting behaviour: if the application
> keeps queuing request that contain the AeEnable control set to false, as
> well as manual exposure and/or gain values, in order to apply the manual
> values as expected, the Agc::SetFixedShutter() and
> Agc::SetFixedAnalogueGain() functions copy the fixed values to the
> status_ structure, so that pausing the algorithm won't override them.
> There's a bit of fragility here that I'm not sure to like, as it seems
> fairly error-prone (not in the existing Raspberry Pi IPA implementation,
> but from a general IPA module development point of view).
>
> Another interesting side effect is what happens if the application then
> sets the exposure time and/or gain to 0 while keeping AeEnable to false.
> The AGC algorithm will resume operation in that case. We can of course
> argue that this is a corner case and that applications shouldn't do
> this, but it's in any case something that needs to be documented.
>
> Looking at how to switch from auto to manual exposure seamlessly, an
> application would need to go through the following sequence:
>
> - Set AeEnable to false in a request, without setting a manual gain or
>   exposure.
> - Wait until metadata reports AeEnable set to false (this isn't
>   currently implemented in the Raspberry Pi IPA), to know that the
>   setting has taken effect.
> - Take the reported gain and exposure time from the request that
>   reported AeEnable set to false, and use them as starting values for
>   manual control of the AGC.
>
> On Android, the equivalent sequence is documented in
> https://ideasonboard.org/android/docs.html#controls_android.control.aeLock.
> The sequences are quite similar, the main difference being that when
> Android calls aeLock is called AeEnable in libcamera. This causes
> another issue: the AeEnable control needs to be reported in metadata,
> but its value doesn't actually indicate whether the gain and exposure
> time are controlled automatically or not. If an application sets
> AeEnable to true but also sets manual gain and exposure time, metadata
> will report AeEnable to true, but the AGC will be manually controlled.
> This will also be difficult to document in a clear way I think.
>
> Another point that bothers me is that treating 0 as a magic value to
> mean automatic control prevents from reporting a minimum value for the
> exposure time and gain to applications. The Raspberry Pi IPA report the
> AnalogueGain control as having a 1.0 minimum value, but when we'll
> enforce bound checking in the libcamera core (as this really shouldn't
> be duplicated by all IPAs), this will break.
>
> > > We need a formal model for how libcamera algorithms interact with manual
> > > controls. It doesn't have to duplicate the Android model, but it has to
> > > offer at least the same features. I believe we need to take it one step
> > > further, as we want to support exposure priority and aperture priority.
> > > I think we should actually talk about gain/sensitivity priority instead
> > > of aperture priority for the existing feature, and already prepare for
> > > devices that have a controllable lens aperture.
> > >
> > > Now, the hard question: how should libcamera model this ? :-) Naush,
> > > David, your feedback from a RPi point of view would be very appreciated.
> >
> > It would be easy for me to say do it the Raspberry Pi way :-)
>
> I have no ideological opposition to that. What bothers me at the moment
> is that the interactions between the controls related to AGC are not
> well defined in the documentation, and based on the comments above, I
> think it will be difficult to do so in a clear way, and avoid
> error-prone behaviours. There's also the bound checking issue for the
> gain (and exposure time, the lower limit happens to be 0 for the
> Raspberry Pi IPA, but that won't be the case universally) that seems
> difficult to fix.
>
> I fear that documenting the behaviour of the existing controls will be
> difficult, and will result in unclear documentation. If we want to keep
> the existing behaviour, I'd like to see documentation that shows that my
> fears are unfounded.
>
> This being said, I'm not advocating for replicating the Android
> controls, as, as discussed previously, they don't support implementing
> shutter or gain priority, which is an important feature. We may however
> decide to design a different semantics for the AE controls that would
> support all our use cases and result in simpler documentation.
>
> > But in reality, I don't think the Android AE model gains us an advantage over our existing
> > model.  And having to track one less state (locked vs disabled) is always a good thing.
> >
> > Explicitly defining AeModes to have (amongst others) ShutterPrioiry, GainPrioiry,
> > AperturePriority and Auto modes would be useful though.  This avoids the messiness of
> > setting 0 to hand back manual control back to the AGC?
>
> Once we'll get more than two parameters to control (adding aperture to
> shutter and gain), I think this would become a bit messy, as there will
> be 8 combinations of what can be controlled manually vs. automatically.
>
> > I'm sure David will also want to share his opinions on this, but he won't be able to
> > respond for a few days.
>
> If David has time to share his opinion at some point, that would be
> helpful.
>
> > > Please also let me know if the above explanation is clear or if you need
> > > more information.
> > >
...
>
> --
> Regards,
>
> Laurent Pinchart
Paul Elder Aug. 25, 2021, 10:20 a.m. UTC | #8
Hi David,

Thank you for your input.

On Tue, Aug 24, 2021 at 04:04:47PM +0100, David Plowman wrote:
> Hi Laurent
> 
> Sorry for the late reply, it seemed to get erroneously discarded by my
> human level spam filter (i.e. myself)! Apologies for that. I've also
> deleted anything with more than three "> > >" indents, I was rather
> drowning - hope that doesn't confuse anyone!
> 
> Anyway, here are some of my thoughts on the matter. In the end I
> decided to leave them at the top here, I'm not too sure how well they
> fit into all the text below.
> 
> 1. I like the idea of being able to fix some but not all of the
> AEC/AGC "levers" (e.g. gain but not exposure). Android seems not to
> allow this? But I think it's essential.
> 
> 2. We don't like unfixing these levers by writing a magic zero. There
> needs to be another mechanism.
> 
> 3. In the RPi code, we fix specific levers by writing a fixed value.
> Yet there's no way to fix a single lever at the current value, which
> is what happens to all the levers when we disable AEC/AGC. That feels
> a bit inconsistent.
> 
> 4. I think point 1 above means that fixing all levers is a natural
> state of the AEC/AGC algorithm, so there doesn't need to be any kind
> of completely separate manual mode. My personal view is therefore that
> libcamera possibly shouldn't emulate the Android behaviour here.

After a long meditation, I've reached these same points as well. I'm
glad we agree on these points.

> 
> So here's something that might work for everyone:
> 
> - Every lever has its own enable, assuming the implementation can support it.
> 
> - Disabling AEC/AGC globally is the same as disabling all the levers.
> 
> - Enabling a specific lever is obviously the new version of writing a
> zero to it!
> 
> - When you disable a lever, it gets fixed to its current value.
> 
> - Or you can disable a lever by writing a specific value, which is
> obviously then the value it gets.

And I'm working on a design that practically does all these things :)

> 
> So I think that might cover all the use cases we have? I don't really
> see the point of a separate "AeLock" control in libcamera itself, it
> simply seems unnecessary (unless I've missed something). Nonetheless,
> the Android view of the world can easily be implemented on top of
> this, including the "lock" mechanism.

We need aelock to transition seamlessly from auto to manual. If you set
AE to off and feed the values that were last computed by AE both in a
request, then the requests that were already in-flight would still have
AE on, and the values might change. Then once your first "AE off"
request gets in, it'll be a seamful change from auto to manual.

Instead, you set aelock and wait for it to propagate through the
pipeline. Once it is, you can copy those exposure values, and send a
request with those + AE off. Thus you can have seamless switching from
auto to off.

I'm thinking of having some sort of lock for each of the "sub-levers"
(as you call it, I call them "sub-controls"). But I need to figure out
and confirm that the flow will work.


Thanks,

Paul

> I expect non-Android applications will prefer to disable levers
> without specifying a value so that you just pick up the current one.
> To me this feels like the most usual situation.
> 
> But I'm always happy to be corrected or persuaded otherwise!
> 
> Thanks
> 
> David
> 
> On Mon, 16 Aug 2021 at 03:07, Laurent Pinchart
> <laurent.pinchart@ideasonboard.com> wrote:
> >
> > Hi Naush,
> >
> > On Mon, Jul 12, 2021 at 03:50:29PM +0100, Naushir Patuck wrote:
> > > On Sun, 11 Jul 2021 at 23:37, Laurent Pinchart wrote:
> > > > On Mon, Jul 05, 2021 at 06:34:07PM +0100, Naushir Patuck wrote:
> ...
> > > >
> > > > Is it ? I thought it should be AeStateConverged.
> > > >
> ...
> > > >
> > > > Please see below.
> > > >
> ...
> > > >
> > > > First of all, let me clarify the goal. The libcamera native API needs to
> > > > offer the features required to implement the Android camera HAL API, but
> > > > we don't need to just duplicate it. When there are reasons to depart
> > > > (for instance to fix historical mistakes made in Android, to provide
> > > > additional features, or just to expose an API we consider more
> > > > consistent), we should do so.
> > > >
> > > > Then, a word about the process. Implementing support for an Android
> > > > control requires plumbing it through the camera stack: definition of a
> > > > libcamera control, translation between the Android and libcamera control
> > > > in the Android camera HAL, implementation of the libcamera control in
> > > > the pipeline handlesr, and, when applicable, in the IPAs. Defining
> > > > libcamera controls, as any API design effort, takes time. To allow
> > > > moving forward on all the other parts, we have a fast-track process to
> > > > define draft controls that mimick the Android controls. All the draft
> > > > controls will be redesigned before freezing the libcamera public API.
> > > >
> > > > This means that AeLock and AeState should be marked as draft controls as
> > > > part of this patch.
> > > >
> > > > This being said, we don't need to define controls as draft controls if
> > > > we want to already go through a complete design process. The discussion
> > > > that stemmed from this patch series goes in that direction.
> > > >
> > > > It's useful, in this context, to understand why the Android AE (and AWB)
> > > > controls have been designed this way. Conceptually speaking, the
> > > > algorithms are considered, in Android, to be one level above manual
> > > > controls.
> > > >
> > > >  Controls                                                        Metadata
> > > >              /---------- == OFF ---------\
> > > >              |                           v
> > > > aeMode      -/             +------+     |\
> > > > aeLock                 --> |  AE  | --> |0\     +--------+
> > > > aeExposureCompensation     +------+     | | --> | Sensor | --> exposureTime
> > > >                                         | |     +--------+     sensitivity
> > > > exposureTime   -----------------------> |1/
> > > > sensitivity                             |/
> > > >
> > > > (The AE controls are in the adnroid.control class, while the sensor
> > > > manual control are in the android.sensor class.)
> > > >
> > > > The aeMode control selects whether the AE algorithm (when != OFF) or the
> > > > manual controls (when == OFF) drive the sensor. The aeLock control tells
> > > > the AE algorithm to freeze its output (when ON) or operate normally
> > > > (when OFF). The metadata always report the values applied to the sensor,
> > > > regardless of whether they come from the AE algorithm or from manual
> > > > controls.
> > > >
> > > > One important difference, compared to the model we currently implement,
> > > > is that the Android AE algorithm is independent from the manual sensor
> > > > controls. The values of the manual controls aren't affected by the AE
> > > > algorithm, and when AE is disabled, the last set manual control values
> > > > are applied. This thus require an aeLock control to transition
> > > > seamlessly between AE and manual mode, as the application needs to set
> > > > the manual controls to the values applied to the sensor on the last
> > > > frame that AE was on, which couldn't otherwise be done due to the
> > > > internal delays.
> > >
> > > Thank you for the detailed explanation here.  I can now better understand
> > > the Android AE state transitions and why they may be needed.
> > >
> > > > The existing libcamera AE controls are ill-defined. There's a \todo
> > > > comment to document the interactions between AeEnable and setting a
> > > > fixed value for ExposureTime or AnalogueGain. I'm actually not sure what
> > > > happens if AeEnable is set to false when ExposureTime and/or
> > > > AnalogueGain are 0, or when AeEnable is set to true when ExposureTime
> > > > and/or AnalogueGain are not 0.
> > >
> > > To provide some brief context on the Raspberry Pi AGC, we never really
> > > go into a "disabled" state at all.  This is because the AGC algorithm always
> > > provides the shutter/gain values to pass to the sensor, even in manual mode
> > > (more on this below).  We use an internal Paused state to lock the shutter/gain
> > > values so that they can never change, which essentially mimics a "disabled"
> > > state.
> > >
> > > For manual operation, we pass shutter/gain ctrl values (either or both) to the AGC,
> > > and that is what the AGC requests off the sensor.  This happens regardless of
> > > whether the AGC is in it's internal paused state or not.  If the AGC is not in
> > > paused state, and we only provide one manual control value (e.g. shutter speed),
> > > then the AGC will vary the gain as it sees fit to converge to the desired
> > > brightness.  Setting a manual shutter/gain ctrl value back to 0 hands back control
> > > of the control to the AGC and it starts adapting it as necessary to converge to
> > > the desired brightness.
> > >
> > > This type of implementation does allow us to essentially do all that Android would
> > > except of the AGC, but does also allow us the flexibility to do Aperture/Shutter/Gain
> > > priority modes if needed.
> >
> > If I understand correctly, when the AeEnable control is set to false,
> > the AGC algorithm is paused, which just means that it sets the fixed
> > shutter and gain to the last computed values. Those are the values that
> > will then be applied to the sensor, until either the AeEnable control is
> > set back to true, or the manual gain or exposure time control is set to
> > a different value.
> >
> > This leads us to the first interesting behaviour: if the application
> > keeps queuing request that contain the AeEnable control set to false, as
> > well as manual exposure and/or gain values, in order to apply the manual
> > values as expected, the Agc::SetFixedShutter() and
> > Agc::SetFixedAnalogueGain() functions copy the fixed values to the
> > status_ structure, so that pausing the algorithm won't override them.
> > There's a bit of fragility here that I'm not sure to like, as it seems
> > fairly error-prone (not in the existing Raspberry Pi IPA implementation,
> > but from a general IPA module development point of view).
> >
> > Another interesting side effect is what happens if the application then
> > sets the exposure time and/or gain to 0 while keeping AeEnable to false.
> > The AGC algorithm will resume operation in that case. We can of course
> > argue that this is a corner case and that applications shouldn't do
> > this, but it's in any case something that needs to be documented.
> >
> > Looking at how to switch from auto to manual exposure seamlessly, an
> > application would need to go through the following sequence:
> >
> > - Set AeEnable to false in a request, without setting a manual gain or
> >   exposure.
> > - Wait until metadata reports AeEnable set to false (this isn't
> >   currently implemented in the Raspberry Pi IPA), to know that the
> >   setting has taken effect.
> > - Take the reported gain and exposure time from the request that
> >   reported AeEnable set to false, and use them as starting values for
> >   manual control of the AGC.
> >
> > On Android, the equivalent sequence is documented in
> > https://ideasonboard.org/android/docs.html#controls_android.control.aeLock.
> > The sequences are quite similar, the main difference being that when
> > Android calls aeLock is called AeEnable in libcamera. This causes
> > another issue: the AeEnable control needs to be reported in metadata,
> > but its value doesn't actually indicate whether the gain and exposure
> > time are controlled automatically or not. If an application sets
> > AeEnable to true but also sets manual gain and exposure time, metadata
> > will report AeEnable to true, but the AGC will be manually controlled.
> > This will also be difficult to document in a clear way I think.
> >
> > Another point that bothers me is that treating 0 as a magic value to
> > mean automatic control prevents from reporting a minimum value for the
> > exposure time and gain to applications. The Raspberry Pi IPA report the
> > AnalogueGain control as having a 1.0 minimum value, but when we'll
> > enforce bound checking in the libcamera core (as this really shouldn't
> > be duplicated by all IPAs), this will break.
> >
> > > > We need a formal model for how libcamera algorithms interact with manual
> > > > controls. It doesn't have to duplicate the Android model, but it has to
> > > > offer at least the same features. I believe we need to take it one step
> > > > further, as we want to support exposure priority and aperture priority.
> > > > I think we should actually talk about gain/sensitivity priority instead
> > > > of aperture priority for the existing feature, and already prepare for
> > > > devices that have a controllable lens aperture.
> > > >
> > > > Now, the hard question: how should libcamera model this ? :-) Naush,
> > > > David, your feedback from a RPi point of view would be very appreciated.
> > >
> > > It would be easy for me to say do it the Raspberry Pi way :-)
> >
> > I have no ideological opposition to that. What bothers me at the moment
> > is that the interactions between the controls related to AGC are not
> > well defined in the documentation, and based on the comments above, I
> > think it will be difficult to do so in a clear way, and avoid
> > error-prone behaviours. There's also the bound checking issue for the
> > gain (and exposure time, the lower limit happens to be 0 for the
> > Raspberry Pi IPA, but that won't be the case universally) that seems
> > difficult to fix.
> >
> > I fear that documenting the behaviour of the existing controls will be
> > difficult, and will result in unclear documentation. If we want to keep
> > the existing behaviour, I'd like to see documentation that shows that my
> > fears are unfounded.
> >
> > This being said, I'm not advocating for replicating the Android
> > controls, as, as discussed previously, they don't support implementing
> > shutter or gain priority, which is an important feature. We may however
> > decide to design a different semantics for the AE controls that would
> > support all our use cases and result in simpler documentation.
> >
> > > But in reality, I don't think the Android AE model gains us an advantage over our existing
> > > model.  And having to track one less state (locked vs disabled) is always a good thing.
> > >
> > > Explicitly defining AeModes to have (amongst others) ShutterPrioiry, GainPrioiry,
> > > AperturePriority and Auto modes would be useful though.  This avoids the messiness of
> > > setting 0 to hand back manual control back to the AGC?
> >
> > Once we'll get more than two parameters to control (adding aperture to
> > shutter and gain), I think this would become a bit messy, as there will
> > be 8 combinations of what can be controlled manually vs. automatically.
> >
> > > I'm sure David will also want to share his opinions on this, but he won't be able to
> > > respond for a few days.
> >
> > If David has time to share his opinion at some point, that would be
> > helpful.
> >
> > > > Please also let me know if the above explanation is clear or if you need
> > > > more information.
> > > >
> ...
Laurent Pinchart Aug. 31, 2021, 12:39 a.m. UTC | #9
Hi Paul and David,

David, thanks a lot for your feedback. It's very appreciated, as usual.

On Wed, Aug 25, 2021 at 07:20:56PM +0900, paul.elder@ideasonboard.com wrote:
> On Tue, Aug 24, 2021 at 04:04:47PM +0100, David Plowman wrote:
> > Hi Laurent
> > 
> > Sorry for the late reply, it seemed to get erroneously discarded by my
> > human level spam filter (i.e. myself)! Apologies for that. I've also

No need to apologize, I'd probably also want to file most of my mails in
a place where I wouldn't see them if I were to receive them ;-)

> > deleted anything with more than three "> > >" indents, I was rather
> > drowning - hope that doesn't confuse anyone!
> > 
> > Anyway, here are some of my thoughts on the matter. In the end I
> > decided to leave them at the top here, I'm not too sure how well they
> > fit into all the text below.
> > 
> > 1. I like the idea of being able to fix some but not all of the
> > AEC/AGC "levers" (e.g. gain but not exposure). Android seems not to
> > allow this? But I think it's essential.

Agreed, I think we need this. The upside is that we'd then implement a
superset of what Android requires, so Android support won't cause extra
work.

> > 2. We don't like unfixing these levers by writing a magic zero. There
> > needs to be another mechanism.

Happy to hear that you agree on this. Let's find a good API, and then we
can port the Raspberry Pi AGC algorithm to it.

> > 3. In the RPi code, we fix specific levers by writing a fixed value.
> > Yet there's no way to fix a single lever at the current value, which
> > is what happens to all the levers when we disable AEC/AGC. That feels
> > a bit inconsistent.
> > 
> > 4. I think point 1 above means that fixing all levers is a natural
> > state of the AEC/AGC algorithm, so there doesn't need to be any kind
> > of completely separate manual mode. My personal view is therefore that
> > libcamera possibly shouldn't emulate the Android behaviour here.
> 
> After a long meditation, I've reached these same points as well. I'm
> glad we agree on these points.
> 
> > So here's something that might work for everyone:
> > 
> > - Every lever has its own enable, assuming the implementation can support it.
> > 
> > - Disabling AEC/AGC globally is the same as disabling all the levers.
> > 
> > - Enabling a specific lever is obviously the new version of writing a
> > zero to it!
> > 
> > - When you disable a lever, it gets fixed to its current value.

This is the part that may be tricky, as explained by Paul below, to
implement a seamless transition.

> > - Or you can disable a lever by writing a specific value, which is
> > obviously then the value it gets.
> 
> And I'm working on a design that practically does all these things :)
> 
> > So I think that might cover all the use cases we have? I don't really
> > see the point of a separate "AeLock" control in libcamera itself, it
> > simply seems unnecessary (unless I've missed something). Nonetheless,
> > the Android view of the world can easily be implemented on top of
> > this, including the "lock" mechanism.
> 
> We need aelock to transition seamlessly from auto to manual. If you set
> AE to off and feed the values that were last computed by AE both in a
> request, then the requests that were already in-flight would still have
> AE on, and the values might change. Then once your first "AE off"
> request gets in, it'll be a seamful change from auto to manual.
> 
> Instead, you set aelock and wait for it to propagate through the
> pipeline. Once it is, you can copy those exposure values, and send a
> request with those + AE off. Thus you can have seamless switching from
> auto to off.
> 
> I'm thinking of having some sort of lock for each of the "sub-levers"
> (as you call it, I call them "sub-controls"). But I need to figure out
> and confirm that the flow will work.

Sounds good to me. I'll now review your proposal :-)

> > I expect non-Android applications will prefer to disable levers
> > without specifying a value so that you just pick up the current one.
> > To me this feels like the most usual situation.
> > 
> > But I'm always happy to be corrected or persuaded otherwise!
> > 
> > On Mon, 16 Aug 2021 at 03:07, Laurent Pinchart wrote:
> > > On Mon, Jul 12, 2021 at 03:50:29PM +0100, Naushir Patuck wrote:
> > > > On Sun, 11 Jul 2021 at 23:37, Laurent Pinchart wrote:
> > > > > On Mon, Jul 05, 2021 at 06:34:07PM +0100, Naushir Patuck wrote:
> > ...
> > > > >
> > > > > Is it ? I thought it should be AeStateConverged.
> > > > >
> > ...
> > > > >
> > > > > Please see below.
> > > > >
> > ...
> > > > >
> > > > > First of all, let me clarify the goal. The libcamera native API needs to
> > > > > offer the features required to implement the Android camera HAL API, but
> > > > > we don't need to just duplicate it. When there are reasons to depart
> > > > > (for instance to fix historical mistakes made in Android, to provide
> > > > > additional features, or just to expose an API we consider more
> > > > > consistent), we should do so.
> > > > >
> > > > > Then, a word about the process. Implementing support for an Android
> > > > > control requires plumbing it through the camera stack: definition of a
> > > > > libcamera control, translation between the Android and libcamera control
> > > > > in the Android camera HAL, implementation of the libcamera control in
> > > > > the pipeline handlesr, and, when applicable, in the IPAs. Defining
> > > > > libcamera controls, as any API design effort, takes time. To allow
> > > > > moving forward on all the other parts, we have a fast-track process to
> > > > > define draft controls that mimick the Android controls. All the draft
> > > > > controls will be redesigned before freezing the libcamera public API.
> > > > >
> > > > > This means that AeLock and AeState should be marked as draft controls as
> > > > > part of this patch.
> > > > >
> > > > > This being said, we don't need to define controls as draft controls if
> > > > > we want to already go through a complete design process. The discussion
> > > > > that stemmed from this patch series goes in that direction.
> > > > >
> > > > > It's useful, in this context, to understand why the Android AE (and AWB)
> > > > > controls have been designed this way. Conceptually speaking, the
> > > > > algorithms are considered, in Android, to be one level above manual
> > > > > controls.
> > > > >
> > > > >  Controls                                                        Metadata
> > > > >              /---------- == OFF ---------\
> > > > >              |                           v
> > > > > aeMode      -/             +------+     |\
> > > > > aeLock                 --> |  AE  | --> |0\     +--------+
> > > > > aeExposureCompensation     +------+     | | --> | Sensor | --> exposureTime
> > > > >                                         | |     +--------+     sensitivity
> > > > > exposureTime   -----------------------> |1/
> > > > > sensitivity                             |/
> > > > >
> > > > > (The AE controls are in the adnroid.control class, while the sensor
> > > > > manual control are in the android.sensor class.)
> > > > >
> > > > > The aeMode control selects whether the AE algorithm (when != OFF) or the
> > > > > manual controls (when == OFF) drive the sensor. The aeLock control tells
> > > > > the AE algorithm to freeze its output (when ON) or operate normally
> > > > > (when OFF). The metadata always report the values applied to the sensor,
> > > > > regardless of whether they come from the AE algorithm or from manual
> > > > > controls.
> > > > >
> > > > > One important difference, compared to the model we currently implement,
> > > > > is that the Android AE algorithm is independent from the manual sensor
> > > > > controls. The values of the manual controls aren't affected by the AE
> > > > > algorithm, and when AE is disabled, the last set manual control values
> > > > > are applied. This thus require an aeLock control to transition
> > > > > seamlessly between AE and manual mode, as the application needs to set
> > > > > the manual controls to the values applied to the sensor on the last
> > > > > frame that AE was on, which couldn't otherwise be done due to the
> > > > > internal delays.
> > > >
> > > > Thank you for the detailed explanation here.  I can now better understand
> > > > the Android AE state transitions and why they may be needed.
> > > >
> > > > > The existing libcamera AE controls are ill-defined. There's a \todo
> > > > > comment to document the interactions between AeEnable and setting a
> > > > > fixed value for ExposureTime or AnalogueGain. I'm actually not sure what
> > > > > happens if AeEnable is set to false when ExposureTime and/or
> > > > > AnalogueGain are 0, or when AeEnable is set to true when ExposureTime
> > > > > and/or AnalogueGain are not 0.
> > > >
> > > > To provide some brief context on the Raspberry Pi AGC, we never really
> > > > go into a "disabled" state at all.  This is because the AGC algorithm always
> > > > provides the shutter/gain values to pass to the sensor, even in manual mode
> > > > (more on this below).  We use an internal Paused state to lock the shutter/gain
> > > > values so that they can never change, which essentially mimics a "disabled"
> > > > state.
> > > >
> > > > For manual operation, we pass shutter/gain ctrl values (either or both) to the AGC,
> > > > and that is what the AGC requests off the sensor.  This happens regardless of
> > > > whether the AGC is in it's internal paused state or not.  If the AGC is not in
> > > > paused state, and we only provide one manual control value (e.g. shutter speed),
> > > > then the AGC will vary the gain as it sees fit to converge to the desired
> > > > brightness.  Setting a manual shutter/gain ctrl value back to 0 hands back control
> > > > of the control to the AGC and it starts adapting it as necessary to converge to
> > > > the desired brightness.
> > > >
> > > > This type of implementation does allow us to essentially do all that Android would
> > > > except of the AGC, but does also allow us the flexibility to do Aperture/Shutter/Gain
> > > > priority modes if needed.
> > >
> > > If I understand correctly, when the AeEnable control is set to false,
> > > the AGC algorithm is paused, which just means that it sets the fixed
> > > shutter and gain to the last computed values. Those are the values that
> > > will then be applied to the sensor, until either the AeEnable control is
> > > set back to true, or the manual gain or exposure time control is set to
> > > a different value.
> > >
> > > This leads us to the first interesting behaviour: if the application
> > > keeps queuing request that contain the AeEnable control set to false, as
> > > well as manual exposure and/or gain values, in order to apply the manual
> > > values as expected, the Agc::SetFixedShutter() and
> > > Agc::SetFixedAnalogueGain() functions copy the fixed values to the
> > > status_ structure, so that pausing the algorithm won't override them.
> > > There's a bit of fragility here that I'm not sure to like, as it seems
> > > fairly error-prone (not in the existing Raspberry Pi IPA implementation,
> > > but from a general IPA module development point of view).
> > >
> > > Another interesting side effect is what happens if the application then
> > > sets the exposure time and/or gain to 0 while keeping AeEnable to false.
> > > The AGC algorithm will resume operation in that case. We can of course
> > > argue that this is a corner case and that applications shouldn't do
> > > this, but it's in any case something that needs to be documented.
> > >
> > > Looking at how to switch from auto to manual exposure seamlessly, an
> > > application would need to go through the following sequence:
> > >
> > > - Set AeEnable to false in a request, without setting a manual gain or
> > >   exposure.
> > > - Wait until metadata reports AeEnable set to false (this isn't
> > >   currently implemented in the Raspberry Pi IPA), to know that the
> > >   setting has taken effect.
> > > - Take the reported gain and exposure time from the request that
> > >   reported AeEnable set to false, and use them as starting values for
> > >   manual control of the AGC.
> > >
> > > On Android, the equivalent sequence is documented in
> > > https://ideasonboard.org/android/docs.html#controls_android.control.aeLock.
> > > The sequences are quite similar, the main difference being that when
> > > Android calls aeLock is called AeEnable in libcamera. This causes
> > > another issue: the AeEnable control needs to be reported in metadata,
> > > but its value doesn't actually indicate whether the gain and exposure
> > > time are controlled automatically or not. If an application sets
> > > AeEnable to true but also sets manual gain and exposure time, metadata
> > > will report AeEnable to true, but the AGC will be manually controlled.
> > > This will also be difficult to document in a clear way I think.
> > >
> > > Another point that bothers me is that treating 0 as a magic value to
> > > mean automatic control prevents from reporting a minimum value for the
> > > exposure time and gain to applications. The Raspberry Pi IPA report the
> > > AnalogueGain control as having a 1.0 minimum value, but when we'll
> > > enforce bound checking in the libcamera core (as this really shouldn't
> > > be duplicated by all IPAs), this will break.
> > >
> > > > > We need a formal model for how libcamera algorithms interact with manual
> > > > > controls. It doesn't have to duplicate the Android model, but it has to
> > > > > offer at least the same features. I believe we need to take it one step
> > > > > further, as we want to support exposure priority and aperture priority.
> > > > > I think we should actually talk about gain/sensitivity priority instead
> > > > > of aperture priority for the existing feature, and already prepare for
> > > > > devices that have a controllable lens aperture.
> > > > >
> > > > > Now, the hard question: how should libcamera model this ? :-) Naush,
> > > > > David, your feedback from a RPi point of view would be very appreciated.
> > > >
> > > > It would be easy for me to say do it the Raspberry Pi way :-)
> > >
> > > I have no ideological opposition to that. What bothers me at the moment
> > > is that the interactions between the controls related to AGC are not
> > > well defined in the documentation, and based on the comments above, I
> > > think it will be difficult to do so in a clear way, and avoid
> > > error-prone behaviours. There's also the bound checking issue for the
> > > gain (and exposure time, the lower limit happens to be 0 for the
> > > Raspberry Pi IPA, but that won't be the case universally) that seems
> > > difficult to fix.
> > >
> > > I fear that documenting the behaviour of the existing controls will be
> > > difficult, and will result in unclear documentation. If we want to keep
> > > the existing behaviour, I'd like to see documentation that shows that my
> > > fears are unfounded.
> > >
> > > This being said, I'm not advocating for replicating the Android
> > > controls, as, as discussed previously, they don't support implementing
> > > shutter or gain priority, which is an important feature. We may however
> > > decide to design a different semantics for the AE controls that would
> > > support all our use cases and result in simpler documentation.
> > >
> > > > But in reality, I don't think the Android AE model gains us an advantage over our existing
> > > > model.  And having to track one less state (locked vs disabled) is always a good thing.
> > > >
> > > > Explicitly defining AeModes to have (amongst others) ShutterPrioiry, GainPrioiry,
> > > > AperturePriority and Auto modes would be useful though.  This avoids the messiness of
> > > > setting 0 to hand back manual control back to the AGC?
> > >
> > > Once we'll get more than two parameters to control (adding aperture to
> > > shutter and gain), I think this would become a bit messy, as there will
> > > be 8 combinations of what can be controlled manually vs. automatically.
> > >
> > > > I'm sure David will also want to share his opinions on this, but he won't be able to
> > > > respond for a few days.
> > >
> > > If David has time to share his opinion at some point, that would be
> > > helpful.
> > >
> > > > > Please also let me know if the above explanation is clear or if you need
> > > > > more information.
> > > > >
> > ...

Patch
diff mbox series

diff --git a/src/ipa/raspberrypi/raspberrypi.cpp b/src/ipa/raspberrypi/raspberrypi.cpp
index 4d09a84f..4981aa29 100644
--- a/src/ipa/raspberrypi/raspberrypi.cpp
+++ b/src/ipa/raspberrypi/raspberrypi.cpp
@@ -469,7 +469,10 @@  void IPARPi::reportMetadata()
 
 	AgcStatus *agcStatus = rpiMetadata_.GetLocked<AgcStatus>("agc.status");
 	if (agcStatus) {
-		libcameraMetadata_.set(controls::AeLocked, agcStatus->locked);
+		libcameraMetadata_.set(controls::AeState,
+				       agcStatus->locked ?
+				       controls::AeStateLocked :
+				       controls::AeStateSearching);
 		libcameraMetadata_.set(controls::DigitalGain, agcStatus->digital_gain);
 	}
 
diff --git a/src/ipa/rkisp1/rkisp1.cpp b/src/ipa/rkisp1/rkisp1.cpp
index cdfb4d13..4eca26e2 100644
--- a/src/ipa/rkisp1/rkisp1.cpp
+++ b/src/ipa/rkisp1/rkisp1.cpp
@@ -51,7 +51,7 @@  private:
 			      const rkisp1_stat_buffer *stats);
 
 	void setControls(unsigned int frame);
-	void metadataReady(unsigned int frame, unsigned int aeState);
+	void metadataReady(unsigned int frame, int aeState);
 
 	std::map<unsigned int, FrameBuffer> buffers_;
 	std::map<unsigned int, void *> buffersMemory_;
@@ -227,7 +227,7 @@  void IPARkISP1::updateStatistics(unsigned int frame,
 				 const rkisp1_stat_buffer *stats)
 {
 	const rkisp1_cif_isp_stat *params = &stats->params;
-	unsigned int aeState = 0;
+	int aeState = controls::AeStateInactive;
 
 	if (stats->meas_type & RKISP1_CIF_ISP_STAT_AUTOEXP) {
 		const rkisp1_cif_isp_ae_stat *ae = &params->ae;
@@ -262,7 +262,9 @@  void IPARkISP1::updateStatistics(unsigned int frame,
 			setControls(frame + 1);
 		}
 
-		aeState = fabs(factor - 1.0f) < 0.05f ? 2 : 1;
+		aeState = fabs(factor - 1.0f) < 0.05f ?
+			  controls::AeStateConverged :
+			  controls::AeStateSearching;
 	}
 
 	metadataReady(frame, aeState);
@@ -281,12 +283,11 @@  void IPARkISP1::setControls(unsigned int frame)
 	queueFrameAction.emit(frame, op);
 }
 
-void IPARkISP1::metadataReady(unsigned int frame, unsigned int aeState)
+void IPARkISP1::metadataReady(unsigned int frame, int aeState)
 {
 	ControlList ctrls(controls::controls);
 
-	if (aeState)
-		ctrls.set(controls::AeLocked, aeState == 2);
+	ctrls.set(controls::AeState, aeState);
 
 	RkISP1Action op;
 	op.op = ActionMetadata;
diff --git a/src/libcamera/control_ids.yaml b/src/libcamera/control_ids.yaml
index 9d4638ae..5717bc1f 100644
--- a/src/libcamera/control_ids.yaml
+++ b/src/libcamera/control_ids.yaml
@@ -14,16 +14,70 @@  controls:
 
         \sa ExposureTime AnalogueGain
 
-  - AeLocked:
+  - AeLock:
       type: bool
       description: |
-        Report the lock status of a running AE algorithm.
+        Control to lock the AE values.
+        When set to true, the AE algorithm is locked to its latest parameters,
+        and will not change exposure settings until set to false.
+        \sa AeState
 
-        If the AE algorithm is locked the value shall be set to true, if it's
-        converging it shall be set to false. If the AE algorithm is not
-        running the control shall not be present in the metadata control list.
+  - AeState:
+      type: int32_t
+      description: |
+        Control to report the current AE algorithm state. Enabling or disabling
+        AE (AeEnable) always resets the AeState to AeStateInactive. The camera
+        device can do several state transitions between two results, if it is
+        allowed by the state transition table. For example, AeStateInactive may
+        never actually be seen in a result.
 
-        \sa AeEnable
+        The state in the result is the state for this image (in sync with this
+        image). If AE state becomes AeStateConverged, then the image data
+        associated with the result should be good to use.
+
+        The state transitions mentioned below assume that AeEnable is on.
+
+        \sa AeLock
+      enum:
+        - name: AeStateInactive
+          value: 0
+          description: |
+            The AE algorithm is inactive.
+            If the camera initiates an AE scan, the state shall go to Searching.
+            If AeLock is on, the state shall go to Locked.
+        - name: AeStateSearching
+          value: 1
+          description: |
+            The AE algorithm has not converged yet.
+            If the camera finishes an AE scan, the state shall go to Converged.
+            If the camera finishes an AE scan, but flash is required, the state
+            shall go to FlashRequired.
+            If AeLock is on, the state shall go to Locked.
+        - name: AeStateConverged
+          value: 2
+          description: |
+            The AE algorithm has converged.
+            If the camera initiates an AE scan, the state shall go to Searching.
+            If AeLock is on, the state shall go to Locked.
+        - name: AeStateLocked
+          value: 3
+          description: |
+            The AE algorithm is locked.
+            If AeLock is off, the state can go to Searching, Converged, or
+            FlashRequired.
+        - name: AeStateFlashRequired
+          value: 4
+          description: |
+            The AE algorithm would need a flash for good results
+            If the camera initiates an AE scan, the state shall go to Searching.
+            If AeLock is on, the state shall go to Locked.
+        - name: AeStatePrecapture
+          value: 5
+          description: |
+            The AE algorithm has started a pre-capture metering session.
+            After the sequence is finished, the state shall go to Converged if
+            AeLock is off, and Locked if it is on.
+            \sa AePrecaptureTrigger
 
   # AeMeteringMode needs further attention:
   # - Auto-generate max enum value.
@@ -477,36 +531,6 @@  controls:
             High quality aberration correction which might reduce the frame
             rate.
 
-  - AeState:
-      type: int32_t
-      draft: true
-      description: |
-       Control to report the current AE algorithm state. Currently identical to
-       ANDROID_CONTROL_AE_STATE.
-
-        Current state of the AE algorithm.
-      enum:
-        - name: AeStateInactive
-          value: 0
-          description: The AE algorithm is inactive.
-        - name: AeStateSearching
-          value: 1
-          description: The AE algorithm has not converged yet.
-        - name: AeStateConverged
-          value: 2
-          description: The AE algorithm has converged.
-        - name: AeStateLocked
-          value: 3
-          description: The AE algorithm is locked.
-        - name: AeStateFlashRequired
-          value: 4
-          description: The AE algorithm would need a flash for good results
-        - name: AeStatePrecapture
-          value: 5
-          description: |
-            The AE algorithm has started a pre-capture metering session.
-            \sa AePrecaptureTrigger
-
   - AfState:
       type: int32_t
       draft: true