[04/10] ipa: libipa: Add MeanLuminanceAgc base class
diff mbox series

Message ID 20240322131451.3092931-5-dan.scally@ideasonboard.com
State New
Headers show
Series
  • Centralise Agc into libipa
Related show

Commit Message

Daniel Scally March 22, 2024, 1:14 p.m. UTC
The Agc algorithms for the RkIsp1 and IPU3 IPAs do the same thing in
very large part; following the Rpi IPA's algorithm in spirit with a
few tunable values in that IPA being hardcoded in the libipa ones.
Add a new base class for MeanLuminanceAgc which implements the same
algorithm and additionally parses yaml tuning files to inform an IPA
module's Agc algorithm about valid constraint and exposure modes and
their associated bounds.

Signed-off-by: Daniel Scally <dan.scally@ideasonboard.com>
---
 src/ipa/libipa/agc.cpp     | 526 +++++++++++++++++++++++++++++++++++++
 src/ipa/libipa/agc.h       |  82 ++++++
 src/ipa/libipa/meson.build |   2 +
 3 files changed, 610 insertions(+)
 create mode 100644 src/ipa/libipa/agc.cpp
 create mode 100644 src/ipa/libipa/agc.h

Comments

Jacopo Mondi March 25, 2024, 8:16 p.m. UTC | #1
Hi Dan

On Fri, Mar 22, 2024 at 01:14:45PM +0000, Daniel Scally wrote:
> The Agc algorithms for the RkIsp1 and IPU3 IPAs do the same thing in
> very large part; following the Rpi IPA's algorithm in spirit with a
> few tunable values in that IPA being hardcoded in the libipa ones.
> Add a new base class for MeanLuminanceAgc which implements the same

nit: I would rather call this one AgcMeanLuminance.

One other note, not sure if applicable here, from the experience of
trying to upstream an auto-focus algorithm for RkISP1. We had there a
base class to define the algorithm interface, one derived class for
the common calculation and one platform-specific part for statistics
collection and IPA module interfacing.

The base class was there mostly to handle the algorithm state machine
by handling the controls that influence the algorithm behaviour. In
case of AEGC I can think, in example, about handling the switch
between enable/disable of auto-mode (and consequentially handling a
manually set ExposureTime and AnalogueGain), switching between
different ExposureModes etc

This is the last attempt for AF
https://patchwork.libcamera.org/patch/18510/

and I'm wondering if it would be desirable to abstract away from the
MeanLuminance part the part that is tightly coupled with the
libcamera's controls definition.

Let's make a concrete example: look how the rkisp1 and the ipu3 agc
implementations handle AeEnabled (or better, look at how the rkisp1
does that and the IPU3 does not).

Ideally, my goal would be to abstract the handling of the control and
all the state machine that decides if the manual or auto-computed
values should be used to an AEGCAlgorithm base class.

Your series does that for the tuning file parsing already but does
that in the MeanLuminance method implementation, while it should or
could be common to all AEGC methods.

One very interesting experiment could be starting with this and then
plumb-in AeEnable support in the IPU3 in example and move everything
common to a base class.

> algorithm and additionally parses yaml tuning files to inform an IPA
> module's Agc algorithm about valid constraint and exposure modes and
> their associated bounds.
>
> Signed-off-by: Daniel Scally <dan.scally@ideasonboard.com>
> ---
>  src/ipa/libipa/agc.cpp     | 526 +++++++++++++++++++++++++++++++++++++
>  src/ipa/libipa/agc.h       |  82 ++++++
>  src/ipa/libipa/meson.build |   2 +
>  3 files changed, 610 insertions(+)
>  create mode 100644 src/ipa/libipa/agc.cpp
>  create mode 100644 src/ipa/libipa/agc.h
>
> diff --git a/src/ipa/libipa/agc.cpp b/src/ipa/libipa/agc.cpp
> new file mode 100644
> index 00000000..af57a571
> --- /dev/null
> +++ b/src/ipa/libipa/agc.cpp
> @@ -0,0 +1,526 @@
> +/* SPDX-License-Identifier: LGPL-2.1-or-later */
> +/*
> + * Copyright (C) 2024 Ideas on Board Oy
> + *
> + * agc.cpp - Base class for libipa-compliant AGC algorithms
> + */
> +
> +#include "agc.h"
> +
> +#include <cmath>
> +
> +#include <libcamera/base/log.h>
> +#include <libcamera/control_ids.h>
> +
> +#include "exposure_mode_helper.h"
> +
> +using namespace libcamera::controls;
> +
> +/**
> + * \file agc.h
> + * \brief Base class implementing mean luminance AEGC.

nit: not '.' at the end of briefs

> + */
> +
> +namespace libcamera {
> +
> +using namespace std::literals::chrono_literals;
> +
> +LOG_DEFINE_CATEGORY(Agc)
> +
> +namespace ipa {
> +
> +/*
> + * Number of frames to wait before calculating stats on minimum exposure
> + * \todo should this be a tunable value?

Does this depend on the ISP (comes from IPA), the sensor (comes from
tuning file) or... both ? :)

> + */
> +static constexpr uint32_t kNumStartupFrames = 10;
> +
> +/*
> + * Default relative luminance target
> + *
> + * This value should be chosen so that when the camera points at a grey target,
> + * the resulting image brightness looks "right". Custom values can be passed
> + * as the relativeLuminanceTarget value in sensor tuning files.
> + */
> +static constexpr double kDefaultRelativeLuminanceTarget = 0.16;
> +
> +/**
> + * \struct MeanLuminanceAgc::AgcConstraint
> + * \brief The boundaries and target for an AeConstraintMode constraint
> + *
> + * This structure describes an AeConstraintMode constraint for the purposes of
> + * this algorithm. The algorithm will apply the constraints by calculating the
> + * Histogram's inter-quantile mean between the given quantiles and ensure that
> + * the resulting value is the right side of the given target (as defined by the
> + * boundary and luminance target).
> + */

Here, in example.

controls::AeConstraintMode and the supported values are defined as
(core|vendor) controls in control_ids_*.yaml

The tuning file expresses the constraint modes using the Control
definition (I wonder if this has always been like this) but it
definitely ties the tuning file to the controls definition.

Applications use controls::AeConstraintMode to select one of the
constrained modes to have the algorithm use it.

In all of this, how much is part of the MeanLuminance implementation
and how much shared between possibly multiple implementations ?

> +
> +/**
> + * \enum MeanLuminanceAgc::AgcConstraint::Bound
> + * \brief Specify whether the constraint defines a lower or upper bound
> + * \var MeanLuminanceAgc::AgcConstraint::LOWER
> + * \brief The constraint defines a lower bound
> + * \var MeanLuminanceAgc::AgcConstraint::UPPER
> + * \brief The constraint defines an upper bound
> + */
> +
> +/**
> + * \var MeanLuminanceAgc::AgcConstraint::bound
> + * \brief The type of constraint bound
> + */
> +
> +/**
> + * \var MeanLuminanceAgc::AgcConstraint::qLo
> + * \brief The lower quantile to use for the constraint
> + */
> +
> +/**
> + * \var MeanLuminanceAgc::AgcConstraint::qHi
> + * \brief The upper quantile to use for the constraint
> + */
> +
> +/**
> + * \var MeanLuminanceAgc::AgcConstraint::yTarget
> + * \brief The luminance target for the constraint
> + */
> +
> +/**
> + * \class MeanLuminanceAgc
> + * \brief a mean-based auto-exposure algorithm
> + *
> + * This algorithm calculates a shutter time, analogue and digital gain such that
> + * the normalised mean luminance value of an image is driven towards a target,
> + * which itself is discovered from tuning data. The algorithm is a two-stage
> + * process:
> + *
> + * In the first stage, an initial gain value is derived by iteratively comparing
> + * the gain-adjusted mean luminance across an entire image against a target, and
> + * selecting a value which pushes it as closely as possible towards the target.
> + *
> + * In the second stage we calculate the gain required to drive the average of a
> + * section of a histogram to a target value, where the target and the boundaries
> + * of the section of the histogram used in the calculation are taken from the
> + * values defined for the currently configured AeConstraintMode within the
> + * tuning data. The gain from the first stage is then clamped to the gain from
> + * this stage.
> + *
> + * The final gain is used to adjust the effective exposure value of the image,
> + * and that new exposure value divided into shutter time, analogue gain and
> + * digital gain according to the selected AeExposureMode.
> + */
> +
> +MeanLuminanceAgc::MeanLuminanceAgc()
> +	: frameCount_(0), filteredExposure_(0s), relativeLuminanceTarget_(0)
> +{
> +}
> +
> +/**
> + * \brief Parse the relative luminance target from the tuning data
> + * \param[in] tuningData The YamlObject holding the algorithm's tuning data
> + */
> +void MeanLuminanceAgc::parseRelativeLuminanceTarget(const YamlObject &tuningData)
> +{
> +	relativeLuminanceTarget_ =
> +		tuningData["relativeLuminanceTarget"].get<double>(kDefaultRelativeLuminanceTarget);

How do you expect this to be computed in the tuning file ?

> +}
> +
> +/**
> + * \brief Parse an AeConstraintMode constraint from tuning data
> + * \param[in] modeDict the YamlObject holding the constraint data
> + * \param[in] id The constraint ID from AeConstraintModeEnum
> + */
> +void MeanLuminanceAgc::parseConstraint(const YamlObject &modeDict, int32_t id)
> +{
> +	for (const auto &[boundName, content] : modeDict.asDict()) {
> +		if (boundName != "upper" && boundName != "lower") {
> +			LOG(Agc, Warning)
> +				<< "Ignoring unknown constraint bound '" << boundName << "'";
> +			continue;
> +		}
> +
> +		unsigned int idx = static_cast<unsigned int>(boundName == "upper");
> +		AgcConstraint::Bound bound = static_cast<AgcConstraint::Bound>(idx);
> +		double qLo = content["qLo"].get<double>().value_or(0.98);
> +		double qHi = content["qHi"].get<double>().value_or(1.0);
> +		double yTarget =
> +			content["yTarget"].getList<double>().value_or(std::vector<double>{ 0.5 }).at(0);
> +
> +		AgcConstraint constraint = { bound, qLo, qHi, yTarget };
> +
> +		if (!constraintModes_.count(id))
> +			constraintModes_[id] = {};
> +
> +		if (idx)
> +			constraintModes_[id].push_back(constraint);
> +		else
> +			constraintModes_[id].insert(constraintModes_[id].begin(), constraint);
> +	}
> +}
> +
> +/**
> + * \brief Parse tuning data file to populate AeConstraintMode control
> + * \param[in] tuningData the YamlObject representing the tuning data for Agc
> + *
> + * The Agc algorithm's tuning data should contain a dictionary called
> + * AeConstraintMode containing per-mode setting dictionaries with the key being
> + * a value from \ref controls::AeConstraintModeNameValueMap. Each mode dict may
> + * contain either a "lower" or "upper" key, or both, in this format:
> + *
> + * \code{.unparsed}
> + * algorithms:
> + *   - Agc:
> + *       AeConstraintMode:
> + *         ConstraintNormal:
> + *           lower:
> + *             qLo: 0.98
> + *             qHi: 1.0
> + *             yTarget: 0.5

Ok, so this ties the tuning file not just to the libcamera controls
definition, but to this specific implementation of the algorithm ?

Not that it was not expected, and I think it's fine, as using a
libcamera defined control value as 'index' makes sure the applications
will deal with the same interface, but this largely conflicts with the
idea to have shared parsing for all algorithms in a base class.

Also, we made AeConstrainModes a 'core control' because at the time
when RPi first implemented AGC support there was no alternative to
that. Then the RPi implementation has been copied in all other
platforms and this is still fine as a 'core control'. This however
seems to be a tuning parameter for this specific algorithm
implementation, isn't it ?

> + *         ConstraintHighlight:
> + *           lower:
> + *             qLo: 0.98
> + *             qHi: 1.0
> + *             yTarget: 0.5
> + *           upper:
> + *             qLo: 0.98
> + *             qHi: 1.0
> + *             yTarget: 0.8
> + *
> + * \endcode
> + *
> + * The parsed dictionaries are used to populate an array of available values for
> + * the AeConstraintMode control and stored for later use in the algorithm.
> + *
> + * \return -EINVAL Where a defined constraint mode is invalid
> + * \return 0 on success
> + */
> +int MeanLuminanceAgc::parseConstraintModes(const YamlObject &tuningData)
> +{
> +	std::vector<ControlValue> availableConstraintModes;
> +
> +	const YamlObject &yamlConstraintModes = tuningData[controls::AeConstraintMode.name()];
> +	if (yamlConstraintModes.isDictionary()) {
> +		for (const auto &[modeName, modeDict] : yamlConstraintModes.asDict()) {
> +			if (AeConstraintModeNameValueMap.find(modeName) ==
> +			    AeConstraintModeNameValueMap.end()) {
> +				LOG(Agc, Warning)
> +					<< "Skipping unknown constraint mode '" << modeName << "'";
> +				continue;
> +			}
> +
> +			if (!modeDict.isDictionary()) {
> +				LOG(Agc, Error)
> +					<< "Invalid constraint mode '" << modeName << "'";
> +				return -EINVAL;
> +			}
> +
> +			parseConstraint(modeDict,
> +					AeConstraintModeNameValueMap.at(modeName));
> +			availableConstraintModes.push_back(
> +				AeConstraintModeNameValueMap.at(modeName));
> +		}
> +	}
> +
> +	/*
> +	 * If the tuning data file contains no constraints then we use the
> +	 * default constraint that the various Agc algorithms were adhering to
> +	 * anyway before centralisation.
> +	 */
> +	if (constraintModes_.empty()) {
> +		AgcConstraint constraint = {
> +			AgcConstraint::Bound::LOWER,
> +			0.98,
> +			1.0,
> +			0.5
> +		};
> +
> +		constraintModes_[controls::ConstraintNormal].insert(
> +			constraintModes_[controls::ConstraintNormal].begin(),
> +			constraint);
> +		availableConstraintModes.push_back(
> +			AeConstraintModeNameValueMap.at("ConstraintNormal"));
> +	}
> +
> +	controls_[&controls::AeConstraintMode] = ControlInfo(availableConstraintModes);
> +
> +	return 0;
> +}
> +
> +/**
> + * \brief Parse tuning data file to populate AeExposureMode control
> + * \param[in] tuningData the YamlObject representing the tuning data for Agc
> + *
> + * The Agc algorithm's tuning data should contain a dictionary called
> + * AeExposureMode containing per-mode setting dictionaries with the key being
> + * a value from \ref controls::AeExposureModeNameValueMap. Each mode dict should
> + * contain an array of shutter times with the key "shutter" and an array of gain
> + * values with the key "gain", in this format:
> + *
> + * \code{.unparsed}
> + * algorithms:
> + *   - Agc:
> + *       AeExposureMode:

Same reasoning as per the constraints, really.

There's nothing bad here, if not me realizing our controls definition
is intimately tied with the algorithms implementation, so I wonder
again if even handling things like AeEnable in a common form makes
any sense..

Not going to review the actual implementation now, as it comes from
the existing ones...


> + *         ExposureNormal:
> + *           shutter: [ 100, 10000, 30000, 60000, 120000 ]
> + *           gain: [ 1.0, 2.0, 4.0, 6.0, 6.0 ]
> + *         ExposureShort:
> + *           shutter: [ 100, 10000, 30000, 60000, 120000 ]
> + *           gain: [ 1.0, 2.0, 4.0, 6.0, 6.0 ]
> + *
> + * \endcode
> + *
> + * The parsed dictionaries are used to populate an array of available values for
> + * the AeExposureMode control and to create ExposureModeHelpers
> + *
> + * \return -EINVAL Where a defined constraint mode is invalid
> + * \return 0 on success
> + */
> +int MeanLuminanceAgc::parseExposureModes(const YamlObject &tuningData)
> +{
> +	std::vector<ControlValue> availableExposureModes;
> +	int ret;
> +
> +	const YamlObject &yamlExposureModes = tuningData[controls::AeExposureMode.name()];
> +	if (yamlExposureModes.isDictionary()) {
> +		for (const auto &[modeName, modeValues] : yamlExposureModes.asDict()) {
> +			if (AeExposureModeNameValueMap.find(modeName) ==
> +			    AeExposureModeNameValueMap.end()) {
> +				LOG(Agc, Warning)
> +					<< "Skipping unknown exposure mode '" << modeName << "'";
> +				continue;
> +			}
> +
> +			if (!modeValues.isDictionary()) {
> +				LOG(Agc, Error)
> +					<< "Invalid exposure mode '" << modeName << "'";
> +				return -EINVAL;
> +			}
> +
> +			std::vector<uint32_t> shutters =
> +				modeValues["shutter"].getList<uint32_t>().value_or(std::vector<uint32_t>{});
> +			std::vector<double> gains =
> +				modeValues["gain"].getList<double>().value_or(std::vector<double>{});
> +
> +			std::vector<utils::Duration> shutterDurations;
> +			std::transform(shutters.begin(), shutters.end(),
> +				std::back_inserter(shutterDurations),
> +				[](uint32_t time) { return std::chrono::microseconds(time); });
> +
> +			std::shared_ptr<ExposureModeHelper> helper =
> +				std::make_shared<ExposureModeHelper>();
> +			if ((ret = helper->init(shutterDurations, gains) < 0)) {
> +				LOG(Agc, Error)
> +					<< "Failed to parse exposure mode '" << modeName << "'";
> +				return ret;
> +			}
> +
> +			exposureModeHelpers_[AeExposureModeNameValueMap.at(modeName)] = helper;
> +			availableExposureModes.push_back(AeExposureModeNameValueMap.at(modeName));
> +		}
> +	}
> +
> +	/*
> +	 * If we don't have any exposure modes in the tuning data we create an
> +	 * ExposureModeHelper using empty shutter time and gain arrays, which
> +	 * will then internally simply drive the shutter as high as possible
> +	 * before touching gain
> +	 */
> +	if (availableExposureModes.empty()) {
> +		int32_t exposureModeId = AeExposureModeNameValueMap.at("ExposureNormal");
> +		std::vector<utils::Duration> shutterDurations = {};
> +		std::vector<double> gains = {};
> +
> +		std::shared_ptr<ExposureModeHelper> helper =
> +			std::make_shared<ExposureModeHelper>();
> +		if ((ret = helper->init(shutterDurations, gains) < 0)) {
> +			LOG(Agc, Error)
> +				<< "Failed to create default ExposureModeHelper";
> +			return ret;
> +		}
> +
> +		exposureModeHelpers_[exposureModeId] = helper;
> +		availableExposureModes.push_back(exposureModeId);
> +	}
> +
> +	controls_[&controls::AeExposureMode] = ControlInfo(availableExposureModes);
> +
> +	return 0;
> +}
> +
> +/**
> + * \fn MeanLuminanceAgc::constraintModes()
> + * \brief Get the constraint modes that have been parsed from tuning data
> + */
> +
> +/**
> + * \fn MeanLuminanceAgc::exposureModeHelpers()
> + * \brief Get the ExposureModeHelpers that have been parsed from tuning data
> + */
> +
> +/**
> + * \fn MeanLuminanceAgc::controls()
> + * \brief Get the controls that have been generated after parsing tuning data
> + */
> +
> +/**
> + * \fn MeanLuminanceAgc::estimateLuminance(const double gain)
> + * \brief Estimate the luminance of an image, adjusted by a given gain
> + * \param[in] gain The gain with which to adjust the luminance estimate
> + *
> + * This function is a pure virtual function because estimation of luminance is a
> + * hardware-specific operation, which depends wholly on the format of the stats
> + * that are delivered to libcamera from the ISP. Derived classes must implement
> + * an overriding function that calculates the normalised mean luminance value
> + * across the entire image.
> + *
> + * \return The normalised relative luminance of the image
> + */
> +
> +/**
> + * \brief Estimate the initial gain needed to achieve a relative luminance
> + *        target

nit: we don't usually indent in briefs, or in general when breaking
lines in doxygen as far as I can tell

Not going

> + *
> + * To account for non-linearity caused by saturation, the value needs to be
> + * estimated in an iterative process, as multiplying by a gain will not increase
> + * the relative luminance by the same factor if some image regions are saturated
> + *
> + * \return The calculated initial gain
> + */
> +double MeanLuminanceAgc::estimateInitialGain()
> +{
> +	double yTarget = relativeLuminanceTarget_;
> +	double yGain = 1.0;
> +
> +	for (unsigned int i = 0; i < 8; i++) {
> +		double yValue = estimateLuminance(yGain);
> +		double extra_gain = std::min(10.0, yTarget / (yValue + .001));
> +
> +		yGain *= extra_gain;
> +		LOG(Agc, Debug) << "Y value: " << yValue
> +				<< ", Y target: " << yTarget
> +				<< ", gives gain " << yGain;
> +
> +		if (utils::abs_diff(extra_gain, 1.0) < 0.01)
> +			break;
> +	}
> +
> +	return yGain;
> +}
> +
> +/**
> + * \brief Clamp gain within the bounds of a defined constraint
> + * \param[in] constraintModeIndex The index of the constraint to adhere to
> + * \param[in] hist A histogram over which to calculate inter-quantile means
> + * \param[in] gain The gain to clamp
> + *
> + * \return The gain clamped within the constraint bounds
> + */
> +double MeanLuminanceAgc::constraintClampGain(uint32_t constraintModeIndex,
> +					     const Histogram &hist,
> +					     double gain)
> +{
> +	std::vector<AgcConstraint> &constraints = constraintModes_[constraintModeIndex];
> +	for (const AgcConstraint &constraint : constraints) {
> +		double newGain = constraint.yTarget * hist.bins() /
> +				 hist.interQuantileMean(constraint.qLo, constraint.qHi);
> +
> +		if (constraint.bound == AgcConstraint::Bound::LOWER &&
> +		    newGain > gain)
> +			gain = newGain;
> +
> +		if (constraint.bound == AgcConstraint::Bound::UPPER &&
> +		    newGain < gain)
> +			gain = newGain;
> +	}
> +
> +	return gain;
> +}
> +
> +/**
> + * \brief Apply a filter on the exposure value to limit the speed of changes
> + * \param[in] exposureValue The target exposure from the AGC algorithm
> + *
> + * The speed of the filter is adaptive, and will produce the target quicker
> + * during startup, or when the target exposure is within 20% of the most recent
> + * filter output.
> + *
> + * \return The filtered exposure
> + */
> +utils::Duration MeanLuminanceAgc::filterExposure(utils::Duration exposureValue)
> +{
> +	double speed = 0.2;
> +
> +	/* Adapt instantly if we are in startup phase. */
> +	if (frameCount_ < kNumStartupFrames)
> +		speed = 1.0;
> +
> +	/*
> +	 * If we are close to the desired result, go faster to avoid making
> +	 * multiple micro-adjustments.
> +	 * \todo Make this customisable?
> +	 */
> +	if (filteredExposure_ < 1.2 * exposureValue &&
> +	    filteredExposure_ > 0.8 * exposureValue)
> +		speed = sqrt(speed);
> +
> +	filteredExposure_ = speed * exposureValue +
> +			    filteredExposure_ * (1.0 - speed);
> +
> +	return filteredExposure_;
> +}
> +
> +/**
> + * \brief Calculate the new exposure value
> + * \param[in] constraintModeIndex The index of the current constraint mode
> + * \param[in] exposureModeIndex The index of the current exposure mode
> + * \param[in] yHist A Histogram from the ISP statistics to use in constraining
> + *	      the calculated gain
> + * \param[in] effectiveExposureValue The EV applied to the frame from which the
> + *	      statistics in use derive
> + *
> + * Calculate a new exposure value to try to obtain the target. The calculated
> + * exposure value is filtered to prevent rapid changes from frame to frame, and
> + * divided into shutter time, analogue and digital gain.
> + *
> + * \return Tuple of shutter time, analogue gain, and digital gain
> + */
> +std::tuple<utils::Duration, double, double>
> +MeanLuminanceAgc::calculateNewEv(uint32_t constraintModeIndex,
> +				 uint32_t exposureModeIndex,
> +				 const Histogram &yHist,
> +				 utils::Duration effectiveExposureValue)
> +{
> +	/*
> +	 * The pipeline handler should validate that we have received an allowed
> +	 * value for AeExposureMode.
> +	 */
> +	std::shared_ptr<ExposureModeHelper> exposureModeHelper =
> +		exposureModeHelpers_.at(exposureModeIndex);
> +
> +	double gain = estimateInitialGain();
> +	gain = constraintClampGain(constraintModeIndex, yHist, gain);
> +
> +	/*
> +	 * We don't check whether we're already close to the target, because
> +	 * even if the effective exposure value is the same as the last frame's
> +	 * we could have switched to an exposure mode that would require a new
> +	 * pass through the splitExposure() function.
> +	 */
> +
> +	utils::Duration newExposureValue = effectiveExposureValue * gain;
> +	utils::Duration maxTotalExposure = exposureModeHelper->maxShutter()
> +					   * exposureModeHelper->maxGain();
> +	newExposureValue = std::min(newExposureValue, maxTotalExposure);
> +
> +	/*
> +	 * We filter the exposure value to make sure changes are not too jarring
> +	 * from frame to frame.
> +	 */
> +	newExposureValue = filterExposure(newExposureValue);
> +
> +	frameCount_++;
> +	return exposureModeHelper->splitExposure(newExposureValue);
> +}
> +
> +}; /* namespace ipa */
> +
> +}; /* namespace libcamera */
> diff --git a/src/ipa/libipa/agc.h b/src/ipa/libipa/agc.h
> new file mode 100644
> index 00000000..902a359a
> --- /dev/null
> +++ b/src/ipa/libipa/agc.h
> @@ -0,0 +1,82 @@
> +/* SPDX-License-Identifier: LGPL-2.1-or-later */
> +/*
> + * Copyright (C) 2024 Ideas on Board Oy
> + *
> + agc.h - Base class for libipa-compliant AGC algorithms
> + */
> +
> +#pragma once
> +
> +#include <tuple>
> +#include <vector>
> +
> +#include <libcamera/controls.h>
> +
> +#include "libcamera/internal/yaml_parser.h"
> +
> +#include "exposure_mode_helper.h"
> +#include "histogram.h"
> +
> +namespace libcamera {
> +
> +namespace ipa {
> +
> +class MeanLuminanceAgc
> +{
> +public:
> +	MeanLuminanceAgc();
> +	virtual ~MeanLuminanceAgc() = default;
> +
> +	struct AgcConstraint {
> +		enum class Bound {
> +			LOWER = 0,
> +			UPPER = 1
> +		};
> +		Bound bound;
> +		double qLo;
> +		double qHi;
> +		double yTarget;
> +	};
> +
> +	void parseRelativeLuminanceTarget(const YamlObject &tuningData);
> +	void parseConstraint(const YamlObject &modeDict, int32_t id);
> +	int parseConstraintModes(const YamlObject &tuningData);
> +	int parseExposureModes(const YamlObject &tuningData);
> +
> +	std::map<int32_t, std::vector<AgcConstraint>> constraintModes()
> +	{
> +		return constraintModes_;
> +	}
> +
> +	std::map<int32_t, std::shared_ptr<ExposureModeHelper>> exposureModeHelpers()
> +	{
> +		return exposureModeHelpers_;
> +	}
> +
> +	ControlInfoMap::Map controls()
> +	{
> +		return controls_;
> +	}
> +
> +	virtual double estimateLuminance(const double gain) = 0;
> +	double estimateInitialGain();
> +	double constraintClampGain(uint32_t constraintModeIndex,
> +				   const Histogram &hist,
> +				   double gain);
> +	utils::Duration filterExposure(utils::Duration exposureValue);
> +	std::tuple<utils::Duration, double, double>
> +	calculateNewEv(uint32_t constraintModeIndex, uint32_t exposureModeIndex,
> +		       const Histogram &yHist, utils::Duration effectiveExposureValue);
> +private:
> +	uint64_t frameCount_;
> +	utils::Duration filteredExposure_;
> +	double relativeLuminanceTarget_;
> +
> +	std::map<int32_t, std::vector<AgcConstraint>> constraintModes_;
> +	std::map<int32_t, std::shared_ptr<ExposureModeHelper>> exposureModeHelpers_;
> +	ControlInfoMap::Map controls_;
> +};
> +
> +}; /* namespace ipa */
> +
> +}; /* namespace libcamera */
> diff --git a/src/ipa/libipa/meson.build b/src/ipa/libipa/meson.build
> index 37fbd177..31cc8d70 100644
> --- a/src/ipa/libipa/meson.build
> +++ b/src/ipa/libipa/meson.build
> @@ -1,6 +1,7 @@
>  # SPDX-License-Identifier: CC0-1.0
>
>  libipa_headers = files([
> +    'agc.h',
>      'algorithm.h',
>      'camera_sensor_helper.h',
>      'exposure_mode_helper.h',
> @@ -10,6 +11,7 @@ libipa_headers = files([
>  ])
>
>  libipa_sources = files([
> +    'agc.cpp',
>      'algorithm.cpp',
>      'camera_sensor_helper.cpp',
>      'exposure_mode_helper.cpp',
> --
> 2.34.1
>
Stefan Klug March 27, 2024, 3:25 p.m. UTC | #2
Hi Daniel,

thanks for the patch. Jacopo already did the whole review, so only a
small comment...

On Fri, Mar 22, 2024 at 01:14:45PM +0000, Daniel Scally wrote:
> The Agc algorithms for the RkIsp1 and IPU3 IPAs do the same thing in
> very large part; following the Rpi IPA's algorithm in spirit with a
> few tunable values in that IPA being hardcoded in the libipa ones.
> Add a new base class for MeanLuminanceAgc which implements the same
> algorithm and additionally parses yaml tuning files to inform an IPA
> module's Agc algorithm about valid constraint and exposure modes and
> their associated bounds.
> 
> Signed-off-by: Daniel Scally <dan.scally@ideasonboard.com>
> ---
>  src/ipa/libipa/agc.cpp     | 526 +++++++++++++++++++++++++++++++++++++
>  src/ipa/libipa/agc.h       |  82 ++++++
>  src/ipa/libipa/meson.build |   2 +
>  3 files changed, 610 insertions(+)
>  create mode 100644 src/ipa/libipa/agc.cpp
>  create mode 100644 src/ipa/libipa/agc.h
> 
> diff --git a/src/ipa/libipa/agc.cpp b/src/ipa/libipa/agc.cpp
> new file mode 100644
> index 00000000..af57a571
> --- /dev/null
> +++ b/src/ipa/libipa/agc.cpp
> @@ -0,0 +1,526 @@
> +/* SPDX-License-Identifier: LGPL-2.1-or-later */
> +/*
> + * Copyright (C) 2024 Ideas on Board Oy
> + *
> + * agc.cpp - Base class for libipa-compliant AGC algorithms
> + */
> +
> +#include "agc.h"
> +
> +#include <cmath>
> +
> +#include <libcamera/base/log.h>
> +#include <libcamera/control_ids.h>
> +
> +#include "exposure_mode_helper.h"
> +
> +using namespace libcamera::controls;
> +
> +/**
> + * \file agc.h
> + * \brief Base class implementing mean luminance AEGC.
> + */
> +
> +namespace libcamera {
> +
> +using namespace std::literals::chrono_literals;
> +
> +LOG_DEFINE_CATEGORY(Agc)
> +
> +namespace ipa {
> +
> +/*
> + * Number of frames to wait before calculating stats on minimum exposure
> + * \todo should this be a tunable value?
> + */
> +static constexpr uint32_t kNumStartupFrames = 10;
> +
> +/*
> + * Default relative luminance target
> + *
> + * This value should be chosen so that when the camera points at a grey target,
> + * the resulting image brightness looks "right". Custom values can be passed
> + * as the relativeLuminanceTarget value in sensor tuning files.
> + */
> +static constexpr double kDefaultRelativeLuminanceTarget = 0.16;
> +
> +/**
> + * \struct MeanLuminanceAgc::AgcConstraint
> + * \brief The boundaries and target for an AeConstraintMode constraint
> + *
> + * This structure describes an AeConstraintMode constraint for the purposes of
> + * this algorithm. The algorithm will apply the constraints by calculating the
> + * Histogram's inter-quantile mean between the given quantiles and ensure that
> + * the resulting value is the right side of the given target (as defined by the
> + * boundary and luminance target).
> + */
> +
> +/**
> + * \enum MeanLuminanceAgc::AgcConstraint::Bound
> + * \brief Specify whether the constraint defines a lower or upper bound
> + * \var MeanLuminanceAgc::AgcConstraint::LOWER
> + * \brief The constraint defines a lower bound
> + * \var MeanLuminanceAgc::AgcConstraint::UPPER
> + * \brief The constraint defines an upper bound
> + */
> +
> +/**
> + * \var MeanLuminanceAgc::AgcConstraint::bound
> + * \brief The type of constraint bound
> + */
> +
> +/**
> + * \var MeanLuminanceAgc::AgcConstraint::qLo
> + * \brief The lower quantile to use for the constraint
> + */
> +
> +/**
> + * \var MeanLuminanceAgc::AgcConstraint::qHi
> + * \brief The upper quantile to use for the constraint
> + */
> +
> +/**
> + * \var MeanLuminanceAgc::AgcConstraint::yTarget
> + * \brief The luminance target for the constraint
> + */
> +
> +/**
> + * \class MeanLuminanceAgc
> + * \brief a mean-based auto-exposure algorithm
> + *
> + * This algorithm calculates a shutter time, analogue and digital gain such that
> + * the normalised mean luminance value of an image is driven towards a target,
> + * which itself is discovered from tuning data. The algorithm is a two-stage
> + * process:
> + *
> + * In the first stage, an initial gain value is derived by iteratively comparing
> + * the gain-adjusted mean luminance across an entire image against a target, and
> + * selecting a value which pushes it as closely as possible towards the target.
> + *
> + * In the second stage we calculate the gain required to drive the average of a
> + * section of a histogram to a target value, where the target and the boundaries
> + * of the section of the histogram used in the calculation are taken from the
> + * values defined for the currently configured AeConstraintMode within the
> + * tuning data. The gain from the first stage is then clamped to the gain from
> + * this stage.
> + *
> + * The final gain is used to adjust the effective exposure value of the image,
> + * and that new exposure value divided into shutter time, analogue gain and
> + * digital gain according to the selected AeExposureMode.
> + */
> +
> +MeanLuminanceAgc::MeanLuminanceAgc()
> +	: frameCount_(0), filteredExposure_(0s), relativeLuminanceTarget_(0)
> +{
> +}
> +
> +/**
> + * \brief Parse the relative luminance target from the tuning data
> + * \param[in] tuningData The YamlObject holding the algorithm's tuning data
> + */
> +void MeanLuminanceAgc::parseRelativeLuminanceTarget(const YamlObject &tuningData)
> +{
> +	relativeLuminanceTarget_ =
> +		tuningData["relativeLuminanceTarget"].get<double>(kDefaultRelativeLuminanceTarget);
> +}
> +
> +/**
> + * \brief Parse an AeConstraintMode constraint from tuning data
> + * \param[in] modeDict the YamlObject holding the constraint data
> + * \param[in] id The constraint ID from AeConstraintModeEnum
> + */
> +void MeanLuminanceAgc::parseConstraint(const YamlObject &modeDict, int32_t id)
> +{
> +	for (const auto &[boundName, content] : modeDict.asDict()) {
> +		if (boundName != "upper" && boundName != "lower") {
> +			LOG(Agc, Warning)
> +				<< "Ignoring unknown constraint bound '" << boundName << "'";
> +			continue;
> +		}
> +
> +		unsigned int idx = static_cast<unsigned int>(boundName == "upper");
> +		AgcConstraint::Bound bound = static_cast<AgcConstraint::Bound>(idx);
> +		double qLo = content["qLo"].get<double>().value_or(0.98);
> +		double qHi = content["qHi"].get<double>().value_or(1.0);
> +		double yTarget =
> +			content["yTarget"].getList<double>().value_or(std::vector<double>{ 0.5 }).at(0);
> +
> +		AgcConstraint constraint = { bound, qLo, qHi, yTarget };
> +
> +		if (!constraintModes_.count(id))
> +			constraintModes_[id] = {};
> +
> +		if (idx)
> +			constraintModes_[id].push_back(constraint);
> +		else
> +			constraintModes_[id].insert(constraintModes_[id].begin(), constraint);
> +	}
> +}
> +
> +/**
> + * \brief Parse tuning data file to populate AeConstraintMode control
> + * \param[in] tuningData the YamlObject representing the tuning data for Agc
> + *
> + * The Agc algorithm's tuning data should contain a dictionary called
> + * AeConstraintMode containing per-mode setting dictionaries with the key being
> + * a value from \ref controls::AeConstraintModeNameValueMap. Each mode dict may
> + * contain either a "lower" or "upper" key, or both, in this format:
> + *
> + * \code{.unparsed}
> + * algorithms:
> + *   - Agc:
> + *       AeConstraintMode:
> + *         ConstraintNormal:
> + *           lower:
> + *             qLo: 0.98
> + *             qHi: 1.0
> + *             yTarget: 0.5
> + *         ConstraintHighlight:
> + *           lower:
> + *             qLo: 0.98
> + *             qHi: 1.0
> + *             yTarget: 0.5
> + *           upper:
> + *             qLo: 0.98
> + *             qHi: 1.0
> + *             yTarget: 0.8
> + *
> + * \endcode
> + *
> + * The parsed dictionaries are used to populate an array of available values for
> + * the AeConstraintMode control and stored for later use in the algorithm.
> + *
> + * \return -EINVAL Where a defined constraint mode is invalid
> + * \return 0 on success
> + */
> +int MeanLuminanceAgc::parseConstraintModes(const YamlObject &tuningData)
> +{
> +	std::vector<ControlValue> availableConstraintModes;
> +
> +	const YamlObject &yamlConstraintModes = tuningData[controls::AeConstraintMode.name()];
> +	if (yamlConstraintModes.isDictionary()) {
> +		for (const auto &[modeName, modeDict] : yamlConstraintModes.asDict()) {
> +			if (AeConstraintModeNameValueMap.find(modeName) ==
> +			    AeConstraintModeNameValueMap.end()) {
> +				LOG(Agc, Warning)
> +					<< "Skipping unknown constraint mode '" << modeName << "'";
> +				continue;
> +			}
> +
> +			if (!modeDict.isDictionary()) {
> +				LOG(Agc, Error)
> +					<< "Invalid constraint mode '" << modeName << "'";
> +				return -EINVAL;
> +			}
> +
> +			parseConstraint(modeDict,
> +					AeConstraintModeNameValueMap.at(modeName));
> +			availableConstraintModes.push_back(
> +				AeConstraintModeNameValueMap.at(modeName));
> +		}
> +	}
> +
> +	/*
> +	 * If the tuning data file contains no constraints then we use the
> +	 * default constraint that the various Agc algorithms were adhering to
> +	 * anyway before centralisation.
> +	 */
> +	if (constraintModes_.empty()) {
> +		AgcConstraint constraint = {
> +			AgcConstraint::Bound::LOWER,
> +			0.98,
> +			1.0,
> +			0.5
> +		};
> +
> +		constraintModes_[controls::ConstraintNormal].insert(
> +			constraintModes_[controls::ConstraintNormal].begin(),
> +			constraint);
> +		availableConstraintModes.push_back(
> +			AeConstraintModeNameValueMap.at("ConstraintNormal"));
> +	}
> +
> +	controls_[&controls::AeConstraintMode] = ControlInfo(availableConstraintModes);
> +
> +	return 0;
> +}
> +
> +/**
> + * \brief Parse tuning data file to populate AeExposureMode control
> + * \param[in] tuningData the YamlObject representing the tuning data for Agc
> + *
> + * The Agc algorithm's tuning data should contain a dictionary called
> + * AeExposureMode containing per-mode setting dictionaries with the key being
> + * a value from \ref controls::AeExposureModeNameValueMap. Each mode dict should
> + * contain an array of shutter times with the key "shutter" and an array of gain
> + * values with the key "gain", in this format:
> + *
> + * \code{.unparsed}
> + * algorithms:
> + *   - Agc:
> + *       AeExposureMode:
> + *         ExposureNormal:
> + *           shutter: [ 100, 10000, 30000, 60000, 120000 ]
> + *           gain: [ 1.0, 2.0, 4.0, 6.0, 6.0 ]
> + *         ExposureShort:
> + *           shutter: [ 100, 10000, 30000, 60000, 120000 ]
> + *           gain: [ 1.0, 2.0, 4.0, 6.0, 6.0 ]
> + *
> + * \endcode
> + *
> + * The parsed dictionaries are used to populate an array of available values for
> + * the AeExposureMode control and to create ExposureModeHelpers
> + *
> + * \return -EINVAL Where a defined constraint mode is invalid
> + * \return 0 on success
> + */
> +int MeanLuminanceAgc::parseExposureModes(const YamlObject &tuningData)
> +{
> +	std::vector<ControlValue> availableExposureModes;
> +	int ret;
> +
> +	const YamlObject &yamlExposureModes = tuningData[controls::AeExposureMode.name()];
> +	if (yamlExposureModes.isDictionary()) {
> +		for (const auto &[modeName, modeValues] : yamlExposureModes.asDict()) {
> +			if (AeExposureModeNameValueMap.find(modeName) ==
> +			    AeExposureModeNameValueMap.end()) {
> +				LOG(Agc, Warning)
> +					<< "Skipping unknown exposure mode '" << modeName << "'";
> +				continue;
> +			}
> +
> +			if (!modeValues.isDictionary()) {
> +				LOG(Agc, Error)
> +					<< "Invalid exposure mode '" << modeName << "'";
> +				return -EINVAL;
> +			}
> +
> +			std::vector<uint32_t> shutters =
> +				modeValues["shutter"].getList<uint32_t>().value_or(std::vector<uint32_t>{});
> +			std::vector<double> gains =
> +				modeValues["gain"].getList<double>().value_or(std::vector<double>{});
> +
> +			std::vector<utils::Duration> shutterDurations;
> +			std::transform(shutters.begin(), shutters.end(),
> +				std::back_inserter(shutterDurations),
> +				[](uint32_t time) { return std::chrono::microseconds(time); });
> +
> +			std::shared_ptr<ExposureModeHelper> helper =
> +				std::make_shared<ExposureModeHelper>();
> +			if ((ret = helper->init(shutterDurations, gains) < 0)) {
> +				LOG(Agc, Error)
> +					<< "Failed to parse exposure mode '" << modeName << "'";
> +				return ret;
> +			}
> +
> +			exposureModeHelpers_[AeExposureModeNameValueMap.at(modeName)] = helper;
> +			availableExposureModes.push_back(AeExposureModeNameValueMap.at(modeName));
> +		}
> +	}
> +
> +	/*
> +	 * If we don't have any exposure modes in the tuning data we create an
> +	 * ExposureModeHelper using empty shutter time and gain arrays, which
> +	 * will then internally simply drive the shutter as high as possible
> +	 * before touching gain
> +	 */
> +	if (availableExposureModes.empty()) {
> +		int32_t exposureModeId = AeExposureModeNameValueMap.at("ExposureNormal");
> +		std::vector<utils::Duration> shutterDurations = {};
> +		std::vector<double> gains = {};
> +
> +		std::shared_ptr<ExposureModeHelper> helper =
> +			std::make_shared<ExposureModeHelper>();
> +		if ((ret = helper->init(shutterDurations, gains) < 0)) {
> +			LOG(Agc, Error)
> +				<< "Failed to create default ExposureModeHelper";
> +			return ret;
> +		}
> +
> +		exposureModeHelpers_[exposureModeId] = helper;
> +		availableExposureModes.push_back(exposureModeId);
> +	}
> +
> +	controls_[&controls::AeExposureMode] = ControlInfo(availableExposureModes);
> +
> +	return 0;
> +}
> +
> +/**
> + * \fn MeanLuminanceAgc::constraintModes()
> + * \brief Get the constraint modes that have been parsed from tuning data
> + */
> +
> +/**
> + * \fn MeanLuminanceAgc::exposureModeHelpers()
> + * \brief Get the ExposureModeHelpers that have been parsed from tuning data
> + */
> +
> +/**
> + * \fn MeanLuminanceAgc::controls()
> + * \brief Get the controls that have been generated after parsing tuning data
> + */
> +
> +/**
> + * \fn MeanLuminanceAgc::estimateLuminance(const double gain)
> + * \brief Estimate the luminance of an image, adjusted by a given gain
> + * \param[in] gain The gain with which to adjust the luminance estimate
> + *
> + * This function is a pure virtual function because estimation of luminance is a
> + * hardware-specific operation, which depends wholly on the format of the stats
> + * that are delivered to libcamera from the ISP. Derived classes must implement
> + * an overriding function that calculates the normalised mean luminance value
> + * across the entire image.
> + *
> + * \return The normalised relative luminance of the image
> + */
> +
> +/**
> + * \brief Estimate the initial gain needed to achieve a relative luminance
> + *        target
> + *
> + * To account for non-linearity caused by saturation, the value needs to be
> + * estimated in an iterative process, as multiplying by a gain will not increase
> + * the relative luminance by the same factor if some image regions are saturated
> + *
> + * \return The calculated initial gain
> + */
> +double MeanLuminanceAgc::estimateInitialGain()
> +{
> +	double yTarget = relativeLuminanceTarget_;
> +	double yGain = 1.0;
> +
> +	for (unsigned int i = 0; i < 8; i++) {
> +		double yValue = estimateLuminance(yGain);
> +		double extra_gain = std::min(10.0, yTarget / (yValue + .001));
> +
> +		yGain *= extra_gain;
> +		LOG(Agc, Debug) << "Y value: " << yValue
> +				<< ", Y target: " << yTarget
> +				<< ", gives gain " << yGain;
> +
> +		if (utils::abs_diff(extra_gain, 1.0) < 0.01)
> +			break;
> +	}
> +
> +	return yGain;
> +}
> +
> +/**
> + * \brief Clamp gain within the bounds of a defined constraint
> + * \param[in] constraintModeIndex The index of the constraint to adhere to
> + * \param[in] hist A histogram over which to calculate inter-quantile means
> + * \param[in] gain The gain to clamp
> + *
> + * \return The gain clamped within the constraint bounds
> + */
> +double MeanLuminanceAgc::constraintClampGain(uint32_t constraintModeIndex,
> +					     const Histogram &hist,
> +					     double gain)
> +{
> +	std::vector<AgcConstraint> &constraints = constraintModes_[constraintModeIndex];
> +	for (const AgcConstraint &constraint : constraints) {
> +		double newGain = constraint.yTarget * hist.bins() /
> +				 hist.interQuantileMean(constraint.qLo, constraint.qHi);
> +
> +		if (constraint.bound == AgcConstraint::Bound::LOWER &&
> +		    newGain > gain)
> +			gain = newGain;
> +
> +		if (constraint.bound == AgcConstraint::Bound::UPPER &&
> +		    newGain < gain)
> +			gain = newGain;
> +	}
> +
> +	return gain;
> +}
> +
> +/**
> + * \brief Apply a filter on the exposure value to limit the speed of changes
> + * \param[in] exposureValue The target exposure from the AGC algorithm
> + *
> + * The speed of the filter is adaptive, and will produce the target quicker
> + * during startup, or when the target exposure is within 20% of the most recent
> + * filter output.
> + *
> + * \return The filtered exposure
> + */
> +utils::Duration MeanLuminanceAgc::filterExposure(utils::Duration exposureValue)
> +{
> +	double speed = 0.2;
> +
> +	/* Adapt instantly if we are in startup phase. */
> +	if (frameCount_ < kNumStartupFrames)
> +		speed = 1.0;

frameCount is only reset in the constructor. If I got it right, this
class doesn't know when the camera is stopped/restarted. This might be
on purpose, with the expectation that a camera continues with the last
state after a stop()/start(). Could we fix the comment to express that
somehow ( Adapt instntly if we are in the first startup phase ) or
something. I'm no native speaker, so maybe you have better ideas :-)

Cheers Stefan

> +
> +	/*
> +	 * If we are close to the desired result, go faster to avoid making
> +	 * multiple micro-adjustments.
> +	 * \todo Make this customisable?
> +	 */
> +	if (filteredExposure_ < 1.2 * exposureValue &&
> +	    filteredExposure_ > 0.8 * exposureValue)
> +		speed = sqrt(speed);
> +
> +	filteredExposure_ = speed * exposureValue +
> +			    filteredExposure_ * (1.0 - speed);
> +
> +	return filteredExposure_;
> +}
> +
> +/**
> + * \brief Calculate the new exposure value
> + * \param[in] constraintModeIndex The index of the current constraint mode
> + * \param[in] exposureModeIndex The index of the current exposure mode
> + * \param[in] yHist A Histogram from the ISP statistics to use in constraining
> + *	      the calculated gain
> + * \param[in] effectiveExposureValue The EV applied to the frame from which the
> + *	      statistics in use derive
> + *
> + * Calculate a new exposure value to try to obtain the target. The calculated
> + * exposure value is filtered to prevent rapid changes from frame to frame, and
> + * divided into shutter time, analogue and digital gain.
> + *
> + * \return Tuple of shutter time, analogue gain, and digital gain
> + */
> +std::tuple<utils::Duration, double, double>
> +MeanLuminanceAgc::calculateNewEv(uint32_t constraintModeIndex,
> +				 uint32_t exposureModeIndex,
> +				 const Histogram &yHist,
> +				 utils::Duration effectiveExposureValue)
> +{
> +	/*
> +	 * The pipeline handler should validate that we have received an allowed
> +	 * value for AeExposureMode.
> +	 */
> +	std::shared_ptr<ExposureModeHelper> exposureModeHelper =
> +		exposureModeHelpers_.at(exposureModeIndex);
> +
> +	double gain = estimateInitialGain();
> +	gain = constraintClampGain(constraintModeIndex, yHist, gain);
> +
> +	/*
> +	 * We don't check whether we're already close to the target, because
> +	 * even if the effective exposure value is the same as the last frame's
> +	 * we could have switched to an exposure mode that would require a new
> +	 * pass through the splitExposure() function.
> +	 */
> +
> +	utils::Duration newExposureValue = effectiveExposureValue * gain;
> +	utils::Duration maxTotalExposure = exposureModeHelper->maxShutter()
> +					   * exposureModeHelper->maxGain();
> +	newExposureValue = std::min(newExposureValue, maxTotalExposure);
> +
> +	/*
> +	 * We filter the exposure value to make sure changes are not too jarring
> +	 * from frame to frame.
> +	 */
> +	newExposureValue = filterExposure(newExposureValue);
> +
> +	frameCount_++;
> +	return exposureModeHelper->splitExposure(newExposureValue);
> +}
> +
> +}; /* namespace ipa */
> +
> +}; /* namespace libcamera */
> diff --git a/src/ipa/libipa/agc.h b/src/ipa/libipa/agc.h
> new file mode 100644
> index 00000000..902a359a
> --- /dev/null
> +++ b/src/ipa/libipa/agc.h
> @@ -0,0 +1,82 @@
> +/* SPDX-License-Identifier: LGPL-2.1-or-later */
> +/*
> + * Copyright (C) 2024 Ideas on Board Oy
> + *
> + agc.h - Base class for libipa-compliant AGC algorithms
> + */
> +
> +#pragma once
> +
> +#include <tuple>
> +#include <vector>
> +
> +#include <libcamera/controls.h>
> +
> +#include "libcamera/internal/yaml_parser.h"
> +
> +#include "exposure_mode_helper.h"
> +#include "histogram.h"
> +
> +namespace libcamera {
> +
> +namespace ipa {
> +
> +class MeanLuminanceAgc
> +{
> +public:
> +	MeanLuminanceAgc();
> +	virtual ~MeanLuminanceAgc() = default;
> +
> +	struct AgcConstraint {
> +		enum class Bound {
> +			LOWER = 0,
> +			UPPER = 1
> +		};
> +		Bound bound;
> +		double qLo;
> +		double qHi;
> +		double yTarget;
> +	};
> +
> +	void parseRelativeLuminanceTarget(const YamlObject &tuningData);
> +	void parseConstraint(const YamlObject &modeDict, int32_t id);
> +	int parseConstraintModes(const YamlObject &tuningData);
> +	int parseExposureModes(const YamlObject &tuningData);
> +
> +	std::map<int32_t, std::vector<AgcConstraint>> constraintModes()
> +	{
> +		return constraintModes_;
> +	}
> +
> +	std::map<int32_t, std::shared_ptr<ExposureModeHelper>> exposureModeHelpers()
> +	{
> +		return exposureModeHelpers_;
> +	}
> +
> +	ControlInfoMap::Map controls()
> +	{
> +		return controls_;
> +	}
> +
> +	virtual double estimateLuminance(const double gain) = 0;
> +	double estimateInitialGain();
> +	double constraintClampGain(uint32_t constraintModeIndex,
> +				   const Histogram &hist,
> +				   double gain);
> +	utils::Duration filterExposure(utils::Duration exposureValue);
> +	std::tuple<utils::Duration, double, double>
> +	calculateNewEv(uint32_t constraintModeIndex, uint32_t exposureModeIndex,
> +		       const Histogram &yHist, utils::Duration effectiveExposureValue);
> +private:
> +	uint64_t frameCount_;
> +	utils::Duration filteredExposure_;
> +	double relativeLuminanceTarget_;
> +
> +	std::map<int32_t, std::vector<AgcConstraint>> constraintModes_;
> +	std::map<int32_t, std::shared_ptr<ExposureModeHelper>> exposureModeHelpers_;
> +	ControlInfoMap::Map controls_;
> +};
> +
> +}; /* namespace ipa */
> +
> +}; /* namespace libcamera */
> diff --git a/src/ipa/libipa/meson.build b/src/ipa/libipa/meson.build
> index 37fbd177..31cc8d70 100644
> --- a/src/ipa/libipa/meson.build
> +++ b/src/ipa/libipa/meson.build
> @@ -1,6 +1,7 @@
>  # SPDX-License-Identifier: CC0-1.0
>  
>  libipa_headers = files([
> +    'agc.h',
>      'algorithm.h',
>      'camera_sensor_helper.h',
>      'exposure_mode_helper.h',
> @@ -10,6 +11,7 @@ libipa_headers = files([
>  ])
>  
>  libipa_sources = files([
> +    'agc.cpp',
>      'algorithm.cpp',
>      'camera_sensor_helper.cpp',
>      'exposure_mode_helper.cpp',
> -- 
> 2.34.1
>
Laurent Pinchart April 5, 2024, 11:08 p.m. UTC | #3
On Wed, Mar 27, 2024 at 04:25:45PM +0100, Stefan Klug wrote:
> Hi Daniel,
> 
> thanks for the patch. Jacopo already did the whole review, so only a
> small comment...
> 
> On Fri, Mar 22, 2024 at 01:14:45PM +0000, Daniel Scally wrote:
> > The Agc algorithms for the RkIsp1 and IPU3 IPAs do the same thing in
> > very large part; following the Rpi IPA's algorithm in spirit with a
> > few tunable values in that IPA being hardcoded in the libipa ones.
> > Add a new base class for MeanLuminanceAgc which implements the same
> > algorithm and additionally parses yaml tuning files to inform an IPA
> > module's Agc algorithm about valid constraint and exposure modes and
> > their associated bounds.
> > 
> > Signed-off-by: Daniel Scally <dan.scally@ideasonboard.com>
> > ---
> >  src/ipa/libipa/agc.cpp     | 526 +++++++++++++++++++++++++++++++++++++
> >  src/ipa/libipa/agc.h       |  82 ++++++
> >  src/ipa/libipa/meson.build |   2 +
> >  3 files changed, 610 insertions(+)
> >  create mode 100644 src/ipa/libipa/agc.cpp
> >  create mode 100644 src/ipa/libipa/agc.h
> > 
> > diff --git a/src/ipa/libipa/agc.cpp b/src/ipa/libipa/agc.cpp
> > new file mode 100644
> > index 00000000..af57a571
> > --- /dev/null
> > +++ b/src/ipa/libipa/agc.cpp
> > @@ -0,0 +1,526 @@
> > +/* SPDX-License-Identifier: LGPL-2.1-or-later */
> > +/*
> > + * Copyright (C) 2024 Ideas on Board Oy
> > + *
> > + * agc.cpp - Base class for libipa-compliant AGC algorithms
> > + */
> > +
> > +#include "agc.h"
> > +
> > +#include <cmath>
> > +
> > +#include <libcamera/base/log.h>
> > +#include <libcamera/control_ids.h>
> > +
> > +#include "exposure_mode_helper.h"
> > +
> > +using namespace libcamera::controls;
> > +
> > +/**
> > + * \file agc.h
> > + * \brief Base class implementing mean luminance AEGC.
> > + */
> > +
> > +namespace libcamera {
> > +
> > +using namespace std::literals::chrono_literals;
> > +
> > +LOG_DEFINE_CATEGORY(Agc)
> > +
> > +namespace ipa {
> > +
> > +/*
> > + * Number of frames to wait before calculating stats on minimum exposure
> > + * \todo should this be a tunable value?
> > + */
> > +static constexpr uint32_t kNumStartupFrames = 10;
> > +
> > +/*
> > + * Default relative luminance target
> > + *
> > + * This value should be chosen so that when the camera points at a grey target,
> > + * the resulting image brightness looks "right". Custom values can be passed
> > + * as the relativeLuminanceTarget value in sensor tuning files.
> > + */
> > +static constexpr double kDefaultRelativeLuminanceTarget = 0.16;
> > +
> > +/**
> > + * \struct MeanLuminanceAgc::AgcConstraint
> > + * \brief The boundaries and target for an AeConstraintMode constraint
> > + *
> > + * This structure describes an AeConstraintMode constraint for the purposes of
> > + * this algorithm. The algorithm will apply the constraints by calculating the
> > + * Histogram's inter-quantile mean between the given quantiles and ensure that
> > + * the resulting value is the right side of the given target (as defined by the
> > + * boundary and luminance target).
> > + */
> > +
> > +/**
> > + * \enum MeanLuminanceAgc::AgcConstraint::Bound
> > + * \brief Specify whether the constraint defines a lower or upper bound
> > + * \var MeanLuminanceAgc::AgcConstraint::LOWER
> > + * \brief The constraint defines a lower bound
> > + * \var MeanLuminanceAgc::AgcConstraint::UPPER
> > + * \brief The constraint defines an upper bound
> > + */
> > +
> > +/**
> > + * \var MeanLuminanceAgc::AgcConstraint::bound
> > + * \brief The type of constraint bound
> > + */
> > +
> > +/**
> > + * \var MeanLuminanceAgc::AgcConstraint::qLo
> > + * \brief The lower quantile to use for the constraint
> > + */
> > +
> > +/**
> > + * \var MeanLuminanceAgc::AgcConstraint::qHi
> > + * \brief The upper quantile to use for the constraint
> > + */
> > +
> > +/**
> > + * \var MeanLuminanceAgc::AgcConstraint::yTarget
> > + * \brief The luminance target for the constraint
> > + */
> > +
> > +/**
> > + * \class MeanLuminanceAgc
> > + * \brief a mean-based auto-exposure algorithm
> > + *
> > + * This algorithm calculates a shutter time, analogue and digital gain such that
> > + * the normalised mean luminance value of an image is driven towards a target,
> > + * which itself is discovered from tuning data. The algorithm is a two-stage
> > + * process:
> > + *
> > + * In the first stage, an initial gain value is derived by iteratively comparing
> > + * the gain-adjusted mean luminance across an entire image against a target, and
> > + * selecting a value which pushes it as closely as possible towards the target.
> > + *
> > + * In the second stage we calculate the gain required to drive the average of a
> > + * section of a histogram to a target value, where the target and the boundaries
> > + * of the section of the histogram used in the calculation are taken from the
> > + * values defined for the currently configured AeConstraintMode within the
> > + * tuning data. The gain from the first stage is then clamped to the gain from
> > + * this stage.
> > + *
> > + * The final gain is used to adjust the effective exposure value of the image,
> > + * and that new exposure value divided into shutter time, analogue gain and
> > + * digital gain according to the selected AeExposureMode.
> > + */
> > +
> > +MeanLuminanceAgc::MeanLuminanceAgc()
> > +	: frameCount_(0), filteredExposure_(0s), relativeLuminanceTarget_(0)
> > +{
> > +}
> > +
> > +/**
> > + * \brief Parse the relative luminance target from the tuning data
> > + * \param[in] tuningData The YamlObject holding the algorithm's tuning data
> > + */
> > +void MeanLuminanceAgc::parseRelativeLuminanceTarget(const YamlObject &tuningData)
> > +{
> > +	relativeLuminanceTarget_ =
> > +		tuningData["relativeLuminanceTarget"].get<double>(kDefaultRelativeLuminanceTarget);
> > +}
> > +
> > +/**
> > + * \brief Parse an AeConstraintMode constraint from tuning data
> > + * \param[in] modeDict the YamlObject holding the constraint data
> > + * \param[in] id The constraint ID from AeConstraintModeEnum
> > + */
> > +void MeanLuminanceAgc::parseConstraint(const YamlObject &modeDict, int32_t id)
> > +{
> > +	for (const auto &[boundName, content] : modeDict.asDict()) {
> > +		if (boundName != "upper" && boundName != "lower") {
> > +			LOG(Agc, Warning)
> > +				<< "Ignoring unknown constraint bound '" << boundName << "'";
> > +			continue;
> > +		}
> > +
> > +		unsigned int idx = static_cast<unsigned int>(boundName == "upper");
> > +		AgcConstraint::Bound bound = static_cast<AgcConstraint::Bound>(idx);
> > +		double qLo = content["qLo"].get<double>().value_or(0.98);
> > +		double qHi = content["qHi"].get<double>().value_or(1.0);
> > +		double yTarget =
> > +			content["yTarget"].getList<double>().value_or(std::vector<double>{ 0.5 }).at(0);
> > +
> > +		AgcConstraint constraint = { bound, qLo, qHi, yTarget };
> > +
> > +		if (!constraintModes_.count(id))
> > +			constraintModes_[id] = {};
> > +
> > +		if (idx)
> > +			constraintModes_[id].push_back(constraint);
> > +		else
> > +			constraintModes_[id].insert(constraintModes_[id].begin(), constraint);
> > +	}
> > +}
> > +
> > +/**
> > + * \brief Parse tuning data file to populate AeConstraintMode control
> > + * \param[in] tuningData the YamlObject representing the tuning data for Agc
> > + *
> > + * The Agc algorithm's tuning data should contain a dictionary called
> > + * AeConstraintMode containing per-mode setting dictionaries with the key being
> > + * a value from \ref controls::AeConstraintModeNameValueMap. Each mode dict may
> > + * contain either a "lower" or "upper" key, or both, in this format:
> > + *
> > + * \code{.unparsed}
> > + * algorithms:
> > + *   - Agc:
> > + *       AeConstraintMode:
> > + *         ConstraintNormal:
> > + *           lower:
> > + *             qLo: 0.98
> > + *             qHi: 1.0
> > + *             yTarget: 0.5
> > + *         ConstraintHighlight:
> > + *           lower:
> > + *             qLo: 0.98
> > + *             qHi: 1.0
> > + *             yTarget: 0.5
> > + *           upper:
> > + *             qLo: 0.98
> > + *             qHi: 1.0
> > + *             yTarget: 0.8
> > + *
> > + * \endcode
> > + *
> > + * The parsed dictionaries are used to populate an array of available values for
> > + * the AeConstraintMode control and stored for later use in the algorithm.
> > + *
> > + * \return -EINVAL Where a defined constraint mode is invalid
> > + * \return 0 on success
> > + */
> > +int MeanLuminanceAgc::parseConstraintModes(const YamlObject &tuningData)
> > +{
> > +	std::vector<ControlValue> availableConstraintModes;
> > +
> > +	const YamlObject &yamlConstraintModes = tuningData[controls::AeConstraintMode.name()];
> > +	if (yamlConstraintModes.isDictionary()) {
> > +		for (const auto &[modeName, modeDict] : yamlConstraintModes.asDict()) {
> > +			if (AeConstraintModeNameValueMap.find(modeName) ==
> > +			    AeConstraintModeNameValueMap.end()) {
> > +				LOG(Agc, Warning)
> > +					<< "Skipping unknown constraint mode '" << modeName << "'";
> > +				continue;
> > +			}
> > +
> > +			if (!modeDict.isDictionary()) {
> > +				LOG(Agc, Error)
> > +					<< "Invalid constraint mode '" << modeName << "'";
> > +				return -EINVAL;
> > +			}
> > +
> > +			parseConstraint(modeDict,
> > +					AeConstraintModeNameValueMap.at(modeName));
> > +			availableConstraintModes.push_back(
> > +				AeConstraintModeNameValueMap.at(modeName));
> > +		}
> > +	}
> > +
> > +	/*
> > +	 * If the tuning data file contains no constraints then we use the
> > +	 * default constraint that the various Agc algorithms were adhering to
> > +	 * anyway before centralisation.
> > +	 */
> > +	if (constraintModes_.empty()) {
> > +		AgcConstraint constraint = {
> > +			AgcConstraint::Bound::LOWER,
> > +			0.98,
> > +			1.0,
> > +			0.5
> > +		};
> > +
> > +		constraintModes_[controls::ConstraintNormal].insert(
> > +			constraintModes_[controls::ConstraintNormal].begin(),
> > +			constraint);
> > +		availableConstraintModes.push_back(
> > +			AeConstraintModeNameValueMap.at("ConstraintNormal"));
> > +	}
> > +
> > +	controls_[&controls::AeConstraintMode] = ControlInfo(availableConstraintModes);
> > +
> > +	return 0;
> > +}
> > +
> > +/**
> > + * \brief Parse tuning data file to populate AeExposureMode control
> > + * \param[in] tuningData the YamlObject representing the tuning data for Agc
> > + *
> > + * The Agc algorithm's tuning data should contain a dictionary called
> > + * AeExposureMode containing per-mode setting dictionaries with the key being
> > + * a value from \ref controls::AeExposureModeNameValueMap. Each mode dict should
> > + * contain an array of shutter times with the key "shutter" and an array of gain
> > + * values with the key "gain", in this format:
> > + *
> > + * \code{.unparsed}
> > + * algorithms:
> > + *   - Agc:
> > + *       AeExposureMode:
> > + *         ExposureNormal:
> > + *           shutter: [ 100, 10000, 30000, 60000, 120000 ]
> > + *           gain: [ 1.0, 2.0, 4.0, 6.0, 6.0 ]
> > + *         ExposureShort:
> > + *           shutter: [ 100, 10000, 30000, 60000, 120000 ]
> > + *           gain: [ 1.0, 2.0, 4.0, 6.0, 6.0 ]
> > + *
> > + * \endcode
> > + *
> > + * The parsed dictionaries are used to populate an array of available values for
> > + * the AeExposureMode control and to create ExposureModeHelpers
> > + *
> > + * \return -EINVAL Where a defined constraint mode is invalid
> > + * \return 0 on success
> > + */
> > +int MeanLuminanceAgc::parseExposureModes(const YamlObject &tuningData)
> > +{
> > +	std::vector<ControlValue> availableExposureModes;
> > +	int ret;
> > +
> > +	const YamlObject &yamlExposureModes = tuningData[controls::AeExposureMode.name()];
> > +	if (yamlExposureModes.isDictionary()) {
> > +		for (const auto &[modeName, modeValues] : yamlExposureModes.asDict()) {
> > +			if (AeExposureModeNameValueMap.find(modeName) ==
> > +			    AeExposureModeNameValueMap.end()) {
> > +				LOG(Agc, Warning)
> > +					<< "Skipping unknown exposure mode '" << modeName << "'";
> > +				continue;
> > +			}
> > +
> > +			if (!modeValues.isDictionary()) {
> > +				LOG(Agc, Error)
> > +					<< "Invalid exposure mode '" << modeName << "'";
> > +				return -EINVAL;
> > +			}
> > +
> > +			std::vector<uint32_t> shutters =
> > +				modeValues["shutter"].getList<uint32_t>().value_or(std::vector<uint32_t>{});
> > +			std::vector<double> gains =
> > +				modeValues["gain"].getList<double>().value_or(std::vector<double>{});
> > +
> > +			std::vector<utils::Duration> shutterDurations;
> > +			std::transform(shutters.begin(), shutters.end(),
> > +				std::back_inserter(shutterDurations),
> > +				[](uint32_t time) { return std::chrono::microseconds(time); });
> > +
> > +			std::shared_ptr<ExposureModeHelper> helper =
> > +				std::make_shared<ExposureModeHelper>();
> > +			if ((ret = helper->init(shutterDurations, gains) < 0)) {
> > +				LOG(Agc, Error)
> > +					<< "Failed to parse exposure mode '" << modeName << "'";
> > +				return ret;
> > +			}
> > +
> > +			exposureModeHelpers_[AeExposureModeNameValueMap.at(modeName)] = helper;
> > +			availableExposureModes.push_back(AeExposureModeNameValueMap.at(modeName));
> > +		}
> > +	}
> > +
> > +	/*
> > +	 * If we don't have any exposure modes in the tuning data we create an
> > +	 * ExposureModeHelper using empty shutter time and gain arrays, which
> > +	 * will then internally simply drive the shutter as high as possible
> > +	 * before touching gain
> > +	 */
> > +	if (availableExposureModes.empty()) {
> > +		int32_t exposureModeId = AeExposureModeNameValueMap.at("ExposureNormal");
> > +		std::vector<utils::Duration> shutterDurations = {};
> > +		std::vector<double> gains = {};
> > +
> > +		std::shared_ptr<ExposureModeHelper> helper =
> > +			std::make_shared<ExposureModeHelper>();
> > +		if ((ret = helper->init(shutterDurations, gains) < 0)) {
> > +			LOG(Agc, Error)
> > +				<< "Failed to create default ExposureModeHelper";
> > +			return ret;
> > +		}
> > +
> > +		exposureModeHelpers_[exposureModeId] = helper;
> > +		availableExposureModes.push_back(exposureModeId);
> > +	}
> > +
> > +	controls_[&controls::AeExposureMode] = ControlInfo(availableExposureModes);
> > +
> > +	return 0;
> > +}
> > +
> > +/**
> > + * \fn MeanLuminanceAgc::constraintModes()
> > + * \brief Get the constraint modes that have been parsed from tuning data
> > + */
> > +
> > +/**
> > + * \fn MeanLuminanceAgc::exposureModeHelpers()
> > + * \brief Get the ExposureModeHelpers that have been parsed from tuning data
> > + */
> > +
> > +/**
> > + * \fn MeanLuminanceAgc::controls()
> > + * \brief Get the controls that have been generated after parsing tuning data
> > + */
> > +
> > +/**
> > + * \fn MeanLuminanceAgc::estimateLuminance(const double gain)
> > + * \brief Estimate the luminance of an image, adjusted by a given gain
> > + * \param[in] gain The gain with which to adjust the luminance estimate
> > + *
> > + * This function is a pure virtual function because estimation of luminance is a
> > + * hardware-specific operation, which depends wholly on the format of the stats
> > + * that are delivered to libcamera from the ISP. Derived classes must implement
> > + * an overriding function that calculates the normalised mean luminance value
> > + * across the entire image.
> > + *
> > + * \return The normalised relative luminance of the image
> > + */
> > +
> > +/**
> > + * \brief Estimate the initial gain needed to achieve a relative luminance
> > + *        target
> > + *
> > + * To account for non-linearity caused by saturation, the value needs to be
> > + * estimated in an iterative process, as multiplying by a gain will not increase
> > + * the relative luminance by the same factor if some image regions are saturated
> > + *
> > + * \return The calculated initial gain
> > + */
> > +double MeanLuminanceAgc::estimateInitialGain()
> > +{
> > +	double yTarget = relativeLuminanceTarget_;
> > +	double yGain = 1.0;
> > +
> > +	for (unsigned int i = 0; i < 8; i++) {
> > +		double yValue = estimateLuminance(yGain);
> > +		double extra_gain = std::min(10.0, yTarget / (yValue + .001));
> > +
> > +		yGain *= extra_gain;
> > +		LOG(Agc, Debug) << "Y value: " << yValue
> > +				<< ", Y target: " << yTarget
> > +				<< ", gives gain " << yGain;
> > +
> > +		if (utils::abs_diff(extra_gain, 1.0) < 0.01)
> > +			break;
> > +	}
> > +
> > +	return yGain;
> > +}
> > +
> > +/**
> > + * \brief Clamp gain within the bounds of a defined constraint
> > + * \param[in] constraintModeIndex The index of the constraint to adhere to
> > + * \param[in] hist A histogram over which to calculate inter-quantile means
> > + * \param[in] gain The gain to clamp
> > + *
> > + * \return The gain clamped within the constraint bounds
> > + */
> > +double MeanLuminanceAgc::constraintClampGain(uint32_t constraintModeIndex,
> > +					     const Histogram &hist,
> > +					     double gain)
> > +{
> > +	std::vector<AgcConstraint> &constraints = constraintModes_[constraintModeIndex];
> > +	for (const AgcConstraint &constraint : constraints) {
> > +		double newGain = constraint.yTarget * hist.bins() /
> > +				 hist.interQuantileMean(constraint.qLo, constraint.qHi);
> > +
> > +		if (constraint.bound == AgcConstraint::Bound::LOWER &&
> > +		    newGain > gain)
> > +			gain = newGain;
> > +
> > +		if (constraint.bound == AgcConstraint::Bound::UPPER &&
> > +		    newGain < gain)
> > +			gain = newGain;
> > +	}
> > +
> > +	return gain;
> > +}
> > +
> > +/**
> > + * \brief Apply a filter on the exposure value to limit the speed of changes
> > + * \param[in] exposureValue The target exposure from the AGC algorithm
> > + *
> > + * The speed of the filter is adaptive, and will produce the target quicker
> > + * during startup, or when the target exposure is within 20% of the most recent
> > + * filter output.
> > + *
> > + * \return The filtered exposure
> > + */
> > +utils::Duration MeanLuminanceAgc::filterExposure(utils::Duration exposureValue)
> > +{
> > +	double speed = 0.2;
> > +
> > +	/* Adapt instantly if we are in startup phase. */
> > +	if (frameCount_ < kNumStartupFrames)
> > +		speed = 1.0;
> 
> frameCount is only reset in the constructor. If I got it right, this
> class doesn't know when the camera is stopped/restarted. This might be
> on purpose, with the expectation that a camera continues with the last
> state after a stop()/start(). Could we fix the comment to express that
> somehow ( Adapt instntly if we are in the first startup phase ) or
> something. I'm no native speaker, so maybe you have better ideas :-)

I think it would be better to reset the counter when the camera is
stopped and restarted. Otherwise we'll have inconsistent behaviour and
difficult to debug problems. We can later improve the restart case if
needed, possibly taking into account how long the camera has been
stopped for.

> > +
> > +	/*
> > +	 * If we are close to the desired result, go faster to avoid making
> > +	 * multiple micro-adjustments.
> > +	 * \todo Make this customisable?
> > +	 */
> > +	if (filteredExposure_ < 1.2 * exposureValue &&
> > +	    filteredExposure_ > 0.8 * exposureValue)
> > +		speed = sqrt(speed);
> > +
> > +	filteredExposure_ = speed * exposureValue +
> > +			    filteredExposure_ * (1.0 - speed);
> > +
> > +	return filteredExposure_;
> > +}
> > +
> > +/**
> > + * \brief Calculate the new exposure value
> > + * \param[in] constraintModeIndex The index of the current constraint mode
> > + * \param[in] exposureModeIndex The index of the current exposure mode
> > + * \param[in] yHist A Histogram from the ISP statistics to use in constraining
> > + *	      the calculated gain
> > + * \param[in] effectiveExposureValue The EV applied to the frame from which the
> > + *	      statistics in use derive
> > + *
> > + * Calculate a new exposure value to try to obtain the target. The calculated
> > + * exposure value is filtered to prevent rapid changes from frame to frame, and
> > + * divided into shutter time, analogue and digital gain.
> > + *
> > + * \return Tuple of shutter time, analogue gain, and digital gain
> > + */
> > +std::tuple<utils::Duration, double, double>
> > +MeanLuminanceAgc::calculateNewEv(uint32_t constraintModeIndex,
> > +				 uint32_t exposureModeIndex,
> > +				 const Histogram &yHist,
> > +				 utils::Duration effectiveExposureValue)
> > +{
> > +	/*
> > +	 * The pipeline handler should validate that we have received an allowed
> > +	 * value for AeExposureMode.
> > +	 */
> > +	std::shared_ptr<ExposureModeHelper> exposureModeHelper =
> > +		exposureModeHelpers_.at(exposureModeIndex);
> > +
> > +	double gain = estimateInitialGain();
> > +	gain = constraintClampGain(constraintModeIndex, yHist, gain);
> > +
> > +	/*
> > +	 * We don't check whether we're already close to the target, because
> > +	 * even if the effective exposure value is the same as the last frame's
> > +	 * we could have switched to an exposure mode that would require a new
> > +	 * pass through the splitExposure() function.
> > +	 */
> > +
> > +	utils::Duration newExposureValue = effectiveExposureValue * gain;
> > +	utils::Duration maxTotalExposure = exposureModeHelper->maxShutter()
> > +					   * exposureModeHelper->maxGain();
> > +	newExposureValue = std::min(newExposureValue, maxTotalExposure);
> > +
> > +	/*
> > +	 * We filter the exposure value to make sure changes are not too jarring
> > +	 * from frame to frame.
> > +	 */
> > +	newExposureValue = filterExposure(newExposureValue);
> > +
> > +	frameCount_++;
> > +	return exposureModeHelper->splitExposure(newExposureValue);
> > +}
> > +
> > +}; /* namespace ipa */
> > +
> > +}; /* namespace libcamera */
> > diff --git a/src/ipa/libipa/agc.h b/src/ipa/libipa/agc.h
> > new file mode 100644
> > index 00000000..902a359a
> > --- /dev/null
> > +++ b/src/ipa/libipa/agc.h
> > @@ -0,0 +1,82 @@
> > +/* SPDX-License-Identifier: LGPL-2.1-or-later */
> > +/*
> > + * Copyright (C) 2024 Ideas on Board Oy
> > + *
> > + agc.h - Base class for libipa-compliant AGC algorithms
> > + */
> > +
> > +#pragma once
> > +
> > +#include <tuple>
> > +#include <vector>
> > +
> > +#include <libcamera/controls.h>
> > +
> > +#include "libcamera/internal/yaml_parser.h"
> > +
> > +#include "exposure_mode_helper.h"
> > +#include "histogram.h"
> > +
> > +namespace libcamera {
> > +
> > +namespace ipa {
> > +
> > +class MeanLuminanceAgc
> > +{
> > +public:
> > +	MeanLuminanceAgc();
> > +	virtual ~MeanLuminanceAgc() = default;
> > +
> > +	struct AgcConstraint {
> > +		enum class Bound {
> > +			LOWER = 0,
> > +			UPPER = 1
> > +		};
> > +		Bound bound;
> > +		double qLo;
> > +		double qHi;
> > +		double yTarget;
> > +	};
> > +
> > +	void parseRelativeLuminanceTarget(const YamlObject &tuningData);
> > +	void parseConstraint(const YamlObject &modeDict, int32_t id);
> > +	int parseConstraintModes(const YamlObject &tuningData);
> > +	int parseExposureModes(const YamlObject &tuningData);
> > +
> > +	std::map<int32_t, std::vector<AgcConstraint>> constraintModes()
> > +	{
> > +		return constraintModes_;
> > +	}
> > +
> > +	std::map<int32_t, std::shared_ptr<ExposureModeHelper>> exposureModeHelpers()
> > +	{
> > +		return exposureModeHelpers_;
> > +	}
> > +
> > +	ControlInfoMap::Map controls()
> > +	{
> > +		return controls_;
> > +	}
> > +
> > +	virtual double estimateLuminance(const double gain) = 0;
> > +	double estimateInitialGain();
> > +	double constraintClampGain(uint32_t constraintModeIndex,
> > +				   const Histogram &hist,
> > +				   double gain);
> > +	utils::Duration filterExposure(utils::Duration exposureValue);
> > +	std::tuple<utils::Duration, double, double>
> > +	calculateNewEv(uint32_t constraintModeIndex, uint32_t exposureModeIndex,
> > +		       const Histogram &yHist, utils::Duration effectiveExposureValue);
> > +private:
> > +	uint64_t frameCount_;
> > +	utils::Duration filteredExposure_;
> > +	double relativeLuminanceTarget_;
> > +
> > +	std::map<int32_t, std::vector<AgcConstraint>> constraintModes_;
> > +	std::map<int32_t, std::shared_ptr<ExposureModeHelper>> exposureModeHelpers_;
> > +	ControlInfoMap::Map controls_;
> > +};
> > +
> > +}; /* namespace ipa */
> > +
> > +}; /* namespace libcamera */
> > diff --git a/src/ipa/libipa/meson.build b/src/ipa/libipa/meson.build
> > index 37fbd177..31cc8d70 100644
> > --- a/src/ipa/libipa/meson.build
> > +++ b/src/ipa/libipa/meson.build
> > @@ -1,6 +1,7 @@
> >  # SPDX-License-Identifier: CC0-1.0
> >  
> >  libipa_headers = files([
> > +    'agc.h',
> >      'algorithm.h',
> >      'camera_sensor_helper.h',
> >      'exposure_mode_helper.h',
> > @@ -10,6 +11,7 @@ libipa_headers = files([
> >  ])
> >  
> >  libipa_sources = files([
> > +    'agc.cpp',
> >      'algorithm.cpp',
> >      'camera_sensor_helper.cpp',
> >      'exposure_mode_helper.cpp',
Laurent Pinchart April 6, 2024, 1:07 a.m. UTC | #4
Hello,

On Mon, Mar 25, 2024 at 09:16:00PM +0100, Jacopo Mondi wrote:
> On Fri, Mar 22, 2024 at 01:14:45PM +0000, Daniel Scally wrote:
> > The Agc algorithms for the RkIsp1 and IPU3 IPAs do the same thing in
> > very large part; following the Rpi IPA's algorithm in spirit with a
> > few tunable values in that IPA being hardcoded in the libipa ones.
> > Add a new base class for MeanLuminanceAgc which implements the same
> 
> nit: I would rather call this one AgcMeanLuminance.

And rename the files accordingly ?

> One other note, not sure if applicable here, from the experience of
> trying to upstream an auto-focus algorithm for RkISP1. We had there a
> base class to define the algorithm interface, one derived class for
> the common calculation and one platform-specific part for statistics
> collection and IPA module interfacing.
> 
> The base class was there mostly to handle the algorithm state machine
> by handling the controls that influence the algorithm behaviour. In
> case of AEGC I can think, in example, about handling the switch
> between enable/disable of auto-mode (and consequentially handling a
> manually set ExposureTime and AnalogueGain), switching between
> different ExposureModes etc
> 
> This is the last attempt for AF
> https://patchwork.libcamera.org/patch/18510/
> 
> and I'm wondering if it would be desirable to abstract away from the
> MeanLuminance part the part that is tightly coupled with the
> libcamera's controls definition.
> 
> Let's make a concrete example: look how the rkisp1 and the ipu3 agc
> implementations handle AeEnabled (or better, look at how the rkisp1
> does that and the IPU3 does not).
> 
> Ideally, my goal would be to abstract the handling of the control and
> all the state machine that decides if the manual or auto-computed
> values should be used to an AEGCAlgorithm base class.
> 
> Your series does that for the tuning file parsing already but does
> that in the MeanLuminance method implementation, while it should or
> could be common to all AEGC methods.
> 
> One very interesting experiment could be starting with this and then
> plumb-in AeEnable support in the IPU3 in example and move everything
> common to a base class.
> 
> > algorithm and additionally parses yaml tuning files to inform an IPA
> > module's Agc algorithm about valid constraint and exposure modes and
> > their associated bounds.
> >
> > Signed-off-by: Daniel Scally <dan.scally@ideasonboard.com>
> > ---
> >  src/ipa/libipa/agc.cpp     | 526 +++++++++++++++++++++++++++++++++++++
> >  src/ipa/libipa/agc.h       |  82 ++++++
> >  src/ipa/libipa/meson.build |   2 +
> >  3 files changed, 610 insertions(+)
> >  create mode 100644 src/ipa/libipa/agc.cpp
> >  create mode 100644 src/ipa/libipa/agc.h
> >
> > diff --git a/src/ipa/libipa/agc.cpp b/src/ipa/libipa/agc.cpp
> > new file mode 100644
> > index 00000000..af57a571
> > --- /dev/null
> > +++ b/src/ipa/libipa/agc.cpp
> > @@ -0,0 +1,526 @@
> > +/* SPDX-License-Identifier: LGPL-2.1-or-later */
> > +/*
> > + * Copyright (C) 2024 Ideas on Board Oy
> > + *
> > + * agc.cpp - Base class for libipa-compliant AGC algorithms

We could have different AGC algorithms that are all libipa-compliant.
This is one particular AGC algorithm, right ?

> > + */
> > +
> > +#include "agc.h"
> > +
> > +#include <cmath>
> > +
> > +#include <libcamera/base/log.h>
> > +#include <libcamera/control_ids.h>
> > +
> > +#include "exposure_mode_helper.h"
> > +
> > +using namespace libcamera::controls;
> > +
> > +/**
> > + * \file agc.h
> > + * \brief Base class implementing mean luminance AEGC.
> 
> nit: not '.' at the end of briefs
> 
> > + */
> > +
> > +namespace libcamera {
> > +
> > +using namespace std::literals::chrono_literals;
> > +
> > +LOG_DEFINE_CATEGORY(Agc)
> > +
> > +namespace ipa {
> > +
> > +/*
> > + * Number of frames to wait before calculating stats on minimum exposure

This doesn't seem to match what the value is used for.

> > + * \todo should this be a tunable value?
> 
> Does this depend on the ISP (comes from IPA), the sensor (comes from
> tuning file) or... both ? :)
> 
> > + */
> > +static constexpr uint32_t kNumStartupFrames = 10;
> > +
> > +/*
> > + * Default relative luminance target
> > + *
> > + * This value should be chosen so that when the camera points at a grey target,
> > + * the resulting image brightness looks "right". Custom values can be passed
> > + * as the relativeLuminanceTarget value in sensor tuning files.
> > + */
> > +static constexpr double kDefaultRelativeLuminanceTarget = 0.16;
> > +
> > +/**
> > + * \struct MeanLuminanceAgc::AgcConstraint
> > + * \brief The boundaries and target for an AeConstraintMode constraint
> > + *
> > + * This structure describes an AeConstraintMode constraint for the purposes of
> > + * this algorithm. The algorithm will apply the constraints by calculating the
> > + * Histogram's inter-quantile mean between the given quantiles and ensure that
> > + * the resulting value is the right side of the given target (as defined by the
> > + * boundary and luminance target).
> > + */
> 
> Here, in example.
> 
> controls::AeConstraintMode and the supported values are defined as
> (core|vendor) controls in control_ids_*.yaml
> 
> The tuning file expresses the constraint modes using the Control
> definition (I wonder if this has always been like this) but it
> definitely ties the tuning file to the controls definition.
> 
> Applications use controls::AeConstraintMode to select one of the
> constrained modes to have the algorithm use it.
> 
> In all of this, how much is part of the MeanLuminance implementation
> and how much shared between possibly multiple implementations ?
> 
> > +
> > +/**
> > + * \enum MeanLuminanceAgc::AgcConstraint::Bound
> > + * \brief Specify whether the constraint defines a lower or upper bound
> > + * \var MeanLuminanceAgc::AgcConstraint::LOWER
> > + * \brief The constraint defines a lower bound
> > + * \var MeanLuminanceAgc::AgcConstraint::UPPER
> > + * \brief The constraint defines an upper bound
> > + */
> > +
> > +/**
> > + * \var MeanLuminanceAgc::AgcConstraint::bound
> > + * \brief The type of constraint bound
> > + */
> > +
> > +/**
> > + * \var MeanLuminanceAgc::AgcConstraint::qLo
> > + * \brief The lower quantile to use for the constraint
> > + */
> > +
> > +/**
> > + * \var MeanLuminanceAgc::AgcConstraint::qHi
> > + * \brief The upper quantile to use for the constraint
> > + */
> > +
> > +/**
> > + * \var MeanLuminanceAgc::AgcConstraint::yTarget
> > + * \brief The luminance target for the constraint
> > + */
> > +
> > +/**
> > + * \class MeanLuminanceAgc
> > + * \brief a mean-based auto-exposure algorithm

s/a/A/

> > + *
> > + * This algorithm calculates a shutter time, analogue and digital gain such that
> > + * the normalised mean luminance value of an image is driven towards a target,
> > + * which itself is discovered from tuning data. The algorithm is a two-stage
> > + * process:

s/:/./

or make the next two paragraph bullet list entries.

> > + *
> > + * In the first stage, an initial gain value is derived by iteratively comparing
> > + * the gain-adjusted mean luminance across an entire image against a target, and
> > + * selecting a value which pushes it as closely as possible towards the target.
> > + *
> > + * In the second stage we calculate the gain required to drive the average of a
> > + * section of a histogram to a target value, where the target and the boundaries
> > + * of the section of the histogram used in the calculation are taken from the
> > + * values defined for the currently configured AeConstraintMode within the
> > + * tuning data. The gain from the first stage is then clamped to the gain from
> > + * this stage.
> > + *
> > + * The final gain is used to adjust the effective exposure value of the image,
> > + * and that new exposure value divided into shutter time, analogue gain and

s/divided/is divided/

> > + * digital gain according to the selected AeExposureMode.

There should be a mention here that this class handles tuning file
parsing and expects a particular tuning data format. You can reference
the function that parses the tuning data for detailed documentation
about the format.

> > + */
> > +
> > +MeanLuminanceAgc::MeanLuminanceAgc()
> > +	: frameCount_(0), filteredExposure_(0s), relativeLuminanceTarget_(0)
> > +{
> > +}
> > +
> > +/**
> > + * \brief Parse the relative luminance target from the tuning data
> > + * \param[in] tuningData The YamlObject holding the algorithm's tuning data
> > + */
> > +void MeanLuminanceAgc::parseRelativeLuminanceTarget(const YamlObject &tuningData)
> > +{
> > +	relativeLuminanceTarget_ =
> > +		tuningData["relativeLuminanceTarget"].get<double>(kDefaultRelativeLuminanceTarget);
> 
> How do you expect this to be computed in the tuning file ?
> 
> > +}
> > +
> > +/**
> > + * \brief Parse an AeConstraintMode constraint from tuning data
> > + * \param[in] modeDict the YamlObject holding the constraint data
> > + * \param[in] id The constraint ID from AeConstraintModeEnum
> > + */
> > +void MeanLuminanceAgc::parseConstraint(const YamlObject &modeDict, int32_t id)
> > +{
> > +	for (const auto &[boundName, content] : modeDict.asDict()) {
> > +		if (boundName != "upper" && boundName != "lower") {
> > +			LOG(Agc, Warning)
> > +				<< "Ignoring unknown constraint bound '" << boundName << "'";
> > +			continue;
> > +		}
> > +
> > +		unsigned int idx = static_cast<unsigned int>(boundName == "upper");
> > +		AgcConstraint::Bound bound = static_cast<AgcConstraint::Bound>(idx);
> > +		double qLo = content["qLo"].get<double>().value_or(0.98);
> > +		double qHi = content["qHi"].get<double>().value_or(1.0);
> > +		double yTarget =
> > +			content["yTarget"].getList<double>().value_or(std::vector<double>{ 0.5 }).at(0);
> > +
> > +		AgcConstraint constraint = { bound, qLo, qHi, yTarget };
> > +
> > +		if (!constraintModes_.count(id))
> > +			constraintModes_[id] = {};
> > +
> > +		if (idx)
> > +			constraintModes_[id].push_back(constraint);
> > +		else
> > +			constraintModes_[id].insert(constraintModes_[id].begin(), constraint);
> > +	}
> > +}
> > +
> > +/**
> > + * \brief Parse tuning data file to populate AeConstraintMode control
> > + * \param[in] tuningData the YamlObject representing the tuning data for Agc
> > + *
> > + * The Agc algorithm's tuning data should contain a dictionary called
> > + * AeConstraintMode containing per-mode setting dictionaries with the key being
> > + * a value from \ref controls::AeConstraintModeNameValueMap. Each mode dict may
> > + * contain either a "lower" or "upper" key, or both, in this format:
> > + *
> > + * \code{.unparsed}
> > + * algorithms:
> > + *   - Agc:
> > + *       AeConstraintMode:
> > + *         ConstraintNormal:
> > + *           lower:
> > + *             qLo: 0.98
> > + *             qHi: 1.0
> > + *             yTarget: 0.5
> 
> Ok, so this ties the tuning file not just to the libcamera controls
> definition, but to this specific implementation of the algorithm ?

Tying the algorithm implementation with the tuning data format seems
unavoidable. I'm slightly concerned about handling the YAML parsing here
though, as I'm wondering if we can always guarantee that the file will
not need to diverge somehow (possibly with additional data somewhere
that would make the shared parser fail), but that's probably such a
small risk that we can worry about it later.

> Not that it was not expected, and I think it's fine, as using a
> libcamera defined control value as 'index' makes sure the applications
> will deal with the same interface, but this largely conflicts with the
> idea to have shared parsing for all algorithms in a base class.

The idea comes straight from the RPi implementation, which expect users
to customize tuning files with new modes. Indexing into a non standard
list of modes isn't an API I really like, and that's something I think
we need to fix eventually.

> Also, we made AeConstrainModes a 'core control' because at the time
> when RPi first implemented AGC support there was no alternative to
> that. Then the RPi implementation has been copied in all other
> platforms and this is still fine as a 'core control'. This however
> seems to be a tuning parameter for this specific algorithm
> implementation, isn't it ?

I should read your whole reply before writing anything :-)

> > + *         ConstraintHighlight:
> > + *           lower:
> > + *             qLo: 0.98
> > + *             qHi: 1.0
> > + *             yTarget: 0.5
> > + *           upper:
> > + *             qLo: 0.98
> > + *             qHi: 1.0
> > + *             yTarget: 0.8
> > + *
> > + * \endcode
> > + *
> > + * The parsed dictionaries are used to populate an array of available values for
> > + * the AeConstraintMode control and stored for later use in the algorithm.
> > + *
> > + * \return -EINVAL Where a defined constraint mode is invalid
> > + * \return 0 on success
> > + */
> > +int MeanLuminanceAgc::parseConstraintModes(const YamlObject &tuningData)
> > +{
> > +	std::vector<ControlValue> availableConstraintModes;
> > +
> > +	const YamlObject &yamlConstraintModes = tuningData[controls::AeConstraintMode.name()];
> > +	if (yamlConstraintModes.isDictionary()) {
> > +		for (const auto &[modeName, modeDict] : yamlConstraintModes.asDict()) {
> > +			if (AeConstraintModeNameValueMap.find(modeName) ==
> > +			    AeConstraintModeNameValueMap.end()) {
> > +				LOG(Agc, Warning)
> > +					<< "Skipping unknown constraint mode '" << modeName << "'";
> > +				continue;
> > +			}
> > +
> > +			if (!modeDict.isDictionary()) {
> > +				LOG(Agc, Error)
> > +					<< "Invalid constraint mode '" << modeName << "'";
> > +				return -EINVAL;
> > +			}
> > +
> > +			parseConstraint(modeDict,
> > +					AeConstraintModeNameValueMap.at(modeName));
> > +			availableConstraintModes.push_back(
> > +				AeConstraintModeNameValueMap.at(modeName));
> > +		}
> > +	}
> > +
> > +	/*
> > +	 * If the tuning data file contains no constraints then we use the
> > +	 * default constraint that the various Agc algorithms were adhering to
> > +	 * anyway before centralisation.
> > +	 */
> > +	if (constraintModes_.empty()) {
> > +		AgcConstraint constraint = {
> > +			AgcConstraint::Bound::LOWER,
> > +			0.98,
> > +			1.0,
> > +			0.5
> > +		};
> > +
> > +		constraintModes_[controls::ConstraintNormal].insert(
> > +			constraintModes_[controls::ConstraintNormal].begin(),
> > +			constraint);
> > +		availableConstraintModes.push_back(
> > +			AeConstraintModeNameValueMap.at("ConstraintNormal"));
> > +	}
> > +
> > +	controls_[&controls::AeConstraintMode] = ControlInfo(availableConstraintModes);
> > +
> > +	return 0;
> > +}
> > +
> > +/**
> > + * \brief Parse tuning data file to populate AeExposureMode control
> > + * \param[in] tuningData the YamlObject representing the tuning data for Agc
> > + *
> > + * The Agc algorithm's tuning data should contain a dictionary called
> > + * AeExposureMode containing per-mode setting dictionaries with the key being
> > + * a value from \ref controls::AeExposureModeNameValueMap. Each mode dict should
> > + * contain an array of shutter times with the key "shutter" and an array of gain
> > + * values with the key "gain", in this format:
> > + *
> > + * \code{.unparsed}
> > + * algorithms:
> > + *   - Agc:
> > + *       AeExposureMode:
> 
> Same reasoning as per the constraints, really.
> 
> There's nothing bad here, if not me realizing our controls definition
> is intimately tied with the algorithms implementation, so I wonder
> again if even handling things like AeEnable in a common form makes
> any sense..
> 
> Not going to review the actual implementation now, as it comes from
> the existing ones...
> 
> 
> > + *         ExposureNormal:
> > + *           shutter: [ 100, 10000, 30000, 60000, 120000 ]
> > + *           gain: [ 1.0, 2.0, 4.0, 6.0, 6.0 ]
> > + *         ExposureShort:
> > + *           shutter: [ 100, 10000, 30000, 60000, 120000 ]
> > + *           gain: [ 1.0, 2.0, 4.0, 6.0, 6.0 ]
> > + *
> > + * \endcode
> > + *
> > + * The parsed dictionaries are used to populate an array of available values for
> > + * the AeExposureMode control and to create ExposureModeHelpers
> > + *
> > + * \return -EINVAL Where a defined constraint mode is invalid
> > + * \return 0 on success
> > + */
> > +int MeanLuminanceAgc::parseExposureModes(const YamlObject &tuningData)
> > +{
> > +	std::vector<ControlValue> availableExposureModes;
> > +	int ret;
> > +
> > +	const YamlObject &yamlExposureModes = tuningData[controls::AeExposureMode.name()];
> > +	if (yamlExposureModes.isDictionary()) {
> > +		for (const auto &[modeName, modeValues] : yamlExposureModes.asDict()) {
> > +			if (AeExposureModeNameValueMap.find(modeName) ==
> > +			    AeExposureModeNameValueMap.end()) {
> > +				LOG(Agc, Warning)
> > +					<< "Skipping unknown exposure mode '" << modeName << "'";
> > +				continue;
> > +			}
> > +
> > +			if (!modeValues.isDictionary()) {
> > +				LOG(Agc, Error)
> > +					<< "Invalid exposure mode '" << modeName << "'";
> > +				return -EINVAL;
> > +			}
> > +
> > +			std::vector<uint32_t> shutters =
> > +				modeValues["shutter"].getList<uint32_t>().value_or(std::vector<uint32_t>{});
> > +			std::vector<double> gains =
> > +				modeValues["gain"].getList<double>().value_or(std::vector<double>{});
> > +
> > +			std::vector<utils::Duration> shutterDurations;
> > +			std::transform(shutters.begin(), shutters.end(),
> > +				std::back_inserter(shutterDurations),
> > +				[](uint32_t time) { return std::chrono::microseconds(time); });
> > +
> > +			std::shared_ptr<ExposureModeHelper> helper =
> > +				std::make_shared<ExposureModeHelper>();
> > +			if ((ret = helper->init(shutterDurations, gains) < 0)) {
> > +				LOG(Agc, Error)
> > +					<< "Failed to parse exposure mode '" << modeName << "'";
> > +				return ret;
> > +			}
> > +
> > +			exposureModeHelpers_[AeExposureModeNameValueMap.at(modeName)] = helper;
> > +			availableExposureModes.push_back(AeExposureModeNameValueMap.at(modeName));
> > +		}
> > +	}
> > +
> > +	/*
> > +	 * If we don't have any exposure modes in the tuning data we create an
> > +	 * ExposureModeHelper using empty shutter time and gain arrays, which
> > +	 * will then internally simply drive the shutter as high as possible
> > +	 * before touching gain
> > +	 */
> > +	if (availableExposureModes.empty()) {
> > +		int32_t exposureModeId = AeExposureModeNameValueMap.at("ExposureNormal");
> > +		std::vector<utils::Duration> shutterDurations = {};
> > +		std::vector<double> gains = {};
> > +
> > +		std::shared_ptr<ExposureModeHelper> helper =
> > +			std::make_shared<ExposureModeHelper>();
> > +		if ((ret = helper->init(shutterDurations, gains) < 0)) {
> > +			LOG(Agc, Error)
> > +				<< "Failed to create default ExposureModeHelper";
> > +			return ret;
> > +		}
> > +
> > +		exposureModeHelpers_[exposureModeId] = helper;
> > +		availableExposureModes.push_back(exposureModeId);
> > +	}
> > +
> > +	controls_[&controls::AeExposureMode] = ControlInfo(availableExposureModes);
> > +
> > +	return 0;
> > +}
> > +
> > +/**
> > + * \fn MeanLuminanceAgc::constraintModes()
> > + * \brief Get the constraint modes that have been parsed from tuning data
> > + */
> > +
> > +/**
> > + * \fn MeanLuminanceAgc::exposureModeHelpers()
> > + * \brief Get the ExposureModeHelpers that have been parsed from tuning data
> > + */
> > +
> > +/**
> > + * \fn MeanLuminanceAgc::controls()
> > + * \brief Get the controls that have been generated after parsing tuning data
> > + */
> > +
> > +/**
> > + * \fn MeanLuminanceAgc::estimateLuminance(const double gain)
> > + * \brief Estimate the luminance of an image, adjusted by a given gain
> > + * \param[in] gain The gain with which to adjust the luminance estimate
> > + *
> > + * This function is a pure virtual function because estimation of luminance is a
> > + * hardware-specific operation, which depends wholly on the format of the stats
> > + * that are delivered to libcamera from the ISP. Derived classes must implement
> > + * an overriding function that calculates the normalised mean luminance value
> > + * across the entire image.
> > + *
> > + * \return The normalised relative luminance of the image
> > + */
> > +
> > +/**
> > + * \brief Estimate the initial gain needed to achieve a relative luminance
> > + *        target
> 
> nit: we don't usually indent in briefs, or in general when breaking
> lines in doxygen as far as I can tell
> 
> Not going
> 
> > + *
> > + * To account for non-linearity caused by saturation, the value needs to be
> > + * estimated in an iterative process, as multiplying by a gain will not increase
> > + * the relative luminance by the same factor if some image regions are saturated
> > + *
> > + * \return The calculated initial gain
> > + */
> > +double MeanLuminanceAgc::estimateInitialGain()
> > +{
> > +	double yTarget = relativeLuminanceTarget_;
> > +	double yGain = 1.0;
> > +
> > +	for (unsigned int i = 0; i < 8; i++) {
> > +		double yValue = estimateLuminance(yGain);
> > +		double extra_gain = std::min(10.0, yTarget / (yValue + .001));
> > +
> > +		yGain *= extra_gain;
> > +		LOG(Agc, Debug) << "Y value: " << yValue
> > +				<< ", Y target: " << yTarget
> > +				<< ", gives gain " << yGain;
> > +
> > +		if (utils::abs_diff(extra_gain, 1.0) < 0.01)
> > +			break;
> > +	}
> > +
> > +	return yGain;
> > +}
> > +
> > +/**
> > + * \brief Clamp gain within the bounds of a defined constraint
> > + * \param[in] constraintModeIndex The index of the constraint to adhere to
> > + * \param[in] hist A histogram over which to calculate inter-quantile means
> > + * \param[in] gain The gain to clamp
> > + *
> > + * \return The gain clamped within the constraint bounds
> > + */
> > +double MeanLuminanceAgc::constraintClampGain(uint32_t constraintModeIndex,
> > +					     const Histogram &hist,
> > +					     double gain)
> > +{
> > +	std::vector<AgcConstraint> &constraints = constraintModes_[constraintModeIndex];
> > +	for (const AgcConstraint &constraint : constraints) {
> > +		double newGain = constraint.yTarget * hist.bins() /
> > +				 hist.interQuantileMean(constraint.qLo, constraint.qHi);
> > +
> > +		if (constraint.bound == AgcConstraint::Bound::LOWER &&
> > +		    newGain > gain)
> > +			gain = newGain;
> > +
> > +		if (constraint.bound == AgcConstraint::Bound::UPPER &&
> > +		    newGain < gain)
> > +			gain = newGain;
> > +	}
> > +
> > +	return gain;
> > +}
> > +
> > +/**
> > + * \brief Apply a filter on the exposure value to limit the speed of changes
> > + * \param[in] exposureValue The target exposure from the AGC algorithm
> > + *
> > + * The speed of the filter is adaptive, and will produce the target quicker
> > + * during startup, or when the target exposure is within 20% of the most recent
> > + * filter output.
> > + *
> > + * \return The filtered exposure
> > + */
> > +utils::Duration MeanLuminanceAgc::filterExposure(utils::Duration exposureValue)
> > +{
> > +	double speed = 0.2;
> > +
> > +	/* Adapt instantly if we are in startup phase. */
> > +	if (frameCount_ < kNumStartupFrames)
> > +		speed = 1.0;
> > +
> > +	/*
> > +	 * If we are close to the desired result, go faster to avoid making
> > +	 * multiple micro-adjustments.
> > +	 * \todo Make this customisable?
> > +	 */
> > +	if (filteredExposure_ < 1.2 * exposureValue &&
> > +	    filteredExposure_ > 0.8 * exposureValue)
> > +		speed = sqrt(speed);
> > +
> > +	filteredExposure_ = speed * exposureValue +
> > +			    filteredExposure_ * (1.0 - speed);
> > +
> > +	return filteredExposure_;
> > +}
> > +
> > +/**
> > + * \brief Calculate the new exposure value
> > + * \param[in] constraintModeIndex The index of the current constraint mode
> > + * \param[in] exposureModeIndex The index of the current exposure mode
> > + * \param[in] yHist A Histogram from the ISP statistics to use in constraining
> > + *	      the calculated gain
> > + * \param[in] effectiveExposureValue The EV applied to the frame from which the
> > + *	      statistics in use derive
> > + *
> > + * Calculate a new exposure value to try to obtain the target. The calculated
> > + * exposure value is filtered to prevent rapid changes from frame to frame, and
> > + * divided into shutter time, analogue and digital gain.
> > + *
> > + * \return Tuple of shutter time, analogue gain, and digital gain
> > + */
> > +std::tuple<utils::Duration, double, double>
> > +MeanLuminanceAgc::calculateNewEv(uint32_t constraintModeIndex,
> > +				 uint32_t exposureModeIndex,
> > +				 const Histogram &yHist,
> > +				 utils::Duration effectiveExposureValue)
> > +{
> > +	/*
> > +	 * The pipeline handler should validate that we have received an allowed
> > +	 * value for AeExposureMode.
> > +	 */
> > +	std::shared_ptr<ExposureModeHelper> exposureModeHelper =
> > +		exposureModeHelpers_.at(exposureModeIndex);
> > +
> > +	double gain = estimateInitialGain();
> > +	gain = constraintClampGain(constraintModeIndex, yHist, gain);
> > +
> > +	/*
> > +	 * We don't check whether we're already close to the target, because
> > +	 * even if the effective exposure value is the same as the last frame's
> > +	 * we could have switched to an exposure mode that would require a new
> > +	 * pass through the splitExposure() function.
> > +	 */
> > +
> > +	utils::Duration newExposureValue = effectiveExposureValue * gain;
> > +	utils::Duration maxTotalExposure = exposureModeHelper->maxShutter()
> > +					   * exposureModeHelper->maxGain();
> > +	newExposureValue = std::min(newExposureValue, maxTotalExposure);
> > +
> > +	/*
> > +	 * We filter the exposure value to make sure changes are not too jarring
> > +	 * from frame to frame.
> > +	 */
> > +	newExposureValue = filterExposure(newExposureValue);
> > +
> > +	frameCount_++;
> > +	return exposureModeHelper->splitExposure(newExposureValue);
> > +}
> > +
> > +}; /* namespace ipa */
> > +
> > +}; /* namespace libcamera */
> > diff --git a/src/ipa/libipa/agc.h b/src/ipa/libipa/agc.h
> > new file mode 100644
> > index 00000000..902a359a
> > --- /dev/null
> > +++ b/src/ipa/libipa/agc.h
> > @@ -0,0 +1,82 @@
> > +/* SPDX-License-Identifier: LGPL-2.1-or-later */
> > +/*
> > + * Copyright (C) 2024 Ideas on Board Oy
> > + *
> > + agc.h - Base class for libipa-compliant AGC algorithms
> > + */
> > +
> > +#pragma once
> > +
> > +#include <tuple>
> > +#include <vector>
> > +
> > +#include <libcamera/controls.h>
> > +
> > +#include "libcamera/internal/yaml_parser.h"
> > +
> > +#include "exposure_mode_helper.h"
> > +#include "histogram.h"
> > +
> > +namespace libcamera {
> > +
> > +namespace ipa {
> > +
> > +class MeanLuminanceAgc
> > +{
> > +public:
> > +	MeanLuminanceAgc();
> > +	virtual ~MeanLuminanceAgc() = default;

No need to make this inline, define the destructor in the .cpp with

MeanLumimanceAgc::~MeanLuminanceAgc() = default;

> > +
> > +	struct AgcConstraint {
> > +		enum class Bound {
> > +			LOWER = 0,
> > +			UPPER = 1

Wrong coding style.

> > +		};
> > +		Bound bound;
> > +		double qLo;
> > +		double qHi;
> > +		double yTarget;
> > +	};
> > +
> > +	void parseRelativeLuminanceTarget(const YamlObject &tuningData);
> > +	void parseConstraint(const YamlObject &modeDict, int32_t id);

This function doesn't seem to be called from the derived classes, you
can make it private. Same for other functions below.

> > +	int parseConstraintModes(const YamlObject &tuningData);
> > +	int parseExposureModes(const YamlObject &tuningData);

Is there a reason to split the parsing in multiple functions, given that
they are called the same from from both the rkisp1 and ipu3
implementations ?

> > +
> > +	std::map<int32_t, std::vector<AgcConstraint>> constraintModes()
> > +	{
> > +		return constraintModes_;
> > +	}

This seems unused, so the AgcConstraint structure can be made private.

> > +
> > +	std::map<int32_t, std::shared_ptr<ExposureModeHelper>> exposureModeHelpers()
> > +	{
> > +		return exposureModeHelpers_;
> > +	}

Can we avoid exposing this too ?

I never expected there would be multiple helpers when reading the
documentation of the ExposureModeHelper class. That may be due to
missing documentation.

> > +
> > +	ControlInfoMap::Map controls()
> > +	{
> > +		return controls_;
> > +	}

Hmmm... We don't have precendents for pushing ControlInfo to algorithms.
Are we sure we want to go that way ?

> > +
> > +	virtual double estimateLuminance(const double gain) = 0;
> > +	double estimateInitialGain();
> > +	double constraintClampGain(uint32_t constraintModeIndex,
> > +				   const Histogram &hist,
> > +				   double gain);
> > +	utils::Duration filterExposure(utils::Duration exposureValue);
> > +	std::tuple<utils::Duration, double, double>
> > +	calculateNewEv(uint32_t constraintModeIndex, uint32_t exposureModeIndex,
> > +		       const Histogram &yHist, utils::Duration effectiveExposureValue);

Missing blank line.

> > +private:
> > +	uint64_t frameCount_;
> > +	utils::Duration filteredExposure_;
> > +	double relativeLuminanceTarget_;
> > +
> > +	std::map<int32_t, std::vector<AgcConstraint>> constraintModes_;
> > +	std::map<int32_t, std::shared_ptr<ExposureModeHelper>> exposureModeHelpers_;
> > +	ControlInfoMap::Map controls_;
> > +};
> > +
> > +}; /* namespace ipa */
> > +
> > +}; /* namespace libcamera */
> > diff --git a/src/ipa/libipa/meson.build b/src/ipa/libipa/meson.build
> > index 37fbd177..31cc8d70 100644
> > --- a/src/ipa/libipa/meson.build
> > +++ b/src/ipa/libipa/meson.build
> > @@ -1,6 +1,7 @@
> >  # SPDX-License-Identifier: CC0-1.0
> >
> >  libipa_headers = files([
> > +    'agc.h',
> >      'algorithm.h',
> >      'camera_sensor_helper.h',
> >      'exposure_mode_helper.h',
> > @@ -10,6 +11,7 @@ libipa_headers = files([
> >  ])
> >
> >  libipa_sources = files([
> > +    'agc.cpp',
> >      'algorithm.cpp',
> >      'camera_sensor_helper.cpp',
> >      'exposure_mode_helper.cpp',
Laurent Pinchart April 6, 2024, 1:27 a.m. UTC | #5
On Sat, Apr 06, 2024 at 04:07:52AM +0300, Laurent Pinchart wrote:
> Hello,
> 
> On Mon, Mar 25, 2024 at 09:16:00PM +0100, Jacopo Mondi wrote:
> > On Fri, Mar 22, 2024 at 01:14:45PM +0000, Daniel Scally wrote:
> > > The Agc algorithms for the RkIsp1 and IPU3 IPAs do the same thing in
> > > very large part; following the Rpi IPA's algorithm in spirit with a
> > > few tunable values in that IPA being hardcoded in the libipa ones.
> > > Add a new base class for MeanLuminanceAgc which implements the same
> > 
> > nit: I would rather call this one AgcMeanLuminance.
> 
> And rename the files accordingly ?
> 
> > One other note, not sure if applicable here, from the experience of
> > trying to upstream an auto-focus algorithm for RkISP1. We had there a
> > base class to define the algorithm interface, one derived class for
> > the common calculation and one platform-specific part for statistics
> > collection and IPA module interfacing.
> > 
> > The base class was there mostly to handle the algorithm state machine
> > by handling the controls that influence the algorithm behaviour. In
> > case of AEGC I can think, in example, about handling the switch
> > between enable/disable of auto-mode (and consequentially handling a
> > manually set ExposureTime and AnalogueGain), switching between
> > different ExposureModes etc
> > 
> > This is the last attempt for AF
> > https://patchwork.libcamera.org/patch/18510/
> > 
> > and I'm wondering if it would be desirable to abstract away from the
> > MeanLuminance part the part that is tightly coupled with the
> > libcamera's controls definition.
> > 
> > Let's make a concrete example: look how the rkisp1 and the ipu3 agc
> > implementations handle AeEnabled (or better, look at how the rkisp1
> > does that and the IPU3 does not).
> > 
> > Ideally, my goal would be to abstract the handling of the control and
> > all the state machine that decides if the manual or auto-computed
> > values should be used to an AEGCAlgorithm base class.
> > 
> > Your series does that for the tuning file parsing already but does
> > that in the MeanLuminance method implementation, while it should or
> > could be common to all AEGC methods.
> > 
> > One very interesting experiment could be starting with this and then
> > plumb-in AeEnable support in the IPU3 in example and move everything
> > common to a base class.
> > 
> > > algorithm and additionally parses yaml tuning files to inform an IPA
> > > module's Agc algorithm about valid constraint and exposure modes and
> > > their associated bounds.
> > >
> > > Signed-off-by: Daniel Scally <dan.scally@ideasonboard.com>
> > > ---
> > >  src/ipa/libipa/agc.cpp     | 526 +++++++++++++++++++++++++++++++++++++
> > >  src/ipa/libipa/agc.h       |  82 ++++++
> > >  src/ipa/libipa/meson.build |   2 +
> > >  3 files changed, 610 insertions(+)
> > >  create mode 100644 src/ipa/libipa/agc.cpp
> > >  create mode 100644 src/ipa/libipa/agc.h
> > >
> > > diff --git a/src/ipa/libipa/agc.cpp b/src/ipa/libipa/agc.cpp
> > > new file mode 100644
> > > index 00000000..af57a571
> > > --- /dev/null
> > > +++ b/src/ipa/libipa/agc.cpp
> > > @@ -0,0 +1,526 @@
> > > +/* SPDX-License-Identifier: LGPL-2.1-or-later */
> > > +/*
> > > + * Copyright (C) 2024 Ideas on Board Oy
> > > + *
> > > + * agc.cpp - Base class for libipa-compliant AGC algorithms
> 
> We could have different AGC algorithms that are all libipa-compliant.
> This is one particular AGC algorithm, right ?
> 
> > > + */
> > > +
> > > +#include "agc.h"
> > > +
> > > +#include <cmath>
> > > +
> > > +#include <libcamera/base/log.h>
> > > +#include <libcamera/control_ids.h>
> > > +
> > > +#include "exposure_mode_helper.h"
> > > +
> > > +using namespace libcamera::controls;
> > > +
> > > +/**
> > > + * \file agc.h
> > > + * \brief Base class implementing mean luminance AEGC.
> > 
> > nit: not '.' at the end of briefs
> > 
> > > + */
> > > +
> > > +namespace libcamera {
> > > +
> > > +using namespace std::literals::chrono_literals;
> > > +
> > > +LOG_DEFINE_CATEGORY(Agc)
> > > +
> > > +namespace ipa {
> > > +
> > > +/*
> > > + * Number of frames to wait before calculating stats on minimum exposure
> 
> This doesn't seem to match what the value is used for.
> 
> > > + * \todo should this be a tunable value?
> > 
> > Does this depend on the ISP (comes from IPA), the sensor (comes from
> > tuning file) or... both ? :)
> > 
> > > + */
> > > +static constexpr uint32_t kNumStartupFrames = 10;
> > > +
> > > +/*
> > > + * Default relative luminance target
> > > + *
> > > + * This value should be chosen so that when the camera points at a grey target,
> > > + * the resulting image brightness looks "right". Custom values can be passed
> > > + * as the relativeLuminanceTarget value in sensor tuning files.
> > > + */
> > > +static constexpr double kDefaultRelativeLuminanceTarget = 0.16;
> > > +
> > > +/**
> > > + * \struct MeanLuminanceAgc::AgcConstraint
> > > + * \brief The boundaries and target for an AeConstraintMode constraint
> > > + *
> > > + * This structure describes an AeConstraintMode constraint for the purposes of
> > > + * this algorithm. The algorithm will apply the constraints by calculating the
> > > + * Histogram's inter-quantile mean between the given quantiles and ensure that
> > > + * the resulting value is the right side of the given target (as defined by the
> > > + * boundary and luminance target).
> > > + */
> > 
> > Here, in example.
> > 
> > controls::AeConstraintMode and the supported values are defined as
> > (core|vendor) controls in control_ids_*.yaml
> > 
> > The tuning file expresses the constraint modes using the Control
> > definition (I wonder if this has always been like this) but it
> > definitely ties the tuning file to the controls definition.
> > 
> > Applications use controls::AeConstraintMode to select one of the
> > constrained modes to have the algorithm use it.
> > 
> > In all of this, how much is part of the MeanLuminance implementation
> > and how much shared between possibly multiple implementations ?
> > 
> > > +
> > > +/**
> > > + * \enum MeanLuminanceAgc::AgcConstraint::Bound
> > > + * \brief Specify whether the constraint defines a lower or upper bound
> > > + * \var MeanLuminanceAgc::AgcConstraint::LOWER
> > > + * \brief The constraint defines a lower bound
> > > + * \var MeanLuminanceAgc::AgcConstraint::UPPER
> > > + * \brief The constraint defines an upper bound
> > > + */
> > > +
> > > +/**
> > > + * \var MeanLuminanceAgc::AgcConstraint::bound
> > > + * \brief The type of constraint bound
> > > + */
> > > +
> > > +/**
> > > + * \var MeanLuminanceAgc::AgcConstraint::qLo
> > > + * \brief The lower quantile to use for the constraint
> > > + */
> > > +
> > > +/**
> > > + * \var MeanLuminanceAgc::AgcConstraint::qHi
> > > + * \brief The upper quantile to use for the constraint
> > > + */
> > > +
> > > +/**
> > > + * \var MeanLuminanceAgc::AgcConstraint::yTarget
> > > + * \brief The luminance target for the constraint
> > > + */
> > > +
> > > +/**
> > > + * \class MeanLuminanceAgc
> > > + * \brief a mean-based auto-exposure algorithm
> 
> s/a/A/
> 
> > > + *
> > > + * This algorithm calculates a shutter time, analogue and digital gain such that
> > > + * the normalised mean luminance value of an image is driven towards a target,
> > > + * which itself is discovered from tuning data. The algorithm is a two-stage
> > > + * process:
> 
> s/:/./
> 
> or make the next two paragraph bullet list entries.
> 
> > > + *
> > > + * In the first stage, an initial gain value is derived by iteratively comparing
> > > + * the gain-adjusted mean luminance across an entire image against a target, and
> > > + * selecting a value which pushes it as closely as possible towards the target.
> > > + *
> > > + * In the second stage we calculate the gain required to drive the average of a
> > > + * section of a histogram to a target value, where the target and the boundaries
> > > + * of the section of the histogram used in the calculation are taken from the
> > > + * values defined for the currently configured AeConstraintMode within the
> > > + * tuning data. The gain from the first stage is then clamped to the gain from
> > > + * this stage.

Another thing we need to document is that requirements to use this
class. The IPU3 can provide the required histogram because we compute a
poor man's histogram from the grid of AWB stats, as the ImgU doesn't
give us a histogram. Platforms that can't provide a histogram at all
won't be able to use this. What precision do we need for the histogram
(number of bins, and number of pixels used for the calculation) for it
to be usable with this class ?

> > > + *
> > > + * The final gain is used to adjust the effective exposure value of the image,
> > > + * and that new exposure value divided into shutter time, analogue gain and
> 
> s/divided/is divided/
> 
> > > + * digital gain according to the selected AeExposureMode.
> 
> There should be a mention here that this class handles tuning file
> parsing and expects a particular tuning data format. You can reference
> the function that parses the tuning data for detailed documentation
> about the format.
> 
> > > + */
> > > +
> > > +MeanLuminanceAgc::MeanLuminanceAgc()
> > > +	: frameCount_(0), filteredExposure_(0s), relativeLuminanceTarget_(0)
> > > +{
> > > +}
> > > +
> > > +/**
> > > + * \brief Parse the relative luminance target from the tuning data
> > > + * \param[in] tuningData The YamlObject holding the algorithm's tuning data
> > > + */
> > > +void MeanLuminanceAgc::parseRelativeLuminanceTarget(const YamlObject &tuningData)
> > > +{
> > > +	relativeLuminanceTarget_ =
> > > +		tuningData["relativeLuminanceTarget"].get<double>(kDefaultRelativeLuminanceTarget);
> > 
> > How do you expect this to be computed in the tuning file ?
> > 
> > > +}
> > > +
> > > +/**
> > > + * \brief Parse an AeConstraintMode constraint from tuning data
> > > + * \param[in] modeDict the YamlObject holding the constraint data
> > > + * \param[in] id The constraint ID from AeConstraintModeEnum
> > > + */
> > > +void MeanLuminanceAgc::parseConstraint(const YamlObject &modeDict, int32_t id)
> > > +{
> > > +	for (const auto &[boundName, content] : modeDict.asDict()) {
> > > +		if (boundName != "upper" && boundName != "lower") {
> > > +			LOG(Agc, Warning)
> > > +				<< "Ignoring unknown constraint bound '" << boundName << "'";
> > > +			continue;
> > > +		}
> > > +
> > > +		unsigned int idx = static_cast<unsigned int>(boundName == "upper");
> > > +		AgcConstraint::Bound bound = static_cast<AgcConstraint::Bound>(idx);
> > > +		double qLo = content["qLo"].get<double>().value_or(0.98);
> > > +		double qHi = content["qHi"].get<double>().value_or(1.0);
> > > +		double yTarget =
> > > +			content["yTarget"].getList<double>().value_or(std::vector<double>{ 0.5 }).at(0);
> > > +
> > > +		AgcConstraint constraint = { bound, qLo, qHi, yTarget };
> > > +
> > > +		if (!constraintModes_.count(id))
> > > +			constraintModes_[id] = {};
> > > +
> > > +		if (idx)
> > > +			constraintModes_[id].push_back(constraint);
> > > +		else
> > > +			constraintModes_[id].insert(constraintModes_[id].begin(), constraint);
> > > +	}
> > > +}
> > > +
> > > +/**
> > > + * \brief Parse tuning data file to populate AeConstraintMode control
> > > + * \param[in] tuningData the YamlObject representing the tuning data for Agc
> > > + *
> > > + * The Agc algorithm's tuning data should contain a dictionary called
> > > + * AeConstraintMode containing per-mode setting dictionaries with the key being
> > > + * a value from \ref controls::AeConstraintModeNameValueMap. Each mode dict may
> > > + * contain either a "lower" or "upper" key, or both, in this format:
> > > + *
> > > + * \code{.unparsed}
> > > + * algorithms:
> > > + *   - Agc:
> > > + *       AeConstraintMode:
> > > + *         ConstraintNormal:
> > > + *           lower:
> > > + *             qLo: 0.98
> > > + *             qHi: 1.0
> > > + *             yTarget: 0.5
> > 
> > Ok, so this ties the tuning file not just to the libcamera controls
> > definition, but to this specific implementation of the algorithm ?
> 
> Tying the algorithm implementation with the tuning data format seems
> unavoidable. I'm slightly concerned about handling the YAML parsing here
> though, as I'm wondering if we can always guarantee that the file will
> not need to diverge somehow (possibly with additional data somewhere
> that would make the shared parser fail), but that's probably such a
> small risk that we can worry about it later.
> 
> > Not that it was not expected, and I think it's fine, as using a
> > libcamera defined control value as 'index' makes sure the applications
> > will deal with the same interface, but this largely conflicts with the
> > idea to have shared parsing for all algorithms in a base class.
> 
> The idea comes straight from the RPi implementation, which expect users
> to customize tuning files with new modes. Indexing into a non standard
> list of modes isn't an API I really like, and that's something I think
> we need to fix eventually.
> 
> > Also, we made AeConstrainModes a 'core control' because at the time
> > when RPi first implemented AGC support there was no alternative to
> > that. Then the RPi implementation has been copied in all other
> > platforms and this is still fine as a 'core control'. This however
> > seems to be a tuning parameter for this specific algorithm
> > implementation, isn't it ?
> 
> I should read your whole reply before writing anything :-)
> 
> > > + *         ConstraintHighlight:
> > > + *           lower:
> > > + *             qLo: 0.98
> > > + *             qHi: 1.0
> > > + *             yTarget: 0.5
> > > + *           upper:
> > > + *             qLo: 0.98
> > > + *             qHi: 1.0
> > > + *             yTarget: 0.8
> > > + *
> > > + * \endcode
> > > + *
> > > + * The parsed dictionaries are used to populate an array of available values for
> > > + * the AeConstraintMode control and stored for later use in the algorithm.
> > > + *
> > > + * \return -EINVAL Where a defined constraint mode is invalid
> > > + * \return 0 on success
> > > + */
> > > +int MeanLuminanceAgc::parseConstraintModes(const YamlObject &tuningData)
> > > +{
> > > +	std::vector<ControlValue> availableConstraintModes;
> > > +
> > > +	const YamlObject &yamlConstraintModes = tuningData[controls::AeConstraintMode.name()];
> > > +	if (yamlConstraintModes.isDictionary()) {
> > > +		for (const auto &[modeName, modeDict] : yamlConstraintModes.asDict()) {
> > > +			if (AeConstraintModeNameValueMap.find(modeName) ==
> > > +			    AeConstraintModeNameValueMap.end()) {
> > > +				LOG(Agc, Warning)
> > > +					<< "Skipping unknown constraint mode '" << modeName << "'";
> > > +				continue;
> > > +			}
> > > +
> > > +			if (!modeDict.isDictionary()) {
> > > +				LOG(Agc, Error)
> > > +					<< "Invalid constraint mode '" << modeName << "'";
> > > +				return -EINVAL;
> > > +			}
> > > +
> > > +			parseConstraint(modeDict,
> > > +					AeConstraintModeNameValueMap.at(modeName));
> > > +			availableConstraintModes.push_back(
> > > +				AeConstraintModeNameValueMap.at(modeName));
> > > +		}
> > > +	}
> > > +
> > > +	/*
> > > +	 * If the tuning data file contains no constraints then we use the
> > > +	 * default constraint that the various Agc algorithms were adhering to
> > > +	 * anyway before centralisation.
> > > +	 */
> > > +	if (constraintModes_.empty()) {
> > > +		AgcConstraint constraint = {
> > > +			AgcConstraint::Bound::LOWER,
> > > +			0.98,
> > > +			1.0,
> > > +			0.5
> > > +		};
> > > +
> > > +		constraintModes_[controls::ConstraintNormal].insert(
> > > +			constraintModes_[controls::ConstraintNormal].begin(),
> > > +			constraint);
> > > +		availableConstraintModes.push_back(
> > > +			AeConstraintModeNameValueMap.at("ConstraintNormal"));
> > > +	}
> > > +
> > > +	controls_[&controls::AeConstraintMode] = ControlInfo(availableConstraintModes);
> > > +
> > > +	return 0;
> > > +}
> > > +
> > > +/**
> > > + * \brief Parse tuning data file to populate AeExposureMode control
> > > + * \param[in] tuningData the YamlObject representing the tuning data for Agc
> > > + *
> > > + * The Agc algorithm's tuning data should contain a dictionary called
> > > + * AeExposureMode containing per-mode setting dictionaries with the key being
> > > + * a value from \ref controls::AeExposureModeNameValueMap. Each mode dict should
> > > + * contain an array of shutter times with the key "shutter" and an array of gain
> > > + * values with the key "gain", in this format:
> > > + *
> > > + * \code{.unparsed}
> > > + * algorithms:
> > > + *   - Agc:
> > > + *       AeExposureMode:
> > 
> > Same reasoning as per the constraints, really.
> > 
> > There's nothing bad here, if not me realizing our controls definition
> > is intimately tied with the algorithms implementation, so I wonder
> > again if even handling things like AeEnable in a common form makes
> > any sense..
> > 
> > Not going to review the actual implementation now, as it comes from
> > the existing ones...
> > 
> > 
> > > + *         ExposureNormal:
> > > + *           shutter: [ 100, 10000, 30000, 60000, 120000 ]
> > > + *           gain: [ 1.0, 2.0, 4.0, 6.0, 6.0 ]
> > > + *         ExposureShort:
> > > + *           shutter: [ 100, 10000, 30000, 60000, 120000 ]
> > > + *           gain: [ 1.0, 2.0, 4.0, 6.0, 6.0 ]
> > > + *
> > > + * \endcode
> > > + *
> > > + * The parsed dictionaries are used to populate an array of available values for
> > > + * the AeExposureMode control and to create ExposureModeHelpers
> > > + *
> > > + * \return -EINVAL Where a defined constraint mode is invalid
> > > + * \return 0 on success
> > > + */
> > > +int MeanLuminanceAgc::parseExposureModes(const YamlObject &tuningData)
> > > +{
> > > +	std::vector<ControlValue> availableExposureModes;
> > > +	int ret;
> > > +
> > > +	const YamlObject &yamlExposureModes = tuningData[controls::AeExposureMode.name()];
> > > +	if (yamlExposureModes.isDictionary()) {
> > > +		for (const auto &[modeName, modeValues] : yamlExposureModes.asDict()) {
> > > +			if (AeExposureModeNameValueMap.find(modeName) ==
> > > +			    AeExposureModeNameValueMap.end()) {
> > > +				LOG(Agc, Warning)
> > > +					<< "Skipping unknown exposure mode '" << modeName << "'";
> > > +				continue;
> > > +			}
> > > +
> > > +			if (!modeValues.isDictionary()) {
> > > +				LOG(Agc, Error)
> > > +					<< "Invalid exposure mode '" << modeName << "'";
> > > +				return -EINVAL;
> > > +			}
> > > +
> > > +			std::vector<uint32_t> shutters =
> > > +				modeValues["shutter"].getList<uint32_t>().value_or(std::vector<uint32_t>{});
> > > +			std::vector<double> gains =
> > > +				modeValues["gain"].getList<double>().value_or(std::vector<double>{});
> > > +
> > > +			std::vector<utils::Duration> shutterDurations;
> > > +			std::transform(shutters.begin(), shutters.end(),
> > > +				std::back_inserter(shutterDurations),
> > > +				[](uint32_t time) { return std::chrono::microseconds(time); });
> > > +
> > > +			std::shared_ptr<ExposureModeHelper> helper =
> > > +				std::make_shared<ExposureModeHelper>();
> > > +			if ((ret = helper->init(shutterDurations, gains) < 0)) {
> > > +				LOG(Agc, Error)
> > > +					<< "Failed to parse exposure mode '" << modeName << "'";
> > > +				return ret;
> > > +			}
> > > +
> > > +			exposureModeHelpers_[AeExposureModeNameValueMap.at(modeName)] = helper;
> > > +			availableExposureModes.push_back(AeExposureModeNameValueMap.at(modeName));
> > > +		}
> > > +	}
> > > +
> > > +	/*
> > > +	 * If we don't have any exposure modes in the tuning data we create an
> > > +	 * ExposureModeHelper using empty shutter time and gain arrays, which
> > > +	 * will then internally simply drive the shutter as high as possible
> > > +	 * before touching gain
> > > +	 */
> > > +	if (availableExposureModes.empty()) {
> > > +		int32_t exposureModeId = AeExposureModeNameValueMap.at("ExposureNormal");
> > > +		std::vector<utils::Duration> shutterDurations = {};
> > > +		std::vector<double> gains = {};
> > > +
> > > +		std::shared_ptr<ExposureModeHelper> helper =
> > > +			std::make_shared<ExposureModeHelper>();
> > > +		if ((ret = helper->init(shutterDurations, gains) < 0)) {
> > > +			LOG(Agc, Error)
> > > +				<< "Failed to create default ExposureModeHelper";
> > > +			return ret;
> > > +		}
> > > +
> > > +		exposureModeHelpers_[exposureModeId] = helper;
> > > +		availableExposureModes.push_back(exposureModeId);
> > > +	}
> > > +
> > > +	controls_[&controls::AeExposureMode] = ControlInfo(availableExposureModes);
> > > +
> > > +	return 0;
> > > +}
> > > +
> > > +/**
> > > + * \fn MeanLuminanceAgc::constraintModes()
> > > + * \brief Get the constraint modes that have been parsed from tuning data
> > > + */
> > > +
> > > +/**
> > > + * \fn MeanLuminanceAgc::exposureModeHelpers()
> > > + * \brief Get the ExposureModeHelpers that have been parsed from tuning data
> > > + */
> > > +
> > > +/**
> > > + * \fn MeanLuminanceAgc::controls()
> > > + * \brief Get the controls that have been generated after parsing tuning data
> > > + */
> > > +
> > > +/**
> > > + * \fn MeanLuminanceAgc::estimateLuminance(const double gain)
> > > + * \brief Estimate the luminance of an image, adjusted by a given gain
> > > + * \param[in] gain The gain with which to adjust the luminance estimate
> > > + *
> > > + * This function is a pure virtual function because estimation of luminance is a
> > > + * hardware-specific operation, which depends wholly on the format of the stats
> > > + * that are delivered to libcamera from the ISP. Derived classes must implement
> > > + * an overriding function that calculates the normalised mean luminance value
> > > + * across the entire image.
> > > + *
> > > + * \return The normalised relative luminance of the image
> > > + */
> > > +
> > > +/**
> > > + * \brief Estimate the initial gain needed to achieve a relative luminance
> > > + *        target
> > 
> > nit: we don't usually indent in briefs, or in general when breaking
> > lines in doxygen as far as I can tell
> > 
> > Not going
> > 
> > > + *
> > > + * To account for non-linearity caused by saturation, the value needs to be
> > > + * estimated in an iterative process, as multiplying by a gain will not increase
> > > + * the relative luminance by the same factor if some image regions are saturated
> > > + *
> > > + * \return The calculated initial gain
> > > + */
> > > +double MeanLuminanceAgc::estimateInitialGain()
> > > +{
> > > +	double yTarget = relativeLuminanceTarget_;
> > > +	double yGain = 1.0;
> > > +
> > > +	for (unsigned int i = 0; i < 8; i++) {
> > > +		double yValue = estimateLuminance(yGain);
> > > +		double extra_gain = std::min(10.0, yTarget / (yValue + .001));
> > > +
> > > +		yGain *= extra_gain;
> > > +		LOG(Agc, Debug) << "Y value: " << yValue
> > > +				<< ", Y target: " << yTarget
> > > +				<< ", gives gain " << yGain;
> > > +
> > > +		if (utils::abs_diff(extra_gain, 1.0) < 0.01)
> > > +			break;
> > > +	}
> > > +
> > > +	return yGain;
> > > +}
> > > +
> > > +/**
> > > + * \brief Clamp gain within the bounds of a defined constraint
> > > + * \param[in] constraintModeIndex The index of the constraint to adhere to
> > > + * \param[in] hist A histogram over which to calculate inter-quantile means
> > > + * \param[in] gain The gain to clamp
> > > + *
> > > + * \return The gain clamped within the constraint bounds
> > > + */
> > > +double MeanLuminanceAgc::constraintClampGain(uint32_t constraintModeIndex,
> > > +					     const Histogram &hist,
> > > +					     double gain)
> > > +{
> > > +	std::vector<AgcConstraint> &constraints = constraintModes_[constraintModeIndex];
> > > +	for (const AgcConstraint &constraint : constraints) {
> > > +		double newGain = constraint.yTarget * hist.bins() /
> > > +				 hist.interQuantileMean(constraint.qLo, constraint.qHi);
> > > +
> > > +		if (constraint.bound == AgcConstraint::Bound::LOWER &&
> > > +		    newGain > gain)
> > > +			gain = newGain;
> > > +
> > > +		if (constraint.bound == AgcConstraint::Bound::UPPER &&
> > > +		    newGain < gain)
> > > +			gain = newGain;
> > > +	}
> > > +
> > > +	return gain;
> > > +}
> > > +
> > > +/**
> > > + * \brief Apply a filter on the exposure value to limit the speed of changes
> > > + * \param[in] exposureValue The target exposure from the AGC algorithm
> > > + *
> > > + * The speed of the filter is adaptive, and will produce the target quicker
> > > + * during startup, or when the target exposure is within 20% of the most recent
> > > + * filter output.
> > > + *
> > > + * \return The filtered exposure
> > > + */
> > > +utils::Duration MeanLuminanceAgc::filterExposure(utils::Duration exposureValue)
> > > +{
> > > +	double speed = 0.2;
> > > +
> > > +	/* Adapt instantly if we are in startup phase. */
> > > +	if (frameCount_ < kNumStartupFrames)
> > > +		speed = 1.0;
> > > +
> > > +	/*
> > > +	 * If we are close to the desired result, go faster to avoid making
> > > +	 * multiple micro-adjustments.
> > > +	 * \todo Make this customisable?
> > > +	 */
> > > +	if (filteredExposure_ < 1.2 * exposureValue &&
> > > +	    filteredExposure_ > 0.8 * exposureValue)
> > > +		speed = sqrt(speed);
> > > +
> > > +	filteredExposure_ = speed * exposureValue +
> > > +			    filteredExposure_ * (1.0 - speed);
> > > +
> > > +	return filteredExposure_;
> > > +}
> > > +
> > > +/**
> > > + * \brief Calculate the new exposure value
> > > + * \param[in] constraintModeIndex The index of the current constraint mode
> > > + * \param[in] exposureModeIndex The index of the current exposure mode
> > > + * \param[in] yHist A Histogram from the ISP statistics to use in constraining
> > > + *	      the calculated gain
> > > + * \param[in] effectiveExposureValue The EV applied to the frame from which the
> > > + *	      statistics in use derive
> > > + *
> > > + * Calculate a new exposure value to try to obtain the target. The calculated
> > > + * exposure value is filtered to prevent rapid changes from frame to frame, and
> > > + * divided into shutter time, analogue and digital gain.
> > > + *
> > > + * \return Tuple of shutter time, analogue gain, and digital gain
> > > + */
> > > +std::tuple<utils::Duration, double, double>
> > > +MeanLuminanceAgc::calculateNewEv(uint32_t constraintModeIndex,
> > > +				 uint32_t exposureModeIndex,
> > > +				 const Histogram &yHist,
> > > +				 utils::Duration effectiveExposureValue)
> > > +{
> > > +	/*
> > > +	 * The pipeline handler should validate that we have received an allowed
> > > +	 * value for AeExposureMode.
> > > +	 */
> > > +	std::shared_ptr<ExposureModeHelper> exposureModeHelper =
> > > +		exposureModeHelpers_.at(exposureModeIndex);
> > > +
> > > +	double gain = estimateInitialGain();
> > > +	gain = constraintClampGain(constraintModeIndex, yHist, gain);
> > > +
> > > +	/*
> > > +	 * We don't check whether we're already close to the target, because
> > > +	 * even if the effective exposure value is the same as the last frame's
> > > +	 * we could have switched to an exposure mode that would require a new
> > > +	 * pass through the splitExposure() function.
> > > +	 */
> > > +
> > > +	utils::Duration newExposureValue = effectiveExposureValue * gain;
> > > +	utils::Duration maxTotalExposure = exposureModeHelper->maxShutter()
> > > +					   * exposureModeHelper->maxGain();
> > > +	newExposureValue = std::min(newExposureValue, maxTotalExposure);
> > > +
> > > +	/*
> > > +	 * We filter the exposure value to make sure changes are not too jarring
> > > +	 * from frame to frame.
> > > +	 */
> > > +	newExposureValue = filterExposure(newExposureValue);
> > > +
> > > +	frameCount_++;
> > > +	return exposureModeHelper->splitExposure(newExposureValue);
> > > +}
> > > +
> > > +}; /* namespace ipa */
> > > +
> > > +}; /* namespace libcamera */
> > > diff --git a/src/ipa/libipa/agc.h b/src/ipa/libipa/agc.h
> > > new file mode 100644
> > > index 00000000..902a359a
> > > --- /dev/null
> > > +++ b/src/ipa/libipa/agc.h
> > > @@ -0,0 +1,82 @@
> > > +/* SPDX-License-Identifier: LGPL-2.1-or-later */
> > > +/*
> > > + * Copyright (C) 2024 Ideas on Board Oy
> > > + *
> > > + agc.h - Base class for libipa-compliant AGC algorithms
> > > + */
> > > +
> > > +#pragma once
> > > +
> > > +#include <tuple>
> > > +#include <vector>
> > > +
> > > +#include <libcamera/controls.h>
> > > +
> > > +#include "libcamera/internal/yaml_parser.h"
> > > +
> > > +#include "exposure_mode_helper.h"
> > > +#include "histogram.h"
> > > +
> > > +namespace libcamera {
> > > +
> > > +namespace ipa {
> > > +
> > > +class MeanLuminanceAgc
> > > +{
> > > +public:
> > > +	MeanLuminanceAgc();
> > > +	virtual ~MeanLuminanceAgc() = default;
> 
> No need to make this inline, define the destructor in the .cpp with
> 
> MeanLumimanceAgc::~MeanLuminanceAgc() = default;
> 
> > > +
> > > +	struct AgcConstraint {
> > > +		enum class Bound {
> > > +			LOWER = 0,
> > > +			UPPER = 1
> 
> Wrong coding style.
> 
> > > +		};
> > > +		Bound bound;
> > > +		double qLo;
> > > +		double qHi;
> > > +		double yTarget;
> > > +	};
> > > +
> > > +	void parseRelativeLuminanceTarget(const YamlObject &tuningData);
> > > +	void parseConstraint(const YamlObject &modeDict, int32_t id);
> 
> This function doesn't seem to be called from the derived classes, you
> can make it private. Same for other functions below.
> 
> > > +	int parseConstraintModes(const YamlObject &tuningData);
> > > +	int parseExposureModes(const YamlObject &tuningData);
> 
> Is there a reason to split the parsing in multiple functions, given that
> they are called the same from from both the rkisp1 and ipu3
> implementations ?
> 
> > > +
> > > +	std::map<int32_t, std::vector<AgcConstraint>> constraintModes()
> > > +	{
> > > +		return constraintModes_;
> > > +	}
> 
> This seems unused, so the AgcConstraint structure can be made private.
> 
> > > +
> > > +	std::map<int32_t, std::shared_ptr<ExposureModeHelper>> exposureModeHelpers()
> > > +	{
> > > +		return exposureModeHelpers_;
> > > +	}
> 
> Can we avoid exposing this too ?
> 
> I never expected there would be multiple helpers when reading the
> documentation of the ExposureModeHelper class. That may be due to
> missing documentation.
> 
> > > +
> > > +	ControlInfoMap::Map controls()
> > > +	{
> > > +		return controls_;
> > > +	}
> 
> Hmmm... We don't have precendents for pushing ControlInfo to algorithms.
> Are we sure we want to go that way ?
> 
> > > +
> > > +	virtual double estimateLuminance(const double gain) = 0;
> > > +	double estimateInitialGain();
> > > +	double constraintClampGain(uint32_t constraintModeIndex,
> > > +				   const Histogram &hist,
> > > +				   double gain);
> > > +	utils::Duration filterExposure(utils::Duration exposureValue);
> > > +	std::tuple<utils::Duration, double, double>
> > > +	calculateNewEv(uint32_t constraintModeIndex, uint32_t exposureModeIndex,
> > > +		       const Histogram &yHist, utils::Duration effectiveExposureValue);
> 
> Missing blank line.
> 
> > > +private:
> > > +	uint64_t frameCount_;
> > > +	utils::Duration filteredExposure_;
> > > +	double relativeLuminanceTarget_;
> > > +
> > > +	std::map<int32_t, std::vector<AgcConstraint>> constraintModes_;
> > > +	std::map<int32_t, std::shared_ptr<ExposureModeHelper>> exposureModeHelpers_;
> > > +	ControlInfoMap::Map controls_;
> > > +};
> > > +
> > > +}; /* namespace ipa */
> > > +
> > > +}; /* namespace libcamera */
> > > diff --git a/src/ipa/libipa/meson.build b/src/ipa/libipa/meson.build
> > > index 37fbd177..31cc8d70 100644
> > > --- a/src/ipa/libipa/meson.build
> > > +++ b/src/ipa/libipa/meson.build
> > > @@ -1,6 +1,7 @@
> > >  # SPDX-License-Identifier: CC0-1.0
> > >
> > >  libipa_headers = files([
> > > +    'agc.h',
> > >      'algorithm.h',
> > >      'camera_sensor_helper.h',
> > >      'exposure_mode_helper.h',
> > > @@ -10,6 +11,7 @@ libipa_headers = files([
> > >  ])
> > >
> > >  libipa_sources = files([
> > > +    'agc.cpp',
> > >      'algorithm.cpp',
> > >      'camera_sensor_helper.cpp',
> > >      'exposure_mode_helper.cpp',
> 
> -- 
> Regards,
> 
> Laurent Pinchart
Daniel Scally April 8, 2024, 10:57 a.m. UTC | #6
Hi Jacopo, Laurent

On 06/04/2024 02:07, Laurent Pinchart wrote:
> Hello,
>
> On Mon, Mar 25, 2024 at 09:16:00PM +0100, Jacopo Mondi wrote:
>> On Fri, Mar 22, 2024 at 01:14:45PM +0000, Daniel Scally wrote:
>>> The Agc algorithms for the RkIsp1 and IPU3 IPAs do the same thing in
>>> very large part; following the Rpi IPA's algorithm in spirit with a
>>> few tunable values in that IPA being hardcoded in the libipa ones.
>>> Add a new base class for MeanLuminanceAgc which implements the same
>> nit: I would rather call this one AgcMeanLuminance.
> And rename the files accordingly ?
>
>> One other note, not sure if applicable here, from the experience of
>> trying to upstream an auto-focus algorithm for RkISP1. We had there a
>> base class to define the algorithm interface, one derived class for
>> the common calculation and one platform-specific part for statistics
>> collection and IPA module interfacing.
>>
>> The base class was there mostly to handle the algorithm state machine
>> by handling the controls that influence the algorithm behaviour. In
>> case of AEGC I can think, in example, about handling the switch
>> between enable/disable of auto-mode (and consequentially handling a
>> manually set ExposureTime and AnalogueGain), switching between
>> different ExposureModes etc
>>
>> This is the last attempt for AF
>> https://patchwork.libcamera.org/patch/18510/
>>
>> and I'm wondering if it would be desirable to abstract away from the
>> MeanLuminance part the part that is tightly coupled with the
>> libcamera's controls definition.
>>
>> Let's make a concrete example: look how the rkisp1 and the ipu3 agc
>> implementations handle AeEnabled (or better, look at how the rkisp1
>> does that and the IPU3 does not).
>>
>> Ideally, my goal would be to abstract the handling of the control and
>> all the state machine that decides if the manual or auto-computed
>> values should be used to an AEGCAlgorithm base class.
>>
>> Your series does that for the tuning file parsing already but does
>> that in the MeanLuminance method implementation, while it should or
>> could be common to all AEGC methods.
>>
>> One very interesting experiment could be starting with this and then
>> plumb-in AeEnable support in the IPU3 in example and move everything
>> common to a base class.


I agree that conceptually that seems like a good direction of travel.

>>
>>> algorithm and additionally parses yaml tuning files to inform an IPA
>>> module's Agc algorithm about valid constraint and exposure modes and
>>> their associated bounds.
>>>
>>> Signed-off-by: Daniel Scally <dan.scally@ideasonboard.com>
>>> ---
>>>   src/ipa/libipa/agc.cpp     | 526 +++++++++++++++++++++++++++++++++++++
>>>   src/ipa/libipa/agc.h       |  82 ++++++
>>>   src/ipa/libipa/meson.build |   2 +
>>>   3 files changed, 610 insertions(+)
>>>   create mode 100644 src/ipa/libipa/agc.cpp
>>>   create mode 100644 src/ipa/libipa/agc.h
>>>
>>> diff --git a/src/ipa/libipa/agc.cpp b/src/ipa/libipa/agc.cpp
>>> new file mode 100644
>>> index 00000000..af57a571
>>> --- /dev/null
>>> +++ b/src/ipa/libipa/agc.cpp
>>> @@ -0,0 +1,526 @@
>>> +/* SPDX-License-Identifier: LGPL-2.1-or-later */
>>> +/*
>>> + * Copyright (C) 2024 Ideas on Board Oy
>>> + *
>>> + * agc.cpp - Base class for libipa-compliant AGC algorithms
> We could have different AGC algorithms that are all libipa-compliant.
> This is one particular AGC algorithm, right ?


Yes - the filenames do need to change.

>
>>> + */
>>> +
>>> +#include "agc.h"
>>> +
>>> +#include <cmath>
>>> +
>>> +#include <libcamera/base/log.h>
>>> +#include <libcamera/control_ids.h>
>>> +
>>> +#include "exposure_mode_helper.h"
>>> +
>>> +using namespace libcamera::controls;
>>> +
>>> +/**
>>> + * \file agc.h
>>> + * \brief Base class implementing mean luminance AEGC.
>> nit: not '.' at the end of briefs
>>
>>> + */
>>> +
>>> +namespace libcamera {
>>> +
>>> +using namespace std::literals::chrono_literals;
>>> +
>>> +LOG_DEFINE_CATEGORY(Agc)
>>> +
>>> +namespace ipa {
>>> +
>>> +/*
>>> + * Number of frames to wait before calculating stats on minimum exposure
> This doesn't seem to match what the value is used for.
>
>>> + * \todo should this be a tunable value?
>> Does this depend on the ISP (comes from IPA), the sensor (comes from
>> tuning file) or... both ? :)
>>
>>> + */
>>> +static constexpr uint32_t kNumStartupFrames = 10;
>>> +
>>> +/*
>>> + * Default relative luminance target
>>> + *
>>> + * This value should be chosen so that when the camera points at a grey target,
>>> + * the resulting image brightness looks "right". Custom values can be passed
>>> + * as the relativeLuminanceTarget value in sensor tuning files.
>>> + */
>>> +static constexpr double kDefaultRelativeLuminanceTarget = 0.16;
>>> +
>>> +/**
>>> + * \struct MeanLuminanceAgc::AgcConstraint
>>> + * \brief The boundaries and target for an AeConstraintMode constraint
>>> + *
>>> + * This structure describes an AeConstraintMode constraint for the purposes of
>>> + * this algorithm. The algorithm will apply the constraints by calculating the
>>> + * Histogram's inter-quantile mean between the given quantiles and ensure that
>>> + * the resulting value is the right side of the given target (as defined by the
>>> + * boundary and luminance target).
>>> + */
>> Here, in example.
>>
>> controls::AeConstraintMode and the supported values are defined as
>> (core|vendor) controls in control_ids_*.yaml
>>
>> The tuning file expresses the constraint modes using the Control
>> definition (I wonder if this has always been like this) but it
>> definitely ties the tuning file to the controls definition.
>>
>> Applications use controls::AeConstraintMode to select one of the
>> constrained modes to have the algorithm use it.
>>
>> In all of this, how much is part of the MeanLuminance implementation
>> and how much shared between possibly multiple implementations ?
>>
>>> +
>>> +/**
>>> + * \enum MeanLuminanceAgc::AgcConstraint::Bound
>>> + * \brief Specify whether the constraint defines a lower or upper bound
>>> + * \var MeanLuminanceAgc::AgcConstraint::LOWER
>>> + * \brief The constraint defines a lower bound
>>> + * \var MeanLuminanceAgc::AgcConstraint::UPPER
>>> + * \brief The constraint defines an upper bound
>>> + */
>>> +
>>> +/**
>>> + * \var MeanLuminanceAgc::AgcConstraint::bound
>>> + * \brief The type of constraint bound
>>> + */
>>> +
>>> +/**
>>> + * \var MeanLuminanceAgc::AgcConstraint::qLo
>>> + * \brief The lower quantile to use for the constraint
>>> + */
>>> +
>>> +/**
>>> + * \var MeanLuminanceAgc::AgcConstraint::qHi
>>> + * \brief The upper quantile to use for the constraint
>>> + */
>>> +
>>> +/**
>>> + * \var MeanLuminanceAgc::AgcConstraint::yTarget
>>> + * \brief The luminance target for the constraint
>>> + */
>>> +
>>> +/**
>>> + * \class MeanLuminanceAgc
>>> + * \brief a mean-based auto-exposure algorithm
> s/a/A/
>
>>> + *
>>> + * This algorithm calculates a shutter time, analogue and digital gain such that
>>> + * the normalised mean luminance value of an image is driven towards a target,
>>> + * which itself is discovered from tuning data. The algorithm is a two-stage
>>> + * process:
> s/:/./
>
> or make the next two paragraph bullet list entries.
>
>>> + *
>>> + * In the first stage, an initial gain value is derived by iteratively comparing
>>> + * the gain-adjusted mean luminance across an entire image against a target, and
>>> + * selecting a value which pushes it as closely as possible towards the target.
>>> + *
>>> + * In the second stage we calculate the gain required to drive the average of a
>>> + * section of a histogram to a target value, where the target and the boundaries
>>> + * of the section of the histogram used in the calculation are taken from the
>>> + * values defined for the currently configured AeConstraintMode within the
>>> + * tuning data. The gain from the first stage is then clamped to the gain from
>>> + * this stage.
>>> + *
>>> + * The final gain is used to adjust the effective exposure value of the image,
>>> + * and that new exposure value divided into shutter time, analogue gain and
> s/divided/is divided/
>
>>> + * digital gain according to the selected AeExposureMode.
> There should be a mention here that this class handles tuning file
> parsing and expects a particular tuning data format. You can reference
> the function that parses the tuning data for detailed documentation
> about the format.
>
>>> + */
>>> +
>>> +MeanLuminanceAgc::MeanLuminanceAgc()
>>> +	: frameCount_(0), filteredExposure_(0s), relativeLuminanceTarget_(0)
>>> +{
>>> +}
>>> +
>>> +/**
>>> + * \brief Parse the relative luminance target from the tuning data
>>> + * \param[in] tuningData The YamlObject holding the algorithm's tuning data
>>> + */
>>> +void MeanLuminanceAgc::parseRelativeLuminanceTarget(const YamlObject &tuningData)
>>> +{
>>> +	relativeLuminanceTarget_ =
>>> +		tuningData["relativeLuminanceTarget"].get<double>(kDefaultRelativeLuminanceTarget);
>> How do you expect this to be computed in the tuning file ?
>>
>>> +}
>>> +
>>> +/**
>>> + * \brief Parse an AeConstraintMode constraint from tuning data
>>> + * \param[in] modeDict the YamlObject holding the constraint data
>>> + * \param[in] id The constraint ID from AeConstraintModeEnum
>>> + */
>>> +void MeanLuminanceAgc::parseConstraint(const YamlObject &modeDict, int32_t id)
>>> +{
>>> +	for (const auto &[boundName, content] : modeDict.asDict()) {
>>> +		if (boundName != "upper" && boundName != "lower") {
>>> +			LOG(Agc, Warning)
>>> +				<< "Ignoring unknown constraint bound '" << boundName << "'";
>>> +			continue;
>>> +		}
>>> +
>>> +		unsigned int idx = static_cast<unsigned int>(boundName == "upper");
>>> +		AgcConstraint::Bound bound = static_cast<AgcConstraint::Bound>(idx);
>>> +		double qLo = content["qLo"].get<double>().value_or(0.98);
>>> +		double qHi = content["qHi"].get<double>().value_or(1.0);
>>> +		double yTarget =
>>> +			content["yTarget"].getList<double>().value_or(std::vector<double>{ 0.5 }).at(0);
>>> +
>>> +		AgcConstraint constraint = { bound, qLo, qHi, yTarget };
>>> +
>>> +		if (!constraintModes_.count(id))
>>> +			constraintModes_[id] = {};
>>> +
>>> +		if (idx)
>>> +			constraintModes_[id].push_back(constraint);
>>> +		else
>>> +			constraintModes_[id].insert(constraintModes_[id].begin(), constraint);
>>> +	}
>>> +}
>>> +
>>> +/**
>>> + * \brief Parse tuning data file to populate AeConstraintMode control
>>> + * \param[in] tuningData the YamlObject representing the tuning data for Agc
>>> + *
>>> + * The Agc algorithm's tuning data should contain a dictionary called
>>> + * AeConstraintMode containing per-mode setting dictionaries with the key being
>>> + * a value from \ref controls::AeConstraintModeNameValueMap. Each mode dict may
>>> + * contain either a "lower" or "upper" key, or both, in this format:
>>> + *
>>> + * \code{.unparsed}
>>> + * algorithms:
>>> + *   - Agc:
>>> + *       AeConstraintMode:
>>> + *         ConstraintNormal:
>>> + *           lower:
>>> + *             qLo: 0.98
>>> + *             qHi: 1.0
>>> + *             yTarget: 0.5
>> Ok, so this ties the tuning file not just to the libcamera controls
>> definition, but to this specific implementation of the algorithm ?
> Tying the algorithm implementation with the tuning data format seems
> unavoidable. I'm slightly concerned about handling the YAML parsing here
> though, as I'm wondering if we can always guarantee that the file will
> not need to diverge somehow (possibly with additional data somewhere
> that would make the shared parser fail), but that's probably such a
> small risk that we can worry about it later.
>
>> Not that it was not expected, and I think it's fine, as using a
>> libcamera defined control value as 'index' makes sure the applications
>> will deal with the same interface, but this largely conflicts with the
>> idea to have shared parsing for all algorithms in a base class.
> The idea comes straight from the RPi implementation, which expect users
> to customize tuning files with new modes. Indexing into a non standard
> list of modes isn't an API I really like, and that's something I think
> we need to fix eventually.
>
>> Also, we made AeConstrainModes a 'core control' because at the time
>> when RPi first implemented AGC support there was no alternative to
>> that. Then the RPi implementation has been copied in all other
>> platforms and this is still fine as a 'core control'. This however
>> seems to be a tuning parameter for this specific algorithm
>> implementation, isn't it ?
> I should read your whole reply before writing anything :-)


We (Jacopo, Stefan and I) discussed this a little in a call a while ago too; I think that although 
the ExposureMode control and probably AeEnable could be centralised across all implementations of 
Agc, the ConstraintMode and the luminance target are really quite specific to this mean luminance 
algorithm and so parsing the settings for those probably does need to live here. For the same reason 
the question below about whether it's right to pass a ControlInfoMap::Map to the algorithms is, I 
think, necessary - the controls that get instantiated might depend on the version of an algorithm 
that is "selected" by the tuning file.


In recognition of that, in addition to renaming the file to AgcMeanLuminance I think that the key 
for the parameters for the algorithm in the tuning files will need to change to that too.

>
>>> + *         ConstraintHighlight:
>>> + *           lower:
>>> + *             qLo: 0.98
>>> + *             qHi: 1.0
>>> + *             yTarget: 0.5
>>> + *           upper:
>>> + *             qLo: 0.98
>>> + *             qHi: 1.0
>>> + *             yTarget: 0.8
>>> + *
>>> + * \endcode
>>> + *
>>> + * The parsed dictionaries are used to populate an array of available values for
>>> + * the AeConstraintMode control and stored for later use in the algorithm.
>>> + *
>>> + * \return -EINVAL Where a defined constraint mode is invalid
>>> + * \return 0 on success
>>> + */
>>> +int MeanLuminanceAgc::parseConstraintModes(const YamlObject &tuningData)
>>> +{
>>> +	std::vector<ControlValue> availableConstraintModes;
>>> +
>>> +	const YamlObject &yamlConstraintModes = tuningData[controls::AeConstraintMode.name()];
>>> +	if (yamlConstraintModes.isDictionary()) {
>>> +		for (const auto &[modeName, modeDict] : yamlConstraintModes.asDict()) {
>>> +			if (AeConstraintModeNameValueMap.find(modeName) ==
>>> +			    AeConstraintModeNameValueMap.end()) {
>>> +				LOG(Agc, Warning)
>>> +					<< "Skipping unknown constraint mode '" << modeName << "'";
>>> +				continue;
>>> +			}
>>> +
>>> +			if (!modeDict.isDictionary()) {
>>> +				LOG(Agc, Error)
>>> +					<< "Invalid constraint mode '" << modeName << "'";
>>> +				return -EINVAL;
>>> +			}
>>> +
>>> +			parseConstraint(modeDict,
>>> +					AeConstraintModeNameValueMap.at(modeName));
>>> +			availableConstraintModes.push_back(
>>> +				AeConstraintModeNameValueMap.at(modeName));
>>> +		}
>>> +	}
>>> +
>>> +	/*
>>> +	 * If the tuning data file contains no constraints then we use the
>>> +	 * default constraint that the various Agc algorithms were adhering to
>>> +	 * anyway before centralisation.
>>> +	 */
>>> +	if (constraintModes_.empty()) {
>>> +		AgcConstraint constraint = {
>>> +			AgcConstraint::Bound::LOWER,
>>> +			0.98,
>>> +			1.0,
>>> +			0.5
>>> +		};
>>> +
>>> +		constraintModes_[controls::ConstraintNormal].insert(
>>> +			constraintModes_[controls::ConstraintNormal].begin(),
>>> +			constraint);
>>> +		availableConstraintModes.push_back(
>>> +			AeConstraintModeNameValueMap.at("ConstraintNormal"));
>>> +	}
>>> +
>>> +	controls_[&controls::AeConstraintMode] = ControlInfo(availableConstraintModes);
>>> +
>>> +	return 0;
>>> +}
>>> +
>>> +/**
>>> + * \brief Parse tuning data file to populate AeExposureMode control
>>> + * \param[in] tuningData the YamlObject representing the tuning data for Agc
>>> + *
>>> + * The Agc algorithm's tuning data should contain a dictionary called
>>> + * AeExposureMode containing per-mode setting dictionaries with the key being
>>> + * a value from \ref controls::AeExposureModeNameValueMap. Each mode dict should
>>> + * contain an array of shutter times with the key "shutter" and an array of gain
>>> + * values with the key "gain", in this format:
>>> + *
>>> + * \code{.unparsed}
>>> + * algorithms:
>>> + *   - Agc:
>>> + *       AeExposureMode:
>> Same reasoning as per the constraints, really.
>>
>> There's nothing bad here, if not me realizing our controls definition
>> is intimately tied with the algorithms implementation, so I wonder
>> again if even handling things like AeEnable in a common form makes
>> any sense..
>>
>> Not going to review the actual implementation now, as it comes from
>> the existing ones...
>>
>>
>>> + *         ExposureNormal:
>>> + *           shutter: [ 100, 10000, 30000, 60000, 120000 ]
>>> + *           gain: [ 1.0, 2.0, 4.0, 6.0, 6.0 ]
>>> + *         ExposureShort:
>>> + *           shutter: [ 100, 10000, 30000, 60000, 120000 ]
>>> + *           gain: [ 1.0, 2.0, 4.0, 6.0, 6.0 ]
>>> + *
>>> + * \endcode
>>> + *
>>> + * The parsed dictionaries are used to populate an array of available values for
>>> + * the AeExposureMode control and to create ExposureModeHelpers
>>> + *
>>> + * \return -EINVAL Where a defined constraint mode is invalid
>>> + * \return 0 on success
>>> + */
>>> +int MeanLuminanceAgc::parseExposureModes(const YamlObject &tuningData)
>>> +{
>>> +	std::vector<ControlValue> availableExposureModes;
>>> +	int ret;
>>> +
>>> +	const YamlObject &yamlExposureModes = tuningData[controls::AeExposureMode.name()];
>>> +	if (yamlExposureModes.isDictionary()) {
>>> +		for (const auto &[modeName, modeValues] : yamlExposureModes.asDict()) {
>>> +			if (AeExposureModeNameValueMap.find(modeName) ==
>>> +			    AeExposureModeNameValueMap.end()) {
>>> +				LOG(Agc, Warning)
>>> +					<< "Skipping unknown exposure mode '" << modeName << "'";
>>> +				continue;
>>> +			}
>>> +
>>> +			if (!modeValues.isDictionary()) {
>>> +				LOG(Agc, Error)
>>> +					<< "Invalid exposure mode '" << modeName << "'";
>>> +				return -EINVAL;
>>> +			}
>>> +
>>> +			std::vector<uint32_t> shutters =
>>> +				modeValues["shutter"].getList<uint32_t>().value_or(std::vector<uint32_t>{});
>>> +			std::vector<double> gains =
>>> +				modeValues["gain"].getList<double>().value_or(std::vector<double>{});
>>> +
>>> +			std::vector<utils::Duration> shutterDurations;
>>> +			std::transform(shutters.begin(), shutters.end(),
>>> +				std::back_inserter(shutterDurations),
>>> +				[](uint32_t time) { return std::chrono::microseconds(time); });
>>> +
>>> +			std::shared_ptr<ExposureModeHelper> helper =
>>> +				std::make_shared<ExposureModeHelper>();
>>> +			if ((ret = helper->init(shutterDurations, gains) < 0)) {
>>> +				LOG(Agc, Error)
>>> +					<< "Failed to parse exposure mode '" << modeName << "'";
>>> +				return ret;
>>> +			}
>>> +
>>> +			exposureModeHelpers_[AeExposureModeNameValueMap.at(modeName)] = helper;
>>> +			availableExposureModes.push_back(AeExposureModeNameValueMap.at(modeName));
>>> +		}
>>> +	}
>>> +
>>> +	/*
>>> +	 * If we don't have any exposure modes in the tuning data we create an
>>> +	 * ExposureModeHelper using empty shutter time and gain arrays, which
>>> +	 * will then internally simply drive the shutter as high as possible
>>> +	 * before touching gain
>>> +	 */
>>> +	if (availableExposureModes.empty()) {
>>> +		int32_t exposureModeId = AeExposureModeNameValueMap.at("ExposureNormal");
>>> +		std::vector<utils::Duration> shutterDurations = {};
>>> +		std::vector<double> gains = {};
>>> +
>>> +		std::shared_ptr<ExposureModeHelper> helper =
>>> +			std::make_shared<ExposureModeHelper>();
>>> +		if ((ret = helper->init(shutterDurations, gains) < 0)) {
>>> +			LOG(Agc, Error)
>>> +				<< "Failed to create default ExposureModeHelper";
>>> +			return ret;
>>> +		}
>>> +
>>> +		exposureModeHelpers_[exposureModeId] = helper;
>>> +		availableExposureModes.push_back(exposureModeId);
>>> +	}
>>> +
>>> +	controls_[&controls::AeExposureMode] = ControlInfo(availableExposureModes);
>>> +
>>> +	return 0;
>>> +}
>>> +
>>> +/**
>>> + * \fn MeanLuminanceAgc::constraintModes()
>>> + * \brief Get the constraint modes that have been parsed from tuning data
>>> + */
>>> +
>>> +/**
>>> + * \fn MeanLuminanceAgc::exposureModeHelpers()
>>> + * \brief Get the ExposureModeHelpers that have been parsed from tuning data
>>> + */
>>> +
>>> +/**
>>> + * \fn MeanLuminanceAgc::controls()
>>> + * \brief Get the controls that have been generated after parsing tuning data
>>> + */
>>> +
>>> +/**
>>> + * \fn MeanLuminanceAgc::estimateLuminance(const double gain)
>>> + * \brief Estimate the luminance of an image, adjusted by a given gain
>>> + * \param[in] gain The gain with which to adjust the luminance estimate
>>> + *
>>> + * This function is a pure virtual function because estimation of luminance is a
>>> + * hardware-specific operation, which depends wholly on the format of the stats
>>> + * that are delivered to libcamera from the ISP. Derived classes must implement
>>> + * an overriding function that calculates the normalised mean luminance value
>>> + * across the entire image.
>>> + *
>>> + * \return The normalised relative luminance of the image
>>> + */
>>> +
>>> +/**
>>> + * \brief Estimate the initial gain needed to achieve a relative luminance
>>> + *        target
>> nit: we don't usually indent in briefs, or in general when breaking
>> lines in doxygen as far as I can tell
>>
>> Not going
>>
>>> + *
>>> + * To account for non-linearity caused by saturation, the value needs to be
>>> + * estimated in an iterative process, as multiplying by a gain will not increase
>>> + * the relative luminance by the same factor if some image regions are saturated
>>> + *
>>> + * \return The calculated initial gain
>>> + */
>>> +double MeanLuminanceAgc::estimateInitialGain()
>>> +{
>>> +	double yTarget = relativeLuminanceTarget_;
>>> +	double yGain = 1.0;
>>> +
>>> +	for (unsigned int i = 0; i < 8; i++) {
>>> +		double yValue = estimateLuminance(yGain);
>>> +		double extra_gain = std::min(10.0, yTarget / (yValue + .001));
>>> +
>>> +		yGain *= extra_gain;
>>> +		LOG(Agc, Debug) << "Y value: " << yValue
>>> +				<< ", Y target: " << yTarget
>>> +				<< ", gives gain " << yGain;
>>> +
>>> +		if (utils::abs_diff(extra_gain, 1.0) < 0.01)
>>> +			break;
>>> +	}
>>> +
>>> +	return yGain;
>>> +}
>>> +
>>> +/**
>>> + * \brief Clamp gain within the bounds of a defined constraint
>>> + * \param[in] constraintModeIndex The index of the constraint to adhere to
>>> + * \param[in] hist A histogram over which to calculate inter-quantile means
>>> + * \param[in] gain The gain to clamp
>>> + *
>>> + * \return The gain clamped within the constraint bounds
>>> + */
>>> +double MeanLuminanceAgc::constraintClampGain(uint32_t constraintModeIndex,
>>> +					     const Histogram &hist,
>>> +					     double gain)
>>> +{
>>> +	std::vector<AgcConstraint> &constraints = constraintModes_[constraintModeIndex];
>>> +	for (const AgcConstraint &constraint : constraints) {
>>> +		double newGain = constraint.yTarget * hist.bins() /
>>> +				 hist.interQuantileMean(constraint.qLo, constraint.qHi);
>>> +
>>> +		if (constraint.bound == AgcConstraint::Bound::LOWER &&
>>> +		    newGain > gain)
>>> +			gain = newGain;
>>> +
>>> +		if (constraint.bound == AgcConstraint::Bound::UPPER &&
>>> +		    newGain < gain)
>>> +			gain = newGain;
>>> +	}
>>> +
>>> +	return gain;
>>> +}
>>> +
>>> +/**
>>> + * \brief Apply a filter on the exposure value to limit the speed of changes
>>> + * \param[in] exposureValue The target exposure from the AGC algorithm
>>> + *
>>> + * The speed of the filter is adaptive, and will produce the target quicker
>>> + * during startup, or when the target exposure is within 20% of the most recent
>>> + * filter output.
>>> + *
>>> + * \return The filtered exposure
>>> + */
>>> +utils::Duration MeanLuminanceAgc::filterExposure(utils::Duration exposureValue)
>>> +{
>>> +	double speed = 0.2;
>>> +
>>> +	/* Adapt instantly if we are in startup phase. */
>>> +	if (frameCount_ < kNumStartupFrames)
>>> +		speed = 1.0;
>>> +
>>> +	/*
>>> +	 * If we are close to the desired result, go faster to avoid making
>>> +	 * multiple micro-adjustments.
>>> +	 * \todo Make this customisable?
>>> +	 */
>>> +	if (filteredExposure_ < 1.2 * exposureValue &&
>>> +	    filteredExposure_ > 0.8 * exposureValue)
>>> +		speed = sqrt(speed);
>>> +
>>> +	filteredExposure_ = speed * exposureValue +
>>> +			    filteredExposure_ * (1.0 - speed);
>>> +
>>> +	return filteredExposure_;
>>> +}
>>> +
>>> +/**
>>> + * \brief Calculate the new exposure value
>>> + * \param[in] constraintModeIndex The index of the current constraint mode
>>> + * \param[in] exposureModeIndex The index of the current exposure mode
>>> + * \param[in] yHist A Histogram from the ISP statistics to use in constraining
>>> + *	      the calculated gain
>>> + * \param[in] effectiveExposureValue The EV applied to the frame from which the
>>> + *	      statistics in use derive
>>> + *
>>> + * Calculate a new exposure value to try to obtain the target. The calculated
>>> + * exposure value is filtered to prevent rapid changes from frame to frame, and
>>> + * divided into shutter time, analogue and digital gain.
>>> + *
>>> + * \return Tuple of shutter time, analogue gain, and digital gain
>>> + */
>>> +std::tuple<utils::Duration, double, double>
>>> +MeanLuminanceAgc::calculateNewEv(uint32_t constraintModeIndex,
>>> +				 uint32_t exposureModeIndex,
>>> +				 const Histogram &yHist,
>>> +				 utils::Duration effectiveExposureValue)
>>> +{
>>> +	/*
>>> +	 * The pipeline handler should validate that we have received an allowed
>>> +	 * value for AeExposureMode.
>>> +	 */
>>> +	std::shared_ptr<ExposureModeHelper> exposureModeHelper =
>>> +		exposureModeHelpers_.at(exposureModeIndex);
>>> +
>>> +	double gain = estimateInitialGain();
>>> +	gain = constraintClampGain(constraintModeIndex, yHist, gain);
>>> +
>>> +	/*
>>> +	 * We don't check whether we're already close to the target, because
>>> +	 * even if the effective exposure value is the same as the last frame's
>>> +	 * we could have switched to an exposure mode that would require a new
>>> +	 * pass through the splitExposure() function.
>>> +	 */
>>> +
>>> +	utils::Duration newExposureValue = effectiveExposureValue * gain;
>>> +	utils::Duration maxTotalExposure = exposureModeHelper->maxShutter()
>>> +					   * exposureModeHelper->maxGain();
>>> +	newExposureValue = std::min(newExposureValue, maxTotalExposure);
>>> +
>>> +	/*
>>> +	 * We filter the exposure value to make sure changes are not too jarring
>>> +	 * from frame to frame.
>>> +	 */
>>> +	newExposureValue = filterExposure(newExposureValue);
>>> +
>>> +	frameCount_++;
>>> +	return exposureModeHelper->splitExposure(newExposureValue);
>>> +}
>>> +
>>> +}; /* namespace ipa */
>>> +
>>> +}; /* namespace libcamera */
>>> diff --git a/src/ipa/libipa/agc.h b/src/ipa/libipa/agc.h
>>> new file mode 100644
>>> index 00000000..902a359a
>>> --- /dev/null
>>> +++ b/src/ipa/libipa/agc.h
>>> @@ -0,0 +1,82 @@
>>> +/* SPDX-License-Identifier: LGPL-2.1-or-later */
>>> +/*
>>> + * Copyright (C) 2024 Ideas on Board Oy
>>> + *
>>> + agc.h - Base class for libipa-compliant AGC algorithms
>>> + */
>>> +
>>> +#pragma once
>>> +
>>> +#include <tuple>
>>> +#include <vector>
>>> +
>>> +#include <libcamera/controls.h>
>>> +
>>> +#include "libcamera/internal/yaml_parser.h"
>>> +
>>> +#include "exposure_mode_helper.h"
>>> +#include "histogram.h"
>>> +
>>> +namespace libcamera {
>>> +
>>> +namespace ipa {
>>> +
>>> +class MeanLuminanceAgc
>>> +{
>>> +public:
>>> +	MeanLuminanceAgc();
>>> +	virtual ~MeanLuminanceAgc() = default;
> No need to make this inline, define the destructor in the .cpp with
>
> MeanLumimanceAgc::~MeanLuminanceAgc() = default;
>
>>> +
>>> +	struct AgcConstraint {
>>> +		enum class Bound {
>>> +			LOWER = 0,
>>> +			UPPER = 1
> Wrong coding style.
>
>>> +		};
>>> +		Bound bound;
>>> +		double qLo;
>>> +		double qHi;
>>> +		double yTarget;
>>> +	};
>>> +
>>> +	void parseRelativeLuminanceTarget(const YamlObject &tuningData);
>>> +	void parseConstraint(const YamlObject &modeDict, int32_t id);
> This function doesn't seem to be called from the derived classes, you
> can make it private. Same for other functions below.
>
>>> +	int parseConstraintModes(const YamlObject &tuningData);
>>> +	int parseExposureModes(const YamlObject &tuningData);
> Is there a reason to split the parsing in multiple functions, given that
> they are called the same from from both the rkisp1 and ipu3
> implementations ?
>
>>> +
>>> +	std::map<int32_t, std::vector<AgcConstraint>> constraintModes()
>>> +	{
>>> +		return constraintModes_;
>>> +	}
> This seems unused, so the AgcConstraint structure can be made private.
>
>>> +
>>> +	std::map<int32_t, std::shared_ptr<ExposureModeHelper>> exposureModeHelpers()
>>> +	{
>>> +		return exposureModeHelpers_;
>>> +	}
> Can we avoid exposing this too ?
>
> I never expected there would be multiple helpers when reading the
> documentation of the ExposureModeHelper class. That may be due to
> missing documentation.
>
>>> +
>>> +	ControlInfoMap::Map controls()
>>> +	{
>>> +		return controls_;
>>> +	}
> Hmmm... We don't have precendents for pushing ControlInfo to algorithms.
> Are we sure we want to go that way ?
>
>>> +
>>> +	virtual double estimateLuminance(const double gain) = 0;
>>> +	double estimateInitialGain();
>>> +	double constraintClampGain(uint32_t constraintModeIndex,
>>> +				   const Histogram &hist,
>>> +				   double gain);
>>> +	utils::Duration filterExposure(utils::Duration exposureValue);
>>> +	std::tuple<utils::Duration, double, double>
>>> +	calculateNewEv(uint32_t constraintModeIndex, uint32_t exposureModeIndex,
>>> +		       const Histogram &yHist, utils::Duration effectiveExposureValue);
> Missing blank line.
>
>>> +private:
>>> +	uint64_t frameCount_;
>>> +	utils::Duration filteredExposure_;
>>> +	double relativeLuminanceTarget_;
>>> +
>>> +	std::map<int32_t, std::vector<AgcConstraint>> constraintModes_;
>>> +	std::map<int32_t, std::shared_ptr<ExposureModeHelper>> exposureModeHelpers_;
>>> +	ControlInfoMap::Map controls_;
>>> +};
>>> +
>>> +}; /* namespace ipa */
>>> +
>>> +}; /* namespace libcamera */
>>> diff --git a/src/ipa/libipa/meson.build b/src/ipa/libipa/meson.build
>>> index 37fbd177..31cc8d70 100644
>>> --- a/src/ipa/libipa/meson.build
>>> +++ b/src/ipa/libipa/meson.build
>>> @@ -1,6 +1,7 @@
>>>   # SPDX-License-Identifier: CC0-1.0
>>>
>>>   libipa_headers = files([
>>> +    'agc.h',
>>>       'algorithm.h',
>>>       'camera_sensor_helper.h',
>>>       'exposure_mode_helper.h',
>>> @@ -10,6 +11,7 @@ libipa_headers = files([
>>>   ])
>>>
>>>   libipa_sources = files([
>>> +    'agc.cpp',
>>>       'algorithm.cpp',
>>>       'camera_sensor_helper.cpp',
>>>       'exposure_mode_helper.cpp',
Daniel Scally April 8, 2024, 11:09 a.m. UTC | #7
Hi Laurent

On 06/04/2024 02:27, Laurent Pinchart wrote:
> On Sat, Apr 06, 2024 at 04:07:52AM +0300, Laurent Pinchart wrote:
>> Hello,
>>
>> On Mon, Mar 25, 2024 at 09:16:00PM +0100, Jacopo Mondi wrote:
>>> On Fri, Mar 22, 2024 at 01:14:45PM +0000, Daniel Scally wrote:
>>>> The Agc algorithms for the RkIsp1 and IPU3 IPAs do the same thing in
>>>> very large part; following the Rpi IPA's algorithm in spirit with a
>>>> few tunable values in that IPA being hardcoded in the libipa ones.
>>>> Add a new base class for MeanLuminanceAgc which implements the same
>>> nit: I would rather call this one AgcMeanLuminance.
>> And rename the files accordingly ?
>>
>>> One other note, not sure if applicable here, from the experience of
>>> trying to upstream an auto-focus algorithm for RkISP1. We had there a
>>> base class to define the algorithm interface, one derived class for
>>> the common calculation and one platform-specific part for statistics
>>> collection and IPA module interfacing.
>>>
>>> The base class was there mostly to handle the algorithm state machine
>>> by handling the controls that influence the algorithm behaviour. In
>>> case of AEGC I can think, in example, about handling the switch
>>> between enable/disable of auto-mode (and consequentially handling a
>>> manually set ExposureTime and AnalogueGain), switching between
>>> different ExposureModes etc
>>>
>>> This is the last attempt for AF
>>> https://patchwork.libcamera.org/patch/18510/
>>>
>>> and I'm wondering if it would be desirable to abstract away from the
>>> MeanLuminance part the part that is tightly coupled with the
>>> libcamera's controls definition.
>>>
>>> Let's make a concrete example: look how the rkisp1 and the ipu3 agc
>>> implementations handle AeEnabled (or better, look at how the rkisp1
>>> does that and the IPU3 does not).
>>>
>>> Ideally, my goal would be to abstract the handling of the control and
>>> all the state machine that decides if the manual or auto-computed
>>> values should be used to an AEGCAlgorithm base class.
>>>
>>> Your series does that for the tuning file parsing already but does
>>> that in the MeanLuminance method implementation, while it should or
>>> could be common to all AEGC methods.
>>>
>>> One very interesting experiment could be starting with this and then
>>> plumb-in AeEnable support in the IPU3 in example and move everything
>>> common to a base class.
>>>
>>>> algorithm and additionally parses yaml tuning files to inform an IPA
>>>> module's Agc algorithm about valid constraint and exposure modes and
>>>> their associated bounds.
>>>>
>>>> Signed-off-by: Daniel Scally <dan.scally@ideasonboard.com>
>>>> ---
>>>>   src/ipa/libipa/agc.cpp     | 526 +++++++++++++++++++++++++++++++++++++
>>>>   src/ipa/libipa/agc.h       |  82 ++++++
>>>>   src/ipa/libipa/meson.build |   2 +
>>>>   3 files changed, 610 insertions(+)
>>>>   create mode 100644 src/ipa/libipa/agc.cpp
>>>>   create mode 100644 src/ipa/libipa/agc.h
>>>>
>>>> diff --git a/src/ipa/libipa/agc.cpp b/src/ipa/libipa/agc.cpp
>>>> new file mode 100644
>>>> index 00000000..af57a571
>>>> --- /dev/null
>>>> +++ b/src/ipa/libipa/agc.cpp
>>>> @@ -0,0 +1,526 @@
>>>> +/* SPDX-License-Identifier: LGPL-2.1-or-later */
>>>> +/*
>>>> + * Copyright (C) 2024 Ideas on Board Oy
>>>> + *
>>>> + * agc.cpp - Base class for libipa-compliant AGC algorithms
>> We could have different AGC algorithms that are all libipa-compliant.
>> This is one particular AGC algorithm, right ?
>>
>>>> + */
>>>> +
>>>> +#include "agc.h"
>>>> +
>>>> +#include <cmath>
>>>> +
>>>> +#include <libcamera/base/log.h>
>>>> +#include <libcamera/control_ids.h>
>>>> +
>>>> +#include "exposure_mode_helper.h"
>>>> +
>>>> +using namespace libcamera::controls;
>>>> +
>>>> +/**
>>>> + * \file agc.h
>>>> + * \brief Base class implementing mean luminance AEGC.
>>> nit: not '.' at the end of briefs
>>>
>>>> + */
>>>> +
>>>> +namespace libcamera {
>>>> +
>>>> +using namespace std::literals::chrono_literals;
>>>> +
>>>> +LOG_DEFINE_CATEGORY(Agc)
>>>> +
>>>> +namespace ipa {
>>>> +
>>>> +/*
>>>> + * Number of frames to wait before calculating stats on minimum exposure
>> This doesn't seem to match what the value is used for.
>>
>>>> + * \todo should this be a tunable value?
>>> Does this depend on the ISP (comes from IPA), the sensor (comes from
>>> tuning file) or... both ? :)
>>>
>>>> + */
>>>> +static constexpr uint32_t kNumStartupFrames = 10;
>>>> +
>>>> +/*
>>>> + * Default relative luminance target
>>>> + *
>>>> + * This value should be chosen so that when the camera points at a grey target,
>>>> + * the resulting image brightness looks "right". Custom values can be passed
>>>> + * as the relativeLuminanceTarget value in sensor tuning files.
>>>> + */
>>>> +static constexpr double kDefaultRelativeLuminanceTarget = 0.16;
>>>> +
>>>> +/**
>>>> + * \struct MeanLuminanceAgc::AgcConstraint
>>>> + * \brief The boundaries and target for an AeConstraintMode constraint
>>>> + *
>>>> + * This structure describes an AeConstraintMode constraint for the purposes of
>>>> + * this algorithm. The algorithm will apply the constraints by calculating the
>>>> + * Histogram's inter-quantile mean between the given quantiles and ensure that
>>>> + * the resulting value is the right side of the given target (as defined by the
>>>> + * boundary and luminance target).
>>>> + */
>>> Here, in example.
>>>
>>> controls::AeConstraintMode and the supported values are defined as
>>> (core|vendor) controls in control_ids_*.yaml
>>>
>>> The tuning file expresses the constraint modes using the Control
>>> definition (I wonder if this has always been like this) but it
>>> definitely ties the tuning file to the controls definition.
>>>
>>> Applications use controls::AeConstraintMode to select one of the
>>> constrained modes to have the algorithm use it.
>>>
>>> In all of this, how much is part of the MeanLuminance implementation
>>> and how much shared between possibly multiple implementations ?
>>>
>>>> +
>>>> +/**
>>>> + * \enum MeanLuminanceAgc::AgcConstraint::Bound
>>>> + * \brief Specify whether the constraint defines a lower or upper bound
>>>> + * \var MeanLuminanceAgc::AgcConstraint::LOWER
>>>> + * \brief The constraint defines a lower bound
>>>> + * \var MeanLuminanceAgc::AgcConstraint::UPPER
>>>> + * \brief The constraint defines an upper bound
>>>> + */
>>>> +
>>>> +/**
>>>> + * \var MeanLuminanceAgc::AgcConstraint::bound
>>>> + * \brief The type of constraint bound
>>>> + */
>>>> +
>>>> +/**
>>>> + * \var MeanLuminanceAgc::AgcConstraint::qLo
>>>> + * \brief The lower quantile to use for the constraint
>>>> + */
>>>> +
>>>> +/**
>>>> + * \var MeanLuminanceAgc::AgcConstraint::qHi
>>>> + * \brief The upper quantile to use for the constraint
>>>> + */
>>>> +
>>>> +/**
>>>> + * \var MeanLuminanceAgc::AgcConstraint::yTarget
>>>> + * \brief The luminance target for the constraint
>>>> + */
>>>> +
>>>> +/**
>>>> + * \class MeanLuminanceAgc
>>>> + * \brief a mean-based auto-exposure algorithm
>> s/a/A/
>>
>>>> + *
>>>> + * This algorithm calculates a shutter time, analogue and digital gain such that
>>>> + * the normalised mean luminance value of an image is driven towards a target,
>>>> + * which itself is discovered from tuning data. The algorithm is a two-stage
>>>> + * process:
>> s/:/./
>>
>> or make the next two paragraph bullet list entries.
>>
>>>> + *
>>>> + * In the first stage, an initial gain value is derived by iteratively comparing
>>>> + * the gain-adjusted mean luminance across an entire image against a target, and
>>>> + * selecting a value which pushes it as closely as possible towards the target.
>>>> + *
>>>> + * In the second stage we calculate the gain required to drive the average of a
>>>> + * section of a histogram to a target value, where the target and the boundaries
>>>> + * of the section of the histogram used in the calculation are taken from the
>>>> + * values defined for the currently configured AeConstraintMode within the
>>>> + * tuning data. The gain from the first stage is then clamped to the gain from
>>>> + * this stage.
> Another thing we need to document is that requirements to use this
> class. The IPU3 can provide the required histogram because we compute a
> poor man's histogram from the grid of AWB stats, as the ImgU doesn't
> give us a histogram. Platforms that can't provide a histogram at all
> won't be able to use this. What precision do we need for the histogram
> (number of bins, and number of pixels used for the calculation) for it
> to be usable with this class ?


I'm not sure how to go about that last question really; the precision that's necessary for the 
histogram is itself dependent on the precision of the constraints that you want to set...which ought 
to be informed by the capabilities of the hardware that you're writing the tuning file for anyway. 
Or in other words, even a very small histogram could be used as long as you account for the 
resulting imprecision when you define the constraints that you want to adhere to...and I suspect 
that would affect the quality of the algorithm.

>
>>>> + *
>>>> + * The final gain is used to adjust the effective exposure value of the image,
>>>> + * and that new exposure value divided into shutter time, analogue gain and
>> s/divided/is divided/
>>
>>>> + * digital gain according to the selected AeExposureMode.
>> There should be a mention here that this class handles tuning file
>> parsing and expects a particular tuning data format. You can reference
>> the function that parses the tuning data for detailed documentation
>> about the format.
>>
>>>> + */
>>>> +
>>>> +MeanLuminanceAgc::MeanLuminanceAgc()
>>>> +	: frameCount_(0), filteredExposure_(0s), relativeLuminanceTarget_(0)
>>>> +{
>>>> +}
>>>> +
>>>> +/**
>>>> + * \brief Parse the relative luminance target from the tuning data
>>>> + * \param[in] tuningData The YamlObject holding the algorithm's tuning data
>>>> + */
>>>> +void MeanLuminanceAgc::parseRelativeLuminanceTarget(const YamlObject &tuningData)
>>>> +{
>>>> +	relativeLuminanceTarget_ =
>>>> +		tuningData["relativeLuminanceTarget"].get<double>(kDefaultRelativeLuminanceTarget);
>>> How do you expect this to be computed in the tuning file ?
>>>
>>>> +}
>>>> +
>>>> +/**
>>>> + * \brief Parse an AeConstraintMode constraint from tuning data
>>>> + * \param[in] modeDict the YamlObject holding the constraint data
>>>> + * \param[in] id The constraint ID from AeConstraintModeEnum
>>>> + */
>>>> +void MeanLuminanceAgc::parseConstraint(const YamlObject &modeDict, int32_t id)
>>>> +{
>>>> +	for (const auto &[boundName, content] : modeDict.asDict()) {
>>>> +		if (boundName != "upper" && boundName != "lower") {
>>>> +			LOG(Agc, Warning)
>>>> +				<< "Ignoring unknown constraint bound '" << boundName << "'";
>>>> +			continue;
>>>> +		}
>>>> +
>>>> +		unsigned int idx = static_cast<unsigned int>(boundName == "upper");
>>>> +		AgcConstraint::Bound bound = static_cast<AgcConstraint::Bound>(idx);
>>>> +		double qLo = content["qLo"].get<double>().value_or(0.98);
>>>> +		double qHi = content["qHi"].get<double>().value_or(1.0);
>>>> +		double yTarget =
>>>> +			content["yTarget"].getList<double>().value_or(std::vector<double>{ 0.5 }).at(0);
>>>> +
>>>> +		AgcConstraint constraint = { bound, qLo, qHi, yTarget };
>>>> +
>>>> +		if (!constraintModes_.count(id))
>>>> +			constraintModes_[id] = {};
>>>> +
>>>> +		if (idx)
>>>> +			constraintModes_[id].push_back(constraint);
>>>> +		else
>>>> +			constraintModes_[id].insert(constraintModes_[id].begin(), constraint);
>>>> +	}
>>>> +}
>>>> +
>>>> +/**
>>>> + * \brief Parse tuning data file to populate AeConstraintMode control
>>>> + * \param[in] tuningData the YamlObject representing the tuning data for Agc
>>>> + *
>>>> + * The Agc algorithm's tuning data should contain a dictionary called
>>>> + * AeConstraintMode containing per-mode setting dictionaries with the key being
>>>> + * a value from \ref controls::AeConstraintModeNameValueMap. Each mode dict may
>>>> + * contain either a "lower" or "upper" key, or both, in this format:
>>>> + *
>>>> + * \code{.unparsed}
>>>> + * algorithms:
>>>> + *   - Agc:
>>>> + *       AeConstraintMode:
>>>> + *         ConstraintNormal:
>>>> + *           lower:
>>>> + *             qLo: 0.98
>>>> + *             qHi: 1.0
>>>> + *             yTarget: 0.5
>>> Ok, so this ties the tuning file not just to the libcamera controls
>>> definition, but to this specific implementation of the algorithm ?
>> Tying the algorithm implementation with the tuning data format seems
>> unavoidable. I'm slightly concerned about handling the YAML parsing here
>> though, as I'm wondering if we can always guarantee that the file will
>> not need to diverge somehow (possibly with additional data somewhere
>> that would make the shared parser fail), but that's probably such a
>> small risk that we can worry about it later.
>>
>>> Not that it was not expected, and I think it's fine, as using a
>>> libcamera defined control value as 'index' makes sure the applications
>>> will deal with the same interface, but this largely conflicts with the
>>> idea to have shared parsing for all algorithms in a base class.
>> The idea comes straight from the RPi implementation, which expect users
>> to customize tuning files with new modes. Indexing into a non standard
>> list of modes isn't an API I really like, and that's something I think
>> we need to fix eventually.
>>
>>> Also, we made AeConstrainModes a 'core control' because at the time
>>> when RPi first implemented AGC support there was no alternative to
>>> that. Then the RPi implementation has been copied in all other
>>> platforms and this is still fine as a 'core control'. This however
>>> seems to be a tuning parameter for this specific algorithm
>>> implementation, isn't it ?
>> I should read your whole reply before writing anything :-)
>>
>>>> + *         ConstraintHighlight:
>>>> + *           lower:
>>>> + *             qLo: 0.98
>>>> + *             qHi: 1.0
>>>> + *             yTarget: 0.5
>>>> + *           upper:
>>>> + *             qLo: 0.98
>>>> + *             qHi: 1.0
>>>> + *             yTarget: 0.8
>>>> + *
>>>> + * \endcode
>>>> + *
>>>> + * The parsed dictionaries are used to populate an array of available values for
>>>> + * the AeConstraintMode control and stored for later use in the algorithm.
>>>> + *
>>>> + * \return -EINVAL Where a defined constraint mode is invalid
>>>> + * \return 0 on success
>>>> + */
>>>> +int MeanLuminanceAgc::parseConstraintModes(const YamlObject &tuningData)
>>>> +{
>>>> +	std::vector<ControlValue> availableConstraintModes;
>>>> +
>>>> +	const YamlObject &yamlConstraintModes = tuningData[controls::AeConstraintMode.name()];
>>>> +	if (yamlConstraintModes.isDictionary()) {
>>>> +		for (const auto &[modeName, modeDict] : yamlConstraintModes.asDict()) {
>>>> +			if (AeConstraintModeNameValueMap.find(modeName) ==
>>>> +			    AeConstraintModeNameValueMap.end()) {
>>>> +				LOG(Agc, Warning)
>>>> +					<< "Skipping unknown constraint mode '" << modeName << "'";
>>>> +				continue;
>>>> +			}
>>>> +
>>>> +			if (!modeDict.isDictionary()) {
>>>> +				LOG(Agc, Error)
>>>> +					<< "Invalid constraint mode '" << modeName << "'";
>>>> +				return -EINVAL;
>>>> +			}
>>>> +
>>>> +			parseConstraint(modeDict,
>>>> +					AeConstraintModeNameValueMap.at(modeName));
>>>> +			availableConstraintModes.push_back(
>>>> +				AeConstraintModeNameValueMap.at(modeName));
>>>> +		}
>>>> +	}
>>>> +
>>>> +	/*
>>>> +	 * If the tuning data file contains no constraints then we use the
>>>> +	 * default constraint that the various Agc algorithms were adhering to
>>>> +	 * anyway before centralisation.
>>>> +	 */
>>>> +	if (constraintModes_.empty()) {
>>>> +		AgcConstraint constraint = {
>>>> +			AgcConstraint::Bound::LOWER,
>>>> +			0.98,
>>>> +			1.0,
>>>> +			0.5
>>>> +		};
>>>> +
>>>> +		constraintModes_[controls::ConstraintNormal].insert(
>>>> +			constraintModes_[controls::ConstraintNormal].begin(),
>>>> +			constraint);
>>>> +		availableConstraintModes.push_back(
>>>> +			AeConstraintModeNameValueMap.at("ConstraintNormal"));
>>>> +	}
>>>> +
>>>> +	controls_[&controls::AeConstraintMode] = ControlInfo(availableConstraintModes);
>>>> +
>>>> +	return 0;
>>>> +}
>>>> +
>>>> +/**
>>>> + * \brief Parse tuning data file to populate AeExposureMode control
>>>> + * \param[in] tuningData the YamlObject representing the tuning data for Agc
>>>> + *
>>>> + * The Agc algorithm's tuning data should contain a dictionary called
>>>> + * AeExposureMode containing per-mode setting dictionaries with the key being
>>>> + * a value from \ref controls::AeExposureModeNameValueMap. Each mode dict should
>>>> + * contain an array of shutter times with the key "shutter" and an array of gain
>>>> + * values with the key "gain", in this format:
>>>> + *
>>>> + * \code{.unparsed}
>>>> + * algorithms:
>>>> + *   - Agc:
>>>> + *       AeExposureMode:
>>> Same reasoning as per the constraints, really.
>>>
>>> There's nothing bad here, if not me realizing our controls definition
>>> is intimately tied with the algorithms implementation, so I wonder
>>> again if even handling things like AeEnable in a common form makes
>>> any sense..
>>>
>>> Not going to review the actual implementation now, as it comes from
>>> the existing ones...
>>>
>>>
>>>> + *         ExposureNormal:
>>>> + *           shutter: [ 100, 10000, 30000, 60000, 120000 ]
>>>> + *           gain: [ 1.0, 2.0, 4.0, 6.0, 6.0 ]
>>>> + *         ExposureShort:
>>>> + *           shutter: [ 100, 10000, 30000, 60000, 120000 ]
>>>> + *           gain: [ 1.0, 2.0, 4.0, 6.0, 6.0 ]
>>>> + *
>>>> + * \endcode
>>>> + *
>>>> + * The parsed dictionaries are used to populate an array of available values for
>>>> + * the AeExposureMode control and to create ExposureModeHelpers
>>>> + *
>>>> + * \return -EINVAL Where a defined constraint mode is invalid
>>>> + * \return 0 on success
>>>> + */
>>>> +int MeanLuminanceAgc::parseExposureModes(const YamlObject &tuningData)
>>>> +{
>>>> +	std::vector<ControlValue> availableExposureModes;
>>>> +	int ret;
>>>> +
>>>> +	const YamlObject &yamlExposureModes = tuningData[controls::AeExposureMode.name()];
>>>> +	if (yamlExposureModes.isDictionary()) {
>>>> +		for (const auto &[modeName, modeValues] : yamlExposureModes.asDict()) {
>>>> +			if (AeExposureModeNameValueMap.find(modeName) ==
>>>> +			    AeExposureModeNameValueMap.end()) {
>>>> +				LOG(Agc, Warning)
>>>> +					<< "Skipping unknown exposure mode '" << modeName << "'";
>>>> +				continue;
>>>> +			}
>>>> +
>>>> +			if (!modeValues.isDictionary()) {
>>>> +				LOG(Agc, Error)
>>>> +					<< "Invalid exposure mode '" << modeName << "'";
>>>> +				return -EINVAL;
>>>> +			}
>>>> +
>>>> +			std::vector<uint32_t> shutters =
>>>> +				modeValues["shutter"].getList<uint32_t>().value_or(std::vector<uint32_t>{});
>>>> +			std::vector<double> gains =
>>>> +				modeValues["gain"].getList<double>().value_or(std::vector<double>{});
>>>> +
>>>> +			std::vector<utils::Duration> shutterDurations;
>>>> +			std::transform(shutters.begin(), shutters.end(),
>>>> +				std::back_inserter(shutterDurations),
>>>> +				[](uint32_t time) { return std::chrono::microseconds(time); });
>>>> +
>>>> +			std::shared_ptr<ExposureModeHelper> helper =
>>>> +				std::make_shared<ExposureModeHelper>();
>>>> +			if ((ret = helper->init(shutterDurations, gains) < 0)) {
>>>> +				LOG(Agc, Error)
>>>> +					<< "Failed to parse exposure mode '" << modeName << "'";
>>>> +				return ret;
>>>> +			}
>>>> +
>>>> +			exposureModeHelpers_[AeExposureModeNameValueMap.at(modeName)] = helper;
>>>> +			availableExposureModes.push_back(AeExposureModeNameValueMap.at(modeName));
>>>> +		}
>>>> +	}
>>>> +
>>>> +	/*
>>>> +	 * If we don't have any exposure modes in the tuning data we create an
>>>> +	 * ExposureModeHelper using empty shutter time and gain arrays, which
>>>> +	 * will then internally simply drive the shutter as high as possible
>>>> +	 * before touching gain
>>>> +	 */
>>>> +	if (availableExposureModes.empty()) {
>>>> +		int32_t exposureModeId = AeExposureModeNameValueMap.at("ExposureNormal");
>>>> +		std::vector<utils::Duration> shutterDurations = {};
>>>> +		std::vector<double> gains = {};
>>>> +
>>>> +		std::shared_ptr<ExposureModeHelper> helper =
>>>> +			std::make_shared<ExposureModeHelper>();
>>>> +		if ((ret = helper->init(shutterDurations, gains) < 0)) {
>>>> +			LOG(Agc, Error)
>>>> +				<< "Failed to create default ExposureModeHelper";
>>>> +			return ret;
>>>> +		}
>>>> +
>>>> +		exposureModeHelpers_[exposureModeId] = helper;
>>>> +		availableExposureModes.push_back(exposureModeId);
>>>> +	}
>>>> +
>>>> +	controls_[&controls::AeExposureMode] = ControlInfo(availableExposureModes);
>>>> +
>>>> +	return 0;
>>>> +}
>>>> +
>>>> +/**
>>>> + * \fn MeanLuminanceAgc::constraintModes()
>>>> + * \brief Get the constraint modes that have been parsed from tuning data
>>>> + */
>>>> +
>>>> +/**
>>>> + * \fn MeanLuminanceAgc::exposureModeHelpers()
>>>> + * \brief Get the ExposureModeHelpers that have been parsed from tuning data
>>>> + */
>>>> +
>>>> +/**
>>>> + * \fn MeanLuminanceAgc::controls()
>>>> + * \brief Get the controls that have been generated after parsing tuning data
>>>> + */
>>>> +
>>>> +/**
>>>> + * \fn MeanLuminanceAgc::estimateLuminance(const double gain)
>>>> + * \brief Estimate the luminance of an image, adjusted by a given gain
>>>> + * \param[in] gain The gain with which to adjust the luminance estimate
>>>> + *
>>>> + * This function is a pure virtual function because estimation of luminance is a
>>>> + * hardware-specific operation, which depends wholly on the format of the stats
>>>> + * that are delivered to libcamera from the ISP. Derived classes must implement
>>>> + * an overriding function that calculates the normalised mean luminance value
>>>> + * across the entire image.
>>>> + *
>>>> + * \return The normalised relative luminance of the image
>>>> + */
>>>> +
>>>> +/**
>>>> + * \brief Estimate the initial gain needed to achieve a relative luminance
>>>> + *        target
>>> nit: we don't usually indent in briefs, or in general when breaking
>>> lines in doxygen as far as I can tell
>>>
>>> Not going
>>>
>>>> + *
>>>> + * To account for non-linearity caused by saturation, the value needs to be
>>>> + * estimated in an iterative process, as multiplying by a gain will not increase
>>>> + * the relative luminance by the same factor if some image regions are saturated
>>>> + *
>>>> + * \return The calculated initial gain
>>>> + */
>>>> +double MeanLuminanceAgc::estimateInitialGain()
>>>> +{
>>>> +	double yTarget = relativeLuminanceTarget_;
>>>> +	double yGain = 1.0;
>>>> +
>>>> +	for (unsigned int i = 0; i < 8; i++) {
>>>> +		double yValue = estimateLuminance(yGain);
>>>> +		double extra_gain = std::min(10.0, yTarget / (yValue + .001));
>>>> +
>>>> +		yGain *= extra_gain;
>>>> +		LOG(Agc, Debug) << "Y value: " << yValue
>>>> +				<< ", Y target: " << yTarget
>>>> +				<< ", gives gain " << yGain;
>>>> +
>>>> +		if (utils::abs_diff(extra_gain, 1.0) < 0.01)
>>>> +			break;
>>>> +	}
>>>> +
>>>> +	return yGain;
>>>> +}
>>>> +
>>>> +/**
>>>> + * \brief Clamp gain within the bounds of a defined constraint
>>>> + * \param[in] constraintModeIndex The index of the constraint to adhere to
>>>> + * \param[in] hist A histogram over which to calculate inter-quantile means
>>>> + * \param[in] gain The gain to clamp
>>>> + *
>>>> + * \return The gain clamped within the constraint bounds
>>>> + */
>>>> +double MeanLuminanceAgc::constraintClampGain(uint32_t constraintModeIndex,
>>>> +					     const Histogram &hist,
>>>> +					     double gain)
>>>> +{
>>>> +	std::vector<AgcConstraint> &constraints = constraintModes_[constraintModeIndex];
>>>> +	for (const AgcConstraint &constraint : constraints) {
>>>> +		double newGain = constraint.yTarget * hist.bins() /
>>>> +				 hist.interQuantileMean(constraint.qLo, constraint.qHi);
>>>> +
>>>> +		if (constraint.bound == AgcConstraint::Bound::LOWER &&
>>>> +		    newGain > gain)
>>>> +			gain = newGain;
>>>> +
>>>> +		if (constraint.bound == AgcConstraint::Bound::UPPER &&
>>>> +		    newGain < gain)
>>>> +			gain = newGain;
>>>> +	}
>>>> +
>>>> +	return gain;
>>>> +}
>>>> +
>>>> +/**
>>>> + * \brief Apply a filter on the exposure value to limit the speed of changes
>>>> + * \param[in] exposureValue The target exposure from the AGC algorithm
>>>> + *
>>>> + * The speed of the filter is adaptive, and will produce the target quicker
>>>> + * during startup, or when the target exposure is within 20% of the most recent
>>>> + * filter output.
>>>> + *
>>>> + * \return The filtered exposure
>>>> + */
>>>> +utils::Duration MeanLuminanceAgc::filterExposure(utils::Duration exposureValue)
>>>> +{
>>>> +	double speed = 0.2;
>>>> +
>>>> +	/* Adapt instantly if we are in startup phase. */
>>>> +	if (frameCount_ < kNumStartupFrames)
>>>> +		speed = 1.0;
>>>> +
>>>> +	/*
>>>> +	 * If we are close to the desired result, go faster to avoid making
>>>> +	 * multiple micro-adjustments.
>>>> +	 * \todo Make this customisable?
>>>> +	 */
>>>> +	if (filteredExposure_ < 1.2 * exposureValue &&
>>>> +	    filteredExposure_ > 0.8 * exposureValue)
>>>> +		speed = sqrt(speed);
>>>> +
>>>> +	filteredExposure_ = speed * exposureValue +
>>>> +			    filteredExposure_ * (1.0 - speed);
>>>> +
>>>> +	return filteredExposure_;
>>>> +}
>>>> +
>>>> +/**
>>>> + * \brief Calculate the new exposure value
>>>> + * \param[in] constraintModeIndex The index of the current constraint mode
>>>> + * \param[in] exposureModeIndex The index of the current exposure mode
>>>> + * \param[in] yHist A Histogram from the ISP statistics to use in constraining
>>>> + *	      the calculated gain
>>>> + * \param[in] effectiveExposureValue The EV applied to the frame from which the
>>>> + *	      statistics in use derive
>>>> + *
>>>> + * Calculate a new exposure value to try to obtain the target. The calculated
>>>> + * exposure value is filtered to prevent rapid changes from frame to frame, and
>>>> + * divided into shutter time, analogue and digital gain.
>>>> + *
>>>> + * \return Tuple of shutter time, analogue gain, and digital gain
>>>> + */
>>>> +std::tuple<utils::Duration, double, double>
>>>> +MeanLuminanceAgc::calculateNewEv(uint32_t constraintModeIndex,
>>>> +				 uint32_t exposureModeIndex,
>>>> +				 const Histogram &yHist,
>>>> +				 utils::Duration effectiveExposureValue)
>>>> +{
>>>> +	/*
>>>> +	 * The pipeline handler should validate that we have received an allowed
>>>> +	 * value for AeExposureMode.
>>>> +	 */
>>>> +	std::shared_ptr<ExposureModeHelper> exposureModeHelper =
>>>> +		exposureModeHelpers_.at(exposureModeIndex);
>>>> +
>>>> +	double gain = estimateInitialGain();
>>>> +	gain = constraintClampGain(constraintModeIndex, yHist, gain);
>>>> +
>>>> +	/*
>>>> +	 * We don't check whether we're already close to the target, because
>>>> +	 * even if the effective exposure value is the same as the last frame's
>>>> +	 * we could have switched to an exposure mode that would require a new
>>>> +	 * pass through the splitExposure() function.
>>>> +	 */
>>>> +
>>>> +	utils::Duration newExposureValue = effectiveExposureValue * gain;
>>>> +	utils::Duration maxTotalExposure = exposureModeHelper->maxShutter()
>>>> +					   * exposureModeHelper->maxGain();
>>>> +	newExposureValue = std::min(newExposureValue, maxTotalExposure);
>>>> +
>>>> +	/*
>>>> +	 * We filter the exposure value to make sure changes are not too jarring
>>>> +	 * from frame to frame.
>>>> +	 */
>>>> +	newExposureValue = filterExposure(newExposureValue);
>>>> +
>>>> +	frameCount_++;
>>>> +	return exposureModeHelper->splitExposure(newExposureValue);
>>>> +}
>>>> +
>>>> +}; /* namespace ipa */
>>>> +
>>>> +}; /* namespace libcamera */
>>>> diff --git a/src/ipa/libipa/agc.h b/src/ipa/libipa/agc.h
>>>> new file mode 100644
>>>> index 00000000..902a359a
>>>> --- /dev/null
>>>> +++ b/src/ipa/libipa/agc.h
>>>> @@ -0,0 +1,82 @@
>>>> +/* SPDX-License-Identifier: LGPL-2.1-or-later */
>>>> +/*
>>>> + * Copyright (C) 2024 Ideas on Board Oy
>>>> + *
>>>> + agc.h - Base class for libipa-compliant AGC algorithms
>>>> + */
>>>> +
>>>> +#pragma once
>>>> +
>>>> +#include <tuple>
>>>> +#include <vector>
>>>> +
>>>> +#include <libcamera/controls.h>
>>>> +
>>>> +#include "libcamera/internal/yaml_parser.h"
>>>> +
>>>> +#include "exposure_mode_helper.h"
>>>> +#include "histogram.h"
>>>> +
>>>> +namespace libcamera {
>>>> +
>>>> +namespace ipa {
>>>> +
>>>> +class MeanLuminanceAgc
>>>> +{
>>>> +public:
>>>> +	MeanLuminanceAgc();
>>>> +	virtual ~MeanLuminanceAgc() = default;
>> No need to make this inline, define the destructor in the .cpp with
>>
>> MeanLumimanceAgc::~MeanLuminanceAgc() = default;
>>
>>>> +
>>>> +	struct AgcConstraint {
>>>> +		enum class Bound {
>>>> +			LOWER = 0,
>>>> +			UPPER = 1
>> Wrong coding style.
>>
>>>> +		};
>>>> +		Bound bound;
>>>> +		double qLo;
>>>> +		double qHi;
>>>> +		double yTarget;
>>>> +	};
>>>> +
>>>> +	void parseRelativeLuminanceTarget(const YamlObject &tuningData);
>>>> +	void parseConstraint(const YamlObject &modeDict, int32_t id);
>> This function doesn't seem to be called from the derived classes, you
>> can make it private. Same for other functions below.
>>
>>>> +	int parseConstraintModes(const YamlObject &tuningData);
>>>> +	int parseExposureModes(const YamlObject &tuningData);
>> Is there a reason to split the parsing in multiple functions, given that
>> they are called the same from from both the rkisp1 and ipu3
>> implementations ?
>>
>>>> +
>>>> +	std::map<int32_t, std::vector<AgcConstraint>> constraintModes()
>>>> +	{
>>>> +		return constraintModes_;
>>>> +	}
>> This seems unused, so the AgcConstraint structure can be made private.
>>
>>>> +
>>>> +	std::map<int32_t, std::shared_ptr<ExposureModeHelper>> exposureModeHelpers()
>>>> +	{
>>>> +		return exposureModeHelpers_;
>>>> +	}
>> Can we avoid exposing this too ?
>>
>> I never expected there would be multiple helpers when reading the
>> documentation of the ExposureModeHelper class. That may be due to
>> missing documentation.
>>
>>>> +
>>>> +	ControlInfoMap::Map controls()
>>>> +	{
>>>> +		return controls_;
>>>> +	}
>> Hmmm... We don't have precendents for pushing ControlInfo to algorithms.
>> Are we sure we want to go that way ?
>>
>>>> +
>>>> +	virtual double estimateLuminance(const double gain) = 0;
>>>> +	double estimateInitialGain();
>>>> +	double constraintClampGain(uint32_t constraintModeIndex,
>>>> +				   const Histogram &hist,
>>>> +				   double gain);
>>>> +	utils::Duration filterExposure(utils::Duration exposureValue);
>>>> +	std::tuple<utils::Duration, double, double>
>>>> +	calculateNewEv(uint32_t constraintModeIndex, uint32_t exposureModeIndex,
>>>> +		       const Histogram &yHist, utils::Duration effectiveExposureValue);
>> Missing blank line.
>>
>>>> +private:
>>>> +	uint64_t frameCount_;
>>>> +	utils::Duration filteredExposure_;
>>>> +	double relativeLuminanceTarget_;
>>>> +
>>>> +	std::map<int32_t, std::vector<AgcConstraint>> constraintModes_;
>>>> +	std::map<int32_t, std::shared_ptr<ExposureModeHelper>> exposureModeHelpers_;
>>>> +	ControlInfoMap::Map controls_;
>>>> +};
>>>> +
>>>> +}; /* namespace ipa */
>>>> +
>>>> +}; /* namespace libcamera */
>>>> diff --git a/src/ipa/libipa/meson.build b/src/ipa/libipa/meson.build
>>>> index 37fbd177..31cc8d70 100644
>>>> --- a/src/ipa/libipa/meson.build
>>>> +++ b/src/ipa/libipa/meson.build
>>>> @@ -1,6 +1,7 @@
>>>>   # SPDX-License-Identifier: CC0-1.0
>>>>
>>>>   libipa_headers = files([
>>>> +    'agc.h',
>>>>       'algorithm.h',
>>>>       'camera_sensor_helper.h',
>>>>       'exposure_mode_helper.h',
>>>> @@ -10,6 +11,7 @@ libipa_headers = files([
>>>>   ])
>>>>
>>>>   libipa_sources = files([
>>>> +    'agc.cpp',
>>>>       'algorithm.cpp',
>>>>       'camera_sensor_helper.cpp',
>>>>       'exposure_mode_helper.cpp',
>> -- 
>> Regards,
>>
>> Laurent Pinchart
Paul Elder April 9, 2024, 9:37 a.m. UTC | #8
On Sat, Apr 06, 2024 at 04:07:52AM +0300, Laurent Pinchart wrote:
> Hello,
> 
> On Mon, Mar 25, 2024 at 09:16:00PM +0100, Jacopo Mondi wrote:
> > On Fri, Mar 22, 2024 at 01:14:45PM +0000, Daniel Scally wrote:
> > > The Agc algorithms for the RkIsp1 and IPU3 IPAs do the same thing in
> > > very large part; following the Rpi IPA's algorithm in spirit with a
> > > few tunable values in that IPA being hardcoded in the libipa ones.
> > > Add a new base class for MeanLuminanceAgc which implements the same
> > 
> > nit: I would rather call this one AgcMeanLuminance.
> 
> And rename the files accordingly ?
> 
> > One other note, not sure if applicable here, from the experience of
> > trying to upstream an auto-focus algorithm for RkISP1. We had there a
> > base class to define the algorithm interface, one derived class for
> > the common calculation and one platform-specific part for statistics
> > collection and IPA module interfacing.
> > 
> > The base class was there mostly to handle the algorithm state machine
> > by handling the controls that influence the algorithm behaviour. In
> > case of AEGC I can think, in example, about handling the switch
> > between enable/disable of auto-mode (and consequentially handling a
> > manually set ExposureTime and AnalogueGain), switching between
> > different ExposureModes etc
> > 
> > This is the last attempt for AF
> > https://patchwork.libcamera.org/patch/18510/
> > 
> > and I'm wondering if it would be desirable to abstract away from the
> > MeanLuminance part the part that is tightly coupled with the
> > libcamera's controls definition.
> > 
> > Let's make a concrete example: look how the rkisp1 and the ipu3 agc
> > implementations handle AeEnabled (or better, look at how the rkisp1
> > does that and the IPU3 does not).
> > 
> > Ideally, my goal would be to abstract the handling of the control and
> > all the state machine that decides if the manual or auto-computed
> > values should be used to an AEGCAlgorithm base class.
> > 
> > Your series does that for the tuning file parsing already but does
> > that in the MeanLuminance method implementation, while it should or
> > could be common to all AEGC methods.
> > 
> > One very interesting experiment could be starting with this and then
> > plumb-in AeEnable support in the IPU3 in example and move everything
> > common to a base class.
> > 
> > > algorithm and additionally parses yaml tuning files to inform an IPA
> > > module's Agc algorithm about valid constraint and exposure modes and
> > > their associated bounds.
> > >
> > > Signed-off-by: Daniel Scally <dan.scally@ideasonboard.com>
> > > ---
> > >  src/ipa/libipa/agc.cpp     | 526 +++++++++++++++++++++++++++++++++++++
> > >  src/ipa/libipa/agc.h       |  82 ++++++
> > >  src/ipa/libipa/meson.build |   2 +
> > >  3 files changed, 610 insertions(+)
> > >  create mode 100644 src/ipa/libipa/agc.cpp
> > >  create mode 100644 src/ipa/libipa/agc.h
> > >
> > > diff --git a/src/ipa/libipa/agc.cpp b/src/ipa/libipa/agc.cpp
> > > new file mode 100644
> > > index 00000000..af57a571
> > > --- /dev/null
> > > +++ b/src/ipa/libipa/agc.cpp
> > > @@ -0,0 +1,526 @@
> > > +/* SPDX-License-Identifier: LGPL-2.1-or-later */
> > > +/*
> > > + * Copyright (C) 2024 Ideas on Board Oy
> > > + *
> > > + * agc.cpp - Base class for libipa-compliant AGC algorithms
> 
> We could have different AGC algorithms that are all libipa-compliant.
> This is one particular AGC algorithm, right ?
> 
> > > + */
> > > +
> > > +#include "agc.h"
> > > +
> > > +#include <cmath>
> > > +
> > > +#include <libcamera/base/log.h>
> > > +#include <libcamera/control_ids.h>
> > > +
> > > +#include "exposure_mode_helper.h"
> > > +
> > > +using namespace libcamera::controls;
> > > +
> > > +/**
> > > + * \file agc.h
> > > + * \brief Base class implementing mean luminance AEGC.
> > 
> > nit: not '.' at the end of briefs
> > 
> > > + */
> > > +
> > > +namespace libcamera {
> > > +
> > > +using namespace std::literals::chrono_literals;
> > > +
> > > +LOG_DEFINE_CATEGORY(Agc)
> > > +
> > > +namespace ipa {
> > > +
> > > +/*
> > > + * Number of frames to wait before calculating stats on minimum exposure
> 
> This doesn't seem to match what the value is used for.
> 
> > > + * \todo should this be a tunable value?
> > 
> > Does this depend on the ISP (comes from IPA), the sensor (comes from
> > tuning file) or... both ? :)
> > 
> > > + */
> > > +static constexpr uint32_t kNumStartupFrames = 10;
> > > +
> > > +/*
> > > + * Default relative luminance target
> > > + *
> > > + * This value should be chosen so that when the camera points at a grey target,
> > > + * the resulting image brightness looks "right". Custom values can be passed
> > > + * as the relativeLuminanceTarget value in sensor tuning files.
> > > + */
> > > +static constexpr double kDefaultRelativeLuminanceTarget = 0.16;
> > > +
> > > +/**
> > > + * \struct MeanLuminanceAgc::AgcConstraint
> > > + * \brief The boundaries and target for an AeConstraintMode constraint
> > > + *
> > > + * This structure describes an AeConstraintMode constraint for the purposes of
> > > + * this algorithm. The algorithm will apply the constraints by calculating the
> > > + * Histogram's inter-quantile mean between the given quantiles and ensure that
> > > + * the resulting value is the right side of the given target (as defined by the
> > > + * boundary and luminance target).
> > > + */
> > 
> > Here, in example.
> > 
> > controls::AeConstraintMode and the supported values are defined as
> > (core|vendor) controls in control_ids_*.yaml
> > 
> > The tuning file expresses the constraint modes using the Control
> > definition (I wonder if this has always been like this) but it
> > definitely ties the tuning file to the controls definition.
> > 
> > Applications use controls::AeConstraintMode to select one of the
> > constrained modes to have the algorithm use it.
> > 
> > In all of this, how much is part of the MeanLuminance implementation
> > and how much shared between possibly multiple implementations ?
> > 
> > > +
> > > +/**
> > > + * \enum MeanLuminanceAgc::AgcConstraint::Bound
> > > + * \brief Specify whether the constraint defines a lower or upper bound
> > > + * \var MeanLuminanceAgc::AgcConstraint::LOWER
> > > + * \brief The constraint defines a lower bound
> > > + * \var MeanLuminanceAgc::AgcConstraint::UPPER
> > > + * \brief The constraint defines an upper bound
> > > + */
> > > +
> > > +/**
> > > + * \var MeanLuminanceAgc::AgcConstraint::bound
> > > + * \brief The type of constraint bound
> > > + */
> > > +
> > > +/**
> > > + * \var MeanLuminanceAgc::AgcConstraint::qLo
> > > + * \brief The lower quantile to use for the constraint
> > > + */
> > > +
> > > +/**
> > > + * \var MeanLuminanceAgc::AgcConstraint::qHi
> > > + * \brief The upper quantile to use for the constraint
> > > + */
> > > +
> > > +/**
> > > + * \var MeanLuminanceAgc::AgcConstraint::yTarget
> > > + * \brief The luminance target for the constraint
> > > + */
> > > +
> > > +/**
> > > + * \class MeanLuminanceAgc
> > > + * \brief a mean-based auto-exposure algorithm
> 
> s/a/A/
> 
> > > + *
> > > + * This algorithm calculates a shutter time, analogue and digital gain such that
> > > + * the normalised mean luminance value of an image is driven towards a target,
> > > + * which itself is discovered from tuning data. The algorithm is a two-stage
> > > + * process:
> 
> s/:/./
> 
> or make the next two paragraph bullet list entries.
> 
> > > + *
> > > + * In the first stage, an initial gain value is derived by iteratively comparing
> > > + * the gain-adjusted mean luminance across an entire image against a target, and
> > > + * selecting a value which pushes it as closely as possible towards the target.
> > > + *
> > > + * In the second stage we calculate the gain required to drive the average of a
> > > + * section of a histogram to a target value, where the target and the boundaries
> > > + * of the section of the histogram used in the calculation are taken from the
> > > + * values defined for the currently configured AeConstraintMode within the
> > > + * tuning data. The gain from the first stage is then clamped to the gain from
> > > + * this stage.
> > > + *
> > > + * The final gain is used to adjust the effective exposure value of the image,
> > > + * and that new exposure value divided into shutter time, analogue gain and
> 
> s/divided/is divided/
> 
> > > + * digital gain according to the selected AeExposureMode.
> 
> There should be a mention here that this class handles tuning file
> parsing and expects a particular tuning data format. You can reference
> the function that parses the tuning data for detailed documentation
> about the format.
> 
> > > + */
> > > +
> > > +MeanLuminanceAgc::MeanLuminanceAgc()
> > > +	: frameCount_(0), filteredExposure_(0s), relativeLuminanceTarget_(0)
> > > +{
> > > +}
> > > +
> > > +/**
> > > + * \brief Parse the relative luminance target from the tuning data
> > > + * \param[in] tuningData The YamlObject holding the algorithm's tuning data
> > > + */
> > > +void MeanLuminanceAgc::parseRelativeLuminanceTarget(const YamlObject &tuningData)
> > > +{
> > > +	relativeLuminanceTarget_ =
> > > +		tuningData["relativeLuminanceTarget"].get<double>(kDefaultRelativeLuminanceTarget);
> > 
> > How do you expect this to be computed in the tuning file ?
> > 
> > > +}
> > > +
> > > +/**
> > > + * \brief Parse an AeConstraintMode constraint from tuning data
> > > + * \param[in] modeDict the YamlObject holding the constraint data
> > > + * \param[in] id The constraint ID from AeConstraintModeEnum
> > > + */
> > > +void MeanLuminanceAgc::parseConstraint(const YamlObject &modeDict, int32_t id)
> > > +{
> > > +	for (const auto &[boundName, content] : modeDict.asDict()) {
> > > +		if (boundName != "upper" && boundName != "lower") {
> > > +			LOG(Agc, Warning)
> > > +				<< "Ignoring unknown constraint bound '" << boundName << "'";
> > > +			continue;
> > > +		}
> > > +
> > > +		unsigned int idx = static_cast<unsigned int>(boundName == "upper");
> > > +		AgcConstraint::Bound bound = static_cast<AgcConstraint::Bound>(idx);
> > > +		double qLo = content["qLo"].get<double>().value_or(0.98);
> > > +		double qHi = content["qHi"].get<double>().value_or(1.0);
> > > +		double yTarget =
> > > +			content["yTarget"].getList<double>().value_or(std::vector<double>{ 0.5 }).at(0);

So this should really be a piecewise linear function and not a scalar.
That's why I had it parse as a vector from the get-go so that we don't
run into compatibility issues later when we do add it.

I was going to fix it on top in "ipa: libipa: agc: Change luminance
target to piecewise linear function" [1] but clearly I forgot and only
did it for the main luminance target. Up to you if you want to fix it in
v2 or if you want me to do it on top in my v2.

[1] https://patchwork.libcamera.org/patch/19854/

> > > +
> > > +		AgcConstraint constraint = { bound, qLo, qHi, yTarget };
> > > +
> > > +		if (!constraintModes_.count(id))
> > > +			constraintModes_[id] = {};
> > > +
> > > +		if (idx)
> > > +			constraintModes_[id].push_back(constraint);
> > > +		else
> > > +			constraintModes_[id].insert(constraintModes_[id].begin(), constraint);
> > > +	}
> > > +}
> > > +
> > > +/**
> > > + * \brief Parse tuning data file to populate AeConstraintMode control
> > > + * \param[in] tuningData the YamlObject representing the tuning data for Agc
> > > + *
> > > + * The Agc algorithm's tuning data should contain a dictionary called
> > > + * AeConstraintMode containing per-mode setting dictionaries with the key being
> > > + * a value from \ref controls::AeConstraintModeNameValueMap. Each mode dict may
> > > + * contain either a "lower" or "upper" key, or both, in this format:
> > > + *
> > > + * \code{.unparsed}
> > > + * algorithms:
> > > + *   - Agc:
> > > + *       AeConstraintMode:
> > > + *         ConstraintNormal:
> > > + *           lower:
> > > + *             qLo: 0.98
> > > + *             qHi: 1.0
> > > + *             yTarget: 0.5
> > 
> > Ok, so this ties the tuning file not just to the libcamera controls
> > definition, but to this specific implementation of the algorithm ?
> 
> Tying the algorithm implementation with the tuning data format seems
> unavoidable. I'm slightly concerned about handling the YAML parsing here
> though, as I'm wondering if we can always guarantee that the file will
> not need to diverge somehow (possibly with additional data somewhere
> that would make the shared parser fail), but that's probably such a
> small risk that we can worry about it later.
> 
> > Not that it was not expected, and I think it's fine, as using a
> > libcamera defined control value as 'index' makes sure the applications
> > will deal with the same interface, but this largely conflicts with the
> > idea to have shared parsing for all algorithms in a base class.
> 
> The idea comes straight from the RPi implementation, which expect users
> to customize tuning files with new modes. Indexing into a non standard
> list of modes isn't an API I really like, and that's something I think
> we need to fix eventually.
> 
> > Also, we made AeConstrainModes a 'core control' because at the time
> > when RPi first implemented AGC support there was no alternative to
> > that. Then the RPi implementation has been copied in all other
> > platforms and this is still fine as a 'core control'. This however
> > seems to be a tuning parameter for this specific algorithm
> > implementation, isn't it ?
> 
> I should read your whole reply before writing anything :-)
> 
> > > + *         ConstraintHighlight:
> > > + *           lower:
> > > + *             qLo: 0.98
> > > + *             qHi: 1.0
> > > + *             yTarget: 0.5
> > > + *           upper:
> > > + *             qLo: 0.98
> > > + *             qHi: 1.0
> > > + *             yTarget: 0.8
> > > + *
> > > + * \endcode
> > > + *
> > > + * The parsed dictionaries are used to populate an array of available values for
> > > + * the AeConstraintMode control and stored for later use in the algorithm.
> > > + *
> > > + * \return -EINVAL Where a defined constraint mode is invalid
> > > + * \return 0 on success
> > > + */
> > > +int MeanLuminanceAgc::parseConstraintModes(const YamlObject &tuningData)
> > > +{
> > > +	std::vector<ControlValue> availableConstraintModes;
> > > +
> > > +	const YamlObject &yamlConstraintModes = tuningData[controls::AeConstraintMode.name()];
> > > +	if (yamlConstraintModes.isDictionary()) {
> > > +		for (const auto &[modeName, modeDict] : yamlConstraintModes.asDict()) {
> > > +			if (AeConstraintModeNameValueMap.find(modeName) ==
> > > +			    AeConstraintModeNameValueMap.end()) {
> > > +				LOG(Agc, Warning)
> > > +					<< "Skipping unknown constraint mode '" << modeName << "'";
> > > +				continue;
> > > +			}
> > > +
> > > +			if (!modeDict.isDictionary()) {
> > > +				LOG(Agc, Error)
> > > +					<< "Invalid constraint mode '" << modeName << "'";
> > > +				return -EINVAL;
> > > +			}
> > > +
> > > +			parseConstraint(modeDict,
> > > +					AeConstraintModeNameValueMap.at(modeName));
> > > +			availableConstraintModes.push_back(
> > > +				AeConstraintModeNameValueMap.at(modeName));
> > > +		}
> > > +	}
> > > +
> > > +	/*
> > > +	 * If the tuning data file contains no constraints then we use the
> > > +	 * default constraint that the various Agc algorithms were adhering to
> > > +	 * anyway before centralisation.
> > > +	 */
> > > +	if (constraintModes_.empty()) {
> > > +		AgcConstraint constraint = {
> > > +			AgcConstraint::Bound::LOWER,
> > > +			0.98,
> > > +			1.0,
> > > +			0.5
> > > +		};
> > > +
> > > +		constraintModes_[controls::ConstraintNormal].insert(
> > > +			constraintModes_[controls::ConstraintNormal].begin(),
> > > +			constraint);
> > > +		availableConstraintModes.push_back(
> > > +			AeConstraintModeNameValueMap.at("ConstraintNormal"));
> > > +	}
> > > +
> > > +	controls_[&controls::AeConstraintMode] = ControlInfo(availableConstraintModes);
> > > +
> > > +	return 0;
> > > +}
> > > +
> > > +/**
> > > + * \brief Parse tuning data file to populate AeExposureMode control
> > > + * \param[in] tuningData the YamlObject representing the tuning data for Agc
> > > + *
> > > + * The Agc algorithm's tuning data should contain a dictionary called
> > > + * AeExposureMode containing per-mode setting dictionaries with the key being
> > > + * a value from \ref controls::AeExposureModeNameValueMap. Each mode dict should
> > > + * contain an array of shutter times with the key "shutter" and an array of gain
> > > + * values with the key "gain", in this format:
> > > + *
> > > + * \code{.unparsed}
> > > + * algorithms:
> > > + *   - Agc:
> > > + *       AeExposureMode:
> > 
> > Same reasoning as per the constraints, really.
> > 
> > There's nothing bad here, if not me realizing our controls definition
> > is intimately tied with the algorithms implementation, so I wonder
> > again if even handling things like AeEnable in a common form makes
> > any sense..
> > 
> > Not going to review the actual implementation now, as it comes from
> > the existing ones...
> > 
> > 
> > > + *         ExposureNormal:
> > > + *           shutter: [ 100, 10000, 30000, 60000, 120000 ]
> > > + *           gain: [ 1.0, 2.0, 4.0, 6.0, 6.0 ]
> > > + *         ExposureShort:
> > > + *           shutter: [ 100, 10000, 30000, 60000, 120000 ]
> > > + *           gain: [ 1.0, 2.0, 4.0, 6.0, 6.0 ]
> > > + *
> > > + * \endcode
> > > + *
> > > + * The parsed dictionaries are used to populate an array of available values for
> > > + * the AeExposureMode control and to create ExposureModeHelpers
> > > + *
> > > + * \return -EINVAL Where a defined constraint mode is invalid
> > > + * \return 0 on success
> > > + */
> > > +int MeanLuminanceAgc::parseExposureModes(const YamlObject &tuningData)
> > > +{
> > > +	std::vector<ControlValue> availableExposureModes;
> > > +	int ret;
> > > +
> > > +	const YamlObject &yamlExposureModes = tuningData[controls::AeExposureMode.name()];
> > > +	if (yamlExposureModes.isDictionary()) {
> > > +		for (const auto &[modeName, modeValues] : yamlExposureModes.asDict()) {
> > > +			if (AeExposureModeNameValueMap.find(modeName) ==
> > > +			    AeExposureModeNameValueMap.end()) {
> > > +				LOG(Agc, Warning)
> > > +					<< "Skipping unknown exposure mode '" << modeName << "'";
> > > +				continue;
> > > +			}
> > > +
> > > +			if (!modeValues.isDictionary()) {
> > > +				LOG(Agc, Error)
> > > +					<< "Invalid exposure mode '" << modeName << "'";
> > > +				return -EINVAL;
> > > +			}
> > > +
> > > +			std::vector<uint32_t> shutters =
> > > +				modeValues["shutter"].getList<uint32_t>().value_or(std::vector<uint32_t>{});
> > > +			std::vector<double> gains =
> > > +				modeValues["gain"].getList<double>().value_or(std::vector<double>{});
> > > +
> > > +			std::vector<utils::Duration> shutterDurations;
> > > +			std::transform(shutters.begin(), shutters.end(),
> > > +				std::back_inserter(shutterDurations),
> > > +				[](uint32_t time) { return std::chrono::microseconds(time); });
> > > +
> > > +			std::shared_ptr<ExposureModeHelper> helper =
> > > +				std::make_shared<ExposureModeHelper>();
> > > +			if ((ret = helper->init(shutterDurations, gains) < 0)) {
> > > +				LOG(Agc, Error)
> > > +					<< "Failed to parse exposure mode '" << modeName << "'";
> > > +				return ret;
> > > +			}
> > > +
> > > +			exposureModeHelpers_[AeExposureModeNameValueMap.at(modeName)] = helper;
> > > +			availableExposureModes.push_back(AeExposureModeNameValueMap.at(modeName));
> > > +		}
> > > +	}
> > > +
> > > +	/*
> > > +	 * If we don't have any exposure modes in the tuning data we create an
> > > +	 * ExposureModeHelper using empty shutter time and gain arrays, which
> > > +	 * will then internally simply drive the shutter as high as possible
> > > +	 * before touching gain
> > > +	 */
> > > +	if (availableExposureModes.empty()) {
> > > +		int32_t exposureModeId = AeExposureModeNameValueMap.at("ExposureNormal");
> > > +		std::vector<utils::Duration> shutterDurations = {};
> > > +		std::vector<double> gains = {};
> > > +
> > > +		std::shared_ptr<ExposureModeHelper> helper =
> > > +			std::make_shared<ExposureModeHelper>();
> > > +		if ((ret = helper->init(shutterDurations, gains) < 0)) {
> > > +			LOG(Agc, Error)
> > > +				<< "Failed to create default ExposureModeHelper";
> > > +			return ret;
> > > +		}
> > > +
> > > +		exposureModeHelpers_[exposureModeId] = helper;
> > > +		availableExposureModes.push_back(exposureModeId);
> > > +	}
> > > +
> > > +	controls_[&controls::AeExposureMode] = ControlInfo(availableExposureModes);
> > > +
> > > +	return 0;
> > > +}
> > > +
> > > +/**
> > > + * \fn MeanLuminanceAgc::constraintModes()
> > > + * \brief Get the constraint modes that have been parsed from tuning data
> > > + */
> > > +
> > > +/**
> > > + * \fn MeanLuminanceAgc::exposureModeHelpers()
> > > + * \brief Get the ExposureModeHelpers that have been parsed from tuning data
> > > + */
> > > +
> > > +/**
> > > + * \fn MeanLuminanceAgc::controls()
> > > + * \brief Get the controls that have been generated after parsing tuning data
> > > + */
> > > +
> > > +/**
> > > + * \fn MeanLuminanceAgc::estimateLuminance(const double gain)
> > > + * \brief Estimate the luminance of an image, adjusted by a given gain
> > > + * \param[in] gain The gain with which to adjust the luminance estimate
> > > + *
> > > + * This function is a pure virtual function because estimation of luminance is a
> > > + * hardware-specific operation, which depends wholly on the format of the stats
> > > + * that are delivered to libcamera from the ISP. Derived classes must implement
> > > + * an overriding function that calculates the normalised mean luminance value
> > > + * across the entire image.
> > > + *
> > > + * \return The normalised relative luminance of the image
> > > + */
> > > +
> > > +/**
> > > + * \brief Estimate the initial gain needed to achieve a relative luminance
> > > + *        target
> > 
> > nit: we don't usually indent in briefs, or in general when breaking
> > lines in doxygen as far as I can tell
> > 
> > Not going
> > 
> > > + *
> > > + * To account for non-linearity caused by saturation, the value needs to be
> > > + * estimated in an iterative process, as multiplying by a gain will not increase
> > > + * the relative luminance by the same factor if some image regions are saturated
> > > + *
> > > + * \return The calculated initial gain
> > > + */
> > > +double MeanLuminanceAgc::estimateInitialGain()
> > > +{
> > > +	double yTarget = relativeLuminanceTarget_;
> > > +	double yGain = 1.0;
> > > +
> > > +	for (unsigned int i = 0; i < 8; i++) {
> > > +		double yValue = estimateLuminance(yGain);
> > > +		double extra_gain = std::min(10.0, yTarget / (yValue + .001));
> > > +
> > > +		yGain *= extra_gain;
> > > +		LOG(Agc, Debug) << "Y value: " << yValue
> > > +				<< ", Y target: " << yTarget
> > > +				<< ", gives gain " << yGain;
> > > +
> > > +		if (utils::abs_diff(extra_gain, 1.0) < 0.01)
> > > +			break;
> > > +	}
> > > +
> > > +	return yGain;
> > > +}
> > > +
> > > +/**
> > > + * \brief Clamp gain within the bounds of a defined constraint
> > > + * \param[in] constraintModeIndex The index of the constraint to adhere to
> > > + * \param[in] hist A histogram over which to calculate inter-quantile means
> > > + * \param[in] gain The gain to clamp
> > > + *
> > > + * \return The gain clamped within the constraint bounds
> > > + */
> > > +double MeanLuminanceAgc::constraintClampGain(uint32_t constraintModeIndex,
> > > +					     const Histogram &hist,
> > > +					     double gain)
> > > +{
> > > +	std::vector<AgcConstraint> &constraints = constraintModes_[constraintModeIndex];
> > > +	for (const AgcConstraint &constraint : constraints) {
> > > +		double newGain = constraint.yTarget * hist.bins() /
> > > +				 hist.interQuantileMean(constraint.qLo, constraint.qHi);
> > > +
> > > +		if (constraint.bound == AgcConstraint::Bound::LOWER &&
> > > +		    newGain > gain)
> > > +			gain = newGain;
> > > +
> > > +		if (constraint.bound == AgcConstraint::Bound::UPPER &&
> > > +		    newGain < gain)
> > > +			gain = newGain;
> > > +	}
> > > +
> > > +	return gain;
> > > +}
> > > +
> > > +/**
> > > + * \brief Apply a filter on the exposure value to limit the speed of changes
> > > + * \param[in] exposureValue The target exposure from the AGC algorithm
> > > + *
> > > + * The speed of the filter is adaptive, and will produce the target quicker
> > > + * during startup, or when the target exposure is within 20% of the most recent
> > > + * filter output.
> > > + *
> > > + * \return The filtered exposure
> > > + */
> > > +utils::Duration MeanLuminanceAgc::filterExposure(utils::Duration exposureValue)
> > > +{
> > > +	double speed = 0.2;
> > > +
> > > +	/* Adapt instantly if we are in startup phase. */
> > > +	if (frameCount_ < kNumStartupFrames)
> > > +		speed = 1.0;
> > > +
> > > +	/*
> > > +	 * If we are close to the desired result, go faster to avoid making
> > > +	 * multiple micro-adjustments.
> > > +	 * \todo Make this customisable?
> > > +	 */
> > > +	if (filteredExposure_ < 1.2 * exposureValue &&
> > > +	    filteredExposure_ > 0.8 * exposureValue)
> > > +		speed = sqrt(speed);
> > > +
> > > +	filteredExposure_ = speed * exposureValue +
> > > +			    filteredExposure_ * (1.0 - speed);
> > > +
> > > +	return filteredExposure_;
> > > +}
> > > +
> > > +/**
> > > + * \brief Calculate the new exposure value
> > > + * \param[in] constraintModeIndex The index of the current constraint mode
> > > + * \param[in] exposureModeIndex The index of the current exposure mode
> > > + * \param[in] yHist A Histogram from the ISP statistics to use in constraining
> > > + *	      the calculated gain
> > > + * \param[in] effectiveExposureValue The EV applied to the frame from which the
> > > + *	      statistics in use derive
> > > + *
> > > + * Calculate a new exposure value to try to obtain the target. The calculated
> > > + * exposure value is filtered to prevent rapid changes from frame to frame, and
> > > + * divided into shutter time, analogue and digital gain.
> > > + *
> > > + * \return Tuple of shutter time, analogue gain, and digital gain
> > > + */
> > > +std::tuple<utils::Duration, double, double>
> > > +MeanLuminanceAgc::calculateNewEv(uint32_t constraintModeIndex,
> > > +				 uint32_t exposureModeIndex,
> > > +				 const Histogram &yHist,
> > > +				 utils::Duration effectiveExposureValue)
> > > +{
> > > +	/*
> > > +	 * The pipeline handler should validate that we have received an allowed
> > > +	 * value for AeExposureMode.
> > > +	 */
> > > +	std::shared_ptr<ExposureModeHelper> exposureModeHelper =
> > > +		exposureModeHelpers_.at(exposureModeIndex);
> > > +
> > > +	double gain = estimateInitialGain();
> > > +	gain = constraintClampGain(constraintModeIndex, yHist, gain);
> > > +
> > > +	/*
> > > +	 * We don't check whether we're already close to the target, because
> > > +	 * even if the effective exposure value is the same as the last frame's
> > > +	 * we could have switched to an exposure mode that would require a new
> > > +	 * pass through the splitExposure() function.
> > > +	 */
> > > +
> > > +	utils::Duration newExposureValue = effectiveExposureValue * gain;
> > > +	utils::Duration maxTotalExposure = exposureModeHelper->maxShutter()
> > > +					   * exposureModeHelper->maxGain();
> > > +	newExposureValue = std::min(newExposureValue, maxTotalExposure);
> > > +
> > > +	/*
> > > +	 * We filter the exposure value to make sure changes are not too jarring
> > > +	 * from frame to frame.
> > > +	 */
> > > +	newExposureValue = filterExposure(newExposureValue);
> > > +
> > > +	frameCount_++;
> > > +	return exposureModeHelper->splitExposure(newExposureValue);
> > > +}
> > > +
> > > +}; /* namespace ipa */
> > > +
> > > +}; /* namespace libcamera */
> > > diff --git a/src/ipa/libipa/agc.h b/src/ipa/libipa/agc.h
> > > new file mode 100644
> > > index 00000000..902a359a
> > > --- /dev/null
> > > +++ b/src/ipa/libipa/agc.h
> > > @@ -0,0 +1,82 @@
> > > +/* SPDX-License-Identifier: LGPL-2.1-or-later */
> > > +/*
> > > + * Copyright (C) 2024 Ideas on Board Oy
> > > + *
> > > + agc.h - Base class for libipa-compliant AGC algorithms
> > > + */
> > > +
> > > +#pragma once
> > > +
> > > +#include <tuple>
> > > +#include <vector>
> > > +
> > > +#include <libcamera/controls.h>
> > > +
> > > +#include "libcamera/internal/yaml_parser.h"
> > > +
> > > +#include "exposure_mode_helper.h"
> > > +#include "histogram.h"
> > > +
> > > +namespace libcamera {
> > > +
> > > +namespace ipa {
> > > +
> > > +class MeanLuminanceAgc
> > > +{
> > > +public:
> > > +	MeanLuminanceAgc();
> > > +	virtual ~MeanLuminanceAgc() = default;
> 
> No need to make this inline, define the destructor in the .cpp with
> 
> MeanLumimanceAgc::~MeanLuminanceAgc() = default;
> 
> > > +
> > > +	struct AgcConstraint {
> > > +		enum class Bound {
> > > +			LOWER = 0,
> > > +			UPPER = 1
> 
> Wrong coding style.
> 
> > > +		};
> > > +		Bound bound;
> > > +		double qLo;
> > > +		double qHi;
> > > +		double yTarget;
> > > +	};
> > > +
> > > +	void parseRelativeLuminanceTarget(const YamlObject &tuningData);
> > > +	void parseConstraint(const YamlObject &modeDict, int32_t id);
> 
> This function doesn't seem to be called from the derived classes, you
> can make it private. Same for other functions below.
> 
> > > +	int parseConstraintModes(const YamlObject &tuningData);
> > > +	int parseExposureModes(const YamlObject &tuningData);
> 
> Is there a reason to split the parsing in multiple functions, given that
> they are called the same from from both the rkisp1 and ipu3
> implementations ?

(I assume this code came from what I wrote) It made the parent parsing
function more readable/nice imo :)

> 
> > > +
> > > +	std::map<int32_t, std::vector<AgcConstraint>> constraintModes()
> > > +	{
> > > +		return constraintModes_;
> > > +	}
> 
> This seems unused, so the AgcConstraint structure can be made private.

I'm using it! (on patches on top)

The alternative is to make these protected as opposed to private, which
might be better than getter functions.

> 
> > > +
> > > +	std::map<int32_t, std::shared_ptr<ExposureModeHelper>> exposureModeHelpers()
> > > +	{
> > > +		return exposureModeHelpers_;
> > > +	}
> 
> Can we avoid exposing this too ?

Same here.

> 
> I never expected there would be multiple helpers when reading the
> documentation of the ExposureModeHelper class. That may be due to
> missing documentation.

Yeah I didn't write any documentation for it...

The ExposureModeHelper only helps for a single ExposureMode so...

Should we instead have an ExposuresModeHelper? I didn't feel too
strongly that I needed that level (when I wrote it originally) and that
a map of ExposureModeHelpers was enough.

> 
> > > +
> > > +	ControlInfoMap::Map controls()
> > > +	{
> > > +		return controls_;
> > > +	}
> 
> Hmmm... We don't have precendents for pushing ControlInfo to algorithms.
> Are we sure we want to go that way ?

I added a similar pattern in the rkisp1 AGC/IPA (which I assume is where
this comes from?). But that was so that the algorithms could report to
the IPA what controls and values are supported so that the IPA can in
turn report it to the pipeline handler (and then to the application).

So this hunk actually conflicts with what I added (in "ipa: rkisp1: agc:
Plumb mode-selection and frame duration controls " [2]) because the
rkisp1 agc now inherits from both this and rkisp1 algo. That's why I
have to specify Algorithm::controls_.

[2] https://patchwork.libcamera.org/patch/19855/


Thanks,

Paul

> 
> > > +
> > > +	virtual double estimateLuminance(const double gain) = 0;
> > > +	double estimateInitialGain();
> > > +	double constraintClampGain(uint32_t constraintModeIndex,
> > > +				   const Histogram &hist,
> > > +				   double gain);
> > > +	utils::Duration filterExposure(utils::Duration exposureValue);
> > > +	std::tuple<utils::Duration, double, double>
> > > +	calculateNewEv(uint32_t constraintModeIndex, uint32_t exposureModeIndex,
> > > +		       const Histogram &yHist, utils::Duration effectiveExposureValue);
> 
> Missing blank line.
> 
> > > +private:
> > > +	uint64_t frameCount_;
> > > +	utils::Duration filteredExposure_;
> > > +	double relativeLuminanceTarget_;
> > > +
> > > +	std::map<int32_t, std::vector<AgcConstraint>> constraintModes_;
> > > +	std::map<int32_t, std::shared_ptr<ExposureModeHelper>> exposureModeHelpers_;
> > > +	ControlInfoMap::Map controls_;
> > > +};
> > > +
> > > +}; /* namespace ipa */
> > > +
> > > +}; /* namespace libcamera */
> > > diff --git a/src/ipa/libipa/meson.build b/src/ipa/libipa/meson.build
> > > index 37fbd177..31cc8d70 100644
> > > --- a/src/ipa/libipa/meson.build
> > > +++ b/src/ipa/libipa/meson.build
> > > @@ -1,6 +1,7 @@
> > >  # SPDX-License-Identifier: CC0-1.0
> > >
> > >  libipa_headers = files([
> > > +    'agc.h',
> > >      'algorithm.h',
> > >      'camera_sensor_helper.h',
> > >      'exposure_mode_helper.h',
> > > @@ -10,6 +11,7 @@ libipa_headers = files([
> > >  ])
> > >
> > >  libipa_sources = files([
> > > +    'agc.cpp',
> > >      'algorithm.cpp',
> > >      'camera_sensor_helper.cpp',
> > >      'exposure_mode_helper.cpp',

Patch
diff mbox series

diff --git a/src/ipa/libipa/agc.cpp b/src/ipa/libipa/agc.cpp
new file mode 100644
index 00000000..af57a571
--- /dev/null
+++ b/src/ipa/libipa/agc.cpp
@@ -0,0 +1,526 @@ 
+/* SPDX-License-Identifier: LGPL-2.1-or-later */
+/*
+ * Copyright (C) 2024 Ideas on Board Oy
+ *
+ * agc.cpp - Base class for libipa-compliant AGC algorithms
+ */
+
+#include "agc.h"
+
+#include <cmath>
+
+#include <libcamera/base/log.h>
+#include <libcamera/control_ids.h>
+
+#include "exposure_mode_helper.h"
+
+using namespace libcamera::controls;
+
+/**
+ * \file agc.h
+ * \brief Base class implementing mean luminance AEGC.
+ */
+
+namespace libcamera {
+
+using namespace std::literals::chrono_literals;
+
+LOG_DEFINE_CATEGORY(Agc)
+
+namespace ipa {
+
+/*
+ * Number of frames to wait before calculating stats on minimum exposure
+ * \todo should this be a tunable value?
+ */
+static constexpr uint32_t kNumStartupFrames = 10;
+
+/*
+ * Default relative luminance target
+ *
+ * This value should be chosen so that when the camera points at a grey target,
+ * the resulting image brightness looks "right". Custom values can be passed
+ * as the relativeLuminanceTarget value in sensor tuning files.
+ */
+static constexpr double kDefaultRelativeLuminanceTarget = 0.16;
+
+/**
+ * \struct MeanLuminanceAgc::AgcConstraint
+ * \brief The boundaries and target for an AeConstraintMode constraint
+ *
+ * This structure describes an AeConstraintMode constraint for the purposes of
+ * this algorithm. The algorithm will apply the constraints by calculating the
+ * Histogram's inter-quantile mean between the given quantiles and ensure that
+ * the resulting value is the right side of the given target (as defined by the
+ * boundary and luminance target).
+ */
+
+/**
+ * \enum MeanLuminanceAgc::AgcConstraint::Bound
+ * \brief Specify whether the constraint defines a lower or upper bound
+ * \var MeanLuminanceAgc::AgcConstraint::LOWER
+ * \brief The constraint defines a lower bound
+ * \var MeanLuminanceAgc::AgcConstraint::UPPER
+ * \brief The constraint defines an upper bound
+ */
+
+/**
+ * \var MeanLuminanceAgc::AgcConstraint::bound
+ * \brief The type of constraint bound
+ */
+
+/**
+ * \var MeanLuminanceAgc::AgcConstraint::qLo
+ * \brief The lower quantile to use for the constraint
+ */
+
+/**
+ * \var MeanLuminanceAgc::AgcConstraint::qHi
+ * \brief The upper quantile to use for the constraint
+ */
+
+/**
+ * \var MeanLuminanceAgc::AgcConstraint::yTarget
+ * \brief The luminance target for the constraint
+ */
+
+/**
+ * \class MeanLuminanceAgc
+ * \brief a mean-based auto-exposure algorithm
+ *
+ * This algorithm calculates a shutter time, analogue and digital gain such that
+ * the normalised mean luminance value of an image is driven towards a target,
+ * which itself is discovered from tuning data. The algorithm is a two-stage
+ * process:
+ *
+ * In the first stage, an initial gain value is derived by iteratively comparing
+ * the gain-adjusted mean luminance across an entire image against a target, and
+ * selecting a value which pushes it as closely as possible towards the target.
+ *
+ * In the second stage we calculate the gain required to drive the average of a
+ * section of a histogram to a target value, where the target and the boundaries
+ * of the section of the histogram used in the calculation are taken from the
+ * values defined for the currently configured AeConstraintMode within the
+ * tuning data. The gain from the first stage is then clamped to the gain from
+ * this stage.
+ *
+ * The final gain is used to adjust the effective exposure value of the image,
+ * and that new exposure value divided into shutter time, analogue gain and
+ * digital gain according to the selected AeExposureMode.
+ */
+
+MeanLuminanceAgc::MeanLuminanceAgc()
+	: frameCount_(0), filteredExposure_(0s), relativeLuminanceTarget_(0)
+{
+}
+
+/**
+ * \brief Parse the relative luminance target from the tuning data
+ * \param[in] tuningData The YamlObject holding the algorithm's tuning data
+ */
+void MeanLuminanceAgc::parseRelativeLuminanceTarget(const YamlObject &tuningData)
+{
+	relativeLuminanceTarget_ =
+		tuningData["relativeLuminanceTarget"].get<double>(kDefaultRelativeLuminanceTarget);
+}
+
+/**
+ * \brief Parse an AeConstraintMode constraint from tuning data
+ * \param[in] modeDict the YamlObject holding the constraint data
+ * \param[in] id The constraint ID from AeConstraintModeEnum
+ */
+void MeanLuminanceAgc::parseConstraint(const YamlObject &modeDict, int32_t id)
+{
+	for (const auto &[boundName, content] : modeDict.asDict()) {
+		if (boundName != "upper" && boundName != "lower") {
+			LOG(Agc, Warning)
+				<< "Ignoring unknown constraint bound '" << boundName << "'";
+			continue;
+		}
+
+		unsigned int idx = static_cast<unsigned int>(boundName == "upper");
+		AgcConstraint::Bound bound = static_cast<AgcConstraint::Bound>(idx);
+		double qLo = content["qLo"].get<double>().value_or(0.98);
+		double qHi = content["qHi"].get<double>().value_or(1.0);
+		double yTarget =
+			content["yTarget"].getList<double>().value_or(std::vector<double>{ 0.5 }).at(0);
+
+		AgcConstraint constraint = { bound, qLo, qHi, yTarget };
+
+		if (!constraintModes_.count(id))
+			constraintModes_[id] = {};
+
+		if (idx)
+			constraintModes_[id].push_back(constraint);
+		else
+			constraintModes_[id].insert(constraintModes_[id].begin(), constraint);
+	}
+}
+
+/**
+ * \brief Parse tuning data file to populate AeConstraintMode control
+ * \param[in] tuningData the YamlObject representing the tuning data for Agc
+ *
+ * The Agc algorithm's tuning data should contain a dictionary called
+ * AeConstraintMode containing per-mode setting dictionaries with the key being
+ * a value from \ref controls::AeConstraintModeNameValueMap. Each mode dict may
+ * contain either a "lower" or "upper" key, or both, in this format:
+ *
+ * \code{.unparsed}
+ * algorithms:
+ *   - Agc:
+ *       AeConstraintMode:
+ *         ConstraintNormal:
+ *           lower:
+ *             qLo: 0.98
+ *             qHi: 1.0
+ *             yTarget: 0.5
+ *         ConstraintHighlight:
+ *           lower:
+ *             qLo: 0.98
+ *             qHi: 1.0
+ *             yTarget: 0.5
+ *           upper:
+ *             qLo: 0.98
+ *             qHi: 1.0
+ *             yTarget: 0.8
+ *
+ * \endcode
+ *
+ * The parsed dictionaries are used to populate an array of available values for
+ * the AeConstraintMode control and stored for later use in the algorithm.
+ *
+ * \return -EINVAL Where a defined constraint mode is invalid
+ * \return 0 on success
+ */
+int MeanLuminanceAgc::parseConstraintModes(const YamlObject &tuningData)
+{
+	std::vector<ControlValue> availableConstraintModes;
+
+	const YamlObject &yamlConstraintModes = tuningData[controls::AeConstraintMode.name()];
+	if (yamlConstraintModes.isDictionary()) {
+		for (const auto &[modeName, modeDict] : yamlConstraintModes.asDict()) {
+			if (AeConstraintModeNameValueMap.find(modeName) ==
+			    AeConstraintModeNameValueMap.end()) {
+				LOG(Agc, Warning)
+					<< "Skipping unknown constraint mode '" << modeName << "'";
+				continue;
+			}
+
+			if (!modeDict.isDictionary()) {
+				LOG(Agc, Error)
+					<< "Invalid constraint mode '" << modeName << "'";
+				return -EINVAL;
+			}
+
+			parseConstraint(modeDict,
+					AeConstraintModeNameValueMap.at(modeName));
+			availableConstraintModes.push_back(
+				AeConstraintModeNameValueMap.at(modeName));
+		}
+	}
+
+	/*
+	 * If the tuning data file contains no constraints then we use the
+	 * default constraint that the various Agc algorithms were adhering to
+	 * anyway before centralisation.
+	 */
+	if (constraintModes_.empty()) {
+		AgcConstraint constraint = {
+			AgcConstraint::Bound::LOWER,
+			0.98,
+			1.0,
+			0.5
+		};
+
+		constraintModes_[controls::ConstraintNormal].insert(
+			constraintModes_[controls::ConstraintNormal].begin(),
+			constraint);
+		availableConstraintModes.push_back(
+			AeConstraintModeNameValueMap.at("ConstraintNormal"));
+	}
+
+	controls_[&controls::AeConstraintMode] = ControlInfo(availableConstraintModes);
+
+	return 0;
+}
+
+/**
+ * \brief Parse tuning data file to populate AeExposureMode control
+ * \param[in] tuningData the YamlObject representing the tuning data for Agc
+ *
+ * The Agc algorithm's tuning data should contain a dictionary called
+ * AeExposureMode containing per-mode setting dictionaries with the key being
+ * a value from \ref controls::AeExposureModeNameValueMap. Each mode dict should
+ * contain an array of shutter times with the key "shutter" and an array of gain
+ * values with the key "gain", in this format:
+ *
+ * \code{.unparsed}
+ * algorithms:
+ *   - Agc:
+ *       AeExposureMode:
+ *         ExposureNormal:
+ *           shutter: [ 100, 10000, 30000, 60000, 120000 ]
+ *           gain: [ 1.0, 2.0, 4.0, 6.0, 6.0 ]
+ *         ExposureShort:
+ *           shutter: [ 100, 10000, 30000, 60000, 120000 ]
+ *           gain: [ 1.0, 2.0, 4.0, 6.0, 6.0 ]
+ *
+ * \endcode
+ *
+ * The parsed dictionaries are used to populate an array of available values for
+ * the AeExposureMode control and to create ExposureModeHelpers
+ *
+ * \return -EINVAL Where a defined constraint mode is invalid
+ * \return 0 on success
+ */
+int MeanLuminanceAgc::parseExposureModes(const YamlObject &tuningData)
+{
+	std::vector<ControlValue> availableExposureModes;
+	int ret;
+
+	const YamlObject &yamlExposureModes = tuningData[controls::AeExposureMode.name()];
+	if (yamlExposureModes.isDictionary()) {
+		for (const auto &[modeName, modeValues] : yamlExposureModes.asDict()) {
+			if (AeExposureModeNameValueMap.find(modeName) ==
+			    AeExposureModeNameValueMap.end()) {
+				LOG(Agc, Warning)
+					<< "Skipping unknown exposure mode '" << modeName << "'";
+				continue;
+			}
+
+			if (!modeValues.isDictionary()) {
+				LOG(Agc, Error)
+					<< "Invalid exposure mode '" << modeName << "'";
+				return -EINVAL;
+			}
+
+			std::vector<uint32_t> shutters =
+				modeValues["shutter"].getList<uint32_t>().value_or(std::vector<uint32_t>{});
+			std::vector<double> gains =
+				modeValues["gain"].getList<double>().value_or(std::vector<double>{});
+
+			std::vector<utils::Duration> shutterDurations;
+			std::transform(shutters.begin(), shutters.end(),
+				std::back_inserter(shutterDurations),
+				[](uint32_t time) { return std::chrono::microseconds(time); });
+
+			std::shared_ptr<ExposureModeHelper> helper =
+				std::make_shared<ExposureModeHelper>();
+			if ((ret = helper->init(shutterDurations, gains) < 0)) {
+				LOG(Agc, Error)
+					<< "Failed to parse exposure mode '" << modeName << "'";
+				return ret;
+			}
+
+			exposureModeHelpers_[AeExposureModeNameValueMap.at(modeName)] = helper;
+			availableExposureModes.push_back(AeExposureModeNameValueMap.at(modeName));
+		}
+	}
+
+	/*
+	 * If we don't have any exposure modes in the tuning data we create an
+	 * ExposureModeHelper using empty shutter time and gain arrays, which
+	 * will then internally simply drive the shutter as high as possible
+	 * before touching gain
+	 */
+	if (availableExposureModes.empty()) {
+		int32_t exposureModeId = AeExposureModeNameValueMap.at("ExposureNormal");
+		std::vector<utils::Duration> shutterDurations = {};
+		std::vector<double> gains = {};
+
+		std::shared_ptr<ExposureModeHelper> helper =
+			std::make_shared<ExposureModeHelper>();
+		if ((ret = helper->init(shutterDurations, gains) < 0)) {
+			LOG(Agc, Error)
+				<< "Failed to create default ExposureModeHelper";
+			return ret;
+		}
+
+		exposureModeHelpers_[exposureModeId] = helper;
+		availableExposureModes.push_back(exposureModeId);
+	}
+
+	controls_[&controls::AeExposureMode] = ControlInfo(availableExposureModes);
+
+	return 0;
+}
+
+/**
+ * \fn MeanLuminanceAgc::constraintModes()
+ * \brief Get the constraint modes that have been parsed from tuning data
+ */
+
+/**
+ * \fn MeanLuminanceAgc::exposureModeHelpers()
+ * \brief Get the ExposureModeHelpers that have been parsed from tuning data
+ */
+
+/**
+ * \fn MeanLuminanceAgc::controls()
+ * \brief Get the controls that have been generated after parsing tuning data
+ */
+
+/**
+ * \fn MeanLuminanceAgc::estimateLuminance(const double gain)
+ * \brief Estimate the luminance of an image, adjusted by a given gain
+ * \param[in] gain The gain with which to adjust the luminance estimate
+ *
+ * This function is a pure virtual function because estimation of luminance is a
+ * hardware-specific operation, which depends wholly on the format of the stats
+ * that are delivered to libcamera from the ISP. Derived classes must implement
+ * an overriding function that calculates the normalised mean luminance value
+ * across the entire image.
+ *
+ * \return The normalised relative luminance of the image
+ */
+
+/**
+ * \brief Estimate the initial gain needed to achieve a relative luminance
+ *        target
+ *
+ * To account for non-linearity caused by saturation, the value needs to be
+ * estimated in an iterative process, as multiplying by a gain will not increase
+ * the relative luminance by the same factor if some image regions are saturated
+ *
+ * \return The calculated initial gain
+ */
+double MeanLuminanceAgc::estimateInitialGain()
+{
+	double yTarget = relativeLuminanceTarget_;
+	double yGain = 1.0;
+
+	for (unsigned int i = 0; i < 8; i++) {
+		double yValue = estimateLuminance(yGain);
+		double extra_gain = std::min(10.0, yTarget / (yValue + .001));
+
+		yGain *= extra_gain;
+		LOG(Agc, Debug) << "Y value: " << yValue
+				<< ", Y target: " << yTarget
+				<< ", gives gain " << yGain;
+
+		if (utils::abs_diff(extra_gain, 1.0) < 0.01)
+			break;
+	}
+
+	return yGain;
+}
+
+/**
+ * \brief Clamp gain within the bounds of a defined constraint
+ * \param[in] constraintModeIndex The index of the constraint to adhere to
+ * \param[in] hist A histogram over which to calculate inter-quantile means
+ * \param[in] gain The gain to clamp
+ *
+ * \return The gain clamped within the constraint bounds
+ */
+double MeanLuminanceAgc::constraintClampGain(uint32_t constraintModeIndex,
+					     const Histogram &hist,
+					     double gain)
+{
+	std::vector<AgcConstraint> &constraints = constraintModes_[constraintModeIndex];
+	for (const AgcConstraint &constraint : constraints) {
+		double newGain = constraint.yTarget * hist.bins() /
+				 hist.interQuantileMean(constraint.qLo, constraint.qHi);
+
+		if (constraint.bound == AgcConstraint::Bound::LOWER &&
+		    newGain > gain)
+			gain = newGain;
+
+		if (constraint.bound == AgcConstraint::Bound::UPPER &&
+		    newGain < gain)
+			gain = newGain;
+	}
+
+	return gain;
+}
+
+/**
+ * \brief Apply a filter on the exposure value to limit the speed of changes
+ * \param[in] exposureValue The target exposure from the AGC algorithm
+ *
+ * The speed of the filter is adaptive, and will produce the target quicker
+ * during startup, or when the target exposure is within 20% of the most recent
+ * filter output.
+ *
+ * \return The filtered exposure
+ */
+utils::Duration MeanLuminanceAgc::filterExposure(utils::Duration exposureValue)
+{
+	double speed = 0.2;
+
+	/* Adapt instantly if we are in startup phase. */
+	if (frameCount_ < kNumStartupFrames)
+		speed = 1.0;
+
+	/*
+	 * If we are close to the desired result, go faster to avoid making
+	 * multiple micro-adjustments.
+	 * \todo Make this customisable?
+	 */
+	if (filteredExposure_ < 1.2 * exposureValue &&
+	    filteredExposure_ > 0.8 * exposureValue)
+		speed = sqrt(speed);
+
+	filteredExposure_ = speed * exposureValue +
+			    filteredExposure_ * (1.0 - speed);
+
+	return filteredExposure_;
+}
+
+/**
+ * \brief Calculate the new exposure value
+ * \param[in] constraintModeIndex The index of the current constraint mode
+ * \param[in] exposureModeIndex The index of the current exposure mode
+ * \param[in] yHist A Histogram from the ISP statistics to use in constraining
+ *	      the calculated gain
+ * \param[in] effectiveExposureValue The EV applied to the frame from which the
+ *	      statistics in use derive
+ *
+ * Calculate a new exposure value to try to obtain the target. The calculated
+ * exposure value is filtered to prevent rapid changes from frame to frame, and
+ * divided into shutter time, analogue and digital gain.
+ *
+ * \return Tuple of shutter time, analogue gain, and digital gain
+ */
+std::tuple<utils::Duration, double, double>
+MeanLuminanceAgc::calculateNewEv(uint32_t constraintModeIndex,
+				 uint32_t exposureModeIndex,
+				 const Histogram &yHist,
+				 utils::Duration effectiveExposureValue)
+{
+	/*
+	 * The pipeline handler should validate that we have received an allowed
+	 * value for AeExposureMode.
+	 */
+	std::shared_ptr<ExposureModeHelper> exposureModeHelper =
+		exposureModeHelpers_.at(exposureModeIndex);
+
+	double gain = estimateInitialGain();
+	gain = constraintClampGain(constraintModeIndex, yHist, gain);
+
+	/*
+	 * We don't check whether we're already close to the target, because
+	 * even if the effective exposure value is the same as the last frame's
+	 * we could have switched to an exposure mode that would require a new
+	 * pass through the splitExposure() function.
+	 */
+
+	utils::Duration newExposureValue = effectiveExposureValue * gain;
+	utils::Duration maxTotalExposure = exposureModeHelper->maxShutter()
+					   * exposureModeHelper->maxGain();
+	newExposureValue = std::min(newExposureValue, maxTotalExposure);
+
+	/*
+	 * We filter the exposure value to make sure changes are not too jarring
+	 * from frame to frame.
+	 */
+	newExposureValue = filterExposure(newExposureValue);
+
+	frameCount_++;
+	return exposureModeHelper->splitExposure(newExposureValue);
+}
+
+}; /* namespace ipa */
+
+}; /* namespace libcamera */
diff --git a/src/ipa/libipa/agc.h b/src/ipa/libipa/agc.h
new file mode 100644
index 00000000..902a359a
--- /dev/null
+++ b/src/ipa/libipa/agc.h
@@ -0,0 +1,82 @@ 
+/* SPDX-License-Identifier: LGPL-2.1-or-later */
+/*
+ * Copyright (C) 2024 Ideas on Board Oy
+ *
+ agc.h - Base class for libipa-compliant AGC algorithms
+ */
+
+#pragma once
+
+#include <tuple>
+#include <vector>
+
+#include <libcamera/controls.h>
+
+#include "libcamera/internal/yaml_parser.h"
+
+#include "exposure_mode_helper.h"
+#include "histogram.h"
+
+namespace libcamera {
+
+namespace ipa {
+
+class MeanLuminanceAgc
+{
+public:
+	MeanLuminanceAgc();
+	virtual ~MeanLuminanceAgc() = default;
+
+	struct AgcConstraint {
+		enum class Bound {
+			LOWER = 0,
+			UPPER = 1
+		};
+		Bound bound;
+		double qLo;
+		double qHi;
+		double yTarget;
+	};
+
+	void parseRelativeLuminanceTarget(const YamlObject &tuningData);
+	void parseConstraint(const YamlObject &modeDict, int32_t id);
+	int parseConstraintModes(const YamlObject &tuningData);
+	int parseExposureModes(const YamlObject &tuningData);
+
+	std::map<int32_t, std::vector<AgcConstraint>> constraintModes()
+	{
+		return constraintModes_;
+	}
+
+	std::map<int32_t, std::shared_ptr<ExposureModeHelper>> exposureModeHelpers()
+	{
+		return exposureModeHelpers_;
+	}
+
+	ControlInfoMap::Map controls()
+	{
+		return controls_;
+	}
+
+	virtual double estimateLuminance(const double gain) = 0;
+	double estimateInitialGain();
+	double constraintClampGain(uint32_t constraintModeIndex,
+				   const Histogram &hist,
+				   double gain);
+	utils::Duration filterExposure(utils::Duration exposureValue);
+	std::tuple<utils::Duration, double, double>
+	calculateNewEv(uint32_t constraintModeIndex, uint32_t exposureModeIndex,
+		       const Histogram &yHist, utils::Duration effectiveExposureValue);
+private:
+	uint64_t frameCount_;
+	utils::Duration filteredExposure_;
+	double relativeLuminanceTarget_;
+
+	std::map<int32_t, std::vector<AgcConstraint>> constraintModes_;
+	std::map<int32_t, std::shared_ptr<ExposureModeHelper>> exposureModeHelpers_;
+	ControlInfoMap::Map controls_;
+};
+
+}; /* namespace ipa */
+
+}; /* namespace libcamera */
diff --git a/src/ipa/libipa/meson.build b/src/ipa/libipa/meson.build
index 37fbd177..31cc8d70 100644
--- a/src/ipa/libipa/meson.build
+++ b/src/ipa/libipa/meson.build
@@ -1,6 +1,7 @@ 
 # SPDX-License-Identifier: CC0-1.0
 
 libipa_headers = files([
+    'agc.h',
     'algorithm.h',
     'camera_sensor_helper.h',
     'exposure_mode_helper.h',
@@ -10,6 +11,7 @@  libipa_headers = files([
 ])
 
 libipa_sources = files([
+    'agc.cpp',
     'algorithm.cpp',
     'camera_sensor_helper.cpp',
     'exposure_mode_helper.cpp',