[{"id":27761,"web_url":"https://patchwork.libcamera.org/comment/27761/","msgid":"<brn2l4a6hepdgnl4txnhhebgnl66lgda5edzj4cp5dwrgpwckp@wps62szgdufb>","date":"2023-09-12T14:46:01","subject":"Re: [libcamera-devel] [PATCH v3 2/5] ipa: rpi: agc: Reorganise code\n\tfor multi-channel AGC","submitter":{"id":143,"url":"https://patchwork.libcamera.org/api/people/143/","name":"Jacopo Mondi","email":"jacopo.mondi@ideasonboard.com"},"content":"Hi David\n\nOn Tue, Sep 12, 2023 at 11:24:39AM +0100, David Plowman via libcamera-devel wrote:\n> This commit does the basic reorganisation of the code in order to\n> implement multi-channel AGC. The main changes are:\n>\n> * The previous Agc class (in agc.cpp) has become the AgcChannel class\n>   in (agc_channel.cpp).\n>\n> * A new Agc class is introduced which is a wrapper round a number of\n>   AgcChannels.\n>\n> * The basic plumbing from ipa_base.cpp to Agc is updated to include a\n>   channel number. All the existing controls are hardwired to talk\n>   directly to channel 0.\n>\n> There are a couple of limitations which we expect to apply to\n> multi-channel AGC. We're not allowing different frame durations to be\n> applied to the channels, nor are we allowing separate metering\n> modes. To be fair, supporting these things is not impossible, but\n> there are reasons why it may be tricky so they remain \"TBD\" for now.\n>\n> This patch only includes the basic reorganisation and plumbing. It\n> does not yet update the important methods (switchMode, prepare and\n> process) to implement multi-channel AGC properly. This will appear in\n> a subsequent commit. For now, these functions are hard-coded just to\n> use channel 0, thereby preserving the existing behaviour.\n>\n> Signed-off-by: David Plowman <david.plowman@raspberrypi.com>\n> Reviewed-by: Naushir Patuck <naush@raspberrypi.com>\n> ---\n>  src/ipa/rpi/common/ipa_base.cpp            |  20 +-\n>  src/ipa/rpi/controller/agc_algorithm.h     |  19 +-\n>  src/ipa/rpi/controller/meson.build         |   1 +\n>  src/ipa/rpi/controller/rpi/agc.cpp         | 912 +++-----------------\n>  src/ipa/rpi/controller/rpi/agc.h           | 121 +--\n>  src/ipa/rpi/controller/rpi/agc_channel.cpp | 924 +++++++++++++++++++++\n>  src/ipa/rpi/controller/rpi/agc_channel.h   | 137 +++\n>  7 files changed, 1219 insertions(+), 915 deletions(-)\n>  create mode 100644 src/ipa/rpi/controller/rpi/agc_channel.cpp\n>  create mode 100644 src/ipa/rpi/controller/rpi/agc_channel.h\n>\n> diff --git a/src/ipa/rpi/common/ipa_base.cpp b/src/ipa/rpi/common/ipa_base.cpp\n> index a47ae3a9..f7e7ad5e 100644\n> --- a/src/ipa/rpi/common/ipa_base.cpp\n> +++ b/src/ipa/rpi/common/ipa_base.cpp\n> @@ -699,9 +699,9 @@ void IpaBase::applyControls(const ControlList &controls)\n>  \t\t\t}\n>\n>  \t\t\tif (ctrl.second.get<bool>() == false)\n> -\t\t\t\tagc->disableAuto();\n> +\t\t\t\tagc->disableAuto(0);\n>  \t\t\telse\n> -\t\t\t\tagc->enableAuto();\n> +\t\t\t\tagc->enableAuto(0);\n>\n>  \t\t\tlibcameraMetadata_.set(controls::AeEnable, ctrl.second.get<bool>());\n>  \t\t\tbreak;\n> @@ -717,7 +717,7 @@ void IpaBase::applyControls(const ControlList &controls)\n>  \t\t\t}\n>\n>  \t\t\t/* The control provides units of microseconds. */\n> -\t\t\tagc->setFixedShutter(ctrl.second.get<int32_t>() * 1.0us);\n> +\t\t\tagc->setFixedShutter(0, ctrl.second.get<int32_t>() * 1.0us);\n>\n>  \t\t\tlibcameraMetadata_.set(controls::ExposureTime, ctrl.second.get<int32_t>());\n>  \t\t\tbreak;\n> @@ -732,7 +732,7 @@ void IpaBase::applyControls(const ControlList &controls)\n>  \t\t\t\tbreak;\n>  \t\t\t}\n>\n> -\t\t\tagc->setFixedAnalogueGain(ctrl.second.get<float>());\n> +\t\t\tagc->setFixedAnalogueGain(0, ctrl.second.get<float>());\n>\n>  \t\t\tlibcameraMetadata_.set(controls::AnalogueGain,\n>  \t\t\t\t\t       ctrl.second.get<float>());\n> @@ -770,7 +770,7 @@ void IpaBase::applyControls(const ControlList &controls)\n>\n>  \t\t\tint32_t idx = ctrl.second.get<int32_t>();\n>  \t\t\tif (ConstraintModeTable.count(idx)) {\n> -\t\t\t\tagc->setConstraintMode(ConstraintModeTable.at(idx));\n> +\t\t\t\tagc->setConstraintMode(0, ConstraintModeTable.at(idx));\n>  \t\t\t\tlibcameraMetadata_.set(controls::AeConstraintMode, idx);\n>  \t\t\t} else {\n>  \t\t\t\tLOG(IPARPI, Error) << \"Constraint mode \" << idx\n> @@ -790,7 +790,7 @@ void IpaBase::applyControls(const ControlList &controls)\n>\n>  \t\t\tint32_t idx = ctrl.second.get<int32_t>();\n>  \t\t\tif (ExposureModeTable.count(idx)) {\n> -\t\t\t\tagc->setExposureMode(ExposureModeTable.at(idx));\n> +\t\t\t\tagc->setExposureMode(0, ExposureModeTable.at(idx));\n>  \t\t\t\tlibcameraMetadata_.set(controls::AeExposureMode, idx);\n>  \t\t\t} else {\n>  \t\t\t\tLOG(IPARPI, Error) << \"Exposure mode \" << idx\n> @@ -813,7 +813,7 @@ void IpaBase::applyControls(const ControlList &controls)\n>  \t\t\t * So convert to 2^EV\n>  \t\t\t */\n>  \t\t\tdouble ev = pow(2.0, ctrl.second.get<float>());\n> -\t\t\tagc->setEv(ev);\n> +\t\t\tagc->setEv(0, ev);\n>  \t\t\tlibcameraMetadata_.set(controls::ExposureValue,\n>  \t\t\t\t\t       ctrl.second.get<float>());\n>  \t\t\tbreak;\n> @@ -833,12 +833,12 @@ void IpaBase::applyControls(const ControlList &controls)\n>\n>  \t\t\tswitch (mode) {\n>  \t\t\tcase controls::FlickerOff:\n> -\t\t\t\tagc->setFlickerPeriod(0us);\n> +\t\t\t\tagc->setFlickerPeriod(0, 0us);\n>\n>  \t\t\t\tbreak;\n>\n>  \t\t\tcase controls::FlickerManual:\n> -\t\t\t\tagc->setFlickerPeriod(flickerState_.manualPeriod);\n> +\t\t\t\tagc->setFlickerPeriod(0, flickerState_.manualPeriod);\n>\n>  \t\t\t\tbreak;\n>\n> @@ -872,7 +872,7 @@ void IpaBase::applyControls(const ControlList &controls)\n>  \t\t\t * first, and the period updated after, or vice versa.\n>  \t\t\t */\n>  \t\t\tif (flickerState_.mode == controls::FlickerManual)\n> -\t\t\t\tagc->setFlickerPeriod(flickerState_.manualPeriod);\n> +\t\t\t\tagc->setFlickerPeriod(0, flickerState_.manualPeriod);\n>\n>  \t\t\tbreak;\n>  \t\t}\n> diff --git a/src/ipa/rpi/controller/agc_algorithm.h b/src/ipa/rpi/controller/agc_algorithm.h\n> index b6949daa..b8986560 100644\n> --- a/src/ipa/rpi/controller/agc_algorithm.h\n> +++ b/src/ipa/rpi/controller/agc_algorithm.h\n> @@ -21,16 +21,19 @@ public:\n>  \t/* An AGC algorithm must provide the following: */\n>  \tvirtual unsigned int getConvergenceFrames() const = 0;\n>  \tvirtual std::vector<double> const &getWeights() const = 0;\n> -\tvirtual void setEv(double ev) = 0;\n> -\tvirtual void setFlickerPeriod(libcamera::utils::Duration flickerPeriod) = 0;\n> -\tvirtual void setFixedShutter(libcamera::utils::Duration fixedShutter) = 0;\n> +\tvirtual void setEv(unsigned int channel, double ev) = 0;\n> +\tvirtual void setFlickerPeriod(unsigned int channel,\n> +\t\t\t\t      libcamera::utils::Duration flickerPeriod) = 0;\n> +\tvirtual void setFixedShutter(unsigned int channel,\n> +\t\t\t\t     libcamera::utils::Duration fixedShutter) = 0;\n>  \tvirtual void setMaxShutter(libcamera::utils::Duration maxShutter) = 0;\n> -\tvirtual void setFixedAnalogueGain(double fixedAnalogueGain) = 0;\n> +\tvirtual void setFixedAnalogueGain(unsigned int channel, double fixedAnalogueGain) = 0;\n>  \tvirtual void setMeteringMode(std::string const &meteringModeName) = 0;\n> -\tvirtual void setExposureMode(std::string const &exposureModeName) = 0;\n> -\tvirtual void setConstraintMode(std::string const &contraintModeName) = 0;\n> -\tvirtual void enableAuto() = 0;\n> -\tvirtual void disableAuto() = 0;\n> +\tvirtual void setExposureMode(unsigned int channel, std::string const &exposureModeName) = 0;\n> +\tvirtual void setConstraintMode(unsigned int channel, std::string const &contraintModeName) = 0;\n> +\tvirtual void enableAuto(unsigned int channel) = 0;\n> +\tvirtual void disableAuto(unsigned int channel) = 0;\n> +\tvirtual void setActiveChannels(const std::vector<unsigned int> &activeChannels) = 0;\n>  };\n>\n>  } /* namespace RPiController */\n> diff --git a/src/ipa/rpi/controller/meson.build b/src/ipa/rpi/controller/meson.build\n> index feb0334e..20b9cda9 100644\n> --- a/src/ipa/rpi/controller/meson.build\n> +++ b/src/ipa/rpi/controller/meson.build\n> @@ -8,6 +8,7 @@ rpi_ipa_controller_sources = files([\n>      'pwl.cpp',\n>      'rpi/af.cpp',\n>      'rpi/agc.cpp',\n> +    'rpi/agc_channel.cpp',\n>      'rpi/alsc.cpp',\n>      'rpi/awb.cpp',\n>      'rpi/black_level.cpp',\n> diff --git a/src/ipa/rpi/controller/rpi/agc.cpp b/src/ipa/rpi/controller/rpi/agc.cpp\n> index 7b02972a..598fc890 100644\n> --- a/src/ipa/rpi/controller/rpi/agc.cpp\n> +++ b/src/ipa/rpi/controller/rpi/agc.cpp\n> @@ -5,20 +5,12 @@\n>   * agc.cpp - AGC/AEC control algorithm\n>   */\n>\n> -#include <algorithm>\n> -#include <map>\n> -#include <tuple>\n> +#include \"agc.h\"\n>\n>  #include <libcamera/base/log.h>\n>\n> -#include \"../awb_status.h\"\n> -#include \"../device_status.h\"\n> -#include \"../histogram.h\"\n> -#include \"../lux_status.h\"\n>  #include \"../metadata.h\"\n>\n> -#include \"agc.h\"\n> -\n>  using namespace RPiController;\n>  using namespace libcamera;\n>  using libcamera::utils::Duration;\n> @@ -28,881 +20,205 @@ LOG_DEFINE_CATEGORY(RPiAgc)\n>\n>  #define NAME \"rpi.agc\"\n>\n> -int AgcMeteringMode::read(const libcamera::YamlObject &params)\n> +Agc::Agc(Controller *controller)\n> +\t: AgcAlgorithm(controller),\n> +\t  activeChannels_({ 0 })\n>  {\n> -\tconst YamlObject &yamlWeights = params[\"weights\"];\n> -\n> -\tfor (const auto &p : yamlWeights.asList()) {\n> -\t\tauto value = p.get<double>();\n> -\t\tif (!value)\n> -\t\t\treturn -EINVAL;\n> -\t\tweights.push_back(*value);\n> -\t}\n> -\n> -\treturn 0;\n>  }\n>\n> -static std::tuple<int, std::string>\n> -readMeteringModes(std::map<std::string, AgcMeteringMode> &metering_modes,\n> -\t\t  const libcamera::YamlObject &params)\n> +char const *Agc::name() const\n>  {\n> -\tstd::string first;\n> -\tint ret;\n> -\n> -\tfor (const auto &[key, value] : params.asDict()) {\n> -\t\tAgcMeteringMode meteringMode;\n> -\t\tret = meteringMode.read(value);\n> -\t\tif (ret)\n> -\t\t\treturn { ret, {} };\n> -\n> -\t\tmetering_modes[key] = std::move(meteringMode);\n> -\t\tif (first.empty())\n> -\t\t\tfirst = key;\n> -\t}\n> -\n> -\treturn { 0, first };\n> +\treturn NAME;\n>  }\n>\n> -int AgcExposureMode::read(const libcamera::YamlObject &params)\n> +int Agc::read(const libcamera::YamlObject &params)\n>  {\n> -\tauto value = params[\"shutter\"].getList<double>();\n> -\tif (!value)\n> -\t\treturn -EINVAL;\n> -\tstd::transform(value->begin(), value->end(), std::back_inserter(shutter),\n> -\t\t       [](double v) { return v * 1us; });\n> -\n> -\tvalue = params[\"gain\"].getList<double>();\n> -\tif (!value)\n> -\t\treturn -EINVAL;\n> -\tgain = std::move(*value);\n> -\n> -\tif (shutter.size() < 2 || gain.size() < 2) {\n> -\t\tLOG(RPiAgc, Error)\n> -\t\t\t<< \"AgcExposureMode: must have at least two entries in exposure profile\";\n> -\t\treturn -EINVAL;\n> -\t}\n> -\n> -\tif (shutter.size() != gain.size()) {\n> -\t\tLOG(RPiAgc, Error)\n> -\t\t\t<< \"AgcExposureMode: expect same number of exposure and gain entries in exposure profile\";\n> -\t\treturn -EINVAL;\n> +\t/*\n> +\t * When there is only a single channel we can read the old style syntax.\n> +\t * Otherwise we expect a \"channels\" keyword followed by a list of configurations.\n> +\t */\n> +\tif (!params.contains(\"channels\")) {\n> +\t\tLOG(RPiAgc, Debug) << \"Single channel only\";\n> +\t\tchannelData_.emplace_back();\n> +\t\treturn channelData_.back().channel.read(params, getHardwareConfig());\n>  \t}\n>\n> -\treturn 0;\n> -}\n> -\n> -static std::tuple<int, std::string>\n> -readExposureModes(std::map<std::string, AgcExposureMode> &exposureModes,\n> -\t\t  const libcamera::YamlObject &params)\n> -{\n> -\tstd::string first;\n> -\tint ret;\n> -\n> -\tfor (const auto &[key, value] : params.asDict()) {\n> -\t\tAgcExposureMode exposureMode;\n> -\t\tret = exposureMode.read(value);\n> +\tconst auto &channels = params[\"channels\"].asList();\n> +\tfor (auto ch = channels.begin(); ch != channels.end(); ch++) {\n> +\t\tLOG(RPiAgc, Debug) << \"Read AGC channel\";\n> +\t\tchannelData_.emplace_back();\n> +\t\tint ret = channelData_.back().channel.read(*ch, getHardwareConfig());\n>  \t\tif (ret)\n> -\t\t\treturn { ret, {} };\n> -\n> -\t\texposureModes[key] = std::move(exposureMode);\n> -\t\tif (first.empty())\n> -\t\t\tfirst = key;\n> +\t\t\treturn ret;\n>  \t}\n>\n> -\treturn { 0, first };\n> -}\n> -\n> -int AgcConstraint::read(const libcamera::YamlObject &params)\n> -{\n> -\tstd::string boundString = params[\"bound\"].get<std::string>(\"\");\n> -\ttransform(boundString.begin(), boundString.end(),\n> -\t\t  boundString.begin(), ::toupper);\n> -\tif (boundString != \"UPPER\" && boundString != \"LOWER\") {\n> -\t\tLOG(RPiAgc, Error) << \"AGC constraint type should be UPPER or LOWER\";\n> -\t\treturn -EINVAL;\n> +\tLOG(RPiAgc, Debug) << \"Read \" << channelData_.size() << \" channel(s)\";\n> +\tif (channelData_.empty()) {\n> +\t\tLOG(RPiAgc, Error) << \"No AGC channels provided\";\n> +\t\treturn -1;\n>  \t}\n> -\tbound = boundString == \"UPPER\" ? Bound::UPPER : Bound::LOWER;\n> -\n> -\tauto value = params[\"q_lo\"].get<double>();\n> -\tif (!value)\n> -\t\treturn -EINVAL;\n> -\tqLo = *value;\n> -\n> -\tvalue = params[\"q_hi\"].get<double>();\n> -\tif (!value)\n> -\t\treturn -EINVAL;\n> -\tqHi = *value;\n> -\n> -\treturn yTarget.read(params[\"y_target\"]);\n> -}\n>\n> -static std::tuple<int, AgcConstraintMode>\n> -readConstraintMode(const libcamera::YamlObject &params)\n> -{\n> -\tAgcConstraintMode mode;\n> -\tint ret;\n> -\n> -\tfor (const auto &p : params.asList()) {\n> -\t\tAgcConstraint constraint;\n> -\t\tret = constraint.read(p);\n> -\t\tif (ret)\n> -\t\t\treturn { ret, {} };\n> -\n> -\t\tmode.push_back(std::move(constraint));\n> -\t}\n> -\n> -\treturn { 0, mode };\n> +\treturn 0;\n>  }\n>\n> -static std::tuple<int, std::string>\n> -readConstraintModes(std::map<std::string, AgcConstraintMode> &constraintModes,\n> -\t\t    const libcamera::YamlObject &params)\n> +int Agc::checkChannel(unsigned int channelIndex) const\n>  {\n> -\tstd::string first;\n> -\tint ret;\n> -\n> -\tfor (const auto &[key, value] : params.asDict()) {\n> -\t\tstd::tie(ret, constraintModes[key]) = readConstraintMode(value);\n> -\t\tif (ret)\n> -\t\t\treturn { ret, {} };\n> -\n> -\t\tif (first.empty())\n> -\t\t\tfirst = key;\n> +\tif (channelIndex >= channelData_.size()) {\n> +\t\tLOG(RPiAgc, Warning) << \"AGC channel \" << channelIndex << \" not available\";\n> +\t\treturn -1;\n>  \t}\n>\n> -\treturn { 0, first };\n> -}\n> -\n> -int AgcConfig::read(const libcamera::YamlObject &params)\n> -{\n> -\tLOG(RPiAgc, Debug) << \"AgcConfig\";\n> -\tint ret;\n> -\n> -\tstd::tie(ret, defaultMeteringMode) =\n> -\t\treadMeteringModes(meteringModes, params[\"metering_modes\"]);\n> -\tif (ret)\n> -\t\treturn ret;\n> -\tstd::tie(ret, defaultExposureMode) =\n> -\t\treadExposureModes(exposureModes, params[\"exposure_modes\"]);\n> -\tif (ret)\n> -\t\treturn ret;\n> -\tstd::tie(ret, defaultConstraintMode) =\n> -\t\treadConstraintModes(constraintModes, params[\"constraint_modes\"]);\n> -\tif (ret)\n> -\t\treturn ret;\n> -\n> -\tret = yTarget.read(params[\"y_target\"]);\n> -\tif (ret)\n> -\t\treturn ret;\n> -\n> -\tspeed = params[\"speed\"].get<double>(0.2);\n> -\tstartupFrames = params[\"startup_frames\"].get<uint16_t>(10);\n> -\tconvergenceFrames = params[\"convergence_frames\"].get<unsigned int>(6);\n> -\tfastReduceThreshold = params[\"fast_reduce_threshold\"].get<double>(0.4);\n> -\tbaseEv = params[\"base_ev\"].get<double>(1.0);\n> -\n> -\t/* Start with quite a low value as ramping up is easier than ramping down. */\n> -\tdefaultExposureTime = params[\"default_exposure_time\"].get<double>(1000) * 1us;\n> -\tdefaultAnalogueGain = params[\"default_analogue_gain\"].get<double>(1.0);\n> -\n>  \treturn 0;\n>  }\n>\n> -Agc::ExposureValues::ExposureValues()\n> -\t: shutter(0s), analogueGain(0),\n> -\t  totalExposure(0s), totalExposureNoDG(0s)\n> +void Agc::disableAuto(unsigned int channelIndex)\n>  {\n> -}\n> -\n> -Agc::Agc(Controller *controller)\n> -\t: AgcAlgorithm(controller), meteringMode_(nullptr),\n> -\t  exposureMode_(nullptr), constraintMode_(nullptr),\n> -\t  frameCount_(0), lockCount_(0),\n> -\t  lastTargetExposure_(0s), ev_(1.0), flickerPeriod_(0s),\n> -\t  maxShutter_(0s), fixedShutter_(0s), fixedAnalogueGain_(0.0)\n> -{\n> -\tmemset(&awb_, 0, sizeof(awb_));\n> -\t/*\n> -\t * Setting status_.totalExposureValue_ to zero initially tells us\n> -\t * it's not been calculated yet (i.e. Process hasn't yet run).\n> -\t */\n> -\tstatus_ = {};\n> -\tstatus_.ev = ev_;\n> -}\n> +\tif (checkChannel(channelIndex))\n> +\t\treturn;\n>\n> -char const *Agc::name() const\n> -{\n> -\treturn NAME;\n> +\tLOG(RPiAgc, Debug) << \"disableAuto for channel \" << channelIndex;\n> +\tchannelData_[channelIndex].channel.disableAuto();\n>  }\n>\n> -int Agc::read(const libcamera::YamlObject &params)\n> +void Agc::enableAuto(unsigned int channelIndex)\n>  {\n> -\tLOG(RPiAgc, Debug) << \"Agc\";\n> -\n> -\tint ret = config_.read(params);\n> -\tif (ret)\n> -\t\treturn ret;\n> -\n> -\tconst Size &size = getHardwareConfig().agcZoneWeights;\n> -\tfor (auto const &modes : config_.meteringModes) {\n> -\t\tif (modes.second.weights.size() != size.width * size.height) {\n> -\t\t\tLOG(RPiAgc, Error) << \"AgcMeteringMode: Incorrect number of weights\";\n> -\t\t\treturn -EINVAL;\n> -\t\t}\n> -\t}\n> +\tif (checkChannel(channelIndex))\n> +\t\treturn;\n>\n> -\t/*\n> -\t * Set the config's defaults (which are the first ones it read) as our\n> -\t * current modes, until someone changes them.  (they're all known to\n> -\t * exist at this point)\n> -\t */\n> -\tmeteringModeName_ = config_.defaultMeteringMode;\n> -\tmeteringMode_ = &config_.meteringModes[meteringModeName_];\n> -\texposureModeName_ = config_.defaultExposureMode;\n> -\texposureMode_ = &config_.exposureModes[exposureModeName_];\n> -\tconstraintModeName_ = config_.defaultConstraintMode;\n> -\tconstraintMode_ = &config_.constraintModes[constraintModeName_];\n> -\t/* Set up the \"last shutter/gain\" values, in case AGC starts \"disabled\". */\n> -\tstatus_.shutterTime = config_.defaultExposureTime;\n> -\tstatus_.analogueGain = config_.defaultAnalogueGain;\n> -\treturn 0;\n> -}\n> -\n> -void Agc::disableAuto()\n> -{\n> -\tfixedShutter_ = status_.shutterTime;\n> -\tfixedAnalogueGain_ = status_.analogueGain;\n> -}\n> -\n> -void Agc::enableAuto()\n> -{\n> -\tfixedShutter_ = 0s;\n> -\tfixedAnalogueGain_ = 0;\n> +\tLOG(RPiAgc, Debug) << \"enableAuto for channel \" << channelIndex;\n> +\tchannelData_[channelIndex].channel.enableAuto();\n>  }\n>\n>  unsigned int Agc::getConvergenceFrames() const\n>  {\n> -\t/*\n> -\t * If shutter and gain have been explicitly set, there is no\n> -\t * convergence to happen, so no need to drop any frames - return zero.\n> -\t */\n> -\tif (fixedShutter_ && fixedAnalogueGain_)\n> -\t\treturn 0;\n> -\telse\n> -\t\treturn config_.convergenceFrames;\n> +\t/* If there are n channels, it presumably takes n times as long to converge. */\n> +\treturn channelData_[0].channel.getConvergenceFrames() * activeChannels_.size();\n>  }\n>\n>  std::vector<double> const &Agc::getWeights() const\n>  {\n>  \t/*\n> -\t * In case someone calls setMeteringMode and then this before the\n> -\t * algorithm has run and updated the meteringMode_ pointer.\n> +\t * In future the metering weights may be determined differently, making it\n> +\t * difficult to associate different sets of weight with different channels.\n> +\t * Therefore we shall impose a limitation, at least for now, that all\n> +\t * channels will use the same weights.\n>  \t */\n> -\tauto it = config_.meteringModes.find(meteringModeName_);\n> -\tif (it == config_.meteringModes.end())\n> -\t\treturn meteringMode_->weights;\n> -\treturn it->second.weights;\n> +\treturn channelData_[0].channel.getWeights();\n>  }\n>\n> -void Agc::setEv(double ev)\n> +void Agc::setEv(unsigned int channelIndex, double ev)\n>  {\n> -\tev_ = ev;\n> -}\n> +\tif (checkChannel(channelIndex))\n> +\t\treturn;\n>\n> -void Agc::setFlickerPeriod(Duration flickerPeriod)\n> -{\n> -\tflickerPeriod_ = flickerPeriod;\n> +\tLOG(RPiAgc, Debug) << \"setEv \" << ev << \" for channel \" << channelIndex;\n> +\tchannelData_[channelIndex].channel.setEv(ev);\n>  }\n>\n> -void Agc::setMaxShutter(Duration maxShutter)\n> +void Agc::setFlickerPeriod(unsigned int channelIndex, Duration flickerPeriod)\n>  {\n> -\tmaxShutter_ = maxShutter;\n> -}\n> +\tif (checkChannel(channelIndex))\n> +\t\treturn;\n>\n> -void Agc::setFixedShutter(Duration fixedShutter)\n> -{\n> -\tfixedShutter_ = fixedShutter;\n> -\t/* Set this in case someone calls disableAuto() straight after. */\n> -\tstatus_.shutterTime = limitShutter(fixedShutter_);\n> +\tLOG(RPiAgc, Debug) << \"setFlickerPeriod \" << flickerPeriod\n> +\t\t\t   << \" for channel \" << channelIndex;\n> +\tchannelData_[channelIndex].channel.setFlickerPeriod(flickerPeriod);\n>  }\n>\n> -void Agc::setFixedAnalogueGain(double fixedAnalogueGain)\n> -{\n> -\tfixedAnalogueGain_ = fixedAnalogueGain;\n> -\t/* Set this in case someone calls disableAuto() straight after. */\n> -\tstatus_.analogueGain = limitGain(fixedAnalogueGain);\n> -}\n> -\n> -void Agc::setMeteringMode(std::string const &meteringModeName)\n> -{\n> -\tmeteringModeName_ = meteringModeName;\n> -}\n> -\n> -void Agc::setExposureMode(std::string const &exposureModeName)\n> -{\n> -\texposureModeName_ = exposureModeName;\n> -}\n> -\n> -void Agc::setConstraintMode(std::string const &constraintModeName)\n> -{\n> -\tconstraintModeName_ = constraintModeName;\n> -}\n> -\n> -void Agc::switchMode(CameraMode const &cameraMode,\n> -\t\t     Metadata *metadata)\n> +void Agc::setMaxShutter(Duration maxShutter)\n>  {\n> -\t/* AGC expects the mode sensitivity always to be non-zero. */\n> -\tASSERT(cameraMode.sensitivity);\n> -\n> -\thousekeepConfig();\n> -\n> -\t/*\n> -\t * Store the mode in the local state. We must cache the sensitivity of\n> -\t * of the previous mode for the calculations below.\n> -\t */\n> -\tdouble lastSensitivity = mode_.sensitivity;\n> -\tmode_ = cameraMode;\n> -\n> -\tDuration fixedShutter = limitShutter(fixedShutter_);\n> -\tif (fixedShutter && fixedAnalogueGain_) {\n> -\t\t/* We're going to reset the algorithm here with these fixed values. */\n> -\n> -\t\tfetchAwbStatus(metadata);\n> -\t\tdouble minColourGain = std::min({ awb_.gainR, awb_.gainG, awb_.gainB, 1.0 });\n> -\t\tASSERT(minColourGain != 0.0);\n> -\n> -\t\t/* This is the equivalent of computeTargetExposure and applyDigitalGain. */\n> -\t\ttarget_.totalExposureNoDG = fixedShutter_ * fixedAnalogueGain_;\n> -\t\ttarget_.totalExposure = target_.totalExposureNoDG / minColourGain;\n> -\n> -\t\t/* Equivalent of filterExposure. This resets any \"history\". */\n> -\t\tfiltered_ = target_;\n> -\n> -\t\t/* Equivalent of divideUpExposure. */\n> -\t\tfiltered_.shutter = fixedShutter;\n> -\t\tfiltered_.analogueGain = fixedAnalogueGain_;\n> -\t} else if (status_.totalExposureValue) {\n> -\t\t/*\n> -\t\t * On a mode switch, various things could happen:\n> -\t\t * - the exposure profile might change\n> -\t\t * - a fixed exposure or gain might be set\n> -\t\t * - the new mode's sensitivity might be different\n> -\t\t * We cope with the last of these by scaling the target values. After\n> -\t\t * that we just need to re-divide the exposure/gain according to the\n> -\t\t * current exposure profile, which takes care of everything else.\n> -\t\t */\n> -\n> -\t\tdouble ratio = lastSensitivity / cameraMode.sensitivity;\n> -\t\ttarget_.totalExposureNoDG *= ratio;\n> -\t\ttarget_.totalExposure *= ratio;\n> -\t\tfiltered_.totalExposureNoDG *= ratio;\n> -\t\tfiltered_.totalExposure *= ratio;\n> -\n> -\t\tdivideUpExposure();\n> -\t} else {\n> -\t\t/*\n> -\t\t * We come through here on startup, when at least one of the shutter\n> -\t\t * or gain has not been fixed. We must still write those values out so\n> -\t\t * that they will be applied immediately. We supply some arbitrary defaults\n> -\t\t * for any that weren't set.\n> -\t\t */\n> -\n> -\t\t/* Equivalent of divideUpExposure. */\n> -\t\tfiltered_.shutter = fixedShutter ? fixedShutter : config_.defaultExposureTime;\n> -\t\tfiltered_.analogueGain = fixedAnalogueGain_ ? fixedAnalogueGain_ : config_.defaultAnalogueGain;\n> -\t}\n> -\n> -\twriteAndFinish(metadata, false);\n> +\t/* Frame durations will be the same across all channels too. */\n> +\tfor (auto &data : channelData_)\n> +\t\tdata.channel.setMaxShutter(maxShutter);\n>  }\n>\n> -void Agc::prepare(Metadata *imageMetadata)\n> +void Agc::setFixedShutter(unsigned int channelIndex, Duration fixedShutter)\n>  {\n> -\tDuration totalExposureValue = status_.totalExposureValue;\n> -\tAgcStatus delayedStatus;\n> -\tAgcPrepareStatus prepareStatus;\n> -\n> -\tif (!imageMetadata->get(\"agc.delayed_status\", delayedStatus))\n> -\t\ttotalExposureValue = delayedStatus.totalExposureValue;\n> -\n> -\tprepareStatus.digitalGain = 1.0;\n> -\tprepareStatus.locked = false;\n> -\n> -\tif (status_.totalExposureValue) {\n> -\t\t/* Process has run, so we have meaningful values. */\n> -\t\tDeviceStatus deviceStatus;\n> -\t\tif (imageMetadata->get(\"device.status\", deviceStatus) == 0) {\n> -\t\t\tDuration actualExposure = deviceStatus.shutterSpeed *\n> -\t\t\t\t\t\t  deviceStatus.analogueGain;\n> -\t\t\tif (actualExposure) {\n> -\t\t\t\tdouble digitalGain = totalExposureValue / actualExposure;\n> -\t\t\t\tLOG(RPiAgc, Debug) << \"Want total exposure \" << totalExposureValue;\n> -\t\t\t\t/*\n> -\t\t\t\t * Never ask for a gain < 1.0, and also impose\n> -\t\t\t\t * some upper limit. Make it customisable?\n> -\t\t\t\t */\n> -\t\t\t\tprepareStatus.digitalGain = std::max(1.0, std::min(digitalGain, 4.0));\n> -\t\t\t\tLOG(RPiAgc, Debug) << \"Actual exposure \" << actualExposure;\n> -\t\t\t\tLOG(RPiAgc, Debug) << \"Use digitalGain \" << prepareStatus.digitalGain;\n> -\t\t\t\tLOG(RPiAgc, Debug) << \"Effective exposure \"\n> -\t\t\t\t\t\t   << actualExposure * prepareStatus.digitalGain;\n> -\t\t\t\t/* Decide whether AEC/AGC has converged. */\n> -\t\t\t\tprepareStatus.locked = updateLockStatus(deviceStatus);\n> -\t\t\t}\n> -\t\t} else\n> -\t\t\tLOG(RPiAgc, Warning) << name() << \": no device metadata\";\n> -\t\timageMetadata->set(\"agc.prepare_status\", prepareStatus);\n> -\t}\n> -}\n> +\tif (checkChannel(channelIndex))\n> +\t\treturn;\n>\n> -void Agc::process(StatisticsPtr &stats, Metadata *imageMetadata)\n> -{\n> -\tframeCount_++;\n> -\t/*\n> -\t * First a little bit of housekeeping, fetching up-to-date settings and\n> -\t * configuration, that kind of thing.\n> -\t */\n> -\thousekeepConfig();\n> -\t/* Fetch the AWB status immediately, so that we can assume it's there. */\n> -\tfetchAwbStatus(imageMetadata);\n> -\t/* Get the current exposure values for the frame that's just arrived. */\n> -\tfetchCurrentExposure(imageMetadata);\n> -\t/* Compute the total gain we require relative to the current exposure. */\n> -\tdouble gain, targetY;\n> -\tcomputeGain(stats, imageMetadata, gain, targetY);\n> -\t/* Now compute the target (final) exposure which we think we want. */\n> -\tcomputeTargetExposure(gain);\n> -\t/* The results have to be filtered so as not to change too rapidly. */\n> -\tfilterExposure();\n> -\t/*\n> -\t * Some of the exposure has to be applied as digital gain, so work out\n> -\t * what that is. This function also tells us whether it's decided to\n> -\t * \"desaturate\" the image more quickly.\n> -\t */\n> -\tbool desaturate = applyDigitalGain(gain, targetY);\n> -\t/*\n> -\t * The last thing is to divide up the exposure value into a shutter time\n> -\t * and analogue gain, according to the current exposure mode.\n> -\t */\n> -\tdivideUpExposure();\n> -\t/* Finally advertise what we've done. */\n> -\twriteAndFinish(imageMetadata, desaturate);\n> +\tLOG(RPiAgc, Debug) << \"setFixedShutter \" << fixedShutter\n> +\t\t\t   << \" for channel \" << channelIndex;\n> +\tchannelData_[channelIndex].channel.setFixedShutter(fixedShutter);\n>  }\n>\n> -bool Agc::updateLockStatus(DeviceStatus const &deviceStatus)\n> +void Agc::setFixedAnalogueGain(unsigned int channelIndex, double fixedAnalogueGain)\n>  {\n> -\tconst double errorFactor = 0.10; /* make these customisable? */\n> -\tconst int maxLockCount = 5;\n> -\t/* Reset \"lock count\" when we exceed this multiple of errorFactor */\n> -\tconst double resetMargin = 1.5;\n> +\tif (checkChannel(channelIndex))\n> +\t\treturn;\n>\n> -\t/* Add 200us to the exposure time error to allow for line quantisation. */\n> -\tDuration exposureError = lastDeviceStatus_.shutterSpeed * errorFactor + 200us;\n> -\tdouble gainError = lastDeviceStatus_.analogueGain * errorFactor;\n> -\tDuration targetError = lastTargetExposure_ * errorFactor;\n> -\n> -\t/*\n> -\t * Note that we don't know the exposure/gain limits of the sensor, so\n> -\t * the values we keep requesting may be unachievable. For this reason\n> -\t * we only insist that we're close to values in the past few frames.\n> -\t */\n> -\tif (deviceStatus.shutterSpeed > lastDeviceStatus_.shutterSpeed - exposureError &&\n> -\t    deviceStatus.shutterSpeed < lastDeviceStatus_.shutterSpeed + exposureError &&\n> -\t    deviceStatus.analogueGain > lastDeviceStatus_.analogueGain - gainError &&\n> -\t    deviceStatus.analogueGain < lastDeviceStatus_.analogueGain + gainError &&\n> -\t    status_.targetExposureValue > lastTargetExposure_ - targetError &&\n> -\t    status_.targetExposureValue < lastTargetExposure_ + targetError)\n> -\t\tlockCount_ = std::min(lockCount_ + 1, maxLockCount);\n> -\telse if (deviceStatus.shutterSpeed < lastDeviceStatus_.shutterSpeed - resetMargin * exposureError ||\n> -\t\t deviceStatus.shutterSpeed > lastDeviceStatus_.shutterSpeed + resetMargin * exposureError ||\n> -\t\t deviceStatus.analogueGain < lastDeviceStatus_.analogueGain - resetMargin * gainError ||\n> -\t\t deviceStatus.analogueGain > lastDeviceStatus_.analogueGain + resetMargin * gainError ||\n> -\t\t status_.targetExposureValue < lastTargetExposure_ - resetMargin * targetError ||\n> -\t\t status_.targetExposureValue > lastTargetExposure_ + resetMargin * targetError)\n> -\t\tlockCount_ = 0;\n> -\n> -\tlastDeviceStatus_ = deviceStatus;\n> -\tlastTargetExposure_ = status_.targetExposureValue;\n> -\n> -\tLOG(RPiAgc, Debug) << \"Lock count updated to \" << lockCount_;\n> -\treturn lockCount_ == maxLockCount;\n> +\tLOG(RPiAgc, Debug) << \"setFixedAnalogueGain \" << fixedAnalogueGain\n> +\t\t\t   << \" for channel \" << channelIndex;\n> +\tchannelData_[channelIndex].channel.setFixedAnalogueGain(fixedAnalogueGain);\n>  }\n>\n> -void Agc::housekeepConfig()\n> +void Agc::setMeteringMode(std::string const &meteringModeName)\n>  {\n> -\t/* First fetch all the up-to-date settings, so no one else has to do it. */\n> -\tstatus_.ev = ev_;\n> -\tstatus_.fixedShutter = limitShutter(fixedShutter_);\n> -\tstatus_.fixedAnalogueGain = fixedAnalogueGain_;\n> -\tstatus_.flickerPeriod = flickerPeriod_;\n> -\tLOG(RPiAgc, Debug) << \"ev \" << status_.ev << \" fixedShutter \"\n> -\t\t\t   << status_.fixedShutter << \" fixedAnalogueGain \"\n> -\t\t\t   << status_.fixedAnalogueGain;\n> -\t/*\n> -\t * Make sure the \"mode\" pointers point to the up-to-date things, if\n> -\t * they've changed.\n> -\t */\n> -\tif (meteringModeName_ != status_.meteringMode) {\n> -\t\tauto it = config_.meteringModes.find(meteringModeName_);\n> -\t\tif (it == config_.meteringModes.end()) {\n> -\t\t\tLOG(RPiAgc, Warning) << \"No metering mode \" << meteringModeName_;\n> -\t\t\tmeteringModeName_ = status_.meteringMode;\n> -\t\t} else {\n> -\t\t\tmeteringMode_ = &it->second;\n> -\t\t\tstatus_.meteringMode = meteringModeName_;\n> -\t\t}\n> -\t}\n> -\tif (exposureModeName_ != status_.exposureMode) {\n> -\t\tauto it = config_.exposureModes.find(exposureModeName_);\n> -\t\tif (it == config_.exposureModes.end()) {\n> -\t\t\tLOG(RPiAgc, Warning) << \"No exposure profile \" << exposureModeName_;\n> -\t\t\texposureModeName_ = status_.exposureMode;\n> -\t\t} else {\n> -\t\t\texposureMode_ = &it->second;\n> -\t\t\tstatus_.exposureMode = exposureModeName_;\n> -\t\t}\n> -\t}\n> -\tif (constraintModeName_ != status_.constraintMode) {\n> -\t\tauto it = config_.constraintModes.find(constraintModeName_);\n> -\t\tif (it == config_.constraintModes.end()) {\n> -\t\t\tLOG(RPiAgc, Warning) << \"No constraint list \" << constraintModeName_;\n> -\t\t\tconstraintModeName_ = status_.constraintMode;\n> -\t\t} else {\n> -\t\t\tconstraintMode_ = &it->second;\n> -\t\t\tstatus_.constraintMode = constraintModeName_;\n> -\t\t}\n> -\t}\n> -\tLOG(RPiAgc, Debug) << \"exposureMode \"\n> -\t\t\t   << exposureModeName_ << \" constraintMode \"\n> -\t\t\t   << constraintModeName_ << \" meteringMode \"\n> -\t\t\t   << meteringModeName_;\n> +\t/* Metering modes will be the same across all channels too. */\n> +\tfor (auto &data : channelData_)\n> +\t\tdata.channel.setMeteringMode(meteringModeName);\n>  }\n>\n> -void Agc::fetchCurrentExposure(Metadata *imageMetadata)\n> +void Agc::setExposureMode(unsigned int channelIndex, std::string const &exposureModeName)\n>  {\n> -\tstd::unique_lock<Metadata> lock(*imageMetadata);\n> -\tDeviceStatus *deviceStatus =\n> -\t\timageMetadata->getLocked<DeviceStatus>(\"device.status\");\n> -\tif (!deviceStatus)\n> -\t\tLOG(RPiAgc, Fatal) << \"No device metadata\";\n> -\tcurrent_.shutter = deviceStatus->shutterSpeed;\n> -\tcurrent_.analogueGain = deviceStatus->analogueGain;\n> -\tAgcStatus *agcStatus =\n> -\t\timageMetadata->getLocked<AgcStatus>(\"agc.status\");\n> -\tcurrent_.totalExposure = agcStatus ? agcStatus->totalExposureValue : 0s;\n> -\tcurrent_.totalExposureNoDG = current_.shutter * current_.analogueGain;\n> -}\n> +\tif (checkChannel(channelIndex))\n> +\t\treturn;\n>\n> -void Agc::fetchAwbStatus(Metadata *imageMetadata)\n> -{\n> -\tawb_.gainR = 1.0; /* in case not found in metadata */\n> -\tawb_.gainG = 1.0;\n> -\tawb_.gainB = 1.0;\n> -\tif (imageMetadata->get(\"awb.status\", awb_) != 0)\n> -\t\tLOG(RPiAgc, Debug) << \"No AWB status found\";\n> +\tLOG(RPiAgc, Debug) << \"setExposureMode \" << exposureModeName\n> +\t\t\t   << \" for channel \" << channelIndex;\n> +\tchannelData_[channelIndex].channel.setExposureMode(exposureModeName);\n>  }\n>\n> -static double computeInitialY(StatisticsPtr &stats, AwbStatus const &awb,\n> -\t\t\t      std::vector<double> &weights, double gain)\n> +void Agc::setConstraintMode(unsigned int channelIndex, std::string const &constraintModeName)\n>  {\n> -\tconstexpr uint64_t maxVal = 1 << Statistics::NormalisationFactorPow2;\n> +\tif (checkChannel(channelIndex))\n> +\t\treturn;\n>\n> -\tASSERT(weights.size() == stats->agcRegions.numRegions());\n> -\n> -\t/*\n> -\t * Note that the weights are applied by the IPA to the statistics directly,\n> -\t * before they are given to us here.\n> -\t */\n> -\tdouble rSum = 0, gSum = 0, bSum = 0, pixelSum = 0;\n> -\tfor (unsigned int i = 0; i < stats->agcRegions.numRegions(); i++) {\n> -\t\tauto &region = stats->agcRegions.get(i);\n> -\t\trSum += std::min<double>(region.val.rSum * gain, (maxVal - 1) * region.counted);\n> -\t\tgSum += std::min<double>(region.val.gSum * gain, (maxVal - 1) * region.counted);\n> -\t\tbSum += std::min<double>(region.val.bSum * gain, (maxVal - 1) * region.counted);\n> -\t\tpixelSum += region.counted;\n> -\t}\n> -\tif (pixelSum == 0.0) {\n> -\t\tLOG(RPiAgc, Warning) << \"computeInitialY: pixelSum is zero\";\n> -\t\treturn 0;\n> -\t}\n> -\tdouble ySum = rSum * awb.gainR * .299 +\n> -\t\t      gSum * awb.gainG * .587 +\n> -\t\t      bSum * awb.gainB * .114;\n> -\treturn ySum / pixelSum / maxVal;\n> +\tchannelData_[channelIndex].channel.setConstraintMode(constraintModeName);\n>  }\n>\n> -/*\n> - * We handle extra gain through EV by adjusting our Y targets. However, you\n> - * simply can't monitor histograms once they get very close to (or beyond!)\n> - * saturation, so we clamp the Y targets to this value. It does mean that EV\n> - * increases don't necessarily do quite what you might expect in certain\n> - * (contrived) cases.\n> - */\n> -\n> -static constexpr double EvGainYTargetLimit = 0.9;\n> -\n> -static double constraintComputeGain(AgcConstraint &c, const Histogram &h, double lux,\n> -\t\t\t\t    double evGain, double &targetY)\n> +template<typename T>\n> +std::ostream &operator<<(std::ostream &os, const std::vector<T> &v)\n>  {\n> -\ttargetY = c.yTarget.eval(c.yTarget.domain().clip(lux));\n> -\ttargetY = std::min(EvGainYTargetLimit, targetY * evGain);\n> -\tdouble iqm = h.interQuantileMean(c.qLo, c.qHi);\n> -\treturn (targetY * h.bins()) / iqm;\n> +\tos << \"{\";\n> +\tfor (const auto &e : v)\n> +\t\tos << \" \" << e;\n> +\tos << \" }\";\n> +\treturn os;\n>  }\n>\n> -void Agc::computeGain(StatisticsPtr &statistics, Metadata *imageMetadata,\n> -\t\t      double &gain, double &targetY)\n> +void Agc::setActiveChannels(const std::vector<unsigned int> &activeChannels)\n>  {\n> -\tstruct LuxStatus lux = {};\n> -\tlux.lux = 400; /* default lux level to 400 in case no metadata found */\n> -\tif (imageMetadata->get(\"lux.status\", lux) != 0)\n> -\t\tLOG(RPiAgc, Warning) << \"No lux level found\";\n> -\tconst Histogram &h = statistics->yHist;\n> -\tdouble evGain = status_.ev * config_.baseEv;\n> -\t/*\n> -\t * The initial gain and target_Y come from some of the regions. After\n> -\t * that we consider the histogram constraints.\n> -\t */\n> -\ttargetY = config_.yTarget.eval(config_.yTarget.domain().clip(lux.lux));\n> -\ttargetY = std::min(EvGainYTargetLimit, targetY * evGain);\n> -\n> -\t/*\n> -\t * Do this calculation a few times as brightness increase can be\n> -\t * non-linear when there are saturated regions.\n> -\t */\n> -\tgain = 1.0;\n> -\tfor (int i = 0; i < 8; i++) {\n> -\t\tdouble initialY = computeInitialY(statistics, awb_, meteringMode_->weights, gain);\n> -\t\tdouble extraGain = std::min(10.0, targetY / (initialY + .001));\n> -\t\tgain *= extraGain;\n> -\t\tLOG(RPiAgc, Debug) << \"Initial Y \" << initialY << \" target \" << targetY\n> -\t\t\t\t   << \" gives gain \" << gain;\n> -\t\tif (extraGain < 1.01) /* close enough */\n> -\t\t\tbreak;\n> -\t}\n> -\n> -\tfor (auto &c : *constraintMode_) {\n> -\t\tdouble newTargetY;\n> -\t\tdouble newGain = constraintComputeGain(c, h, lux.lux, evGain, newTargetY);\n> -\t\tLOG(RPiAgc, Debug) << \"Constraint has target_Y \"\n> -\t\t\t\t   << newTargetY << \" giving gain \" << newGain;\n> -\t\tif (c.bound == AgcConstraint::Bound::LOWER && newGain > gain) {\n> -\t\t\tLOG(RPiAgc, Debug) << \"Lower bound constraint adopted\";\n> -\t\t\tgain = newGain;\n> -\t\t\ttargetY = newTargetY;\n> -\t\t} else if (c.bound == AgcConstraint::Bound::UPPER && newGain < gain) {\n> -\t\t\tLOG(RPiAgc, Debug) << \"Upper bound constraint adopted\";\n> -\t\t\tgain = newGain;\n> -\t\t\ttargetY = newTargetY;\n> -\t\t}\n> +\tif (activeChannels.empty()) {\n> +\t\tLOG(RPiAgc, Warning) << \"No active AGC channels supplied\";\n> +\t\treturn;\n>  \t}\n> -\tLOG(RPiAgc, Debug) << \"Final gain \" << gain << \" (target_Y \" << targetY << \" ev \"\n> -\t\t\t   << status_.ev << \" base_ev \" << config_.baseEv\n> -\t\t\t   << \")\";\n> -}\n> -\n> -void Agc::computeTargetExposure(double gain)\n> -{\n> -\tif (status_.fixedShutter && status_.fixedAnalogueGain) {\n> -\t\t/*\n> -\t\t * When ag and shutter are both fixed, we need to drive the\n> -\t\t * total exposure so that we end up with a digital gain of at least\n> -\t\t * 1/minColourGain. Otherwise we'd desaturate channels causing\n> -\t\t * white to go cyan or magenta.\n> -\t\t */\n> -\t\tdouble minColourGain = std::min({ awb_.gainR, awb_.gainG, awb_.gainB, 1.0 });\n> -\t\tASSERT(minColourGain != 0.0);\n> -\t\ttarget_.totalExposure =\n> -\t\t\tstatus_.fixedShutter * status_.fixedAnalogueGain / minColourGain;\n> -\t} else {\n> -\t\t/*\n> -\t\t * The statistics reflect the image without digital gain, so the final\n> -\t\t * total exposure we're aiming for is:\n> -\t\t */\n> -\t\ttarget_.totalExposure = current_.totalExposureNoDG * gain;\n> -\t\t/* The final target exposure is also limited to what the exposure mode allows. */\n> -\t\tDuration maxShutter = status_.fixedShutter\n> -\t\t\t\t\t      ? status_.fixedShutter\n> -\t\t\t\t\t      : exposureMode_->shutter.back();\n> -\t\tmaxShutter = limitShutter(maxShutter);\n> -\t\tDuration maxTotalExposure =\n> -\t\t\tmaxShutter *\n> -\t\t\t(status_.fixedAnalogueGain != 0.0\n> -\t\t\t\t ? status_.fixedAnalogueGain\n> -\t\t\t\t : exposureMode_->gain.back());\n> -\t\ttarget_.totalExposure = std::min(target_.totalExposure, maxTotalExposure);\n> -\t}\n> -\tLOG(RPiAgc, Debug) << \"Target totalExposure \" << target_.totalExposure;\n> -}\n>\n> -bool Agc::applyDigitalGain(double gain, double targetY)\n> -{\n> -\tdouble minColourGain = std::min({ awb_.gainR, awb_.gainG, awb_.gainB, 1.0 });\n> -\tASSERT(minColourGain != 0.0);\n> -\tdouble dg = 1.0 / minColourGain;\n> -\t/*\n> -\t * I think this pipeline subtracts black level and rescales before we\n> -\t * get the stats, so no need to worry about it.\n> -\t */\n> -\tLOG(RPiAgc, Debug) << \"after AWB, target dg \" << dg << \" gain \" << gain\n> -\t\t\t   << \" target_Y \" << targetY;\n> -\t/*\n> -\t * Finally, if we're trying to reduce exposure but the target_Y is\n> -\t * \"close\" to 1.0, then the gain computed for that constraint will be\n> -\t * only slightly less than one, because the measured Y can never be\n> -\t * larger than 1.0. When this happens, demand a large digital gain so\n> -\t * that the exposure can be reduced, de-saturating the image much more\n> -\t * quickly (and we then approach the correct value more quickly from\n> -\t * below).\n> -\t */\n> -\tbool desaturate = targetY > config_.fastReduceThreshold &&\n> -\t\t\t  gain < sqrt(targetY);\n> -\tif (desaturate)\n> -\t\tdg /= config_.fastReduceThreshold;\n> -\tLOG(RPiAgc, Debug) << \"Digital gain \" << dg << \" desaturate? \" << desaturate;\n> -\tfiltered_.totalExposureNoDG = filtered_.totalExposure / dg;\n> -\tLOG(RPiAgc, Debug) << \"Target totalExposureNoDG \" << filtered_.totalExposureNoDG;\n> -\treturn desaturate;\n> -}\n> -\n> -void Agc::filterExposure()\n> -{\n> -\tdouble speed = config_.speed;\n> -\t/*\n> -\t * AGC adapts instantly if both shutter and gain are directly specified\n> -\t * or we're in the startup phase.\n> -\t */\n> -\tif ((status_.fixedShutter && status_.fixedAnalogueGain) ||\n> -\t    frameCount_ <= config_.startupFrames)\n> -\t\tspeed = 1.0;\n> -\tif (!filtered_.totalExposure) {\n> -\t\tfiltered_.totalExposure = target_.totalExposure;\n> -\t} else {\n> -\t\t/*\n> -\t\t * If close to the result go faster, to save making so many\n> -\t\t * micro-adjustments on the way. (Make this customisable?)\n> -\t\t */\n> -\t\tif (filtered_.totalExposure < 1.2 * target_.totalExposure &&\n> -\t\t    filtered_.totalExposure > 0.8 * target_.totalExposure)\n> -\t\t\tspeed = sqrt(speed);\n> -\t\tfiltered_.totalExposure = speed * target_.totalExposure +\n> -\t\t\t\t\t  filtered_.totalExposure * (1.0 - speed);\n> -\t}\n> -\tLOG(RPiAgc, Debug) << \"After filtering, totalExposure \" << filtered_.totalExposure\n> -\t\t\t   << \" no dg \" << filtered_.totalExposureNoDG;\n> -}\n> +\tfor (auto index : activeChannels)\n> +\t\tif (checkChannel(index))\n> +\t\t\treturn;\n>\n> -void Agc::divideUpExposure()\n> -{\n> -\t/*\n> -\t * Sending the fixed shutter/gain cases through the same code may seem\n> -\t * unnecessary, but it will make more sense when extend this to cover\n> -\t * variable aperture.\n> -\t */\n> -\tDuration exposureValue = filtered_.totalExposureNoDG;\n> -\tDuration shutterTime;\n> -\tdouble analogueGain;\n> -\tshutterTime = status_.fixedShutter ? status_.fixedShutter\n> -\t\t\t\t\t   : exposureMode_->shutter[0];\n> -\tshutterTime = limitShutter(shutterTime);\n> -\tanalogueGain = status_.fixedAnalogueGain != 0.0 ? status_.fixedAnalogueGain\n> -\t\t\t\t\t\t\t: exposureMode_->gain[0];\n> -\tanalogueGain = limitGain(analogueGain);\n> -\tif (shutterTime * analogueGain < exposureValue) {\n> -\t\tfor (unsigned int stage = 1;\n> -\t\t     stage < exposureMode_->gain.size(); stage++) {\n> -\t\t\tif (!status_.fixedShutter) {\n> -\t\t\t\tDuration stageShutter =\n> -\t\t\t\t\tlimitShutter(exposureMode_->shutter[stage]);\n> -\t\t\t\tif (stageShutter * analogueGain >= exposureValue) {\n> -\t\t\t\t\tshutterTime = exposureValue / analogueGain;\n> -\t\t\t\t\tbreak;\n> -\t\t\t\t}\n> -\t\t\t\tshutterTime = stageShutter;\n> -\t\t\t}\n> -\t\t\tif (status_.fixedAnalogueGain == 0.0) {\n> -\t\t\t\tif (exposureMode_->gain[stage] * shutterTime >= exposureValue) {\n> -\t\t\t\t\tanalogueGain = exposureValue / shutterTime;\n> -\t\t\t\t\tbreak;\n> -\t\t\t\t}\n> -\t\t\t\tanalogueGain = exposureMode_->gain[stage];\n> -\t\t\t\tanalogueGain = limitGain(analogueGain);\n> -\t\t\t}\n> -\t\t}\n> -\t}\n> -\tLOG(RPiAgc, Debug) << \"Divided up shutter and gain are \" << shutterTime << \" and \"\n> -\t\t\t   << analogueGain;\n> -\t/*\n> -\t * Finally adjust shutter time for flicker avoidance (require both\n> -\t * shutter and gain not to be fixed).\n> -\t */\n> -\tif (!status_.fixedShutter && !status_.fixedAnalogueGain &&\n> -\t    status_.flickerPeriod) {\n> -\t\tint flickerPeriods = shutterTime / status_.flickerPeriod;\n> -\t\tif (flickerPeriods) {\n> -\t\t\tDuration newShutterTime = flickerPeriods * status_.flickerPeriod;\n> -\t\t\tanalogueGain *= shutterTime / newShutterTime;\n> -\t\t\t/*\n> -\t\t\t * We should still not allow the ag to go over the\n> -\t\t\t * largest value in the exposure mode. Note that this\n> -\t\t\t * may force more of the total exposure into the digital\n> -\t\t\t * gain as a side-effect.\n> -\t\t\t */\n> -\t\t\tanalogueGain = std::min(analogueGain, exposureMode_->gain.back());\n> -\t\t\tanalogueGain = limitGain(analogueGain);\n> -\t\t\tshutterTime = newShutterTime;\n> -\t\t}\n> -\t\tLOG(RPiAgc, Debug) << \"After flicker avoidance, shutter \"\n> -\t\t\t\t   << shutterTime << \" gain \" << analogueGain;\n> -\t}\n> -\tfiltered_.shutter = shutterTime;\n> -\tfiltered_.analogueGain = analogueGain;\n> +\tLOG(RPiAgc, Debug) << \"setActiveChannels \" << activeChannels;\n> +\tactiveChannels_ = activeChannels;\n>  }\n>\n> -void Agc::writeAndFinish(Metadata *imageMetadata, bool desaturate)\n> +void Agc::switchMode(CameraMode const &cameraMode,\n> +\t\t     Metadata *metadata)\n>  {\n> -\tstatus_.totalExposureValue = filtered_.totalExposure;\n> -\tstatus_.targetExposureValue = desaturate ? 0s : target_.totalExposureNoDG;\n> -\tstatus_.shutterTime = filtered_.shutter;\n> -\tstatus_.analogueGain = filtered_.analogueGain;\n> -\t/*\n> -\t * Write to metadata as well, in case anyone wants to update the camera\n> -\t * immediately.\n> -\t */\n> -\timageMetadata->set(\"agc.status\", status_);\n> -\tLOG(RPiAgc, Debug) << \"Output written, total exposure requested is \"\n> -\t\t\t   << filtered_.totalExposure;\n> -\tLOG(RPiAgc, Debug) << \"Camera exposure update: shutter time \" << filtered_.shutter\n> -\t\t\t   << \" analogue gain \" << filtered_.analogueGain;\n> +\tLOG(RPiAgc, Debug) << \"switchMode for channel 0\";\n> +\tchannelData_[0].channel.switchMode(cameraMode, metadata);\n>  }\n>\n> -Duration Agc::limitShutter(Duration shutter)\n> +void Agc::prepare(Metadata *imageMetadata)\n>  {\n> -\t/*\n> -\t * shutter == 0 is a special case for fixed shutter values, and must pass\n> -\t * through unchanged\n> -\t */\n> -\tif (!shutter)\n> -\t\treturn shutter;\n> -\n> -\tshutter = std::clamp(shutter, mode_.minShutter, maxShutter_);\n> -\treturn shutter;\n> +\tLOG(RPiAgc, Debug) << \"prepare for channel 0\";\n> +\tchannelData_[0].channel.prepare(imageMetadata);\n>  }\n>\n> -double Agc::limitGain(double gain) const\n> +void Agc::process(StatisticsPtr &stats, Metadata *imageMetadata)\n>  {\n> -\t/*\n> -\t * Only limit the lower bounds of the gain value to what the sensor limits.\n> -\t * The upper bound on analogue gain will be made up with additional digital\n> -\t * gain applied by the ISP.\n> -\t *\n> -\t * gain == 0.0 is a special case for fixed shutter values, and must pass\n> -\t * through unchanged\n> -\t */\n> -\tif (!gain)\n> -\t\treturn gain;\n> -\n> -\tgain = std::max(gain, mode_.minAnalogueGain);\n> -\treturn gain;\n> +\tLOG(RPiAgc, Debug) << \"process for channel 0\";\n> +\tchannelData_[0].channel.process(stats, imageMetadata);\n>  }\n>\n>  /* Register algorithm with the system. */\n> diff --git a/src/ipa/rpi/controller/rpi/agc.h b/src/ipa/rpi/controller/rpi/agc.h\n> index aaf77c8f..24f0a271 100644\n> --- a/src/ipa/rpi/controller/rpi/agc.h\n> +++ b/src/ipa/rpi/controller/rpi/agc.h\n> @@ -6,60 +6,18 @@\n>   */\n>  #pragma once\n>\n> +#include <optional>\n\nIs this used ?\n\n> +#include <string>\n>  #include <vector>\n> -#include <mutex>\n> -\n> -#include <libcamera/base/utils.h>\n>\n>  #include \"../agc_algorithm.h\"\n> -#include \"../agc_status.h\"\n> -#include \"../pwl.h\"\n>\n> -/* This is our implementation of AGC. */\n> +#include \"agc_channel.h\"\n>\n>  namespace RPiController {\n>\n> -struct AgcMeteringMode {\n> -\tstd::vector<double> weights;\n> -\tint read(const libcamera::YamlObject &params);\n> -};\n> -\n> -struct AgcExposureMode {\n> -\tstd::vector<libcamera::utils::Duration> shutter;\n> -\tstd::vector<double> gain;\n> -\tint read(const libcamera::YamlObject &params);\n> -};\n> -\n> -struct AgcConstraint {\n> -\tenum class Bound { LOWER = 0, UPPER = 1 };\n> -\tBound bound;\n> -\tdouble qLo;\n> -\tdouble qHi;\n> -\tPwl yTarget;\n> -\tint read(const libcamera::YamlObject &params);\n> -};\n> -\n> -typedef std::vector<AgcConstraint> AgcConstraintMode;\n> -\n> -struct AgcConfig {\n> -\tint read(const libcamera::YamlObject &params);\n> -\tstd::map<std::string, AgcMeteringMode> meteringModes;\n> -\tstd::map<std::string, AgcExposureMode> exposureModes;\n> -\tstd::map<std::string, AgcConstraintMode> constraintModes;\n> -\tPwl yTarget;\n> -\tdouble speed;\n> -\tuint16_t startupFrames;\n> -\tunsigned int convergenceFrames;\n> -\tdouble maxChange;\n> -\tdouble minChange;\n> -\tdouble fastReduceThreshold;\n> -\tdouble speedUpThreshold;\n> -\tstd::string defaultMeteringMode;\n> -\tstd::string defaultExposureMode;\n> -\tstd::string defaultConstraintMode;\n> -\tdouble baseEv;\n> -\tlibcamera::utils::Duration defaultExposureTime;\n> -\tdouble defaultAnalogueGain;\n> +struct AgcChannelData {\n> +\tAgcChannel channel;\n>  };\n>\n>  class Agc : public AgcAlgorithm\n> @@ -70,65 +28,30 @@ public:\n>  \tint read(const libcamera::YamlObject &params) override;\n>  \tunsigned int getConvergenceFrames() const override;\n>  \tstd::vector<double> const &getWeights() const override;\n> -\tvoid setEv(double ev) override;\n> -\tvoid setFlickerPeriod(libcamera::utils::Duration flickerPeriod) override;\n> +\tvoid setEv(unsigned int channel, double ev) override;\n> +\tvoid setFlickerPeriod(unsigned int channelIndex,\n> +\t\t\t      libcamera::utils::Duration flickerPeriod) override;\n>  \tvoid setMaxShutter(libcamera::utils::Duration maxShutter) override;\n> -\tvoid setFixedShutter(libcamera::utils::Duration fixedShutter) override;\n> -\tvoid setFixedAnalogueGain(double fixedAnalogueGain) override;\n> +\tvoid setFixedShutter(unsigned int channelIndex,\n> +\t\t\t     libcamera::utils::Duration fixedShutter) override;\n> +\tvoid setFixedAnalogueGain(unsigned int channelIndex,\n> +\t\t\t\t  double fixedAnalogueGain) override;\n>  \tvoid setMeteringMode(std::string const &meteringModeName) override;\n> -\tvoid setExposureMode(std::string const &exposureModeName) override;\n> -\tvoid setConstraintMode(std::string const &contraintModeName) override;\n> -\tvoid enableAuto() override;\n> -\tvoid disableAuto() override;\n> +\tvoid setExposureMode(unsigned int channelIndex,\n> +\t\t\t     std::string const &exposureModeName) override;\n> +\tvoid setConstraintMode(unsigned int channelIndex,\n> +\t\t\t       std::string const &contraintModeName) override;\n> +\tvoid enableAuto(unsigned int channelIndex) override;\n> +\tvoid disableAuto(unsigned int channelIndex) override;\n>  \tvoid switchMode(CameraMode const &cameraMode, Metadata *metadata) override;\n>  \tvoid prepare(Metadata *imageMetadata) override;\n>  \tvoid process(StatisticsPtr &stats, Metadata *imageMetadata) override;\n> +\tvoid setActiveChannels(const std::vector<unsigned int> &activeChannels) override;\n>\n>  private:\n> -\tbool updateLockStatus(DeviceStatus const &deviceStatus);\n> -\tAgcConfig config_;\n> -\tvoid housekeepConfig();\n> -\tvoid fetchCurrentExposure(Metadata *imageMetadata);\n> -\tvoid fetchAwbStatus(Metadata *imageMetadata);\n> -\tvoid computeGain(StatisticsPtr &statistics, Metadata *imageMetadata,\n> -\t\t\t double &gain, double &targetY);\n> -\tvoid computeTargetExposure(double gain);\n> -\tvoid filterExposure();\n> -\tbool applyDigitalGain(double gain, double targetY);\n> -\tvoid divideUpExposure();\n> -\tvoid writeAndFinish(Metadata *imageMetadata, bool desaturate);\n> -\tlibcamera::utils::Duration limitShutter(libcamera::utils::Duration shutter);\n> -\tdouble limitGain(double gain) const;\n> -\tAgcMeteringMode *meteringMode_;\n> -\tAgcExposureMode *exposureMode_;\n> -\tAgcConstraintMode *constraintMode_;\n> -\tCameraMode mode_;\n> -\tuint64_t frameCount_;\n> -\tAwbStatus awb_;\n> -\tstruct ExposureValues {\n> -\t\tExposureValues();\n> -\n> -\t\tlibcamera::utils::Duration shutter;\n> -\t\tdouble analogueGain;\n> -\t\tlibcamera::utils::Duration totalExposure;\n> -\t\tlibcamera::utils::Duration totalExposureNoDG; /* without digital gain */\n> -\t};\n> -\tExposureValues current_;  /* values for the current frame */\n> -\tExposureValues target_;   /* calculate the values we want here */\n> -\tExposureValues filtered_; /* these values are filtered towards target */\n> -\tAgcStatus status_;\n> -\tint lockCount_;\n> -\tDeviceStatus lastDeviceStatus_;\n> -\tlibcamera::utils::Duration lastTargetExposure_;\n> -\t/* Below here the \"settings\" that applications can change. */\n> -\tstd::string meteringModeName_;\n> -\tstd::string exposureModeName_;\n> -\tstd::string constraintModeName_;\n> -\tdouble ev_;\n> -\tlibcamera::utils::Duration flickerPeriod_;\n> -\tlibcamera::utils::Duration maxShutter_;\n> -\tlibcamera::utils::Duration fixedShutter_;\n> -\tdouble fixedAnalogueGain_;\n> +\tint checkChannel(unsigned int channel) const;\n> +\tstd::vector<AgcChannelData> channelData_;\n> +\tstd::vector<unsigned int> activeChannels_;\n>  };\n>\n>  } /* namespace RPiController */\n> diff --git a/src/ipa/rpi/controller/rpi/agc_channel.cpp b/src/ipa/rpi/controller/rpi/agc_channel.cpp\n> new file mode 100644\n> index 00000000..7c1aba81\n> --- /dev/null\n> +++ b/src/ipa/rpi/controller/rpi/agc_channel.cpp\n> @@ -0,0 +1,924 @@\n> +/* SPDX-License-Identifier: BSD-2-Clause */\n> +/*\n> + * Copyright (C) 2023, Raspberry Pi Ltd\n> + *\n> + * agc.cpp - AGC/AEC control algorithm\n\nagc_channel.cpp\n\n> + */\n> +\n> +#include \"agc_channel.h\"\n> +\n> +#include <algorithm>\n> +#include <tuple>\n> +\n> +#include <libcamera/base/log.h>\n> +\n> +#include \"../awb_status.h\"\n> +#include \"../device_status.h\"\n> +#include \"../histogram.h\"\n> +#include \"../lux_status.h\"\n> +#include \"../metadata.h\"\n> +\n> +using namespace RPiController;\n> +using namespace libcamera;\n> +using libcamera::utils::Duration;\n> +using namespace std::literals::chrono_literals;\n> +\n> +LOG_DECLARE_CATEGORY(RPiAgc)\n> +\n> +int AgcMeteringMode::read(const libcamera::YamlObject &params)\n> +{\n> +\tconst YamlObject &yamlWeights = params[\"weights\"];\n> +\n> +\tfor (const auto &p : yamlWeights.asList()) {\n> +\t\tauto value = p.get<double>();\n> +\t\tif (!value)\n> +\t\t\treturn -EINVAL;\n> +\t\tweights.push_back(*value);\n> +\t}\n> +\n> +\treturn 0;\n> +}\n> +\n> +static std::tuple<int, std::string>\n> +readMeteringModes(std::map<std::string, AgcMeteringMode> &metering_modes,\n> +\t\t  const libcamera::YamlObject &params)\n> +{\n> +\tstd::string first;\n> +\tint ret;\n> +\n> +\tfor (const auto &[key, value] : params.asDict()) {\n> +\t\tAgcMeteringMode meteringMode;\n> +\t\tret = meteringMode.read(value);\n> +\t\tif (ret)\n> +\t\t\treturn { ret, {} };\n> +\n> +\t\tmetering_modes[key] = std::move(meteringMode);\n> +\t\tif (first.empty())\n> +\t\t\tfirst = key;\n> +\t}\n> +\n> +\treturn { 0, first };\n> +}\n> +\n> +int AgcExposureMode::read(const libcamera::YamlObject &params)\n> +{\n> +\tauto value = params[\"shutter\"].getList<double>();\n> +\tif (!value)\n> +\t\treturn -EINVAL;\n> +\tstd::transform(value->begin(), value->end(), std::back_inserter(shutter),\n> +\t\t       [](double v) { return v * 1us; });\n> +\n> +\tvalue = params[\"gain\"].getList<double>();\n> +\tif (!value)\n> +\t\treturn -EINVAL;\n> +\tgain = std::move(*value);\n> +\n> +\tif (shutter.size() < 2 || gain.size() < 2) {\n> +\t\tLOG(RPiAgc, Error)\n> +\t\t\t<< \"AgcExposureMode: must have at least two entries in exposure profile\";\n> +\t\treturn -EINVAL;\n> +\t}\n> +\n> +\tif (shutter.size() != gain.size()) {\n> +\t\tLOG(RPiAgc, Error)\n> +\t\t\t<< \"AgcExposureMode: expect same number of exposure and gain entries in exposure profile\";\n> +\t\treturn -EINVAL;\n> +\t}\n> +\n> +\treturn 0;\n> +}\n> +\n> +static std::tuple<int, std::string>\n> +readExposureModes(std::map<std::string, AgcExposureMode> &exposureModes,\n> +\t\t  const libcamera::YamlObject &params)\n> +{\n> +\tstd::string first;\n> +\tint ret;\n> +\n> +\tfor (const auto &[key, value] : params.asDict()) {\n> +\t\tAgcExposureMode exposureMode;\n> +\t\tret = exposureMode.read(value);\n> +\t\tif (ret)\n> +\t\t\treturn { ret, {} };\n> +\n> +\t\texposureModes[key] = std::move(exposureMode);\n> +\t\tif (first.empty())\n> +\t\t\tfirst = key;\n> +\t}\n> +\n> +\treturn { 0, first };\n> +}\n> +\n> +int AgcConstraint::read(const libcamera::YamlObject &params)\n> +{\n> +\tstd::string boundString = params[\"bound\"].get<std::string>(\"\");\n> +\ttransform(boundString.begin(), boundString.end(),\n> +\t\t  boundString.begin(), ::toupper);\n> +\tif (boundString != \"UPPER\" && boundString != \"LOWER\") {\n> +\t\tLOG(RPiAgc, Error) << \"AGC constraint type should be UPPER or LOWER\";\n> +\t\treturn -EINVAL;\n> +\t}\n> +\tbound = boundString == \"UPPER\" ? Bound::UPPER : Bound::LOWER;\n> +\n> +\tauto value = params[\"q_lo\"].get<double>();\n> +\tif (!value)\n> +\t\treturn -EINVAL;\n> +\tqLo = *value;\n> +\n> +\tvalue = params[\"q_hi\"].get<double>();\n> +\tif (!value)\n> +\t\treturn -EINVAL;\n> +\tqHi = *value;\n> +\n> +\treturn yTarget.read(params[\"y_target\"]);\n> +}\n> +\n> +static std::tuple<int, AgcConstraintMode>\n> +readConstraintMode(const libcamera::YamlObject &params)\n> +{\n> +\tAgcConstraintMode mode;\n> +\tint ret;\n> +\n> +\tfor (const auto &p : params.asList()) {\n> +\t\tAgcConstraint constraint;\n> +\t\tret = constraint.read(p);\n> +\t\tif (ret)\n> +\t\t\treturn { ret, {} };\n> +\n> +\t\tmode.push_back(std::move(constraint));\n> +\t}\n> +\n> +\treturn { 0, mode };\n> +}\n> +\n> +static std::tuple<int, std::string>\n> +readConstraintModes(std::map<std::string, AgcConstraintMode> &constraintModes,\n> +\t\t    const libcamera::YamlObject &params)\n> +{\n> +\tstd::string first;\n> +\tint ret;\n> +\n> +\tfor (const auto &[key, value] : params.asDict()) {\n> +\t\tstd::tie(ret, constraintModes[key]) = readConstraintMode(value);\n> +\t\tif (ret)\n> +\t\t\treturn { ret, {} };\n> +\n> +\t\tif (first.empty())\n> +\t\t\tfirst = key;\n> +\t}\n> +\n> +\treturn { 0, first };\n> +}\n> +\n> +int AgcConfig::read(const libcamera::YamlObject &params)\n> +{\n> +\tLOG(RPiAgc, Debug) << \"AgcConfig\";\n> +\tint ret;\n> +\n> +\tstd::tie(ret, defaultMeteringMode) =\n> +\t\treadMeteringModes(meteringModes, params[\"metering_modes\"]);\n> +\tif (ret)\n> +\t\treturn ret;\n> +\tstd::tie(ret, defaultExposureMode) =\n> +\t\treadExposureModes(exposureModes, params[\"exposure_modes\"]);\n> +\tif (ret)\n> +\t\treturn ret;\n> +\tstd::tie(ret, defaultConstraintMode) =\n> +\t\treadConstraintModes(constraintModes, params[\"constraint_modes\"]);\n> +\tif (ret)\n> +\t\treturn ret;\n> +\n> +\tret = yTarget.read(params[\"y_target\"]);\n> +\tif (ret)\n> +\t\treturn ret;\n> +\n> +\tspeed = params[\"speed\"].get<double>(0.2);\n> +\tstartupFrames = params[\"startup_frames\"].get<uint16_t>(10);\n> +\tconvergenceFrames = params[\"convergence_frames\"].get<unsigned int>(6);\n> +\tfastReduceThreshold = params[\"fast_reduce_threshold\"].get<double>(0.4);\n> +\tbaseEv = params[\"base_ev\"].get<double>(1.0);\n> +\n> +\t/* Start with quite a low value as ramping up is easier than ramping down. */\n> +\tdefaultExposureTime = params[\"default_exposure_time\"].get<double>(1000) * 1us;\n> +\tdefaultAnalogueGain = params[\"default_analogue_gain\"].get<double>(1.0);\n> +\n> +\treturn 0;\n> +}\n> +\n> +AgcChannel::ExposureValues::ExposureValues()\n> +\t: shutter(0s), analogueGain(0),\n> +\t  totalExposure(0s), totalExposureNoDG(0s)\n> +{\n> +}\n> +\n> +AgcChannel::AgcChannel()\n> +\t: meteringMode_(nullptr), exposureMode_(nullptr), constraintMode_(nullptr),\n> +\t  frameCount_(0), lockCount_(0),\n> +\t  lastTargetExposure_(0s), ev_(1.0), flickerPeriod_(0s),\n> +\t  maxShutter_(0s), fixedShutter_(0s), fixedAnalogueGain_(0.0)\n> +{\n> +\tmemset(&awb_, 0, sizeof(awb_));\n> +\t/*\n> +\t * Setting status_.totalExposureValue_ to zero initially tells us\n> +\t * it's not been calculated yet (i.e. Process hasn't yet run).\n> +\t */\n> +\tstatus_ = {};\n> +\tstatus_.ev = ev_;\n> +}\n> +\n> +int AgcChannel::read(const libcamera::YamlObject &params,\n> +\t\t     const Controller::HardwareConfig &hardwareConfig)\n> +{\n> +\tint ret = config_.read(params);\n> +\tif (ret)\n> +\t\treturn ret;\n> +\n> +\tconst Size &size = hardwareConfig.agcZoneWeights;\n> +\tfor (auto const &modes : config_.meteringModes) {\n> +\t\tif (modes.second.weights.size() != size.width * size.height) {\n> +\t\t\tLOG(RPiAgc, Error) << \"AgcMeteringMode: Incorrect number of weights\";\n> +\t\t\treturn -EINVAL;\n> +\t\t}\n> +\t}\n> +\n> +\t/*\n> +\t * Set the config's defaults (which are the first ones it read) as our\n> +\t * current modes, until someone changes them.  (they're all known to\n> +\t * exist at this point)\n> +\t */\n> +\tmeteringModeName_ = config_.defaultMeteringMode;\n> +\tmeteringMode_ = &config_.meteringModes[meteringModeName_];\n> +\texposureModeName_ = config_.defaultExposureMode;\n> +\texposureMode_ = &config_.exposureModes[exposureModeName_];\n> +\tconstraintModeName_ = config_.defaultConstraintMode;\n> +\tconstraintMode_ = &config_.constraintModes[constraintModeName_];\n> +\t/* Set up the \"last shutter/gain\" values, in case AGC starts \"disabled\". */\n> +\tstatus_.shutterTime = config_.defaultExposureTime;\n> +\tstatus_.analogueGain = config_.defaultAnalogueGain;\n> +\treturn 0;\n> +}\n> +\n> +void AgcChannel::disableAuto()\n> +{\n> +\tfixedShutter_ = status_.shutterTime;\n> +\tfixedAnalogueGain_ = status_.analogueGain;\n> +}\n> +\n> +void AgcChannel::enableAuto()\n> +{\n> +\tfixedShutter_ = 0s;\n> +\tfixedAnalogueGain_ = 0;\n> +}\n> +\n> +unsigned int AgcChannel::getConvergenceFrames() const\n> +{\n> +\t/*\n> +\t * If shutter and gain have been explicitly set, there is no\n> +\t * convergence to happen, so no need to drop any frames - return zero.\n> +\t */\n> +\tif (fixedShutter_ && fixedAnalogueGain_)\n> +\t\treturn 0;\n> +\telse\n> +\t\treturn config_.convergenceFrames;\n> +}\n> +\n> +std::vector<double> const &AgcChannel::getWeights() const\n> +{\n> +\t/*\n> +\t * In case someone calls setMeteringMode and then this before the\n> +\t * algorithm has run and updated the meteringMode_ pointer.\n> +\t */\n> +\tauto it = config_.meteringModes.find(meteringModeName_);\n> +\tif (it == config_.meteringModes.end())\n> +\t\treturn meteringMode_->weights;\n> +\treturn it->second.weights;\n> +}\n> +\n> +void AgcChannel::setEv(double ev)\n> +{\n> +\tev_ = ev;\n> +}\n> +\n> +void AgcChannel::setFlickerPeriod(Duration flickerPeriod)\n> +{\n> +\tflickerPeriod_ = flickerPeriod;\n> +}\n> +\n> +void AgcChannel::setMaxShutter(Duration maxShutter)\n> +{\n> +\tmaxShutter_ = maxShutter;\n> +}\n> +\n> +void AgcChannel::setFixedShutter(Duration fixedShutter)\n> +{\n> +\tfixedShutter_ = fixedShutter;\n> +\t/* Set this in case someone calls disableAuto() straight after. */\n> +\tstatus_.shutterTime = limitShutter(fixedShutter_);\n> +}\n> +\n> +void AgcChannel::setFixedAnalogueGain(double fixedAnalogueGain)\n> +{\n> +\tfixedAnalogueGain_ = fixedAnalogueGain;\n> +\t/* Set this in case someone calls disableAuto() straight after. */\n> +\tstatus_.analogueGain = limitGain(fixedAnalogueGain);\n> +}\n> +\n> +void AgcChannel::setMeteringMode(std::string const &meteringModeName)\n> +{\n> +\tmeteringModeName_ = meteringModeName;\n> +}\n> +\n> +void AgcChannel::setExposureMode(std::string const &exposureModeName)\n> +{\n> +\texposureModeName_ = exposureModeName;\n> +}\n> +\n> +void AgcChannel::setConstraintMode(std::string const &constraintModeName)\n> +{\n> +\tconstraintModeName_ = constraintModeName;\n> +}\n> +\n> +void AgcChannel::switchMode(CameraMode const &cameraMode,\n> +\t\t\t    Metadata *metadata)\n> +{\n> +\t/* AGC expects the mode sensitivity always to be non-zero. */\n> +\tASSERT(cameraMode.sensitivity);\n> +\n> +\thousekeepConfig();\n> +\n> +\t/*\n> +\t * Store the mode in the local state. We must cache the sensitivity of\n> +\t * of the previous mode for the calculations below.\n> +\t */\n> +\tdouble lastSensitivity = mode_.sensitivity;\n> +\tmode_ = cameraMode;\n> +\n> +\tDuration fixedShutter = limitShutter(fixedShutter_);\n> +\tif (fixedShutter && fixedAnalogueGain_) {\n> +\t\t/* We're going to reset the algorithm here with these fixed values. */\n> +\n> +\t\tfetchAwbStatus(metadata);\n> +\t\tdouble minColourGain = std::min({ awb_.gainR, awb_.gainG, awb_.gainB, 1.0 });\n> +\t\tASSERT(minColourGain != 0.0);\n> +\n> +\t\t/* This is the equivalent of computeTargetExposure and applyDigitalGain. */\n> +\t\ttarget_.totalExposureNoDG = fixedShutter_ * fixedAnalogueGain_;\n> +\t\ttarget_.totalExposure = target_.totalExposureNoDG / minColourGain;\n> +\n> +\t\t/* Equivalent of filterExposure. This resets any \"history\". */\n> +\t\tfiltered_ = target_;\n> +\n> +\t\t/* Equivalent of divideUpExposure. */\n> +\t\tfiltered_.shutter = fixedShutter;\n> +\t\tfiltered_.analogueGain = fixedAnalogueGain_;\n> +\t} else if (status_.totalExposureValue) {\n> +\t\t/*\n> +\t\t * On a mode switch, various things could happen:\n> +\t\t * - the exposure profile might change\n> +\t\t * - a fixed exposure or gain might be set\n> +\t\t * - the new mode's sensitivity might be different\n> +\t\t * We cope with the last of these by scaling the target values. After\n> +\t\t * that we just need to re-divide the exposure/gain according to the\n> +\t\t * current exposure profile, which takes care of everything else.\n> +\t\t */\n> +\n> +\t\tdouble ratio = lastSensitivity / cameraMode.sensitivity;\n> +\t\ttarget_.totalExposureNoDG *= ratio;\n> +\t\ttarget_.totalExposure *= ratio;\n> +\t\tfiltered_.totalExposureNoDG *= ratio;\n> +\t\tfiltered_.totalExposure *= ratio;\n> +\n> +\t\tdivideUpExposure();\n> +\t} else {\n> +\t\t/*\n> +\t\t * We come through here on startup, when at least one of the shutter\n> +\t\t * or gain has not been fixed. We must still write those values out so\n> +\t\t * that they will be applied immediately. We supply some arbitrary defaults\n> +\t\t * for any that weren't set.\n> +\t\t */\n> +\n> +\t\t/* Equivalent of divideUpExposure. */\n> +\t\tfiltered_.shutter = fixedShutter ? fixedShutter : config_.defaultExposureTime;\n> +\t\tfiltered_.analogueGain = fixedAnalogueGain_ ? fixedAnalogueGain_ : config_.defaultAnalogueGain;\n> +\t}\n> +\n> +\twriteAndFinish(metadata, false);\n> +}\n> +\n> +void AgcChannel::prepare(Metadata *imageMetadata)\n> +{\n> +\tDuration totalExposureValue = status_.totalExposureValue;\n> +\tAgcStatus delayedStatus;\n> +\tAgcPrepareStatus prepareStatus;\n> +\n> +\tif (!imageMetadata->get(\"agc.delayed_status\", delayedStatus))\n> +\t\ttotalExposureValue = delayedStatus.totalExposureValue;\n> +\n> +\tprepareStatus.digitalGain = 1.0;\n> +\tprepareStatus.locked = false;\n> +\n> +\tif (status_.totalExposureValue) {\n> +\t\t/* Process has run, so we have meaningful values. */\n> +\t\tDeviceStatus deviceStatus;\n> +\t\tif (imageMetadata->get(\"device.status\", deviceStatus) == 0) {\n> +\t\t\tDuration actualExposure = deviceStatus.shutterSpeed *\n> +\t\t\t\t\t\t  deviceStatus.analogueGain;\n> +\t\t\tif (actualExposure) {\n> +\t\t\t\tdouble digitalGain = totalExposureValue / actualExposure;\n> +\t\t\t\tLOG(RPiAgc, Debug) << \"Want total exposure \" << totalExposureValue;\n> +\t\t\t\t/*\n> +\t\t\t\t * Never ask for a gain < 1.0, and also impose\n> +\t\t\t\t * some upper limit. Make it customisable?\n> +\t\t\t\t */\n> +\t\t\t\tprepareStatus.digitalGain = std::max(1.0, std::min(digitalGain, 4.0));\n> +\t\t\t\tLOG(RPiAgc, Debug) << \"Actual exposure \" << actualExposure;\n> +\t\t\t\tLOG(RPiAgc, Debug) << \"Use digitalGain \" << prepareStatus.digitalGain;\n> +\t\t\t\tLOG(RPiAgc, Debug) << \"Effective exposure \"\n> +\t\t\t\t\t\t   << actualExposure * prepareStatus.digitalGain;\n> +\t\t\t\t/* Decide whether AEC/AGC has converged. */\n> +\t\t\t\tprepareStatus.locked = updateLockStatus(deviceStatus);\n> +\t\t\t}\n> +\t\t} else\n> +\t\t\tLOG(RPiAgc, Warning) << \"AgcChannel: no device metadata\";\n> +\t\timageMetadata->set(\"agc.prepare_status\", prepareStatus);\n> +\t}\n> +}\n> +\n> +void AgcChannel::process(StatisticsPtr &stats, Metadata *imageMetadata)\n> +{\n> +\tframeCount_++;\n> +\t/*\n> +\t * First a little bit of housekeeping, fetching up-to-date settings and\n> +\t * configuration, that kind of thing.\n> +\t */\n> +\thousekeepConfig();\n> +\t/* Fetch the AWB status immediately, so that we can assume it's there. */\n> +\tfetchAwbStatus(imageMetadata);\n> +\t/* Get the current exposure values for the frame that's just arrived. */\n> +\tfetchCurrentExposure(imageMetadata);\n> +\t/* Compute the total gain we require relative to the current exposure. */\n> +\tdouble gain, targetY;\n> +\tcomputeGain(stats, imageMetadata, gain, targetY);\n> +\t/* Now compute the target (final) exposure which we think we want. */\n> +\tcomputeTargetExposure(gain);\n> +\t/* The results have to be filtered so as not to change too rapidly. */\n> +\tfilterExposure();\n> +\t/*\n> +\t * Some of the exposure has to be applied as digital gain, so work out\n> +\t * what that is. This function also tells us whether it's decided to\n> +\t * \"desaturate\" the image more quickly.\n> +\t */\n> +\tbool desaturate = applyDigitalGain(gain, targetY);\n> +\t/*\n> +\t * The last thing is to divide up the exposure value into a shutter time\n> +\t * and analogue gain, according to the current exposure mode.\n> +\t */\n> +\tdivideUpExposure();\n> +\t/* Finally advertise what we've done. */\n> +\twriteAndFinish(imageMetadata, desaturate);\n> +}\n> +\n> +bool AgcChannel::updateLockStatus(DeviceStatus const &deviceStatus)\n> +{\n> +\tconst double errorFactor = 0.10; /* make these customisable? */\n> +\tconst int maxLockCount = 5;\n> +\t/* Reset \"lock count\" when we exceed this multiple of errorFactor */\n> +\tconst double resetMargin = 1.5;\n> +\n> +\t/* Add 200us to the exposure time error to allow for line quantisation. */\n> +\tDuration exposureError = lastDeviceStatus_.shutterSpeed * errorFactor + 200us;\n> +\tdouble gainError = lastDeviceStatus_.analogueGain * errorFactor;\n> +\tDuration targetError = lastTargetExposure_ * errorFactor;\n> +\n> +\t/*\n> +\t * Note that we don't know the exposure/gain limits of the sensor, so\n> +\t * the values we keep requesting may be unachievable. For this reason\n> +\t * we only insist that we're close to values in the past few frames.\n> +\t */\n> +\tif (deviceStatus.shutterSpeed > lastDeviceStatus_.shutterSpeed - exposureError &&\n> +\t    deviceStatus.shutterSpeed < lastDeviceStatus_.shutterSpeed + exposureError &&\n> +\t    deviceStatus.analogueGain > lastDeviceStatus_.analogueGain - gainError &&\n> +\t    deviceStatus.analogueGain < lastDeviceStatus_.analogueGain + gainError &&\n> +\t    status_.targetExposureValue > lastTargetExposure_ - targetError &&\n> +\t    status_.targetExposureValue < lastTargetExposure_ + targetError)\n> +\t\tlockCount_ = std::min(lockCount_ + 1, maxLockCount);\n> +\telse if (deviceStatus.shutterSpeed < lastDeviceStatus_.shutterSpeed - resetMargin * exposureError ||\n> +\t\t deviceStatus.shutterSpeed > lastDeviceStatus_.shutterSpeed + resetMargin * exposureError ||\n> +\t\t deviceStatus.analogueGain < lastDeviceStatus_.analogueGain - resetMargin * gainError ||\n> +\t\t deviceStatus.analogueGain > lastDeviceStatus_.analogueGain + resetMargin * gainError ||\n> +\t\t status_.targetExposureValue < lastTargetExposure_ - resetMargin * targetError ||\n> +\t\t status_.targetExposureValue > lastTargetExposure_ + resetMargin * targetError)\n> +\t\tlockCount_ = 0;\n> +\n> +\tlastDeviceStatus_ = deviceStatus;\n> +\tlastTargetExposure_ = status_.targetExposureValue;\n> +\n> +\tLOG(RPiAgc, Debug) << \"Lock count updated to \" << lockCount_;\n> +\treturn lockCount_ == maxLockCount;\n> +}\n> +\n> +void AgcChannel::housekeepConfig()\n> +{\n> +\t/* First fetch all the up-to-date settings, so no one else has to do it. */\n> +\tstatus_.ev = ev_;\n> +\tstatus_.fixedShutter = limitShutter(fixedShutter_);\n> +\tstatus_.fixedAnalogueGain = fixedAnalogueGain_;\n> +\tstatus_.flickerPeriod = flickerPeriod_;\n> +\tLOG(RPiAgc, Debug) << \"ev \" << status_.ev << \" fixedShutter \"\n> +\t\t\t   << status_.fixedShutter << \" fixedAnalogueGain \"\n> +\t\t\t   << status_.fixedAnalogueGain;\n> +\t/*\n> +\t * Make sure the \"mode\" pointers point to the up-to-date things, if\n> +\t * they've changed.\n> +\t */\n> +\tif (meteringModeName_ != status_.meteringMode) {\n> +\t\tauto it = config_.meteringModes.find(meteringModeName_);\n> +\t\tif (it == config_.meteringModes.end()) {\n> +\t\t\tLOG(RPiAgc, Warning) << \"No metering mode \" << meteringModeName_;\n> +\t\t\tmeteringModeName_ = status_.meteringMode;\n> +\t\t} else {\n> +\t\t\tmeteringMode_ = &it->second;\n> +\t\t\tstatus_.meteringMode = meteringModeName_;\n> +\t\t}\n> +\t}\n> +\tif (exposureModeName_ != status_.exposureMode) {\n> +\t\tauto it = config_.exposureModes.find(exposureModeName_);\n> +\t\tif (it == config_.exposureModes.end()) {\n> +\t\t\tLOG(RPiAgc, Warning) << \"No exposure profile \" << exposureModeName_;\n> +\t\t\texposureModeName_ = status_.exposureMode;\n> +\t\t} else {\n> +\t\t\texposureMode_ = &it->second;\n> +\t\t\tstatus_.exposureMode = exposureModeName_;\n> +\t\t}\n> +\t}\n> +\tif (constraintModeName_ != status_.constraintMode) {\n> +\t\tauto it = config_.constraintModes.find(constraintModeName_);\n> +\t\tif (it == config_.constraintModes.end()) {\n> +\t\t\tLOG(RPiAgc, Warning) << \"No constraint list \" << constraintModeName_;\n> +\t\t\tconstraintModeName_ = status_.constraintMode;\n> +\t\t} else {\n> +\t\t\tconstraintMode_ = &it->second;\n> +\t\t\tstatus_.constraintMode = constraintModeName_;\n> +\t\t}\n> +\t}\n> +\tLOG(RPiAgc, Debug) << \"exposureMode \"\n> +\t\t\t   << exposureModeName_ << \" constraintMode \"\n> +\t\t\t   << constraintModeName_ << \" meteringMode \"\n> +\t\t\t   << meteringModeName_;\n> +}\n> +\n> +void AgcChannel::fetchCurrentExposure(Metadata *imageMetadata)\n> +{\n> +\tstd::unique_lock<Metadata> lock(*imageMetadata);\n> +\tDeviceStatus *deviceStatus =\n> +\t\timageMetadata->getLocked<DeviceStatus>(\"device.status\");\n> +\tif (!deviceStatus)\n> +\t\tLOG(RPiAgc, Fatal) << \"No device metadata\";\n> +\tcurrent_.shutter = deviceStatus->shutterSpeed;\n> +\tcurrent_.analogueGain = deviceStatus->analogueGain;\n> +\tAgcStatus *agcStatus =\n> +\t\timageMetadata->getLocked<AgcStatus>(\"agc.status\");\n> +\tcurrent_.totalExposure = agcStatus ? agcStatus->totalExposureValue : 0s;\n> +\tcurrent_.totalExposureNoDG = current_.shutter * current_.analogueGain;\n> +}\n> +\n> +void AgcChannel::fetchAwbStatus(Metadata *imageMetadata)\n> +{\n> +\tawb_.gainR = 1.0; /* in case not found in metadata */\n> +\tawb_.gainG = 1.0;\n> +\tawb_.gainB = 1.0;\n> +\tif (imageMetadata->get(\"awb.status\", awb_) != 0)\n> +\t\tLOG(RPiAgc, Debug) << \"No AWB status found\";\n> +}\n> +\n> +static double computeInitialY(StatisticsPtr &stats, AwbStatus const &awb,\n> +\t\t\t      std::vector<double> &weights, double gain)\n> +{\n> +\tconstexpr uint64_t maxVal = 1 << Statistics::NormalisationFactorPow2;\n> +\n> +\t/*\n> +\t * If we have no AGC region stats, but do have a a Y histogram, use that\n> +\t * directly to caluclate the mean Y value of the image.\n> +\t */\n> +\tif (!stats->agcRegions.numRegions() && stats->yHist.bins()) {\n> +\t\t/*\n> +\t\t * When the gain is applied to the histogram, anything below minBin\n> +\t\t * will scale up directly with the gain, but anything above that\n> +\t\t * will saturate into the top bin.\n> +\t\t */\n> +\t\tauto &hist = stats->yHist;\n> +\t\tdouble minBin = std::min(1.0, 1.0 / gain) * hist.bins();\n> +\t\tdouble binMean = hist.interBinMean(0.0, minBin);\n> +\t\tdouble numUnsaturated = hist.cumulativeFreq(minBin);\n> +\t\t/* This term is from all the pixels that won't saturate. */\n> +\t\tdouble ySum = binMean * gain * numUnsaturated;\n> +\t\t/* And add the ones that will saturate. */\n> +\t\tySum += (hist.total() - numUnsaturated) * hist.bins();\n> +\t\treturn ySum / hist.total() / hist.bins();\n> +\t}\n> +\n> +\tASSERT(weights.size() == stats->agcRegions.numRegions());\n> +\n> +\t/*\n> +\t * Note that the weights are applied by the IPA to the statistics directly,\n> +\t * before they are given to us here.\n> +\t */\n> +\tdouble rSum = 0, gSum = 0, bSum = 0, pixelSum = 0;\n> +\tfor (unsigned int i = 0; i < stats->agcRegions.numRegions(); i++) {\n> +\t\tauto &region = stats->agcRegions.get(i);\n> +\t\trSum += std::min<double>(region.val.rSum * gain, (maxVal - 1) * region.counted);\n> +\t\tgSum += std::min<double>(region.val.gSum * gain, (maxVal - 1) * region.counted);\n> +\t\tbSum += std::min<double>(region.val.bSum * gain, (maxVal - 1) * region.counted);\n> +\t\tpixelSum += region.counted;\n> +\t}\n> +\tif (pixelSum == 0.0) {\n> +\t\tLOG(RPiAgc, Warning) << \"computeInitialY: pixelSum is zero\";\n> +\t\treturn 0;\n> +\t}\n> +\n> +\tdouble ySum;\n> +\t/* Factor in the AWB correction if needed. */\n> +\tif (stats->agcStatsPos == Statistics::AgcStatsPos::PreWb) {\n> +\t\tySum = rSum * awb.gainR * .299 +\n> +\t\t       gSum * awb.gainG * .587 +\n> +\t\t       gSum * awb.gainB * .114;\n> +\t} else\n> +\t\tySum = rSum * .299 + gSum * .587 + gSum * .114;\n> +\n> +\treturn ySum / pixelSum / (1 << 16);\n> +}\n> +\n> +/*\n> + * We handle extra gain through EV by adjusting our Y targets. However, you\n> + * simply can't monitor histograms once they get very close to (or beyond!)\n> + * saturation, so we clamp the Y targets to this value. It does mean that EV\n> + * increases don't necessarily do quite what you might expect in certain\n> + * (contrived) cases.\n> + */\n> +\n> +static constexpr double EvGainYTargetLimit = 0.9;\n> +\n> +static double constraintComputeGain(AgcConstraint &c, const Histogram &h, double lux,\n> +\t\t\t\t    double evGain, double &targetY)\n> +{\n> +\ttargetY = c.yTarget.eval(c.yTarget.domain().clip(lux));\n> +\ttargetY = std::min(EvGainYTargetLimit, targetY * evGain);\n> +\tdouble iqm = h.interQuantileMean(c.qLo, c.qHi);\n> +\treturn (targetY * h.bins()) / iqm;\n> +}\n> +\n> +void AgcChannel::computeGain(StatisticsPtr &statistics, Metadata *imageMetadata,\n> +\t\t\t     double &gain, double &targetY)\n> +{\n> +\tstruct LuxStatus lux = {};\n> +\tlux.lux = 400; /* default lux level to 400 in case no metadata found */\n> +\tif (imageMetadata->get(\"lux.status\", lux) != 0)\n> +\t\tLOG(RPiAgc, Warning) << \"No lux level found\";\n> +\tconst Histogram &h = statistics->yHist;\n> +\tdouble evGain = status_.ev * config_.baseEv;\n> +\t/*\n> +\t * The initial gain and target_Y come from some of the regions. After\n> +\t * that we consider the histogram constraints.\n> +\t */\n> +\ttargetY = config_.yTarget.eval(config_.yTarget.domain().clip(lux.lux));\n> +\ttargetY = std::min(EvGainYTargetLimit, targetY * evGain);\n> +\n> +\t/*\n> +\t * Do this calculation a few times as brightness increase can be\n> +\t * non-linear when there are saturated regions.\n> +\t */\n> +\tgain = 1.0;\n> +\tfor (int i = 0; i < 8; i++) {\n> +\t\tdouble initialY = computeInitialY(statistics, awb_, meteringMode_->weights, gain);\n> +\t\tdouble extraGain = std::min(10.0, targetY / (initialY + .001));\n> +\t\tgain *= extraGain;\n> +\t\tLOG(RPiAgc, Debug) << \"Initial Y \" << initialY << \" target \" << targetY\n> +\t\t\t\t   << \" gives gain \" << gain;\n> +\t\tif (extraGain < 1.01) /* close enough */\n> +\t\t\tbreak;\n> +\t}\n> +\n> +\tfor (auto &c : *constraintMode_) {\n> +\t\tdouble newTargetY;\n> +\t\tdouble newGain = constraintComputeGain(c, h, lux.lux, evGain, newTargetY);\n> +\t\tLOG(RPiAgc, Debug) << \"Constraint has target_Y \"\n> +\t\t\t\t   << newTargetY << \" giving gain \" << newGain;\n> +\t\tif (c.bound == AgcConstraint::Bound::LOWER && newGain > gain) {\n> +\t\t\tLOG(RPiAgc, Debug) << \"Lower bound constraint adopted\";\n> +\t\t\tgain = newGain;\n> +\t\t\ttargetY = newTargetY;\n> +\t\t} else if (c.bound == AgcConstraint::Bound::UPPER && newGain < gain) {\n> +\t\t\tLOG(RPiAgc, Debug) << \"Upper bound constraint adopted\";\n> +\t\t\tgain = newGain;\n> +\t\t\ttargetY = newTargetY;\n> +\t\t}\n> +\t}\n> +\tLOG(RPiAgc, Debug) << \"Final gain \" << gain << \" (target_Y \" << targetY << \" ev \"\n> +\t\t\t   << status_.ev << \" base_ev \" << config_.baseEv\n> +\t\t\t   << \")\";\n> +}\n> +\n> +void AgcChannel::computeTargetExposure(double gain)\n> +{\n> +\tif (status_.fixedShutter && status_.fixedAnalogueGain) {\n> +\t\t/*\n> +\t\t * When ag and shutter are both fixed, we need to drive the\n> +\t\t * total exposure so that we end up with a digital gain of at least\n> +\t\t * 1/minColourGain. Otherwise we'd desaturate channels causing\n> +\t\t * white to go cyan or magenta.\n> +\t\t */\n> +\t\tdouble minColourGain = std::min({ awb_.gainR, awb_.gainG, awb_.gainB, 1.0 });\n> +\t\tASSERT(minColourGain != 0.0);\n> +\t\ttarget_.totalExposure =\n> +\t\t\tstatus_.fixedShutter * status_.fixedAnalogueGain / minColourGain;\n> +\t} else {\n> +\t\t/*\n> +\t\t * The statistics reflect the image without digital gain, so the final\n> +\t\t * total exposure we're aiming for is:\n> +\t\t */\n> +\t\ttarget_.totalExposure = current_.totalExposureNoDG * gain;\n> +\t\t/* The final target exposure is also limited to what the exposure mode allows. */\n> +\t\tDuration maxShutter = status_.fixedShutter\n> +\t\t\t\t\t      ? status_.fixedShutter\n> +\t\t\t\t\t      : exposureMode_->shutter.back();\n> +\t\tmaxShutter = limitShutter(maxShutter);\n> +\t\tDuration maxTotalExposure =\n> +\t\t\tmaxShutter *\n> +\t\t\t(status_.fixedAnalogueGain != 0.0\n> +\t\t\t\t ? status_.fixedAnalogueGain\n> +\t\t\t\t : exposureMode_->gain.back());\n> +\t\ttarget_.totalExposure = std::min(target_.totalExposure, maxTotalExposure);\n> +\t}\n> +\tLOG(RPiAgc, Debug) << \"Target totalExposure \" << target_.totalExposure;\n> +}\n> +\n> +bool AgcChannel::applyDigitalGain(double gain, double targetY)\n> +{\n> +\tdouble minColourGain = std::min({ awb_.gainR, awb_.gainG, awb_.gainB, 1.0 });\n> +\tASSERT(minColourGain != 0.0);\n> +\tdouble dg = 1.0 / minColourGain;\n> +\t/*\n> +\t * I think this pipeline subtracts black level and rescales before we\n> +\t * get the stats, so no need to worry about it.\n> +\t */\n> +\tLOG(RPiAgc, Debug) << \"after AWB, target dg \" << dg << \" gain \" << gain\n> +\t\t\t   << \" target_Y \" << targetY;\n> +\t/*\n> +\t * Finally, if we're trying to reduce exposure but the target_Y is\n> +\t * \"close\" to 1.0, then the gain computed for that constraint will be\n> +\t * only slightly less than one, because the measured Y can never be\n> +\t * larger than 1.0. When this happens, demand a large digital gain so\n> +\t * that the exposure can be reduced, de-saturating the image much more\n> +\t * quickly (and we then approach the correct value more quickly from\n> +\t * below).\n> +\t */\n> +\tbool desaturate = targetY > config_.fastReduceThreshold &&\n> +\t\t\t  gain < sqrt(targetY);\n> +\tif (desaturate)\n> +\t\tdg /= config_.fastReduceThreshold;\n> +\tLOG(RPiAgc, Debug) << \"Digital gain \" << dg << \" desaturate? \" << desaturate;\n> +\tfiltered_.totalExposureNoDG = filtered_.totalExposure / dg;\n> +\tLOG(RPiAgc, Debug) << \"Target totalExposureNoDG \" << filtered_.totalExposureNoDG;\n> +\treturn desaturate;\n> +}\n> +\n> +void AgcChannel::filterExposure()\n> +{\n> +\tdouble speed = config_.speed;\n> +\t/*\n> +\t * AGC adapts instantly if both shutter and gain are directly specified\n> +\t * or we're in the startup phase.\n> +\t */\n> +\tif ((status_.fixedShutter && status_.fixedAnalogueGain) ||\n> +\t    frameCount_ <= config_.startupFrames)\n> +\t\tspeed = 1.0;\n> +\tif (!filtered_.totalExposure) {\n> +\t\tfiltered_.totalExposure = target_.totalExposure;\n> +\t} else {\n> +\t\t/*\n> +\t\t * If close to the result go faster, to save making so many\n> +\t\t * micro-adjustments on the way. (Make this customisable?)\n> +\t\t */\n> +\t\tif (filtered_.totalExposure < 1.2 * target_.totalExposure &&\n> +\t\t    filtered_.totalExposure > 0.8 * target_.totalExposure)\n> +\t\t\tspeed = sqrt(speed);\n> +\t\tfiltered_.totalExposure = speed * target_.totalExposure +\n> +\t\t\t\t\t  filtered_.totalExposure * (1.0 - speed);\n> +\t}\n> +\tLOG(RPiAgc, Debug) << \"After filtering, totalExposure \" << filtered_.totalExposure\n> +\t\t\t   << \" no dg \" << filtered_.totalExposureNoDG;\n> +}\n> +\n> +void AgcChannel::divideUpExposure()\n> +{\n> +\t/*\n> +\t * Sending the fixed shutter/gain cases through the same code may seem\n> +\t * unnecessary, but it will make more sense when extend this to cover\n> +\t * variable aperture.\n> +\t */\n> +\tDuration exposureValue = filtered_.totalExposureNoDG;\n> +\tDuration shutterTime;\n> +\tdouble analogueGain;\n> +\tshutterTime = status_.fixedShutter ? status_.fixedShutter\n> +\t\t\t\t\t   : exposureMode_->shutter[0];\n> +\tshutterTime = limitShutter(shutterTime);\n> +\tanalogueGain = status_.fixedAnalogueGain != 0.0 ? status_.fixedAnalogueGain\n> +\t\t\t\t\t\t\t: exposureMode_->gain[0];\n> +\tanalogueGain = limitGain(analogueGain);\n> +\tif (shutterTime * analogueGain < exposureValue) {\n> +\t\tfor (unsigned int stage = 1;\n> +\t\t     stage < exposureMode_->gain.size(); stage++) {\n> +\t\t\tif (!status_.fixedShutter) {\n> +\t\t\t\tDuration stageShutter =\n> +\t\t\t\t\tlimitShutter(exposureMode_->shutter[stage]);\n> +\t\t\t\tif (stageShutter * analogueGain >= exposureValue) {\n> +\t\t\t\t\tshutterTime = exposureValue / analogueGain;\n> +\t\t\t\t\tbreak;\n> +\t\t\t\t}\n> +\t\t\t\tshutterTime = stageShutter;\n> +\t\t\t}\n> +\t\t\tif (status_.fixedAnalogueGain == 0.0) {\n> +\t\t\t\tif (exposureMode_->gain[stage] * shutterTime >= exposureValue) {\n> +\t\t\t\t\tanalogueGain = exposureValue / shutterTime;\n> +\t\t\t\t\tbreak;\n> +\t\t\t\t}\n> +\t\t\t\tanalogueGain = exposureMode_->gain[stage];\n> +\t\t\t\tanalogueGain = limitGain(analogueGain);\n> +\t\t\t}\n> +\t\t}\n> +\t}\n> +\tLOG(RPiAgc, Debug) << \"Divided up shutter and gain are \" << shutterTime << \" and \"\n> +\t\t\t   << analogueGain;\n> +\t/*\n> +\t * Finally adjust shutter time for flicker avoidance (require both\n> +\t * shutter and gain not to be fixed).\n> +\t */\n> +\tif (!status_.fixedShutter && !status_.fixedAnalogueGain &&\n> +\t    status_.flickerPeriod) {\n> +\t\tint flickerPeriods = shutterTime / status_.flickerPeriod;\n> +\t\tif (flickerPeriods) {\n> +\t\t\tDuration newShutterTime = flickerPeriods * status_.flickerPeriod;\n> +\t\t\tanalogueGain *= shutterTime / newShutterTime;\n> +\t\t\t/*\n> +\t\t\t * We should still not allow the ag to go over the\n> +\t\t\t * largest value in the exposure mode. Note that this\n> +\t\t\t * may force more of the total exposure into the digital\n> +\t\t\t * gain as a side-effect.\n> +\t\t\t */\n> +\t\t\tanalogueGain = std::min(analogueGain, exposureMode_->gain.back());\n> +\t\t\tanalogueGain = limitGain(analogueGain);\n> +\t\t\tshutterTime = newShutterTime;\n> +\t\t}\n> +\t\tLOG(RPiAgc, Debug) << \"After flicker avoidance, shutter \"\n> +\t\t\t\t   << shutterTime << \" gain \" << analogueGain;\n> +\t}\n> +\tfiltered_.shutter = shutterTime;\n> +\tfiltered_.analogueGain = analogueGain;\n> +}\n> +\n> +void AgcChannel::writeAndFinish(Metadata *imageMetadata, bool desaturate)\n> +{\n> +\tstatus_.totalExposureValue = filtered_.totalExposure;\n> +\tstatus_.targetExposureValue = desaturate ? 0s : target_.totalExposureNoDG;\n> +\tstatus_.shutterTime = filtered_.shutter;\n> +\tstatus_.analogueGain = filtered_.analogueGain;\n> +\t/*\n> +\t * Write to metadata as well, in case anyone wants to update the camera\n> +\t * immediately.\n> +\t */\n> +\timageMetadata->set(\"agc.status\", status_);\n> +\tLOG(RPiAgc, Debug) << \"Output written, total exposure requested is \"\n> +\t\t\t   << filtered_.totalExposure;\n> +\tLOG(RPiAgc, Debug) << \"Camera exposure update: shutter time \" << filtered_.shutter\n> +\t\t\t   << \" analogue gain \" << filtered_.analogueGain;\n> +}\n> +\n> +Duration AgcChannel::limitShutter(Duration shutter)\n> +{\n> +\t/*\n> +\t * shutter == 0 is a special case for fixed shutter values, and must pass\n> +\t * through unchanged\n> +\t */\n> +\tif (!shutter)\n> +\t\treturn shutter;\n> +\n> +\tshutter = std::clamp(shutter, mode_.minShutter, maxShutter_);\n> +\treturn shutter;\n> +}\n> +\n> +double AgcChannel::limitGain(double gain) const\n> +{\n> +\t/*\n> +\t * Only limit the lower bounds of the gain value to what the sensor limits.\n> +\t * The upper bound on analogue gain will be made up with additional digital\n> +\t * gain applied by the ISP.\n> +\t *\n> +\t * gain == 0.0 is a special case for fixed shutter values, and must pass\n> +\t * through unchanged\n> +\t */\n> +\tif (!gain)\n> +\t\treturn gain;\n> +\n> +\tgain = std::max(gain, mode_.minAnalogueGain);\n> +\treturn gain;\n> +}\n> diff --git a/src/ipa/rpi/controller/rpi/agc_channel.h b/src/ipa/rpi/controller/rpi/agc_channel.h\n> new file mode 100644\n> index 00000000..d5a5cf3a\n> --- /dev/null\n> +++ b/src/ipa/rpi/controller/rpi/agc_channel.h\n> @@ -0,0 +1,137 @@\n> +/* SPDX-License-Identifier: BSD-2-Clause */\n> +/*\n> + * Copyright (C) 2023, Raspberry Pi Ltd\n> + *\n> + * agc.h - AGC/AEC control algorithm\n\nAnd agc_channel.h\n\nSorry for missing these in the previous review\n\nThe rest has been clarified in the previous review and the code is\njust mostly moved around\n\nReviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>\n\nThanks\n  j\n\n\n> + */\n> +#pragma once\n> +\n> +#include <map>\n> +#include <string>\n> +#include <vector>\n> +\n> +#include <libcamera/base/utils.h>\n> +\n> +#include \"../agc_status.h\"\n> +#include \"../awb_status.h\"\n> +#include \"../controller.h\"\n> +#include \"../pwl.h\"\n> +\n> +/* This is our implementation of AGC. */\n> +\n> +namespace RPiController {\n> +\n> +struct AgcMeteringMode {\n> +\tstd::vector<double> weights;\n> +\tint read(const libcamera::YamlObject &params);\n> +};\n> +\n> +struct AgcExposureMode {\n> +\tstd::vector<libcamera::utils::Duration> shutter;\n> +\tstd::vector<double> gain;\n> +\tint read(const libcamera::YamlObject &params);\n> +};\n> +\n> +struct AgcConstraint {\n> +\tenum class Bound { LOWER = 0,\n> +\t\t\t   UPPER = 1 };\n> +\tBound bound;\n> +\tdouble qLo;\n> +\tdouble qHi;\n> +\tPwl yTarget;\n> +\tint read(const libcamera::YamlObject &params);\n> +};\n> +\n> +typedef std::vector<AgcConstraint> AgcConstraintMode;\n> +\n> +struct AgcConfig {\n> +\tint read(const libcamera::YamlObject &params);\n> +\tstd::map<std::string, AgcMeteringMode> meteringModes;\n> +\tstd::map<std::string, AgcExposureMode> exposureModes;\n> +\tstd::map<std::string, AgcConstraintMode> constraintModes;\n> +\tPwl yTarget;\n> +\tdouble speed;\n> +\tuint16_t startupFrames;\n> +\tunsigned int convergenceFrames;\n> +\tdouble maxChange;\n> +\tdouble minChange;\n> +\tdouble fastReduceThreshold;\n> +\tdouble speedUpThreshold;\n> +\tstd::string defaultMeteringMode;\n> +\tstd::string defaultExposureMode;\n> +\tstd::string defaultConstraintMode;\n> +\tdouble baseEv;\n> +\tlibcamera::utils::Duration defaultExposureTime;\n> +\tdouble defaultAnalogueGain;\n> +};\n> +\n> +class AgcChannel\n> +{\n> +public:\n> +\tAgcChannel();\n> +\tint read(const libcamera::YamlObject &params,\n> +\t\t const Controller::HardwareConfig &hardwareConfig);\n> +\tunsigned int getConvergenceFrames() const;\n> +\tstd::vector<double> const &getWeights() const;\n> +\tvoid setEv(double ev);\n> +\tvoid setFlickerPeriod(libcamera::utils::Duration flickerPeriod);\n> +\tvoid setMaxShutter(libcamera::utils::Duration maxShutter);\n> +\tvoid setFixedShutter(libcamera::utils::Duration fixedShutter);\n> +\tvoid setFixedAnalogueGain(double fixedAnalogueGain);\n> +\tvoid setMeteringMode(std::string const &meteringModeName);\n> +\tvoid setExposureMode(std::string const &exposureModeName);\n> +\tvoid setConstraintMode(std::string const &contraintModeName);\n> +\tvoid enableAuto();\n> +\tvoid disableAuto();\n> +\tvoid switchMode(CameraMode const &cameraMode, Metadata *metadata);\n> +\tvoid prepare(Metadata *imageMetadata);\n> +\tvoid process(StatisticsPtr &stats, Metadata *imageMetadata);\n> +\n> +private:\n> +\tbool updateLockStatus(DeviceStatus const &deviceStatus);\n> +\tAgcConfig config_;\n> +\tvoid housekeepConfig();\n> +\tvoid fetchCurrentExposure(Metadata *imageMetadata);\n> +\tvoid fetchAwbStatus(Metadata *imageMetadata);\n> +\tvoid computeGain(StatisticsPtr &statistics, Metadata *imageMetadata,\n> +\t\t\t double &gain, double &targetY);\n> +\tvoid computeTargetExposure(double gain);\n> +\tvoid filterExposure();\n> +\tbool applyDigitalGain(double gain, double targetY);\n> +\tvoid divideUpExposure();\n> +\tvoid writeAndFinish(Metadata *imageMetadata, bool desaturate);\n> +\tlibcamera::utils::Duration limitShutter(libcamera::utils::Duration shutter);\n> +\tdouble limitGain(double gain) const;\n> +\tAgcMeteringMode *meteringMode_;\n> +\tAgcExposureMode *exposureMode_;\n> +\tAgcConstraintMode *constraintMode_;\n> +\tCameraMode mode_;\n> +\tuint64_t frameCount_;\n> +\tAwbStatus awb_;\n> +\tstruct ExposureValues {\n> +\t\tExposureValues();\n> +\n> +\t\tlibcamera::utils::Duration shutter;\n> +\t\tdouble analogueGain;\n> +\t\tlibcamera::utils::Duration totalExposure;\n> +\t\tlibcamera::utils::Duration totalExposureNoDG; /* without digital gain */\n> +\t};\n> +\tExposureValues current_; /* values for the current frame */\n> +\tExposureValues target_; /* calculate the values we want here */\n> +\tExposureValues filtered_; /* these values are filtered towards target */\n> +\tAgcStatus status_;\n> +\tint lockCount_;\n> +\tDeviceStatus lastDeviceStatus_;\n> +\tlibcamera::utils::Duration lastTargetExposure_;\n> +\t/* Below here the \"settings\" that applications can change. */\n> +\tstd::string meteringModeName_;\n> +\tstd::string exposureModeName_;\n> +\tstd::string constraintModeName_;\n> +\tdouble ev_;\n> +\tlibcamera::utils::Duration flickerPeriod_;\n> +\tlibcamera::utils::Duration maxShutter_;\n> +\tlibcamera::utils::Duration fixedShutter_;\n> +\tdouble fixedAnalogueGain_;\n> +};\n> +\n> +} /* namespace RPiController */\n> --\n> 2.30.2\n>","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id 298D6BE080\n\tfor <parsemail@patchwork.libcamera.org>;\n\tTue, 12 Sep 2023 14:46:09 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id 76ED761DF5;\n\tTue, 12 Sep 2023 16:46:08 +0200 (CEST)","from perceval.ideasonboard.com (perceval.ideasonboard.com\n\t[213.167.242.64])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id 1A0C661DEF\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tTue, 12 Sep 2023 16:46:06 +0200 (CEST)","from ideasonboard.com (mob-5-90-67-213.net.vodafone.it\n\t[5.90.67.213])\n\tby perceval.ideasonboard.com (Postfix) with ESMTPSA id 32818512;\n\tTue, 12 Sep 2023 16:44:34 +0200 (CEST)"],"DKIM-Signature":["v=1; a=rsa-sha256; c=relaxed/simple; d=libcamera.org;\n\ts=mail; t=1694529968;\n\tbh=8Q3GWatdnZmk9jgvBCZFpUmWqrWs78gn6AJAp0SpJU0=;\n\th=Date:To:References:In-Reply-To:Subject:List-Id:List-Unsubscribe:\n\tList-Archive:List-Post:List-Help:List-Subscribe:From:Reply-To:Cc:\n\tFrom;\n\tb=FjCPnoU+n4165vVWztg2+Ajl7K5MuQXVFk871K7O7vCis+0mYX64QZd30tbxOnTr2\n\tmyTjcNkKjj0+hqZnVZQ5UnqshjG9iu6Jt6N+1pMQSFLcovSJmJra/rh3ICB2e4ScdE\n\t0pk93y0xhPSTnoPGWFYjhsIIzZ81XrXbfKEax4MsNqnn7S5ww+ml+a4HtHdK522qqC\n\t2hXklSIvNp3L8WyGFoOWVvy+fLWhMei66mg7cknJOsy3IKOoYPAROGJL4uys6XyziQ\n\tZ2n55lBEpDsThGMAAK5cFFRrFRbZUPUOeV2oVtLnuPDNdDqyPgKidYrgYrCREXLvLD\n\t3pOGKfjjKoX6Q==","v=1; a=rsa-sha256; c=relaxed/simple; d=ideasonboard.com;\n\ts=mail; t=1694529874;\n\tbh=8Q3GWatdnZmk9jgvBCZFpUmWqrWs78gn6AJAp0SpJU0=;\n\th=Date:From:To:Cc:Subject:References:In-Reply-To:From;\n\tb=HK0xcSKQtBZ2FcnlYww5UIUDEivoWO/e4mi9o6DSZmJb0f2S2oHpHB79MXiwSRazz\n\th7/ZasDDsnosO7njjK82uFR/V0EZW2Qtz48XTz93vKv5Fclz5pizrk5dreHdVCMabs\n\tY4wYjGqBKWiPJ8a4bL75BbJxfoCLbTgi2IAyCzgk="],"Authentication-Results":"lancelot.ideasonboard.com; dkim=pass (1024-bit key; \n\tunprotected) header.d=ideasonboard.com\n\theader.i=@ideasonboard.com\n\theader.b=\"HK0xcSKQ\"; dkim-atps=neutral","Date":"Tue, 12 Sep 2023 16:46:01 +0200","To":"David Plowman <david.plowman@raspberrypi.com>","Message-ID":"<brn2l4a6hepdgnl4txnhhebgnl66lgda5edzj4cp5dwrgpwckp@wps62szgdufb>","References":"<20230912102442.169001-1-david.plowman@raspberrypi.com>\n\t<20230912102442.169001-3-david.plowman@raspberrypi.com>","MIME-Version":"1.0","Content-Type":"text/plain; charset=utf-8","Content-Disposition":"inline","In-Reply-To":"<20230912102442.169001-3-david.plowman@raspberrypi.com>","Subject":"Re: [libcamera-devel] [PATCH v3 2/5] ipa: rpi: agc: Reorganise code\n\tfor multi-channel AGC","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","From":"Jacopo Mondi via libcamera-devel <libcamera-devel@lists.libcamera.org>","Reply-To":"Jacopo Mondi <jacopo.mondi@ideasonboard.com>","Cc":"libcamera-devel@lists.libcamera.org","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}},{"id":27782,"web_url":"https://patchwork.libcamera.org/comment/27782/","msgid":"<CAHW6GYJozb9vz=BRFiMMdXTNm7pQbk9k03zhNOWRc=Rtx9H1BA@mail.gmail.com>","date":"2023-09-15T08:08:08","subject":"Re: [libcamera-devel] [PATCH v3 2/5] ipa: rpi: agc: Reorganise code\n\tfor multi-channel AGC","submitter":{"id":42,"url":"https://patchwork.libcamera.org/api/people/42/","name":"David Plowman","email":"david.plowman@raspberrypi.com"},"content":"Hi Jacopo\n\nThanks for the review again!\n\nOn Tue, 12 Sept 2023 at 15:46, Jacopo Mondi\n<jacopo.mondi@ideasonboard.com> wrote:\n>\n> Hi David\n>\n> On Tue, Sep 12, 2023 at 11:24:39AM +0100, David Plowman via libcamera-devel wrote:\n> > This commit does the basic reorganisation of the code in order to\n> > implement multi-channel AGC. The main changes are:\n> >\n> > * The previous Agc class (in agc.cpp) has become the AgcChannel class\n> >   in (agc_channel.cpp).\n> >\n> > * A new Agc class is introduced which is a wrapper round a number of\n> >   AgcChannels.\n> >\n> > * The basic plumbing from ipa_base.cpp to Agc is updated to include a\n> >   channel number. All the existing controls are hardwired to talk\n> >   directly to channel 0.\n> >\n> > There are a couple of limitations which we expect to apply to\n> > multi-channel AGC. We're not allowing different frame durations to be\n> > applied to the channels, nor are we allowing separate metering\n> > modes. To be fair, supporting these things is not impossible, but\n> > there are reasons why it may be tricky so they remain \"TBD\" for now.\n> >\n> > This patch only includes the basic reorganisation and plumbing. It\n> > does not yet update the important methods (switchMode, prepare and\n> > process) to implement multi-channel AGC properly. This will appear in\n> > a subsequent commit. For now, these functions are hard-coded just to\n> > use channel 0, thereby preserving the existing behaviour.\n> >\n> > Signed-off-by: David Plowman <david.plowman@raspberrypi.com>\n> > Reviewed-by: Naushir Patuck <naush@raspberrypi.com>\n> > ---\n> >  src/ipa/rpi/common/ipa_base.cpp            |  20 +-\n> >  src/ipa/rpi/controller/agc_algorithm.h     |  19 +-\n> >  src/ipa/rpi/controller/meson.build         |   1 +\n> >  src/ipa/rpi/controller/rpi/agc.cpp         | 912 +++-----------------\n> >  src/ipa/rpi/controller/rpi/agc.h           | 121 +--\n> >  src/ipa/rpi/controller/rpi/agc_channel.cpp | 924 +++++++++++++++++++++\n> >  src/ipa/rpi/controller/rpi/agc_channel.h   | 137 +++\n> >  7 files changed, 1219 insertions(+), 915 deletions(-)\n> >  create mode 100644 src/ipa/rpi/controller/rpi/agc_channel.cpp\n> >  create mode 100644 src/ipa/rpi/controller/rpi/agc_channel.h\n> >\n> > diff --git a/src/ipa/rpi/common/ipa_base.cpp b/src/ipa/rpi/common/ipa_base.cpp\n> > index a47ae3a9..f7e7ad5e 100644\n> > --- a/src/ipa/rpi/common/ipa_base.cpp\n> > +++ b/src/ipa/rpi/common/ipa_base.cpp\n> > @@ -699,9 +699,9 @@ void IpaBase::applyControls(const ControlList &controls)\n> >                       }\n> >\n> >                       if (ctrl.second.get<bool>() == false)\n> > -                             agc->disableAuto();\n> > +                             agc->disableAuto(0);\n> >                       else\n> > -                             agc->enableAuto();\n> > +                             agc->enableAuto(0);\n> >\n> >                       libcameraMetadata_.set(controls::AeEnable, ctrl.second.get<bool>());\n> >                       break;\n> > @@ -717,7 +717,7 @@ void IpaBase::applyControls(const ControlList &controls)\n> >                       }\n> >\n> >                       /* The control provides units of microseconds. */\n> > -                     agc->setFixedShutter(ctrl.second.get<int32_t>() * 1.0us);\n> > +                     agc->setFixedShutter(0, ctrl.second.get<int32_t>() * 1.0us);\n> >\n> >                       libcameraMetadata_.set(controls::ExposureTime, ctrl.second.get<int32_t>());\n> >                       break;\n> > @@ -732,7 +732,7 @@ void IpaBase::applyControls(const ControlList &controls)\n> >                               break;\n> >                       }\n> >\n> > -                     agc->setFixedAnalogueGain(ctrl.second.get<float>());\n> > +                     agc->setFixedAnalogueGain(0, ctrl.second.get<float>());\n> >\n> >                       libcameraMetadata_.set(controls::AnalogueGain,\n> >                                              ctrl.second.get<float>());\n> > @@ -770,7 +770,7 @@ void IpaBase::applyControls(const ControlList &controls)\n> >\n> >                       int32_t idx = ctrl.second.get<int32_t>();\n> >                       if (ConstraintModeTable.count(idx)) {\n> > -                             agc->setConstraintMode(ConstraintModeTable.at(idx));\n> > +                             agc->setConstraintMode(0, ConstraintModeTable.at(idx));\n> >                               libcameraMetadata_.set(controls::AeConstraintMode, idx);\n> >                       } else {\n> >                               LOG(IPARPI, Error) << \"Constraint mode \" << idx\n> > @@ -790,7 +790,7 @@ void IpaBase::applyControls(const ControlList &controls)\n> >\n> >                       int32_t idx = ctrl.second.get<int32_t>();\n> >                       if (ExposureModeTable.count(idx)) {\n> > -                             agc->setExposureMode(ExposureModeTable.at(idx));\n> > +                             agc->setExposureMode(0, ExposureModeTable.at(idx));\n> >                               libcameraMetadata_.set(controls::AeExposureMode, idx);\n> >                       } else {\n> >                               LOG(IPARPI, Error) << \"Exposure mode \" << idx\n> > @@ -813,7 +813,7 @@ void IpaBase::applyControls(const ControlList &controls)\n> >                        * So convert to 2^EV\n> >                        */\n> >                       double ev = pow(2.0, ctrl.second.get<float>());\n> > -                     agc->setEv(ev);\n> > +                     agc->setEv(0, ev);\n> >                       libcameraMetadata_.set(controls::ExposureValue,\n> >                                              ctrl.second.get<float>());\n> >                       break;\n> > @@ -833,12 +833,12 @@ void IpaBase::applyControls(const ControlList &controls)\n> >\n> >                       switch (mode) {\n> >                       case controls::FlickerOff:\n> > -                             agc->setFlickerPeriod(0us);\n> > +                             agc->setFlickerPeriod(0, 0us);\n> >\n> >                               break;\n> >\n> >                       case controls::FlickerManual:\n> > -                             agc->setFlickerPeriod(flickerState_.manualPeriod);\n> > +                             agc->setFlickerPeriod(0, flickerState_.manualPeriod);\n> >\n> >                               break;\n> >\n> > @@ -872,7 +872,7 @@ void IpaBase::applyControls(const ControlList &controls)\n> >                        * first, and the period updated after, or vice versa.\n> >                        */\n> >                       if (flickerState_.mode == controls::FlickerManual)\n> > -                             agc->setFlickerPeriod(flickerState_.manualPeriod);\n> > +                             agc->setFlickerPeriod(0, flickerState_.manualPeriod);\n> >\n> >                       break;\n> >               }\n> > diff --git a/src/ipa/rpi/controller/agc_algorithm.h b/src/ipa/rpi/controller/agc_algorithm.h\n> > index b6949daa..b8986560 100644\n> > --- a/src/ipa/rpi/controller/agc_algorithm.h\n> > +++ b/src/ipa/rpi/controller/agc_algorithm.h\n> > @@ -21,16 +21,19 @@ public:\n> >       /* An AGC algorithm must provide the following: */\n> >       virtual unsigned int getConvergenceFrames() const = 0;\n> >       virtual std::vector<double> const &getWeights() const = 0;\n> > -     virtual void setEv(double ev) = 0;\n> > -     virtual void setFlickerPeriod(libcamera::utils::Duration flickerPeriod) = 0;\n> > -     virtual void setFixedShutter(libcamera::utils::Duration fixedShutter) = 0;\n> > +     virtual void setEv(unsigned int channel, double ev) = 0;\n> > +     virtual void setFlickerPeriod(unsigned int channel,\n> > +                                   libcamera::utils::Duration flickerPeriod) = 0;\n> > +     virtual void setFixedShutter(unsigned int channel,\n> > +                                  libcamera::utils::Duration fixedShutter) = 0;\n> >       virtual void setMaxShutter(libcamera::utils::Duration maxShutter) = 0;\n> > -     virtual void setFixedAnalogueGain(double fixedAnalogueGain) = 0;\n> > +     virtual void setFixedAnalogueGain(unsigned int channel, double fixedAnalogueGain) = 0;\n> >       virtual void setMeteringMode(std::string const &meteringModeName) = 0;\n> > -     virtual void setExposureMode(std::string const &exposureModeName) = 0;\n> > -     virtual void setConstraintMode(std::string const &contraintModeName) = 0;\n> > -     virtual void enableAuto() = 0;\n> > -     virtual void disableAuto() = 0;\n> > +     virtual void setExposureMode(unsigned int channel, std::string const &exposureModeName) = 0;\n> > +     virtual void setConstraintMode(unsigned int channel, std::string const &contraintModeName) = 0;\n> > +     virtual void enableAuto(unsigned int channel) = 0;\n> > +     virtual void disableAuto(unsigned int channel) = 0;\n> > +     virtual void setActiveChannels(const std::vector<unsigned int> &activeChannels) = 0;\n> >  };\n> >\n> >  } /* namespace RPiController */\n> > diff --git a/src/ipa/rpi/controller/meson.build b/src/ipa/rpi/controller/meson.build\n> > index feb0334e..20b9cda9 100644\n> > --- a/src/ipa/rpi/controller/meson.build\n> > +++ b/src/ipa/rpi/controller/meson.build\n> > @@ -8,6 +8,7 @@ rpi_ipa_controller_sources = files([\n> >      'pwl.cpp',\n> >      'rpi/af.cpp',\n> >      'rpi/agc.cpp',\n> > +    'rpi/agc_channel.cpp',\n> >      'rpi/alsc.cpp',\n> >      'rpi/awb.cpp',\n> >      'rpi/black_level.cpp',\n> > diff --git a/src/ipa/rpi/controller/rpi/agc.cpp b/src/ipa/rpi/controller/rpi/agc.cpp\n> > index 7b02972a..598fc890 100644\n> > --- a/src/ipa/rpi/controller/rpi/agc.cpp\n> > +++ b/src/ipa/rpi/controller/rpi/agc.cpp\n> > @@ -5,20 +5,12 @@\n> >   * agc.cpp - AGC/AEC control algorithm\n> >   */\n> >\n> > -#include <algorithm>\n> > -#include <map>\n> > -#include <tuple>\n> > +#include \"agc.h\"\n> >\n> >  #include <libcamera/base/log.h>\n> >\n> > -#include \"../awb_status.h\"\n> > -#include \"../device_status.h\"\n> > -#include \"../histogram.h\"\n> > -#include \"../lux_status.h\"\n> >  #include \"../metadata.h\"\n> >\n> > -#include \"agc.h\"\n> > -\n> >  using namespace RPiController;\n> >  using namespace libcamera;\n> >  using libcamera::utils::Duration;\n> > @@ -28,881 +20,205 @@ LOG_DEFINE_CATEGORY(RPiAgc)\n> >\n> >  #define NAME \"rpi.agc\"\n> >\n> > -int AgcMeteringMode::read(const libcamera::YamlObject &params)\n> > +Agc::Agc(Controller *controller)\n> > +     : AgcAlgorithm(controller),\n> > +       activeChannels_({ 0 })\n> >  {\n> > -     const YamlObject &yamlWeights = params[\"weights\"];\n> > -\n> > -     for (const auto &p : yamlWeights.asList()) {\n> > -             auto value = p.get<double>();\n> > -             if (!value)\n> > -                     return -EINVAL;\n> > -             weights.push_back(*value);\n> > -     }\n> > -\n> > -     return 0;\n> >  }\n> >\n> > -static std::tuple<int, std::string>\n> > -readMeteringModes(std::map<std::string, AgcMeteringMode> &metering_modes,\n> > -               const libcamera::YamlObject &params)\n> > +char const *Agc::name() const\n> >  {\n> > -     std::string first;\n> > -     int ret;\n> > -\n> > -     for (const auto &[key, value] : params.asDict()) {\n> > -             AgcMeteringMode meteringMode;\n> > -             ret = meteringMode.read(value);\n> > -             if (ret)\n> > -                     return { ret, {} };\n> > -\n> > -             metering_modes[key] = std::move(meteringMode);\n> > -             if (first.empty())\n> > -                     first = key;\n> > -     }\n> > -\n> > -     return { 0, first };\n> > +     return NAME;\n> >  }\n> >\n> > -int AgcExposureMode::read(const libcamera::YamlObject &params)\n> > +int Agc::read(const libcamera::YamlObject &params)\n> >  {\n> > -     auto value = params[\"shutter\"].getList<double>();\n> > -     if (!value)\n> > -             return -EINVAL;\n> > -     std::transform(value->begin(), value->end(), std::back_inserter(shutter),\n> > -                    [](double v) { return v * 1us; });\n> > -\n> > -     value = params[\"gain\"].getList<double>();\n> > -     if (!value)\n> > -             return -EINVAL;\n> > -     gain = std::move(*value);\n> > -\n> > -     if (shutter.size() < 2 || gain.size() < 2) {\n> > -             LOG(RPiAgc, Error)\n> > -                     << \"AgcExposureMode: must have at least two entries in exposure profile\";\n> > -             return -EINVAL;\n> > -     }\n> > -\n> > -     if (shutter.size() != gain.size()) {\n> > -             LOG(RPiAgc, Error)\n> > -                     << \"AgcExposureMode: expect same number of exposure and gain entries in exposure profile\";\n> > -             return -EINVAL;\n> > +     /*\n> > +      * When there is only a single channel we can read the old style syntax.\n> > +      * Otherwise we expect a \"channels\" keyword followed by a list of configurations.\n> > +      */\n> > +     if (!params.contains(\"channels\")) {\n> > +             LOG(RPiAgc, Debug) << \"Single channel only\";\n> > +             channelData_.emplace_back();\n> > +             return channelData_.back().channel.read(params, getHardwareConfig());\n> >       }\n> >\n> > -     return 0;\n> > -}\n> > -\n> > -static std::tuple<int, std::string>\n> > -readExposureModes(std::map<std::string, AgcExposureMode> &exposureModes,\n> > -               const libcamera::YamlObject &params)\n> > -{\n> > -     std::string first;\n> > -     int ret;\n> > -\n> > -     for (const auto &[key, value] : params.asDict()) {\n> > -             AgcExposureMode exposureMode;\n> > -             ret = exposureMode.read(value);\n> > +     const auto &channels = params[\"channels\"].asList();\n> > +     for (auto ch = channels.begin(); ch != channels.end(); ch++) {\n> > +             LOG(RPiAgc, Debug) << \"Read AGC channel\";\n> > +             channelData_.emplace_back();\n> > +             int ret = channelData_.back().channel.read(*ch, getHardwareConfig());\n> >               if (ret)\n> > -                     return { ret, {} };\n> > -\n> > -             exposureModes[key] = std::move(exposureMode);\n> > -             if (first.empty())\n> > -                     first = key;\n> > +                     return ret;\n> >       }\n> >\n> > -     return { 0, first };\n> > -}\n> > -\n> > -int AgcConstraint::read(const libcamera::YamlObject &params)\n> > -{\n> > -     std::string boundString = params[\"bound\"].get<std::string>(\"\");\n> > -     transform(boundString.begin(), boundString.end(),\n> > -               boundString.begin(), ::toupper);\n> > -     if (boundString != \"UPPER\" && boundString != \"LOWER\") {\n> > -             LOG(RPiAgc, Error) << \"AGC constraint type should be UPPER or LOWER\";\n> > -             return -EINVAL;\n> > +     LOG(RPiAgc, Debug) << \"Read \" << channelData_.size() << \" channel(s)\";\n> > +     if (channelData_.empty()) {\n> > +             LOG(RPiAgc, Error) << \"No AGC channels provided\";\n> > +             return -1;\n> >       }\n> > -     bound = boundString == \"UPPER\" ? Bound::UPPER : Bound::LOWER;\n> > -\n> > -     auto value = params[\"q_lo\"].get<double>();\n> > -     if (!value)\n> > -             return -EINVAL;\n> > -     qLo = *value;\n> > -\n> > -     value = params[\"q_hi\"].get<double>();\n> > -     if (!value)\n> > -             return -EINVAL;\n> > -     qHi = *value;\n> > -\n> > -     return yTarget.read(params[\"y_target\"]);\n> > -}\n> >\n> > -static std::tuple<int, AgcConstraintMode>\n> > -readConstraintMode(const libcamera::YamlObject &params)\n> > -{\n> > -     AgcConstraintMode mode;\n> > -     int ret;\n> > -\n> > -     for (const auto &p : params.asList()) {\n> > -             AgcConstraint constraint;\n> > -             ret = constraint.read(p);\n> > -             if (ret)\n> > -                     return { ret, {} };\n> > -\n> > -             mode.push_back(std::move(constraint));\n> > -     }\n> > -\n> > -     return { 0, mode };\n> > +     return 0;\n> >  }\n> >\n> > -static std::tuple<int, std::string>\n> > -readConstraintModes(std::map<std::string, AgcConstraintMode> &constraintModes,\n> > -                 const libcamera::YamlObject &params)\n> > +int Agc::checkChannel(unsigned int channelIndex) const\n> >  {\n> > -     std::string first;\n> > -     int ret;\n> > -\n> > -     for (const auto &[key, value] : params.asDict()) {\n> > -             std::tie(ret, constraintModes[key]) = readConstraintMode(value);\n> > -             if (ret)\n> > -                     return { ret, {} };\n> > -\n> > -             if (first.empty())\n> > -                     first = key;\n> > +     if (channelIndex >= channelData_.size()) {\n> > +             LOG(RPiAgc, Warning) << \"AGC channel \" << channelIndex << \" not available\";\n> > +             return -1;\n> >       }\n> >\n> > -     return { 0, first };\n> > -}\n> > -\n> > -int AgcConfig::read(const libcamera::YamlObject &params)\n> > -{\n> > -     LOG(RPiAgc, Debug) << \"AgcConfig\";\n> > -     int ret;\n> > -\n> > -     std::tie(ret, defaultMeteringMode) =\n> > -             readMeteringModes(meteringModes, params[\"metering_modes\"]);\n> > -     if (ret)\n> > -             return ret;\n> > -     std::tie(ret, defaultExposureMode) =\n> > -             readExposureModes(exposureModes, params[\"exposure_modes\"]);\n> > -     if (ret)\n> > -             return ret;\n> > -     std::tie(ret, defaultConstraintMode) =\n> > -             readConstraintModes(constraintModes, params[\"constraint_modes\"]);\n> > -     if (ret)\n> > -             return ret;\n> > -\n> > -     ret = yTarget.read(params[\"y_target\"]);\n> > -     if (ret)\n> > -             return ret;\n> > -\n> > -     speed = params[\"speed\"].get<double>(0.2);\n> > -     startupFrames = params[\"startup_frames\"].get<uint16_t>(10);\n> > -     convergenceFrames = params[\"convergence_frames\"].get<unsigned int>(6);\n> > -     fastReduceThreshold = params[\"fast_reduce_threshold\"].get<double>(0.4);\n> > -     baseEv = params[\"base_ev\"].get<double>(1.0);\n> > -\n> > -     /* Start with quite a low value as ramping up is easier than ramping down. */\n> > -     defaultExposureTime = params[\"default_exposure_time\"].get<double>(1000) * 1us;\n> > -     defaultAnalogueGain = params[\"default_analogue_gain\"].get<double>(1.0);\n> > -\n> >       return 0;\n> >  }\n> >\n> > -Agc::ExposureValues::ExposureValues()\n> > -     : shutter(0s), analogueGain(0),\n> > -       totalExposure(0s), totalExposureNoDG(0s)\n> > +void Agc::disableAuto(unsigned int channelIndex)\n> >  {\n> > -}\n> > -\n> > -Agc::Agc(Controller *controller)\n> > -     : AgcAlgorithm(controller), meteringMode_(nullptr),\n> > -       exposureMode_(nullptr), constraintMode_(nullptr),\n> > -       frameCount_(0), lockCount_(0),\n> > -       lastTargetExposure_(0s), ev_(1.0), flickerPeriod_(0s),\n> > -       maxShutter_(0s), fixedShutter_(0s), fixedAnalogueGain_(0.0)\n> > -{\n> > -     memset(&awb_, 0, sizeof(awb_));\n> > -     /*\n> > -      * Setting status_.totalExposureValue_ to zero initially tells us\n> > -      * it's not been calculated yet (i.e. Process hasn't yet run).\n> > -      */\n> > -     status_ = {};\n> > -     status_.ev = ev_;\n> > -}\n> > +     if (checkChannel(channelIndex))\n> > +             return;\n> >\n> > -char const *Agc::name() const\n> > -{\n> > -     return NAME;\n> > +     LOG(RPiAgc, Debug) << \"disableAuto for channel \" << channelIndex;\n> > +     channelData_[channelIndex].channel.disableAuto();\n> >  }\n> >\n> > -int Agc::read(const libcamera::YamlObject &params)\n> > +void Agc::enableAuto(unsigned int channelIndex)\n> >  {\n> > -     LOG(RPiAgc, Debug) << \"Agc\";\n> > -\n> > -     int ret = config_.read(params);\n> > -     if (ret)\n> > -             return ret;\n> > -\n> > -     const Size &size = getHardwareConfig().agcZoneWeights;\n> > -     for (auto const &modes : config_.meteringModes) {\n> > -             if (modes.second.weights.size() != size.width * size.height) {\n> > -                     LOG(RPiAgc, Error) << \"AgcMeteringMode: Incorrect number of weights\";\n> > -                     return -EINVAL;\n> > -             }\n> > -     }\n> > +     if (checkChannel(channelIndex))\n> > +             return;\n> >\n> > -     /*\n> > -      * Set the config's defaults (which are the first ones it read) as our\n> > -      * current modes, until someone changes them.  (they're all known to\n> > -      * exist at this point)\n> > -      */\n> > -     meteringModeName_ = config_.defaultMeteringMode;\n> > -     meteringMode_ = &config_.meteringModes[meteringModeName_];\n> > -     exposureModeName_ = config_.defaultExposureMode;\n> > -     exposureMode_ = &config_.exposureModes[exposureModeName_];\n> > -     constraintModeName_ = config_.defaultConstraintMode;\n> > -     constraintMode_ = &config_.constraintModes[constraintModeName_];\n> > -     /* Set up the \"last shutter/gain\" values, in case AGC starts \"disabled\". */\n> > -     status_.shutterTime = config_.defaultExposureTime;\n> > -     status_.analogueGain = config_.defaultAnalogueGain;\n> > -     return 0;\n> > -}\n> > -\n> > -void Agc::disableAuto()\n> > -{\n> > -     fixedShutter_ = status_.shutterTime;\n> > -     fixedAnalogueGain_ = status_.analogueGain;\n> > -}\n> > -\n> > -void Agc::enableAuto()\n> > -{\n> > -     fixedShutter_ = 0s;\n> > -     fixedAnalogueGain_ = 0;\n> > +     LOG(RPiAgc, Debug) << \"enableAuto for channel \" << channelIndex;\n> > +     channelData_[channelIndex].channel.enableAuto();\n> >  }\n> >\n> >  unsigned int Agc::getConvergenceFrames() const\n> >  {\n> > -     /*\n> > -      * If shutter and gain have been explicitly set, there is no\n> > -      * convergence to happen, so no need to drop any frames - return zero.\n> > -      */\n> > -     if (fixedShutter_ && fixedAnalogueGain_)\n> > -             return 0;\n> > -     else\n> > -             return config_.convergenceFrames;\n> > +     /* If there are n channels, it presumably takes n times as long to converge. */\n> > +     return channelData_[0].channel.getConvergenceFrames() * activeChannels_.size();\n> >  }\n> >\n> >  std::vector<double> const &Agc::getWeights() const\n> >  {\n> >       /*\n> > -      * In case someone calls setMeteringMode and then this before the\n> > -      * algorithm has run and updated the meteringMode_ pointer.\n> > +      * In future the metering weights may be determined differently, making it\n> > +      * difficult to associate different sets of weight with different channels.\n> > +      * Therefore we shall impose a limitation, at least for now, that all\n> > +      * channels will use the same weights.\n> >        */\n> > -     auto it = config_.meteringModes.find(meteringModeName_);\n> > -     if (it == config_.meteringModes.end())\n> > -             return meteringMode_->weights;\n> > -     return it->second.weights;\n> > +     return channelData_[0].channel.getWeights();\n> >  }\n> >\n> > -void Agc::setEv(double ev)\n> > +void Agc::setEv(unsigned int channelIndex, double ev)\n> >  {\n> > -     ev_ = ev;\n> > -}\n> > +     if (checkChannel(channelIndex))\n> > +             return;\n> >\n> > -void Agc::setFlickerPeriod(Duration flickerPeriod)\n> > -{\n> > -     flickerPeriod_ = flickerPeriod;\n> > +     LOG(RPiAgc, Debug) << \"setEv \" << ev << \" for channel \" << channelIndex;\n> > +     channelData_[channelIndex].channel.setEv(ev);\n> >  }\n> >\n> > -void Agc::setMaxShutter(Duration maxShutter)\n> > +void Agc::setFlickerPeriod(unsigned int channelIndex, Duration flickerPeriod)\n> >  {\n> > -     maxShutter_ = maxShutter;\n> > -}\n> > +     if (checkChannel(channelIndex))\n> > +             return;\n> >\n> > -void Agc::setFixedShutter(Duration fixedShutter)\n> > -{\n> > -     fixedShutter_ = fixedShutter;\n> > -     /* Set this in case someone calls disableAuto() straight after. */\n> > -     status_.shutterTime = limitShutter(fixedShutter_);\n> > +     LOG(RPiAgc, Debug) << \"setFlickerPeriod \" << flickerPeriod\n> > +                        << \" for channel \" << channelIndex;\n> > +     channelData_[channelIndex].channel.setFlickerPeriod(flickerPeriod);\n> >  }\n> >\n> > -void Agc::setFixedAnalogueGain(double fixedAnalogueGain)\n> > -{\n> > -     fixedAnalogueGain_ = fixedAnalogueGain;\n> > -     /* Set this in case someone calls disableAuto() straight after. */\n> > -     status_.analogueGain = limitGain(fixedAnalogueGain);\n> > -}\n> > -\n> > -void Agc::setMeteringMode(std::string const &meteringModeName)\n> > -{\n> > -     meteringModeName_ = meteringModeName;\n> > -}\n> > -\n> > -void Agc::setExposureMode(std::string const &exposureModeName)\n> > -{\n> > -     exposureModeName_ = exposureModeName;\n> > -}\n> > -\n> > -void Agc::setConstraintMode(std::string const &constraintModeName)\n> > -{\n> > -     constraintModeName_ = constraintModeName;\n> > -}\n> > -\n> > -void Agc::switchMode(CameraMode const &cameraMode,\n> > -                  Metadata *metadata)\n> > +void Agc::setMaxShutter(Duration maxShutter)\n> >  {\n> > -     /* AGC expects the mode sensitivity always to be non-zero. */\n> > -     ASSERT(cameraMode.sensitivity);\n> > -\n> > -     housekeepConfig();\n> > -\n> > -     /*\n> > -      * Store the mode in the local state. We must cache the sensitivity of\n> > -      * of the previous mode for the calculations below.\n> > -      */\n> > -     double lastSensitivity = mode_.sensitivity;\n> > -     mode_ = cameraMode;\n> > -\n> > -     Duration fixedShutter = limitShutter(fixedShutter_);\n> > -     if (fixedShutter && fixedAnalogueGain_) {\n> > -             /* We're going to reset the algorithm here with these fixed values. */\n> > -\n> > -             fetchAwbStatus(metadata);\n> > -             double minColourGain = std::min({ awb_.gainR, awb_.gainG, awb_.gainB, 1.0 });\n> > -             ASSERT(minColourGain != 0.0);\n> > -\n> > -             /* This is the equivalent of computeTargetExposure and applyDigitalGain. */\n> > -             target_.totalExposureNoDG = fixedShutter_ * fixedAnalogueGain_;\n> > -             target_.totalExposure = target_.totalExposureNoDG / minColourGain;\n> > -\n> > -             /* Equivalent of filterExposure. This resets any \"history\". */\n> > -             filtered_ = target_;\n> > -\n> > -             /* Equivalent of divideUpExposure. */\n> > -             filtered_.shutter = fixedShutter;\n> > -             filtered_.analogueGain = fixedAnalogueGain_;\n> > -     } else if (status_.totalExposureValue) {\n> > -             /*\n> > -              * On a mode switch, various things could happen:\n> > -              * - the exposure profile might change\n> > -              * - a fixed exposure or gain might be set\n> > -              * - the new mode's sensitivity might be different\n> > -              * We cope with the last of these by scaling the target values. After\n> > -              * that we just need to re-divide the exposure/gain according to the\n> > -              * current exposure profile, which takes care of everything else.\n> > -              */\n> > -\n> > -             double ratio = lastSensitivity / cameraMode.sensitivity;\n> > -             target_.totalExposureNoDG *= ratio;\n> > -             target_.totalExposure *= ratio;\n> > -             filtered_.totalExposureNoDG *= ratio;\n> > -             filtered_.totalExposure *= ratio;\n> > -\n> > -             divideUpExposure();\n> > -     } else {\n> > -             /*\n> > -              * We come through here on startup, when at least one of the shutter\n> > -              * or gain has not been fixed. We must still write those values out so\n> > -              * that they will be applied immediately. We supply some arbitrary defaults\n> > -              * for any that weren't set.\n> > -              */\n> > -\n> > -             /* Equivalent of divideUpExposure. */\n> > -             filtered_.shutter = fixedShutter ? fixedShutter : config_.defaultExposureTime;\n> > -             filtered_.analogueGain = fixedAnalogueGain_ ? fixedAnalogueGain_ : config_.defaultAnalogueGain;\n> > -     }\n> > -\n> > -     writeAndFinish(metadata, false);\n> > +     /* Frame durations will be the same across all channels too. */\n> > +     for (auto &data : channelData_)\n> > +             data.channel.setMaxShutter(maxShutter);\n> >  }\n> >\n> > -void Agc::prepare(Metadata *imageMetadata)\n> > +void Agc::setFixedShutter(unsigned int channelIndex, Duration fixedShutter)\n> >  {\n> > -     Duration totalExposureValue = status_.totalExposureValue;\n> > -     AgcStatus delayedStatus;\n> > -     AgcPrepareStatus prepareStatus;\n> > -\n> > -     if (!imageMetadata->get(\"agc.delayed_status\", delayedStatus))\n> > -             totalExposureValue = delayedStatus.totalExposureValue;\n> > -\n> > -     prepareStatus.digitalGain = 1.0;\n> > -     prepareStatus.locked = false;\n> > -\n> > -     if (status_.totalExposureValue) {\n> > -             /* Process has run, so we have meaningful values. */\n> > -             DeviceStatus deviceStatus;\n> > -             if (imageMetadata->get(\"device.status\", deviceStatus) == 0) {\n> > -                     Duration actualExposure = deviceStatus.shutterSpeed *\n> > -                                               deviceStatus.analogueGain;\n> > -                     if (actualExposure) {\n> > -                             double digitalGain = totalExposureValue / actualExposure;\n> > -                             LOG(RPiAgc, Debug) << \"Want total exposure \" << totalExposureValue;\n> > -                             /*\n> > -                              * Never ask for a gain < 1.0, and also impose\n> > -                              * some upper limit. Make it customisable?\n> > -                              */\n> > -                             prepareStatus.digitalGain = std::max(1.0, std::min(digitalGain, 4.0));\n> > -                             LOG(RPiAgc, Debug) << \"Actual exposure \" << actualExposure;\n> > -                             LOG(RPiAgc, Debug) << \"Use digitalGain \" << prepareStatus.digitalGain;\n> > -                             LOG(RPiAgc, Debug) << \"Effective exposure \"\n> > -                                                << actualExposure * prepareStatus.digitalGain;\n> > -                             /* Decide whether AEC/AGC has converged. */\n> > -                             prepareStatus.locked = updateLockStatus(deviceStatus);\n> > -                     }\n> > -             } else\n> > -                     LOG(RPiAgc, Warning) << name() << \": no device metadata\";\n> > -             imageMetadata->set(\"agc.prepare_status\", prepareStatus);\n> > -     }\n> > -}\n> > +     if (checkChannel(channelIndex))\n> > +             return;\n> >\n> > -void Agc::process(StatisticsPtr &stats, Metadata *imageMetadata)\n> > -{\n> > -     frameCount_++;\n> > -     /*\n> > -      * First a little bit of housekeeping, fetching up-to-date settings and\n> > -      * configuration, that kind of thing.\n> > -      */\n> > -     housekeepConfig();\n> > -     /* Fetch the AWB status immediately, so that we can assume it's there. */\n> > -     fetchAwbStatus(imageMetadata);\n> > -     /* Get the current exposure values for the frame that's just arrived. */\n> > -     fetchCurrentExposure(imageMetadata);\n> > -     /* Compute the total gain we require relative to the current exposure. */\n> > -     double gain, targetY;\n> > -     computeGain(stats, imageMetadata, gain, targetY);\n> > -     /* Now compute the target (final) exposure which we think we want. */\n> > -     computeTargetExposure(gain);\n> > -     /* The results have to be filtered so as not to change too rapidly. */\n> > -     filterExposure();\n> > -     /*\n> > -      * Some of the exposure has to be applied as digital gain, so work out\n> > -      * what that is. This function also tells us whether it's decided to\n> > -      * \"desaturate\" the image more quickly.\n> > -      */\n> > -     bool desaturate = applyDigitalGain(gain, targetY);\n> > -     /*\n> > -      * The last thing is to divide up the exposure value into a shutter time\n> > -      * and analogue gain, according to the current exposure mode.\n> > -      */\n> > -     divideUpExposure();\n> > -     /* Finally advertise what we've done. */\n> > -     writeAndFinish(imageMetadata, desaturate);\n> > +     LOG(RPiAgc, Debug) << \"setFixedShutter \" << fixedShutter\n> > +                        << \" for channel \" << channelIndex;\n> > +     channelData_[channelIndex].channel.setFixedShutter(fixedShutter);\n> >  }\n> >\n> > -bool Agc::updateLockStatus(DeviceStatus const &deviceStatus)\n> > +void Agc::setFixedAnalogueGain(unsigned int channelIndex, double fixedAnalogueGain)\n> >  {\n> > -     const double errorFactor = 0.10; /* make these customisable? */\n> > -     const int maxLockCount = 5;\n> > -     /* Reset \"lock count\" when we exceed this multiple of errorFactor */\n> > -     const double resetMargin = 1.5;\n> > +     if (checkChannel(channelIndex))\n> > +             return;\n> >\n> > -     /* Add 200us to the exposure time error to allow for line quantisation. */\n> > -     Duration exposureError = lastDeviceStatus_.shutterSpeed * errorFactor + 200us;\n> > -     double gainError = lastDeviceStatus_.analogueGain * errorFactor;\n> > -     Duration targetError = lastTargetExposure_ * errorFactor;\n> > -\n> > -     /*\n> > -      * Note that we don't know the exposure/gain limits of the sensor, so\n> > -      * the values we keep requesting may be unachievable. For this reason\n> > -      * we only insist that we're close to values in the past few frames.\n> > -      */\n> > -     if (deviceStatus.shutterSpeed > lastDeviceStatus_.shutterSpeed - exposureError &&\n> > -         deviceStatus.shutterSpeed < lastDeviceStatus_.shutterSpeed + exposureError &&\n> > -         deviceStatus.analogueGain > lastDeviceStatus_.analogueGain - gainError &&\n> > -         deviceStatus.analogueGain < lastDeviceStatus_.analogueGain + gainError &&\n> > -         status_.targetExposureValue > lastTargetExposure_ - targetError &&\n> > -         status_.targetExposureValue < lastTargetExposure_ + targetError)\n> > -             lockCount_ = std::min(lockCount_ + 1, maxLockCount);\n> > -     else if (deviceStatus.shutterSpeed < lastDeviceStatus_.shutterSpeed - resetMargin * exposureError ||\n> > -              deviceStatus.shutterSpeed > lastDeviceStatus_.shutterSpeed + resetMargin * exposureError ||\n> > -              deviceStatus.analogueGain < lastDeviceStatus_.analogueGain - resetMargin * gainError ||\n> > -              deviceStatus.analogueGain > lastDeviceStatus_.analogueGain + resetMargin * gainError ||\n> > -              status_.targetExposureValue < lastTargetExposure_ - resetMargin * targetError ||\n> > -              status_.targetExposureValue > lastTargetExposure_ + resetMargin * targetError)\n> > -             lockCount_ = 0;\n> > -\n> > -     lastDeviceStatus_ = deviceStatus;\n> > -     lastTargetExposure_ = status_.targetExposureValue;\n> > -\n> > -     LOG(RPiAgc, Debug) << \"Lock count updated to \" << lockCount_;\n> > -     return lockCount_ == maxLockCount;\n> > +     LOG(RPiAgc, Debug) << \"setFixedAnalogueGain \" << fixedAnalogueGain\n> > +                        << \" for channel \" << channelIndex;\n> > +     channelData_[channelIndex].channel.setFixedAnalogueGain(fixedAnalogueGain);\n> >  }\n> >\n> > -void Agc::housekeepConfig()\n> > +void Agc::setMeteringMode(std::string const &meteringModeName)\n> >  {\n> > -     /* First fetch all the up-to-date settings, so no one else has to do it. */\n> > -     status_.ev = ev_;\n> > -     status_.fixedShutter = limitShutter(fixedShutter_);\n> > -     status_.fixedAnalogueGain = fixedAnalogueGain_;\n> > -     status_.flickerPeriod = flickerPeriod_;\n> > -     LOG(RPiAgc, Debug) << \"ev \" << status_.ev << \" fixedShutter \"\n> > -                        << status_.fixedShutter << \" fixedAnalogueGain \"\n> > -                        << status_.fixedAnalogueGain;\n> > -     /*\n> > -      * Make sure the \"mode\" pointers point to the up-to-date things, if\n> > -      * they've changed.\n> > -      */\n> > -     if (meteringModeName_ != status_.meteringMode) {\n> > -             auto it = config_.meteringModes.find(meteringModeName_);\n> > -             if (it == config_.meteringModes.end()) {\n> > -                     LOG(RPiAgc, Warning) << \"No metering mode \" << meteringModeName_;\n> > -                     meteringModeName_ = status_.meteringMode;\n> > -             } else {\n> > -                     meteringMode_ = &it->second;\n> > -                     status_.meteringMode = meteringModeName_;\n> > -             }\n> > -     }\n> > -     if (exposureModeName_ != status_.exposureMode) {\n> > -             auto it = config_.exposureModes.find(exposureModeName_);\n> > -             if (it == config_.exposureModes.end()) {\n> > -                     LOG(RPiAgc, Warning) << \"No exposure profile \" << exposureModeName_;\n> > -                     exposureModeName_ = status_.exposureMode;\n> > -             } else {\n> > -                     exposureMode_ = &it->second;\n> > -                     status_.exposureMode = exposureModeName_;\n> > -             }\n> > -     }\n> > -     if (constraintModeName_ != status_.constraintMode) {\n> > -             auto it = config_.constraintModes.find(constraintModeName_);\n> > -             if (it == config_.constraintModes.end()) {\n> > -                     LOG(RPiAgc, Warning) << \"No constraint list \" << constraintModeName_;\n> > -                     constraintModeName_ = status_.constraintMode;\n> > -             } else {\n> > -                     constraintMode_ = &it->second;\n> > -                     status_.constraintMode = constraintModeName_;\n> > -             }\n> > -     }\n> > -     LOG(RPiAgc, Debug) << \"exposureMode \"\n> > -                        << exposureModeName_ << \" constraintMode \"\n> > -                        << constraintModeName_ << \" meteringMode \"\n> > -                        << meteringModeName_;\n> > +     /* Metering modes will be the same across all channels too. */\n> > +     for (auto &data : channelData_)\n> > +             data.channel.setMeteringMode(meteringModeName);\n> >  }\n> >\n> > -void Agc::fetchCurrentExposure(Metadata *imageMetadata)\n> > +void Agc::setExposureMode(unsigned int channelIndex, std::string const &exposureModeName)\n> >  {\n> > -     std::unique_lock<Metadata> lock(*imageMetadata);\n> > -     DeviceStatus *deviceStatus =\n> > -             imageMetadata->getLocked<DeviceStatus>(\"device.status\");\n> > -     if (!deviceStatus)\n> > -             LOG(RPiAgc, Fatal) << \"No device metadata\";\n> > -     current_.shutter = deviceStatus->shutterSpeed;\n> > -     current_.analogueGain = deviceStatus->analogueGain;\n> > -     AgcStatus *agcStatus =\n> > -             imageMetadata->getLocked<AgcStatus>(\"agc.status\");\n> > -     current_.totalExposure = agcStatus ? agcStatus->totalExposureValue : 0s;\n> > -     current_.totalExposureNoDG = current_.shutter * current_.analogueGain;\n> > -}\n> > +     if (checkChannel(channelIndex))\n> > +             return;\n> >\n> > -void Agc::fetchAwbStatus(Metadata *imageMetadata)\n> > -{\n> > -     awb_.gainR = 1.0; /* in case not found in metadata */\n> > -     awb_.gainG = 1.0;\n> > -     awb_.gainB = 1.0;\n> > -     if (imageMetadata->get(\"awb.status\", awb_) != 0)\n> > -             LOG(RPiAgc, Debug) << \"No AWB status found\";\n> > +     LOG(RPiAgc, Debug) << \"setExposureMode \" << exposureModeName\n> > +                        << \" for channel \" << channelIndex;\n> > +     channelData_[channelIndex].channel.setExposureMode(exposureModeName);\n> >  }\n> >\n> > -static double computeInitialY(StatisticsPtr &stats, AwbStatus const &awb,\n> > -                           std::vector<double> &weights, double gain)\n> > +void Agc::setConstraintMode(unsigned int channelIndex, std::string const &constraintModeName)\n> >  {\n> > -     constexpr uint64_t maxVal = 1 << Statistics::NormalisationFactorPow2;\n> > +     if (checkChannel(channelIndex))\n> > +             return;\n> >\n> > -     ASSERT(weights.size() == stats->agcRegions.numRegions());\n> > -\n> > -     /*\n> > -      * Note that the weights are applied by the IPA to the statistics directly,\n> > -      * before they are given to us here.\n> > -      */\n> > -     double rSum = 0, gSum = 0, bSum = 0, pixelSum = 0;\n> > -     for (unsigned int i = 0; i < stats->agcRegions.numRegions(); i++) {\n> > -             auto &region = stats->agcRegions.get(i);\n> > -             rSum += std::min<double>(region.val.rSum * gain, (maxVal - 1) * region.counted);\n> > -             gSum += std::min<double>(region.val.gSum * gain, (maxVal - 1) * region.counted);\n> > -             bSum += std::min<double>(region.val.bSum * gain, (maxVal - 1) * region.counted);\n> > -             pixelSum += region.counted;\n> > -     }\n> > -     if (pixelSum == 0.0) {\n> > -             LOG(RPiAgc, Warning) << \"computeInitialY: pixelSum is zero\";\n> > -             return 0;\n> > -     }\n> > -     double ySum = rSum * awb.gainR * .299 +\n> > -                   gSum * awb.gainG * .587 +\n> > -                   bSum * awb.gainB * .114;\n> > -     return ySum / pixelSum / maxVal;\n> > +     channelData_[channelIndex].channel.setConstraintMode(constraintModeName);\n> >  }\n> >\n> > -/*\n> > - * We handle extra gain through EV by adjusting our Y targets. However, you\n> > - * simply can't monitor histograms once they get very close to (or beyond!)\n> > - * saturation, so we clamp the Y targets to this value. It does mean that EV\n> > - * increases don't necessarily do quite what you might expect in certain\n> > - * (contrived) cases.\n> > - */\n> > -\n> > -static constexpr double EvGainYTargetLimit = 0.9;\n> > -\n> > -static double constraintComputeGain(AgcConstraint &c, const Histogram &h, double lux,\n> > -                                 double evGain, double &targetY)\n> > +template<typename T>\n> > +std::ostream &operator<<(std::ostream &os, const std::vector<T> &v)\n> >  {\n> > -     targetY = c.yTarget.eval(c.yTarget.domain().clip(lux));\n> > -     targetY = std::min(EvGainYTargetLimit, targetY * evGain);\n> > -     double iqm = h.interQuantileMean(c.qLo, c.qHi);\n> > -     return (targetY * h.bins()) / iqm;\n> > +     os << \"{\";\n> > +     for (const auto &e : v)\n> > +             os << \" \" << e;\n> > +     os << \" }\";\n> > +     return os;\n> >  }\n> >\n> > -void Agc::computeGain(StatisticsPtr &statistics, Metadata *imageMetadata,\n> > -                   double &gain, double &targetY)\n> > +void Agc::setActiveChannels(const std::vector<unsigned int> &activeChannels)\n> >  {\n> > -     struct LuxStatus lux = {};\n> > -     lux.lux = 400; /* default lux level to 400 in case no metadata found */\n> > -     if (imageMetadata->get(\"lux.status\", lux) != 0)\n> > -             LOG(RPiAgc, Warning) << \"No lux level found\";\n> > -     const Histogram &h = statistics->yHist;\n> > -     double evGain = status_.ev * config_.baseEv;\n> > -     /*\n> > -      * The initial gain and target_Y come from some of the regions. After\n> > -      * that we consider the histogram constraints.\n> > -      */\n> > -     targetY = config_.yTarget.eval(config_.yTarget.domain().clip(lux.lux));\n> > -     targetY = std::min(EvGainYTargetLimit, targetY * evGain);\n> > -\n> > -     /*\n> > -      * Do this calculation a few times as brightness increase can be\n> > -      * non-linear when there are saturated regions.\n> > -      */\n> > -     gain = 1.0;\n> > -     for (int i = 0; i < 8; i++) {\n> > -             double initialY = computeInitialY(statistics, awb_, meteringMode_->weights, gain);\n> > -             double extraGain = std::min(10.0, targetY / (initialY + .001));\n> > -             gain *= extraGain;\n> > -             LOG(RPiAgc, Debug) << \"Initial Y \" << initialY << \" target \" << targetY\n> > -                                << \" gives gain \" << gain;\n> > -             if (extraGain < 1.01) /* close enough */\n> > -                     break;\n> > -     }\n> > -\n> > -     for (auto &c : *constraintMode_) {\n> > -             double newTargetY;\n> > -             double newGain = constraintComputeGain(c, h, lux.lux, evGain, newTargetY);\n> > -             LOG(RPiAgc, Debug) << \"Constraint has target_Y \"\n> > -                                << newTargetY << \" giving gain \" << newGain;\n> > -             if (c.bound == AgcConstraint::Bound::LOWER && newGain > gain) {\n> > -                     LOG(RPiAgc, Debug) << \"Lower bound constraint adopted\";\n> > -                     gain = newGain;\n> > -                     targetY = newTargetY;\n> > -             } else if (c.bound == AgcConstraint::Bound::UPPER && newGain < gain) {\n> > -                     LOG(RPiAgc, Debug) << \"Upper bound constraint adopted\";\n> > -                     gain = newGain;\n> > -                     targetY = newTargetY;\n> > -             }\n> > +     if (activeChannels.empty()) {\n> > +             LOG(RPiAgc, Warning) << \"No active AGC channels supplied\";\n> > +             return;\n> >       }\n> > -     LOG(RPiAgc, Debug) << \"Final gain \" << gain << \" (target_Y \" << targetY << \" ev \"\n> > -                        << status_.ev << \" base_ev \" << config_.baseEv\n> > -                        << \")\";\n> > -}\n> > -\n> > -void Agc::computeTargetExposure(double gain)\n> > -{\n> > -     if (status_.fixedShutter && status_.fixedAnalogueGain) {\n> > -             /*\n> > -              * When ag and shutter are both fixed, we need to drive the\n> > -              * total exposure so that we end up with a digital gain of at least\n> > -              * 1/minColourGain. Otherwise we'd desaturate channels causing\n> > -              * white to go cyan or magenta.\n> > -              */\n> > -             double minColourGain = std::min({ awb_.gainR, awb_.gainG, awb_.gainB, 1.0 });\n> > -             ASSERT(minColourGain != 0.0);\n> > -             target_.totalExposure =\n> > -                     status_.fixedShutter * status_.fixedAnalogueGain / minColourGain;\n> > -     } else {\n> > -             /*\n> > -              * The statistics reflect the image without digital gain, so the final\n> > -              * total exposure we're aiming for is:\n> > -              */\n> > -             target_.totalExposure = current_.totalExposureNoDG * gain;\n> > -             /* The final target exposure is also limited to what the exposure mode allows. */\n> > -             Duration maxShutter = status_.fixedShutter\n> > -                                           ? status_.fixedShutter\n> > -                                           : exposureMode_->shutter.back();\n> > -             maxShutter = limitShutter(maxShutter);\n> > -             Duration maxTotalExposure =\n> > -                     maxShutter *\n> > -                     (status_.fixedAnalogueGain != 0.0\n> > -                              ? status_.fixedAnalogueGain\n> > -                              : exposureMode_->gain.back());\n> > -             target_.totalExposure = std::min(target_.totalExposure, maxTotalExposure);\n> > -     }\n> > -     LOG(RPiAgc, Debug) << \"Target totalExposure \" << target_.totalExposure;\n> > -}\n> >\n> > -bool Agc::applyDigitalGain(double gain, double targetY)\n> > -{\n> > -     double minColourGain = std::min({ awb_.gainR, awb_.gainG, awb_.gainB, 1.0 });\n> > -     ASSERT(minColourGain != 0.0);\n> > -     double dg = 1.0 / minColourGain;\n> > -     /*\n> > -      * I think this pipeline subtracts black level and rescales before we\n> > -      * get the stats, so no need to worry about it.\n> > -      */\n> > -     LOG(RPiAgc, Debug) << \"after AWB, target dg \" << dg << \" gain \" << gain\n> > -                        << \" target_Y \" << targetY;\n> > -     /*\n> > -      * Finally, if we're trying to reduce exposure but the target_Y is\n> > -      * \"close\" to 1.0, then the gain computed for that constraint will be\n> > -      * only slightly less than one, because the measured Y can never be\n> > -      * larger than 1.0. When this happens, demand a large digital gain so\n> > -      * that the exposure can be reduced, de-saturating the image much more\n> > -      * quickly (and we then approach the correct value more quickly from\n> > -      * below).\n> > -      */\n> > -     bool desaturate = targetY > config_.fastReduceThreshold &&\n> > -                       gain < sqrt(targetY);\n> > -     if (desaturate)\n> > -             dg /= config_.fastReduceThreshold;\n> > -     LOG(RPiAgc, Debug) << \"Digital gain \" << dg << \" desaturate? \" << desaturate;\n> > -     filtered_.totalExposureNoDG = filtered_.totalExposure / dg;\n> > -     LOG(RPiAgc, Debug) << \"Target totalExposureNoDG \" << filtered_.totalExposureNoDG;\n> > -     return desaturate;\n> > -}\n> > -\n> > -void Agc::filterExposure()\n> > -{\n> > -     double speed = config_.speed;\n> > -     /*\n> > -      * AGC adapts instantly if both shutter and gain are directly specified\n> > -      * or we're in the startup phase.\n> > -      */\n> > -     if ((status_.fixedShutter && status_.fixedAnalogueGain) ||\n> > -         frameCount_ <= config_.startupFrames)\n> > -             speed = 1.0;\n> > -     if (!filtered_.totalExposure) {\n> > -             filtered_.totalExposure = target_.totalExposure;\n> > -     } else {\n> > -             /*\n> > -              * If close to the result go faster, to save making so many\n> > -              * micro-adjustments on the way. (Make this customisable?)\n> > -              */\n> > -             if (filtered_.totalExposure < 1.2 * target_.totalExposure &&\n> > -                 filtered_.totalExposure > 0.8 * target_.totalExposure)\n> > -                     speed = sqrt(speed);\n> > -             filtered_.totalExposure = speed * target_.totalExposure +\n> > -                                       filtered_.totalExposure * (1.0 - speed);\n> > -     }\n> > -     LOG(RPiAgc, Debug) << \"After filtering, totalExposure \" << filtered_.totalExposure\n> > -                        << \" no dg \" << filtered_.totalExposureNoDG;\n> > -}\n> > +     for (auto index : activeChannels)\n> > +             if (checkChannel(index))\n> > +                     return;\n> >\n> > -void Agc::divideUpExposure()\n> > -{\n> > -     /*\n> > -      * Sending the fixed shutter/gain cases through the same code may seem\n> > -      * unnecessary, but it will make more sense when extend this to cover\n> > -      * variable aperture.\n> > -      */\n> > -     Duration exposureValue = filtered_.totalExposureNoDG;\n> > -     Duration shutterTime;\n> > -     double analogueGain;\n> > -     shutterTime = status_.fixedShutter ? status_.fixedShutter\n> > -                                        : exposureMode_->shutter[0];\n> > -     shutterTime = limitShutter(shutterTime);\n> > -     analogueGain = status_.fixedAnalogueGain != 0.0 ? status_.fixedAnalogueGain\n> > -                                                     : exposureMode_->gain[0];\n> > -     analogueGain = limitGain(analogueGain);\n> > -     if (shutterTime * analogueGain < exposureValue) {\n> > -             for (unsigned int stage = 1;\n> > -                  stage < exposureMode_->gain.size(); stage++) {\n> > -                     if (!status_.fixedShutter) {\n> > -                             Duration stageShutter =\n> > -                                     limitShutter(exposureMode_->shutter[stage]);\n> > -                             if (stageShutter * analogueGain >= exposureValue) {\n> > -                                     shutterTime = exposureValue / analogueGain;\n> > -                                     break;\n> > -                             }\n> > -                             shutterTime = stageShutter;\n> > -                     }\n> > -                     if (status_.fixedAnalogueGain == 0.0) {\n> > -                             if (exposureMode_->gain[stage] * shutterTime >= exposureValue) {\n> > -                                     analogueGain = exposureValue / shutterTime;\n> > -                                     break;\n> > -                             }\n> > -                             analogueGain = exposureMode_->gain[stage];\n> > -                             analogueGain = limitGain(analogueGain);\n> > -                     }\n> > -             }\n> > -     }\n> > -     LOG(RPiAgc, Debug) << \"Divided up shutter and gain are \" << shutterTime << \" and \"\n> > -                        << analogueGain;\n> > -     /*\n> > -      * Finally adjust shutter time for flicker avoidance (require both\n> > -      * shutter and gain not to be fixed).\n> > -      */\n> > -     if (!status_.fixedShutter && !status_.fixedAnalogueGain &&\n> > -         status_.flickerPeriod) {\n> > -             int flickerPeriods = shutterTime / status_.flickerPeriod;\n> > -             if (flickerPeriods) {\n> > -                     Duration newShutterTime = flickerPeriods * status_.flickerPeriod;\n> > -                     analogueGain *= shutterTime / newShutterTime;\n> > -                     /*\n> > -                      * We should still not allow the ag to go over the\n> > -                      * largest value in the exposure mode. Note that this\n> > -                      * may force more of the total exposure into the digital\n> > -                      * gain as a side-effect.\n> > -                      */\n> > -                     analogueGain = std::min(analogueGain, exposureMode_->gain.back());\n> > -                     analogueGain = limitGain(analogueGain);\n> > -                     shutterTime = newShutterTime;\n> > -             }\n> > -             LOG(RPiAgc, Debug) << \"After flicker avoidance, shutter \"\n> > -                                << shutterTime << \" gain \" << analogueGain;\n> > -     }\n> > -     filtered_.shutter = shutterTime;\n> > -     filtered_.analogueGain = analogueGain;\n> > +     LOG(RPiAgc, Debug) << \"setActiveChannels \" << activeChannels;\n> > +     activeChannels_ = activeChannels;\n> >  }\n> >\n> > -void Agc::writeAndFinish(Metadata *imageMetadata, bool desaturate)\n> > +void Agc::switchMode(CameraMode const &cameraMode,\n> > +                  Metadata *metadata)\n> >  {\n> > -     status_.totalExposureValue = filtered_.totalExposure;\n> > -     status_.targetExposureValue = desaturate ? 0s : target_.totalExposureNoDG;\n> > -     status_.shutterTime = filtered_.shutter;\n> > -     status_.analogueGain = filtered_.analogueGain;\n> > -     /*\n> > -      * Write to metadata as well, in case anyone wants to update the camera\n> > -      * immediately.\n> > -      */\n> > -     imageMetadata->set(\"agc.status\", status_);\n> > -     LOG(RPiAgc, Debug) << \"Output written, total exposure requested is \"\n> > -                        << filtered_.totalExposure;\n> > -     LOG(RPiAgc, Debug) << \"Camera exposure update: shutter time \" << filtered_.shutter\n> > -                        << \" analogue gain \" << filtered_.analogueGain;\n> > +     LOG(RPiAgc, Debug) << \"switchMode for channel 0\";\n> > +     channelData_[0].channel.switchMode(cameraMode, metadata);\n> >  }\n> >\n> > -Duration Agc::limitShutter(Duration shutter)\n> > +void Agc::prepare(Metadata *imageMetadata)\n> >  {\n> > -     /*\n> > -      * shutter == 0 is a special case for fixed shutter values, and must pass\n> > -      * through unchanged\n> > -      */\n> > -     if (!shutter)\n> > -             return shutter;\n> > -\n> > -     shutter = std::clamp(shutter, mode_.minShutter, maxShutter_);\n> > -     return shutter;\n> > +     LOG(RPiAgc, Debug) << \"prepare for channel 0\";\n> > +     channelData_[0].channel.prepare(imageMetadata);\n> >  }\n> >\n> > -double Agc::limitGain(double gain) const\n> > +void Agc::process(StatisticsPtr &stats, Metadata *imageMetadata)\n> >  {\n> > -     /*\n> > -      * Only limit the lower bounds of the gain value to what the sensor limits.\n> > -      * The upper bound on analogue gain will be made up with additional digital\n> > -      * gain applied by the ISP.\n> > -      *\n> > -      * gain == 0.0 is a special case for fixed shutter values, and must pass\n> > -      * through unchanged\n> > -      */\n> > -     if (!gain)\n> > -             return gain;\n> > -\n> > -     gain = std::max(gain, mode_.minAnalogueGain);\n> > -     return gain;\n> > +     LOG(RPiAgc, Debug) << \"process for channel 0\";\n> > +     channelData_[0].channel.process(stats, imageMetadata);\n> >  }\n> >\n> >  /* Register algorithm with the system. */\n> > diff --git a/src/ipa/rpi/controller/rpi/agc.h b/src/ipa/rpi/controller/rpi/agc.h\n> > index aaf77c8f..24f0a271 100644\n> > --- a/src/ipa/rpi/controller/rpi/agc.h\n> > +++ b/src/ipa/rpi/controller/rpi/agc.h\n> > @@ -6,60 +6,18 @@\n> >   */\n> >  #pragma once\n> >\n> > +#include <optional>\n>\n> Is this used ?\n\nYou're right, having moved those data fields into the next commit,\nthis include can move too!\n\n>\n> > +#include <string>\n> >  #include <vector>\n> > -#include <mutex>\n> > -\n> > -#include <libcamera/base/utils.h>\n> >\n> >  #include \"../agc_algorithm.h\"\n> > -#include \"../agc_status.h\"\n> > -#include \"../pwl.h\"\n> >\n> > -/* This is our implementation of AGC. */\n> > +#include \"agc_channel.h\"\n> >\n> >  namespace RPiController {\n> >\n> > -struct AgcMeteringMode {\n> > -     std::vector<double> weights;\n> > -     int read(const libcamera::YamlObject &params);\n> > -};\n> > -\n> > -struct AgcExposureMode {\n> > -     std::vector<libcamera::utils::Duration> shutter;\n> > -     std::vector<double> gain;\n> > -     int read(const libcamera::YamlObject &params);\n> > -};\n> > -\n> > -struct AgcConstraint {\n> > -     enum class Bound { LOWER = 0, UPPER = 1 };\n> > -     Bound bound;\n> > -     double qLo;\n> > -     double qHi;\n> > -     Pwl yTarget;\n> > -     int read(const libcamera::YamlObject &params);\n> > -};\n> > -\n> > -typedef std::vector<AgcConstraint> AgcConstraintMode;\n> > -\n> > -struct AgcConfig {\n> > -     int read(const libcamera::YamlObject &params);\n> > -     std::map<std::string, AgcMeteringMode> meteringModes;\n> > -     std::map<std::string, AgcExposureMode> exposureModes;\n> > -     std::map<std::string, AgcConstraintMode> constraintModes;\n> > -     Pwl yTarget;\n> > -     double speed;\n> > -     uint16_t startupFrames;\n> > -     unsigned int convergenceFrames;\n> > -     double maxChange;\n> > -     double minChange;\n> > -     double fastReduceThreshold;\n> > -     double speedUpThreshold;\n> > -     std::string defaultMeteringMode;\n> > -     std::string defaultExposureMode;\n> > -     std::string defaultConstraintMode;\n> > -     double baseEv;\n> > -     libcamera::utils::Duration defaultExposureTime;\n> > -     double defaultAnalogueGain;\n> > +struct AgcChannelData {\n> > +     AgcChannel channel;\n> >  };\n> >\n> >  class Agc : public AgcAlgorithm\n> > @@ -70,65 +28,30 @@ public:\n> >       int read(const libcamera::YamlObject &params) override;\n> >       unsigned int getConvergenceFrames() const override;\n> >       std::vector<double> const &getWeights() const override;\n> > -     void setEv(double ev) override;\n> > -     void setFlickerPeriod(libcamera::utils::Duration flickerPeriod) override;\n> > +     void setEv(unsigned int channel, double ev) override;\n> > +     void setFlickerPeriod(unsigned int channelIndex,\n> > +                           libcamera::utils::Duration flickerPeriod) override;\n> >       void setMaxShutter(libcamera::utils::Duration maxShutter) override;\n> > -     void setFixedShutter(libcamera::utils::Duration fixedShutter) override;\n> > -     void setFixedAnalogueGain(double fixedAnalogueGain) override;\n> > +     void setFixedShutter(unsigned int channelIndex,\n> > +                          libcamera::utils::Duration fixedShutter) override;\n> > +     void setFixedAnalogueGain(unsigned int channelIndex,\n> > +                               double fixedAnalogueGain) override;\n> >       void setMeteringMode(std::string const &meteringModeName) override;\n> > -     void setExposureMode(std::string const &exposureModeName) override;\n> > -     void setConstraintMode(std::string const &contraintModeName) override;\n> > -     void enableAuto() override;\n> > -     void disableAuto() override;\n> > +     void setExposureMode(unsigned int channelIndex,\n> > +                          std::string const &exposureModeName) override;\n> > +     void setConstraintMode(unsigned int channelIndex,\n> > +                            std::string const &contraintModeName) override;\n> > +     void enableAuto(unsigned int channelIndex) override;\n> > +     void disableAuto(unsigned int channelIndex) override;\n> >       void switchMode(CameraMode const &cameraMode, Metadata *metadata) override;\n> >       void prepare(Metadata *imageMetadata) override;\n> >       void process(StatisticsPtr &stats, Metadata *imageMetadata) override;\n> > +     void setActiveChannels(const std::vector<unsigned int> &activeChannels) override;\n> >\n> >  private:\n> > -     bool updateLockStatus(DeviceStatus const &deviceStatus);\n> > -     AgcConfig config_;\n> > -     void housekeepConfig();\n> > -     void fetchCurrentExposure(Metadata *imageMetadata);\n> > -     void fetchAwbStatus(Metadata *imageMetadata);\n> > -     void computeGain(StatisticsPtr &statistics, Metadata *imageMetadata,\n> > -                      double &gain, double &targetY);\n> > -     void computeTargetExposure(double gain);\n> > -     void filterExposure();\n> > -     bool applyDigitalGain(double gain, double targetY);\n> > -     void divideUpExposure();\n> > -     void writeAndFinish(Metadata *imageMetadata, bool desaturate);\n> > -     libcamera::utils::Duration limitShutter(libcamera::utils::Duration shutter);\n> > -     double limitGain(double gain) const;\n> > -     AgcMeteringMode *meteringMode_;\n> > -     AgcExposureMode *exposureMode_;\n> > -     AgcConstraintMode *constraintMode_;\n> > -     CameraMode mode_;\n> > -     uint64_t frameCount_;\n> > -     AwbStatus awb_;\n> > -     struct ExposureValues {\n> > -             ExposureValues();\n> > -\n> > -             libcamera::utils::Duration shutter;\n> > -             double analogueGain;\n> > -             libcamera::utils::Duration totalExposure;\n> > -             libcamera::utils::Duration totalExposureNoDG; /* without digital gain */\n> > -     };\n> > -     ExposureValues current_;  /* values for the current frame */\n> > -     ExposureValues target_;   /* calculate the values we want here */\n> > -     ExposureValues filtered_; /* these values are filtered towards target */\n> > -     AgcStatus status_;\n> > -     int lockCount_;\n> > -     DeviceStatus lastDeviceStatus_;\n> > -     libcamera::utils::Duration lastTargetExposure_;\n> > -     /* Below here the \"settings\" that applications can change. */\n> > -     std::string meteringModeName_;\n> > -     std::string exposureModeName_;\n> > -     std::string constraintModeName_;\n> > -     double ev_;\n> > -     libcamera::utils::Duration flickerPeriod_;\n> > -     libcamera::utils::Duration maxShutter_;\n> > -     libcamera::utils::Duration fixedShutter_;\n> > -     double fixedAnalogueGain_;\n> > +     int checkChannel(unsigned int channel) const;\n> > +     std::vector<AgcChannelData> channelData_;\n> > +     std::vector<unsigned int> activeChannels_;\n> >  };\n> >\n> >  } /* namespace RPiController */\n> > diff --git a/src/ipa/rpi/controller/rpi/agc_channel.cpp b/src/ipa/rpi/controller/rpi/agc_channel.cpp\n> > new file mode 100644\n> > index 00000000..7c1aba81\n> > --- /dev/null\n> > +++ b/src/ipa/rpi/controller/rpi/agc_channel.cpp\n> > @@ -0,0 +1,924 @@\n> > +/* SPDX-License-Identifier: BSD-2-Clause */\n> > +/*\n> > + * Copyright (C) 2023, Raspberry Pi Ltd\n> > + *\n> > + * agc.cpp - AGC/AEC control algorithm\n>\n> agc_channel.cpp\n\nThank you!\n\n>\n> > + */\n> > +\n> > +#include \"agc_channel.h\"\n> > +\n> > +#include <algorithm>\n> > +#include <tuple>\n> > +\n> > +#include <libcamera/base/log.h>\n> > +\n> > +#include \"../awb_status.h\"\n> > +#include \"../device_status.h\"\n> > +#include \"../histogram.h\"\n> > +#include \"../lux_status.h\"\n> > +#include \"../metadata.h\"\n> > +\n> > +using namespace RPiController;\n> > +using namespace libcamera;\n> > +using libcamera::utils::Duration;\n> > +using namespace std::literals::chrono_literals;\n> > +\n> > +LOG_DECLARE_CATEGORY(RPiAgc)\n> > +\n> > +int AgcMeteringMode::read(const libcamera::YamlObject &params)\n> > +{\n> > +     const YamlObject &yamlWeights = params[\"weights\"];\n> > +\n> > +     for (const auto &p : yamlWeights.asList()) {\n> > +             auto value = p.get<double>();\n> > +             if (!value)\n> > +                     return -EINVAL;\n> > +             weights.push_back(*value);\n> > +     }\n> > +\n> > +     return 0;\n> > +}\n> > +\n> > +static std::tuple<int, std::string>\n> > +readMeteringModes(std::map<std::string, AgcMeteringMode> &metering_modes,\n> > +               const libcamera::YamlObject &params)\n> > +{\n> > +     std::string first;\n> > +     int ret;\n> > +\n> > +     for (const auto &[key, value] : params.asDict()) {\n> > +             AgcMeteringMode meteringMode;\n> > +             ret = meteringMode.read(value);\n> > +             if (ret)\n> > +                     return { ret, {} };\n> > +\n> > +             metering_modes[key] = std::move(meteringMode);\n> > +             if (first.empty())\n> > +                     first = key;\n> > +     }\n> > +\n> > +     return { 0, first };\n> > +}\n> > +\n> > +int AgcExposureMode::read(const libcamera::YamlObject &params)\n> > +{\n> > +     auto value = params[\"shutter\"].getList<double>();\n> > +     if (!value)\n> > +             return -EINVAL;\n> > +     std::transform(value->begin(), value->end(), std::back_inserter(shutter),\n> > +                    [](double v) { return v * 1us; });\n> > +\n> > +     value = params[\"gain\"].getList<double>();\n> > +     if (!value)\n> > +             return -EINVAL;\n> > +     gain = std::move(*value);\n> > +\n> > +     if (shutter.size() < 2 || gain.size() < 2) {\n> > +             LOG(RPiAgc, Error)\n> > +                     << \"AgcExposureMode: must have at least two entries in exposure profile\";\n> > +             return -EINVAL;\n> > +     }\n> > +\n> > +     if (shutter.size() != gain.size()) {\n> > +             LOG(RPiAgc, Error)\n> > +                     << \"AgcExposureMode: expect same number of exposure and gain entries in exposure profile\";\n> > +             return -EINVAL;\n> > +     }\n> > +\n> > +     return 0;\n> > +}\n> > +\n> > +static std::tuple<int, std::string>\n> > +readExposureModes(std::map<std::string, AgcExposureMode> &exposureModes,\n> > +               const libcamera::YamlObject &params)\n> > +{\n> > +     std::string first;\n> > +     int ret;\n> > +\n> > +     for (const auto &[key, value] : params.asDict()) {\n> > +             AgcExposureMode exposureMode;\n> > +             ret = exposureMode.read(value);\n> > +             if (ret)\n> > +                     return { ret, {} };\n> > +\n> > +             exposureModes[key] = std::move(exposureMode);\n> > +             if (first.empty())\n> > +                     first = key;\n> > +     }\n> > +\n> > +     return { 0, first };\n> > +}\n> > +\n> > +int AgcConstraint::read(const libcamera::YamlObject &params)\n> > +{\n> > +     std::string boundString = params[\"bound\"].get<std::string>(\"\");\n> > +     transform(boundString.begin(), boundString.end(),\n> > +               boundString.begin(), ::toupper);\n> > +     if (boundString != \"UPPER\" && boundString != \"LOWER\") {\n> > +             LOG(RPiAgc, Error) << \"AGC constraint type should be UPPER or LOWER\";\n> > +             return -EINVAL;\n> > +     }\n> > +     bound = boundString == \"UPPER\" ? Bound::UPPER : Bound::LOWER;\n> > +\n> > +     auto value = params[\"q_lo\"].get<double>();\n> > +     if (!value)\n> > +             return -EINVAL;\n> > +     qLo = *value;\n> > +\n> > +     value = params[\"q_hi\"].get<double>();\n> > +     if (!value)\n> > +             return -EINVAL;\n> > +     qHi = *value;\n> > +\n> > +     return yTarget.read(params[\"y_target\"]);\n> > +}\n> > +\n> > +static std::tuple<int, AgcConstraintMode>\n> > +readConstraintMode(const libcamera::YamlObject &params)\n> > +{\n> > +     AgcConstraintMode mode;\n> > +     int ret;\n> > +\n> > +     for (const auto &p : params.asList()) {\n> > +             AgcConstraint constraint;\n> > +             ret = constraint.read(p);\n> > +             if (ret)\n> > +                     return { ret, {} };\n> > +\n> > +             mode.push_back(std::move(constraint));\n> > +     }\n> > +\n> > +     return { 0, mode };\n> > +}\n> > +\n> > +static std::tuple<int, std::string>\n> > +readConstraintModes(std::map<std::string, AgcConstraintMode> &constraintModes,\n> > +                 const libcamera::YamlObject &params)\n> > +{\n> > +     std::string first;\n> > +     int ret;\n> > +\n> > +     for (const auto &[key, value] : params.asDict()) {\n> > +             std::tie(ret, constraintModes[key]) = readConstraintMode(value);\n> > +             if (ret)\n> > +                     return { ret, {} };\n> > +\n> > +             if (first.empty())\n> > +                     first = key;\n> > +     }\n> > +\n> > +     return { 0, first };\n> > +}\n> > +\n> > +int AgcConfig::read(const libcamera::YamlObject &params)\n> > +{\n> > +     LOG(RPiAgc, Debug) << \"AgcConfig\";\n> > +     int ret;\n> > +\n> > +     std::tie(ret, defaultMeteringMode) =\n> > +             readMeteringModes(meteringModes, params[\"metering_modes\"]);\n> > +     if (ret)\n> > +             return ret;\n> > +     std::tie(ret, defaultExposureMode) =\n> > +             readExposureModes(exposureModes, params[\"exposure_modes\"]);\n> > +     if (ret)\n> > +             return ret;\n> > +     std::tie(ret, defaultConstraintMode) =\n> > +             readConstraintModes(constraintModes, params[\"constraint_modes\"]);\n> > +     if (ret)\n> > +             return ret;\n> > +\n> > +     ret = yTarget.read(params[\"y_target\"]);\n> > +     if (ret)\n> > +             return ret;\n> > +\n> > +     speed = params[\"speed\"].get<double>(0.2);\n> > +     startupFrames = params[\"startup_frames\"].get<uint16_t>(10);\n> > +     convergenceFrames = params[\"convergence_frames\"].get<unsigned int>(6);\n> > +     fastReduceThreshold = params[\"fast_reduce_threshold\"].get<double>(0.4);\n> > +     baseEv = params[\"base_ev\"].get<double>(1.0);\n> > +\n> > +     /* Start with quite a low value as ramping up is easier than ramping down. */\n> > +     defaultExposureTime = params[\"default_exposure_time\"].get<double>(1000) * 1us;\n> > +     defaultAnalogueGain = params[\"default_analogue_gain\"].get<double>(1.0);\n> > +\n> > +     return 0;\n> > +}\n> > +\n> > +AgcChannel::ExposureValues::ExposureValues()\n> > +     : shutter(0s), analogueGain(0),\n> > +       totalExposure(0s), totalExposureNoDG(0s)\n> > +{\n> > +}\n> > +\n> > +AgcChannel::AgcChannel()\n> > +     : meteringMode_(nullptr), exposureMode_(nullptr), constraintMode_(nullptr),\n> > +       frameCount_(0), lockCount_(0),\n> > +       lastTargetExposure_(0s), ev_(1.0), flickerPeriod_(0s),\n> > +       maxShutter_(0s), fixedShutter_(0s), fixedAnalogueGain_(0.0)\n> > +{\n> > +     memset(&awb_, 0, sizeof(awb_));\n> > +     /*\n> > +      * Setting status_.totalExposureValue_ to zero initially tells us\n> > +      * it's not been calculated yet (i.e. Process hasn't yet run).\n> > +      */\n> > +     status_ = {};\n> > +     status_.ev = ev_;\n> > +}\n> > +\n> > +int AgcChannel::read(const libcamera::YamlObject &params,\n> > +                  const Controller::HardwareConfig &hardwareConfig)\n> > +{\n> > +     int ret = config_.read(params);\n> > +     if (ret)\n> > +             return ret;\n> > +\n> > +     const Size &size = hardwareConfig.agcZoneWeights;\n> > +     for (auto const &modes : config_.meteringModes) {\n> > +             if (modes.second.weights.size() != size.width * size.height) {\n> > +                     LOG(RPiAgc, Error) << \"AgcMeteringMode: Incorrect number of weights\";\n> > +                     return -EINVAL;\n> > +             }\n> > +     }\n> > +\n> > +     /*\n> > +      * Set the config's defaults (which are the first ones it read) as our\n> > +      * current modes, until someone changes them.  (they're all known to\n> > +      * exist at this point)\n> > +      */\n> > +     meteringModeName_ = config_.defaultMeteringMode;\n> > +     meteringMode_ = &config_.meteringModes[meteringModeName_];\n> > +     exposureModeName_ = config_.defaultExposureMode;\n> > +     exposureMode_ = &config_.exposureModes[exposureModeName_];\n> > +     constraintModeName_ = config_.defaultConstraintMode;\n> > +     constraintMode_ = &config_.constraintModes[constraintModeName_];\n> > +     /* Set up the \"last shutter/gain\" values, in case AGC starts \"disabled\". */\n> > +     status_.shutterTime = config_.defaultExposureTime;\n> > +     status_.analogueGain = config_.defaultAnalogueGain;\n> > +     return 0;\n> > +}\n> > +\n> > +void AgcChannel::disableAuto()\n> > +{\n> > +     fixedShutter_ = status_.shutterTime;\n> > +     fixedAnalogueGain_ = status_.analogueGain;\n> > +}\n> > +\n> > +void AgcChannel::enableAuto()\n> > +{\n> > +     fixedShutter_ = 0s;\n> > +     fixedAnalogueGain_ = 0;\n> > +}\n> > +\n> > +unsigned int AgcChannel::getConvergenceFrames() const\n> > +{\n> > +     /*\n> > +      * If shutter and gain have been explicitly set, there is no\n> > +      * convergence to happen, so no need to drop any frames - return zero.\n> > +      */\n> > +     if (fixedShutter_ && fixedAnalogueGain_)\n> > +             return 0;\n> > +     else\n> > +             return config_.convergenceFrames;\n> > +}\n> > +\n> > +std::vector<double> const &AgcChannel::getWeights() const\n> > +{\n> > +     /*\n> > +      * In case someone calls setMeteringMode and then this before the\n> > +      * algorithm has run and updated the meteringMode_ pointer.\n> > +      */\n> > +     auto it = config_.meteringModes.find(meteringModeName_);\n> > +     if (it == config_.meteringModes.end())\n> > +             return meteringMode_->weights;\n> > +     return it->second.weights;\n> > +}\n> > +\n> > +void AgcChannel::setEv(double ev)\n> > +{\n> > +     ev_ = ev;\n> > +}\n> > +\n> > +void AgcChannel::setFlickerPeriod(Duration flickerPeriod)\n> > +{\n> > +     flickerPeriod_ = flickerPeriod;\n> > +}\n> > +\n> > +void AgcChannel::setMaxShutter(Duration maxShutter)\n> > +{\n> > +     maxShutter_ = maxShutter;\n> > +}\n> > +\n> > +void AgcChannel::setFixedShutter(Duration fixedShutter)\n> > +{\n> > +     fixedShutter_ = fixedShutter;\n> > +     /* Set this in case someone calls disableAuto() straight after. */\n> > +     status_.shutterTime = limitShutter(fixedShutter_);\n> > +}\n> > +\n> > +void AgcChannel::setFixedAnalogueGain(double fixedAnalogueGain)\n> > +{\n> > +     fixedAnalogueGain_ = fixedAnalogueGain;\n> > +     /* Set this in case someone calls disableAuto() straight after. */\n> > +     status_.analogueGain = limitGain(fixedAnalogueGain);\n> > +}\n> > +\n> > +void AgcChannel::setMeteringMode(std::string const &meteringModeName)\n> > +{\n> > +     meteringModeName_ = meteringModeName;\n> > +}\n> > +\n> > +void AgcChannel::setExposureMode(std::string const &exposureModeName)\n> > +{\n> > +     exposureModeName_ = exposureModeName;\n> > +}\n> > +\n> > +void AgcChannel::setConstraintMode(std::string const &constraintModeName)\n> > +{\n> > +     constraintModeName_ = constraintModeName;\n> > +}\n> > +\n> > +void AgcChannel::switchMode(CameraMode const &cameraMode,\n> > +                         Metadata *metadata)\n> > +{\n> > +     /* AGC expects the mode sensitivity always to be non-zero. */\n> > +     ASSERT(cameraMode.sensitivity);\n> > +\n> > +     housekeepConfig();\n> > +\n> > +     /*\n> > +      * Store the mode in the local state. We must cache the sensitivity of\n> > +      * of the previous mode for the calculations below.\n> > +      */\n> > +     double lastSensitivity = mode_.sensitivity;\n> > +     mode_ = cameraMode;\n> > +\n> > +     Duration fixedShutter = limitShutter(fixedShutter_);\n> > +     if (fixedShutter && fixedAnalogueGain_) {\n> > +             /* We're going to reset the algorithm here with these fixed values. */\n> > +\n> > +             fetchAwbStatus(metadata);\n> > +             double minColourGain = std::min({ awb_.gainR, awb_.gainG, awb_.gainB, 1.0 });\n> > +             ASSERT(minColourGain != 0.0);\n> > +\n> > +             /* This is the equivalent of computeTargetExposure and applyDigitalGain. */\n> > +             target_.totalExposureNoDG = fixedShutter_ * fixedAnalogueGain_;\n> > +             target_.totalExposure = target_.totalExposureNoDG / minColourGain;\n> > +\n> > +             /* Equivalent of filterExposure. This resets any \"history\". */\n> > +             filtered_ = target_;\n> > +\n> > +             /* Equivalent of divideUpExposure. */\n> > +             filtered_.shutter = fixedShutter;\n> > +             filtered_.analogueGain = fixedAnalogueGain_;\n> > +     } else if (status_.totalExposureValue) {\n> > +             /*\n> > +              * On a mode switch, various things could happen:\n> > +              * - the exposure profile might change\n> > +              * - a fixed exposure or gain might be set\n> > +              * - the new mode's sensitivity might be different\n> > +              * We cope with the last of these by scaling the target values. After\n> > +              * that we just need to re-divide the exposure/gain according to the\n> > +              * current exposure profile, which takes care of everything else.\n> > +              */\n> > +\n> > +             double ratio = lastSensitivity / cameraMode.sensitivity;\n> > +             target_.totalExposureNoDG *= ratio;\n> > +             target_.totalExposure *= ratio;\n> > +             filtered_.totalExposureNoDG *= ratio;\n> > +             filtered_.totalExposure *= ratio;\n> > +\n> > +             divideUpExposure();\n> > +     } else {\n> > +             /*\n> > +              * We come through here on startup, when at least one of the shutter\n> > +              * or gain has not been fixed. We must still write those values out so\n> > +              * that they will be applied immediately. We supply some arbitrary defaults\n> > +              * for any that weren't set.\n> > +              */\n> > +\n> > +             /* Equivalent of divideUpExposure. */\n> > +             filtered_.shutter = fixedShutter ? fixedShutter : config_.defaultExposureTime;\n> > +             filtered_.analogueGain = fixedAnalogueGain_ ? fixedAnalogueGain_ : config_.defaultAnalogueGain;\n> > +     }\n> > +\n> > +     writeAndFinish(metadata, false);\n> > +}\n> > +\n> > +void AgcChannel::prepare(Metadata *imageMetadata)\n> > +{\n> > +     Duration totalExposureValue = status_.totalExposureValue;\n> > +     AgcStatus delayedStatus;\n> > +     AgcPrepareStatus prepareStatus;\n> > +\n> > +     if (!imageMetadata->get(\"agc.delayed_status\", delayedStatus))\n> > +             totalExposureValue = delayedStatus.totalExposureValue;\n> > +\n> > +     prepareStatus.digitalGain = 1.0;\n> > +     prepareStatus.locked = false;\n> > +\n> > +     if (status_.totalExposureValue) {\n> > +             /* Process has run, so we have meaningful values. */\n> > +             DeviceStatus deviceStatus;\n> > +             if (imageMetadata->get(\"device.status\", deviceStatus) == 0) {\n> > +                     Duration actualExposure = deviceStatus.shutterSpeed *\n> > +                                               deviceStatus.analogueGain;\n> > +                     if (actualExposure) {\n> > +                             double digitalGain = totalExposureValue / actualExposure;\n> > +                             LOG(RPiAgc, Debug) << \"Want total exposure \" << totalExposureValue;\n> > +                             /*\n> > +                              * Never ask for a gain < 1.0, and also impose\n> > +                              * some upper limit. Make it customisable?\n> > +                              */\n> > +                             prepareStatus.digitalGain = std::max(1.0, std::min(digitalGain, 4.0));\n> > +                             LOG(RPiAgc, Debug) << \"Actual exposure \" << actualExposure;\n> > +                             LOG(RPiAgc, Debug) << \"Use digitalGain \" << prepareStatus.digitalGain;\n> > +                             LOG(RPiAgc, Debug) << \"Effective exposure \"\n> > +                                                << actualExposure * prepareStatus.digitalGain;\n> > +                             /* Decide whether AEC/AGC has converged. */\n> > +                             prepareStatus.locked = updateLockStatus(deviceStatus);\n> > +                     }\n> > +             } else\n> > +                     LOG(RPiAgc, Warning) << \"AgcChannel: no device metadata\";\n> > +             imageMetadata->set(\"agc.prepare_status\", prepareStatus);\n> > +     }\n> > +}\n> > +\n> > +void AgcChannel::process(StatisticsPtr &stats, Metadata *imageMetadata)\n> > +{\n> > +     frameCount_++;\n> > +     /*\n> > +      * First a little bit of housekeeping, fetching up-to-date settings and\n> > +      * configuration, that kind of thing.\n> > +      */\n> > +     housekeepConfig();\n> > +     /* Fetch the AWB status immediately, so that we can assume it's there. */\n> > +     fetchAwbStatus(imageMetadata);\n> > +     /* Get the current exposure values for the frame that's just arrived. */\n> > +     fetchCurrentExposure(imageMetadata);\n> > +     /* Compute the total gain we require relative to the current exposure. */\n> > +     double gain, targetY;\n> > +     computeGain(stats, imageMetadata, gain, targetY);\n> > +     /* Now compute the target (final) exposure which we think we want. */\n> > +     computeTargetExposure(gain);\n> > +     /* The results have to be filtered so as not to change too rapidly. */\n> > +     filterExposure();\n> > +     /*\n> > +      * Some of the exposure has to be applied as digital gain, so work out\n> > +      * what that is. This function also tells us whether it's decided to\n> > +      * \"desaturate\" the image more quickly.\n> > +      */\n> > +     bool desaturate = applyDigitalGain(gain, targetY);\n> > +     /*\n> > +      * The last thing is to divide up the exposure value into a shutter time\n> > +      * and analogue gain, according to the current exposure mode.\n> > +      */\n> > +     divideUpExposure();\n> > +     /* Finally advertise what we've done. */\n> > +     writeAndFinish(imageMetadata, desaturate);\n> > +}\n> > +\n> > +bool AgcChannel::updateLockStatus(DeviceStatus const &deviceStatus)\n> > +{\n> > +     const double errorFactor = 0.10; /* make these customisable? */\n> > +     const int maxLockCount = 5;\n> > +     /* Reset \"lock count\" when we exceed this multiple of errorFactor */\n> > +     const double resetMargin = 1.5;\n> > +\n> > +     /* Add 200us to the exposure time error to allow for line quantisation. */\n> > +     Duration exposureError = lastDeviceStatus_.shutterSpeed * errorFactor + 200us;\n> > +     double gainError = lastDeviceStatus_.analogueGain * errorFactor;\n> > +     Duration targetError = lastTargetExposure_ * errorFactor;\n> > +\n> > +     /*\n> > +      * Note that we don't know the exposure/gain limits of the sensor, so\n> > +      * the values we keep requesting may be unachievable. For this reason\n> > +      * we only insist that we're close to values in the past few frames.\n> > +      */\n> > +     if (deviceStatus.shutterSpeed > lastDeviceStatus_.shutterSpeed - exposureError &&\n> > +         deviceStatus.shutterSpeed < lastDeviceStatus_.shutterSpeed + exposureError &&\n> > +         deviceStatus.analogueGain > lastDeviceStatus_.analogueGain - gainError &&\n> > +         deviceStatus.analogueGain < lastDeviceStatus_.analogueGain + gainError &&\n> > +         status_.targetExposureValue > lastTargetExposure_ - targetError &&\n> > +         status_.targetExposureValue < lastTargetExposure_ + targetError)\n> > +             lockCount_ = std::min(lockCount_ + 1, maxLockCount);\n> > +     else if (deviceStatus.shutterSpeed < lastDeviceStatus_.shutterSpeed - resetMargin * exposureError ||\n> > +              deviceStatus.shutterSpeed > lastDeviceStatus_.shutterSpeed + resetMargin * exposureError ||\n> > +              deviceStatus.analogueGain < lastDeviceStatus_.analogueGain - resetMargin * gainError ||\n> > +              deviceStatus.analogueGain > lastDeviceStatus_.analogueGain + resetMargin * gainError ||\n> > +              status_.targetExposureValue < lastTargetExposure_ - resetMargin * targetError ||\n> > +              status_.targetExposureValue > lastTargetExposure_ + resetMargin * targetError)\n> > +             lockCount_ = 0;\n> > +\n> > +     lastDeviceStatus_ = deviceStatus;\n> > +     lastTargetExposure_ = status_.targetExposureValue;\n> > +\n> > +     LOG(RPiAgc, Debug) << \"Lock count updated to \" << lockCount_;\n> > +     return lockCount_ == maxLockCount;\n> > +}\n> > +\n> > +void AgcChannel::housekeepConfig()\n> > +{\n> > +     /* First fetch all the up-to-date settings, so no one else has to do it. */\n> > +     status_.ev = ev_;\n> > +     status_.fixedShutter = limitShutter(fixedShutter_);\n> > +     status_.fixedAnalogueGain = fixedAnalogueGain_;\n> > +     status_.flickerPeriod = flickerPeriod_;\n> > +     LOG(RPiAgc, Debug) << \"ev \" << status_.ev << \" fixedShutter \"\n> > +                        << status_.fixedShutter << \" fixedAnalogueGain \"\n> > +                        << status_.fixedAnalogueGain;\n> > +     /*\n> > +      * Make sure the \"mode\" pointers point to the up-to-date things, if\n> > +      * they've changed.\n> > +      */\n> > +     if (meteringModeName_ != status_.meteringMode) {\n> > +             auto it = config_.meteringModes.find(meteringModeName_);\n> > +             if (it == config_.meteringModes.end()) {\n> > +                     LOG(RPiAgc, Warning) << \"No metering mode \" << meteringModeName_;\n> > +                     meteringModeName_ = status_.meteringMode;\n> > +             } else {\n> > +                     meteringMode_ = &it->second;\n> > +                     status_.meteringMode = meteringModeName_;\n> > +             }\n> > +     }\n> > +     if (exposureModeName_ != status_.exposureMode) {\n> > +             auto it = config_.exposureModes.find(exposureModeName_);\n> > +             if (it == config_.exposureModes.end()) {\n> > +                     LOG(RPiAgc, Warning) << \"No exposure profile \" << exposureModeName_;\n> > +                     exposureModeName_ = status_.exposureMode;\n> > +             } else {\n> > +                     exposureMode_ = &it->second;\n> > +                     status_.exposureMode = exposureModeName_;\n> > +             }\n> > +     }\n> > +     if (constraintModeName_ != status_.constraintMode) {\n> > +             auto it = config_.constraintModes.find(constraintModeName_);\n> > +             if (it == config_.constraintModes.end()) {\n> > +                     LOG(RPiAgc, Warning) << \"No constraint list \" << constraintModeName_;\n> > +                     constraintModeName_ = status_.constraintMode;\n> > +             } else {\n> > +                     constraintMode_ = &it->second;\n> > +                     status_.constraintMode = constraintModeName_;\n> > +             }\n> > +     }\n> > +     LOG(RPiAgc, Debug) << \"exposureMode \"\n> > +                        << exposureModeName_ << \" constraintMode \"\n> > +                        << constraintModeName_ << \" meteringMode \"\n> > +                        << meteringModeName_;\n> > +}\n> > +\n> > +void AgcChannel::fetchCurrentExposure(Metadata *imageMetadata)\n> > +{\n> > +     std::unique_lock<Metadata> lock(*imageMetadata);\n> > +     DeviceStatus *deviceStatus =\n> > +             imageMetadata->getLocked<DeviceStatus>(\"device.status\");\n> > +     if (!deviceStatus)\n> > +             LOG(RPiAgc, Fatal) << \"No device metadata\";\n> > +     current_.shutter = deviceStatus->shutterSpeed;\n> > +     current_.analogueGain = deviceStatus->analogueGain;\n> > +     AgcStatus *agcStatus =\n> > +             imageMetadata->getLocked<AgcStatus>(\"agc.status\");\n> > +     current_.totalExposure = agcStatus ? agcStatus->totalExposureValue : 0s;\n> > +     current_.totalExposureNoDG = current_.shutter * current_.analogueGain;\n> > +}\n> > +\n> > +void AgcChannel::fetchAwbStatus(Metadata *imageMetadata)\n> > +{\n> > +     awb_.gainR = 1.0; /* in case not found in metadata */\n> > +     awb_.gainG = 1.0;\n> > +     awb_.gainB = 1.0;\n> > +     if (imageMetadata->get(\"awb.status\", awb_) != 0)\n> > +             LOG(RPiAgc, Debug) << \"No AWB status found\";\n> > +}\n> > +\n> > +static double computeInitialY(StatisticsPtr &stats, AwbStatus const &awb,\n> > +                           std::vector<double> &weights, double gain)\n> > +{\n> > +     constexpr uint64_t maxVal = 1 << Statistics::NormalisationFactorPow2;\n> > +\n> > +     /*\n> > +      * If we have no AGC region stats, but do have a a Y histogram, use that\n> > +      * directly to caluclate the mean Y value of the image.\n> > +      */\n> > +     if (!stats->agcRegions.numRegions() && stats->yHist.bins()) {\n> > +             /*\n> > +              * When the gain is applied to the histogram, anything below minBin\n> > +              * will scale up directly with the gain, but anything above that\n> > +              * will saturate into the top bin.\n> > +              */\n> > +             auto &hist = stats->yHist;\n> > +             double minBin = std::min(1.0, 1.0 / gain) * hist.bins();\n> > +             double binMean = hist.interBinMean(0.0, minBin);\n> > +             double numUnsaturated = hist.cumulativeFreq(minBin);\n> > +             /* This term is from all the pixels that won't saturate. */\n> > +             double ySum = binMean * gain * numUnsaturated;\n> > +             /* And add the ones that will saturate. */\n> > +             ySum += (hist.total() - numUnsaturated) * hist.bins();\n> > +             return ySum / hist.total() / hist.bins();\n> > +     }\n> > +\n> > +     ASSERT(weights.size() == stats->agcRegions.numRegions());\n> > +\n> > +     /*\n> > +      * Note that the weights are applied by the IPA to the statistics directly,\n> > +      * before they are given to us here.\n> > +      */\n> > +     double rSum = 0, gSum = 0, bSum = 0, pixelSum = 0;\n> > +     for (unsigned int i = 0; i < stats->agcRegions.numRegions(); i++) {\n> > +             auto &region = stats->agcRegions.get(i);\n> > +             rSum += std::min<double>(region.val.rSum * gain, (maxVal - 1) * region.counted);\n> > +             gSum += std::min<double>(region.val.gSum * gain, (maxVal - 1) * region.counted);\n> > +             bSum += std::min<double>(region.val.bSum * gain, (maxVal - 1) * region.counted);\n> > +             pixelSum += region.counted;\n> > +     }\n> > +     if (pixelSum == 0.0) {\n> > +             LOG(RPiAgc, Warning) << \"computeInitialY: pixelSum is zero\";\n> > +             return 0;\n> > +     }\n> > +\n> > +     double ySum;\n> > +     /* Factor in the AWB correction if needed. */\n> > +     if (stats->agcStatsPos == Statistics::AgcStatsPos::PreWb) {\n> > +             ySum = rSum * awb.gainR * .299 +\n> > +                    gSum * awb.gainG * .587 +\n> > +                    gSum * awb.gainB * .114;\n> > +     } else\n> > +             ySum = rSum * .299 + gSum * .587 + gSum * .114;\n> > +\n> > +     return ySum / pixelSum / (1 << 16);\n> > +}\n> > +\n> > +/*\n> > + * We handle extra gain through EV by adjusting our Y targets. However, you\n> > + * simply can't monitor histograms once they get very close to (or beyond!)\n> > + * saturation, so we clamp the Y targets to this value. It does mean that EV\n> > + * increases don't necessarily do quite what you might expect in certain\n> > + * (contrived) cases.\n> > + */\n> > +\n> > +static constexpr double EvGainYTargetLimit = 0.9;\n> > +\n> > +static double constraintComputeGain(AgcConstraint &c, const Histogram &h, double lux,\n> > +                                 double evGain, double &targetY)\n> > +{\n> > +     targetY = c.yTarget.eval(c.yTarget.domain().clip(lux));\n> > +     targetY = std::min(EvGainYTargetLimit, targetY * evGain);\n> > +     double iqm = h.interQuantileMean(c.qLo, c.qHi);\n> > +     return (targetY * h.bins()) / iqm;\n> > +}\n> > +\n> > +void AgcChannel::computeGain(StatisticsPtr &statistics, Metadata *imageMetadata,\n> > +                          double &gain, double &targetY)\n> > +{\n> > +     struct LuxStatus lux = {};\n> > +     lux.lux = 400; /* default lux level to 400 in case no metadata found */\n> > +     if (imageMetadata->get(\"lux.status\", lux) != 0)\n> > +             LOG(RPiAgc, Warning) << \"No lux level found\";\n> > +     const Histogram &h = statistics->yHist;\n> > +     double evGain = status_.ev * config_.baseEv;\n> > +     /*\n> > +      * The initial gain and target_Y come from some of the regions. After\n> > +      * that we consider the histogram constraints.\n> > +      */\n> > +     targetY = config_.yTarget.eval(config_.yTarget.domain().clip(lux.lux));\n> > +     targetY = std::min(EvGainYTargetLimit, targetY * evGain);\n> > +\n> > +     /*\n> > +      * Do this calculation a few times as brightness increase can be\n> > +      * non-linear when there are saturated regions.\n> > +      */\n> > +     gain = 1.0;\n> > +     for (int i = 0; i < 8; i++) {\n> > +             double initialY = computeInitialY(statistics, awb_, meteringMode_->weights, gain);\n> > +             double extraGain = std::min(10.0, targetY / (initialY + .001));\n> > +             gain *= extraGain;\n> > +             LOG(RPiAgc, Debug) << \"Initial Y \" << initialY << \" target \" << targetY\n> > +                                << \" gives gain \" << gain;\n> > +             if (extraGain < 1.01) /* close enough */\n> > +                     break;\n> > +     }\n> > +\n> > +     for (auto &c : *constraintMode_) {\n> > +             double newTargetY;\n> > +             double newGain = constraintComputeGain(c, h, lux.lux, evGain, newTargetY);\n> > +             LOG(RPiAgc, Debug) << \"Constraint has target_Y \"\n> > +                                << newTargetY << \" giving gain \" << newGain;\n> > +             if (c.bound == AgcConstraint::Bound::LOWER && newGain > gain) {\n> > +                     LOG(RPiAgc, Debug) << \"Lower bound constraint adopted\";\n> > +                     gain = newGain;\n> > +                     targetY = newTargetY;\n> > +             } else if (c.bound == AgcConstraint::Bound::UPPER && newGain < gain) {\n> > +                     LOG(RPiAgc, Debug) << \"Upper bound constraint adopted\";\n> > +                     gain = newGain;\n> > +                     targetY = newTargetY;\n> > +             }\n> > +     }\n> > +     LOG(RPiAgc, Debug) << \"Final gain \" << gain << \" (target_Y \" << targetY << \" ev \"\n> > +                        << status_.ev << \" base_ev \" << config_.baseEv\n> > +                        << \")\";\n> > +}\n> > +\n> > +void AgcChannel::computeTargetExposure(double gain)\n> > +{\n> > +     if (status_.fixedShutter && status_.fixedAnalogueGain) {\n> > +             /*\n> > +              * When ag and shutter are both fixed, we need to drive the\n> > +              * total exposure so that we end up with a digital gain of at least\n> > +              * 1/minColourGain. Otherwise we'd desaturate channels causing\n> > +              * white to go cyan or magenta.\n> > +              */\n> > +             double minColourGain = std::min({ awb_.gainR, awb_.gainG, awb_.gainB, 1.0 });\n> > +             ASSERT(minColourGain != 0.0);\n> > +             target_.totalExposure =\n> > +                     status_.fixedShutter * status_.fixedAnalogueGain / minColourGain;\n> > +     } else {\n> > +             /*\n> > +              * The statistics reflect the image without digital gain, so the final\n> > +              * total exposure we're aiming for is:\n> > +              */\n> > +             target_.totalExposure = current_.totalExposureNoDG * gain;\n> > +             /* The final target exposure is also limited to what the exposure mode allows. */\n> > +             Duration maxShutter = status_.fixedShutter\n> > +                                           ? status_.fixedShutter\n> > +                                           : exposureMode_->shutter.back();\n> > +             maxShutter = limitShutter(maxShutter);\n> > +             Duration maxTotalExposure =\n> > +                     maxShutter *\n> > +                     (status_.fixedAnalogueGain != 0.0\n> > +                              ? status_.fixedAnalogueGain\n> > +                              : exposureMode_->gain.back());\n> > +             target_.totalExposure = std::min(target_.totalExposure, maxTotalExposure);\n> > +     }\n> > +     LOG(RPiAgc, Debug) << \"Target totalExposure \" << target_.totalExposure;\n> > +}\n> > +\n> > +bool AgcChannel::applyDigitalGain(double gain, double targetY)\n> > +{\n> > +     double minColourGain = std::min({ awb_.gainR, awb_.gainG, awb_.gainB, 1.0 });\n> > +     ASSERT(minColourGain != 0.0);\n> > +     double dg = 1.0 / minColourGain;\n> > +     /*\n> > +      * I think this pipeline subtracts black level and rescales before we\n> > +      * get the stats, so no need to worry about it.\n> > +      */\n> > +     LOG(RPiAgc, Debug) << \"after AWB, target dg \" << dg << \" gain \" << gain\n> > +                        << \" target_Y \" << targetY;\n> > +     /*\n> > +      * Finally, if we're trying to reduce exposure but the target_Y is\n> > +      * \"close\" to 1.0, then the gain computed for that constraint will be\n> > +      * only slightly less than one, because the measured Y can never be\n> > +      * larger than 1.0. When this happens, demand a large digital gain so\n> > +      * that the exposure can be reduced, de-saturating the image much more\n> > +      * quickly (and we then approach the correct value more quickly from\n> > +      * below).\n> > +      */\n> > +     bool desaturate = targetY > config_.fastReduceThreshold &&\n> > +                       gain < sqrt(targetY);\n> > +     if (desaturate)\n> > +             dg /= config_.fastReduceThreshold;\n> > +     LOG(RPiAgc, Debug) << \"Digital gain \" << dg << \" desaturate? \" << desaturate;\n> > +     filtered_.totalExposureNoDG = filtered_.totalExposure / dg;\n> > +     LOG(RPiAgc, Debug) << \"Target totalExposureNoDG \" << filtered_.totalExposureNoDG;\n> > +     return desaturate;\n> > +}\n> > +\n> > +void AgcChannel::filterExposure()\n> > +{\n> > +     double speed = config_.speed;\n> > +     /*\n> > +      * AGC adapts instantly if both shutter and gain are directly specified\n> > +      * or we're in the startup phase.\n> > +      */\n> > +     if ((status_.fixedShutter && status_.fixedAnalogueGain) ||\n> > +         frameCount_ <= config_.startupFrames)\n> > +             speed = 1.0;\n> > +     if (!filtered_.totalExposure) {\n> > +             filtered_.totalExposure = target_.totalExposure;\n> > +     } else {\n> > +             /*\n> > +              * If close to the result go faster, to save making so many\n> > +              * micro-adjustments on the way. (Make this customisable?)\n> > +              */\n> > +             if (filtered_.totalExposure < 1.2 * target_.totalExposure &&\n> > +                 filtered_.totalExposure > 0.8 * target_.totalExposure)\n> > +                     speed = sqrt(speed);\n> > +             filtered_.totalExposure = speed * target_.totalExposure +\n> > +                                       filtered_.totalExposure * (1.0 - speed);\n> > +     }\n> > +     LOG(RPiAgc, Debug) << \"After filtering, totalExposure \" << filtered_.totalExposure\n> > +                        << \" no dg \" << filtered_.totalExposureNoDG;\n> > +}\n> > +\n> > +void AgcChannel::divideUpExposure()\n> > +{\n> > +     /*\n> > +      * Sending the fixed shutter/gain cases through the same code may seem\n> > +      * unnecessary, but it will make more sense when extend this to cover\n> > +      * variable aperture.\n> > +      */\n> > +     Duration exposureValue = filtered_.totalExposureNoDG;\n> > +     Duration shutterTime;\n> > +     double analogueGain;\n> > +     shutterTime = status_.fixedShutter ? status_.fixedShutter\n> > +                                        : exposureMode_->shutter[0];\n> > +     shutterTime = limitShutter(shutterTime);\n> > +     analogueGain = status_.fixedAnalogueGain != 0.0 ? status_.fixedAnalogueGain\n> > +                                                     : exposureMode_->gain[0];\n> > +     analogueGain = limitGain(analogueGain);\n> > +     if (shutterTime * analogueGain < exposureValue) {\n> > +             for (unsigned int stage = 1;\n> > +                  stage < exposureMode_->gain.size(); stage++) {\n> > +                     if (!status_.fixedShutter) {\n> > +                             Duration stageShutter =\n> > +                                     limitShutter(exposureMode_->shutter[stage]);\n> > +                             if (stageShutter * analogueGain >= exposureValue) {\n> > +                                     shutterTime = exposureValue / analogueGain;\n> > +                                     break;\n> > +                             }\n> > +                             shutterTime = stageShutter;\n> > +                     }\n> > +                     if (status_.fixedAnalogueGain == 0.0) {\n> > +                             if (exposureMode_->gain[stage] * shutterTime >= exposureValue) {\n> > +                                     analogueGain = exposureValue / shutterTime;\n> > +                                     break;\n> > +                             }\n> > +                             analogueGain = exposureMode_->gain[stage];\n> > +                             analogueGain = limitGain(analogueGain);\n> > +                     }\n> > +             }\n> > +     }\n> > +     LOG(RPiAgc, Debug) << \"Divided up shutter and gain are \" << shutterTime << \" and \"\n> > +                        << analogueGain;\n> > +     /*\n> > +      * Finally adjust shutter time for flicker avoidance (require both\n> > +      * shutter and gain not to be fixed).\n> > +      */\n> > +     if (!status_.fixedShutter && !status_.fixedAnalogueGain &&\n> > +         status_.flickerPeriod) {\n> > +             int flickerPeriods = shutterTime / status_.flickerPeriod;\n> > +             if (flickerPeriods) {\n> > +                     Duration newShutterTime = flickerPeriods * status_.flickerPeriod;\n> > +                     analogueGain *= shutterTime / newShutterTime;\n> > +                     /*\n> > +                      * We should still not allow the ag to go over the\n> > +                      * largest value in the exposure mode. Note that this\n> > +                      * may force more of the total exposure into the digital\n> > +                      * gain as a side-effect.\n> > +                      */\n> > +                     analogueGain = std::min(analogueGain, exposureMode_->gain.back());\n> > +                     analogueGain = limitGain(analogueGain);\n> > +                     shutterTime = newShutterTime;\n> > +             }\n> > +             LOG(RPiAgc, Debug) << \"After flicker avoidance, shutter \"\n> > +                                << shutterTime << \" gain \" << analogueGain;\n> > +     }\n> > +     filtered_.shutter = shutterTime;\n> > +     filtered_.analogueGain = analogueGain;\n> > +}\n> > +\n> > +void AgcChannel::writeAndFinish(Metadata *imageMetadata, bool desaturate)\n> > +{\n> > +     status_.totalExposureValue = filtered_.totalExposure;\n> > +     status_.targetExposureValue = desaturate ? 0s : target_.totalExposureNoDG;\n> > +     status_.shutterTime = filtered_.shutter;\n> > +     status_.analogueGain = filtered_.analogueGain;\n> > +     /*\n> > +      * Write to metadata as well, in case anyone wants to update the camera\n> > +      * immediately.\n> > +      */\n> > +     imageMetadata->set(\"agc.status\", status_);\n> > +     LOG(RPiAgc, Debug) << \"Output written, total exposure requested is \"\n> > +                        << filtered_.totalExposure;\n> > +     LOG(RPiAgc, Debug) << \"Camera exposure update: shutter time \" << filtered_.shutter\n> > +                        << \" analogue gain \" << filtered_.analogueGain;\n> > +}\n> > +\n> > +Duration AgcChannel::limitShutter(Duration shutter)\n> > +{\n> > +     /*\n> > +      * shutter == 0 is a special case for fixed shutter values, and must pass\n> > +      * through unchanged\n> > +      */\n> > +     if (!shutter)\n> > +             return shutter;\n> > +\n> > +     shutter = std::clamp(shutter, mode_.minShutter, maxShutter_);\n> > +     return shutter;\n> > +}\n> > +\n> > +double AgcChannel::limitGain(double gain) const\n> > +{\n> > +     /*\n> > +      * Only limit the lower bounds of the gain value to what the sensor limits.\n> > +      * The upper bound on analogue gain will be made up with additional digital\n> > +      * gain applied by the ISP.\n> > +      *\n> > +      * gain == 0.0 is a special case for fixed shutter values, and must pass\n> > +      * through unchanged\n> > +      */\n> > +     if (!gain)\n> > +             return gain;\n> > +\n> > +     gain = std::max(gain, mode_.minAnalogueGain);\n> > +     return gain;\n> > +}\n> > diff --git a/src/ipa/rpi/controller/rpi/agc_channel.h b/src/ipa/rpi/controller/rpi/agc_channel.h\n> > new file mode 100644\n> > index 00000000..d5a5cf3a\n> > --- /dev/null\n> > +++ b/src/ipa/rpi/controller/rpi/agc_channel.h\n> > @@ -0,0 +1,137 @@\n> > +/* SPDX-License-Identifier: BSD-2-Clause */\n> > +/*\n> > + * Copyright (C) 2023, Raspberry Pi Ltd\n> > + *\n> > + * agc.h - AGC/AEC control algorithm\n>\n> And agc_channel.h\n\nAnd again.\n\nThanks for doing all this reviewing!\n\nBest regards\nDavid\n\n>\n> Sorry for missing these in the previous review\n>\n> The rest has been clarified in the previous review and the code is\n> just mostly moved around\n>\n> Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>\n>\n> Thanks\n>   j\n>\n>\n> > + */\n> > +#pragma once\n> > +\n> > +#include <map>\n> > +#include <string>\n> > +#include <vector>\n> > +\n> > +#include <libcamera/base/utils.h>\n> > +\n> > +#include \"../agc_status.h\"\n> > +#include \"../awb_status.h\"\n> > +#include \"../controller.h\"\n> > +#include \"../pwl.h\"\n> > +\n> > +/* This is our implementation of AGC. */\n> > +\n> > +namespace RPiController {\n> > +\n> > +struct AgcMeteringMode {\n> > +     std::vector<double> weights;\n> > +     int read(const libcamera::YamlObject &params);\n> > +};\n> > +\n> > +struct AgcExposureMode {\n> > +     std::vector<libcamera::utils::Duration> shutter;\n> > +     std::vector<double> gain;\n> > +     int read(const libcamera::YamlObject &params);\n> > +};\n> > +\n> > +struct AgcConstraint {\n> > +     enum class Bound { LOWER = 0,\n> > +                        UPPER = 1 };\n> > +     Bound bound;\n> > +     double qLo;\n> > +     double qHi;\n> > +     Pwl yTarget;\n> > +     int read(const libcamera::YamlObject &params);\n> > +};\n> > +\n> > +typedef std::vector<AgcConstraint> AgcConstraintMode;\n> > +\n> > +struct AgcConfig {\n> > +     int read(const libcamera::YamlObject &params);\n> > +     std::map<std::string, AgcMeteringMode> meteringModes;\n> > +     std::map<std::string, AgcExposureMode> exposureModes;\n> > +     std::map<std::string, AgcConstraintMode> constraintModes;\n> > +     Pwl yTarget;\n> > +     double speed;\n> > +     uint16_t startupFrames;\n> > +     unsigned int convergenceFrames;\n> > +     double maxChange;\n> > +     double minChange;\n> > +     double fastReduceThreshold;\n> > +     double speedUpThreshold;\n> > +     std::string defaultMeteringMode;\n> > +     std::string defaultExposureMode;\n> > +     std::string defaultConstraintMode;\n> > +     double baseEv;\n> > +     libcamera::utils::Duration defaultExposureTime;\n> > +     double defaultAnalogueGain;\n> > +};\n> > +\n> > +class AgcChannel\n> > +{\n> > +public:\n> > +     AgcChannel();\n> > +     int read(const libcamera::YamlObject &params,\n> > +              const Controller::HardwareConfig &hardwareConfig);\n> > +     unsigned int getConvergenceFrames() const;\n> > +     std::vector<double> const &getWeights() const;\n> > +     void setEv(double ev);\n> > +     void setFlickerPeriod(libcamera::utils::Duration flickerPeriod);\n> > +     void setMaxShutter(libcamera::utils::Duration maxShutter);\n> > +     void setFixedShutter(libcamera::utils::Duration fixedShutter);\n> > +     void setFixedAnalogueGain(double fixedAnalogueGain);\n> > +     void setMeteringMode(std::string const &meteringModeName);\n> > +     void setExposureMode(std::string const &exposureModeName);\n> > +     void setConstraintMode(std::string const &contraintModeName);\n> > +     void enableAuto();\n> > +     void disableAuto();\n> > +     void switchMode(CameraMode const &cameraMode, Metadata *metadata);\n> > +     void prepare(Metadata *imageMetadata);\n> > +     void process(StatisticsPtr &stats, Metadata *imageMetadata);\n> > +\n> > +private:\n> > +     bool updateLockStatus(DeviceStatus const &deviceStatus);\n> > +     AgcConfig config_;\n> > +     void housekeepConfig();\n> > +     void fetchCurrentExposure(Metadata *imageMetadata);\n> > +     void fetchAwbStatus(Metadata *imageMetadata);\n> > +     void computeGain(StatisticsPtr &statistics, Metadata *imageMetadata,\n> > +                      double &gain, double &targetY);\n> > +     void computeTargetExposure(double gain);\n> > +     void filterExposure();\n> > +     bool applyDigitalGain(double gain, double targetY);\n> > +     void divideUpExposure();\n> > +     void writeAndFinish(Metadata *imageMetadata, bool desaturate);\n> > +     libcamera::utils::Duration limitShutter(libcamera::utils::Duration shutter);\n> > +     double limitGain(double gain) const;\n> > +     AgcMeteringMode *meteringMode_;\n> > +     AgcExposureMode *exposureMode_;\n> > +     AgcConstraintMode *constraintMode_;\n> > +     CameraMode mode_;\n> > +     uint64_t frameCount_;\n> > +     AwbStatus awb_;\n> > +     struct ExposureValues {\n> > +             ExposureValues();\n> > +\n> > +             libcamera::utils::Duration shutter;\n> > +             double analogueGain;\n> > +             libcamera::utils::Duration totalExposure;\n> > +             libcamera::utils::Duration totalExposureNoDG; /* without digital gain */\n> > +     };\n> > +     ExposureValues current_; /* values for the current frame */\n> > +     ExposureValues target_; /* calculate the values we want here */\n> > +     ExposureValues filtered_; /* these values are filtered towards target */\n> > +     AgcStatus status_;\n> > +     int lockCount_;\n> > +     DeviceStatus lastDeviceStatus_;\n> > +     libcamera::utils::Duration lastTargetExposure_;\n> > +     /* Below here the \"settings\" that applications can change. */\n> > +     std::string meteringModeName_;\n> > +     std::string exposureModeName_;\n> > +     std::string constraintModeName_;\n> > +     double ev_;\n> > +     libcamera::utils::Duration flickerPeriod_;\n> > +     libcamera::utils::Duration maxShutter_;\n> > +     libcamera::utils::Duration fixedShutter_;\n> > +     double fixedAnalogueGain_;\n> > +};\n> > +\n> > +} /* namespace RPiController */\n> > --\n> > 2.30.2\n> >","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id 980EBBE080\n\tfor <parsemail@patchwork.libcamera.org>;\n\tFri, 15 Sep 2023 08:08:24 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id E903A62917;\n\tFri, 15 Sep 2023 10:08:23 +0200 (CEST)","from mail-qk1-x735.google.com (mail-qk1-x735.google.com\n\t[IPv6:2607:f8b0:4864:20::735])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id 4D95E628D8\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tFri, 15 Sep 2023 10:08:22 +0200 (CEST)","by mail-qk1-x735.google.com with SMTP id\n\taf79cd13be357-773a5bb6fb6so82199485a.3\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tFri, 15 Sep 2023 01:08:22 -0700 (PDT)"],"DKIM-Signature":["v=1; a=rsa-sha256; c=relaxed/simple; d=libcamera.org;\n\ts=mail; t=1694765304;\n\tbh=J/DdxbRfEwwIuCL3D5HP2Rph4NqK8vh+a87w+yfsiBU=;\n\th=References:In-Reply-To:Date:To:Subject:List-Id:List-Unsubscribe:\n\tList-Archive:List-Post:List-Help:List-Subscribe:From:Reply-To:Cc:\n\tFrom;\n\tb=QIJPEdG4KPiRB7Kpct/hxY070rqESYaJEVEydHpCwcbG0kuYef6hEIumMWVzFGU8H\n\tbvB4oqGP/ocY8gNJJYOou4c4vBlf9ysCE9RAE4gjCYyY8WEYyh+m/+CbxP6qsP0hzx\n\tKd0Qyp9psr8qyngxCrOpLIuT6fpn/4wyqlMK51++OxXG9MaZC53nXZLsA5snwa0QS6\n\tu5hYyUa44hUew9U+RDdCy99f4/O113LhdJU6+5Y+WMlcK1Jdxnlv5Y1UO+vtFERqNC\n\tI7ieHidCrOGQ8BplzQKHxMF2ydz4p2yWuCVEgxlpux6c5DqCCYTGxSyQOLuk6EpLqr\n\tfMZyNkYHsgO/w==","v=1; a=rsa-sha256; c=relaxed/relaxed;\n\td=raspberrypi.com; s=google; t=1694765301; x=1695370101;\n\tdarn=lists.libcamera.org; \n\th=cc:to:subject:message-id:date:from:in-reply-to:references\n\t:mime-version:from:to:cc:subject:date:message-id:reply-to;\n\tbh=GrqiGqOw/+riZo8WW5C08fyxq93r7CYdo/wgFbRjau8=;\n\tb=Rzr26NrsWTRW+LCQFCEpafnMGJBhGQoLYBWy7Q7TaT44KsykYnVnMc+yhl6GkWPCiM\n\t3MbSnmoSeWROHbbEQa2mMsRGrxI1MHzaPr/UIEpftJgpRCI7jWRCA0qsEOVceeRnnsqj\n\tUycTCQUaRDdnabpYA5pNbMrHOUxhnsfuA/sZOufNteMV24t0RFPQAKZ9bWJQmgnPI2W4\n\tZxlwVgBioyS9KZZWrwsJnwPiBT5hIPSMTjBjjenpZtx8NPut/1hzDOVkh9b5xY0aAxfJ\n\t2EgPb5a9ukrGrcFb9yiH+pB6uWX50nZ49JeDYcF/7CA5MKg9/SPe8HU3pXd6Se7AvJ1k\n\tLpWQ=="],"Authentication-Results":"lancelot.ideasonboard.com; dkim=pass (2048-bit key; \n\tunprotected) header.d=raspberrypi.com\n\theader.i=@raspberrypi.com\n\theader.b=\"Rzr26Nrs\"; dkim-atps=neutral","X-Google-DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/relaxed;\n\td=1e100.net; s=20230601; t=1694765301; x=1695370101;\n\th=cc:to:subject:message-id:date:from:in-reply-to:references\n\t:mime-version:x-gm-message-state:from:to:cc:subject:date:message-id\n\t:reply-to;\n\tbh=GrqiGqOw/+riZo8WW5C08fyxq93r7CYdo/wgFbRjau8=;\n\tb=T9Xoyq+dAUyKFacqlNOvkzQAxujJbm+ZM8EYfrIcOnAngZ7pyLN799rgHE7jZrKAIw\n\tzM0hao4HIrKZicxuDPVTT2eekhuCn5v7VWmvn7APVCWlj81v8tvBGKsynXwBmyxMrOil\n\tTMAxiFAUTqy/iIyXpCSUBKrTBdZjF8o0KcmiYQ775s57Nc0/dQOJElafD2byPyqAJpET\n\tuN7w2eWBqs3U2h0BtlWGiUDopoFiPQ0LBuTVx2uXXKYhdLKU2bTbrBt9i8P8M+2OwSCz\n\tTujbeURin0tjB95mIvLHQq+O/NGDnrF/oWMpfiz2ynCISnlqVyN81m3kDLkimZPV5KsU\n\tSP7g==","X-Gm-Message-State":"AOJu0YzdCKWdCKGqEmokJmCvRC/BU/VWM8RDuqW3PzkyPa2ptCQGEPz0\n\tBzMhUqQ8XPmUF+X+KypHz0UlV0onrOOs+6S2c/y7wHftnzJkuKIu","X-Google-Smtp-Source":"AGHT+IHL2qM4L+e8LM+9k0c30kGAiQXJghFspfJmGter5IjIniRZZdkEU6MRw2s/zTwINFMM164QS6+H82S0Qc+U2PY=","X-Received":"by 2002:a05:620a:d8b:b0:76c:9eab:41ad with SMTP id\n\tq11-20020a05620a0d8b00b0076c9eab41admr912796qkl.32.1694765299911;\n\tFri, 15 Sep 2023 01:08:19 -0700 (PDT)","MIME-Version":"1.0","References":"<20230912102442.169001-1-david.plowman@raspberrypi.com>\n\t<20230912102442.169001-3-david.plowman@raspberrypi.com>\n\t<brn2l4a6hepdgnl4txnhhebgnl66lgda5edzj4cp5dwrgpwckp@wps62szgdufb>","In-Reply-To":"<brn2l4a6hepdgnl4txnhhebgnl66lgda5edzj4cp5dwrgpwckp@wps62szgdufb>","Date":"Fri, 15 Sep 2023 09:08:08 +0100","Message-ID":"<CAHW6GYJozb9vz=BRFiMMdXTNm7pQbk9k03zhNOWRc=Rtx9H1BA@mail.gmail.com>","To":"Jacopo Mondi <jacopo.mondi@ideasonboard.com>","Content-Type":"text/plain; charset=\"UTF-8\"","Subject":"Re: [libcamera-devel] [PATCH v3 2/5] ipa: rpi: agc: Reorganise code\n\tfor multi-channel AGC","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","From":"David Plowman via libcamera-devel <libcamera-devel@lists.libcamera.org>","Reply-To":"David Plowman <david.plowman@raspberrypi.com>","Cc":"libcamera-devel@lists.libcamera.org","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}}]