{"id":15947,"url":"https://patchwork.libcamera.org/api/patches/15947/?format=json","web_url":"https://patchwork.libcamera.org/patch/15947/","project":{"id":1,"url":"https://patchwork.libcamera.org/api/projects/1/?format=json","name":"libcamera","link_name":"libcamera","list_id":"libcamera_core","list_email":"libcamera-devel@lists.libcamera.org","web_url":"","scm_url":"","webscm_url":""},"msgid":"<20220518131030.421225-2-umang.jain@ideasonboard.com>","date":"2022-05-18T13:10:28","name":"[libcamera-devel,v4,1/3] ipa: ipu3: Rework IPAFrameContext","commit_ref":"bab437df1fb02046fc8dfd4bb5457e0b60ce3213","pull_url":null,"state":"accepted","archived":false,"hash":"65eac61bec7755882d65f78f2376f1224992af8f","submitter":{"id":86,"url":"https://patchwork.libcamera.org/api/people/86/?format=json","name":"Umang Jain","email":"umang.jain@ideasonboard.com"},"delegate":null,"mbox":"https://patchwork.libcamera.org/patch/15947/mbox/","series":[{"id":3121,"url":"https://patchwork.libcamera.org/api/series/3121/?format=json","web_url":"https://patchwork.libcamera.org/project/libcamera/list/?series=3121","date":"2022-05-18T13:10:27","name":"[libcamera-devel,v4,1/3] ipa: ipu3: Rework IPAFrameContext","version":4,"mbox":"https://patchwork.libcamera.org/series/3121/mbox/"}],"comments":"https://patchwork.libcamera.org/api/patches/15947/comments/","check":"pending","checks":"https://patchwork.libcamera.org/api/patches/15947/checks/","tags":{},"headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id D6469C3256\n\tfor <parsemail@patchwork.libcamera.org>;\n\tWed, 18 May 2022 13:10:43 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id E95BB6565C;\n\tWed, 18 May 2022 15:10:41 +0200 (CEST)","from perceval.ideasonboard.com (perceval.ideasonboard.com\n\t[213.167.242.64])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id 85B0A65656\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tWed, 18 May 2022 15:10:39 +0200 (CEST)","from perceval.ideasonboard.com (unknown [45.131.31.124])\n\tby perceval.ideasonboard.com (Postfix) with ESMTPSA id BA483E50;\n\tWed, 18 May 2022 15:10:38 +0200 (CEST)"],"DKIM-Signature":["v=1; a=rsa-sha256; c=relaxed/simple; d=libcamera.org;\n\ts=mail; t=1652879442;\n\tbh=+LeCO4mftC3kxvlv2pt8aAm5z9+wKzVWlqJrEAjT11k=;\n\th=To:Date:In-Reply-To:References:Subject:List-Id:List-Unsubscribe:\n\tList-Archive:List-Post:List-Help:List-Subscribe:From:Reply-To:\n\tFrom;\n\tb=WVtEg9HTRvhgpXc6UxEc3efF9N9UFztTgyk384iVWzvhCWkUCpXm/WNSG6uZpsagj\n\tw5dFbgJWOb4r9lE/Vnxzm07RhI6vHhPizdYms46M7NOxwvQb1tliLppiU9txoVmxbo\n\tzla5LljNqh5eX+f9QDeHOwBjPJ4/jqQLmaC6EhTFLWqcZWuuTWBSH8iBsEsXZZNBJM\n\tketrjbjhZWhll1OYWKPOvqGd5EyGZy89A2OFuqdY25QQl5dD3rz6min+p0LPBoaVHC\n\tvUMmWhU5P3JsNfb9LHR1uqDvxgRDXJXgNuZ5ubDj6tP4U8N8UD8fsD+u9Aw5TqSFN7\n\tBcKoRxPd2xweg==","v=1; a=rsa-sha256; c=relaxed/simple; d=ideasonboard.com;\n\ts=mail; t=1652879439;\n\tbh=+LeCO4mftC3kxvlv2pt8aAm5z9+wKzVWlqJrEAjT11k=;\n\th=From:To:Cc:Subject:Date:In-Reply-To:References:From;\n\tb=bZyPAmwxViHBa2zts4uKDLNgJJwFpM3rFYZjTatknmTo4gC2YStHRjXfJmU3orLx/\n\tHCIu5t96cI0JGxA6ewuXNZzPIr/gAVbe23ZetTSOnQB3RbGJ6J1kpgkl+PQOVhrih6\n\tLQ1XwUGHu4LhefcaA3nIHOlnmm+FxBLUsUQJu3VA="],"Authentication-Results":"lancelot.ideasonboard.com; dkim=pass (1024-bit key; \n\tunprotected) header.d=ideasonboard.com\n\theader.i=@ideasonboard.com\n\theader.b=\"bZyPAmwx\"; dkim-atps=neutral","To":"libcamera-devel@lists.libcamera.org","Date":"Wed, 18 May 2022 15:10:28 +0200","Message-Id":"<20220518131030.421225-2-umang.jain@ideasonboard.com>","X-Mailer":"git-send-email 2.31.1","In-Reply-To":"<20220518131030.421225-1-umang.jain@ideasonboard.com>","References":"<20220518131030.421225-1-umang.jain@ideasonboard.com>","MIME-Version":"1.0","Content-Transfer-Encoding":"8bit","Subject":"[libcamera-devel] [PATCH v4 1/3] ipa: ipu3: Rework IPAFrameContext","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","From":"Umang Jain via libcamera-devel <libcamera-devel@lists.libcamera.org>","Reply-To":"Umang Jain <umang.jain@ideasonboard.com>","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"},"content":"Currently, IPAFrameContext consolidates the values computed by the\nactive state of the algorithms, along with the values applied on\nthe sensor.\n\nMoving ahead, we want to have a frame context associated with each\nincoming request (or frame to be captured). This shouldn't be\nnecessarily be tied to \"active state\" of the algorithms hence:\n\t- Rename current IPAFrameContext -> IPAActiveState\n\t  This will now reflect the latest active state of the\n\t  algorithms and has nothing to do with any frame-related\n\t  ops/values.\n\t- Re-instate IPAFrameContext with a sub-structure 'sensor'\n\t  currently storing the exposure and gain value.\n\nAdapt the various access to the frame context to the new changes\nas described above.\n\nSubsequently, the re-instated IPAFrameContext will be extended to\ncontain a frame number and ControlList to remember the incoming\nrequest controls provided by the application. A ring-buffer will\nbe introduced to store these frame contexts for a certain number\nof frames.\n\nSigned-off-by: Umang Jain <umang.jain@ideasonboard.com>\nReviewed-by: Jacopo Mondi <jacopo@jmondi.org>\nReviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>\nReviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>\n---\n src/ipa/ipu3/algorithms/af.cpp           | 42 +++++------\n src/ipa/ipu3/algorithms/agc.cpp          | 21 +++---\n src/ipa/ipu3/algorithms/agc.h            |  2 +-\n src/ipa/ipu3/algorithms/awb.cpp          | 16 ++---\n src/ipa/ipu3/algorithms/tone_mapping.cpp | 10 +--\n src/ipa/ipu3/ipa_context.cpp             | 88 ++++++++++++++----------\n src/ipa/ipu3/ipa_context.h               | 16 +++--\n src/ipa/ipu3/ipu3.cpp                    | 11 +--\n 8 files changed, 112 insertions(+), 94 deletions(-)","diff":"diff --git a/src/ipa/ipu3/algorithms/af.cpp b/src/ipa/ipu3/algorithms/af.cpp\nindex f700b01f..8a5a6b1a 100644\n--- a/src/ipa/ipu3/algorithms/af.cpp\n+++ b/src/ipa/ipu3/algorithms/af.cpp\n@@ -185,11 +185,11 @@ int Af::configure(IPAContext &context, const IPAConfigInfo &configInfo)\n \tafIgnoreFrameReset();\n \n \t/* Initial focus value */\n-\tcontext.frameContext.af.focus = 0;\n+\tcontext.activeState.af.focus = 0;\n \t/* Maximum variance of the AF statistics */\n-\tcontext.frameContext.af.maxVariance = 0;\n+\tcontext.activeState.af.maxVariance = 0;\n \t/* The stable AF value flag. if it is true, the AF should be in a stable state. */\n-\tcontext.frameContext.af.stable = false;\n+\tcontext.activeState.af.stable = false;\n \n \treturn 0;\n }\n@@ -211,10 +211,10 @@ void Af::afCoarseScan(IPAContext &context)\n \n \tif (afScan(context, kCoarseSearchStep)) {\n \t\tcoarseCompleted_ = true;\n-\t\tcontext.frameContext.af.maxVariance = 0;\n-\t\tfocus_ = context.frameContext.af.focus -\n-\t\t\t (context.frameContext.af.focus * kFineRange);\n-\t\tcontext.frameContext.af.focus = focus_;\n+\t\tcontext.activeState.af.maxVariance = 0;\n+\t\tfocus_ = context.activeState.af.focus -\n+\t\t\t (context.activeState.af.focus * kFineRange);\n+\t\tcontext.activeState.af.focus = focus_;\n \t\tpreviousVariance_ = 0;\n \t\tmaxStep_ = std::clamp(focus_ + static_cast<uint32_t>((focus_ * kFineRange)),\n \t\t\t\t      0U, kMaxFocusSteps);\n@@ -237,7 +237,7 @@ void Af::afFineScan(IPAContext &context)\n \t\treturn;\n \n \tif (afScan(context, kFineSearchStep)) {\n-\t\tcontext.frameContext.af.stable = true;\n+\t\tcontext.activeState.af.stable = true;\n \t\tfineCompleted_ = true;\n \t}\n }\n@@ -254,10 +254,10 @@ void Af::afReset(IPAContext &context)\n \tif (afNeedIgnoreFrame())\n \t\treturn;\n \n-\tcontext.frameContext.af.maxVariance = 0;\n-\tcontext.frameContext.af.focus = 0;\n+\tcontext.activeState.af.maxVariance = 0;\n+\tcontext.activeState.af.focus = 0;\n \tfocus_ = 0;\n-\tcontext.frameContext.af.stable = false;\n+\tcontext.activeState.af.stable = false;\n \tignoreCounter_ = kIgnoreFrame;\n \tpreviousVariance_ = 0.0;\n \tcoarseCompleted_ = false;\n@@ -280,7 +280,7 @@ bool Af::afScan(IPAContext &context, int min_step)\n {\n \tif (focus_ > maxStep_) {\n \t\t/* If reach the max step, move lens to the position. */\n-\t\tcontext.frameContext.af.focus = bestFocus_;\n+\t\tcontext.activeState.af.focus = bestFocus_;\n \t\treturn true;\n \t} else {\n \t\t/*\n@@ -288,8 +288,8 @@ bool Af::afScan(IPAContext &context, int min_step)\n \t\t * derivative. If the direction changes, it means we have\n \t\t * passed a maximum one step before.\n \t\t */\n-\t\tif ((currentVariance_ - context.frameContext.af.maxVariance) >=\n-\t\t    -(context.frameContext.af.maxVariance * 0.1)) {\n+\t\tif ((currentVariance_ - context.activeState.af.maxVariance) >=\n+\t\t    -(context.activeState.af.maxVariance * 0.1)) {\n \t\t\t/*\n \t\t\t * Positive and zero derivative:\n \t\t\t * The variance is still increasing. The focus could be\n@@ -298,8 +298,8 @@ bool Af::afScan(IPAContext &context, int min_step)\n \t\t\t */\n \t\t\tbestFocus_ = focus_;\n \t\t\tfocus_ += min_step;\n-\t\t\tcontext.frameContext.af.focus = focus_;\n-\t\t\tcontext.frameContext.af.maxVariance = currentVariance_;\n+\t\t\tcontext.activeState.af.focus = focus_;\n+\t\t\tcontext.activeState.af.maxVariance = currentVariance_;\n \t\t} else {\n \t\t\t/*\n \t\t\t * Negative derivative:\n@@ -307,7 +307,7 @@ bool Af::afScan(IPAContext &context, int min_step)\n \t\t\t * variance is found. Set focus step to previous good one\n \t\t\t * then return immediately.\n \t\t\t */\n-\t\t\tcontext.frameContext.af.focus = bestFocus_;\n+\t\t\tcontext.activeState.af.focus = bestFocus_;\n \t\t\treturn true;\n \t\t}\n \t}\n@@ -389,13 +389,13 @@ double Af::afEstimateVariance(Span<const y_table_item_t> y_items, bool isY1)\n bool Af::afIsOutOfFocus(IPAContext context)\n {\n \tconst uint32_t diff_var = std::abs(currentVariance_ -\n-\t\t\t\t\t   context.frameContext.af.maxVariance);\n-\tconst double var_ratio = diff_var / context.frameContext.af.maxVariance;\n+\t\t\t\t\t   context.activeState.af.maxVariance);\n+\tconst double var_ratio = diff_var / context.activeState.af.maxVariance;\n \n \tLOG(IPU3Af, Debug) << \"Variance change rate: \"\n \t\t\t   << var_ratio\n \t\t\t   << \" Current VCM step: \"\n-\t\t\t   << context.frameContext.af.focus;\n+\t\t\t   << context.activeState.af.focus;\n \n \tif (var_ratio > kMaxChange)\n \t\treturn true;\n@@ -437,7 +437,7 @@ void Af::process(IPAContext &context, const ipu3_uapi_stats_3a *stats)\n \t */\n \tcurrentVariance_ = afEstimateVariance(y_items, !coarseCompleted_);\n \n-\tif (!context.frameContext.af.stable) {\n+\tif (!context.activeState.af.stable) {\n \t\tafCoarseScan(context);\n \t\tafFineScan(context);\n \t} else {\ndiff --git a/src/ipa/ipu3/algorithms/agc.cpp b/src/ipa/ipu3/algorithms/agc.cpp\nindex 7d4b3503..fdeec09d 100644\n--- a/src/ipa/ipu3/algorithms/agc.cpp\n+++ b/src/ipa/ipu3/algorithms/agc.cpp\n@@ -87,7 +87,7 @@ int Agc::configure(IPAContext &context,\n \t\t   [[maybe_unused]] const IPAConfigInfo &configInfo)\n {\n \tconst IPASessionConfiguration &configuration = context.configuration;\n-\tIPAFrameContext &frameContext = context.frameContext;\n+\tIPAActiveState &activeState = context.activeState;\n \n \tstride_ = configuration.grid.stride;\n \n@@ -99,8 +99,8 @@ int Agc::configure(IPAContext &context,\n \tmaxAnalogueGain_ = std::min(configuration.agc.maxAnalogueGain, kMaxAnalogueGain);\n \n \t/* Configure the default exposure and gain. */\n-\tframeContext.agc.gain = std::max(minAnalogueGain_, kMinAnalogueGain);\n-\tframeContext.agc.exposure = 10ms / configuration.sensor.lineDuration;\n+\tactiveState.agc.gain = std::max(minAnalogueGain_, kMinAnalogueGain);\n+\tactiveState.agc.exposure = 10ms / configuration.sensor.lineDuration;\n \n \tframeCount_ = 0;\n \treturn 0;\n@@ -249,9 +249,10 @@ void Agc::computeExposure(IPAContext &context, double yGain,\n \t\t\t    << shutterTime << \" and \"\n \t\t\t    << stepGain;\n \n+\tIPAActiveState &activeState = context.activeState;\n \t/* Update the estimated exposure and gain. */\n-\tframeContext.agc.exposure = shutterTime / configuration.sensor.lineDuration;\n-\tframeContext.agc.gain = stepGain;\n+\tactiveState.agc.exposure = shutterTime / configuration.sensor.lineDuration;\n+\tactiveState.agc.gain = stepGain;\n }\n \n /**\n@@ -279,7 +280,7 @@ void Agc::computeExposure(IPAContext &context, double yGain,\n  * More detailed information can be found in:\n  * https://en.wikipedia.org/wiki/Relative_luminance\n  */\n-double Agc::estimateLuminance(IPAFrameContext &frameContext,\n+double Agc::estimateLuminance(IPAActiveState &activeState,\n \t\t\t      const ipu3_uapi_grid_config &grid,\n \t\t\t      const ipu3_uapi_stats_3a *stats,\n \t\t\t      double gain)\n@@ -307,9 +308,9 @@ double Agc::estimateLuminance(IPAFrameContext &frameContext,\n \t * Apply the AWB gains to approximate colours correctly, use the Rec.\n \t * 601 formula to calculate the relative luminance, and normalize it.\n \t */\n-\tdouble ySum = redSum * frameContext.awb.gains.red * 0.299\n-\t\t    + greenSum * frameContext.awb.gains.green * 0.587\n-\t\t    + blueSum * frameContext.awb.gains.blue * 0.114;\n+\tdouble ySum = redSum * activeState.awb.gains.red * 0.299\n+\t\t    + greenSum * activeState.awb.gains.green * 0.587\n+\t\t    + blueSum * activeState.awb.gains.blue * 0.114;\n \n \treturn ySum / (grid.height * grid.width) / 255;\n }\n@@ -344,7 +345,7 @@ void Agc::process(IPAContext &context, const ipu3_uapi_stats_3a *stats)\n \tdouble yTarget = kRelativeLuminanceTarget;\n \n \tfor (unsigned int i = 0; i < 8; i++) {\n-\t\tdouble yValue = estimateLuminance(context.frameContext,\n+\t\tdouble yValue = estimateLuminance(context.activeState,\n \t\t\t\t\t\t  context.configuration.grid.bdsGrid,\n \t\t\t\t\t\t  stats, yGain);\n \t\tdouble extraGain = std::min(10.0, yTarget / (yValue + .001));\ndiff --git a/src/ipa/ipu3/algorithms/agc.h b/src/ipa/ipu3/algorithms/agc.h\nindex ad705605..31420841 100644\n--- a/src/ipa/ipu3/algorithms/agc.h\n+++ b/src/ipa/ipu3/algorithms/agc.h\n@@ -36,7 +36,7 @@ private:\n \tutils::Duration filterExposure(utils::Duration currentExposure);\n \tvoid computeExposure(IPAContext &context, double yGain,\n \t\t\t     double iqMeanGain);\n-\tdouble estimateLuminance(IPAFrameContext &frameContext,\n+\tdouble estimateLuminance(IPAActiveState &activeState,\n \t\t\t\t const ipu3_uapi_grid_config &grid,\n \t\t\t\t const ipu3_uapi_stats_3a *stats,\n \t\t\t\t double gain);\ndiff --git a/src/ipa/ipu3/algorithms/awb.cpp b/src/ipa/ipu3/algorithms/awb.cpp\nindex 87a6cc7a..ab6924eb 100644\n--- a/src/ipa/ipu3/algorithms/awb.cpp\n+++ b/src/ipa/ipu3/algorithms/awb.cpp\n@@ -396,10 +396,10 @@ void Awb::process(IPAContext &context, const ipu3_uapi_stats_3a *stats)\n \t * The results are cached, so if no results were calculated, we set the\n \t * cached values from asyncResults_ here.\n \t */\n-\tcontext.frameContext.awb.gains.blue = asyncResults_.blueGain;\n-\tcontext.frameContext.awb.gains.green = asyncResults_.greenGain;\n-\tcontext.frameContext.awb.gains.red = asyncResults_.redGain;\n-\tcontext.frameContext.awb.temperatureK = asyncResults_.temperatureK;\n+\tcontext.activeState.awb.gains.blue = asyncResults_.blueGain;\n+\tcontext.activeState.awb.gains.green = asyncResults_.greenGain;\n+\tcontext.activeState.awb.gains.red = asyncResults_.redGain;\n+\tcontext.activeState.awb.temperatureK = asyncResults_.temperatureK;\n }\n \n constexpr uint16_t Awb::threshold(float value)\n@@ -450,10 +450,10 @@ void Awb::prepare(IPAContext &context, ipu3_uapi_params *params)\n \tparams->acc_param.bnr.opt_center_sqr.y_sqr_reset = params->acc_param.bnr.opt_center.y_reset\n \t\t\t\t\t\t\t* params->acc_param.bnr.opt_center.y_reset;\n \t/* Convert to u3.13 fixed point values */\n-\tparams->acc_param.bnr.wb_gains.gr = 8192 * context.frameContext.awb.gains.green;\n-\tparams->acc_param.bnr.wb_gains.r  = 8192 * context.frameContext.awb.gains.red;\n-\tparams->acc_param.bnr.wb_gains.b  = 8192 * context.frameContext.awb.gains.blue;\n-\tparams->acc_param.bnr.wb_gains.gb = 8192 * context.frameContext.awb.gains.green;\n+\tparams->acc_param.bnr.wb_gains.gr = 8192 * context.activeState.awb.gains.green;\n+\tparams->acc_param.bnr.wb_gains.r  = 8192 * context.activeState.awb.gains.red;\n+\tparams->acc_param.bnr.wb_gains.b  = 8192 * context.activeState.awb.gains.blue;\n+\tparams->acc_param.bnr.wb_gains.gb = 8192 * context.activeState.awb.gains.green;\n \n \tLOG(IPU3Awb, Debug) << \"Color temperature estimated: \" << asyncResults_.temperatureK;\n \ndiff --git a/src/ipa/ipu3/algorithms/tone_mapping.cpp b/src/ipa/ipu3/algorithms/tone_mapping.cpp\nindex 2040eda5..7c78d0d9 100644\n--- a/src/ipa/ipu3/algorithms/tone_mapping.cpp\n+++ b/src/ipa/ipu3/algorithms/tone_mapping.cpp\n@@ -42,7 +42,7 @@ int ToneMapping::configure(IPAContext &context,\n \t\t\t   [[maybe_unused]] const IPAConfigInfo &configInfo)\n {\n \t/* Initialise tone mapping gamma value. */\n-\tcontext.frameContext.toneMapping.gamma = 0.0;\n+\tcontext.activeState.toneMapping.gamma = 0.0;\n \n \treturn 0;\n }\n@@ -60,7 +60,7 @@ void ToneMapping::prepare([[maybe_unused]] IPAContext &context,\n {\n \t/* Copy the calculated LUT into the parameters buffer. */\n \tmemcpy(params->acc_param.gamma.gc_lut.lut,\n-\t       context.frameContext.toneMapping.gammaCorrection.lut,\n+\t       context.activeState.toneMapping.gammaCorrection.lut,\n \t       IPU3_UAPI_GAMMA_CORR_LUT_ENTRIES *\n \t       sizeof(params->acc_param.gamma.gc_lut.lut[0]));\n \n@@ -87,11 +87,11 @@ void ToneMapping::process(IPAContext &context,\n \t */\n \tgamma_ = 1.1;\n \n-\tif (context.frameContext.toneMapping.gamma == gamma_)\n+\tif (context.activeState.toneMapping.gamma == gamma_)\n \t\treturn;\n \n \tstruct ipu3_uapi_gamma_corr_lut &lut =\n-\t\tcontext.frameContext.toneMapping.gammaCorrection;\n+\t\tcontext.activeState.toneMapping.gammaCorrection;\n \n \tfor (uint32_t i = 0; i < std::size(lut.lut); i++) {\n \t\tdouble j = static_cast<double>(i) / (std::size(lut.lut) - 1);\n@@ -101,7 +101,7 @@ void ToneMapping::process(IPAContext &context,\n \t\tlut.lut[i] = gamma * 8191;\n \t}\n \n-\tcontext.frameContext.toneMapping.gamma = gamma_;\n+\tcontext.activeState.toneMapping.gamma = gamma_;\n }\n \n } /* namespace ipa::ipu3::algorithms */\ndiff --git a/src/ipa/ipu3/ipa_context.cpp b/src/ipa/ipu3/ipa_context.cpp\nindex b1570dde..383c2e37 100644\n--- a/src/ipa/ipu3/ipa_context.cpp\n+++ b/src/ipa/ipu3/ipa_context.cpp\n@@ -24,19 +24,31 @@ namespace libcamera::ipa::ipu3 {\n  * may also be updated in the start() operation.\n  */\n \n+/**\n+ * \\struct IPAActiveState\n+ * \\brief The active state of the IPA algorithms\n+ *\n+ * The IPA is fed with the statistics generated from the latest frame captured\n+ * by the hardware. The statistics are then processed by the IPA algorithms to\n+ * compute ISP parameters required for the next frame capture. The current state\n+ * of the algorithms is reflected through the IPAActiveState to store the values\n+ * most recently computed by the IPA algorithms.\n+ */\n+\n /**\n  * \\struct IPAFrameContext\n- * \\brief Per-frame context for algorithms\n+ * \\brief Context for a frame\n  *\n  * The frame context stores data specific to a single frame processed by the\n  * IPA. Each frame processed by the IPA has a context associated with it,\n  * accessible through the IPAContext structure.\n  *\n- * \\todo Detail how to access contexts for a particular frame\n- *\n- * Each of the fields in the frame context belongs to either a specific\n- * algorithm, or to the top-level IPA module. A field may be read by any\n- * algorithm, but should only be written by its owner.\n+ * Fields in the frame context should reflect values and controls\n+ * associated with the specific frame as requested by the application, and\n+ * as configured by the hardware. Fields can be read by algorithms to\n+ * determine if they should update any specific action for this frame, and\n+ * finally to update the metadata control lists when the frame is fully\n+ * completed.\n  */\n \n /**\n@@ -49,10 +61,10 @@ namespace libcamera::ipa::ipu3 {\n  * \\var IPAContext::frameContext\n  * \\brief The frame context for the frame being processed\n  *\n- * \\todo While the frame context is supposed to be per-frame, this\n- * single frame context stores data related to both the current frame\n- * and the previous frames, with fields being updated as the algorithms\n- * are run. This needs to be turned into real per-frame data storage.\n+ * \\var IPAContext::activeState\n+ * \\brief The current state of IPA algorithms\n+ *\n+ * \\todo The frame context needs to be turned into real per-frame data storage.\n  */\n \n /**\n@@ -78,17 +90,17 @@ namespace libcamera::ipa::ipu3 {\n  */\n \n /**\n- * \\var IPAFrameContext::af\n+ * \\var IPAActiveState::af\n  * \\brief Context for the Automatic Focus algorithm\n  *\n- * \\struct  IPAFrameContext::af\n- * \\var IPAFrameContext::af.focus\n+ * \\struct IPAActiveState::af\n+ * \\var IPAActiveState::af.focus\n  * \\brief Current position of the lens\n  *\n- * \\var IPAFrameContext::af.maxVariance\n+ * \\var IPAActiveState::af.maxVariance\n  * \\brief The maximum variance of the current image.\n  *\n- * \\var IPAFrameContext::af.stable\n+ * \\var IPAActiveState::af.stable\n  * \\brief It is set to true, if the best focus is found.\n  */\n \n@@ -121,64 +133,64 @@ namespace libcamera::ipa::ipu3 {\n  */\n \n /**\n- * \\var IPAFrameContext::agc\n+ * \\var IPAActiveState::agc\n  * \\brief Context for the Automatic Gain Control algorithm\n  *\n  * The exposure and gain determined are expected to be applied to the sensor\n  * at the earliest opportunity.\n  *\n- * \\var IPAFrameContext::agc.exposure\n+ * \\var IPAActiveState::agc.exposure\n  * \\brief Exposure time expressed as a number of lines\n  *\n- * \\var IPAFrameContext::agc.gain\n+ * \\var IPAActiveState::agc.gain\n  * \\brief Analogue gain multiplier\n  *\n  * The gain should be adapted to the sensor specific gain code before applying.\n  */\n \n /**\n- * \\var IPAFrameContext::awb\n+ * \\var IPAActiveState::awb\n  * \\brief Context for the Automatic White Balance algorithm\n  *\n- * \\struct IPAFrameContext::awb.gains\n+ * \\struct IPAActiveState::awb.gains\n  * \\brief White balance gains\n  *\n- * \\var IPAFrameContext::awb.gains.red\n+ * \\var IPAActiveState::awb.gains.red\n  * \\brief White balance gain for R channel\n  *\n- * \\var IPAFrameContext::awb.gains.green\n+ * \\var IPAActiveState::awb.gains.green\n  * \\brief White balance gain for G channel\n  *\n- * \\var IPAFrameContext::awb.gains.blue\n+ * \\var IPAActiveState::awb.gains.blue\n  * \\brief White balance gain for B channel\n  *\n- * \\var IPAFrameContext::awb.temperatureK\n+ * \\var IPAActiveState::awb.temperatureK\n  * \\brief Estimated color temperature\n  */\n \n /**\n- * \\var IPAFrameContext::sensor\n- * \\brief Effective sensor values\n- *\n- * \\var IPAFrameContext::sensor.exposure\n- * \\brief Exposure time expressed as a number of lines\n- *\n- * \\var IPAFrameContext::sensor.gain\n- * \\brief Analogue gain multiplier\n- */\n-\n-/**\n- * \\var IPAFrameContext::toneMapping\n+ * \\var IPAActiveState::toneMapping\n  * \\brief Context for ToneMapping and Gamma control\n  *\n- * \\var IPAFrameContext::toneMapping.gamma\n+ * \\var IPAActiveState::toneMapping.gamma\n  * \\brief Gamma value for the LUT\n  *\n- * \\var IPAFrameContext::toneMapping.gammaCorrection\n+ * \\var IPAActiveState::toneMapping.gammaCorrection\n  * \\brief Per-pixel tone mapping implemented as a LUT\n  *\n  * The LUT structure is defined by the IPU3 kernel interface. See\n  * <linux/intel-ipu3.h> struct ipu3_uapi_gamma_corr_lut for further details.\n  */\n \n+/**\n+ * \\var IPAFrameContext::sensor\n+ * \\brief Effective sensor values that were applied for the frame\n+ *\n+ * \\var IPAFrameContext::sensor.exposure\n+ * \\brief Exposure time expressed as a number of lines\n+ *\n+ * \\var IPAFrameContext::sensor.gain\n+ * \\brief Analogue gain multiplier\n+ */\n+\n } /* namespace libcamera::ipa::ipu3 */\ndiff --git a/src/ipa/ipu3/ipa_context.h b/src/ipa/ipu3/ipa_context.h\nindex 103498ef..8d681131 100644\n--- a/src/ipa/ipu3/ipa_context.h\n+++ b/src/ipa/ipu3/ipa_context.h\n@@ -42,7 +42,7 @@ struct IPASessionConfiguration {\n \t} sensor;\n };\n \n-struct IPAFrameContext {\n+struct IPAActiveState {\n \tstruct {\n \t\tuint32_t focus;\n \t\tdouble maxVariance;\n@@ -64,19 +64,23 @@ struct IPAFrameContext {\n \t\tdouble temperatureK;\n \t} awb;\n \n-\tstruct {\n-\t\tuint32_t exposure;\n-\t\tdouble gain;\n-\t} sensor;\n-\n \tstruct {\n \t\tdouble gamma;\n \t\tstruct ipu3_uapi_gamma_corr_lut gammaCorrection;\n \t} toneMapping;\n };\n \n+struct IPAFrameContext {\n+\tstruct {\n+\t\tuint32_t exposure;\n+\t\tdouble gain;\n+\t} sensor;\n+};\n+\n struct IPAContext {\n \tIPASessionConfiguration configuration;\n+\tIPAActiveState activeState;\n+\n \tIPAFrameContext frameContext;\n };\n \ndiff --git a/src/ipa/ipu3/ipu3.cpp b/src/ipa/ipu3/ipu3.cpp\nindex dd6cfd79..3b4fc911 100644\n--- a/src/ipa/ipu3/ipu3.cpp\n+++ b/src/ipa/ipu3/ipu3.cpp\n@@ -454,7 +454,8 @@ int IPAIPU3::configure(const IPAConfigInfo &configInfo,\n \n \tcalculateBdsGrid(configInfo.bdsOutputSize);\n \n-\t/* Clean frameContext at each reconfiguration. */\n+\t/* Clean IPAActiveState at each reconfiguration. */\n+\tcontext_.activeState = {};\n \tcontext_.frameContext = {};\n \n \tif (!validateSensorControls()) {\n@@ -585,7 +586,7 @@ void IPAIPU3::processStatsBuffer(const uint32_t frame,\n \n \tctrls.set(controls::AnalogueGain, context_.frameContext.sensor.gain);\n \n-\tctrls.set(controls::ColourTemperature, context_.frameContext.awb.temperatureK);\n+\tctrls.set(controls::ColourTemperature, context_.activeState.awb.temperatureK);\n \n \tctrls.set(controls::ExposureTime, context_.frameContext.sensor.exposure * lineDuration);\n \n@@ -623,8 +624,8 @@ void IPAIPU3::queueRequest([[maybe_unused]] const uint32_t frame,\n  */\n void IPAIPU3::setControls(unsigned int frame)\n {\n-\tint32_t exposure = context_.frameContext.agc.exposure;\n-\tint32_t gain = camHelper_->gainCode(context_.frameContext.agc.gain);\n+\tint32_t exposure = context_.activeState.agc.exposure;\n+\tint32_t gain = camHelper_->gainCode(context_.activeState.agc.gain);\n \n \tControlList ctrls(sensorCtrls_);\n \tctrls.set(V4L2_CID_EXPOSURE, exposure);\n@@ -632,7 +633,7 @@ void IPAIPU3::setControls(unsigned int frame)\n \n \tControlList lensCtrls(lensCtrls_);\n \tlensCtrls.set(V4L2_CID_FOCUS_ABSOLUTE,\n-\t\t      static_cast<int32_t>(context_.frameContext.af.focus));\n+\t\t      static_cast<int32_t>(context_.activeState.af.focus));\n \n \tsetSensorControls.emit(frame, ctrls, lensCtrls);\n }\n","prefixes":["libcamera-devel","v4","1/3"]}