From patchwork Mon Jul 31 09:46:37 2023 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: David Plowman X-Patchwork-Id: 18905 Return-Path: X-Original-To: parsemail@patchwork.libcamera.org Delivered-To: parsemail@patchwork.libcamera.org Received: from lancelot.ideasonboard.com (lancelot.ideasonboard.com [92.243.16.209]) by patchwork.libcamera.org (Postfix) with ESMTPS id 9FB2CBDB13 for ; Mon, 31 Jul 2023 09:46:49 +0000 (UTC) Received: from lancelot.ideasonboard.com (localhost [IPv6:::1]) by lancelot.ideasonboard.com (Postfix) with ESMTP id B0C32627F0; Mon, 31 Jul 2023 11:46:47 +0200 (CEST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/simple; d=libcamera.org; s=mail; t=1690796807; bh=HZ1HQi8StWKq23YMtV9lNb7bvoOCR1720QDC6QcT1E4=; h=To:Date:In-Reply-To:References:Subject:List-Id:List-Unsubscribe: List-Archive:List-Post:List-Help:List-Subscribe:From:Reply-To: From; b=czVGFZVfkn2M1WzwXQxABnpGA+vDtmn/Dq5Nlg4e3+QL9fIYtP7+Emmjyj2etHagn /pe5uonjVS6Oh9g0tTv7dtXJNajLoaaIg0JnfRFBsct8WP5MzQVZYsA6xMZRuJu5ii +KNpQiwovGZml+scMMo/4GdbpoPjcWUweEWCowlevUziJUtaClCoKWDOv7DpW2vyZP 8cgtdQxU3JyW+kqlVE1D0v73ZnfPJzAEDBx/jmz6LnybNaVx4xbE9hey1rvzC+JXS8 xibIY69VPH7DkaPgCjW15sibTavqJcInfoi8jabcvjmYAW3+SFduP3vOwmSNq/pucG TqM3601cHsFMA== Received: from mail-wm1-x330.google.com (mail-wm1-x330.google.com [IPv6:2a00:1450:4864:20::330]) by lancelot.ideasonboard.com (Postfix) with ESMTPS id 4BE87627E8 for ; Mon, 31 Jul 2023 11:46:45 +0200 (CEST) Authentication-Results: lancelot.ideasonboard.com; dkim=pass (2048-bit key; unprotected) header.d=raspberrypi.com header.i=@raspberrypi.com header.b="hhsezBFM"; dkim-atps=neutral Received: by mail-wm1-x330.google.com with SMTP id 5b1f17b1804b1-3fbc59de0e2so39781385e9.3 for ; Mon, 31 Jul 2023 02:46:45 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=raspberrypi.com; s=google; t=1690796805; x=1691401605; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:from:to:cc:subject:date :message-id:reply-to; bh=h/HPCdJqCX5ZMO0JFYmakkAGInDV7AColVdgpVx0OUg=; b=hhsezBFMFwngZF4W960Rum1Q8bmouD8FXxuf/G+390DE2pGmiX+ph5OCd6n+NXjpKc pDfT3ZYz3E/uXY3ZEsQj29FTVSdzQfJdLClQazP2UDYL9HrQBbbTW/WkfqOFnY1FyNv1 zL4U2hAlRQn2t0RJYP9FgPH0mFszQS69aeMqCq1X7MU4V60Q5bMCrltPEShp1JTue1rt zIYNMwkll1Tj50pujL4QH46uSROCDtjb+BIi4L9sknXiVexmnE0Gw/k6qhmqsIxix3HV keLjPbb0n0N9ALPKHPRmGrhdPd+hDGGic8tqQpKMjkIPxn1Bup7a7/A8cRe27ApU9RNg djFw== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20221208; t=1690796805; x=1691401605; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:x-gm-message-state:from:to:cc :subject:date:message-id:reply-to; bh=h/HPCdJqCX5ZMO0JFYmakkAGInDV7AColVdgpVx0OUg=; b=H2DUDY7iRtuXmVrqbgDv5T8HL8vbRzV7kzEiBOPnQ7giRI5YXpd39ZshuZxWM2tt3A 7HlqhI7h4B7cYx2H5ELc/tLXrdpeGmP707fNlEiYMT/tCGqLRXbeiGp5f1b4chqwxH1b M3u6HLRHwfO0SidrQBThxg+Lo3spzbFvUDT3+G9PuHNT87tE/hXZQSS8Ed1nSueO7LWS tnoXTqj3Vx14mhAHl1k+B345hVqS+hLDfegG4jg8SO36Hag8J8+BJ3YpwBw7P/7YpqMC wn4UnSNgmbzNsa8BGc1syaD3VF87esGzzAj23nKdh86soGHaZJ8m6COrXO7fo59zD09L S96g== X-Gm-Message-State: ABy/qLaA+sv8pdeK4fxLmrUnLjHpb4IOOt8YiKFzsI5ODWYgJq3BZbjs GqfSdMlcz7g79wsr/mt3GdIre00cfwUmQul0Xus= X-Google-Smtp-Source: APBJJlG12ndMc8bSFhB6u4TaelSaEt16AsSyxODEwjEYZZPdQ4WyHVCAnNbisUnBHyPipBZD32pgNw== X-Received: by 2002:a05:600c:2143:b0:3fe:2140:f504 with SMTP id v3-20020a05600c214300b003fe2140f504mr1396263wml.20.1690796804783; Mon, 31 Jul 2023 02:46:44 -0700 (PDT) Received: from pi4-davidp.pitowers.org ([2a00:1098:3142:14:2bce:64d6:1a5c:49a2]) by smtp.gmail.com with ESMTPSA id 9-20020a05600c240900b003fa98908014sm13612838wmp.8.2023.07.31.02.46.44 (version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256); Mon, 31 Jul 2023 02:46:44 -0700 (PDT) To: libcamera-devel@lists.libcamera.org Date: Mon, 31 Jul 2023 10:46:37 +0100 Message-Id: <20230731094641.73646-2-david.plowman@raspberrypi.com> X-Mailer: git-send-email 2.30.2 In-Reply-To: <20230731094641.73646-1-david.plowman@raspberrypi.com> References: <20230731094641.73646-1-david.plowman@raspberrypi.com> MIME-Version: 1.0 Subject: [libcamera-devel] [PATCH 1/5] ipa: rpi: histogram: Add interBinMean() X-BeenThere: libcamera-devel@lists.libcamera.org X-Mailman-Version: 2.1.29 Precedence: list List-Id: List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , X-Patchwork-Original-From: David Plowman via libcamera-devel From: David Plowman Reply-To: David Plowman Errors-To: libcamera-devel-bounces@lists.libcamera.org Sender: "libcamera-devel" From: Naushir Patuck Add a new helper function Histogram::interBinMean() that essentially replaces the existing Histogram::interQuantileMean() logic but working on bins instead. Rework the interQuantileMean() to call into interBinMean() with the appropriate convertion from quatiles to bins. Signed-off-by: Naushir Patuck Reviewed-by: Kieran Bingham --- src/ipa/rpi/controller/histogram.cpp | 22 ++++++++++++++-------- src/ipa/rpi/controller/histogram.h | 2 ++ 2 files changed, 16 insertions(+), 8 deletions(-) diff --git a/src/ipa/rpi/controller/histogram.cpp b/src/ipa/rpi/controller/histogram.cpp index 16a9207f..0a27ba2c 100644 --- a/src/ipa/rpi/controller/histogram.cpp +++ b/src/ipa/rpi/controller/histogram.cpp @@ -45,20 +45,26 @@ double Histogram::quantile(double q, int first, int last) const return first + frac; } -double Histogram::interQuantileMean(double qLo, double qHi) const +double Histogram::interBinMean(double binLo, double binHi) const { - assert(qHi > qLo); - double pLo = quantile(qLo); - double pHi = quantile(qHi, (int)pLo); + assert(binHi > binLo); double sumBinFreq = 0, cumulFreq = 0; - for (double pNext = floor(pLo) + 1.0; pNext <= ceil(pHi); - pLo = pNext, pNext += 1.0) { - int bin = floor(pLo); + for (double binNext = floor(binLo) + 1.0; binNext <= ceil(binHi); + binLo = binNext, binNext += 1.0) { + int bin = floor(binLo); double freq = (cumulative_[bin + 1] - cumulative_[bin]) * - (std::min(pNext, pHi) - pLo); + (std::min(binNext, binHi) - binLo); sumBinFreq += bin * freq; cumulFreq += freq; } /* add 0.5 to give an average for bin mid-points */ return sumBinFreq / cumulFreq + 0.5; } + +double Histogram::interQuantileMean(double qLo, double qHi) const +{ + assert(qHi > qLo); + double pLo = quantile(qLo); + double pHi = quantile(qHi, (int)pLo); + return interBinMean(pLo, pHi); +} diff --git a/src/ipa/rpi/controller/histogram.h b/src/ipa/rpi/controller/histogram.h index 6b3e3a9e..e2c5509b 100644 --- a/src/ipa/rpi/controller/histogram.h +++ b/src/ipa/rpi/controller/histogram.h @@ -38,6 +38,8 @@ public: uint64_t total() const { return cumulative_[cumulative_.size() - 1]; } /* Cumulative frequency up to a (fractional) point in a bin. */ uint64_t cumulativeFreq(double bin) const; + /* Return the mean value between two (fractional) bins. */ + double interBinMean(double binLo, double binHi) const; /* * Return the (fractional) bin of the point q (0 <= q <= 1) through the * histogram. Optionally provide limits to help. From patchwork Mon Jul 31 09:46:38 2023 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: David Plowman X-Patchwork-Id: 18908 Return-Path: X-Original-To: parsemail@patchwork.libcamera.org Delivered-To: parsemail@patchwork.libcamera.org Received: from lancelot.ideasonboard.com (lancelot.ideasonboard.com [92.243.16.209]) by patchwork.libcamera.org (Postfix) with ESMTPS id 7D042C32A9 for ; Mon, 31 Jul 2023 09:46:53 +0000 (UTC) Received: from lancelot.ideasonboard.com (localhost [IPv6:::1]) by lancelot.ideasonboard.com (Postfix) with ESMTP id 1CE4D627F6; Mon, 31 Jul 2023 11:46:53 +0200 (CEST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/simple; d=libcamera.org; s=mail; t=1690796813; bh=As/pmrKEnOXEyJEcDgkBBRn79MLfCSXaQIDKYivk6nU=; h=To:Date:In-Reply-To:References:Subject:List-Id:List-Unsubscribe: List-Archive:List-Post:List-Help:List-Subscribe:From:Reply-To: From; b=EMbiuaDTZjM5vXk6Pfd2/sDtt8z4svC6foKvSTTFE9k6CAuap1DqmSXlWiX98rSx5 Jxvw+DNRh5ngt3nPi3d0TYF2H4MEbxYip+WibxHIBuGgqSd4KY5IC6v4/nBvGv5ONZ 8+xoO2R5PnEn9h62jDV2DvMs+EW010azaZjA2m/9QIUnwjvvQZvoGrO1AXtixjuA9M wBdqSFVKPoVk/5CE2sTiNMYxHAykj0fO4oM3R4WhW6TcC/f1MoVqWHme5flTCQmUaY S8S8Dcr9V/eoHsMy64CP7O1qhdXgUkSfF/f+5QO6dES5e3IzqHyaPbuSZ/D+a9Ma5y TTsEkA/F63BvQ== Received: from mail-lj1-x22c.google.com (mail-lj1-x22c.google.com [IPv6:2a00:1450:4864:20::22c]) by lancelot.ideasonboard.com (Postfix) with ESMTPS id 08940627F3 for ; Mon, 31 Jul 2023 11:46:48 +0200 (CEST) Authentication-Results: lancelot.ideasonboard.com; dkim=pass (2048-bit key; unprotected) header.d=raspberrypi.com header.i=@raspberrypi.com header.b="Z/FU/xhT"; dkim-atps=neutral Received: by mail-lj1-x22c.google.com with SMTP id 38308e7fff4ca-2b9aa1d3029so62180301fa.2 for ; Mon, 31 Jul 2023 02:46:47 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=raspberrypi.com; s=google; t=1690796807; x=1691401607; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:from:to:cc:subject:date :message-id:reply-to; bh=uXY48rosQum91KQuC7XjxvHrh/EOnAHpIofmFURURnE=; b=Z/FU/xhTlyEx1jUjYpQDLSGSZQhmBZXE6jP3bp3gN1oN2vxkbn2/wuEcOkSVJQ5UHe 0NW76xSJvoewwUnEqK4b4UFM9qu+9T/lM7X6FehtZdNULD564TBNBkhr3UEAJ1jswzMf SBtLUWn2VP1+qjy+hSCDXmE0acSKQHhHs6Ed7ltpFjuCBsP7PYpbDQsSiuOZC1TMwenH GOrO9oDFCRjRjcZpoIxXWEBNKH80OXdHKi9MoM8X5awsR6/FASbpo3nNB76vDwQ9Ui5f qgan0LhdqiXCf3uiLaKWikXpaspmrfLpyw+6ptlNxt2HGgcO0dAz3TrC6fqRid/bTMjm TiQQ== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20221208; t=1690796807; x=1691401607; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:x-gm-message-state:from:to:cc :subject:date:message-id:reply-to; bh=uXY48rosQum91KQuC7XjxvHrh/EOnAHpIofmFURURnE=; b=HtkHHlUBlzTCLNGdU9jPEEPYh+1jOTHdtW1pE8C02fZOTLuCxPARGZ+e3jdXnjfv/+ tJVGbO4pm/VWgKlehcZpz1/6TNioknAmwWmsuokSQRKt4/ThIqOVm1m7heFWyAJWeemq u+ajJnSJh/65pvdnZi262wH/VKTPxwJXXTVBgy8kDD6qQ1/OT6S/LiwhjukHVAKjR0h6 5XNqPx1YFXZ2QHwZnNSGx6Daz9HFdqCmPjHP0EiipZzZi8TKU51euF2EyOHhhBHcdbJD 0OyvIz68NwMpAqj+IQwaMW+dDl9JGF7glCfM+QgyYnai9Kpgj8Q5qkPGWYCQaLtO306/ bnrg== X-Gm-Message-State: ABy/qLbWchRux0SweQofI7fnLaHFVnT1owtpQDD2gdJPMN2y+NLtl40f UAQKL8rrEWA9ZCTqBNsdTEaIlwAtqe8jXUqVus8= X-Google-Smtp-Source: APBJJlGAKM3Lmm6FY2K48VNCtqHz/xnEB8MdK/JD2jAkYIw1zWwH2TCThd+i/OIHiuIC7/+rLJPiQQ== X-Received: by 2002:a2e:8091:0:b0:2b6:ebc6:1e86 with SMTP id i17-20020a2e8091000000b002b6ebc61e86mr6147197ljg.47.1690796805587; Mon, 31 Jul 2023 02:46:45 -0700 (PDT) Received: from pi4-davidp.pitowers.org ([2a00:1098:3142:14:2bce:64d6:1a5c:49a2]) by smtp.gmail.com with ESMTPSA id 9-20020a05600c240900b003fa98908014sm13612838wmp.8.2023.07.31.02.46.44 (version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256); Mon, 31 Jul 2023 02:46:45 -0700 (PDT) To: libcamera-devel@lists.libcamera.org Date: Mon, 31 Jul 2023 10:46:38 +0100 Message-Id: <20230731094641.73646-3-david.plowman@raspberrypi.com> X-Mailer: git-send-email 2.30.2 In-Reply-To: <20230731094641.73646-1-david.plowman@raspberrypi.com> References: <20230731094641.73646-1-david.plowman@raspberrypi.com> MIME-Version: 1.0 Subject: [libcamera-devel] [PATCH 2/5] ipa: rpi: agc: Reorganise code for multi-channel AGC X-BeenThere: libcamera-devel@lists.libcamera.org X-Mailman-Version: 2.1.29 Precedence: list List-Id: List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , X-Patchwork-Original-From: David Plowman via libcamera-devel From: David Plowman Reply-To: David Plowman Errors-To: libcamera-devel-bounces@lists.libcamera.org Sender: "libcamera-devel" This commit does the basic reorganisation of the code in order to implement multi-channel AGC. The main changes are: * The previous Agc class (in agc.cpp) has become the AgcChannel class in (agc_channel.cpp). * A new Agc class is introduced which is a wrapper round a number of AgcChannels. * The basic plumbing from ipa_base.cpp to Agc is updated to include a channel number. All the existing controls are hardwired to talk directly to channel 0. There are a couple of limitations which we expect to apply to multi-channel AGC. We're not allowing different frame durations to be applied to the channels, nor are we allowing separate metering modes. To be fair, supporting these things is not impossible, but there are reasons why it may be tricky so they remain "TBD" for now. This patch only includes the basic reorganisation and plumbing. It does not yet update the important methods (switchMode, prepare and process) to implement multi-channel AGC properly. This will appear in a subsequent commit. For now, these functions are hard-coded just to use channel 0, thereby preserving the existing behaviour. Signed-off-by: David Plowman Reviewed-by: Naushir Patuck Reviewed-by: Naushir Patuck --- src/ipa/rpi/common/ipa_base.cpp | 14 +- src/ipa/rpi/controller/agc_algorithm.h | 19 +- src/ipa/rpi/controller/meson.build | 1 + src/ipa/rpi/controller/rpi/agc.cpp | 910 +++----------------- src/ipa/rpi/controller/rpi/agc.h | 122 +-- src/ipa/rpi/controller/rpi/agc_channel.cpp | 927 +++++++++++++++++++++ src/ipa/rpi/controller/rpi/agc_channel.h | 135 +++ 7 files changed, 1216 insertions(+), 912 deletions(-) create mode 100644 src/ipa/rpi/controller/rpi/agc_channel.cpp create mode 100644 src/ipa/rpi/controller/rpi/agc_channel.h diff --git a/src/ipa/rpi/common/ipa_base.cpp b/src/ipa/rpi/common/ipa_base.cpp index 6ae84cc6..f29c32fd 100644 --- a/src/ipa/rpi/common/ipa_base.cpp +++ b/src/ipa/rpi/common/ipa_base.cpp @@ -692,9 +692,9 @@ void IpaBase::applyControls(const ControlList &controls) } if (ctrl.second.get() == false) - agc->disableAuto(); + agc->disableAuto(0); else - agc->enableAuto(); + agc->enableAuto(0); libcameraMetadata_.set(controls::AeEnable, ctrl.second.get()); break; @@ -710,7 +710,7 @@ void IpaBase::applyControls(const ControlList &controls) } /* The control provides units of microseconds. */ - agc->setFixedShutter(ctrl.second.get() * 1.0us); + agc->setFixedShutter(0, ctrl.second.get() * 1.0us); libcameraMetadata_.set(controls::ExposureTime, ctrl.second.get()); break; @@ -725,7 +725,7 @@ void IpaBase::applyControls(const ControlList &controls) break; } - agc->setFixedAnalogueGain(ctrl.second.get()); + agc->setFixedAnalogueGain(0, ctrl.second.get()); libcameraMetadata_.set(controls::AnalogueGain, ctrl.second.get()); @@ -763,7 +763,7 @@ void IpaBase::applyControls(const ControlList &controls) int32_t idx = ctrl.second.get(); if (ConstraintModeTable.count(idx)) { - agc->setConstraintMode(ConstraintModeTable.at(idx)); + agc->setConstraintMode(0, ConstraintModeTable.at(idx)); libcameraMetadata_.set(controls::AeConstraintMode, idx); } else { LOG(IPARPI, Error) << "Constraint mode " << idx @@ -783,7 +783,7 @@ void IpaBase::applyControls(const ControlList &controls) int32_t idx = ctrl.second.get(); if (ExposureModeTable.count(idx)) { - agc->setExposureMode(ExposureModeTable.at(idx)); + agc->setExposureMode(0, ExposureModeTable.at(idx)); libcameraMetadata_.set(controls::AeExposureMode, idx); } else { LOG(IPARPI, Error) << "Exposure mode " << idx @@ -806,7 +806,7 @@ void IpaBase::applyControls(const ControlList &controls) * So convert to 2^EV */ double ev = pow(2.0, ctrl.second.get()); - agc->setEv(ev); + agc->setEv(0, ev); libcameraMetadata_.set(controls::ExposureValue, ctrl.second.get()); break; diff --git a/src/ipa/rpi/controller/agc_algorithm.h b/src/ipa/rpi/controller/agc_algorithm.h index b6949daa..b8986560 100644 --- a/src/ipa/rpi/controller/agc_algorithm.h +++ b/src/ipa/rpi/controller/agc_algorithm.h @@ -21,16 +21,19 @@ public: /* An AGC algorithm must provide the following: */ virtual unsigned int getConvergenceFrames() const = 0; virtual std::vector const &getWeights() const = 0; - virtual void setEv(double ev) = 0; - virtual void setFlickerPeriod(libcamera::utils::Duration flickerPeriod) = 0; - virtual void setFixedShutter(libcamera::utils::Duration fixedShutter) = 0; + virtual void setEv(unsigned int channel, double ev) = 0; + virtual void setFlickerPeriod(unsigned int channel, + libcamera::utils::Duration flickerPeriod) = 0; + virtual void setFixedShutter(unsigned int channel, + libcamera::utils::Duration fixedShutter) = 0; virtual void setMaxShutter(libcamera::utils::Duration maxShutter) = 0; - virtual void setFixedAnalogueGain(double fixedAnalogueGain) = 0; + virtual void setFixedAnalogueGain(unsigned int channel, double fixedAnalogueGain) = 0; virtual void setMeteringMode(std::string const &meteringModeName) = 0; - virtual void setExposureMode(std::string const &exposureModeName) = 0; - virtual void setConstraintMode(std::string const &contraintModeName) = 0; - virtual void enableAuto() = 0; - virtual void disableAuto() = 0; + virtual void setExposureMode(unsigned int channel, std::string const &exposureModeName) = 0; + virtual void setConstraintMode(unsigned int channel, std::string const &contraintModeName) = 0; + virtual void enableAuto(unsigned int channel) = 0; + virtual void disableAuto(unsigned int channel) = 0; + virtual void setActiveChannels(const std::vector &activeChannels) = 0; }; } /* namespace RPiController */ diff --git a/src/ipa/rpi/controller/meson.build b/src/ipa/rpi/controller/meson.build index feb0334e..20b9cda9 100644 --- a/src/ipa/rpi/controller/meson.build +++ b/src/ipa/rpi/controller/meson.build @@ -8,6 +8,7 @@ rpi_ipa_controller_sources = files([ 'pwl.cpp', 'rpi/af.cpp', 'rpi/agc.cpp', + 'rpi/agc_channel.cpp', 'rpi/alsc.cpp', 'rpi/awb.cpp', 'rpi/black_level.cpp', diff --git a/src/ipa/rpi/controller/rpi/agc.cpp b/src/ipa/rpi/controller/rpi/agc.cpp index 7b02972a..c9c9c297 100644 --- a/src/ipa/rpi/controller/rpi/agc.cpp +++ b/src/ipa/rpi/controller/rpi/agc.cpp @@ -5,20 +5,12 @@ * agc.cpp - AGC/AEC control algorithm */ -#include -#include -#include +#include "agc.h" #include -#include "../awb_status.h" -#include "../device_status.h" -#include "../histogram.h" -#include "../lux_status.h" #include "../metadata.h" -#include "agc.h" - using namespace RPiController; using namespace libcamera; using libcamera::utils::Duration; @@ -28,881 +20,203 @@ LOG_DEFINE_CATEGORY(RPiAgc) #define NAME "rpi.agc" -int AgcMeteringMode::read(const libcamera::YamlObject ¶ms) +Agc::Agc(Controller *controller) + : AgcAlgorithm(controller), + activeChannels_({ 0 }) { - const YamlObject &yamlWeights = params["weights"]; - - for (const auto &p : yamlWeights.asList()) { - auto value = p.get(); - if (!value) - return -EINVAL; - weights.push_back(*value); - } - - return 0; } -static std::tuple -readMeteringModes(std::map &metering_modes, - const libcamera::YamlObject ¶ms) +char const *Agc::name() const { - std::string first; - int ret; - - for (const auto &[key, value] : params.asDict()) { - AgcMeteringMode meteringMode; - ret = meteringMode.read(value); - if (ret) - return { ret, {} }; - - metering_modes[key] = std::move(meteringMode); - if (first.empty()) - first = key; - } - - return { 0, first }; + return NAME; } -int AgcExposureMode::read(const libcamera::YamlObject ¶ms) +int Agc::read(const libcamera::YamlObject ¶ms) { - auto value = params["shutter"].getList(); - if (!value) - return -EINVAL; - std::transform(value->begin(), value->end(), std::back_inserter(shutter), - [](double v) { return v * 1us; }); - - value = params["gain"].getList(); - if (!value) - return -EINVAL; - gain = std::move(*value); - - if (shutter.size() < 2 || gain.size() < 2) { - LOG(RPiAgc, Error) - << "AgcExposureMode: must have at least two entries in exposure profile"; - return -EINVAL; - } - - if (shutter.size() != gain.size()) { - LOG(RPiAgc, Error) - << "AgcExposureMode: expect same number of exposure and gain entries in exposure profile"; - return -EINVAL; + /* + * When there is only a single channel we can read the old style syntax. + * Otherwise we expect a "channels" keyword followed by a list of configurations. + */ + if (!params.contains("channels")) { + LOG(RPiAgc, Debug) << "Single channel only"; + channelData_.emplace_back(); + return channelData_.back().channel.read(params, getHardwareConfig()); } - return 0; -} - -static std::tuple -readExposureModes(std::map &exposureModes, - const libcamera::YamlObject ¶ms) -{ - std::string first; - int ret; - - for (const auto &[key, value] : params.asDict()) { - AgcExposureMode exposureMode; - ret = exposureMode.read(value); + const auto &channels = params["channels"].asList(); + for (auto ch = channels.begin(); ch != channels.end(); ch++) { + LOG(RPiAgc, Debug) << "Read AGC channel"; + channelData_.emplace_back(); + int ret = channelData_.back().channel.read(*ch, getHardwareConfig()); if (ret) - return { ret, {} }; - - exposureModes[key] = std::move(exposureMode); - if (first.empty()) - first = key; + return ret; } - return { 0, first }; -} - -int AgcConstraint::read(const libcamera::YamlObject ¶ms) -{ - std::string boundString = params["bound"].get(""); - transform(boundString.begin(), boundString.end(), - boundString.begin(), ::toupper); - if (boundString != "UPPER" && boundString != "LOWER") { - LOG(RPiAgc, Error) << "AGC constraint type should be UPPER or LOWER"; - return -EINVAL; + LOG(RPiAgc, Debug) << "Read " << channelData_.size() << " channel(s)"; + if (channelData_.empty()) { + LOG(RPiAgc, Error) << "No AGC channels provided"; + return -1; } - bound = boundString == "UPPER" ? Bound::UPPER : Bound::LOWER; - - auto value = params["q_lo"].get(); - if (!value) - return -EINVAL; - qLo = *value; - - value = params["q_hi"].get(); - if (!value) - return -EINVAL; - qHi = *value; - - return yTarget.read(params["y_target"]); -} -static std::tuple -readConstraintMode(const libcamera::YamlObject ¶ms) -{ - AgcConstraintMode mode; - int ret; - - for (const auto &p : params.asList()) { - AgcConstraint constraint; - ret = constraint.read(p); - if (ret) - return { ret, {} }; - - mode.push_back(std::move(constraint)); - } - - return { 0, mode }; + return 0; } -static std::tuple -readConstraintModes(std::map &constraintModes, - const libcamera::YamlObject ¶ms) +int Agc::checkChannel(unsigned int channelIndex) const { - std::string first; - int ret; - - for (const auto &[key, value] : params.asDict()) { - std::tie(ret, constraintModes[key]) = readConstraintMode(value); - if (ret) - return { ret, {} }; - - if (first.empty()) - first = key; + if (channelIndex >= channelData_.size()) { + LOG(RPiAgc, Warning) << "AGC channel " << channelIndex << " not available"; + return -1; } - return { 0, first }; -} - -int AgcConfig::read(const libcamera::YamlObject ¶ms) -{ - LOG(RPiAgc, Debug) << "AgcConfig"; - int ret; - - std::tie(ret, defaultMeteringMode) = - readMeteringModes(meteringModes, params["metering_modes"]); - if (ret) - return ret; - std::tie(ret, defaultExposureMode) = - readExposureModes(exposureModes, params["exposure_modes"]); - if (ret) - return ret; - std::tie(ret, defaultConstraintMode) = - readConstraintModes(constraintModes, params["constraint_modes"]); - if (ret) - return ret; - - ret = yTarget.read(params["y_target"]); - if (ret) - return ret; - - speed = params["speed"].get(0.2); - startupFrames = params["startup_frames"].get(10); - convergenceFrames = params["convergence_frames"].get(6); - fastReduceThreshold = params["fast_reduce_threshold"].get(0.4); - baseEv = params["base_ev"].get(1.0); - - /* Start with quite a low value as ramping up is easier than ramping down. */ - defaultExposureTime = params["default_exposure_time"].get(1000) * 1us; - defaultAnalogueGain = params["default_analogue_gain"].get(1.0); - return 0; } -Agc::ExposureValues::ExposureValues() - : shutter(0s), analogueGain(0), - totalExposure(0s), totalExposureNoDG(0s) +void Agc::disableAuto(unsigned int channelIndex) { -} - -Agc::Agc(Controller *controller) - : AgcAlgorithm(controller), meteringMode_(nullptr), - exposureMode_(nullptr), constraintMode_(nullptr), - frameCount_(0), lockCount_(0), - lastTargetExposure_(0s), ev_(1.0), flickerPeriod_(0s), - maxShutter_(0s), fixedShutter_(0s), fixedAnalogueGain_(0.0) -{ - memset(&awb_, 0, sizeof(awb_)); - /* - * Setting status_.totalExposureValue_ to zero initially tells us - * it's not been calculated yet (i.e. Process hasn't yet run). - */ - status_ = {}; - status_.ev = ev_; -} + if (checkChannel(channelIndex)) + return; -char const *Agc::name() const -{ - return NAME; + LOG(RPiAgc, Debug) << "disableAuto for channel " << channelIndex; + channelData_[channelIndex].channel.disableAuto(); } -int Agc::read(const libcamera::YamlObject ¶ms) +void Agc::enableAuto(unsigned int channelIndex) { - LOG(RPiAgc, Debug) << "Agc"; - - int ret = config_.read(params); - if (ret) - return ret; - - const Size &size = getHardwareConfig().agcZoneWeights; - for (auto const &modes : config_.meteringModes) { - if (modes.second.weights.size() != size.width * size.height) { - LOG(RPiAgc, Error) << "AgcMeteringMode: Incorrect number of weights"; - return -EINVAL; - } - } + if (checkChannel(channelIndex)) + return; - /* - * Set the config's defaults (which are the first ones it read) as our - * current modes, until someone changes them. (they're all known to - * exist at this point) - */ - meteringModeName_ = config_.defaultMeteringMode; - meteringMode_ = &config_.meteringModes[meteringModeName_]; - exposureModeName_ = config_.defaultExposureMode; - exposureMode_ = &config_.exposureModes[exposureModeName_]; - constraintModeName_ = config_.defaultConstraintMode; - constraintMode_ = &config_.constraintModes[constraintModeName_]; - /* Set up the "last shutter/gain" values, in case AGC starts "disabled". */ - status_.shutterTime = config_.defaultExposureTime; - status_.analogueGain = config_.defaultAnalogueGain; - return 0; -} - -void Agc::disableAuto() -{ - fixedShutter_ = status_.shutterTime; - fixedAnalogueGain_ = status_.analogueGain; -} - -void Agc::enableAuto() -{ - fixedShutter_ = 0s; - fixedAnalogueGain_ = 0; + LOG(RPiAgc, Debug) << "enableAuto for channel " << channelIndex; + channelData_[channelIndex].channel.enableAuto(); } unsigned int Agc::getConvergenceFrames() const { - /* - * If shutter and gain have been explicitly set, there is no - * convergence to happen, so no need to drop any frames - return zero. - */ - if (fixedShutter_ && fixedAnalogueGain_) - return 0; - else - return config_.convergenceFrames; + /* If there are n channels, it presumably takes n times as long to converge. */ + return channelData_[0].channel.getConvergenceFrames() * activeChannels_.size(); } std::vector const &Agc::getWeights() const { /* - * In case someone calls setMeteringMode and then this before the - * algorithm has run and updated the meteringMode_ pointer. + * A limitation is that we're going to have to use the same weights across + * all channels. */ - auto it = config_.meteringModes.find(meteringModeName_); - if (it == config_.meteringModes.end()) - return meteringMode_->weights; - return it->second.weights; + return channelData_[0].channel.getWeights(); } -void Agc::setEv(double ev) +void Agc::setEv(unsigned int channelIndex, double ev) { - ev_ = ev; -} + if (checkChannel(channelIndex)) + return; -void Agc::setFlickerPeriod(Duration flickerPeriod) -{ - flickerPeriod_ = flickerPeriod; + LOG(RPiAgc, Debug) << "setEv " << ev << " for channel " << channelIndex; + channelData_[channelIndex].channel.setEv(ev); } -void Agc::setMaxShutter(Duration maxShutter) +void Agc::setFlickerPeriod(unsigned int channelIndex, Duration flickerPeriod) { - maxShutter_ = maxShutter; -} + if (checkChannel(channelIndex)) + return; -void Agc::setFixedShutter(Duration fixedShutter) -{ - fixedShutter_ = fixedShutter; - /* Set this in case someone calls disableAuto() straight after. */ - status_.shutterTime = limitShutter(fixedShutter_); + LOG(RPiAgc, Debug) << "setFlickerPeriod " << flickerPeriod + << " for channel " << channelIndex; + channelData_[channelIndex].channel.setFlickerPeriod(flickerPeriod); } -void Agc::setFixedAnalogueGain(double fixedAnalogueGain) -{ - fixedAnalogueGain_ = fixedAnalogueGain; - /* Set this in case someone calls disableAuto() straight after. */ - status_.analogueGain = limitGain(fixedAnalogueGain); -} - -void Agc::setMeteringMode(std::string const &meteringModeName) -{ - meteringModeName_ = meteringModeName; -} - -void Agc::setExposureMode(std::string const &exposureModeName) -{ - exposureModeName_ = exposureModeName; -} - -void Agc::setConstraintMode(std::string const &constraintModeName) -{ - constraintModeName_ = constraintModeName; -} - -void Agc::switchMode(CameraMode const &cameraMode, - Metadata *metadata) +void Agc::setMaxShutter(Duration maxShutter) { - /* AGC expects the mode sensitivity always to be non-zero. */ - ASSERT(cameraMode.sensitivity); - - housekeepConfig(); - - /* - * Store the mode in the local state. We must cache the sensitivity of - * of the previous mode for the calculations below. - */ - double lastSensitivity = mode_.sensitivity; - mode_ = cameraMode; - - Duration fixedShutter = limitShutter(fixedShutter_); - if (fixedShutter && fixedAnalogueGain_) { - /* We're going to reset the algorithm here with these fixed values. */ - - fetchAwbStatus(metadata); - double minColourGain = std::min({ awb_.gainR, awb_.gainG, awb_.gainB, 1.0 }); - ASSERT(minColourGain != 0.0); - - /* This is the equivalent of computeTargetExposure and applyDigitalGain. */ - target_.totalExposureNoDG = fixedShutter_ * fixedAnalogueGain_; - target_.totalExposure = target_.totalExposureNoDG / minColourGain; - - /* Equivalent of filterExposure. This resets any "history". */ - filtered_ = target_; - - /* Equivalent of divideUpExposure. */ - filtered_.shutter = fixedShutter; - filtered_.analogueGain = fixedAnalogueGain_; - } else if (status_.totalExposureValue) { - /* - * On a mode switch, various things could happen: - * - the exposure profile might change - * - a fixed exposure or gain might be set - * - the new mode's sensitivity might be different - * We cope with the last of these by scaling the target values. After - * that we just need to re-divide the exposure/gain according to the - * current exposure profile, which takes care of everything else. - */ - - double ratio = lastSensitivity / cameraMode.sensitivity; - target_.totalExposureNoDG *= ratio; - target_.totalExposure *= ratio; - filtered_.totalExposureNoDG *= ratio; - filtered_.totalExposure *= ratio; - - divideUpExposure(); - } else { - /* - * We come through here on startup, when at least one of the shutter - * or gain has not been fixed. We must still write those values out so - * that they will be applied immediately. We supply some arbitrary defaults - * for any that weren't set. - */ - - /* Equivalent of divideUpExposure. */ - filtered_.shutter = fixedShutter ? fixedShutter : config_.defaultExposureTime; - filtered_.analogueGain = fixedAnalogueGain_ ? fixedAnalogueGain_ : config_.defaultAnalogueGain; - } - - writeAndFinish(metadata, false); + /* Frame durations will be the same across all channels too. */ + for (auto &data : channelData_) + data.channel.setMaxShutter(maxShutter); } -void Agc::prepare(Metadata *imageMetadata) +void Agc::setFixedShutter(unsigned int channelIndex, Duration fixedShutter) { - Duration totalExposureValue = status_.totalExposureValue; - AgcStatus delayedStatus; - AgcPrepareStatus prepareStatus; - - if (!imageMetadata->get("agc.delayed_status", delayedStatus)) - totalExposureValue = delayedStatus.totalExposureValue; - - prepareStatus.digitalGain = 1.0; - prepareStatus.locked = false; - - if (status_.totalExposureValue) { - /* Process has run, so we have meaningful values. */ - DeviceStatus deviceStatus; - if (imageMetadata->get("device.status", deviceStatus) == 0) { - Duration actualExposure = deviceStatus.shutterSpeed * - deviceStatus.analogueGain; - if (actualExposure) { - double digitalGain = totalExposureValue / actualExposure; - LOG(RPiAgc, Debug) << "Want total exposure " << totalExposureValue; - /* - * Never ask for a gain < 1.0, and also impose - * some upper limit. Make it customisable? - */ - prepareStatus.digitalGain = std::max(1.0, std::min(digitalGain, 4.0)); - LOG(RPiAgc, Debug) << "Actual exposure " << actualExposure; - LOG(RPiAgc, Debug) << "Use digitalGain " << prepareStatus.digitalGain; - LOG(RPiAgc, Debug) << "Effective exposure " - << actualExposure * prepareStatus.digitalGain; - /* Decide whether AEC/AGC has converged. */ - prepareStatus.locked = updateLockStatus(deviceStatus); - } - } else - LOG(RPiAgc, Warning) << name() << ": no device metadata"; - imageMetadata->set("agc.prepare_status", prepareStatus); - } -} + if (checkChannel(channelIndex)) + return; -void Agc::process(StatisticsPtr &stats, Metadata *imageMetadata) -{ - frameCount_++; - /* - * First a little bit of housekeeping, fetching up-to-date settings and - * configuration, that kind of thing. - */ - housekeepConfig(); - /* Fetch the AWB status immediately, so that we can assume it's there. */ - fetchAwbStatus(imageMetadata); - /* Get the current exposure values for the frame that's just arrived. */ - fetchCurrentExposure(imageMetadata); - /* Compute the total gain we require relative to the current exposure. */ - double gain, targetY; - computeGain(stats, imageMetadata, gain, targetY); - /* Now compute the target (final) exposure which we think we want. */ - computeTargetExposure(gain); - /* The results have to be filtered so as not to change too rapidly. */ - filterExposure(); - /* - * Some of the exposure has to be applied as digital gain, so work out - * what that is. This function also tells us whether it's decided to - * "desaturate" the image more quickly. - */ - bool desaturate = applyDigitalGain(gain, targetY); - /* - * The last thing is to divide up the exposure value into a shutter time - * and analogue gain, according to the current exposure mode. - */ - divideUpExposure(); - /* Finally advertise what we've done. */ - writeAndFinish(imageMetadata, desaturate); + LOG(RPiAgc, Debug) << "setFixedShutter " << fixedShutter + << " for channel " << channelIndex; + channelData_[channelIndex].channel.setFixedShutter(fixedShutter); } -bool Agc::updateLockStatus(DeviceStatus const &deviceStatus) +void Agc::setFixedAnalogueGain(unsigned int channelIndex, double fixedAnalogueGain) { - const double errorFactor = 0.10; /* make these customisable? */ - const int maxLockCount = 5; - /* Reset "lock count" when we exceed this multiple of errorFactor */ - const double resetMargin = 1.5; + if (checkChannel(channelIndex)) + return; - /* Add 200us to the exposure time error to allow for line quantisation. */ - Duration exposureError = lastDeviceStatus_.shutterSpeed * errorFactor + 200us; - double gainError = lastDeviceStatus_.analogueGain * errorFactor; - Duration targetError = lastTargetExposure_ * errorFactor; - - /* - * Note that we don't know the exposure/gain limits of the sensor, so - * the values we keep requesting may be unachievable. For this reason - * we only insist that we're close to values in the past few frames. - */ - if (deviceStatus.shutterSpeed > lastDeviceStatus_.shutterSpeed - exposureError && - deviceStatus.shutterSpeed < lastDeviceStatus_.shutterSpeed + exposureError && - deviceStatus.analogueGain > lastDeviceStatus_.analogueGain - gainError && - deviceStatus.analogueGain < lastDeviceStatus_.analogueGain + gainError && - status_.targetExposureValue > lastTargetExposure_ - targetError && - status_.targetExposureValue < lastTargetExposure_ + targetError) - lockCount_ = std::min(lockCount_ + 1, maxLockCount); - else if (deviceStatus.shutterSpeed < lastDeviceStatus_.shutterSpeed - resetMargin * exposureError || - deviceStatus.shutterSpeed > lastDeviceStatus_.shutterSpeed + resetMargin * exposureError || - deviceStatus.analogueGain < lastDeviceStatus_.analogueGain - resetMargin * gainError || - deviceStatus.analogueGain > lastDeviceStatus_.analogueGain + resetMargin * gainError || - status_.targetExposureValue < lastTargetExposure_ - resetMargin * targetError || - status_.targetExposureValue > lastTargetExposure_ + resetMargin * targetError) - lockCount_ = 0; - - lastDeviceStatus_ = deviceStatus; - lastTargetExposure_ = status_.targetExposureValue; - - LOG(RPiAgc, Debug) << "Lock count updated to " << lockCount_; - return lockCount_ == maxLockCount; + LOG(RPiAgc, Debug) << "setFixedAnalogueGain " << fixedAnalogueGain + << " for channel " << channelIndex; + channelData_[channelIndex].channel.setFixedAnalogueGain(fixedAnalogueGain); } -void Agc::housekeepConfig() +void Agc::setMeteringMode(std::string const &meteringModeName) { - /* First fetch all the up-to-date settings, so no one else has to do it. */ - status_.ev = ev_; - status_.fixedShutter = limitShutter(fixedShutter_); - status_.fixedAnalogueGain = fixedAnalogueGain_; - status_.flickerPeriod = flickerPeriod_; - LOG(RPiAgc, Debug) << "ev " << status_.ev << " fixedShutter " - << status_.fixedShutter << " fixedAnalogueGain " - << status_.fixedAnalogueGain; - /* - * Make sure the "mode" pointers point to the up-to-date things, if - * they've changed. - */ - if (meteringModeName_ != status_.meteringMode) { - auto it = config_.meteringModes.find(meteringModeName_); - if (it == config_.meteringModes.end()) { - LOG(RPiAgc, Warning) << "No metering mode " << meteringModeName_; - meteringModeName_ = status_.meteringMode; - } else { - meteringMode_ = &it->second; - status_.meteringMode = meteringModeName_; - } - } - if (exposureModeName_ != status_.exposureMode) { - auto it = config_.exposureModes.find(exposureModeName_); - if (it == config_.exposureModes.end()) { - LOG(RPiAgc, Warning) << "No exposure profile " << exposureModeName_; - exposureModeName_ = status_.exposureMode; - } else { - exposureMode_ = &it->second; - status_.exposureMode = exposureModeName_; - } - } - if (constraintModeName_ != status_.constraintMode) { - auto it = config_.constraintModes.find(constraintModeName_); - if (it == config_.constraintModes.end()) { - LOG(RPiAgc, Warning) << "No constraint list " << constraintModeName_; - constraintModeName_ = status_.constraintMode; - } else { - constraintMode_ = &it->second; - status_.constraintMode = constraintModeName_; - } - } - LOG(RPiAgc, Debug) << "exposureMode " - << exposureModeName_ << " constraintMode " - << constraintModeName_ << " meteringMode " - << meteringModeName_; + /* Metering modes will be the same across all channels too. */ + for (auto &data : channelData_) + data.channel.setMeteringMode(meteringModeName); } -void Agc::fetchCurrentExposure(Metadata *imageMetadata) +void Agc::setExposureMode(unsigned int channelIndex, std::string const &exposureModeName) { - std::unique_lock lock(*imageMetadata); - DeviceStatus *deviceStatus = - imageMetadata->getLocked("device.status"); - if (!deviceStatus) - LOG(RPiAgc, Fatal) << "No device metadata"; - current_.shutter = deviceStatus->shutterSpeed; - current_.analogueGain = deviceStatus->analogueGain; - AgcStatus *agcStatus = - imageMetadata->getLocked("agc.status"); - current_.totalExposure = agcStatus ? agcStatus->totalExposureValue : 0s; - current_.totalExposureNoDG = current_.shutter * current_.analogueGain; -} + if (checkChannel(channelIndex)) + return; -void Agc::fetchAwbStatus(Metadata *imageMetadata) -{ - awb_.gainR = 1.0; /* in case not found in metadata */ - awb_.gainG = 1.0; - awb_.gainB = 1.0; - if (imageMetadata->get("awb.status", awb_) != 0) - LOG(RPiAgc, Debug) << "No AWB status found"; + LOG(RPiAgc, Debug) << "setExposureMode " << exposureModeName + << " for channel " << channelIndex; + channelData_[channelIndex].channel.setExposureMode(exposureModeName); } -static double computeInitialY(StatisticsPtr &stats, AwbStatus const &awb, - std::vector &weights, double gain) +void Agc::setConstraintMode(unsigned int channelIndex, std::string const &constraintModeName) { - constexpr uint64_t maxVal = 1 << Statistics::NormalisationFactorPow2; + if (checkChannel(channelIndex)) + return; - ASSERT(weights.size() == stats->agcRegions.numRegions()); - - /* - * Note that the weights are applied by the IPA to the statistics directly, - * before they are given to us here. - */ - double rSum = 0, gSum = 0, bSum = 0, pixelSum = 0; - for (unsigned int i = 0; i < stats->agcRegions.numRegions(); i++) { - auto ®ion = stats->agcRegions.get(i); - rSum += std::min(region.val.rSum * gain, (maxVal - 1) * region.counted); - gSum += std::min(region.val.gSum * gain, (maxVal - 1) * region.counted); - bSum += std::min(region.val.bSum * gain, (maxVal - 1) * region.counted); - pixelSum += region.counted; - } - if (pixelSum == 0.0) { - LOG(RPiAgc, Warning) << "computeInitialY: pixelSum is zero"; - return 0; - } - double ySum = rSum * awb.gainR * .299 + - gSum * awb.gainG * .587 + - bSum * awb.gainB * .114; - return ySum / pixelSum / maxVal; + channelData_[channelIndex].channel.setConstraintMode(constraintModeName); } -/* - * We handle extra gain through EV by adjusting our Y targets. However, you - * simply can't monitor histograms once they get very close to (or beyond!) - * saturation, so we clamp the Y targets to this value. It does mean that EV - * increases don't necessarily do quite what you might expect in certain - * (contrived) cases. - */ - -static constexpr double EvGainYTargetLimit = 0.9; - -static double constraintComputeGain(AgcConstraint &c, const Histogram &h, double lux, - double evGain, double &targetY) +template +std::ostream &operator<<(std::ostream &os, const std::vector &v) { - targetY = c.yTarget.eval(c.yTarget.domain().clip(lux)); - targetY = std::min(EvGainYTargetLimit, targetY * evGain); - double iqm = h.interQuantileMean(c.qLo, c.qHi); - return (targetY * h.bins()) / iqm; + os << "{"; + for (const auto &e : v) + os << " " << e; + os << " }"; + return os; } -void Agc::computeGain(StatisticsPtr &statistics, Metadata *imageMetadata, - double &gain, double &targetY) +void Agc::setActiveChannels(const std::vector &activeChannels) { - struct LuxStatus lux = {}; - lux.lux = 400; /* default lux level to 400 in case no metadata found */ - if (imageMetadata->get("lux.status", lux) != 0) - LOG(RPiAgc, Warning) << "No lux level found"; - const Histogram &h = statistics->yHist; - double evGain = status_.ev * config_.baseEv; - /* - * The initial gain and target_Y come from some of the regions. After - * that we consider the histogram constraints. - */ - targetY = config_.yTarget.eval(config_.yTarget.domain().clip(lux.lux)); - targetY = std::min(EvGainYTargetLimit, targetY * evGain); - - /* - * Do this calculation a few times as brightness increase can be - * non-linear when there are saturated regions. - */ - gain = 1.0; - for (int i = 0; i < 8; i++) { - double initialY = computeInitialY(statistics, awb_, meteringMode_->weights, gain); - double extraGain = std::min(10.0, targetY / (initialY + .001)); - gain *= extraGain; - LOG(RPiAgc, Debug) << "Initial Y " << initialY << " target " << targetY - << " gives gain " << gain; - if (extraGain < 1.01) /* close enough */ - break; - } - - for (auto &c : *constraintMode_) { - double newTargetY; - double newGain = constraintComputeGain(c, h, lux.lux, evGain, newTargetY); - LOG(RPiAgc, Debug) << "Constraint has target_Y " - << newTargetY << " giving gain " << newGain; - if (c.bound == AgcConstraint::Bound::LOWER && newGain > gain) { - LOG(RPiAgc, Debug) << "Lower bound constraint adopted"; - gain = newGain; - targetY = newTargetY; - } else if (c.bound == AgcConstraint::Bound::UPPER && newGain < gain) { - LOG(RPiAgc, Debug) << "Upper bound constraint adopted"; - gain = newGain; - targetY = newTargetY; - } + if (activeChannels.empty()) { + LOG(RPiAgc, Warning) << "No active AGC channels supplied"; + return; } - LOG(RPiAgc, Debug) << "Final gain " << gain << " (target_Y " << targetY << " ev " - << status_.ev << " base_ev " << config_.baseEv - << ")"; -} - -void Agc::computeTargetExposure(double gain) -{ - if (status_.fixedShutter && status_.fixedAnalogueGain) { - /* - * When ag and shutter are both fixed, we need to drive the - * total exposure so that we end up with a digital gain of at least - * 1/minColourGain. Otherwise we'd desaturate channels causing - * white to go cyan or magenta. - */ - double minColourGain = std::min({ awb_.gainR, awb_.gainG, awb_.gainB, 1.0 }); - ASSERT(minColourGain != 0.0); - target_.totalExposure = - status_.fixedShutter * status_.fixedAnalogueGain / minColourGain; - } else { - /* - * The statistics reflect the image without digital gain, so the final - * total exposure we're aiming for is: - */ - target_.totalExposure = current_.totalExposureNoDG * gain; - /* The final target exposure is also limited to what the exposure mode allows. */ - Duration maxShutter = status_.fixedShutter - ? status_.fixedShutter - : exposureMode_->shutter.back(); - maxShutter = limitShutter(maxShutter); - Duration maxTotalExposure = - maxShutter * - (status_.fixedAnalogueGain != 0.0 - ? status_.fixedAnalogueGain - : exposureMode_->gain.back()); - target_.totalExposure = std::min(target_.totalExposure, maxTotalExposure); - } - LOG(RPiAgc, Debug) << "Target totalExposure " << target_.totalExposure; -} -bool Agc::applyDigitalGain(double gain, double targetY) -{ - double minColourGain = std::min({ awb_.gainR, awb_.gainG, awb_.gainB, 1.0 }); - ASSERT(minColourGain != 0.0); - double dg = 1.0 / minColourGain; - /* - * I think this pipeline subtracts black level and rescales before we - * get the stats, so no need to worry about it. - */ - LOG(RPiAgc, Debug) << "after AWB, target dg " << dg << " gain " << gain - << " target_Y " << targetY; - /* - * Finally, if we're trying to reduce exposure but the target_Y is - * "close" to 1.0, then the gain computed for that constraint will be - * only slightly less than one, because the measured Y can never be - * larger than 1.0. When this happens, demand a large digital gain so - * that the exposure can be reduced, de-saturating the image much more - * quickly (and we then approach the correct value more quickly from - * below). - */ - bool desaturate = targetY > config_.fastReduceThreshold && - gain < sqrt(targetY); - if (desaturate) - dg /= config_.fastReduceThreshold; - LOG(RPiAgc, Debug) << "Digital gain " << dg << " desaturate? " << desaturate; - filtered_.totalExposureNoDG = filtered_.totalExposure / dg; - LOG(RPiAgc, Debug) << "Target totalExposureNoDG " << filtered_.totalExposureNoDG; - return desaturate; -} - -void Agc::filterExposure() -{ - double speed = config_.speed; - /* - * AGC adapts instantly if both shutter and gain are directly specified - * or we're in the startup phase. - */ - if ((status_.fixedShutter && status_.fixedAnalogueGain) || - frameCount_ <= config_.startupFrames) - speed = 1.0; - if (!filtered_.totalExposure) { - filtered_.totalExposure = target_.totalExposure; - } else { - /* - * If close to the result go faster, to save making so many - * micro-adjustments on the way. (Make this customisable?) - */ - if (filtered_.totalExposure < 1.2 * target_.totalExposure && - filtered_.totalExposure > 0.8 * target_.totalExposure) - speed = sqrt(speed); - filtered_.totalExposure = speed * target_.totalExposure + - filtered_.totalExposure * (1.0 - speed); - } - LOG(RPiAgc, Debug) << "After filtering, totalExposure " << filtered_.totalExposure - << " no dg " << filtered_.totalExposureNoDG; -} + for (auto index : activeChannels) + if (checkChannel(index)) + return; -void Agc::divideUpExposure() -{ - /* - * Sending the fixed shutter/gain cases through the same code may seem - * unnecessary, but it will make more sense when extend this to cover - * variable aperture. - */ - Duration exposureValue = filtered_.totalExposureNoDG; - Duration shutterTime; - double analogueGain; - shutterTime = status_.fixedShutter ? status_.fixedShutter - : exposureMode_->shutter[0]; - shutterTime = limitShutter(shutterTime); - analogueGain = status_.fixedAnalogueGain != 0.0 ? status_.fixedAnalogueGain - : exposureMode_->gain[0]; - analogueGain = limitGain(analogueGain); - if (shutterTime * analogueGain < exposureValue) { - for (unsigned int stage = 1; - stage < exposureMode_->gain.size(); stage++) { - if (!status_.fixedShutter) { - Duration stageShutter = - limitShutter(exposureMode_->shutter[stage]); - if (stageShutter * analogueGain >= exposureValue) { - shutterTime = exposureValue / analogueGain; - break; - } - shutterTime = stageShutter; - } - if (status_.fixedAnalogueGain == 0.0) { - if (exposureMode_->gain[stage] * shutterTime >= exposureValue) { - analogueGain = exposureValue / shutterTime; - break; - } - analogueGain = exposureMode_->gain[stage]; - analogueGain = limitGain(analogueGain); - } - } - } - LOG(RPiAgc, Debug) << "Divided up shutter and gain are " << shutterTime << " and " - << analogueGain; - /* - * Finally adjust shutter time for flicker avoidance (require both - * shutter and gain not to be fixed). - */ - if (!status_.fixedShutter && !status_.fixedAnalogueGain && - status_.flickerPeriod) { - int flickerPeriods = shutterTime / status_.flickerPeriod; - if (flickerPeriods) { - Duration newShutterTime = flickerPeriods * status_.flickerPeriod; - analogueGain *= shutterTime / newShutterTime; - /* - * We should still not allow the ag to go over the - * largest value in the exposure mode. Note that this - * may force more of the total exposure into the digital - * gain as a side-effect. - */ - analogueGain = std::min(analogueGain, exposureMode_->gain.back()); - analogueGain = limitGain(analogueGain); - shutterTime = newShutterTime; - } - LOG(RPiAgc, Debug) << "After flicker avoidance, shutter " - << shutterTime << " gain " << analogueGain; - } - filtered_.shutter = shutterTime; - filtered_.analogueGain = analogueGain; + LOG(RPiAgc, Debug) << "setActiveChannels " << activeChannels; + activeChannels_ = activeChannels; } -void Agc::writeAndFinish(Metadata *imageMetadata, bool desaturate) +void Agc::switchMode(CameraMode const &cameraMode, + Metadata *metadata) { - status_.totalExposureValue = filtered_.totalExposure; - status_.targetExposureValue = desaturate ? 0s : target_.totalExposureNoDG; - status_.shutterTime = filtered_.shutter; - status_.analogueGain = filtered_.analogueGain; - /* - * Write to metadata as well, in case anyone wants to update the camera - * immediately. - */ - imageMetadata->set("agc.status", status_); - LOG(RPiAgc, Debug) << "Output written, total exposure requested is " - << filtered_.totalExposure; - LOG(RPiAgc, Debug) << "Camera exposure update: shutter time " << filtered_.shutter - << " analogue gain " << filtered_.analogueGain; + LOG(RPiAgc, Debug) << "switchMode for channel 0"; + channelData_[0].channel.switchMode(cameraMode, metadata); } -Duration Agc::limitShutter(Duration shutter) +void Agc::prepare(Metadata *imageMetadata) { - /* - * shutter == 0 is a special case for fixed shutter values, and must pass - * through unchanged - */ - if (!shutter) - return shutter; - - shutter = std::clamp(shutter, mode_.minShutter, maxShutter_); - return shutter; + LOG(RPiAgc, Debug) << "prepare for channel 0"; + channelData_[0].channel.prepare(imageMetadata); } -double Agc::limitGain(double gain) const +void Agc::process(StatisticsPtr &stats, Metadata *imageMetadata) { - /* - * Only limit the lower bounds of the gain value to what the sensor limits. - * The upper bound on analogue gain will be made up with additional digital - * gain applied by the ISP. - * - * gain == 0.0 is a special case for fixed shutter values, and must pass - * through unchanged - */ - if (!gain) - return gain; - - gain = std::max(gain, mode_.minAnalogueGain); - return gain; + LOG(RPiAgc, Debug) << "process for channel 0"; + channelData_[0].channel.process(stats, imageMetadata); } /* Register algorithm with the system. */ diff --git a/src/ipa/rpi/controller/rpi/agc.h b/src/ipa/rpi/controller/rpi/agc.h index aaf77c8f..a9158910 100644 --- a/src/ipa/rpi/controller/rpi/agc.h +++ b/src/ipa/rpi/controller/rpi/agc.h @@ -6,60 +6,19 @@ */ #pragma once +#include #include -#include - -#include #include "../agc_algorithm.h" -#include "../agc_status.h" -#include "../pwl.h" -/* This is our implementation of AGC. */ +#include "agc_channel.h" namespace RPiController { -struct AgcMeteringMode { - std::vector weights; - int read(const libcamera::YamlObject ¶ms); -}; - -struct AgcExposureMode { - std::vector shutter; - std::vector gain; - int read(const libcamera::YamlObject ¶ms); -}; - -struct AgcConstraint { - enum class Bound { LOWER = 0, UPPER = 1 }; - Bound bound; - double qLo; - double qHi; - Pwl yTarget; - int read(const libcamera::YamlObject ¶ms); -}; - -typedef std::vector AgcConstraintMode; - -struct AgcConfig { - int read(const libcamera::YamlObject ¶ms); - std::map meteringModes; - std::map exposureModes; - std::map constraintModes; - Pwl yTarget; - double speed; - uint16_t startupFrames; - unsigned int convergenceFrames; - double maxChange; - double minChange; - double fastReduceThreshold; - double speedUpThreshold; - std::string defaultMeteringMode; - std::string defaultExposureMode; - std::string defaultConstraintMode; - double baseEv; - libcamera::utils::Duration defaultExposureTime; - double defaultAnalogueGain; +struct AgcChannelData { + AgcChannel channel; + std::optional deviceStatus; + StatisticsPtr statistics; }; class Agc : public AgcAlgorithm @@ -70,65 +29,30 @@ public: int read(const libcamera::YamlObject ¶ms) override; unsigned int getConvergenceFrames() const override; std::vector const &getWeights() const override; - void setEv(double ev) override; - void setFlickerPeriod(libcamera::utils::Duration flickerPeriod) override; + void setEv(unsigned int channel, double ev) override; + void setFlickerPeriod(unsigned int channelIndex, + libcamera::utils::Duration flickerPeriod) override; void setMaxShutter(libcamera::utils::Duration maxShutter) override; - void setFixedShutter(libcamera::utils::Duration fixedShutter) override; - void setFixedAnalogueGain(double fixedAnalogueGain) override; + void setFixedShutter(unsigned int channelIndex, + libcamera::utils::Duration fixedShutter) override; + void setFixedAnalogueGain(unsigned int channelIndex, + double fixedAnalogueGain) override; void setMeteringMode(std::string const &meteringModeName) override; - void setExposureMode(std::string const &exposureModeName) override; - void setConstraintMode(std::string const &contraintModeName) override; - void enableAuto() override; - void disableAuto() override; + void setExposureMode(unsigned int channelIndex, + std::string const &exposureModeName) override; + void setConstraintMode(unsigned int channelIndex, + std::string const &contraintModeName) override; + void enableAuto(unsigned int channelIndex) override; + void disableAuto(unsigned int channelIndex) override; void switchMode(CameraMode const &cameraMode, Metadata *metadata) override; void prepare(Metadata *imageMetadata) override; void process(StatisticsPtr &stats, Metadata *imageMetadata) override; + void setActiveChannels(const std::vector &activeChannels) override; private: - bool updateLockStatus(DeviceStatus const &deviceStatus); - AgcConfig config_; - void housekeepConfig(); - void fetchCurrentExposure(Metadata *imageMetadata); - void fetchAwbStatus(Metadata *imageMetadata); - void computeGain(StatisticsPtr &statistics, Metadata *imageMetadata, - double &gain, double &targetY); - void computeTargetExposure(double gain); - void filterExposure(); - bool applyDigitalGain(double gain, double targetY); - void divideUpExposure(); - void writeAndFinish(Metadata *imageMetadata, bool desaturate); - libcamera::utils::Duration limitShutter(libcamera::utils::Duration shutter); - double limitGain(double gain) const; - AgcMeteringMode *meteringMode_; - AgcExposureMode *exposureMode_; - AgcConstraintMode *constraintMode_; - CameraMode mode_; - uint64_t frameCount_; - AwbStatus awb_; - struct ExposureValues { - ExposureValues(); - - libcamera::utils::Duration shutter; - double analogueGain; - libcamera::utils::Duration totalExposure; - libcamera::utils::Duration totalExposureNoDG; /* without digital gain */ - }; - ExposureValues current_; /* values for the current frame */ - ExposureValues target_; /* calculate the values we want here */ - ExposureValues filtered_; /* these values are filtered towards target */ - AgcStatus status_; - int lockCount_; - DeviceStatus lastDeviceStatus_; - libcamera::utils::Duration lastTargetExposure_; - /* Below here the "settings" that applications can change. */ - std::string meteringModeName_; - std::string exposureModeName_; - std::string constraintModeName_; - double ev_; - libcamera::utils::Duration flickerPeriod_; - libcamera::utils::Duration maxShutter_; - libcamera::utils::Duration fixedShutter_; - double fixedAnalogueGain_; + int checkChannel(unsigned int channel) const; + std::vector channelData_; + std::vector activeChannels_; }; } /* namespace RPiController */ diff --git a/src/ipa/rpi/controller/rpi/agc_channel.cpp b/src/ipa/rpi/controller/rpi/agc_channel.cpp new file mode 100644 index 00000000..d6e30ef2 --- /dev/null +++ b/src/ipa/rpi/controller/rpi/agc_channel.cpp @@ -0,0 +1,927 @@ +/* SPDX-License-Identifier: BSD-2-Clause */ +/* + * Copyright (C) 2019, Raspberry Pi Ltd + * + * agc.cpp - AGC/AEC control algorithm + */ + +#include +#include +#include + +#include + +#include "../awb_status.h" +#include "../device_status.h" +#include "../histogram.h" +#include "../lux_status.h" +#include "../metadata.h" + +#include "agc.h" + +using namespace RPiController; +using namespace libcamera; +using libcamera::utils::Duration; +using namespace std::literals::chrono_literals; + +LOG_DECLARE_CATEGORY(RPiAgc) + +#define NAME "rpi.agc" + +int AgcMeteringMode::read(const libcamera::YamlObject ¶ms) +{ + const YamlObject &yamlWeights = params["weights"]; + + for (const auto &p : yamlWeights.asList()) { + auto value = p.get(); + if (!value) + return -EINVAL; + weights.push_back(*value); + } + + return 0; +} + +static std::tuple +readMeteringModes(std::map &metering_modes, + const libcamera::YamlObject ¶ms) +{ + std::string first; + int ret; + + for (const auto &[key, value] : params.asDict()) { + AgcMeteringMode meteringMode; + ret = meteringMode.read(value); + if (ret) + return { ret, {} }; + + metering_modes[key] = std::move(meteringMode); + if (first.empty()) + first = key; + } + + return { 0, first }; +} + +int AgcExposureMode::read(const libcamera::YamlObject ¶ms) +{ + auto value = params["shutter"].getList(); + if (!value) + return -EINVAL; + std::transform(value->begin(), value->end(), std::back_inserter(shutter), + [](double v) { return v * 1us; }); + + value = params["gain"].getList(); + if (!value) + return -EINVAL; + gain = std::move(*value); + + if (shutter.size() < 2 || gain.size() < 2) { + LOG(RPiAgc, Error) + << "AgcExposureMode: must have at least two entries in exposure profile"; + return -EINVAL; + } + + if (shutter.size() != gain.size()) { + LOG(RPiAgc, Error) + << "AgcExposureMode: expect same number of exposure and gain entries in exposure profile"; + return -EINVAL; + } + + return 0; +} + +static std::tuple +readExposureModes(std::map &exposureModes, + const libcamera::YamlObject ¶ms) +{ + std::string first; + int ret; + + for (const auto &[key, value] : params.asDict()) { + AgcExposureMode exposureMode; + ret = exposureMode.read(value); + if (ret) + return { ret, {} }; + + exposureModes[key] = std::move(exposureMode); + if (first.empty()) + first = key; + } + + return { 0, first }; +} + +int AgcConstraint::read(const libcamera::YamlObject ¶ms) +{ + std::string boundString = params["bound"].get(""); + transform(boundString.begin(), boundString.end(), + boundString.begin(), ::toupper); + if (boundString != "UPPER" && boundString != "LOWER") { + LOG(RPiAgc, Error) << "AGC constraint type should be UPPER or LOWER"; + return -EINVAL; + } + bound = boundString == "UPPER" ? Bound::UPPER : Bound::LOWER; + + auto value = params["q_lo"].get(); + if (!value) + return -EINVAL; + qLo = *value; + + value = params["q_hi"].get(); + if (!value) + return -EINVAL; + qHi = *value; + + return yTarget.read(params["y_target"]); +} + +static std::tuple +readConstraintMode(const libcamera::YamlObject ¶ms) +{ + AgcConstraintMode mode; + int ret; + + for (const auto &p : params.asList()) { + AgcConstraint constraint; + ret = constraint.read(p); + if (ret) + return { ret, {} }; + + mode.push_back(std::move(constraint)); + } + + return { 0, mode }; +} + +static std::tuple +readConstraintModes(std::map &constraintModes, + const libcamera::YamlObject ¶ms) +{ + std::string first; + int ret; + + for (const auto &[key, value] : params.asDict()) { + std::tie(ret, constraintModes[key]) = readConstraintMode(value); + if (ret) + return { ret, {} }; + + if (first.empty()) + first = key; + } + + return { 0, first }; +} + +int AgcConfig::read(const libcamera::YamlObject ¶ms) +{ + LOG(RPiAgc, Debug) << "AgcConfig"; + int ret; + + std::tie(ret, defaultMeteringMode) = + readMeteringModes(meteringModes, params["metering_modes"]); + if (ret) + return ret; + std::tie(ret, defaultExposureMode) = + readExposureModes(exposureModes, params["exposure_modes"]); + if (ret) + return ret; + std::tie(ret, defaultConstraintMode) = + readConstraintModes(constraintModes, params["constraint_modes"]); + if (ret) + return ret; + + ret = yTarget.read(params["y_target"]); + if (ret) + return ret; + + speed = params["speed"].get(0.2); + startupFrames = params["startup_frames"].get(10); + convergenceFrames = params["convergence_frames"].get(6); + fastReduceThreshold = params["fast_reduce_threshold"].get(0.4); + baseEv = params["base_ev"].get(1.0); + + /* Start with quite a low value as ramping up is easier than ramping down. */ + defaultExposureTime = params["default_exposure_time"].get(1000) * 1us; + defaultAnalogueGain = params["default_analogue_gain"].get(1.0); + + return 0; +} + +AgcChannel::ExposureValues::ExposureValues() + : shutter(0s), analogueGain(0), + totalExposure(0s), totalExposureNoDG(0s) +{ +} + +AgcChannel::AgcChannel() + : meteringMode_(nullptr), exposureMode_(nullptr), constraintMode_(nullptr), + frameCount_(0), lockCount_(0), + lastTargetExposure_(0s), ev_(1.0), flickerPeriod_(0s), + maxShutter_(0s), fixedShutter_(0s), fixedAnalogueGain_(0.0) +{ + memset(&awb_, 0, sizeof(awb_)); + /* + * Setting status_.totalExposureValue_ to zero initially tells us + * it's not been calculated yet (i.e. Process hasn't yet run). + */ + status_ = {}; + status_.ev = ev_; +} + +int AgcChannel::read(const libcamera::YamlObject ¶ms, + const Controller::HardwareConfig &hardwareConfig) +{ + int ret = config_.read(params); + if (ret) + return ret; + + const Size &size = hardwareConfig.agcZoneWeights; + for (auto const &modes : config_.meteringModes) { + if (modes.second.weights.size() != size.width * size.height) { + LOG(RPiAgc, Error) << "AgcMeteringMode: Incorrect number of weights"; + return -EINVAL; + } + } + + /* + * Set the config's defaults (which are the first ones it read) as our + * current modes, until someone changes them. (they're all known to + * exist at this point) + */ + meteringModeName_ = config_.defaultMeteringMode; + meteringMode_ = &config_.meteringModes[meteringModeName_]; + exposureModeName_ = config_.defaultExposureMode; + exposureMode_ = &config_.exposureModes[exposureModeName_]; + constraintModeName_ = config_.defaultConstraintMode; + constraintMode_ = &config_.constraintModes[constraintModeName_]; + /* Set up the "last shutter/gain" values, in case AGC starts "disabled". */ + status_.shutterTime = config_.defaultExposureTime; + status_.analogueGain = config_.defaultAnalogueGain; + return 0; +} + +void AgcChannel::disableAuto() +{ + fixedShutter_ = status_.shutterTime; + fixedAnalogueGain_ = status_.analogueGain; +} + +void AgcChannel::enableAuto() +{ + fixedShutter_ = 0s; + fixedAnalogueGain_ = 0; +} + +unsigned int AgcChannel::getConvergenceFrames() const +{ + /* + * If shutter and gain have been explicitly set, there is no + * convergence to happen, so no need to drop any frames - return zero. + */ + if (fixedShutter_ && fixedAnalogueGain_) + return 0; + else + return config_.convergenceFrames; +} + +std::vector const &AgcChannel::getWeights() const +{ + /* + * In case someone calls setMeteringMode and then this before the + * algorithm has run and updated the meteringMode_ pointer. + */ + auto it = config_.meteringModes.find(meteringModeName_); + if (it == config_.meteringModes.end()) + return meteringMode_->weights; + return it->second.weights; +} + +void AgcChannel::setEv(double ev) +{ + ev_ = ev; +} + +void AgcChannel::setFlickerPeriod(Duration flickerPeriod) +{ + flickerPeriod_ = flickerPeriod; +} + +void AgcChannel::setMaxShutter(Duration maxShutter) +{ + maxShutter_ = maxShutter; +} + +void AgcChannel::setFixedShutter(Duration fixedShutter) +{ + fixedShutter_ = fixedShutter; + /* Set this in case someone calls disableAuto() straight after. */ + status_.shutterTime = limitShutter(fixedShutter_); +} + +void AgcChannel::setFixedAnalogueGain(double fixedAnalogueGain) +{ + fixedAnalogueGain_ = fixedAnalogueGain; + /* Set this in case someone calls disableAuto() straight after. */ + status_.analogueGain = limitGain(fixedAnalogueGain); +} + +void AgcChannel::setMeteringMode(std::string const &meteringModeName) +{ + meteringModeName_ = meteringModeName; +} + +void AgcChannel::setExposureMode(std::string const &exposureModeName) +{ + exposureModeName_ = exposureModeName; +} + +void AgcChannel::setConstraintMode(std::string const &constraintModeName) +{ + constraintModeName_ = constraintModeName; +} + +void AgcChannel::switchMode(CameraMode const &cameraMode, + Metadata *metadata) +{ + /* AGC expects the mode sensitivity always to be non-zero. */ + ASSERT(cameraMode.sensitivity); + + housekeepConfig(); + + /* + * Store the mode in the local state. We must cache the sensitivity of + * of the previous mode for the calculations below. + */ + double lastSensitivity = mode_.sensitivity; + mode_ = cameraMode; + + Duration fixedShutter = limitShutter(fixedShutter_); + if (fixedShutter && fixedAnalogueGain_) { + /* We're going to reset the algorithm here with these fixed values. */ + + fetchAwbStatus(metadata); + double minColourGain = std::min({ awb_.gainR, awb_.gainG, awb_.gainB, 1.0 }); + ASSERT(minColourGain != 0.0); + + /* This is the equivalent of computeTargetExposure and applyDigitalGain. */ + target_.totalExposureNoDG = fixedShutter_ * fixedAnalogueGain_; + target_.totalExposure = target_.totalExposureNoDG / minColourGain; + + /* Equivalent of filterExposure. This resets any "history". */ + filtered_ = target_; + + /* Equivalent of divideUpExposure. */ + filtered_.shutter = fixedShutter; + filtered_.analogueGain = fixedAnalogueGain_; + } else if (status_.totalExposureValue) { + /* + * On a mode switch, various things could happen: + * - the exposure profile might change + * - a fixed exposure or gain might be set + * - the new mode's sensitivity might be different + * We cope with the last of these by scaling the target values. After + * that we just need to re-divide the exposure/gain according to the + * current exposure profile, which takes care of everything else. + */ + + double ratio = lastSensitivity / cameraMode.sensitivity; + target_.totalExposureNoDG *= ratio; + target_.totalExposure *= ratio; + filtered_.totalExposureNoDG *= ratio; + filtered_.totalExposure *= ratio; + + divideUpExposure(); + } else { + /* + * We come through here on startup, when at least one of the shutter + * or gain has not been fixed. We must still write those values out so + * that they will be applied immediately. We supply some arbitrary defaults + * for any that weren't set. + */ + + /* Equivalent of divideUpExposure. */ + filtered_.shutter = fixedShutter ? fixedShutter : config_.defaultExposureTime; + filtered_.analogueGain = fixedAnalogueGain_ ? fixedAnalogueGain_ : config_.defaultAnalogueGain; + } + + writeAndFinish(metadata, false); +} + +void AgcChannel::prepare(Metadata *imageMetadata) +{ + Duration totalExposureValue = status_.totalExposureValue; + AgcStatus delayedStatus; + AgcPrepareStatus prepareStatus; + + if (!imageMetadata->get("agc.delayed_status", delayedStatus)) + totalExposureValue = delayedStatus.totalExposureValue; + + prepareStatus.digitalGain = 1.0; + prepareStatus.locked = false; + + if (status_.totalExposureValue) { + /* Process has run, so we have meaningful values. */ + DeviceStatus deviceStatus; + if (imageMetadata->get("device.status", deviceStatus) == 0) { + Duration actualExposure = deviceStatus.shutterSpeed * + deviceStatus.analogueGain; + if (actualExposure) { + double digitalGain = totalExposureValue / actualExposure; + LOG(RPiAgc, Debug) << "Want total exposure " << totalExposureValue; + /* + * Never ask for a gain < 1.0, and also impose + * some upper limit. Make it customisable? + */ + prepareStatus.digitalGain = std::max(1.0, std::min(digitalGain, 4.0)); + LOG(RPiAgc, Debug) << "Actual exposure " << actualExposure; + LOG(RPiAgc, Debug) << "Use digitalGain " << prepareStatus.digitalGain; + LOG(RPiAgc, Debug) << "Effective exposure " + << actualExposure * prepareStatus.digitalGain; + /* Decide whether AEC/AGC has converged. */ + prepareStatus.locked = updateLockStatus(deviceStatus); + } + } else + LOG(RPiAgc, Warning) << "AgcChannel: no device metadata"; + imageMetadata->set("agc.prepare_status", prepareStatus); + } +} + +void AgcChannel::process(StatisticsPtr &stats, Metadata *imageMetadata) +{ + frameCount_++; + /* + * First a little bit of housekeeping, fetching up-to-date settings and + * configuration, that kind of thing. + */ + housekeepConfig(); + /* Fetch the AWB status immediately, so that we can assume it's there. */ + fetchAwbStatus(imageMetadata); + /* Get the current exposure values for the frame that's just arrived. */ + fetchCurrentExposure(imageMetadata); + /* Compute the total gain we require relative to the current exposure. */ + double gain, targetY; + computeGain(stats, imageMetadata, gain, targetY); + /* Now compute the target (final) exposure which we think we want. */ + computeTargetExposure(gain); + /* The results have to be filtered so as not to change too rapidly. */ + filterExposure(); + /* + * Some of the exposure has to be applied as digital gain, so work out + * what that is. This function also tells us whether it's decided to + * "desaturate" the image more quickly. + */ + bool desaturate = applyDigitalGain(gain, targetY); + /* + * The last thing is to divide up the exposure value into a shutter time + * and analogue gain, according to the current exposure mode. + */ + divideUpExposure(); + /* Finally advertise what we've done. */ + writeAndFinish(imageMetadata, desaturate); +} + +bool AgcChannel::updateLockStatus(DeviceStatus const &deviceStatus) +{ + const double errorFactor = 0.10; /* make these customisable? */ + const int maxLockCount = 5; + /* Reset "lock count" when we exceed this multiple of errorFactor */ + const double resetMargin = 1.5; + + /* Add 200us to the exposure time error to allow for line quantisation. */ + Duration exposureError = lastDeviceStatus_.shutterSpeed * errorFactor + 200us; + double gainError = lastDeviceStatus_.analogueGain * errorFactor; + Duration targetError = lastTargetExposure_ * errorFactor; + + /* + * Note that we don't know the exposure/gain limits of the sensor, so + * the values we keep requesting may be unachievable. For this reason + * we only insist that we're close to values in the past few frames. + */ + if (deviceStatus.shutterSpeed > lastDeviceStatus_.shutterSpeed - exposureError && + deviceStatus.shutterSpeed < lastDeviceStatus_.shutterSpeed + exposureError && + deviceStatus.analogueGain > lastDeviceStatus_.analogueGain - gainError && + deviceStatus.analogueGain < lastDeviceStatus_.analogueGain + gainError && + status_.targetExposureValue > lastTargetExposure_ - targetError && + status_.targetExposureValue < lastTargetExposure_ + targetError) + lockCount_ = std::min(lockCount_ + 1, maxLockCount); + else if (deviceStatus.shutterSpeed < lastDeviceStatus_.shutterSpeed - resetMargin * exposureError || + deviceStatus.shutterSpeed > lastDeviceStatus_.shutterSpeed + resetMargin * exposureError || + deviceStatus.analogueGain < lastDeviceStatus_.analogueGain - resetMargin * gainError || + deviceStatus.analogueGain > lastDeviceStatus_.analogueGain + resetMargin * gainError || + status_.targetExposureValue < lastTargetExposure_ - resetMargin * targetError || + status_.targetExposureValue > lastTargetExposure_ + resetMargin * targetError) + lockCount_ = 0; + + lastDeviceStatus_ = deviceStatus; + lastTargetExposure_ = status_.targetExposureValue; + + LOG(RPiAgc, Debug) << "Lock count updated to " << lockCount_; + return lockCount_ == maxLockCount; +} + +void AgcChannel::housekeepConfig() +{ + /* First fetch all the up-to-date settings, so no one else has to do it. */ + status_.ev = ev_; + status_.fixedShutter = limitShutter(fixedShutter_); + status_.fixedAnalogueGain = fixedAnalogueGain_; + status_.flickerPeriod = flickerPeriod_; + LOG(RPiAgc, Debug) << "ev " << status_.ev << " fixedShutter " + << status_.fixedShutter << " fixedAnalogueGain " + << status_.fixedAnalogueGain; + /* + * Make sure the "mode" pointers point to the up-to-date things, if + * they've changed. + */ + if (meteringModeName_ != status_.meteringMode) { + auto it = config_.meteringModes.find(meteringModeName_); + if (it == config_.meteringModes.end()) { + LOG(RPiAgc, Warning) << "No metering mode " << meteringModeName_; + meteringModeName_ = status_.meteringMode; + } else { + meteringMode_ = &it->second; + status_.meteringMode = meteringModeName_; + } + } + if (exposureModeName_ != status_.exposureMode) { + auto it = config_.exposureModes.find(exposureModeName_); + if (it == config_.exposureModes.end()) { + LOG(RPiAgc, Warning) << "No exposure profile " << exposureModeName_; + exposureModeName_ = status_.exposureMode; + } else { + exposureMode_ = &it->second; + status_.exposureMode = exposureModeName_; + } + } + if (constraintModeName_ != status_.constraintMode) { + auto it = config_.constraintModes.find(constraintModeName_); + if (it == config_.constraintModes.end()) { + LOG(RPiAgc, Warning) << "No constraint list " << constraintModeName_; + constraintModeName_ = status_.constraintMode; + } else { + constraintMode_ = &it->second; + status_.constraintMode = constraintModeName_; + } + } + LOG(RPiAgc, Debug) << "exposureMode " + << exposureModeName_ << " constraintMode " + << constraintModeName_ << " meteringMode " + << meteringModeName_; +} + +void AgcChannel::fetchCurrentExposure(Metadata *imageMetadata) +{ + std::unique_lock lock(*imageMetadata); + DeviceStatus *deviceStatus = + imageMetadata->getLocked("device.status"); + if (!deviceStatus) + LOG(RPiAgc, Fatal) << "No device metadata"; + current_.shutter = deviceStatus->shutterSpeed; + current_.analogueGain = deviceStatus->analogueGain; + AgcStatus *agcStatus = + imageMetadata->getLocked("agc.status"); + current_.totalExposure = agcStatus ? agcStatus->totalExposureValue : 0s; + current_.totalExposureNoDG = current_.shutter * current_.analogueGain; +} + +void AgcChannel::fetchAwbStatus(Metadata *imageMetadata) +{ + awb_.gainR = 1.0; /* in case not found in metadata */ + awb_.gainG = 1.0; + awb_.gainB = 1.0; + if (imageMetadata->get("awb.status", awb_) != 0) + LOG(RPiAgc, Debug) << "No AWB status found"; +} + +static double computeInitialY(StatisticsPtr &stats, AwbStatus const &awb, + std::vector &weights, double gain) +{ + constexpr uint64_t maxVal = 1 << Statistics::NormalisationFactorPow2; + + /* + * If we have no AGC region stats, but do have a a Y histogram, use that + * directly to caluclate the mean Y value of the image. + */ + if (!stats->agcRegions.numRegions() && stats->yHist.bins()) { + /* + * When the gain is applied to the histogram, anything below minBin + * will scale up directly with the gain, but anything above that + * will saturate into the top bin. + */ + auto &hist = stats->yHist; + double minBin = std::min(1.0, 1.0 / gain) * hist.bins(); + double binMean = hist.interBinMean(0.0, minBin); + double numUnsaturated = hist.cumulativeFreq(minBin); + /* This term is from all the pixels that won't saturate. */ + double ySum = binMean * gain * numUnsaturated; + /* And add the ones that will saturate. */ + ySum += (hist.total() - numUnsaturated) * hist.bins(); + return ySum / hist.total() / hist.bins(); + } + + ASSERT(weights.size() == stats->agcRegions.numRegions()); + + /* + * Note that the weights are applied by the IPA to the statistics directly, + * before they are given to us here. + */ + double rSum = 0, gSum = 0, bSum = 0, pixelSum = 0; + for (unsigned int i = 0; i < stats->agcRegions.numRegions(); i++) { + auto ®ion = stats->agcRegions.get(i); + rSum += std::min(region.val.rSum * gain, (maxVal - 1) * region.counted); + gSum += std::min(region.val.gSum * gain, (maxVal - 1) * region.counted); + bSum += std::min(region.val.bSum * gain, (maxVal - 1) * region.counted); + pixelSum += region.counted; + } + if (pixelSum == 0.0) { + LOG(RPiAgc, Warning) << "computeInitialY: pixelSum is zero"; + return 0; + } + + double ySum; + /* Factor in the AWB correction if needed. */ + if (stats->agcStatsPos == Statistics::AgcStatsPos::PreWb) { + ySum = rSum * awb.gainR * .299 + + gSum * awb.gainG * .587 + + gSum * awb.gainB * .114; + } else + ySum = rSum * .299 + gSum * .587 + gSum * .114; + + return ySum / pixelSum / (1 << 16); +} + +/* + * We handle extra gain through EV by adjusting our Y targets. However, you + * simply can't monitor histograms once they get very close to (or beyond!) + * saturation, so we clamp the Y targets to this value. It does mean that EV + * increases don't necessarily do quite what you might expect in certain + * (contrived) cases. + */ + +static constexpr double EvGainYTargetLimit = 0.9; + +static double constraintComputeGain(AgcConstraint &c, const Histogram &h, double lux, + double evGain, double &targetY) +{ + targetY = c.yTarget.eval(c.yTarget.domain().clip(lux)); + targetY = std::min(EvGainYTargetLimit, targetY * evGain); + double iqm = h.interQuantileMean(c.qLo, c.qHi); + return (targetY * h.bins()) / iqm; +} + +void AgcChannel::computeGain(StatisticsPtr &statistics, Metadata *imageMetadata, + double &gain, double &targetY) +{ + struct LuxStatus lux = {}; + lux.lux = 400; /* default lux level to 400 in case no metadata found */ + if (imageMetadata->get("lux.status", lux) != 0) + LOG(RPiAgc, Warning) << "No lux level found"; + const Histogram &h = statistics->yHist; + double evGain = status_.ev * config_.baseEv; + /* + * The initial gain and target_Y come from some of the regions. After + * that we consider the histogram constraints. + */ + targetY = config_.yTarget.eval(config_.yTarget.domain().clip(lux.lux)); + targetY = std::min(EvGainYTargetLimit, targetY * evGain); + + /* + * Do this calculation a few times as brightness increase can be + * non-linear when there are saturated regions. + */ + gain = 1.0; + for (int i = 0; i < 8; i++) { + double initialY = computeInitialY(statistics, awb_, meteringMode_->weights, gain); + double extraGain = std::min(10.0, targetY / (initialY + .001)); + gain *= extraGain; + LOG(RPiAgc, Debug) << "Initial Y " << initialY << " target " << targetY + << " gives gain " << gain; + if (extraGain < 1.01) /* close enough */ + break; + } + + for (auto &c : *constraintMode_) { + double newTargetY; + double newGain = constraintComputeGain(c, h, lux.lux, evGain, newTargetY); + LOG(RPiAgc, Debug) << "Constraint has target_Y " + << newTargetY << " giving gain " << newGain; + if (c.bound == AgcConstraint::Bound::LOWER && newGain > gain) { + LOG(RPiAgc, Debug) << "Lower bound constraint adopted"; + gain = newGain; + targetY = newTargetY; + } else if (c.bound == AgcConstraint::Bound::UPPER && newGain < gain) { + LOG(RPiAgc, Debug) << "Upper bound constraint adopted"; + gain = newGain; + targetY = newTargetY; + } + } + LOG(RPiAgc, Debug) << "Final gain " << gain << " (target_Y " << targetY << " ev " + << status_.ev << " base_ev " << config_.baseEv + << ")"; +} + +void AgcChannel::computeTargetExposure(double gain) +{ + if (status_.fixedShutter && status_.fixedAnalogueGain) { + /* + * When ag and shutter are both fixed, we need to drive the + * total exposure so that we end up with a digital gain of at least + * 1/minColourGain. Otherwise we'd desaturate channels causing + * white to go cyan or magenta. + */ + double minColourGain = std::min({ awb_.gainR, awb_.gainG, awb_.gainB, 1.0 }); + ASSERT(minColourGain != 0.0); + target_.totalExposure = + status_.fixedShutter * status_.fixedAnalogueGain / minColourGain; + } else { + /* + * The statistics reflect the image without digital gain, so the final + * total exposure we're aiming for is: + */ + target_.totalExposure = current_.totalExposureNoDG * gain; + /* The final target exposure is also limited to what the exposure mode allows. */ + Duration maxShutter = status_.fixedShutter + ? status_.fixedShutter + : exposureMode_->shutter.back(); + maxShutter = limitShutter(maxShutter); + Duration maxTotalExposure = + maxShutter * + (status_.fixedAnalogueGain != 0.0 + ? status_.fixedAnalogueGain + : exposureMode_->gain.back()); + target_.totalExposure = std::min(target_.totalExposure, maxTotalExposure); + } + LOG(RPiAgc, Debug) << "Target totalExposure " << target_.totalExposure; +} + +bool AgcChannel::applyDigitalGain(double gain, double targetY) +{ + double minColourGain = std::min({ awb_.gainR, awb_.gainG, awb_.gainB, 1.0 }); + ASSERT(minColourGain != 0.0); + double dg = 1.0 / minColourGain; + /* + * I think this pipeline subtracts black level and rescales before we + * get the stats, so no need to worry about it. + */ + LOG(RPiAgc, Debug) << "after AWB, target dg " << dg << " gain " << gain + << " target_Y " << targetY; + /* + * Finally, if we're trying to reduce exposure but the target_Y is + * "close" to 1.0, then the gain computed for that constraint will be + * only slightly less than one, because the measured Y can never be + * larger than 1.0. When this happens, demand a large digital gain so + * that the exposure can be reduced, de-saturating the image much more + * quickly (and we then approach the correct value more quickly from + * below). + */ + bool desaturate = targetY > config_.fastReduceThreshold && + gain < sqrt(targetY); + if (desaturate) + dg /= config_.fastReduceThreshold; + LOG(RPiAgc, Debug) << "Digital gain " << dg << " desaturate? " << desaturate; + filtered_.totalExposureNoDG = filtered_.totalExposure / dg; + LOG(RPiAgc, Debug) << "Target totalExposureNoDG " << filtered_.totalExposureNoDG; + return desaturate; +} + +void AgcChannel::filterExposure() +{ + double speed = config_.speed; + /* + * AGC adapts instantly if both shutter and gain are directly specified + * or we're in the startup phase. + */ + if ((status_.fixedShutter && status_.fixedAnalogueGain) || + frameCount_ <= config_.startupFrames) + speed = 1.0; + if (!filtered_.totalExposure) { + filtered_.totalExposure = target_.totalExposure; + } else { + /* + * If close to the result go faster, to save making so many + * micro-adjustments on the way. (Make this customisable?) + */ + if (filtered_.totalExposure < 1.2 * target_.totalExposure && + filtered_.totalExposure > 0.8 * target_.totalExposure) + speed = sqrt(speed); + filtered_.totalExposure = speed * target_.totalExposure + + filtered_.totalExposure * (1.0 - speed); + } + LOG(RPiAgc, Debug) << "After filtering, totalExposure " << filtered_.totalExposure + << " no dg " << filtered_.totalExposureNoDG; +} + +void AgcChannel::divideUpExposure() +{ + /* + * Sending the fixed shutter/gain cases through the same code may seem + * unnecessary, but it will make more sense when extend this to cover + * variable aperture. + */ + Duration exposureValue = filtered_.totalExposureNoDG; + Duration shutterTime; + double analogueGain; + shutterTime = status_.fixedShutter ? status_.fixedShutter + : exposureMode_->shutter[0]; + shutterTime = limitShutter(shutterTime); + analogueGain = status_.fixedAnalogueGain != 0.0 ? status_.fixedAnalogueGain + : exposureMode_->gain[0]; + analogueGain = limitGain(analogueGain); + if (shutterTime * analogueGain < exposureValue) { + for (unsigned int stage = 1; + stage < exposureMode_->gain.size(); stage++) { + if (!status_.fixedShutter) { + Duration stageShutter = + limitShutter(exposureMode_->shutter[stage]); + if (stageShutter * analogueGain >= exposureValue) { + shutterTime = exposureValue / analogueGain; + break; + } + shutterTime = stageShutter; + } + if (status_.fixedAnalogueGain == 0.0) { + if (exposureMode_->gain[stage] * shutterTime >= exposureValue) { + analogueGain = exposureValue / shutterTime; + break; + } + analogueGain = exposureMode_->gain[stage]; + analogueGain = limitGain(analogueGain); + } + } + } + LOG(RPiAgc, Debug) << "Divided up shutter and gain are " << shutterTime << " and " + << analogueGain; + /* + * Finally adjust shutter time for flicker avoidance (require both + * shutter and gain not to be fixed). + */ + if (!status_.fixedShutter && !status_.fixedAnalogueGain && + status_.flickerPeriod) { + int flickerPeriods = shutterTime / status_.flickerPeriod; + if (flickerPeriods) { + Duration newShutterTime = flickerPeriods * status_.flickerPeriod; + analogueGain *= shutterTime / newShutterTime; + /* + * We should still not allow the ag to go over the + * largest value in the exposure mode. Note that this + * may force more of the total exposure into the digital + * gain as a side-effect. + */ + analogueGain = std::min(analogueGain, exposureMode_->gain.back()); + analogueGain = limitGain(analogueGain); + shutterTime = newShutterTime; + } + LOG(RPiAgc, Debug) << "After flicker avoidance, shutter " + << shutterTime << " gain " << analogueGain; + } + filtered_.shutter = shutterTime; + filtered_.analogueGain = analogueGain; +} + +void AgcChannel::writeAndFinish(Metadata *imageMetadata, bool desaturate) +{ + status_.totalExposureValue = filtered_.totalExposure; + status_.targetExposureValue = desaturate ? 0s : target_.totalExposureNoDG; + status_.shutterTime = filtered_.shutter; + status_.analogueGain = filtered_.analogueGain; + /* + * Write to metadata as well, in case anyone wants to update the camera + * immediately. + */ + imageMetadata->set("agc.status", status_); + LOG(RPiAgc, Debug) << "Output written, total exposure requested is " + << filtered_.totalExposure; + LOG(RPiAgc, Debug) << "Camera exposure update: shutter time " << filtered_.shutter + << " analogue gain " << filtered_.analogueGain; +} + +Duration AgcChannel::limitShutter(Duration shutter) +{ + /* + * shutter == 0 is a special case for fixed shutter values, and must pass + * through unchanged + */ + if (!shutter) + return shutter; + + shutter = std::clamp(shutter, mode_.minShutter, maxShutter_); + return shutter; +} + +double AgcChannel::limitGain(double gain) const +{ + /* + * Only limit the lower bounds of the gain value to what the sensor limits. + * The upper bound on analogue gain will be made up with additional digital + * gain applied by the ISP. + * + * gain == 0.0 is a special case for fixed shutter values, and must pass + * through unchanged + */ + if (!gain) + return gain; + + gain = std::max(gain, mode_.minAnalogueGain); + return gain; +} diff --git a/src/ipa/rpi/controller/rpi/agc_channel.h b/src/ipa/rpi/controller/rpi/agc_channel.h new file mode 100644 index 00000000..dc4356f3 --- /dev/null +++ b/src/ipa/rpi/controller/rpi/agc_channel.h @@ -0,0 +1,135 @@ +/* SPDX-License-Identifier: BSD-2-Clause */ +/* + * Copyright (C) 2019, Raspberry Pi Ltd + * + * agc.h - AGC/AEC control algorithm + */ +#pragma once + +#include +#include + +#include + +#include "../agc_status.h" +#include "../awb_status.h" +#include "../pwl.h" + +/* This is our implementation of AGC. */ + +namespace RPiController { + +struct AgcMeteringMode { + std::vector weights; + int read(const libcamera::YamlObject ¶ms); +}; + +struct AgcExposureMode { + std::vector shutter; + std::vector gain; + int read(const libcamera::YamlObject ¶ms); +}; + +struct AgcConstraint { + enum class Bound { LOWER = 0, + UPPER = 1 }; + Bound bound; + double qLo; + double qHi; + Pwl yTarget; + int read(const libcamera::YamlObject ¶ms); +}; + +typedef std::vector AgcConstraintMode; + +struct AgcConfig { + int read(const libcamera::YamlObject ¶ms); + std::map meteringModes; + std::map exposureModes; + std::map constraintModes; + Pwl yTarget; + double speed; + uint16_t startupFrames; + unsigned int convergenceFrames; + double maxChange; + double minChange; + double fastReduceThreshold; + double speedUpThreshold; + std::string defaultMeteringMode; + std::string defaultExposureMode; + std::string defaultConstraintMode; + double baseEv; + libcamera::utils::Duration defaultExposureTime; + double defaultAnalogueGain; +}; + +class AgcChannel +{ +public: + AgcChannel(); + int read(const libcamera::YamlObject ¶ms, + const Controller::HardwareConfig &hardwareConfig); + unsigned int getConvergenceFrames() const; + std::vector const &getWeights() const; + void setEv(double ev); + void setFlickerPeriod(libcamera::utils::Duration flickerPeriod); + void setMaxShutter(libcamera::utils::Duration maxShutter); + void setFixedShutter(libcamera::utils::Duration fixedShutter); + void setFixedAnalogueGain(double fixedAnalogueGain); + void setMeteringMode(std::string const &meteringModeName); + void setExposureMode(std::string const &exposureModeName); + void setConstraintMode(std::string const &contraintModeName); + void enableAuto(); + void disableAuto(); + void switchMode(CameraMode const &cameraMode, Metadata *metadata); + void prepare(Metadata *imageMetadata); + void process(StatisticsPtr &stats, Metadata *imageMetadata); + +private: + bool updateLockStatus(DeviceStatus const &deviceStatus); + AgcConfig config_; + void housekeepConfig(); + void fetchCurrentExposure(Metadata *imageMetadata); + void fetchAwbStatus(Metadata *imageMetadata); + void computeGain(StatisticsPtr &statistics, Metadata *imageMetadata, + double &gain, double &targetY); + void computeTargetExposure(double gain); + void filterExposure(); + bool applyDigitalGain(double gain, double targetY); + void divideUpExposure(); + void writeAndFinish(Metadata *imageMetadata, bool desaturate); + libcamera::utils::Duration limitShutter(libcamera::utils::Duration shutter); + double limitGain(double gain) const; + AgcMeteringMode *meteringMode_; + AgcExposureMode *exposureMode_; + AgcConstraintMode *constraintMode_; + CameraMode mode_; + uint64_t frameCount_; + AwbStatus awb_; + struct ExposureValues { + ExposureValues(); + + libcamera::utils::Duration shutter; + double analogueGain; + libcamera::utils::Duration totalExposure; + libcamera::utils::Duration totalExposureNoDG; /* without digital gain */ + }; + ExposureValues current_; /* values for the current frame */ + ExposureValues target_; /* calculate the values we want here */ + ExposureValues filtered_; /* these values are filtered towards target */ + AgcStatus status_; + int lockCount_; + DeviceStatus lastDeviceStatus_; + libcamera::utils::Duration lastTargetExposure_; + /* Below here the "settings" that applications can change. */ + std::string meteringModeName_; + std::string exposureModeName_; + std::string constraintModeName_; + double ev_; + libcamera::utils::Duration flickerPeriod_; + libcamera::utils::Duration maxShutter_; + libcamera::utils::Duration fixedShutter_; + double fixedAnalogueGain_; +}; + +} /* namespace RPiController */ From patchwork Mon Jul 31 09:46:39 2023 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: David Plowman X-Patchwork-Id: 18906 Return-Path: X-Original-To: parsemail@patchwork.libcamera.org Delivered-To: parsemail@patchwork.libcamera.org Received: from lancelot.ideasonboard.com (lancelot.ideasonboard.com [92.243.16.209]) by patchwork.libcamera.org (Postfix) with ESMTPS id 35C80C32A9 for ; Mon, 31 Jul 2023 09:46:51 +0000 (UTC) Received: from lancelot.ideasonboard.com (localhost [IPv6:::1]) by lancelot.ideasonboard.com (Postfix) with ESMTP id 3ABA3627F9; Mon, 31 Jul 2023 11:46:50 +0200 (CEST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/simple; d=libcamera.org; s=mail; t=1690796810; bh=ZJSi6bOkfN3TjdsBu0GV55WnZa2Ci/74JphlKiKVBE8=; h=To:Date:In-Reply-To:References:Subject:List-Id:List-Unsubscribe: List-Archive:List-Post:List-Help:List-Subscribe:From:Reply-To: From; b=ucl7xIw6WbA4lv+qq70QyRsqaWVX70GqXLckNFIygMTPweT1Brnq2GXMuhsMnuks8 0iF9YAY79tM8PN0Ko9M29ZlabCX6g4xEkAVjDwAZg03U5JqgBsu+VBbPglxVsmKGP4 Y0tySKUXo22RIRhoQfgwYxcpzYsj9nIhfrqu9bGSgEi67InlqILckR9eLhsVIf3UCw QaxELqHcegfkkwBfLBPZXDXgPmATzS0tav3KuYK9Dds9LzOHHMJ+8bjyAyBmHC6QOV /vSCVfEOU4+OvAJP8yGc9hR6v1zSwP0s0jVRF/+VJrAV3f7fVyAKzQGDfhWGWhbCvi d/UbfJ+SSdX9A== Received: from mail-wm1-x32b.google.com (mail-wm1-x32b.google.com [IPv6:2a00:1450:4864:20::32b]) by lancelot.ideasonboard.com (Postfix) with ESMTPS id 27202627EF for ; Mon, 31 Jul 2023 11:46:47 +0200 (CEST) Authentication-Results: lancelot.ideasonboard.com; dkim=pass (2048-bit key; unprotected) header.d=raspberrypi.com header.i=@raspberrypi.com header.b="IEL0U3VN"; dkim-atps=neutral Received: by mail-wm1-x32b.google.com with SMTP id 5b1f17b1804b1-3fbf1b82d9cso39674235e9.2 for ; Mon, 31 Jul 2023 02:46:47 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=raspberrypi.com; s=google; t=1690796806; x=1691401606; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:from:to:cc:subject:date :message-id:reply-to; bh=8+ExL/Z4e7Z72TgXvZCoUG9fhEbBhbxBscCQO3R9Tko=; b=IEL0U3VNPzDWQPvIBFOA7kUPF57l6JsX5zIxmVsCFM49SYdgbjoLl9s51JlDbUiQD9 LjXTIVTh49ZmtGdpziumdFYGoMipNUaKosXoyzMSdDRLf4HupzyljAvEyo2uTt7F9TTH mDyru3vXZejNqisXzmsNbbwlG3BKSPveK8ME6z/um/fUuqS4kH2qT3IJfJ8Tfmz2YyqP RLhcFq2vur5VR1n9ryxQC7ugvUUFpl5MuRq+IKRkE2dRSyedT0eL3MjNzEs1J7q2bL00 VSjhmK9y+9h1VPz7QdNbpEuOcmNL0o+K+IYRtZV6M7pxwH7wmcvenIlvBuln+Dbj7/8W mODg== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20221208; t=1690796806; x=1691401606; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:x-gm-message-state:from:to:cc :subject:date:message-id:reply-to; bh=8+ExL/Z4e7Z72TgXvZCoUG9fhEbBhbxBscCQO3R9Tko=; b=i6RF+A1MiqxCXjPbPUyIzDgh4Cd+s36ZBj4Vmy5Nwe7DquxT6e+SOg5Vp1DzXIyzs9 bX6xpysATe92swx7fe4JsYEztOIcTjMZ/VTmW0ALDSGFwTGZurA/be/jMm2p5i7jjYfX k24NcN+daDEe9TNcZj0MdVvFqQXGeHP9/R9L4sBxgnsRL7RH9WEqpqKZwAyzrtGQVSwy FwRXWGJ1i1ne+CXslA7tAfv4qT/EBY/4mLXtSctxa5UuOnLyLXGiCjNSPR41KhoagnEm K1Sj0ml/9pKAlOL+8bBITmclaVF1xlmjyw7qDueA5NJTPbwV2Lo1JPEcQr2gA0lp6K+b 6bJg== X-Gm-Message-State: ABy/qLYc+AyPy6S3Yq78Z7YNNmZPg/kUs2CGDkertMJQK9pWa3k9H1G0 +24CB06MySeivtr6nqi/KNGaw/pfqJhNMx7i5K0= X-Google-Smtp-Source: APBJJlH/b5FGCEMCeeeQ/5l2eYWjNDJzO8KjUUwWrKwllqx+lVnxEFaP6bljfKoQk0HCoTst3tr7Ww== X-Received: by 2002:a1c:7917:0:b0:3fe:16f4:d865 with SMTP id l23-20020a1c7917000000b003fe16f4d865mr3915666wme.23.1690796806169; Mon, 31 Jul 2023 02:46:46 -0700 (PDT) Received: from pi4-davidp.pitowers.org ([2a00:1098:3142:14:2bce:64d6:1a5c:49a2]) by smtp.gmail.com with ESMTPSA id 9-20020a05600c240900b003fa98908014sm13612838wmp.8.2023.07.31.02.46.45 (version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256); Mon, 31 Jul 2023 02:46:45 -0700 (PDT) To: libcamera-devel@lists.libcamera.org Date: Mon, 31 Jul 2023 10:46:39 +0100 Message-Id: <20230731094641.73646-4-david.plowman@raspberrypi.com> X-Mailer: git-send-email 2.30.2 In-Reply-To: <20230731094641.73646-1-david.plowman@raspberrypi.com> References: <20230731094641.73646-1-david.plowman@raspberrypi.com> MIME-Version: 1.0 Subject: [libcamera-devel] [PATCH 3/5] ipa: rpi: agc: Implementation of multi-channel AGC X-BeenThere: libcamera-devel@lists.libcamera.org X-Mailman-Version: 2.1.29 Precedence: list List-Id: List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , X-Patchwork-Original-From: David Plowman via libcamera-devel From: David Plowman Reply-To: David Plowman Errors-To: libcamera-devel-bounces@lists.libcamera.org Sender: "libcamera-devel" The switchMode, prepare and process methods are updated to implement multi-channel AGC correctly: * switchMode now invokes switchMode on all the channels (whether active or not). * prepare must find what channel the current frame is, and run on behalf of that channel. * process updates the most recent DeviceStatus and statistics for the channel of the frame that has just arrived, but generates updated values working through the active channels in round-robin fashion. One minor detail in process is that we don't want to change the DeviceStatus metadata of the current frame, so we now pass this to the AgcChannel's process method, rather than letting it find the DeviceStatus in the metadata. Signed-off-by: David Plowman Reviewed-by: Naushir Patuck --- src/ipa/rpi/controller/agc_status.h | 1 + src/ipa/rpi/controller/rpi/agc.cpp | 108 +++++++++++++++++++-- src/ipa/rpi/controller/rpi/agc.h | 1 + src/ipa/rpi/controller/rpi/agc_channel.cpp | 13 +-- src/ipa/rpi/controller/rpi/agc_channel.h | 4 +- 5 files changed, 109 insertions(+), 18 deletions(-) diff --git a/src/ipa/rpi/controller/agc_status.h b/src/ipa/rpi/controller/agc_status.h index 597eddd7..e5c4ee22 100644 --- a/src/ipa/rpi/controller/agc_status.h +++ b/src/ipa/rpi/controller/agc_status.h @@ -36,6 +36,7 @@ struct AgcStatus { int floatingRegionEnable; libcamera::utils::Duration fixedShutter; double fixedAnalogueGain; + unsigned int channel; }; struct AgcPrepareStatus { diff --git a/src/ipa/rpi/controller/rpi/agc.cpp b/src/ipa/rpi/controller/rpi/agc.cpp index c9c9c297..7e627bba 100644 --- a/src/ipa/rpi/controller/rpi/agc.cpp +++ b/src/ipa/rpi/controller/rpi/agc.cpp @@ -22,7 +22,8 @@ LOG_DEFINE_CATEGORY(RPiAgc) Agc::Agc(Controller *controller) : AgcAlgorithm(controller), - activeChannels_({ 0 }) + activeChannels_({ 0 }), + index_(0) { } @@ -203,20 +204,113 @@ void Agc::setActiveChannels(const std::vector &activeChannels) void Agc::switchMode(CameraMode const &cameraMode, Metadata *metadata) { - LOG(RPiAgc, Debug) << "switchMode for channel 0"; - channelData_[0].channel.switchMode(cameraMode, metadata); + /* + * We run switchMode on every channel, and then we're going to start over + * with the first active channel again which means that this channel's + * status needs to be the one we leave in the metadata. + */ + AgcStatus status; + + for (unsigned int channelIndex = 0; channelIndex < channelData_.size(); channelIndex++) { + LOG(RPiAgc, Debug) << "switchMode for channel " << channelIndex; + channelData_[channelIndex].channel.switchMode(cameraMode, metadata); + if (channelIndex == activeChannels_[0]) + metadata->get("agc.status", status); + } + + status.channel = activeChannels_[0]; + metadata->set("agc.status", status); + index_ = 0; +} + +static void getChannelIndex(Metadata *metadata, const char *message, unsigned int &channelIndex) +{ + std::unique_lock lock(*metadata); + AgcStatus *status = metadata->getLocked("agc.delayed_status"); + if (status) + channelIndex = status->channel; + else + /* This does happen at startup, otherwise it would be a Warning or Error. */ + LOG(RPiAgc, Debug) << message; +} + +static void setChannelIndex(Metadata *metadata, const char *message, unsigned int channelIndex) +{ + std::unique_lock lock(*metadata); + AgcStatus *status = metadata->getLocked("agc.status"); + if (status) + status->channel = channelIndex; + else + /* This does happen at startup, otherwise it would be a Warning or Error. */ + LOG(RPiAgc, Debug) << message; } void Agc::prepare(Metadata *imageMetadata) { - LOG(RPiAgc, Debug) << "prepare for channel 0"; - channelData_[0].channel.prepare(imageMetadata); + /* + * The DeviceStatus in the metadata should be correct for the image we + * are processing. THe delayed status should tell us what channel this frame + * was from, so we will use that channel's prepare method. + * + * \todo To be honest, there's not much that's stateful in the prepare methods + * so we should perhaps re-evaluate whether prepare even needs to be done + * "per channel". + */ + unsigned int channelIndex = activeChannels_[0]; + getChannelIndex(imageMetadata, "prepare: no delayed status", channelIndex); + + LOG(RPiAgc, Debug) << "prepare for channel " << channelIndex; + channelData_[channelIndex].channel.prepare(imageMetadata); } void Agc::process(StatisticsPtr &stats, Metadata *imageMetadata) { - LOG(RPiAgc, Debug) << "process for channel 0"; - channelData_[0].channel.process(stats, imageMetadata); + /* + * We want to generate values for the next channel in round robin fashion + * (i.e. the channel at location index_ in the activeChannel list), even though + * the statistics we have will be for a different channel (which we find + * again from the delayed status). + */ + + /* Generate updated AGC values for this channel: */ + unsigned int channelIndex = activeChannels_[index_]; + AgcChannelData &channelData = channelData_[channelIndex]; + /* Stats are from this channel: */ + unsigned int statsIndex = 0; + getChannelIndex(imageMetadata, "process: no delayed status for stats", statsIndex); + LOG(RPiAgc, Debug) << "process for channel " << channelIndex; + + /* + * We keep a cache of the most recent DeviceStatus and stats for each channel, + * so that we can invoke the next channel's process method with the most up to date + * values. + */ + LOG(RPiAgc, Debug) << "Save DeviceStatus and stats for channel " << statsIndex; + DeviceStatus deviceStatus; + if (imageMetadata->get("device.status", deviceStatus) == 0) + channelData_[statsIndex].deviceStatus = deviceStatus; + else + /* Every frame should have a DeviceStatus. */ + LOG(RPiAgc, Error) << "process: no device status found"; + channelData_[statsIndex].statistics = stats; + + /* + * Finally fetch the most recent DeviceStatus and stats for this channel, if both + * exist, and call process(). We must make the agc.status metadata record correctly + * which channel this is. + */ + if (channelData.statistics && channelData.deviceStatus) { + deviceStatus = *channelData.deviceStatus; + stats = channelData.statistics; + } else + /* Can also happen when new channels start. */ + LOG(RPiAgc, Debug) << "process: channel " << channelIndex << " not seen yet"; + + channelData.channel.process(stats, &deviceStatus, imageMetadata); + setChannelIndex(imageMetadata, "process: no AGC status found", channelIndex); + + /* And onto the next channel for the next call. */ + index_ = (index_ + 1) % activeChannels_.size(); } /* Register algorithm with the system. */ diff --git a/src/ipa/rpi/controller/rpi/agc.h b/src/ipa/rpi/controller/rpi/agc.h index a9158910..2eed2bab 100644 --- a/src/ipa/rpi/controller/rpi/agc.h +++ b/src/ipa/rpi/controller/rpi/agc.h @@ -53,6 +53,7 @@ private: int checkChannel(unsigned int channel) const; std::vector channelData_; std::vector activeChannels_; + unsigned int index_; /* index into the activeChannels_ */ }; } /* namespace RPiController */ diff --git a/src/ipa/rpi/controller/rpi/agc_channel.cpp b/src/ipa/rpi/controller/rpi/agc_channel.cpp index d6e30ef2..ddec611f 100644 --- a/src/ipa/rpi/controller/rpi/agc_channel.cpp +++ b/src/ipa/rpi/controller/rpi/agc_channel.cpp @@ -447,7 +447,7 @@ void AgcChannel::prepare(Metadata *imageMetadata) } } -void AgcChannel::process(StatisticsPtr &stats, Metadata *imageMetadata) +void AgcChannel::process(StatisticsPtr &stats, const DeviceStatus *deviceStatus, Metadata *imageMetadata) { frameCount_++; /* @@ -458,7 +458,7 @@ void AgcChannel::process(StatisticsPtr &stats, Metadata *imageMetadata) /* Fetch the AWB status immediately, so that we can assume it's there. */ fetchAwbStatus(imageMetadata); /* Get the current exposure values for the frame that's just arrived. */ - fetchCurrentExposure(imageMetadata); + fetchCurrentExposure(deviceStatus); /* Compute the total gain we require relative to the current exposure. */ double gain, targetY; computeGain(stats, imageMetadata, gain, targetY); @@ -570,18 +570,13 @@ void AgcChannel::housekeepConfig() << meteringModeName_; } -void AgcChannel::fetchCurrentExposure(Metadata *imageMetadata) +void AgcChannel::fetchCurrentExposure(const DeviceStatus *deviceStatus) { - std::unique_lock lock(*imageMetadata); - DeviceStatus *deviceStatus = - imageMetadata->getLocked("device.status"); if (!deviceStatus) LOG(RPiAgc, Fatal) << "No device metadata"; current_.shutter = deviceStatus->shutterSpeed; current_.analogueGain = deviceStatus->analogueGain; - AgcStatus *agcStatus = - imageMetadata->getLocked("agc.status"); - current_.totalExposure = agcStatus ? agcStatus->totalExposureValue : 0s; + current_.totalExposure = 0s; /* this value is unused */ current_.totalExposureNoDG = current_.shutter * current_.analogueGain; } diff --git a/src/ipa/rpi/controller/rpi/agc_channel.h b/src/ipa/rpi/controller/rpi/agc_channel.h index dc4356f3..0e3d44b0 100644 --- a/src/ipa/rpi/controller/rpi/agc_channel.h +++ b/src/ipa/rpi/controller/rpi/agc_channel.h @@ -83,13 +83,13 @@ public: void disableAuto(); void switchMode(CameraMode const &cameraMode, Metadata *metadata); void prepare(Metadata *imageMetadata); - void process(StatisticsPtr &stats, Metadata *imageMetadata); + void process(StatisticsPtr &stats, const DeviceStatus *deviceStatus, Metadata *imageMetadata); private: bool updateLockStatus(DeviceStatus const &deviceStatus); AgcConfig config_; void housekeepConfig(); - void fetchCurrentExposure(Metadata *imageMetadata); + void fetchCurrentExposure(const DeviceStatus *deviceStatus); void fetchAwbStatus(Metadata *imageMetadata); void computeGain(StatisticsPtr &statistics, Metadata *imageMetadata, double &gain, double &targetY); From patchwork Mon Jul 31 09:46:40 2023 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: David Plowman X-Patchwork-Id: 18907 Return-Path: X-Original-To: parsemail@patchwork.libcamera.org Delivered-To: parsemail@patchwork.libcamera.org Received: from lancelot.ideasonboard.com (lancelot.ideasonboard.com [92.243.16.209]) by patchwork.libcamera.org (Postfix) with ESMTPS id 80DABBDB13 for ; Mon, 31 Jul 2023 09:46:52 +0000 (UTC) Received: from lancelot.ideasonboard.com (localhost [IPv6:::1]) by lancelot.ideasonboard.com (Postfix) with ESMTP id 2F93E627EF; Mon, 31 Jul 2023 11:46:52 +0200 (CEST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/simple; d=libcamera.org; s=mail; t=1690796812; bh=jGq5jBI3BqwxJE7FTkFcHo3Tzt/Lct4FiWi97U8t/Dw=; h=To:Date:In-Reply-To:References:Subject:List-Id:List-Unsubscribe: List-Archive:List-Post:List-Help:List-Subscribe:From:Reply-To: From; b=cq6Q+hxRL2Bb2036cb2E6OUW2aNU0jGEUjMmkQ9ERNixklmOdJW3m3oHV/JcMPm+l T0VozjCettKXs1O75lp4bryXjKkkU21/1HE77gI0W5V2Ih3yxdEWjxKCJjRLlaUMIv jHJBWmYUqp8p/mYUL7U/kf02giYp8aVbKGqImaTyLJLmaKjGQJ05JVAbmPM7fsaaxO Yo6bfhzIgoSJIhYn75Db8qE/vCQBlaYzTL1/Xdx8Mpwtd/pgXOeuGGBEB2W6qsWPgo kCeydiRSXJ/5CWONWy/em4w4r8UJY89MAx+ZBPr6ii1QaIS7MCHxVqN11Z90mFspKT 1oqkNlhy5sIyg== Received: from mail-lj1-x233.google.com (mail-lj1-x233.google.com [IPv6:2a00:1450:4864:20::233]) by lancelot.ideasonboard.com (Postfix) with ESMTPS id 23FDC627ED for ; Mon, 31 Jul 2023 11:46:48 +0200 (CEST) Authentication-Results: lancelot.ideasonboard.com; dkim=pass (2048-bit key; unprotected) header.d=raspberrypi.com header.i=@raspberrypi.com header.b="PcxU5ntk"; dkim-atps=neutral Received: by mail-lj1-x233.google.com with SMTP id 38308e7fff4ca-2b74fa5e7d7so62476651fa.2 for ; Mon, 31 Jul 2023 02:46:48 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=raspberrypi.com; s=google; t=1690796807; x=1691401607; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:from:to:cc:subject:date :message-id:reply-to; bh=9vptURN2phqMoGwpBtJfb059dhhw16s4PNq8MmQlO9c=; b=PcxU5ntk9nVmjQPGYiB98sQC4cjYPEuaxew9p/w7sCVNhp9lOi6f9lPjOIDhd8r/ZG RNxQ5FLtEkyUpuxlDO8AGnxanv7eEe1xX+i+mxpkwV8MDMfQaeFk3Q7NGbEowvK6IAxu OJdAFb7ccbXeiWQgyz/pYIF5kC6ES/nL4ceuZOBrKhhrvK+pL9jYWjgB8UCd+1QLEOrZ WU67DO+0TceNJyZRiLOKAX0fyQrSv/pwFEstejGedkftHVbodYZrQXQ+Q98sAG880HW3 v78fnDEaIIihGfD8X0MqVy+5QdRJgmQaHrPOf9oPz/F7ZCTVgjvasTpk3X4Wym6jElwH 2Dww== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20221208; t=1690796807; x=1691401607; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:x-gm-message-state:from:to:cc :subject:date:message-id:reply-to; bh=9vptURN2phqMoGwpBtJfb059dhhw16s4PNq8MmQlO9c=; b=Sf2n8ByIQYTYYiPfXBfWR+GOORcnptQEw3ql8SjGMD9r7rzCNG7TCjLfZ1T911x5N3 17gmDy7acnxiSBXVwMNaBlRQxS95GiHlokB2NXadqhzyMI2gCft0DYKiQvjkRtLvombx M/fUFQmt4S4gRWUsSgaQ9cOoYexPot6URVDOPDl8KI9f8iDh7k4NW7/rTo+/l6BAlxRk Hlbu9NmLLCLuReAdVZqqCW/rTeakBXeckEM4UWNK5V3YqI38GM6c5WBmWsoLjz1wq5xb jDWyUweLGuhSEH+Ya9T+f2fJmhz2oUzDI7U1Pw1YNIAMtlXn2/fg1Xs6vI3C/DOXpr9b QYXg== X-Gm-Message-State: ABy/qLbPgPS1sAR+G6E2SQUx02pWW+qAd2LSoQTFcAD7Tva+rDxiJNJP 138zEPpvYuAQccsqw8pJHwUo/ZNlydQ1zB9RPiY= X-Google-Smtp-Source: APBJJlGJe5DiqDhgfKL8QlafMloOwTkNiWXt91kNby/O38RTSvX3zLJGr3gUF0Fa9ejZR+QzKrBZqw== X-Received: by 2002:a2e:93cd:0:b0:2b9:43a7:376e with SMTP id p13-20020a2e93cd000000b002b943a7376emr3881949ljh.29.1690796807039; Mon, 31 Jul 2023 02:46:47 -0700 (PDT) Received: from pi4-davidp.pitowers.org ([2a00:1098:3142:14:2bce:64d6:1a5c:49a2]) by smtp.gmail.com with ESMTPSA id 9-20020a05600c240900b003fa98908014sm13612838wmp.8.2023.07.31.02.46.46 (version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256); Mon, 31 Jul 2023 02:46:46 -0700 (PDT) To: libcamera-devel@lists.libcamera.org Date: Mon, 31 Jul 2023 10:46:40 +0100 Message-Id: <20230731094641.73646-5-david.plowman@raspberrypi.com> X-Mailer: git-send-email 2.30.2 In-Reply-To: <20230731094641.73646-1-david.plowman@raspberrypi.com> References: <20230731094641.73646-1-david.plowman@raspberrypi.com> MIME-Version: 1.0 Subject: [libcamera-devel] [PATCH 4/5] ipa: rpi: agc: Add AgcChannelConstraint class X-BeenThere: libcamera-devel@lists.libcamera.org X-Mailman-Version: 2.1.29 Precedence: list List-Id: List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , X-Patchwork-Original-From: David Plowman via libcamera-devel From: David Plowman Reply-To: David Plowman Errors-To: libcamera-devel-bounces@lists.libcamera.org Sender: "libcamera-devel" A channel constraint is somewhat similar to the upper/lower bound constraints that we use elsewhere, but these constraints apply between multiple AGC channels. For example, it lets you say things like "don't let the channel 1 total exposure be more than 8x that of channel 0", and so on. By using both an upper and lower bound constraint, you could fix one AGC channel always to be a fixed ratio of another. Also read a vector of them (if present) when loading the tuning file. Signed-off-by: David Plowman Reviewed-by: Naushir Patuck --- src/ipa/rpi/controller/rpi/agc_channel.cpp | 42 ++++++++++++++++++++++ src/ipa/rpi/controller/rpi/agc_channel.h | 10 ++++++ 2 files changed, 52 insertions(+) diff --git a/src/ipa/rpi/controller/rpi/agc_channel.cpp b/src/ipa/rpi/controller/rpi/agc_channel.cpp index ddec611f..ed8a3b30 100644 --- a/src/ipa/rpi/controller/rpi/agc_channel.cpp +++ b/src/ipa/rpi/controller/rpi/agc_channel.cpp @@ -173,6 +173,42 @@ readConstraintModes(std::map &constraintModes, return { 0, first }; } +int AgcChannelConstraint::read(const libcamera::YamlObject ¶ms) +{ + std::string boundString = params["bound"].get(""); + transform(boundString.begin(), boundString.end(), + boundString.begin(), ::toupper); + if (boundString != "UPPER" && boundString != "LOWER") { + LOG(RPiAgc, Error) << "AGC channelconstraint type should be UPPER or LOWER"; + return -EINVAL; + } + bound = boundString == "UPPER" ? Bound::UPPER : Bound::LOWER; + + auto value = params["factor"].get(); + if (!value) + return -EINVAL; + factor = *value; + + return 0; +} + +int readChannelConstraints(std::vector channelConstraints, + const libcamera::YamlObject ¶ms) +{ + int ret = 0; + + for (const auto &p : params.asList()) { + AgcChannelConstraint constraint; + ret = constraint.read(p); + if (ret) + return ret; + + channelConstraints.push_back(std::move(constraint)); + } + + return ret; +} + int AgcConfig::read(const libcamera::YamlObject ¶ms) { LOG(RPiAgc, Debug) << "AgcConfig"; @@ -191,6 +227,12 @@ int AgcConfig::read(const libcamera::YamlObject ¶ms) if (ret) return ret; + if (params.contains("channel_constraints")) { + ret = readChannelConstraints(channelConstraints, params["channel_constraints"]); + if (ret) + return ret; + } + ret = yTarget.read(params["y_target"]); if (ret) return ret; diff --git a/src/ipa/rpi/controller/rpi/agc_channel.h b/src/ipa/rpi/controller/rpi/agc_channel.h index 0e3d44b0..446125ef 100644 --- a/src/ipa/rpi/controller/rpi/agc_channel.h +++ b/src/ipa/rpi/controller/rpi/agc_channel.h @@ -42,11 +42,21 @@ struct AgcConstraint { typedef std::vector AgcConstraintMode; +struct AgcChannelConstraint { + enum class Bound { LOWER = 0, + UPPER = 1 }; + Bound bound; + unsigned int channel; + double factor; + int read(const libcamera::YamlObject ¶ms); +}; + struct AgcConfig { int read(const libcamera::YamlObject ¶ms); std::map meteringModes; std::map exposureModes; std::map constraintModes; + std::vector channelConstraints; Pwl yTarget; double speed; uint16_t startupFrames; From patchwork Mon Jul 31 09:46:41 2023 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: David Plowman X-Patchwork-Id: 18909 Return-Path: X-Original-To: parsemail@patchwork.libcamera.org Delivered-To: parsemail@patchwork.libcamera.org Received: from lancelot.ideasonboard.com (lancelot.ideasonboard.com [92.243.16.209]) by patchwork.libcamera.org (Postfix) with ESMTPS id BDD3FBDB13 for ; Mon, 31 Jul 2023 09:46:54 +0000 (UTC) Received: from lancelot.ideasonboard.com (localhost [IPv6:::1]) by lancelot.ideasonboard.com (Postfix) with ESMTP id 59DFD627F5; Mon, 31 Jul 2023 11:46:54 +0200 (CEST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/simple; d=libcamera.org; s=mail; t=1690796814; bh=/uTwRHVyrFT2HTsBf8wu4vdEWf1dVb1Bcnb02Sh6A5o=; h=To:Date:In-Reply-To:References:Subject:List-Id:List-Unsubscribe: List-Archive:List-Post:List-Help:List-Subscribe:From:Reply-To: From; b=yD74CfNRbAW6vXPAbdK6Cm0h1faX8gi7EmFkKU1x+h5Cn7VXL1EOzpp7MnIFS//jM +tIq/orQxiTB1cCTYUSgvefQqPZf20ZqXiEMq3p+HBehxLo3fQ6YOwjRr4QT8rdGpW WQGhBPfJUh3bmu84Rh/lDmluMiM9xB3O7R8TgNwBFW7HTGdBERPLgIcLLQ2TOxou+9 OuXwMlUepwJ9FhJiThDusPqTNE1C5HNzuqtNncsGk5ujqNpOQTY2P8W/uHPugrBQWt kWyF1o/VptwO2Tp8rBnkgGH78gdI5NhDH9kD2KCL4syBxBGFAhEQR4muE3BZKC3oI/ D0v5M4qWV82tA== Received: from mail-wm1-x331.google.com (mail-wm1-x331.google.com [IPv6:2a00:1450:4864:20::331]) by lancelot.ideasonboard.com (Postfix) with ESMTPS id 7FB4C627F4 for ; Mon, 31 Jul 2023 11:46:48 +0200 (CEST) Authentication-Results: lancelot.ideasonboard.com; dkim=pass (2048-bit key; unprotected) header.d=raspberrypi.com header.i=@raspberrypi.com header.b="p3QU2A9K"; dkim-atps=neutral Received: by mail-wm1-x331.google.com with SMTP id 5b1f17b1804b1-3fbc63c2e84so48072575e9.3 for ; Mon, 31 Jul 2023 02:46:48 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=raspberrypi.com; s=google; t=1690796808; x=1691401608; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:from:to:cc:subject:date :message-id:reply-to; bh=4rXIM8Dmbucq8jQtDzcDP9vwhEVUIMbE0QaT2+4DNc4=; b=p3QU2A9KbeSr6QkEWyNc2YMZktFrGADj36Et2ZP9Mka7wPKvkhUWSaDzDH+bE14S4l QL6oaWwfPhWmH8w8SWW5dlmlXnkRFZ9g8TSnsuzqtpwcDBheW18S6k3Oj/KVTRBEgtUb ZTxcvojbcs3xHg8o85EIqo6C7TvHA4eZrOLG1O9SfBnUwAURw+WtqcjIKeT3q+D47X42 RdAh41XGZgolcDg4ev3yb3WzDsnfsUB11iC7fmTljTCmEWcmzfu7aF8vSJmZ0p6Q/B1M o/qvQQAA6ZS8SBtxTC7Ni3kY5RDrt3Fim3D3iGV4PEsWRKKSIyTiYL27gunKFLaYiPUC AMsg== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20221208; t=1690796808; x=1691401608; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:x-gm-message-state:from:to:cc :subject:date:message-id:reply-to; bh=4rXIM8Dmbucq8jQtDzcDP9vwhEVUIMbE0QaT2+4DNc4=; b=VsHeVO/3kyXodFgybR5zaY+UPtFV8eZxG+5YZLPTv6BVB1XFgN4kmEczkojusSUIWw ly03/SonGZsU86VEBQmlCj7+mx922ZxXg3sozwp/10/MVq8UJm6EM3BhqSPEOPErO2AT qWkj0WEb/fYjBThLqBKGX/u3Udfx98SGhr0YdaSTFFVOFPquYmgRcbiUJ8E4TKcq60YG El4c+ceW1qQD/e/zfzAUBtnN3I3ePLTGGRgg0TwmmrZOWCY0J5NV8WU92HOoS0MSHBAP GPF4UXgOb61WkFMCUwLjE2EjS/KV7Mbz6aEZ22NQmoQraxxlSGlOHmkTfmyA9MaFnm5U 4y6g== X-Gm-Message-State: ABy/qLabunypinXJicO0KAH/7WrwOU4hQ2yZPlFx944IKhYrOE4Vba3F dFksrW5ekKk6vdpdRYuefSRbdREv5HP109cirdk= X-Google-Smtp-Source: APBJJlHgTehi+Q+xi1vtwdEQPJBcl1MwlPNr9OJfrlRNwva6Z0veOvlS0iFWhCgp5mMSorXMpjsRhA== X-Received: by 2002:a1c:e915:0:b0:3fc:2e8:ea8b with SMTP id q21-20020a1ce915000000b003fc02e8ea8bmr7767293wmc.28.1690796807862; Mon, 31 Jul 2023 02:46:47 -0700 (PDT) Received: from pi4-davidp.pitowers.org ([2a00:1098:3142:14:2bce:64d6:1a5c:49a2]) by smtp.gmail.com with ESMTPSA id 9-20020a05600c240900b003fa98908014sm13612838wmp.8.2023.07.31.02.46.47 (version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256); Mon, 31 Jul 2023 02:46:47 -0700 (PDT) To: libcamera-devel@lists.libcamera.org Date: Mon, 31 Jul 2023 10:46:41 +0100 Message-Id: <20230731094641.73646-6-david.plowman@raspberrypi.com> X-Mailer: git-send-email 2.30.2 In-Reply-To: <20230731094641.73646-1-david.plowman@raspberrypi.com> References: <20230731094641.73646-1-david.plowman@raspberrypi.com> MIME-Version: 1.0 Subject: [libcamera-devel] [PATCH 5/5] ipa: rpi: agc: Use channel constraints in the AGC algorithm X-BeenThere: libcamera-devel@lists.libcamera.org X-Mailman-Version: 2.1.29 Precedence: list List-Id: List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , X-Patchwork-Original-From: David Plowman via libcamera-devel From: David Plowman Reply-To: David Plowman Errors-To: libcamera-devel-bounces@lists.libcamera.org Sender: "libcamera-devel" Whenever we run Agc::process(), we store the most recent total exposure requested for each channel. With these values we can apply the channel constraints after time-filtering the requested total exposure, but before working out how much digital gain is needed. Signed-off-by: David Plowman Reviewed-by: Naushir Patuck --- src/ipa/rpi/controller/rpi/agc.cpp | 21 ++++-- src/ipa/rpi/controller/rpi/agc.h | 1 + src/ipa/rpi/controller/rpi/agc_channel.cpp | 76 ++++++++++++++++++---- src/ipa/rpi/controller/rpi/agc_channel.h | 8 ++- src/ipa/rpi/vc4/data/imx477.json | 3 +- 5 files changed, 90 insertions(+), 19 deletions(-) diff --git a/src/ipa/rpi/controller/rpi/agc.cpp b/src/ipa/rpi/controller/rpi/agc.cpp index 7e627bba..7077fbff 100644 --- a/src/ipa/rpi/controller/rpi/agc.cpp +++ b/src/ipa/rpi/controller/rpi/agc.cpp @@ -40,6 +40,7 @@ int Agc::read(const libcamera::YamlObject ¶ms) */ if (!params.contains("channels")) { LOG(RPiAgc, Debug) << "Single channel only"; + channelResults_.resize(1, 0s); channelData_.emplace_back(); return channelData_.back().channel.read(params, getHardwareConfig()); } @@ -59,6 +60,8 @@ int Agc::read(const libcamera::YamlObject ¶ms) return -1; } + channelResults_.resize(channelData_.size(), 0s); + return 0; } @@ -234,15 +237,21 @@ static void getChannelIndex(Metadata *metadata, const char *message, unsigned in LOG(RPiAgc, Debug) << message; } -static void setChannelIndex(Metadata *metadata, const char *message, unsigned int channelIndex) +static libcamera::utils::Duration +setChannelIndex(Metadata *metadata, const char *message, unsigned int channelIndex) { std::unique_lock lock(*metadata); AgcStatus *status = metadata->getLocked("agc.status"); - if (status) + libcamera::utils::Duration dur = 0s; + + if (status) { status->channel = channelIndex; - else + dur = status->totalExposureValue; + } else /* This does happen at startup, otherwise it would be a Warning or Error. */ LOG(RPiAgc, Debug) << message; + + return dur; } void Agc::prepare(Metadata *imageMetadata) @@ -306,8 +315,10 @@ void Agc::process(StatisticsPtr &stats, Metadata *imageMetadata) /* Can also happen when new channels start. */ LOG(RPiAgc, Debug) << "process: channel " << channelIndex << " not seen yet"; - channelData.channel.process(stats, &deviceStatus, imageMetadata); - setChannelIndex(imageMetadata, "process: no AGC status found", channelIndex); + channelData.channel.process(stats, &deviceStatus, imageMetadata, channelResults_); + auto dur = setChannelIndex(imageMetadata, "process: no AGC status found", channelIndex); + if (dur) + channelResults_[channelIndex] = dur; /* And onto the next channel for the next call. */ index_ = (index_ + 1) % activeChannels_.size(); diff --git a/src/ipa/rpi/controller/rpi/agc.h b/src/ipa/rpi/controller/rpi/agc.h index 2eed2bab..acd6dc2f 100644 --- a/src/ipa/rpi/controller/rpi/agc.h +++ b/src/ipa/rpi/controller/rpi/agc.h @@ -54,6 +54,7 @@ private: std::vector channelData_; std::vector activeChannels_; unsigned int index_; /* index into the activeChannels_ */ + AgcChannelResults channelResults_; }; } /* namespace RPiController */ diff --git a/src/ipa/rpi/controller/rpi/agc_channel.cpp b/src/ipa/rpi/controller/rpi/agc_channel.cpp index ed8a3b30..4e60802c 100644 --- a/src/ipa/rpi/controller/rpi/agc_channel.cpp +++ b/src/ipa/rpi/controller/rpi/agc_channel.cpp @@ -175,6 +175,11 @@ readConstraintModes(std::map &constraintModes, int AgcChannelConstraint::read(const libcamera::YamlObject ¶ms) { + auto channelValue = params["channel"].get(); + if (!channelValue) + return -EINVAL; + channel = *channelValue; + std::string boundString = params["bound"].get(""); transform(boundString.begin(), boundString.end(), boundString.begin(), ::toupper); @@ -184,15 +189,15 @@ int AgcChannelConstraint::read(const libcamera::YamlObject ¶ms) } bound = boundString == "UPPER" ? Bound::UPPER : Bound::LOWER; - auto value = params["factor"].get(); - if (!value) + auto factorValue = params["factor"].get(); + if (!factorValue) return -EINVAL; - factor = *value; + factor = *factorValue; return 0; } -int readChannelConstraints(std::vector channelConstraints, +int readChannelConstraints(std::vector &channelConstraints, const libcamera::YamlObject ¶ms) { int ret = 0; @@ -231,6 +236,7 @@ int AgcConfig::read(const libcamera::YamlObject ¶ms) ret = readChannelConstraints(channelConstraints, params["channel_constraints"]); if (ret) return ret; + LOG(RPiAgc, Info) << "Read " << channelConstraints.size() << " channel constraints"; } ret = yTarget.read(params["y_target"]); @@ -489,7 +495,8 @@ void AgcChannel::prepare(Metadata *imageMetadata) } } -void AgcChannel::process(StatisticsPtr &stats, const DeviceStatus *deviceStatus, Metadata *imageMetadata) +void AgcChannel::process(StatisticsPtr &stats, const DeviceStatus *deviceStatus, + Metadata *imageMetadata, const AgcChannelResults &channelResults) { frameCount_++; /* @@ -508,12 +515,17 @@ void AgcChannel::process(StatisticsPtr &stats, const DeviceStatus *deviceStatus, computeTargetExposure(gain); /* The results have to be filtered so as not to change too rapidly. */ filterExposure(); + /* + * We may be asked to limit the exposure using other channels. If another channel + * determines our upper bound we may want to know this later. + */ + bool channelBound = applyChannelConstraints(channelResults); /* * Some of the exposure has to be applied as digital gain, so work out - * what that is. This function also tells us whether it's decided to - * "desaturate" the image more quickly. + * what that is. It also tells us whether it's trying to desaturate the image + * more quickly, which can only happen when another channel is not limiting us. */ - bool desaturate = applyDigitalGain(gain, targetY); + bool desaturate = applyDigitalGain(gain, targetY, channelBound); /* * The last thing is to divide up the exposure value into a shutter time * and analogue gain, according to the current exposure mode. @@ -792,7 +804,49 @@ void AgcChannel::computeTargetExposure(double gain) LOG(RPiAgc, Debug) << "Target totalExposure " << target_.totalExposure; } -bool AgcChannel::applyDigitalGain(double gain, double targetY) +bool AgcChannel::applyChannelConstraints(const AgcChannelResults &channelResults) +{ + bool channelBound = false; + LOG(RPiAgc, Debug) + << "Total exposure before channel constraints " << filtered_.totalExposure; + + for (const auto &constraint : config_.channelConstraints) { + LOG(RPiAgc, Debug) + << "Check constraint: channel " << constraint.channel << " bound " + << (constraint.bound == AgcChannelConstraint::Bound::UPPER ? "UPPER" : "LOWER") + << " factor " << constraint.factor; + if (constraint.channel >= channelResults.size() || + !channelResults[constraint.channel]) { + LOG(RPiAgc, Debug) << "no such channel - skipped"; + continue; + } + + if (channelResults[constraint.channel] == 0s) { + LOG(RPiAgc, Debug) << "zero exposure - skipped"; + continue; + } + + libcamera::utils::Duration limitExposure = + channelResults[constraint.channel] * constraint.factor; + LOG(RPiAgc, Debug) << "Limit exposure " << limitExposure; + if ((constraint.bound == AgcChannelConstraint::Bound::UPPER && + filtered_.totalExposure > limitExposure) || + (constraint.bound == AgcChannelConstraint::Bound::LOWER && + filtered_.totalExposure < limitExposure)) { + filtered_.totalExposure = limitExposure; + LOG(RPiAgc, Debug) << "Applies"; + channelBound = true; + } else + LOG(RPiAgc, Debug) << "Does not apply"; + } + + LOG(RPiAgc, Debug) + << "Total exposure after channel constraints " << filtered_.totalExposure; + + return channelBound; +} + +bool AgcChannel::applyDigitalGain(double gain, double targetY, bool channelBound) { double minColourGain = std::min({ awb_.gainR, awb_.gainG, awb_.gainB, 1.0 }); ASSERT(minColourGain != 0.0); @@ -812,8 +866,8 @@ bool AgcChannel::applyDigitalGain(double gain, double targetY) * quickly (and we then approach the correct value more quickly from * below). */ - bool desaturate = targetY > config_.fastReduceThreshold && - gain < sqrt(targetY); + bool desaturate = !channelBound && + targetY > config_.fastReduceThreshold && gain < sqrt(targetY); if (desaturate) dg /= config_.fastReduceThreshold; LOG(RPiAgc, Debug) << "Digital gain " << dg << " desaturate? " << desaturate; diff --git a/src/ipa/rpi/controller/rpi/agc_channel.h b/src/ipa/rpi/controller/rpi/agc_channel.h index 446125ef..7ce34360 100644 --- a/src/ipa/rpi/controller/rpi/agc_channel.h +++ b/src/ipa/rpi/controller/rpi/agc_channel.h @@ -19,6 +19,8 @@ namespace RPiController { +using AgcChannelResults = std::vector; + struct AgcMeteringMode { std::vector weights; int read(const libcamera::YamlObject ¶ms); @@ -93,7 +95,8 @@ public: void disableAuto(); void switchMode(CameraMode const &cameraMode, Metadata *metadata); void prepare(Metadata *imageMetadata); - void process(StatisticsPtr &stats, const DeviceStatus *deviceStatus, Metadata *imageMetadata); + void process(StatisticsPtr &stats, const DeviceStatus *deviceStatus, Metadata *imageMetadata, + const AgcChannelResults &channelResults); private: bool updateLockStatus(DeviceStatus const &deviceStatus); @@ -105,7 +108,8 @@ private: double &gain, double &targetY); void computeTargetExposure(double gain); void filterExposure(); - bool applyDigitalGain(double gain, double targetY); + bool applyChannelConstraints(const AgcChannelResults &channelResults); + bool applyDigitalGain(double gain, double targetY, bool channelBound); void divideUpExposure(); void writeAndFinish(Metadata *imageMetadata, bool desaturate); libcamera::utils::Duration limitShutter(libcamera::utils::Duration shutter); diff --git a/src/ipa/rpi/vc4/data/imx477.json b/src/ipa/rpi/vc4/data/imx477.json index daffc268..bbe6da0d 100644 --- a/src/ipa/rpi/vc4/data/imx477.json +++ b/src/ipa/rpi/vc4/data/imx477.json @@ -515,4 +515,5 @@ "rpi.sharpen": { } } ] -} \ No newline at end of file +} +