{"id":18556,"url":"https://patchwork.libcamera.org/api/1.1/patches/18556/?format=json","web_url":"https://patchwork.libcamera.org/patch/18556/","project":{"id":1,"url":"https://patchwork.libcamera.org/api/1.1/projects/1/?format=json","name":"libcamera","link_name":"libcamera","list_id":"libcamera_core","list_email":"libcamera-devel@lists.libcamera.org","web_url":"","scm_url":"","webscm_url":""},"msgid":"<20230426131057.21550-9-naush@raspberrypi.com>","date":"2023-04-26T13:10:52","name":"[libcamera-devel,08/13] ipa: raspberrypi: Introduce IpaBase class","commit_ref":null,"pull_url":null,"state":"superseded","archived":false,"hash":"769b2e84af1314649ca6a5a4d6f7f3b0a48a44a1","submitter":{"id":34,"url":"https://patchwork.libcamera.org/api/1.1/people/34/?format=json","name":"Naushir Patuck","email":"naush@raspberrypi.com"},"delegate":null,"mbox":"https://patchwork.libcamera.org/patch/18556/mbox/","series":[{"id":3847,"url":"https://patchwork.libcamera.org/api/1.1/series/3847/?format=json","web_url":"https://patchwork.libcamera.org/project/libcamera/list/?series=3847","date":"2023-04-26T13:10:44","name":"Raspberry Pi: Code refactoring","version":1,"mbox":"https://patchwork.libcamera.org/series/3847/mbox/"}],"comments":"https://patchwork.libcamera.org/api/patches/18556/comments/","check":"pending","checks":"https://patchwork.libcamera.org/api/patches/18556/checks/","tags":{},"headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id D9262BDCBD\n\tfor <parsemail@patchwork.libcamera.org>;\n\tWed, 26 Apr 2023 13:13:22 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id 9AC3B627FC;\n\tWed, 26 Apr 2023 15:13:22 +0200 (CEST)","from mail-wr1-x42c.google.com (mail-wr1-x42c.google.com\n\t[IPv6:2a00:1450:4864:20::42c])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id 59245627F4\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tWed, 26 Apr 2023 15:13:19 +0200 (CEST)","by mail-wr1-x42c.google.com with SMTP id\n\tffacd0b85a97d-304935cc79bso1664031f8f.2\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tWed, 26 Apr 2023 06:13:19 -0700 (PDT)","from localhost.localdomain ([93.93.133.154])\n\tby smtp.gmail.com with ESMTPSA id\n\tk5-20020adff5c5000000b002f103ca90cdsm15780949wrp.101.2023.04.26.06.13.16\n\t(version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256);\n\tWed, 26 Apr 2023 06:13:17 -0700 (PDT)"],"DKIM-Signature":["v=1; a=rsa-sha256; c=relaxed/simple; d=libcamera.org;\n\ts=mail; t=1682514802;\n\tbh=uHaMjaQtRWAvzDDt+50Z4XiKaoeIIRXN6Pyyigo56eU=;\n\th=To:Date:In-Reply-To:References:Subject:List-Id:List-Unsubscribe:\n\tList-Archive:List-Post:List-Help:List-Subscribe:From:Reply-To:\n\tFrom;\n\tb=kUNlcXMvVgdsK6z+xsj2vZwQHLhAa0aUH79bainNGZTYSi0jiqOWILFPt+D5pkFxi\n\tB4AAbhvDvfRcOM1uRmGSmZTUbpyTr9hCoXKpbK6/pQqf6uLg54h1YJrZ4XzjVGRjPe\n\tpiH15+tTdsB8iLAC0NX9eZvUeC6RPqOaV6UNAdGQ4IncY1q59nIZIG4+naO+5W5jjy\n\tdemPVAzaeal1NtFgebya5PKTIRloxo0M807vGFt5e0gJ7v4hw4bBVBXy7wL1IbRWj7\n\tXEaOrDqCbripIStXTKsQt2VlfATKmMSEPeqhdblWvvi6aGuq95SA/8ZDsQC9jAubl1\n\t7yA6aaob3+b1g==","v=1; a=rsa-sha256; c=relaxed/relaxed;\n\td=raspberrypi.com; s=google; t=1682514798; x=1685106798;\n\th=content-transfer-encoding:mime-version:references:in-reply-to\n\t:message-id:date:subject:cc:to:from:from:to:cc:subject:date\n\t:message-id:reply-to;\n\tbh=UN7NzksCAs6yD/SeL953NjwaD52/vVK3ZRb29175FX4=;\n\tb=rQ5OCFBf8kd7NdnrrJRnl9POLafVuZCKNI17RtiM64tU7UgHXrYLwxTV34xeD+6NEo\n\tWwCobjDTm7v5eNVrYmTale8W7cGVmqYMyDFhUHfGmoMGsv8VNe6LHJ30NbRCbDKp1Og2\n\tdjh2wuJYzhVLNwfGEiqSNDyeYxIvvEkGaZIJx+JNcXFWHU4+EVwE9uVNrjwcYjuLkOUB\n\tvDEmj8ACaTUv2oZboyC8vLQK4KigkB/5wwP62gjNj9AuaNSP35oLzQf3WtHXTJgD4zbG\n\tbFxWuJrYuktyf1hn3fDeQXD/JSaADIFDGQTFBdyMp3EF01Hi2ufIifoCPSvS9ehsssuq\n\teEkA=="],"Authentication-Results":"lancelot.ideasonboard.com; dkim=pass (2048-bit key; \n\tunprotected) header.d=raspberrypi.com\n\theader.i=@raspberrypi.com\n\theader.b=\"rQ5OCFBf\"; dkim-atps=neutral","X-Google-DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/relaxed;\n\td=1e100.net; s=20221208; t=1682514798; x=1685106798;\n\th=content-transfer-encoding:mime-version:references:in-reply-to\n\t:message-id:date:subject:cc:to:from:x-gm-message-state:from:to:cc\n\t:subject:date:message-id:reply-to;\n\tbh=UN7NzksCAs6yD/SeL953NjwaD52/vVK3ZRb29175FX4=;\n\tb=WEkDozkSNUm8eXSuu4oOMUW0tL6AVI+VkOMYtmM1IzLbAU2/J806cr9UbbHacEZ4U7\n\tHh7nsHCHJmfQGTCIkWeqRfnMGF5v8C0brjrWuQM6WE1KcrRTqEmGBtihr2NAesHmTVNw\n\thRGHMvkRwNxDofNm6cwLPUjPuzJxZynuKiOb1dtMOZwZVdVrJP6Sw+7fvI4LM1uICtmo\n\t2wfy0Cm6+YR09ioqeMQKGOBq0dUtMaFHA5svxmJVstncbJrgAuMizm1aouAvehAYic80\n\tHXV7i3FqX0kPZXO7G7Sq81DqQJeiY7ZSMhguvllDRWIEb016ke9No5KMa/TuPXg3sjU3\n\t8aGg==","X-Gm-Message-State":"AAQBX9eohBuDVH9qnUtoklhDttRz5SkWAS0f0Cgw5uffTUmz6x91oqbf\n\tvXlYPlIB37fjDi42ub40GWCocZn72C4x8tGgVBuUXw==","X-Google-Smtp-Source":"AKy350an6nUsOcqcOwi+L8td0qf4iHzRO50BSdfH7ofd1Dx0OqgPHx/d/+WlnRcau0fp34WTnhFGAg==","X-Received":"by 2002:adf:d08a:0:b0:2f8:3225:2bc1 with SMTP id\n\ty10-20020adfd08a000000b002f832252bc1mr15325905wrh.41.1682514797555; \n\tWed, 26 Apr 2023 06:13:17 -0700 (PDT)","To":"libcamera-devel@lists.libcamera.org","Date":"Wed, 26 Apr 2023 14:10:52 +0100","Message-Id":"<20230426131057.21550-9-naush@raspberrypi.com>","X-Mailer":"git-send-email 2.34.1","In-Reply-To":"<20230426131057.21550-1-naush@raspberrypi.com>","References":"<20230426131057.21550-1-naush@raspberrypi.com>","MIME-Version":"1.0","Content-Transfer-Encoding":"8bit","Subject":"[libcamera-devel] [PATCH 08/13] ipa: raspberrypi: Introduce IpaBase\n\tclass","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","From":"Naushir Patuck via libcamera-devel\n\t<libcamera-devel@lists.libcamera.org>","Reply-To":"Naushir Patuck <naush@raspberrypi.com>","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"},"content":"Create a new IpaBase class that handles general purpose housekeeping\nduties for the Raspberry Pi IPA. Code the implementation of new class is\nessentially pulled from the existing ipa/rpi/vc4/raspberrypi.cpp\nfile with a minimal amount of refactoring.\n\nCreate a derived IpaVc4 class from IpaBase that handles the VC4 pipeline\nspecific tasks of the IPA. Again, code for this class implementation is\ntaken from the existing ipa/rpi/vc4/raspberrypi.cpp with a\nminimal amount of refactoring.\n\nThe goal of this change is to allow third parties to implement their own\nIPA running on the Raspberry Pi without duplicating all of the IPA\nhousekeeping tasks.\n\nSigned-off-by: Naushir Patuck <naush@raspberrypi.com>\n---\n src/ipa/meson.build                           |    5 +-\n .../raspberrypi.cpp => common/ipa_base.cpp}   | 1219 +++++------------\n src/ipa/rpi/common/ipa_base.h                 |  125 ++\n src/ipa/rpi/common/meson.build                |    7 +\n src/ipa/rpi/vc4/meson.build                   |    8 +-\n src/ipa/rpi/vc4/vc4.cpp                       |  540 ++++++++\n 6 files changed, 1012 insertions(+), 892 deletions(-)\n rename src/ipa/rpi/{vc4/raspberrypi.cpp => common/ipa_base.cpp} (65%)\n create mode 100644 src/ipa/rpi/common/ipa_base.h\n create mode 100644 src/ipa/rpi/common/meson.build\n create mode 100644 src/ipa/rpi/vc4/vc4.cpp","diff":"diff --git a/src/ipa/meson.build b/src/ipa/meson.build\nindex 10d3b44ca7b6..0c622c38a4a0 100644\n--- a/src/ipa/meson.build\n+++ b/src/ipa/meson.build\n@@ -37,13 +37,14 @@ endif\n \n enabled_ipa_modules = []\n \n-# If the Raspberry Pi VC4 IPA is enabled, ensure we include the  rpi/cam_helper\n-# and rpi/controller subdirectories in the build.\n+# If the Raspberry Pi VC4 IPA is enabled, ensure we include the cam_helper,\n+# common and controller subdirectories in the build.\n #\n # This is done here and not within rpi/vc4/meson.build as meson does not\n # allow the subdir command to traverse up the directory tree.\n if pipelines.contains('rpi/vc4')\n     subdir('rpi/cam_helper')\n+    subdir('rpi/common')\n     subdir('rpi/controller')\n endif\n \ndiff --git a/src/ipa/rpi/vc4/raspberrypi.cpp b/src/ipa/rpi/common/ipa_base.cpp\nsimilarity index 65%\nrename from src/ipa/rpi/vc4/raspberrypi.cpp\nrename to src/ipa/rpi/common/ipa_base.cpp\nindex e3ca8e2f7cbe..becada5973ad 100644\n--- a/src/ipa/rpi/vc4/raspberrypi.cpp\n+++ b/src/ipa/rpi/common/ipa_base.cpp\n@@ -1,60 +1,30 @@\n /* SPDX-License-Identifier: BSD-2-Clause */\n /*\n- * Copyright (C) 2019-2021, Raspberry Pi Ltd\n+ * Copyright (C) 2019-2023, Raspberry Pi Ltd\n  *\n- * rpi.cpp - Raspberry Pi Image Processing Algorithms\n+ * ipa_base.cpp - Raspberry Pi IPA base class\n  */\n \n-#include <algorithm>\n-#include <array>\n-#include <cstring>\n-#include <deque>\n-#include <fcntl.h>\n-#include <math.h>\n-#include <stdint.h>\n-#include <string.h>\n-#include <sys/mman.h>\n-#include <vector>\n+#include \"ipa_base.h\"\n \n-#include <linux/bcm2835-isp.h>\n+#include <cmath>\n \n #include <libcamera/base/log.h>\n-#include <libcamera/base/shared_fd.h>\n #include <libcamera/base/span.h>\n-\n #include <libcamera/control_ids.h>\n-#include <libcamera/controls.h>\n-#include <libcamera/framebuffer.h>\n-#include <libcamera/request.h>\n-\n-#include <libcamera/ipa/ipa_interface.h>\n-#include <libcamera/ipa/ipa_module_info.h>\n-#include <libcamera/ipa/raspberrypi_ipa_interface.h>\n \n-#include \"libcamera/internal/mapped_framebuffer.h\"\n-\n-#include \"cam_helper/cam_helper.h\"\n #include \"controller/af_algorithm.h\"\n #include \"controller/af_status.h\"\n #include \"controller/agc_algorithm.h\"\n-#include \"controller/agc_status.h\"\n-#include \"controller/alsc_status.h\"\n #include \"controller/awb_algorithm.h\"\n #include \"controller/awb_status.h\"\n #include \"controller/black_level_status.h\"\n #include \"controller/ccm_algorithm.h\"\n #include \"controller/ccm_status.h\"\n #include \"controller/contrast_algorithm.h\"\n-#include \"controller/contrast_status.h\"\n-#include \"controller/controller.h\"\n #include \"controller/denoise_algorithm.h\"\n-#include \"controller/denoise_status.h\"\n-#include \"controller/dpc_status.h\"\n-#include \"controller/geq_status.h\"\n #include \"controller/lux_status.h\"\n-#include \"controller/metadata.h\"\n #include \"controller/sharpen_algorithm.h\"\n-#include \"controller/sharpen_status.h\"\n #include \"controller/statistics.h\"\n \n namespace libcamera {\n@@ -62,8 +32,7 @@ namespace libcamera {\n using namespace std::literals::chrono_literals;\n using utils::Duration;\n \n-/* Number of metadata objects available in the context list. */\n-constexpr unsigned int numMetadataContexts = 16;\n+namespace {\n \n /* Number of frame length times to hold in the queue. */\n constexpr unsigned int FrameLengthsQueueSize = 10;\n@@ -80,7 +49,7 @@ constexpr Duration defaultMaxFrameDuration = 250.0s;\n  * we rate-limit the controller Prepare() and Process() calls to lower than or\n  * equal to this rate.\n  */\n-constexpr Duration controllerMinFrameDuration = 1.0s / 30.0;\n+static constexpr Duration controllerMinFrameDuration = 1.0s / 30.0;\n \n /* List of controls handled by the Raspberry Pi IPA */\n static const ControlInfoMap::Map ipaControls{\n@@ -116,118 +85,23 @@ static const ControlInfoMap::Map ipaAfControls{\n \t{ &controls::LensPosition, ControlInfo(0.0f, 32.0f, 1.0f) }\n };\n \n+} /* namespace */\n+\n LOG_DEFINE_CATEGORY(IPARPI)\n \n namespace ipa::RPi {\n \n-class IPARPi : public IPARPiInterface\n+IpaBase::IpaBase()\n+\t: frameCount_(0), mistrustCount_(0), lastRunTimestamp_(0), firstStart_(true),\n+\t  controller_()\n {\n-public:\n-\tIPARPi()\n-\t\t: controller_(), frameCount_(0), checkCount_(0), mistrustCount_(0),\n-\t\t  lastRunTimestamp_(0), lsTable_(nullptr), firstStart_(true),\n-\t\t  lastTimeout_(0s)\n-\t{\n-\t}\n-\n-\t~IPARPi()\n-\t{\n-\t\tif (lsTable_)\n-\t\t\tmunmap(lsTable_, MaxLsGridSize);\n-\t}\n-\n-\tint init(const IPASettings &settings, const InitParams &params, InitResult *result) override;\n-\tvoid start(const ControlList &controls, StartResult *result) override;\n-\tvoid stop() override {}\n-\n-\tint configure(const IPACameraSensorInfo &sensorInfo, const ConfigParams &params,\n-\t\t      ConfigResult *result) override;\n-\tvoid mapBuffers(const std::vector<IPABuffer> &buffers) override;\n-\tvoid unmapBuffers(const std::vector<unsigned int> &ids) override;\n-\tvoid prepareIsp(const PrepareParams &params) override;\n-\tvoid processStats(const ProcessParams &params) override;\n-\n-private:\n-\tvoid setMode(const IPACameraSensorInfo &sensorInfo);\n-\tbool validateSensorControls();\n-\tbool validateIspControls();\n-\tbool validateLensControls();\n-\tvoid applyControls(const ControlList &controls);\n-\tvoid prepare(const PrepareParams &params);\n-\tvoid reportMetadata(unsigned int ipaContext, ControlList *controls) override;\n-\tvoid fillDeviceStatus(const ControlList &sensorControls, unsigned int ipaContext);\n-\tRPiController::StatisticsPtr fillStatistics(bcm2835_isp_stats *stats) const;\n-\tvoid process(unsigned int bufferId, unsigned int ipaContext);\n-\tvoid setCameraTimeoutValue();\n-\tvoid applyFrameDurations(Duration minFrameDuration, Duration maxFrameDuration);\n-\tvoid applyAGC(const struct AgcStatus *agcStatus, ControlList &ctrls);\n-\tvoid applyAWB(const struct AwbStatus *awbStatus, ControlList &ctrls);\n-\tvoid applyDG(const struct AgcStatus *dgStatus, ControlList &ctrls);\n-\tvoid applyCCM(const struct CcmStatus *ccmStatus, ControlList &ctrls);\n-\tvoid applyBlackLevel(const struct BlackLevelStatus *blackLevelStatus, ControlList &ctrls);\n-\tvoid applyGamma(const struct ContrastStatus *contrastStatus, ControlList &ctrls);\n-\tvoid applyGEQ(const struct GeqStatus *geqStatus, ControlList &ctrls);\n-\tvoid applyDenoise(const struct DenoiseStatus *denoiseStatus, ControlList &ctrls);\n-\tvoid applySharpen(const struct SharpenStatus *sharpenStatus, ControlList &ctrls);\n-\tvoid applyDPC(const struct DpcStatus *dpcStatus, ControlList &ctrls);\n-\tvoid applyLS(const struct AlscStatus *lsStatus, ControlList &ctrls);\n-\tvoid applyAF(const struct AfStatus *afStatus, ControlList &lensCtrls);\n-\tvoid resampleTable(uint16_t dest[], const std::vector<double> &src, int destW, int destH);\n-\n-\tstd::map<unsigned int, MappedFrameBuffer> buffers_;\n-\n-\tControlInfoMap sensorCtrls_;\n-\tControlInfoMap ispCtrls_;\n-\tControlInfoMap lensCtrls_;\n-\tbool lensPresent_;\n-\tControlList libcameraMetadata_;\n-\n-\t/* Camera sensor params. */\n-\tCameraMode mode_;\n-\n-\t/* Raspberry Pi controller specific defines. */\n-\tstd::unique_ptr<RPiController::CamHelper> helper_;\n-\tRPiController::Controller controller_;\n-\tstd::array<RPiController::Metadata, numMetadataContexts> rpiMetadata_;\n-\n-\t/*\n-\t * We count frames to decide if the frame must be hidden (e.g. from\n-\t * display) or mistrusted (i.e. not given to the control algos).\n-\t */\n-\tuint64_t frameCount_;\n-\n-\t/* For checking the sequencing of Prepare/Process calls. */\n-\tuint64_t checkCount_;\n-\n-\t/* How many frames we should avoid running control algos on. */\n-\tunsigned int mistrustCount_;\n-\n-\t/* Number of frames that need to be dropped on startup. */\n-\tunsigned int dropFrameCount_;\n-\n-\t/* Frame timestamp for the last run of the controller. */\n-\tuint64_t lastRunTimestamp_;\n-\n-\t/* Do we run a Controller::process() for this frame? */\n-\tbool processPending_;\n-\n-\t/* LS table allocation passed in from the pipeline handler. */\n-\tSharedFD lsTableHandle_;\n-\tvoid *lsTable_;\n-\n-\t/* Distinguish the first camera start from others. */\n-\tbool firstStart_;\n-\n-\t/* Frame duration (1/fps) limits. */\n-\tDuration minFrameDuration_;\n-\tDuration maxFrameDuration_;\n+}\n \n-\t/* Track the frame length times over FrameLengthsQueueSize frames. */\n-\tstd::deque<Duration> frameLengths_;\n-\tDuration lastTimeout_;\n-};\n+IpaBase::~IpaBase()\n+{\n+}\n \n-int IPARPi::init(const IPASettings &settings, const InitParams &params, InitResult *result)\n+int32_t IpaBase::init(const IPASettings &settings, const InitParams &params, InitResult *result)\n {\n \t/*\n \t * Load the \"helper\" for this sensor. This tells us all the device specific stuff\n@@ -264,14 +138,6 @@ int IPARPi::init(const IPASettings &settings, const InitParams &params, InitResu\n \t\treturn ret;\n \t}\n \n-\tconst std::string &target = controller_.getTarget();\n-\tif (target != \"bcm2835\") {\n-\t\tLOG(IPARPI, Error)\n-\t\t\t<< \"Tuning data file target returned \\\"\" << target << \"\\\"\"\n-\t\t\t<< \", expected \\\"bcm2835\\\"\";\n-\t\treturn -EINVAL;\n-\t}\n-\n \tlensPresent_ = params.lensPresent;\n \n \tcontroller_.initialise();\n@@ -282,10 +148,88 @@ int IPARPi::init(const IPASettings &settings, const InitParams &params, InitResu\n \t\tctrlMap.merge(ControlInfoMap::Map(ipaAfControls));\n \tresult->controlInfo = ControlInfoMap(std::move(ctrlMap), controls::controls);\n \n-\treturn 0;\n+\treturn platformInit(params, result);\n }\n \n-void IPARPi::start(const ControlList &controls, StartResult *result)\n+int32_t IpaBase::configure(const IPACameraSensorInfo &sensorInfo, const ConfigParams &params,\n+\t\t\t   ConfigResult *result)\n+{\n+\tsensorCtrls_ = params.sensorControls;\n+\n+\tif (!validateSensorControls()) {\n+\t\tLOG(IPARPI, Error) << \"Sensor control validation failed.\";\n+\t\treturn -1;\n+\t}\n+\n+\tif (lensPresent_) {\n+\t\tlensCtrls_ = params.lensControls;\n+\t\tif (!validateLensControls()) {\n+\t\t\tLOG(IPARPI, Warning) << \"Lens validation failed, \"\n+\t\t\t\t\t     << \"no lens control will be available.\";\n+\t\t\tlensPresent_ = false;\n+\t\t}\n+\t}\n+\n+\t/* Setup a metadata ControlList to output metadata. */\n+\tlibcameraMetadata_ = ControlList(controls::controls);\n+\n+\t/* Re-assemble camera mode using the sensor info. */\n+\tsetMode(sensorInfo);\n+\n+\tmode_.transform = static_cast<libcamera::Transform>(params.transform);\n+\n+\t/* Pass the camera mode to the CamHelper to setup algorithms. */\n+\thelper_->setCameraMode(mode_);\n+\n+\t/*\n+\t * Initialise this ControlList correctly, even if empty, in case the IPA is\n+\t * running is isolation mode (passing the ControlList through the IPC layer).\n+\t */\n+\tControlList ctrls(sensorCtrls_);\n+\n+\t/* The pipeline handler passes out the mode's sensitivity. */\n+\tresult->modeSensitivity = mode_.sensitivity;\n+\n+\tif (firstStart_) {\n+\t\t/* Supply initial values for frame durations. */\n+\t\tapplyFrameDurations(defaultMinFrameDuration, defaultMaxFrameDuration);\n+\n+\t\t/* Supply initial values for gain and exposure. */\n+\t\tAgcStatus agcStatus;\n+\t\tagcStatus.shutterTime = defaultExposureTime;\n+\t\tagcStatus.analogueGain = defaultAnalogueGain;\n+\t\tapplyAGC(&agcStatus, ctrls);\n+\t}\n+\n+\tresult->controls = std::move(ctrls);\n+\n+\t/*\n+\t * Apply the correct limits to the exposure, gain and frame duration controls\n+\t * based on the current sensor mode.\n+\t */\n+\tControlInfoMap::Map ctrlMap = ipaControls;\n+\tctrlMap[&controls::FrameDurationLimits] =\n+\t\tControlInfo(static_cast<int64_t>(mode_.minFrameDuration.get<std::micro>()),\n+\t\t\t    static_cast<int64_t>(mode_.maxFrameDuration.get<std::micro>()));\n+\n+\tctrlMap[&controls::AnalogueGain] =\n+\t\tControlInfo(static_cast<float>(mode_.minAnalogueGain),\n+\t\t\t    static_cast<float>(mode_.maxAnalogueGain));\n+\n+\tctrlMap[&controls::ExposureTime] =\n+\t\tControlInfo(static_cast<int32_t>(mode_.minShutter.get<std::micro>()),\n+\t\t\t    static_cast<int32_t>(mode_.maxShutter.get<std::micro>()));\n+\n+\t/* Declare Autofocus controls, only if we have a controllable lens */\n+\tif (lensPresent_)\n+\t\tctrlMap.merge(ControlInfoMap::Map(ipaAfControls));\n+\n+\tresult->controlInfo = ControlInfoMap(std::move(ctrlMap), controls::controls);\n+\n+\treturn platformConfigure(params, result);\n+}\n+\n+void IpaBase::start(const ControlList &controls, StartResult *result)\n {\n \tRPiController::Metadata metadata;\n \n@@ -320,7 +264,6 @@ void IPARPi::start(const ControlList &controls, StartResult *result)\n \t * or merely a mode switch in a running system.\n \t */\n \tframeCount_ = 0;\n-\tcheckCount_ = 0;\n \tif (firstStart_) {\n \t\tdropFrameCount_ = helper_->hideFramesStartup();\n \t\tmistrustCount_ = helper_->mistrustFramesStartup();\n@@ -363,229 +306,141 @@ void IPARPi::start(const ControlList &controls, StartResult *result)\n \tlastRunTimestamp_ = 0;\n }\n \n-void IPARPi::setMode(const IPACameraSensorInfo &sensorInfo)\n+void IpaBase::mapBuffers(const std::vector<IPABuffer> &buffers)\n {\n-\tmode_.bitdepth = sensorInfo.bitsPerPixel;\n-\tmode_.width = sensorInfo.outputSize.width;\n-\tmode_.height = sensorInfo.outputSize.height;\n-\tmode_.sensorWidth = sensorInfo.activeAreaSize.width;\n-\tmode_.sensorHeight = sensorInfo.activeAreaSize.height;\n-\tmode_.cropX = sensorInfo.analogCrop.x;\n-\tmode_.cropY = sensorInfo.analogCrop.y;\n-\tmode_.pixelRate = sensorInfo.pixelRate;\n-\n-\t/*\n-\t * Calculate scaling parameters. The scale_[xy] factors are determined\n-\t * by the ratio between the crop rectangle size and the output size.\n-\t */\n-\tmode_.scaleX = sensorInfo.analogCrop.width / sensorInfo.outputSize.width;\n-\tmode_.scaleY = sensorInfo.analogCrop.height / sensorInfo.outputSize.height;\n-\n-\t/*\n-\t * We're not told by the pipeline handler how scaling is split between\n-\t * binning and digital scaling. For now, as a heuristic, assume that\n-\t * downscaling up to 2 is achieved through binning, and that any\n-\t * additional scaling is achieved through digital scaling.\n-\t *\n-\t * \\todo Get the pipeline handle to provide the full data\n-\t */\n-\tmode_.binX = std::min(2, static_cast<int>(mode_.scaleX));\n-\tmode_.binY = std::min(2, static_cast<int>(mode_.scaleY));\n-\n-\t/* The noise factor is the square root of the total binning factor. */\n-\tmode_.noiseFactor = sqrt(mode_.binX * mode_.binY);\n-\n-\t/*\n-\t * Calculate the line length as the ratio between the line length in\n-\t * pixels and the pixel rate.\n-\t */\n-\tmode_.minLineLength = sensorInfo.minLineLength * (1.0s / sensorInfo.pixelRate);\n-\tmode_.maxLineLength = sensorInfo.maxLineLength * (1.0s / sensorInfo.pixelRate);\n-\n-\t/*\n-\t * Set the frame length limits for the mode to ensure exposure and\n-\t * framerate calculations are clipped appropriately.\n-\t */\n-\tmode_.minFrameLength = sensorInfo.minFrameLength;\n-\tmode_.maxFrameLength = sensorInfo.maxFrameLength;\n-\n-\t/* Store these for convenience. */\n-\tmode_.minFrameDuration = mode_.minFrameLength * mode_.minLineLength;\n-\tmode_.maxFrameDuration = mode_.maxFrameLength * mode_.maxLineLength;\n-\n-\t/*\n-\t * Some sensors may have different sensitivities in different modes;\n-\t * the CamHelper will know the correct value.\n-\t */\n-\tmode_.sensitivity = helper_->getModeSensitivity(mode_);\n-\n-\tconst ControlInfo &gainCtrl = sensorCtrls_.at(V4L2_CID_ANALOGUE_GAIN);\n-\tconst ControlInfo &shutterCtrl = sensorCtrls_.at(V4L2_CID_EXPOSURE);\n-\n-\tmode_.minAnalogueGain = helper_->gain(gainCtrl.min().get<int32_t>());\n-\tmode_.maxAnalogueGain = helper_->gain(gainCtrl.max().get<int32_t>());\n-\n-\t/* Shutter speed is calculated based on the limits of the frame durations. */\n-\tmode_.minShutter = helper_->exposure(shutterCtrl.min().get<int32_t>(), mode_.minLineLength);\n-\tmode_.maxShutter = Duration::max();\n-\thelper_->getBlanking(mode_.maxShutter,\n-\t\t\t     mode_.minFrameDuration, mode_.maxFrameDuration);\n+\tfor (const IPABuffer &buffer : buffers) {\n+\t\tconst FrameBuffer fb(buffer.planes);\n+\t\tbuffers_.emplace(buffer.id,\n+\t\t\t\t MappedFrameBuffer(&fb, MappedFrameBuffer::MapFlag::ReadWrite));\n+\t}\n }\n \n-int IPARPi::configure(const IPACameraSensorInfo &sensorInfo, const ConfigParams &params,\n-\t\t      ConfigResult *result)\n+void IpaBase::unmapBuffers(const std::vector<unsigned int> &ids)\n {\n-\tsensorCtrls_ = params.sensorControls;\n-\tispCtrls_ = params.ispControls;\n-\n-\tif (!validateSensorControls()) {\n-\t\tLOG(IPARPI, Error) << \"Sensor control validation failed.\";\n-\t\treturn -1;\n-\t}\n-\n-\tif (!validateIspControls()) {\n-\t\tLOG(IPARPI, Error) << \"ISP control validation failed.\";\n-\t\treturn -1;\n-\t}\n+\tfor (unsigned int id : ids) {\n+\t\tauto it = buffers_.find(id);\n+\t\tif (it == buffers_.end())\n+\t\t\tcontinue;\n \n-\tif (lensPresent_) {\n-\t\tlensCtrls_ = params.lensControls;\n-\t\tif (!validateLensControls()) {\n-\t\t\tLOG(IPARPI, Warning) << \"Lens validation failed, \"\n-\t\t\t\t\t     << \"no lens control will be available.\";\n-\t\t\tlensPresent_ = false;\n-\t\t}\n+\t\tbuffers_.erase(id);\n \t}\n+}\n \n-\t/* Setup a metadata ControlList to output metadata. */\n-\tlibcameraMetadata_ = ControlList(controls::controls);\n-\n-\t/* Re-assemble camera mode using the sensor info. */\n-\tsetMode(sensorInfo);\n-\n-\tmode_.transform = static_cast<libcamera::Transform>(params.transform);\n+void IpaBase::prepareIsp(const PrepareParams &params)\n+{\n+\tapplyControls(params.requestControls);\n \n-\t/* Store the lens shading table pointer and handle if available. */\n-\tif (params.lsTableHandle.isValid()) {\n-\t\t/* Remove any previous table, if there was one. */\n-\t\tif (lsTable_) {\n-\t\t\tmunmap(lsTable_, MaxLsGridSize);\n-\t\t\tlsTable_ = nullptr;\n-\t\t}\n+\t/*\n+\t * At start-up, or after a mode-switch, we may want to\n+\t * avoid running the control algos for a few frames in case\n+\t * they are \"unreliable\".\n+\t */\n+\tint64_t frameTimestamp = params.sensorControls.get(controls::SensorTimestamp).value_or(0);\n+\tunsigned int ipaContext = params.ipaContext % rpiMetadata_.size();\n+\tRPiController::Metadata &rpiMetadata = rpiMetadata_[ipaContext];\n+\tSpan<uint8_t> embeddedBuffer;\n \n-\t\t/* Map the LS table buffer into user space. */\n-\t\tlsTableHandle_ = std::move(params.lsTableHandle);\n-\t\tif (lsTableHandle_.isValid()) {\n-\t\t\tlsTable_ = mmap(nullptr, MaxLsGridSize, PROT_READ | PROT_WRITE,\n-\t\t\t\t\tMAP_SHARED, lsTableHandle_.get(), 0);\n+\trpiMetadata.clear();\n+\tfillDeviceStatus(params.sensorControls, ipaContext);\n \n-\t\t\tif (lsTable_ == MAP_FAILED) {\n-\t\t\t\tLOG(IPARPI, Error) << \"dmaHeap mmap failure for LS table.\";\n-\t\t\t\tlsTable_ = nullptr;\n-\t\t\t}\n-\t\t}\n+\tif (params.buffers.embedded) {\n+\t\t/*\n+\t\t * Pipeline handler has supplied us with an embedded data buffer,\n+\t\t * we must pass it to the CamHelper for parsing.\n+\t\t */\n+\t\tauto it = buffers_.find(params.buffers.embedded);\n+\t\tASSERT(it != buffers_.end());\n+\t\tembeddedBuffer = it->second.planes()[0];\n \t}\n \n-\t/* Pass the camera mode to the CamHelper to setup algorithms. */\n-\thelper_->setCameraMode(mode_);\n-\n \t/*\n-\t * Initialise this ControlList correctly, even if empty, in case the IPA is\n-\t * running is isolation mode (passing the ControlList through the IPC layer).\n+\t * AGC wants to know the algorithm status from the time it actioned the\n+\t * sensor exposure/gain changes. So fetch it from the metadata list\n+\t * indexed by the IPA cookie returned, and put it in the current frame\n+\t * metadata.\n \t */\n-\tControlList ctrls(sensorCtrls_);\n-\n-\t/* The pipeline handler passes out the mode's sensitivity. */\n-\tresult->modeSensitivity = mode_.sensitivity;\n+\tAgcStatus agcStatus;\n+\tRPiController::Metadata &delayedMetadata = rpiMetadata_[params.delayContext];\n+\tif (!delayedMetadata.get<AgcStatus>(\"agc.status\", agcStatus))\n+\t\trpiMetadata.set(\"agc.delayed_status\", agcStatus);\n \n-\tif (firstStart_) {\n-\t\t/* Supply initial values for frame durations. */\n-\t\tapplyFrameDurations(defaultMinFrameDuration, defaultMaxFrameDuration);\n+\t/*\n+\t * This may overwrite the DeviceStatus using values from the sensor\n+\t * metadata, and may also do additional custom processing.\n+\t */\n+\thelper_->prepare(embeddedBuffer, rpiMetadata);\n \n-\t\t/* Supply initial values for gain and exposure. */\n-\t\tAgcStatus agcStatus;\n-\t\tagcStatus.shutterTime = defaultExposureTime;\n-\t\tagcStatus.analogueGain = defaultAnalogueGain;\n-\t\tapplyAGC(&agcStatus, ctrls);\n+\t/* Allow a 10% margin on the comparison below. */\n+\tDuration delta = (frameTimestamp - lastRunTimestamp_) * 1.0ns;\n+\tif (lastRunTimestamp_ && frameCount_ > dropFrameCount_ &&\n+\t    delta < controllerMinFrameDuration * 0.9) {\n+\t\t/*\n+\t\t * Ensure we merge the previous frame's metadata with the current\n+\t\t * frame. This will not overwrite exposure/gain values for the\n+\t\t * current frame, or any other bits of metadata that were added\n+\t\t * in helper_->Prepare().\n+\t\t */\n+\t\tRPiController::Metadata &lastMetadata =\n+\t\t\trpiMetadata_[(ipaContext ? ipaContext : rpiMetadata_.size()) - 1];\n+\t\trpiMetadata.mergeCopy(lastMetadata);\n+\t\tprocessPending_ = false;\n+\t} else {\n+\t\tprocessPending_ = true;\n+\t\tlastRunTimestamp_ = frameTimestamp;\n \t}\n \n-\tresult->controls = std::move(ctrls);\n-\n \t/*\n-\t * Apply the correct limits to the exposure, gain and frame duration controls\n-\t * based on the current sensor mode.\n+\t * If a statistics buffer has been passed in, call processStats\n+\t * directly now before prepare() since the statistics are available in-line\n+\t * with the Bayer frame.\n \t */\n-\tControlInfoMap::Map ctrlMap = ipaControls;\n-\tctrlMap[&controls::FrameDurationLimits] =\n-\t\tControlInfo(static_cast<int64_t>(mode_.minFrameDuration.get<std::micro>()),\n-\t\t\t    static_cast<int64_t>(mode_.maxFrameDuration.get<std::micro>()));\n-\n-\tctrlMap[&controls::AnalogueGain] =\n-\t\tControlInfo(static_cast<float>(mode_.minAnalogueGain),\n-\t\t\t    static_cast<float>(mode_.maxAnalogueGain));\n-\n-\tctrlMap[&controls::ExposureTime] =\n-\t\tControlInfo(static_cast<int32_t>(mode_.minShutter.get<std::micro>()),\n-\t\t\t    static_cast<int32_t>(mode_.maxShutter.get<std::micro>()));\n-\n-\t/* Declare Autofocus controls, only if we have a controllable lens */\n-\tif (lensPresent_)\n-\t\tctrlMap.merge(ControlInfoMap::Map(ipaAfControls));\n-\n-\tresult->controlInfo = ControlInfoMap(std::move(ctrlMap), controls::controls);\n-\treturn 0;\n-}\n-\n-void IPARPi::mapBuffers(const std::vector<IPABuffer> &buffers)\n-{\n-\tfor (const IPABuffer &buffer : buffers) {\n-\t\tconst FrameBuffer fb(buffer.planes);\n-\t\tbuffers_.emplace(buffer.id,\n-\t\t\t\t MappedFrameBuffer(&fb, MappedFrameBuffer::MapFlag::ReadWrite));\n+\tif (params.buffers.stats)\n+\t\tprocessStats({ params.buffers, params.ipaContext });\n+\n+\t/* Do we need/want to call prepare? */\n+\tif (processPending_) {\n+\t\tcontroller_.prepare(&rpiMetadata);\n+\t\t/* Actually prepare the ISP parameters for the frame. */\n+\t\tplatformPrepareIsp(params, rpiMetadata);\n \t}\n-}\n \n-void IPARPi::unmapBuffers(const std::vector<unsigned int> &ids)\n-{\n-\tfor (unsigned int id : ids) {\n-\t\tauto it = buffers_.find(id);\n-\t\tif (it == buffers_.end())\n-\t\t\tcontinue;\n+\tframeCount_++;\n \n-\t\tbuffers_.erase(id);\n-\t}\n+\t/* Ready to push the input buffer into the ISP. */\n+\tprepareIspComplete.emit(params.buffers);\n }\n \n-void IPARPi::processStats(const ProcessParams &params)\n+void IpaBase::processStats(const ProcessParams &params)\n {\n-\tunsigned int context = params.ipaContext % rpiMetadata_.size();\n+\tunsigned int ipaContext = params.ipaContext % rpiMetadata_.size();\n \n-\tif (++checkCount_ != frameCount_) /* assert here? */\n-\t\tLOG(IPARPI, Error) << \"WARNING: Prepare/Process mismatch!!!\";\n-\tif (processPending_ && frameCount_ > mistrustCount_)\n-\t\tprocess(params.buffers.stats, context);\n+\tif (processPending_ && frameCount_ > mistrustCount_) {\n+\t\tRPiController::Metadata &rpiMetadata = rpiMetadata_[ipaContext];\n \n-\tprocessStatsComplete.emit(params.buffers);\n-}\n+\t\tauto it = buffers_.find(params.buffers.stats);\n+\t\tif (it == buffers_.end()) {\n+\t\t\tLOG(IPARPI, Error) << \"Could not find stats buffer!\";\n+\t\t\treturn;\n+\t\t}\n \n+\t\tRPiController::StatisticsPtr statistics = platformProcessStats(it->second.planes()[0]);\n \n-void IPARPi::prepareIsp(const PrepareParams &params)\n-{\n-\tapplyControls(params.requestControls);\n+\t\thelper_->process(statistics, rpiMetadata);\n+\t\tcontroller_.process(statistics, &rpiMetadata);\n \n-\t/*\n-\t * At start-up, or after a mode-switch, we may want to\n-\t * avoid running the control algos for a few frames in case\n-\t * they are \"unreliable\".\n-\t */\n-\tprepare(params);\n-\tframeCount_++;\n+\t\tstruct AgcStatus agcStatus;\n+\t\tif (rpiMetadata.get(\"agc.status\", agcStatus) == 0) {\n+\t\t\tControlList ctrls(sensorCtrls_);\n+\t\t\tapplyAGC(&agcStatus, ctrls);\n+\t\t\tsetDelayedControls.emit(ctrls, ipaContext);\n+\t\t\tsetCameraTimeoutValue();\n+\t\t}\n+\t}\n \n-\t/* Ready to push the input buffer into the ISP. */\n-\tprepareIspComplete.emit(params.buffers);\n+\tprocessStatsComplete.emit(params.buffers);\n }\n \n-void IPARPi::reportMetadata(unsigned int ipaContext, ControlList *controls)\n+void IpaBase::reportMetadata(unsigned int ipaContext, ControlList *metadata)\n {\n \tRPiController::Metadata &rpiMetadata = rpiMetadata_[ipaContext % rpiMetadata_.size()];\n \tstd::unique_lock<RPiController::Metadata> lock(rpiMetadata);\n@@ -658,48 +513,132 @@ void IPARPi::reportMetadata(unsigned int ipaContext, ControlList *controls)\n \t\tlibcameraMetadata_.set(controls::FocusFoM, focusFoM);\n \t}\n \n-\tCcmStatus *ccmStatus = rpiMetadata.getLocked<CcmStatus>(\"ccm.status\");\n-\tif (ccmStatus) {\n-\t\tfloat m[9];\n-\t\tfor (unsigned int i = 0; i < 9; i++)\n-\t\t\tm[i] = ccmStatus->matrix[i];\n-\t\tlibcameraMetadata_.set(controls::ColourCorrectionMatrix, m);\n-\t}\n+\tCcmStatus *ccmStatus = rpiMetadata.getLocked<CcmStatus>(\"ccm.status\");\n+\tif (ccmStatus) {\n+\t\tfloat m[9];\n+\t\tfor (unsigned int i = 0; i < 9; i++)\n+\t\t\tm[i] = ccmStatus->matrix[i];\n+\t\tlibcameraMetadata_.set(controls::ColourCorrectionMatrix, m);\n+\t}\n+\n+\tconst AfStatus *afStatus = rpiMetadata.getLocked<AfStatus>(\"af.status\");\n+\tif (afStatus) {\n+\t\tint32_t s, p;\n+\t\tswitch (afStatus->state) {\n+\t\tcase AfState::Scanning:\n+\t\t\ts = controls::AfStateScanning;\n+\t\t\tbreak;\n+\t\tcase AfState::Focused:\n+\t\t\ts = controls::AfStateFocused;\n+\t\t\tbreak;\n+\t\tcase AfState::Failed:\n+\t\t\ts = controls::AfStateFailed;\n+\t\t\tbreak;\n+\t\tdefault:\n+\t\t\ts = controls::AfStateIdle;\n+\t\t}\n+\t\tswitch (afStatus->pauseState) {\n+\t\tcase AfPauseState::Pausing:\n+\t\t\tp = controls::AfPauseStatePausing;\n+\t\t\tbreak;\n+\t\tcase AfPauseState::Paused:\n+\t\t\tp = controls::AfPauseStatePaused;\n+\t\t\tbreak;\n+\t\tdefault:\n+\t\t\tp = controls::AfPauseStateRunning;\n+\t\t}\n+\t\tlibcameraMetadata_.set(controls::AfState, s);\n+\t\tlibcameraMetadata_.set(controls::AfPauseState, p);\n+\t}\n+\n+\t*metadata = std::move(libcameraMetadata_);\n+}\n+\n+void IpaBase::setMode(const IPACameraSensorInfo &sensorInfo)\n+{\n+\tmode_.bitdepth = sensorInfo.bitsPerPixel;\n+\tmode_.width = sensorInfo.outputSize.width;\n+\tmode_.height = sensorInfo.outputSize.height;\n+\tmode_.sensorWidth = sensorInfo.activeAreaSize.width;\n+\tmode_.sensorHeight = sensorInfo.activeAreaSize.height;\n+\tmode_.cropX = sensorInfo.analogCrop.x;\n+\tmode_.cropY = sensorInfo.analogCrop.y;\n+\tmode_.pixelRate = sensorInfo.pixelRate;\n+\n+\t/*\n+\t * Calculate scaling parameters. The scale_[xy] factors are determined\n+\t * by the ratio between the crop rectangle size and the output size.\n+\t */\n+\tmode_.scaleX = sensorInfo.analogCrop.width / sensorInfo.outputSize.width;\n+\tmode_.scaleY = sensorInfo.analogCrop.height / sensorInfo.outputSize.height;\n+\n+\t/*\n+\t * We're not told by the pipeline handler how scaling is split between\n+\t * binning and digital scaling. For now, as a heuristic, assume that\n+\t * downscaling up to 2 is achieved through binning, and that any\n+\t * additional scaling is achieved through digital scaling.\n+\t *\n+\t * \\todo Get the pipeline handle to provide the full data\n+\t */\n+\tmode_.binX = std::min(2, static_cast<int>(mode_.scaleX));\n+\tmode_.binY = std::min(2, static_cast<int>(mode_.scaleY));\n+\n+\t/* The noise factor is the square root of the total binning factor. */\n+\tmode_.noiseFactor = std::sqrt(mode_.binX * mode_.binY);\n+\n+\t/*\n+\t * Calculate the line length as the ratio between the line length in\n+\t * pixels and the pixel rate.\n+\t */\n+\tmode_.minLineLength = sensorInfo.minLineLength * (1.0s / sensorInfo.pixelRate);\n+\tmode_.maxLineLength = sensorInfo.maxLineLength * (1.0s / sensorInfo.pixelRate);\n+\n+\t/*\n+\t * Set the frame length limits for the mode to ensure exposure and\n+\t * framerate calculations are clipped appropriately.\n+\t */\n+\tmode_.minFrameLength = sensorInfo.minFrameLength;\n+\tmode_.maxFrameLength = sensorInfo.maxFrameLength;\n+\n+\t/* Store these for convenience. */\n+\tmode_.minFrameDuration = mode_.minFrameLength * mode_.minLineLength;\n+\tmode_.maxFrameDuration = mode_.maxFrameLength * mode_.maxLineLength;\n+\n+\t/*\n+\t * Some sensors may have different sensitivities in different modes;\n+\t * the CamHelper will know the correct value.\n+\t */\n+\tmode_.sensitivity = helper_->getModeSensitivity(mode_);\n+\n+\tconst ControlInfo &gainCtrl = sensorCtrls_.at(V4L2_CID_ANALOGUE_GAIN);\n+\tconst ControlInfo &shutterCtrl = sensorCtrls_.at(V4L2_CID_EXPOSURE);\n+\n+\tmode_.minAnalogueGain = helper_->gain(gainCtrl.min().get<int32_t>());\n+\tmode_.maxAnalogueGain = helper_->gain(gainCtrl.max().get<int32_t>());\n+\n+\t/* Shutter speed is calculated based on the limits of the frame durations. */\n+\tmode_.minShutter = helper_->exposure(shutterCtrl.min().get<int32_t>(), mode_.minLineLength);\n+\tmode_.maxShutter = Duration::max();\n+\thelper_->getBlanking(mode_.maxShutter,\n+\t\t\t     mode_.minFrameDuration, mode_.maxFrameDuration);\n+}\n+\n+void IpaBase::setCameraTimeoutValue()\n+{\n+\t/*\n+\t * Take the maximum value of the exposure queue as the camera timeout\n+\t * value to pass back to the pipeline handler. Only signal if it has changed\n+\t * from the last set value.\n+\t */\n+\tauto max = std::max_element(frameLengths_.begin(), frameLengths_.end());\n \n-\tconst AfStatus *afStatus = rpiMetadata.getLocked<AfStatus>(\"af.status\");\n-\tif (afStatus) {\n-\t\tint32_t s, p;\n-\t\tswitch (afStatus->state) {\n-\t\tcase AfState::Scanning:\n-\t\t\ts = controls::AfStateScanning;\n-\t\t\tbreak;\n-\t\tcase AfState::Focused:\n-\t\t\ts = controls::AfStateFocused;\n-\t\t\tbreak;\n-\t\tcase AfState::Failed:\n-\t\t\ts = controls::AfStateFailed;\n-\t\t\tbreak;\n-\t\tdefault:\n-\t\t\ts = controls::AfStateIdle;\n-\t\t}\n-\t\tswitch (afStatus->pauseState) {\n-\t\tcase AfPauseState::Pausing:\n-\t\t\tp = controls::AfPauseStatePausing;\n-\t\t\tbreak;\n-\t\tcase AfPauseState::Paused:\n-\t\t\tp = controls::AfPauseStatePaused;\n-\t\t\tbreak;\n-\t\tdefault:\n-\t\t\tp = controls::AfPauseStateRunning;\n-\t\t}\n-\t\tlibcameraMetadata_.set(controls::AfState, s);\n-\t\tlibcameraMetadata_.set(controls::AfPauseState, p);\n+\tif (*max != lastTimeout_) {\n+\t\tsetCameraTimeout.emit(max->get<std::milli>());\n+\t\tlastTimeout_ = *max;\n \t}\n-\n-\t*controls = std::move(libcameraMetadata_);\n }\n \n-bool IPARPi::validateSensorControls()\n+bool IpaBase::validateSensorControls()\n {\n \tstatic const uint32_t ctrls[] = {\n \t\tV4L2_CID_ANALOGUE_GAIN,\n@@ -719,35 +658,7 @@ bool IPARPi::validateSensorControls()\n \treturn true;\n }\n \n-bool IPARPi::validateIspControls()\n-{\n-\tstatic const uint32_t ctrls[] = {\n-\t\tV4L2_CID_RED_BALANCE,\n-\t\tV4L2_CID_BLUE_BALANCE,\n-\t\tV4L2_CID_DIGITAL_GAIN,\n-\t\tV4L2_CID_USER_BCM2835_ISP_CC_MATRIX,\n-\t\tV4L2_CID_USER_BCM2835_ISP_GAMMA,\n-\t\tV4L2_CID_USER_BCM2835_ISP_BLACK_LEVEL,\n-\t\tV4L2_CID_USER_BCM2835_ISP_GEQ,\n-\t\tV4L2_CID_USER_BCM2835_ISP_DENOISE,\n-\t\tV4L2_CID_USER_BCM2835_ISP_SHARPEN,\n-\t\tV4L2_CID_USER_BCM2835_ISP_DPC,\n-\t\tV4L2_CID_USER_BCM2835_ISP_LENS_SHADING,\n-\t\tV4L2_CID_USER_BCM2835_ISP_CDN,\n-\t};\n-\n-\tfor (auto c : ctrls) {\n-\t\tif (ispCtrls_.find(c) == ispCtrls_.end()) {\n-\t\t\tLOG(IPARPI, Error) << \"Unable to find ISP control \"\n-\t\t\t\t\t   << utils::hex(c);\n-\t\t\treturn false;\n-\t\t}\n-\t}\n-\n-\treturn true;\n-}\n-\n-bool IPARPi::validateLensControls()\n+bool IpaBase::validateLensControls()\n {\n \tif (lensCtrls_.find(V4L2_CID_FOCUS_ABSOLUTE) == lensCtrls_.end()) {\n \t\tLOG(IPARPI, Error) << \"Unable to find Lens control V4L2_CID_FOCUS_ABSOLUTE\";\n@@ -820,7 +731,7 @@ static const std::map<int32_t, RPiController::AfAlgorithm::AfPause> AfPauseTable\n \t{ controls::AfPauseResume, RPiController::AfAlgorithm::AfPauseResume },\n };\n \n-void IPARPi::applyControls(const ControlList &controls)\n+void IpaBase::applyControls(const ControlList &controls)\n {\n \tusing RPiController::AfAlgorithm;\n \n@@ -840,7 +751,7 @@ void IPARPi::applyControls(const ControlList &controls)\n \t\tif (mode == AfModeTable.end()) {\n \t\t\tLOG(IPARPI, Error) << \"AF mode \" << idx\n \t\t\t\t\t   << \" not recognised\";\n-\t\t} else\n+\t\t} else if (af)\n \t\t\taf->setMode(mode->second);\n \t}\n \n@@ -1112,6 +1023,10 @@ void IPARPi::applyControls(const ControlList &controls)\n \t\tcase controls::NOISE_REDUCTION_MODE: {\n \t\t\tRPiController::DenoiseAlgorithm *sdn = dynamic_cast<RPiController::DenoiseAlgorithm *>(\n \t\t\t\tcontroller_.getAlgorithm(\"SDN\"));\n+\t\t\t/* Some platforms may have a combined \"denoise\" algorithm instead. */\n+\t\t\tif (!sdn)\n+\t\t\t\tsdn = dynamic_cast<RPiController::DenoiseAlgorithm *>(\n+\t\t\t\tcontroller_.getAlgorithm(\"denoise\"));\n \t\t\tif (!sdn) {\n \t\t\t\tLOG(IPARPI, Warning)\n \t\t\t\t\t<< \"Could not set NOISE_REDUCTION_MODE - no SDN algorithm\";\n@@ -1248,125 +1163,12 @@ void IPARPi::applyControls(const ControlList &controls)\n \t\t\tbreak;\n \t\t}\n \t}\n-}\n-\n-void IPARPi::prepare(const PrepareParams &params)\n-{\n-\tint64_t frameTimestamp = params.sensorControls.get(controls::SensorTimestamp).value_or(0);\n-\tunsigned int ipaContext = params.ipaContext % rpiMetadata_.size();\n-\tRPiController::Metadata &rpiMetadata = rpiMetadata_[ipaContext];\n-\tSpan<uint8_t> embeddedBuffer;\n-\n-\trpiMetadata.clear();\n-\tfillDeviceStatus(params.sensorControls, ipaContext);\n-\n-\tif (params.buffers.embedded) {\n-\t\t/*\n-\t\t * Pipeline handler has supplied us with an embedded data buffer,\n-\t\t * we must pass it to the CamHelper for parsing.\n-\t\t */\n-\t\tauto it = buffers_.find(params.buffers.embedded);\n-\t\tASSERT(it != buffers_.end());\n-\t\tembeddedBuffer = it->second.planes()[0];\n-\t}\n-\n-\t/*\n-\t * AGC wants to know the algorithm status from the time it actioned the\n-\t * sensor exposure/gain changes. So fetch it from the metadata list\n-\t * indexed by the IPA cookie returned, and put it in the current frame\n-\t * metadata.\n-\t */\n-\tAgcStatus agcStatus;\n-\tRPiController::Metadata &delayedMetadata = rpiMetadata_[params.delayContext];\n-\tif (!delayedMetadata.get<AgcStatus>(\"agc.status\", agcStatus))\n-\t\trpiMetadata.set(\"agc.delayed_status\", agcStatus);\n-\n-\t/*\n-\t * This may overwrite the DeviceStatus using values from the sensor\n-\t * metadata, and may also do additional custom processing.\n-\t */\n-\thelper_->prepare(embeddedBuffer, rpiMetadata);\n-\n-\t/* Allow a 10% margin on the comparison below. */\n-\tDuration delta = (frameTimestamp - lastRunTimestamp_) * 1.0ns;\n-\tif (lastRunTimestamp_ && frameCount_ > dropFrameCount_ &&\n-\t    delta < controllerMinFrameDuration * 0.9) {\n-\t\t/*\n-\t\t * Ensure we merge the previous frame's metadata with the current\n-\t\t * frame. This will not overwrite exposure/gain values for the\n-\t\t * current frame, or any other bits of metadata that were added\n-\t\t * in helper_->Prepare().\n-\t\t */\n-\t\tRPiController::Metadata &lastMetadata =\n-\t\t\trpiMetadata_[(ipaContext ? ipaContext : rpiMetadata_.size()) - 1];\n-\t\trpiMetadata.mergeCopy(lastMetadata);\n-\t\tprocessPending_ = false;\n-\t\treturn;\n-\t}\n-\n-\tlastRunTimestamp_ = frameTimestamp;\n-\tprocessPending_ = true;\n-\n-\tControlList ctrls(ispCtrls_);\n-\n-\tcontroller_.prepare(&rpiMetadata);\n-\n-\t/* Lock the metadata buffer to avoid constant locks/unlocks. */\n-\tstd::unique_lock<RPiController::Metadata> lock(rpiMetadata);\n-\n-\tAwbStatus *awbStatus = rpiMetadata.getLocked<AwbStatus>(\"awb.status\");\n-\tif (awbStatus)\n-\t\tapplyAWB(awbStatus, ctrls);\n-\n-\tCcmStatus *ccmStatus = rpiMetadata.getLocked<CcmStatus>(\"ccm.status\");\n-\tif (ccmStatus)\n-\t\tapplyCCM(ccmStatus, ctrls);\n-\n-\tAgcStatus *dgStatus = rpiMetadata.getLocked<AgcStatus>(\"agc.status\");\n-\tif (dgStatus)\n-\t\tapplyDG(dgStatus, ctrls);\n-\n-\tAlscStatus *lsStatus = rpiMetadata.getLocked<AlscStatus>(\"alsc.status\");\n-\tif (lsStatus)\n-\t\tapplyLS(lsStatus, ctrls);\n-\n-\tContrastStatus *contrastStatus = rpiMetadata.getLocked<ContrastStatus>(\"contrast.status\");\n-\tif (contrastStatus)\n-\t\tapplyGamma(contrastStatus, ctrls);\n-\n-\tBlackLevelStatus *blackLevelStatus = rpiMetadata.getLocked<BlackLevelStatus>(\"black_level.status\");\n-\tif (blackLevelStatus)\n-\t\tapplyBlackLevel(blackLevelStatus, ctrls);\n-\n-\tGeqStatus *geqStatus = rpiMetadata.getLocked<GeqStatus>(\"geq.status\");\n-\tif (geqStatus)\n-\t\tapplyGEQ(geqStatus, ctrls);\n-\n-\tDenoiseStatus *denoiseStatus = rpiMetadata.getLocked<DenoiseStatus>(\"denoise.status\");\n-\tif (denoiseStatus)\n-\t\tapplyDenoise(denoiseStatus, ctrls);\n-\n-\tSharpenStatus *sharpenStatus = rpiMetadata.getLocked<SharpenStatus>(\"sharpen.status\");\n-\tif (sharpenStatus)\n-\t\tapplySharpen(sharpenStatus, ctrls);\n-\n-\tDpcStatus *dpcStatus = rpiMetadata.getLocked<DpcStatus>(\"dpc.status\");\n-\tif (dpcStatus)\n-\t\tapplyDPC(dpcStatus, ctrls);\n-\n-\tconst AfStatus *afStatus = rpiMetadata.getLocked<AfStatus>(\"af.status\");\n-\tif (afStatus) {\n-\t\tControlList lensctrls(lensCtrls_);\n-\t\tapplyAF(afStatus, lensctrls);\n-\t\tif (!lensctrls.empty())\n-\t\t\tsetLensControls.emit(lensctrls);\n-\t}\n \n-\tif (!ctrls.empty())\n-\t\tsetIspControls.emit(ctrls);\n+\t/* Give derived classes a chance to examine the new controls. */\n+\thandleControls(controls);\n }\n \n-void IPARPi::fillDeviceStatus(const ControlList &sensorControls, unsigned int ipaContext)\n+void IpaBase::fillDeviceStatus(const ControlList &sensorControls, unsigned int ipaContext)\n {\n \tDeviceStatus deviceStatus = {};\n \n@@ -1390,103 +1192,7 @@ void IPARPi::fillDeviceStatus(const ControlList &sensorControls, unsigned int ip\n \trpiMetadata_[ipaContext].set(\"device.status\", deviceStatus);\n }\n \n-RPiController::StatisticsPtr IPARPi::fillStatistics(bcm2835_isp_stats *stats) const\n-{\n-\tusing namespace RPiController;\n-\n-\tconst Controller::HardwareConfig &hw = controller_.getHardwareConfig();\n-\tunsigned int i;\n-\tStatisticsPtr statistics =\n-\t\tstd::make_unique<Statistics>(Statistics::AgcStatsPos::PreWb, Statistics::ColourStatsPos::PostLsc);\n-\n-\t/* RGB histograms are not used, so do not populate them. */\n-\tstatistics->yHist = RPiController::Histogram(stats->hist[0].g_hist,\n-\t\t\t\t\t\t     hw.numHistogramBins);\n-\n-\t/* All region sums are based on a 16-bit normalised pipeline bit-depth. */\n-\tunsigned int scale = Statistics::NormalisationFactorPow2 - hw.pipelineWidth;\n-\n-\tstatistics->awbRegions.init(hw.awbRegions);\n-\tfor (i = 0; i < statistics->awbRegions.numRegions(); i++)\n-\t\tstatistics->awbRegions.set(i, { { stats->awb_stats[i].r_sum << scale,\n-\t\t\t\t\t\t  stats->awb_stats[i].g_sum << scale,\n-\t\t\t\t\t\t  stats->awb_stats[i].b_sum << scale },\n-\t\t\t\t\t\tstats->awb_stats[i].counted,\n-\t\t\t\t\t\tstats->awb_stats[i].notcounted });\n-\n-\tstatistics->agcRegions.init(hw.agcRegions);\n-\tfor (i = 0; i < statistics->agcRegions.numRegions(); i++)\n-\t\tstatistics->agcRegions.set(i, { { stats->agc_stats[i].r_sum << scale,\n-\t\t\t\t\t\t  stats->agc_stats[i].g_sum << scale,\n-\t\t\t\t\t\t  stats->agc_stats[i].b_sum << scale },\n-\t\t\t\t\t\tstats->agc_stats[i].counted,\n-\t\t\t\t\t\tstats->awb_stats[i].notcounted });\n-\n-\tstatistics->focusRegions.init(hw.focusRegions);\n-\tfor (i = 0; i < statistics->focusRegions.numRegions(); i++)\n-\t\tstatistics->focusRegions.set(i, { stats->focus_stats[i].contrast_val[1][1] / 1000,\n-\t\t\t\t\t\t  stats->focus_stats[i].contrast_val_num[1][1],\n-\t\t\t\t\t\t  stats->focus_stats[i].contrast_val_num[1][0] });\n-\treturn statistics;\n-}\n-\n-void IPARPi::process(unsigned int bufferId, unsigned int ipaContext)\n-{\n-\tRPiController::Metadata &rpiMetadata = rpiMetadata_[ipaContext];\n-\n-\tauto it = buffers_.find(bufferId);\n-\tif (it == buffers_.end()) {\n-\t\tLOG(IPARPI, Error) << \"Could not find stats buffer!\";\n-\t\treturn;\n-\t}\n-\n-\tSpan<uint8_t> mem = it->second.planes()[0];\n-\tbcm2835_isp_stats *stats = reinterpret_cast<bcm2835_isp_stats *>(mem.data());\n-\tRPiController::StatisticsPtr statistics = fillStatistics(stats);\n-\n-\t/* Save the focus stats in the metadata structure to report out later. */\n-\trpiMetadata_[ipaContext].set(\"focus.status\", statistics->focusRegions);\n-\n-\thelper_->process(statistics, rpiMetadata);\n-\tcontroller_.process(statistics, &rpiMetadata);\n-\n-\tstruct AgcStatus agcStatus;\n-\tif (rpiMetadata.get(\"agc.status\", agcStatus) == 0) {\n-\t\tControlList ctrls(sensorCtrls_);\n-\t\tapplyAGC(&agcStatus, ctrls);\n-\n-\t\tsetDelayedControls.emit(ctrls, ipaContext);\n-\t\tsetCameraTimeoutValue();\n-\t}\n-}\n-\n-void IPARPi::setCameraTimeoutValue()\n-{\n-\t/*\n-\t * Take the maximum value of the exposure queue as the camera timeout\n-\t * value to pass back to the pipeline handler. Only signal if it has changed\n-\t * from the last set value.\n-\t */\n-\tauto max = std::max_element(frameLengths_.begin(), frameLengths_.end());\n-\n-\tif (*max != lastTimeout_) {\n-\t\tsetCameraTimeout.emit(max->get<std::milli>());\n-\t\tlastTimeout_ = *max;\n-\t}\n-}\n-\n-void IPARPi::applyAWB(const struct AwbStatus *awbStatus, ControlList &ctrls)\n-{\n-\tLOG(IPARPI, Debug) << \"Applying WB R: \" << awbStatus->gainR << \" B: \"\n-\t\t\t   << awbStatus->gainB;\n-\n-\tctrls.set(V4L2_CID_RED_BALANCE,\n-\t\t  static_cast<int32_t>(awbStatus->gainR * 1000));\n-\tctrls.set(V4L2_CID_BLUE_BALANCE,\n-\t\t  static_cast<int32_t>(awbStatus->gainB * 1000));\n-}\n-\n-void IPARPi::applyFrameDurations(Duration minFrameDuration, Duration maxFrameDuration)\n+void IpaBase::applyFrameDurations(Duration minFrameDuration, Duration maxFrameDuration)\n {\n \t/*\n \t * This will only be applied once AGC recalculations occur.\n@@ -1518,7 +1224,7 @@ void IPARPi::applyFrameDurations(Duration minFrameDuration, Duration maxFrameDur\n \tagc->setMaxShutter(maxShutter);\n }\n \n-void IPARPi::applyAGC(const struct AgcStatus *agcStatus, ControlList &ctrls)\n+void IpaBase::applyAGC(const struct AgcStatus *agcStatus, ControlList &ctrls)\n {\n \tconst int32_t minGainCode = helper_->gainCode(mode_.minAnalogueGain);\n \tconst int32_t maxGainCode = helper_->gainCode(mode_.maxAnalogueGain);\n@@ -1570,269 +1276,6 @@ void IPARPi::applyAGC(const struct AgcStatus *agcStatus, ControlList &ctrls)\n \t\t\t\t\t\t  helper_->hblankToLineLength(hblank)));\n }\n \n-void IPARPi::applyDG(const struct AgcStatus *dgStatus, ControlList &ctrls)\n-{\n-\tctrls.set(V4L2_CID_DIGITAL_GAIN,\n-\t\t  static_cast<int32_t>(dgStatus->digitalGain * 1000));\n-}\n-\n-void IPARPi::applyCCM(const struct CcmStatus *ccmStatus, ControlList &ctrls)\n-{\n-\tbcm2835_isp_custom_ccm ccm;\n-\n-\tfor (int i = 0; i < 9; i++) {\n-\t\tccm.ccm.ccm[i / 3][i % 3].den = 1000;\n-\t\tccm.ccm.ccm[i / 3][i % 3].num = 1000 * ccmStatus->matrix[i];\n-\t}\n-\n-\tccm.enabled = 1;\n-\tccm.ccm.offsets[0] = ccm.ccm.offsets[1] = ccm.ccm.offsets[2] = 0;\n-\n-\tControlValue c(Span<const uint8_t>{ reinterpret_cast<uint8_t *>(&ccm),\n-\t\t\t\t\t    sizeof(ccm) });\n-\tctrls.set(V4L2_CID_USER_BCM2835_ISP_CC_MATRIX, c);\n-}\n-\n-void IPARPi::applyGamma(const struct ContrastStatus *contrastStatus, ControlList &ctrls)\n-{\n-\tconst unsigned int numGammaPoints = controller_.getHardwareConfig().numGammaPoints;\n-\tstruct bcm2835_isp_gamma gamma;\n-\n-\tfor (unsigned int i = 0; i < numGammaPoints - 1; i++) {\n-\t\tint x = i < 16 ? i * 1024\n-\t\t\t       : (i < 24 ? (i - 16) * 2048 + 16384\n-\t\t\t\t\t : (i - 24) * 4096 + 32768);\n-\t\tgamma.x[i] = x;\n-\t\tgamma.y[i] = std::min<uint16_t>(65535, contrastStatus->gammaCurve.eval(x));\n-\t}\n-\n-\tgamma.x[numGammaPoints - 1] = 65535;\n-\tgamma.y[numGammaPoints - 1] = 65535;\n-\tgamma.enabled = 1;\n-\n-\tControlValue c(Span<const uint8_t>{ reinterpret_cast<uint8_t *>(&gamma),\n-\t\t\t\t\t    sizeof(gamma) });\n-\tctrls.set(V4L2_CID_USER_BCM2835_ISP_GAMMA, c);\n-}\n-\n-void IPARPi::applyBlackLevel(const struct BlackLevelStatus *blackLevelStatus, ControlList &ctrls)\n-{\n-\tbcm2835_isp_black_level blackLevel;\n-\n-\tblackLevel.enabled = 1;\n-\tblackLevel.black_level_r = blackLevelStatus->blackLevelR;\n-\tblackLevel.black_level_g = blackLevelStatus->blackLevelG;\n-\tblackLevel.black_level_b = blackLevelStatus->blackLevelB;\n-\n-\tControlValue c(Span<const uint8_t>{ reinterpret_cast<uint8_t *>(&blackLevel),\n-\t\t\t\t\t    sizeof(blackLevel) });\n-\tctrls.set(V4L2_CID_USER_BCM2835_ISP_BLACK_LEVEL, c);\n-}\n-\n-void IPARPi::applyGEQ(const struct GeqStatus *geqStatus, ControlList &ctrls)\n-{\n-\tbcm2835_isp_geq geq;\n-\n-\tgeq.enabled = 1;\n-\tgeq.offset = geqStatus->offset;\n-\tgeq.slope.den = 1000;\n-\tgeq.slope.num = 1000 * geqStatus->slope;\n-\n-\tControlValue c(Span<const uint8_t>{ reinterpret_cast<uint8_t *>(&geq),\n-\t\t\t\t\t    sizeof(geq) });\n-\tctrls.set(V4L2_CID_USER_BCM2835_ISP_GEQ, c);\n-}\n-\n-void IPARPi::applyDenoise(const struct DenoiseStatus *denoiseStatus, ControlList &ctrls)\n-{\n-\tusing RPiController::DenoiseMode;\n-\n-\tbcm2835_isp_denoise denoise;\n-\tDenoiseMode mode = static_cast<DenoiseMode>(denoiseStatus->mode);\n-\n-\tdenoise.enabled = mode != DenoiseMode::Off;\n-\tdenoise.constant = denoiseStatus->noiseConstant;\n-\tdenoise.slope.num = 1000 * denoiseStatus->noiseSlope;\n-\tdenoise.slope.den = 1000;\n-\tdenoise.strength.num = 1000 * denoiseStatus->strength;\n-\tdenoise.strength.den = 1000;\n-\n-\t/* Set the CDN mode to match the SDN operating mode. */\n-\tbcm2835_isp_cdn cdn;\n-\tswitch (mode) {\n-\tcase DenoiseMode::ColourFast:\n-\t\tcdn.enabled = 1;\n-\t\tcdn.mode = CDN_MODE_FAST;\n-\t\tbreak;\n-\tcase DenoiseMode::ColourHighQuality:\n-\t\tcdn.enabled = 1;\n-\t\tcdn.mode = CDN_MODE_HIGH_QUALITY;\n-\t\tbreak;\n-\tdefault:\n-\t\tcdn.enabled = 0;\n-\t}\n-\n-\tControlValue c(Span<const uint8_t>{ reinterpret_cast<uint8_t *>(&denoise),\n-\t\t\t\t\t    sizeof(denoise) });\n-\tctrls.set(V4L2_CID_USER_BCM2835_ISP_DENOISE, c);\n-\n-\tc = ControlValue(Span<const uint8_t>{ reinterpret_cast<uint8_t *>(&cdn),\n-\t\t\t\t\t      sizeof(cdn) });\n-\tctrls.set(V4L2_CID_USER_BCM2835_ISP_CDN, c);\n-}\n-\n-void IPARPi::applySharpen(const struct SharpenStatus *sharpenStatus, ControlList &ctrls)\n-{\n-\tbcm2835_isp_sharpen sharpen;\n-\n-\tsharpen.enabled = 1;\n-\tsharpen.threshold.num = 1000 * sharpenStatus->threshold;\n-\tsharpen.threshold.den = 1000;\n-\tsharpen.strength.num = 1000 * sharpenStatus->strength;\n-\tsharpen.strength.den = 1000;\n-\tsharpen.limit.num = 1000 * sharpenStatus->limit;\n-\tsharpen.limit.den = 1000;\n-\n-\tControlValue c(Span<const uint8_t>{ reinterpret_cast<uint8_t *>(&sharpen),\n-\t\t\t\t\t    sizeof(sharpen) });\n-\tctrls.set(V4L2_CID_USER_BCM2835_ISP_SHARPEN, c);\n-}\n-\n-void IPARPi::applyDPC(const struct DpcStatus *dpcStatus, ControlList &ctrls)\n-{\n-\tbcm2835_isp_dpc dpc;\n-\n-\tdpc.enabled = 1;\n-\tdpc.strength = dpcStatus->strength;\n-\n-\tControlValue c(Span<const uint8_t>{ reinterpret_cast<uint8_t *>(&dpc),\n-\t\t\t\t\t    sizeof(dpc) });\n-\tctrls.set(V4L2_CID_USER_BCM2835_ISP_DPC, c);\n-}\n-\n-void IPARPi::applyLS(const struct AlscStatus *lsStatus, ControlList &ctrls)\n-{\n-\t/*\n-\t * Program lens shading tables into pipeline.\n-\t * Choose smallest cell size that won't exceed 63x48 cells.\n-\t */\n-\tconst int cellSizes[] = { 16, 32, 64, 128, 256 };\n-\tunsigned int numCells = std::size(cellSizes);\n-\tunsigned int i, w, h, cellSize;\n-\tfor (i = 0; i < numCells; i++) {\n-\t\tcellSize = cellSizes[i];\n-\t\tw = (mode_.width + cellSize - 1) / cellSize;\n-\t\th = (mode_.height + cellSize - 1) / cellSize;\n-\t\tif (w < 64 && h <= 48)\n-\t\t\tbreak;\n-\t}\n-\n-\tif (i == numCells) {\n-\t\tLOG(IPARPI, Error) << \"Cannot find cell size\";\n-\t\treturn;\n-\t}\n-\n-\t/* We're going to supply corner sampled tables, 16 bit samples. */\n-\tw++, h++;\n-\tbcm2835_isp_lens_shading ls = {\n-\t\t.enabled = 1,\n-\t\t.grid_cell_size = cellSize,\n-\t\t.grid_width = w,\n-\t\t.grid_stride = w,\n-\t\t.grid_height = h,\n-\t\t/* .dmabuf will be filled in by pipeline handler. */\n-\t\t.dmabuf = 0,\n-\t\t.ref_transform = 0,\n-\t\t.corner_sampled = 1,\n-\t\t.gain_format = GAIN_FORMAT_U4P10\n-\t};\n-\n-\tif (!lsTable_ || w * h * 4 * sizeof(uint16_t) > MaxLsGridSize) {\n-\t\tLOG(IPARPI, Error) << \"Do not have a correctly allocate lens shading table!\";\n-\t\treturn;\n-\t}\n-\n-\tif (lsStatus) {\n-\t\t/* Format will be u4.10 */\n-\t\tuint16_t *grid = static_cast<uint16_t *>(lsTable_);\n-\n-\t\tresampleTable(grid, lsStatus->r, w, h);\n-\t\tresampleTable(grid + w * h, lsStatus->g, w, h);\n-\t\tstd::memcpy(grid + 2 * w * h, grid + w * h, w * h * sizeof(uint16_t));\n-\t\tresampleTable(grid + 3 * w * h, lsStatus->b, w, h);\n-\t}\n-\n-\tControlValue c(Span<const uint8_t>{ reinterpret_cast<uint8_t *>(&ls),\n-\t\t\t\t\t    sizeof(ls) });\n-\tctrls.set(V4L2_CID_USER_BCM2835_ISP_LENS_SHADING, c);\n-}\n-\n-void IPARPi::applyAF(const struct AfStatus *afStatus, ControlList &lensCtrls)\n-{\n-\tif (afStatus->lensSetting) {\n-\t\tControlValue v(afStatus->lensSetting.value());\n-\t\tlensCtrls.set(V4L2_CID_FOCUS_ABSOLUTE, v);\n-\t}\n-}\n-\n-/*\n- * Resamples a 16x12 table with central sampling to destW x destH with corner\n- * sampling.\n- */\n-void IPARPi::resampleTable(uint16_t dest[], const std::vector<double> &src,\n-\t\t\t   int destW, int destH)\n-{\n-\t/*\n-\t * Precalculate and cache the x sampling locations and phases to\n-\t * save recomputing them on every row.\n-\t */\n-\tassert(destW > 1 && destH > 1 && destW <= 64);\n-\tint xLo[64], xHi[64];\n-\tdouble xf[64];\n-\tdouble x = -0.5, xInc = 16.0 / (destW - 1);\n-\tfor (int i = 0; i < destW; i++, x += xInc) {\n-\t\txLo[i] = floor(x);\n-\t\txf[i] = x - xLo[i];\n-\t\txHi[i] = xLo[i] < 15 ? xLo[i] + 1 : 15;\n-\t\txLo[i] = xLo[i] > 0 ? xLo[i] : 0;\n-\t}\n-\n-\t/* Now march over the output table generating the new values. */\n-\tdouble y = -0.5, yInc = 12.0 / (destH - 1);\n-\tfor (int j = 0; j < destH; j++, y += yInc) {\n-\t\tint yLo = floor(y);\n-\t\tdouble yf = y - yLo;\n-\t\tint yHi = yLo < 11 ? yLo + 1 : 11;\n-\t\tyLo = yLo > 0 ? yLo : 0;\n-\t\tdouble const *rowAbove = src.data() + yLo * 16;\n-\t\tdouble const *rowBelow = src.data() + yHi * 16;\n-\t\tfor (int i = 0; i < destW; i++) {\n-\t\t\tdouble above = rowAbove[xLo[i]] * (1 - xf[i]) + rowAbove[xHi[i]] * xf[i];\n-\t\t\tdouble below = rowBelow[xLo[i]] * (1 - xf[i]) + rowBelow[xHi[i]] * xf[i];\n-\t\t\tint result = floor(1024 * (above * (1 - yf) + below * yf) + .5);\n-\t\t\t*(dest++) = result > 16383 ? 16383 : result; /* want u4.10 */\n-\t\t}\n-\t}\n-}\n-\n } /* namespace ipa::RPi */\n \n-/*\n- * External IPA module interface\n- */\n-extern \"C\" {\n-const struct IPAModuleInfo ipaModuleInfo = {\n-\tIPA_MODULE_API_VERSION,\n-\t1,\n-\t\"PipelineHandlerRPi\",\n-\t\"vc4\",\n-};\n-\n-IPAInterface *ipaCreate()\n-{\n-\treturn new ipa::RPi::IPARPi();\n-}\n-\n-} /* extern \"C\" */\n-\n } /* namespace libcamera */\ndiff --git a/src/ipa/rpi/common/ipa_base.h b/src/ipa/rpi/common/ipa_base.h\nnew file mode 100644\nindex 000000000000..b9fbf9414d63\n--- /dev/null\n+++ b/src/ipa/rpi/common/ipa_base.h\n@@ -0,0 +1,125 @@\n+/* SPDX-License-Identifier: BSD-2-Clause */\n+/*\n+ * Copyright (C) 2023, Raspberry Pi Ltd\n+ *\n+ * ipa_base.cpp - Raspberry Pi IPA base class\n+ */\n+#pragma once\n+\n+#include <array>\n+#include <deque>\n+#include <vector>\n+\n+#include <libcamera/base/utils.h>\n+\n+#include <libcamera/controls.h>\n+\n+#include <libcamera/ipa/ipa_interface.h>\n+#include <libcamera/ipa/ipa_module_info.h>\n+#include <libcamera/ipa/raspberrypi_ipa_interface.h>\n+\n+#include \"libcamera/internal/mapped_framebuffer.h\"\n+\n+#include \"cam_helper/cam_helper.h\"\n+#include \"controller/agc_status.h\"\n+#include \"controller/camera_mode.h\"\n+#include \"controller/controller.h\"\n+#include \"controller/metadata.h\"\n+\n+namespace libcamera {\n+\n+namespace ipa::RPi {\n+\n+class IpaBase : public IPARPiInterface\n+{\n+public:\n+\tIpaBase();\n+\tvirtual ~IpaBase() = 0;\n+\n+\tint32_t init(const IPASettings &settings, const InitParams &params, InitResult *result) override;\n+\tint32_t configure(const IPACameraSensorInfo &sensorInfo, const ConfigParams &params,\n+\t\t\t  ConfigResult *result) override;\n+\n+\tvoid start(const ControlList &controls, StartResult *result) override;\n+\tvoid stop() override {}\n+\n+\tvoid mapBuffers(const std::vector<IPABuffer> &buffers) override;\n+\tvoid unmapBuffers(const std::vector<unsigned int> &ids) override;\n+\n+\tvoid prepareIsp(const PrepareParams &params) override;\n+\tvoid processStats(const ProcessParams &params) override;\n+\n+\tvoid reportMetadata(unsigned int ipaContext, ControlList *controls) override;\n+\n+private:\n+\t/* Number of metadata objects available in the context list. */\n+\tstatic constexpr unsigned int numMetadataContexts = 16;\n+\n+\tvirtual int32_t platformInit(const InitParams &params, InitResult *result) = 0;\n+\tvirtual int32_t platformConfigure(const ConfigParams &params, ConfigResult *result) = 0;\n+\n+\tvirtual void platformPrepareIsp(const PrepareParams &params,\n+\t\t\t\t\tRPiController::Metadata &rpiMetadata) = 0;\n+\tvirtual RPiController::StatisticsPtr platformProcessStats(Span<uint8_t> mem) = 0;\n+\n+\tvoid setMode(const IPACameraSensorInfo &sensorInfo);\n+\tvoid setCameraTimeoutValue();\n+\tbool validateSensorControls();\n+\tbool validateLensControls();\n+\tvoid applyControls(const ControlList &controls);\n+\tvirtual void handleControls(const ControlList &controls) = 0;\n+\tvoid fillDeviceStatus(const ControlList &sensorControls, unsigned int ipaContext);\n+\tvoid applyFrameDurations(utils::Duration minFrameDuration, utils::Duration maxFrameDuration);\n+\tvoid applyAGC(const struct AgcStatus *agcStatus, ControlList &ctrls);\n+\n+\tstd::map<unsigned int, MappedFrameBuffer> buffers_;\n+\n+\tbool lensPresent_;\n+\tControlList libcameraMetadata_;\n+\n+\tstd::array<RPiController::Metadata, numMetadataContexts> rpiMetadata_;\n+\n+\t/*\n+\t * We count frames to decide if the frame must be hidden (e.g. from\n+\t * display) or mistrusted (i.e. not given to the control algos).\n+\t */\n+\tuint64_t frameCount_;\n+\n+\t/* How many frames we should avoid running control algos on. */\n+\tunsigned int mistrustCount_;\n+\n+\t/* Number of frames that need to be dropped on startup. */\n+\tunsigned int dropFrameCount_;\n+\n+\t/* Frame timestamp for the last run of the controller. */\n+\tuint64_t lastRunTimestamp_;\n+\n+\t/* Do we run a Controller::process() for this frame? */\n+\tbool processPending_;\n+\n+\t/* Distinguish the first camera start from others. */\n+\tbool firstStart_;\n+\n+\t/* Frame duration (1/fps) limits. */\n+\tutils::Duration minFrameDuration_;\n+\tutils::Duration maxFrameDuration_;\n+\n+protected:\n+\t/* Raspberry Pi controller specific defines. */\n+\tstd::unique_ptr<RPiController::CamHelper> helper_;\n+\tRPiController::Controller controller_;\n+\n+\tControlInfoMap sensorCtrls_;\n+\tControlInfoMap lensCtrls_;\n+\n+\t/* Camera sensor params. */\n+\tCameraMode mode_;\n+\n+\t/* Track the frame length times over FrameLengthsQueueSize frames. */\n+\tstd::deque<utils::Duration> frameLengths_;\n+\tutils::Duration lastTimeout_;\n+};\n+\n+} /* namespace ipa::RPi */\n+\n+} /* namespace libcamera */\ndiff --git a/src/ipa/rpi/common/meson.build b/src/ipa/rpi/common/meson.build\nnew file mode 100644\nindex 000000000000..a7189f8389af\n--- /dev/null\n+++ b/src/ipa/rpi/common/meson.build\n@@ -0,0 +1,7 @@\n+# SPDX-License-Identifier: CC0-1.0\n+\n+# SPDX-License-Identifier: CC0-1.0\n+\n+rpi_ipa_common_sources = files([\n+    'ipa_base.cpp',\n+])\ndiff --git a/src/ipa/rpi/vc4/meson.build b/src/ipa/rpi/vc4/meson.build\nindex 992d0f1ab5a7..b581391586b3 100644\n--- a/src/ipa/rpi/vc4/meson.build\n+++ b/src/ipa/rpi/vc4/meson.build\n@@ -13,11 +13,15 @@ vc4_ipa_includes = [\n ]\n \n vc4_ipa_sources = files([\n-    'raspberrypi.cpp',\n+    'vc4.cpp',\n ])\n \n vc4_ipa_includes += include_directories('..')\n-vc4_ipa_sources += [rpi_ipa_cam_helper_sources, rpi_ipa_controller_sources]\n+vc4_ipa_sources += [\n+    rpi_ipa_cam_helper_sources,\n+    rpi_ipa_common_sources,\n+    rpi_ipa_controller_sources,\n+]\n \n mod = shared_module(ipa_name,\n                     [vc4_ipa_sources, libcamera_generated_ipa_headers],\ndiff --git a/src/ipa/rpi/vc4/vc4.cpp b/src/ipa/rpi/vc4/vc4.cpp\nnew file mode 100644\nindex 000000000000..0d929cda6c4a\n--- /dev/null\n+++ b/src/ipa/rpi/vc4/vc4.cpp\n@@ -0,0 +1,540 @@\n+/* SPDX-License-Identifier: BSD-2-Clause */\n+/*\n+ * Copyright (C) 2019-2021, Raspberry Pi Ltd\n+ *\n+ * rpi.cpp - Raspberry Pi VC4/BCM2835 ISP IPA.\n+ */\n+\n+#include <string.h>\n+#include <sys/mman.h>\n+\n+#include <linux/bcm2835-isp.h>\n+\n+#include <libcamera/base/log.h>\n+\n+#include \"common/ipa_base.h\"\n+#include \"controller/af_status.h\"\n+#include \"controller/agc_algorithm.h\"\n+#include \"controller/alsc_status.h\"\n+#include \"controller/awb_status.h\"\n+#include \"controller/black_level_status.h\"\n+#include \"controller/ccm_status.h\"\n+#include \"controller/contrast_status.h\"\n+#include \"controller/denoise_algorithm.h\"\n+#include \"controller/denoise_status.h\"\n+#include \"controller/dpc_status.h\"\n+#include \"controller/geq_status.h\"\n+#include \"controller/lux_status.h\"\n+#include \"controller/noise_status.h\"\n+#include \"controller/sharpen_status.h\"\n+\n+namespace libcamera {\n+\n+LOG_DECLARE_CATEGORY(IPARPI)\n+\n+namespace ipa::RPi {\n+\n+class IpaVc4 final : public IpaBase\n+{\n+public:\n+\tIpaVc4()\n+\t\t: IpaBase(), lsTable_(nullptr)\n+\t{\n+\t}\n+\n+\t~IpaVc4()\n+\t{\n+\t\tif (lsTable_)\n+\t\t\tmunmap(lsTable_, MaxLsGridSize);\n+\t}\n+\n+\n+private:\n+\tint32_t platformInit(const InitParams &params, InitResult *result) override;\n+\tint32_t platformConfigure(const ConfigParams &params, ConfigResult *result) override;\n+\n+\tvoid platformPrepareIsp(const PrepareParams &params, RPiController::Metadata &rpiMetadata) override;\n+\tRPiController::StatisticsPtr platformProcessStats(Span<uint8_t> mem) override;\n+\n+\tvoid handleControls(const ControlList &controls) override;\n+\tbool validateIspControls();\n+\n+\tvoid applyAWB(const struct AwbStatus *awbStatus, ControlList &ctrls);\n+\tvoid applyDG(const struct AgcStatus *dgStatus, ControlList &ctrls);\n+\tvoid applyCCM(const struct CcmStatus *ccmStatus, ControlList &ctrls);\n+\tvoid applyBlackLevel(const struct BlackLevelStatus *blackLevelStatus, ControlList &ctrls);\n+\tvoid applyGamma(const struct ContrastStatus *contrastStatus, ControlList &ctrls);\n+\tvoid applyGEQ(const struct GeqStatus *geqStatus, ControlList &ctrls);\n+\tvoid applyDenoise(const struct DenoiseStatus *denoiseStatus, ControlList &ctrls);\n+\tvoid applySharpen(const struct SharpenStatus *sharpenStatus, ControlList &ctrls);\n+\tvoid applyDPC(const struct DpcStatus *dpcStatus, ControlList &ctrls);\n+\tvoid applyLS(const struct AlscStatus *lsStatus, ControlList &ctrls);\n+\tvoid applyAF(const struct AfStatus *afStatus, ControlList &lensCtrls);\n+\tvoid resampleTable(uint16_t dest[], const std::vector<double> &src, int destW, int destH);\n+\n+\t/* VC4 ISP controls. */\n+\tControlInfoMap ispCtrls_;\n+\n+\t/* LS table allocation passed in from the pipeline handler. */\n+\tSharedFD lsTableHandle_;\n+\tvoid *lsTable_;\n+};\n+\n+int32_t IpaVc4::platformInit([[maybe_unused]] const InitParams &params, [[maybe_unused]] InitResult *result)\n+{\n+\tconst std::string &target = controller_.getTarget();\n+\n+\tif (target != \"bcm2835\") {\n+\t\tLOG(IPARPI, Error)\n+\t\t\t<< \"Tuning data file target returned \\\"\" << target << \"\\\"\"\n+\t\t\t<< \", expected \\\"bcm2835\\\"\";\n+\t\treturn -EINVAL;\n+\t}\n+\n+\treturn 0;\n+}\n+\n+int32_t IpaVc4::platformConfigure(const ConfigParams &params, [[maybe_unused]] ConfigResult *result)\n+{\n+\tispCtrls_ = params.ispControls;\n+\tif (!validateIspControls()) {\n+\t\tLOG(IPARPI, Error) << \"ISP control validation failed.\";\n+\t\treturn -1;\n+\t}\n+\n+\t/* Store the lens shading table pointer and handle if available. */\n+\tif (params.lsTableHandle.isValid()) {\n+\t\t/* Remove any previous table, if there was one. */\n+\t\tif (lsTable_) {\n+\t\t\tmunmap(lsTable_, MaxLsGridSize);\n+\t\t\tlsTable_ = nullptr;\n+\t\t}\n+\n+\t\t/* Map the LS table buffer into user space. */\n+\t\tlsTableHandle_ = std::move(params.lsTableHandle);\n+\t\tif (lsTableHandle_.isValid()) {\n+\t\t\tlsTable_ = mmap(nullptr, MaxLsGridSize, PROT_READ | PROT_WRITE,\n+\t\t\t\t\tMAP_SHARED, lsTableHandle_.get(), 0);\n+\n+\t\t\tif (lsTable_ == MAP_FAILED) {\n+\t\t\t\tLOG(IPARPI, Error) << \"dmaHeap mmap failure for LS table.\";\n+\t\t\t\tlsTable_ = nullptr;\n+\t\t\t}\n+\t\t}\n+\t}\n+\n+\treturn 0;\n+}\n+\n+void IpaVc4::platformPrepareIsp([[maybe_unused]] const PrepareParams &params,\n+\t\t\t\tRPiController::Metadata &rpiMetadata)\n+{\n+\tControlList ctrls(ispCtrls_);\n+\n+\t/* Lock the metadata buffer to avoid constant locks/unlocks. */\n+\tstd::unique_lock<RPiController::Metadata> lock(rpiMetadata);\n+\n+\tAwbStatus *awbStatus = rpiMetadata.getLocked<AwbStatus>(\"awb.status\");\n+\tif (awbStatus)\n+\t\tapplyAWB(awbStatus, ctrls);\n+\n+\tCcmStatus *ccmStatus = rpiMetadata.getLocked<CcmStatus>(\"ccm.status\");\n+\tif (ccmStatus)\n+\t\tapplyCCM(ccmStatus, ctrls);\n+\n+\tAgcStatus *dgStatus = rpiMetadata.getLocked<AgcStatus>(\"agc.status\");\n+\tif (dgStatus)\n+\t\tapplyDG(dgStatus, ctrls);\n+\n+\tAlscStatus *lsStatus = rpiMetadata.getLocked<AlscStatus>(\"alsc.status\");\n+\tif (lsStatus)\n+\t\tapplyLS(lsStatus, ctrls);\n+\n+\tContrastStatus *contrastStatus = rpiMetadata.getLocked<ContrastStatus>(\"contrast.status\");\n+\tif (contrastStatus)\n+\t\tapplyGamma(contrastStatus, ctrls);\n+\n+\tBlackLevelStatus *blackLevelStatus = rpiMetadata.getLocked<BlackLevelStatus>(\"black_level.status\");\n+\tif (blackLevelStatus)\n+\t\tapplyBlackLevel(blackLevelStatus, ctrls);\n+\n+\tGeqStatus *geqStatus = rpiMetadata.getLocked<GeqStatus>(\"geq.status\");\n+\tif (geqStatus)\n+\t\tapplyGEQ(geqStatus, ctrls);\n+\n+\tDenoiseStatus *denoiseStatus = rpiMetadata.getLocked<DenoiseStatus>(\"denoise.status\");\n+\tif (denoiseStatus)\n+\t\tapplyDenoise(denoiseStatus, ctrls);\n+\n+\tSharpenStatus *sharpenStatus = rpiMetadata.getLocked<SharpenStatus>(\"sharpen.status\");\n+\tif (sharpenStatus)\n+\t\tapplySharpen(sharpenStatus, ctrls);\n+\n+\tDpcStatus *dpcStatus = rpiMetadata.getLocked<DpcStatus>(\"dpc.status\");\n+\tif (dpcStatus)\n+\t\tapplyDPC(dpcStatus, ctrls);\n+\n+\tconst AfStatus *afStatus = rpiMetadata.getLocked<AfStatus>(\"af.status\");\n+\tif (afStatus) {\n+\t\tControlList lensctrls(lensCtrls_);\n+\t\tapplyAF(afStatus, lensctrls);\n+\t\tif (!lensctrls.empty())\n+\t\t\tsetLensControls.emit(lensctrls);\n+\t}\n+\n+\tif (!ctrls.empty())\n+\t\tsetIspControls.emit(ctrls);\n+}\n+\n+RPiController::StatisticsPtr IpaVc4::platformProcessStats(Span<uint8_t> mem)\n+{\n+\tusing namespace RPiController;\n+\n+\tconst bcm2835_isp_stats *stats = reinterpret_cast<bcm2835_isp_stats *>(mem.data());\n+\tStatisticsPtr statistics = std::make_unique<Statistics>(Statistics::AgcStatsPos::PreWb,\n+\t\t\t\t\t\t\t\tStatistics::ColourStatsPos::PostLsc);\n+\tconst Controller::HardwareConfig &hw = controller_.getHardwareConfig();\n+\tunsigned int i;\n+\n+\t/* RGB histograms are not used, so do not populate them. */\n+\tstatistics->yHist = RPiController::Histogram(stats->hist[0].g_hist,\n+\t\t\t\t\t\t     hw.numHistogramBins);\n+\n+\t/* All region sums are based on a 16-bit normalised pipeline bit-depth. */\n+\tunsigned int scale = Statistics::NormalisationFactorPow2 - hw.pipelineWidth;\n+\n+\tstatistics->awbRegions.init(hw.awbRegions);\n+\tfor (i = 0; i < statistics->awbRegions.numRegions(); i++)\n+\t\tstatistics->awbRegions.set(i, { { stats->awb_stats[i].r_sum << scale,\n+\t\t\t\t\t\t  stats->awb_stats[i].g_sum << scale,\n+\t\t\t\t\t\t  stats->awb_stats[i].b_sum << scale },\n+\t\t\t\t\t\tstats->awb_stats[i].counted,\n+\t\t\t\t\t\tstats->awb_stats[i].notcounted });\n+\n+\tstatistics->agcRegions.init(hw.agcRegions);\n+\tfor (i = 0; i < statistics->agcRegions.numRegions(); i++)\n+\t\tstatistics->agcRegions.set(i, { { stats->agc_stats[i].r_sum << scale,\n+\t\t\t\t\t\t  stats->agc_stats[i].g_sum << scale,\n+\t\t\t\t\t\t  stats->agc_stats[i].b_sum << scale },\n+\t\t\t\t\t\tstats->agc_stats[i].counted,\n+\t\t\t\t\t\tstats->awb_stats[i].notcounted });\n+\n+\tstatistics->focusRegions.init(hw.focusRegions);\n+\tfor (i = 0; i < statistics->focusRegions.numRegions(); i++)\n+\t\tstatistics->focusRegions.set(i, { stats->focus_stats[i].contrast_val[1][1] / 1000,\n+\t\t\t\t\t\t  stats->focus_stats[i].contrast_val_num[1][1],\n+\t\t\t\t\t\t  stats->focus_stats[i].contrast_val_num[1][0] });\n+\n+\treturn statistics;\n+}\n+\n+void IpaVc4::handleControls([[maybe_unused]] const ControlList &controls)\n+{\n+\t/* No controls require any special updates to the hardware configuration. */\n+}\n+\n+bool IpaVc4::validateIspControls()\n+{\n+\tstatic const uint32_t ctrls[] = {\n+\t\tV4L2_CID_RED_BALANCE,\n+\t\tV4L2_CID_BLUE_BALANCE,\n+\t\tV4L2_CID_DIGITAL_GAIN,\n+\t\tV4L2_CID_USER_BCM2835_ISP_CC_MATRIX,\n+\t\tV4L2_CID_USER_BCM2835_ISP_GAMMA,\n+\t\tV4L2_CID_USER_BCM2835_ISP_BLACK_LEVEL,\n+\t\tV4L2_CID_USER_BCM2835_ISP_GEQ,\n+\t\tV4L2_CID_USER_BCM2835_ISP_DENOISE,\n+\t\tV4L2_CID_USER_BCM2835_ISP_SHARPEN,\n+\t\tV4L2_CID_USER_BCM2835_ISP_DPC,\n+\t\tV4L2_CID_USER_BCM2835_ISP_LENS_SHADING,\n+\t\tV4L2_CID_USER_BCM2835_ISP_CDN,\n+\t};\n+\n+\tfor (auto c : ctrls) {\n+\t\tif (ispCtrls_.find(c) == ispCtrls_.end()) {\n+\t\t\tLOG(IPARPI, Error) << \"Unable to find ISP control \"\n+\t\t\t\t\t   << utils::hex(c);\n+\t\t\treturn false;\n+\t\t}\n+\t}\n+\n+\treturn true;\n+}\n+\n+void IpaVc4::applyAWB(const struct AwbStatus *awbStatus, ControlList &ctrls)\n+{\n+\tLOG(IPARPI, Debug) << \"Applying WB R: \" << awbStatus->gainR << \" B: \"\n+\t\t\t   << awbStatus->gainB;\n+\n+\tctrls.set(V4L2_CID_RED_BALANCE,\n+\t\t  static_cast<int32_t>(awbStatus->gainR * 1000));\n+\tctrls.set(V4L2_CID_BLUE_BALANCE,\n+\t\t  static_cast<int32_t>(awbStatus->gainB * 1000));\n+}\n+\n+void IpaVc4::applyDG(const struct AgcStatus *dgStatus, ControlList &ctrls)\n+{\n+\tctrls.set(V4L2_CID_DIGITAL_GAIN,\n+\t\t  static_cast<int32_t>(dgStatus->digitalGain * 1000));\n+}\n+\n+void IpaVc4::applyCCM(const struct CcmStatus *ccmStatus, ControlList &ctrls)\n+{\n+\tbcm2835_isp_custom_ccm ccm;\n+\n+\tfor (int i = 0; i < 9; i++) {\n+\t\tccm.ccm.ccm[i / 3][i % 3].den = 1000;\n+\t\tccm.ccm.ccm[i / 3][i % 3].num = 1000 * ccmStatus->matrix[i];\n+\t}\n+\n+\tccm.enabled = 1;\n+\tccm.ccm.offsets[0] = ccm.ccm.offsets[1] = ccm.ccm.offsets[2] = 0;\n+\n+\tControlValue c(Span<const uint8_t>{ reinterpret_cast<uint8_t *>(&ccm),\n+\t\t\t\t\t    sizeof(ccm) });\n+\tctrls.set(V4L2_CID_USER_BCM2835_ISP_CC_MATRIX, c);\n+}\n+\n+void IpaVc4::applyBlackLevel(const struct BlackLevelStatus *blackLevelStatus, ControlList &ctrls)\n+{\n+\tbcm2835_isp_black_level blackLevel;\n+\n+\tblackLevel.enabled = 1;\n+\tblackLevel.black_level_r = blackLevelStatus->blackLevelR;\n+\tblackLevel.black_level_g = blackLevelStatus->blackLevelG;\n+\tblackLevel.black_level_b = blackLevelStatus->blackLevelB;\n+\n+\tControlValue c(Span<const uint8_t>{ reinterpret_cast<uint8_t *>(&blackLevel),\n+\t\t\t\t\t    sizeof(blackLevel) });\n+\tctrls.set(V4L2_CID_USER_BCM2835_ISP_BLACK_LEVEL, c);\n+}\n+\n+void IpaVc4::applyGamma(const struct ContrastStatus *contrastStatus, ControlList &ctrls)\n+{\n+\tconst unsigned int numGammaPoints = controller_.getHardwareConfig().numGammaPoints;\n+\tstruct bcm2835_isp_gamma gamma;\n+\n+\tfor (unsigned int i = 0; i < numGammaPoints - 1; i++) {\n+\t\tint x = i < 16 ? i * 1024\n+\t\t\t       : (i < 24 ? (i - 16) * 2048 + 16384\n+\t\t\t\t\t : (i - 24) * 4096 + 32768);\n+\t\tgamma.x[i] = x;\n+\t\tgamma.y[i] = std::min<uint16_t>(65535, contrastStatus->gammaCurve.eval(x));\n+\t}\n+\n+\tgamma.x[numGammaPoints - 1] = 65535;\n+\tgamma.y[numGammaPoints - 1] = 65535;\n+\tgamma.enabled = 1;\n+\n+\tControlValue c(Span<const uint8_t>{ reinterpret_cast<uint8_t *>(&gamma),\n+\t\t\t\t\t    sizeof(gamma) });\n+\tctrls.set(V4L2_CID_USER_BCM2835_ISP_GAMMA, c);\n+}\n+\n+void IpaVc4::applyGEQ(const struct GeqStatus *geqStatus, ControlList &ctrls)\n+{\n+\tbcm2835_isp_geq geq;\n+\n+\tgeq.enabled = 1;\n+\tgeq.offset = geqStatus->offset;\n+\tgeq.slope.den = 1000;\n+\tgeq.slope.num = 1000 * geqStatus->slope;\n+\n+\tControlValue c(Span<const uint8_t>{ reinterpret_cast<uint8_t *>(&geq),\n+\t\t\t\t\t    sizeof(geq) });\n+\tctrls.set(V4L2_CID_USER_BCM2835_ISP_GEQ, c);\n+}\n+\n+void IpaVc4::applyDenoise(const struct DenoiseStatus *denoiseStatus, ControlList &ctrls)\n+{\n+\tusing RPiController::DenoiseMode;\n+\n+\tbcm2835_isp_denoise denoise;\n+\tDenoiseMode mode = static_cast<DenoiseMode>(denoiseStatus->mode);\n+\n+\tdenoise.enabled = mode != DenoiseMode::Off;\n+\tdenoise.constant = denoiseStatus->noiseConstant;\n+\tdenoise.slope.num = 1000 * denoiseStatus->noiseSlope;\n+\tdenoise.slope.den = 1000;\n+\tdenoise.strength.num = 1000 * denoiseStatus->strength;\n+\tdenoise.strength.den = 1000;\n+\n+\t/* Set the CDN mode to match the SDN operating mode. */\n+\tbcm2835_isp_cdn cdn;\n+\tswitch (mode) {\n+\tcase DenoiseMode::ColourFast:\n+\t\tcdn.enabled = 1;\n+\t\tcdn.mode = CDN_MODE_FAST;\n+\t\tbreak;\n+\tcase DenoiseMode::ColourHighQuality:\n+\t\tcdn.enabled = 1;\n+\t\tcdn.mode = CDN_MODE_HIGH_QUALITY;\n+\t\tbreak;\n+\tdefault:\n+\t\tcdn.enabled = 0;\n+\t}\n+\n+\tControlValue c(Span<const uint8_t>{ reinterpret_cast<uint8_t *>(&denoise),\n+\t\t\t\t\t    sizeof(denoise) });\n+\tctrls.set(V4L2_CID_USER_BCM2835_ISP_DENOISE, c);\n+\n+\tc = ControlValue(Span<const uint8_t>{ reinterpret_cast<uint8_t *>(&cdn),\n+\t\t\t\t\t      sizeof(cdn) });\n+\tctrls.set(V4L2_CID_USER_BCM2835_ISP_CDN, c);\n+}\n+\n+void IpaVc4::applySharpen(const struct SharpenStatus *sharpenStatus, ControlList &ctrls)\n+{\n+\tbcm2835_isp_sharpen sharpen;\n+\n+\tsharpen.enabled = 1;\n+\tsharpen.threshold.num = 1000 * sharpenStatus->threshold;\n+\tsharpen.threshold.den = 1000;\n+\tsharpen.strength.num = 1000 * sharpenStatus->strength;\n+\tsharpen.strength.den = 1000;\n+\tsharpen.limit.num = 1000 * sharpenStatus->limit;\n+\tsharpen.limit.den = 1000;\n+\n+\tControlValue c(Span<const uint8_t>{ reinterpret_cast<uint8_t *>(&sharpen),\n+\t\t\t\t\t    sizeof(sharpen) });\n+\tctrls.set(V4L2_CID_USER_BCM2835_ISP_SHARPEN, c);\n+}\n+\n+void IpaVc4::applyDPC(const struct DpcStatus *dpcStatus, ControlList &ctrls)\n+{\n+\tbcm2835_isp_dpc dpc;\n+\n+\tdpc.enabled = 1;\n+\tdpc.strength = dpcStatus->strength;\n+\n+\tControlValue c(Span<const uint8_t>{ reinterpret_cast<uint8_t *>(&dpc),\n+\t\t\t\t\t    sizeof(dpc) });\n+\tctrls.set(V4L2_CID_USER_BCM2835_ISP_DPC, c);\n+}\n+\n+void IpaVc4::applyLS(const struct AlscStatus *lsStatus, ControlList &ctrls)\n+{\n+\t/*\n+\t * Program lens shading tables into pipeline.\n+\t * Choose smallest cell size that won't exceed 63x48 cells.\n+\t */\n+\tconst int cellSizes[] = { 16, 32, 64, 128, 256 };\n+\tunsigned int numCells = std::size(cellSizes);\n+\tunsigned int i, w, h, cellSize;\n+\tfor (i = 0; i < numCells; i++) {\n+\t\tcellSize = cellSizes[i];\n+\t\tw = (mode_.width + cellSize - 1) / cellSize;\n+\t\th = (mode_.height + cellSize - 1) / cellSize;\n+\t\tif (w < 64 && h <= 48)\n+\t\t\tbreak;\n+\t}\n+\n+\tif (i == numCells) {\n+\t\tLOG(IPARPI, Error) << \"Cannot find cell size\";\n+\t\treturn;\n+\t}\n+\n+\t/* We're going to supply corner sampled tables, 16 bit samples. */\n+\tw++, h++;\n+\tbcm2835_isp_lens_shading ls = {\n+\t\t.enabled = 1,\n+\t\t.grid_cell_size = cellSize,\n+\t\t.grid_width = w,\n+\t\t.grid_stride = w,\n+\t\t.grid_height = h,\n+\t\t/* .dmabuf will be filled in by pipeline handler. */\n+\t\t.dmabuf = 0,\n+\t\t.ref_transform = 0,\n+\t\t.corner_sampled = 1,\n+\t\t.gain_format = GAIN_FORMAT_U4P10\n+\t};\n+\n+\tif (!lsTable_ || w * h * 4 * sizeof(uint16_t) > MaxLsGridSize) {\n+\t\tLOG(IPARPI, Error) << \"Do not have a correctly allocate lens shading table!\";\n+\t\treturn;\n+\t}\n+\n+\tif (lsStatus) {\n+\t\t/* Format will be u4.10 */\n+\t\tuint16_t *grid = static_cast<uint16_t *>(lsTable_);\n+\n+\t\tresampleTable(grid, lsStatus->r, w, h);\n+\t\tresampleTable(grid + w * h, lsStatus->g, w, h);\n+\t\tmemcpy(grid + 2 * w * h, grid + w * h, w * h * sizeof(uint16_t));\n+\t\tresampleTable(grid + 3 * w * h, lsStatus->b, w, h);\n+\t}\n+\n+\tControlValue c(Span<const uint8_t>{ reinterpret_cast<uint8_t *>(&ls),\n+\t\t\t\t\t    sizeof(ls) });\n+\tctrls.set(V4L2_CID_USER_BCM2835_ISP_LENS_SHADING, c);\n+}\n+\n+void IpaVc4::applyAF(const struct AfStatus *afStatus, ControlList &lensCtrls)\n+{\n+\tif (afStatus->lensSetting) {\n+\t\tControlValue v(afStatus->lensSetting.value());\n+\t\tlensCtrls.set(V4L2_CID_FOCUS_ABSOLUTE, v);\n+\t}\n+}\n+\n+/*\n+ * Resamples a 16x12 table with central sampling to destW x destH with corner\n+ * sampling.\n+ */\n+void IpaVc4::resampleTable(uint16_t dest[], const std::vector<double> &src,\n+\t\t\t   int destW, int destH)\n+{\n+\t/*\n+\t * Precalculate and cache the x sampling locations and phases to\n+\t * save recomputing them on every row.\n+\t */\n+\tassert(destW > 1 && destH > 1 && destW <= 64);\n+\tint xLo[64], xHi[64];\n+\tdouble xf[64];\n+\tdouble x = -0.5, xInc = 16.0 / (destW - 1);\n+\tfor (int i = 0; i < destW; i++, x += xInc) {\n+\t\txLo[i] = floor(x);\n+\t\txf[i] = x - xLo[i];\n+\t\txHi[i] = xLo[i] < 15 ? xLo[i] + 1 : 15;\n+\t\txLo[i] = xLo[i] > 0 ? xLo[i] : 0;\n+\t}\n+\n+\t/* Now march over the output table generating the new values. */\n+\tdouble y = -0.5, yInc = 12.0 / (destH - 1);\n+\tfor (int j = 0; j < destH; j++, y += yInc) {\n+\t\tint yLo = floor(y);\n+\t\tdouble yf = y - yLo;\n+\t\tint yHi = yLo < 11 ? yLo + 1 : 11;\n+\t\tyLo = yLo > 0 ? yLo : 0;\n+\t\tdouble const *rowAbove = src.data() + yLo * 16;\n+\t\tdouble const *rowBelow = src.data() + yHi * 16;\n+\t\tfor (int i = 0; i < destW; i++) {\n+\t\t\tdouble above = rowAbove[xLo[i]] * (1 - xf[i]) + rowAbove[xHi[i]] * xf[i];\n+\t\t\tdouble below = rowBelow[xLo[i]] * (1 - xf[i]) + rowBelow[xHi[i]] * xf[i];\n+\t\t\tint result = floor(1024 * (above * (1 - yf) + below * yf) + .5);\n+\t\t\t*(dest++) = result > 16383 ? 16383 : result; /* want u4.10 */\n+\t\t}\n+\t}\n+}\n+\n+} /* namespace ipa::RPi */\n+\n+/*\n+ * External IPA module interface\n+ */\n+extern \"C\" {\n+const struct IPAModuleInfo ipaModuleInfo = {\n+\tIPA_MODULE_API_VERSION,\n+\t1,\n+\t\"PipelineHandlerRPi\",\n+\t\"vc4\",\n+};\n+\n+IPAInterface *ipaCreate()\n+{\n+\treturn new ipa::RPi::IpaVc4();\n+}\n+\n+} /* extern \"C\" */\n+\n+} /* namespace libcamera */\n","prefixes":["libcamera-devel","08/13"]}