From patchwork Wed Apr 26 13:10:45 2023 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Naushir Patuck X-Patchwork-Id: 18549 Return-Path: X-Original-To: parsemail@patchwork.libcamera.org Delivered-To: parsemail@patchwork.libcamera.org Received: from lancelot.ideasonboard.com (lancelot.ideasonboard.com [92.243.16.209]) by patchwork.libcamera.org (Postfix) with ESMTPS id 7A3F0BDCBD for ; Wed, 26 Apr 2023 13:13:15 +0000 (UTC) Received: from lancelot.ideasonboard.com (localhost [IPv6:::1]) by lancelot.ideasonboard.com (Postfix) with ESMTP id 07BFA627E3; Wed, 26 Apr 2023 15:13:14 +0200 (CEST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/simple; d=libcamera.org; s=mail; t=1682514794; bh=+OAnu+Axde/BMS9n1Q6b1MB3Ox2AkYx1tR/z6tYFUkw=; h=To:Date:In-Reply-To:References:Subject:List-Id:List-Unsubscribe: List-Archive:List-Post:List-Help:List-Subscribe:From:Reply-To: From; b=b/44w4q6+7Qt/tgFmmnN5k2/qS5Cw93bMm7OmGSpuQiI+n27AqTHK4KcAN4sttsEQ 2py5yqIYFH/n8cPPQZnXzHLRENT8SSzfj69x4NkfHLVn4675EwifJilXQiB3u8+dd+ uZvZ7E4QLGvzCX2JJ7dt8fEmEeMsZ7b9olz3roHU09ceY9+zO7+JfHBKUCEMNFsswx W2TMYyX3ZtGhrAsnsrJdr7GHwUoFsEqocRBmSEcxCJPWb9dgIFgoBa1oOaN5Snb5Tm BcAAa3bxdTgjGsBRo7wy7UzMQ2apnA7AxFMsYakPlTeOLdH++a5H8AeMrJbsvJxwP7 l0NJ96vXtTGeA== Received: from mail-wm1-x335.google.com (mail-wm1-x335.google.com [IPv6:2a00:1450:4864:20::335]) by lancelot.ideasonboard.com (Postfix) with ESMTPS id 8AA94627D6 for ; Wed, 26 Apr 2023 15:13:11 +0200 (CEST) Authentication-Results: lancelot.ideasonboard.com; dkim=pass (2048-bit key; unprotected) header.d=raspberrypi.com header.i=@raspberrypi.com header.b="Hzj3PBJr"; dkim-atps=neutral Received: by mail-wm1-x335.google.com with SMTP id 5b1f17b1804b1-3f196e8e2c6so52626715e9.1 for ; Wed, 26 Apr 2023 06:13:11 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=raspberrypi.com; s=google; t=1682514791; x=1685106791; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:from:to:cc:subject:date :message-id:reply-to; bh=5a8LR0uxpT9CI908IO5AGVgppN0gjMdZsaYRnh9qcDc=; b=Hzj3PBJrKbWlLoVY48yAtPyNEDZIaHFnQ55V+iFtVsGkF10sQRdfjbkEZWVJdCkeLo Mqq2aetNs61QeeLBpPHeRaqzHP5BIIrVBQdh0A5ifG6ckIK/EYqX19W1nR1j1f0eParu EIFA7EW+uAPXSesCCVK9MAaA5+/9sY4JnLIPaFTOuP0q7L0vUOp8V7ZW/SalqnnYPFB3 5yPqHXbQGw5DlKEVR9ZGlOdn6WCQDdKgw4n20G8WObe77kUTJ0TOUAgrmvUbr2NmLG6t Yigilm2ewhcwM+/DfE8HQUl8Qal5yIA2fiVB8qKTlcOeXCP3d7TDx42teAgdopqtAfHS DRRA== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20221208; t=1682514791; x=1685106791; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:x-gm-message-state:from:to:cc :subject:date:message-id:reply-to; bh=5a8LR0uxpT9CI908IO5AGVgppN0gjMdZsaYRnh9qcDc=; b=Pt/LNkm1AsqhJRNPfHmH1I1LDcHA/yZFxe+Xbajgvz9TvshofhfH3w34+zK6QA1Flf pp02h+ouTwuSv+BD3Nt8jTdv8qRPNUJYQb8HwPo4aBfd4n+BWZVd5n1+A51CdsbI4RUG 5uMIyp/h/RWaNT/fGeNlDQbfiHcXCppOQ6JipTF0VkfB03F6uSTuRldpEb6VrYkJpmM0 IVN7Dzo6SlsyrQsgT49OYEcbWv0XwG1pWSUFCF97ImR5QlM39o8aeP2PkrB9PyXyitJB exAP+X3egNyEmIrA+wQbW1SEkgQEKQRZhuDnI/qzXU7gPicll2gsX+Pi61XogKRZhumS KrBA== X-Gm-Message-State: AAQBX9cmf0AHUZnPsVmapc3K+68t4zxo6mf6wQjvhIHoA29nFtuR/Crl tAFv7COykAXfurWLlG/Muujwy6B/JbQC6lgZQrAUnA== X-Google-Smtp-Source: AKy350a/OvOwyDnxYbkhSTDSz4tjqbFZP3KekBiHaPr8zYlm+L2eh7S7n69o40gsiYjZd2H2K0twjw== X-Received: by 2002:a5d:55cf:0:b0:2ff:613c:af5d with SMTP id i15-20020a5d55cf000000b002ff613caf5dmr13801454wrw.36.1682514790841; Wed, 26 Apr 2023 06:13:10 -0700 (PDT) Received: from localhost.localdomain ([93.93.133.154]) by smtp.gmail.com with ESMTPSA id k5-20020adff5c5000000b002f103ca90cdsm15780949wrp.101.2023.04.26.06.13.10 (version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256); Wed, 26 Apr 2023 06:13:10 -0700 (PDT) To: libcamera-devel@lists.libcamera.org Date: Wed, 26 Apr 2023 14:10:45 +0100 Message-Id: <20230426131057.21550-2-naush@raspberrypi.com> X-Mailer: git-send-email 2.34.1 In-Reply-To: <20230426131057.21550-1-naush@raspberrypi.com> References: <20230426131057.21550-1-naush@raspberrypi.com> MIME-Version: 1.0 Subject: [libcamera-devel] [PATCH 01/13] meson: ipa: Add mapping for pipeline handler to mojom interface file X-BeenThere: libcamera-devel@lists.libcamera.org X-Mailman-Version: 2.1.29 Precedence: list List-Id: List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , X-Patchwork-Original-From: Naushir Patuck via libcamera-devel From: Naushir Patuck Reply-To: Naushir Patuck Errors-To: libcamera-devel-bounces@lists.libcamera.org Sender: "libcamera-devel" Allow an arbitrary mapping between the pipeline handler and IPA mojom interface file in the build system. This removes the 1:1 mapping of pipeline handler name to mojom filename, and allows more flexibility to pipeline developers. Signed-off-by: Naushir Patuck Reviewed-by: Jacopo Mondi --- Documentation/guides/ipa.rst | 19 ++++++++-------- include/libcamera/ipa/meson.build | 36 ++++++++++++++++++++----------- 2 files changed, 33 insertions(+), 22 deletions(-) diff --git a/Documentation/guides/ipa.rst b/Documentation/guides/ipa.rst index fc0317451e24..89839408672a 100644 --- a/Documentation/guides/ipa.rst +++ b/Documentation/guides/ipa.rst @@ -269,35 +269,36 @@ The following is an example of an event interface definition: Compiling the IPA interface --------------------------- -After the IPA interface is defined in include/libcamera/ipa/{pipeline_name}.mojom, +After the IPA interface is defined in include/libcamera/ipa/{interface_name}.mojom, an entry for it must be added in meson so that it can be compiled. The filename -must be added to the ipa_mojom_files object in include/libcamera/ipa/meson.build. +must be added to the pipeline_ipa_mojom_mapping object in include/libcamera/ipa/meson.build. +This object maps the pipeline handler name with an ipa interface file. For example, adding the raspberrypi.mojom file to meson: .. code-block:: none - ipa_mojom_files = [ - 'raspberrypi.mojom', + pipeline_ipa_mojom_mapping = [ + 'raspberrypi': 'raspberrypi.mojom', ] This will cause the mojo data definition file to be compiled. Specifically, it generates five files: - a header describing the custom data structures, and the complete IPA - interface (at {$build_dir}/include/libcamera/ipa/{pipeline}_ipa_interface.h) + interface (at {$build_dir}/include/libcamera/ipa/{interface}_ipa_interface.h) - a serializer implementing de/serialization for the custom data structures (at - {$build_dir}/include/libcamera/ipa/{pipeline}_ipa_serializer.h) + {$build_dir}/include/libcamera/ipa/{interface}_ipa_serializer.h) - a proxy header describing a specialized IPA proxy (at - {$build_dir}/include/libcamera/ipa/{pipeline}_ipa_proxy.h) + {$build_dir}/include/libcamera/ipa/{interface}_ipa_proxy.h) - a proxy source implementing the IPA proxy (at - {$build_dir}/src/libcamera/proxy/{pipeline}_ipa_proxy.cpp) + {$build_dir}/src/libcamera/proxy/{interface}_ipa_proxy.cpp) - a proxy worker source implementing the other end of the IPA proxy (at - {$build_dir}/src/libcamera/proxy/worker/{pipeline}_ipa_proxy_worker.cpp) + {$build_dir}/src/libcamera/proxy/worker/{interface}_ipa_proxy_worker.cpp) The IPA proxy serves as the layer between the pipeline handler and the IPA, and handles threading vs isolation transparently. The pipeline handler and the IPA diff --git a/include/libcamera/ipa/meson.build b/include/libcamera/ipa/meson.build index 442ca3dd7e1c..67c31cb04ccf 100644 --- a/include/libcamera/ipa/meson.build +++ b/include/libcamera/ipa/meson.build @@ -60,13 +60,15 @@ libcamera_generated_ipa_headers += custom_target('core_ipa_serializer_h', './' +'@INPUT@' ]) -ipa_mojom_files = [ - 'ipu3.mojom', - 'raspberrypi.mojom', - 'rkisp1.mojom', - 'vimc.mojom', -] - +# Mapping from pipeline handler name to mojom file +pipeline_ipa_mojom_mapping = { + 'ipu3': 'ipu3.mojom', + 'rkisp1': 'rkisp1.mojom', + 'raspberrypi': 'raspberrypi.mojom', + 'vimc': 'vimc.mojom', +} + +ipa_mojom_files = [] ipa_mojoms = [] # @@ -75,14 +77,22 @@ ipa_mojoms = [] # TODO Define per-pipeline ControlInfoMap with yaml? -foreach file : ipa_mojom_files +foreach pipeline, file : pipeline_ipa_mojom_mapping + name = file.split('.')[0] - if name not in pipelines + # Ensure we do not build duplicate mojom modules + if (file in ipa_mojom_files) + continue + endif + + ipa_mojom_files += file + + if pipeline not in pipelines continue endif - # {pipeline}.mojom-module + # {interface}.mojom-module mojom = custom_target(name + '_mojom_module', input : file, output : file + '-module', @@ -94,7 +104,7 @@ foreach file : ipa_mojom_files '--mojoms', '@INPUT@' ]) - # {pipeline}_ipa_interface.h + # {interface}_ipa_interface.h header = custom_target(name + '_ipa_interface_h', input : mojom, output : name + '_ipa_interface.h', @@ -110,7 +120,7 @@ foreach file : ipa_mojom_files './' +'@INPUT@' ]) - # {pipeline}_ipa_serializer.h + # {interface}_ipa_serializer.h serializer = custom_target(name + '_ipa_serializer_h', input : mojom, output : name + '_ipa_serializer.h', @@ -124,7 +134,7 @@ foreach file : ipa_mojom_files './' +'@INPUT@' ]) - # {pipeline}_ipa_proxy.h + # {interface}_ipa_proxy.h proxy_header = custom_target(name + '_proxy_h', input : mojom, output : name + '_ipa_proxy.h', From patchwork Wed Apr 26 13:10:46 2023 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Naushir Patuck X-Patchwork-Id: 18550 Return-Path: X-Original-To: parsemail@patchwork.libcamera.org Delivered-To: parsemail@patchwork.libcamera.org Received: from lancelot.ideasonboard.com (lancelot.ideasonboard.com [92.243.16.209]) by patchwork.libcamera.org (Postfix) with ESMTPS id D9B82BDCBD for ; Wed, 26 Apr 2023 13:13:16 +0000 (UTC) Received: from lancelot.ideasonboard.com (localhost [IPv6:::1]) by lancelot.ideasonboard.com (Postfix) with ESMTP id 89D33627E9; Wed, 26 Apr 2023 15:13:15 +0200 (CEST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/simple; d=libcamera.org; s=mail; t=1682514795; bh=1gxU4JwyGxJEnRQ9vp8hr7LGZOljBhaDc83/JCtpisc=; h=To:Date:In-Reply-To:References:Subject:List-Id:List-Unsubscribe: List-Archive:List-Post:List-Help:List-Subscribe:From:Reply-To: From; b=sW28jg7HQpQZVQS8uz0A9zBQuXyJm4naxHYN4OXsm5hBsiYA3/M+XthoCLkfqVqFp vznAXqG7F4V0Xvao99TfVaOfEvlrpNJMlOSwVLfipYNl94GOlzuCbojmOvd698TFEv h/hz9NlOvs8TaxKOUbkInfGIkL0i4Zyoh6XPBbnaSZgpR7nKdKSouAyb0iHjZSra21 BeBfQed9D8oTqvOllJo8YConbWg0bvEUn+5gSIzlri5I4xDvvCwp4UwzIVwi8f1ekJ pct287blQWY3V7+trFfg3qnIYW6BPleJfJEHfCJuiG1wjwxxA4VRfpOeNajqMl8VlN VY5nZkptdD9lQ== Received: from mail-wm1-x331.google.com (mail-wm1-x331.google.com [IPv6:2a00:1450:4864:20::331]) by lancelot.ideasonboard.com (Postfix) with ESMTPS id 3C611627D6 for ; Wed, 26 Apr 2023 15:13:13 +0200 (CEST) Authentication-Results: lancelot.ideasonboard.com; dkim=pass (2048-bit key; unprotected) header.d=raspberrypi.com header.i=@raspberrypi.com header.b="SdZyXWEQ"; dkim-atps=neutral Received: by mail-wm1-x331.google.com with SMTP id 5b1f17b1804b1-3f1957e80a2so151866845e9.1 for ; Wed, 26 Apr 2023 06:13:13 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=raspberrypi.com; s=google; t=1682514792; x=1685106792; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:from:to:cc:subject:date :message-id:reply-to; bh=zoLEOeoTgZgHTXVqtppZR4khc2YZ0Sk1+YlJyvlS/vE=; b=SdZyXWEQxYZ9bf8rlVCTSw+k533PHWSpzPhKMiRy4p32Ffbc/CKZj1YgXOwU4TVY7+ rv3E03zLQmtl8HG60mXrEeYTN9S6qBlv1yB1EfIqYgk2vuXa40bUpAW0S466Q4/lSjoI fHvmP3mM5L+7f0gymwrHnvszD7/oQhSxl9ViBNYW39H9z8AGbc/YaVduzUveIZXnU2Hw SI80+HyiyGOPKI+tIcZ9N8gdd0/vBLJaNAS71ZPfHp3WX0xSkipok+zFSB4kiidpLxKd AVVc1hnTDxCPZt+5rji2Kw7l3R9ML90hYoXibQyn4hP71n8cjxuJd0k+deVbWOYGxUxx qXbQ== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20221208; t=1682514792; x=1685106792; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:x-gm-message-state:from:to:cc :subject:date:message-id:reply-to; bh=zoLEOeoTgZgHTXVqtppZR4khc2YZ0Sk1+YlJyvlS/vE=; b=SlXIZLo2uSQZatTxkh12hG36Skjxt8tj1Mo4v0zs+UCTqkFOCTslfid1Hm/c5s7NDW f6csl1QlIgV31AehJK/rzLU4UFg/Qp7zKbi9hnfvv/zLYTjz0pFxMcOP1fh296CY+SeS BJGAeiPrjRMrb6BtFAXfiRd/IYBJNvmCs9q1C86C6uQpql3qITeMERBUvHLjqAhCS+kx /A6EF+4nALsbZTy4J0SNdtsixV7WbWuZePCY41eo1blCieQaOejtn6U0KCypulEUWy8C HIzHAhXNn88GDbLFoOedKds5Y0cUy4JzobOvGhkrhRHZXTTYJTKEXV8ZBsGWr35I6zoM fLaQ== X-Gm-Message-State: AC+VfDyI8Qx0RvcfYKL8/3VQ+3MLz7EalzlLtyOhN4+oQMqnRTGj3Iek WbwN2LQI5hoDxZqSpyNXqGdMxClhUmHk7alf/z7Reg== X-Google-Smtp-Source: ACHHUZ4EeLACQa/Bu23AV9U5bi8RIVNBwmFqzuysn9VDe3XI2uWIiqt5tsRF/iuPs0fmuHeQGFb18A== X-Received: by 2002:a05:600c:916:b0:3f1:7371:86bb with SMTP id m22-20020a05600c091600b003f1737186bbmr1603930wmp.20.1682514792647; Wed, 26 Apr 2023 06:13:12 -0700 (PDT) Received: from localhost.localdomain ([93.93.133.154]) by smtp.gmail.com with ESMTPSA id k5-20020adff5c5000000b002f103ca90cdsm15780949wrp.101.2023.04.26.06.13.10 (version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256); Wed, 26 Apr 2023 06:13:11 -0700 (PDT) To: libcamera-devel@lists.libcamera.org Date: Wed, 26 Apr 2023 14:10:46 +0100 Message-Id: <20230426131057.21550-3-naush@raspberrypi.com> X-Mailer: git-send-email 2.34.1 In-Reply-To: <20230426131057.21550-1-naush@raspberrypi.com> References: <20230426131057.21550-1-naush@raspberrypi.com> MIME-Version: 1.0 Subject: [libcamera-devel] [PATCH 02/13] libcamera: ipa_proxy: Allow a prefix for the configuration file X-BeenThere: libcamera-devel@lists.libcamera.org X-Mailman-Version: 2.1.29 Precedence: list List-Id: List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , X-Patchwork-Original-From: Naushir Patuck via libcamera-devel From: Naushir Patuck Reply-To: Naushir Patuck Errors-To: libcamera-devel-bounces@lists.libcamera.org Sender: "libcamera-devel" Add a prefix parameter to IPAProxy::configurationFile(). This prefix is added to the search path when locating IPA configuration files in the system directories. For example, the system directories etc/libcamera/ipa// and share/libcamera/ipa// will be used to search for the IPA configuration files. Signed-off-by: Naushir Patuck Reviewed-by: Jacopo Mondi --- include/libcamera/internal/ipa_proxy.h | 3 ++- src/libcamera/ipa_proxy.cpp | 11 +++++++---- src/libcamera/pipeline/ipu3/ipu3.cpp | 4 ++-- src/libcamera/pipeline/raspberrypi/raspberrypi.cpp | 2 +- src/libcamera/pipeline/rkisp1/rkisp1.cpp | 4 ++-- src/libcamera/pipeline/vimc/vimc.cpp | 2 +- test/ipa/ipa_interface_test.cpp | 2 +- 7 files changed, 16 insertions(+), 12 deletions(-) diff --git a/include/libcamera/internal/ipa_proxy.h b/include/libcamera/internal/ipa_proxy.h index 781c8b623605..4ec357425fd3 100644 --- a/include/libcamera/internal/ipa_proxy.h +++ b/include/libcamera/internal/ipa_proxy.h @@ -31,7 +31,8 @@ public: bool isValid() const { return valid_; } - std::string configurationFile(const std::string &file) const; + std::string configurationFile(const std::string &file, + const std::string &prefix) const; protected: std::string resolvePath(const std::string &file) const; diff --git a/src/libcamera/ipa_proxy.cpp b/src/libcamera/ipa_proxy.cpp index 3f2cc6b89f60..4a27b0a993fa 100644 --- a/src/libcamera/ipa_proxy.cpp +++ b/src/libcamera/ipa_proxy.cpp @@ -72,6 +72,7 @@ IPAProxy::~IPAProxy() /** * \brief Retrieve the absolute path to an IPA configuration file * \param[in] name The configuration file name + * \param[in] prefix The configuration directory prefix when searching system paths * * This function locates the configuration file for an IPA and returns its * absolute path. It searches the following directories, in order: @@ -80,8 +81,8 @@ IPAProxy::~IPAProxy() * environment variable ; or * - If libcamera is not installed, the src/ipa/ directory within the source * tree ; otherwise - * - The system sysconf (etc/libcamera/ipa) and the data (share/libcamera/ipa/) - * directories. + * - The system sysconf (etc/libcamera/ipa//) and the data + * (share/libcamera/ipa//) directories. * * The system directories are not searched if libcamera is not installed. * @@ -92,7 +93,8 @@ IPAProxy::~IPAProxy() * \return The full path to the IPA configuration file, or an empty string if * no configuration file can be found */ -std::string IPAProxy::configurationFile(const std::string &name) const +std::string IPAProxy::configurationFile(const std::string &name, + const std::string &prefix) const { struct stat statbuf; int ret; @@ -139,7 +141,8 @@ std::string IPAProxy::configurationFile(const std::string &name) const } else { /* Else look in the system locations. */ for (const auto &dir : utils::split(IPA_CONFIG_DIR, ":")) { - std::string confPath = dir + "/" + ipaName + "/" + name; + std::string confPath = dir + "/" + prefix + "/" + + ipaName + "/" + name; ret = stat(confPath.c_str(), &statbuf); if (ret == 0 && (statbuf.st_mode & S_IFMT) == S_IFREG) return confPath; diff --git a/src/libcamera/pipeline/ipu3/ipu3.cpp b/src/libcamera/pipeline/ipu3/ipu3.cpp index 355cb0cb76b8..a48d7e78d25e 100644 --- a/src/libcamera/pipeline/ipu3/ipu3.cpp +++ b/src/libcamera/pipeline/ipu3/ipu3.cpp @@ -1186,9 +1186,9 @@ int IPU3CameraData::loadIPA() * The API tuning file is made from the sensor name. If the tuning file * isn't found, fall back to the 'uncalibrated' file. */ - std::string ipaTuningFile = ipa_->configurationFile(sensor->model() + ".yaml"); + std::string ipaTuningFile = ipa_->configurationFile(sensor->model() + ".yaml", ""); if (ipaTuningFile.empty()) - ipaTuningFile = ipa_->configurationFile("uncalibrated.yaml"); + ipaTuningFile = ipa_->configurationFile("uncalibrated.yaml", ""); ret = ipa_->init(IPASettings{ ipaTuningFile, sensor->model() }, sensorInfo, sensor->controls(), &ipaControls_); diff --git a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp index 0060044143cc..a4fff28bf198 100644 --- a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp +++ b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp @@ -1668,7 +1668,7 @@ int RPiCameraData::loadIPA(ipa::RPi::IPAInitResult *result) std::string model = sensor_->model(); if (isMonoSensor(sensor_)) model += "_mono"; - configurationFile = ipa_->configurationFile(model + ".json"); + configurationFile = ipa_->configurationFile(model + ".json", ""); } else { configurationFile = std::string(configFromEnv); } diff --git a/src/libcamera/pipeline/rkisp1/rkisp1.cpp b/src/libcamera/pipeline/rkisp1/rkisp1.cpp index 8a30fe061d04..e338cdee2a2d 100644 --- a/src/libcamera/pipeline/rkisp1/rkisp1.cpp +++ b/src/libcamera/pipeline/rkisp1/rkisp1.cpp @@ -349,13 +349,13 @@ int RkISP1CameraData::loadIPA(unsigned int hwRevision) std::string ipaTuningFile; char const *configFromEnv = utils::secure_getenv("LIBCAMERA_RKISP1_TUNING_FILE"); if (!configFromEnv || *configFromEnv == '\0') { - ipaTuningFile = ipa_->configurationFile(sensor_->model() + ".yaml"); + ipaTuningFile = ipa_->configurationFile(sensor_->model() + ".yaml", ""); /* * If the tuning file isn't found, fall back to the * 'uncalibrated' configuration file. */ if (ipaTuningFile.empty()) - ipaTuningFile = ipa_->configurationFile("uncalibrated.yaml"); + ipaTuningFile = ipa_->configurationFile("uncalibrated.yaml", ""); } else { ipaTuningFile = std::string(configFromEnv); } diff --git a/src/libcamera/pipeline/vimc/vimc.cpp b/src/libcamera/pipeline/vimc/vimc.cpp index 204f5ad73f6d..5fbabcc5763d 100644 --- a/src/libcamera/pipeline/vimc/vimc.cpp +++ b/src/libcamera/pipeline/vimc/vimc.cpp @@ -472,7 +472,7 @@ bool PipelineHandlerVimc::match(DeviceEnumerator *enumerator) data->ipa_->paramsBufferReady.connect(data.get(), &VimcCameraData::paramsBufferReady); - std::string conf = data->ipa_->configurationFile("vimc.conf"); + std::string conf = data->ipa_->configurationFile("vimc.conf", ""); Flags inFlags = ipa::vimc::TestFlag::Flag2; Flags outFlags; data->ipa_->init(IPASettings{ conf, data->sensor_->model() }, diff --git a/test/ipa/ipa_interface_test.cpp b/test/ipa/ipa_interface_test.cpp index 051ef96e7ed2..b25de222b9b4 100644 --- a/test/ipa/ipa_interface_test.cpp +++ b/test/ipa/ipa_interface_test.cpp @@ -105,7 +105,7 @@ protected: } /* Test initialization of IPA module. */ - std::string conf = ipa_->configurationFile("vimc.conf"); + std::string conf = ipa_->configurationFile("vimc.conf", ""); Flags inFlags; Flags outFlags; int ret = ipa_->init(IPASettings{ conf, "vimc" }, From patchwork Wed Apr 26 13:10:47 2023 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Naushir Patuck X-Patchwork-Id: 18551 Return-Path: X-Original-To: parsemail@patchwork.libcamera.org Delivered-To: parsemail@patchwork.libcamera.org Received: from lancelot.ideasonboard.com (lancelot.ideasonboard.com [92.243.16.209]) by patchwork.libcamera.org (Postfix) with ESMTPS id EE462BDCBD for ; Wed, 26 Apr 2023 13:13:17 +0000 (UTC) Received: from lancelot.ideasonboard.com (localhost [IPv6:::1]) by lancelot.ideasonboard.com (Postfix) with ESMTP id B3A21627EA; Wed, 26 Apr 2023 15:13:17 +0200 (CEST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/simple; d=libcamera.org; s=mail; t=1682514797; bh=F3959Pm6JU5R3zup7IqNB32YmRQJwXCrrBJ1IbncgHw=; h=To:Date:In-Reply-To:References:Subject:List-Id:List-Unsubscribe: List-Archive:List-Post:List-Help:List-Subscribe:From:Reply-To: From; b=0Vr4UV3JBpOVUCpRuHB1QQUIHcda9+iQEAQavZLlE1Y+85igQcNouKxc/CkBXrn5P 7D7W9+fvzqSFLFKbA7ePvKHN5hDQZxHRA3tjjKq5hIpBD3SxOqOp/Vd6vBJ+ZO1RWF bIR27Odg1Cpqg3rNa3xyuHXj9v44CiHYRJDBHSWBkjnSLPYqh+O8IX9IKMcR8e7sEF JKn0hhobm3Zdh07HLa2F6giLQkhZLL9HABAh/b/s1MRSSz8fYCBGTH22jrlIHRaMJr LT4RGt60P+UqPeOiBykdvKdmPkjRfwNZAAhyWTQOfwoj01pfxOoTjRFHChDS7V/rJm mpVKNi478xzcw== Received: from mail-wm1-x32e.google.com (mail-wm1-x32e.google.com [IPv6:2a00:1450:4864:20::32e]) by lancelot.ideasonboard.com (Postfix) with ESMTPS id 69B60627E6 for ; Wed, 26 Apr 2023 15:13:14 +0200 (CEST) Authentication-Results: lancelot.ideasonboard.com; dkim=pass (2048-bit key; unprotected) header.d=raspberrypi.com header.i=@raspberrypi.com header.b="ldtEXm3P"; dkim-atps=neutral Received: by mail-wm1-x32e.google.com with SMTP id 5b1f17b1804b1-3f195b164c4so37126405e9.1 for ; Wed, 26 Apr 2023 06:13:14 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=raspberrypi.com; s=google; t=1682514793; x=1685106793; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:from:to:cc:subject:date :message-id:reply-to; bh=NQoeQRknbIwTLq4lL4VzO01h7SoAhCUftXdW8ib11g4=; b=ldtEXm3PKdKcqPz7Bd19JtG0IDam7z+ntRYwcPoJhOAwiNx2MqPL2idpLvVVmp4Ue7 RC3OAOb8jJFtWJxm4lVFHTWJdgZHevxsPothx7GEaoNyDeEE3jmSB6iSTDcdwOgeHat1 JtqSPpfR+vPc22DT2w83GmI0HS7NLbyEQsbKGxHboRRhmRQaQt4Vvl4fEGt+u8HNrsV3 xgLEzgHgMv9Pgi+k4rj4l/Dax4/l5FJtiHahPvz76X7lUq95AH1gMty0wpo5BUZdhGNI 5gfbnBlqJXKk/5vcTU1OoUNhMbrUC1WQ8lF9qyIfUjGI7y+X03ZTVG4hTgQ1gDuENsaX krbA== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20221208; t=1682514793; x=1685106793; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:x-gm-message-state:from:to:cc :subject:date:message-id:reply-to; bh=NQoeQRknbIwTLq4lL4VzO01h7SoAhCUftXdW8ib11g4=; b=e9Au/HiIP4Q6PiG7p6eoh87a6AinoM5VDU48b5AI2n4RhTn1fJf/4vL16bMUxwJJkV XAibZFsJbfhvoh258yyxkX55KhANWT0jimcfpXpoFqxBP6IUU3cnQQB3ndrm3WEovhn2 XtvPwHsRN2RLRuPnTnoW1jHsth79miN9BqsV7honVuKFUKMFrAT2P0sBFfJvm421cQpb 6XOGltmpRqLq72qAVBTaA65Qfdf5kbsp8wGUO2Tidc+qNLGFp+lPiPSyP51hAWS2WtMt imXoZqGHRTBANUjETNyP46hCCHJ6X1ylMX/P2oLTO2c0c8yStseg7CDxKz3S+V9z4aPL QvgA== X-Gm-Message-State: AAQBX9fPqPZcK9chcWdgh2xiugwrP41Gieav7prCHox5RJIVFCdBB7K2 hIrOwGwyNWs+IrlGBU6lHGu9cqXVGTgtzsQ1ldTmQQ== X-Google-Smtp-Source: AKy350bH8M2NHeyYPjpE3EhZj4T6p4RZD0JFP7Trh2pB5w0WLo6Hr0niaqxkwy+l8FEFF+VpkaOIkw== X-Received: by 2002:a1c:4b1a:0:b0:3f1:7bb5:9d71 with SMTP id y26-20020a1c4b1a000000b003f17bb59d71mr12663855wma.33.1682514793556; Wed, 26 Apr 2023 06:13:13 -0700 (PDT) Received: from localhost.localdomain ([93.93.133.154]) by smtp.gmail.com with ESMTPSA id k5-20020adff5c5000000b002f103ca90cdsm15780949wrp.101.2023.04.26.06.13.12 (version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256); Wed, 26 Apr 2023 06:13:12 -0700 (PDT) To: libcamera-devel@lists.libcamera.org Date: Wed, 26 Apr 2023 14:10:47 +0100 Message-Id: <20230426131057.21550-4-naush@raspberrypi.com> X-Mailer: git-send-email 2.34.1 In-Reply-To: <20230426131057.21550-1-naush@raspberrypi.com> References: <20230426131057.21550-1-naush@raspberrypi.com> MIME-Version: 1.0 Subject: [libcamera-devel] [PATCH 03/13] pipeline: raspberrypi: Refactor and move pipeline handler code X-BeenThere: libcamera-devel@lists.libcamera.org X-Mailman-Version: 2.1.29 Precedence: list List-Id: List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , X-Patchwork-Original-From: Naushir Patuck via libcamera-devel From: Naushir Patuck Reply-To: Naushir Patuck Errors-To: libcamera-devel-bounces@lists.libcamera.org Sender: "libcamera-devel" Split the Raspberry Pi pipeline handler code into common and VC4/BCM2835 specific file structures. The common code files now live in src/libcamera/pipeline/rpi/common/ and the vc4 specific files in src/libcamera/pipeline/rpi/vc4/. To build the pipeline handler, the meson configuration option to select the Raspberry Pi pipeline handler has now changed from "raspberrypi" to "rpi/vc4": meson setup build -Dpipelines=rpi/vc4 There are no functional changes in the pipeline handler code itself. Signed-off-by: Naushir Patuck Reviewed-by: Jacopo Mondi --- Documentation/environment_variables.rst | 2 +- Documentation/guides/introduction.rst | 2 +- Documentation/guides/ipa.rst | 2 +- Documentation/guides/pipeline-handler.rst | 2 +- include/libcamera/ipa/meson.build | 2 +- meson.build | 2 +- meson_options.txt | 2 +- src/libcamera/pipeline/meson.build | 9 +++++++++ .../{raspberrypi => rpi/common}/delayed_controls.cpp | 0 .../{raspberrypi => rpi/common}/delayed_controls.h | 0 src/libcamera/pipeline/rpi/common/meson.build | 8 ++++++++ .../pipeline/{raspberrypi => rpi/common}/rpi_stream.cpp | 0 .../pipeline/{raspberrypi => rpi/common}/rpi_stream.h | 0 .../pipeline/{raspberrypi => rpi/vc4}/data/example.yaml | 0 .../pipeline/{raspberrypi => rpi/vc4}/data/meson.build | 2 +- .../pipeline/{raspberrypi => rpi/vc4}/dma_heaps.cpp | 0 .../pipeline/{raspberrypi => rpi/vc4}/dma_heaps.h | 0 .../pipeline/{raspberrypi => rpi/vc4}/meson.build | 2 -- .../pipeline/{raspberrypi => rpi/vc4}/raspberrypi.cpp | 2 +- 19 files changed, 26 insertions(+), 11 deletions(-) rename src/libcamera/pipeline/{raspberrypi => rpi/common}/delayed_controls.cpp (100%) rename src/libcamera/pipeline/{raspberrypi => rpi/common}/delayed_controls.h (100%) create mode 100644 src/libcamera/pipeline/rpi/common/meson.build rename src/libcamera/pipeline/{raspberrypi => rpi/common}/rpi_stream.cpp (100%) rename src/libcamera/pipeline/{raspberrypi => rpi/common}/rpi_stream.h (100%) rename src/libcamera/pipeline/{raspberrypi => rpi/vc4}/data/example.yaml (100%) rename src/libcamera/pipeline/{raspberrypi => rpi/vc4}/data/meson.build (63%) rename src/libcamera/pipeline/{raspberrypi => rpi/vc4}/dma_heaps.cpp (100%) rename src/libcamera/pipeline/{raspberrypi => rpi/vc4}/dma_heaps.h (100%) rename src/libcamera/pipeline/{raspberrypi => rpi/vc4}/meson.build (71%) rename src/libcamera/pipeline/{raspberrypi => rpi/vc4}/raspberrypi.cpp (99%) diff --git a/Documentation/environment_variables.rst b/Documentation/environment_variables.rst index ceeb251a2ac0..4bf38b877897 100644 --- a/Documentation/environment_variables.rst +++ b/Documentation/environment_variables.rst @@ -40,7 +40,7 @@ LIBCAMERA_IPA_MODULE_PATH LIBCAMERA_RPI_CONFIG_FILE Define a custom configuration file to use in the Raspberry Pi pipeline handler. - Example value: ``/usr/local/share/libcamera/pipeline/raspberrypi/minimal_mem.yaml`` + Example value: ``/usr/local/share/libcamera/pipeline/rpi/vc4/minimal_mem.yaml`` Further details --------------- diff --git a/Documentation/guides/introduction.rst b/Documentation/guides/introduction.rst index 2d1760c1866b..700ec2d33c30 100644 --- a/Documentation/guides/introduction.rst +++ b/Documentation/guides/introduction.rst @@ -288,7 +288,7 @@ with dedicated pipeline handlers: - Intel IPU3 (ipu3) - Rockchip RK3399 (rkisp1) - - RaspberryPi 3 and 4 (raspberrypi) + - RaspberryPi 3 and 4 (rpi/vc4) Furthermore, generic platform support is provided for the following: diff --git a/Documentation/guides/ipa.rst b/Documentation/guides/ipa.rst index 89839408672a..10301d89fc80 100644 --- a/Documentation/guides/ipa.rst +++ b/Documentation/guides/ipa.rst @@ -279,7 +279,7 @@ For example, adding the raspberrypi.mojom file to meson: .. code-block:: none pipeline_ipa_mojom_mapping = [ - 'raspberrypi': 'raspberrypi.mojom', + 'rpi/vc4': 'raspberrypi.mojom', ] This will cause the mojo data definition file to be compiled. Specifically, it diff --git a/Documentation/guides/pipeline-handler.rst b/Documentation/guides/pipeline-handler.rst index 4d38fa23fbcd..7d143b0eaecb 100644 --- a/Documentation/guides/pipeline-handler.rst +++ b/Documentation/guides/pipeline-handler.rst @@ -183,7 +183,7 @@ to the libcamera build options in the top level ``meson_options.txt``. option('pipelines', type : 'array', - choices : ['ipu3', 'raspberrypi', 'rkisp1', 'simple', 'uvcvideo', 'vimc', 'vivid'], + choices : ['ipu3', 'rpi/vc4', 'rkisp1', 'simple', 'uvcvideo', 'vimc', 'vivid'], description : 'Select which pipeline handlers to include') diff --git a/include/libcamera/ipa/meson.build b/include/libcamera/ipa/meson.build index 67c31cb04ccf..bae15d64d7d8 100644 --- a/include/libcamera/ipa/meson.build +++ b/include/libcamera/ipa/meson.build @@ -64,7 +64,7 @@ libcamera_generated_ipa_headers += custom_target('core_ipa_serializer_h', pipeline_ipa_mojom_mapping = { 'ipu3': 'ipu3.mojom', 'rkisp1': 'rkisp1.mojom', - 'raspberrypi': 'raspberrypi.mojom', + 'rpi/vc4': 'raspberrypi.mojom', 'vimc': 'vimc.mojom', } diff --git a/meson.build b/meson.build index 8628e6acebee..f60da3e17719 100644 --- a/meson.build +++ b/meson.build @@ -179,7 +179,7 @@ arch_x86 = ['x86', 'x86_64'] pipelines_support = { 'imx8-isi': arch_arm, 'ipu3': arch_x86, - 'raspberrypi': arch_arm, + 'rpi/vc4': arch_arm, 'rkisp1': arch_arm, 'simple': arch_arm, 'uvcvideo': ['any'], diff --git a/meson_options.txt b/meson_options.txt index 78a78b7214d5..e1f4c205aa94 100644 --- a/meson_options.txt +++ b/meson_options.txt @@ -43,7 +43,7 @@ option('pipelines', 'auto', 'imx8-isi', 'ipu3', - 'raspberrypi', + 'rpi/vc4', 'rkisp1', 'simple', 'uvcvideo', diff --git a/src/libcamera/pipeline/meson.build b/src/libcamera/pipeline/meson.build index f14869f3a3c0..4f55611013db 100644 --- a/src/libcamera/pipeline/meson.build +++ b/src/libcamera/pipeline/meson.build @@ -3,6 +3,15 @@ # Location of pipeline specific configuration files pipeline_data_dir = libcamera_datadir / 'pipeline' +# If the Raspberry Pi VC4 pipeline handler is enabled, ensure we include the +# rpi/common subdirectory in the build. +# +# This is done here and not within rpi/vc4/meson.build as meson does not +# allow the subdir command to traverse up the directory tree. +if pipelines.contains('rpi/vc4') + subdir('rpi/common') +endif + foreach pipeline : pipelines subdir(pipeline) endforeach diff --git a/src/libcamera/pipeline/raspberrypi/delayed_controls.cpp b/src/libcamera/pipeline/rpi/common/delayed_controls.cpp similarity index 100% rename from src/libcamera/pipeline/raspberrypi/delayed_controls.cpp rename to src/libcamera/pipeline/rpi/common/delayed_controls.cpp diff --git a/src/libcamera/pipeline/raspberrypi/delayed_controls.h b/src/libcamera/pipeline/rpi/common/delayed_controls.h similarity index 100% rename from src/libcamera/pipeline/raspberrypi/delayed_controls.h rename to src/libcamera/pipeline/rpi/common/delayed_controls.h diff --git a/src/libcamera/pipeline/rpi/common/meson.build b/src/libcamera/pipeline/rpi/common/meson.build new file mode 100644 index 000000000000..1dec6d3d028b --- /dev/null +++ b/src/libcamera/pipeline/rpi/common/meson.build @@ -0,0 +1,8 @@ +# SPDX-License-Identifier: CC0-1.0 + +libcamera_sources += files([ + 'delayed_controls.cpp', + 'rpi_stream.cpp', +]) + +includes += include_directories('.') diff --git a/src/libcamera/pipeline/raspberrypi/rpi_stream.cpp b/src/libcamera/pipeline/rpi/common/rpi_stream.cpp similarity index 100% rename from src/libcamera/pipeline/raspberrypi/rpi_stream.cpp rename to src/libcamera/pipeline/rpi/common/rpi_stream.cpp diff --git a/src/libcamera/pipeline/raspberrypi/rpi_stream.h b/src/libcamera/pipeline/rpi/common/rpi_stream.h similarity index 100% rename from src/libcamera/pipeline/raspberrypi/rpi_stream.h rename to src/libcamera/pipeline/rpi/common/rpi_stream.h diff --git a/src/libcamera/pipeline/raspberrypi/data/example.yaml b/src/libcamera/pipeline/rpi/vc4/data/example.yaml similarity index 100% rename from src/libcamera/pipeline/raspberrypi/data/example.yaml rename to src/libcamera/pipeline/rpi/vc4/data/example.yaml diff --git a/src/libcamera/pipeline/raspberrypi/data/meson.build b/src/libcamera/pipeline/rpi/vc4/data/meson.build similarity index 63% rename from src/libcamera/pipeline/raspberrypi/data/meson.build rename to src/libcamera/pipeline/rpi/vc4/data/meson.build index 1c70433bbcbc..a7dfa02320e5 100644 --- a/src/libcamera/pipeline/raspberrypi/data/meson.build +++ b/src/libcamera/pipeline/rpi/vc4/data/meson.build @@ -5,4 +5,4 @@ conf_files = files([ ]) install_data(conf_files, - install_dir : pipeline_data_dir / 'raspberrypi') + install_dir : pipeline_data_dir / 'rpi/vc4') diff --git a/src/libcamera/pipeline/raspberrypi/dma_heaps.cpp b/src/libcamera/pipeline/rpi/vc4/dma_heaps.cpp similarity index 100% rename from src/libcamera/pipeline/raspberrypi/dma_heaps.cpp rename to src/libcamera/pipeline/rpi/vc4/dma_heaps.cpp diff --git a/src/libcamera/pipeline/raspberrypi/dma_heaps.h b/src/libcamera/pipeline/rpi/vc4/dma_heaps.h similarity index 100% rename from src/libcamera/pipeline/raspberrypi/dma_heaps.h rename to src/libcamera/pipeline/rpi/vc4/dma_heaps.h diff --git a/src/libcamera/pipeline/raspberrypi/meson.build b/src/libcamera/pipeline/rpi/vc4/meson.build similarity index 71% rename from src/libcamera/pipeline/raspberrypi/meson.build rename to src/libcamera/pipeline/rpi/vc4/meson.build index 59e8fb18c555..228823f30922 100644 --- a/src/libcamera/pipeline/raspberrypi/meson.build +++ b/src/libcamera/pipeline/rpi/vc4/meson.build @@ -1,10 +1,8 @@ # SPDX-License-Identifier: CC0-1.0 libcamera_sources += files([ - 'delayed_controls.cpp', 'dma_heaps.cpp', 'raspberrypi.cpp', - 'rpi_stream.cpp', ]) subdir('data') diff --git a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp b/src/libcamera/pipeline/rpi/vc4/raspberrypi.cpp similarity index 99% rename from src/libcamera/pipeline/raspberrypi/raspberrypi.cpp rename to src/libcamera/pipeline/rpi/vc4/raspberrypi.cpp index a4fff28bf198..4595773d2517 100644 --- a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp +++ b/src/libcamera/pipeline/rpi/vc4/raspberrypi.cpp @@ -2,7 +2,7 @@ /* * Copyright (C) 2019-2021, Raspberry Pi Ltd * - * raspberrypi.cpp - Pipeline handler for Raspberry Pi devices + * raspberrypi.cpp - Pipeline handler for VC4 based Raspberry Pi devices */ #include #include From patchwork Wed Apr 26 13:10:48 2023 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Naushir Patuck X-Patchwork-Id: 18552 Return-Path: X-Original-To: parsemail@patchwork.libcamera.org Delivered-To: parsemail@patchwork.libcamera.org Received: from lancelot.ideasonboard.com (lancelot.ideasonboard.com [92.243.16.209]) by patchwork.libcamera.org (Postfix) with ESMTPS id B7019BDCBD for ; Wed, 26 Apr 2023 13:13:19 +0000 (UTC) Received: from lancelot.ideasonboard.com (localhost [IPv6:::1]) by lancelot.ideasonboard.com (Postfix) with ESMTP id 56F01627F3; Wed, 26 Apr 2023 15:13:19 +0200 (CEST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/simple; d=libcamera.org; s=mail; t=1682514799; bh=Jyxinqtrg7A2JprM3fFwZZUqheTHCrjiVnEKJvYDb5Q=; h=To:Date:In-Reply-To:References:Subject:List-Id:List-Unsubscribe: List-Archive:List-Post:List-Help:List-Subscribe:From:Reply-To: From; b=QnvhB1kpSH7hF9DouJVzktGTzHfJ+El3wHjYXOlPoojRypvJ0cJtwDGZBEar+xEzH QUXQ2lMXU/qOQW4P32CoVfSyjQ7CgZCcpj7Wqhu7OkqnY3x4wnE31vikS1/gBYVC31 2LLuhlVAUBJ2eG8f4wUG2Omoc8m4OTyfcN/Sg6gpHKCsWuf5OB2uRutdqIvhcQwUi8 FxAOuGx98rcEcjUZWL8nqdtfXTV02EgG9LyP1YXH2u5sgXw6zmFus7F6boIu+aAhK6 31RYTH7wzwunemUcTzPS30o9MVC/r/51nECezrdZagp/FlpReCAkihv2B+MHQss/GD Dvcru/3nMh5EQ== Received: from mail-wr1-x42b.google.com (mail-wr1-x42b.google.com [IPv6:2a00:1450:4864:20::42b]) by lancelot.ideasonboard.com (Postfix) with ESMTPS id BB400627EA for ; Wed, 26 Apr 2023 15:13:15 +0200 (CEST) Authentication-Results: lancelot.ideasonboard.com; dkim=pass (2048-bit key; unprotected) header.d=raspberrypi.com header.i=@raspberrypi.com header.b="iglKZ3ZP"; dkim-atps=neutral Received: by mail-wr1-x42b.google.com with SMTP id ffacd0b85a97d-2f46348728eso4340367f8f.3 for ; Wed, 26 Apr 2023 06:13:15 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=raspberrypi.com; s=google; t=1682514795; x=1685106795; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:from:to:cc:subject:date :message-id:reply-to; bh=PJ+IkmiZXDc1Kgr01RThPI7WHOWq+rDuJQy3cVcYU7k=; b=iglKZ3ZP0UGTz14bxsCJxeLP7TKmjDfEz/hKP1gCI6HTxjRFUIESNArYUeIB27arB1 ax0ohV65fpPFHmuH7Xk2/uuHHRd8Z3C3HjFhAUIjrwUqv6npk2ZVB7xgfUc1rcWa5S+4 KKvO+KDiKsKf+aQyWVrlVx/YmiGWVX4XXSOLO7jule10AjE7CJxK5aL0Ec2JrrJQq6VW rTuqaYrnAxjQiRu+8VhS4wg2bcBXayzQQjiUK/xTZ1/agcwcOwk+iuQBGq7fIGPK6xOk +34P6dRSXW+x1M6MRExSxq0vqBvexPb7Z8e4zvD8G7/wumI5DntBq+2uhGuVbxDVJdSE uqYQ== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20221208; t=1682514795; x=1685106795; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:x-gm-message-state:from:to:cc :subject:date:message-id:reply-to; bh=PJ+IkmiZXDc1Kgr01RThPI7WHOWq+rDuJQy3cVcYU7k=; b=Rg1lk9oBsx6D7q9Xus7X+jCyKJwbgqWFqI40WnGxIHhquJrQiGV56CSqVf6OqtzH66 nFcqRe8pM81Ms86DQCQ2njGPUJx0ToIy5ZhW4z2iEXDrNpoasYp9tzrJUPYY3n0jwnZY Fn3z16BzbbdrN7wkYsVqJaD19BHphQWHYJ8zy2qvBgFYFWkBP7pHnOL1IUIjxK60oR6e uRi5DWqZTtt06K47mlmtBrjMU3sVrC+zgylWO2rII69pCe2mzgexDcRnwGKVuitdK7ZL NsgFuNA/4rfUB9rPtGi7esLplKjUmWy4hM+vx3L0ecBMP/CxeYGBRQR8z8zKdyCRIz09 UVIQ== X-Gm-Message-State: AAQBX9ddZu7CVCY9/KvdjEvma/z9lktsrQ1f0zHaFTddhEq20lLfmVLc BYBvfW9IC+E5y8HBNY8zHnRTSgIwAVK8/vPcPH22Pg== X-Google-Smtp-Source: AKy350bKMKr7FqtRNVZ6IoktqIpr3Dak5ZL2FiJLIOZRx0ex0IKQh3PCIpd8URklDaHW7+c8fGfhHA== X-Received: by 2002:adf:eac5:0:b0:303:c151:8e69 with SMTP id o5-20020adfeac5000000b00303c1518e69mr11982324wrn.40.1682514794494; Wed, 26 Apr 2023 06:13:14 -0700 (PDT) Received: from localhost.localdomain ([93.93.133.154]) by smtp.gmail.com with ESMTPSA id k5-20020adff5c5000000b002f103ca90cdsm15780949wrp.101.2023.04.26.06.13.13 (version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256); Wed, 26 Apr 2023 06:13:13 -0700 (PDT) To: libcamera-devel@lists.libcamera.org Date: Wed, 26 Apr 2023 14:10:48 +0100 Message-Id: <20230426131057.21550-5-naush@raspberrypi.com> X-Mailer: git-send-email 2.34.1 In-Reply-To: <20230426131057.21550-1-naush@raspberrypi.com> References: <20230426131057.21550-1-naush@raspberrypi.com> MIME-Version: 1.0 Subject: [libcamera-devel] [PATCH 04/13] ipa: raspberrypi: Refactor and move IPA code X-BeenThere: libcamera-devel@lists.libcamera.org X-Mailman-Version: 2.1.29 Precedence: list List-Id: List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , X-Patchwork-Original-From: Naushir Patuck via libcamera-devel From: Naushir Patuck Reply-To: Naushir Patuck Errors-To: libcamera-devel-bounces@lists.libcamera.org Sender: "libcamera-devel" Split the Raspberry Pi IPA code into common and VC4/BCM2835 specific file structures. The common code files now live in src/ipa/rpi/{cam_helper,controller}/ and the vc4 specific files in src/ipa/rpi/vc4/. To build the IPA, the meson configuration option to select the Raspberry Pi IPA has now changed from "raspberrypi" to "rpi/vc4": meson setup build --Dipas=rpi/vc4 With this change, the camera tuning files are now installed under share/libcamera/ipa/rpi/vc4/ Signed-off-by: Naushir Patuck Reviewed-by: Jacopo Mondi --- Documentation/environment_variables.rst | 2 +- meson_options.txt | 2 +- src/ipa/meson.build | 10 +++ src/ipa/raspberrypi/meson.build | 66 ------------------- src/ipa/{raspberrypi => rpi}/README.md | 0 .../cam_helper}/cam_helper.cpp | 0 .../cam_helper}/cam_helper.h | 2 +- .../cam_helper}/cam_helper_imx219.cpp | 0 .../cam_helper}/cam_helper_imx290.cpp | 0 .../cam_helper}/cam_helper_imx296.cpp | 0 .../cam_helper}/cam_helper_imx477.cpp | 0 .../cam_helper}/cam_helper_imx519.cpp | 0 .../cam_helper}/cam_helper_imx708.cpp | 0 .../cam_helper}/cam_helper_ov5647.cpp | 0 .../cam_helper}/cam_helper_ov9281.cpp | 0 .../cam_helper}/md_parser.h | 0 .../cam_helper}/md_parser_smia.cpp | 0 src/ipa/rpi/cam_helper/meson.build | 14 ++++ .../controller/af_algorithm.h | 0 .../controller/af_status.h | 0 .../controller/agc_algorithm.h | 0 .../controller/agc_status.h | 0 .../controller/algorithm.cpp | 0 .../controller/algorithm.h | 0 .../controller/alsc_status.h | 0 .../controller/awb_algorithm.h | 0 .../controller/awb_status.h | 0 .../controller/black_level_status.h | 0 .../controller/camera_mode.h | 0 .../controller/ccm_algorithm.h | 0 .../controller/ccm_status.h | 0 .../controller/contrast_algorithm.h | 0 .../controller/contrast_status.h | 0 .../controller/controller.cpp | 0 .../controller/controller.h | 0 .../controller/denoise_algorithm.h | 0 .../controller/denoise_status.h | 0 .../controller/device_status.cpp | 0 .../controller/device_status.h | 0 .../controller/dpc_status.h | 0 .../controller/geq_status.h | 0 .../controller/histogram.cpp | 0 .../controller/histogram.h | 0 .../controller/lux_status.h | 0 src/ipa/rpi/controller/meson.build | 22 +++++++ .../controller/metadata.h | 0 .../controller/noise_status.h | 0 .../controller/pdaf_data.h | 0 .../{raspberrypi => rpi}/controller/pwl.cpp | 0 src/ipa/{raspberrypi => rpi}/controller/pwl.h | 0 .../controller/region_stats.h | 0 .../controller/rpi/af.cpp | 0 .../{raspberrypi => rpi}/controller/rpi/af.h | 0 .../controller/rpi/agc.cpp | 0 .../{raspberrypi => rpi}/controller/rpi/agc.h | 0 .../controller/rpi/alsc.cpp | 0 .../controller/rpi/alsc.h | 0 .../controller/rpi/awb.cpp | 0 .../{raspberrypi => rpi}/controller/rpi/awb.h | 0 .../controller/rpi/black_level.cpp | 0 .../controller/rpi/black_level.h | 0 .../controller/rpi/ccm.cpp | 0 .../{raspberrypi => rpi}/controller/rpi/ccm.h | 0 .../controller/rpi/contrast.cpp | 0 .../controller/rpi/contrast.h | 0 .../controller/rpi/dpc.cpp | 0 .../{raspberrypi => rpi}/controller/rpi/dpc.h | 0 .../controller/rpi/focus.h | 0 .../controller/rpi/geq.cpp | 0 .../{raspberrypi => rpi}/controller/rpi/geq.h | 0 .../controller/rpi/lux.cpp | 0 .../{raspberrypi => rpi}/controller/rpi/lux.h | 0 .../controller/rpi/noise.cpp | 0 .../controller/rpi/noise.h | 0 .../controller/rpi/sdn.cpp | 0 .../{raspberrypi => rpi}/controller/rpi/sdn.h | 0 .../controller/rpi/sharpen.cpp | 0 .../controller/rpi/sharpen.h | 0 .../controller/sharpen_algorithm.h | 0 .../controller/sharpen_status.h | 0 .../controller}/statistics.h | 0 .../{raspberrypi => rpi/vc4}/data/imx219.json | 0 .../vc4}/data/imx219_noir.json | 0 .../{raspberrypi => rpi/vc4}/data/imx290.json | 0 .../{raspberrypi => rpi/vc4}/data/imx296.json | 0 .../vc4}/data/imx296_mono.json | 0 .../{raspberrypi => rpi/vc4}/data/imx378.json | 0 .../{raspberrypi => rpi/vc4}/data/imx477.json | 0 .../vc4}/data/imx477_noir.json | 0 .../vc4}/data/imx477_scientific.json | 0 .../vc4}/data/imx477_v1.json | 0 .../{raspberrypi => rpi/vc4}/data/imx519.json | 0 .../{raspberrypi => rpi/vc4}/data/imx708.json | 0 .../vc4}/data/imx708_noir.json | 0 .../vc4}/data/imx708_wide.json | 0 .../vc4}/data/imx708_wide_noir.json | 0 .../{raspberrypi => rpi/vc4}/data/meson.build | 2 +- .../{raspberrypi => rpi/vc4}/data/ov5647.json | 0 .../vc4}/data/ov5647_noir.json | 0 .../vc4}/data/ov9281_mono.json | 0 .../vc4}/data/se327m12.json | 0 .../vc4}/data/uncalibrated.json | 0 src/ipa/rpi/vc4/meson.build | 40 +++++++++++ .../{raspberrypi => rpi/vc4}/raspberrypi.cpp | 48 +++++++------- .../pipeline/rpi/vc4/raspberrypi.cpp | 2 +- 105 files changed, 115 insertions(+), 95 deletions(-) delete mode 100644 src/ipa/raspberrypi/meson.build rename src/ipa/{raspberrypi => rpi}/README.md (100%) rename src/ipa/{raspberrypi => rpi/cam_helper}/cam_helper.cpp (100%) rename src/ipa/{raspberrypi => rpi/cam_helper}/cam_helper.h (99%) rename src/ipa/{raspberrypi => rpi/cam_helper}/cam_helper_imx219.cpp (100%) rename src/ipa/{raspberrypi => rpi/cam_helper}/cam_helper_imx290.cpp (100%) rename src/ipa/{raspberrypi => rpi/cam_helper}/cam_helper_imx296.cpp (100%) rename src/ipa/{raspberrypi => rpi/cam_helper}/cam_helper_imx477.cpp (100%) rename src/ipa/{raspberrypi => rpi/cam_helper}/cam_helper_imx519.cpp (100%) rename src/ipa/{raspberrypi => rpi/cam_helper}/cam_helper_imx708.cpp (100%) rename src/ipa/{raspberrypi => rpi/cam_helper}/cam_helper_ov5647.cpp (100%) rename src/ipa/{raspberrypi => rpi/cam_helper}/cam_helper_ov9281.cpp (100%) rename src/ipa/{raspberrypi => rpi/cam_helper}/md_parser.h (100%) rename src/ipa/{raspberrypi => rpi/cam_helper}/md_parser_smia.cpp (100%) create mode 100644 src/ipa/rpi/cam_helper/meson.build rename src/ipa/{raspberrypi => rpi}/controller/af_algorithm.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/af_status.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/agc_algorithm.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/agc_status.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/algorithm.cpp (100%) rename src/ipa/{raspberrypi => rpi}/controller/algorithm.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/alsc_status.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/awb_algorithm.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/awb_status.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/black_level_status.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/camera_mode.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/ccm_algorithm.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/ccm_status.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/contrast_algorithm.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/contrast_status.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/controller.cpp (100%) rename src/ipa/{raspberrypi => rpi}/controller/controller.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/denoise_algorithm.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/denoise_status.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/device_status.cpp (100%) rename src/ipa/{raspberrypi => rpi}/controller/device_status.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/dpc_status.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/geq_status.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/histogram.cpp (100%) rename src/ipa/{raspberrypi => rpi}/controller/histogram.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/lux_status.h (100%) create mode 100644 src/ipa/rpi/controller/meson.build rename src/ipa/{raspberrypi => rpi}/controller/metadata.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/noise_status.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/pdaf_data.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/pwl.cpp (100%) rename src/ipa/{raspberrypi => rpi}/controller/pwl.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/region_stats.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/rpi/af.cpp (100%) rename src/ipa/{raspberrypi => rpi}/controller/rpi/af.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/rpi/agc.cpp (100%) rename src/ipa/{raspberrypi => rpi}/controller/rpi/agc.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/rpi/alsc.cpp (100%) rename src/ipa/{raspberrypi => rpi}/controller/rpi/alsc.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/rpi/awb.cpp (100%) rename src/ipa/{raspberrypi => rpi}/controller/rpi/awb.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/rpi/black_level.cpp (100%) rename src/ipa/{raspberrypi => rpi}/controller/rpi/black_level.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/rpi/ccm.cpp (100%) rename src/ipa/{raspberrypi => rpi}/controller/rpi/ccm.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/rpi/contrast.cpp (100%) rename src/ipa/{raspberrypi => rpi}/controller/rpi/contrast.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/rpi/dpc.cpp (100%) rename src/ipa/{raspberrypi => rpi}/controller/rpi/dpc.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/rpi/focus.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/rpi/geq.cpp (100%) rename src/ipa/{raspberrypi => rpi}/controller/rpi/geq.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/rpi/lux.cpp (100%) rename src/ipa/{raspberrypi => rpi}/controller/rpi/lux.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/rpi/noise.cpp (100%) rename src/ipa/{raspberrypi => rpi}/controller/rpi/noise.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/rpi/sdn.cpp (100%) rename src/ipa/{raspberrypi => rpi}/controller/rpi/sdn.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/rpi/sharpen.cpp (100%) rename src/ipa/{raspberrypi => rpi}/controller/rpi/sharpen.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/sharpen_algorithm.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/sharpen_status.h (100%) rename src/ipa/{raspberrypi => rpi/controller}/statistics.h (100%) rename src/ipa/{raspberrypi => rpi/vc4}/data/imx219.json (100%) rename src/ipa/{raspberrypi => rpi/vc4}/data/imx219_noir.json (100%) rename src/ipa/{raspberrypi => rpi/vc4}/data/imx290.json (100%) rename src/ipa/{raspberrypi => rpi/vc4}/data/imx296.json (100%) rename src/ipa/{raspberrypi => rpi/vc4}/data/imx296_mono.json (100%) rename src/ipa/{raspberrypi => rpi/vc4}/data/imx378.json (100%) rename src/ipa/{raspberrypi => rpi/vc4}/data/imx477.json (100%) rename src/ipa/{raspberrypi => rpi/vc4}/data/imx477_noir.json (100%) rename src/ipa/{raspberrypi => rpi/vc4}/data/imx477_scientific.json (100%) rename src/ipa/{raspberrypi => rpi/vc4}/data/imx477_v1.json (100%) rename src/ipa/{raspberrypi => rpi/vc4}/data/imx519.json (100%) rename src/ipa/{raspberrypi => rpi/vc4}/data/imx708.json (100%) rename src/ipa/{raspberrypi => rpi/vc4}/data/imx708_noir.json (100%) rename src/ipa/{raspberrypi => rpi/vc4}/data/imx708_wide.json (100%) rename src/ipa/{raspberrypi => rpi/vc4}/data/imx708_wide_noir.json (100%) rename src/ipa/{raspberrypi => rpi/vc4}/data/meson.build (89%) rename src/ipa/{raspberrypi => rpi/vc4}/data/ov5647.json (100%) rename src/ipa/{raspberrypi => rpi/vc4}/data/ov5647_noir.json (100%) rename src/ipa/{raspberrypi => rpi/vc4}/data/ov9281_mono.json (100%) rename src/ipa/{raspberrypi => rpi/vc4}/data/se327m12.json (100%) rename src/ipa/{raspberrypi => rpi/vc4}/data/uncalibrated.json (100%) create mode 100644 src/ipa/rpi/vc4/meson.build rename src/ipa/{raspberrypi => rpi/vc4}/raspberrypi.cpp (98%) diff --git a/Documentation/environment_variables.rst b/Documentation/environment_variables.rst index 4bf38b877897..888707785ee1 100644 --- a/Documentation/environment_variables.rst +++ b/Documentation/environment_variables.rst @@ -143,7 +143,7 @@ contain tuning parameters for the algorithms, in JSON format. The ``LIBCAMERA_IPA_CONFIG_PATH`` variable can be used to specify custom storage locations to search for those configuration files. -`Examples `__ +`Examples `__ IPA module ~~~~~~~~~~ diff --git a/meson_options.txt b/meson_options.txt index e1f4c205aa94..596f813ea5e5 100644 --- a/meson_options.txt +++ b/meson_options.txt @@ -27,7 +27,7 @@ option('gstreamer', option('ipas', type : 'array', - choices : ['ipu3', 'raspberrypi', 'rkisp1', 'vimc'], + choices : ['ipu3', 'rpi/vc4', 'rkisp1', 'vimc'], description : 'Select which IPA modules to build') option('lc-compliance', diff --git a/src/ipa/meson.build b/src/ipa/meson.build index 76ad5b445601..10d3b44ca7b6 100644 --- a/src/ipa/meson.build +++ b/src/ipa/meson.build @@ -37,6 +37,16 @@ endif enabled_ipa_modules = [] +# If the Raspberry Pi VC4 IPA is enabled, ensure we include the rpi/cam_helper +# and rpi/controller subdirectories in the build. +# +# This is done here and not within rpi/vc4/meson.build as meson does not +# allow the subdir command to traverse up the directory tree. +if pipelines.contains('rpi/vc4') + subdir('rpi/cam_helper') + subdir('rpi/controller') +endif + # The ipa-sign-install.sh script which uses the ipa_names variable will itself # prepend MESON_INSTALL_DESTDIR_PREFIX to each ipa module name, therefore we # must not include the prefix string here. diff --git a/src/ipa/raspberrypi/meson.build b/src/ipa/raspberrypi/meson.build deleted file mode 100644 index de78cbd80f9c..000000000000 --- a/src/ipa/raspberrypi/meson.build +++ /dev/null @@ -1,66 +0,0 @@ -# SPDX-License-Identifier: CC0-1.0 - -ipa_name = 'ipa_rpi' - -rpi_ipa_deps = [ - libcamera_private, - libatomic, -] - -rpi_ipa_includes = [ - ipa_includes, - libipa_includes, - include_directories('controller') -] - -rpi_ipa_sources = files([ - 'raspberrypi.cpp', - 'md_parser_smia.cpp', - 'cam_helper.cpp', - 'cam_helper_ov5647.cpp', - 'cam_helper_imx219.cpp', - 'cam_helper_imx290.cpp', - 'cam_helper_imx296.cpp', - 'cam_helper_imx477.cpp', - 'cam_helper_imx519.cpp', - 'cam_helper_imx708.cpp', - 'cam_helper_ov9281.cpp', - 'controller/controller.cpp', - 'controller/histogram.cpp', - 'controller/algorithm.cpp', - 'controller/rpi/af.cpp', - 'controller/rpi/alsc.cpp', - 'controller/rpi/awb.cpp', - 'controller/rpi/sharpen.cpp', - 'controller/rpi/black_level.cpp', - 'controller/rpi/geq.cpp', - 'controller/rpi/noise.cpp', - 'controller/rpi/lux.cpp', - 'controller/rpi/agc.cpp', - 'controller/rpi/dpc.cpp', - 'controller/rpi/ccm.cpp', - 'controller/rpi/contrast.cpp', - 'controller/rpi/sdn.cpp', - 'controller/pwl.cpp', - 'controller/device_status.cpp', -]) - -mod = shared_module(ipa_name, - [rpi_ipa_sources, libcamera_generated_ipa_headers], - name_prefix : '', - include_directories : rpi_ipa_includes, - dependencies : rpi_ipa_deps, - link_with : libipa, - install : true, - install_dir : ipa_install_dir) - -if ipa_sign_module - custom_target(ipa_name + '.so.sign', - input : mod, - output : ipa_name + '.so.sign', - command : [ipa_sign, ipa_priv_key, '@INPUT@', '@OUTPUT@'], - install : false, - build_by_default : true) -endif - -subdir('data') diff --git a/src/ipa/raspberrypi/README.md b/src/ipa/rpi/README.md similarity index 100% rename from src/ipa/raspberrypi/README.md rename to src/ipa/rpi/README.md diff --git a/src/ipa/raspberrypi/cam_helper.cpp b/src/ipa/rpi/cam_helper/cam_helper.cpp similarity index 100% rename from src/ipa/raspberrypi/cam_helper.cpp rename to src/ipa/rpi/cam_helper/cam_helper.cpp diff --git a/src/ipa/raspberrypi/cam_helper.h b/src/ipa/rpi/cam_helper/cam_helper.h similarity index 99% rename from src/ipa/raspberrypi/cam_helper.h rename to src/ipa/rpi/cam_helper/cam_helper.h index b3f8c9803094..58a4b202d5a8 100644 --- a/src/ipa/raspberrypi/cam_helper.h +++ b/src/ipa/rpi/cam_helper/cam_helper.h @@ -13,7 +13,7 @@ #include #include -#include "camera_mode.h" +#include "controller/camera_mode.h" #include "controller/controller.h" #include "controller/metadata.h" #include "md_parser.h" diff --git a/src/ipa/raspberrypi/cam_helper_imx219.cpp b/src/ipa/rpi/cam_helper/cam_helper_imx219.cpp similarity index 100% rename from src/ipa/raspberrypi/cam_helper_imx219.cpp rename to src/ipa/rpi/cam_helper/cam_helper_imx219.cpp diff --git a/src/ipa/raspberrypi/cam_helper_imx290.cpp b/src/ipa/rpi/cam_helper/cam_helper_imx290.cpp similarity index 100% rename from src/ipa/raspberrypi/cam_helper_imx290.cpp rename to src/ipa/rpi/cam_helper/cam_helper_imx290.cpp diff --git a/src/ipa/raspberrypi/cam_helper_imx296.cpp b/src/ipa/rpi/cam_helper/cam_helper_imx296.cpp similarity index 100% rename from src/ipa/raspberrypi/cam_helper_imx296.cpp rename to src/ipa/rpi/cam_helper/cam_helper_imx296.cpp diff --git a/src/ipa/raspberrypi/cam_helper_imx477.cpp b/src/ipa/rpi/cam_helper/cam_helper_imx477.cpp similarity index 100% rename from src/ipa/raspberrypi/cam_helper_imx477.cpp rename to src/ipa/rpi/cam_helper/cam_helper_imx477.cpp diff --git a/src/ipa/raspberrypi/cam_helper_imx519.cpp b/src/ipa/rpi/cam_helper/cam_helper_imx519.cpp similarity index 100% rename from src/ipa/raspberrypi/cam_helper_imx519.cpp rename to src/ipa/rpi/cam_helper/cam_helper_imx519.cpp diff --git a/src/ipa/raspberrypi/cam_helper_imx708.cpp b/src/ipa/rpi/cam_helper/cam_helper_imx708.cpp similarity index 100% rename from src/ipa/raspberrypi/cam_helper_imx708.cpp rename to src/ipa/rpi/cam_helper/cam_helper_imx708.cpp diff --git a/src/ipa/raspberrypi/cam_helper_ov5647.cpp b/src/ipa/rpi/cam_helper/cam_helper_ov5647.cpp similarity index 100% rename from src/ipa/raspberrypi/cam_helper_ov5647.cpp rename to src/ipa/rpi/cam_helper/cam_helper_ov5647.cpp diff --git a/src/ipa/raspberrypi/cam_helper_ov9281.cpp b/src/ipa/rpi/cam_helper/cam_helper_ov9281.cpp similarity index 100% rename from src/ipa/raspberrypi/cam_helper_ov9281.cpp rename to src/ipa/rpi/cam_helper/cam_helper_ov9281.cpp diff --git a/src/ipa/raspberrypi/md_parser.h b/src/ipa/rpi/cam_helper/md_parser.h similarity index 100% rename from src/ipa/raspberrypi/md_parser.h rename to src/ipa/rpi/cam_helper/md_parser.h diff --git a/src/ipa/raspberrypi/md_parser_smia.cpp b/src/ipa/rpi/cam_helper/md_parser_smia.cpp similarity index 100% rename from src/ipa/raspberrypi/md_parser_smia.cpp rename to src/ipa/rpi/cam_helper/md_parser_smia.cpp diff --git a/src/ipa/rpi/cam_helper/meson.build b/src/ipa/rpi/cam_helper/meson.build new file mode 100644 index 000000000000..aa42ae91f69d --- /dev/null +++ b/src/ipa/rpi/cam_helper/meson.build @@ -0,0 +1,14 @@ +# SPDX-License-Identifier: CC0-1.0 + +rpi_ipa_cam_helper_sources = files([ + 'cam_helper.cpp', + 'cam_helper_ov5647.cpp', + 'cam_helper_imx219.cpp', + 'cam_helper_imx290.cpp', + 'cam_helper_imx296.cpp', + 'cam_helper_imx477.cpp', + 'cam_helper_imx519.cpp', + 'cam_helper_imx708.cpp', + 'cam_helper_ov9281.cpp', + 'md_parser_smia.cpp', +]) diff --git a/src/ipa/raspberrypi/controller/af_algorithm.h b/src/ipa/rpi/controller/af_algorithm.h similarity index 100% rename from src/ipa/raspberrypi/controller/af_algorithm.h rename to src/ipa/rpi/controller/af_algorithm.h diff --git a/src/ipa/raspberrypi/controller/af_status.h b/src/ipa/rpi/controller/af_status.h similarity index 100% rename from src/ipa/raspberrypi/controller/af_status.h rename to src/ipa/rpi/controller/af_status.h diff --git a/src/ipa/raspberrypi/controller/agc_algorithm.h b/src/ipa/rpi/controller/agc_algorithm.h similarity index 100% rename from src/ipa/raspberrypi/controller/agc_algorithm.h rename to src/ipa/rpi/controller/agc_algorithm.h diff --git a/src/ipa/raspberrypi/controller/agc_status.h b/src/ipa/rpi/controller/agc_status.h similarity index 100% rename from src/ipa/raspberrypi/controller/agc_status.h rename to src/ipa/rpi/controller/agc_status.h diff --git a/src/ipa/raspberrypi/controller/algorithm.cpp b/src/ipa/rpi/controller/algorithm.cpp similarity index 100% rename from src/ipa/raspberrypi/controller/algorithm.cpp rename to src/ipa/rpi/controller/algorithm.cpp diff --git a/src/ipa/raspberrypi/controller/algorithm.h b/src/ipa/rpi/controller/algorithm.h similarity index 100% rename from src/ipa/raspberrypi/controller/algorithm.h rename to src/ipa/rpi/controller/algorithm.h diff --git a/src/ipa/raspberrypi/controller/alsc_status.h b/src/ipa/rpi/controller/alsc_status.h similarity index 100% rename from src/ipa/raspberrypi/controller/alsc_status.h rename to src/ipa/rpi/controller/alsc_status.h diff --git a/src/ipa/raspberrypi/controller/awb_algorithm.h b/src/ipa/rpi/controller/awb_algorithm.h similarity index 100% rename from src/ipa/raspberrypi/controller/awb_algorithm.h rename to src/ipa/rpi/controller/awb_algorithm.h diff --git a/src/ipa/raspberrypi/controller/awb_status.h b/src/ipa/rpi/controller/awb_status.h similarity index 100% rename from src/ipa/raspberrypi/controller/awb_status.h rename to src/ipa/rpi/controller/awb_status.h diff --git a/src/ipa/raspberrypi/controller/black_level_status.h b/src/ipa/rpi/controller/black_level_status.h similarity index 100% rename from src/ipa/raspberrypi/controller/black_level_status.h rename to src/ipa/rpi/controller/black_level_status.h diff --git a/src/ipa/raspberrypi/controller/camera_mode.h b/src/ipa/rpi/controller/camera_mode.h similarity index 100% rename from src/ipa/raspberrypi/controller/camera_mode.h rename to src/ipa/rpi/controller/camera_mode.h diff --git a/src/ipa/raspberrypi/controller/ccm_algorithm.h b/src/ipa/rpi/controller/ccm_algorithm.h similarity index 100% rename from src/ipa/raspberrypi/controller/ccm_algorithm.h rename to src/ipa/rpi/controller/ccm_algorithm.h diff --git a/src/ipa/raspberrypi/controller/ccm_status.h b/src/ipa/rpi/controller/ccm_status.h similarity index 100% rename from src/ipa/raspberrypi/controller/ccm_status.h rename to src/ipa/rpi/controller/ccm_status.h diff --git a/src/ipa/raspberrypi/controller/contrast_algorithm.h b/src/ipa/rpi/controller/contrast_algorithm.h similarity index 100% rename from src/ipa/raspberrypi/controller/contrast_algorithm.h rename to src/ipa/rpi/controller/contrast_algorithm.h diff --git a/src/ipa/raspberrypi/controller/contrast_status.h b/src/ipa/rpi/controller/contrast_status.h similarity index 100% rename from src/ipa/raspberrypi/controller/contrast_status.h rename to src/ipa/rpi/controller/contrast_status.h diff --git a/src/ipa/raspberrypi/controller/controller.cpp b/src/ipa/rpi/controller/controller.cpp similarity index 100% rename from src/ipa/raspberrypi/controller/controller.cpp rename to src/ipa/rpi/controller/controller.cpp diff --git a/src/ipa/raspberrypi/controller/controller.h b/src/ipa/rpi/controller/controller.h similarity index 100% rename from src/ipa/raspberrypi/controller/controller.h rename to src/ipa/rpi/controller/controller.h diff --git a/src/ipa/raspberrypi/controller/denoise_algorithm.h b/src/ipa/rpi/controller/denoise_algorithm.h similarity index 100% rename from src/ipa/raspberrypi/controller/denoise_algorithm.h rename to src/ipa/rpi/controller/denoise_algorithm.h diff --git a/src/ipa/raspberrypi/controller/denoise_status.h b/src/ipa/rpi/controller/denoise_status.h similarity index 100% rename from src/ipa/raspberrypi/controller/denoise_status.h rename to src/ipa/rpi/controller/denoise_status.h diff --git a/src/ipa/raspberrypi/controller/device_status.cpp b/src/ipa/rpi/controller/device_status.cpp similarity index 100% rename from src/ipa/raspberrypi/controller/device_status.cpp rename to src/ipa/rpi/controller/device_status.cpp diff --git a/src/ipa/raspberrypi/controller/device_status.h b/src/ipa/rpi/controller/device_status.h similarity index 100% rename from src/ipa/raspberrypi/controller/device_status.h rename to src/ipa/rpi/controller/device_status.h diff --git a/src/ipa/raspberrypi/controller/dpc_status.h b/src/ipa/rpi/controller/dpc_status.h similarity index 100% rename from src/ipa/raspberrypi/controller/dpc_status.h rename to src/ipa/rpi/controller/dpc_status.h diff --git a/src/ipa/raspberrypi/controller/geq_status.h b/src/ipa/rpi/controller/geq_status.h similarity index 100% rename from src/ipa/raspberrypi/controller/geq_status.h rename to src/ipa/rpi/controller/geq_status.h diff --git a/src/ipa/raspberrypi/controller/histogram.cpp b/src/ipa/rpi/controller/histogram.cpp similarity index 100% rename from src/ipa/raspberrypi/controller/histogram.cpp rename to src/ipa/rpi/controller/histogram.cpp diff --git a/src/ipa/raspberrypi/controller/histogram.h b/src/ipa/rpi/controller/histogram.h similarity index 100% rename from src/ipa/raspberrypi/controller/histogram.h rename to src/ipa/rpi/controller/histogram.h diff --git a/src/ipa/raspberrypi/controller/lux_status.h b/src/ipa/rpi/controller/lux_status.h similarity index 100% rename from src/ipa/raspberrypi/controller/lux_status.h rename to src/ipa/rpi/controller/lux_status.h diff --git a/src/ipa/rpi/controller/meson.build b/src/ipa/rpi/controller/meson.build new file mode 100644 index 000000000000..90a67412444e --- /dev/null +++ b/src/ipa/rpi/controller/meson.build @@ -0,0 +1,22 @@ +# SPDX-License-Identifier: CC0-1.0 + +rpi_ipa_controller_sources = files([ + 'algorithm.cpp', + 'controller.cpp', + 'device_status.cpp', + 'histogram.cpp', + 'pwl.cpp', + 'rpi/af.cpp', + 'rpi/agc.cpp', + 'rpi/alsc.cpp', + 'rpi/awb.cpp', + 'rpi/black_level.cpp', + 'rpi/ccm.cpp', + 'rpi/contrast.cpp', + 'rpi/dpc.cpp', + 'rpi/geq.cpp', + 'rpi/lux.cpp', + 'rpi/noise.cpp', + 'rpi/sdn.cpp', + 'rpi/sharpen.cpp', +]) diff --git a/src/ipa/raspberrypi/controller/metadata.h b/src/ipa/rpi/controller/metadata.h similarity index 100% rename from src/ipa/raspberrypi/controller/metadata.h rename to src/ipa/rpi/controller/metadata.h diff --git a/src/ipa/raspberrypi/controller/noise_status.h b/src/ipa/rpi/controller/noise_status.h similarity index 100% rename from src/ipa/raspberrypi/controller/noise_status.h rename to src/ipa/rpi/controller/noise_status.h diff --git a/src/ipa/raspberrypi/controller/pdaf_data.h b/src/ipa/rpi/controller/pdaf_data.h similarity index 100% rename from src/ipa/raspberrypi/controller/pdaf_data.h rename to src/ipa/rpi/controller/pdaf_data.h diff --git a/src/ipa/raspberrypi/controller/pwl.cpp b/src/ipa/rpi/controller/pwl.cpp similarity index 100% rename from src/ipa/raspberrypi/controller/pwl.cpp rename to src/ipa/rpi/controller/pwl.cpp diff --git a/src/ipa/raspberrypi/controller/pwl.h b/src/ipa/rpi/controller/pwl.h similarity index 100% rename from src/ipa/raspberrypi/controller/pwl.h rename to src/ipa/rpi/controller/pwl.h diff --git a/src/ipa/raspberrypi/controller/region_stats.h b/src/ipa/rpi/controller/region_stats.h similarity index 100% rename from src/ipa/raspberrypi/controller/region_stats.h rename to src/ipa/rpi/controller/region_stats.h diff --git a/src/ipa/raspberrypi/controller/rpi/af.cpp b/src/ipa/rpi/controller/rpi/af.cpp similarity index 100% rename from src/ipa/raspberrypi/controller/rpi/af.cpp rename to src/ipa/rpi/controller/rpi/af.cpp diff --git a/src/ipa/raspberrypi/controller/rpi/af.h b/src/ipa/rpi/controller/rpi/af.h similarity index 100% rename from src/ipa/raspberrypi/controller/rpi/af.h rename to src/ipa/rpi/controller/rpi/af.h diff --git a/src/ipa/raspberrypi/controller/rpi/agc.cpp b/src/ipa/rpi/controller/rpi/agc.cpp similarity index 100% rename from src/ipa/raspberrypi/controller/rpi/agc.cpp rename to src/ipa/rpi/controller/rpi/agc.cpp diff --git a/src/ipa/raspberrypi/controller/rpi/agc.h b/src/ipa/rpi/controller/rpi/agc.h similarity index 100% rename from src/ipa/raspberrypi/controller/rpi/agc.h rename to src/ipa/rpi/controller/rpi/agc.h diff --git a/src/ipa/raspberrypi/controller/rpi/alsc.cpp b/src/ipa/rpi/controller/rpi/alsc.cpp similarity index 100% rename from src/ipa/raspberrypi/controller/rpi/alsc.cpp rename to src/ipa/rpi/controller/rpi/alsc.cpp diff --git a/src/ipa/raspberrypi/controller/rpi/alsc.h b/src/ipa/rpi/controller/rpi/alsc.h similarity index 100% rename from src/ipa/raspberrypi/controller/rpi/alsc.h rename to src/ipa/rpi/controller/rpi/alsc.h diff --git a/src/ipa/raspberrypi/controller/rpi/awb.cpp b/src/ipa/rpi/controller/rpi/awb.cpp similarity index 100% rename from src/ipa/raspberrypi/controller/rpi/awb.cpp rename to src/ipa/rpi/controller/rpi/awb.cpp diff --git a/src/ipa/raspberrypi/controller/rpi/awb.h b/src/ipa/rpi/controller/rpi/awb.h similarity index 100% rename from src/ipa/raspberrypi/controller/rpi/awb.h rename to src/ipa/rpi/controller/rpi/awb.h diff --git a/src/ipa/raspberrypi/controller/rpi/black_level.cpp b/src/ipa/rpi/controller/rpi/black_level.cpp similarity index 100% rename from src/ipa/raspberrypi/controller/rpi/black_level.cpp rename to src/ipa/rpi/controller/rpi/black_level.cpp diff --git a/src/ipa/raspberrypi/controller/rpi/black_level.h b/src/ipa/rpi/controller/rpi/black_level.h similarity index 100% rename from src/ipa/raspberrypi/controller/rpi/black_level.h rename to src/ipa/rpi/controller/rpi/black_level.h diff --git a/src/ipa/raspberrypi/controller/rpi/ccm.cpp b/src/ipa/rpi/controller/rpi/ccm.cpp similarity index 100% rename from src/ipa/raspberrypi/controller/rpi/ccm.cpp rename to src/ipa/rpi/controller/rpi/ccm.cpp diff --git a/src/ipa/raspberrypi/controller/rpi/ccm.h b/src/ipa/rpi/controller/rpi/ccm.h similarity index 100% rename from src/ipa/raspberrypi/controller/rpi/ccm.h rename to src/ipa/rpi/controller/rpi/ccm.h diff --git a/src/ipa/raspberrypi/controller/rpi/contrast.cpp b/src/ipa/rpi/controller/rpi/contrast.cpp similarity index 100% rename from src/ipa/raspberrypi/controller/rpi/contrast.cpp rename to src/ipa/rpi/controller/rpi/contrast.cpp diff --git a/src/ipa/raspberrypi/controller/rpi/contrast.h b/src/ipa/rpi/controller/rpi/contrast.h similarity index 100% rename from src/ipa/raspberrypi/controller/rpi/contrast.h rename to src/ipa/rpi/controller/rpi/contrast.h diff --git a/src/ipa/raspberrypi/controller/rpi/dpc.cpp b/src/ipa/rpi/controller/rpi/dpc.cpp similarity index 100% rename from src/ipa/raspberrypi/controller/rpi/dpc.cpp rename to src/ipa/rpi/controller/rpi/dpc.cpp diff --git a/src/ipa/raspberrypi/controller/rpi/dpc.h b/src/ipa/rpi/controller/rpi/dpc.h similarity index 100% rename from src/ipa/raspberrypi/controller/rpi/dpc.h rename to src/ipa/rpi/controller/rpi/dpc.h diff --git a/src/ipa/raspberrypi/controller/rpi/focus.h b/src/ipa/rpi/controller/rpi/focus.h similarity index 100% rename from src/ipa/raspberrypi/controller/rpi/focus.h rename to src/ipa/rpi/controller/rpi/focus.h diff --git a/src/ipa/raspberrypi/controller/rpi/geq.cpp b/src/ipa/rpi/controller/rpi/geq.cpp similarity index 100% rename from src/ipa/raspberrypi/controller/rpi/geq.cpp rename to src/ipa/rpi/controller/rpi/geq.cpp diff --git a/src/ipa/raspberrypi/controller/rpi/geq.h b/src/ipa/rpi/controller/rpi/geq.h similarity index 100% rename from src/ipa/raspberrypi/controller/rpi/geq.h rename to src/ipa/rpi/controller/rpi/geq.h diff --git a/src/ipa/raspberrypi/controller/rpi/lux.cpp b/src/ipa/rpi/controller/rpi/lux.cpp similarity index 100% rename from src/ipa/raspberrypi/controller/rpi/lux.cpp rename to src/ipa/rpi/controller/rpi/lux.cpp diff --git a/src/ipa/raspberrypi/controller/rpi/lux.h b/src/ipa/rpi/controller/rpi/lux.h similarity index 100% rename from src/ipa/raspberrypi/controller/rpi/lux.h rename to src/ipa/rpi/controller/rpi/lux.h diff --git a/src/ipa/raspberrypi/controller/rpi/noise.cpp b/src/ipa/rpi/controller/rpi/noise.cpp similarity index 100% rename from src/ipa/raspberrypi/controller/rpi/noise.cpp rename to src/ipa/rpi/controller/rpi/noise.cpp diff --git a/src/ipa/raspberrypi/controller/rpi/noise.h b/src/ipa/rpi/controller/rpi/noise.h similarity index 100% rename from src/ipa/raspberrypi/controller/rpi/noise.h rename to src/ipa/rpi/controller/rpi/noise.h diff --git a/src/ipa/raspberrypi/controller/rpi/sdn.cpp b/src/ipa/rpi/controller/rpi/sdn.cpp similarity index 100% rename from src/ipa/raspberrypi/controller/rpi/sdn.cpp rename to src/ipa/rpi/controller/rpi/sdn.cpp diff --git a/src/ipa/raspberrypi/controller/rpi/sdn.h b/src/ipa/rpi/controller/rpi/sdn.h similarity index 100% rename from src/ipa/raspberrypi/controller/rpi/sdn.h rename to src/ipa/rpi/controller/rpi/sdn.h diff --git a/src/ipa/raspberrypi/controller/rpi/sharpen.cpp b/src/ipa/rpi/controller/rpi/sharpen.cpp similarity index 100% rename from src/ipa/raspberrypi/controller/rpi/sharpen.cpp rename to src/ipa/rpi/controller/rpi/sharpen.cpp diff --git a/src/ipa/raspberrypi/controller/rpi/sharpen.h b/src/ipa/rpi/controller/rpi/sharpen.h similarity index 100% rename from src/ipa/raspberrypi/controller/rpi/sharpen.h rename to src/ipa/rpi/controller/rpi/sharpen.h diff --git a/src/ipa/raspberrypi/controller/sharpen_algorithm.h b/src/ipa/rpi/controller/sharpen_algorithm.h similarity index 100% rename from src/ipa/raspberrypi/controller/sharpen_algorithm.h rename to src/ipa/rpi/controller/sharpen_algorithm.h diff --git a/src/ipa/raspberrypi/controller/sharpen_status.h b/src/ipa/rpi/controller/sharpen_status.h similarity index 100% rename from src/ipa/raspberrypi/controller/sharpen_status.h rename to src/ipa/rpi/controller/sharpen_status.h diff --git a/src/ipa/raspberrypi/statistics.h b/src/ipa/rpi/controller/statistics.h similarity index 100% rename from src/ipa/raspberrypi/statistics.h rename to src/ipa/rpi/controller/statistics.h diff --git a/src/ipa/raspberrypi/data/imx219.json b/src/ipa/rpi/vc4/data/imx219.json similarity index 100% rename from src/ipa/raspberrypi/data/imx219.json rename to src/ipa/rpi/vc4/data/imx219.json diff --git a/src/ipa/raspberrypi/data/imx219_noir.json b/src/ipa/rpi/vc4/data/imx219_noir.json similarity index 100% rename from src/ipa/raspberrypi/data/imx219_noir.json rename to src/ipa/rpi/vc4/data/imx219_noir.json diff --git a/src/ipa/raspberrypi/data/imx290.json b/src/ipa/rpi/vc4/data/imx290.json similarity index 100% rename from src/ipa/raspberrypi/data/imx290.json rename to src/ipa/rpi/vc4/data/imx290.json diff --git a/src/ipa/raspberrypi/data/imx296.json b/src/ipa/rpi/vc4/data/imx296.json similarity index 100% rename from src/ipa/raspberrypi/data/imx296.json rename to src/ipa/rpi/vc4/data/imx296.json diff --git a/src/ipa/raspberrypi/data/imx296_mono.json b/src/ipa/rpi/vc4/data/imx296_mono.json similarity index 100% rename from src/ipa/raspberrypi/data/imx296_mono.json rename to src/ipa/rpi/vc4/data/imx296_mono.json diff --git a/src/ipa/raspberrypi/data/imx378.json b/src/ipa/rpi/vc4/data/imx378.json similarity index 100% rename from src/ipa/raspberrypi/data/imx378.json rename to src/ipa/rpi/vc4/data/imx378.json diff --git a/src/ipa/raspberrypi/data/imx477.json b/src/ipa/rpi/vc4/data/imx477.json similarity index 100% rename from src/ipa/raspberrypi/data/imx477.json rename to src/ipa/rpi/vc4/data/imx477.json diff --git a/src/ipa/raspberrypi/data/imx477_noir.json b/src/ipa/rpi/vc4/data/imx477_noir.json similarity index 100% rename from src/ipa/raspberrypi/data/imx477_noir.json rename to src/ipa/rpi/vc4/data/imx477_noir.json diff --git a/src/ipa/raspberrypi/data/imx477_scientific.json b/src/ipa/rpi/vc4/data/imx477_scientific.json similarity index 100% rename from src/ipa/raspberrypi/data/imx477_scientific.json rename to src/ipa/rpi/vc4/data/imx477_scientific.json diff --git a/src/ipa/raspberrypi/data/imx477_v1.json b/src/ipa/rpi/vc4/data/imx477_v1.json similarity index 100% rename from src/ipa/raspberrypi/data/imx477_v1.json rename to src/ipa/rpi/vc4/data/imx477_v1.json diff --git a/src/ipa/raspberrypi/data/imx519.json b/src/ipa/rpi/vc4/data/imx519.json similarity index 100% rename from src/ipa/raspberrypi/data/imx519.json rename to src/ipa/rpi/vc4/data/imx519.json diff --git a/src/ipa/raspberrypi/data/imx708.json b/src/ipa/rpi/vc4/data/imx708.json similarity index 100% rename from src/ipa/raspberrypi/data/imx708.json rename to src/ipa/rpi/vc4/data/imx708.json diff --git a/src/ipa/raspberrypi/data/imx708_noir.json b/src/ipa/rpi/vc4/data/imx708_noir.json similarity index 100% rename from src/ipa/raspberrypi/data/imx708_noir.json rename to src/ipa/rpi/vc4/data/imx708_noir.json diff --git a/src/ipa/raspberrypi/data/imx708_wide.json b/src/ipa/rpi/vc4/data/imx708_wide.json similarity index 100% rename from src/ipa/raspberrypi/data/imx708_wide.json rename to src/ipa/rpi/vc4/data/imx708_wide.json diff --git a/src/ipa/raspberrypi/data/imx708_wide_noir.json b/src/ipa/rpi/vc4/data/imx708_wide_noir.json similarity index 100% rename from src/ipa/raspberrypi/data/imx708_wide_noir.json rename to src/ipa/rpi/vc4/data/imx708_wide_noir.json diff --git a/src/ipa/raspberrypi/data/meson.build b/src/ipa/rpi/vc4/data/meson.build similarity index 89% rename from src/ipa/raspberrypi/data/meson.build rename to src/ipa/rpi/vc4/data/meson.build index b163a052f57c..327462fa2420 100644 --- a/src/ipa/raspberrypi/data/meson.build +++ b/src/ipa/rpi/vc4/data/meson.build @@ -23,4 +23,4 @@ conf_files = files([ ]) install_data(conf_files, - install_dir : ipa_data_dir / 'raspberrypi') + install_dir : ipa_data_dir / 'rpi/vc4') diff --git a/src/ipa/raspberrypi/data/ov5647.json b/src/ipa/rpi/vc4/data/ov5647.json similarity index 100% rename from src/ipa/raspberrypi/data/ov5647.json rename to src/ipa/rpi/vc4/data/ov5647.json diff --git a/src/ipa/raspberrypi/data/ov5647_noir.json b/src/ipa/rpi/vc4/data/ov5647_noir.json similarity index 100% rename from src/ipa/raspberrypi/data/ov5647_noir.json rename to src/ipa/rpi/vc4/data/ov5647_noir.json diff --git a/src/ipa/raspberrypi/data/ov9281_mono.json b/src/ipa/rpi/vc4/data/ov9281_mono.json similarity index 100% rename from src/ipa/raspberrypi/data/ov9281_mono.json rename to src/ipa/rpi/vc4/data/ov9281_mono.json diff --git a/src/ipa/raspberrypi/data/se327m12.json b/src/ipa/rpi/vc4/data/se327m12.json similarity index 100% rename from src/ipa/raspberrypi/data/se327m12.json rename to src/ipa/rpi/vc4/data/se327m12.json diff --git a/src/ipa/raspberrypi/data/uncalibrated.json b/src/ipa/rpi/vc4/data/uncalibrated.json similarity index 100% rename from src/ipa/raspberrypi/data/uncalibrated.json rename to src/ipa/rpi/vc4/data/uncalibrated.json diff --git a/src/ipa/rpi/vc4/meson.build b/src/ipa/rpi/vc4/meson.build new file mode 100644 index 000000000000..992d0f1ab5a7 --- /dev/null +++ b/src/ipa/rpi/vc4/meson.build @@ -0,0 +1,40 @@ +# SPDX-License-Identifier: CC0-1.0 + +ipa_name = 'ipa_vc4' + +vc4_ipa_deps = [ + libcamera_private, + libatomic, +] + +vc4_ipa_includes = [ + ipa_includes, + libipa_includes, +] + +vc4_ipa_sources = files([ + 'raspberrypi.cpp', +]) + +vc4_ipa_includes += include_directories('..') +vc4_ipa_sources += [rpi_ipa_cam_helper_sources, rpi_ipa_controller_sources] + +mod = shared_module(ipa_name, + [vc4_ipa_sources, libcamera_generated_ipa_headers], + name_prefix : '', + include_directories : vc4_ipa_includes, + dependencies : vc4_ipa_deps, + link_with : libipa, + install : true, + install_dir : ipa_install_dir) + +if ipa_sign_module + custom_target(ipa_name + '.so.sign', + input : mod, + output : ipa_name + '.so.sign', + command : [ipa_sign, ipa_priv_key, '@INPUT@', '@OUTPUT@'], + install : false, + build_by_default : true) +endif + +subdir('data') diff --git a/src/ipa/raspberrypi/raspberrypi.cpp b/src/ipa/rpi/vc4/raspberrypi.cpp similarity index 98% rename from src/ipa/raspberrypi/raspberrypi.cpp rename to src/ipa/rpi/vc4/raspberrypi.cpp index 9c29fa9a5e5c..841635ccde2b 100644 --- a/src/ipa/raspberrypi/raspberrypi.cpp +++ b/src/ipa/rpi/vc4/raspberrypi.cpp @@ -33,29 +33,29 @@ #include "libcamera/internal/mapped_framebuffer.h" -#include "af_algorithm.h" -#include "af_status.h" -#include "agc_algorithm.h" -#include "agc_status.h" -#include "alsc_status.h" -#include "awb_algorithm.h" -#include "awb_status.h" -#include "black_level_status.h" -#include "cam_helper.h" -#include "ccm_algorithm.h" -#include "ccm_status.h" -#include "contrast_algorithm.h" -#include "contrast_status.h" -#include "controller.h" -#include "denoise_algorithm.h" -#include "denoise_status.h" -#include "dpc_status.h" -#include "geq_status.h" -#include "lux_status.h" -#include "metadata.h" -#include "sharpen_algorithm.h" -#include "sharpen_status.h" -#include "statistics.h" +#include "cam_helper/cam_helper.h" +#include "controller/af_algorithm.h" +#include "controller/af_status.h" +#include "controller/agc_algorithm.h" +#include "controller/agc_status.h" +#include "controller/alsc_status.h" +#include "controller/awb_algorithm.h" +#include "controller/awb_status.h" +#include "controller/black_level_status.h" +#include "controller/ccm_algorithm.h" +#include "controller/ccm_status.h" +#include "controller/contrast_algorithm.h" +#include "controller/contrast_status.h" +#include "controller/controller.h" +#include "controller/denoise_algorithm.h" +#include "controller/denoise_status.h" +#include "controller/dpc_status.h" +#include "controller/geq_status.h" +#include "controller/lux_status.h" +#include "controller/metadata.h" +#include "controller/sharpen_algorithm.h" +#include "controller/sharpen_status.h" +#include "controller/statistics.h" namespace libcamera { @@ -1840,7 +1840,7 @@ const struct IPAModuleInfo ipaModuleInfo = { IPA_MODULE_API_VERSION, 1, "PipelineHandlerRPi", - "raspberrypi", + "vc4", }; IPAInterface *ipaCreate() diff --git a/src/libcamera/pipeline/rpi/vc4/raspberrypi.cpp b/src/libcamera/pipeline/rpi/vc4/raspberrypi.cpp index 4595773d2517..cfde7f2e2229 100644 --- a/src/libcamera/pipeline/rpi/vc4/raspberrypi.cpp +++ b/src/libcamera/pipeline/rpi/vc4/raspberrypi.cpp @@ -1668,7 +1668,7 @@ int RPiCameraData::loadIPA(ipa::RPi::IPAInitResult *result) std::string model = sensor_->model(); if (isMonoSensor(sensor_)) model += "_mono"; - configurationFile = ipa_->configurationFile(model + ".json", ""); + configurationFile = ipa_->configurationFile(model + ".json", "rpi"); } else { configurationFile = std::string(configFromEnv); } From patchwork Wed Apr 26 13:10:49 2023 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Naushir Patuck X-Patchwork-Id: 18553 Return-Path: X-Original-To: parsemail@patchwork.libcamera.org Delivered-To: parsemail@patchwork.libcamera.org Received: from lancelot.ideasonboard.com (lancelot.ideasonboard.com [92.243.16.209]) by patchwork.libcamera.org (Postfix) with ESMTPS id C949FBDCBD for ; Wed, 26 Apr 2023 13:13:20 +0000 (UTC) Received: from lancelot.ideasonboard.com (localhost [IPv6:::1]) by lancelot.ideasonboard.com (Postfix) with ESMTP id 83B38627ED; Wed, 26 Apr 2023 15:13:20 +0200 (CEST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/simple; d=libcamera.org; s=mail; t=1682514800; bh=PuwVw5sfY27SM6qjnb0dc5kXg13QrdtvEMENojSPR1E=; h=To:Date:In-Reply-To:References:Subject:List-Id:List-Unsubscribe: List-Archive:List-Post:List-Help:List-Subscribe:From:Reply-To: From; b=ia5FwDuTqMvr/QOKsk+W0psAAQ4rjNhUi6fv/KEiD0MbM7FsYCbL+CFW+Y0j9rTUL QKEoiNDI5/rjuztRiQHnwI/MXYdG/oDngkgUQXkfD6E1ciHfIZKFK0Jh04HUTaHw0d WmZhVu/Sm8/PFCNoOU3EWngxBkd0Bsslj/CvIpVr3AE30LLXFZZ8aqGhABrt3kS9h1 hB7opZT1PC2pkMoU8Yfum6tCKMnDchDT4Za7W+eyOUSeVK3k/oHScnDX+N6BeCgX6M h6BTw38/HiiRySUCvDqhfG8OisOnsnnED8bhMQN74iffmOImjWDP2ZEYxhVPdSohXg aQIxyWtE59x3Q== Received: from mail-wr1-x430.google.com (mail-wr1-x430.google.com [IPv6:2a00:1450:4864:20::430]) by lancelot.ideasonboard.com (Postfix) with ESMTPS id 05278627D6 for ; Wed, 26 Apr 2023 15:13:16 +0200 (CEST) Authentication-Results: lancelot.ideasonboard.com; dkim=pass (2048-bit key; unprotected) header.d=raspberrypi.com header.i=@raspberrypi.com header.b="p41tga7g"; dkim-atps=neutral Received: by mail-wr1-x430.google.com with SMTP id ffacd0b85a97d-2f9b9aa9d75so4515476f8f.0 for ; Wed, 26 Apr 2023 06:13:16 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=raspberrypi.com; s=google; t=1682514795; x=1685106795; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:from:to:cc:subject:date :message-id:reply-to; bh=Jexy6v+bzoE8dH/G3OuVccEeF4d1I3E44a60uWXBTjE=; b=p41tga7gx6+ud8CbXlKyZJEf8VVDQATcr2DUzXLivYm/zsOsuumGWdZV5MuQsrdJMS IqC9WG1krhjM+4jcC/EPDqNGNwpfYGMrlsPnAHV2L+7pU+31XZ6I9vrV00ftWkrEVQoU ibqEeQM7gjb5jB4cUcQcbHDZw/togT22Be4PfgCUA4MT0mijtJcKiykiSz7fJUOIAPDw ZWavCpJ3wYDvsYXpEr0nGLc+LcITj0KYvtx+nUMXvzoMySmMBDw+/G7uwEormuKtT0wU LbslpUgDagzbKq5Ma0NeXUQurlyG8+wWVs7LKUXsiXeM749ywgKowM9Lzt3kUF9Rr5+u ouGA== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20221208; t=1682514795; x=1685106795; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:x-gm-message-state:from:to:cc :subject:date:message-id:reply-to; bh=Jexy6v+bzoE8dH/G3OuVccEeF4d1I3E44a60uWXBTjE=; b=lOwObQVGh2CuLVTzTAYApOpXTaoWc3YKf0JjWG4hbsaPObZqgQXxXvx6equQdAlTzD dupIfnIcyhE0RKm9y9FA4dQQ9Ajqtopk3BfvCnp837XLPVyritTbd1DZETnX0M+GRGSp LXrgKz/LJfhz8y6wpZcD3HgCvP6jMbID29MKPI6R2KhM3oXK845TJB/AevRCCxu7xv+T wTLjgdhnKv8bLShRKpJ6HUGmtKvUiFWEoefIFLG9qQYwgO8L4Vi5j+i3aahWha08P2Ut FGuH+NobbMW6iIkujLH8Aa92OYc/UJwV2oZ2owQs1cC/6G+ARMMmXvZdss5AKR4i0+fo /NqA== X-Gm-Message-State: AAQBX9dFROl5mLdSu9auwIsZ+0FbySiB6Ht3kyI6rVbkNngbf3ZBWKEc QAY+EQtw5WMdlsjGsLYMGqqa22jfl6E4OrmDe/UYxA== X-Google-Smtp-Source: AKy350aXx2/wiPbv1OobzGVccxfmcWeautFYhtRQol+UJ9DP7tzub71Ca8IXnm3QLfMZ7i2Femi5Xw== X-Received: by 2002:adf:f7c5:0:b0:2fb:87f7:3812 with SMTP id a5-20020adff7c5000000b002fb87f73812mr13930782wrq.1.1682514795045; Wed, 26 Apr 2023 06:13:15 -0700 (PDT) Received: from localhost.localdomain ([93.93.133.154]) by smtp.gmail.com with ESMTPSA id k5-20020adff5c5000000b002f103ca90cdsm15780949wrp.101.2023.04.26.06.13.14 (version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256); Wed, 26 Apr 2023 06:13:14 -0700 (PDT) To: libcamera-devel@lists.libcamera.org Date: Wed, 26 Apr 2023 14:10:49 +0100 Message-Id: <20230426131057.21550-6-naush@raspberrypi.com> X-Mailer: git-send-email 2.34.1 In-Reply-To: <20230426131057.21550-1-naush@raspberrypi.com> References: <20230426131057.21550-1-naush@raspberrypi.com> MIME-Version: 1.0 Subject: [libcamera-devel] [PATCH 05/13] pipeline: raspberrypi: rpi_stream: Set invalid buffer to id == 0 X-BeenThere: libcamera-devel@lists.libcamera.org X-Mailman-Version: 2.1.29 Precedence: list List-Id: List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , X-Patchwork-Original-From: Naushir Patuck via libcamera-devel From: Naushir Patuck Reply-To: Naushir Patuck Errors-To: libcamera-devel-bounces@lists.libcamera.org Sender: "libcamera-devel" At present, the RPiStream buffer ids == -1 indicates an invalid value. As a simplification, use id == 0 to indicate an invalid value. This allows for better code readability. As a consequence of this, use unsigned int for the buffer id values. Signed-off-by: Naushir Patuck Reviewed-by: Jacopo Mondi Reviewed-by: Laurent Pinchart --- .../pipeline/rpi/common/rpi_stream.cpp | 10 +++++----- src/libcamera/pipeline/rpi/common/rpi_stream.h | 18 +++++++++--------- src/libcamera/pipeline/rpi/vc4/raspberrypi.cpp | 6 +++--- 3 files changed, 17 insertions(+), 17 deletions(-) diff --git a/src/libcamera/pipeline/rpi/common/rpi_stream.cpp b/src/libcamera/pipeline/rpi/common/rpi_stream.cpp index 2bb10f25d6ca..3690667e9aa6 100644 --- a/src/libcamera/pipeline/rpi/common/rpi_stream.cpp +++ b/src/libcamera/pipeline/rpi/common/rpi_stream.cpp @@ -55,17 +55,17 @@ const BufferMap &Stream::getBuffers() const return bufferMap_; } -int Stream::getBufferId(FrameBuffer *buffer) const +unsigned int Stream::getBufferId(FrameBuffer *buffer) const { if (importOnly_) - return -1; + return 0; /* Find the buffer in the map, and return the buffer id. */ auto it = std::find_if(bufferMap_.begin(), bufferMap_.end(), [&buffer](auto const &p) { return p.second == buffer; }); if (it == bufferMap_.end()) - return -1; + return 0; return it->first; } @@ -77,10 +77,10 @@ void Stream::setExternalBuffer(FrameBuffer *buffer) void Stream::removeExternalBuffer(FrameBuffer *buffer) { - int id = getBufferId(buffer); + unsigned int id = getBufferId(buffer); /* Ensure we have this buffer in the stream, and it is marked external. */ - ASSERT(id != -1 && (id & BufferMask::MaskExternalBuffer)); + ASSERT(id & BufferMask::MaskExternalBuffer); bufferMap_.erase(id); } diff --git a/src/libcamera/pipeline/rpi/common/rpi_stream.h b/src/libcamera/pipeline/rpi/common/rpi_stream.h index b8bd79cf1535..1aae674967e1 100644 --- a/src/libcamera/pipeline/rpi/common/rpi_stream.h +++ b/src/libcamera/pipeline/rpi/common/rpi_stream.h @@ -58,7 +58,7 @@ public: void setExportedBuffers(std::vector> *buffers); const BufferMap &getBuffers() const; - int getBufferId(FrameBuffer *buffer) const; + unsigned int getBufferId(FrameBuffer *buffer) const; void setExternalBuffer(FrameBuffer *buffer); void removeExternalBuffer(FrameBuffer *buffer); @@ -74,25 +74,25 @@ private: class IdGenerator { public: - IdGenerator(int max) + IdGenerator(unsigned int max) : max_(max), id_(0) { } - int get() + unsigned int get() { - int id; + unsigned int id; if (!recycle_.empty()) { id = recycle_.front(); recycle_.pop(); } else { - id = id_++; + id = ++id_; ASSERT(id_ <= max_); } return id; } - void release(int id) + void release(unsigned int id) { recycle_.push(id); } @@ -104,9 +104,9 @@ private: } private: - int max_; - int id_; - std::queue recycle_; + unsigned int max_; + unsigned int id_; + std::queue recycle_; }; void clearBuffers(); diff --git a/src/libcamera/pipeline/rpi/vc4/raspberrypi.cpp b/src/libcamera/pipeline/rpi/vc4/raspberrypi.cpp index cfde7f2e2229..e54bf1ef2c17 100644 --- a/src/libcamera/pipeline/rpi/vc4/raspberrypi.cpp +++ b/src/libcamera/pipeline/rpi/vc4/raspberrypi.cpp @@ -1210,7 +1210,7 @@ int PipelineHandlerRPi::queueRequestDevice(Camera *camera, Request *request) continue; FrameBuffer *buffer = request->findBuffer(stream); - if (buffer && stream->getBufferId(buffer) == -1) { + if (buffer && !stream->getBufferId(buffer)) { /* * This buffer is not recognised, so it must have been allocated * outside the v4l2 device. Store it in the stream buffer list @@ -2042,7 +2042,7 @@ void RPiCameraData::unicamBufferDequeue(FrameBuffer *buffer) for (RPi::Stream &s : unicam_) { index = s.getBufferId(buffer); - if (index != -1) { + if (index) { stream = &s; break; } @@ -2098,7 +2098,7 @@ void RPiCameraData::ispOutputDequeue(FrameBuffer *buffer) for (RPi::Stream &s : isp_) { index = s.getBufferId(buffer); - if (index != -1) { + if (index) { stream = &s; break; } From patchwork Wed Apr 26 13:10:50 2023 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Naushir Patuck X-Patchwork-Id: 18554 Return-Path: X-Original-To: parsemail@patchwork.libcamera.org Delivered-To: parsemail@patchwork.libcamera.org Received: from lancelot.ideasonboard.com (lancelot.ideasonboard.com [92.243.16.209]) by patchwork.libcamera.org (Postfix) with ESMTPS id 4330EC3275 for ; Wed, 26 Apr 2023 13:13:21 +0000 (UTC) Received: from lancelot.ideasonboard.com (localhost [IPv6:::1]) by lancelot.ideasonboard.com (Postfix) with ESMTP id 0D0FE627F6; Wed, 26 Apr 2023 15:13:21 +0200 (CEST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/simple; d=libcamera.org; s=mail; t=1682514801; bh=Cn9TVPs5A0oqKu1iJHr45EsvPfXkGKmULy/SHYPXdSQ=; h=To:Date:In-Reply-To:References:Subject:List-Id:List-Unsubscribe: List-Archive:List-Post:List-Help:List-Subscribe:From:Reply-To: From; b=b4pLAlpSj4Z2OlO6kNbvTcex3E1LoT/8VdDJU+xHAkHK+wJuhAbmitq4A+H0Xeh5K LD5NJWvQD7X2IuprV8xNrtnLY6d1oLq8QDpsM/egfmxD5qALQ4u5Ezxway/PjM25RR h7XXVTvozThyLVVWeI1AQdyYdganZvRLinhD06Y2qTzeSluipzPrXT7a6vou8WQlqF 8H7KoLR57qIsfVe4iMoSdHUhhMKFBs4gvYnp1xFoXqAHcrTapoijUAFnWigUzdZbKB Bs+0n4nkg2JVbzcSkJluQXSaHLJI8O5AQpSODsUyjVKs1V+QhQtqsj4jHp0bQPKN7R IC8wOT3QnmyUA== Received: from mail-wr1-x431.google.com (mail-wr1-x431.google.com [IPv6:2a00:1450:4864:20::431]) by lancelot.ideasonboard.com (Postfix) with ESMTPS id 1DA96627E8 for ; Wed, 26 Apr 2023 15:13:17 +0200 (CEST) Authentication-Results: lancelot.ideasonboard.com; dkim=pass (2048-bit key; unprotected) header.d=raspberrypi.com header.i=@raspberrypi.com header.b="gEXiuNHh"; dkim-atps=neutral Received: by mail-wr1-x431.google.com with SMTP id ffacd0b85a97d-2febac9cacdso4349170f8f.1 for ; Wed, 26 Apr 2023 06:13:17 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=raspberrypi.com; s=google; t=1682514796; x=1685106796; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:from:to:cc:subject:date :message-id:reply-to; bh=D2bcUTUEmhVVCEjSvqYoiPB53iQmfTi7Y3/wpVSMuFQ=; b=gEXiuNHhVLzrXbF94GdoB24B7oRvT+o7ZKgHbE4DhB1m4D4o6KmraPzuVpcRKBo4kN Im/cTbj3L850u+q31cExVYN1mtbByez8sziW6PHCJB13eXQTGeaTTZ9qspfJRJ/Upku/ CrAJRC8VC9o/E2bYIyBkBbcDdh+9q9BaJi5zH+YKNbm7jCNeh2C0UA+SBEhMbPf5KKai XpVrk97fVoGOvkTf6H7Jn+hkQnOMiwcv1DwXgLaKs2eh5rkAczc9TUthmes2nbBR7L23 MOes5Lom1gIXjf8B69fbsrVbb1kp3FYtdZzBrrSOGI1iqKRBzmDZ0aBQISyAfgsCDk1/ 6nyw== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20221208; t=1682514796; x=1685106796; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:x-gm-message-state:from:to:cc :subject:date:message-id:reply-to; bh=D2bcUTUEmhVVCEjSvqYoiPB53iQmfTi7Y3/wpVSMuFQ=; b=B3EFY8tSy+8cL01N9DDp8ASTXC5ESO9Nej50P5TfgvcgAk5vXnKMJWA3cWD1VgCPwC 3jP/6dVc7uLkxUwkrxwLGDryCZhzHJb0y9yw5szP7CPEKbQbwCwhRjOeBFNNNadzSUUL KEyLVkea0jOCDg18HEnUSmXB1bowgq8ZWSk3SpISvF3QLKEbEx3QqRrgsHYXcVomxAyq DsrbYsHZgdJpnpiaTG21mkbDJLFQu+j215BYR/Y11vSS8AuG9tEP/W7PhNrFMXqmfuIK azDgNVytP2ZxvV7nZ/e6pN6o8ygJHpXwLd5MN6/o7zJyXlPM6a1rBgo1KeJOlb9b+4kh vmjA== X-Gm-Message-State: AAQBX9dwEizk0v5Zkd1G5UpEzHsxG3gBVGEgzaaaMNTyIY3yu1BCdb24 7hVmtmLVE1tK9eIH+mnRux1in8a0xpVtIxW1/luw8w== X-Google-Smtp-Source: AKy350aDEVEOo2VLIuiUH/uzI9vo6X+N6MDZfQRYYxvfTCmUW/zAGREjxYJGdQIWA+2CMK2vOAK9sw== X-Received: by 2002:adf:ef0d:0:b0:2fc:ec82:4dbf with SMTP id e13-20020adfef0d000000b002fcec824dbfmr13475425wro.1.1682514795843; Wed, 26 Apr 2023 06:13:15 -0700 (PDT) Received: from localhost.localdomain ([93.93.133.154]) by smtp.gmail.com with ESMTPSA id k5-20020adff5c5000000b002f103ca90cdsm15780949wrp.101.2023.04.26.06.13.15 (version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256); Wed, 26 Apr 2023 06:13:15 -0700 (PDT) To: libcamera-devel@lists.libcamera.org Date: Wed, 26 Apr 2023 14:10:50 +0100 Message-Id: <20230426131057.21550-7-naush@raspberrypi.com> X-Mailer: git-send-email 2.34.1 In-Reply-To: <20230426131057.21550-1-naush@raspberrypi.com> References: <20230426131057.21550-1-naush@raspberrypi.com> MIME-Version: 1.0 Subject: [libcamera-devel] [PATCH 06/13] pipeline: ipa: raspberrypi: Restructure the IPA mojom interface X-BeenThere: libcamera-devel@lists.libcamera.org X-Mailman-Version: 2.1.29 Precedence: list List-Id: List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , X-Patchwork-Original-From: Naushir Patuck via libcamera-devel From: Naushir Patuck Reply-To: Naushir Patuck Errors-To: libcamera-devel-bounces@lists.libcamera.org Sender: "libcamera-devel" Restructure the IPA mojom interface to be more consistent in the use of the API. Function parameters are now grouped into *Params structures and results are now returned in *Results structures. The following pipeline -> IPA interfaces have been removed: signalQueueRequest(libcamera.ControlList controls); signalIspPrepare(ISPConfig data); signalStatReady(uint32 bufferId, uint32 ipaContext); and replaced with: prepareIsp(PrepareParams params); processStats(ProcessParams params); signalQueueRequest() is now encompassed within signalPrepareIsp(). The following IPA -> pipeline interfaces have been removed: runIsp(uint32 bufferId); embeddedComplete(uint32 bufferId); statsMetadataComplete(uint32 bufferId, libcamera.ControlList controls); and replaced with the following async calls: prepareIspComplete(BufferIds buffers); processStatsComplete(BufferIds buffers); metadataReady(libcamera.ControlList metadata); Signed-off-by: Naushir Patuck Reviewed-by: Jacopo Mondi --- include/libcamera/ipa/raspberrypi.mojom | 127 +++++--------- src/ipa/rpi/vc4/raspberrypi.cpp | 102 +++++------- .../pipeline/rpi/vc4/raspberrypi.cpp | 155 +++++++++--------- 3 files changed, 165 insertions(+), 219 deletions(-) diff --git a/include/libcamera/ipa/raspberrypi.mojom b/include/libcamera/ipa/raspberrypi.mojom index 80e0126618c8..7d56248f4ab3 100644 --- a/include/libcamera/ipa/raspberrypi.mojom +++ b/include/libcamera/ipa/raspberrypi.mojom @@ -8,7 +8,7 @@ module ipa.RPi; import "include/libcamera/ipa/core.mojom"; -/* Size of the LS grid allocation. */ +/* Size of the LS grid allocation on VC4. */ const uint32 MaxLsGridSize = 0x8000; struct SensorConfig { @@ -19,117 +19,78 @@ struct SensorConfig { uint32 sensorMetadata; }; -struct IPAInitResult { +struct InitParams { + bool lensPresent; +}; + +struct InitResult { SensorConfig sensorConfig; libcamera.ControlInfoMap controlInfo; }; -struct ISPConfig { - uint32 embeddedBufferId; - uint32 bayerBufferId; - bool embeddedBufferPresent; - libcamera.ControlList controls; - uint32 ipaContext; - uint32 delayContext; +struct BufferIds { + uint32 bayer; + uint32 embedded; + uint32 stats; }; -struct IPAConfig { +struct ConfigParams { uint32 transform; - libcamera.SharedFD lsTableHandle; libcamera.ControlInfoMap sensorControls; libcamera.ControlInfoMap ispControls; libcamera.ControlInfoMap lensControls; + /* VC4 specifc */ + libcamera.SharedFD lsTableHandle; }; -struct IPAConfigResult { - float modeSensitivity; - libcamera.ControlInfoMap controlInfo; +struct ConfigResult { + float modeSensitivity; + libcamera.ControlInfoMap controlInfo; + libcamera.ControlList controls; }; -struct StartConfig { +struct StartResult { libcamera.ControlList controls; int32 dropFrameCount; }; +struct PrepareParams { + BufferIds buffers; + libcamera.ControlList sensorControls; + libcamera.ControlList requestControls; + uint32 ipaContext; + uint32 delayContext; +}; + +struct ProcessParams { + BufferIds buffers; + uint32 ipaContext; +}; + interface IPARPiInterface { - init(libcamera.IPASettings settings, bool lensPresent) - => (int32 ret, IPAInitResult result); - start(libcamera.ControlList controls) => (StartConfig startConfig); + init(libcamera.IPASettings settings, InitParams params) + => (int32 ret, InitResult result); + + start(libcamera.ControlList controls) => (StartResult result); stop(); - /** - * \fn configure() - * \brief Configure the IPA stream and sensor settings - * \param[in] sensorInfo Camera sensor information - * \param[in] ipaConfig Pipeline-handler-specific configuration data - * \param[out] controls Controls to apply by the pipeline entity - * \param[out] result Other results that the pipeline handler may require - * - * This function shall be called when the camera is configured to inform - * the IPA of the camera's streams and the sensor settings. - * - * The \a sensorInfo conveys information about the camera sensor settings that - * the pipeline handler has selected for the configuration. - * - * The \a ipaConfig and \a controls parameters carry data passed by the - * pipeline handler to the IPA and back. - */ - configure(libcamera.IPACameraSensorInfo sensorInfo, - IPAConfig ipaConfig) - => (int32 ret, libcamera.ControlList controls, IPAConfigResult result); - - /** - * \fn mapBuffers() - * \brief Map buffers shared between the pipeline handler and the IPA - * \param[in] buffers List of buffers to map - * - * This function informs the IPA module of memory buffers set up by the - * pipeline handler that the IPA needs to access. It provides dmabuf - * file handles for each buffer, and associates the buffers with unique - * numerical IDs. - * - * IPAs shall map the dmabuf file handles to their address space and - * keep a cache of the mappings, indexed by the buffer numerical IDs. - * The IDs are used in all other IPA interface functions to refer to - * buffers, including the unmapBuffers() function. - * - * All buffers that the pipeline handler wishes to share with an IPA - * shall be mapped with this function. Buffers may be mapped all at once - * with a single call, or mapped and unmapped dynamically at runtime, - * depending on the IPA protocol. Regardless of the protocol, all - * buffers mapped at a given time shall have unique numerical IDs. - * - * The numerical IDs have no meaning defined by the IPA interface, and - * should be treated as opaque handles by IPAs, with the only exception - * that ID zero is invalid. - * - * \sa unmapBuffers() - */ - mapBuffers(array buffers); + configure(libcamera.IPACameraSensorInfo sensorInfo, ConfigParams params) + => (int32 ret, ConfigResult result); - /** - * \fn unmapBuffers() - * \brief Unmap buffers shared by the pipeline to the IPA - * \param[in] ids List of buffer IDs to unmap - * - * This function removes mappings set up with mapBuffers(). Numerical - * IDs of unmapped buffers may be reused when mapping new buffers. - * - * \sa mapBuffers() - */ + mapBuffers(array buffers); unmapBuffers(array ids); - [async] signalStatReady(uint32 bufferId, uint32 ipaContext); - [async] signalQueueRequest(libcamera.ControlList controls); - [async] signalIspPrepare(ISPConfig data); + [async] prepareIsp(PrepareParams params); + [async] processStats(ProcessParams params); }; interface IPARPiEventInterface { - statsMetadataComplete(uint32 bufferId, libcamera.ControlList controls); - runIsp(uint32 bufferId); - embeddedComplete(uint32 bufferId); + prepareIspComplete(BufferIds buffers); + processStatsComplete(BufferIds buffers); + metadataReady(libcamera.ControlList metadata); setIspControls(libcamera.ControlList controls); setDelayedControls(libcamera.ControlList controls, uint32 delayContext); setLensControls(libcamera.ControlList controls); setCameraTimeout(uint32 maxFrameLengthMs); }; + diff --git a/src/ipa/rpi/vc4/raspberrypi.cpp b/src/ipa/rpi/vc4/raspberrypi.cpp index 841635ccde2b..d737f1d662a0 100644 --- a/src/ipa/rpi/vc4/raspberrypi.cpp +++ b/src/ipa/rpi/vc4/raspberrypi.cpp @@ -136,30 +136,28 @@ public: munmap(lsTable_, MaxLsGridSize); } - int init(const IPASettings &settings, bool lensPresent, IPAInitResult *result) override; - void start(const ControlList &controls, StartConfig *startConfig) override; + int init(const IPASettings &settings, const InitParams ¶ms, InitResult *result) override; + void start(const ControlList &controls, StartResult *result) override; void stop() override {} - int configure(const IPACameraSensorInfo &sensorInfo, const IPAConfig &data, - ControlList *controls, IPAConfigResult *result) override; + int configure(const IPACameraSensorInfo &sensorInfo, const ConfigParams ¶ms, + ConfigResult *result) override; void mapBuffers(const std::vector &buffers) override; void unmapBuffers(const std::vector &ids) override; - void signalStatReady(const uint32_t bufferId, uint32_t ipaContext) override; - void signalQueueRequest(const ControlList &controls) override; - void signalIspPrepare(const ISPConfig &data) override; + void prepareIsp(const PrepareParams ¶ms) override; + void processStats(const ProcessParams ¶ms) override; private: void setMode(const IPACameraSensorInfo &sensorInfo); bool validateSensorControls(); bool validateIspControls(); bool validateLensControls(); - void queueRequest(const ControlList &controls); - void returnEmbeddedBuffer(unsigned int bufferId); - void prepareISP(const ISPConfig &data); + void applyControls(const ControlList &controls); + void prepare(const PrepareParams ¶ms); void reportMetadata(unsigned int ipaContext); void fillDeviceStatus(const ControlList &sensorControls, unsigned int ipaContext); RPiController::StatisticsPtr fillStatistics(bcm2835_isp_stats *stats) const; - void processStats(unsigned int bufferId, unsigned int ipaContext); + void process(unsigned int bufferId, unsigned int ipaContext); void setCameraTimeoutValue(); void applyFrameDurations(Duration minFrameDuration, Duration maxFrameDuration); void applyAGC(const struct AgcStatus *agcStatus, ControlList &ctrls); @@ -229,7 +227,7 @@ private: Duration lastTimeout_; }; -int IPARPi::init(const IPASettings &settings, bool lensPresent, IPAInitResult *result) +int IPARPi::init(const IPASettings &settings, const InitParams ¶ms, InitResult *result) { /* * Load the "helper" for this sensor. This tells us all the device specific stuff @@ -274,7 +272,7 @@ int IPARPi::init(const IPASettings &settings, bool lensPresent, IPAInitResult *r return -EINVAL; } - lensPresent_ = lensPresent; + lensPresent_ = params.lensPresent; controller_.initialise(); @@ -287,14 +285,13 @@ int IPARPi::init(const IPASettings &settings, bool lensPresent, IPAInitResult *r return 0; } -void IPARPi::start(const ControlList &controls, StartConfig *startConfig) +void IPARPi::start(const ControlList &controls, StartResult *result) { RPiController::Metadata metadata; - ASSERT(startConfig); if (!controls.empty()) { /* We have been given some controls to action before start. */ - queueRequest(controls); + applyControls(controls); } controller_.switchMode(mode_, &metadata); @@ -313,7 +310,7 @@ void IPARPi::start(const ControlList &controls, StartConfig *startConfig) if (agcStatus.shutterTime && agcStatus.analogueGain) { ControlList ctrls(sensorCtrls_); applyAGC(&agcStatus, ctrls); - startConfig->controls = std::move(ctrls); + result->controls = std::move(ctrls); setCameraTimeoutValue(); } @@ -360,7 +357,7 @@ void IPARPi::start(const ControlList &controls, StartConfig *startConfig) mistrustCount_ = helper_->mistrustFramesModeSwitch(); } - startConfig->dropFrameCount = dropFrameCount_; + result->dropFrameCount = dropFrameCount_; firstStart_ = false; lastRunTimestamp_ = 0; @@ -435,11 +432,11 @@ void IPARPi::setMode(const IPACameraSensorInfo &sensorInfo) mode_.minFrameDuration, mode_.maxFrameDuration); } -int IPARPi::configure(const IPACameraSensorInfo &sensorInfo, const IPAConfig &ipaConfig, - ControlList *controls, IPAConfigResult *result) +int IPARPi::configure(const IPACameraSensorInfo &sensorInfo, const ConfigParams ¶ms, + ConfigResult *result) { - sensorCtrls_ = ipaConfig.sensorControls; - ispCtrls_ = ipaConfig.ispControls; + sensorCtrls_ = params.sensorControls; + ispCtrls_ = params.ispControls; if (!validateSensorControls()) { LOG(IPARPI, Error) << "Sensor control validation failed."; @@ -452,7 +449,7 @@ int IPARPi::configure(const IPACameraSensorInfo &sensorInfo, const IPAConfig &ip } if (lensPresent_) { - lensCtrls_ = ipaConfig.lensControls; + lensCtrls_ = params.lensControls; if (!validateLensControls()) { LOG(IPARPI, Warning) << "Lens validation failed, " << "no lens control will be available."; @@ -466,10 +463,10 @@ int IPARPi::configure(const IPACameraSensorInfo &sensorInfo, const IPAConfig &ip /* Re-assemble camera mode using the sensor info. */ setMode(sensorInfo); - mode_.transform = static_cast(ipaConfig.transform); + mode_.transform = static_cast(params.transform); /* Store the lens shading table pointer and handle if available. */ - if (ipaConfig.lsTableHandle.isValid()) { + if (params.lsTableHandle.isValid()) { /* Remove any previous table, if there was one. */ if (lsTable_) { munmap(lsTable_, MaxLsGridSize); @@ -477,7 +474,7 @@ int IPARPi::configure(const IPACameraSensorInfo &sensorInfo, const IPAConfig &ip } /* Map the LS table buffer into user space. */ - lsTableHandle_ = std::move(ipaConfig.lsTableHandle); + lsTableHandle_ = std::move(params.lsTableHandle); if (lsTableHandle_.isValid()) { lsTable_ = mmap(nullptr, MaxLsGridSize, PROT_READ | PROT_WRITE, MAP_SHARED, lsTableHandle_.get(), 0); @@ -512,8 +509,7 @@ int IPARPi::configure(const IPACameraSensorInfo &sensorInfo, const IPAConfig &ip applyAGC(&agcStatus, ctrls); } - ASSERT(controls); - *controls = std::move(ctrls); + result->controls = std::move(ctrls); /* * Apply the correct limits to the exposure, gain and frame duration controls @@ -560,37 +556,34 @@ void IPARPi::unmapBuffers(const std::vector &ids) } } -void IPARPi::signalStatReady(uint32_t bufferId, uint32_t ipaContext) +void IPARPi::processStats(const ProcessParams ¶ms) { - unsigned int context = ipaContext % rpiMetadata_.size(); + unsigned int context = params.ipaContext % rpiMetadata_.size(); if (++checkCount_ != frameCount_) /* assert here? */ LOG(IPARPI, Error) << "WARNING: Prepare/Process mismatch!!!"; if (processPending_ && frameCount_ > mistrustCount_) - processStats(bufferId, context); + process(params.buffers.stats, context); reportMetadata(context); - - statsMetadataComplete.emit(bufferId, libcameraMetadata_); + processStatsComplete.emit(params.buffers); } -void IPARPi::signalQueueRequest(const ControlList &controls) -{ - queueRequest(controls); -} -void IPARPi::signalIspPrepare(const ISPConfig &data) +void IPARPi::prepareIsp(const PrepareParams ¶ms) { + applyControls(params.requestControls); + /* * At start-up, or after a mode-switch, we may want to * avoid running the control algos for a few frames in case * they are "unreliable". */ - prepareISP(data); + prepare(params); frameCount_++; /* Ready to push the input buffer into the ISP. */ - runIsp.emit(data.bayerBufferId); + prepareIspComplete.emit(params.buffers); } void IPARPi::reportMetadata(unsigned int ipaContext) @@ -703,6 +696,8 @@ void IPARPi::reportMetadata(unsigned int ipaContext) libcameraMetadata_.set(controls::AfState, s); libcameraMetadata_.set(controls::AfPauseState, p); } + + metadataReady.emit(libcameraMetadata_); } bool IPARPi::validateSensorControls() @@ -826,7 +821,7 @@ static const std::map AfPauseTable { controls::AfPauseResume, RPiController::AfAlgorithm::AfPauseResume }, }; -void IPARPi::queueRequest(const ControlList &controls) +void IPARPi::applyControls(const ControlList &controls) { using RPiController::AfAlgorithm; @@ -1256,27 +1251,22 @@ void IPARPi::queueRequest(const ControlList &controls) } } -void IPARPi::returnEmbeddedBuffer(unsigned int bufferId) +void IPARPi::prepare(const PrepareParams ¶ms) { - embeddedComplete.emit(bufferId); -} - -void IPARPi::prepareISP(const ISPConfig &data) -{ - int64_t frameTimestamp = data.controls.get(controls::SensorTimestamp).value_or(0); - unsigned int ipaContext = data.ipaContext % rpiMetadata_.size(); + int64_t frameTimestamp = params.sensorControls.get(controls::SensorTimestamp).value_or(0); + unsigned int ipaContext = params.ipaContext % rpiMetadata_.size(); RPiController::Metadata &rpiMetadata = rpiMetadata_[ipaContext]; Span embeddedBuffer; rpiMetadata.clear(); - fillDeviceStatus(data.controls, ipaContext); + fillDeviceStatus(params.sensorControls, ipaContext); - if (data.embeddedBufferPresent) { + if (params.buffers.embedded) { /* * Pipeline handler has supplied us with an embedded data buffer, * we must pass it to the CamHelper for parsing. */ - auto it = buffers_.find(data.embeddedBufferId); + auto it = buffers_.find(params.buffers.embedded); ASSERT(it != buffers_.end()); embeddedBuffer = it->second.planes()[0]; } @@ -1288,7 +1278,7 @@ void IPARPi::prepareISP(const ISPConfig &data) * metadata. */ AgcStatus agcStatus; - RPiController::Metadata &delayedMetadata = rpiMetadata_[data.delayContext]; + RPiController::Metadata &delayedMetadata = rpiMetadata_[params.delayContext]; if (!delayedMetadata.get("agc.status", agcStatus)) rpiMetadata.set("agc.delayed_status", agcStatus); @@ -1298,10 +1288,6 @@ void IPARPi::prepareISP(const ISPConfig &data) */ helper_->prepare(embeddedBuffer, rpiMetadata); - /* Done with embedded data now, return to pipeline handler asap. */ - if (data.embeddedBufferPresent) - returnEmbeddedBuffer(data.embeddedBufferId); - /* Allow a 10% margin on the comparison below. */ Duration delta = (frameTimestamp - lastRunTimestamp_) * 1.0ns; if (lastRunTimestamp_ && frameCount_ > dropFrameCount_ && @@ -1445,7 +1431,7 @@ RPiController::StatisticsPtr IPARPi::fillStatistics(bcm2835_isp_stats *stats) co return statistics; } -void IPARPi::processStats(unsigned int bufferId, unsigned int ipaContext) +void IPARPi::process(unsigned int bufferId, unsigned int ipaContext) { RPiController::Metadata &rpiMetadata = rpiMetadata_[ipaContext]; diff --git a/src/libcamera/pipeline/rpi/vc4/raspberrypi.cpp b/src/libcamera/pipeline/rpi/vc4/raspberrypi.cpp index e54bf1ef2c17..30ce6feab0d1 100644 --- a/src/libcamera/pipeline/rpi/vc4/raspberrypi.cpp +++ b/src/libcamera/pipeline/rpi/vc4/raspberrypi.cpp @@ -200,15 +200,15 @@ public: void freeBuffers(); void frameStarted(uint32_t sequence); - int loadIPA(ipa::RPi::IPAInitResult *result); - int configureIPA(const CameraConfiguration *config, ipa::RPi::IPAConfigResult *result); + int loadIPA(ipa::RPi::InitResult *result); + int configureIPA(const CameraConfiguration *config, ipa::RPi::ConfigResult *result); int loadPipelineConfiguration(); void enumerateVideoDevices(MediaLink *link); - void statsMetadataComplete(uint32_t bufferId, const ControlList &controls); - void runIsp(uint32_t bufferId); - void embeddedComplete(uint32_t bufferId); + void processStatsComplete(const ipa::RPi::BufferIds &buffers); + void metadataReady(const ControlList &metadata); + void prepareIspComplete(const ipa::RPi::BufferIds &buffers); void setIspControls(const ControlList &controls); void setDelayedControls(const ControlList &controls, uint32_t delayContext); void setLensControls(const ControlList &controls); @@ -238,7 +238,7 @@ public: /* The vector below is just for convenience when iterating over all streams. */ std::vector streams_; /* Stores the ids of the buffers mapped in the IPA. */ - std::unordered_set ipaBuffers_; + std::unordered_set bufferIds_; /* * Stores a cascade of Video Mux or Bridge devices between the sensor and * Unicam together with media link across the entities. @@ -1000,7 +1000,7 @@ int PipelineHandlerRPi::configure(Camera *camera, CameraConfiguration *config) data->isp_[Isp::Input].dev()->setSelection(V4L2_SEL_TGT_CROP, &crop); - ipa::RPi::IPAConfigResult result; + ipa::RPi::ConfigResult result; ret = data->configureIPA(config, &result); if (ret) LOG(RPI, Error) << "Failed to configure the IPA: " << ret; @@ -1117,17 +1117,17 @@ int PipelineHandlerRPi::start(Camera *camera, const ControlList *controls) data->applyScalerCrop(*controls); /* Start the IPA. */ - ipa::RPi::StartConfig startConfig; + ipa::RPi::StartResult result; data->ipa_->start(controls ? *controls : ControlList{ controls::controls }, - &startConfig); + &result); /* Apply any gain/exposure settings that the IPA may have passed back. */ - if (!startConfig.controls.empty()) - data->setSensorControls(startConfig.controls); + if (!result.controls.empty()) + data->setSensorControls(result.controls); /* Configure the number of dropped frames required on startup. */ data->dropFrameCount_ = data->config_.disableStartupFrameDrops - ? 0 : startConfig.dropFrameCount; + ? 0 : result.dropFrameCount; for (auto const stream : data->streams_) stream->resetBuffers(); @@ -1358,7 +1358,7 @@ int PipelineHandlerRPi::registerCamera(MediaDevice *unicam, MediaDevice *isp, Me data->sensorFormats_ = populateSensorFormats(data->sensor_); - ipa::RPi::IPAInitResult result; + ipa::RPi::InitResult result; if (data->loadIPA(&result)) { LOG(RPI, Error) << "Failed to load a suitable IPA library"; return -EINVAL; @@ -1599,7 +1599,7 @@ int PipelineHandlerRPi::prepareBuffers(Camera *camera) void PipelineHandlerRPi::mapBuffers(Camera *camera, const RPi::BufferMap &buffers, unsigned int mask) { RPiCameraData *data = cameraData(camera); - std::vector ipaBuffers; + std::vector bufferIds; /* * Link the FrameBuffers with the id (key value) in the map stored in * the RPi stream object - along with an identifier mask. @@ -1608,12 +1608,12 @@ void PipelineHandlerRPi::mapBuffers(Camera *camera, const RPi::BufferMap &buffer * handler and the IPA. */ for (auto const &it : buffers) { - ipaBuffers.push_back(IPABuffer(mask | it.first, + bufferIds.push_back(IPABuffer(mask | it.first, it.second->planes())); - data->ipaBuffers_.insert(mask | it.first); + data->bufferIds_.insert(mask | it.first); } - data->ipa_->mapBuffers(ipaBuffers); + data->ipa_->mapBuffers(bufferIds); } void RPiCameraData::freeBuffers() @@ -1623,10 +1623,10 @@ void RPiCameraData::freeBuffers() * Copy the buffer ids from the unordered_set to a vector to * pass to the IPA. */ - std::vector ipaBuffers(ipaBuffers_.begin(), - ipaBuffers_.end()); - ipa_->unmapBuffers(ipaBuffers); - ipaBuffers_.clear(); + std::vector bufferIds(bufferIds_.begin(), + bufferIds_.end()); + ipa_->unmapBuffers(bufferIds); + bufferIds_.clear(); } for (auto const stream : streams_) @@ -1643,16 +1643,16 @@ void RPiCameraData::frameStarted(uint32_t sequence) delayedCtrls_->applyControls(sequence); } -int RPiCameraData::loadIPA(ipa::RPi::IPAInitResult *result) +int RPiCameraData::loadIPA(ipa::RPi::InitResult *result) { ipa_ = IPAManager::createIPA(pipe(), 1, 1); if (!ipa_) return -ENOENT; - ipa_->statsMetadataComplete.connect(this, &RPiCameraData::statsMetadataComplete); - ipa_->runIsp.connect(this, &RPiCameraData::runIsp); - ipa_->embeddedComplete.connect(this, &RPiCameraData::embeddedComplete); + ipa_->processStatsComplete.connect(this, &RPiCameraData::processStatsComplete); + ipa_->prepareIspComplete.connect(this, &RPiCameraData::prepareIspComplete); + ipa_->metadataReady.connect(this, &RPiCameraData::metadataReady); ipa_->setIspControls.connect(this, &RPiCameraData::setIspControls); ipa_->setDelayedControls.connect(this, &RPiCameraData::setDelayedControls); ipa_->setLensControls.connect(this, &RPiCameraData::setLensControls); @@ -1674,23 +1674,25 @@ int RPiCameraData::loadIPA(ipa::RPi::IPAInitResult *result) } IPASettings settings(configurationFile, sensor_->model()); + ipa::RPi::InitParams params; - return ipa_->init(settings, !!sensor_->focusLens(), result); + params.lensPresent = !!sensor_->focusLens(); + return ipa_->init(settings, params, result); } -int RPiCameraData::configureIPA(const CameraConfiguration *config, ipa::RPi::IPAConfigResult *result) +int RPiCameraData::configureIPA(const CameraConfiguration *config, ipa::RPi::ConfigResult *result) { std::map entityControls; - ipa::RPi::IPAConfig ipaConfig; + ipa::RPi::ConfigParams params; /* \todo Move passing of ispControls and lensControls to ipa::init() */ - ipaConfig.sensorControls = sensor_->controls(); - ipaConfig.ispControls = isp_[Isp::Input].dev()->controls(); + params.sensorControls = sensor_->controls(); + params.ispControls = isp_[Isp::Input].dev()->controls(); if (sensor_->focusLens()) - ipaConfig.lensControls = sensor_->focusLens()->controls(); + params.lensControls = sensor_->focusLens()->controls(); /* Always send the user transform to the IPA. */ - ipaConfig.transform = static_cast(config->transform); + params.transform = static_cast(config->transform); /* Allocate the lens shading table via dmaHeap and pass to the IPA. */ if (!lsTable_.isValid()) { @@ -1703,7 +1705,7 @@ int RPiCameraData::configureIPA(const CameraConfiguration *config, ipa::RPi::IPA * \todo Investigate if mapping the lens shading table buffer * could be handled with mapBuffers(). */ - ipaConfig.lsTableHandle = lsTable_; + params.lsTableHandle = lsTable_; } /* We store the IPACameraSensorInfo for digital zoom calculations. */ @@ -1714,15 +1716,14 @@ int RPiCameraData::configureIPA(const CameraConfiguration *config, ipa::RPi::IPA } /* Ready the IPA - it must know about the sensor resolution. */ - ControlList controls; - ret = ipa_->configure(sensorInfo_, ipaConfig, &controls, result); + ret = ipa_->configure(sensorInfo_, params, result); if (ret < 0) { LOG(RPI, Error) << "IPA configuration failed!"; return -EPIPE; } - if (!controls.empty()) - setSensorControls(controls); + if (!result->controls.empty()) + setSensorControls(result->controls); return 0; } @@ -1883,24 +1884,32 @@ void RPiCameraData::enumerateVideoDevices(MediaLink *link) } } -void RPiCameraData::statsMetadataComplete(uint32_t bufferId, const ControlList &controls) +void RPiCameraData::processStatsComplete(const ipa::RPi::BufferIds &buffers) { if (!isRunning()) return; - FrameBuffer *buffer = isp_[Isp::Stats].getBuffers().at(bufferId & RPi::MaskID); + FrameBuffer *buffer = isp_[Isp::Stats].getBuffers().at(buffers.stats & RPi::MaskID); handleStreamBuffer(buffer, &isp_[Isp::Stats]); + state_ = State::IpaComplete; + handleState(); +} + +void RPiCameraData::metadataReady(const ControlList &metadata) +{ + if (!isRunning()) + return; /* Add to the Request metadata buffer what the IPA has provided. */ Request *request = requestQueue_.front(); - request->metadata().merge(controls); + request->metadata().merge(metadata); /* * Inform the sensor of the latest colour gains if it has the * V4L2_CID_NOTIFY_GAINS control (which means notifyGainsUnity_ is set). */ - const auto &colourGains = controls.get(libcamera::controls::ColourGains); + const auto &colourGains = metadata.get(libcamera::controls::ColourGains); if (notifyGainsUnity_ && colourGains) { /* The control wants linear gains in the order B, Gb, Gr, R. */ ControlList ctrls(sensor_->controls()); @@ -1914,33 +1923,29 @@ void RPiCameraData::statsMetadataComplete(uint32_t bufferId, const ControlList & sensor_->setControls(&ctrls); } - - state_ = State::IpaComplete; - handleState(); } -void RPiCameraData::runIsp(uint32_t bufferId) +void RPiCameraData::prepareIspComplete(const ipa::RPi::BufferIds &buffers) { + unsigned int embeddedId = buffers.embedded & RPi::MaskID; + unsigned int bayer = buffers.bayer & RPi::MaskID; + FrameBuffer *buffer; + if (!isRunning()) return; - FrameBuffer *buffer = unicam_[Unicam::Image].getBuffers().at(bufferId & RPi::MaskID); - - LOG(RPI, Debug) << "Input re-queue to ISP, buffer id " << (bufferId & RPi::MaskID) + buffer = unicam_[Unicam::Image].getBuffers().at(bayer & RPi::MaskID); + LOG(RPI, Debug) << "Input re-queue to ISP, buffer id " << (bayer & RPi::MaskID) << ", timestamp: " << buffer->metadata().timestamp; isp_[Isp::Input].queueBuffer(buffer); ispOutputCount_ = 0; - handleState(); -} -void RPiCameraData::embeddedComplete(uint32_t bufferId) -{ - if (!isRunning()) - return; + if (sensorMetadata_ && embeddedId) { + buffer = unicam_[Unicam::Embedded].getBuffers().at(embeddedId & RPi::MaskID); + handleStreamBuffer(buffer, &unicam_[Unicam::Embedded]); + } - FrameBuffer *buffer = unicam_[Unicam::Embedded].getBuffers().at(bufferId & RPi::MaskID); - handleStreamBuffer(buffer, &unicam_[Unicam::Embedded]); handleState(); } @@ -2116,8 +2121,10 @@ void RPiCameraData::ispOutputDequeue(FrameBuffer *buffer) * application until after the IPA signals so. */ if (stream == &isp_[Isp::Stats]) { - ipa_->signalStatReady(RPi::MaskStats | static_cast(index), - requestQueue_.front()->sequence()); + ipa::RPi::ProcessParams params; + params.buffers.stats = index | RPi::MaskStats; + params.ipaContext = requestQueue_.front()->sequence(); + ipa_->processStats(params); } else { /* Any other ISP output can be handed back to the application now. */ handleStreamBuffer(buffer, stream); @@ -2344,38 +2351,30 @@ void RPiCameraData::tryRunPipeline() request->metadata().clear(); fillRequestMetadata(bayerFrame.controls, request); - /* - * Process all the user controls by the IPA. Once this is complete, we - * queue the ISP output buffer listed in the request to start the HW - * pipeline. - */ - ipa_->signalQueueRequest(request->controls()); - /* Set our state to say the pipeline is active. */ state_ = State::Busy; - unsigned int bayerId = unicam_[Unicam::Image].getBufferId(bayerFrame.buffer); + unsigned int bayer = unicam_[Unicam::Image].getBufferId(bayerFrame.buffer); - LOG(RPI, Debug) << "Signalling signalIspPrepare:" - << " Bayer buffer id: " << bayerId; + LOG(RPI, Debug) << "Signalling prepareIsp:" + << " Bayer buffer id: " << bayer; - ipa::RPi::ISPConfig ispPrepare; - ispPrepare.bayerBufferId = RPi::MaskBayerData | bayerId; - ispPrepare.controls = std::move(bayerFrame.controls); - ispPrepare.ipaContext = request->sequence(); - ispPrepare.delayContext = bayerFrame.delayContext; + ipa::RPi::PrepareParams params; + params.buffers.bayer = RPi::MaskBayerData | bayer; + params.sensorControls = std::move(bayerFrame.controls); + params.requestControls = request->controls(); + params.ipaContext = request->sequence(); + params.delayContext = bayerFrame.delayContext; if (embeddedBuffer) { unsigned int embeddedId = unicam_[Unicam::Embedded].getBufferId(embeddedBuffer); - ispPrepare.embeddedBufferId = RPi::MaskEmbeddedData | embeddedId; - ispPrepare.embeddedBufferPresent = true; - - LOG(RPI, Debug) << "Signalling signalIspPrepare:" + params.buffers.embedded = RPi::MaskEmbeddedData | embeddedId; + LOG(RPI, Debug) << "Signalling prepareIsp:" << " Embedded buffer id: " << embeddedId; } - ipa_->signalIspPrepare(ispPrepare); + ipa_->prepareIsp(params); } bool RPiCameraData::findMatchingBuffers(BayerFrame &bayerFrame, FrameBuffer *&embeddedBuffer) From patchwork Wed Apr 26 13:10:51 2023 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Naushir Patuck X-Patchwork-Id: 18555 Return-Path: X-Original-To: parsemail@patchwork.libcamera.org Delivered-To: parsemail@patchwork.libcamera.org Received: from lancelot.ideasonboard.com (lancelot.ideasonboard.com [92.243.16.209]) by patchwork.libcamera.org (Postfix) with ESMTPS id 1C7BFBDCBD for ; Wed, 26 Apr 2023 13:13:22 +0000 (UTC) Received: from lancelot.ideasonboard.com (localhost [IPv6:::1]) by lancelot.ideasonboard.com (Postfix) with ESMTP id C164E627E0; Wed, 26 Apr 2023 15:13:21 +0200 (CEST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/simple; d=libcamera.org; s=mail; t=1682514801; bh=8ZJirGtzoI1nfzxKgVonMTAI0BmoM7sKCOo0NC5txqc=; h=To:Date:In-Reply-To:References:Subject:List-Id:List-Unsubscribe: List-Archive:List-Post:List-Help:List-Subscribe:From:Reply-To: From; b=AMdt2VUhu4e3biif0MibKJzdnAWTK61Ud3C3yfFcxiHNnVlK1d1E6Wo1+/dOjyAzf aVNZ1XxvwtMJOBIsYKsO4BhHikMDtK4JKWselFvDGYQ/3NkKSbc3tjUiMwgZDLES0R PGLTt6IjNfWBpWIX74ohDG3zsqhfTAg3ZmoiOmbUAp+wsUj475oAszMdI14qgbTZ8s JbJ6vbyg1M+cNN1MOSbUGuutwC+Tg+mz0qUHdq9EJDd9sGgMUv4AIBALOD9tvDw2tb 4QkeFvPUTmkXgLRJXYAR3bfDz5UY/HwBVkr0ckx2pbuicP5pn0vNZi9gvMWmdSVw8z ++Jb6zWlu/h0Q== Received: from mail-wm1-x32a.google.com (mail-wm1-x32a.google.com [IPv6:2a00:1450:4864:20::32a]) by lancelot.ideasonboard.com (Postfix) with ESMTPS id 824DC627ED for ; Wed, 26 Apr 2023 15:13:17 +0200 (CEST) Authentication-Results: lancelot.ideasonboard.com; dkim=pass (2048-bit key; unprotected) header.d=raspberrypi.com header.i=@raspberrypi.com header.b="mgbXOhEh"; dkim-atps=neutral Received: by mail-wm1-x32a.google.com with SMTP id 5b1f17b1804b1-3f1763ee8f8so48872745e9.1 for ; Wed, 26 Apr 2023 06:13:17 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=raspberrypi.com; s=google; t=1682514797; x=1685106797; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:from:to:cc:subject:date :message-id:reply-to; bh=sxkrJNOXybpoweaw5C2D3mxs0OIHpZEwdQMHkFUZTyc=; b=mgbXOhEhxPVlgomonpV54g6wP5Cp7iFkCpoELnnuDqWseE/yVvVolm63grPIU2YG6B rOeFS3a8Nf5ArYJCk9nedOPvcxGpLDlfHoTb7XOPw+efGPePwCL9RIr26u4IDEuwucCi zjlrWwE/eBwe/hxvWiUB0/Ve/6Qu6etRw3VGxKBZnYXLxcC0V29O/KFod3itSv/DladZ z+C7HLnm8CdyR5G2t+preNDqn6MNDobQp/VoeAg/nJRhWK4QfJvNwp40N+++Z6JiXU0J xpP+ZQrcdL9YFeiztAVcfQaZjOq46MTISRL0DFK52Hl8aEQffZSmlCyhxQYX+bqTp5s/ Vqlg== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20221208; t=1682514797; x=1685106797; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:x-gm-message-state:from:to:cc :subject:date:message-id:reply-to; bh=sxkrJNOXybpoweaw5C2D3mxs0OIHpZEwdQMHkFUZTyc=; b=KJ7SwXD4Ypkq78oIhlnRPlHbBwcb6NuJTSnyKIUtCeHwCmyCVoEYZ/DFpDd1k+5fNy PwYdVPbxzt4hnIcxJ0yLvjjMPMzfuAP/p+uer6ZxOh1iYy/IxOSE2MymcNBHluHbei+K C/S/2cumuqI44K5ePSGsBXotavt6U63UEOrtrOl61+h/tu6+59nDWoZl+ipqvZV2L1my asRksYP7/6gFwewqakopEIq3SBKumiDiP/Vx3ZOgnudUBWxy1LNUKV2ZUHqQiyv/AEY9 J7KVFqdXyR/Ra1CyQK5kFA57Fsl0xx+spJmWDE2lb9ElrppTJMR+LC5NylpG5K6cLees kAZQ== X-Gm-Message-State: AAQBX9foWcUUc/+lONHpaoy6WHeh/LbywfTi7aq5P3uFtLGrZsljKmCc 4dergyRpL+K9RNRk8wrbYo+RrILwi5b1VBHAh4lFRg== X-Google-Smtp-Source: AKy350Z/CyVuWjIBxK3eVEoZgrAaX/4XRnDzy2bSvBYGhqPidEJkq4dME3ApJUwiBDGZnuqJ8g1uVg== X-Received: by 2002:adf:fe49:0:b0:2fa:631a:9f38 with SMTP id m9-20020adffe49000000b002fa631a9f38mr14906924wrs.2.1682514796652; Wed, 26 Apr 2023 06:13:16 -0700 (PDT) Received: from localhost.localdomain ([93.93.133.154]) by smtp.gmail.com with ESMTPSA id k5-20020adff5c5000000b002f103ca90cdsm15780949wrp.101.2023.04.26.06.13.15 (version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256); Wed, 26 Apr 2023 06:13:16 -0700 (PDT) To: libcamera-devel@lists.libcamera.org Date: Wed, 26 Apr 2023 14:10:51 +0100 Message-Id: <20230426131057.21550-8-naush@raspberrypi.com> X-Mailer: git-send-email 2.34.1 In-Reply-To: <20230426131057.21550-1-naush@raspberrypi.com> References: <20230426131057.21550-1-naush@raspberrypi.com> MIME-Version: 1.0 Subject: [libcamera-devel] [PATCH 07/13] pipeline: ipa: raspberrypi: Replace IPA metadataReady() call X-BeenThere: libcamera-devel@lists.libcamera.org X-Mailman-Version: 2.1.29 Precedence: list List-Id: List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , X-Patchwork-Original-From: Naushir Patuck via libcamera-devel From: Naushir Patuck Reply-To: Naushir Patuck Errors-To: libcamera-devel-bounces@lists.libcamera.org Sender: "libcamera-devel" Remove the IPA -> pipeline metadataReady() call to signify the return Request metdata is ready to be consumed. Instead, replace this with a new pipeline -> IPA reportMetadata() call that explicitly ask the IPA to fill in the Request metdata at an appropriate time. This change allows the pipeline handler to dictate when it receives the metadata from the IPA if the structure of the pipeline were to ever change. Signed-off-by: Naushir Patuck Reviewed-by: Jacopo Mondi --- include/libcamera/ipa/raspberrypi.mojom | 2 ++ src/ipa/rpi/vc4/raspberrypi.cpp | 9 ++++----- src/libcamera/pipeline/rpi/vc4/raspberrypi.cpp | 18 +++++++----------- 3 files changed, 13 insertions(+), 16 deletions(-) diff --git a/include/libcamera/ipa/raspberrypi.mojom b/include/libcamera/ipa/raspberrypi.mojom index 7d56248f4ab3..620f13ba9717 100644 --- a/include/libcamera/ipa/raspberrypi.mojom +++ b/include/libcamera/ipa/raspberrypi.mojom @@ -82,6 +82,8 @@ interface IPARPiInterface { [async] prepareIsp(PrepareParams params); [async] processStats(ProcessParams params); + + reportMetadata(uint32 ipaContext) => (libcamera.ControlList controls); }; interface IPARPiEventInterface { diff --git a/src/ipa/rpi/vc4/raspberrypi.cpp b/src/ipa/rpi/vc4/raspberrypi.cpp index d737f1d662a0..e3ca8e2f7cbe 100644 --- a/src/ipa/rpi/vc4/raspberrypi.cpp +++ b/src/ipa/rpi/vc4/raspberrypi.cpp @@ -154,7 +154,7 @@ private: bool validateLensControls(); void applyControls(const ControlList &controls); void prepare(const PrepareParams ¶ms); - void reportMetadata(unsigned int ipaContext); + void reportMetadata(unsigned int ipaContext, ControlList *controls) override; void fillDeviceStatus(const ControlList &sensorControls, unsigned int ipaContext); RPiController::StatisticsPtr fillStatistics(bcm2835_isp_stats *stats) const; void process(unsigned int bufferId, unsigned int ipaContext); @@ -565,7 +565,6 @@ void IPARPi::processStats(const ProcessParams ¶ms) if (processPending_ && frameCount_ > mistrustCount_) process(params.buffers.stats, context); - reportMetadata(context); processStatsComplete.emit(params.buffers); } @@ -586,9 +585,9 @@ void IPARPi::prepareIsp(const PrepareParams ¶ms) prepareIspComplete.emit(params.buffers); } -void IPARPi::reportMetadata(unsigned int ipaContext) +void IPARPi::reportMetadata(unsigned int ipaContext, ControlList *controls) { - RPiController::Metadata &rpiMetadata = rpiMetadata_[ipaContext]; + RPiController::Metadata &rpiMetadata = rpiMetadata_[ipaContext % rpiMetadata_.size()]; std::unique_lock lock(rpiMetadata); /* @@ -697,7 +696,7 @@ void IPARPi::reportMetadata(unsigned int ipaContext) libcameraMetadata_.set(controls::AfPauseState, p); } - metadataReady.emit(libcameraMetadata_); + *controls = std::move(libcameraMetadata_); } bool IPARPi::validateSensorControls() diff --git a/src/libcamera/pipeline/rpi/vc4/raspberrypi.cpp b/src/libcamera/pipeline/rpi/vc4/raspberrypi.cpp index 30ce6feab0d1..bd66468683df 100644 --- a/src/libcamera/pipeline/rpi/vc4/raspberrypi.cpp +++ b/src/libcamera/pipeline/rpi/vc4/raspberrypi.cpp @@ -207,7 +207,6 @@ public: void enumerateVideoDevices(MediaLink *link); void processStatsComplete(const ipa::RPi::BufferIds &buffers); - void metadataReady(const ControlList &metadata); void prepareIspComplete(const ipa::RPi::BufferIds &buffers); void setIspControls(const ControlList &controls); void setDelayedControls(const ControlList &controls, uint32_t delayContext); @@ -1652,7 +1651,6 @@ int RPiCameraData::loadIPA(ipa::RPi::InitResult *result) ipa_->processStatsComplete.connect(this, &RPiCameraData::processStatsComplete); ipa_->prepareIspComplete.connect(this, &RPiCameraData::prepareIspComplete); - ipa_->metadataReady.connect(this, &RPiCameraData::metadataReady); ipa_->setIspControls.connect(this, &RPiCameraData::setIspControls); ipa_->setDelayedControls.connect(this, &RPiCameraData::setDelayedControls); ipa_->setLensControls.connect(this, &RPiCameraData::setLensControls); @@ -1892,17 +1890,12 @@ void RPiCameraData::processStatsComplete(const ipa::RPi::BufferIds &buffers) FrameBuffer *buffer = isp_[Isp::Stats].getBuffers().at(buffers.stats & RPi::MaskID); handleStreamBuffer(buffer, &isp_[Isp::Stats]); - state_ = State::IpaComplete; - handleState(); -} -void RPiCameraData::metadataReady(const ControlList &metadata) -{ - if (!isRunning()) - return; - - /* Add to the Request metadata buffer what the IPA has provided. */ + /* Last thing to do is to fill up the request metadata. */ Request *request = requestQueue_.front(); + ControlList metadata; + + ipa_->reportMetadata(request->sequence(), &metadata); request->metadata().merge(metadata); /* @@ -1923,6 +1916,9 @@ void RPiCameraData::metadataReady(const ControlList &metadata) sensor_->setControls(&ctrls); } + + state_ = State::IpaComplete; + handleState(); } void RPiCameraData::prepareIspComplete(const ipa::RPi::BufferIds &buffers) From patchwork Wed Apr 26 13:10:52 2023 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Naushir Patuck X-Patchwork-Id: 18556 Return-Path: X-Original-To: parsemail@patchwork.libcamera.org Delivered-To: parsemail@patchwork.libcamera.org Received: from lancelot.ideasonboard.com (lancelot.ideasonboard.com [92.243.16.209]) by patchwork.libcamera.org (Postfix) with ESMTPS id D9262BDCBD for ; Wed, 26 Apr 2023 13:13:22 +0000 (UTC) Received: from lancelot.ideasonboard.com (localhost [IPv6:::1]) by lancelot.ideasonboard.com (Postfix) with ESMTP id 9AC3B627FC; Wed, 26 Apr 2023 15:13:22 +0200 (CEST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/simple; d=libcamera.org; s=mail; t=1682514802; bh=uHaMjaQtRWAvzDDt+50Z4XiKaoeIIRXN6Pyyigo56eU=; h=To:Date:In-Reply-To:References:Subject:List-Id:List-Unsubscribe: List-Archive:List-Post:List-Help:List-Subscribe:From:Reply-To: From; b=kUNlcXMvVgdsK6z+xsj2vZwQHLhAa0aUH79bainNGZTYSi0jiqOWILFPt+D5pkFxi B4AAbhvDvfRcOM1uRmGSmZTUbpyTr9hCoXKpbK6/pQqf6uLg54h1YJrZ4XzjVGRjPe piH15+tTdsB8iLAC0NX9eZvUeC6RPqOaV6UNAdGQ4IncY1q59nIZIG4+naO+5W5jjy demPVAzaeal1NtFgebya5PKTIRloxo0M807vGFt5e0gJ7v4hw4bBVBXy7wL1IbRWj7 XEaOrDqCbripIStXTKsQt2VlfATKmMSEPeqhdblWvvi6aGuq95SA/8ZDsQC9jAubl1 7yA6aaob3+b1g== Received: from mail-wr1-x42c.google.com (mail-wr1-x42c.google.com [IPv6:2a00:1450:4864:20::42c]) by lancelot.ideasonboard.com (Postfix) with ESMTPS id 59245627F4 for ; Wed, 26 Apr 2023 15:13:19 +0200 (CEST) Authentication-Results: lancelot.ideasonboard.com; dkim=pass (2048-bit key; unprotected) header.d=raspberrypi.com header.i=@raspberrypi.com header.b="rQ5OCFBf"; dkim-atps=neutral Received: by mail-wr1-x42c.google.com with SMTP id ffacd0b85a97d-304935cc79bso1664031f8f.2 for ; Wed, 26 Apr 2023 06:13:19 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=raspberrypi.com; s=google; t=1682514798; x=1685106798; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:from:to:cc:subject:date :message-id:reply-to; bh=UN7NzksCAs6yD/SeL953NjwaD52/vVK3ZRb29175FX4=; b=rQ5OCFBf8kd7NdnrrJRnl9POLafVuZCKNI17RtiM64tU7UgHXrYLwxTV34xeD+6NEo WwCobjDTm7v5eNVrYmTale8W7cGVmqYMyDFhUHfGmoMGsv8VNe6LHJ30NbRCbDKp1Og2 djh2wuJYzhVLNwfGEiqSNDyeYxIvvEkGaZIJx+JNcXFWHU4+EVwE9uVNrjwcYjuLkOUB vDEmj8ACaTUv2oZboyC8vLQK4KigkB/5wwP62gjNj9AuaNSP35oLzQf3WtHXTJgD4zbG bFxWuJrYuktyf1hn3fDeQXD/JSaADIFDGQTFBdyMp3EF01Hi2ufIifoCPSvS9ehsssuq eEkA== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20221208; t=1682514798; x=1685106798; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:x-gm-message-state:from:to:cc :subject:date:message-id:reply-to; bh=UN7NzksCAs6yD/SeL953NjwaD52/vVK3ZRb29175FX4=; b=WEkDozkSNUm8eXSuu4oOMUW0tL6AVI+VkOMYtmM1IzLbAU2/J806cr9UbbHacEZ4U7 Hh7nsHCHJmfQGTCIkWeqRfnMGF5v8C0brjrWuQM6WE1KcrRTqEmGBtihr2NAesHmTVNw hRGHMvkRwNxDofNm6cwLPUjPuzJxZynuKiOb1dtMOZwZVdVrJP6Sw+7fvI4LM1uICtmo 2wfy0Cm6+YR09ioqeMQKGOBq0dUtMaFHA5svxmJVstncbJrgAuMizm1aouAvehAYic80 HXV7i3FqX0kPZXO7G7Sq81DqQJeiY7ZSMhguvllDRWIEb016ke9No5KMa/TuPXg3sjU3 8aGg== X-Gm-Message-State: AAQBX9eohBuDVH9qnUtoklhDttRz5SkWAS0f0Cgw5uffTUmz6x91oqbf vXlYPlIB37fjDi42ub40GWCocZn72C4x8tGgVBuUXw== X-Google-Smtp-Source: AKy350an6nUsOcqcOwi+L8td0qf4iHzRO50BSdfH7ofd1Dx0OqgPHx/d/+WlnRcau0fp34WTnhFGAg== X-Received: by 2002:adf:d08a:0:b0:2f8:3225:2bc1 with SMTP id y10-20020adfd08a000000b002f832252bc1mr15325905wrh.41.1682514797555; Wed, 26 Apr 2023 06:13:17 -0700 (PDT) Received: from localhost.localdomain ([93.93.133.154]) by smtp.gmail.com with ESMTPSA id k5-20020adff5c5000000b002f103ca90cdsm15780949wrp.101.2023.04.26.06.13.16 (version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256); Wed, 26 Apr 2023 06:13:17 -0700 (PDT) To: libcamera-devel@lists.libcamera.org Date: Wed, 26 Apr 2023 14:10:52 +0100 Message-Id: <20230426131057.21550-9-naush@raspberrypi.com> X-Mailer: git-send-email 2.34.1 In-Reply-To: <20230426131057.21550-1-naush@raspberrypi.com> References: <20230426131057.21550-1-naush@raspberrypi.com> MIME-Version: 1.0 Subject: [libcamera-devel] [PATCH 08/13] ipa: raspberrypi: Introduce IpaBase class X-BeenThere: libcamera-devel@lists.libcamera.org X-Mailman-Version: 2.1.29 Precedence: list List-Id: List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , X-Patchwork-Original-From: Naushir Patuck via libcamera-devel From: Naushir Patuck Reply-To: Naushir Patuck Errors-To: libcamera-devel-bounces@lists.libcamera.org Sender: "libcamera-devel" Create a new IpaBase class that handles general purpose housekeeping duties for the Raspberry Pi IPA. Code the implementation of new class is essentially pulled from the existing ipa/rpi/vc4/raspberrypi.cpp file with a minimal amount of refactoring. Create a derived IpaVc4 class from IpaBase that handles the VC4 pipeline specific tasks of the IPA. Again, code for this class implementation is taken from the existing ipa/rpi/vc4/raspberrypi.cpp with a minimal amount of refactoring. The goal of this change is to allow third parties to implement their own IPA running on the Raspberry Pi without duplicating all of the IPA housekeeping tasks. Signed-off-by: Naushir Patuck Reviewed-by: Jacopo Mondi --- src/ipa/meson.build | 5 +- .../raspberrypi.cpp => common/ipa_base.cpp} | 1219 +++++------------ src/ipa/rpi/common/ipa_base.h | 125 ++ src/ipa/rpi/common/meson.build | 7 + src/ipa/rpi/vc4/meson.build | 8 +- src/ipa/rpi/vc4/vc4.cpp | 540 ++++++++ 6 files changed, 1012 insertions(+), 892 deletions(-) rename src/ipa/rpi/{vc4/raspberrypi.cpp => common/ipa_base.cpp} (65%) create mode 100644 src/ipa/rpi/common/ipa_base.h create mode 100644 src/ipa/rpi/common/meson.build create mode 100644 src/ipa/rpi/vc4/vc4.cpp diff --git a/src/ipa/meson.build b/src/ipa/meson.build index 10d3b44ca7b6..0c622c38a4a0 100644 --- a/src/ipa/meson.build +++ b/src/ipa/meson.build @@ -37,13 +37,14 @@ endif enabled_ipa_modules = [] -# If the Raspberry Pi VC4 IPA is enabled, ensure we include the rpi/cam_helper -# and rpi/controller subdirectories in the build. +# If the Raspberry Pi VC4 IPA is enabled, ensure we include the cam_helper, +# common and controller subdirectories in the build. # # This is done here and not within rpi/vc4/meson.build as meson does not # allow the subdir command to traverse up the directory tree. if pipelines.contains('rpi/vc4') subdir('rpi/cam_helper') + subdir('rpi/common') subdir('rpi/controller') endif diff --git a/src/ipa/rpi/vc4/raspberrypi.cpp b/src/ipa/rpi/common/ipa_base.cpp similarity index 65% rename from src/ipa/rpi/vc4/raspberrypi.cpp rename to src/ipa/rpi/common/ipa_base.cpp index e3ca8e2f7cbe..becada5973ad 100644 --- a/src/ipa/rpi/vc4/raspberrypi.cpp +++ b/src/ipa/rpi/common/ipa_base.cpp @@ -1,60 +1,30 @@ /* SPDX-License-Identifier: BSD-2-Clause */ /* - * Copyright (C) 2019-2021, Raspberry Pi Ltd + * Copyright (C) 2019-2023, Raspberry Pi Ltd * - * rpi.cpp - Raspberry Pi Image Processing Algorithms + * ipa_base.cpp - Raspberry Pi IPA base class */ -#include -#include -#include -#include -#include -#include -#include -#include -#include -#include +#include "ipa_base.h" -#include +#include #include -#include #include - #include -#include -#include -#include - -#include -#include -#include -#include "libcamera/internal/mapped_framebuffer.h" - -#include "cam_helper/cam_helper.h" #include "controller/af_algorithm.h" #include "controller/af_status.h" #include "controller/agc_algorithm.h" -#include "controller/agc_status.h" -#include "controller/alsc_status.h" #include "controller/awb_algorithm.h" #include "controller/awb_status.h" #include "controller/black_level_status.h" #include "controller/ccm_algorithm.h" #include "controller/ccm_status.h" #include "controller/contrast_algorithm.h" -#include "controller/contrast_status.h" -#include "controller/controller.h" #include "controller/denoise_algorithm.h" -#include "controller/denoise_status.h" -#include "controller/dpc_status.h" -#include "controller/geq_status.h" #include "controller/lux_status.h" -#include "controller/metadata.h" #include "controller/sharpen_algorithm.h" -#include "controller/sharpen_status.h" #include "controller/statistics.h" namespace libcamera { @@ -62,8 +32,7 @@ namespace libcamera { using namespace std::literals::chrono_literals; using utils::Duration; -/* Number of metadata objects available in the context list. */ -constexpr unsigned int numMetadataContexts = 16; +namespace { /* Number of frame length times to hold in the queue. */ constexpr unsigned int FrameLengthsQueueSize = 10; @@ -80,7 +49,7 @@ constexpr Duration defaultMaxFrameDuration = 250.0s; * we rate-limit the controller Prepare() and Process() calls to lower than or * equal to this rate. */ -constexpr Duration controllerMinFrameDuration = 1.0s / 30.0; +static constexpr Duration controllerMinFrameDuration = 1.0s / 30.0; /* List of controls handled by the Raspberry Pi IPA */ static const ControlInfoMap::Map ipaControls{ @@ -116,118 +85,23 @@ static const ControlInfoMap::Map ipaAfControls{ { &controls::LensPosition, ControlInfo(0.0f, 32.0f, 1.0f) } }; +} /* namespace */ + LOG_DEFINE_CATEGORY(IPARPI) namespace ipa::RPi { -class IPARPi : public IPARPiInterface +IpaBase::IpaBase() + : frameCount_(0), mistrustCount_(0), lastRunTimestamp_(0), firstStart_(true), + controller_() { -public: - IPARPi() - : controller_(), frameCount_(0), checkCount_(0), mistrustCount_(0), - lastRunTimestamp_(0), lsTable_(nullptr), firstStart_(true), - lastTimeout_(0s) - { - } - - ~IPARPi() - { - if (lsTable_) - munmap(lsTable_, MaxLsGridSize); - } - - int init(const IPASettings &settings, const InitParams ¶ms, InitResult *result) override; - void start(const ControlList &controls, StartResult *result) override; - void stop() override {} - - int configure(const IPACameraSensorInfo &sensorInfo, const ConfigParams ¶ms, - ConfigResult *result) override; - void mapBuffers(const std::vector &buffers) override; - void unmapBuffers(const std::vector &ids) override; - void prepareIsp(const PrepareParams ¶ms) override; - void processStats(const ProcessParams ¶ms) override; - -private: - void setMode(const IPACameraSensorInfo &sensorInfo); - bool validateSensorControls(); - bool validateIspControls(); - bool validateLensControls(); - void applyControls(const ControlList &controls); - void prepare(const PrepareParams ¶ms); - void reportMetadata(unsigned int ipaContext, ControlList *controls) override; - void fillDeviceStatus(const ControlList &sensorControls, unsigned int ipaContext); - RPiController::StatisticsPtr fillStatistics(bcm2835_isp_stats *stats) const; - void process(unsigned int bufferId, unsigned int ipaContext); - void setCameraTimeoutValue(); - void applyFrameDurations(Duration minFrameDuration, Duration maxFrameDuration); - void applyAGC(const struct AgcStatus *agcStatus, ControlList &ctrls); - void applyAWB(const struct AwbStatus *awbStatus, ControlList &ctrls); - void applyDG(const struct AgcStatus *dgStatus, ControlList &ctrls); - void applyCCM(const struct CcmStatus *ccmStatus, ControlList &ctrls); - void applyBlackLevel(const struct BlackLevelStatus *blackLevelStatus, ControlList &ctrls); - void applyGamma(const struct ContrastStatus *contrastStatus, ControlList &ctrls); - void applyGEQ(const struct GeqStatus *geqStatus, ControlList &ctrls); - void applyDenoise(const struct DenoiseStatus *denoiseStatus, ControlList &ctrls); - void applySharpen(const struct SharpenStatus *sharpenStatus, ControlList &ctrls); - void applyDPC(const struct DpcStatus *dpcStatus, ControlList &ctrls); - void applyLS(const struct AlscStatus *lsStatus, ControlList &ctrls); - void applyAF(const struct AfStatus *afStatus, ControlList &lensCtrls); - void resampleTable(uint16_t dest[], const std::vector &src, int destW, int destH); - - std::map buffers_; - - ControlInfoMap sensorCtrls_; - ControlInfoMap ispCtrls_; - ControlInfoMap lensCtrls_; - bool lensPresent_; - ControlList libcameraMetadata_; - - /* Camera sensor params. */ - CameraMode mode_; - - /* Raspberry Pi controller specific defines. */ - std::unique_ptr helper_; - RPiController::Controller controller_; - std::array rpiMetadata_; - - /* - * We count frames to decide if the frame must be hidden (e.g. from - * display) or mistrusted (i.e. not given to the control algos). - */ - uint64_t frameCount_; - - /* For checking the sequencing of Prepare/Process calls. */ - uint64_t checkCount_; - - /* How many frames we should avoid running control algos on. */ - unsigned int mistrustCount_; - - /* Number of frames that need to be dropped on startup. */ - unsigned int dropFrameCount_; - - /* Frame timestamp for the last run of the controller. */ - uint64_t lastRunTimestamp_; - - /* Do we run a Controller::process() for this frame? */ - bool processPending_; - - /* LS table allocation passed in from the pipeline handler. */ - SharedFD lsTableHandle_; - void *lsTable_; - - /* Distinguish the first camera start from others. */ - bool firstStart_; - - /* Frame duration (1/fps) limits. */ - Duration minFrameDuration_; - Duration maxFrameDuration_; +} - /* Track the frame length times over FrameLengthsQueueSize frames. */ - std::deque frameLengths_; - Duration lastTimeout_; -}; +IpaBase::~IpaBase() +{ +} -int IPARPi::init(const IPASettings &settings, const InitParams ¶ms, InitResult *result) +int32_t IpaBase::init(const IPASettings &settings, const InitParams ¶ms, InitResult *result) { /* * Load the "helper" for this sensor. This tells us all the device specific stuff @@ -264,14 +138,6 @@ int IPARPi::init(const IPASettings &settings, const InitParams ¶ms, InitResu return ret; } - const std::string &target = controller_.getTarget(); - if (target != "bcm2835") { - LOG(IPARPI, Error) - << "Tuning data file target returned \"" << target << "\"" - << ", expected \"bcm2835\""; - return -EINVAL; - } - lensPresent_ = params.lensPresent; controller_.initialise(); @@ -282,10 +148,88 @@ int IPARPi::init(const IPASettings &settings, const InitParams ¶ms, InitResu ctrlMap.merge(ControlInfoMap::Map(ipaAfControls)); result->controlInfo = ControlInfoMap(std::move(ctrlMap), controls::controls); - return 0; + return platformInit(params, result); } -void IPARPi::start(const ControlList &controls, StartResult *result) +int32_t IpaBase::configure(const IPACameraSensorInfo &sensorInfo, const ConfigParams ¶ms, + ConfigResult *result) +{ + sensorCtrls_ = params.sensorControls; + + if (!validateSensorControls()) { + LOG(IPARPI, Error) << "Sensor control validation failed."; + return -1; + } + + if (lensPresent_) { + lensCtrls_ = params.lensControls; + if (!validateLensControls()) { + LOG(IPARPI, Warning) << "Lens validation failed, " + << "no lens control will be available."; + lensPresent_ = false; + } + } + + /* Setup a metadata ControlList to output metadata. */ + libcameraMetadata_ = ControlList(controls::controls); + + /* Re-assemble camera mode using the sensor info. */ + setMode(sensorInfo); + + mode_.transform = static_cast(params.transform); + + /* Pass the camera mode to the CamHelper to setup algorithms. */ + helper_->setCameraMode(mode_); + + /* + * Initialise this ControlList correctly, even if empty, in case the IPA is + * running is isolation mode (passing the ControlList through the IPC layer). + */ + ControlList ctrls(sensorCtrls_); + + /* The pipeline handler passes out the mode's sensitivity. */ + result->modeSensitivity = mode_.sensitivity; + + if (firstStart_) { + /* Supply initial values for frame durations. */ + applyFrameDurations(defaultMinFrameDuration, defaultMaxFrameDuration); + + /* Supply initial values for gain and exposure. */ + AgcStatus agcStatus; + agcStatus.shutterTime = defaultExposureTime; + agcStatus.analogueGain = defaultAnalogueGain; + applyAGC(&agcStatus, ctrls); + } + + result->controls = std::move(ctrls); + + /* + * Apply the correct limits to the exposure, gain and frame duration controls + * based on the current sensor mode. + */ + ControlInfoMap::Map ctrlMap = ipaControls; + ctrlMap[&controls::FrameDurationLimits] = + ControlInfo(static_cast(mode_.minFrameDuration.get()), + static_cast(mode_.maxFrameDuration.get())); + + ctrlMap[&controls::AnalogueGain] = + ControlInfo(static_cast(mode_.minAnalogueGain), + static_cast(mode_.maxAnalogueGain)); + + ctrlMap[&controls::ExposureTime] = + ControlInfo(static_cast(mode_.minShutter.get()), + static_cast(mode_.maxShutter.get())); + + /* Declare Autofocus controls, only if we have a controllable lens */ + if (lensPresent_) + ctrlMap.merge(ControlInfoMap::Map(ipaAfControls)); + + result->controlInfo = ControlInfoMap(std::move(ctrlMap), controls::controls); + + return platformConfigure(params, result); +} + +void IpaBase::start(const ControlList &controls, StartResult *result) { RPiController::Metadata metadata; @@ -320,7 +264,6 @@ void IPARPi::start(const ControlList &controls, StartResult *result) * or merely a mode switch in a running system. */ frameCount_ = 0; - checkCount_ = 0; if (firstStart_) { dropFrameCount_ = helper_->hideFramesStartup(); mistrustCount_ = helper_->mistrustFramesStartup(); @@ -363,229 +306,141 @@ void IPARPi::start(const ControlList &controls, StartResult *result) lastRunTimestamp_ = 0; } -void IPARPi::setMode(const IPACameraSensorInfo &sensorInfo) +void IpaBase::mapBuffers(const std::vector &buffers) { - mode_.bitdepth = sensorInfo.bitsPerPixel; - mode_.width = sensorInfo.outputSize.width; - mode_.height = sensorInfo.outputSize.height; - mode_.sensorWidth = sensorInfo.activeAreaSize.width; - mode_.sensorHeight = sensorInfo.activeAreaSize.height; - mode_.cropX = sensorInfo.analogCrop.x; - mode_.cropY = sensorInfo.analogCrop.y; - mode_.pixelRate = sensorInfo.pixelRate; - - /* - * Calculate scaling parameters. The scale_[xy] factors are determined - * by the ratio between the crop rectangle size and the output size. - */ - mode_.scaleX = sensorInfo.analogCrop.width / sensorInfo.outputSize.width; - mode_.scaleY = sensorInfo.analogCrop.height / sensorInfo.outputSize.height; - - /* - * We're not told by the pipeline handler how scaling is split between - * binning and digital scaling. For now, as a heuristic, assume that - * downscaling up to 2 is achieved through binning, and that any - * additional scaling is achieved through digital scaling. - * - * \todo Get the pipeline handle to provide the full data - */ - mode_.binX = std::min(2, static_cast(mode_.scaleX)); - mode_.binY = std::min(2, static_cast(mode_.scaleY)); - - /* The noise factor is the square root of the total binning factor. */ - mode_.noiseFactor = sqrt(mode_.binX * mode_.binY); - - /* - * Calculate the line length as the ratio between the line length in - * pixels and the pixel rate. - */ - mode_.minLineLength = sensorInfo.minLineLength * (1.0s / sensorInfo.pixelRate); - mode_.maxLineLength = sensorInfo.maxLineLength * (1.0s / sensorInfo.pixelRate); - - /* - * Set the frame length limits for the mode to ensure exposure and - * framerate calculations are clipped appropriately. - */ - mode_.minFrameLength = sensorInfo.minFrameLength; - mode_.maxFrameLength = sensorInfo.maxFrameLength; - - /* Store these for convenience. */ - mode_.minFrameDuration = mode_.minFrameLength * mode_.minLineLength; - mode_.maxFrameDuration = mode_.maxFrameLength * mode_.maxLineLength; - - /* - * Some sensors may have different sensitivities in different modes; - * the CamHelper will know the correct value. - */ - mode_.sensitivity = helper_->getModeSensitivity(mode_); - - const ControlInfo &gainCtrl = sensorCtrls_.at(V4L2_CID_ANALOGUE_GAIN); - const ControlInfo &shutterCtrl = sensorCtrls_.at(V4L2_CID_EXPOSURE); - - mode_.minAnalogueGain = helper_->gain(gainCtrl.min().get()); - mode_.maxAnalogueGain = helper_->gain(gainCtrl.max().get()); - - /* Shutter speed is calculated based on the limits of the frame durations. */ - mode_.minShutter = helper_->exposure(shutterCtrl.min().get(), mode_.minLineLength); - mode_.maxShutter = Duration::max(); - helper_->getBlanking(mode_.maxShutter, - mode_.minFrameDuration, mode_.maxFrameDuration); + for (const IPABuffer &buffer : buffers) { + const FrameBuffer fb(buffer.planes); + buffers_.emplace(buffer.id, + MappedFrameBuffer(&fb, MappedFrameBuffer::MapFlag::ReadWrite)); + } } -int IPARPi::configure(const IPACameraSensorInfo &sensorInfo, const ConfigParams ¶ms, - ConfigResult *result) +void IpaBase::unmapBuffers(const std::vector &ids) { - sensorCtrls_ = params.sensorControls; - ispCtrls_ = params.ispControls; - - if (!validateSensorControls()) { - LOG(IPARPI, Error) << "Sensor control validation failed."; - return -1; - } - - if (!validateIspControls()) { - LOG(IPARPI, Error) << "ISP control validation failed."; - return -1; - } + for (unsigned int id : ids) { + auto it = buffers_.find(id); + if (it == buffers_.end()) + continue; - if (lensPresent_) { - lensCtrls_ = params.lensControls; - if (!validateLensControls()) { - LOG(IPARPI, Warning) << "Lens validation failed, " - << "no lens control will be available."; - lensPresent_ = false; - } + buffers_.erase(id); } +} - /* Setup a metadata ControlList to output metadata. */ - libcameraMetadata_ = ControlList(controls::controls); - - /* Re-assemble camera mode using the sensor info. */ - setMode(sensorInfo); - - mode_.transform = static_cast(params.transform); +void IpaBase::prepareIsp(const PrepareParams ¶ms) +{ + applyControls(params.requestControls); - /* Store the lens shading table pointer and handle if available. */ - if (params.lsTableHandle.isValid()) { - /* Remove any previous table, if there was one. */ - if (lsTable_) { - munmap(lsTable_, MaxLsGridSize); - lsTable_ = nullptr; - } + /* + * At start-up, or after a mode-switch, we may want to + * avoid running the control algos for a few frames in case + * they are "unreliable". + */ + int64_t frameTimestamp = params.sensorControls.get(controls::SensorTimestamp).value_or(0); + unsigned int ipaContext = params.ipaContext % rpiMetadata_.size(); + RPiController::Metadata &rpiMetadata = rpiMetadata_[ipaContext]; + Span embeddedBuffer; - /* Map the LS table buffer into user space. */ - lsTableHandle_ = std::move(params.lsTableHandle); - if (lsTableHandle_.isValid()) { - lsTable_ = mmap(nullptr, MaxLsGridSize, PROT_READ | PROT_WRITE, - MAP_SHARED, lsTableHandle_.get(), 0); + rpiMetadata.clear(); + fillDeviceStatus(params.sensorControls, ipaContext); - if (lsTable_ == MAP_FAILED) { - LOG(IPARPI, Error) << "dmaHeap mmap failure for LS table."; - lsTable_ = nullptr; - } - } + if (params.buffers.embedded) { + /* + * Pipeline handler has supplied us with an embedded data buffer, + * we must pass it to the CamHelper for parsing. + */ + auto it = buffers_.find(params.buffers.embedded); + ASSERT(it != buffers_.end()); + embeddedBuffer = it->second.planes()[0]; } - /* Pass the camera mode to the CamHelper to setup algorithms. */ - helper_->setCameraMode(mode_); - /* - * Initialise this ControlList correctly, even if empty, in case the IPA is - * running is isolation mode (passing the ControlList through the IPC layer). + * AGC wants to know the algorithm status from the time it actioned the + * sensor exposure/gain changes. So fetch it from the metadata list + * indexed by the IPA cookie returned, and put it in the current frame + * metadata. */ - ControlList ctrls(sensorCtrls_); - - /* The pipeline handler passes out the mode's sensitivity. */ - result->modeSensitivity = mode_.sensitivity; + AgcStatus agcStatus; + RPiController::Metadata &delayedMetadata = rpiMetadata_[params.delayContext]; + if (!delayedMetadata.get("agc.status", agcStatus)) + rpiMetadata.set("agc.delayed_status", agcStatus); - if (firstStart_) { - /* Supply initial values for frame durations. */ - applyFrameDurations(defaultMinFrameDuration, defaultMaxFrameDuration); + /* + * This may overwrite the DeviceStatus using values from the sensor + * metadata, and may also do additional custom processing. + */ + helper_->prepare(embeddedBuffer, rpiMetadata); - /* Supply initial values for gain and exposure. */ - AgcStatus agcStatus; - agcStatus.shutterTime = defaultExposureTime; - agcStatus.analogueGain = defaultAnalogueGain; - applyAGC(&agcStatus, ctrls); + /* Allow a 10% margin on the comparison below. */ + Duration delta = (frameTimestamp - lastRunTimestamp_) * 1.0ns; + if (lastRunTimestamp_ && frameCount_ > dropFrameCount_ && + delta < controllerMinFrameDuration * 0.9) { + /* + * Ensure we merge the previous frame's metadata with the current + * frame. This will not overwrite exposure/gain values for the + * current frame, or any other bits of metadata that were added + * in helper_->Prepare(). + */ + RPiController::Metadata &lastMetadata = + rpiMetadata_[(ipaContext ? ipaContext : rpiMetadata_.size()) - 1]; + rpiMetadata.mergeCopy(lastMetadata); + processPending_ = false; + } else { + processPending_ = true; + lastRunTimestamp_ = frameTimestamp; } - result->controls = std::move(ctrls); - /* - * Apply the correct limits to the exposure, gain and frame duration controls - * based on the current sensor mode. + * If a statistics buffer has been passed in, call processStats + * directly now before prepare() since the statistics are available in-line + * with the Bayer frame. */ - ControlInfoMap::Map ctrlMap = ipaControls; - ctrlMap[&controls::FrameDurationLimits] = - ControlInfo(static_cast(mode_.minFrameDuration.get()), - static_cast(mode_.maxFrameDuration.get())); - - ctrlMap[&controls::AnalogueGain] = - ControlInfo(static_cast(mode_.minAnalogueGain), - static_cast(mode_.maxAnalogueGain)); - - ctrlMap[&controls::ExposureTime] = - ControlInfo(static_cast(mode_.minShutter.get()), - static_cast(mode_.maxShutter.get())); - - /* Declare Autofocus controls, only if we have a controllable lens */ - if (lensPresent_) - ctrlMap.merge(ControlInfoMap::Map(ipaAfControls)); - - result->controlInfo = ControlInfoMap(std::move(ctrlMap), controls::controls); - return 0; -} - -void IPARPi::mapBuffers(const std::vector &buffers) -{ - for (const IPABuffer &buffer : buffers) { - const FrameBuffer fb(buffer.planes); - buffers_.emplace(buffer.id, - MappedFrameBuffer(&fb, MappedFrameBuffer::MapFlag::ReadWrite)); + if (params.buffers.stats) + processStats({ params.buffers, params.ipaContext }); + + /* Do we need/want to call prepare? */ + if (processPending_) { + controller_.prepare(&rpiMetadata); + /* Actually prepare the ISP parameters for the frame. */ + platformPrepareIsp(params, rpiMetadata); } -} -void IPARPi::unmapBuffers(const std::vector &ids) -{ - for (unsigned int id : ids) { - auto it = buffers_.find(id); - if (it == buffers_.end()) - continue; + frameCount_++; - buffers_.erase(id); - } + /* Ready to push the input buffer into the ISP. */ + prepareIspComplete.emit(params.buffers); } -void IPARPi::processStats(const ProcessParams ¶ms) +void IpaBase::processStats(const ProcessParams ¶ms) { - unsigned int context = params.ipaContext % rpiMetadata_.size(); + unsigned int ipaContext = params.ipaContext % rpiMetadata_.size(); - if (++checkCount_ != frameCount_) /* assert here? */ - LOG(IPARPI, Error) << "WARNING: Prepare/Process mismatch!!!"; - if (processPending_ && frameCount_ > mistrustCount_) - process(params.buffers.stats, context); + if (processPending_ && frameCount_ > mistrustCount_) { + RPiController::Metadata &rpiMetadata = rpiMetadata_[ipaContext]; - processStatsComplete.emit(params.buffers); -} + auto it = buffers_.find(params.buffers.stats); + if (it == buffers_.end()) { + LOG(IPARPI, Error) << "Could not find stats buffer!"; + return; + } + RPiController::StatisticsPtr statistics = platformProcessStats(it->second.planes()[0]); -void IPARPi::prepareIsp(const PrepareParams ¶ms) -{ - applyControls(params.requestControls); + helper_->process(statistics, rpiMetadata); + controller_.process(statistics, &rpiMetadata); - /* - * At start-up, or after a mode-switch, we may want to - * avoid running the control algos for a few frames in case - * they are "unreliable". - */ - prepare(params); - frameCount_++; + struct AgcStatus agcStatus; + if (rpiMetadata.get("agc.status", agcStatus) == 0) { + ControlList ctrls(sensorCtrls_); + applyAGC(&agcStatus, ctrls); + setDelayedControls.emit(ctrls, ipaContext); + setCameraTimeoutValue(); + } + } - /* Ready to push the input buffer into the ISP. */ - prepareIspComplete.emit(params.buffers); + processStatsComplete.emit(params.buffers); } -void IPARPi::reportMetadata(unsigned int ipaContext, ControlList *controls) +void IpaBase::reportMetadata(unsigned int ipaContext, ControlList *metadata) { RPiController::Metadata &rpiMetadata = rpiMetadata_[ipaContext % rpiMetadata_.size()]; std::unique_lock lock(rpiMetadata); @@ -658,48 +513,132 @@ void IPARPi::reportMetadata(unsigned int ipaContext, ControlList *controls) libcameraMetadata_.set(controls::FocusFoM, focusFoM); } - CcmStatus *ccmStatus = rpiMetadata.getLocked("ccm.status"); - if (ccmStatus) { - float m[9]; - for (unsigned int i = 0; i < 9; i++) - m[i] = ccmStatus->matrix[i]; - libcameraMetadata_.set(controls::ColourCorrectionMatrix, m); - } + CcmStatus *ccmStatus = rpiMetadata.getLocked("ccm.status"); + if (ccmStatus) { + float m[9]; + for (unsigned int i = 0; i < 9; i++) + m[i] = ccmStatus->matrix[i]; + libcameraMetadata_.set(controls::ColourCorrectionMatrix, m); + } + + const AfStatus *afStatus = rpiMetadata.getLocked("af.status"); + if (afStatus) { + int32_t s, p; + switch (afStatus->state) { + case AfState::Scanning: + s = controls::AfStateScanning; + break; + case AfState::Focused: + s = controls::AfStateFocused; + break; + case AfState::Failed: + s = controls::AfStateFailed; + break; + default: + s = controls::AfStateIdle; + } + switch (afStatus->pauseState) { + case AfPauseState::Pausing: + p = controls::AfPauseStatePausing; + break; + case AfPauseState::Paused: + p = controls::AfPauseStatePaused; + break; + default: + p = controls::AfPauseStateRunning; + } + libcameraMetadata_.set(controls::AfState, s); + libcameraMetadata_.set(controls::AfPauseState, p); + } + + *metadata = std::move(libcameraMetadata_); +} + +void IpaBase::setMode(const IPACameraSensorInfo &sensorInfo) +{ + mode_.bitdepth = sensorInfo.bitsPerPixel; + mode_.width = sensorInfo.outputSize.width; + mode_.height = sensorInfo.outputSize.height; + mode_.sensorWidth = sensorInfo.activeAreaSize.width; + mode_.sensorHeight = sensorInfo.activeAreaSize.height; + mode_.cropX = sensorInfo.analogCrop.x; + mode_.cropY = sensorInfo.analogCrop.y; + mode_.pixelRate = sensorInfo.pixelRate; + + /* + * Calculate scaling parameters. The scale_[xy] factors are determined + * by the ratio between the crop rectangle size and the output size. + */ + mode_.scaleX = sensorInfo.analogCrop.width / sensorInfo.outputSize.width; + mode_.scaleY = sensorInfo.analogCrop.height / sensorInfo.outputSize.height; + + /* + * We're not told by the pipeline handler how scaling is split between + * binning and digital scaling. For now, as a heuristic, assume that + * downscaling up to 2 is achieved through binning, and that any + * additional scaling is achieved through digital scaling. + * + * \todo Get the pipeline handle to provide the full data + */ + mode_.binX = std::min(2, static_cast(mode_.scaleX)); + mode_.binY = std::min(2, static_cast(mode_.scaleY)); + + /* The noise factor is the square root of the total binning factor. */ + mode_.noiseFactor = std::sqrt(mode_.binX * mode_.binY); + + /* + * Calculate the line length as the ratio between the line length in + * pixels and the pixel rate. + */ + mode_.minLineLength = sensorInfo.minLineLength * (1.0s / sensorInfo.pixelRate); + mode_.maxLineLength = sensorInfo.maxLineLength * (1.0s / sensorInfo.pixelRate); + + /* + * Set the frame length limits for the mode to ensure exposure and + * framerate calculations are clipped appropriately. + */ + mode_.minFrameLength = sensorInfo.minFrameLength; + mode_.maxFrameLength = sensorInfo.maxFrameLength; + + /* Store these for convenience. */ + mode_.minFrameDuration = mode_.minFrameLength * mode_.minLineLength; + mode_.maxFrameDuration = mode_.maxFrameLength * mode_.maxLineLength; + + /* + * Some sensors may have different sensitivities in different modes; + * the CamHelper will know the correct value. + */ + mode_.sensitivity = helper_->getModeSensitivity(mode_); + + const ControlInfo &gainCtrl = sensorCtrls_.at(V4L2_CID_ANALOGUE_GAIN); + const ControlInfo &shutterCtrl = sensorCtrls_.at(V4L2_CID_EXPOSURE); + + mode_.minAnalogueGain = helper_->gain(gainCtrl.min().get()); + mode_.maxAnalogueGain = helper_->gain(gainCtrl.max().get()); + + /* Shutter speed is calculated based on the limits of the frame durations. */ + mode_.minShutter = helper_->exposure(shutterCtrl.min().get(), mode_.minLineLength); + mode_.maxShutter = Duration::max(); + helper_->getBlanking(mode_.maxShutter, + mode_.minFrameDuration, mode_.maxFrameDuration); +} + +void IpaBase::setCameraTimeoutValue() +{ + /* + * Take the maximum value of the exposure queue as the camera timeout + * value to pass back to the pipeline handler. Only signal if it has changed + * from the last set value. + */ + auto max = std::max_element(frameLengths_.begin(), frameLengths_.end()); - const AfStatus *afStatus = rpiMetadata.getLocked("af.status"); - if (afStatus) { - int32_t s, p; - switch (afStatus->state) { - case AfState::Scanning: - s = controls::AfStateScanning; - break; - case AfState::Focused: - s = controls::AfStateFocused; - break; - case AfState::Failed: - s = controls::AfStateFailed; - break; - default: - s = controls::AfStateIdle; - } - switch (afStatus->pauseState) { - case AfPauseState::Pausing: - p = controls::AfPauseStatePausing; - break; - case AfPauseState::Paused: - p = controls::AfPauseStatePaused; - break; - default: - p = controls::AfPauseStateRunning; - } - libcameraMetadata_.set(controls::AfState, s); - libcameraMetadata_.set(controls::AfPauseState, p); + if (*max != lastTimeout_) { + setCameraTimeout.emit(max->get()); + lastTimeout_ = *max; } - - *controls = std::move(libcameraMetadata_); } -bool IPARPi::validateSensorControls() +bool IpaBase::validateSensorControls() { static const uint32_t ctrls[] = { V4L2_CID_ANALOGUE_GAIN, @@ -719,35 +658,7 @@ bool IPARPi::validateSensorControls() return true; } -bool IPARPi::validateIspControls() -{ - static const uint32_t ctrls[] = { - V4L2_CID_RED_BALANCE, - V4L2_CID_BLUE_BALANCE, - V4L2_CID_DIGITAL_GAIN, - V4L2_CID_USER_BCM2835_ISP_CC_MATRIX, - V4L2_CID_USER_BCM2835_ISP_GAMMA, - V4L2_CID_USER_BCM2835_ISP_BLACK_LEVEL, - V4L2_CID_USER_BCM2835_ISP_GEQ, - V4L2_CID_USER_BCM2835_ISP_DENOISE, - V4L2_CID_USER_BCM2835_ISP_SHARPEN, - V4L2_CID_USER_BCM2835_ISP_DPC, - V4L2_CID_USER_BCM2835_ISP_LENS_SHADING, - V4L2_CID_USER_BCM2835_ISP_CDN, - }; - - for (auto c : ctrls) { - if (ispCtrls_.find(c) == ispCtrls_.end()) { - LOG(IPARPI, Error) << "Unable to find ISP control " - << utils::hex(c); - return false; - } - } - - return true; -} - -bool IPARPi::validateLensControls() +bool IpaBase::validateLensControls() { if (lensCtrls_.find(V4L2_CID_FOCUS_ABSOLUTE) == lensCtrls_.end()) { LOG(IPARPI, Error) << "Unable to find Lens control V4L2_CID_FOCUS_ABSOLUTE"; @@ -820,7 +731,7 @@ static const std::map AfPauseTable { controls::AfPauseResume, RPiController::AfAlgorithm::AfPauseResume }, }; -void IPARPi::applyControls(const ControlList &controls) +void IpaBase::applyControls(const ControlList &controls) { using RPiController::AfAlgorithm; @@ -840,7 +751,7 @@ void IPARPi::applyControls(const ControlList &controls) if (mode == AfModeTable.end()) { LOG(IPARPI, Error) << "AF mode " << idx << " not recognised"; - } else + } else if (af) af->setMode(mode->second); } @@ -1112,6 +1023,10 @@ void IPARPi::applyControls(const ControlList &controls) case controls::NOISE_REDUCTION_MODE: { RPiController::DenoiseAlgorithm *sdn = dynamic_cast( controller_.getAlgorithm("SDN")); + /* Some platforms may have a combined "denoise" algorithm instead. */ + if (!sdn) + sdn = dynamic_cast( + controller_.getAlgorithm("denoise")); if (!sdn) { LOG(IPARPI, Warning) << "Could not set NOISE_REDUCTION_MODE - no SDN algorithm"; @@ -1248,125 +1163,12 @@ void IPARPi::applyControls(const ControlList &controls) break; } } -} - -void IPARPi::prepare(const PrepareParams ¶ms) -{ - int64_t frameTimestamp = params.sensorControls.get(controls::SensorTimestamp).value_or(0); - unsigned int ipaContext = params.ipaContext % rpiMetadata_.size(); - RPiController::Metadata &rpiMetadata = rpiMetadata_[ipaContext]; - Span embeddedBuffer; - - rpiMetadata.clear(); - fillDeviceStatus(params.sensorControls, ipaContext); - - if (params.buffers.embedded) { - /* - * Pipeline handler has supplied us with an embedded data buffer, - * we must pass it to the CamHelper for parsing. - */ - auto it = buffers_.find(params.buffers.embedded); - ASSERT(it != buffers_.end()); - embeddedBuffer = it->second.planes()[0]; - } - - /* - * AGC wants to know the algorithm status from the time it actioned the - * sensor exposure/gain changes. So fetch it from the metadata list - * indexed by the IPA cookie returned, and put it in the current frame - * metadata. - */ - AgcStatus agcStatus; - RPiController::Metadata &delayedMetadata = rpiMetadata_[params.delayContext]; - if (!delayedMetadata.get("agc.status", agcStatus)) - rpiMetadata.set("agc.delayed_status", agcStatus); - - /* - * This may overwrite the DeviceStatus using values from the sensor - * metadata, and may also do additional custom processing. - */ - helper_->prepare(embeddedBuffer, rpiMetadata); - - /* Allow a 10% margin on the comparison below. */ - Duration delta = (frameTimestamp - lastRunTimestamp_) * 1.0ns; - if (lastRunTimestamp_ && frameCount_ > dropFrameCount_ && - delta < controllerMinFrameDuration * 0.9) { - /* - * Ensure we merge the previous frame's metadata with the current - * frame. This will not overwrite exposure/gain values for the - * current frame, or any other bits of metadata that were added - * in helper_->Prepare(). - */ - RPiController::Metadata &lastMetadata = - rpiMetadata_[(ipaContext ? ipaContext : rpiMetadata_.size()) - 1]; - rpiMetadata.mergeCopy(lastMetadata); - processPending_ = false; - return; - } - - lastRunTimestamp_ = frameTimestamp; - processPending_ = true; - - ControlList ctrls(ispCtrls_); - - controller_.prepare(&rpiMetadata); - - /* Lock the metadata buffer to avoid constant locks/unlocks. */ - std::unique_lock lock(rpiMetadata); - - AwbStatus *awbStatus = rpiMetadata.getLocked("awb.status"); - if (awbStatus) - applyAWB(awbStatus, ctrls); - - CcmStatus *ccmStatus = rpiMetadata.getLocked("ccm.status"); - if (ccmStatus) - applyCCM(ccmStatus, ctrls); - - AgcStatus *dgStatus = rpiMetadata.getLocked("agc.status"); - if (dgStatus) - applyDG(dgStatus, ctrls); - - AlscStatus *lsStatus = rpiMetadata.getLocked("alsc.status"); - if (lsStatus) - applyLS(lsStatus, ctrls); - - ContrastStatus *contrastStatus = rpiMetadata.getLocked("contrast.status"); - if (contrastStatus) - applyGamma(contrastStatus, ctrls); - - BlackLevelStatus *blackLevelStatus = rpiMetadata.getLocked("black_level.status"); - if (blackLevelStatus) - applyBlackLevel(blackLevelStatus, ctrls); - - GeqStatus *geqStatus = rpiMetadata.getLocked("geq.status"); - if (geqStatus) - applyGEQ(geqStatus, ctrls); - - DenoiseStatus *denoiseStatus = rpiMetadata.getLocked("denoise.status"); - if (denoiseStatus) - applyDenoise(denoiseStatus, ctrls); - - SharpenStatus *sharpenStatus = rpiMetadata.getLocked("sharpen.status"); - if (sharpenStatus) - applySharpen(sharpenStatus, ctrls); - - DpcStatus *dpcStatus = rpiMetadata.getLocked("dpc.status"); - if (dpcStatus) - applyDPC(dpcStatus, ctrls); - - const AfStatus *afStatus = rpiMetadata.getLocked("af.status"); - if (afStatus) { - ControlList lensctrls(lensCtrls_); - applyAF(afStatus, lensctrls); - if (!lensctrls.empty()) - setLensControls.emit(lensctrls); - } - if (!ctrls.empty()) - setIspControls.emit(ctrls); + /* Give derived classes a chance to examine the new controls. */ + handleControls(controls); } -void IPARPi::fillDeviceStatus(const ControlList &sensorControls, unsigned int ipaContext) +void IpaBase::fillDeviceStatus(const ControlList &sensorControls, unsigned int ipaContext) { DeviceStatus deviceStatus = {}; @@ -1390,103 +1192,7 @@ void IPARPi::fillDeviceStatus(const ControlList &sensorControls, unsigned int ip rpiMetadata_[ipaContext].set("device.status", deviceStatus); } -RPiController::StatisticsPtr IPARPi::fillStatistics(bcm2835_isp_stats *stats) const -{ - using namespace RPiController; - - const Controller::HardwareConfig &hw = controller_.getHardwareConfig(); - unsigned int i; - StatisticsPtr statistics = - std::make_unique(Statistics::AgcStatsPos::PreWb, Statistics::ColourStatsPos::PostLsc); - - /* RGB histograms are not used, so do not populate them. */ - statistics->yHist = RPiController::Histogram(stats->hist[0].g_hist, - hw.numHistogramBins); - - /* All region sums are based on a 16-bit normalised pipeline bit-depth. */ - unsigned int scale = Statistics::NormalisationFactorPow2 - hw.pipelineWidth; - - statistics->awbRegions.init(hw.awbRegions); - for (i = 0; i < statistics->awbRegions.numRegions(); i++) - statistics->awbRegions.set(i, { { stats->awb_stats[i].r_sum << scale, - stats->awb_stats[i].g_sum << scale, - stats->awb_stats[i].b_sum << scale }, - stats->awb_stats[i].counted, - stats->awb_stats[i].notcounted }); - - statistics->agcRegions.init(hw.agcRegions); - for (i = 0; i < statistics->agcRegions.numRegions(); i++) - statistics->agcRegions.set(i, { { stats->agc_stats[i].r_sum << scale, - stats->agc_stats[i].g_sum << scale, - stats->agc_stats[i].b_sum << scale }, - stats->agc_stats[i].counted, - stats->awb_stats[i].notcounted }); - - statistics->focusRegions.init(hw.focusRegions); - for (i = 0; i < statistics->focusRegions.numRegions(); i++) - statistics->focusRegions.set(i, { stats->focus_stats[i].contrast_val[1][1] / 1000, - stats->focus_stats[i].contrast_val_num[1][1], - stats->focus_stats[i].contrast_val_num[1][0] }); - return statistics; -} - -void IPARPi::process(unsigned int bufferId, unsigned int ipaContext) -{ - RPiController::Metadata &rpiMetadata = rpiMetadata_[ipaContext]; - - auto it = buffers_.find(bufferId); - if (it == buffers_.end()) { - LOG(IPARPI, Error) << "Could not find stats buffer!"; - return; - } - - Span mem = it->second.planes()[0]; - bcm2835_isp_stats *stats = reinterpret_cast(mem.data()); - RPiController::StatisticsPtr statistics = fillStatistics(stats); - - /* Save the focus stats in the metadata structure to report out later. */ - rpiMetadata_[ipaContext].set("focus.status", statistics->focusRegions); - - helper_->process(statistics, rpiMetadata); - controller_.process(statistics, &rpiMetadata); - - struct AgcStatus agcStatus; - if (rpiMetadata.get("agc.status", agcStatus) == 0) { - ControlList ctrls(sensorCtrls_); - applyAGC(&agcStatus, ctrls); - - setDelayedControls.emit(ctrls, ipaContext); - setCameraTimeoutValue(); - } -} - -void IPARPi::setCameraTimeoutValue() -{ - /* - * Take the maximum value of the exposure queue as the camera timeout - * value to pass back to the pipeline handler. Only signal if it has changed - * from the last set value. - */ - auto max = std::max_element(frameLengths_.begin(), frameLengths_.end()); - - if (*max != lastTimeout_) { - setCameraTimeout.emit(max->get()); - lastTimeout_ = *max; - } -} - -void IPARPi::applyAWB(const struct AwbStatus *awbStatus, ControlList &ctrls) -{ - LOG(IPARPI, Debug) << "Applying WB R: " << awbStatus->gainR << " B: " - << awbStatus->gainB; - - ctrls.set(V4L2_CID_RED_BALANCE, - static_cast(awbStatus->gainR * 1000)); - ctrls.set(V4L2_CID_BLUE_BALANCE, - static_cast(awbStatus->gainB * 1000)); -} - -void IPARPi::applyFrameDurations(Duration minFrameDuration, Duration maxFrameDuration) +void IpaBase::applyFrameDurations(Duration minFrameDuration, Duration maxFrameDuration) { /* * This will only be applied once AGC recalculations occur. @@ -1518,7 +1224,7 @@ void IPARPi::applyFrameDurations(Duration minFrameDuration, Duration maxFrameDur agc->setMaxShutter(maxShutter); } -void IPARPi::applyAGC(const struct AgcStatus *agcStatus, ControlList &ctrls) +void IpaBase::applyAGC(const struct AgcStatus *agcStatus, ControlList &ctrls) { const int32_t minGainCode = helper_->gainCode(mode_.minAnalogueGain); const int32_t maxGainCode = helper_->gainCode(mode_.maxAnalogueGain); @@ -1570,269 +1276,6 @@ void IPARPi::applyAGC(const struct AgcStatus *agcStatus, ControlList &ctrls) helper_->hblankToLineLength(hblank))); } -void IPARPi::applyDG(const struct AgcStatus *dgStatus, ControlList &ctrls) -{ - ctrls.set(V4L2_CID_DIGITAL_GAIN, - static_cast(dgStatus->digitalGain * 1000)); -} - -void IPARPi::applyCCM(const struct CcmStatus *ccmStatus, ControlList &ctrls) -{ - bcm2835_isp_custom_ccm ccm; - - for (int i = 0; i < 9; i++) { - ccm.ccm.ccm[i / 3][i % 3].den = 1000; - ccm.ccm.ccm[i / 3][i % 3].num = 1000 * ccmStatus->matrix[i]; - } - - ccm.enabled = 1; - ccm.ccm.offsets[0] = ccm.ccm.offsets[1] = ccm.ccm.offsets[2] = 0; - - ControlValue c(Span{ reinterpret_cast(&ccm), - sizeof(ccm) }); - ctrls.set(V4L2_CID_USER_BCM2835_ISP_CC_MATRIX, c); -} - -void IPARPi::applyGamma(const struct ContrastStatus *contrastStatus, ControlList &ctrls) -{ - const unsigned int numGammaPoints = controller_.getHardwareConfig().numGammaPoints; - struct bcm2835_isp_gamma gamma; - - for (unsigned int i = 0; i < numGammaPoints - 1; i++) { - int x = i < 16 ? i * 1024 - : (i < 24 ? (i - 16) * 2048 + 16384 - : (i - 24) * 4096 + 32768); - gamma.x[i] = x; - gamma.y[i] = std::min(65535, contrastStatus->gammaCurve.eval(x)); - } - - gamma.x[numGammaPoints - 1] = 65535; - gamma.y[numGammaPoints - 1] = 65535; - gamma.enabled = 1; - - ControlValue c(Span{ reinterpret_cast(&gamma), - sizeof(gamma) }); - ctrls.set(V4L2_CID_USER_BCM2835_ISP_GAMMA, c); -} - -void IPARPi::applyBlackLevel(const struct BlackLevelStatus *blackLevelStatus, ControlList &ctrls) -{ - bcm2835_isp_black_level blackLevel; - - blackLevel.enabled = 1; - blackLevel.black_level_r = blackLevelStatus->blackLevelR; - blackLevel.black_level_g = blackLevelStatus->blackLevelG; - blackLevel.black_level_b = blackLevelStatus->blackLevelB; - - ControlValue c(Span{ reinterpret_cast(&blackLevel), - sizeof(blackLevel) }); - ctrls.set(V4L2_CID_USER_BCM2835_ISP_BLACK_LEVEL, c); -} - -void IPARPi::applyGEQ(const struct GeqStatus *geqStatus, ControlList &ctrls) -{ - bcm2835_isp_geq geq; - - geq.enabled = 1; - geq.offset = geqStatus->offset; - geq.slope.den = 1000; - geq.slope.num = 1000 * geqStatus->slope; - - ControlValue c(Span{ reinterpret_cast(&geq), - sizeof(geq) }); - ctrls.set(V4L2_CID_USER_BCM2835_ISP_GEQ, c); -} - -void IPARPi::applyDenoise(const struct DenoiseStatus *denoiseStatus, ControlList &ctrls) -{ - using RPiController::DenoiseMode; - - bcm2835_isp_denoise denoise; - DenoiseMode mode = static_cast(denoiseStatus->mode); - - denoise.enabled = mode != DenoiseMode::Off; - denoise.constant = denoiseStatus->noiseConstant; - denoise.slope.num = 1000 * denoiseStatus->noiseSlope; - denoise.slope.den = 1000; - denoise.strength.num = 1000 * denoiseStatus->strength; - denoise.strength.den = 1000; - - /* Set the CDN mode to match the SDN operating mode. */ - bcm2835_isp_cdn cdn; - switch (mode) { - case DenoiseMode::ColourFast: - cdn.enabled = 1; - cdn.mode = CDN_MODE_FAST; - break; - case DenoiseMode::ColourHighQuality: - cdn.enabled = 1; - cdn.mode = CDN_MODE_HIGH_QUALITY; - break; - default: - cdn.enabled = 0; - } - - ControlValue c(Span{ reinterpret_cast(&denoise), - sizeof(denoise) }); - ctrls.set(V4L2_CID_USER_BCM2835_ISP_DENOISE, c); - - c = ControlValue(Span{ reinterpret_cast(&cdn), - sizeof(cdn) }); - ctrls.set(V4L2_CID_USER_BCM2835_ISP_CDN, c); -} - -void IPARPi::applySharpen(const struct SharpenStatus *sharpenStatus, ControlList &ctrls) -{ - bcm2835_isp_sharpen sharpen; - - sharpen.enabled = 1; - sharpen.threshold.num = 1000 * sharpenStatus->threshold; - sharpen.threshold.den = 1000; - sharpen.strength.num = 1000 * sharpenStatus->strength; - sharpen.strength.den = 1000; - sharpen.limit.num = 1000 * sharpenStatus->limit; - sharpen.limit.den = 1000; - - ControlValue c(Span{ reinterpret_cast(&sharpen), - sizeof(sharpen) }); - ctrls.set(V4L2_CID_USER_BCM2835_ISP_SHARPEN, c); -} - -void IPARPi::applyDPC(const struct DpcStatus *dpcStatus, ControlList &ctrls) -{ - bcm2835_isp_dpc dpc; - - dpc.enabled = 1; - dpc.strength = dpcStatus->strength; - - ControlValue c(Span{ reinterpret_cast(&dpc), - sizeof(dpc) }); - ctrls.set(V4L2_CID_USER_BCM2835_ISP_DPC, c); -} - -void IPARPi::applyLS(const struct AlscStatus *lsStatus, ControlList &ctrls) -{ - /* - * Program lens shading tables into pipeline. - * Choose smallest cell size that won't exceed 63x48 cells. - */ - const int cellSizes[] = { 16, 32, 64, 128, 256 }; - unsigned int numCells = std::size(cellSizes); - unsigned int i, w, h, cellSize; - for (i = 0; i < numCells; i++) { - cellSize = cellSizes[i]; - w = (mode_.width + cellSize - 1) / cellSize; - h = (mode_.height + cellSize - 1) / cellSize; - if (w < 64 && h <= 48) - break; - } - - if (i == numCells) { - LOG(IPARPI, Error) << "Cannot find cell size"; - return; - } - - /* We're going to supply corner sampled tables, 16 bit samples. */ - w++, h++; - bcm2835_isp_lens_shading ls = { - .enabled = 1, - .grid_cell_size = cellSize, - .grid_width = w, - .grid_stride = w, - .grid_height = h, - /* .dmabuf will be filled in by pipeline handler. */ - .dmabuf = 0, - .ref_transform = 0, - .corner_sampled = 1, - .gain_format = GAIN_FORMAT_U4P10 - }; - - if (!lsTable_ || w * h * 4 * sizeof(uint16_t) > MaxLsGridSize) { - LOG(IPARPI, Error) << "Do not have a correctly allocate lens shading table!"; - return; - } - - if (lsStatus) { - /* Format will be u4.10 */ - uint16_t *grid = static_cast(lsTable_); - - resampleTable(grid, lsStatus->r, w, h); - resampleTable(grid + w * h, lsStatus->g, w, h); - std::memcpy(grid + 2 * w * h, grid + w * h, w * h * sizeof(uint16_t)); - resampleTable(grid + 3 * w * h, lsStatus->b, w, h); - } - - ControlValue c(Span{ reinterpret_cast(&ls), - sizeof(ls) }); - ctrls.set(V4L2_CID_USER_BCM2835_ISP_LENS_SHADING, c); -} - -void IPARPi::applyAF(const struct AfStatus *afStatus, ControlList &lensCtrls) -{ - if (afStatus->lensSetting) { - ControlValue v(afStatus->lensSetting.value()); - lensCtrls.set(V4L2_CID_FOCUS_ABSOLUTE, v); - } -} - -/* - * Resamples a 16x12 table with central sampling to destW x destH with corner - * sampling. - */ -void IPARPi::resampleTable(uint16_t dest[], const std::vector &src, - int destW, int destH) -{ - /* - * Precalculate and cache the x sampling locations and phases to - * save recomputing them on every row. - */ - assert(destW > 1 && destH > 1 && destW <= 64); - int xLo[64], xHi[64]; - double xf[64]; - double x = -0.5, xInc = 16.0 / (destW - 1); - for (int i = 0; i < destW; i++, x += xInc) { - xLo[i] = floor(x); - xf[i] = x - xLo[i]; - xHi[i] = xLo[i] < 15 ? xLo[i] + 1 : 15; - xLo[i] = xLo[i] > 0 ? xLo[i] : 0; - } - - /* Now march over the output table generating the new values. */ - double y = -0.5, yInc = 12.0 / (destH - 1); - for (int j = 0; j < destH; j++, y += yInc) { - int yLo = floor(y); - double yf = y - yLo; - int yHi = yLo < 11 ? yLo + 1 : 11; - yLo = yLo > 0 ? yLo : 0; - double const *rowAbove = src.data() + yLo * 16; - double const *rowBelow = src.data() + yHi * 16; - for (int i = 0; i < destW; i++) { - double above = rowAbove[xLo[i]] * (1 - xf[i]) + rowAbove[xHi[i]] * xf[i]; - double below = rowBelow[xLo[i]] * (1 - xf[i]) + rowBelow[xHi[i]] * xf[i]; - int result = floor(1024 * (above * (1 - yf) + below * yf) + .5); - *(dest++) = result > 16383 ? 16383 : result; /* want u4.10 */ - } - } -} - } /* namespace ipa::RPi */ -/* - * External IPA module interface - */ -extern "C" { -const struct IPAModuleInfo ipaModuleInfo = { - IPA_MODULE_API_VERSION, - 1, - "PipelineHandlerRPi", - "vc4", -}; - -IPAInterface *ipaCreate() -{ - return new ipa::RPi::IPARPi(); -} - -} /* extern "C" */ - } /* namespace libcamera */ diff --git a/src/ipa/rpi/common/ipa_base.h b/src/ipa/rpi/common/ipa_base.h new file mode 100644 index 000000000000..b9fbf9414d63 --- /dev/null +++ b/src/ipa/rpi/common/ipa_base.h @@ -0,0 +1,125 @@ +/* SPDX-License-Identifier: BSD-2-Clause */ +/* + * Copyright (C) 2023, Raspberry Pi Ltd + * + * ipa_base.cpp - Raspberry Pi IPA base class + */ +#pragma once + +#include +#include +#include + +#include + +#include + +#include +#include +#include + +#include "libcamera/internal/mapped_framebuffer.h" + +#include "cam_helper/cam_helper.h" +#include "controller/agc_status.h" +#include "controller/camera_mode.h" +#include "controller/controller.h" +#include "controller/metadata.h" + +namespace libcamera { + +namespace ipa::RPi { + +class IpaBase : public IPARPiInterface +{ +public: + IpaBase(); + virtual ~IpaBase() = 0; + + int32_t init(const IPASettings &settings, const InitParams ¶ms, InitResult *result) override; + int32_t configure(const IPACameraSensorInfo &sensorInfo, const ConfigParams ¶ms, + ConfigResult *result) override; + + void start(const ControlList &controls, StartResult *result) override; + void stop() override {} + + void mapBuffers(const std::vector &buffers) override; + void unmapBuffers(const std::vector &ids) override; + + void prepareIsp(const PrepareParams ¶ms) override; + void processStats(const ProcessParams ¶ms) override; + + void reportMetadata(unsigned int ipaContext, ControlList *controls) override; + +private: + /* Number of metadata objects available in the context list. */ + static constexpr unsigned int numMetadataContexts = 16; + + virtual int32_t platformInit(const InitParams ¶ms, InitResult *result) = 0; + virtual int32_t platformConfigure(const ConfigParams ¶ms, ConfigResult *result) = 0; + + virtual void platformPrepareIsp(const PrepareParams ¶ms, + RPiController::Metadata &rpiMetadata) = 0; + virtual RPiController::StatisticsPtr platformProcessStats(Span mem) = 0; + + void setMode(const IPACameraSensorInfo &sensorInfo); + void setCameraTimeoutValue(); + bool validateSensorControls(); + bool validateLensControls(); + void applyControls(const ControlList &controls); + virtual void handleControls(const ControlList &controls) = 0; + void fillDeviceStatus(const ControlList &sensorControls, unsigned int ipaContext); + void applyFrameDurations(utils::Duration minFrameDuration, utils::Duration maxFrameDuration); + void applyAGC(const struct AgcStatus *agcStatus, ControlList &ctrls); + + std::map buffers_; + + bool lensPresent_; + ControlList libcameraMetadata_; + + std::array rpiMetadata_; + + /* + * We count frames to decide if the frame must be hidden (e.g. from + * display) or mistrusted (i.e. not given to the control algos). + */ + uint64_t frameCount_; + + /* How many frames we should avoid running control algos on. */ + unsigned int mistrustCount_; + + /* Number of frames that need to be dropped on startup. */ + unsigned int dropFrameCount_; + + /* Frame timestamp for the last run of the controller. */ + uint64_t lastRunTimestamp_; + + /* Do we run a Controller::process() for this frame? */ + bool processPending_; + + /* Distinguish the first camera start from others. */ + bool firstStart_; + + /* Frame duration (1/fps) limits. */ + utils::Duration minFrameDuration_; + utils::Duration maxFrameDuration_; + +protected: + /* Raspberry Pi controller specific defines. */ + std::unique_ptr helper_; + RPiController::Controller controller_; + + ControlInfoMap sensorCtrls_; + ControlInfoMap lensCtrls_; + + /* Camera sensor params. */ + CameraMode mode_; + + /* Track the frame length times over FrameLengthsQueueSize frames. */ + std::deque frameLengths_; + utils::Duration lastTimeout_; +}; + +} /* namespace ipa::RPi */ + +} /* namespace libcamera */ diff --git a/src/ipa/rpi/common/meson.build b/src/ipa/rpi/common/meson.build new file mode 100644 index 000000000000..a7189f8389af --- /dev/null +++ b/src/ipa/rpi/common/meson.build @@ -0,0 +1,7 @@ +# SPDX-License-Identifier: CC0-1.0 + +# SPDX-License-Identifier: CC0-1.0 + +rpi_ipa_common_sources = files([ + 'ipa_base.cpp', +]) diff --git a/src/ipa/rpi/vc4/meson.build b/src/ipa/rpi/vc4/meson.build index 992d0f1ab5a7..b581391586b3 100644 --- a/src/ipa/rpi/vc4/meson.build +++ b/src/ipa/rpi/vc4/meson.build @@ -13,11 +13,15 @@ vc4_ipa_includes = [ ] vc4_ipa_sources = files([ - 'raspberrypi.cpp', + 'vc4.cpp', ]) vc4_ipa_includes += include_directories('..') -vc4_ipa_sources += [rpi_ipa_cam_helper_sources, rpi_ipa_controller_sources] +vc4_ipa_sources += [ + rpi_ipa_cam_helper_sources, + rpi_ipa_common_sources, + rpi_ipa_controller_sources, +] mod = shared_module(ipa_name, [vc4_ipa_sources, libcamera_generated_ipa_headers], diff --git a/src/ipa/rpi/vc4/vc4.cpp b/src/ipa/rpi/vc4/vc4.cpp new file mode 100644 index 000000000000..0d929cda6c4a --- /dev/null +++ b/src/ipa/rpi/vc4/vc4.cpp @@ -0,0 +1,540 @@ +/* SPDX-License-Identifier: BSD-2-Clause */ +/* + * Copyright (C) 2019-2021, Raspberry Pi Ltd + * + * rpi.cpp - Raspberry Pi VC4/BCM2835 ISP IPA. + */ + +#include +#include + +#include + +#include + +#include "common/ipa_base.h" +#include "controller/af_status.h" +#include "controller/agc_algorithm.h" +#include "controller/alsc_status.h" +#include "controller/awb_status.h" +#include "controller/black_level_status.h" +#include "controller/ccm_status.h" +#include "controller/contrast_status.h" +#include "controller/denoise_algorithm.h" +#include "controller/denoise_status.h" +#include "controller/dpc_status.h" +#include "controller/geq_status.h" +#include "controller/lux_status.h" +#include "controller/noise_status.h" +#include "controller/sharpen_status.h" + +namespace libcamera { + +LOG_DECLARE_CATEGORY(IPARPI) + +namespace ipa::RPi { + +class IpaVc4 final : public IpaBase +{ +public: + IpaVc4() + : IpaBase(), lsTable_(nullptr) + { + } + + ~IpaVc4() + { + if (lsTable_) + munmap(lsTable_, MaxLsGridSize); + } + + +private: + int32_t platformInit(const InitParams ¶ms, InitResult *result) override; + int32_t platformConfigure(const ConfigParams ¶ms, ConfigResult *result) override; + + void platformPrepareIsp(const PrepareParams ¶ms, RPiController::Metadata &rpiMetadata) override; + RPiController::StatisticsPtr platformProcessStats(Span mem) override; + + void handleControls(const ControlList &controls) override; + bool validateIspControls(); + + void applyAWB(const struct AwbStatus *awbStatus, ControlList &ctrls); + void applyDG(const struct AgcStatus *dgStatus, ControlList &ctrls); + void applyCCM(const struct CcmStatus *ccmStatus, ControlList &ctrls); + void applyBlackLevel(const struct BlackLevelStatus *blackLevelStatus, ControlList &ctrls); + void applyGamma(const struct ContrastStatus *contrastStatus, ControlList &ctrls); + void applyGEQ(const struct GeqStatus *geqStatus, ControlList &ctrls); + void applyDenoise(const struct DenoiseStatus *denoiseStatus, ControlList &ctrls); + void applySharpen(const struct SharpenStatus *sharpenStatus, ControlList &ctrls); + void applyDPC(const struct DpcStatus *dpcStatus, ControlList &ctrls); + void applyLS(const struct AlscStatus *lsStatus, ControlList &ctrls); + void applyAF(const struct AfStatus *afStatus, ControlList &lensCtrls); + void resampleTable(uint16_t dest[], const std::vector &src, int destW, int destH); + + /* VC4 ISP controls. */ + ControlInfoMap ispCtrls_; + + /* LS table allocation passed in from the pipeline handler. */ + SharedFD lsTableHandle_; + void *lsTable_; +}; + +int32_t IpaVc4::platformInit([[maybe_unused]] const InitParams ¶ms, [[maybe_unused]] InitResult *result) +{ + const std::string &target = controller_.getTarget(); + + if (target != "bcm2835") { + LOG(IPARPI, Error) + << "Tuning data file target returned \"" << target << "\"" + << ", expected \"bcm2835\""; + return -EINVAL; + } + + return 0; +} + +int32_t IpaVc4::platformConfigure(const ConfigParams ¶ms, [[maybe_unused]] ConfigResult *result) +{ + ispCtrls_ = params.ispControls; + if (!validateIspControls()) { + LOG(IPARPI, Error) << "ISP control validation failed."; + return -1; + } + + /* Store the lens shading table pointer and handle if available. */ + if (params.lsTableHandle.isValid()) { + /* Remove any previous table, if there was one. */ + if (lsTable_) { + munmap(lsTable_, MaxLsGridSize); + lsTable_ = nullptr; + } + + /* Map the LS table buffer into user space. */ + lsTableHandle_ = std::move(params.lsTableHandle); + if (lsTableHandle_.isValid()) { + lsTable_ = mmap(nullptr, MaxLsGridSize, PROT_READ | PROT_WRITE, + MAP_SHARED, lsTableHandle_.get(), 0); + + if (lsTable_ == MAP_FAILED) { + LOG(IPARPI, Error) << "dmaHeap mmap failure for LS table."; + lsTable_ = nullptr; + } + } + } + + return 0; +} + +void IpaVc4::platformPrepareIsp([[maybe_unused]] const PrepareParams ¶ms, + RPiController::Metadata &rpiMetadata) +{ + ControlList ctrls(ispCtrls_); + + /* Lock the metadata buffer to avoid constant locks/unlocks. */ + std::unique_lock lock(rpiMetadata); + + AwbStatus *awbStatus = rpiMetadata.getLocked("awb.status"); + if (awbStatus) + applyAWB(awbStatus, ctrls); + + CcmStatus *ccmStatus = rpiMetadata.getLocked("ccm.status"); + if (ccmStatus) + applyCCM(ccmStatus, ctrls); + + AgcStatus *dgStatus = rpiMetadata.getLocked("agc.status"); + if (dgStatus) + applyDG(dgStatus, ctrls); + + AlscStatus *lsStatus = rpiMetadata.getLocked("alsc.status"); + if (lsStatus) + applyLS(lsStatus, ctrls); + + ContrastStatus *contrastStatus = rpiMetadata.getLocked("contrast.status"); + if (contrastStatus) + applyGamma(contrastStatus, ctrls); + + BlackLevelStatus *blackLevelStatus = rpiMetadata.getLocked("black_level.status"); + if (blackLevelStatus) + applyBlackLevel(blackLevelStatus, ctrls); + + GeqStatus *geqStatus = rpiMetadata.getLocked("geq.status"); + if (geqStatus) + applyGEQ(geqStatus, ctrls); + + DenoiseStatus *denoiseStatus = rpiMetadata.getLocked("denoise.status"); + if (denoiseStatus) + applyDenoise(denoiseStatus, ctrls); + + SharpenStatus *sharpenStatus = rpiMetadata.getLocked("sharpen.status"); + if (sharpenStatus) + applySharpen(sharpenStatus, ctrls); + + DpcStatus *dpcStatus = rpiMetadata.getLocked("dpc.status"); + if (dpcStatus) + applyDPC(dpcStatus, ctrls); + + const AfStatus *afStatus = rpiMetadata.getLocked("af.status"); + if (afStatus) { + ControlList lensctrls(lensCtrls_); + applyAF(afStatus, lensctrls); + if (!lensctrls.empty()) + setLensControls.emit(lensctrls); + } + + if (!ctrls.empty()) + setIspControls.emit(ctrls); +} + +RPiController::StatisticsPtr IpaVc4::platformProcessStats(Span mem) +{ + using namespace RPiController; + + const bcm2835_isp_stats *stats = reinterpret_cast(mem.data()); + StatisticsPtr statistics = std::make_unique(Statistics::AgcStatsPos::PreWb, + Statistics::ColourStatsPos::PostLsc); + const Controller::HardwareConfig &hw = controller_.getHardwareConfig(); + unsigned int i; + + /* RGB histograms are not used, so do not populate them. */ + statistics->yHist = RPiController::Histogram(stats->hist[0].g_hist, + hw.numHistogramBins); + + /* All region sums are based on a 16-bit normalised pipeline bit-depth. */ + unsigned int scale = Statistics::NormalisationFactorPow2 - hw.pipelineWidth; + + statistics->awbRegions.init(hw.awbRegions); + for (i = 0; i < statistics->awbRegions.numRegions(); i++) + statistics->awbRegions.set(i, { { stats->awb_stats[i].r_sum << scale, + stats->awb_stats[i].g_sum << scale, + stats->awb_stats[i].b_sum << scale }, + stats->awb_stats[i].counted, + stats->awb_stats[i].notcounted }); + + statistics->agcRegions.init(hw.agcRegions); + for (i = 0; i < statistics->agcRegions.numRegions(); i++) + statistics->agcRegions.set(i, { { stats->agc_stats[i].r_sum << scale, + stats->agc_stats[i].g_sum << scale, + stats->agc_stats[i].b_sum << scale }, + stats->agc_stats[i].counted, + stats->awb_stats[i].notcounted }); + + statistics->focusRegions.init(hw.focusRegions); + for (i = 0; i < statistics->focusRegions.numRegions(); i++) + statistics->focusRegions.set(i, { stats->focus_stats[i].contrast_val[1][1] / 1000, + stats->focus_stats[i].contrast_val_num[1][1], + stats->focus_stats[i].contrast_val_num[1][0] }); + + return statistics; +} + +void IpaVc4::handleControls([[maybe_unused]] const ControlList &controls) +{ + /* No controls require any special updates to the hardware configuration. */ +} + +bool IpaVc4::validateIspControls() +{ + static const uint32_t ctrls[] = { + V4L2_CID_RED_BALANCE, + V4L2_CID_BLUE_BALANCE, + V4L2_CID_DIGITAL_GAIN, + V4L2_CID_USER_BCM2835_ISP_CC_MATRIX, + V4L2_CID_USER_BCM2835_ISP_GAMMA, + V4L2_CID_USER_BCM2835_ISP_BLACK_LEVEL, + V4L2_CID_USER_BCM2835_ISP_GEQ, + V4L2_CID_USER_BCM2835_ISP_DENOISE, + V4L2_CID_USER_BCM2835_ISP_SHARPEN, + V4L2_CID_USER_BCM2835_ISP_DPC, + V4L2_CID_USER_BCM2835_ISP_LENS_SHADING, + V4L2_CID_USER_BCM2835_ISP_CDN, + }; + + for (auto c : ctrls) { + if (ispCtrls_.find(c) == ispCtrls_.end()) { + LOG(IPARPI, Error) << "Unable to find ISP control " + << utils::hex(c); + return false; + } + } + + return true; +} + +void IpaVc4::applyAWB(const struct AwbStatus *awbStatus, ControlList &ctrls) +{ + LOG(IPARPI, Debug) << "Applying WB R: " << awbStatus->gainR << " B: " + << awbStatus->gainB; + + ctrls.set(V4L2_CID_RED_BALANCE, + static_cast(awbStatus->gainR * 1000)); + ctrls.set(V4L2_CID_BLUE_BALANCE, + static_cast(awbStatus->gainB * 1000)); +} + +void IpaVc4::applyDG(const struct AgcStatus *dgStatus, ControlList &ctrls) +{ + ctrls.set(V4L2_CID_DIGITAL_GAIN, + static_cast(dgStatus->digitalGain * 1000)); +} + +void IpaVc4::applyCCM(const struct CcmStatus *ccmStatus, ControlList &ctrls) +{ + bcm2835_isp_custom_ccm ccm; + + for (int i = 0; i < 9; i++) { + ccm.ccm.ccm[i / 3][i % 3].den = 1000; + ccm.ccm.ccm[i / 3][i % 3].num = 1000 * ccmStatus->matrix[i]; + } + + ccm.enabled = 1; + ccm.ccm.offsets[0] = ccm.ccm.offsets[1] = ccm.ccm.offsets[2] = 0; + + ControlValue c(Span{ reinterpret_cast(&ccm), + sizeof(ccm) }); + ctrls.set(V4L2_CID_USER_BCM2835_ISP_CC_MATRIX, c); +} + +void IpaVc4::applyBlackLevel(const struct BlackLevelStatus *blackLevelStatus, ControlList &ctrls) +{ + bcm2835_isp_black_level blackLevel; + + blackLevel.enabled = 1; + blackLevel.black_level_r = blackLevelStatus->blackLevelR; + blackLevel.black_level_g = blackLevelStatus->blackLevelG; + blackLevel.black_level_b = blackLevelStatus->blackLevelB; + + ControlValue c(Span{ reinterpret_cast(&blackLevel), + sizeof(blackLevel) }); + ctrls.set(V4L2_CID_USER_BCM2835_ISP_BLACK_LEVEL, c); +} + +void IpaVc4::applyGamma(const struct ContrastStatus *contrastStatus, ControlList &ctrls) +{ + const unsigned int numGammaPoints = controller_.getHardwareConfig().numGammaPoints; + struct bcm2835_isp_gamma gamma; + + for (unsigned int i = 0; i < numGammaPoints - 1; i++) { + int x = i < 16 ? i * 1024 + : (i < 24 ? (i - 16) * 2048 + 16384 + : (i - 24) * 4096 + 32768); + gamma.x[i] = x; + gamma.y[i] = std::min(65535, contrastStatus->gammaCurve.eval(x)); + } + + gamma.x[numGammaPoints - 1] = 65535; + gamma.y[numGammaPoints - 1] = 65535; + gamma.enabled = 1; + + ControlValue c(Span{ reinterpret_cast(&gamma), + sizeof(gamma) }); + ctrls.set(V4L2_CID_USER_BCM2835_ISP_GAMMA, c); +} + +void IpaVc4::applyGEQ(const struct GeqStatus *geqStatus, ControlList &ctrls) +{ + bcm2835_isp_geq geq; + + geq.enabled = 1; + geq.offset = geqStatus->offset; + geq.slope.den = 1000; + geq.slope.num = 1000 * geqStatus->slope; + + ControlValue c(Span{ reinterpret_cast(&geq), + sizeof(geq) }); + ctrls.set(V4L2_CID_USER_BCM2835_ISP_GEQ, c); +} + +void IpaVc4::applyDenoise(const struct DenoiseStatus *denoiseStatus, ControlList &ctrls) +{ + using RPiController::DenoiseMode; + + bcm2835_isp_denoise denoise; + DenoiseMode mode = static_cast(denoiseStatus->mode); + + denoise.enabled = mode != DenoiseMode::Off; + denoise.constant = denoiseStatus->noiseConstant; + denoise.slope.num = 1000 * denoiseStatus->noiseSlope; + denoise.slope.den = 1000; + denoise.strength.num = 1000 * denoiseStatus->strength; + denoise.strength.den = 1000; + + /* Set the CDN mode to match the SDN operating mode. */ + bcm2835_isp_cdn cdn; + switch (mode) { + case DenoiseMode::ColourFast: + cdn.enabled = 1; + cdn.mode = CDN_MODE_FAST; + break; + case DenoiseMode::ColourHighQuality: + cdn.enabled = 1; + cdn.mode = CDN_MODE_HIGH_QUALITY; + break; + default: + cdn.enabled = 0; + } + + ControlValue c(Span{ reinterpret_cast(&denoise), + sizeof(denoise) }); + ctrls.set(V4L2_CID_USER_BCM2835_ISP_DENOISE, c); + + c = ControlValue(Span{ reinterpret_cast(&cdn), + sizeof(cdn) }); + ctrls.set(V4L2_CID_USER_BCM2835_ISP_CDN, c); +} + +void IpaVc4::applySharpen(const struct SharpenStatus *sharpenStatus, ControlList &ctrls) +{ + bcm2835_isp_sharpen sharpen; + + sharpen.enabled = 1; + sharpen.threshold.num = 1000 * sharpenStatus->threshold; + sharpen.threshold.den = 1000; + sharpen.strength.num = 1000 * sharpenStatus->strength; + sharpen.strength.den = 1000; + sharpen.limit.num = 1000 * sharpenStatus->limit; + sharpen.limit.den = 1000; + + ControlValue c(Span{ reinterpret_cast(&sharpen), + sizeof(sharpen) }); + ctrls.set(V4L2_CID_USER_BCM2835_ISP_SHARPEN, c); +} + +void IpaVc4::applyDPC(const struct DpcStatus *dpcStatus, ControlList &ctrls) +{ + bcm2835_isp_dpc dpc; + + dpc.enabled = 1; + dpc.strength = dpcStatus->strength; + + ControlValue c(Span{ reinterpret_cast(&dpc), + sizeof(dpc) }); + ctrls.set(V4L2_CID_USER_BCM2835_ISP_DPC, c); +} + +void IpaVc4::applyLS(const struct AlscStatus *lsStatus, ControlList &ctrls) +{ + /* + * Program lens shading tables into pipeline. + * Choose smallest cell size that won't exceed 63x48 cells. + */ + const int cellSizes[] = { 16, 32, 64, 128, 256 }; + unsigned int numCells = std::size(cellSizes); + unsigned int i, w, h, cellSize; + for (i = 0; i < numCells; i++) { + cellSize = cellSizes[i]; + w = (mode_.width + cellSize - 1) / cellSize; + h = (mode_.height + cellSize - 1) / cellSize; + if (w < 64 && h <= 48) + break; + } + + if (i == numCells) { + LOG(IPARPI, Error) << "Cannot find cell size"; + return; + } + + /* We're going to supply corner sampled tables, 16 bit samples. */ + w++, h++; + bcm2835_isp_lens_shading ls = { + .enabled = 1, + .grid_cell_size = cellSize, + .grid_width = w, + .grid_stride = w, + .grid_height = h, + /* .dmabuf will be filled in by pipeline handler. */ + .dmabuf = 0, + .ref_transform = 0, + .corner_sampled = 1, + .gain_format = GAIN_FORMAT_U4P10 + }; + + if (!lsTable_ || w * h * 4 * sizeof(uint16_t) > MaxLsGridSize) { + LOG(IPARPI, Error) << "Do not have a correctly allocate lens shading table!"; + return; + } + + if (lsStatus) { + /* Format will be u4.10 */ + uint16_t *grid = static_cast(lsTable_); + + resampleTable(grid, lsStatus->r, w, h); + resampleTable(grid + w * h, lsStatus->g, w, h); + memcpy(grid + 2 * w * h, grid + w * h, w * h * sizeof(uint16_t)); + resampleTable(grid + 3 * w * h, lsStatus->b, w, h); + } + + ControlValue c(Span{ reinterpret_cast(&ls), + sizeof(ls) }); + ctrls.set(V4L2_CID_USER_BCM2835_ISP_LENS_SHADING, c); +} + +void IpaVc4::applyAF(const struct AfStatus *afStatus, ControlList &lensCtrls) +{ + if (afStatus->lensSetting) { + ControlValue v(afStatus->lensSetting.value()); + lensCtrls.set(V4L2_CID_FOCUS_ABSOLUTE, v); + } +} + +/* + * Resamples a 16x12 table with central sampling to destW x destH with corner + * sampling. + */ +void IpaVc4::resampleTable(uint16_t dest[], const std::vector &src, + int destW, int destH) +{ + /* + * Precalculate and cache the x sampling locations and phases to + * save recomputing them on every row. + */ + assert(destW > 1 && destH > 1 && destW <= 64); + int xLo[64], xHi[64]; + double xf[64]; + double x = -0.5, xInc = 16.0 / (destW - 1); + for (int i = 0; i < destW; i++, x += xInc) { + xLo[i] = floor(x); + xf[i] = x - xLo[i]; + xHi[i] = xLo[i] < 15 ? xLo[i] + 1 : 15; + xLo[i] = xLo[i] > 0 ? xLo[i] : 0; + } + + /* Now march over the output table generating the new values. */ + double y = -0.5, yInc = 12.0 / (destH - 1); + for (int j = 0; j < destH; j++, y += yInc) { + int yLo = floor(y); + double yf = y - yLo; + int yHi = yLo < 11 ? yLo + 1 : 11; + yLo = yLo > 0 ? yLo : 0; + double const *rowAbove = src.data() + yLo * 16; + double const *rowBelow = src.data() + yHi * 16; + for (int i = 0; i < destW; i++) { + double above = rowAbove[xLo[i]] * (1 - xf[i]) + rowAbove[xHi[i]] * xf[i]; + double below = rowBelow[xLo[i]] * (1 - xf[i]) + rowBelow[xHi[i]] * xf[i]; + int result = floor(1024 * (above * (1 - yf) + below * yf) + .5); + *(dest++) = result > 16383 ? 16383 : result; /* want u4.10 */ + } + } +} + +} /* namespace ipa::RPi */ + +/* + * External IPA module interface + */ +extern "C" { +const struct IPAModuleInfo ipaModuleInfo = { + IPA_MODULE_API_VERSION, + 1, + "PipelineHandlerRPi", + "vc4", +}; + +IPAInterface *ipaCreate() +{ + return new ipa::RPi::IpaVc4(); +} + +} /* extern "C" */ + +} /* namespace libcamera */ From patchwork Wed Apr 26 13:10:53 2023 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Naushir Patuck X-Patchwork-Id: 18557 Return-Path: X-Original-To: parsemail@patchwork.libcamera.org Delivered-To: parsemail@patchwork.libcamera.org Received: from lancelot.ideasonboard.com (lancelot.ideasonboard.com [92.243.16.209]) by patchwork.libcamera.org (Postfix) with ESMTPS id BE6D4BDCBD for ; Wed, 26 Apr 2023 13:13:40 +0000 (UTC) Received: from lancelot.ideasonboard.com (localhost [IPv6:::1]) by lancelot.ideasonboard.com (Postfix) with ESMTP id 8B4C6627DF; Wed, 26 Apr 2023 15:13:40 +0200 (CEST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/simple; d=libcamera.org; s=mail; t=1682514820; bh=zMnuCMQzW5A9CsO1NyiO6bJ4PVWkLOb/UvsOjK0kGek=; h=To:Date:In-Reply-To:References:Subject:List-Id:List-Unsubscribe: List-Archive:List-Post:List-Help:List-Subscribe:From:Reply-To: From; b=JCScAaSAHwnm+FgW0wCkJRUY8NIipGeSvcpc9sQ6eAWVasYszw3c0tvOHs79DD5wW +gCTQIoraPNos4xh6uVhnTnmG2R2MvgSqTSLrMdrxudRh2aQAlI1mB+VIfdfvwep2w 6hxfRJqav28WjX1MD8M0bS+s+KVL9tynuDoNufAFZ0zNh+czBNQHZ/mfBWmu8P3mJl Wsi412yR/hakDQqGW4ujishEoAVPUCUTyk+TUnrW3o6NZLci2aSBqSbHdLkoafCoi6 86C/zU6KwF29sa0PNe2lQaqf3Gwbys1SSgC1VCAAMWFObOEs5/2v+n7JKYZ42A+pgb qVvsaPl4HgF8Q== Received: from mail-wm1-x32f.google.com (mail-wm1-x32f.google.com [IPv6:2a00:1450:4864:20::32f]) by lancelot.ideasonboard.com (Postfix) with ESMTPS id 7F440627D4 for ; Wed, 26 Apr 2023 15:13:38 +0200 (CEST) Authentication-Results: lancelot.ideasonboard.com; dkim=pass (2048-bit key; unprotected) header.d=raspberrypi.com header.i=@raspberrypi.com header.b="c5P0ORXU"; dkim-atps=neutral Received: by mail-wm1-x32f.google.com with SMTP id 5b1f17b1804b1-3f19a80a330so30979845e9.2 for ; Wed, 26 Apr 2023 06:13:38 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=raspberrypi.com; s=google; t=1682514818; x=1685106818; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:from:to:cc:subject:date :message-id:reply-to; bh=vY50V2GR0AO2EgoMd7bJCL30YLCQYhdChGBDbVstcds=; b=c5P0ORXUdV2ZBfFNa18U+wWn5j7SKFMLBjEKDxuzwgOUHYTrS7HvnDPGgZP/EEPi9s 41HNpFXBX3a9rnYe2joLCAYGhVyFYFP/vB1eU+D3jX74UU+IYq9o0U0/wTDYGS6t4h/Z f2s0YDq41hfePl2t4tG55BlBEM/blTDgSKMjYc4S9jQJvdRrd2kQ5up4fQK57iX/oQjw bFyCSm4L7otZgGLIoaZqENWFNFX1vAnofVfo3FuaR9TlE18NkyLm0K8z3mcuusI6GyeT SnxiUlKXYN61fDtT2PUxi6/fRKuvKe9gRXCh1tFmlGxT/gcz5kpP43egEnBtKVSbos6n t+qA== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20221208; t=1682514818; x=1685106818; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:x-gm-message-state:from:to:cc :subject:date:message-id:reply-to; bh=vY50V2GR0AO2EgoMd7bJCL30YLCQYhdChGBDbVstcds=; b=kBD9tQZX291Bq394ah3gD7gqv3L38BzwFF7O4vapI4eMMhKN7kat5cyZZwC+GMgDE7 ehD2KDGBP7W/2z2vv5+rbGG/h5B+X7YvFu+zemfeyc73z12f1oz9v42xG9seIlQjSuuN 5z7cVaash+24Qpl3fkMpZ5PCNsaf3+KM10///eal7AIaQF9OMWKyr54BoTO2dsgppSOG scDTKFaD38nsrbHaKYT/IMcKOg5qGhcqmSD3hn64JnlwDTX5Qc7ukSXIL90lg00rPR7v 1bqV/ejKk6qEbj1MmDHHqWr4dkj0Ju4X/BHD67U7JI+92vWwXOKQKa/hBieIJJERZywq BRlQ== X-Gm-Message-State: AAQBX9c9y52atWIShVJgKrU8ZJhL7lwYk14GuA4iQlmiDzDM87PSi9eQ xqHk35kW66kitaUNp3dHhSdL6+7DSrBLY3OMKSXSrQ== X-Google-Smtp-Source: AKy350bXDvIGNYrqRVxDWIjARAuxsBchkgMfn7p46JGzsjaVVWv5Amf+qMbdXAsPKxmiezDk5S73Fw== X-Received: by 2002:a7b:cb96:0:b0:3f0:5519:9049 with SMTP id m22-20020a7bcb96000000b003f055199049mr13944098wmi.8.1682514817672; Wed, 26 Apr 2023 06:13:37 -0700 (PDT) Received: from localhost.localdomain ([93.93.133.154]) by smtp.gmail.com with ESMTPSA id k5-20020adff5c5000000b002f103ca90cdsm15780949wrp.101.2023.04.26.06.13.37 (version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256); Wed, 26 Apr 2023 06:13:37 -0700 (PDT) To: libcamera-devel@lists.libcamera.org Date: Wed, 26 Apr 2023 14:10:53 +0100 Message-Id: <20230426131057.21550-10-naush@raspberrypi.com> X-Mailer: git-send-email 2.34.1 In-Reply-To: <20230426131057.21550-1-naush@raspberrypi.com> References: <20230426131057.21550-1-naush@raspberrypi.com> MIME-Version: 1.0 Subject: [libcamera-devel] [PATCH 09/13] ipa: raspberrypi: agc: Move weights out of AGC X-BeenThere: libcamera-devel@lists.libcamera.org X-Mailman-Version: 2.1.29 Precedence: list List-Id: List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , X-Patchwork-Original-From: Naushir Patuck via libcamera-devel From: Naushir Patuck Reply-To: Naushir Patuck Errors-To: libcamera-devel-bounces@lists.libcamera.org Sender: "libcamera-devel" From: David Plowman The region weights for the the AGC zones were previously handled by the AGC algorithm. Now it's the IPA (vc4.cpp) that applies them directly to the statistics that we pass to the AGC. Signed-off-by: David Plowman Signed-off-by: Naushir Patuck Reviewed-by: Jacopo Mondi --- src/ipa/rpi/controller/agc_algorithm.h | 3 +++ src/ipa/rpi/controller/rpi/agc.cpp | 27 +++++++++++++++++--------- src/ipa/rpi/controller/rpi/agc.h | 1 + src/ipa/rpi/vc4/vc4.cpp | 26 ++++++++++++++++++------- 4 files changed, 41 insertions(+), 16 deletions(-) diff --git a/src/ipa/rpi/controller/agc_algorithm.h b/src/ipa/rpi/controller/agc_algorithm.h index 36e6c11058ee..b6949daa7135 100644 --- a/src/ipa/rpi/controller/agc_algorithm.h +++ b/src/ipa/rpi/controller/agc_algorithm.h @@ -6,6 +6,8 @@ */ #pragma once +#include + #include #include "algorithm.h" @@ -18,6 +20,7 @@ public: AgcAlgorithm(Controller *controller) : Algorithm(controller) {} /* An AGC algorithm must provide the following: */ virtual unsigned int getConvergenceFrames() const = 0; + virtual std::vector const &getWeights() const = 0; virtual void setEv(double ev) = 0; virtual void setFlickerPeriod(libcamera::utils::Duration flickerPeriod) = 0; virtual void setFixedShutter(libcamera::utils::Duration fixedShutter) = 0; diff --git a/src/ipa/rpi/controller/rpi/agc.cpp b/src/ipa/rpi/controller/rpi/agc.cpp index e6fb7b8dbeb3..e79c82e2e65b 100644 --- a/src/ipa/rpi/controller/rpi/agc.cpp +++ b/src/ipa/rpi/controller/rpi/agc.cpp @@ -292,6 +292,18 @@ unsigned int Agc::getConvergenceFrames() const return config_.convergenceFrames; } +std::vector const &Agc::getWeights() const +{ + /* + * In case someone calls setMeteringMode and then this before the + * algorithm has run and updated the meteringMode_ pointer. + */ + auto it = config_.meteringModes.find(meteringModeName_); + if (it == config_.meteringModes.end()) + return meteringMode_->weights; + return it->second.weights; +} + void Agc::setEv(double ev) { ev_ = ev; @@ -595,19 +607,16 @@ static double computeInitialY(StatisticsPtr &stats, AwbStatus const &awb, ASSERT(weights.size() == stats->agcRegions.numRegions()); /* - * Note how the calculation below means that equal weights give you - * "average" metering (i.e. all pixels equally important). + * Note that the weights are applied by the IPA to the statistics directly, + * before they are given to us here. */ double rSum = 0, gSum = 0, bSum = 0, pixelSum = 0; for (unsigned int i = 0; i < stats->agcRegions.numRegions(); i++) { auto ®ion = stats->agcRegions.get(i); - double rAcc = std::min(region.val.rSum * gain, (maxVal - 1) * region.counted); - double gAcc = std::min(region.val.gSum * gain, (maxVal - 1) * region.counted); - double bAcc = std::min(region.val.bSum * gain, (maxVal - 1) * region.counted); - rSum += rAcc * weights[i]; - gSum += gAcc * weights[i]; - bSum += bAcc * weights[i]; - pixelSum += region.counted * weights[i]; + rSum += std::min(region.val.rSum * gain, (maxVal - 1) * region.counted); + gSum += std::min(region.val.gSum * gain, (maxVal - 1) * region.counted); + bSum += std::min(region.val.bSum * gain, (maxVal - 1) * region.counted); + pixelSum += region.counted; } if (pixelSum == 0.0) { LOG(RPiAgc, Warning) << "computeInitialY: pixelSum is zero"; diff --git a/src/ipa/rpi/controller/rpi/agc.h b/src/ipa/rpi/controller/rpi/agc.h index 4e5f272fac78..939f97295a02 100644 --- a/src/ipa/rpi/controller/rpi/agc.h +++ b/src/ipa/rpi/controller/rpi/agc.h @@ -69,6 +69,7 @@ public: char const *name() const override; int read(const libcamera::YamlObject ¶ms) override; unsigned int getConvergenceFrames() const override; + std::vector const &getWeights() const override; void setEv(double ev) override; void setFlickerPeriod(libcamera::utils::Duration flickerPeriod) override; void setMaxShutter(libcamera::utils::Duration maxShutter) override; diff --git a/src/ipa/rpi/vc4/vc4.cpp b/src/ipa/rpi/vc4/vc4.cpp index 0d929cda6c4a..0e022c2aeed3 100644 --- a/src/ipa/rpi/vc4/vc4.cpp +++ b/src/ipa/rpi/vc4/vc4.cpp @@ -211,13 +211,25 @@ RPiController::StatisticsPtr IpaVc4::platformProcessStats(Span mem) stats->awb_stats[i].counted, stats->awb_stats[i].notcounted }); - statistics->agcRegions.init(hw.agcRegions); - for (i = 0; i < statistics->agcRegions.numRegions(); i++) - statistics->agcRegions.set(i, { { stats->agc_stats[i].r_sum << scale, - stats->agc_stats[i].g_sum << scale, - stats->agc_stats[i].b_sum << scale }, - stats->agc_stats[i].counted, - stats->awb_stats[i].notcounted }); + RPiController::AgcAlgorithm *agc = dynamic_cast( + controller_.getAlgorithm("agc")); + if (!agc) { + LOG(IPARPI, Debug) << "No AGC algorithm - not copying statistics"; + statistics->agcRegions.init(0); + } else { + statistics->agcRegions.init(hw.agcRegions); + const std::vector &weights = agc->getWeights(); + for (i = 0; i < statistics->agcRegions.numRegions(); i++) { + uint64_t rSum = (stats->agc_stats[i].r_sum << scale) * weights[i]; + uint64_t gSum = (stats->agc_stats[i].g_sum << scale) * weights[i]; + uint64_t bSum = (stats->agc_stats[i].b_sum << scale) * weights[i]; + uint32_t counted = stats->agc_stats[i].counted * weights[i]; + uint32_t notcounted = stats->agc_stats[i].notcounted * weights[i]; + statistics->agcRegions.set(i, { { rSum, gSum, bSum }, + counted, + notcounted }); + } + } statistics->focusRegions.init(hw.focusRegions); for (i = 0; i < statistics->focusRegions.numRegions(); i++) From patchwork Wed Apr 26 13:10:54 2023 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Naushir Patuck X-Patchwork-Id: 18558 Return-Path: X-Original-To: parsemail@patchwork.libcamera.org Delivered-To: parsemail@patchwork.libcamera.org Received: from lancelot.ideasonboard.com (lancelot.ideasonboard.com [92.243.16.209]) by patchwork.libcamera.org (Postfix) with ESMTPS id 63A6DBDCBD for ; Wed, 26 Apr 2023 13:13:41 +0000 (UTC) Received: from lancelot.ideasonboard.com (localhost [IPv6:::1]) by lancelot.ideasonboard.com (Postfix) with ESMTP id 12529627E6; Wed, 26 Apr 2023 15:13:41 +0200 (CEST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/simple; d=libcamera.org; s=mail; t=1682514821; bh=/mODpDy9ae6RAbY4u8qElAFie9GlwGqlL9wtGupX6Us=; h=To:Date:In-Reply-To:References:Subject:List-Id:List-Unsubscribe: List-Archive:List-Post:List-Help:List-Subscribe:From:Reply-To: From; b=IM9Olix67s21n1eZqelrOLRRuQsbCBHqQ4oVC5JsoU76D0QPVdijOhKYXBaLLcCPE hau+ZCtR6CiLYxIOmvLxkbff3sSdQ+u3qxI/zpmXnuKGwbnX32TrwcUZjtBsvw1Hwk y1XUX/a/8HjVBnXJU8shffF5MZYkGUpgN0iwjhqVEzxek82x6ZiwFqDM3oVsHvr9RG YLu2vfQfVfqZw6VccscSQ8TIcv2zqK9A3TPRQ8usigO3GbILad4SbmjoY4SqK+uQ0o b6d0TMGTu3ildj1PV8yPVRiJfPKE10XIex7Q/hqCyyhn9wBHlWhmhaMN9Igloyh8gw 9f80mZEwPa4bw== Received: from mail-wm1-x32a.google.com (mail-wm1-x32a.google.com [IPv6:2a00:1450:4864:20::32a]) by lancelot.ideasonboard.com (Postfix) with ESMTPS id 727B4627DF for ; Wed, 26 Apr 2023 15:13:39 +0200 (CEST) Authentication-Results: lancelot.ideasonboard.com; dkim=pass (2048-bit key; unprotected) header.d=raspberrypi.com header.i=@raspberrypi.com header.b="qyZkMlZc"; dkim-atps=neutral Received: by mail-wm1-x32a.google.com with SMTP id 5b1f17b1804b1-3f18dacd392so43721615e9.0 for ; Wed, 26 Apr 2023 06:13:39 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=raspberrypi.com; s=google; t=1682514819; x=1685106819; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:from:to:cc:subject:date :message-id:reply-to; bh=IZJohyqvcH9rNGAHJTnrio/fvnwZIvDrGoCaJPfxj3g=; b=qyZkMlZcl/xYEQvAub/rBoLtlvrSkaMmM9xCs3kMC9HUU+gFHT76nufxAo4ioq25JB iKa2xquzLGUTRqmxwh+rqTLsFXqvrlKQ3cjJgDIqwY52wM4BfUBiNfEAXiRFifYxztw4 CntK9cK8+E6N7VpvtoTOxDtg0shUHaLGh8Ur/+0iUOgf1tr0RVm2RZazBmDh9N7mQlWI p07NrCiLfzzXzWWQ5RhpnZvFxd/i34oPdEa9UFegM/GiCzGv8zD7IBTfWXmrVBEdfWot LlTf/aF/XHDGBqTLoun2pdQIiLnKUUyVd8iulSw2zoA2RC2qyH49D4DLw7ZRZ5qtsNEv RC/g== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20221208; t=1682514819; x=1685106819; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:x-gm-message-state:from:to:cc :subject:date:message-id:reply-to; bh=IZJohyqvcH9rNGAHJTnrio/fvnwZIvDrGoCaJPfxj3g=; b=NAq1awCrveQRFo1T6D6+9HZre/k/FLuL0j/yk3wa3qZeTgYn7DYozX/8SDoo2EpTZb 4osY21zgqtHdrU/adlNa4Ap8RKZOrA001qqQK8F+6tw9YE01MoplR81PpKZ8y0OnTtei tnQsayKHDAHeU2ycmqKYaxSpYNCyI7F0jU4P+bMvwG7sKvcDiTZom2/fp6hsFC9VKi0G ODFsy5oE401BTF8Fd1N1Crfmh0aPNYiYMiRWzFq50dUxfQUrn8yzyO7Gq+GMfFbefB4R sr/+LcEXAeDpfCui9X43St1VSvn8RL4neCoCkYS2/wkJ3CN1q4k9GcQ5I7wtNzRYO3xP LMcQ== X-Gm-Message-State: AAQBX9dJSLzc0Pa4GNes0QDK8Ga6MziOVMef+9OxW1fi2Ib9bIOXB41K +3S+pDl3KyZ7wgdbiSX6XXHUrHh2Ccqmv9TpPf6Xug== X-Google-Smtp-Source: AKy350aWwzMDRShhF4qDn5LI6nqJQvKIXS7AdvGQ2d31qFYGyS7kCLS8ktoKgRjbAT7igap/BdFB8g== X-Received: by 2002:a05:600c:219a:b0:3f1:8c37:fa79 with SMTP id e26-20020a05600c219a00b003f18c37fa79mr12753615wme.31.1682514818844; Wed, 26 Apr 2023 06:13:38 -0700 (PDT) Received: from localhost.localdomain ([93.93.133.154]) by smtp.gmail.com with ESMTPSA id k5-20020adff5c5000000b002f103ca90cdsm15780949wrp.101.2023.04.26.06.13.37 (version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256); Wed, 26 Apr 2023 06:13:38 -0700 (PDT) To: libcamera-devel@lists.libcamera.org Date: Wed, 26 Apr 2023 14:10:54 +0100 Message-Id: <20230426131057.21550-11-naush@raspberrypi.com> X-Mailer: git-send-email 2.34.1 In-Reply-To: <20230426131057.21550-1-naush@raspberrypi.com> References: <20230426131057.21550-1-naush@raspberrypi.com> MIME-Version: 1.0 Subject: [libcamera-devel] [PATCH 10/13] pipeline: raspberrypi: Make RPi::Stream::name() return const std::string & X-BeenThere: libcamera-devel@lists.libcamera.org X-Mailman-Version: 2.1.29 Precedence: list List-Id: List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , X-Patchwork-Original-From: Naushir Patuck via libcamera-devel From: Naushir Patuck Reply-To: Naushir Patuck Errors-To: libcamera-devel-bounces@lists.libcamera.org Sender: "libcamera-devel" Return a const std::string reference from RPi::Stream::name() to avoid copying a string when not needed. --- src/libcamera/pipeline/rpi/common/rpi_stream.cpp | 2 +- src/libcamera/pipeline/rpi/common/rpi_stream.h | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/src/libcamera/pipeline/rpi/common/rpi_stream.cpp b/src/libcamera/pipeline/rpi/common/rpi_stream.cpp index 3690667e9aa6..b7e4130f5e56 100644 --- a/src/libcamera/pipeline/rpi/common/rpi_stream.cpp +++ b/src/libcamera/pipeline/rpi/common/rpi_stream.cpp @@ -19,7 +19,7 @@ V4L2VideoDevice *Stream::dev() const return dev_.get(); } -std::string Stream::name() const +const std::string &Stream::name() const { return name_; } diff --git a/src/libcamera/pipeline/rpi/common/rpi_stream.h b/src/libcamera/pipeline/rpi/common/rpi_stream.h index 1aae674967e1..b8c74de35863 100644 --- a/src/libcamera/pipeline/rpi/common/rpi_stream.h +++ b/src/libcamera/pipeline/rpi/common/rpi_stream.h @@ -49,7 +49,7 @@ public: } V4L2VideoDevice *dev() const; - std::string name() const; + const std::string &name() const; bool isImporter() const; void resetBuffers(); From patchwork Wed Apr 26 13:10:55 2023 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Naushir Patuck X-Patchwork-Id: 18561 Return-Path: X-Original-To: parsemail@patchwork.libcamera.org Delivered-To: parsemail@patchwork.libcamera.org Received: from lancelot.ideasonboard.com (lancelot.ideasonboard.com [92.243.16.209]) by patchwork.libcamera.org (Postfix) with ESMTPS id 59F3CBDCBD for ; Wed, 26 Apr 2023 13:13:45 +0000 (UTC) Received: from lancelot.ideasonboard.com (localhost [IPv6:::1]) by lancelot.ideasonboard.com (Postfix) with ESMTP id 0DC3C627E3; Wed, 26 Apr 2023 15:13:45 +0200 (CEST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/simple; d=libcamera.org; s=mail; t=1682514825; bh=vY+CVfGNwYGyUN7UJHdJRIs19CLiVZZt1CKrKRy8Ues=; h=To:Date:In-Reply-To:References:Subject:List-Id:List-Unsubscribe: List-Archive:List-Post:List-Help:List-Subscribe:From:Reply-To: From; b=WCOJBh9KYERPKWoOVbJVqXXlsrIe+HSK5gveJWNPGDttQnixLzqx0bgQ8o7hfucGX 7XLIf0jHfaaasc0G/G4cGSir6W7BEs0D3YwM4D5K1FVKG+n03eP0gV+TuC8GTGuwkG d99wfurbBnJ5RQsJROcNZe3hKeyFit60OjZDTW6SX07wnRvBJmJPTDQQLIhvO2pJUZ 2bp9FUCZTy3BsIc/mX6TmqNwr68+CWvQoFI63UbUEejsShMorzOK1cpuGQ+uEQDr6v KMrp2HgdW8ZDCzp8ycMhqVmPNbfgx6Sax3Q/c6GsLmLbAc6iN44Ijxfz/1K0P1HXvW W/Za/aGhPNb8w== Received: from mail-wm1-x333.google.com (mail-wm1-x333.google.com [IPv6:2a00:1450:4864:20::333]) by lancelot.ideasonboard.com (Postfix) with ESMTPS id E15B9627E2 for ; Wed, 26 Apr 2023 15:13:42 +0200 (CEST) Authentication-Results: lancelot.ideasonboard.com; dkim=pass (2048-bit key; unprotected) header.d=raspberrypi.com header.i=@raspberrypi.com header.b="QugtJzQo"; dkim-atps=neutral Received: by mail-wm1-x333.google.com with SMTP id 5b1f17b1804b1-3f09b4a1527so73161185e9.0 for ; Wed, 26 Apr 2023 06:13:42 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=raspberrypi.com; s=google; t=1682514822; x=1685106822; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:from:to:cc:subject:date :message-id:reply-to; bh=ZYvMG3JlfJ2iPI34zyy+2kQQVY/C+zcG7SR5d24k+NU=; b=QugtJzQoXZPNgnUSONgYxnzTgfnO4hpK5PTXvLh4WRL11+3j7jUzNR0HRQ9yhwHo50 z/z+Lq6hr3dfudcZ4U8jXECWImr42hmbJLU0kp+KUT7kw6FbV8ifUKTJDt12N2q2Q2cC PorQIyKDK4cERr7aV18iz0yqzqaFejxqmqn+1u97pIYMSt+VaE7Me2I+QZW4N0i8eTnR dLzzOPfLUySevGiJp/5KpctHSRZbixOQ+x75M1Wl76KFuenuFWS4G3QhWU929TO5fQPz EttwgOOgx3pxovxmWZixkD2pb1sfjSdeal53IkMkZcnJGipi/7EDaO96I6hlfC4l9Qnn nZOQ== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20221208; t=1682514822; x=1685106822; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:x-gm-message-state:from:to:cc :subject:date:message-id:reply-to; bh=ZYvMG3JlfJ2iPI34zyy+2kQQVY/C+zcG7SR5d24k+NU=; b=gjxv8bcW3rrNZ2uNlRqMf4kpE6VxCtP9QfF1EzL6+BwDvQQRpZFvop9DC14VOqYXby xhsngi7iRyAoMFhQZ6QFwtMd0BtKt9aXv+nZ6L4466F9UCvv/za2QtAKrHk7FJbn66WY aKQ0SSly1msmHSp5sMGJZgIgTEPnVHCZfpN6RBj/KPpqKPoptDitWAxAm1SuSaZILMOC ifKFru4UWfXlInoMsN2uzk2zXAb/AYObBY434sIrX32EZaAyxaW8aUiptEJgYoL4DJ68 kWiPNsp6K8lGf2ZAm5M4YZxUR9XsekMK6Wp+vrqVq57rgbX4w0YFkPmSJ0yASjSnOlBg ZX4g== X-Gm-Message-State: AAQBX9dqp2y7cHIChkkH1IHn4c60T+tGAV9jY+Bepz+NSzQogPZJP8Dm ffNMgvReHcOzicBpmsnA4nYYE5uOii18SLgfq/NnYA== X-Google-Smtp-Source: AKy350b0vsAZMsG4bmKBjIB4FDoCm6BncuNiukzJhZzSi/GnALRhheLSukNHwszNECO16nFb9uNjkg== X-Received: by 2002:a05:600c:24cd:b0:3f0:7f4b:f3c0 with SMTP id 13-20020a05600c24cd00b003f07f4bf3c0mr13483215wmu.19.1682514819909; Wed, 26 Apr 2023 06:13:39 -0700 (PDT) Received: from localhost.localdomain ([93.93.133.154]) by smtp.gmail.com with ESMTPSA id k5-20020adff5c5000000b002f103ca90cdsm15780949wrp.101.2023.04.26.06.13.38 (version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256); Wed, 26 Apr 2023 06:13:39 -0700 (PDT) To: libcamera-devel@lists.libcamera.org Date: Wed, 26 Apr 2023 14:10:55 +0100 Message-Id: <20230426131057.21550-12-naush@raspberrypi.com> X-Mailer: git-send-email 2.34.1 In-Reply-To: <20230426131057.21550-1-naush@raspberrypi.com> References: <20230426131057.21550-1-naush@raspberrypi.com> MIME-Version: 1.0 Subject: [libcamera-devel] [PATCH 11/13] pipeline: raspberrypi: Introduce PipelineHandlerBase class X-BeenThere: libcamera-devel@lists.libcamera.org X-Mailman-Version: 2.1.29 Precedence: list List-Id: List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , X-Patchwork-Original-From: Naushir Patuck via libcamera-devel From: Naushir Patuck Reply-To: Naushir Patuck Errors-To: libcamera-devel-bounces@lists.libcamera.org Sender: "libcamera-devel" Create a new PipelineHandlerBase class that handles general purpose housekeeping duties for the Raspberry Pi pipeline handler. Code the implementation of new class is essentially pulled from the existing pipeline/rpi/vc4/raspberrypi.cpp file with a small amount of refactoring. Create a derived PipelineHandlerVc4 class from PipelineHandlerBase that handles the VC4 pipeline specific tasks of the pipeline handler. Again, code for this class implementation is taken from the existing pipeline/rpi/vc4/raspberrypi.cpp with a small amount of refactoring. The goal of this change is to allow third parties to implement their own pipeline handlers running on the Raspberry Pi without duplicating all of the pipeline handler housekeeping tasks. Signed-off-by: Naushir Patuck Reviewed-by: Jacopo Mondi --- src/ipa/rpi/vc4/vc4.cpp | 2 +- src/libcamera/pipeline/rpi/common/meson.build | 1 + .../pipeline/rpi/common/pipeline_base.cpp | 1447 ++++++++++ .../pipeline/rpi/common/pipeline_base.h | 276 ++ .../pipeline/rpi/vc4/data/example.yaml | 4 +- src/libcamera/pipeline/rpi/vc4/meson.build | 2 +- .../pipeline/rpi/vc4/raspberrypi.cpp | 2428 ----------------- src/libcamera/pipeline/rpi/vc4/vc4.cpp | 1001 +++++++ 8 files changed, 2729 insertions(+), 2432 deletions(-) create mode 100644 src/libcamera/pipeline/rpi/common/pipeline_base.cpp create mode 100644 src/libcamera/pipeline/rpi/common/pipeline_base.h delete mode 100644 src/libcamera/pipeline/rpi/vc4/raspberrypi.cpp create mode 100644 src/libcamera/pipeline/rpi/vc4/vc4.cpp diff --git a/src/ipa/rpi/vc4/vc4.cpp b/src/ipa/rpi/vc4/vc4.cpp index 0e022c2aeed3..f3d83b2afadf 100644 --- a/src/ipa/rpi/vc4/vc4.cpp +++ b/src/ipa/rpi/vc4/vc4.cpp @@ -538,7 +538,7 @@ extern "C" { const struct IPAModuleInfo ipaModuleInfo = { IPA_MODULE_API_VERSION, 1, - "PipelineHandlerRPi", + "PipelineHandlerVc4", "vc4", }; diff --git a/src/libcamera/pipeline/rpi/common/meson.build b/src/libcamera/pipeline/rpi/common/meson.build index 1dec6d3d028b..f8ea790b42a1 100644 --- a/src/libcamera/pipeline/rpi/common/meson.build +++ b/src/libcamera/pipeline/rpi/common/meson.build @@ -2,6 +2,7 @@ libcamera_sources += files([ 'delayed_controls.cpp', + 'pipeline_base.cpp', 'rpi_stream.cpp', ]) diff --git a/src/libcamera/pipeline/rpi/common/pipeline_base.cpp b/src/libcamera/pipeline/rpi/common/pipeline_base.cpp new file mode 100644 index 000000000000..012766b38c32 --- /dev/null +++ b/src/libcamera/pipeline/rpi/common/pipeline_base.cpp @@ -0,0 +1,1447 @@ +/* SPDX-License-Identifier: LGPL-2.1-or-later */ +/* + * Copyright (C) 2019-2023, Raspberry Pi Ltd + * + * pipeline_base.h - Pipeline handler base class for Raspberry Pi devices + */ + +#include "pipeline_base.h" + +#include + +#include +#include + +#include +#include + +#include +#include +#include + +#include "libcamera/internal/camera_lens.h" +#include "libcamera/internal/ipa_manager.h" +#include "libcamera/internal/v4l2_subdevice.h" + +using namespace std::chrono_literals; + +namespace libcamera { + +using namespace RPi; + +LOG_DEFINE_CATEGORY(RPI) + +namespace { + +constexpr unsigned int defaultRawBitDepth = 12; + +bool isRaw(const PixelFormat &pixFmt) +{ + /* This test works for both Bayer and raw mono formats. */ + return BayerFormat::fromPixelFormat(pixFmt).isValid(); +} + +PixelFormat mbusCodeToPixelFormat(unsigned int mbus_code, + BayerFormat::Packing packingReq) +{ + BayerFormat bayer = BayerFormat::fromMbusCode(mbus_code); + + ASSERT(bayer.isValid()); + + bayer.packing = packingReq; + PixelFormat pix = bayer.toPixelFormat(); + + /* + * Not all formats (e.g. 8-bit or 16-bit Bayer formats) can have packed + * variants. So if the PixelFormat returns as invalid, use the non-packed + * conversion instead. + */ + if (!pix.isValid()) { + bayer.packing = BayerFormat::Packing::None; + pix = bayer.toPixelFormat(); + } + + return pix; +} + +SensorFormats populateSensorFormats(std::unique_ptr &sensor) +{ + SensorFormats formats; + + for (auto const mbusCode : sensor->mbusCodes()) + formats.emplace(mbusCode, sensor->sizes(mbusCode)); + + return formats; +} + +bool isMonoSensor(std::unique_ptr &sensor) +{ + unsigned int mbusCode = sensor->mbusCodes()[0]; + const BayerFormat &bayer = BayerFormat::fromMbusCode(mbusCode); + + return bayer.order == BayerFormat::Order::MONO; +} + +double scoreFormat(double desired, double actual) +{ + double score = desired - actual; + /* Smaller desired dimensions are preferred. */ + if (score < 0.0) + score = (-score) / 8; + /* Penalise non-exact matches. */ + if (actual != desired) + score *= 2; + + return score; +} + +V4L2SubdeviceFormat findBestFormat(const SensorFormats &formatsMap, const Size &req, unsigned int bitDepth) +{ + double bestScore = std::numeric_limits::max(), score; + V4L2SubdeviceFormat bestFormat; + bestFormat.colorSpace = ColorSpace::Raw; + + constexpr float penaltyAr = 1500.0; + constexpr float penaltyBitDepth = 500.0; + + /* Calculate the closest/best mode from the user requested size. */ + for (const auto &iter : formatsMap) { + const unsigned int mbusCode = iter.first; + const PixelFormat format = mbusCodeToPixelFormat(mbusCode, + BayerFormat::Packing::None); + const PixelFormatInfo &info = PixelFormatInfo::info(format); + + for (const Size &size : iter.second) { + double reqAr = static_cast(req.width) / req.height; + double fmtAr = static_cast(size.width) / size.height; + + /* Score the dimensions for closeness. */ + score = scoreFormat(req.width, size.width); + score += scoreFormat(req.height, size.height); + score += penaltyAr * scoreFormat(reqAr, fmtAr); + + /* Add any penalties... this is not an exact science! */ + score += utils::abs_diff(info.bitsPerPixel, bitDepth) * penaltyBitDepth; + + if (score <= bestScore) { + bestScore = score; + bestFormat.mbus_code = mbusCode; + bestFormat.size = size; + } + + LOG(RPI, Debug) << "Format: " << size + << " fmt " << format + << " Score: " << score + << " (best " << bestScore << ")"; + } + } + + return bestFormat; +} + +const std::vector validColorSpaces = { + ColorSpace::Sycc, + ColorSpace::Smpte170m, + ColorSpace::Rec709 +}; + +std::optional findValidColorSpace(const ColorSpace &colourSpace) +{ + for (auto cs : validColorSpaces) { + if (colourSpace.primaries == cs.primaries && + colourSpace.transferFunction == cs.transferFunction) + return cs; + } + + return std::nullopt; +} + +bool isRgb(const PixelFormat &pixFmt) +{ + const PixelFormatInfo &info = PixelFormatInfo::info(pixFmt); + return info.colourEncoding == PixelFormatInfo::ColourEncodingRGB; +} + +bool isYuv(const PixelFormat &pixFmt) +{ + /* The code below would return true for raw mono streams, so weed those out first. */ + if (isRaw(pixFmt)) + return false; + + const PixelFormatInfo &info = PixelFormatInfo::info(pixFmt); + return info.colourEncoding == PixelFormatInfo::ColourEncodingYUV; +} + +} /* namespace */ + +/* + * Raspberry Pi drivers expect the following colour spaces: + * - V4L2_COLORSPACE_RAW for raw streams. + * - One of V4L2_COLORSPACE_JPEG, V4L2_COLORSPACE_SMPTE170M, V4L2_COLORSPACE_REC709 for + * non-raw streams. Other fields such as transfer function, YCbCr encoding and + * quantisation are not used. + * + * The libcamera colour spaces that we wish to use corresponding to these are therefore: + * - ColorSpace::Raw for V4L2_COLORSPACE_RAW + * - ColorSpace::Sycc for V4L2_COLORSPACE_JPEG + * - ColorSpace::Smpte170m for V4L2_COLORSPACE_SMPTE170M + * - ColorSpace::Rec709 for V4L2_COLORSPACE_REC709 + */ +CameraConfiguration::Status RPiCameraConfiguration::validateColorSpaces([[maybe_unused]] ColorSpaceFlags flags) +{ + Status status = Valid; + yuvColorSpace_.reset(); + + for (auto cfg : config_) { + /* First fix up raw streams to have the "raw" colour space. */ + if (isRaw(cfg.pixelFormat)) { + /* If there was no value here, that doesn't count as "adjusted". */ + if (cfg.colorSpace && cfg.colorSpace != ColorSpace::Raw) + status = Adjusted; + cfg.colorSpace = ColorSpace::Raw; + continue; + } + + /* Next we need to find our shared colour space. The first valid one will do. */ + if (cfg.colorSpace && !yuvColorSpace_) + yuvColorSpace_ = findValidColorSpace(cfg.colorSpace.value()); + } + + /* If no colour space was given anywhere, choose sYCC. */ + if (!yuvColorSpace_) + yuvColorSpace_ = ColorSpace::Sycc; + + /* Note the version of this that any RGB streams will have to use. */ + rgbColorSpace_ = yuvColorSpace_; + rgbColorSpace_->ycbcrEncoding = ColorSpace::YcbcrEncoding::None; + rgbColorSpace_->range = ColorSpace::Range::Full; + + /* Go through the streams again and force everyone to the same colour space. */ + for (auto cfg : config_) { + if (cfg.colorSpace == ColorSpace::Raw) + continue; + + if (isYuv(cfg.pixelFormat) && cfg.colorSpace != yuvColorSpace_) { + /* Again, no value means "not adjusted". */ + if (cfg.colorSpace) + status = Adjusted; + cfg.colorSpace = yuvColorSpace_; + } + if (isRgb(cfg.pixelFormat) && cfg.colorSpace != rgbColorSpace_) { + /* Be nice, and let the YUV version count as non-adjusted too. */ + if (cfg.colorSpace && cfg.colorSpace != yuvColorSpace_) + status = Adjusted; + cfg.colorSpace = rgbColorSpace_; + } + } + + return status; +} + +CameraConfiguration::Status RPiCameraConfiguration::validate() +{ + Status status = Valid; + + if (config_.empty()) + return Invalid; + + status = validateColorSpaces(ColorSpaceFlag::StreamsShareColorSpace); + + /* + * Validate the requested transform against the sensor capabilities and + * rotation and store the final combined transform that configure() will + * need to apply to the sensor to save us working it out again. + */ + Transform requestedTransform = transform; + combinedTransform_ = data_->sensor_->validateTransform(&transform); + if (transform != requestedTransform) + status = Adjusted; + + std::vector rawStreams, outStreams; + for (const auto &[index, cfg] : utils::enumerate(config_)) { + if (isRaw(cfg.pixelFormat)) + rawStreams.emplace_back(index, &cfg); + else + outStreams.emplace_back(index, &cfg); + } + + /* Sort the streams so the highest resolution is first. */ + std::sort(rawStreams.begin(), rawStreams.end(), + [](auto &l, auto &r) { return l.cfg->size > r.cfg->size; }); + + std::sort(outStreams.begin(), outStreams.end(), + [](auto &l, auto &r) { return l.cfg->size > r.cfg->size; }); + + /* Do any platform specific fixups. */ + status = data_->platformValidate(rawStreams, outStreams); + if (status == Invalid) + return Invalid; + + /* Further fixups on the RAW streams. */ + for (auto &raw : rawStreams) { + StreamConfiguration &cfg = config_.at(raw.index); + V4L2DeviceFormat rawFormat; + + const PixelFormatInfo &info = PixelFormatInfo::info(cfg.pixelFormat); + unsigned int bitDepth = info.isValid() ? info.bitsPerPixel : defaultRawBitDepth; + V4L2SubdeviceFormat sensorFormat = findBestFormat(data_->sensorFormats_, cfg.size, bitDepth); + + rawFormat.size = sensorFormat.size; + rawFormat.fourcc = raw.dev->toV4L2PixelFormat(cfg.pixelFormat); + + int ret = raw.dev->tryFormat(&rawFormat); + if (ret) + return Invalid; + /* + * Some sensors change their Bayer order when they are h-flipped + * or v-flipped, according to the transform. If this one does, we + * must advertise the transformed Bayer order in the raw stream. + * Note how we must fetch the "native" (i.e. untransformed) Bayer + * order, because the sensor may currently be flipped! + */ + V4L2PixelFormat fourcc = rawFormat.fourcc; + if (data_->flipsAlterBayerOrder_) { + BayerFormat bayer = BayerFormat::fromV4L2PixelFormat(fourcc); + bayer.order = data_->nativeBayerOrder_; + bayer = bayer.transform(combinedTransform_); + fourcc = bayer.toV4L2PixelFormat(); + } + + PixelFormat inputPixFormat = fourcc.toPixelFormat(); + if (raw.cfg->size != rawFormat.size || raw.cfg->pixelFormat != inputPixFormat) { + raw.cfg->size = rawFormat.size; + raw.cfg->pixelFormat = inputPixFormat; + status = Adjusted; + } + + raw.cfg->stride = rawFormat.planes[0].bpl; + raw.cfg->frameSize = rawFormat.planes[0].size; + } + + /* Further fixups on the ISP output streams. */ + for (auto &out : outStreams) { + StreamConfiguration &cfg = config_.at(out.index); + PixelFormat &cfgPixFmt = cfg.pixelFormat; + V4L2VideoDevice::Formats fmts = out.dev->formats(); + + if (fmts.find(out.dev->toV4L2PixelFormat(cfgPixFmt)) == fmts.end()) { + /* If we cannot find a native format, use a default one. */ + cfgPixFmt = formats::NV12; + status = Adjusted; + } + + V4L2DeviceFormat format; + format.fourcc = out.dev->toV4L2PixelFormat(cfg.pixelFormat); + format.size = cfg.size; + /* We want to send the associated YCbCr info through to the driver. */ + format.colorSpace = yuvColorSpace_; + + LOG(RPI, Debug) + << "Try color space " << ColorSpace::toString(cfg.colorSpace); + + int ret = out.dev->tryFormat(&format); + if (ret) + return Invalid; + + /* + * But for RGB streams, the YCbCr info gets overwritten on the way back + * so we must check against what the stream cfg says, not what we actually + * requested (which carefully included the YCbCr info)! + */ + if (cfg.colorSpace != format.colorSpace) { + status = Adjusted; + LOG(RPI, Debug) + << "Color space changed from " + << ColorSpace::toString(cfg.colorSpace) << " to " + << ColorSpace::toString(format.colorSpace); + } + + cfg.colorSpace = format.colorSpace; + cfg.stride = format.planes[0].bpl; + cfg.frameSize = format.planes[0].size; + } + + return status; +} + +V4L2DeviceFormat PipelineHandlerBase::toV4L2DeviceFormat(const V4L2VideoDevice *dev, + const V4L2SubdeviceFormat &format, + BayerFormat::Packing packingReq) +{ + unsigned int mbus_code = format.mbus_code; + const PixelFormat pix = mbusCodeToPixelFormat(mbus_code, packingReq); + V4L2DeviceFormat deviceFormat; + + deviceFormat.fourcc = dev->toV4L2PixelFormat(pix); + deviceFormat.size = format.size; + deviceFormat.colorSpace = format.colorSpace; + return deviceFormat; +} + +std::unique_ptr +PipelineHandlerBase::generateConfiguration(Camera *camera, const StreamRoles &roles) +{ + CameraData *data = cameraData(camera); + std::unique_ptr config = + std::make_unique(data); + V4L2SubdeviceFormat sensorFormat; + unsigned int bufferCount; + PixelFormat pixelFormat; + V4L2VideoDevice::Formats fmts; + Size size; + std::optional colorSpace; + + if (roles.empty()) + return config; + + Size sensorSize = data->sensor_->resolution(); + for (const StreamRole role : roles) { + switch (role) { + case StreamRole::Raw: + size = sensorSize; + sensorFormat = findBestFormat(data->sensorFormats_, size, defaultRawBitDepth); + pixelFormat = mbusCodeToPixelFormat(sensorFormat.mbus_code, + BayerFormat::Packing::CSI2); + ASSERT(pixelFormat.isValid()); + colorSpace = ColorSpace::Raw; + bufferCount = 2; + break; + + case StreamRole::StillCapture: + fmts = data->ispFormats(); + pixelFormat = formats::NV12; + /* + * Still image codecs usually expect the sYCC color space. + * Even RGB codecs will be fine as the RGB we get with the + * sYCC color space is the same as sRGB. + */ + colorSpace = ColorSpace::Sycc; + /* Return the largest sensor resolution. */ + size = sensorSize; + bufferCount = 1; + break; + + case StreamRole::VideoRecording: + /* + * The colour denoise algorithm requires the analysis + * image, produced by the second ISP output, to be in + * YUV420 format. Select this format as the default, to + * maximize chances that it will be picked by + * applications and enable usage of the colour denoise + * algorithm. + */ + fmts = data->ispFormats(); + pixelFormat = formats::YUV420; + /* + * Choose a color space appropriate for video recording. + * Rec.709 will be a good default for HD resolutions. + */ + colorSpace = ColorSpace::Rec709; + size = { 1920, 1080 }; + bufferCount = 4; + break; + + case StreamRole::Viewfinder: + fmts = data->ispFormats(); + pixelFormat = formats::ARGB8888; + colorSpace = ColorSpace::Sycc; + size = { 800, 600 }; + bufferCount = 4; + break; + + default: + LOG(RPI, Error) << "Requested stream role not supported: " + << role; + return nullptr; + } + + std::map> deviceFormats; + if (role == StreamRole::Raw) { + /* Translate the MBUS codes to a PixelFormat. */ + for (const auto &format : data->sensorFormats_) { + PixelFormat pf = mbusCodeToPixelFormat(format.first, + BayerFormat::Packing::CSI2); + if (pf.isValid()) + deviceFormats.emplace(std::piecewise_construct, std::forward_as_tuple(pf), + std::forward_as_tuple(format.second.begin(), format.second.end())); + } + } else { + /* + * Translate the V4L2PixelFormat to PixelFormat. Note that we + * limit the recommended largest ISP output size to match the + * sensor resolution. + */ + for (const auto &format : fmts) { + PixelFormat pf = format.first.toPixelFormat(); + if (pf.isValid()) { + const SizeRange &ispSizes = format.second[0]; + deviceFormats[pf].emplace_back(ispSizes.min, sensorSize, + ispSizes.hStep, ispSizes.vStep); + } + } + } + + /* Add the stream format based on the device node used for the use case. */ + StreamFormats formats(deviceFormats); + StreamConfiguration cfg(formats); + cfg.size = size; + cfg.pixelFormat = pixelFormat; + cfg.colorSpace = colorSpace; + cfg.bufferCount = bufferCount; + config->addConfiguration(cfg); + } + + config->validate(); + + return config; +} + +int PipelineHandlerBase::configure(Camera *camera, CameraConfiguration *config) +{ + CameraData *data = cameraData(camera); + int ret; + + /* Start by freeing all buffers and reset the stream states. */ + data->freeBuffers(); + for (auto const stream : data->streams_) + stream->setExternal(false); + + std::vector rawStreams, ispStreams; + std::optional packing; + unsigned int bitDepth = defaultRawBitDepth; + + for (unsigned i = 0; i < config->size(); i++) { + StreamConfiguration *cfg = &config->at(i); + + if (isRaw(cfg->pixelFormat)) + rawStreams.emplace_back(i, cfg); + else + ispStreams.emplace_back(i, cfg); + } + + /* Sort the streams so the highest resolution is first. */ + std::sort(rawStreams.begin(), rawStreams.end(), + [](auto &l, auto &r) { return l.cfg->size > r.cfg->size; }); + + std::sort(ispStreams.begin(), ispStreams.end(), + [](auto &l, auto &r) { return l.cfg->size > r.cfg->size; }); + + /* + * Calculate the best sensor mode we can use based on the user's request, + * and apply it to the sensor with the cached tranform, if any. + * + * If we have been given a RAW stream, use that size for setting up the sensor. + */ + if (!rawStreams.empty()) { + BayerFormat bayerFormat = BayerFormat::fromPixelFormat(rawStreams[0].cfg->pixelFormat); + /* Replace the user requested packing/bit-depth. */ + packing = bayerFormat.packing; + bitDepth = bayerFormat.bitDepth; + } + + V4L2SubdeviceFormat sensorFormat = findBestFormat(data->sensorFormats_, + rawStreams.empty() ? ispStreams[0].cfg->size + : rawStreams[0].cfg->size, + bitDepth); + /* Apply any cached transform. */ + const RPiCameraConfiguration *rpiConfig = static_cast(config); + + /* Then apply the format on the sensor. */ + ret = data->sensor_->setFormat(&sensorFormat, rpiConfig->combinedTransform_); + if (ret) + return ret; + + /* + * Platform specific internal stream configuration. This also assigns + * external streams which get configured below. + */ + ret = data->platformConfigure(sensorFormat, packing, rawStreams, ispStreams); + if (ret) + return ret; + + ipa::RPi::ConfigResult result; + ret = data->configureIPA(config, &result); + if (ret) { + LOG(RPI, Error) << "Failed to configure the IPA: " << ret; + return ret; + } + + /* + * Set the scaler crop to the value we are using (scaled to native sensor + * coordinates). + */ + data->scalerCrop_ = data->scaleIspCrop(data->ispCrop_); + + /* + * Update the ScalerCropMaximum to the correct value for this camera mode. + * For us, it's the same as the "analogue crop". + * + * \todo Make this property the ScalerCrop maximum value when dynamic + * controls are available and set it at validate() time + */ + data->properties_.set(properties::ScalerCropMaximum, data->sensorInfo_.analogCrop); + + /* Store the mode sensitivity for the application. */ + data->properties_.set(properties::SensorSensitivity, result.modeSensitivity); + + /* Update the controls that the Raspberry Pi IPA can handle. */ + ControlInfoMap::Map ctrlMap; + for (auto const &c : result.controlInfo) + ctrlMap.emplace(c.first, c.second); + + /* Add the ScalerCrop control limits based on the current mode. */ + Rectangle ispMinCrop = data->scaleIspCrop(Rectangle(data->ispMinCropSize_)); + ctrlMap[&controls::ScalerCrop] = ControlInfo(ispMinCrop, data->sensorInfo_.analogCrop, data->scalerCrop_); + + data->controlInfo_ = ControlInfoMap(std::move(ctrlMap), result.controlInfo.idmap()); + + /* Setup the Video Mux/Bridge entities. */ + for (auto &[device, link] : data->bridgeDevices_) { + /* + * Start by disabling all the sink pad links on the devices in the + * cascade, with the exception of the link connecting the device. + */ + for (const MediaPad *p : device->entity()->pads()) { + if (!(p->flags() & MEDIA_PAD_FL_SINK)) + continue; + + for (MediaLink *l : p->links()) { + if (l != link) + l->setEnabled(false); + } + } + + /* + * Next, enable the entity -> entity links, and setup the pad format. + * + * \todo Some bridge devices may chainge the media bus code, so we + * ought to read the source pad format and propagate it to the sink pad. + */ + link->setEnabled(true); + const MediaPad *sinkPad = link->sink(); + ret = device->setFormat(sinkPad->index(), &sensorFormat); + if (ret) { + LOG(RPI, Error) << "Failed to set format on " << device->entity()->name() + << " pad " << sinkPad->index() + << " with format " << sensorFormat + << ": " << ret; + return ret; + } + + LOG(RPI, Debug) << "Configured media link on device " << device->entity()->name() + << " on pad " << sinkPad->index(); + } + + return 0; +} + +int PipelineHandlerBase::exportFrameBuffers([[maybe_unused]] Camera *camera, libcamera::Stream *stream, + std::vector> *buffers) +{ + RPi::Stream *s = static_cast(stream); + unsigned int count = stream->configuration().bufferCount; + int ret = s->dev()->exportBuffers(count, buffers); + + s->setExportedBuffers(buffers); + + return ret; +} + +int PipelineHandlerBase::start(Camera *camera, const ControlList *controls) +{ + CameraData *data = cameraData(camera); + int ret; + + /* Check if a ScalerCrop control was specified. */ + if (controls) + data->calculateScalerCrop(*controls); + + /* Start the IPA. */ + ipa::RPi::StartResult result; + data->ipa_->start(controls ? *controls : ControlList{ controls::controls }, + &result); + + /* Apply any gain/exposure settings that the IPA may have passed back. */ + if (!result.controls.empty()) + data->setSensorControls(result.controls); + + /* Configure the number of dropped frames required on startup. */ + data->dropFrameCount_ = data->config_.disableStartupFrameDrops ? 0 + : result.dropFrameCount; + + for (auto const stream : data->streams_) + stream->resetBuffers(); + + if (!data->buffersAllocated_) { + /* Allocate buffers for internal pipeline usage. */ + ret = prepareBuffers(camera); + if (ret) { + LOG(RPI, Error) << "Failed to allocate buffers"; + data->freeBuffers(); + stop(camera); + return ret; + } + data->buffersAllocated_ = true; + } + + /* We need to set the dropFrameCount_ before queueing buffers. */ + ret = queueAllBuffers(camera); + if (ret) { + LOG(RPI, Error) << "Failed to queue buffers"; + stop(camera); + return ret; + } + + /* + * Reset the delayed controls with the gain and exposure values set by + * the IPA. + */ + data->delayedCtrls_->reset(0); + data->state_ = CameraData::State::Idle; + + /* Enable SOF event generation. */ + data->frontendDevice()->setFrameStartEnabled(true); + + data->platformStart(); + + /* Start all streams. */ + for (auto const stream : data->streams_) { + ret = stream->dev()->streamOn(); + if (ret) { + stop(camera); + return ret; + } + } + + return 0; +} + +void PipelineHandlerBase::stopDevice(Camera *camera) +{ + CameraData *data = cameraData(camera); + + data->state_ = CameraData::State::Stopped; + data->platformStop(); + + for (auto const stream : data->streams_) + stream->dev()->streamOff(); + + /* Disable SOF event generation. */ + data->frontendDevice()->setFrameStartEnabled(false); + + data->clearIncompleteRequests(); + + /* Stop the IPA. */ + data->ipa_->stop(); +} + +void PipelineHandlerBase::releaseDevice(Camera *camera) +{ + CameraData *data = cameraData(camera); + data->freeBuffers(); +} + +int PipelineHandlerBase::queueRequestDevice(Camera *camera, Request *request) +{ + CameraData *data = cameraData(camera); + + if (!data->isRunning()) + return -EINVAL; + + LOG(RPI, Debug) << "queueRequestDevice: New request."; + + /* Push all buffers supplied in the Request to the respective streams. */ + for (auto stream : data->streams_) { + if (!stream->isExternal()) + continue; + + FrameBuffer *buffer = request->findBuffer(stream); + if (buffer && !stream->getBufferId(buffer)) { + /* + * This buffer is not recognised, so it must have been allocated + * outside the v4l2 device. Store it in the stream buffer list + * so we can track it. + */ + stream->setExternalBuffer(buffer); + } + + /* + * If no buffer is provided by the request for this stream, we + * queue a nullptr to the stream to signify that it must use an + * internally allocated buffer for this capture request. This + * buffer will not be given back to the application, but is used + * to support the internal pipeline flow. + * + * The below queueBuffer() call will do nothing if there are not + * enough internal buffers allocated, but this will be handled by + * queuing the request for buffers in the RPiStream object. + */ + int ret = stream->queueBuffer(buffer); + if (ret) + return ret; + } + + /* Push the request to the back of the queue. */ + data->requestQueue_.push(request); + data->handleState(); + + return 0; +} + +int PipelineHandlerBase::registerCamera(MediaDevice *frontend, const std::string &frontendName, + MediaDevice *backend, MediaEntity *sensorEntity) +{ + std::unique_ptr cameraData = allocateCameraData(); + CameraData *data = cameraData.get(); + int ret; + + data->sensor_ = std::make_unique(sensorEntity); + if (!data->sensor_) + return -EINVAL; + + if (data->sensor_->init()) + return -EINVAL; + + data->sensorFormats_ = populateSensorFormats(data->sensor_); + + /* + * Enumerate all the Video Mux/Bridge devices across the sensor -> Fr + * chain. There may be a cascade of devices in this chain! + */ + MediaLink *link = sensorEntity->getPadByIndex(0)->links()[0]; + data->enumerateVideoDevices(link, frontendName); + + ipa::RPi::InitResult result; + if (data->loadIPA(&result)) { + LOG(RPI, Error) << "Failed to load a suitable IPA library"; + return -EINVAL; + } + + /* + * Setup our delayed control writer with the sensor default + * gain and exposure delays. Mark VBLANK for priority write. + */ + std::unordered_map params = { + { V4L2_CID_ANALOGUE_GAIN, { result.sensorConfig.gainDelay, false } }, + { V4L2_CID_EXPOSURE, { result.sensorConfig.exposureDelay, false } }, + { V4L2_CID_HBLANK, { result.sensorConfig.hblankDelay, false } }, + { V4L2_CID_VBLANK, { result.sensorConfig.vblankDelay, true } } + }; + data->delayedCtrls_ = std::make_unique(data->sensor_->device(), params); + data->sensorMetadata_ = result.sensorConfig.sensorMetadata; + + /* Register initial controls that the Raspberry Pi IPA can handle. */ + data->controlInfo_ = std::move(result.controlInfo); + + /* Initialize the camera properties. */ + data->properties_ = data->sensor_->properties(); + + /* + * The V4L2_CID_NOTIFY_GAINS control, if present, is used to inform the + * sensor of the colour gains. It is defined to be a linear gain where + * the default value represents a gain of exactly one. + */ + auto it = data->sensor_->controls().find(V4L2_CID_NOTIFY_GAINS); + if (it != data->sensor_->controls().end()) + data->notifyGainsUnity_ = it->second.def().get(); + + /* + * Set a default value for the ScalerCropMaximum property to show + * that we support its use, however, initialise it to zero because + * it's not meaningful until a camera mode has been chosen. + */ + data->properties_.set(properties::ScalerCropMaximum, Rectangle{}); + + /* + * We cache two things about the sensor in relation to transforms + * (meaning horizontal and vertical flips): if they affect the Bayer + * ordering, and what the "native" Bayer order is, when no transforms + * are applied. + * + * If flips are supported verify if they affect the Bayer ordering + * and what the "native" Bayer order is, when no transforms are + * applied. + * + * We note that the sensor's cached list of supported formats is + * already in the "native" order, with any flips having been undone. + */ + const V4L2Subdevice *sensor = data->sensor_->device(); + const struct v4l2_query_ext_ctrl *hflipCtrl = sensor->controlInfo(V4L2_CID_HFLIP); + if (hflipCtrl) { + /* We assume it will support vflips too... */ + data->flipsAlterBayerOrder_ = hflipCtrl->flags & V4L2_CTRL_FLAG_MODIFY_LAYOUT; + } + + /* Look for a valid Bayer format. */ + BayerFormat bayerFormat; + for (const auto &iter : data->sensorFormats_) { + bayerFormat = BayerFormat::fromMbusCode(iter.first); + if (bayerFormat.isValid()) + break; + } + + if (!bayerFormat.isValid()) { + LOG(RPI, Error) << "No Bayer format found"; + return -EINVAL; + } + data->nativeBayerOrder_ = bayerFormat.order; + + ret = data->loadPipelineConfiguration(); + if (ret) { + LOG(RPI, Error) << "Unable to load pipeline configuration"; + return ret; + } + + ret = platformRegister(cameraData, frontend, backend); + if (ret) + return ret; + + /* Setup the general IPA signal handlers. */ + data->frontendDevice()->dequeueTimeout.connect(data, &RPi::CameraData::cameraTimeout); + data->frontendDevice()->frameStart.connect(data, &RPi::CameraData::frameStarted); + data->ipa_->setDelayedControls.connect(data, &CameraData::setDelayedControls); + data->ipa_->setLensControls.connect(data, &CameraData::setLensControls); + + return 0; +} + +void PipelineHandlerBase::mapBuffers(Camera *camera, const BufferMap &buffers, unsigned int mask) +{ + CameraData *data = cameraData(camera); + std::vector bufferIds; + /* + * Link the FrameBuffers with the id (key value) in the map stored in + * the RPi stream object - along with an identifier mask. + * + * This will allow us to identify buffers passed between the pipeline + * handler and the IPA. + */ + for (auto const &it : buffers) { + bufferIds.push_back(IPABuffer(mask | it.first, + it.second->planes())); + data->bufferIds_.insert(mask | it.first); + } + + data->ipa_->mapBuffers(bufferIds); +} + +int PipelineHandlerBase::queueAllBuffers(Camera *camera) +{ + CameraData *data = cameraData(camera); + int ret; + + for (auto const stream : data->streams_) { + if (!stream->isExternal()) { + ret = stream->queueAllBuffers(); + if (ret < 0) + return ret; + } else { + /* + * For external streams, we must queue up a set of internal + * buffers to handle the number of drop frames requested by + * the IPA. This is done by passing nullptr in queueBuffer(). + * + * The below queueBuffer() call will do nothing if there + * are not enough internal buffers allocated, but this will + * be handled by queuing the request for buffers in the + * RPiStream object. + */ + unsigned int i; + for (i = 0; i < data->dropFrameCount_; i++) { + ret = stream->queueBuffer(nullptr); + if (ret) + return ret; + } + } + } + + return 0; +} + +void CameraData::freeBuffers() +{ + if (ipa_) { + /* + * Copy the buffer ids from the unordered_set to a vector to + * pass to the IPA. + */ + std::vector bufferIds(bufferIds_.begin(), + bufferIds_.end()); + ipa_->unmapBuffers(bufferIds); + bufferIds_.clear(); + } + + for (auto const stream : streams_) + stream->releaseBuffers(); + + platformFreeBuffers(); + + buffersAllocated_ = false; +} + +/* + * enumerateVideoDevices() iterates over the Media Controller topology, starting + * at the sensor and finishing at the frontend. For each sensor, CameraData stores + * a unique list of any intermediate video mux or bridge devices connected in a + * cascade, together with the entity to entity link. + * + * Entity pad configuration and link enabling happens at the end of configure(). + * We first disable all pad links on each entity device in the chain, and then + * selectively enabling the specific links to link sensor to the frontend across + * all intermediate muxes and bridges. + * + * In the cascaded topology below, if Sensor1 is used, the Mux2 -> Mux1 link + * will be disabled, and Sensor1 -> Mux1 -> Frontend links enabled. Alternatively, + * if Sensor3 is used, the Sensor2 -> Mux2 and Sensor1 -> Mux1 links are disabled, + * and Sensor3 -> Mux2 -> Mux1 -> Frontend links are enabled. All other links will + * remain unchanged. + * + * +----------+ + * | FE | + * +-----^----+ + * | + * +---+---+ + * | Mux1 |<------+ + * +--^---- | + * | | + * +-----+---+ +---+---+ + * | Sensor1 | | Mux2 |<--+ + * +---------+ +-^-----+ | + * | | + * +-------+-+ +---+-----+ + * | Sensor2 | | Sensor3 | + * +---------+ +---------+ + */ +void CameraData::enumerateVideoDevices(MediaLink *link, const std::string &frontend) +{ + const MediaPad *sinkPad = link->sink(); + const MediaEntity *entity = sinkPad->entity(); + bool frontendFound = false; + + /* We only deal with Video Mux and Bridge devices in cascade. */ + if (entity->function() != MEDIA_ENT_F_VID_MUX && + entity->function() != MEDIA_ENT_F_VID_IF_BRIDGE) + return; + + /* Find the source pad for this Video Mux or Bridge device. */ + const MediaPad *sourcePad = nullptr; + for (const MediaPad *pad : entity->pads()) { + if (pad->flags() & MEDIA_PAD_FL_SOURCE) { + /* + * We can only deal with devices that have a single source + * pad. If this device has multiple source pads, ignore it + * and this branch in the cascade. + */ + if (sourcePad) + return; + + sourcePad = pad; + } + } + + LOG(RPI, Debug) << "Found video mux device " << entity->name() + << " linked to sink pad " << sinkPad->index(); + + bridgeDevices_.emplace_back(std::make_unique(entity), link); + bridgeDevices_.back().first->open(); + + /* + * Iterate through all the sink pad links down the cascade to find any + * other Video Mux and Bridge devices. + */ + for (MediaLink *l : sourcePad->links()) { + enumerateVideoDevices(l, frontend); + /* Once we reach the Frontend entity, we are done. */ + if (l->sink()->entity()->name() == frontend) { + frontendFound = true; + break; + } + } + + /* This identifies the end of our entity enumeration recursion. */ + if (link->source()->entity()->function() == MEDIA_ENT_F_CAM_SENSOR) { + /* + * If the frontend is not at the end of this cascade, we cannot + * configure this topology automatically, so remove all entity references. + */ + if (!frontendFound) { + LOG(RPI, Warning) << "Cannot automatically configure this MC topology!"; + bridgeDevices_.clear(); + } + } +} + +int CameraData::loadPipelineConfiguration() +{ + config_ = { + .disableStartupFrameDrops = false, + .cameraTimeoutValue = 0, + }; + + /* Initial configuration of the platform, in case no config file is present */ + platformPipelineConfigure({}); + + char const *configFromEnv = utils::secure_getenv("LIBCAMERA_RPI_CONFIG_FILE"); + if (!configFromEnv || *configFromEnv == '\0') + return 0; + + std::string filename = std::string(configFromEnv); + File file(filename); + + if (!file.open(File::OpenModeFlag::ReadOnly)) { + LOG(RPI, Error) << "Failed to open configuration file '" << filename << "'"; + return -EIO; + } + + LOG(RPI, Info) << "Using configuration file '" << filename << "'"; + + std::unique_ptr root = YamlParser::parse(file); + if (!root) { + LOG(RPI, Warning) << "Failed to parse configuration file, using defaults"; + return 0; + } + + std::optional ver = (*root)["version"].get(); + if (!ver || *ver != 1.0) { + LOG(RPI, Error) << "Unexpected configuration file version reported"; + return -EINVAL; + } + + const YamlObject &phConfig = (*root)["pipeline_handler"]; + + config_.disableStartupFrameDrops = + phConfig["disable_startup_frame_drops"].get(config_.disableStartupFrameDrops); + + config_.cameraTimeoutValue = + phConfig["camera_timeout_value_ms"].get(config_.cameraTimeoutValue); + + if (config_.cameraTimeoutValue) { + /* Disable the IPA signal to control timeout and set the user requested value. */ + ipa_->setCameraTimeout.disconnect(); + frontendDevice()->setDequeueTimeout(config_.cameraTimeoutValue * 1ms); + } + + return platformPipelineConfigure(root); +} + +int CameraData::loadIPA(ipa::RPi::InitResult *result) +{ + ipa_ = IPAManager::createIPA(pipe(), 1, 1); + + if (!ipa_) + return -ENOENT; + + /* + * The configuration (tuning file) is made from the sensor name unless + * the environment variable overrides it. + */ + std::string configurationFile; + char const *configFromEnv = utils::secure_getenv("LIBCAMERA_RPI_TUNING_FILE"); + if (!configFromEnv || *configFromEnv == '\0') { + std::string model = sensor_->model(); + if (isMonoSensor(sensor_)) + model += "_mono"; + configurationFile = ipa_->configurationFile(model + ".json", "rpi"); + } else { + configurationFile = std::string(configFromEnv); + } + + IPASettings settings(configurationFile, sensor_->model()); + ipa::RPi::InitParams params; + + params.lensPresent = !!sensor_->focusLens(); + int ret = platformInitIpa(params); + if (ret) + return ret; + + return ipa_->init(settings, params, result); +} + +int CameraData::configureIPA(const CameraConfiguration *config, ipa::RPi::ConfigResult *result) +{ + std::map entityControls; + ipa::RPi::ConfigParams params; + int ret; + + params.sensorControls = sensor_->controls(); + if (sensor_->focusLens()) + params.lensControls = sensor_->focusLens()->controls(); + + ret = platformConfigureIpa(params); + if (ret) + return ret; + + /* We store the IPACameraSensorInfo for digital zoom calculations. */ + ret = sensor_->sensorInfo(&sensorInfo_); + if (ret) { + LOG(RPI, Error) << "Failed to retrieve camera sensor info"; + return ret; + } + + /* Always send the user transform to the IPA. */ + params.transform = static_cast(config->transform); + + /* Ready the IPA - it must know about the sensor resolution. */ + ret = ipa_->configure(sensorInfo_, params, result); + if (ret < 0) { + LOG(RPI, Error) << "IPA configuration failed!"; + return -EPIPE; + } + + if (!result->controls.empty()) + setSensorControls(result->controls); + + return 0; +} + +void CameraData::setDelayedControls(const ControlList &controls, uint32_t delayContext) +{ + if (!delayedCtrls_->push(controls, delayContext)) + LOG(RPI, Error) << "V4L2 DelayedControl set failed"; +} + +void CameraData::setLensControls(const ControlList &controls) +{ + CameraLens *lens = sensor_->focusLens(); + + if (lens && controls.contains(V4L2_CID_FOCUS_ABSOLUTE)) { + ControlValue const &focusValue = controls.get(V4L2_CID_FOCUS_ABSOLUTE); + lens->setFocusPosition(focusValue.get()); + } +} + +void CameraData::setSensorControls(ControlList &controls) +{ + /* + * We need to ensure that if both VBLANK and EXPOSURE are present, the + * former must be written ahead of, and separately from EXPOSURE to avoid + * V4L2 rejecting the latter. This is identical to what DelayedControls + * does with the priority write flag. + * + * As a consequence of the below logic, VBLANK gets set twice, and we + * rely on the v4l2 framework to not pass the second control set to the + * driver as the actual control value has not changed. + */ + if (controls.contains(V4L2_CID_EXPOSURE) && controls.contains(V4L2_CID_VBLANK)) { + ControlList vblank_ctrl; + + vblank_ctrl.set(V4L2_CID_VBLANK, controls.get(V4L2_CID_VBLANK)); + sensor_->setControls(&vblank_ctrl); + } + + sensor_->setControls(&controls); +} + +Rectangle CameraData::scaleIspCrop(const Rectangle &ispCrop) const +{ + /* + * Scale a crop rectangle defined in the ISP's coordinates into native sensor + * coordinates. + */ + Rectangle nativeCrop = ispCrop.scaledBy(sensorInfo_.analogCrop.size(), + sensorInfo_.outputSize); + nativeCrop.translateBy(sensorInfo_.analogCrop.topLeft()); + return nativeCrop; +} + +void CameraData::calculateScalerCrop(const ControlList &controls) +{ + const auto &scalerCrop = controls.get(controls::ScalerCrop); + if (scalerCrop) { + Rectangle nativeCrop = *scalerCrop; + + if (!nativeCrop.width || !nativeCrop.height) + nativeCrop = { 0, 0, 1, 1 }; + + /* Create a version of the crop scaled to ISP (camera mode) pixels. */ + Rectangle ispCrop = nativeCrop.translatedBy(-sensorInfo_.analogCrop.topLeft()); + ispCrop.scaleBy(sensorInfo_.outputSize, sensorInfo_.analogCrop.size()); + + /* + * The crop that we set must be: + * 1. At least as big as ispMinCropSize_, once that's been + * enlarged to the same aspect ratio. + * 2. With the same mid-point, if possible. + * 3. But it can't go outside the sensor area. + */ + Size minSize = ispMinCropSize_.expandedToAspectRatio(nativeCrop.size()); + Size size = ispCrop.size().expandedTo(minSize); + ispCrop = size.centeredTo(ispCrop.center()).enclosedIn(Rectangle(sensorInfo_.outputSize)); + + if (ispCrop != ispCrop_) { + ispCrop_ = ispCrop; + platformIspCrop(); + + /* + * Also update the ScalerCrop in the metadata with what we actually + * used. But we must first rescale that from ISP (camera mode) pixels + * back into sensor native pixels. + */ + scalerCrop_ = scaleIspCrop(ispCrop_); + } + } +} + +void CameraData::cameraTimeout() +{ + LOG(RPI, Error) << "Camera frontend has timed out!"; + LOG(RPI, Error) << "Please check that your camera sensor connector is attached securely."; + LOG(RPI, Error) << "Alternatively, try another cable and/or sensor."; + + state_ = CameraData::State::Error; + platformStop(); + + /* + * To allow the application to attempt a recovery from this timeout, + * stop all devices streaming, and return any outstanding requests as + * incomplete and cancelled. + */ + for (auto const stream : streams_) + stream->dev()->streamOff(); + + clearIncompleteRequests(); +} + +void CameraData::frameStarted(uint32_t sequence) +{ + LOG(RPI, Debug) << "Frame start " << sequence; + + /* Write any controls for the next frame as soon as we can. */ + delayedCtrls_->applyControls(sequence); +} + +void CameraData::clearIncompleteRequests() +{ + /* + * All outstanding requests (and associated buffers) must be returned + * back to the application. + */ + while (!requestQueue_.empty()) { + Request *request = requestQueue_.front(); + + for (auto &b : request->buffers()) { + FrameBuffer *buffer = b.second; + /* + * Has the buffer already been handed back to the + * request? If not, do so now. + */ + if (buffer->request()) { + buffer->_d()->cancel(); + pipe()->completeBuffer(request, buffer); + } + } + + pipe()->completeRequest(request); + requestQueue_.pop(); + } +} + +void CameraData::handleStreamBuffer(FrameBuffer *buffer, RPi::Stream *stream) +{ + /* + * It is possible to be here without a pending request, so check + * that we actually have one to action, otherwise we just return + * buffer back to the stream. + */ + Request *request = requestQueue_.empty() ? nullptr : requestQueue_.front(); + if (!dropFrameCount_ && request && request->findBuffer(stream) == buffer) { + /* + * Check if this is an externally provided buffer, and if + * so, we must stop tracking it in the pipeline handler. + */ + handleExternalBuffer(buffer, stream); + /* + * Tag the buffer as completed, returning it to the + * application. + */ + pipe()->completeBuffer(request, buffer); + } else { + /* + * This buffer was not part of the Request (which happens if an + * internal buffer was used for an external stream, or + * unconditionally for internal streams), or there is no pending + * request, so we can recycle it. + */ + stream->returnBuffer(buffer); + } +} + +void CameraData::handleState() +{ + switch (state_) { + case State::Stopped: + case State::Busy: + case State::Error: + break; + + case State::IpaComplete: + /* If the request is completed, we will switch to Idle state. */ + checkRequestCompleted(); + /* + * No break here, we want to try running the pipeline again. + * The fallthrough clause below suppresses compiler warnings. + */ + [[fallthrough]]; + + case State::Idle: + tryRunPipeline(); + break; + } +} + +void CameraData::handleExternalBuffer(FrameBuffer *buffer, RPi::Stream *stream) +{ + unsigned int id = stream->getBufferId(buffer); + + if (!(id & MaskExternalBuffer)) + return; + + /* Stop the Stream object from tracking the buffer. */ + stream->removeExternalBuffer(buffer); +} + +void CameraData::checkRequestCompleted() +{ + bool requestCompleted = false; + /* + * If we are dropping this frame, do not touch the request, simply + * change the state to IDLE when ready. + */ + if (!dropFrameCount_) { + Request *request = requestQueue_.front(); + if (request->hasPendingBuffers()) + return; + + /* Must wait for metadata to be filled in before completing. */ + if (state_ != State::IpaComplete) + return; + + pipe()->completeRequest(request); + requestQueue_.pop(); + requestCompleted = true; + } + + /* + * Make sure we have three outputs completed in the case of a dropped + * frame. + */ + if (state_ == State::IpaComplete && + ((ispOutputCount_ == ispOutputTotal_ && dropFrameCount_) || requestCompleted)) { + state_ = State::Idle; + if (dropFrameCount_) { + dropFrameCount_--; + LOG(RPI, Debug) << "Dropping frame at the request of the IPA (" + << dropFrameCount_ << " left)"; + } + } +} + +void CameraData::fillRequestMetadata(const ControlList &bufferControls, Request *request) +{ + request->metadata().set(controls::SensorTimestamp, + bufferControls.get(controls::SensorTimestamp).value_or(0)); + + request->metadata().set(controls::ScalerCrop, scalerCrop_); +} + +} /* namespace libcamera */ diff --git a/src/libcamera/pipeline/rpi/common/pipeline_base.h b/src/libcamera/pipeline/rpi/common/pipeline_base.h new file mode 100644 index 000000000000..318266a6fb51 --- /dev/null +++ b/src/libcamera/pipeline/rpi/common/pipeline_base.h @@ -0,0 +1,276 @@ +/* SPDX-License-Identifier: LGPL-2.1-or-later */ +/* + * Copyright (C) 2019-2023, Raspberry Pi Ltd + * + * pipeline_base.h - Pipeline handler base class for Raspberry Pi devices + */ + +#include +#include +#include +#include +#include +#include +#include +#include + +#include +#include + +#include "libcamera/internal/bayer_format.h" +#include "libcamera/internal/camera.h" +#include "libcamera/internal/camera_sensor.h" +#include "libcamera/internal/framebuffer.h" +#include "libcamera/internal/media_device.h" +#include "libcamera/internal/media_object.h" +#include "libcamera/internal/pipeline_handler.h" +#include "libcamera/internal/v4l2_videodevice.h" +#include "libcamera/internal/yaml_parser.h" + +#include +#include + +#include "delayed_controls.h" +#include "rpi_stream.h" + +using namespace std::chrono_literals; + +namespace libcamera { + +namespace RPi { + +/* Map of mbus codes to supported sizes reported by the sensor. */ +using SensorFormats = std::map>; + +class CameraData : public Camera::Private +{ +public: + CameraData(PipelineHandler *pipe) + : Camera::Private(pipe), state_(State::Stopped), + flipsAlterBayerOrder_(false), dropFrameCount_(0), buffersAllocated_(false), + ispOutputCount_(0), ispOutputTotal_(0) + { + } + + virtual ~CameraData() + { + } + + struct StreamParams { + StreamParams() + : index(0), cfg(nullptr), dev(nullptr) + { + } + + StreamParams(unsigned int index_, StreamConfiguration *cfg_) + : index(index_), cfg(cfg_), dev(nullptr) + { + } + + unsigned int index; + StreamConfiguration *cfg; + V4L2VideoDevice *dev; + }; + + virtual CameraConfiguration::Status platformValidate(std::vector &rawStreams, + std::vector &outStreams) const = 0; + virtual int platformConfigure(const V4L2SubdeviceFormat &sensorFormat, + std::optional packing, + std::vector &rawStreams, + std::vector &outStreams) = 0; + virtual void platformStart() = 0; + virtual void platformStop() = 0; + + void freeBuffers(); + virtual void platformFreeBuffers() = 0; + + void enumerateVideoDevices(MediaLink *link, const std::string &frontend); + + int loadPipelineConfiguration(); + int loadIPA(ipa::RPi::InitResult *result); + int configureIPA(const CameraConfiguration *config, ipa::RPi::ConfigResult *result); + virtual int platformInitIpa(ipa::RPi::InitParams ¶ms) = 0; + virtual int platformConfigureIpa(ipa::RPi::ConfigParams ¶ms) = 0; + + void setDelayedControls(const ControlList &controls, uint32_t delayContext); + void setLensControls(const ControlList &controls); + void setSensorControls(ControlList &controls); + + Rectangle scaleIspCrop(const Rectangle &ispCrop) const; + void calculateScalerCrop(const ControlList &controls); + virtual void platformIspCrop() = 0; + + void cameraTimeout(); + void frameStarted(uint32_t sequence); + + void clearIncompleteRequests(); + void handleStreamBuffer(FrameBuffer *buffer, Stream *stream); + void handleState(); + + virtual V4L2VideoDevice::Formats ispFormats() const = 0; + virtual V4L2VideoDevice::Formats rawFormats() const = 0; + virtual V4L2VideoDevice *frontendDevice() = 0; + + virtual int platformPipelineConfigure(const std::unique_ptr &root) = 0; + + std::unique_ptr ipa_; + + std::unique_ptr sensor_; + SensorFormats sensorFormats_; + + /* The vector below is just for convenience when iterating over all streams. */ + std::vector streams_; + /* Stores the ids of the buffers mapped in the IPA. */ + std::unordered_set bufferIds_; + /* + * Stores a cascade of Video Mux or Bridge devices between the sensor and + * Unicam together with media link across the entities. + */ + std::vector, MediaLink *>> bridgeDevices_; + + std::unique_ptr delayedCtrls_; + bool sensorMetadata_; + + /* + * All the functions in this class are called from a single calling + * thread. So, we do not need to have any mutex to protect access to any + * of the variables below. + */ + enum class State { Stopped, Idle, Busy, IpaComplete, Error }; + State state_; + + bool isRunning() + { + return state_ != State::Stopped && state_ != State::Error; + } + + std::queue requestQueue_; + + /* Store the "native" Bayer order (that is, with no transforms applied). */ + bool flipsAlterBayerOrder_; + BayerFormat::Order nativeBayerOrder_; + + /* For handling digital zoom. */ + IPACameraSensorInfo sensorInfo_; + Rectangle ispCrop_; /* crop in ISP (camera mode) pixels */ + Rectangle scalerCrop_; /* crop in sensor native pixels */ + Size ispMinCropSize_; + + unsigned int dropFrameCount_; + + /* + * If set, this stores the value that represets a gain of one for + * the V4L2_CID_NOTIFY_GAINS control. + */ + std::optional notifyGainsUnity_; + + /* Have internal buffers been allocated? */ + bool buffersAllocated_; + + struct Config { + /* + * Override any request from the IPA to drop a number of startup + * frames. + */ + bool disableStartupFrameDrops; + /* + * Override the camera timeout value calculated by the IPA based + * on frame durations. + */ + unsigned int cameraTimeoutValue; + }; + + Config config_; + +protected: + void fillRequestMetadata(const ControlList &bufferControls, + Request *request); + + virtual void tryRunPipeline() = 0; + + unsigned int ispOutputCount_; + unsigned int ispOutputTotal_; + +private: + void handleExternalBuffer(FrameBuffer *buffer, Stream *stream); + void checkRequestCompleted(); +}; + +class PipelineHandlerBase : public PipelineHandler +{ +public: + PipelineHandlerBase(CameraManager *manager) + : PipelineHandler(manager) + { + } + + virtual ~PipelineHandlerBase() + { + } + + static V4L2DeviceFormat toV4L2DeviceFormat(const V4L2VideoDevice *dev, + const V4L2SubdeviceFormat &format, + BayerFormat::Packing packingReq); + + std::unique_ptr + generateConfiguration(Camera *camera, const StreamRoles &roles) override; + int configure(Camera *camera, CameraConfiguration *config) override; + + int exportFrameBuffers(Camera *camera, libcamera::Stream *stream, + std::vector> *buffers) override; + + int start(Camera *camera, const ControlList *controls) override; + void stopDevice(Camera *camera) override; + void releaseDevice(Camera *camera) override; + + int queueRequestDevice(Camera *camera, Request *request) override; + +protected: + int registerCamera(MediaDevice *frontent, const std::string &frontendName, + MediaDevice *backend, MediaEntity *sensorEntity); + + void mapBuffers(Camera *camera, const BufferMap &buffers, unsigned int mask); + + virtual std::unique_ptr allocateCameraData() = 0; + virtual int platformRegister(std::unique_ptr &cameraData, + MediaDevice *unicam, MediaDevice *isp) = 0; + +private: + CameraData *cameraData(Camera *camera) + { + return static_cast(camera->_d()); + } + + int queueAllBuffers(Camera *camera); + virtual int prepareBuffers(Camera *camera) = 0; +}; + +class RPiCameraConfiguration final : public CameraConfiguration +{ +public: + RPiCameraConfiguration(const CameraData *data) + : CameraConfiguration(), data_(data) + { + } + + CameraConfiguration::Status validateColorSpaces(ColorSpaceFlags flags); + Status validate() override; + + /* Cache the combinedTransform_ that will be applied to the sensor */ + Transform combinedTransform_; + +private: + const CameraData *data_; + + /* + * Store the colour spaces that all our streams will have. RGB format streams + * will have the same colorspace as YUV streams, with YCbCr field cleared and + * range set to full. + */ + std::optional yuvColorSpace_; + std::optional rgbColorSpace_; +}; + +} /* namespace RPi */ + +} /* namespace libcamera */ diff --git a/src/libcamera/pipeline/rpi/vc4/data/example.yaml b/src/libcamera/pipeline/rpi/vc4/data/example.yaml index c90f518f8849..b8e01adeaf40 100644 --- a/src/libcamera/pipeline/rpi/vc4/data/example.yaml +++ b/src/libcamera/pipeline/rpi/vc4/data/example.yaml @@ -34,13 +34,13 @@ # # "disable_startup_frame_drops": false, - # Custom timeout value (in ms) for Unicam to use. This overrides + # Custom timeout value (in ms) for camera to use. This overrides # the value computed by the pipeline handler based on frame # durations. # # Set this value to 0 to use the pipeline handler computed # timeout value. # - # "unicam_timeout_value_ms": 0, + # "camera_timeout_value_ms": 0, } } diff --git a/src/libcamera/pipeline/rpi/vc4/meson.build b/src/libcamera/pipeline/rpi/vc4/meson.build index 228823f30922..cdb049c58d2c 100644 --- a/src/libcamera/pipeline/rpi/vc4/meson.build +++ b/src/libcamera/pipeline/rpi/vc4/meson.build @@ -2,7 +2,7 @@ libcamera_sources += files([ 'dma_heaps.cpp', - 'raspberrypi.cpp', + 'vc4.cpp', ]) subdir('data') diff --git a/src/libcamera/pipeline/rpi/vc4/raspberrypi.cpp b/src/libcamera/pipeline/rpi/vc4/raspberrypi.cpp deleted file mode 100644 index bd66468683df..000000000000 --- a/src/libcamera/pipeline/rpi/vc4/raspberrypi.cpp +++ /dev/null @@ -1,2428 +0,0 @@ -/* SPDX-License-Identifier: LGPL-2.1-or-later */ -/* - * Copyright (C) 2019-2021, Raspberry Pi Ltd - * - * raspberrypi.cpp - Pipeline handler for VC4 based Raspberry Pi devices - */ -#include -#include -#include -#include -#include -#include -#include -#include -#include - -#include -#include -#include - -#include -#include -#include -#include -#include -#include -#include -#include - -#include -#include -#include - -#include "libcamera/internal/bayer_format.h" -#include "libcamera/internal/camera.h" -#include "libcamera/internal/camera_lens.h" -#include "libcamera/internal/camera_sensor.h" -#include "libcamera/internal/device_enumerator.h" -#include "libcamera/internal/framebuffer.h" -#include "libcamera/internal/ipa_manager.h" -#include "libcamera/internal/media_device.h" -#include "libcamera/internal/pipeline_handler.h" -#include "libcamera/internal/v4l2_videodevice.h" -#include "libcamera/internal/yaml_parser.h" - -#include "delayed_controls.h" -#include "dma_heaps.h" -#include "rpi_stream.h" - -using namespace std::chrono_literals; - -namespace libcamera { - -LOG_DEFINE_CATEGORY(RPI) - -namespace { - -constexpr unsigned int defaultRawBitDepth = 12; - -/* Map of mbus codes to supported sizes reported by the sensor. */ -using SensorFormats = std::map>; - -SensorFormats populateSensorFormats(std::unique_ptr &sensor) -{ - SensorFormats formats; - - for (auto const mbusCode : sensor->mbusCodes()) - formats.emplace(mbusCode, sensor->sizes(mbusCode)); - - return formats; -} - -bool isMonoSensor(std::unique_ptr &sensor) -{ - unsigned int mbusCode = sensor->mbusCodes()[0]; - const BayerFormat &bayer = BayerFormat::fromMbusCode(mbusCode); - - return bayer.order == BayerFormat::Order::MONO; -} - -PixelFormat mbusCodeToPixelFormat(unsigned int mbus_code, - BayerFormat::Packing packingReq) -{ - BayerFormat bayer = BayerFormat::fromMbusCode(mbus_code); - - ASSERT(bayer.isValid()); - - bayer.packing = packingReq; - PixelFormat pix = bayer.toPixelFormat(); - - /* - * Not all formats (e.g. 8-bit or 16-bit Bayer formats) can have packed - * variants. So if the PixelFormat returns as invalid, use the non-packed - * conversion instead. - */ - if (!pix.isValid()) { - bayer.packing = BayerFormat::Packing::None; - pix = bayer.toPixelFormat(); - } - - return pix; -} - -V4L2DeviceFormat toV4L2DeviceFormat(const V4L2VideoDevice *dev, - const V4L2SubdeviceFormat &format, - BayerFormat::Packing packingReq) -{ - const PixelFormat pix = mbusCodeToPixelFormat(format.mbus_code, packingReq); - V4L2DeviceFormat deviceFormat; - - deviceFormat.fourcc = dev->toV4L2PixelFormat(pix); - deviceFormat.size = format.size; - deviceFormat.colorSpace = format.colorSpace; - return deviceFormat; -} - -bool isRaw(const PixelFormat &pixFmt) -{ - /* This test works for both Bayer and raw mono formats. */ - return BayerFormat::fromPixelFormat(pixFmt).isValid(); -} - -double scoreFormat(double desired, double actual) -{ - double score = desired - actual; - /* Smaller desired dimensions are preferred. */ - if (score < 0.0) - score = (-score) / 8; - /* Penalise non-exact matches. */ - if (actual != desired) - score *= 2; - - return score; -} - -V4L2SubdeviceFormat findBestFormat(const SensorFormats &formatsMap, const Size &req, unsigned int bitDepth) -{ - double bestScore = std::numeric_limits::max(), score; - V4L2SubdeviceFormat bestFormat; - bestFormat.colorSpace = ColorSpace::Raw; - - constexpr float penaltyAr = 1500.0; - constexpr float penaltyBitDepth = 500.0; - - /* Calculate the closest/best mode from the user requested size. */ - for (const auto &iter : formatsMap) { - const unsigned int mbusCode = iter.first; - const PixelFormat format = mbusCodeToPixelFormat(mbusCode, - BayerFormat::Packing::None); - const PixelFormatInfo &info = PixelFormatInfo::info(format); - - for (const Size &size : iter.second) { - double reqAr = static_cast(req.width) / req.height; - double fmtAr = static_cast(size.width) / size.height; - - /* Score the dimensions for closeness. */ - score = scoreFormat(req.width, size.width); - score += scoreFormat(req.height, size.height); - score += penaltyAr * scoreFormat(reqAr, fmtAr); - - /* Add any penalties... this is not an exact science! */ - score += utils::abs_diff(info.bitsPerPixel, bitDepth) * penaltyBitDepth; - - if (score <= bestScore) { - bestScore = score; - bestFormat.mbus_code = mbusCode; - bestFormat.size = size; - } - - LOG(RPI, Debug) << "Format: " << size - << " fmt " << format - << " Score: " << score - << " (best " << bestScore << ")"; - } - } - - return bestFormat; -} - -enum class Unicam : unsigned int { Image, Embedded }; -enum class Isp : unsigned int { Input, Output0, Output1, Stats }; - -} /* namespace */ - -class RPiCameraData : public Camera::Private -{ -public: - RPiCameraData(PipelineHandler *pipe) - : Camera::Private(pipe), state_(State::Stopped), - flipsAlterBayerOrder_(false), dropFrameCount_(0), - buffersAllocated_(false), ispOutputCount_(0) - { - } - - ~RPiCameraData() - { - freeBuffers(); - } - - void freeBuffers(); - void frameStarted(uint32_t sequence); - - int loadIPA(ipa::RPi::InitResult *result); - int configureIPA(const CameraConfiguration *config, ipa::RPi::ConfigResult *result); - int loadPipelineConfiguration(); - - void enumerateVideoDevices(MediaLink *link); - - void processStatsComplete(const ipa::RPi::BufferIds &buffers); - void prepareIspComplete(const ipa::RPi::BufferIds &buffers); - void setIspControls(const ControlList &controls); - void setDelayedControls(const ControlList &controls, uint32_t delayContext); - void setLensControls(const ControlList &controls); - void setCameraTimeout(uint32_t maxExposureTimeMs); - void setSensorControls(ControlList &controls); - void unicamTimeout(); - - /* bufferComplete signal handlers. */ - void unicamBufferDequeue(FrameBuffer *buffer); - void ispInputDequeue(FrameBuffer *buffer); - void ispOutputDequeue(FrameBuffer *buffer); - - void clearIncompleteRequests(); - void handleStreamBuffer(FrameBuffer *buffer, RPi::Stream *stream); - void handleExternalBuffer(FrameBuffer *buffer, RPi::Stream *stream); - void handleState(); - Rectangle scaleIspCrop(const Rectangle &ispCrop) const; - void applyScalerCrop(const ControlList &controls); - - std::unique_ptr ipa_; - - std::unique_ptr sensor_; - SensorFormats sensorFormats_; - /* Array of Unicam and ISP device streams and associated buffers/streams. */ - RPi::Device unicam_; - RPi::Device isp_; - /* The vector below is just for convenience when iterating over all streams. */ - std::vector streams_; - /* Stores the ids of the buffers mapped in the IPA. */ - std::unordered_set bufferIds_; - /* - * Stores a cascade of Video Mux or Bridge devices between the sensor and - * Unicam together with media link across the entities. - */ - std::vector, MediaLink *>> bridgeDevices_; - - /* DMAHEAP allocation helper. */ - RPi::DmaHeap dmaHeap_; - SharedFD lsTable_; - - std::unique_ptr delayedCtrls_; - bool sensorMetadata_; - - /* - * All the functions in this class are called from a single calling - * thread. So, we do not need to have any mutex to protect access to any - * of the variables below. - */ - enum class State { Stopped, Idle, Busy, IpaComplete, Error }; - State state_; - - bool isRunning() - { - return state_ != State::Stopped && state_ != State::Error; - } - - struct BayerFrame { - FrameBuffer *buffer; - ControlList controls; - unsigned int delayContext; - }; - - std::queue bayerQueue_; - std::queue embeddedQueue_; - std::deque requestQueue_; - - /* - * Store the "native" Bayer order (that is, with no transforms - * applied). - */ - bool flipsAlterBayerOrder_; - BayerFormat::Order nativeBayerOrder_; - - /* For handling digital zoom. */ - IPACameraSensorInfo sensorInfo_; - Rectangle ispCrop_; /* crop in ISP (camera mode) pixels */ - Rectangle scalerCrop_; /* crop in sensor native pixels */ - Size ispMinCropSize_; - - unsigned int dropFrameCount_; - - /* - * If set, this stores the value that represets a gain of one for - * the V4L2_CID_NOTIFY_GAINS control. - */ - std::optional notifyGainsUnity_; - - /* Have internal buffers been allocated? */ - bool buffersAllocated_; - - struct Config { - /* - * The minimum number of internal buffers to be allocated for - * the Unicam Image stream. - */ - unsigned int minUnicamBuffers; - /* - * The minimum total (internal + external) buffer count used for - * the Unicam Image stream. - * - * Note that: - * minTotalUnicamBuffers must be >= 1, and - * minTotalUnicamBuffers >= minUnicamBuffers - */ - unsigned int minTotalUnicamBuffers; - /* - * Override any request from the IPA to drop a number of startup - * frames. - */ - bool disableStartupFrameDrops; - /* - * Override the Unicam timeout value calculated by the IPA based - * on frame durations. - */ - unsigned int unicamTimeoutValue; - }; - - Config config_; - -private: - void checkRequestCompleted(); - void fillRequestMetadata(const ControlList &bufferControls, - Request *request); - void tryRunPipeline(); - bool findMatchingBuffers(BayerFrame &bayerFrame, FrameBuffer *&embeddedBuffer); - - unsigned int ispOutputCount_; -}; - -class RPiCameraConfiguration : public CameraConfiguration -{ -public: - RPiCameraConfiguration(const RPiCameraData *data); - - CameraConfiguration::Status validateColorSpaces(ColorSpaceFlags flags); - Status validate() override; - - /* Cache the combinedTransform_ that will be applied to the sensor */ - Transform combinedTransform_; - -private: - const RPiCameraData *data_; - - /* - * Store the colour spaces that all our streams will have. RGB format streams - * will have the same colorspace as YUV streams, with YCbCr field cleared and - * range set to full. - */ - std::optional yuvColorSpace_; - std::optional rgbColorSpace_; -}; - -class PipelineHandlerRPi : public PipelineHandler -{ -public: - PipelineHandlerRPi(CameraManager *manager); - - std::unique_ptr generateConfiguration(Camera *camera, - const StreamRoles &roles) override; - int configure(Camera *camera, CameraConfiguration *config) override; - - int exportFrameBuffers(Camera *camera, Stream *stream, - std::vector> *buffers) override; - - int start(Camera *camera, const ControlList *controls) override; - void stopDevice(Camera *camera) override; - - int queueRequestDevice(Camera *camera, Request *request) override; - - bool match(DeviceEnumerator *enumerator) override; - - void releaseDevice(Camera *camera) override; - -private: - RPiCameraData *cameraData(Camera *camera) - { - return static_cast(camera->_d()); - } - - int registerCamera(MediaDevice *unicam, MediaDevice *isp, MediaEntity *sensorEntity); - int queueAllBuffers(Camera *camera); - int prepareBuffers(Camera *camera); - void mapBuffers(Camera *camera, const RPi::BufferMap &buffers, unsigned int mask); -}; - -RPiCameraConfiguration::RPiCameraConfiguration(const RPiCameraData *data) - : CameraConfiguration(), data_(data) -{ -} - -static const std::vector validColorSpaces = { - ColorSpace::Sycc, - ColorSpace::Smpte170m, - ColorSpace::Rec709 -}; - -static std::optional findValidColorSpace(const ColorSpace &colourSpace) -{ - for (auto cs : validColorSpaces) { - if (colourSpace.primaries == cs.primaries && - colourSpace.transferFunction == cs.transferFunction) - return cs; - } - - return std::nullopt; -} - -static bool isRgb(const PixelFormat &pixFmt) -{ - const PixelFormatInfo &info = PixelFormatInfo::info(pixFmt); - return info.colourEncoding == PixelFormatInfo::ColourEncodingRGB; -} - -static bool isYuv(const PixelFormat &pixFmt) -{ - /* The code below would return true for raw mono streams, so weed those out first. */ - if (isRaw(pixFmt)) - return false; - - const PixelFormatInfo &info = PixelFormatInfo::info(pixFmt); - return info.colourEncoding == PixelFormatInfo::ColourEncodingYUV; -} - -/* - * Raspberry Pi drivers expect the following colour spaces: - * - V4L2_COLORSPACE_RAW for raw streams. - * - One of V4L2_COLORSPACE_JPEG, V4L2_COLORSPACE_SMPTE170M, V4L2_COLORSPACE_REC709 for - * non-raw streams. Other fields such as transfer function, YCbCr encoding and - * quantisation are not used. - * - * The libcamera colour spaces that we wish to use corresponding to these are therefore: - * - ColorSpace::Raw for V4L2_COLORSPACE_RAW - * - ColorSpace::Sycc for V4L2_COLORSPACE_JPEG - * - ColorSpace::Smpte170m for V4L2_COLORSPACE_SMPTE170M - * - ColorSpace::Rec709 for V4L2_COLORSPACE_REC709 - */ - -CameraConfiguration::Status RPiCameraConfiguration::validateColorSpaces([[maybe_unused]] ColorSpaceFlags flags) -{ - Status status = Valid; - yuvColorSpace_.reset(); - - for (auto cfg : config_) { - /* First fix up raw streams to have the "raw" colour space. */ - if (isRaw(cfg.pixelFormat)) { - /* If there was no value here, that doesn't count as "adjusted". */ - if (cfg.colorSpace && cfg.colorSpace != ColorSpace::Raw) - status = Adjusted; - cfg.colorSpace = ColorSpace::Raw; - continue; - } - - /* Next we need to find our shared colour space. The first valid one will do. */ - if (cfg.colorSpace && !yuvColorSpace_) - yuvColorSpace_ = findValidColorSpace(cfg.colorSpace.value()); - } - - /* If no colour space was given anywhere, choose sYCC. */ - if (!yuvColorSpace_) - yuvColorSpace_ = ColorSpace::Sycc; - - /* Note the version of this that any RGB streams will have to use. */ - rgbColorSpace_ = yuvColorSpace_; - rgbColorSpace_->ycbcrEncoding = ColorSpace::YcbcrEncoding::None; - rgbColorSpace_->range = ColorSpace::Range::Full; - - /* Go through the streams again and force everyone to the same colour space. */ - for (auto cfg : config_) { - if (cfg.colorSpace == ColorSpace::Raw) - continue; - - if (isYuv(cfg.pixelFormat) && cfg.colorSpace != yuvColorSpace_) { - /* Again, no value means "not adjusted". */ - if (cfg.colorSpace) - status = Adjusted; - cfg.colorSpace = yuvColorSpace_; - } - if (isRgb(cfg.pixelFormat) && cfg.colorSpace != rgbColorSpace_) { - /* Be nice, and let the YUV version count as non-adjusted too. */ - if (cfg.colorSpace && cfg.colorSpace != yuvColorSpace_) - status = Adjusted; - cfg.colorSpace = rgbColorSpace_; - } - } - - return status; -} - -CameraConfiguration::Status RPiCameraConfiguration::validate() -{ - Status status = Valid; - - if (config_.empty()) - return Invalid; - - status = validateColorSpaces(ColorSpaceFlag::StreamsShareColorSpace); - - /* - * Validate the requested transform against the sensor capabilities and - * rotation and store the final combined transform that configure() will - * need to apply to the sensor to save us working it out again. - */ - Transform requestedTransform = transform; - combinedTransform_ = data_->sensor_->validateTransform(&transform); - if (transform != requestedTransform) - status = Adjusted; - - unsigned int rawCount = 0, outCount = 0, count = 0, maxIndex = 0; - std::pair outSize[2]; - Size maxSize; - for (StreamConfiguration &cfg : config_) { - if (isRaw(cfg.pixelFormat)) { - /* - * Calculate the best sensor mode we can use based on - * the user request. - */ - V4L2VideoDevice *unicam = data_->unicam_[Unicam::Image].dev(); - const PixelFormatInfo &info = PixelFormatInfo::info(cfg.pixelFormat); - unsigned int bitDepth = info.isValid() ? info.bitsPerPixel : defaultRawBitDepth; - V4L2SubdeviceFormat sensorFormat = findBestFormat(data_->sensorFormats_, cfg.size, bitDepth); - BayerFormat::Packing packing = BayerFormat::Packing::CSI2; - if (info.isValid() && !info.packed) - packing = BayerFormat::Packing::None; - V4L2DeviceFormat unicamFormat = toV4L2DeviceFormat(unicam, sensorFormat, packing); - int ret = unicam->tryFormat(&unicamFormat); - if (ret) - return Invalid; - - /* - * Some sensors change their Bayer order when they are - * h-flipped or v-flipped, according to the transform. - * If this one does, we must advertise the transformed - * Bayer order in the raw stream. Note how we must - * fetch the "native" (i.e. untransformed) Bayer order, - * because the sensor may currently be flipped! - */ - V4L2PixelFormat fourcc = unicamFormat.fourcc; - if (data_->flipsAlterBayerOrder_) { - BayerFormat bayer = BayerFormat::fromV4L2PixelFormat(fourcc); - bayer.order = data_->nativeBayerOrder_; - bayer = bayer.transform(combinedTransform_); - fourcc = bayer.toV4L2PixelFormat(); - } - - PixelFormat unicamPixFormat = fourcc.toPixelFormat(); - if (cfg.size != unicamFormat.size || - cfg.pixelFormat != unicamPixFormat) { - cfg.size = unicamFormat.size; - cfg.pixelFormat = unicamPixFormat; - status = Adjusted; - } - - cfg.stride = unicamFormat.planes[0].bpl; - cfg.frameSize = unicamFormat.planes[0].size; - - rawCount++; - } else { - outSize[outCount] = std::make_pair(count, cfg.size); - /* Record the largest resolution for fixups later. */ - if (maxSize < cfg.size) { - maxSize = cfg.size; - maxIndex = outCount; - } - outCount++; - } - - count++; - - /* Can only output 1 RAW stream, or 2 YUV/RGB streams. */ - if (rawCount > 1 || outCount > 2) { - LOG(RPI, Error) << "Invalid number of streams requested"; - return Invalid; - } - } - - /* - * Now do any fixups needed. For the two ISP outputs, one stream must be - * equal or smaller than the other in all dimensions. - */ - for (unsigned int i = 0; i < outCount; i++) { - outSize[i].second.width = std::min(outSize[i].second.width, - maxSize.width); - outSize[i].second.height = std::min(outSize[i].second.height, - maxSize.height); - - if (config_.at(outSize[i].first).size != outSize[i].second) { - config_.at(outSize[i].first).size = outSize[i].second; - status = Adjusted; - } - - /* - * Also validate the correct pixel formats here. - * Note that Output0 and Output1 support a different - * set of formats. - * - * Output 0 must be for the largest resolution. We will - * have that fixed up in the code above. - * - */ - StreamConfiguration &cfg = config_.at(outSize[i].first); - PixelFormat &cfgPixFmt = cfg.pixelFormat; - V4L2VideoDevice *dev; - - if (i == maxIndex) - dev = data_->isp_[Isp::Output0].dev(); - else - dev = data_->isp_[Isp::Output1].dev(); - - V4L2VideoDevice::Formats fmts = dev->formats(); - - if (fmts.find(dev->toV4L2PixelFormat(cfgPixFmt)) == fmts.end()) { - /* If we cannot find a native format, use a default one. */ - cfgPixFmt = formats::NV12; - status = Adjusted; - } - - V4L2DeviceFormat format; - format.fourcc = dev->toV4L2PixelFormat(cfg.pixelFormat); - format.size = cfg.size; - /* We want to send the associated YCbCr info through to the driver. */ - format.colorSpace = yuvColorSpace_; - - LOG(RPI, Debug) - << "Try color space " << ColorSpace::toString(cfg.colorSpace); - - int ret = dev->tryFormat(&format); - if (ret) - return Invalid; - - /* - * But for RGB streams, the YCbCr info gets overwritten on the way back - * so we must check against what the stream cfg says, not what we actually - * requested (which carefully included the YCbCr info)! - */ - if (cfg.colorSpace != format.colorSpace) { - status = Adjusted; - LOG(RPI, Debug) - << "Color space changed from " - << ColorSpace::toString(cfg.colorSpace) << " to " - << ColorSpace::toString(format.colorSpace); - } - - cfg.colorSpace = format.colorSpace; - - cfg.stride = format.planes[0].bpl; - cfg.frameSize = format.planes[0].size; - - } - - return status; -} - -PipelineHandlerRPi::PipelineHandlerRPi(CameraManager *manager) - : PipelineHandler(manager) -{ -} - -std::unique_ptr -PipelineHandlerRPi::generateConfiguration(Camera *camera, const StreamRoles &roles) -{ - RPiCameraData *data = cameraData(camera); - std::unique_ptr config = - std::make_unique(data); - V4L2SubdeviceFormat sensorFormat; - unsigned int bufferCount; - PixelFormat pixelFormat; - V4L2VideoDevice::Formats fmts; - Size size; - std::optional colorSpace; - - if (roles.empty()) - return config; - - unsigned int rawCount = 0; - unsigned int outCount = 0; - Size sensorSize = data->sensor_->resolution(); - for (const StreamRole role : roles) { - switch (role) { - case StreamRole::Raw: - size = sensorSize; - sensorFormat = findBestFormat(data->sensorFormats_, size, defaultRawBitDepth); - pixelFormat = mbusCodeToPixelFormat(sensorFormat.mbus_code, - BayerFormat::Packing::CSI2); - ASSERT(pixelFormat.isValid()); - colorSpace = ColorSpace::Raw; - bufferCount = 2; - rawCount++; - break; - - case StreamRole::StillCapture: - fmts = data->isp_[Isp::Output0].dev()->formats(); - pixelFormat = formats::NV12; - /* - * Still image codecs usually expect the sYCC color space. - * Even RGB codecs will be fine as the RGB we get with the - * sYCC color space is the same as sRGB. - */ - colorSpace = ColorSpace::Sycc; - /* Return the largest sensor resolution. */ - size = sensorSize; - bufferCount = 1; - outCount++; - break; - - case StreamRole::VideoRecording: - /* - * The colour denoise algorithm requires the analysis - * image, produced by the second ISP output, to be in - * YUV420 format. Select this format as the default, to - * maximize chances that it will be picked by - * applications and enable usage of the colour denoise - * algorithm. - */ - fmts = data->isp_[Isp::Output0].dev()->formats(); - pixelFormat = formats::YUV420; - /* - * Choose a color space appropriate for video recording. - * Rec.709 will be a good default for HD resolutions. - */ - colorSpace = ColorSpace::Rec709; - size = { 1920, 1080 }; - bufferCount = 4; - outCount++; - break; - - case StreamRole::Viewfinder: - fmts = data->isp_[Isp::Output0].dev()->formats(); - pixelFormat = formats::ARGB8888; - colorSpace = ColorSpace::Sycc; - size = { 800, 600 }; - bufferCount = 4; - outCount++; - break; - - default: - LOG(RPI, Error) << "Requested stream role not supported: " - << role; - return nullptr; - } - - if (rawCount > 1 || outCount > 2) { - LOG(RPI, Error) << "Invalid stream roles requested"; - return nullptr; - } - - std::map> deviceFormats; - if (role == StreamRole::Raw) { - /* Translate the MBUS codes to a PixelFormat. */ - for (const auto &format : data->sensorFormats_) { - PixelFormat pf = mbusCodeToPixelFormat(format.first, - BayerFormat::Packing::CSI2); - if (pf.isValid()) - deviceFormats.emplace(std::piecewise_construct, std::forward_as_tuple(pf), - std::forward_as_tuple(format.second.begin(), format.second.end())); - } - } else { - /* - * Translate the V4L2PixelFormat to PixelFormat. Note that we - * limit the recommended largest ISP output size to match the - * sensor resolution. - */ - for (const auto &format : fmts) { - PixelFormat pf = format.first.toPixelFormat(); - if (pf.isValid()) { - const SizeRange &ispSizes = format.second[0]; - deviceFormats[pf].emplace_back(ispSizes.min, sensorSize, - ispSizes.hStep, ispSizes.vStep); - } - } - } - - /* Add the stream format based on the device node used for the use case. */ - StreamFormats formats(deviceFormats); - StreamConfiguration cfg(formats); - cfg.size = size; - cfg.pixelFormat = pixelFormat; - cfg.colorSpace = colorSpace; - cfg.bufferCount = bufferCount; - config->addConfiguration(cfg); - } - - config->validate(); - - return config; -} - -int PipelineHandlerRPi::configure(Camera *camera, CameraConfiguration *config) -{ - RPiCameraData *data = cameraData(camera); - int ret; - - /* Start by freeing all buffers and reset the Unicam and ISP stream states. */ - data->freeBuffers(); - for (auto const stream : data->streams_) - stream->setExternal(false); - - BayerFormat::Packing packing = BayerFormat::Packing::CSI2; - Size maxSize, sensorSize; - unsigned int maxIndex = 0; - bool rawStream = false; - unsigned int bitDepth = defaultRawBitDepth; - - /* - * Look for the RAW stream (if given) size as well as the largest - * ISP output size. - */ - for (unsigned i = 0; i < config->size(); i++) { - StreamConfiguration &cfg = config->at(i); - - if (isRaw(cfg.pixelFormat)) { - /* - * If we have been given a RAW stream, use that size - * for setting up the sensor. - */ - sensorSize = cfg.size; - rawStream = true; - /* Check if the user has explicitly set an unpacked format. */ - BayerFormat bayerFormat = BayerFormat::fromPixelFormat(cfg.pixelFormat); - packing = bayerFormat.packing; - bitDepth = bayerFormat.bitDepth; - } else { - if (cfg.size > maxSize) { - maxSize = config->at(i).size; - maxIndex = i; - } - } - } - - /* - * Calculate the best sensor mode we can use based on the user's - * request, and apply it to the sensor with the cached transform, if - * any. - */ - V4L2SubdeviceFormat sensorFormat = findBestFormat(data->sensorFormats_, rawStream ? sensorSize : maxSize, bitDepth); - const RPiCameraConfiguration *rpiConfig = static_cast(config); - ret = data->sensor_->setFormat(&sensorFormat, rpiConfig->combinedTransform_); - if (ret) - return ret; - - V4L2VideoDevice *unicam = data->unicam_[Unicam::Image].dev(); - V4L2DeviceFormat unicamFormat = toV4L2DeviceFormat(unicam, sensorFormat, packing); - ret = unicam->setFormat(&unicamFormat); - if (ret) - return ret; - - LOG(RPI, Info) << "Sensor: " << camera->id() - << " - Selected sensor format: " << sensorFormat - << " - Selected unicam format: " << unicamFormat; - - ret = data->isp_[Isp::Input].dev()->setFormat(&unicamFormat); - if (ret) - return ret; - - /* - * See which streams are requested, and route the user - * StreamConfiguration appropriately. - */ - V4L2DeviceFormat format; - bool output0Set = false, output1Set = false; - for (unsigned i = 0; i < config->size(); i++) { - StreamConfiguration &cfg = config->at(i); - - if (isRaw(cfg.pixelFormat)) { - cfg.setStream(&data->unicam_[Unicam::Image]); - data->unicam_[Unicam::Image].setExternal(true); - continue; - } - - /* The largest resolution gets routed to the ISP Output 0 node. */ - RPi::Stream *stream = i == maxIndex ? &data->isp_[Isp::Output0] - : &data->isp_[Isp::Output1]; - - V4L2PixelFormat fourcc = stream->dev()->toV4L2PixelFormat(cfg.pixelFormat); - format.size = cfg.size; - format.fourcc = fourcc; - format.colorSpace = cfg.colorSpace; - - LOG(RPI, Debug) << "Setting " << stream->name() << " to " - << format; - - ret = stream->dev()->setFormat(&format); - if (ret) - return -EINVAL; - - if (format.size != cfg.size || format.fourcc != fourcc) { - LOG(RPI, Error) - << "Failed to set requested format on " << stream->name() - << ", returned " << format; - return -EINVAL; - } - - LOG(RPI, Debug) - << "Stream " << stream->name() << " has color space " - << ColorSpace::toString(cfg.colorSpace); - - cfg.setStream(stream); - stream->setExternal(true); - - if (i != maxIndex) - output1Set = true; - else - output0Set = true; - } - - /* - * If ISP::Output0 stream has not been configured by the application, - * we must allow the hardware to generate an output so that the data - * flow in the pipeline handler remains consistent, and we still generate - * statistics for the IPA to use. So enable the output at a very low - * resolution for internal use. - * - * \todo Allow the pipeline to work correctly without Output0 and only - * statistics coming from the hardware. - */ - if (!output0Set) { - V4L2VideoDevice *dev = data->isp_[Isp::Output0].dev(); - - maxSize = Size(320, 240); - format = {}; - format.size = maxSize; - format.fourcc = dev->toV4L2PixelFormat(formats::YUV420); - /* No one asked for output, so the color space doesn't matter. */ - format.colorSpace = ColorSpace::Sycc; - ret = dev->setFormat(&format); - if (ret) { - LOG(RPI, Error) - << "Failed to set default format on ISP Output0: " - << ret; - return -EINVAL; - } - - LOG(RPI, Debug) << "Defaulting ISP Output0 format to " - << format; - } - - /* - * If ISP::Output1 stream has not been requested by the application, we - * set it up for internal use now. This second stream will be used for - * fast colour denoise, and must be a quarter resolution of the ISP::Output0 - * stream. However, also limit the maximum size to 1200 pixels in the - * larger dimension, just to avoid being wasteful with buffer allocations - * and memory bandwidth. - * - * \todo If Output 1 format is not YUV420, Output 1 ought to be disabled as - * colour denoise will not run. - */ - if (!output1Set) { - V4L2VideoDevice *dev = data->isp_[Isp::Output1].dev(); - - V4L2DeviceFormat output1Format; - constexpr Size maxDimensions(1200, 1200); - const Size limit = maxDimensions.boundedToAspectRatio(format.size); - - output1Format.size = (format.size / 2).boundedTo(limit).alignedDownTo(2, 2); - output1Format.colorSpace = format.colorSpace; - output1Format.fourcc = dev->toV4L2PixelFormat(formats::YUV420); - - LOG(RPI, Debug) << "Setting ISP Output1 (internal) to " - << output1Format; - - ret = dev->setFormat(&output1Format); - if (ret) { - LOG(RPI, Error) << "Failed to set format on ISP Output1: " - << ret; - return -EINVAL; - } - } - - /* ISP statistics output format. */ - format = {}; - format.fourcc = V4L2PixelFormat(V4L2_META_FMT_BCM2835_ISP_STATS); - ret = data->isp_[Isp::Stats].dev()->setFormat(&format); - if (ret) { - LOG(RPI, Error) << "Failed to set format on ISP stats stream: " - << format; - return ret; - } - - /* Figure out the smallest selection the ISP will allow. */ - Rectangle testCrop(0, 0, 1, 1); - data->isp_[Isp::Input].dev()->setSelection(V4L2_SEL_TGT_CROP, &testCrop); - data->ispMinCropSize_ = testCrop.size(); - - /* Adjust aspect ratio by providing crops on the input image. */ - Size size = unicamFormat.size.boundedToAspectRatio(maxSize); - Rectangle crop = size.centeredTo(Rectangle(unicamFormat.size).center()); - Rectangle defaultCrop = crop; - data->ispCrop_ = crop; - - data->isp_[Isp::Input].dev()->setSelection(V4L2_SEL_TGT_CROP, &crop); - - ipa::RPi::ConfigResult result; - ret = data->configureIPA(config, &result); - if (ret) - LOG(RPI, Error) << "Failed to configure the IPA: " << ret; - - /* - * Set the scaler crop to the value we are using (scaled to native sensor - * coordinates). - */ - data->scalerCrop_ = data->scaleIspCrop(data->ispCrop_); - - /* - * Configure the Unicam embedded data output format only if the sensor - * supports it. - */ - if (data->sensorMetadata_) { - V4L2SubdeviceFormat embeddedFormat; - - data->sensor_->device()->getFormat(1, &embeddedFormat); - format.fourcc = V4L2PixelFormat(V4L2_META_FMT_SENSOR_DATA); - format.planes[0].size = embeddedFormat.size.width * embeddedFormat.size.height; - - LOG(RPI, Debug) << "Setting embedded data format."; - ret = data->unicam_[Unicam::Embedded].dev()->setFormat(&format); - if (ret) { - LOG(RPI, Error) << "Failed to set format on Unicam embedded: " - << format; - return ret; - } - } - - /* - * Update the ScalerCropMaximum to the correct value for this camera mode. - * For us, it's the same as the "analogue crop". - * - * \todo Make this property the ScalerCrop maximum value when dynamic - * controls are available and set it at validate() time - */ - data->properties_.set(properties::ScalerCropMaximum, data->sensorInfo_.analogCrop); - - /* Store the mode sensitivity for the application. */ - data->properties_.set(properties::SensorSensitivity, result.modeSensitivity); - - /* Update the controls that the Raspberry Pi IPA can handle. */ - ControlInfoMap::Map ctrlMap; - for (auto const &c : result.controlInfo) - ctrlMap.emplace(c.first, c.second); - - /* Add the ScalerCrop control limits based on the current mode. */ - Rectangle ispMinCrop = data->scaleIspCrop(Rectangle(data->ispMinCropSize_)); - defaultCrop = data->scaleIspCrop(defaultCrop); - ctrlMap[&controls::ScalerCrop] = ControlInfo(ispMinCrop, data->sensorInfo_.analogCrop, defaultCrop); - - data->controlInfo_ = ControlInfoMap(std::move(ctrlMap), result.controlInfo.idmap()); - - /* Setup the Video Mux/Bridge entities. */ - for (auto &[device, link] : data->bridgeDevices_) { - /* - * Start by disabling all the sink pad links on the devices in the - * cascade, with the exception of the link connecting the device. - */ - for (const MediaPad *p : device->entity()->pads()) { - if (!(p->flags() & MEDIA_PAD_FL_SINK)) - continue; - - for (MediaLink *l : p->links()) { - if (l != link) - l->setEnabled(false); - } - } - - /* - * Next, enable the entity -> entity links, and setup the pad format. - * - * \todo Some bridge devices may chainge the media bus code, so we - * ought to read the source pad format and propagate it to the sink pad. - */ - link->setEnabled(true); - const MediaPad *sinkPad = link->sink(); - ret = device->setFormat(sinkPad->index(), &sensorFormat); - if (ret) { - LOG(RPI, Error) << "Failed to set format on " << device->entity()->name() - << " pad " << sinkPad->index() - << " with format " << format - << ": " << ret; - return ret; - } - - LOG(RPI, Debug) << "Configured media link on device " << device->entity()->name() - << " on pad " << sinkPad->index(); - } - - return ret; -} - -int PipelineHandlerRPi::exportFrameBuffers([[maybe_unused]] Camera *camera, Stream *stream, - std::vector> *buffers) -{ - RPi::Stream *s = static_cast(stream); - unsigned int count = stream->configuration().bufferCount; - int ret = s->dev()->exportBuffers(count, buffers); - - s->setExportedBuffers(buffers); - - return ret; -} - -int PipelineHandlerRPi::start(Camera *camera, const ControlList *controls) -{ - RPiCameraData *data = cameraData(camera); - int ret; - - /* Check if a ScalerCrop control was specified. */ - if (controls) - data->applyScalerCrop(*controls); - - /* Start the IPA. */ - ipa::RPi::StartResult result; - data->ipa_->start(controls ? *controls : ControlList{ controls::controls }, - &result); - - /* Apply any gain/exposure settings that the IPA may have passed back. */ - if (!result.controls.empty()) - data->setSensorControls(result.controls); - - /* Configure the number of dropped frames required on startup. */ - data->dropFrameCount_ = data->config_.disableStartupFrameDrops - ? 0 : result.dropFrameCount; - - for (auto const stream : data->streams_) - stream->resetBuffers(); - - if (!data->buffersAllocated_) { - /* Allocate buffers for internal pipeline usage. */ - ret = prepareBuffers(camera); - if (ret) { - LOG(RPI, Error) << "Failed to allocate buffers"; - data->freeBuffers(); - stop(camera); - return ret; - } - data->buffersAllocated_ = true; - } - - /* We need to set the dropFrameCount_ before queueing buffers. */ - ret = queueAllBuffers(camera); - if (ret) { - LOG(RPI, Error) << "Failed to queue buffers"; - stop(camera); - return ret; - } - - /* Enable SOF event generation. */ - data->unicam_[Unicam::Image].dev()->setFrameStartEnabled(true); - - /* - * Reset the delayed controls with the gain and exposure values set by - * the IPA. - */ - data->delayedCtrls_->reset(0); - - data->state_ = RPiCameraData::State::Idle; - - /* Start all streams. */ - for (auto const stream : data->streams_) { - ret = stream->dev()->streamOn(); - if (ret) { - stop(camera); - return ret; - } - } - - return 0; -} - -void PipelineHandlerRPi::stopDevice(Camera *camera) -{ - RPiCameraData *data = cameraData(camera); - - data->state_ = RPiCameraData::State::Stopped; - - /* Disable SOF event generation. */ - data->unicam_[Unicam::Image].dev()->setFrameStartEnabled(false); - - for (auto const stream : data->streams_) - stream->dev()->streamOff(); - - data->clearIncompleteRequests(); - data->bayerQueue_ = {}; - data->embeddedQueue_ = {}; - - /* Stop the IPA. */ - data->ipa_->stop(); -} - -int PipelineHandlerRPi::queueRequestDevice(Camera *camera, Request *request) -{ - RPiCameraData *data = cameraData(camera); - - if (!data->isRunning()) - return -EINVAL; - - LOG(RPI, Debug) << "queueRequestDevice: New request."; - - /* Push all buffers supplied in the Request to the respective streams. */ - for (auto stream : data->streams_) { - if (!stream->isExternal()) - continue; - - FrameBuffer *buffer = request->findBuffer(stream); - if (buffer && !stream->getBufferId(buffer)) { - /* - * This buffer is not recognised, so it must have been allocated - * outside the v4l2 device. Store it in the stream buffer list - * so we can track it. - */ - stream->setExternalBuffer(buffer); - } - - /* - * If no buffer is provided by the request for this stream, we - * queue a nullptr to the stream to signify that it must use an - * internally allocated buffer for this capture request. This - * buffer will not be given back to the application, but is used - * to support the internal pipeline flow. - * - * The below queueBuffer() call will do nothing if there are not - * enough internal buffers allocated, but this will be handled by - * queuing the request for buffers in the RPiStream object. - */ - int ret = stream->queueBuffer(buffer); - if (ret) - return ret; - } - - /* Push the request to the back of the queue. */ - data->requestQueue_.push_back(request); - data->handleState(); - - return 0; -} - -bool PipelineHandlerRPi::match(DeviceEnumerator *enumerator) -{ - constexpr unsigned int numUnicamDevices = 2; - - /* - * Loop over all Unicam instances, but return out once a match is found. - * This is to ensure we correctly enumrate the camera when an instance - * of Unicam has registered with media controller, but has not registered - * device nodes due to a sensor subdevice failure. - */ - for (unsigned int i = 0; i < numUnicamDevices; i++) { - DeviceMatch unicam("unicam"); - MediaDevice *unicamDevice = acquireMediaDevice(enumerator, unicam); - - if (!unicamDevice) { - LOG(RPI, Debug) << "Unable to acquire a Unicam instance"; - continue; - } - - DeviceMatch isp("bcm2835-isp"); - MediaDevice *ispDevice = acquireMediaDevice(enumerator, isp); - - if (!ispDevice) { - LOG(RPI, Debug) << "Unable to acquire ISP instance"; - continue; - } - - /* - * The loop below is used to register multiple cameras behind one or more - * video mux devices that are attached to a particular Unicam instance. - * Obviously these cameras cannot be used simultaneously. - */ - unsigned int numCameras = 0; - for (MediaEntity *entity : unicamDevice->entities()) { - if (entity->function() != MEDIA_ENT_F_CAM_SENSOR) - continue; - - int ret = registerCamera(unicamDevice, ispDevice, entity); - if (ret) - LOG(RPI, Error) << "Failed to register camera " - << entity->name() << ": " << ret; - else - numCameras++; - } - - if (numCameras) - return true; - } - - return false; -} - -void PipelineHandlerRPi::releaseDevice(Camera *camera) -{ - RPiCameraData *data = cameraData(camera); - data->freeBuffers(); -} - -int PipelineHandlerRPi::registerCamera(MediaDevice *unicam, MediaDevice *isp, MediaEntity *sensorEntity) -{ - std::unique_ptr data = std::make_unique(this); - - if (!data->dmaHeap_.isValid()) - return -ENOMEM; - - MediaEntity *unicamImage = unicam->getEntityByName("unicam-image"); - MediaEntity *ispOutput0 = isp->getEntityByName("bcm2835-isp0-output0"); - MediaEntity *ispCapture1 = isp->getEntityByName("bcm2835-isp0-capture1"); - MediaEntity *ispCapture2 = isp->getEntityByName("bcm2835-isp0-capture2"); - MediaEntity *ispCapture3 = isp->getEntityByName("bcm2835-isp0-capture3"); - - if (!unicamImage || !ispOutput0 || !ispCapture1 || !ispCapture2 || !ispCapture3) - return -ENOENT; - - /* Locate and open the unicam video streams. */ - data->unicam_[Unicam::Image] = RPi::Stream("Unicam Image", unicamImage); - - /* An embedded data node will not be present if the sensor does not support it. */ - MediaEntity *unicamEmbedded = unicam->getEntityByName("unicam-embedded"); - if (unicamEmbedded) { - data->unicam_[Unicam::Embedded] = RPi::Stream("Unicam Embedded", unicamEmbedded); - data->unicam_[Unicam::Embedded].dev()->bufferReady.connect(data.get(), - &RPiCameraData::unicamBufferDequeue); - } - - /* Tag the ISP input stream as an import stream. */ - data->isp_[Isp::Input] = RPi::Stream("ISP Input", ispOutput0, true); - data->isp_[Isp::Output0] = RPi::Stream("ISP Output0", ispCapture1); - data->isp_[Isp::Output1] = RPi::Stream("ISP Output1", ispCapture2); - data->isp_[Isp::Stats] = RPi::Stream("ISP Stats", ispCapture3); - - /* Wire up all the buffer connections. */ - data->unicam_[Unicam::Image].dev()->dequeueTimeout.connect(data.get(), &RPiCameraData::unicamTimeout); - data->unicam_[Unicam::Image].dev()->frameStart.connect(data.get(), &RPiCameraData::frameStarted); - data->unicam_[Unicam::Image].dev()->bufferReady.connect(data.get(), &RPiCameraData::unicamBufferDequeue); - data->isp_[Isp::Input].dev()->bufferReady.connect(data.get(), &RPiCameraData::ispInputDequeue); - data->isp_[Isp::Output0].dev()->bufferReady.connect(data.get(), &RPiCameraData::ispOutputDequeue); - data->isp_[Isp::Output1].dev()->bufferReady.connect(data.get(), &RPiCameraData::ispOutputDequeue); - data->isp_[Isp::Stats].dev()->bufferReady.connect(data.get(), &RPiCameraData::ispOutputDequeue); - - data->sensor_ = std::make_unique(sensorEntity); - if (!data->sensor_) - return -EINVAL; - - if (data->sensor_->init()) - return -EINVAL; - - /* - * Enumerate all the Video Mux/Bridge devices across the sensor -> unicam - * chain. There may be a cascade of devices in this chain! - */ - MediaLink *link = sensorEntity->getPadByIndex(0)->links()[0]; - data->enumerateVideoDevices(link); - - data->sensorFormats_ = populateSensorFormats(data->sensor_); - - ipa::RPi::InitResult result; - if (data->loadIPA(&result)) { - LOG(RPI, Error) << "Failed to load a suitable IPA library"; - return -EINVAL; - } - - if (result.sensorConfig.sensorMetadata ^ !!unicamEmbedded) { - LOG(RPI, Warning) << "Mismatch between Unicam and CamHelper for embedded data usage!"; - result.sensorConfig.sensorMetadata = false; - if (unicamEmbedded) - data->unicam_[Unicam::Embedded].dev()->bufferReady.disconnect(); - } - - /* - * Open all Unicam and ISP streams. The exception is the embedded data - * stream, which only gets opened below if the IPA reports that the sensor - * supports embedded data. - * - * The below grouping is just for convenience so that we can easily - * iterate over all streams in one go. - */ - data->streams_.push_back(&data->unicam_[Unicam::Image]); - if (result.sensorConfig.sensorMetadata) - data->streams_.push_back(&data->unicam_[Unicam::Embedded]); - - for (auto &stream : data->isp_) - data->streams_.push_back(&stream); - - for (auto stream : data->streams_) { - int ret = stream->dev()->open(); - if (ret) - return ret; - } - - if (!data->unicam_[Unicam::Image].dev()->caps().hasMediaController()) { - LOG(RPI, Error) << "Unicam driver does not use the MediaController, please update your kernel!"; - return -EINVAL; - } - - /* - * Setup our delayed control writer with the sensor default - * gain and exposure delays. Mark VBLANK for priority write. - */ - std::unordered_map params = { - { V4L2_CID_ANALOGUE_GAIN, { result.sensorConfig.gainDelay, false } }, - { V4L2_CID_EXPOSURE, { result.sensorConfig.exposureDelay, false } }, - { V4L2_CID_HBLANK, { result.sensorConfig.hblankDelay, false } }, - { V4L2_CID_VBLANK, { result.sensorConfig.vblankDelay, true } } - }; - data->delayedCtrls_ = std::make_unique(data->sensor_->device(), params); - data->sensorMetadata_ = result.sensorConfig.sensorMetadata; - - /* Register initial controls that the Raspberry Pi IPA can handle. */ - data->controlInfo_ = std::move(result.controlInfo); - - /* Initialize the camera properties. */ - data->properties_ = data->sensor_->properties(); - - /* - * The V4L2_CID_NOTIFY_GAINS control, if present, is used to inform the - * sensor of the colour gains. It is defined to be a linear gain where - * the default value represents a gain of exactly one. - */ - auto it = data->sensor_->controls().find(V4L2_CID_NOTIFY_GAINS); - if (it != data->sensor_->controls().end()) - data->notifyGainsUnity_ = it->second.def().get(); - - /* - * Set a default value for the ScalerCropMaximum property to show - * that we support its use, however, initialise it to zero because - * it's not meaningful until a camera mode has been chosen. - */ - data->properties_.set(properties::ScalerCropMaximum, Rectangle{}); - - /* - * We cache two things about the sensor in relation to transforms - * (meaning horizontal and vertical flips): if they affect the Bayer - * ordering, and what the "native" Bayer order is, when no transforms - * are applied. - * - * We note that the sensor's cached list of supported formats is - * already in the "native" order, with any flips having been undone. - */ - const V4L2Subdevice *sensor = data->sensor_->device(); - const struct v4l2_query_ext_ctrl *hflipCtrl = sensor->controlInfo(V4L2_CID_HFLIP); - if (hflipCtrl) { - /* We assume it will support vflips too... */ - data->flipsAlterBayerOrder_ = hflipCtrl->flags & V4L2_CTRL_FLAG_MODIFY_LAYOUT; - } - - /* Look for a valid Bayer format. */ - BayerFormat bayerFormat; - for (const auto &iter : data->sensorFormats_) { - bayerFormat = BayerFormat::fromMbusCode(iter.first); - if (bayerFormat.isValid()) - break; - } - - if (!bayerFormat.isValid()) { - LOG(RPI, Error) << "No Bayer format found"; - return -EINVAL; - } - data->nativeBayerOrder_ = bayerFormat.order; - - /* - * List the available streams an application may request. At present, we - * do not advertise Unicam Embedded and ISP Statistics streams, as there - * is no mechanism for the application to request non-image buffer formats. - */ - std::set streams; - streams.insert(&data->unicam_[Unicam::Image]); - streams.insert(&data->isp_[Isp::Output0]); - streams.insert(&data->isp_[Isp::Output1]); - - int ret = data->loadPipelineConfiguration(); - if (ret) { - LOG(RPI, Error) << "Unable to load pipeline configuration"; - return ret; - } - - /* Create and register the camera. */ - const std::string &id = data->sensor_->id(); - std::shared_ptr camera = - Camera::create(std::move(data), id, streams); - PipelineHandler::registerCamera(std::move(camera)); - - LOG(RPI, Info) << "Registered camera " << id - << " to Unicam device " << unicam->deviceNode() - << " and ISP device " << isp->deviceNode(); - return 0; -} - -int PipelineHandlerRPi::queueAllBuffers(Camera *camera) -{ - RPiCameraData *data = cameraData(camera); - int ret; - - for (auto const stream : data->streams_) { - if (!stream->isExternal()) { - ret = stream->queueAllBuffers(); - if (ret < 0) - return ret; - } else { - /* - * For external streams, we must queue up a set of internal - * buffers to handle the number of drop frames requested by - * the IPA. This is done by passing nullptr in queueBuffer(). - * - * The below queueBuffer() call will do nothing if there - * are not enough internal buffers allocated, but this will - * be handled by queuing the request for buffers in the - * RPiStream object. - */ - unsigned int i; - for (i = 0; i < data->dropFrameCount_; i++) { - ret = stream->queueBuffer(nullptr); - if (ret) - return ret; - } - } - } - - return 0; -} - -int PipelineHandlerRPi::prepareBuffers(Camera *camera) -{ - RPiCameraData *data = cameraData(camera); - unsigned int numRawBuffers = 0; - int ret; - - for (Stream *s : camera->streams()) { - if (isRaw(s->configuration().pixelFormat)) { - numRawBuffers = s->configuration().bufferCount; - break; - } - } - - /* Decide how many internal buffers to allocate. */ - for (auto const stream : data->streams_) { - unsigned int numBuffers; - /* - * For Unicam, allocate a minimum number of buffers for internal - * use as we want to avoid any frame drops. - */ - const unsigned int minBuffers = data->config_.minTotalUnicamBuffers; - if (stream == &data->unicam_[Unicam::Image]) { - /* - * If an application has configured a RAW stream, allocate - * additional buffers to make up the minimum, but ensure - * we have at least minUnicamBuffers of internal buffers - * to use to minimise frame drops. - */ - numBuffers = std::max(data->config_.minUnicamBuffers, - minBuffers - numRawBuffers); - } else if (stream == &data->isp_[Isp::Input]) { - /* - * ISP input buffers are imported from Unicam, so follow - * similar logic as above to count all the RAW buffers - * available. - */ - numBuffers = numRawBuffers + - std::max(data->config_.minUnicamBuffers, - minBuffers - numRawBuffers); - - } else if (stream == &data->unicam_[Unicam::Embedded]) { - /* - * Embedded data buffers are (currently) for internal use, - * so allocate the minimum required to avoid frame drops. - */ - numBuffers = minBuffers; - } else { - /* - * Since the ISP runs synchronous with the IPA and requests, - * we only ever need one set of internal buffers. Any buffers - * the application wants to hold onto will already be exported - * through PipelineHandlerRPi::exportFrameBuffers(). - */ - numBuffers = 1; - } - - ret = stream->prepareBuffers(numBuffers); - if (ret < 0) - return ret; - } - - /* - * Pass the stats and embedded data buffers to the IPA. No other - * buffers need to be passed. - */ - mapBuffers(camera, data->isp_[Isp::Stats].getBuffers(), RPi::MaskStats); - if (data->sensorMetadata_) - mapBuffers(camera, data->unicam_[Unicam::Embedded].getBuffers(), - RPi::MaskEmbeddedData); - - return 0; -} - -void PipelineHandlerRPi::mapBuffers(Camera *camera, const RPi::BufferMap &buffers, unsigned int mask) -{ - RPiCameraData *data = cameraData(camera); - std::vector bufferIds; - /* - * Link the FrameBuffers with the id (key value) in the map stored in - * the RPi stream object - along with an identifier mask. - * - * This will allow us to identify buffers passed between the pipeline - * handler and the IPA. - */ - for (auto const &it : buffers) { - bufferIds.push_back(IPABuffer(mask | it.first, - it.second->planes())); - data->bufferIds_.insert(mask | it.first); - } - - data->ipa_->mapBuffers(bufferIds); -} - -void RPiCameraData::freeBuffers() -{ - if (ipa_) { - /* - * Copy the buffer ids from the unordered_set to a vector to - * pass to the IPA. - */ - std::vector bufferIds(bufferIds_.begin(), - bufferIds_.end()); - ipa_->unmapBuffers(bufferIds); - bufferIds_.clear(); - } - - for (auto const stream : streams_) - stream->releaseBuffers(); - - buffersAllocated_ = false; -} - -void RPiCameraData::frameStarted(uint32_t sequence) -{ - LOG(RPI, Debug) << "frame start " << sequence; - - /* Write any controls for the next frame as soon as we can. */ - delayedCtrls_->applyControls(sequence); -} - -int RPiCameraData::loadIPA(ipa::RPi::InitResult *result) -{ - ipa_ = IPAManager::createIPA(pipe(), 1, 1); - - if (!ipa_) - return -ENOENT; - - ipa_->processStatsComplete.connect(this, &RPiCameraData::processStatsComplete); - ipa_->prepareIspComplete.connect(this, &RPiCameraData::prepareIspComplete); - ipa_->setIspControls.connect(this, &RPiCameraData::setIspControls); - ipa_->setDelayedControls.connect(this, &RPiCameraData::setDelayedControls); - ipa_->setLensControls.connect(this, &RPiCameraData::setLensControls); - ipa_->setCameraTimeout.connect(this, &RPiCameraData::setCameraTimeout); - - /* - * The configuration (tuning file) is made from the sensor name unless - * the environment variable overrides it. - */ - std::string configurationFile; - char const *configFromEnv = utils::secure_getenv("LIBCAMERA_RPI_TUNING_FILE"); - if (!configFromEnv || *configFromEnv == '\0') { - std::string model = sensor_->model(); - if (isMonoSensor(sensor_)) - model += "_mono"; - configurationFile = ipa_->configurationFile(model + ".json", "rpi"); - } else { - configurationFile = std::string(configFromEnv); - } - - IPASettings settings(configurationFile, sensor_->model()); - ipa::RPi::InitParams params; - - params.lensPresent = !!sensor_->focusLens(); - return ipa_->init(settings, params, result); -} - -int RPiCameraData::configureIPA(const CameraConfiguration *config, ipa::RPi::ConfigResult *result) -{ - std::map entityControls; - ipa::RPi::ConfigParams params; - - /* \todo Move passing of ispControls and lensControls to ipa::init() */ - params.sensorControls = sensor_->controls(); - params.ispControls = isp_[Isp::Input].dev()->controls(); - if (sensor_->focusLens()) - params.lensControls = sensor_->focusLens()->controls(); - - /* Always send the user transform to the IPA. */ - params.transform = static_cast(config->transform); - - /* Allocate the lens shading table via dmaHeap and pass to the IPA. */ - if (!lsTable_.isValid()) { - lsTable_ = SharedFD(dmaHeap_.alloc("ls_grid", ipa::RPi::MaxLsGridSize)); - if (!lsTable_.isValid()) - return -ENOMEM; - - /* Allow the IPA to mmap the LS table via the file descriptor. */ - /* - * \todo Investigate if mapping the lens shading table buffer - * could be handled with mapBuffers(). - */ - params.lsTableHandle = lsTable_; - } - - /* We store the IPACameraSensorInfo for digital zoom calculations. */ - int ret = sensor_->sensorInfo(&sensorInfo_); - if (ret) { - LOG(RPI, Error) << "Failed to retrieve camera sensor info"; - return ret; - } - - /* Ready the IPA - it must know about the sensor resolution. */ - ret = ipa_->configure(sensorInfo_, params, result); - if (ret < 0) { - LOG(RPI, Error) << "IPA configuration failed!"; - return -EPIPE; - } - - if (!result->controls.empty()) - setSensorControls(result->controls); - - return 0; -} - -int RPiCameraData::loadPipelineConfiguration() -{ - config_ = { - .minUnicamBuffers = 2, - .minTotalUnicamBuffers = 4, - .disableStartupFrameDrops = false, - .unicamTimeoutValue = 0, - }; - - char const *configFromEnv = utils::secure_getenv("LIBCAMERA_RPI_CONFIG_FILE"); - if (!configFromEnv || *configFromEnv == '\0') - return 0; - - std::string filename = std::string(configFromEnv); - File file(filename); - - if (!file.open(File::OpenModeFlag::ReadOnly)) { - LOG(RPI, Error) << "Failed to open configuration file '" << filename << "'"; - return -EIO; - } - - LOG(RPI, Info) << "Using configuration file '" << filename << "'"; - - std::unique_ptr root = YamlParser::parse(file); - if (!root) { - LOG(RPI, Warning) << "Failed to parse configuration file, using defaults"; - return 0; - } - - std::optional ver = (*root)["version"].get(); - if (!ver || *ver != 1.0) { - LOG(RPI, Error) << "Unexpected configuration file version reported"; - return -EINVAL; - } - - const YamlObject &phConfig = (*root)["pipeline_handler"]; - config_.minUnicamBuffers = - phConfig["min_unicam_buffers"].get(config_.minUnicamBuffers); - config_.minTotalUnicamBuffers = - phConfig["min_total_unicam_buffers"].get(config_.minTotalUnicamBuffers); - config_.disableStartupFrameDrops = - phConfig["disable_startup_frame_drops"].get(config_.disableStartupFrameDrops); - config_.unicamTimeoutValue = - phConfig["unicam_timeout_value_ms"].get(config_.unicamTimeoutValue); - - if (config_.unicamTimeoutValue) { - /* Disable the IPA signal to control timeout and set the user requested value. */ - ipa_->setCameraTimeout.disconnect(); - unicam_[Unicam::Image].dev()->setDequeueTimeout(config_.unicamTimeoutValue * 1ms); - } - - if (config_.minTotalUnicamBuffers < config_.minUnicamBuffers) { - LOG(RPI, Error) << "Invalid configuration: min_total_unicam_buffers must be >= min_unicam_buffers"; - return -EINVAL; - } - - if (config_.minTotalUnicamBuffers < 1) { - LOG(RPI, Error) << "Invalid configuration: min_total_unicam_buffers must be >= 1"; - return -EINVAL; - } - - return 0; -} - -/* - * enumerateVideoDevices() iterates over the Media Controller topology, starting - * at the sensor and finishing at Unicam. For each sensor, RPiCameraData stores - * a unique list of any intermediate video mux or bridge devices connected in a - * cascade, together with the entity to entity link. - * - * Entity pad configuration and link enabling happens at the end of configure(). - * We first disable all pad links on each entity device in the chain, and then - * selectively enabling the specific links to link sensor to Unicam across all - * intermediate muxes and bridges. - * - * In the cascaded topology below, if Sensor1 is used, the Mux2 -> Mux1 link - * will be disabled, and Sensor1 -> Mux1 -> Unicam links enabled. Alternatively, - * if Sensor3 is used, the Sensor2 -> Mux2 and Sensor1 -> Mux1 links are disabled, - * and Sensor3 -> Mux2 -> Mux1 -> Unicam links are enabled. All other links will - * remain unchanged. - * - * +----------+ - * | Unicam | - * +-----^----+ - * | - * +---+---+ - * | Mux1 <-------+ - * +--^----+ | - * | | - * +-----+---+ +---+---+ - * | Sensor1 | | Mux2 |<--+ - * +---------+ +-^-----+ | - * | | - * +-------+-+ +---+-----+ - * | Sensor2 | | Sensor3 | - * +---------+ +---------+ - */ -void RPiCameraData::enumerateVideoDevices(MediaLink *link) -{ - const MediaPad *sinkPad = link->sink(); - const MediaEntity *entity = sinkPad->entity(); - bool unicamFound = false; - - /* We only deal with Video Mux and Bridge devices in cascade. */ - if (entity->function() != MEDIA_ENT_F_VID_MUX && - entity->function() != MEDIA_ENT_F_VID_IF_BRIDGE) - return; - - /* Find the source pad for this Video Mux or Bridge device. */ - const MediaPad *sourcePad = nullptr; - for (const MediaPad *pad : entity->pads()) { - if (pad->flags() & MEDIA_PAD_FL_SOURCE) { - /* - * We can only deal with devices that have a single source - * pad. If this device has multiple source pads, ignore it - * and this branch in the cascade. - */ - if (sourcePad) - return; - - sourcePad = pad; - } - } - - LOG(RPI, Debug) << "Found video mux device " << entity->name() - << " linked to sink pad " << sinkPad->index(); - - bridgeDevices_.emplace_back(std::make_unique(entity), link); - bridgeDevices_.back().first->open(); - - /* - * Iterate through all the sink pad links down the cascade to find any - * other Video Mux and Bridge devices. - */ - for (MediaLink *l : sourcePad->links()) { - enumerateVideoDevices(l); - /* Once we reach the Unicam entity, we are done. */ - if (l->sink()->entity()->name() == "unicam-image") { - unicamFound = true; - break; - } - } - - /* This identifies the end of our entity enumeration recursion. */ - if (link->source()->entity()->function() == MEDIA_ENT_F_CAM_SENSOR) { - /* - * If Unicam is not at the end of this cascade, we cannot configure - * this topology automatically, so remove all entity references. - */ - if (!unicamFound) { - LOG(RPI, Warning) << "Cannot automatically configure this MC topology!"; - bridgeDevices_.clear(); - } - } -} - -void RPiCameraData::processStatsComplete(const ipa::RPi::BufferIds &buffers) -{ - if (!isRunning()) - return; - - FrameBuffer *buffer = isp_[Isp::Stats].getBuffers().at(buffers.stats & RPi::MaskID); - - handleStreamBuffer(buffer, &isp_[Isp::Stats]); - - /* Last thing to do is to fill up the request metadata. */ - Request *request = requestQueue_.front(); - ControlList metadata; - - ipa_->reportMetadata(request->sequence(), &metadata); - request->metadata().merge(metadata); - - /* - * Inform the sensor of the latest colour gains if it has the - * V4L2_CID_NOTIFY_GAINS control (which means notifyGainsUnity_ is set). - */ - const auto &colourGains = metadata.get(libcamera::controls::ColourGains); - if (notifyGainsUnity_ && colourGains) { - /* The control wants linear gains in the order B, Gb, Gr, R. */ - ControlList ctrls(sensor_->controls()); - std::array gains{ - static_cast((*colourGains)[1] * *notifyGainsUnity_), - *notifyGainsUnity_, - *notifyGainsUnity_, - static_cast((*colourGains)[0] * *notifyGainsUnity_) - }; - ctrls.set(V4L2_CID_NOTIFY_GAINS, Span{ gains }); - - sensor_->setControls(&ctrls); - } - - state_ = State::IpaComplete; - handleState(); -} - -void RPiCameraData::prepareIspComplete(const ipa::RPi::BufferIds &buffers) -{ - unsigned int embeddedId = buffers.embedded & RPi::MaskID; - unsigned int bayer = buffers.bayer & RPi::MaskID; - FrameBuffer *buffer; - - if (!isRunning()) - return; - - buffer = unicam_[Unicam::Image].getBuffers().at(bayer & RPi::MaskID); - LOG(RPI, Debug) << "Input re-queue to ISP, buffer id " << (bayer & RPi::MaskID) - << ", timestamp: " << buffer->metadata().timestamp; - - isp_[Isp::Input].queueBuffer(buffer); - ispOutputCount_ = 0; - - if (sensorMetadata_ && embeddedId) { - buffer = unicam_[Unicam::Embedded].getBuffers().at(embeddedId & RPi::MaskID); - handleStreamBuffer(buffer, &unicam_[Unicam::Embedded]); - } - - handleState(); -} - -void RPiCameraData::setIspControls(const ControlList &controls) -{ - ControlList ctrls = controls; - - if (ctrls.contains(V4L2_CID_USER_BCM2835_ISP_LENS_SHADING)) { - ControlValue &value = - const_cast(ctrls.get(V4L2_CID_USER_BCM2835_ISP_LENS_SHADING)); - Span s = value.data(); - bcm2835_isp_lens_shading *ls = - reinterpret_cast(s.data()); - ls->dmabuf = lsTable_.get(); - } - - isp_[Isp::Input].dev()->setControls(&ctrls); - handleState(); -} - -void RPiCameraData::setDelayedControls(const ControlList &controls, uint32_t delayContext) -{ - if (!delayedCtrls_->push(controls, delayContext)) - LOG(RPI, Error) << "V4L2 DelayedControl set failed"; - handleState(); -} - -void RPiCameraData::setLensControls(const ControlList &controls) -{ - CameraLens *lens = sensor_->focusLens(); - - if (lens && controls.contains(V4L2_CID_FOCUS_ABSOLUTE)) { - ControlValue const &focusValue = controls.get(V4L2_CID_FOCUS_ABSOLUTE); - lens->setFocusPosition(focusValue.get()); - } -} - -void RPiCameraData::setCameraTimeout(uint32_t maxFrameLengthMs) -{ - /* - * Set the dequeue timeout to the larger of 5x the maximum reported - * frame length advertised by the IPA over a number of frames. Allow - * a minimum timeout value of 1s. - */ - utils::Duration timeout = - std::max(1s, 5 * maxFrameLengthMs * 1ms); - - LOG(RPI, Debug) << "Setting Unicam timeout to " << timeout; - unicam_[Unicam::Image].dev()->setDequeueTimeout(timeout); -} - -void RPiCameraData::setSensorControls(ControlList &controls) -{ - /* - * We need to ensure that if both VBLANK and EXPOSURE are present, the - * former must be written ahead of, and separately from EXPOSURE to avoid - * V4L2 rejecting the latter. This is identical to what DelayedControls - * does with the priority write flag. - * - * As a consequence of the below logic, VBLANK gets set twice, and we - * rely on the v4l2 framework to not pass the second control set to the - * driver as the actual control value has not changed. - */ - if (controls.contains(V4L2_CID_EXPOSURE) && controls.contains(V4L2_CID_VBLANK)) { - ControlList vblank_ctrl; - - vblank_ctrl.set(V4L2_CID_VBLANK, controls.get(V4L2_CID_VBLANK)); - sensor_->setControls(&vblank_ctrl); - } - - sensor_->setControls(&controls); -} - -void RPiCameraData::unicamTimeout() -{ - LOG(RPI, Error) << "Unicam has timed out!"; - LOG(RPI, Error) << "Please check that your camera sensor connector is attached securely."; - LOG(RPI, Error) << "Alternatively, try another cable and/or sensor."; - - state_ = RPiCameraData::State::Error; - /* - * To allow the application to attempt a recovery from this timeout, - * stop all devices streaming, and return any outstanding requests as - * incomplete and cancelled. - */ - for (auto const stream : streams_) - stream->dev()->streamOff(); - - clearIncompleteRequests(); -} - -void RPiCameraData::unicamBufferDequeue(FrameBuffer *buffer) -{ - RPi::Stream *stream = nullptr; - int index; - - if (!isRunning()) - return; - - for (RPi::Stream &s : unicam_) { - index = s.getBufferId(buffer); - if (index) { - stream = &s; - break; - } - } - - /* The buffer must belong to one of our streams. */ - ASSERT(stream); - - LOG(RPI, Debug) << "Stream " << stream->name() << " buffer dequeue" - << ", buffer id " << index - << ", timestamp: " << buffer->metadata().timestamp; - - if (stream == &unicam_[Unicam::Image]) { - /* - * Lookup the sensor controls used for this frame sequence from - * DelayedControl and queue them along with the frame buffer. - */ - auto [ctrl, delayContext] = delayedCtrls_->get(buffer->metadata().sequence); - /* - * Add the frame timestamp to the ControlList for the IPA to use - * as it does not receive the FrameBuffer object. - */ - ctrl.set(controls::SensorTimestamp, buffer->metadata().timestamp); - bayerQueue_.push({ buffer, std::move(ctrl), delayContext }); - } else { - embeddedQueue_.push(buffer); - } - - handleState(); -} - -void RPiCameraData::ispInputDequeue(FrameBuffer *buffer) -{ - if (!isRunning()) - return; - - LOG(RPI, Debug) << "Stream ISP Input buffer complete" - << ", buffer id " << unicam_[Unicam::Image].getBufferId(buffer) - << ", timestamp: " << buffer->metadata().timestamp; - - /* The ISP input buffer gets re-queued into Unicam. */ - handleStreamBuffer(buffer, &unicam_[Unicam::Image]); - handleState(); -} - -void RPiCameraData::ispOutputDequeue(FrameBuffer *buffer) -{ - RPi::Stream *stream = nullptr; - int index; - - if (!isRunning()) - return; - - for (RPi::Stream &s : isp_) { - index = s.getBufferId(buffer); - if (index) { - stream = &s; - break; - } - } - - /* The buffer must belong to one of our ISP output streams. */ - ASSERT(stream); - - LOG(RPI, Debug) << "Stream " << stream->name() << " buffer complete" - << ", buffer id " << index - << ", timestamp: " << buffer->metadata().timestamp; - - /* - * ISP statistics buffer must not be re-queued or sent back to the - * application until after the IPA signals so. - */ - if (stream == &isp_[Isp::Stats]) { - ipa::RPi::ProcessParams params; - params.buffers.stats = index | RPi::MaskStats; - params.ipaContext = requestQueue_.front()->sequence(); - ipa_->processStats(params); - } else { - /* Any other ISP output can be handed back to the application now. */ - handleStreamBuffer(buffer, stream); - } - - /* - * Increment the number of ISP outputs generated. - * This is needed to track dropped frames. - */ - ispOutputCount_++; - - handleState(); -} - -void RPiCameraData::clearIncompleteRequests() -{ - /* - * All outstanding requests (and associated buffers) must be returned - * back to the application. - */ - while (!requestQueue_.empty()) { - Request *request = requestQueue_.front(); - - for (auto &b : request->buffers()) { - FrameBuffer *buffer = b.second; - /* - * Has the buffer already been handed back to the - * request? If not, do so now. - */ - if (buffer->request()) { - buffer->_d()->cancel(); - pipe()->completeBuffer(request, buffer); - } - } - - pipe()->completeRequest(request); - requestQueue_.pop_front(); - } -} - -void RPiCameraData::handleStreamBuffer(FrameBuffer *buffer, RPi::Stream *stream) -{ - /* - * It is possible to be here without a pending request, so check - * that we actually have one to action, otherwise we just return - * buffer back to the stream. - */ - Request *request = requestQueue_.empty() ? nullptr : requestQueue_.front(); - if (!dropFrameCount_ && request && request->findBuffer(stream) == buffer) { - /* - * Check if this is an externally provided buffer, and if - * so, we must stop tracking it in the pipeline handler. - */ - handleExternalBuffer(buffer, stream); - /* - * Tag the buffer as completed, returning it to the - * application. - */ - pipe()->completeBuffer(request, buffer); - } else { - /* - * This buffer was not part of the Request (which happens if an - * internal buffer was used for an external stream, or - * unconditionally for internal streams), or there is no pending - * request, so we can recycle it. - */ - stream->returnBuffer(buffer); - } -} - -void RPiCameraData::handleExternalBuffer(FrameBuffer *buffer, RPi::Stream *stream) -{ - unsigned int id = stream->getBufferId(buffer); - - if (!(id & RPi::MaskExternalBuffer)) - return; - - /* Stop the Stream object from tracking the buffer. */ - stream->removeExternalBuffer(buffer); -} - -void RPiCameraData::handleState() -{ - switch (state_) { - case State::Stopped: - case State::Busy: - case State::Error: - break; - - case State::IpaComplete: - /* If the request is completed, we will switch to Idle state. */ - checkRequestCompleted(); - /* - * No break here, we want to try running the pipeline again. - * The fallthrough clause below suppresses compiler warnings. - */ - [[fallthrough]]; - - case State::Idle: - tryRunPipeline(); - break; - } -} - -void RPiCameraData::checkRequestCompleted() -{ - bool requestCompleted = false; - /* - * If we are dropping this frame, do not touch the request, simply - * change the state to IDLE when ready. - */ - if (!dropFrameCount_) { - Request *request = requestQueue_.front(); - if (request->hasPendingBuffers()) - return; - - /* Must wait for metadata to be filled in before completing. */ - if (state_ != State::IpaComplete) - return; - - pipe()->completeRequest(request); - requestQueue_.pop_front(); - requestCompleted = true; - } - - /* - * Make sure we have three outputs completed in the case of a dropped - * frame. - */ - if (state_ == State::IpaComplete && - ((ispOutputCount_ == 3 && dropFrameCount_) || requestCompleted)) { - state_ = State::Idle; - if (dropFrameCount_) { - dropFrameCount_--; - LOG(RPI, Debug) << "Dropping frame at the request of the IPA (" - << dropFrameCount_ << " left)"; - } - } -} - -Rectangle RPiCameraData::scaleIspCrop(const Rectangle &ispCrop) const -{ - /* - * Scale a crop rectangle defined in the ISP's coordinates into native sensor - * coordinates. - */ - Rectangle nativeCrop = ispCrop.scaledBy(sensorInfo_.analogCrop.size(), - sensorInfo_.outputSize); - nativeCrop.translateBy(sensorInfo_.analogCrop.topLeft()); - return nativeCrop; -} - -void RPiCameraData::applyScalerCrop(const ControlList &controls) -{ - const auto &scalerCrop = controls.get(controls::ScalerCrop); - if (scalerCrop) { - Rectangle nativeCrop = *scalerCrop; - - if (!nativeCrop.width || !nativeCrop.height) - nativeCrop = { 0, 0, 1, 1 }; - - /* Create a version of the crop scaled to ISP (camera mode) pixels. */ - Rectangle ispCrop = nativeCrop.translatedBy(-sensorInfo_.analogCrop.topLeft()); - ispCrop.scaleBy(sensorInfo_.outputSize, sensorInfo_.analogCrop.size()); - - /* - * The crop that we set must be: - * 1. At least as big as ispMinCropSize_, once that's been - * enlarged to the same aspect ratio. - * 2. With the same mid-point, if possible. - * 3. But it can't go outside the sensor area. - */ - Size minSize = ispMinCropSize_.expandedToAspectRatio(nativeCrop.size()); - Size size = ispCrop.size().expandedTo(minSize); - ispCrop = size.centeredTo(ispCrop.center()).enclosedIn(Rectangle(sensorInfo_.outputSize)); - - if (ispCrop != ispCrop_) { - isp_[Isp::Input].dev()->setSelection(V4L2_SEL_TGT_CROP, &ispCrop); - ispCrop_ = ispCrop; - - /* - * Also update the ScalerCrop in the metadata with what we actually - * used. But we must first rescale that from ISP (camera mode) pixels - * back into sensor native pixels. - */ - scalerCrop_ = scaleIspCrop(ispCrop_); - } - } -} - -void RPiCameraData::fillRequestMetadata(const ControlList &bufferControls, - Request *request) -{ - request->metadata().set(controls::SensorTimestamp, - bufferControls.get(controls::SensorTimestamp).value_or(0)); - - request->metadata().set(controls::ScalerCrop, scalerCrop_); -} - -void RPiCameraData::tryRunPipeline() -{ - FrameBuffer *embeddedBuffer; - BayerFrame bayerFrame; - - /* If any of our request or buffer queues are empty, we cannot proceed. */ - if (state_ != State::Idle || requestQueue_.empty() || - bayerQueue_.empty() || (embeddedQueue_.empty() && sensorMetadata_)) - return; - - if (!findMatchingBuffers(bayerFrame, embeddedBuffer)) - return; - - /* Take the first request from the queue and action the IPA. */ - Request *request = requestQueue_.front(); - - /* See if a new ScalerCrop value needs to be applied. */ - applyScalerCrop(request->controls()); - - /* - * Clear the request metadata and fill it with some initial non-IPA - * related controls. We clear it first because the request metadata - * may have been populated if we have dropped the previous frame. - */ - request->metadata().clear(); - fillRequestMetadata(bayerFrame.controls, request); - - /* Set our state to say the pipeline is active. */ - state_ = State::Busy; - - unsigned int bayer = unicam_[Unicam::Image].getBufferId(bayerFrame.buffer); - - LOG(RPI, Debug) << "Signalling prepareIsp:" - << " Bayer buffer id: " << bayer; - - ipa::RPi::PrepareParams params; - params.buffers.bayer = RPi::MaskBayerData | bayer; - params.sensorControls = std::move(bayerFrame.controls); - params.requestControls = request->controls(); - params.ipaContext = request->sequence(); - params.delayContext = bayerFrame.delayContext; - - if (embeddedBuffer) { - unsigned int embeddedId = unicam_[Unicam::Embedded].getBufferId(embeddedBuffer); - - params.buffers.embedded = RPi::MaskEmbeddedData | embeddedId; - LOG(RPI, Debug) << "Signalling prepareIsp:" - << " Embedded buffer id: " << embeddedId; - } - - ipa_->prepareIsp(params); -} - -bool RPiCameraData::findMatchingBuffers(BayerFrame &bayerFrame, FrameBuffer *&embeddedBuffer) -{ - if (bayerQueue_.empty()) - return false; - - /* - * Find the embedded data buffer with a matching timestamp to pass to - * the IPA. Any embedded buffers with a timestamp lower than the - * current bayer buffer will be removed and re-queued to the driver. - */ - uint64_t ts = bayerQueue_.front().buffer->metadata().timestamp; - embeddedBuffer = nullptr; - while (!embeddedQueue_.empty()) { - FrameBuffer *b = embeddedQueue_.front(); - if (b->metadata().timestamp < ts) { - embeddedQueue_.pop(); - unicam_[Unicam::Embedded].returnBuffer(b); - LOG(RPI, Debug) << "Dropping unmatched input frame in stream " - << unicam_[Unicam::Embedded].name(); - } else if (b->metadata().timestamp == ts) { - /* Found a match! */ - embeddedBuffer = b; - embeddedQueue_.pop(); - break; - } else { - break; /* Only higher timestamps from here. */ - } - } - - if (!embeddedBuffer && sensorMetadata_) { - if (embeddedQueue_.empty()) { - /* - * If the embedded buffer queue is empty, wait for the next - * buffer to arrive - dequeue ordering may send the image - * buffer first. - */ - LOG(RPI, Debug) << "Waiting for next embedded buffer."; - return false; - } - - /* Log if there is no matching embedded data buffer found. */ - LOG(RPI, Debug) << "Returning bayer frame without a matching embedded buffer."; - } - - bayerFrame = std::move(bayerQueue_.front()); - bayerQueue_.pop(); - - return true; -} - -REGISTER_PIPELINE_HANDLER(PipelineHandlerRPi) - -} /* namespace libcamera */ diff --git a/src/libcamera/pipeline/rpi/vc4/vc4.cpp b/src/libcamera/pipeline/rpi/vc4/vc4.cpp new file mode 100644 index 000000000000..a085a06376a8 --- /dev/null +++ b/src/libcamera/pipeline/rpi/vc4/vc4.cpp @@ -0,0 +1,1001 @@ +/* SPDX-License-Identifier: LGPL-2.1-or-later */ +/* + * Copyright (C) 2019-2023, Raspberry Pi Ltd + * + * vc4.cpp - Pipeline handler for VC4 based Raspberry Pi devices + */ + +#include +#include +#include + +#include + +#include "libcamera/internal/device_enumerator.h" + +#include "dma_heaps.h" +#include "pipeline_base.h" +#include "rpi_stream.h" + +using namespace std::chrono_literals; + +namespace libcamera { + +LOG_DECLARE_CATEGORY(RPI) + +namespace { + +enum class Unicam : unsigned int { Image, Embedded }; +enum class Isp : unsigned int { Input, Output0, Output1, Stats }; + +} /* namespace */ + +class Vc4CameraData final : public RPi::CameraData +{ +public: + Vc4CameraData(PipelineHandler *pipe) + : RPi::CameraData(pipe) + { + } + + ~Vc4CameraData() + { + freeBuffers(); + } + + V4L2VideoDevice::Formats ispFormats() const override + { + return isp_[Isp::Output0].dev()->formats(); + } + + V4L2VideoDevice::Formats rawFormats() const override + { + return unicam_[Unicam::Image].dev()->formats(); + } + + V4L2VideoDevice *frontendDevice() override + { + return unicam_[Unicam::Image].dev(); + } + + void platformFreeBuffers() override + { + } + + CameraConfiguration::Status platformValidate(std::vector &rawStreams, + std::vector &outStreams) const override; + + int platformPipelineConfigure(const std::unique_ptr &root) override; + + void platformStart() override; + void platformStop() override; + + void unicamBufferDequeue(FrameBuffer *buffer); + void ispInputDequeue(FrameBuffer *buffer); + void ispOutputDequeue(FrameBuffer *buffer); + + void processStatsComplete(const ipa::RPi::BufferIds &buffers); + void prepareIspComplete(const ipa::RPi::BufferIds &buffers); + void setIspControls(const ControlList &controls); + void setCameraTimeout(uint32_t maxFrameLengthMs); + + /* Array of Unicam and ISP device streams and associated buffers/streams. */ + RPi::Device unicam_; + RPi::Device isp_; + + /* DMAHEAP allocation helper. */ + RPi::DmaHeap dmaHeap_; + SharedFD lsTable_; + + struct Config { + /* + * The minimum number of internal buffers to be allocated for + * the Unicam Image stream. + */ + unsigned int minUnicamBuffers; + /* + * The minimum total (internal + external) buffer count used for + * the Unicam Image stream. + * + * Note that: + * minTotalUnicamBuffers must be >= 1, and + * minTotalUnicamBuffers >= minUnicamBuffers + */ + unsigned int minTotalUnicamBuffers; + }; + + Config config_; + +private: + void platformIspCrop() override + { + isp_[Isp::Input].dev()->setSelection(V4L2_SEL_TGT_CROP, &ispCrop_); + } + + int platformConfigure(const V4L2SubdeviceFormat &sensorFormat, + std::optional packing, + std::vector &rawStreams, + std::vector &outStreams) override; + int platformConfigureIpa(ipa::RPi::ConfigParams ¶ms) override; + + int platformInitIpa([[maybe_unused]] ipa::RPi::InitParams ¶ms) override + { + return 0; + } + + struct BayerFrame { + FrameBuffer *buffer; + ControlList controls; + unsigned int delayContext; + }; + + void tryRunPipeline() override; + bool findMatchingBuffers(BayerFrame &bayerFrame, FrameBuffer *&embeddedBuffer); + + std::queue bayerQueue_; + std::queue embeddedQueue_; +}; + +class PipelineHandlerVc4 : public RPi::PipelineHandlerBase +{ +public: + PipelineHandlerVc4(CameraManager *manager) + : RPi::PipelineHandlerBase(manager) + { + } + + ~PipelineHandlerVc4() + { + } + + bool match(DeviceEnumerator *enumerator) override; + +private: + Vc4CameraData *cameraData(Camera *camera) + { + return static_cast(camera->_d()); + } + + std::unique_ptr allocateCameraData() override + { + return std::make_unique(this); + } + + int prepareBuffers(Camera *camera) override; + int platformRegister(std::unique_ptr &cameraData, + MediaDevice *unicam, MediaDevice *isp) override; +}; + +bool PipelineHandlerVc4::match(DeviceEnumerator *enumerator) +{ + constexpr unsigned int numUnicamDevices = 2; + + /* + * Loop over all Unicam instances, but return out once a match is found. + * This is to ensure we correctly enumrate the camera when an instance + * of Unicam has registered with media controller, but has not registered + * device nodes due to a sensor subdevice failure. + */ + for (unsigned int i = 0; i < numUnicamDevices; i++) { + DeviceMatch unicam("unicam"); + MediaDevice *unicamDevice = acquireMediaDevice(enumerator, unicam); + + if (!unicamDevice) { + LOG(RPI, Debug) << "Unable to acquire a Unicam instance"; + break; + } + + DeviceMatch isp("bcm2835-isp"); + MediaDevice *ispDevice = acquireMediaDevice(enumerator, isp); + + if (!ispDevice) { + LOG(RPI, Debug) << "Unable to acquire ISP instance"; + break; + } + + /* + * The loop below is used to register multiple cameras behind one or more + * video mux devices that are attached to a particular Unicam instance. + * Obviously these cameras cannot be used simultaneously. + */ + unsigned int numCameras = 0; + for (MediaEntity *entity : unicamDevice->entities()) { + if (entity->function() != MEDIA_ENT_F_CAM_SENSOR) + continue; + + int ret = RPi::PipelineHandlerBase::registerCamera(unicamDevice, "unicam-image", + ispDevice, entity); + if (ret) + LOG(RPI, Error) << "Failed to register camera " + << entity->name() << ": " << ret; + else + numCameras++; + } + + if (numCameras) + return true; + } + + return false; +} + +int PipelineHandlerVc4::prepareBuffers(Camera *camera) +{ + Vc4CameraData *data = cameraData(camera); + unsigned int numRawBuffers = 0; + int ret; + + for (Stream *s : camera->streams()) { + if (BayerFormat::fromPixelFormat(s->configuration().pixelFormat).isValid()) { + numRawBuffers = s->configuration().bufferCount; + break; + } + } + + /* Decide how many internal buffers to allocate. */ + for (auto const stream : data->streams_) { + unsigned int numBuffers; + /* + * For Unicam, allocate a minimum number of buffers for internal + * use as we want to avoid any frame drops. + */ + const unsigned int minBuffers = data->config_.minTotalUnicamBuffers; + if (stream == &data->unicam_[Unicam::Image]) { + /* + * If an application has configured a RAW stream, allocate + * additional buffers to make up the minimum, but ensure + * we have at least minUnicamBuffers of internal buffers + * to use to minimise frame drops. + */ + numBuffers = std::max(data->config_.minUnicamBuffers, + minBuffers - numRawBuffers); + } else if (stream == &data->isp_[Isp::Input]) { + /* + * ISP input buffers are imported from Unicam, so follow + * similar logic as above to count all the RAW buffers + * available. + */ + numBuffers = numRawBuffers + + std::max(data->config_.minUnicamBuffers, + minBuffers - numRawBuffers); + + } else if (stream == &data->unicam_[Unicam::Embedded]) { + /* + * Embedded data buffers are (currently) for internal use, + * so allocate the minimum required to avoid frame drops. + */ + numBuffers = minBuffers; + } else { + /* + * Since the ISP runs synchronous with the IPA and requests, + * we only ever need one set of internal buffers. Any buffers + * the application wants to hold onto will already be exported + * through PipelineHandlerRPi::exportFrameBuffers(). + */ + numBuffers = 1; + } + + ret = stream->prepareBuffers(numBuffers); + if (ret < 0) + return ret; + } + + /* + * Pass the stats and embedded data buffers to the IPA. No other + * buffers need to be passed. + */ + mapBuffers(camera, data->isp_[Isp::Stats].getBuffers(), RPi::MaskStats); + if (data->sensorMetadata_) + mapBuffers(camera, data->unicam_[Unicam::Embedded].getBuffers(), + RPi::MaskEmbeddedData); + + return 0; +} + +int PipelineHandlerVc4::platformRegister(std::unique_ptr &cameraData, MediaDevice *unicam, MediaDevice *isp) +{ + Vc4CameraData *data = static_cast(cameraData.get()); + + if (!data->dmaHeap_.isValid()) + return -ENOMEM; + + MediaEntity *unicamImage = unicam->getEntityByName("unicam-image"); + MediaEntity *ispOutput0 = isp->getEntityByName("bcm2835-isp0-output0"); + MediaEntity *ispCapture1 = isp->getEntityByName("bcm2835-isp0-capture1"); + MediaEntity *ispCapture2 = isp->getEntityByName("bcm2835-isp0-capture2"); + MediaEntity *ispCapture3 = isp->getEntityByName("bcm2835-isp0-capture3"); + + if (!unicamImage || !ispOutput0 || !ispCapture1 || !ispCapture2 || !ispCapture3) + return -ENOENT; + + /* Locate and open the unicam video streams. */ + data->unicam_[Unicam::Image] = RPi::Stream("Unicam Image", unicamImage); + + /* An embedded data node will not be present if the sensor does not support it. */ + MediaEntity *unicamEmbedded = unicam->getEntityByName("unicam-embedded"); + if (unicamEmbedded) { + data->unicam_[Unicam::Embedded] = RPi::Stream("Unicam Embedded", unicamEmbedded); + data->unicam_[Unicam::Embedded].dev()->bufferReady.connect(data, + &Vc4CameraData::unicamBufferDequeue); + } + + /* Tag the ISP input stream as an import stream. */ + data->isp_[Isp::Input] = RPi::Stream("ISP Input", ispOutput0, true); + data->isp_[Isp::Output0] = RPi::Stream("ISP Output0", ispCapture1); + data->isp_[Isp::Output1] = RPi::Stream("ISP Output1", ispCapture2); + data->isp_[Isp::Stats] = RPi::Stream("ISP Stats", ispCapture3); + + /* Wire up all the buffer connections. */ + data->unicam_[Unicam::Image].dev()->bufferReady.connect(data, &Vc4CameraData::unicamBufferDequeue); + data->isp_[Isp::Input].dev()->bufferReady.connect(data, &Vc4CameraData::ispInputDequeue); + data->isp_[Isp::Output0].dev()->bufferReady.connect(data, &Vc4CameraData::ispOutputDequeue); + data->isp_[Isp::Output1].dev()->bufferReady.connect(data, &Vc4CameraData::ispOutputDequeue); + data->isp_[Isp::Stats].dev()->bufferReady.connect(data, &Vc4CameraData::ispOutputDequeue); + + if (data->sensorMetadata_ ^ !!data->unicam_[Unicam::Embedded].dev()) { + LOG(RPI, Warning) << "Mismatch between Unicam and CamHelper for embedded data usage!"; + data->sensorMetadata_ = false; + if (data->unicam_[Unicam::Embedded].dev()) + data->unicam_[Unicam::Embedded].dev()->bufferReady.disconnect(); + } + + /* + * Open all Unicam and ISP streams. The exception is the embedded data + * stream, which only gets opened below if the IPA reports that the sensor + * supports embedded data. + * + * The below grouping is just for convenience so that we can easily + * iterate over all streams in one go. + */ + data->streams_.push_back(&data->unicam_[Unicam::Image]); + if (data->sensorMetadata_) + data->streams_.push_back(&data->unicam_[Unicam::Embedded]); + + for (auto &stream : data->isp_) + data->streams_.push_back(&stream); + + for (auto stream : data->streams_) { + int ret = stream->dev()->open(); + if (ret) + return ret; + } + + if (!data->unicam_[Unicam::Image].dev()->caps().hasMediaController()) { + LOG(RPI, Error) << "Unicam driver does not use the MediaController, please update your kernel!"; + return -EINVAL; + } + + /* Write up all the IPA connections. */ + data->ipa_->processStatsComplete.connect(data, &Vc4CameraData::processStatsComplete); + data->ipa_->prepareIspComplete.connect(data, &Vc4CameraData::prepareIspComplete); + data->ipa_->setIspControls.connect(data, &Vc4CameraData::setIspControls); + data->ipa_->setCameraTimeout.connect(data, &Vc4CameraData::setCameraTimeout); + + /* + * List the available streams an application may request. At present, we + * do not advertise Unicam Embedded and ISP Statistics streams, as there + * is no mechanism for the application to request non-image buffer formats. + */ + std::set streams; + streams.insert(&data->unicam_[Unicam::Image]); + streams.insert(&data->isp_[Isp::Output0]); + streams.insert(&data->isp_[Isp::Output1]); + + /* Create and register the camera. */ + const std::string &id = data->sensor_->id(); + std::shared_ptr camera = + Camera::create(std::move(cameraData), id, streams); + PipelineHandler::registerCamera(std::move(camera)); + + LOG(RPI, Info) << "Registered camera " << id + << " to Unicam device " << unicam->deviceNode() + << " and ISP device " << isp->deviceNode(); + + return 0; +} + +CameraConfiguration::Status Vc4CameraData::platformValidate(std::vector &rawStreams, + std::vector &outStreams) const +{ + CameraConfiguration::Status status = CameraConfiguration::Status::Valid; + + /* Can only output 1 RAW stream, or 2 YUV/RGB streams. */ + if (rawStreams.size() > 1 || outStreams.size() > 2) { + LOG(RPI, Error) << "Invalid number of streams requested"; + return CameraConfiguration::Status::Invalid; + } + + if (!rawStreams.empty()) + rawStreams[0].dev = unicam_[Unicam::Image].dev(); + + /* + * For the two ISP outputs, one stream must be equal or smaller than the + * other in all dimensions. + * + * Index 0 contains the largest requested resolution. + */ + for (unsigned int i = 0; i < outStreams.size(); i++) { + Size size; + + size.width = std::min(outStreams[i].cfg->size.width, + outStreams[0].cfg->size.width); + size.height = std::min(outStreams[i].cfg->size.height, + outStreams[0].cfg->size.height); + + if (outStreams[i].cfg->size != size) { + outStreams[i].cfg->size = size; + status = CameraConfiguration::Status::Adjusted; + } + + /* + * Output 0 must be for the largest resolution. We will + * have that fixed up in the code above. + */ + outStreams[i].dev = isp_[i == 0 ? Isp::Output0 : Isp::Output1].dev(); + } + + return status; +} + +int Vc4CameraData::platformPipelineConfigure(const std::unique_ptr &root) +{ + config_ = { + .minUnicamBuffers = 2, + .minTotalUnicamBuffers = 4, + }; + + if (!root) + return 0; + + std::optional ver = (*root)["version"].get(); + if (!ver || *ver != 1.0) { + LOG(RPI, Error) << "Unexpected configuration file version reported"; + return -EINVAL; + } + + std::optional target = (*root)["target"].get(); + if (!target || *target != "bcm2835") { + LOG(RPI, Error) << "Unexpected target reported: expected \"bcm2835\", got " + << *target; + return -EINVAL; + } + + const YamlObject &phConfig = (*root)["pipeline_handler"]; + config_.minUnicamBuffers = + phConfig["min_unicam_buffers"].get(config_.minUnicamBuffers); + config_.minTotalUnicamBuffers = + phConfig["min_total_unicam_buffers"].get(config_.minTotalUnicamBuffers); + + if (config_.minTotalUnicamBuffers < config_.minUnicamBuffers) { + LOG(RPI, Error) << "Invalid configuration: min_total_unicam_buffers must be >= min_unicam_buffers"; + return -EINVAL; + } + + if (config_.minTotalUnicamBuffers < 1) { + LOG(RPI, Error) << "Invalid configuration: min_total_unicam_buffers must be >= 1"; + return -EINVAL; + } + + return 0; +} + +int Vc4CameraData::platformConfigure(const V4L2SubdeviceFormat &sensorFormat, + std::optional packing, + std::vector &rawStreams, + std::vector &outStreams) +{ + int ret; + + if (!packing) + packing = BayerFormat::Packing::CSI2; + + V4L2VideoDevice *unicam = unicam_[Unicam::Image].dev(); + V4L2DeviceFormat unicamFormat = RPi::PipelineHandlerBase::toV4L2DeviceFormat(unicam, sensorFormat, *packing); + + ret = unicam->setFormat(&unicamFormat); + if (ret) + return ret; + + /* + * See which streams are requested, and route the user + * StreamConfiguration appropriately. + */ + if (!rawStreams.empty()) { + rawStreams[0].cfg->setStream(&unicam_[Unicam::Image]); + unicam_[Unicam::Image].setExternal(true); + } + + ret = isp_[Isp::Input].dev()->setFormat(&unicamFormat); + if (ret) + return ret; + + LOG(RPI, Info) << "Sensor: " << sensor_->id() + << " - Selected sensor format: " << sensorFormat + << " - Selected unicam format: " << unicamFormat; + + /* Use a sensible small default size if no output streams are configured. */ + Size maxSize = outStreams.empty() ? Size(320, 240) : outStreams[0].cfg->size; + V4L2DeviceFormat format; + + for (unsigned int i = 0; i < outStreams.size(); i++) { + StreamConfiguration *cfg = outStreams[i].cfg; + + /* The largest resolution gets routed to the ISP Output 0 node. */ + RPi::Stream *stream = i == 0 ? &isp_[Isp::Output0] : &isp_[Isp::Output1]; + + V4L2PixelFormat fourcc = stream->dev()->toV4L2PixelFormat(cfg->pixelFormat); + format.size = cfg->size; + format.fourcc = fourcc; + format.colorSpace = cfg->colorSpace; + + LOG(RPI, Debug) << "Setting " << stream->name() << " to " + << format; + + ret = stream->dev()->setFormat(&format); + if (ret) + return -EINVAL; + + if (format.size != cfg->size || format.fourcc != fourcc) { + LOG(RPI, Error) + << "Failed to set requested format on " << stream->name() + << ", returned " << format; + return -EINVAL; + } + + LOG(RPI, Debug) + << "Stream " << stream->name() << " has color space " + << ColorSpace::toString(cfg->colorSpace); + + cfg->setStream(stream); + stream->setExternal(true); + } + + ispOutputTotal_ = outStreams.size(); + + /* + * If ISP::Output0 stream has not been configured by the application, + * we must allow the hardware to generate an output so that the data + * flow in the pipeline handler remains consistent, and we still generate + * statistics for the IPA to use. So enable the output at a very low + * resolution for internal use. + * + * \todo Allow the pipeline to work correctly without Output0 and only + * statistics coming from the hardware. + */ + if (outStreams.empty()) { + V4L2VideoDevice *dev = isp_[Isp::Output0].dev(); + + format = {}; + format.size = maxSize; + format.fourcc = dev->toV4L2PixelFormat(formats::YUV420); + /* No one asked for output, so the color space doesn't matter. */ + format.colorSpace = ColorSpace::Sycc; + ret = dev->setFormat(&format); + if (ret) { + LOG(RPI, Error) + << "Failed to set default format on ISP Output0: " + << ret; + return -EINVAL; + } + + ispOutputTotal_++; + + LOG(RPI, Debug) << "Defaulting ISP Output0 format to " + << format; + } + + /* + * If ISP::Output1 stream has not been requested by the application, we + * set it up for internal use now. This second stream will be used for + * fast colour denoise, and must be a quarter resolution of the ISP::Output0 + * stream. However, also limit the maximum size to 1200 pixels in the + * larger dimension, just to avoid being wasteful with buffer allocations + * and memory bandwidth. + * + * \todo If Output 1 format is not YUV420, Output 1 ought to be disabled as + * colour denoise will not run. + */ + if (outStreams.size() == 1) { + V4L2VideoDevice *dev = isp_[Isp::Output1].dev(); + + V4L2DeviceFormat output1Format; + constexpr Size maxDimensions(1200, 1200); + const Size limit = maxDimensions.boundedToAspectRatio(format.size); + + output1Format.size = (format.size / 2).boundedTo(limit).alignedDownTo(2, 2); + output1Format.colorSpace = format.colorSpace; + output1Format.fourcc = dev->toV4L2PixelFormat(formats::YUV420); + + LOG(RPI, Debug) << "Setting ISP Output1 (internal) to " + << output1Format; + + ret = dev->setFormat(&output1Format); + if (ret) { + LOG(RPI, Error) << "Failed to set format on ISP Output1: " + << ret; + return -EINVAL; + } + + ispOutputTotal_++; + } + + /* ISP statistics output format. */ + format = {}; + format.fourcc = V4L2PixelFormat(V4L2_META_FMT_BCM2835_ISP_STATS); + ret = isp_[Isp::Stats].dev()->setFormat(&format); + if (ret) { + LOG(RPI, Error) << "Failed to set format on ISP stats stream: " + << format; + return ret; + } + + ispOutputTotal_++; + + /* + * Configure the Unicam embedded data output format only if the sensor + * supports it. + */ + if (sensorMetadata_) { + V4L2SubdeviceFormat embeddedFormat; + + sensor_->device()->getFormat(1, &embeddedFormat); + format = {}; + format.fourcc = V4L2PixelFormat(V4L2_META_FMT_SENSOR_DATA); + format.planes[0].size = embeddedFormat.size.width * embeddedFormat.size.height; + + LOG(RPI, Debug) << "Setting embedded data format " << format.toString(); + ret = unicam_[Unicam::Embedded].dev()->setFormat(&format); + if (ret) { + LOG(RPI, Error) << "Failed to set format on Unicam embedded: " + << format; + return ret; + } + } + + /* Figure out the smallest selection the ISP will allow. */ + Rectangle testCrop(0, 0, 1, 1); + isp_[Isp::Input].dev()->setSelection(V4L2_SEL_TGT_CROP, &testCrop); + ispMinCropSize_ = testCrop.size(); + + /* Adjust aspect ratio by providing crops on the input image. */ + Size size = unicamFormat.size.boundedToAspectRatio(maxSize); + ispCrop_ = size.centeredTo(Rectangle(unicamFormat.size).center()); + + platformIspCrop(); + + return 0; +} + +int Vc4CameraData::platformConfigureIpa(ipa::RPi::ConfigParams ¶ms) +{ + params.ispControls = isp_[Isp::Input].dev()->controls(); + + /* Allocate the lens shading table via dmaHeap and pass to the IPA. */ + if (!lsTable_.isValid()) { + lsTable_ = SharedFD(dmaHeap_.alloc("ls_grid", ipa::RPi::MaxLsGridSize)); + if (!lsTable_.isValid()) + return -ENOMEM; + + /* Allow the IPA to mmap the LS table via the file descriptor. */ + /* + * \todo Investigate if mapping the lens shading table buffer + * could be handled with mapBuffers(). + */ + params.lsTableHandle = lsTable_; + } + + return 0; +} + +void Vc4CameraData::platformStart() +{ +} + +void Vc4CameraData::platformStop() +{ + bayerQueue_ = {}; + embeddedQueue_ = {}; +} + +void Vc4CameraData::unicamBufferDequeue(FrameBuffer *buffer) +{ + RPi::Stream *stream = nullptr; + unsigned int index; + + if (!isRunning()) + return; + + for (RPi::Stream &s : unicam_) { + index = s.getBufferId(buffer); + if (index) { + stream = &s; + break; + } + } + + /* The buffer must belong to one of our streams. */ + ASSERT(stream); + + LOG(RPI, Debug) << "Stream " << stream->name() << " buffer dequeue" + << ", buffer id " << index + << ", timestamp: " << buffer->metadata().timestamp; + + if (stream == &unicam_[Unicam::Image]) { + /* + * Lookup the sensor controls used for this frame sequence from + * DelayedControl and queue them along with the frame buffer. + */ + auto [ctrl, delayContext] = delayedCtrls_->get(buffer->metadata().sequence); + /* + * Add the frame timestamp to the ControlList for the IPA to use + * as it does not receive the FrameBuffer object. + */ + ctrl.set(controls::SensorTimestamp, buffer->metadata().timestamp); + bayerQueue_.push({ buffer, std::move(ctrl), delayContext }); + } else { + embeddedQueue_.push(buffer); + } + + handleState(); +} + +void Vc4CameraData::ispInputDequeue(FrameBuffer *buffer) +{ + if (!isRunning()) + return; + + LOG(RPI, Debug) << "Stream ISP Input buffer complete" + << ", buffer id " << unicam_[Unicam::Image].getBufferId(buffer) + << ", timestamp: " << buffer->metadata().timestamp; + + /* The ISP input buffer gets re-queued into Unicam. */ + handleStreamBuffer(buffer, &unicam_[Unicam::Image]); + handleState(); +} + +void Vc4CameraData::ispOutputDequeue(FrameBuffer *buffer) +{ + RPi::Stream *stream = nullptr; + unsigned int index; + + if (!isRunning()) + return; + + for (RPi::Stream &s : isp_) { + index = s.getBufferId(buffer); + if (index) { + stream = &s; + break; + } + } + + /* The buffer must belong to one of our ISP output streams. */ + ASSERT(stream); + + LOG(RPI, Debug) << "Stream " << stream->name() << " buffer complete" + << ", buffer id " << index + << ", timestamp: " << buffer->metadata().timestamp; + + /* + * ISP statistics buffer must not be re-queued or sent back to the + * application until after the IPA signals so. + */ + if (stream == &isp_[Isp::Stats]) { + ipa::RPi::ProcessParams params; + params.buffers.stats = index | RPi::MaskStats; + params.ipaContext = requestQueue_.front()->sequence(); + ipa_->processStats(params); + } else { + /* Any other ISP output can be handed back to the application now. */ + handleStreamBuffer(buffer, stream); + } + + /* + * Increment the number of ISP outputs generated. + * This is needed to track dropped frames. + */ + ispOutputCount_++; + + handleState(); +} + +void Vc4CameraData::processStatsComplete(const ipa::RPi::BufferIds &buffers) +{ + if (!isRunning()) + return; + + FrameBuffer *buffer = isp_[Isp::Stats].getBuffers().at(buffers.stats & RPi::MaskID); + + handleStreamBuffer(buffer, &isp_[Isp::Stats]); + + /* Last thing to do is to fill up the request metadata. */ + Request *request = requestQueue_.front(); + ControlList metadata(controls::controls); + + ipa_->reportMetadata(request->sequence(), &metadata); + request->metadata().merge(metadata); + + /* + * Inform the sensor of the latest colour gains if it has the + * V4L2_CID_NOTIFY_GAINS control (which means notifyGainsUnity_ is set). + */ + const auto &colourGains = metadata.get(libcamera::controls::ColourGains); + if (notifyGainsUnity_ && colourGains) { + /* The control wants linear gains in the order B, Gb, Gr, R. */ + ControlList ctrls(sensor_->controls()); + std::array gains{ + static_cast((*colourGains)[1] * *notifyGainsUnity_), + *notifyGainsUnity_, + *notifyGainsUnity_, + static_cast((*colourGains)[0] * *notifyGainsUnity_) + }; + ctrls.set(V4L2_CID_NOTIFY_GAINS, Span{ gains }); + + sensor_->setControls(&ctrls); + } + + state_ = State::IpaComplete; + handleState(); +} + +void Vc4CameraData::prepareIspComplete(const ipa::RPi::BufferIds &buffers) +{ + unsigned int embeddedId = buffers.embedded & RPi::MaskID; + unsigned int bayer = buffers.bayer & RPi::MaskID; + FrameBuffer *buffer; + + if (!isRunning()) + return; + + buffer = unicam_[Unicam::Image].getBuffers().at(bayer & RPi::MaskID); + LOG(RPI, Debug) << "Input re-queue to ISP, buffer id " << (bayer & RPi::MaskID) + << ", timestamp: " << buffer->metadata().timestamp; + + isp_[Isp::Input].queueBuffer(buffer); + ispOutputCount_ = 0; + + if (sensorMetadata_ && embeddedId) { + buffer = unicam_[Unicam::Embedded].getBuffers().at(embeddedId & RPi::MaskID); + handleStreamBuffer(buffer, &unicam_[Unicam::Embedded]); + } + + handleState(); +} + +void Vc4CameraData::setIspControls(const ControlList &controls) +{ + ControlList ctrls = controls; + + if (ctrls.contains(V4L2_CID_USER_BCM2835_ISP_LENS_SHADING)) { + ControlValue &value = + const_cast(ctrls.get(V4L2_CID_USER_BCM2835_ISP_LENS_SHADING)); + Span s = value.data(); + bcm2835_isp_lens_shading *ls = + reinterpret_cast(s.data()); + ls->dmabuf = lsTable_.get(); + } + + isp_[Isp::Input].dev()->setControls(&ctrls); + handleState(); +} + +void Vc4CameraData::setCameraTimeout(uint32_t maxFrameLengthMs) +{ + /* + * Set the dequeue timeout to the larger of 5x the maximum reported + * frame length advertised by the IPA over a number of frames. Allow + * a minimum timeout value of 1s. + */ + utils::Duration timeout = + std::max(1s, 5 * maxFrameLengthMs * 1ms); + + LOG(RPI, Debug) << "Setting Unicam timeout to " << timeout; + unicam_[Unicam::Image].dev()->setDequeueTimeout(timeout); +} + +void Vc4CameraData::tryRunPipeline() +{ + FrameBuffer *embeddedBuffer; + BayerFrame bayerFrame; + + /* If any of our request or buffer queues are empty, we cannot proceed. */ + if (state_ != State::Idle || requestQueue_.empty() || + bayerQueue_.empty() || (embeddedQueue_.empty() && sensorMetadata_)) + return; + + if (!findMatchingBuffers(bayerFrame, embeddedBuffer)) + return; + + /* Take the first request from the queue and action the IPA. */ + Request *request = requestQueue_.front(); + + /* See if a new ScalerCrop value needs to be applied. */ + calculateScalerCrop(request->controls()); + + /* + * Clear the request metadata and fill it with some initial non-IPA + * related controls. We clear it first because the request metadata + * may have been populated if we have dropped the previous frame. + */ + request->metadata().clear(); + fillRequestMetadata(bayerFrame.controls, request); + + /* Set our state to say the pipeline is active. */ + state_ = State::Busy; + + unsigned int bayer = unicam_[Unicam::Image].getBufferId(bayerFrame.buffer); + + LOG(RPI, Debug) << "Signalling prepareIsp:" + << " Bayer buffer id: " << bayer; + + ipa::RPi::PrepareParams params; + params.buffers.bayer = RPi::MaskBayerData | bayer; + params.sensorControls = std::move(bayerFrame.controls); + params.requestControls = request->controls(); + params.ipaContext = request->sequence(); + params.delayContext = bayerFrame.delayContext; + + if (embeddedBuffer) { + unsigned int embeddedId = unicam_[Unicam::Embedded].getBufferId(embeddedBuffer); + + params.buffers.embedded = RPi::MaskEmbeddedData | embeddedId; + LOG(RPI, Debug) << "Signalling prepareIsp:" + << " Embedded buffer id: " << embeddedId; + } + + ipa_->prepareIsp(params); +} + +bool Vc4CameraData::findMatchingBuffers(BayerFrame &bayerFrame, FrameBuffer *&embeddedBuffer) +{ + if (bayerQueue_.empty()) + return false; + + /* + * Find the embedded data buffer with a matching timestamp to pass to + * the IPA. Any embedded buffers with a timestamp lower than the + * current bayer buffer will be removed and re-queued to the driver. + */ + uint64_t ts = bayerQueue_.front().buffer->metadata().timestamp; + embeddedBuffer = nullptr; + while (!embeddedQueue_.empty()) { + FrameBuffer *b = embeddedQueue_.front(); + if (b->metadata().timestamp < ts) { + embeddedQueue_.pop(); + unicam_[Unicam::Embedded].returnBuffer(b); + LOG(RPI, Debug) << "Dropping unmatched input frame in stream " + << unicam_[Unicam::Embedded].name(); + } else if (b->metadata().timestamp == ts) { + /* Found a match! */ + embeddedBuffer = b; + embeddedQueue_.pop(); + break; + } else { + break; /* Only higher timestamps from here. */ + } + } + + if (!embeddedBuffer && sensorMetadata_) { + if (embeddedQueue_.empty()) { + /* + * If the embedded buffer queue is empty, wait for the next + * buffer to arrive - dequeue ordering may send the image + * buffer first. + */ + LOG(RPI, Debug) << "Waiting for next embedded buffer."; + return false; + } + + /* Log if there is no matching embedded data buffer found. */ + LOG(RPI, Debug) << "Returning bayer frame without a matching embedded buffer."; + } + + bayerFrame = std::move(bayerQueue_.front()); + bayerQueue_.pop(); + + return true; +} + +REGISTER_PIPELINE_HANDLER(PipelineHandlerVc4) + +} /* namespace libcamera */ From patchwork Wed Apr 26 13:10:56 2023 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Naushir Patuck X-Patchwork-Id: 18559 Return-Path: X-Original-To: parsemail@patchwork.libcamera.org Delivered-To: parsemail@patchwork.libcamera.org Received: from lancelot.ideasonboard.com (lancelot.ideasonboard.com [92.243.16.209]) by patchwork.libcamera.org (Postfix) with ESMTPS id D9E6FBDCBD for ; Wed, 26 Apr 2023 13:13:43 +0000 (UTC) Received: from lancelot.ideasonboard.com (localhost [IPv6:::1]) by lancelot.ideasonboard.com (Postfix) with ESMTP id 921B8627F2; Wed, 26 Apr 2023 15:13:43 +0200 (CEST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/simple; d=libcamera.org; s=mail; t=1682514823; bh=VJDTMZvXy2RDrlW/OQ+He8sPOCnLdbc1q2yJgjEFg6M=; h=To:Date:In-Reply-To:References:Subject:List-Id:List-Unsubscribe: List-Archive:List-Post:List-Help:List-Subscribe:From:Reply-To: From; b=jlYDcHGZqEi/s2eKR30tXCyjvN0KZ7ZqRdFBTrSA0Clmt1mRUOJLAMpo6n9Qf/IQc cZbYu76D878HfUw8xcxAdB+PRmP0tvEaOgDYqnGvc9FgorOkt6hEGRQb0Zz28e1bYB YEeAN840oPqoAgCad5qKiLVteglJDaKtD4Xqf8QJ4QEXwYszPF55ZldbARMEd8Hpzk GFf7p50MxxGuaJpoPPNu7zazvS0PrFi2NkcrrYUfzGvVEL8OWMHSGO8mYGmtu8Akce Ihn9qI8IFDEne8Qp/qLhVwvjWteEREBHZMrxOJ8RJdiS29WpkXiVYiW6etVbYKKvdq hGd0WjA9yaMlA== Received: from mail-wm1-x32d.google.com (mail-wm1-x32d.google.com [IPv6:2a00:1450:4864:20::32d]) by lancelot.ideasonboard.com (Postfix) with ESMTPS id 8FC8B627E4 for ; Wed, 26 Apr 2023 15:13:41 +0200 (CEST) Authentication-Results: lancelot.ideasonboard.com; dkim=pass (2048-bit key; unprotected) header.d=raspberrypi.com header.i=@raspberrypi.com header.b="ZJsakscl"; dkim-atps=neutral Received: by mail-wm1-x32d.google.com with SMTP id 5b1f17b1804b1-3f19c473b9eso119335145e9.0 for ; Wed, 26 Apr 2023 06:13:41 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=raspberrypi.com; s=google; t=1682514821; x=1685106821; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:from:to:cc:subject:date :message-id:reply-to; bh=8Y709+XrzI/xOv/zeokIcnbOBDFL0N1omPB0toIeEHI=; b=ZJsaksclpaDGWaobX4Dk4VuM6sIcXKY8uKKGiNlBuo738FUgBxpZ5oo5wHUxoKVF0I 00PVm5j7gl5RGRGRQHLCjBxpYUSbY9ZOc5vXXoN15N1n0fdQzbZ4Nvcqu/AU2tI3OrzO +NdnES0dhl805+cOA0o5gE65Ckrdi9Bvkne8gTTlCek47bJ0l7lN5oL5YxoUHuC3HC85 Rt/JXLlBd1fE8XEJBL91VfodwBuN2b0dRUhRPU+RlqKYtYXfbatLxfZM1hJzp/4YXIit f4EJ9JEqIu//0pfdFq6m2sQPXIoiIM3x4hIO36k9Tpw/z5Y2iO7NQX7+ssdGxCeXtqvo e6Eg== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20221208; t=1682514821; x=1685106821; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:x-gm-message-state:from:to:cc :subject:date:message-id:reply-to; bh=8Y709+XrzI/xOv/zeokIcnbOBDFL0N1omPB0toIeEHI=; b=GF8EULNaMrSLi2IFOkOPGh075Gg5TGN1FXMOO1VyajanVeIQNl4NBtNbKdm/KZd6Zf BE+h166lypzQQLH2wLutu0etHPX0o0siRFN7mn2JGnvcqsH6wzarx09ntsMyYgO5JA1Z GllLQdmbxDrCLOIZ/0n9YSSAYVwRFCNDxn9xVZxxwF9MEi2eKWXou1E+iNEAC/tO8iR8 FTa3i4Q4WjUe3QaMBeJJDusGM4ZnUFdV4DUUNA1HhU6xuQ3+1ml2uJwXq9a6MeCwu9q5 NgRppxC9cdBe4cJC67pfmKF3Ee8G+FuAM1oD1d7/43QLF8KNI9C1USLGpgNcPKYZA4Qn r2Dg== X-Gm-Message-State: AC+VfDw2noPIUKz7V4WOO9T7y+X38BI+n+GpHH7BafB/prsDDSx3tBvs b8E7h/0ttnLdlC36/KimWLobmGOonLNI0JO4GyoQsA== X-Google-Smtp-Source: ACHHUZ4mWUhLyG/s39B7so48xQSd4xL/eU3Pw0Gyep9vg9l3Vn3xBSNuTNOnPbWZ5OH4WK4WdPOJaA== X-Received: by 2002:a5d:6682:0:b0:304:7927:ec65 with SMTP id l2-20020a5d6682000000b003047927ec65mr1763588wru.16.1682514820869; Wed, 26 Apr 2023 06:13:40 -0700 (PDT) Received: from localhost.localdomain ([93.93.133.154]) by smtp.gmail.com with ESMTPSA id k5-20020adff5c5000000b002f103ca90cdsm15780949wrp.101.2023.04.26.06.13.40 (version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256); Wed, 26 Apr 2023 06:13:40 -0700 (PDT) To: libcamera-devel@lists.libcamera.org Date: Wed, 26 Apr 2023 14:10:56 +0100 Message-Id: <20230426131057.21550-13-naush@raspberrypi.com> X-Mailer: git-send-email 2.34.1 In-Reply-To: <20230426131057.21550-1-naush@raspberrypi.com> References: <20230426131057.21550-1-naush@raspberrypi.com> MIME-Version: 1.0 Subject: [libcamera-devel] [PATCH 12/13] pipeline: raspberrypi: Add stream flags to RPi::Stream X-BeenThere: libcamera-devel@lists.libcamera.org X-Mailman-Version: 2.1.29 Precedence: list List-Id: List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , X-Patchwork-Original-From: Naushir Patuck via libcamera-devel From: Naushir Patuck Reply-To: Naushir Patuck Errors-To: libcamera-devel-bounces@lists.libcamera.org Sender: "libcamera-devel" Add a flags_ field to indicate stream state information in RPi::Stream. This replaces the existing external_ and importOnly_ boolean flags. Signed-off-by: Naushir Patuck Reviewed-by: Jacopo Mondi --- .../pipeline/rpi/common/pipeline_base.cpp | 8 ++-- .../pipeline/rpi/common/rpi_stream.cpp | 40 ++++++++++-------- .../pipeline/rpi/common/rpi_stream.h | 42 ++++++++++++------- src/libcamera/pipeline/rpi/vc4/vc4.cpp | 8 ++-- 4 files changed, 60 insertions(+), 38 deletions(-) diff --git a/src/libcamera/pipeline/rpi/common/pipeline_base.cpp b/src/libcamera/pipeline/rpi/common/pipeline_base.cpp index 012766b38c32..4d38d511c383 100644 --- a/src/libcamera/pipeline/rpi/common/pipeline_base.cpp +++ b/src/libcamera/pipeline/rpi/common/pipeline_base.cpp @@ -31,6 +31,8 @@ using namespace RPi; LOG_DEFINE_CATEGORY(RPI) +using Flag = RPi::Stream::StreamFlag; + namespace { constexpr unsigned int defaultRawBitDepth = 12; @@ -504,7 +506,7 @@ int PipelineHandlerBase::configure(Camera *camera, CameraConfiguration *config) /* Start by freeing all buffers and reset the stream states. */ data->freeBuffers(); for (auto const stream : data->streams_) - stream->setExternal(false); + stream->clearFlags(Flag::External); std::vector rawStreams, ispStreams; std::optional packing; @@ -752,7 +754,7 @@ int PipelineHandlerBase::queueRequestDevice(Camera *camera, Request *request) /* Push all buffers supplied in the Request to the respective streams. */ for (auto stream : data->streams_) { - if (!stream->isExternal()) + if (!(stream->getFlags() & Flag::External)) continue; FrameBuffer *buffer = request->findBuffer(stream); @@ -931,7 +933,7 @@ int PipelineHandlerBase::queueAllBuffers(Camera *camera) int ret; for (auto const stream : data->streams_) { - if (!stream->isExternal()) { + if (!(stream->getFlags() & Flag::External)) { ret = stream->queueAllBuffers(); if (ret < 0) return ret; diff --git a/src/libcamera/pipeline/rpi/common/rpi_stream.cpp b/src/libcamera/pipeline/rpi/common/rpi_stream.cpp index b7e4130f5e56..c158843cb0ed 100644 --- a/src/libcamera/pipeline/rpi/common/rpi_stream.cpp +++ b/src/libcamera/pipeline/rpi/common/rpi_stream.cpp @@ -14,6 +14,24 @@ LOG_DEFINE_CATEGORY(RPISTREAM) namespace RPi { +void Stream::setFlags(StreamFlags flags) +{ + flags_ |= flags; + + /* Import streams cannot be external. */ + ASSERT(!(flags_ & StreamFlag::External) || !(flags_ & StreamFlag::ImportOnly)); +} + +void Stream::clearFlags(StreamFlags flags) +{ + flags_ &= ~flags; +} + +RPi::Stream::StreamFlags Stream::getFlags() const +{ + return flags_; +} + V4L2VideoDevice *Stream::dev() const { return dev_.get(); @@ -32,18 +50,6 @@ void Stream::resetBuffers() availableBuffers_.push(buffer.get()); } -void Stream::setExternal(bool external) -{ - /* Import streams cannot be external. */ - ASSERT(!external || !importOnly_); - external_ = external; -} - -bool Stream::isExternal() const -{ - return external_; -} - void Stream::setExportedBuffers(std::vector> *buffers) { for (auto const &buffer : *buffers) @@ -57,7 +63,7 @@ const BufferMap &Stream::getBuffers() const unsigned int Stream::getBufferId(FrameBuffer *buffer) const { - if (importOnly_) + if (flags_ & StreamFlag::ImportOnly) return 0; /* Find the buffer in the map, and return the buffer id. */ @@ -88,7 +94,7 @@ int Stream::prepareBuffers(unsigned int count) { int ret; - if (!importOnly_) { + if (!(flags_ & StreamFlag::ImportOnly)) { if (count) { /* Export some frame buffers for internal use. */ ret = dev_->exportBuffers(count, &internalBuffers_); @@ -113,7 +119,7 @@ int Stream::prepareBuffers(unsigned int count) * \todo Find a better heuristic, or, even better, an exact solution to * this issue. */ - if (isExternal() || importOnly_) + if ((flags_ & StreamFlag::External) || (flags_ & StreamFlag::ImportOnly)) count = count * 2; return dev_->importBuffers(count); @@ -160,7 +166,7 @@ int Stream::queueBuffer(FrameBuffer *buffer) void Stream::returnBuffer(FrameBuffer *buffer) { - if (!external_) { + if (!(flags_ & StreamFlag::External)) { /* For internal buffers, simply requeue back to the device. */ queueToDevice(buffer); return; @@ -204,7 +210,7 @@ int Stream::queueAllBuffers() { int ret; - if (external_) + if (flags_ & StreamFlag::External) return 0; while (!availableBuffers_.empty()) { diff --git a/src/libcamera/pipeline/rpi/common/rpi_stream.h b/src/libcamera/pipeline/rpi/common/rpi_stream.h index b8c74de35863..6edd304bdfe2 100644 --- a/src/libcamera/pipeline/rpi/common/rpi_stream.h +++ b/src/libcamera/pipeline/rpi/common/rpi_stream.h @@ -12,6 +12,7 @@ #include #include +#include #include #include "libcamera/internal/v4l2_videodevice.h" @@ -37,25 +38,41 @@ enum BufferMask { class Stream : public libcamera::Stream { public: + enum class StreamFlag { + None = 0, + /* + * Indicates that this stream only imports buffers, e.g. the ISP + * input stream. + */ + ImportOnly = (1 << 0), + /* + * Indicates that this stream is active externally, i.e. the + * buffers might be provided by (and returned to) the application. + */ + External = (1 << 1), + }; + + using StreamFlags = Flags; + Stream() - : id_(BufferMask::MaskID) + : flags_(StreamFlag::None), id_(BufferMask::MaskID) { } - Stream(const char *name, MediaEntity *dev, bool importOnly = false) - : external_(false), importOnly_(importOnly), name_(name), + Stream(const char *name, MediaEntity *dev, StreamFlags flags = StreamFlag::None) + : flags_(flags), name_(name), dev_(std::make_unique(dev)), id_(BufferMask::MaskID) { } + void setFlags(StreamFlags flags); + void clearFlags(StreamFlags flags); + StreamFlags getFlags() const; + V4L2VideoDevice *dev() const; const std::string &name() const; - bool isImporter() const; void resetBuffers(); - void setExternal(bool external); - bool isExternal() const; - void setExportedBuffers(std::vector> *buffers); const BufferMap &getBuffers() const; unsigned int getBufferId(FrameBuffer *buffer) const; @@ -112,14 +129,7 @@ private: void clearBuffers(); int queueToDevice(FrameBuffer *buffer); - /* - * Indicates that this stream is active externally, i.e. the buffers - * might be provided by (and returned to) the application. - */ - bool external_; - - /* Indicates that this stream only imports buffers, e.g. ISP input. */ - bool importOnly_; + StreamFlags flags_; /* Stream name identifier. */ std::string name_; @@ -182,4 +192,6 @@ public: } /* namespace RPi */ +LIBCAMERA_FLAGS_ENABLE_OPERATORS(RPi::Stream::StreamFlag) + } /* namespace libcamera */ diff --git a/src/libcamera/pipeline/rpi/vc4/vc4.cpp b/src/libcamera/pipeline/rpi/vc4/vc4.cpp index a085a06376a8..bd7bfb3a7663 100644 --- a/src/libcamera/pipeline/rpi/vc4/vc4.cpp +++ b/src/libcamera/pipeline/rpi/vc4/vc4.cpp @@ -23,6 +23,8 @@ namespace libcamera { LOG_DECLARE_CATEGORY(RPI) +using Flag = RPi::Stream::StreamFlag; + namespace { enum class Unicam : unsigned int { Image, Embedded }; @@ -320,7 +322,7 @@ int PipelineHandlerVc4::platformRegister(std::unique_ptr &camer } /* Tag the ISP input stream as an import stream. */ - data->isp_[Isp::Input] = RPi::Stream("ISP Input", ispOutput0, true); + data->isp_[Isp::Input] = RPi::Stream("ISP Input", ispOutput0, Flag::ImportOnly); data->isp_[Isp::Output0] = RPi::Stream("ISP Output0", ispCapture1); data->isp_[Isp::Output1] = RPi::Stream("ISP Output1", ispCapture2); data->isp_[Isp::Stats] = RPi::Stream("ISP Stats", ispCapture3); @@ -502,7 +504,7 @@ int Vc4CameraData::platformConfigure(const V4L2SubdeviceFormat &sensorFormat, */ if (!rawStreams.empty()) { rawStreams[0].cfg->setStream(&unicam_[Unicam::Image]); - unicam_[Unicam::Image].setExternal(true); + unicam_[Unicam::Image].setFlags(Flag::External); } ret = isp_[Isp::Input].dev()->setFormat(&unicamFormat); @@ -547,7 +549,7 @@ int Vc4CameraData::platformConfigure(const V4L2SubdeviceFormat &sensorFormat, << ColorSpace::toString(cfg->colorSpace); cfg->setStream(stream); - stream->setExternal(true); + stream->setFlags(Flag::External); } ispOutputTotal_ = outStreams.size(); From patchwork Wed Apr 26 13:10:57 2023 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Naushir Patuck X-Patchwork-Id: 18560 Return-Path: X-Original-To: parsemail@patchwork.libcamera.org Delivered-To: parsemail@patchwork.libcamera.org Received: from lancelot.ideasonboard.com (lancelot.ideasonboard.com [92.243.16.209]) by patchwork.libcamera.org (Postfix) with ESMTPS id 6DBDCC3275 for ; Wed, 26 Apr 2023 13:13:44 +0000 (UTC) Received: from lancelot.ideasonboard.com (localhost [IPv6:::1]) by lancelot.ideasonboard.com (Postfix) with ESMTP id 1EDB3627FF; Wed, 26 Apr 2023 15:13:44 +0200 (CEST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/simple; d=libcamera.org; s=mail; t=1682514824; bh=oY+NgQ2D+mWVJsnVUJdRNmCftrAmbxszpbVYstEIXzo=; h=To:Date:In-Reply-To:References:Subject:List-Id:List-Unsubscribe: List-Archive:List-Post:List-Help:List-Subscribe:From:Reply-To: From; b=bui+jd2/0713KThCN1uGyCouuDjn7KKGTmLB3/ooXDxGoEswIG0XFt74cWX1HAdNg iy4AtiLUZ/rRSNhO0xWE1e5+fnu4QWZXyk4SG3yxNsRCjgQjSW6uyuNsB7S5YhazHc OIk5E+PNgUQDk16pVTAWdcb6nlvuPO1TXP8weqo2E90PNP2vfeL/fbx67XUOZsfaWa 5tmn9mIeTES4n5omCZtjr6/4UCTQuzxUldDVlpgarmaOKX62CmpZVdPi+MDjrdhmnM Kr8wIsQ2xCzCioURWuzklk8zfin7EVLKIUKpp8HdXxA0stPl1Arziz8+HTCytws6vO HZ8feF+R4T99Q== Received: from mail-wm1-x331.google.com (mail-wm1-x331.google.com [IPv6:2a00:1450:4864:20::331]) by lancelot.ideasonboard.com (Postfix) with ESMTPS id 7BAD8627E4 for ; Wed, 26 Apr 2023 15:13:42 +0200 (CEST) Authentication-Results: lancelot.ideasonboard.com; dkim=pass (2048-bit key; unprotected) header.d=raspberrypi.com header.i=@raspberrypi.com header.b="hpMjk2ty"; dkim-atps=neutral Received: by mail-wm1-x331.google.com with SMTP id 5b1f17b1804b1-3f1cfed93e2so43567585e9.3 for ; Wed, 26 Apr 2023 06:13:42 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=raspberrypi.com; s=google; t=1682514822; x=1685106822; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:from:to:cc:subject:date :message-id:reply-to; bh=OSX7rX2yvtgqkMky+H6ao7ePsv7HtXEeq3H1rxSd1s0=; b=hpMjk2tyXf7CbHbA8SfpqctftPf/osyxor5WsxzHFiBd5LoG5bL/irrYt6C9ZPx+Za BMqdAcs9QQX7j/bVDweCO8vHY7+pwYBFmrAL+QYNn6FOAuMhRjHLQj863sVK5/G8el3M PnSI4XFr1l1UMJ8GmsSrUW8lF3pGLELOkrhOMr1EpijcmS7ikW4H4UNTlpTdqXskx4A+ Lci3P5jYYMQLn6IflKN6JvSUrTVSO3nh1xb5VnbjiEP2UgYz983daooWljyTmNbdBcl8 sYqLEvonZtx9mM86YIe6p11qa/rJUg1u8Zf8OUTFHZ+qCoWRJRXTJel99tl1tlb8PHnG ub0w== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20221208; t=1682514822; x=1685106822; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:x-gm-message-state:from:to:cc :subject:date:message-id:reply-to; bh=OSX7rX2yvtgqkMky+H6ao7ePsv7HtXEeq3H1rxSd1s0=; b=GNnnlCkWDhehH5OrOzfUwQ33YJhGvE0rLgMkpUbD+8MRUXZOWMCHvVWGs6tTNK1vu1 PhRzgp4E012QpDDpQXa60n3lRtlJoK8VVBBv00eAntwnEbbTSkVxfZgTw6p3yuJGGuwu ULAR0/XIOF7Nb8shBuHvgv7XgUNIjNHSPKZxAPSurQJV2PVprAB+01kl8bx3a3A9ZHiT SHGqZFTrEkxCBdplw7P0VN01CgIkdg98ZYvuqXiAbK1ux23Kzm3RR11XPjizRehDZzOk iJ2sKWJv3N3vPdupIL2XYBnzLj+15i8DVy4oymLg71E4cZrb8HSugDMp3e8sd0O7PUjZ u0lw== X-Gm-Message-State: AAQBX9cd/CCb+Zj89O9UTNTYXH1CS/r8iQdYkH6SqdPM4AL6YGtNYyvj SrJGNF3cnz0enCA4QMoHI9/kMDp55qdDyIhG+j37bA== X-Google-Smtp-Source: AKy350YlAofyDRw1OzbgnKdWPhP9hsd1mNW8xscNxLEq+kdp3fHXA0ikyVd/nzmfZC5Q83G7KOMuPA== X-Received: by 2002:a5d:4f07:0:b0:2f0:bd17:2d2c with SMTP id c7-20020a5d4f07000000b002f0bd172d2cmr13468851wru.9.1682514821870; Wed, 26 Apr 2023 06:13:41 -0700 (PDT) Received: from localhost.localdomain ([93.93.133.154]) by smtp.gmail.com with ESMTPSA id k5-20020adff5c5000000b002f103ca90cdsm15780949wrp.101.2023.04.26.06.13.40 (version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256); Wed, 26 Apr 2023 06:13:41 -0700 (PDT) To: libcamera-devel@lists.libcamera.org Date: Wed, 26 Apr 2023 14:10:57 +0100 Message-Id: <20230426131057.21550-14-naush@raspberrypi.com> X-Mailer: git-send-email 2.34.1 In-Reply-To: <20230426131057.21550-1-naush@raspberrypi.com> References: <20230426131057.21550-1-naush@raspberrypi.com> MIME-Version: 1.0 Subject: [libcamera-devel] [PATCH 13/13] pipeline: vc4: Connect/disconnect IPA and dequeue signals on start/stop X-BeenThere: libcamera-devel@lists.libcamera.org X-Mailman-Version: 2.1.29 Precedence: list List-Id: List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , X-Patchwork-Original-From: Naushir Patuck via libcamera-devel From: Naushir Patuck Reply-To: Naushir Patuck Errors-To: libcamera-devel-bounces@lists.libcamera.org Sender: "libcamera-devel" We currently rely on a state check to see if any of the IPA and buffer dequeue signal functions need to run. Replace this check by explicitly disconnecting the appropriate signals on camera stop. Re-connect the signals on camera start. Signed-off-by: Naushir Patuck Reviewed-by: Jacopo Mondi --- src/libcamera/pipeline/rpi/vc4/vc4.cpp | 54 +++++++++++--------------- 1 file changed, 23 insertions(+), 31 deletions(-) diff --git a/src/libcamera/pipeline/rpi/vc4/vc4.cpp b/src/libcamera/pipeline/rpi/vc4/vc4.cpp index bd7bfb3a7663..4b3f5a7fc9fe 100644 --- a/src/libcamera/pipeline/rpi/vc4/vc4.cpp +++ b/src/libcamera/pipeline/rpi/vc4/vc4.cpp @@ -315,11 +315,8 @@ int PipelineHandlerVc4::platformRegister(std::unique_ptr &camer /* An embedded data node will not be present if the sensor does not support it. */ MediaEntity *unicamEmbedded = unicam->getEntityByName("unicam-embedded"); - if (unicamEmbedded) { + if (unicamEmbedded) data->unicam_[Unicam::Embedded] = RPi::Stream("Unicam Embedded", unicamEmbedded); - data->unicam_[Unicam::Embedded].dev()->bufferReady.connect(data, - &Vc4CameraData::unicamBufferDequeue); - } /* Tag the ISP input stream as an import stream. */ data->isp_[Isp::Input] = RPi::Stream("ISP Input", ispOutput0, Flag::ImportOnly); @@ -327,18 +324,9 @@ int PipelineHandlerVc4::platformRegister(std::unique_ptr &camer data->isp_[Isp::Output1] = RPi::Stream("ISP Output1", ispCapture2); data->isp_[Isp::Stats] = RPi::Stream("ISP Stats", ispCapture3); - /* Wire up all the buffer connections. */ - data->unicam_[Unicam::Image].dev()->bufferReady.connect(data, &Vc4CameraData::unicamBufferDequeue); - data->isp_[Isp::Input].dev()->bufferReady.connect(data, &Vc4CameraData::ispInputDequeue); - data->isp_[Isp::Output0].dev()->bufferReady.connect(data, &Vc4CameraData::ispOutputDequeue); - data->isp_[Isp::Output1].dev()->bufferReady.connect(data, &Vc4CameraData::ispOutputDequeue); - data->isp_[Isp::Stats].dev()->bufferReady.connect(data, &Vc4CameraData::ispOutputDequeue); - if (data->sensorMetadata_ ^ !!data->unicam_[Unicam::Embedded].dev()) { LOG(RPI, Warning) << "Mismatch between Unicam and CamHelper for embedded data usage!"; data->sensorMetadata_ = false; - if (data->unicam_[Unicam::Embedded].dev()) - data->unicam_[Unicam::Embedded].dev()->bufferReady.disconnect(); } /* @@ -367,9 +355,7 @@ int PipelineHandlerVc4::platformRegister(std::unique_ptr &camer return -EINVAL; } - /* Write up all the IPA connections. */ - data->ipa_->processStatsComplete.connect(data, &Vc4CameraData::processStatsComplete); - data->ipa_->prepareIspComplete.connect(data, &Vc4CameraData::prepareIspComplete); + /* Wire up the default IPA connections. The others get connected on start() */ data->ipa_->setIspControls.connect(data, &Vc4CameraData::setIspControls); data->ipa_->setCameraTimeout.connect(data, &Vc4CameraData::setCameraTimeout); @@ -691,10 +677,31 @@ int Vc4CameraData::platformConfigureIpa(ipa::RPi::ConfigParams ¶ms) void Vc4CameraData::platformStart() { + unicam_[Unicam::Image].dev()->bufferReady.connect(this, &Vc4CameraData::unicamBufferDequeue); + isp_[Isp::Input].dev()->bufferReady.connect(this, &Vc4CameraData::ispInputDequeue); + isp_[Isp::Output0].dev()->bufferReady.connect(this, &Vc4CameraData::ispOutputDequeue); + isp_[Isp::Output1].dev()->bufferReady.connect(this, &Vc4CameraData::ispOutputDequeue); + isp_[Isp::Stats].dev()->bufferReady.connect(this, &Vc4CameraData::ispOutputDequeue); + ipa_->processStatsComplete.connect(this, &Vc4CameraData::processStatsComplete); + ipa_->prepareIspComplete.connect(this, &Vc4CameraData::prepareIspComplete); + + if (sensorMetadata_) + unicam_[Unicam::Embedded].dev()->bufferReady.connect(this, &Vc4CameraData::unicamBufferDequeue); } void Vc4CameraData::platformStop() { + unicam_[Unicam::Image].dev()->bufferReady.disconnect(); + isp_[Isp::Input].dev()->bufferReady.disconnect(); + isp_[Isp::Output0].dev()->bufferReady.disconnect(); + isp_[Isp::Output1].dev()->bufferReady.disconnect(); + isp_[Isp::Stats].dev()->bufferReady.disconnect(); + ipa_->processStatsComplete.disconnect(); + ipa_->prepareIspComplete.disconnect(); + + if (sensorMetadata_) + unicam_[Unicam::Embedded].dev()->bufferReady.disconnect(); + bayerQueue_ = {}; embeddedQueue_ = {}; } @@ -704,9 +711,6 @@ void Vc4CameraData::unicamBufferDequeue(FrameBuffer *buffer) RPi::Stream *stream = nullptr; unsigned int index; - if (!isRunning()) - return; - for (RPi::Stream &s : unicam_) { index = s.getBufferId(buffer); if (index) { @@ -743,9 +747,6 @@ void Vc4CameraData::unicamBufferDequeue(FrameBuffer *buffer) void Vc4CameraData::ispInputDequeue(FrameBuffer *buffer) { - if (!isRunning()) - return; - LOG(RPI, Debug) << "Stream ISP Input buffer complete" << ", buffer id " << unicam_[Unicam::Image].getBufferId(buffer) << ", timestamp: " << buffer->metadata().timestamp; @@ -760,9 +761,6 @@ void Vc4CameraData::ispOutputDequeue(FrameBuffer *buffer) RPi::Stream *stream = nullptr; unsigned int index; - if (!isRunning()) - return; - for (RPi::Stream &s : isp_) { index = s.getBufferId(buffer); if (index) { @@ -803,9 +801,6 @@ void Vc4CameraData::ispOutputDequeue(FrameBuffer *buffer) void Vc4CameraData::processStatsComplete(const ipa::RPi::BufferIds &buffers) { - if (!isRunning()) - return; - FrameBuffer *buffer = isp_[Isp::Stats].getBuffers().at(buffers.stats & RPi::MaskID); handleStreamBuffer(buffer, &isp_[Isp::Stats]); @@ -846,9 +841,6 @@ void Vc4CameraData::prepareIspComplete(const ipa::RPi::BufferIds &buffers) unsigned int bayer = buffers.bayer & RPi::MaskID; FrameBuffer *buffer; - if (!isRunning()) - return; - buffer = unicam_[Unicam::Image].getBuffers().at(bayer & RPi::MaskID); LOG(RPI, Debug) << "Input re-queue to ISP, buffer id " << (bayer & RPi::MaskID) << ", timestamp: " << buffer->metadata().timestamp;