From patchwork Wed May 3 12:20:23 2023 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Naushir Patuck X-Patchwork-Id: 18586 Return-Path: X-Original-To: parsemail@patchwork.libcamera.org Delivered-To: parsemail@patchwork.libcamera.org Received: from lancelot.ideasonboard.com (lancelot.ideasonboard.com [92.243.16.209]) by patchwork.libcamera.org (Postfix) with ESMTPS id 313D7C0DA4 for ; Wed, 3 May 2023 12:20:43 +0000 (UTC) Received: from lancelot.ideasonboard.com (localhost [IPv6:::1]) by lancelot.ideasonboard.com (Postfix) with ESMTP id BC44F633B8; Wed, 3 May 2023 14:20:40 +0200 (CEST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/simple; d=libcamera.org; s=mail; t=1683116440; bh=7P/GgqAl2UBfQR9LrnSPIOwN3x8qtszmDXjO3hSgwD4=; h=To:Date:In-Reply-To:References:Subject:List-Id:List-Unsubscribe: List-Archive:List-Post:List-Help:List-Subscribe:From:Reply-To:Cc: From; b=Nvo+zQ1Dv8yqOaTQYRF+kYj+miyqY0JwyVx/j244EzekSYV4rS3v0UwQ9EXmZBh5d GEPt4x6IuDlhl2gBzYBkT8qbuzCeGk5YPGhC+0jqYCkaoGkjHELLiSczzZz/t//xWp T73sjNLcSWfj4633gRo0XJrr4iyxmQbDK7nuwzLOzGDkuolRoAo4VRj/OKLaE7GiYo /ylUqX+O6+LumbLWkH5G2L4hKmhCStNFbHC2hNJ3/WxAn52odBBcYoA9YMGxho6YGx zdDkZlbqxluoFG6dXOVM9q8FFXDQ1GUC48JRKle6A6nxFZSynDWow5JW6Xg81Rmez/ Apr4HjuE9BIHg== Received: from mail-wm1-x329.google.com (mail-wm1-x329.google.com [IPv6:2a00:1450:4864:20::329]) by lancelot.ideasonboard.com (Postfix) with ESMTPS id A272C633A8 for ; Wed, 3 May 2023 14:20:38 +0200 (CEST) Authentication-Results: lancelot.ideasonboard.com; dkim=pass (2048-bit key; unprotected) header.d=raspberrypi.com header.i=@raspberrypi.com header.b="tZA4NDp4"; dkim-atps=neutral Received: by mail-wm1-x329.google.com with SMTP id 5b1f17b1804b1-3f1763ee8f8so33441585e9.1 for ; Wed, 03 May 2023 05:20:38 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=raspberrypi.com; s=google; t=1683116438; x=1685708438; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:from:to:cc:subject:date :message-id:reply-to; bh=neWi2yniHpiu9WlFbT/hNGIChDnbjNoBYPmPLu8pArg=; b=tZA4NDp4gbIu/t0NVvV+ZuFX6DgOTL1Wc2vrXK1mOTaNE4NeJgXcG1zjcC3klvAUK6 ykcM3UlzhCLnJ9VjNHgivm5hQ1yhy6/Gx8MBZs17w88B5dVmqfVUm0LUHmm00i71eHrI 41g7zdXHOMxl4Rk1vqV2ll6oVOLohP9wOocCKcpKxsH0rzJezYmipFbc8TRtPQtpH1aB M0gASiuuoy+dAXlw4iJQ0Yir+11SNm/47FISHBST5riqgCVnl0wQxWtes9uUoFGNflP+ qVxTGVKhqhw33SA4ae16cr86fsqB7PVq3Nv8dXBfnYoRDKrV+HiZOcucLTbsa8bZU3i0 2Jfg== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20221208; t=1683116438; x=1685708438; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:x-gm-message-state:from:to:cc :subject:date:message-id:reply-to; bh=neWi2yniHpiu9WlFbT/hNGIChDnbjNoBYPmPLu8pArg=; b=Lok3xzlHXGnMR8o8pGduaZD9/FaZgL+IlcHXT25VdXikX2AqWBOVKKTms1q5txmQb1 P/lAREZ8w3J8RD4aINhCsOZRL8bPKuMFeBHnHOVQ3Wv6oE0bRvJhzNcEfdbXSpHE7MhD p8YnsxhviiFOZzpkbT3Uc3m1RHt1pKdWLBaTCt2N58P/xk3HGGDjoUJPcV3yA5lIXFAS NQP7NGmtpOxOxQmDmhtGj9wXWX7k1lGUqTzzoOoJ8Smd78ZVQXubOoANOMtYIUMx4lOJ TIftXtthK+de4qJZ7zwbhEJCX6ILGmqQCuer4i3NOx2vT68a6eAzZDJUzZ6KZK2l/x5l S+6w== X-Gm-Message-State: AC+VfDySZMSHEUq1/XXaonBPRUk+cmUC+y+7LVSj0ulXFxOwHNcdbTlH bUUhZibFI6YN4oKvI/eNfYOrDoAfUP7Rlfwxt6/Yyg== X-Google-Smtp-Source: ACHHUZ6yG8qBcxB2LI86NazapFtd2OVz/Zup74/aWI7XJhexJdfe06eGodRTQqsOJoS8kpkHqdcy9g== X-Received: by 2002:a7b:c015:0:b0:3f1:82ba:b03b with SMTP id c21-20020a7bc015000000b003f182bab03bmr14947638wmb.19.1683116437886; Wed, 03 May 2023 05:20:37 -0700 (PDT) Received: from localhost.localdomain ([93.93.133.154]) by smtp.gmail.com with ESMTPSA id f23-20020a7bcd17000000b003ee443bf0c7sm1736785wmj.16.2023.05.03.05.20.37 (version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256); Wed, 03 May 2023 05:20:37 -0700 (PDT) To: libcamera-devel@lists.libcamera.org Date: Wed, 3 May 2023 13:20:23 +0100 Message-Id: <20230503122035.32026-2-naush@raspberrypi.com> X-Mailer: git-send-email 2.34.1 In-Reply-To: <20230503122035.32026-1-naush@raspberrypi.com> References: <20230503122035.32026-1-naush@raspberrypi.com> MIME-Version: 1.0 Subject: [libcamera-devel] [PATCH 01/13] meson: ipa: Add mapping for pipeline handler to mojom interface file X-BeenThere: libcamera-devel@lists.libcamera.org X-Mailman-Version: 2.1.29 Precedence: list List-Id: List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , X-Patchwork-Original-From: Naushir Patuck via libcamera-devel From: Naushir Patuck Reply-To: Naushir Patuck Cc: Jacopo Mondi Errors-To: libcamera-devel-bounces@lists.libcamera.org Sender: "libcamera-devel" Allow an arbitrary mapping between the pipeline handler and IPA mojom interface file in the build system. This removes the 1:1 mapping of pipeline handler name to mojom filename, and allows more flexibility to pipeline developers. Signed-off-by: Naushir Patuck Reviewed-by: Jacopo Mondi Reviewed-by: Laurent Pinchart --- Documentation/guides/ipa.rst | 39 ++++++++++++++++++++----------- include/libcamera/ipa/meson.build | 36 +++++++++++++++++----------- 2 files changed, 48 insertions(+), 27 deletions(-) diff --git a/Documentation/guides/ipa.rst b/Documentation/guides/ipa.rst index fc0317451e24..51dbe3aad5f5 100644 --- a/Documentation/guides/ipa.rst +++ b/Documentation/guides/ipa.rst @@ -19,6 +19,16 @@ connect to, in order to receive data from the IPA asynchronously. In addition, it contains any custom data structures that the pipeline handler and IPA may pass to each other. +It is possible to use the same IPA interface with multiple pipeline handlers +on different hardware platforms. Generally in such cases, these platforms would +have a common hardware ISP pipeline. For instance, the rkisp1 pipeline handler +supports both the RK3399 and the i.MX8MP as they integrate the same ISP. +However, the i.MX8MP has a more complex camera pipeline, which may call for a +dedicated pipeline handler in the future. As the ISP is the same as for RK3399, +the same IPA interface could be used for both pipeline handlers. The build files +provide a mapping from pipeline handler to the IPA interface name as detailed in +:ref:`compiling-section`. + The IPA protocol refers to the agreement between the pipeline handler and the IPA regarding the expected response(s) from the IPA for given calls to the IPA. This protocol doesn't need to be declared anywhere in code, but it shall be @@ -43,7 +53,7 @@ interface definition is thus written by the pipeline handler author, based on how they design the interactions between the pipeline handler and the IPA. The entire IPA interface, including the functions, signals, and any custom -structs shall be defined in a file named {pipeline_name}.mojom under +structs shall be defined in a file named {interface_name}.mojom under include/libcamera/ipa/. .. _mojo Interface Definition Language: https://chromium.googlesource.com/chromium/src.git/+/master/mojo/public/tools/bindings/README.md @@ -150,8 +160,6 @@ and the Event IPA interface, which describes the signals received by the pipeline handler that the IPA can emit. Both must be defined. This section focuses on the Main IPA interface. -The main interface must be named as IPA{pipeline_name}Interface. - The functions that the pipeline handler can call from the IPA may be synchronous or asynchronous. Synchronous functions do not return until the IPA returns from the function, while asynchronous functions return immediately @@ -243,7 +251,7 @@ then it may be empty. These emissions are meant to notify the pipeline handler of some event, such as request data is ready, and *must not* be used to drive the camera pipeline from the IPA. -The event interface must be named as IPA{pipeline_name}EventInterface. +The event interface must be named as IPA{interface_name}EventInterface. Functions defined in the event interface are implicitly asynchronous. Thus they cannot return any value. Specifying the [async] tag is not @@ -266,38 +274,41 @@ The following is an example of an event interface definition: setStaggered(libcamera.ControlList controls); }; +.. _compiling-section: + Compiling the IPA interface --------------------------- -After the IPA interface is defined in include/libcamera/ipa/{pipeline_name}.mojom, +After the IPA interface is defined in include/libcamera/ipa/{interface_name}.mojom, an entry for it must be added in meson so that it can be compiled. The filename -must be added to the ipa_mojom_files object in include/libcamera/ipa/meson.build. +must be added to the pipeline_ipa_mojom_mapping object in include/libcamera/ipa/meson.build. +This object maps the pipeline handler name with an ipa interface file. For example, adding the raspberrypi.mojom file to meson: .. code-block:: none - ipa_mojom_files = [ - 'raspberrypi.mojom', + pipeline_ipa_mojom_mapping = [ + 'rpi/vc4': 'raspberrypi.mojom', ] This will cause the mojo data definition file to be compiled. Specifically, it generates five files: - a header describing the custom data structures, and the complete IPA - interface (at {$build_dir}/include/libcamera/ipa/{pipeline}_ipa_interface.h) + interface (at {$build_dir}/include/libcamera/ipa/{interface}_ipa_interface.h) - a serializer implementing de/serialization for the custom data structures (at - {$build_dir}/include/libcamera/ipa/{pipeline}_ipa_serializer.h) + {$build_dir}/include/libcamera/ipa/{interface}_ipa_serializer.h) - a proxy header describing a specialized IPA proxy (at - {$build_dir}/include/libcamera/ipa/{pipeline}_ipa_proxy.h) + {$build_dir}/include/libcamera/ipa/{interface}_ipa_proxy.h) - a proxy source implementing the IPA proxy (at - {$build_dir}/src/libcamera/proxy/{pipeline}_ipa_proxy.cpp) + {$build_dir}/src/libcamera/proxy/{interface}_ipa_proxy.cpp) - a proxy worker source implementing the other end of the IPA proxy (at - {$build_dir}/src/libcamera/proxy/worker/{pipeline}_ipa_proxy_worker.cpp) + {$build_dir}/src/libcamera/proxy/worker/{interface}_ipa_proxy_worker.cpp) The IPA proxy serves as the layer between the pipeline handler and the IPA, and handles threading vs isolation transparently. The pipeline handler and the IPA @@ -312,7 +323,7 @@ file, the following header must be included: .. code-block:: C++ - #include + #include The POD types of the structs simply become their C++ counterparts, eg. uint32 in mojo will become uint32_t in C++. mojo map becomes C++ std::map, and mojo diff --git a/include/libcamera/ipa/meson.build b/include/libcamera/ipa/meson.build index 442ca3dd7e1c..c57b3a5e1570 100644 --- a/include/libcamera/ipa/meson.build +++ b/include/libcamera/ipa/meson.build @@ -60,13 +60,15 @@ libcamera_generated_ipa_headers += custom_target('core_ipa_serializer_h', './' +'@INPUT@' ]) -ipa_mojom_files = [ - 'ipu3.mojom', - 'raspberrypi.mojom', - 'rkisp1.mojom', - 'vimc.mojom', -] - +# Mapping from pipeline handler name to mojom file +pipeline_ipa_mojom_mapping = { + 'ipu3': 'ipu3.mojom', + 'rkisp1': 'rkisp1.mojom', + 'raspberrypi': 'raspberrypi.mojom', + 'vimc': 'vimc.mojom', +} + +ipa_mojom_files = [] ipa_mojoms = [] # @@ -75,14 +77,22 @@ ipa_mojoms = [] # TODO Define per-pipeline ControlInfoMap with yaml? -foreach file : ipa_mojom_files +foreach pipeline, file : pipeline_ipa_mojom_mapping + name = file.split('.')[0] - if name not in pipelines + # Ensure we do not build duplicate mojom modules + if file in ipa_mojom_files + continue + endif + + ipa_mojom_files += file + + if pipeline not in pipelines continue endif - # {pipeline}.mojom-module + # {interface}.mojom-module mojom = custom_target(name + '_mojom_module', input : file, output : file + '-module', @@ -94,7 +104,7 @@ foreach file : ipa_mojom_files '--mojoms', '@INPUT@' ]) - # {pipeline}_ipa_interface.h + # {interface}_ipa_interface.h header = custom_target(name + '_ipa_interface_h', input : mojom, output : name + '_ipa_interface.h', @@ -110,7 +120,7 @@ foreach file : ipa_mojom_files './' +'@INPUT@' ]) - # {pipeline}_ipa_serializer.h + # {interface}_ipa_serializer.h serializer = custom_target(name + '_ipa_serializer_h', input : mojom, output : name + '_ipa_serializer.h', @@ -124,7 +134,7 @@ foreach file : ipa_mojom_files './' +'@INPUT@' ]) - # {pipeline}_ipa_proxy.h + # {interface}_ipa_proxy.h proxy_header = custom_target(name + '_proxy_h', input : mojom, output : name + '_ipa_proxy.h', From patchwork Wed May 3 12:20:24 2023 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Naushir Patuck X-Patchwork-Id: 18587 Return-Path: X-Original-To: parsemail@patchwork.libcamera.org Delivered-To: parsemail@patchwork.libcamera.org Received: from lancelot.ideasonboard.com (lancelot.ideasonboard.com [92.243.16.209]) by patchwork.libcamera.org (Postfix) with ESMTPS id 61E5CC0DA4 for ; Wed, 3 May 2023 12:20:44 +0000 (UTC) Received: from lancelot.ideasonboard.com (localhost [IPv6:::1]) by lancelot.ideasonboard.com (Postfix) with ESMTP id 59559633BE; Wed, 3 May 2023 14:20:41 +0200 (CEST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/simple; d=libcamera.org; s=mail; t=1683116441; bh=SkuNolI9P2nCWUfWWA5jMlqkXgH+/QYppr/4bgET7hQ=; h=To:Date:In-Reply-To:References:Subject:List-Id:List-Unsubscribe: List-Archive:List-Post:List-Help:List-Subscribe:From:Reply-To: From; b=oFa5y9at+4UbEGEvk1VrQ6GoABo345f2x1JoctiuS7Q4iR0a1tN8HvF/lswc5z7dQ aXyVc88TlfP2ADogBthURnvpPLjdjJnC/VJoB7+4QlxVFSMrpSb4lWczg1+fBpVOL8 h9o51CFaRhQi2+yqlJww0CTMbIV7A1j1xksrzt9Kr3s7Paq+Aq9y82U05CmP3hDRR/ CrBNQ9P9QWahx3sUIBNUulCFNq+EMK/0GYJ2H2eejEIGVtdylQ1bZYT4jb+lqJVovB 7Q0rZ+f4gpeVJspE7UIPE7MJMHQkPeCb0h+3QyYNPB9676iJ+L4u6H+Fr8olSbl02v nMY4ECbycK6UQ== Received: from mail-wm1-x336.google.com (mail-wm1-x336.google.com [IPv6:2a00:1450:4864:20::336]) by lancelot.ideasonboard.com (Postfix) with ESMTPS id 02313633A8 for ; Wed, 3 May 2023 14:20:39 +0200 (CEST) Authentication-Results: lancelot.ideasonboard.com; dkim=pass (2048-bit key; unprotected) header.d=raspberrypi.com header.i=@raspberrypi.com header.b="Bd1/nSPA"; dkim-atps=neutral Received: by mail-wm1-x336.google.com with SMTP id 5b1f17b1804b1-3f1738d0d4cso31404845e9.1 for ; Wed, 03 May 2023 05:20:39 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=raspberrypi.com; s=google; t=1683116438; x=1685708438; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:from:to:cc:subject:date :message-id:reply-to; bh=AAwiLoEgkTyPRPryFHEiudlGVKZQEgFJz2H6YvPFE5s=; b=Bd1/nSPA7BrTzAnYQ75o3bih3Y1D9xqs5VKmRqd5AFjS79uMl1VgtdSD/Ma4+OjQ0G jWXNfoE2GWNxQrsOwH1BHaZV6lN2FKFBGkr+SybUpeBar069iagqqzdxa454zgI5kIzM uSnkAAdWieVIf662AYsFtoMxB1N8BwbyX7S45/qKzX6HdG8q7pmCbyVimENK2WYrAt6t dnbZax15hh8sRnJw7UVAT15TCFD8c8i1xMZEoQxMzJaVi06comJwsc7ULsnYlUgF14uO rGtD4JC7rPvrc1LwznC1dVNkrL6pwlmZ7g1B+AP80ceROwZinbQqOavvOVN++JlLfkom Kfiw== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20221208; t=1683116438; x=1685708438; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:x-gm-message-state:from:to:cc :subject:date:message-id:reply-to; bh=AAwiLoEgkTyPRPryFHEiudlGVKZQEgFJz2H6YvPFE5s=; b=C63Wy2s83tP0OHkSJhbrDc/r39QcbtX5tShMTpVwEbuKpJp8BEmdOhUYbyarkkwie+ Y3x9QL5ACUT9DGJzkTSrtxLafb+HnlFxX8guJRdVNfWT87oMJ/jUZ7iX8x6stUFOgJnM NtJuN038/Kbfds709kxiVEGAFoHEKzK0PEH0aRavzYQftVnLLyfeQE6LKSzVsj0wc/y3 4rnBFOdu5X98KeifvlrN31rL7GyUrUyYSBf9Pv81I+MgA5HUPk8h2dfACJGqY+MIn3BX EOCXvWwVYMPhYeYKLQfIlmnWMNLNZZ2tWl5R5lK623tVt5chMkdZ23QDfHgxrmlrBbx3 QZCA== X-Gm-Message-State: AC+VfDzuyJmRWx0M8pUTKli17h8u5NRqIvTPseA5XTDTuLAizFp9WTwu GOqij2JV0Hov08btbpwzLb1AFL6wp/NWFfUm5p2Reg== X-Google-Smtp-Source: ACHHUZ6s8cC4kyDAs29ioGsmLvmPJntVFCkfwJNYXtkwYGBadIWcWfZBr92Rjgg4ec9j64r0SaUdNQ== X-Received: by 2002:a7b:c047:0:b0:3f1:7a44:317c with SMTP id u7-20020a7bc047000000b003f17a44317cmr15154183wmc.24.1683116438440; Wed, 03 May 2023 05:20:38 -0700 (PDT) Received: from localhost.localdomain ([93.93.133.154]) by smtp.gmail.com with ESMTPSA id f23-20020a7bcd17000000b003ee443bf0c7sm1736785wmj.16.2023.05.03.05.20.37 (version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256); Wed, 03 May 2023 05:20:38 -0700 (PDT) To: libcamera-devel@lists.libcamera.org Date: Wed, 3 May 2023 13:20:24 +0100 Message-Id: <20230503122035.32026-3-naush@raspberrypi.com> X-Mailer: git-send-email 2.34.1 In-Reply-To: <20230503122035.32026-1-naush@raspberrypi.com> References: <20230503122035.32026-1-naush@raspberrypi.com> MIME-Version: 1.0 Subject: [libcamera-devel] [PATCH 02/13] libcamera: ipa: Remove character restriction on the IPA name X-BeenThere: libcamera-devel@lists.libcamera.org X-Mailman-Version: 2.1.29 Precedence: list List-Id: List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , X-Patchwork-Original-From: Naushir Patuck via libcamera-devel From: Naushir Patuck Reply-To: Naushir Patuck Errors-To: libcamera-devel-bounces@lists.libcamera.org Sender: "libcamera-devel" Remove the restriction on using the '/' character in the IPA name string. This allows more flexibility in IPA directory structures where different IPA modules might live in subdirectories under the usual src/ipa// top level directory. Signed-off-by: Naushir Patuck Reviewed-by: Jacopo Mondi Reviewed-by: Laurent Pinchart --- src/libcamera/ipa_module.cpp | 11 +++++------ 1 file changed, 5 insertions(+), 6 deletions(-) diff --git a/src/libcamera/ipa_module.cpp b/src/libcamera/ipa_module.cpp index c152153c180f..3ca7074b3b58 100644 --- a/src/libcamera/ipa_module.cpp +++ b/src/libcamera/ipa_module.cpp @@ -225,9 +225,9 @@ Span elfLoadSymbol(Span elf, const char *symbol) * \brief The name of the IPA module * * The name may be used to build file system paths to IPA-specific resources. - * It shall only contain printable characters, and may not contain '/', '*', - * '?' or '\'. For IPA modules included in libcamera, it shall match the - * directory of the IPA module in the source tree. + * It shall only contain printable characters, and may not contain '*', '?' or + * '\'. For IPA modules included in libcamera, it shall match the directory of + * the IPA module in the source tree. * * \todo Allow user to choose to isolate open source IPAs */ @@ -304,9 +304,8 @@ int IPAModule::loadIPAModuleInfo() std::string ipaName = info_.name; auto iter = std::find_if_not(ipaName.begin(), ipaName.end(), [](unsigned char c) -> bool { - return isprint(c) && c != '/' && - c != '?' && c != '*' && - c != '\\'; + return isprint(c) && c != '?' && + c != '*' && c != '\\'; }); if (iter != ipaName.end()) { LOG(IPAModule, Error) From patchwork Wed May 3 12:20:25 2023 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Naushir Patuck X-Patchwork-Id: 18588 Return-Path: X-Original-To: parsemail@patchwork.libcamera.org Delivered-To: parsemail@patchwork.libcamera.org Received: from lancelot.ideasonboard.com (lancelot.ideasonboard.com [92.243.16.209]) by patchwork.libcamera.org (Postfix) with ESMTPS id 8E1B6C0DA4 for ; Wed, 3 May 2023 12:20:45 +0000 (UTC) Received: from lancelot.ideasonboard.com (localhost [IPv6:::1]) by lancelot.ideasonboard.com (Postfix) with ESMTP id D6703633C1; Wed, 3 May 2023 14:20:41 +0200 (CEST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/simple; d=libcamera.org; s=mail; t=1683116441; bh=OQkgtB49bpC6kjk1eXz96Ne0yWH0cZ1scVF85P9/Cr8=; h=To:Date:In-Reply-To:References:Subject:List-Id:List-Unsubscribe: List-Archive:List-Post:List-Help:List-Subscribe:From:Reply-To: From; b=QstfLeJ3ykl1tds9uO0B+enUzyhxnFxojd/GPNmOJLDS4sWMkJ2153va9CjhPF+q5 apwrfJ9pxpsG1xGvAOAmpkko8531QgUAixulztM1cBnUoUVaO3FO5Bo4YF4REpEsT2 RuXnUbIlzPXA8n3aoxS4hNHCCydA0b/JKWbJAQ/UcM6z3Nr0feCSvtNpJwVQwmnX+U lfzOOjuLkwwMBESMh5pmjqoYwi+2rRF3GfB1Nkt+QvtC1+9wb2vxaWwm1xUpIBAqhD oiHFHpdJ5SjPpOAQ3GzB4NwHzJ9BdLrODK/jdpq2Km4P9PjhTONIDWlnjrmpPqz1vR lwISuDFvPjcdg== Received: from mail-wm1-x335.google.com (mail-wm1-x335.google.com [IPv6:2a00:1450:4864:20::335]) by lancelot.ideasonboard.com (Postfix) with ESMTPS id B0FD0627DE for ; Wed, 3 May 2023 14:20:39 +0200 (CEST) Authentication-Results: lancelot.ideasonboard.com; dkim=pass (2048-bit key; unprotected) header.d=raspberrypi.com header.i=@raspberrypi.com header.b="Z6j5h2uu"; dkim-atps=neutral Received: by mail-wm1-x335.google.com with SMTP id 5b1f17b1804b1-3f315712406so21887755e9.0 for ; Wed, 03 May 2023 05:20:39 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=raspberrypi.com; s=google; t=1683116439; x=1685708439; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:from:to:cc:subject:date :message-id:reply-to; bh=TOZtiZNotGvI/TxVa/2UVHk+I4BBJa641a7L6mP14W4=; b=Z6j5h2uuAiljXv2I8LMRQe5xn7ldYDpFSjeuEZkBa4KVTtx14PhXYKJgG7ql2euXmK ulc/cQI20CcYKaSmFGwFoUEJfEPWiD2OJQoPa1gHz6jqHGkwAiTjtDluEivx+n0pQYoX sO7yoqlYdAcDP4wI1UgRradhvOShF1xORYgI4ksCcXHY3IG3pkH1gJTNyZYqCBTin5at xj9TnTXhpUpe3lcuQrjgna9zQ57eeCBTDlr/W0PtcevN1sUtdtQin5JbS6TOD8GKN2fF z30hYW8GM0ZFeD82LaNGNA9dUE+MTLXqULAqTSqKPwbkI5tNi8BUshn5iuuDBh4kHxwi bYfQ== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20221208; t=1683116439; x=1685708439; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:x-gm-message-state:from:to:cc :subject:date:message-id:reply-to; bh=TOZtiZNotGvI/TxVa/2UVHk+I4BBJa641a7L6mP14W4=; b=jYBy+nQzHmKe9aDQ/w+ggy/PszCt2ay7K8hOgYRKgtZ7Im60gVUHNIjcJI1gbnwLz9 FNfi94JgCc39AGBR2tddKShzZiIB/rctYyB00wpLOEpncJaaz3v6m8KsXU3cJXfodG8/ aqr+uSX3zljPi3zXaYIqa/e74RgH/r0rQA6fdvxJUHJEuAPiMjnlVIukK9B4dC9k0mGO 0F9h+tkPr13DA4PvqzhL6f6fJNEF0rHgVe4IHY4U5fWH4m42btQFVUq4hH7sp9P5Z/sK g0cY/w5qNxJ+gK/ViP9cbSUW/JrxvONvyHccrM99kZarMVK8uxIbjIkV/BYsAYolSgCw AU5A== X-Gm-Message-State: AC+VfDzk+eMKc8AhB0vD3BsyCI7SWdULJVz6OUhROFk4nRKRmGAC589F qYzv9cGNUv+lfDQlbGAqImzhj2x0nR4PsOBjawhN4g== X-Google-Smtp-Source: ACHHUZ6Bu8JzsfClOfy67XL2ZFiAcyhBPS2IJrPIhpGYBq+ApbDwFTA7AzhXwK+H5jDyzDfuW54rSw== X-Received: by 2002:a05:600c:4585:b0:3f1:80d0:906b with SMTP id r5-20020a05600c458500b003f180d0906bmr1469339wmo.4.1683116439060; Wed, 03 May 2023 05:20:39 -0700 (PDT) Received: from localhost.localdomain ([93.93.133.154]) by smtp.gmail.com with ESMTPSA id f23-20020a7bcd17000000b003ee443bf0c7sm1736785wmj.16.2023.05.03.05.20.38 (version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256); Wed, 03 May 2023 05:20:38 -0700 (PDT) To: libcamera-devel@lists.libcamera.org Date: Wed, 3 May 2023 13:20:25 +0100 Message-Id: <20230503122035.32026-4-naush@raspberrypi.com> X-Mailer: git-send-email 2.34.1 In-Reply-To: <20230503122035.32026-1-naush@raspberrypi.com> References: <20230503122035.32026-1-naush@raspberrypi.com> MIME-Version: 1.0 Subject: [libcamera-devel] [PATCH 03/13] ipa: meson: Allow nested IPA directory structures X-BeenThere: libcamera-devel@lists.libcamera.org X-Mailman-Version: 2.1.29 Precedence: list List-Id: List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , X-Patchwork-Original-From: Naushir Patuck via libcamera-devel From: Naushir Patuck Reply-To: Naushir Patuck Errors-To: libcamera-devel-bounces@lists.libcamera.org Sender: "libcamera-devel" The current IPA build files require a flat directory struture for the IPAs. Modify the build files to remove this restriction and allow a directory structure such as: src/ipa |- raspberrypi |- common |- cam_helpers |- controller |- vc4 |- rkisp1 |- ipu3 where each subdir (e.g. raspberrypi/common, raspberrypi/cam_helper) has its own meson.build file. Such a directory structure will be introduced for the Raspberry Pi IPA in a future commit. Signed-off-by: Naushir Patuck Reviewed-by: Jacopo Mondi Reviewed-by: Laurent Pinchart --- meson.build | 2 +- src/ipa/ipu3/meson.build | 2 ++ src/ipa/meson.build | 26 +++++++++++++++++++++----- src/ipa/raspberrypi/meson.build | 2 ++ src/ipa/rkisp1/meson.build | 2 ++ 5 files changed, 28 insertions(+), 6 deletions(-) diff --git a/meson.build b/meson.build index d3289181b7b9..2d99029bf5b7 100644 --- a/meson.build +++ b/meson.build @@ -262,7 +262,7 @@ py_mod.find_installation('python3', modules: py_modules) ## Summarise Configurations summary({ 'Enabled pipelines': pipelines, - 'Enabled IPA modules': enabled_ipa_modules, + 'Enabled IPA modules': enabled_ipa_names, 'Tracing support': tracing_enabled, 'Android support': android_enabled, 'GStreamer support': gst_enabled, diff --git a/src/ipa/ipu3/meson.build b/src/ipa/ipu3/meson.build index 658e7c9bc366..66c398432d43 100644 --- a/src/ipa/ipu3/meson.build +++ b/src/ipa/ipu3/meson.build @@ -29,3 +29,5 @@ if ipa_sign_module install : false, build_by_default : true) endif + +ipa_names += ipa_name diff --git a/src/ipa/meson.build b/src/ipa/meson.build index 76ad5b445601..fac92f32fdb9 100644 --- a/src/ipa/meson.build +++ b/src/ipa/meson.build @@ -36,16 +36,32 @@ if get_option('test') and 'vimc' not in ipa_modules endif enabled_ipa_modules = [] +enabled_ipa_names = [] +ipa_names = [] # The ipa-sign-install.sh script which uses the ipa_names variable will itself # prepend MESON_INSTALL_DESTDIR_PREFIX to each ipa module name, therefore we # must not include the prefix string here. + +subdirs = [] foreach pipeline : pipelines - if ipa_modules.contains(pipeline) - subdir(pipeline) - ipa_names += ipa_install_dir / ipa_name + '.so' - enabled_ipa_modules += pipeline + if not ipa_modules.contains(pipeline) + continue + endif + enabled_ipa_names += pipeline + + # Allow multi-level directory structuring for the IPAs if needed. + pipeline = pipeline.split('/')[0] + if pipeline in subdirs + continue endif + + subdir(pipeline) + subdirs += [pipeline] +endforeach + +foreach ipa_name : ipa_names + enabled_ipa_modules += ipa_install_dir / ipa_name + '.so' endforeach if ipa_sign_module @@ -54,5 +70,5 @@ if ipa_sign_module # install time, which invalidates the signatures. meson.add_install_script('ipa-sign-install.sh', ipa_priv_key.full_path(), - ipa_names) + enabled_ipa_modules) endif diff --git a/src/ipa/raspberrypi/meson.build b/src/ipa/raspberrypi/meson.build index de78cbd80f9c..95437cbcc962 100644 --- a/src/ipa/raspberrypi/meson.build +++ b/src/ipa/raspberrypi/meson.build @@ -64,3 +64,5 @@ if ipa_sign_module endif subdir('data') + +ipa_names += ipa_name diff --git a/src/ipa/rkisp1/meson.build b/src/ipa/rkisp1/meson.build index ccb84b27525b..e813da53ae9b 100644 --- a/src/ipa/rkisp1/meson.build +++ b/src/ipa/rkisp1/meson.build @@ -29,3 +29,5 @@ if ipa_sign_module install : false, build_by_default : true) endif + +ipa_names += ipa_name From patchwork Wed May 3 12:20:26 2023 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Naushir Patuck X-Patchwork-Id: 18589 Return-Path: X-Original-To: parsemail@patchwork.libcamera.org Delivered-To: parsemail@patchwork.libcamera.org Received: from lancelot.ideasonboard.com (lancelot.ideasonboard.com [92.243.16.209]) by patchwork.libcamera.org (Postfix) with ESMTPS id 353A1C327C for ; Wed, 3 May 2023 12:20:46 +0000 (UTC) Received: from lancelot.ideasonboard.com (localhost [IPv6:::1]) by lancelot.ideasonboard.com (Postfix) with ESMTP id 7EDF0633C8; Wed, 3 May 2023 14:20:44 +0200 (CEST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/simple; d=libcamera.org; s=mail; t=1683116444; bh=Nrc2VCaQEYgwL/KrjgCkD/2fXQlzDrF+D5L1CKEfJTs=; h=To:Date:In-Reply-To:References:Subject:List-Id:List-Unsubscribe: List-Archive:List-Post:List-Help:List-Subscribe:From:Reply-To: From; b=mibNVtFjoZ0pUuv9/vEMotnqCQq3bA/jee3WrhqrOg6v+/lqxeiUH9c1p30m7o83V J4ni0QpYfs6GPCGCjhKvgPv0ffEwc/Ba0Xe9dDvZubrShWMAm9pZPMHT1wQKX2MUgl L2B9TnurMV27p+A3cMSfsWZsAWsvWSfC/K5Ou4cbdqSIFPkFecpvIt1zyGJpuoc/wy nmOtzrWQ/MUUm70P8eMuniRaTU7keCVpGpl6Pu9Q4oH5uYfmhuvTRecwMg214gJ8gt 6j7AMq9MGtin8q7Fe8iaN3qnTGfv5wzSlulHbvo8q4gCIFn4YAx1IQDA9zySiF7vso WCtIydBeAGX3w== Received: from mail-wm1-x32b.google.com (mail-wm1-x32b.google.com [IPv6:2a00:1450:4864:20::32b]) by lancelot.ideasonboard.com (Postfix) with ESMTPS id 93874633BB for ; Wed, 3 May 2023 14:20:40 +0200 (CEST) Authentication-Results: lancelot.ideasonboard.com; dkim=pass (2048-bit key; unprotected) header.d=raspberrypi.com header.i=@raspberrypi.com header.b="KGUMKjp2"; dkim-atps=neutral Received: by mail-wm1-x32b.google.com with SMTP id 5b1f17b1804b1-3f20215fa70so33548065e9.0 for ; Wed, 03 May 2023 05:20:40 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=raspberrypi.com; s=google; t=1683116440; x=1685708440; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:from:to:cc:subject:date :message-id:reply-to; bh=IemsxOVQ9LeJZukTKv2xAB06N6yXJw0JMYPwJ07JdP4=; b=KGUMKjp2QZnAMjp4y4D78ybL7/VMdip7LnjMTfzKb98J9fKVkLbagT9xQ1oO8jHhOO zcsBIzBH361nlBUO56Pxc4dCN27f88DMGiEdJj+UnPQQVHaEKBOCFptXQ1ryrdXIg5lX j3t9R7Z695iP9ue1wXPzFCGVdxx8N+ChQ2U3JB9kvo/MSV4WmS7YrCbzygbmi7ua45LM tSPO7Jhf16cL60hfK1d2fAT4tZNmJ6UhV8vhzsWcnl2VQWBTQwnkKRwbMZ9EEvBKWHWx du4571nZYmXsjb3EMnp5Ro7CcSCLqkzfhkjK0V0xb3l+q3uqaQfl6Sfg0J1/Zzh6fNJA HJwg== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20221208; t=1683116440; x=1685708440; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:x-gm-message-state:from:to:cc :subject:date:message-id:reply-to; bh=IemsxOVQ9LeJZukTKv2xAB06N6yXJw0JMYPwJ07JdP4=; b=F1njNc9OxS1uqn+dtr/jIRLhoFsh47+Acz0OF6TbBnck5r0svuE7U6Pd2Z0jk9phgk sF6fWZxYCeK0ssYa66fjQ/HYGSfqGAUoJBfq+wv/5Cx73vV1EL2O3qwJ3Exlik3itV6j /crNRi919sZTdOr62MWLRmDzd18xKVD1ITYTw49dpdsYmfzeYSaKmyaETn+PLsAxkyK5 om+AXiIrSkYto8f9O4dllN8Q9VfZV/J5E9YFATUm90LTZ8EPrxiIlJ1uiHDL+Y7c7ivV SuaEKK/JklbcZ45yH65DLIAJgNHXk2zGjkbyJGVTwHaVk/eQua0m11lDasf4WRAVc/jM qBsA== X-Gm-Message-State: AC+VfDz7Q8ao9LxlLefPCV8y6LvTcIGCot4bdP+p6ZZzukkSGBPPGoal NOnof+7gjqHOwY6DqJPnUkRhEvR1F9V4mfpoDJf7jQ== X-Google-Smtp-Source: ACHHUZ4hVnNE97++h7PCLFqmVAd0eAY2o7jsVbyI8M5O0SiWrM4+a+gXpjDZ499L1dTMSB+lkMVXOg== X-Received: by 2002:a1c:7311:0:b0:3f1:72db:4554 with SMTP id d17-20020a1c7311000000b003f172db4554mr14585532wmb.1.1683116439774; Wed, 03 May 2023 05:20:39 -0700 (PDT) Received: from localhost.localdomain ([93.93.133.154]) by smtp.gmail.com with ESMTPSA id f23-20020a7bcd17000000b003ee443bf0c7sm1736785wmj.16.2023.05.03.05.20.39 (version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256); Wed, 03 May 2023 05:20:39 -0700 (PDT) To: libcamera-devel@lists.libcamera.org Date: Wed, 3 May 2023 13:20:26 +0100 Message-Id: <20230503122035.32026-5-naush@raspberrypi.com> X-Mailer: git-send-email 2.34.1 In-Reply-To: <20230503122035.32026-1-naush@raspberrypi.com> References: <20230503122035.32026-1-naush@raspberrypi.com> MIME-Version: 1.0 Subject: [libcamera-devel] [PATCH 04/13] pipeline: meson: Allow nested pipeline handler directory structures X-BeenThere: libcamera-devel@lists.libcamera.org X-Mailman-Version: 2.1.29 Precedence: list List-Id: List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , X-Patchwork-Original-From: Naushir Patuck via libcamera-devel From: Naushir Patuck Reply-To: Naushir Patuck Errors-To: libcamera-devel-bounces@lists.libcamera.org Sender: "libcamera-devel" The current pipeline handler build files require a flat directory struture for each pipeline handler. Modify the build files to remove this restriction and allow a directory structure such as: src/libcamera/pipeline/ |- raspberrypi |- common |- vc4 |- rkisp1 |- ipu3 where each subdir (e.g. raspberrypi/common, raspberrypi/vc4) has its own meson.build file. Such a directory structure will be introduced for the Raspberry Pi pipeline handler in a future commit. Signed-off-by: Naushir Patuck Reviewed-by: Jacopo Mondi Reviewed-by: Laurent Pinchart --- src/libcamera/pipeline/meson.build | 9 +++++++++ 1 file changed, 9 insertions(+) diff --git a/src/libcamera/pipeline/meson.build b/src/libcamera/pipeline/meson.build index f14869f3a3c0..059c68bb964f 100644 --- a/src/libcamera/pipeline/meson.build +++ b/src/libcamera/pipeline/meson.build @@ -3,6 +3,15 @@ # Location of pipeline specific configuration files pipeline_data_dir = libcamera_datadir / 'pipeline' +# Allow multi-level directory structuring for the pipeline handlers if needed. +subdirs = [] + foreach pipeline : pipelines + pipeline = pipeline.split('/')[0] + if pipeline in subdirs + continue + endif + subdir(pipeline) + subdirs += [pipeline] endforeach From patchwork Wed May 3 12:20:27 2023 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Naushir Patuck X-Patchwork-Id: 18590 Return-Path: X-Original-To: parsemail@patchwork.libcamera.org Delivered-To: parsemail@patchwork.libcamera.org Received: from lancelot.ideasonboard.com (lancelot.ideasonboard.com [92.243.16.209]) by patchwork.libcamera.org (Postfix) with ESMTPS id D90ADC0DA4 for ; Wed, 3 May 2023 12:20:46 +0000 (UTC) Received: from lancelot.ideasonboard.com (localhost [IPv6:::1]) by lancelot.ideasonboard.com (Postfix) with ESMTP id 15773633C3; Wed, 3 May 2023 14:20:45 +0200 (CEST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/simple; d=libcamera.org; s=mail; t=1683116445; bh=ez9UHZ3oSN13tSkRau5999ckuxjrF9DpF0AWXWpBSpE=; h=To:Date:In-Reply-To:References:Subject:List-Id:List-Unsubscribe: List-Archive:List-Post:List-Help:List-Subscribe:From:Reply-To:Cc: From; b=kGijXyzXUo1ekbg8kNC0oIuQ+/eXGIHsn7ioucz5wa57LKoHG7O1YHRxh+fig83Do qEI2kKaND3veMMAGAnEYf3KfSZCtxUgIItqWpItyuh9rLNeYiZ/y14LFpZoNeHSXJ1 XGSxhNtdplibO0V0b9zCvwoQldqrHQg4pie9yZN3MyUt8bMEIb5Py/4g9V9eQvay4w Qu5qM0til5792tR1zYLYc2Oeikb8/J7PJ6t3sQf5eju9O/A0Hz2FulL2KyJV3n34Gm bYYWJuCM/uwuaDJRmZ7K891OaxtQYEEh7xUoXWSpSXbyhKImEdrJrJJbPWxMiAmCQN grGXtMA2UK77A== Received: from mail-wm1-x335.google.com (mail-wm1-x335.google.com [IPv6:2a00:1450:4864:20::335]) by lancelot.ideasonboard.com (Postfix) with ESMTPS id 209C5633BC for ; Wed, 3 May 2023 14:20:41 +0200 (CEST) Authentication-Results: lancelot.ideasonboard.com; dkim=pass (2048-bit key; unprotected) header.d=raspberrypi.com header.i=@raspberrypi.com header.b="a7N7ICsA"; dkim-atps=neutral Received: by mail-wm1-x335.google.com with SMTP id 5b1f17b1804b1-3f315712406so21888675e9.0 for ; Wed, 03 May 2023 05:20:41 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=raspberrypi.com; s=google; t=1683116440; x=1685708440; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:from:to:cc:subject:date :message-id:reply-to; bh=F/2MHVKeOlr0o4VD8FPQQe36d1yC/zjAdEojKietEpU=; b=a7N7ICsAPDjfRYj6WDTzPQeQSis/X4U8tVdNsiMFHG/hdoEALFqrLFIDT/UGtpalrr 5U64vYy8BqAi1ZA2KqReNuZA1i+VMorFd+qCL8xhplWZ5h3W7AVNthNyoaJS6QfYFjmT ngQc9TfjR0o+Hco65Nwr9Qr4hHiUMThLdolBCFi+W1rwSsWsJgkqn4O93hGcUQpQYxFl CCYtifFoXWp28tVe9gU9HF3b+W/gQx5EpNRCfzA7XvNLi4bkCrRI7kW58OjyXyhhEqNI VKwAUhtsRvgkMDaN1he6veSCyCq5gDKZ7DvoVcu3+W8TGjjn+Jp0kJxv/6gjd00qUqwn sRfw== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20221208; t=1683116440; x=1685708440; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:x-gm-message-state:from:to:cc :subject:date:message-id:reply-to; bh=F/2MHVKeOlr0o4VD8FPQQe36d1yC/zjAdEojKietEpU=; b=Gx8Lhvwk1ZZWofR8pzr9mGg5uuAQnUYYrdhcEHKyb+3uEc6YoULrL0iDwmC/jyQHLZ OsZgQYOO0E5zxmarGz0i3+pSWe24lUGyIrmwhEZaZSUSpqpMOtjh5EKCIbWh80FC04WD uHR2MEl0AvdOvg+KSxlpnZEmpeYhOuG9/ZrOP7yX4i8Ah+UCrjj53QL1+EzQRDTDnoJb x3Ws/U3ZOqHMNSXChSWBTLC1qFQvOrRX3vUo6P1vItePUaw5K/+iYDc7CegFQOCF26rc eujFeMP9iyKqqdpkYJUoUvowjmu2hfMbPUl0gwLMRvHwmA6C6e9uTlLWUF9eVZLidV8k LRmg== X-Gm-Message-State: AC+VfDxJ76I3H2rUJtb24nsse83mHGypv1qzkuFTxawsLtr8l++LeUMO eXo5J94nYVM9ZUYwFVeI3aeM2hnFghR5Oq7pgAkkkg== X-Google-Smtp-Source: ACHHUZ6eOuTo5wXqLMIBGMGlXqwWoCDOOZSiHOQ8cJ1T7/iG8WPjJhOkX+DFuf1yT62qV8r/qY466A== X-Received: by 2002:a1c:770f:0:b0:3f1:94e2:e5bc with SMTP id t15-20020a1c770f000000b003f194e2e5bcmr1304650wmi.11.1683116440464; Wed, 03 May 2023 05:20:40 -0700 (PDT) Received: from localhost.localdomain ([93.93.133.154]) by smtp.gmail.com with ESMTPSA id f23-20020a7bcd17000000b003ee443bf0c7sm1736785wmj.16.2023.05.03.05.20.39 (version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256); Wed, 03 May 2023 05:20:40 -0700 (PDT) To: libcamera-devel@lists.libcamera.org Date: Wed, 3 May 2023 13:20:27 +0100 Message-Id: <20230503122035.32026-6-naush@raspberrypi.com> X-Mailer: git-send-email 2.34.1 In-Reply-To: <20230503122035.32026-1-naush@raspberrypi.com> References: <20230503122035.32026-1-naush@raspberrypi.com> MIME-Version: 1.0 Subject: [libcamera-devel] [PATCH 05/13] pipeline: raspberrypi: Refactor and move pipeline handler code X-BeenThere: libcamera-devel@lists.libcamera.org X-Mailman-Version: 2.1.29 Precedence: list List-Id: List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , X-Patchwork-Original-From: Naushir Patuck via libcamera-devel From: Naushir Patuck Reply-To: Naushir Patuck Cc: Jacopo Mondi Errors-To: libcamera-devel-bounces@lists.libcamera.org Sender: "libcamera-devel" Split the Raspberry Pi pipeline handler code into common and VC4/BCM2835 specific file structures. The common code files now live in src/libcamera/pipeline/rpi/common/ and the vc4 specific files in src/libcamera/pipeline/rpi/vc4/. To build the pipeline handler, the meson configuration option to select the Raspberry Pi pipeline handler has now changed from "raspberrypi" to "rpi/vc4": meson setup build -Dpipelines=rpi/vc4 There are no functional changes in the pipeline handler code itself. Signed-off-by: Naushir Patuck Reviewed-by: Jacopo Mondi Reviewed-by: Laurent Pinchart --- Documentation/environment_variables.rst | 2 +- Documentation/guides/introduction.rst | 2 +- Documentation/guides/pipeline-handler.rst | 2 +- include/libcamera/ipa/meson.build | 2 +- meson.build | 2 +- meson_options.txt | 2 +- .../{raspberrypi => rpi/common}/delayed_controls.cpp | 0 .../{raspberrypi => rpi/common}/delayed_controls.h | 0 src/libcamera/pipeline/rpi/common/meson.build | 6 ++++++ .../{raspberrypi => rpi/common}/rpi_stream.cpp | 0 .../{raspberrypi => rpi/common}/rpi_stream.h | 0 src/libcamera/pipeline/rpi/meson.build | 12 ++++++++++++ .../{raspberrypi => rpi/vc4}/data/example.yaml | 0 .../{raspberrypi => rpi/vc4}/data/meson.build | 2 +- .../pipeline/{raspberrypi => rpi/vc4}/dma_heaps.cpp | 0 .../pipeline/{raspberrypi => rpi/vc4}/dma_heaps.h | 0 .../pipeline/{raspberrypi => rpi/vc4}/meson.build | 2 -- .../{raspberrypi => rpi/vc4}/raspberrypi.cpp | 6 +++--- 18 files changed, 28 insertions(+), 12 deletions(-) rename src/libcamera/pipeline/{raspberrypi => rpi/common}/delayed_controls.cpp (100%) rename src/libcamera/pipeline/{raspberrypi => rpi/common}/delayed_controls.h (100%) create mode 100644 src/libcamera/pipeline/rpi/common/meson.build rename src/libcamera/pipeline/{raspberrypi => rpi/common}/rpi_stream.cpp (100%) rename src/libcamera/pipeline/{raspberrypi => rpi/common}/rpi_stream.h (100%) create mode 100644 src/libcamera/pipeline/rpi/meson.build rename src/libcamera/pipeline/{raspberrypi => rpi/vc4}/data/example.yaml (100%) rename src/libcamera/pipeline/{raspberrypi => rpi/vc4}/data/meson.build (63%) rename src/libcamera/pipeline/{raspberrypi => rpi/vc4}/dma_heaps.cpp (100%) rename src/libcamera/pipeline/{raspberrypi => rpi/vc4}/dma_heaps.h (100%) rename src/libcamera/pipeline/{raspberrypi => rpi/vc4}/meson.build (71%) rename src/libcamera/pipeline/{raspberrypi => rpi/vc4}/raspberrypi.cpp (99%) diff --git a/Documentation/environment_variables.rst b/Documentation/environment_variables.rst index ceeb251a2ac0..4bf38b877897 100644 --- a/Documentation/environment_variables.rst +++ b/Documentation/environment_variables.rst @@ -40,7 +40,7 @@ LIBCAMERA_IPA_MODULE_PATH LIBCAMERA_RPI_CONFIG_FILE Define a custom configuration file to use in the Raspberry Pi pipeline handler. - Example value: ``/usr/local/share/libcamera/pipeline/raspberrypi/minimal_mem.yaml`` + Example value: ``/usr/local/share/libcamera/pipeline/rpi/vc4/minimal_mem.yaml`` Further details --------------- diff --git a/Documentation/guides/introduction.rst b/Documentation/guides/introduction.rst index 2d1760c1866b..700ec2d33c30 100644 --- a/Documentation/guides/introduction.rst +++ b/Documentation/guides/introduction.rst @@ -288,7 +288,7 @@ with dedicated pipeline handlers: - Intel IPU3 (ipu3) - Rockchip RK3399 (rkisp1) - - RaspberryPi 3 and 4 (raspberrypi) + - RaspberryPi 3 and 4 (rpi/vc4) Furthermore, generic platform support is provided for the following: diff --git a/Documentation/guides/pipeline-handler.rst b/Documentation/guides/pipeline-handler.rst index 1acd1812cf37..57644534de61 100644 --- a/Documentation/guides/pipeline-handler.rst +++ b/Documentation/guides/pipeline-handler.rst @@ -183,7 +183,7 @@ to the libcamera build options in the top level ``meson_options.txt``. option('pipelines', type : 'array', - choices : ['ipu3', 'raspberrypi', 'rkisp1', 'simple', 'uvcvideo', 'vimc', 'vivid'], + choices : ['ipu3', 'rkisp1', 'rpi/vc4', 'simple', 'uvcvideo', 'vimc', 'vivid'], description : 'Select which pipeline handlers to include') diff --git a/include/libcamera/ipa/meson.build b/include/libcamera/ipa/meson.build index c57b3a5e1570..6060c68f047d 100644 --- a/include/libcamera/ipa/meson.build +++ b/include/libcamera/ipa/meson.build @@ -64,7 +64,7 @@ libcamera_generated_ipa_headers += custom_target('core_ipa_serializer_h', pipeline_ipa_mojom_mapping = { 'ipu3': 'ipu3.mojom', 'rkisp1': 'rkisp1.mojom', - 'raspberrypi': 'raspberrypi.mojom', + 'rpi/vc4': 'raspberrypi.mojom', 'vimc': 'vimc.mojom', } diff --git a/meson.build b/meson.build index 2d99029bf5b7..e1fd924307f7 100644 --- a/meson.build +++ b/meson.build @@ -194,8 +194,8 @@ arch_x86 = ['x86', 'x86_64'] pipelines_support = { 'imx8-isi': arch_arm, 'ipu3': arch_x86, - 'raspberrypi': arch_arm, 'rkisp1': arch_arm, + 'rpi/vc4': arch_arm, 'simple': arch_arm, 'uvcvideo': ['any'], 'vimc': ['test'], diff --git a/meson_options.txt b/meson_options.txt index 78a78b7214d5..b4afb8e591a8 100644 --- a/meson_options.txt +++ b/meson_options.txt @@ -43,8 +43,8 @@ option('pipelines', 'auto', 'imx8-isi', 'ipu3', - 'raspberrypi', 'rkisp1', + 'rpi/vc4', 'simple', 'uvcvideo', 'vimc' diff --git a/src/libcamera/pipeline/raspberrypi/delayed_controls.cpp b/src/libcamera/pipeline/rpi/common/delayed_controls.cpp similarity index 100% rename from src/libcamera/pipeline/raspberrypi/delayed_controls.cpp rename to src/libcamera/pipeline/rpi/common/delayed_controls.cpp diff --git a/src/libcamera/pipeline/raspberrypi/delayed_controls.h b/src/libcamera/pipeline/rpi/common/delayed_controls.h similarity index 100% rename from src/libcamera/pipeline/raspberrypi/delayed_controls.h rename to src/libcamera/pipeline/rpi/common/delayed_controls.h diff --git a/src/libcamera/pipeline/rpi/common/meson.build b/src/libcamera/pipeline/rpi/common/meson.build new file mode 100644 index 000000000000..2ad594cf8d1a --- /dev/null +++ b/src/libcamera/pipeline/rpi/common/meson.build @@ -0,0 +1,6 @@ +# SPDX-License-Identifier: CC0-1.0 + +libcamera_sources += files([ + 'delayed_controls.cpp', + 'rpi_stream.cpp', +]) diff --git a/src/libcamera/pipeline/raspberrypi/rpi_stream.cpp b/src/libcamera/pipeline/rpi/common/rpi_stream.cpp similarity index 100% rename from src/libcamera/pipeline/raspberrypi/rpi_stream.cpp rename to src/libcamera/pipeline/rpi/common/rpi_stream.cpp diff --git a/src/libcamera/pipeline/raspberrypi/rpi_stream.h b/src/libcamera/pipeline/rpi/common/rpi_stream.h similarity index 100% rename from src/libcamera/pipeline/raspberrypi/rpi_stream.h rename to src/libcamera/pipeline/rpi/common/rpi_stream.h diff --git a/src/libcamera/pipeline/rpi/meson.build b/src/libcamera/pipeline/rpi/meson.build new file mode 100644 index 000000000000..2391b6a9729e --- /dev/null +++ b/src/libcamera/pipeline/rpi/meson.build @@ -0,0 +1,12 @@ +# SPDX-License-Identifier: CC0-1.0 + +subdir('common') + +foreach pipeline : pipelines + pipeline = pipeline.split('/') + if pipeline.length() < 2 or pipeline[0] != 'rpi' + continue + endif + + subdir(pipeline[1]) +endforeach diff --git a/src/libcamera/pipeline/raspberrypi/data/example.yaml b/src/libcamera/pipeline/rpi/vc4/data/example.yaml similarity index 100% rename from src/libcamera/pipeline/raspberrypi/data/example.yaml rename to src/libcamera/pipeline/rpi/vc4/data/example.yaml diff --git a/src/libcamera/pipeline/raspberrypi/data/meson.build b/src/libcamera/pipeline/rpi/vc4/data/meson.build similarity index 63% rename from src/libcamera/pipeline/raspberrypi/data/meson.build rename to src/libcamera/pipeline/rpi/vc4/data/meson.build index 1c70433bbcbc..cca5e3885a4e 100644 --- a/src/libcamera/pipeline/raspberrypi/data/meson.build +++ b/src/libcamera/pipeline/rpi/vc4/data/meson.build @@ -5,4 +5,4 @@ conf_files = files([ ]) install_data(conf_files, - install_dir : pipeline_data_dir / 'raspberrypi') + install_dir : pipeline_data_dir / 'rpi' / 'vc4') diff --git a/src/libcamera/pipeline/raspberrypi/dma_heaps.cpp b/src/libcamera/pipeline/rpi/vc4/dma_heaps.cpp similarity index 100% rename from src/libcamera/pipeline/raspberrypi/dma_heaps.cpp rename to src/libcamera/pipeline/rpi/vc4/dma_heaps.cpp diff --git a/src/libcamera/pipeline/raspberrypi/dma_heaps.h b/src/libcamera/pipeline/rpi/vc4/dma_heaps.h similarity index 100% rename from src/libcamera/pipeline/raspberrypi/dma_heaps.h rename to src/libcamera/pipeline/rpi/vc4/dma_heaps.h diff --git a/src/libcamera/pipeline/raspberrypi/meson.build b/src/libcamera/pipeline/rpi/vc4/meson.build similarity index 71% rename from src/libcamera/pipeline/raspberrypi/meson.build rename to src/libcamera/pipeline/rpi/vc4/meson.build index 59e8fb18c555..228823f30922 100644 --- a/src/libcamera/pipeline/raspberrypi/meson.build +++ b/src/libcamera/pipeline/rpi/vc4/meson.build @@ -1,10 +1,8 @@ # SPDX-License-Identifier: CC0-1.0 libcamera_sources += files([ - 'delayed_controls.cpp', 'dma_heaps.cpp', 'raspberrypi.cpp', - 'rpi_stream.cpp', ]) subdir('data') diff --git a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp b/src/libcamera/pipeline/rpi/vc4/raspberrypi.cpp similarity index 99% rename from src/libcamera/pipeline/raspberrypi/raspberrypi.cpp rename to src/libcamera/pipeline/rpi/vc4/raspberrypi.cpp index 0060044143cc..af464d153f28 100644 --- a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp +++ b/src/libcamera/pipeline/rpi/vc4/raspberrypi.cpp @@ -2,7 +2,7 @@ /* * Copyright (C) 2019-2021, Raspberry Pi Ltd * - * raspberrypi.cpp - Pipeline handler for Raspberry Pi devices + * raspberrypi.cpp - Pipeline handler for VC4-based Raspberry Pi devices */ #include #include @@ -43,9 +43,9 @@ #include "libcamera/internal/v4l2_videodevice.h" #include "libcamera/internal/yaml_parser.h" -#include "delayed_controls.h" +#include "../common/delayed_controls.h" +#include "../common/rpi_stream.h" #include "dma_heaps.h" -#include "rpi_stream.h" using namespace std::chrono_literals; From patchwork Wed May 3 12:20:28 2023 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Naushir Patuck X-Patchwork-Id: 18591 Return-Path: X-Original-To: parsemail@patchwork.libcamera.org Delivered-To: parsemail@patchwork.libcamera.org Received: from lancelot.ideasonboard.com (lancelot.ideasonboard.com [92.243.16.209]) by patchwork.libcamera.org (Postfix) with ESMTPS id 69D0AC327C for ; Wed, 3 May 2023 12:20:47 +0000 (UTC) Received: from lancelot.ideasonboard.com (localhost [IPv6:::1]) by lancelot.ideasonboard.com (Postfix) with ESMTP id EFEBA633C7; Wed, 3 May 2023 14:20:45 +0200 (CEST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/simple; d=libcamera.org; s=mail; t=1683116446; bh=IKkOqzgfwfe84pHEiM7h/0jJwW9FMZvFMswTYJRvSXU=; h=To:Date:In-Reply-To:References:Subject:List-Id:List-Unsubscribe: List-Archive:List-Post:List-Help:List-Subscribe:From:Reply-To:Cc: From; b=3XUxcm7AqkXaT5kyqLFZJV7uB3GYZdLBGsP+YWC82VcDXIEqmJbUgiuA2+kCmYwJ2 0VDs2MO2hPdvOrzIUWKDlO45LUL/CVlNtqU8A5o8mbDKQ1Ee27JbeYpdcROSLQ6pKY LW4vdgDuFpW+iYBSL/9TDUZpc6eXf5O2mLWQUM9vpV0fogFzOznYoYP4qLmdK5OHTi dPapQU7/1DZTlD/oCmDpsBvcBqYqIfjkdYRzZwpJA+nj4AwxujslGO/Jrw0xhczebA Ia74QIuebhPC+APhP61Ld+hA0GLJdb6ouapKbdyZ4LeomxiGfWk/rLk3g/2eqVVreo gdyE6dUIbFZ4A== Received: from mail-wm1-x32b.google.com (mail-wm1-x32b.google.com [IPv6:2a00:1450:4864:20::32b]) by lancelot.ideasonboard.com (Postfix) with ESMTPS id 1ADD5633C3 for ; Wed, 3 May 2023 14:20:42 +0200 (CEST) Authentication-Results: lancelot.ideasonboard.com; dkim=pass (2048-bit key; unprotected) header.d=raspberrypi.com header.i=@raspberrypi.com header.b="ri6h31rD"; dkim-atps=neutral Received: by mail-wm1-x32b.google.com with SMTP id 5b1f17b1804b1-3f1cfed93e2so50887775e9.3 for ; Wed, 03 May 2023 05:20:42 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=raspberrypi.com; s=google; t=1683116441; x=1685708441; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:from:to:cc:subject:date :message-id:reply-to; bh=neya+kDNHfrWcD6zrzJAZQfOypAjYs2FCktzHPpIAf0=; b=ri6h31rDeEhNd/lI/FNhQxW4BurBJLgvJE0D1wy4ENx0jvmP3qDWY7uKwAm8Y9akmg voeUaGe61ldmR6QpOps/OAzFspBNVX/F01FID4ntkdb75+ZydWsr8+UUPnDhX8mIuRf5 01E/udKOPLHlbc1FGxNGSvEkwo9RELiG8Rv8P+OBTjIHFbC/iw5J3iRuGVLoQND5nfKC bIMJ1D86DkxMFJTfam+38aMbdiiGv5dr79wivmYsVkLYDtWJsZZducMYOiA+fSNIhHc4 C+tWFoUyKh8RWl0RVYTg2JISKAMiM0SY/Z92TKSTGsuB6tmBiP0N5b5O/RGQJYuXth+F zTXg== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20221208; t=1683116441; x=1685708441; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:x-gm-message-state:from:to:cc :subject:date:message-id:reply-to; bh=neya+kDNHfrWcD6zrzJAZQfOypAjYs2FCktzHPpIAf0=; b=gO4Doblk66fQp/BpoXMekl5nRRXLCxeGcemMFoNVlK+6UxYIB4UmIQd+C6CEMi5cew 64w/kLQzb7y2JpPkPIM4O+azp4N4SJJ+xhE6QF9IU8c+c6rQDcABsvLXvhCwgQ/I6blG 6Fid4ljNFsaK47hd4bMtjetSR322LJe9iRgAgRhbxgz+RZZwHYcdsmxfWtfNwByKX84I 4kwn7B3uxm8M/0jsj7mjMJN3OQ0NUOw1bx3JiONDh9iJuIB3edWNbUdMcpeFrzZT1pb7 W61iexrmsAl/x2K838cxl/5rnp1R/l0Vp0Qc1LDr2fJ/CdW43CUm1CBsOS+wyQcJN3pi ASyg== X-Gm-Message-State: AC+VfDw2AKtaeM0PPXu6zCHg4aQ52vfe//ll91hbjRtVgTHoD0DgDvZu zoObWsmaltyB+yHPnZBvFnN999Qc++ahTrqPc0j1Hw== X-Google-Smtp-Source: ACHHUZ6U733ZG/X7gUseGsDaHqVJ3eTPz56AvyLqBhEqsUytEXN/lRe1M9kEAANzZiFWFcMMddNKPw== X-Received: by 2002:a1c:f30b:0:b0:3f1:75d2:a6a7 with SMTP id q11-20020a1cf30b000000b003f175d2a6a7mr15078869wmq.36.1683116441036; Wed, 03 May 2023 05:20:41 -0700 (PDT) Received: from localhost.localdomain ([93.93.133.154]) by smtp.gmail.com with ESMTPSA id f23-20020a7bcd17000000b003ee443bf0c7sm1736785wmj.16.2023.05.03.05.20.40 (version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256); Wed, 03 May 2023 05:20:40 -0700 (PDT) To: libcamera-devel@lists.libcamera.org Date: Wed, 3 May 2023 13:20:28 +0100 Message-Id: <20230503122035.32026-7-naush@raspberrypi.com> X-Mailer: git-send-email 2.34.1 In-Reply-To: <20230503122035.32026-1-naush@raspberrypi.com> References: <20230503122035.32026-1-naush@raspberrypi.com> MIME-Version: 1.0 Subject: [libcamera-devel] [PATCH 06/13] ipa: raspberrypi: Refactor and move IPA code X-BeenThere: libcamera-devel@lists.libcamera.org X-Mailman-Version: 2.1.29 Precedence: list List-Id: List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , X-Patchwork-Original-From: Naushir Patuck via libcamera-devel From: Naushir Patuck Reply-To: Naushir Patuck Cc: Jacopo Mondi Errors-To: libcamera-devel-bounces@lists.libcamera.org Sender: "libcamera-devel" Split the Raspberry Pi IPA code into common and VC4/BCM2835 specific file structures. The common code files now live in src/ipa/rpi/{cam_helper,controller}/ and the vc4 specific files in src/ipa/rpi/vc4/. To build the IPA, the meson configuration option to select the Raspberry Pi IPA has now changed from "raspberrypi" to "rpi/vc4": meson setup build --Dipas=rpi/vc4 With this change, the camera tuning files are now installed under share/libcamera/ipa/rpi/vc4/ Signed-off-by: Naushir Patuck Reviewed-by: Jacopo Mondi Reviewed-by: Laurent Pinchart --- Documentation/environment_variables.rst | 2 +- meson_options.txt | 2 +- src/ipa/raspberrypi/meson.build | 68 ------------------- src/ipa/{raspberrypi => rpi}/README.md | 0 .../cam_helper}/cam_helper.cpp | 0 .../cam_helper}/cam_helper.h | 2 +- .../cam_helper}/cam_helper_imx219.cpp | 0 .../cam_helper}/cam_helper_imx290.cpp | 0 .../cam_helper}/cam_helper_imx296.cpp | 0 .../cam_helper}/cam_helper_imx477.cpp | 0 .../cam_helper}/cam_helper_imx519.cpp | 0 .../cam_helper}/cam_helper_imx708.cpp | 0 .../cam_helper}/cam_helper_ov5647.cpp | 0 .../cam_helper}/cam_helper_ov9281.cpp | 0 .../cam_helper}/md_parser.h | 0 .../cam_helper}/md_parser_smia.cpp | 0 src/ipa/rpi/cam_helper/meson.build | 26 +++++++ .../controller/af_algorithm.h | 0 .../controller/af_status.h | 0 .../controller/agc_algorithm.h | 0 .../controller/agc_status.h | 0 .../controller/algorithm.cpp | 0 .../controller/algorithm.h | 0 .../controller/alsc_status.h | 0 .../controller/awb_algorithm.h | 0 .../controller/awb_status.h | 0 .../controller/black_level_status.h | 0 .../controller/camera_mode.h | 0 .../controller/ccm_algorithm.h | 0 .../controller/ccm_status.h | 0 .../controller/contrast_algorithm.h | 0 .../controller/contrast_status.h | 0 .../controller/controller.cpp | 0 .../controller/controller.h | 0 .../controller/denoise_algorithm.h | 0 .../controller/denoise_status.h | 0 .../controller/device_status.cpp | 0 .../controller/device_status.h | 0 .../controller/dpc_status.h | 0 .../controller/geq_status.h | 0 .../controller/histogram.cpp | 0 .../controller/histogram.h | 0 .../controller/lux_status.h | 0 src/ipa/rpi/controller/meson.build | 29 ++++++++ .../controller/metadata.h | 0 .../controller/noise_status.h | 0 .../controller/pdaf_data.h | 0 .../{raspberrypi => rpi}/controller/pwl.cpp | 0 src/ipa/{raspberrypi => rpi}/controller/pwl.h | 0 .../controller/region_stats.h | 0 .../controller/rpi/af.cpp | 0 .../{raspberrypi => rpi}/controller/rpi/af.h | 0 .../controller/rpi/agc.cpp | 0 .../{raspberrypi => rpi}/controller/rpi/agc.h | 0 .../controller/rpi/alsc.cpp | 0 .../controller/rpi/alsc.h | 0 .../controller/rpi/awb.cpp | 0 .../{raspberrypi => rpi}/controller/rpi/awb.h | 0 .../controller/rpi/black_level.cpp | 0 .../controller/rpi/black_level.h | 0 .../controller/rpi/ccm.cpp | 0 .../{raspberrypi => rpi}/controller/rpi/ccm.h | 0 .../controller/rpi/contrast.cpp | 0 .../controller/rpi/contrast.h | 0 .../controller/rpi/dpc.cpp | 0 .../{raspberrypi => rpi}/controller/rpi/dpc.h | 0 .../controller/rpi/focus.h | 0 .../controller/rpi/geq.cpp | 0 .../{raspberrypi => rpi}/controller/rpi/geq.h | 0 .../controller/rpi/lux.cpp | 0 .../{raspberrypi => rpi}/controller/rpi/lux.h | 0 .../controller/rpi/noise.cpp | 0 .../controller/rpi/noise.h | 0 .../controller/rpi/sdn.cpp | 0 .../{raspberrypi => rpi}/controller/rpi/sdn.h | 0 .../controller/rpi/sharpen.cpp | 0 .../controller/rpi/sharpen.h | 0 .../controller/sharpen_algorithm.h | 0 .../controller/sharpen_status.h | 0 .../controller}/statistics.h | 0 src/ipa/rpi/meson.build | 13 ++++ .../{raspberrypi => rpi/vc4}/data/imx219.json | 0 .../vc4}/data/imx219_noir.json | 0 .../{raspberrypi => rpi/vc4}/data/imx290.json | 0 .../{raspberrypi => rpi/vc4}/data/imx296.json | 0 .../vc4}/data/imx296_mono.json | 0 .../{raspberrypi => rpi/vc4}/data/imx378.json | 0 .../{raspberrypi => rpi/vc4}/data/imx477.json | 0 .../vc4}/data/imx477_noir.json | 0 .../vc4}/data/imx477_scientific.json | 0 .../vc4}/data/imx477_v1.json | 0 .../{raspberrypi => rpi/vc4}/data/imx519.json | 0 .../{raspberrypi => rpi/vc4}/data/imx708.json | 0 .../vc4}/data/imx708_noir.json | 0 .../vc4}/data/imx708_wide.json | 0 .../vc4}/data/imx708_wide_noir.json | 0 .../{raspberrypi => rpi/vc4}/data/meson.build | 2 +- .../{raspberrypi => rpi/vc4}/data/ov5647.json | 0 .../vc4}/data/ov5647_noir.json | 0 .../vc4}/data/ov9281_mono.json | 0 .../vc4}/data/se327m12.json | 0 .../vc4}/data/uncalibrated.json | 0 src/ipa/rpi/vc4/meson.build | 48 +++++++++++++ .../{raspberrypi => rpi/vc4}/raspberrypi.cpp | 48 ++++++------- 104 files changed, 144 insertions(+), 96 deletions(-) delete mode 100644 src/ipa/raspberrypi/meson.build rename src/ipa/{raspberrypi => rpi}/README.md (100%) rename src/ipa/{raspberrypi => rpi/cam_helper}/cam_helper.cpp (100%) rename src/ipa/{raspberrypi => rpi/cam_helper}/cam_helper.h (99%) rename src/ipa/{raspberrypi => rpi/cam_helper}/cam_helper_imx219.cpp (100%) rename src/ipa/{raspberrypi => rpi/cam_helper}/cam_helper_imx290.cpp (100%) rename src/ipa/{raspberrypi => rpi/cam_helper}/cam_helper_imx296.cpp (100%) rename src/ipa/{raspberrypi => rpi/cam_helper}/cam_helper_imx477.cpp (100%) rename src/ipa/{raspberrypi => rpi/cam_helper}/cam_helper_imx519.cpp (100%) rename src/ipa/{raspberrypi => rpi/cam_helper}/cam_helper_imx708.cpp (100%) rename src/ipa/{raspberrypi => rpi/cam_helper}/cam_helper_ov5647.cpp (100%) rename src/ipa/{raspberrypi => rpi/cam_helper}/cam_helper_ov9281.cpp (100%) rename src/ipa/{raspberrypi => rpi/cam_helper}/md_parser.h (100%) rename src/ipa/{raspberrypi => rpi/cam_helper}/md_parser_smia.cpp (100%) create mode 100644 src/ipa/rpi/cam_helper/meson.build rename src/ipa/{raspberrypi => rpi}/controller/af_algorithm.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/af_status.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/agc_algorithm.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/agc_status.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/algorithm.cpp (100%) rename src/ipa/{raspberrypi => rpi}/controller/algorithm.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/alsc_status.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/awb_algorithm.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/awb_status.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/black_level_status.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/camera_mode.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/ccm_algorithm.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/ccm_status.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/contrast_algorithm.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/contrast_status.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/controller.cpp (100%) rename src/ipa/{raspberrypi => rpi}/controller/controller.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/denoise_algorithm.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/denoise_status.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/device_status.cpp (100%) rename src/ipa/{raspberrypi => rpi}/controller/device_status.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/dpc_status.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/geq_status.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/histogram.cpp (100%) rename src/ipa/{raspberrypi => rpi}/controller/histogram.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/lux_status.h (100%) create mode 100644 src/ipa/rpi/controller/meson.build rename src/ipa/{raspberrypi => rpi}/controller/metadata.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/noise_status.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/pdaf_data.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/pwl.cpp (100%) rename src/ipa/{raspberrypi => rpi}/controller/pwl.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/region_stats.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/rpi/af.cpp (100%) rename src/ipa/{raspberrypi => rpi}/controller/rpi/af.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/rpi/agc.cpp (100%) rename src/ipa/{raspberrypi => rpi}/controller/rpi/agc.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/rpi/alsc.cpp (100%) rename src/ipa/{raspberrypi => rpi}/controller/rpi/alsc.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/rpi/awb.cpp (100%) rename src/ipa/{raspberrypi => rpi}/controller/rpi/awb.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/rpi/black_level.cpp (100%) rename src/ipa/{raspberrypi => rpi}/controller/rpi/black_level.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/rpi/ccm.cpp (100%) rename src/ipa/{raspberrypi => rpi}/controller/rpi/ccm.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/rpi/contrast.cpp (100%) rename src/ipa/{raspberrypi => rpi}/controller/rpi/contrast.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/rpi/dpc.cpp (100%) rename src/ipa/{raspberrypi => rpi}/controller/rpi/dpc.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/rpi/focus.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/rpi/geq.cpp (100%) rename src/ipa/{raspberrypi => rpi}/controller/rpi/geq.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/rpi/lux.cpp (100%) rename src/ipa/{raspberrypi => rpi}/controller/rpi/lux.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/rpi/noise.cpp (100%) rename src/ipa/{raspberrypi => rpi}/controller/rpi/noise.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/rpi/sdn.cpp (100%) rename src/ipa/{raspberrypi => rpi}/controller/rpi/sdn.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/rpi/sharpen.cpp (100%) rename src/ipa/{raspberrypi => rpi}/controller/rpi/sharpen.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/sharpen_algorithm.h (100%) rename src/ipa/{raspberrypi => rpi}/controller/sharpen_status.h (100%) rename src/ipa/{raspberrypi => rpi/controller}/statistics.h (100%) create mode 100644 src/ipa/rpi/meson.build rename src/ipa/{raspberrypi => rpi/vc4}/data/imx219.json (100%) rename src/ipa/{raspberrypi => rpi/vc4}/data/imx219_noir.json (100%) rename src/ipa/{raspberrypi => rpi/vc4}/data/imx290.json (100%) rename src/ipa/{raspberrypi => rpi/vc4}/data/imx296.json (100%) rename src/ipa/{raspberrypi => rpi/vc4}/data/imx296_mono.json (100%) rename src/ipa/{raspberrypi => rpi/vc4}/data/imx378.json (100%) rename src/ipa/{raspberrypi => rpi/vc4}/data/imx477.json (100%) rename src/ipa/{raspberrypi => rpi/vc4}/data/imx477_noir.json (100%) rename src/ipa/{raspberrypi => rpi/vc4}/data/imx477_scientific.json (100%) rename src/ipa/{raspberrypi => rpi/vc4}/data/imx477_v1.json (100%) rename src/ipa/{raspberrypi => rpi/vc4}/data/imx519.json (100%) rename src/ipa/{raspberrypi => rpi/vc4}/data/imx708.json (100%) rename src/ipa/{raspberrypi => rpi/vc4}/data/imx708_noir.json (100%) rename src/ipa/{raspberrypi => rpi/vc4}/data/imx708_wide.json (100%) rename src/ipa/{raspberrypi => rpi/vc4}/data/imx708_wide_noir.json (100%) rename src/ipa/{raspberrypi => rpi/vc4}/data/meson.build (89%) rename src/ipa/{raspberrypi => rpi/vc4}/data/ov5647.json (100%) rename src/ipa/{raspberrypi => rpi/vc4}/data/ov5647_noir.json (100%) rename src/ipa/{raspberrypi => rpi/vc4}/data/ov9281_mono.json (100%) rename src/ipa/{raspberrypi => rpi/vc4}/data/se327m12.json (100%) rename src/ipa/{raspberrypi => rpi/vc4}/data/uncalibrated.json (100%) create mode 100644 src/ipa/rpi/vc4/meson.build rename src/ipa/{raspberrypi => rpi/vc4}/raspberrypi.cpp (98%) diff --git a/Documentation/environment_variables.rst b/Documentation/environment_variables.rst index 4bf38b877897..a9b230bcd93e 100644 --- a/Documentation/environment_variables.rst +++ b/Documentation/environment_variables.rst @@ -143,7 +143,7 @@ contain tuning parameters for the algorithms, in JSON format. The ``LIBCAMERA_IPA_CONFIG_PATH`` variable can be used to specify custom storage locations to search for those configuration files. -`Examples `__ +`Examples `__ IPA module ~~~~~~~~~~ diff --git a/meson_options.txt b/meson_options.txt index b4afb8e591a8..c8cd53b49ba1 100644 --- a/meson_options.txt +++ b/meson_options.txt @@ -27,7 +27,7 @@ option('gstreamer', option('ipas', type : 'array', - choices : ['ipu3', 'raspberrypi', 'rkisp1', 'vimc'], + choices : ['ipu3', 'rkisp1', 'rpi/vc4', 'vimc'], description : 'Select which IPA modules to build') option('lc-compliance', diff --git a/src/ipa/raspberrypi/meson.build b/src/ipa/raspberrypi/meson.build deleted file mode 100644 index 95437cbcc962..000000000000 --- a/src/ipa/raspberrypi/meson.build +++ /dev/null @@ -1,68 +0,0 @@ -# SPDX-License-Identifier: CC0-1.0 - -ipa_name = 'ipa_rpi' - -rpi_ipa_deps = [ - libcamera_private, - libatomic, -] - -rpi_ipa_includes = [ - ipa_includes, - libipa_includes, - include_directories('controller') -] - -rpi_ipa_sources = files([ - 'raspberrypi.cpp', - 'md_parser_smia.cpp', - 'cam_helper.cpp', - 'cam_helper_ov5647.cpp', - 'cam_helper_imx219.cpp', - 'cam_helper_imx290.cpp', - 'cam_helper_imx296.cpp', - 'cam_helper_imx477.cpp', - 'cam_helper_imx519.cpp', - 'cam_helper_imx708.cpp', - 'cam_helper_ov9281.cpp', - 'controller/controller.cpp', - 'controller/histogram.cpp', - 'controller/algorithm.cpp', - 'controller/rpi/af.cpp', - 'controller/rpi/alsc.cpp', - 'controller/rpi/awb.cpp', - 'controller/rpi/sharpen.cpp', - 'controller/rpi/black_level.cpp', - 'controller/rpi/geq.cpp', - 'controller/rpi/noise.cpp', - 'controller/rpi/lux.cpp', - 'controller/rpi/agc.cpp', - 'controller/rpi/dpc.cpp', - 'controller/rpi/ccm.cpp', - 'controller/rpi/contrast.cpp', - 'controller/rpi/sdn.cpp', - 'controller/pwl.cpp', - 'controller/device_status.cpp', -]) - -mod = shared_module(ipa_name, - [rpi_ipa_sources, libcamera_generated_ipa_headers], - name_prefix : '', - include_directories : rpi_ipa_includes, - dependencies : rpi_ipa_deps, - link_with : libipa, - install : true, - install_dir : ipa_install_dir) - -if ipa_sign_module - custom_target(ipa_name + '.so.sign', - input : mod, - output : ipa_name + '.so.sign', - command : [ipa_sign, ipa_priv_key, '@INPUT@', '@OUTPUT@'], - install : false, - build_by_default : true) -endif - -subdir('data') - -ipa_names += ipa_name diff --git a/src/ipa/raspberrypi/README.md b/src/ipa/rpi/README.md similarity index 100% rename from src/ipa/raspberrypi/README.md rename to src/ipa/rpi/README.md diff --git a/src/ipa/raspberrypi/cam_helper.cpp b/src/ipa/rpi/cam_helper/cam_helper.cpp similarity index 100% rename from src/ipa/raspberrypi/cam_helper.cpp rename to src/ipa/rpi/cam_helper/cam_helper.cpp diff --git a/src/ipa/raspberrypi/cam_helper.h b/src/ipa/rpi/cam_helper/cam_helper.h similarity index 99% rename from src/ipa/raspberrypi/cam_helper.h rename to src/ipa/rpi/cam_helper/cam_helper.h index b3f8c9803094..58a4b202d5a8 100644 --- a/src/ipa/raspberrypi/cam_helper.h +++ b/src/ipa/rpi/cam_helper/cam_helper.h @@ -13,7 +13,7 @@ #include #include -#include "camera_mode.h" +#include "controller/camera_mode.h" #include "controller/controller.h" #include "controller/metadata.h" #include "md_parser.h" diff --git a/src/ipa/raspberrypi/cam_helper_imx219.cpp b/src/ipa/rpi/cam_helper/cam_helper_imx219.cpp similarity index 100% rename from src/ipa/raspberrypi/cam_helper_imx219.cpp rename to src/ipa/rpi/cam_helper/cam_helper_imx219.cpp diff --git a/src/ipa/raspberrypi/cam_helper_imx290.cpp b/src/ipa/rpi/cam_helper/cam_helper_imx290.cpp similarity index 100% rename from src/ipa/raspberrypi/cam_helper_imx290.cpp rename to src/ipa/rpi/cam_helper/cam_helper_imx290.cpp diff --git a/src/ipa/raspberrypi/cam_helper_imx296.cpp b/src/ipa/rpi/cam_helper/cam_helper_imx296.cpp similarity index 100% rename from src/ipa/raspberrypi/cam_helper_imx296.cpp rename to src/ipa/rpi/cam_helper/cam_helper_imx296.cpp diff --git a/src/ipa/raspberrypi/cam_helper_imx477.cpp b/src/ipa/rpi/cam_helper/cam_helper_imx477.cpp similarity index 100% rename from src/ipa/raspberrypi/cam_helper_imx477.cpp rename to src/ipa/rpi/cam_helper/cam_helper_imx477.cpp diff --git a/src/ipa/raspberrypi/cam_helper_imx519.cpp b/src/ipa/rpi/cam_helper/cam_helper_imx519.cpp similarity index 100% rename from src/ipa/raspberrypi/cam_helper_imx519.cpp rename to src/ipa/rpi/cam_helper/cam_helper_imx519.cpp diff --git a/src/ipa/raspberrypi/cam_helper_imx708.cpp b/src/ipa/rpi/cam_helper/cam_helper_imx708.cpp similarity index 100% rename from src/ipa/raspberrypi/cam_helper_imx708.cpp rename to src/ipa/rpi/cam_helper/cam_helper_imx708.cpp diff --git a/src/ipa/raspberrypi/cam_helper_ov5647.cpp b/src/ipa/rpi/cam_helper/cam_helper_ov5647.cpp similarity index 100% rename from src/ipa/raspberrypi/cam_helper_ov5647.cpp rename to src/ipa/rpi/cam_helper/cam_helper_ov5647.cpp diff --git a/src/ipa/raspberrypi/cam_helper_ov9281.cpp b/src/ipa/rpi/cam_helper/cam_helper_ov9281.cpp similarity index 100% rename from src/ipa/raspberrypi/cam_helper_ov9281.cpp rename to src/ipa/rpi/cam_helper/cam_helper_ov9281.cpp diff --git a/src/ipa/raspberrypi/md_parser.h b/src/ipa/rpi/cam_helper/md_parser.h similarity index 100% rename from src/ipa/raspberrypi/md_parser.h rename to src/ipa/rpi/cam_helper/md_parser.h diff --git a/src/ipa/raspberrypi/md_parser_smia.cpp b/src/ipa/rpi/cam_helper/md_parser_smia.cpp similarity index 100% rename from src/ipa/raspberrypi/md_parser_smia.cpp rename to src/ipa/rpi/cam_helper/md_parser_smia.cpp diff --git a/src/ipa/rpi/cam_helper/meson.build b/src/ipa/rpi/cam_helper/meson.build new file mode 100644 index 000000000000..bdf2db8eb742 --- /dev/null +++ b/src/ipa/rpi/cam_helper/meson.build @@ -0,0 +1,26 @@ +# SPDX-License-Identifier: CC0-1.0 + +rpi_ipa_cam_helper_sources = files([ + 'cam_helper.cpp', + 'cam_helper_ov5647.cpp', + 'cam_helper_imx219.cpp', + 'cam_helper_imx290.cpp', + 'cam_helper_imx296.cpp', + 'cam_helper_imx477.cpp', + 'cam_helper_imx519.cpp', + 'cam_helper_imx708.cpp', + 'cam_helper_ov9281.cpp', + 'md_parser_smia.cpp', +]) + +rpi_ipa_cam_helper_includes = [ + include_directories('..'), +] + +rpi_ipa_cam_helper_deps = [ + libcamera_private, +] + +rpi_ipa_cam_helper_lib = static_library('rpi_ipa_cam_helper', rpi_ipa_cam_helper_sources, + include_directories : rpi_ipa_cam_helper_includes, + dependencies : rpi_ipa_cam_helper_deps) diff --git a/src/ipa/raspberrypi/controller/af_algorithm.h b/src/ipa/rpi/controller/af_algorithm.h similarity index 100% rename from src/ipa/raspberrypi/controller/af_algorithm.h rename to src/ipa/rpi/controller/af_algorithm.h diff --git a/src/ipa/raspberrypi/controller/af_status.h b/src/ipa/rpi/controller/af_status.h similarity index 100% rename from src/ipa/raspberrypi/controller/af_status.h rename to src/ipa/rpi/controller/af_status.h diff --git a/src/ipa/raspberrypi/controller/agc_algorithm.h b/src/ipa/rpi/controller/agc_algorithm.h similarity index 100% rename from src/ipa/raspberrypi/controller/agc_algorithm.h rename to src/ipa/rpi/controller/agc_algorithm.h diff --git a/src/ipa/raspberrypi/controller/agc_status.h b/src/ipa/rpi/controller/agc_status.h similarity index 100% rename from src/ipa/raspberrypi/controller/agc_status.h rename to src/ipa/rpi/controller/agc_status.h diff --git a/src/ipa/raspberrypi/controller/algorithm.cpp b/src/ipa/rpi/controller/algorithm.cpp similarity index 100% rename from src/ipa/raspberrypi/controller/algorithm.cpp rename to src/ipa/rpi/controller/algorithm.cpp diff --git a/src/ipa/raspberrypi/controller/algorithm.h b/src/ipa/rpi/controller/algorithm.h similarity index 100% rename from src/ipa/raspberrypi/controller/algorithm.h rename to src/ipa/rpi/controller/algorithm.h diff --git a/src/ipa/raspberrypi/controller/alsc_status.h b/src/ipa/rpi/controller/alsc_status.h similarity index 100% rename from src/ipa/raspberrypi/controller/alsc_status.h rename to src/ipa/rpi/controller/alsc_status.h diff --git a/src/ipa/raspberrypi/controller/awb_algorithm.h b/src/ipa/rpi/controller/awb_algorithm.h similarity index 100% rename from src/ipa/raspberrypi/controller/awb_algorithm.h rename to src/ipa/rpi/controller/awb_algorithm.h diff --git a/src/ipa/raspberrypi/controller/awb_status.h b/src/ipa/rpi/controller/awb_status.h similarity index 100% rename from src/ipa/raspberrypi/controller/awb_status.h rename to src/ipa/rpi/controller/awb_status.h diff --git a/src/ipa/raspberrypi/controller/black_level_status.h b/src/ipa/rpi/controller/black_level_status.h similarity index 100% rename from src/ipa/raspberrypi/controller/black_level_status.h rename to src/ipa/rpi/controller/black_level_status.h diff --git a/src/ipa/raspberrypi/controller/camera_mode.h b/src/ipa/rpi/controller/camera_mode.h similarity index 100% rename from src/ipa/raspberrypi/controller/camera_mode.h rename to src/ipa/rpi/controller/camera_mode.h diff --git a/src/ipa/raspberrypi/controller/ccm_algorithm.h b/src/ipa/rpi/controller/ccm_algorithm.h similarity index 100% rename from src/ipa/raspberrypi/controller/ccm_algorithm.h rename to src/ipa/rpi/controller/ccm_algorithm.h diff --git a/src/ipa/raspberrypi/controller/ccm_status.h b/src/ipa/rpi/controller/ccm_status.h similarity index 100% rename from src/ipa/raspberrypi/controller/ccm_status.h rename to src/ipa/rpi/controller/ccm_status.h diff --git a/src/ipa/raspberrypi/controller/contrast_algorithm.h b/src/ipa/rpi/controller/contrast_algorithm.h similarity index 100% rename from src/ipa/raspberrypi/controller/contrast_algorithm.h rename to src/ipa/rpi/controller/contrast_algorithm.h diff --git a/src/ipa/raspberrypi/controller/contrast_status.h b/src/ipa/rpi/controller/contrast_status.h similarity index 100% rename from src/ipa/raspberrypi/controller/contrast_status.h rename to src/ipa/rpi/controller/contrast_status.h diff --git a/src/ipa/raspberrypi/controller/controller.cpp b/src/ipa/rpi/controller/controller.cpp similarity index 100% rename from src/ipa/raspberrypi/controller/controller.cpp rename to src/ipa/rpi/controller/controller.cpp diff --git a/src/ipa/raspberrypi/controller/controller.h b/src/ipa/rpi/controller/controller.h similarity index 100% rename from src/ipa/raspberrypi/controller/controller.h rename to src/ipa/rpi/controller/controller.h diff --git a/src/ipa/raspberrypi/controller/denoise_algorithm.h b/src/ipa/rpi/controller/denoise_algorithm.h similarity index 100% rename from src/ipa/raspberrypi/controller/denoise_algorithm.h rename to src/ipa/rpi/controller/denoise_algorithm.h diff --git a/src/ipa/raspberrypi/controller/denoise_status.h b/src/ipa/rpi/controller/denoise_status.h similarity index 100% rename from src/ipa/raspberrypi/controller/denoise_status.h rename to src/ipa/rpi/controller/denoise_status.h diff --git a/src/ipa/raspberrypi/controller/device_status.cpp b/src/ipa/rpi/controller/device_status.cpp similarity index 100% rename from src/ipa/raspberrypi/controller/device_status.cpp rename to src/ipa/rpi/controller/device_status.cpp diff --git a/src/ipa/raspberrypi/controller/device_status.h b/src/ipa/rpi/controller/device_status.h similarity index 100% rename from src/ipa/raspberrypi/controller/device_status.h rename to src/ipa/rpi/controller/device_status.h diff --git a/src/ipa/raspberrypi/controller/dpc_status.h b/src/ipa/rpi/controller/dpc_status.h similarity index 100% rename from src/ipa/raspberrypi/controller/dpc_status.h rename to src/ipa/rpi/controller/dpc_status.h diff --git a/src/ipa/raspberrypi/controller/geq_status.h b/src/ipa/rpi/controller/geq_status.h similarity index 100% rename from src/ipa/raspberrypi/controller/geq_status.h rename to src/ipa/rpi/controller/geq_status.h diff --git a/src/ipa/raspberrypi/controller/histogram.cpp b/src/ipa/rpi/controller/histogram.cpp similarity index 100% rename from src/ipa/raspberrypi/controller/histogram.cpp rename to src/ipa/rpi/controller/histogram.cpp diff --git a/src/ipa/raspberrypi/controller/histogram.h b/src/ipa/rpi/controller/histogram.h similarity index 100% rename from src/ipa/raspberrypi/controller/histogram.h rename to src/ipa/rpi/controller/histogram.h diff --git a/src/ipa/raspberrypi/controller/lux_status.h b/src/ipa/rpi/controller/lux_status.h similarity index 100% rename from src/ipa/raspberrypi/controller/lux_status.h rename to src/ipa/rpi/controller/lux_status.h diff --git a/src/ipa/rpi/controller/meson.build b/src/ipa/rpi/controller/meson.build new file mode 100644 index 000000000000..feb0334e8bb4 --- /dev/null +++ b/src/ipa/rpi/controller/meson.build @@ -0,0 +1,29 @@ +# SPDX-License-Identifier: CC0-1.0 + +rpi_ipa_controller_sources = files([ + 'algorithm.cpp', + 'controller.cpp', + 'device_status.cpp', + 'histogram.cpp', + 'pwl.cpp', + 'rpi/af.cpp', + 'rpi/agc.cpp', + 'rpi/alsc.cpp', + 'rpi/awb.cpp', + 'rpi/black_level.cpp', + 'rpi/ccm.cpp', + 'rpi/contrast.cpp', + 'rpi/dpc.cpp', + 'rpi/geq.cpp', + 'rpi/lux.cpp', + 'rpi/noise.cpp', + 'rpi/sdn.cpp', + 'rpi/sharpen.cpp', +]) + +rpi_ipa_controller_deps = [ + libcamera_private, +] + +rpi_ipa_controller_lib = static_library('rpi_ipa_controller', rpi_ipa_controller_sources, + dependencies : rpi_ipa_controller_deps) diff --git a/src/ipa/raspberrypi/controller/metadata.h b/src/ipa/rpi/controller/metadata.h similarity index 100% rename from src/ipa/raspberrypi/controller/metadata.h rename to src/ipa/rpi/controller/metadata.h diff --git a/src/ipa/raspberrypi/controller/noise_status.h b/src/ipa/rpi/controller/noise_status.h similarity index 100% rename from src/ipa/raspberrypi/controller/noise_status.h rename to src/ipa/rpi/controller/noise_status.h diff --git a/src/ipa/raspberrypi/controller/pdaf_data.h b/src/ipa/rpi/controller/pdaf_data.h similarity index 100% rename from src/ipa/raspberrypi/controller/pdaf_data.h rename to src/ipa/rpi/controller/pdaf_data.h diff --git a/src/ipa/raspberrypi/controller/pwl.cpp b/src/ipa/rpi/controller/pwl.cpp similarity index 100% rename from src/ipa/raspberrypi/controller/pwl.cpp rename to src/ipa/rpi/controller/pwl.cpp diff --git a/src/ipa/raspberrypi/controller/pwl.h b/src/ipa/rpi/controller/pwl.h similarity index 100% rename from src/ipa/raspberrypi/controller/pwl.h rename to src/ipa/rpi/controller/pwl.h diff --git a/src/ipa/raspberrypi/controller/region_stats.h b/src/ipa/rpi/controller/region_stats.h similarity index 100% rename from src/ipa/raspberrypi/controller/region_stats.h rename to src/ipa/rpi/controller/region_stats.h diff --git a/src/ipa/raspberrypi/controller/rpi/af.cpp b/src/ipa/rpi/controller/rpi/af.cpp similarity index 100% rename from src/ipa/raspberrypi/controller/rpi/af.cpp rename to src/ipa/rpi/controller/rpi/af.cpp diff --git a/src/ipa/raspberrypi/controller/rpi/af.h b/src/ipa/rpi/controller/rpi/af.h similarity index 100% rename from src/ipa/raspberrypi/controller/rpi/af.h rename to src/ipa/rpi/controller/rpi/af.h diff --git a/src/ipa/raspberrypi/controller/rpi/agc.cpp b/src/ipa/rpi/controller/rpi/agc.cpp similarity index 100% rename from src/ipa/raspberrypi/controller/rpi/agc.cpp rename to src/ipa/rpi/controller/rpi/agc.cpp diff --git a/src/ipa/raspberrypi/controller/rpi/agc.h b/src/ipa/rpi/controller/rpi/agc.h similarity index 100% rename from src/ipa/raspberrypi/controller/rpi/agc.h rename to src/ipa/rpi/controller/rpi/agc.h diff --git a/src/ipa/raspberrypi/controller/rpi/alsc.cpp b/src/ipa/rpi/controller/rpi/alsc.cpp similarity index 100% rename from src/ipa/raspberrypi/controller/rpi/alsc.cpp rename to src/ipa/rpi/controller/rpi/alsc.cpp diff --git a/src/ipa/raspberrypi/controller/rpi/alsc.h b/src/ipa/rpi/controller/rpi/alsc.h similarity index 100% rename from src/ipa/raspberrypi/controller/rpi/alsc.h rename to src/ipa/rpi/controller/rpi/alsc.h diff --git a/src/ipa/raspberrypi/controller/rpi/awb.cpp b/src/ipa/rpi/controller/rpi/awb.cpp similarity index 100% rename from src/ipa/raspberrypi/controller/rpi/awb.cpp rename to src/ipa/rpi/controller/rpi/awb.cpp diff --git a/src/ipa/raspberrypi/controller/rpi/awb.h b/src/ipa/rpi/controller/rpi/awb.h similarity index 100% rename from src/ipa/raspberrypi/controller/rpi/awb.h rename to src/ipa/rpi/controller/rpi/awb.h diff --git a/src/ipa/raspberrypi/controller/rpi/black_level.cpp b/src/ipa/rpi/controller/rpi/black_level.cpp similarity index 100% rename from src/ipa/raspberrypi/controller/rpi/black_level.cpp rename to src/ipa/rpi/controller/rpi/black_level.cpp diff --git a/src/ipa/raspberrypi/controller/rpi/black_level.h b/src/ipa/rpi/controller/rpi/black_level.h similarity index 100% rename from src/ipa/raspberrypi/controller/rpi/black_level.h rename to src/ipa/rpi/controller/rpi/black_level.h diff --git a/src/ipa/raspberrypi/controller/rpi/ccm.cpp b/src/ipa/rpi/controller/rpi/ccm.cpp similarity index 100% rename from src/ipa/raspberrypi/controller/rpi/ccm.cpp rename to src/ipa/rpi/controller/rpi/ccm.cpp diff --git a/src/ipa/raspberrypi/controller/rpi/ccm.h b/src/ipa/rpi/controller/rpi/ccm.h similarity index 100% rename from src/ipa/raspberrypi/controller/rpi/ccm.h rename to src/ipa/rpi/controller/rpi/ccm.h diff --git a/src/ipa/raspberrypi/controller/rpi/contrast.cpp b/src/ipa/rpi/controller/rpi/contrast.cpp similarity index 100% rename from src/ipa/raspberrypi/controller/rpi/contrast.cpp rename to src/ipa/rpi/controller/rpi/contrast.cpp diff --git a/src/ipa/raspberrypi/controller/rpi/contrast.h b/src/ipa/rpi/controller/rpi/contrast.h similarity index 100% rename from src/ipa/raspberrypi/controller/rpi/contrast.h rename to src/ipa/rpi/controller/rpi/contrast.h diff --git a/src/ipa/raspberrypi/controller/rpi/dpc.cpp b/src/ipa/rpi/controller/rpi/dpc.cpp similarity index 100% rename from src/ipa/raspberrypi/controller/rpi/dpc.cpp rename to src/ipa/rpi/controller/rpi/dpc.cpp diff --git a/src/ipa/raspberrypi/controller/rpi/dpc.h b/src/ipa/rpi/controller/rpi/dpc.h similarity index 100% rename from src/ipa/raspberrypi/controller/rpi/dpc.h rename to src/ipa/rpi/controller/rpi/dpc.h diff --git a/src/ipa/raspberrypi/controller/rpi/focus.h b/src/ipa/rpi/controller/rpi/focus.h similarity index 100% rename from src/ipa/raspberrypi/controller/rpi/focus.h rename to src/ipa/rpi/controller/rpi/focus.h diff --git a/src/ipa/raspberrypi/controller/rpi/geq.cpp b/src/ipa/rpi/controller/rpi/geq.cpp similarity index 100% rename from src/ipa/raspberrypi/controller/rpi/geq.cpp rename to src/ipa/rpi/controller/rpi/geq.cpp diff --git a/src/ipa/raspberrypi/controller/rpi/geq.h b/src/ipa/rpi/controller/rpi/geq.h similarity index 100% rename from src/ipa/raspberrypi/controller/rpi/geq.h rename to src/ipa/rpi/controller/rpi/geq.h diff --git a/src/ipa/raspberrypi/controller/rpi/lux.cpp b/src/ipa/rpi/controller/rpi/lux.cpp similarity index 100% rename from src/ipa/raspberrypi/controller/rpi/lux.cpp rename to src/ipa/rpi/controller/rpi/lux.cpp diff --git a/src/ipa/raspberrypi/controller/rpi/lux.h b/src/ipa/rpi/controller/rpi/lux.h similarity index 100% rename from src/ipa/raspberrypi/controller/rpi/lux.h rename to src/ipa/rpi/controller/rpi/lux.h diff --git a/src/ipa/raspberrypi/controller/rpi/noise.cpp b/src/ipa/rpi/controller/rpi/noise.cpp similarity index 100% rename from src/ipa/raspberrypi/controller/rpi/noise.cpp rename to src/ipa/rpi/controller/rpi/noise.cpp diff --git a/src/ipa/raspberrypi/controller/rpi/noise.h b/src/ipa/rpi/controller/rpi/noise.h similarity index 100% rename from src/ipa/raspberrypi/controller/rpi/noise.h rename to src/ipa/rpi/controller/rpi/noise.h diff --git a/src/ipa/raspberrypi/controller/rpi/sdn.cpp b/src/ipa/rpi/controller/rpi/sdn.cpp similarity index 100% rename from src/ipa/raspberrypi/controller/rpi/sdn.cpp rename to src/ipa/rpi/controller/rpi/sdn.cpp diff --git a/src/ipa/raspberrypi/controller/rpi/sdn.h b/src/ipa/rpi/controller/rpi/sdn.h similarity index 100% rename from src/ipa/raspberrypi/controller/rpi/sdn.h rename to src/ipa/rpi/controller/rpi/sdn.h diff --git a/src/ipa/raspberrypi/controller/rpi/sharpen.cpp b/src/ipa/rpi/controller/rpi/sharpen.cpp similarity index 100% rename from src/ipa/raspberrypi/controller/rpi/sharpen.cpp rename to src/ipa/rpi/controller/rpi/sharpen.cpp diff --git a/src/ipa/raspberrypi/controller/rpi/sharpen.h b/src/ipa/rpi/controller/rpi/sharpen.h similarity index 100% rename from src/ipa/raspberrypi/controller/rpi/sharpen.h rename to src/ipa/rpi/controller/rpi/sharpen.h diff --git a/src/ipa/raspberrypi/controller/sharpen_algorithm.h b/src/ipa/rpi/controller/sharpen_algorithm.h similarity index 100% rename from src/ipa/raspberrypi/controller/sharpen_algorithm.h rename to src/ipa/rpi/controller/sharpen_algorithm.h diff --git a/src/ipa/raspberrypi/controller/sharpen_status.h b/src/ipa/rpi/controller/sharpen_status.h similarity index 100% rename from src/ipa/raspberrypi/controller/sharpen_status.h rename to src/ipa/rpi/controller/sharpen_status.h diff --git a/src/ipa/raspberrypi/statistics.h b/src/ipa/rpi/controller/statistics.h similarity index 100% rename from src/ipa/raspberrypi/statistics.h rename to src/ipa/rpi/controller/statistics.h diff --git a/src/ipa/rpi/meson.build b/src/ipa/rpi/meson.build new file mode 100644 index 000000000000..7d7a61f7cea7 --- /dev/null +++ b/src/ipa/rpi/meson.build @@ -0,0 +1,13 @@ +# SPDX-License-Identifier: CC0-1.0 + +subdir('cam_helper') +subdir('controller') + +foreach pipeline : pipelines + pipeline = pipeline.split('/') + if pipeline.length() < 2 or pipeline[0] != 'rpi' + continue + endif + + subdir(pipeline[1]) +endforeach diff --git a/src/ipa/raspberrypi/data/imx219.json b/src/ipa/rpi/vc4/data/imx219.json similarity index 100% rename from src/ipa/raspberrypi/data/imx219.json rename to src/ipa/rpi/vc4/data/imx219.json diff --git a/src/ipa/raspberrypi/data/imx219_noir.json b/src/ipa/rpi/vc4/data/imx219_noir.json similarity index 100% rename from src/ipa/raspberrypi/data/imx219_noir.json rename to src/ipa/rpi/vc4/data/imx219_noir.json diff --git a/src/ipa/raspberrypi/data/imx290.json b/src/ipa/rpi/vc4/data/imx290.json similarity index 100% rename from src/ipa/raspberrypi/data/imx290.json rename to src/ipa/rpi/vc4/data/imx290.json diff --git a/src/ipa/raspberrypi/data/imx296.json b/src/ipa/rpi/vc4/data/imx296.json similarity index 100% rename from src/ipa/raspberrypi/data/imx296.json rename to src/ipa/rpi/vc4/data/imx296.json diff --git a/src/ipa/raspberrypi/data/imx296_mono.json b/src/ipa/rpi/vc4/data/imx296_mono.json similarity index 100% rename from src/ipa/raspberrypi/data/imx296_mono.json rename to src/ipa/rpi/vc4/data/imx296_mono.json diff --git a/src/ipa/raspberrypi/data/imx378.json b/src/ipa/rpi/vc4/data/imx378.json similarity index 100% rename from src/ipa/raspberrypi/data/imx378.json rename to src/ipa/rpi/vc4/data/imx378.json diff --git a/src/ipa/raspberrypi/data/imx477.json b/src/ipa/rpi/vc4/data/imx477.json similarity index 100% rename from src/ipa/raspberrypi/data/imx477.json rename to src/ipa/rpi/vc4/data/imx477.json diff --git a/src/ipa/raspberrypi/data/imx477_noir.json b/src/ipa/rpi/vc4/data/imx477_noir.json similarity index 100% rename from src/ipa/raspberrypi/data/imx477_noir.json rename to src/ipa/rpi/vc4/data/imx477_noir.json diff --git a/src/ipa/raspberrypi/data/imx477_scientific.json b/src/ipa/rpi/vc4/data/imx477_scientific.json similarity index 100% rename from src/ipa/raspberrypi/data/imx477_scientific.json rename to src/ipa/rpi/vc4/data/imx477_scientific.json diff --git a/src/ipa/raspberrypi/data/imx477_v1.json b/src/ipa/rpi/vc4/data/imx477_v1.json similarity index 100% rename from src/ipa/raspberrypi/data/imx477_v1.json rename to src/ipa/rpi/vc4/data/imx477_v1.json diff --git a/src/ipa/raspberrypi/data/imx519.json b/src/ipa/rpi/vc4/data/imx519.json similarity index 100% rename from src/ipa/raspberrypi/data/imx519.json rename to src/ipa/rpi/vc4/data/imx519.json diff --git a/src/ipa/raspberrypi/data/imx708.json b/src/ipa/rpi/vc4/data/imx708.json similarity index 100% rename from src/ipa/raspberrypi/data/imx708.json rename to src/ipa/rpi/vc4/data/imx708.json diff --git a/src/ipa/raspberrypi/data/imx708_noir.json b/src/ipa/rpi/vc4/data/imx708_noir.json similarity index 100% rename from src/ipa/raspberrypi/data/imx708_noir.json rename to src/ipa/rpi/vc4/data/imx708_noir.json diff --git a/src/ipa/raspberrypi/data/imx708_wide.json b/src/ipa/rpi/vc4/data/imx708_wide.json similarity index 100% rename from src/ipa/raspberrypi/data/imx708_wide.json rename to src/ipa/rpi/vc4/data/imx708_wide.json diff --git a/src/ipa/raspberrypi/data/imx708_wide_noir.json b/src/ipa/rpi/vc4/data/imx708_wide_noir.json similarity index 100% rename from src/ipa/raspberrypi/data/imx708_wide_noir.json rename to src/ipa/rpi/vc4/data/imx708_wide_noir.json diff --git a/src/ipa/raspberrypi/data/meson.build b/src/ipa/rpi/vc4/data/meson.build similarity index 89% rename from src/ipa/raspberrypi/data/meson.build rename to src/ipa/rpi/vc4/data/meson.build index b163a052f57c..bcf5658ba5d2 100644 --- a/src/ipa/raspberrypi/data/meson.build +++ b/src/ipa/rpi/vc4/data/meson.build @@ -23,4 +23,4 @@ conf_files = files([ ]) install_data(conf_files, - install_dir : ipa_data_dir / 'raspberrypi') + install_dir : ipa_data_dir / 'rpi' / 'vc4') diff --git a/src/ipa/raspberrypi/data/ov5647.json b/src/ipa/rpi/vc4/data/ov5647.json similarity index 100% rename from src/ipa/raspberrypi/data/ov5647.json rename to src/ipa/rpi/vc4/data/ov5647.json diff --git a/src/ipa/raspberrypi/data/ov5647_noir.json b/src/ipa/rpi/vc4/data/ov5647_noir.json similarity index 100% rename from src/ipa/raspberrypi/data/ov5647_noir.json rename to src/ipa/rpi/vc4/data/ov5647_noir.json diff --git a/src/ipa/raspberrypi/data/ov9281_mono.json b/src/ipa/rpi/vc4/data/ov9281_mono.json similarity index 100% rename from src/ipa/raspberrypi/data/ov9281_mono.json rename to src/ipa/rpi/vc4/data/ov9281_mono.json diff --git a/src/ipa/raspberrypi/data/se327m12.json b/src/ipa/rpi/vc4/data/se327m12.json similarity index 100% rename from src/ipa/raspberrypi/data/se327m12.json rename to src/ipa/rpi/vc4/data/se327m12.json diff --git a/src/ipa/raspberrypi/data/uncalibrated.json b/src/ipa/rpi/vc4/data/uncalibrated.json similarity index 100% rename from src/ipa/raspberrypi/data/uncalibrated.json rename to src/ipa/rpi/vc4/data/uncalibrated.json diff --git a/src/ipa/rpi/vc4/meson.build b/src/ipa/rpi/vc4/meson.build new file mode 100644 index 000000000000..cbd4dec62659 --- /dev/null +++ b/src/ipa/rpi/vc4/meson.build @@ -0,0 +1,48 @@ +# SPDX-License-Identifier: CC0-1.0 + +ipa_name = 'ipa_rpi_vc4' + +vc4_ipa_deps = [ + libcamera_private, + libatomic, +] + +vc4_ipa_libs = [ + rpi_ipa_cam_helper_lib, + rpi_ipa_controller_lib +] + +vc4_ipa_includes = [ + ipa_includes, + libipa_includes, +] + +vc4_ipa_sources = files([ + 'raspberrypi.cpp', +]) + +vc4_ipa_includes += include_directories('..') + +mod = shared_module(ipa_name, + [vc4_ipa_sources, libcamera_generated_ipa_headers], + name_prefix : '', + include_directories : vc4_ipa_includes, + dependencies : vc4_ipa_deps, + link_with : libipa, + link_whole : vc4_ipa_libs, + install : true, + install_dir : ipa_install_dir) + +if ipa_sign_module + custom_target(ipa_name + '.so.sign', + input : mod, + output : ipa_name + '.so.sign', + command : [ipa_sign, ipa_priv_key, '@INPUT@', '@OUTPUT@'], + install : false, + build_by_default : true) +endif + +subdir('data') + +ipa_names += ipa_name + diff --git a/src/ipa/raspberrypi/raspberrypi.cpp b/src/ipa/rpi/vc4/raspberrypi.cpp similarity index 98% rename from src/ipa/raspberrypi/raspberrypi.cpp rename to src/ipa/rpi/vc4/raspberrypi.cpp index 9c29fa9a5e5c..5d3bf4caf3da 100644 --- a/src/ipa/raspberrypi/raspberrypi.cpp +++ b/src/ipa/rpi/vc4/raspberrypi.cpp @@ -33,29 +33,29 @@ #include "libcamera/internal/mapped_framebuffer.h" -#include "af_algorithm.h" -#include "af_status.h" -#include "agc_algorithm.h" -#include "agc_status.h" -#include "alsc_status.h" -#include "awb_algorithm.h" -#include "awb_status.h" -#include "black_level_status.h" -#include "cam_helper.h" -#include "ccm_algorithm.h" -#include "ccm_status.h" -#include "contrast_algorithm.h" -#include "contrast_status.h" -#include "controller.h" -#include "denoise_algorithm.h" -#include "denoise_status.h" -#include "dpc_status.h" -#include "geq_status.h" -#include "lux_status.h" -#include "metadata.h" -#include "sharpen_algorithm.h" -#include "sharpen_status.h" -#include "statistics.h" +#include "cam_helper/cam_helper.h" +#include "controller/af_algorithm.h" +#include "controller/af_status.h" +#include "controller/agc_algorithm.h" +#include "controller/agc_status.h" +#include "controller/alsc_status.h" +#include "controller/awb_algorithm.h" +#include "controller/awb_status.h" +#include "controller/black_level_status.h" +#include "controller/ccm_algorithm.h" +#include "controller/ccm_status.h" +#include "controller/contrast_algorithm.h" +#include "controller/contrast_status.h" +#include "controller/controller.h" +#include "controller/denoise_algorithm.h" +#include "controller/denoise_status.h" +#include "controller/dpc_status.h" +#include "controller/geq_status.h" +#include "controller/lux_status.h" +#include "controller/metadata.h" +#include "controller/sharpen_algorithm.h" +#include "controller/sharpen_status.h" +#include "controller/statistics.h" namespace libcamera { @@ -1840,7 +1840,7 @@ const struct IPAModuleInfo ipaModuleInfo = { IPA_MODULE_API_VERSION, 1, "PipelineHandlerRPi", - "raspberrypi", + "rpi/vc4", }; IPAInterface *ipaCreate() From patchwork Wed May 3 12:20:29 2023 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Naushir Patuck X-Patchwork-Id: 18592 Return-Path: X-Original-To: parsemail@patchwork.libcamera.org Delivered-To: parsemail@patchwork.libcamera.org Received: from lancelot.ideasonboard.com (lancelot.ideasonboard.com [92.243.16.209]) by patchwork.libcamera.org (Postfix) with ESMTPS id DFA66C0DA4 for ; Wed, 3 May 2023 12:20:47 +0000 (UTC) Received: from lancelot.ideasonboard.com (localhost [IPv6:::1]) by lancelot.ideasonboard.com (Postfix) with ESMTP id F1037633B8; Wed, 3 May 2023 14:20:46 +0200 (CEST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/simple; d=libcamera.org; s=mail; t=1683116447; bh=gDPuWDR/FP8/Bt2bGbP2yMnGTelSriSCxRaC5mk3gxw=; h=To:Date:In-Reply-To:References:Subject:List-Id:List-Unsubscribe: List-Archive:List-Post:List-Help:List-Subscribe:From:Reply-To:Cc: From; b=aeLhMJ1dAU2cBbdbe6bJHaL5XUUNj3lAEg9ovyVoeqik9rkXFJeyHMi6MEvR2qYMN jmNq+3R1iw7wEbLVQMrGBEALFPfglaKaD1Vx6vSmdrcLxke6z+683cR003lPknvK3v Ba54Gcki+hldSPAdoXSRZdELrnSbuOGsOfbDrxuEmU1vbvWdG+2xi98qgyCtAJRU58 PgEGYj1YsyMBAR7uuXceWny/v9E9tx0s3+M+9Io3J5AyNyu+OZrtdi2ChHoRdRIeVS GfBO6ejnzI+R1TuOrHDmdjWT7NOQBTZVTCLrnPOiv/7nithhxXKfau6ureEkYhU0BP fcTcjc+pb+vEQ== Received: from mail-wm1-x32e.google.com (mail-wm1-x32e.google.com [IPv6:2a00:1450:4864:20::32e]) by lancelot.ideasonboard.com (Postfix) with ESMTPS id 64370633BA for ; Wed, 3 May 2023 14:20:42 +0200 (CEST) Authentication-Results: lancelot.ideasonboard.com; dkim=pass (2048-bit key; unprotected) header.d=raspberrypi.com header.i=@raspberrypi.com header.b="W7pfy2kI"; dkim-atps=neutral Received: by mail-wm1-x32e.google.com with SMTP id 5b1f17b1804b1-3f18dacd392so32165355e9.0 for ; Wed, 03 May 2023 05:20:42 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=raspberrypi.com; s=google; t=1683116441; x=1685708441; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:from:to:cc:subject:date :message-id:reply-to; bh=clZyTVs1q3LNKsG/a/hXCidK6h8JM4hhBxtTJnVMTYQ=; b=W7pfy2kInTTXH4mRpbxTUR2ETGOilhbCjgHFjt4ga/bGj2qtzbMH+ls/WVvDbcqM4D cEQwSL1JlZ+GQKW4lnBlbImTprn4Bnt4YMAC5jl8dQp3s6KnX1E5tRuvUybDrvIkJzaK QVXfYIoPGYi+Gdv4XRKB7pujDcOCNY1HU3Y6UkH5dhLFMyseGAxMYmZiDf7BmbD/SJS3 1da2FDh44pu16weJy+L9s2oRbZ1y82hZoNvgmlOLw+XVxcxJr32Od41tOiz+HYwl8QeJ mq1o1U9Shji6PhhodreW70eQ0cFroefp/KynJ3oc4VRscThPm2MXnVxWDkvlwsl/GF3j oR7A== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20221208; t=1683116441; x=1685708441; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:x-gm-message-state:from:to:cc :subject:date:message-id:reply-to; bh=clZyTVs1q3LNKsG/a/hXCidK6h8JM4hhBxtTJnVMTYQ=; b=HT0sSlMSYj8bse+oaXyV3U9GVU4qhqCJ8BvcB2UYB6rw5T5LmLeh2h0/4gi4oYsEts 6F3GqNUTqMPfnoENRpWMkhWNsZcE7c79hooKFyYNA3CyrRu3C6daQNgl/ataNk3bqyT3 a0C8hfLg9eZ5zFs4dm4RBCkV46cX0XeAL1od1+dMZ3dWu2Ku2gToG5zex7bXniwiPW/s kqY76AIKeHCrMUJeOxAyxi3aExWcv2YHVfT/OX5BHkA4QEuZhFvLWszewJ0LGXtnf/KF Bi5opGFmDT51nwTp3i3RzjS4J88wb/AHfQQzKeAxpQDNli3b60v/eKspURcPI+w9Yw/q wjfw== X-Gm-Message-State: AC+VfDxHGPf/xFeKDwC5yp6kipMULtWMU9QYcCX+faHDOE35n8rjlD9k SGSZMdfAgwDfSoXfWzfAaG8WGtIFC83fI5bkgez9cg== X-Google-Smtp-Source: ACHHUZ4TDCoOQMemFjoGofESQv/SdV76KrjshHCMLMKZGssgm/Mx/aiSqpSitV4zfJj3XqitgxuT6w== X-Received: by 2002:a1c:4b1a:0:b0:3f1:954f:82e3 with SMTP id y26-20020a1c4b1a000000b003f1954f82e3mr14926947wma.24.1683116441672; Wed, 03 May 2023 05:20:41 -0700 (PDT) Received: from localhost.localdomain ([93.93.133.154]) by smtp.gmail.com with ESMTPSA id f23-20020a7bcd17000000b003ee443bf0c7sm1736785wmj.16.2023.05.03.05.20.41 (version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256); Wed, 03 May 2023 05:20:41 -0700 (PDT) To: libcamera-devel@lists.libcamera.org Date: Wed, 3 May 2023 13:20:29 +0100 Message-Id: <20230503122035.32026-8-naush@raspberrypi.com> X-Mailer: git-send-email 2.34.1 In-Reply-To: <20230503122035.32026-1-naush@raspberrypi.com> References: <20230503122035.32026-1-naush@raspberrypi.com> MIME-Version: 1.0 Subject: [libcamera-devel] [PATCH 07/13] pipeline: raspberrypi: rpi_stream: Set invalid buffer to id == 0 X-BeenThere: libcamera-devel@lists.libcamera.org X-Mailman-Version: 2.1.29 Precedence: list List-Id: List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , X-Patchwork-Original-From: Naushir Patuck via libcamera-devel From: Naushir Patuck Reply-To: Naushir Patuck Cc: Jacopo Mondi Errors-To: libcamera-devel-bounces@lists.libcamera.org Sender: "libcamera-devel" At present, the RPiStream buffer ids == -1 indicates an invalid value. As a simplification, use id == 0 to indicate an invalid value. This allows for better code readability. As a consequence of this, use unsigned int for the buffer id values. Signed-off-by: Naushir Patuck Reviewed-by: Jacopo Mondi Reviewed-by: Laurent Pinchart --- .../pipeline/rpi/common/rpi_stream.cpp | 10 +++++----- src/libcamera/pipeline/rpi/common/rpi_stream.h | 18 +++++++++--------- src/libcamera/pipeline/rpi/vc4/raspberrypi.cpp | 6 +++--- 3 files changed, 17 insertions(+), 17 deletions(-) diff --git a/src/libcamera/pipeline/rpi/common/rpi_stream.cpp b/src/libcamera/pipeline/rpi/common/rpi_stream.cpp index 2bb10f25d6ca..3690667e9aa6 100644 --- a/src/libcamera/pipeline/rpi/common/rpi_stream.cpp +++ b/src/libcamera/pipeline/rpi/common/rpi_stream.cpp @@ -55,17 +55,17 @@ const BufferMap &Stream::getBuffers() const return bufferMap_; } -int Stream::getBufferId(FrameBuffer *buffer) const +unsigned int Stream::getBufferId(FrameBuffer *buffer) const { if (importOnly_) - return -1; + return 0; /* Find the buffer in the map, and return the buffer id. */ auto it = std::find_if(bufferMap_.begin(), bufferMap_.end(), [&buffer](auto const &p) { return p.second == buffer; }); if (it == bufferMap_.end()) - return -1; + return 0; return it->first; } @@ -77,10 +77,10 @@ void Stream::setExternalBuffer(FrameBuffer *buffer) void Stream::removeExternalBuffer(FrameBuffer *buffer) { - int id = getBufferId(buffer); + unsigned int id = getBufferId(buffer); /* Ensure we have this buffer in the stream, and it is marked external. */ - ASSERT(id != -1 && (id & BufferMask::MaskExternalBuffer)); + ASSERT(id & BufferMask::MaskExternalBuffer); bufferMap_.erase(id); } diff --git a/src/libcamera/pipeline/rpi/common/rpi_stream.h b/src/libcamera/pipeline/rpi/common/rpi_stream.h index b8bd79cf1535..1aae674967e1 100644 --- a/src/libcamera/pipeline/rpi/common/rpi_stream.h +++ b/src/libcamera/pipeline/rpi/common/rpi_stream.h @@ -58,7 +58,7 @@ public: void setExportedBuffers(std::vector> *buffers); const BufferMap &getBuffers() const; - int getBufferId(FrameBuffer *buffer) const; + unsigned int getBufferId(FrameBuffer *buffer) const; void setExternalBuffer(FrameBuffer *buffer); void removeExternalBuffer(FrameBuffer *buffer); @@ -74,25 +74,25 @@ private: class IdGenerator { public: - IdGenerator(int max) + IdGenerator(unsigned int max) : max_(max), id_(0) { } - int get() + unsigned int get() { - int id; + unsigned int id; if (!recycle_.empty()) { id = recycle_.front(); recycle_.pop(); } else { - id = id_++; + id = ++id_; ASSERT(id_ <= max_); } return id; } - void release(int id) + void release(unsigned int id) { recycle_.push(id); } @@ -104,9 +104,9 @@ private: } private: - int max_; - int id_; - std::queue recycle_; + unsigned int max_; + unsigned int id_; + std::queue recycle_; }; void clearBuffers(); diff --git a/src/libcamera/pipeline/rpi/vc4/raspberrypi.cpp b/src/libcamera/pipeline/rpi/vc4/raspberrypi.cpp index af464d153f28..96749c0d1318 100644 --- a/src/libcamera/pipeline/rpi/vc4/raspberrypi.cpp +++ b/src/libcamera/pipeline/rpi/vc4/raspberrypi.cpp @@ -1210,7 +1210,7 @@ int PipelineHandlerRPi::queueRequestDevice(Camera *camera, Request *request) continue; FrameBuffer *buffer = request->findBuffer(stream); - if (buffer && stream->getBufferId(buffer) == -1) { + if (buffer && !stream->getBufferId(buffer)) { /* * This buffer is not recognised, so it must have been allocated * outside the v4l2 device. Store it in the stream buffer list @@ -2042,7 +2042,7 @@ void RPiCameraData::unicamBufferDequeue(FrameBuffer *buffer) for (RPi::Stream &s : unicam_) { index = s.getBufferId(buffer); - if (index != -1) { + if (index) { stream = &s; break; } @@ -2098,7 +2098,7 @@ void RPiCameraData::ispOutputDequeue(FrameBuffer *buffer) for (RPi::Stream &s : isp_) { index = s.getBufferId(buffer); - if (index != -1) { + if (index) { stream = &s; break; } From patchwork Wed May 3 12:20:30 2023 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Naushir Patuck X-Patchwork-Id: 18593 Return-Path: X-Original-To: parsemail@patchwork.libcamera.org Delivered-To: parsemail@patchwork.libcamera.org Received: from lancelot.ideasonboard.com (lancelot.ideasonboard.com [92.243.16.209]) by patchwork.libcamera.org (Postfix) with ESMTPS id ED764C0DA4 for ; Wed, 3 May 2023 12:20:49 +0000 (UTC) Received: from lancelot.ideasonboard.com (localhost [IPv6:::1]) by lancelot.ideasonboard.com (Postfix) with ESMTP id AE027633BC; Wed, 3 May 2023 14:20:49 +0200 (CEST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/simple; d=libcamera.org; s=mail; t=1683116449; bh=KWW/OtZpcceOjMoh0a0L1w7VHGL7cGC5+TaXw+I8yLI=; h=To:Date:In-Reply-To:References:Subject:List-Id:List-Unsubscribe: List-Archive:List-Post:List-Help:List-Subscribe:From:Reply-To:Cc: From; b=U5uEgX1JS/znDKzKcUQH+64n5sCjVqVLPDoaWHKfaKS+fSag+jfZBf4b7MXrtMtd/ grVOQcGIrVJq93cv9wCURz7/39Euy7tMrgs6VcLXMDbqI9XO195VCRR8EbL9Jr3HIL /A0gqRVAZ/mlSjj8CCsG+Xcv5Z+zDtuQNqzcWkEqJyOAUitTotx4rQw5mxjD4aA53L eyRzkSEUdRGdNOhmLwDkrM/Qlp9bIrlMrhREITEaxLOAzbCLzg/wMxO87gVTp/G+4t hh4DSH6r0JtjoT6oBzE/tUmXmQszy03XCIYNxlEWC/ex5WxPBQCRGWXF5cR6XUsFEL 0VQuuVJhB4brg== Received: from mail-wm1-x32e.google.com (mail-wm1-x32e.google.com [IPv6:2a00:1450:4864:20::32e]) by lancelot.ideasonboard.com (Postfix) with ESMTPS id 0054F633BC for ; Wed, 3 May 2023 14:20:43 +0200 (CEST) Authentication-Results: lancelot.ideasonboard.com; dkim=pass (2048-bit key; unprotected) header.d=raspberrypi.com header.i=@raspberrypi.com header.b="lEzENy+b"; dkim-atps=neutral Received: by mail-wm1-x32e.google.com with SMTP id 5b1f17b1804b1-3f1950f5676so52405925e9.3 for ; Wed, 03 May 2023 05:20:43 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=raspberrypi.com; s=google; t=1683116443; x=1685708443; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:from:to:cc:subject:date :message-id:reply-to; bh=W9uNXdGf1+0DfPtCT3PVO32tBUknN5CbYrNGpW6pPc8=; b=lEzENy+bCh3NV2j99JC9GYvUDMRN1FcNjgUjgMX0THKUoJk45w604+uzBFd5PeNzRh 2rMZpVlCaT7WoYineboLmqU7BCVnVwCyLx55PT4VAm189CBPUhivBt5112o39sxPPlak oRokhbakr9QyZwH6P5mOLc8unZ0BegMZG22+zCLZAJOL4V+KIEoRxpH3WoOhPar+/CQO ymT10Ak5gi2dWZl3eWkiZbEe6DaNbfMNyGKRT/MG3pa7iIMsgkNeqvsD5nZQgBBZJATo t1WVgwA0eZuVbHXkB0bWskarnWsFkxy8EiEFVeH79Pv0kKkuP1xblW2RPNNJdBSzcu66 rNFQ== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20221208; t=1683116443; x=1685708443; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:x-gm-message-state:from:to:cc :subject:date:message-id:reply-to; bh=W9uNXdGf1+0DfPtCT3PVO32tBUknN5CbYrNGpW6pPc8=; b=HP0fJfTGWMtfafOUDyq7mVGjhnUGSFBfW4sJuvIuVc/3HcYRiD7NhJ00OMpLHL5h8D MwT9F40TVsWNmaMYxbkc66HdwRPKcx9rABF79m1/Fgfd4aM8j+TFFyO+TRmZjSdgCS40 ByEkjNW3CY3j4YxhSn1G/EsYzGBeXcK6es5vaAqa66p5sOAMKnpS3In49jku9+IgGYLE 6I64SMSSo2XBXzWBr1mkj8Bb/e7m/fYUKd/XTzUnakiqGxIIljlmdpcKq5fz2Mpp3iw0 LbgHCzXEcFNt+oH26dkvY94x4iUR/moYz1MV1dPhKw2wE8SnRMs61gxJESCshHUYiDvR r2hg== X-Gm-Message-State: AC+VfDxJoYr3KXrRaXPj3+YORsuRm50t9tX7D+Bf8EpkOQlsy8Hy19z6 XHgqHpcwHW9qaqbgqfhmCRVMRUgAofz3rDDpjmysDg== X-Google-Smtp-Source: ACHHUZ7qjv6vssBZGrUrtuSAXo0RT0qvl1EWknHEDb7cK7tZ5LzIYG9UicIuVOIUd9+uaTPOG4gbMw== X-Received: by 2002:a1c:6a17:0:b0:3f1:9526:22da with SMTP id f23-20020a1c6a17000000b003f1952622damr15541827wmc.22.1683116442813; Wed, 03 May 2023 05:20:42 -0700 (PDT) Received: from localhost.localdomain ([93.93.133.154]) by smtp.gmail.com with ESMTPSA id f23-20020a7bcd17000000b003ee443bf0c7sm1736785wmj.16.2023.05.03.05.20.41 (version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256); Wed, 03 May 2023 05:20:42 -0700 (PDT) To: libcamera-devel@lists.libcamera.org Date: Wed, 3 May 2023 13:20:30 +0100 Message-Id: <20230503122035.32026-9-naush@raspberrypi.com> X-Mailer: git-send-email 2.34.1 In-Reply-To: <20230503122035.32026-1-naush@raspberrypi.com> References: <20230503122035.32026-1-naush@raspberrypi.com> MIME-Version: 1.0 Subject: [libcamera-devel] [PATCH 08/13] pipeline: ipa: raspberrypi: Restructure the IPA mojom interface X-BeenThere: libcamera-devel@lists.libcamera.org X-Mailman-Version: 2.1.29 Precedence: list List-Id: List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , X-Patchwork-Original-From: Naushir Patuck via libcamera-devel From: Naushir Patuck Reply-To: Naushir Patuck Cc: Jacopo Mondi Errors-To: libcamera-devel-bounces@lists.libcamera.org Sender: "libcamera-devel" Restructure the IPA mojom interface to be more consistent in the use of the API. Function parameters are now grouped into *Params structures and results are now returned in *Results structures. The following pipeline -> IPA interfaces have been removed: signalQueueRequest(libcamera.ControlList controls); signalIspPrepare(ISPConfig data); signalStatReady(uint32 bufferId, uint32 ipaContext); and replaced with: prepareIsp(PrepareParams params); processStats(ProcessParams params); signalQueueRequest() is now encompassed within prepareIsp(). The following IPA -> pipeline interfaces have been removed: runIsp(uint32 bufferId); embeddedComplete(uint32 bufferId); statsMetadataComplete(uint32 bufferId, libcamera.ControlList controls); and replaced with the following async calls: prepareIspComplete(BufferIds buffers); processStatsComplete(BufferIds buffers); metadataReady(libcamera.ControlList metadata); Signed-off-by: Naushir Patuck Reviewed-by: Jacopo Mondi --- include/libcamera/ipa/raspberrypi.mojom | 238 +++++++++++++++--- src/ipa/rpi/vc4/raspberrypi.cpp | 102 ++++---- .../pipeline/rpi/vc4/raspberrypi.cpp | 155 ++++++------ 3 files changed, 320 insertions(+), 175 deletions(-) diff --git a/include/libcamera/ipa/raspberrypi.mojom b/include/libcamera/ipa/raspberrypi.mojom index 80e0126618c8..ba786e647ca1 100644 --- a/include/libcamera/ipa/raspberrypi.mojom +++ b/include/libcamera/ipa/raspberrypi.mojom @@ -8,7 +8,7 @@ module ipa.RPi; import "include/libcamera/ipa/core.mojom"; -/* Size of the LS grid allocation. */ +/* Size of the LS grid allocation on VC4. */ const uint32 MaxLsGridSize = 0x8000; struct SensorConfig { @@ -19,64 +19,123 @@ struct SensorConfig { uint32 sensorMetadata; }; -struct IPAInitResult { +struct InitParams { + bool lensPresent; +}; + +struct InitResult { SensorConfig sensorConfig; libcamera.ControlInfoMap controlInfo; }; -struct ISPConfig { - uint32 embeddedBufferId; - uint32 bayerBufferId; - bool embeddedBufferPresent; - libcamera.ControlList controls; - uint32 ipaContext; - uint32 delayContext; +struct BufferIds { + uint32 bayer; + uint32 embedded; + uint32 stats; }; -struct IPAConfig { +struct ConfigParams { uint32 transform; - libcamera.SharedFD lsTableHandle; libcamera.ControlInfoMap sensorControls; libcamera.ControlInfoMap ispControls; libcamera.ControlInfoMap lensControls; + /* VC4 specific */ + libcamera.SharedFD lsTableHandle; }; -struct IPAConfigResult { - float modeSensitivity; - libcamera.ControlInfoMap controlInfo; +struct ConfigResult { + float modeSensitivity; + libcamera.ControlInfoMap controlInfo; + libcamera.ControlList controls; }; -struct StartConfig { +struct StartResult { libcamera.ControlList controls; int32 dropFrameCount; }; +struct PrepareParams { + BufferIds buffers; + libcamera.ControlList sensorControls; + libcamera.ControlList requestControls; + uint32 ipaContext; + uint32 delayContext; +}; + +struct ProcessParams { + BufferIds buffers; + uint32 ipaContext; +}; + interface IPARPiInterface { - init(libcamera.IPASettings settings, bool lensPresent) - => (int32 ret, IPAInitResult result); - start(libcamera.ControlList controls) => (StartConfig startConfig); + /** + * \fn init() + * \brief Initialise the IPA + * \param[in] settings Camera sensor information and configuration file + * \param[in] params Platform specific initialisation parameters + * \param[out] ret 0 on success or a negative error code otherwise + * \param[out] result Static sensor configuration and controls available + * + * This function initialises the IPA for a particular sensor from the + * pipeline handler. + * + * The \a settings conveys information about the camera sensor and + * configuration file requested by the pipeline handler. + * + * The \a result parameter returns the sensor delay for the given camera + * as well as a ControlInfoMap of available controls that can be handled + * by the IPA. + */ + init(libcamera.IPASettings settings, InitParams params) + => (int32 ret, InitResult result); + + /** + * \fn start() + * \brief Start the IPA + * \param[in] controls List of control to handle + * \param[out] result Controls to apply and number of dropped frames + * + * This function sets the IPA to a started state. + * + * The \a controls provide a list of controls to handle immediately. The + * actual controls to apply on the sensor and ISP in the pipeline + * handler are returned in \a result. + * + * The number of convergence frames to be dropped is also returned in + * \a result. + */ + start(libcamera.ControlList controls) => (StartResult result); + + /** + * \fn start() + * \brief Stop the IPA + * + * This function sets the IPA to a stopped state. + */ stop(); /** * \fn configure() - * \brief Configure the IPA stream and sensor settings - * \param[in] sensorInfo Camera sensor information - * \param[in] ipaConfig Pipeline-handler-specific configuration data - * \param[out] controls Controls to apply by the pipeline entity - * \param[out] result Other results that the pipeline handler may require + * \brief Configure the IPA + * \param[in] sensorInfo Sensor mode configuration + * \param[in] params Platform configuration parameters + * \param[out] ret 0 on success or a negative error code otherwise + * \param[out] result Results of the configuration operation * - * This function shall be called when the camera is configured to inform - * the IPA of the camera's streams and the sensor settings. + * This function configures the IPA for a particular camera + * configuration * - * The \a sensorInfo conveys information about the camera sensor settings that - * the pipeline handler has selected for the configuration. + * The \a params parameter provides a list of available controls for the + * ISP, sensor and lens devices, and the user requested transform + * operation. It can also provide platform specific configuration + * parameters, e.g. the lens shading table memory handle for VC4. * - * The \a ipaConfig and \a controls parameters carry data passed by the - * pipeline handler to the IPA and back. + * The \a result parameter returns the available controls for the given + * camera mode, a list of controls to apply to the sensor device, and + * the requested mode's sensitivity characteristics. */ - configure(libcamera.IPACameraSensorInfo sensorInfo, - IPAConfig ipaConfig) - => (int32 ret, libcamera.ControlList controls, IPAConfigResult result); + configure(libcamera.IPACameraSensorInfo sensorInfo, ConfigParams params) + => (int32 ret, ConfigResult result); /** * \fn mapBuffers() @@ -99,7 +158,7 @@ interface IPARPiInterface { * depending on the IPA protocol. Regardless of the protocol, all * buffers mapped at a given time shall have unique numerical IDs. * - * The numerical IDs have no meaning defined by the IPA interface, and + * The numerical IDs have no meaning defined by the IPA interface, and * should be treated as opaque handles by IPAs, with the only exception * that ID zero is invalid. * @@ -119,17 +178,118 @@ interface IPARPiInterface { */ unmapBuffers(array ids); - [async] signalStatReady(uint32 bufferId, uint32 ipaContext); - [async] signalQueueRequest(libcamera.ControlList controls); - [async] signalIspPrepare(ISPConfig data); + /** + * \fn prepareIsp() + * \brief Prepare the ISP configuration for a frame + * \param[in] params Parameter set for the frame to process + * + * This function call into all the algorithms in preparation for the + * frame to be processed by the ISP. + * + * The \a params parameter lists the buffer IDs for the Bayer and + * embedded data buffers, a ControlList of sensor frame params, and + * a ControlList of request controls for the current frame. + * + * Additionally, \a params also contains the IPA context (ipaContext) to + * use as an index location to store control algorithm results, and a + * historical IPA context (delayContext) that was active when the sensor + * settings were requested by the IPA. + */ + [async] prepareIsp(PrepareParams params); + + /** + * \fn processStats() + * \brief Process the statistics provided by the ISP + * \param[in] params Parameter set for the statistics to process + * + * This function call into all the algorithms to provide the statistics + * generated by the ISP for the processed frame. + * + * The \a params parameter lists the buffer ID for the statistics buffer + * and an IPA context (ipaContext) to use as an index location to store + * algorithm results. + */ + [async] processStats(ProcessParams params); }; interface IPARPiEventInterface { - statsMetadataComplete(uint32 bufferId, libcamera.ControlList controls); - runIsp(uint32 bufferId); - embeddedComplete(uint32 bufferId); + /** + * \fn prepareIspComplete() + * \brief Signal completion of \a prepareIsp + * \param[in] buffers Bayer and embedded buffers actioned. + * + * This asynchronous event is signalled to the pipeline handler once + * the \a prepareIsp signal has completed, and the ISP is ready to start + * processing the frame. The embedded data buffer may be recycled after + * this event. + */ + prepareIspComplete(BufferIds buffers); + + /** + * \fn processStatsComplete() + * \brief Signal completion of \a processStats + * \param[in] buffers Statistics buffers actioned. + * + * This asynchronous event is signalled to the pipeline handler once + * the \a processStats signal has completed. The statistics buffer may + * be recycled after this event. + */ + processStatsComplete(BufferIds buffers); + + /** + * \fn metadataReady() + * \brief Signal request metadata is to be merged + * \param[in] metadata Control list of metadata to be merged + * + * This asynchronous event is signalled to the pipeline handler once + * all the frame metadata has been gathered. The pipeline handler will + * copy or merge this metadata into the \a Request returned back to the + * application. + */ + metadataReady(libcamera.ControlList metadata); + + /** + * \fn setIspControls() + * \brief Signal ISP controls to be applied. + * \param[in] controls List of controls to be applied. + * + * This asynchronous event is signalled to the pipeline handler during + * the \a prepareISP signal after all algorithms have been run and the + * IPA requires ISP controls to be applied for the frame. + */ setIspControls(libcamera.ControlList controls); + + /** + * \fn setDelayedControls() + * \brief Signal Sensor controls to be applied. + * \param[in] controls List of controls to be applied. + * \param[in] delayContext IPA context index used for this request + * + * This asynchronous event is signalled to the pipeline handler when + * the IPA requires sensor specific controls (e.g. shutter speed, gain, + * blanking) to be applied. + */ setDelayedControls(libcamera.ControlList controls, uint32 delayContext); + + /** + * \fn setLensControls() + * \brief Signal lens controls to be applied. + * \param[in] controls List of controls to be applied. + * + * This asynchronous event is signalled to the pipeline handler when + * the IPA requires a lens movement control to be applied. + */ setLensControls(libcamera.ControlList controls); + + /** + * \fn setCameraTimeout() + * \brief Request a watchdog timeout value to use + * \param[in] maxFrameLengthMs Timeout value in ms + * + * This asynchronous event is used by the IPA to inform the pipeline + * handler of an acceptable watchdog timer value to use for the sensor + * stream. This value is based on the history of frame lengths requested + * by the IPA. + */ setCameraTimeout(uint32 maxFrameLengthMs); }; diff --git a/src/ipa/rpi/vc4/raspberrypi.cpp b/src/ipa/rpi/vc4/raspberrypi.cpp index 5d3bf4caf3da..17ea5c046f4f 100644 --- a/src/ipa/rpi/vc4/raspberrypi.cpp +++ b/src/ipa/rpi/vc4/raspberrypi.cpp @@ -136,30 +136,28 @@ public: munmap(lsTable_, MaxLsGridSize); } - int init(const IPASettings &settings, bool lensPresent, IPAInitResult *result) override; - void start(const ControlList &controls, StartConfig *startConfig) override; + int init(const IPASettings &settings, const InitParams ¶ms, InitResult *result) override; + void start(const ControlList &controls, StartResult *result) override; void stop() override {} - int configure(const IPACameraSensorInfo &sensorInfo, const IPAConfig &data, - ControlList *controls, IPAConfigResult *result) override; + int configure(const IPACameraSensorInfo &sensorInfo, const ConfigParams ¶ms, + ConfigResult *result) override; void mapBuffers(const std::vector &buffers) override; void unmapBuffers(const std::vector &ids) override; - void signalStatReady(const uint32_t bufferId, uint32_t ipaContext) override; - void signalQueueRequest(const ControlList &controls) override; - void signalIspPrepare(const ISPConfig &data) override; + void prepareIsp(const PrepareParams ¶ms) override; + void processStats(const ProcessParams ¶ms) override; private: void setMode(const IPACameraSensorInfo &sensorInfo); bool validateSensorControls(); bool validateIspControls(); bool validateLensControls(); - void queueRequest(const ControlList &controls); - void returnEmbeddedBuffer(unsigned int bufferId); - void prepareISP(const ISPConfig &data); + void applyControls(const ControlList &controls); + void prepare(const PrepareParams ¶ms); void reportMetadata(unsigned int ipaContext); void fillDeviceStatus(const ControlList &sensorControls, unsigned int ipaContext); RPiController::StatisticsPtr fillStatistics(bcm2835_isp_stats *stats) const; - void processStats(unsigned int bufferId, unsigned int ipaContext); + void process(unsigned int bufferId, unsigned int ipaContext); void setCameraTimeoutValue(); void applyFrameDurations(Duration minFrameDuration, Duration maxFrameDuration); void applyAGC(const struct AgcStatus *agcStatus, ControlList &ctrls); @@ -229,7 +227,7 @@ private: Duration lastTimeout_; }; -int IPARPi::init(const IPASettings &settings, bool lensPresent, IPAInitResult *result) +int IPARPi::init(const IPASettings &settings, const InitParams ¶ms, InitResult *result) { /* * Load the "helper" for this sensor. This tells us all the device specific stuff @@ -274,7 +272,7 @@ int IPARPi::init(const IPASettings &settings, bool lensPresent, IPAInitResult *r return -EINVAL; } - lensPresent_ = lensPresent; + lensPresent_ = params.lensPresent; controller_.initialise(); @@ -287,14 +285,13 @@ int IPARPi::init(const IPASettings &settings, bool lensPresent, IPAInitResult *r return 0; } -void IPARPi::start(const ControlList &controls, StartConfig *startConfig) +void IPARPi::start(const ControlList &controls, StartResult *result) { RPiController::Metadata metadata; - ASSERT(startConfig); if (!controls.empty()) { /* We have been given some controls to action before start. */ - queueRequest(controls); + applyControls(controls); } controller_.switchMode(mode_, &metadata); @@ -313,7 +310,7 @@ void IPARPi::start(const ControlList &controls, StartConfig *startConfig) if (agcStatus.shutterTime && agcStatus.analogueGain) { ControlList ctrls(sensorCtrls_); applyAGC(&agcStatus, ctrls); - startConfig->controls = std::move(ctrls); + result->controls = std::move(ctrls); setCameraTimeoutValue(); } @@ -360,7 +357,7 @@ void IPARPi::start(const ControlList &controls, StartConfig *startConfig) mistrustCount_ = helper_->mistrustFramesModeSwitch(); } - startConfig->dropFrameCount = dropFrameCount_; + result->dropFrameCount = dropFrameCount_; firstStart_ = false; lastRunTimestamp_ = 0; @@ -435,11 +432,11 @@ void IPARPi::setMode(const IPACameraSensorInfo &sensorInfo) mode_.minFrameDuration, mode_.maxFrameDuration); } -int IPARPi::configure(const IPACameraSensorInfo &sensorInfo, const IPAConfig &ipaConfig, - ControlList *controls, IPAConfigResult *result) +int IPARPi::configure(const IPACameraSensorInfo &sensorInfo, const ConfigParams ¶ms, + ConfigResult *result) { - sensorCtrls_ = ipaConfig.sensorControls; - ispCtrls_ = ipaConfig.ispControls; + sensorCtrls_ = params.sensorControls; + ispCtrls_ = params.ispControls; if (!validateSensorControls()) { LOG(IPARPI, Error) << "Sensor control validation failed."; @@ -452,7 +449,7 @@ int IPARPi::configure(const IPACameraSensorInfo &sensorInfo, const IPAConfig &ip } if (lensPresent_) { - lensCtrls_ = ipaConfig.lensControls; + lensCtrls_ = params.lensControls; if (!validateLensControls()) { LOG(IPARPI, Warning) << "Lens validation failed, " << "no lens control will be available."; @@ -466,10 +463,10 @@ int IPARPi::configure(const IPACameraSensorInfo &sensorInfo, const IPAConfig &ip /* Re-assemble camera mode using the sensor info. */ setMode(sensorInfo); - mode_.transform = static_cast(ipaConfig.transform); + mode_.transform = static_cast(params.transform); /* Store the lens shading table pointer and handle if available. */ - if (ipaConfig.lsTableHandle.isValid()) { + if (params.lsTableHandle.isValid()) { /* Remove any previous table, if there was one. */ if (lsTable_) { munmap(lsTable_, MaxLsGridSize); @@ -477,7 +474,7 @@ int IPARPi::configure(const IPACameraSensorInfo &sensorInfo, const IPAConfig &ip } /* Map the LS table buffer into user space. */ - lsTableHandle_ = std::move(ipaConfig.lsTableHandle); + lsTableHandle_ = std::move(params.lsTableHandle); if (lsTableHandle_.isValid()) { lsTable_ = mmap(nullptr, MaxLsGridSize, PROT_READ | PROT_WRITE, MAP_SHARED, lsTableHandle_.get(), 0); @@ -512,8 +509,7 @@ int IPARPi::configure(const IPACameraSensorInfo &sensorInfo, const IPAConfig &ip applyAGC(&agcStatus, ctrls); } - ASSERT(controls); - *controls = std::move(ctrls); + result->controls = std::move(ctrls); /* * Apply the correct limits to the exposure, gain and frame duration controls @@ -560,37 +556,34 @@ void IPARPi::unmapBuffers(const std::vector &ids) } } -void IPARPi::signalStatReady(uint32_t bufferId, uint32_t ipaContext) +void IPARPi::processStats(const ProcessParams ¶ms) { - unsigned int context = ipaContext % rpiMetadata_.size(); + unsigned int context = params.ipaContext % rpiMetadata_.size(); if (++checkCount_ != frameCount_) /* assert here? */ LOG(IPARPI, Error) << "WARNING: Prepare/Process mismatch!!!"; if (processPending_ && frameCount_ > mistrustCount_) - processStats(bufferId, context); + process(params.buffers.stats, context); reportMetadata(context); - - statsMetadataComplete.emit(bufferId, libcameraMetadata_); + processStatsComplete.emit(params.buffers); } -void IPARPi::signalQueueRequest(const ControlList &controls) -{ - queueRequest(controls); -} -void IPARPi::signalIspPrepare(const ISPConfig &data) +void IPARPi::prepareIsp(const PrepareParams ¶ms) { + applyControls(params.requestControls); + /* * At start-up, or after a mode-switch, we may want to * avoid running the control algos for a few frames in case * they are "unreliable". */ - prepareISP(data); + prepare(params); frameCount_++; /* Ready to push the input buffer into the ISP. */ - runIsp.emit(data.bayerBufferId); + prepareIspComplete.emit(params.buffers); } void IPARPi::reportMetadata(unsigned int ipaContext) @@ -703,6 +696,8 @@ void IPARPi::reportMetadata(unsigned int ipaContext) libcameraMetadata_.set(controls::AfState, s); libcameraMetadata_.set(controls::AfPauseState, p); } + + metadataReady.emit(libcameraMetadata_); } bool IPARPi::validateSensorControls() @@ -826,7 +821,7 @@ static const std::map AfPauseTable { controls::AfPauseResume, RPiController::AfAlgorithm::AfPauseResume }, }; -void IPARPi::queueRequest(const ControlList &controls) +void IPARPi::applyControls(const ControlList &controls) { using RPiController::AfAlgorithm; @@ -1256,27 +1251,22 @@ void IPARPi::queueRequest(const ControlList &controls) } } -void IPARPi::returnEmbeddedBuffer(unsigned int bufferId) +void IPARPi::prepare(const PrepareParams ¶ms) { - embeddedComplete.emit(bufferId); -} - -void IPARPi::prepareISP(const ISPConfig &data) -{ - int64_t frameTimestamp = data.controls.get(controls::SensorTimestamp).value_or(0); - unsigned int ipaContext = data.ipaContext % rpiMetadata_.size(); + int64_t frameTimestamp = params.sensorControls.get(controls::SensorTimestamp).value_or(0); + unsigned int ipaContext = params.ipaContext % rpiMetadata_.size(); RPiController::Metadata &rpiMetadata = rpiMetadata_[ipaContext]; Span embeddedBuffer; rpiMetadata.clear(); - fillDeviceStatus(data.controls, ipaContext); + fillDeviceStatus(params.sensorControls, ipaContext); - if (data.embeddedBufferPresent) { + if (params.buffers.embedded) { /* * Pipeline handler has supplied us with an embedded data buffer, * we must pass it to the CamHelper for parsing. */ - auto it = buffers_.find(data.embeddedBufferId); + auto it = buffers_.find(params.buffers.embedded); ASSERT(it != buffers_.end()); embeddedBuffer = it->second.planes()[0]; } @@ -1288,7 +1278,7 @@ void IPARPi::prepareISP(const ISPConfig &data) * metadata. */ AgcStatus agcStatus; - RPiController::Metadata &delayedMetadata = rpiMetadata_[data.delayContext]; + RPiController::Metadata &delayedMetadata = rpiMetadata_[params.delayContext]; if (!delayedMetadata.get("agc.status", agcStatus)) rpiMetadata.set("agc.delayed_status", agcStatus); @@ -1298,10 +1288,6 @@ void IPARPi::prepareISP(const ISPConfig &data) */ helper_->prepare(embeddedBuffer, rpiMetadata); - /* Done with embedded data now, return to pipeline handler asap. */ - if (data.embeddedBufferPresent) - returnEmbeddedBuffer(data.embeddedBufferId); - /* Allow a 10% margin on the comparison below. */ Duration delta = (frameTimestamp - lastRunTimestamp_) * 1.0ns; if (lastRunTimestamp_ && frameCount_ > dropFrameCount_ && @@ -1445,7 +1431,7 @@ RPiController::StatisticsPtr IPARPi::fillStatistics(bcm2835_isp_stats *stats) co return statistics; } -void IPARPi::processStats(unsigned int bufferId, unsigned int ipaContext) +void IPARPi::process(unsigned int bufferId, unsigned int ipaContext) { RPiController::Metadata &rpiMetadata = rpiMetadata_[ipaContext]; diff --git a/src/libcamera/pipeline/rpi/vc4/raspberrypi.cpp b/src/libcamera/pipeline/rpi/vc4/raspberrypi.cpp index 96749c0d1318..27b3bde4dbac 100644 --- a/src/libcamera/pipeline/rpi/vc4/raspberrypi.cpp +++ b/src/libcamera/pipeline/rpi/vc4/raspberrypi.cpp @@ -200,15 +200,15 @@ public: void freeBuffers(); void frameStarted(uint32_t sequence); - int loadIPA(ipa::RPi::IPAInitResult *result); - int configureIPA(const CameraConfiguration *config, ipa::RPi::IPAConfigResult *result); + int loadIPA(ipa::RPi::InitResult *result); + int configureIPA(const CameraConfiguration *config, ipa::RPi::ConfigResult *result); int loadPipelineConfiguration(); void enumerateVideoDevices(MediaLink *link); - void statsMetadataComplete(uint32_t bufferId, const ControlList &controls); - void runIsp(uint32_t bufferId); - void embeddedComplete(uint32_t bufferId); + void processStatsComplete(const ipa::RPi::BufferIds &buffers); + void metadataReady(const ControlList &metadata); + void prepareIspComplete(const ipa::RPi::BufferIds &buffers); void setIspControls(const ControlList &controls); void setDelayedControls(const ControlList &controls, uint32_t delayContext); void setLensControls(const ControlList &controls); @@ -238,7 +238,7 @@ public: /* The vector below is just for convenience when iterating over all streams. */ std::vector streams_; /* Stores the ids of the buffers mapped in the IPA. */ - std::unordered_set ipaBuffers_; + std::unordered_set bufferIds_; /* * Stores a cascade of Video Mux or Bridge devices between the sensor and * Unicam together with media link across the entities. @@ -1000,7 +1000,7 @@ int PipelineHandlerRPi::configure(Camera *camera, CameraConfiguration *config) data->isp_[Isp::Input].dev()->setSelection(V4L2_SEL_TGT_CROP, &crop); - ipa::RPi::IPAConfigResult result; + ipa::RPi::ConfigResult result; ret = data->configureIPA(config, &result); if (ret) LOG(RPI, Error) << "Failed to configure the IPA: " << ret; @@ -1117,17 +1117,17 @@ int PipelineHandlerRPi::start(Camera *camera, const ControlList *controls) data->applyScalerCrop(*controls); /* Start the IPA. */ - ipa::RPi::StartConfig startConfig; + ipa::RPi::StartResult result; data->ipa_->start(controls ? *controls : ControlList{ controls::controls }, - &startConfig); + &result); /* Apply any gain/exposure settings that the IPA may have passed back. */ - if (!startConfig.controls.empty()) - data->setSensorControls(startConfig.controls); + if (!result.controls.empty()) + data->setSensorControls(result.controls); /* Configure the number of dropped frames required on startup. */ data->dropFrameCount_ = data->config_.disableStartupFrameDrops - ? 0 : startConfig.dropFrameCount; + ? 0 : result.dropFrameCount; for (auto const stream : data->streams_) stream->resetBuffers(); @@ -1358,7 +1358,7 @@ int PipelineHandlerRPi::registerCamera(MediaDevice *unicam, MediaDevice *isp, Me data->sensorFormats_ = populateSensorFormats(data->sensor_); - ipa::RPi::IPAInitResult result; + ipa::RPi::InitResult result; if (data->loadIPA(&result)) { LOG(RPI, Error) << "Failed to load a suitable IPA library"; return -EINVAL; @@ -1599,7 +1599,7 @@ int PipelineHandlerRPi::prepareBuffers(Camera *camera) void PipelineHandlerRPi::mapBuffers(Camera *camera, const RPi::BufferMap &buffers, unsigned int mask) { RPiCameraData *data = cameraData(camera); - std::vector ipaBuffers; + std::vector bufferIds; /* * Link the FrameBuffers with the id (key value) in the map stored in * the RPi stream object - along with an identifier mask. @@ -1608,12 +1608,12 @@ void PipelineHandlerRPi::mapBuffers(Camera *camera, const RPi::BufferMap &buffer * handler and the IPA. */ for (auto const &it : buffers) { - ipaBuffers.push_back(IPABuffer(mask | it.first, + bufferIds.push_back(IPABuffer(mask | it.first, it.second->planes())); - data->ipaBuffers_.insert(mask | it.first); + data->bufferIds_.insert(mask | it.first); } - data->ipa_->mapBuffers(ipaBuffers); + data->ipa_->mapBuffers(bufferIds); } void RPiCameraData::freeBuffers() @@ -1623,10 +1623,10 @@ void RPiCameraData::freeBuffers() * Copy the buffer ids from the unordered_set to a vector to * pass to the IPA. */ - std::vector ipaBuffers(ipaBuffers_.begin(), - ipaBuffers_.end()); - ipa_->unmapBuffers(ipaBuffers); - ipaBuffers_.clear(); + std::vector bufferIds(bufferIds_.begin(), + bufferIds_.end()); + ipa_->unmapBuffers(bufferIds); + bufferIds_.clear(); } for (auto const stream : streams_) @@ -1643,16 +1643,16 @@ void RPiCameraData::frameStarted(uint32_t sequence) delayedCtrls_->applyControls(sequence); } -int RPiCameraData::loadIPA(ipa::RPi::IPAInitResult *result) +int RPiCameraData::loadIPA(ipa::RPi::InitResult *result) { ipa_ = IPAManager::createIPA(pipe(), 1, 1); if (!ipa_) return -ENOENT; - ipa_->statsMetadataComplete.connect(this, &RPiCameraData::statsMetadataComplete); - ipa_->runIsp.connect(this, &RPiCameraData::runIsp); - ipa_->embeddedComplete.connect(this, &RPiCameraData::embeddedComplete); + ipa_->processStatsComplete.connect(this, &RPiCameraData::processStatsComplete); + ipa_->prepareIspComplete.connect(this, &RPiCameraData::prepareIspComplete); + ipa_->metadataReady.connect(this, &RPiCameraData::metadataReady); ipa_->setIspControls.connect(this, &RPiCameraData::setIspControls); ipa_->setDelayedControls.connect(this, &RPiCameraData::setDelayedControls); ipa_->setLensControls.connect(this, &RPiCameraData::setLensControls); @@ -1674,23 +1674,25 @@ int RPiCameraData::loadIPA(ipa::RPi::IPAInitResult *result) } IPASettings settings(configurationFile, sensor_->model()); + ipa::RPi::InitParams params; - return ipa_->init(settings, !!sensor_->focusLens(), result); + params.lensPresent = !!sensor_->focusLens(); + return ipa_->init(settings, params, result); } -int RPiCameraData::configureIPA(const CameraConfiguration *config, ipa::RPi::IPAConfigResult *result) +int RPiCameraData::configureIPA(const CameraConfiguration *config, ipa::RPi::ConfigResult *result) { std::map entityControls; - ipa::RPi::IPAConfig ipaConfig; + ipa::RPi::ConfigParams params; /* \todo Move passing of ispControls and lensControls to ipa::init() */ - ipaConfig.sensorControls = sensor_->controls(); - ipaConfig.ispControls = isp_[Isp::Input].dev()->controls(); + params.sensorControls = sensor_->controls(); + params.ispControls = isp_[Isp::Input].dev()->controls(); if (sensor_->focusLens()) - ipaConfig.lensControls = sensor_->focusLens()->controls(); + params.lensControls = sensor_->focusLens()->controls(); /* Always send the user transform to the IPA. */ - ipaConfig.transform = static_cast(config->transform); + params.transform = static_cast(config->transform); /* Allocate the lens shading table via dmaHeap and pass to the IPA. */ if (!lsTable_.isValid()) { @@ -1703,7 +1705,7 @@ int RPiCameraData::configureIPA(const CameraConfiguration *config, ipa::RPi::IPA * \todo Investigate if mapping the lens shading table buffer * could be handled with mapBuffers(). */ - ipaConfig.lsTableHandle = lsTable_; + params.lsTableHandle = lsTable_; } /* We store the IPACameraSensorInfo for digital zoom calculations. */ @@ -1714,15 +1716,14 @@ int RPiCameraData::configureIPA(const CameraConfiguration *config, ipa::RPi::IPA } /* Ready the IPA - it must know about the sensor resolution. */ - ControlList controls; - ret = ipa_->configure(sensorInfo_, ipaConfig, &controls, result); + ret = ipa_->configure(sensorInfo_, params, result); if (ret < 0) { LOG(RPI, Error) << "IPA configuration failed!"; return -EPIPE; } - if (!controls.empty()) - setSensorControls(controls); + if (!result->controls.empty()) + setSensorControls(result->controls); return 0; } @@ -1883,24 +1884,32 @@ void RPiCameraData::enumerateVideoDevices(MediaLink *link) } } -void RPiCameraData::statsMetadataComplete(uint32_t bufferId, const ControlList &controls) +void RPiCameraData::processStatsComplete(const ipa::RPi::BufferIds &buffers) { if (!isRunning()) return; - FrameBuffer *buffer = isp_[Isp::Stats].getBuffers().at(bufferId & RPi::MaskID); + FrameBuffer *buffer = isp_[Isp::Stats].getBuffers().at(buffers.stats & RPi::MaskID); handleStreamBuffer(buffer, &isp_[Isp::Stats]); + state_ = State::IpaComplete; + handleState(); +} + +void RPiCameraData::metadataReady(const ControlList &metadata) +{ + if (!isRunning()) + return; /* Add to the Request metadata buffer what the IPA has provided. */ Request *request = requestQueue_.front(); - request->metadata().merge(controls); + request->metadata().merge(metadata); /* * Inform the sensor of the latest colour gains if it has the * V4L2_CID_NOTIFY_GAINS control (which means notifyGainsUnity_ is set). */ - const auto &colourGains = controls.get(libcamera::controls::ColourGains); + const auto &colourGains = metadata.get(libcamera::controls::ColourGains); if (notifyGainsUnity_ && colourGains) { /* The control wants linear gains in the order B, Gb, Gr, R. */ ControlList ctrls(sensor_->controls()); @@ -1914,33 +1923,29 @@ void RPiCameraData::statsMetadataComplete(uint32_t bufferId, const ControlList & sensor_->setControls(&ctrls); } - - state_ = State::IpaComplete; - handleState(); } -void RPiCameraData::runIsp(uint32_t bufferId) +void RPiCameraData::prepareIspComplete(const ipa::RPi::BufferIds &buffers) { + unsigned int embeddedId = buffers.embedded & RPi::MaskID; + unsigned int bayer = buffers.bayer & RPi::MaskID; + FrameBuffer *buffer; + if (!isRunning()) return; - FrameBuffer *buffer = unicam_[Unicam::Image].getBuffers().at(bufferId & RPi::MaskID); - - LOG(RPI, Debug) << "Input re-queue to ISP, buffer id " << (bufferId & RPi::MaskID) + buffer = unicam_[Unicam::Image].getBuffers().at(bayer & RPi::MaskID); + LOG(RPI, Debug) << "Input re-queue to ISP, buffer id " << (bayer & RPi::MaskID) << ", timestamp: " << buffer->metadata().timestamp; isp_[Isp::Input].queueBuffer(buffer); ispOutputCount_ = 0; - handleState(); -} -void RPiCameraData::embeddedComplete(uint32_t bufferId) -{ - if (!isRunning()) - return; + if (sensorMetadata_ && embeddedId) { + buffer = unicam_[Unicam::Embedded].getBuffers().at(embeddedId & RPi::MaskID); + handleStreamBuffer(buffer, &unicam_[Unicam::Embedded]); + } - FrameBuffer *buffer = unicam_[Unicam::Embedded].getBuffers().at(bufferId & RPi::MaskID); - handleStreamBuffer(buffer, &unicam_[Unicam::Embedded]); handleState(); } @@ -2116,8 +2121,10 @@ void RPiCameraData::ispOutputDequeue(FrameBuffer *buffer) * application until after the IPA signals so. */ if (stream == &isp_[Isp::Stats]) { - ipa_->signalStatReady(RPi::MaskStats | static_cast(index), - requestQueue_.front()->sequence()); + ipa::RPi::ProcessParams params; + params.buffers.stats = index | RPi::MaskStats; + params.ipaContext = requestQueue_.front()->sequence(); + ipa_->processStats(params); } else { /* Any other ISP output can be handed back to the application now. */ handleStreamBuffer(buffer, stream); @@ -2344,38 +2351,30 @@ void RPiCameraData::tryRunPipeline() request->metadata().clear(); fillRequestMetadata(bayerFrame.controls, request); - /* - * Process all the user controls by the IPA. Once this is complete, we - * queue the ISP output buffer listed in the request to start the HW - * pipeline. - */ - ipa_->signalQueueRequest(request->controls()); - /* Set our state to say the pipeline is active. */ state_ = State::Busy; - unsigned int bayerId = unicam_[Unicam::Image].getBufferId(bayerFrame.buffer); + unsigned int bayer = unicam_[Unicam::Image].getBufferId(bayerFrame.buffer); - LOG(RPI, Debug) << "Signalling signalIspPrepare:" - << " Bayer buffer id: " << bayerId; + LOG(RPI, Debug) << "Signalling prepareIsp:" + << " Bayer buffer id: " << bayer; - ipa::RPi::ISPConfig ispPrepare; - ispPrepare.bayerBufferId = RPi::MaskBayerData | bayerId; - ispPrepare.controls = std::move(bayerFrame.controls); - ispPrepare.ipaContext = request->sequence(); - ispPrepare.delayContext = bayerFrame.delayContext; + ipa::RPi::PrepareParams params; + params.buffers.bayer = RPi::MaskBayerData | bayer; + params.sensorControls = std::move(bayerFrame.controls); + params.requestControls = request->controls(); + params.ipaContext = request->sequence(); + params.delayContext = bayerFrame.delayContext; if (embeddedBuffer) { unsigned int embeddedId = unicam_[Unicam::Embedded].getBufferId(embeddedBuffer); - ispPrepare.embeddedBufferId = RPi::MaskEmbeddedData | embeddedId; - ispPrepare.embeddedBufferPresent = true; - - LOG(RPI, Debug) << "Signalling signalIspPrepare:" + params.buffers.embedded = RPi::MaskEmbeddedData | embeddedId; + LOG(RPI, Debug) << "Signalling prepareIsp:" << " Embedded buffer id: " << embeddedId; } - ipa_->signalIspPrepare(ispPrepare); + ipa_->prepareIsp(params); } bool RPiCameraData::findMatchingBuffers(BayerFrame &bayerFrame, FrameBuffer *&embeddedBuffer) From patchwork Wed May 3 12:20:31 2023 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Naushir Patuck X-Patchwork-Id: 18594 Return-Path: X-Original-To: parsemail@patchwork.libcamera.org Delivered-To: parsemail@patchwork.libcamera.org Received: from lancelot.ideasonboard.com (lancelot.ideasonboard.com [92.243.16.209]) by patchwork.libcamera.org (Postfix) with ESMTPS id A2419C327C for ; Wed, 3 May 2023 12:20:50 +0000 (UTC) Received: from lancelot.ideasonboard.com (localhost [IPv6:::1]) by lancelot.ideasonboard.com (Postfix) with ESMTP id 47077633D2; Wed, 3 May 2023 14:20:50 +0200 (CEST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/simple; d=libcamera.org; s=mail; t=1683116450; bh=HntdV6YIK0gfLsZuD45mMh3F2kL/hIDUGEDLGBSLNSM=; h=To:Date:In-Reply-To:References:Subject:List-Id:List-Unsubscribe: List-Archive:List-Post:List-Help:List-Subscribe:From:Reply-To:Cc: From; b=oCEU5dLbYrifrdK63pcjMbSl+z22LzlpFeVKKEJrQ9J6242HEl5j4pbz0v4zI6ZyO U2TztEiG+AdsA30tDGEi5e8hAbJqDIZctNA8VyoZGhngF3CKO+vn1zRvrxKpy2pYbu 36vOCS0VkMOGPDcyjodViGq2od+wiR9DdrCR8B4okmCBKxFzT5+LuNa0g0pordDyLc kxtcH8YRfL4/OrCsbvnwCcvMgVAKf4B8JF1YCzCtbemcqvvizHo3bfobpE4UnIWcrz Vx+7lKuaYThH6VgZFVN9An3d3MrF9xcdjE092ynZwDKXudeJ+V3DDr1U/zgnrYPYYR eubNxxWpfyyZg== Received: from mail-wm1-x335.google.com (mail-wm1-x335.google.com [IPv6:2a00:1450:4864:20::335]) by lancelot.ideasonboard.com (Postfix) with ESMTPS id E6FCE633CA for ; Wed, 3 May 2023 14:20:44 +0200 (CEST) Authentication-Results: lancelot.ideasonboard.com; dkim=pass (2048-bit key; unprotected) header.d=raspberrypi.com header.i=@raspberrypi.com header.b="Vzx6nYRX"; dkim-atps=neutral Received: by mail-wm1-x335.google.com with SMTP id 5b1f17b1804b1-3f315712406so21890585e9.0 for ; Wed, 03 May 2023 05:20:44 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=raspberrypi.com; s=google; t=1683116444; x=1685708444; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:from:to:cc:subject:date :message-id:reply-to; bh=VzpOKKHp2fG3CPKzxeAjKJO6JoTdG1sdv6K8VfWF6ic=; b=Vzx6nYRXf/8WE2aOXH1pAFeQyUEgSOaExEMnnq0IKACc5e5APcIJMh4nMbMvsueKgm lPZh68lWp80N4PABvYjEV3Y47EQP1k5ZXfqz4lhwHsORkmW0WBVL/Tq2NYrjhrqdh4ut hcnk1fDhp6tFzborUKY24QBaAcMEdQoHr265/uqBNuVAuJIBVqQUm4y1nzLAsITgrZte cujcc9T4rH9NrCJDOjFPtcg2guzHJljAuLTrH6ydjseTo1dF/wuTH3ad7yacsgij6Jqz Yq7wg6KIZ3AH+yDzwfeMLIguZBhBRw8ETWsmGzNjaVjbA+Oo8f4yEDnUlkL88vTkGsTf oSMA== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20221208; t=1683116444; x=1685708444; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:x-gm-message-state:from:to:cc :subject:date:message-id:reply-to; bh=VzpOKKHp2fG3CPKzxeAjKJO6JoTdG1sdv6K8VfWF6ic=; b=EYjUYcQlLoZJ53i7IsC+G8T1g7GuT/JR3xtnEZpJjggxO8FmBec151FMKgf+bG0EyW R1yi+uPtXL41Ryr8T13JAyUW8Q7g3cpWMQVBDeJoSyRa8LJj/tt43LLpNjLTYqZyV0Y0 3DgwNSE3aDILIJQSG/hYxTFAOtwJ5M5YCPFgrnghK4DU9fMM9vIxsa6Xh9arxDRCuuje lSlQspb07hip7ObdKuShF0liihL7EO019X1g9v0SsRWcNvQdC4UKQeqT1iUT9LbVj33V 1r7qw2m6plc3lPQ0aSgqNn0fy/zDUn3CTEmwfgMOoh3awmO5f/R0974P3acAoBEIZYVn tOFg== X-Gm-Message-State: AC+VfDxQiVaLN1gR9HlrSbtlovh9ENtWQ09Y8oll/eHXgLBUhhXRlVmH fptFO3DYXMf2Op5H5KOxmD9cJ8JBW9z0kZXYyjddxA== X-Google-Smtp-Source: ACHHUZ4wez3g+aGXAZmo/7xNfVLvnVry97gwLZzJYrld4JV5Gteg2SAehgMGxWfFWJ69Xy2ZZdN0bQ== X-Received: by 2002:a05:600c:3112:b0:3f1:939e:2e3b with SMTP id g18-20020a05600c311200b003f1939e2e3bmr1375937wmo.19.1683116443514; Wed, 03 May 2023 05:20:43 -0700 (PDT) Received: from localhost.localdomain ([93.93.133.154]) by smtp.gmail.com with ESMTPSA id f23-20020a7bcd17000000b003ee443bf0c7sm1736785wmj.16.2023.05.03.05.20.42 (version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256); Wed, 03 May 2023 05:20:43 -0700 (PDT) To: libcamera-devel@lists.libcamera.org Date: Wed, 3 May 2023 13:20:31 +0100 Message-Id: <20230503122035.32026-10-naush@raspberrypi.com> X-Mailer: git-send-email 2.34.1 In-Reply-To: <20230503122035.32026-1-naush@raspberrypi.com> References: <20230503122035.32026-1-naush@raspberrypi.com> MIME-Version: 1.0 Subject: [libcamera-devel] [PATCH 09/13] ipa: raspberrypi: Introduce IpaBase class X-BeenThere: libcamera-devel@lists.libcamera.org X-Mailman-Version: 2.1.29 Precedence: list List-Id: List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , X-Patchwork-Original-From: Naushir Patuck via libcamera-devel From: Naushir Patuck Reply-To: Naushir Patuck Cc: Jacopo Mondi Errors-To: libcamera-devel-bounces@lists.libcamera.org Sender: "libcamera-devel" Create a new IpaBase class that handles general purpose housekeeping duties for the Raspberry Pi IPA. The implementation of the new class is essentially pulled from the existing ipa/rpi/vc4/raspberrypi.cpp file with a minimal amount of refactoring. Create a derived IpaVc4 class from IpaBase that handles the VC4 pipeline specific tasks of the IPA. Again, code for this class implementation is taken from the existing ipa/rpi/vc4/raspberrypi.cpp with a minimal amount of refactoring. The goal of this change is to allow third parties to implement their own IPA running on the Raspberry Pi without duplicating all of the IPA housekeeping tasks. Signed-off-by: Naushir Patuck Reviewed-by: Jacopo Mondi --- .../raspberrypi.cpp => common/ipa_base.cpp} | 1305 +++++------------ src/ipa/rpi/common/ipa_base.h | 122 ++ src/ipa/rpi/common/meson.build | 17 + src/ipa/rpi/meson.build | 1 + src/ipa/rpi/vc4/meson.build | 4 +- src/ipa/rpi/vc4/vc4.cpp | 540 +++++++ 6 files changed, 1056 insertions(+), 933 deletions(-) rename src/ipa/rpi/{vc4/raspberrypi.cpp => common/ipa_base.cpp} (65%) create mode 100644 src/ipa/rpi/common/ipa_base.h create mode 100644 src/ipa/rpi/common/meson.build create mode 100644 src/ipa/rpi/vc4/vc4.cpp diff --git a/src/ipa/rpi/vc4/raspberrypi.cpp b/src/ipa/rpi/common/ipa_base.cpp similarity index 65% rename from src/ipa/rpi/vc4/raspberrypi.cpp rename to src/ipa/rpi/common/ipa_base.cpp index 17ea5c046f4f..db7a0eb3a1ca 100644 --- a/src/ipa/rpi/vc4/raspberrypi.cpp +++ b/src/ipa/rpi/common/ipa_base.cpp @@ -1,60 +1,30 @@ /* SPDX-License-Identifier: BSD-2-Clause */ /* - * Copyright (C) 2019-2021, Raspberry Pi Ltd + * Copyright (C) 2019-2023, Raspberry Pi Ltd * - * rpi.cpp - Raspberry Pi Image Processing Algorithms + * ipa_base.cpp - Raspberry Pi IPA base class */ -#include -#include -#include -#include -#include -#include -#include -#include -#include -#include +#include "ipa_base.h" -#include +#include #include -#include #include - #include -#include -#include -#include - -#include -#include -#include -#include "libcamera/internal/mapped_framebuffer.h" - -#include "cam_helper/cam_helper.h" #include "controller/af_algorithm.h" #include "controller/af_status.h" #include "controller/agc_algorithm.h" -#include "controller/agc_status.h" -#include "controller/alsc_status.h" #include "controller/awb_algorithm.h" #include "controller/awb_status.h" #include "controller/black_level_status.h" #include "controller/ccm_algorithm.h" #include "controller/ccm_status.h" #include "controller/contrast_algorithm.h" -#include "controller/contrast_status.h" -#include "controller/controller.h" #include "controller/denoise_algorithm.h" -#include "controller/denoise_status.h" -#include "controller/dpc_status.h" -#include "controller/geq_status.h" #include "controller/lux_status.h" -#include "controller/metadata.h" #include "controller/sharpen_algorithm.h" -#include "controller/sharpen_status.h" #include "controller/statistics.h" namespace libcamera { @@ -62,8 +32,7 @@ namespace libcamera { using namespace std::literals::chrono_literals; using utils::Duration; -/* Number of metadata objects available in the context list. */ -constexpr unsigned int numMetadataContexts = 16; +namespace { /* Number of frame length times to hold in the queue. */ constexpr unsigned int FrameLengthsQueueSize = 10; @@ -83,7 +52,7 @@ constexpr Duration defaultMaxFrameDuration = 250.0s; constexpr Duration controllerMinFrameDuration = 1.0s / 30.0; /* List of controls handled by the Raspberry Pi IPA */ -static const ControlInfoMap::Map ipaControls{ +const ControlInfoMap::Map ipaControls{ { &controls::AeEnable, ControlInfo(false, true) }, { &controls::ExposureTime, ControlInfo(0, 66666) }, { &controls::AnalogueGain, ControlInfo(1.0f, 16.0f) }, @@ -105,7 +74,7 @@ static const ControlInfoMap::Map ipaControls{ }; /* IPA controls handled conditionally, if the lens has a focus control */ -static const ControlInfoMap::Map ipaAfControls{ +const ControlInfoMap::Map ipaAfControls{ { &controls::AfMode, ControlInfo(controls::AfModeValues) }, { &controls::AfRange, ControlInfo(controls::AfRangeValues) }, { &controls::AfSpeed, ControlInfo(controls::AfSpeedValues) }, @@ -116,118 +85,23 @@ static const ControlInfoMap::Map ipaAfControls{ { &controls::LensPosition, ControlInfo(0.0f, 32.0f, 1.0f) } }; +} /* namespace */ + LOG_DEFINE_CATEGORY(IPARPI) namespace ipa::RPi { -class IPARPi : public IPARPiInterface +IpaBase::IpaBase() + : controller_(), frameCount_(0), mistrustCount_(0), lastRunTimestamp_(0), + firstStart_(true) { -public: - IPARPi() - : controller_(), frameCount_(0), checkCount_(0), mistrustCount_(0), - lastRunTimestamp_(0), lsTable_(nullptr), firstStart_(true), - lastTimeout_(0s) - { - } - - ~IPARPi() - { - if (lsTable_) - munmap(lsTable_, MaxLsGridSize); - } - - int init(const IPASettings &settings, const InitParams ¶ms, InitResult *result) override; - void start(const ControlList &controls, StartResult *result) override; - void stop() override {} - - int configure(const IPACameraSensorInfo &sensorInfo, const ConfigParams ¶ms, - ConfigResult *result) override; - void mapBuffers(const std::vector &buffers) override; - void unmapBuffers(const std::vector &ids) override; - void prepareIsp(const PrepareParams ¶ms) override; - void processStats(const ProcessParams ¶ms) override; - -private: - void setMode(const IPACameraSensorInfo &sensorInfo); - bool validateSensorControls(); - bool validateIspControls(); - bool validateLensControls(); - void applyControls(const ControlList &controls); - void prepare(const PrepareParams ¶ms); - void reportMetadata(unsigned int ipaContext); - void fillDeviceStatus(const ControlList &sensorControls, unsigned int ipaContext); - RPiController::StatisticsPtr fillStatistics(bcm2835_isp_stats *stats) const; - void process(unsigned int bufferId, unsigned int ipaContext); - void setCameraTimeoutValue(); - void applyFrameDurations(Duration minFrameDuration, Duration maxFrameDuration); - void applyAGC(const struct AgcStatus *agcStatus, ControlList &ctrls); - void applyAWB(const struct AwbStatus *awbStatus, ControlList &ctrls); - void applyDG(const struct AgcStatus *dgStatus, ControlList &ctrls); - void applyCCM(const struct CcmStatus *ccmStatus, ControlList &ctrls); - void applyBlackLevel(const struct BlackLevelStatus *blackLevelStatus, ControlList &ctrls); - void applyGamma(const struct ContrastStatus *contrastStatus, ControlList &ctrls); - void applyGEQ(const struct GeqStatus *geqStatus, ControlList &ctrls); - void applyDenoise(const struct DenoiseStatus *denoiseStatus, ControlList &ctrls); - void applySharpen(const struct SharpenStatus *sharpenStatus, ControlList &ctrls); - void applyDPC(const struct DpcStatus *dpcStatus, ControlList &ctrls); - void applyLS(const struct AlscStatus *lsStatus, ControlList &ctrls); - void applyAF(const struct AfStatus *afStatus, ControlList &lensCtrls); - void resampleTable(uint16_t dest[], const std::vector &src, int destW, int destH); - - std::map buffers_; - - ControlInfoMap sensorCtrls_; - ControlInfoMap ispCtrls_; - ControlInfoMap lensCtrls_; - bool lensPresent_; - ControlList libcameraMetadata_; - - /* Camera sensor params. */ - CameraMode mode_; - - /* Raspberry Pi controller specific defines. */ - std::unique_ptr helper_; - RPiController::Controller controller_; - std::array rpiMetadata_; - - /* - * We count frames to decide if the frame must be hidden (e.g. from - * display) or mistrusted (i.e. not given to the control algos). - */ - uint64_t frameCount_; - - /* For checking the sequencing of Prepare/Process calls. */ - uint64_t checkCount_; - - /* How many frames we should avoid running control algos on. */ - unsigned int mistrustCount_; - - /* Number of frames that need to be dropped on startup. */ - unsigned int dropFrameCount_; - - /* Frame timestamp for the last run of the controller. */ - uint64_t lastRunTimestamp_; - - /* Do we run a Controller::process() for this frame? */ - bool processPending_; - - /* LS table allocation passed in from the pipeline handler. */ - SharedFD lsTableHandle_; - void *lsTable_; - - /* Distinguish the first camera start from others. */ - bool firstStart_; - - /* Frame duration (1/fps) limits. */ - Duration minFrameDuration_; - Duration maxFrameDuration_; +} - /* Track the frame length times over FrameLengthsQueueSize frames. */ - std::deque frameLengths_; - Duration lastTimeout_; -}; +IpaBase::~IpaBase() +{ +} -int IPARPi::init(const IPASettings &settings, const InitParams ¶ms, InitResult *result) +int32_t IpaBase::init(const IPASettings &settings, const InitParams ¶ms, InitResult *result) { /* * Load the "helper" for this sensor. This tells us all the device specific stuff @@ -264,14 +138,6 @@ int IPARPi::init(const IPASettings &settings, const InitParams ¶ms, InitResu return ret; } - const std::string &target = controller_.getTarget(); - if (target != "bcm2835") { - LOG(IPARPI, Error) - << "Tuning data file target returned \"" << target << "\"" - << ", expected \"bcm2835\""; - return -EINVAL; - } - lensPresent_ = params.lensPresent; controller_.initialise(); @@ -282,10 +148,88 @@ int IPARPi::init(const IPASettings &settings, const InitParams ¶ms, InitResu ctrlMap.merge(ControlInfoMap::Map(ipaAfControls)); result->controlInfo = ControlInfoMap(std::move(ctrlMap), controls::controls); - return 0; + return platformInit(params, result); +} + +int32_t IpaBase::configure(const IPACameraSensorInfo &sensorInfo, const ConfigParams ¶ms, + ConfigResult *result) +{ + sensorCtrls_ = params.sensorControls; + + if (!validateSensorControls()) { + LOG(IPARPI, Error) << "Sensor control validation failed."; + return -1; + } + + if (lensPresent_) { + lensCtrls_ = params.lensControls; + if (!validateLensControls()) { + LOG(IPARPI, Warning) << "Lens validation failed, " + << "no lens control will be available."; + lensPresent_ = false; + } + } + + /* Setup a metadata ControlList to output metadata. */ + libcameraMetadata_ = ControlList(controls::controls); + + /* Re-assemble camera mode using the sensor info. */ + setMode(sensorInfo); + + mode_.transform = static_cast(params.transform); + + /* Pass the camera mode to the CamHelper to setup algorithms. */ + helper_->setCameraMode(mode_); + + /* + * Initialise this ControlList correctly, even if empty, in case the IPA is + * running is isolation mode (passing the ControlList through the IPC layer). + */ + ControlList ctrls(sensorCtrls_); + + /* The pipeline handler passes out the mode's sensitivity. */ + result->modeSensitivity = mode_.sensitivity; + + if (firstStart_) { + /* Supply initial values for frame durations. */ + applyFrameDurations(defaultMinFrameDuration, defaultMaxFrameDuration); + + /* Supply initial values for gain and exposure. */ + AgcStatus agcStatus; + agcStatus.shutterTime = defaultExposureTime; + agcStatus.analogueGain = defaultAnalogueGain; + applyAGC(&agcStatus, ctrls); + } + + result->controls = std::move(ctrls); + + /* + * Apply the correct limits to the exposure, gain and frame duration controls + * based on the current sensor mode. + */ + ControlInfoMap::Map ctrlMap = ipaControls; + ctrlMap[&controls::FrameDurationLimits] = + ControlInfo(static_cast(mode_.minFrameDuration.get()), + static_cast(mode_.maxFrameDuration.get())); + + ctrlMap[&controls::AnalogueGain] = + ControlInfo(static_cast(mode_.minAnalogueGain), + static_cast(mode_.maxAnalogueGain)); + + ctrlMap[&controls::ExposureTime] = + ControlInfo(static_cast(mode_.minShutter.get()), + static_cast(mode_.maxShutter.get())); + + /* Declare Autofocus controls, only if we have a controllable lens */ + if (lensPresent_) + ctrlMap.merge(ControlInfoMap::Map(ipaAfControls)); + + result->controlInfo = ControlInfoMap(std::move(ctrlMap), controls::controls); + + return platformConfigure(params, result); } -void IPARPi::start(const ControlList &controls, StartResult *result) +void IpaBase::start(const ControlList &controls, StartResult *result) { RPiController::Metadata metadata; @@ -320,7 +264,6 @@ void IPARPi::start(const ControlList &controls, StartResult *result) * or merely a mode switch in a running system. */ frameCount_ = 0; - checkCount_ = 0; if (firstStart_) { dropFrameCount_ = helper_->hideFramesStartup(); mistrustCount_ = helper_->mistrustFramesStartup(); @@ -363,7 +306,142 @@ void IPARPi::start(const ControlList &controls, StartResult *result) lastRunTimestamp_ = 0; } -void IPARPi::setMode(const IPACameraSensorInfo &sensorInfo) +void IpaBase::mapBuffers(const std::vector &buffers) +{ + for (const IPABuffer &buffer : buffers) { + const FrameBuffer fb(buffer.planes); + buffers_.emplace(buffer.id, + MappedFrameBuffer(&fb, MappedFrameBuffer::MapFlag::ReadWrite)); + } +} + +void IpaBase::unmapBuffers(const std::vector &ids) +{ + for (unsigned int id : ids) { + auto it = buffers_.find(id); + if (it == buffers_.end()) + continue; + + buffers_.erase(id); + } +} + +void IpaBase::prepareIsp(const PrepareParams ¶ms) +{ + applyControls(params.requestControls); + + /* + * At start-up, or after a mode-switch, we may want to + * avoid running the control algos for a few frames in case + * they are "unreliable". + */ + int64_t frameTimestamp = params.sensorControls.get(controls::SensorTimestamp).value_or(0); + unsigned int ipaContext = params.ipaContext % rpiMetadata_.size(); + RPiController::Metadata &rpiMetadata = rpiMetadata_[ipaContext]; + Span embeddedBuffer; + + rpiMetadata.clear(); + fillDeviceStatus(params.sensorControls, ipaContext); + + if (params.buffers.embedded) { + /* + * Pipeline handler has supplied us with an embedded data buffer, + * we must pass it to the CamHelper for parsing. + */ + auto it = buffers_.find(params.buffers.embedded); + ASSERT(it != buffers_.end()); + embeddedBuffer = it->second.planes()[0]; + } + + /* + * AGC wants to know the algorithm status from the time it actioned the + * sensor exposure/gain changes. So fetch it from the metadata list + * indexed by the IPA cookie returned, and put it in the current frame + * metadata. + */ + AgcStatus agcStatus; + RPiController::Metadata &delayedMetadata = rpiMetadata_[params.delayContext]; + if (!delayedMetadata.get("agc.status", agcStatus)) + rpiMetadata.set("agc.delayed_status", agcStatus); + + /* + * This may overwrite the DeviceStatus using values from the sensor + * metadata, and may also do additional custom processing. + */ + helper_->prepare(embeddedBuffer, rpiMetadata); + + /* Allow a 10% margin on the comparison below. */ + Duration delta = (frameTimestamp - lastRunTimestamp_) * 1.0ns; + if (lastRunTimestamp_ && frameCount_ > dropFrameCount_ && + delta < controllerMinFrameDuration * 0.9) { + /* + * Ensure we merge the previous frame's metadata with the current + * frame. This will not overwrite exposure/gain values for the + * current frame, or any other bits of metadata that were added + * in helper_->Prepare(). + */ + RPiController::Metadata &lastMetadata = + rpiMetadata_[(ipaContext ? ipaContext : rpiMetadata_.size()) - 1]; + rpiMetadata.mergeCopy(lastMetadata); + processPending_ = false; + } else { + processPending_ = true; + lastRunTimestamp_ = frameTimestamp; + } + + /* + * If a statistics buffer has been passed in, call processStats + * directly now before prepare() since the statistics are available in-line + * with the Bayer frame. + */ + if (params.buffers.stats) + processStats({ params.buffers, params.ipaContext }); + + /* Do we need/want to call prepare? */ + if (processPending_) { + controller_.prepare(&rpiMetadata); + /* Actually prepare the ISP parameters for the frame. */ + platformPrepareIsp(params, rpiMetadata); + } + + frameCount_++; + + /* Ready to push the input buffer into the ISP. */ + prepareIspComplete.emit(params.buffers); +} + +void IpaBase::processStats(const ProcessParams ¶ms) +{ + unsigned int ipaContext = params.ipaContext % rpiMetadata_.size(); + + if (processPending_ && frameCount_ > mistrustCount_) { + RPiController::Metadata &rpiMetadata = rpiMetadata_[ipaContext]; + + auto it = buffers_.find(params.buffers.stats); + if (it == buffers_.end()) { + LOG(IPARPI, Error) << "Could not find stats buffer!"; + return; + } + + RPiController::StatisticsPtr statistics = platformProcessStats(it->second.planes()[0]); + + helper_->process(statistics, rpiMetadata); + controller_.process(statistics, &rpiMetadata); + + struct AgcStatus agcStatus; + if (rpiMetadata.get("agc.status", agcStatus) == 0) { + ControlList ctrls(sensorCtrls_); + applyAGC(&agcStatus, ctrls); + setDelayedControls.emit(ctrls, ipaContext); + setCameraTimeoutValue(); + } + } + + reportMetadata(ipaContext); + processStatsComplete.emit(params.buffers); +} + +void IpaBase::setMode(const IPACameraSensorInfo &sensorInfo) { mode_.bitdepth = sensorInfo.bitsPerPixel; mode_.width = sensorInfo.outputSize.width; @@ -393,7 +471,7 @@ void IPARPi::setMode(const IPACameraSensorInfo &sensorInfo) mode_.binY = std::min(2, static_cast(mode_.scaleY)); /* The noise factor is the square root of the total binning factor. */ - mode_.noiseFactor = sqrt(mode_.binX * mode_.binY); + mode_.noiseFactor = std::sqrt(mode_.binX * mode_.binY); /* * Calculate the line length as the ratio between the line length in @@ -432,327 +510,46 @@ void IPARPi::setMode(const IPACameraSensorInfo &sensorInfo) mode_.minFrameDuration, mode_.maxFrameDuration); } -int IPARPi::configure(const IPACameraSensorInfo &sensorInfo, const ConfigParams ¶ms, - ConfigResult *result) +void IpaBase::setCameraTimeoutValue() { - sensorCtrls_ = params.sensorControls; - ispCtrls_ = params.ispControls; + /* + * Take the maximum value of the exposure queue as the camera timeout + * value to pass back to the pipeline handler. Only signal if it has changed + * from the last set value. + */ + auto max = std::max_element(frameLengths_.begin(), frameLengths_.end()); - if (!validateSensorControls()) { - LOG(IPARPI, Error) << "Sensor control validation failed."; - return -1; + if (*max != lastTimeout_) { + setCameraTimeout.emit(max->get()); + lastTimeout_ = *max; } +} - if (!validateIspControls()) { - LOG(IPARPI, Error) << "ISP control validation failed."; - return -1; - } +bool IpaBase::validateSensorControls() +{ + static const uint32_t ctrls[] = { + V4L2_CID_ANALOGUE_GAIN, + V4L2_CID_EXPOSURE, + V4L2_CID_VBLANK, + V4L2_CID_HBLANK, + }; - if (lensPresent_) { - lensCtrls_ = params.lensControls; - if (!validateLensControls()) { - LOG(IPARPI, Warning) << "Lens validation failed, " - << "no lens control will be available."; - lensPresent_ = false; + for (auto c : ctrls) { + if (sensorCtrls_.find(c) == sensorCtrls_.end()) { + LOG(IPARPI, Error) << "Unable to find sensor control " + << utils::hex(c); + return false; } } - /* Setup a metadata ControlList to output metadata. */ - libcameraMetadata_ = ControlList(controls::controls); - - /* Re-assemble camera mode using the sensor info. */ - setMode(sensorInfo); - - mode_.transform = static_cast(params.transform); - - /* Store the lens shading table pointer and handle if available. */ - if (params.lsTableHandle.isValid()) { - /* Remove any previous table, if there was one. */ - if (lsTable_) { - munmap(lsTable_, MaxLsGridSize); - lsTable_ = nullptr; - } - - /* Map the LS table buffer into user space. */ - lsTableHandle_ = std::move(params.lsTableHandle); - if (lsTableHandle_.isValid()) { - lsTable_ = mmap(nullptr, MaxLsGridSize, PROT_READ | PROT_WRITE, - MAP_SHARED, lsTableHandle_.get(), 0); + return true; +} - if (lsTable_ == MAP_FAILED) { - LOG(IPARPI, Error) << "dmaHeap mmap failure for LS table."; - lsTable_ = nullptr; - } - } - } - - /* Pass the camera mode to the CamHelper to setup algorithms. */ - helper_->setCameraMode(mode_); - - /* - * Initialise this ControlList correctly, even if empty, in case the IPA is - * running is isolation mode (passing the ControlList through the IPC layer). - */ - ControlList ctrls(sensorCtrls_); - - /* The pipeline handler passes out the mode's sensitivity. */ - result->modeSensitivity = mode_.sensitivity; - - if (firstStart_) { - /* Supply initial values for frame durations. */ - applyFrameDurations(defaultMinFrameDuration, defaultMaxFrameDuration); - - /* Supply initial values for gain and exposure. */ - AgcStatus agcStatus; - agcStatus.shutterTime = defaultExposureTime; - agcStatus.analogueGain = defaultAnalogueGain; - applyAGC(&agcStatus, ctrls); - } - - result->controls = std::move(ctrls); - - /* - * Apply the correct limits to the exposure, gain and frame duration controls - * based on the current sensor mode. - */ - ControlInfoMap::Map ctrlMap = ipaControls; - ctrlMap[&controls::FrameDurationLimits] = - ControlInfo(static_cast(mode_.minFrameDuration.get()), - static_cast(mode_.maxFrameDuration.get())); - - ctrlMap[&controls::AnalogueGain] = - ControlInfo(static_cast(mode_.minAnalogueGain), - static_cast(mode_.maxAnalogueGain)); - - ctrlMap[&controls::ExposureTime] = - ControlInfo(static_cast(mode_.minShutter.get()), - static_cast(mode_.maxShutter.get())); - - /* Declare Autofocus controls, only if we have a controllable lens */ - if (lensPresent_) - ctrlMap.merge(ControlInfoMap::Map(ipaAfControls)); - - result->controlInfo = ControlInfoMap(std::move(ctrlMap), controls::controls); - return 0; -} - -void IPARPi::mapBuffers(const std::vector &buffers) -{ - for (const IPABuffer &buffer : buffers) { - const FrameBuffer fb(buffer.planes); - buffers_.emplace(buffer.id, - MappedFrameBuffer(&fb, MappedFrameBuffer::MapFlag::ReadWrite)); - } -} - -void IPARPi::unmapBuffers(const std::vector &ids) -{ - for (unsigned int id : ids) { - auto it = buffers_.find(id); - if (it == buffers_.end()) - continue; - - buffers_.erase(id); - } -} - -void IPARPi::processStats(const ProcessParams ¶ms) -{ - unsigned int context = params.ipaContext % rpiMetadata_.size(); - - if (++checkCount_ != frameCount_) /* assert here? */ - LOG(IPARPI, Error) << "WARNING: Prepare/Process mismatch!!!"; - if (processPending_ && frameCount_ > mistrustCount_) - process(params.buffers.stats, context); - - reportMetadata(context); - processStatsComplete.emit(params.buffers); -} - - -void IPARPi::prepareIsp(const PrepareParams ¶ms) -{ - applyControls(params.requestControls); - - /* - * At start-up, or after a mode-switch, we may want to - * avoid running the control algos for a few frames in case - * they are "unreliable". - */ - prepare(params); - frameCount_++; - - /* Ready to push the input buffer into the ISP. */ - prepareIspComplete.emit(params.buffers); -} - -void IPARPi::reportMetadata(unsigned int ipaContext) -{ - RPiController::Metadata &rpiMetadata = rpiMetadata_[ipaContext]; - std::unique_lock lock(rpiMetadata); - - /* - * Certain information about the current frame and how it will be - * processed can be extracted and placed into the libcamera metadata - * buffer, where an application could query it. - */ - DeviceStatus *deviceStatus = rpiMetadata.getLocked("device.status"); - if (deviceStatus) { - libcameraMetadata_.set(controls::ExposureTime, - deviceStatus->shutterSpeed.get()); - libcameraMetadata_.set(controls::AnalogueGain, deviceStatus->analogueGain); - libcameraMetadata_.set(controls::FrameDuration, - helper_->exposure(deviceStatus->frameLength, deviceStatus->lineLength).get()); - if (deviceStatus->sensorTemperature) - libcameraMetadata_.set(controls::SensorTemperature, *deviceStatus->sensorTemperature); - if (deviceStatus->lensPosition) - libcameraMetadata_.set(controls::LensPosition, *deviceStatus->lensPosition); - } - - AgcStatus *agcStatus = rpiMetadata.getLocked("agc.status"); - if (agcStatus) { - libcameraMetadata_.set(controls::AeLocked, agcStatus->locked); - libcameraMetadata_.set(controls::DigitalGain, agcStatus->digitalGain); - } - - LuxStatus *luxStatus = rpiMetadata.getLocked("lux.status"); - if (luxStatus) - libcameraMetadata_.set(controls::Lux, luxStatus->lux); - - AwbStatus *awbStatus = rpiMetadata.getLocked("awb.status"); - if (awbStatus) { - libcameraMetadata_.set(controls::ColourGains, { static_cast(awbStatus->gainR), - static_cast(awbStatus->gainB) }); - libcameraMetadata_.set(controls::ColourTemperature, awbStatus->temperatureK); - } - - BlackLevelStatus *blackLevelStatus = rpiMetadata.getLocked("black_level.status"); - if (blackLevelStatus) - libcameraMetadata_.set(controls::SensorBlackLevels, - { static_cast(blackLevelStatus->blackLevelR), - static_cast(blackLevelStatus->blackLevelG), - static_cast(blackLevelStatus->blackLevelG), - static_cast(blackLevelStatus->blackLevelB) }); - - RPiController::FocusRegions *focusStatus = - rpiMetadata.getLocked("focus.status"); - if (focusStatus) { - /* - * Calculate the average FoM over the central (symmetric) positions - * to give an overall scene FoM. This can change later if it is - * not deemed suitable. - */ - libcamera::Size size = focusStatus->size(); - unsigned rows = size.height; - unsigned cols = size.width; - - uint64_t sum = 0; - unsigned int numRegions = 0; - for (unsigned r = rows / 3; r < rows - rows / 3; ++r) { - for (unsigned c = cols / 4; c < cols - cols / 4; ++c) { - sum += focusStatus->get({ (int)c, (int)r }).val; - numRegions++; - } - } - - uint32_t focusFoM = (sum / numRegions) >> 16; - libcameraMetadata_.set(controls::FocusFoM, focusFoM); - } - - CcmStatus *ccmStatus = rpiMetadata.getLocked("ccm.status"); - if (ccmStatus) { - float m[9]; - for (unsigned int i = 0; i < 9; i++) - m[i] = ccmStatus->matrix[i]; - libcameraMetadata_.set(controls::ColourCorrectionMatrix, m); - } - - const AfStatus *afStatus = rpiMetadata.getLocked("af.status"); - if (afStatus) { - int32_t s, p; - switch (afStatus->state) { - case AfState::Scanning: - s = controls::AfStateScanning; - break; - case AfState::Focused: - s = controls::AfStateFocused; - break; - case AfState::Failed: - s = controls::AfStateFailed; - break; - default: - s = controls::AfStateIdle; - } - switch (afStatus->pauseState) { - case AfPauseState::Pausing: - p = controls::AfPauseStatePausing; - break; - case AfPauseState::Paused: - p = controls::AfPauseStatePaused; - break; - default: - p = controls::AfPauseStateRunning; - } - libcameraMetadata_.set(controls::AfState, s); - libcameraMetadata_.set(controls::AfPauseState, p); - } - - metadataReady.emit(libcameraMetadata_); -} - -bool IPARPi::validateSensorControls() -{ - static const uint32_t ctrls[] = { - V4L2_CID_ANALOGUE_GAIN, - V4L2_CID_EXPOSURE, - V4L2_CID_VBLANK, - V4L2_CID_HBLANK, - }; - - for (auto c : ctrls) { - if (sensorCtrls_.find(c) == sensorCtrls_.end()) { - LOG(IPARPI, Error) << "Unable to find sensor control " - << utils::hex(c); - return false; - } - } - - return true; -} - -bool IPARPi::validateIspControls() -{ - static const uint32_t ctrls[] = { - V4L2_CID_RED_BALANCE, - V4L2_CID_BLUE_BALANCE, - V4L2_CID_DIGITAL_GAIN, - V4L2_CID_USER_BCM2835_ISP_CC_MATRIX, - V4L2_CID_USER_BCM2835_ISP_GAMMA, - V4L2_CID_USER_BCM2835_ISP_BLACK_LEVEL, - V4L2_CID_USER_BCM2835_ISP_GEQ, - V4L2_CID_USER_BCM2835_ISP_DENOISE, - V4L2_CID_USER_BCM2835_ISP_SHARPEN, - V4L2_CID_USER_BCM2835_ISP_DPC, - V4L2_CID_USER_BCM2835_ISP_LENS_SHADING, - V4L2_CID_USER_BCM2835_ISP_CDN, - }; - - for (auto c : ctrls) { - if (ispCtrls_.find(c) == ispCtrls_.end()) { - LOG(IPARPI, Error) << "Unable to find ISP control " - << utils::hex(c); - return false; - } - } - - return true; -} - -bool IPARPi::validateLensControls() -{ - if (lensCtrls_.find(V4L2_CID_FOCUS_ABSOLUTE) == lensCtrls_.end()) { - LOG(IPARPI, Error) << "Unable to find Lens control V4L2_CID_FOCUS_ABSOLUTE"; - return false; +bool IpaBase::validateLensControls() +{ + if (lensCtrls_.find(V4L2_CID_FOCUS_ABSOLUTE) == lensCtrls_.end()) { + LOG(IPARPI, Error) << "Unable to find Lens control V4L2_CID_FOCUS_ABSOLUTE"; + return false; } return true; @@ -821,7 +618,7 @@ static const std::map AfPauseTable { controls::AfPauseResume, RPiController::AfAlgorithm::AfPauseResume }, }; -void IPARPi::applyControls(const ControlList &controls) +void IpaBase::applyControls(const ControlList &controls) { using RPiController::AfAlgorithm; @@ -841,7 +638,7 @@ void IPARPi::applyControls(const ControlList &controls) if (mode == AfModeTable.end()) { LOG(IPARPI, Error) << "AF mode " << idx << " not recognised"; - } else + } else if (af) af->setMode(mode->second); } @@ -1113,6 +910,10 @@ void IPARPi::applyControls(const ControlList &controls) case controls::NOISE_REDUCTION_MODE: { RPiController::DenoiseAlgorithm *sdn = dynamic_cast( controller_.getAlgorithm("SDN")); + /* Some platforms may have a combined "denoise" algorithm instead. */ + if (!sdn) + sdn = dynamic_cast( + controller_.getAlgorithm("denoise")); if (!sdn) { LOG(IPARPI, Warning) << "Could not set NOISE_REDUCTION_MODE - no SDN algorithm"; @@ -1249,125 +1050,12 @@ void IPARPi::applyControls(const ControlList &controls) break; } } -} - -void IPARPi::prepare(const PrepareParams ¶ms) -{ - int64_t frameTimestamp = params.sensorControls.get(controls::SensorTimestamp).value_or(0); - unsigned int ipaContext = params.ipaContext % rpiMetadata_.size(); - RPiController::Metadata &rpiMetadata = rpiMetadata_[ipaContext]; - Span embeddedBuffer; - - rpiMetadata.clear(); - fillDeviceStatus(params.sensorControls, ipaContext); - - if (params.buffers.embedded) { - /* - * Pipeline handler has supplied us with an embedded data buffer, - * we must pass it to the CamHelper for parsing. - */ - auto it = buffers_.find(params.buffers.embedded); - ASSERT(it != buffers_.end()); - embeddedBuffer = it->second.planes()[0]; - } - - /* - * AGC wants to know the algorithm status from the time it actioned the - * sensor exposure/gain changes. So fetch it from the metadata list - * indexed by the IPA cookie returned, and put it in the current frame - * metadata. - */ - AgcStatus agcStatus; - RPiController::Metadata &delayedMetadata = rpiMetadata_[params.delayContext]; - if (!delayedMetadata.get("agc.status", agcStatus)) - rpiMetadata.set("agc.delayed_status", agcStatus); - - /* - * This may overwrite the DeviceStatus using values from the sensor - * metadata, and may also do additional custom processing. - */ - helper_->prepare(embeddedBuffer, rpiMetadata); - - /* Allow a 10% margin on the comparison below. */ - Duration delta = (frameTimestamp - lastRunTimestamp_) * 1.0ns; - if (lastRunTimestamp_ && frameCount_ > dropFrameCount_ && - delta < controllerMinFrameDuration * 0.9) { - /* - * Ensure we merge the previous frame's metadata with the current - * frame. This will not overwrite exposure/gain values for the - * current frame, or any other bits of metadata that were added - * in helper_->Prepare(). - */ - RPiController::Metadata &lastMetadata = - rpiMetadata_[(ipaContext ? ipaContext : rpiMetadata_.size()) - 1]; - rpiMetadata.mergeCopy(lastMetadata); - processPending_ = false; - return; - } - - lastRunTimestamp_ = frameTimestamp; - processPending_ = true; - - ControlList ctrls(ispCtrls_); - - controller_.prepare(&rpiMetadata); - - /* Lock the metadata buffer to avoid constant locks/unlocks. */ - std::unique_lock lock(rpiMetadata); - - AwbStatus *awbStatus = rpiMetadata.getLocked("awb.status"); - if (awbStatus) - applyAWB(awbStatus, ctrls); - - CcmStatus *ccmStatus = rpiMetadata.getLocked("ccm.status"); - if (ccmStatus) - applyCCM(ccmStatus, ctrls); - - AgcStatus *dgStatus = rpiMetadata.getLocked("agc.status"); - if (dgStatus) - applyDG(dgStatus, ctrls); - - AlscStatus *lsStatus = rpiMetadata.getLocked("alsc.status"); - if (lsStatus) - applyLS(lsStatus, ctrls); - - ContrastStatus *contrastStatus = rpiMetadata.getLocked("contrast.status"); - if (contrastStatus) - applyGamma(contrastStatus, ctrls); - - BlackLevelStatus *blackLevelStatus = rpiMetadata.getLocked("black_level.status"); - if (blackLevelStatus) - applyBlackLevel(blackLevelStatus, ctrls); - - GeqStatus *geqStatus = rpiMetadata.getLocked("geq.status"); - if (geqStatus) - applyGEQ(geqStatus, ctrls); - - DenoiseStatus *denoiseStatus = rpiMetadata.getLocked("denoise.status"); - if (denoiseStatus) - applyDenoise(denoiseStatus, ctrls); - - SharpenStatus *sharpenStatus = rpiMetadata.getLocked("sharpen.status"); - if (sharpenStatus) - applySharpen(sharpenStatus, ctrls); - - DpcStatus *dpcStatus = rpiMetadata.getLocked("dpc.status"); - if (dpcStatus) - applyDPC(dpcStatus, ctrls); - - const AfStatus *afStatus = rpiMetadata.getLocked("af.status"); - if (afStatus) { - ControlList lensctrls(lensCtrls_); - applyAF(afStatus, lensctrls); - if (!lensctrls.empty()) - setLensControls.emit(lensctrls); - } - if (!ctrls.empty()) - setIspControls.emit(ctrls); + /* Give derived classes a chance to examine the new controls. */ + handleControls(controls); } -void IPARPi::fillDeviceStatus(const ControlList &sensorControls, unsigned int ipaContext) +void IpaBase::fillDeviceStatus(const ControlList &sensorControls, unsigned int ipaContext) { DeviceStatus deviceStatus = {}; @@ -1391,103 +1079,121 @@ void IPARPi::fillDeviceStatus(const ControlList &sensorControls, unsigned int ip rpiMetadata_[ipaContext].set("device.status", deviceStatus); } -RPiController::StatisticsPtr IPARPi::fillStatistics(bcm2835_isp_stats *stats) const -{ - using namespace RPiController; - - const Controller::HardwareConfig &hw = controller_.getHardwareConfig(); - unsigned int i; - StatisticsPtr statistics = - std::make_unique(Statistics::AgcStatsPos::PreWb, Statistics::ColourStatsPos::PostLsc); - - /* RGB histograms are not used, so do not populate them. */ - statistics->yHist = RPiController::Histogram(stats->hist[0].g_hist, - hw.numHistogramBins); - - /* All region sums are based on a 16-bit normalised pipeline bit-depth. */ - unsigned int scale = Statistics::NormalisationFactorPow2 - hw.pipelineWidth; - - statistics->awbRegions.init(hw.awbRegions); - for (i = 0; i < statistics->awbRegions.numRegions(); i++) - statistics->awbRegions.set(i, { { stats->awb_stats[i].r_sum << scale, - stats->awb_stats[i].g_sum << scale, - stats->awb_stats[i].b_sum << scale }, - stats->awb_stats[i].counted, - stats->awb_stats[i].notcounted }); - - statistics->agcRegions.init(hw.agcRegions); - for (i = 0; i < statistics->agcRegions.numRegions(); i++) - statistics->agcRegions.set(i, { { stats->agc_stats[i].r_sum << scale, - stats->agc_stats[i].g_sum << scale, - stats->agc_stats[i].b_sum << scale }, - stats->agc_stats[i].counted, - stats->awb_stats[i].notcounted }); - - statistics->focusRegions.init(hw.focusRegions); - for (i = 0; i < statistics->focusRegions.numRegions(); i++) - statistics->focusRegions.set(i, { stats->focus_stats[i].contrast_val[1][1] / 1000, - stats->focus_stats[i].contrast_val_num[1][1], - stats->focus_stats[i].contrast_val_num[1][0] }); - return statistics; -} - -void IPARPi::process(unsigned int bufferId, unsigned int ipaContext) +void IpaBase::reportMetadata(unsigned int ipaContext) { RPiController::Metadata &rpiMetadata = rpiMetadata_[ipaContext]; + std::unique_lock lock(rpiMetadata); - auto it = buffers_.find(bufferId); - if (it == buffers_.end()) { - LOG(IPARPI, Error) << "Could not find stats buffer!"; - return; + /* + * Certain information about the current frame and how it will be + * processed can be extracted and placed into the libcamera metadata + * buffer, where an application could query it. + */ + DeviceStatus *deviceStatus = rpiMetadata.getLocked("device.status"); + if (deviceStatus) { + libcameraMetadata_.set(controls::ExposureTime, + deviceStatus->shutterSpeed.get()); + libcameraMetadata_.set(controls::AnalogueGain, deviceStatus->analogueGain); + libcameraMetadata_.set(controls::FrameDuration, + helper_->exposure(deviceStatus->frameLength, deviceStatus->lineLength).get()); + if (deviceStatus->sensorTemperature) + libcameraMetadata_.set(controls::SensorTemperature, *deviceStatus->sensorTemperature); + if (deviceStatus->lensPosition) + libcameraMetadata_.set(controls::LensPosition, *deviceStatus->lensPosition); } - Span mem = it->second.planes()[0]; - bcm2835_isp_stats *stats = reinterpret_cast(mem.data()); - RPiController::StatisticsPtr statistics = fillStatistics(stats); + AgcStatus *agcStatus = rpiMetadata.getLocked("agc.status"); + if (agcStatus) { + libcameraMetadata_.set(controls::AeLocked, agcStatus->locked); + libcameraMetadata_.set(controls::DigitalGain, agcStatus->digitalGain); + } - /* Save the focus stats in the metadata structure to report out later. */ - rpiMetadata_[ipaContext].set("focus.status", statistics->focusRegions); + LuxStatus *luxStatus = rpiMetadata.getLocked("lux.status"); + if (luxStatus) + libcameraMetadata_.set(controls::Lux, luxStatus->lux); - helper_->process(statistics, rpiMetadata); - controller_.process(statistics, &rpiMetadata); + AwbStatus *awbStatus = rpiMetadata.getLocked("awb.status"); + if (awbStatus) { + libcameraMetadata_.set(controls::ColourGains, { static_cast(awbStatus->gainR), + static_cast(awbStatus->gainB) }); + libcameraMetadata_.set(controls::ColourTemperature, awbStatus->temperatureK); + } - struct AgcStatus agcStatus; - if (rpiMetadata.get("agc.status", agcStatus) == 0) { - ControlList ctrls(sensorCtrls_); - applyAGC(&agcStatus, ctrls); + BlackLevelStatus *blackLevelStatus = rpiMetadata.getLocked("black_level.status"); + if (blackLevelStatus) + libcameraMetadata_.set(controls::SensorBlackLevels, + { static_cast(blackLevelStatus->blackLevelR), + static_cast(blackLevelStatus->blackLevelG), + static_cast(blackLevelStatus->blackLevelG), + static_cast(blackLevelStatus->blackLevelB) }); - setDelayedControls.emit(ctrls, ipaContext); - setCameraTimeoutValue(); - } -} + RPiController::FocusRegions *focusStatus = + rpiMetadata.getLocked("focus.status"); + if (focusStatus) { + /* + * Calculate the average FoM over the central (symmetric) positions + * to give an overall scene FoM. This can change later if it is + * not deemed suitable. + */ + libcamera::Size size = focusStatus->size(); + unsigned rows = size.height; + unsigned cols = size.width; -void IPARPi::setCameraTimeoutValue() -{ - /* - * Take the maximum value of the exposure queue as the camera timeout - * value to pass back to the pipeline handler. Only signal if it has changed - * from the last set value. - */ - auto max = std::max_element(frameLengths_.begin(), frameLengths_.end()); + uint64_t sum = 0; + unsigned int numRegions = 0; + for (unsigned r = rows / 3; r < rows - rows / 3; ++r) { + for (unsigned c = cols / 4; c < cols - cols / 4; ++c) { + sum += focusStatus->get({ (int)c, (int)r }).val; + numRegions++; + } + } - if (*max != lastTimeout_) { - setCameraTimeout.emit(max->get()); - lastTimeout_ = *max; + uint32_t focusFoM = (sum / numRegions) >> 16; + libcameraMetadata_.set(controls::FocusFoM, focusFoM); } -} -void IPARPi::applyAWB(const struct AwbStatus *awbStatus, ControlList &ctrls) -{ - LOG(IPARPI, Debug) << "Applying WB R: " << awbStatus->gainR << " B: " - << awbStatus->gainB; + CcmStatus *ccmStatus = rpiMetadata.getLocked("ccm.status"); + if (ccmStatus) { + float m[9]; + for (unsigned int i = 0; i < 9; i++) + m[i] = ccmStatus->matrix[i]; + libcameraMetadata_.set(controls::ColourCorrectionMatrix, m); + } + + const AfStatus *afStatus = rpiMetadata.getLocked("af.status"); + if (afStatus) { + int32_t s, p; + switch (afStatus->state) { + case AfState::Scanning: + s = controls::AfStateScanning; + break; + case AfState::Focused: + s = controls::AfStateFocused; + break; + case AfState::Failed: + s = controls::AfStateFailed; + break; + default: + s = controls::AfStateIdle; + } + switch (afStatus->pauseState) { + case AfPauseState::Pausing: + p = controls::AfPauseStatePausing; + break; + case AfPauseState::Paused: + p = controls::AfPauseStatePaused; + break; + default: + p = controls::AfPauseStateRunning; + } + libcameraMetadata_.set(controls::AfState, s); + libcameraMetadata_.set(controls::AfPauseState, p); + } - ctrls.set(V4L2_CID_RED_BALANCE, - static_cast(awbStatus->gainR * 1000)); - ctrls.set(V4L2_CID_BLUE_BALANCE, - static_cast(awbStatus->gainB * 1000)); + metadataReady.emit(libcameraMetadata_); } -void IPARPi::applyFrameDurations(Duration minFrameDuration, Duration maxFrameDuration) +void IpaBase::applyFrameDurations(Duration minFrameDuration, Duration maxFrameDuration) { /* * This will only be applied once AGC recalculations occur. @@ -1519,7 +1225,7 @@ void IPARPi::applyFrameDurations(Duration minFrameDuration, Duration maxFrameDur agc->setMaxShutter(maxShutter); } -void IPARPi::applyAGC(const struct AgcStatus *agcStatus, ControlList &ctrls) +void IpaBase::applyAGC(const struct AgcStatus *agcStatus, ControlList &ctrls) { const int32_t minGainCode = helper_->gainCode(mode_.minAnalogueGain); const int32_t maxGainCode = helper_->gainCode(mode_.maxAnalogueGain); @@ -1571,269 +1277,6 @@ void IPARPi::applyAGC(const struct AgcStatus *agcStatus, ControlList &ctrls) helper_->hblankToLineLength(hblank))); } -void IPARPi::applyDG(const struct AgcStatus *dgStatus, ControlList &ctrls) -{ - ctrls.set(V4L2_CID_DIGITAL_GAIN, - static_cast(dgStatus->digitalGain * 1000)); -} - -void IPARPi::applyCCM(const struct CcmStatus *ccmStatus, ControlList &ctrls) -{ - bcm2835_isp_custom_ccm ccm; - - for (int i = 0; i < 9; i++) { - ccm.ccm.ccm[i / 3][i % 3].den = 1000; - ccm.ccm.ccm[i / 3][i % 3].num = 1000 * ccmStatus->matrix[i]; - } - - ccm.enabled = 1; - ccm.ccm.offsets[0] = ccm.ccm.offsets[1] = ccm.ccm.offsets[2] = 0; - - ControlValue c(Span{ reinterpret_cast(&ccm), - sizeof(ccm) }); - ctrls.set(V4L2_CID_USER_BCM2835_ISP_CC_MATRIX, c); -} - -void IPARPi::applyGamma(const struct ContrastStatus *contrastStatus, ControlList &ctrls) -{ - const unsigned int numGammaPoints = controller_.getHardwareConfig().numGammaPoints; - struct bcm2835_isp_gamma gamma; - - for (unsigned int i = 0; i < numGammaPoints - 1; i++) { - int x = i < 16 ? i * 1024 - : (i < 24 ? (i - 16) * 2048 + 16384 - : (i - 24) * 4096 + 32768); - gamma.x[i] = x; - gamma.y[i] = std::min(65535, contrastStatus->gammaCurve.eval(x)); - } - - gamma.x[numGammaPoints - 1] = 65535; - gamma.y[numGammaPoints - 1] = 65535; - gamma.enabled = 1; - - ControlValue c(Span{ reinterpret_cast(&gamma), - sizeof(gamma) }); - ctrls.set(V4L2_CID_USER_BCM2835_ISP_GAMMA, c); -} - -void IPARPi::applyBlackLevel(const struct BlackLevelStatus *blackLevelStatus, ControlList &ctrls) -{ - bcm2835_isp_black_level blackLevel; - - blackLevel.enabled = 1; - blackLevel.black_level_r = blackLevelStatus->blackLevelR; - blackLevel.black_level_g = blackLevelStatus->blackLevelG; - blackLevel.black_level_b = blackLevelStatus->blackLevelB; - - ControlValue c(Span{ reinterpret_cast(&blackLevel), - sizeof(blackLevel) }); - ctrls.set(V4L2_CID_USER_BCM2835_ISP_BLACK_LEVEL, c); -} - -void IPARPi::applyGEQ(const struct GeqStatus *geqStatus, ControlList &ctrls) -{ - bcm2835_isp_geq geq; - - geq.enabled = 1; - geq.offset = geqStatus->offset; - geq.slope.den = 1000; - geq.slope.num = 1000 * geqStatus->slope; - - ControlValue c(Span{ reinterpret_cast(&geq), - sizeof(geq) }); - ctrls.set(V4L2_CID_USER_BCM2835_ISP_GEQ, c); -} - -void IPARPi::applyDenoise(const struct DenoiseStatus *denoiseStatus, ControlList &ctrls) -{ - using RPiController::DenoiseMode; - - bcm2835_isp_denoise denoise; - DenoiseMode mode = static_cast(denoiseStatus->mode); - - denoise.enabled = mode != DenoiseMode::Off; - denoise.constant = denoiseStatus->noiseConstant; - denoise.slope.num = 1000 * denoiseStatus->noiseSlope; - denoise.slope.den = 1000; - denoise.strength.num = 1000 * denoiseStatus->strength; - denoise.strength.den = 1000; - - /* Set the CDN mode to match the SDN operating mode. */ - bcm2835_isp_cdn cdn; - switch (mode) { - case DenoiseMode::ColourFast: - cdn.enabled = 1; - cdn.mode = CDN_MODE_FAST; - break; - case DenoiseMode::ColourHighQuality: - cdn.enabled = 1; - cdn.mode = CDN_MODE_HIGH_QUALITY; - break; - default: - cdn.enabled = 0; - } - - ControlValue c(Span{ reinterpret_cast(&denoise), - sizeof(denoise) }); - ctrls.set(V4L2_CID_USER_BCM2835_ISP_DENOISE, c); - - c = ControlValue(Span{ reinterpret_cast(&cdn), - sizeof(cdn) }); - ctrls.set(V4L2_CID_USER_BCM2835_ISP_CDN, c); -} - -void IPARPi::applySharpen(const struct SharpenStatus *sharpenStatus, ControlList &ctrls) -{ - bcm2835_isp_sharpen sharpen; - - sharpen.enabled = 1; - sharpen.threshold.num = 1000 * sharpenStatus->threshold; - sharpen.threshold.den = 1000; - sharpen.strength.num = 1000 * sharpenStatus->strength; - sharpen.strength.den = 1000; - sharpen.limit.num = 1000 * sharpenStatus->limit; - sharpen.limit.den = 1000; - - ControlValue c(Span{ reinterpret_cast(&sharpen), - sizeof(sharpen) }); - ctrls.set(V4L2_CID_USER_BCM2835_ISP_SHARPEN, c); -} - -void IPARPi::applyDPC(const struct DpcStatus *dpcStatus, ControlList &ctrls) -{ - bcm2835_isp_dpc dpc; - - dpc.enabled = 1; - dpc.strength = dpcStatus->strength; - - ControlValue c(Span{ reinterpret_cast(&dpc), - sizeof(dpc) }); - ctrls.set(V4L2_CID_USER_BCM2835_ISP_DPC, c); -} - -void IPARPi::applyLS(const struct AlscStatus *lsStatus, ControlList &ctrls) -{ - /* - * Program lens shading tables into pipeline. - * Choose smallest cell size that won't exceed 63x48 cells. - */ - const int cellSizes[] = { 16, 32, 64, 128, 256 }; - unsigned int numCells = std::size(cellSizes); - unsigned int i, w, h, cellSize; - for (i = 0; i < numCells; i++) { - cellSize = cellSizes[i]; - w = (mode_.width + cellSize - 1) / cellSize; - h = (mode_.height + cellSize - 1) / cellSize; - if (w < 64 && h <= 48) - break; - } - - if (i == numCells) { - LOG(IPARPI, Error) << "Cannot find cell size"; - return; - } - - /* We're going to supply corner sampled tables, 16 bit samples. */ - w++, h++; - bcm2835_isp_lens_shading ls = { - .enabled = 1, - .grid_cell_size = cellSize, - .grid_width = w, - .grid_stride = w, - .grid_height = h, - /* .dmabuf will be filled in by pipeline handler. */ - .dmabuf = 0, - .ref_transform = 0, - .corner_sampled = 1, - .gain_format = GAIN_FORMAT_U4P10 - }; - - if (!lsTable_ || w * h * 4 * sizeof(uint16_t) > MaxLsGridSize) { - LOG(IPARPI, Error) << "Do not have a correctly allocate lens shading table!"; - return; - } - - if (lsStatus) { - /* Format will be u4.10 */ - uint16_t *grid = static_cast(lsTable_); - - resampleTable(grid, lsStatus->r, w, h); - resampleTable(grid + w * h, lsStatus->g, w, h); - std::memcpy(grid + 2 * w * h, grid + w * h, w * h * sizeof(uint16_t)); - resampleTable(grid + 3 * w * h, lsStatus->b, w, h); - } - - ControlValue c(Span{ reinterpret_cast(&ls), - sizeof(ls) }); - ctrls.set(V4L2_CID_USER_BCM2835_ISP_LENS_SHADING, c); -} - -void IPARPi::applyAF(const struct AfStatus *afStatus, ControlList &lensCtrls) -{ - if (afStatus->lensSetting) { - ControlValue v(afStatus->lensSetting.value()); - lensCtrls.set(V4L2_CID_FOCUS_ABSOLUTE, v); - } -} - -/* - * Resamples a 16x12 table with central sampling to destW x destH with corner - * sampling. - */ -void IPARPi::resampleTable(uint16_t dest[], const std::vector &src, - int destW, int destH) -{ - /* - * Precalculate and cache the x sampling locations and phases to - * save recomputing them on every row. - */ - assert(destW > 1 && destH > 1 && destW <= 64); - int xLo[64], xHi[64]; - double xf[64]; - double x = -0.5, xInc = 16.0 / (destW - 1); - for (int i = 0; i < destW; i++, x += xInc) { - xLo[i] = floor(x); - xf[i] = x - xLo[i]; - xHi[i] = xLo[i] < 15 ? xLo[i] + 1 : 15; - xLo[i] = xLo[i] > 0 ? xLo[i] : 0; - } - - /* Now march over the output table generating the new values. */ - double y = -0.5, yInc = 12.0 / (destH - 1); - for (int j = 0; j < destH; j++, y += yInc) { - int yLo = floor(y); - double yf = y - yLo; - int yHi = yLo < 11 ? yLo + 1 : 11; - yLo = yLo > 0 ? yLo : 0; - double const *rowAbove = src.data() + yLo * 16; - double const *rowBelow = src.data() + yHi * 16; - for (int i = 0; i < destW; i++) { - double above = rowAbove[xLo[i]] * (1 - xf[i]) + rowAbove[xHi[i]] * xf[i]; - double below = rowBelow[xLo[i]] * (1 - xf[i]) + rowBelow[xHi[i]] * xf[i]; - int result = floor(1024 * (above * (1 - yf) + below * yf) + .5); - *(dest++) = result > 16383 ? 16383 : result; /* want u4.10 */ - } - } -} - } /* namespace ipa::RPi */ -/* - * External IPA module interface - */ -extern "C" { -const struct IPAModuleInfo ipaModuleInfo = { - IPA_MODULE_API_VERSION, - 1, - "PipelineHandlerRPi", - "rpi/vc4", -}; - -IPAInterface *ipaCreate() -{ - return new ipa::RPi::IPARPi(); -} - -} /* extern "C" */ - } /* namespace libcamera */ diff --git a/src/ipa/rpi/common/ipa_base.h b/src/ipa/rpi/common/ipa_base.h new file mode 100644 index 000000000000..6f9c46bb16b1 --- /dev/null +++ b/src/ipa/rpi/common/ipa_base.h @@ -0,0 +1,122 @@ +/* SPDX-License-Identifier: BSD-2-Clause */ +/* + * Copyright (C) 2023, Raspberry Pi Ltd + * + * ipa_base.h - Raspberry Pi IPA base class + */ +#pragma once + +#include +#include +#include +#include + +#include +#include + +#include + +#include "libcamera/internal/mapped_framebuffer.h" + +#include "cam_helper/cam_helper.h" +#include "controller/agc_status.h" +#include "controller/camera_mode.h" +#include "controller/controller.h" +#include "controller/metadata.h" + +namespace libcamera { + +namespace ipa::RPi { + +class IpaBase : public IPARPiInterface +{ +public: + IpaBase(); + ~IpaBase(); + + int32_t init(const IPASettings &settings, const InitParams ¶ms, InitResult *result) override; + int32_t configure(const IPACameraSensorInfo &sensorInfo, const ConfigParams ¶ms, + ConfigResult *result) override; + + void start(const ControlList &controls, StartResult *result) override; + void stop() override {} + + void mapBuffers(const std::vector &buffers) override; + void unmapBuffers(const std::vector &ids) override; + + void prepareIsp(const PrepareParams ¶ms) override; + void processStats(const ProcessParams ¶ms) override; + +protected: + /* Raspberry Pi controller specific defines. */ + std::unique_ptr helper_; + RPiController::Controller controller_; + + ControlInfoMap sensorCtrls_; + ControlInfoMap lensCtrls_; + + /* Camera sensor params. */ + CameraMode mode_; + + /* Track the frame length times over FrameLengthsQueueSize frames. */ + std::deque frameLengths_; + utils::Duration lastTimeout_; + +private: + /* Number of metadata objects available in the context list. */ + static constexpr unsigned int numMetadataContexts = 16; + + virtual int32_t platformInit(const InitParams ¶ms, InitResult *result) = 0; + virtual int32_t platformConfigure(const ConfigParams ¶ms, ConfigResult *result) = 0; + + virtual void platformPrepareIsp(const PrepareParams ¶ms, + RPiController::Metadata &rpiMetadata) = 0; + virtual RPiController::StatisticsPtr platformProcessStats(Span mem) = 0; + + void setMode(const IPACameraSensorInfo &sensorInfo); + void setCameraTimeoutValue(); + bool validateSensorControls(); + bool validateLensControls(); + void applyControls(const ControlList &controls); + virtual void handleControls(const ControlList &controls) = 0; + void fillDeviceStatus(const ControlList &sensorControls, unsigned int ipaContext); + void reportMetadata(unsigned int ipaContext); + void applyFrameDurations(utils::Duration minFrameDuration, utils::Duration maxFrameDuration); + void applyAGC(const struct AgcStatus *agcStatus, ControlList &ctrls); + + std::map buffers_; + + bool lensPresent_; + ControlList libcameraMetadata_; + + std::array rpiMetadata_; + + /* + * We count frames to decide if the frame must be hidden (e.g. from + * display) or mistrusted (i.e. not given to the control algos). + */ + uint64_t frameCount_; + + /* How many frames we should avoid running control algos on. */ + unsigned int mistrustCount_; + + /* Number of frames that need to be dropped on startup. */ + unsigned int dropFrameCount_; + + /* Frame timestamp for the last run of the controller. */ + uint64_t lastRunTimestamp_; + + /* Do we run a Controller::process() for this frame? */ + bool processPending_; + + /* Distinguish the first camera start from others. */ + bool firstStart_; + + /* Frame duration (1/fps) limits. */ + utils::Duration minFrameDuration_; + utils::Duration maxFrameDuration_; +}; + +} /* namespace ipa::RPi */ + +} /* namespace libcamera */ diff --git a/src/ipa/rpi/common/meson.build b/src/ipa/rpi/common/meson.build new file mode 100644 index 000000000000..73d2ee732339 --- /dev/null +++ b/src/ipa/rpi/common/meson.build @@ -0,0 +1,17 @@ +# SPDX-License-Identifier: CC0-1.0 + +rpi_ipa_common_sources = files([ + 'ipa_base.cpp', +]) + +rpi_ipa_common_includes = [ + include_directories('..'), +] + +rpi_ipa_common_deps = [ + libcamera_private, +] + +rpi_ipa_common_lib = static_library('rpi_ipa_common', rpi_ipa_common_sources, + include_directories : rpi_ipa_common_includes, + dependencies : rpi_ipa_common_deps) diff --git a/src/ipa/rpi/meson.build b/src/ipa/rpi/meson.build index 7d7a61f7cea7..4811c76f3f9e 100644 --- a/src/ipa/rpi/meson.build +++ b/src/ipa/rpi/meson.build @@ -1,6 +1,7 @@ # SPDX-License-Identifier: CC0-1.0 subdir('cam_helper') +subdir('common') subdir('controller') foreach pipeline : pipelines diff --git a/src/ipa/rpi/vc4/meson.build b/src/ipa/rpi/vc4/meson.build index cbd4dec62659..590e9197756d 100644 --- a/src/ipa/rpi/vc4/meson.build +++ b/src/ipa/rpi/vc4/meson.build @@ -9,6 +9,7 @@ vc4_ipa_deps = [ vc4_ipa_libs = [ rpi_ipa_cam_helper_lib, + rpi_ipa_common_lib, rpi_ipa_controller_lib ] @@ -18,7 +19,7 @@ vc4_ipa_includes = [ ] vc4_ipa_sources = files([ - 'raspberrypi.cpp', + 'vc4.cpp', ]) vc4_ipa_includes += include_directories('..') @@ -45,4 +46,3 @@ endif subdir('data') ipa_names += ipa_name - diff --git a/src/ipa/rpi/vc4/vc4.cpp b/src/ipa/rpi/vc4/vc4.cpp new file mode 100644 index 000000000000..a32db9bcaf9a --- /dev/null +++ b/src/ipa/rpi/vc4/vc4.cpp @@ -0,0 +1,540 @@ +/* SPDX-License-Identifier: BSD-2-Clause */ +/* + * Copyright (C) 2019-2021, Raspberry Pi Ltd + * + * rpi.cpp - Raspberry Pi VC4/BCM2835 ISP IPA. + */ + +#include +#include + +#include + +#include +#include + +#include "common/ipa_base.h" +#include "controller/af_status.h" +#include "controller/agc_algorithm.h" +#include "controller/alsc_status.h" +#include "controller/awb_status.h" +#include "controller/black_level_status.h" +#include "controller/ccm_status.h" +#include "controller/contrast_status.h" +#include "controller/denoise_algorithm.h" +#include "controller/denoise_status.h" +#include "controller/dpc_status.h" +#include "controller/geq_status.h" +#include "controller/lux_status.h" +#include "controller/noise_status.h" +#include "controller/sharpen_status.h" + +namespace libcamera { + +LOG_DECLARE_CATEGORY(IPARPI) + +namespace ipa::RPi { + +class IpaVc4 final : public IpaBase +{ +public: + IpaVc4() + : IpaBase(), lsTable_(nullptr) + { + } + + ~IpaVc4() + { + if (lsTable_) + munmap(lsTable_, MaxLsGridSize); + } + +private: + int32_t platformInit(const InitParams ¶ms, InitResult *result) override; + int32_t platformConfigure(const ConfigParams ¶ms, ConfigResult *result) override; + + void platformPrepareIsp(const PrepareParams ¶ms, RPiController::Metadata &rpiMetadata) override; + RPiController::StatisticsPtr platformProcessStats(Span mem) override; + + void handleControls(const ControlList &controls) override; + bool validateIspControls(); + + void applyAWB(const struct AwbStatus *awbStatus, ControlList &ctrls); + void applyDG(const struct AgcStatus *dgStatus, ControlList &ctrls); + void applyCCM(const struct CcmStatus *ccmStatus, ControlList &ctrls); + void applyBlackLevel(const struct BlackLevelStatus *blackLevelStatus, ControlList &ctrls); + void applyGamma(const struct ContrastStatus *contrastStatus, ControlList &ctrls); + void applyGEQ(const struct GeqStatus *geqStatus, ControlList &ctrls); + void applyDenoise(const struct DenoiseStatus *denoiseStatus, ControlList &ctrls); + void applySharpen(const struct SharpenStatus *sharpenStatus, ControlList &ctrls); + void applyDPC(const struct DpcStatus *dpcStatus, ControlList &ctrls); + void applyLS(const struct AlscStatus *lsStatus, ControlList &ctrls); + void applyAF(const struct AfStatus *afStatus, ControlList &lensCtrls); + void resampleTable(uint16_t dest[], const std::vector &src, int destW, int destH); + + /* VC4 ISP controls. */ + ControlInfoMap ispCtrls_; + + /* LS table allocation passed in from the pipeline handler. */ + SharedFD lsTableHandle_; + void *lsTable_; +}; + +int32_t IpaVc4::platformInit([[maybe_unused]] const InitParams ¶ms, [[maybe_unused]] InitResult *result) +{ + const std::string &target = controller_.getTarget(); + + if (target != "bcm2835") { + LOG(IPARPI, Error) + << "Tuning data file target returned \"" << target << "\"" + << ", expected \"bcm2835\""; + return -EINVAL; + } + + return 0; +} + +int32_t IpaVc4::platformConfigure(const ConfigParams ¶ms, [[maybe_unused]] ConfigResult *result) +{ + ispCtrls_ = params.ispControls; + if (!validateIspControls()) { + LOG(IPARPI, Error) << "ISP control validation failed."; + return -1; + } + + /* Store the lens shading table pointer and handle if available. */ + if (params.lsTableHandle.isValid()) { + /* Remove any previous table, if there was one. */ + if (lsTable_) { + munmap(lsTable_, MaxLsGridSize); + lsTable_ = nullptr; + } + + /* Map the LS table buffer into user space. */ + lsTableHandle_ = std::move(params.lsTableHandle); + if (lsTableHandle_.isValid()) { + lsTable_ = mmap(nullptr, MaxLsGridSize, PROT_READ | PROT_WRITE, + MAP_SHARED, lsTableHandle_.get(), 0); + + if (lsTable_ == MAP_FAILED) { + LOG(IPARPI, Error) << "dmaHeap mmap failure for LS table."; + lsTable_ = nullptr; + } + } + } + + return 0; +} + +void IpaVc4::platformPrepareIsp([[maybe_unused]] const PrepareParams ¶ms, + RPiController::Metadata &rpiMetadata) +{ + ControlList ctrls(ispCtrls_); + + /* Lock the metadata buffer to avoid constant locks/unlocks. */ + std::unique_lock lock(rpiMetadata); + + AwbStatus *awbStatus = rpiMetadata.getLocked("awb.status"); + if (awbStatus) + applyAWB(awbStatus, ctrls); + + CcmStatus *ccmStatus = rpiMetadata.getLocked("ccm.status"); + if (ccmStatus) + applyCCM(ccmStatus, ctrls); + + AgcStatus *dgStatus = rpiMetadata.getLocked("agc.status"); + if (dgStatus) + applyDG(dgStatus, ctrls); + + AlscStatus *lsStatus = rpiMetadata.getLocked("alsc.status"); + if (lsStatus) + applyLS(lsStatus, ctrls); + + ContrastStatus *contrastStatus = rpiMetadata.getLocked("contrast.status"); + if (contrastStatus) + applyGamma(contrastStatus, ctrls); + + BlackLevelStatus *blackLevelStatus = rpiMetadata.getLocked("black_level.status"); + if (blackLevelStatus) + applyBlackLevel(blackLevelStatus, ctrls); + + GeqStatus *geqStatus = rpiMetadata.getLocked("geq.status"); + if (geqStatus) + applyGEQ(geqStatus, ctrls); + + DenoiseStatus *denoiseStatus = rpiMetadata.getLocked("denoise.status"); + if (denoiseStatus) + applyDenoise(denoiseStatus, ctrls); + + SharpenStatus *sharpenStatus = rpiMetadata.getLocked("sharpen.status"); + if (sharpenStatus) + applySharpen(sharpenStatus, ctrls); + + DpcStatus *dpcStatus = rpiMetadata.getLocked("dpc.status"); + if (dpcStatus) + applyDPC(dpcStatus, ctrls); + + const AfStatus *afStatus = rpiMetadata.getLocked("af.status"); + if (afStatus) { + ControlList lensctrls(lensCtrls_); + applyAF(afStatus, lensctrls); + if (!lensctrls.empty()) + setLensControls.emit(lensctrls); + } + + if (!ctrls.empty()) + setIspControls.emit(ctrls); +} + +RPiController::StatisticsPtr IpaVc4::platformProcessStats(Span mem) +{ + using namespace RPiController; + + const bcm2835_isp_stats *stats = reinterpret_cast(mem.data()); + StatisticsPtr statistics = std::make_unique(Statistics::AgcStatsPos::PreWb, + Statistics::ColourStatsPos::PostLsc); + const Controller::HardwareConfig &hw = controller_.getHardwareConfig(); + unsigned int i; + + /* RGB histograms are not used, so do not populate them. */ + statistics->yHist = RPiController::Histogram(stats->hist[0].g_hist, + hw.numHistogramBins); + + /* All region sums are based on a 16-bit normalised pipeline bit-depth. */ + unsigned int scale = Statistics::NormalisationFactorPow2 - hw.pipelineWidth; + + statistics->awbRegions.init(hw.awbRegions); + for (i = 0; i < statistics->awbRegions.numRegions(); i++) + statistics->awbRegions.set(i, { { stats->awb_stats[i].r_sum << scale, + stats->awb_stats[i].g_sum << scale, + stats->awb_stats[i].b_sum << scale }, + stats->awb_stats[i].counted, + stats->awb_stats[i].notcounted }); + + statistics->agcRegions.init(hw.agcRegions); + for (i = 0; i < statistics->agcRegions.numRegions(); i++) + statistics->agcRegions.set(i, { { stats->agc_stats[i].r_sum << scale, + stats->agc_stats[i].g_sum << scale, + stats->agc_stats[i].b_sum << scale }, + stats->agc_stats[i].counted, + stats->awb_stats[i].notcounted }); + + statistics->focusRegions.init(hw.focusRegions); + for (i = 0; i < statistics->focusRegions.numRegions(); i++) + statistics->focusRegions.set(i, { stats->focus_stats[i].contrast_val[1][1] / 1000, + stats->focus_stats[i].contrast_val_num[1][1], + stats->focus_stats[i].contrast_val_num[1][0] }); + + return statistics; +} + +void IpaVc4::handleControls([[maybe_unused]] const ControlList &controls) +{ + /* No controls require any special updates to the hardware configuration. */ +} + +bool IpaVc4::validateIspControls() +{ + static const uint32_t ctrls[] = { + V4L2_CID_RED_BALANCE, + V4L2_CID_BLUE_BALANCE, + V4L2_CID_DIGITAL_GAIN, + V4L2_CID_USER_BCM2835_ISP_CC_MATRIX, + V4L2_CID_USER_BCM2835_ISP_GAMMA, + V4L2_CID_USER_BCM2835_ISP_BLACK_LEVEL, + V4L2_CID_USER_BCM2835_ISP_GEQ, + V4L2_CID_USER_BCM2835_ISP_DENOISE, + V4L2_CID_USER_BCM2835_ISP_SHARPEN, + V4L2_CID_USER_BCM2835_ISP_DPC, + V4L2_CID_USER_BCM2835_ISP_LENS_SHADING, + V4L2_CID_USER_BCM2835_ISP_CDN, + }; + + for (auto c : ctrls) { + if (ispCtrls_.find(c) == ispCtrls_.end()) { + LOG(IPARPI, Error) << "Unable to find ISP control " + << utils::hex(c); + return false; + } + } + + return true; +} + +void IpaVc4::applyAWB(const struct AwbStatus *awbStatus, ControlList &ctrls) +{ + LOG(IPARPI, Debug) << "Applying WB R: " << awbStatus->gainR << " B: " + << awbStatus->gainB; + + ctrls.set(V4L2_CID_RED_BALANCE, + static_cast(awbStatus->gainR * 1000)); + ctrls.set(V4L2_CID_BLUE_BALANCE, + static_cast(awbStatus->gainB * 1000)); +} + +void IpaVc4::applyDG(const struct AgcStatus *dgStatus, ControlList &ctrls) +{ + ctrls.set(V4L2_CID_DIGITAL_GAIN, + static_cast(dgStatus->digitalGain * 1000)); +} + +void IpaVc4::applyCCM(const struct CcmStatus *ccmStatus, ControlList &ctrls) +{ + bcm2835_isp_custom_ccm ccm; + + for (int i = 0; i < 9; i++) { + ccm.ccm.ccm[i / 3][i % 3].den = 1000; + ccm.ccm.ccm[i / 3][i % 3].num = 1000 * ccmStatus->matrix[i]; + } + + ccm.enabled = 1; + ccm.ccm.offsets[0] = ccm.ccm.offsets[1] = ccm.ccm.offsets[2] = 0; + + ControlValue c(Span{ reinterpret_cast(&ccm), + sizeof(ccm) }); + ctrls.set(V4L2_CID_USER_BCM2835_ISP_CC_MATRIX, c); +} + +void IpaVc4::applyBlackLevel(const struct BlackLevelStatus *blackLevelStatus, ControlList &ctrls) +{ + bcm2835_isp_black_level blackLevel; + + blackLevel.enabled = 1; + blackLevel.black_level_r = blackLevelStatus->blackLevelR; + blackLevel.black_level_g = blackLevelStatus->blackLevelG; + blackLevel.black_level_b = blackLevelStatus->blackLevelB; + + ControlValue c(Span{ reinterpret_cast(&blackLevel), + sizeof(blackLevel) }); + ctrls.set(V4L2_CID_USER_BCM2835_ISP_BLACK_LEVEL, c); +} + +void IpaVc4::applyGamma(const struct ContrastStatus *contrastStatus, ControlList &ctrls) +{ + const unsigned int numGammaPoints = controller_.getHardwareConfig().numGammaPoints; + struct bcm2835_isp_gamma gamma; + + for (unsigned int i = 0; i < numGammaPoints - 1; i++) { + int x = i < 16 ? i * 1024 + : (i < 24 ? (i - 16) * 2048 + 16384 + : (i - 24) * 4096 + 32768); + gamma.x[i] = x; + gamma.y[i] = std::min(65535, contrastStatus->gammaCurve.eval(x)); + } + + gamma.x[numGammaPoints - 1] = 65535; + gamma.y[numGammaPoints - 1] = 65535; + gamma.enabled = 1; + + ControlValue c(Span{ reinterpret_cast(&gamma), + sizeof(gamma) }); + ctrls.set(V4L2_CID_USER_BCM2835_ISP_GAMMA, c); +} + +void IpaVc4::applyGEQ(const struct GeqStatus *geqStatus, ControlList &ctrls) +{ + bcm2835_isp_geq geq; + + geq.enabled = 1; + geq.offset = geqStatus->offset; + geq.slope.den = 1000; + geq.slope.num = 1000 * geqStatus->slope; + + ControlValue c(Span{ reinterpret_cast(&geq), + sizeof(geq) }); + ctrls.set(V4L2_CID_USER_BCM2835_ISP_GEQ, c); +} + +void IpaVc4::applyDenoise(const struct DenoiseStatus *denoiseStatus, ControlList &ctrls) +{ + using RPiController::DenoiseMode; + + bcm2835_isp_denoise denoise; + DenoiseMode mode = static_cast(denoiseStatus->mode); + + denoise.enabled = mode != DenoiseMode::Off; + denoise.constant = denoiseStatus->noiseConstant; + denoise.slope.num = 1000 * denoiseStatus->noiseSlope; + denoise.slope.den = 1000; + denoise.strength.num = 1000 * denoiseStatus->strength; + denoise.strength.den = 1000; + + /* Set the CDN mode to match the SDN operating mode. */ + bcm2835_isp_cdn cdn; + switch (mode) { + case DenoiseMode::ColourFast: + cdn.enabled = 1; + cdn.mode = CDN_MODE_FAST; + break; + case DenoiseMode::ColourHighQuality: + cdn.enabled = 1; + cdn.mode = CDN_MODE_HIGH_QUALITY; + break; + default: + cdn.enabled = 0; + } + + ControlValue c(Span{ reinterpret_cast(&denoise), + sizeof(denoise) }); + ctrls.set(V4L2_CID_USER_BCM2835_ISP_DENOISE, c); + + c = ControlValue(Span{ reinterpret_cast(&cdn), + sizeof(cdn) }); + ctrls.set(V4L2_CID_USER_BCM2835_ISP_CDN, c); +} + +void IpaVc4::applySharpen(const struct SharpenStatus *sharpenStatus, ControlList &ctrls) +{ + bcm2835_isp_sharpen sharpen; + + sharpen.enabled = 1; + sharpen.threshold.num = 1000 * sharpenStatus->threshold; + sharpen.threshold.den = 1000; + sharpen.strength.num = 1000 * sharpenStatus->strength; + sharpen.strength.den = 1000; + sharpen.limit.num = 1000 * sharpenStatus->limit; + sharpen.limit.den = 1000; + + ControlValue c(Span{ reinterpret_cast(&sharpen), + sizeof(sharpen) }); + ctrls.set(V4L2_CID_USER_BCM2835_ISP_SHARPEN, c); +} + +void IpaVc4::applyDPC(const struct DpcStatus *dpcStatus, ControlList &ctrls) +{ + bcm2835_isp_dpc dpc; + + dpc.enabled = 1; + dpc.strength = dpcStatus->strength; + + ControlValue c(Span{ reinterpret_cast(&dpc), + sizeof(dpc) }); + ctrls.set(V4L2_CID_USER_BCM2835_ISP_DPC, c); +} + +void IpaVc4::applyLS(const struct AlscStatus *lsStatus, ControlList &ctrls) +{ + /* + * Program lens shading tables into pipeline. + * Choose smallest cell size that won't exceed 63x48 cells. + */ + const int cellSizes[] = { 16, 32, 64, 128, 256 }; + unsigned int numCells = std::size(cellSizes); + unsigned int i, w, h, cellSize; + for (i = 0; i < numCells; i++) { + cellSize = cellSizes[i]; + w = (mode_.width + cellSize - 1) / cellSize; + h = (mode_.height + cellSize - 1) / cellSize; + if (w < 64 && h <= 48) + break; + } + + if (i == numCells) { + LOG(IPARPI, Error) << "Cannot find cell size"; + return; + } + + /* We're going to supply corner sampled tables, 16 bit samples. */ + w++, h++; + bcm2835_isp_lens_shading ls = { + .enabled = 1, + .grid_cell_size = cellSize, + .grid_width = w, + .grid_stride = w, + .grid_height = h, + /* .dmabuf will be filled in by pipeline handler. */ + .dmabuf = 0, + .ref_transform = 0, + .corner_sampled = 1, + .gain_format = GAIN_FORMAT_U4P10 + }; + + if (!lsTable_ || w * h * 4 * sizeof(uint16_t) > MaxLsGridSize) { + LOG(IPARPI, Error) << "Do not have a correctly allocate lens shading table!"; + return; + } + + if (lsStatus) { + /* Format will be u4.10 */ + uint16_t *grid = static_cast(lsTable_); + + resampleTable(grid, lsStatus->r, w, h); + resampleTable(grid + w * h, lsStatus->g, w, h); + memcpy(grid + 2 * w * h, grid + w * h, w * h * sizeof(uint16_t)); + resampleTable(grid + 3 * w * h, lsStatus->b, w, h); + } + + ControlValue c(Span{ reinterpret_cast(&ls), + sizeof(ls) }); + ctrls.set(V4L2_CID_USER_BCM2835_ISP_LENS_SHADING, c); +} + +void IpaVc4::applyAF(const struct AfStatus *afStatus, ControlList &lensCtrls) +{ + if (afStatus->lensSetting) { + ControlValue v(afStatus->lensSetting.value()); + lensCtrls.set(V4L2_CID_FOCUS_ABSOLUTE, v); + } +} + +/* + * Resamples a 16x12 table with central sampling to destW x destH with corner + * sampling. + */ +void IpaVc4::resampleTable(uint16_t dest[], const std::vector &src, + int destW, int destH) +{ + /* + * Precalculate and cache the x sampling locations and phases to + * save recomputing them on every row. + */ + assert(destW > 1 && destH > 1 && destW <= 64); + int xLo[64], xHi[64]; + double xf[64]; + double x = -0.5, xInc = 16.0 / (destW - 1); + for (int i = 0; i < destW; i++, x += xInc) { + xLo[i] = floor(x); + xf[i] = x - xLo[i]; + xHi[i] = xLo[i] < 15 ? xLo[i] + 1 : 15; + xLo[i] = xLo[i] > 0 ? xLo[i] : 0; + } + + /* Now march over the output table generating the new values. */ + double y = -0.5, yInc = 12.0 / (destH - 1); + for (int j = 0; j < destH; j++, y += yInc) { + int yLo = floor(y); + double yf = y - yLo; + int yHi = yLo < 11 ? yLo + 1 : 11; + yLo = yLo > 0 ? yLo : 0; + double const *rowAbove = src.data() + yLo * 16; + double const *rowBelow = src.data() + yHi * 16; + for (int i = 0; i < destW; i++) { + double above = rowAbove[xLo[i]] * (1 - xf[i]) + rowAbove[xHi[i]] * xf[i]; + double below = rowBelow[xLo[i]] * (1 - xf[i]) + rowBelow[xHi[i]] * xf[i]; + int result = floor(1024 * (above * (1 - yf) + below * yf) + .5); + *(dest++) = result > 16383 ? 16383 : result; /* want u4.10 */ + } + } +} + +} /* namespace ipa::RPi */ + +/* + * External IPA module interface + */ +extern "C" { +const struct IPAModuleInfo ipaModuleInfo = { + IPA_MODULE_API_VERSION, + 1, + "PipelineHandlerRPi", + "rpi/vc4", +}; + +IPAInterface *ipaCreate() +{ + return new ipa::RPi::IpaVc4(); +} + +} /* extern "C" */ + +} /* namespace libcamera */ From patchwork Wed May 3 12:20:32 2023 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Naushir Patuck X-Patchwork-Id: 18595 Return-Path: X-Original-To: parsemail@patchwork.libcamera.org Delivered-To: parsemail@patchwork.libcamera.org Received: from lancelot.ideasonboard.com (lancelot.ideasonboard.com [92.243.16.209]) by patchwork.libcamera.org (Postfix) with ESMTPS id 2D612C0DA4 for ; Wed, 3 May 2023 12:21:03 +0000 (UTC) Received: from lancelot.ideasonboard.com (localhost [IPv6:::1]) by lancelot.ideasonboard.com (Postfix) with ESMTP id DD33B633BC; Wed, 3 May 2023 14:21:02 +0200 (CEST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/simple; d=libcamera.org; s=mail; t=1683116462; bh=r8OWRW4exBKkEoNUxF9rW+e27c6/Jmh3844KlNQPoEo=; h=To:Date:In-Reply-To:References:Subject:List-Id:List-Unsubscribe: List-Archive:List-Post:List-Help:List-Subscribe:From:Reply-To:Cc: From; b=xpCmzxBjJqkuhAdB4aUxcI/ia5ioeD44xbkOhTETt0diSPNurK69XJH6ecuRDymF/ T8OHui9XbGrHr3ydcyhn5C7x2ZwPJ73Tq+4JSUIQqcwzEzWQ5PNdDGgeH/P3SZBsYT bXDmwo/6bWycBLgKy/TGB/zfp8HK+3VquDuXakAn2f1HOua0jreVuo1lIluah/evvC j5RJISg73bV481SpiUBAtq7guBUYrOT0C9V/PHclqMEZEaNKJdHF7C4xya10wjHfbn PfBySb2y8fAIaSX4j2X0v8qpWL9bFVNZzF76+JfGQab6BKSkCfVYeU47SgAP+dYfBa sCXYLnc5nY+wQ== Received: from mail-wm1-x32a.google.com (mail-wm1-x32a.google.com [IPv6:2a00:1450:4864:20::32a]) by lancelot.ideasonboard.com (Postfix) with ESMTPS id 954F5627DE for ; Wed, 3 May 2023 14:21:01 +0200 (CEST) Authentication-Results: lancelot.ideasonboard.com; dkim=pass (2048-bit key; unprotected) header.d=raspberrypi.com header.i=@raspberrypi.com header.b="HQ+sQKKc"; dkim-atps=neutral Received: by mail-wm1-x32a.google.com with SMTP id 5b1f17b1804b1-3f19a7f9424so50264815e9.2 for ; Wed, 03 May 2023 05:21:01 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=raspberrypi.com; s=google; t=1683116461; x=1685708461; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:from:to:cc:subject:date :message-id:reply-to; bh=K+6m0TmzUBQDH0KaQ5mGecYbrYCVSMAcwOdAkyRWqrM=; b=HQ+sQKKcW2e/n5CeBtb1x+gtlQZio6sCQN9GJtQ9DVOnqE3GL4dw5Vuo3c528gNSFR HfVYF/DT3rf2wd/wOQprhqIrDmpAvCLrhrW6Xl6g8hFwnxhbMD0xcEI66j+ezb86iZnL 5V8WEIQEvTTqNiGWn6YwVYPbUVxeiwu6xPGQ4hMW5Ile6eIwpWtIeu6pu92Uc3uJT+Lt 2989A8zgs+4MXvLOnThRZxUKAdBen096eJTvZ2UoKLQ/QYejdTuQaCy1Qp3D214DNCD8 p+rGfXb0GDD8wZZ/BzCcuwS4VlQkmtS8UcplnXR2/yVDTaJ5baULGX/b6SXouL4SfXWo lelQ== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20221208; t=1683116461; x=1685708461; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:x-gm-message-state:from:to:cc :subject:date:message-id:reply-to; bh=K+6m0TmzUBQDH0KaQ5mGecYbrYCVSMAcwOdAkyRWqrM=; b=g9fP+1qtLaRbLuIyQOlZU/2LPv57FUE78yS2klU+1EBSF0/VoWhAHP8/0l8CipbDVL bDYxdekURNGmjzBFkFUhP1YFg4i4v/Ew2FV0mL90UBsJFfTR925+dO1OmCH9KEIRY16D FyBlS9esyof1oCm6iAZykASAN3dlvuihou2V3MXVU/07Y9HBvVB+jpBqQfGz9A4vZeoM We4BJDPRVzuhcO9NIbATcKVSaQSmre7TsVuD9te+k0pr5Xs/RpKuHzVH8C/24KfJ2esL dkDs3F1Ixpp1guqf57+CHRUbJKrElM67xiJxFkhj7skwctEnEnNvrjIgfMh5oOW0rDBD 4SOQ== X-Gm-Message-State: AC+VfDzjQs11NRBIBaMIjsFnrlO/J/gEFx+9hJ+EKv4TH6mEaHFveA0R KlVTUW8yU1M5V0JUwKnu/ba6CslGfQBvM/RGTdg2iQ== X-Google-Smtp-Source: ACHHUZ7R3Htwphvu97nPFyVpNGQQ89doJVZRhl/ibzJsDId01DcyObENuvdH9EWfWTYfgDvqgitnfg== X-Received: by 2002:a1c:f305:0:b0:3f1:93c2:4df7 with SMTP id q5-20020a1cf305000000b003f193c24df7mr14649326wmq.5.1683116460791; Wed, 03 May 2023 05:21:00 -0700 (PDT) Received: from localhost.localdomain ([93.93.133.154]) by smtp.gmail.com with ESMTPSA id f23-20020a7bcd17000000b003ee443bf0c7sm1736785wmj.16.2023.05.03.05.21.00 (version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256); Wed, 03 May 2023 05:21:00 -0700 (PDT) To: libcamera-devel@lists.libcamera.org Date: Wed, 3 May 2023 13:20:32 +0100 Message-Id: <20230503122035.32026-11-naush@raspberrypi.com> X-Mailer: git-send-email 2.34.1 In-Reply-To: <20230503122035.32026-1-naush@raspberrypi.com> References: <20230503122035.32026-1-naush@raspberrypi.com> MIME-Version: 1.0 Subject: [libcamera-devel] [PATCH 10/13] ipa: raspberrypi: agc: Move weights out of AGC X-BeenThere: libcamera-devel@lists.libcamera.org X-Mailman-Version: 2.1.29 Precedence: list List-Id: List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , X-Patchwork-Original-From: Naushir Patuck via libcamera-devel From: Naushir Patuck Reply-To: Naushir Patuck Cc: Jacopo Mondi Errors-To: libcamera-devel-bounces@lists.libcamera.org Sender: "libcamera-devel" From: David Plowman The region weights for the the AGC zones are handled by the AGC algorithm. Apply them directly in the IPA (vc4.cpp) to the statistics that we pass to the AGC. Signed-off-by: David Plowman Signed-off-by: Naushir Patuck Reviewed-by: Jacopo Mondi --- src/ipa/rpi/controller/agc_algorithm.h | 3 +++ src/ipa/rpi/controller/rpi/agc.cpp | 27 +++++++++++++++++--------- src/ipa/rpi/controller/rpi/agc.h | 1 + src/ipa/rpi/vc4/vc4.cpp | 26 ++++++++++++++++++------- 4 files changed, 41 insertions(+), 16 deletions(-) diff --git a/src/ipa/rpi/controller/agc_algorithm.h b/src/ipa/rpi/controller/agc_algorithm.h index 36e6c11058ee..b6949daa7135 100644 --- a/src/ipa/rpi/controller/agc_algorithm.h +++ b/src/ipa/rpi/controller/agc_algorithm.h @@ -6,6 +6,8 @@ */ #pragma once +#include + #include #include "algorithm.h" @@ -18,6 +20,7 @@ public: AgcAlgorithm(Controller *controller) : Algorithm(controller) {} /* An AGC algorithm must provide the following: */ virtual unsigned int getConvergenceFrames() const = 0; + virtual std::vector const &getWeights() const = 0; virtual void setEv(double ev) = 0; virtual void setFlickerPeriod(libcamera::utils::Duration flickerPeriod) = 0; virtual void setFixedShutter(libcamera::utils::Duration fixedShutter) = 0; diff --git a/src/ipa/rpi/controller/rpi/agc.cpp b/src/ipa/rpi/controller/rpi/agc.cpp index e6fb7b8dbeb3..e79c82e2e65b 100644 --- a/src/ipa/rpi/controller/rpi/agc.cpp +++ b/src/ipa/rpi/controller/rpi/agc.cpp @@ -292,6 +292,18 @@ unsigned int Agc::getConvergenceFrames() const return config_.convergenceFrames; } +std::vector const &Agc::getWeights() const +{ + /* + * In case someone calls setMeteringMode and then this before the + * algorithm has run and updated the meteringMode_ pointer. + */ + auto it = config_.meteringModes.find(meteringModeName_); + if (it == config_.meteringModes.end()) + return meteringMode_->weights; + return it->second.weights; +} + void Agc::setEv(double ev) { ev_ = ev; @@ -595,19 +607,16 @@ static double computeInitialY(StatisticsPtr &stats, AwbStatus const &awb, ASSERT(weights.size() == stats->agcRegions.numRegions()); /* - * Note how the calculation below means that equal weights give you - * "average" metering (i.e. all pixels equally important). + * Note that the weights are applied by the IPA to the statistics directly, + * before they are given to us here. */ double rSum = 0, gSum = 0, bSum = 0, pixelSum = 0; for (unsigned int i = 0; i < stats->agcRegions.numRegions(); i++) { auto ®ion = stats->agcRegions.get(i); - double rAcc = std::min(region.val.rSum * gain, (maxVal - 1) * region.counted); - double gAcc = std::min(region.val.gSum * gain, (maxVal - 1) * region.counted); - double bAcc = std::min(region.val.bSum * gain, (maxVal - 1) * region.counted); - rSum += rAcc * weights[i]; - gSum += gAcc * weights[i]; - bSum += bAcc * weights[i]; - pixelSum += region.counted * weights[i]; + rSum += std::min(region.val.rSum * gain, (maxVal - 1) * region.counted); + gSum += std::min(region.val.gSum * gain, (maxVal - 1) * region.counted); + bSum += std::min(region.val.bSum * gain, (maxVal - 1) * region.counted); + pixelSum += region.counted; } if (pixelSum == 0.0) { LOG(RPiAgc, Warning) << "computeInitialY: pixelSum is zero"; diff --git a/src/ipa/rpi/controller/rpi/agc.h b/src/ipa/rpi/controller/rpi/agc.h index 4e5f272fac78..939f97295a02 100644 --- a/src/ipa/rpi/controller/rpi/agc.h +++ b/src/ipa/rpi/controller/rpi/agc.h @@ -69,6 +69,7 @@ public: char const *name() const override; int read(const libcamera::YamlObject ¶ms) override; unsigned int getConvergenceFrames() const override; + std::vector const &getWeights() const override; void setEv(double ev) override; void setFlickerPeriod(libcamera::utils::Duration flickerPeriod) override; void setMaxShutter(libcamera::utils::Duration maxShutter) override; diff --git a/src/ipa/rpi/vc4/vc4.cpp b/src/ipa/rpi/vc4/vc4.cpp index a32db9bcaf9a..e38024ac5418 100644 --- a/src/ipa/rpi/vc4/vc4.cpp +++ b/src/ipa/rpi/vc4/vc4.cpp @@ -211,13 +211,25 @@ RPiController::StatisticsPtr IpaVc4::platformProcessStats(Span mem) stats->awb_stats[i].counted, stats->awb_stats[i].notcounted }); - statistics->agcRegions.init(hw.agcRegions); - for (i = 0; i < statistics->agcRegions.numRegions(); i++) - statistics->agcRegions.set(i, { { stats->agc_stats[i].r_sum << scale, - stats->agc_stats[i].g_sum << scale, - stats->agc_stats[i].b_sum << scale }, - stats->agc_stats[i].counted, - stats->awb_stats[i].notcounted }); + RPiController::AgcAlgorithm *agc = dynamic_cast( + controller_.getAlgorithm("agc")); + if (!agc) { + LOG(IPARPI, Debug) << "No AGC algorithm - not copying statistics"; + statistics->agcRegions.init(0); + } else { + statistics->agcRegions.init(hw.agcRegions); + const std::vector &weights = agc->getWeights(); + for (i = 0; i < statistics->agcRegions.numRegions(); i++) { + uint64_t rSum = (stats->agc_stats[i].r_sum << scale) * weights[i]; + uint64_t gSum = (stats->agc_stats[i].g_sum << scale) * weights[i]; + uint64_t bSum = (stats->agc_stats[i].b_sum << scale) * weights[i]; + uint32_t counted = stats->agc_stats[i].counted * weights[i]; + uint32_t notcounted = stats->agc_stats[i].notcounted * weights[i]; + statistics->agcRegions.set(i, { { rSum, gSum, bSum }, + counted, + notcounted }); + } + } statistics->focusRegions.init(hw.focusRegions); for (i = 0; i < statistics->focusRegions.numRegions(); i++) From patchwork Wed May 3 12:20:33 2023 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Naushir Patuck X-Patchwork-Id: 18596 Return-Path: X-Original-To: parsemail@patchwork.libcamera.org Delivered-To: parsemail@patchwork.libcamera.org Received: from lancelot.ideasonboard.com (lancelot.ideasonboard.com [92.243.16.209]) by patchwork.libcamera.org (Postfix) with ESMTPS id B0DB3C327C for ; Wed, 3 May 2023 12:21:03 +0000 (UTC) Received: from lancelot.ideasonboard.com (localhost [IPv6:::1]) by lancelot.ideasonboard.com (Postfix) with ESMTP id 63FB3633C1; Wed, 3 May 2023 14:21:03 +0200 (CEST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/simple; d=libcamera.org; s=mail; t=1683116463; bh=EYQNuwD2CifocQLUwQCnqBa/mEuzQ8CwImJ7LOF8rE8=; h=To:Date:In-Reply-To:References:Subject:List-Id:List-Unsubscribe: List-Archive:List-Post:List-Help:List-Subscribe:From:Reply-To:Cc: From; b=TAo3JKoc+7GgvaTsB2FkNkroSDzpJsGoUJ/sUHzOd3MehBFPjk5Qt9JKCnezXA/pb G2W3y+Hsn2ecULicGWOeeH2F88Vsn1D+DRRH76BsqbWPZpcC4fNPWsOS97AczmzieL 6pVWiepVs51z8sWIoChlk5P+dhG9qcv0q7E9kmClEYBngBjsmRt7uAchN3kACGh5yL EOkjIwrFa5gUb0ioLRG01fSg1dBLBSDOeXq0Ub0Tt3Rby4FcdhSwe8Fy4+dmaEXFmQ lgFKhWvcz5v4LkQSaRVrcHR+vaYKnOko0l9Ausg4f2Iq0k+/J6QRSfB6mlDMLn+X2w oQoD9Tki1eyyw== Received: from mail-wm1-x329.google.com (mail-wm1-x329.google.com [IPv6:2a00:1450:4864:20::329]) by lancelot.ideasonboard.com (Postfix) with ESMTPS id 089DE633B6 for ; Wed, 3 May 2023 14:21:02 +0200 (CEST) Authentication-Results: lancelot.ideasonboard.com; dkim=pass (2048-bit key; unprotected) header.d=raspberrypi.com header.i=@raspberrypi.com header.b="TW9d5Q3/"; dkim-atps=neutral Received: by mail-wm1-x329.google.com with SMTP id 5b1f17b1804b1-3f4000ec74aso2704705e9.3 for ; Wed, 03 May 2023 05:21:02 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=raspberrypi.com; s=google; t=1683116461; x=1685708461; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:from:to:cc:subject:date :message-id:reply-to; bh=85k+ztMhbF/aIzPrrW8kN62vFz/B1zNIYtS6xm//rbQ=; b=TW9d5Q3/MfK0h9AQvUe1SHgZFTL5tRfQnAffEjMoSs+2WqMLrfkrbSE78DPb0SlvNJ 3yqh52cOwEqm869o0dbfv89FvEOglCKYamjnK3CTGhD1DcRomtNhKUbmLDmV6wExC5Vl f8DRfXq/dITBBChkAb4hdJLOIqb1N4Qo6kBKz1bHM0H6DZcUcMruk5REKO9VyXj3sjKr xOWLRdIYJUW4vjtb//DJX9UoHsnO/xRFNH4F3YEqXDNi/7MGRYXr6yOHDboQ3S8SeIOj Bb5DgPWVV0irqChtSpmxm/mFf7os4VENkTuD9zn4f8+PF8uE/Smjt1vIf37AStir3JKI xiKQ== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20221208; t=1683116461; x=1685708461; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:x-gm-message-state:from:to:cc :subject:date:message-id:reply-to; bh=85k+ztMhbF/aIzPrrW8kN62vFz/B1zNIYtS6xm//rbQ=; b=FWn5523TXe3WjMNXjU8sYguh64NL1nYdT/CtzmlN7saBKb8vfFefbzZavUf88dSIdL byZUZGZDU5MLz+O9XeqQFMUxXE9tK6NPXps+lZxKCihg24CiWUtlSaFE4GPrkV9SmYUX viI8D3lUYL8vy2SSnLoPMsBIEGK2HwyzFCGdJQCvq+6ZnYxT95l3MvNLjMIWDXj97c82 1+nY5pt2oC6oXDnufWvbUGKvatpk/421yd27di+8fdRA2oJ9SHHOslD/AdgAv/IQaPLf ogofqqRaIGJeBlp9v1yaTx3HJUBt9foB0MsLwo6Qwew8J6Rc4EZEUarO+X3KRnDISUyh sBbA== X-Gm-Message-State: AC+VfDxGn9YlBQdIYSkHvh59ZcmIZQEwnXGZQCl3u2W5anFkVxj7i3cQ PL9FHnpi7hfz1YgcJVdoIk5JPhHJqUwhFT7Jl8TFuA== X-Google-Smtp-Source: ACHHUZ7HDBBf75eHDttXB8yZFr7tRqeC2g06uK7g9CKGho6lz5qRz+k2+JvIO/jD5S2C7oVVZn5r/g== X-Received: by 2002:a05:600c:228f:b0:3f2:5999:4f3d with SMTP id 15-20020a05600c228f00b003f259994f3dmr14269581wmf.29.1683116461407; Wed, 03 May 2023 05:21:01 -0700 (PDT) Received: from localhost.localdomain ([93.93.133.154]) by smtp.gmail.com with ESMTPSA id f23-20020a7bcd17000000b003ee443bf0c7sm1736785wmj.16.2023.05.03.05.21.00 (version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256); Wed, 03 May 2023 05:21:01 -0700 (PDT) To: libcamera-devel@lists.libcamera.org Date: Wed, 3 May 2023 13:20:33 +0100 Message-Id: <20230503122035.32026-12-naush@raspberrypi.com> X-Mailer: git-send-email 2.34.1 In-Reply-To: <20230503122035.32026-1-naush@raspberrypi.com> References: <20230503122035.32026-1-naush@raspberrypi.com> MIME-Version: 1.0 Subject: [libcamera-devel] [PATCH 11/13] pipeline: raspberrypi: Make RPi::Stream::name() return const std::string & X-BeenThere: libcamera-devel@lists.libcamera.org X-Mailman-Version: 2.1.29 Precedence: list List-Id: List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , X-Patchwork-Original-From: Naushir Patuck via libcamera-devel From: Naushir Patuck Reply-To: Naushir Patuck Cc: Jacopo Mondi Errors-To: libcamera-devel-bounces@lists.libcamera.org Sender: "libcamera-devel" Return a const std::string reference from RPi::Stream::name() to avoid copying a string when not needed. Signed-off-by: Naushir Patuck Reviewed-by: Jacopo Mondi Reviewed-by: Laurent Pinchart --- src/libcamera/pipeline/rpi/common/rpi_stream.cpp | 2 +- src/libcamera/pipeline/rpi/common/rpi_stream.h | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/src/libcamera/pipeline/rpi/common/rpi_stream.cpp b/src/libcamera/pipeline/rpi/common/rpi_stream.cpp index 3690667e9aa6..b7e4130f5e56 100644 --- a/src/libcamera/pipeline/rpi/common/rpi_stream.cpp +++ b/src/libcamera/pipeline/rpi/common/rpi_stream.cpp @@ -19,7 +19,7 @@ V4L2VideoDevice *Stream::dev() const return dev_.get(); } -std::string Stream::name() const +const std::string &Stream::name() const { return name_; } diff --git a/src/libcamera/pipeline/rpi/common/rpi_stream.h b/src/libcamera/pipeline/rpi/common/rpi_stream.h index 1aae674967e1..b8c74de35863 100644 --- a/src/libcamera/pipeline/rpi/common/rpi_stream.h +++ b/src/libcamera/pipeline/rpi/common/rpi_stream.h @@ -49,7 +49,7 @@ public: } V4L2VideoDevice *dev() const; - std::string name() const; + const std::string &name() const; bool isImporter() const; void resetBuffers(); From patchwork Wed May 3 12:20:34 2023 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Naushir Patuck X-Patchwork-Id: 18598 Return-Path: X-Original-To: parsemail@patchwork.libcamera.org Delivered-To: parsemail@patchwork.libcamera.org Received: from lancelot.ideasonboard.com (lancelot.ideasonboard.com [92.243.16.209]) by patchwork.libcamera.org (Postfix) with ESMTPS id BD036C0DA4 for ; Wed, 3 May 2023 12:21:07 +0000 (UTC) Received: from lancelot.ideasonboard.com (localhost [IPv6:::1]) by lancelot.ideasonboard.com (Postfix) with ESMTP id 87655633C5; Wed, 3 May 2023 14:21:07 +0200 (CEST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/simple; d=libcamera.org; s=mail; t=1683116467; bh=FMMEWmtIs3AMptHcttfXikH75IF1TNtN/8IpJc/UQB8=; h=To:Date:In-Reply-To:References:Subject:List-Id:List-Unsubscribe: List-Archive:List-Post:List-Help:List-Subscribe:From:Reply-To:Cc: From; b=fltRANn5WwlOHZwF12VTbUybEZUzd79GhRb4jOhnwgXOi5WEaYXFrC2/8ZO7YRcH1 zKipS/7aC+7iDxMOvJ5Wbz9kBJxK+nn7qYRGv2+fvvCfOCf4+gfafCns5mVq55HLLr usJ5tC7kyf3MK+IchzMR14XpR2IpO2bDbFZFlKNg1F/gRqfxmxsZ5JbQJV41Wn2Lp/ qv/7Jhf+4tfF4ZZSUYKxdtawz2NWYAFizv8k16KBkqyiVzs8yLTlnuXVjstQtFHYeP JlIl3bZmBLSZXyVU6wLASc0Cm6RdZhSIUsFnd5uhFuf9MrS+9IdOUGQx3vPLsui6sU Xg+WLRH19e+0w== Received: from mail-wm1-x336.google.com (mail-wm1-x336.google.com [IPv6:2a00:1450:4864:20::336]) by lancelot.ideasonboard.com (Postfix) with ESMTPS id 0DE26633C5 for ; Wed, 3 May 2023 14:21:05 +0200 (CEST) Authentication-Results: lancelot.ideasonboard.com; dkim=pass (2048-bit key; unprotected) header.d=raspberrypi.com header.i=@raspberrypi.com header.b="dhnCSUEJ"; dkim-atps=neutral Received: by mail-wm1-x336.google.com with SMTP id 5b1f17b1804b1-3f1738d0d4cso31408655e9.1 for ; Wed, 03 May 2023 05:21:05 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=raspberrypi.com; s=google; t=1683116464; x=1685708464; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:from:to:cc:subject:date :message-id:reply-to; bh=oaL4taXlaJ8cLKhe+VlyWbqONZgo9v5Q7ZU/W6CDKqA=; b=dhnCSUEJAvhULX+Pr8o4JAsMvzly5gt758LRBmEzlHfGvv90IRDEqosQKNWaJjhB4i KVhAB+uFQ5SW0U4KEVcaGlEkY6kyH+hdTEfv+pq5uF9zMswfAS8aWA9iJgZabJh7BhAH WOaU1V8058KLnxZbQcVv20ep1i8J7WqDp6R41H0AROW7syjoStHXUHAUhPaMIXOaj0GK b1AeKJuwG4FwKeUbeOnxL3TCsBngYeuDA1WYTwM+J+/yKHDe8+S3SVHHGJ+PtmprseER n8n9Q+rsfljwGS++i4t3w3ZugfMgAKq/G1vpFo4nD05RRViSi9+xpFCkD3wvxYASxMjU fDSQ== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20221208; t=1683116464; x=1685708464; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:x-gm-message-state:from:to:cc :subject:date:message-id:reply-to; bh=oaL4taXlaJ8cLKhe+VlyWbqONZgo9v5Q7ZU/W6CDKqA=; b=HK9NFNCh1d4fC7viUXB05Bnyk/GVMwar7Q65CQW9zEAy0JlBPhzXVwHZGYql9KT8Mu 5xjd2F06qLtrKTh+ekUYHEPzbEN4QMKsUaj8GcXC2OtH6hRZHzpYK9ZXWtwy4OU1JyGN vCw/LbZShlmWymvtmZISzwrOClIVueBh1a97H7OKDLaGptLLvOH5YtzX+AXSlciAN0k6 o3ov7Nnjz8JK4arleTtxiGq/Bc2GZL2d32tgAU8lZjiIXuv2MMWcBUvfEVRtcSUH5sbT KSjVQuqfNk2QKe5HPdH9GzSx5rNReSN+Nh8rP2GyDn4ogB9paEZ1b1Crj7Ey4/Pb/Nfo tARQ== X-Gm-Message-State: AC+VfDysSj/yabgYLxl/8G1dFt6/HqsRRs0fZY4HbZia5j1NoFDgf82f 3v0/XIgWZ/Tye8l9Z4L/4ves1vVKmweJ0W4kmhjujQ== X-Google-Smtp-Source: ACHHUZ5Kkw2knb86KvSwW2BMuBLdkmKehggmfBcI4CR9w9PvYfEdZQ7xi3t1pnLXrrRXZGndiiws5Q== X-Received: by 2002:a05:600c:ac8:b0:3f1:72f8:6a92 with SMTP id c8-20020a05600c0ac800b003f172f86a92mr14201357wmr.20.1683116462158; Wed, 03 May 2023 05:21:02 -0700 (PDT) Received: from localhost.localdomain ([93.93.133.154]) by smtp.gmail.com with ESMTPSA id f23-20020a7bcd17000000b003ee443bf0c7sm1736785wmj.16.2023.05.03.05.21.01 (version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256); Wed, 03 May 2023 05:21:01 -0700 (PDT) To: libcamera-devel@lists.libcamera.org Date: Wed, 3 May 2023 13:20:34 +0100 Message-Id: <20230503122035.32026-13-naush@raspberrypi.com> X-Mailer: git-send-email 2.34.1 In-Reply-To: <20230503122035.32026-1-naush@raspberrypi.com> References: <20230503122035.32026-1-naush@raspberrypi.com> MIME-Version: 1.0 Subject: [libcamera-devel] [PATCH 12/13] pipeline: raspberrypi: Introduce PipelineHandlerBase class X-BeenThere: libcamera-devel@lists.libcamera.org X-Mailman-Version: 2.1.29 Precedence: list List-Id: List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , X-Patchwork-Original-From: Naushir Patuck via libcamera-devel From: Naushir Patuck Reply-To: Naushir Patuck Cc: Jacopo Mondi Errors-To: libcamera-devel-bounces@lists.libcamera.org Sender: "libcamera-devel" Create a new PipelineHandlerBase class that handles general purpose housekeeping duties for the Raspberry Pi pipeline handler. Code the implementation of new class is essentially pulled from the existing pipeline/rpi/vc4/raspberrypi.cpp file with a small amount of refactoring. Create a derived PipelineHandlerVc4 class from PipelineHandlerBase that handles the VC4 pipeline specific tasks of the pipeline handler. Again, code for this class implementation is taken from the existing pipeline/rpi/vc4/raspberrypi.cpp with a small amount of refactoring. The goal of this change is to allow third parties to implement their own pipeline handlers running on the Raspberry Pi without duplicating all of the pipeline handler housekeeping tasks. Signed-off-by: Naushir Patuck Reviewed-by: Jacopo Mondi --- src/ipa/rpi/vc4/vc4.cpp | 2 +- src/libcamera/pipeline/rpi/common/meson.build | 1 + .../pipeline/rpi/common/pipeline_base.cpp | 1480 ++++++++++ .../pipeline/rpi/common/pipeline_base.h | 277 ++ .../pipeline/rpi/vc4/data/example.yaml | 4 +- src/libcamera/pipeline/rpi/vc4/meson.build | 2 +- .../pipeline/rpi/vc4/raspberrypi.cpp | 2432 ----------------- src/libcamera/pipeline/rpi/vc4/vc4.cpp | 973 +++++++ 8 files changed, 2735 insertions(+), 2436 deletions(-) create mode 100644 src/libcamera/pipeline/rpi/common/pipeline_base.cpp create mode 100644 src/libcamera/pipeline/rpi/common/pipeline_base.h delete mode 100644 src/libcamera/pipeline/rpi/vc4/raspberrypi.cpp create mode 100644 src/libcamera/pipeline/rpi/vc4/vc4.cpp diff --git a/src/ipa/rpi/vc4/vc4.cpp b/src/ipa/rpi/vc4/vc4.cpp index e38024ac5418..789a345fa16e 100644 --- a/src/ipa/rpi/vc4/vc4.cpp +++ b/src/ipa/rpi/vc4/vc4.cpp @@ -538,7 +538,7 @@ extern "C" { const struct IPAModuleInfo ipaModuleInfo = { IPA_MODULE_API_VERSION, 1, - "PipelineHandlerRPi", + "PipelineHandlerVc4", "rpi/vc4", }; diff --git a/src/libcamera/pipeline/rpi/common/meson.build b/src/libcamera/pipeline/rpi/common/meson.build index 2ad594cf8d1a..8fb7e823279d 100644 --- a/src/libcamera/pipeline/rpi/common/meson.build +++ b/src/libcamera/pipeline/rpi/common/meson.build @@ -2,5 +2,6 @@ libcamera_sources += files([ 'delayed_controls.cpp', + 'pipeline_base.cpp', 'rpi_stream.cpp', ]) diff --git a/src/libcamera/pipeline/rpi/common/pipeline_base.cpp b/src/libcamera/pipeline/rpi/common/pipeline_base.cpp new file mode 100644 index 000000000000..69c67acd7d45 --- /dev/null +++ b/src/libcamera/pipeline/rpi/common/pipeline_base.cpp @@ -0,0 +1,1480 @@ +/* SPDX-License-Identifier: LGPL-2.1-or-later */ +/* + * Copyright (C) 2019-2023, Raspberry Pi Ltd + * + * pipeline_base.cpp - Pipeline handler base class for Raspberry Pi devices + */ + +#include "pipeline_base.h" + +#include + +#include +#include + +#include +#include + +#include +#include +#include + +#include "libcamera/internal/camera_lens.h" +#include "libcamera/internal/ipa_manager.h" +#include "libcamera/internal/v4l2_subdevice.h" + +using namespace std::chrono_literals; + +namespace libcamera { + +using namespace RPi; + +LOG_DEFINE_CATEGORY(RPI) + +namespace { + +constexpr unsigned int defaultRawBitDepth = 12; + +bool isRaw(const PixelFormat &pixFmt) +{ + /* This test works for both Bayer and raw mono formats. */ + return BayerFormat::fromPixelFormat(pixFmt).isValid(); +} + +PixelFormat mbusCodeToPixelFormat(unsigned int mbus_code, + BayerFormat::Packing packingReq) +{ + BayerFormat bayer = BayerFormat::fromMbusCode(mbus_code); + + ASSERT(bayer.isValid()); + + bayer.packing = packingReq; + PixelFormat pix = bayer.toPixelFormat(); + + /* + * Not all formats (e.g. 8-bit or 16-bit Bayer formats) can have packed + * variants. So if the PixelFormat returns as invalid, use the non-packed + * conversion instead. + */ + if (!pix.isValid()) { + bayer.packing = BayerFormat::Packing::None; + pix = bayer.toPixelFormat(); + } + + return pix; +} + +SensorFormats populateSensorFormats(std::unique_ptr &sensor) +{ + SensorFormats formats; + + for (auto const mbusCode : sensor->mbusCodes()) + formats.emplace(mbusCode, sensor->sizes(mbusCode)); + + return formats; +} + +bool isMonoSensor(std::unique_ptr &sensor) +{ + unsigned int mbusCode = sensor->mbusCodes()[0]; + const BayerFormat &bayer = BayerFormat::fromMbusCode(mbusCode); + + return bayer.order == BayerFormat::Order::MONO; +} + +double scoreFormat(double desired, double actual) +{ + double score = desired - actual; + /* Smaller desired dimensions are preferred. */ + if (score < 0.0) + score = (-score) / 8; + /* Penalise non-exact matches. */ + if (actual != desired) + score *= 2; + + return score; +} + +V4L2SubdeviceFormat findBestFormat(const SensorFormats &formatsMap, const Size &req, unsigned int bitDepth) +{ + double bestScore = std::numeric_limits::max(), score; + V4L2SubdeviceFormat bestFormat; + bestFormat.colorSpace = ColorSpace::Raw; + + constexpr float penaltyAr = 1500.0; + constexpr float penaltyBitDepth = 500.0; + + /* Calculate the closest/best mode from the user requested size. */ + for (const auto &iter : formatsMap) { + const unsigned int mbusCode = iter.first; + const PixelFormat format = mbusCodeToPixelFormat(mbusCode, + BayerFormat::Packing::None); + const PixelFormatInfo &info = PixelFormatInfo::info(format); + + for (const Size &size : iter.second) { + double reqAr = static_cast(req.width) / req.height; + double fmtAr = static_cast(size.width) / size.height; + + /* Score the dimensions for closeness. */ + score = scoreFormat(req.width, size.width); + score += scoreFormat(req.height, size.height); + score += penaltyAr * scoreFormat(reqAr, fmtAr); + + /* Add any penalties... this is not an exact science! */ + score += utils::abs_diff(info.bitsPerPixel, bitDepth) * penaltyBitDepth; + + if (score <= bestScore) { + bestScore = score; + bestFormat.mbus_code = mbusCode; + bestFormat.size = size; + } + + LOG(RPI, Debug) << "Format: " << size + << " fmt " << format + << " Score: " << score + << " (best " << bestScore << ")"; + } + } + + return bestFormat; +} + +const std::vector validColorSpaces = { + ColorSpace::Sycc, + ColorSpace::Smpte170m, + ColorSpace::Rec709 +}; + +std::optional findValidColorSpace(const ColorSpace &colourSpace) +{ + for (auto cs : validColorSpaces) { + if (colourSpace.primaries == cs.primaries && + colourSpace.transferFunction == cs.transferFunction) + return cs; + } + + return std::nullopt; +} + +bool isRgb(const PixelFormat &pixFmt) +{ + const PixelFormatInfo &info = PixelFormatInfo::info(pixFmt); + return info.colourEncoding == PixelFormatInfo::ColourEncodingRGB; +} + +bool isYuv(const PixelFormat &pixFmt) +{ + /* The code below would return true for raw mono streams, so weed those out first. */ + if (isRaw(pixFmt)) + return false; + + const PixelFormatInfo &info = PixelFormatInfo::info(pixFmt); + return info.colourEncoding == PixelFormatInfo::ColourEncodingYUV; +} + +} /* namespace */ + +/* + * Raspberry Pi drivers expect the following colour spaces: + * - V4L2_COLORSPACE_RAW for raw streams. + * - One of V4L2_COLORSPACE_JPEG, V4L2_COLORSPACE_SMPTE170M, V4L2_COLORSPACE_REC709 for + * non-raw streams. Other fields such as transfer function, YCbCr encoding and + * quantisation are not used. + * + * The libcamera colour spaces that we wish to use corresponding to these are therefore: + * - ColorSpace::Raw for V4L2_COLORSPACE_RAW + * - ColorSpace::Sycc for V4L2_COLORSPACE_JPEG + * - ColorSpace::Smpte170m for V4L2_COLORSPACE_SMPTE170M + * - ColorSpace::Rec709 for V4L2_COLORSPACE_REC709 + */ +CameraConfiguration::Status RPiCameraConfiguration::validateColorSpaces([[maybe_unused]] ColorSpaceFlags flags) +{ + Status status = Valid; + yuvColorSpace_.reset(); + + for (auto cfg : config_) { + /* First fix up raw streams to have the "raw" colour space. */ + if (isRaw(cfg.pixelFormat)) { + /* If there was no value here, that doesn't count as "adjusted". */ + if (cfg.colorSpace && cfg.colorSpace != ColorSpace::Raw) + status = Adjusted; + cfg.colorSpace = ColorSpace::Raw; + continue; + } + + /* Next we need to find our shared colour space. The first valid one will do. */ + if (cfg.colorSpace && !yuvColorSpace_) + yuvColorSpace_ = findValidColorSpace(cfg.colorSpace.value()); + } + + /* If no colour space was given anywhere, choose sYCC. */ + if (!yuvColorSpace_) + yuvColorSpace_ = ColorSpace::Sycc; + + /* Note the version of this that any RGB streams will have to use. */ + rgbColorSpace_ = yuvColorSpace_; + rgbColorSpace_->ycbcrEncoding = ColorSpace::YcbcrEncoding::None; + rgbColorSpace_->range = ColorSpace::Range::Full; + + /* Go through the streams again and force everyone to the same colour space. */ + for (auto cfg : config_) { + if (cfg.colorSpace == ColorSpace::Raw) + continue; + + if (isYuv(cfg.pixelFormat) && cfg.colorSpace != yuvColorSpace_) { + /* Again, no value means "not adjusted". */ + if (cfg.colorSpace) + status = Adjusted; + cfg.colorSpace = yuvColorSpace_; + } + if (isRgb(cfg.pixelFormat) && cfg.colorSpace != rgbColorSpace_) { + /* Be nice, and let the YUV version count as non-adjusted too. */ + if (cfg.colorSpace && cfg.colorSpace != yuvColorSpace_) + status = Adjusted; + cfg.colorSpace = rgbColorSpace_; + } + } + + return status; +} + +CameraConfiguration::Status RPiCameraConfiguration::validate() +{ + Status status = Valid; + + if (config_.empty()) + return Invalid; + + status = validateColorSpaces(ColorSpaceFlag::StreamsShareColorSpace); + + /* + * Validate the requested transform against the sensor capabilities and + * rotation and store the final combined transform that configure() will + * need to apply to the sensor to save us working it out again. + */ + Transform requestedTransform = transform; + combinedTransform_ = data_->sensor_->validateTransform(&transform); + if (transform != requestedTransform) + status = Adjusted; + + std::vector rawStreams, outStreams; + for (const auto &[index, cfg] : utils::enumerate(config_)) { + if (isRaw(cfg.pixelFormat)) + rawStreams.emplace_back(index, &cfg); + else + outStreams.emplace_back(index, &cfg); + } + + /* Sort the streams so the highest resolution is first. */ + std::sort(rawStreams.begin(), rawStreams.end(), + [](auto &l, auto &r) { return l.cfg->size > r.cfg->size; }); + + std::sort(outStreams.begin(), outStreams.end(), + [](auto &l, auto &r) { return l.cfg->size > r.cfg->size; }); + + /* Do any platform specific fixups. */ + status = data_->platformValidate(rawStreams, outStreams); + if (status == Invalid) + return Invalid; + + /* Further fixups on the RAW streams. */ + for (auto &raw : rawStreams) { + StreamConfiguration &cfg = config_.at(raw.index); + V4L2DeviceFormat rawFormat; + + const PixelFormatInfo &info = PixelFormatInfo::info(cfg.pixelFormat); + unsigned int bitDepth = info.isValid() ? info.bitsPerPixel : defaultRawBitDepth; + V4L2SubdeviceFormat sensorFormat = findBestFormat(data_->sensorFormats_, cfg.size, bitDepth); + + rawFormat.size = sensorFormat.size; + rawFormat.fourcc = raw.dev->toV4L2PixelFormat(cfg.pixelFormat); + + int ret = raw.dev->tryFormat(&rawFormat); + if (ret) + return Invalid; + /* + * Some sensors change their Bayer order when they are h-flipped + * or v-flipped, according to the transform. If this one does, we + * must advertise the transformed Bayer order in the raw stream. + * Note how we must fetch the "native" (i.e. untransformed) Bayer + * order, because the sensor may currently be flipped! + */ + V4L2PixelFormat fourcc = rawFormat.fourcc; + if (data_->flipsAlterBayerOrder_) { + BayerFormat bayer = BayerFormat::fromV4L2PixelFormat(fourcc); + bayer.order = data_->nativeBayerOrder_; + bayer = bayer.transform(combinedTransform_); + fourcc = bayer.toV4L2PixelFormat(); + } + + PixelFormat inputPixFormat = fourcc.toPixelFormat(); + if (raw.cfg->size != rawFormat.size || raw.cfg->pixelFormat != inputPixFormat) { + raw.cfg->size = rawFormat.size; + raw.cfg->pixelFormat = inputPixFormat; + status = Adjusted; + } + + raw.cfg->stride = rawFormat.planes[0].bpl; + raw.cfg->frameSize = rawFormat.planes[0].size; + } + + /* Further fixups on the ISP output streams. */ + for (auto &out : outStreams) { + StreamConfiguration &cfg = config_.at(out.index); + PixelFormat &cfgPixFmt = cfg.pixelFormat; + V4L2VideoDevice::Formats fmts = out.dev->formats(); + + if (fmts.find(out.dev->toV4L2PixelFormat(cfgPixFmt)) == fmts.end()) { + /* If we cannot find a native format, use a default one. */ + cfgPixFmt = formats::NV12; + status = Adjusted; + } + + V4L2DeviceFormat format; + format.fourcc = out.dev->toV4L2PixelFormat(cfg.pixelFormat); + format.size = cfg.size; + /* We want to send the associated YCbCr info through to the driver. */ + format.colorSpace = yuvColorSpace_; + + LOG(RPI, Debug) + << "Try color space " << ColorSpace::toString(cfg.colorSpace); + + int ret = out.dev->tryFormat(&format); + if (ret) + return Invalid; + + /* + * But for RGB streams, the YCbCr info gets overwritten on the way back + * so we must check against what the stream cfg says, not what we actually + * requested (which carefully included the YCbCr info)! + */ + if (cfg.colorSpace != format.colorSpace) { + status = Adjusted; + LOG(RPI, Debug) + << "Color space changed from " + << ColorSpace::toString(cfg.colorSpace) << " to " + << ColorSpace::toString(format.colorSpace); + } + + cfg.colorSpace = format.colorSpace; + cfg.stride = format.planes[0].bpl; + cfg.frameSize = format.planes[0].size; + } + + return status; +} + +V4L2DeviceFormat PipelineHandlerBase::toV4L2DeviceFormat(const V4L2VideoDevice *dev, + const V4L2SubdeviceFormat &format, + BayerFormat::Packing packingReq) +{ + unsigned int mbus_code = format.mbus_code; + const PixelFormat pix = mbusCodeToPixelFormat(mbus_code, packingReq); + V4L2DeviceFormat deviceFormat; + + deviceFormat.fourcc = dev->toV4L2PixelFormat(pix); + deviceFormat.size = format.size; + deviceFormat.colorSpace = format.colorSpace; + return deviceFormat; +} + +std::unique_ptr +PipelineHandlerBase::generateConfiguration(Camera *camera, const StreamRoles &roles) +{ + CameraData *data = cameraData(camera); + std::unique_ptr config = + std::make_unique(data); + V4L2SubdeviceFormat sensorFormat; + unsigned int bufferCount; + PixelFormat pixelFormat; + V4L2VideoDevice::Formats fmts; + Size size; + std::optional colorSpace; + + if (roles.empty()) + return config; + + Size sensorSize = data->sensor_->resolution(); + for (const StreamRole role : roles) { + switch (role) { + case StreamRole::Raw: + size = sensorSize; + sensorFormat = findBestFormat(data->sensorFormats_, size, defaultRawBitDepth); + pixelFormat = mbusCodeToPixelFormat(sensorFormat.mbus_code, + BayerFormat::Packing::CSI2); + ASSERT(pixelFormat.isValid()); + colorSpace = ColorSpace::Raw; + bufferCount = 2; + break; + + case StreamRole::StillCapture: + fmts = data->ispFormats(); + pixelFormat = formats::NV12; + /* + * Still image codecs usually expect the sYCC color space. + * Even RGB codecs will be fine as the RGB we get with the + * sYCC color space is the same as sRGB. + */ + colorSpace = ColorSpace::Sycc; + /* Return the largest sensor resolution. */ + size = sensorSize; + bufferCount = 1; + break; + + case StreamRole::VideoRecording: + /* + * The colour denoise algorithm requires the analysis + * image, produced by the second ISP output, to be in + * YUV420 format. Select this format as the default, to + * maximize chances that it will be picked by + * applications and enable usage of the colour denoise + * algorithm. + */ + fmts = data->ispFormats(); + pixelFormat = formats::YUV420; + /* + * Choose a color space appropriate for video recording. + * Rec.709 will be a good default for HD resolutions. + */ + colorSpace = ColorSpace::Rec709; + size = { 1920, 1080 }; + bufferCount = 4; + break; + + case StreamRole::Viewfinder: + fmts = data->ispFormats(); + pixelFormat = formats::ARGB8888; + colorSpace = ColorSpace::Sycc; + size = { 800, 600 }; + bufferCount = 4; + break; + + default: + LOG(RPI, Error) << "Requested stream role not supported: " + << role; + return nullptr; + } + + std::map> deviceFormats; + if (role == StreamRole::Raw) { + /* Translate the MBUS codes to a PixelFormat. */ + for (const auto &format : data->sensorFormats_) { + PixelFormat pf = mbusCodeToPixelFormat(format.first, + BayerFormat::Packing::CSI2); + if (pf.isValid()) + deviceFormats.emplace(std::piecewise_construct, std::forward_as_tuple(pf), + std::forward_as_tuple(format.second.begin(), format.second.end())); + } + } else { + /* + * Translate the V4L2PixelFormat to PixelFormat. Note that we + * limit the recommended largest ISP output size to match the + * sensor resolution. + */ + for (const auto &format : fmts) { + PixelFormat pf = format.first.toPixelFormat(); + if (pf.isValid()) { + const SizeRange &ispSizes = format.second[0]; + deviceFormats[pf].emplace_back(ispSizes.min, sensorSize, + ispSizes.hStep, ispSizes.vStep); + } + } + } + + /* Add the stream format based on the device node used for the use case. */ + StreamFormats formats(deviceFormats); + StreamConfiguration cfg(formats); + cfg.size = size; + cfg.pixelFormat = pixelFormat; + cfg.colorSpace = colorSpace; + cfg.bufferCount = bufferCount; + config->addConfiguration(cfg); + } + + config->validate(); + + return config; +} + +int PipelineHandlerBase::configure(Camera *camera, CameraConfiguration *config) +{ + CameraData *data = cameraData(camera); + int ret; + + /* Start by freeing all buffers and reset the stream states. */ + data->freeBuffers(); + for (auto const stream : data->streams_) + stream->setExternal(false); + + std::vector rawStreams, ispStreams; + std::optional packing; + unsigned int bitDepth = defaultRawBitDepth; + + for (unsigned i = 0; i < config->size(); i++) { + StreamConfiguration *cfg = &config->at(i); + + if (isRaw(cfg->pixelFormat)) + rawStreams.emplace_back(i, cfg); + else + ispStreams.emplace_back(i, cfg); + } + + /* Sort the streams so the highest resolution is first. */ + std::sort(rawStreams.begin(), rawStreams.end(), + [](auto &l, auto &r) { return l.cfg->size > r.cfg->size; }); + + std::sort(ispStreams.begin(), ispStreams.end(), + [](auto &l, auto &r) { return l.cfg->size > r.cfg->size; }); + + /* + * Calculate the best sensor mode we can use based on the user's request, + * and apply it to the sensor with the cached tranform, if any. + * + * If we have been given a RAW stream, use that size for setting up the sensor. + */ + if (!rawStreams.empty()) { + BayerFormat bayerFormat = BayerFormat::fromPixelFormat(rawStreams[0].cfg->pixelFormat); + /* Replace the user requested packing/bit-depth. */ + packing = bayerFormat.packing; + bitDepth = bayerFormat.bitDepth; + } + + V4L2SubdeviceFormat sensorFormat = findBestFormat(data->sensorFormats_, + rawStreams.empty() ? ispStreams[0].cfg->size + : rawStreams[0].cfg->size, + bitDepth); + /* Apply any cached transform. */ + const RPiCameraConfiguration *rpiConfig = static_cast(config); + + /* Then apply the format on the sensor. */ + ret = data->sensor_->setFormat(&sensorFormat, rpiConfig->combinedTransform_); + if (ret) + return ret; + + /* + * Platform specific internal stream configuration. This also assigns + * external streams which get configured below. + */ + ret = data->platformConfigure(sensorFormat, packing, rawStreams, ispStreams); + if (ret) + return ret; + + ipa::RPi::ConfigResult result; + ret = data->configureIPA(config, &result); + if (ret) { + LOG(RPI, Error) << "Failed to configure the IPA: " << ret; + return ret; + } + + /* + * Set the scaler crop to the value we are using (scaled to native sensor + * coordinates). + */ + data->scalerCrop_ = data->scaleIspCrop(data->ispCrop_); + + /* + * Update the ScalerCropMaximum to the correct value for this camera mode. + * For us, it's the same as the "analogue crop". + * + * \todo Make this property the ScalerCrop maximum value when dynamic + * controls are available and set it at validate() time + */ + data->properties_.set(properties::ScalerCropMaximum, data->sensorInfo_.analogCrop); + + /* Store the mode sensitivity for the application. */ + data->properties_.set(properties::SensorSensitivity, result.modeSensitivity); + + /* Update the controls that the Raspberry Pi IPA can handle. */ + ControlInfoMap::Map ctrlMap; + for (auto const &c : result.controlInfo) + ctrlMap.emplace(c.first, c.second); + + /* Add the ScalerCrop control limits based on the current mode. */ + Rectangle ispMinCrop = data->scaleIspCrop(Rectangle(data->ispMinCropSize_)); + ctrlMap[&controls::ScalerCrop] = ControlInfo(ispMinCrop, data->sensorInfo_.analogCrop, data->scalerCrop_); + + data->controlInfo_ = ControlInfoMap(std::move(ctrlMap), result.controlInfo.idmap()); + + /* Setup the Video Mux/Bridge entities. */ + for (auto &[device, link] : data->bridgeDevices_) { + /* + * Start by disabling all the sink pad links on the devices in the + * cascade, with the exception of the link connecting the device. + */ + for (const MediaPad *p : device->entity()->pads()) { + if (!(p->flags() & MEDIA_PAD_FL_SINK)) + continue; + + for (MediaLink *l : p->links()) { + if (l != link) + l->setEnabled(false); + } + } + + /* + * Next, enable the entity -> entity links, and setup the pad format. + * + * \todo Some bridge devices may chainge the media bus code, so we + * ought to read the source pad format and propagate it to the sink pad. + */ + link->setEnabled(true); + const MediaPad *sinkPad = link->sink(); + ret = device->setFormat(sinkPad->index(), &sensorFormat); + if (ret) { + LOG(RPI, Error) << "Failed to set format on " << device->entity()->name() + << " pad " << sinkPad->index() + << " with format " << sensorFormat + << ": " << ret; + return ret; + } + + LOG(RPI, Debug) << "Configured media link on device " << device->entity()->name() + << " on pad " << sinkPad->index(); + } + + return 0; +} + +int PipelineHandlerBase::exportFrameBuffers([[maybe_unused]] Camera *camera, libcamera::Stream *stream, + std::vector> *buffers) +{ + RPi::Stream *s = static_cast(stream); + unsigned int count = stream->configuration().bufferCount; + int ret = s->dev()->exportBuffers(count, buffers); + + s->setExportedBuffers(buffers); + + return ret; +} + +int PipelineHandlerBase::start(Camera *camera, const ControlList *controls) +{ + CameraData *data = cameraData(camera); + int ret; + + /* Check if a ScalerCrop control was specified. */ + if (controls) + data->applyScalerCrop(*controls); + + /* Start the IPA. */ + ipa::RPi::StartResult result; + data->ipa_->start(controls ? *controls : ControlList{ controls::controls }, + &result); + + /* Apply any gain/exposure settings that the IPA may have passed back. */ + if (!result.controls.empty()) + data->setSensorControls(result.controls); + + /* Configure the number of dropped frames required on startup. */ + data->dropFrameCount_ = data->config_.disableStartupFrameDrops + ? 0 : result.dropFrameCount; + + for (auto const stream : data->streams_) + stream->resetBuffers(); + + if (!data->buffersAllocated_) { + /* Allocate buffers for internal pipeline usage. */ + ret = prepareBuffers(camera); + if (ret) { + LOG(RPI, Error) << "Failed to allocate buffers"; + data->freeBuffers(); + stop(camera); + return ret; + } + data->buffersAllocated_ = true; + } + + /* We need to set the dropFrameCount_ before queueing buffers. */ + ret = queueAllBuffers(camera); + if (ret) { + LOG(RPI, Error) << "Failed to queue buffers"; + stop(camera); + return ret; + } + + /* + * Reset the delayed controls with the gain and exposure values set by + * the IPA. + */ + data->delayedCtrls_->reset(0); + data->state_ = CameraData::State::Idle; + + /* Enable SOF event generation. */ + data->frontendDevice()->setFrameStartEnabled(true); + + data->platformStart(); + + /* Start all streams. */ + for (auto const stream : data->streams_) { + ret = stream->dev()->streamOn(); + if (ret) { + stop(camera); + return ret; + } + } + + return 0; +} + +void PipelineHandlerBase::stopDevice(Camera *camera) +{ + CameraData *data = cameraData(camera); + + data->state_ = CameraData::State::Stopped; + data->platformStop(); + + for (auto const stream : data->streams_) + stream->dev()->streamOff(); + + /* Disable SOF event generation. */ + data->frontendDevice()->setFrameStartEnabled(false); + + data->clearIncompleteRequests(); + + /* Stop the IPA. */ + data->ipa_->stop(); +} + +void PipelineHandlerBase::releaseDevice(Camera *camera) +{ + CameraData *data = cameraData(camera); + data->freeBuffers(); +} + +int PipelineHandlerBase::queueRequestDevice(Camera *camera, Request *request) +{ + CameraData *data = cameraData(camera); + + if (!data->isRunning()) + return -EINVAL; + + LOG(RPI, Debug) << "queueRequestDevice: New request."; + + /* Push all buffers supplied in the Request to the respective streams. */ + for (auto stream : data->streams_) { + if (!stream->isExternal()) + continue; + + FrameBuffer *buffer = request->findBuffer(stream); + if (buffer && !stream->getBufferId(buffer)) { + /* + * This buffer is not recognised, so it must have been allocated + * outside the v4l2 device. Store it in the stream buffer list + * so we can track it. + */ + stream->setExternalBuffer(buffer); + } + + /* + * If no buffer is provided by the request for this stream, we + * queue a nullptr to the stream to signify that it must use an + * internally allocated buffer for this capture request. This + * buffer will not be given back to the application, but is used + * to support the internal pipeline flow. + * + * The below queueBuffer() call will do nothing if there are not + * enough internal buffers allocated, but this will be handled by + * queuing the request for buffers in the RPiStream object. + */ + int ret = stream->queueBuffer(buffer); + if (ret) + return ret; + } + + /* Push the request to the back of the queue. */ + data->requestQueue_.push(request); + data->handleState(); + + return 0; +} + +int PipelineHandlerBase::registerCamera(std::unique_ptr &cameraData, + MediaDevice *frontend, const std::string &frontendName, + MediaDevice *backend, MediaEntity *sensorEntity) +{ + CameraData *data = cameraData.get(); + int ret; + + data->sensor_ = std::make_unique(sensorEntity); + if (!data->sensor_) + return -EINVAL; + + if (data->sensor_->init()) + return -EINVAL; + + data->sensorFormats_ = populateSensorFormats(data->sensor_); + + /* + * Enumerate all the Video Mux/Bridge devices across the sensor -> Fr + * chain. There may be a cascade of devices in this chain! + */ + MediaLink *link = sensorEntity->getPadByIndex(0)->links()[0]; + data->enumerateVideoDevices(link, frontendName); + + ipa::RPi::InitResult result; + if (data->loadIPA(&result)) { + LOG(RPI, Error) << "Failed to load a suitable IPA library"; + return -EINVAL; + } + + /* + * Setup our delayed control writer with the sensor default + * gain and exposure delays. Mark VBLANK for priority write. + */ + std::unordered_map params = { + { V4L2_CID_ANALOGUE_GAIN, { result.sensorConfig.gainDelay, false } }, + { V4L2_CID_EXPOSURE, { result.sensorConfig.exposureDelay, false } }, + { V4L2_CID_HBLANK, { result.sensorConfig.hblankDelay, false } }, + { V4L2_CID_VBLANK, { result.sensorConfig.vblankDelay, true } } + }; + data->delayedCtrls_ = std::make_unique(data->sensor_->device(), params); + data->sensorMetadata_ = result.sensorConfig.sensorMetadata; + + /* Register initial controls that the Raspberry Pi IPA can handle. */ + data->controlInfo_ = std::move(result.controlInfo); + + /* Initialize the camera properties. */ + data->properties_ = data->sensor_->properties(); + + /* + * The V4L2_CID_NOTIFY_GAINS control, if present, is used to inform the + * sensor of the colour gains. It is defined to be a linear gain where + * the default value represents a gain of exactly one. + */ + auto it = data->sensor_->controls().find(V4L2_CID_NOTIFY_GAINS); + if (it != data->sensor_->controls().end()) + data->notifyGainsUnity_ = it->second.def().get(); + + /* + * Set a default value for the ScalerCropMaximum property to show + * that we support its use, however, initialise it to zero because + * it's not meaningful until a camera mode has been chosen. + */ + data->properties_.set(properties::ScalerCropMaximum, Rectangle{}); + + /* + * We cache two things about the sensor in relation to transforms + * (meaning horizontal and vertical flips): if they affect the Bayer + * ordering, and what the "native" Bayer order is, when no transforms + * are applied. + * + * If flips are supported verify if they affect the Bayer ordering + * and what the "native" Bayer order is, when no transforms are + * applied. + * + * We note that the sensor's cached list of supported formats is + * already in the "native" order, with any flips having been undone. + */ + const V4L2Subdevice *sensor = data->sensor_->device(); + const struct v4l2_query_ext_ctrl *hflipCtrl = sensor->controlInfo(V4L2_CID_HFLIP); + if (hflipCtrl) { + /* We assume it will support vflips too... */ + data->flipsAlterBayerOrder_ = hflipCtrl->flags & V4L2_CTRL_FLAG_MODIFY_LAYOUT; + } + + /* Look for a valid Bayer format. */ + BayerFormat bayerFormat; + for (const auto &iter : data->sensorFormats_) { + bayerFormat = BayerFormat::fromMbusCode(iter.first); + if (bayerFormat.isValid()) + break; + } + + if (!bayerFormat.isValid()) { + LOG(RPI, Error) << "No Bayer format found"; + return -EINVAL; + } + data->nativeBayerOrder_ = bayerFormat.order; + + ret = data->loadPipelineConfiguration(); + if (ret) { + LOG(RPI, Error) << "Unable to load pipeline configuration"; + return ret; + } + + ret = platformRegister(cameraData, frontend, backend); + if (ret) + return ret; + + /* Setup the general IPA signal handlers. */ + data->frontendDevice()->dequeueTimeout.connect(data, &RPi::CameraData::cameraTimeout); + data->frontendDevice()->frameStart.connect(data, &RPi::CameraData::frameStarted); + data->ipa_->setDelayedControls.connect(data, &CameraData::setDelayedControls); + data->ipa_->setLensControls.connect(data, &CameraData::setLensControls); + data->ipa_->metadataReady.connect(data, &CameraData::metadataReady); + + return 0; +} + +void PipelineHandlerBase::mapBuffers(Camera *camera, const BufferMap &buffers, unsigned int mask) +{ + CameraData *data = cameraData(camera); + std::vector bufferIds; + /* + * Link the FrameBuffers with the id (key value) in the map stored in + * the RPi stream object - along with an identifier mask. + * + * This will allow us to identify buffers passed between the pipeline + * handler and the IPA. + */ + for (auto const &it : buffers) { + bufferIds.push_back(IPABuffer(mask | it.first, + it.second->planes())); + data->bufferIds_.insert(mask | it.first); + } + + data->ipa_->mapBuffers(bufferIds); +} + +int PipelineHandlerBase::queueAllBuffers(Camera *camera) +{ + CameraData *data = cameraData(camera); + int ret; + + for (auto const stream : data->streams_) { + if (!stream->isExternal()) { + ret = stream->queueAllBuffers(); + if (ret < 0) + return ret; + } else { + /* + * For external streams, we must queue up a set of internal + * buffers to handle the number of drop frames requested by + * the IPA. This is done by passing nullptr in queueBuffer(). + * + * The below queueBuffer() call will do nothing if there + * are not enough internal buffers allocated, but this will + * be handled by queuing the request for buffers in the + * RPiStream object. + */ + unsigned int i; + for (i = 0; i < data->dropFrameCount_; i++) { + ret = stream->queueBuffer(nullptr); + if (ret) + return ret; + } + } + } + + return 0; +} + +void CameraData::freeBuffers() +{ + if (ipa_) { + /* + * Copy the buffer ids from the unordered_set to a vector to + * pass to the IPA. + */ + std::vector bufferIds(bufferIds_.begin(), + bufferIds_.end()); + ipa_->unmapBuffers(bufferIds); + bufferIds_.clear(); + } + + for (auto const stream : streams_) + stream->releaseBuffers(); + + platformFreeBuffers(); + + buffersAllocated_ = false; +} + +/* + * enumerateVideoDevices() iterates over the Media Controller topology, starting + * at the sensor and finishing at the frontend. For each sensor, CameraData stores + * a unique list of any intermediate video mux or bridge devices connected in a + * cascade, together with the entity to entity link. + * + * Entity pad configuration and link enabling happens at the end of configure(). + * We first disable all pad links on each entity device in the chain, and then + * selectively enabling the specific links to link sensor to the frontend across + * all intermediate muxes and bridges. + * + * In the cascaded topology below, if Sensor1 is used, the Mux2 -> Mux1 link + * will be disabled, and Sensor1 -> Mux1 -> Frontend links enabled. Alternatively, + * if Sensor3 is used, the Sensor2 -> Mux2 and Sensor1 -> Mux1 links are disabled, + * and Sensor3 -> Mux2 -> Mux1 -> Frontend links are enabled. All other links will + * remain unchanged. + * + * +----------+ + * | FE | + * +-----^----+ + * | + * +---+---+ + * | Mux1 |<------+ + * +--^---- | + * | | + * +-----+---+ +---+---+ + * | Sensor1 | | Mux2 |<--+ + * +---------+ +-^-----+ | + * | | + * +-------+-+ +---+-----+ + * | Sensor2 | | Sensor3 | + * +---------+ +---------+ + */ +void CameraData::enumerateVideoDevices(MediaLink *link, const std::string &frontend) +{ + const MediaPad *sinkPad = link->sink(); + const MediaEntity *entity = sinkPad->entity(); + bool frontendFound = false; + + /* We only deal with Video Mux and Bridge devices in cascade. */ + if (entity->function() != MEDIA_ENT_F_VID_MUX && + entity->function() != MEDIA_ENT_F_VID_IF_BRIDGE) + return; + + /* Find the source pad for this Video Mux or Bridge device. */ + const MediaPad *sourcePad = nullptr; + for (const MediaPad *pad : entity->pads()) { + if (pad->flags() & MEDIA_PAD_FL_SOURCE) { + /* + * We can only deal with devices that have a single source + * pad. If this device has multiple source pads, ignore it + * and this branch in the cascade. + */ + if (sourcePad) + return; + + sourcePad = pad; + } + } + + LOG(RPI, Debug) << "Found video mux device " << entity->name() + << " linked to sink pad " << sinkPad->index(); + + bridgeDevices_.emplace_back(std::make_unique(entity), link); + bridgeDevices_.back().first->open(); + + /* + * Iterate through all the sink pad links down the cascade to find any + * other Video Mux and Bridge devices. + */ + for (MediaLink *l : sourcePad->links()) { + enumerateVideoDevices(l, frontend); + /* Once we reach the Frontend entity, we are done. */ + if (l->sink()->entity()->name() == frontend) { + frontendFound = true; + break; + } + } + + /* This identifies the end of our entity enumeration recursion. */ + if (link->source()->entity()->function() == MEDIA_ENT_F_CAM_SENSOR) { + /* + * If the frontend is not at the end of this cascade, we cannot + * configure this topology automatically, so remove all entity + * references. + */ + if (!frontendFound) { + LOG(RPI, Warning) << "Cannot automatically configure this MC topology!"; + bridgeDevices_.clear(); + } + } +} + +int CameraData::loadPipelineConfiguration() +{ + config_ = { + .disableStartupFrameDrops = false, + .cameraTimeoutValue = 0, + }; + + /* Initial configuration of the platform, in case no config file is present */ + platformPipelineConfigure({}); + + char const *configFromEnv = utils::secure_getenv("LIBCAMERA_RPI_CONFIG_FILE"); + if (!configFromEnv || *configFromEnv == '\0') + return 0; + + std::string filename = std::string(configFromEnv); + File file(filename); + + if (!file.open(File::OpenModeFlag::ReadOnly)) { + LOG(RPI, Error) << "Failed to open configuration file '" << filename << "'"; + return -EIO; + } + + LOG(RPI, Info) << "Using configuration file '" << filename << "'"; + + std::unique_ptr root = YamlParser::parse(file); + if (!root) { + LOG(RPI, Warning) << "Failed to parse configuration file, using defaults"; + return 0; + } + + std::optional ver = (*root)["version"].get(); + if (!ver || *ver != 1.0) { + LOG(RPI, Error) << "Unexpected configuration file version reported"; + return -EINVAL; + } + + const YamlObject &phConfig = (*root)["pipeline_handler"]; + + config_.disableStartupFrameDrops = + phConfig["disable_startup_frame_drops"].get(config_.disableStartupFrameDrops); + + config_.cameraTimeoutValue = + phConfig["camera_timeout_value_ms"].get(config_.cameraTimeoutValue); + + if (config_.cameraTimeoutValue) { + /* Disable the IPA signal to control timeout and set the user requested value. */ + ipa_->setCameraTimeout.disconnect(); + frontendDevice()->setDequeueTimeout(config_.cameraTimeoutValue * 1ms); + } + + return platformPipelineConfigure(root); +} + +int CameraData::loadIPA(ipa::RPi::InitResult *result) +{ + ipa_ = IPAManager::createIPA(pipe(), 1, 1); + + if (!ipa_) + return -ENOENT; + + /* + * The configuration (tuning file) is made from the sensor name unless + * the environment variable overrides it. + */ + std::string configurationFile; + char const *configFromEnv = utils::secure_getenv("LIBCAMERA_RPI_TUNING_FILE"); + if (!configFromEnv || *configFromEnv == '\0') { + std::string model = sensor_->model(); + if (isMonoSensor(sensor_)) + model += "_mono"; + configurationFile = ipa_->configurationFile(model + ".json"); + } else { + configurationFile = std::string(configFromEnv); + } + + IPASettings settings(configurationFile, sensor_->model()); + ipa::RPi::InitParams params; + + params.lensPresent = !!sensor_->focusLens(); + int ret = platformInitIpa(params); + if (ret) + return ret; + + return ipa_->init(settings, params, result); +} + +int CameraData::configureIPA(const CameraConfiguration *config, ipa::RPi::ConfigResult *result) +{ + std::map entityControls; + ipa::RPi::ConfigParams params; + int ret; + + params.sensorControls = sensor_->controls(); + if (sensor_->focusLens()) + params.lensControls = sensor_->focusLens()->controls(); + + ret = platformConfigureIpa(params); + if (ret) + return ret; + + /* We store the IPACameraSensorInfo for digital zoom calculations. */ + ret = sensor_->sensorInfo(&sensorInfo_); + if (ret) { + LOG(RPI, Error) << "Failed to retrieve camera sensor info"; + return ret; + } + + /* Always send the user transform to the IPA. */ + params.transform = static_cast(config->transform); + + /* Ready the IPA - it must know about the sensor resolution. */ + ret = ipa_->configure(sensorInfo_, params, result); + if (ret < 0) { + LOG(RPI, Error) << "IPA configuration failed!"; + return -EPIPE; + } + + if (!result->controls.empty()) + setSensorControls(result->controls); + + return 0; +} + +void CameraData::metadataReady(const ControlList &metadata) +{ + if (!isRunning()) + return; + + /* Add to the Request metadata buffer what the IPA has provided. */ + /* Last thing to do is to fill up the request metadata. */ + Request *request = requestQueue_.front(); + request->metadata().merge(metadata); + + /* + * Inform the sensor of the latest colour gains if it has the + * V4L2_CID_NOTIFY_GAINS control (which means notifyGainsUnity_ is set). + */ + const auto &colourGains = metadata.get(libcamera::controls::ColourGains); + if (notifyGainsUnity_ && colourGains) { + /* The control wants linear gains in the order B, Gb, Gr, R. */ + ControlList ctrls(sensor_->controls()); + std::array gains{ + static_cast((*colourGains)[1] * *notifyGainsUnity_), + *notifyGainsUnity_, + *notifyGainsUnity_, + static_cast((*colourGains)[0] * *notifyGainsUnity_) + }; + ctrls.set(V4L2_CID_NOTIFY_GAINS, Span{ gains }); + + sensor_->setControls(&ctrls); + } +} + +void CameraData::setDelayedControls(const ControlList &controls, uint32_t delayContext) +{ + if (!delayedCtrls_->push(controls, delayContext)) + LOG(RPI, Error) << "V4L2 DelayedControl set failed"; +} + +void CameraData::setLensControls(const ControlList &controls) +{ + CameraLens *lens = sensor_->focusLens(); + + if (lens && controls.contains(V4L2_CID_FOCUS_ABSOLUTE)) { + ControlValue const &focusValue = controls.get(V4L2_CID_FOCUS_ABSOLUTE); + lens->setFocusPosition(focusValue.get()); + } +} + +void CameraData::setSensorControls(ControlList &controls) +{ + /* + * We need to ensure that if both VBLANK and EXPOSURE are present, the + * former must be written ahead of, and separately from EXPOSURE to avoid + * V4L2 rejecting the latter. This is identical to what DelayedControls + * does with the priority write flag. + * + * As a consequence of the below logic, VBLANK gets set twice, and we + * rely on the v4l2 framework to not pass the second control set to the + * driver as the actual control value has not changed. + */ + if (controls.contains(V4L2_CID_EXPOSURE) && controls.contains(V4L2_CID_VBLANK)) { + ControlList vblank_ctrl; + + vblank_ctrl.set(V4L2_CID_VBLANK, controls.get(V4L2_CID_VBLANK)); + sensor_->setControls(&vblank_ctrl); + } + + sensor_->setControls(&controls); +} + +Rectangle CameraData::scaleIspCrop(const Rectangle &ispCrop) const +{ + /* + * Scale a crop rectangle defined in the ISP's coordinates into native sensor + * coordinates. + */ + Rectangle nativeCrop = ispCrop.scaledBy(sensorInfo_.analogCrop.size(), + sensorInfo_.outputSize); + nativeCrop.translateBy(sensorInfo_.analogCrop.topLeft()); + return nativeCrop; +} + +void CameraData::applyScalerCrop(const ControlList &controls) +{ + const auto &scalerCrop = controls.get(controls::ScalerCrop); + if (scalerCrop) { + Rectangle nativeCrop = *scalerCrop; + + if (!nativeCrop.width || !nativeCrop.height) + nativeCrop = { 0, 0, 1, 1 }; + + /* Create a version of the crop scaled to ISP (camera mode) pixels. */ + Rectangle ispCrop = nativeCrop.translatedBy(-sensorInfo_.analogCrop.topLeft()); + ispCrop.scaleBy(sensorInfo_.outputSize, sensorInfo_.analogCrop.size()); + + /* + * The crop that we set must be: + * 1. At least as big as ispMinCropSize_, once that's been + * enlarged to the same aspect ratio. + * 2. With the same mid-point, if possible. + * 3. But it can't go outside the sensor area. + */ + Size minSize = ispMinCropSize_.expandedToAspectRatio(nativeCrop.size()); + Size size = ispCrop.size().expandedTo(minSize); + ispCrop = size.centeredTo(ispCrop.center()).enclosedIn(Rectangle(sensorInfo_.outputSize)); + + if (ispCrop != ispCrop_) { + ispCrop_ = ispCrop; + platformSetIspCrop(); + + /* + * Also update the ScalerCrop in the metadata with what we actually + * used. But we must first rescale that from ISP (camera mode) pixels + * back into sensor native pixels. + */ + scalerCrop_ = scaleIspCrop(ispCrop_); + } + } +} + +void CameraData::cameraTimeout() +{ + LOG(RPI, Error) << "Camera frontend has timed out!"; + LOG(RPI, Error) << "Please check that your camera sensor connector is attached securely."; + LOG(RPI, Error) << "Alternatively, try another cable and/or sensor."; + + state_ = CameraData::State::Error; + platformStop(); + + /* + * To allow the application to attempt a recovery from this timeout, + * stop all devices streaming, and return any outstanding requests as + * incomplete and cancelled. + */ + for (auto const stream : streams_) + stream->dev()->streamOff(); + + clearIncompleteRequests(); +} + +void CameraData::frameStarted(uint32_t sequence) +{ + LOG(RPI, Debug) << "Frame start " << sequence; + + /* Write any controls for the next frame as soon as we can. */ + delayedCtrls_->applyControls(sequence); +} + +void CameraData::clearIncompleteRequests() +{ + /* + * All outstanding requests (and associated buffers) must be returned + * back to the application. + */ + while (!requestQueue_.empty()) { + Request *request = requestQueue_.front(); + + for (auto &b : request->buffers()) { + FrameBuffer *buffer = b.second; + /* + * Has the buffer already been handed back to the + * request? If not, do so now. + */ + if (buffer->request()) { + buffer->_d()->cancel(); + pipe()->completeBuffer(request, buffer); + } + } + + pipe()->completeRequest(request); + requestQueue_.pop(); + } +} + +void CameraData::handleStreamBuffer(FrameBuffer *buffer, RPi::Stream *stream) +{ + /* + * It is possible to be here without a pending request, so check + * that we actually have one to action, otherwise we just return + * buffer back to the stream. + */ + Request *request = requestQueue_.empty() ? nullptr : requestQueue_.front(); + if (!dropFrameCount_ && request && request->findBuffer(stream) == buffer) { + /* + * Check if this is an externally provided buffer, and if + * so, we must stop tracking it in the pipeline handler. + */ + handleExternalBuffer(buffer, stream); + /* + * Tag the buffer as completed, returning it to the + * application. + */ + pipe()->completeBuffer(request, buffer); + } else { + /* + * This buffer was not part of the Request (which happens if an + * internal buffer was used for an external stream, or + * unconditionally for internal streams), or there is no pending + * request, so we can recycle it. + */ + stream->returnBuffer(buffer); + } +} + +void CameraData::handleState() +{ + switch (state_) { + case State::Stopped: + case State::Busy: + case State::Error: + break; + + case State::IpaComplete: + /* If the request is completed, we will switch to Idle state. */ + checkRequestCompleted(); + /* + * No break here, we want to try running the pipeline again. + * The fallthrough clause below suppresses compiler warnings. + */ + [[fallthrough]]; + + case State::Idle: + tryRunPipeline(); + break; + } +} + +void CameraData::handleExternalBuffer(FrameBuffer *buffer, RPi::Stream *stream) +{ + unsigned int id = stream->getBufferId(buffer); + + if (!(id & MaskExternalBuffer)) + return; + + /* Stop the Stream object from tracking the buffer. */ + stream->removeExternalBuffer(buffer); +} + +void CameraData::checkRequestCompleted() +{ + bool requestCompleted = false; + /* + * If we are dropping this frame, do not touch the request, simply + * change the state to IDLE when ready. + */ + if (!dropFrameCount_) { + Request *request = requestQueue_.front(); + if (request->hasPendingBuffers()) + return; + + /* Must wait for metadata to be filled in before completing. */ + if (state_ != State::IpaComplete) + return; + + pipe()->completeRequest(request); + requestQueue_.pop(); + requestCompleted = true; + } + + /* + * Make sure we have three outputs completed in the case of a dropped + * frame. + */ + if (state_ == State::IpaComplete && + ((ispOutputCount_ == ispOutputTotal_ && dropFrameCount_) || + requestCompleted)) { + state_ = State::Idle; + if (dropFrameCount_) { + dropFrameCount_--; + LOG(RPI, Debug) << "Dropping frame at the request of the IPA (" + << dropFrameCount_ << " left)"; + } + } +} + +void CameraData::fillRequestMetadata(const ControlList &bufferControls, Request *request) +{ + request->metadata().set(controls::SensorTimestamp, + bufferControls.get(controls::SensorTimestamp).value_or(0)); + + request->metadata().set(controls::ScalerCrop, scalerCrop_); +} + +} /* namespace libcamera */ diff --git a/src/libcamera/pipeline/rpi/common/pipeline_base.h b/src/libcamera/pipeline/rpi/common/pipeline_base.h new file mode 100644 index 000000000000..6b19b56c8ee8 --- /dev/null +++ b/src/libcamera/pipeline/rpi/common/pipeline_base.h @@ -0,0 +1,277 @@ +/* SPDX-License-Identifier: LGPL-2.1-or-later */ +/* + * Copyright (C) 2019-2023, Raspberry Pi Ltd + * + * pipeline_base.h - Pipeline handler base class for Raspberry Pi devices + */ + +#include +#include +#include +#include +#include +#include +#include +#include + +#include +#include + +#include "libcamera/internal/bayer_format.h" +#include "libcamera/internal/camera.h" +#include "libcamera/internal/camera_sensor.h" +#include "libcamera/internal/framebuffer.h" +#include "libcamera/internal/media_device.h" +#include "libcamera/internal/media_object.h" +#include "libcamera/internal/pipeline_handler.h" +#include "libcamera/internal/v4l2_videodevice.h" +#include "libcamera/internal/yaml_parser.h" + +#include +#include + +#include "delayed_controls.h" +#include "rpi_stream.h" + +using namespace std::chrono_literals; + +namespace libcamera { + +namespace RPi { + +/* Map of mbus codes to supported sizes reported by the sensor. */ +using SensorFormats = std::map>; + +class CameraData : public Camera::Private +{ +public: + CameraData(PipelineHandler *pipe) + : Camera::Private(pipe), state_(State::Stopped), + flipsAlterBayerOrder_(false), dropFrameCount_(0), buffersAllocated_(false), + ispOutputCount_(0), ispOutputTotal_(0) + { + } + + virtual ~CameraData() + { + } + + struct StreamParams { + StreamParams() + : index(0), cfg(nullptr), dev(nullptr) + { + } + + StreamParams(unsigned int index_, StreamConfiguration *cfg_) + : index(index_), cfg(cfg_), dev(nullptr) + { + } + + unsigned int index; + StreamConfiguration *cfg; + V4L2VideoDevice *dev; + }; + + virtual CameraConfiguration::Status platformValidate(std::vector &rawStreams, + std::vector &outStreams) const = 0; + virtual int platformConfigure(const V4L2SubdeviceFormat &sensorFormat, + std::optional packing, + std::vector &rawStreams, + std::vector &outStreams) = 0; + virtual void platformStart() = 0; + virtual void platformStop() = 0; + + void freeBuffers(); + virtual void platformFreeBuffers() = 0; + + void enumerateVideoDevices(MediaLink *link, const std::string &frontend); + + int loadPipelineConfiguration(); + int loadIPA(ipa::RPi::InitResult *result); + int configureIPA(const CameraConfiguration *config, ipa::RPi::ConfigResult *result); + virtual int platformInitIpa(ipa::RPi::InitParams ¶ms) = 0; + virtual int platformConfigureIpa(ipa::RPi::ConfigParams ¶ms) = 0; + + void metadataReady(const ControlList &metadata); + void setDelayedControls(const ControlList &controls, uint32_t delayContext); + void setLensControls(const ControlList &controls); + void setSensorControls(ControlList &controls); + + Rectangle scaleIspCrop(const Rectangle &ispCrop) const; + void applyScalerCrop(const ControlList &controls); + virtual void platformSetIspCrop() = 0; + + void cameraTimeout(); + void frameStarted(uint32_t sequence); + + void clearIncompleteRequests(); + void handleStreamBuffer(FrameBuffer *buffer, Stream *stream); + void handleState(); + + virtual V4L2VideoDevice::Formats ispFormats() const = 0; + virtual V4L2VideoDevice::Formats rawFormats() const = 0; + virtual V4L2VideoDevice *frontendDevice() = 0; + + virtual int platformPipelineConfigure(const std::unique_ptr &root) = 0; + + std::unique_ptr ipa_; + + std::unique_ptr sensor_; + SensorFormats sensorFormats_; + + /* The vector below is just for convenience when iterating over all streams. */ + std::vector streams_; + /* Stores the ids of the buffers mapped in the IPA. */ + std::unordered_set bufferIds_; + /* + * Stores a cascade of Video Mux or Bridge devices between the sensor and + * Unicam together with media link across the entities. + */ + std::vector, MediaLink *>> bridgeDevices_; + + std::unique_ptr delayedCtrls_; + bool sensorMetadata_; + + /* + * All the functions in this class are called from a single calling + * thread. So, we do not need to have any mutex to protect access to any + * of the variables below. + */ + enum class State { Stopped, Idle, Busy, IpaComplete, Error }; + State state_; + + bool isRunning() + { + return state_ != State::Stopped && state_ != State::Error; + } + + std::queue requestQueue_; + + /* Store the "native" Bayer order (that is, with no transforms applied). */ + bool flipsAlterBayerOrder_; + BayerFormat::Order nativeBayerOrder_; + + /* For handling digital zoom. */ + IPACameraSensorInfo sensorInfo_; + Rectangle ispCrop_; /* crop in ISP (camera mode) pixels */ + Rectangle scalerCrop_; /* crop in sensor native pixels */ + Size ispMinCropSize_; + + unsigned int dropFrameCount_; + + /* + * If set, this stores the value that represets a gain of one for + * the V4L2_CID_NOTIFY_GAINS control. + */ + std::optional notifyGainsUnity_; + + /* Have internal buffers been allocated? */ + bool buffersAllocated_; + + struct Config { + /* + * Override any request from the IPA to drop a number of startup + * frames. + */ + bool disableStartupFrameDrops; + /* + * Override the camera timeout value calculated by the IPA based + * on frame durations. + */ + unsigned int cameraTimeoutValue; + }; + + Config config_; + +protected: + void fillRequestMetadata(const ControlList &bufferControls, + Request *request); + + virtual void tryRunPipeline() = 0; + + unsigned int ispOutputCount_; + unsigned int ispOutputTotal_; + +private: + void handleExternalBuffer(FrameBuffer *buffer, Stream *stream); + void checkRequestCompleted(); +}; + +class PipelineHandlerBase : public PipelineHandler +{ +public: + PipelineHandlerBase(CameraManager *manager) + : PipelineHandler(manager) + { + } + + virtual ~PipelineHandlerBase() + { + } + + static V4L2DeviceFormat toV4L2DeviceFormat(const V4L2VideoDevice *dev, + const V4L2SubdeviceFormat &format, + BayerFormat::Packing packingReq); + + std::unique_ptr + generateConfiguration(Camera *camera, const StreamRoles &roles) override; + int configure(Camera *camera, CameraConfiguration *config) override; + + int exportFrameBuffers(Camera *camera, libcamera::Stream *stream, + std::vector> *buffers) override; + + int start(Camera *camera, const ControlList *controls) override; + void stopDevice(Camera *camera) override; + void releaseDevice(Camera *camera) override; + + int queueRequestDevice(Camera *camera, Request *request) override; + +protected: + int registerCamera(std::unique_ptr &cameraData, + MediaDevice *frontent, const std::string &frontendName, + MediaDevice *backend, MediaEntity *sensorEntity); + + void mapBuffers(Camera *camera, const BufferMap &buffers, unsigned int mask); + + virtual int platformRegister(std::unique_ptr &cameraData, + MediaDevice *unicam, MediaDevice *isp) = 0; + +private: + CameraData *cameraData(Camera *camera) + { + return static_cast(camera->_d()); + } + + int queueAllBuffers(Camera *camera); + virtual int prepareBuffers(Camera *camera) = 0; +}; + +class RPiCameraConfiguration final : public CameraConfiguration +{ +public: + RPiCameraConfiguration(const CameraData *data) + : CameraConfiguration(), data_(data) + { + } + + CameraConfiguration::Status validateColorSpaces(ColorSpaceFlags flags); + Status validate() override; + + /* Cache the combinedTransform_ that will be applied to the sensor */ + Transform combinedTransform_; + +private: + const CameraData *data_; + + /* + * Store the colour spaces that all our streams will have. RGB format streams + * will have the same colorspace as YUV streams, with YCbCr field cleared and + * range set to full. + */ + std::optional yuvColorSpace_; + std::optional rgbColorSpace_; +}; + +} /* namespace RPi */ + +} /* namespace libcamera */ diff --git a/src/libcamera/pipeline/rpi/vc4/data/example.yaml b/src/libcamera/pipeline/rpi/vc4/data/example.yaml index c90f518f8849..b8e01adeaf40 100644 --- a/src/libcamera/pipeline/rpi/vc4/data/example.yaml +++ b/src/libcamera/pipeline/rpi/vc4/data/example.yaml @@ -34,13 +34,13 @@ # # "disable_startup_frame_drops": false, - # Custom timeout value (in ms) for Unicam to use. This overrides + # Custom timeout value (in ms) for camera to use. This overrides # the value computed by the pipeline handler based on frame # durations. # # Set this value to 0 to use the pipeline handler computed # timeout value. # - # "unicam_timeout_value_ms": 0, + # "camera_timeout_value_ms": 0, } } diff --git a/src/libcamera/pipeline/rpi/vc4/meson.build b/src/libcamera/pipeline/rpi/vc4/meson.build index 228823f30922..cdb049c58d2c 100644 --- a/src/libcamera/pipeline/rpi/vc4/meson.build +++ b/src/libcamera/pipeline/rpi/vc4/meson.build @@ -2,7 +2,7 @@ libcamera_sources += files([ 'dma_heaps.cpp', - 'raspberrypi.cpp', + 'vc4.cpp', ]) subdir('data') diff --git a/src/libcamera/pipeline/rpi/vc4/raspberrypi.cpp b/src/libcamera/pipeline/rpi/vc4/raspberrypi.cpp deleted file mode 100644 index 27b3bde4dbac..000000000000 --- a/src/libcamera/pipeline/rpi/vc4/raspberrypi.cpp +++ /dev/null @@ -1,2432 +0,0 @@ -/* SPDX-License-Identifier: LGPL-2.1-or-later */ -/* - * Copyright (C) 2019-2021, Raspberry Pi Ltd - * - * raspberrypi.cpp - Pipeline handler for VC4-based Raspberry Pi devices - */ -#include -#include -#include -#include -#include -#include -#include -#include -#include - -#include -#include -#include - -#include -#include -#include -#include -#include -#include -#include -#include - -#include -#include -#include - -#include "libcamera/internal/bayer_format.h" -#include "libcamera/internal/camera.h" -#include "libcamera/internal/camera_lens.h" -#include "libcamera/internal/camera_sensor.h" -#include "libcamera/internal/device_enumerator.h" -#include "libcamera/internal/framebuffer.h" -#include "libcamera/internal/ipa_manager.h" -#include "libcamera/internal/media_device.h" -#include "libcamera/internal/pipeline_handler.h" -#include "libcamera/internal/v4l2_videodevice.h" -#include "libcamera/internal/yaml_parser.h" - -#include "../common/delayed_controls.h" -#include "../common/rpi_stream.h" -#include "dma_heaps.h" - -using namespace std::chrono_literals; - -namespace libcamera { - -LOG_DEFINE_CATEGORY(RPI) - -namespace { - -constexpr unsigned int defaultRawBitDepth = 12; - -/* Map of mbus codes to supported sizes reported by the sensor. */ -using SensorFormats = std::map>; - -SensorFormats populateSensorFormats(std::unique_ptr &sensor) -{ - SensorFormats formats; - - for (auto const mbusCode : sensor->mbusCodes()) - formats.emplace(mbusCode, sensor->sizes(mbusCode)); - - return formats; -} - -bool isMonoSensor(std::unique_ptr &sensor) -{ - unsigned int mbusCode = sensor->mbusCodes()[0]; - const BayerFormat &bayer = BayerFormat::fromMbusCode(mbusCode); - - return bayer.order == BayerFormat::Order::MONO; -} - -PixelFormat mbusCodeToPixelFormat(unsigned int mbus_code, - BayerFormat::Packing packingReq) -{ - BayerFormat bayer = BayerFormat::fromMbusCode(mbus_code); - - ASSERT(bayer.isValid()); - - bayer.packing = packingReq; - PixelFormat pix = bayer.toPixelFormat(); - - /* - * Not all formats (e.g. 8-bit or 16-bit Bayer formats) can have packed - * variants. So if the PixelFormat returns as invalid, use the non-packed - * conversion instead. - */ - if (!pix.isValid()) { - bayer.packing = BayerFormat::Packing::None; - pix = bayer.toPixelFormat(); - } - - return pix; -} - -V4L2DeviceFormat toV4L2DeviceFormat(const V4L2VideoDevice *dev, - const V4L2SubdeviceFormat &format, - BayerFormat::Packing packingReq) -{ - const PixelFormat pix = mbusCodeToPixelFormat(format.mbus_code, packingReq); - V4L2DeviceFormat deviceFormat; - - deviceFormat.fourcc = dev->toV4L2PixelFormat(pix); - deviceFormat.size = format.size; - deviceFormat.colorSpace = format.colorSpace; - return deviceFormat; -} - -bool isRaw(const PixelFormat &pixFmt) -{ - /* This test works for both Bayer and raw mono formats. */ - return BayerFormat::fromPixelFormat(pixFmt).isValid(); -} - -double scoreFormat(double desired, double actual) -{ - double score = desired - actual; - /* Smaller desired dimensions are preferred. */ - if (score < 0.0) - score = (-score) / 8; - /* Penalise non-exact matches. */ - if (actual != desired) - score *= 2; - - return score; -} - -V4L2SubdeviceFormat findBestFormat(const SensorFormats &formatsMap, const Size &req, unsigned int bitDepth) -{ - double bestScore = std::numeric_limits::max(), score; - V4L2SubdeviceFormat bestFormat; - bestFormat.colorSpace = ColorSpace::Raw; - - constexpr float penaltyAr = 1500.0; - constexpr float penaltyBitDepth = 500.0; - - /* Calculate the closest/best mode from the user requested size. */ - for (const auto &iter : formatsMap) { - const unsigned int mbusCode = iter.first; - const PixelFormat format = mbusCodeToPixelFormat(mbusCode, - BayerFormat::Packing::None); - const PixelFormatInfo &info = PixelFormatInfo::info(format); - - for (const Size &size : iter.second) { - double reqAr = static_cast(req.width) / req.height; - double fmtAr = static_cast(size.width) / size.height; - - /* Score the dimensions for closeness. */ - score = scoreFormat(req.width, size.width); - score += scoreFormat(req.height, size.height); - score += penaltyAr * scoreFormat(reqAr, fmtAr); - - /* Add any penalties... this is not an exact science! */ - score += utils::abs_diff(info.bitsPerPixel, bitDepth) * penaltyBitDepth; - - if (score <= bestScore) { - bestScore = score; - bestFormat.mbus_code = mbusCode; - bestFormat.size = size; - } - - LOG(RPI, Debug) << "Format: " << size - << " fmt " << format - << " Score: " << score - << " (best " << bestScore << ")"; - } - } - - return bestFormat; -} - -enum class Unicam : unsigned int { Image, Embedded }; -enum class Isp : unsigned int { Input, Output0, Output1, Stats }; - -} /* namespace */ - -class RPiCameraData : public Camera::Private -{ -public: - RPiCameraData(PipelineHandler *pipe) - : Camera::Private(pipe), state_(State::Stopped), - flipsAlterBayerOrder_(false), dropFrameCount_(0), - buffersAllocated_(false), ispOutputCount_(0) - { - } - - ~RPiCameraData() - { - freeBuffers(); - } - - void freeBuffers(); - void frameStarted(uint32_t sequence); - - int loadIPA(ipa::RPi::InitResult *result); - int configureIPA(const CameraConfiguration *config, ipa::RPi::ConfigResult *result); - int loadPipelineConfiguration(); - - void enumerateVideoDevices(MediaLink *link); - - void processStatsComplete(const ipa::RPi::BufferIds &buffers); - void metadataReady(const ControlList &metadata); - void prepareIspComplete(const ipa::RPi::BufferIds &buffers); - void setIspControls(const ControlList &controls); - void setDelayedControls(const ControlList &controls, uint32_t delayContext); - void setLensControls(const ControlList &controls); - void setCameraTimeout(uint32_t maxExposureTimeMs); - void setSensorControls(ControlList &controls); - void unicamTimeout(); - - /* bufferComplete signal handlers. */ - void unicamBufferDequeue(FrameBuffer *buffer); - void ispInputDequeue(FrameBuffer *buffer); - void ispOutputDequeue(FrameBuffer *buffer); - - void clearIncompleteRequests(); - void handleStreamBuffer(FrameBuffer *buffer, RPi::Stream *stream); - void handleExternalBuffer(FrameBuffer *buffer, RPi::Stream *stream); - void handleState(); - Rectangle scaleIspCrop(const Rectangle &ispCrop) const; - void applyScalerCrop(const ControlList &controls); - - std::unique_ptr ipa_; - - std::unique_ptr sensor_; - SensorFormats sensorFormats_; - /* Array of Unicam and ISP device streams and associated buffers/streams. */ - RPi::Device unicam_; - RPi::Device isp_; - /* The vector below is just for convenience when iterating over all streams. */ - std::vector streams_; - /* Stores the ids of the buffers mapped in the IPA. */ - std::unordered_set bufferIds_; - /* - * Stores a cascade of Video Mux or Bridge devices between the sensor and - * Unicam together with media link across the entities. - */ - std::vector, MediaLink *>> bridgeDevices_; - - /* DMAHEAP allocation helper. */ - RPi::DmaHeap dmaHeap_; - SharedFD lsTable_; - - std::unique_ptr delayedCtrls_; - bool sensorMetadata_; - - /* - * All the functions in this class are called from a single calling - * thread. So, we do not need to have any mutex to protect access to any - * of the variables below. - */ - enum class State { Stopped, Idle, Busy, IpaComplete, Error }; - State state_; - - bool isRunning() - { - return state_ != State::Stopped && state_ != State::Error; - } - - struct BayerFrame { - FrameBuffer *buffer; - ControlList controls; - unsigned int delayContext; - }; - - std::queue bayerQueue_; - std::queue embeddedQueue_; - std::deque requestQueue_; - - /* - * Store the "native" Bayer order (that is, with no transforms - * applied). - */ - bool flipsAlterBayerOrder_; - BayerFormat::Order nativeBayerOrder_; - - /* For handling digital zoom. */ - IPACameraSensorInfo sensorInfo_; - Rectangle ispCrop_; /* crop in ISP (camera mode) pixels */ - Rectangle scalerCrop_; /* crop in sensor native pixels */ - Size ispMinCropSize_; - - unsigned int dropFrameCount_; - - /* - * If set, this stores the value that represets a gain of one for - * the V4L2_CID_NOTIFY_GAINS control. - */ - std::optional notifyGainsUnity_; - - /* Have internal buffers been allocated? */ - bool buffersAllocated_; - - struct Config { - /* - * The minimum number of internal buffers to be allocated for - * the Unicam Image stream. - */ - unsigned int minUnicamBuffers; - /* - * The minimum total (internal + external) buffer count used for - * the Unicam Image stream. - * - * Note that: - * minTotalUnicamBuffers must be >= 1, and - * minTotalUnicamBuffers >= minUnicamBuffers - */ - unsigned int minTotalUnicamBuffers; - /* - * Override any request from the IPA to drop a number of startup - * frames. - */ - bool disableStartupFrameDrops; - /* - * Override the Unicam timeout value calculated by the IPA based - * on frame durations. - */ - unsigned int unicamTimeoutValue; - }; - - Config config_; - -private: - void checkRequestCompleted(); - void fillRequestMetadata(const ControlList &bufferControls, - Request *request); - void tryRunPipeline(); - bool findMatchingBuffers(BayerFrame &bayerFrame, FrameBuffer *&embeddedBuffer); - - unsigned int ispOutputCount_; -}; - -class RPiCameraConfiguration : public CameraConfiguration -{ -public: - RPiCameraConfiguration(const RPiCameraData *data); - - CameraConfiguration::Status validateColorSpaces(ColorSpaceFlags flags); - Status validate() override; - - /* Cache the combinedTransform_ that will be applied to the sensor */ - Transform combinedTransform_; - -private: - const RPiCameraData *data_; - - /* - * Store the colour spaces that all our streams will have. RGB format streams - * will have the same colorspace as YUV streams, with YCbCr field cleared and - * range set to full. - */ - std::optional yuvColorSpace_; - std::optional rgbColorSpace_; -}; - -class PipelineHandlerRPi : public PipelineHandler -{ -public: - PipelineHandlerRPi(CameraManager *manager); - - std::unique_ptr generateConfiguration(Camera *camera, - const StreamRoles &roles) override; - int configure(Camera *camera, CameraConfiguration *config) override; - - int exportFrameBuffers(Camera *camera, Stream *stream, - std::vector> *buffers) override; - - int start(Camera *camera, const ControlList *controls) override; - void stopDevice(Camera *camera) override; - - int queueRequestDevice(Camera *camera, Request *request) override; - - bool match(DeviceEnumerator *enumerator) override; - - void releaseDevice(Camera *camera) override; - -private: - RPiCameraData *cameraData(Camera *camera) - { - return static_cast(camera->_d()); - } - - int registerCamera(MediaDevice *unicam, MediaDevice *isp, MediaEntity *sensorEntity); - int queueAllBuffers(Camera *camera); - int prepareBuffers(Camera *camera); - void mapBuffers(Camera *camera, const RPi::BufferMap &buffers, unsigned int mask); -}; - -RPiCameraConfiguration::RPiCameraConfiguration(const RPiCameraData *data) - : CameraConfiguration(), data_(data) -{ -} - -static const std::vector validColorSpaces = { - ColorSpace::Sycc, - ColorSpace::Smpte170m, - ColorSpace::Rec709 -}; - -static std::optional findValidColorSpace(const ColorSpace &colourSpace) -{ - for (auto cs : validColorSpaces) { - if (colourSpace.primaries == cs.primaries && - colourSpace.transferFunction == cs.transferFunction) - return cs; - } - - return std::nullopt; -} - -static bool isRgb(const PixelFormat &pixFmt) -{ - const PixelFormatInfo &info = PixelFormatInfo::info(pixFmt); - return info.colourEncoding == PixelFormatInfo::ColourEncodingRGB; -} - -static bool isYuv(const PixelFormat &pixFmt) -{ - /* The code below would return true for raw mono streams, so weed those out first. */ - if (isRaw(pixFmt)) - return false; - - const PixelFormatInfo &info = PixelFormatInfo::info(pixFmt); - return info.colourEncoding == PixelFormatInfo::ColourEncodingYUV; -} - -/* - * Raspberry Pi drivers expect the following colour spaces: - * - V4L2_COLORSPACE_RAW for raw streams. - * - One of V4L2_COLORSPACE_JPEG, V4L2_COLORSPACE_SMPTE170M, V4L2_COLORSPACE_REC709 for - * non-raw streams. Other fields such as transfer function, YCbCr encoding and - * quantisation are not used. - * - * The libcamera colour spaces that we wish to use corresponding to these are therefore: - * - ColorSpace::Raw for V4L2_COLORSPACE_RAW - * - ColorSpace::Sycc for V4L2_COLORSPACE_JPEG - * - ColorSpace::Smpte170m for V4L2_COLORSPACE_SMPTE170M - * - ColorSpace::Rec709 for V4L2_COLORSPACE_REC709 - */ - -CameraConfiguration::Status RPiCameraConfiguration::validateColorSpaces([[maybe_unused]] ColorSpaceFlags flags) -{ - Status status = Valid; - yuvColorSpace_.reset(); - - for (auto cfg : config_) { - /* First fix up raw streams to have the "raw" colour space. */ - if (isRaw(cfg.pixelFormat)) { - /* If there was no value here, that doesn't count as "adjusted". */ - if (cfg.colorSpace && cfg.colorSpace != ColorSpace::Raw) - status = Adjusted; - cfg.colorSpace = ColorSpace::Raw; - continue; - } - - /* Next we need to find our shared colour space. The first valid one will do. */ - if (cfg.colorSpace && !yuvColorSpace_) - yuvColorSpace_ = findValidColorSpace(cfg.colorSpace.value()); - } - - /* If no colour space was given anywhere, choose sYCC. */ - if (!yuvColorSpace_) - yuvColorSpace_ = ColorSpace::Sycc; - - /* Note the version of this that any RGB streams will have to use. */ - rgbColorSpace_ = yuvColorSpace_; - rgbColorSpace_->ycbcrEncoding = ColorSpace::YcbcrEncoding::None; - rgbColorSpace_->range = ColorSpace::Range::Full; - - /* Go through the streams again and force everyone to the same colour space. */ - for (auto cfg : config_) { - if (cfg.colorSpace == ColorSpace::Raw) - continue; - - if (isYuv(cfg.pixelFormat) && cfg.colorSpace != yuvColorSpace_) { - /* Again, no value means "not adjusted". */ - if (cfg.colorSpace) - status = Adjusted; - cfg.colorSpace = yuvColorSpace_; - } - if (isRgb(cfg.pixelFormat) && cfg.colorSpace != rgbColorSpace_) { - /* Be nice, and let the YUV version count as non-adjusted too. */ - if (cfg.colorSpace && cfg.colorSpace != yuvColorSpace_) - status = Adjusted; - cfg.colorSpace = rgbColorSpace_; - } - } - - return status; -} - -CameraConfiguration::Status RPiCameraConfiguration::validate() -{ - Status status = Valid; - - if (config_.empty()) - return Invalid; - - status = validateColorSpaces(ColorSpaceFlag::StreamsShareColorSpace); - - /* - * Validate the requested transform against the sensor capabilities and - * rotation and store the final combined transform that configure() will - * need to apply to the sensor to save us working it out again. - */ - Transform requestedTransform = transform; - combinedTransform_ = data_->sensor_->validateTransform(&transform); - if (transform != requestedTransform) - status = Adjusted; - - unsigned int rawCount = 0, outCount = 0, count = 0, maxIndex = 0; - std::pair outSize[2]; - Size maxSize; - for (StreamConfiguration &cfg : config_) { - if (isRaw(cfg.pixelFormat)) { - /* - * Calculate the best sensor mode we can use based on - * the user request. - */ - V4L2VideoDevice *unicam = data_->unicam_[Unicam::Image].dev(); - const PixelFormatInfo &info = PixelFormatInfo::info(cfg.pixelFormat); - unsigned int bitDepth = info.isValid() ? info.bitsPerPixel : defaultRawBitDepth; - V4L2SubdeviceFormat sensorFormat = findBestFormat(data_->sensorFormats_, cfg.size, bitDepth); - BayerFormat::Packing packing = BayerFormat::Packing::CSI2; - if (info.isValid() && !info.packed) - packing = BayerFormat::Packing::None; - V4L2DeviceFormat unicamFormat = toV4L2DeviceFormat(unicam, sensorFormat, packing); - int ret = unicam->tryFormat(&unicamFormat); - if (ret) - return Invalid; - - /* - * Some sensors change their Bayer order when they are - * h-flipped or v-flipped, according to the transform. - * If this one does, we must advertise the transformed - * Bayer order in the raw stream. Note how we must - * fetch the "native" (i.e. untransformed) Bayer order, - * because the sensor may currently be flipped! - */ - V4L2PixelFormat fourcc = unicamFormat.fourcc; - if (data_->flipsAlterBayerOrder_) { - BayerFormat bayer = BayerFormat::fromV4L2PixelFormat(fourcc); - bayer.order = data_->nativeBayerOrder_; - bayer = bayer.transform(combinedTransform_); - fourcc = bayer.toV4L2PixelFormat(); - } - - PixelFormat unicamPixFormat = fourcc.toPixelFormat(); - if (cfg.size != unicamFormat.size || - cfg.pixelFormat != unicamPixFormat) { - cfg.size = unicamFormat.size; - cfg.pixelFormat = unicamPixFormat; - status = Adjusted; - } - - cfg.stride = unicamFormat.planes[0].bpl; - cfg.frameSize = unicamFormat.planes[0].size; - - rawCount++; - } else { - outSize[outCount] = std::make_pair(count, cfg.size); - /* Record the largest resolution for fixups later. */ - if (maxSize < cfg.size) { - maxSize = cfg.size; - maxIndex = outCount; - } - outCount++; - } - - count++; - - /* Can only output 1 RAW stream, or 2 YUV/RGB streams. */ - if (rawCount > 1 || outCount > 2) { - LOG(RPI, Error) << "Invalid number of streams requested"; - return Invalid; - } - } - - /* - * Now do any fixups needed. For the two ISP outputs, one stream must be - * equal or smaller than the other in all dimensions. - */ - for (unsigned int i = 0; i < outCount; i++) { - outSize[i].second.width = std::min(outSize[i].second.width, - maxSize.width); - outSize[i].second.height = std::min(outSize[i].second.height, - maxSize.height); - - if (config_.at(outSize[i].first).size != outSize[i].second) { - config_.at(outSize[i].first).size = outSize[i].second; - status = Adjusted; - } - - /* - * Also validate the correct pixel formats here. - * Note that Output0 and Output1 support a different - * set of formats. - * - * Output 0 must be for the largest resolution. We will - * have that fixed up in the code above. - * - */ - StreamConfiguration &cfg = config_.at(outSize[i].first); - PixelFormat &cfgPixFmt = cfg.pixelFormat; - V4L2VideoDevice *dev; - - if (i == maxIndex) - dev = data_->isp_[Isp::Output0].dev(); - else - dev = data_->isp_[Isp::Output1].dev(); - - V4L2VideoDevice::Formats fmts = dev->formats(); - - if (fmts.find(dev->toV4L2PixelFormat(cfgPixFmt)) == fmts.end()) { - /* If we cannot find a native format, use a default one. */ - cfgPixFmt = formats::NV12; - status = Adjusted; - } - - V4L2DeviceFormat format; - format.fourcc = dev->toV4L2PixelFormat(cfg.pixelFormat); - format.size = cfg.size; - /* We want to send the associated YCbCr info through to the driver. */ - format.colorSpace = yuvColorSpace_; - - LOG(RPI, Debug) - << "Try color space " << ColorSpace::toString(cfg.colorSpace); - - int ret = dev->tryFormat(&format); - if (ret) - return Invalid; - - /* - * But for RGB streams, the YCbCr info gets overwritten on the way back - * so we must check against what the stream cfg says, not what we actually - * requested (which carefully included the YCbCr info)! - */ - if (cfg.colorSpace != format.colorSpace) { - status = Adjusted; - LOG(RPI, Debug) - << "Color space changed from " - << ColorSpace::toString(cfg.colorSpace) << " to " - << ColorSpace::toString(format.colorSpace); - } - - cfg.colorSpace = format.colorSpace; - - cfg.stride = format.planes[0].bpl; - cfg.frameSize = format.planes[0].size; - - } - - return status; -} - -PipelineHandlerRPi::PipelineHandlerRPi(CameraManager *manager) - : PipelineHandler(manager) -{ -} - -std::unique_ptr -PipelineHandlerRPi::generateConfiguration(Camera *camera, const StreamRoles &roles) -{ - RPiCameraData *data = cameraData(camera); - std::unique_ptr config = - std::make_unique(data); - V4L2SubdeviceFormat sensorFormat; - unsigned int bufferCount; - PixelFormat pixelFormat; - V4L2VideoDevice::Formats fmts; - Size size; - std::optional colorSpace; - - if (roles.empty()) - return config; - - unsigned int rawCount = 0; - unsigned int outCount = 0; - Size sensorSize = data->sensor_->resolution(); - for (const StreamRole role : roles) { - switch (role) { - case StreamRole::Raw: - size = sensorSize; - sensorFormat = findBestFormat(data->sensorFormats_, size, defaultRawBitDepth); - pixelFormat = mbusCodeToPixelFormat(sensorFormat.mbus_code, - BayerFormat::Packing::CSI2); - ASSERT(pixelFormat.isValid()); - colorSpace = ColorSpace::Raw; - bufferCount = 2; - rawCount++; - break; - - case StreamRole::StillCapture: - fmts = data->isp_[Isp::Output0].dev()->formats(); - pixelFormat = formats::NV12; - /* - * Still image codecs usually expect the sYCC color space. - * Even RGB codecs will be fine as the RGB we get with the - * sYCC color space is the same as sRGB. - */ - colorSpace = ColorSpace::Sycc; - /* Return the largest sensor resolution. */ - size = sensorSize; - bufferCount = 1; - outCount++; - break; - - case StreamRole::VideoRecording: - /* - * The colour denoise algorithm requires the analysis - * image, produced by the second ISP output, to be in - * YUV420 format. Select this format as the default, to - * maximize chances that it will be picked by - * applications and enable usage of the colour denoise - * algorithm. - */ - fmts = data->isp_[Isp::Output0].dev()->formats(); - pixelFormat = formats::YUV420; - /* - * Choose a color space appropriate for video recording. - * Rec.709 will be a good default for HD resolutions. - */ - colorSpace = ColorSpace::Rec709; - size = { 1920, 1080 }; - bufferCount = 4; - outCount++; - break; - - case StreamRole::Viewfinder: - fmts = data->isp_[Isp::Output0].dev()->formats(); - pixelFormat = formats::ARGB8888; - colorSpace = ColorSpace::Sycc; - size = { 800, 600 }; - bufferCount = 4; - outCount++; - break; - - default: - LOG(RPI, Error) << "Requested stream role not supported: " - << role; - return nullptr; - } - - if (rawCount > 1 || outCount > 2) { - LOG(RPI, Error) << "Invalid stream roles requested"; - return nullptr; - } - - std::map> deviceFormats; - if (role == StreamRole::Raw) { - /* Translate the MBUS codes to a PixelFormat. */ - for (const auto &format : data->sensorFormats_) { - PixelFormat pf = mbusCodeToPixelFormat(format.first, - BayerFormat::Packing::CSI2); - if (pf.isValid()) - deviceFormats.emplace(std::piecewise_construct, std::forward_as_tuple(pf), - std::forward_as_tuple(format.second.begin(), format.second.end())); - } - } else { - /* - * Translate the V4L2PixelFormat to PixelFormat. Note that we - * limit the recommended largest ISP output size to match the - * sensor resolution. - */ - for (const auto &format : fmts) { - PixelFormat pf = format.first.toPixelFormat(); - if (pf.isValid()) { - const SizeRange &ispSizes = format.second[0]; - deviceFormats[pf].emplace_back(ispSizes.min, sensorSize, - ispSizes.hStep, ispSizes.vStep); - } - } - } - - /* Add the stream format based on the device node used for the use case. */ - StreamFormats formats(deviceFormats); - StreamConfiguration cfg(formats); - cfg.size = size; - cfg.pixelFormat = pixelFormat; - cfg.colorSpace = colorSpace; - cfg.bufferCount = bufferCount; - config->addConfiguration(cfg); - } - - config->validate(); - - return config; -} - -int PipelineHandlerRPi::configure(Camera *camera, CameraConfiguration *config) -{ - RPiCameraData *data = cameraData(camera); - int ret; - - /* Start by freeing all buffers and reset the Unicam and ISP stream states. */ - data->freeBuffers(); - for (auto const stream : data->streams_) - stream->setExternal(false); - - BayerFormat::Packing packing = BayerFormat::Packing::CSI2; - Size maxSize, sensorSize; - unsigned int maxIndex = 0; - bool rawStream = false; - unsigned int bitDepth = defaultRawBitDepth; - - /* - * Look for the RAW stream (if given) size as well as the largest - * ISP output size. - */ - for (unsigned i = 0; i < config->size(); i++) { - StreamConfiguration &cfg = config->at(i); - - if (isRaw(cfg.pixelFormat)) { - /* - * If we have been given a RAW stream, use that size - * for setting up the sensor. - */ - sensorSize = cfg.size; - rawStream = true; - /* Check if the user has explicitly set an unpacked format. */ - BayerFormat bayerFormat = BayerFormat::fromPixelFormat(cfg.pixelFormat); - packing = bayerFormat.packing; - bitDepth = bayerFormat.bitDepth; - } else { - if (cfg.size > maxSize) { - maxSize = config->at(i).size; - maxIndex = i; - } - } - } - - /* - * Calculate the best sensor mode we can use based on the user's - * request, and apply it to the sensor with the cached transform, if - * any. - */ - V4L2SubdeviceFormat sensorFormat = findBestFormat(data->sensorFormats_, rawStream ? sensorSize : maxSize, bitDepth); - const RPiCameraConfiguration *rpiConfig = static_cast(config); - ret = data->sensor_->setFormat(&sensorFormat, rpiConfig->combinedTransform_); - if (ret) - return ret; - - V4L2VideoDevice *unicam = data->unicam_[Unicam::Image].dev(); - V4L2DeviceFormat unicamFormat = toV4L2DeviceFormat(unicam, sensorFormat, packing); - ret = unicam->setFormat(&unicamFormat); - if (ret) - return ret; - - LOG(RPI, Info) << "Sensor: " << camera->id() - << " - Selected sensor format: " << sensorFormat - << " - Selected unicam format: " << unicamFormat; - - ret = data->isp_[Isp::Input].dev()->setFormat(&unicamFormat); - if (ret) - return ret; - - /* - * See which streams are requested, and route the user - * StreamConfiguration appropriately. - */ - V4L2DeviceFormat format; - bool output0Set = false, output1Set = false; - for (unsigned i = 0; i < config->size(); i++) { - StreamConfiguration &cfg = config->at(i); - - if (isRaw(cfg.pixelFormat)) { - cfg.setStream(&data->unicam_[Unicam::Image]); - data->unicam_[Unicam::Image].setExternal(true); - continue; - } - - /* The largest resolution gets routed to the ISP Output 0 node. */ - RPi::Stream *stream = i == maxIndex ? &data->isp_[Isp::Output0] - : &data->isp_[Isp::Output1]; - - V4L2PixelFormat fourcc = stream->dev()->toV4L2PixelFormat(cfg.pixelFormat); - format.size = cfg.size; - format.fourcc = fourcc; - format.colorSpace = cfg.colorSpace; - - LOG(RPI, Debug) << "Setting " << stream->name() << " to " - << format; - - ret = stream->dev()->setFormat(&format); - if (ret) - return -EINVAL; - - if (format.size != cfg.size || format.fourcc != fourcc) { - LOG(RPI, Error) - << "Failed to set requested format on " << stream->name() - << ", returned " << format; - return -EINVAL; - } - - LOG(RPI, Debug) - << "Stream " << stream->name() << " has color space " - << ColorSpace::toString(cfg.colorSpace); - - cfg.setStream(stream); - stream->setExternal(true); - - if (i != maxIndex) - output1Set = true; - else - output0Set = true; - } - - /* - * If ISP::Output0 stream has not been configured by the application, - * we must allow the hardware to generate an output so that the data - * flow in the pipeline handler remains consistent, and we still generate - * statistics for the IPA to use. So enable the output at a very low - * resolution for internal use. - * - * \todo Allow the pipeline to work correctly without Output0 and only - * statistics coming from the hardware. - */ - if (!output0Set) { - V4L2VideoDevice *dev = data->isp_[Isp::Output0].dev(); - - maxSize = Size(320, 240); - format = {}; - format.size = maxSize; - format.fourcc = dev->toV4L2PixelFormat(formats::YUV420); - /* No one asked for output, so the color space doesn't matter. */ - format.colorSpace = ColorSpace::Sycc; - ret = dev->setFormat(&format); - if (ret) { - LOG(RPI, Error) - << "Failed to set default format on ISP Output0: " - << ret; - return -EINVAL; - } - - LOG(RPI, Debug) << "Defaulting ISP Output0 format to " - << format; - } - - /* - * If ISP::Output1 stream has not been requested by the application, we - * set it up for internal use now. This second stream will be used for - * fast colour denoise, and must be a quarter resolution of the ISP::Output0 - * stream. However, also limit the maximum size to 1200 pixels in the - * larger dimension, just to avoid being wasteful with buffer allocations - * and memory bandwidth. - * - * \todo If Output 1 format is not YUV420, Output 1 ought to be disabled as - * colour denoise will not run. - */ - if (!output1Set) { - V4L2VideoDevice *dev = data->isp_[Isp::Output1].dev(); - - V4L2DeviceFormat output1Format; - constexpr Size maxDimensions(1200, 1200); - const Size limit = maxDimensions.boundedToAspectRatio(format.size); - - output1Format.size = (format.size / 2).boundedTo(limit).alignedDownTo(2, 2); - output1Format.colorSpace = format.colorSpace; - output1Format.fourcc = dev->toV4L2PixelFormat(formats::YUV420); - - LOG(RPI, Debug) << "Setting ISP Output1 (internal) to " - << output1Format; - - ret = dev->setFormat(&output1Format); - if (ret) { - LOG(RPI, Error) << "Failed to set format on ISP Output1: " - << ret; - return -EINVAL; - } - } - - /* ISP statistics output format. */ - format = {}; - format.fourcc = V4L2PixelFormat(V4L2_META_FMT_BCM2835_ISP_STATS); - ret = data->isp_[Isp::Stats].dev()->setFormat(&format); - if (ret) { - LOG(RPI, Error) << "Failed to set format on ISP stats stream: " - << format; - return ret; - } - - /* Figure out the smallest selection the ISP will allow. */ - Rectangle testCrop(0, 0, 1, 1); - data->isp_[Isp::Input].dev()->setSelection(V4L2_SEL_TGT_CROP, &testCrop); - data->ispMinCropSize_ = testCrop.size(); - - /* Adjust aspect ratio by providing crops on the input image. */ - Size size = unicamFormat.size.boundedToAspectRatio(maxSize); - Rectangle crop = size.centeredTo(Rectangle(unicamFormat.size).center()); - Rectangle defaultCrop = crop; - data->ispCrop_ = crop; - - data->isp_[Isp::Input].dev()->setSelection(V4L2_SEL_TGT_CROP, &crop); - - ipa::RPi::ConfigResult result; - ret = data->configureIPA(config, &result); - if (ret) - LOG(RPI, Error) << "Failed to configure the IPA: " << ret; - - /* - * Set the scaler crop to the value we are using (scaled to native sensor - * coordinates). - */ - data->scalerCrop_ = data->scaleIspCrop(data->ispCrop_); - - /* - * Configure the Unicam embedded data output format only if the sensor - * supports it. - */ - if (data->sensorMetadata_) { - V4L2SubdeviceFormat embeddedFormat; - - data->sensor_->device()->getFormat(1, &embeddedFormat); - format.fourcc = V4L2PixelFormat(V4L2_META_FMT_SENSOR_DATA); - format.planes[0].size = embeddedFormat.size.width * embeddedFormat.size.height; - - LOG(RPI, Debug) << "Setting embedded data format."; - ret = data->unicam_[Unicam::Embedded].dev()->setFormat(&format); - if (ret) { - LOG(RPI, Error) << "Failed to set format on Unicam embedded: " - << format; - return ret; - } - } - - /* - * Update the ScalerCropMaximum to the correct value for this camera mode. - * For us, it's the same as the "analogue crop". - * - * \todo Make this property the ScalerCrop maximum value when dynamic - * controls are available and set it at validate() time - */ - data->properties_.set(properties::ScalerCropMaximum, data->sensorInfo_.analogCrop); - - /* Store the mode sensitivity for the application. */ - data->properties_.set(properties::SensorSensitivity, result.modeSensitivity); - - /* Update the controls that the Raspberry Pi IPA can handle. */ - ControlInfoMap::Map ctrlMap; - for (auto const &c : result.controlInfo) - ctrlMap.emplace(c.first, c.second); - - /* Add the ScalerCrop control limits based on the current mode. */ - Rectangle ispMinCrop = data->scaleIspCrop(Rectangle(data->ispMinCropSize_)); - defaultCrop = data->scaleIspCrop(defaultCrop); - ctrlMap[&controls::ScalerCrop] = ControlInfo(ispMinCrop, data->sensorInfo_.analogCrop, defaultCrop); - - data->controlInfo_ = ControlInfoMap(std::move(ctrlMap), result.controlInfo.idmap()); - - /* Setup the Video Mux/Bridge entities. */ - for (auto &[device, link] : data->bridgeDevices_) { - /* - * Start by disabling all the sink pad links on the devices in the - * cascade, with the exception of the link connecting the device. - */ - for (const MediaPad *p : device->entity()->pads()) { - if (!(p->flags() & MEDIA_PAD_FL_SINK)) - continue; - - for (MediaLink *l : p->links()) { - if (l != link) - l->setEnabled(false); - } - } - - /* - * Next, enable the entity -> entity links, and setup the pad format. - * - * \todo Some bridge devices may chainge the media bus code, so we - * ought to read the source pad format and propagate it to the sink pad. - */ - link->setEnabled(true); - const MediaPad *sinkPad = link->sink(); - ret = device->setFormat(sinkPad->index(), &sensorFormat); - if (ret) { - LOG(RPI, Error) << "Failed to set format on " << device->entity()->name() - << " pad " << sinkPad->index() - << " with format " << format - << ": " << ret; - return ret; - } - - LOG(RPI, Debug) << "Configured media link on device " << device->entity()->name() - << " on pad " << sinkPad->index(); - } - - return ret; -} - -int PipelineHandlerRPi::exportFrameBuffers([[maybe_unused]] Camera *camera, Stream *stream, - std::vector> *buffers) -{ - RPi::Stream *s = static_cast(stream); - unsigned int count = stream->configuration().bufferCount; - int ret = s->dev()->exportBuffers(count, buffers); - - s->setExportedBuffers(buffers); - - return ret; -} - -int PipelineHandlerRPi::start(Camera *camera, const ControlList *controls) -{ - RPiCameraData *data = cameraData(camera); - int ret; - - /* Check if a ScalerCrop control was specified. */ - if (controls) - data->applyScalerCrop(*controls); - - /* Start the IPA. */ - ipa::RPi::StartResult result; - data->ipa_->start(controls ? *controls : ControlList{ controls::controls }, - &result); - - /* Apply any gain/exposure settings that the IPA may have passed back. */ - if (!result.controls.empty()) - data->setSensorControls(result.controls); - - /* Configure the number of dropped frames required on startup. */ - data->dropFrameCount_ = data->config_.disableStartupFrameDrops - ? 0 : result.dropFrameCount; - - for (auto const stream : data->streams_) - stream->resetBuffers(); - - if (!data->buffersAllocated_) { - /* Allocate buffers for internal pipeline usage. */ - ret = prepareBuffers(camera); - if (ret) { - LOG(RPI, Error) << "Failed to allocate buffers"; - data->freeBuffers(); - stop(camera); - return ret; - } - data->buffersAllocated_ = true; - } - - /* We need to set the dropFrameCount_ before queueing buffers. */ - ret = queueAllBuffers(camera); - if (ret) { - LOG(RPI, Error) << "Failed to queue buffers"; - stop(camera); - return ret; - } - - /* Enable SOF event generation. */ - data->unicam_[Unicam::Image].dev()->setFrameStartEnabled(true); - - /* - * Reset the delayed controls with the gain and exposure values set by - * the IPA. - */ - data->delayedCtrls_->reset(0); - - data->state_ = RPiCameraData::State::Idle; - - /* Start all streams. */ - for (auto const stream : data->streams_) { - ret = stream->dev()->streamOn(); - if (ret) { - stop(camera); - return ret; - } - } - - return 0; -} - -void PipelineHandlerRPi::stopDevice(Camera *camera) -{ - RPiCameraData *data = cameraData(camera); - - data->state_ = RPiCameraData::State::Stopped; - - /* Disable SOF event generation. */ - data->unicam_[Unicam::Image].dev()->setFrameStartEnabled(false); - - for (auto const stream : data->streams_) - stream->dev()->streamOff(); - - data->clearIncompleteRequests(); - data->bayerQueue_ = {}; - data->embeddedQueue_ = {}; - - /* Stop the IPA. */ - data->ipa_->stop(); -} - -int PipelineHandlerRPi::queueRequestDevice(Camera *camera, Request *request) -{ - RPiCameraData *data = cameraData(camera); - - if (!data->isRunning()) - return -EINVAL; - - LOG(RPI, Debug) << "queueRequestDevice: New request."; - - /* Push all buffers supplied in the Request to the respective streams. */ - for (auto stream : data->streams_) { - if (!stream->isExternal()) - continue; - - FrameBuffer *buffer = request->findBuffer(stream); - if (buffer && !stream->getBufferId(buffer)) { - /* - * This buffer is not recognised, so it must have been allocated - * outside the v4l2 device. Store it in the stream buffer list - * so we can track it. - */ - stream->setExternalBuffer(buffer); - } - - /* - * If no buffer is provided by the request for this stream, we - * queue a nullptr to the stream to signify that it must use an - * internally allocated buffer for this capture request. This - * buffer will not be given back to the application, but is used - * to support the internal pipeline flow. - * - * The below queueBuffer() call will do nothing if there are not - * enough internal buffers allocated, but this will be handled by - * queuing the request for buffers in the RPiStream object. - */ - int ret = stream->queueBuffer(buffer); - if (ret) - return ret; - } - - /* Push the request to the back of the queue. */ - data->requestQueue_.push_back(request); - data->handleState(); - - return 0; -} - -bool PipelineHandlerRPi::match(DeviceEnumerator *enumerator) -{ - constexpr unsigned int numUnicamDevices = 2; - - /* - * Loop over all Unicam instances, but return out once a match is found. - * This is to ensure we correctly enumrate the camera when an instance - * of Unicam has registered with media controller, but has not registered - * device nodes due to a sensor subdevice failure. - */ - for (unsigned int i = 0; i < numUnicamDevices; i++) { - DeviceMatch unicam("unicam"); - MediaDevice *unicamDevice = acquireMediaDevice(enumerator, unicam); - - if (!unicamDevice) { - LOG(RPI, Debug) << "Unable to acquire a Unicam instance"; - continue; - } - - DeviceMatch isp("bcm2835-isp"); - MediaDevice *ispDevice = acquireMediaDevice(enumerator, isp); - - if (!ispDevice) { - LOG(RPI, Debug) << "Unable to acquire ISP instance"; - continue; - } - - /* - * The loop below is used to register multiple cameras behind one or more - * video mux devices that are attached to a particular Unicam instance. - * Obviously these cameras cannot be used simultaneously. - */ - unsigned int numCameras = 0; - for (MediaEntity *entity : unicamDevice->entities()) { - if (entity->function() != MEDIA_ENT_F_CAM_SENSOR) - continue; - - int ret = registerCamera(unicamDevice, ispDevice, entity); - if (ret) - LOG(RPI, Error) << "Failed to register camera " - << entity->name() << ": " << ret; - else - numCameras++; - } - - if (numCameras) - return true; - } - - return false; -} - -void PipelineHandlerRPi::releaseDevice(Camera *camera) -{ - RPiCameraData *data = cameraData(camera); - data->freeBuffers(); -} - -int PipelineHandlerRPi::registerCamera(MediaDevice *unicam, MediaDevice *isp, MediaEntity *sensorEntity) -{ - std::unique_ptr data = std::make_unique(this); - - if (!data->dmaHeap_.isValid()) - return -ENOMEM; - - MediaEntity *unicamImage = unicam->getEntityByName("unicam-image"); - MediaEntity *ispOutput0 = isp->getEntityByName("bcm2835-isp0-output0"); - MediaEntity *ispCapture1 = isp->getEntityByName("bcm2835-isp0-capture1"); - MediaEntity *ispCapture2 = isp->getEntityByName("bcm2835-isp0-capture2"); - MediaEntity *ispCapture3 = isp->getEntityByName("bcm2835-isp0-capture3"); - - if (!unicamImage || !ispOutput0 || !ispCapture1 || !ispCapture2 || !ispCapture3) - return -ENOENT; - - /* Locate and open the unicam video streams. */ - data->unicam_[Unicam::Image] = RPi::Stream("Unicam Image", unicamImage); - - /* An embedded data node will not be present if the sensor does not support it. */ - MediaEntity *unicamEmbedded = unicam->getEntityByName("unicam-embedded"); - if (unicamEmbedded) { - data->unicam_[Unicam::Embedded] = RPi::Stream("Unicam Embedded", unicamEmbedded); - data->unicam_[Unicam::Embedded].dev()->bufferReady.connect(data.get(), - &RPiCameraData::unicamBufferDequeue); - } - - /* Tag the ISP input stream as an import stream. */ - data->isp_[Isp::Input] = RPi::Stream("ISP Input", ispOutput0, true); - data->isp_[Isp::Output0] = RPi::Stream("ISP Output0", ispCapture1); - data->isp_[Isp::Output1] = RPi::Stream("ISP Output1", ispCapture2); - data->isp_[Isp::Stats] = RPi::Stream("ISP Stats", ispCapture3); - - /* Wire up all the buffer connections. */ - data->unicam_[Unicam::Image].dev()->dequeueTimeout.connect(data.get(), &RPiCameraData::unicamTimeout); - data->unicam_[Unicam::Image].dev()->frameStart.connect(data.get(), &RPiCameraData::frameStarted); - data->unicam_[Unicam::Image].dev()->bufferReady.connect(data.get(), &RPiCameraData::unicamBufferDequeue); - data->isp_[Isp::Input].dev()->bufferReady.connect(data.get(), &RPiCameraData::ispInputDequeue); - data->isp_[Isp::Output0].dev()->bufferReady.connect(data.get(), &RPiCameraData::ispOutputDequeue); - data->isp_[Isp::Output1].dev()->bufferReady.connect(data.get(), &RPiCameraData::ispOutputDequeue); - data->isp_[Isp::Stats].dev()->bufferReady.connect(data.get(), &RPiCameraData::ispOutputDequeue); - - data->sensor_ = std::make_unique(sensorEntity); - if (!data->sensor_) - return -EINVAL; - - if (data->sensor_->init()) - return -EINVAL; - - /* - * Enumerate all the Video Mux/Bridge devices across the sensor -> unicam - * chain. There may be a cascade of devices in this chain! - */ - MediaLink *link = sensorEntity->getPadByIndex(0)->links()[0]; - data->enumerateVideoDevices(link); - - data->sensorFormats_ = populateSensorFormats(data->sensor_); - - ipa::RPi::InitResult result; - if (data->loadIPA(&result)) { - LOG(RPI, Error) << "Failed to load a suitable IPA library"; - return -EINVAL; - } - - if (result.sensorConfig.sensorMetadata ^ !!unicamEmbedded) { - LOG(RPI, Warning) << "Mismatch between Unicam and CamHelper for embedded data usage!"; - result.sensorConfig.sensorMetadata = false; - if (unicamEmbedded) - data->unicam_[Unicam::Embedded].dev()->bufferReady.disconnect(); - } - - /* - * Open all Unicam and ISP streams. The exception is the embedded data - * stream, which only gets opened below if the IPA reports that the sensor - * supports embedded data. - * - * The below grouping is just for convenience so that we can easily - * iterate over all streams in one go. - */ - data->streams_.push_back(&data->unicam_[Unicam::Image]); - if (result.sensorConfig.sensorMetadata) - data->streams_.push_back(&data->unicam_[Unicam::Embedded]); - - for (auto &stream : data->isp_) - data->streams_.push_back(&stream); - - for (auto stream : data->streams_) { - int ret = stream->dev()->open(); - if (ret) - return ret; - } - - if (!data->unicam_[Unicam::Image].dev()->caps().hasMediaController()) { - LOG(RPI, Error) << "Unicam driver does not use the MediaController, please update your kernel!"; - return -EINVAL; - } - - /* - * Setup our delayed control writer with the sensor default - * gain and exposure delays. Mark VBLANK for priority write. - */ - std::unordered_map params = { - { V4L2_CID_ANALOGUE_GAIN, { result.sensorConfig.gainDelay, false } }, - { V4L2_CID_EXPOSURE, { result.sensorConfig.exposureDelay, false } }, - { V4L2_CID_HBLANK, { result.sensorConfig.hblankDelay, false } }, - { V4L2_CID_VBLANK, { result.sensorConfig.vblankDelay, true } } - }; - data->delayedCtrls_ = std::make_unique(data->sensor_->device(), params); - data->sensorMetadata_ = result.sensorConfig.sensorMetadata; - - /* Register initial controls that the Raspberry Pi IPA can handle. */ - data->controlInfo_ = std::move(result.controlInfo); - - /* Initialize the camera properties. */ - data->properties_ = data->sensor_->properties(); - - /* - * The V4L2_CID_NOTIFY_GAINS control, if present, is used to inform the - * sensor of the colour gains. It is defined to be a linear gain where - * the default value represents a gain of exactly one. - */ - auto it = data->sensor_->controls().find(V4L2_CID_NOTIFY_GAINS); - if (it != data->sensor_->controls().end()) - data->notifyGainsUnity_ = it->second.def().get(); - - /* - * Set a default value for the ScalerCropMaximum property to show - * that we support its use, however, initialise it to zero because - * it's not meaningful until a camera mode has been chosen. - */ - data->properties_.set(properties::ScalerCropMaximum, Rectangle{}); - - /* - * We cache two things about the sensor in relation to transforms - * (meaning horizontal and vertical flips): if they affect the Bayer - * ordering, and what the "native" Bayer order is, when no transforms - * are applied. - * - * We note that the sensor's cached list of supported formats is - * already in the "native" order, with any flips having been undone. - */ - const V4L2Subdevice *sensor = data->sensor_->device(); - const struct v4l2_query_ext_ctrl *hflipCtrl = sensor->controlInfo(V4L2_CID_HFLIP); - if (hflipCtrl) { - /* We assume it will support vflips too... */ - data->flipsAlterBayerOrder_ = hflipCtrl->flags & V4L2_CTRL_FLAG_MODIFY_LAYOUT; - } - - /* Look for a valid Bayer format. */ - BayerFormat bayerFormat; - for (const auto &iter : data->sensorFormats_) { - bayerFormat = BayerFormat::fromMbusCode(iter.first); - if (bayerFormat.isValid()) - break; - } - - if (!bayerFormat.isValid()) { - LOG(RPI, Error) << "No Bayer format found"; - return -EINVAL; - } - data->nativeBayerOrder_ = bayerFormat.order; - - /* - * List the available streams an application may request. At present, we - * do not advertise Unicam Embedded and ISP Statistics streams, as there - * is no mechanism for the application to request non-image buffer formats. - */ - std::set streams; - streams.insert(&data->unicam_[Unicam::Image]); - streams.insert(&data->isp_[Isp::Output0]); - streams.insert(&data->isp_[Isp::Output1]); - - int ret = data->loadPipelineConfiguration(); - if (ret) { - LOG(RPI, Error) << "Unable to load pipeline configuration"; - return ret; - } - - /* Create and register the camera. */ - const std::string &id = data->sensor_->id(); - std::shared_ptr camera = - Camera::create(std::move(data), id, streams); - PipelineHandler::registerCamera(std::move(camera)); - - LOG(RPI, Info) << "Registered camera " << id - << " to Unicam device " << unicam->deviceNode() - << " and ISP device " << isp->deviceNode(); - return 0; -} - -int PipelineHandlerRPi::queueAllBuffers(Camera *camera) -{ - RPiCameraData *data = cameraData(camera); - int ret; - - for (auto const stream : data->streams_) { - if (!stream->isExternal()) { - ret = stream->queueAllBuffers(); - if (ret < 0) - return ret; - } else { - /* - * For external streams, we must queue up a set of internal - * buffers to handle the number of drop frames requested by - * the IPA. This is done by passing nullptr in queueBuffer(). - * - * The below queueBuffer() call will do nothing if there - * are not enough internal buffers allocated, but this will - * be handled by queuing the request for buffers in the - * RPiStream object. - */ - unsigned int i; - for (i = 0; i < data->dropFrameCount_; i++) { - ret = stream->queueBuffer(nullptr); - if (ret) - return ret; - } - } - } - - return 0; -} - -int PipelineHandlerRPi::prepareBuffers(Camera *camera) -{ - RPiCameraData *data = cameraData(camera); - unsigned int numRawBuffers = 0; - int ret; - - for (Stream *s : camera->streams()) { - if (isRaw(s->configuration().pixelFormat)) { - numRawBuffers = s->configuration().bufferCount; - break; - } - } - - /* Decide how many internal buffers to allocate. */ - for (auto const stream : data->streams_) { - unsigned int numBuffers; - /* - * For Unicam, allocate a minimum number of buffers for internal - * use as we want to avoid any frame drops. - */ - const unsigned int minBuffers = data->config_.minTotalUnicamBuffers; - if (stream == &data->unicam_[Unicam::Image]) { - /* - * If an application has configured a RAW stream, allocate - * additional buffers to make up the minimum, but ensure - * we have at least minUnicamBuffers of internal buffers - * to use to minimise frame drops. - */ - numBuffers = std::max(data->config_.minUnicamBuffers, - minBuffers - numRawBuffers); - } else if (stream == &data->isp_[Isp::Input]) { - /* - * ISP input buffers are imported from Unicam, so follow - * similar logic as above to count all the RAW buffers - * available. - */ - numBuffers = numRawBuffers + - std::max(data->config_.minUnicamBuffers, - minBuffers - numRawBuffers); - - } else if (stream == &data->unicam_[Unicam::Embedded]) { - /* - * Embedded data buffers are (currently) for internal use, - * so allocate the minimum required to avoid frame drops. - */ - numBuffers = minBuffers; - } else { - /* - * Since the ISP runs synchronous with the IPA and requests, - * we only ever need one set of internal buffers. Any buffers - * the application wants to hold onto will already be exported - * through PipelineHandlerRPi::exportFrameBuffers(). - */ - numBuffers = 1; - } - - ret = stream->prepareBuffers(numBuffers); - if (ret < 0) - return ret; - } - - /* - * Pass the stats and embedded data buffers to the IPA. No other - * buffers need to be passed. - */ - mapBuffers(camera, data->isp_[Isp::Stats].getBuffers(), RPi::MaskStats); - if (data->sensorMetadata_) - mapBuffers(camera, data->unicam_[Unicam::Embedded].getBuffers(), - RPi::MaskEmbeddedData); - - return 0; -} - -void PipelineHandlerRPi::mapBuffers(Camera *camera, const RPi::BufferMap &buffers, unsigned int mask) -{ - RPiCameraData *data = cameraData(camera); - std::vector bufferIds; - /* - * Link the FrameBuffers with the id (key value) in the map stored in - * the RPi stream object - along with an identifier mask. - * - * This will allow us to identify buffers passed between the pipeline - * handler and the IPA. - */ - for (auto const &it : buffers) { - bufferIds.push_back(IPABuffer(mask | it.first, - it.second->planes())); - data->bufferIds_.insert(mask | it.first); - } - - data->ipa_->mapBuffers(bufferIds); -} - -void RPiCameraData::freeBuffers() -{ - if (ipa_) { - /* - * Copy the buffer ids from the unordered_set to a vector to - * pass to the IPA. - */ - std::vector bufferIds(bufferIds_.begin(), - bufferIds_.end()); - ipa_->unmapBuffers(bufferIds); - bufferIds_.clear(); - } - - for (auto const stream : streams_) - stream->releaseBuffers(); - - buffersAllocated_ = false; -} - -void RPiCameraData::frameStarted(uint32_t sequence) -{ - LOG(RPI, Debug) << "frame start " << sequence; - - /* Write any controls for the next frame as soon as we can. */ - delayedCtrls_->applyControls(sequence); -} - -int RPiCameraData::loadIPA(ipa::RPi::InitResult *result) -{ - ipa_ = IPAManager::createIPA(pipe(), 1, 1); - - if (!ipa_) - return -ENOENT; - - ipa_->processStatsComplete.connect(this, &RPiCameraData::processStatsComplete); - ipa_->prepareIspComplete.connect(this, &RPiCameraData::prepareIspComplete); - ipa_->metadataReady.connect(this, &RPiCameraData::metadataReady); - ipa_->setIspControls.connect(this, &RPiCameraData::setIspControls); - ipa_->setDelayedControls.connect(this, &RPiCameraData::setDelayedControls); - ipa_->setLensControls.connect(this, &RPiCameraData::setLensControls); - ipa_->setCameraTimeout.connect(this, &RPiCameraData::setCameraTimeout); - - /* - * The configuration (tuning file) is made from the sensor name unless - * the environment variable overrides it. - */ - std::string configurationFile; - char const *configFromEnv = utils::secure_getenv("LIBCAMERA_RPI_TUNING_FILE"); - if (!configFromEnv || *configFromEnv == '\0') { - std::string model = sensor_->model(); - if (isMonoSensor(sensor_)) - model += "_mono"; - configurationFile = ipa_->configurationFile(model + ".json"); - } else { - configurationFile = std::string(configFromEnv); - } - - IPASettings settings(configurationFile, sensor_->model()); - ipa::RPi::InitParams params; - - params.lensPresent = !!sensor_->focusLens(); - return ipa_->init(settings, params, result); -} - -int RPiCameraData::configureIPA(const CameraConfiguration *config, ipa::RPi::ConfigResult *result) -{ - std::map entityControls; - ipa::RPi::ConfigParams params; - - /* \todo Move passing of ispControls and lensControls to ipa::init() */ - params.sensorControls = sensor_->controls(); - params.ispControls = isp_[Isp::Input].dev()->controls(); - if (sensor_->focusLens()) - params.lensControls = sensor_->focusLens()->controls(); - - /* Always send the user transform to the IPA. */ - params.transform = static_cast(config->transform); - - /* Allocate the lens shading table via dmaHeap and pass to the IPA. */ - if (!lsTable_.isValid()) { - lsTable_ = SharedFD(dmaHeap_.alloc("ls_grid", ipa::RPi::MaxLsGridSize)); - if (!lsTable_.isValid()) - return -ENOMEM; - - /* Allow the IPA to mmap the LS table via the file descriptor. */ - /* - * \todo Investigate if mapping the lens shading table buffer - * could be handled with mapBuffers(). - */ - params.lsTableHandle = lsTable_; - } - - /* We store the IPACameraSensorInfo for digital zoom calculations. */ - int ret = sensor_->sensorInfo(&sensorInfo_); - if (ret) { - LOG(RPI, Error) << "Failed to retrieve camera sensor info"; - return ret; - } - - /* Ready the IPA - it must know about the sensor resolution. */ - ret = ipa_->configure(sensorInfo_, params, result); - if (ret < 0) { - LOG(RPI, Error) << "IPA configuration failed!"; - return -EPIPE; - } - - if (!result->controls.empty()) - setSensorControls(result->controls); - - return 0; -} - -int RPiCameraData::loadPipelineConfiguration() -{ - config_ = { - .minUnicamBuffers = 2, - .minTotalUnicamBuffers = 4, - .disableStartupFrameDrops = false, - .unicamTimeoutValue = 0, - }; - - char const *configFromEnv = utils::secure_getenv("LIBCAMERA_RPI_CONFIG_FILE"); - if (!configFromEnv || *configFromEnv == '\0') - return 0; - - std::string filename = std::string(configFromEnv); - File file(filename); - - if (!file.open(File::OpenModeFlag::ReadOnly)) { - LOG(RPI, Error) << "Failed to open configuration file '" << filename << "'"; - return -EIO; - } - - LOG(RPI, Info) << "Using configuration file '" << filename << "'"; - - std::unique_ptr root = YamlParser::parse(file); - if (!root) { - LOG(RPI, Warning) << "Failed to parse configuration file, using defaults"; - return 0; - } - - std::optional ver = (*root)["version"].get(); - if (!ver || *ver != 1.0) { - LOG(RPI, Error) << "Unexpected configuration file version reported"; - return -EINVAL; - } - - const YamlObject &phConfig = (*root)["pipeline_handler"]; - config_.minUnicamBuffers = - phConfig["min_unicam_buffers"].get(config_.minUnicamBuffers); - config_.minTotalUnicamBuffers = - phConfig["min_total_unicam_buffers"].get(config_.minTotalUnicamBuffers); - config_.disableStartupFrameDrops = - phConfig["disable_startup_frame_drops"].get(config_.disableStartupFrameDrops); - config_.unicamTimeoutValue = - phConfig["unicam_timeout_value_ms"].get(config_.unicamTimeoutValue); - - if (config_.unicamTimeoutValue) { - /* Disable the IPA signal to control timeout and set the user requested value. */ - ipa_->setCameraTimeout.disconnect(); - unicam_[Unicam::Image].dev()->setDequeueTimeout(config_.unicamTimeoutValue * 1ms); - } - - if (config_.minTotalUnicamBuffers < config_.minUnicamBuffers) { - LOG(RPI, Error) << "Invalid configuration: min_total_unicam_buffers must be >= min_unicam_buffers"; - return -EINVAL; - } - - if (config_.minTotalUnicamBuffers < 1) { - LOG(RPI, Error) << "Invalid configuration: min_total_unicam_buffers must be >= 1"; - return -EINVAL; - } - - return 0; -} - -/* - * enumerateVideoDevices() iterates over the Media Controller topology, starting - * at the sensor and finishing at Unicam. For each sensor, RPiCameraData stores - * a unique list of any intermediate video mux or bridge devices connected in a - * cascade, together with the entity to entity link. - * - * Entity pad configuration and link enabling happens at the end of configure(). - * We first disable all pad links on each entity device in the chain, and then - * selectively enabling the specific links to link sensor to Unicam across all - * intermediate muxes and bridges. - * - * In the cascaded topology below, if Sensor1 is used, the Mux2 -> Mux1 link - * will be disabled, and Sensor1 -> Mux1 -> Unicam links enabled. Alternatively, - * if Sensor3 is used, the Sensor2 -> Mux2 and Sensor1 -> Mux1 links are disabled, - * and Sensor3 -> Mux2 -> Mux1 -> Unicam links are enabled. All other links will - * remain unchanged. - * - * +----------+ - * | Unicam | - * +-----^----+ - * | - * +---+---+ - * | Mux1 <-------+ - * +--^----+ | - * | | - * +-----+---+ +---+---+ - * | Sensor1 | | Mux2 |<--+ - * +---------+ +-^-----+ | - * | | - * +-------+-+ +---+-----+ - * | Sensor2 | | Sensor3 | - * +---------+ +---------+ - */ -void RPiCameraData::enumerateVideoDevices(MediaLink *link) -{ - const MediaPad *sinkPad = link->sink(); - const MediaEntity *entity = sinkPad->entity(); - bool unicamFound = false; - - /* We only deal with Video Mux and Bridge devices in cascade. */ - if (entity->function() != MEDIA_ENT_F_VID_MUX && - entity->function() != MEDIA_ENT_F_VID_IF_BRIDGE) - return; - - /* Find the source pad for this Video Mux or Bridge device. */ - const MediaPad *sourcePad = nullptr; - for (const MediaPad *pad : entity->pads()) { - if (pad->flags() & MEDIA_PAD_FL_SOURCE) { - /* - * We can only deal with devices that have a single source - * pad. If this device has multiple source pads, ignore it - * and this branch in the cascade. - */ - if (sourcePad) - return; - - sourcePad = pad; - } - } - - LOG(RPI, Debug) << "Found video mux device " << entity->name() - << " linked to sink pad " << sinkPad->index(); - - bridgeDevices_.emplace_back(std::make_unique(entity), link); - bridgeDevices_.back().first->open(); - - /* - * Iterate through all the sink pad links down the cascade to find any - * other Video Mux and Bridge devices. - */ - for (MediaLink *l : sourcePad->links()) { - enumerateVideoDevices(l); - /* Once we reach the Unicam entity, we are done. */ - if (l->sink()->entity()->name() == "unicam-image") { - unicamFound = true; - break; - } - } - - /* This identifies the end of our entity enumeration recursion. */ - if (link->source()->entity()->function() == MEDIA_ENT_F_CAM_SENSOR) { - /* - * If Unicam is not at the end of this cascade, we cannot configure - * this topology automatically, so remove all entity references. - */ - if (!unicamFound) { - LOG(RPI, Warning) << "Cannot automatically configure this MC topology!"; - bridgeDevices_.clear(); - } - } -} - -void RPiCameraData::processStatsComplete(const ipa::RPi::BufferIds &buffers) -{ - if (!isRunning()) - return; - - FrameBuffer *buffer = isp_[Isp::Stats].getBuffers().at(buffers.stats & RPi::MaskID); - - handleStreamBuffer(buffer, &isp_[Isp::Stats]); - state_ = State::IpaComplete; - handleState(); -} - -void RPiCameraData::metadataReady(const ControlList &metadata) -{ - if (!isRunning()) - return; - - /* Add to the Request metadata buffer what the IPA has provided. */ - Request *request = requestQueue_.front(); - request->metadata().merge(metadata); - - /* - * Inform the sensor of the latest colour gains if it has the - * V4L2_CID_NOTIFY_GAINS control (which means notifyGainsUnity_ is set). - */ - const auto &colourGains = metadata.get(libcamera::controls::ColourGains); - if (notifyGainsUnity_ && colourGains) { - /* The control wants linear gains in the order B, Gb, Gr, R. */ - ControlList ctrls(sensor_->controls()); - std::array gains{ - static_cast((*colourGains)[1] * *notifyGainsUnity_), - *notifyGainsUnity_, - *notifyGainsUnity_, - static_cast((*colourGains)[0] * *notifyGainsUnity_) - }; - ctrls.set(V4L2_CID_NOTIFY_GAINS, Span{ gains }); - - sensor_->setControls(&ctrls); - } -} - -void RPiCameraData::prepareIspComplete(const ipa::RPi::BufferIds &buffers) -{ - unsigned int embeddedId = buffers.embedded & RPi::MaskID; - unsigned int bayer = buffers.bayer & RPi::MaskID; - FrameBuffer *buffer; - - if (!isRunning()) - return; - - buffer = unicam_[Unicam::Image].getBuffers().at(bayer & RPi::MaskID); - LOG(RPI, Debug) << "Input re-queue to ISP, buffer id " << (bayer & RPi::MaskID) - << ", timestamp: " << buffer->metadata().timestamp; - - isp_[Isp::Input].queueBuffer(buffer); - ispOutputCount_ = 0; - - if (sensorMetadata_ && embeddedId) { - buffer = unicam_[Unicam::Embedded].getBuffers().at(embeddedId & RPi::MaskID); - handleStreamBuffer(buffer, &unicam_[Unicam::Embedded]); - } - - handleState(); -} - -void RPiCameraData::setIspControls(const ControlList &controls) -{ - ControlList ctrls = controls; - - if (ctrls.contains(V4L2_CID_USER_BCM2835_ISP_LENS_SHADING)) { - ControlValue &value = - const_cast(ctrls.get(V4L2_CID_USER_BCM2835_ISP_LENS_SHADING)); - Span s = value.data(); - bcm2835_isp_lens_shading *ls = - reinterpret_cast(s.data()); - ls->dmabuf = lsTable_.get(); - } - - isp_[Isp::Input].dev()->setControls(&ctrls); - handleState(); -} - -void RPiCameraData::setDelayedControls(const ControlList &controls, uint32_t delayContext) -{ - if (!delayedCtrls_->push(controls, delayContext)) - LOG(RPI, Error) << "V4L2 DelayedControl set failed"; - handleState(); -} - -void RPiCameraData::setLensControls(const ControlList &controls) -{ - CameraLens *lens = sensor_->focusLens(); - - if (lens && controls.contains(V4L2_CID_FOCUS_ABSOLUTE)) { - ControlValue const &focusValue = controls.get(V4L2_CID_FOCUS_ABSOLUTE); - lens->setFocusPosition(focusValue.get()); - } -} - -void RPiCameraData::setCameraTimeout(uint32_t maxFrameLengthMs) -{ - /* - * Set the dequeue timeout to the larger of 5x the maximum reported - * frame length advertised by the IPA over a number of frames. Allow - * a minimum timeout value of 1s. - */ - utils::Duration timeout = - std::max(1s, 5 * maxFrameLengthMs * 1ms); - - LOG(RPI, Debug) << "Setting Unicam timeout to " << timeout; - unicam_[Unicam::Image].dev()->setDequeueTimeout(timeout); -} - -void RPiCameraData::setSensorControls(ControlList &controls) -{ - /* - * We need to ensure that if both VBLANK and EXPOSURE are present, the - * former must be written ahead of, and separately from EXPOSURE to avoid - * V4L2 rejecting the latter. This is identical to what DelayedControls - * does with the priority write flag. - * - * As a consequence of the below logic, VBLANK gets set twice, and we - * rely on the v4l2 framework to not pass the second control set to the - * driver as the actual control value has not changed. - */ - if (controls.contains(V4L2_CID_EXPOSURE) && controls.contains(V4L2_CID_VBLANK)) { - ControlList vblank_ctrl; - - vblank_ctrl.set(V4L2_CID_VBLANK, controls.get(V4L2_CID_VBLANK)); - sensor_->setControls(&vblank_ctrl); - } - - sensor_->setControls(&controls); -} - -void RPiCameraData::unicamTimeout() -{ - LOG(RPI, Error) << "Unicam has timed out!"; - LOG(RPI, Error) << "Please check that your camera sensor connector is attached securely."; - LOG(RPI, Error) << "Alternatively, try another cable and/or sensor."; - - state_ = RPiCameraData::State::Error; - /* - * To allow the application to attempt a recovery from this timeout, - * stop all devices streaming, and return any outstanding requests as - * incomplete and cancelled. - */ - for (auto const stream : streams_) - stream->dev()->streamOff(); - - clearIncompleteRequests(); -} - -void RPiCameraData::unicamBufferDequeue(FrameBuffer *buffer) -{ - RPi::Stream *stream = nullptr; - int index; - - if (!isRunning()) - return; - - for (RPi::Stream &s : unicam_) { - index = s.getBufferId(buffer); - if (index) { - stream = &s; - break; - } - } - - /* The buffer must belong to one of our streams. */ - ASSERT(stream); - - LOG(RPI, Debug) << "Stream " << stream->name() << " buffer dequeue" - << ", buffer id " << index - << ", timestamp: " << buffer->metadata().timestamp; - - if (stream == &unicam_[Unicam::Image]) { - /* - * Lookup the sensor controls used for this frame sequence from - * DelayedControl and queue them along with the frame buffer. - */ - auto [ctrl, delayContext] = delayedCtrls_->get(buffer->metadata().sequence); - /* - * Add the frame timestamp to the ControlList for the IPA to use - * as it does not receive the FrameBuffer object. - */ - ctrl.set(controls::SensorTimestamp, buffer->metadata().timestamp); - bayerQueue_.push({ buffer, std::move(ctrl), delayContext }); - } else { - embeddedQueue_.push(buffer); - } - - handleState(); -} - -void RPiCameraData::ispInputDequeue(FrameBuffer *buffer) -{ - if (!isRunning()) - return; - - LOG(RPI, Debug) << "Stream ISP Input buffer complete" - << ", buffer id " << unicam_[Unicam::Image].getBufferId(buffer) - << ", timestamp: " << buffer->metadata().timestamp; - - /* The ISP input buffer gets re-queued into Unicam. */ - handleStreamBuffer(buffer, &unicam_[Unicam::Image]); - handleState(); -} - -void RPiCameraData::ispOutputDequeue(FrameBuffer *buffer) -{ - RPi::Stream *stream = nullptr; - int index; - - if (!isRunning()) - return; - - for (RPi::Stream &s : isp_) { - index = s.getBufferId(buffer); - if (index) { - stream = &s; - break; - } - } - - /* The buffer must belong to one of our ISP output streams. */ - ASSERT(stream); - - LOG(RPI, Debug) << "Stream " << stream->name() << " buffer complete" - << ", buffer id " << index - << ", timestamp: " << buffer->metadata().timestamp; - - /* - * ISP statistics buffer must not be re-queued or sent back to the - * application until after the IPA signals so. - */ - if (stream == &isp_[Isp::Stats]) { - ipa::RPi::ProcessParams params; - params.buffers.stats = index | RPi::MaskStats; - params.ipaContext = requestQueue_.front()->sequence(); - ipa_->processStats(params); - } else { - /* Any other ISP output can be handed back to the application now. */ - handleStreamBuffer(buffer, stream); - } - - /* - * Increment the number of ISP outputs generated. - * This is needed to track dropped frames. - */ - ispOutputCount_++; - - handleState(); -} - -void RPiCameraData::clearIncompleteRequests() -{ - /* - * All outstanding requests (and associated buffers) must be returned - * back to the application. - */ - while (!requestQueue_.empty()) { - Request *request = requestQueue_.front(); - - for (auto &b : request->buffers()) { - FrameBuffer *buffer = b.second; - /* - * Has the buffer already been handed back to the - * request? If not, do so now. - */ - if (buffer->request()) { - buffer->_d()->cancel(); - pipe()->completeBuffer(request, buffer); - } - } - - pipe()->completeRequest(request); - requestQueue_.pop_front(); - } -} - -void RPiCameraData::handleStreamBuffer(FrameBuffer *buffer, RPi::Stream *stream) -{ - /* - * It is possible to be here without a pending request, so check - * that we actually have one to action, otherwise we just return - * buffer back to the stream. - */ - Request *request = requestQueue_.empty() ? nullptr : requestQueue_.front(); - if (!dropFrameCount_ && request && request->findBuffer(stream) == buffer) { - /* - * Check if this is an externally provided buffer, and if - * so, we must stop tracking it in the pipeline handler. - */ - handleExternalBuffer(buffer, stream); - /* - * Tag the buffer as completed, returning it to the - * application. - */ - pipe()->completeBuffer(request, buffer); - } else { - /* - * This buffer was not part of the Request (which happens if an - * internal buffer was used for an external stream, or - * unconditionally for internal streams), or there is no pending - * request, so we can recycle it. - */ - stream->returnBuffer(buffer); - } -} - -void RPiCameraData::handleExternalBuffer(FrameBuffer *buffer, RPi::Stream *stream) -{ - unsigned int id = stream->getBufferId(buffer); - - if (!(id & RPi::MaskExternalBuffer)) - return; - - /* Stop the Stream object from tracking the buffer. */ - stream->removeExternalBuffer(buffer); -} - -void RPiCameraData::handleState() -{ - switch (state_) { - case State::Stopped: - case State::Busy: - case State::Error: - break; - - case State::IpaComplete: - /* If the request is completed, we will switch to Idle state. */ - checkRequestCompleted(); - /* - * No break here, we want to try running the pipeline again. - * The fallthrough clause below suppresses compiler warnings. - */ - [[fallthrough]]; - - case State::Idle: - tryRunPipeline(); - break; - } -} - -void RPiCameraData::checkRequestCompleted() -{ - bool requestCompleted = false; - /* - * If we are dropping this frame, do not touch the request, simply - * change the state to IDLE when ready. - */ - if (!dropFrameCount_) { - Request *request = requestQueue_.front(); - if (request->hasPendingBuffers()) - return; - - /* Must wait for metadata to be filled in before completing. */ - if (state_ != State::IpaComplete) - return; - - pipe()->completeRequest(request); - requestQueue_.pop_front(); - requestCompleted = true; - } - - /* - * Make sure we have three outputs completed in the case of a dropped - * frame. - */ - if (state_ == State::IpaComplete && - ((ispOutputCount_ == 3 && dropFrameCount_) || requestCompleted)) { - state_ = State::Idle; - if (dropFrameCount_) { - dropFrameCount_--; - LOG(RPI, Debug) << "Dropping frame at the request of the IPA (" - << dropFrameCount_ << " left)"; - } - } -} - -Rectangle RPiCameraData::scaleIspCrop(const Rectangle &ispCrop) const -{ - /* - * Scale a crop rectangle defined in the ISP's coordinates into native sensor - * coordinates. - */ - Rectangle nativeCrop = ispCrop.scaledBy(sensorInfo_.analogCrop.size(), - sensorInfo_.outputSize); - nativeCrop.translateBy(sensorInfo_.analogCrop.topLeft()); - return nativeCrop; -} - -void RPiCameraData::applyScalerCrop(const ControlList &controls) -{ - const auto &scalerCrop = controls.get(controls::ScalerCrop); - if (scalerCrop) { - Rectangle nativeCrop = *scalerCrop; - - if (!nativeCrop.width || !nativeCrop.height) - nativeCrop = { 0, 0, 1, 1 }; - - /* Create a version of the crop scaled to ISP (camera mode) pixels. */ - Rectangle ispCrop = nativeCrop.translatedBy(-sensorInfo_.analogCrop.topLeft()); - ispCrop.scaleBy(sensorInfo_.outputSize, sensorInfo_.analogCrop.size()); - - /* - * The crop that we set must be: - * 1. At least as big as ispMinCropSize_, once that's been - * enlarged to the same aspect ratio. - * 2. With the same mid-point, if possible. - * 3. But it can't go outside the sensor area. - */ - Size minSize = ispMinCropSize_.expandedToAspectRatio(nativeCrop.size()); - Size size = ispCrop.size().expandedTo(minSize); - ispCrop = size.centeredTo(ispCrop.center()).enclosedIn(Rectangle(sensorInfo_.outputSize)); - - if (ispCrop != ispCrop_) { - isp_[Isp::Input].dev()->setSelection(V4L2_SEL_TGT_CROP, &ispCrop); - ispCrop_ = ispCrop; - - /* - * Also update the ScalerCrop in the metadata with what we actually - * used. But we must first rescale that from ISP (camera mode) pixels - * back into sensor native pixels. - */ - scalerCrop_ = scaleIspCrop(ispCrop_); - } - } -} - -void RPiCameraData::fillRequestMetadata(const ControlList &bufferControls, - Request *request) -{ - request->metadata().set(controls::SensorTimestamp, - bufferControls.get(controls::SensorTimestamp).value_or(0)); - - request->metadata().set(controls::ScalerCrop, scalerCrop_); -} - -void RPiCameraData::tryRunPipeline() -{ - FrameBuffer *embeddedBuffer; - BayerFrame bayerFrame; - - /* If any of our request or buffer queues are empty, we cannot proceed. */ - if (state_ != State::Idle || requestQueue_.empty() || - bayerQueue_.empty() || (embeddedQueue_.empty() && sensorMetadata_)) - return; - - if (!findMatchingBuffers(bayerFrame, embeddedBuffer)) - return; - - /* Take the first request from the queue and action the IPA. */ - Request *request = requestQueue_.front(); - - /* See if a new ScalerCrop value needs to be applied. */ - applyScalerCrop(request->controls()); - - /* - * Clear the request metadata and fill it with some initial non-IPA - * related controls. We clear it first because the request metadata - * may have been populated if we have dropped the previous frame. - */ - request->metadata().clear(); - fillRequestMetadata(bayerFrame.controls, request); - - /* Set our state to say the pipeline is active. */ - state_ = State::Busy; - - unsigned int bayer = unicam_[Unicam::Image].getBufferId(bayerFrame.buffer); - - LOG(RPI, Debug) << "Signalling prepareIsp:" - << " Bayer buffer id: " << bayer; - - ipa::RPi::PrepareParams params; - params.buffers.bayer = RPi::MaskBayerData | bayer; - params.sensorControls = std::move(bayerFrame.controls); - params.requestControls = request->controls(); - params.ipaContext = request->sequence(); - params.delayContext = bayerFrame.delayContext; - - if (embeddedBuffer) { - unsigned int embeddedId = unicam_[Unicam::Embedded].getBufferId(embeddedBuffer); - - params.buffers.embedded = RPi::MaskEmbeddedData | embeddedId; - LOG(RPI, Debug) << "Signalling prepareIsp:" - << " Embedded buffer id: " << embeddedId; - } - - ipa_->prepareIsp(params); -} - -bool RPiCameraData::findMatchingBuffers(BayerFrame &bayerFrame, FrameBuffer *&embeddedBuffer) -{ - if (bayerQueue_.empty()) - return false; - - /* - * Find the embedded data buffer with a matching timestamp to pass to - * the IPA. Any embedded buffers with a timestamp lower than the - * current bayer buffer will be removed and re-queued to the driver. - */ - uint64_t ts = bayerQueue_.front().buffer->metadata().timestamp; - embeddedBuffer = nullptr; - while (!embeddedQueue_.empty()) { - FrameBuffer *b = embeddedQueue_.front(); - if (b->metadata().timestamp < ts) { - embeddedQueue_.pop(); - unicam_[Unicam::Embedded].returnBuffer(b); - LOG(RPI, Debug) << "Dropping unmatched input frame in stream " - << unicam_[Unicam::Embedded].name(); - } else if (b->metadata().timestamp == ts) { - /* Found a match! */ - embeddedBuffer = b; - embeddedQueue_.pop(); - break; - } else { - break; /* Only higher timestamps from here. */ - } - } - - if (!embeddedBuffer && sensorMetadata_) { - if (embeddedQueue_.empty()) { - /* - * If the embedded buffer queue is empty, wait for the next - * buffer to arrive - dequeue ordering may send the image - * buffer first. - */ - LOG(RPI, Debug) << "Waiting for next embedded buffer."; - return false; - } - - /* Log if there is no matching embedded data buffer found. */ - LOG(RPI, Debug) << "Returning bayer frame without a matching embedded buffer."; - } - - bayerFrame = std::move(bayerQueue_.front()); - bayerQueue_.pop(); - - return true; -} - -REGISTER_PIPELINE_HANDLER(PipelineHandlerRPi) - -} /* namespace libcamera */ diff --git a/src/libcamera/pipeline/rpi/vc4/vc4.cpp b/src/libcamera/pipeline/rpi/vc4/vc4.cpp new file mode 100644 index 000000000000..c3a40326b228 --- /dev/null +++ b/src/libcamera/pipeline/rpi/vc4/vc4.cpp @@ -0,0 +1,973 @@ +/* SPDX-License-Identifier: LGPL-2.1-or-later */ +/* + * Copyright (C) 2019-2023, Raspberry Pi Ltd + * + * vc4.cpp - Pipeline handler for VC4-based Raspberry Pi devices + */ + +#include +#include +#include + +#include + +#include "libcamera/internal/device_enumerator.h" + +#include "../common/pipeline_base.h" +#include "../common/rpi_stream.h" + +#include "dma_heaps.h" + +using namespace std::chrono_literals; + +namespace libcamera { + +LOG_DECLARE_CATEGORY(RPI) + +namespace { + +enum class Unicam : unsigned int { Image, Embedded }; +enum class Isp : unsigned int { Input, Output0, Output1, Stats }; + +} /* namespace */ + +class Vc4CameraData final : public RPi::CameraData +{ +public: + Vc4CameraData(PipelineHandler *pipe) + : RPi::CameraData(pipe) + { + } + + ~Vc4CameraData() + { + freeBuffers(); + } + + V4L2VideoDevice::Formats ispFormats() const override + { + return isp_[Isp::Output0].dev()->formats(); + } + + V4L2VideoDevice::Formats rawFormats() const override + { + return unicam_[Unicam::Image].dev()->formats(); + } + + V4L2VideoDevice *frontendDevice() override + { + return unicam_[Unicam::Image].dev(); + } + + void platformFreeBuffers() override + { + } + + CameraConfiguration::Status platformValidate(std::vector &rawStreams, + std::vector &outStreams) const override; + + int platformPipelineConfigure(const std::unique_ptr &root) override; + + void platformStart() override; + void platformStop() override; + + void unicamBufferDequeue(FrameBuffer *buffer); + void ispInputDequeue(FrameBuffer *buffer); + void ispOutputDequeue(FrameBuffer *buffer); + + void processStatsComplete(const ipa::RPi::BufferIds &buffers); + void prepareIspComplete(const ipa::RPi::BufferIds &buffers); + void setIspControls(const ControlList &controls); + void setCameraTimeout(uint32_t maxFrameLengthMs); + + /* Array of Unicam and ISP device streams and associated buffers/streams. */ + RPi::Device unicam_; + RPi::Device isp_; + + /* DMAHEAP allocation helper. */ + RPi::DmaHeap dmaHeap_; + SharedFD lsTable_; + + struct Config { + /* + * The minimum number of internal buffers to be allocated for + * the Unicam Image stream. + */ + unsigned int minUnicamBuffers; + /* + * The minimum total (internal + external) buffer count used for + * the Unicam Image stream. + * + * Note that: + * minTotalUnicamBuffers must be >= 1, and + * minTotalUnicamBuffers >= minUnicamBuffers + */ + unsigned int minTotalUnicamBuffers; + }; + + Config config_; + +private: + void platformSetIspCrop() override + { + isp_[Isp::Input].dev()->setSelection(V4L2_SEL_TGT_CROP, &ispCrop_); + } + + int platformConfigure(const V4L2SubdeviceFormat &sensorFormat, + std::optional packing, + std::vector &rawStreams, + std::vector &outStreams) override; + int platformConfigureIpa(ipa::RPi::ConfigParams ¶ms) override; + + int platformInitIpa([[maybe_unused]] ipa::RPi::InitParams ¶ms) override + { + return 0; + } + + struct BayerFrame { + FrameBuffer *buffer; + ControlList controls; + unsigned int delayContext; + }; + + void tryRunPipeline() override; + bool findMatchingBuffers(BayerFrame &bayerFrame, FrameBuffer *&embeddedBuffer); + + std::queue bayerQueue_; + std::queue embeddedQueue_; +}; + +class PipelineHandlerVc4 : public RPi::PipelineHandlerBase +{ +public: + PipelineHandlerVc4(CameraManager *manager) + : RPi::PipelineHandlerBase(manager) + { + } + + ~PipelineHandlerVc4() + { + } + + bool match(DeviceEnumerator *enumerator) override; + +private: + Vc4CameraData *cameraData(Camera *camera) + { + return static_cast(camera->_d()); + } + + int prepareBuffers(Camera *camera) override; + int platformRegister(std::unique_ptr &cameraData, + MediaDevice *unicam, MediaDevice *isp) override; +}; + +bool PipelineHandlerVc4::match(DeviceEnumerator *enumerator) +{ + constexpr unsigned int numUnicamDevices = 2; + + /* + * Loop over all Unicam instances, but return out once a match is found. + * This is to ensure we correctly enumrate the camera when an instance + * of Unicam has registered with media controller, but has not registered + * device nodes due to a sensor subdevice failure. + */ + for (unsigned int i = 0; i < numUnicamDevices; i++) { + DeviceMatch unicam("unicam"); + MediaDevice *unicamDevice = acquireMediaDevice(enumerator, unicam); + + if (!unicamDevice) { + LOG(RPI, Debug) << "Unable to acquire a Unicam instance"; + continue; + } + + DeviceMatch isp("bcm2835-isp"); + MediaDevice *ispDevice = acquireMediaDevice(enumerator, isp); + + if (!ispDevice) { + LOG(RPI, Debug) << "Unable to acquire ISP instance"; + continue; + } + + /* + * The loop below is used to register multiple cameras behind one or more + * video mux devices that are attached to a particular Unicam instance. + * Obviously these cameras cannot be used simultaneously. + */ + unsigned int numCameras = 0; + for (MediaEntity *entity : unicamDevice->entities()) { + if (entity->function() != MEDIA_ENT_F_CAM_SENSOR) + continue; + + std::unique_ptr cameraData = std::make_unique(this); + int ret = RPi::PipelineHandlerBase::registerCamera(cameraData, + unicamDevice, "unicam-image", + ispDevice, entity); + if (ret) + LOG(RPI, Error) << "Failed to register camera " + << entity->name() << ": " << ret; + else + numCameras++; + } + + if (numCameras) + return true; + } + + return false; +} + +int PipelineHandlerVc4::prepareBuffers(Camera *camera) +{ + Vc4CameraData *data = cameraData(camera); + unsigned int numRawBuffers = 0; + int ret; + + for (Stream *s : camera->streams()) { + if (BayerFormat::fromPixelFormat(s->configuration().pixelFormat).isValid()) { + numRawBuffers = s->configuration().bufferCount; + break; + } + } + + /* Decide how many internal buffers to allocate. */ + for (auto const stream : data->streams_) { + unsigned int numBuffers; + /* + * For Unicam, allocate a minimum number of buffers for internal + * use as we want to avoid any frame drops. + */ + const unsigned int minBuffers = data->config_.minTotalUnicamBuffers; + if (stream == &data->unicam_[Unicam::Image]) { + /* + * If an application has configured a RAW stream, allocate + * additional buffers to make up the minimum, but ensure + * we have at least minUnicamBuffers of internal buffers + * to use to minimise frame drops. + */ + numBuffers = std::max(data->config_.minUnicamBuffers, + minBuffers - numRawBuffers); + } else if (stream == &data->isp_[Isp::Input]) { + /* + * ISP input buffers are imported from Unicam, so follow + * similar logic as above to count all the RAW buffers + * available. + */ + numBuffers = numRawBuffers + + std::max(data->config_.minUnicamBuffers, + minBuffers - numRawBuffers); + + } else if (stream == &data->unicam_[Unicam::Embedded]) { + /* + * Embedded data buffers are (currently) for internal use, + * so allocate the minimum required to avoid frame drops. + */ + numBuffers = minBuffers; + } else { + /* + * Since the ISP runs synchronous with the IPA and requests, + * we only ever need one set of internal buffers. Any buffers + * the application wants to hold onto will already be exported + * through PipelineHandlerRPi::exportFrameBuffers(). + */ + numBuffers = 1; + } + + ret = stream->prepareBuffers(numBuffers); + if (ret < 0) + return ret; + } + + /* + * Pass the stats and embedded data buffers to the IPA. No other + * buffers need to be passed. + */ + mapBuffers(camera, data->isp_[Isp::Stats].getBuffers(), RPi::MaskStats); + if (data->sensorMetadata_) + mapBuffers(camera, data->unicam_[Unicam::Embedded].getBuffers(), + RPi::MaskEmbeddedData); + + return 0; +} + +int PipelineHandlerVc4::platformRegister(std::unique_ptr &cameraData, MediaDevice *unicam, MediaDevice *isp) +{ + Vc4CameraData *data = static_cast(cameraData.get()); + + if (!data->dmaHeap_.isValid()) + return -ENOMEM; + + MediaEntity *unicamImage = unicam->getEntityByName("unicam-image"); + MediaEntity *ispOutput0 = isp->getEntityByName("bcm2835-isp0-output0"); + MediaEntity *ispCapture1 = isp->getEntityByName("bcm2835-isp0-capture1"); + MediaEntity *ispCapture2 = isp->getEntityByName("bcm2835-isp0-capture2"); + MediaEntity *ispCapture3 = isp->getEntityByName("bcm2835-isp0-capture3"); + + if (!unicamImage || !ispOutput0 || !ispCapture1 || !ispCapture2 || !ispCapture3) + return -ENOENT; + + /* Locate and open the unicam video streams. */ + data->unicam_[Unicam::Image] = RPi::Stream("Unicam Image", unicamImage); + + /* An embedded data node will not be present if the sensor does not support it. */ + MediaEntity *unicamEmbedded = unicam->getEntityByName("unicam-embedded"); + if (unicamEmbedded) { + data->unicam_[Unicam::Embedded] = RPi::Stream("Unicam Embedded", unicamEmbedded); + data->unicam_[Unicam::Embedded].dev()->bufferReady.connect(data, + &Vc4CameraData::unicamBufferDequeue); + } + + /* Tag the ISP input stream as an import stream. */ + data->isp_[Isp::Input] = RPi::Stream("ISP Input", ispOutput0, true); + data->isp_[Isp::Output0] = RPi::Stream("ISP Output0", ispCapture1); + data->isp_[Isp::Output1] = RPi::Stream("ISP Output1", ispCapture2); + data->isp_[Isp::Stats] = RPi::Stream("ISP Stats", ispCapture3); + + /* Wire up all the buffer connections. */ + data->unicam_[Unicam::Image].dev()->bufferReady.connect(data, &Vc4CameraData::unicamBufferDequeue); + data->isp_[Isp::Input].dev()->bufferReady.connect(data, &Vc4CameraData::ispInputDequeue); + data->isp_[Isp::Output0].dev()->bufferReady.connect(data, &Vc4CameraData::ispOutputDequeue); + data->isp_[Isp::Output1].dev()->bufferReady.connect(data, &Vc4CameraData::ispOutputDequeue); + data->isp_[Isp::Stats].dev()->bufferReady.connect(data, &Vc4CameraData::ispOutputDequeue); + + if (data->sensorMetadata_ ^ !!data->unicam_[Unicam::Embedded].dev()) { + LOG(RPI, Warning) << "Mismatch between Unicam and CamHelper for embedded data usage!"; + data->sensorMetadata_ = false; + if (data->unicam_[Unicam::Embedded].dev()) + data->unicam_[Unicam::Embedded].dev()->bufferReady.disconnect(); + } + + /* + * Open all Unicam and ISP streams. The exception is the embedded data + * stream, which only gets opened below if the IPA reports that the sensor + * supports embedded data. + * + * The below grouping is just for convenience so that we can easily + * iterate over all streams in one go. + */ + data->streams_.push_back(&data->unicam_[Unicam::Image]); + if (data->sensorMetadata_) + data->streams_.push_back(&data->unicam_[Unicam::Embedded]); + + for (auto &stream : data->isp_) + data->streams_.push_back(&stream); + + for (auto stream : data->streams_) { + int ret = stream->dev()->open(); + if (ret) + return ret; + } + + if (!data->unicam_[Unicam::Image].dev()->caps().hasMediaController()) { + LOG(RPI, Error) << "Unicam driver does not use the MediaController, please update your kernel!"; + return -EINVAL; + } + + /* Write up all the IPA connections. */ + data->ipa_->processStatsComplete.connect(data, &Vc4CameraData::processStatsComplete); + data->ipa_->prepareIspComplete.connect(data, &Vc4CameraData::prepareIspComplete); + data->ipa_->setIspControls.connect(data, &Vc4CameraData::setIspControls); + data->ipa_->setCameraTimeout.connect(data, &Vc4CameraData::setCameraTimeout); + + /* + * List the available streams an application may request. At present, we + * do not advertise Unicam Embedded and ISP Statistics streams, as there + * is no mechanism for the application to request non-image buffer formats. + */ + std::set streams; + streams.insert(&data->unicam_[Unicam::Image]); + streams.insert(&data->isp_[Isp::Output0]); + streams.insert(&data->isp_[Isp::Output1]); + + /* Create and register the camera. */ + const std::string &id = data->sensor_->id(); + std::shared_ptr camera = + Camera::create(std::move(cameraData), id, streams); + PipelineHandler::registerCamera(std::move(camera)); + + LOG(RPI, Info) << "Registered camera " << id + << " to Unicam device " << unicam->deviceNode() + << " and ISP device " << isp->deviceNode(); + + return 0; +} + +CameraConfiguration::Status Vc4CameraData::platformValidate(std::vector &rawStreams, + std::vector &outStreams) const +{ + CameraConfiguration::Status status = CameraConfiguration::Status::Valid; + + /* Can only output 1 RAW stream, or 2 YUV/RGB streams. */ + if (rawStreams.size() > 1 || outStreams.size() > 2) { + LOG(RPI, Error) << "Invalid number of streams requested"; + return CameraConfiguration::Status::Invalid; + } + + if (!rawStreams.empty()) + rawStreams[0].dev = unicam_[Unicam::Image].dev(); + + /* + * For the two ISP outputs, one stream must be equal or smaller than the + * other in all dimensions. + * + * Index 0 contains the largest requested resolution. + */ + for (unsigned int i = 0; i < outStreams.size(); i++) { + Size size; + + size.width = std::min(outStreams[i].cfg->size.width, + outStreams[0].cfg->size.width); + size.height = std::min(outStreams[i].cfg->size.height, + outStreams[0].cfg->size.height); + + if (outStreams[i].cfg->size != size) { + outStreams[i].cfg->size = size; + status = CameraConfiguration::Status::Adjusted; + } + + /* + * Output 0 must be for the largest resolution. We will + * have that fixed up in the code above. + */ + outStreams[i].dev = isp_[i == 0 ? Isp::Output0 : Isp::Output1].dev(); + } + + return status; +} + +int Vc4CameraData::platformPipelineConfigure(const std::unique_ptr &root) +{ + config_ = { + .minUnicamBuffers = 2, + .minTotalUnicamBuffers = 4, + }; + + if (!root) + return 0; + + std::optional ver = (*root)["version"].get(); + if (!ver || *ver != 1.0) { + LOG(RPI, Error) << "Unexpected configuration file version reported"; + return -EINVAL; + } + + std::optional target = (*root)["target"].get(); + if (!target || *target != "bcm2835") { + LOG(RPI, Error) << "Unexpected target reported: expected \"bcm2835\", got " + << *target; + return -EINVAL; + } + + const YamlObject &phConfig = (*root)["pipeline_handler"]; + config_.minUnicamBuffers = + phConfig["min_unicam_buffers"].get(config_.minUnicamBuffers); + config_.minTotalUnicamBuffers = + phConfig["min_total_unicam_buffers"].get(config_.minTotalUnicamBuffers); + + if (config_.minTotalUnicamBuffers < config_.minUnicamBuffers) { + LOG(RPI, Error) << "Invalid configuration: min_total_unicam_buffers must be >= min_unicam_buffers"; + return -EINVAL; + } + + if (config_.minTotalUnicamBuffers < 1) { + LOG(RPI, Error) << "Invalid configuration: min_total_unicam_buffers must be >= 1"; + return -EINVAL; + } + + return 0; +} + +int Vc4CameraData::platformConfigure(const V4L2SubdeviceFormat &sensorFormat, + std::optional packing, + std::vector &rawStreams, + std::vector &outStreams) +{ + int ret; + + if (!packing) + packing = BayerFormat::Packing::CSI2; + + V4L2VideoDevice *unicam = unicam_[Unicam::Image].dev(); + V4L2DeviceFormat unicamFormat = RPi::PipelineHandlerBase::toV4L2DeviceFormat(unicam, sensorFormat, *packing); + + ret = unicam->setFormat(&unicamFormat); + if (ret) + return ret; + + /* + * See which streams are requested, and route the user + * StreamConfiguration appropriately. + */ + if (!rawStreams.empty()) { + rawStreams[0].cfg->setStream(&unicam_[Unicam::Image]); + unicam_[Unicam::Image].setExternal(true); + } + + ret = isp_[Isp::Input].dev()->setFormat(&unicamFormat); + if (ret) + return ret; + + LOG(RPI, Info) << "Sensor: " << sensor_->id() + << " - Selected sensor format: " << sensorFormat + << " - Selected unicam format: " << unicamFormat; + + /* Use a sensible small default size if no output streams are configured. */ + Size maxSize = outStreams.empty() ? Size(320, 240) : outStreams[0].cfg->size; + V4L2DeviceFormat format; + + for (unsigned int i = 0; i < outStreams.size(); i++) { + StreamConfiguration *cfg = outStreams[i].cfg; + + /* The largest resolution gets routed to the ISP Output 0 node. */ + RPi::Stream *stream = i == 0 ? &isp_[Isp::Output0] : &isp_[Isp::Output1]; + + V4L2PixelFormat fourcc = stream->dev()->toV4L2PixelFormat(cfg->pixelFormat); + format.size = cfg->size; + format.fourcc = fourcc; + format.colorSpace = cfg->colorSpace; + + LOG(RPI, Debug) << "Setting " << stream->name() << " to " + << format; + + ret = stream->dev()->setFormat(&format); + if (ret) + return -EINVAL; + + if (format.size != cfg->size || format.fourcc != fourcc) { + LOG(RPI, Error) + << "Failed to set requested format on " << stream->name() + << ", returned " << format; + return -EINVAL; + } + + LOG(RPI, Debug) + << "Stream " << stream->name() << " has color space " + << ColorSpace::toString(cfg->colorSpace); + + cfg->setStream(stream); + stream->setExternal(true); + } + + ispOutputTotal_ = outStreams.size(); + + /* + * If ISP::Output0 stream has not been configured by the application, + * we must allow the hardware to generate an output so that the data + * flow in the pipeline handler remains consistent, and we still generate + * statistics for the IPA to use. So enable the output at a very low + * resolution for internal use. + * + * \todo Allow the pipeline to work correctly without Output0 and only + * statistics coming from the hardware. + */ + if (outStreams.empty()) { + V4L2VideoDevice *dev = isp_[Isp::Output0].dev(); + + format = {}; + format.size = maxSize; + format.fourcc = dev->toV4L2PixelFormat(formats::YUV420); + /* No one asked for output, so the color space doesn't matter. */ + format.colorSpace = ColorSpace::Sycc; + ret = dev->setFormat(&format); + if (ret) { + LOG(RPI, Error) + << "Failed to set default format on ISP Output0: " + << ret; + return -EINVAL; + } + + ispOutputTotal_++; + + LOG(RPI, Debug) << "Defaulting ISP Output0 format to " + << format; + } + + /* + * If ISP::Output1 stream has not been requested by the application, we + * set it up for internal use now. This second stream will be used for + * fast colour denoise, and must be a quarter resolution of the ISP::Output0 + * stream. However, also limit the maximum size to 1200 pixels in the + * larger dimension, just to avoid being wasteful with buffer allocations + * and memory bandwidth. + * + * \todo If Output 1 format is not YUV420, Output 1 ought to be disabled as + * colour denoise will not run. + */ + if (outStreams.size() == 1) { + V4L2VideoDevice *dev = isp_[Isp::Output1].dev(); + + V4L2DeviceFormat output1Format; + constexpr Size maxDimensions(1200, 1200); + const Size limit = maxDimensions.boundedToAspectRatio(format.size); + + output1Format.size = (format.size / 2).boundedTo(limit).alignedDownTo(2, 2); + output1Format.colorSpace = format.colorSpace; + output1Format.fourcc = dev->toV4L2PixelFormat(formats::YUV420); + + LOG(RPI, Debug) << "Setting ISP Output1 (internal) to " + << output1Format; + + ret = dev->setFormat(&output1Format); + if (ret) { + LOG(RPI, Error) << "Failed to set format on ISP Output1: " + << ret; + return -EINVAL; + } + + ispOutputTotal_++; + } + + /* ISP statistics output format. */ + format = {}; + format.fourcc = V4L2PixelFormat(V4L2_META_FMT_BCM2835_ISP_STATS); + ret = isp_[Isp::Stats].dev()->setFormat(&format); + if (ret) { + LOG(RPI, Error) << "Failed to set format on ISP stats stream: " + << format; + return ret; + } + + ispOutputTotal_++; + + /* + * Configure the Unicam embedded data output format only if the sensor + * supports it. + */ + if (sensorMetadata_) { + V4L2SubdeviceFormat embeddedFormat; + + sensor_->device()->getFormat(1, &embeddedFormat); + format = {}; + format.fourcc = V4L2PixelFormat(V4L2_META_FMT_SENSOR_DATA); + format.planes[0].size = embeddedFormat.size.width * embeddedFormat.size.height; + + LOG(RPI, Debug) << "Setting embedded data format " << format.toString(); + ret = unicam_[Unicam::Embedded].dev()->setFormat(&format); + if (ret) { + LOG(RPI, Error) << "Failed to set format on Unicam embedded: " + << format; + return ret; + } + } + + /* Figure out the smallest selection the ISP will allow. */ + Rectangle testCrop(0, 0, 1, 1); + isp_[Isp::Input].dev()->setSelection(V4L2_SEL_TGT_CROP, &testCrop); + ispMinCropSize_ = testCrop.size(); + + /* Adjust aspect ratio by providing crops on the input image. */ + Size size = unicamFormat.size.boundedToAspectRatio(maxSize); + ispCrop_ = size.centeredTo(Rectangle(unicamFormat.size).center()); + + platformSetIspCrop(); + + return 0; +} + +int Vc4CameraData::platformConfigureIpa(ipa::RPi::ConfigParams ¶ms) +{ + params.ispControls = isp_[Isp::Input].dev()->controls(); + + /* Allocate the lens shading table via dmaHeap and pass to the IPA. */ + if (!lsTable_.isValid()) { + lsTable_ = SharedFD(dmaHeap_.alloc("ls_grid", ipa::RPi::MaxLsGridSize)); + if (!lsTable_.isValid()) + return -ENOMEM; + + /* Allow the IPA to mmap the LS table via the file descriptor. */ + /* + * \todo Investigate if mapping the lens shading table buffer + * could be handled with mapBuffers(). + */ + params.lsTableHandle = lsTable_; + } + + return 0; +} + +void Vc4CameraData::platformStart() +{ +} + +void Vc4CameraData::platformStop() +{ + bayerQueue_ = {}; + embeddedQueue_ = {}; +} + +void Vc4CameraData::unicamBufferDequeue(FrameBuffer *buffer) +{ + RPi::Stream *stream = nullptr; + unsigned int index; + + if (!isRunning()) + return; + + for (RPi::Stream &s : unicam_) { + index = s.getBufferId(buffer); + if (index) { + stream = &s; + break; + } + } + + /* The buffer must belong to one of our streams. */ + ASSERT(stream); + + LOG(RPI, Debug) << "Stream " << stream->name() << " buffer dequeue" + << ", buffer id " << index + << ", timestamp: " << buffer->metadata().timestamp; + + if (stream == &unicam_[Unicam::Image]) { + /* + * Lookup the sensor controls used for this frame sequence from + * DelayedControl and queue them along with the frame buffer. + */ + auto [ctrl, delayContext] = delayedCtrls_->get(buffer->metadata().sequence); + /* + * Add the frame timestamp to the ControlList for the IPA to use + * as it does not receive the FrameBuffer object. + */ + ctrl.set(controls::SensorTimestamp, buffer->metadata().timestamp); + bayerQueue_.push({ buffer, std::move(ctrl), delayContext }); + } else { + embeddedQueue_.push(buffer); + } + + handleState(); +} + +void Vc4CameraData::ispInputDequeue(FrameBuffer *buffer) +{ + if (!isRunning()) + return; + + LOG(RPI, Debug) << "Stream ISP Input buffer complete" + << ", buffer id " << unicam_[Unicam::Image].getBufferId(buffer) + << ", timestamp: " << buffer->metadata().timestamp; + + /* The ISP input buffer gets re-queued into Unicam. */ + handleStreamBuffer(buffer, &unicam_[Unicam::Image]); + handleState(); +} + +void Vc4CameraData::ispOutputDequeue(FrameBuffer *buffer) +{ + RPi::Stream *stream = nullptr; + unsigned int index; + + if (!isRunning()) + return; + + for (RPi::Stream &s : isp_) { + index = s.getBufferId(buffer); + if (index) { + stream = &s; + break; + } + } + + /* The buffer must belong to one of our ISP output streams. */ + ASSERT(stream); + + LOG(RPI, Debug) << "Stream " << stream->name() << " buffer complete" + << ", buffer id " << index + << ", timestamp: " << buffer->metadata().timestamp; + + /* + * ISP statistics buffer must not be re-queued or sent back to the + * application until after the IPA signals so. + */ + if (stream == &isp_[Isp::Stats]) { + ipa::RPi::ProcessParams params; + params.buffers.stats = index | RPi::MaskStats; + params.ipaContext = requestQueue_.front()->sequence(); + ipa_->processStats(params); + } else { + /* Any other ISP output can be handed back to the application now. */ + handleStreamBuffer(buffer, stream); + } + + /* + * Increment the number of ISP outputs generated. + * This is needed to track dropped frames. + */ + ispOutputCount_++; + + handleState(); +} + +void Vc4CameraData::processStatsComplete(const ipa::RPi::BufferIds &buffers) +{ + if (!isRunning()) + return; + + FrameBuffer *buffer = isp_[Isp::Stats].getBuffers().at(buffers.stats & RPi::MaskID); + + handleStreamBuffer(buffer, &isp_[Isp::Stats]); + + state_ = State::IpaComplete; + handleState(); +} + +void Vc4CameraData::prepareIspComplete(const ipa::RPi::BufferIds &buffers) +{ + unsigned int embeddedId = buffers.embedded & RPi::MaskID; + unsigned int bayer = buffers.bayer & RPi::MaskID; + FrameBuffer *buffer; + + if (!isRunning()) + return; + + buffer = unicam_[Unicam::Image].getBuffers().at(bayer & RPi::MaskID); + LOG(RPI, Debug) << "Input re-queue to ISP, buffer id " << (bayer & RPi::MaskID) + << ", timestamp: " << buffer->metadata().timestamp; + + isp_[Isp::Input].queueBuffer(buffer); + ispOutputCount_ = 0; + + if (sensorMetadata_ && embeddedId) { + buffer = unicam_[Unicam::Embedded].getBuffers().at(embeddedId & RPi::MaskID); + handleStreamBuffer(buffer, &unicam_[Unicam::Embedded]); + } + + handleState(); +} + +void Vc4CameraData::setIspControls(const ControlList &controls) +{ + ControlList ctrls = controls; + + if (ctrls.contains(V4L2_CID_USER_BCM2835_ISP_LENS_SHADING)) { + ControlValue &value = + const_cast(ctrls.get(V4L2_CID_USER_BCM2835_ISP_LENS_SHADING)); + Span s = value.data(); + bcm2835_isp_lens_shading *ls = + reinterpret_cast(s.data()); + ls->dmabuf = lsTable_.get(); + } + + isp_[Isp::Input].dev()->setControls(&ctrls); + handleState(); +} + +void Vc4CameraData::setCameraTimeout(uint32_t maxFrameLengthMs) +{ + /* + * Set the dequeue timeout to the larger of 5x the maximum reported + * frame length advertised by the IPA over a number of frames. Allow + * a minimum timeout value of 1s. + */ + utils::Duration timeout = + std::max(1s, 5 * maxFrameLengthMs * 1ms); + + LOG(RPI, Debug) << "Setting Unicam timeout to " << timeout; + unicam_[Unicam::Image].dev()->setDequeueTimeout(timeout); +} + +void Vc4CameraData::tryRunPipeline() +{ + FrameBuffer *embeddedBuffer; + BayerFrame bayerFrame; + + /* If any of our request or buffer queues are empty, we cannot proceed. */ + if (state_ != State::Idle || requestQueue_.empty() || + bayerQueue_.empty() || (embeddedQueue_.empty() && sensorMetadata_)) + return; + + if (!findMatchingBuffers(bayerFrame, embeddedBuffer)) + return; + + /* Take the first request from the queue and action the IPA. */ + Request *request = requestQueue_.front(); + + /* See if a new ScalerCrop value needs to be applied. */ + applyScalerCrop(request->controls()); + + /* + * Clear the request metadata and fill it with some initial non-IPA + * related controls. We clear it first because the request metadata + * may have been populated if we have dropped the previous frame. + */ + request->metadata().clear(); + fillRequestMetadata(bayerFrame.controls, request); + + /* Set our state to say the pipeline is active. */ + state_ = State::Busy; + + unsigned int bayer = unicam_[Unicam::Image].getBufferId(bayerFrame.buffer); + + LOG(RPI, Debug) << "Signalling prepareIsp:" + << " Bayer buffer id: " << bayer; + + ipa::RPi::PrepareParams params; + params.buffers.bayer = RPi::MaskBayerData | bayer; + params.sensorControls = std::move(bayerFrame.controls); + params.requestControls = request->controls(); + params.ipaContext = request->sequence(); + params.delayContext = bayerFrame.delayContext; + + if (embeddedBuffer) { + unsigned int embeddedId = unicam_[Unicam::Embedded].getBufferId(embeddedBuffer); + + params.buffers.embedded = RPi::MaskEmbeddedData | embeddedId; + LOG(RPI, Debug) << "Signalling prepareIsp:" + << " Embedded buffer id: " << embeddedId; + } + + ipa_->prepareIsp(params); +} + +bool Vc4CameraData::findMatchingBuffers(BayerFrame &bayerFrame, FrameBuffer *&embeddedBuffer) +{ + if (bayerQueue_.empty()) + return false; + + /* + * Find the embedded data buffer with a matching timestamp to pass to + * the IPA. Any embedded buffers with a timestamp lower than the + * current bayer buffer will be removed and re-queued to the driver. + */ + uint64_t ts = bayerQueue_.front().buffer->metadata().timestamp; + embeddedBuffer = nullptr; + while (!embeddedQueue_.empty()) { + FrameBuffer *b = embeddedQueue_.front(); + if (b->metadata().timestamp < ts) { + embeddedQueue_.pop(); + unicam_[Unicam::Embedded].returnBuffer(b); + LOG(RPI, Debug) << "Dropping unmatched input frame in stream " + << unicam_[Unicam::Embedded].name(); + } else if (b->metadata().timestamp == ts) { + /* Found a match! */ + embeddedBuffer = b; + embeddedQueue_.pop(); + break; + } else { + break; /* Only higher timestamps from here. */ + } + } + + if (!embeddedBuffer && sensorMetadata_) { + if (embeddedQueue_.empty()) { + /* + * If the embedded buffer queue is empty, wait for the next + * buffer to arrive - dequeue ordering may send the image + * buffer first. + */ + LOG(RPI, Debug) << "Waiting for next embedded buffer."; + return false; + } + + /* Log if there is no matching embedded data buffer found. */ + LOG(RPI, Debug) << "Returning bayer frame without a matching embedded buffer."; + } + + bayerFrame = std::move(bayerQueue_.front()); + bayerQueue_.pop(); + + return true; +} + +REGISTER_PIPELINE_HANDLER(PipelineHandlerVc4) + +} /* namespace libcamera */ From patchwork Wed May 3 12:20:35 2023 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Naushir Patuck X-Patchwork-Id: 18597 Return-Path: X-Original-To: parsemail@patchwork.libcamera.org Delivered-To: parsemail@patchwork.libcamera.org Received: from lancelot.ideasonboard.com (lancelot.ideasonboard.com [92.243.16.209]) by patchwork.libcamera.org (Postfix) with ESMTPS id 41FC3C0DA4 for ; Wed, 3 May 2023 12:21:05 +0000 (UTC) Received: from lancelot.ideasonboard.com (localhost [IPv6:::1]) by lancelot.ideasonboard.com (Postfix) with ESMTP id 0412B633C3; Wed, 3 May 2023 14:21:05 +0200 (CEST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/simple; d=libcamera.org; s=mail; t=1683116465; bh=KwBE0uPVA2vSMqX4i4juvfO9whsaBV4+pfIEQIzQDFs=; h=To:Date:In-Reply-To:References:Subject:List-Id:List-Unsubscribe: List-Archive:List-Post:List-Help:List-Subscribe:From:Reply-To:Cc: From; b=CSWCeaKPapLezDSCPY9lb9N0gbwmoswib6bDpdIxrzctwgCDFD+qOohmXC6U7mIXq JIFbuN9Di/3xrOOXq8AlJCd5riA71W/ga6foblo2Eks6N2Q/983Hrvj6hhK/wugDVc X6jvljPOIITWKit2Q+RDDzBFQQfYq233T3qdu/81cqcLXh5JskcpmTOVEojuWYILjz ftRl++YZabmXVJ/uZe6vmFvs9sqryUMmBGG/vUivCwkQDK6rjESSjVTTMSjzk6DEE5 Ry4c1KqJpiO0cnqaofp+TwvIKvGnkOqbKsdv8NT3R9BGWwAwOwLpq3tk4UxY/3cDdn 2lypXZfBV8tsA== Received: from mail-wm1-x32b.google.com (mail-wm1-x32b.google.com [IPv6:2a00:1450:4864:20::32b]) by lancelot.ideasonboard.com (Postfix) with ESMTPS id B0C96633B4 for ; Wed, 3 May 2023 14:21:03 +0200 (CEST) Authentication-Results: lancelot.ideasonboard.com; dkim=pass (2048-bit key; unprotected) header.d=raspberrypi.com header.i=@raspberrypi.com header.b="LVlS7sMo"; dkim-atps=neutral Received: by mail-wm1-x32b.google.com with SMTP id 5b1f17b1804b1-3f19afc4fbfso50885245e9.2 for ; Wed, 03 May 2023 05:21:03 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=raspberrypi.com; s=google; t=1683116463; x=1685708463; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:from:to:cc:subject:date :message-id:reply-to; bh=Zo95N/QeZOpCPpe+CBOXO5Itz+bJlZWdBtIRXvOcgIw=; b=LVlS7sMouP2iODT3/zH36ewYg1AQAiu7lSw3GYN16FLY3LJqhE6aXDIc6uwl2QhtfP lT/kLOCMNORkym8jW7iwcRy5O9Xs8ckHgtDbidJyQ2YOSj8pYcY/aoG9/xnoSJJqiYiX HzN7bnLjD7tK29yC8rDjpfOQyuoDpYEQmYWh8+P/Juc59oRtGz6nuy6AL8yAp0O35Qkk fnb/8fEVbuU9b3kf09O8WPZft6EPvvMGh8ZpO2vNQadIlpKeCEc8bKTd67ykVuEwXHZ4 kthyCxDR2Iirby/RPEU7zGc78FNXH/7zBtxHP6kUZD72sKYQD0GkXVr8ARro9Mxev2NJ VOvw== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20221208; t=1683116463; x=1685708463; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:x-gm-message-state:from:to:cc :subject:date:message-id:reply-to; bh=Zo95N/QeZOpCPpe+CBOXO5Itz+bJlZWdBtIRXvOcgIw=; b=bVauv1TuzRFOnIjPBUBy2XN5OQLtqPxEBB2FPhz8Qz/8FzI0mCdPQLkKJY1v9wdSal Swzj1oXIMf5SF1mukE/ya8IPBvWFg8D1wWI8+NuhUAyxfHYO2dS9RdiA4Tr2L/YYPRkc 4n1Om1h8tEuZbVnAAzSOQIbQ2zBNohtMxt7wTczBFIglTmynNkS4kguIIZOlGoD5v3fw xrFxlxcXVt+CLsEquu8CicedSd2KOD5CLSAALkmyKvUR4mk8RZrTrE6N/hpavJrU+u7u S50PMYmVztTkY0og+MXJW9HtDdD+6Th5pgsCApNvwmT4JIa2VLr77oEUTIVPNGUmz9PP eeOA== X-Gm-Message-State: AC+VfDxbCyaSprTi+bkJpa+YmXuDAouxHA3E4MQlc5g1LNazXyk11i8G 0k+fHpStTCQKxQw7Z3sZ55lHxTDd261KQX2R4SjgJw== X-Google-Smtp-Source: ACHHUZ6Dfm6U+LFhv1+ERowBjE23UAz4ajoZhVH2xsjO69FugU0KqydRRFPajCsNfrGF4k7w4Rhxmg== X-Received: by 2002:a05:600c:299:b0:3f2:50ab:1bba with SMTP id 25-20020a05600c029900b003f250ab1bbamr14908162wmk.19.1683116462874; Wed, 03 May 2023 05:21:02 -0700 (PDT) Received: from localhost.localdomain ([93.93.133.154]) by smtp.gmail.com with ESMTPSA id f23-20020a7bcd17000000b003ee443bf0c7sm1736785wmj.16.2023.05.03.05.21.02 (version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256); Wed, 03 May 2023 05:21:02 -0700 (PDT) To: libcamera-devel@lists.libcamera.org Date: Wed, 3 May 2023 13:20:35 +0100 Message-Id: <20230503122035.32026-14-naush@raspberrypi.com> X-Mailer: git-send-email 2.34.1 In-Reply-To: <20230503122035.32026-1-naush@raspberrypi.com> References: <20230503122035.32026-1-naush@raspberrypi.com> MIME-Version: 1.0 Subject: [libcamera-devel] [PATCH 13/13] pipeline: raspberrypi: Add stream flags to RPi::Stream X-BeenThere: libcamera-devel@lists.libcamera.org X-Mailman-Version: 2.1.29 Precedence: list List-Id: List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , X-Patchwork-Original-From: Naushir Patuck via libcamera-devel From: Naushir Patuck Reply-To: Naushir Patuck Cc: Jacopo Mondi Errors-To: libcamera-devel-bounces@lists.libcamera.org Sender: "libcamera-devel" Add a flags_ field to indicate stream state information in RPi::Stream. This replaces the existing external_ and importOnly_ boolean flags. Signed-off-by: Naushir Patuck Reviewed-by: Jacopo Mondi Reviewed-by: Laurent Pinchart --- .../pipeline/rpi/common/pipeline_base.cpp | 8 ++-- .../pipeline/rpi/common/rpi_stream.cpp | 40 ++++++++++-------- .../pipeline/rpi/common/rpi_stream.h | 42 ++++++++++++------- src/libcamera/pipeline/rpi/vc4/vc4.cpp | 8 ++-- 4 files changed, 60 insertions(+), 38 deletions(-) diff --git a/src/libcamera/pipeline/rpi/common/pipeline_base.cpp b/src/libcamera/pipeline/rpi/common/pipeline_base.cpp index 69c67acd7d45..ba1797bcfef0 100644 --- a/src/libcamera/pipeline/rpi/common/pipeline_base.cpp +++ b/src/libcamera/pipeline/rpi/common/pipeline_base.cpp @@ -31,6 +31,8 @@ using namespace RPi; LOG_DEFINE_CATEGORY(RPI) +using StreamFlag = RPi::Stream::StreamFlag; + namespace { constexpr unsigned int defaultRawBitDepth = 12; @@ -504,7 +506,7 @@ int PipelineHandlerBase::configure(Camera *camera, CameraConfiguration *config) /* Start by freeing all buffers and reset the stream states. */ data->freeBuffers(); for (auto const stream : data->streams_) - stream->setExternal(false); + stream->clearFlags(StreamFlag::External); std::vector rawStreams, ispStreams; std::optional packing; @@ -752,7 +754,7 @@ int PipelineHandlerBase::queueRequestDevice(Camera *camera, Request *request) /* Push all buffers supplied in the Request to the respective streams. */ for (auto stream : data->streams_) { - if (!stream->isExternal()) + if (!(stream->getFlags() & StreamFlag::External)) continue; FrameBuffer *buffer = request->findBuffer(stream); @@ -932,7 +934,7 @@ int PipelineHandlerBase::queueAllBuffers(Camera *camera) int ret; for (auto const stream : data->streams_) { - if (!stream->isExternal()) { + if (!(stream->getFlags() & StreamFlag::External)) { ret = stream->queueAllBuffers(); if (ret < 0) return ret; diff --git a/src/libcamera/pipeline/rpi/common/rpi_stream.cpp b/src/libcamera/pipeline/rpi/common/rpi_stream.cpp index b7e4130f5e56..c158843cb0ed 100644 --- a/src/libcamera/pipeline/rpi/common/rpi_stream.cpp +++ b/src/libcamera/pipeline/rpi/common/rpi_stream.cpp @@ -14,6 +14,24 @@ LOG_DEFINE_CATEGORY(RPISTREAM) namespace RPi { +void Stream::setFlags(StreamFlags flags) +{ + flags_ |= flags; + + /* Import streams cannot be external. */ + ASSERT(!(flags_ & StreamFlag::External) || !(flags_ & StreamFlag::ImportOnly)); +} + +void Stream::clearFlags(StreamFlags flags) +{ + flags_ &= ~flags; +} + +RPi::Stream::StreamFlags Stream::getFlags() const +{ + return flags_; +} + V4L2VideoDevice *Stream::dev() const { return dev_.get(); @@ -32,18 +50,6 @@ void Stream::resetBuffers() availableBuffers_.push(buffer.get()); } -void Stream::setExternal(bool external) -{ - /* Import streams cannot be external. */ - ASSERT(!external || !importOnly_); - external_ = external; -} - -bool Stream::isExternal() const -{ - return external_; -} - void Stream::setExportedBuffers(std::vector> *buffers) { for (auto const &buffer : *buffers) @@ -57,7 +63,7 @@ const BufferMap &Stream::getBuffers() const unsigned int Stream::getBufferId(FrameBuffer *buffer) const { - if (importOnly_) + if (flags_ & StreamFlag::ImportOnly) return 0; /* Find the buffer in the map, and return the buffer id. */ @@ -88,7 +94,7 @@ int Stream::prepareBuffers(unsigned int count) { int ret; - if (!importOnly_) { + if (!(flags_ & StreamFlag::ImportOnly)) { if (count) { /* Export some frame buffers for internal use. */ ret = dev_->exportBuffers(count, &internalBuffers_); @@ -113,7 +119,7 @@ int Stream::prepareBuffers(unsigned int count) * \todo Find a better heuristic, or, even better, an exact solution to * this issue. */ - if (isExternal() || importOnly_) + if ((flags_ & StreamFlag::External) || (flags_ & StreamFlag::ImportOnly)) count = count * 2; return dev_->importBuffers(count); @@ -160,7 +166,7 @@ int Stream::queueBuffer(FrameBuffer *buffer) void Stream::returnBuffer(FrameBuffer *buffer) { - if (!external_) { + if (!(flags_ & StreamFlag::External)) { /* For internal buffers, simply requeue back to the device. */ queueToDevice(buffer); return; @@ -204,7 +210,7 @@ int Stream::queueAllBuffers() { int ret; - if (external_) + if (flags_ & StreamFlag::External) return 0; while (!availableBuffers_.empty()) { diff --git a/src/libcamera/pipeline/rpi/common/rpi_stream.h b/src/libcamera/pipeline/rpi/common/rpi_stream.h index b8c74de35863..6edd304bdfe2 100644 --- a/src/libcamera/pipeline/rpi/common/rpi_stream.h +++ b/src/libcamera/pipeline/rpi/common/rpi_stream.h @@ -12,6 +12,7 @@ #include #include +#include #include #include "libcamera/internal/v4l2_videodevice.h" @@ -37,25 +38,41 @@ enum BufferMask { class Stream : public libcamera::Stream { public: + enum class StreamFlag { + None = 0, + /* + * Indicates that this stream only imports buffers, e.g. the ISP + * input stream. + */ + ImportOnly = (1 << 0), + /* + * Indicates that this stream is active externally, i.e. the + * buffers might be provided by (and returned to) the application. + */ + External = (1 << 1), + }; + + using StreamFlags = Flags; + Stream() - : id_(BufferMask::MaskID) + : flags_(StreamFlag::None), id_(BufferMask::MaskID) { } - Stream(const char *name, MediaEntity *dev, bool importOnly = false) - : external_(false), importOnly_(importOnly), name_(name), + Stream(const char *name, MediaEntity *dev, StreamFlags flags = StreamFlag::None) + : flags_(flags), name_(name), dev_(std::make_unique(dev)), id_(BufferMask::MaskID) { } + void setFlags(StreamFlags flags); + void clearFlags(StreamFlags flags); + StreamFlags getFlags() const; + V4L2VideoDevice *dev() const; const std::string &name() const; - bool isImporter() const; void resetBuffers(); - void setExternal(bool external); - bool isExternal() const; - void setExportedBuffers(std::vector> *buffers); const BufferMap &getBuffers() const; unsigned int getBufferId(FrameBuffer *buffer) const; @@ -112,14 +129,7 @@ private: void clearBuffers(); int queueToDevice(FrameBuffer *buffer); - /* - * Indicates that this stream is active externally, i.e. the buffers - * might be provided by (and returned to) the application. - */ - bool external_; - - /* Indicates that this stream only imports buffers, e.g. ISP input. */ - bool importOnly_; + StreamFlags flags_; /* Stream name identifier. */ std::string name_; @@ -182,4 +192,6 @@ public: } /* namespace RPi */ +LIBCAMERA_FLAGS_ENABLE_OPERATORS(RPi::Stream::StreamFlag) + } /* namespace libcamera */ diff --git a/src/libcamera/pipeline/rpi/vc4/vc4.cpp b/src/libcamera/pipeline/rpi/vc4/vc4.cpp index c3a40326b228..018cf4881d0e 100644 --- a/src/libcamera/pipeline/rpi/vc4/vc4.cpp +++ b/src/libcamera/pipeline/rpi/vc4/vc4.cpp @@ -24,6 +24,8 @@ namespace libcamera { LOG_DECLARE_CATEGORY(RPI) +using StreamFlag = RPi::Stream::StreamFlag; + namespace { enum class Unicam : unsigned int { Image, Embedded }; @@ -318,7 +320,7 @@ int PipelineHandlerVc4::platformRegister(std::unique_ptr &camer } /* Tag the ISP input stream as an import stream. */ - data->isp_[Isp::Input] = RPi::Stream("ISP Input", ispOutput0, true); + data->isp_[Isp::Input] = RPi::Stream("ISP Input", ispOutput0, StreamFlag::ImportOnly); data->isp_[Isp::Output0] = RPi::Stream("ISP Output0", ispCapture1); data->isp_[Isp::Output1] = RPi::Stream("ISP Output1", ispCapture2); data->isp_[Isp::Stats] = RPi::Stream("ISP Stats", ispCapture3); @@ -500,7 +502,7 @@ int Vc4CameraData::platformConfigure(const V4L2SubdeviceFormat &sensorFormat, */ if (!rawStreams.empty()) { rawStreams[0].cfg->setStream(&unicam_[Unicam::Image]); - unicam_[Unicam::Image].setExternal(true); + unicam_[Unicam::Image].setFlags(StreamFlag::External); } ret = isp_[Isp::Input].dev()->setFormat(&unicamFormat); @@ -545,7 +547,7 @@ int Vc4CameraData::platformConfigure(const V4L2SubdeviceFormat &sensorFormat, << ColorSpace::toString(cfg->colorSpace); cfg->setStream(stream); - stream->setExternal(true); + stream->setFlags(StreamFlag::External); } ispOutputTotal_ = outStreams.size();