{"id":8979,"url":"https://patchwork.libcamera.org/api/patches/8979/?format=json","web_url":"https://patchwork.libcamera.org/patch/8979/","project":{"id":1,"url":"https://patchwork.libcamera.org/api/projects/1/?format=json","name":"libcamera","link_name":"libcamera","list_id":"libcamera_core","list_email":"libcamera-devel@lists.libcamera.org","web_url":"","scm_url":"","webscm_url":""},"msgid":"<20200724162734.73806-1-chris@gregariousmammal.com>","date":"2020-07-24T16:27:34","name":"[libcamera-devel] Add pipeline-handler writers Guide","commit_ref":null,"pull_url":null,"state":"superseded","archived":false,"hash":"7dc9195f6425c12c30d60b98198286db9cf069ff","submitter":{"id":55,"url":"https://patchwork.libcamera.org/api/people/55/?format=json","name":"Chris Chinchilla","email":"chris@gregariousmammal.com"},"delegate":null,"mbox":"https://patchwork.libcamera.org/patch/8979/mbox/","series":[{"id":1139,"url":"https://patchwork.libcamera.org/api/series/1139/?format=json","web_url":"https://patchwork.libcamera.org/project/libcamera/list/?series=1139","date":"2020-07-24T16:27:34","name":"[libcamera-devel] Add pipeline-handler writers Guide","version":1,"mbox":"https://patchwork.libcamera.org/series/1139/mbox/"}],"comments":"https://patchwork.libcamera.org/api/patches/8979/comments/","check":"pending","checks":"https://patchwork.libcamera.org/api/patches/8979/checks/","tags":{},"headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id 8A4BBBD878\n\tfor <parsemail@patchwork.libcamera.org>;\n\tFri, 24 Jul 2020 16:27:47 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id 2246D61220;\n\tFri, 24 Jul 2020 18:27:47 +0200 (CEST)","from mail-wr1-x444.google.com (mail-wr1-x444.google.com\n\t[IPv6:2a00:1450:4864:20::444])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id 3ACFC605B2\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tFri, 24 Jul 2020 18:27:46 +0200 (CEST)","by mail-wr1-x444.google.com with SMTP id f18so8870804wrs.0\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tFri, 24 Jul 2020 09:27:46 -0700 (PDT)","from localhost.localdomain ([196.240.57.36])\n\tby smtp.gmail.com with ESMTPSA id\n\tx9sm8132118wmk.45.2020.07.24.09.27.42\n\t(version=TLS1_2 cipher=ECDHE-ECDSA-AES128-GCM-SHA256 bits=128/128);\n\tFri, 24 Jul 2020 09:27:42 -0700 (PDT)"],"Authentication-Results":"lancelot.ideasonboard.com;\n\tdkim=fail reason=\"signature verification failed\" (2048-bit key;\n\tunprotected) header.d=gregariousmammal.com\n\theader.i=@gregariousmammal.com header.b=\"vy/IcQu8\"; \n\tdkim-atps=neutral","DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/relaxed;\n\td=gregariousmammal.com; s=google;\n\th=from:to:cc:subject:date:message-id:mime-version\n\t:content-transfer-encoding;\n\tbh=r1uYhHrEC2zlXJCQnwtVW0faS00MJ/IGx2HbpPqb1Uo=;\n\tb=vy/IcQu8qWjqBxnQzI5edmTtVnNisn1vniElrieMGC7hxW7qSR2vi9bFErRkHdrSzz\n\tky0IRLglwZcycz/reyxtybiyjxYG4a4Dg4Cbalw5udxvGyFNLWCxl6SyR2vo9xsWDxEt\n\tivBeI2z2oHHOkR+6/toxGrc+opqvnz7763wdb9GCHY+/anDu+7NHzNj+bKHFfE1agC2n\n\tQGrp32SLr13U2AM9neG4WOnRJi34TOp0JijbjgJCYLSssjdYtxVa9ECyYPGlPMlb+uJP\n\tsEEPGgQHqBzne4qff9Y6AGhxMQW8x5HoYhrIggM8Mm34Sd7saQoPJGMXuVLIMgIAQdF4\n\t5WZQ==","X-Google-DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/relaxed;\n\td=1e100.net; s=20161025;\n\th=x-gm-message-state:from:to:cc:subject:date:message-id:mime-version\n\t:content-transfer-encoding;\n\tbh=r1uYhHrEC2zlXJCQnwtVW0faS00MJ/IGx2HbpPqb1Uo=;\n\tb=LZPpFc8bNsznK6xgJLpXZMnm2AeAydSy9SWeiZFUyc7ea6owdLtePCy1uQF+Q74ZYA\n\t2lvTTvVZYkOAPcV56wpECIjgED+9RP5D9IpKnlAABaJyIXa54bnuHm2eRSu3tjsns/fB\n\tVfySGb3eePYTba8xrtW00uHW9HdMs33keQYPbOp39t4GNzboECEw9BU534rWyWJDBenQ\n\tTh58jg5YolAnShpaFxjA6g5UNak5YiI8SkuZ98w+wysWh25yTvZss0Iv87bKE5opSmdd\n\tzw7qGU9CcYPX/4y8/LZ4EcjXJKpK5wz62EdlxKq9X+hNZDvDEAyXwQxjOzWDhr1yj7v6\n\tDrLQ==","X-Gm-Message-State":"AOAM533a4CokHfxyoMfBr34mNnAhu1fdJVaCXv3i21f6ZB9/uV7TMkht\n\tf1y0bl0KdZYzw8/+BdAaBN9ICdBK8ZimYWv8","X-Google-Smtp-Source":"ABdhPJypOuZ3nw7PfgcdFwJ4cL+04W/8G7mX5qy9//vcmVPDWl0j7YluvSzdTNmNOtQRYJdXLlX/2g==","X-Received":"by 2002:adf:fb87:: with SMTP id a7mr7946983wrr.390.1595608063753;\n\tFri, 24 Jul 2020 09:27:43 -0700 (PDT)","From":"chris@gregariousmammal.com","To":"libcamera-devel@lists.libcamera.org","Date":"Fri, 24 Jul 2020 18:27:34 +0200","Message-Id":"<20200724162734.73806-1-chris@gregariousmammal.com>","X-Mailer":"git-send-email 2.27.0","MIME-Version":"1.0","Subject":"[libcamera-devel] [PATCH] Add pipeline-handler writers Guide","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","Cc":"Chris Chinchilla <chris@gregariousmammal.com>","Content-Type":"text/plain; charset=\"utf-8\"","Content-Transfer-Encoding":"base64","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"},"content":"From: Chris Chinchilla <chris@gregariousmammal.com>\n\nAdd pipeline-handler writers guide covering the steps to add support for a new device\n\nSigned-off-by: Chris Chinchilla <chris@gregariousmammal.com>\n---\n Documentation/write-pipeline-handler.rst | 1103 ++++++++++++++++++++++\n 1 file changed, 1103 insertions(+)\n create mode 100644 Documentation/write-pipeline-handler.rst","diff":"diff --git a/Documentation/write-pipeline-handler.rst b/Documentation/write-pipeline-handler.rst\nnew file mode 100644\nindex 0000000..cf4cdee\n--- /dev/null\n+++ b/Documentation/write-pipeline-handler.rst\n@@ -0,0 +1,1103 @@\n+Pipeline Handler Writers Guide\n+==============================\n+\n+Pipeline handlers are the abstraction layer for device-specific hardware\n+configuration. They access and control hardware through the V4L2 and\n+Media Controller kernel interfaces, and implement an internal API to\n+control the ISP and Capture components of a pipeline directly.\n+\n+A pipeline handler manages most aspects of interacting with a camera\n+device including:\n+\n+-  setting and retrieving frame controls\n+-  stream configuration\n+-  delivering frame data from a device to applications\n+\n+Pipeline handlers create Camera device instances based on the devices\n+they detect and support on the running system.\n+\n+If you want to add support for a particular device to the libcamera\n+codebase, you need to create a matching pipeline handler, and\n+specifically the code and configuration for the ISP.\n+\n+Prerequisite knowledge\n+----------------------\n+\n+A pipeline handler uses most of the following libcamera classes, and\n+below is a brief overview of each of those. The rest of this guide\n+explains how to use each of them in the context of a pipeline handler.\n+\n+.. TODO: Convert to sphinx refs\n+\n+-  `DeviceEnumerator <http://libcamera.org/api-html/classlibcamera_1_1DeviceEnumerator.html>`_:\n+   Enumerates all media devices attached to the system, and for each\n+   device found creates an instance of the ``MediaDevice`` class and\n+   stores it.\n+-  `DeviceMatch <http://libcamera.org/api-html/classlibcamera_1_1DeviceMatch.html>`_:\n+   Describes a media device search pattern using entity names, or other\n+   properties.\n+-  `MediaDevice <http://libcamera.org/api-html/classlibcamera_1_1MediaDevice.html>`_:\n+   Instances of this class are associated with a kernel media controller\n+   device and its connected objects.\n+-  `V4L2VideoDevice <http://libcamera.org/api-html/classlibcamera_1_1V4L2VideoDevice.html>`_:\n+   Models an instance of a V4L2 video device. It is constructed with the\n+   path to a V4L2 video device node.\n+-  `V4L2SubDevice <http://libcamera.org/api-html/classlibcamera_1_1V4L2Subdevice.html>`_:\n+   Provides an API to the sub-devices that model the hardware components\n+   of a V4L2 device.\n+-  `CameraSensor <http://libcamera.org/api-html/classlibcamera_1_1CameraSensor.html>`_:\n+   Abstracts camera sensor handling by hiding the details of the V4L2\n+   subdevice kernel API and caching sensor information.\n+-  `CameraData <http://libcamera.org/api-html/classlibcamera_1_1CameraData.html>`_:\n+   Represents device-specific data associated with a Camera that a pipeline handler might want to \n+   access.\n+-  `CameraConfiguration <http://libcamera.org/api-html/classlibcamera_1_1CameraConfiguration.html>`_: An ordered vector of StreamConfigurations used to describe desired stream parameters, and \n+    when validated, applied to the camera.\n+-  `StreamConfiguration <http://libcamera.org/api-html/structlibcamera_1_1StreamConfiguration.html>`_:\n+   Models the configuration information for a single camera stream.\n+\n+Creating a PipelineHandler\n+--------------------------\n+\n+This guide walks through the steps to create a simple pipeline handler\n+called “Vivid” that supports the `V4L2 Virtual Video Test Driver\n+(vivid) <https://www.kernel.org/doc/html/latest/admin-guide/media/vivid.html>`_.\n+\n+To use the vivid test driver, you first need to check that the vivid kernel module is loaded\n+with the ``modprobe vivid`` command.\n+\n+Create the skeleton file structure\n+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~\n+\n+To add a new pipeline handler, create a directory to hold the pipeline code\n+in the *src/libcamera/pipeline/* directory that matches the name of the\n+pipeline (in this case *vivid*). Inside the new directory add a *meson.build* file that integrates \n+with the libcamera build system, and a *vivid.cpp* file that matches the name of the pipeline.\n+\n+In the *meson.build* file, add the *vivid.cpp* file as a build source for libcamera by adding it to \n+the global meson ``libcamera_sources`` variable:\n+\n+.. code-block:: meson\n+\n+   # SPDX-License-Identifier: CC0-1.0\n+\n+   libcamera_sources += files([\n+       'vivid.cpp',\n+   ])\n+\n+Users of libcamera can selectively enable pipelines while building libcamera using the \n+``pipelines`` option. \n+\n+For example, to enable only the IPU3, UVC, and VIVID pipelines, specify them as a comma separated\n+list with ``-Dpipelines`` when generating a build directory:\n+\n+.. code-block:: shell\n+\n+    meson build -Dpipelines=ipu3,uvcvideo,vivid'\n+\n+`Read the Meson build configuration documentation <https://mesonbuild.com/Configuring-a-build-directory.html>`_ for more information.\n+\n+To add the new pipeline handler to this list of options, add its directory \n+name to the libcamera build options in the top level _meson_options.txt_. \n+\n+.. code-block:: meson\n+\n+   option('pipelines',\n+           type : 'array',\n+           choices : ['ipu3', 'raspberrypi', 'rkisp1', 'simple', 'uvcvideo', 'vimc', 'vivid'],\n+           description : 'Select which pipeline handlers to include')\n+\n+In *vivid.cpp* add the pipeline handler to the ``libcamera`` namespace, create an instance of the\n+`PipelineHandler <http://libcamera.org/api-html/classlibcamera_1_1PipelineHandler.html>`_\n+base class, and add stub methods for the overridden class members.\n+\n+.. code-block:: cpp\n+\n+    namespace libcamera {\n+\n+        class PipelineHandlerVivid : public PipelineHandler\n+        {\n+        public:\n+            PipelineHandlerVivid(CameraManager *manager);\n+\n+            CameraConfiguration *generateConfiguration(Camera *camera,\n+                                    const StreamRoles &roles) override;\n+            int configure(Camera *camera, CameraConfiguration *config) override;\n+\n+            int exportFrameBuffers(Camera *camera, Stream *stream,\n+                            std::vector<std::unique_ptr<FrameBuffer>> *buffers) override;\n+\n+            int start(Camera *camera) override;\n+            void stop(Camera *camera) override;\n+\n+            int queueRequestDevice(Camera *camera, Request *request) override;\n+\n+            bool match(DeviceEnumerator *enumerator) override;\n+        };\n+\n+        PipelineHandlerVivid::PipelineHandlerVivid(CameraManager *manager)\n+            : PipelineHandler(manager)\n+        {\n+        }\n+\n+        CameraConfiguration *PipelineHandlerVivid::generateConfiguration(Camera *camera,\n+                                            const StreamRoles &roles)\n+        {\n+            return nullptr;\n+        }\n+\n+        int PipelineHandlerVivid::configure(Camera *camera, CameraConfiguration *config)\n+        {\n+            return -1;\n+        }\n+\n+        int PipelineHandlerVivid::exportFrameBuffers(Camera *camera, Stream *stream,\n+                                    std::vector<std::unique_ptr<FrameBuffer>> *buffers)\n+        {\n+            return -1;\n+        }\n+\n+        int PipelineHandlerVivid::start(Camera *camera)\n+        {\n+            return -1;\n+        }\n+\n+        void PipelineHandlerVivid::stop(Camera *camera)\n+        {\n+        }\n+\n+        int PipelineHandlerVivid::queueRequestDevice(Camera *camera, Request *request)\n+        {\n+            return -1;\n+        }\n+\n+        bool PipelineHandlerVivid::match(DeviceEnumerator *enumerator)\n+        {\n+            return false;\n+        }\n+    } /* namespace libcamera */\n+\n+You must register the ``PipelineHandler`` subclass with the pipeline\n+handler factory using the\n+`REGISTER_PIPELINE_HANDLER <http://libcamera.org/api-html/pipeline__handler_8h.html>`_\n+macro which registers it and creates a global symbol to reference the\n+class and make it available to try and match devices.\n+\n+Add the following before the closing curly bracket of the namespace declaration.\n+\n+.. code-block:: cpp\n+\n+   REGISTER_PIPELINE_HANDLER(PipelineHandlerVivid);\n+\n+For debugging and testing a pipeline handler during development, you can\n+define a log message category for the pipeline handler. The\n+``LOG_DEFINE_CATEGORY`` and ``LIBCAMERA_LOG_LEVELS`` macros are part of\n+`the inbuilt libcamera logging\n+infrastructure <http://libcamera.org/api-html/log_8h.html>`_ that allow\n+you to inspect internal operations in a user-configurable way.\n+\n+Add the following before the ``PipelineHandlerVivid`` class declaration.\n+\n+.. code-block:: cpp\n+\n+   LOG_DEFINE_CATEGORY(VIVID)\n+\n+At this point you need the following includes to the top of the file.\n+\n+.. code-block:: cpp\n+\n+   #include \"libcamera/internal/log.h\"\n+   #include \"libcamera/internal/pipeline_handler.h\"\n+\n+Run ``meson build`` and ``ninja -C build install`` to build the\n+libcamera code base, and confirm that the build system found the new\n+pipeline handler by running\n+``LIBCAMERA_LOG_LEVELS=Pipeline:0 ./build/src/cam/cam -l``.\n+\n+And you should see output like the below.\n+\n+.. code-block:: shell\n+\n+    DEBUG Pipeline pipeline_handler.cpp:680 Registered pipeline handler \"PipelineHandlerVivid\"\n+\n+Matching devices\n+~~~~~~~~~~~~~~~~\n+\n+The\n+`match <http://libcamera.org/api-html/classlibcamera_1_1DeviceMatch.html>`_\n+class member function is the main entry point of any pipeline handler. When a\n+libcamera application starts an instance of the camera manager (using\n+the ``start`` method you add later), the instantiation calls the\n+``match`` function with an enumerator of all devices it found on a\n+system, it acquires the device and creates the resources it needs.\n+\n+The ``match`` method takes a pointer to the enumerator\n+as a parameter and should return ``true`` if it matches a device, and\n+``false`` if not.\n+\n+The match method should identify if there are suitable devices available in the \n+``DeviceEnumerator`` which the pipeline supports. To do\n+this, construct the `DeviceMatch <http://libcamera.org/api-html/classlibcamera_1_1DeviceMatch.html>`_\n+class with the name of the ``MediaController`` device to match. You can clarify\n+the search further by adding specific media entities to the search using the\n+``.add()`` method on the DeviceMatch.\n+\n+This example uses search patterns that match vivid, but you should\n+change this value to suit your device identifier.\n+\n+Replace the contents of the ``PipelineHandlerVivid::match`` method with the following.\n+\n+.. code-block:: cpp\n+\n+   DeviceMatch dm(\"vivid\");\n+   dm.add(\"vivid-000-vid-cap\");\n+   return false; // Prevent infinite loops for now\n+\n+With the device matched, attempt to acquire exclusive access to the matching media controller device with the\n+`acquireMediaDevice <http://libcamera.org/api-html/classlibcamera_1_1PipelineHandler.html#a77e424fe704e7b26094164b9189e0f84>`_\n+method. If the method attempts to acquire a device it has already matched, it returns ``false``.\n+\n+Add the following below ``dm.add(\"vivid-000-vid-cap\");``.\n+\n+.. code-block:: cpp\n+\n+   MediaDevice *media = acquireMediaDevice(enumerator, dm);\n+   if (!media)\n+       return false;\n+\n+The application now needs an additional include, add the following to the existing include block.\n+\n+.. code-block:: cpp\n+\n+   #include \"libcamera/internal/device_enumerator.h\"\n+\n+At this stage, you should test that the pipeline handler can successfully match\n+the devices, but have not yet added any code to create a Camera which can be reported to libcamera.\n+\n+As a temporary validation step, add a debug print with ``LOG(VIVID, Debug) << \"Obtained Vivid Device\";``\n+before the closing ``return false; // Prevent infinite loops for now`` in the ``PipelineHandlerVivid::match`` method for when when the pipeline handler successfully matches the ``MediaDevice`` \n+and ``MediaEntity`` names.\n+\n+Test that the pipeline handler matches and finds a device by rebuilding,\n+and running\n+``LIBCAMERA_LOG_LEVELS=Pipeline,VIVID:0 ./build/src/cam/cam -l``.\n+\n+And you should see output like the below.\n+\n+.. code-block:: shell\n+    \n+    DEBUG VIVID vivid.cpp:74 Obtained Vivid Device\n+\n+Creating camera data\n+~~~~~~~~~~~~~~~~~~~~\n+\n+If the enumerator finds a matching device, the ``match`` method can\n+create a Camera instance for it. Each Camera has device-specific data\n+stored using the `CameraData <http://libcamera.org/api-html/classlibcamera_1_1CameraData.html>`_\n+class, which you extend for the needs of a specific pipeline handler.\n+\n+You need to implement a ``VividCameraData()`` class which inherits from the\n+``CameraData`` class and initialize the base ``CameraData`` class using the base\n+``PipelineHandler`` pointer.\n+\n+Add the following code after the ``LOG_DEFINE_CATEGORY(VIVID)`` line. \n+\n+.. code-block:: cpp\n+\n+   class VividCameraData : public CameraData\n+   {\n+   public:\n+       VividCameraData(PipelineHandler *pipe, MediaDevice *media)\n+           : CameraData(pipe), media_(media), video_(nullptr)\n+       {\n+       }\n+\n+       ~VividCameraData()\n+       {\n+           delete video_;\n+       }\n+\n+       int init();\n+       void bufferReady(FrameBuffer *buffer);\n+\n+       MediaDevice *media_;\n+       V4L2VideoDevice *video_;\n+       Stream stream_;\n+   };\n+\n+``CameraData(pipe)``  initializes the base class through it's\n+constructor, and then the ``VividCameraData`` members. ``media_`` is initialized\n+with the ``MediaDevice`` passed into this constructor, and the video\n+capture device is initialized to a ``nullptr`` as it's not yet established.\n+\n+The class has a destructor method that deletes the video device when the ``CameraData`` is \n+destroyed by an application.\n+\n+Every camera data class instance needs at least the above, but yours may\n+add more, in which case you should also remove and clean up anything\n+extra in the destructor method.\n+\n+The class then calls a method to initialize the camera data with\n+``init()``. The base ``CameraData`` class doesn’t define the ``init()``\n+function, it’s up to pipeline handlers to define how they initialize the\n+camera and camera data. This method is one of the more device-specific\n+methods for a pipeline handler, and defines the context of the camera,\n+and how libcamera should communicate with the camera and store the data\n+it generates. For real hardware, this includes tasks such as opening the\n+ISP, or creating a sensor device.\n+\n+For this example, create an ``init`` method after the ``VividCameraData`` class that creates a new \n+V4L2 video device from the matching ``MediaEntity`` within the ``MediaDevice``, using the \n+getEntityByName helper. Make sure you specify the entity name for your device.\n+\n+.. code-block:: cpp\n+\n+   int VividCameraData::init()\n+   {\n+       video_ = new V4L2VideoDevice(media_->getEntityByName(\"vivid-000-vid-cap\"));\n+       if (video_->open())\n+           return -ENODEV;\n+\n+       return 0;\n+   }\n+\n+Return to the ``match`` method, and remove ``LOG(VIVID, Debug) << \"Obtained Vivid Device\";`` and\n+``return false; // Prevent infinite loops for now``, replacing it with the following code.\n+\n+After a successful device match, the code below creates a new instance of the device-specific \n+``CameraData`` class, using a unique pointer to manage the lifetime of the instance.\n+\n+If the camera data initialization fails, return ``false`` to indicate the failure to the ``match()\n+`` method and prevent retiring of the pipeline handler.\n+\n+.. code-block:: cpp\n+\n+   std::unique_ptr<VividCameraData> data = std::make_unique<VividCameraData>(this, media);\n+\n+   if (data->init())\n+       return false;\n+\n+Create a set of streams for the camera, which for this device is\n+only one. You create a camera using the static\n+`Camera::create <http://libcamera.org/api-html/classlibcamera_1_1Camera.html#a453740e0d2a2f495048ae307a85a2574>`_ method,\n+passing the pipeline handler, the name of the camera, and the streams\n+available. Then register the camera and its data with the camera manager\n+using\n+`registerCamera <http://libcamera.org/api-html/classlibcamera_1_1PipelineHandler.html#adf02a7f1bbd87aca73c0e8d8e0e6c98b>`_.\n+At the end of the method, return ``true`` to express that a camera was\n+created successfully.\n+\n+Add the following below the code added above.\n+\n+.. code-block:: cpp\n+\n+   std::set<Stream *> streams{ &data->stream_ };\n+   std::shared_ptr<Camera> camera = Camera::create(this, data->video_->deviceName(), streams);\n+   registerCamera(std::move(camera), std::move(data));\n+\n+   return true;\n+\n+Add a private ``cameraData`` helper to the ``PipelineHandlerVivid`` class which obtains the camera \n+data, and does the necessary casting to convert it to the pipeline-specific ``VividCameraData``. \n+This simplifies the process of obtaining the custom camera data, which you need throughout the code \n+for the pipeline handler.\n+\n+.. code-block:: cpp\n+\n+   private:\n+\n+       VividCameraData *cameraData(const Camera *camera)\n+       {\n+           return static_cast<VividCameraData *>(\n+               PipelineHandler::cameraData(camera));\n+       }\n+\n+At this point, you need to add the following new includes to provide the Camera interface, and \n+device interaction interfaces.\n+\n+.. code-block:: cpp\n+\n+   #include <libcamera/camera.h>\n+   #include \"libcamera/internal/media_device.h\"\n+   #include \"libcamera/internal/v4l2_videodevice.h\"\n+\n+Generating a default configuration\n+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~\n+\n+When a libcamera-based application uses a pipeline handler to access and\n+control a camera stream, it typically generates a default configuration\n+for the device it wants to access, makes changes, and validates those\n+changes with the pipeline handler.\n+\n+The configuration includes aspects such as pixel formats, and the width\n+and height of streams.\n+\n+Create the `CameraConfiguration <http://libcamera.org/api-html/classlibcamera_1_1CameraConfiguration.html>`_ class for the camera device and its empty constructor \n+before the ``PipelineHandlerVivid`` class.\n+\n+.. code-block:: cpp\n+\n+    class VividCameraConfiguration : public CameraConfiguration\n+    {\n+    public:\n+        VividCameraConfiguration();\n+\n+        Status validate() override;\n+    };\n+\n+   VividCameraConfiguration::VividCameraConfiguration()\n+       : CameraConfiguration()\n+   {\n+   }    \n+\n+To generate the default configuration, add to the overridden\n+`generateConfiguration <http://libcamera.org/api-html/classlibcamera_1_1PipelineHandler.html#a7932e87735695500ce1f8c7ae449b65b>`_\n+method, passing a pointer to the camera device. Notice the\n+``StreamRoles`` type, which libcamera uses to define the predefined ways\n+an application intends to use a camera (`You can read the full list in\n+the API\n+documentation <http://libcamera.org/api-html/stream_8h.html#file_a295d1f5e7828d95c0b0aabc0a8baac03>`_).\n+\n+In the ``generateConfiguration`` method, remove the ``return nullptr;``\n+and create pointers to a new instance of the configuration for the\n+pipeline, and the camera data.\n+\n+.. code-block:: cpp\n+\n+   CameraConfiguration *config = new VividCameraConfiguration();\n+   VividCameraData *data = cameraData(camera);\n+\n+A ``CameraConfiguration`` is specific to each pipeline, so you can only\n+create it from the pipeline handler code path. To allow applications to generate their\n+own configuration from scratch, add the following beneath the code above to \n+return the newly constructed empty configuration in case the application does not pass any \n+``StreamRole``s.\n+\n+.. code-block:: cpp\n+\n+   if (roles.empty())\n+       return config;\n+\n+A production pipeline handler should generate the ``StreamConfiguration``s for all\n+the appropriate stream roles a camera device supports. For this simpler\n+example (with only one stream), the pipeline handler always returns the\n+same configuration. How it does this is reproduced below, but we\n+recommend you take a look at the Raspberry Pi pipeline handler for a\n+realistic example.\n+\n+.. TODO: Add link\n+\n+To generate a ``StreamConfiguration``, you need a list of pixel formats and\n+frame sizes supported by the device. You can fetch a map of the\n+``V4LPixelFormat``s and ``SizeRange``s supported by the device, but the pipeline\n+handler needs to convert this to a ``libcamera::PixelFormat`` type to pass\n+to applications. You can do this using ``std::transform`` to convert the\n+formats and populate a new ``PixelFormat`` map as shown below. Add the following beneath the code \n+from above.\n+\n+.. code-block:: cpp\n+\n+   std::map<V4L2PixelFormat, std::vector<SizeRange>> v4l2Formats =\n+   data->video_->formats();\n+   std::map<PixelFormat, std::vector<SizeRange>> deviceFormats;\n+   std::transform(v4l2Formats.begin(), v4l2Formats.end(),\n+          std::inserter(deviceFormats, deviceFormats.begin()),\n+          [&](const decltype(v4l2Formats)::value_type &format) {\n+              return decltype(deviceFormats)::value_type{\n+                  format.first.toPixelFormat(),\n+                  format.second\n+              };\n+          });\n+\n+The\n+`StreamFormats <http://libcamera.org/api-html/classlibcamera_1_1StreamFormats.html>`_\n+class holds information about the pixel formats and frame sizes a stream\n+supports. The class groups size information by the pixel format, which\n+can produce it.\n+\n+The\n+`StreamConfiguration <http://libcamera.org/api-html/structlibcamera_1_1StreamConfiguration.html>`_\n+structure models all configurable information for a single video stream.\n+\n+The code below uses the ``StreamFormats`` class to hold information about the pixel\n+formats and frame sizes a stream supports, and groups size information by\n+the pixel format that can support it. It then uses the\n+``StreamConfiguration`` class to model the information an application\n+can use to configure a single stream.\n+\n+Add the following below the code from above.\n+\n+.. code-block:: cpp\n+\n+   StreamFormats formats(deviceFormats);\n+   StreamConfiguration cfg(formats);\n+\n+Create the default values for pixel formats, sizes, and buffer count\n+returned by the configuration.\n+\n+Add the following below the code from above.\n+\n+.. code-block:: cpp\n+\n+    cfg.pixelFormat = formats::BGR888;\n+    cfg.size = { 1280, 720 };\n+    cfg.bufferCount = 4;\n+\n+Add each ``StreamConfiguration`` you generate to the ``CameraConfiguration``, validating each \n+CameraConfiguration before returning them to the application.\n+\n+Add the following below the code from above.\n+\n+.. code-block:: cpp\n+\n+    config->addConfiguration(cfg);\n+\n+    config->validate();\n+\n+    return config;\n+\n+To validate any configuration, a pipeline handler must implement the\n+`validate <http://libcamera.org/api-html/classlibcamera_1_1CameraConfiguration.html#a29f8f263384c6149775b6011c7397093>`_\n+method that takes any configuration passed to it, can make adjustments to make the configuration \n+valid, and returns the validation status.  If changes are made, it marks the configuration as \n+``Adjusted``.\n+\n+Again, this example pipeline handler is simpler, look at the Raspberry Pi pipeline handler for a \n+realistic example.\n+\n+.. TODO: Add link\n+.. TODO: Can we fit in a description of the checks that are actually done?\n+\n+Add the following code above ``PipelineHandlerVivid::configure``.\n+\n+.. code-block:: cpp\n+\n+   CameraConfiguration::Status VividCameraConfiguration::validate()\n+   {\n+       Status status = Valid;\n+\n+       if (config_.empty())\n+           return Invalid;\n+\n+       if (config_.size() > 1) {\n+           config_.resize(1);\n+           status = Adjusted;\n+       }\n+\n+       StreamConfiguration &cfg = config_[0];\n+\n+       const std::vector<libcamera::PixelFormat> formats = cfg.formats().pixelformats();\n+       if (std::find(formats.begin(), formats.end(), cfg.pixelFormat) == formats.end()) {\n+           cfg.pixelFormat = cfg.formats().pixelformats()[0];\n+           LOG(VIVID, Debug) << \"Adjusting format to \" << cfg.pixelFormat.toString();\n+           status = Adjusted;\n+       }\n+\n+       cfg.bufferCount = 4;\n+\n+       return status;\n+   }\n+\n+To handle ``PixelFormat``s, add ``#include <libcamera/formats.h>`` to the include section, rebuild\n+the codebase, and use ``LIBCAMERA_LOG_LEVELS=Pipeline,VIVID:0 ./build/src/cam/cam -c vivid -I`` to test\n+the configuration is generated.\n+\n+You should see the following output.\n+\n+.. code-block:: shell\n+\n+    Using camera vivid\n+    0: 1280x720-BGR888\n+    * Pixelformat: NV21 (320x180)-(3840x2160)/(+0,+0)\n+    - 320x180\n+    - 640x360\n+    - 640x480\n+    - 1280x720\n+    - 1920x1080\n+    - 3840x2160\n+    * Pixelformat: NV12 (320x180)-(3840x2160)/(+0,+0)\n+    - 320x180\n+    - 640x360\n+    - 640x480\n+    - 1280x720\n+    - 1920x1080\n+    - 3840x2160\n+    * Pixelformat: BGRA8888 (320x180)-(3840x2160)/(+0,+0)\n+    - 320x180\n+    - 640x360\n+    - 640x480\n+    - 1280x720\n+    - 1920x1080\n+    - 3840x2160\n+    * Pixelformat: RGBA8888 (320x180)-(3840x2160)/(+0,+0)\n+    - 320x180\n+    - 640x360\n+    - 640x480\n+    - 1280x720\n+    - 1920x1080\n+    - 3840x2160\n+\n+Configuring a device\n+~~~~~~~~~~~~~~~~~~~~\n+\n+With the configuration generated and validated, a pipeline handler needs\n+a method that allows an application to apply a configuration\n+to a supported device.\n+\n+Replace the contents of the ``PipelineHandlerVivid::configure`` method\n+with the following that obtains the camera data and stream configuration. This pipeline handler supports only a single stream, so it\n+directly obtains the first ``StreamConfiguration`` from the camera\n+configuration. A pipeline handler with multiple streams should handle\n+requests to configure each of them..\n+\n+.. code-block:: cpp\n+\n+   VividCameraData *data = cameraData(camera);\n+   int ret;\n+   StreamConfiguration &cfg = config->at(0);\n+\n+The Vivid capture device is a V4L2 video device, so create a\n+`V4L2DeviceFormat <http://libcamera.org/api-html/classlibcamera_1_1V4L2DeviceFormat.html>`_\n+with the fourcc and size attributes to apply directly to the capture device node. The\n+fourcc attribute is a `V4L2PixelFormat <http://libcamera.org/api-html/classlibcamera_1_1V4L2PixelFormat.html>`_ and differs from the ``libcamera::PixelFormat``.\n+Converting the format requires knowledge of the plane configuration for\n+multiplanar formats, so you must explicitly convert it using the\n+helpers provided by the ``V4LVideoDevice``, in this case ``toV4L2PixelFormat``.\n+\n+Add the following code beneath the code from above.\n+\n+.. code-block:: cpp\n+\n+   V4L2DeviceFormat format = {};\n+   format.fourcc = data->video_->toV4L2PixelFormat(cfg.pixelFormat);\n+   format.size = cfg.size;\n+\n+Set the format defined above using the\n+`setFormat <http://libcamera.org/api-html/classlibcamera_1_1V4L2VideoDevice.html#ad67b47dd9327ce5df43350b80c083cca>`_\n+helper method to the Kernel API. You should check if the kernel driver\n+has adjusted the format, as this shows the pipeline handler has failed\n+to handle the validation stages correctly, and the configure operation\n+has also failed.\n+\n+Add the following code beneath the code from above.\n+\n+.. code-block:: cpp\n+\n+       ret = data->video_->setFormat(&format);\n+       if (ret)\n+           return ret;\n+\n+       if (format.size != cfg.size ||\n+           format.fourcc != data->video_->toV4L2PixelFormat(cfg.pixelFormat))\n+           return -EINVAL;\n+\n+Finally, store and set stream-specific data reflecting the state of the\n+stream. Associate the configuration with the stream by using the\n+`setStream <http://libcamera.org/api-html/structlibcamera_1_1StreamConfiguration.html#a74a0eb44dad1b00112c7c0443ae54a12>`_\n+method, and you can also set the values of individual stream\n+configuration members.\n+\n+.. NOTE: the cfg.setStream() call here associates the stream to the\n+   StreamConfiguration however that should quite likely be done as part of\n+   the validation process. TBD\n+\n+Add the following code beneath the code from above.\n+\n+.. code-block:: cpp\n+\n+       cfg.setStream(&data->stream_);\n+       cfg.stride = format.planes[0].bpl;\n+\n+       return 0;\n+\n+Buffer handling and stream control\n+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~\n+\n+An application using libcamera needs to reserve the memory that it can\n+write camera data to for each individual stream requested, using\n+`FrameBuffer <http://libcamera.org/api-html/classlibcamera_1_1FrameBuffer.html>`_\n+instances to represent frames of data from memory.\n+\n+An application can create a `FrameBufferAllocator <http://libcamera.org/api-html/classlibcamera_1_1FrameBufferAllocator.html>`_ for a Camera to create\n+frame buffers suitable for use with that device.\n+\n+The ``FrameBufferAllocator`` uses the camera\n+device pipeline handler to export buffers from the underlying device\n+using the\n+`exportFrameBuffers <http://libcamera.org/api-html/classlibcamera_1_1PipelineHandler.html#a6312a69da7129c2ed41f9d9f790adf7c>`_\n+method, that all pipeline handlers must implement.\n+\n+The ``exportFrameBuffers()`` function uses a Camera and a Stream pointer to\n+identify the required device to allocate buffers for, and\n+returns the buffers allocated by adding them to the buffers vector. \n+\n+Replace the contents of the ``exportFrameBuffers`` method with the following.\n+\n+.. code-block:: cpp\n+\n+    unsigned int count = stream->configuration().bufferCount;\n+    VividCameraData *data = cameraData(camera);\n+\n+    return data->video_->exportBuffers(count, buffers);\n+\n+This example method takes pointers to the camera, the stream, and a\n+vector of unique pointers to the frame buffers.\n+\n+The method checks the stream configuration to see how many buffers an application\n+requested, in the default configuration for this example this is 4, but\n+an application may have changed this value.\n+\n+It then uses the ``exportBuffers`` to create the buffers on the\n+underlying V4L2 video device, which allows a ``FrameBufferAllocator`` to\n+obtain buffers from the capture device-specific to this stream, and\n+returns the number created.\n+\n+When applications obtain buffers through ``exportFrameBuffers``, they are\n+orphaned and left unassociated with the device, and an application must reimport\n+them in the pipeline handler ``start`` method. This approach allows you\n+to use the same interface whether you are using internal or external\n+buffers for the stream.\n+\n+Replace the contents of the ``start`` method with the following.\n+\n+.. code-block:: cpp\n+\n+    VividCameraData *data = cameraData(camera);\n+    unsigned int count = data->stream_.configuration().bufferCount;\n+\n+    int ret = data->video_->importBuffers(count);\n+    if (ret < 0)\n+        return ret;\n+\n+    ret = data->video_->streamOn();\n+    if (ret < 0) {\n+        data->video_->releaseBuffers();\n+        return ret;\n+    }\n+\n+    return 0;\n+\n+The method imports buffers\n+(`importBuffers <http://libcamera.org/api-html/classlibcamera_1_1V4L2VideoDevice.html#a154f5283d16ebd5e15d63e212745cb64>`_)\n+to prepare the underlying V4L2 device based on the number requested, and\n+starts a stream using the\n+`streamOn <http://libcamera.org/api-html/classlibcamera_1_1V4L2VideoDevice.html#a588a5dc9d6f4c54c61136ac43ff9a8cc>`_\n+method. If either of the calls fail, the error value is propagated to the caller\n+and the `releaseBuffers <http://libcamera.org/api-html/classlibcamera_1_1V4L2VideoDevice.html#a191619c152f764e03bc461611f3fcd35>`_\n+method releases any buffers to leave the device in a consistent state. If your pipeline handler \n+uses any image processing algorithms, you should also stop them.\n+\n+Add the following to the ``stop`` method, which stops the stream\n+(`streamOff <http://libcamera.org/api-html/classlibcamera_1_1V4L2VideoDevice.html#a61998710615bdf7aa25a046c8565ed66>`_)\n+and releases the buffers (``releaseBuffers``).\n+\n+.. code-block:: cpp\n+\n+    VividCameraData *data = cameraData(camera);\n+    data->video_->streamOff();\n+    data->video_->releaseBuffers();\n+\n+Event handling\n+~~~~~~~~~~~~~~\n+\n+Applications use signals and slots (`similar to\n+Qt <https://doc.qt.io/qt-5/signalsandslots.html>`_) to connect system\n+events with callbacks to handle those events.\n+\n+Pipeline handlers `connect <http://libcamera.org/api-html/classlibcamera_1_1Signal.html#aa04db72d5b3091ffbb4920565aeed382>`_ the ``bufferReady`` signal from the capture\n+devices they support to a member function slot to handle processing of available\n+frames. When a buffer is ready, the pipeline handler must propagate the\n+completion of that buffer, and when all buffers have successfully\n+completed for a request, also complete the Request itself.\n+\n+In this example, when a buffer completes, the event handler calls the buffer\n+completion handler of the pipeline handler, and because the device has a\n+single stream, immediately completes the request.\n+\n+Returning to the ``int VividCameraData::init()`` method, add the\n+following above the closing ``return 0;`` that connects the pipeline\n+handler ``bufferReady`` method to the V4L2 device buffer signaling it is\n+ready and passing the frame buffer to the class ``bufferReady`` method.\n+\n+.. code-block:: cpp\n+\n+   video_->bufferReady.connect(this, &VividCameraData::bufferReady);\n+\n+The libcamera library follows a familiar streaming request model for\n+data (frames of camera data). For each frame a camera captures, an\n+application must queue a request for it to the camera. With libcamera, a\n+``Request`` is at least one Stream (one source from a Camera), that has\n+one ``FrameBuffer`` full of image data.\n+\n+Create the matching ``VividCameraData::bufferReady`` method above the ``REGISTER_PIPELINE_HANDLER(PipelineHandlerVivid);`` line that takes\n+the frame buffer passed to it as a parameter.\n+\n+The ``bufferReady`` method obtains the request from the buffer using the\n+``request`` method, and notifies the pipeline handler that the buffer\n+and request are completed. In this simpler pipeline handler, there is\n+only one buffer, so it completes the buffer immediately. You can find a more complex example of \n+event handling with supporting multiple streams in the RaspberryPi Pipeline Handler.\n+\n+.. TODO: Add link\n+\n+.. code-block:: cpp\n+\n+   void VividCameraData::bufferReady(FrameBuffer *buffer)\n+   {\n+       Request *request = buffer->request();\n+\n+       pipe_->completeBuffer(camera_, request, buffer);\n+       pipe_->completeRequest(camera_, request);\n+   }\n+\n+Queuing requests between applications and hardware\n+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~\n+\n+When an application sends a request to a pipeline handler, the pipeline\n+handler must parse the request and identify what actions it should take\n+to carry out the request on the hardware.\n+\n+This example pipeline handler identifies the buffer\n+(`findBuffer <http://libcamera.org/api-html/classlibcamera_1_1Request.html#ac66050aeb9b92c64218945158559c4d4>`_)\n+from the only supported stream and queues it to the capture device\n+(`queueBuffer <http://libcamera.org/api-html/classlibcamera_1_1V4L2VideoDevice.html#a594cd594686a8c1cf9ae8dba0b2a8a75>`_).\n+\n+Replace the contents of ``queueRequestDevice`` with the following.\n+\n+.. code-block:: cpp\n+\n+    VividCameraData *data = cameraData(camera);\n+    FrameBuffer *buffer = request->findBuffer(&data->stream_);\n+    if (!buffer) {\n+        LOG(VIVID, Error)\n+            << \"Attempt to queue request with invalid stream\";\n+\n+        return -ENOENT;\n+    }\n+\n+    int ret = data->video_->queueBuffer(buffer);\n+    if (ret < 0)\n+        return ret;\n+\n+    return 0;\n+\n+At this point you can test capture by rebuilding, and using ``LIBCAMERA_LOG_LEVELS=Pipeline,VIVID:0 sudo ./build/src/cam/cam -c vivid -I -C`` which should output frame capture data.\n+\n+Initializing frame controls\n+~~~~~~~~~~~~~~~~~~~~~~~~~~~\n+\n+Controls allow an application using libcamera to control capture\n+parameters for each stream on a per-frame basis, and a pipeline handler\n+can define the initial controls to suit the device.\n+\n+This section is particularly vivid specific as it sets the initial\n+values of controls to match `the controls that vivid\n+defines <https://www.kernel.org/doc/html/latest/admin-guide/media/vivid.html#controls>`_.\n+You won’t need any of the code below for your pipeline handler, but it’s\n+included as an example of how to implement what your pipeline handler\n+might need.\n+\n+Create a list of controls with the\n+`ControlList <http://libcamera.org/api-html/classlibcamera_1_1ControlList.html>`_\n+class, and set them using the\n+`set <http://libcamera.org/api-html/classlibcamera_1_1ControlList.html#a74a1a29abff5243e6e37ace8e24eb4ba>`_\n+method.\n+\n+This pipeline handler retains the global state of its controls and shows\n+an example of creating and setting a control list. In a production\n+pipeline handler, you typically set controls as part of a request.\n+\n+.. TODO: Link to example of the above\n+\n+Create defines beneath the current includes for convenience.\n+\n+.. code-block:: cpp\n+\n+   #define VIVID_CID_VIVID_BASE           (0x00f00000 | 0xf000)\n+   #define VIVID_CID_VIVID_CLASS           (0x00f00000 | 1)\n+   #define VIVID_CID_TEST_PATTERN          (VIVID_CID_VIVID_BASE  + 0)\n+   #define VIVID_CID_OSD_TEXT_MODE         (VIVID_CID_VIVID_BASE  + 1)\n+   #define VIVID_CID_HOR_MOVEMENT          (VIVID_CID_VIVID_BASE  + 2)\n+   #define VIVID_CID_VERT_MOVEMENT         (VIVID_CID_VIVID_BASE  + 3)\n+   #define VIVID_CID_SHOW_BORDER           (VIVID_CID_VIVID_BASE  + 4)\n+   #define VIVID_CID_SHOW_SQUARE           (VIVID_CID_VIVID_BASE  + 5)\n+   #define VIVID_CID_INSERT_SAV            (VIVID_CID_VIVID_BASE  + 6)\n+   #define VIVID_CID_INSERT_EAV            (VIVID_CID_VIVID_BASE  + 7)\n+   #define VIVID_CID_VBI_CAP_INTERLACED    (VIVID_CID_VIVID_BASE  + 8)\n+\n+In the ``configure`` method, add the below above the\n+``cfg.setStream(&data->stream_);`` line.\n+\n+.. code-block:: cpp\n+\n+    ControlList controls(data->video_->controls());\n+    controls.set(VIVID_CID_TEST_PATTERN, 0);\n+    controls.set(VIVID_CID_OSD_TEXT_MODE, 0);\n+\n+    controls.set(V4L2_CID_BRIGHTNESS, 128);\n+    controls.set(V4L2_CID_CONTRAST, 128);\n+    controls.set(V4L2_CID_SATURATION, 128);\n+\n+    controls.set(VIVID_CID_HOR_MOVEMENT, 5);\n+\n+    ret = data->video_->setControls(&controls);\n+    if (ret) {\n+        LOG(VIVID, Error) << \"Failed to set controls: \" << ret;\n+        return ret < 0 ? ret : -EINVAL;\n+    }\n+\n+These controls configure VIVID to use a default test pattern, and\n+enable all on-screen display text, while configuring sensible brightness,\n+contrast and saturation values. Use the ``controls.set`` method to set individual controls.\n+\n+Enabling ``HOR_MOVEMENT`` adds movement to the video stream while\n+capturing, and all controls are set on the vivid capture device through\n+the ``setControls()`` method below.\n+\n+Processing controls\n+~~~~~~~~~~~~~~~~~~~\n+\n+When constructing the camera, a pipeline handler parses the available\n+controls on a capture device, and maps supported controls to libcamera\n+controls, and initializes the defaults.\n+\n+Create the ``processControls`` method above the ``queueRequestDevice`` method. \n+The method loops through the defined control list, and libcamera makes adjustments to the control \n+values to convert between libcamera control range definitions and their corresponding values on the \n+device.\n+\n+.. code-block:: cpp\n+\n+   int PipelineHandlerVivid::processControls(VividCameraData *data, Request *request)\n+   {\n+       ControlList controls(data->video_->controls());\n+\n+       for (auto it : request->controls()) {\n+           unsigned int id = it.first;\n+           unsigned int offset;\n+           uint32_t cid;\n+\n+           if (id == controls::Brightness) {\n+               cid = V4L2_CID_BRIGHTNESS;\n+               offset = 128;\n+           } else if (id == controls::Contrast) {\n+               cid = V4L2_CID_CONTRAST;\n+               offset = 0;\n+           } else if (id == controls::Saturation) {\n+               cid = V4L2_CID_SATURATION;\n+               offset = 0;\n+           } else {\n+               continue;\n+           }\n+\n+           int32_t value = lroundf(it.second.get<float>() * 128 + offset);\n+           controls.set(cid, utils::clamp(value, 0, 255));\n+       }\n+\n+       for (const auto &ctrl : controls)\n+           LOG(VIVID, Debug)\n+               << \"Setting control \" << utils::hex(ctrl.first)\n+               << \" to \" << ctrl.second.toString();\n+\n+       int ret = data->video_->setControls(&controls);\n+       if (ret) {\n+           LOG(VIVID, Error) << \"Failed to set controls: \" << ret;\n+           return ret < 0 ? ret : -EINVAL;\n+       }\n+\n+       return ret;\n+   }\n+\n+Declare the function prototype for the ``processControls`` method within\n+the private ``PipelineHandlerVivid`` class members, as it is only used internally as a\n+helper when processing Requests.\n+\n+.. code-block:: cpp\n+\n+   private:\n+       int processControls(VividCameraData *data, Request *request);\n+\n+A pipeline handler is responsible for applying controls provided in a\n+Request to the relevant hardware devices. This could be directly on the\n+capture device, or where appropriate by setting controls on\n+V4L2Subdevices directly. Each pipeline handler is responsible for\n+understanding the correct procedure for applying controls to the device they support.\n+\n+This example pipeline handler applies controls during the\n+`queueRequestDevice <http://libcamera.org/api-html/classlibcamera_1_1PipelineHandler.html#a106914cca210640c9da9ee1f0419e83c>`_\n+method for each request, and applies them to the capture device through\n+the capture node.\n+\n+In the ``queueRequestDevice`` method, replace the following.\n+\n+.. code-block:: cpp\n+\n+    int ret = data->video_->queueBuffer(buffer);\n+    if (ret < 0)\n+        return ret;\n+\n+With the following code.\n+\n+.. code-block:: cpp\n+\n+\tint ret = processControls(data, request);\n+\tif (ret < 0)\n+\t\treturn ret;\n+\n+\tret = data->video_->queueBuffer(buffer);\n+ \tif (ret < 0)\n+ \t\treturn ret;\n+\n+In the ``init`` method, initialize the supported controls beneath the\n+``video_->bufferReady.connect(this, &VividCameraData::bufferReady);``\n+line by parsing the available controls on the V4L2 video device, and\n+creating corresponding libcamera controls, populated with their minimum,\n+maximum and default values.\n+\n+.. code-block:: cpp\n+\n+    const ControlInfoMap &controls = video_->controls();\n+    ControlInfoMap::Map ctrls;\n+\n+    for (const auto &ctrl : controls) {\n+        const ControlId *id;\n+        ControlInfo info;\n+\n+        switch (ctrl.first->id()) {\n+        case V4L2_CID_BRIGHTNESS:\n+            id = &controls::Brightness;\n+            info = ControlInfo{ { -1.0f }, { 1.0f }, { 0.0f } };\n+            break;\n+        case V4L2_CID_CONTRAST:\n+            id = &controls::Contrast;\n+            info = ControlInfo{ { 0.0f }, { 2.0f }, { 1.0f } };\n+            break;\n+        case V4L2_CID_SATURATION:\n+            id = &controls::Saturation;\n+            info = ControlInfo{ { 0.0f }, { 2.0f }, { 1.0f } };\n+            break;\n+        default:\n+            continue;\n+        }\n+\n+        ctrls.emplace(id, info);\n+    }\n+\n+    controlInfo_ = std::move(ctrls);\n+\n+At this point you need to add the following includes to the top of the file for controls handling.\n+\n+.. code-block:: cpp\n+\n+    #include <math.h>\n+    #include <libcamera/controls.h>\n+    #include <libcamera/control_ids.h>\n+\n+Testing a pipeline handler\n+~~~~~~~~~~~~~~~~~~~~~~~~~~\n+\n+Once you've built the pipeline handler, rebuild the code base, and you\n+can use the ``LIBCAMERA_LOG_LEVELS=Pipeline,VIVID:0 ./build/src/cam/cam -c vivid -I -C`` command \n+to test that the pipeline handler can detect a device, and capture input.\n+\n+Running the command above outputs (a lot of) information about pixel formats, and then starts capturing frame data.\n\\ No newline at end of file\n","prefixes":["libcamera-devel"]}