[{"id":12084,"web_url":"https://patchwork.libcamera.org/comment/12084/","msgid":"<20200820154427.GS6593@pendragon.ideasonboard.com>","date":"2020-08-20T15:44:27","subject":"Re: [libcamera-devel] [PATCH v4 2/3] Documentation: Guides:\n\tPipeline Handler Writers Guide","submitter":{"id":2,"url":"https://patchwork.libcamera.org/api/people/2/","name":"Laurent Pinchart","email":"laurent.pinchart@ideasonboard.com"},"content":"Hi Kieran,\n\nThank you for the patch.\n\nOn Thu, Aug 20, 2020 at 02:47:50PM +0100, Kieran Bingham wrote:\n> From: Chris Chinchilla <chris@gregariousmammal.com>\n> \n> Introduce a pipeline-handler writers guide to provide a walk through of\n> the steps and concepts required to implement a new Pipeline Handler.\n> \n> Signed-off-by: Chris Chinchilla <chris@gregariousmammal.com>\n> [Reflow/Rework, update to mainline API]\n> Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>\n> [Further reworks and review]\n> Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>\n> ---\n>  Documentation/guides/pipeline-handler.rst | 1473 +++++++++++++++++++++\n>  Documentation/index.rst                   |    1 +\n>  Documentation/meson.build                 |    1 +\n>  3 files changed, 1475 insertions(+)\n>  create mode 100644 Documentation/guides/pipeline-handler.rst\n> \n> diff --git a/Documentation/guides/pipeline-handler.rst b/Documentation/guides/pipeline-handler.rst\n> new file mode 100644\n> index 000000000000..eb795dcc35b6\n> --- /dev/null\n> +++ b/Documentation/guides/pipeline-handler.rst\n> @@ -0,0 +1,1473 @@\n> +.. SPDX-License-Identifier: CC-BY-SA-4.0\n> +\n> +Pipeline Handler Writers Guide\n> +==============================\n> +\n> +Pipeline handlers are the abstraction layer for device-specific hardware\n> +configuration. They access and control hardware through the V4L2 and Media\n> +Controller kernel interfaces, and implement an internal API to control the ISP\n> +and capture components of a pipeline directly.\n> +\n> +Prerequisite knowledge: system architecture\n> +-------------------------------------------\n> +\n> +A pipeline handler configures and manages the image acquisition and\n> +transformation pipeline realized by specialized system peripherals combined with\n> +an image source connected to the system through a data and control bus. The\n> +presence, number and characteristics of them vary depending on the system design\n> +and the product integration of the target platform.\n> +\n> +System components can be classified in three macro-categories:\n> +\n> +.. TODO: Insert references to the open CSI-2 (and other) specification.\n> +\n> +- Input ports: Interfaces to external devices, usually image sensors,\n> +  which transfer data from the physical bus to locations accessible by other\n> +  system peripherals. An input port needs to be configured according to the\n> +  input image format and size and could optionally apply basic transformations\n> +  on the received images, most typically cropping/scaling and some formats\n> +  conversion. The industry standard for the system typically targeted by\n> +  libcamera is to have receivers compliant with the MIPI CSI-2 specifications,\n> +  implemented on a compatible physical layer such as MIPI D-PHY or MIPI C-PHY.\n> +  Other design are possible but less common, such as LVDS or the legacy BT.601\n> +  and BT.656 parallel protocols.\n> +\n> +- Image Signal Processor (ISP): A specialized media processor which applies\n> +  digital transformations on image streams. ISPs can be integrated as part of\n> +  the SoC as a memory interfaced system peripheral or packaged as stand-alone\n> +  chips connected to the application processor through a bus. Most hardware used\n> +  by libcamera makes use of in-system ISP designs but pipelines can equally\n> +  support external ISP chips or be instrumented to use other system resources\n> +  such as a GPU or an FPGA IP block. ISPs expose a software programming\n> +  interface that allows the configuration of multiple processing blocks which\n> +  form an \"Image Transformation Pipeline\". An ISP usually produces 'processed'\n> +  image streams along with the metadata describing the processing steps which\n> +  have been applied to generate the output frames.\n> +\n> +- Camera Sensor: Digital components that integrate an image sensor with control\n> +  electronics and usually a lens. It interfaces to the SoC image receiver ports\n> +  and is programmed to produce images in a format and size suitable for the\n> +  current system configuration. Complex camera modules can integrate on-board\n> +  ISP or DSP chips and process images before delivering them to the system. Most\n> +  systems with a dedicated ISP processor will usually integrate camera sensors\n> +  which produce images in Raw Bayer format and defer processing to it.\n> +\n> +It is the responsibility of the pipeline handler to interface with these (and\n> +possibly other) components of the system and implement the following\n> +functionalities:\n> +\n> +- Detect and register camera devices available in the system with an associated\n> +  set of image streams.\n> +\n> +- Configure the image acquisition and processing pipeline by assigning the\n> +  system resources (memory, shared components, etc.) to satisfy the\n> +  configuration requested by the application.\n> +\n> +- Start and the stop the image acquisition and processing sessions.\n> +\n> +- Apply configuration settings requested by applications and computed by image\n> +  processing algorithms integrated in libcamera to the hardware devices.\n> +\n> +- Notify applications of the availability of new images and deliver them to the\n> +  correct locations.\n> +\n> +Prerequisite knowledge: libcamera architecture\n> +----------------------------------------------\n> +\n> +A pipeline handler makes use of the following libcamera classes to realize the\n> +functionalities descibed above. Below is a brief overview of each of those:\n> +\n> +.. TODO: (All) Convert to sphinx refs\n> +.. TODO: (MediaDevice) Reference to the Media Device API (possibly with versioning requirements)\n> +.. TODO: (IPAInterface) refer to the IPA guide\n> +\n> +-  `MediaDevice <http://libcamera.org/api-html/classlibcamera_1_1MediaDevice.html>`_:\n> +   Instances of this class are associated with a kernel media controller\n> +   device and its connected objects.\n> +\n> +-  `DeviceEnumerator <http://libcamera.org/api-html/classlibcamera_1_1DeviceEnumerator.html>`_:\n> +   Enumerates all media devices attached to the system and the media entities\n> +   registered with it, by creating instances of the ``MediaDevice`` class and\n> +   storing them.\n> +\n> +-  `DeviceMatch <http://libcamera.org/api-html/classlibcamera_1_1DeviceMatch.html>`_:\n> +   Describes a media device search pattern using entity names, or other\n> +   properties.\n> +\n> +-  `V4L2VideoDevice <http://libcamera.org/api-html/classlibcamera_1_1V4L2VideoDevice.html>`_:\n> +   Models an instance of a V4L2 video device constructed with the path to a V4L2\n> +   video device node.\n> +\n> +-  `V4L2SubDevice <http://libcamera.org/api-html/classlibcamera_1_1V4L2Subdevice.html>`_:\n> +   Provides an API to the sub-devices that model the hardware components of a\n> +   V4L2 device.\n> +\n> +-  `CameraSensor <http://libcamera.org/api-html/classlibcamera_1_1CameraSensor.html>`_:\n> +   Abstracts camera sensor handling by hiding the details of the V4L2 subdevice\n> +   kernel API and caching sensor information.\n> +\n> +-  `CameraData <http://libcamera.org/api-html/classlibcamera_1_1CameraData.html>`_:\n> +   Represents device-specific data a pipeline handler associates to each Camera\n> +   instance.\n> +\n> +-  `StreamConfiguration <http://libcamera.org/api-html/structlibcamera_1_1StreamConfiguration.html>`_:\n> +   Models the current configuration of an image stream produced by the camera by\n> +   reporting its format and sizes.\n> +\n> +-  `CameraConfiguration <http://libcamera.org/api-html/classlibcamera_1_1CameraConfiguration.html>`_:\n> +   Represents the current configuration of a camera, which includes a list of\n> +   stream configurations for each active stream in a capture session. When\n> +   validated, it is applied to the camera.\n> +\n> +-  `IPAInterface <http://libcamera.org/api-html/classlibcamera_1_1IPAInterface.html>`_:\n> +   The interface to the Image Processing Algorithm (IPA) module which performs\n> +   the computation of the image processing pipeline tuning parameters.\n> +\n> +-  `ControlList <http://libcamera.org/api-html/classlibcamera_1_1ControlList.html>`_:\n> +   A list of control items, indexed by Control<> instances or by numerical index\n> +   which contains values used by application and IPA to change parameters of\n> +   image streams, used to return to applications and share with IPA the metadata\n> +   associated with the captured images, and to advertise the immutable camera\n> +   characteristics enumerated at system initialization time.\n> +\n> +Creating a PipelineHandler\n> +--------------------------\n> +\n> +This guide walks through the steps to create a simple pipeline handler\n> +called “Vivid” that supports the `V4L2 Virtual Video Test Driver`_ (vivid).\n> +\n> +To use the vivid test driver, you first need to check that the vivid kernel\n> +module is loaded, for example with the ``modprobe vivid`` command.\n> +\n> +.. _V4L2 Virtual Video Test Driver: https://www.kernel.org/doc/html/latest/admin-guide/media/vivid.html\n> +\n> +Create the skeleton file structure\n> +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~\n> +\n> +To add a new pipeline handler, create a directory to hold the pipeline code in\n> +the *src/libcamera/pipeline/* directory that matches the name of the pipeline\n> +(in this case *vivid*). Inside the new directory add a *meson.build* file that\n> +integrates with the libcamera build system, and a *vivid.cpp* file that matches\n> +the name of the pipeline.\n> +\n> +In the *meson.build* file, add the *vivid.cpp* file as a build source for\n> +libcamera by adding it to the global meson ``libcamera_sources`` variable:\n> +\n> +.. code-block:: none\n> +\n> +   # SPDX-License-Identifier: CC0-1.0\n> +\n> +   libcamera_sources += files([\n> +       'vivid.cpp',\n> +   ])\n> +\n> +Users of libcamera can selectively enable pipelines while building libcamera\n> +using the ``pipelines`` option.\n> +\n> +For example, to enable only the IPU3, UVC, and VIVID pipelines, specify them as\n> +a comma separated list with ``-Dpipelines`` when generating a build directory:\n> +\n> +.. code-block:: shell\n> +\n> +    meson build -Dpipelines=ipu3,uvcvideo,vivid\n> +\n> +Read the `Meson build configuration`_ documentation for more information on\n> +configuring a build directory.\n> +\n> +.. _Meson build configuration: https://mesonbuild.com/Configuring-a-build-directory.html\n> +\n> +To add the new pipeline handler to this list of options, add its directory name\n> +to the libcamera build options in the top level _meson_options.txt_.\n> +\n> +.. code-block:: none\n> +\n> +   option('pipelines',\n> +           type : 'array',\n> +           choices : ['ipu3', 'raspberrypi', 'rkisp1', 'simple', 'uvcvideo', 'vimc', 'vivid'],\n> +           description : 'Select which pipeline handlers to include')\n> +\n> +\n> +In *vivid.cpp* add the pipeline handler to the ``libcamera`` namespace, defining\n> +a `PipelineHandler`_ derived class named PipelineHandlerVivid, and add stub\n> +methods for the overridden class members.\n> +\n> +.. _PipelineHandler: http://libcamera.org/api-html/classlibcamera_1_1PipelineHandler.html\n> +\n> +.. code-block:: cpp\n> +\n> +   namespace libcamera {\n> +\n> +   class PipelineHandlerVivid : public PipelineHandler\n> +   {\n> +   public:\n> +          PipelineHandlerVivid(CameraManager *manager);\n> +\n> +          CameraConfiguration *generateConfiguration(Camera *camera,\n> +          const StreamRoles &roles) override;\n> +          int configure(Camera *camera, CameraConfiguration *config) override;\n> +\n> +          int exportFrameBuffers(Camera *camera, Stream *stream,\n> +          std::vector<std::unique_ptr<FrameBuffer>> *buffers) override;\n> +\n> +          int start(Camera *camera) override;\n> +          void stop(Camera *camera) override;\n> +\n> +          int queueRequestDevice(Camera *camera, Request *request) override;\n> +\n> +          bool match(DeviceEnumerator *enumerator) override;\n> +   };\n> +\n> +   PipelineHandlerVivid::PipelineHandlerVivid(CameraManager *manager)\n> +          : PipelineHandler(manager)\n> +   {\n> +   }\n> +\n> +   CameraConfiguration *PipelineHandlerVivid::generateConfiguration(Camera *camera,\n> +                                                                    const StreamRoles &roles)\n> +   {\n> +          return nullptr;\n> +   }\n> +\n> +   int PipelineHandlerVivid::configure(Camera *camera, CameraConfiguration *config)\n> +   {\n> +          return -1;\n> +   }\n> +\n> +   int PipelineHandlerVivid::exportFrameBuffers(Camera *camera, Stream *stream,\n> +                                                std::vector<std::unique_ptr<FrameBuffer>> *buffers)\n> +   {\n> +          return -1;\n> +   }\n> +\n> +   int PipelineHandlerVivid::start(Camera *camera)\n> +   {\n> +          return -1;\n> +   }\n> +\n> +   void PipelineHandlerVivid::stop(Camera *camera)\n> +   {\n> +   }\n> +\n> +   int PipelineHandlerVivid::queueRequestDevice(Camera *camera, Request *request)\n> +   {\n> +          return -1;\n> +   }\n> +\n> +   bool PipelineHandlerVivid::match(DeviceEnumerator *enumerator)\n> +   {\n> +          return false;\n> +   }\n> +\n> +   REGISTER_PIPELINE_HANDLER(PipelineHandlerVivid);\n> +\n> +   } /* namespace libcamera */\n> +\n> +Note that you must register the ``PipelineHandler`` subclass with the pipeline\n> +handler factory using the `REGISTER_PIPELINE_HANDLER`_ macro which\n> +registers it and creates a global symbol to reference the class and make it\n> +available to try and match devices.\n> +\n> +.. _REGISTER_PIPELINE_HANDLER: http://libcamera.org/api-html/pipeline__handler_8h.html\n> +\n> +For debugging and testing a pipeline handler during development, you can define\n> +a log message category for the pipeline handler. The ``LOG_DEFINE_CATEGORY``\n> +macro and ``LIBCAMERA_LOG_LEVELS`` environment variable help you use the inbuilt\n> +libcamera `logging infrastructure`_ that allow for the inspection of internal\n> +operations in a user-configurable way.\n> +\n> +.. _logging infrastructure: http://libcamera.org/api-html/log_8h.html\n> +\n> +Add the following before the ``PipelineHandlerVivid`` class declaration:\n> +\n> +.. code-block:: cpp\n> +\n> +   LOG_DEFINE_CATEGORY(VIVID)\n> +\n> +At this point you need the following includes for logging and pipeline handler\n> +features:\n> +\n> +.. code-block:: cpp\n> +\n> +   #include \"libcamera/internal/log.h\"\n> +   #include \"libcamera/internal/pipeline_handler.h\"\n> +\n> +Run the following commands:\n> +\n> +.. code-block:: shell\n> +\n> +   meson build\n> +   ninja -C build\n> +\n> +To build the libcamera code base, and confirm that the build system found the\n> +new pipeline handler by running:\n> +\n> +.. code-block:: shell\n> +\n> +   LIBCAMERA_LOG_LEVELS=Pipeline:0 ./build/src/cam/cam -l\n> +\n> +And you should see output like the below:\n> +\n> +.. code-block:: shell\n> +\n> +    DEBUG Pipeline pipeline_handler.cpp:680 Registered pipeline handler \"PipelineHandlerVivid\"\n> +\n> +Matching devices\n> +~~~~~~~~~~~~~~~~\n> +\n> +Each pipeline handler registered in libcamera gets tested against the current\n> +system configuration, by matching a ``DeviceMatch`` with the system\n> +``DeviceEnumerator``. A successful match makes sure all the requested components\n> +have been registered in the system and allows the pipeline handler to be\n> +initialized.\n> +\n> +The main entry point of a pipeline handler is the `match()`_ class member\n> +function. When the ``CameraManager`` is started (using the `start()`_ method),\n> +all the registered pipeline handlers are iterated and their ``match`` function\n> +called with an enumerator of all devices it found on a system.\n> +\n> +The match method should identify if there are suitable devices available in the\n> +``DeviceEnumerator`` which the pipeline supports, returning ``true`` if it\n> +matches a device, and ``false`` if it does not. To do this, construct a\n> +`DeviceMatch`_ class with the name of the ``MediaController`` device to match.\n> +You can specify the search further by adding specific media entities to the\n> +search using the ``.add()`` method on the DeviceMatch.\n> +\n> +.. _match(): https://www.libcamera.org/api-html/classlibcamera_1_1PipelineHandler.html#a7cd5b652a2414b543ec20ba9dabf61b6\n> +.. _start(): http://libcamera.org/api-html/classlibcamera_1_1CameraManager.html#a49e322880a2a26013bb0076788b298c5\n> +.. _DeviceMatch: http://libcamera.org/api-html/classlibcamera_1_1DeviceMatch.html\n> +\n> +This example uses search patterns that match vivid, but when developing a new\n> +pipeline handler, you should change this value to suit your device identifier.\n> +\n> +Replace the contents of the ``PipelineHandlerVivid::match`` method with the\n> +following:\n> +\n> +.. code-block:: cpp\n> +\n> +   DeviceMatch dm(\"vivid\");\n> +   dm.add(\"vivid-000-vid-cap\");\n> +   return false; // Prevent infinite loops for now\n> +\n> +With the device matching criteria defined, attempt to acquire exclusive access\n> +to the matching media controller device with the `acquireMediaDevice`_ method.\n> +If the method attempts to acquire a device it has already matched, it returns\n> +``false``.\n> +\n> +.. _acquireMediaDevice: http://libcamera.org/api-html/classlibcamera_1_1PipelineHandler.html#a77e424fe704e7b26094164b9189e0f84\n> +\n> +Add the following below ``dm.add(\"vivid-000-vid-cap\");``:\n> +\n> +.. code-block:: cpp\n> +\n> +   MediaDevice *media = acquireMediaDevice(enumerator, dm);\n> +   if (!media)\n> +           return false;\n> +\n> +The pipeline handler now needs an additional include. Add the following to the\n> +existing include block for device enumeration functionality:\n> +\n> +.. code-block:: cpp\n> +\n> +   #include \"libcamera/internal/device_enumerator.h\"\n> +\n> +At this stage, you should test that the pipeline handler can successfully match\n> +the devices, but have not yet added any code to create a Camera which libcamera\n> +reports to applications.\n> +\n> +As a temporary validation step, add a debug print with\n> +\n> +.. code-block:: cpp\n> +\n> +   LOG(VIVID, Debug) << \"Vivid Device Identified\";\n> +\n> +before the final closing return statement in the ``PipelineHandlerVivid::match``\n> +method for when when the pipeline handler successfully matches the\n> +``MediaDevice`` and ``MediaEntity`` names.\n> +\n> +Test that the pipeline handler matches and finds a device by rebuilding, and\n> +running\n> +\n> +.. code-block:: shell\n> +\n> +   ninja -C build\n> +   LIBCAMERA_LOG_LEVELS=Pipeline,VIVID:0 ./build/src/cam/cam -l\n> +\n> +And you should see output like the below:\n> +\n> +.. code-block:: shell\n> +\n> +    DEBUG VIVID vivid.cpp:74 Vivid Device Identified\n> +\n> +Creating camera devices\n> +~~~~~~~~~~~~~~~~~~~~~~~\n> +\n> +If the pipeline handler successfully matches with the system it is running on,\n> +it can proceed to initialization, by creating all the required instances of the\n> +``V4L2VideoDevice``, ``V4L2Subdevice`` and ``CameraSensor`` hardware abstraction\n> +classes. If the Pipeline handler supports an ISP, it can then also Initialise\n> +the IPA module before proceeding to the creation of the Camera devices.\n> +\n> +An image ``Stream`` represents a sequence of images and data of known size and\n> +format, stored in application-accessible memory locations. Typical examples of\n> +streams are the ISP processed outputs and the raw images captured at the\n> +receivers port output.\n> +\n> +The Pipeline Handler is responsible for defining the set of Streams associated\n> +with the Camera.\n> +\n> +Each Camera has instance-specific data represented using the `CameraData`_\n> +class, which can be extended for the specific needs of the pipeline handler.\n> +\n> +.. _CameraData: http://libcamera.org/api-html/classlibcamera_1_1CameraData.html\n> +\n> +\n> +To support the Camera we will later register, we need to create a CameraData\n> +class that we can implement for our specific Pipeline Handler.\n> +\n> +Define a new ``VividCameraData()`` class derived from ``CameraData`` by adding\n> +the following code before the PipelineHandlerVivid class definition where it\n> +will be used:\n> +\n> +.. code-block:: cpp\n> +\n> +   class VividCameraData : public CameraData\n> +   {\n> +   public:\n> +          VividCameraData(PipelineHandler *pipe, MediaDevice *media)\n> +                : CameraData(pipe), media_(media), video_(nullptr)\n> +          {\n> +          }\n> +\n> +          ~VividCameraData()\n> +          {\n> +                delete video_;\n> +          }\n> +\n> +          int init();\n> +          void bufferReady(FrameBuffer *buffer);\n> +\n> +          MediaDevice *media_;\n> +          V4L2VideoDevice *video_;\n> +          Stream stream_;\n> +   };\n> +\n> +This example pipeline handler handles a single video device and supports a\n> +single stream, represented by the ``VividCameraData`` class members. More\n> +complex pipeline handlers might register cameras composed of several video\n> +devices and sub-devices, or multiple streams per camera that represent the\n> +several components of the image capture pipeline. You should represent all these\n> +components in the ``CameraData`` derived class when developing a custom\n> +PipelineHandler.\n> +\n> +In our example VividCameraData we implement an ``init()`` function to prepare\n> +the object from our PipelineHandler, however the CameraData class does not\n> +specify the interface for initialisation and PipelineHandlers can manage this\n> +based on their own needs. Derived CameraData classes are used only by their\n> +respective pipeline handlers.\n> +\n> +The CameraData class stores the context required for each camera instance and\n> +is usually responsible for opening all Devices used in the capture pipeline.\n> +\n> +We can now implement the ``init`` method for our example Pipeline Handler to\n> +create a new V4L2 video device from the media entity, which we can specify using\n> +the `MediaDevice::getEntityByName`_ method from the MediaDevice. As our example\n> +is based upon the simplistic Vivid test device, we only need to open a single\n> +capture device named 'vivid-000-vid-cap' by the device.\n> +\n> +.. _MediaDevice::getEntityByName: http://libcamera.org/api-html/classlibcamera_1_1MediaDevice.html#ad5d9279329ef4987ceece2694b33e230\n> +\n> +.. code-block:: cpp\n> +\n> +   int VividCameraData::init()\n> +   {\n> +          video_ = new V4L2VideoDevice(media_->getEntityByName(\"vivid-000-vid-cap\"));\n> +          if (video_->open())\n> +                return -ENODEV;\n> +\n> +          return 0;\n> +   }\n> +\n> +The CameraData should be created and initialised before we move on to register a\n> +new Camera device so we need to construct and initialise our\n> +VividCameraData after we have identified our device within\n> +PipelineHandlerVivid::match(). The VividCameraData is wrapped by a\n> +std::unique_ptr to help manage the lifetime of our CameraData instance.\n> +\n> +If the camera data initialization fails, return ``false`` to indicate the\n> +failure to the ``match()`` method and prevent retrying of the pipeline handler.\n> +\n> +.. code-block:: cpp\n> +\n> +   std::unique_ptr<VividCameraData> data = std::make_unique<VividCameraData>(this, media);\n> +\n> +   if (data->init())\n> +           return false;\n> +\n> +\n> +Once the camera data has been initialized, the Camera device instances and the\n> +associated streams have to be registered. Create a set of streams for the\n> +camera, which for this device is only one. You create a camera using the static\n> +`Camera::create`_ method, passing the pipeline handler, the id of the camera,\n> +and the streams available. Then register the camera and its data with the\n> +pipeline handler and camera manager using `registerCamera`_.\n> +\n> +Finally with a successful construction, we return 'true' indicating that the\n> +PipelineHandler successfully matched and constructed a device.\n> +\n> +.. _Camera::create: http://libcamera.org/api-html/classlibcamera_1_1Camera.html#a453740e0d2a2f495048ae307a85a2574\n> +.. _registerCamera: http://libcamera.org/api-html/classlibcamera_1_1PipelineHandler.html#adf02a7f1bbd87aca73c0e8d8e0e6c98b\n> +\n> +.. code-block:: cpp\n> +\n> +   std::set<Stream *> streams{ &data->stream_ };\n> +   std::shared_ptr<Camera> camera = Camera::create(this, data->video_->deviceName(), streams);\n> +   registerCamera(std::move(camera), std::move(data));\n> +\n> +   return true;\n> +\n> +\n> +Our match function should now look like the following:\n> +\n> +.. code-block:: cpp\n> +\n> +   bool PipelineHandlerVivid::match(DeviceEnumerator *enumerator)\n> +   {\n> +   \tDeviceMatch dm(\"vivid\");\n> +   \tdm.add(\"vivid-000-vid-cap\");\n> +\n> +   \tMediaDevice *media = acquireMediaDevice(enumerator, dm);\n> +   \tif (!media)\n> +   \t\treturn false;\n> +\n> +   \tstd::unique_ptr<VividCameraData> data = std::make_unique<VividCameraData>(this, media);\n> +\n> +   \t/* Locate and open the capture video node. */\n> +   \tif (data->init())\n> +   \t\treturn false;\n> +\n> +   \t/* Create and register the camera. */\n> +   \tstd::set<Stream *> streams{ &data->stream_ };\n> +   \tstd::shared_ptr<Camera> camera = Camera::create(this, data->video_->deviceName(), streams);\n> +   \tregisterCamera(std::move(camera), std::move(data));\n> +\n> +   \treturn true;\n> +   }\n> +\n> +We will need to use our custom CameraData class frequently throughout the\n> +pipeline handler, so we add a private convenience helper to our Pipeline handler\n> +to obtain and cast the custom CameraData instance from a Camera instance.\n> +\n> +.. code-block:: cpp\n> +\n> +   private:\n> +       VividCameraData *cameraData(const Camera *camera)\n> +       {\n> +               return static_cast<VividCameraData *>(\n> +                        PipelineHandler::cameraData(camera));\n> +       }\n> +\n> +At this point, you need to add the following new includes to provide the Camera\n> +interface, and device interaction interfaces.\n> +\n> +.. code-block:: cpp\n> +\n> +   #include <libcamera/camera.h>\n> +   #include \"libcamera/internal/media_device.h\"\n> +   #include \"libcamera/internal/v4l2_videodevice.h\"\n> +\n> +Registering controls and properties\n> +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~\n> +\n> +The libcamera `controls framework`_ allows an application to configure the\n> +streams capture parameters on a per-frame basis and is also used to advertise\n> +immutable properties of the ``Camera`` device.\n> +\n> +The libcamera controls and properties are defined in YAML form which is\n> +processed to automatically generate documentation and interfaces. Controls are\n> +defined by the src/libcamera/`control_ids.yaml`_ file and camera properties\n> +are defined by src/libcamera/`properties_ids.yaml`_.\n> +\n> +.. _controls framework: http://libcamera.org/api-html/controls_8h.html\n> +.. _control_ids.yaml: http://libcamera.org/api-html/control__ids_8h.html\n> +.. _properties_ids.yaml: http://libcamera.org/api-html/property__ids_8h.html\n> +\n> +Pipeline handlers can optionally register the list of controls an application\n> +can set as well as a list of immutable camera properties. Being both\n> +Camera-specific values, they are represented in the ``CameraData`` base class,\n> +which provides two members for this purpose: the `CameraData::controlInfo_`_ and\n> +the `CameraData::properties_`_ fields.\n> +\n> +.. _CameraData::controlInfo_: http://libcamera.org/api-html/classlibcamera_1_1CameraData.html#ab9fecd05c655df6084a2233872144a52\n> +.. _CameraData::properties_: http://libcamera.org/api-html/classlibcamera_1_1CameraData.html#a84002c29f45bd35566c172bb65e7ec0b\n> +\n> +The ``controlInfo_`` field represents a map of ``ControlId`` instances\n> +associated with the limits of valid values supported for the control. More\n> +information can be found in the `ControlInfoMap`_ class documentation.\n> +\n> +.. _ControlInfoMap: http://libcamera.org/api-html/classlibcamera_1_1ControlInfoMap.html\n> +\n> +Pipeline handlers register controls to expose the tunable device and IPA\n> +parameters to applications. Our example pipeline handler only exposes trivial\n> +controls of the video device, by registering a ``ControlId`` instance with\n> +associated values for each supported V4L2 control but demonstrates the mapping\n> +of V4L2 Controls to libcamera ControlIDs.\n> +\n> +Complete the initialization of the ``VividCameraData`` class by adding the\n> +following code to the ``VividCameraData::init()`` method to initialise the\n> +controls. For more complex control configurations, this could of course be\n> +broken out to a separate function, but for now we just initialise the small set\n> +inline in our CameraData init:\n> +\n> +.. code-block:: cpp\n> +\n> +   /* Initialise the supported controls. */\n> +   const ControlInfoMap &controls = video_->controls();\n> +   ControlInfoMap::Map ctrls;\n> +\n> +   for (const auto &ctrl : controls) {\n> +           const ControlId *id;\n> +           ControlInfo info;\n> +\n> +           switch (ctrl.first->id()) {\n> +           case V4L2_CID_BRIGHTNESS:\n> +                   id = &controls::Brightness;\n> +                   info = ControlInfo{ { -1.0f }, { 1.0f }, { 0.0f } };\n> +                   break;\n> +           case V4L2_CID_CONTRAST:\n> +                   id = &controls::Contrast;\n> +                   info = ControlInfo{ { 0.0f }, { 2.0f }, { 1.0f } };\n> +                   break;\n> +           case V4L2_CID_SATURATION:\n> +                   id = &controls::Saturation;\n> +                   info = ControlInfo{ { 0.0f }, { 2.0f }, { 1.0f } };\n> +                   break;\n> +           default:\n> +                   continue;\n> +           }\n> +\n> +           ctrls.emplace(id, info);\n> +   }\n> +\n> +   controlInfo_ = std::move(ctrls);\n> +\n> +The ``properties_`` field is  a list of ``ControlId`` instances\n> +associated with immutable values, which represent static characteristics that can\n> +be used by applications to identify camera devices in the system. Properties can be\n> +registered by inspecting the values of V4L2 controls from the video devices and\n> +camera sensor (for example to retrieve the position and orientation of a camera)\n> +or to express other immutable characteristics. The example pipeline handler does\n> +not register any property, but examples are available in the libcamera code\n> +base.\n> +\n> +.. TODO: Add a property example to the pipeline handler. At least the model.\n> +\n> +At this point you need to add the following includes to the top of the file for\n> +handling controls:\n> +\n> +.. code-block:: cpp\n> +\n> +   #include <libcamera/controls.h>\n> +   #include <libcamera/control_ids.h>\n> +\n> +Generating a default configuration\n> +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~\n> +\n> +Once ``Camera`` devices and the associated ``Streams`` have been registered, an\n> +application can proceed to acquire and configure the camera to prepare it for a\n> +frame capture session.\n> +\n> +Applications specify the requested configuration by assigning a\n> +``StreamConfiguration`` instance to each stream they want to enable which\n> +expresses the desired image size and pixel format. The stream configurations are\n> +grouped in a ``CameraConfiguration`` which is inspected by the pipeline handler\n> +and validated to adjust it to a supported configuration. This may involve\n> +adjusting the formats or image sizes or alignments for example to match the\n> +capabilities of the device.\n> +\n> +Applications may choose to repeat validation stages, adjusting paramters until a\n> +set of validated StreamConfigurations are returned that is acceptable for the\n> +applications needs. When the pipeline handler receives a valid camera\n> +configuration it can use the image stream configurations to apply settings to\n> +the hardware devices.\n> +\n> +This configuration and validation process is managed with another Pipeline\n> +specific class derived from a common base implementation and interface.\n> +\n> +To support validation in our example pipeline handler, Create a new class called\n> +``VividCameraConfiguration`` derived from the base `CameraConfiguration`_ class\n> +which we can implement and use within our ``PipelineHandlerVivid`` class.\n> +\n> +.. _CameraConfiguration: http://libcamera.org/api-html/classlibcamera_1_1CameraConfiguration.html\n> +\n> +The derived ``CameraConfiguration`` class must override the base class\n> +``validate()`` function, where the stream configuration inspection and\n> +adjustment happens.\n> +\n> +.. code-block:: cpp\n> +\n> +    class VividCameraConfiguration : public CameraConfiguration\n> +    {\n> +    public:\n> +           VividCameraConfiguration();\n> +\n> +           Status validate() override;\n> +    };\n> +\n> +    VividCameraConfiguration::VividCameraConfiguration()\n> +           : CameraConfiguration()\n> +    {\n> +    }\n> +\n> +Applications generate a ``CameraConfiguration`` instance by calling the\n> +`Camera::generateConfiguration()`_ function, which calls into the pipeline\n> +implementation of the overridden `PipelineHandler::generateConfiguration()`_\n> +method.\n> +\n> +.. _Camera::generateConfiguration(): http://libcamera.org/api-html/classlibcamera_1_1Camera.html#a25c80eb7fc9b1cf32692ce0c7f09991d\n> +.. _PipelineHandler::generateConfiguration(): http://libcamera.org/api-html/classlibcamera_1_1PipelineHandler.html#a7932e87735695500ce1f8c7ae449b65b\n> +\n> +Configurations are generated by receiving a list of ``StreamRoles`` instances,\n> +which libcamera uses as predefined ways an application intends to use a camera\n> +(You can read the full list in the `StreamRole API`_ documentation). These are\n> +optional hints on how an application intends to use a stream, and a pipeline\n> +handler should return an ideal configuration for each role that is requested.\n> +\n> +.. _StreamRole API: http://libcamera.org/api-html/stream_8h.html#file_a295d1f5e7828d95c0b0aabc0a8baac03\n> +\n> +In the pipeline handler ``generateConfiguration`` implementation, remove the\n> +``return nullptr;``, create a new instance of the ``CameraConfiguration``\n> +derived class, and assign it to a base class pointer.\n> +\n> +.. code-block:: cpp\n> +\n> +   VividCameraData *data = cameraData(camera);\n> +   CameraConfiguration *config = new VividCameraConfiguration();\n> +\n> +A ``CameraConfiguration`` is specific to each pipeline, so you can only create\n> +it from the pipeline handler code path. Applications can also generate an empty\n> +configuration and add desired stream configurations manually. Pipelines must\n> +allow for this by returning an empty configuration if no roles are requested.\n> +\n> +To support this in our PipelineHandlerVivid, next add the following check in\n> +``generateConfiguration`` after the Cameraconfiguration has been constructed:\n> +\n> +.. code-block:: cpp\n> +\n> +   if (roles.empty())\n> +           return config;\n> +\n> +A production pipeline handler should generate the ``StreamConfiguration`` for\n> +all the appropriate stream roles a camera device supports. For this simpler\n> +example (with only one stream), the pipeline handler always returns the same\n> +configuration, inferred from the underlying V4L2VideoDevice.\n> +\n> +How it does this is shown below, but examination of the more full-featured\n> +pipelines for IPU3, RKISP1 and RaspberryPi are recommend to explore more\n> +complex examples.\n> +\n> +To generate a ``StreamConfiguration``, you need a list of pixel formats and\n> +frame sizes which supported outputs of the stream. You can fetch a map of the\n> +``V4LPixelFormat`` and ``SizeRange`` supported by the underlying output device,\n> +but the pipeline handler needs to convert this to a ``libcamera::PixelFormat``\n> +type to pass to applications. We do this here using ``std::transform`` to\n> +convert the formats and populate a new ``PixelFormat`` map as shown below.\n> +\n> +Continue adding the following code example to our ``generateConfiguration``\n> +implementation.\n> +\n> +.. code-block:: cpp\n> +\n> +   std::map<V4L2PixelFormat, std::vector<SizeRange>> v4l2Formats =\n> +   data->video_->formats();\n> +   std::map<PixelFormat, std::vector<SizeRange>> deviceFormats;\n> +   std::transform(v4l2Formats.begin(), v4l2Formats.end(),\n> +          std::inserter(deviceFormats, deviceFormats.begin()),\n> +          [&](const decltype(v4l2Formats)::value_type &format) {\n> +              return decltype(deviceFormats)::value_type{\n> +                  format.first.toPixelFormat(),\n> +                  format.second\n> +              };\n> +          });\n> +\n> +The `StreamFormats`_ class holds information about the pixel formats and frame\n> +sizes that a stream can support. The class groups size information by the pixel\n> +format, which can produce it.\n> +\n> +.. _StreamFormats: http://libcamera.org/api-html/classlibcamera_1_1StreamFormats.html\n> +\n> +The code below uses the ``StreamFormats`` class to represent all of the\n> +supported pixel formats, associated with a list of frame sizes. It then\n> +generates a supported StreamConfiguration to model the information an\n> +application can use to configure a single stream.\n> +\n> +Continue adding the following code to support this:\n> +\n> +.. code-block:: cpp\n> +\n> +   StreamFormats formats(deviceFormats);\n> +   StreamConfiguration cfg(formats);\n> +\n> +As well as a list of supported StreamFormats, the StreamConfiguration is also\n> +expected to provide an initialsed default configuration. This may be arbitrary,\n> +but depending on use case you may which to select an output that matches the\n> +Sensor output, or prefer a pixelformat which might provide higher performance on\n> +the hardware. The bufferCount represents the number of buffers required to\n> +support functional continuous processing on this stream.\n> +\n> +.. code-block:: cpp\n> +\n> +   cfg.pixelFormat = formats::BGR888;\n> +   cfg.size = { 1280, 720 };\n> +   cfg.bufferCount = 4;\n> +\n> +Finally add each ``StreamConfiguration`` generated to the\n> +``CameraConfiguration``, and ensure that it has been validated before returning\n> +it to the application. With only a single supported stream, this code adds only\n> +a single StreamConfiguration however a StreamConfiguration should be added for\n> +each supported role in a device that can handle more streams.\n> +\n> +Add the following code to complete the implementation of\n> +``generateConfiguration``:\n> +\n> +.. code-block:: cpp\n> +\n> +   config->addConfiguration(cfg);\n> +\n> +   config->validate();\n> +\n> +   return config;\n> +\n> +To validate a camera configuration, a pipeline handler must implement the\n> +`CameraConfiguration::validate()`_ method in it's derived class to inspect all\n> +the stream configuration associated to it, make any adjustments required to make\n> +the configuration valid, and return the validation status.\n> +\n> +If changes are made, it marks the configuration as ``Adjusted``, however if the\n> +requested configuration is not supported and cannot be adjusted it shall be\n> +refused and marked as ``Invalid``.\n> +\n> +.. _CameraConfiguration::validate(): http://libcamera.org/api-html/classlibcamera_1_1CameraConfiguration.html#a29f8f263384c6149775b6011c7397093\n> +\n> +The validation phase makes sure all the platform-specific constraints are\n> +respected by the requested configuration. The most trivial examples being making\n> +sure the requested image formats are supported and the image alignment\n> +restrictions adhered to. The pipeline handler specific implementation of\n> +``validate()`` shall inspect all the configuration parameters received and never\n> +assume they are correct, as applications are free to change the requested stream\n> +parameters after the configuration has been generated.\n> +\n> +Again, this example pipeline handler is simpler, look at the more complex\n> +implementations for a realistic example.\n> +\n> +Add the following function implementation to your file:\n> +\n> +.. code-block:: cpp\n> +\n> +   CameraConfiguration::Status VividCameraConfiguration::validate()\n> +   {\n> +           Status status = Valid;\n> +\n> +           if (config_.empty())\n> +                  return Invalid;\n> +\n> +           if (config_.size() > 1) {\n> +                  config_.resize(1);\n> +                  status = Adjusted;\n> +           }\n> +\n> +           StreamConfiguration &cfg = config_[0];\n> +\n> +           const std::vector<libcamera::PixelFormat> formats = cfg.formats().pixelformats();\n> +           if (std::find(formats.begin(), formats.end(), cfg.pixelFormat) == formats.end()) {\n> +                  cfg.pixelFormat = cfg.formats().pixelformats()[0];\n> +                  LOG(VIVID, Debug) << \"Adjusting format to \" << cfg.pixelFormat.toString();\n> +                  status = Adjusted;\n> +           }\n> +\n> +           cfg.bufferCount = 4;\n> +\n> +           return status;\n> +   }\n> +\n> +Now that we are handling the ``PixelFormat`` type, we also need to add\n> +``#include <libcamera/formats.h>`` to the include section before we rebuild the\n> +codebase, and test:\n> +\n> +.. code-block:: shell\n> +\n> +   ninja -C build\n> +   LIBCAMERA_LOG_LEVELS=Pipeline,VIVID:0 ./build/src/cam/cam -c vivid -I\n> +\n> +You should see the following output showing the capabilites of our new pipeline\n> +handler, and showing that our configurations have been generated:\n> +\n> +.. code-block:: shell\n> +\n> +    Using camera vivid\n> +    0: 1280x720-BGR888\n> +    * Pixelformat: NV21 (320x180)-(3840x2160)/(+0,+0)\n> +    - 320x180\n> +    - 640x360\n> +    - 640x480\n> +    - 1280x720\n> +    - 1920x1080\n> +    - 3840x2160\n> +    * Pixelformat: NV12 (320x180)-(3840x2160)/(+0,+0)\n> +    - 320x180\n> +    - 640x360\n> +    - 640x480\n> +    - 1280x720\n> +    - 1920x1080\n> +    - 3840x2160\n> +    * Pixelformat: BGRA8888 (320x180)-(3840x2160)/(+0,+0)\n> +    - 320x180\n> +    - 640x360\n> +    - 640x480\n> +    - 1280x720\n> +    - 1920x1080\n> +    - 3840x2160\n> +    * Pixelformat: RGBA8888 (320x180)-(3840x2160)/(+0,+0)\n> +    - 320x180\n> +    - 640x360\n> +    - 640x480\n> +    - 1280x720\n> +    - 1920x1080\n> +    - 3840x2160\n> +\n> +Configuring a device\n> +~~~~~~~~~~~~~~~~~~~~\n> +\n> +With the configuration generated, and optionally modified and re-validated, a\n> +pipeline handler needs a method that allows an application to apply a\n> +configuration to the hardware devices.\n> +\n> +The `PipelineHandler::configure()`_ method receives a valid\n> +`CameraConfiguration`_ and applies the settings to hardware devices, using its\n> +parameters to prepare a device for a streaming session with the desired\n> +properties.\n> +\n> +.. _PipelineHandler::configure(): http://libcamera.org/api-html/classlibcamera_1_1PipelineHandler.html#a930f2a9cdfb51dfb4b9ca3824e84fc29\n> +.. _CameraConfiguration: http://libcamera.org/api-html/classlibcamera_1_1CameraConfiguration.html\n> +\n> +Replace the contents of the stubbed ``PipelineHandlerVivid::configure`` method\n> +with the following to obtain the camera data and stream configuration. This\n> +pipeline handler supports only a single stream, so it directly obtains the first\n> +``StreamConfiguration`` from the camera configuration. A pipeline handler with\n> +multiple streams should inspect each StreamConfiguration and configure the\n> +system accordingly.\n> +\n> +.. code-block:: cpp\n> +\n> +   VividCameraData *data = cameraData(camera);\n> +   StreamConfiguration &cfg = config->at(0);\n> +   int ret;\n> +\n> +The Vivid capture device is a V4L2 video device, so we use a `V4L2DeviceFormat`_\n> +with the fourcc and size attributes to apply directly to the capture device\n> +node. The fourcc attribute is a `V4L2PixelFormat`_ and differs from the\n> +``libcamera::PixelFormat``. Converting the format requires knowledge of the\n> +plane configuration for multiplanar formats, so you must explicitly convert it\n> +using the helper ``V4L2VideoDevice::toV4L2PixelFormat()`` provided by the\n> +V4L2VideoDevice instance of which the format will be applied on.\n> +\n> +.. _V4L2DeviceFormat: http://libcamera.org/api-html/classlibcamera_1_1V4L2DeviceFormat.html\n> +.. _V4L2PixelFormat: http://libcamera.org/api-html/classlibcamera_1_1V4L2PixelFormat.html\n> +\n> +Add the following code beneath the code from above:\n> +\n> +.. code-block:: cpp\n> +\n> +   V4L2DeviceFormat format = {};\n> +   format.fourcc = data->video_->toV4L2PixelFormat(cfg.pixelFormat);\n> +   format.size = cfg.size;\n> +\n> +Set the video device format defined above using the\n> +`V4L2VideoDevice::setFormat()`_ method. You should check if the kernel\n> +driver has adjusted the format, as this shows the pipeline handler has failed to\n> +handle the validation stages correctly, and the configure operation shall also\n> +fail.\n> +\n> +.. _V4L2VideoDevice::setFormat(): http://libcamera.org/api-html/classlibcamera_1_1V4L2VideoDevice.html#ad67b47dd9327ce5df43350b80c083cca\n> +\n> +Continue the implementation with the following code:\n> +\n> +.. code-block:: cpp\n> +\n> +   ret = data->video_->setFormat(&format);\n> +   if (ret)\n> +          return ret;\n> +\n> +   if (format.size != cfg.size ||\n> +          format.fourcc != data->video_->toV4L2PixelFormat(cfg.pixelFormat))\n> +          return -EINVAL;\n> +\n> +Finally, store and set stream-specific data reflecting the state of the stream.\n> +Associate the configuration with the stream by using the\n> +`StreamConfiguration::setStream`_ method, and set the values of individual\n> +stream configuration members as required.\n> +\n> +.. _StreamConfiguration::setStream: http://libcamera.org/api-html/structlibcamera_1_1StreamConfiguration.html#a74a0eb44dad1b00112c7c0443ae54a12\n> +\n> +.. NOTE: the cfg.setStream() call here associates the stream to the\n> +   StreamConfiguration however that should quite likely be done as part of\n> +   the validation process. TBD\n> +\n> +Complete the configure implementation with the following code:\n> +\n> +.. code-block:: cpp\n> +\n> +   cfg.setStream(&data->stream_);\n> +   cfg.stride = format.planes[0].bpl;\n> +\n> +   return 0;\n> +\n> +.. TODO: stride SHALL be assigned in validate\n> +\n> +Initializing device controls\n> +~~~~~~~~~~~~~~~~~~~~~~~~~~~~\n> +\n> +Pipeline handlers can optionally initialize the video devices and camera sensor\n> +controls at system configuration time, to make sure to make sure they are\n> +defaulted to sane values. Handling of device controls is again performed using\n> +the libcamera `controls framework`_.\n> +\n> +.. _Controls Framework: http://libcamera.org/api-html/controls_8h.html\n> +\n> +This section is particularly specific to Vivid as it sets the initial values of\n> +controls to match `Vivid Controls`_ defined by the kernel driver. You won’t need\n> +any of the code below for your pipeline handler, but it’s included as an example\n> +of how to implement functionality your pipeline handler might need.\n> +\n> +.. _Vivid Controls: https://www.kernel.org/doc/html/latest/admin-guide/media/vivid.html#controls\n> +\n> +We need to add some definitions at the top of the file for convenience. These\n> +come directly from the kernel sources:\n> +\n> +.. code-block:: cpp\n> +\n> +   #define VIVID_CID_VIVID_BASE            (0x00f00000 | 0xf000)\n> +   #define VIVID_CID_VIVID_CLASS           (0x00f00000 | 1)\n> +   #define VIVID_CID_TEST_PATTERN          (VIVID_CID_VIVID_BASE  + 0)\n> +   #define VIVID_CID_OSD_TEXT_MODE         (VIVID_CID_VIVID_BASE  + 1)\n> +   #define VIVID_CID_HOR_MOVEMENT          (VIVID_CID_VIVID_BASE  + 2)\n> +\n> +We can now use the V4L2 control IDs to prepare a list of controls with the\n> +`ControlList`_ class, and set them using the `ControlList::set()`_ method.\n> +\n> +.. _ControlList: http://libcamera.org/api-html/classlibcamera_1_1ControlList.html\n> +.. _ControlList::set(): http://libcamera.org/api-html/classlibcamera_1_1ControlList.html#a74a1a29abff5243e6e37ace8e24eb4ba\n> +\n> +In our pipeline ``configure`` method, add the following code after the format\n> +has been set and checked to initialise the ControlList and apply it to the\n> +device:\n> +\n> +.. code-block:: cpp\n> +\n> +   ControlList controls(data->video_->controls());\n> +   controls.set(VIVID_CID_TEST_PATTERN, 0);\n> +   controls.set(VIVID_CID_OSD_TEXT_MODE, 0);\n> +\n> +   controls.set(V4L2_CID_BRIGHTNESS, 128);\n> +   controls.set(V4L2_CID_CONTRAST, 128);\n> +   controls.set(V4L2_CID_SATURATION, 128);\n> +\n> +   controls.set(VIVID_CID_HOR_MOVEMENT, 5);\n> +\n> +   ret = data->video_->setControls(&controls);\n> +   if (ret) {\n> +          LOG(VIVID, Error) << \"Failed to set controls: \" << ret;\n> +          return ret < 0 ? ret : -EINVAL;\n> +   }\n> +\n> +These controls configure VIVID to use a default test pattern, and enable all\n> +on-screen display text, while configuring sensible brightness, contrast and\n> +saturation values. Use the ``controls.set`` method to set individual controls.\n> +\n> +Buffer handling and stream control\n> +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~\n> +\n> +Once the system has been configured with the requested parameters, it is now\n> +possible for applications to start capturing frames from the ``Camera`` device.\n> +\n> +Libcamera implements a per-frame request capture model, realized by queueing\n\ns/Libcamera/libcamera/ (there are two other occurences)\n\n> +``Request`` instances to a ``Camera`` object. Before applications can start\n> +submitting capture requests the capture pipeline needs to be prepared to deliver\n> +frames as soon as they are requested. Memory should be initialized and made\n> +available to the devices which have to be started and ready to produce\n> +images. At the end of a capture session the ``Camera`` device needs to be\n> +stopped, to gracefully clean up any allocated memory and stop the hardware\n> +devices. Pipeline handlers implement two methods for these purposes, the\n> +``start()`` and ``stop()`` methods.\n> +\n> +The memory initialization phase that happens at ``start()`` time serves to\n> +configure video devices to be able to use memory buffers exported as dma-buf\n> +file descriptors. From the pipeline handlers perspective the video devices that\n> +provide application facing streams always act as memory importers which use,\n> +in V4L2 terminology, buffers of V4L2_MEMORY_DMABUF memory type.\n> +\n> +Libcamera also provides an API to allocate and export memory to applications\n> +realized through the `exportFrameBuffers`_ function and the\n> +`FrameBufferAllocator`_ class which will be presented later.\n> +\n> +.. _exportFrameBuffers: http://libcamera.org/api-html/classlibcamera_1_1PipelineHandler.html#a6312a69da7129c2ed41f9d9f790adf7c\n> +.. _FrameBufferAllocator: http://libcamera.org/api-html/classlibcamera_1_1FrameBufferAllocator.html\n> +\n> +Please refer to the V4L2VideoDevice API documentation, specifically the\n> +`allocateBuffers`_, `importBuffers`_ and `exportBuffers`_ functions for a\n> +detailed description of the video device memory management.\n> +\n> +.. _allocateBuffers: http://libcamera.org/api-html/classlibcamera_1_1V4L2VideoDevice.html#a3a1a77e5e6c220ea7878e89485864a1c\n> +.. _importBuffers: http://libcamera.org/api-html/classlibcamera_1_1V4L2VideoDevice.html#a154f5283d16ebd5e15d63e212745cb64\n> +.. _exportBuffers: http://libcamera.org/api-html/classlibcamera_1_1V4L2VideoDevice.html#ae9c0b0a68f350725b63b73a6da5a2ecd\n> +\n> +Video memory buffers are represented in libcamera by the `FrameBuffer`_ class.\n> +A ``FrameBuffer`` instance has to be associated to each ``Stream`` which is part\n> +of a capture ``Request``. Pipeline handlers should prepare the capture devices\n> +by importing the dma-buf file descriptors it needs to operate on. This operation\n> +is performed by using the ``V4L2VideoDevice`` API, which provides an\n> +``importBuffers()`` function that prepares the video device accordingly.\n> +\n> +.. _FrameBuffer: http://libcamera.org/api-html/classlibcamera_1_1FrameBuffer.html\n> +\n> +Implement the pipeline handler ``start()`` function by replacing the stub\n> +version with the following code:\n> +\n> +.. code-block:: c++\n> +\n> +   VividCameraData *data = cameraData(camera);\n> +   unsigned int count = data->stream_.configuration().bufferCount;\n> +\n> +   int ret = data->video_->importBuffers(count);\n> +   if (ret < 0)\n> +         return ret;\n> +\n> +   return 0;\n> +\n> +During the startup phase pipeline handlers allocate any internal buffer pool\n> +required to transfer data between different components of the image capture\n> +pipeline, for example, between the CSI-2 receiver and the ISP input. The example\n> +pipeline does not require any internal pool, but examples are available in more\n> +complex pipeline handlers in the libcamera code base.\n> +\n> +Applications might want to use memory allocated in the video devices instead of\n> +allocating it from other parts of the system. libcamera provides an abstraction\n> +to assist with this task in the `FrameBufferAllocator`_ class. The\n> +``FrameBufferAllocator`` reserves memory for a ``Stream`` in the video device\n> +and exports it as dma-buf file descriptors. From this point on, the allocated\n> +``FrameBuffer`` are associated to ``Stream`` instances in a ``Request`` and then\n> +imported by the pipeline hander in exactly the same fashion as if they were\n> +allocated elsewhere.\n> +\n> +.. _FrameBufferAllocator: http://libcamera.org/api-html/classlibcamera_1_1FrameBufferAllocator.html\n> +\n> +Pipeline handlers support the ``FrameBufferAllocator`` operations by\n> +implementing the `exportFrameBuffers`_ function, which will allocate memory in\n> +the video device associated with a stream and export it.\n> +\n> +.. _exportFrameBuffers: http://libcamera.org/api-html/classlibcamera_1_1PipelineHandler.html#a6312a69da7129c2ed41f9d9f790adf7c\n> +\n> +Implement the ``exportFrameBuffers`` stub method with the following code to\n> +handle this:\n> +\n> +.. code-block:: cpp\n> +\n> +   unsigned int count = stream->configuration().bufferCount;\n> +   VividCameraData *data = cameraData(camera);\n> +\n> +   return data->video_->exportBuffers(count, buffers);\n> +\n> +Once memory has been properly setup, the video devices can be started, to\n> +prepare for capture operations. Complete the ``start`` method implementation\n> +with the following code:\n> +\n> +.. code-block:: cpp\n> +\n> +   ret = data->video_->streamOn();\n> +   if (ret < 0) {\n> +          data->video_->releaseBuffers();\n> +          return ret;\n> +   }\n> +\n> +   return 0;\n> +\n> +The method starts the video device associated with the stream with the\n> +`streamOn`_ method. If the call fails, the error value is propagated to the\n> +caller and the `releaseBuffers`_ method releases any buffers to leave the device\n> +in a consistent state. If your pipeline handler uses any image processing\n> +algorithms, or other devices you should also stop them.\n> +\n> +.. _streamOn: http://libcamera.org/api-html/classlibcamera_1_1V4L2VideoDevice.html#a588a5dc9d6f4c54c61136ac43ff9a8cc\n> +.. _releaseBuffers: http://libcamera.org/api-html/classlibcamera_1_1V4L2VideoDevice.html#a191619c152f764e03bc461611f3fcd35\n> +\n> +Of course we also need to handle the corresponding actions to stop streaming on\n> +a device, Add the following to the ``stop`` method, to stop the stream with the\n> +`streamOff`_ method and release all buffers.\n> +\n> +.. _streamOff: http://libcamera.org/api-html/classlibcamera_1_1V4L2VideoDevice.html#a61998710615bdf7aa25a046c8565ed66\n> +\n> +.. code-block:: cpp\n> +\n> +   VividCameraData *data = cameraData(camera);\n> +   data->video_->streamOff();\n> +   data->video_->releaseBuffers();\n> +\n> +Queuing requests between applications and hardware\n> +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~\n> +\n> +libcamera implements a streaming model based on capture requests queued by an\n> +application to the ``Camera`` device. Each request contains at least one\n> +``Stream`` instance with an associated ``FrameBuffer`` object.\n> +\n> +When an application sends a capture request, the pipeline handler identifies\n> +which video devices have to be provided with buffers to generate a frame from\n> +the enabled streams.\n> +\n> +This example pipeline handler identifies the buffer using the `findBuffer`_\n> +helper from the only supported stream and queues it to the capture device\n> +directly with the `queueBuffer`_ method provided by the V4L2VideoDevice.\n> +\n> +.. _findBuffer: http://libcamera.org/api-html/classlibcamera_1_1Request.html#ac66050aeb9b92c64218945158559c4d4\n> +.. _queueBuffer: http://libcamera.org/api-html/classlibcamera_1_1V4L2VideoDevice.html#a594cd594686a8c1cf9ae8dba0b2a8a75\n> +\n> +Replace the stubbed contents of ``queueRequestDevice`` with the following:\n> +\n> +.. code-block:: cpp\n> +\n> +   VividCameraData *data = cameraData(camera);\n> +   FrameBuffer *buffer = request->findBuffer(&data->stream_);\n> +   if (!buffer) {\n> +          LOG(VIVID, Error)\n> +                  << \"Attempt to queue request with invalid stream\";\n> +\n> +          return -ENOENT;\n> +    }\n> +\n> +   int ret = data->video_->queueBuffer(buffer);\n> +   if (ret < 0)\n> +          return ret;\n> +\n> +   return 0;\n> +\n> +Processing controls\n> +~~~~~~~~~~~~~~~~~~~\n> +\n> +Capture requests not only contain streams and memory buffers, but can\n> +optionally contain a list of controls the application has set to modify the\n> +streaming parameters.\n> +\n> +Applications can set controls registered by the pipeline handler in the\n> +initialization phase, as explained in the `Registering controls and properties`_\n> +section.\n> +\n> +Implement a ``processControls`` method above the ``queueRequestDevice`` method\n> +to loop through the control list received with each request, and inspect the\n> +control values. Controls may need to be converted between the libcamera control\n> +range definitions and their corresponding values on the device before being set.\n> +\n> +.. code-block:: cpp\n> +\n> +   int PipelineHandlerVivid::processControls(VividCameraData *data, Request *request)\n> +   {\n> +          ControlList controls(data->video_->controls());\n> +\n> +          for (auto it : request->controls()) {\n> +                 unsigned int id = it.first;\n> +                 unsigned int offset;\n> +                 uint32_t cid;\n> +\n> +                 if (id == controls::Brightness) {\n> +                        cid = V4L2_CID_BRIGHTNESS;\n> +                        offset = 128;\n> +                 } else if (id == controls::Contrast) {\n> +                        cid = V4L2_CID_CONTRAST;\n> +                        offset = 0;\n> +                 } else if (id == controls::Saturation) {\n> +                        cid = V4L2_CID_SATURATION;\n> +                        offset = 0;\n> +                 } else {\n> +                        continue;\n> +                 }\n> +\n> +                 int32_t value = lroundf(it.second.get<float>() * 128 + offset);\n> +                 controls.set(cid, utils::clamp(value, 0, 255));\n> +          }\n> +\n> +          for (const auto &ctrl : controls)\n> +                 LOG(VIVID, Debug)\n> +                        << \"Setting control \" << utils::hex(ctrl.first)\n> +                        << \" to \" << ctrl.second.toString();\n> +\n> +          int ret = data->video_->setControls(&controls);\n> +          if (ret) {\n> +                 LOG(VIVID, Error) << \"Failed to set controls: \" << ret;\n> +                 return ret < 0 ? ret : -EINVAL;\n> +          }\n> +\n> +          return ret;\n> +   }\n> +\n> +Declare the function prototype for the ``processControls`` method within the\n> +private ``PipelineHandlerVivid`` class members, as it is only used internally as\n> +a helper when processing Requests.\n> +\n> +.. code-block:: cpp\n> +\n> +   private:\n> +        int processControls(VividCameraData *data, Request *request);\n> +\n> +A pipeline handler is responsible for applying controls provided in a Request to\n> +the relevant hardware devices. This could be directly on the capture device, or\n> +where appropriate by setting controls on V4L2Subdevices directly. Each pipeline\n> +handler is responsible for understanding the correct procedure for applying\n> +controls to the device they support.\n> +\n> +This example pipeline handler applies controls during the `queueRequestDevice`_\n> +method for each request, and applies them to the capture device through the\n> +capture node.\n> +\n> +.. _queueRequestDevice: http://libcamera.org/api-html/classlibcamera_1_1PipelineHandler.html#a106914cca210640c9da9ee1f0419e83c\n> +\n> +In the ``queueRequestDevice`` method, replace the following:\n> +\n> +.. code-block:: cpp\n> +\n> +   int ret = data->video_->queueBuffer(buffer);\n> +   if (ret < 0)\n> +        return ret;\n> +\n> +With the following code:\n> +\n> +.. code-block:: cpp\n> +\n> +   int ret = processControls(data, request);\n> +   if (ret < 0)\n> +        return ret;\n> +\n> +   ret = data->video_->queueBuffer(buffer);\n> +   if (ret < 0)\n> +        return ret;\n> +\n> +We also need to add the following include directive to support the control\n> +value translation operations:\n> +\n> +.. code-block:: cpp\n> +\n> +   #include <math.h>\n> +\n> +Frame completion and event handling\n> +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~\n> +\n> +Libcamera implements a signals and slots mechanism (similar to `Qt Signals and\n> +Slots`_) to connect event sources with callbacks to handle them.\n> +\n> +As a general summary, a ``Slot`` can be connected to a ``Signal``, which when\n> +emitted triggers the execution of the connected slots.  A detailed description\n> +of the libcamera implementation is available in the `libcamera Signal and Slot`_\n> +classes documentation.\n> +\n> +.. _Qt Signals and Slots: https://doc.qt.io/qt-5/signalsandslots.html\n> +.. _libcamera Signal and Slot: http://libcamera.org/api-html/classlibcamera_1_1Signal.html#details\n> +\n> +In order to notify applications about the availability of new frames and data,\n> +the ``Camera`` device exposes two ``Signals`` which applications can connect to\n> +be notified of frame completion events. The ``bufferComplete`` signal serves to\n> +report to applications the completion event of a single ``Stream`` part of a\n> +``Request``, while the ``requestComplete`` signal notifies the completion of all\n> +the ``Streams`` and data submitted as part of a request. This mechanism allows\n> +implementation of partial request completion, which allows an application to\n> +inspect completed buffers associated with the single streams without waiting for\n> +all of them to be ready.\n> +\n> +The ``bufferComplete`` and ``requestComplete`` signals are emitted by the\n> +``Camera`` device upon notifications received from the pipeline handler, which\n> +tracks the buffers and request completion status.\n> +\n> +The single buffer completion notification is implemented by pipeline handlers by\n> +`connecting`_ the ``bufferReady`` signal of the capture devices they have queued\n> +buffers to, to a member function slot that handles processing of the completed\n> +frames. When a buffer is ready, the pipeline handler must propagate the\n> +completion of that buffer to the Camera by using the PipelineHandler base class\n> +``completeBuffer`` function. When all of the buffers referenced by a ``Request``\n> +have been completed, the pipeline handler must again notify the ``Camera`` using\n> +the PipelineHandler base class ``completeRequest`` function. The PipelineHandler\n> +class implementation makes sure the request completion notifications are\n> +delivered to applications in the same order as they have been submitted.\n> +\n> +.. _connecting: http://libcamera.org/api-html/classlibcamera_1_1Signal.html#aa04db72d5b3091ffbb4920565aeed382\n> +\n> +Returning to the ``int VividCameraData::init()`` method, add the following above\n> +the closing ``return 0;`` to connects the pipeline handler ``bufferReady``\n> +method to the V4L2 device buffer signal.\n> +\n> +.. code-block:: cpp\n> +\n> +   video_->bufferReady.connect(this, &VividCameraData::bufferReady);\n> +\n> +Create the matching ``VividCameraData::bufferReady`` method after your\n> +VividCameradata::init() impelementation.\n> +\n> +The ``bufferReady`` method obtains the request from the buffer using the\n> +``request`` method, and notifies the ``Camera`` that the buffer and\n> +request are completed. In this simpler pipeline handler, there is only one\n> +stream, so it completes the request immediately. You can find a more complex\n> +example of event handling with supporting multiple streams in the libcamera\n> +code-base.\n> +\n> +.. TODO: Add link\n> +\n> +.. code-block:: cpp\n> +\n> +   void VividCameraData::bufferReady(FrameBuffer *buffer)\n> +   {\n> +          Request *request = buffer->request();\n> +\n> +          pipe_->completeBuffer(camera_, request, buffer);\n> +          pipe_->completeRequest(camera_, request);\n> +   }\n> +\n> +Testing a pipeline handler\n> +~~~~~~~~~~~~~~~~~~~~~~~~~~\n> +\n> +Once you've built the pipeline handler, we ca rebuild the code base, and test\n> +capture through the pipeline through both of the cam and qcam utilities.\n> +\n> +.. code-block:: shell\n> +\n> +   ninja -C build\n> +   ./build/src/cam/cam -c vivid -C 5\n> +\n> +To test that the pipeline handler can detect a device, and capture input.\n> +\n> +Running the command above outputs (a lot of) information about pixel formats,\n> +and then starts capturing frame data, and should provide an output such as the\n> +following:\n> +\n> +.. code-block:: none\n> +\n> +   user@dev:/home/libcamera$ ./build/src/cam/cam -c vivid -C5\n> +   [42:34:08.573066847] [186470]  INFO IPAManager ipa_manager.cpp:136 libcamera is not installed. Adding '/home/libcamera/build/src/ipa' to the IPA search path\n> +   [42:34:08.575908115] [186470]  INFO Camera camera_manager.cpp:287 libcamera v0.0.11+876-7b27d262\n> +   [42:34:08.610334268] [186471]  INFO IPAProxy ipa_proxy.cpp:122 libcamera is not installed. Loading IPA configuration from '/home/libcamera/src/ipa/vimc/data'\n> +   Using camera vivid\n> +   [42:34:08.618462130] [186470]  WARN V4L2 v4l2_pixelformat.cpp:176 Unsupported V4L2 pixel format Y10\n> +   ... <remaining Unsupported V4L2 pixel format warnings can be ignored>\n> +   [42:34:08.619901297] [186470]  INFO Camera camera.cpp:793 configuring streams: (0) 1280x720-BGR888\n> +   Capture 5 frames\n> +   fps: 0.00 stream0 seq: 000000 bytesused: 2764800\n> +   fps: 4.98 stream0 seq: 000001 bytesused: 2764800\n> +   fps: 5.00 stream0 seq: 000002 bytesused: 2764800\n> +   fps: 5.03 stream0 seq: 000003 bytesused: 2764800\n> +   fps: 5.03 stream0 seq: 000004 bytesused: 2764800\n> +\n> +This demonstrates that the pipeline handler is successfully capturing frames,\n> +but it is helpful to see the visual output and validate the images are being\n> +processed correctly. The libcamera project also implements a Qt based\n> +application which will render the frames in a window for visual inspection:\n> +\n> +.. code-block:: shell\n> +\n> +   ./build/src/qcam/qcam -c vivid\n> +\n> +.. TODO: Running qcam with the vivid pipeline handler appears to have a bug and\n> +         no visual frames are seen. However disabling zero-copy on qcam renders\n> +         them successfully.\n> \\ No newline at end of file\n> diff --git a/Documentation/index.rst b/Documentation/index.rst\n> index cfcfd388699b..fb391d2b6ebf 100644\n> --- a/Documentation/index.rst\n> +++ b/Documentation/index.rst\n> @@ -14,3 +14,4 @@\n>     Contribute <contributing>\n>  \n>     Developer Guide <guides/introduction>\n> +   Pipeline Handler Writers Guide <guides/pipeline-handler>\n\nWriters or Writer's ?\n\nOther improvements can go on top.\n\nAcked-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>\n\n> diff --git a/Documentation/meson.build b/Documentation/meson.build\n> index dd7ae700af11..9f6c67071da9 100644\n> --- a/Documentation/meson.build\n> +++ b/Documentation/meson.build\n> @@ -53,6 +53,7 @@ if sphinx.found()\n>          'docs.rst',\n>          'index.rst',\n>          'guides/introduction.rst',\n> +        'guides/pipeline-handler.rst',\n>      ]\n>  \n>      release = 'release=v' + libcamera_git_version","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id 8B120BE173\n\tfor <parsemail@patchwork.libcamera.org>;\n\tThu, 20 Aug 2020 15:44:47 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id 2B28662082;\n\tThu, 20 Aug 2020 17:44:47 +0200 (CEST)","from perceval.ideasonboard.com (perceval.ideasonboard.com\n\t[IPv6:2001:4b98:dc2:55:216:3eff:fef7:d647])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id 6FA4060381\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tThu, 20 Aug 2020 17:44:46 +0200 (CEST)","from pendragon.ideasonboard.com (62-78-145-57.bb.dnainternet.fi\n\t[62.78.145.57])\n\tby perceval.ideasonboard.com (Postfix) with ESMTPSA id BA1E023D;\n\tThu, 20 Aug 2020 17:44:45 +0200 (CEST)"],"Authentication-Results":"lancelot.ideasonboard.com;\n\tdkim=fail reason=\"signature verification failed\" (1024-bit key;\n\tunprotected) header.d=ideasonboard.com header.i=@ideasonboard.com\n\theader.b=\"bxedf0c5\"; dkim-atps=neutral","DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/simple; d=ideasonboard.com;\n\ts=mail; t=1597938286;\n\tbh=YKyUeB7+bejsI9BZ4CBRCX83bAE7L55J8I5jc5QgFxU=;\n\th=Date:From:To:Cc:Subject:References:In-Reply-To:From;\n\tb=bxedf0c5eXdtZYaNTnb3/qdDU5nn4r6e4J5nqxDTjU99SwSQoeZH/g+CjtvxSS3Gp\n\tg8iwjSlSA2DIkuIv/ke4VN7aN9Q9gDYktIlzv3l1JfcF/4hQs8OeA0hQz750AXGYtX\n\te3im8w2r9T5GH6fjwR5NQ64x9eK9W6q6AFmuGLIY=","Date":"Thu, 20 Aug 2020 18:44:27 +0300","From":"Laurent Pinchart <laurent.pinchart@ideasonboard.com>","To":"Kieran Bingham <kieran.bingham@ideasonboard.com>","Message-ID":"<20200820154427.GS6593@pendragon.ideasonboard.com>","References":"<20200820134751.278033-1-kieran.bingham@ideasonboard.com>\n\t<20200820134751.278033-3-kieran.bingham@ideasonboard.com>","MIME-Version":"1.0","Content-Disposition":"inline","In-Reply-To":"<20200820134751.278033-3-kieran.bingham@ideasonboard.com>","Subject":"Re: [libcamera-devel] [PATCH v4 2/3] Documentation: Guides:\n\tPipeline Handler Writers Guide","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","Cc":"Chris Ward <chris@gregariousmammal.com>,\n\tlibcamera devel <libcamera-devel@lists.libcamera.org>","Content-Type":"text/plain; charset=\"utf-8\"","Content-Transfer-Encoding":"base64","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}}]