[{"id":30862,"web_url":"https://patchwork.libcamera.org/comment/30862/","msgid":"<20240819214022.GB4770@pendragon.ideasonboard.com>","date":"2024-08-19T21:40:22","subject":"Re: [PATCH v3 4/7] Documentation: Breakout docs.rst","submitter":{"id":2,"url":"https://patchwork.libcamera.org/api/people/2/","name":"Laurent Pinchart","email":"laurent.pinchart@ideasonboard.com"},"content":"Hi Dan,\n\nThank you for the patch.\n\nOn Mon, Aug 19, 2024 at 05:09:18PM +0100, Daniel Scally wrote:\n> In preparation for including more of the Documentation for libcamera\n> on the website, break out the libcamera Architecture and Feature\n> Requirements sections of docs.rst file into separate files for each\n> section. Add all of the new files to documentation-contents.rst so\n> they're included on the website too.\n> \n> Signed-off-by: Daniel Scally <dan.scally@ideasonboard.com>\n\nReviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>\n\n> ---\n> Changes since v2:\n> \n> \t- Reworked to breakout libcamera architecture instead of camera stack\n> \t  section. Dropped the R-b.\n> \n> Changes since v1:\n> \n> \t- Included the new files in meson's docs_sources array\n> \n>  Documentation/docs.rst                   | 278 -----------------------\n>  Documentation/documentation-contents.rst |   2 +\n>  Documentation/feature_requirements.rst   | 145 ++++++++++++\n>  Documentation/index.rst                  |   2 +\n>  Documentation/libcamera_architecture.rst | 138 +++++++++++\n>  Documentation/meson.build                |   2 +\n>  6 files changed, 289 insertions(+), 278 deletions(-)\n>  create mode 100644 Documentation/feature_requirements.rst\n>  create mode 100644 Documentation/libcamera_architecture.rst\n> \n> diff --git a/Documentation/docs.rst b/Documentation/docs.rst\n> index 0eacc924..1f0d12c0 100644\n> --- a/Documentation/docs.rst\n> +++ b/Documentation/docs.rst\n> @@ -21,149 +21,6 @@ The libcamera API is extensively documented using Doxygen. The :ref:`API\n>  nightly build <api>` contains the most up-to-date API documentation, built from\n>  the latest master branch.\n>  \n> -Feature Requirements\n> -====================\n> -\n> -Device enumeration\n> -------------------\n> -\n> -The library shall support enumerating all camera devices available in the\n> -system, including both fixed cameras and hotpluggable cameras. It shall\n> -support cameras plugged and unplugged after the initialization of the\n> -library, and shall offer a mechanism to notify applications of camera plug\n> -and unplug.\n> -\n> -The following types of cameras shall be supported:\n> -\n> -* Internal cameras designed for point-and-shoot still image and video\n> -  capture usage, either controlled directly by the CPU, or exposed through\n> -  an internal USB bus as a UVC device.\n> -\n> -* External UVC cameras designed for video conferencing usage.\n> -\n> -Other types of camera, including analog cameras, depth cameras, thermal\n> -cameras, external digital picture or movie cameras, are out of scope for\n> -this project.\n> -\n> -A hardware device that includes independent camera sensors, such as front\n> -and back sensors in a phone, shall be considered as multiple camera devices\n> -for the purpose of this library.\n> -\n> -Independent Camera Devices\n> ---------------------------\n> -\n> -When multiple cameras are present in the system and are able to operate\n> -independently from each other, the library shall expose them as multiple\n> -camera devices and support parallel operation without any additional usage\n> -restriction apart from the limitations inherent to the hardware (such as\n> -memory bandwidth, CPU usage or number of CSI-2 receivers for instance).\n> -\n> -Independent processes shall be able to use independent cameras devices\n> -without interfering with each other. A single camera device shall be\n> -usable by a single process at a time.\n> -\n> -Multiple streams support\n> -------------------------\n> -\n> -The library shall support multiple video streams running in parallel\n> -for each camera device, within the limits imposed by the system.\n> -\n> -Per frame controls\n> -------------------\n> -\n> -The library shall support controlling capture parameters for each stream\n> -on a per-frame basis, on a best effort basis based on the capabilities of the\n> -hardware and underlying software stack (including kernel drivers and\n> -firmware). It shall apply capture parameters to the frame they target, and\n> -report the value of the parameters that have effectively been used for each\n> -captured frame.\n> -\n> -When a camera device supports multiple streams, the library shall allow both\n> -control of each stream independently, and control of multiple streams\n> -together. Streams that are controlled together shall be synchronized. No\n> -synchronization is required for streams controlled independently.\n> -\n> -Capability Enumeration\n> -----------------------\n> -\n> -The library shall expose capabilities of each camera device in a way that\n> -allows applications to discover those capabilities dynamically. Applications\n> -shall be allowed to cache capabilities for as long as they are using the\n> -library. If capabilities can change at runtime, the library shall offer a\n> -mechanism to notify applications of such changes. Applications shall not\n> -cache capabilities in long term storage between runs.\n> -\n> -Capabilities shall be discovered dynamically at runtime from the device when\n> -possible, and may come, in part or in full, from platform configuration\n> -data.\n> -\n> -Device Profiles\n> ----------------\n> -\n> -The library may define different camera device profiles, each with a minimum\n> -set of required capabilities. Applications may use those profiles to quickly\n> -determine the level of features exposed by a device without parsing the full\n> -list of capabilities. Camera devices may implement additional capabilities\n> -on top of the minimum required set for the profile they expose.\n> -\n> -3A and Image Enhancement Algorithms\n> ------------------------------------\n> -\n> -The camera devices shall implement auto exposure, auto gain and auto white\n> -balance. Camera devices that include a focus lens shall implement auto\n> -focus. Additional image enhancement algorithms, such as noise reduction or\n> -video stabilization, may be implemented.\n> -\n> -All algorithms may be implemented in hardware or firmware outside of the\n> -library, or in software in the library. They shall all be controllable by\n> -applications.\n> -\n> -The library shall be architectured to isolate the 3A and image enhancement\n> -algorithms in a component with a documented API, respectively called the 3A\n> -component and the 3A API. The 3A API shall be stable, and shall allow both\n> -open-source and closed-source implementations of the 3A component.\n> -\n> -The library may include statically-linked open-source 3A components, and\n> -shall support dynamically-linked open-source and closed-source 3A\n> -components.\n> -\n> -Closed-source 3A Component Sandboxing\n> --------------------------------------\n> -\n> -For security purposes, it may be desired to run closed-source 3A components\n> -in a separate process. The 3A API would in such a case be transported over\n> -IPC. The 3A API shall make it possible to use any IPC mechanism that\n> -supports passing file descriptors.\n> -\n> -The library may implement an IPC mechanism, and shall support third-party\n> -platform-specific IPC mechanisms through the implementation of a\n> -platform-specific 3A API wrapper. No modification to the library shall be\n> -needed to use such third-party IPC mechanisms.\n> -\n> -The 3A component shall not directly access any device node on the system.\n> -Such accesses shall instead be performed through the 3A API. The library\n> -shall validate all accesses and restrict them to what is absolutely required\n> -by 3A components.\n> -\n> -V4L2 Compatibility Layer\n> -------------------------\n> -\n> -The project shall support traditional V4L2 application through an additional\n> -libcamera wrapper library. The wrapper library shall trap all accesses to\n> -camera devices through `LD_PRELOAD`, and route them through libcamera to\n> -emulate a high-level V4L2 camera device. It shall expose camera device\n> -features on a best-effort basis, and aim for the level of features\n> -traditionally available from a UVC camera designed for video conferencing.\n> -\n> -Android Camera HAL v3 Compatibility\n> ------------------------------------\n> -\n> -The library API shall expose all the features required to implement an\n> -Android Camera HAL v3 on top of libcamera. Some features of the HAL may be\n> -omitted as long as they can be implemented separately in the HAL, such as\n> -JPEG encoding, or YUV reprocessing.\n> -\n> -\n>  Camera Stack\n>  ============\n>  \n> @@ -269,138 +126,3 @@ Native libcamera API\n>    followed in the :doc:`Application writer's guide </guides/application-developer>`\n>  \n>  .. _GStreamer element: https://gstreamer.freedesktop.org/documentation/application-development/basics/elements.html\n> -\n> -libcamera Architecture\n> -======================\n> -\n> -::\n> -\n> -   ---------------------------< libcamera Public API >---------------------------\n> -                    ^                                      ^\n> -                    |                                      |\n> -                    v                                      v\n> -             +-------------+  +-------------------------------------------------+\n> -             |   Camera    |  |  Camera Device                                  |\n> -             |   Devices   |  | +---------------------------------------------+ |\n> -             |   Manager   |  | | Device-Agnostic                             | |\n> -             +-------------+  | |                                             | |\n> -                    ^         | |                    +------------------------+ |\n> -                    |         | |                    |   ~~~~~~~~~~~~~~~~~~~~~  |\n> -                    |         | |                    |  {  +---------------+  } |\n> -                    |         | |                    |  }  | ////Image//// |  { |\n> -                    |         | |                    | <-> | /Processing// |  } |\n> -                    |         | |                    |  }  | /Algorithms// |  { |\n> -                    |         | |                    |  {  +---------------+  } |\n> -                    |         | |                    |   ~~~~~~~~~~~~~~~~~~~~~  |\n> -                    |         | |                    | ======================== |\n> -                    |         | |                    |     +---------------+    |\n> -                    |         | |                    |     | //Pipeline/// |    |\n> -                    |         | |                    | <-> | ///Handler/// |    |\n> -                    |         | |                    |     | ///////////// |    |\n> -                    |         | +--------------------+     +---------------+    |\n> -                    |         |                                 Device-Specific |\n> -                    |         +-------------------------------------------------+\n> -                    |                     ^                        ^\n> -                    |                     |                        |\n> -                    v                     v                        v\n> -           +--------------------------------------------------------------------+\n> -           | Helpers and Support Classes                                        |\n> -           | +-------------+  +-------------+  +-------------+  +-------------+ |\n> -           | |  MC & V4L2  |  |   Buffers   |  | Sandboxing  |  |   Plugins   | |\n> -           | |   Support   |  |  Allocator  |  |     IPC     |  |   Manager   | |\n> -           | +-------------+  +-------------+  +-------------+  +-------------+ |\n> -           | +-------------+  +-------------+                                   |\n> -           | |  Pipeline   |  |     ...     |                                   |\n> -           | |   Runner    |  |             |                                   |\n> -           | +-------------+  +-------------+                                   |\n> -           +--------------------------------------------------------------------+\n> -\n> -             /// Device-Specific Components\n> -             ~~~ Sandboxing\n> -\n> -While offering a unified API towards upper layers, and presenting\n> -itself as a single library, libcamera isn't monolithic. It exposes\n> -multiple components through its public API, is built around a set of\n> -separate helpers internally, uses device-specific components and can\n> -load dynamic plugins.\n> -\n> -Camera Devices Manager\n> -  The Camera Devices Manager provides a view of available cameras\n> -  in the system. It performs cold enumeration and runtime camera\n> -  management, and supports a hotplug notification mechanism in its\n> -  public API.\n> -\n> -  To avoid the cost associated with cold enumeration of all devices\n> -  at application start, and to arbitrate concurrent access to camera\n> -  devices, the Camera Devices Manager could later be split to a\n> -  separate service, possibly with integration in platform-specific\n> -  device management.\n> -\n> -Camera Device\n> -  The Camera Device represents a camera device to upper layers. It\n> -  exposes full control of the device through the public API, and is\n> -  thus the highest level object exposed by libcamera.\n> -\n> -  Camera Device instances are created by the Camera Devices\n> -  Manager. An optional function to create new instances could be exposed\n> -  through the public API to speed up initialization when the upper\n> -  layer knows how to directly address camera devices present in the\n> -  system.\n> -\n> -Pipeline Handler\n> -  The Pipeline Handler manages complex pipelines exposed by the kernel drivers\n> -  through the Media Controller and V4L2 APIs. It abstracts pipeline handling to\n> -  hide device-specific details to the rest of the library, and implements both\n> -  pipeline configuration based on stream configuration, and pipeline runtime\n> -  execution and scheduling when needed by the device.\n> -\n> -  This component is device-specific and is part of the libcamera code base. As\n> -  such it is covered by the same free software license as the rest of libcamera\n> -  and needs to be contributed upstream by device vendors. The Pipeline Handler\n> -  lives in the same process as the rest of the library, and has access to all\n> -  helpers and kernel camera-related devices.\n> -\n> -Image Processing Algorithms\n> -  Together with the hardware image processing and hardware statistics\n> -  collection, the Image Processing Algorithms implement 3A (Auto-Exposure,\n> -  Auto-White Balance and Auto-Focus) and other algorithms. They run on the CPU\n> -  and interact with the kernel camera devices to control hardware image\n> -  processing based on the parameters supplied by upper layers, closing the\n> -  control loop of the ISP.\n> -\n> -  This component is device-specific and is loaded as an external plugin. It can\n> -  be part of the libcamera code base, in which case it is covered by the same\n> -  license, or provided externally as an open-source or closed-source component.\n> -\n> -  The component is sandboxed and can only interact with libcamera through\n> -  internal APIs specifically marked as such. In particular it will have no\n> -  direct access to kernel camera devices, and all its accesses to image and\n> -  metadata will be mediated by dmabuf instances explicitly passed to the\n> -  component. The component must be prepared to run in a process separate from\n> -  the main libcamera process, and to have a very restricted view of the system,\n> -  including no access to networking APIs and limited access to file systems.\n> -\n> -  The sandboxing mechanism isn't defined by libcamera. One example\n> -  implementation will be provided as part of the project, and platforms vendors\n> -  will be able to provide their own sandboxing mechanism as a plugin.\n> -\n> -  libcamera should provide a basic implementation of Image Processing\n> -  Algorithms, to serve as a reference for the internal API. Device vendors are\n> -  expected to provide a full-fledged implementation compatible with their\n> -  Pipeline Handler. One goal of the libcamera project is to create an\n> -  environment in which the community will be able to compete with the\n> -  closed-source vendor binaries and develop a high quality open source\n> -  implementation.\n> -\n> -Helpers and Support Classes\n> -  While Pipeline Handlers are device-specific, implementations are expected to\n> -  share code due to usage of identical APIs towards the kernel camera drivers\n> -  and the Image Processing Algorithms. This includes without limitation handling\n> -  of the MC and V4L2 APIs, buffer management through dmabuf, and pipeline\n> -  discovery, configuration and scheduling. Such code will be factored out to\n> -  helpers when applicable.\n> -\n> -  Other parts of libcamera will also benefit from factoring code out to\n> -  self-contained support classes, even if such code is present only once in the\n> -  code base, in order to keep the source code clean and easy to read. This\n> -  should be the case for instance for plugin management.\n> diff --git a/Documentation/documentation-contents.rst b/Documentation/documentation-contents.rst\n> index a6915e05..c7b78c43 100644\n> --- a/Documentation/documentation-contents.rst\n> +++ b/Documentation/documentation-contents.rst\n> @@ -7,12 +7,14 @@\n>     * :doc:`/code-of-conduct`\n>     * :doc:`/coding-style`\n>     * :doc:`/environment_variables`\n> +   * :doc:`/feature_requirements`\n>     * :doc:`/guides/application-developer`\n>     * :doc:`/guides/introduction`\n>     * :doc:`/guides/ipa`\n>     * :doc:`/guides/pipeline-handler`\n>     * :doc:`/guides/tracing`\n>     * :doc:`/lens_driver_requirements`\n> +   * :doc:`/libcamera_architecture`\n>     * :doc:`/python-bindings`\n>     * :doc:`/sensor_driver_requirements`\n>     * :doc:`/software-isp-benchmarking`\n> diff --git a/Documentation/feature_requirements.rst b/Documentation/feature_requirements.rst\n> new file mode 100644\n> index 00000000..cae7e9ab\n> --- /dev/null\n> +++ b/Documentation/feature_requirements.rst\n> @@ -0,0 +1,145 @@\n> +.. SPDX-License-Identifier: CC-BY-SA-4.0\n> +\n> +.. include:: documentation-contents.rst\n> +\n> +Feature Requirements\n> +====================\n> +\n> +Device enumeration\n> +------------------\n> +\n> +The library shall support enumerating all camera devices available in the\n> +system, including both fixed cameras and hotpluggable cameras. It shall\n> +support cameras plugged and unplugged after the initialization of the\n> +library, and shall offer a mechanism to notify applications of camera plug\n> +and unplug.\n> +\n> +The following types of cameras shall be supported:\n> +\n> +* Internal cameras designed for point-and-shoot still image and video\n> +  capture usage, either controlled directly by the CPU, or exposed through\n> +  an internal USB bus as a UVC device.\n> +\n> +* External UVC cameras designed for video conferencing usage.\n> +\n> +Other types of camera, including analog cameras, depth cameras, thermal\n> +cameras, external digital picture or movie cameras, are out of scope for\n> +this project.\n> +\n> +A hardware device that includes independent camera sensors, such as front\n> +and back sensors in a phone, shall be considered as multiple camera devices\n> +for the purpose of this library.\n> +\n> +Independent Camera Devices\n> +--------------------------\n> +\n> +When multiple cameras are present in the system and are able to operate\n> +independently from each other, the library shall expose them as multiple\n> +camera devices and support parallel operation without any additional usage\n> +restriction apart from the limitations inherent to the hardware (such as\n> +memory bandwidth, CPU usage or number of CSI-2 receivers for instance).\n> +\n> +Independent processes shall be able to use independent cameras devices\n> +without interfering with each other. A single camera device shall be\n> +usable by a single process at a time.\n> +\n> +Multiple streams support\n> +------------------------\n> +\n> +The library shall support multiple video streams running in parallel\n> +for each camera device, within the limits imposed by the system.\n> +\n> +Per frame controls\n> +------------------\n> +\n> +The library shall support controlling capture parameters for each stream\n> +on a per-frame basis, on a best effort basis based on the capabilities of the\n> +hardware and underlying software stack (including kernel drivers and\n> +firmware). It shall apply capture parameters to the frame they target, and\n> +report the value of the parameters that have effectively been used for each\n> +captured frame.\n> +\n> +When a camera device supports multiple streams, the library shall allow both\n> +control of each stream independently, and control of multiple streams\n> +together. Streams that are controlled together shall be synchronized. No\n> +synchronization is required for streams controlled independently.\n> +\n> +Capability Enumeration\n> +----------------------\n> +\n> +The library shall expose capabilities of each camera device in a way that\n> +allows applications to discover those capabilities dynamically. Applications\n> +shall be allowed to cache capabilities for as long as they are using the\n> +library. If capabilities can change at runtime, the library shall offer a\n> +mechanism to notify applications of such changes. Applications shall not\n> +cache capabilities in long term storage between runs.\n> +\n> +Capabilities shall be discovered dynamically at runtime from the device when\n> +possible, and may come, in part or in full, from platform configuration\n> +data.\n> +\n> +Device Profiles\n> +---------------\n> +\n> +The library may define different camera device profiles, each with a minimum\n> +set of required capabilities. Applications may use those profiles to quickly\n> +determine the level of features exposed by a device without parsing the full\n> +list of capabilities. Camera devices may implement additional capabilities\n> +on top of the minimum required set for the profile they expose.\n> +\n> +3A and Image Enhancement Algorithms\n> +-----------------------------------\n> +\n> +The camera devices shall implement auto exposure, auto gain and auto white\n> +balance. Camera devices that include a focus lens shall implement auto\n> +focus. Additional image enhancement algorithms, such as noise reduction or\n> +video stabilization, may be implemented.\n> +\n> +All algorithms may be implemented in hardware or firmware outside of the\n> +library, or in software in the library. They shall all be controllable by\n> +applications.\n> +\n> +The library shall be architectured to isolate the 3A and image enhancement\n> +algorithms in a component with a documented API, respectively called the 3A\n> +component and the 3A API. The 3A API shall be stable, and shall allow both\n> +open-source and closed-source implementations of the 3A component.\n> +\n> +The library may include statically-linked open-source 3A components, and\n> +shall support dynamically-linked open-source and closed-source 3A\n> +components.\n> +\n> +Closed-source 3A Component Sandboxing\n> +-------------------------------------\n> +\n> +For security purposes, it may be desired to run closed-source 3A components\n> +in a separate process. The 3A API would in such a case be transported over\n> +IPC. The 3A API shall make it possible to use any IPC mechanism that\n> +supports passing file descriptors.\n> +\n> +The library may implement an IPC mechanism, and shall support third-party\n> +platform-specific IPC mechanisms through the implementation of a\n> +platform-specific 3A API wrapper. No modification to the library shall be\n> +needed to use such third-party IPC mechanisms.\n> +\n> +The 3A component shall not directly access any device node on the system.\n> +Such accesses shall instead be performed through the 3A API. The library\n> +shall validate all accesses and restrict them to what is absolutely required\n> +by 3A components.\n> +\n> +V4L2 Compatibility Layer\n> +------------------------\n> +\n> +The project shall support traditional V4L2 application through an additional\n> +libcamera wrapper library. The wrapper library shall trap all accesses to\n> +camera devices through `LD_PRELOAD`, and route them through libcamera to\n> +emulate a high-level V4L2 camera device. It shall expose camera device\n> +features on a best-effort basis, and aim for the level of features\n> +traditionally available from a UVC camera designed for video conferencing.\n> +\n> +Android Camera HAL v3 Compatibility\n> +-----------------------------------\n> +\n> +The library API shall expose all the features required to implement an\n> +Android Camera HAL v3 on top of libcamera. Some features of the HAL may be\n> +omitted as long as they can be implemented separately in the HAL, such as\n> +JPEG encoding, or YUV reprocessing.\n> diff --git a/Documentation/index.rst b/Documentation/index.rst\n> index 52ddc494..95c80b4e 100644\n> --- a/Documentation/index.rst\n> +++ b/Documentation/index.rst\n> @@ -18,8 +18,10 @@\n>     Camera Sensor Model <camera-sensor-model>\n>     Developer Guide <guides/introduction>\n>     Environment variables <environment_variables>\n> +   Feature Requirements <feature_requirements>\n>     IPA Writer's guide <guides/ipa>\n>     Lens driver requirements <lens_driver_requirements>\n> +   libcamera Architecture <libcamera_architecture>\n>     Pipeline Handler Writer's Guide <guides/pipeline-handler>\n>     Python Bindings <python-bindings>\n>     Sensor driver requirements <sensor_driver_requirements>\n> diff --git a/Documentation/libcamera_architecture.rst b/Documentation/libcamera_architecture.rst\n> new file mode 100644\n> index 00000000..1258db23\n> --- /dev/null\n> +++ b/Documentation/libcamera_architecture.rst\n> @@ -0,0 +1,138 @@\n> +.. SPDX-License-Identifier: CC-BY-SA-4.0\n> +\n> +.. include:: documentation-contents.rst\n> +\n> +libcamera Architecture\n> +======================\n> +\n> +::\n> +\n> +   ---------------------------< libcamera Public API >---------------------------\n> +                    ^                                      ^\n> +                    |                                      |\n> +                    v                                      v\n> +             +-------------+  +-------------------------------------------------+\n> +             |   Camera    |  |  Camera Device                                  |\n> +             |   Devices   |  | +---------------------------------------------+ |\n> +             |   Manager   |  | | Device-Agnostic                             | |\n> +             +-------------+  | |                                             | |\n> +                    ^         | |                    +------------------------+ |\n> +                    |         | |                    |   ~~~~~~~~~~~~~~~~~~~~~  |\n> +                    |         | |                    |  {  +---------------+  } |\n> +                    |         | |                    |  }  | ////Image//// |  { |\n> +                    |         | |                    | <-> | /Processing// |  } |\n> +                    |         | |                    |  }  | /Algorithms// |  { |\n> +                    |         | |                    |  {  +---------------+  } |\n> +                    |         | |                    |   ~~~~~~~~~~~~~~~~~~~~~  |\n> +                    |         | |                    | ======================== |\n> +                    |         | |                    |     +---------------+    |\n> +                    |         | |                    |     | //Pipeline/// |    |\n> +                    |         | |                    | <-> | ///Handler/// |    |\n> +                    |         | |                    |     | ///////////// |    |\n> +                    |         | +--------------------+     +---------------+    |\n> +                    |         |                                 Device-Specific |\n> +                    |         +-------------------------------------------------+\n> +                    |                     ^                        ^\n> +                    |                     |                        |\n> +                    v                     v                        v\n> +           +--------------------------------------------------------------------+\n> +           | Helpers and Support Classes                                        |\n> +           | +-------------+  +-------------+  +-------------+  +-------------+ |\n> +           | |  MC & V4L2  |  |   Buffers   |  | Sandboxing  |  |   Plugins   | |\n> +           | |   Support   |  |  Allocator  |  |     IPC     |  |   Manager   | |\n> +           | +-------------+  +-------------+  +-------------+  +-------------+ |\n> +           | +-------------+  +-------------+                                   |\n> +           | |  Pipeline   |  |     ...     |                                   |\n> +           | |   Runner    |  |             |                                   |\n> +           | +-------------+  +-------------+                                   |\n> +           +--------------------------------------------------------------------+\n> +\n> +             /// Device-Specific Components\n> +             ~~~ Sandboxing\n> +\n> +While offering a unified API towards upper layers, and presenting\n> +itself as a single library, libcamera isn't monolithic. It exposes\n> +multiple components through its public API, is built around a set of\n> +separate helpers internally, uses device-specific components and can\n> +load dynamic plugins.\n> +\n> +Camera Devices Manager\n> +  The Camera Devices Manager provides a view of available cameras\n> +  in the system. It performs cold enumeration and runtime camera\n> +  management, and supports a hotplug notification mechanism in its\n> +  public API.\n> +\n> +  To avoid the cost associated with cold enumeration of all devices\n> +  at application start, and to arbitrate concurrent access to camera\n> +  devices, the Camera Devices Manager could later be split to a\n> +  separate service, possibly with integration in platform-specific\n> +  device management.\n> +\n> +Camera Device\n> +  The Camera Device represents a camera device to upper layers. It\n> +  exposes full control of the device through the public API, and is\n> +  thus the highest level object exposed by libcamera.\n> +\n> +  Camera Device instances are created by the Camera Devices\n> +  Manager. An optional function to create new instances could be exposed\n> +  through the public API to speed up initialization when the upper\n> +  layer knows how to directly address camera devices present in the\n> +  system.\n> +\n> +Pipeline Handler\n> +  The Pipeline Handler manages complex pipelines exposed by the kernel drivers\n> +  through the Media Controller and V4L2 APIs. It abstracts pipeline handling to\n> +  hide device-specific details to the rest of the library, and implements both\n> +  pipeline configuration based on stream configuration, and pipeline runtime\n> +  execution and scheduling when needed by the device.\n> +\n> +  This component is device-specific and is part of the libcamera code base. As\n> +  such it is covered by the same free software license as the rest of libcamera\n> +  and needs to be contributed upstream by device vendors. The Pipeline Handler\n> +  lives in the same process as the rest of the library, and has access to all\n> +  helpers and kernel camera-related devices.\n> +\n> +Image Processing Algorithms\n> +  Together with the hardware image processing and hardware statistics\n> +  collection, the Image Processing Algorithms implement 3A (Auto-Exposure,\n> +  Auto-White Balance and Auto-Focus) and other algorithms. They run on the CPU\n> +  and interact with the kernel camera devices to control hardware image\n> +  processing based on the parameters supplied by upper layers, closing the\n> +  control loop of the ISP.\n> +\n> +  This component is device-specific and is loaded as an external plugin. It can\n> +  be part of the libcamera code base, in which case it is covered by the same\n> +  license, or provided externally as an open-source or closed-source component.\n> +\n> +  The component is sandboxed and can only interact with libcamera through\n> +  internal APIs specifically marked as such. In particular it will have no\n> +  direct access to kernel camera devices, and all its accesses to image and\n> +  metadata will be mediated by dmabuf instances explicitly passed to the\n> +  component. The component must be prepared to run in a process separate from\n> +  the main libcamera process, and to have a very restricted view of the system,\n> +  including no access to networking APIs and limited access to file systems.\n> +\n> +  The sandboxing mechanism isn't defined by libcamera. One example\n> +  implementation will be provided as part of the project, and platforms vendors\n> +  will be able to provide their own sandboxing mechanism as a plugin.\n> +\n> +  libcamera should provide a basic implementation of Image Processing\n> +  Algorithms, to serve as a reference for the internal API. Device vendors are\n> +  expected to provide a full-fledged implementation compatible with their\n> +  Pipeline Handler. One goal of the libcamera project is to create an\n> +  environment in which the community will be able to compete with the\n> +  closed-source vendor binaries and develop a high quality open source\n> +  implementation.\n> +\n> +Helpers and Support Classes\n> +  While Pipeline Handlers are device-specific, implementations are expected to\n> +  share code due to usage of identical APIs towards the kernel camera drivers\n> +  and the Image Processing Algorithms. This includes without limitation handling\n> +  of the MC and V4L2 APIs, buffer management through dmabuf, and pipeline\n> +  discovery, configuration and scheduling. Such code will be factored out to\n> +  helpers when applicable.\n> +\n> +  Other parts of libcamera will also benefit from factoring code out to\n> +  self-contained support classes, even if such code is present only once in the\n> +  code base, in order to keep the source code clean and easy to read. This\n> +  should be the case for instance for plugin management.\n> diff --git a/Documentation/meson.build b/Documentation/meson.build\n> index 79135b6f..54788d6d 100644\n> --- a/Documentation/meson.build\n> +++ b/Documentation/meson.build\n> @@ -131,6 +131,7 @@ if sphinx.found()\n>          'docs.rst',\n>          'documentation-contents.rst',\n>          'environment_variables.rst',\n> +        'feature_requirements.rst',\n>          'guides/application-developer.rst',\n>          'guides/introduction.rst',\n>          'guides/ipa.rst',\n> @@ -138,6 +139,7 @@ if sphinx.found()\n>          'guides/tracing.rst',\n>          'index.rst',\n>          'lens_driver_requirements.rst',\n> +        'libcamera_architecture.rst',\n>          'python-bindings.rst',\n>          'sensor_driver_requirements.rst',\n>          'software-isp-benchmarking.rst',","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id 8B571BDB13\n\tfor <parsemail@patchwork.libcamera.org>;\n\tMon, 19 Aug 2024 21:40:53 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id 9188C633BE;\n\tMon, 19 Aug 2024 23:40:52 +0200 (CEST)","from perceval.ideasonboard.com (perceval.ideasonboard.com\n\t[213.167.242.64])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id EAACD633B3\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tMon, 19 Aug 2024 23:40:50 +0200 (CEST)","from pendragon.ideasonboard.com (81-175-209-231.bb.dnainternet.fi\n\t[81.175.209.231])\n\tby perceval.ideasonboard.com (Postfix) with ESMTPSA id CE4D75B3;\n\tMon, 19 Aug 2024 23:39:48 +0200 (CEST)"],"Authentication-Results":"lancelot.ideasonboard.com; dkim=pass (1024-bit key;\n\tunprotected) header.d=ideasonboard.com header.i=@ideasonboard.com\n\theader.b=\"UBYDtZiX\"; dkim-atps=neutral","DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/simple; d=ideasonboard.com;\n\ts=mail; t=1724103589;\n\tbh=rZm5H+tkAcv1OYX8/XQD7KSNzcDXGguJRVCRs8tVmHo=;\n\th=Date:From:To:Cc:Subject:References:In-Reply-To:From;\n\tb=UBYDtZiXx1wBCHSTT6m2FLu/cs9hhiQWsVVc8nw7na0O1nI6G4O5S20gv+/ghrzwp\n\tj8Wg9W3Jy8AKRGBeyMsLrBUG9r1z0m4je0HUOhJuZoDBTBCdD8gfaHvtRTvRqQDGNk\n\t0Ne5XRhJYP32QR1YJaudYVc0KNQijrnTPr3dZGVs=","Date":"Tue, 20 Aug 2024 00:40:22 +0300","From":"Laurent Pinchart <laurent.pinchart@ideasonboard.com>","To":"Daniel Scally <dan.scally@ideasonboard.com>","Cc":"libcamera-devel@lists.libcamera.org","Subject":"Re: [PATCH v3 4/7] Documentation: Breakout docs.rst","Message-ID":"<20240819214022.GB4770@pendragon.ideasonboard.com>","References":"<20240819160921.468981-1-dan.scally@ideasonboard.com>\n\t<20240819160921.468981-5-dan.scally@ideasonboard.com>","MIME-Version":"1.0","Content-Type":"text/plain; charset=utf-8","Content-Disposition":"inline","In-Reply-To":"<20240819160921.468981-5-dan.scally@ideasonboard.com>","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}}]