[libcamera-devel,v2] Create libcamera overview document and glossary

Message ID 20200702104542.19998-1-chris@gregariousmammal.com
State Superseded
Headers show
Series
  • [libcamera-devel,v2] Create libcamera overview document and glossary
Related show

Commit Message

Chris Chinchilla July 2, 2020, 10:45 a.m. UTC
From: Chris Chinchilla <chris@gregariousmammal.com>

Creates a libcamera architecture overview guide and glossary.

Signed-off-by: Chris Chinchilla <chris@gregariousmammal.com>
---
 Documentation/introduction.rst | 197 +++++++++++++++++++++++++++++++++
 1 file changed, 197 insertions(+)
 create mode 100644 Documentation/introduction.rst

Comments

Umang Jain July 6, 2020, 5:55 p.m. UTC | #1
Hi Chris,

On 7/2/20 4:15 PM, chris@gregariousmammal.com wrote:
> From: Chris Chinchilla <chris@gregariousmammal.com>
>
> Creates a libcamera architecture overview guide and glossary.
>
> Signed-off-by: Chris Chinchilla <chris@gregariousmammal.com>
> ---
>   Documentation/introduction.rst | 197 +++++++++++++++++++++++++++++++++
>   1 file changed, 197 insertions(+)
>   create mode 100644 Documentation/introduction.rst
>
> diff --git a/Documentation/introduction.rst b/Documentation/introduction.rst
> new file mode 100644
> index 0000000..cf914c7
> --- /dev/null
> +++ b/Documentation/introduction.rst
> @@ -0,0 +1,197 @@
> +An introduction to libcamera
> +============================
> +
> +The Video for Linux 2 (`V4L2 <https://www.linuxtv.org/downloads/v4l-dvb-apis-new/userspace-api/v4l/v4l2.html>`_) API provides kernel drivers for devices that provide and manipulate
> +images and video. However, Linux was missing a convenient way for application developers to take
> +advantage of these kernel drivers in userspace. Vendors of devices that provide image input sources
> +referred to as "image signal processors" (ISPs) sometimes contribute open-source V4L2 drivers.
> +However, ISPs vary so much, and it's a hard task for developers to write portable ISP-based
> +applications.
> +
> +The libcamera library aims to fill this gap by providing a complete userspace camera
> +stack for Linux-based systems that supports a wide variety of ISPs, including systems with multiple ISPs attached.
> +
> +The library currently supports:
> +
> +-  Hardware
> +
> +   -  Intel IPU3
> +   -  Rockchip RK3399
> +   -  RaspberryPi 3 and 4
> +   -  USB video device class (UVC) cameras
> +   -  Virtual media controller (vimc) driver
> +
> +-  Software
> +
> +   - ChromeOS support through an Android HAL3 adaptation layer
nit: `ChromeOS` alignment not in line with other points below.
> +   -  Image processing algorithms
> +   -  A V4L2 compatibility layer for existing V4L2 based applications
> +   -  A gstreamer element for gstreamer pipeline based applications.
> +
> +The library provides a public API for managing ISPs, frame capture, video streams, frame and
> +request metadata, events and callbacks, image processing, and more.
> +
> +Where libcamera sits in the camera stack
> +----------------------------------------
> +
> +The libcamera library sits in userspace, just on top of the kernel drivers that directly interact
> +with hardware and the V4L2 family of APIs (Media Controller, V4L2 Video Device, and V4L2 sub-device APIs).
> +
> +When using libcamera in a camera stack, it is the core component, taking control of all camera
> +devices, and exposing a native C++ API to upper layers of the framework. Other language bindings are in development.
> +
> +Compatibility Layers
> +~~~~~~~~~~~~~~~~~~~~
> +
> +In a layer above the core framework are compatibility libraries to help existing applications and their developers.
> +
> +V4L2 Compatibility Layer
> +^^^^^^^^^^^^^^^^^^^^^^^^
> +
> +To emulate high-level V4L2 camera devices and capture all access to camera devices, libcamera uses
> +a shared library. The shared library is transparent to libcamera-based applications and injected
> +into a process address space using dynamic linking with "LD_PRELOAD ".
> +
> +The compatibility layer exposes camera device features on a best-effort basis and aims for the
> +level of features traditionally available from a UVC camera designed for video conferencing.
> +
> +Android Camera HAL
> +^^^^^^^^^^^^^^^^^^
> +
> +The libcamera library offers Android camera support through a generic `hardware abstraction layer (HAL) <https://source.android.com/devices/camera/camera3_requests_hal>`_ implementation.
> +The HAL focuses on supporting features required by Android that are missing from libcamera, such as JPEG encoding.
> +
> +The Android camera HAL implementation initially targets the "LIMITED "
> +`hardware level <https://source.android.com/devices/camera/versioning#camera_api2>`_,
> +with support for the "FULL "level then gradually implemented.

Here I feel we should mention that ChromeOS actually uses Android HAL 
layer for it's camera
to re-iterate the point above ("ChromeOS support through an Android HAL3 
adaptation layer").

This is why libcamera ventures into Android's HAL layer (for now atleast).
> +
> +gstreamer element
> +^^^^^^^^^^^^^^^^^
> +
> +The library provides `a gstreamer element <https://gstreamer.freedesktop.org/documentation/application-development/basics/elements.html?gi-language=c>`_ that routes libcamera data for
> +further processing in a gstreamer pipeline.
> +
> +libcamera Architecture
> +----------------------
> +
> +The libcamera library exposes one unified API, but behind that is built from
> +re-usable components that provide hardware specific support and configuration
> +with a device agnostic API.
> +
> +Camera Manager
> +~~~~~~~~~~~~~~
> +
> +It enumerates cameras at runtime and instantiates a `Pipeline Handler`_ to manage each Camera
> +device that libcamera supports. The Camera Manager supports hotplug detection
> +and notification events when supported by the underlying kernel devices.
> +
> +There is only ever one instance of the Camera Manager running per application. Each application's
> +instance of the Camera Manager ensures that only a single application can take control of a camera
> +device at once.
> +
> +`Read the Camera Manager API documentation for more details <http://libcamera.org/api-html/classlibcamera_1_1CameraManager.html>`_
> +
> +Camera Device
> +~~~~~~~~~~~~~
> +
> +The camera class represents a single item of camera hardware that is capable of producing one or more image streams, and provides the API to interact with the underlying device.
> +
> +If a system has multiple instances of the same hardware attached, each has it's own instance of the camera class.
> +
> +The API exposes full control of the device to upper layers of libcamera through the public API,
> +making it the highest level object libcamera exposes, and the object that all other API operations
> +interact with from configuration to capture.
> +
> +`Read the Camera API documentation for more details <http://libcamera.org/api-html/classlibcamera_1_1Camera.html>`_
> +
> +Frame controls
> +^^^^^^^^^^^^^^
> +
> +Depending on the capabilities of a camera device and the hardware it is connected
> +to, libcamera supports controlling capture parameters for each stream on a per-frame basis.
> +
> +These controls include auto exposure, gain, brightness, contrast, lux, white balance, color
> +temperature, and saturation.
> +
> +`Read the Control API documentation for more details <http://libcamera.org/api-html/classlibcamera_1_1Control.html>`_
> +
> +Pipeline Handler
> +~~~~~~~~~~~~~~~~
> +
> +Pipeline handlers are the abstraction layer for platform-specific hardware configuration. They
> +access and control hardware through the V4L2 and Media Controller kernel interfaces, and implement
> +an internal API to control the ISP and Capture components of a pipeline directly.
> +
> +Pipeline handlers' create' Camera device instances based on the devices they detect and support on
> +the running system.
> +A pipeline handler manages most aspects of interacting with a camera device including:
> +
> +-  frame controls
> +-  pipelines and stream configuration
> +-  the data the camera produces, and the buffers it needs to hold the data
> +-  granting access to camera data
> +
> +Pipeline handlers form part of the libcamera codebase, and developers need to implement them for
> +complex hardware with an ISP that requires device-specific configurations.
> +
> +`Read the PipelineHandler API documentation for more details <http://libcamera.org/api-html/classlibcamera_1_1PipelineHandler.html>`_
> +
> +Image Processing Algorithms
> +~~~~~~~~~~~~~~~~~~~~~~~~~~~
> +
> +An image processing algorithm (IPA) component is a private API that implements
> +3A (Auto-Exposure, Auto-White Balance, and Auto-Focus) algorithms.
> +
> +Each supported camera device has its own IPA component that runs on the CPU and
> +interacts with the kernel camera devices to control hardware image processing
> +based on the parameters supplied by upper layers, and helps maintain state,
> +closing the control loop of the ISP.
> +
> +An IPA component can be part of the libcamera code base, in which case the same
> +license covers it, or provided externally as an open-source or closed-source component.
> +
> +The component is sandboxed and can only interact with libcamera through specifically
> +marked APIs. A component has no direct access to kernel camera devices, and dmabuf
> +instances explicitly passed to the component control its access to image data and
> +metadata. The component should run in a process separate from the main libcamera
> +process, and has a restricted view of the system, including no access to networking APIs
> +and limited access to file systems.
> +
> +While libcamera requires sandboxing, the implementation is platform-specific, and handled by
> +platform-specific plugins.
> +
> +To support this security and sandboxing model, libcamera provides an IPC mechanism
> +for an IPA and Pipeline Handler to communicate, but developers need to create the
> +API themselves. Platform vendors can also use any other IPC mechanism that supports
> +passing file descriptors.
> +
> +3A and Image Enhancement Algorithms
> +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
> +
> +Camera devices should implement auto exposure, auto gain, and auto white balance,
> +and those that include a focus lens should also implement autofocus.
> +
> +Device vendors implement the control methods required to control these algorithms in hardware or
> +firmware outside of libcamera, or in an IPA component.
> +
> +.. TODO: Add link to guide when completed
> +
> +Helpers and Support Classes
> +~~~~~~~~~~~~~~~~~~~~~~~~~~~
> +
> +.. TODO: Feel like this is implied in other places of the doc
> +
> +To help developers create device-specific pipeline handlers and image processing
> +algorithms, libcamera provides helpers and support classes that sit on top of the
> +Media Controller and V4L2 APIs.
> +
> +Glossary
> +--------
> +
> +-  **ISP**: Image Signal Processor.
> +-  **Media controller API**: `The Linux Kernel API <https://www.kernel.org/doc/html/v4.10/media/uapi/mediactl/media-controller-intro.html>`_ that handles audio and video input and output.
Small question: Anything specific to do with v4.10 version of the 
document here? Shouldn't we point to something newer / latest?
A quick search gave me v5.6 
https://www.kernel.org/doc/html/v5.6/media/uapi/mediactl/media-controller-intro.html 
based on the release I guess?
I don't know what to do here specifically? Maybe asking a bit around but 
generally something > v5.0 should be used I guess.
> +-  **V4L2**: `Video for Linux API version 2 <https://www.linuxtv.org/downloads/v4l-dvb-apis-new/userspace-api/v4l/v4l2.html>`_. Device drivers and API for video capture on Linux.
> +-  **UVC camera**: `USB Video Class <https://en.wikipedia.org/wiki/USB_video_device_class>`_ that describes devices capable of streaming video over the UVC protocol.
> +-  **3A**: Common algorithms used by camera devices for auto-exposure, auto-white balance and auto-focus).
> +-  **IPC**: `Inter-process communications based protocol <https://en.wikipedia.org/wiki/Inter-process_communication>`_.


Thanks for the work.

Reviewed-by: Umang Jain <email@uajain.com>
Chris Chinchilla July 10, 2020, 10:29 a.m. UTC | #2
Thanks Umang, some great small clarifications, just about to send a new patch

Chris
On 6 Jul 2020, 19:55 +0200, Umang Jain <email@uajain.com>, wrote:
> Hi Chris,
>
> On 7/2/20 4:15 PM, chris@gregariousmammal.com wrote:
> > From: Chris Chinchilla <chris@gregariousmammal.com>
> >
> > Creates a libcamera architecture overview guide and glossary.
> >
> > Signed-off-by: Chris Chinchilla <chris@gregariousmammal.com>
> > ---
> > Documentation/introduction.rst | 197 +++++++++++++++++++++++++++++++++
> > 1 file changed, 197 insertions(+)
> > create mode 100644 Documentation/introduction.rst
> >
> > diff --git a/Documentation/introduction.rst b/Documentation/introduction.rst
> > new file mode 100644
> > index 0000000..cf914c7
> > --- /dev/null
> > +++ b/Documentation/introduction.rst
> > @@ -0,0 +1,197 @@
> > +An introduction to libcamera
> > +============================
> > +
> > +The Video for Linux 2 (`V4L2 <https://www.linuxtv.org/downloads/v4l-dvb-apis-new/userspace-api/v4l/v4l2.html>`_) API provides kernel drivers for devices that provide and manipulate
> > +images and video. However, Linux was missing a convenient way for application developers to take
> > +advantage of these kernel drivers in userspace. Vendors of devices that provide image input sources
> > +referred to as "image signal processors" (ISPs) sometimes contribute open-source V4L2 drivers.
> > +However, ISPs vary so much, and it's a hard task for developers to write portable ISP-based
> > +applications.
> > +
> > +The libcamera library aims to fill this gap by providing a complete userspace camera
> > +stack for Linux-based systems that supports a wide variety of ISPs, including systems with multiple ISPs attached.
> > +
> > +The library currently supports:
> > +
> > +- Hardware
> > +
> > + - Intel IPU3
> > + - Rockchip RK3399
> > + - RaspberryPi 3 and 4
> > + - USB video device class (UVC) cameras
> > + - Virtual media controller (vimc) driver
> > +
> > +- Software
> > +
> > + - ChromeOS support through an Android HAL3 adaptation layer
> nit: `ChromeOS` alignment not in line with other points below.
> > + - Image processing algorithms
> > + - A V4L2 compatibility layer for existing V4L2 based applications
> > + - A gstreamer element for gstreamer pipeline based applications.
> > +
> > +The library provides a public API for managing ISPs, frame capture, video streams, frame and
> > +request metadata, events and callbacks, image processing, and more.
> > +
> > +Where libcamera sits in the camera stack
> > +----------------------------------------
> > +
> > +The libcamera library sits in userspace, just on top of the kernel drivers that directly interact
> > +with hardware and the V4L2 family of APIs (Media Controller, V4L2 Video Device, and V4L2 sub-device APIs).
> > +
> > +When using libcamera in a camera stack, it is the core component, taking control of all camera
> > +devices, and exposing a native C++ API to upper layers of the framework. Other language bindings are in development.
> > +
> > +Compatibility Layers
> > +~~~~~~~~~~~~~~~~~~~~
> > +
> > +In a layer above the core framework are compatibility libraries to help existing applications and their developers.
> > +
> > +V4L2 Compatibility Layer
> > +^^^^^^^^^^^^^^^^^^^^^^^^
> > +
> > +To emulate high-level V4L2 camera devices and capture all access to camera devices, libcamera uses
> > +a shared library. The shared library is transparent to libcamera-based applications and injected
> > +into a process address space using dynamic linking with "LD_PRELOAD ".
> > +
> > +The compatibility layer exposes camera device features on a best-effort basis and aims for the
> > +level of features traditionally available from a UVC camera designed for video conferencing.
> > +
> > +Android Camera HAL
> > +^^^^^^^^^^^^^^^^^^
> > +
> > +The libcamera library offers Android camera support through a generic `hardware abstraction layer (HAL) <https://source.android.com/devices/camera/camera3_requests_hal>`_ implementation.
> > +The HAL focuses on supporting features required by Android that are missing from libcamera, such as JPEG encoding.
> > +
> > +The Android camera HAL implementation initially targets the "LIMITED "
> > +`hardware level <https://source.android.com/devices/camera/versioning#camera_api2>`_,
> > +with support for the "FULL "level then gradually implemented.
>
> Here I feel we should mention that ChromeOS actually uses Android HAL
> layer for it's camera
> to re-iterate the point above ("ChromeOS support through an Android HAL3
> adaptation layer").
>
> This is why libcamera ventures into Android's HAL layer (for now atleast).
> > +
> > +gstreamer element
> > +^^^^^^^^^^^^^^^^^
> > +
> > +The library provides `a gstreamer element <https://gstreamer.freedesktop.org/documentation/application-development/basics/elements.html?gi-language=c>`_ that routes libcamera data for
> > +further processing in a gstreamer pipeline.
> > +
> > +libcamera Architecture
> > +----------------------
> > +
> > +The libcamera library exposes one unified API, but behind that is built from
> > +re-usable components that provide hardware specific support and configuration
> > +with a device agnostic API.
> > +
> > +Camera Manager
> > +~~~~~~~~~~~~~~
> > +
> > +It enumerates cameras at runtime and instantiates a `Pipeline Handler`_ to manage each Camera
> > +device that libcamera supports. The Camera Manager supports hotplug detection
> > +and notification events when supported by the underlying kernel devices.
> > +
> > +There is only ever one instance of the Camera Manager running per application. Each application's
> > +instance of the Camera Manager ensures that only a single application can take control of a camera
> > +device at once.
> > +
> > +`Read the Camera Manager API documentation for more details <http://libcamera.org/api-html/classlibcamera_1_1CameraManager.html>`_
> > +
> > +Camera Device
> > +~~~~~~~~~~~~~
> > +
> > +The camera class represents a single item of camera hardware that is capable of producing one or more image streams, and provides the API to interact with the underlying device.
> > +
> > +If a system has multiple instances of the same hardware attached, each has it's own instance of the camera class.
> > +
> > +The API exposes full control of the device to upper layers of libcamera through the public API,
> > +making it the highest level object libcamera exposes, and the object that all other API operations
> > +interact with from configuration to capture.
> > +
> > +`Read the Camera API documentation for more details <http://libcamera.org/api-html/classlibcamera_1_1Camera.html>`_
> > +
> > +Frame controls
> > +^^^^^^^^^^^^^^
> > +
> > +Depending on the capabilities of a camera device and the hardware it is connected
> > +to, libcamera supports controlling capture parameters for each stream on a per-frame basis.
> > +
> > +These controls include auto exposure, gain, brightness, contrast, lux, white balance, color
> > +temperature, and saturation.
> > +
> > +`Read the Control API documentation for more details <http://libcamera.org/api-html/classlibcamera_1_1Control.html>`_
> > +
> > +Pipeline Handler
> > +~~~~~~~~~~~~~~~~
> > +
> > +Pipeline handlers are the abstraction layer for platform-specific hardware configuration. They
> > +access and control hardware through the V4L2 and Media Controller kernel interfaces, and implement
> > +an internal API to control the ISP and Capture components of a pipeline directly.
> > +
> > +Pipeline handlers' create' Camera device instances based on the devices they detect and support on
> > +the running system.
> > +A pipeline handler manages most aspects of interacting with a camera device including:
> > +
> > +- frame controls
> > +- pipelines and stream configuration
> > +- the data the camera produces, and the buffers it needs to hold the data
> > +- granting access to camera data
> > +
> > +Pipeline handlers form part of the libcamera codebase, and developers need to implement them for
> > +complex hardware with an ISP that requires device-specific configurations.
> > +
> > +`Read the PipelineHandler API documentation for more details <http://libcamera.org/api-html/classlibcamera_1_1PipelineHandler.html>`_
> > +
> > +Image Processing Algorithms
> > +~~~~~~~~~~~~~~~~~~~~~~~~~~~
> > +
> > +An image processing algorithm (IPA) component is a private API that implements
> > +3A (Auto-Exposure, Auto-White Balance, and Auto-Focus) algorithms.
> > +
> > +Each supported camera device has its own IPA component that runs on the CPU and
> > +interacts with the kernel camera devices to control hardware image processing
> > +based on the parameters supplied by upper layers, and helps maintain state,
> > +closing the control loop of the ISP.
> > +
> > +An IPA component can be part of the libcamera code base, in which case the same
> > +license covers it, or provided externally as an open-source or closed-source component.
> > +
> > +The component is sandboxed and can only interact with libcamera through specifically
> > +marked APIs. A component has no direct access to kernel camera devices, and dmabuf
> > +instances explicitly passed to the component control its access to image data and
> > +metadata. The component should run in a process separate from the main libcamera
> > +process, and has a restricted view of the system, including no access to networking APIs
> > +and limited access to file systems.
> > +
> > +While libcamera requires sandboxing, the implementation is platform-specific, and handled by
> > +platform-specific plugins.
> > +
> > +To support this security and sandboxing model, libcamera provides an IPC mechanism
> > +for an IPA and Pipeline Handler to communicate, but developers need to create the
> > +API themselves. Platform vendors can also use any other IPC mechanism that supports
> > +passing file descriptors.
> > +
> > +3A and Image Enhancement Algorithms
> > +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
> > +
> > +Camera devices should implement auto exposure, auto gain, and auto white balance,
> > +and those that include a focus lens should also implement autofocus.
> > +
> > +Device vendors implement the control methods required to control these algorithms in hardware or
> > +firmware outside of libcamera, or in an IPA component.
> > +
> > +.. TODO: Add link to guide when completed
> > +
> > +Helpers and Support Classes
> > +~~~~~~~~~~~~~~~~~~~~~~~~~~~
> > +
> > +.. TODO: Feel like this is implied in other places of the doc
> > +
> > +To help developers create device-specific pipeline handlers and image processing
> > +algorithms, libcamera provides helpers and support classes that sit on top of the
> > +Media Controller and V4L2 APIs.
> > +
> > +Glossary
> > +--------
> > +
> > +- **ISP**: Image Signal Processor.
> > +- **Media controller API**: `The Linux Kernel API <https://www.kernel.org/doc/html/v4.10/media/uapi/mediactl/media-controller-intro.html>`_ that handles audio and video input and output.
> Small question: Anything specific to do with v4.10 version of the
> document here? Shouldn't we point to something newer / latest?
> A quick search gave me v5.6
> https://www.kernel.org/doc/html/v5.6/media/uapi/mediactl/media-controller-intro.html
> based on the release I guess?
> I don't know what to do here specifically? Maybe asking a bit around but
> generally something > v5.0 should be used I guess.
> > +- **V4L2**: `Video for Linux API version 2 <https://www.linuxtv.org/downloads/v4l-dvb-apis-new/userspace-api/v4l/v4l2.html>`_. Device drivers and API for video capture on Linux.
> > +- **UVC camera**: `USB Video Class <https://en.wikipedia.org/wiki/USB_video_device_class>`_ that describes devices capable of streaming video over the UVC protocol.
> > +- **3A**: Common algorithms used by camera devices for auto-exposure, auto-white balance and auto-focus).
> > +- **IPC**: `Inter-process communications based protocol <https://en.wikipedia.org/wiki/Inter-process_communication>`_.
>
>
> Thanks for the work.
>
> Reviewed-by: Umang Jain <email@uajain.com>
>

Patch

diff --git a/Documentation/introduction.rst b/Documentation/introduction.rst
new file mode 100644
index 0000000..cf914c7
--- /dev/null
+++ b/Documentation/introduction.rst
@@ -0,0 +1,197 @@ 
+An introduction to libcamera
+============================
+
+The Video for Linux 2 (`V4L2 <https://www.linuxtv.org/downloads/v4l-dvb-apis-new/userspace-api/v4l/v4l2.html>`_) API provides kernel drivers for devices that provide and manipulate
+images and video. However, Linux was missing a convenient way for application developers to take 
+advantage of these kernel drivers in userspace. Vendors of devices that provide image input sources 
+referred to as "image signal processors" (ISPs) sometimes contribute open-source V4L2 drivers. 
+However, ISPs vary so much, and it's a hard task for developers to write portable ISP-based
+applications.
+
+The libcamera library aims to fill this gap by providing a complete userspace camera 
+stack for Linux-based systems that supports a wide variety of ISPs, including systems with multiple ISPs attached.
+
+The library currently supports:
+
+-  Hardware
+
+   -  Intel IPU3
+   -  Rockchip RK3399
+   -  RaspberryPi 3 and 4
+   -  USB video device class (UVC) cameras
+   -  Virtual media controller (vimc) driver
+
+-  Software
+
+   - ChromeOS support through an Android HAL3 adaptation layer
+   -  Image processing algorithms
+   -  A V4L2 compatibility layer for existing V4L2 based applications
+   -  A gstreamer element for gstreamer pipeline based applications.
+
+The library provides a public API for managing ISPs, frame capture, video streams, frame and 
+request metadata, events and callbacks, image processing, and more.
+
+Where libcamera sits in the camera stack
+----------------------------------------
+
+The libcamera library sits in userspace, just on top of the kernel drivers that directly interact 
+with hardware and the V4L2 family of APIs (Media Controller, V4L2 Video Device, and V4L2 sub-device APIs).
+
+When using libcamera in a camera stack, it is the core component, taking control of all camera 
+devices, and exposing a native C++ API to upper layers of the framework. Other language bindings are in development.
+
+Compatibility Layers
+~~~~~~~~~~~~~~~~~~~~
+
+In a layer above the core framework are compatibility libraries to help existing applications and their developers.
+
+V4L2 Compatibility Layer
+^^^^^^^^^^^^^^^^^^^^^^^^
+
+To emulate high-level V4L2 camera devices and capture all access to camera devices, libcamera uses 
+a shared library. The shared library is transparent to libcamera-based applications and injected 
+into a process address space using dynamic linking with "LD_PRELOAD ".
+
+The compatibility layer exposes camera device features on a best-effort basis and aims for the 
+level of features traditionally available from a UVC camera designed for video conferencing.
+
+Android Camera HAL
+^^^^^^^^^^^^^^^^^^
+
+The libcamera library offers Android camera support through a generic `hardware abstraction layer (HAL) <https://source.android.com/devices/camera/camera3_requests_hal>`_ implementation. 
+The HAL focuses on supporting features required by Android that are missing from libcamera, such as JPEG encoding.
+
+The Android camera HAL implementation initially targets the "LIMITED "
+`hardware level <https://source.android.com/devices/camera/versioning#camera_api2>`_,
+with support for the "FULL "level then gradually implemented.
+
+gstreamer element
+^^^^^^^^^^^^^^^^^
+
+The library provides `a gstreamer element <https://gstreamer.freedesktop.org/documentation/application-development/basics/elements.html?gi-language=c>`_ that routes libcamera data for 
+further processing in a gstreamer pipeline.
+
+libcamera Architecture
+----------------------
+
+The libcamera library exposes one unified API, but behind that is built from 
+re-usable components that provide hardware specific support and configuration 
+with a device agnostic API.
+
+Camera Manager
+~~~~~~~~~~~~~~
+
+It enumerates cameras at runtime and instantiates a `Pipeline Handler`_ to manage each Camera 
+device that libcamera supports. The Camera Manager supports hotplug detection
+and notification events when supported by the underlying kernel devices.
+
+There is only ever one instance of the Camera Manager running per application. Each application's 
+instance of the Camera Manager ensures that only a single application can take control of a camera 
+device at once.
+
+`Read the Camera Manager API documentation for more details <http://libcamera.org/api-html/classlibcamera_1_1CameraManager.html>`_
+
+Camera Device
+~~~~~~~~~~~~~
+
+The camera class represents a single item of camera hardware that is capable of producing one or more image streams, and provides the API to interact with the underlying device. 
+
+If a system has multiple instances of the same hardware attached, each has it's own instance of the camera class.
+
+The API exposes full control of the device to upper layers of libcamera through the public API, 
+making it the highest level object libcamera exposes, and the object that all other API operations 
+interact with from configuration to capture.
+
+`Read the Camera API documentation for more details <http://libcamera.org/api-html/classlibcamera_1_1Camera.html>`_
+
+Frame controls
+^^^^^^^^^^^^^^
+
+Depending on the capabilities of a camera device and the hardware it is connected 
+to, libcamera supports controlling capture parameters for each stream on a per-frame basis.
+
+These controls include auto exposure, gain, brightness, contrast, lux, white balance, color 
+temperature, and saturation.
+
+`Read the Control API documentation for more details <http://libcamera.org/api-html/classlibcamera_1_1Control.html>`_
+
+Pipeline Handler
+~~~~~~~~~~~~~~~~
+
+Pipeline handlers are the abstraction layer for platform-specific hardware configuration. They 
+access and control hardware through the V4L2 and Media Controller kernel interfaces, and implement 
+an internal API to control the ISP and Capture components of a pipeline directly.
+
+Pipeline handlers' create' Camera device instances based on the devices they detect and support on 
+the running system.
+
+A pipeline handler manages most aspects of interacting with a camera device including:
+
+-  frame controls
+-  pipelines and stream configuration
+-  the data the camera produces, and the buffers it needs to hold the data
+-  granting access to camera data
+
+Pipeline handlers form part of the libcamera codebase, and developers need to implement them for 
+complex hardware with an ISP that requires device-specific configurations.
+
+`Read the PipelineHandler API documentation for more details <http://libcamera.org/api-html/classlibcamera_1_1PipelineHandler.html>`_
+
+Image Processing Algorithms
+~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+An image processing algorithm (IPA) component is a private API that implements 
+3A (Auto-Exposure, Auto-White Balance, and Auto-Focus) algorithms.
+
+Each supported camera device has its own IPA component that runs on the CPU and 
+interacts with the kernel camera devices to control hardware image processing 
+based on the parameters supplied by upper layers, and helps maintain state, 
+closing the control loop of the ISP.
+
+An IPA component can be part of the libcamera code base, in which case the same 
+license covers it, or provided externally as an open-source or closed-source component.
+
+The component is sandboxed and can only interact with libcamera through specifically 
+marked APIs. A component has no direct access to kernel camera devices, and dmabuf 
+instances explicitly passed to the component control its access to image data and 
+metadata. The component should run in a process separate from the main libcamera 
+process, and has a restricted view of the system, including no access to networking APIs
+and limited access to file systems.
+
+While libcamera requires sandboxing, the implementation is platform-specific, and handled by 
+platform-specific plugins.
+
+To support this security and sandboxing model, libcamera provides an IPC mechanism 
+for an IPA and Pipeline Handler to communicate, but developers need to create the 
+API themselves. Platform vendors can also use any other IPC mechanism that supports 
+passing file descriptors.
+
+3A and Image Enhancement Algorithms
+^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
+
+Camera devices should implement auto exposure, auto gain, and auto white balance, 
+and those that include a focus lens should also implement autofocus.
+
+Device vendors implement the control methods required to control these algorithms in hardware or 
+firmware outside of libcamera, or in an IPA component.
+
+.. TODO: Add link to guide when completed
+
+Helpers and Support Classes
+~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+.. TODO: Feel like this is implied in other places of the doc
+
+To help developers create device-specific pipeline handlers and image processing 
+algorithms, libcamera provides helpers and support classes that sit on top of the 
+Media Controller and V4L2 APIs.
+
+Glossary
+--------
+
+-  **ISP**: Image Signal Processor.
+-  **Media controller API**: `The Linux Kernel API <https://www.kernel.org/doc/html/v4.10/media/uapi/mediactl/media-controller-intro.html>`_ that handles audio and video input and output.
+-  **V4L2**: `Video for Linux API version 2 <https://www.linuxtv.org/downloads/v4l-dvb-apis-new/userspace-api/v4l/v4l2.html>`_. Device drivers and API for video capture on Linux.
+-  **UVC camera**: `USB Video Class <https://en.wikipedia.org/wiki/USB_video_device_class>`_ that describes devices capable of streaming video over the UVC protocol.
+-  **3A**: Common algorithms used by camera devices for auto-exposure, auto-white balance and auto-focus).
+-  **IPC**: `Inter-process communications based protocol <https://en.wikipedia.org/wiki/Inter-process_communication>`_.