[libcamera-devel] Create libcamera overview and gloassry

Message ID 20200626111732.62852-1-chris@gregariousmammal.com
State Superseded
Headers show
Series
  • [libcamera-devel] Create libcamera overview and gloassry
Related show

Commit Message

Chris Chinchilla June 26, 2020, 11:17 a.m. UTC
From: Chris Chinchilla <chris@gregariousmammal.com>

Creates a libcamera overview guide and glossary to give new developers an idea of why they should use libcamera, its architecture, and features it offers.

Signed-off-by: Chris Chinchilla <chris@gregariousmammal.com>
---
 Documentation/introduction.rst | 195 +++++++++++++++++++++++++++++++++
 1 file changed, 195 insertions(+)
 create mode 100644 Documentation/introduction.rst

Comments

Umang Jain June 26, 2020, 3:38 p.m. UTC | #1
Hi Chris,

Thank you the work.

I read through the entire overview and have commented my thoughts.
However, I didn't feel I am the right person to review "Image Processing 
Algorithms"
(and sections below) so maybe a more experienced person can review those
parts which might be actually helpful.

On 6/26/20 4:47 PM, chris@gregariousmammal.com wrote:
> From: Chris Chinchilla <chris@gregariousmammal.com>
You can drop this as you already have 'Signed-off' Tag below.
>
> Creates a libcamera overview guide and glossary to give new developers an idea of why they should use libcamera, its architecture, and features it offers.
Can you please honor 72-character line limit as per git's commit message
convention here? Then it would be perfect :)
>
> Signed-off-by: Chris Chinchilla <chris@gregariousmammal.com>
Perfect. :)
> ---
>   Documentation/introduction.rst | 195 +++++++++++++++++++++++++++++++++
>   1 file changed, 195 insertions(+)
>   create mode 100644 Documentation/introduction.rst
>
> diff --git a/Documentation/introduction.rst b/Documentation/introduction.rst
> new file mode 100644
> index 0000000..0539e60
> --- /dev/null
> +++ b/Documentation/introduction.rst
> @@ -0,0 +1,195 @@
> +An introduction to libcamera
> +============================
> +
> +The Video for Linux 2 (`V4L2 <https://www.linuxtv.org/downloads/v4l-dvb-apis-new/userspace-api/v4l/v4l2.html>`_) API provides kernel drivers for devices that provide and manipulate
> +images and video. However, Linux was missing a convenient way for application developers to take
> +advantage of these kernel drivers in userspace. Vendors of devices that provide image input sources
s/devices/camera devices/
>   
> +referred to as “image signal processors” (ISPs) sometimes contribute open-source V4L2 drivers.
> +However, ISPs vary so much, it’s a hard task for developers to write portable ISP-based
> +applications.
> +
> +The libcamera library aims to fill this gap by providing a complete userspace camera
> +stack for Linux-based systems that supports a wide variety of ISPs, including systems with multiple ISPs attached.
hmm, I didn't understand the phrase "with multiple ISPs attached".
Does a vendor generally offers multiple ISPs with the camera? Does it 
happen quite often?
I honestly don't know, so probably a newbie reader(like me :P) won't 
know too. Being a slight
verbose about the multiple ISPs thing, might help.
> +
> +The library currently supports:
> +
> +-  Hardware
> +
> +   -  Intel IPU3
> +   -  Rockchip RK3399
> +   -  RaspberryPi 3 and 4
> +   -  USB video device class (UVC) cameras
> +   -  Virtual media controller (vimc) driver
> +
> +-  Software
> +
> +   -  Other Linux-kernel-based operating systems (such as Android and ChromeOS)
> +   -  Image enhancement algorithms
> +   -  A V4L2 compatibility layer
> +
I personally find that libcamera framework being able to integrate with 
proprietary IPAs,
a strong focus of public attention. Can we somehow flex that capability 
and say something
around that a bit more 'explicitly' here ? :) No pressure though!
> +The library provides a public API for managing ISPs, frame capture, video streams, frame and
> +request metadata, events and callbacks, image processing, and more.
> +
> +Where libcamera sits in the camera stack
> +----------------------------------------
> +
> +The libcamera library sits in userspace, just on top of the kernel drivers that directly interact
> +with hardware and the V4L2 family of APIs (Media Controller, V4L2 Video Device, and V4L2 sub-device APIs).
> +
> +When using libcamera in a camera stack, it is the core component, taking control of all camera
> +devices, and exposing a native C++ API to upper layers of the framework. Other language bindings are coming soon.
> +
> +Compatibility Layers
> +~~~~~~~~~~~~~~~~~~~~
> +
> +In a layer above the core framework are compatibility libraries to help existing applications and their developers.
> +
Did you mean s/above/below/?
> +V4L2 Compatibility Layer
> +^^^^^^^^^^^^^^^^^^^^^^^^
> +
> +To emulate high-level V4L2 camera devices and capture all access to camera devices, libcamera uses
> +a shared library. The shared library is transparent to libcamera-based applications and injected
> +into a process address space using dynamic linking with ``LD_PRELOAD``.
> +
> +The compatibility layer exposes camera device features on a best-effort basis and aims for the
> +level of features traditionally available from a UVC camera designed for video conferencing.
> +
> +Android Camera HAL
> +^^^^^^^^^^^^^^^^^^
> +
> +The libcamera library offers Android camera support through a generic `hardware abstraction layer (HAL) <https://source.android.com/devices/camera/camera3_requests_hal>`_ implementation.
> +The HAL focuses on supporting features required by Android that are missing from libcamera, such as JPEG encoding.
> +
> +The Android camera HAL implementation initially targets the ``LIMITED``
> +`hardware level <https://source.android.com/devices/camera/versioning#camera_api2>`_,
> +with support for the ``FULL`` level then gradually implemented.
> +
> +gstreamer element
> +^^^^^^^^^^^^^^^^^
> +
> +The library provides `a gstreamer element <https://gstreamer.freedesktop.org/documentation/application-development/basics/elements.html?gi-language=c>`_ for routing libcamera data to gstreamer for processing.
> +
> +libcamera Architecture
> +----------------------
> +
> +The libcamera library exposes one unified API, but behind that is built from a
I would slight repharse: s/behind that/on the backend side, it is built.../
> +set of specific components for each supported device, components for different
> +functionalities, and helpers for those components.
> +
> +Camera Devices Manager
> +~~~~~~~~~~~~~~~~~~~~~~
> +
> +Provides access and manages all cameras on a system. It enumerates cameras at
> +build and runtime and supports a hotplug notification mechanism.
> +
> +There is only ever one instance of the Camera Devices Manager running per application.
> +
> +`Read the Camera Manager API documentation for more details <http://libcamera.org/api-html/classlibcamera_1_1CameraManager.html>`_
> +
> +Camera Device
> +~~~~~~~~~~~~~
> +
> +A camera device is a device capable of producing one or more image streams from
> +a single image source and can include multiple identical camera devices.
The 'multiple identical camera devices' bit confuses me here. I didn't 
have time right
now to dig into this but I had a brief conversation on IRC. Attaching 
here for your reference:

<uajain> From camera.cpp, can somebody clear me one question for me
<uajain> > * A camera device contains a single image source, and 
separate camera device
<uajain> >  * instances relate to different image sources.
<uajain> The first 'camera device'  in the sentence acutally means a 
"media device" (Like a phone/laptop)
<uajain> and the second 'camera device' is a camera (i.e. instance of 
Camera class)
<uajain> Am I making any sense?
<pinchartl> no, both mean a Camera instance
<pinchartl> this means that there's a 1:1 mapping between camera sensors 
and Camera instances
<pinchartl> (for now at least)
<uajain> hmm, I see :S
-*- uajain digs more
<uajain> I am acutally trying to de-reference Chris' statement on 
"libcamera overview and glossary patch"
<uajain> +A camera device is a device capable of producing one or more 
image streams from
<uajain> +a single image source and can include multiple identical 
camera devices.
<uajain> The 'can include multiple identical camera devices' part is 
tripping me up
<pinchartl> maybe he meant that a system can include multiple identical 
camera devices, as in multiple UVC webcams for instance ?
<pinchartl> I'm not sure
<uajain> Yeah, I need to ask him when he's online.

> +
> +The libcamera library represents a camera device to the upper layers of the library.
> +It exposes full control of the device through the public API, making it the highest
> +level object libcamera exposes, and the object that all other API operations interact
> +with from configuration to capture.
> +
> +`Read the Camera API documentation for more details <http://libcamera.org/api-html/classlibcamera_1_1Camera.html>`_
> +
> +Frame controls
> +^^^^^^^^^^^^^^
> +
> +Depending on the capabilities of a camera device and the hardware it is connected
> +to, libcamera supports controlling capture parameters for each stream on a per-frame basis.
> +
> +These controls include auto exposure, gain, brightness, contrast, lux, white balance, color temperature, and saturation.
> +
> +`Read the Control API documentation for more details <http://libcamera.org/api-html/classlibcamera_1_1Control.html>`_
> +
> +Pipeline Handler
> +~~~~~~~~~~~~~~~~
> +
> +A pipeline handler is a private API that manages the pipelines exposed by the kernel
> +drivers through the Media Controller and V4L2 APIs.
> +
> +Each supported camera device has its own pipeline handler component, which hides
> +and abstracts device-specific details, providing what the rest of the library
> +needs to interact with a device.
> +
> +A pipeline handler manages most aspects of interacting with a camera device including:
> +
> +-  acquiring use of the camera device
> +-  frame controls
> +-  pipelines and stream configuration
> +-  the data the camera produces, and the buffers it needs to hold the data
> +-  creating requests applications need to access camera data
I would also include that, pipeline handler is responsible for 
registering the camera with CameraManager
so that it can be made available to the application.
> +
> +A pipeline handler is part of the libcamera codebase, and hardware vendors generally need to create them.
> +
> +`Read the PipelineHandler API documentation for more details <http://libcamera.org/api-html/classlibcamera_1_1PipelineHandler.html>`_
> +
> +Image Processing Algorithms
> +~~~~~~~~~~~~~~~~~~~~~~~~~~~
> +
> +An image processing algorithm (IPA) component is a private API that implements
> +3A (Auto-Exposure, Auto-White Balance, and Auto-Focus) algorithms.
> +
> +Each supported camera device has its own IPA component that runs on the CPU and
> +interacts with the kernel camera devices to control hardware image processing
> +based on the parameters supplied by upper layers, and helps maintain state,
> +closing the control loop of the ISP.
> +
> +An IPA component can be part of the libcamera code base, in which case the same
> +license covers it, or provided externally as an open-source or closed-source component.
> +
> +.. TODO: Better in the IPA guide?
> +
> +The component is sandboxed and can only interact with libcamera through specifically
> +marked APIs. A component has no direct access to kernel camera devices, and dmabuf
> +instances explicitly passed to the component control its access to image data and
> +metadata. The component should run in a process separate from the main libcamera
> +process, and has a restricted view of the system, including no access to networking APIs
> +and limited access to file systems.
> +
> +The sandboxing mechanism isn’t defined by libcamera. Platform vendors need to provide sandboxing
> +mechanisms as a plugin.
> +
> +To support this security and sandboxing model, libcamera provides an IPC mechanism
> +for an IPA and Pipeline Handler to communicate, but developers need to create the
> +API themselves. Platform vendors can also use any other IPC mechanism that supports
> +passing file descriptors.
> +
> +3A and Image Enhancement Algorithms
> +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
> +
> +Camera devices should implement auto exposure, auto gain, and auto white balance,
> +and those that include a focus lens should also implement autofocus.
> +
> +Vendors can implement these algorithms in hardware or firmware outside of libcamera,
> +or in an IPA component.
> +
> +
> +.. TODO: Add link to guide when completed
> +
> +Helpers and Support Classes
> +~~~~~~~~~~~~~~~~~~~~~~~~~~~
> +
> +.. TODO: Feel like this is implied in other places of the doc
> +
> +To help developers create device-specific pipeline handlers and image processing
> +algorithms, libcamera provides helpers and support classes that sit on top of the
> +Media Controller and V4L2 APIs.
> +
> +Glossary
> +--------
> +
> +-  **ISP**: Image Signal Processor.
> +-  **SoC**: System on a Chip.
> +-  **Media controller API**: `The Linux Kernel API <https://www.kernel.org/doc/html/v4.10/media/uapi/mediactl/media-controller-intro.html>`_ that handles audio and video input and output.
> +-  **V4L2**: `Video for Linux API version 2 <https://www.linuxtv.org/downloads/v4l-dvb-apis-new/userspace-api/v4l/v4l2.html>`_. Device drivers and API for video capture on Linux.
> +-  **UVC camera**: `USB Video Class <https://en.wikipedia.org/wiki/USB_video_device_class>`_ that describes devices capable of streaming video.
> +-  **3A**: Common algorithms used by camera devices for auto-exposure, auto-white balance and auto-focus).
> +-  **IPC**: `Inter-process communications based protocol <https://en.wikipedia.org/wiki/Inter-process_communication>`_.
> _______________________________________________
> libcamera-devel mailing list
> libcamera-devel@lists.libcamera.org
> https://lists.libcamera.org/listinfo/libcamera-devel
Kieran Bingham June 30, 2020, 12:42 p.m. UTC | #2
Hi Chris,

On 26/06/2020 12:17, chris@gregariousmammal.com wrote:
> From: Chris Chinchilla <chris@gregariousmammal.com>
> 
> Creates a libcamera overview guide and glossary to give new developers an idea of why they should use libcamera, its architecture, and features it offers.
> 
> Signed-off-by: Chris Chinchilla <chris@gregariousmammal.com>
> ---
>  Documentation/introduction.rst | 195 +++++++++++++++++++++++++++++++++
>  1 file changed, 195 insertions(+)
>  create mode 100644 Documentation/introduction.rst
> 
> diff --git a/Documentation/introduction.rst b/Documentation/introduction.rst
> new file mode 100644
> index 0000000..0539e60
> --- /dev/null
> +++ b/Documentation/introduction.rst
> @@ -0,0 +1,195 @@
> +An introduction to libcamera
> +============================
> +
> +The Video for Linux 2 (`V4L2 <https://www.linuxtv.org/downloads/v4l-dvb-apis-new/userspace-api/v4l/v4l2.html>`_) API provides kernel drivers for devices that provide and manipulate
> +images and video. However, Linux was missing a convenient way for application developers to take 
> +advantage of these kernel drivers in userspace. Vendors of devices that provide image input sources 
> +referred to as “image signal processors” (ISPs) sometimes contribute open-source V4L2 drivers. 
> +However, ISPs vary so much, it’s a hard task for developers to write portable ISP-based
> +applications.
> +
> +The libcamera library aims to fill this gap by providing a complete userspace camera 
> +stack for Linux-based systems that supports a wide variety of ISPs, including systems with multiple ISPs attached.

I'm not sure that 'systems with multiple ISPs attached' makes sense here.

> +
> +The library currently supports:
> +
> +-  Hardware
> +
> +   -  Intel IPU3
> +   -  Rockchip RK3399
> +   -  RaspberryPi 3 and 4
> +   -  USB video device class (UVC) cameras
> +   -  Virtual media controller (vimc) driver
> +
> +-  Software
> +
> +   -  Other Linux-kernel-based operating systems (such as Android and ChromeOS)

I'd describe this as :
    - ChromeOS support through an Android HAL3 adaptation layer

> +   -  A V4L2 compatibility layer
	      -  A V4L2 compatibility layer for existing V4L2 based applications

      -  A gstreamer element for gstreamer pipeline based applications.



> +   -  Image enhancement algorithms

I'm not sure this point counts here yet, so I'd drop it.
Unless it's meant to mean 'Image processing algorithms' ?

Maybe that's the same thing...

> +
> +The library provides a public API for managing ISPs, frame capture, video streams, frame and 
> +request metadata, events and callbacks, image processing, and more.
> +
> +Where libcamera sits in the camera stack
> +----------------------------------------
> +
> +The libcamera library sits in userspace, just on top of the kernel drivers that directly interact 
> +with hardware and the V4L2 family of APIs (Media Controller, V4L2 Video Device, and V4L2 sub-device APIs).
> +
> +When using libcamera in a camera stack, it is the core component, taking control of all camera 
> +devices, and exposing a native C++ API to upper layers of the framework. Other language bindings are coming soon.

I'd maybe refrain from saying 'coming soon', as that could imply 'next
week' which wouldn't be accurate.

How about " are expected potential developments."

> +
> +Compatibility Layers
> +~~~~~~~~~~~~~~~~~~~~
> +
> +In a layer above the core framework are compatibility libraries to help existing applications and their developers.
> +
> +V4L2 Compatibility Layer
> +^^^^^^^^^^^^^^^^^^^^^^^^
> +
> +To emulate high-level V4L2 camera devices and capture all access to camera devices, libcamera uses 
> +a shared library. The shared library is transparent to libcamera-based applications and injected 
> +into a process address space using dynamic linking with ``LD_PRELOAD``.
> +
> +The compatibility layer exposes camera device features on a best-effort basis and aims for the 
> +level of features traditionally available from a UVC camera designed for video conferencing.
> +
> +Android Camera HAL
> +^^^^^^^^^^^^^^^^^^
> +
> +The libcamera library offers Android camera support through a generic `hardware abstraction layer (HAL) <https://source.android.com/devices/camera/camera3_requests_hal>`_ implementation. 
> +The HAL focuses on supporting features required by Android that are missing from libcamera, such as JPEG encoding.
> +
> +The Android camera HAL implementation initially targets the ``LIMITED``
> +`hardware level <https://source.android.com/devices/camera/versioning#camera_api2>`_,
> +with support for the ``FULL`` level then gradually implemented.
> +
> +gstreamer element
> +^^^^^^^^^^^^^^^^^
> +
> +The library provides `a gstreamer element <https://gstreamer.freedesktop.org/documentation/application-development/basics/elements.html?gi-language=c>`_ for routing libcamera data to gstreamer for processing.

'for further processing in a gstreamer pipeline.' ?

> +
> +libcamera Architecture
> +----------------------
> +
> +The libcamera library exposes one unified API, but behind that is built from a

'it is built'

> +set of specific components for each supported device, components for different 
> +functionalities, and helpers for those components.

This seems quite vague, so I think we need to expand here a bit, though
I guess this is just an intro into this section ?


How about:

'is built from a collection of re-usable components to support platform
specific configuration of hardware {with/to give/to provide?} a device
agnostic API.'

> +
> +Camera Devices Manager

Camera Manager

> +~~~~~~~~~~~~~~~~~~~~~~
> +
> +Provides access and manages all cameras on a system. It enumerates cameras at 

'Provides access control' ?

> +build and runtime and supports a hotplug notification mechanism.


It can't enumerate cameras at build time. Only runtime.

"It enumerates cameras at runtime, and instantiates Pipeline Handlers to
manage each Camera device. The CameraManager supports hotplug detection
and notification events when supported by the underlying kernel devices."


> +
> +There is only ever one instance of the Camera Devices Manager running per application.

I'd probably call it the "Camera Device Manager"
And lets expand that it prevents concurrent access to cameras even
across different applictions:

"The Camera Device Manager ensures that only a single application can
take control of a camera at any one time."

Each application's instance of the CameraManager


> +
> +`Read the Camera Manager API documentation for more details <http://libcamera.org/api-html/classlibcamera_1_1CameraManager.html>`_
> +
> +Camera Device
> +~~~~~~~~~~~~~
> +
> +A camera device is a device capable of producing one or more image streams from 

I wouldn't call a "camera device" a device like that... It's a class to
represent the underlying hardware as a single object, providing the API
interface to interact with.

I see that the term 'Camera device' comes from the opening brief in the
detailed description at
http://libcamera.org/api-html/classlibcamera_1_1Camera.html#details, so
I suspect that should be improved/expanded upon there too.



> +a single image source and can include multiple identical camera devices.

I'm not sure that's quite right ("and can include multiple identical
camera devices..."). A single instance of a Camera class represents a
single underlying camera.

It can get complicated in the future where a logical camera might be
composed of multiple sensors merged together (to produce exotic HDR or
such I think) - but we don't need to worry about that for now. Currently
- a Camera instance is a single camera, which can be multiple streams of
the same source capture image.

> +
> +The libcamera library represents a camera device to the upper layers of the library. 
> +It exposes full control of the device through the public API, making it the highest 
> +level object libcamera exposes, and the object that all other API operations interact 
> +with from configuration to capture.
> +
> +`Read the Camera API documentation for more details <http://libcamera.org/api-html/classlibcamera_1_1Camera.html>`_
> +
> +Frame controls
> +^^^^^^^^^^^^^^
> +
> +Depending on the capabilities of a camera device and the hardware it is connected 
> +to, libcamera supports controlling capture parameters for each stream on a per-frame basis.
> +
> +These controls include auto exposure, gain, brightness, contrast, lux, white balance, color temperature, and saturation.
> +
> +`Read the Control API documentation for more details <http://libcamera.org/api-html/classlibcamera_1_1Control.html>`_
> +
> +Pipeline Handler
> +~~~~~~~~~~~~~~~~
> +
> +A pipeline handler is a private API that manages the pipelines exposed by the kernel 

Hrm, it's more like -  A pipeline handler implements the internal API
required to manage ...

> +drivers through the Media Controller and V4L2 APIs.

Hrm: I'd say something more like:


"A pipeline handler implements the internal API to manage, configure,
and control the pipelines exposed by the kernel drivers through the
Media Controller and V4L2 kernel interfaces."

Or

"Pipeline handlers are the underlying abstraction layer for platform
specific configuration of hardware. They access and control hardware
through the V4L2 and Media Controller kernel interfaces, to implement an
internal API to directly control the ISP and Capture components of a
pipeline."



> +
> +Each supported camera device has its own pipeline handler component, which hides 
> +and abstracts device-specific details, providing what the rest of the library 

Not quite, Pipeline handlers 'create' Camera device instances, based on
the devices they detect and support in the running system.


> +needs to interact with a device.



> +
> +A pipeline handler manages most aspects of interacting with a camera device including:
> +
> +-  acquiring use of the camera device

Hrm ... Is that done by pipeline handlers, or the CameraManager ?

> +-  frame controls
> +-  pipelines and stream configuration
> +-  the data the camera produces, and the buffers it needs to hold the data
> +-  creating requests applications need to access camera data

Requests are 'given' to a Pipeline handler to describe what the pipeline
handler must do ...

> +
> +A pipeline handler is part of the libcamera codebase, and hardware vendors generally need to create them.

I'm not fond of that sentence, because while indeed a hardware vendor
might be best suited to create one for a target, they're probably not
always going to be the ones who do it.

maybe:

"Pipeline handlers form part of the internal libcamera codebase, and are
needed to be implemented for complex platforms with an ISP requiring
device specific configurations."

> +
> +`Read the PipelineHandler API documentation for more details <http://libcamera.org/api-html/classlibcamera_1_1PipelineHandler.html>`_
> +
> +Image Processing Algorithms
> +~~~~~~~~~~~~~~~~~~~~~~~~~~~
> +
> +An image processing algorithm (IPA) component is a private API that implements 
> +3A (Auto-Exposure, Auto-White Balance, and Auto-Focus) algorithms.
> +
> +Each supported camera device has its own IPA component that runs on the CPU and 
> +interacts with the kernel camera devices to control hardware image processing 
> +based on the parameters supplied by upper layers, and helps maintain state, 
> +closing the control loop of the ISP.
> +
> +An IPA component can be part of the libcamera code base, in which case the same 
> +license covers it, or provided externally as an open-source or closed-source component.
> +
> +.. TODO: Better in the IPA guide?

I would say there's benefit to having some description here.

The IPA guide will be about what is needed to write and design an IPA
module.

> +
> +The component is sandboxed and can only interact with libcamera through specifically 
> +marked APIs. A component has no direct access to kernel camera devices, and dmabuf 
> +instances explicitly passed to the component control its access to image data and 
> +metadata. The component should run in a process separate from the main libcamera 
> +process, and has a restricted view of the system, including no access to networking APIs
> +and limited access to file systems.
> +
> +The sandboxing mechanism isn’t defined by libcamera. Platform vendors need to provide sandboxing 
> +mechanisms as a plugin.

Hrm ... We only really support two platforms currently: Linux and ChromeOS.

I would say we do define the requirements, but the implementation is
platform specific and handled via platform specific plugins?


> +
> +To support this security and sandboxing model, libcamera provides an IPC mechanism 
> +for an IPA and Pipeline Handler to communicate, but developers need to create the 
> +API themselves. Platform vendors can also use any other IPC mechanism that supports 
> +passing file descriptors.
> +
> +3A and Image Enhancement Algorithms
> +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
> +
> +Camera devices should implement auto exposure, auto gain, and auto white balance, 
> +and those that include a focus lens should also implement autofocus.
> +
> +Vendors can implement these algorithms in hardware or firmware outside of libcamera, 
> +or in an IPA component.

I think it's more like "Vendors implement the control algorithms
required to control

> +
> +
> +.. TODO: Add link to guide when completed
> +
> +Helpers and Support Classes
> +~~~~~~~~~~~~~~~~~~~~~~~~~~~
> +
> +.. TODO: Feel like this is implied in other places of the doc
> +
> +To help developers create device-specific pipeline handlers and image processing 
> +algorithms, libcamera provides helpers and support classes that sit on top of the 
> +Media Controller and V4L2 APIs.

Yes, either this section should be dropped, or it should actually
introduce and talk about the details of the MC/V4L2 classes, and the
DeviceEnumerator APIs.


> +
> +Glossary
> +--------
> +
> +-  **ISP**: Image Signal Processor.
> +-  **SoC**: System on a Chip.

SoC is never mentioned in this document so far, so I guess it doesn't
need to be in the glossary.


> +-  **Media controller API**: `The Linux Kernel API <https://www.kernel.org/doc/html/v4.10/media/uapi/mediactl/media-controller-intro.html>`_ that handles audio and video input and output.
> +-  **V4L2**: `Video for Linux API version 2 <https://www.linuxtv.org/downloads/v4l-dvb-apis-new/userspace-api/v4l/v4l2.html>`_. Device drivers and API for video capture on Linux.
> +-  **UVC camera**: `USB Video Class <https://en.wikipedia.org/wiki/USB_video_device_class>`_ that describes devices capable of streaming video.

"of streaming video over the UVC protocol."

> +-  **3A**: Common algorithms used by camera devices for auto-exposure, auto-white balance and auto-focus).
> +-  **IPC**: `Inter-process communications based protocol <https://en.wikipedia.org/wiki/Inter-process_communication>`_.
> _______________________________________________
> libcamera-devel mailing list
> libcamera-devel@lists.libcamera.org
> https://lists.libcamera.org/listinfo/libcamera-devel
>

Patch

diff --git a/Documentation/introduction.rst b/Documentation/introduction.rst
new file mode 100644
index 0000000..0539e60
--- /dev/null
+++ b/Documentation/introduction.rst
@@ -0,0 +1,195 @@ 
+An introduction to libcamera
+============================
+
+The Video for Linux 2 (`V4L2 <https://www.linuxtv.org/downloads/v4l-dvb-apis-new/userspace-api/v4l/v4l2.html>`_) API provides kernel drivers for devices that provide and manipulate
+images and video. However, Linux was missing a convenient way for application developers to take 
+advantage of these kernel drivers in userspace. Vendors of devices that provide image input sources 
+referred to as “image signal processors” (ISPs) sometimes contribute open-source V4L2 drivers. 
+However, ISPs vary so much, it’s a hard task for developers to write portable ISP-based
+applications.
+
+The libcamera library aims to fill this gap by providing a complete userspace camera 
+stack for Linux-based systems that supports a wide variety of ISPs, including systems with multiple ISPs attached.
+
+The library currently supports:
+
+-  Hardware
+
+   -  Intel IPU3
+   -  Rockchip RK3399
+   -  RaspberryPi 3 and 4
+   -  USB video device class (UVC) cameras
+   -  Virtual media controller (vimc) driver
+
+-  Software
+
+   -  Other Linux-kernel-based operating systems (such as Android and ChromeOS)
+   -  Image enhancement algorithms
+   -  A V4L2 compatibility layer
+
+The library provides a public API for managing ISPs, frame capture, video streams, frame and 
+request metadata, events and callbacks, image processing, and more.
+
+Where libcamera sits in the camera stack
+----------------------------------------
+
+The libcamera library sits in userspace, just on top of the kernel drivers that directly interact 
+with hardware and the V4L2 family of APIs (Media Controller, V4L2 Video Device, and V4L2 sub-device APIs).
+
+When using libcamera in a camera stack, it is the core component, taking control of all camera 
+devices, and exposing a native C++ API to upper layers of the framework. Other language bindings are coming soon.
+
+Compatibility Layers
+~~~~~~~~~~~~~~~~~~~~
+
+In a layer above the core framework are compatibility libraries to help existing applications and their developers.
+
+V4L2 Compatibility Layer
+^^^^^^^^^^^^^^^^^^^^^^^^
+
+To emulate high-level V4L2 camera devices and capture all access to camera devices, libcamera uses 
+a shared library. The shared library is transparent to libcamera-based applications and injected 
+into a process address space using dynamic linking with ``LD_PRELOAD``.
+
+The compatibility layer exposes camera device features on a best-effort basis and aims for the 
+level of features traditionally available from a UVC camera designed for video conferencing.
+
+Android Camera HAL
+^^^^^^^^^^^^^^^^^^
+
+The libcamera library offers Android camera support through a generic `hardware abstraction layer (HAL) <https://source.android.com/devices/camera/camera3_requests_hal>`_ implementation. 
+The HAL focuses on supporting features required by Android that are missing from libcamera, such as JPEG encoding.
+
+The Android camera HAL implementation initially targets the ``LIMITED``
+`hardware level <https://source.android.com/devices/camera/versioning#camera_api2>`_,
+with support for the ``FULL`` level then gradually implemented.
+
+gstreamer element
+^^^^^^^^^^^^^^^^^
+
+The library provides `a gstreamer element <https://gstreamer.freedesktop.org/documentation/application-development/basics/elements.html?gi-language=c>`_ for routing libcamera data to gstreamer for processing.
+
+libcamera Architecture
+----------------------
+
+The libcamera library exposes one unified API, but behind that is built from a 
+set of specific components for each supported device, components for different 
+functionalities, and helpers for those components.
+
+Camera Devices Manager
+~~~~~~~~~~~~~~~~~~~~~~
+
+Provides access and manages all cameras on a system. It enumerates cameras at 
+build and runtime and supports a hotplug notification mechanism.
+
+There is only ever one instance of the Camera Devices Manager running per application.
+
+`Read the Camera Manager API documentation for more details <http://libcamera.org/api-html/classlibcamera_1_1CameraManager.html>`_
+
+Camera Device
+~~~~~~~~~~~~~
+
+A camera device is a device capable of producing one or more image streams from 
+a single image source and can include multiple identical camera devices.
+
+The libcamera library represents a camera device to the upper layers of the library. 
+It exposes full control of the device through the public API, making it the highest 
+level object libcamera exposes, and the object that all other API operations interact 
+with from configuration to capture.
+
+`Read the Camera API documentation for more details <http://libcamera.org/api-html/classlibcamera_1_1Camera.html>`_
+
+Frame controls
+^^^^^^^^^^^^^^
+
+Depending on the capabilities of a camera device and the hardware it is connected 
+to, libcamera supports controlling capture parameters for each stream on a per-frame basis.
+
+These controls include auto exposure, gain, brightness, contrast, lux, white balance, color temperature, and saturation.
+
+`Read the Control API documentation for more details <http://libcamera.org/api-html/classlibcamera_1_1Control.html>`_
+
+Pipeline Handler
+~~~~~~~~~~~~~~~~
+
+A pipeline handler is a private API that manages the pipelines exposed by the kernel 
+drivers through the Media Controller and V4L2 APIs.
+
+Each supported camera device has its own pipeline handler component, which hides 
+and abstracts device-specific details, providing what the rest of the library 
+needs to interact with a device.
+
+A pipeline handler manages most aspects of interacting with a camera device including:
+
+-  acquiring use of the camera device
+-  frame controls
+-  pipelines and stream configuration
+-  the data the camera produces, and the buffers it needs to hold the data
+-  creating requests applications need to access camera data
+
+A pipeline handler is part of the libcamera codebase, and hardware vendors generally need to create them.
+
+`Read the PipelineHandler API documentation for more details <http://libcamera.org/api-html/classlibcamera_1_1PipelineHandler.html>`_
+
+Image Processing Algorithms
+~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+An image processing algorithm (IPA) component is a private API that implements 
+3A (Auto-Exposure, Auto-White Balance, and Auto-Focus) algorithms.
+
+Each supported camera device has its own IPA component that runs on the CPU and 
+interacts with the kernel camera devices to control hardware image processing 
+based on the parameters supplied by upper layers, and helps maintain state, 
+closing the control loop of the ISP.
+
+An IPA component can be part of the libcamera code base, in which case the same 
+license covers it, or provided externally as an open-source or closed-source component.
+
+.. TODO: Better in the IPA guide?
+
+The component is sandboxed and can only interact with libcamera through specifically 
+marked APIs. A component has no direct access to kernel camera devices, and dmabuf 
+instances explicitly passed to the component control its access to image data and 
+metadata. The component should run in a process separate from the main libcamera 
+process, and has a restricted view of the system, including no access to networking APIs
+and limited access to file systems.
+
+The sandboxing mechanism isn’t defined by libcamera. Platform vendors need to provide sandboxing 
+mechanisms as a plugin.
+
+To support this security and sandboxing model, libcamera provides an IPC mechanism 
+for an IPA and Pipeline Handler to communicate, but developers need to create the 
+API themselves. Platform vendors can also use any other IPC mechanism that supports 
+passing file descriptors.
+
+3A and Image Enhancement Algorithms
+^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
+
+Camera devices should implement auto exposure, auto gain, and auto white balance, 
+and those that include a focus lens should also implement autofocus.
+
+Vendors can implement these algorithms in hardware or firmware outside of libcamera, 
+or in an IPA component.
+
+
+.. TODO: Add link to guide when completed
+
+Helpers and Support Classes
+~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+.. TODO: Feel like this is implied in other places of the doc
+
+To help developers create device-specific pipeline handlers and image processing 
+algorithms, libcamera provides helpers and support classes that sit on top of the 
+Media Controller and V4L2 APIs.
+
+Glossary
+--------
+
+-  **ISP**: Image Signal Processor.
+-  **SoC**: System on a Chip.
+-  **Media controller API**: `The Linux Kernel API <https://www.kernel.org/doc/html/v4.10/media/uapi/mediactl/media-controller-intro.html>`_ that handles audio and video input and output.
+-  **V4L2**: `Video for Linux API version 2 <https://www.linuxtv.org/downloads/v4l-dvb-apis-new/userspace-api/v4l/v4l2.html>`_. Device drivers and API for video capture on Linux.
+-  **UVC camera**: `USB Video Class <https://en.wikipedia.org/wiki/USB_video_device_class>`_ that describes devices capable of streaming video.
+-  **3A**: Common algorithms used by camera devices for auto-exposure, auto-white balance and auto-focus).
+-  **IPC**: `Inter-process communications based protocol <https://en.wikipedia.org/wiki/Inter-process_communication>`_.