From patchwork Tue May 3 16:30:37 2022 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Quentin Schulz X-Patchwork-Id: 15764 Return-Path: X-Original-To: parsemail@patchwork.libcamera.org Delivered-To: parsemail@patchwork.libcamera.org Received: from lancelot.ideasonboard.com (lancelot.ideasonboard.com [92.243.16.209]) by patchwork.libcamera.org (Postfix) with ESMTPS id 4E9D2C0F2A for ; Tue, 3 May 2022 16:30:54 +0000 (UTC) Received: from lancelot.ideasonboard.com (localhost [IPv6:::1]) by lancelot.ideasonboard.com (Postfix) with ESMTP id E402765649; Tue, 3 May 2022 18:30:52 +0200 (CEST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/simple; d=libcamera.org; s=mail; t=1651595452; bh=FB53gOuCzI3rtV/38aBP8CDX9Db8oz0d7g1i2UfssnY=; h=To:Date:In-Reply-To:References:Subject:List-Id:List-Unsubscribe: List-Archive:List-Post:List-Help:List-Subscribe:From:Reply-To:Cc: From; b=M0HuqxB8lvpsGpWRjOOBgbQio+7hh5JpANLmnnJDe0i51d8xvbPtVNjij8ft62O4X v+zEte76Y/h/DmqhN5ns8w1T4xY00JRgyTvLeqceuxOq3YDH8PidZlJGZrPzFwAlen fMc/VC2e2Wmlo+aSoueKEdpksL4K7JTea8gSvwWqawRiBdJWwL0fc4Jybt9u21B8us PHwu9a76gd7XRdpzb7f9Zn+UpwQfjHmxhwQ9zKyfYU5IhaVuFEMWTcVWVD7fypfTB8 6pWVDDdV53IIP8gPhBnq7iE0PRDrgz82Tv/uzTxmoIURns2ncM976IJuh3aSuQ9qS8 68SzI+nlfJgsQ== Received: from relay2-d.mail.gandi.net (relay2-d.mail.gandi.net [217.70.183.194]) by lancelot.ideasonboard.com (Postfix) with ESMTPS id EB9F665645 for ; Tue, 3 May 2022 18:30:50 +0200 (CEST) Received: (Authenticated sender: foss@0leil.net) by mail.gandi.net (Postfix) with ESMTPSA id 707DA40004; Tue, 3 May 2022 16:30:50 +0000 (UTC) To: Date: Tue, 3 May 2022 18:30:37 +0200 Message-Id: <20220503163038.1174462-2-foss+libcamera@0leil.net> X-Mailer: git-send-email 2.35.1 In-Reply-To: <20220503163038.1174462-1-foss+libcamera@0leil.net> References: <20220503163038.1174462-1-foss+libcamera@0leil.net> MIME-Version: 1.0 Subject: [libcamera-devel] [PATCH 2/3] Documentation: fix typos X-BeenThere: libcamera-devel@lists.libcamera.org X-Mailman-Version: 2.1.29 Precedence: list List-Id: List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , X-Patchwork-Original-From: Quentin Schulz via libcamera-devel From: Quentin Schulz Reply-To: Quentin Schulz Cc: libcamera-devel@lists.libcamera.org Errors-To: libcamera-devel-bounces@lists.libcamera.org Sender: "libcamera-devel" From: Quentin Schulz A few typos made it to the docs, so let's fix them. Cc: Quentin Schulz Signed-off-by: Quentin Schulz Reviewed-by: Jacopo Mondi Reviewed-by: Kieran Bingham --- Documentation/guides/introduction.rst | 2 +- Documentation/guides/pipeline-handler.rst | 28 +++++++++++------------ 2 files changed, 15 insertions(+), 15 deletions(-) diff --git a/Documentation/guides/introduction.rst b/Documentation/guides/introduction.rst index d080679f..07f66881 100644 --- a/Documentation/guides/introduction.rst +++ b/Documentation/guides/introduction.rst @@ -221,7 +221,7 @@ Camera Device of producing one or more image streams, and provides the API to interact with the underlying device. - If a system has multiple instances of the same hardware attached, each has it's + If a system has multiple instances of the same hardware attached, each has its own instance of the camera class. The API exposes full control of the device to upper layers of libcamera through diff --git a/Documentation/guides/pipeline-handler.rst b/Documentation/guides/pipeline-handler.rst index a7208f57..989b0163 100644 --- a/Documentation/guides/pipeline-handler.rst +++ b/Documentation/guides/pipeline-handler.rst @@ -75,7 +75,7 @@ Prerequisite knowledge: libcamera architecture ---------------------------------------------- A pipeline handler makes use of the following libcamera classes to realize the -functionalities descibed above. Below is a brief overview of each of those: +functionalities described above. Below is a brief overview of each of those: .. TODO: (All) Convert to sphinx refs .. TODO: (MediaDevice) Reference to the Media Device API (possibly with versioning requirements) @@ -405,7 +405,7 @@ Creating camera devices If the pipeline handler successfully matches with the system it is running on, it can proceed to initialization, by creating all the required instances of the ``V4L2VideoDevice``, ``V4L2Subdevice`` and ``CameraSensor`` hardware abstraction -classes. If the Pipeline handler supports an ISP, it can then also Initialise +classes. If the Pipeline handler supports an ISP, it can then also initialise the IPA module before proceeding to the creation of the Camera devices. An image ``Stream`` represents a sequence of images and data of known size and @@ -687,8 +687,8 @@ and validated to adjust it to a supported configuration. This may involve adjusting the formats or image sizes or alignments for example to match the capabilities of the device. -Applications may choose to repeat validation stages, adjusting paramters until a -set of validated StreamConfigurations are returned that is acceptable for the +Applications may choose to repeat validation stages, adjusting parameters until +a set of validated StreamConfigurations are returned that is acceptable for the applications needs. When the pipeline handler receives a valid camera configuration it can use the image stream configurations to apply settings to the hardware devices. @@ -765,11 +765,11 @@ example (with only one stream), the pipeline handler always returns the same configuration, inferred from the underlying V4L2VideoDevice. How it does this is shown below, but examination of the more full-featured -pipelines for IPU3, RKISP1 and RaspberryPi are recommend to explore more +pipelines for IPU3, RKISP1 and RaspberryPi are recommended to explore more complex examples. To generate a ``StreamConfiguration``, you need a list of pixel formats and -frame sizes which supported outputs of the stream. You can fetch a map of the +frame sizes supported by outputs of the stream. You can fetch a map of the ``V4LPixelFormat`` and ``SizeRange`` supported by the underlying output device, but the pipeline handler needs to convert this to a ``libcamera::PixelFormat`` type to pass to applications. We do this here using ``std::transform`` to @@ -811,9 +811,9 @@ Continue adding the following code to support this: StreamConfiguration cfg(formats); As well as a list of supported StreamFormats, the StreamConfiguration is also -expected to provide an initialsed default configuration. This may be arbitrary, -but depending on use case you may which to select an output that matches the -Sensor output, or prefer a pixelformat which might provide higher performance on +expected to provide an initialised default configuration. This may be arbitrary, +but depending on use case you may wish to select an output that matches the +Sensor output, or prefer a pixelFormat which might provide higher performance on the hardware. The bufferCount represents the number of buffers required to support functional continuous processing on this stream. @@ -826,7 +826,7 @@ support functional continuous processing on this stream. Finally add each ``StreamConfiguration`` generated to the ``CameraConfiguration``, and ensure that it has been validated before returning it to the application. With only a single supported stream, this code adds only -a single StreamConfiguration however a StreamConfiguration should be added for +a single StreamConfiguration. However a StreamConfiguration should be added for each supported role in a device that can handle more streams. Add the following code to complete the implementation of @@ -841,7 +841,7 @@ Add the following code to complete the implementation of return config; To validate a camera configuration, a pipeline handler must implement the -`CameraConfiguration::validate()`_ function in it's derived class to inspect all +`CameraConfiguration::validate()`_ function in its derived class to inspect all the stream configuration associated to it, make any adjustments required to make the configuration valid, and return the validation status. @@ -1372,9 +1372,9 @@ classes documentation. .. _libcamera Signal and Slot: http://libcamera.org/api-html/classlibcamera_1_1Signal.html#details In order to notify applications about the availability of new frames and data, -the ``Camera`` device exposes two ``Signals`` which applications can connect to -be notified of frame completion events. The ``bufferComplete`` signal serves to -report to applications the completion event of a single ``Stream`` part of a +the ``Camera`` device exposes two ``Signals`` to which applications can connect +to be notified of frame completion events. The ``bufferComplete`` signal serves +to report to applications the completion event of a single ``Stream`` part of a ``Request``, while the ``requestComplete`` signal notifies the completion of all the ``Streams`` and data submitted as part of a request. This mechanism allows implementation of partial request completion, which allows an application to