[libcamera-devel,2/3] Documentation: fix typos
diff mbox series

Message ID 20220503163038.1174462-2-foss+libcamera@0leil.net
State Accepted
Headers show
Series
  • [libcamera-devel,1/3] Documentation: replace ’ by '
Related show

Commit Message

Quentin Schulz May 3, 2022, 4:30 p.m. UTC
From: Quentin Schulz <quentin.schulz@theobroma-systems.com>

A few typos made it to the docs, so let's fix them.

Cc: Quentin Schulz <foss+libcamera@0leil.net>
Signed-off-by: Quentin Schulz <quentin.schulz@theobroma-systems.com>
---
 Documentation/guides/introduction.rst     |  2 +-
 Documentation/guides/pipeline-handler.rst | 28 +++++++++++------------
 2 files changed, 15 insertions(+), 15 deletions(-)

Comments

Jacopo Mondi May 4, 2022, 7:11 a.m. UTC | #1
Hi Quentin

On Tue, May 03, 2022 at 06:30:37PM +0200, Quentin Schulz via libcamera-devel wrote:
> From: Quentin Schulz <quentin.schulz@theobroma-systems.com>
>
> A few typos made it to the docs, so let's fix them.
>
> Cc: Quentin Schulz <foss+libcamera@0leil.net>
> Signed-off-by: Quentin Schulz <quentin.schulz@theobroma-systems.com>
> ---
>  Documentation/guides/introduction.rst     |  2 +-
>  Documentation/guides/pipeline-handler.rst | 28 +++++++++++------------
>  2 files changed, 15 insertions(+), 15 deletions(-)
>
> diff --git a/Documentation/guides/introduction.rst b/Documentation/guides/introduction.rst
> index d080679f..07f66881 100644
> --- a/Documentation/guides/introduction.rst
> +++ b/Documentation/guides/introduction.rst
> @@ -221,7 +221,7 @@ Camera Device
>    of producing one or more image streams, and provides the API to interact with
>    the underlying device.
>
> -  If a system has multiple instances of the same hardware attached, each has it's
> +  If a system has multiple instances of the same hardware attached, each has its

ack

>    own instance of the camera class.
>
>    The API exposes full control of the device to upper layers of libcamera through
> diff --git a/Documentation/guides/pipeline-handler.rst b/Documentation/guides/pipeline-handler.rst
> index a7208f57..989b0163 100644
> --- a/Documentation/guides/pipeline-handler.rst
> +++ b/Documentation/guides/pipeline-handler.rst
> @@ -75,7 +75,7 @@ Prerequisite knowledge: libcamera architecture
>  ----------------------------------------------
>
>  A pipeline handler makes use of the following libcamera classes to realize the
> -functionalities descibed above. Below is a brief overview of each of those:
> +functionalities described above. Below is a brief overview of each of those:

ack

>
>  .. TODO: (All) Convert to sphinx refs
>  .. TODO: (MediaDevice) Reference to the Media Device API (possibly with versioning requirements)
> @@ -405,7 +405,7 @@ Creating camera devices
>  If the pipeline handler successfully matches with the system it is running on,
>  it can proceed to initialization, by creating all the required instances of the
>  ``V4L2VideoDevice``, ``V4L2Subdevice`` and ``CameraSensor`` hardware abstraction
> -classes. If the Pipeline handler supports an ISP, it can then also Initialise
> +classes. If the Pipeline handler supports an ISP, it can then also initialise
>  the IPA module before proceeding to the creation of the Camera devices.
>
>  An image ``Stream`` represents a sequence of images and data of known size and
> @@ -687,8 +687,8 @@ and validated to adjust it to a supported configuration. This may involve
>  adjusting the formats or image sizes or alignments for example to match the
>  capabilities of the device.
>
> -Applications may choose to repeat validation stages, adjusting paramters until a
> -set of validated StreamConfigurations are returned that is acceptable for the
> +Applications may choose to repeat validation stages, adjusting parameters until
> +a set of validated StreamConfigurations are returned that is acceptable for the

ack

>  applications needs. When the pipeline handler receives a valid camera
>  configuration it can use the image stream configurations to apply settings to
>  the hardware devices.
> @@ -765,11 +765,11 @@ example (with only one stream), the pipeline handler always returns the same
>  configuration, inferred from the underlying V4L2VideoDevice.
>
>  How it does this is shown below, but examination of the more full-featured
> -pipelines for IPU3, RKISP1 and RaspberryPi are recommend to explore more
> +pipelines for IPU3, RKISP1 and RaspberryPi are recommended to explore more

ack

>  complex examples.
>
>  To generate a ``StreamConfiguration``, you need a list of pixel formats and
> -frame sizes which supported outputs of the stream. You can fetch a map of the
> +frame sizes supported by outputs of the stream. You can fetch a map of the

I think what was meant is

To generate a ``StreamConfiguration``, you need a list of pixel formats and
frame sizes which are supported as outputs of the stream. You can fetch a map of the


>  ``V4LPixelFormat`` and ``SizeRange`` supported by the underlying output device,
>  but the pipeline handler needs to convert this to a ``libcamera::PixelFormat``
>  type to pass to applications. We do this here using ``std::transform`` to
> @@ -811,9 +811,9 @@ Continue adding the following code to support this:
>     StreamConfiguration cfg(formats);
>
>  As well as a list of supported StreamFormats, the StreamConfiguration is also
> -expected to provide an initialised default configuration. This may be arbitrary,
> -but depending on use case you may which to select an output that matches the
> -Sensor output, or prefer a pixelformat which might provide higher performance on
> +expected to provide an initialised default configuration. This may be arbitrary,

ack

> +but depending on use case you may wish to select an output that matches the

'wish' or either simply 'want'

> +Sensor output, or prefer a pixelFormat which might provide higher performance on

I would rather use ``PixelFormat`` as the term indicates a libcamera
a type

>  the hardware. The bufferCount represents the number of buffers required to
>  support functional continuous processing on this stream.
>
> @@ -826,7 +826,7 @@ support functional continuous processing on this stream.
>  Finally add each ``StreamConfiguration`` generated to the
>  ``CameraConfiguration``, and ensure that it has been validated before returning
>  it to the application. With only a single supported stream, this code adds only
> -a single StreamConfiguration however a StreamConfiguration should be added for
> +a single StreamConfiguration. However a StreamConfiguration should be added for
>  each supported role in a device that can handle more streams.

ack

>
>  Add the following code to complete the implementation of
> @@ -841,7 +841,7 @@ Add the following code to complete the implementation of
>     return config;
>
>  To validate a camera configuration, a pipeline handler must implement the
> -`CameraConfiguration::validate()`_ function in it's derived class to inspect all
> +`CameraConfiguration::validate()`_ function in its derived class to inspect all

ack

>  the stream configuration associated to it, make any adjustments required to make
>  the configuration valid, and return the validation status.
>
> @@ -1372,9 +1372,9 @@ classes documentation.
>  .. _libcamera Signal and Slot: http://libcamera.org/api-html/classlibcamera_1_1Signal.html#details
>
>  In order to notify applications about the availability of new frames and data,
> -the ``Camera`` device exposes two ``Signals`` which applications can connect to
> -be notified of frame completion events. The ``bufferComplete`` signal serves to
> -report to applications the completion event of a single ``Stream`` part of a
> +the ``Camera`` device exposes two ``Signals`` to which applications can connect

not sure the 'to' is required. Native speakers ?

However, all minor comments, if you agree they can be addressed when
applying

Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>

Thanks
  j

> +to be notified of frame completion events. The ``bufferComplete`` signal serves
> +to report to applications the completion event of a single ``Stream`` part of a
>  ``Request``, while the ``requestComplete`` signal notifies the completion of all
>  the ``Streams`` and data submitted as part of a request. This mechanism allows
>  implementation of partial request completion, which allows an application to
> --
> 2.35.1
>
Quentin Schulz May 4, 2022, 9:20 a.m. UTC | #2
Hi Jacopo,

On 5/4/22 09:11, Jacopo Mondi via libcamera-devel wrote:
> Hi Quentin
> 
> On Tue, May 03, 2022 at 06:30:37PM +0200, Quentin Schulz via libcamera-devel wrote:
>> From: Quentin Schulz <quentin.schulz@theobroma-systems.com>
>>
>> A few typos made it to the docs, so let's fix them.
>>
>> Cc: Quentin Schulz <foss+libcamera@0leil.net>
>> Signed-off-by: Quentin Schulz <quentin.schulz@theobroma-systems.com>
>> ---
>>   Documentation/guides/introduction.rst     |  2 +-
>>   Documentation/guides/pipeline-handler.rst | 28 +++++++++++------------
>>   2 files changed, 15 insertions(+), 15 deletions(-)
>>
>> diff --git a/Documentation/guides/introduction.rst b/Documentation/guides/introduction.rst
>> index d080679f..07f66881 100644
>> --- a/Documentation/guides/introduction.rst
>> +++ b/Documentation/guides/introduction.rst
>> @@ -221,7 +221,7 @@ Camera Device
>>     of producing one or more image streams, and provides the API to interact with
>>     the underlying device.
>>
>> -  If a system has multiple instances of the same hardware attached, each has it's
>> +  If a system has multiple instances of the same hardware attached, each has its
> 
> ack
> 
>>     own instance of the camera class.
>>
>>     The API exposes full control of the device to upper layers of libcamera through
>> diff --git a/Documentation/guides/pipeline-handler.rst b/Documentation/guides/pipeline-handler.rst
>> index a7208f57..989b0163 100644
>> --- a/Documentation/guides/pipeline-handler.rst
>> +++ b/Documentation/guides/pipeline-handler.rst
>> @@ -75,7 +75,7 @@ Prerequisite knowledge: libcamera architecture
>>   ----------------------------------------------
>>
>>   A pipeline handler makes use of the following libcamera classes to realize the
>> -functionalities descibed above. Below is a brief overview of each of those:
>> +functionalities described above. Below is a brief overview of each of those:
> 
> ack
> 
>>
>>   .. TODO: (All) Convert to sphinx refs
>>   .. TODO: (MediaDevice) Reference to the Media Device API (possibly with versioning requirements)
>> @@ -405,7 +405,7 @@ Creating camera devices
>>   If the pipeline handler successfully matches with the system it is running on,
>>   it can proceed to initialization, by creating all the required instances of the
>>   ``V4L2VideoDevice``, ``V4L2Subdevice`` and ``CameraSensor`` hardware abstraction
>> -classes. If the Pipeline handler supports an ISP, it can then also Initialise
>> +classes. If the Pipeline handler supports an ISP, it can then also initialise
>>   the IPA module before proceeding to the creation of the Camera devices.
>>
>>   An image ``Stream`` represents a sequence of images and data of known size and
>> @@ -687,8 +687,8 @@ and validated to adjust it to a supported configuration. This may involve
>>   adjusting the formats or image sizes or alignments for example to match the
>>   capabilities of the device.
>>
>> -Applications may choose to repeat validation stages, adjusting paramters until a
>> -set of validated StreamConfigurations are returned that is acceptable for the
>> +Applications may choose to repeat validation stages, adjusting parameters until
>> +a set of validated StreamConfigurations are returned that is acceptable for the
> 
> ack
> 
>>   applications needs. When the pipeline handler receives a valid camera
>>   configuration it can use the image stream configurations to apply settings to
>>   the hardware devices.
>> @@ -765,11 +765,11 @@ example (with only one stream), the pipeline handler always returns the same
>>   configuration, inferred from the underlying V4L2VideoDevice.
>>
>>   How it does this is shown below, but examination of the more full-featured
>> -pipelines for IPU3, RKISP1 and RaspberryPi are recommend to explore more
>> +pipelines for IPU3, RKISP1 and RaspberryPi are recommended to explore more
> 
> ack
> 
>>   complex examples.
>>
>>   To generate a ``StreamConfiguration``, you need a list of pixel formats and
>> -frame sizes which supported outputs of the stream. You can fetch a map of the
>> +frame sizes supported by outputs of the stream. You can fetch a map of the
> 
> I think what was meant is
> 
> To generate a ``StreamConfiguration``, you need a list of pixel formats and
> frame sizes which are supported as outputs of the stream. You can fetch a map of the
> 

Thanks, forgot to say I wasn't so sure about this one.

> 
>>   ``V4LPixelFormat`` and ``SizeRange`` supported by the underlying output device,
>>   but the pipeline handler needs to convert this to a ``libcamera::PixelFormat``
>>   type to pass to applications. We do this here using ``std::transform`` to
>> @@ -811,9 +811,9 @@ Continue adding the following code to support this:
>>      StreamConfiguration cfg(formats);
>>
>>   As well as a list of supported StreamFormats, the StreamConfiguration is also
>> -expected to provide an initialised default configuration. This may be arbitrary,
>> -but depending on use case you may which to select an output that matches the
>> -Sensor output, or prefer a pixelformat which might provide higher performance on
>> +expected to provide an initialised default configuration. This may be arbitrary,
> 
> ack
> 
>> +but depending on use case you may wish to select an output that matches the
> 
> 'wish' or either simply 'want'
> 

I modified which to wish. I'm not sure what you're suggesting now, can 
you rephrase please?

>> +Sensor output, or prefer a pixelFormat which might provide higher performance on
> 
> I would rather use ``PixelFormat`` as the term indicates a libcamera
> a type
> 

This shouldn't have made it to this patch since it's not a typo. I agree 
it should be ``PixelFormat``, but in another patch :)

>>   the hardware. The bufferCount represents the number of buffers required to
>>   support functional continuous processing on this stream.
>>
>> @@ -826,7 +826,7 @@ support functional continuous processing on this stream.
>>   Finally add each ``StreamConfiguration`` generated to the
>>   ``CameraConfiguration``, and ensure that it has been validated before returning
>>   it to the application. With only a single supported stream, this code adds only
>> -a single StreamConfiguration however a StreamConfiguration should be added for
>> +a single StreamConfiguration. However a StreamConfiguration should be added for
>>   each supported role in a device that can handle more streams.
> 
> ack
> 
>>
>>   Add the following code to complete the implementation of
>> @@ -841,7 +841,7 @@ Add the following code to complete the implementation of
>>      return config;
>>
>>   To validate a camera configuration, a pipeline handler must implement the
>> -`CameraConfiguration::validate()`_ function in it's derived class to inspect all
>> +`CameraConfiguration::validate()`_ function in its derived class to inspect all
> 
> ack
> 
>>   the stream configuration associated to it, make any adjustments required to make
>>   the configuration valid, and return the validation status.
>>
>> @@ -1372,9 +1372,9 @@ classes documentation.
>>   .. _libcamera Signal and Slot: https://urldefense.proofpoint.com/v2/url?u=http-3A__libcamera.org_api-2Dhtml_classlibcamera-5F1-5F1Signal.html-23details&d=DwIBaQ&c=_sEr5x9kUWhuk4_nFwjJtA&r=LYjLexDn7rXIzVmkNPvw5ymA1XTSqHGq8yBP6m6qZZ4njZguQhZhkI_-172IIy1t&m=UuonJkXB_70e1fcxpurWu4hQTzDeWE6U6W9UL1EHr9qbGUgawljHRfFpkA5uNBfm&s=X1c_zR2f-F9tgcfNiPaaDtoXYWJxBqZcVR4j550QSlc&e=
>>
>>   In order to notify applications about the availability of new frames and data,
>> -the ``Camera`` device exposes two ``Signals`` which applications can connect to
>> -be notified of frame completion events. The ``bufferComplete`` signal serves to
>> -report to applications the completion event of a single ``Stream`` part of a
>> +the ``Camera`` device exposes two ``Signals`` to which applications can connect
> 
> not sure the 'to' is required. Native speakers ?
> 

It sounded wrong this way to me (not native speaker here :) ).

I had:
the camera device exposes two Signals which applications can connect to 
to be notified
or
the camera device exposes two Signals to which applications can connect 
to be notified

Either way, can't argue if it's already proper English with current 
state of the docs :)

> However, all minor comments, if you agree they can be addressed when
> applying
> 

I don't mind sending a v2, up to you.

Cheers,
Quentin
Kieran Bingham May 4, 2022, 9:42 a.m. UTC | #3
Quoting Jacopo Mondi via libcamera-devel (2022-05-04 08:11:42)
> Hi Quentin
> 
> On Tue, May 03, 2022 at 06:30:37PM +0200, Quentin Schulz via libcamera-devel wrote:
> > From: Quentin Schulz <quentin.schulz@theobroma-systems.com>
> >
> > A few typos made it to the docs, so let's fix them.
> >
> > Cc: Quentin Schulz <foss+libcamera@0leil.net>
> > Signed-off-by: Quentin Schulz <quentin.schulz@theobroma-systems.com>
> > ---
> >  Documentation/guides/introduction.rst     |  2 +-
> >  Documentation/guides/pipeline-handler.rst | 28 +++++++++++------------
> >  2 files changed, 15 insertions(+), 15 deletions(-)
> >
> > diff --git a/Documentation/guides/introduction.rst b/Documentation/guides/introduction.rst
> > index d080679f..07f66881 100644
> > --- a/Documentation/guides/introduction.rst
> > +++ b/Documentation/guides/introduction.rst
> > @@ -221,7 +221,7 @@ Camera Device
> >    of producing one or more image streams, and provides the API to interact with
> >    the underlying device.
> >
> > -  If a system has multiple instances of the same hardware attached, each has it's
> > +  If a system has multiple instances of the same hardware attached, each has its
> 
> ack
> 
> >    own instance of the camera class.
> >
> >    The API exposes full control of the device to upper layers of libcamera through
> > diff --git a/Documentation/guides/pipeline-handler.rst b/Documentation/guides/pipeline-handler.rst
> > index a7208f57..989b0163 100644
> > --- a/Documentation/guides/pipeline-handler.rst
> > +++ b/Documentation/guides/pipeline-handler.rst
> > @@ -75,7 +75,7 @@ Prerequisite knowledge: libcamera architecture
> >  ----------------------------------------------
> >
> >  A pipeline handler makes use of the following libcamera classes to realize the
> > -functionalities descibed above. Below is a brief overview of each of those:
> > +functionalities described above. Below is a brief overview of each of those:
> 
> ack
> 
> >
> >  .. TODO: (All) Convert to sphinx refs
> >  .. TODO: (MediaDevice) Reference to the Media Device API (possibly with versioning requirements)
> > @@ -405,7 +405,7 @@ Creating camera devices
> >  If the pipeline handler successfully matches with the system it is running on,
> >  it can proceed to initialization, by creating all the required instances of the
> >  ``V4L2VideoDevice``, ``V4L2Subdevice`` and ``CameraSensor`` hardware abstraction
> > -classes. If the Pipeline handler supports an ISP, it can then also Initialise
> > +classes. If the Pipeline handler supports an ISP, it can then also initialise
> >  the IPA module before proceeding to the creation of the Camera devices.
> >
> >  An image ``Stream`` represents a sequence of images and data of known size and
> > @@ -687,8 +687,8 @@ and validated to adjust it to a supported configuration. This may involve
> >  adjusting the formats or image sizes or alignments for example to match the
> >  capabilities of the device.
> >
> > -Applications may choose to repeat validation stages, adjusting paramters until a
> > -set of validated StreamConfigurations are returned that is acceptable for the
> > +Applications may choose to repeat validation stages, adjusting parameters until
> > +a set of validated StreamConfigurations are returned that is acceptable for the
> 
> ack
> 
> >  applications needs. When the pipeline handler receives a valid camera
> >  configuration it can use the image stream configurations to apply settings to
> >  the hardware devices.
> > @@ -765,11 +765,11 @@ example (with only one stream), the pipeline handler always returns the same
> >  configuration, inferred from the underlying V4L2VideoDevice.
> >
> >  How it does this is shown below, but examination of the more full-featured
> > -pipelines for IPU3, RKISP1 and RaspberryPi are recommend to explore more
> > +pipelines for IPU3, RKISP1 and RaspberryPi are recommended to explore more
> 
> ack
> 
> >  complex examples.
> >
> >  To generate a ``StreamConfiguration``, you need a list of pixel formats and
> > -frame sizes which supported outputs of the stream. You can fetch a map of the
> > +frame sizes supported by outputs of the stream. You can fetch a map of the
> 
> I think what was meant is
> 
> To generate a ``StreamConfiguration``, you need a list of pixel formats and
> frame sizes which are supported as outputs of the stream. You can fetch a map of the

Yes, I'd go with this one. It's describing how a single
StreamConfiguration should be constructed. And they are not supported by
outputs, they 'are' the output.


> >  ``V4LPixelFormat`` and ``SizeRange`` supported by the underlying output device,
> >  but the pipeline handler needs to convert this to a ``libcamera::PixelFormat``
> >  type to pass to applications. We do this here using ``std::transform`` to
> > @@ -811,9 +811,9 @@ Continue adding the following code to support this:
> >     StreamConfiguration cfg(formats);
> >
> >  As well as a list of supported StreamFormats, the StreamConfiguration is also
> > -expected to provide an initialised default configuration. This may be arbitrary,
> > -but depending on use case you may which to select an output that matches the
> > -Sensor output, or prefer a pixelformat which might provide higher performance on
> > +expected to provide an initialised default configuration. This may be arbitrary,
> 
> ack

I can't actually see any change in that line which is added here?

> 
> > +but depending on use case you may wish to select an output that matches the
> 
> 'wish' or either simply 'want'

I expect the original intent was wish - but I don't mind wish or want.

> 
> > +Sensor output, or prefer a pixelFormat which might provide higher performance on
> 
> I would rather use ``PixelFormat`` as the term indicates a libcamera
> a type


Yes, PixelFormat is better here than both pixelformat and pixelFormat.

> 
> >  the hardware. The bufferCount represents the number of buffers required to
> >  support functional continuous processing on this stream.
> >
> > @@ -826,7 +826,7 @@ support functional continuous processing on this stream.
> >  Finally add each ``StreamConfiguration`` generated to the
> >  ``CameraConfiguration``, and ensure that it has been validated before returning
> >  it to the application. With only a single supported stream, this code adds only
> > -a single StreamConfiguration however a StreamConfiguration should be added for
> > +a single StreamConfiguration. However a StreamConfiguration should be added for
> >  each supported role in a device that can handle more streams.
> 
> ack
> 
> >
> >  Add the following code to complete the implementation of
> > @@ -841,7 +841,7 @@ Add the following code to complete the implementation of
> >     return config;
> >
> >  To validate a camera configuration, a pipeline handler must implement the
> > -`CameraConfiguration::validate()`_ function in it's derived class to inspect all
> > +`CameraConfiguration::validate()`_ function in its derived class to inspect all
> 
> ack
> 
> >  the stream configuration associated to it, make any adjustments required to make
> >  the configuration valid, and return the validation status.
> >
> > @@ -1372,9 +1372,9 @@ classes documentation.
> >  .. _libcamera Signal and Slot: http://libcamera.org/api-html/classlibcamera_1_1Signal.html#details
> >
> >  In order to notify applications about the availability of new frames and data,
> > -the ``Camera`` device exposes two ``Signals`` which applications can connect to
> > -be notified of frame completion events. The ``bufferComplete`` signal serves to
> > -report to applications the completion event of a single ``Stream`` part of a
> > +the ``Camera`` device exposes two ``Signals`` to which applications can connect
> 
> not sure the 'to' is required. Native speakers ?

I'm not 100% sure of the grammar rules there but I actually like the
extra 'to'.

Signals are 'connected to' something, so I could imagine this could
(/should) read:

"which applications can connect to - to be notified"

Having the first 'to' early still makes sense, and prevents the double
'to to'.

> However, all minor comments, if you agree they can be addressed when
> applying
> 
> Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>


Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>



> 
> Thanks
>   j
> 
> > +to be notified of frame completion events. The ``bufferComplete`` signal serves
> > +to report to applications the completion event of a single ``Stream`` part of a
> >  ``Request``, while the ``requestComplete`` signal notifies the completion of all
> >  the ``Streams`` and data submitted as part of a request. This mechanism allows
> >  implementation of partial request completion, which allows an application to
> > --
> > 2.35.1
> >
Quentin Schulz May 4, 2022, 9:56 a.m. UTC | #4
Hi Kieran,

On 5/4/22 11:42, Kieran Bingham via libcamera-devel wrote:
> Quoting Jacopo Mondi via libcamera-devel (2022-05-04 08:11:42)
>> Hi Quentin
>>
>> On Tue, May 03, 2022 at 06:30:37PM +0200, Quentin Schulz via libcamera-devel wrote:
>>> From: Quentin Schulz <quentin.schulz@theobroma-systems.com>
>>>
>>> A few typos made it to the docs, so let's fix them.
>>>
>>> Cc: Quentin Schulz <foss+libcamera@0leil.net>
>>> Signed-off-by: Quentin Schulz <quentin.schulz@theobroma-systems.com>
>>> ---
[...]
>>>   ``V4LPixelFormat`` and ``SizeRange`` supported by the underlying output device,
>>>   but the pipeline handler needs to convert this to a ``libcamera::PixelFormat``
>>>   type to pass to applications. We do this here using ``std::transform`` to
>>> @@ -811,9 +811,9 @@ Continue adding the following code to support this:
>>>      StreamConfiguration cfg(formats);
>>>
>>>   As well as a list of supported StreamFormats, the StreamConfiguration is also
>>> -expected to provide an initialised default configuration. This may be arbitrary,
>>> -but depending on use case you may which to select an output that matches the
>>> -Sensor output, or prefer a pixelformat which might provide higher performance on
>>> +expected to provide an initialised default configuration. This may be arbitrary,
>>
>> ack
> 
> I can't actually see any change in that line which is added here?
> 

Mmmm. Don't know what Jacopo did to the git diff but the patch has the 
typo in the original file :)

s/initialsed/initialised/

s/which/wish/

Cheers,
Quentin
Jacopo Mondi May 4, 2022, 10:13 a.m. UTC | #5
On Wed, May 04, 2022 at 11:56:22AM +0200, Quentin Schulz wrote:
> Hi Kieran,
>
> On 5/4/22 11:42, Kieran Bingham via libcamera-devel wrote:
> > Quoting Jacopo Mondi via libcamera-devel (2022-05-04 08:11:42)
> > > Hi Quentin
> > >
> > > On Tue, May 03, 2022 at 06:30:37PM +0200, Quentin Schulz via libcamera-devel wrote:
> > > > From: Quentin Schulz <quentin.schulz@theobroma-systems.com>
> > > >
> > > > A few typos made it to the docs, so let's fix them.
> > > >
> > > > Cc: Quentin Schulz <foss+libcamera@0leil.net>
> > > > Signed-off-by: Quentin Schulz <quentin.schulz@theobroma-systems.com>
> > > > ---
> [...]
> > > >   ``V4LPixelFormat`` and ``SizeRange`` supported by the underlying output device,
> > > >   but the pipeline handler needs to convert this to a ``libcamera::PixelFormat``
> > > >   type to pass to applications. We do this here using ``std::transform`` to
> > > > @@ -811,9 +811,9 @@ Continue adding the following code to support this:
> > > >      StreamConfiguration cfg(formats);
> > > >
> > > >   As well as a list of supported StreamFormats, the StreamConfiguration is also
> > > > -expected to provide an initialised default configuration. This may be arbitrary,
> > > > -but depending on use case you may which to select an output that matches the
> > > > -Sensor output, or prefer a pixelformat which might provide higher performance on
> > > > +expected to provide an initialised default configuration. This may be arbitrary,
> > >
> > > ack
> >
> > I can't actually see any change in that line which is added here?
> >
>
> Mmmm. Don't know what Jacopo did to the git diff but the patch has the typo
> in the original file :)

Ups my bad

To verify documentation changes like this one I simply search in vi
for the whole phrase, typing it it until one of the two lines differ.

I might have changed the first part to check if anything else had been
modified, sorry

>
> s/initialsed/initialised/
>
> s/which/wish/
>
> Cheers,
> Quentin

Patch
diff mbox series

diff --git a/Documentation/guides/introduction.rst b/Documentation/guides/introduction.rst
index d080679f..07f66881 100644
--- a/Documentation/guides/introduction.rst
+++ b/Documentation/guides/introduction.rst
@@ -221,7 +221,7 @@  Camera Device
   of producing one or more image streams, and provides the API to interact with
   the underlying device.
 
-  If a system has multiple instances of the same hardware attached, each has it's
+  If a system has multiple instances of the same hardware attached, each has its
   own instance of the camera class.
 
   The API exposes full control of the device to upper layers of libcamera through
diff --git a/Documentation/guides/pipeline-handler.rst b/Documentation/guides/pipeline-handler.rst
index a7208f57..989b0163 100644
--- a/Documentation/guides/pipeline-handler.rst
+++ b/Documentation/guides/pipeline-handler.rst
@@ -75,7 +75,7 @@  Prerequisite knowledge: libcamera architecture
 ----------------------------------------------
 
 A pipeline handler makes use of the following libcamera classes to realize the
-functionalities descibed above. Below is a brief overview of each of those:
+functionalities described above. Below is a brief overview of each of those:
 
 .. TODO: (All) Convert to sphinx refs
 .. TODO: (MediaDevice) Reference to the Media Device API (possibly with versioning requirements)
@@ -405,7 +405,7 @@  Creating camera devices
 If the pipeline handler successfully matches with the system it is running on,
 it can proceed to initialization, by creating all the required instances of the
 ``V4L2VideoDevice``, ``V4L2Subdevice`` and ``CameraSensor`` hardware abstraction
-classes. If the Pipeline handler supports an ISP, it can then also Initialise
+classes. If the Pipeline handler supports an ISP, it can then also initialise
 the IPA module before proceeding to the creation of the Camera devices.
 
 An image ``Stream`` represents a sequence of images and data of known size and
@@ -687,8 +687,8 @@  and validated to adjust it to a supported configuration. This may involve
 adjusting the formats or image sizes or alignments for example to match the
 capabilities of the device.
 
-Applications may choose to repeat validation stages, adjusting paramters until a
-set of validated StreamConfigurations are returned that is acceptable for the
+Applications may choose to repeat validation stages, adjusting parameters until
+a set of validated StreamConfigurations are returned that is acceptable for the
 applications needs. When the pipeline handler receives a valid camera
 configuration it can use the image stream configurations to apply settings to
 the hardware devices.
@@ -765,11 +765,11 @@  example (with only one stream), the pipeline handler always returns the same
 configuration, inferred from the underlying V4L2VideoDevice.
 
 How it does this is shown below, but examination of the more full-featured
-pipelines for IPU3, RKISP1 and RaspberryPi are recommend to explore more
+pipelines for IPU3, RKISP1 and RaspberryPi are recommended to explore more
 complex examples.
 
 To generate a ``StreamConfiguration``, you need a list of pixel formats and
-frame sizes which supported outputs of the stream. You can fetch a map of the
+frame sizes supported by outputs of the stream. You can fetch a map of the
 ``V4LPixelFormat`` and ``SizeRange`` supported by the underlying output device,
 but the pipeline handler needs to convert this to a ``libcamera::PixelFormat``
 type to pass to applications. We do this here using ``std::transform`` to
@@ -811,9 +811,9 @@  Continue adding the following code to support this:
    StreamConfiguration cfg(formats);
 
 As well as a list of supported StreamFormats, the StreamConfiguration is also
-expected to provide an initialsed default configuration. This may be arbitrary,
-but depending on use case you may which to select an output that matches the
-Sensor output, or prefer a pixelformat which might provide higher performance on
+expected to provide an initialised default configuration. This may be arbitrary,
+but depending on use case you may wish to select an output that matches the
+Sensor output, or prefer a pixelFormat which might provide higher performance on
 the hardware. The bufferCount represents the number of buffers required to
 support functional continuous processing on this stream.
 
@@ -826,7 +826,7 @@  support functional continuous processing on this stream.
 Finally add each ``StreamConfiguration`` generated to the
 ``CameraConfiguration``, and ensure that it has been validated before returning
 it to the application. With only a single supported stream, this code adds only
-a single StreamConfiguration however a StreamConfiguration should be added for
+a single StreamConfiguration. However a StreamConfiguration should be added for
 each supported role in a device that can handle more streams.
 
 Add the following code to complete the implementation of
@@ -841,7 +841,7 @@  Add the following code to complete the implementation of
    return config;
 
 To validate a camera configuration, a pipeline handler must implement the
-`CameraConfiguration::validate()`_ function in it's derived class to inspect all
+`CameraConfiguration::validate()`_ function in its derived class to inspect all
 the stream configuration associated to it, make any adjustments required to make
 the configuration valid, and return the validation status.
 
@@ -1372,9 +1372,9 @@  classes documentation.
 .. _libcamera Signal and Slot: http://libcamera.org/api-html/classlibcamera_1_1Signal.html#details
 
 In order to notify applications about the availability of new frames and data,
-the ``Camera`` device exposes two ``Signals`` which applications can connect to
-be notified of frame completion events. The ``bufferComplete`` signal serves to
-report to applications the completion event of a single ``Stream`` part of a
+the ``Camera`` device exposes two ``Signals`` to which applications can connect
+to be notified of frame completion events. The ``bufferComplete`` signal serves
+to report to applications the completion event of a single ``Stream`` part of a
 ``Request``, while the ``requestComplete`` signal notifies the completion of all
 the ``Streams`` and data submitted as part of a request. This mechanism allows
 implementation of partial request completion, which allows an application to