[libcamera-devel,v3,1/6] libcamera: properties: Define pixel array properties

Message ID 20200309180444.725757-2-jacopo@jmondi.org
State Superseded
Headers show
Series
  • Camera properties and camera sensor factory
Related show

Commit Message

Jacopo Mondi March 9, 2020, 6:04 p.m. UTC
Add definition of pixel array related properties.

Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>

---
v2 -> v3:
- Pluralize PixelAreaSizes
- Use the new 'size' property in place of 'compound'.
  I tried to set meaningful values but it's not easy to get them right..
---
 src/libcamera/property_ids.yaml | 177 ++++++++++++++++++++++++++++++++
 1 file changed, 177 insertions(+)

Comments

Kieran Bingham March 24, 2020, 1:59 p.m. UTC | #1
Hi Jacopo,

On 09/03/2020 18:04, Jacopo Mondi wrote:
> Add definition of pixel array related properties.
> 
> Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
> 
> ---
> v2 -> v3:
> - Pluralize PixelAreaSizes
> - Use the new 'size' property in place of 'compound'.
>   I tried to set meaningful values but it's not easy to get them right..
> ---
>  src/libcamera/property_ids.yaml | 177 ++++++++++++++++++++++++++++++++
>  1 file changed, 177 insertions(+)
> 
> diff --git a/src/libcamera/property_ids.yaml b/src/libcamera/property_ids.yaml
> index ce627fa042ba..4cecb5ad9ac3 100644
> --- a/src/libcamera/property_ids.yaml
> +++ b/src/libcamera/property_ids.yaml
> @@ -386,4 +386,181 @@ controls:
>                                |                    |
>                                |                    |
>                                +--------------------+
> +
> +  - PixelArraySize:
> +      type: float

Do we really need a float to store millimetres?

Hrm ... actually yes - A quick google for instance "IMX219 sensor size"
returns 4.60 mm (diagonal) ... :-)



> +      size: [2]
> +      description: |
> +        The physical sizes of the pixel array (width and height), in
> +        millimeters.
> +

Looking at the IMX219 product brief/datasheet:

Device Structure
* CMOS image sensor
* Image size : Diagonal 4.60 mm (Type 1/4.0)
* Total number of pixels : 3296 (H) × 2512 (V) approx. 8.28 M pixels
* Number of effective pixels : 3296 (H) × 2480 (V) approx. 8.17 M pixels
* Number of active pixels : 3280 (H) × 2464 (V) approx. 8.08 M pixels
* Chip size : 5.095 mm (H) × 4.930 mm (V) (w/ Scribe)
* Unit cell size : 1.12 μm (H) × 1.12 μm (V)
* Substrate material : Silicon

This is stating an Image Size (as a diagonal...) Presumably someone will
have to convert/identify that to a width/height appropriately?

I assume the chip-size is not relevant here.

For instance, the OV5647 specifies:

* image area: 3673.6 um x 2738.4um,

as well as die-dimensions.





> +  - PixelArrayBounds:
> +      type: int32_t
> +      size: [2]
> +      description: |
> +        The camera sensor pixel array bounding rectangle vertical and
> +        horizontal sizes.

That reads like a string of unassociated words.

Perhaps some hyphenation would help?

The camera-sensor pixel-array bounding-rectangle vertical and horizontal
sizes.


Or:

The vertical and horizontal sizes for the maximum bounding rectangle of
the pixel array on the camera sensor.

> +
> +        For image sensors with a rectangular pixel array the sizes described by

/array/array,/

> +        this property are the same as the PixelAreaSize property size.
> +
> +        For image sensors with more complex pixel array displacements (such as
> +        cross-shaped pixel arrays, non symmetrical pixel arrays etc) this
> +        property represents the bounding rectangle in which all pixel array
> +        dimensions are inscribed into.
> +
> +        In example, the bounding rectangle sizes for image sensor with a

/In/For/

> +        cross-shaped pixel array is described as
> +
> +
> +                     PixelArrayBound(0) = width
> +                    /-----------------/
> +
> +            (0,0)-> +----+------+----+  /
> +                    |    |//////|    |  |
> +                    +----+//////+----+  |
> +                    |////////////////|  | PixelArrayBound(1) = height
> +                    +----+//////+----+  |
> +                    |    |//////|    |  |
> +                    +----+------+----+  /
> +                            |
> +                             -> Cross-shaped pixel area
> +
> +  - PixelArrays:
> +      type: int32_t
> +      size: [8]
> +      description: |
> +        The sensor pixel array rectangles, relative to the rectangle described
> +        by the PixelArrayBounds property.
> +
> +        This property describes an arbitrary number of (likely overlapping)
> +        rectangles, representing the pixel array areas the sensor is composed
> +        of.
> +
> +        Each rectangle is defined by its displacement from pixel (0, 0) of
> +        the bounding rectangle described by the PixelArrayBound property.
> +
> +        For image sensors with a rectangular pixel array, a single rectangle
> +        is required. For sensors with more complex pixel array displacements
> +        multiple rectangles shall be specified, ordered from the tallest to the

Should this state 'multiple overlapping rectangles' ?

> +        shorter one.
> +
> +        For each rectangle, this property reports the full pixel array size,
> +        including non-active pixels, black level calibration pixels etc.
> +
> +        In example, a simple sensor with a rectangular pixel array is described
> +        as
> +
> +                     PixelArrayBound(0) = width
> +                    /-----------------/
> +                      x1          x2
> +            (0,0)-> +-o------------o-+  /
> +                 y1 o +------------+ |  |
> +                    | |////////////| |  |
> +                    | |////////////| |  | PixelArrayBound(1) = height
> +                    | |////////////| |  |
> +                 y2 o +------------+ |  |
> +                    +----------------+  /
> +
> +                 PixelArray = (x1, y1, (x2 - x1), (y2 - y1))

So for a single 'described rect', this is
struct rect {
	int32_t x;
	int32_t y;

	int32_t width;
	int32_t height;
}

What are the remaining 4 values? 0?

(Is there a means to recognise when or when not to parse the second 'rect'?)


> +
> +        A more complex sensor, with a cross shaped pixel array displacement
> +        is described with 2 rectangles, with the vertical rectangle
> +        described first
> +
> +                     PixelArrayBound(0) = width
> +                    /-----------------/
> +                    x1  x2     x3  x4 W
> +            (0,0)-> +o---o------o---o+  /
> +                    |    |//////|    |  |
> +                 y1 o+---+------+---+|  |
> +                    ||///|//////|///||  | PixelArrayBound(1) = height
> +                 y2 o+---+------+---+|  |
> +                    |    |//////|    |  |
> +                 H  +----+------+----+  /
> +
> +
> +                PixelArray = ( (x2, 0, (x3 - x2), H),
> +                               (x1, y1, (x4 - x1), (y2 - y1))

Wouldn't it make sense to show y1,y2,y3,y4 as well, and use y4-y1 in
place of H? or are you trying to show that the W and H are not required
parameters, and the rectangles do not have to utilise the full width or
height of the sensor?


But why would the width of the sensor be larger than the width of the
largest horizontal rectangle ...


So we are really expressing:

> +                     PixelArrayBound(0) = width
> +                    /-----------------/
> +                    x1  x2     x3  x4 W
> +            (0,0)-> +o---o------o---o+  /
> +                    |    |//////|    |  |
> +                 y1 o    +------+    |  |
> +                    |    |//////|    |  | PixelArrayBound(1) = height
> +                 y2 o    +------+    |  |
> +                    |    |//////|    |  |
> +                 H  +----+------+----+  /

followed by:

> +                     PixelArrayBound(0) = width
> +                    /-----------------/
> +                    x1  x2     x3  x4 W
> +            (0,0)-> +o---o------o---o+  /
> +                    |                |  |
> +                 y1 o+---+------+---+|  |
> +                    ||///|//////|///||  | PixelArrayBound(1) = height
> +                 y2 o+---+------+---+|  |
> +                    |                |  |
> +                 H  +----+------+----+  /


So really, PixelArrays describes up to two rectangles...

What happens if the shape is more complex? Why do we support a cross
shaped pixel array, but not a hexagonal pixel array?

(perhaps cross shaped pixel arrays are a standard thing, I don't know,
or maybe its' just because we only support a maximum of two rects...,
and a polygon would be more bytes to encode...)


Querying this it seems it's for advanced sensors which have layouts to
support different aspect ratios. It would be good to express that here
somewhere, but I'm not sure how yet.


> +
> +  - ActiveAreaSizes:
> +      type: int32_t
> +      size: [8]
> +      description: |
> +        The sensor active pixel area sizes, represented as rectangles

/sensor/sensor's/ (the active pixel area belongs to the sensor)


> +        inscribed in the ones described by the PixelArrays property.
> +
> +        One ActiveAreaSize rectangle per each rectangle described in the

/per/for/

> +        PixelArrays property is required. As a consequence, the two properties
> +        shall transport the same number of elements.
> +
> +        The ActiveAreaSize rectangles represent the maximum image sizes the
> +        sensor can produce.
> +
> +  - BayerFilterArrangement:
> +      type: int32_t
> +      description: |
> +        The pixel array color filter displacement.
> +
> +        This property describes the arrangement and readout sequence of the
> +        three RGB color components of the sensor's Bayer Color Filter Array
> +        (CFA).
> +
> +        Color filters are usually displaced in line-alternating fashion on the
> +        sensor pixel array. In example, one line might be composed of Red-Green
> +        while the successive is composed of Blue-Green color information.
> +
> +        The value of this property represent the arrangement of color filters

/represent/represents/

> +        in the top-left 2x2 pixel square.
> +
> +        In example, for a sensor with the following color filter displacement

/In example,/For example,/

> +
> +                 (0, 0)               (max-col)
> +           +---+    +--------------...---+
> +           |B|G|<---|B|G|B|G|B|G|B|...B|G|
> +           |G|R|<---|G|R|G|R|G|R|G|...G|R|
> +           +---+    |B|G|B|G|B|G|B|...B|G|
> +                    ...                  ..
> +                    ...                  ..
> +                    |G|R|G|R|G|R|G|...G|R|
> +                    |B|G|B|G|B|G|B|...B|G|   (max-lines)
> +                    +--------------...---+
> +
> +        The filer arrangement is represented by the BGGR value, which correspond

/filer/filter/
/correspond/corresponds/


> +        to the pixel readout sequence in line interleaved mode.
> +
> +      enum:
> +        - name: BayerFilterRGGB

Do we need to prefix these enum values (BayerFilerXXXX) for namespacing?
Or are they already namespaced by being specific to BayerFilterArrangement?

I.e. can we arrange that these become:

BayerFilterArrangement::RGGB ?

Rather than:

BayerFilterArrangement::BayerFilterRGGB ?

> +          value: 0
> +          description: |
> +            Color filter array displacement is Red-Green/Green-Blue
> +
> +        - name: BayerFilterGRBG
> +          value: 1
> +          description: |
> +            Color filter array displacement is Green-Red/Blue-Green
> +
> +        - name: BayerFilterGBRG
> +          value: 2
> +          description: |
> +            Color filter array displacement is Green-Blue/Red-Green
> +
> +        - name: BayerFilterBGGR
> +          value: 3
> +          description: |
> +            Color filter array displacement is Blue-Green/Green-Red
> +
> +        - name: BayerFilterNonStandard
> +          value: 4
> +          description: |
> +            The pixel array color filter does not use the standard Bayer RGB
> +            color model
> +
> +  - ISOSensitivityRange:
> +      type: int32_t
> +      size: [2]
> +      description: |
> +        The range of supported ISO sensitivities, as documented by the
> +        ISO 12232:2006 standard

The ISO 1232:2006 appears to be superseded by ISO 12232:2019... and
itself supersedes the :2006 version. Do we need to specify versions? Or
should we state 'or later' .... or ... ?

 (I see that "Speed ratings greater than 10000 have finally been defined
in ISO 12232:2019"...)


This is 2 int32_t entries. Presumably, thus one is the minimum, and one
is the maximum. I can guess which order they are - but do they need to
be specified explicitly? (maybe not, but I ask as devils advocate at least)


> +
>  ...
>
Jacopo Mondi March 26, 2020, 12:14 p.m. UTC | #2
Hi Kieran,

On Tue, Mar 24, 2020 at 01:59:43PM +0000, Kieran Bingham wrote:
> Hi Jacopo,
>
> On 09/03/2020 18:04, Jacopo Mondi wrote:
> > Add definition of pixel array related properties.
> >
> > Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
> >
> > ---
> > v2 -> v3:
> > - Pluralize PixelAreaSizes
> > - Use the new 'size' property in place of 'compound'.
> >   I tried to set meaningful values but it's not easy to get them right..
> > ---
> >  src/libcamera/property_ids.yaml | 177 ++++++++++++++++++++++++++++++++
> >  1 file changed, 177 insertions(+)
> >
> > diff --git a/src/libcamera/property_ids.yaml b/src/libcamera/property_ids.yaml
> > index ce627fa042ba..4cecb5ad9ac3 100644
> > --- a/src/libcamera/property_ids.yaml
> > +++ b/src/libcamera/property_ids.yaml
> > @@ -386,4 +386,181 @@ controls:
> >                                |                    |
> >                                |                    |
> >                                +--------------------+
> > +
> > +  - PixelArraySize:
> > +      type: float
>
> Do we really need a float to store millimetres?
>
> Hrm ... actually yes - A quick google for instance "IMX219 sensor size"
> returns 4.60 mm (diagonal) ... :-)
>
>
>
> > +      size: [2]
> > +      description: |
> > +        The physical sizes of the pixel array (width and height), in
> > +        millimeters.
> > +
>
> Looking at the IMX219 product brief/datasheet:
>
> Device Structure
> * CMOS image sensor
> * Image size : Diagonal 4.60 mm (Type 1/4.0)
> * Total number of pixels : 3296 (H) × 2512 (V) approx. 8.28 M pixels
> * Number of effective pixels : 3296 (H) × 2480 (V) approx. 8.17 M pixels
> * Number of active pixels : 3280 (H) × 2464 (V) approx. 8.08 M pixels
> * Chip size : 5.095 mm (H) × 4.930 mm (V) (w/ Scribe)
> * Unit cell size : 1.12 μm (H) × 1.12 μm (V)
> * Substrate material : Silicon
>
> This is stating an Image Size (as a diagonal...) Presumably someone will
> have to convert/identify that to a width/height appropriately?
>
> I assume the chip-size is not relevant here.

You are right, I have used that chip size in place of the pixel array
size, which can be calculated from Fig.3 of the chip manual instead.

>
> For instance, the OV5647 specifies:
>
> * image area: 3673.6 um x 2738.4um,
>
> as well as die-dimensions.
>
>
>
>
>
> > +  - PixelArrayBounds:
> > +      type: int32_t
> > +      size: [2]
> > +      description: |
> > +        The camera sensor pixel array bounding rectangle vertical and
> > +        horizontal sizes.
>
> That reads like a string of unassociated words.
>
> Perhaps some hyphenation would help?
>
> The camera-sensor pixel-array bounding-rectangle vertical and horizontal
> sizes.
>
>
> Or:
>
> The vertical and horizontal sizes for the maximum bounding rectangle of
> the pixel array on the camera sensor.

As clarified with Sakari, this property will be dropped. Cross-shaped
sensors expose different overlapping rectangles as supported modes,
but according to his opinion, there is nothing like a cross-shaped
pixel matrix like the below reported one

>
> > +
> > +        For image sensors with a rectangular pixel array the sizes described by
>
> /array/array,/
>
> > +        this property are the same as the PixelAreaSize property size.
> > +
> > +        For image sensors with more complex pixel array displacements (such as
> > +        cross-shaped pixel arrays, non symmetrical pixel arrays etc) this
> > +        property represents the bounding rectangle in which all pixel array
> > +        dimensions are inscribed into.
> > +
> > +        In example, the bounding rectangle sizes for image sensor with a
>
> /In/For/
>
> > +        cross-shaped pixel array is described as
> > +
> > +
> > +                     PixelArrayBound(0) = width
> > +                    /-----------------/
> > +
> > +            (0,0)-> +----+------+----+  /
> > +                    |    |//////|    |  |
> > +                    +----+//////+----+  |
> > +                    |////////////////|  | PixelArrayBound(1) = height
> > +                    +----+//////+----+  |
> > +                    |    |//////|    |  |
> > +                    +----+------+----+  /
> > +                            |
> > +                             -> Cross-shaped pixel area
> > +
> > +  - PixelArrays:
> > +      type: int32_t
> > +      size: [8]
> > +      description: |
> > +        The sensor pixel array rectangles, relative to the rectangle described
> > +        by the PixelArrayBounds property.
> > +
> > +        This property describes an arbitrary number of (likely overlapping)
> > +        rectangles, representing the pixel array areas the sensor is composed
> > +        of.
> > +
> > +        Each rectangle is defined by its displacement from pixel (0, 0) of
> > +        the bounding rectangle described by the PixelArrayBound property.
> > +
> > +        For image sensors with a rectangular pixel array, a single rectangle
> > +        is required. For sensors with more complex pixel array displacements
> > +        multiple rectangles shall be specified, ordered from the tallest to the
>
> Should this state 'multiple overlapping rectangles' ?
>

yes

> > +        shorter one.
> > +
> > +        For each rectangle, this property reports the full pixel array size,
> > +        including non-active pixels, black level calibration pixels etc.
> > +
> > +        In example, a simple sensor with a rectangular pixel array is described
> > +        as
> > +
> > +                     PixelArrayBound(0) = width
> > +                    /-----------------/
> > +                      x1          x2
> > +            (0,0)-> +-o------------o-+  /
> > +                 y1 o +------------+ |  |
> > +                    | |////////////| |  |
> > +                    | |////////////| |  | PixelArrayBound(1) = height
> > +                    | |////////////| |  |
> > +                 y2 o +------------+ |  |
> > +                    +----------------+  /
> > +
> > +                 PixelArray = (x1, y1, (x2 - x1), (y2 - y1))
>
> So for a single 'described rect', this is
> struct rect {
> 	int32_t x;
> 	int32_t y;
>
> 	int32_t width;
> 	int32_t height;
> }
>
> What are the remaining 4 values? 0?

There are no other values.
I recently realized the value to assign to 'size:' is totally
arbitrary, while I interpreted it as an upper bound (hence, the note
in the patch commit message)

 - Use the new 'size' property in place of 'compound'.
   I tried to set meaningful values but it's not easy to get them right..

>
> (Is there a means to recognise when or when not to parse the second 'rect'?)

You'll get it from the property size.

>
>
> > +
> > +        A more complex sensor, with a cross shaped pixel array displacement
> > +        is described with 2 rectangles, with the vertical rectangle
> > +        described first
> > +
> > +                     PixelArrayBound(0) = width
> > +                    /-----------------/
> > +                    x1  x2     x3  x4 W
> > +            (0,0)-> +o---o------o---o+  /
> > +                    |    |//////|    |  |
> > +                 y1 o+---+------+---+|  |
> > +                    ||///|//////|///||  | PixelArrayBound(1) = height
> > +                 y2 o+---+------+---+|  |
> > +                    |    |//////|    |  |
> > +                 H  +----+------+----+  /
> > +
> > +
> > +                PixelArray = ( (x2, 0, (x3 - x2), H),
> > +                               (x1, y1, (x4 - x1), (y2 - y1))
>
> Wouldn't it make sense to show y1,y2,y3,y4 as well, and use y4-y1 in
> place of H? or are you trying to show that the W and H are not required
> parameters, and the rectangles do not have to utilise the full width or
> height of the sensor?

yes, rectangles can span the whole width or height or not, that's why
the two dimensions are described differently.

>
>
> But why would the width of the sensor be larger than the width of the
> largest horizontal rectangle ...

This property will have to be changed as there are no more bounds to
take into account

>
>
> So we are really expressing:
>
> > +                     PixelArrayBound(0) = width
> > +                    /-----------------/
> > +                    x1  x2     x3  x4 W
> > +            (0,0)-> +o---o------o---o+  /
> > +                    |    |//////|    |  |
> > +                 y1 o    +------+    |  |
> > +                    |    |//////|    |  | PixelArrayBound(1) = height
> > +                 y2 o    +------+    |  |
> > +                    |    |//////|    |  |
> > +                 H  +----+------+----+  /
>
> followed by:
>
> > +                     PixelArrayBound(0) = width
> > +                    /-----------------/
> > +                    x1  x2     x3  x4 W
> > +            (0,0)-> +o---o------o---o+  /
> > +                    |                |  |
> > +                 y1 o+---+------+---+|  |
> > +                    ||///|//////|///||  | PixelArrayBound(1) = height
> > +                 y2 o+---+------+---+|  |
> > +                    |                |  |
> > +                 H  +----+------+----+  /
>
>
> So really, PixelArrays describes up to two rectangles...
>
> What happens if the shape is more complex? Why do we support a cross
> shaped pixel array, but not a hexagonal pixel array?

I have heard from Sakari and Laurent abount possible 'cross shaped'
pixel arrays (which after the clarification, are actually just sensors
with modes that maximize the FOV in one direction).

>
> (perhaps cross shaped pixel arrays are a standard thing, I don't know,
> or maybe its' just because we only support a maximum of two rects...,
> and a polygon would be more bytes to encode...)
>

Yes, let's leave it out

>
> Querying this it seems it's for advanced sensors which have layouts to
> support different aspect ratios. It would be good to express that here
> somewhere, but I'm not sure how yet.

I'll try to capture that, but I don't think it's necessary. If an
integrator has a cross-shaped sensor to deal with, I'm sure she knows
what's that for

>
>
> > +
> > +  - ActiveAreaSizes:
> > +      type: int32_t
> > +      size: [8]
> > +      description: |
> > +        The sensor active pixel area sizes, represented as rectangles
>
> /sensor/sensor's/ (the active pixel area belongs to the sensor)
>
>
> > +        inscribed in the ones described by the PixelArrays property.
> > +
> > +        One ActiveAreaSize rectangle per each rectangle described in the
>
> /per/for/
>

ack

> > +        PixelArrays property is required. As a consequence, the two properties
> > +        shall transport the same number of elements.
> > +
> > +        The ActiveAreaSize rectangles represent the maximum image sizes the
> > +        sensor can produce.
> > +
> > +  - BayerFilterArrangement:
> > +      type: int32_t
> > +      description: |
> > +        The pixel array color filter displacement.
> > +
> > +        This property describes the arrangement and readout sequence of the
> > +        three RGB color components of the sensor's Bayer Color Filter Array
> > +        (CFA).
> > +
> > +        Color filters are usually displaced in line-alternating fashion on the
> > +        sensor pixel array. In example, one line might be composed of Red-Green
> > +        while the successive is composed of Blue-Green color information.
> > +
> > +        The value of this property represent the arrangement of color filters
>
> /represent/represents/
>
> > +        in the top-left 2x2 pixel square.
> > +
> > +        In example, for a sensor with the following color filter displacement
>
> /In example,/For example,/
>
> > +
> > +                 (0, 0)               (max-col)
> > +           +---+    +--------------...---+
> > +           |B|G|<---|B|G|B|G|B|G|B|...B|G|
> > +           |G|R|<---|G|R|G|R|G|R|G|...G|R|
> > +           +---+    |B|G|B|G|B|G|B|...B|G|
> > +                    ...                  ..
> > +                    ...                  ..
> > +                    |G|R|G|R|G|R|G|...G|R|
> > +                    |B|G|B|G|B|G|B|...B|G|   (max-lines)
> > +                    +--------------...---+
> > +
> > +        The filer arrangement is represented by the BGGR value, which correspond
>
> /filer/filter/
> /correspond/corresponds/
>
>
> > +        to the pixel readout sequence in line interleaved mode.
> > +
> > +      enum:
> > +        - name: BayerFilterRGGB
>
> Do we need to prefix these enum values (BayerFilerXXXX) for namespacing?
> Or are they already namespaced by being specific to BayerFilterArrangement?
>
> I.e. can we arrange that these become:
>
> BayerFilterArrangement::RGGB ?
>
> Rather than:
>
> BayerFilterArrangement::BayerFilterRGGB ?
>

Looking at the generated property_ids.cpp


/**
 * \enum BayerFilterArrangementValues
 * \brief Supported BayerFilterArrangement values
 *
 * \var BayerFilterArrangementValues::BayerFilterRGGB
 * \brief Color filter array displacement is Red-Green/Green-Blue
 *
 * \var BayerFilterArrangementValues::BayerFilterGRBG
 * \brief Color filter array displacement is Green-Red/Blue-Green
 *
 * \var BayerFilterArrangementValues::BayerFilterGBRG
 * \brief Color filter array displacement is Green-Blue/Red-Green
 *
 * \var BayerFilterArrangementValues::BayerFilterBGGR
 * \brief Color filter array displacement is Blue-Green/Green-Red
 *
 * \var BayerFilterArrangementValues::BayerFilterNonStandard
 * \brief The pixel array color filter does not use the standard Bayer RGB
 * color model
 */

Shortening the enum entries names would allow them to be accessed as
'RGGB' only, which seems bad. And really, why does this matter ?

> > +          value: 0
> > +          description: |
> > +            Color filter array displacement is Red-Green/Green-Blue
> > +
> > +        - name: BayerFilterGRBG
> > +          value: 1
> > +          description: |
> > +            Color filter array displacement is Green-Red/Blue-Green
> > +
> > +        - name: BayerFilterGBRG
> > +          value: 2
> > +          description: |
> > +            Color filter array displacement is Green-Blue/Red-Green
> > +
> > +        - name: BayerFilterBGGR
> > +          value: 3
> > +          description: |
> > +            Color filter array displacement is Blue-Green/Green-Red
> > +
> > +        - name: BayerFilterNonStandard
> > +          value: 4
> > +          description: |
> > +            The pixel array color filter does not use the standard Bayer RGB
> > +            color model
> > +
> > +  - ISOSensitivityRange:
> > +      type: int32_t
> > +      size: [2]
> > +      description: |
> > +        The range of supported ISO sensitivities, as documented by the
> > +        ISO 12232:2006 standard
>
> The ISO 1232:2006 appears to be superseded by ISO 12232:2019... and
> itself supersedes the :2006 version. Do we need to specify versions? Or
> should we state 'or later' .... or ... ?
>
>  (I see that "Speed ratings greater than 10000 have finally been defined
> in ISO 12232:2019"...)

Thanks, I got that from the android definition. Should we use the most
recent one. I premit I have no idea what I'm talking about.

>
>
> This is 2 int32_t entries. Presumably, thus one is the minimum, and one
> is the maximum. I can guess which order they are - but do they need to
> be specified explicitly? (maybe not, but I ask as devils advocate at least)

Well, one will be less than the other, right ? We can specify it, but
seems really obvious to me to specify a range as [min, max] and not
[max, min].

Thanks for the detailed review

>
>
> > +
> >  ...
> >
>
> --
> Regards
> --
> Kieran

Patch

diff --git a/src/libcamera/property_ids.yaml b/src/libcamera/property_ids.yaml
index ce627fa042ba..4cecb5ad9ac3 100644
--- a/src/libcamera/property_ids.yaml
+++ b/src/libcamera/property_ids.yaml
@@ -386,4 +386,181 @@  controls:
                               |                    |
                               |                    |
                               +--------------------+
+
+  - PixelArraySize:
+      type: float
+      size: [2]
+      description: |
+        The physical sizes of the pixel array (width and height), in
+        millimeters.
+
+  - PixelArrayBounds:
+      type: int32_t
+      size: [2]
+      description: |
+        The camera sensor pixel array bounding rectangle vertical and
+        horizontal sizes.
+
+        For image sensors with a rectangular pixel array the sizes described by
+        this property are the same as the PixelAreaSize property size.
+
+        For image sensors with more complex pixel array displacements (such as
+        cross-shaped pixel arrays, non symmetrical pixel arrays etc) this
+        property represents the bounding rectangle in which all pixel array
+        dimensions are inscribed into.
+
+        In example, the bounding rectangle sizes for image sensor with a
+        cross-shaped pixel array is described as
+
+
+                     PixelArrayBound(0) = width
+                    /-----------------/
+
+            (0,0)-> +----+------+----+  /
+                    |    |//////|    |  |
+                    +----+//////+----+  |
+                    |////////////////|  | PixelArrayBound(1) = height
+                    +----+//////+----+  |
+                    |    |//////|    |  |
+                    +----+------+----+  /
+                            |
+                             -> Cross-shaped pixel area
+
+  - PixelArrays:
+      type: int32_t
+      size: [8]
+      description: |
+        The sensor pixel array rectangles, relative to the rectangle described
+        by the PixelArrayBounds property.
+
+        This property describes an arbitrary number of (likely overlapping)
+        rectangles, representing the pixel array areas the sensor is composed
+        of.
+
+        Each rectangle is defined by its displacement from pixel (0, 0) of
+        the bounding rectangle described by the PixelArrayBound property.
+
+        For image sensors with a rectangular pixel array, a single rectangle
+        is required. For sensors with more complex pixel array displacements
+        multiple rectangles shall be specified, ordered from the tallest to the
+        shorter one.
+
+        For each rectangle, this property reports the full pixel array size,
+        including non-active pixels, black level calibration pixels etc.
+
+        In example, a simple sensor with a rectangular pixel array is described
+        as
+
+                     PixelArrayBound(0) = width
+                    /-----------------/
+                      x1          x2
+            (0,0)-> +-o------------o-+  /
+                 y1 o +------------+ |  |
+                    | |////////////| |  |
+                    | |////////////| |  | PixelArrayBound(1) = height
+                    | |////////////| |  |
+                 y2 o +------------+ |  |
+                    +----------------+  /
+
+                 PixelArray = (x1, y1, (x2 - x1), (y2 - y1))
+
+        A more complex sensor, with a cross shaped pixel array displacement
+        is described with 2 rectangles, with the vertical rectangle
+        described first
+
+                     PixelArrayBound(0) = width
+                    /-----------------/
+                    x1  x2     x3  x4 W
+            (0,0)-> +o---o------o---o+  /
+                    |    |//////|    |  |
+                 y1 o+---+------+---+|  |
+                    ||///|//////|///||  | PixelArrayBound(1) = height
+                 y2 o+---+------+---+|  |
+                    |    |//////|    |  |
+                 H  +----+------+----+  /
+
+
+                PixelArray = ( (x2, 0, (x3 - x2), H),
+                               (x1, y1, (x4 - x1), (y2 - y1))
+
+  - ActiveAreaSizes:
+      type: int32_t
+      size: [8]
+      description: |
+        The sensor active pixel area sizes, represented as rectangles
+        inscribed in the ones described by the PixelArrays property.
+
+        One ActiveAreaSize rectangle per each rectangle described in the
+        PixelArrays property is required. As a consequence, the two properties
+        shall transport the same number of elements.
+
+        The ActiveAreaSize rectangles represent the maximum image sizes the
+        sensor can produce.
+
+  - BayerFilterArrangement:
+      type: int32_t
+      description: |
+        The pixel array color filter displacement.
+
+        This property describes the arrangement and readout sequence of the
+        three RGB color components of the sensor's Bayer Color Filter Array
+        (CFA).
+
+        Color filters are usually displaced in line-alternating fashion on the
+        sensor pixel array. In example, one line might be composed of Red-Green
+        while the successive is composed of Blue-Green color information.
+
+        The value of this property represent the arrangement of color filters
+        in the top-left 2x2 pixel square.
+
+        In example, for a sensor with the following color filter displacement
+
+                 (0, 0)               (max-col)
+           +---+    +--------------...---+
+           |B|G|<---|B|G|B|G|B|G|B|...B|G|
+           |G|R|<---|G|R|G|R|G|R|G|...G|R|
+           +---+    |B|G|B|G|B|G|B|...B|G|
+                    ...                  ..
+                    ...                  ..
+                    |G|R|G|R|G|R|G|...G|R|
+                    |B|G|B|G|B|G|B|...B|G|   (max-lines)
+                    +--------------...---+
+
+        The filer arrangement is represented by the BGGR value, which correspond
+        to the pixel readout sequence in line interleaved mode.
+
+      enum:
+        - name: BayerFilterRGGB
+          value: 0
+          description: |
+            Color filter array displacement is Red-Green/Green-Blue
+
+        - name: BayerFilterGRBG
+          value: 1
+          description: |
+            Color filter array displacement is Green-Red/Blue-Green
+
+        - name: BayerFilterGBRG
+          value: 2
+          description: |
+            Color filter array displacement is Green-Blue/Red-Green
+
+        - name: BayerFilterBGGR
+          value: 3
+          description: |
+            Color filter array displacement is Blue-Green/Green-Red
+
+        - name: BayerFilterNonStandard
+          value: 4
+          description: |
+            The pixel array color filter does not use the standard Bayer RGB
+            color model
+
+  - ISOSensitivityRange:
+      type: int32_t
+      size: [2]
+      description: |
+        The range of supported ISO sensitivities, as documented by the
+        ISO 12232:2006 standard
+
 ...