[libcamera-devel,v3,5/8] libcamera: properties: Add FrameDurationLimits

Message ID 20200605141002.49119-6-jacopo@jmondi.org
State Superseded, archived
Delegated to: Jacopo Mondi
Headers show
Series
  • android: Build stream configuration map
Related show

Commit Message

Jacopo Mondi June 5, 2020, 2:09 p.m. UTC
Add a camera property to express the minimum and maximum frame durations.

Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
---

Cc Naush and Dave as this could potentially conflict with their on-going FPS
handling series. Sending it out for discussion.

---
 src/libcamera/property_ids.yaml | 47 +++++++++++++++++++++++++++++++++
 1 file changed, 47 insertions(+)

--
2.27.0

Comments

Laurent Pinchart June 26, 2020, 2:35 a.m. UTC | #1
Hi Jacopo,

Thank you for the patch.

On Fri, Jun 05, 2020 at 04:09:59PM +0200, Jacopo Mondi wrote:
> Add a camera property to express the minimum and maximum frame durations.
> 
> Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
> ---
> 
> Cc Naush and Dave as this could potentially conflict with their on-going FPS
> handling series. Sending it out for discussion.
> 
> ---
>  src/libcamera/property_ids.yaml | 47 +++++++++++++++++++++++++++++++++
>  1 file changed, 47 insertions(+)
> 
> diff --git a/src/libcamera/property_ids.yaml b/src/libcamera/property_ids.yaml
> index ce627fa042ba..d703ab31eaac 100644
> --- a/src/libcamera/property_ids.yaml
> +++ b/src/libcamera/property_ids.yaml
> @@ -386,4 +386,51 @@ controls:
>                                |                    |
>                                |                    |
>                                +--------------------+
> +
> +  - FrameDurationLimits:
> +      type: int32_t
> +      size: [2]
> +      description: |
> +        The camera supported frame durations interval.
> +
> +        This property reports the camera minimum and maximum frame durations (in
> +        this order) expressed in nanoseconds to report the limits of the
> +        achievable frame rate.
> +
> +        Camera devices should here report durations calculated by inspecting the
> +        camera sensor supported frame rate interval, and adding to it any
> +        known additional delay caused by the image acquisition process that
> +        incurs in the time between a frame is captured and delivered to
> +        applications.

Those are two unrelated properties. The processing delay doesn't affect
the frame duration. Reporting the processing delay may make sense, but
it should then be another property.

> +        The here reported durations represent the time interval that occurs
> +        between the delivery of two consecutive frames for the fastest and
> +        slower streams, without considering additional delays introduced by the
> +        sharing of system resources when multiple streams are captured at the
> +        same time.
> +
> +        This implies that the minimum reported frame duration, which corresponds
> +        to the highest possible camera frame rate, is calculated without taking
> +        into consideration how multiple streams capture requests sent to
> +        the camera as part of the same capture session might influence the frame
> +        delivery rate by, in example, introducing delays due to the requirement
> +        of time-sharing components part of the image acquisition pipeline. In
> +        example, request containing two scaled-down streams might require
> +        time-sharing the single scaler available in the system, introducing an
> +        additional delay that prevents the camera to deliver frames at the
> +        here reported rate.

This also mixes the delays and frame rates, it's quite unclear to me :-S

> +
> +        In the same way, the maximum frame duration, which corresponds to the
> +        lowest possible frame rate, does not take into consideration additional
> +        processing delays introduced by image encoding and processing that
> +        happens sporadically, in example, to produce still images in JPEG
> +        format.
> +
> +        As a consequence, application should inspect this property to know the
> +        camera capability limits, but should not assume the here reported values
> +        are achievable under all circumstances.
> +
> +        # \todo Expand to better describe how stalling streams (ie JPEG) impacts
> +        # the frame rate calculation.

This ends up in the generated .cpp file, and generates a doxygen
warning. You can either drop the # (in which case the todo will end up
in doxygen), or reduce the indentation by two spaces.

> +
>  ...
Jacopo Mondi June 26, 2020, 3:59 p.m. UTC | #2
Hi Laurent,

On Fri, Jun 26, 2020 at 05:35:07AM +0300, Laurent Pinchart wrote:
> Hi Jacopo,
>
> Thank you for the patch.
>
> On Fri, Jun 05, 2020 at 04:09:59PM +0200, Jacopo Mondi wrote:
> > Add a camera property to express the minimum and maximum frame durations.
> >
> > Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
> > ---
> >
> > Cc Naush and Dave as this could potentially conflict with their on-going FPS
> > handling series. Sending it out for discussion.
> >
> > ---
> >  src/libcamera/property_ids.yaml | 47 +++++++++++++++++++++++++++++++++
> >  1 file changed, 47 insertions(+)
> >
> > diff --git a/src/libcamera/property_ids.yaml b/src/libcamera/property_ids.yaml
> > index ce627fa042ba..d703ab31eaac 100644
> > --- a/src/libcamera/property_ids.yaml
> > +++ b/src/libcamera/property_ids.yaml
> > @@ -386,4 +386,51 @@ controls:
> >                                |                    |
> >                                |                    |
> >                                +--------------------+
> > +
> > +  - FrameDurationLimits:
> > +      type: int32_t
> > +      size: [2]
> > +      description: |
> > +        The camera supported frame durations interval.
> > +
> > +        This property reports the camera minimum and maximum frame durations (in
> > +        this order) expressed in nanoseconds to report the limits of the
> > +        achievable frame rate.
> > +
> > +        Camera devices should here report durations calculated by inspecting the
> > +        camera sensor supported frame rate interval, and adding to it any
> > +        known additional delay caused by the image acquisition process that
> > +        incurs in the time between a frame is captured and delivered to
> > +        applications.
>
> Those are two unrelated properties. The processing delay doesn't affect
> the frame duration. Reporting the processing delay may make sense, but

Doesn't it ? From an application perspective the frame rate is
actually defined by both the sensor frame duration and the ISP
processing delays, if known. I understand one is a sensor property and
the other one is platform/use-case specific, but from an application,
would it be better to have the two properties split and then having to
consider both ?

> it should then be another property.
>
> > +        The here reported durations represent the time interval that occurs
> > +        between the delivery of two consecutive frames for the fastest and
> > +        slower streams, without considering additional delays introduced by the
> > +        sharing of system resources when multiple streams are captured at the
> > +        same time.
> > +
> > +        This implies that the minimum reported frame duration, which corresponds
> > +        to the highest possible camera frame rate, is calculated without taking
> > +        into consideration how multiple streams capture requests sent to
> > +        the camera as part of the same capture session might influence the frame
> > +        delivery rate by, in example, introducing delays due to the requirement
> > +        of time-sharing components part of the image acquisition pipeline. In
> > +        example, request containing two scaled-down streams might require
> > +        time-sharing the single scaler available in the system, introducing an
> > +        additional delay that prevents the camera to deliver frames at the
> > +        here reported rate.
>
> This also mixes the delays and frame rates, it's quite unclear to me :-S
>
> > +
> > +        In the same way, the maximum frame duration, which corresponds to the
> > +        lowest possible frame rate, does not take into consideration additional
> > +        processing delays introduced by image encoding and processing that
> > +        happens sporadically, in example, to produce still images in JPEG
> > +        format.
> > +
> > +        As a consequence, application should inspect this property to know the
> > +        camera capability limits, but should not assume the here reported values
> > +        are achievable under all circumstances.
> > +
> > +        # \todo Expand to better describe how stalling streams (ie JPEG) impacts
> > +        # the frame rate calculation.
>
> This ends up in the generated .cpp file, and generates a doxygen
> warning. You can either drop the # (in which case the todo will end up
> in doxygen), or reduce the indentation by two spaces.
>
> > +
> >  ...
>
> --
> Regards,
>
> Laurent Pinchart

Patch

diff --git a/src/libcamera/property_ids.yaml b/src/libcamera/property_ids.yaml
index ce627fa042ba..d703ab31eaac 100644
--- a/src/libcamera/property_ids.yaml
+++ b/src/libcamera/property_ids.yaml
@@ -386,4 +386,51 @@  controls:
                               |                    |
                               |                    |
                               +--------------------+
+
+  - FrameDurationLimits:
+      type: int32_t
+      size: [2]
+      description: |
+        The camera supported frame durations interval.
+
+        This property reports the camera minimum and maximum frame durations (in
+        this order) expressed in nanoseconds to report the limits of the
+        achievable frame rate.
+
+        Camera devices should here report durations calculated by inspecting the
+        camera sensor supported frame rate interval, and adding to it any
+        known additional delay caused by the image acquisition process that
+        incurs in the time between a frame is captured and delivered to
+        applications.
+
+        The here reported durations represent the time interval that occurs
+        between the delivery of two consecutive frames for the fastest and
+        slower streams, without considering additional delays introduced by the
+        sharing of system resources when multiple streams are captured at the
+        same time.
+
+        This implies that the minimum reported frame duration, which corresponds
+        to the highest possible camera frame rate, is calculated without taking
+        into consideration how multiple streams capture requests sent to
+        the camera as part of the same capture session might influence the frame
+        delivery rate by, in example, introducing delays due to the requirement
+        of time-sharing components part of the image acquisition pipeline. In
+        example, request containing two scaled-down streams might require
+        time-sharing the single scaler available in the system, introducing an
+        additional delay that prevents the camera to deliver frames at the
+        here reported rate.
+
+        In the same way, the maximum frame duration, which corresponds to the
+        lowest possible frame rate, does not take into consideration additional
+        processing delays introduced by image encoding and processing that
+        happens sporadically, in example, to produce still images in JPEG
+        format.
+
+        As a consequence, application should inspect this property to know the
+        camera capability limits, but should not assume the here reported values
+        are achievable under all circumstances.
+
+        # \todo Expand to better describe how stalling streams (ie JPEG) impacts
+        # the frame rate calculation.
+
 ...