Message ID | 20200513091120.1653-2-naush@raspberrypi.com |
---|---|
State | Superseded |
Headers | show |
Series |
|
Related | show |
Hi Naush, Thank you for the patch. On Wed, May 13, 2020 at 10:11:18AM +0100, Naushir Patuck wrote: > Add a float array control (controls::FrameDurations) to specify the > minimum and maximum (in that order) frame duration to be used by the > camera sensor. > > Signed-off-by: Naushir Patuck <naush@raspberrypi.com> > --- > src/libcamera/control_ids.yaml | 14 ++++++++++++++ > 1 file changed, 14 insertions(+) > > diff --git a/src/libcamera/control_ids.yaml b/src/libcamera/control_ids.yaml > index 77ebc3f9..42694fc7 100644 > --- a/src/libcamera/control_ids.yaml > +++ b/src/libcamera/control_ids.yaml > @@ -239,4 +239,18 @@ controls: > pixel range (as if pixels ranged from 0 to 65535). The SensorBlackLevels > control can only be returned in metadata. > size: [4] > + > + - FrameDurations: > + type: float Do we need sub-microsecond precision, or would a uint32_t control work ? I'd rather have a fixed precision if possible. > + description: | > + Specifies the minimum and maximum (in that order) allowable frame > + duration, in micro-seconds, for the sensor to use. This could also limit > + the largest exposure times the sensor can use. For example, if a maximum > + frame duration of 33ms is requested (corresponding to 30 frames per > + second), the sensor will not be able raise the exposure time above 33ms. > + Note that the sensor may not always be able to provide the requested > + frame duration limits depending on its mode configuration. This looks good to me, but I'd like to discuss the corner cases I can already think about. - Are there use cases for an application to specify a minimum frame duration only, or a maximum frame duration only (that is, any frame rate lower than a limit, or any frame rate higher than a limit) ? If so, how should that be specified ? We could set the minimum value to 0 to mean that the maximum frame rate is unbounded, and the maximum value to UINT32_MAX to mean that the minimum frame rate is unbounded. Is there anything I'm overlooking ? - Is 0 an acceptable value ? Or should 1 be the minimum value ? - What happens if the requested frame duration isn't achievable ? Should we specify that the camera will use a frame duration as close to the requested range as possible ? Or could there be cases where a different behaviour would be needed ? - Not something we need to address now, but do you see any future relation between this control and anti-banding (50 or 60Hz flicker avoidance) ? Anti-banding will mostly restrict possible exposure times, and should only indirectly interact with the frame duration if I'm not mistaken, is that correct ? > + > + \sa ExposureTime > + size: [2] > ...
Hi Laurent, On Sun, 24 May 2020 at 23:23, Laurent Pinchart <laurent.pinchart@ideasonboard.com> wrote: > > Hi Naush, > > Thank you for the patch. > > On Wed, May 13, 2020 at 10:11:18AM +0100, Naushir Patuck wrote: > > Add a float array control (controls::FrameDurations) to specify the > > minimum and maximum (in that order) frame duration to be used by the > > camera sensor. > > > > Signed-off-by: Naushir Patuck <naush@raspberrypi.com> > > --- > > src/libcamera/control_ids.yaml | 14 ++++++++++++++ > > 1 file changed, 14 insertions(+) > > > > diff --git a/src/libcamera/control_ids.yaml b/src/libcamera/control_ids.yaml > > index 77ebc3f9..42694fc7 100644 > > --- a/src/libcamera/control_ids.yaml > > +++ b/src/libcamera/control_ids.yaml > > @@ -239,4 +239,18 @@ controls: > > pixel range (as if pixels ranged from 0 to 65535). The SensorBlackLevels > > control can only be returned in metadata. > > size: [4] > > + > > + - FrameDurations: > > + type: float > > Do we need sub-microsecond precision, or would a uint32_t control work ? > I'd rather have a fixed precision if possible. The reason I chose floating point here was because of legacy standards (e.g. 29.97/59.94 fps for SMPTE). It also does allow us to specify < 1 fps if a user ever needs to (e.g. for a time lapse type capture use case). > > + description: | > > + Specifies the minimum and maximum (in that order) allowable frame > > + duration, in micro-seconds, for the sensor to use. This could also limit > > + the largest exposure times the sensor can use. For example, if a maximum > > + frame duration of 33ms is requested (corresponding to 30 frames per > > + second), the sensor will not be able raise the exposure time above 33ms. > > + Note that the sensor may not always be able to provide the requested > > + frame duration limits depending on its mode configuration. > > This looks good to me, but I'd like to discuss the corner cases I can > already think about. > > - Are there use cases for an application to specify a minimum frame > duration only, or a maximum frame duration only (that is, any frame > rate lower than a limit, or any frame rate higher than a limit) ? If > so, how should that be specified ? We could set the minimum value to 0 > to mean that the maximum frame rate is unbounded, and the maximum > value to UINT32_MAX to mean that the minimum frame rate is unbounded. > Is there anything I'm overlooking ? Good point, I can see we might only want to set one limit. In these cases, the sensor specified limit would be used. Speaking of this, I was going to address sensor reported framerate after this patch. Do we do need a way for the sensor driver to specify its framerate limits if the application wants to know? Perhaps this should be part of CameraSensorInfo, although it will be mode specific? Or perhaps we just use the VBLANK min/max to give us the limits? > > - Is 0 an acceptable value ? Or should 1 be the minimum value ? If we stick to floating point, we should use 0.0 as the min limit. > > - What happens if the requested frame duration isn't achievable ? Should > we specify that the camera will use a frame duration as close to the > requested range as possible ? Or could there be cases where a > different behaviour would be needed ? I think the only sensible thing would be to truncate the requested values to the sensor limits. See above on how we get the limits from the sensor though. > > - Not something we need to address now, but do you see any future > relation between this control and anti-banding (50 or 60Hz flicker > avoidance) ? Anti-banding will mostly restrict possible exposure > times, and should only indirectly interact with the frame duration if > I'm not mistaken, is that correct ? There should be minimal interaction with framerate control. As you said, we will restrict use of certain exposure times for antibanding, but will have more flexibility with framerate. > > > + > > + \sa ExposureTime > > + size: [2] > > ... > > -- > Regards, > > Laurent Pinchart Regards, Naush
Hi Naush, Laurent, On Mon, May 25, 2020 at 01:23:02AM +0300, Laurent Pinchart wrote: > Hi Naush, > > Thank you for the patch. > > On Wed, May 13, 2020 at 10:11:18AM +0100, Naushir Patuck wrote: > > Add a float array control (controls::FrameDurations) to specify the > > minimum and maximum (in that order) frame duration to be used by the > > camera sensor. > > > > Signed-off-by: Naushir Patuck <naush@raspberrypi.com> > > --- > > src/libcamera/control_ids.yaml | 14 ++++++++++++++ > > 1 file changed, 14 insertions(+) > > > > diff --git a/src/libcamera/control_ids.yaml b/src/libcamera/control_ids.yaml > > index 77ebc3f9..42694fc7 100644 > > --- a/src/libcamera/control_ids.yaml > > +++ b/src/libcamera/control_ids.yaml > > @@ -239,4 +239,18 @@ controls: > > pixel range (as if pixels ranged from 0 to 65535). The SensorBlackLevels > > control can only be returned in metadata. > > size: [4] > > + > > + - FrameDurations: > > + type: float > > Do we need sub-microsecond precision, or would a uint32_t control work ? > I'd rather have a fixed precision if possible. > > > + description: | > > + Specifies the minimum and maximum (in that order) allowable frame > > + duration, in micro-seconds, for the sensor to use. This could also limit > > + the largest exposure times the sensor can use. For example, if a maximum > > + frame duration of 33ms is requested (corresponding to 30 frames per > > + second), the sensor will not be able raise the exposure time above 33ms. Isn't it s/maximum frame duration/minimum frame duration/ ? If at least 30fps are requested, the exposure time cannot be longer than 33msec ? Am I reading this wrong ? > > + Note that the sensor may not always be able to provide the requested > > + frame duration limits depending on its mode configuration. I've been really confused by this, until I realized I was having an hard time trying to figure out how to use this for applications to express a request for a precise duration and how should the library have used it to report the actual frame durtion in metadata. Until I realized this could probably be better interpreted not as a frame duration request or result, but as an hint to the AE algorithm to clamp its calculation into certain duration limtis. And I agree a duration range might be a good hint to drive the AE routine and if I got the rest of the series right, that's what happens in the IPA with these patches. If that's actually the intention, what about making of this an "AeFrameDurationLimits" control, to make it clear it applies to AE and use "FrameDuration" as an integer control to represent the requested and precise frame duration in requests and metadata respectively. On "FrameDuration" itself: would it make sense to only considered it when running with AE off ? In that case it would be applications responsibility to opportunely calculate the exposure time and the frame duration, but I think it's expected if you run in manual mode. And for reference, that's what Android does as well :) It would also rule out the need to specify how the frameDurationLimits would be used when running in manual mode and wanting to configure a precise value there, as I guess you would have to give to min and max the same value to express that. this might open pandora's box, but: is the frame duration a property of the camera, or of a single stream ? > > This looks good to me, but I'd like to discuss the corner cases I can > already think about. > > - Are there use cases for an application to specify a minimum frame > duration only, or a maximum frame duration only (that is, any frame > rate lower than a limit, or any frame rate higher than a limit) ? If > so, how should that be specified ? We could set the minimum value to 0 > to mean that the maximum frame rate is unbounded, and the maximum > value to UINT32_MAX to mean that the minimum frame rate is unbounded. > Is there anything I'm overlooking ? If this becames an AE-related control and you specify that, I think you should provide both values. If you don't care about one of the two just use the values of the (forthcoming) property that reports the min and max frame durations of a camera. If an applications does not specify this control, the AE routine runs free, in the limits of the camera sensor capabilities. Does it make sense ? Thanks j > > - Is 0 an acceptable value ? Or should 1 be the minimum value ? > > - What happens if the requested frame duration isn't achievable ? Should > we specify that the camera will use a frame duration as close to the > requested range as possible ? Or could there be cases where a > different behaviour would be needed ? > > - Not something we need to address now, but do you see any future > relation between this control and anti-banding (50 or 60Hz flicker > avoidance) ? Anti-banding will mostly restrict possible exposure > times, and should only indirectly interact with the frame duration if > I'm not mistaken, is that correct ? > > > + > > + \sa ExposureTime > > + size: [2] > > ... > > -- > Regards, > > Laurent Pinchart > _______________________________________________ > libcamera-devel mailing list > libcamera-devel@lists.libcamera.org > https://lists.libcamera.org/listinfo/libcamera-devel
On Tue, May 26, 2020 at 09:59:57AM +0100, Naushir Patuck wrote: > On Sun, 24 May 2020 at 23:23, Laurent Pinchart wrote: > > On Wed, May 13, 2020 at 10:11:18AM +0100, Naushir Patuck wrote: > > > Add a float array control (controls::FrameDurations) to specify the > > > minimum and maximum (in that order) frame duration to be used by the > > > camera sensor. > > > > > > Signed-off-by: Naushir Patuck <naush@raspberrypi.com> > > > --- > > > src/libcamera/control_ids.yaml | 14 ++++++++++++++ > > > 1 file changed, 14 insertions(+) > > > > > > diff --git a/src/libcamera/control_ids.yaml b/src/libcamera/control_ids.yaml > > > index 77ebc3f9..42694fc7 100644 > > > --- a/src/libcamera/control_ids.yaml > > > +++ b/src/libcamera/control_ids.yaml > > > @@ -239,4 +239,18 @@ controls: > > > pixel range (as if pixels ranged from 0 to 65535). The SensorBlackLevels > > > control can only be returned in metadata. > > > size: [4] > > > + > > > + - FrameDurations: > > > + type: float > > > > Do we need sub-microsecond precision, or would a uint32_t control work ? > > I'd rather have a fixed precision if possible. > > The reason I chose floating point here was because of legacy standards > (e.g. 29.97/59.94 fps for SMPTE). It also does allow us to specify < > 1 fps if a user ever needs to (e.g. for a time lapse type capture use > case). But we're now using a frame duration in µs. A value lower than one would correspond to a frame rate higher than 1Mfps, which is unlikely :-) > > > + description: | > > > + Specifies the minimum and maximum (in that order) allowable frame > > > + duration, in micro-seconds, for the sensor to use. This could also limit > > > + the largest exposure times the sensor can use. For example, if a maximum > > > + frame duration of 33ms is requested (corresponding to 30 frames per > > > + second), the sensor will not be able raise the exposure time above 33ms. > > > + Note that the sensor may not always be able to provide the requested > > > + frame duration limits depending on its mode configuration. > > > > This looks good to me, but I'd like to discuss the corner cases I can > > already think about. > > > > - Are there use cases for an application to specify a minimum frame > > duration only, or a maximum frame duration only (that is, any frame > > rate lower than a limit, or any frame rate higher than a limit) ? If > > so, how should that be specified ? We could set the minimum value to 0 > > to mean that the maximum frame rate is unbounded, and the maximum > > value to UINT32_MAX to mean that the minimum frame rate is unbounded. > > Is there anything I'm overlooking ? > > Good point, I can see we might only want to set one limit. In these > cases, the sensor specified limit would be used. How should applications specify that ? By setting the other value to 0 (for minimum) or UINT32_MAX (for maximum) ? Or do you think another means would be better ? > Speaking of this, I was going to address sensor reported framerate > after this patch. Do we do need a way for the sensor driver to > specify its framerate limits if the application wants to know? I think reporting the minimum and maximum achievable frame durations is useful. It could be done by specifying FrameDurations (or FrameDurationLimits, as proposed in the review of 2/2) as a property (Camera::properties()). The maximum won't be a problem, but the minimum (highest frame rate) may depend on the frame size, so this can be tricky. > Perhaps this should be part of CameraSensorInfo, although it will be > mode specific? Or perhaps we just use the VBLANK min/max to give us > the limits? CameraSensorInfo is meant to convey information specific to the mode that has been selected (or, more generally speaking, the sensor configuration that has been set). It won't be a good match to report data that need to be set in camera static properties, at least when passed to IPAInterface::configure(). The maximum achievable frame rate depends on both the sensor and the ISP, as the ISP bandwidth limit may be lower than what the sensor supports. Could it also depend on the algorithms (and would thus need to involve the IPA), or would it be possible to always compute the limits in the pipeline handler ? We need to figure out what information we need to calculate the limits, and identify what information needs to be retrieved from the kernel (if any), what static information about the sensor should come from a userspace sensor database (similar to CamHelper, but moved to the libcamera core or libipa), and how the pipeline handler and IPA should collaborate to perform the computation. I can help designing all this, but if you already have an idea regarding how it could be done (even if in a way that would be specific to Raspberry Pi) that would be useful as a base for discussions. > > - Is 0 an acceptable value ? Or should 1 be the minimum value ? > > If we stick to floating point, we should use 0.0 as the min limit. > > > - What happens if the requested frame duration isn't achievable ? Should > > we specify that the camera will use a frame duration as close to the > > requested range as possible ? Or could there be cases where a > > different behaviour would be needed ? > > I think the only sensible thing would be to truncate the requested > values to the sensor limits. See above on how we get the limits from > the sensor though. That sounds good to me. Could you update the description to document this ? > > - Not something we need to address now, but do you see any future > > relation between this control and anti-banding (50 or 60Hz flicker > > avoidance) ? Anti-banding will mostly restrict possible exposure > > times, and should only indirectly interact with the frame duration if > > I'm not mistaken, is that correct ? > > There should be minimal interaction with framerate control. As you > said, we will restrict use of certain exposure times for antibanding, > but will have more flexibility with framerate. OK, good to have a confirmation that I shouldn't worry about it :-) > > > + > > > + \sa ExposureTime > > > + size: [2] > > > ...
Hi Jacopo, On Wed, May 27, 2020 at 11:47:34PM +0200, Jacopo Mondi wrote: > On Mon, May 25, 2020 at 01:23:02AM +0300, Laurent Pinchart wrote: > > On Wed, May 13, 2020 at 10:11:18AM +0100, Naushir Patuck wrote: > > > Add a float array control (controls::FrameDurations) to specify the > > > minimum and maximum (in that order) frame duration to be used by the > > > camera sensor. > > > > > > Signed-off-by: Naushir Patuck <naush@raspberrypi.com> > > > --- > > > src/libcamera/control_ids.yaml | 14 ++++++++++++++ > > > 1 file changed, 14 insertions(+) > > > > > > diff --git a/src/libcamera/control_ids.yaml b/src/libcamera/control_ids.yaml > > > index 77ebc3f9..42694fc7 100644 > > > --- a/src/libcamera/control_ids.yaml > > > +++ b/src/libcamera/control_ids.yaml > > > @@ -239,4 +239,18 @@ controls: > > > pixel range (as if pixels ranged from 0 to 65535). The SensorBlackLevels > > > control can only be returned in metadata. > > > size: [4] > > > + > > > + - FrameDurations: > > > + type: float > > > > Do we need sub-microsecond precision, or would a uint32_t control work ? > > I'd rather have a fixed precision if possible. > > > > > + description: | > > > + Specifies the minimum and maximum (in that order) allowable frame > > > + duration, in micro-seconds, for the sensor to use. This could also limit > > > + the largest exposure times the sensor can use. For example, if a maximum > > > + frame duration of 33ms is requested (corresponding to 30 frames per > > > + second), the sensor will not be able raise the exposure time above 33ms. > > Isn't it s/maximum frame duration/minimum frame duration/ ? > > If at least 30fps are requested, the exposure time cannot be longer > than 33msec ? Am I reading this wrong ? I may read it wrong, but I think Naush is correct here. If a *maximum* frame duration of 33ms is requested [that is, a *minimum* frame rate of 30fps], the sensor will not be able to *raise* the exposure time above 33ms. > > > + Note that the sensor may not always be able to provide the requested > > > + frame duration limits depending on its mode configuration. > > I've been really confused by this, until I realized I was having an > hard time trying to figure out how to use this for applications to > express a request for a precise duration and how should the library > have used it to report the actual frame durtion in metadata. Until I > realized this could probably be better interpreted not as a frame > duration request or result, but as an hint to the AE algorithm to > clamp its calculation into certain duration limtis. > > And I agree a duration range might be a good hint to drive the AE > routine and if I got the rest of the series right, that's > what happens in the IPA with these patches. > > If that's actually the intention, what about making of this an > "AeFrameDurationLimits" control, to make it clear it applies to AE and > use "FrameDuration" as an integer control to represent the requested > and precise frame duration in requests and metadata respectively. I think (Ae)FrameDurationLimits makes sense as a control (and as a property too, with the caveat that the minimum frame duration - highest frame rate - can be dependent on the stream resolution), and FrameDuration makes sense for metadata. More about FrameDuration as a control below. > On "FrameDuration" itself: would it make sense to only considered it > when running with AE off ? In that case it would be applications > responsibility to opportunely calculate the exposure time and the frame > duration, but I think it's expected if you run in manual mode. And for > reference, that's what Android does as well :) > > It would also rule out the need to specify how the frameDurationLimits > would be used when running in manual mode and wanting to configure a > precise value there, as I guess you would have to give to min and max > the same value to express that. > this might open pandora's box, but: is the frame duration a property of > the camera, or of a single stream ? I've opened the same box in my previous e-mail :-) I think the frame duration is a property of the camera, as all streams share the same sensor, and are thus slave to the same frame rate. Some streams could drop frames, but I think I would then report that as a decimation factor per stream (not sure what the use cases would be though). This being said, the maximum frame rate (minimum frame duration) can depend on the configuration of the camera, due to bandwidth limitations. It shouldn't affect the (Ae)FrameDuration(Limits) control or metadata, but will affect the (Ae)FrameDurationLimits property. It's something we need to address eventually, not necessarily as part of this series, we could start with a limit that doesn't depend on the camera configuration and then build additional features on top. > > This looks good to me, but I'd like to discuss the corner cases I can > > already think about. > > > > - Are there use cases for an application to specify a minimum frame > > duration only, or a maximum frame duration only (that is, any frame > > rate lower than a limit, or any frame rate higher than a limit) ? If > > so, how should that be specified ? We could set the minimum value to 0 > > to mean that the maximum frame rate is unbounded, and the maximum > > value to UINT32_MAX to mean that the minimum frame rate is unbounded. > > Is there anything I'm overlooking ? > > If this becames an AE-related control and you specify that, I think > you should provide both values. If you don't care about one of the two > just use the values of the (forthcoming) property that reports the > min and max frame durations of a camera. If an applications does not > specify this control, the AE routine runs free, in the limits > of the camera sensor capabilities. > > Does it make sense ? Yes that can work. Specifying anything below the minimum possible or above the maximum possible value supported by the camera will be clamped to the camera limits anyway. > > - Is 0 an acceptable value ? Or should 1 be the minimum value ? > > > > - What happens if the requested frame duration isn't achievable ? Should > > we specify that the camera will use a frame duration as close to the > > requested range as possible ? Or could there be cases where a > > different behaviour would be needed ? > > > > - Not something we need to address now, but do you see any future > > relation between this control and anti-banding (50 or 60Hz flicker > > avoidance) ? Anti-banding will mostly restrict possible exposure > > times, and should only indirectly interact with the frame duration if > > I'm not mistaken, is that correct ? > > > > > + > > > + \sa ExposureTime > > > + size: [2] > > > ...
Hi Laurent and Jacopo, I'll address the comments from both emails jointly below: On Thu, 28 May 2020 at 01:55, Laurent Pinchart <laurent.pinchart@ideasonboard.com> wrote: > > Hi Jacopo, > > On Wed, May 27, 2020 at 11:47:34PM +0200, Jacopo Mondi wrote: > > On Mon, May 25, 2020 at 01:23:02AM +0300, Laurent Pinchart wrote: > > > On Wed, May 13, 2020 at 10:11:18AM +0100, Naushir Patuck wrote: > > > > Add a float array control (controls::FrameDurations) to specify the > > > > minimum and maximum (in that order) frame duration to be used by the > > > > camera sensor. > > > > > > > > Signed-off-by: Naushir Patuck <naush@raspberrypi.com> > > > > --- > > > > src/libcamera/control_ids.yaml | 14 ++++++++++++++ > > > > 1 file changed, 14 insertions(+) > > > > > > > > diff --git a/src/libcamera/control_ids.yaml b/src/libcamera/control_ids.yaml > > > > index 77ebc3f9..42694fc7 100644 > > > > --- a/src/libcamera/control_ids.yaml > > > > +++ b/src/libcamera/control_ids.yaml > > > > @@ -239,4 +239,18 @@ controls: > > > > pixel range (as if pixels ranged from 0 to 65535). The SensorBlackLevels > > > > control can only be returned in metadata. > > > > size: [4] > > > > + > > > > + - FrameDurations: > > > > + type: float > > > > > > Do we need sub-microsecond precision, or would a uint32_t control work ? > > > I'd rather have a fixed precision if possible. > > > > > > > + description: | > > > > + Specifies the minimum and maximum (in that order) allowable frame > > > > + duration, in micro-seconds, for the sensor to use. This could also limit > > > > + the largest exposure times the sensor can use. For example, if a maximum > > > > + frame duration of 33ms is requested (corresponding to 30 frames per > > > > + second), the sensor will not be able raise the exposure time above 33ms. > > > > Isn't it s/maximum frame duration/minimum frame duration/ ? > > > > If at least 30fps are requested, the exposure time cannot be longer > > than 33msec ? Am I reading this wrong ? > > I may read it wrong, but I think Naush is correct here. If a *maximum* > frame duration of 33ms is requested [that is, a *minimum* frame rate of > 30fps], the sensor will not be able to *raise* the exposure time above > 33ms. > Yes, my wording in the description matches Laurent's interpretation. I must admit, I do find it more difficult talking about frame durations vs framerates :) > > > > + Note that the sensor may not always be able to provide the requested > > > > + frame duration limits depending on its mode configuration. > > > > I've been really confused by this, until I realized I was having an > > hard time trying to figure out how to use this for applications to > > express a request for a precise duration and how should the library > > have used it to report the actual frame durtion in metadata. Until I > > realized this could probably be better interpreted not as a frame > > duration request or result, but as an hint to the AE algorithm to > > clamp its calculation into certain duration limtis. > > > > And I agree a duration range might be a good hint to drive the AE > > routine and if I got the rest of the series right, that's > > what happens in the IPA with these patches. > > > > If that's actually the intention, what about making of this an > > "AeFrameDurationLimits" control, to make it clear it applies to AE and > > use "FrameDuration" as an integer control to represent the requested > > and precise frame duration in requests and metadata respectively. > > I think (Ae)FrameDurationLimits makes sense as a control (and as a > property too, with the caveat that the minimum frame duration - highest > frame rate - can be dependent on the stream resolution), and > FrameDuration makes sense for metadata. Good points here. When doing this change, I purposely did not want to directly tie this control with AE for a few of reasons, even though there is an interaction: 1) Even though the FrameDurationLimits does restrict what the AE can ask for, in our IPA, they are two distinct and separate operations. In fact, our AE does not even see the frameduration limits that the user set. The results of AE just get clipped by the FrameDurationLimits. This is possibly sub-optimal (see my last email for the discussion), but still gives the desired exposure. In future we will probably optimise this, but given the complexity, will leave it for another change. 2) AE does have a way to clip requested shutter speeds using the "Exposure Profile" tuning parameters which offers another degree of freedom to control the interaction of shutter speed with analogue gain. FrameDurationLimits only controls shutter speed which may not be desirable. > > More about FrameDuration as a control below. > > > On "FrameDuration" itself: would it make sense to only considered it > > when running with AE off ? In that case it would be applications > > responsibility to opportunely calculate the exposure time and the frame > > duration, but I think it's expected if you run in manual mode. And for > > reference, that's what Android does as well :) > > > > It would also rule out the need to specify how the frameDurationLimits > > would be used when running in manual mode and wanting to configure a > > precise value there, as I guess you would have to give to min and max > > the same value to express that. FrameDurationLimits should only really apply with AE enabled. AE varies the shutter speed all over the place, that could cause framerate changes that we need to clip. If AE is disabled and we have a manual shutter speed set, that dictates the frame duration, and any limits will not apply. Perhaps I should add this in the description? > > > > this might open pandora's box, but: is the frame duration a property of > > the camera, or of a single stream ? > > I've opened the same box in my previous e-mail :-) I think the frame > duration is a property of the camera, as all streams share the same > sensor, and are thus slave to the same frame rate. Some streams could > drop frames, but I think I would then report that as a decimation factor > per stream (not sure what the use cases would be though). This being > said, the maximum frame rate (minimum frame duration) can depend on the > configuration of the camera, due to bandwidth limitations. It shouldn't > affect the (Ae)FrameDuration(Limits) control or metadata, but will > affect the (Ae)FrameDurationLimits property. It's something we need to > address eventually, not necessarily as part of this series, we could > start with a limit that doesn't depend on the camera configuration and > then build additional features on top. Agreed. > > > > This looks good to me, but I'd like to discuss the corner cases I can > > > already think about. > > > > > > - Are there use cases for an application to specify a minimum frame > > > duration only, or a maximum frame duration only (that is, any frame > > > rate lower than a limit, or any frame rate higher than a limit) ? If > > > so, how should that be specified ? We could set the minimum value to 0 > > > to mean that the maximum frame rate is unbounded, and the maximum > > > value to UINT32_MAX to mean that the minimum frame rate is unbounded. > > > Is there anything I'm overlooking ? > > > > If this becames an AE-related control and you specify that, I think > > you should provide both values. If you don't care about one of the two > > just use the values of the (forthcoming) property that reports the > > min and max frame durations of a camera. If an applications does not > > specify this control, the AE routine runs free, in the limits > > of the camera sensor capabilities. > > > > Does it make sense ? > > Yes that can work. Specifying anything below the minimum possible or > above the maximum possible value supported by the camera will be clamped > to the camera limits anyway. > > > > - Is 0 an acceptable value ? Or should 1 be the minimum value ? > > > > > > - What happens if the requested frame duration isn't achievable ? Should > > > we specify that the camera will use a frame duration as close to the > > > requested range as possible ? Or could there be cases where a > > > different behaviour would be needed ? > > > > > > - Not something we need to address now, but do you see any future > > > relation between this control and anti-banding (50 or 60Hz flicker > > > avoidance) ? Anti-banding will mostly restrict possible exposure > > > times, and should only indirectly interact with the frame duration if > > > I'm not mistaken, is that correct ? > > > > > > > + > > > > + \sa ExposureTime > > > > + size: [2] > > > > ... > > -- > Regards, > > Laurent Pinchart Regards, Naush
Hi Laurent, On Thu, May 28, 2020 at 03:55:41AM +0300, Laurent Pinchart wrote: > Hi Jacopo, > > On Wed, May 27, 2020 at 11:47:34PM +0200, Jacopo Mondi wrote: > > On Mon, May 25, 2020 at 01:23:02AM +0300, Laurent Pinchart wrote: > > > On Wed, May 13, 2020 at 10:11:18AM +0100, Naushir Patuck wrote: > > > > Add a float array control (controls::FrameDurations) to specify the > > > > minimum and maximum (in that order) frame duration to be used by the > > > > camera sensor. > > > > > > > > Signed-off-by: Naushir Patuck <naush@raspberrypi.com> > > > > --- > > > > src/libcamera/control_ids.yaml | 14 ++++++++++++++ > > > > 1 file changed, 14 insertions(+) > > > > > > > > diff --git a/src/libcamera/control_ids.yaml b/src/libcamera/control_ids.yaml > > > > index 77ebc3f9..42694fc7 100644 > > > > --- a/src/libcamera/control_ids.yaml > > > > +++ b/src/libcamera/control_ids.yaml > > > > @@ -239,4 +239,18 @@ controls: > > > > pixel range (as if pixels ranged from 0 to 65535). The SensorBlackLevels > > > > control can only be returned in metadata. > > > > size: [4] > > > > + > > > > + - FrameDurations: > > > > + type: float > > > > > > Do we need sub-microsecond precision, or would a uint32_t control work ? > > > I'd rather have a fixed precision if possible. > > > > > > > + description: | > > > > + Specifies the minimum and maximum (in that order) allowable frame > > > > + duration, in micro-seconds, for the sensor to use. This could also limit > > > > + the largest exposure times the sensor can use. For example, if a maximum > > > > + frame duration of 33ms is requested (corresponding to 30 frames per > > > > + second), the sensor will not be able raise the exposure time above 33ms. > > > > Isn't it s/maximum frame duration/minimum frame duration/ ? > > > > If at least 30fps are requested, the exposure time cannot be longer > > than 33msec ? Am I reading this wrong ? > > I may read it wrong, but I think Naush is correct here. If a *maximum* > frame duration of 33ms is requested [that is, a *minimum* frame rate of > 30fps], the sensor will not be able to *raise* the exposure time above > 33ms. Ouch, correct, sorry for the noise. > > > > > + Note that the sensor may not always be able to provide the requested > > > > + frame duration limits depending on its mode configuration. > > > > I've been really confused by this, until I realized I was having an > > hard time trying to figure out how to use this for applications to > > express a request for a precise duration and how should the library > > have used it to report the actual frame durtion in metadata. Until I > > realized this could probably be better interpreted not as a frame > > duration request or result, but as an hint to the AE algorithm to > > clamp its calculation into certain duration limtis. > > > > And I agree a duration range might be a good hint to drive the AE > > routine and if I got the rest of the series right, that's > > what happens in the IPA with these patches. > > > > If that's actually the intention, what about making of this an > > "AeFrameDurationLimits" control, to make it clear it applies to AE and > > use "FrameDuration" as an integer control to represent the requested > > and precise frame duration in requests and metadata respectively. > > I think (Ae)FrameDurationLimits makes sense as a control (and as a > property too, with the caveat that the minimum frame duration - highest > frame rate - can be dependent on the stream resolution), and > FrameDuration makes sense for metadata. > > More about FrameDuration as a control below. > > > On "FrameDuration" itself: would it make sense to only considered it > > when running with AE off ? In that case it would be applications > > responsibility to opportunely calculate the exposure time and the frame > > duration, but I think it's expected if you run in manual mode. And for > > reference, that's what Android does as well :) > > > > It would also rule out the need to specify how the frameDurationLimits > > would be used when running in manual mode and wanting to configure a > > precise value there, as I guess you would have to give to min and max > > the same value to express that. > > > > this might open pandora's box, but: is the frame duration a property of > > the camera, or of a single stream ? > > I've opened the same box in my previous e-mail :-) I think the frame > duration is a property of the camera, as all streams share the same > sensor, and are thus slave to the same frame rate. Some streams could > drop frames, but I think I would then report that as a decimation factor > per stream (not sure what the use cases would be though). This being I was actually more concerned about processing that might stall the pipeline, like JPEG encoding. > said, the maximum frame rate (minimum frame duration) can depend on the > configuration of the camera, due to bandwidth limitations. It shouldn't > affect the (Ae)FrameDuration(Limits) control or metadata, but will > affect the (Ae)FrameDurationLimits property. It's something we need to > address eventually, not necessarily as part of this series, we could > start with a limit that doesn't depend on the camera configuration and > then build additional features on top. > I might actually need to fill that gap very soon as I need a way to report the per-configuration frame duration interval. This is something not part of this series, but it's good if we at least stabilize on naming. > > > This looks good to me, but I'd like to discuss the corner cases I can > > > already think about. > > > > > > - Are there use cases for an application to specify a minimum frame > > > duration only, or a maximum frame duration only (that is, any frame > > > rate lower than a limit, or any frame rate higher than a limit) ? If > > > so, how should that be specified ? We could set the minimum value to 0 > > > to mean that the maximum frame rate is unbounded, and the maximum > > > value to UINT32_MAX to mean that the minimum frame rate is unbounded. > > > Is there anything I'm overlooking ? > > > > If this becames an AE-related control and you specify that, I think > > you should provide both values. If you don't care about one of the two > > just use the values of the (forthcoming) property that reports the > > min and max frame durations of a camera. If an applications does not > > specify this control, the AE routine runs free, in the limits > > of the camera sensor capabilities. > > > > Does it make sense ? > > Yes that can work. Specifying anything below the minimum possible or > above the maximum possible value supported by the camera will be clamped > to the camera limits anyway. > True, we can clamp to those limits indeed > > > - Is 0 an acceptable value ? Or should 1 be the minimum value ? > > > > > > - What happens if the requested frame duration isn't achievable ? Should > > > we specify that the camera will use a frame duration as close to the > > > requested range as possible ? Or could there be cases where a > > > different behaviour would be needed ? > > > > > > - Not something we need to address now, but do you see any future > > > relation between this control and anti-banding (50 or 60Hz flicker > > > avoidance) ? Anti-banding will mostly restrict possible exposure > > > times, and should only indirectly interact with the frame duration if > > > I'm not mistaken, is that correct ? > > > > > > > + > > > > + \sa ExposureTime > > > > + size: [2] > > > > ... > > -- > Regards, > > Laurent Pinchart
Hi Naush, On Thu, May 28, 2020 at 09:52:51AM +0100, Naushir Patuck wrote: > Hi Laurent and Jacopo, > > I'll address the comments from both emails jointly below: > > On Thu, 28 May 2020 at 01:55, Laurent Pinchart > <laurent.pinchart@ideasonboard.com> wrote: > > > > Hi Jacopo, > > > > On Wed, May 27, 2020 at 11:47:34PM +0200, Jacopo Mondi wrote: > > > On Mon, May 25, 2020 at 01:23:02AM +0300, Laurent Pinchart wrote: > > > > On Wed, May 13, 2020 at 10:11:18AM +0100, Naushir Patuck wrote: > > > > > Add a float array control (controls::FrameDurations) to specify the > > > > > minimum and maximum (in that order) frame duration to be used by the > > > > > camera sensor. > > > > > > > > > > Signed-off-by: Naushir Patuck <naush@raspberrypi.com> > > > > > --- > > > > > src/libcamera/control_ids.yaml | 14 ++++++++++++++ > > > > > 1 file changed, 14 insertions(+) > > > > > > > > > > diff --git a/src/libcamera/control_ids.yaml b/src/libcamera/control_ids.yaml > > > > > index 77ebc3f9..42694fc7 100644 > > > > > --- a/src/libcamera/control_ids.yaml > > > > > +++ b/src/libcamera/control_ids.yaml > > > > > @@ -239,4 +239,18 @@ controls: > > > > > pixel range (as if pixels ranged from 0 to 65535). The SensorBlackLevels > > > > > control can only be returned in metadata. > > > > > size: [4] > > > > > + > > > > > + - FrameDurations: > > > > > + type: float > > > > > > > > Do we need sub-microsecond precision, or would a uint32_t control work ? > > > > I'd rather have a fixed precision if possible. > > > > > > > > > + description: | > > > > > + Specifies the minimum and maximum (in that order) allowable frame > > > > > + duration, in micro-seconds, for the sensor to use. This could also limit > > > > > + the largest exposure times the sensor can use. For example, if a maximum > > > > > + frame duration of 33ms is requested (corresponding to 30 frames per > > > > > + second), the sensor will not be able raise the exposure time above 33ms. > > > > > > Isn't it s/maximum frame duration/minimum frame duration/ ? > > > > > > If at least 30fps are requested, the exposure time cannot be longer > > > than 33msec ? Am I reading this wrong ? > > > > I may read it wrong, but I think Naush is correct here. If a *maximum* > > frame duration of 33ms is requested [that is, a *minimum* frame rate of > > 30fps], the sensor will not be able to *raise* the exposure time above > > 33ms. > > > > Yes, my wording in the description matches Laurent's interpretation. > I must admit, I do find it more difficult talking about frame > durations vs framerates :) > > > > > > + Note that the sensor may not always be able to provide the requested > > > > > + frame duration limits depending on its mode configuration. > > > > > > I've been really confused by this, until I realized I was having an > > > hard time trying to figure out how to use this for applications to > > > express a request for a precise duration and how should the library > > > have used it to report the actual frame durtion in metadata. Until I > > > realized this could probably be better interpreted not as a frame > > > duration request or result, but as an hint to the AE algorithm to > > > clamp its calculation into certain duration limtis. > > > > > > And I agree a duration range might be a good hint to drive the AE > > > routine and if I got the rest of the series right, that's > > > what happens in the IPA with these patches. > > > > > > If that's actually the intention, what about making of this an > > > "AeFrameDurationLimits" control, to make it clear it applies to AE and > > > use "FrameDuration" as an integer control to represent the requested > > > and precise frame duration in requests and metadata respectively. > > > > I think (Ae)FrameDurationLimits makes sense as a control (and as a > > property too, with the caveat that the minimum frame duration - highest > > frame rate - can be dependent on the stream resolution), and > > FrameDuration makes sense for metadata. > > Good points here. When doing this change, I purposely did not want to > directly tie this control with AE for a few of reasons, even though > there is an interaction: > > 1) Even though the FrameDurationLimits does restrict what the AE can > ask for, in our IPA, they are two distinct and separate operations. > In fact, our AE does not even see the frameduration limits that the > user set. The results of AE just get clipped by the > FrameDurationLimits. This is possibly sub-optimal (see my last email > for the discussion), but still gives the desired exposure. In future > we will probably optimise this, but given the complexity, will leave > it for another change. Not sure I got this: "AE just get clipped by the FrameDuration" doesn't mean that this is an AE-tuning parameter ? Ideally, I think we should have both an "AE frame duration limits" and an "AE exposure limits" to allow applications (with the help of some pre-cooked profile possibily) to implement the canonical shutter-priority and exposure-priority modes.. > 2) AE does have a way to clip requested shutter speeds using the > "Exposure Profile" tuning parameters which offers another degree of > freedom to control the interaction of shutter speed with analogue > gain. FrameDurationLimits only controls shutter speed which may not > be desirable. Could this be addressed with an AE-exposure-priority control ? > I'm not sure, based on your last points, what's your view on this control future developments. Should this stay "frameDuration" or should this become AeFrameDurationLimit (or AeTargetDuration or whatever else appropriate) ? > > > > More about FrameDuration as a control below. > > > > > On "FrameDuration" itself: would it make sense to only considered it > > > when running with AE off ? In that case it would be applications > > > responsibility to opportunely calculate the exposure time and the frame > > > duration, but I think it's expected if you run in manual mode. And for > > > reference, that's what Android does as well :) > > > > > > It would also rule out the need to specify how the frameDurationLimits > > > would be used when running in manual mode and wanting to configure a > > > precise value there, as I guess you would have to give to min and max > > > the same value to express that. > > FrameDurationLimits should only really apply with AE enabled. AE > varies the shutter speed all over the place, that could cause > framerate changes that we need to clip. If AE is disabled and we have > a manual shutter speed set, that dictates the frame duration, and any > limits will not apply. Perhaps I should add this in the description? > It really depends what developments you foresee here, if the control becomes an AE-relatd one I think it's not necessary to specify that. All in all, to me it seems like "frameDuration" should be reserved for manual duration control and to report the actual duration of the captured frames. As I'm working on reporting the camera frame duration limits, synching on naming and semantic associated with the control could help both of us progressing knowing we won't step on each other toes... Thanks j > > > > > > > this might open pandora's box, but: is the frame duration a property of > > > the camera, or of a single stream ? > > > > I've opened the same box in my previous e-mail :-) I think the frame > > duration is a property of the camera, as all streams share the same > > sensor, and are thus slave to the same frame rate. Some streams could > > drop frames, but I think I would then report that as a decimation factor > > per stream (not sure what the use cases would be though). This being > > said, the maximum frame rate (minimum frame duration) can depend on the > > configuration of the camera, due to bandwidth limitations. It shouldn't > > affect the (Ae)FrameDuration(Limits) control or metadata, but will > > affect the (Ae)FrameDurationLimits property. It's something we need to > > address eventually, not necessarily as part of this series, we could > > start with a limit that doesn't depend on the camera configuration and > > then build additional features on top. > > Agreed. > > > > > > > This looks good to me, but I'd like to discuss the corner cases I can > > > > already think about. > > > > > > > > - Are there use cases for an application to specify a minimum frame > > > > duration only, or a maximum frame duration only (that is, any frame > > > > rate lower than a limit, or any frame rate higher than a limit) ? If > > > > so, how should that be specified ? We could set the minimum value to 0 > > > > to mean that the maximum frame rate is unbounded, and the maximum > > > > value to UINT32_MAX to mean that the minimum frame rate is unbounded. > > > > Is there anything I'm overlooking ? > > > > > > If this becames an AE-related control and you specify that, I think > > > you should provide both values. If you don't care about one of the two > > > just use the values of the (forthcoming) property that reports the > > > min and max frame durations of a camera. If an applications does not > > > specify this control, the AE routine runs free, in the limits > > > of the camera sensor capabilities. > > > > > > Does it make sense ? > > > > Yes that can work. Specifying anything below the minimum possible or > > above the maximum possible value supported by the camera will be clamped > > to the camera limits anyway. > > > > > > - Is 0 an acceptable value ? Or should 1 be the minimum value ? > > > > > > > > - What happens if the requested frame duration isn't achievable ? Should > > > > we specify that the camera will use a frame duration as close to the > > > > requested range as possible ? Or could there be cases where a > > > > different behaviour would be needed ? > > > > > > > > - Not something we need to address now, but do you see any future > > > > relation between this control and anti-banding (50 or 60Hz flicker > > > > avoidance) ? Anti-banding will mostly restrict possible exposure > > > > times, and should only indirectly interact with the frame duration if > > > > I'm not mistaken, is that correct ? > > > > > > > > > + > > > > > + \sa ExposureTime > > > > > + size: [2] > > > > > ... > > > > -- > > Regards, > > > > Laurent Pinchart > > Regards, > Naush
Hi Jacopo, On Mon, 1 Jun 2020 at 16:40, Jacopo Mondi <jacopo@jmondi.org> wrote: > > Hi Naush, > > On Thu, May 28, 2020 at 09:52:51AM +0100, Naushir Patuck wrote: > > Hi Laurent and Jacopo, > > > > I'll address the comments from both emails jointly below: > > > > On Thu, 28 May 2020 at 01:55, Laurent Pinchart > > <laurent.pinchart@ideasonboard.com> wrote: > > > > > > Hi Jacopo, > > > > > > On Wed, May 27, 2020 at 11:47:34PM +0200, Jacopo Mondi wrote: > > > > On Mon, May 25, 2020 at 01:23:02AM +0300, Laurent Pinchart wrote: > > > > > On Wed, May 13, 2020 at 10:11:18AM +0100, Naushir Patuck wrote: > > > > > > Add a float array control (controls::FrameDurations) to specify the > > > > > > minimum and maximum (in that order) frame duration to be used by the > > > > > > camera sensor. > > > > > > > > > > > > Signed-off-by: Naushir Patuck <naush@raspberrypi.com> > > > > > > --- > > > > > > src/libcamera/control_ids.yaml | 14 ++++++++++++++ > > > > > > 1 file changed, 14 insertions(+) > > > > > > > > > > > > diff --git a/src/libcamera/control_ids.yaml b/src/libcamera/control_ids.yaml > > > > > > index 77ebc3f9..42694fc7 100644 > > > > > > --- a/src/libcamera/control_ids.yaml > > > > > > +++ b/src/libcamera/control_ids.yaml > > > > > > @@ -239,4 +239,18 @@ controls: > > > > > > pixel range (as if pixels ranged from 0 to 65535). The SensorBlackLevels > > > > > > control can only be returned in metadata. > > > > > > size: [4] > > > > > > + > > > > > > + - FrameDurations: > > > > > > + type: float > > > > > > > > > > Do we need sub-microsecond precision, or would a uint32_t control work ? > > > > > I'd rather have a fixed precision if possible. > > > > > > > > > > > + description: | > > > > > > + Specifies the minimum and maximum (in that order) allowable frame > > > > > > + duration, in micro-seconds, for the sensor to use. This could also limit > > > > > > + the largest exposure times the sensor can use. For example, if a maximum > > > > > > + frame duration of 33ms is requested (corresponding to 30 frames per > > > > > > + second), the sensor will not be able raise the exposure time above 33ms. > > > > > > > > Isn't it s/maximum frame duration/minimum frame duration/ ? > > > > > > > > If at least 30fps are requested, the exposure time cannot be longer > > > > than 33msec ? Am I reading this wrong ? > > > > > > I may read it wrong, but I think Naush is correct here. If a *maximum* > > > frame duration of 33ms is requested [that is, a *minimum* frame rate of > > > 30fps], the sensor will not be able to *raise* the exposure time above > > > 33ms. > > > > > > > Yes, my wording in the description matches Laurent's interpretation. > > I must admit, I do find it more difficult talking about frame > > durations vs framerates :) > > > > > > > > + Note that the sensor may not always be able to provide the requested > > > > > > + frame duration limits depending on its mode configuration. > > > > > > > > I've been really confused by this, until I realized I was having an > > > > hard time trying to figure out how to use this for applications to > > > > express a request for a precise duration and how should the library > > > > have used it to report the actual frame durtion in metadata. Until I > > > > realized this could probably be better interpreted not as a frame > > > > duration request or result, but as an hint to the AE algorithm to > > > > clamp its calculation into certain duration limtis. > > > > > > > > And I agree a duration range might be a good hint to drive the AE > > > > routine and if I got the rest of the series right, that's > > > > what happens in the IPA with these patches. > > > > > > > > If that's actually the intention, what about making of this an > > > > "AeFrameDurationLimits" control, to make it clear it applies to AE and > > > > use "FrameDuration" as an integer control to represent the requested > > > > and precise frame duration in requests and metadata respectively. > > > > > > I think (Ae)FrameDurationLimits makes sense as a control (and as a > > > property too, with the caveat that the minimum frame duration - highest > > > frame rate - can be dependent on the stream resolution), and > > > FrameDuration makes sense for metadata. > > > > Good points here. When doing this change, I purposely did not want to > > directly tie this control with AE for a few of reasons, even though > > there is an interaction: > > > > 1) Even though the FrameDurationLimits does restrict what the AE can > > ask for, in our IPA, they are two distinct and separate operations. > > In fact, our AE does not even see the frameduration limits that the > > user set. The results of AE just get clipped by the > > FrameDurationLimits. This is possibly sub-optimal (see my last email > > for the discussion), but still gives the desired exposure. In future > > we will probably optimise this, but given the complexity, will leave > > it for another change. > > Not sure I got this: "AE just get clipped by the FrameDuration" > doesn't mean that this is an AE-tuning parameter ? Roughly speaking, the following sequence of events occur in the RPi IPA (per frame) for AE calculations: 1) AE gets given the current frame shutter speed and gain. 2) If the current exposure does not have the required exposure for the target brightness, it sets up a digital gain in the ISP to make up the difference. 3) It also generates a new shutter speed and gain value to program the sensor to achieve the target brightness without digital gain. This is done with no knowledge of what the maximum achievable exposure is (based on the FrameDuration). 4) In the IPA, we look at this shutter speed and gain value requested by the AE. We clip the shutter speed based on what user requested FrameDuration was provided (if any). So even though AE results do get clipped by the FrameDuration setting, AE has no knowledge about this. In the case of manual exposures (see below for a bit more context), the FrameDuration clipping will still apply exactly like above. Hence my choosing to call this control FrameDuration to decouple it from what the AE does. However, I am happy to change things and rename FrameDuration -> AeFrameDuration (or AeFrameDurationLimits?) if the consensus is there. > > Ideally, I think we should have both an "AE frame duration limits" and > an "AE exposure limits" to allow applications (with the help of some > pre-cooked profile possibily) to implement the canonical > shutter-priority and exposure-priority modes.. > > > 2) AE does have a way to clip requested shutter speeds using the > > "Exposure Profile" tuning parameters which offers another degree of > > freedom to control the interaction of shutter speed with analogue > > gain. FrameDurationLimits only controls shutter speed which may not > > be desirable. > > Could this be addressed with an AE-exposure-priority control ? You are right that we do need controls for both. The RPi IPA does have both controls. For "AE frame duration limits", we use FrameDuration (or AeFrameDuration depending on what we decide). For "AE exposure limits", we use the control AeExposureMode for exactly this. > > > > > I'm not sure, based on your last points, what's your view on this > control future developments. Should this stay "frameDuration" or > should this become AeFrameDurationLimit (or AeTargetDuration or > whatever else appropriate) ? See below. > > > > > > > More about FrameDuration as a control below. > > > > > > > On "FrameDuration" itself: would it make sense to only considered it > > > > when running with AE off ? In that case it would be applications > > > > responsibility to opportunely calculate the exposure time and the frame > > > > duration, but I think it's expected if you run in manual mode. And for > > > > reference, that's what Android does as well :) > > > > > > > > It would also rule out the need to specify how the frameDurationLimits > > > > would be used when running in manual mode and wanting to configure a > > > > precise value there, as I guess you would have to give to min and max > > > > the same value to express that. > > > > FrameDurationLimits should only really apply with AE enabled. AE > > varies the shutter speed all over the place, that could cause > > framerate changes that we need to clip. If AE is disabled and we have > > a manual shutter speed set, that dictates the frame duration, and any > > limits will not apply. Perhaps I should add this in the description? > > I have to correct myself here. FrameDurationLimits will also apply with manual shutter speeds when AE is disabled. We could, say, have a manual shutter speed request of 25 ms, but still want to run at 30fps. There is a question on what happens if they clash, e.g. request a manual shutter speed of 66ms, and a FrameDuration of 30fps? What gets prioritised? Currently, in our implementation, FrameDuration will always be prioritised. > > It really depends what developments you foresee here, if the control > becomes an AE-relatd one I think it's not necessary to specify that. > > All in all, to me it seems like "frameDuration" should be reserved for > manual duration control and to report the actual duration of the > captured frames. So circling back to your earlier question: > Should this stay "frameDuration" or > should this become AeFrameDurationLimit (or AeTargetDuration or > whatever else appropriate) ? Given that I think FrameDurationLimits might apply in both AE and non-AE operations equally, I think we should keep the name as-is. However, again, if the consensus is to change, I am more than happy to make the change. > > As I'm working on reporting the camera frame duration limits, synching > on naming and semantic associated with the control could help both of > us progressing knowing we won't step on each other toes... Agreed, we should be consistent on what we decide. Please do let me know what the consensus is and I will be happy to follow it. Regards, Naush > Thanks > j > > > > > > > > > > > > this might open pandora's box, but: is the frame duration a property of > > > > the camera, or of a single stream ? > > > > > > I've opened the same box in my previous e-mail :-) I think the frame > > > duration is a property of the camera, as all streams share the same > > > sensor, and are thus slave to the same frame rate. Some streams could > > > drop frames, but I think I would then report that as a decimation factor > > > per stream (not sure what the use cases would be though). This being > > > said, the maximum frame rate (minimum frame duration) can depend on the > > > configuration of the camera, due to bandwidth limitations. It shouldn't > > > affect the (Ae)FrameDuration(Limits) control or metadata, but will > > > affect the (Ae)FrameDurationLimits property. It's something we need to > > > address eventually, not necessarily as part of this series, we could > > > start with a limit that doesn't depend on the camera configuration and > > > then build additional features on top. > > > > Agreed. > > > > > > > > > > This looks good to me, but I'd like to discuss the corner cases I can > > > > > already think about. > > > > > > > > > > - Are there use cases for an application to specify a minimum frame > > > > > duration only, or a maximum frame duration only (that is, any frame > > > > > rate lower than a limit, or any frame rate higher than a limit) ? If > > > > > so, how should that be specified ? We could set the minimum value to 0 > > > > > to mean that the maximum frame rate is unbounded, and the maximum > > > > > value to UINT32_MAX to mean that the minimum frame rate is unbounded. > > > > > Is there anything I'm overlooking ? > > > > > > > > If this becames an AE-related control and you specify that, I think > > > > you should provide both values. If you don't care about one of the two > > > > just use the values of the (forthcoming) property that reports the > > > > min and max frame durations of a camera. If an applications does not > > > > specify this control, the AE routine runs free, in the limits > > > > of the camera sensor capabilities. > > > > > > > > Does it make sense ? > > > > > > Yes that can work. Specifying anything below the minimum possible or > > > above the maximum possible value supported by the camera will be clamped > > > to the camera limits anyway. > > > > > > > > - Is 0 an acceptable value ? Or should 1 be the minimum value ? > > > > > > > > > > - What happens if the requested frame duration isn't achievable ? Should > > > > > we specify that the camera will use a frame duration as close to the > > > > > requested range as possible ? Or could there be cases where a > > > > > different behaviour would be needed ? > > > > > > > > > > - Not something we need to address now, but do you see any future > > > > > relation between this control and anti-banding (50 or 60Hz flicker > > > > > avoidance) ? Anti-banding will mostly restrict possible exposure > > > > > times, and should only indirectly interact with the frame duration if > > > > > I'm not mistaken, is that correct ? > > > > > > > > > > > + > > > > > > + \sa ExposureTime > > > > > > + size: [2] > > > > > > ... > > > > > > -- > > > Regards, > > > > > > Laurent Pinchart > > > > Regards, > > Naush
Hi Naush, I won't go into details here as I would like Laurent to chime in and express his view on this topic, but I would have a question in the meantime. On Tue, Jun 02, 2020 at 10:38:41AM +0100, Naushir Patuck wrote: > Hi Jacopo, > > > On Mon, 1 Jun 2020 at 16:40, Jacopo Mondi <jacopo@jmondi.org> wrote: > > > > Hi Naush, > > > > On Thu, May 28, 2020 at 09:52:51AM +0100, Naushir Patuck wrote: [snip] > > > > > On "FrameDuration" itself: would it make sense to only considered it > > > > > when running with AE off ? In that case it would be applications > > > > > responsibility to opportunely calculate the exposure time and the frame > > > > > duration, but I think it's expected if you run in manual mode. And for > > > > > reference, that's what Android does as well :) > > > > > > > > > > It would also rule out the need to specify how the frameDurationLimits > > > > > would be used when running in manual mode and wanting to configure a > > > > > precise value there, as I guess you would have to give to min and max > > > > > the same value to express that. > > > > > > FrameDurationLimits should only really apply with AE enabled. AE > > > varies the shutter speed all over the place, that could cause > > > framerate changes that we need to clip. If AE is disabled and we have > > > a manual shutter speed set, that dictates the frame duration, and any > > > limits will not apply. Perhaps I should add this in the description? > > > > > I have to correct myself here. FrameDurationLimits will also apply > with manual shutter speeds when AE is disabled. We could, say, have a > manual shutter speed request of 25 ms, but still want to run at 30fps. > There is a question on what happens if they clash, e.g. request a > manual shutter speed of 66ms, and a FrameDuration of 30fps? What gets > prioritised? Currently, in our implementation, FrameDuration will > always be prioritised. Is there a rationle behind this decision ? I'm asking as I see Android going in the other direction, with the exposure time always overriding the requested frame duration. It is specified in their documentation that the maximum frame duration a camera reports (lower frame rate) shall be at least the maximum allowed exposure time, so that all possible exposure values are known to be in the requested FPS range. I guess this should be enforced even if we give frame duration priority, or should we cap the maximum frame duration to the maximum available exposure time ? Just trying to get more data points to make an informed decision, as your question on what has to be prioritized is quite sensible... Thanks j
Hi Jacopo, On Wed, 3 Jun 2020 at 09:42, Jacopo Mondi <jacopo@jmondi.org> wrote: > > Hi Naush, > > I won't go into details here as I would like Laurent to chime in > and express his view on this topic, but I would have a question in the > meantime. > > On Tue, Jun 02, 2020 at 10:38:41AM +0100, Naushir Patuck wrote: > > Hi Jacopo, > > > > > > On Mon, 1 Jun 2020 at 16:40, Jacopo Mondi <jacopo@jmondi.org> wrote: > > > > > > Hi Naush, > > > > > > On Thu, May 28, 2020 at 09:52:51AM +0100, Naushir Patuck wrote: > > [snip] > > > > > > > On "FrameDuration" itself: would it make sense to only considered it > > > > > > when running with AE off ? In that case it would be applications > > > > > > responsibility to opportunely calculate the exposure time and the frame > > > > > > duration, but I think it's expected if you run in manual mode. And for > > > > > > reference, that's what Android does as well :) > > > > > > > > > > > > It would also rule out the need to specify how the frameDurationLimits > > > > > > would be used when running in manual mode and wanting to configure a > > > > > > precise value there, as I guess you would have to give to min and max > > > > > > the same value to express that. > > > > > > > > FrameDurationLimits should only really apply with AE enabled. AE > > > > varies the shutter speed all over the place, that could cause > > > > framerate changes that we need to clip. If AE is disabled and we have > > > > a manual shutter speed set, that dictates the frame duration, and any > > > > limits will not apply. Perhaps I should add this in the description? > > > > > > > > I have to correct myself here. FrameDurationLimits will also apply > > with manual shutter speeds when AE is disabled. We could, say, have a > > manual shutter speed request of 25 ms, but still want to run at 30fps. > > There is a question on what happens if they clash, e.g. request a > > manual shutter speed of 66ms, and a FrameDuration of 30fps? What gets > > prioritised? Currently, in our implementation, FrameDuration will > > always be prioritised. > > Is there a rationle behind this decision ? I'm asking as I see Android > going in the other direction, with the exposure time always overriding > the requested frame duration. This was an entirely arbitrary choice on my part - based on what was easiest to do at the time :) > It is specified in their documentation that the mximum frame duration > a camera reports (lower frame rate) shall be at least the maximum > allowed exposure time, so that all possible exposure values are known > to be in the requested FPS range. I guess this should be enforced even > if we give frame duration priority, or should we cap the maximum frame > duration to the maximum available exposure time ? I have not looked at the Android docs in ages, but do they have two sets of frame duration limits, one that the sensor will report (sounds like this is what you are working on?) and one that the user requests (of course, bounded by the former)? This was where my thinking was going, and with frame duration priority, exposure time is capped by the user request (if given), or the sensor limits. But as always, if the consensus is to behave differently, I am happy to change things around. Regards, Naush > > Just trying to get more data points to make an informed decision, as > your question on what has to be prioritized is quite sensible... > > Thanks > j
Hi Naush, On Wed, Jun 03, 2020 at 10:44:18AM +0100, Naushir Patuck wrote: > Hi Jacopo, > > On Wed, 3 Jun 2020 at 09:42, Jacopo Mondi <jacopo@jmondi.org> wrote: > > > > Hi Naush, > > > > I won't go into details here as I would like Laurent to chime in > > and express his view on this topic, but I would have a question in the > > meantime. > > > > On Tue, Jun 02, 2020 at 10:38:41AM +0100, Naushir Patuck wrote: > > > Hi Jacopo, > > > > > > > > > On Mon, 1 Jun 2020 at 16:40, Jacopo Mondi <jacopo@jmondi.org> wrote: > > > > > > > > Hi Naush, > > > > > > > > On Thu, May 28, 2020 at 09:52:51AM +0100, Naushir Patuck wrote: > > > > [snip] > > > > > > > > > On "FrameDuration" itself: would it make sense to only considered it > > > > > > > when running with AE off ? In that case it would be applications > > > > > > > responsibility to opportunely calculate the exposure time and the frame > > > > > > > duration, but I think it's expected if you run in manual mode. And for > > > > > > > reference, that's what Android does as well :) > > > > > > > > > > > > > > It would also rule out the need to specify how the frameDurationLimits > > > > > > > would be used when running in manual mode and wanting to configure a > > > > > > > precise value there, as I guess you would have to give to min and max > > > > > > > the same value to express that. > > > > > > > > > > FrameDurationLimits should only really apply with AE enabled. AE > > > > > varies the shutter speed all over the place, that could cause > > > > > framerate changes that we need to clip. If AE is disabled and we have > > > > > a manual shutter speed set, that dictates the frame duration, and any > > > > > limits will not apply. Perhaps I should add this in the description? > > > > > > > > > > > I have to correct myself here. FrameDurationLimits will also apply > > > with manual shutter speeds when AE is disabled. We could, say, have a > > > manual shutter speed request of 25 ms, but still want to run at 30fps. > > > There is a question on what happens if they clash, e.g. request a > > > manual shutter speed of 66ms, and a FrameDuration of 30fps? What gets > > > prioritised? Currently, in our implementation, FrameDuration will > > > always be prioritised. > > > > Is there a rationle behind this decision ? I'm asking as I see Android > > going in the other direction, with the exposure time always overriding > > the requested frame duration. > > This was an entirely arbitrary choice on my part - based on what was > easiest to do at the time :) > > > It is specified in their documentation that the mximum frame duration > > a camera reports (lower frame rate) shall be at least the maximum > > allowed exposure time, so that all possible exposure values are known > > to be in the requested FPS range. I guess this should be enforced even > > if we give frame duration priority, or should we cap the maximum frame > > duration to the maximum available exposure time ? > > I have not looked at the Android docs in ages, but do they have two > sets of frame duration limits, one that the sensor will report (sounds > like this is what you are working on?) and one that the user requests yes, there's a static property that reports the minFrameDurations for each supported stream configuration (including the ISP introduced dealys) https://jmondi.org/android_metadata_tags/docs.html#static_android.scaler.availableMinFrameDurations And two controls for the sensor frame durations, one static property https://jmondi.org/android_metadata_tags/docs.html#static_android.sensor.info.maxFrameDuration and a control https://jmondi.org/android_metadata_tags/docs.html#controls_android.sensor.frameDuration > (of course, bounded by the former)? This was where my thinking was > going, and with frame duration priority, exposure time is capped by > the user request (if given), or the sensor limits. But as always, if > the consensus is to behave differently, I am happy to change things > around. I don't have preferences, nor reasons that make me think one is better than the other. I wonder if we should enforce this at all or let pipeline handler/IPA decide that. Thanks j > > Regards, > Naush > > > > > > Just trying to get more data points to make an informed decision, as > > your question on what has to be prioritized is quite sensible... > > > > Thanks > > j
Hi Jacopo On Wed, 3 Jun 2020 at 11:11, Jacopo Mondi <jacopo@jmondi.org> wrote: > > Hi Naush, > > On Wed, Jun 03, 2020 at 10:44:18AM +0100, Naushir Patuck wrote: > > Hi Jacopo, > > > > On Wed, 3 Jun 2020 at 09:42, Jacopo Mondi <jacopo@jmondi.org> wrote: > > > > > > Hi Naush, > > > > > > I won't go into details here as I would like Laurent to chime in > > > and express his view on this topic, but I would have a question in the > > > meantime. > > > > > > On Tue, Jun 02, 2020 at 10:38:41AM +0100, Naushir Patuck wrote: > > > > Hi Jacopo, > > > > > > > > > > > > On Mon, 1 Jun 2020 at 16:40, Jacopo Mondi <jacopo@jmondi.org> wrote: > > > > > > > > > > Hi Naush, > > > > > > > > > > On Thu, May 28, 2020 at 09:52:51AM +0100, Naushir Patuck wrote: > > > > > > [snip] > > > > > > > > > > > On "FrameDuration" itself: would it make sense to only considered it > > > > > > > > when running with AE off ? In that case it would be applications > > > > > > > > responsibility to opportunely calculate the exposure time and the frame > > > > > > > > duration, but I think it's expected if you run in manual mode. And for > > > > > > > > reference, that's what Android does as well :) > > > > > > > > > > > > > > > > It would also rule out the need to specify how the frameDurationLimits > > > > > > > > would be used when running in manual mode and wanting to configure a > > > > > > > > precise value there, as I guess you would have to give to min and max > > > > > > > > the same value to express that. > > > > > > > > > > > > FrameDurationLimits should only really apply with AE enabled. AE > > > > > > varies the shutter speed all over the place, that could cause > > > > > > framerate changes that we need to clip. If AE is disabled and we have > > > > > > a manual shutter speed set, that dictates the frame duration, and any > > > > > > limits will not apply. Perhaps I should add this in the description? > > > > > > > > > > > > > > I have to correct myself here. FrameDurationLimits will also apply > > > > with manual shutter speeds when AE is disabled. We could, say, have a > > > > manual shutter speed request of 25 ms, but still want to run at 30fps. > > > > There is a question on what happens if they clash, e.g. request a > > > > manual shutter speed of 66ms, and a FrameDuration of 30fps? What gets > > > > prioritised? Currently, in our implementation, FrameDuration will > > > > always be prioritised. > > > > > > Is there a rationle behind this decision ? I'm asking as I see Android > > > going in the other direction, with the exposure time always overriding > > > the requested frame duration. > > > > This was an entirely arbitrary choice on my part - based on what was > > easiest to do at the time :) > > > > > It is specified in their documentation that the mximum frame duration > > > a camera reports (lower frame rate) shall be at least the maximum > > > allowed exposure time, so that all possible exposure values are known > > > to be in the requested FPS range. I guess this should be enforced even > > > if we give frame duration priority, or should we cap the maximum frame > > > duration to the maximum available exposure time ? > > > > I have not looked at the Android docs in ages, but do they have two > > sets of frame duration limits, one that the sensor will report (sounds > > like this is what you are working on?) and one that the user requests > > yes, there's a static property that reports the minFrameDurations for > each supported stream configuration (including the ISP introduced > dealys) > https://jmondi.org/android_metadata_tags/docs.html#static_android.scaler.availableMinFrameDurations > > And two controls for the sensor frame durations, one static property > https://jmondi.org/android_metadata_tags/docs.html#static_android.sensor.info.maxFrameDuration > > and a control > https://jmondi.org/android_metadata_tags/docs.html#controls_android.sensor.frameDuration > > > (of course, bounded by the former)? This was where my thinking was > > going, and with frame duration priority, exposure time is capped by > > the user request (if given), or the sensor limits. But as always, if > > the consensus is to behave differently, I am happy to change things > > around. > > I don't have preferences, nor reasons that make me think one is better > than the other. I wonder if we should enforce this at all or let pipeline > handler/IPA decide that. V4L2 has it that VBLANK (ie frame rate) clips the max exposure time, therefore if libcamera doesn't follow that then you have to be very conscious of doing things in the correct order - reset VBLANK / frame rate first to reset exposure ranges, and then set your desired exposure time. I'm trying to think whether I've encountered sensors that won't allow you to adjust the blanking periods whilst streaming, as those would limit the exposure time without stopping and restarting. None are jumping to mind, but others may have encountered such. For AE modes, there is the override control V4L2_CID_EXPOSURE_AUTO_PRIORITY [1], but otherwise frame rate all other references appear to say frame rate dictates max exposure. For performance tests it could be considered useful if the specified frame rate was guaranteed to be delivered. Then again you could claim it was a flawed test if it set the exposure time to a value that contradicted the frame rate it was trying to measure. Dave [1] https://www.kernel.org/doc/html/latest/media/uapi/v4l/ext-ctrls-camera.html > Thanks > j > > > > > Regards, > > Naush > > > > > > > > > > Just trying to get more data points to make an informed decision, as > > > your question on what has to be prioritized is quite sensible... > > > > > > Thanks > > > j > _______________________________________________ > libcamera-devel mailing list > libcamera-devel@lists.libcamera.org > https://lists.libcamera.org/listinfo/libcamera-devel
diff --git a/src/libcamera/control_ids.yaml b/src/libcamera/control_ids.yaml index 77ebc3f9..42694fc7 100644 --- a/src/libcamera/control_ids.yaml +++ b/src/libcamera/control_ids.yaml @@ -239,4 +239,18 @@ controls: pixel range (as if pixels ranged from 0 to 65535). The SensorBlackLevels control can only be returned in metadata. size: [4] + + - FrameDurations: + type: float + description: | + Specifies the minimum and maximum (in that order) allowable frame + duration, in micro-seconds, for the sensor to use. This could also limit + the largest exposure times the sensor can use. For example, if a maximum + frame duration of 33ms is requested (corresponding to 30 frames per + second), the sensor will not be able raise the exposure time above 33ms. + Note that the sensor may not always be able to provide the requested + frame duration limits depending on its mode configuration. + + \sa ExposureTime + size: [2] ...
Add a float array control (controls::FrameDurations) to specify the minimum and maximum (in that order) frame duration to be used by the camera sensor. Signed-off-by: Naushir Patuck <naush@raspberrypi.com> --- src/libcamera/control_ids.yaml | 14 ++++++++++++++ 1 file changed, 14 insertions(+)