[libcamera-devel,v3,4/4] pipeline: raspberrypi: Allow either strict or non-strict buffer matching
diff mbox series

Message ID 20210218170126.2060783-5-naush@raspberrypi.com
State Superseded
Headers show
Series
  • Raspberry Pi: Embedded data usage
Related show

Commit Message

Naushir Patuck Feb. 18, 2021, 5:01 p.m. UTC
A new flag, StrictBufferMatching, is used to control the behavior of
the embedded and bayer buffer matching routine.

If set to true (default), we reject and drop all bayer frames that do
not have a matching embedded data buffer and vice-versa. This guarantees
the IPA will always have the correct frame exposure and gain values to
use.

If set to false, we use bayer frames that do not have a matching
embedded data buffer. In this case, IPA will use use the ControlList
passed to it for gain and exposure values.

Additionally, allow external stream buffers to behave as if
StrictBufferMatching = false since we are not allowed to drop them.

Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
---
 .../pipeline/raspberrypi/raspberrypi.cpp      | 31 ++++++++++++++++---
 1 file changed, 26 insertions(+), 5 deletions(-)

Comments

Laurent Pinchart Feb. 21, 2021, 11:09 p.m. UTC | #1
Hi Naush,

Thank you for the patch.

On Thu, Feb 18, 2021 at 05:01:26PM +0000, Naushir Patuck wrote:
> A new flag, StrictBufferMatching, is used to control the behavior of
> the embedded and bayer buffer matching routine.
> 
> If set to true (default), we reject and drop all bayer frames that do
> not have a matching embedded data buffer and vice-versa. This guarantees
> the IPA will always have the correct frame exposure and gain values to
> use.
> 
> If set to false, we use bayer frames that do not have a matching
> embedded data buffer. In this case, IPA will use use the ControlList
> passed to it for gain and exposure values.
> 
> Additionally, allow external stream buffers to behave as if
> StrictBufferMatching = false since we are not allowed to drop them.

Could you explain the use case for both options ? Assuming an API for
applications to set this, how would an application decide ?

> Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
> ---
>  .../pipeline/raspberrypi/raspberrypi.cpp      | 31 ++++++++++++++++---
>  1 file changed, 26 insertions(+), 5 deletions(-)
> 
> diff --git a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
> index d969c77993eb..7f66d6995176 100644
> --- a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
> +++ b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
> @@ -46,6 +46,22 @@ namespace libcamera {
>  
>  LOG_DEFINE_CATEGORY(RPI)
>  
> +/*
> + * Set this to true to reject and drop all bayer frames that do not have a
> + * matching embedded data buffer and vice-versa. This guarantees the IPA will
> + * always have the correct frame exposure and gain values to use.
> + *
> + * Set this to false to use bayer frames that do not have a matching embedded
> + * data buffer. In this case, IPA will use use our local history for gain and
> + * exposure values, occasional frame drops may cause these number to be out of
> + * sync for a short period.
> + *
> + * \todo Ideally, this property should be set by the application, but we don't
> + * have any mechanism to pass generic properties into a pipeline handler at
> + * present.
> + */
> +static const bool StrictBufferMatching = true;
> +
>  namespace {
>  
>  bool isRaw(PixelFormat &pixFmt)
> @@ -1724,13 +1740,13 @@ bool RPiCameraData::findMatchingBuffers(BayerFrame &bayerFrame, FrameBuffer *&em
>  		embeddedBuffer = nullptr;
>  		while (!embeddedQueue_.empty()) {
>  			FrameBuffer *b = embeddedQueue_.front();
> -			if (!unicam_[Unicam::Embedded].isExternal() && b->metadata().timestamp < ts) {
> +			if (b->metadata().timestamp < ts) {
>  				embeddedQueue_.pop();
>  				unicam_[Unicam::Embedded].queueBuffer(b);
>  				embeddedRequeueCount++;
>  				LOG(RPI, Warning) << "Dropping unmatched input frame in stream "
>  						  << unicam_[Unicam::Embedded].name();
> -			} else if (unicam_[Unicam::Embedded].isExternal() || b->metadata().timestamp == ts) {
> +			} else if (b->metadata().timestamp == ts) {
>  				/* We pop the item from the queue lower down. */
>  				embeddedBuffer = b;
>  				break;
> @@ -1744,10 +1760,15 @@ bool RPiCameraData::findMatchingBuffers(BayerFrame &bayerFrame, FrameBuffer *&em
>  
>  			LOG(RPI, Debug) << "Could not find matching embedded buffer";
>  
> -			if (!sensorMetadata_) {
> +			if (unicam_[Unicam::Embedded].isExternal() ||
> +			    !StrictBufferMatching || !sensorMetadata_) {
>  				/*
> -				 * If there is no sensor metadata, simply return the
> -				 * first bayer frame in the queue.
> +				 * If any of the follwing is true:
> +				 * - This is an external stream buffer (so cannot be dropped).
> +				 * - We do not care about strict buffer matching.
> +				 * - There is no sensor metadata present.
> +				 * we can simply return the first bayer frame in the queue
> +				 * without a matching embedded buffer.
>  				 */
>  				LOG(RPI, Debug) << "Returning bayer frame without a match";
>  				bayerQueue_.pop();
Naushir Patuck Feb. 22, 2021, 1:51 p.m. UTC | #2
Hi Laurent


On Sun, 21 Feb 2021 at 23:09, Laurent Pinchart <
laurent.pinchart@ideasonboard.com> wrote:

> Hi Naush,
>
> Thank you for the patch.
>
> On Thu, Feb 18, 2021 at 05:01:26PM +0000, Naushir Patuck wrote:
> > A new flag, StrictBufferMatching, is used to control the behavior of
> > the embedded and bayer buffer matching routine.
> >
> > If set to true (default), we reject and drop all bayer frames that do
> > not have a matching embedded data buffer and vice-versa. This guarantees
> > the IPA will always have the correct frame exposure and gain values to
> > use.
> >
> > If set to false, we use bayer frames that do not have a matching
> > embedded data buffer. In this case, IPA will use use the ControlList
> > passed to it for gain and exposure values.
> >
> > Additionally, allow external stream buffers to behave as if
> > StrictBufferMatching = false since we are not allowed to drop them.
>
> Could you explain the use case for both options ? Assuming an API for
> applications to set this, how would an application decide ?
>

As an example, consider a sensor that generates some kind of computational
metadata (e.g. face/object detection) in the sensor silicon and sends it out
via embedded data.  In such cases, the image data alone is not very helpful
for the application.  An application would need both embedded data and image
data buffers to do its magic, and crucially they must be in sync.  This is
where
the StrictBufferMatching = true would be used.  We are actually talking to a
vendor about this exact use case. Of course, today we do not have the
ability to
get to that embedded data buffer in the application, but we will in the
future, so
right now this hint is more for the IPA having a guarantee that both
buffers are in
sync.

For more simpler use cases, e.g. high framerate video, we do not
necessarily care
about strict matching, as the embedded buffer will only be used for AWB/AE
loops,
and avoiding frame drops is more desirable.  These cases would
use StrictBufferMatching = false.
Perhaps a better strategy would be to set this to the default...?

Hope that makes sense.

Naush



>
> > Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
> > ---
> >  .../pipeline/raspberrypi/raspberrypi.cpp      | 31 ++++++++++++++++---
> >  1 file changed, 26 insertions(+), 5 deletions(-)
> >
> > diff --git a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
> b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
> > index d969c77993eb..7f66d6995176 100644
> > --- a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
> > +++ b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
> > @@ -46,6 +46,22 @@ namespace libcamera {
> >
> >  LOG_DEFINE_CATEGORY(RPI)
> >
> > +/*
> > + * Set this to true to reject and drop all bayer frames that do not
> have a
> > + * matching embedded data buffer and vice-versa. This guarantees the
> IPA will
> > + * always have the correct frame exposure and gain values to use.
> > + *
> > + * Set this to false to use bayer frames that do not have a matching
> embedded
> > + * data buffer. In this case, IPA will use use our local history for
> gain and
> > + * exposure values, occasional frame drops may cause these number to be
> out of
> > + * sync for a short period.
> > + *
> > + * \todo Ideally, this property should be set by the application, but
> we don't
> > + * have any mechanism to pass generic properties into a pipeline
> handler at
> > + * present.
> > + */
> > +static const bool StrictBufferMatching = true;
> > +
> >  namespace {
> >
> >  bool isRaw(PixelFormat &pixFmt)
> > @@ -1724,13 +1740,13 @@ bool
> RPiCameraData::findMatchingBuffers(BayerFrame &bayerFrame, FrameBuffer *&em
> >               embeddedBuffer = nullptr;
> >               while (!embeddedQueue_.empty()) {
> >                       FrameBuffer *b = embeddedQueue_.front();
> > -                     if (!unicam_[Unicam::Embedded].isExternal() &&
> b->metadata().timestamp < ts) {
> > +                     if (b->metadata().timestamp < ts) {
> >                               embeddedQueue_.pop();
> >                               unicam_[Unicam::Embedded].queueBuffer(b);
> >                               embeddedRequeueCount++;
> >                               LOG(RPI, Warning) << "Dropping unmatched
> input frame in stream "
> >                                                 <<
> unicam_[Unicam::Embedded].name();
> > -                     } else if (unicam_[Unicam::Embedded].isExternal()
> || b->metadata().timestamp == ts) {
> > +                     } else if (b->metadata().timestamp == ts) {
> >                               /* We pop the item from the queue lower
> down. */
> >                               embeddedBuffer = b;
> >                               break;
> > @@ -1744,10 +1760,15 @@ bool
> RPiCameraData::findMatchingBuffers(BayerFrame &bayerFrame, FrameBuffer *&em
> >
> >                       LOG(RPI, Debug) << "Could not find matching
> embedded buffer";
> >
> > -                     if (!sensorMetadata_) {
> > +                     if (unicam_[Unicam::Embedded].isExternal() ||
> > +                         !StrictBufferMatching || !sensorMetadata_) {
> >                               /*
> > -                              * If there is no sensor metadata, simply
> return the
> > -                              * first bayer frame in the queue.
> > +                              * If any of the follwing is true:
> > +                              * - This is an external stream buffer (so
> cannot be dropped).
> > +                              * - We do not care about strict buffer
> matching.
> > +                              * - There is no sensor metadata present.
> > +                              * we can simply return the first bayer
> frame in the queue
> > +                              * without a matching embedded buffer.
> >                                */
> >                               LOG(RPI, Debug) << "Returning bayer frame
> without a match";
> >                               bayerQueue_.pop();
>
> --
> Regards,
>
> Laurent Pinchart
>
David Plowman Feb. 25, 2021, 11:17 a.m. UTC | #3
Another interesting case that we're looking at is a sensor with
on-board HDR. Because the sensor tonemaps the image before outputting
it, it means that the statistics we get from the ISP are no good for
the AGC/AEC algorithm - they're not linear with exposure/gain. Because
of this the sensor actually outputs some statistics of its own that we
can collect into the "embedded data" buffers in our usual way. From
there, the plan would be to pass them to our AGC/AEC.

But anyway, this is another case where "strict" matching is important.
(Also, there will likely be changes in our pipeline handler and IPAs
in the short-to-medium term to support this kind of use-case more
straightforwardly!).

David

On Mon, 22 Feb 2021 at 13:52, Naushir Patuck <naush@raspberrypi.com> wrote:
>
> Hi Laurent
>
>
> On Sun, 21 Feb 2021 at 23:09, Laurent Pinchart <laurent.pinchart@ideasonboard.com> wrote:
>>
>> Hi Naush,
>>
>> Thank you for the patch.
>>
>> On Thu, Feb 18, 2021 at 05:01:26PM +0000, Naushir Patuck wrote:
>> > A new flag, StrictBufferMatching, is used to control the behavior of
>> > the embedded and bayer buffer matching routine.
>> >
>> > If set to true (default), we reject and drop all bayer frames that do
>> > not have a matching embedded data buffer and vice-versa. This guarantees
>> > the IPA will always have the correct frame exposure and gain values to
>> > use.
>> >
>> > If set to false, we use bayer frames that do not have a matching
>> > embedded data buffer. In this case, IPA will use use the ControlList
>> > passed to it for gain and exposure values.
>> >
>> > Additionally, allow external stream buffers to behave as if
>> > StrictBufferMatching = false since we are not allowed to drop them.
>>
>> Could you explain the use case for both options ? Assuming an API for
>> applications to set this, how would an application decide ?
>
>
> As an example, consider a sensor that generates some kind of computational
> metadata (e.g. face/object detection) in the sensor silicon and sends it out
> via embedded data.  In such cases, the image data alone is not very helpful
> for the application.  An application would need both embedded data and image
> data buffers to do its magic, and crucially they must be in sync.  This is where
> the StrictBufferMatching = true would be used.  We are actually talking to a
> vendor about this exact use case. Of course, today we do not have the ability to
> get to that embedded data buffer in the application, but we will in the future, so
> right now this hint is more for the IPA having a guarantee that both buffers are in
> sync.
>
> For more simpler use cases, e.g. high framerate video, we do not necessarily care
> about strict matching, as the embedded buffer will only be used for AWB/AE loops,
> and avoiding frame drops is more desirable.  These cases would use StrictBufferMatching = false.
> Perhaps a better strategy would be to set this to the default...?
>
> Hope that makes sense.
>
> Naush
>
>
>>
>>
>> > Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
>> > ---
>> >  .../pipeline/raspberrypi/raspberrypi.cpp      | 31 ++++++++++++++++---
>> >  1 file changed, 26 insertions(+), 5 deletions(-)
>> >
>> > diff --git a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
>> > index d969c77993eb..7f66d6995176 100644
>> > --- a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
>> > +++ b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
>> > @@ -46,6 +46,22 @@ namespace libcamera {
>> >
>> >  LOG_DEFINE_CATEGORY(RPI)
>> >
>> > +/*
>> > + * Set this to true to reject and drop all bayer frames that do not have a
>> > + * matching embedded data buffer and vice-versa. This guarantees the IPA will
>> > + * always have the correct frame exposure and gain values to use.
>> > + *
>> > + * Set this to false to use bayer frames that do not have a matching embedded
>> > + * data buffer. In this case, IPA will use use our local history for gain and
>> > + * exposure values, occasional frame drops may cause these number to be out of
>> > + * sync for a short period.
>> > + *
>> > + * \todo Ideally, this property should be set by the application, but we don't
>> > + * have any mechanism to pass generic properties into a pipeline handler at
>> > + * present.
>> > + */
>> > +static const bool StrictBufferMatching = true;
>> > +
>> >  namespace {
>> >
>> >  bool isRaw(PixelFormat &pixFmt)
>> > @@ -1724,13 +1740,13 @@ bool RPiCameraData::findMatchingBuffers(BayerFrame &bayerFrame, FrameBuffer *&em
>> >               embeddedBuffer = nullptr;
>> >               while (!embeddedQueue_.empty()) {
>> >                       FrameBuffer *b = embeddedQueue_.front();
>> > -                     if (!unicam_[Unicam::Embedded].isExternal() && b->metadata().timestamp < ts) {
>> > +                     if (b->metadata().timestamp < ts) {
>> >                               embeddedQueue_.pop();
>> >                               unicam_[Unicam::Embedded].queueBuffer(b);
>> >                               embeddedRequeueCount++;
>> >                               LOG(RPI, Warning) << "Dropping unmatched input frame in stream "
>> >                                                 << unicam_[Unicam::Embedded].name();
>> > -                     } else if (unicam_[Unicam::Embedded].isExternal() || b->metadata().timestamp == ts) {
>> > +                     } else if (b->metadata().timestamp == ts) {
>> >                               /* We pop the item from the queue lower down. */
>> >                               embeddedBuffer = b;
>> >                               break;
>> > @@ -1744,10 +1760,15 @@ bool RPiCameraData::findMatchingBuffers(BayerFrame &bayerFrame, FrameBuffer *&em
>> >
>> >                       LOG(RPI, Debug) << "Could not find matching embedded buffer";
>> >
>> > -                     if (!sensorMetadata_) {
>> > +                     if (unicam_[Unicam::Embedded].isExternal() ||
>> > +                         !StrictBufferMatching || !sensorMetadata_) {
>> >                               /*
>> > -                              * If there is no sensor metadata, simply return the
>> > -                              * first bayer frame in the queue.
>> > +                              * If any of the follwing is true:
>> > +                              * - This is an external stream buffer (so cannot be dropped).
>> > +                              * - We do not care about strict buffer matching.
>> > +                              * - There is no sensor metadata present.
>> > +                              * we can simply return the first bayer frame in the queue
>> > +                              * without a matching embedded buffer.
>> >                                */
>> >                               LOG(RPI, Debug) << "Returning bayer frame without a match";
>> >                               bayerQueue_.pop();
>>
>> --
>> Regards,
>>
>> Laurent Pinchart
>
> _______________________________________________
> libcamera-devel mailing list
> libcamera-devel@lists.libcamera.org
> https://lists.libcamera.org/listinfo/libcamera-devel
Laurent Pinchart March 4, 2021, 2:12 a.m. UTC | #4
Hello,

On Thu, Feb 25, 2021 at 11:17:37AM +0000, David Plowman wrote:
> Another interesting case that we're looking at is a sensor with
> on-board HDR. Because the sensor tonemaps the image before outputting
> it, it means that the statistics we get from the ISP are no good for
> the AGC/AEC algorithm - they're not linear with exposure/gain. Because
> of this the sensor actually outputs some statistics of its own that we
> can collect into the "embedded data" buffers in our usual way. From
> there, the plan would be to pass them to our AGC/AEC.
> 
> But anyway, this is another case where "strict" matching is important.
> (Also, there will likely be changes in our pipeline handler and IPAs
> in the short-to-medium term to support this kind of use-case more
> straightforwardly!).
> 
> On Mon, 22 Feb 2021 at 13:52, Naushir Patuck <naush@raspberrypi.com> wrote:
> > On Sun, 21 Feb 2021 at 23:09, Laurent Pinchart <laurent.pinchart@ideasonboard.com> wrote:
> >> On Thu, Feb 18, 2021 at 05:01:26PM +0000, Naushir Patuck wrote:
> >> > A new flag, StrictBufferMatching, is used to control the behavior of
> >> > the embedded and bayer buffer matching routine.
> >> >
> >> > If set to true (default), we reject and drop all bayer frames that do
> >> > not have a matching embedded data buffer and vice-versa. This guarantees
> >> > the IPA will always have the correct frame exposure and gain values to
> >> > use.
> >> >
> >> > If set to false, we use bayer frames that do not have a matching
> >> > embedded data buffer. In this case, IPA will use use the ControlList
> >> > passed to it for gain and exposure values.
> >> >
> >> > Additionally, allow external stream buffers to behave as if
> >> > StrictBufferMatching = false since we are not allowed to drop them.
> >>
> >> Could you explain the use case for both options ? Assuming an API for
> >> applications to set this, how would an application decide ?
> >
> > As an example, consider a sensor that generates some kind of computational
> > metadata (e.g. face/object detection) in the sensor silicon and sends it out
> > via embedded data.  In such cases, the image data alone is not very helpful
> > for the application.  An application would need both embedded data and image
> > data buffers to do its magic, and crucially they must be in sync.  This is where
> > the StrictBufferMatching = true would be used.  We are actually talking to a
> > vendor about this exact use case. Of course, today we do not have the ability to
> > get to that embedded data buffer in the application, but we will in the future, so
> > right now this hint is more for the IPA having a guarantee that both buffers are in
> > sync.
> >
> > For more simpler use cases, e.g. high framerate video, we do not necessarily care
> > about strict matching, as the embedded buffer will only be used for AWB/AE loops,
> > and avoiding frame drops is more desirable.  These cases would use StrictBufferMatching = false.
> > Perhaps a better strategy would be to set this to the default...?
> >
> > Hope that makes sense.

So I understand why strict matching is important, and that's the default
in this patch. Non-strict matching relaxes the requirements, but I'm not
sure what impact it will have on algorithms.

Furthermore, to make this really useful, we would need a way to
configure the policy at runtime, and I think it may be a hard choice for
applications. I wonder if there could be a way to handle the buffer
matching policy automagically ? The code hear deals with the case were
no matching buffer is available, which I assume would be caused by a
frame drop (either on the image stream or the embedded data stream).
Would it be possible to detect such situations and handle them in an
automatic way ? I'm not sure what to propose exactly, but going in the
direction of exposing to applications controls that tune heuristics in
ill-defined ways worries me. We could easily end up with controls that
applications would have no meaningful way to set. If you ask me if I
need to avoid frame drops or avoid bad images, I'll tell you I need both
:-) I could possibly make a decision if I knew the exact risk of frame
drop, and exactly how "bad" images could be, but I don't think that's
something we'll be able to express in an API. I'd really like to avoid
going in this direction.

> >> > Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
> >> > ---
> >> >  .../pipeline/raspberrypi/raspberrypi.cpp      | 31 ++++++++++++++++---
> >> >  1 file changed, 26 insertions(+), 5 deletions(-)
> >> >
> >> > diff --git a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
> >> > index d969c77993eb..7f66d6995176 100644
> >> > --- a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
> >> > +++ b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
> >> > @@ -46,6 +46,22 @@ namespace libcamera {
> >> >
> >> >  LOG_DEFINE_CATEGORY(RPI)
> >> >
> >> > +/*
> >> > + * Set this to true to reject and drop all bayer frames that do not have a
> >> > + * matching embedded data buffer and vice-versa. This guarantees the IPA will
> >> > + * always have the correct frame exposure and gain values to use.
> >> > + *
> >> > + * Set this to false to use bayer frames that do not have a matching embedded
> >> > + * data buffer. In this case, IPA will use use our local history for gain and
> >> > + * exposure values, occasional frame drops may cause these number to be out of
> >> > + * sync for a short period.
> >> > + *
> >> > + * \todo Ideally, this property should be set by the application, but we don't
> >> > + * have any mechanism to pass generic properties into a pipeline handler at
> >> > + * present.
> >> > + */
> >> > +static const bool StrictBufferMatching = true;
> >> > +
> >> >  namespace {
> >> >
> >> >  bool isRaw(PixelFormat &pixFmt)
> >> > @@ -1724,13 +1740,13 @@ bool RPiCameraData::findMatchingBuffers(BayerFrame &bayerFrame, FrameBuffer *&em
> >> >               embeddedBuffer = nullptr;
> >> >               while (!embeddedQueue_.empty()) {
> >> >                       FrameBuffer *b = embeddedQueue_.front();
> >> > -                     if (!unicam_[Unicam::Embedded].isExternal() && b->metadata().timestamp < ts) {
> >> > +                     if (b->metadata().timestamp < ts) {
> >> >                               embeddedQueue_.pop();
> >> >                               unicam_[Unicam::Embedded].queueBuffer(b);
> >> >                               embeddedRequeueCount++;
> >> >                               LOG(RPI, Warning) << "Dropping unmatched input frame in stream "
> >> >                                                 << unicam_[Unicam::Embedded].name();
> >> > -                     } else if (unicam_[Unicam::Embedded].isExternal() || b->metadata().timestamp == ts) {
> >> > +                     } else if (b->metadata().timestamp == ts) {
> >> >                               /* We pop the item from the queue lower down. */
> >> >                               embeddedBuffer = b;
> >> >                               break;
> >> > @@ -1744,10 +1760,15 @@ bool RPiCameraData::findMatchingBuffers(BayerFrame &bayerFrame, FrameBuffer *&em
> >> >
> >> >                       LOG(RPI, Debug) << "Could not find matching embedded buffer";
> >> >
> >> > -                     if (!sensorMetadata_) {
> >> > +                     if (unicam_[Unicam::Embedded].isExternal() ||
> >> > +                         !StrictBufferMatching || !sensorMetadata_) {
> >> >                               /*
> >> > -                              * If there is no sensor metadata, simply return the
> >> > -                              * first bayer frame in the queue.
> >> > +                              * If any of the follwing is true:
> >> > +                              * - This is an external stream buffer (so cannot be dropped).
> >> > +                              * - We do not care about strict buffer matching.
> >> > +                              * - There is no sensor metadata present.
> >> > +                              * we can simply return the first bayer frame in the queue
> >> > +                              * without a matching embedded buffer.
> >> >                                */
> >> >                               LOG(RPI, Debug) << "Returning bayer frame without a match";
> >> >                               bayerQueue_.pop();
Naushir Patuck March 4, 2021, 8:48 a.m. UTC | #5
Hi Laurent,

On Thu, 4 Mar 2021 at 02:12, Laurent Pinchart <
laurent.pinchart@ideasonboard.com> wrote:

> Hello,
>
> On Thu, Feb 25, 2021 at 11:17:37AM +0000, David Plowman wrote:
> > Another interesting case that we're looking at is a sensor with
> > on-board HDR. Because the sensor tonemaps the image before outputting
> > it, it means that the statistics we get from the ISP are no good for
> > the AGC/AEC algorithm - they're not linear with exposure/gain. Because
> > of this the sensor actually outputs some statistics of its own that we
> > can collect into the "embedded data" buffers in our usual way. From
> > there, the plan would be to pass them to our AGC/AEC.
> >
> > But anyway, this is another case where "strict" matching is important.
> > (Also, there will likely be changes in our pipeline handler and IPAs
> > in the short-to-medium term to support this kind of use-case more
> > straightforwardly!).
> >
> > On Mon, 22 Feb 2021 at 13:52, Naushir Patuck <naush@raspberrypi.com>
> wrote:
> > > On Sun, 21 Feb 2021 at 23:09, Laurent Pinchart <
> laurent.pinchart@ideasonboard.com> wrote:
> > >> On Thu, Feb 18, 2021 at 05:01:26PM +0000, Naushir Patuck wrote:
> > >> > A new flag, StrictBufferMatching, is used to control the behavior of
> > >> > the embedded and bayer buffer matching routine.
> > >> >
> > >> > If set to true (default), we reject and drop all bayer frames that
> do
> > >> > not have a matching embedded data buffer and vice-versa. This
> guarantees
> > >> > the IPA will always have the correct frame exposure and gain values
> to
> > >> > use.
> > >> >
> > >> > If set to false, we use bayer frames that do not have a matching
> > >> > embedded data buffer. In this case, IPA will use use the ControlList
> > >> > passed to it for gain and exposure values.
> > >> >
> > >> > Additionally, allow external stream buffers to behave as if
> > >> > StrictBufferMatching = false since we are not allowed to drop them.
> > >>
> > >> Could you explain the use case for both options ? Assuming an API for
> > >> applications to set this, how would an application decide ?
> > >
> > > As an example, consider a sensor that generates some kind of
> computational
> > > metadata (e.g. face/object detection) in the sensor silicon and sends
> it out
> > > via embedded data.  In such cases, the image data alone is not very
> helpful
> > > for the application.  An application would need both embedded data and
> image
> > > data buffers to do its magic, and crucially they must be in sync.
> This is where
> > > the StrictBufferMatching = true would be used.  We are actually
> talking to a
> > > vendor about this exact use case. Of course, today we do not have the
> ability to
> > > get to that embedded data buffer in the application, but we will in
> the future, so
> > > right now this hint is more for the IPA having a guarantee that both
> buffers are in
> > > sync.
> > >
> > > For more simpler use cases, e.g. high framerate video, we do not
> necessarily care
> > > about strict matching, as the embedded buffer will only be used for
> AWB/AE loops,
> > > and avoiding frame drops is more desirable.  These cases would use
> StrictBufferMatching = false.
> > > Perhaps a better strategy would be to set this to the default...?
> > >
> > > Hope that makes sense.
>
> So I understand why strict matching is important, and that's the default
> in this patch. Non-strict matching relaxes the requirements, but I'm not
> sure what impact it will have on algorithms.
>

Hopefully non-strict buffer matching will not have any impact on the
algorithms :-)
I suppose it could possibly cause AE to go wrong in extreme cases where
for whatever reason, the locally stored values were incorrect.


> Furthermore, to make this really useful, we would need a way to
> configure the policy at runtime, and I think it may be a hard choice for
> applications. I wonder if there could be a way to handle the buffer
> matching policy automagically ? The code hear deals with the case were
> no matching buffer is available, which I assume would be caused by a
> frame drop (either on the image stream or the embedded data stream).
> Would it be possible to detect such situations and handle them in an
> automatic way ?


The pipeline handler is able to detect a buffer that cannot be matched,
but after that, I don't see a way for it to know how to handle this in an
automatic way.  It does not know the context of the buffers or application,
e.g, if there is some computer vision embedded data that goes with the
image frame so if we don't have a match we should drop the frame.


> I'm not sure what to propose exactly, but going in the
> direction of exposing to applications controls that tune heuristics in
> ill-defined ways worries me. We could easily end up with controls that
> applications would have no meaningful way to set. If you ask me if I
> need to avoid frame drops or avoid bad images, I'll tell you I need both
> :-) I could possibly make a decision if I knew the exact risk of frame
> drop, and exactly how "bad" images could be, but I don't think that's
> something we'll be able to express in an API. I'd really like to avoid
> going in this direction.
>

Yes, I do see your concerns here.  We do not want to put more onus on
the application side to manage pipelines, and in possibly a vendor specific
way.  However, I do see that more advanced users, or unique apps that
have a very specific purpose may want this sort of control available.

Some time ago, we talked about allowing applications to provide "hints"
to the pipeline handlers on possible operating behaviors.   I was thinking
that StrictBufferMatching could be one of those hints.  Other possible
hints may be HDR, ZSL, high framerate, low power operating modes
as a few examples from the top of my head.

We still need to be careful, as hints should not be enforceable, but act
only as a suggestion (hence "hints" :-)) that the pipeline handler may or
may not handle.

I'm not sure that the right approach here is, or if there even is one...?

Regards,
Naush

> >> > Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
> > >> > ---
> > >> >  .../pipeline/raspberrypi/raspberrypi.cpp      | 31
> ++++++++++++++++---
> > >> >  1 file changed, 26 insertions(+), 5 deletions(-)
> > >> >
> > >> > diff --git a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
> b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
> > >> > index d969c77993eb..7f66d6995176 100644
> > >> > --- a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
> > >> > +++ b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
> > >> > @@ -46,6 +46,22 @@ namespace libcamera {
> > >> >
> > >> >  LOG_DEFINE_CATEGORY(RPI)
> > >> >
> > >> > +/*
> > >> > + * Set this to true to reject and drop all bayer frames that do
> not have a
> > >> > + * matching embedded data buffer and vice-versa. This guarantees
> the IPA will
> > >> > + * always have the correct frame exposure and gain values to use.
> > >> > + *
> > >> > + * Set this to false to use bayer frames that do not have a
> matching embedded
> > >> > + * data buffer. In this case, IPA will use use our local history
> for gain and
> > >> > + * exposure values, occasional frame drops may cause these number
> to be out of
> > >> > + * sync for a short period.
> > >> > + *
> > >> > + * \todo Ideally, this property should be set by the application,
> but we don't
> > >> > + * have any mechanism to pass generic properties into a pipeline
> handler at
> > >> > + * present.
> > >> > + */
> > >> > +static const bool StrictBufferMatching = true;
> > >> > +
> > >> >  namespace {
> > >> >
> > >> >  bool isRaw(PixelFormat &pixFmt)
> > >> > @@ -1724,13 +1740,13 @@ bool
> RPiCameraData::findMatchingBuffers(BayerFrame &bayerFrame, FrameBuffer *&em
> > >> >               embeddedBuffer = nullptr;
> > >> >               while (!embeddedQueue_.empty()) {
> > >> >                       FrameBuffer *b = embeddedQueue_.front();
> > >> > -                     if (!unicam_[Unicam::Embedded].isExternal()
> && b->metadata().timestamp < ts) {
> > >> > +                     if (b->metadata().timestamp < ts) {
> > >> >                               embeddedQueue_.pop();
> > >> >
>  unicam_[Unicam::Embedded].queueBuffer(b);
> > >> >                               embeddedRequeueCount++;
> > >> >                               LOG(RPI, Warning) << "Dropping
> unmatched input frame in stream "
> > >> >                                                 <<
> unicam_[Unicam::Embedded].name();
> > >> > -                     } else if
> (unicam_[Unicam::Embedded].isExternal() || b->metadata().timestamp == ts) {
> > >> > +                     } else if (b->metadata().timestamp == ts) {
> > >> >                               /* We pop the item from the queue
> lower down. */
> > >> >                               embeddedBuffer = b;
> > >> >                               break;
> > >> > @@ -1744,10 +1760,15 @@ bool
> RPiCameraData::findMatchingBuffers(BayerFrame &bayerFrame, FrameBuffer *&em
> > >> >
> > >> >                       LOG(RPI, Debug) << "Could not find matching
> embedded buffer";
> > >> >
> > >> > -                     if (!sensorMetadata_) {
> > >> > +                     if (unicam_[Unicam::Embedded].isExternal() ||
> > >> > +                         !StrictBufferMatching ||
> !sensorMetadata_) {
> > >> >                               /*
> > >> > -                              * If there is no sensor metadata,
> simply return the
> > >> > -                              * first bayer frame in the queue.
> > >> > +                              * If any of the follwing is true:
> > >> > +                              * - This is an external stream
> buffer (so cannot be dropped).
> > >> > +                              * - We do not care about strict
> buffer matching.
> > >> > +                              * - There is no sensor metadata
> present.
> > >> > +                              * we can simply return the first
> bayer frame in the queue
> > >> > +                              * without a matching embedded buffer.
> > >> >                                */
> > >> >                               LOG(RPI, Debug) << "Returning bayer
> frame without a match";
> > >> >                               bayerQueue_.pop();
>
> --
> Regards,
>
> Laurent Pinchart
>
David Plowman March 4, 2021, 5:19 p.m. UTC | #6
Hi Naush, Laurent

On Thu, 4 Mar 2021 at 08:49, Naushir Patuck <naush@raspberrypi.com> wrote:
>
> Hi Laurent,
>
> On Thu, 4 Mar 2021 at 02:12, Laurent Pinchart <laurent.pinchart@ideasonboard.com> wrote:
>>
>> Hello,
>>
>> On Thu, Feb 25, 2021 at 11:17:37AM +0000, David Plowman wrote:
>> > Another interesting case that we're looking at is a sensor with
>> > on-board HDR. Because the sensor tonemaps the image before outputting
>> > it, it means that the statistics we get from the ISP are no good for
>> > the AGC/AEC algorithm - they're not linear with exposure/gain. Because
>> > of this the sensor actually outputs some statistics of its own that we
>> > can collect into the "embedded data" buffers in our usual way. From
>> > there, the plan would be to pass them to our AGC/AEC.
>> >
>> > But anyway, this is another case where "strict" matching is important.
>> > (Also, there will likely be changes in our pipeline handler and IPAs
>> > in the short-to-medium term to support this kind of use-case more
>> > straightforwardly!).
>> >
>> > On Mon, 22 Feb 2021 at 13:52, Naushir Patuck <naush@raspberrypi.com> wrote:
>> > > On Sun, 21 Feb 2021 at 23:09, Laurent Pinchart <laurent.pinchart@ideasonboard.com> wrote:
>> > >> On Thu, Feb 18, 2021 at 05:01:26PM +0000, Naushir Patuck wrote:
>> > >> > A new flag, StrictBufferMatching, is used to control the behavior of
>> > >> > the embedded and bayer buffer matching routine.
>> > >> >
>> > >> > If set to true (default), we reject and drop all bayer frames that do
>> > >> > not have a matching embedded data buffer and vice-versa. This guarantees
>> > >> > the IPA will always have the correct frame exposure and gain values to
>> > >> > use.
>> > >> >
>> > >> > If set to false, we use bayer frames that do not have a matching
>> > >> > embedded data buffer. In this case, IPA will use use the ControlList
>> > >> > passed to it for gain and exposure values.
>> > >> >
>> > >> > Additionally, allow external stream buffers to behave as if
>> > >> > StrictBufferMatching = false since we are not allowed to drop them.
>> > >>
>> > >> Could you explain the use case for both options ? Assuming an API for
>> > >> applications to set this, how would an application decide ?
>> > >
>> > > As an example, consider a sensor that generates some kind of computational
>> > > metadata (e.g. face/object detection) in the sensor silicon and sends it out
>> > > via embedded data.  In such cases, the image data alone is not very helpful
>> > > for the application.  An application would need both embedded data and image
>> > > data buffers to do its magic, and crucially they must be in sync.  This is where
>> > > the StrictBufferMatching = true would be used.  We are actually talking to a
>> > > vendor about this exact use case. Of course, today we do not have the ability to
>> > > get to that embedded data buffer in the application, but we will in the future, so
>> > > right now this hint is more for the IPA having a guarantee that both buffers are in
>> > > sync.
>> > >
>> > > For more simpler use cases, e.g. high framerate video, we do not necessarily care
>> > > about strict matching, as the embedded buffer will only be used for AWB/AE loops,
>> > > and avoiding frame drops is more desirable.  These cases would use StrictBufferMatching = false.
>> > > Perhaps a better strategy would be to set this to the default...?
>> > >
>> > > Hope that makes sense.
>>
>> So I understand why strict matching is important, and that's the default
>> in this patch. Non-strict matching relaxes the requirements, but I'm not
>> sure what impact it will have on algorithms.
>
>
> Hopefully non-strict buffer matching will not have any impact on the algorithms :-)
> I suppose it could possibly cause AE to go wrong in extreme cases where
> for whatever reason, the locally stored values were incorrect.
>
>>
>> Furthermore, to make this really useful, we would need a way to
>> configure the policy at runtime, and I think it may be a hard choice for
>> applications. I wonder if there could be a way to handle the buffer
>> matching policy automagically ? The code hear deals with the case were
>> no matching buffer is available, which I assume would be caused by a
>> frame drop (either on the image stream or the embedded data stream).
>> Would it be possible to detect such situations and handle them in an
>> automatic way ?
>
>
> The pipeline handler is able to detect a buffer that cannot be matched,
> but after that, I don't see a way for it to know how to handle this in an
> automatic way.  It does not know the context of the buffers or application,
> e.g, if there is some computer vision embedded data that goes with the
> image frame so if we don't have a match we should drop the frame.
>
>>
>> I'm not sure what to propose exactly, but going in the
>> direction of exposing to applications controls that tune heuristics in
>> ill-defined ways worries me. We could easily end up with controls that
>> applications would have no meaningful way to set. If you ask me if I
>> need to avoid frame drops or avoid bad images, I'll tell you I need both
>> :-) I could possibly make a decision if I knew the exact risk of frame
>> drop, and exactly how "bad" images could be, but I don't think that's
>> something we'll be able to express in an API. I'd really like to avoid
>> going in this direction.
>
>
> Yes, I do see your concerns here.  We do not want to put more onus on
> the application side to manage pipelines, and in possibly a vendor specific
> way.  However, I do see that more advanced users, or unique apps that
> have a very specific purpose may want this sort of control available.
>
> Some time ago, we talked about allowing applications to provide "hints"
> to the pipeline handlers on possible operating behaviors.   I was thinking
> that StrictBufferMatching could be one of those hints.  Other possible
> hints may be HDR, ZSL, high framerate, low power operating modes
> as a few examples from the top of my head.
>
> We still need to be careful, as hints should not be enforceable, but act
> only as a suggestion (hence "hints" :-)) that the pipeline handler may or
> may not handle.
>
> I'm not sure that the right approach here is, or if there even is one...?

I wonder if CamHelpers would help here? Particularly in view of our
plans going forward to generalise the parsing for more types of
metadata, the CamHelpers will have to know what the metadata is and
could take a view on the risks of dropping frames or not.

David

>
> Regards,
> Naush
>
>> > >> > Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
>> > >> > ---
>> > >> >  .../pipeline/raspberrypi/raspberrypi.cpp      | 31 ++++++++++++++++---
>> > >> >  1 file changed, 26 insertions(+), 5 deletions(-)
>> > >> >
>> > >> > diff --git a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
>> > >> > index d969c77993eb..7f66d6995176 100644
>> > >> > --- a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
>> > >> > +++ b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
>> > >> > @@ -46,6 +46,22 @@ namespace libcamera {
>> > >> >
>> > >> >  LOG_DEFINE_CATEGORY(RPI)
>> > >> >
>> > >> > +/*
>> > >> > + * Set this to true to reject and drop all bayer frames that do not have a
>> > >> > + * matching embedded data buffer and vice-versa. This guarantees the IPA will
>> > >> > + * always have the correct frame exposure and gain values to use.
>> > >> > + *
>> > >> > + * Set this to false to use bayer frames that do not have a matching embedded
>> > >> > + * data buffer. In this case, IPA will use use our local history for gain and
>> > >> > + * exposure values, occasional frame drops may cause these number to be out of
>> > >> > + * sync for a short period.
>> > >> > + *
>> > >> > + * \todo Ideally, this property should be set by the application, but we don't
>> > >> > + * have any mechanism to pass generic properties into a pipeline handler at
>> > >> > + * present.
>> > >> > + */
>> > >> > +static const bool StrictBufferMatching = true;
>> > >> > +
>> > >> >  namespace {
>> > >> >
>> > >> >  bool isRaw(PixelFormat &pixFmt)
>> > >> > @@ -1724,13 +1740,13 @@ bool RPiCameraData::findMatchingBuffers(BayerFrame &bayerFrame, FrameBuffer *&em
>> > >> >               embeddedBuffer = nullptr;
>> > >> >               while (!embeddedQueue_.empty()) {
>> > >> >                       FrameBuffer *b = embeddedQueue_.front();
>> > >> > -                     if (!unicam_[Unicam::Embedded].isExternal() && b->metadata().timestamp < ts) {
>> > >> > +                     if (b->metadata().timestamp < ts) {
>> > >> >                               embeddedQueue_.pop();
>> > >> >                               unicam_[Unicam::Embedded].queueBuffer(b);
>> > >> >                               embeddedRequeueCount++;
>> > >> >                               LOG(RPI, Warning) << "Dropping unmatched input frame in stream "
>> > >> >                                                 << unicam_[Unicam::Embedded].name();
>> > >> > -                     } else if (unicam_[Unicam::Embedded].isExternal() || b->metadata().timestamp == ts) {
>> > >> > +                     } else if (b->metadata().timestamp == ts) {
>> > >> >                               /* We pop the item from the queue lower down. */
>> > >> >                               embeddedBuffer = b;
>> > >> >                               break;
>> > >> > @@ -1744,10 +1760,15 @@ bool RPiCameraData::findMatchingBuffers(BayerFrame &bayerFrame, FrameBuffer *&em
>> > >> >
>> > >> >                       LOG(RPI, Debug) << "Could not find matching embedded buffer";
>> > >> >
>> > >> > -                     if (!sensorMetadata_) {
>> > >> > +                     if (unicam_[Unicam::Embedded].isExternal() ||
>> > >> > +                         !StrictBufferMatching || !sensorMetadata_) {
>> > >> >                               /*
>> > >> > -                              * If there is no sensor metadata, simply return the
>> > >> > -                              * first bayer frame in the queue.
>> > >> > +                              * If any of the follwing is true:
>> > >> > +                              * - This is an external stream buffer (so cannot be dropped).
>> > >> > +                              * - We do not care about strict buffer matching.
>> > >> > +                              * - There is no sensor metadata present.
>> > >> > +                              * we can simply return the first bayer frame in the queue
>> > >> > +                              * without a matching embedded buffer.
>> > >> >                                */
>> > >> >                               LOG(RPI, Debug) << "Returning bayer frame without a match";
>> > >> >                               bayerQueue_.pop();
>>
>> --
>> Regards,
>>
>> Laurent Pinchart
Naushir Patuck March 5, 2021, 9:02 a.m. UTC | #7
Hi David,

On Thu, 4 Mar 2021 at 17:19, David Plowman <david.plowman@raspberrypi.com>
wrote:

> Hi Naush, Laurent
>
> On Thu, 4 Mar 2021 at 08:49, Naushir Patuck <naush@raspberrypi.com> wrote:
> >
> > Hi Laurent,
> >
> > On Thu, 4 Mar 2021 at 02:12, Laurent Pinchart <
> laurent.pinchart@ideasonboard.com> wrote:
> >>
> >> Hello,
> >>
> >> On Thu, Feb 25, 2021 at 11:17:37AM +0000, David Plowman wrote:
> >> > Another interesting case that we're looking at is a sensor with
> >> > on-board HDR. Because the sensor tonemaps the image before outputting
> >> > it, it means that the statistics we get from the ISP are no good for
> >> > the AGC/AEC algorithm - they're not linear with exposure/gain. Because
> >> > of this the sensor actually outputs some statistics of its own that we
> >> > can collect into the "embedded data" buffers in our usual way. From
> >> > there, the plan would be to pass them to our AGC/AEC.
> >> >
> >> > But anyway, this is another case where "strict" matching is important.
> >> > (Also, there will likely be changes in our pipeline handler and IPAs
> >> > in the short-to-medium term to support this kind of use-case more
> >> > straightforwardly!).
> >> >
> >> > On Mon, 22 Feb 2021 at 13:52, Naushir Patuck <naush@raspberrypi.com>
> wrote:
> >> > > On Sun, 21 Feb 2021 at 23:09, Laurent Pinchart <
> laurent.pinchart@ideasonboard.com> wrote:
> >> > >> On Thu, Feb 18, 2021 at 05:01:26PM +0000, Naushir Patuck wrote:
> >> > >> > A new flag, StrictBufferMatching, is used to control the
> behavior of
> >> > >> > the embedded and bayer buffer matching routine.
> >> > >> >
> >> > >> > If set to true (default), we reject and drop all bayer frames
> that do
> >> > >> > not have a matching embedded data buffer and vice-versa. This
> guarantees
> >> > >> > the IPA will always have the correct frame exposure and gain
> values to
> >> > >> > use.
> >> > >> >
> >> > >> > If set to false, we use bayer frames that do not have a matching
> >> > >> > embedded data buffer. In this case, IPA will use use the
> ControlList
> >> > >> > passed to it for gain and exposure values.
> >> > >> >
> >> > >> > Additionally, allow external stream buffers to behave as if
> >> > >> > StrictBufferMatching = false since we are not allowed to drop
> them.
> >> > >>
> >> > >> Could you explain the use case for both options ? Assuming an API
> for
> >> > >> applications to set this, how would an application decide ?
> >> > >
> >> > > As an example, consider a sensor that generates some kind of
> computational
> >> > > metadata (e.g. face/object detection) in the sensor silicon and
> sends it out
> >> > > via embedded data.  In such cases, the image data alone is not very
> helpful
> >> > > for the application.  An application would need both embedded data
> and image
> >> > > data buffers to do its magic, and crucially they must be in sync.
> This is where
> >> > > the StrictBufferMatching = true would be used.  We are actually
> talking to a
> >> > > vendor about this exact use case. Of course, today we do not have
> the ability to
> >> > > get to that embedded data buffer in the application, but we will in
> the future, so
> >> > > right now this hint is more for the IPA having a guarantee that
> both buffers are in
> >> > > sync.
> >> > >
> >> > > For more simpler use cases, e.g. high framerate video, we do not
> necessarily care
> >> > > about strict matching, as the embedded buffer will only be used for
> AWB/AE loops,
> >> > > and avoiding frame drops is more desirable.  These cases would use
> StrictBufferMatching = false.
> >> > > Perhaps a better strategy would be to set this to the default...?
> >> > >
> >> > > Hope that makes sense.
> >>
> >> So I understand why strict matching is important, and that's the default
> >> in this patch. Non-strict matching relaxes the requirements, but I'm not
> >> sure what impact it will have on algorithms.
> >
> >
> > Hopefully non-strict buffer matching will not have any impact on the
> algorithms :-)
> > I suppose it could possibly cause AE to go wrong in extreme cases where
> > for whatever reason, the locally stored values were incorrect.
> >
> >>
> >> Furthermore, to make this really useful, we would need a way to
> >> configure the policy at runtime, and I think it may be a hard choice for
> >> applications. I wonder if there could be a way to handle the buffer
> >> matching policy automagically ? The code hear deals with the case were
> >> no matching buffer is available, which I assume would be caused by a
> >> frame drop (either on the image stream or the embedded data stream).
> >> Would it be possible to detect such situations and handle them in an
> >> automatic way ?
> >
> >
> > The pipeline handler is able to detect a buffer that cannot be matched,
> > but after that, I don't see a way for it to know how to handle this in an
> > automatic way.  It does not know the context of the buffers or
> application,
> > e.g, if there is some computer vision embedded data that goes with the
> > image frame so if we don't have a match we should drop the frame.
> >
> >>
> >> I'm not sure what to propose exactly, but going in the
> >> direction of exposing to applications controls that tune heuristics in
> >> ill-defined ways worries me. We could easily end up with controls that
> >> applications would have no meaningful way to set. If you ask me if I
> >> need to avoid frame drops or avoid bad images, I'll tell you I need both
> >> :-) I could possibly make a decision if I knew the exact risk of frame
> >> drop, and exactly how "bad" images could be, but I don't think that's
> >> something we'll be able to express in an API. I'd really like to avoid
> >> going in this direction.
> >
> >
> > Yes, I do see your concerns here.  We do not want to put more onus on
> > the application side to manage pipelines, and in possibly a vendor
> specific
> > way.  However, I do see that more advanced users, or unique apps that
> > have a very specific purpose may want this sort of control available.
> >
> > Some time ago, we talked about allowing applications to provide "hints"
> > to the pipeline handlers on possible operating behaviors.   I was
> thinking
> > that StrictBufferMatching could be one of those hints.  Other possible
> > hints may be HDR, ZSL, high framerate, low power operating modes
> > as a few examples from the top of my head.
> >
> > We still need to be careful, as hints should not be enforceable, but act
> > only as a suggestion (hence "hints" :-)) that the pipeline handler may or
> > may not handle.
> >
> > I'm not sure that the right approach here is, or if there even is one...?
>
> I wonder if CamHelpers would help here? Particularly in view of our
> plans going forward to generalise the parsing for more types of
> metadata, the CamHelpers will have to know what the metadata is and
> could take a view on the risks of dropping frames or not.
>

While I think this could be part of the solution, I don't think it would be
complete.  I still feel there is a level of control that an application may
(optionally) want to exert on the pipeline handlers.  If, for example,
the application wants the pipeline handler to return the metadata buffer
(I know we cannot do this right now but...), then the IPA cannot make
the decision in isolation.  If you expand on the "hints" idea, you could
also see the application sending hints about mode choices, e.g. fast
fps preferred, or larger FoV preferred, etc. The IPA would not know or
care about any of these choices.

Regards,
Naush

Patch
diff mbox series

diff --git a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
index d969c77993eb..7f66d6995176 100644
--- a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
+++ b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
@@ -46,6 +46,22 @@  namespace libcamera {
 
 LOG_DEFINE_CATEGORY(RPI)
 
+/*
+ * Set this to true to reject and drop all bayer frames that do not have a
+ * matching embedded data buffer and vice-versa. This guarantees the IPA will
+ * always have the correct frame exposure and gain values to use.
+ *
+ * Set this to false to use bayer frames that do not have a matching embedded
+ * data buffer. In this case, IPA will use use our local history for gain and
+ * exposure values, occasional frame drops may cause these number to be out of
+ * sync for a short period.
+ *
+ * \todo Ideally, this property should be set by the application, but we don't
+ * have any mechanism to pass generic properties into a pipeline handler at
+ * present.
+ */
+static const bool StrictBufferMatching = true;
+
 namespace {
 
 bool isRaw(PixelFormat &pixFmt)
@@ -1724,13 +1740,13 @@  bool RPiCameraData::findMatchingBuffers(BayerFrame &bayerFrame, FrameBuffer *&em
 		embeddedBuffer = nullptr;
 		while (!embeddedQueue_.empty()) {
 			FrameBuffer *b = embeddedQueue_.front();
-			if (!unicam_[Unicam::Embedded].isExternal() && b->metadata().timestamp < ts) {
+			if (b->metadata().timestamp < ts) {
 				embeddedQueue_.pop();
 				unicam_[Unicam::Embedded].queueBuffer(b);
 				embeddedRequeueCount++;
 				LOG(RPI, Warning) << "Dropping unmatched input frame in stream "
 						  << unicam_[Unicam::Embedded].name();
-			} else if (unicam_[Unicam::Embedded].isExternal() || b->metadata().timestamp == ts) {
+			} else if (b->metadata().timestamp == ts) {
 				/* We pop the item from the queue lower down. */
 				embeddedBuffer = b;
 				break;
@@ -1744,10 +1760,15 @@  bool RPiCameraData::findMatchingBuffers(BayerFrame &bayerFrame, FrameBuffer *&em
 
 			LOG(RPI, Debug) << "Could not find matching embedded buffer";
 
-			if (!sensorMetadata_) {
+			if (unicam_[Unicam::Embedded].isExternal() ||
+			    !StrictBufferMatching || !sensorMetadata_) {
 				/*
-				 * If there is no sensor metadata, simply return the
-				 * first bayer frame in the queue.
+				 * If any of the follwing is true:
+				 * - This is an external stream buffer (so cannot be dropped).
+				 * - We do not care about strict buffer matching.
+				 * - There is no sensor metadata present.
+				 * we can simply return the first bayer frame in the queue
+				 * without a matching embedded buffer.
 				 */
 				LOG(RPI, Debug) << "Returning bayer frame without a match";
 				bayerQueue_.pop();