[libcamera-devel,3/8] cam: Use SensorTimestamp rather than buffer metadata
diff mbox series

Message ID 20211206233948.1351206-4-kieran.bingham@ideasonboard.com
State New
Delegated to: Kieran Bingham
Headers show
Series
  • Request metadata: SensorSequence
Related show

Commit Message

Kieran Bingham Dec. 6, 2021, 11:39 p.m. UTC
The SensorTimestamp is defined to be the time of capture of the image.
While all streams should have the same timestamp, this is not always
defined or guaranteed as ISP drivers may not forward sequence numbers
and timestamps from their input buffer.

Use the Request metadata to get the SensorTimestamp which must be
set by the pipeline handlers according to the data from the capture
device.

Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
---
 src/cam/camera_session.cpp | 7 ++++---
 1 file changed, 4 insertions(+), 3 deletions(-)

Comments

Laurent Pinchart Dec. 7, 2021, 12:09 a.m. UTC | #1
Hi Kieran,

Thank you for the patch.

On Mon, Dec 06, 2021 at 11:39:43PM +0000, Kieran Bingham wrote:
> The SensorTimestamp is defined to be the time of capture of the image.
> While all streams should have the same timestamp, this is not always
> defined or guaranteed as ISP drivers may not forward sequence numbers
> and timestamps from their input buffer.

That should then bo solved by the pipeline handler, which should store
the correct timestamp in the buffer metadata.

> Use the Request metadata to get the SensorTimestamp which must be
> set by the pipeline handlers according to the data from the capture
> device.
> 
> Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
> ---
>  src/cam/camera_session.cpp | 7 ++++---
>  1 file changed, 4 insertions(+), 3 deletions(-)
> 
> diff --git a/src/cam/camera_session.cpp b/src/cam/camera_session.cpp
> index 1bf460fa3fb7..50170723c30f 100644
> --- a/src/cam/camera_session.cpp
> +++ b/src/cam/camera_session.cpp
> @@ -359,10 +359,11 @@ void CameraSession::processRequest(Request *request)
>  	const Request::BufferMap &buffers = request->buffers();
>  
>  	/*
> -	 * Compute the frame rate. The timestamp is arbitrarily retrieved from
> -	 * the first buffer, as all buffers should have matching timestamps.
> +	 * Compute the frame rate. The timestamp is retrieved from the
> +	 * SensorTimestamp property, though all streams should have the
> +	 * same timestamp.
>  	 */
> -	uint64_t ts = buffers.begin()->second->metadata().timestamp;
> +	uint64_t ts = request->metadata().get(controls::SensorTimestamp);

This seems reasonable. Why do we have timestamps in the buffer metadata
? :-)

>  	double fps = ts - last_;
>  	fps = last_ != 0 && fps ? 1000000000.0 / fps : 0.0;
>  	last_ = ts;
Umang Jain Dec. 7, 2021, 1:21 p.m. UTC | #2
Hi Laurent,

On 12/7/21 5:39 AM, Laurent Pinchart wrote:
> Hi Kieran,
>
> Thank you for the patch.
>
> On Mon, Dec 06, 2021 at 11:39:43PM +0000, Kieran Bingham wrote:
>> The SensorTimestamp is defined to be the time of capture of the image.
>> While all streams should have the same timestamp, this is not always
>> defined or guaranteed as ISP drivers may not forward sequence numbers
>> and timestamps from their input buffer.
> That should then bo solved by the pipeline handler, which should store
> the correct timestamp in the buffer metadata.
>
>> Use the Request metadata to get the SensorTimestamp which must be
>> set by the pipeline handlers according to the data from the capture
>> device.
>>
>> Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
>> ---
>>   src/cam/camera_session.cpp | 7 ++++---
>>   1 file changed, 4 insertions(+), 3 deletions(-)
>>
>> diff --git a/src/cam/camera_session.cpp b/src/cam/camera_session.cpp
>> index 1bf460fa3fb7..50170723c30f 100644
>> --- a/src/cam/camera_session.cpp
>> +++ b/src/cam/camera_session.cpp
>> @@ -359,10 +359,11 @@ void CameraSession::processRequest(Request *request)
>>   	const Request::BufferMap &buffers = request->buffers();
>>   
>>   	/*
>> -	 * Compute the frame rate. The timestamp is arbitrarily retrieved from
>> -	 * the first buffer, as all buffers should have matching timestamps.
>> +	 * Compute the frame rate. The timestamp is retrieved from the
>> +	 * SensorTimestamp property, though all streams should have the
>> +	 * same timestamp.
>>   	 */
>> -	uint64_t ts = buffers.begin()->second->metadata().timestamp;
>> +	uint64_t ts = request->metadata().get(controls::SensorTimestamp);
> This seems reasonable. Why do we have timestamps in the buffer metadata
> ? :-)


Because there is no buffer assigned on the time-point where we want to 
capture the timestamp.

The exact location of the timestamp is a \todo

              * \todo The sensor timestamp should be better estimated by 
connecting
              * to the V4L2Device::frameStart signal.

in all the pipeline-handlers as of now.

We *want* to capture at frameStart signal, which emits in response to 
VIDIOC_DQEVENT, but we can't as no buffer assigned (yet)

We probably need to assign a container taking care of timestamps with 
sequence numbers until a buffer is assigned down-the-line and then set 
the buffer metadata by reading seq# & timestamp from that container.


>
>>   	double fps = ts - last_;
>>   	fps = last_ != 0 && fps ? 1000000000.0 / fps : 0.0;
>>   	last_ = ts;
Umang Jain Dec. 7, 2021, 1:37 p.m. UTC | #3
On 12/7/21 6:51 PM, Umang Jain wrote:
> Hi Laurent,
>
> On 12/7/21 5:39 AM, Laurent Pinchart wrote:
>> Hi Kieran,
>>
>> Thank you for the patch.
>>
>> On Mon, Dec 06, 2021 at 11:39:43PM +0000, Kieran Bingham wrote:
>>> The SensorTimestamp is defined to be the time of capture of the image.
>>> While all streams should have the same timestamp, this is not always
>>> defined or guaranteed as ISP drivers may not forward sequence numbers
>>> and timestamps from their input buffer.
>> That should then bo solved by the pipeline handler, which should store
>> the correct timestamp in the buffer metadata.
>>
>>> Use the Request metadata to get the SensorTimestamp which must be
>>> set by the pipeline handlers according to the data from the capture
>>> device.
>>>
>>> Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
>>> ---
>>>   src/cam/camera_session.cpp | 7 ++++---
>>>   1 file changed, 4 insertions(+), 3 deletions(-)
>>>
>>> diff --git a/src/cam/camera_session.cpp b/src/cam/camera_session.cpp
>>> index 1bf460fa3fb7..50170723c30f 100644
>>> --- a/src/cam/camera_session.cpp
>>> +++ b/src/cam/camera_session.cpp
>>> @@ -359,10 +359,11 @@ void CameraSession::processRequest(Request 
>>> *request)
>>>       const Request::BufferMap &buffers = request->buffers();
>>>         /*
>>> -     * Compute the frame rate. The timestamp is arbitrarily 
>>> retrieved from
>>> -     * the first buffer, as all buffers should have matching 
>>> timestamps.
>>> +     * Compute the frame rate. The timestamp is retrieved from the
>>> +     * SensorTimestamp property, though all streams should have the
>>> +     * same timestamp.
>>>        */
>>> -    uint64_t ts = buffers.begin()->second->metadata().timestamp;
>>> +    uint64_t ts = request->metadata().get(controls::SensorTimestamp);
>> This seems reasonable. Why do we have timestamps in the buffer metadata
>> ? :-)


Strong chance I have mis-understood the question, I later realized this 
is a cam-related patch.

to me, the question translated :

     why do we have timestamps in the FrameBuffer.metadata_.timestamp ?

So ignore the discussion (if you want) :-P

>
>
> Because there is no buffer assigned on the time-point where we want to 
> capture the timestamp.
>
> The exact location of the timestamp is a \todo
>
>              * \todo The sensor timestamp should be better estimated 
> by connecting
>              * to the V4L2Device::frameStart signal.
>
> in all the pipeline-handlers as of now.
>
> We *want* to capture at frameStart signal, which emits in response to 
> VIDIOC_DQEVENT, but we can't as no buffer assigned (yet)
>
> We probably need to assign a container taking care of timestamps with 
> sequence numbers until a buffer is assigned down-the-line and then set 
> the buffer metadata by reading seq# & timestamp from that container.
>
>
>>
>>>       double fps = ts - last_;
>>>       fps = last_ != 0 && fps ? 1000000000.0 / fps : 0.0;
>>>       last_ = ts;
Laurent Pinchart Dec. 8, 2021, 6:57 p.m. UTC | #4
Hi Umang,

On Tue, Dec 07, 2021 at 07:07:46PM +0530, Umang Jain wrote:
> On 12/7/21 6:51 PM, Umang Jain wrote:
> > On 12/7/21 5:39 AM, Laurent Pinchart wrote:
> >> On Mon, Dec 06, 2021 at 11:39:43PM +0000, Kieran Bingham wrote:
> >>> The SensorTimestamp is defined to be the time of capture of the image.
> >>> While all streams should have the same timestamp, this is not always
> >>> defined or guaranteed as ISP drivers may not forward sequence numbers
> >>> and timestamps from their input buffer.
> >>
> >> That should then bo solved by the pipeline handler, which should store
> >> the correct timestamp in the buffer metadata.
> >>
> >>> Use the Request metadata to get the SensorTimestamp which must be
> >>> set by the pipeline handlers according to the data from the capture
> >>> device.
> >>>
> >>> Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
> >>> ---
> >>>   src/cam/camera_session.cpp | 7 ++++---
> >>>   1 file changed, 4 insertions(+), 3 deletions(-)
> >>>
> >>> diff --git a/src/cam/camera_session.cpp b/src/cam/camera_session.cpp
> >>> index 1bf460fa3fb7..50170723c30f 100644
> >>> --- a/src/cam/camera_session.cpp
> >>> +++ b/src/cam/camera_session.cpp
> >>> @@ -359,10 +359,11 @@ void CameraSession::processRequest(Request 
> >>> *request)
> >>>       const Request::BufferMap &buffers = request->buffers();
> >>>         /*
> >>> -     * Compute the frame rate. The timestamp is arbitrarily retrieved from
> >>> -     * the first buffer, as all buffers should have matching timestamps.
> >>> +     * Compute the frame rate. The timestamp is retrieved from the
> >>> +     * SensorTimestamp property, though all streams should have the
> >>> +     * same timestamp.
> >>>        */
> >>> -    uint64_t ts = buffers.begin()->second->metadata().timestamp;
> >>> +    uint64_t ts = request->metadata().get(controls::SensorTimestamp);
> >>
> >> This seems reasonable. Why do we have timestamps in the buffer metadata
> >> ? :-)
> 
> Strong chance I have mis-understood the question, I later realized this 
> is a cam-related patch.
> 
> to me, the question translated :
> 
>      why do we have timestamps in the FrameBuffer.metadata_.timestamp ?

To clarify, I meant to ask why we have both controls::SensorTimestamp
*and* timestamps in individual buffers
(FrameBuffer::metadata_.timestamp). I suppose it's historical, the
timestamp member comes from

commit e94e52c0cb27f92c085da6e776af8b3d3172bbb2
Author: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Date:   Thu Jan 24 23:34:51 2019 +0100

    libcamera: v4l2_device: Update dequeued buffer information

while controls::SensorTimestamp has been introduced in

commit 84e79bd8b5347d91309cbd759bbd988b5144fb8d
Author: Jacopo Mondi <jacopo@jmondi.org>
Date:   Thu Jul 30 15:18:50 2020 +0200

    libcamera: control_ids: Define draft controls

> So ignore the discussion (if you want) :-P
> 
> > Because there is no buffer assigned on the time-point where we want to 
> > capture the timestamp.
> >
> > The exact location of the timestamp is a \todo
> >
> >              * \todo The sensor timestamp should be better estimated 
> > by connecting
> >              * to the V4L2Device::frameStart signal.
> >
> > in all the pipeline-handlers as of now.
> >
> > We *want* to capture at frameStart signal, which emits in response to 
> > VIDIOC_DQEVENT, but we can't as no buffer assigned (yet)
> >
> > We probably need to assign a container taking care of timestamps with 
> > sequence numbers until a buffer is assigned down-the-line and then set 
> > the buffer metadata by reading seq# & timestamp from that container.

Should we just drop FrameBuffer::metadata_.timestamp in favour of
controls::SensorTimestamp ? What would be the drawbacks, if any ?

> >>>       double fps = ts - last_;
> >>>       fps = last_ != 0 && fps ? 1000000000.0 / fps : 0.0;
> >>>       last_ = ts;
Kieran Bingham Dec. 8, 2021, 8:59 p.m. UTC | #5
Quoting Laurent Pinchart (2021-12-08 18:57:10)
> Hi Umang,
> 
> On Tue, Dec 07, 2021 at 07:07:46PM +0530, Umang Jain wrote:
> > On 12/7/21 6:51 PM, Umang Jain wrote:
> > > On 12/7/21 5:39 AM, Laurent Pinchart wrote:
> > >> On Mon, Dec 06, 2021 at 11:39:43PM +0000, Kieran Bingham wrote:
> > >>> The SensorTimestamp is defined to be the time of capture of the image.
> > >>> While all streams should have the same timestamp, this is not always
> > >>> defined or guaranteed as ISP drivers may not forward sequence numbers
> > >>> and timestamps from their input buffer.
> > >>
> > >> That should then bo solved by the pipeline handler, which should store
> > >> the correct timestamp in the buffer metadata.
> > >>
> > >>> Use the Request metadata to get the SensorTimestamp which must be
> > >>> set by the pipeline handlers according to the data from the capture
> > >>> device.
> > >>>
> > >>> Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
> > >>> ---
> > >>>   src/cam/camera_session.cpp | 7 ++++---
> > >>>   1 file changed, 4 insertions(+), 3 deletions(-)
> > >>>
> > >>> diff --git a/src/cam/camera_session.cpp b/src/cam/camera_session.cpp
> > >>> index 1bf460fa3fb7..50170723c30f 100644
> > >>> --- a/src/cam/camera_session.cpp
> > >>> +++ b/src/cam/camera_session.cpp
> > >>> @@ -359,10 +359,11 @@ void CameraSession::processRequest(Request 
> > >>> *request)
> > >>>       const Request::BufferMap &buffers = request->buffers();
> > >>>         /*
> > >>> -     * Compute the frame rate. The timestamp is arbitrarily retrieved from
> > >>> -     * the first buffer, as all buffers should have matching timestamps.
> > >>> +     * Compute the frame rate. The timestamp is retrieved from the
> > >>> +     * SensorTimestamp property, though all streams should have the
> > >>> +     * same timestamp.
> > >>>        */
> > >>> -    uint64_t ts = buffers.begin()->second->metadata().timestamp;
> > >>> +    uint64_t ts = request->metadata().get(controls::SensorTimestamp);
> > >>
> > >> This seems reasonable. Why do we have timestamps in the buffer metadata
> > >> ? :-)
> > 
> > Strong chance I have mis-understood the question, I later realized this 
> > is a cam-related patch.
> > 
> > to me, the question translated :
> > 
> >      why do we have timestamps in the FrameBuffer.metadata_.timestamp ?
> 
> To clarify, I meant to ask why we have both controls::SensorTimestamp
> *and* timestamps in individual buffers
> (FrameBuffer::metadata_.timestamp). I suppose it's historical, the
> timestamp member comes from
> 
> commit e94e52c0cb27f92c085da6e776af8b3d3172bbb2
> Author: Niklas Söderlund <niklas.soderlund@ragnatech.se>
> Date:   Thu Jan 24 23:34:51 2019 +0100
> 
>     libcamera: v4l2_device: Update dequeued buffer information
> 
> while controls::SensorTimestamp has been introduced in
> 
> commit 84e79bd8b5347d91309cbd759bbd988b5144fb8d
> Author: Jacopo Mondi <jacopo@jmondi.org>
> Date:   Thu Jul 30 15:18:50 2020 +0200
> 
>     libcamera: control_ids: Define draft controls
> 
> > So ignore the discussion (if you want) :-P
> > 
> > > Because there is no buffer assigned on the time-point where we want to 
> > > capture the timestamp.
> > >
> > > The exact location of the timestamp is a \todo
> > >
> > >              * \todo The sensor timestamp should be better estimated 
> > > by connecting
> > >              * to the V4L2Device::frameStart signal.
> > >
> > > in all the pipeline-handlers as of now.
> > >
> > > We *want* to capture at frameStart signal, which emits in response to 
> > > VIDIOC_DQEVENT, but we can't as no buffer assigned (yet)
> > >
> > > We probably need to assign a container taking care of timestamps with 
> > > sequence numbers until a buffer is assigned down-the-line and then set 
> > > the buffer metadata by reading seq# & timestamp from that container.
> 
> Should we just drop FrameBuffer::metadata_.timestamp in favour of
> controls::SensorTimestamp ? What would be the drawbacks, if any ?

Drawbacks for using Metadata control:
 - Have to parse the metadata control list to get timestamps
   (/sequences)

Pro's for Metadata control:
 - Only one place for it, all buffers in the same request are defined as
   having the same value (sequence and timestamp).


Drawbacks for per-buffer metadata:
 - Must be explictily assigned to keep them in sync even though they are
   the output of multiple video devices perhaps.

Pro's for for per-buffer metadata:
 - Quicker to retrieve the values? (defined structure?)

Any more on either side?


Personally, I'm currently thinking these are better defined by the
metadata() control list. After all, that's the purpose of the list
right?


> 
> > >>>       double fps = ts - last_;
> > >>>       fps = last_ != 0 && fps ? 1000000000.0 / fps : 0.0;
> > >>>       last_ = ts;
> 
> -- 
> Regards,
> 
> Laurent Pinchart
Laurent Pinchart Dec. 8, 2021, 9:13 p.m. UTC | #6
Hi Kieran,

On Wed, Dec 08, 2021 at 08:59:29PM +0000, Kieran Bingham wrote:
> Quoting Laurent Pinchart (2021-12-08 18:57:10)
> > On Tue, Dec 07, 2021 at 07:07:46PM +0530, Umang Jain wrote:
> > > On 12/7/21 6:51 PM, Umang Jain wrote:
> > > > On 12/7/21 5:39 AM, Laurent Pinchart wrote:
> > > >> On Mon, Dec 06, 2021 at 11:39:43PM +0000, Kieran Bingham wrote:
> > > >>> The SensorTimestamp is defined to be the time of capture of the image.
> > > >>> While all streams should have the same timestamp, this is not always
> > > >>> defined or guaranteed as ISP drivers may not forward sequence numbers
> > > >>> and timestamps from their input buffer.
> > > >>
> > > >> That should then bo solved by the pipeline handler, which should store
> > > >> the correct timestamp in the buffer metadata.
> > > >>
> > > >>> Use the Request metadata to get the SensorTimestamp which must be
> > > >>> set by the pipeline handlers according to the data from the capture
> > > >>> device.
> > > >>>
> > > >>> Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
> > > >>> ---
> > > >>>   src/cam/camera_session.cpp | 7 ++++---
> > > >>>   1 file changed, 4 insertions(+), 3 deletions(-)
> > > >>>
> > > >>> diff --git a/src/cam/camera_session.cpp b/src/cam/camera_session.cpp
> > > >>> index 1bf460fa3fb7..50170723c30f 100644
> > > >>> --- a/src/cam/camera_session.cpp
> > > >>> +++ b/src/cam/camera_session.cpp
> > > >>> @@ -359,10 +359,11 @@ void CameraSession::processRequest(Request 
> > > >>> *request)
> > > >>>       const Request::BufferMap &buffers = request->buffers();
> > > >>>         /*
> > > >>> -     * Compute the frame rate. The timestamp is arbitrarily retrieved from
> > > >>> -     * the first buffer, as all buffers should have matching timestamps.
> > > >>> +     * Compute the frame rate. The timestamp is retrieved from the
> > > >>> +     * SensorTimestamp property, though all streams should have the
> > > >>> +     * same timestamp.
> > > >>>        */
> > > >>> -    uint64_t ts = buffers.begin()->second->metadata().timestamp;
> > > >>> +    uint64_t ts = request->metadata().get(controls::SensorTimestamp);
> > > >>
> > > >> This seems reasonable. Why do we have timestamps in the buffer metadata
> > > >> ? :-)
> > > 
> > > Strong chance I have mis-understood the question, I later realized this 
> > > is a cam-related patch.
> > > 
> > > to me, the question translated :
> > > 
> > >      why do we have timestamps in the FrameBuffer.metadata_.timestamp ?
> > 
> > To clarify, I meant to ask why we have both controls::SensorTimestamp
> > *and* timestamps in individual buffers
> > (FrameBuffer::metadata_.timestamp). I suppose it's historical, the
> > timestamp member comes from
> > 
> > commit e94e52c0cb27f92c085da6e776af8b3d3172bbb2
> > Author: Niklas Söderlund <niklas.soderlund@ragnatech.se>
> > Date:   Thu Jan 24 23:34:51 2019 +0100
> > 
> >     libcamera: v4l2_device: Update dequeued buffer information
> > 
> > while controls::SensorTimestamp has been introduced in
> > 
> > commit 84e79bd8b5347d91309cbd759bbd988b5144fb8d
> > Author: Jacopo Mondi <jacopo@jmondi.org>
> > Date:   Thu Jul 30 15:18:50 2020 +0200
> > 
> >     libcamera: control_ids: Define draft controls
> > 
> > > So ignore the discussion (if you want) :-P
> > > 
> > > > Because there is no buffer assigned on the time-point where we want to 
> > > > capture the timestamp.
> > > >
> > > > The exact location of the timestamp is a \todo
> > > >
> > > >              * \todo The sensor timestamp should be better estimated 
> > > > by connecting
> > > >              * to the V4L2Device::frameStart signal.
> > > >
> > > > in all the pipeline-handlers as of now.
> > > >
> > > > We *want* to capture at frameStart signal, which emits in response to 
> > > > VIDIOC_DQEVENT, but we can't as no buffer assigned (yet)
> > > >
> > > > We probably need to assign a container taking care of timestamps with 
> > > > sequence numbers until a buffer is assigned down-the-line and then set 
> > > > the buffer metadata by reading seq# & timestamp from that container.
> > 
> > Should we just drop FrameBuffer::metadata_.timestamp in favour of
> > controls::SensorTimestamp ? What would be the drawbacks, if any ?
> 
> Drawbacks for using Metadata control:
>  - Have to parse the metadata control list to get timestamps
>    (/sequences)

The request metadata is also not available before the request fully
completes, while buffers will complete individually before. This is
something we may want to address in a generic way though, I suspect
there will be use cases for providing request metadata early.

> Pro's for Metadata control:
>  - Only one place for it, all buffers in the same request are defined as
>    having the same value (sequence and timestamp).

That's the part I like :-)

> Drawbacks for per-buffer metadata:
>  - Must be explictily assigned to keep them in sync even though they are
>    the output of multiple video devices perhaps.
> 
> Pro's for for per-buffer metadata:
>  - Quicker to retrieve the values? (defined structure?)

Possibly a bit quicker, but that seems marginal to me.

> Any more on either side?
> 
> 
> Personally, I'm currently thinking these are better defined by the
> metadata() control list. After all, that's the purpose of the list
> right?

I think so. I see very little value today in providing buffer completion
timestamps that would be different than the sensor timestamp (this could
possibly help measuring the processing time, but if that's the goal, it
could also be done by capturing the system timestamp in the buffer
completion handler on the application side, which would provide
statistics about the full processing time, including both the hardware
processing and the software processing).

Let's wait for more feedback, but if there's a general agreement that
this is the direction we want to take, we could remove
FrameBuffer::metadata_.timestamp independently of addressing the sensor
sequence issue.

> > > >>>       double fps = ts - last_;
> > > >>>       fps = last_ != 0 && fps ? 1000000000.0 / fps : 0.0;
> > > >>>       last_ = ts;
Umang Jain Dec. 9, 2021, 5:59 a.m. UTC | #7
Hi,

On 12/9/21 2:43 AM, Laurent Pinchart wrote:
> Hi Kieran,
>
> On Wed, Dec 08, 2021 at 08:59:29PM +0000, Kieran Bingham wrote:
>> Quoting Laurent Pinchart (2021-12-08 18:57:10)
>>> On Tue, Dec 07, 2021 at 07:07:46PM +0530, Umang Jain wrote:
>>>> On 12/7/21 6:51 PM, Umang Jain wrote:
>>>>> On 12/7/21 5:39 AM, Laurent Pinchart wrote:
>>>>>> On Mon, Dec 06, 2021 at 11:39:43PM +0000, Kieran Bingham wrote:
>>>>>>> The SensorTimestamp is defined to be the time of capture of the image.
>>>>>>> While all streams should have the same timestamp, this is not always
>>>>>>> defined or guaranteed as ISP drivers may not forward sequence numbers
>>>>>>> and timestamps from their input buffer.
>>>>>> That should then bo solved by the pipeline handler, which should store
>>>>>> the correct timestamp in the buffer metadata.
>>>>>>
>>>>>>> Use the Request metadata to get the SensorTimestamp which must be
>>>>>>> set by the pipeline handlers according to the data from the capture
>>>>>>> device.
>>>>>>>
>>>>>>> Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
>>>>>>> ---
>>>>>>>    src/cam/camera_session.cpp | 7 ++++---
>>>>>>>    1 file changed, 4 insertions(+), 3 deletions(-)
>>>>>>>
>>>>>>> diff --git a/src/cam/camera_session.cpp b/src/cam/camera_session.cpp
>>>>>>> index 1bf460fa3fb7..50170723c30f 100644
>>>>>>> --- a/src/cam/camera_session.cpp
>>>>>>> +++ b/src/cam/camera_session.cpp
>>>>>>> @@ -359,10 +359,11 @@ void CameraSession::processRequest(Request
>>>>>>> *request)
>>>>>>>        const Request::BufferMap &buffers = request->buffers();
>>>>>>>          /*
>>>>>>> -     * Compute the frame rate. The timestamp is arbitrarily retrieved from
>>>>>>> -     * the first buffer, as all buffers should have matching timestamps.
>>>>>>> +     * Compute the frame rate. The timestamp is retrieved from the
>>>>>>> +     * SensorTimestamp property, though all streams should have the
>>>>>>> +     * same timestamp.
>>>>>>>         */
>>>>>>> -    uint64_t ts = buffers.begin()->second->metadata().timestamp;
>>>>>>> +    uint64_t ts = request->metadata().get(controls::SensorTimestamp);
>>>>>> This seems reasonable. Why do we have timestamps in the buffer metadata
>>>>>> ? :-)
>>>> Strong chance I have mis-understood the question, I later realized this
>>>> is a cam-related patch.
>>>>
>>>> to me, the question translated :
>>>>
>>>>       why do we have timestamps in the FrameBuffer.metadata_.timestamp ?
>>> To clarify, I meant to ask why we have both controls::SensorTimestamp
>>> *and* timestamps in individual buffers
>>> (FrameBuffer::metadata_.timestamp). I suppose it's historical, the
>>> timestamp member comes from
>>>
>>> commit e94e52c0cb27f92c085da6e776af8b3d3172bbb2
>>> Author: Niklas Söderlund <niklas.soderlund@ragnatech.se>
>>> Date:   Thu Jan 24 23:34:51 2019 +0100
>>>
>>>      libcamera: v4l2_device: Update dequeued buffer information
>>>
>>> while controls::SensorTimestamp has been introduced in
>>>
>>> commit 84e79bd8b5347d91309cbd759bbd988b5144fb8d
>>> Author: Jacopo Mondi <jacopo@jmondi.org>
>>> Date:   Thu Jul 30 15:18:50 2020 +0200
>>>
>>>      libcamera: control_ids: Define draft controls
>>>
>>>> So ignore the discussion (if you want) :-P
>>>>
>>>>> Because there is no buffer assigned on the time-point where we want to
>>>>> capture the timestamp.
>>>>>
>>>>> The exact location of the timestamp is a \todo
>>>>>
>>>>>               * \todo The sensor timestamp should be better estimated
>>>>> by connecting
>>>>>               * to the V4L2Device::frameStart signal.
>>>>>
>>>>> in all the pipeline-handlers as of now.
>>>>>
>>>>> We *want* to capture at frameStart signal, which emits in response to
>>>>> VIDIOC_DQEVENT, but we can't as no buffer assigned (yet)
>>>>>
>>>>> We probably need to assign a container taking care of timestamps with
>>>>> sequence numbers until a buffer is assigned down-the-line and then set
>>>>> the buffer metadata by reading seq# & timestamp from that container.
>>> Should we just drop FrameBuffer::metadata_.timestamp in favour of
>>> controls::SensorTimestamp ? What would be the drawbacks, if any ?
>> Drawbacks for using Metadata control:
>>   - Have to parse the metadata control list to get timestamps
>>     (/sequences)
> The request metadata is also not available before the request fully
> completes, while buffers will complete individually before. This is
> something we may want to address in a generic way though, I suspect
> there will be use cases for providing request metadata early.
>
>> Pro's for Metadata control:
>>   - Only one place for it, all buffers in the same request are defined as
>>     having the same value (sequence and timestamp).
> That's the part I like :-)
>
>> Drawbacks for per-buffer metadata:
>>   - Must be explictily assigned to keep them in sync even though they are
>>     the output of multiple video devices perhaps.
>>
>> Pro's for for per-buffer metadata:
>>   - Quicker to retrieve the values? (defined structure?)
> Possibly a bit quicker, but that seems marginal to me.
>
>> Any more on either side?
>>
>>
>> Personally, I'm currently thinking these are better defined by the
>> metadata() control list. After all, that's the purpose of the list
>> right?
> I think so. I see very little value today in providing buffer completion
> timestamps that would be different than the sensor timestamp (this could


Agreed on this point. Applications 'generally' should be asking for 
request-completion timestamp and sensor-timestamp (if they want to use 
it for hw/processing metrics).

Buffer-completion timestamps are in-between (also sub-classed facts like 
"direct" buffers / post-processed buffers etc.)  - I can't see any major 
value for applications for requesting those, but I could be wrong!

> possibly help measuring the processing time, but if that's the goal, it
> could also be done by capturing the system timestamp in the buffer
> completion handler on the application side, which would provide
> statistics about the full processing time, including both the hardware
> processing and the software processing).
>
> Let's wait for more feedback, but if there's a general agreement that
> this is the direction we want to take, we could remove
> FrameBuffer::metadata_.timestamp independently of addressing the sensor
> sequence issue.


Makes sense.

>
>>>>>>>        double fps = ts - last_;
>>>>>>>        fps = last_ != 0 && fps ? 1000000000.0 / fps : 0.0;
>>>>>>>        last_ = ts;
Naushir Patuck Dec. 9, 2021, 9:23 a.m. UTC | #8
Hi all,

On Wed, 8 Dec 2021 at 21:13, Laurent Pinchart <
laurent.pinchart@ideasonboard.com> wrote:

> Hi Kieran,
>
> On Wed, Dec 08, 2021 at 08:59:29PM +0000, Kieran Bingham wrote:
> > Quoting Laurent Pinchart (2021-12-08 18:57:10)
> > > On Tue, Dec 07, 2021 at 07:07:46PM +0530, Umang Jain wrote:
> > > > On 12/7/21 6:51 PM, Umang Jain wrote:
> > > > > On 12/7/21 5:39 AM, Laurent Pinchart wrote:
> > > > >> On Mon, Dec 06, 2021 at 11:39:43PM +0000, Kieran Bingham wrote:
> > > > >>> The SensorTimestamp is defined to be the time of capture of the
> image.
> > > > >>> While all streams should have the same timestamp, this is not
> always
> > > > >>> defined or guaranteed as ISP drivers may not forward sequence
> numbers
> > > > >>> and timestamps from their input buffer.
> > > > >>
> > > > >> That should then bo solved by the pipeline handler, which should
> store
> > > > >> the correct timestamp in the buffer metadata.
> > > > >>
> > > > >>> Use the Request metadata to get the SensorTimestamp which must be
> > > > >>> set by the pipeline handlers according to the data from the
> capture
> > > > >>> device.
> > > > >>>
> > > > >>> Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
> > > > >>> ---
> > > > >>>   src/cam/camera_session.cpp | 7 ++++---
> > > > >>>   1 file changed, 4 insertions(+), 3 deletions(-)
> > > > >>>
> > > > >>> diff --git a/src/cam/camera_session.cpp
> b/src/cam/camera_session.cpp
> > > > >>> index 1bf460fa3fb7..50170723c30f 100644
> > > > >>> --- a/src/cam/camera_session.cpp
> > > > >>> +++ b/src/cam/camera_session.cpp
> > > > >>> @@ -359,10 +359,11 @@ void CameraSession::processRequest(Request
> > > > >>> *request)
> > > > >>>       const Request::BufferMap &buffers = request->buffers();
> > > > >>>         /*
> > > > >>> -     * Compute the frame rate. The timestamp is arbitrarily
> retrieved from
> > > > >>> -     * the first buffer, as all buffers should have matching
> timestamps.
> > > > >>> +     * Compute the frame rate. The timestamp is retrieved from
> the
> > > > >>> +     * SensorTimestamp property, though all streams should have
> the
> > > > >>> +     * same timestamp.
> > > > >>>        */
> > > > >>> -    uint64_t ts = buffers.begin()->second->metadata().timestamp;
> > > > >>> +    uint64_t ts =
> request->metadata().get(controls::SensorTimestamp);
> > > > >>
> > > > >> This seems reasonable. Why do we have timestamps in the buffer
> metadata
> > > > >> ? :-)
> > > >
> > > > Strong chance I have mis-understood the question, I later realized
> this
> > > > is a cam-related patch.
> > > >
> > > > to me, the question translated :
> > > >
> > > >      why do we have timestamps in the
> FrameBuffer.metadata_.timestamp ?
> > >
> > > To clarify, I meant to ask why we have both controls::SensorTimestamp
> > > *and* timestamps in individual buffers
> > > (FrameBuffer::metadata_.timestamp). I suppose it's historical, the
> > > timestamp member comes from
> > >
> > > commit e94e52c0cb27f92c085da6e776af8b3d3172bbb2
> > > Author: Niklas Söderlund <niklas.soderlund@ragnatech.se>
> > > Date:   Thu Jan 24 23:34:51 2019 +0100
> > >
> > >     libcamera: v4l2_device: Update dequeued buffer information
> > >
> > > while controls::SensorTimestamp has been introduced in
> > >
> > > commit 84e79bd8b5347d91309cbd759bbd988b5144fb8d
> > > Author: Jacopo Mondi <jacopo@jmondi.org>
> > > Date:   Thu Jul 30 15:18:50 2020 +0200
> > >
> > >     libcamera: control_ids: Define draft controls
> > >
> > > > So ignore the discussion (if you want) :-P
> > > >
> > > > > Because there is no buffer assigned on the time-point where we
> want to
> > > > > capture the timestamp.
> > > > >
> > > > > The exact location of the timestamp is a \todo
> > > > >
> > > > >              * \todo The sensor timestamp should be better
> estimated
> > > > > by connecting
> > > > >              * to the V4L2Device::frameStart signal.
> > > > >
> > > > > in all the pipeline-handlers as of now.
> > > > >
> > > > > We *want* to capture at frameStart signal, which emits in response
> to
> > > > > VIDIOC_DQEVENT, but we can't as no buffer assigned (yet)
> > > > >
> > > > > We probably need to assign a container taking care of timestamps
> with
> > > > > sequence numbers until a buffer is assigned down-the-line and then
> set
> > > > > the buffer metadata by reading seq# & timestamp from that
> container.
> > >
> > > Should we just drop FrameBuffer::metadata_.timestamp in favour of
> > > controls::SensorTimestamp ? What would be the drawbacks, if any ?
> >
> > Drawbacks for using Metadata control:
> >  - Have to parse the metadata control list to get timestamps
> >    (/sequences)
>
> The request metadata is also not available before the request fully
> completes, while buffers will complete individually before. This is
> something we may want to address in a generic way though, I suspect
> there will be use cases for providing request metadata early.
>
> > Pro's for Metadata control:
> >  - Only one place for it, all buffers in the same request are defined as
> >    having the same value (sequence and timestamp).
>
> That's the part I like :-)
>
> > Drawbacks for per-buffer metadata:
> >  - Must be explictily assigned to keep them in sync even though they are
> >    the output of multiple video devices perhaps.
> >
> > Pro's for for per-buffer metadata:
> >  - Quicker to retrieve the values? (defined structure?)
>
> Possibly a bit quicker, but that seems marginal to me.
>
> > Any more on either side?
> >
> >
> > Personally, I'm currently thinking these are better defined by the
> > metadata() control list. After all, that's the purpose of the list
> > right?
>
> I think so. I see very little value today in providing buffer completion
> timestamps that would be different than the sensor timestamp (this could
> possibly help measuring the processing time, but if that's the goal, it
> could also be done by capturing the system timestamp in the buffer
> completion handler on the application side, which would provide
> statistics about the full processing time, including both the hardware
> processing and the software processing).
>

The above reason was exactly my rationale for keeping both :-)
However, it's not something we rely on, or even use really, so
I am happy to get rid of the buffer timestamps.

Naush



>
> Let's wait for more feedback, but if there's a general agreement that
> this is the direction we want to take, we could remove
> FrameBuffer::metadata_.timestamp independently of addressing the sensor
> sequence issue.
>
> > > > >>>       double fps = ts - last_;
> > > > >>>       fps = last_ != 0 && fps ? 1000000000.0 / fps : 0.0;
> > > > >>>       last_ = ts;
>
> --
> Regards,
>
> Laurent Pinchart
>
Umang Jain June 1, 2022, 8:24 a.m. UTC | #9
Hi Kieran,

Thank you for the patch (again)

On 12/7/21 00:39, Kieran Bingham wrote:
> The SensorTimestamp is defined to be the time of capture of the image.
> While all streams should have the same timestamp, this is not always
> defined or guaranteed as ISP drivers may not forward sequence numbers
> and timestamps from their input buffer.
>
> Use the Request metadata to get the SensorTimestamp which must be
> set by the pipeline handlers according to the data from the capture
> device.
>
> Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>


Much as has been discussed regarding the per-buffer timestamp vs request 
timestamp. But it shouldn't affect this patch for now hence,

Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>

> ---
>   src/cam/camera_session.cpp | 7 ++++---
>   1 file changed, 4 insertions(+), 3 deletions(-)
>
> diff --git a/src/cam/camera_session.cpp b/src/cam/camera_session.cpp
> index 1bf460fa3fb7..50170723c30f 100644
> --- a/src/cam/camera_session.cpp
> +++ b/src/cam/camera_session.cpp
> @@ -359,10 +359,11 @@ void CameraSession::processRequest(Request *request)
>   	const Request::BufferMap &buffers = request->buffers();
>   
>   	/*
> -	 * Compute the frame rate. The timestamp is arbitrarily retrieved from
> -	 * the first buffer, as all buffers should have matching timestamps.
> +	 * Compute the frame rate. The timestamp is retrieved from the
> +	 * SensorTimestamp property, though all streams should have the
> +	 * same timestamp.
>   	 */
> -	uint64_t ts = buffers.begin()->second->metadata().timestamp;
> +	uint64_t ts = request->metadata().get(controls::SensorTimestamp);
>   	double fps = ts - last_;
>   	fps = last_ != 0 && fps ? 1000000000.0 / fps : 0.0;
>   	last_ = ts;
Umang Jain June 1, 2022, 8:34 a.m. UTC | #10
Hi All,

On 12/8/21 21:59, Kieran Bingham wrote:
> Quoting Laurent Pinchart (2021-12-08 18:57:10)
>> Hi Umang,
>>
>> On Tue, Dec 07, 2021 at 07:07:46PM +0530, Umang Jain wrote:
>>> On 12/7/21 6:51 PM, Umang Jain wrote:
>>>> On 12/7/21 5:39 AM, Laurent Pinchart wrote:
>>>>> On Mon, Dec 06, 2021 at 11:39:43PM +0000, Kieran Bingham wrote:
>>>>>> The SensorTimestamp is defined to be the time of capture of the image.
>>>>>> While all streams should have the same timestamp, this is not always
>>>>>> defined or guaranteed as ISP drivers may not forward sequence numbers
>>>>>> and timestamps from their input buffer.
>>>>> That should then bo solved by the pipeline handler, which should store
>>>>> the correct timestamp in the buffer metadata.
>>>>>
>>>>>> Use the Request metadata to get the SensorTimestamp which must be
>>>>>> set by the pipeline handlers according to the data from the capture
>>>>>> device.
>>>>>>
>>>>>> Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
>>>>>> ---
>>>>>>    src/cam/camera_session.cpp | 7 ++++---
>>>>>>    1 file changed, 4 insertions(+), 3 deletions(-)
>>>>>>
>>>>>> diff --git a/src/cam/camera_session.cpp b/src/cam/camera_session.cpp
>>>>>> index 1bf460fa3fb7..50170723c30f 100644
>>>>>> --- a/src/cam/camera_session.cpp
>>>>>> +++ b/src/cam/camera_session.cpp
>>>>>> @@ -359,10 +359,11 @@ void CameraSession::processRequest(Request
>>>>>> *request)
>>>>>>        const Request::BufferMap &buffers = request->buffers();
>>>>>>          /*
>>>>>> -     * Compute the frame rate. The timestamp is arbitrarily retrieved from
>>>>>> -     * the first buffer, as all buffers should have matching timestamps.
>>>>>> +     * Compute the frame rate. The timestamp is retrieved from the
>>>>>> +     * SensorTimestamp property, though all streams should have the
>>>>>> +     * same timestamp.
>>>>>>         */
>>>>>> -    uint64_t ts = buffers.begin()->second->metadata().timestamp;
>>>>>> +    uint64_t ts = request->metadata().get(controls::SensorTimestamp);
>>>>> This seems reasonable. Why do we have timestamps in the buffer metadata
>>>>> ? :-)
>>> Strong chance I have mis-understood the question, I later realized this
>>> is a cam-related patch.
>>>
>>> to me, the question translated :
>>>
>>>       why do we have timestamps in the FrameBuffer.metadata_.timestamp ?
>> To clarify, I meant to ask why we have both controls::SensorTimestamp
>> *and* timestamps in individual buffers
>> (FrameBuffer::metadata_.timestamp). I suppose it's historical, the
>> timestamp member comes from
>>
>> commit e94e52c0cb27f92c085da6e776af8b3d3172bbb2
>> Author: Niklas Söderlund <niklas.soderlund@ragnatech.se>
>> Date:   Thu Jan 24 23:34:51 2019 +0100
>>
>>      libcamera: v4l2_device: Update dequeued buffer information
>>
>> while controls::SensorTimestamp has been introduced in
>>
>> commit 84e79bd8b5347d91309cbd759bbd988b5144fb8d
>> Author: Jacopo Mondi <jacopo@jmondi.org>
>> Date:   Thu Jul 30 15:18:50 2020 +0200
>>
>>      libcamera: control_ids: Define draft controls
>>
>>> So ignore the discussion (if you want) :-P
>>>
>>>> Because there is no buffer assigned on the time-point where we want to
>>>> capture the timestamp.
>>>>
>>>> The exact location of the timestamp is a \todo
>>>>
>>>>               * \todo The sensor timestamp should be better estimated
>>>> by connecting
>>>>               * to the V4L2Device::frameStart signal.
>>>>
>>>> in all the pipeline-handlers as of now.
>>>>
>>>> We *want* to capture at frameStart signal, which emits in response to
>>>> VIDIOC_DQEVENT, but we can't as no buffer assigned (yet)
>>>>
>>>> We probably need to assign a container taking care of timestamps with
>>>> sequence numbers until a buffer is assigned down-the-line and then set
>>>> the buffer metadata by reading seq# & timestamp from that container.
>> Should we just drop FrameBuffer::metadata_.timestamp in favour of
>> controls::SensorTimestamp ? What would be the drawbacks, if any ?


Understanding the 'why' per-framebuffer timestamp made me realize that
it's a part of partial request completion implementation. From 
application-developer
guide:

"
Receiving notifications about the single buffer
completion event allows applications to implement partial request completion
support, and to inspect the buffer content before the request it is part 
of has
fully completed.
"

So that's why we have the timestamps in buffers. But it is also 
something that
can be done on applications' side as well (as discussed further in the 
series)

> Drawbacks for using Metadata control:
>   - Have to parse the metadata control list to get timestamps
>     (/sequences)
>
> Pro's for Metadata control:
>   - Only one place for it, all buffers in the same request are defined as
>     having the same value (sequence and timestamp).
>
>
> Drawbacks for per-buffer metadata:
>   - Must be explictily assigned to keep them in sync even though they are
>     the output of multiple video devices perhaps.
>
> Pro's for for per-buffer metadata:
>   - Quicker to retrieve the values? (defined structure?)
>
> Any more on either side?
>
>
> Personally, I'm currently thinking these are better defined by the
> metadata() control list. After all, that's the purpose of the list
> right?


I might be pulling away the discussion to partial request completion side,
but a single metadata() control list can be returned at various points to
have a 'clear' partial completion implementation because currently?
Might need to work with ControlList::merge() or ControlList::update() stuff,
so it's a very separate topic indeed.

So I do think that metadata() control list is better for defining as you 
suggested.

>
>
>>>>>>        double fps = ts - last_;
>>>>>>        fps = last_ != 0 && fps ? 1000000000.0 / fps : 0.0;
>>>>>>        last_ = ts;
>> -- 
>> Regards,
>>
>> Laurent Pinchart
Laurent Pinchart June 8, 2022, 10:40 p.m. UTC | #11
Hi Umang,

On Wed, Jun 01, 2022 at 10:34:52AM +0200, Umang Jain wrote:
> On 12/8/21 21:59, Kieran Bingham wrote:
> > Quoting Laurent Pinchart (2021-12-08 18:57:10)
> >> On Tue, Dec 07, 2021 at 07:07:46PM +0530, Umang Jain wrote:
> >>> On 12/7/21 6:51 PM, Umang Jain wrote:
> >>>> On 12/7/21 5:39 AM, Laurent Pinchart wrote:
> >>>>> On Mon, Dec 06, 2021 at 11:39:43PM +0000, Kieran Bingham wrote:
> >>>>>> The SensorTimestamp is defined to be the time of capture of the image.
> >>>>>> While all streams should have the same timestamp, this is not always
> >>>>>> defined or guaranteed as ISP drivers may not forward sequence numbers
> >>>>>> and timestamps from their input buffer.
> >>>>> That should then bo solved by the pipeline handler, which should store
> >>>>> the correct timestamp in the buffer metadata.
> >>>>>
> >>>>>> Use the Request metadata to get the SensorTimestamp which must be
> >>>>>> set by the pipeline handlers according to the data from the capture
> >>>>>> device.
> >>>>>>
> >>>>>> Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
> >>>>>> ---
> >>>>>>    src/cam/camera_session.cpp | 7 ++++---
> >>>>>>    1 file changed, 4 insertions(+), 3 deletions(-)
> >>>>>>
> >>>>>> diff --git a/src/cam/camera_session.cpp b/src/cam/camera_session.cpp
> >>>>>> index 1bf460fa3fb7..50170723c30f 100644
> >>>>>> --- a/src/cam/camera_session.cpp
> >>>>>> +++ b/src/cam/camera_session.cpp
> >>>>>> @@ -359,10 +359,11 @@ void CameraSession::processRequest(Request
> >>>>>> *request)
> >>>>>>        const Request::BufferMap &buffers = request->buffers();
> >>>>>>          /*
> >>>>>> -     * Compute the frame rate. The timestamp is arbitrarily retrieved from
> >>>>>> -     * the first buffer, as all buffers should have matching timestamps.
> >>>>>> +     * Compute the frame rate. The timestamp is retrieved from the
> >>>>>> +     * SensorTimestamp property, though all streams should have the
> >>>>>> +     * same timestamp.
> >>>>>>         */
> >>>>>> -    uint64_t ts = buffers.begin()->second->metadata().timestamp;
> >>>>>> +    uint64_t ts = request->metadata().get(controls::SensorTimestamp);
> >>>>> This seems reasonable. Why do we have timestamps in the buffer metadata
> >>>>> ? :-)
> >>> 
> >>> Strong chance I have mis-understood the question, I later realized this
> >>> is a cam-related patch.
> >>>
> >>> to me, the question translated :
> >>>
> >>>       why do we have timestamps in the FrameBuffer.metadata_.timestamp ?
> >> 
> >> To clarify, I meant to ask why we have both controls::SensorTimestamp
> >> *and* timestamps in individual buffers
> >> (FrameBuffer::metadata_.timestamp). I suppose it's historical, the
> >> timestamp member comes from
> >>
> >> commit e94e52c0cb27f92c085da6e776af8b3d3172bbb2
> >> Author: Niklas Söderlund <niklas.soderlund@ragnatech.se>
> >> Date:   Thu Jan 24 23:34:51 2019 +0100
> >>
> >>      libcamera: v4l2_device: Update dequeued buffer information
> >>
> >> while controls::SensorTimestamp has been introduced in
> >>
> >> commit 84e79bd8b5347d91309cbd759bbd988b5144fb8d
> >> Author: Jacopo Mondi <jacopo@jmondi.org>
> >> Date:   Thu Jul 30 15:18:50 2020 +0200
> >>
> >>      libcamera: control_ids: Define draft controls
> >>
> >>> So ignore the discussion (if you want) :-P
> >>>
> >>>> Because there is no buffer assigned on the time-point where we want to
> >>>> capture the timestamp.
> >>>>
> >>>> The exact location of the timestamp is a \todo
> >>>>
> >>>>               * \todo The sensor timestamp should be better estimated
> >>>> by connecting
> >>>>               * to the V4L2Device::frameStart signal.
> >>>>
> >>>> in all the pipeline-handlers as of now.
> >>>>
> >>>> We *want* to capture at frameStart signal, which emits in response to
> >>>> VIDIOC_DQEVENT, but we can't as no buffer assigned (yet)
> >>>>
> >>>> We probably need to assign a container taking care of timestamps with
> >>>> sequence numbers until a buffer is assigned down-the-line and then set
> >>>> the buffer metadata by reading seq# & timestamp from that container.
> >> 
> >> Should we just drop FrameBuffer::metadata_.timestamp in favour of
> >> controls::SensorTimestamp ? What would be the drawbacks, if any ?
> 
> Understanding the 'why' per-framebuffer timestamp made me realize that
> it's a part of partial request completion implementation. From 
> application-developer
> guide:
> 
> "
> Receiving notifications about the single buffer
> completion event allows applications to implement partial request completion
> support, and to inspect the buffer content before the request it is part of has
> fully completed.
> "
> 
> So that's why we have the timestamps in buffers. But it is also something that
> can be done on applications' side as well (as discussed further in the 
> series)

If we need to provide timestamp information early, I'm tempted to do so
through a separate start of frame event, similar to how Android does it.
It will provide the information, but also allow applications to react as
early as possible to the start of a frame.

> > Drawbacks for using Metadata control:
> >   - Have to parse the metadata control list to get timestamps
> >     (/sequences)
> >
> > Pro's for Metadata control:
> >   - Only one place for it, all buffers in the same request are defined as
> >     having the same value (sequence and timestamp).
> >
> >
> > Drawbacks for per-buffer metadata:
> >   - Must be explictily assigned to keep them in sync even though they are
> >     the output of multiple video devices perhaps.
> >
> > Pro's for for per-buffer metadata:
> >   - Quicker to retrieve the values? (defined structure?)
> >
> > Any more on either side?
> >
> >
> > Personally, I'm currently thinking these are better defined by the
> > metadata() control list. After all, that's the purpose of the list
> > right?
> 
> I might be pulling away the discussion to partial request completion side,
> but a single metadata() control list can be returned at various points to
> have a 'clear' partial completion implementation because currently?
> Might need to work with ControlList::merge() or ControlList::update() stuff,
> so it's a very separate topic indeed.

We don't support that yet, but if there's a use case, it could be
implemented, yes.

> So I do think that metadata() control list is better for defining as you 
> suggested.
> 
> >>>>>>        double fps = ts - last_;
> >>>>>>        fps = last_ != 0 && fps ? 1000000000.0 / fps : 0.0;
> >>>>>>        last_ = ts;

Patch
diff mbox series

diff --git a/src/cam/camera_session.cpp b/src/cam/camera_session.cpp
index 1bf460fa3fb7..50170723c30f 100644
--- a/src/cam/camera_session.cpp
+++ b/src/cam/camera_session.cpp
@@ -359,10 +359,11 @@  void CameraSession::processRequest(Request *request)
 	const Request::BufferMap &buffers = request->buffers();
 
 	/*
-	 * Compute the frame rate. The timestamp is arbitrarily retrieved from
-	 * the first buffer, as all buffers should have matching timestamps.
+	 * Compute the frame rate. The timestamp is retrieved from the
+	 * SensorTimestamp property, though all streams should have the
+	 * same timestamp.
 	 */
-	uint64_t ts = buffers.begin()->second->metadata().timestamp;
+	uint64_t ts = request->metadata().get(controls::SensorTimestamp);
 	double fps = ts - last_;
 	fps = last_ != 0 && fps ? 1000000000.0 / fps : 0.0;
 	last_ = ts;