[libcamera-devel,1/4] ipa: raspberrypi: Allow long exposure modes for imx477.
diff mbox series

Message ID 20210614100040.3054433-2-naush@raspberrypi.com
State Superseded
Headers show
Series
  • Raspberry Pi: Enable imx477 long exposure modes
Related show

Commit Message

Naushir Patuck June 14, 2021, 10 a.m. UTC
Update the imx477 CamHelper to use long exposure modes if needed.
This is done by overloading the CamHelper::GetVBlanking function to return a
frame length (and vblank value) computed using a scaling factor when the value
would be larger than what the sensor register could otherwise hold.

CamHelperImx477::Prepare is also overloaded to ensure that the "device.status"
metadata returns the right value if the long exposure scaling factor is use.
The scaling factor is unfortunately not returned back in metadata.

With the current imx477 driver, we can achieve a maximum exposure time of approx
127 seconds since the HBLANK control is read-only.

Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
---
 src/ipa/raspberrypi/cam_helper_imx477.cpp | 67 +++++++++++++++++++++++
 1 file changed, 67 insertions(+)

Comments

Laurent Pinchart June 15, 2021, 3:19 a.m. UTC | #1
Hi Naush,

Thank you for the patch.

On Mon, Jun 14, 2021 at 11:00:37AM +0100, Naushir Patuck wrote:
> Update the imx477 CamHelper to use long exposure modes if needed.
> This is done by overloading the CamHelper::GetVBlanking function to return a
> frame length (and vblank value) computed using a scaling factor when the value
> would be larger than what the sensor register could otherwise hold.
> 
> CamHelperImx477::Prepare is also overloaded to ensure that the "device.status"
> metadata returns the right value if the long exposure scaling factor is use.

s/use/used/

> The scaling factor is unfortunately not returned back in metadata.
> 
> With the current imx477 driver, we can achieve a maximum exposure time of approx
> 127 seconds since the HBLANK control is read-only.
> 
> Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
> ---
>  src/ipa/raspberrypi/cam_helper_imx477.cpp | 67 +++++++++++++++++++++++
>  1 file changed, 67 insertions(+)
> 
> diff --git a/src/ipa/raspberrypi/cam_helper_imx477.cpp b/src/ipa/raspberrypi/cam_helper_imx477.cpp
> index 7a1100c25afc..549c2620765a 100644
> --- a/src/ipa/raspberrypi/cam_helper_imx477.cpp
> +++ b/src/ipa/raspberrypi/cam_helper_imx477.cpp
> @@ -6,6 +6,7 @@
>   */
>  
>  #include <assert.h>
> +#include <cmath>
>  #include <stddef.h>
>  #include <stdio.h>
>  #include <stdlib.h>
> @@ -14,6 +15,7 @@
>  #include "md_parser.hpp"
>  
>  using namespace RPiController;
> +using libcamera::utils::Duration;
>  
>  /*
>   * We care about two gain registers and a pair of exposure registers. Their
> @@ -30,6 +32,9 @@ public:
>  	CamHelperImx477();
>  	uint32_t GainCode(double gain) const override;
>  	double Gain(uint32_t gain_code) const override;
> +	void Prepare(libcamera::Span<const uint8_t> buffer, Metadata &metadata) override;
> +	uint32_t GetVBlanking(Duration &exposure, Duration minFrameDuration,
> +			      Duration maxFrameDuration) const override;
>  	void GetDelays(int &exposure_delay, int &gain_delay,
>  		       int &vblank_delay) const override;
>  	bool SensorEmbeddedDataPresent() const override;
> @@ -40,6 +45,10 @@ private:
>  	 * in units of lines.
>  	 */
>  	static constexpr int frameIntegrationDiff = 22;
> +	/* Maximum frame length allowable for long exposure calculations. */
> +	static constexpr int frameLengthMax = 0xffdc;
> +	/* Largest long exposure scale factor given as a left shift on the frame length. */
> +	static constexpr int longExposureShiftMax = 7;
>  
>  	MdParserSmia imx477_parser;
>  
> @@ -72,6 +81,64 @@ double CamHelperImx477::Gain(uint32_t gain_code) const
>  	return 1024.0 / (1024 - gain_code);
>  }
>  
> +void CamHelperImx477::Prepare(libcamera::Span<const uint8_t> buffer, Metadata &metadata)
> +{
> +	DeviceStatus deviceStatus, parsedStatus;
> +
> +	/* Get the device status provided by DelayedControls */
> +	if (metadata.Get("device.status", deviceStatus) != 0)
> +		return;
> +
> +	/* Get the device status provided by the embedded data buffer. */
> +	CamHelper::parseEmbeddedData(buffer, metadata);
> +	metadata.Get("device.status", parsedStatus);
> +
> +	/*
> +	 * If the ratio of DelayedControls to embedded data shutter speed is > 1
> +	 * and is a factor of 2^N, then we can assume this is a long exposure mode
> +	 * frame. Since embedded data does not provide any hints of long exposure
> +	 * modes, make sure we use the DelayedControls values in the metadata.
> +	 * Otherwise, just go with the embedded data values.
> +	 */
> +	unsigned long ratio = std::lround(deviceStatus.shutter_speed / parsedStatus.shutter_speed);
> +	bool replace = (ratio > 1) && ((ratio & (~ratio + 1)) == ratio);
> +
> +	if (replace)
> +		metadata.Set("device.status", deviceStatus);
> +}
> +
> +uint32_t CamHelperImx477::GetVBlanking(Duration &exposure,
> +				       Duration minFrameDuration,
> +				       Duration maxFrameDuration) const
> +{
> +	uint32_t frameLength, exposureLines;
> +	unsigned int shift = 0;
> +
> +	frameLength = mode_.height + CamHelper::GetVBlanking(exposure, minFrameDuration,
> +							     maxFrameDuration);
> +	/*
> +	 * Check if the frame length calculated needs to be setup for long
> +	 * exposure mode. This will require us to use a long exposure scale
> +	 * factor provided by a shift operation in the sensor.
> +	 */
> +	while (frameLength > frameLengthMax) {
> +		if (++shift > longExposureShiftMax) {
> +			shift = longExposureShiftMax;
> +			frameLength = frameLengthMax;
> +			break;
> +		}
> +		frameLength >>= 1;
> +	}
> +
> +	/* Account for any rounding in the scaled frame length value. */
> +	frameLength <<= shift;
> +	exposureLines = CamHelper::ExposureLines(exposure);
> +	exposureLines = std::min(exposureLines, frameLength - frameIntegrationDiff);
> +	exposure = CamHelper::Exposure(exposureLines);

I'll try to review this series in more details shortly, but I'd like to
already take this as an opportunity to discuss an idea related to the
cam helpers.

As they stand today, the cam helpers translate between natural units
(time for exposure, and a linear multipler for gain) and V4L2 control
values. This pushes knowledge of V4L2 to the IPA, which isn't great.
I've been toying with the idea of gathering all the V4L2 knowledge in
the libcamera core, which would move these conversions to the pipeline
handler (with the helper of camera helpers on the libcamera side of
course).

One issue that bothered me is that the IPA wouldn't know about rounding
(or at least not right away, it could get information back from the
pipeline handler, but with a delay, and in any case that wouldn't be
very clean). The code above takes rounding into account. I'd thus like
your feedback on how important this is, and, in case (as I suspect) the
IPA needs to know about rounding, whether you think there would still be
a way to not add V4L2 knowledge to the IPA.

> +
> +	return frameLength - mode_.height;
> +}
> +
>  void CamHelperImx477::GetDelays(int &exposure_delay, int &gain_delay,
>  				int &vblank_delay) const
>  {
Naushir Patuck June 15, 2021, 12:55 p.m. UTC | #2
Hi Laurent,


On Tue, 15 Jun 2021 at 04:19, Laurent Pinchart <
laurent.pinchart@ideasonboard.com> wrote:

> Hi Naush,
>
> Thank you for the patch.
>
> On Mon, Jun 14, 2021 at 11:00:37AM +0100, Naushir Patuck wrote:
> > Update the imx477 CamHelper to use long exposure modes if needed.
> > This is done by overloading the CamHelper::GetVBlanking function to
> return a
> > frame length (and vblank value) computed using a scaling factor when the
> value
> > would be larger than what the sensor register could otherwise hold.
> >
> > CamHelperImx477::Prepare is also overloaded to ensure that the
> "device.status"
> > metadata returns the right value if the long exposure scaling factor is
> use.
>
> s/use/used/
>
> > The scaling factor is unfortunately not returned back in metadata.
> >
> > With the current imx477 driver, we can achieve a maximum exposure time
> of approx
> > 127 seconds since the HBLANK control is read-only.
> >
> > Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
> > ---
> >  src/ipa/raspberrypi/cam_helper_imx477.cpp | 67 +++++++++++++++++++++++
> >  1 file changed, 67 insertions(+)
> >
> > diff --git a/src/ipa/raspberrypi/cam_helper_imx477.cpp
> b/src/ipa/raspberrypi/cam_helper_imx477.cpp
> > index 7a1100c25afc..549c2620765a 100644
> > --- a/src/ipa/raspberrypi/cam_helper_imx477.cpp
> > +++ b/src/ipa/raspberrypi/cam_helper_imx477.cpp
> > @@ -6,6 +6,7 @@
> >   */
> >
> >  #include <assert.h>
> > +#include <cmath>
> >  #include <stddef.h>
> >  #include <stdio.h>
> >  #include <stdlib.h>
> > @@ -14,6 +15,7 @@
> >  #include "md_parser.hpp"
> >
> >  using namespace RPiController;
> > +using libcamera::utils::Duration;
> >
> >  /*
> >   * We care about two gain registers and a pair of exposure registers.
> Their
> > @@ -30,6 +32,9 @@ public:
> >       CamHelperImx477();
> >       uint32_t GainCode(double gain) const override;
> >       double Gain(uint32_t gain_code) const override;
> > +     void Prepare(libcamera::Span<const uint8_t> buffer, Metadata
> &metadata) override;
> > +     uint32_t GetVBlanking(Duration &exposure, Duration
> minFrameDuration,
> > +                           Duration maxFrameDuration) const override;
> >       void GetDelays(int &exposure_delay, int &gain_delay,
> >                      int &vblank_delay) const override;
> >       bool SensorEmbeddedDataPresent() const override;
> > @@ -40,6 +45,10 @@ private:
> >        * in units of lines.
> >        */
> >       static constexpr int frameIntegrationDiff = 22;
> > +     /* Maximum frame length allowable for long exposure calculations.
> */
> > +     static constexpr int frameLengthMax = 0xffdc;
> > +     /* Largest long exposure scale factor given as a left shift on the
> frame length. */
> > +     static constexpr int longExposureShiftMax = 7;
> >
> >       MdParserSmia imx477_parser;
> >
> > @@ -72,6 +81,64 @@ double CamHelperImx477::Gain(uint32_t gain_code) const
> >       return 1024.0 / (1024 - gain_code);
> >  }
> >
> > +void CamHelperImx477::Prepare(libcamera::Span<const uint8_t> buffer,
> Metadata &metadata)
> > +{
> > +     DeviceStatus deviceStatus, parsedStatus;
> > +
> > +     /* Get the device status provided by DelayedControls */
> > +     if (metadata.Get("device.status", deviceStatus) != 0)
> > +             return;
> > +
> > +     /* Get the device status provided by the embedded data buffer. */
> > +     CamHelper::parseEmbeddedData(buffer, metadata);
> > +     metadata.Get("device.status", parsedStatus);
> > +
> > +     /*
> > +      * If the ratio of DelayedControls to embedded data shutter speed
> is > 1
> > +      * and is a factor of 2^N, then we can assume this is a long
> exposure mode
> > +      * frame. Since embedded data does not provide any hints of long
> exposure
> > +      * modes, make sure we use the DelayedControls values in the
> metadata.
> > +      * Otherwise, just go with the embedded data values.
> > +      */
> > +     unsigned long ratio = std::lround(deviceStatus.shutter_speed /
> parsedStatus.shutter_speed);
> > +     bool replace = (ratio > 1) && ((ratio & (~ratio + 1)) == ratio);
> > +
> > +     if (replace)
> > +             metadata.Set("device.status", deviceStatus);
> > +}
> > +
> > +uint32_t CamHelperImx477::GetVBlanking(Duration &exposure,
> > +                                    Duration minFrameDuration,
> > +                                    Duration maxFrameDuration) const
> > +{
> > +     uint32_t frameLength, exposureLines;
> > +     unsigned int shift = 0;
> > +
> > +     frameLength = mode_.height + CamHelper::GetVBlanking(exposure,
> minFrameDuration,
> > +
> maxFrameDuration);
> > +     /*
> > +      * Check if the frame length calculated needs to be setup for long
> > +      * exposure mode. This will require us to use a long exposure scale
> > +      * factor provided by a shift operation in the sensor.
> > +      */
> > +     while (frameLength > frameLengthMax) {
> > +             if (++shift > longExposureShiftMax) {
> > +                     shift = longExposureShiftMax;
> > +                     frameLength = frameLengthMax;
> > +                     break;
> > +             }
> > +             frameLength >>= 1;
> > +     }
> > +
> > +     /* Account for any rounding in the scaled frame length value. */
> > +     frameLength <<= shift;
> > +     exposureLines = CamHelper::ExposureLines(exposure);
> > +     exposureLines = std::min(exposureLines, frameLength -
> frameIntegrationDiff);
> > +     exposure = CamHelper::Exposure(exposureLines);
>
> I'll try to review this series in more details shortly, but I'd like to
> already take this as an opportunity to discuss an idea related to the
> cam helpers.
>
> As they stand today, the cam helpers translate between natural units
> (time for exposure, and a linear multipler for gain) and V4L2 control
> values. This pushes knowledge of V4L2 to the IPA, which isn't great.
> I've been toying with the idea of gathering all the V4L2 knowledge in
> the libcamera core, which would move these conversions to the pipeline
> handler (with the helper of camera helpers on the libcamera side of
> course).
>
> One issue that bothered me is that the IPA wouldn't know about rounding
> (or at least not right away, it could get information back from the
> pipeline handler, but with a delay, and in any case that wouldn't be
> very clean). The code above takes rounding into account. I'd thus like
> your feedback on how important this is, and, in case (as I suspect) the
> IPA needs to know about rounding, whether you think there would still be
> a way to not add V4L2 knowledge to the IPA.
>

This is purely from the perspective of the Raspberry Pi IPA, so other
vendors
may have different opinions or requirements.

For the RPi IPA, the key is that we get given the *actual* shutter speed
and gain
values used by the sensor for that frame, and these may or may not be
rounded
from what the AGC originally requested.  When fetching shutter speed and
gain
values from embedded data, this is implicit.  When no embedded data is
available,
the CamHelper does the rounding before passing the values into
DelayedControls,
and the value given back to the IPA when the frame comes through (with
appropriate
frame delay) will also be rounded. So, if this CamHelper functionality were
to be
moved to the pipeline handler domain mostly as-is, we would see no ill
effect.  If the
IPA were to somehow get shutter speed and gain values that were not rounded
appropriately, you would likely see some tiny oscillations in the AGC, and
David
would not be happy with that :-)

So in summary, I don't think the rounding will give us any problems,
other IPAs
might have to take a similar approach to ours to claim the same. I do think
removing
these sorts of conversions from the IPA domain is the right thing to do.
Having the
IPA deal with only natural units will be a welcome change!

Regards,
Naush


>
> > +
> > +     return frameLength - mode_.height;
> > +}
> > +
> >  void CamHelperImx477::GetDelays(int &exposure_delay, int &gain_delay,
> >                               int &vblank_delay) const
> >  {
>
> --
> Regards,
>
> Laurent Pinchart
>
David Plowman June 15, 2021, 1:27 p.m. UTC | #3
Hi Naush

Thanks for this feature!

On Mon, 14 Jun 2021 at 11:00, Naushir Patuck <naush@raspberrypi.com> wrote:
>
> Update the imx477 CamHelper to use long exposure modes if needed.
> This is done by overloading the CamHelper::GetVBlanking function to return a
> frame length (and vblank value) computed using a scaling factor when the value
> would be larger than what the sensor register could otherwise hold.
>
> CamHelperImx477::Prepare is also overloaded to ensure that the "device.status"
> metadata returns the right value if the long exposure scaling factor is use.
> The scaling factor is unfortunately not returned back in metadata.
>
> With the current imx477 driver, we can achieve a maximum exposure time of approx
> 127 seconds since the HBLANK control is read-only.
>
> Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
> ---
>  src/ipa/raspberrypi/cam_helper_imx477.cpp | 67 +++++++++++++++++++++++
>  1 file changed, 67 insertions(+)
>
> diff --git a/src/ipa/raspberrypi/cam_helper_imx477.cpp b/src/ipa/raspberrypi/cam_helper_imx477.cpp
> index 7a1100c25afc..549c2620765a 100644
> --- a/src/ipa/raspberrypi/cam_helper_imx477.cpp
> +++ b/src/ipa/raspberrypi/cam_helper_imx477.cpp
> @@ -6,6 +6,7 @@
>   */
>
>  #include <assert.h>
> +#include <cmath>
>  #include <stddef.h>
>  #include <stdio.h>
>  #include <stdlib.h>
> @@ -14,6 +15,7 @@
>  #include "md_parser.hpp"
>
>  using namespace RPiController;
> +using libcamera::utils::Duration;
>
>  /*
>   * We care about two gain registers and a pair of exposure registers. Their
> @@ -30,6 +32,9 @@ public:
>         CamHelperImx477();
>         uint32_t GainCode(double gain) const override;
>         double Gain(uint32_t gain_code) const override;
> +       void Prepare(libcamera::Span<const uint8_t> buffer, Metadata &metadata) override;
> +       uint32_t GetVBlanking(Duration &exposure, Duration minFrameDuration,
> +                             Duration maxFrameDuration) const override;
>         void GetDelays(int &exposure_delay, int &gain_delay,
>                        int &vblank_delay) const override;
>         bool SensorEmbeddedDataPresent() const override;
> @@ -40,6 +45,10 @@ private:
>          * in units of lines.
>          */
>         static constexpr int frameIntegrationDiff = 22;
> +       /* Maximum frame length allowable for long exposure calculations. */
> +       static constexpr int frameLengthMax = 0xffdc;
> +       /* Largest long exposure scale factor given as a left shift on the frame length. */
> +       static constexpr int longExposureShiftMax = 7;
>
>         MdParserSmia imx477_parser;
>
> @@ -72,6 +81,64 @@ double CamHelperImx477::Gain(uint32_t gain_code) const
>         return 1024.0 / (1024 - gain_code);
>  }
>
> +void CamHelperImx477::Prepare(libcamera::Span<const uint8_t> buffer, Metadata &metadata)
> +{
> +       DeviceStatus deviceStatus, parsedStatus;
> +
> +       /* Get the device status provided by DelayedControls */
> +       if (metadata.Get("device.status", deviceStatus) != 0)
> +               return;
> +
> +       /* Get the device status provided by the embedded data buffer. */
> +       CamHelper::parseEmbeddedData(buffer, metadata);
> +       metadata.Get("device.status", parsedStatus);
> +
> +       /*
> +        * If the ratio of DelayedControls to embedded data shutter speed is > 1
> +        * and is a factor of 2^N, then we can assume this is a long exposure mode
> +        * frame. Since embedded data does not provide any hints of long exposure
> +        * modes, make sure we use the DelayedControls values in the metadata.
> +        * Otherwise, just go with the embedded data values.
> +        */
> +       unsigned long ratio = std::lround(deviceStatus.shutter_speed / parsedStatus.shutter_speed);
> +       bool replace = (ratio > 1) && ((ratio & (~ratio + 1)) == ratio);

Is there a case we need to worry about where "ratio" was only rounded
to a power of 2, but wouldn't otherwise have been one? (e.g.
deviceStatus.shutter_speed = 11, parsedStatus.shutter_speed = 5)

> +
> +       if (replace)
> +               metadata.Set("device.status", deviceStatus);
> +}
> +
> +uint32_t CamHelperImx477::GetVBlanking(Duration &exposure,
> +                                      Duration minFrameDuration,
> +                                      Duration maxFrameDuration) const
> +{
> +       uint32_t frameLength, exposureLines;
> +       unsigned int shift = 0;
> +
> +       frameLength = mode_.height + CamHelper::GetVBlanking(exposure, minFrameDuration,
> +                                                            maxFrameDuration);
> +       /*
> +        * Check if the frame length calculated needs to be setup for long
> +        * exposure mode. This will require us to use a long exposure scale
> +        * factor provided by a shift operation in the sensor.
> +        */
> +       while (frameLength > frameLengthMax) {
> +               if (++shift > longExposureShiftMax) {
> +                       shift = longExposureShiftMax;
> +                       frameLength = frameLengthMax;
> +                       break;
> +               }
> +               frameLength >>= 1;
> +       }
> +
> +       /* Account for any rounding in the scaled frame length value. */
> +       frameLength <<= shift;
> +       exposureLines = CamHelper::ExposureLines(exposure);
> +       exposureLines = std::min(exposureLines, frameLength - frameIntegrationDiff);
> +       exposure = CamHelper::Exposure(exposureLines);

Any reason for invoking the base class functions explicitly here? Not
that it bothers me, just curious...

Subject to clarification on my ratio rounding question:

Reviewed-by: David Plowman <david.plowman@raspberrypi.com>

Thanks!
David

> +
> +       return frameLength - mode_.height;
> +}
> +
>  void CamHelperImx477::GetDelays(int &exposure_delay, int &gain_delay,
>                                 int &vblank_delay) const
>  {
> --
> 2.25.1
>
Naushir Patuck June 15, 2021, 1:46 p.m. UTC | #4
Hi David,

Thank you for your feedback!

On Tue, 15 Jun 2021 at 14:28, David Plowman <david.plowman@raspberrypi.com>
wrote:

> Hi Naush
>
> Thanks for this feature!
>
> On Mon, 14 Jun 2021 at 11:00, Naushir Patuck <naush@raspberrypi.com>
> wrote:
> >
> > Update the imx477 CamHelper to use long exposure modes if needed.
> > This is done by overloading the CamHelper::GetVBlanking function to
> return a
> > frame length (and vblank value) computed using a scaling factor when the
> value
> > would be larger than what the sensor register could otherwise hold.
> >
> > CamHelperImx477::Prepare is also overloaded to ensure that the
> "device.status"
> > metadata returns the right value if the long exposure scaling factor is
> use.
> > The scaling factor is unfortunately not returned back in metadata.
> >
> > With the current imx477 driver, we can achieve a maximum exposure time
> of approx
> > 127 seconds since the HBLANK control is read-only.
> >
> > Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
> > ---
> >  src/ipa/raspberrypi/cam_helper_imx477.cpp | 67 +++++++++++++++++++++++
> >  1 file changed, 67 insertions(+)
> >
> > diff --git a/src/ipa/raspberrypi/cam_helper_imx477.cpp
> b/src/ipa/raspberrypi/cam_helper_imx477.cpp
> > index 7a1100c25afc..549c2620765a 100644
> > --- a/src/ipa/raspberrypi/cam_helper_imx477.cpp
> > +++ b/src/ipa/raspberrypi/cam_helper_imx477.cpp
> > @@ -6,6 +6,7 @@
> >   */
> >
> >  #include <assert.h>
> > +#include <cmath>
> >  #include <stddef.h>
> >  #include <stdio.h>
> >  #include <stdlib.h>
> > @@ -14,6 +15,7 @@
> >  #include "md_parser.hpp"
> >
> >  using namespace RPiController;
> > +using libcamera::utils::Duration;
> >
> >  /*
> >   * We care about two gain registers and a pair of exposure registers.
> Their
> > @@ -30,6 +32,9 @@ public:
> >         CamHelperImx477();
> >         uint32_t GainCode(double gain) const override;
> >         double Gain(uint32_t gain_code) const override;
> > +       void Prepare(libcamera::Span<const uint8_t> buffer, Metadata
> &metadata) override;
> > +       uint32_t GetVBlanking(Duration &exposure, Duration
> minFrameDuration,
> > +                             Duration maxFrameDuration) const override;
> >         void GetDelays(int &exposure_delay, int &gain_delay,
> >                        int &vblank_delay) const override;
> >         bool SensorEmbeddedDataPresent() const override;
> > @@ -40,6 +45,10 @@ private:
> >          * in units of lines.
> >          */
> >         static constexpr int frameIntegrationDiff = 22;
> > +       /* Maximum frame length allowable for long exposure
> calculations. */
> > +       static constexpr int frameLengthMax = 0xffdc;
> > +       /* Largest long exposure scale factor given as a left shift on
> the frame length. */
> > +       static constexpr int longExposureShiftMax = 7;
> >
> >         MdParserSmia imx477_parser;
> >
> > @@ -72,6 +81,64 @@ double CamHelperImx477::Gain(uint32_t gain_code) const
> >         return 1024.0 / (1024 - gain_code);
> >  }
> >
> > +void CamHelperImx477::Prepare(libcamera::Span<const uint8_t> buffer,
> Metadata &metadata)
> > +{
> > +       DeviceStatus deviceStatus, parsedStatus;
> > +
> > +       /* Get the device status provided by DelayedControls */
> > +       if (metadata.Get("device.status", deviceStatus) != 0)
> > +               return;
> > +
> > +       /* Get the device status provided by the embedded data buffer. */
> > +       CamHelper::parseEmbeddedData(buffer, metadata);
> > +       metadata.Get("device.status", parsedStatus);
> > +
> > +       /*
> > +        * If the ratio of DelayedControls to embedded data shutter
> speed is > 1
> > +        * and is a factor of 2^N, then we can assume this is a long
> exposure mode
> > +        * frame. Since embedded data does not provide any hints of long
> exposure
> > +        * modes, make sure we use the DelayedControls values in the
> metadata.
> > +        * Otherwise, just go with the embedded data values.
> > +        */
> > +       unsigned long ratio = std::lround(deviceStatus.shutter_speed /
> parsedStatus.shutter_speed);
> > +       bool replace = (ratio > 1) && ((ratio & (~ratio + 1)) == ratio);
>
> Is there a case we need to worry about where "ratio" was only rounded
> to a power of 2, but wouldn't otherwise have been one? (e.g.
> deviceStatus.shutter_speed = 11, parsedStatus.shutter_speed = 5)
>

The ratio must be a power of 2 or exactly 1 for the imx477.  Any other value
and something has gone wrong :-)  In the failure case, I assume that the
embedded data values are to be trusted and use them over DelatedControl
values.
Perhaps it may be worth adding a log point if this condition ever hits?


>
> > +
> > +       if (replace)
> > +               metadata.Set("device.status", deviceStatus);
> > +}
> > +
> > +uint32_t CamHelperImx477::GetVBlanking(Duration &exposure,
> > +                                      Duration minFrameDuration,
> > +                                      Duration maxFrameDuration) const
> > +{
> > +       uint32_t frameLength, exposureLines;
> > +       unsigned int shift = 0;
> > +
> > +       frameLength = mode_.height + CamHelper::GetVBlanking(exposure,
> minFrameDuration,
> > +
> maxFrameDuration);
> > +       /*
> > +        * Check if the frame length calculated needs to be setup for
> long
> > +        * exposure mode. This will require us to use a long exposure
> scale
> > +        * factor provided by a shift operation in the sensor.
> > +        */
> > +       while (frameLength > frameLengthMax) {
> > +               if (++shift > longExposureShiftMax) {
> > +                       shift = longExposureShiftMax;
> > +                       frameLength = frameLengthMax;
> > +                       break;
> > +               }
> > +               frameLength >>= 1;
> > +       }
> > +
> > +       /* Account for any rounding in the scaled frame length value. */
> > +       frameLength <<= shift;
> > +       exposureLines = CamHelper::ExposureLines(exposure);
> > +       exposureLines = std::min(exposureLines, frameLength -
> frameIntegrationDiff);
> > +       exposure = CamHelper::Exposure(exposureLines);
>
> Any reason for invoking the base class functions explicitly here? Not
> that it bothers me, just curious...
>

I use CamHelper::GetVBlanking to compute the vblank and exposure values with
correct clipping and rounding based on the frame length limits.  I could
possibly move
that entire calculation into this function, but it would be very similar,
so thought it
would be better to call the base class implementation.

Regards,
Naush


>
> Subject to clarification on my ratio rounding question:
>
> Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
>
> Thanks!
> David
>
> > +
> > +       return frameLength - mode_.height;
> > +}
> > +
> >  void CamHelperImx477::GetDelays(int &exposure_delay, int &gain_delay,
> >                                 int &vblank_delay) const
> >  {
> > --
> > 2.25.1
> >
>
Laurent Pinchart July 1, 2021, 11:55 p.m. UTC | #5
Hi Naush,

On Tue, Jun 15, 2021 at 01:55:55PM +0100, Naushir Patuck wrote:
> On Tue, 15 Jun 2021 at 04:19, Laurent Pinchart wrote:
> > On Mon, Jun 14, 2021 at 11:00:37AM +0100, Naushir Patuck wrote:
> > > Update the imx477 CamHelper to use long exposure modes if needed.
> > > This is done by overloading the CamHelper::GetVBlanking function to return a
> > > frame length (and vblank value) computed using a scaling factor when the value
> > > would be larger than what the sensor register could otherwise hold.
> > >
> > > CamHelperImx477::Prepare is also overloaded to ensure that the "device.status"
> > > metadata returns the right value if the long exposure scaling factor is use.
> >
> > s/use/used/
> >
> > > The scaling factor is unfortunately not returned back in metadata.
> > >
> > > With the current imx477 driver, we can achieve a maximum exposure time of approx
> > > 127 seconds since the HBLANK control is read-only.
> > >
> > > Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
> > > ---
> > >  src/ipa/raspberrypi/cam_helper_imx477.cpp | 67 +++++++++++++++++++++++
> > >  1 file changed, 67 insertions(+)
> > >
> > > diff --git a/src/ipa/raspberrypi/cam_helper_imx477.cpp b/src/ipa/raspberrypi/cam_helper_imx477.cpp
> > > index 7a1100c25afc..549c2620765a 100644
> > > --- a/src/ipa/raspberrypi/cam_helper_imx477.cpp
> > > +++ b/src/ipa/raspberrypi/cam_helper_imx477.cpp
> > > @@ -6,6 +6,7 @@
> > >   */
> > >
> > >  #include <assert.h>
> > > +#include <cmath>
> > >  #include <stddef.h>
> > >  #include <stdio.h>
> > >  #include <stdlib.h>
> > > @@ -14,6 +15,7 @@
> > >  #include "md_parser.hpp"
> > >
> > >  using namespace RPiController;
> > > +using libcamera::utils::Duration;
> > >
> > >  /*
> > >   * We care about two gain registers and a pair of exposure registers. Their
> > > @@ -30,6 +32,9 @@ public:
> > >       CamHelperImx477();
> > >       uint32_t GainCode(double gain) const override;
> > >       double Gain(uint32_t gain_code) const override;
> > > +     void Prepare(libcamera::Span<const uint8_t> buffer, Metadata &metadata) override;
> > > +     uint32_t GetVBlanking(Duration &exposure, Duration minFrameDuration,
> > > +                           Duration maxFrameDuration) const override;
> > >       void GetDelays(int &exposure_delay, int &gain_delay,
> > >                      int &vblank_delay) const override;
> > >       bool SensorEmbeddedDataPresent() const override;
> > > @@ -40,6 +45,10 @@ private:
> > >        * in units of lines.
> > >        */
> > >       static constexpr int frameIntegrationDiff = 22;
> > > +     /* Maximum frame length allowable for long exposure calculations. */
> > > +     static constexpr int frameLengthMax = 0xffdc;
> > > +     /* Largest long exposure scale factor given as a left shift on the frame length. */
> > > +     static constexpr int longExposureShiftMax = 7;
> > >
> > >       MdParserSmia imx477_parser;
> > >
> > > @@ -72,6 +81,64 @@ double CamHelperImx477::Gain(uint32_t gain_code) const
> > >       return 1024.0 / (1024 - gain_code);
> > >  }
> > >
> > > +void CamHelperImx477::Prepare(libcamera::Span<const uint8_t> buffer, Metadata &metadata)
> > > +{
> > > +     DeviceStatus deviceStatus, parsedStatus;
> > > +
> > > +     /* Get the device status provided by DelayedControls */
> > > +     if (metadata.Get("device.status", deviceStatus) != 0)
> > > +             return;
> > > +
> > > +     /* Get the device status provided by the embedded data buffer. */
> > > +     CamHelper::parseEmbeddedData(buffer, metadata);
> > > +     metadata.Get("device.status", parsedStatus);
> > > +
> > > +     /*
> > > +      * If the ratio of DelayedControls to embedded data shutter speed is > 1
> > > +      * and is a factor of 2^N, then we can assume this is a long exposure mode
> > > +      * frame. Since embedded data does not provide any hints of long exposure
> > > +      * modes, make sure we use the DelayedControls values in the metadata.
> > > +      * Otherwise, just go with the embedded data values.
> > > +      */
> > > +     unsigned long ratio = std::lround(deviceStatus.shutter_speed / parsedStatus.shutter_speed);
> > > +     bool replace = (ratio > 1) && ((ratio & (~ratio + 1)) == ratio);
> > > +
> > > +     if (replace)
> > > +             metadata.Set("device.status", deviceStatus);
> > > +}
> > > +
> > > +uint32_t CamHelperImx477::GetVBlanking(Duration &exposure,
> > > +                                    Duration minFrameDuration,
> > > +                                    Duration maxFrameDuration) const
> > > +{
> > > +     uint32_t frameLength, exposureLines;
> > > +     unsigned int shift = 0;
> > > +
> > > +     frameLength = mode_.height + CamHelper::GetVBlanking(exposure, minFrameDuration,
> > > + maxFrameDuration);
> > > +     /*
> > > +      * Check if the frame length calculated needs to be setup for long
> > > +      * exposure mode. This will require us to use a long exposure scale
> > > +      * factor provided by a shift operation in the sensor.
> > > +      */
> > > +     while (frameLength > frameLengthMax) {
> > > +             if (++shift > longExposureShiftMax) {
> > > +                     shift = longExposureShiftMax;
> > > +                     frameLength = frameLengthMax;
> > > +                     break;
> > > +             }
> > > +             frameLength >>= 1;
> > > +     }
> > > +
> > > +     /* Account for any rounding in the scaled frame length value. */
> > > +     frameLength <<= shift;
> > > +     exposureLines = CamHelper::ExposureLines(exposure);
> > > +     exposureLines = std::min(exposureLines, frameLength - frameIntegrationDiff);
> > > +     exposure = CamHelper::Exposure(exposureLines);
> >
> > I'll try to review this series in more details shortly, but I'd like to
> > already take this as an opportunity to discuss an idea related to the
> > cam helpers.
> >
> > As they stand today, the cam helpers translate between natural units
> > (time for exposure, and a linear multipler for gain) and V4L2 control
> > values. This pushes knowledge of V4L2 to the IPA, which isn't great.
> > I've been toying with the idea of gathering all the V4L2 knowledge in
> > the libcamera core, which would move these conversions to the pipeline
> > handler (with the helper of camera helpers on the libcamera side of
> > course).
> >
> > One issue that bothered me is that the IPA wouldn't know about rounding
> > (or at least not right away, it could get information back from the
> > pipeline handler, but with a delay, and in any case that wouldn't be
> > very clean). The code above takes rounding into account. I'd thus like
> > your feedback on how important this is, and, in case (as I suspect) the
> > IPA needs to know about rounding, whether you think there would still be
> > a way to not add V4L2 knowledge to the IPA.
> 
> This is purely from the perspective of the Raspberry Pi IPA, so other vendors
> may have different opinions or requirements.
> 
> For the RPi IPA, the key is that we get given the *actual* shutter speed and gain
> values used by the sensor for that frame, and these may or may not be rounded
> from what the AGC originally requested.  When fetching shutter speed and gain
> values from embedded data, this is implicit.  When no embedded data is available,
> the CamHelper does the rounding before passing the values into DelayedControls,
> and the value given back to the IPA when the frame comes through (with appropriate
> frame delay) will also be rounded. So, if this CamHelper functionality were to be
> moved to the pipeline handler domain mostly as-is, we would see no ill effect.  If the
> IPA were to somehow get shutter speed and gain values that were not rounded
> appropriately, you would likely see some tiny oscillations in the AGC, and David
> would not be happy with that :-)
> 
> So in summary, I don't think the rounding will give us any problems, other IPAs
> might have to take a similar approach to ours to claim the same. I do think removing
> these sorts of conversions from the IPA domain is the right thing to do. Having the
> IPA deal with only natural units will be a welcome change!

Thanks for the information. I would also like to see moving out of the
IPA, as it will simplify the code on the IPA side.

I do however fear that some pipeline handlers will want to know how
values are rounded before they get the data back along with the captured
frame. Maybe I'm too paranoid, and this will never be an issue. I think
it's worth exploring moving this to the pipeline handler side in any
case, one more item for my long todo list :-)

> > > +
> > > +     return frameLength - mode_.height;
> > > +}
> > > +
> > >  void CamHelperImx477::GetDelays(int &exposure_delay, int &gain_delay,
> > >                               int &vblank_delay) const
> > >  {

Patch
diff mbox series

diff --git a/src/ipa/raspberrypi/cam_helper_imx477.cpp b/src/ipa/raspberrypi/cam_helper_imx477.cpp
index 7a1100c25afc..549c2620765a 100644
--- a/src/ipa/raspberrypi/cam_helper_imx477.cpp
+++ b/src/ipa/raspberrypi/cam_helper_imx477.cpp
@@ -6,6 +6,7 @@ 
  */
 
 #include <assert.h>
+#include <cmath>
 #include <stddef.h>
 #include <stdio.h>
 #include <stdlib.h>
@@ -14,6 +15,7 @@ 
 #include "md_parser.hpp"
 
 using namespace RPiController;
+using libcamera::utils::Duration;
 
 /*
  * We care about two gain registers and a pair of exposure registers. Their
@@ -30,6 +32,9 @@  public:
 	CamHelperImx477();
 	uint32_t GainCode(double gain) const override;
 	double Gain(uint32_t gain_code) const override;
+	void Prepare(libcamera::Span<const uint8_t> buffer, Metadata &metadata) override;
+	uint32_t GetVBlanking(Duration &exposure, Duration minFrameDuration,
+			      Duration maxFrameDuration) const override;
 	void GetDelays(int &exposure_delay, int &gain_delay,
 		       int &vblank_delay) const override;
 	bool SensorEmbeddedDataPresent() const override;
@@ -40,6 +45,10 @@  private:
 	 * in units of lines.
 	 */
 	static constexpr int frameIntegrationDiff = 22;
+	/* Maximum frame length allowable for long exposure calculations. */
+	static constexpr int frameLengthMax = 0xffdc;
+	/* Largest long exposure scale factor given as a left shift on the frame length. */
+	static constexpr int longExposureShiftMax = 7;
 
 	MdParserSmia imx477_parser;
 
@@ -72,6 +81,64 @@  double CamHelperImx477::Gain(uint32_t gain_code) const
 	return 1024.0 / (1024 - gain_code);
 }
 
+void CamHelperImx477::Prepare(libcamera::Span<const uint8_t> buffer, Metadata &metadata)
+{
+	DeviceStatus deviceStatus, parsedStatus;
+
+	/* Get the device status provided by DelayedControls */
+	if (metadata.Get("device.status", deviceStatus) != 0)
+		return;
+
+	/* Get the device status provided by the embedded data buffer. */
+	CamHelper::parseEmbeddedData(buffer, metadata);
+	metadata.Get("device.status", parsedStatus);
+
+	/*
+	 * If the ratio of DelayedControls to embedded data shutter speed is > 1
+	 * and is a factor of 2^N, then we can assume this is a long exposure mode
+	 * frame. Since embedded data does not provide any hints of long exposure
+	 * modes, make sure we use the DelayedControls values in the metadata.
+	 * Otherwise, just go with the embedded data values.
+	 */
+	unsigned long ratio = std::lround(deviceStatus.shutter_speed / parsedStatus.shutter_speed);
+	bool replace = (ratio > 1) && ((ratio & (~ratio + 1)) == ratio);
+
+	if (replace)
+		metadata.Set("device.status", deviceStatus);
+}
+
+uint32_t CamHelperImx477::GetVBlanking(Duration &exposure,
+				       Duration minFrameDuration,
+				       Duration maxFrameDuration) const
+{
+	uint32_t frameLength, exposureLines;
+	unsigned int shift = 0;
+
+	frameLength = mode_.height + CamHelper::GetVBlanking(exposure, minFrameDuration,
+							     maxFrameDuration);
+	/*
+	 * Check if the frame length calculated needs to be setup for long
+	 * exposure mode. This will require us to use a long exposure scale
+	 * factor provided by a shift operation in the sensor.
+	 */
+	while (frameLength > frameLengthMax) {
+		if (++shift > longExposureShiftMax) {
+			shift = longExposureShiftMax;
+			frameLength = frameLengthMax;
+			break;
+		}
+		frameLength >>= 1;
+	}
+
+	/* Account for any rounding in the scaled frame length value. */
+	frameLength <<= shift;
+	exposureLines = CamHelper::ExposureLines(exposure);
+	exposureLines = std::min(exposureLines, frameLength - frameIntegrationDiff);
+	exposure = CamHelper::Exposure(exposureLines);
+
+	return frameLength - mode_.height;
+}
+
 void CamHelperImx477::GetDelays(int &exposure_delay, int &gain_delay,
 				int &vblank_delay) const
 {