[libcamera-devel,2/6] pipeline: raspberrypi: Convert the pipeline handler to use media controller
diff mbox series

Message ID 20211022115537.2964533-3-naush@raspberrypi.com
State Superseded
Headers show
Series
  • Raspberry Pi: Conversion to media controller
Related show

Commit Message

Naushir Patuck Oct. 22, 2021, 11:55 a.m. UTC
Switch the pipeline handler to use the new Unicam media controller based driver.
With this change, we directly talk to the sensor device driver to set controls
and set/get formats in the pipeline handler.

This change requires the accompanying Raspberry Pi linux kernel change at
https://github.com/raspberrypi/linux/pull/4645. If this kernel change is not
present, the pipeline handler will fail to run with an error message informing
the user to update the kernel build.

Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
---
 .../pipeline/raspberrypi/raspberrypi.cpp      | 113 +++++++++++-------
 1 file changed, 67 insertions(+), 46 deletions(-)

Comments

David Plowman Oct. 22, 2021, 1:09 p.m. UTC | #1
Hi Naush

Thanks for this patch!

On Fri, 22 Oct 2021 at 12:55, Naushir Patuck <naush@raspberrypi.com> wrote:
>
> Switch the pipeline handler to use the new Unicam media controller based driver.
> With this change, we directly talk to the sensor device driver to set controls
> and set/get formats in the pipeline handler.
>
> This change requires the accompanying Raspberry Pi linux kernel change at
> https://github.com/raspberrypi/linux/pull/4645. If this kernel change is not
> present, the pipeline handler will fail to run with an error message informing
> the user to update the kernel build.
>
> Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
> ---
>  .../pipeline/raspberrypi/raspberrypi.cpp      | 113 +++++++++++-------
>  1 file changed, 67 insertions(+), 46 deletions(-)
>
> diff --git a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
> index 1634ca98f481..730f1575095c 100644
> --- a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
> +++ b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
> @@ -48,6 +48,19 @@ LOG_DEFINE_CATEGORY(RPI)
>
>  namespace {
>
> +/* Map of mbus codes to supported sizes reported by the sensor. */
> +using SensorFormats = std::map<unsigned int, std::vector<Size>>;
> +
> +SensorFormats populateSensorFormats(std::unique_ptr<CameraSensor> &sensor)
> +{
> +       SensorFormats formats;
> +
> +       for (auto const mbusCode : sensor->mbusCodes())
> +               formats.emplace(mbusCode, sensor->sizes(mbusCode));
> +
> +       return formats;
> +}
> +
>  bool isRaw(PixelFormat &pixFmt)
>  {
>         /*
> @@ -74,8 +87,7 @@ double scoreFormat(double desired, double actual)
>         return score;
>  }
>
> -V4L2DeviceFormat findBestMode(V4L2VideoDevice::Formats &formatsMap,
> -                             const Size &req)
> +V4L2DeviceFormat findBestMode(const SensorFormats &formatsMap, const Size &req)

I wonder whether this function should return something more like an
mbus code and a size. The packing is not really relevant to what comes
out of the sensor and should, I suspect, actually be determined by any
raw stream that has been requested by the application. This all starts
to impinge a bit on the sensor-mode-hint discussions I've been having.

One thing I wonder at this point is how an application could express a
wish for packed or unpacked raw streams. I think generateConfiguration
should probably fill in whatever it wants, probably "packed" as that's
more efficient. Then the application would have to be able to change
this to "unpacked" - which we could make less annoying by making the
BayerFormat publicly available.

>  {
>         double bestScore = std::numeric_limits<double>::max(), score;
>         V4L2DeviceFormat bestMode;
> @@ -88,18 +100,17 @@ V4L2DeviceFormat findBestMode(V4L2VideoDevice::Formats &formatsMap,
>
>         /* Calculate the closest/best mode from the user requested size. */
>         for (const auto &iter : formatsMap) {
> -               V4L2PixelFormat v4l2Format = iter.first;
> +               const unsigned int mbus_code = iter.first;
> +               const V4L2PixelFormat v4l2Format = BayerFormat::fromMbusCode(mbus_code).toV4L2PixelFormat();
>                 const PixelFormatInfo &info = PixelFormatInfo::info(v4l2Format);
>
> -               for (const SizeRange &sz : iter.second) {
> -                       double modeWidth = sz.contains(req) ? req.width : sz.max.width;
> -                       double modeHeight = sz.contains(req) ? req.height : sz.max.height;
> +               for (const Size &sz : iter.second) {
>                         double reqAr = static_cast<double>(req.width) / req.height;
> -                       double modeAr = modeWidth / modeHeight;
> +                       double modeAr = sz.width / sz.height;
>
>                         /* Score the dimensions for closeness. */
> -                       score = scoreFormat(req.width, modeWidth);
> -                       score += scoreFormat(req.height, modeHeight);
> +                       score = scoreFormat(req.width, sz.width);
> +                       score += scoreFormat(req.height, sz.height);
>                         score += PENALTY_AR * scoreFormat(reqAr, modeAr);
>
>                         /* Add any penalties... this is not an exact science! */
> @@ -116,10 +127,10 @@ V4L2DeviceFormat findBestMode(V4L2VideoDevice::Formats &formatsMap,
>                         if (score <= bestScore) {
>                                 bestScore = score;
>                                 bestMode.fourcc = v4l2Format;
> -                               bestMode.size = Size(modeWidth, modeHeight);
> +                               bestMode.size = sz;
>                         }
>
> -                       LOG(RPI, Info) << "Mode: " << modeWidth << "x" << modeHeight
> +                       LOG(RPI, Info) << "Mode: " << sz.width << "x" << sz.height
>                                        << " fmt " << v4l2Format.toString()
>                                        << " Score: " << score
>                                        << " (best " << bestScore << ")";
> @@ -170,6 +181,7 @@ public:
>         std::unique_ptr<ipa::RPi::IPAProxyRPi> ipa_;
>
>         std::unique_ptr<CameraSensor> sensor_;
> +       SensorFormats sensorFormats_;
>         /* Array of Unicam and ISP device streams and associated buffers/streams. */
>         RPi::Device<Unicam, 2> unicam_;
>         RPi::Device<Isp, 4> isp_;
> @@ -352,8 +364,7 @@ CameraConfiguration::Status RPiCameraConfiguration::validate()
>                          * Calculate the best sensor mode we can use based on
>                          * the user request.
>                          */
> -                       V4L2VideoDevice::Formats fmts = data_->unicam_[Unicam::Image].dev()->formats();
> -                       V4L2DeviceFormat sensorFormat = findBestMode(fmts, cfg.size);
> +                       V4L2DeviceFormat sensorFormat = findBestMode(data_->sensorFormats_, cfg.size);
>                         int ret = data_->unicam_[Unicam::Image].dev()->tryFormat(&sensorFormat);
>                         if (ret)
>                                 return Invalid;
> @@ -487,8 +498,7 @@ CameraConfiguration *PipelineHandlerRPi::generateConfiguration(Camera *camera,
>                 switch (role) {
>                 case StreamRole::Raw:
>                         size = data->sensor_->resolution();
> -                       fmts = data->unicam_[Unicam::Image].dev()->formats();
> -                       sensorFormat = findBestMode(fmts, size);
> +                       sensorFormat = findBestMode(data->sensorFormats_, size);
>                         pixelFormat = sensorFormat.fourcc.toPixelFormat();
>                         ASSERT(pixelFormat.isValid());
>                         bufferCount = 2;
> @@ -599,32 +609,32 @@ int PipelineHandlerRPi::configure(Camera *camera, CameraConfiguration *config)
>         }
>
>         /* First calculate the best sensor mode we can use based on the user request. */
> -       V4L2VideoDevice::Formats fmts = data->unicam_[Unicam::Image].dev()->formats();
> -       V4L2DeviceFormat sensorFormat = findBestMode(fmts, rawStream ? sensorSize : maxSize);
> +       V4L2DeviceFormat unicamFormat = findBestMode(data->sensorFormats_, rawStream ? sensorSize : maxSize);
> +
> +       unsigned int mbus_code = BayerFormat::fromV4L2PixelFormat(unicamFormat.fourcc).toMbusCode();
> +       V4L2SubdeviceFormat sensorFormat { .mbus_code = mbus_code, .size = unicamFormat.size };
> +
> +       ret = data->sensor_->setFormat(&sensorFormat);
> +       if (ret)
> +               return ret;

As discussed above, the unicamFormat probably wants to take account of
the packing (or not) in any raw stream (if present).

>
>         /*
>          * Unicam image output format. The ISP input format gets set at start,
>          * just in case we have swapped bayer orders due to flips.
>          */
> -       ret = data->unicam_[Unicam::Image].dev()->setFormat(&sensorFormat);
> +       ret = data->unicam_[Unicam::Image].dev()->setFormat(&unicamFormat);
>         if (ret)
>                 return ret;
>
> -       /*
> -        * The control ranges associated with the sensor may need updating
> -        * after a format change.
> -        * \todo Use the CameraSensor::setFormat API instead.
> -        */
> -       data->sensor_->updateControlInfo();

Hmm, nice, I wonder if we could delete updateControlInfo entirely?

> -
>         LOG(RPI, Info) << "Sensor: " << camera->id()
> -                      << " - Selected mode: " << sensorFormat.toString();
> +                      << " - Selected sensor mode: " << sensorFormat.toString()
> +                      << " - Selected unicam mode: " << unicamFormat.toString();
>
>         /*
>          * This format may be reset on start() if the bayer order has changed
>          * because of flips in the sensor.
>          */
> -       ret = data->isp_[Isp::Input].dev()->setFormat(&sensorFormat);
> +       ret = data->isp_[Isp::Input].dev()->setFormat(&unicamFormat);
>         if (ret)
>                 return ret;
>
> @@ -746,8 +756,8 @@ int PipelineHandlerRPi::configure(Camera *camera, CameraConfiguration *config)
>         data->ispMinCropSize_ = testCrop.size();
>
>         /* Adjust aspect ratio by providing crops on the input image. */
> -       Size size = sensorFormat.size.boundedToAspectRatio(maxSize);
> -       Rectangle crop = size.centeredTo(Rectangle(sensorFormat.size).center());
> +       Size size = unicamFormat.size.boundedToAspectRatio(maxSize);
> +       Rectangle crop = size.centeredTo(Rectangle(unicamFormat.size).center());
>         data->ispCrop_ = crop;
>
>         data->isp_[Isp::Input].dev()->setSelection(V4L2_SEL_TGT_CROP, &crop);
> @@ -761,10 +771,11 @@ int PipelineHandlerRPi::configure(Camera *camera, CameraConfiguration *config)
>          * supports it.
>          */
>         if (data->sensorMetadata_) {
> -               format = {};
> -               format.fourcc = V4L2PixelFormat(V4L2_META_FMT_SENSOR_DATA);
> +               V4L2SubdeviceFormat embeddedFormat;
>
> -               LOG(RPI, Debug) << "Setting embedded data format.";
> +               data->sensor_->device()->getFormat(1, &embeddedFormat);
> +               format.fourcc = V4L2PixelFormat(V4L2_META_FMT_SENSOR_DATA);
> +               format.planes[0].size = embeddedFormat.size.width * embeddedFormat.size.height;
>                 ret = data->unicam_[Unicam::Embedded].dev()->setFormat(&format);
>                 if (ret) {
>                         LOG(RPI, Error) << "Failed to set format on Unicam embedded: "
> @@ -847,9 +858,14 @@ int PipelineHandlerRPi::start(Camera *camera, const ControlList *controls)
>          * IPA configure may have changed the sensor flips - hence the bayer
>          * order. Get the sensor format and set the ISP input now.
>          */
> -       V4L2DeviceFormat sensorFormat;
> -       data->unicam_[Unicam::Image].dev()->getFormat(&sensorFormat);
> -       ret = data->isp_[Isp::Input].dev()->setFormat(&sensorFormat);
> +       V4L2SubdeviceFormat sensorFormat;
> +       data->sensor_->device()->getFormat(0, &sensorFormat);
> +
> +       V4L2DeviceFormat ispFormat;
> +       ispFormat.fourcc = BayerFormat::fromMbusCode(sensorFormat.mbus_code).toV4L2PixelFormat();
> +       ispFormat.size = sensorFormat.size;
> +
> +       ret = data->isp_[Isp::Input].dev()->setFormat(&ispFormat);
>         if (ret) {
>                 stop(camera);
>                 return ret;
> @@ -1004,6 +1020,8 @@ bool PipelineHandlerRPi::match(DeviceEnumerator *enumerator)
>         if (data->sensor_->init())
>                 return false;
>
> +       data->sensorFormats_ = populateSensorFormats(data->sensor_);
> +
>         ipa::RPi::SensorConfig sensorConfig;
>         if (data->loadIPA(&sensorConfig)) {
>                 LOG(RPI, Error) << "Failed to load a suitable IPA library";
> @@ -1030,6 +1048,11 @@ bool PipelineHandlerRPi::match(DeviceEnumerator *enumerator)
>                         return false;
>         }
>
> +       if (!(data->unicam_[Unicam::Image].dev()->caps().device_caps() & V4L2_CAP_IO_MC)) {
> +               LOG(RPI, Error) << "Unicam driver did not advertise V4L2_CAP_IO_MC, please update your kernel!";
> +               return false;
> +       }
> +
>         /*
>          * Setup our delayed control writer with the sensor default
>          * gain and exposure delays. Mark VBLANK for priority write.
> @@ -1039,7 +1062,7 @@ bool PipelineHandlerRPi::match(DeviceEnumerator *enumerator)
>                 { V4L2_CID_EXPOSURE, { sensorConfig.exposureDelay, false } },
>                 { V4L2_CID_VBLANK, { sensorConfig.vblankDelay, true } }
>         };
> -       data->delayedCtrls_ = std::make_unique<DelayedControls>(data->unicam_[Unicam::Image].dev(), params);
> +       data->delayedCtrls_ = std::make_unique<DelayedControls>(data->sensor_->device(), params);
>         data->sensorMetadata_ = sensorConfig.sensorMetadata;
>
>         /* Register the controls that the Raspberry Pi IPA can handle. */
> @@ -1066,15 +1089,14 @@ bool PipelineHandlerRPi::match(DeviceEnumerator *enumerator)
>          * As part of answering the final question, we reset the camera to
>          * no transform at all.
>          */
> -
> -       V4L2VideoDevice *dev = data->unicam_[Unicam::Image].dev();
> -       const struct v4l2_query_ext_ctrl *hflipCtrl = dev->controlInfo(V4L2_CID_HFLIP);
> +       const V4L2Subdevice *sensor = data->sensor_->device();
> +       const struct v4l2_query_ext_ctrl *hflipCtrl = sensor->controlInfo(V4L2_CID_HFLIP);
>         if (hflipCtrl) {
>                 /* We assume it will support vflips too... */
>                 data->supportsFlips_ = true;
>                 data->flipsAlterBayerOrder_ = hflipCtrl->flags & V4L2_CTRL_FLAG_MODIFY_LAYOUT;
>
> -               ControlList ctrls(dev->controls());
> +               ControlList ctrls(data->sensor_->controls());
>                 ctrls.set(V4L2_CID_HFLIP, 0);
>                 ctrls.set(V4L2_CID_VFLIP, 0);
>                 data->setSensorControls(ctrls);
> @@ -1082,9 +1104,8 @@ bool PipelineHandlerRPi::match(DeviceEnumerator *enumerator)
>
>         /* Look for a valid Bayer format. */
>         BayerFormat bayerFormat;
> -       for (const auto &iter : dev->formats()) {
> -               V4L2PixelFormat v4l2Format = iter.first;
> -               bayerFormat = BayerFormat::fromV4L2PixelFormat(v4l2Format);
> +       for (const auto &iter : data->sensorFormats_) {
> +               bayerFormat = BayerFormat::fromMbusCode(iter.first);
>                 if (bayerFormat.isValid())
>                         break;
>         }
> @@ -1271,7 +1292,7 @@ int RPiCameraData::configureIPA(const CameraConfiguration *config)
>                 }
>         }
>
> -       entityControls.emplace(0, unicam_[Unicam::Image].dev()->controls());
> +       entityControls.emplace(0, sensor_->controls());
>         entityControls.emplace(1, isp_[Isp::Input].dev()->controls());
>
>         /* Always send the user transform to the IPA. */
> @@ -1406,10 +1427,10 @@ void RPiCameraData::setSensorControls(ControlList &controls)
>                 ControlList vblank_ctrl;
>
>                 vblank_ctrl.set(V4L2_CID_VBLANK, controls.get(V4L2_CID_VBLANK));
> -               unicam_[Unicam::Image].dev()->setControls(&vblank_ctrl);
> +               sensor_->setControls(&vblank_ctrl);
>         }
>
> -       unicam_[Unicam::Image].dev()->setControls(&controls);
> +       sensor_->setControls(&controls);
>  }
>
>  void RPiCameraData::unicamBufferDequeue(FrameBuffer *buffer)
> --
> 2.25.1
>

Thanks!

David
Naushir Patuck Oct. 22, 2021, 1:28 p.m. UTC | #2
Hi David,

Thank you for your feedback.

On Fri, 22 Oct 2021 at 14:09, David Plowman <david.plowman@raspberrypi.com>
wrote:

> Hi Naush
>
> Thanks for this patch!
>
> On Fri, 22 Oct 2021 at 12:55, Naushir Patuck <naush@raspberrypi.com>
> wrote:
> >
> > Switch the pipeline handler to use the new Unicam media controller based
> driver.
> > With this change, we directly talk to the sensor device driver to set
> controls
> > and set/get formats in the pipeline handler.
> >
> > This change requires the accompanying Raspberry Pi linux kernel change at
> > https://github.com/raspberrypi/linux/pull/4645. If this kernel change
> is not
> > present, the pipeline handler will fail to run with an error message
> informing
> > the user to update the kernel build.
> >
> > Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
> > ---
> >  .../pipeline/raspberrypi/raspberrypi.cpp      | 113 +++++++++++-------
> >  1 file changed, 67 insertions(+), 46 deletions(-)
> >
> > diff --git a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
> b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
> > index 1634ca98f481..730f1575095c 100644
> > --- a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
> > +++ b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
> > @@ -48,6 +48,19 @@ LOG_DEFINE_CATEGORY(RPI)
> >
> >  namespace {
> >
> > +/* Map of mbus codes to supported sizes reported by the sensor. */
> > +using SensorFormats = std::map<unsigned int, std::vector<Size>>;
> > +
> > +SensorFormats populateSensorFormats(std::unique_ptr<CameraSensor>
> &sensor)
> > +{
> > +       SensorFormats formats;
> > +
> > +       for (auto const mbusCode : sensor->mbusCodes())
> > +               formats.emplace(mbusCode, sensor->sizes(mbusCode));
> > +
> > +       return formats;
> > +}
> > +
> >  bool isRaw(PixelFormat &pixFmt)
> >  {
> >         /*
> > @@ -74,8 +87,7 @@ double scoreFormat(double desired, double actual)
> >         return score;
> >  }
> >
> > -V4L2DeviceFormat findBestMode(V4L2VideoDevice::Formats &formatsMap,
> > -                             const Size &req)
> > +V4L2DeviceFormat findBestMode(const SensorFormats &formatsMap, const
> Size &req)
>
> I wonder whether this function should return something more like an
> mbus code and a size. The packing is not really relevant to what comes
> out of the sensor and should, I suspect, actually be determined by any
> raw stream that has been requested by the application. This all starts
> to impinge a bit on the sensor-mode-hint discussions I've been having.
>

Yes, you are right!   And for convenience, a V4L2SubdeviceFormat
is a struct that has these two fields.  I will return that struct from
findBestMode to make things simpler.


> One thing I wonder at this point is how an application could express a
> wish for packed or unpacked raw streams. I think generateConfiguration
> should probably fill in whatever it wants, probably "packed" as that's
> more efficient. Then the application would have to be able to change
> this to "unpacked" - which we could make less annoying by making the
> BayerFormat publicly available.
>

That makes sense.  For this change I will adjust the "packing" param to
be packed for all cases, except if the application requested an unpacked raw
stream pixelformat.

We can consider opening the BayerFormat class to the public later.


>
> >  {
> >         double bestScore = std::numeric_limits<double>::max(), score;
> >         V4L2DeviceFormat bestMode;
> > @@ -88,18 +100,17 @@ V4L2DeviceFormat
> findBestMode(V4L2VideoDevice::Formats &formatsMap,
> >
> >         /* Calculate the closest/best mode from the user requested size.
> */
> >         for (const auto &iter : formatsMap) {
> > -               V4L2PixelFormat v4l2Format = iter.first;
> > +               const unsigned int mbus_code = iter.first;
> > +               const V4L2PixelFormat v4l2Format =
> BayerFormat::fromMbusCode(mbus_code).toV4L2PixelFormat();
> >                 const PixelFormatInfo &info =
> PixelFormatInfo::info(v4l2Format);
> >
> > -               for (const SizeRange &sz : iter.second) {
> > -                       double modeWidth = sz.contains(req) ? req.width
> : sz.max.width;
> > -                       double modeHeight = sz.contains(req) ?
> req.height : sz.max.height;
> > +               for (const Size &sz : iter.second) {
> >                         double reqAr = static_cast<double>(req.width) /
> req.height;
> > -                       double modeAr = modeWidth / modeHeight;
> > +                       double modeAr = sz.width / sz.height;
> >
> >                         /* Score the dimensions for closeness. */
> > -                       score = scoreFormat(req.width, modeWidth);
> > -                       score += scoreFormat(req.height, modeHeight);
> > +                       score = scoreFormat(req.width, sz.width);
> > +                       score += scoreFormat(req.height, sz.height);
> >                         score += PENALTY_AR * scoreFormat(reqAr, modeAr);
> >
> >                         /* Add any penalties... this is not an exact
> science! */
> > @@ -116,10 +127,10 @@ V4L2DeviceFormat
> findBestMode(V4L2VideoDevice::Formats &formatsMap,
> >                         if (score <= bestScore) {
> >                                 bestScore = score;
> >                                 bestMode.fourcc = v4l2Format;
> > -                               bestMode.size = Size(modeWidth,
> modeHeight);
> > +                               bestMode.size = sz;
> >                         }
> >
> > -                       LOG(RPI, Info) << "Mode: " << modeWidth << "x"
> << modeHeight
> > +                       LOG(RPI, Info) << "Mode: " << sz.width << "x" <<
> sz.height
> >                                        << " fmt " <<
> v4l2Format.toString()
> >                                        << " Score: " << score
> >                                        << " (best " << bestScore << ")";
> > @@ -170,6 +181,7 @@ public:
> >         std::unique_ptr<ipa::RPi::IPAProxyRPi> ipa_;
> >
> >         std::unique_ptr<CameraSensor> sensor_;
> > +       SensorFormats sensorFormats_;
> >         /* Array of Unicam and ISP device streams and associated
> buffers/streams. */
> >         RPi::Device<Unicam, 2> unicam_;
> >         RPi::Device<Isp, 4> isp_;
> > @@ -352,8 +364,7 @@ CameraConfiguration::Status
> RPiCameraConfiguration::validate()
> >                          * Calculate the best sensor mode we can use
> based on
> >                          * the user request.
> >                          */
> > -                       V4L2VideoDevice::Formats fmts =
> data_->unicam_[Unicam::Image].dev()->formats();
> > -                       V4L2DeviceFormat sensorFormat =
> findBestMode(fmts, cfg.size);
> > +                       V4L2DeviceFormat sensorFormat =
> findBestMode(data_->sensorFormats_, cfg.size);
> >                         int ret =
> data_->unicam_[Unicam::Image].dev()->tryFormat(&sensorFormat);
> >                         if (ret)
> >                                 return Invalid;
> > @@ -487,8 +498,7 @@ CameraConfiguration
> *PipelineHandlerRPi::generateConfiguration(Camera *camera,
> >                 switch (role) {
> >                 case StreamRole::Raw:
> >                         size = data->sensor_->resolution();
> > -                       fmts =
> data->unicam_[Unicam::Image].dev()->formats();
> > -                       sensorFormat = findBestMode(fmts, size);
> > +                       sensorFormat =
> findBestMode(data->sensorFormats_, size);
> >                         pixelFormat =
> sensorFormat.fourcc.toPixelFormat();
> >                         ASSERT(pixelFormat.isValid());
> >                         bufferCount = 2;
> > @@ -599,32 +609,32 @@ int PipelineHandlerRPi::configure(Camera *camera,
> CameraConfiguration *config)
> >         }
> >
> >         /* First calculate the best sensor mode we can use based on the
> user request. */
> > -       V4L2VideoDevice::Formats fmts =
> data->unicam_[Unicam::Image].dev()->formats();
> > -       V4L2DeviceFormat sensorFormat = findBestMode(fmts, rawStream ?
> sensorSize : maxSize);
> > +       V4L2DeviceFormat unicamFormat =
> findBestMode(data->sensorFormats_, rawStream ? sensorSize : maxSize);
> > +
> > +       unsigned int mbus_code =
> BayerFormat::fromV4L2PixelFormat(unicamFormat.fourcc).toMbusCode();
> > +       V4L2SubdeviceFormat sensorFormat { .mbus_code = mbus_code, .size
> = unicamFormat.size };
> > +
> > +       ret = data->sensor_->setFormat(&sensorFormat);
> > +       if (ret)
> > +               return ret;
>
> As discussed above, the unicamFormat probably wants to take account of
> the packing (or not) in any raw stream (if present).
>

Ack.


>
> >
> >         /*
> >          * Unicam image output format. The ISP input format gets set at
> start,
> >          * just in case we have swapped bayer orders due to flips.
> >          */
> > -       ret =
> data->unicam_[Unicam::Image].dev()->setFormat(&sensorFormat);
> > +       ret =
> data->unicam_[Unicam::Image].dev()->setFormat(&unicamFormat);
> >         if (ret)
> >                 return ret;
> >
> > -       /*
> > -        * The control ranges associated with the sensor may need
> updating
> > -        * after a format change.
> > -        * \todo Use the CameraSensor::setFormat API instead.
> > -        */
> > -       data->sensor_->updateControlInfo();
>
> Hmm, nice, I wonder if we could delete updateControlInfo entirely?
>

There does not seem to be any other users of V4L2Device::updateControlInfo()
so I could remove this.  I recall this was mainly for our use, so that
should be fine.

Regards,
Naush


> > -
> >         LOG(RPI, Info) << "Sensor: " << camera->id()
> > -                      << " - Selected mode: " <<
> sensorFormat.toString();
> > +                      << " - Selected sensor mode: " <<
> sensorFormat.toString()
> > +                      << " - Selected unicam mode: " <<
> unicamFormat.toString();
> >
> >         /*
> >          * This format may be reset on start() if the bayer order has
> changed
> >          * because of flips in the sensor.
> >          */
> > -       ret = data->isp_[Isp::Input].dev()->setFormat(&sensorFormat);
> > +       ret = data->isp_[Isp::Input].dev()->setFormat(&unicamFormat);
> >         if (ret)
> >                 return ret;
> >
> > @@ -746,8 +756,8 @@ int PipelineHandlerRPi::configure(Camera *camera,
> CameraConfiguration *config)
> >         data->ispMinCropSize_ = testCrop.size();
> >
> >         /* Adjust aspect ratio by providing crops on the input image. */
> > -       Size size = sensorFormat.size.boundedToAspectRatio(maxSize);
> > -       Rectangle crop =
> size.centeredTo(Rectangle(sensorFormat.size).center());
> > +       Size size = unicamFormat.size.boundedToAspectRatio(maxSize);
> > +       Rectangle crop =
> size.centeredTo(Rectangle(unicamFormat.size).center());
> >         data->ispCrop_ = crop;
> >
> >         data->isp_[Isp::Input].dev()->setSelection(V4L2_SEL_TGT_CROP,
> &crop);
> > @@ -761,10 +771,11 @@ int PipelineHandlerRPi::configure(Camera *camera,
> CameraConfiguration *config)
> >          * supports it.
> >          */
> >         if (data->sensorMetadata_) {
> > -               format = {};
> > -               format.fourcc =
> V4L2PixelFormat(V4L2_META_FMT_SENSOR_DATA);
> > +               V4L2SubdeviceFormat embeddedFormat;
> >
> > -               LOG(RPI, Debug) << "Setting embedded data format.";
> > +               data->sensor_->device()->getFormat(1, &embeddedFormat);
> > +               format.fourcc =
> V4L2PixelFormat(V4L2_META_FMT_SENSOR_DATA);
> > +               format.planes[0].size = embeddedFormat.size.width *
> embeddedFormat.size.height;
> >                 ret =
> data->unicam_[Unicam::Embedded].dev()->setFormat(&format);
> >                 if (ret) {
> >                         LOG(RPI, Error) << "Failed to set format on
> Unicam embedded: "
> > @@ -847,9 +858,14 @@ int PipelineHandlerRPi::start(Camera *camera, const
> ControlList *controls)
> >          * IPA configure may have changed the sensor flips - hence the
> bayer
> >          * order. Get the sensor format and set the ISP input now.
> >          */
> > -       V4L2DeviceFormat sensorFormat;
> > -       data->unicam_[Unicam::Image].dev()->getFormat(&sensorFormat);
> > -       ret = data->isp_[Isp::Input].dev()->setFormat(&sensorFormat);
> > +       V4L2SubdeviceFormat sensorFormat;
> > +       data->sensor_->device()->getFormat(0, &sensorFormat);
> > +
> > +       V4L2DeviceFormat ispFormat;
> > +       ispFormat.fourcc =
> BayerFormat::fromMbusCode(sensorFormat.mbus_code).toV4L2PixelFormat();
> > +       ispFormat.size = sensorFormat.size;
> > +
> > +       ret = data->isp_[Isp::Input].dev()->setFormat(&ispFormat);
> >         if (ret) {
> >                 stop(camera);
> >                 return ret;
> > @@ -1004,6 +1020,8 @@ bool PipelineHandlerRPi::match(DeviceEnumerator
> *enumerator)
> >         if (data->sensor_->init())
> >                 return false;
> >
> > +       data->sensorFormats_ = populateSensorFormats(data->sensor_);
> > +
> >         ipa::RPi::SensorConfig sensorConfig;
> >         if (data->loadIPA(&sensorConfig)) {
> >                 LOG(RPI, Error) << "Failed to load a suitable IPA
> library";
> > @@ -1030,6 +1048,11 @@ bool PipelineHandlerRPi::match(DeviceEnumerator
> *enumerator)
> >                         return false;
> >         }
> >
> > +       if (!(data->unicam_[Unicam::Image].dev()->caps().device_caps() &
> V4L2_CAP_IO_MC)) {
> > +               LOG(RPI, Error) << "Unicam driver did not advertise
> V4L2_CAP_IO_MC, please update your kernel!";
> > +               return false;
> > +       }
> > +
> >         /*
> >          * Setup our delayed control writer with the sensor default
> >          * gain and exposure delays. Mark VBLANK for priority write.
> > @@ -1039,7 +1062,7 @@ bool PipelineHandlerRPi::match(DeviceEnumerator
> *enumerator)
> >                 { V4L2_CID_EXPOSURE, { sensorConfig.exposureDelay, false
> } },
> >                 { V4L2_CID_VBLANK, { sensorConfig.vblankDelay, true } }
> >         };
> > -       data->delayedCtrls_ =
> std::make_unique<DelayedControls>(data->unicam_[Unicam::Image].dev(),
> params);
> > +       data->delayedCtrls_ =
> std::make_unique<DelayedControls>(data->sensor_->device(), params);
> >         data->sensorMetadata_ = sensorConfig.sensorMetadata;
> >
> >         /* Register the controls that the Raspberry Pi IPA can handle. */
> > @@ -1066,15 +1089,14 @@ bool PipelineHandlerRPi::match(DeviceEnumerator
> *enumerator)
> >          * As part of answering the final question, we reset the camera
> to
> >          * no transform at all.
> >          */
> > -
> > -       V4L2VideoDevice *dev = data->unicam_[Unicam::Image].dev();
> > -       const struct v4l2_query_ext_ctrl *hflipCtrl =
> dev->controlInfo(V4L2_CID_HFLIP);
> > +       const V4L2Subdevice *sensor = data->sensor_->device();
> > +       const struct v4l2_query_ext_ctrl *hflipCtrl =
> sensor->controlInfo(V4L2_CID_HFLIP);
> >         if (hflipCtrl) {
> >                 /* We assume it will support vflips too... */
> >                 data->supportsFlips_ = true;
> >                 data->flipsAlterBayerOrder_ = hflipCtrl->flags &
> V4L2_CTRL_FLAG_MODIFY_LAYOUT;
> >
> > -               ControlList ctrls(dev->controls());
> > +               ControlList ctrls(data->sensor_->controls());
> >                 ctrls.set(V4L2_CID_HFLIP, 0);
> >                 ctrls.set(V4L2_CID_VFLIP, 0);
> >                 data->setSensorControls(ctrls);
> > @@ -1082,9 +1104,8 @@ bool PipelineHandlerRPi::match(DeviceEnumerator
> *enumerator)
> >
> >         /* Look for a valid Bayer format. */
> >         BayerFormat bayerFormat;
> > -       for (const auto &iter : dev->formats()) {
> > -               V4L2PixelFormat v4l2Format = iter.first;
> > -               bayerFormat =
> BayerFormat::fromV4L2PixelFormat(v4l2Format);
> > +       for (const auto &iter : data->sensorFormats_) {
> > +               bayerFormat = BayerFormat::fromMbusCode(iter.first);
> >                 if (bayerFormat.isValid())
> >                         break;
> >         }
> > @@ -1271,7 +1292,7 @@ int RPiCameraData::configureIPA(const
> CameraConfiguration *config)
> >                 }
> >         }
> >
> > -       entityControls.emplace(0,
> unicam_[Unicam::Image].dev()->controls());
> > +       entityControls.emplace(0, sensor_->controls());
> >         entityControls.emplace(1, isp_[Isp::Input].dev()->controls());
> >
> >         /* Always send the user transform to the IPA. */
> > @@ -1406,10 +1427,10 @@ void
> RPiCameraData::setSensorControls(ControlList &controls)
> >                 ControlList vblank_ctrl;
> >
> >                 vblank_ctrl.set(V4L2_CID_VBLANK,
> controls.get(V4L2_CID_VBLANK));
> > -               unicam_[Unicam::Image].dev()->setControls(&vblank_ctrl);
> > +               sensor_->setControls(&vblank_ctrl);
> >         }
> >
> > -       unicam_[Unicam::Image].dev()->setControls(&controls);
> > +       sensor_->setControls(&controls);
> >  }
> >
> >  void RPiCameraData::unicamBufferDequeue(FrameBuffer *buffer)
> > --
> > 2.25.1
> >
>
> Thanks!
>
> David
>

Patch
diff mbox series

diff --git a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
index 1634ca98f481..730f1575095c 100644
--- a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
+++ b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
@@ -48,6 +48,19 @@  LOG_DEFINE_CATEGORY(RPI)
 
 namespace {
 
+/* Map of mbus codes to supported sizes reported by the sensor. */
+using SensorFormats = std::map<unsigned int, std::vector<Size>>;
+
+SensorFormats populateSensorFormats(std::unique_ptr<CameraSensor> &sensor)
+{
+	SensorFormats formats;
+
+	for (auto const mbusCode : sensor->mbusCodes())
+		formats.emplace(mbusCode, sensor->sizes(mbusCode));
+
+	return formats;
+}
+
 bool isRaw(PixelFormat &pixFmt)
 {
 	/*
@@ -74,8 +87,7 @@  double scoreFormat(double desired, double actual)
 	return score;
 }
 
-V4L2DeviceFormat findBestMode(V4L2VideoDevice::Formats &formatsMap,
-			      const Size &req)
+V4L2DeviceFormat findBestMode(const SensorFormats &formatsMap, const Size &req)
 {
 	double bestScore = std::numeric_limits<double>::max(), score;
 	V4L2DeviceFormat bestMode;
@@ -88,18 +100,17 @@  V4L2DeviceFormat findBestMode(V4L2VideoDevice::Formats &formatsMap,
 
 	/* Calculate the closest/best mode from the user requested size. */
 	for (const auto &iter : formatsMap) {
-		V4L2PixelFormat v4l2Format = iter.first;
+		const unsigned int mbus_code = iter.first;
+		const V4L2PixelFormat v4l2Format = BayerFormat::fromMbusCode(mbus_code).toV4L2PixelFormat();
 		const PixelFormatInfo &info = PixelFormatInfo::info(v4l2Format);
 
-		for (const SizeRange &sz : iter.second) {
-			double modeWidth = sz.contains(req) ? req.width : sz.max.width;
-			double modeHeight = sz.contains(req) ? req.height : sz.max.height;
+		for (const Size &sz : iter.second) {
 			double reqAr = static_cast<double>(req.width) / req.height;
-			double modeAr = modeWidth / modeHeight;
+			double modeAr = sz.width / sz.height;
 
 			/* Score the dimensions for closeness. */
-			score = scoreFormat(req.width, modeWidth);
-			score += scoreFormat(req.height, modeHeight);
+			score = scoreFormat(req.width, sz.width);
+			score += scoreFormat(req.height, sz.height);
 			score += PENALTY_AR * scoreFormat(reqAr, modeAr);
 
 			/* Add any penalties... this is not an exact science! */
@@ -116,10 +127,10 @@  V4L2DeviceFormat findBestMode(V4L2VideoDevice::Formats &formatsMap,
 			if (score <= bestScore) {
 				bestScore = score;
 				bestMode.fourcc = v4l2Format;
-				bestMode.size = Size(modeWidth, modeHeight);
+				bestMode.size = sz;
 			}
 
-			LOG(RPI, Info) << "Mode: " << modeWidth << "x" << modeHeight
+			LOG(RPI, Info) << "Mode: " << sz.width << "x" << sz.height
 				       << " fmt " << v4l2Format.toString()
 				       << " Score: " << score
 				       << " (best " << bestScore << ")";
@@ -170,6 +181,7 @@  public:
 	std::unique_ptr<ipa::RPi::IPAProxyRPi> ipa_;
 
 	std::unique_ptr<CameraSensor> sensor_;
+	SensorFormats sensorFormats_;
 	/* Array of Unicam and ISP device streams and associated buffers/streams. */
 	RPi::Device<Unicam, 2> unicam_;
 	RPi::Device<Isp, 4> isp_;
@@ -352,8 +364,7 @@  CameraConfiguration::Status RPiCameraConfiguration::validate()
 			 * Calculate the best sensor mode we can use based on
 			 * the user request.
 			 */
-			V4L2VideoDevice::Formats fmts = data_->unicam_[Unicam::Image].dev()->formats();
-			V4L2DeviceFormat sensorFormat = findBestMode(fmts, cfg.size);
+			V4L2DeviceFormat sensorFormat = findBestMode(data_->sensorFormats_, cfg.size);
 			int ret = data_->unicam_[Unicam::Image].dev()->tryFormat(&sensorFormat);
 			if (ret)
 				return Invalid;
@@ -487,8 +498,7 @@  CameraConfiguration *PipelineHandlerRPi::generateConfiguration(Camera *camera,
 		switch (role) {
 		case StreamRole::Raw:
 			size = data->sensor_->resolution();
-			fmts = data->unicam_[Unicam::Image].dev()->formats();
-			sensorFormat = findBestMode(fmts, size);
+			sensorFormat = findBestMode(data->sensorFormats_, size);
 			pixelFormat = sensorFormat.fourcc.toPixelFormat();
 			ASSERT(pixelFormat.isValid());
 			bufferCount = 2;
@@ -599,32 +609,32 @@  int PipelineHandlerRPi::configure(Camera *camera, CameraConfiguration *config)
 	}
 
 	/* First calculate the best sensor mode we can use based on the user request. */
-	V4L2VideoDevice::Formats fmts = data->unicam_[Unicam::Image].dev()->formats();
-	V4L2DeviceFormat sensorFormat = findBestMode(fmts, rawStream ? sensorSize : maxSize);
+	V4L2DeviceFormat unicamFormat = findBestMode(data->sensorFormats_, rawStream ? sensorSize : maxSize);
+
+	unsigned int mbus_code = BayerFormat::fromV4L2PixelFormat(unicamFormat.fourcc).toMbusCode();
+	V4L2SubdeviceFormat sensorFormat { .mbus_code = mbus_code, .size = unicamFormat.size };
+
+	ret = data->sensor_->setFormat(&sensorFormat);
+	if (ret)
+		return ret;
 
 	/*
 	 * Unicam image output format. The ISP input format gets set at start,
 	 * just in case we have swapped bayer orders due to flips.
 	 */
-	ret = data->unicam_[Unicam::Image].dev()->setFormat(&sensorFormat);
+	ret = data->unicam_[Unicam::Image].dev()->setFormat(&unicamFormat);
 	if (ret)
 		return ret;
 
-	/*
-	 * The control ranges associated with the sensor may need updating
-	 * after a format change.
-	 * \todo Use the CameraSensor::setFormat API instead.
-	 */
-	data->sensor_->updateControlInfo();
-
 	LOG(RPI, Info) << "Sensor: " << camera->id()
-		       << " - Selected mode: " << sensorFormat.toString();
+		       << " - Selected sensor mode: " << sensorFormat.toString()
+		       << " - Selected unicam mode: " << unicamFormat.toString();
 
 	/*
 	 * This format may be reset on start() if the bayer order has changed
 	 * because of flips in the sensor.
 	 */
-	ret = data->isp_[Isp::Input].dev()->setFormat(&sensorFormat);
+	ret = data->isp_[Isp::Input].dev()->setFormat(&unicamFormat);
 	if (ret)
 		return ret;
 
@@ -746,8 +756,8 @@  int PipelineHandlerRPi::configure(Camera *camera, CameraConfiguration *config)
 	data->ispMinCropSize_ = testCrop.size();
 
 	/* Adjust aspect ratio by providing crops on the input image. */
-	Size size = sensorFormat.size.boundedToAspectRatio(maxSize);
-	Rectangle crop = size.centeredTo(Rectangle(sensorFormat.size).center());
+	Size size = unicamFormat.size.boundedToAspectRatio(maxSize);
+	Rectangle crop = size.centeredTo(Rectangle(unicamFormat.size).center());
 	data->ispCrop_ = crop;
 
 	data->isp_[Isp::Input].dev()->setSelection(V4L2_SEL_TGT_CROP, &crop);
@@ -761,10 +771,11 @@  int PipelineHandlerRPi::configure(Camera *camera, CameraConfiguration *config)
 	 * supports it.
 	 */
 	if (data->sensorMetadata_) {
-		format = {};
-		format.fourcc = V4L2PixelFormat(V4L2_META_FMT_SENSOR_DATA);
+		V4L2SubdeviceFormat embeddedFormat;
 
-		LOG(RPI, Debug) << "Setting embedded data format.";
+		data->sensor_->device()->getFormat(1, &embeddedFormat);
+		format.fourcc = V4L2PixelFormat(V4L2_META_FMT_SENSOR_DATA);
+		format.planes[0].size = embeddedFormat.size.width * embeddedFormat.size.height;
 		ret = data->unicam_[Unicam::Embedded].dev()->setFormat(&format);
 		if (ret) {
 			LOG(RPI, Error) << "Failed to set format on Unicam embedded: "
@@ -847,9 +858,14 @@  int PipelineHandlerRPi::start(Camera *camera, const ControlList *controls)
 	 * IPA configure may have changed the sensor flips - hence the bayer
 	 * order. Get the sensor format and set the ISP input now.
 	 */
-	V4L2DeviceFormat sensorFormat;
-	data->unicam_[Unicam::Image].dev()->getFormat(&sensorFormat);
-	ret = data->isp_[Isp::Input].dev()->setFormat(&sensorFormat);
+	V4L2SubdeviceFormat sensorFormat;
+	data->sensor_->device()->getFormat(0, &sensorFormat);
+
+	V4L2DeviceFormat ispFormat;
+	ispFormat.fourcc = BayerFormat::fromMbusCode(sensorFormat.mbus_code).toV4L2PixelFormat();
+	ispFormat.size = sensorFormat.size;
+
+	ret = data->isp_[Isp::Input].dev()->setFormat(&ispFormat);
 	if (ret) {
 		stop(camera);
 		return ret;
@@ -1004,6 +1020,8 @@  bool PipelineHandlerRPi::match(DeviceEnumerator *enumerator)
 	if (data->sensor_->init())
 		return false;
 
+	data->sensorFormats_ = populateSensorFormats(data->sensor_);
+
 	ipa::RPi::SensorConfig sensorConfig;
 	if (data->loadIPA(&sensorConfig)) {
 		LOG(RPI, Error) << "Failed to load a suitable IPA library";
@@ -1030,6 +1048,11 @@  bool PipelineHandlerRPi::match(DeviceEnumerator *enumerator)
 			return false;
 	}
 
+	if (!(data->unicam_[Unicam::Image].dev()->caps().device_caps() & V4L2_CAP_IO_MC)) {
+		LOG(RPI, Error) << "Unicam driver did not advertise V4L2_CAP_IO_MC, please update your kernel!";
+		return false;
+	}
+
 	/*
 	 * Setup our delayed control writer with the sensor default
 	 * gain and exposure delays. Mark VBLANK for priority write.
@@ -1039,7 +1062,7 @@  bool PipelineHandlerRPi::match(DeviceEnumerator *enumerator)
 		{ V4L2_CID_EXPOSURE, { sensorConfig.exposureDelay, false } },
 		{ V4L2_CID_VBLANK, { sensorConfig.vblankDelay, true } }
 	};
-	data->delayedCtrls_ = std::make_unique<DelayedControls>(data->unicam_[Unicam::Image].dev(), params);
+	data->delayedCtrls_ = std::make_unique<DelayedControls>(data->sensor_->device(), params);
 	data->sensorMetadata_ = sensorConfig.sensorMetadata;
 
 	/* Register the controls that the Raspberry Pi IPA can handle. */
@@ -1066,15 +1089,14 @@  bool PipelineHandlerRPi::match(DeviceEnumerator *enumerator)
 	 * As part of answering the final question, we reset the camera to
 	 * no transform at all.
 	 */
-
-	V4L2VideoDevice *dev = data->unicam_[Unicam::Image].dev();
-	const struct v4l2_query_ext_ctrl *hflipCtrl = dev->controlInfo(V4L2_CID_HFLIP);
+	const V4L2Subdevice *sensor = data->sensor_->device();
+	const struct v4l2_query_ext_ctrl *hflipCtrl = sensor->controlInfo(V4L2_CID_HFLIP);
 	if (hflipCtrl) {
 		/* We assume it will support vflips too... */
 		data->supportsFlips_ = true;
 		data->flipsAlterBayerOrder_ = hflipCtrl->flags & V4L2_CTRL_FLAG_MODIFY_LAYOUT;
 
-		ControlList ctrls(dev->controls());
+		ControlList ctrls(data->sensor_->controls());
 		ctrls.set(V4L2_CID_HFLIP, 0);
 		ctrls.set(V4L2_CID_VFLIP, 0);
 		data->setSensorControls(ctrls);
@@ -1082,9 +1104,8 @@  bool PipelineHandlerRPi::match(DeviceEnumerator *enumerator)
 
 	/* Look for a valid Bayer format. */
 	BayerFormat bayerFormat;
-	for (const auto &iter : dev->formats()) {
-		V4L2PixelFormat v4l2Format = iter.first;
-		bayerFormat = BayerFormat::fromV4L2PixelFormat(v4l2Format);
+	for (const auto &iter : data->sensorFormats_) {
+		bayerFormat = BayerFormat::fromMbusCode(iter.first);
 		if (bayerFormat.isValid())
 			break;
 	}
@@ -1271,7 +1292,7 @@  int RPiCameraData::configureIPA(const CameraConfiguration *config)
 		}
 	}
 
-	entityControls.emplace(0, unicam_[Unicam::Image].dev()->controls());
+	entityControls.emplace(0, sensor_->controls());
 	entityControls.emplace(1, isp_[Isp::Input].dev()->controls());
 
 	/* Always send the user transform to the IPA. */
@@ -1406,10 +1427,10 @@  void RPiCameraData::setSensorControls(ControlList &controls)
 		ControlList vblank_ctrl;
 
 		vblank_ctrl.set(V4L2_CID_VBLANK, controls.get(V4L2_CID_VBLANK));
-		unicam_[Unicam::Image].dev()->setControls(&vblank_ctrl);
+		sensor_->setControls(&vblank_ctrl);
 	}
 
-	unicam_[Unicam::Image].dev()->setControls(&controls);
+	sensor_->setControls(&controls);
 }
 
 void RPiCameraData::unicamBufferDequeue(FrameBuffer *buffer)