[{"id":26960,"web_url":"https://patchwork.libcamera.org/comment/26960/","msgid":"<gn6jv3ymkzfdp3s5ukbdf2toon22cocctz6rmz5rxi6qo2dr77@fmajlub3whx6>","date":"2023-04-27T10:58:27","subject":"Re: [libcamera-devel] [PATCH 11/13] pipeline: raspberrypi:\n\tIntroduce PipelineHandlerBase class","submitter":{"id":143,"url":"https://patchwork.libcamera.org/api/people/143/","name":"Jacopo Mondi","email":"jacopo.mondi@ideasonboard.com"},"content":"Hi Naush\n\nOn Wed, Apr 26, 2023 at 02:10:55PM +0100, Naushir Patuck via libcamera-devel wrote:\n> Create a new PipelineHandlerBase class that handles general purpose\n> housekeeping duties for the Raspberry Pi pipeline handler. Code the\n> implementation of new class is essentially pulled from the existing\n> pipeline/rpi/vc4/raspberrypi.cpp file with a small amount of\n> refactoring.\n>\n> Create a derived PipelineHandlerVc4 class from PipelineHandlerBase that\n> handles the VC4 pipeline specific tasks of the pipeline handler. Again,\n> code for this class implementation is taken from the existing\n> pipeline/rpi/vc4/raspberrypi.cpp with a small amount of\n> refactoring.\n>\n> The goal of this change is to allow third parties to implement their own\n> pipeline handlers running on the Raspberry Pi without duplicating all of\n> the pipeline handler housekeeping tasks.\n>\n> Signed-off-by: Naushir Patuck <naush@raspberrypi.com>\n\nReviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>\n\nThanks\n   j\n\n> ---\n>  src/ipa/rpi/vc4/vc4.cpp                       |    2 +-\n>  src/libcamera/pipeline/rpi/common/meson.build |    1 +\n>  .../pipeline/rpi/common/pipeline_base.cpp     | 1447 ++++++++++\n>  .../pipeline/rpi/common/pipeline_base.h       |  276 ++\n>  .../pipeline/rpi/vc4/data/example.yaml        |    4 +-\n>  src/libcamera/pipeline/rpi/vc4/meson.build    |    2 +-\n>  .../pipeline/rpi/vc4/raspberrypi.cpp          | 2428 -----------------\n>  src/libcamera/pipeline/rpi/vc4/vc4.cpp        | 1001 +++++++\n>  8 files changed, 2729 insertions(+), 2432 deletions(-)\n>  create mode 100644 src/libcamera/pipeline/rpi/common/pipeline_base.cpp\n>  create mode 100644 src/libcamera/pipeline/rpi/common/pipeline_base.h\n>  delete mode 100644 src/libcamera/pipeline/rpi/vc4/raspberrypi.cpp\n>  create mode 100644 src/libcamera/pipeline/rpi/vc4/vc4.cpp\n>\n> diff --git a/src/ipa/rpi/vc4/vc4.cpp b/src/ipa/rpi/vc4/vc4.cpp\n> index 0e022c2aeed3..f3d83b2afadf 100644\n> --- a/src/ipa/rpi/vc4/vc4.cpp\n> +++ b/src/ipa/rpi/vc4/vc4.cpp\n> @@ -538,7 +538,7 @@ extern \"C\" {\n>  const struct IPAModuleInfo ipaModuleInfo = {\n>  \tIPA_MODULE_API_VERSION,\n>  \t1,\n> -\t\"PipelineHandlerRPi\",\n> +\t\"PipelineHandlerVc4\",\n>  \t\"vc4\",\n>  };\n>\n> diff --git a/src/libcamera/pipeline/rpi/common/meson.build b/src/libcamera/pipeline/rpi/common/meson.build\n> index 1dec6d3d028b..f8ea790b42a1 100644\n> --- a/src/libcamera/pipeline/rpi/common/meson.build\n> +++ b/src/libcamera/pipeline/rpi/common/meson.build\n> @@ -2,6 +2,7 @@\n>\n>  libcamera_sources += files([\n>      'delayed_controls.cpp',\n> +    'pipeline_base.cpp',\n>      'rpi_stream.cpp',\n>  ])\n>\n> diff --git a/src/libcamera/pipeline/rpi/common/pipeline_base.cpp b/src/libcamera/pipeline/rpi/common/pipeline_base.cpp\n> new file mode 100644\n> index 000000000000..012766b38c32\n> --- /dev/null\n> +++ b/src/libcamera/pipeline/rpi/common/pipeline_base.cpp\n> @@ -0,0 +1,1447 @@\n> +/* SPDX-License-Identifier: LGPL-2.1-or-later */\n> +/*\n> + * Copyright (C) 2019-2023, Raspberry Pi Ltd\n> + *\n> + * pipeline_base.h - Pipeline handler base class for Raspberry Pi devices\n> + */\n> +\n> +#include \"pipeline_base.h\"\n> +\n> +#include <chrono>\n> +\n> +#include <linux/media-bus-format.h>\n> +#include <linux/videodev2.h>\n> +\n> +#include <libcamera/base/file.h>\n> +#include <libcamera/base/utils.h>\n> +\n> +#include <libcamera/formats.h>\n> +#include <libcamera/logging.h>\n> +#include <libcamera/property_ids.h>\n> +\n> +#include \"libcamera/internal/camera_lens.h\"\n> +#include \"libcamera/internal/ipa_manager.h\"\n> +#include \"libcamera/internal/v4l2_subdevice.h\"\n> +\n> +using namespace std::chrono_literals;\n> +\n> +namespace libcamera {\n> +\n> +using namespace RPi;\n> +\n> +LOG_DEFINE_CATEGORY(RPI)\n> +\n> +namespace {\n> +\n> +constexpr unsigned int defaultRawBitDepth = 12;\n> +\n> +bool isRaw(const PixelFormat &pixFmt)\n> +{\n> +\t/* This test works for both Bayer and raw mono formats. */\n> +\treturn BayerFormat::fromPixelFormat(pixFmt).isValid();\n> +}\n> +\n> +PixelFormat mbusCodeToPixelFormat(unsigned int mbus_code,\n> +\t\t\t\t  BayerFormat::Packing packingReq)\n> +{\n> +\tBayerFormat bayer = BayerFormat::fromMbusCode(mbus_code);\n> +\n> +\tASSERT(bayer.isValid());\n> +\n> +\tbayer.packing = packingReq;\n> +\tPixelFormat pix = bayer.toPixelFormat();\n> +\n> +\t/*\n> +\t * Not all formats (e.g. 8-bit or 16-bit Bayer formats) can have packed\n> +\t * variants. So if the PixelFormat returns as invalid, use the non-packed\n> +\t * conversion instead.\n> +\t */\n> +\tif (!pix.isValid()) {\n> +\t\tbayer.packing = BayerFormat::Packing::None;\n> +\t\tpix = bayer.toPixelFormat();\n> +\t}\n> +\n> +\treturn pix;\n> +}\n> +\n> +SensorFormats populateSensorFormats(std::unique_ptr<CameraSensor> &sensor)\n> +{\n> +\tSensorFormats formats;\n> +\n> +\tfor (auto const mbusCode : sensor->mbusCodes())\n> +\t\tformats.emplace(mbusCode, sensor->sizes(mbusCode));\n> +\n> +\treturn formats;\n> +}\n> +\n> +bool isMonoSensor(std::unique_ptr<CameraSensor> &sensor)\n> +{\n> +\tunsigned int mbusCode = sensor->mbusCodes()[0];\n> +\tconst BayerFormat &bayer = BayerFormat::fromMbusCode(mbusCode);\n> +\n> +\treturn bayer.order == BayerFormat::Order::MONO;\n> +}\n> +\n> +double scoreFormat(double desired, double actual)\n> +{\n> +\tdouble score = desired - actual;\n> +\t/* Smaller desired dimensions are preferred. */\n> +\tif (score < 0.0)\n> +\t\tscore = (-score) / 8;\n> +\t/* Penalise non-exact matches. */\n> +\tif (actual != desired)\n> +\t\tscore *= 2;\n> +\n> +\treturn score;\n> +}\n> +\n> +V4L2SubdeviceFormat findBestFormat(const SensorFormats &formatsMap, const Size &req, unsigned int bitDepth)\n> +{\n> +\tdouble bestScore = std::numeric_limits<double>::max(), score;\n> +\tV4L2SubdeviceFormat bestFormat;\n> +\tbestFormat.colorSpace = ColorSpace::Raw;\n> +\n> +\tconstexpr float penaltyAr = 1500.0;\n> +\tconstexpr float penaltyBitDepth = 500.0;\n> +\n> +\t/* Calculate the closest/best mode from the user requested size. */\n> +\tfor (const auto &iter : formatsMap) {\n> +\t\tconst unsigned int mbusCode = iter.first;\n> +\t\tconst PixelFormat format = mbusCodeToPixelFormat(mbusCode,\n> +\t\t\t\t\t\t\t\t BayerFormat::Packing::None);\n> +\t\tconst PixelFormatInfo &info = PixelFormatInfo::info(format);\n> +\n> +\t\tfor (const Size &size : iter.second) {\n> +\t\t\tdouble reqAr = static_cast<double>(req.width) / req.height;\n> +\t\t\tdouble fmtAr = static_cast<double>(size.width) / size.height;\n> +\n> +\t\t\t/* Score the dimensions for closeness. */\n> +\t\t\tscore = scoreFormat(req.width, size.width);\n> +\t\t\tscore += scoreFormat(req.height, size.height);\n> +\t\t\tscore += penaltyAr * scoreFormat(reqAr, fmtAr);\n> +\n> +\t\t\t/* Add any penalties... this is not an exact science! */\n> +\t\t\tscore += utils::abs_diff(info.bitsPerPixel, bitDepth) * penaltyBitDepth;\n> +\n> +\t\t\tif (score <= bestScore) {\n> +\t\t\t\tbestScore = score;\n> +\t\t\t\tbestFormat.mbus_code = mbusCode;\n> +\t\t\t\tbestFormat.size = size;\n> +\t\t\t}\n> +\n> +\t\t\tLOG(RPI, Debug) << \"Format: \" << size\n> +\t\t\t\t\t<< \" fmt \" << format\n> +\t\t\t\t\t<< \" Score: \" << score\n> +\t\t\t\t\t<< \" (best \" << bestScore << \")\";\n> +\t\t}\n> +\t}\n> +\n> +\treturn bestFormat;\n> +}\n> +\n> +const std::vector<ColorSpace> validColorSpaces = {\n> +\tColorSpace::Sycc,\n> +\tColorSpace::Smpte170m,\n> +\tColorSpace::Rec709\n> +};\n> +\n> +std::optional<ColorSpace> findValidColorSpace(const ColorSpace &colourSpace)\n> +{\n> +\tfor (auto cs : validColorSpaces) {\n> +\t\tif (colourSpace.primaries == cs.primaries &&\n> +\t\t    colourSpace.transferFunction == cs.transferFunction)\n> +\t\t\treturn cs;\n> +\t}\n> +\n> +\treturn std::nullopt;\n> +}\n> +\n> +bool isRgb(const PixelFormat &pixFmt)\n> +{\n> +\tconst PixelFormatInfo &info = PixelFormatInfo::info(pixFmt);\n> +\treturn info.colourEncoding == PixelFormatInfo::ColourEncodingRGB;\n> +}\n> +\n> +bool isYuv(const PixelFormat &pixFmt)\n> +{\n> +\t/* The code below would return true for raw mono streams, so weed those out first. */\n> +\tif (isRaw(pixFmt))\n> +\t\treturn false;\n> +\n> +\tconst PixelFormatInfo &info = PixelFormatInfo::info(pixFmt);\n> +\treturn info.colourEncoding == PixelFormatInfo::ColourEncodingYUV;\n> +}\n> +\n> +} /* namespace */\n> +\n> +/*\n> + * Raspberry Pi drivers expect the following colour spaces:\n> + * - V4L2_COLORSPACE_RAW for raw streams.\n> + * - One of V4L2_COLORSPACE_JPEG, V4L2_COLORSPACE_SMPTE170M, V4L2_COLORSPACE_REC709 for\n> + *   non-raw streams. Other fields such as transfer function, YCbCr encoding and\n> + *   quantisation are not used.\n> + *\n> + * The libcamera colour spaces that we wish to use corresponding to these are therefore:\n> + * - ColorSpace::Raw for V4L2_COLORSPACE_RAW\n> + * - ColorSpace::Sycc for V4L2_COLORSPACE_JPEG\n> + * - ColorSpace::Smpte170m for V4L2_COLORSPACE_SMPTE170M\n> + * - ColorSpace::Rec709 for V4L2_COLORSPACE_REC709\n> + */\n> +CameraConfiguration::Status RPiCameraConfiguration::validateColorSpaces([[maybe_unused]] ColorSpaceFlags flags)\n> +{\n> +\tStatus status = Valid;\n> +\tyuvColorSpace_.reset();\n> +\n> +\tfor (auto cfg : config_) {\n> +\t\t/* First fix up raw streams to have the \"raw\" colour space. */\n> +\t\tif (isRaw(cfg.pixelFormat)) {\n> +\t\t\t/* If there was no value here, that doesn't count as \"adjusted\". */\n> +\t\t\tif (cfg.colorSpace && cfg.colorSpace != ColorSpace::Raw)\n> +\t\t\t\tstatus = Adjusted;\n> +\t\t\tcfg.colorSpace = ColorSpace::Raw;\n> +\t\t\tcontinue;\n> +\t\t}\n> +\n> +\t\t/* Next we need to find our shared colour space. The first valid one will do. */\n> +\t\tif (cfg.colorSpace && !yuvColorSpace_)\n> +\t\t\tyuvColorSpace_ = findValidColorSpace(cfg.colorSpace.value());\n> +\t}\n> +\n> +\t/* If no colour space was given anywhere, choose sYCC. */\n> +\tif (!yuvColorSpace_)\n> +\t\tyuvColorSpace_ = ColorSpace::Sycc;\n> +\n> +\t/* Note the version of this that any RGB streams will have to use. */\n> +\trgbColorSpace_ = yuvColorSpace_;\n> +\trgbColorSpace_->ycbcrEncoding = ColorSpace::YcbcrEncoding::None;\n> +\trgbColorSpace_->range = ColorSpace::Range::Full;\n> +\n> +\t/* Go through the streams again and force everyone to the same colour space. */\n> +\tfor (auto cfg : config_) {\n> +\t\tif (cfg.colorSpace == ColorSpace::Raw)\n> +\t\t\tcontinue;\n> +\n> +\t\tif (isYuv(cfg.pixelFormat) && cfg.colorSpace != yuvColorSpace_) {\n> +\t\t\t/* Again, no value means \"not adjusted\". */\n> +\t\t\tif (cfg.colorSpace)\n> +\t\t\t\tstatus = Adjusted;\n> +\t\t\tcfg.colorSpace = yuvColorSpace_;\n> +\t\t}\n> +\t\tif (isRgb(cfg.pixelFormat) && cfg.colorSpace != rgbColorSpace_) {\n> +\t\t\t/* Be nice, and let the YUV version count as non-adjusted too. */\n> +\t\t\tif (cfg.colorSpace && cfg.colorSpace != yuvColorSpace_)\n> +\t\t\t\tstatus = Adjusted;\n> +\t\t\tcfg.colorSpace = rgbColorSpace_;\n> +\t\t}\n> +\t}\n> +\n> +\treturn status;\n> +}\n> +\n> +CameraConfiguration::Status RPiCameraConfiguration::validate()\n> +{\n> +\tStatus status = Valid;\n> +\n> +\tif (config_.empty())\n> +\t\treturn Invalid;\n> +\n> +\tstatus = validateColorSpaces(ColorSpaceFlag::StreamsShareColorSpace);\n> +\n> +\t/*\n> +\t * Validate the requested transform against the sensor capabilities and\n> +\t * rotation and store the final combined transform that configure() will\n> +\t * need to apply to the sensor to save us working it out again.\n> +\t */\n> +\tTransform requestedTransform = transform;\n> +\tcombinedTransform_ = data_->sensor_->validateTransform(&transform);\n> +\tif (transform != requestedTransform)\n> +\t\tstatus = Adjusted;\n> +\n> +\tstd::vector<CameraData::StreamParams> rawStreams, outStreams;\n> +\tfor (const auto &[index, cfg] : utils::enumerate(config_)) {\n> +\t\tif (isRaw(cfg.pixelFormat))\n> +\t\t\trawStreams.emplace_back(index, &cfg);\n> +\t\telse\n> +\t\t\toutStreams.emplace_back(index, &cfg);\n> +\t}\n> +\n> +\t/* Sort the streams so the highest resolution is first. */\n> +\tstd::sort(rawStreams.begin(), rawStreams.end(),\n> +\t\t  [](auto &l, auto &r) { return l.cfg->size > r.cfg->size; });\n> +\n> +\tstd::sort(outStreams.begin(), outStreams.end(),\n> +\t\t  [](auto &l, auto &r) { return l.cfg->size > r.cfg->size; });\n> +\n> +\t/* Do any platform specific fixups. */\n> +\tstatus = data_->platformValidate(rawStreams, outStreams);\n> +\tif (status == Invalid)\n> +\t\treturn Invalid;\n> +\n> +\t/* Further fixups on the RAW streams. */\n> +\tfor (auto &raw : rawStreams) {\n> +\t\tStreamConfiguration &cfg = config_.at(raw.index);\n> +\t\tV4L2DeviceFormat rawFormat;\n> +\n> +\t\tconst PixelFormatInfo &info = PixelFormatInfo::info(cfg.pixelFormat);\n> +\t\tunsigned int bitDepth = info.isValid() ? info.bitsPerPixel : defaultRawBitDepth;\n> +\t\tV4L2SubdeviceFormat sensorFormat = findBestFormat(data_->sensorFormats_, cfg.size, bitDepth);\n> +\n> +\t\trawFormat.size = sensorFormat.size;\n> +\t\trawFormat.fourcc = raw.dev->toV4L2PixelFormat(cfg.pixelFormat);\n> +\n> +\t\tint ret = raw.dev->tryFormat(&rawFormat);\n> +\t\tif (ret)\n> +\t\t\treturn Invalid;\n> +\t\t/*\n> +\t\t * Some sensors change their Bayer order when they are h-flipped\n> +\t\t * or v-flipped, according to the transform. If this one does, we\n> +\t\t * must advertise the transformed Bayer order in the raw stream.\n> +\t\t * Note how we must fetch the \"native\" (i.e. untransformed) Bayer\n> +\t\t * order, because the sensor may currently be flipped!\n> +\t\t */\n> +\t\tV4L2PixelFormat fourcc = rawFormat.fourcc;\n> +\t\tif (data_->flipsAlterBayerOrder_) {\n> +\t\t\tBayerFormat bayer = BayerFormat::fromV4L2PixelFormat(fourcc);\n> +\t\t\tbayer.order = data_->nativeBayerOrder_;\n> +\t\t\tbayer = bayer.transform(combinedTransform_);\n> +\t\t\tfourcc = bayer.toV4L2PixelFormat();\n> +\t\t}\n> +\n> +\t\tPixelFormat inputPixFormat = fourcc.toPixelFormat();\n> +\t\tif (raw.cfg->size != rawFormat.size || raw.cfg->pixelFormat != inputPixFormat) {\n> +\t\t\traw.cfg->size = rawFormat.size;\n> +\t\t\traw.cfg->pixelFormat = inputPixFormat;\n> +\t\t\tstatus = Adjusted;\n> +\t\t}\n> +\n> +\t\traw.cfg->stride = rawFormat.planes[0].bpl;\n> +\t\traw.cfg->frameSize = rawFormat.planes[0].size;\n> +\t}\n> +\n> +\t/* Further fixups on the ISP output streams. */\n> +\tfor (auto &out : outStreams) {\n> +\t\tStreamConfiguration &cfg = config_.at(out.index);\n> +\t\tPixelFormat &cfgPixFmt = cfg.pixelFormat;\n> +\t\tV4L2VideoDevice::Formats fmts = out.dev->formats();\n> +\n> +\t\tif (fmts.find(out.dev->toV4L2PixelFormat(cfgPixFmt)) == fmts.end()) {\n> +\t\t\t/* If we cannot find a native format, use a default one. */\n> +\t\t\tcfgPixFmt = formats::NV12;\n> +\t\t\tstatus = Adjusted;\n> +\t\t}\n> +\n> +\t\tV4L2DeviceFormat format;\n> +\t\tformat.fourcc = out.dev->toV4L2PixelFormat(cfg.pixelFormat);\n> +\t\tformat.size = cfg.size;\n> +\t\t/* We want to send the associated YCbCr info through to the driver. */\n> +\t\tformat.colorSpace = yuvColorSpace_;\n> +\n> +\t\tLOG(RPI, Debug)\n> +\t\t\t<< \"Try color space \" << ColorSpace::toString(cfg.colorSpace);\n> +\n> +\t\tint ret = out.dev->tryFormat(&format);\n> +\t\tif (ret)\n> +\t\t\treturn Invalid;\n> +\n> +\t\t/*\n> +\t\t * But for RGB streams, the YCbCr info gets overwritten on the way back\n> +\t\t * so we must check against what the stream cfg says, not what we actually\n> +\t\t * requested (which carefully included the YCbCr info)!\n> +\t\t */\n> +\t\tif (cfg.colorSpace != format.colorSpace) {\n> +\t\t\tstatus = Adjusted;\n> +\t\t\tLOG(RPI, Debug)\n> +\t\t\t\t<< \"Color space changed from \"\n> +\t\t\t\t<< ColorSpace::toString(cfg.colorSpace) << \" to \"\n> +\t\t\t\t<< ColorSpace::toString(format.colorSpace);\n> +\t\t}\n> +\n> +\t\tcfg.colorSpace = format.colorSpace;\n> +\t\tcfg.stride = format.planes[0].bpl;\n> +\t\tcfg.frameSize = format.planes[0].size;\n> +\t}\n> +\n> +\treturn status;\n> +}\n> +\n> +V4L2DeviceFormat PipelineHandlerBase::toV4L2DeviceFormat(const V4L2VideoDevice *dev,\n> +\t\t\t\t\t\t\t const V4L2SubdeviceFormat &format,\n> +\t\t\t\t\t\t\t BayerFormat::Packing packingReq)\n> +{\n> +\tunsigned int mbus_code = format.mbus_code;\n> +\tconst PixelFormat pix = mbusCodeToPixelFormat(mbus_code, packingReq);\n> +\tV4L2DeviceFormat deviceFormat;\n> +\n> +\tdeviceFormat.fourcc = dev->toV4L2PixelFormat(pix);\n> +\tdeviceFormat.size = format.size;\n> +\tdeviceFormat.colorSpace = format.colorSpace;\n> +\treturn deviceFormat;\n> +}\n> +\n> +std::unique_ptr<CameraConfiguration>\n> +PipelineHandlerBase::generateConfiguration(Camera *camera, const StreamRoles &roles)\n> +{\n> +\tCameraData *data = cameraData(camera);\n> +\tstd::unique_ptr<CameraConfiguration> config =\n> +\t\tstd::make_unique<RPiCameraConfiguration>(data);\n> +\tV4L2SubdeviceFormat sensorFormat;\n> +\tunsigned int bufferCount;\n> +\tPixelFormat pixelFormat;\n> +\tV4L2VideoDevice::Formats fmts;\n> +\tSize size;\n> +\tstd::optional<ColorSpace> colorSpace;\n> +\n> +\tif (roles.empty())\n> +\t\treturn config;\n> +\n> +\tSize sensorSize = data->sensor_->resolution();\n> +\tfor (const StreamRole role : roles) {\n> +\t\tswitch (role) {\n> +\t\tcase StreamRole::Raw:\n> +\t\t\tsize = sensorSize;\n> +\t\t\tsensorFormat = findBestFormat(data->sensorFormats_, size, defaultRawBitDepth);\n> +\t\t\tpixelFormat = mbusCodeToPixelFormat(sensorFormat.mbus_code,\n> +\t\t\t\t\t\t\t    BayerFormat::Packing::CSI2);\n> +\t\t\tASSERT(pixelFormat.isValid());\n> +\t\t\tcolorSpace = ColorSpace::Raw;\n> +\t\t\tbufferCount = 2;\n> +\t\t\tbreak;\n> +\n> +\t\tcase StreamRole::StillCapture:\n> +\t\t\tfmts = data->ispFormats();\n> +\t\t\tpixelFormat = formats::NV12;\n> +\t\t\t/*\n> +\t\t\t * Still image codecs usually expect the sYCC color space.\n> +\t\t\t * Even RGB codecs will be fine as the RGB we get with the\n> +\t\t\t * sYCC color space is the same as sRGB.\n> +\t\t\t */\n> +\t\t\tcolorSpace = ColorSpace::Sycc;\n> +\t\t\t/* Return the largest sensor resolution. */\n> +\t\t\tsize = sensorSize;\n> +\t\t\tbufferCount = 1;\n> +\t\t\tbreak;\n> +\n> +\t\tcase StreamRole::VideoRecording:\n> +\t\t\t/*\n> +\t\t\t * The colour denoise algorithm requires the analysis\n> +\t\t\t * image, produced by the second ISP output, to be in\n> +\t\t\t * YUV420 format. Select this format as the default, to\n> +\t\t\t * maximize chances that it will be picked by\n> +\t\t\t * applications and enable usage of the colour denoise\n> +\t\t\t * algorithm.\n> +\t\t\t */\n> +\t\t\tfmts = data->ispFormats();\n> +\t\t\tpixelFormat = formats::YUV420;\n> +\t\t\t/*\n> +\t\t\t * Choose a color space appropriate for video recording.\n> +\t\t\t * Rec.709 will be a good default for HD resolutions.\n> +\t\t\t */\n> +\t\t\tcolorSpace = ColorSpace::Rec709;\n> +\t\t\tsize = { 1920, 1080 };\n> +\t\t\tbufferCount = 4;\n> +\t\t\tbreak;\n> +\n> +\t\tcase StreamRole::Viewfinder:\n> +\t\t\tfmts = data->ispFormats();\n> +\t\t\tpixelFormat = formats::ARGB8888;\n> +\t\t\tcolorSpace = ColorSpace::Sycc;\n> +\t\t\tsize = { 800, 600 };\n> +\t\t\tbufferCount = 4;\n> +\t\t\tbreak;\n> +\n> +\t\tdefault:\n> +\t\t\tLOG(RPI, Error) << \"Requested stream role not supported: \"\n> +\t\t\t\t\t<< role;\n> +\t\t\treturn nullptr;\n> +\t\t}\n> +\n> +\t\tstd::map<PixelFormat, std::vector<SizeRange>> deviceFormats;\n> +\t\tif (role == StreamRole::Raw) {\n> +\t\t\t/* Translate the MBUS codes to a PixelFormat. */\n> +\t\t\tfor (const auto &format : data->sensorFormats_) {\n> +\t\t\t\tPixelFormat pf = mbusCodeToPixelFormat(format.first,\n> +\t\t\t\t\t\t\t\t       BayerFormat::Packing::CSI2);\n> +\t\t\t\tif (pf.isValid())\n> +\t\t\t\t\tdeviceFormats.emplace(std::piecewise_construct, std::forward_as_tuple(pf),\n> +\t\t\t\t\t\t\t      std::forward_as_tuple(format.second.begin(), format.second.end()));\n> +\t\t\t}\n> +\t\t} else {\n> +\t\t\t/*\n> +\t\t\t * Translate the V4L2PixelFormat to PixelFormat. Note that we\n> +\t\t\t * limit the recommended largest ISP output size to match the\n> +\t\t\t * sensor resolution.\n> +\t\t\t */\n> +\t\t\tfor (const auto &format : fmts) {\n> +\t\t\t\tPixelFormat pf = format.first.toPixelFormat();\n> +\t\t\t\tif (pf.isValid()) {\n> +\t\t\t\t\tconst SizeRange &ispSizes = format.second[0];\n> +\t\t\t\t\tdeviceFormats[pf].emplace_back(ispSizes.min, sensorSize,\n> +\t\t\t\t\t\t\t\t       ispSizes.hStep, ispSizes.vStep);\n> +\t\t\t\t}\n> +\t\t\t}\n> +\t\t}\n> +\n> +\t\t/* Add the stream format based on the device node used for the use case. */\n> +\t\tStreamFormats formats(deviceFormats);\n> +\t\tStreamConfiguration cfg(formats);\n> +\t\tcfg.size = size;\n> +\t\tcfg.pixelFormat = pixelFormat;\n> +\t\tcfg.colorSpace = colorSpace;\n> +\t\tcfg.bufferCount = bufferCount;\n> +\t\tconfig->addConfiguration(cfg);\n> +\t}\n> +\n> +\tconfig->validate();\n> +\n> +\treturn config;\n> +}\n> +\n> +int PipelineHandlerBase::configure(Camera *camera, CameraConfiguration *config)\n> +{\n> +\tCameraData *data = cameraData(camera);\n> +\tint ret;\n> +\n> +\t/* Start by freeing all buffers and reset the stream states. */\n> +\tdata->freeBuffers();\n> +\tfor (auto const stream : data->streams_)\n> +\t\tstream->setExternal(false);\n> +\n> +\tstd::vector<CameraData::StreamParams> rawStreams, ispStreams;\n> +\tstd::optional<BayerFormat::Packing> packing;\n> +\tunsigned int bitDepth = defaultRawBitDepth;\n> +\n> +\tfor (unsigned i = 0; i < config->size(); i++) {\n> +\t\tStreamConfiguration *cfg = &config->at(i);\n> +\n> +\t\tif (isRaw(cfg->pixelFormat))\n> +\t\t\trawStreams.emplace_back(i, cfg);\n> +\t\telse\n> +\t\t\tispStreams.emplace_back(i, cfg);\n> +\t}\n> +\n> +\t/* Sort the streams so the highest resolution is first. */\n> +\tstd::sort(rawStreams.begin(), rawStreams.end(),\n> +\t\t  [](auto &l, auto &r) { return l.cfg->size > r.cfg->size; });\n> +\n> +\tstd::sort(ispStreams.begin(), ispStreams.end(),\n> +\t\t  [](auto &l, auto &r) { return l.cfg->size > r.cfg->size; });\n> +\n> +\t/*\n> +\t * Calculate the best sensor mode we can use based on the user's request,\n> +\t * and apply it to the sensor with the cached tranform, if any.\n> +\t *\n> +\t * If we have been given a RAW stream, use that size for setting up the sensor.\n> +\t */\n> +\tif (!rawStreams.empty()) {\n> +\t\tBayerFormat bayerFormat = BayerFormat::fromPixelFormat(rawStreams[0].cfg->pixelFormat);\n> +\t\t/* Replace the user requested packing/bit-depth. */\n> +\t\tpacking = bayerFormat.packing;\n> +\t\tbitDepth = bayerFormat.bitDepth;\n> +\t}\n> +\n> +\tV4L2SubdeviceFormat sensorFormat = findBestFormat(data->sensorFormats_,\n> +\t\t\t\t\t\t\t  rawStreams.empty() ? ispStreams[0].cfg->size\n> +\t\t\t\t\t\t\t\t\t     : rawStreams[0].cfg->size,\n> +\t\t\t\t\t\t\t  bitDepth);\n> +\t/* Apply any cached transform. */\n> +\tconst RPiCameraConfiguration *rpiConfig = static_cast<const RPiCameraConfiguration *>(config);\n> +\n> +\t/* Then apply the format on the sensor. */\n> +\tret = data->sensor_->setFormat(&sensorFormat, rpiConfig->combinedTransform_);\n> +\tif (ret)\n> +\t\treturn ret;\n> +\n> +\t/*\n> +\t * Platform specific internal stream configuration. This also assigns\n> +\t * external streams which get configured below.\n> +\t */\n> +\tret = data->platformConfigure(sensorFormat, packing, rawStreams, ispStreams);\n> +\tif (ret)\n> +\t\treturn ret;\n> +\n> +\tipa::RPi::ConfigResult result;\n> +\tret = data->configureIPA(config, &result);\n> +\tif (ret) {\n> +\t\tLOG(RPI, Error) << \"Failed to configure the IPA: \" << ret;\n> +\t\treturn ret;\n> +\t}\n> +\n> +\t/*\n> +\t * Set the scaler crop to the value we are using (scaled to native sensor\n> +\t * coordinates).\n> +\t */\n> +\tdata->scalerCrop_ = data->scaleIspCrop(data->ispCrop_);\n> +\n> +\t/*\n> +\t * Update the ScalerCropMaximum to the correct value for this camera mode.\n> +\t * For us, it's the same as the \"analogue crop\".\n> +\t *\n> +\t * \\todo Make this property the ScalerCrop maximum value when dynamic\n> +\t * controls are available and set it at validate() time\n> +\t */\n> +\tdata->properties_.set(properties::ScalerCropMaximum, data->sensorInfo_.analogCrop);\n> +\n> +\t/* Store the mode sensitivity for the application. */\n> +\tdata->properties_.set(properties::SensorSensitivity, result.modeSensitivity);\n> +\n> +\t/* Update the controls that the Raspberry Pi IPA can handle. */\n> +\tControlInfoMap::Map ctrlMap;\n> +\tfor (auto const &c : result.controlInfo)\n> +\t\tctrlMap.emplace(c.first, c.second);\n> +\n> +\t/* Add the ScalerCrop control limits based on the current mode. */\n> +\tRectangle ispMinCrop = data->scaleIspCrop(Rectangle(data->ispMinCropSize_));\n> +\tctrlMap[&controls::ScalerCrop] = ControlInfo(ispMinCrop, data->sensorInfo_.analogCrop, data->scalerCrop_);\n> +\n> +\tdata->controlInfo_ = ControlInfoMap(std::move(ctrlMap), result.controlInfo.idmap());\n> +\n> +\t/* Setup the Video Mux/Bridge entities. */\n> +\tfor (auto &[device, link] : data->bridgeDevices_) {\n> +\t\t/*\n> +\t\t * Start by disabling all the sink pad links on the devices in the\n> +\t\t * cascade, with the exception of the link connecting the device.\n> +\t\t */\n> +\t\tfor (const MediaPad *p : device->entity()->pads()) {\n> +\t\t\tif (!(p->flags() & MEDIA_PAD_FL_SINK))\n> +\t\t\t\tcontinue;\n> +\n> +\t\t\tfor (MediaLink *l : p->links()) {\n> +\t\t\t\tif (l != link)\n> +\t\t\t\t\tl->setEnabled(false);\n> +\t\t\t}\n> +\t\t}\n> +\n> +\t\t/*\n> +\t\t * Next, enable the entity -> entity links, and setup the pad format.\n> +\t\t *\n> +\t\t * \\todo Some bridge devices may chainge the media bus code, so we\n> +\t\t * ought to read the source pad format and propagate it to the sink pad.\n> +\t\t */\n> +\t\tlink->setEnabled(true);\n> +\t\tconst MediaPad *sinkPad = link->sink();\n> +\t\tret = device->setFormat(sinkPad->index(), &sensorFormat);\n> +\t\tif (ret) {\n> +\t\t\tLOG(RPI, Error) << \"Failed to set format on \" << device->entity()->name()\n> +\t\t\t\t\t<< \" pad \" << sinkPad->index()\n> +\t\t\t\t\t<< \" with format  \" << sensorFormat\n> +\t\t\t\t\t<< \": \" << ret;\n> +\t\t\treturn ret;\n> +\t\t}\n> +\n> +\t\tLOG(RPI, Debug) << \"Configured media link on device \" << device->entity()->name()\n> +\t\t\t\t<< \" on pad \" << sinkPad->index();\n> +\t}\n> +\n> +\treturn 0;\n> +}\n> +\n> +int PipelineHandlerBase::exportFrameBuffers([[maybe_unused]] Camera *camera, libcamera::Stream *stream,\n> +\t\t\t\t\t    std::vector<std::unique_ptr<FrameBuffer>> *buffers)\n> +{\n> +\tRPi::Stream *s = static_cast<RPi::Stream *>(stream);\n> +\tunsigned int count = stream->configuration().bufferCount;\n> +\tint ret = s->dev()->exportBuffers(count, buffers);\n> +\n> +\ts->setExportedBuffers(buffers);\n> +\n> +\treturn ret;\n> +}\n> +\n> +int PipelineHandlerBase::start(Camera *camera, const ControlList *controls)\n> +{\n> +\tCameraData *data = cameraData(camera);\n> +\tint ret;\n> +\n> +\t/* Check if a ScalerCrop control was specified. */\n> +\tif (controls)\n> +\t\tdata->calculateScalerCrop(*controls);\n> +\n> +\t/* Start the IPA. */\n> +\tipa::RPi::StartResult result;\n> +\tdata->ipa_->start(controls ? *controls : ControlList{ controls::controls },\n> +\t\t\t  &result);\n> +\n> +\t/* Apply any gain/exposure settings that the IPA may have passed back. */\n> +\tif (!result.controls.empty())\n> +\t\tdata->setSensorControls(result.controls);\n> +\n> +\t/* Configure the number of dropped frames required on startup. */\n> +\tdata->dropFrameCount_ = data->config_.disableStartupFrameDrops ? 0\n> +\t\t\t\t\t\t\t\t       : result.dropFrameCount;\n> +\n> +\tfor (auto const stream : data->streams_)\n> +\t\tstream->resetBuffers();\n> +\n> +\tif (!data->buffersAllocated_) {\n> +\t\t/* Allocate buffers for internal pipeline usage. */\n> +\t\tret = prepareBuffers(camera);\n> +\t\tif (ret) {\n> +\t\t\tLOG(RPI, Error) << \"Failed to allocate buffers\";\n> +\t\t\tdata->freeBuffers();\n> +\t\t\tstop(camera);\n> +\t\t\treturn ret;\n> +\t\t}\n> +\t\tdata->buffersAllocated_ = true;\n> +\t}\n> +\n> +\t/* We need to set the dropFrameCount_ before queueing buffers. */\n> +\tret = queueAllBuffers(camera);\n> +\tif (ret) {\n> +\t\tLOG(RPI, Error) << \"Failed to queue buffers\";\n> +\t\tstop(camera);\n> +\t\treturn ret;\n> +\t}\n> +\n> +\t/*\n> +\t * Reset the delayed controls with the gain and exposure values set by\n> +\t * the IPA.\n> +\t */\n> +\tdata->delayedCtrls_->reset(0);\n> +\tdata->state_ = CameraData::State::Idle;\n> +\n> +\t/* Enable SOF event generation. */\n> +\tdata->frontendDevice()->setFrameStartEnabled(true);\n> +\n> +\tdata->platformStart();\n> +\n> +\t/* Start all streams. */\n> +\tfor (auto const stream : data->streams_) {\n> +\t\tret = stream->dev()->streamOn();\n> +\t\tif (ret) {\n> +\t\t\tstop(camera);\n> +\t\t\treturn ret;\n> +\t\t}\n> +\t}\n> +\n> +\treturn 0;\n> +}\n> +\n> +void PipelineHandlerBase::stopDevice(Camera *camera)\n> +{\n> +\tCameraData *data = cameraData(camera);\n> +\n> +\tdata->state_ = CameraData::State::Stopped;\n> +\tdata->platformStop();\n> +\n> +\tfor (auto const stream : data->streams_)\n> +\t\tstream->dev()->streamOff();\n> +\n> +\t/* Disable SOF event generation. */\n> +\tdata->frontendDevice()->setFrameStartEnabled(false);\n> +\n> +\tdata->clearIncompleteRequests();\n> +\n> +\t/* Stop the IPA. */\n> +\tdata->ipa_->stop();\n> +}\n> +\n> +void PipelineHandlerBase::releaseDevice(Camera *camera)\n> +{\n> +\tCameraData *data = cameraData(camera);\n> +\tdata->freeBuffers();\n> +}\n> +\n> +int PipelineHandlerBase::queueRequestDevice(Camera *camera, Request *request)\n> +{\n> +\tCameraData *data = cameraData(camera);\n> +\n> +\tif (!data->isRunning())\n> +\t\treturn -EINVAL;\n> +\n> +\tLOG(RPI, Debug) << \"queueRequestDevice: New request.\";\n> +\n> +\t/* Push all buffers supplied in the Request to the respective streams. */\n> +\tfor (auto stream : data->streams_) {\n> +\t\tif (!stream->isExternal())\n> +\t\t\tcontinue;\n> +\n> +\t\tFrameBuffer *buffer = request->findBuffer(stream);\n> +\t\tif (buffer && !stream->getBufferId(buffer)) {\n> +\t\t\t/*\n> +\t\t\t * This buffer is not recognised, so it must have been allocated\n> +\t\t\t * outside the v4l2 device. Store it in the stream buffer list\n> +\t\t\t * so we can track it.\n> +\t\t\t */\n> +\t\t\tstream->setExternalBuffer(buffer);\n> +\t\t}\n> +\n> +\t\t/*\n> +\t\t * If no buffer is provided by the request for this stream, we\n> +\t\t * queue a nullptr to the stream to signify that it must use an\n> +\t\t * internally allocated buffer for this capture request. This\n> +\t\t * buffer will not be given back to the application, but is used\n> +\t\t * to support the internal pipeline flow.\n> +\t\t *\n> +\t\t * The below queueBuffer() call will do nothing if there are not\n> +\t\t * enough internal buffers allocated, but this will be handled by\n> +\t\t * queuing the request for buffers in the RPiStream object.\n> +\t\t */\n> +\t\tint ret = stream->queueBuffer(buffer);\n> +\t\tif (ret)\n> +\t\t\treturn ret;\n> +\t}\n> +\n> +\t/* Push the request to the back of the queue. */\n> +\tdata->requestQueue_.push(request);\n> +\tdata->handleState();\n> +\n> +\treturn 0;\n> +}\n> +\n> +int PipelineHandlerBase::registerCamera(MediaDevice *frontend, const std::string &frontendName,\n> +\t\t\t\t\tMediaDevice *backend, MediaEntity *sensorEntity)\n> +{\n> +\tstd::unique_ptr<CameraData> cameraData = allocateCameraData();\n> +\tCameraData *data = cameraData.get();\n> +\tint ret;\n> +\n> +\tdata->sensor_ = std::make_unique<CameraSensor>(sensorEntity);\n> +\tif (!data->sensor_)\n> +\t\treturn -EINVAL;\n> +\n> +\tif (data->sensor_->init())\n> +\t\treturn -EINVAL;\n> +\n> +\tdata->sensorFormats_ = populateSensorFormats(data->sensor_);\n> +\n> +\t/*\n> +\t * Enumerate all the Video Mux/Bridge devices across the sensor -> Fr\n> +\t * chain. There may be a cascade of devices in this chain!\n> +\t */\n> +\tMediaLink *link = sensorEntity->getPadByIndex(0)->links()[0];\n> +\tdata->enumerateVideoDevices(link, frontendName);\n> +\n> +\tipa::RPi::InitResult result;\n> +\tif (data->loadIPA(&result)) {\n> +\t\tLOG(RPI, Error) << \"Failed to load a suitable IPA library\";\n> +\t\treturn -EINVAL;\n> +\t}\n> +\n> +\t/*\n> +\t * Setup our delayed control writer with the sensor default\n> +\t * gain and exposure delays. Mark VBLANK for priority write.\n> +\t */\n> +\tstd::unordered_map<uint32_t, RPi::DelayedControls::ControlParams> params = {\n> +\t\t{ V4L2_CID_ANALOGUE_GAIN, { result.sensorConfig.gainDelay, false } },\n> +\t\t{ V4L2_CID_EXPOSURE, { result.sensorConfig.exposureDelay, false } },\n> +\t\t{ V4L2_CID_HBLANK, { result.sensorConfig.hblankDelay, false } },\n> +\t\t{ V4L2_CID_VBLANK, { result.sensorConfig.vblankDelay, true } }\n> +\t};\n> +\tdata->delayedCtrls_ = std::make_unique<RPi::DelayedControls>(data->sensor_->device(), params);\n> +\tdata->sensorMetadata_ = result.sensorConfig.sensorMetadata;\n> +\n> +\t/* Register initial controls that the Raspberry Pi IPA can handle. */\n> +\tdata->controlInfo_ = std::move(result.controlInfo);\n> +\n> +\t/* Initialize the camera properties. */\n> +\tdata->properties_ = data->sensor_->properties();\n> +\n> +\t/*\n> +\t * The V4L2_CID_NOTIFY_GAINS control, if present, is used to inform the\n> +\t * sensor of the colour gains. It is defined to be a linear gain where\n> +\t * the default value represents a gain of exactly one.\n> +\t */\n> +\tauto it = data->sensor_->controls().find(V4L2_CID_NOTIFY_GAINS);\n> +\tif (it != data->sensor_->controls().end())\n> +\t\tdata->notifyGainsUnity_ = it->second.def().get<int32_t>();\n> +\n> +\t/*\n> +\t * Set a default value for the ScalerCropMaximum property to show\n> +\t * that we support its use, however, initialise it to zero because\n> +\t * it's not meaningful until a camera mode has been chosen.\n> +\t */\n> +\tdata->properties_.set(properties::ScalerCropMaximum, Rectangle{});\n> +\n> +\t/*\n> +\t * We cache two things about the sensor in relation to transforms\n> +\t * (meaning horizontal and vertical flips): if they affect the Bayer\n> +         * ordering, and what the \"native\" Bayer order is, when no transforms\n> +\t * are applied.\n> +\t *\n> +\t * If flips are supported verify if they affect the Bayer ordering\n> +\t * and what the \"native\" Bayer order is, when no transforms are\n> +\t * applied.\n> +\t *\n> +\t * We note that the sensor's cached list of supported formats is\n> +\t * already in the \"native\" order, with any flips having been undone.\n> +\t */\n> +\tconst V4L2Subdevice *sensor = data->sensor_->device();\n> +\tconst struct v4l2_query_ext_ctrl *hflipCtrl = sensor->controlInfo(V4L2_CID_HFLIP);\n> +\tif (hflipCtrl) {\n> +\t\t/* We assume it will support vflips too... */\n> +\t\tdata->flipsAlterBayerOrder_ = hflipCtrl->flags & V4L2_CTRL_FLAG_MODIFY_LAYOUT;\n> +\t}\n> +\n> +\t/* Look for a valid Bayer format. */\n> +\tBayerFormat bayerFormat;\n> +\tfor (const auto &iter : data->sensorFormats_) {\n> +\t\tbayerFormat = BayerFormat::fromMbusCode(iter.first);\n> +\t\tif (bayerFormat.isValid())\n> +\t\t\tbreak;\n> +\t}\n> +\n> +\tif (!bayerFormat.isValid()) {\n> +\t\tLOG(RPI, Error) << \"No Bayer format found\";\n> +\t\treturn -EINVAL;\n> +\t}\n> +\tdata->nativeBayerOrder_ = bayerFormat.order;\n> +\n> +\tret = data->loadPipelineConfiguration();\n> +\tif (ret) {\n> +\t\tLOG(RPI, Error) << \"Unable to load pipeline configuration\";\n> +\t\treturn ret;\n> +\t}\n> +\n> +\tret = platformRegister(cameraData, frontend, backend);\n> +\tif (ret)\n> +\t\treturn ret;\n> +\n> +\t/* Setup the general IPA signal handlers. */\n> +\tdata->frontendDevice()->dequeueTimeout.connect(data, &RPi::CameraData::cameraTimeout);\n> +\tdata->frontendDevice()->frameStart.connect(data, &RPi::CameraData::frameStarted);\n> +\tdata->ipa_->setDelayedControls.connect(data, &CameraData::setDelayedControls);\n> +\tdata->ipa_->setLensControls.connect(data, &CameraData::setLensControls);\n> +\n> +\treturn 0;\n> +}\n> +\n> +void PipelineHandlerBase::mapBuffers(Camera *camera, const BufferMap &buffers, unsigned int mask)\n> +{\n> +\tCameraData *data = cameraData(camera);\n> +\tstd::vector<IPABuffer> bufferIds;\n> +\t/*\n> +\t * Link the FrameBuffers with the id (key value) in the map stored in\n> +\t * the RPi stream object - along with an identifier mask.\n> +\t *\n> +\t * This will allow us to identify buffers passed between the pipeline\n> +\t * handler and the IPA.\n> +\t */\n> +\tfor (auto const &it : buffers) {\n> +\t\tbufferIds.push_back(IPABuffer(mask | it.first,\n> +\t\t\t\t\t      it.second->planes()));\n> +\t\tdata->bufferIds_.insert(mask | it.first);\n> +\t}\n> +\n> +\tdata->ipa_->mapBuffers(bufferIds);\n> +}\n> +\n> +int PipelineHandlerBase::queueAllBuffers(Camera *camera)\n> +{\n> +\tCameraData *data = cameraData(camera);\n> +\tint ret;\n> +\n> +\tfor (auto const stream : data->streams_) {\n> +\t\tif (!stream->isExternal()) {\n> +\t\t\tret = stream->queueAllBuffers();\n> +\t\t\tif (ret < 0)\n> +\t\t\t\treturn ret;\n> +\t\t} else {\n> +\t\t\t/*\n> +\t\t\t * For external streams, we must queue up a set of internal\n> +\t\t\t * buffers to handle the number of drop frames requested by\n> +\t\t\t * the IPA. This is done by passing nullptr in queueBuffer().\n> +\t\t\t *\n> +\t\t\t * The below queueBuffer() call will do nothing if there\n> +\t\t\t * are not enough internal buffers allocated, but this will\n> +\t\t\t * be handled by queuing the request for buffers in the\n> +\t\t\t * RPiStream object.\n> +\t\t\t */\n> +\t\t\tunsigned int i;\n> +\t\t\tfor (i = 0; i < data->dropFrameCount_; i++) {\n> +\t\t\t\tret = stream->queueBuffer(nullptr);\n> +\t\t\t\tif (ret)\n> +\t\t\t\t\treturn ret;\n> +\t\t\t}\n> +\t\t}\n> +\t}\n> +\n> +\treturn 0;\n> +}\n> +\n> +void CameraData::freeBuffers()\n> +{\n> +\tif (ipa_) {\n> +\t\t/*\n> +\t\t * Copy the buffer ids from the unordered_set to a vector to\n> +\t\t * pass to the IPA.\n> +\t\t */\n> +\t\tstd::vector<unsigned int> bufferIds(bufferIds_.begin(),\n> +\t\t\t\t\t\t    bufferIds_.end());\n> +\t\tipa_->unmapBuffers(bufferIds);\n> +\t\tbufferIds_.clear();\n> +\t}\n> +\n> +\tfor (auto const stream : streams_)\n> +\t\tstream->releaseBuffers();\n> +\n> +\tplatformFreeBuffers();\n> +\n> +\tbuffersAllocated_ = false;\n> +}\n> +\n> +/*\n> + * enumerateVideoDevices() iterates over the Media Controller topology, starting\n> + * at the sensor and finishing at the frontend. For each sensor, CameraData stores\n> + * a unique list of any intermediate video mux or bridge devices connected in a\n> + * cascade, together with the entity to entity link.\n> + *\n> + * Entity pad configuration and link enabling happens at the end of configure().\n> + * We first disable all pad links on each entity device in the chain, and then\n> + * selectively enabling the specific links to link sensor to the frontend across\n> + * all intermediate muxes and bridges.\n> + *\n> + * In the cascaded topology below, if Sensor1 is used, the Mux2 -> Mux1 link\n> + * will be disabled, and Sensor1 -> Mux1 -> Frontend links enabled. Alternatively,\n> + * if Sensor3 is used, the Sensor2 -> Mux2 and Sensor1 -> Mux1 links are disabled,\n> + * and Sensor3 -> Mux2 -> Mux1 -> Frontend links are enabled. All other links will\n> + * remain unchanged.\n> + *\n> + *  +----------+\n> + *  |     FE   |\n> + *  +-----^----+\n> + *        |\n> + *    +---+---+\n> + *    | Mux1  |<------+\n> + *    +--^----        |\n> + *       |            |\n> + * +-----+---+    +---+---+\n> + * | Sensor1 |    |  Mux2 |<--+\n> + * +---------+    +-^-----+   |\n> + *                  |         |\n> + *          +-------+-+   +---+-----+\n> + *          | Sensor2 |   | Sensor3 |\n> + *          +---------+   +---------+\n> + */\n> +void CameraData::enumerateVideoDevices(MediaLink *link, const std::string &frontend)\n> +{\n> +\tconst MediaPad *sinkPad = link->sink();\n> +\tconst MediaEntity *entity = sinkPad->entity();\n> +\tbool frontendFound = false;\n> +\n> +\t/* We only deal with Video Mux and Bridge devices in cascade. */\n> +\tif (entity->function() != MEDIA_ENT_F_VID_MUX &&\n> +\t    entity->function() != MEDIA_ENT_F_VID_IF_BRIDGE)\n> +\t\treturn;\n> +\n> +\t/* Find the source pad for this Video Mux or Bridge device. */\n> +\tconst MediaPad *sourcePad = nullptr;\n> +\tfor (const MediaPad *pad : entity->pads()) {\n> +\t\tif (pad->flags() & MEDIA_PAD_FL_SOURCE) {\n> +\t\t\t/*\n> +\t\t\t * We can only deal with devices that have a single source\n> +\t\t\t * pad. If this device has multiple source pads, ignore it\n> +\t\t\t * and this branch in the cascade.\n> +\t\t\t */\n> +\t\t\tif (sourcePad)\n> +\t\t\t\treturn;\n> +\n> +\t\t\tsourcePad = pad;\n> +\t\t}\n> +\t}\n> +\n> +\tLOG(RPI, Debug) << \"Found video mux device \" << entity->name()\n> +\t\t\t<< \" linked to sink pad \" << sinkPad->index();\n> +\n> +\tbridgeDevices_.emplace_back(std::make_unique<V4L2Subdevice>(entity), link);\n> +\tbridgeDevices_.back().first->open();\n> +\n> +\t/*\n> +\t * Iterate through all the sink pad links down the cascade to find any\n> +\t * other Video Mux and Bridge devices.\n> +\t */\n> +\tfor (MediaLink *l : sourcePad->links()) {\n> +\t\tenumerateVideoDevices(l, frontend);\n> +\t\t/* Once we reach the Frontend entity, we are done. */\n> +\t\tif (l->sink()->entity()->name() == frontend) {\n> +\t\t\tfrontendFound = true;\n> +\t\t\tbreak;\n> +\t\t}\n> +\t}\n> +\n> +\t/* This identifies the end of our entity enumeration recursion. */\n> +\tif (link->source()->entity()->function() == MEDIA_ENT_F_CAM_SENSOR) {\n> +\t\t/*\n> +\t\t* If the frontend is not at the end of this cascade, we cannot\n> +\t\t* configure this topology automatically, so remove all entity references.\n> +\t\t*/\n> +\t\tif (!frontendFound) {\n> +\t\t\tLOG(RPI, Warning) << \"Cannot automatically configure this MC topology!\";\n> +\t\t\tbridgeDevices_.clear();\n> +\t\t}\n> +\t}\n> +}\n> +\n> +int CameraData::loadPipelineConfiguration()\n> +{\n> +\tconfig_ = {\n> +\t\t.disableStartupFrameDrops = false,\n> +\t\t.cameraTimeoutValue = 0,\n> +\t};\n> +\n> +\t/* Initial configuration of the platform, in case no config file is present */\n> +\tplatformPipelineConfigure({});\n> +\n> +\tchar const *configFromEnv = utils::secure_getenv(\"LIBCAMERA_RPI_CONFIG_FILE\");\n> +\tif (!configFromEnv || *configFromEnv == '\\0')\n> +\t\treturn 0;\n> +\n> +\tstd::string filename = std::string(configFromEnv);\n> +\tFile file(filename);\n> +\n> +\tif (!file.open(File::OpenModeFlag::ReadOnly)) {\n> +\t\tLOG(RPI, Error) << \"Failed to open configuration file '\" << filename << \"'\";\n> +\t\treturn -EIO;\n> +\t}\n> +\n> +\tLOG(RPI, Info) << \"Using configuration file '\" << filename << \"'\";\n> +\n> +\tstd::unique_ptr<YamlObject> root = YamlParser::parse(file);\n> +\tif (!root) {\n> +\t\tLOG(RPI, Warning) << \"Failed to parse configuration file, using defaults\";\n> +\t\treturn 0;\n> +\t}\n> +\n> +\tstd::optional<double> ver = (*root)[\"version\"].get<double>();\n> +\tif (!ver || *ver != 1.0) {\n> +\t\tLOG(RPI, Error) << \"Unexpected configuration file version reported\";\n> +\t\treturn -EINVAL;\n> +\t}\n> +\n> +\tconst YamlObject &phConfig = (*root)[\"pipeline_handler\"];\n> +\n> +\tconfig_.disableStartupFrameDrops =\n> +\t\tphConfig[\"disable_startup_frame_drops\"].get<bool>(config_.disableStartupFrameDrops);\n> +\n> +\tconfig_.cameraTimeoutValue =\n> +\t\tphConfig[\"camera_timeout_value_ms\"].get<unsigned int>(config_.cameraTimeoutValue);\n> +\n> +\tif (config_.cameraTimeoutValue) {\n> +\t\t/* Disable the IPA signal to control timeout and set the user requested value. */\n> +\t\tipa_->setCameraTimeout.disconnect();\n> +\t\tfrontendDevice()->setDequeueTimeout(config_.cameraTimeoutValue * 1ms);\n> +\t}\n> +\n> +\treturn platformPipelineConfigure(root);\n> +}\n> +\n> +int CameraData::loadIPA(ipa::RPi::InitResult *result)\n> +{\n> +\tipa_ = IPAManager::createIPA<ipa::RPi::IPAProxyRPi>(pipe(), 1, 1);\n> +\n> +\tif (!ipa_)\n> +\t\treturn -ENOENT;\n> +\n> +\t/*\n> +\t * The configuration (tuning file) is made from the sensor name unless\n> +\t * the environment variable overrides it.\n> +\t */\n> +\tstd::string configurationFile;\n> +\tchar const *configFromEnv = utils::secure_getenv(\"LIBCAMERA_RPI_TUNING_FILE\");\n> +\tif (!configFromEnv || *configFromEnv == '\\0') {\n> +\t\tstd::string model = sensor_->model();\n> +\t\tif (isMonoSensor(sensor_))\n> +\t\t\tmodel += \"_mono\";\n> +\t\tconfigurationFile = ipa_->configurationFile(model + \".json\", \"rpi\");\n> +\t} else {\n> +\t\tconfigurationFile = std::string(configFromEnv);\n> +\t}\n> +\n> +\tIPASettings settings(configurationFile, sensor_->model());\n> +\tipa::RPi::InitParams params;\n> +\n> +\tparams.lensPresent = !!sensor_->focusLens();\n> +\tint ret = platformInitIpa(params);\n> +\tif (ret)\n> +\t\treturn ret;\n> +\n> +\treturn ipa_->init(settings, params, result);\n> +}\n> +\n> +int CameraData::configureIPA(const CameraConfiguration *config, ipa::RPi::ConfigResult *result)\n> +{\n> +\tstd::map<unsigned int, ControlInfoMap> entityControls;\n> +\tipa::RPi::ConfigParams params;\n> +\tint ret;\n> +\n> +\tparams.sensorControls = sensor_->controls();\n> +\tif (sensor_->focusLens())\n> +\t\tparams.lensControls = sensor_->focusLens()->controls();\n> +\n> +\tret = platformConfigureIpa(params);\n> +\tif (ret)\n> +\t\treturn ret;\n> +\n> +\t/* We store the IPACameraSensorInfo for digital zoom calculations. */\n> +\tret = sensor_->sensorInfo(&sensorInfo_);\n> +\tif (ret) {\n> +\t\tLOG(RPI, Error) << \"Failed to retrieve camera sensor info\";\n> +\t\treturn ret;\n> +\t}\n> +\n> +\t/* Always send the user transform to the IPA. */\n> +\tparams.transform = static_cast<unsigned int>(config->transform);\n> +\n> +\t/* Ready the IPA - it must know about the sensor resolution. */\n> +\tret = ipa_->configure(sensorInfo_, params, result);\n> +\tif (ret < 0) {\n> +\t\tLOG(RPI, Error) << \"IPA configuration failed!\";\n> +\t\treturn -EPIPE;\n> +\t}\n> +\n> +\tif (!result->controls.empty())\n> +\t\tsetSensorControls(result->controls);\n> +\n> +\treturn 0;\n> +}\n> +\n> +void CameraData::setDelayedControls(const ControlList &controls, uint32_t delayContext)\n> +{\n> +\tif (!delayedCtrls_->push(controls, delayContext))\n> +\t\tLOG(RPI, Error) << \"V4L2 DelayedControl set failed\";\n> +}\n> +\n> +void CameraData::setLensControls(const ControlList &controls)\n> +{\n> +\tCameraLens *lens = sensor_->focusLens();\n> +\n> +\tif (lens && controls.contains(V4L2_CID_FOCUS_ABSOLUTE)) {\n> +\t\tControlValue const &focusValue = controls.get(V4L2_CID_FOCUS_ABSOLUTE);\n> +\t\tlens->setFocusPosition(focusValue.get<int32_t>());\n> +\t}\n> +}\n> +\n> +void CameraData::setSensorControls(ControlList &controls)\n> +{\n> +\t/*\n> +\t * We need to ensure that if both VBLANK and EXPOSURE are present, the\n> +\t * former must be written ahead of, and separately from EXPOSURE to avoid\n> +\t * V4L2 rejecting the latter. This is identical to what DelayedControls\n> +\t * does with the priority write flag.\n> +\t *\n> +\t * As a consequence of the below logic, VBLANK gets set twice, and we\n> +\t * rely on the v4l2 framework to not pass the second control set to the\n> +\t * driver as the actual control value has not changed.\n> +\t */\n> +\tif (controls.contains(V4L2_CID_EXPOSURE) && controls.contains(V4L2_CID_VBLANK)) {\n> +\t\tControlList vblank_ctrl;\n> +\n> +\t\tvblank_ctrl.set(V4L2_CID_VBLANK, controls.get(V4L2_CID_VBLANK));\n> +\t\tsensor_->setControls(&vblank_ctrl);\n> +\t}\n> +\n> +\tsensor_->setControls(&controls);\n> +}\n> +\n> +Rectangle CameraData::scaleIspCrop(const Rectangle &ispCrop) const\n> +{\n> +\t/*\n> +\t * Scale a crop rectangle defined in the ISP's coordinates into native sensor\n> +\t * coordinates.\n> +\t */\n> +\tRectangle nativeCrop = ispCrop.scaledBy(sensorInfo_.analogCrop.size(),\n> +\t\t\t\t\t\tsensorInfo_.outputSize);\n> +\tnativeCrop.translateBy(sensorInfo_.analogCrop.topLeft());\n> +\treturn nativeCrop;\n> +}\n> +\n> +void CameraData::calculateScalerCrop(const ControlList &controls)\n> +{\n> +\tconst auto &scalerCrop = controls.get<Rectangle>(controls::ScalerCrop);\n> +\tif (scalerCrop) {\n> +\t\tRectangle nativeCrop = *scalerCrop;\n> +\n> +\t\tif (!nativeCrop.width || !nativeCrop.height)\n> +\t\t\tnativeCrop = { 0, 0, 1, 1 };\n> +\n> +\t\t/* Create a version of the crop scaled to ISP (camera mode) pixels. */\n> +\t\tRectangle ispCrop = nativeCrop.translatedBy(-sensorInfo_.analogCrop.topLeft());\n> +\t\tispCrop.scaleBy(sensorInfo_.outputSize, sensorInfo_.analogCrop.size());\n> +\n> +\t\t/*\n> +\t\t * The crop that we set must be:\n> +\t\t * 1. At least as big as ispMinCropSize_, once that's been\n> +\t\t *    enlarged to the same aspect ratio.\n> +\t\t * 2. With the same mid-point, if possible.\n> +\t\t * 3. But it can't go outside the sensor area.\n> +\t\t */\n> +\t\tSize minSize = ispMinCropSize_.expandedToAspectRatio(nativeCrop.size());\n> +\t\tSize size = ispCrop.size().expandedTo(minSize);\n> +\t\tispCrop = size.centeredTo(ispCrop.center()).enclosedIn(Rectangle(sensorInfo_.outputSize));\n> +\n> +\t\tif (ispCrop != ispCrop_) {\n> +\t\t\tispCrop_ = ispCrop;\n> +\t\t\tplatformIspCrop();\n> +\n> +\t\t\t/*\n> +\t\t\t * Also update the ScalerCrop in the metadata with what we actually\n> +\t\t\t * used. But we must first rescale that from ISP (camera mode) pixels\n> +\t\t\t * back into sensor native pixels.\n> +\t\t\t */\n> +\t\t\tscalerCrop_ = scaleIspCrop(ispCrop_);\n> +\t\t}\n> +\t}\n> +}\n> +\n> +void CameraData::cameraTimeout()\n> +{\n> +\tLOG(RPI, Error) << \"Camera frontend has timed out!\";\n> +\tLOG(RPI, Error) << \"Please check that your camera sensor connector is attached securely.\";\n> +\tLOG(RPI, Error) << \"Alternatively, try another cable and/or sensor.\";\n> +\n> +\tstate_ = CameraData::State::Error;\n> +\tplatformStop();\n> +\n> +\t/*\n> +\t * To allow the application to attempt a recovery from this timeout,\n> +\t * stop all devices streaming, and return any outstanding requests as\n> +\t * incomplete and cancelled.\n> +\t */\n> +\tfor (auto const stream : streams_)\n> +\t\tstream->dev()->streamOff();\n> +\n> +\tclearIncompleteRequests();\n> +}\n> +\n> +void CameraData::frameStarted(uint32_t sequence)\n> +{\n> +\tLOG(RPI, Debug) << \"Frame start \" << sequence;\n> +\n> +\t/* Write any controls for the next frame as soon as we can. */\n> +\tdelayedCtrls_->applyControls(sequence);\n> +}\n> +\n> +void CameraData::clearIncompleteRequests()\n> +{\n> +\t/*\n> +\t * All outstanding requests (and associated buffers) must be returned\n> +\t * back to the application.\n> +\t */\n> +\twhile (!requestQueue_.empty()) {\n> +\t\tRequest *request = requestQueue_.front();\n> +\n> +\t\tfor (auto &b : request->buffers()) {\n> +\t\t\tFrameBuffer *buffer = b.second;\n> +\t\t\t/*\n> +\t\t\t * Has the buffer already been handed back to the\n> +\t\t\t * request? If not, do so now.\n> +\t\t\t */\n> +\t\t\tif (buffer->request()) {\n> +\t\t\t\tbuffer->_d()->cancel();\n> +\t\t\t\tpipe()->completeBuffer(request, buffer);\n> +\t\t\t}\n> +\t\t}\n> +\n> +\t\tpipe()->completeRequest(request);\n> +\t\trequestQueue_.pop();\n> +\t}\n> +}\n> +\n> +void CameraData::handleStreamBuffer(FrameBuffer *buffer, RPi::Stream *stream)\n> +{\n> +\t/*\n> +\t * It is possible to be here without a pending request, so check\n> +\t * that we actually have one to action, otherwise we just return\n> +\t * buffer back to the stream.\n> +\t */\n> +\tRequest *request = requestQueue_.empty() ? nullptr : requestQueue_.front();\n> +\tif (!dropFrameCount_ && request && request->findBuffer(stream) == buffer) {\n> +\t\t/*\n> +\t\t * Check if this is an externally provided buffer, and if\n> +\t\t * so, we must stop tracking it in the pipeline handler.\n> +\t\t */\n> +\t\thandleExternalBuffer(buffer, stream);\n> +\t\t/*\n> +\t\t * Tag the buffer as completed, returning it to the\n> +\t\t * application.\n> +\t\t */\n> +\t\tpipe()->completeBuffer(request, buffer);\n> +\t} else {\n> +\t\t/*\n> +\t\t * This buffer was not part of the Request (which happens if an\n> +\t\t * internal buffer was used for an external stream, or\n> +\t\t * unconditionally for internal streams), or there is no pending\n> +\t\t * request, so we can recycle it.\n> +\t\t */\n> +\t\tstream->returnBuffer(buffer);\n> +\t}\n> +}\n> +\n> +void CameraData::handleState()\n> +{\n> +\tswitch (state_) {\n> +\tcase State::Stopped:\n> +\tcase State::Busy:\n> +\tcase State::Error:\n> +\t\tbreak;\n> +\n> +\tcase State::IpaComplete:\n> +\t\t/* If the request is completed, we will switch to Idle state. */\n> +\t\tcheckRequestCompleted();\n> +\t\t/*\n> +\t\t * No break here, we want to try running the pipeline again.\n> +\t\t * The fallthrough clause below suppresses compiler warnings.\n> +\t\t */\n> +\t\t[[fallthrough]];\n> +\n> +\tcase State::Idle:\n> +\t\ttryRunPipeline();\n> +\t\tbreak;\n> +\t}\n> +}\n> +\n> +void CameraData::handleExternalBuffer(FrameBuffer *buffer, RPi::Stream *stream)\n> +{\n> +\tunsigned int id = stream->getBufferId(buffer);\n> +\n> +\tif (!(id & MaskExternalBuffer))\n> +\t\treturn;\n> +\n> +\t/* Stop the Stream object from tracking the buffer. */\n> +\tstream->removeExternalBuffer(buffer);\n> +}\n> +\n> +void CameraData::checkRequestCompleted()\n> +{\n> +\tbool requestCompleted = false;\n> +\t/*\n> +\t * If we are dropping this frame, do not touch the request, simply\n> +\t * change the state to IDLE when ready.\n> +\t */\n> +\tif (!dropFrameCount_) {\n> +\t\tRequest *request = requestQueue_.front();\n> +\t\tif (request->hasPendingBuffers())\n> +\t\t\treturn;\n> +\n> +\t\t/* Must wait for metadata to be filled in before completing. */\n> +\t\tif (state_ != State::IpaComplete)\n> +\t\t\treturn;\n> +\n> +\t\tpipe()->completeRequest(request);\n> +\t\trequestQueue_.pop();\n> +\t\trequestCompleted = true;\n> +\t}\n> +\n> +\t/*\n> +\t * Make sure we have three outputs completed in the case of a dropped\n> +\t * frame.\n> +\t */\n> +\tif (state_ == State::IpaComplete &&\n> +\t    ((ispOutputCount_ == ispOutputTotal_ && dropFrameCount_) || requestCompleted)) {\n> +\t\tstate_ = State::Idle;\n> +\t\tif (dropFrameCount_) {\n> +\t\t\tdropFrameCount_--;\n> +\t\t\tLOG(RPI, Debug) << \"Dropping frame at the request of the IPA (\"\n> +\t\t\t\t\t<< dropFrameCount_ << \" left)\";\n> +\t\t}\n> +\t}\n> +}\n> +\n> +void CameraData::fillRequestMetadata(const ControlList &bufferControls, Request *request)\n> +{\n> +\trequest->metadata().set(controls::SensorTimestamp,\n> +\t\t\t\tbufferControls.get(controls::SensorTimestamp).value_or(0));\n> +\n> +\trequest->metadata().set(controls::ScalerCrop, scalerCrop_);\n> +}\n> +\n> +} /* namespace libcamera */\n> diff --git a/src/libcamera/pipeline/rpi/common/pipeline_base.h b/src/libcamera/pipeline/rpi/common/pipeline_base.h\n> new file mode 100644\n> index 000000000000..318266a6fb51\n> --- /dev/null\n> +++ b/src/libcamera/pipeline/rpi/common/pipeline_base.h\n> @@ -0,0 +1,276 @@\n> +/* SPDX-License-Identifier: LGPL-2.1-or-later */\n> +/*\n> + * Copyright (C) 2019-2023, Raspberry Pi Ltd\n> + *\n> + * pipeline_base.h - Pipeline handler base class for Raspberry Pi devices\n> + */\n> +\n> +#include <map>\n> +#include <memory>\n> +#include <optional>\n> +#include <queue>\n> +#include <string>\n> +#include <unordered_set>\n> +#include <utility>\n> +#include <vector>\n> +\n> +#include <libcamera/controls.h>\n> +#include <libcamera/request.h>\n> +\n> +#include \"libcamera/internal/bayer_format.h\"\n> +#include \"libcamera/internal/camera.h\"\n> +#include \"libcamera/internal/camera_sensor.h\"\n> +#include \"libcamera/internal/framebuffer.h\"\n> +#include \"libcamera/internal/media_device.h\"\n> +#include \"libcamera/internal/media_object.h\"\n> +#include \"libcamera/internal/pipeline_handler.h\"\n> +#include \"libcamera/internal/v4l2_videodevice.h\"\n> +#include \"libcamera/internal/yaml_parser.h\"\n> +\n> +#include <libcamera/ipa/raspberrypi_ipa_interface.h>\n> +#include <libcamera/ipa/raspberrypi_ipa_proxy.h>\n> +\n> +#include \"delayed_controls.h\"\n> +#include \"rpi_stream.h\"\n> +\n> +using namespace std::chrono_literals;\n> +\n> +namespace libcamera {\n> +\n> +namespace RPi {\n> +\n> +/* Map of mbus codes to supported sizes reported by the sensor. */\n> +using SensorFormats = std::map<unsigned int, std::vector<Size>>;\n> +\n> +class CameraData : public Camera::Private\n> +{\n> +public:\n> +\tCameraData(PipelineHandler *pipe)\n> +\t\t: Camera::Private(pipe), state_(State::Stopped),\n> +\t\t  flipsAlterBayerOrder_(false), dropFrameCount_(0), buffersAllocated_(false),\n> +\t\t  ispOutputCount_(0), ispOutputTotal_(0)\n> +\t{\n> +\t}\n> +\n> +\tvirtual ~CameraData()\n> +\t{\n> +\t}\n> +\n> +\tstruct StreamParams {\n> +\t\tStreamParams()\n> +\t\t\t: index(0), cfg(nullptr), dev(nullptr)\n> +\t\t{\n> +\t\t}\n> +\n> +\t\tStreamParams(unsigned int index_, StreamConfiguration *cfg_)\n> +\t\t\t: index(index_), cfg(cfg_), dev(nullptr)\n> +\t\t{\n> +\t\t}\n> +\n> +\t\tunsigned int index;\n> +\t\tStreamConfiguration *cfg;\n> +\t\tV4L2VideoDevice *dev;\n> +\t};\n> +\n> +\tvirtual CameraConfiguration::Status platformValidate(std::vector<StreamParams> &rawStreams,\n> +\t\t\t\t\t\t\t     std::vector<StreamParams> &outStreams) const = 0;\n> +\tvirtual int platformConfigure(const V4L2SubdeviceFormat &sensorFormat,\n> +\t\t\t\t      std::optional<BayerFormat::Packing> packing,\n> +\t\t\t\t      std::vector<StreamParams> &rawStreams,\n> +\t\t\t\t      std::vector<StreamParams> &outStreams) = 0;\n> +\tvirtual void platformStart() = 0;\n> +\tvirtual void platformStop() = 0;\n> +\n> +\tvoid freeBuffers();\n> +\tvirtual void platformFreeBuffers() = 0;\n> +\n> +\tvoid enumerateVideoDevices(MediaLink *link, const std::string &frontend);\n> +\n> +\tint loadPipelineConfiguration();\n> +\tint loadIPA(ipa::RPi::InitResult *result);\n> +\tint configureIPA(const CameraConfiguration *config, ipa::RPi::ConfigResult *result);\n> +\tvirtual int platformInitIpa(ipa::RPi::InitParams &params) = 0;\n> +\tvirtual int platformConfigureIpa(ipa::RPi::ConfigParams &params) = 0;\n> +\n> +\tvoid setDelayedControls(const ControlList &controls, uint32_t delayContext);\n> +\tvoid setLensControls(const ControlList &controls);\n> +\tvoid setSensorControls(ControlList &controls);\n> +\n> +\tRectangle scaleIspCrop(const Rectangle &ispCrop) const;\n> +\tvoid calculateScalerCrop(const ControlList &controls);\n> +\tvirtual void platformIspCrop() = 0;\n> +\n> +\tvoid cameraTimeout();\n> +\tvoid frameStarted(uint32_t sequence);\n> +\n> +\tvoid clearIncompleteRequests();\n> +\tvoid handleStreamBuffer(FrameBuffer *buffer, Stream *stream);\n> +\tvoid handleState();\n> +\n> +\tvirtual V4L2VideoDevice::Formats ispFormats() const = 0;\n> +\tvirtual V4L2VideoDevice::Formats rawFormats() const = 0;\n> +\tvirtual V4L2VideoDevice *frontendDevice() = 0;\n> +\n> +\tvirtual int platformPipelineConfigure(const std::unique_ptr<YamlObject> &root) = 0;\n> +\n> +\tstd::unique_ptr<ipa::RPi::IPAProxyRPi> ipa_;\n> +\n> +\tstd::unique_ptr<CameraSensor> sensor_;\n> +\tSensorFormats sensorFormats_;\n> +\n> +\t/* The vector below is just for convenience when iterating over all streams. */\n> +\tstd::vector<Stream *> streams_;\n> +\t/* Stores the ids of the buffers mapped in the IPA. */\n> +\tstd::unordered_set<unsigned int> bufferIds_;\n> +\t/*\n> +\t * Stores a cascade of Video Mux or Bridge devices between the sensor and\n> +\t * Unicam together with media link across the entities.\n> +\t */\n> +\tstd::vector<std::pair<std::unique_ptr<V4L2Subdevice>, MediaLink *>> bridgeDevices_;\n> +\n> +\tstd::unique_ptr<DelayedControls> delayedCtrls_;\n> +\tbool sensorMetadata_;\n> +\n> +\t/*\n> +\t * All the functions in this class are called from a single calling\n> +\t * thread. So, we do not need to have any mutex to protect access to any\n> +\t * of the variables below.\n> +\t */\n> +\tenum class State { Stopped, Idle, Busy, IpaComplete, Error };\n> +\tState state_;\n> +\n> +\tbool isRunning()\n> +\t{\n> +\t\treturn state_ != State::Stopped && state_ != State::Error;\n> +\t}\n> +\n> +\tstd::queue<Request *> requestQueue_;\n> +\n> +\t/* Store the \"native\" Bayer order (that is, with no transforms applied). */\n> +\tbool flipsAlterBayerOrder_;\n> +\tBayerFormat::Order nativeBayerOrder_;\n> +\n> +\t/* For handling digital zoom. */\n> +\tIPACameraSensorInfo sensorInfo_;\n> +\tRectangle ispCrop_; /* crop in ISP (camera mode) pixels */\n> +\tRectangle scalerCrop_; /* crop in sensor native pixels */\n> +\tSize ispMinCropSize_;\n> +\n> +\tunsigned int dropFrameCount_;\n> +\n> +\t/*\n> +\t * If set, this stores the value that represets a gain of one for\n> +\t * the V4L2_CID_NOTIFY_GAINS control.\n> +\t */\n> +\tstd::optional<int32_t> notifyGainsUnity_;\n> +\n> +\t/* Have internal buffers been allocated? */\n> +\tbool buffersAllocated_;\n> +\n> +\tstruct Config {\n> +\t\t/*\n> +\t\t * Override any request from the IPA to drop a number of startup\n> +\t\t * frames.\n> +\t\t */\n> +\t\tbool disableStartupFrameDrops;\n> +\t\t/*\n> +\t\t * Override the camera timeout value calculated by the IPA based\n> +\t\t * on frame durations.\n> +\t\t */\n> +\t\tunsigned int cameraTimeoutValue;\n> +\t};\n> +\n> +\tConfig config_;\n> +\n> +protected:\n> +\tvoid fillRequestMetadata(const ControlList &bufferControls,\n> +\t\t\t\t Request *request);\n> +\n> +\tvirtual void tryRunPipeline() = 0;\n> +\n> +\tunsigned int ispOutputCount_;\n> +\tunsigned int ispOutputTotal_;\n> +\n> +private:\n> +\tvoid handleExternalBuffer(FrameBuffer *buffer, Stream *stream);\n> +\tvoid checkRequestCompleted();\n> +};\n> +\n> +class PipelineHandlerBase : public PipelineHandler\n> +{\n> +public:\n> +\tPipelineHandlerBase(CameraManager *manager)\n> +\t\t: PipelineHandler(manager)\n> +\t{\n> +\t}\n> +\n> +\tvirtual ~PipelineHandlerBase()\n> +\t{\n> +\t}\n> +\n> +\tstatic V4L2DeviceFormat toV4L2DeviceFormat(const V4L2VideoDevice *dev,\n> +\t\t\t\t\t\t   const V4L2SubdeviceFormat &format,\n> +\t\t\t\t\t\t   BayerFormat::Packing packingReq);\n> +\n> +\tstd::unique_ptr<CameraConfiguration>\n> +\tgenerateConfiguration(Camera *camera, const StreamRoles &roles) override;\n> +\tint configure(Camera *camera, CameraConfiguration *config) override;\n> +\n> +\tint exportFrameBuffers(Camera *camera, libcamera::Stream *stream,\n> +\t\t\t       std::vector<std::unique_ptr<FrameBuffer>> *buffers) override;\n> +\n> +\tint start(Camera *camera, const ControlList *controls) override;\n> +\tvoid stopDevice(Camera *camera) override;\n> +\tvoid releaseDevice(Camera *camera) override;\n> +\n> +\tint queueRequestDevice(Camera *camera, Request *request) override;\n> +\n> +protected:\n> +\tint registerCamera(MediaDevice *frontent, const std::string &frontendName,\n> +\t\t\t   MediaDevice *backend, MediaEntity *sensorEntity);\n> +\n> +\tvoid mapBuffers(Camera *camera, const BufferMap &buffers, unsigned int mask);\n> +\n> +\tvirtual std::unique_ptr<CameraData> allocateCameraData() = 0;\n> +\tvirtual int platformRegister(std::unique_ptr<CameraData> &cameraData,\n> +\t\t\t\t     MediaDevice *unicam, MediaDevice *isp) = 0;\n> +\n> +private:\n> +\tCameraData *cameraData(Camera *camera)\n> +\t{\n> +\t\treturn static_cast<CameraData *>(camera->_d());\n> +\t}\n> +\n> +\tint queueAllBuffers(Camera *camera);\n> +\tvirtual int prepareBuffers(Camera *camera) = 0;\n> +};\n> +\n> +class RPiCameraConfiguration final : public CameraConfiguration\n> +{\n> +public:\n> +\tRPiCameraConfiguration(const CameraData *data)\n> +\t\t: CameraConfiguration(), data_(data)\n> +\t{\n> +\t}\n> +\n> +\tCameraConfiguration::Status validateColorSpaces(ColorSpaceFlags flags);\n> +\tStatus validate() override;\n> +\n> +\t/* Cache the combinedTransform_ that will be applied to the sensor */\n> +\tTransform combinedTransform_;\n> +\n> +private:\n> +\tconst CameraData *data_;\n> +\n> +\t/*\n> +\t * Store the colour spaces that all our streams will have. RGB format streams\n> +\t * will have the same colorspace as YUV streams, with YCbCr field cleared and\n> +\t * range set to full.\n> +         */\n> +\tstd::optional<ColorSpace> yuvColorSpace_;\n> +\tstd::optional<ColorSpace> rgbColorSpace_;\n> +};\n> +\n> +} /* namespace RPi */\n> +\n> +} /* namespace libcamera */\n> diff --git a/src/libcamera/pipeline/rpi/vc4/data/example.yaml b/src/libcamera/pipeline/rpi/vc4/data/example.yaml\n> index c90f518f8849..b8e01adeaf40 100644\n> --- a/src/libcamera/pipeline/rpi/vc4/data/example.yaml\n> +++ b/src/libcamera/pipeline/rpi/vc4/data/example.yaml\n> @@ -34,13 +34,13 @@\n>                  #\n>                  # \"disable_startup_frame_drops\": false,\n>\n> -                # Custom timeout value (in ms) for Unicam to use. This overrides\n> +                # Custom timeout value (in ms) for camera to use. This overrides\n>                  # the value computed by the pipeline handler based on frame\n>                  # durations.\n>                  #\n>                  # Set this value to 0 to use the pipeline handler computed\n>                  # timeout value.\n>                  #\n> -                # \"unicam_timeout_value_ms\": 0,\n> +                # \"camera_timeout_value_ms\": 0,\n>          }\n>  }\n> diff --git a/src/libcamera/pipeline/rpi/vc4/meson.build b/src/libcamera/pipeline/rpi/vc4/meson.build\n> index 228823f30922..cdb049c58d2c 100644\n> --- a/src/libcamera/pipeline/rpi/vc4/meson.build\n> +++ b/src/libcamera/pipeline/rpi/vc4/meson.build\n> @@ -2,7 +2,7 @@\n>\n>  libcamera_sources += files([\n>      'dma_heaps.cpp',\n> -    'raspberrypi.cpp',\n> +    'vc4.cpp',\n>  ])\n>\n>  subdir('data')\n> diff --git a/src/libcamera/pipeline/rpi/vc4/raspberrypi.cpp b/src/libcamera/pipeline/rpi/vc4/raspberrypi.cpp\n> deleted file mode 100644\n> index bd66468683df..000000000000\n> --- a/src/libcamera/pipeline/rpi/vc4/raspberrypi.cpp\n> +++ /dev/null\n> @@ -1,2428 +0,0 @@\n> -/* SPDX-License-Identifier: LGPL-2.1-or-later */\n> -/*\n> - * Copyright (C) 2019-2021, Raspberry Pi Ltd\n> - *\n> - * raspberrypi.cpp - Pipeline handler for VC4 based Raspberry Pi devices\n> - */\n> -#include <algorithm>\n> -#include <assert.h>\n> -#include <cmath>\n> -#include <fcntl.h>\n> -#include <memory>\n> -#include <mutex>\n> -#include <queue>\n> -#include <unordered_set>\n> -#include <utility>\n> -\n> -#include <libcamera/base/file.h>\n> -#include <libcamera/base/shared_fd.h>\n> -#include <libcamera/base/utils.h>\n> -\n> -#include <libcamera/camera.h>\n> -#include <libcamera/control_ids.h>\n> -#include <libcamera/formats.h>\n> -#include <libcamera/ipa/raspberrypi_ipa_interface.h>\n> -#include <libcamera/ipa/raspberrypi_ipa_proxy.h>\n> -#include <libcamera/logging.h>\n> -#include <libcamera/property_ids.h>\n> -#include <libcamera/request.h>\n> -\n> -#include <linux/bcm2835-isp.h>\n> -#include <linux/media-bus-format.h>\n> -#include <linux/videodev2.h>\n> -\n> -#include \"libcamera/internal/bayer_format.h\"\n> -#include \"libcamera/internal/camera.h\"\n> -#include \"libcamera/internal/camera_lens.h\"\n> -#include \"libcamera/internal/camera_sensor.h\"\n> -#include \"libcamera/internal/device_enumerator.h\"\n> -#include \"libcamera/internal/framebuffer.h\"\n> -#include \"libcamera/internal/ipa_manager.h\"\n> -#include \"libcamera/internal/media_device.h\"\n> -#include \"libcamera/internal/pipeline_handler.h\"\n> -#include \"libcamera/internal/v4l2_videodevice.h\"\n> -#include \"libcamera/internal/yaml_parser.h\"\n> -\n> -#include \"delayed_controls.h\"\n> -#include \"dma_heaps.h\"\n> -#include \"rpi_stream.h\"\n> -\n> -using namespace std::chrono_literals;\n> -\n> -namespace libcamera {\n> -\n> -LOG_DEFINE_CATEGORY(RPI)\n> -\n> -namespace {\n> -\n> -constexpr unsigned int defaultRawBitDepth = 12;\n> -\n> -/* Map of mbus codes to supported sizes reported by the sensor. */\n> -using SensorFormats = std::map<unsigned int, std::vector<Size>>;\n> -\n> -SensorFormats populateSensorFormats(std::unique_ptr<CameraSensor> &sensor)\n> -{\n> -\tSensorFormats formats;\n> -\n> -\tfor (auto const mbusCode : sensor->mbusCodes())\n> -\t\tformats.emplace(mbusCode, sensor->sizes(mbusCode));\n> -\n> -\treturn formats;\n> -}\n> -\n> -bool isMonoSensor(std::unique_ptr<CameraSensor> &sensor)\n> -{\n> -\tunsigned int mbusCode = sensor->mbusCodes()[0];\n> -\tconst BayerFormat &bayer = BayerFormat::fromMbusCode(mbusCode);\n> -\n> -\treturn bayer.order == BayerFormat::Order::MONO;\n> -}\n> -\n> -PixelFormat mbusCodeToPixelFormat(unsigned int mbus_code,\n> -\t\t\t\t  BayerFormat::Packing packingReq)\n> -{\n> -\tBayerFormat bayer = BayerFormat::fromMbusCode(mbus_code);\n> -\n> -\tASSERT(bayer.isValid());\n> -\n> -\tbayer.packing = packingReq;\n> -\tPixelFormat pix = bayer.toPixelFormat();\n> -\n> -\t/*\n> -\t * Not all formats (e.g. 8-bit or 16-bit Bayer formats) can have packed\n> -\t * variants. So if the PixelFormat returns as invalid, use the non-packed\n> -\t * conversion instead.\n> -\t */\n> -\tif (!pix.isValid()) {\n> -\t\tbayer.packing = BayerFormat::Packing::None;\n> -\t\tpix = bayer.toPixelFormat();\n> -\t}\n> -\n> -\treturn pix;\n> -}\n> -\n> -V4L2DeviceFormat toV4L2DeviceFormat(const V4L2VideoDevice *dev,\n> -\t\t\t\t    const V4L2SubdeviceFormat &format,\n> -\t\t\t\t    BayerFormat::Packing packingReq)\n> -{\n> -\tconst PixelFormat pix = mbusCodeToPixelFormat(format.mbus_code, packingReq);\n> -\tV4L2DeviceFormat deviceFormat;\n> -\n> -\tdeviceFormat.fourcc = dev->toV4L2PixelFormat(pix);\n> -\tdeviceFormat.size = format.size;\n> -\tdeviceFormat.colorSpace = format.colorSpace;\n> -\treturn deviceFormat;\n> -}\n> -\n> -bool isRaw(const PixelFormat &pixFmt)\n> -{\n> -\t/* This test works for both Bayer and raw mono formats. */\n> -\treturn BayerFormat::fromPixelFormat(pixFmt).isValid();\n> -}\n> -\n> -double scoreFormat(double desired, double actual)\n> -{\n> -\tdouble score = desired - actual;\n> -\t/* Smaller desired dimensions are preferred. */\n> -\tif (score < 0.0)\n> -\t\tscore = (-score) / 8;\n> -\t/* Penalise non-exact matches. */\n> -\tif (actual != desired)\n> -\t\tscore *= 2;\n> -\n> -\treturn score;\n> -}\n> -\n> -V4L2SubdeviceFormat findBestFormat(const SensorFormats &formatsMap, const Size &req, unsigned int bitDepth)\n> -{\n> -\tdouble bestScore = std::numeric_limits<double>::max(), score;\n> -\tV4L2SubdeviceFormat bestFormat;\n> -\tbestFormat.colorSpace = ColorSpace::Raw;\n> -\n> -\tconstexpr float penaltyAr = 1500.0;\n> -\tconstexpr float penaltyBitDepth = 500.0;\n> -\n> -\t/* Calculate the closest/best mode from the user requested size. */\n> -\tfor (const auto &iter : formatsMap) {\n> -\t\tconst unsigned int mbusCode = iter.first;\n> -\t\tconst PixelFormat format = mbusCodeToPixelFormat(mbusCode,\n> -\t\t\t\t\t\t\t\t BayerFormat::Packing::None);\n> -\t\tconst PixelFormatInfo &info = PixelFormatInfo::info(format);\n> -\n> -\t\tfor (const Size &size : iter.second) {\n> -\t\t\tdouble reqAr = static_cast<double>(req.width) / req.height;\n> -\t\t\tdouble fmtAr = static_cast<double>(size.width) / size.height;\n> -\n> -\t\t\t/* Score the dimensions for closeness. */\n> -\t\t\tscore = scoreFormat(req.width, size.width);\n> -\t\t\tscore += scoreFormat(req.height, size.height);\n> -\t\t\tscore += penaltyAr * scoreFormat(reqAr, fmtAr);\n> -\n> -\t\t\t/* Add any penalties... this is not an exact science! */\n> -\t\t\tscore += utils::abs_diff(info.bitsPerPixel, bitDepth) * penaltyBitDepth;\n> -\n> -\t\t\tif (score <= bestScore) {\n> -\t\t\t\tbestScore = score;\n> -\t\t\t\tbestFormat.mbus_code = mbusCode;\n> -\t\t\t\tbestFormat.size = size;\n> -\t\t\t}\n> -\n> -\t\t\tLOG(RPI, Debug) << \"Format: \" << size\n> -\t\t\t\t\t<< \" fmt \" << format\n> -\t\t\t\t\t<< \" Score: \" << score\n> -\t\t\t\t\t<< \" (best \" << bestScore << \")\";\n> -\t\t}\n> -\t}\n> -\n> -\treturn bestFormat;\n> -}\n> -\n> -enum class Unicam : unsigned int { Image, Embedded };\n> -enum class Isp : unsigned int { Input, Output0, Output1, Stats };\n> -\n> -} /* namespace */\n> -\n> -class RPiCameraData : public Camera::Private\n> -{\n> -public:\n> -\tRPiCameraData(PipelineHandler *pipe)\n> -\t\t: Camera::Private(pipe), state_(State::Stopped),\n> -\t\t  flipsAlterBayerOrder_(false), dropFrameCount_(0),\n> -\t\t  buffersAllocated_(false), ispOutputCount_(0)\n> -\t{\n> -\t}\n> -\n> -\t~RPiCameraData()\n> -\t{\n> -\t\tfreeBuffers();\n> -\t}\n> -\n> -\tvoid freeBuffers();\n> -\tvoid frameStarted(uint32_t sequence);\n> -\n> -\tint loadIPA(ipa::RPi::InitResult *result);\n> -\tint configureIPA(const CameraConfiguration *config, ipa::RPi::ConfigResult *result);\n> -\tint loadPipelineConfiguration();\n> -\n> -\tvoid enumerateVideoDevices(MediaLink *link);\n> -\n> -\tvoid processStatsComplete(const ipa::RPi::BufferIds &buffers);\n> -\tvoid prepareIspComplete(const ipa::RPi::BufferIds &buffers);\n> -\tvoid setIspControls(const ControlList &controls);\n> -\tvoid setDelayedControls(const ControlList &controls, uint32_t delayContext);\n> -\tvoid setLensControls(const ControlList &controls);\n> -\tvoid setCameraTimeout(uint32_t maxExposureTimeMs);\n> -\tvoid setSensorControls(ControlList &controls);\n> -\tvoid unicamTimeout();\n> -\n> -\t/* bufferComplete signal handlers. */\n> -\tvoid unicamBufferDequeue(FrameBuffer *buffer);\n> -\tvoid ispInputDequeue(FrameBuffer *buffer);\n> -\tvoid ispOutputDequeue(FrameBuffer *buffer);\n> -\n> -\tvoid clearIncompleteRequests();\n> -\tvoid handleStreamBuffer(FrameBuffer *buffer, RPi::Stream *stream);\n> -\tvoid handleExternalBuffer(FrameBuffer *buffer, RPi::Stream *stream);\n> -\tvoid handleState();\n> -\tRectangle scaleIspCrop(const Rectangle &ispCrop) const;\n> -\tvoid applyScalerCrop(const ControlList &controls);\n> -\n> -\tstd::unique_ptr<ipa::RPi::IPAProxyRPi> ipa_;\n> -\n> -\tstd::unique_ptr<CameraSensor> sensor_;\n> -\tSensorFormats sensorFormats_;\n> -\t/* Array of Unicam and ISP device streams and associated buffers/streams. */\n> -\tRPi::Device<Unicam, 2> unicam_;\n> -\tRPi::Device<Isp, 4> isp_;\n> -\t/* The vector below is just for convenience when iterating over all streams. */\n> -\tstd::vector<RPi::Stream *> streams_;\n> -\t/* Stores the ids of the buffers mapped in the IPA. */\n> -\tstd::unordered_set<unsigned int> bufferIds_;\n> -\t/*\n> -\t * Stores a cascade of Video Mux or Bridge devices between the sensor and\n> -\t * Unicam together with media link across the entities.\n> -\t */\n> -\tstd::vector<std::pair<std::unique_ptr<V4L2Subdevice>, MediaLink *>> bridgeDevices_;\n> -\n> -\t/* DMAHEAP allocation helper. */\n> -\tRPi::DmaHeap dmaHeap_;\n> -\tSharedFD lsTable_;\n> -\n> -\tstd::unique_ptr<RPi::DelayedControls> delayedCtrls_;\n> -\tbool sensorMetadata_;\n> -\n> -\t/*\n> -\t * All the functions in this class are called from a single calling\n> -\t * thread. So, we do not need to have any mutex to protect access to any\n> -\t * of the variables below.\n> -\t */\n> -\tenum class State { Stopped, Idle, Busy, IpaComplete, Error };\n> -\tState state_;\n> -\n> -\tbool isRunning()\n> -\t{\n> -\t\treturn state_ != State::Stopped && state_ != State::Error;\n> -\t}\n> -\n> -\tstruct BayerFrame {\n> -\t\tFrameBuffer *buffer;\n> -\t\tControlList controls;\n> -\t\tunsigned int delayContext;\n> -\t};\n> -\n> -\tstd::queue<BayerFrame> bayerQueue_;\n> -\tstd::queue<FrameBuffer *> embeddedQueue_;\n> -\tstd::deque<Request *> requestQueue_;\n> -\n> -\t/*\n> -\t * Store the \"native\" Bayer order (that is, with no transforms\n> -\t * applied).\n> -\t */\n> -\tbool flipsAlterBayerOrder_;\n> -\tBayerFormat::Order nativeBayerOrder_;\n> -\n> -\t/* For handling digital zoom. */\n> -\tIPACameraSensorInfo sensorInfo_;\n> -\tRectangle ispCrop_; /* crop in ISP (camera mode) pixels */\n> -\tRectangle scalerCrop_; /* crop in sensor native pixels */\n> -\tSize ispMinCropSize_;\n> -\n> -\tunsigned int dropFrameCount_;\n> -\n> -\t/*\n> -\t * If set, this stores the value that represets a gain of one for\n> -\t * the V4L2_CID_NOTIFY_GAINS control.\n> -\t */\n> -\tstd::optional<int32_t> notifyGainsUnity_;\n> -\n> -\t/* Have internal buffers been allocated? */\n> -\tbool buffersAllocated_;\n> -\n> -\tstruct Config {\n> -\t\t/*\n> -\t\t * The minimum number of internal buffers to be allocated for\n> -\t\t * the Unicam Image stream.\n> -\t\t */\n> -\t\tunsigned int minUnicamBuffers;\n> -\t\t/*\n> -\t\t * The minimum total (internal + external) buffer count used for\n> -\t\t * the Unicam Image stream.\n> -\t\t *\n> -\t\t * Note that:\n> -\t\t * minTotalUnicamBuffers must be >= 1, and\n> -\t\t * minTotalUnicamBuffers >= minUnicamBuffers\n> -\t\t */\n> -\t\tunsigned int minTotalUnicamBuffers;\n> -\t\t/*\n> -\t\t * Override any request from the IPA to drop a number of startup\n> -\t\t * frames.\n> -\t\t */\n> -\t\tbool disableStartupFrameDrops;\n> -\t\t/*\n> -\t\t * Override the Unicam timeout value calculated by the IPA based\n> -\t\t * on frame durations.\n> -\t\t */\n> -\t\tunsigned int unicamTimeoutValue;\n> -\t};\n> -\n> -\tConfig config_;\n> -\n> -private:\n> -\tvoid checkRequestCompleted();\n> -\tvoid fillRequestMetadata(const ControlList &bufferControls,\n> -\t\t\t\t Request *request);\n> -\tvoid tryRunPipeline();\n> -\tbool findMatchingBuffers(BayerFrame &bayerFrame, FrameBuffer *&embeddedBuffer);\n> -\n> -\tunsigned int ispOutputCount_;\n> -};\n> -\n> -class RPiCameraConfiguration : public CameraConfiguration\n> -{\n> -public:\n> -\tRPiCameraConfiguration(const RPiCameraData *data);\n> -\n> -\tCameraConfiguration::Status validateColorSpaces(ColorSpaceFlags flags);\n> -\tStatus validate() override;\n> -\n> -\t/* Cache the combinedTransform_ that will be applied to the sensor */\n> -\tTransform combinedTransform_;\n> -\n> -private:\n> -\tconst RPiCameraData *data_;\n> -\n> -\t/*\n> -\t * Store the colour spaces that all our streams will have. RGB format streams\n> -\t * will have the same colorspace as YUV streams, with YCbCr field cleared and\n> -\t * range set to full.\n> -         */\n> -\tstd::optional<ColorSpace> yuvColorSpace_;\n> -\tstd::optional<ColorSpace> rgbColorSpace_;\n> -};\n> -\n> -class PipelineHandlerRPi : public PipelineHandler\n> -{\n> -public:\n> -\tPipelineHandlerRPi(CameraManager *manager);\n> -\n> -\tstd::unique_ptr<CameraConfiguration> generateConfiguration(Camera *camera,\n> -\t\tconst StreamRoles &roles) override;\n> -\tint configure(Camera *camera, CameraConfiguration *config) override;\n> -\n> -\tint exportFrameBuffers(Camera *camera, Stream *stream,\n> -\t\t\t       std::vector<std::unique_ptr<FrameBuffer>> *buffers) override;\n> -\n> -\tint start(Camera *camera, const ControlList *controls) override;\n> -\tvoid stopDevice(Camera *camera) override;\n> -\n> -\tint queueRequestDevice(Camera *camera, Request *request) override;\n> -\n> -\tbool match(DeviceEnumerator *enumerator) override;\n> -\n> -\tvoid releaseDevice(Camera *camera) override;\n> -\n> -private:\n> -\tRPiCameraData *cameraData(Camera *camera)\n> -\t{\n> -\t\treturn static_cast<RPiCameraData *>(camera->_d());\n> -\t}\n> -\n> -\tint registerCamera(MediaDevice *unicam, MediaDevice *isp, MediaEntity *sensorEntity);\n> -\tint queueAllBuffers(Camera *camera);\n> -\tint prepareBuffers(Camera *camera);\n> -\tvoid mapBuffers(Camera *camera, const RPi::BufferMap &buffers, unsigned int mask);\n> -};\n> -\n> -RPiCameraConfiguration::RPiCameraConfiguration(const RPiCameraData *data)\n> -\t: CameraConfiguration(), data_(data)\n> -{\n> -}\n> -\n> -static const std::vector<ColorSpace> validColorSpaces = {\n> -\tColorSpace::Sycc,\n> -\tColorSpace::Smpte170m,\n> -\tColorSpace::Rec709\n> -};\n> -\n> -static std::optional<ColorSpace> findValidColorSpace(const ColorSpace &colourSpace)\n> -{\n> -\tfor (auto cs : validColorSpaces) {\n> -\t\tif (colourSpace.primaries == cs.primaries &&\n> -\t\t    colourSpace.transferFunction == cs.transferFunction)\n> -\t\t\treturn cs;\n> -\t}\n> -\n> -\treturn std::nullopt;\n> -}\n> -\n> -static bool isRgb(const PixelFormat &pixFmt)\n> -{\n> -\tconst PixelFormatInfo &info = PixelFormatInfo::info(pixFmt);\n> -\treturn info.colourEncoding == PixelFormatInfo::ColourEncodingRGB;\n> -}\n> -\n> -static bool isYuv(const PixelFormat &pixFmt)\n> -{\n> -\t/* The code below would return true for raw mono streams, so weed those out first. */\n> -\tif (isRaw(pixFmt))\n> -\t\treturn false;\n> -\n> -\tconst PixelFormatInfo &info = PixelFormatInfo::info(pixFmt);\n> -\treturn info.colourEncoding == PixelFormatInfo::ColourEncodingYUV;\n> -}\n> -\n> -/*\n> - * Raspberry Pi drivers expect the following colour spaces:\n> - * - V4L2_COLORSPACE_RAW for raw streams.\n> - * - One of V4L2_COLORSPACE_JPEG, V4L2_COLORSPACE_SMPTE170M, V4L2_COLORSPACE_REC709 for\n> - *   non-raw streams. Other fields such as transfer function, YCbCr encoding and\n> - *   quantisation are not used.\n> - *\n> - * The libcamera colour spaces that we wish to use corresponding to these are therefore:\n> - * - ColorSpace::Raw for V4L2_COLORSPACE_RAW\n> - * - ColorSpace::Sycc for V4L2_COLORSPACE_JPEG\n> - * - ColorSpace::Smpte170m for V4L2_COLORSPACE_SMPTE170M\n> - * - ColorSpace::Rec709 for V4L2_COLORSPACE_REC709\n> - */\n> -\n> -CameraConfiguration::Status RPiCameraConfiguration::validateColorSpaces([[maybe_unused]] ColorSpaceFlags flags)\n> -{\n> -\tStatus status = Valid;\n> -\tyuvColorSpace_.reset();\n> -\n> -\tfor (auto cfg : config_) {\n> -\t\t/* First fix up raw streams to have the \"raw\" colour space. */\n> -\t\tif (isRaw(cfg.pixelFormat)) {\n> -\t\t\t/* If there was no value here, that doesn't count as \"adjusted\". */\n> -\t\t\tif (cfg.colorSpace && cfg.colorSpace != ColorSpace::Raw)\n> -\t\t\t\tstatus = Adjusted;\n> -\t\t\tcfg.colorSpace = ColorSpace::Raw;\n> -\t\t\tcontinue;\n> -\t\t}\n> -\n> -\t\t/* Next we need to find our shared colour space. The first valid one will do. */\n> -\t\tif (cfg.colorSpace && !yuvColorSpace_)\n> -\t\t\tyuvColorSpace_ = findValidColorSpace(cfg.colorSpace.value());\n> -\t}\n> -\n> -\t/* If no colour space was given anywhere, choose sYCC. */\n> -\tif (!yuvColorSpace_)\n> -\t\tyuvColorSpace_ = ColorSpace::Sycc;\n> -\n> -\t/* Note the version of this that any RGB streams will have to use. */\n> -\trgbColorSpace_ = yuvColorSpace_;\n> -\trgbColorSpace_->ycbcrEncoding = ColorSpace::YcbcrEncoding::None;\n> -\trgbColorSpace_->range = ColorSpace::Range::Full;\n> -\n> -\t/* Go through the streams again and force everyone to the same colour space. */\n> -\tfor (auto cfg : config_) {\n> -\t\tif (cfg.colorSpace == ColorSpace::Raw)\n> -\t\t\tcontinue;\n> -\n> -\t\tif (isYuv(cfg.pixelFormat) && cfg.colorSpace != yuvColorSpace_) {\n> -\t\t\t/* Again, no value means \"not adjusted\". */\n> -\t\t\tif (cfg.colorSpace)\n> -\t\t\t\tstatus = Adjusted;\n> -\t\t\tcfg.colorSpace = yuvColorSpace_;\n> -\t\t}\n> -\t\tif (isRgb(cfg.pixelFormat) && cfg.colorSpace != rgbColorSpace_) {\n> -\t\t\t/* Be nice, and let the YUV version count as non-adjusted too. */\n> -\t\t\tif (cfg.colorSpace && cfg.colorSpace != yuvColorSpace_)\n> -\t\t\t\tstatus = Adjusted;\n> -\t\t\tcfg.colorSpace = rgbColorSpace_;\n> -\t\t}\n> -\t}\n> -\n> -\treturn status;\n> -}\n> -\n> -CameraConfiguration::Status RPiCameraConfiguration::validate()\n> -{\n> -\tStatus status = Valid;\n> -\n> -\tif (config_.empty())\n> -\t\treturn Invalid;\n> -\n> -\tstatus = validateColorSpaces(ColorSpaceFlag::StreamsShareColorSpace);\n> -\n> -\t/*\n> -\t * Validate the requested transform against the sensor capabilities and\n> -\t * rotation and store the final combined transform that configure() will\n> -\t * need to apply to the sensor to save us working it out again.\n> -\t */\n> -\tTransform requestedTransform = transform;\n> -\tcombinedTransform_ = data_->sensor_->validateTransform(&transform);\n> -\tif (transform != requestedTransform)\n> -\t\tstatus = Adjusted;\n> -\n> -\tunsigned int rawCount = 0, outCount = 0, count = 0, maxIndex = 0;\n> -\tstd::pair<int, Size> outSize[2];\n> -\tSize maxSize;\n> -\tfor (StreamConfiguration &cfg : config_) {\n> -\t\tif (isRaw(cfg.pixelFormat)) {\n> -\t\t\t/*\n> -\t\t\t * Calculate the best sensor mode we can use based on\n> -\t\t\t * the user request.\n> -\t\t\t */\n> -\t\t\tV4L2VideoDevice *unicam = data_->unicam_[Unicam::Image].dev();\n> -\t\t\tconst PixelFormatInfo &info = PixelFormatInfo::info(cfg.pixelFormat);\n> -\t\t\tunsigned int bitDepth = info.isValid() ? info.bitsPerPixel : defaultRawBitDepth;\n> -\t\t\tV4L2SubdeviceFormat sensorFormat = findBestFormat(data_->sensorFormats_, cfg.size, bitDepth);\n> -\t\t\tBayerFormat::Packing packing = BayerFormat::Packing::CSI2;\n> -\t\t\tif (info.isValid() && !info.packed)\n> -\t\t\t\tpacking = BayerFormat::Packing::None;\n> -\t\t\tV4L2DeviceFormat unicamFormat = toV4L2DeviceFormat(unicam, sensorFormat, packing);\n> -\t\t\tint ret = unicam->tryFormat(&unicamFormat);\n> -\t\t\tif (ret)\n> -\t\t\t\treturn Invalid;\n> -\n> -\t\t\t/*\n> -\t\t\t * Some sensors change their Bayer order when they are\n> -\t\t\t * h-flipped or v-flipped, according to the transform.\n> -\t\t\t * If this one does, we must advertise the transformed\n> -\t\t\t * Bayer order in the raw stream. Note how we must\n> -\t\t\t * fetch the \"native\" (i.e. untransformed) Bayer order,\n> -\t\t\t * because the sensor may currently be flipped!\n> -\t\t\t */\n> -\t\t\tV4L2PixelFormat fourcc = unicamFormat.fourcc;\n> -\t\t\tif (data_->flipsAlterBayerOrder_) {\n> -\t\t\t\tBayerFormat bayer = BayerFormat::fromV4L2PixelFormat(fourcc);\n> -\t\t\t\tbayer.order = data_->nativeBayerOrder_;\n> -\t\t\t\tbayer = bayer.transform(combinedTransform_);\n> -\t\t\t\tfourcc = bayer.toV4L2PixelFormat();\n> -\t\t\t}\n> -\n> -\t\t\tPixelFormat unicamPixFormat = fourcc.toPixelFormat();\n> -\t\t\tif (cfg.size != unicamFormat.size ||\n> -\t\t\t    cfg.pixelFormat != unicamPixFormat) {\n> -\t\t\t\tcfg.size = unicamFormat.size;\n> -\t\t\t\tcfg.pixelFormat = unicamPixFormat;\n> -\t\t\t\tstatus = Adjusted;\n> -\t\t\t}\n> -\n> -\t\t\tcfg.stride = unicamFormat.planes[0].bpl;\n> -\t\t\tcfg.frameSize = unicamFormat.planes[0].size;\n> -\n> -\t\t\trawCount++;\n> -\t\t} else {\n> -\t\t\toutSize[outCount] = std::make_pair(count, cfg.size);\n> -\t\t\t/* Record the largest resolution for fixups later. */\n> -\t\t\tif (maxSize < cfg.size) {\n> -\t\t\t\tmaxSize = cfg.size;\n> -\t\t\t\tmaxIndex = outCount;\n> -\t\t\t}\n> -\t\t\toutCount++;\n> -\t\t}\n> -\n> -\t\tcount++;\n> -\n> -\t\t/* Can only output 1 RAW stream, or 2 YUV/RGB streams. */\n> -\t\tif (rawCount > 1 || outCount > 2) {\n> -\t\t\tLOG(RPI, Error) << \"Invalid number of streams requested\";\n> -\t\t\treturn Invalid;\n> -\t\t}\n> -\t}\n> -\n> -\t/*\n> -\t * Now do any fixups needed. For the two ISP outputs, one stream must be\n> -\t * equal or smaller than the other in all dimensions.\n> -\t */\n> -\tfor (unsigned int i = 0; i < outCount; i++) {\n> -\t\toutSize[i].second.width = std::min(outSize[i].second.width,\n> -\t\t\t\t\t\t   maxSize.width);\n> -\t\toutSize[i].second.height = std::min(outSize[i].second.height,\n> -\t\t\t\t\t\t    maxSize.height);\n> -\n> -\t\tif (config_.at(outSize[i].first).size != outSize[i].second) {\n> -\t\t\tconfig_.at(outSize[i].first).size = outSize[i].second;\n> -\t\t\tstatus = Adjusted;\n> -\t\t}\n> -\n> -\t\t/*\n> -\t\t * Also validate the correct pixel formats here.\n> -\t\t * Note that Output0 and Output1 support a different\n> -\t\t * set of formats.\n> -\t\t *\n> -\t\t * Output 0 must be for the largest resolution. We will\n> -\t\t * have that fixed up in the code above.\n> -\t\t *\n> -\t\t */\n> -\t\tStreamConfiguration &cfg = config_.at(outSize[i].first);\n> -\t\tPixelFormat &cfgPixFmt = cfg.pixelFormat;\n> -\t\tV4L2VideoDevice *dev;\n> -\n> -\t\tif (i == maxIndex)\n> -\t\t\tdev = data_->isp_[Isp::Output0].dev();\n> -\t\telse\n> -\t\t\tdev = data_->isp_[Isp::Output1].dev();\n> -\n> -\t\tV4L2VideoDevice::Formats fmts = dev->formats();\n> -\n> -\t\tif (fmts.find(dev->toV4L2PixelFormat(cfgPixFmt)) == fmts.end()) {\n> -\t\t\t/* If we cannot find a native format, use a default one. */\n> -\t\t\tcfgPixFmt = formats::NV12;\n> -\t\t\tstatus = Adjusted;\n> -\t\t}\n> -\n> -\t\tV4L2DeviceFormat format;\n> -\t\tformat.fourcc = dev->toV4L2PixelFormat(cfg.pixelFormat);\n> -\t\tformat.size = cfg.size;\n> -\t\t/* We want to send the associated YCbCr info through to the driver. */\n> -\t\tformat.colorSpace = yuvColorSpace_;\n> -\n> -\t\tLOG(RPI, Debug)\n> -\t\t\t<< \"Try color space \" << ColorSpace::toString(cfg.colorSpace);\n> -\n> -\t\tint ret = dev->tryFormat(&format);\n> -\t\tif (ret)\n> -\t\t\treturn Invalid;\n> -\n> -\t\t/*\n> -\t\t * But for RGB streams, the YCbCr info gets overwritten on the way back\n> -\t\t * so we must check against what the stream cfg says, not what we actually\n> -\t\t * requested (which carefully included the YCbCr info)!\n> -\t\t */\n> -\t\tif (cfg.colorSpace != format.colorSpace) {\n> -\t\t\tstatus = Adjusted;\n> -\t\t\tLOG(RPI, Debug)\n> -\t\t\t\t<< \"Color space changed from \"\n> -\t\t\t\t<< ColorSpace::toString(cfg.colorSpace) << \" to \"\n> -\t\t\t\t<< ColorSpace::toString(format.colorSpace);\n> -\t\t}\n> -\n> -\t\tcfg.colorSpace = format.colorSpace;\n> -\n> -\t\tcfg.stride = format.planes[0].bpl;\n> -\t\tcfg.frameSize = format.planes[0].size;\n> -\n> -\t}\n> -\n> -\treturn status;\n> -}\n> -\n> -PipelineHandlerRPi::PipelineHandlerRPi(CameraManager *manager)\n> -\t: PipelineHandler(manager)\n> -{\n> -}\n> -\n> -std::unique_ptr<CameraConfiguration>\n> -PipelineHandlerRPi::generateConfiguration(Camera *camera, const StreamRoles &roles)\n> -{\n> -\tRPiCameraData *data = cameraData(camera);\n> -\tstd::unique_ptr<CameraConfiguration> config =\n> -\t\tstd::make_unique<RPiCameraConfiguration>(data);\n> -\tV4L2SubdeviceFormat sensorFormat;\n> -\tunsigned int bufferCount;\n> -\tPixelFormat pixelFormat;\n> -\tV4L2VideoDevice::Formats fmts;\n> -\tSize size;\n> -\tstd::optional<ColorSpace> colorSpace;\n> -\n> -\tif (roles.empty())\n> -\t\treturn config;\n> -\n> -\tunsigned int rawCount = 0;\n> -\tunsigned int outCount = 0;\n> -\tSize sensorSize = data->sensor_->resolution();\n> -\tfor (const StreamRole role : roles) {\n> -\t\tswitch (role) {\n> -\t\tcase StreamRole::Raw:\n> -\t\t\tsize = sensorSize;\n> -\t\t\tsensorFormat = findBestFormat(data->sensorFormats_, size, defaultRawBitDepth);\n> -\t\t\tpixelFormat = mbusCodeToPixelFormat(sensorFormat.mbus_code,\n> -\t\t\t\t\t\t\t    BayerFormat::Packing::CSI2);\n> -\t\t\tASSERT(pixelFormat.isValid());\n> -\t\t\tcolorSpace = ColorSpace::Raw;\n> -\t\t\tbufferCount = 2;\n> -\t\t\trawCount++;\n> -\t\t\tbreak;\n> -\n> -\t\tcase StreamRole::StillCapture:\n> -\t\t\tfmts = data->isp_[Isp::Output0].dev()->formats();\n> -\t\t\tpixelFormat = formats::NV12;\n> -\t\t\t/*\n> -\t\t\t * Still image codecs usually expect the sYCC color space.\n> -\t\t\t * Even RGB codecs will be fine as the RGB we get with the\n> -\t\t\t * sYCC color space is the same as sRGB.\n> -\t\t\t */\n> -\t\t\tcolorSpace = ColorSpace::Sycc;\n> -\t\t\t/* Return the largest sensor resolution. */\n> -\t\t\tsize = sensorSize;\n> -\t\t\tbufferCount = 1;\n> -\t\t\toutCount++;\n> -\t\t\tbreak;\n> -\n> -\t\tcase StreamRole::VideoRecording:\n> -\t\t\t/*\n> -\t\t\t * The colour denoise algorithm requires the analysis\n> -\t\t\t * image, produced by the second ISP output, to be in\n> -\t\t\t * YUV420 format. Select this format as the default, to\n> -\t\t\t * maximize chances that it will be picked by\n> -\t\t\t * applications and enable usage of the colour denoise\n> -\t\t\t * algorithm.\n> -\t\t\t */\n> -\t\t\tfmts = data->isp_[Isp::Output0].dev()->formats();\n> -\t\t\tpixelFormat = formats::YUV420;\n> -\t\t\t/*\n> -\t\t\t * Choose a color space appropriate for video recording.\n> -\t\t\t * Rec.709 will be a good default for HD resolutions.\n> -\t\t\t */\n> -\t\t\tcolorSpace = ColorSpace::Rec709;\n> -\t\t\tsize = { 1920, 1080 };\n> -\t\t\tbufferCount = 4;\n> -\t\t\toutCount++;\n> -\t\t\tbreak;\n> -\n> -\t\tcase StreamRole::Viewfinder:\n> -\t\t\tfmts = data->isp_[Isp::Output0].dev()->formats();\n> -\t\t\tpixelFormat = formats::ARGB8888;\n> -\t\t\tcolorSpace = ColorSpace::Sycc;\n> -\t\t\tsize = { 800, 600 };\n> -\t\t\tbufferCount = 4;\n> -\t\t\toutCount++;\n> -\t\t\tbreak;\n> -\n> -\t\tdefault:\n> -\t\t\tLOG(RPI, Error) << \"Requested stream role not supported: \"\n> -\t\t\t\t\t<< role;\n> -\t\t\treturn nullptr;\n> -\t\t}\n> -\n> -\t\tif (rawCount > 1 || outCount > 2) {\n> -\t\t\tLOG(RPI, Error) << \"Invalid stream roles requested\";\n> -\t\t\treturn nullptr;\n> -\t\t}\n> -\n> -\t\tstd::map<PixelFormat, std::vector<SizeRange>> deviceFormats;\n> -\t\tif (role == StreamRole::Raw) {\n> -\t\t\t/* Translate the MBUS codes to a PixelFormat. */\n> -\t\t\tfor (const auto &format : data->sensorFormats_) {\n> -\t\t\t\tPixelFormat pf = mbusCodeToPixelFormat(format.first,\n> -\t\t\t\t\t\t\t\t       BayerFormat::Packing::CSI2);\n> -\t\t\t\tif (pf.isValid())\n> -\t\t\t\t\tdeviceFormats.emplace(std::piecewise_construct,\tstd::forward_as_tuple(pf),\n> -\t\t\t\t\t\tstd::forward_as_tuple(format.second.begin(), format.second.end()));\n> -\t\t\t}\n> -\t\t} else {\n> -\t\t\t/*\n> -\t\t\t * Translate the V4L2PixelFormat to PixelFormat. Note that we\n> -\t\t\t * limit the recommended largest ISP output size to match the\n> -\t\t\t * sensor resolution.\n> -\t\t\t */\n> -\t\t\tfor (const auto &format : fmts) {\n> -\t\t\t\tPixelFormat pf = format.first.toPixelFormat();\n> -\t\t\t\tif (pf.isValid()) {\n> -\t\t\t\t\tconst SizeRange &ispSizes = format.second[0];\n> -\t\t\t\t\tdeviceFormats[pf].emplace_back(ispSizes.min, sensorSize,\n> -\t\t\t\t\t\t\t\t       ispSizes.hStep, ispSizes.vStep);\n> -\t\t\t\t}\n> -\t\t\t}\n> -\t\t}\n> -\n> -\t\t/* Add the stream format based on the device node used for the use case. */\n> -\t\tStreamFormats formats(deviceFormats);\n> -\t\tStreamConfiguration cfg(formats);\n> -\t\tcfg.size = size;\n> -\t\tcfg.pixelFormat = pixelFormat;\n> -\t\tcfg.colorSpace = colorSpace;\n> -\t\tcfg.bufferCount = bufferCount;\n> -\t\tconfig->addConfiguration(cfg);\n> -\t}\n> -\n> -\tconfig->validate();\n> -\n> -\treturn config;\n> -}\n> -\n> -int PipelineHandlerRPi::configure(Camera *camera, CameraConfiguration *config)\n> -{\n> -\tRPiCameraData *data = cameraData(camera);\n> -\tint ret;\n> -\n> -\t/* Start by freeing all buffers and reset the Unicam and ISP stream states. */\n> -\tdata->freeBuffers();\n> -\tfor (auto const stream : data->streams_)\n> -\t\tstream->setExternal(false);\n> -\n> -\tBayerFormat::Packing packing = BayerFormat::Packing::CSI2;\n> -\tSize maxSize, sensorSize;\n> -\tunsigned int maxIndex = 0;\n> -\tbool rawStream = false;\n> -\tunsigned int bitDepth = defaultRawBitDepth;\n> -\n> -\t/*\n> -\t * Look for the RAW stream (if given) size as well as the largest\n> -\t * ISP output size.\n> -\t */\n> -\tfor (unsigned i = 0; i < config->size(); i++) {\n> -\t\tStreamConfiguration &cfg = config->at(i);\n> -\n> -\t\tif (isRaw(cfg.pixelFormat)) {\n> -\t\t\t/*\n> -\t\t\t * If we have been given a RAW stream, use that size\n> -\t\t\t * for setting up the sensor.\n> -\t\t\t */\n> -\t\t\tsensorSize = cfg.size;\n> -\t\t\trawStream = true;\n> -\t\t\t/* Check if the user has explicitly set an unpacked format. */\n> -\t\t\tBayerFormat bayerFormat = BayerFormat::fromPixelFormat(cfg.pixelFormat);\n> -\t\t\tpacking = bayerFormat.packing;\n> -\t\t\tbitDepth = bayerFormat.bitDepth;\n> -\t\t} else {\n> -\t\t\tif (cfg.size > maxSize) {\n> -\t\t\t\tmaxSize = config->at(i).size;\n> -\t\t\t\tmaxIndex = i;\n> -\t\t\t}\n> -\t\t}\n> -\t}\n> -\n> -\t/*\n> -\t * Calculate the best sensor mode we can use based on the user's\n> -\t * request, and apply it to the sensor with the cached transform, if\n> -\t * any.\n> -\t */\n> -\tV4L2SubdeviceFormat sensorFormat = findBestFormat(data->sensorFormats_, rawStream ? sensorSize : maxSize, bitDepth);\n> -\tconst RPiCameraConfiguration *rpiConfig = static_cast<const RPiCameraConfiguration *>(config);\n> -\tret = data->sensor_->setFormat(&sensorFormat, rpiConfig->combinedTransform_);\n> -\tif (ret)\n> -\t\treturn ret;\n> -\n> -\tV4L2VideoDevice *unicam = data->unicam_[Unicam::Image].dev();\n> -\tV4L2DeviceFormat unicamFormat = toV4L2DeviceFormat(unicam, sensorFormat, packing);\n> -\tret = unicam->setFormat(&unicamFormat);\n> -\tif (ret)\n> -\t\treturn ret;\n> -\n> -\tLOG(RPI, Info) << \"Sensor: \" << camera->id()\n> -\t\t       << \" - Selected sensor format: \" << sensorFormat\n> -\t\t       << \" - Selected unicam format: \" << unicamFormat;\n> -\n> -\tret = data->isp_[Isp::Input].dev()->setFormat(&unicamFormat);\n> -\tif (ret)\n> -\t\treturn ret;\n> -\n> -\t/*\n> -\t * See which streams are requested, and route the user\n> -\t * StreamConfiguration appropriately.\n> -\t */\n> -\tV4L2DeviceFormat format;\n> -\tbool output0Set = false, output1Set = false;\n> -\tfor (unsigned i = 0; i < config->size(); i++) {\n> -\t\tStreamConfiguration &cfg = config->at(i);\n> -\n> -\t\tif (isRaw(cfg.pixelFormat)) {\n> -\t\t\tcfg.setStream(&data->unicam_[Unicam::Image]);\n> -\t\t\tdata->unicam_[Unicam::Image].setExternal(true);\n> -\t\t\tcontinue;\n> -\t\t}\n> -\n> -\t\t/* The largest resolution gets routed to the ISP Output 0 node. */\n> -\t\tRPi::Stream *stream = i == maxIndex ? &data->isp_[Isp::Output0]\n> -\t\t\t\t\t\t    : &data->isp_[Isp::Output1];\n> -\n> -\t\tV4L2PixelFormat fourcc = stream->dev()->toV4L2PixelFormat(cfg.pixelFormat);\n> -\t\tformat.size = cfg.size;\n> -\t\tformat.fourcc = fourcc;\n> -\t\tformat.colorSpace = cfg.colorSpace;\n> -\n> -\t\tLOG(RPI, Debug) << \"Setting \" << stream->name() << \" to \"\n> -\t\t\t\t<< format;\n> -\n> -\t\tret = stream->dev()->setFormat(&format);\n> -\t\tif (ret)\n> -\t\t\treturn -EINVAL;\n> -\n> -\t\tif (format.size != cfg.size || format.fourcc != fourcc) {\n> -\t\t\tLOG(RPI, Error)\n> -\t\t\t\t<< \"Failed to set requested format on \" << stream->name()\n> -\t\t\t\t<< \", returned \" << format;\n> -\t\t\treturn -EINVAL;\n> -\t\t}\n> -\n> -\t\tLOG(RPI, Debug)\n> -\t\t\t<< \"Stream \" << stream->name() << \" has color space \"\n> -\t\t\t<< ColorSpace::toString(cfg.colorSpace);\n> -\n> -\t\tcfg.setStream(stream);\n> -\t\tstream->setExternal(true);\n> -\n> -\t\tif (i != maxIndex)\n> -\t\t\toutput1Set = true;\n> -\t\telse\n> -\t\t\toutput0Set = true;\n> -\t}\n> -\n> -\t/*\n> -\t * If ISP::Output0 stream has not been configured by the application,\n> -\t * we must allow the hardware to generate an output so that the data\n> -\t * flow in the pipeline handler remains consistent, and we still generate\n> -\t * statistics for the IPA to use. So enable the output at a very low\n> -\t * resolution for internal use.\n> -\t *\n> -\t * \\todo Allow the pipeline to work correctly without Output0 and only\n> -\t * statistics coming from the hardware.\n> -\t */\n> -\tif (!output0Set) {\n> -\t\tV4L2VideoDevice *dev = data->isp_[Isp::Output0].dev();\n> -\n> -\t\tmaxSize = Size(320, 240);\n> -\t\tformat = {};\n> -\t\tformat.size = maxSize;\n> -\t\tformat.fourcc = dev->toV4L2PixelFormat(formats::YUV420);\n> -\t\t/* No one asked for output, so the color space doesn't matter. */\n> -\t\tformat.colorSpace = ColorSpace::Sycc;\n> -\t\tret = dev->setFormat(&format);\n> -\t\tif (ret) {\n> -\t\t\tLOG(RPI, Error)\n> -\t\t\t\t<< \"Failed to set default format on ISP Output0: \"\n> -\t\t\t\t<< ret;\n> -\t\t\treturn -EINVAL;\n> -\t\t}\n> -\n> -\t\tLOG(RPI, Debug) << \"Defaulting ISP Output0 format to \"\n> -\t\t\t\t<< format;\n> -\t}\n> -\n> -\t/*\n> -\t * If ISP::Output1 stream has not been requested by the application, we\n> -\t * set it up for internal use now. This second stream will be used for\n> -\t * fast colour denoise, and must be a quarter resolution of the ISP::Output0\n> -\t * stream. However, also limit the maximum size to 1200 pixels in the\n> -\t * larger dimension, just to avoid being wasteful with buffer allocations\n> -\t * and memory bandwidth.\n> -\t *\n> -\t * \\todo If Output 1 format is not YUV420, Output 1 ought to be disabled as\n> -\t * colour denoise will not run.\n> -\t */\n> -\tif (!output1Set) {\n> -\t\tV4L2VideoDevice *dev = data->isp_[Isp::Output1].dev();\n> -\n> -\t\tV4L2DeviceFormat output1Format;\n> -\t\tconstexpr Size maxDimensions(1200, 1200);\n> -\t\tconst Size limit = maxDimensions.boundedToAspectRatio(format.size);\n> -\n> -\t\toutput1Format.size = (format.size / 2).boundedTo(limit).alignedDownTo(2, 2);\n> -\t\toutput1Format.colorSpace = format.colorSpace;\n> -\t\toutput1Format.fourcc = dev->toV4L2PixelFormat(formats::YUV420);\n> -\n> -\t\tLOG(RPI, Debug) << \"Setting ISP Output1 (internal) to \"\n> -\t\t\t\t<< output1Format;\n> -\n> -\t\tret = dev->setFormat(&output1Format);\n> -\t\tif (ret) {\n> -\t\t\tLOG(RPI, Error) << \"Failed to set format on ISP Output1: \"\n> -\t\t\t\t\t<< ret;\n> -\t\t\treturn -EINVAL;\n> -\t\t}\n> -\t}\n> -\n> -\t/* ISP statistics output format. */\n> -\tformat = {};\n> -\tformat.fourcc = V4L2PixelFormat(V4L2_META_FMT_BCM2835_ISP_STATS);\n> -\tret = data->isp_[Isp::Stats].dev()->setFormat(&format);\n> -\tif (ret) {\n> -\t\tLOG(RPI, Error) << \"Failed to set format on ISP stats stream: \"\n> -\t\t\t\t<< format;\n> -\t\treturn ret;\n> -\t}\n> -\n> -\t/* Figure out the smallest selection the ISP will allow. */\n> -\tRectangle testCrop(0, 0, 1, 1);\n> -\tdata->isp_[Isp::Input].dev()->setSelection(V4L2_SEL_TGT_CROP, &testCrop);\n> -\tdata->ispMinCropSize_ = testCrop.size();\n> -\n> -\t/* Adjust aspect ratio by providing crops on the input image. */\n> -\tSize size = unicamFormat.size.boundedToAspectRatio(maxSize);\n> -\tRectangle crop = size.centeredTo(Rectangle(unicamFormat.size).center());\n> -\tRectangle defaultCrop = crop;\n> -\tdata->ispCrop_ = crop;\n> -\n> -\tdata->isp_[Isp::Input].dev()->setSelection(V4L2_SEL_TGT_CROP, &crop);\n> -\n> -\tipa::RPi::ConfigResult result;\n> -\tret = data->configureIPA(config, &result);\n> -\tif (ret)\n> -\t\tLOG(RPI, Error) << \"Failed to configure the IPA: \" << ret;\n> -\n> -\t/*\n> -\t * Set the scaler crop to the value we are using (scaled to native sensor\n> -\t * coordinates).\n> -\t */\n> -\tdata->scalerCrop_ = data->scaleIspCrop(data->ispCrop_);\n> -\n> -\t/*\n> -\t * Configure the Unicam embedded data output format only if the sensor\n> -\t * supports it.\n> -\t */\n> -\tif (data->sensorMetadata_) {\n> -\t\tV4L2SubdeviceFormat embeddedFormat;\n> -\n> -\t\tdata->sensor_->device()->getFormat(1, &embeddedFormat);\n> -\t\tformat.fourcc = V4L2PixelFormat(V4L2_META_FMT_SENSOR_DATA);\n> -\t\tformat.planes[0].size = embeddedFormat.size.width * embeddedFormat.size.height;\n> -\n> -\t\tLOG(RPI, Debug) << \"Setting embedded data format.\";\n> -\t\tret = data->unicam_[Unicam::Embedded].dev()->setFormat(&format);\n> -\t\tif (ret) {\n> -\t\t\tLOG(RPI, Error) << \"Failed to set format on Unicam embedded: \"\n> -\t\t\t\t\t<< format;\n> -\t\t\treturn ret;\n> -\t\t}\n> -\t}\n> -\n> -\t/*\n> -\t * Update the ScalerCropMaximum to the correct value for this camera mode.\n> -\t * For us, it's the same as the \"analogue crop\".\n> -\t *\n> -\t * \\todo Make this property the ScalerCrop maximum value when dynamic\n> -\t * controls are available and set it at validate() time\n> -\t */\n> -\tdata->properties_.set(properties::ScalerCropMaximum, data->sensorInfo_.analogCrop);\n> -\n> -\t/* Store the mode sensitivity for the application. */\n> -\tdata->properties_.set(properties::SensorSensitivity, result.modeSensitivity);\n> -\n> -\t/* Update the controls that the Raspberry Pi IPA can handle. */\n> -\tControlInfoMap::Map ctrlMap;\n> -\tfor (auto const &c : result.controlInfo)\n> -\t\tctrlMap.emplace(c.first, c.second);\n> -\n> -\t/* Add the ScalerCrop control limits based on the current mode. */\n> -\tRectangle ispMinCrop = data->scaleIspCrop(Rectangle(data->ispMinCropSize_));\n> -\tdefaultCrop = data->scaleIspCrop(defaultCrop);\n> -\tctrlMap[&controls::ScalerCrop] = ControlInfo(ispMinCrop, data->sensorInfo_.analogCrop, defaultCrop);\n> -\n> -\tdata->controlInfo_ = ControlInfoMap(std::move(ctrlMap), result.controlInfo.idmap());\n> -\n> -\t/* Setup the Video Mux/Bridge entities. */\n> -\tfor (auto &[device, link] : data->bridgeDevices_) {\n> -\t\t/*\n> -\t\t * Start by disabling all the sink pad links on the devices in the\n> -\t\t * cascade, with the exception of the link connecting the device.\n> -\t\t */\n> -\t\tfor (const MediaPad *p : device->entity()->pads()) {\n> -\t\t\tif (!(p->flags() & MEDIA_PAD_FL_SINK))\n> -\t\t\t\tcontinue;\n> -\n> -\t\t\tfor (MediaLink *l : p->links()) {\n> -\t\t\t\tif (l != link)\n> -\t\t\t\t\tl->setEnabled(false);\n> -\t\t\t}\n> -\t\t}\n> -\n> -\t\t/*\n> -\t\t * Next, enable the entity -> entity links, and setup the pad format.\n> -\t\t *\n> -\t\t * \\todo Some bridge devices may chainge the media bus code, so we\n> -\t\t * ought to read the source pad format and propagate it to the sink pad.\n> -\t\t */\n> -\t\tlink->setEnabled(true);\n> -\t\tconst MediaPad *sinkPad = link->sink();\n> -\t\tret = device->setFormat(sinkPad->index(), &sensorFormat);\n> -\t\tif (ret) {\n> -\t\t\tLOG(RPI, Error) << \"Failed to set format on \" << device->entity()->name()\n> -\t\t\t\t\t<< \" pad \" << sinkPad->index()\n> -\t\t\t\t\t<< \" with format  \" << format\n> -\t\t\t\t\t<< \": \" << ret;\n> -\t\t\treturn ret;\n> -\t\t}\n> -\n> -\t\tLOG(RPI, Debug) << \"Configured media link on device \" << device->entity()->name()\n> -\t\t\t\t<< \" on pad \" << sinkPad->index();\n> -\t}\n> -\n> -\treturn ret;\n> -}\n> -\n> -int PipelineHandlerRPi::exportFrameBuffers([[maybe_unused]] Camera *camera, Stream *stream,\n> -\t\t\t\t\t   std::vector<std::unique_ptr<FrameBuffer>> *buffers)\n> -{\n> -\tRPi::Stream *s = static_cast<RPi::Stream *>(stream);\n> -\tunsigned int count = stream->configuration().bufferCount;\n> -\tint ret = s->dev()->exportBuffers(count, buffers);\n> -\n> -\ts->setExportedBuffers(buffers);\n> -\n> -\treturn ret;\n> -}\n> -\n> -int PipelineHandlerRPi::start(Camera *camera, const ControlList *controls)\n> -{\n> -\tRPiCameraData *data = cameraData(camera);\n> -\tint ret;\n> -\n> -\t/* Check if a ScalerCrop control was specified. */\n> -\tif (controls)\n> -\t\tdata->applyScalerCrop(*controls);\n> -\n> -\t/* Start the IPA. */\n> -\tipa::RPi::StartResult result;\n> -\tdata->ipa_->start(controls ? *controls : ControlList{ controls::controls },\n> -\t\t\t  &result);\n> -\n> -\t/* Apply any gain/exposure settings that the IPA may have passed back. */\n> -\tif (!result.controls.empty())\n> -\t\tdata->setSensorControls(result.controls);\n> -\n> -\t/* Configure the number of dropped frames required on startup. */\n> -\tdata->dropFrameCount_ = data->config_.disableStartupFrameDrops\n> -\t\t\t\t? 0 : result.dropFrameCount;\n> -\n> -\tfor (auto const stream : data->streams_)\n> -\t\tstream->resetBuffers();\n> -\n> -\tif (!data->buffersAllocated_) {\n> -\t\t/* Allocate buffers for internal pipeline usage. */\n> -\t\tret = prepareBuffers(camera);\n> -\t\tif (ret) {\n> -\t\t\tLOG(RPI, Error) << \"Failed to allocate buffers\";\n> -\t\t\tdata->freeBuffers();\n> -\t\t\tstop(camera);\n> -\t\t\treturn ret;\n> -\t\t}\n> -\t\tdata->buffersAllocated_ = true;\n> -\t}\n> -\n> -\t/* We need to set the dropFrameCount_ before queueing buffers. */\n> -\tret = queueAllBuffers(camera);\n> -\tif (ret) {\n> -\t\tLOG(RPI, Error) << \"Failed to queue buffers\";\n> -\t\tstop(camera);\n> -\t\treturn ret;\n> -\t}\n> -\n> -\t/* Enable SOF event generation. */\n> -\tdata->unicam_[Unicam::Image].dev()->setFrameStartEnabled(true);\n> -\n> -\t/*\n> -\t * Reset the delayed controls with the gain and exposure values set by\n> -\t * the IPA.\n> -\t */\n> -\tdata->delayedCtrls_->reset(0);\n> -\n> -\tdata->state_ = RPiCameraData::State::Idle;\n> -\n> -\t/* Start all streams. */\n> -\tfor (auto const stream : data->streams_) {\n> -\t\tret = stream->dev()->streamOn();\n> -\t\tif (ret) {\n> -\t\t\tstop(camera);\n> -\t\t\treturn ret;\n> -\t\t}\n> -\t}\n> -\n> -\treturn 0;\n> -}\n> -\n> -void PipelineHandlerRPi::stopDevice(Camera *camera)\n> -{\n> -\tRPiCameraData *data = cameraData(camera);\n> -\n> -\tdata->state_ = RPiCameraData::State::Stopped;\n> -\n> -\t/* Disable SOF event generation. */\n> -\tdata->unicam_[Unicam::Image].dev()->setFrameStartEnabled(false);\n> -\n> -\tfor (auto const stream : data->streams_)\n> -\t\tstream->dev()->streamOff();\n> -\n> -\tdata->clearIncompleteRequests();\n> -\tdata->bayerQueue_ = {};\n> -\tdata->embeddedQueue_ = {};\n> -\n> -\t/* Stop the IPA. */\n> -\tdata->ipa_->stop();\n> -}\n> -\n> -int PipelineHandlerRPi::queueRequestDevice(Camera *camera, Request *request)\n> -{\n> -\tRPiCameraData *data = cameraData(camera);\n> -\n> -\tif (!data->isRunning())\n> -\t\treturn -EINVAL;\n> -\n> -\tLOG(RPI, Debug) << \"queueRequestDevice: New request.\";\n> -\n> -\t/* Push all buffers supplied in the Request to the respective streams. */\n> -\tfor (auto stream : data->streams_) {\n> -\t\tif (!stream->isExternal())\n> -\t\t\tcontinue;\n> -\n> -\t\tFrameBuffer *buffer = request->findBuffer(stream);\n> -\t\tif (buffer && !stream->getBufferId(buffer)) {\n> -\t\t\t/*\n> -\t\t\t * This buffer is not recognised, so it must have been allocated\n> -\t\t\t * outside the v4l2 device. Store it in the stream buffer list\n> -\t\t\t * so we can track it.\n> -\t\t\t */\n> -\t\t\tstream->setExternalBuffer(buffer);\n> -\t\t}\n> -\n> -\t\t/*\n> -\t\t * If no buffer is provided by the request for this stream, we\n> -\t\t * queue a nullptr to the stream to signify that it must use an\n> -\t\t * internally allocated buffer for this capture request. This\n> -\t\t * buffer will not be given back to the application, but is used\n> -\t\t * to support the internal pipeline flow.\n> -\t\t *\n> -\t\t * The below queueBuffer() call will do nothing if there are not\n> -\t\t * enough internal buffers allocated, but this will be handled by\n> -\t\t * queuing the request for buffers in the RPiStream object.\n> -\t\t */\n> -\t\tint ret = stream->queueBuffer(buffer);\n> -\t\tif (ret)\n> -\t\t\treturn ret;\n> -\t}\n> -\n> -\t/* Push the request to the back of the queue. */\n> -\tdata->requestQueue_.push_back(request);\n> -\tdata->handleState();\n> -\n> -\treturn 0;\n> -}\n> -\n> -bool PipelineHandlerRPi::match(DeviceEnumerator *enumerator)\n> -{\n> -\tconstexpr unsigned int numUnicamDevices = 2;\n> -\n> -\t/*\n> -\t * Loop over all Unicam instances, but return out once a match is found.\n> -\t * This is to ensure we correctly enumrate the camera when an instance\n> -\t * of Unicam has registered with media controller, but has not registered\n> -\t * device nodes due to a sensor subdevice failure.\n> -\t */\n> -\tfor (unsigned int i = 0; i < numUnicamDevices; i++) {\n> -\t\tDeviceMatch unicam(\"unicam\");\n> -\t\tMediaDevice *unicamDevice = acquireMediaDevice(enumerator, unicam);\n> -\n> -\t\tif (!unicamDevice) {\n> -\t\t\tLOG(RPI, Debug) << \"Unable to acquire a Unicam instance\";\n> -\t\t\tcontinue;\n> -\t\t}\n> -\n> -\t\tDeviceMatch isp(\"bcm2835-isp\");\n> -\t\tMediaDevice *ispDevice = acquireMediaDevice(enumerator, isp);\n> -\n> -\t\tif (!ispDevice) {\n> -\t\t\tLOG(RPI, Debug) << \"Unable to acquire ISP instance\";\n> -\t\t\tcontinue;\n> -\t\t}\n> -\n> -\t\t/*\n> -\t\t * The loop below is used to register multiple cameras behind one or more\n> -\t\t * video mux devices that are attached to a particular Unicam instance.\n> -\t\t * Obviously these cameras cannot be used simultaneously.\n> -\t\t */\n> -\t\tunsigned int numCameras = 0;\n> -\t\tfor (MediaEntity *entity : unicamDevice->entities()) {\n> -\t\t\tif (entity->function() != MEDIA_ENT_F_CAM_SENSOR)\n> -\t\t\t\tcontinue;\n> -\n> -\t\t\tint ret = registerCamera(unicamDevice, ispDevice, entity);\n> -\t\t\tif (ret)\n> -\t\t\t\tLOG(RPI, Error) << \"Failed to register camera \"\n> -\t\t\t\t\t\t<< entity->name() << \": \" << ret;\n> -\t\t\telse\n> -\t\t\t\tnumCameras++;\n> -\t\t}\n> -\n> -\t\tif (numCameras)\n> -\t\t\treturn true;\n> -\t}\n> -\n> -\treturn false;\n> -}\n> -\n> -void PipelineHandlerRPi::releaseDevice(Camera *camera)\n> -{\n> -\tRPiCameraData *data = cameraData(camera);\n> -\tdata->freeBuffers();\n> -}\n> -\n> -int PipelineHandlerRPi::registerCamera(MediaDevice *unicam, MediaDevice *isp, MediaEntity *sensorEntity)\n> -{\n> -\tstd::unique_ptr<RPiCameraData> data = std::make_unique<RPiCameraData>(this);\n> -\n> -\tif (!data->dmaHeap_.isValid())\n> -\t\treturn -ENOMEM;\n> -\n> -\tMediaEntity *unicamImage = unicam->getEntityByName(\"unicam-image\");\n> -\tMediaEntity *ispOutput0 = isp->getEntityByName(\"bcm2835-isp0-output0\");\n> -\tMediaEntity *ispCapture1 = isp->getEntityByName(\"bcm2835-isp0-capture1\");\n> -\tMediaEntity *ispCapture2 = isp->getEntityByName(\"bcm2835-isp0-capture2\");\n> -\tMediaEntity *ispCapture3 = isp->getEntityByName(\"bcm2835-isp0-capture3\");\n> -\n> -\tif (!unicamImage || !ispOutput0 || !ispCapture1 || !ispCapture2 || !ispCapture3)\n> -\t\treturn -ENOENT;\n> -\n> -\t/* Locate and open the unicam video streams. */\n> -\tdata->unicam_[Unicam::Image] = RPi::Stream(\"Unicam Image\", unicamImage);\n> -\n> -\t/* An embedded data node will not be present if the sensor does not support it. */\n> -\tMediaEntity *unicamEmbedded = unicam->getEntityByName(\"unicam-embedded\");\n> -\tif (unicamEmbedded) {\n> -\t\tdata->unicam_[Unicam::Embedded] = RPi::Stream(\"Unicam Embedded\", unicamEmbedded);\n> -\t\tdata->unicam_[Unicam::Embedded].dev()->bufferReady.connect(data.get(),\n> -\t\t\t\t\t\t\t\t\t   &RPiCameraData::unicamBufferDequeue);\n> -\t}\n> -\n> -\t/* Tag the ISP input stream as an import stream. */\n> -\tdata->isp_[Isp::Input] = RPi::Stream(\"ISP Input\", ispOutput0, true);\n> -\tdata->isp_[Isp::Output0] = RPi::Stream(\"ISP Output0\", ispCapture1);\n> -\tdata->isp_[Isp::Output1] = RPi::Stream(\"ISP Output1\", ispCapture2);\n> -\tdata->isp_[Isp::Stats] = RPi::Stream(\"ISP Stats\", ispCapture3);\n> -\n> -\t/* Wire up all the buffer connections. */\n> -\tdata->unicam_[Unicam::Image].dev()->dequeueTimeout.connect(data.get(), &RPiCameraData::unicamTimeout);\n> -\tdata->unicam_[Unicam::Image].dev()->frameStart.connect(data.get(), &RPiCameraData::frameStarted);\n> -\tdata->unicam_[Unicam::Image].dev()->bufferReady.connect(data.get(), &RPiCameraData::unicamBufferDequeue);\n> -\tdata->isp_[Isp::Input].dev()->bufferReady.connect(data.get(), &RPiCameraData::ispInputDequeue);\n> -\tdata->isp_[Isp::Output0].dev()->bufferReady.connect(data.get(), &RPiCameraData::ispOutputDequeue);\n> -\tdata->isp_[Isp::Output1].dev()->bufferReady.connect(data.get(), &RPiCameraData::ispOutputDequeue);\n> -\tdata->isp_[Isp::Stats].dev()->bufferReady.connect(data.get(), &RPiCameraData::ispOutputDequeue);\n> -\n> -\tdata->sensor_ = std::make_unique<CameraSensor>(sensorEntity);\n> -\tif (!data->sensor_)\n> -\t\treturn -EINVAL;\n> -\n> -\tif (data->sensor_->init())\n> -\t\treturn -EINVAL;\n> -\n> -\t/*\n> -\t * Enumerate all the Video Mux/Bridge devices across the sensor -> unicam\n> -\t * chain. There may be a cascade of devices in this chain!\n> -\t */\n> -\tMediaLink *link = sensorEntity->getPadByIndex(0)->links()[0];\n> -\tdata->enumerateVideoDevices(link);\n> -\n> -\tdata->sensorFormats_ = populateSensorFormats(data->sensor_);\n> -\n> -\tipa::RPi::InitResult result;\n> -\tif (data->loadIPA(&result)) {\n> -\t\tLOG(RPI, Error) << \"Failed to load a suitable IPA library\";\n> -\t\treturn -EINVAL;\n> -\t}\n> -\n> -\tif (result.sensorConfig.sensorMetadata ^ !!unicamEmbedded) {\n> -\t\tLOG(RPI, Warning) << \"Mismatch between Unicam and CamHelper for embedded data usage!\";\n> -\t\tresult.sensorConfig.sensorMetadata = false;\n> -\t\tif (unicamEmbedded)\n> -\t\t\tdata->unicam_[Unicam::Embedded].dev()->bufferReady.disconnect();\n> -\t}\n> -\n> -\t/*\n> -\t * Open all Unicam and ISP streams. The exception is the embedded data\n> -\t * stream, which only gets opened below if the IPA reports that the sensor\n> -\t * supports embedded data.\n> -\t *\n> -\t * The below grouping is just for convenience so that we can easily\n> -\t * iterate over all streams in one go.\n> -\t */\n> -\tdata->streams_.push_back(&data->unicam_[Unicam::Image]);\n> -\tif (result.sensorConfig.sensorMetadata)\n> -\t\tdata->streams_.push_back(&data->unicam_[Unicam::Embedded]);\n> -\n> -\tfor (auto &stream : data->isp_)\n> -\t\tdata->streams_.push_back(&stream);\n> -\n> -\tfor (auto stream : data->streams_) {\n> -\t\tint ret = stream->dev()->open();\n> -\t\tif (ret)\n> -\t\t\treturn ret;\n> -\t}\n> -\n> -\tif (!data->unicam_[Unicam::Image].dev()->caps().hasMediaController()) {\n> -\t\tLOG(RPI, Error) << \"Unicam driver does not use the MediaController, please update your kernel!\";\n> -\t\treturn -EINVAL;\n> -\t}\n> -\n> -\t/*\n> -\t * Setup our delayed control writer with the sensor default\n> -\t * gain and exposure delays. Mark VBLANK for priority write.\n> -\t */\n> -\tstd::unordered_map<uint32_t, RPi::DelayedControls::ControlParams> params = {\n> -\t\t{ V4L2_CID_ANALOGUE_GAIN, { result.sensorConfig.gainDelay, false } },\n> -\t\t{ V4L2_CID_EXPOSURE, { result.sensorConfig.exposureDelay, false } },\n> -\t\t{ V4L2_CID_HBLANK, { result.sensorConfig.hblankDelay, false } },\n> -\t\t{ V4L2_CID_VBLANK, { result.sensorConfig.vblankDelay, true } }\n> -\t};\n> -\tdata->delayedCtrls_ = std::make_unique<RPi::DelayedControls>(data->sensor_->device(), params);\n> -\tdata->sensorMetadata_ = result.sensorConfig.sensorMetadata;\n> -\n> -\t/* Register initial controls that the Raspberry Pi IPA can handle. */\n> -\tdata->controlInfo_ = std::move(result.controlInfo);\n> -\n> -\t/* Initialize the camera properties. */\n> -\tdata->properties_ = data->sensor_->properties();\n> -\n> -\t/*\n> -\t * The V4L2_CID_NOTIFY_GAINS control, if present, is used to inform the\n> -\t * sensor of the colour gains. It is defined to be a linear gain where\n> -\t * the default value represents a gain of exactly one.\n> -\t */\n> -\tauto it = data->sensor_->controls().find(V4L2_CID_NOTIFY_GAINS);\n> -\tif (it != data->sensor_->controls().end())\n> -\t\tdata->notifyGainsUnity_ = it->second.def().get<int32_t>();\n> -\n> -\t/*\n> -\t * Set a default value for the ScalerCropMaximum property to show\n> -\t * that we support its use, however, initialise it to zero because\n> -\t * it's not meaningful until a camera mode has been chosen.\n> -\t */\n> -\tdata->properties_.set(properties::ScalerCropMaximum, Rectangle{});\n> -\n> -\t/*\n> -\t * We cache two things about the sensor in relation to transforms\n> -\t * (meaning horizontal and vertical flips): if they affect the Bayer\n> -\t * ordering, and what the \"native\" Bayer order is, when no transforms\n> -\t * are applied.\n> -\t *\n> -\t * We note that the sensor's cached list of supported formats is\n> -\t * already in the \"native\" order, with any flips having been undone.\n> -\t */\n> -\tconst V4L2Subdevice *sensor = data->sensor_->device();\n> -\tconst struct v4l2_query_ext_ctrl *hflipCtrl = sensor->controlInfo(V4L2_CID_HFLIP);\n> -\tif (hflipCtrl) {\n> -\t\t/* We assume it will support vflips too... */\n> -\t\tdata->flipsAlterBayerOrder_ = hflipCtrl->flags & V4L2_CTRL_FLAG_MODIFY_LAYOUT;\n> -\t}\n> -\n> -\t/* Look for a valid Bayer format. */\n> -\tBayerFormat bayerFormat;\n> -\tfor (const auto &iter : data->sensorFormats_) {\n> -\t\tbayerFormat = BayerFormat::fromMbusCode(iter.first);\n> -\t\tif (bayerFormat.isValid())\n> -\t\t\tbreak;\n> -\t}\n> -\n> -\tif (!bayerFormat.isValid()) {\n> -\t\tLOG(RPI, Error) << \"No Bayer format found\";\n> -\t\treturn -EINVAL;\n> -\t}\n> -\tdata->nativeBayerOrder_ = bayerFormat.order;\n> -\n> -\t/*\n> -\t * List the available streams an application may request. At present, we\n> -\t * do not advertise Unicam Embedded and ISP Statistics streams, as there\n> -\t * is no mechanism for the application to request non-image buffer formats.\n> -\t */\n> -\tstd::set<Stream *> streams;\n> -\tstreams.insert(&data->unicam_[Unicam::Image]);\n> -\tstreams.insert(&data->isp_[Isp::Output0]);\n> -\tstreams.insert(&data->isp_[Isp::Output1]);\n> -\n> -\tint ret = data->loadPipelineConfiguration();\n> -\tif (ret) {\n> -\t\tLOG(RPI, Error) << \"Unable to load pipeline configuration\";\n> -\t\treturn ret;\n> -\t}\n> -\n> -\t/* Create and register the camera. */\n> -\tconst std::string &id = data->sensor_->id();\n> -\tstd::shared_ptr<Camera> camera =\n> -\t\tCamera::create(std::move(data), id, streams);\n> -\tPipelineHandler::registerCamera(std::move(camera));\n> -\n> -\tLOG(RPI, Info) << \"Registered camera \" << id\n> -\t\t       << \" to Unicam device \" << unicam->deviceNode()\n> -\t\t       << \" and ISP device \" << isp->deviceNode();\n> -\treturn 0;\n> -}\n> -\n> -int PipelineHandlerRPi::queueAllBuffers(Camera *camera)\n> -{\n> -\tRPiCameraData *data = cameraData(camera);\n> -\tint ret;\n> -\n> -\tfor (auto const stream : data->streams_) {\n> -\t\tif (!stream->isExternal()) {\n> -\t\t\tret = stream->queueAllBuffers();\n> -\t\t\tif (ret < 0)\n> -\t\t\t\treturn ret;\n> -\t\t} else {\n> -\t\t\t/*\n> -\t\t\t * For external streams, we must queue up a set of internal\n> -\t\t\t * buffers to handle the number of drop frames requested by\n> -\t\t\t * the IPA. This is done by passing nullptr in queueBuffer().\n> -\t\t\t *\n> -\t\t\t * The below queueBuffer() call will do nothing if there\n> -\t\t\t * are not enough internal buffers allocated, but this will\n> -\t\t\t * be handled by queuing the request for buffers in the\n> -\t\t\t * RPiStream object.\n> -\t\t\t */\n> -\t\t\tunsigned int i;\n> -\t\t\tfor (i = 0; i < data->dropFrameCount_; i++) {\n> -\t\t\t\tret = stream->queueBuffer(nullptr);\n> -\t\t\t\tif (ret)\n> -\t\t\t\t\treturn ret;\n> -\t\t\t}\n> -\t\t}\n> -\t}\n> -\n> -\treturn 0;\n> -}\n> -\n> -int PipelineHandlerRPi::prepareBuffers(Camera *camera)\n> -{\n> -\tRPiCameraData *data = cameraData(camera);\n> -\tunsigned int numRawBuffers = 0;\n> -\tint ret;\n> -\n> -\tfor (Stream *s : camera->streams()) {\n> -\t\tif (isRaw(s->configuration().pixelFormat)) {\n> -\t\t\tnumRawBuffers = s->configuration().bufferCount;\n> -\t\t\tbreak;\n> -\t\t}\n> -\t}\n> -\n> -\t/* Decide how many internal buffers to allocate. */\n> -\tfor (auto const stream : data->streams_) {\n> -\t\tunsigned int numBuffers;\n> -\t\t/*\n> -\t\t * For Unicam, allocate a minimum number of buffers for internal\n> -\t\t * use as we want to avoid any frame drops.\n> -\t\t */\n> -\t\tconst unsigned int minBuffers = data->config_.minTotalUnicamBuffers;\n> -\t\tif (stream == &data->unicam_[Unicam::Image]) {\n> -\t\t\t/*\n> -\t\t\t * If an application has configured a RAW stream, allocate\n> -\t\t\t * additional buffers to make up the minimum, but ensure\n> -\t\t\t * we have at least minUnicamBuffers of internal buffers\n> -\t\t\t * to use to minimise frame drops.\n> -\t\t\t */\n> -\t\t\tnumBuffers = std::max<int>(data->config_.minUnicamBuffers,\n> -\t\t\t\t\t\t   minBuffers - numRawBuffers);\n> -\t\t} else if (stream == &data->isp_[Isp::Input]) {\n> -\t\t\t/*\n> -\t\t\t * ISP input buffers are imported from Unicam, so follow\n> -\t\t\t * similar logic as above to count all the RAW buffers\n> -\t\t\t * available.\n> -\t\t\t */\n> -\t\t\tnumBuffers = numRawBuffers +\n> -\t\t\t\t     std::max<int>(data->config_.minUnicamBuffers,\n> -\t\t\t\t\t\t   minBuffers - numRawBuffers);\n> -\n> -\t\t} else if (stream == &data->unicam_[Unicam::Embedded]) {\n> -\t\t\t/*\n> -\t\t\t * Embedded data buffers are (currently) for internal use,\n> -\t\t\t * so allocate the minimum required to avoid frame drops.\n> -\t\t\t */\n> -\t\t\tnumBuffers = minBuffers;\n> -\t\t} else {\n> -\t\t\t/*\n> -\t\t\t * Since the ISP runs synchronous with the IPA and requests,\n> -\t\t\t * we only ever need one set of internal buffers. Any buffers\n> -\t\t\t * the application wants to hold onto will already be exported\n> -\t\t\t * through PipelineHandlerRPi::exportFrameBuffers().\n> -\t\t\t */\n> -\t\t\tnumBuffers = 1;\n> -\t\t}\n> -\n> -\t\tret = stream->prepareBuffers(numBuffers);\n> -\t\tif (ret < 0)\n> -\t\t\treturn ret;\n> -\t}\n> -\n> -\t/*\n> -\t * Pass the stats and embedded data buffers to the IPA. No other\n> -\t * buffers need to be passed.\n> -\t */\n> -\tmapBuffers(camera, data->isp_[Isp::Stats].getBuffers(), RPi::MaskStats);\n> -\tif (data->sensorMetadata_)\n> -\t\tmapBuffers(camera, data->unicam_[Unicam::Embedded].getBuffers(),\n> -\t\t\t   RPi::MaskEmbeddedData);\n> -\n> -\treturn 0;\n> -}\n> -\n> -void PipelineHandlerRPi::mapBuffers(Camera *camera, const RPi::BufferMap &buffers, unsigned int mask)\n> -{\n> -\tRPiCameraData *data = cameraData(camera);\n> -\tstd::vector<IPABuffer> bufferIds;\n> -\t/*\n> -\t * Link the FrameBuffers with the id (key value) in the map stored in\n> -\t * the RPi stream object - along with an identifier mask.\n> -\t *\n> -\t * This will allow us to identify buffers passed between the pipeline\n> -\t * handler and the IPA.\n> -\t */\n> -\tfor (auto const &it : buffers) {\n> -\t\tbufferIds.push_back(IPABuffer(mask | it.first,\n> -\t\t\t\t\t       it.second->planes()));\n> -\t\tdata->bufferIds_.insert(mask | it.first);\n> -\t}\n> -\n> -\tdata->ipa_->mapBuffers(bufferIds);\n> -}\n> -\n> -void RPiCameraData::freeBuffers()\n> -{\n> -\tif (ipa_) {\n> -\t\t/*\n> -\t\t * Copy the buffer ids from the unordered_set to a vector to\n> -\t\t * pass to the IPA.\n> -\t\t */\n> -\t\tstd::vector<unsigned int> bufferIds(bufferIds_.begin(),\n> -\t\t\t\t\t\t    bufferIds_.end());\n> -\t\tipa_->unmapBuffers(bufferIds);\n> -\t\tbufferIds_.clear();\n> -\t}\n> -\n> -\tfor (auto const stream : streams_)\n> -\t\tstream->releaseBuffers();\n> -\n> -\tbuffersAllocated_ = false;\n> -}\n> -\n> -void RPiCameraData::frameStarted(uint32_t sequence)\n> -{\n> -\tLOG(RPI, Debug) << \"frame start \" << sequence;\n> -\n> -\t/* Write any controls for the next frame as soon as we can. */\n> -\tdelayedCtrls_->applyControls(sequence);\n> -}\n> -\n> -int RPiCameraData::loadIPA(ipa::RPi::InitResult *result)\n> -{\n> -\tipa_ = IPAManager::createIPA<ipa::RPi::IPAProxyRPi>(pipe(), 1, 1);\n> -\n> -\tif (!ipa_)\n> -\t\treturn -ENOENT;\n> -\n> -\tipa_->processStatsComplete.connect(this, &RPiCameraData::processStatsComplete);\n> -\tipa_->prepareIspComplete.connect(this, &RPiCameraData::prepareIspComplete);\n> -\tipa_->setIspControls.connect(this, &RPiCameraData::setIspControls);\n> -\tipa_->setDelayedControls.connect(this, &RPiCameraData::setDelayedControls);\n> -\tipa_->setLensControls.connect(this, &RPiCameraData::setLensControls);\n> -\tipa_->setCameraTimeout.connect(this, &RPiCameraData::setCameraTimeout);\n> -\n> -\t/*\n> -\t * The configuration (tuning file) is made from the sensor name unless\n> -\t * the environment variable overrides it.\n> -\t */\n> -\tstd::string configurationFile;\n> -\tchar const *configFromEnv = utils::secure_getenv(\"LIBCAMERA_RPI_TUNING_FILE\");\n> -\tif (!configFromEnv || *configFromEnv == '\\0') {\n> -\t\tstd::string model = sensor_->model();\n> -\t\tif (isMonoSensor(sensor_))\n> -\t\t\tmodel += \"_mono\";\n> -\t\tconfigurationFile = ipa_->configurationFile(model + \".json\", \"rpi\");\n> -\t} else {\n> -\t\tconfigurationFile = std::string(configFromEnv);\n> -\t}\n> -\n> -\tIPASettings settings(configurationFile, sensor_->model());\n> -\tipa::RPi::InitParams params;\n> -\n> -\tparams.lensPresent = !!sensor_->focusLens();\n> -\treturn ipa_->init(settings, params, result);\n> -}\n> -\n> -int RPiCameraData::configureIPA(const CameraConfiguration *config, ipa::RPi::ConfigResult *result)\n> -{\n> -\tstd::map<unsigned int, ControlInfoMap> entityControls;\n> -\tipa::RPi::ConfigParams params;\n> -\n> -\t/* \\todo Move passing of ispControls and lensControls to ipa::init() */\n> -\tparams.sensorControls = sensor_->controls();\n> -\tparams.ispControls = isp_[Isp::Input].dev()->controls();\n> -\tif (sensor_->focusLens())\n> -\t\tparams.lensControls = sensor_->focusLens()->controls();\n> -\n> -\t/* Always send the user transform to the IPA. */\n> -\tparams.transform = static_cast<unsigned int>(config->transform);\n> -\n> -\t/* Allocate the lens shading table via dmaHeap and pass to the IPA. */\n> -\tif (!lsTable_.isValid()) {\n> -\t\tlsTable_ = SharedFD(dmaHeap_.alloc(\"ls_grid\", ipa::RPi::MaxLsGridSize));\n> -\t\tif (!lsTable_.isValid())\n> -\t\t\treturn -ENOMEM;\n> -\n> -\t\t/* Allow the IPA to mmap the LS table via the file descriptor. */\n> -\t\t/*\n> -\t\t * \\todo Investigate if mapping the lens shading table buffer\n> -\t\t * could be handled with mapBuffers().\n> -\t\t */\n> -\t\tparams.lsTableHandle = lsTable_;\n> -\t}\n> -\n> -\t/* We store the IPACameraSensorInfo for digital zoom calculations. */\n> -\tint ret = sensor_->sensorInfo(&sensorInfo_);\n> -\tif (ret) {\n> -\t\tLOG(RPI, Error) << \"Failed to retrieve camera sensor info\";\n> -\t\treturn ret;\n> -\t}\n> -\n> -\t/* Ready the IPA - it must know about the sensor resolution. */\n> -\tret = ipa_->configure(sensorInfo_, params, result);\n> -\tif (ret < 0) {\n> -\t\tLOG(RPI, Error) << \"IPA configuration failed!\";\n> -\t\treturn -EPIPE;\n> -\t}\n> -\n> -\tif (!result->controls.empty())\n> -\t\tsetSensorControls(result->controls);\n> -\n> -\treturn 0;\n> -}\n> -\n> -int RPiCameraData::loadPipelineConfiguration()\n> -{\n> -\tconfig_ = {\n> -\t\t.minUnicamBuffers = 2,\n> -\t\t.minTotalUnicamBuffers = 4,\n> -\t\t.disableStartupFrameDrops = false,\n> -\t\t.unicamTimeoutValue = 0,\n> -\t};\n> -\n> -\tchar const *configFromEnv = utils::secure_getenv(\"LIBCAMERA_RPI_CONFIG_FILE\");\n> -\tif (!configFromEnv || *configFromEnv == '\\0')\n> -\t\treturn 0;\n> -\n> -\tstd::string filename = std::string(configFromEnv);\n> -\tFile file(filename);\n> -\n> -\tif (!file.open(File::OpenModeFlag::ReadOnly)) {\n> -\t\tLOG(RPI, Error) << \"Failed to open configuration file '\" << filename << \"'\";\n> -\t\treturn -EIO;\n> -\t}\n> -\n> -\tLOG(RPI, Info) << \"Using configuration file '\" << filename << \"'\";\n> -\n> -\tstd::unique_ptr<YamlObject> root = YamlParser::parse(file);\n> -\tif (!root) {\n> -\t\tLOG(RPI, Warning) << \"Failed to parse configuration file, using defaults\";\n> -\t\treturn 0;\n> -\t}\n> -\n> -\tstd::optional<double> ver = (*root)[\"version\"].get<double>();\n> -\tif (!ver || *ver != 1.0) {\n> -\t\tLOG(RPI, Error) << \"Unexpected configuration file version reported\";\n> -\t\treturn -EINVAL;\n> -\t}\n> -\n> -\tconst YamlObject &phConfig = (*root)[\"pipeline_handler\"];\n> -\tconfig_.minUnicamBuffers =\n> -\t\tphConfig[\"min_unicam_buffers\"].get<unsigned int>(config_.minUnicamBuffers);\n> -\tconfig_.minTotalUnicamBuffers =\n> -\t\tphConfig[\"min_total_unicam_buffers\"].get<unsigned int>(config_.minTotalUnicamBuffers);\n> -\tconfig_.disableStartupFrameDrops =\n> -\t\tphConfig[\"disable_startup_frame_drops\"].get<bool>(config_.disableStartupFrameDrops);\n> -\tconfig_.unicamTimeoutValue =\n> -\t\tphConfig[\"unicam_timeout_value_ms\"].get<unsigned int>(config_.unicamTimeoutValue);\n> -\n> -\tif (config_.unicamTimeoutValue) {\n> -\t\t/* Disable the IPA signal to control timeout and set the user requested value. */\n> -\t\tipa_->setCameraTimeout.disconnect();\n> -\t\tunicam_[Unicam::Image].dev()->setDequeueTimeout(config_.unicamTimeoutValue * 1ms);\n> -\t}\n> -\n> -\tif (config_.minTotalUnicamBuffers < config_.minUnicamBuffers) {\n> -\t\tLOG(RPI, Error) << \"Invalid configuration: min_total_unicam_buffers must be >= min_unicam_buffers\";\n> -\t\treturn -EINVAL;\n> -\t}\n> -\n> -\tif (config_.minTotalUnicamBuffers < 1) {\n> -\t\tLOG(RPI, Error) << \"Invalid configuration: min_total_unicam_buffers must be >= 1\";\n> -\t\treturn -EINVAL;\n> -\t}\n> -\n> -\treturn 0;\n> -}\n> -\n> -/*\n> - * enumerateVideoDevices() iterates over the Media Controller topology, starting\n> - * at the sensor and finishing at Unicam. For each sensor, RPiCameraData stores\n> - * a unique list of any intermediate video mux or bridge devices connected in a\n> - * cascade, together with the entity to entity link.\n> - *\n> - * Entity pad configuration and link enabling happens at the end of configure().\n> - * We first disable all pad links on each entity device in the chain, and then\n> - * selectively enabling the specific links to link sensor to Unicam across all\n> - * intermediate muxes and bridges.\n> - *\n> - * In the cascaded topology below, if Sensor1 is used, the Mux2 -> Mux1 link\n> - * will be disabled, and Sensor1 -> Mux1 -> Unicam links enabled. Alternatively,\n> - * if Sensor3 is used, the Sensor2 -> Mux2 and Sensor1 -> Mux1 links are disabled,\n> - * and Sensor3 -> Mux2 -> Mux1 -> Unicam links are enabled. All other links will\n> - * remain unchanged.\n> - *\n> - *  +----------+\n> - *  |  Unicam  |\n> - *  +-----^----+\n> - *        |\n> - *    +---+---+\n> - *    |  Mux1 <-------+\n> - *    +--^----+       |\n> - *       |            |\n> - * +-----+---+    +---+---+\n> - * | Sensor1 |    |  Mux2 |<--+\n> - * +---------+    +-^-----+   |\n> - *                  |         |\n> - *          +-------+-+   +---+-----+\n> - *          | Sensor2 |   | Sensor3 |\n> - *          +---------+   +---------+\n> - */\n> -void RPiCameraData::enumerateVideoDevices(MediaLink *link)\n> -{\n> -\tconst MediaPad *sinkPad = link->sink();\n> -\tconst MediaEntity *entity = sinkPad->entity();\n> -\tbool unicamFound = false;\n> -\n> -\t/* We only deal with Video Mux and Bridge devices in cascade. */\n> -\tif (entity->function() != MEDIA_ENT_F_VID_MUX &&\n> -\t    entity->function() != MEDIA_ENT_F_VID_IF_BRIDGE)\n> -\t\treturn;\n> -\n> -\t/* Find the source pad for this Video Mux or Bridge device. */\n> -\tconst MediaPad *sourcePad = nullptr;\n> -\tfor (const MediaPad *pad : entity->pads()) {\n> -\t\tif (pad->flags() & MEDIA_PAD_FL_SOURCE) {\n> -\t\t\t/*\n> -\t\t\t * We can only deal with devices that have a single source\n> -\t\t\t * pad. If this device has multiple source pads, ignore it\n> -\t\t\t * and this branch in the cascade.\n> -\t\t\t */\n> -\t\t\tif (sourcePad)\n> -\t\t\t\treturn;\n> -\n> -\t\t\tsourcePad = pad;\n> -\t\t}\n> -\t}\n> -\n> -\tLOG(RPI, Debug) << \"Found video mux device \" << entity->name()\n> -\t\t\t<< \" linked to sink pad \" << sinkPad->index();\n> -\n> -\tbridgeDevices_.emplace_back(std::make_unique<V4L2Subdevice>(entity), link);\n> -\tbridgeDevices_.back().first->open();\n> -\n> -\t/*\n> -\t * Iterate through all the sink pad links down the cascade to find any\n> -\t * other Video Mux and Bridge devices.\n> -\t */\n> -\tfor (MediaLink *l : sourcePad->links()) {\n> -\t\tenumerateVideoDevices(l);\n> -\t\t/* Once we reach the Unicam entity, we are done. */\n> -\t\tif (l->sink()->entity()->name() == \"unicam-image\") {\n> -\t\t\tunicamFound = true;\n> -\t\t\tbreak;\n> -\t\t}\n> -\t}\n> -\n> -\t/* This identifies the end of our entity enumeration recursion. */\n> -\tif (link->source()->entity()->function() == MEDIA_ENT_F_CAM_SENSOR) {\n> -\t\t/*\n> -\t\t* If Unicam is not at the end of this cascade, we cannot configure\n> -\t\t* this topology automatically, so remove all entity references.\n> -\t\t*/\n> -\t\tif (!unicamFound) {\n> -\t\t\tLOG(RPI, Warning) << \"Cannot automatically configure this MC topology!\";\n> -\t\t\tbridgeDevices_.clear();\n> -\t\t}\n> -\t}\n> -}\n> -\n> -void RPiCameraData::processStatsComplete(const ipa::RPi::BufferIds &buffers)\n> -{\n> -\tif (!isRunning())\n> -\t\treturn;\n> -\n> -\tFrameBuffer *buffer = isp_[Isp::Stats].getBuffers().at(buffers.stats & RPi::MaskID);\n> -\n> -\thandleStreamBuffer(buffer, &isp_[Isp::Stats]);\n> -\n> -\t/* Last thing to do is to fill up the request metadata. */\n> -\tRequest *request = requestQueue_.front();\n> -\tControlList metadata;\n> -\n> -\tipa_->reportMetadata(request->sequence(), &metadata);\n> -\trequest->metadata().merge(metadata);\n> -\n> -\t/*\n> -\t * Inform the sensor of the latest colour gains if it has the\n> -\t * V4L2_CID_NOTIFY_GAINS control (which means notifyGainsUnity_ is set).\n> -\t */\n> -\tconst auto &colourGains = metadata.get(libcamera::controls::ColourGains);\n> -\tif (notifyGainsUnity_ && colourGains) {\n> -\t\t/* The control wants linear gains in the order B, Gb, Gr, R. */\n> -\t\tControlList ctrls(sensor_->controls());\n> -\t\tstd::array<int32_t, 4> gains{\n> -\t\t\tstatic_cast<int32_t>((*colourGains)[1] * *notifyGainsUnity_),\n> -\t\t\t*notifyGainsUnity_,\n> -\t\t\t*notifyGainsUnity_,\n> -\t\t\tstatic_cast<int32_t>((*colourGains)[0] * *notifyGainsUnity_)\n> -\t\t};\n> -\t\tctrls.set(V4L2_CID_NOTIFY_GAINS, Span<const int32_t>{ gains });\n> -\n> -\t\tsensor_->setControls(&ctrls);\n> -\t}\n> -\n> -\tstate_ = State::IpaComplete;\n> -\thandleState();\n> -}\n> -\n> -void RPiCameraData::prepareIspComplete(const ipa::RPi::BufferIds &buffers)\n> -{\n> -\tunsigned int embeddedId = buffers.embedded & RPi::MaskID;\n> -\tunsigned int bayer = buffers.bayer & RPi::MaskID;\n> -\tFrameBuffer *buffer;\n> -\n> -\tif (!isRunning())\n> -\t\treturn;\n> -\n> -\tbuffer = unicam_[Unicam::Image].getBuffers().at(bayer & RPi::MaskID);\n> -\tLOG(RPI, Debug) << \"Input re-queue to ISP, buffer id \" << (bayer & RPi::MaskID)\n> -\t\t\t<< \", timestamp: \" << buffer->metadata().timestamp;\n> -\n> -\tisp_[Isp::Input].queueBuffer(buffer);\n> -\tispOutputCount_ = 0;\n> -\n> -\tif (sensorMetadata_ && embeddedId) {\n> -\t\tbuffer = unicam_[Unicam::Embedded].getBuffers().at(embeddedId & RPi::MaskID);\n> -\t\thandleStreamBuffer(buffer, &unicam_[Unicam::Embedded]);\n> -\t}\n> -\n> -\thandleState();\n> -}\n> -\n> -void RPiCameraData::setIspControls(const ControlList &controls)\n> -{\n> -\tControlList ctrls = controls;\n> -\n> -\tif (ctrls.contains(V4L2_CID_USER_BCM2835_ISP_LENS_SHADING)) {\n> -\t\tControlValue &value =\n> -\t\t\tconst_cast<ControlValue &>(ctrls.get(V4L2_CID_USER_BCM2835_ISP_LENS_SHADING));\n> -\t\tSpan<uint8_t> s = value.data();\n> -\t\tbcm2835_isp_lens_shading *ls =\n> -\t\t\treinterpret_cast<bcm2835_isp_lens_shading *>(s.data());\n> -\t\tls->dmabuf = lsTable_.get();\n> -\t}\n> -\n> -\tisp_[Isp::Input].dev()->setControls(&ctrls);\n> -\thandleState();\n> -}\n> -\n> -void RPiCameraData::setDelayedControls(const ControlList &controls, uint32_t delayContext)\n> -{\n> -\tif (!delayedCtrls_->push(controls, delayContext))\n> -\t\tLOG(RPI, Error) << \"V4L2 DelayedControl set failed\";\n> -\thandleState();\n> -}\n> -\n> -void RPiCameraData::setLensControls(const ControlList &controls)\n> -{\n> -\tCameraLens *lens = sensor_->focusLens();\n> -\n> -\tif (lens && controls.contains(V4L2_CID_FOCUS_ABSOLUTE)) {\n> -\t\tControlValue const &focusValue = controls.get(V4L2_CID_FOCUS_ABSOLUTE);\n> -\t\tlens->setFocusPosition(focusValue.get<int32_t>());\n> -\t}\n> -}\n> -\n> -void RPiCameraData::setCameraTimeout(uint32_t maxFrameLengthMs)\n> -{\n> -\t/*\n> -\t * Set the dequeue timeout to the larger of 5x the maximum reported\n> -\t * frame length advertised by the IPA over a number of frames. Allow\n> -\t * a minimum timeout value of 1s.\n> -\t */\n> -\tutils::Duration timeout =\n> -\t\tstd::max<utils::Duration>(1s, 5 * maxFrameLengthMs * 1ms);\n> -\n> -\tLOG(RPI, Debug) << \"Setting Unicam timeout to \" << timeout;\n> -\tunicam_[Unicam::Image].dev()->setDequeueTimeout(timeout);\n> -}\n> -\n> -void RPiCameraData::setSensorControls(ControlList &controls)\n> -{\n> -\t/*\n> -\t * We need to ensure that if both VBLANK and EXPOSURE are present, the\n> -\t * former must be written ahead of, and separately from EXPOSURE to avoid\n> -\t * V4L2 rejecting the latter. This is identical to what DelayedControls\n> -\t * does with the priority write flag.\n> -\t *\n> -\t * As a consequence of the below logic, VBLANK gets set twice, and we\n> -\t * rely on the v4l2 framework to not pass the second control set to the\n> -\t * driver as the actual control value has not changed.\n> -\t */\n> -\tif (controls.contains(V4L2_CID_EXPOSURE) && controls.contains(V4L2_CID_VBLANK)) {\n> -\t\tControlList vblank_ctrl;\n> -\n> -\t\tvblank_ctrl.set(V4L2_CID_VBLANK, controls.get(V4L2_CID_VBLANK));\n> -\t\tsensor_->setControls(&vblank_ctrl);\n> -\t}\n> -\n> -\tsensor_->setControls(&controls);\n> -}\n> -\n> -void RPiCameraData::unicamTimeout()\n> -{\n> -\tLOG(RPI, Error) << \"Unicam has timed out!\";\n> -\tLOG(RPI, Error) << \"Please check that your camera sensor connector is attached securely.\";\n> -\tLOG(RPI, Error) << \"Alternatively, try another cable and/or sensor.\";\n> -\n> -\tstate_ = RPiCameraData::State::Error;\n> -\t/*\n> -\t * To allow the application to attempt a recovery from this timeout,\n> -\t * stop all devices streaming, and return any outstanding requests as\n> -\t * incomplete and cancelled.\n> -\t */\n> -\tfor (auto const stream : streams_)\n> -\t\tstream->dev()->streamOff();\n> -\n> -\tclearIncompleteRequests();\n> -}\n> -\n> -void RPiCameraData::unicamBufferDequeue(FrameBuffer *buffer)\n> -{\n> -\tRPi::Stream *stream = nullptr;\n> -\tint index;\n> -\n> -\tif (!isRunning())\n> -\t\treturn;\n> -\n> -\tfor (RPi::Stream &s : unicam_) {\n> -\t\tindex = s.getBufferId(buffer);\n> -\t\tif (index) {\n> -\t\t\tstream = &s;\n> -\t\t\tbreak;\n> -\t\t}\n> -\t}\n> -\n> -\t/* The buffer must belong to one of our streams. */\n> -\tASSERT(stream);\n> -\n> -\tLOG(RPI, Debug) << \"Stream \" << stream->name() << \" buffer dequeue\"\n> -\t\t\t<< \", buffer id \" << index\n> -\t\t\t<< \", timestamp: \" << buffer->metadata().timestamp;\n> -\n> -\tif (stream == &unicam_[Unicam::Image]) {\n> -\t\t/*\n> -\t\t * Lookup the sensor controls used for this frame sequence from\n> -\t\t * DelayedControl and queue them along with the frame buffer.\n> -\t\t */\n> -\t\tauto [ctrl, delayContext] = delayedCtrls_->get(buffer->metadata().sequence);\n> -\t\t/*\n> -\t\t * Add the frame timestamp to the ControlList for the IPA to use\n> -\t\t * as it does not receive the FrameBuffer object.\n> -\t\t */\n> -\t\tctrl.set(controls::SensorTimestamp, buffer->metadata().timestamp);\n> -\t\tbayerQueue_.push({ buffer, std::move(ctrl), delayContext });\n> -\t} else {\n> -\t\tembeddedQueue_.push(buffer);\n> -\t}\n> -\n> -\thandleState();\n> -}\n> -\n> -void RPiCameraData::ispInputDequeue(FrameBuffer *buffer)\n> -{\n> -\tif (!isRunning())\n> -\t\treturn;\n> -\n> -\tLOG(RPI, Debug) << \"Stream ISP Input buffer complete\"\n> -\t\t\t<< \", buffer id \" << unicam_[Unicam::Image].getBufferId(buffer)\n> -\t\t\t<< \", timestamp: \" << buffer->metadata().timestamp;\n> -\n> -\t/* The ISP input buffer gets re-queued into Unicam. */\n> -\thandleStreamBuffer(buffer, &unicam_[Unicam::Image]);\n> -\thandleState();\n> -}\n> -\n> -void RPiCameraData::ispOutputDequeue(FrameBuffer *buffer)\n> -{\n> -\tRPi::Stream *stream = nullptr;\n> -\tint index;\n> -\n> -\tif (!isRunning())\n> -\t\treturn;\n> -\n> -\tfor (RPi::Stream &s : isp_) {\n> -\t\tindex = s.getBufferId(buffer);\n> -\t\tif (index) {\n> -\t\t\tstream = &s;\n> -\t\t\tbreak;\n> -\t\t}\n> -\t}\n> -\n> -\t/* The buffer must belong to one of our ISP output streams. */\n> -\tASSERT(stream);\n> -\n> -\tLOG(RPI, Debug) << \"Stream \" << stream->name() << \" buffer complete\"\n> -\t\t\t<< \", buffer id \" << index\n> -\t\t\t<< \", timestamp: \" << buffer->metadata().timestamp;\n> -\n> -\t/*\n> -\t * ISP statistics buffer must not be re-queued or sent back to the\n> -\t * application until after the IPA signals so.\n> -\t */\n> -\tif (stream == &isp_[Isp::Stats]) {\n> -\t\tipa::RPi::ProcessParams params;\n> -\t\tparams.buffers.stats = index | RPi::MaskStats;\n> -\t\tparams.ipaContext = requestQueue_.front()->sequence();\n> -\t\tipa_->processStats(params);\n> -\t} else {\n> -\t\t/* Any other ISP output can be handed back to the application now. */\n> -\t\thandleStreamBuffer(buffer, stream);\n> -\t}\n> -\n> -\t/*\n> -\t * Increment the number of ISP outputs generated.\n> -\t * This is needed to track dropped frames.\n> -\t */\n> -\tispOutputCount_++;\n> -\n> -\thandleState();\n> -}\n> -\n> -void RPiCameraData::clearIncompleteRequests()\n> -{\n> -\t/*\n> -\t * All outstanding requests (and associated buffers) must be returned\n> -\t * back to the application.\n> -\t */\n> -\twhile (!requestQueue_.empty()) {\n> -\t\tRequest *request = requestQueue_.front();\n> -\n> -\t\tfor (auto &b : request->buffers()) {\n> -\t\t\tFrameBuffer *buffer = b.second;\n> -\t\t\t/*\n> -\t\t\t * Has the buffer already been handed back to the\n> -\t\t\t * request? If not, do so now.\n> -\t\t\t */\n> -\t\t\tif (buffer->request()) {\n> -\t\t\t\tbuffer->_d()->cancel();\n> -\t\t\t\tpipe()->completeBuffer(request, buffer);\n> -\t\t\t}\n> -\t\t}\n> -\n> -\t\tpipe()->completeRequest(request);\n> -\t\trequestQueue_.pop_front();\n> -\t}\n> -}\n> -\n> -void RPiCameraData::handleStreamBuffer(FrameBuffer *buffer, RPi::Stream *stream)\n> -{\n> -\t/*\n> -\t * It is possible to be here without a pending request, so check\n> -\t * that we actually have one to action, otherwise we just return\n> -\t * buffer back to the stream.\n> -\t */\n> -\tRequest *request = requestQueue_.empty() ? nullptr : requestQueue_.front();\n> -\tif (!dropFrameCount_ && request && request->findBuffer(stream) == buffer) {\n> -\t\t/*\n> -\t\t * Check if this is an externally provided buffer, and if\n> -\t\t * so, we must stop tracking it in the pipeline handler.\n> -\t\t */\n> -\t\thandleExternalBuffer(buffer, stream);\n> -\t\t/*\n> -\t\t * Tag the buffer as completed, returning it to the\n> -\t\t * application.\n> -\t\t */\n> -\t\tpipe()->completeBuffer(request, buffer);\n> -\t} else {\n> -\t\t/*\n> -\t\t * This buffer was not part of the Request (which happens if an\n> -\t\t * internal buffer was used for an external stream, or\n> -\t\t * unconditionally for internal streams), or there is no pending\n> -\t\t * request, so we can recycle it.\n> -\t\t */\n> -\t\tstream->returnBuffer(buffer);\n> -\t}\n> -}\n> -\n> -void RPiCameraData::handleExternalBuffer(FrameBuffer *buffer, RPi::Stream *stream)\n> -{\n> -\tunsigned int id = stream->getBufferId(buffer);\n> -\n> -\tif (!(id & RPi::MaskExternalBuffer))\n> -\t\treturn;\n> -\n> -\t/* Stop the Stream object from tracking the buffer. */\n> -\tstream->removeExternalBuffer(buffer);\n> -}\n> -\n> -void RPiCameraData::handleState()\n> -{\n> -\tswitch (state_) {\n> -\tcase State::Stopped:\n> -\tcase State::Busy:\n> -\tcase State::Error:\n> -\t\tbreak;\n> -\n> -\tcase State::IpaComplete:\n> -\t\t/* If the request is completed, we will switch to Idle state. */\n> -\t\tcheckRequestCompleted();\n> -\t\t/*\n> -\t\t * No break here, we want to try running the pipeline again.\n> -\t\t * The fallthrough clause below suppresses compiler warnings.\n> -\t\t */\n> -\t\t[[fallthrough]];\n> -\n> -\tcase State::Idle:\n> -\t\ttryRunPipeline();\n> -\t\tbreak;\n> -\t}\n> -}\n> -\n> -void RPiCameraData::checkRequestCompleted()\n> -{\n> -\tbool requestCompleted = false;\n> -\t/*\n> -\t * If we are dropping this frame, do not touch the request, simply\n> -\t * change the state to IDLE when ready.\n> -\t */\n> -\tif (!dropFrameCount_) {\n> -\t\tRequest *request = requestQueue_.front();\n> -\t\tif (request->hasPendingBuffers())\n> -\t\t\treturn;\n> -\n> -\t\t/* Must wait for metadata to be filled in before completing. */\n> -\t\tif (state_ != State::IpaComplete)\n> -\t\t\treturn;\n> -\n> -\t\tpipe()->completeRequest(request);\n> -\t\trequestQueue_.pop_front();\n> -\t\trequestCompleted = true;\n> -\t}\n> -\n> -\t/*\n> -\t * Make sure we have three outputs completed in the case of a dropped\n> -\t * frame.\n> -\t */\n> -\tif (state_ == State::IpaComplete &&\n> -\t    ((ispOutputCount_ == 3 && dropFrameCount_) || requestCompleted)) {\n> -\t\tstate_ = State::Idle;\n> -\t\tif (dropFrameCount_) {\n> -\t\t\tdropFrameCount_--;\n> -\t\t\tLOG(RPI, Debug) << \"Dropping frame at the request of the IPA (\"\n> -\t\t\t\t\t<< dropFrameCount_ << \" left)\";\n> -\t\t}\n> -\t}\n> -}\n> -\n> -Rectangle RPiCameraData::scaleIspCrop(const Rectangle &ispCrop) const\n> -{\n> -\t/*\n> -\t * Scale a crop rectangle defined in the ISP's coordinates into native sensor\n> -\t * coordinates.\n> -\t */\n> -\tRectangle nativeCrop = ispCrop.scaledBy(sensorInfo_.analogCrop.size(),\n> -\t\t\t\t\t\tsensorInfo_.outputSize);\n> -\tnativeCrop.translateBy(sensorInfo_.analogCrop.topLeft());\n> -\treturn nativeCrop;\n> -}\n> -\n> -void RPiCameraData::applyScalerCrop(const ControlList &controls)\n> -{\n> -\tconst auto &scalerCrop = controls.get<Rectangle>(controls::ScalerCrop);\n> -\tif (scalerCrop) {\n> -\t\tRectangle nativeCrop = *scalerCrop;\n> -\n> -\t\tif (!nativeCrop.width || !nativeCrop.height)\n> -\t\t\tnativeCrop = { 0, 0, 1, 1 };\n> -\n> -\t\t/* Create a version of the crop scaled to ISP (camera mode) pixels. */\n> -\t\tRectangle ispCrop = nativeCrop.translatedBy(-sensorInfo_.analogCrop.topLeft());\n> -\t\tispCrop.scaleBy(sensorInfo_.outputSize, sensorInfo_.analogCrop.size());\n> -\n> -\t\t/*\n> -\t\t * The crop that we set must be:\n> -\t\t * 1. At least as big as ispMinCropSize_, once that's been\n> -\t\t *    enlarged to the same aspect ratio.\n> -\t\t * 2. With the same mid-point, if possible.\n> -\t\t * 3. But it can't go outside the sensor area.\n> -\t\t */\n> -\t\tSize minSize = ispMinCropSize_.expandedToAspectRatio(nativeCrop.size());\n> -\t\tSize size = ispCrop.size().expandedTo(minSize);\n> -\t\tispCrop = size.centeredTo(ispCrop.center()).enclosedIn(Rectangle(sensorInfo_.outputSize));\n> -\n> -\t\tif (ispCrop != ispCrop_) {\n> -\t\t\tisp_[Isp::Input].dev()->setSelection(V4L2_SEL_TGT_CROP, &ispCrop);\n> -\t\t\tispCrop_ = ispCrop;\n> -\n> -\t\t\t/*\n> -\t\t\t * Also update the ScalerCrop in the metadata with what we actually\n> -\t\t\t * used. But we must first rescale that from ISP (camera mode) pixels\n> -\t\t\t * back into sensor native pixels.\n> -\t\t\t */\n> -\t\t\tscalerCrop_ = scaleIspCrop(ispCrop_);\n> -\t\t}\n> -\t}\n> -}\n> -\n> -void RPiCameraData::fillRequestMetadata(const ControlList &bufferControls,\n> -\t\t\t\t\tRequest *request)\n> -{\n> -\trequest->metadata().set(controls::SensorTimestamp,\n> -\t\t\t\tbufferControls.get(controls::SensorTimestamp).value_or(0));\n> -\n> -\trequest->metadata().set(controls::ScalerCrop, scalerCrop_);\n> -}\n> -\n> -void RPiCameraData::tryRunPipeline()\n> -{\n> -\tFrameBuffer *embeddedBuffer;\n> -\tBayerFrame bayerFrame;\n> -\n> -\t/* If any of our request or buffer queues are empty, we cannot proceed. */\n> -\tif (state_ != State::Idle || requestQueue_.empty() ||\n> -\t    bayerQueue_.empty() || (embeddedQueue_.empty() && sensorMetadata_))\n> -\t\treturn;\n> -\n> -\tif (!findMatchingBuffers(bayerFrame, embeddedBuffer))\n> -\t\treturn;\n> -\n> -\t/* Take the first request from the queue and action the IPA. */\n> -\tRequest *request = requestQueue_.front();\n> -\n> -\t/* See if a new ScalerCrop value needs to be applied. */\n> -\tapplyScalerCrop(request->controls());\n> -\n> -\t/*\n> -\t * Clear the request metadata and fill it with some initial non-IPA\n> -\t * related controls. We clear it first because the request metadata\n> -\t * may have been populated if we have dropped the previous frame.\n> -\t */\n> -\trequest->metadata().clear();\n> -\tfillRequestMetadata(bayerFrame.controls, request);\n> -\n> -\t/* Set our state to say the pipeline is active. */\n> -\tstate_ = State::Busy;\n> -\n> -\tunsigned int bayer = unicam_[Unicam::Image].getBufferId(bayerFrame.buffer);\n> -\n> -\tLOG(RPI, Debug) << \"Signalling prepareIsp:\"\n> -\t\t\t<< \" Bayer buffer id: \" << bayer;\n> -\n> -\tipa::RPi::PrepareParams params;\n> -\tparams.buffers.bayer = RPi::MaskBayerData | bayer;\n> -\tparams.sensorControls = std::move(bayerFrame.controls);\n> -\tparams.requestControls = request->controls();\n> -\tparams.ipaContext = request->sequence();\n> -\tparams.delayContext = bayerFrame.delayContext;\n> -\n> -\tif (embeddedBuffer) {\n> -\t\tunsigned int embeddedId = unicam_[Unicam::Embedded].getBufferId(embeddedBuffer);\n> -\n> -\t\tparams.buffers.embedded = RPi::MaskEmbeddedData | embeddedId;\n> -\t\tLOG(RPI, Debug) << \"Signalling prepareIsp:\"\n> -\t\t\t\t<< \" Embedded buffer id: \" << embeddedId;\n> -\t}\n> -\n> -\tipa_->prepareIsp(params);\n> -}\n> -\n> -bool RPiCameraData::findMatchingBuffers(BayerFrame &bayerFrame, FrameBuffer *&embeddedBuffer)\n> -{\n> -\tif (bayerQueue_.empty())\n> -\t\treturn false;\n> -\n> -\t/*\n> -\t * Find the embedded data buffer with a matching timestamp to pass to\n> -\t * the IPA. Any embedded buffers with a timestamp lower than the\n> -\t * current bayer buffer will be removed and re-queued to the driver.\n> -\t */\n> -\tuint64_t ts = bayerQueue_.front().buffer->metadata().timestamp;\n> -\tembeddedBuffer = nullptr;\n> -\twhile (!embeddedQueue_.empty()) {\n> -\t\tFrameBuffer *b = embeddedQueue_.front();\n> -\t\tif (b->metadata().timestamp < ts) {\n> -\t\t\tembeddedQueue_.pop();\n> -\t\t\tunicam_[Unicam::Embedded].returnBuffer(b);\n> -\t\t\tLOG(RPI, Debug) << \"Dropping unmatched input frame in stream \"\n> -\t\t\t\t\t<< unicam_[Unicam::Embedded].name();\n> -\t\t} else if (b->metadata().timestamp == ts) {\n> -\t\t\t/* Found a match! */\n> -\t\t\tembeddedBuffer = b;\n> -\t\t\tembeddedQueue_.pop();\n> -\t\t\tbreak;\n> -\t\t} else {\n> -\t\t\tbreak; /* Only higher timestamps from here. */\n> -\t\t}\n> -\t}\n> -\n> -\tif (!embeddedBuffer && sensorMetadata_) {\n> -\t\tif (embeddedQueue_.empty()) {\n> -\t\t\t/*\n> -\t\t\t * If the embedded buffer queue is empty, wait for the next\n> -\t\t\t * buffer to arrive - dequeue ordering may send the image\n> -\t\t\t * buffer first.\n> -\t\t\t */\n> -\t\t\tLOG(RPI, Debug) << \"Waiting for next embedded buffer.\";\n> -\t\t\treturn false;\n> -\t\t}\n> -\n> -\t\t/* Log if there is no matching embedded data buffer found. */\n> -\t\tLOG(RPI, Debug) << \"Returning bayer frame without a matching embedded buffer.\";\n> -\t}\n> -\n> -\tbayerFrame = std::move(bayerQueue_.front());\n> -\tbayerQueue_.pop();\n> -\n> -\treturn true;\n> -}\n> -\n> -REGISTER_PIPELINE_HANDLER(PipelineHandlerRPi)\n> -\n> -} /* namespace libcamera */\n> diff --git a/src/libcamera/pipeline/rpi/vc4/vc4.cpp b/src/libcamera/pipeline/rpi/vc4/vc4.cpp\n> new file mode 100644\n> index 000000000000..a085a06376a8\n> --- /dev/null\n> +++ b/src/libcamera/pipeline/rpi/vc4/vc4.cpp\n> @@ -0,0 +1,1001 @@\n> +/* SPDX-License-Identifier: LGPL-2.1-or-later */\n> +/*\n> + * Copyright (C) 2019-2023, Raspberry Pi Ltd\n> + *\n> + * vc4.cpp - Pipeline handler for VC4 based Raspberry Pi devices\n> + */\n> +\n> +#include <linux/bcm2835-isp.h>\n> +#include <linux/v4l2-controls.h>\n> +#include <linux/videodev2.h>\n> +\n> +#include <libcamera/formats.h>\n> +\n> +#include \"libcamera/internal/device_enumerator.h\"\n> +\n> +#include \"dma_heaps.h\"\n> +#include \"pipeline_base.h\"\n> +#include \"rpi_stream.h\"\n> +\n> +using namespace std::chrono_literals;\n> +\n> +namespace libcamera {\n> +\n> +LOG_DECLARE_CATEGORY(RPI)\n> +\n> +namespace {\n> +\n> +enum class Unicam : unsigned int { Image, Embedded };\n> +enum class Isp : unsigned int { Input, Output0, Output1, Stats };\n> +\n> +} /* namespace */\n> +\n> +class Vc4CameraData final : public RPi::CameraData\n> +{\n> +public:\n> +\tVc4CameraData(PipelineHandler *pipe)\n> +\t\t: RPi::CameraData(pipe)\n> +\t{\n> +\t}\n> +\n> +\t~Vc4CameraData()\n> +\t{\n> +\t\tfreeBuffers();\n> +\t}\n> +\n> +\tV4L2VideoDevice::Formats ispFormats() const override\n> +\t{\n> +\t\treturn isp_[Isp::Output0].dev()->formats();\n> +\t}\n> +\n> +\tV4L2VideoDevice::Formats rawFormats() const override\n> +\t{\n> +\t\treturn unicam_[Unicam::Image].dev()->formats();\n> +\t}\n> +\n> +\tV4L2VideoDevice *frontendDevice() override\n> +\t{\n> +\t\treturn unicam_[Unicam::Image].dev();\n> +\t}\n> +\n> +\tvoid platformFreeBuffers() override\n> +\t{\n> +\t}\n> +\n> +\tCameraConfiguration::Status platformValidate(std::vector<StreamParams> &rawStreams,\n> +\t\t\t\t\t\t     std::vector<StreamParams> &outStreams) const override;\n> +\n> +\tint platformPipelineConfigure(const std::unique_ptr<YamlObject> &root) override;\n> +\n> +\tvoid platformStart() override;\n> +\tvoid platformStop() override;\n> +\n> +\tvoid unicamBufferDequeue(FrameBuffer *buffer);\n> +\tvoid ispInputDequeue(FrameBuffer *buffer);\n> +\tvoid ispOutputDequeue(FrameBuffer *buffer);\n> +\n> +\tvoid processStatsComplete(const ipa::RPi::BufferIds &buffers);\n> +\tvoid prepareIspComplete(const ipa::RPi::BufferIds &buffers);\n> +\tvoid setIspControls(const ControlList &controls);\n> +\tvoid setCameraTimeout(uint32_t maxFrameLengthMs);\n> +\n> +\t/* Array of Unicam and ISP device streams and associated buffers/streams. */\n> +\tRPi::Device<Unicam, 2> unicam_;\n> +\tRPi::Device<Isp, 4> isp_;\n> +\n> +\t/* DMAHEAP allocation helper. */\n> +\tRPi::DmaHeap dmaHeap_;\n> +\tSharedFD lsTable_;\n> +\n> +\tstruct Config {\n> +\t\t/*\n> +\t\t * The minimum number of internal buffers to be allocated for\n> +\t\t * the Unicam Image stream.\n> +\t\t */\n> +\t\tunsigned int minUnicamBuffers;\n> +\t\t/*\n> +\t\t * The minimum total (internal + external) buffer count used for\n> +\t\t * the Unicam Image stream.\n> +\t\t *\n> +\t\t * Note that:\n> +\t\t * minTotalUnicamBuffers must be >= 1, and\n> +\t\t * minTotalUnicamBuffers >= minUnicamBuffers\n> +\t\t */\n> +\t\tunsigned int minTotalUnicamBuffers;\n> +\t};\n> +\n> +\tConfig config_;\n> +\n> +private:\n> +\tvoid platformIspCrop() override\n> +\t{\n> +\t\tisp_[Isp::Input].dev()->setSelection(V4L2_SEL_TGT_CROP, &ispCrop_);\n> +\t}\n> +\n> +\tint platformConfigure(const V4L2SubdeviceFormat &sensorFormat,\n> +\t\t\t      std::optional<BayerFormat::Packing> packing,\n> +\t\t\t      std::vector<StreamParams> &rawStreams,\n> +\t\t\t      std::vector<StreamParams> &outStreams) override;\n> +\tint platformConfigureIpa(ipa::RPi::ConfigParams &params) override;\n> +\n> +\tint platformInitIpa([[maybe_unused]] ipa::RPi::InitParams &params) override\n> +\t{\n> +\t\treturn 0;\n> +\t}\n> +\n> +\tstruct BayerFrame {\n> +\t\tFrameBuffer *buffer;\n> +\t\tControlList controls;\n> +\t\tunsigned int delayContext;\n> +\t};\n> +\n> +\tvoid tryRunPipeline() override;\n> +\tbool findMatchingBuffers(BayerFrame &bayerFrame, FrameBuffer *&embeddedBuffer);\n> +\n> +\tstd::queue<BayerFrame> bayerQueue_;\n> +\tstd::queue<FrameBuffer *> embeddedQueue_;\n> +};\n> +\n> +class PipelineHandlerVc4 : public RPi::PipelineHandlerBase\n> +{\n> +public:\n> +\tPipelineHandlerVc4(CameraManager *manager)\n> +\t\t: RPi::PipelineHandlerBase(manager)\n> +\t{\n> +\t}\n> +\n> +\t~PipelineHandlerVc4()\n> +\t{\n> +\t}\n> +\n> +\tbool match(DeviceEnumerator *enumerator) override;\n> +\n> +private:\n> +\tVc4CameraData *cameraData(Camera *camera)\n> +\t{\n> +\t\treturn static_cast<Vc4CameraData *>(camera->_d());\n> +\t}\n> +\n> +\tstd::unique_ptr<RPi::CameraData> allocateCameraData() override\n> +\t{\n> +\t\treturn std::make_unique<Vc4CameraData>(this);\n> +\t}\n> +\n> +\tint prepareBuffers(Camera *camera) override;\n> +\tint platformRegister(std::unique_ptr<RPi::CameraData> &cameraData,\n> +\t\t\t     MediaDevice *unicam, MediaDevice *isp) override;\n> +};\n> +\n> +bool PipelineHandlerVc4::match(DeviceEnumerator *enumerator)\n> +{\n> +\tconstexpr unsigned int numUnicamDevices = 2;\n> +\n> +\t/*\n> +\t * Loop over all Unicam instances, but return out once a match is found.\n> +\t * This is to ensure we correctly enumrate the camera when an instance\n> +\t * of Unicam has registered with media controller, but has not registered\n> +\t * device nodes due to a sensor subdevice failure.\n> +\t */\n> +\tfor (unsigned int i = 0; i < numUnicamDevices; i++) {\n> +\t\tDeviceMatch unicam(\"unicam\");\n> +\t\tMediaDevice *unicamDevice = acquireMediaDevice(enumerator, unicam);\n> +\n> +\t\tif (!unicamDevice) {\n> +\t\t\tLOG(RPI, Debug) << \"Unable to acquire a Unicam instance\";\n> +\t\t\tbreak;\n> +\t\t}\n> +\n> +\t\tDeviceMatch isp(\"bcm2835-isp\");\n> +\t\tMediaDevice *ispDevice = acquireMediaDevice(enumerator, isp);\n> +\n> +\t\tif (!ispDevice) {\n> +\t\t\tLOG(RPI, Debug) << \"Unable to acquire ISP instance\";\n> +\t\t\tbreak;\n> +\t\t}\n> +\n> +\t\t/*\n> +\t\t * The loop below is used to register multiple cameras behind one or more\n> +\t\t * video mux devices that are attached to a particular Unicam instance.\n> +\t\t * Obviously these cameras cannot be used simultaneously.\n> +\t\t */\n> +\t\tunsigned int numCameras = 0;\n> +\t\tfor (MediaEntity *entity : unicamDevice->entities()) {\n> +\t\t\tif (entity->function() != MEDIA_ENT_F_CAM_SENSOR)\n> +\t\t\t\tcontinue;\n> +\n> +\t\t\tint ret = RPi::PipelineHandlerBase::registerCamera(unicamDevice, \"unicam-image\",\n> +\t\t\t\t\t\t\t\t\t   ispDevice, entity);\n> +\t\t\tif (ret)\n> +\t\t\t\tLOG(RPI, Error) << \"Failed to register camera \"\n> +\t\t\t\t\t\t<< entity->name() << \": \" << ret;\n> +\t\t\telse\n> +\t\t\t\tnumCameras++;\n> +\t\t}\n> +\n> +\t\tif (numCameras)\n> +\t\t\treturn true;\n> +\t}\n> +\n> +\treturn false;\n> +}\n> +\n> +int PipelineHandlerVc4::prepareBuffers(Camera *camera)\n> +{\n> +\tVc4CameraData *data = cameraData(camera);\n> +\tunsigned int numRawBuffers = 0;\n> +\tint ret;\n> +\n> +\tfor (Stream *s : camera->streams()) {\n> +\t\tif (BayerFormat::fromPixelFormat(s->configuration().pixelFormat).isValid()) {\n> +\t\t\tnumRawBuffers = s->configuration().bufferCount;\n> +\t\t\tbreak;\n> +\t\t}\n> +\t}\n> +\n> +\t/* Decide how many internal buffers to allocate. */\n> +\tfor (auto const stream : data->streams_) {\n> +\t\tunsigned int numBuffers;\n> +\t\t/*\n> +\t\t * For Unicam, allocate a minimum number of buffers for internal\n> +\t\t * use as we want to avoid any frame drops.\n> +\t\t */\n> +\t\tconst unsigned int minBuffers = data->config_.minTotalUnicamBuffers;\n> +\t\tif (stream == &data->unicam_[Unicam::Image]) {\n> +\t\t\t/*\n> +\t\t\t * If an application has configured a RAW stream, allocate\n> +\t\t\t * additional buffers to make up the minimum, but ensure\n> +\t\t\t * we have at least minUnicamBuffers of internal buffers\n> +\t\t\t * to use to minimise frame drops.\n> +\t\t\t */\n> +\t\t\tnumBuffers = std::max<int>(data->config_.minUnicamBuffers,\n> +\t\t\t\t\t\t   minBuffers - numRawBuffers);\n> +\t\t} else if (stream == &data->isp_[Isp::Input]) {\n> +\t\t\t/*\n> +\t\t\t * ISP input buffers are imported from Unicam, so follow\n> +\t\t\t * similar logic as above to count all the RAW buffers\n> +\t\t\t * available.\n> +\t\t\t */\n> +\t\t\tnumBuffers = numRawBuffers +\n> +\t\t\t\t     std::max<int>(data->config_.minUnicamBuffers,\n> +\t\t\t\t\t\t   minBuffers - numRawBuffers);\n> +\n> +\t\t} else if (stream == &data->unicam_[Unicam::Embedded]) {\n> +\t\t\t/*\n> +\t\t\t * Embedded data buffers are (currently) for internal use,\n> +\t\t\t * so allocate the minimum required to avoid frame drops.\n> +\t\t\t */\n> +\t\t\tnumBuffers = minBuffers;\n> +\t\t} else {\n> +\t\t\t/*\n> +\t\t\t * Since the ISP runs synchronous with the IPA and requests,\n> +\t\t\t * we only ever need one set of internal buffers. Any buffers\n> +\t\t\t * the application wants to hold onto will already be exported\n> +\t\t\t * through PipelineHandlerRPi::exportFrameBuffers().\n> +\t\t\t */\n> +\t\t\tnumBuffers = 1;\n> +\t\t}\n> +\n> +\t\tret = stream->prepareBuffers(numBuffers);\n> +\t\tif (ret < 0)\n> +\t\t\treturn ret;\n> +\t}\n> +\n> +\t/*\n> +\t * Pass the stats and embedded data buffers to the IPA. No other\n> +\t * buffers need to be passed.\n> +\t */\n> +\tmapBuffers(camera, data->isp_[Isp::Stats].getBuffers(), RPi::MaskStats);\n> +\tif (data->sensorMetadata_)\n> +\t\tmapBuffers(camera, data->unicam_[Unicam::Embedded].getBuffers(),\n> +\t\t\t   RPi::MaskEmbeddedData);\n> +\n> +\treturn 0;\n> +}\n> +\n> +int PipelineHandlerVc4::platformRegister(std::unique_ptr<RPi::CameraData> &cameraData, MediaDevice *unicam, MediaDevice *isp)\n> +{\n> +\tVc4CameraData *data = static_cast<Vc4CameraData *>(cameraData.get());\n> +\n> +\tif (!data->dmaHeap_.isValid())\n> +\t\treturn -ENOMEM;\n> +\n> +\tMediaEntity *unicamImage = unicam->getEntityByName(\"unicam-image\");\n> +\tMediaEntity *ispOutput0 = isp->getEntityByName(\"bcm2835-isp0-output0\");\n> +\tMediaEntity *ispCapture1 = isp->getEntityByName(\"bcm2835-isp0-capture1\");\n> +\tMediaEntity *ispCapture2 = isp->getEntityByName(\"bcm2835-isp0-capture2\");\n> +\tMediaEntity *ispCapture3 = isp->getEntityByName(\"bcm2835-isp0-capture3\");\n> +\n> +\tif (!unicamImage || !ispOutput0 || !ispCapture1 || !ispCapture2 || !ispCapture3)\n> +\t\treturn -ENOENT;\n> +\n> +\t/* Locate and open the unicam video streams. */\n> +\tdata->unicam_[Unicam::Image] = RPi::Stream(\"Unicam Image\", unicamImage);\n> +\n> +\t/* An embedded data node will not be present if the sensor does not support it. */\n> +\tMediaEntity *unicamEmbedded = unicam->getEntityByName(\"unicam-embedded\");\n> +\tif (unicamEmbedded) {\n> +\t\tdata->unicam_[Unicam::Embedded] = RPi::Stream(\"Unicam Embedded\", unicamEmbedded);\n> +\t\tdata->unicam_[Unicam::Embedded].dev()->bufferReady.connect(data,\n> +\t\t\t\t\t\t\t\t\t   &Vc4CameraData::unicamBufferDequeue);\n> +\t}\n> +\n> +\t/* Tag the ISP input stream as an import stream. */\n> +\tdata->isp_[Isp::Input] = RPi::Stream(\"ISP Input\", ispOutput0, true);\n> +\tdata->isp_[Isp::Output0] = RPi::Stream(\"ISP Output0\", ispCapture1);\n> +\tdata->isp_[Isp::Output1] = RPi::Stream(\"ISP Output1\", ispCapture2);\n> +\tdata->isp_[Isp::Stats] = RPi::Stream(\"ISP Stats\", ispCapture3);\n> +\n> +\t/* Wire up all the buffer connections. */\n> +\tdata->unicam_[Unicam::Image].dev()->bufferReady.connect(data, &Vc4CameraData::unicamBufferDequeue);\n> +\tdata->isp_[Isp::Input].dev()->bufferReady.connect(data, &Vc4CameraData::ispInputDequeue);\n> +\tdata->isp_[Isp::Output0].dev()->bufferReady.connect(data, &Vc4CameraData::ispOutputDequeue);\n> +\tdata->isp_[Isp::Output1].dev()->bufferReady.connect(data, &Vc4CameraData::ispOutputDequeue);\n> +\tdata->isp_[Isp::Stats].dev()->bufferReady.connect(data, &Vc4CameraData::ispOutputDequeue);\n> +\n> +\tif (data->sensorMetadata_ ^ !!data->unicam_[Unicam::Embedded].dev()) {\n> +\t\tLOG(RPI, Warning) << \"Mismatch between Unicam and CamHelper for embedded data usage!\";\n> +\t\tdata->sensorMetadata_ = false;\n> +\t\tif (data->unicam_[Unicam::Embedded].dev())\n> +\t\t\tdata->unicam_[Unicam::Embedded].dev()->bufferReady.disconnect();\n> +\t}\n> +\n> +\t/*\n> +\t * Open all Unicam and ISP streams. The exception is the embedded data\n> +\t * stream, which only gets opened below if the IPA reports that the sensor\n> +\t * supports embedded data.\n> +\t *\n> +\t * The below grouping is just for convenience so that we can easily\n> +\t * iterate over all streams in one go.\n> +\t */\n> +\tdata->streams_.push_back(&data->unicam_[Unicam::Image]);\n> +\tif (data->sensorMetadata_)\n> +\t\tdata->streams_.push_back(&data->unicam_[Unicam::Embedded]);\n> +\n> +\tfor (auto &stream : data->isp_)\n> +\t\tdata->streams_.push_back(&stream);\n> +\n> +\tfor (auto stream : data->streams_) {\n> +\t\tint ret = stream->dev()->open();\n> +\t\tif (ret)\n> +\t\t\treturn ret;\n> +\t}\n> +\n> +\tif (!data->unicam_[Unicam::Image].dev()->caps().hasMediaController()) {\n> +\t\tLOG(RPI, Error) << \"Unicam driver does not use the MediaController, please update your kernel!\";\n> +\t\treturn -EINVAL;\n> +\t}\n> +\n> +\t/* Write up all the IPA connections. */\n> +\tdata->ipa_->processStatsComplete.connect(data, &Vc4CameraData::processStatsComplete);\n> +\tdata->ipa_->prepareIspComplete.connect(data, &Vc4CameraData::prepareIspComplete);\n> +\tdata->ipa_->setIspControls.connect(data, &Vc4CameraData::setIspControls);\n> +\tdata->ipa_->setCameraTimeout.connect(data, &Vc4CameraData::setCameraTimeout);\n> +\n> +\t/*\n> +\t * List the available streams an application may request. At present, we\n> +\t * do not advertise Unicam Embedded and ISP Statistics streams, as there\n> +\t * is no mechanism for the application to request non-image buffer formats.\n> +\t */\n> +\tstd::set<Stream *> streams;\n> +\tstreams.insert(&data->unicam_[Unicam::Image]);\n> +\tstreams.insert(&data->isp_[Isp::Output0]);\n> +\tstreams.insert(&data->isp_[Isp::Output1]);\n> +\n> +\t/* Create and register the camera. */\n> +\tconst std::string &id = data->sensor_->id();\n> +\tstd::shared_ptr<Camera> camera =\n> +\t\tCamera::create(std::move(cameraData), id, streams);\n> +\tPipelineHandler::registerCamera(std::move(camera));\n> +\n> +\tLOG(RPI, Info) << \"Registered camera \" << id\n> +\t\t       << \" to Unicam device \" << unicam->deviceNode()\n> +\t\t       << \" and ISP device \" << isp->deviceNode();\n> +\n> +\treturn 0;\n> +}\n> +\n> +CameraConfiguration::Status Vc4CameraData::platformValidate(std::vector<StreamParams> &rawStreams,\n> +\t\t\t\t\t\t\t    std::vector<StreamParams> &outStreams) const\n> +{\n> +\tCameraConfiguration::Status status = CameraConfiguration::Status::Valid;\n> +\n> +\t/* Can only output 1 RAW stream, or 2 YUV/RGB streams. */\n> +\tif (rawStreams.size() > 1 || outStreams.size() > 2) {\n> +\t\tLOG(RPI, Error) << \"Invalid number of streams requested\";\n> +\t\treturn CameraConfiguration::Status::Invalid;\n> +\t}\n> +\n> +\tif (!rawStreams.empty())\n> +\t\trawStreams[0].dev = unicam_[Unicam::Image].dev();\n> +\n> +\t/*\n> +\t * For the two ISP outputs, one stream must be equal or smaller than the\n> +\t * other in all dimensions.\n> +\t *\n> +\t * Index 0 contains the largest requested resolution.\n> +\t */\n> +\tfor (unsigned int i = 0; i < outStreams.size(); i++) {\n> +\t\tSize size;\n> +\n> +\t\tsize.width = std::min(outStreams[i].cfg->size.width,\n> +\t\t\t\t      outStreams[0].cfg->size.width);\n> +\t\tsize.height = std::min(outStreams[i].cfg->size.height,\n> +\t\t\t\t       outStreams[0].cfg->size.height);\n> +\n> +\t\tif (outStreams[i].cfg->size != size) {\n> +\t\t\toutStreams[i].cfg->size = size;\n> +\t\t\tstatus = CameraConfiguration::Status::Adjusted;\n> +\t\t}\n> +\n> +\t\t/*\n> +\t\t * Output 0 must be for the largest resolution. We will\n> +\t\t * have that fixed up in the code above.\n> +\t\t */\n> +\t\toutStreams[i].dev = isp_[i == 0 ? Isp::Output0 : Isp::Output1].dev();\n> +\t}\n> +\n> +\treturn status;\n> +}\n> +\n> +int Vc4CameraData::platformPipelineConfigure(const std::unique_ptr<YamlObject> &root)\n> +{\n> +\tconfig_ = {\n> +\t\t.minUnicamBuffers = 2,\n> +\t\t.minTotalUnicamBuffers = 4,\n> +\t};\n> +\n> +\tif (!root)\n> +\t\treturn 0;\n> +\n> +\tstd::optional<double> ver = (*root)[\"version\"].get<double>();\n> +\tif (!ver || *ver != 1.0) {\n> +\t\tLOG(RPI, Error) << \"Unexpected configuration file version reported\";\n> +\t\treturn -EINVAL;\n> +\t}\n> +\n> +\tstd::optional<std::string> target = (*root)[\"target\"].get<std::string>();\n> +\tif (!target || *target != \"bcm2835\") {\n> +\t\tLOG(RPI, Error) << \"Unexpected target reported: expected \\\"bcm2835\\\", got \"\n> +\t\t\t\t<< *target;\n> +\t\treturn -EINVAL;\n> +\t}\n> +\n> +\tconst YamlObject &phConfig = (*root)[\"pipeline_handler\"];\n> +\tconfig_.minUnicamBuffers =\n> +\t\tphConfig[\"min_unicam_buffers\"].get<unsigned int>(config_.minUnicamBuffers);\n> +\tconfig_.minTotalUnicamBuffers =\n> +\t\tphConfig[\"min_total_unicam_buffers\"].get<unsigned int>(config_.minTotalUnicamBuffers);\n> +\n> +\tif (config_.minTotalUnicamBuffers < config_.minUnicamBuffers) {\n> +\t\tLOG(RPI, Error) << \"Invalid configuration: min_total_unicam_buffers must be >= min_unicam_buffers\";\n> +\t\treturn -EINVAL;\n> +\t}\n> +\n> +\tif (config_.minTotalUnicamBuffers < 1) {\n> +\t\tLOG(RPI, Error) << \"Invalid configuration: min_total_unicam_buffers must be >= 1\";\n> +\t\treturn -EINVAL;\n> +\t}\n> +\n> +\treturn 0;\n> +}\n> +\n> +int Vc4CameraData::platformConfigure(const V4L2SubdeviceFormat &sensorFormat,\n> +\t\t\t\t     std::optional<BayerFormat::Packing> packing,\n> +\t\t\t\t     std::vector<StreamParams> &rawStreams,\n> +\t\t\t\t     std::vector<StreamParams> &outStreams)\n> +{\n> +\tint ret;\n> +\n> +\tif (!packing)\n> +\t\tpacking = BayerFormat::Packing::CSI2;\n> +\n> +\tV4L2VideoDevice *unicam = unicam_[Unicam::Image].dev();\n> +\tV4L2DeviceFormat unicamFormat = RPi::PipelineHandlerBase::toV4L2DeviceFormat(unicam, sensorFormat, *packing);\n> +\n> +\tret = unicam->setFormat(&unicamFormat);\n> +\tif (ret)\n> +\t\treturn ret;\n> +\n> +\t/*\n> +\t * See which streams are requested, and route the user\n> +\t * StreamConfiguration appropriately.\n> +\t */\n> +\tif (!rawStreams.empty()) {\n> +\t\trawStreams[0].cfg->setStream(&unicam_[Unicam::Image]);\n> +\t\tunicam_[Unicam::Image].setExternal(true);\n> +\t}\n> +\n> +\tret = isp_[Isp::Input].dev()->setFormat(&unicamFormat);\n> +\tif (ret)\n> +\t\treturn ret;\n> +\n> +\tLOG(RPI, Info) << \"Sensor: \" << sensor_->id()\n> +\t\t       << \" - Selected sensor format: \" << sensorFormat\n> +\t\t       << \" - Selected unicam format: \" << unicamFormat;\n> +\n> +\t/* Use a sensible small default size if no output streams are configured. */\n> +\tSize maxSize = outStreams.empty() ? Size(320, 240) : outStreams[0].cfg->size;\n> +\tV4L2DeviceFormat format;\n> +\n> +\tfor (unsigned int i = 0; i < outStreams.size(); i++) {\n> +\t\tStreamConfiguration *cfg = outStreams[i].cfg;\n> +\n> +\t\t/* The largest resolution gets routed to the ISP Output 0 node. */\n> +\t\tRPi::Stream *stream = i == 0 ? &isp_[Isp::Output0] : &isp_[Isp::Output1];\n> +\n> +\t\tV4L2PixelFormat fourcc = stream->dev()->toV4L2PixelFormat(cfg->pixelFormat);\n> +\t\tformat.size = cfg->size;\n> +\t\tformat.fourcc = fourcc;\n> +\t\tformat.colorSpace = cfg->colorSpace;\n> +\n> +\t\tLOG(RPI, Debug) << \"Setting \" << stream->name() << \" to \"\n> +\t\t\t\t<< format;\n> +\n> +\t\tret = stream->dev()->setFormat(&format);\n> +\t\tif (ret)\n> +\t\t\treturn -EINVAL;\n> +\n> +\t\tif (format.size != cfg->size || format.fourcc != fourcc) {\n> +\t\t\tLOG(RPI, Error)\n> +\t\t\t\t<< \"Failed to set requested format on \" << stream->name()\n> +\t\t\t\t<< \", returned \" << format;\n> +\t\t\treturn -EINVAL;\n> +\t\t}\n> +\n> +\t\tLOG(RPI, Debug)\n> +\t\t\t<< \"Stream \" << stream->name() << \" has color space \"\n> +\t\t\t<< ColorSpace::toString(cfg->colorSpace);\n> +\n> +\t\tcfg->setStream(stream);\n> +\t\tstream->setExternal(true);\n> +\t}\n> +\n> +\tispOutputTotal_ = outStreams.size();\n> +\n> +\t/*\n> +\t * If ISP::Output0 stream has not been configured by the application,\n> +\t * we must allow the hardware to generate an output so that the data\n> +\t * flow in the pipeline handler remains consistent, and we still generate\n> +\t * statistics for the IPA to use. So enable the output at a very low\n> +\t * resolution for internal use.\n> +\t *\n> +\t * \\todo Allow the pipeline to work correctly without Output0 and only\n> +\t * statistics coming from the hardware.\n> +\t */\n> +\tif (outStreams.empty()) {\n> +\t\tV4L2VideoDevice *dev = isp_[Isp::Output0].dev();\n> +\n> +\t\tformat = {};\n> +\t\tformat.size = maxSize;\n> +\t\tformat.fourcc = dev->toV4L2PixelFormat(formats::YUV420);\n> +\t\t/* No one asked for output, so the color space doesn't matter. */\n> +\t\tformat.colorSpace = ColorSpace::Sycc;\n> +\t\tret = dev->setFormat(&format);\n> +\t\tif (ret) {\n> +\t\t\tLOG(RPI, Error)\n> +\t\t\t\t<< \"Failed to set default format on ISP Output0: \"\n> +\t\t\t\t<< ret;\n> +\t\t\treturn -EINVAL;\n> +\t\t}\n> +\n> +\t\tispOutputTotal_++;\n> +\n> +\t\tLOG(RPI, Debug) << \"Defaulting ISP Output0 format to \"\n> +\t\t\t\t<< format;\n> +\t}\n> +\n> +\t/*\n> +\t * If ISP::Output1 stream has not been requested by the application, we\n> +\t * set it up for internal use now. This second stream will be used for\n> +\t * fast colour denoise, and must be a quarter resolution of the ISP::Output0\n> +\t * stream. However, also limit the maximum size to 1200 pixels in the\n> +\t * larger dimension, just to avoid being wasteful with buffer allocations\n> +\t * and memory bandwidth.\n> +\t *\n> +\t * \\todo If Output 1 format is not YUV420, Output 1 ought to be disabled as\n> +\t * colour denoise will not run.\n> +\t */\n> +\tif (outStreams.size() == 1) {\n> +\t\tV4L2VideoDevice *dev = isp_[Isp::Output1].dev();\n> +\n> +\t\tV4L2DeviceFormat output1Format;\n> +\t\tconstexpr Size maxDimensions(1200, 1200);\n> +\t\tconst Size limit = maxDimensions.boundedToAspectRatio(format.size);\n> +\n> +\t\toutput1Format.size = (format.size / 2).boundedTo(limit).alignedDownTo(2, 2);\n> +\t\toutput1Format.colorSpace = format.colorSpace;\n> +\t\toutput1Format.fourcc = dev->toV4L2PixelFormat(formats::YUV420);\n> +\n> +\t\tLOG(RPI, Debug) << \"Setting ISP Output1 (internal) to \"\n> +\t\t\t\t<< output1Format;\n> +\n> +\t\tret = dev->setFormat(&output1Format);\n> +\t\tif (ret) {\n> +\t\t\tLOG(RPI, Error) << \"Failed to set format on ISP Output1: \"\n> +\t\t\t\t\t<< ret;\n> +\t\t\treturn -EINVAL;\n> +\t\t}\n> +\n> +\t\tispOutputTotal_++;\n> +\t}\n> +\n> +\t/* ISP statistics output format. */\n> +\tformat = {};\n> +\tformat.fourcc = V4L2PixelFormat(V4L2_META_FMT_BCM2835_ISP_STATS);\n> +\tret = isp_[Isp::Stats].dev()->setFormat(&format);\n> +\tif (ret) {\n> +\t\tLOG(RPI, Error) << \"Failed to set format on ISP stats stream: \"\n> +\t\t\t\t<< format;\n> +\t\treturn ret;\n> +\t}\n> +\n> +\tispOutputTotal_++;\n> +\n> +\t/*\n> +\t * Configure the Unicam embedded data output format only if the sensor\n> +\t * supports it.\n> +\t */\n> +\tif (sensorMetadata_) {\n> +\t\tV4L2SubdeviceFormat embeddedFormat;\n> +\n> +\t\tsensor_->device()->getFormat(1, &embeddedFormat);\n> +\t\tformat = {};\n> +\t\tformat.fourcc = V4L2PixelFormat(V4L2_META_FMT_SENSOR_DATA);\n> +\t\tformat.planes[0].size = embeddedFormat.size.width * embeddedFormat.size.height;\n> +\n> +\t\tLOG(RPI, Debug) << \"Setting embedded data format \" << format.toString();\n> +\t\tret = unicam_[Unicam::Embedded].dev()->setFormat(&format);\n> +\t\tif (ret) {\n> +\t\t\tLOG(RPI, Error) << \"Failed to set format on Unicam embedded: \"\n> +\t\t\t\t\t<< format;\n> +\t\t\treturn ret;\n> +\t\t}\n> +\t}\n> +\n> +\t/* Figure out the smallest selection the ISP will allow. */\n> +\tRectangle testCrop(0, 0, 1, 1);\n> +\tisp_[Isp::Input].dev()->setSelection(V4L2_SEL_TGT_CROP, &testCrop);\n> +\tispMinCropSize_ = testCrop.size();\n> +\n> +\t/* Adjust aspect ratio by providing crops on the input image. */\n> +\tSize size = unicamFormat.size.boundedToAspectRatio(maxSize);\n> +\tispCrop_ = size.centeredTo(Rectangle(unicamFormat.size).center());\n> +\n> +\tplatformIspCrop();\n> +\n> +\treturn 0;\n> +}\n> +\n> +int Vc4CameraData::platformConfigureIpa(ipa::RPi::ConfigParams &params)\n> +{\n> +\tparams.ispControls = isp_[Isp::Input].dev()->controls();\n> +\n> +\t/* Allocate the lens shading table via dmaHeap and pass to the IPA. */\n> +\tif (!lsTable_.isValid()) {\n> +\t\tlsTable_ = SharedFD(dmaHeap_.alloc(\"ls_grid\", ipa::RPi::MaxLsGridSize));\n> +\t\tif (!lsTable_.isValid())\n> +\t\t\treturn -ENOMEM;\n> +\n> +\t\t/* Allow the IPA to mmap the LS table via the file descriptor. */\n> +\t\t/*\n> +\t\t * \\todo Investigate if mapping the lens shading table buffer\n> +\t\t * could be handled with mapBuffers().\n> +\t\t */\n> +\t\tparams.lsTableHandle = lsTable_;\n> +\t}\n> +\n> +\treturn 0;\n> +}\n> +\n> +void Vc4CameraData::platformStart()\n> +{\n> +}\n> +\n> +void Vc4CameraData::platformStop()\n> +{\n> +\tbayerQueue_ = {};\n> +\tembeddedQueue_ = {};\n> +}\n> +\n> +void Vc4CameraData::unicamBufferDequeue(FrameBuffer *buffer)\n> +{\n> +\tRPi::Stream *stream = nullptr;\n> +\tunsigned int index;\n> +\n> +\tif (!isRunning())\n> +\t\treturn;\n> +\n> +\tfor (RPi::Stream &s : unicam_) {\n> +\t\tindex = s.getBufferId(buffer);\n> +\t\tif (index) {\n> +\t\t\tstream = &s;\n> +\t\t\tbreak;\n> +\t\t}\n> +\t}\n> +\n> +\t/* The buffer must belong to one of our streams. */\n> +\tASSERT(stream);\n> +\n> +\tLOG(RPI, Debug) << \"Stream \" << stream->name() << \" buffer dequeue\"\n> +\t\t\t<< \", buffer id \" << index\n> +\t\t\t<< \", timestamp: \" << buffer->metadata().timestamp;\n> +\n> +\tif (stream == &unicam_[Unicam::Image]) {\n> +\t\t/*\n> +\t\t * Lookup the sensor controls used for this frame sequence from\n> +\t\t * DelayedControl and queue them along with the frame buffer.\n> +\t\t */\n> +\t\tauto [ctrl, delayContext] = delayedCtrls_->get(buffer->metadata().sequence);\n> +\t\t/*\n> +\t\t * Add the frame timestamp to the ControlList for the IPA to use\n> +\t\t * as it does not receive the FrameBuffer object.\n> +\t\t */\n> +\t\tctrl.set(controls::SensorTimestamp, buffer->metadata().timestamp);\n> +\t\tbayerQueue_.push({ buffer, std::move(ctrl), delayContext });\n> +\t} else {\n> +\t\tembeddedQueue_.push(buffer);\n> +\t}\n> +\n> +\thandleState();\n> +}\n> +\n> +void Vc4CameraData::ispInputDequeue(FrameBuffer *buffer)\n> +{\n> +\tif (!isRunning())\n> +\t\treturn;\n> +\n> +\tLOG(RPI, Debug) << \"Stream ISP Input buffer complete\"\n> +\t\t\t<< \", buffer id \" << unicam_[Unicam::Image].getBufferId(buffer)\n> +\t\t\t<< \", timestamp: \" << buffer->metadata().timestamp;\n> +\n> +\t/* The ISP input buffer gets re-queued into Unicam. */\n> +\thandleStreamBuffer(buffer, &unicam_[Unicam::Image]);\n> +\thandleState();\n> +}\n> +\n> +void Vc4CameraData::ispOutputDequeue(FrameBuffer *buffer)\n> +{\n> +\tRPi::Stream *stream = nullptr;\n> +\tunsigned int index;\n> +\n> +\tif (!isRunning())\n> +\t\treturn;\n> +\n> +\tfor (RPi::Stream &s : isp_) {\n> +\t\tindex = s.getBufferId(buffer);\n> +\t\tif (index) {\n> +\t\t\tstream = &s;\n> +\t\t\tbreak;\n> +\t\t}\n> +\t}\n> +\n> +\t/* The buffer must belong to one of our ISP output streams. */\n> +\tASSERT(stream);\n> +\n> +\tLOG(RPI, Debug) << \"Stream \" << stream->name() << \" buffer complete\"\n> +\t\t\t<< \", buffer id \" << index\n> +\t\t\t<< \", timestamp: \" << buffer->metadata().timestamp;\n> +\n> +\t/*\n> +\t * ISP statistics buffer must not be re-queued or sent back to the\n> +\t * application until after the IPA signals so.\n> +\t */\n> +\tif (stream == &isp_[Isp::Stats]) {\n> +\t\tipa::RPi::ProcessParams params;\n> +\t\tparams.buffers.stats = index | RPi::MaskStats;\n> +\t\tparams.ipaContext = requestQueue_.front()->sequence();\n> +\t\tipa_->processStats(params);\n> +\t} else {\n> +\t\t/* Any other ISP output can be handed back to the application now. */\n> +\t\thandleStreamBuffer(buffer, stream);\n> +\t}\n> +\n> +\t/*\n> +\t * Increment the number of ISP outputs generated.\n> +\t * This is needed to track dropped frames.\n> +\t */\n> +\tispOutputCount_++;\n> +\n> +\thandleState();\n> +}\n> +\n> +void Vc4CameraData::processStatsComplete(const ipa::RPi::BufferIds &buffers)\n> +{\n> +\tif (!isRunning())\n> +\t\treturn;\n> +\n> +\tFrameBuffer *buffer = isp_[Isp::Stats].getBuffers().at(buffers.stats & RPi::MaskID);\n> +\n> +\thandleStreamBuffer(buffer, &isp_[Isp::Stats]);\n> +\n> +\t/* Last thing to do is to fill up the request metadata. */\n> +\tRequest *request = requestQueue_.front();\n> +\tControlList metadata(controls::controls);\n> +\n> +\tipa_->reportMetadata(request->sequence(), &metadata);\n> +\trequest->metadata().merge(metadata);\n> +\n> +\t/*\n> +\t * Inform the sensor of the latest colour gains if it has the\n> +\t * V4L2_CID_NOTIFY_GAINS control (which means notifyGainsUnity_ is set).\n> +\t */\n> +\tconst auto &colourGains = metadata.get(libcamera::controls::ColourGains);\n> +\tif (notifyGainsUnity_ && colourGains) {\n> +\t\t/* The control wants linear gains in the order B, Gb, Gr, R. */\n> +\t\tControlList ctrls(sensor_->controls());\n> +\t\tstd::array<int32_t, 4> gains{\n> +\t\t\tstatic_cast<int32_t>((*colourGains)[1] * *notifyGainsUnity_),\n> +\t\t\t*notifyGainsUnity_,\n> +\t\t\t*notifyGainsUnity_,\n> +\t\t\tstatic_cast<int32_t>((*colourGains)[0] * *notifyGainsUnity_)\n> +\t\t};\n> +\t\tctrls.set(V4L2_CID_NOTIFY_GAINS, Span<const int32_t>{ gains });\n> +\n> +\t\tsensor_->setControls(&ctrls);\n> +\t}\n> +\n> +\tstate_ = State::IpaComplete;\n> +\thandleState();\n> +}\n> +\n> +void Vc4CameraData::prepareIspComplete(const ipa::RPi::BufferIds &buffers)\n> +{\n> +\tunsigned int embeddedId = buffers.embedded & RPi::MaskID;\n> +\tunsigned int bayer = buffers.bayer & RPi::MaskID;\n> +\tFrameBuffer *buffer;\n> +\n> +\tif (!isRunning())\n> +\t\treturn;\n> +\n> +\tbuffer = unicam_[Unicam::Image].getBuffers().at(bayer & RPi::MaskID);\n> +\tLOG(RPI, Debug) << \"Input re-queue to ISP, buffer id \" << (bayer & RPi::MaskID)\n> +\t\t\t<< \", timestamp: \" << buffer->metadata().timestamp;\n> +\n> +\tisp_[Isp::Input].queueBuffer(buffer);\n> +\tispOutputCount_ = 0;\n> +\n> +\tif (sensorMetadata_ && embeddedId) {\n> +\t\tbuffer = unicam_[Unicam::Embedded].getBuffers().at(embeddedId & RPi::MaskID);\n> +\t\thandleStreamBuffer(buffer, &unicam_[Unicam::Embedded]);\n> +\t}\n> +\n> +\thandleState();\n> +}\n> +\n> +void Vc4CameraData::setIspControls(const ControlList &controls)\n> +{\n> +\tControlList ctrls = controls;\n> +\n> +\tif (ctrls.contains(V4L2_CID_USER_BCM2835_ISP_LENS_SHADING)) {\n> +\t\tControlValue &value =\n> +\t\t\tconst_cast<ControlValue &>(ctrls.get(V4L2_CID_USER_BCM2835_ISP_LENS_SHADING));\n> +\t\tSpan<uint8_t> s = value.data();\n> +\t\tbcm2835_isp_lens_shading *ls =\n> +\t\t\treinterpret_cast<bcm2835_isp_lens_shading *>(s.data());\n> +\t\tls->dmabuf = lsTable_.get();\n> +\t}\n> +\n> +\tisp_[Isp::Input].dev()->setControls(&ctrls);\n> +\thandleState();\n> +}\n> +\n> +void Vc4CameraData::setCameraTimeout(uint32_t maxFrameLengthMs)\n> +{\n> +\t/*\n> +\t * Set the dequeue timeout to the larger of 5x the maximum reported\n> +\t * frame length advertised by the IPA over a number of frames. Allow\n> +\t * a minimum timeout value of 1s.\n> +\t */\n> +\tutils::Duration timeout =\n> +\t\tstd::max<utils::Duration>(1s, 5 * maxFrameLengthMs * 1ms);\n> +\n> +\tLOG(RPI, Debug) << \"Setting Unicam timeout to \" << timeout;\n> +\tunicam_[Unicam::Image].dev()->setDequeueTimeout(timeout);\n> +}\n> +\n> +void Vc4CameraData::tryRunPipeline()\n> +{\n> +\tFrameBuffer *embeddedBuffer;\n> +\tBayerFrame bayerFrame;\n> +\n> +\t/* If any of our request or buffer queues are empty, we cannot proceed. */\n> +\tif (state_ != State::Idle || requestQueue_.empty() ||\n> +\t    bayerQueue_.empty() || (embeddedQueue_.empty() && sensorMetadata_))\n> +\t\treturn;\n> +\n> +\tif (!findMatchingBuffers(bayerFrame, embeddedBuffer))\n> +\t\treturn;\n> +\n> +\t/* Take the first request from the queue and action the IPA. */\n> +\tRequest *request = requestQueue_.front();\n> +\n> +\t/* See if a new ScalerCrop value needs to be applied. */\n> +\tcalculateScalerCrop(request->controls());\n> +\n> +\t/*\n> +\t * Clear the request metadata and fill it with some initial non-IPA\n> +\t * related controls. We clear it first because the request metadata\n> +\t * may have been populated if we have dropped the previous frame.\n> +\t */\n> +\trequest->metadata().clear();\n> +\tfillRequestMetadata(bayerFrame.controls, request);\n> +\n> +\t/* Set our state to say the pipeline is active. */\n> +\tstate_ = State::Busy;\n> +\n> +\tunsigned int bayer = unicam_[Unicam::Image].getBufferId(bayerFrame.buffer);\n> +\n> +\tLOG(RPI, Debug) << \"Signalling prepareIsp:\"\n> +\t\t\t<< \" Bayer buffer id: \" << bayer;\n> +\n> +\tipa::RPi::PrepareParams params;\n> +\tparams.buffers.bayer = RPi::MaskBayerData | bayer;\n> +\tparams.sensorControls = std::move(bayerFrame.controls);\n> +\tparams.requestControls = request->controls();\n> +\tparams.ipaContext = request->sequence();\n> +\tparams.delayContext = bayerFrame.delayContext;\n> +\n> +\tif (embeddedBuffer) {\n> +\t\tunsigned int embeddedId = unicam_[Unicam::Embedded].getBufferId(embeddedBuffer);\n> +\n> +\t\tparams.buffers.embedded = RPi::MaskEmbeddedData | embeddedId;\n> +\t\tLOG(RPI, Debug) << \"Signalling prepareIsp:\"\n> +\t\t\t\t<< \" Embedded buffer id: \" << embeddedId;\n> +\t}\n> +\n> +\tipa_->prepareIsp(params);\n> +}\n> +\n> +bool Vc4CameraData::findMatchingBuffers(BayerFrame &bayerFrame, FrameBuffer *&embeddedBuffer)\n> +{\n> +\tif (bayerQueue_.empty())\n> +\t\treturn false;\n> +\n> +\t/*\n> +\t * Find the embedded data buffer with a matching timestamp to pass to\n> +\t * the IPA. Any embedded buffers with a timestamp lower than the\n> +\t * current bayer buffer will be removed and re-queued to the driver.\n> +\t */\n> +\tuint64_t ts = bayerQueue_.front().buffer->metadata().timestamp;\n> +\tembeddedBuffer = nullptr;\n> +\twhile (!embeddedQueue_.empty()) {\n> +\t\tFrameBuffer *b = embeddedQueue_.front();\n> +\t\tif (b->metadata().timestamp < ts) {\n> +\t\t\tembeddedQueue_.pop();\n> +\t\t\tunicam_[Unicam::Embedded].returnBuffer(b);\n> +\t\t\tLOG(RPI, Debug) << \"Dropping unmatched input frame in stream \"\n> +\t\t\t\t\t<< unicam_[Unicam::Embedded].name();\n> +\t\t} else if (b->metadata().timestamp == ts) {\n> +\t\t\t/* Found a match! */\n> +\t\t\tembeddedBuffer = b;\n> +\t\t\tembeddedQueue_.pop();\n> +\t\t\tbreak;\n> +\t\t} else {\n> +\t\t\tbreak; /* Only higher timestamps from here. */\n> +\t\t}\n> +\t}\n> +\n> +\tif (!embeddedBuffer && sensorMetadata_) {\n> +\t\tif (embeddedQueue_.empty()) {\n> +\t\t\t/*\n> +\t\t\t * If the embedded buffer queue is empty, wait for the next\n> +\t\t\t * buffer to arrive - dequeue ordering may send the image\n> +\t\t\t * buffer first.\n> +\t\t\t */\n> +\t\t\tLOG(RPI, Debug) << \"Waiting for next embedded buffer.\";\n> +\t\t\treturn false;\n> +\t\t}\n> +\n> +\t\t/* Log if there is no matching embedded data buffer found. */\n> +\t\tLOG(RPI, Debug) << \"Returning bayer frame without a matching embedded buffer.\";\n> +\t}\n> +\n> +\tbayerFrame = std::move(bayerQueue_.front());\n> +\tbayerQueue_.pop();\n> +\n> +\treturn true;\n> +}\n> +\n> +REGISTER_PIPELINE_HANDLER(PipelineHandlerVc4)\n> +\n> +} /* namespace libcamera */\n> --\n> 2.34.1\n>","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id 6F222BDCBD\n\tfor <parsemail@patchwork.libcamera.org>;\n\tThu, 27 Apr 2023 10:58:34 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id D5FAE627D6;\n\tThu, 27 Apr 2023 12:58:33 +0200 (CEST)","from perceval.ideasonboard.com (perceval.ideasonboard.com\n\t[213.167.242.64])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id 70360627B7\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tThu, 27 Apr 2023 12:58:31 +0200 (CEST)","from ideasonboard.com (unknown\n\t[IPv6:2001:b07:5d2e:52c9:1cf0:b3bc:c785:4625])\n\tby perceval.ideasonboard.com (Postfix) with ESMTPSA id 263929DE;\n\tThu, 27 Apr 2023 12:58:19 +0200 (CEST)"],"DKIM-Signature":["v=1; a=rsa-sha256; c=relaxed/simple; d=libcamera.org;\n\ts=mail; t=1682593113;\n\tbh=Gj4Ug++sIk1+p/yHMQnmoVTCciZkDWvOcVEWouZIqBM=;\n\th=Date:To:References:In-Reply-To:Subject:List-Id:List-Unsubscribe:\n\tList-Archive:List-Post:List-Help:List-Subscribe:From:Reply-To:Cc:\n\tFrom;\n\tb=rYufU3256aH3vXg17CqJo/EhZbZLGUQSD3KoiH6XTYOlNLa7RPZO3XjZdglJsax19\n\t8L2wM1DLpGw3p9CJ85Q7ZKmnc5R4FVqfAfj9KR+5p9IoVyV2UaDBBkpkUMPCCJgGUu\n\t1pvXRdrl0rc4xFt87ERyEow55p1SjYT+dmZu+r4I68aCl248MiyLjBeycKG3oBYoGF\n\tKsHD41clpEaFvw12mqF1xJ6T0znGWFtWXTZKrx7nKIN7pZKHcXAVEa8E00wVLZ4gyq\n\t9czO4d7yTxMZvN6fuufaAsSbf4FTiHGWhwQ8orU6e4gp2wpLgI/DUpTLOp47MM/21N\n\tifwqyRssj9RPw==","v=1; a=rsa-sha256; c=relaxed/simple; d=ideasonboard.com;\n\ts=mail; t=1682593099;\n\tbh=Gj4Ug++sIk1+p/yHMQnmoVTCciZkDWvOcVEWouZIqBM=;\n\th=Date:From:To:Cc:Subject:References:In-Reply-To:From;\n\tb=dgfqG1W8TG2OqMTAUI6OEZCNCSfLxxnjbTPqYXeuTUJGafbyLqnnMrG5WgROAk4SC\n\tnuCr6xKj8wyDeEg9qGjTJ7E/WbVRfbtghTdLG3YBa02fjC+3LwudS2V3u+P4/qtuuW\n\tm6hgjmmyLF0XOSOR1HRub+2HV8Z7h28xriQqIW1Y="],"Authentication-Results":"lancelot.ideasonboard.com; dkim=pass (1024-bit key; \n\tunprotected) header.d=ideasonboard.com\n\theader.i=@ideasonboard.com\n\theader.b=\"dgfqG1W8\"; dkim-atps=neutral","Date":"Thu, 27 Apr 2023 12:58:27 +0200","To":"Naushir Patuck <naush@raspberrypi.com>","Message-ID":"<gn6jv3ymkzfdp3s5ukbdf2toon22cocctz6rmz5rxi6qo2dr77@fmajlub3whx6>","References":"<20230426131057.21550-1-naush@raspberrypi.com>\n\t<20230426131057.21550-12-naush@raspberrypi.com>","MIME-Version":"1.0","Content-Type":"text/plain; charset=utf-8","Content-Disposition":"inline","In-Reply-To":"<20230426131057.21550-12-naush@raspberrypi.com>","Subject":"Re: [libcamera-devel] [PATCH 11/13] pipeline: raspberrypi:\n\tIntroduce PipelineHandlerBase class","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","From":"Jacopo Mondi via libcamera-devel <libcamera-devel@lists.libcamera.org>","Reply-To":"Jacopo Mondi <jacopo.mondi@ideasonboard.com>","Cc":"libcamera-devel@lists.libcamera.org","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}},{"id":26978,"web_url":"https://patchwork.libcamera.org/comment/26978/","msgid":"<20230427164014.GJ28489@pendragon.ideasonboard.com>","date":"2023-04-27T16:40:14","subject":"Re: [libcamera-devel] [PATCH 11/13] pipeline: raspberrypi:\n\tIntroduce PipelineHandlerBase class","submitter":{"id":2,"url":"https://patchwork.libcamera.org/api/people/2/","name":"Laurent Pinchart","email":"laurent.pinchart@ideasonboard.com"},"content":"Hi Naush,\n\nThank you for the patch.\n\nOn Wed, Apr 26, 2023 at 02:10:55PM +0100, Naushir Patuck via libcamera-devel wrote:\n> Create a new PipelineHandlerBase class that handles general purpose\n> housekeeping duties for the Raspberry Pi pipeline handler. Code the\n> implementation of new class is essentially pulled from the existing\n> pipeline/rpi/vc4/raspberrypi.cpp file with a small amount of\n> refactoring.\n> \n> Create a derived PipelineHandlerVc4 class from PipelineHandlerBase that\n> handles the VC4 pipeline specific tasks of the pipeline handler. Again,\n> code for this class implementation is taken from the existing\n> pipeline/rpi/vc4/raspberrypi.cpp with a small amount of\n> refactoring.\n> \n> The goal of this change is to allow third parties to implement their own\n> pipeline handlers running on the Raspberry Pi without duplicating all of\n> the pipeline handler housekeeping tasks.\n> \n> Signed-off-by: Naushir Patuck <naush@raspberrypi.com>\n> ---\n>  src/ipa/rpi/vc4/vc4.cpp                       |    2 +-\n>  src/libcamera/pipeline/rpi/common/meson.build |    1 +\n>  .../pipeline/rpi/common/pipeline_base.cpp     | 1447 ++++++++++\n>  .../pipeline/rpi/common/pipeline_base.h       |  276 ++\n>  .../pipeline/rpi/vc4/data/example.yaml        |    4 +-\n>  src/libcamera/pipeline/rpi/vc4/meson.build    |    2 +-\n>  .../pipeline/rpi/vc4/raspberrypi.cpp          | 2428 -----------------\n>  src/libcamera/pipeline/rpi/vc4/vc4.cpp        | 1001 +++++++\n>  8 files changed, 2729 insertions(+), 2432 deletions(-)\n>  create mode 100644 src/libcamera/pipeline/rpi/common/pipeline_base.cpp\n>  create mode 100644 src/libcamera/pipeline/rpi/common/pipeline_base.h\n>  delete mode 100644 src/libcamera/pipeline/rpi/vc4/raspberrypi.cpp\n>  create mode 100644 src/libcamera/pipeline/rpi/vc4/vc4.cpp\n> \n> diff --git a/src/ipa/rpi/vc4/vc4.cpp b/src/ipa/rpi/vc4/vc4.cpp\n> index 0e022c2aeed3..f3d83b2afadf 100644\n> --- a/src/ipa/rpi/vc4/vc4.cpp\n> +++ b/src/ipa/rpi/vc4/vc4.cpp\n> @@ -538,7 +538,7 @@ extern \"C\" {\n>  const struct IPAModuleInfo ipaModuleInfo = {\n>  \tIPA_MODULE_API_VERSION,\n>  \t1,\n> -\t\"PipelineHandlerRPi\",\n> +\t\"PipelineHandlerVc4\",\n>  \t\"vc4\",\n>  };\n>  \n> diff --git a/src/libcamera/pipeline/rpi/common/meson.build b/src/libcamera/pipeline/rpi/common/meson.build\n> index 1dec6d3d028b..f8ea790b42a1 100644\n> --- a/src/libcamera/pipeline/rpi/common/meson.build\n> +++ b/src/libcamera/pipeline/rpi/common/meson.build\n> @@ -2,6 +2,7 @@\n>  \n>  libcamera_sources += files([\n>      'delayed_controls.cpp',\n> +    'pipeline_base.cpp',\n>      'rpi_stream.cpp',\n>  ])\n>  \n> diff --git a/src/libcamera/pipeline/rpi/common/pipeline_base.cpp b/src/libcamera/pipeline/rpi/common/pipeline_base.cpp\n> new file mode 100644\n> index 000000000000..012766b38c32\n> --- /dev/null\n> +++ b/src/libcamera/pipeline/rpi/common/pipeline_base.cpp\n> @@ -0,0 +1,1447 @@\n> +/* SPDX-License-Identifier: LGPL-2.1-or-later */\n> +/*\n> + * Copyright (C) 2019-2023, Raspberry Pi Ltd\n> + *\n> + * pipeline_base.h - Pipeline handler base class for Raspberry Pi devices\n\npipeline_base.cpp\n\nMaybe we should drop the file names...\n\n> + */\n> +\n> +#include \"pipeline_base.h\"\n> +\n> +#include <chrono>\n> +\n> +#include <linux/media-bus-format.h>\n> +#include <linux/videodev2.h>\n> +\n> +#include <libcamera/base/file.h>\n> +#include <libcamera/base/utils.h>\n> +\n> +#include <libcamera/formats.h>\n> +#include <libcamera/logging.h>\n> +#include <libcamera/property_ids.h>\n> +\n> +#include \"libcamera/internal/camera_lens.h\"\n> +#include \"libcamera/internal/ipa_manager.h\"\n> +#include \"libcamera/internal/v4l2_subdevice.h\"\n> +\n> +using namespace std::chrono_literals;\n> +\n> +namespace libcamera {\n> +\n> +using namespace RPi;\n> +\n> +LOG_DEFINE_CATEGORY(RPI)\n> +\n> +namespace {\n> +\n> +constexpr unsigned int defaultRawBitDepth = 12;\n> +\n> +bool isRaw(const PixelFormat &pixFmt)\n> +{\n> +\t/* This test works for both Bayer and raw mono formats. */\n> +\treturn BayerFormat::fromPixelFormat(pixFmt).isValid();\n> +}\n> +\n> +PixelFormat mbusCodeToPixelFormat(unsigned int mbus_code,\n> +\t\t\t\t  BayerFormat::Packing packingReq)\n> +{\n> +\tBayerFormat bayer = BayerFormat::fromMbusCode(mbus_code);\n> +\n> +\tASSERT(bayer.isValid());\n> +\n> +\tbayer.packing = packingReq;\n> +\tPixelFormat pix = bayer.toPixelFormat();\n> +\n> +\t/*\n> +\t * Not all formats (e.g. 8-bit or 16-bit Bayer formats) can have packed\n> +\t * variants. So if the PixelFormat returns as invalid, use the non-packed\n> +\t * conversion instead.\n> +\t */\n> +\tif (!pix.isValid()) {\n> +\t\tbayer.packing = BayerFormat::Packing::None;\n> +\t\tpix = bayer.toPixelFormat();\n> +\t}\n> +\n> +\treturn pix;\n> +}\n> +\n> +SensorFormats populateSensorFormats(std::unique_ptr<CameraSensor> &sensor)\n> +{\n> +\tSensorFormats formats;\n> +\n> +\tfor (auto const mbusCode : sensor->mbusCodes())\n> +\t\tformats.emplace(mbusCode, sensor->sizes(mbusCode));\n> +\n> +\treturn formats;\n> +}\n> +\n> +bool isMonoSensor(std::unique_ptr<CameraSensor> &sensor)\n> +{\n> +\tunsigned int mbusCode = sensor->mbusCodes()[0];\n> +\tconst BayerFormat &bayer = BayerFormat::fromMbusCode(mbusCode);\n> +\n> +\treturn bayer.order == BayerFormat::Order::MONO;\n> +}\n> +\n> +double scoreFormat(double desired, double actual)\n> +{\n> +\tdouble score = desired - actual;\n> +\t/* Smaller desired dimensions are preferred. */\n> +\tif (score < 0.0)\n> +\t\tscore = (-score) / 8;\n> +\t/* Penalise non-exact matches. */\n> +\tif (actual != desired)\n> +\t\tscore *= 2;\n> +\n> +\treturn score;\n> +}\n> +\n> +V4L2SubdeviceFormat findBestFormat(const SensorFormats &formatsMap, const Size &req, unsigned int bitDepth)\n> +{\n> +\tdouble bestScore = std::numeric_limits<double>::max(), score;\n> +\tV4L2SubdeviceFormat bestFormat;\n> +\tbestFormat.colorSpace = ColorSpace::Raw;\n> +\n> +\tconstexpr float penaltyAr = 1500.0;\n> +\tconstexpr float penaltyBitDepth = 500.0;\n> +\n> +\t/* Calculate the closest/best mode from the user requested size. */\n> +\tfor (const auto &iter : formatsMap) {\n> +\t\tconst unsigned int mbusCode = iter.first;\n> +\t\tconst PixelFormat format = mbusCodeToPixelFormat(mbusCode,\n> +\t\t\t\t\t\t\t\t BayerFormat::Packing::None);\n> +\t\tconst PixelFormatInfo &info = PixelFormatInfo::info(format);\n> +\n> +\t\tfor (const Size &size : iter.second) {\n> +\t\t\tdouble reqAr = static_cast<double>(req.width) / req.height;\n> +\t\t\tdouble fmtAr = static_cast<double>(size.width) / size.height;\n> +\n> +\t\t\t/* Score the dimensions for closeness. */\n> +\t\t\tscore = scoreFormat(req.width, size.width);\n> +\t\t\tscore += scoreFormat(req.height, size.height);\n> +\t\t\tscore += penaltyAr * scoreFormat(reqAr, fmtAr);\n> +\n> +\t\t\t/* Add any penalties... this is not an exact science! */\n> +\t\t\tscore += utils::abs_diff(info.bitsPerPixel, bitDepth) * penaltyBitDepth;\n> +\n> +\t\t\tif (score <= bestScore) {\n> +\t\t\t\tbestScore = score;\n> +\t\t\t\tbestFormat.mbus_code = mbusCode;\n> +\t\t\t\tbestFormat.size = size;\n> +\t\t\t}\n> +\n> +\t\t\tLOG(RPI, Debug) << \"Format: \" << size\n> +\t\t\t\t\t<< \" fmt \" << format\n> +\t\t\t\t\t<< \" Score: \" << score\n> +\t\t\t\t\t<< \" (best \" << bestScore << \")\";\n> +\t\t}\n> +\t}\n> +\n> +\treturn bestFormat;\n> +}\n> +\n> +const std::vector<ColorSpace> validColorSpaces = {\n> +\tColorSpace::Sycc,\n> +\tColorSpace::Smpte170m,\n> +\tColorSpace::Rec709\n> +};\n> +\n> +std::optional<ColorSpace> findValidColorSpace(const ColorSpace &colourSpace)\n> +{\n> +\tfor (auto cs : validColorSpaces) {\n> +\t\tif (colourSpace.primaries == cs.primaries &&\n> +\t\t    colourSpace.transferFunction == cs.transferFunction)\n> +\t\t\treturn cs;\n> +\t}\n> +\n> +\treturn std::nullopt;\n> +}\n> +\n> +bool isRgb(const PixelFormat &pixFmt)\n> +{\n> +\tconst PixelFormatInfo &info = PixelFormatInfo::info(pixFmt);\n> +\treturn info.colourEncoding == PixelFormatInfo::ColourEncodingRGB;\n> +}\n> +\n> +bool isYuv(const PixelFormat &pixFmt)\n> +{\n> +\t/* The code below would return true for raw mono streams, so weed those out first. */\n> +\tif (isRaw(pixFmt))\n> +\t\treturn false;\n> +\n> +\tconst PixelFormatInfo &info = PixelFormatInfo::info(pixFmt);\n> +\treturn info.colourEncoding == PixelFormatInfo::ColourEncodingYUV;\n> +}\n> +\n> +} /* namespace */\n> +\n> +/*\n> + * Raspberry Pi drivers expect the following colour spaces:\n> + * - V4L2_COLORSPACE_RAW for raw streams.\n> + * - One of V4L2_COLORSPACE_JPEG, V4L2_COLORSPACE_SMPTE170M, V4L2_COLORSPACE_REC709 for\n> + *   non-raw streams. Other fields such as transfer function, YCbCr encoding and\n> + *   quantisation are not used.\n> + *\n> + * The libcamera colour spaces that we wish to use corresponding to these are therefore:\n> + * - ColorSpace::Raw for V4L2_COLORSPACE_RAW\n> + * - ColorSpace::Sycc for V4L2_COLORSPACE_JPEG\n> + * - ColorSpace::Smpte170m for V4L2_COLORSPACE_SMPTE170M\n> + * - ColorSpace::Rec709 for V4L2_COLORSPACE_REC709\n> + */\n> +CameraConfiguration::Status RPiCameraConfiguration::validateColorSpaces([[maybe_unused]] ColorSpaceFlags flags)\n> +{\n> +\tStatus status = Valid;\n> +\tyuvColorSpace_.reset();\n> +\n> +\tfor (auto cfg : config_) {\n> +\t\t/* First fix up raw streams to have the \"raw\" colour space. */\n> +\t\tif (isRaw(cfg.pixelFormat)) {\n> +\t\t\t/* If there was no value here, that doesn't count as \"adjusted\". */\n> +\t\t\tif (cfg.colorSpace && cfg.colorSpace != ColorSpace::Raw)\n> +\t\t\t\tstatus = Adjusted;\n> +\t\t\tcfg.colorSpace = ColorSpace::Raw;\n> +\t\t\tcontinue;\n> +\t\t}\n> +\n> +\t\t/* Next we need to find our shared colour space. The first valid one will do. */\n> +\t\tif (cfg.colorSpace && !yuvColorSpace_)\n> +\t\t\tyuvColorSpace_ = findValidColorSpace(cfg.colorSpace.value());\n> +\t}\n> +\n> +\t/* If no colour space was given anywhere, choose sYCC. */\n> +\tif (!yuvColorSpace_)\n> +\t\tyuvColorSpace_ = ColorSpace::Sycc;\n> +\n> +\t/* Note the version of this that any RGB streams will have to use. */\n> +\trgbColorSpace_ = yuvColorSpace_;\n> +\trgbColorSpace_->ycbcrEncoding = ColorSpace::YcbcrEncoding::None;\n> +\trgbColorSpace_->range = ColorSpace::Range::Full;\n> +\n> +\t/* Go through the streams again and force everyone to the same colour space. */\n> +\tfor (auto cfg : config_) {\n> +\t\tif (cfg.colorSpace == ColorSpace::Raw)\n> +\t\t\tcontinue;\n> +\n> +\t\tif (isYuv(cfg.pixelFormat) && cfg.colorSpace != yuvColorSpace_) {\n> +\t\t\t/* Again, no value means \"not adjusted\". */\n> +\t\t\tif (cfg.colorSpace)\n> +\t\t\t\tstatus = Adjusted;\n> +\t\t\tcfg.colorSpace = yuvColorSpace_;\n> +\t\t}\n> +\t\tif (isRgb(cfg.pixelFormat) && cfg.colorSpace != rgbColorSpace_) {\n> +\t\t\t/* Be nice, and let the YUV version count as non-adjusted too. */\n> +\t\t\tif (cfg.colorSpace && cfg.colorSpace != yuvColorSpace_)\n> +\t\t\t\tstatus = Adjusted;\n> +\t\t\tcfg.colorSpace = rgbColorSpace_;\n> +\t\t}\n> +\t}\n> +\n> +\treturn status;\n> +}\n> +\n> +CameraConfiguration::Status RPiCameraConfiguration::validate()\n> +{\n> +\tStatus status = Valid;\n> +\n> +\tif (config_.empty())\n> +\t\treturn Invalid;\n> +\n> +\tstatus = validateColorSpaces(ColorSpaceFlag::StreamsShareColorSpace);\n> +\n> +\t/*\n> +\t * Validate the requested transform against the sensor capabilities and\n> +\t * rotation and store the final combined transform that configure() will\n> +\t * need to apply to the sensor to save us working it out again.\n> +\t */\n> +\tTransform requestedTransform = transform;\n> +\tcombinedTransform_ = data_->sensor_->validateTransform(&transform);\n> +\tif (transform != requestedTransform)\n> +\t\tstatus = Adjusted;\n> +\n> +\tstd::vector<CameraData::StreamParams> rawStreams, outStreams;\n> +\tfor (const auto &[index, cfg] : utils::enumerate(config_)) {\n> +\t\tif (isRaw(cfg.pixelFormat))\n> +\t\t\trawStreams.emplace_back(index, &cfg);\n> +\t\telse\n> +\t\t\toutStreams.emplace_back(index, &cfg);\n> +\t}\n> +\n> +\t/* Sort the streams so the highest resolution is first. */\n> +\tstd::sort(rawStreams.begin(), rawStreams.end(),\n> +\t\t  [](auto &l, auto &r) { return l.cfg->size > r.cfg->size; });\n> +\n> +\tstd::sort(outStreams.begin(), outStreams.end(),\n> +\t\t  [](auto &l, auto &r) { return l.cfg->size > r.cfg->size; });\n> +\n> +\t/* Do any platform specific fixups. */\n> +\tstatus = data_->platformValidate(rawStreams, outStreams);\n> +\tif (status == Invalid)\n> +\t\treturn Invalid;\n> +\n> +\t/* Further fixups on the RAW streams. */\n> +\tfor (auto &raw : rawStreams) {\n> +\t\tStreamConfiguration &cfg = config_.at(raw.index);\n> +\t\tV4L2DeviceFormat rawFormat;\n> +\n> +\t\tconst PixelFormatInfo &info = PixelFormatInfo::info(cfg.pixelFormat);\n> +\t\tunsigned int bitDepth = info.isValid() ? info.bitsPerPixel : defaultRawBitDepth;\n> +\t\tV4L2SubdeviceFormat sensorFormat = findBestFormat(data_->sensorFormats_, cfg.size, bitDepth);\n> +\n> +\t\trawFormat.size = sensorFormat.size;\n> +\t\trawFormat.fourcc = raw.dev->toV4L2PixelFormat(cfg.pixelFormat);\n> +\n> +\t\tint ret = raw.dev->tryFormat(&rawFormat);\n> +\t\tif (ret)\n> +\t\t\treturn Invalid;\n> +\t\t/*\n> +\t\t * Some sensors change their Bayer order when they are h-flipped\n> +\t\t * or v-flipped, according to the transform. If this one does, we\n> +\t\t * must advertise the transformed Bayer order in the raw stream.\n> +\t\t * Note how we must fetch the \"native\" (i.e. untransformed) Bayer\n> +\t\t * order, because the sensor may currently be flipped!\n> +\t\t */\n> +\t\tV4L2PixelFormat fourcc = rawFormat.fourcc;\n> +\t\tif (data_->flipsAlterBayerOrder_) {\n> +\t\t\tBayerFormat bayer = BayerFormat::fromV4L2PixelFormat(fourcc);\n> +\t\t\tbayer.order = data_->nativeBayerOrder_;\n> +\t\t\tbayer = bayer.transform(combinedTransform_);\n> +\t\t\tfourcc = bayer.toV4L2PixelFormat();\n> +\t\t}\n> +\n> +\t\tPixelFormat inputPixFormat = fourcc.toPixelFormat();\n> +\t\tif (raw.cfg->size != rawFormat.size || raw.cfg->pixelFormat != inputPixFormat) {\n> +\t\t\traw.cfg->size = rawFormat.size;\n> +\t\t\traw.cfg->pixelFormat = inputPixFormat;\n> +\t\t\tstatus = Adjusted;\n> +\t\t}\n> +\n> +\t\traw.cfg->stride = rawFormat.planes[0].bpl;\n> +\t\traw.cfg->frameSize = rawFormat.planes[0].size;\n> +\t}\n> +\n> +\t/* Further fixups on the ISP output streams. */\n> +\tfor (auto &out : outStreams) {\n> +\t\tStreamConfiguration &cfg = config_.at(out.index);\n> +\t\tPixelFormat &cfgPixFmt = cfg.pixelFormat;\n> +\t\tV4L2VideoDevice::Formats fmts = out.dev->formats();\n> +\n> +\t\tif (fmts.find(out.dev->toV4L2PixelFormat(cfgPixFmt)) == fmts.end()) {\n> +\t\t\t/* If we cannot find a native format, use a default one. */\n> +\t\t\tcfgPixFmt = formats::NV12;\n> +\t\t\tstatus = Adjusted;\n> +\t\t}\n> +\n> +\t\tV4L2DeviceFormat format;\n> +\t\tformat.fourcc = out.dev->toV4L2PixelFormat(cfg.pixelFormat);\n> +\t\tformat.size = cfg.size;\n> +\t\t/* We want to send the associated YCbCr info through to the driver. */\n> +\t\tformat.colorSpace = yuvColorSpace_;\n> +\n> +\t\tLOG(RPI, Debug)\n> +\t\t\t<< \"Try color space \" << ColorSpace::toString(cfg.colorSpace);\n> +\n> +\t\tint ret = out.dev->tryFormat(&format);\n> +\t\tif (ret)\n> +\t\t\treturn Invalid;\n> +\n> +\t\t/*\n> +\t\t * But for RGB streams, the YCbCr info gets overwritten on the way back\n> +\t\t * so we must check against what the stream cfg says, not what we actually\n> +\t\t * requested (which carefully included the YCbCr info)!\n> +\t\t */\n> +\t\tif (cfg.colorSpace != format.colorSpace) {\n> +\t\t\tstatus = Adjusted;\n> +\t\t\tLOG(RPI, Debug)\n> +\t\t\t\t<< \"Color space changed from \"\n> +\t\t\t\t<< ColorSpace::toString(cfg.colorSpace) << \" to \"\n> +\t\t\t\t<< ColorSpace::toString(format.colorSpace);\n> +\t\t}\n> +\n> +\t\tcfg.colorSpace = format.colorSpace;\n> +\t\tcfg.stride = format.planes[0].bpl;\n> +\t\tcfg.frameSize = format.planes[0].size;\n> +\t}\n> +\n> +\treturn status;\n> +}\n> +\n> +V4L2DeviceFormat PipelineHandlerBase::toV4L2DeviceFormat(const V4L2VideoDevice *dev,\n> +\t\t\t\t\t\t\t const V4L2SubdeviceFormat &format,\n> +\t\t\t\t\t\t\t BayerFormat::Packing packingReq)\n> +{\n> +\tunsigned int mbus_code = format.mbus_code;\n> +\tconst PixelFormat pix = mbusCodeToPixelFormat(mbus_code, packingReq);\n> +\tV4L2DeviceFormat deviceFormat;\n> +\n> +\tdeviceFormat.fourcc = dev->toV4L2PixelFormat(pix);\n> +\tdeviceFormat.size = format.size;\n> +\tdeviceFormat.colorSpace = format.colorSpace;\n> +\treturn deviceFormat;\n> +}\n> +\n> +std::unique_ptr<CameraConfiguration>\n> +PipelineHandlerBase::generateConfiguration(Camera *camera, const StreamRoles &roles)\n> +{\n> +\tCameraData *data = cameraData(camera);\n> +\tstd::unique_ptr<CameraConfiguration> config =\n> +\t\tstd::make_unique<RPiCameraConfiguration>(data);\n> +\tV4L2SubdeviceFormat sensorFormat;\n> +\tunsigned int bufferCount;\n> +\tPixelFormat pixelFormat;\n> +\tV4L2VideoDevice::Formats fmts;\n> +\tSize size;\n> +\tstd::optional<ColorSpace> colorSpace;\n> +\n> +\tif (roles.empty())\n> +\t\treturn config;\n> +\n> +\tSize sensorSize = data->sensor_->resolution();\n> +\tfor (const StreamRole role : roles) {\n> +\t\tswitch (role) {\n> +\t\tcase StreamRole::Raw:\n> +\t\t\tsize = sensorSize;\n> +\t\t\tsensorFormat = findBestFormat(data->sensorFormats_, size, defaultRawBitDepth);\n> +\t\t\tpixelFormat = mbusCodeToPixelFormat(sensorFormat.mbus_code,\n> +\t\t\t\t\t\t\t    BayerFormat::Packing::CSI2);\n> +\t\t\tASSERT(pixelFormat.isValid());\n> +\t\t\tcolorSpace = ColorSpace::Raw;\n> +\t\t\tbufferCount = 2;\n> +\t\t\tbreak;\n> +\n> +\t\tcase StreamRole::StillCapture:\n> +\t\t\tfmts = data->ispFormats();\n> +\t\t\tpixelFormat = formats::NV12;\n> +\t\t\t/*\n> +\t\t\t * Still image codecs usually expect the sYCC color space.\n> +\t\t\t * Even RGB codecs will be fine as the RGB we get with the\n> +\t\t\t * sYCC color space is the same as sRGB.\n> +\t\t\t */\n> +\t\t\tcolorSpace = ColorSpace::Sycc;\n> +\t\t\t/* Return the largest sensor resolution. */\n> +\t\t\tsize = sensorSize;\n> +\t\t\tbufferCount = 1;\n> +\t\t\tbreak;\n> +\n> +\t\tcase StreamRole::VideoRecording:\n> +\t\t\t/*\n> +\t\t\t * The colour denoise algorithm requires the analysis\n> +\t\t\t * image, produced by the second ISP output, to be in\n> +\t\t\t * YUV420 format. Select this format as the default, to\n> +\t\t\t * maximize chances that it will be picked by\n> +\t\t\t * applications and enable usage of the colour denoise\n> +\t\t\t * algorithm.\n> +\t\t\t */\n> +\t\t\tfmts = data->ispFormats();\n> +\t\t\tpixelFormat = formats::YUV420;\n> +\t\t\t/*\n> +\t\t\t * Choose a color space appropriate for video recording.\n> +\t\t\t * Rec.709 will be a good default for HD resolutions.\n> +\t\t\t */\n> +\t\t\tcolorSpace = ColorSpace::Rec709;\n> +\t\t\tsize = { 1920, 1080 };\n> +\t\t\tbufferCount = 4;\n> +\t\t\tbreak;\n> +\n> +\t\tcase StreamRole::Viewfinder:\n> +\t\t\tfmts = data->ispFormats();\n> +\t\t\tpixelFormat = formats::ARGB8888;\n> +\t\t\tcolorSpace = ColorSpace::Sycc;\n> +\t\t\tsize = { 800, 600 };\n> +\t\t\tbufferCount = 4;\n> +\t\t\tbreak;\n> +\n> +\t\tdefault:\n> +\t\t\tLOG(RPI, Error) << \"Requested stream role not supported: \"\n> +\t\t\t\t\t<< role;\n> +\t\t\treturn nullptr;\n> +\t\t}\n> +\n> +\t\tstd::map<PixelFormat, std::vector<SizeRange>> deviceFormats;\n> +\t\tif (role == StreamRole::Raw) {\n> +\t\t\t/* Translate the MBUS codes to a PixelFormat. */\n> +\t\t\tfor (const auto &format : data->sensorFormats_) {\n> +\t\t\t\tPixelFormat pf = mbusCodeToPixelFormat(format.first,\n> +\t\t\t\t\t\t\t\t       BayerFormat::Packing::CSI2);\n> +\t\t\t\tif (pf.isValid())\n> +\t\t\t\t\tdeviceFormats.emplace(std::piecewise_construct, std::forward_as_tuple(pf),\n> +\t\t\t\t\t\t\t      std::forward_as_tuple(format.second.begin(), format.second.end()));\n> +\t\t\t}\n> +\t\t} else {\n> +\t\t\t/*\n> +\t\t\t * Translate the V4L2PixelFormat to PixelFormat. Note that we\n> +\t\t\t * limit the recommended largest ISP output size to match the\n> +\t\t\t * sensor resolution.\n> +\t\t\t */\n> +\t\t\tfor (const auto &format : fmts) {\n> +\t\t\t\tPixelFormat pf = format.first.toPixelFormat();\n> +\t\t\t\tif (pf.isValid()) {\n> +\t\t\t\t\tconst SizeRange &ispSizes = format.second[0];\n> +\t\t\t\t\tdeviceFormats[pf].emplace_back(ispSizes.min, sensorSize,\n> +\t\t\t\t\t\t\t\t       ispSizes.hStep, ispSizes.vStep);\n> +\t\t\t\t}\n> +\t\t\t}\n> +\t\t}\n> +\n> +\t\t/* Add the stream format based on the device node used for the use case. */\n> +\t\tStreamFormats formats(deviceFormats);\n> +\t\tStreamConfiguration cfg(formats);\n> +\t\tcfg.size = size;\n> +\t\tcfg.pixelFormat = pixelFormat;\n> +\t\tcfg.colorSpace = colorSpace;\n> +\t\tcfg.bufferCount = bufferCount;\n> +\t\tconfig->addConfiguration(cfg);\n> +\t}\n> +\n> +\tconfig->validate();\n> +\n> +\treturn config;\n> +}\n> +\n> +int PipelineHandlerBase::configure(Camera *camera, CameraConfiguration *config)\n> +{\n> +\tCameraData *data = cameraData(camera);\n> +\tint ret;\n> +\n> +\t/* Start by freeing all buffers and reset the stream states. */\n> +\tdata->freeBuffers();\n> +\tfor (auto const stream : data->streams_)\n> +\t\tstream->setExternal(false);\n> +\n> +\tstd::vector<CameraData::StreamParams> rawStreams, ispStreams;\n> +\tstd::optional<BayerFormat::Packing> packing;\n> +\tunsigned int bitDepth = defaultRawBitDepth;\n> +\n> +\tfor (unsigned i = 0; i < config->size(); i++) {\n> +\t\tStreamConfiguration *cfg = &config->at(i);\n> +\n> +\t\tif (isRaw(cfg->pixelFormat))\n> +\t\t\trawStreams.emplace_back(i, cfg);\n> +\t\telse\n> +\t\t\tispStreams.emplace_back(i, cfg);\n> +\t}\n> +\n> +\t/* Sort the streams so the highest resolution is first. */\n> +\tstd::sort(rawStreams.begin(), rawStreams.end(),\n> +\t\t  [](auto &l, auto &r) { return l.cfg->size > r.cfg->size; });\n> +\n> +\tstd::sort(ispStreams.begin(), ispStreams.end(),\n> +\t\t  [](auto &l, auto &r) { return l.cfg->size > r.cfg->size; });\n> +\n> +\t/*\n> +\t * Calculate the best sensor mode we can use based on the user's request,\n> +\t * and apply it to the sensor with the cached tranform, if any.\n> +\t *\n> +\t * If we have been given a RAW stream, use that size for setting up the sensor.\n> +\t */\n> +\tif (!rawStreams.empty()) {\n> +\t\tBayerFormat bayerFormat = BayerFormat::fromPixelFormat(rawStreams[0].cfg->pixelFormat);\n> +\t\t/* Replace the user requested packing/bit-depth. */\n> +\t\tpacking = bayerFormat.packing;\n> +\t\tbitDepth = bayerFormat.bitDepth;\n> +\t}\n> +\n> +\tV4L2SubdeviceFormat sensorFormat = findBestFormat(data->sensorFormats_,\n> +\t\t\t\t\t\t\t  rawStreams.empty() ? ispStreams[0].cfg->size\n> +\t\t\t\t\t\t\t\t\t     : rawStreams[0].cfg->size,\n> +\t\t\t\t\t\t\t  bitDepth);\n> +\t/* Apply any cached transform. */\n> +\tconst RPiCameraConfiguration *rpiConfig = static_cast<const RPiCameraConfiguration *>(config);\n> +\n> +\t/* Then apply the format on the sensor. */\n> +\tret = data->sensor_->setFormat(&sensorFormat, rpiConfig->combinedTransform_);\n> +\tif (ret)\n> +\t\treturn ret;\n> +\n> +\t/*\n> +\t * Platform specific internal stream configuration. This also assigns\n> +\t * external streams which get configured below.\n> +\t */\n> +\tret = data->platformConfigure(sensorFormat, packing, rawStreams, ispStreams);\n> +\tif (ret)\n> +\t\treturn ret;\n> +\n> +\tipa::RPi::ConfigResult result;\n> +\tret = data->configureIPA(config, &result);\n> +\tif (ret) {\n> +\t\tLOG(RPI, Error) << \"Failed to configure the IPA: \" << ret;\n> +\t\treturn ret;\n> +\t}\n> +\n> +\t/*\n> +\t * Set the scaler crop to the value we are using (scaled to native sensor\n> +\t * coordinates).\n> +\t */\n> +\tdata->scalerCrop_ = data->scaleIspCrop(data->ispCrop_);\n> +\n> +\t/*\n> +\t * Update the ScalerCropMaximum to the correct value for this camera mode.\n> +\t * For us, it's the same as the \"analogue crop\".\n> +\t *\n> +\t * \\todo Make this property the ScalerCrop maximum value when dynamic\n> +\t * controls are available and set it at validate() time\n> +\t */\n> +\tdata->properties_.set(properties::ScalerCropMaximum, data->sensorInfo_.analogCrop);\n> +\n> +\t/* Store the mode sensitivity for the application. */\n> +\tdata->properties_.set(properties::SensorSensitivity, result.modeSensitivity);\n> +\n> +\t/* Update the controls that the Raspberry Pi IPA can handle. */\n> +\tControlInfoMap::Map ctrlMap;\n> +\tfor (auto const &c : result.controlInfo)\n> +\t\tctrlMap.emplace(c.first, c.second);\n> +\n> +\t/* Add the ScalerCrop control limits based on the current mode. */\n> +\tRectangle ispMinCrop = data->scaleIspCrop(Rectangle(data->ispMinCropSize_));\n> +\tctrlMap[&controls::ScalerCrop] = ControlInfo(ispMinCrop, data->sensorInfo_.analogCrop, data->scalerCrop_);\n> +\n> +\tdata->controlInfo_ = ControlInfoMap(std::move(ctrlMap), result.controlInfo.idmap());\n> +\n> +\t/* Setup the Video Mux/Bridge entities. */\n> +\tfor (auto &[device, link] : data->bridgeDevices_) {\n> +\t\t/*\n> +\t\t * Start by disabling all the sink pad links on the devices in the\n> +\t\t * cascade, with the exception of the link connecting the device.\n> +\t\t */\n> +\t\tfor (const MediaPad *p : device->entity()->pads()) {\n> +\t\t\tif (!(p->flags() & MEDIA_PAD_FL_SINK))\n> +\t\t\t\tcontinue;\n> +\n> +\t\t\tfor (MediaLink *l : p->links()) {\n> +\t\t\t\tif (l != link)\n> +\t\t\t\t\tl->setEnabled(false);\n> +\t\t\t}\n> +\t\t}\n> +\n> +\t\t/*\n> +\t\t * Next, enable the entity -> entity links, and setup the pad format.\n> +\t\t *\n> +\t\t * \\todo Some bridge devices may chainge the media bus code, so we\n> +\t\t * ought to read the source pad format and propagate it to the sink pad.\n> +\t\t */\n> +\t\tlink->setEnabled(true);\n> +\t\tconst MediaPad *sinkPad = link->sink();\n> +\t\tret = device->setFormat(sinkPad->index(), &sensorFormat);\n> +\t\tif (ret) {\n> +\t\t\tLOG(RPI, Error) << \"Failed to set format on \" << device->entity()->name()\n> +\t\t\t\t\t<< \" pad \" << sinkPad->index()\n> +\t\t\t\t\t<< \" with format  \" << sensorFormat\n> +\t\t\t\t\t<< \": \" << ret;\n> +\t\t\treturn ret;\n> +\t\t}\n> +\n> +\t\tLOG(RPI, Debug) << \"Configured media link on device \" << device->entity()->name()\n> +\t\t\t\t<< \" on pad \" << sinkPad->index();\n> +\t}\n> +\n> +\treturn 0;\n> +}\n> +\n> +int PipelineHandlerBase::exportFrameBuffers([[maybe_unused]] Camera *camera, libcamera::Stream *stream,\n> +\t\t\t\t\t    std::vector<std::unique_ptr<FrameBuffer>> *buffers)\n> +{\n> +\tRPi::Stream *s = static_cast<RPi::Stream *>(stream);\n> +\tunsigned int count = stream->configuration().bufferCount;\n> +\tint ret = s->dev()->exportBuffers(count, buffers);\n> +\n> +\ts->setExportedBuffers(buffers);\n> +\n> +\treturn ret;\n> +}\n> +\n> +int PipelineHandlerBase::start(Camera *camera, const ControlList *controls)\n> +{\n> +\tCameraData *data = cameraData(camera);\n> +\tint ret;\n> +\n> +\t/* Check if a ScalerCrop control was specified. */\n> +\tif (controls)\n> +\t\tdata->calculateScalerCrop(*controls);\n> +\n> +\t/* Start the IPA. */\n> +\tipa::RPi::StartResult result;\n> +\tdata->ipa_->start(controls ? *controls : ControlList{ controls::controls },\n> +\t\t\t  &result);\n> +\n> +\t/* Apply any gain/exposure settings that the IPA may have passed back. */\n> +\tif (!result.controls.empty())\n> +\t\tdata->setSensorControls(result.controls);\n> +\n> +\t/* Configure the number of dropped frames required on startup. */\n> +\tdata->dropFrameCount_ = data->config_.disableStartupFrameDrops ? 0\n> +\t\t\t\t\t\t\t\t       : result.dropFrameCount;\n\n\tdata->dropFrameCount_ = data->config_.disableStartupFrameDrops\n\t\t\t      ? 0 : result.dropFrameCount;\n\n> +\n> +\tfor (auto const stream : data->streams_)\n> +\t\tstream->resetBuffers();\n> +\n> +\tif (!data->buffersAllocated_) {\n> +\t\t/* Allocate buffers for internal pipeline usage. */\n> +\t\tret = prepareBuffers(camera);\n> +\t\tif (ret) {\n> +\t\t\tLOG(RPI, Error) << \"Failed to allocate buffers\";\n> +\t\t\tdata->freeBuffers();\n> +\t\t\tstop(camera);\n> +\t\t\treturn ret;\n> +\t\t}\n> +\t\tdata->buffersAllocated_ = true;\n> +\t}\n> +\n> +\t/* We need to set the dropFrameCount_ before queueing buffers. */\n> +\tret = queueAllBuffers(camera);\n> +\tif (ret) {\n> +\t\tLOG(RPI, Error) << \"Failed to queue buffers\";\n> +\t\tstop(camera);\n> +\t\treturn ret;\n> +\t}\n> +\n> +\t/*\n> +\t * Reset the delayed controls with the gain and exposure values set by\n> +\t * the IPA.\n> +\t */\n> +\tdata->delayedCtrls_->reset(0);\n> +\tdata->state_ = CameraData::State::Idle;\n> +\n> +\t/* Enable SOF event generation. */\n> +\tdata->frontendDevice()->setFrameStartEnabled(true);\n> +\n> +\tdata->platformStart();\n> +\n> +\t/* Start all streams. */\n> +\tfor (auto const stream : data->streams_) {\n> +\t\tret = stream->dev()->streamOn();\n> +\t\tif (ret) {\n> +\t\t\tstop(camera);\n> +\t\t\treturn ret;\n> +\t\t}\n> +\t}\n> +\n> +\treturn 0;\n> +}\n> +\n> +void PipelineHandlerBase::stopDevice(Camera *camera)\n> +{\n> +\tCameraData *data = cameraData(camera);\n> +\n> +\tdata->state_ = CameraData::State::Stopped;\n> +\tdata->platformStop();\n> +\n> +\tfor (auto const stream : data->streams_)\n> +\t\tstream->dev()->streamOff();\n> +\n> +\t/* Disable SOF event generation. */\n> +\tdata->frontendDevice()->setFrameStartEnabled(false);\n> +\n> +\tdata->clearIncompleteRequests();\n> +\n> +\t/* Stop the IPA. */\n> +\tdata->ipa_->stop();\n> +}\n> +\n> +void PipelineHandlerBase::releaseDevice(Camera *camera)\n> +{\n> +\tCameraData *data = cameraData(camera);\n> +\tdata->freeBuffers();\n> +}\n> +\n> +int PipelineHandlerBase::queueRequestDevice(Camera *camera, Request *request)\n> +{\n> +\tCameraData *data = cameraData(camera);\n> +\n> +\tif (!data->isRunning())\n> +\t\treturn -EINVAL;\n> +\n> +\tLOG(RPI, Debug) << \"queueRequestDevice: New request.\";\n> +\n> +\t/* Push all buffers supplied in the Request to the respective streams. */\n> +\tfor (auto stream : data->streams_) {\n> +\t\tif (!stream->isExternal())\n> +\t\t\tcontinue;\n> +\n> +\t\tFrameBuffer *buffer = request->findBuffer(stream);\n> +\t\tif (buffer && !stream->getBufferId(buffer)) {\n> +\t\t\t/*\n> +\t\t\t * This buffer is not recognised, so it must have been allocated\n> +\t\t\t * outside the v4l2 device. Store it in the stream buffer list\n> +\t\t\t * so we can track it.\n> +\t\t\t */\n> +\t\t\tstream->setExternalBuffer(buffer);\n> +\t\t}\n> +\n> +\t\t/*\n> +\t\t * If no buffer is provided by the request for this stream, we\n> +\t\t * queue a nullptr to the stream to signify that it must use an\n> +\t\t * internally allocated buffer for this capture request. This\n> +\t\t * buffer will not be given back to the application, but is used\n> +\t\t * to support the internal pipeline flow.\n> +\t\t *\n> +\t\t * The below queueBuffer() call will do nothing if there are not\n> +\t\t * enough internal buffers allocated, but this will be handled by\n> +\t\t * queuing the request for buffers in the RPiStream object.\n> +\t\t */\n> +\t\tint ret = stream->queueBuffer(buffer);\n> +\t\tif (ret)\n> +\t\t\treturn ret;\n> +\t}\n> +\n> +\t/* Push the request to the back of the queue. */\n> +\tdata->requestQueue_.push(request);\n> +\tdata->handleState();\n> +\n> +\treturn 0;\n> +}\n> +\n> +int PipelineHandlerBase::registerCamera(MediaDevice *frontend, const std::string &frontendName,\n> +\t\t\t\t\tMediaDevice *backend, MediaEntity *sensorEntity)\n> +{\n> +\tstd::unique_ptr<CameraData> cameraData = allocateCameraData();\n> +\tCameraData *data = cameraData.get();\n> +\tint ret;\n> +\n> +\tdata->sensor_ = std::make_unique<CameraSensor>(sensorEntity);\n> +\tif (!data->sensor_)\n> +\t\treturn -EINVAL;\n> +\n> +\tif (data->sensor_->init())\n> +\t\treturn -EINVAL;\n> +\n> +\tdata->sensorFormats_ = populateSensorFormats(data->sensor_);\n> +\n> +\t/*\n> +\t * Enumerate all the Video Mux/Bridge devices across the sensor -> Fr\n> +\t * chain. There may be a cascade of devices in this chain!\n> +\t */\n> +\tMediaLink *link = sensorEntity->getPadByIndex(0)->links()[0];\n> +\tdata->enumerateVideoDevices(link, frontendName);\n> +\n> +\tipa::RPi::InitResult result;\n> +\tif (data->loadIPA(&result)) {\n> +\t\tLOG(RPI, Error) << \"Failed to load a suitable IPA library\";\n> +\t\treturn -EINVAL;\n> +\t}\n> +\n> +\t/*\n> +\t * Setup our delayed control writer with the sensor default\n> +\t * gain and exposure delays. Mark VBLANK for priority write.\n> +\t */\n> +\tstd::unordered_map<uint32_t, RPi::DelayedControls::ControlParams> params = {\n> +\t\t{ V4L2_CID_ANALOGUE_GAIN, { result.sensorConfig.gainDelay, false } },\n> +\t\t{ V4L2_CID_EXPOSURE, { result.sensorConfig.exposureDelay, false } },\n> +\t\t{ V4L2_CID_HBLANK, { result.sensorConfig.hblankDelay, false } },\n> +\t\t{ V4L2_CID_VBLANK, { result.sensorConfig.vblankDelay, true } }\n> +\t};\n> +\tdata->delayedCtrls_ = std::make_unique<RPi::DelayedControls>(data->sensor_->device(), params);\n> +\tdata->sensorMetadata_ = result.sensorConfig.sensorMetadata;\n> +\n> +\t/* Register initial controls that the Raspberry Pi IPA can handle. */\n> +\tdata->controlInfo_ = std::move(result.controlInfo);\n> +\n> +\t/* Initialize the camera properties. */\n> +\tdata->properties_ = data->sensor_->properties();\n> +\n> +\t/*\n> +\t * The V4L2_CID_NOTIFY_GAINS control, if present, is used to inform the\n> +\t * sensor of the colour gains. It is defined to be a linear gain where\n> +\t * the default value represents a gain of exactly one.\n> +\t */\n> +\tauto it = data->sensor_->controls().find(V4L2_CID_NOTIFY_GAINS);\n> +\tif (it != data->sensor_->controls().end())\n> +\t\tdata->notifyGainsUnity_ = it->second.def().get<int32_t>();\n> +\n> +\t/*\n> +\t * Set a default value for the ScalerCropMaximum property to show\n> +\t * that we support its use, however, initialise it to zero because\n> +\t * it's not meaningful until a camera mode has been chosen.\n> +\t */\n> +\tdata->properties_.set(properties::ScalerCropMaximum, Rectangle{});\n> +\n> +\t/*\n> +\t * We cache two things about the sensor in relation to transforms\n> +\t * (meaning horizontal and vertical flips): if they affect the Bayer\n> +         * ordering, and what the \"native\" Bayer order is, when no transforms\n\nWrong indentation.\n\n> +\t * are applied.\n> +\t *\n> +\t * If flips are supported verify if they affect the Bayer ordering\n> +\t * and what the \"native\" Bayer order is, when no transforms are\n> +\t * applied.\n> +\t *\n> +\t * We note that the sensor's cached list of supported formats is\n> +\t * already in the \"native\" order, with any flips having been undone.\n> +\t */\n> +\tconst V4L2Subdevice *sensor = data->sensor_->device();\n> +\tconst struct v4l2_query_ext_ctrl *hflipCtrl = sensor->controlInfo(V4L2_CID_HFLIP);\n> +\tif (hflipCtrl) {\n> +\t\t/* We assume it will support vflips too... */\n> +\t\tdata->flipsAlterBayerOrder_ = hflipCtrl->flags & V4L2_CTRL_FLAG_MODIFY_LAYOUT;\n> +\t}\n> +\n> +\t/* Look for a valid Bayer format. */\n> +\tBayerFormat bayerFormat;\n> +\tfor (const auto &iter : data->sensorFormats_) {\n> +\t\tbayerFormat = BayerFormat::fromMbusCode(iter.first);\n> +\t\tif (bayerFormat.isValid())\n> +\t\t\tbreak;\n> +\t}\n> +\n> +\tif (!bayerFormat.isValid()) {\n> +\t\tLOG(RPI, Error) << \"No Bayer format found\";\n> +\t\treturn -EINVAL;\n> +\t}\n> +\tdata->nativeBayerOrder_ = bayerFormat.order;\n> +\n> +\tret = data->loadPipelineConfiguration();\n> +\tif (ret) {\n> +\t\tLOG(RPI, Error) << \"Unable to load pipeline configuration\";\n> +\t\treturn ret;\n> +\t}\n> +\n> +\tret = platformRegister(cameraData, frontend, backend);\n> +\tif (ret)\n> +\t\treturn ret;\n> +\n> +\t/* Setup the general IPA signal handlers. */\n> +\tdata->frontendDevice()->dequeueTimeout.connect(data, &RPi::CameraData::cameraTimeout);\n> +\tdata->frontendDevice()->frameStart.connect(data, &RPi::CameraData::frameStarted);\n> +\tdata->ipa_->setDelayedControls.connect(data, &CameraData::setDelayedControls);\n> +\tdata->ipa_->setLensControls.connect(data, &CameraData::setLensControls);\n> +\n> +\treturn 0;\n> +}\n> +\n> +void PipelineHandlerBase::mapBuffers(Camera *camera, const BufferMap &buffers, unsigned int mask)\n> +{\n> +\tCameraData *data = cameraData(camera);\n> +\tstd::vector<IPABuffer> bufferIds;\n> +\t/*\n> +\t * Link the FrameBuffers with the id (key value) in the map stored in\n> +\t * the RPi stream object - along with an identifier mask.\n> +\t *\n> +\t * This will allow us to identify buffers passed between the pipeline\n> +\t * handler and the IPA.\n> +\t */\n> +\tfor (auto const &it : buffers) {\n> +\t\tbufferIds.push_back(IPABuffer(mask | it.first,\n> +\t\t\t\t\t      it.second->planes()));\n> +\t\tdata->bufferIds_.insert(mask | it.first);\n> +\t}\n> +\n> +\tdata->ipa_->mapBuffers(bufferIds);\n> +}\n> +\n> +int PipelineHandlerBase::queueAllBuffers(Camera *camera)\n> +{\n> +\tCameraData *data = cameraData(camera);\n> +\tint ret;\n> +\n> +\tfor (auto const stream : data->streams_) {\n> +\t\tif (!stream->isExternal()) {\n> +\t\t\tret = stream->queueAllBuffers();\n> +\t\t\tif (ret < 0)\n> +\t\t\t\treturn ret;\n> +\t\t} else {\n> +\t\t\t/*\n> +\t\t\t * For external streams, we must queue up a set of internal\n> +\t\t\t * buffers to handle the number of drop frames requested by\n> +\t\t\t * the IPA. This is done by passing nullptr in queueBuffer().\n> +\t\t\t *\n> +\t\t\t * The below queueBuffer() call will do nothing if there\n> +\t\t\t * are not enough internal buffers allocated, but this will\n> +\t\t\t * be handled by queuing the request for buffers in the\n> +\t\t\t * RPiStream object.\n> +\t\t\t */\n> +\t\t\tunsigned int i;\n> +\t\t\tfor (i = 0; i < data->dropFrameCount_; i++) {\n> +\t\t\t\tret = stream->queueBuffer(nullptr);\n> +\t\t\t\tif (ret)\n> +\t\t\t\t\treturn ret;\n> +\t\t\t}\n> +\t\t}\n> +\t}\n> +\n> +\treturn 0;\n> +}\n> +\n> +void CameraData::freeBuffers()\n> +{\n> +\tif (ipa_) {\n> +\t\t/*\n> +\t\t * Copy the buffer ids from the unordered_set to a vector to\n> +\t\t * pass to the IPA.\n> +\t\t */\n> +\t\tstd::vector<unsigned int> bufferIds(bufferIds_.begin(),\n> +\t\t\t\t\t\t    bufferIds_.end());\n> +\t\tipa_->unmapBuffers(bufferIds);\n> +\t\tbufferIds_.clear();\n> +\t}\n> +\n> +\tfor (auto const stream : streams_)\n> +\t\tstream->releaseBuffers();\n> +\n> +\tplatformFreeBuffers();\n> +\n> +\tbuffersAllocated_ = false;\n> +}\n> +\n> +/*\n> + * enumerateVideoDevices() iterates over the Media Controller topology, starting\n> + * at the sensor and finishing at the frontend. For each sensor, CameraData stores\n> + * a unique list of any intermediate video mux or bridge devices connected in a\n> + * cascade, together with the entity to entity link.\n> + *\n> + * Entity pad configuration and link enabling happens at the end of configure().\n> + * We first disable all pad links on each entity device in the chain, and then\n> + * selectively enabling the specific links to link sensor to the frontend across\n> + * all intermediate muxes and bridges.\n> + *\n> + * In the cascaded topology below, if Sensor1 is used, the Mux2 -> Mux1 link\n> + * will be disabled, and Sensor1 -> Mux1 -> Frontend links enabled. Alternatively,\n> + * if Sensor3 is used, the Sensor2 -> Mux2 and Sensor1 -> Mux1 links are disabled,\n> + * and Sensor3 -> Mux2 -> Mux1 -> Frontend links are enabled. All other links will\n> + * remain unchanged.\n> + *\n> + *  +----------+\n> + *  |     FE   |\n> + *  +-----^----+\n> + *        |\n> + *    +---+---+\n> + *    | Mux1  |<------+\n> + *    +--^----        |\n> + *       |            |\n> + * +-----+---+    +---+---+\n> + * | Sensor1 |    |  Mux2 |<--+\n> + * +---------+    +-^-----+   |\n> + *                  |         |\n> + *          +-------+-+   +---+-----+\n> + *          | Sensor2 |   | Sensor3 |\n> + *          +---------+   +---------+\n> + */\n> +void CameraData::enumerateVideoDevices(MediaLink *link, const std::string &frontend)\n> +{\n> +\tconst MediaPad *sinkPad = link->sink();\n> +\tconst MediaEntity *entity = sinkPad->entity();\n> +\tbool frontendFound = false;\n> +\n> +\t/* We only deal with Video Mux and Bridge devices in cascade. */\n> +\tif (entity->function() != MEDIA_ENT_F_VID_MUX &&\n> +\t    entity->function() != MEDIA_ENT_F_VID_IF_BRIDGE)\n> +\t\treturn;\n> +\n> +\t/* Find the source pad for this Video Mux or Bridge device. */\n> +\tconst MediaPad *sourcePad = nullptr;\n> +\tfor (const MediaPad *pad : entity->pads()) {\n> +\t\tif (pad->flags() & MEDIA_PAD_FL_SOURCE) {\n> +\t\t\t/*\n> +\t\t\t * We can only deal with devices that have a single source\n> +\t\t\t * pad. If this device has multiple source pads, ignore it\n> +\t\t\t * and this branch in the cascade.\n> +\t\t\t */\n> +\t\t\tif (sourcePad)\n> +\t\t\t\treturn;\n> +\n> +\t\t\tsourcePad = pad;\n> +\t\t}\n> +\t}\n> +\n> +\tLOG(RPI, Debug) << \"Found video mux device \" << entity->name()\n> +\t\t\t<< \" linked to sink pad \" << sinkPad->index();\n> +\n> +\tbridgeDevices_.emplace_back(std::make_unique<V4L2Subdevice>(entity), link);\n> +\tbridgeDevices_.back().first->open();\n> +\n> +\t/*\n> +\t * Iterate through all the sink pad links down the cascade to find any\n> +\t * other Video Mux and Bridge devices.\n> +\t */\n> +\tfor (MediaLink *l : sourcePad->links()) {\n> +\t\tenumerateVideoDevices(l, frontend);\n> +\t\t/* Once we reach the Frontend entity, we are done. */\n> +\t\tif (l->sink()->entity()->name() == frontend) {\n> +\t\t\tfrontendFound = true;\n> +\t\t\tbreak;\n> +\t\t}\n> +\t}\n> +\n> +\t/* This identifies the end of our entity enumeration recursion. */\n> +\tif (link->source()->entity()->function() == MEDIA_ENT_F_CAM_SENSOR) {\n> +\t\t/*\n> +\t\t* If the frontend is not at the end of this cascade, we cannot\n> +\t\t* configure this topology automatically, so remove all entity references.\n\nLine wrap.\n\n> +\t\t*/\n> +\t\tif (!frontendFound) {\n> +\t\t\tLOG(RPI, Warning) << \"Cannot automatically configure this MC topology!\";\n> +\t\t\tbridgeDevices_.clear();\n> +\t\t}\n> +\t}\n> +}\n> +\n> +int CameraData::loadPipelineConfiguration()\n> +{\n> +\tconfig_ = {\n> +\t\t.disableStartupFrameDrops = false,\n> +\t\t.cameraTimeoutValue = 0,\n> +\t};\n> +\n> +\t/* Initial configuration of the platform, in case no config file is present */\n> +\tplatformPipelineConfigure({});\n> +\n> +\tchar const *configFromEnv = utils::secure_getenv(\"LIBCAMERA_RPI_CONFIG_FILE\");\n> +\tif (!configFromEnv || *configFromEnv == '\\0')\n> +\t\treturn 0;\n> +\n> +\tstd::string filename = std::string(configFromEnv);\n> +\tFile file(filename);\n> +\n> +\tif (!file.open(File::OpenModeFlag::ReadOnly)) {\n> +\t\tLOG(RPI, Error) << \"Failed to open configuration file '\" << filename << \"'\";\n> +\t\treturn -EIO;\n> +\t}\n> +\n> +\tLOG(RPI, Info) << \"Using configuration file '\" << filename << \"'\";\n> +\n> +\tstd::unique_ptr<YamlObject> root = YamlParser::parse(file);\n> +\tif (!root) {\n> +\t\tLOG(RPI, Warning) << \"Failed to parse configuration file, using defaults\";\n> +\t\treturn 0;\n> +\t}\n> +\n> +\tstd::optional<double> ver = (*root)[\"version\"].get<double>();\n> +\tif (!ver || *ver != 1.0) {\n> +\t\tLOG(RPI, Error) << \"Unexpected configuration file version reported\";\n> +\t\treturn -EINVAL;\n> +\t}\n> +\n> +\tconst YamlObject &phConfig = (*root)[\"pipeline_handler\"];\n> +\n> +\tconfig_.disableStartupFrameDrops =\n> +\t\tphConfig[\"disable_startup_frame_drops\"].get<bool>(config_.disableStartupFrameDrops);\n> +\n> +\tconfig_.cameraTimeoutValue =\n> +\t\tphConfig[\"camera_timeout_value_ms\"].get<unsigned int>(config_.cameraTimeoutValue);\n> +\n> +\tif (config_.cameraTimeoutValue) {\n> +\t\t/* Disable the IPA signal to control timeout and set the user requested value. */\n> +\t\tipa_->setCameraTimeout.disconnect();\n> +\t\tfrontendDevice()->setDequeueTimeout(config_.cameraTimeoutValue * 1ms);\n> +\t}\n> +\n> +\treturn platformPipelineConfigure(root);\n> +}\n> +\n> +int CameraData::loadIPA(ipa::RPi::InitResult *result)\n> +{\n> +\tipa_ = IPAManager::createIPA<ipa::RPi::IPAProxyRPi>(pipe(), 1, 1);\n> +\n> +\tif (!ipa_)\n> +\t\treturn -ENOENT;\n> +\n> +\t/*\n> +\t * The configuration (tuning file) is made from the sensor name unless\n> +\t * the environment variable overrides it.\n> +\t */\n> +\tstd::string configurationFile;\n> +\tchar const *configFromEnv = utils::secure_getenv(\"LIBCAMERA_RPI_TUNING_FILE\");\n> +\tif (!configFromEnv || *configFromEnv == '\\0') {\n> +\t\tstd::string model = sensor_->model();\n> +\t\tif (isMonoSensor(sensor_))\n> +\t\t\tmodel += \"_mono\";\n> +\t\tconfigurationFile = ipa_->configurationFile(model + \".json\", \"rpi\");\n> +\t} else {\n> +\t\tconfigurationFile = std::string(configFromEnv);\n> +\t}\n> +\n> +\tIPASettings settings(configurationFile, sensor_->model());\n> +\tipa::RPi::InitParams params;\n> +\n> +\tparams.lensPresent = !!sensor_->focusLens();\n> +\tint ret = platformInitIpa(params);\n> +\tif (ret)\n> +\t\treturn ret;\n> +\n> +\treturn ipa_->init(settings, params, result);\n> +}\n> +\n> +int CameraData::configureIPA(const CameraConfiguration *config, ipa::RPi::ConfigResult *result)\n> +{\n> +\tstd::map<unsigned int, ControlInfoMap> entityControls;\n> +\tipa::RPi::ConfigParams params;\n> +\tint ret;\n> +\n> +\tparams.sensorControls = sensor_->controls();\n> +\tif (sensor_->focusLens())\n> +\t\tparams.lensControls = sensor_->focusLens()->controls();\n> +\n> +\tret = platformConfigureIpa(params);\n> +\tif (ret)\n> +\t\treturn ret;\n> +\n> +\t/* We store the IPACameraSensorInfo for digital zoom calculations. */\n> +\tret = sensor_->sensorInfo(&sensorInfo_);\n> +\tif (ret) {\n> +\t\tLOG(RPI, Error) << \"Failed to retrieve camera sensor info\";\n> +\t\treturn ret;\n> +\t}\n> +\n> +\t/* Always send the user transform to the IPA. */\n> +\tparams.transform = static_cast<unsigned int>(config->transform);\n> +\n> +\t/* Ready the IPA - it must know about the sensor resolution. */\n> +\tret = ipa_->configure(sensorInfo_, params, result);\n> +\tif (ret < 0) {\n> +\t\tLOG(RPI, Error) << \"IPA configuration failed!\";\n> +\t\treturn -EPIPE;\n> +\t}\n> +\n> +\tif (!result->controls.empty())\n> +\t\tsetSensorControls(result->controls);\n> +\n> +\treturn 0;\n> +}\n> +\n> +void CameraData::setDelayedControls(const ControlList &controls, uint32_t delayContext)\n> +{\n> +\tif (!delayedCtrls_->push(controls, delayContext))\n> +\t\tLOG(RPI, Error) << \"V4L2 DelayedControl set failed\";\n> +}\n> +\n> +void CameraData::setLensControls(const ControlList &controls)\n> +{\n> +\tCameraLens *lens = sensor_->focusLens();\n> +\n> +\tif (lens && controls.contains(V4L2_CID_FOCUS_ABSOLUTE)) {\n> +\t\tControlValue const &focusValue = controls.get(V4L2_CID_FOCUS_ABSOLUTE);\n> +\t\tlens->setFocusPosition(focusValue.get<int32_t>());\n> +\t}\n> +}\n> +\n> +void CameraData::setSensorControls(ControlList &controls)\n> +{\n> +\t/*\n> +\t * We need to ensure that if both VBLANK and EXPOSURE are present, the\n> +\t * former must be written ahead of, and separately from EXPOSURE to avoid\n> +\t * V4L2 rejecting the latter. This is identical to what DelayedControls\n> +\t * does with the priority write flag.\n> +\t *\n> +\t * As a consequence of the below logic, VBLANK gets set twice, and we\n> +\t * rely on the v4l2 framework to not pass the second control set to the\n> +\t * driver as the actual control value has not changed.\n> +\t */\n> +\tif (controls.contains(V4L2_CID_EXPOSURE) && controls.contains(V4L2_CID_VBLANK)) {\n> +\t\tControlList vblank_ctrl;\n> +\n> +\t\tvblank_ctrl.set(V4L2_CID_VBLANK, controls.get(V4L2_CID_VBLANK));\n> +\t\tsensor_->setControls(&vblank_ctrl);\n> +\t}\n> +\n> +\tsensor_->setControls(&controls);\n> +}\n> +\n> +Rectangle CameraData::scaleIspCrop(const Rectangle &ispCrop) const\n> +{\n> +\t/*\n> +\t * Scale a crop rectangle defined in the ISP's coordinates into native sensor\n> +\t * coordinates.\n> +\t */\n> +\tRectangle nativeCrop = ispCrop.scaledBy(sensorInfo_.analogCrop.size(),\n> +\t\t\t\t\t\tsensorInfo_.outputSize);\n> +\tnativeCrop.translateBy(sensorInfo_.analogCrop.topLeft());\n> +\treturn nativeCrop;\n> +}\n> +\n> +void CameraData::calculateScalerCrop(const ControlList &controls)\n> +{\n> +\tconst auto &scalerCrop = controls.get<Rectangle>(controls::ScalerCrop);\n> +\tif (scalerCrop) {\n> +\t\tRectangle nativeCrop = *scalerCrop;\n> +\n> +\t\tif (!nativeCrop.width || !nativeCrop.height)\n> +\t\t\tnativeCrop = { 0, 0, 1, 1 };\n> +\n> +\t\t/* Create a version of the crop scaled to ISP (camera mode) pixels. */\n> +\t\tRectangle ispCrop = nativeCrop.translatedBy(-sensorInfo_.analogCrop.topLeft());\n> +\t\tispCrop.scaleBy(sensorInfo_.outputSize, sensorInfo_.analogCrop.size());\n> +\n> +\t\t/*\n> +\t\t * The crop that we set must be:\n> +\t\t * 1. At least as big as ispMinCropSize_, once that's been\n> +\t\t *    enlarged to the same aspect ratio.\n> +\t\t * 2. With the same mid-point, if possible.\n> +\t\t * 3. But it can't go outside the sensor area.\n> +\t\t */\n> +\t\tSize minSize = ispMinCropSize_.expandedToAspectRatio(nativeCrop.size());\n> +\t\tSize size = ispCrop.size().expandedTo(minSize);\n> +\t\tispCrop = size.centeredTo(ispCrop.center()).enclosedIn(Rectangle(sensorInfo_.outputSize));\n> +\n> +\t\tif (ispCrop != ispCrop_) {\n> +\t\t\tispCrop_ = ispCrop;\n> +\t\t\tplatformIspCrop();\n\nThe CameraData::calculateScalerCrop() function was previously called\napplyScalerCrop(). The new name implies it calculates the crop rectangle\nonly, without applying it.\n\nplatformIspCrop() is a fairly non-descriptive name. As far as I\nunderstand its purpose is to apply the ISP input crop rectangle, could\nyou rename it accordingly ?\n\n> +\n> +\t\t\t/*\n> +\t\t\t * Also update the ScalerCrop in the metadata with what we actually\n> +\t\t\t * used. But we must first rescale that from ISP (camera mode) pixels\n> +\t\t\t * back into sensor native pixels.\n> +\t\t\t */\n> +\t\t\tscalerCrop_ = scaleIspCrop(ispCrop_);\n> +\t\t}\n> +\t}\n> +}\n> +\n> +void CameraData::cameraTimeout()\n> +{\n> +\tLOG(RPI, Error) << \"Camera frontend has timed out!\";\n> +\tLOG(RPI, Error) << \"Please check that your camera sensor connector is attached securely.\";\n> +\tLOG(RPI, Error) << \"Alternatively, try another cable and/or sensor.\";\n> +\n> +\tstate_ = CameraData::State::Error;\n> +\tplatformStop();\n> +\n> +\t/*\n> +\t * To allow the application to attempt a recovery from this timeout,\n> +\t * stop all devices streaming, and return any outstanding requests as\n> +\t * incomplete and cancelled.\n> +\t */\n> +\tfor (auto const stream : streams_)\n> +\t\tstream->dev()->streamOff();\n> +\n> +\tclearIncompleteRequests();\n> +}\n> +\n> +void CameraData::frameStarted(uint32_t sequence)\n> +{\n> +\tLOG(RPI, Debug) << \"Frame start \" << sequence;\n> +\n> +\t/* Write any controls for the next frame as soon as we can. */\n> +\tdelayedCtrls_->applyControls(sequence);\n> +}\n> +\n> +void CameraData::clearIncompleteRequests()\n> +{\n> +\t/*\n> +\t * All outstanding requests (and associated buffers) must be returned\n> +\t * back to the application.\n> +\t */\n> +\twhile (!requestQueue_.empty()) {\n> +\t\tRequest *request = requestQueue_.front();\n> +\n> +\t\tfor (auto &b : request->buffers()) {\n> +\t\t\tFrameBuffer *buffer = b.second;\n> +\t\t\t/*\n> +\t\t\t * Has the buffer already been handed back to the\n> +\t\t\t * request? If not, do so now.\n> +\t\t\t */\n> +\t\t\tif (buffer->request()) {\n> +\t\t\t\tbuffer->_d()->cancel();\n> +\t\t\t\tpipe()->completeBuffer(request, buffer);\n> +\t\t\t}\n> +\t\t}\n> +\n> +\t\tpipe()->completeRequest(request);\n> +\t\trequestQueue_.pop();\n> +\t}\n> +}\n> +\n> +void CameraData::handleStreamBuffer(FrameBuffer *buffer, RPi::Stream *stream)\n> +{\n> +\t/*\n> +\t * It is possible to be here without a pending request, so check\n> +\t * that we actually have one to action, otherwise we just return\n> +\t * buffer back to the stream.\n> +\t */\n> +\tRequest *request = requestQueue_.empty() ? nullptr : requestQueue_.front();\n> +\tif (!dropFrameCount_ && request && request->findBuffer(stream) == buffer) {\n> +\t\t/*\n> +\t\t * Check if this is an externally provided buffer, and if\n> +\t\t * so, we must stop tracking it in the pipeline handler.\n> +\t\t */\n> +\t\thandleExternalBuffer(buffer, stream);\n> +\t\t/*\n> +\t\t * Tag the buffer as completed, returning it to the\n> +\t\t * application.\n> +\t\t */\n> +\t\tpipe()->completeBuffer(request, buffer);\n> +\t} else {\n> +\t\t/*\n> +\t\t * This buffer was not part of the Request (which happens if an\n> +\t\t * internal buffer was used for an external stream, or\n> +\t\t * unconditionally for internal streams), or there is no pending\n> +\t\t * request, so we can recycle it.\n> +\t\t */\n> +\t\tstream->returnBuffer(buffer);\n> +\t}\n> +}\n> +\n> +void CameraData::handleState()\n> +{\n> +\tswitch (state_) {\n> +\tcase State::Stopped:\n> +\tcase State::Busy:\n> +\tcase State::Error:\n> +\t\tbreak;\n> +\n> +\tcase State::IpaComplete:\n> +\t\t/* If the request is completed, we will switch to Idle state. */\n> +\t\tcheckRequestCompleted();\n> +\t\t/*\n> +\t\t * No break here, we want to try running the pipeline again.\n> +\t\t * The fallthrough clause below suppresses compiler warnings.\n> +\t\t */\n> +\t\t[[fallthrough]];\n> +\n> +\tcase State::Idle:\n> +\t\ttryRunPipeline();\n> +\t\tbreak;\n> +\t}\n> +}\n> +\n> +void CameraData::handleExternalBuffer(FrameBuffer *buffer, RPi::Stream *stream)\n> +{\n> +\tunsigned int id = stream->getBufferId(buffer);\n> +\n> +\tif (!(id & MaskExternalBuffer))\n> +\t\treturn;\n> +\n> +\t/* Stop the Stream object from tracking the buffer. */\n> +\tstream->removeExternalBuffer(buffer);\n> +}\n> +\n> +void CameraData::checkRequestCompleted()\n> +{\n> +\tbool requestCompleted = false;\n> +\t/*\n> +\t * If we are dropping this frame, do not touch the request, simply\n> +\t * change the state to IDLE when ready.\n> +\t */\n> +\tif (!dropFrameCount_) {\n> +\t\tRequest *request = requestQueue_.front();\n> +\t\tif (request->hasPendingBuffers())\n> +\t\t\treturn;\n> +\n> +\t\t/* Must wait for metadata to be filled in before completing. */\n> +\t\tif (state_ != State::IpaComplete)\n> +\t\t\treturn;\n> +\n> +\t\tpipe()->completeRequest(request);\n> +\t\trequestQueue_.pop();\n> +\t\trequestCompleted = true;\n> +\t}\n> +\n> +\t/*\n> +\t * Make sure we have three outputs completed in the case of a dropped\n> +\t * frame.\n> +\t */\n> +\tif (state_ == State::IpaComplete &&\n> +\t    ((ispOutputCount_ == ispOutputTotal_ && dropFrameCount_) || requestCompleted)) {\n\nLine wrap.\n\n> +\t\tstate_ = State::Idle;\n> +\t\tif (dropFrameCount_) {\n> +\t\t\tdropFrameCount_--;\n> +\t\t\tLOG(RPI, Debug) << \"Dropping frame at the request of the IPA (\"\n> +\t\t\t\t\t<< dropFrameCount_ << \" left)\";\n> +\t\t}\n> +\t}\n> +}\n> +\n> +void CameraData::fillRequestMetadata(const ControlList &bufferControls, Request *request)\n> +{\n> +\trequest->metadata().set(controls::SensorTimestamp,\n> +\t\t\t\tbufferControls.get(controls::SensorTimestamp).value_or(0));\n> +\n> +\trequest->metadata().set(controls::ScalerCrop, scalerCrop_);\n> +}\n> +\n> +} /* namespace libcamera */\n> diff --git a/src/libcamera/pipeline/rpi/common/pipeline_base.h b/src/libcamera/pipeline/rpi/common/pipeline_base.h\n> new file mode 100644\n> index 000000000000..318266a6fb51\n> --- /dev/null\n> +++ b/src/libcamera/pipeline/rpi/common/pipeline_base.h\n> @@ -0,0 +1,276 @@\n> +/* SPDX-License-Identifier: LGPL-2.1-or-later */\n> +/*\n> + * Copyright (C) 2019-2023, Raspberry Pi Ltd\n> + *\n> + * pipeline_base.h - Pipeline handler base class for Raspberry Pi devices\n> + */\n> +\n> +#include <map>\n> +#include <memory>\n> +#include <optional>\n> +#include <queue>\n> +#include <string>\n> +#include <unordered_set>\n> +#include <utility>\n> +#include <vector>\n> +\n> +#include <libcamera/controls.h>\n> +#include <libcamera/request.h>\n> +\n> +#include \"libcamera/internal/bayer_format.h\"\n> +#include \"libcamera/internal/camera.h\"\n> +#include \"libcamera/internal/camera_sensor.h\"\n> +#include \"libcamera/internal/framebuffer.h\"\n> +#include \"libcamera/internal/media_device.h\"\n> +#include \"libcamera/internal/media_object.h\"\n> +#include \"libcamera/internal/pipeline_handler.h\"\n> +#include \"libcamera/internal/v4l2_videodevice.h\"\n> +#include \"libcamera/internal/yaml_parser.h\"\n> +\n> +#include <libcamera/ipa/raspberrypi_ipa_interface.h>\n> +#include <libcamera/ipa/raspberrypi_ipa_proxy.h>\n> +\n> +#include \"delayed_controls.h\"\n> +#include \"rpi_stream.h\"\n> +\n> +using namespace std::chrono_literals;\n> +\n> +namespace libcamera {\n> +\n> +namespace RPi {\n> +\n> +/* Map of mbus codes to supported sizes reported by the sensor. */\n> +using SensorFormats = std::map<unsigned int, std::vector<Size>>;\n> +\n> +class CameraData : public Camera::Private\n> +{\n> +public:\n> +\tCameraData(PipelineHandler *pipe)\n> +\t\t: Camera::Private(pipe), state_(State::Stopped),\n> +\t\t  flipsAlterBayerOrder_(false), dropFrameCount_(0), buffersAllocated_(false),\n> +\t\t  ispOutputCount_(0), ispOutputTotal_(0)\n> +\t{\n> +\t}\n> +\n> +\tvirtual ~CameraData()\n> +\t{\n> +\t}\n> +\n> +\tstruct StreamParams {\n> +\t\tStreamParams()\n> +\t\t\t: index(0), cfg(nullptr), dev(nullptr)\n> +\t\t{\n> +\t\t}\n> +\n> +\t\tStreamParams(unsigned int index_, StreamConfiguration *cfg_)\n> +\t\t\t: index(index_), cfg(cfg_), dev(nullptr)\n> +\t\t{\n> +\t\t}\n> +\n> +\t\tunsigned int index;\n> +\t\tStreamConfiguration *cfg;\n> +\t\tV4L2VideoDevice *dev;\n> +\t};\n> +\n> +\tvirtual CameraConfiguration::Status platformValidate(std::vector<StreamParams> &rawStreams,\n> +\t\t\t\t\t\t\t     std::vector<StreamParams> &outStreams) const = 0;\n> +\tvirtual int platformConfigure(const V4L2SubdeviceFormat &sensorFormat,\n> +\t\t\t\t      std::optional<BayerFormat::Packing> packing,\n> +\t\t\t\t      std::vector<StreamParams> &rawStreams,\n> +\t\t\t\t      std::vector<StreamParams> &outStreams) = 0;\n\nAs explained in the review of the corresponding refactoring for the IPA\nmodule, all those abstract virtual platform*() functions are not nice.\nIt may be possible to remove some by overriding the PipelineHandler\nfunctions in the derived class and calling PipelineHandlerBase functions\nfrom there.\n\n> +\tvirtual void platformStart() = 0;\n> +\tvirtual void platformStop() = 0;\n> +\n> +\tvoid freeBuffers();\n> +\tvirtual void platformFreeBuffers() = 0;\n> +\n> +\tvoid enumerateVideoDevices(MediaLink *link, const std::string &frontend);\n> +\n> +\tint loadPipelineConfiguration();\n> +\tint loadIPA(ipa::RPi::InitResult *result);\n> +\tint configureIPA(const CameraConfiguration *config, ipa::RPi::ConfigResult *result);\n> +\tvirtual int platformInitIpa(ipa::RPi::InitParams &params) = 0;\n> +\tvirtual int platformConfigureIpa(ipa::RPi::ConfigParams &params) = 0;\n> +\n> +\tvoid setDelayedControls(const ControlList &controls, uint32_t delayContext);\n> +\tvoid setLensControls(const ControlList &controls);\n> +\tvoid setSensorControls(ControlList &controls);\n> +\n> +\tRectangle scaleIspCrop(const Rectangle &ispCrop) const;\n> +\tvoid calculateScalerCrop(const ControlList &controls);\n> +\tvirtual void platformIspCrop() = 0;\n> +\n> +\tvoid cameraTimeout();\n> +\tvoid frameStarted(uint32_t sequence);\n> +\n> +\tvoid clearIncompleteRequests();\n> +\tvoid handleStreamBuffer(FrameBuffer *buffer, Stream *stream);\n> +\tvoid handleState();\n> +\n> +\tvirtual V4L2VideoDevice::Formats ispFormats() const = 0;\n> +\tvirtual V4L2VideoDevice::Formats rawFormats() const = 0;\n> +\tvirtual V4L2VideoDevice *frontendDevice() = 0;\n> +\n> +\tvirtual int platformPipelineConfigure(const std::unique_ptr<YamlObject> &root) = 0;\n> +\n> +\tstd::unique_ptr<ipa::RPi::IPAProxyRPi> ipa_;\n> +\n> +\tstd::unique_ptr<CameraSensor> sensor_;\n> +\tSensorFormats sensorFormats_;\n> +\n> +\t/* The vector below is just for convenience when iterating over all streams. */\n> +\tstd::vector<Stream *> streams_;\n> +\t/* Stores the ids of the buffers mapped in the IPA. */\n> +\tstd::unordered_set<unsigned int> bufferIds_;\n> +\t/*\n> +\t * Stores a cascade of Video Mux or Bridge devices between the sensor and\n> +\t * Unicam together with media link across the entities.\n> +\t */\n> +\tstd::vector<std::pair<std::unique_ptr<V4L2Subdevice>, MediaLink *>> bridgeDevices_;\n> +\n> +\tstd::unique_ptr<DelayedControls> delayedCtrls_;\n> +\tbool sensorMetadata_;\n> +\n> +\t/*\n> +\t * All the functions in this class are called from a single calling\n> +\t * thread. So, we do not need to have any mutex to protect access to any\n> +\t * of the variables below.\n> +\t */\n> +\tenum class State { Stopped, Idle, Busy, IpaComplete, Error };\n> +\tState state_;\n> +\n> +\tbool isRunning()\n> +\t{\n> +\t\treturn state_ != State::Stopped && state_ != State::Error;\n> +\t}\n> +\n> +\tstd::queue<Request *> requestQueue_;\n> +\n> +\t/* Store the \"native\" Bayer order (that is, with no transforms applied). */\n> +\tbool flipsAlterBayerOrder_;\n> +\tBayerFormat::Order nativeBayerOrder_;\n> +\n> +\t/* For handling digital zoom. */\n> +\tIPACameraSensorInfo sensorInfo_;\n> +\tRectangle ispCrop_; /* crop in ISP (camera mode) pixels */\n> +\tRectangle scalerCrop_; /* crop in sensor native pixels */\n> +\tSize ispMinCropSize_;\n> +\n> +\tunsigned int dropFrameCount_;\n> +\n> +\t/*\n> +\t * If set, this stores the value that represets a gain of one for\n> +\t * the V4L2_CID_NOTIFY_GAINS control.\n> +\t */\n> +\tstd::optional<int32_t> notifyGainsUnity_;\n> +\n> +\t/* Have internal buffers been allocated? */\n> +\tbool buffersAllocated_;\n> +\n> +\tstruct Config {\n> +\t\t/*\n> +\t\t * Override any request from the IPA to drop a number of startup\n> +\t\t * frames.\n> +\t\t */\n> +\t\tbool disableStartupFrameDrops;\n> +\t\t/*\n> +\t\t * Override the camera timeout value calculated by the IPA based\n> +\t\t * on frame durations.\n> +\t\t */\n> +\t\tunsigned int cameraTimeoutValue;\n> +\t};\n> +\n> +\tConfig config_;\n> +\n> +protected:\n> +\tvoid fillRequestMetadata(const ControlList &bufferControls,\n> +\t\t\t\t Request *request);\n> +\n> +\tvirtual void tryRunPipeline() = 0;\n> +\n> +\tunsigned int ispOutputCount_;\n> +\tunsigned int ispOutputTotal_;\n> +\n> +private:\n> +\tvoid handleExternalBuffer(FrameBuffer *buffer, Stream *stream);\n> +\tvoid checkRequestCompleted();\n> +};\n> +\n> +class PipelineHandlerBase : public PipelineHandler\n> +{\n> +public:\n> +\tPipelineHandlerBase(CameraManager *manager)\n> +\t\t: PipelineHandler(manager)\n> +\t{\n> +\t}\n> +\n> +\tvirtual ~PipelineHandlerBase()\n> +\t{\n> +\t}\n> +\n> +\tstatic V4L2DeviceFormat toV4L2DeviceFormat(const V4L2VideoDevice *dev,\n> +\t\t\t\t\t\t   const V4L2SubdeviceFormat &format,\n> +\t\t\t\t\t\t   BayerFormat::Packing packingReq);\n> +\n> +\tstd::unique_ptr<CameraConfiguration>\n> +\tgenerateConfiguration(Camera *camera, const StreamRoles &roles) override;\n> +\tint configure(Camera *camera, CameraConfiguration *config) override;\n> +\n> +\tint exportFrameBuffers(Camera *camera, libcamera::Stream *stream,\n> +\t\t\t       std::vector<std::unique_ptr<FrameBuffer>> *buffers) override;\n> +\n> +\tint start(Camera *camera, const ControlList *controls) override;\n> +\tvoid stopDevice(Camera *camera) override;\n> +\tvoid releaseDevice(Camera *camera) override;\n> +\n> +\tint queueRequestDevice(Camera *camera, Request *request) override;\n> +\n> +protected:\n> +\tint registerCamera(MediaDevice *frontent, const std::string &frontendName,\n> +\t\t\t   MediaDevice *backend, MediaEntity *sensorEntity);\n> +\n> +\tvoid mapBuffers(Camera *camera, const BufferMap &buffers, unsigned int mask);\n> +\n> +\tvirtual std::unique_ptr<CameraData> allocateCameraData() = 0;\n> +\tvirtual int platformRegister(std::unique_ptr<CameraData> &cameraData,\n> +\t\t\t\t     MediaDevice *unicam, MediaDevice *isp) = 0;\n> +\n> +private:\n> +\tCameraData *cameraData(Camera *camera)\n> +\t{\n> +\t\treturn static_cast<CameraData *>(camera->_d());\n> +\t}\n> +\n> +\tint queueAllBuffers(Camera *camera);\n> +\tvirtual int prepareBuffers(Camera *camera) = 0;\n> +};\n> +\n> +class RPiCameraConfiguration final : public CameraConfiguration\n> +{\n> +public:\n> +\tRPiCameraConfiguration(const CameraData *data)\n> +\t\t: CameraConfiguration(), data_(data)\n> +\t{\n> +\t}\n> +\n> +\tCameraConfiguration::Status validateColorSpaces(ColorSpaceFlags flags);\n> +\tStatus validate() override;\n> +\n> +\t/* Cache the combinedTransform_ that will be applied to the sensor */\n> +\tTransform combinedTransform_;\n> +\n> +private:\n> +\tconst CameraData *data_;\n> +\n> +\t/*\n> +\t * Store the colour spaces that all our streams will have. RGB format streams\n> +\t * will have the same colorspace as YUV streams, with YCbCr field cleared and\n> +\t * range set to full.\n> +         */\n> +\tstd::optional<ColorSpace> yuvColorSpace_;\n> +\tstd::optional<ColorSpace> rgbColorSpace_;\n> +};\n> +\n> +} /* namespace RPi */\n> +\n> +} /* namespace libcamera */\n> diff --git a/src/libcamera/pipeline/rpi/vc4/data/example.yaml b/src/libcamera/pipeline/rpi/vc4/data/example.yaml\n> index c90f518f8849..b8e01adeaf40 100644\n> --- a/src/libcamera/pipeline/rpi/vc4/data/example.yaml\n> +++ b/src/libcamera/pipeline/rpi/vc4/data/example.yaml\n> @@ -34,13 +34,13 @@\n>                  #\n>                  # \"disable_startup_frame_drops\": false,\n>  \n> -                # Custom timeout value (in ms) for Unicam to use. This overrides\n> +                # Custom timeout value (in ms) for camera to use. This overrides\n>                  # the value computed by the pipeline handler based on frame\n>                  # durations.\n>                  #\n>                  # Set this value to 0 to use the pipeline handler computed\n>                  # timeout value.\n>                  #\n> -                # \"unicam_timeout_value_ms\": 0,\n> +                # \"camera_timeout_value_ms\": 0,\n>          }\n>  }\n> diff --git a/src/libcamera/pipeline/rpi/vc4/meson.build b/src/libcamera/pipeline/rpi/vc4/meson.build\n> index 228823f30922..cdb049c58d2c 100644\n> --- a/src/libcamera/pipeline/rpi/vc4/meson.build\n> +++ b/src/libcamera/pipeline/rpi/vc4/meson.build\n> @@ -2,7 +2,7 @@\n>  \n>  libcamera_sources += files([\n>      'dma_heaps.cpp',\n> -    'raspberrypi.cpp',\n> +    'vc4.cpp',\n>  ])\n>  \n>  subdir('data')\n\n[snip]\n\n> diff --git a/src/libcamera/pipeline/rpi/vc4/vc4.cpp b/src/libcamera/pipeline/rpi/vc4/vc4.cpp\n> new file mode 100644\n> index 000000000000..a085a06376a8\n> --- /dev/null\n> +++ b/src/libcamera/pipeline/rpi/vc4/vc4.cpp\n> @@ -0,0 +1,1001 @@\n> +/* SPDX-License-Identifier: LGPL-2.1-or-later */\n> +/*\n> + * Copyright (C) 2019-2023, Raspberry Pi Ltd\n> + *\n> + * vc4.cpp - Pipeline handler for VC4 based Raspberry Pi devices\n> + */\n> +\n> +#include <linux/bcm2835-isp.h>\n> +#include <linux/v4l2-controls.h>\n> +#include <linux/videodev2.h>\n> +\n> +#include <libcamera/formats.h>\n> +\n> +#include \"libcamera/internal/device_enumerator.h\"\n> +\n> +#include \"dma_heaps.h\"\n> +#include \"pipeline_base.h\"\n> +#include \"rpi_stream.h\"\n> +\n> +using namespace std::chrono_literals;\n> +\n> +namespace libcamera {\n> +\n> +LOG_DECLARE_CATEGORY(RPI)\n> +\n> +namespace {\n> +\n> +enum class Unicam : unsigned int { Image, Embedded };\n> +enum class Isp : unsigned int { Input, Output0, Output1, Stats };\n> +\n> +} /* namespace */\n> +\n> +class Vc4CameraData final : public RPi::CameraData\n> +{\n> +public:\n> +\tVc4CameraData(PipelineHandler *pipe)\n> +\t\t: RPi::CameraData(pipe)\n> +\t{\n> +\t}\n> +\n> +\t~Vc4CameraData()\n> +\t{\n> +\t\tfreeBuffers();\n> +\t}\n> +\n> +\tV4L2VideoDevice::Formats ispFormats() const override\n> +\t{\n> +\t\treturn isp_[Isp::Output0].dev()->formats();\n> +\t}\n> +\n> +\tV4L2VideoDevice::Formats rawFormats() const override\n> +\t{\n> +\t\treturn unicam_[Unicam::Image].dev()->formats();\n> +\t}\n> +\n> +\tV4L2VideoDevice *frontendDevice() override\n> +\t{\n> +\t\treturn unicam_[Unicam::Image].dev();\n> +\t}\n> +\n> +\tvoid platformFreeBuffers() override\n> +\t{\n> +\t}\n> +\n> +\tCameraConfiguration::Status platformValidate(std::vector<StreamParams> &rawStreams,\n> +\t\t\t\t\t\t     std::vector<StreamParams> &outStreams) const override;\n> +\n> +\tint platformPipelineConfigure(const std::unique_ptr<YamlObject> &root) override;\n> +\n> +\tvoid platformStart() override;\n> +\tvoid platformStop() override;\n> +\n> +\tvoid unicamBufferDequeue(FrameBuffer *buffer);\n> +\tvoid ispInputDequeue(FrameBuffer *buffer);\n> +\tvoid ispOutputDequeue(FrameBuffer *buffer);\n> +\n> +\tvoid processStatsComplete(const ipa::RPi::BufferIds &buffers);\n> +\tvoid prepareIspComplete(const ipa::RPi::BufferIds &buffers);\n> +\tvoid setIspControls(const ControlList &controls);\n> +\tvoid setCameraTimeout(uint32_t maxFrameLengthMs);\n> +\n> +\t/* Array of Unicam and ISP device streams and associated buffers/streams. */\n> +\tRPi::Device<Unicam, 2> unicam_;\n> +\tRPi::Device<Isp, 4> isp_;\n> +\n> +\t/* DMAHEAP allocation helper. */\n> +\tRPi::DmaHeap dmaHeap_;\n> +\tSharedFD lsTable_;\n> +\n> +\tstruct Config {\n> +\t\t/*\n> +\t\t * The minimum number of internal buffers to be allocated for\n> +\t\t * the Unicam Image stream.\n> +\t\t */\n> +\t\tunsigned int minUnicamBuffers;\n> +\t\t/*\n> +\t\t * The minimum total (internal + external) buffer count used for\n> +\t\t * the Unicam Image stream.\n> +\t\t *\n> +\t\t * Note that:\n> +\t\t * minTotalUnicamBuffers must be >= 1, and\n> +\t\t * minTotalUnicamBuffers >= minUnicamBuffers\n> +\t\t */\n> +\t\tunsigned int minTotalUnicamBuffers;\n> +\t};\n> +\n> +\tConfig config_;\n> +\n> +private:\n> +\tvoid platformIspCrop() override\n> +\t{\n> +\t\tisp_[Isp::Input].dev()->setSelection(V4L2_SEL_TGT_CROP, &ispCrop_);\n> +\t}\n> +\n> +\tint platformConfigure(const V4L2SubdeviceFormat &sensorFormat,\n> +\t\t\t      std::optional<BayerFormat::Packing> packing,\n> +\t\t\t      std::vector<StreamParams> &rawStreams,\n> +\t\t\t      std::vector<StreamParams> &outStreams) override;\n> +\tint platformConfigureIpa(ipa::RPi::ConfigParams &params) override;\n> +\n> +\tint platformInitIpa([[maybe_unused]] ipa::RPi::InitParams &params) override\n> +\t{\n> +\t\treturn 0;\n> +\t}\n> +\n> +\tstruct BayerFrame {\n> +\t\tFrameBuffer *buffer;\n> +\t\tControlList controls;\n> +\t\tunsigned int delayContext;\n> +\t};\n> +\n> +\tvoid tryRunPipeline() override;\n> +\tbool findMatchingBuffers(BayerFrame &bayerFrame, FrameBuffer *&embeddedBuffer);\n> +\n> +\tstd::queue<BayerFrame> bayerQueue_;\n> +\tstd::queue<FrameBuffer *> embeddedQueue_;\n> +};\n> +\n> +class PipelineHandlerVc4 : public RPi::PipelineHandlerBase\n> +{\n> +public:\n> +\tPipelineHandlerVc4(CameraManager *manager)\n> +\t\t: RPi::PipelineHandlerBase(manager)\n> +\t{\n> +\t}\n> +\n> +\t~PipelineHandlerVc4()\n> +\t{\n> +\t}\n> +\n> +\tbool match(DeviceEnumerator *enumerator) override;\n> +\n> +private:\n> +\tVc4CameraData *cameraData(Camera *camera)\n> +\t{\n> +\t\treturn static_cast<Vc4CameraData *>(camera->_d());\n> +\t}\n> +\n> +\tstd::unique_ptr<RPi::CameraData> allocateCameraData() override\n> +\t{\n> +\t\treturn std::make_unique<Vc4CameraData>(this);\n> +\t}\n> +\n> +\tint prepareBuffers(Camera *camera) override;\n> +\tint platformRegister(std::unique_ptr<RPi::CameraData> &cameraData,\n> +\t\t\t     MediaDevice *unicam, MediaDevice *isp) override;\n> +};\n> +\n> +bool PipelineHandlerVc4::match(DeviceEnumerator *enumerator)\n> +{\n> +\tconstexpr unsigned int numUnicamDevices = 2;\n> +\n> +\t/*\n> +\t * Loop over all Unicam instances, but return out once a match is found.\n> +\t * This is to ensure we correctly enumrate the camera when an instance\n> +\t * of Unicam has registered with media controller, but has not registered\n> +\t * device nodes due to a sensor subdevice failure.\n> +\t */\n> +\tfor (unsigned int i = 0; i < numUnicamDevices; i++) {\n> +\t\tDeviceMatch unicam(\"unicam\");\n> +\t\tMediaDevice *unicamDevice = acquireMediaDevice(enumerator, unicam);\n> +\n> +\t\tif (!unicamDevice) {\n> +\t\t\tLOG(RPI, Debug) << \"Unable to acquire a Unicam instance\";\n> +\t\t\tbreak;\n\nThe existing code continues here instead of breaking. Same if\n!ispDevice. Is this change intended ?\n\n> +\t\t}\n> +\n> +\t\tDeviceMatch isp(\"bcm2835-isp\");\n> +\t\tMediaDevice *ispDevice = acquireMediaDevice(enumerator, isp);\n> +\n> +\t\tif (!ispDevice) {\n> +\t\t\tLOG(RPI, Debug) << \"Unable to acquire ISP instance\";\n> +\t\t\tbreak;\n> +\t\t}\n> +\n> +\t\t/*\n> +\t\t * The loop below is used to register multiple cameras behind one or more\n> +\t\t * video mux devices that are attached to a particular Unicam instance.\n> +\t\t * Obviously these cameras cannot be used simultaneously.\n> +\t\t */\n> +\t\tunsigned int numCameras = 0;\n> +\t\tfor (MediaEntity *entity : unicamDevice->entities()) {\n> +\t\t\tif (entity->function() != MEDIA_ENT_F_CAM_SENSOR)\n> +\t\t\t\tcontinue;\n> +\n> +\t\t\tint ret = RPi::PipelineHandlerBase::registerCamera(unicamDevice, \"unicam-image\",\n> +\t\t\t\t\t\t\t\t\t   ispDevice, entity);\n> +\t\t\tif (ret)\n> +\t\t\t\tLOG(RPI, Error) << \"Failed to register camera \"\n> +\t\t\t\t\t\t<< entity->name() << \": \" << ret;\n> +\t\t\telse\n> +\t\t\t\tnumCameras++;\n> +\t\t}\n> +\n> +\t\tif (numCameras)\n> +\t\t\treturn true;\n> +\t}\n> +\n> +\treturn false;\n> +}\n> +\n> +int PipelineHandlerVc4::prepareBuffers(Camera *camera)\n> +{\n> +\tVc4CameraData *data = cameraData(camera);\n> +\tunsigned int numRawBuffers = 0;\n> +\tint ret;\n> +\n> +\tfor (Stream *s : camera->streams()) {\n> +\t\tif (BayerFormat::fromPixelFormat(s->configuration().pixelFormat).isValid()) {\n> +\t\t\tnumRawBuffers = s->configuration().bufferCount;\n> +\t\t\tbreak;\n> +\t\t}\n> +\t}\n> +\n> +\t/* Decide how many internal buffers to allocate. */\n> +\tfor (auto const stream : data->streams_) {\n> +\t\tunsigned int numBuffers;\n> +\t\t/*\n> +\t\t * For Unicam, allocate a minimum number of buffers for internal\n> +\t\t * use as we want to avoid any frame drops.\n> +\t\t */\n> +\t\tconst unsigned int minBuffers = data->config_.minTotalUnicamBuffers;\n> +\t\tif (stream == &data->unicam_[Unicam::Image]) {\n> +\t\t\t/*\n> +\t\t\t * If an application has configured a RAW stream, allocate\n> +\t\t\t * additional buffers to make up the minimum, but ensure\n> +\t\t\t * we have at least minUnicamBuffers of internal buffers\n> +\t\t\t * to use to minimise frame drops.\n> +\t\t\t */\n> +\t\t\tnumBuffers = std::max<int>(data->config_.minUnicamBuffers,\n> +\t\t\t\t\t\t   minBuffers - numRawBuffers);\n> +\t\t} else if (stream == &data->isp_[Isp::Input]) {\n> +\t\t\t/*\n> +\t\t\t * ISP input buffers are imported from Unicam, so follow\n> +\t\t\t * similar logic as above to count all the RAW buffers\n> +\t\t\t * available.\n> +\t\t\t */\n> +\t\t\tnumBuffers = numRawBuffers +\n> +\t\t\t\t     std::max<int>(data->config_.minUnicamBuffers,\n> +\t\t\t\t\t\t   minBuffers - numRawBuffers);\n> +\n> +\t\t} else if (stream == &data->unicam_[Unicam::Embedded]) {\n> +\t\t\t/*\n> +\t\t\t * Embedded data buffers are (currently) for internal use,\n> +\t\t\t * so allocate the minimum required to avoid frame drops.\n> +\t\t\t */\n> +\t\t\tnumBuffers = minBuffers;\n> +\t\t} else {\n> +\t\t\t/*\n> +\t\t\t * Since the ISP runs synchronous with the IPA and requests,\n> +\t\t\t * we only ever need one set of internal buffers. Any buffers\n> +\t\t\t * the application wants to hold onto will already be exported\n> +\t\t\t * through PipelineHandlerRPi::exportFrameBuffers().\n> +\t\t\t */\n> +\t\t\tnumBuffers = 1;\n> +\t\t}\n> +\n> +\t\tret = stream->prepareBuffers(numBuffers);\n> +\t\tif (ret < 0)\n> +\t\t\treturn ret;\n> +\t}\n> +\n> +\t/*\n> +\t * Pass the stats and embedded data buffers to the IPA. No other\n> +\t * buffers need to be passed.\n> +\t */\n> +\tmapBuffers(camera, data->isp_[Isp::Stats].getBuffers(), RPi::MaskStats);\n> +\tif (data->sensorMetadata_)\n> +\t\tmapBuffers(camera, data->unicam_[Unicam::Embedded].getBuffers(),\n> +\t\t\t   RPi::MaskEmbeddedData);\n> +\n> +\treturn 0;\n> +}\n> +\n> +int PipelineHandlerVc4::platformRegister(std::unique_ptr<RPi::CameraData> &cameraData, MediaDevice *unicam, MediaDevice *isp)\n> +{\n> +\tVc4CameraData *data = static_cast<Vc4CameraData *>(cameraData.get());\n> +\n> +\tif (!data->dmaHeap_.isValid())\n> +\t\treturn -ENOMEM;\n> +\n> +\tMediaEntity *unicamImage = unicam->getEntityByName(\"unicam-image\");\n> +\tMediaEntity *ispOutput0 = isp->getEntityByName(\"bcm2835-isp0-output0\");\n> +\tMediaEntity *ispCapture1 = isp->getEntityByName(\"bcm2835-isp0-capture1\");\n> +\tMediaEntity *ispCapture2 = isp->getEntityByName(\"bcm2835-isp0-capture2\");\n> +\tMediaEntity *ispCapture3 = isp->getEntityByName(\"bcm2835-isp0-capture3\");\n> +\n> +\tif (!unicamImage || !ispOutput0 || !ispCapture1 || !ispCapture2 || !ispCapture3)\n> +\t\treturn -ENOENT;\n> +\n> +\t/* Locate and open the unicam video streams. */\n> +\tdata->unicam_[Unicam::Image] = RPi::Stream(\"Unicam Image\", unicamImage);\n> +\n> +\t/* An embedded data node will not be present if the sensor does not support it. */\n> +\tMediaEntity *unicamEmbedded = unicam->getEntityByName(\"unicam-embedded\");\n> +\tif (unicamEmbedded) {\n> +\t\tdata->unicam_[Unicam::Embedded] = RPi::Stream(\"Unicam Embedded\", unicamEmbedded);\n> +\t\tdata->unicam_[Unicam::Embedded].dev()->bufferReady.connect(data,\n> +\t\t\t\t\t\t\t\t\t   &Vc4CameraData::unicamBufferDequeue);\n> +\t}\n> +\n> +\t/* Tag the ISP input stream as an import stream. */\n> +\tdata->isp_[Isp::Input] = RPi::Stream(\"ISP Input\", ispOutput0, true);\n> +\tdata->isp_[Isp::Output0] = RPi::Stream(\"ISP Output0\", ispCapture1);\n> +\tdata->isp_[Isp::Output1] = RPi::Stream(\"ISP Output1\", ispCapture2);\n> +\tdata->isp_[Isp::Stats] = RPi::Stream(\"ISP Stats\", ispCapture3);\n> +\n> +\t/* Wire up all the buffer connections. */\n> +\tdata->unicam_[Unicam::Image].dev()->bufferReady.connect(data, &Vc4CameraData::unicamBufferDequeue);\n> +\tdata->isp_[Isp::Input].dev()->bufferReady.connect(data, &Vc4CameraData::ispInputDequeue);\n> +\tdata->isp_[Isp::Output0].dev()->bufferReady.connect(data, &Vc4CameraData::ispOutputDequeue);\n> +\tdata->isp_[Isp::Output1].dev()->bufferReady.connect(data, &Vc4CameraData::ispOutputDequeue);\n> +\tdata->isp_[Isp::Stats].dev()->bufferReady.connect(data, &Vc4CameraData::ispOutputDequeue);\n> +\n> +\tif (data->sensorMetadata_ ^ !!data->unicam_[Unicam::Embedded].dev()) {\n> +\t\tLOG(RPI, Warning) << \"Mismatch between Unicam and CamHelper for embedded data usage!\";\n> +\t\tdata->sensorMetadata_ = false;\n> +\t\tif (data->unicam_[Unicam::Embedded].dev())\n> +\t\t\tdata->unicam_[Unicam::Embedded].dev()->bufferReady.disconnect();\n> +\t}\n> +\n> +\t/*\n> +\t * Open all Unicam and ISP streams. The exception is the embedded data\n> +\t * stream, which only gets opened below if the IPA reports that the sensor\n> +\t * supports embedded data.\n> +\t *\n> +\t * The below grouping is just for convenience so that we can easily\n> +\t * iterate over all streams in one go.\n> +\t */\n> +\tdata->streams_.push_back(&data->unicam_[Unicam::Image]);\n> +\tif (data->sensorMetadata_)\n> +\t\tdata->streams_.push_back(&data->unicam_[Unicam::Embedded]);\n> +\n> +\tfor (auto &stream : data->isp_)\n> +\t\tdata->streams_.push_back(&stream);\n> +\n> +\tfor (auto stream : data->streams_) {\n> +\t\tint ret = stream->dev()->open();\n> +\t\tif (ret)\n> +\t\t\treturn ret;\n> +\t}\n> +\n> +\tif (!data->unicam_[Unicam::Image].dev()->caps().hasMediaController()) {\n> +\t\tLOG(RPI, Error) << \"Unicam driver does not use the MediaController, please update your kernel!\";\n> +\t\treturn -EINVAL;\n> +\t}\n> +\n> +\t/* Write up all the IPA connections. */\n> +\tdata->ipa_->processStatsComplete.connect(data, &Vc4CameraData::processStatsComplete);\n> +\tdata->ipa_->prepareIspComplete.connect(data, &Vc4CameraData::prepareIspComplete);\n> +\tdata->ipa_->setIspControls.connect(data, &Vc4CameraData::setIspControls);\n> +\tdata->ipa_->setCameraTimeout.connect(data, &Vc4CameraData::setCameraTimeout);\n> +\n> +\t/*\n> +\t * List the available streams an application may request. At present, we\n> +\t * do not advertise Unicam Embedded and ISP Statistics streams, as there\n> +\t * is no mechanism for the application to request non-image buffer formats.\n> +\t */\n> +\tstd::set<Stream *> streams;\n> +\tstreams.insert(&data->unicam_[Unicam::Image]);\n> +\tstreams.insert(&data->isp_[Isp::Output0]);\n> +\tstreams.insert(&data->isp_[Isp::Output1]);\n> +\n> +\t/* Create and register the camera. */\n> +\tconst std::string &id = data->sensor_->id();\n> +\tstd::shared_ptr<Camera> camera =\n> +\t\tCamera::create(std::move(cameraData), id, streams);\n> +\tPipelineHandler::registerCamera(std::move(camera));\n> +\n> +\tLOG(RPI, Info) << \"Registered camera \" << id\n> +\t\t       << \" to Unicam device \" << unicam->deviceNode()\n> +\t\t       << \" and ISP device \" << isp->deviceNode();\n> +\n> +\treturn 0;\n> +}\n> +\n> +CameraConfiguration::Status Vc4CameraData::platformValidate(std::vector<StreamParams> &rawStreams,\n> +\t\t\t\t\t\t\t    std::vector<StreamParams> &outStreams) const\n> +{\n> +\tCameraConfiguration::Status status = CameraConfiguration::Status::Valid;\n> +\n> +\t/* Can only output 1 RAW stream, or 2 YUV/RGB streams. */\n> +\tif (rawStreams.size() > 1 || outStreams.size() > 2) {\n> +\t\tLOG(RPI, Error) << \"Invalid number of streams requested\";\n> +\t\treturn CameraConfiguration::Status::Invalid;\n> +\t}\n> +\n> +\tif (!rawStreams.empty())\n> +\t\trawStreams[0].dev = unicam_[Unicam::Image].dev();\n> +\n> +\t/*\n> +\t * For the two ISP outputs, one stream must be equal or smaller than the\n> +\t * other in all dimensions.\n> +\t *\n> +\t * Index 0 contains the largest requested resolution.\n> +\t */\n> +\tfor (unsigned int i = 0; i < outStreams.size(); i++) {\n> +\t\tSize size;\n> +\n> +\t\tsize.width = std::min(outStreams[i].cfg->size.width,\n> +\t\t\t\t      outStreams[0].cfg->size.width);\n> +\t\tsize.height = std::min(outStreams[i].cfg->size.height,\n> +\t\t\t\t       outStreams[0].cfg->size.height);\n> +\n> +\t\tif (outStreams[i].cfg->size != size) {\n> +\t\t\toutStreams[i].cfg->size = size;\n> +\t\t\tstatus = CameraConfiguration::Status::Adjusted;\n> +\t\t}\n> +\n> +\t\t/*\n> +\t\t * Output 0 must be for the largest resolution. We will\n> +\t\t * have that fixed up in the code above.\n> +\t\t */\n> +\t\toutStreams[i].dev = isp_[i == 0 ? Isp::Output0 : Isp::Output1].dev();\n> +\t}\n> +\n> +\treturn status;\n> +}\n> +\n> +int Vc4CameraData::platformPipelineConfigure(const std::unique_ptr<YamlObject> &root)\n> +{\n> +\tconfig_ = {\n> +\t\t.minUnicamBuffers = 2,\n> +\t\t.minTotalUnicamBuffers = 4,\n> +\t};\n> +\n> +\tif (!root)\n> +\t\treturn 0;\n> +\n> +\tstd::optional<double> ver = (*root)[\"version\"].get<double>();\n> +\tif (!ver || *ver != 1.0) {\n> +\t\tLOG(RPI, Error) << \"Unexpected configuration file version reported\";\n> +\t\treturn -EINVAL;\n> +\t}\n> +\n> +\tstd::optional<std::string> target = (*root)[\"target\"].get<std::string>();\n> +\tif (!target || *target != \"bcm2835\") {\n> +\t\tLOG(RPI, Error) << \"Unexpected target reported: expected \\\"bcm2835\\\", got \"\n> +\t\t\t\t<< *target;\n> +\t\treturn -EINVAL;\n> +\t}\n> +\n> +\tconst YamlObject &phConfig = (*root)[\"pipeline_handler\"];\n> +\tconfig_.minUnicamBuffers =\n> +\t\tphConfig[\"min_unicam_buffers\"].get<unsigned int>(config_.minUnicamBuffers);\n> +\tconfig_.minTotalUnicamBuffers =\n> +\t\tphConfig[\"min_total_unicam_buffers\"].get<unsigned int>(config_.minTotalUnicamBuffers);\n> +\n> +\tif (config_.minTotalUnicamBuffers < config_.minUnicamBuffers) {\n> +\t\tLOG(RPI, Error) << \"Invalid configuration: min_total_unicam_buffers must be >= min_unicam_buffers\";\n> +\t\treturn -EINVAL;\n> +\t}\n> +\n> +\tif (config_.minTotalUnicamBuffers < 1) {\n> +\t\tLOG(RPI, Error) << \"Invalid configuration: min_total_unicam_buffers must be >= 1\";\n> +\t\treturn -EINVAL;\n> +\t}\n> +\n> +\treturn 0;\n> +}\n> +\n> +int Vc4CameraData::platformConfigure(const V4L2SubdeviceFormat &sensorFormat,\n> +\t\t\t\t     std::optional<BayerFormat::Packing> packing,\n> +\t\t\t\t     std::vector<StreamParams> &rawStreams,\n> +\t\t\t\t     std::vector<StreamParams> &outStreams)\n> +{\n> +\tint ret;\n> +\n> +\tif (!packing)\n> +\t\tpacking = BayerFormat::Packing::CSI2;\n> +\n> +\tV4L2VideoDevice *unicam = unicam_[Unicam::Image].dev();\n> +\tV4L2DeviceFormat unicamFormat = RPi::PipelineHandlerBase::toV4L2DeviceFormat(unicam, sensorFormat, *packing);\n> +\n> +\tret = unicam->setFormat(&unicamFormat);\n> +\tif (ret)\n> +\t\treturn ret;\n> +\n> +\t/*\n> +\t * See which streams are requested, and route the user\n> +\t * StreamConfiguration appropriately.\n> +\t */\n> +\tif (!rawStreams.empty()) {\n> +\t\trawStreams[0].cfg->setStream(&unicam_[Unicam::Image]);\n> +\t\tunicam_[Unicam::Image].setExternal(true);\n> +\t}\n> +\n> +\tret = isp_[Isp::Input].dev()->setFormat(&unicamFormat);\n> +\tif (ret)\n> +\t\treturn ret;\n> +\n> +\tLOG(RPI, Info) << \"Sensor: \" << sensor_->id()\n> +\t\t       << \" - Selected sensor format: \" << sensorFormat\n> +\t\t       << \" - Selected unicam format: \" << unicamFormat;\n> +\n> +\t/* Use a sensible small default size if no output streams are configured. */\n> +\tSize maxSize = outStreams.empty() ? Size(320, 240) : outStreams[0].cfg->size;\n> +\tV4L2DeviceFormat format;\n> +\n> +\tfor (unsigned int i = 0; i < outStreams.size(); i++) {\n> +\t\tStreamConfiguration *cfg = outStreams[i].cfg;\n> +\n> +\t\t/* The largest resolution gets routed to the ISP Output 0 node. */\n> +\t\tRPi::Stream *stream = i == 0 ? &isp_[Isp::Output0] : &isp_[Isp::Output1];\n> +\n> +\t\tV4L2PixelFormat fourcc = stream->dev()->toV4L2PixelFormat(cfg->pixelFormat);\n> +\t\tformat.size = cfg->size;\n> +\t\tformat.fourcc = fourcc;\n> +\t\tformat.colorSpace = cfg->colorSpace;\n> +\n> +\t\tLOG(RPI, Debug) << \"Setting \" << stream->name() << \" to \"\n> +\t\t\t\t<< format;\n> +\n> +\t\tret = stream->dev()->setFormat(&format);\n> +\t\tif (ret)\n> +\t\t\treturn -EINVAL;\n> +\n> +\t\tif (format.size != cfg->size || format.fourcc != fourcc) {\n> +\t\t\tLOG(RPI, Error)\n> +\t\t\t\t<< \"Failed to set requested format on \" << stream->name()\n> +\t\t\t\t<< \", returned \" << format;\n> +\t\t\treturn -EINVAL;\n> +\t\t}\n> +\n> +\t\tLOG(RPI, Debug)\n> +\t\t\t<< \"Stream \" << stream->name() << \" has color space \"\n> +\t\t\t<< ColorSpace::toString(cfg->colorSpace);\n> +\n> +\t\tcfg->setStream(stream);\n> +\t\tstream->setExternal(true);\n> +\t}\n> +\n> +\tispOutputTotal_ = outStreams.size();\n> +\n> +\t/*\n> +\t * If ISP::Output0 stream has not been configured by the application,\n> +\t * we must allow the hardware to generate an output so that the data\n> +\t * flow in the pipeline handler remains consistent, and we still generate\n> +\t * statistics for the IPA to use. So enable the output at a very low\n> +\t * resolution for internal use.\n> +\t *\n> +\t * \\todo Allow the pipeline to work correctly without Output0 and only\n> +\t * statistics coming from the hardware.\n> +\t */\n> +\tif (outStreams.empty()) {\n> +\t\tV4L2VideoDevice *dev = isp_[Isp::Output0].dev();\n> +\n> +\t\tformat = {};\n> +\t\tformat.size = maxSize;\n> +\t\tformat.fourcc = dev->toV4L2PixelFormat(formats::YUV420);\n> +\t\t/* No one asked for output, so the color space doesn't matter. */\n> +\t\tformat.colorSpace = ColorSpace::Sycc;\n> +\t\tret = dev->setFormat(&format);\n> +\t\tif (ret) {\n> +\t\t\tLOG(RPI, Error)\n> +\t\t\t\t<< \"Failed to set default format on ISP Output0: \"\n> +\t\t\t\t<< ret;\n> +\t\t\treturn -EINVAL;\n> +\t\t}\n> +\n> +\t\tispOutputTotal_++;\n> +\n> +\t\tLOG(RPI, Debug) << \"Defaulting ISP Output0 format to \"\n> +\t\t\t\t<< format;\n> +\t}\n> +\n> +\t/*\n> +\t * If ISP::Output1 stream has not been requested by the application, we\n> +\t * set it up for internal use now. This second stream will be used for\n> +\t * fast colour denoise, and must be a quarter resolution of the ISP::Output0\n> +\t * stream. However, also limit the maximum size to 1200 pixels in the\n> +\t * larger dimension, just to avoid being wasteful with buffer allocations\n> +\t * and memory bandwidth.\n> +\t *\n> +\t * \\todo If Output 1 format is not YUV420, Output 1 ought to be disabled as\n> +\t * colour denoise will not run.\n> +\t */\n> +\tif (outStreams.size() == 1) {\n> +\t\tV4L2VideoDevice *dev = isp_[Isp::Output1].dev();\n> +\n> +\t\tV4L2DeviceFormat output1Format;\n> +\t\tconstexpr Size maxDimensions(1200, 1200);\n> +\t\tconst Size limit = maxDimensions.boundedToAspectRatio(format.size);\n> +\n> +\t\toutput1Format.size = (format.size / 2).boundedTo(limit).alignedDownTo(2, 2);\n> +\t\toutput1Format.colorSpace = format.colorSpace;\n> +\t\toutput1Format.fourcc = dev->toV4L2PixelFormat(formats::YUV420);\n> +\n> +\t\tLOG(RPI, Debug) << \"Setting ISP Output1 (internal) to \"\n> +\t\t\t\t<< output1Format;\n> +\n> +\t\tret = dev->setFormat(&output1Format);\n> +\t\tif (ret) {\n> +\t\t\tLOG(RPI, Error) << \"Failed to set format on ISP Output1: \"\n> +\t\t\t\t\t<< ret;\n> +\t\t\treturn -EINVAL;\n> +\t\t}\n> +\n> +\t\tispOutputTotal_++;\n> +\t}\n> +\n> +\t/* ISP statistics output format. */\n> +\tformat = {};\n> +\tformat.fourcc = V4L2PixelFormat(V4L2_META_FMT_BCM2835_ISP_STATS);\n> +\tret = isp_[Isp::Stats].dev()->setFormat(&format);\n> +\tif (ret) {\n> +\t\tLOG(RPI, Error) << \"Failed to set format on ISP stats stream: \"\n> +\t\t\t\t<< format;\n> +\t\treturn ret;\n> +\t}\n> +\n> +\tispOutputTotal_++;\n> +\n> +\t/*\n> +\t * Configure the Unicam embedded data output format only if the sensor\n> +\t * supports it.\n> +\t */\n> +\tif (sensorMetadata_) {\n> +\t\tV4L2SubdeviceFormat embeddedFormat;\n> +\n> +\t\tsensor_->device()->getFormat(1, &embeddedFormat);\n> +\t\tformat = {};\n> +\t\tformat.fourcc = V4L2PixelFormat(V4L2_META_FMT_SENSOR_DATA);\n> +\t\tformat.planes[0].size = embeddedFormat.size.width * embeddedFormat.size.height;\n> +\n> +\t\tLOG(RPI, Debug) << \"Setting embedded data format \" << format.toString();\n> +\t\tret = unicam_[Unicam::Embedded].dev()->setFormat(&format);\n> +\t\tif (ret) {\n> +\t\t\tLOG(RPI, Error) << \"Failed to set format on Unicam embedded: \"\n> +\t\t\t\t\t<< format;\n> +\t\t\treturn ret;\n> +\t\t}\n> +\t}\n> +\n> +\t/* Figure out the smallest selection the ISP will allow. */\n> +\tRectangle testCrop(0, 0, 1, 1);\n> +\tisp_[Isp::Input].dev()->setSelection(V4L2_SEL_TGT_CROP, &testCrop);\n> +\tispMinCropSize_ = testCrop.size();\n> +\n> +\t/* Adjust aspect ratio by providing crops on the input image. */\n> +\tSize size = unicamFormat.size.boundedToAspectRatio(maxSize);\n> +\tispCrop_ = size.centeredTo(Rectangle(unicamFormat.size).center());\n> +\n> +\tplatformIspCrop();\n> +\n> +\treturn 0;\n> +}\n> +\n> +int Vc4CameraData::platformConfigureIpa(ipa::RPi::ConfigParams &params)\n> +{\n> +\tparams.ispControls = isp_[Isp::Input].dev()->controls();\n> +\n> +\t/* Allocate the lens shading table via dmaHeap and pass to the IPA. */\n> +\tif (!lsTable_.isValid()) {\n> +\t\tlsTable_ = SharedFD(dmaHeap_.alloc(\"ls_grid\", ipa::RPi::MaxLsGridSize));\n> +\t\tif (!lsTable_.isValid())\n> +\t\t\treturn -ENOMEM;\n> +\n> +\t\t/* Allow the IPA to mmap the LS table via the file descriptor. */\n> +\t\t/*\n> +\t\t * \\todo Investigate if mapping the lens shading table buffer\n> +\t\t * could be handled with mapBuffers().\n> +\t\t */\n> +\t\tparams.lsTableHandle = lsTable_;\n> +\t}\n> +\n> +\treturn 0;\n> +}\n> +\n> +void Vc4CameraData::platformStart()\n> +{\n> +}\n> +\n> +void Vc4CameraData::platformStop()\n> +{\n> +\tbayerQueue_ = {};\n> +\tembeddedQueue_ = {};\n> +}\n> +\n> +void Vc4CameraData::unicamBufferDequeue(FrameBuffer *buffer)\n> +{\n> +\tRPi::Stream *stream = nullptr;\n> +\tunsigned int index;\n> +\n> +\tif (!isRunning())\n> +\t\treturn;\n> +\n> +\tfor (RPi::Stream &s : unicam_) {\n> +\t\tindex = s.getBufferId(buffer);\n> +\t\tif (index) {\n> +\t\t\tstream = &s;\n> +\t\t\tbreak;\n> +\t\t}\n> +\t}\n> +\n> +\t/* The buffer must belong to one of our streams. */\n> +\tASSERT(stream);\n> +\n> +\tLOG(RPI, Debug) << \"Stream \" << stream->name() << \" buffer dequeue\"\n> +\t\t\t<< \", buffer id \" << index\n> +\t\t\t<< \", timestamp: \" << buffer->metadata().timestamp;\n> +\n> +\tif (stream == &unicam_[Unicam::Image]) {\n> +\t\t/*\n> +\t\t * Lookup the sensor controls used for this frame sequence from\n> +\t\t * DelayedControl and queue them along with the frame buffer.\n> +\t\t */\n> +\t\tauto [ctrl, delayContext] = delayedCtrls_->get(buffer->metadata().sequence);\n> +\t\t/*\n> +\t\t * Add the frame timestamp to the ControlList for the IPA to use\n> +\t\t * as it does not receive the FrameBuffer object.\n> +\t\t */\n> +\t\tctrl.set(controls::SensorTimestamp, buffer->metadata().timestamp);\n> +\t\tbayerQueue_.push({ buffer, std::move(ctrl), delayContext });\n> +\t} else {\n> +\t\tembeddedQueue_.push(buffer);\n> +\t}\n> +\n> +\thandleState();\n> +}\n> +\n> +void Vc4CameraData::ispInputDequeue(FrameBuffer *buffer)\n> +{\n> +\tif (!isRunning())\n> +\t\treturn;\n> +\n> +\tLOG(RPI, Debug) << \"Stream ISP Input buffer complete\"\n> +\t\t\t<< \", buffer id \" << unicam_[Unicam::Image].getBufferId(buffer)\n> +\t\t\t<< \", timestamp: \" << buffer->metadata().timestamp;\n> +\n> +\t/* The ISP input buffer gets re-queued into Unicam. */\n> +\thandleStreamBuffer(buffer, &unicam_[Unicam::Image]);\n> +\thandleState();\n> +}\n> +\n> +void Vc4CameraData::ispOutputDequeue(FrameBuffer *buffer)\n> +{\n> +\tRPi::Stream *stream = nullptr;\n> +\tunsigned int index;\n> +\n> +\tif (!isRunning())\n> +\t\treturn;\n> +\n> +\tfor (RPi::Stream &s : isp_) {\n> +\t\tindex = s.getBufferId(buffer);\n> +\t\tif (index) {\n> +\t\t\tstream = &s;\n> +\t\t\tbreak;\n> +\t\t}\n> +\t}\n> +\n> +\t/* The buffer must belong to one of our ISP output streams. */\n> +\tASSERT(stream);\n> +\n> +\tLOG(RPI, Debug) << \"Stream \" << stream->name() << \" buffer complete\"\n> +\t\t\t<< \", buffer id \" << index\n> +\t\t\t<< \", timestamp: \" << buffer->metadata().timestamp;\n> +\n> +\t/*\n> +\t * ISP statistics buffer must not be re-queued or sent back to the\n> +\t * application until after the IPA signals so.\n> +\t */\n> +\tif (stream == &isp_[Isp::Stats]) {\n> +\t\tipa::RPi::ProcessParams params;\n> +\t\tparams.buffers.stats = index | RPi::MaskStats;\n> +\t\tparams.ipaContext = requestQueue_.front()->sequence();\n> +\t\tipa_->processStats(params);\n> +\t} else {\n> +\t\t/* Any other ISP output can be handed back to the application now. */\n> +\t\thandleStreamBuffer(buffer, stream);\n> +\t}\n> +\n> +\t/*\n> +\t * Increment the number of ISP outputs generated.\n> +\t * This is needed to track dropped frames.\n> +\t */\n> +\tispOutputCount_++;\n> +\n> +\thandleState();\n> +}\n> +\n> +void Vc4CameraData::processStatsComplete(const ipa::RPi::BufferIds &buffers)\n> +{\n> +\tif (!isRunning())\n> +\t\treturn;\n> +\n> +\tFrameBuffer *buffer = isp_[Isp::Stats].getBuffers().at(buffers.stats & RPi::MaskID);\n> +\n> +\thandleStreamBuffer(buffer, &isp_[Isp::Stats]);\n> +\n> +\t/* Last thing to do is to fill up the request metadata. */\n> +\tRequest *request = requestQueue_.front();\n> +\tControlList metadata(controls::controls);\n> +\n> +\tipa_->reportMetadata(request->sequence(), &metadata);\n> +\trequest->metadata().merge(metadata);\n> +\n> +\t/*\n> +\t * Inform the sensor of the latest colour gains if it has the\n> +\t * V4L2_CID_NOTIFY_GAINS control (which means notifyGainsUnity_ is set).\n> +\t */\n> +\tconst auto &colourGains = metadata.get(libcamera::controls::ColourGains);\n> +\tif (notifyGainsUnity_ && colourGains) {\n> +\t\t/* The control wants linear gains in the order B, Gb, Gr, R. */\n> +\t\tControlList ctrls(sensor_->controls());\n> +\t\tstd::array<int32_t, 4> gains{\n> +\t\t\tstatic_cast<int32_t>((*colourGains)[1] * *notifyGainsUnity_),\n> +\t\t\t*notifyGainsUnity_,\n> +\t\t\t*notifyGainsUnity_,\n> +\t\t\tstatic_cast<int32_t>((*colourGains)[0] * *notifyGainsUnity_)\n> +\t\t};\n> +\t\tctrls.set(V4L2_CID_NOTIFY_GAINS, Span<const int32_t>{ gains });\n> +\n> +\t\tsensor_->setControls(&ctrls);\n> +\t}\n> +\n> +\tstate_ = State::IpaComplete;\n> +\thandleState();\n> +}\n> +\n> +void Vc4CameraData::prepareIspComplete(const ipa::RPi::BufferIds &buffers)\n> +{\n> +\tunsigned int embeddedId = buffers.embedded & RPi::MaskID;\n> +\tunsigned int bayer = buffers.bayer & RPi::MaskID;\n> +\tFrameBuffer *buffer;\n> +\n> +\tif (!isRunning())\n> +\t\treturn;\n> +\n> +\tbuffer = unicam_[Unicam::Image].getBuffers().at(bayer & RPi::MaskID);\n> +\tLOG(RPI, Debug) << \"Input re-queue to ISP, buffer id \" << (bayer & RPi::MaskID)\n> +\t\t\t<< \", timestamp: \" << buffer->metadata().timestamp;\n> +\n> +\tisp_[Isp::Input].queueBuffer(buffer);\n> +\tispOutputCount_ = 0;\n> +\n> +\tif (sensorMetadata_ && embeddedId) {\n> +\t\tbuffer = unicam_[Unicam::Embedded].getBuffers().at(embeddedId & RPi::MaskID);\n> +\t\thandleStreamBuffer(buffer, &unicam_[Unicam::Embedded]);\n> +\t}\n> +\n> +\thandleState();\n> +}\n> +\n> +void Vc4CameraData::setIspControls(const ControlList &controls)\n> +{\n> +\tControlList ctrls = controls;\n> +\n> +\tif (ctrls.contains(V4L2_CID_USER_BCM2835_ISP_LENS_SHADING)) {\n> +\t\tControlValue &value =\n> +\t\t\tconst_cast<ControlValue &>(ctrls.get(V4L2_CID_USER_BCM2835_ISP_LENS_SHADING));\n> +\t\tSpan<uint8_t> s = value.data();\n> +\t\tbcm2835_isp_lens_shading *ls =\n> +\t\t\treinterpret_cast<bcm2835_isp_lens_shading *>(s.data());\n> +\t\tls->dmabuf = lsTable_.get();\n> +\t}\n> +\n> +\tisp_[Isp::Input].dev()->setControls(&ctrls);\n> +\thandleState();\n> +}\n> +\n> +void Vc4CameraData::setCameraTimeout(uint32_t maxFrameLengthMs)\n> +{\n> +\t/*\n> +\t * Set the dequeue timeout to the larger of 5x the maximum reported\n> +\t * frame length advertised by the IPA over a number of frames. Allow\n> +\t * a minimum timeout value of 1s.\n> +\t */\n> +\tutils::Duration timeout =\n> +\t\tstd::max<utils::Duration>(1s, 5 * maxFrameLengthMs * 1ms);\n> +\n> +\tLOG(RPI, Debug) << \"Setting Unicam timeout to \" << timeout;\n> +\tunicam_[Unicam::Image].dev()->setDequeueTimeout(timeout);\n> +}\n> +\n> +void Vc4CameraData::tryRunPipeline()\n> +{\n> +\tFrameBuffer *embeddedBuffer;\n> +\tBayerFrame bayerFrame;\n> +\n> +\t/* If any of our request or buffer queues are empty, we cannot proceed. */\n> +\tif (state_ != State::Idle || requestQueue_.empty() ||\n> +\t    bayerQueue_.empty() || (embeddedQueue_.empty() && sensorMetadata_))\n> +\t\treturn;\n> +\n> +\tif (!findMatchingBuffers(bayerFrame, embeddedBuffer))\n> +\t\treturn;\n> +\n> +\t/* Take the first request from the queue and action the IPA. */\n> +\tRequest *request = requestQueue_.front();\n> +\n> +\t/* See if a new ScalerCrop value needs to be applied. */\n> +\tcalculateScalerCrop(request->controls());\n> +\n> +\t/*\n> +\t * Clear the request metadata and fill it with some initial non-IPA\n> +\t * related controls. We clear it first because the request metadata\n> +\t * may have been populated if we have dropped the previous frame.\n> +\t */\n> +\trequest->metadata().clear();\n> +\tfillRequestMetadata(bayerFrame.controls, request);\n> +\n> +\t/* Set our state to say the pipeline is active. */\n> +\tstate_ = State::Busy;\n> +\n> +\tunsigned int bayer = unicam_[Unicam::Image].getBufferId(bayerFrame.buffer);\n> +\n> +\tLOG(RPI, Debug) << \"Signalling prepareIsp:\"\n> +\t\t\t<< \" Bayer buffer id: \" << bayer;\n> +\n> +\tipa::RPi::PrepareParams params;\n> +\tparams.buffers.bayer = RPi::MaskBayerData | bayer;\n> +\tparams.sensorControls = std::move(bayerFrame.controls);\n> +\tparams.requestControls = request->controls();\n> +\tparams.ipaContext = request->sequence();\n> +\tparams.delayContext = bayerFrame.delayContext;\n> +\n> +\tif (embeddedBuffer) {\n> +\t\tunsigned int embeddedId = unicam_[Unicam::Embedded].getBufferId(embeddedBuffer);\n> +\n> +\t\tparams.buffers.embedded = RPi::MaskEmbeddedData | embeddedId;\n> +\t\tLOG(RPI, Debug) << \"Signalling prepareIsp:\"\n> +\t\t\t\t<< \" Embedded buffer id: \" << embeddedId;\n> +\t}\n> +\n> +\tipa_->prepareIsp(params);\n> +}\n> +\n> +bool Vc4CameraData::findMatchingBuffers(BayerFrame &bayerFrame, FrameBuffer *&embeddedBuffer)\n> +{\n> +\tif (bayerQueue_.empty())\n> +\t\treturn false;\n> +\n> +\t/*\n> +\t * Find the embedded data buffer with a matching timestamp to pass to\n> +\t * the IPA. Any embedded buffers with a timestamp lower than the\n> +\t * current bayer buffer will be removed and re-queued to the driver.\n> +\t */\n> +\tuint64_t ts = bayerQueue_.front().buffer->metadata().timestamp;\n> +\tembeddedBuffer = nullptr;\n> +\twhile (!embeddedQueue_.empty()) {\n> +\t\tFrameBuffer *b = embeddedQueue_.front();\n> +\t\tif (b->metadata().timestamp < ts) {\n> +\t\t\tembeddedQueue_.pop();\n> +\t\t\tunicam_[Unicam::Embedded].returnBuffer(b);\n> +\t\t\tLOG(RPI, Debug) << \"Dropping unmatched input frame in stream \"\n> +\t\t\t\t\t<< unicam_[Unicam::Embedded].name();\n> +\t\t} else if (b->metadata().timestamp == ts) {\n> +\t\t\t/* Found a match! */\n> +\t\t\tembeddedBuffer = b;\n> +\t\t\tembeddedQueue_.pop();\n> +\t\t\tbreak;\n> +\t\t} else {\n> +\t\t\tbreak; /* Only higher timestamps from here. */\n> +\t\t}\n> +\t}\n> +\n> +\tif (!embeddedBuffer && sensorMetadata_) {\n> +\t\tif (embeddedQueue_.empty()) {\n> +\t\t\t/*\n> +\t\t\t * If the embedded buffer queue is empty, wait for the next\n> +\t\t\t * buffer to arrive - dequeue ordering may send the image\n> +\t\t\t * buffer first.\n> +\t\t\t */\n> +\t\t\tLOG(RPI, Debug) << \"Waiting for next embedded buffer.\";\n> +\t\t\treturn false;\n> +\t\t}\n> +\n> +\t\t/* Log if there is no matching embedded data buffer found. */\n> +\t\tLOG(RPI, Debug) << \"Returning bayer frame without a matching embedded buffer.\";\n> +\t}\n> +\n> +\tbayerFrame = std::move(bayerQueue_.front());\n> +\tbayerQueue_.pop();\n> +\n> +\treturn true;\n> +}\n> +\n> +REGISTER_PIPELINE_HANDLER(PipelineHandlerVc4)\n> +\n> +} /* namespace libcamera */","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id C4C41C0DA4\n\tfor <parsemail@patchwork.libcamera.org>;\n\tThu, 27 Apr 2023 16:40:06 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id 3EED6627D6;\n\tThu, 27 Apr 2023 18:40:06 +0200 (CEST)","from perceval.ideasonboard.com (perceval.ideasonboard.com\n\t[213.167.242.64])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id 62E05627B7\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tThu, 27 Apr 2023 18:40:04 +0200 (CEST)","from pendragon.ideasonboard.com\n\t(133-32-181-51.west.xps.vectant.ne.jp [133.32.181.51])\n\tby perceval.ideasonboard.com (Postfix) with ESMTPSA id 2FBE8802;\n\tThu, 27 Apr 2023 18:39:49 +0200 (CEST)"],"DKIM-Signature":["v=1; a=rsa-sha256; c=relaxed/simple; d=libcamera.org;\n\ts=mail; t=1682613606;\n\tbh=Kyk1jjZgX5RR4q50lTaTJhjgd/XZ6idCi5cuSjAorbU=;\n\th=Date:To:References:In-Reply-To:Subject:List-Id:List-Unsubscribe:\n\tList-Archive:List-Post:List-Help:List-Subscribe:From:Reply-To:Cc:\n\tFrom;\n\tb=2OneP2XjajVCHpzKCiQUaCFLSE5L1frtSr2r+MKG+hhXQM5RBUBFVxcJrubSLjTCS\n\tYbabdNQV8ZfOV14CGzyjtvxc2DXOqcGk5TLOU5RyrF+bjzYTsd7/PYQzPwYHk2EtOM\n\t4rdz6Ny40yP7d8KZ71OmhNrjfnki9SbG7e2zwYHj+Fjvkt+okzvRzXZ16uYJz0mp4Q\n\tjiLUkHfxVftG7Ujej0u9RLpKBSFSBrlO6vMq2n9eqbhik9kqtIKev1LAGaZWrAROj8\n\t0QVwSLqXMaC5BK4bj4H+RFrw/oCQ/e7kJsSXTh82sFi514am9Lzc9AK692oqxnp8zK\n\tHMGp6mzXcYJ5Q==","v=1; a=rsa-sha256; c=relaxed/simple; d=ideasonboard.com;\n\ts=mail; t=1682613592;\n\tbh=Kyk1jjZgX5RR4q50lTaTJhjgd/XZ6idCi5cuSjAorbU=;\n\th=Date:From:To:Cc:Subject:References:In-Reply-To:From;\n\tb=QK17VPeC+9o6YLz6ATgDOwxyCyFYmtJsxemyLKRLfYqeCdSswaaGGxuvJ9YBSLGe3\n\tBjT/ZMtd/CQI6usd/nRB1Z2HzgpPH7DQZjaxzjkI3mehG4vuKRkCebPZUR7a0Tzvot\n\tQLS/TocA23mShag5VHUT7Dz58S2xsAMkkNBYpy48="],"Authentication-Results":"lancelot.ideasonboard.com; dkim=pass (1024-bit key; \n\tunprotected) header.d=ideasonboard.com\n\theader.i=@ideasonboard.com\n\theader.b=\"QK17VPeC\"; dkim-atps=neutral","Date":"Thu, 27 Apr 2023 19:40:14 +0300","To":"Naushir Patuck <naush@raspberrypi.com>","Message-ID":"<20230427164014.GJ28489@pendragon.ideasonboard.com>","References":"<20230426131057.21550-1-naush@raspberrypi.com>\n\t<20230426131057.21550-12-naush@raspberrypi.com>","MIME-Version":"1.0","Content-Type":"text/plain; charset=utf-8","Content-Disposition":"inline","In-Reply-To":"<20230426131057.21550-12-naush@raspberrypi.com>","Subject":"Re: [libcamera-devel] [PATCH 11/13] pipeline: raspberrypi:\n\tIntroduce PipelineHandlerBase class","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","From":"Laurent Pinchart via libcamera-devel\n\t<libcamera-devel@lists.libcamera.org>","Reply-To":"Laurent Pinchart <laurent.pinchart@ideasonboard.com>","Cc":"libcamera-devel@lists.libcamera.org","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}},{"id":26986,"web_url":"https://patchwork.libcamera.org/comment/26986/","msgid":"<CAEmqJPqf3x_SDtgzr=a3yWzqqs4MjgtSUOdpNnqJ6Cpz=cNeKQ@mail.gmail.com>","date":"2023-04-28T09:03:11","subject":"Re: [libcamera-devel] [PATCH 11/13] pipeline: raspberrypi:\n\tIntroduce PipelineHandlerBase class","submitter":{"id":34,"url":"https://patchwork.libcamera.org/api/people/34/","name":"Naushir Patuck","email":"naush@raspberrypi.com"},"content":"Hi Laurent,\n\nThank you for your feedback.\n\nOn Thu, 27 Apr 2023 at 17:40, Laurent Pinchart\n<laurent.pinchart@ideasonboard.com> wrote:\n>\n> Hi Naush,\n>\n> Thank you for the patch.\n>\n> On Wed, Apr 26, 2023 at 02:10:55PM +0100, Naushir Patuck via libcamera-devel wrote:\n> > Create a new PipelineHandlerBase class that handles general purpose\n> > housekeeping duties for the Raspberry Pi pipeline handler. Code the\n> > implementation of new class is essentially pulled from the existing\n> > pipeline/rpi/vc4/raspberrypi.cpp file with a small amount of\n> > refactoring.\n> >\n> > Create a derived PipelineHandlerVc4 class from PipelineHandlerBase that\n> > handles the VC4 pipeline specific tasks of the pipeline handler. Again,\n> > code for this class implementation is taken from the existing\n> > pipeline/rpi/vc4/raspberrypi.cpp with a small amount of\n> > refactoring.\n> >\n> > The goal of this change is to allow third parties to implement their own\n> > pipeline handlers running on the Raspberry Pi without duplicating all of\n> > the pipeline handler housekeeping tasks.\n> >\n> > Signed-off-by: Naushir Patuck <naush@raspberrypi.com>\n> > ---\n> >  src/ipa/rpi/vc4/vc4.cpp                       |    2 +-\n> >  src/libcamera/pipeline/rpi/common/meson.build |    1 +\n> >  .../pipeline/rpi/common/pipeline_base.cpp     | 1447 ++++++++++\n> >  .../pipeline/rpi/common/pipeline_base.h       |  276 ++\n> >  .../pipeline/rpi/vc4/data/example.yaml        |    4 +-\n> >  src/libcamera/pipeline/rpi/vc4/meson.build    |    2 +-\n> >  .../pipeline/rpi/vc4/raspberrypi.cpp          | 2428 -----------------\n> >  src/libcamera/pipeline/rpi/vc4/vc4.cpp        | 1001 +++++++\n> >  8 files changed, 2729 insertions(+), 2432 deletions(-)\n> >  create mode 100644 src/libcamera/pipeline/rpi/common/pipeline_base.cpp\n> >  create mode 100644 src/libcamera/pipeline/rpi/common/pipeline_base.h\n> >  delete mode 100644 src/libcamera/pipeline/rpi/vc4/raspberrypi.cpp\n> >  create mode 100644 src/libcamera/pipeline/rpi/vc4/vc4.cpp\n> >\n> > diff --git a/src/ipa/rpi/vc4/vc4.cpp b/src/ipa/rpi/vc4/vc4.cpp\n> > index 0e022c2aeed3..f3d83b2afadf 100644\n> > --- a/src/ipa/rpi/vc4/vc4.cpp\n> > +++ b/src/ipa/rpi/vc4/vc4.cpp\n> > @@ -538,7 +538,7 @@ extern \"C\" {\n> >  const struct IPAModuleInfo ipaModuleInfo = {\n> >       IPA_MODULE_API_VERSION,\n> >       1,\n> > -     \"PipelineHandlerRPi\",\n> > +     \"PipelineHandlerVc4\",\n> >       \"vc4\",\n> >  };\n> >\n> > diff --git a/src/libcamera/pipeline/rpi/common/meson.build b/src/libcamera/pipeline/rpi/common/meson.build\n> > index 1dec6d3d028b..f8ea790b42a1 100644\n> > --- a/src/libcamera/pipeline/rpi/common/meson.build\n> > +++ b/src/libcamera/pipeline/rpi/common/meson.build\n> > @@ -2,6 +2,7 @@\n> >\n> >  libcamera_sources += files([\n> >      'delayed_controls.cpp',\n> > +    'pipeline_base.cpp',\n> >      'rpi_stream.cpp',\n> >  ])\n> >\n> > diff --git a/src/libcamera/pipeline/rpi/common/pipeline_base.cpp b/src/libcamera/pipeline/rpi/common/pipeline_base.cpp\n> > new file mode 100644\n> > index 000000000000..012766b38c32\n> > --- /dev/null\n> > +++ b/src/libcamera/pipeline/rpi/common/pipeline_base.cpp\n> > @@ -0,0 +1,1447 @@\n> > +/* SPDX-License-Identifier: LGPL-2.1-or-later */\n> > +/*\n> > + * Copyright (C) 2019-2023, Raspberry Pi Ltd\n> > + *\n> > + * pipeline_base.h - Pipeline handler base class for Raspberry Pi devices\n>\n> pipeline_base.cpp\n>\n> Maybe we should drop the file names...\n>\n> > + */\n> > +\n> > +#include \"pipeline_base.h\"\n> > +\n> > +#include <chrono>\n> > +\n> > +#include <linux/media-bus-format.h>\n> > +#include <linux/videodev2.h>\n> > +\n> > +#include <libcamera/base/file.h>\n> > +#include <libcamera/base/utils.h>\n> > +\n> > +#include <libcamera/formats.h>\n> > +#include <libcamera/logging.h>\n> > +#include <libcamera/property_ids.h>\n> > +\n> > +#include \"libcamera/internal/camera_lens.h\"\n> > +#include \"libcamera/internal/ipa_manager.h\"\n> > +#include \"libcamera/internal/v4l2_subdevice.h\"\n> > +\n> > +using namespace std::chrono_literals;\n> > +\n> > +namespace libcamera {\n> > +\n> > +using namespace RPi;\n> > +\n> > +LOG_DEFINE_CATEGORY(RPI)\n> > +\n> > +namespace {\n> > +\n> > +constexpr unsigned int defaultRawBitDepth = 12;\n> > +\n> > +bool isRaw(const PixelFormat &pixFmt)\n> > +{\n> > +     /* This test works for both Bayer and raw mono formats. */\n> > +     return BayerFormat::fromPixelFormat(pixFmt).isValid();\n> > +}\n> > +\n> > +PixelFormat mbusCodeToPixelFormat(unsigned int mbus_code,\n> > +                               BayerFormat::Packing packingReq)\n> > +{\n> > +     BayerFormat bayer = BayerFormat::fromMbusCode(mbus_code);\n> > +\n> > +     ASSERT(bayer.isValid());\n> > +\n> > +     bayer.packing = packingReq;\n> > +     PixelFormat pix = bayer.toPixelFormat();\n> > +\n> > +     /*\n> > +      * Not all formats (e.g. 8-bit or 16-bit Bayer formats) can have packed\n> > +      * variants. So if the PixelFormat returns as invalid, use the non-packed\n> > +      * conversion instead.\n> > +      */\n> > +     if (!pix.isValid()) {\n> > +             bayer.packing = BayerFormat::Packing::None;\n> > +             pix = bayer.toPixelFormat();\n> > +     }\n> > +\n> > +     return pix;\n> > +}\n> > +\n> > +SensorFormats populateSensorFormats(std::unique_ptr<CameraSensor> &sensor)\n> > +{\n> > +     SensorFormats formats;\n> > +\n> > +     for (auto const mbusCode : sensor->mbusCodes())\n> > +             formats.emplace(mbusCode, sensor->sizes(mbusCode));\n> > +\n> > +     return formats;\n> > +}\n> > +\n> > +bool isMonoSensor(std::unique_ptr<CameraSensor> &sensor)\n> > +{\n> > +     unsigned int mbusCode = sensor->mbusCodes()[0];\n> > +     const BayerFormat &bayer = BayerFormat::fromMbusCode(mbusCode);\n> > +\n> > +     return bayer.order == BayerFormat::Order::MONO;\n> > +}\n> > +\n> > +double scoreFormat(double desired, double actual)\n> > +{\n> > +     double score = desired - actual;\n> > +     /* Smaller desired dimensions are preferred. */\n> > +     if (score < 0.0)\n> > +             score = (-score) / 8;\n> > +     /* Penalise non-exact matches. */\n> > +     if (actual != desired)\n> > +             score *= 2;\n> > +\n> > +     return score;\n> > +}\n> > +\n> > +V4L2SubdeviceFormat findBestFormat(const SensorFormats &formatsMap, const Size &req, unsigned int bitDepth)\n> > +{\n> > +     double bestScore = std::numeric_limits<double>::max(), score;\n> > +     V4L2SubdeviceFormat bestFormat;\n> > +     bestFormat.colorSpace = ColorSpace::Raw;\n> > +\n> > +     constexpr float penaltyAr = 1500.0;\n> > +     constexpr float penaltyBitDepth = 500.0;\n> > +\n> > +     /* Calculate the closest/best mode from the user requested size. */\n> > +     for (const auto &iter : formatsMap) {\n> > +             const unsigned int mbusCode = iter.first;\n> > +             const PixelFormat format = mbusCodeToPixelFormat(mbusCode,\n> > +                                                              BayerFormat::Packing::None);\n> > +             const PixelFormatInfo &info = PixelFormatInfo::info(format);\n> > +\n> > +             for (const Size &size : iter.second) {\n> > +                     double reqAr = static_cast<double>(req.width) / req.height;\n> > +                     double fmtAr = static_cast<double>(size.width) / size.height;\n> > +\n> > +                     /* Score the dimensions for closeness. */\n> > +                     score = scoreFormat(req.width, size.width);\n> > +                     score += scoreFormat(req.height, size.height);\n> > +                     score += penaltyAr * scoreFormat(reqAr, fmtAr);\n> > +\n> > +                     /* Add any penalties... this is not an exact science! */\n> > +                     score += utils::abs_diff(info.bitsPerPixel, bitDepth) * penaltyBitDepth;\n> > +\n> > +                     if (score <= bestScore) {\n> > +                             bestScore = score;\n> > +                             bestFormat.mbus_code = mbusCode;\n> > +                             bestFormat.size = size;\n> > +                     }\n> > +\n> > +                     LOG(RPI, Debug) << \"Format: \" << size\n> > +                                     << \" fmt \" << format\n> > +                                     << \" Score: \" << score\n> > +                                     << \" (best \" << bestScore << \")\";\n> > +             }\n> > +     }\n> > +\n> > +     return bestFormat;\n> > +}\n> > +\n> > +const std::vector<ColorSpace> validColorSpaces = {\n> > +     ColorSpace::Sycc,\n> > +     ColorSpace::Smpte170m,\n> > +     ColorSpace::Rec709\n> > +};\n> > +\n> > +std::optional<ColorSpace> findValidColorSpace(const ColorSpace &colourSpace)\n> > +{\n> > +     for (auto cs : validColorSpaces) {\n> > +             if (colourSpace.primaries == cs.primaries &&\n> > +                 colourSpace.transferFunction == cs.transferFunction)\n> > +                     return cs;\n> > +     }\n> > +\n> > +     return std::nullopt;\n> > +}\n> > +\n> > +bool isRgb(const PixelFormat &pixFmt)\n> > +{\n> > +     const PixelFormatInfo &info = PixelFormatInfo::info(pixFmt);\n> > +     return info.colourEncoding == PixelFormatInfo::ColourEncodingRGB;\n> > +}\n> > +\n> > +bool isYuv(const PixelFormat &pixFmt)\n> > +{\n> > +     /* The code below would return true for raw mono streams, so weed those out first. */\n> > +     if (isRaw(pixFmt))\n> > +             return false;\n> > +\n> > +     const PixelFormatInfo &info = PixelFormatInfo::info(pixFmt);\n> > +     return info.colourEncoding == PixelFormatInfo::ColourEncodingYUV;\n> > +}\n> > +\n> > +} /* namespace */\n> > +\n> > +/*\n> > + * Raspberry Pi drivers expect the following colour spaces:\n> > + * - V4L2_COLORSPACE_RAW for raw streams.\n> > + * - One of V4L2_COLORSPACE_JPEG, V4L2_COLORSPACE_SMPTE170M, V4L2_COLORSPACE_REC709 for\n> > + *   non-raw streams. Other fields such as transfer function, YCbCr encoding and\n> > + *   quantisation are not used.\n> > + *\n> > + * The libcamera colour spaces that we wish to use corresponding to these are therefore:\n> > + * - ColorSpace::Raw for V4L2_COLORSPACE_RAW\n> > + * - ColorSpace::Sycc for V4L2_COLORSPACE_JPEG\n> > + * - ColorSpace::Smpte170m for V4L2_COLORSPACE_SMPTE170M\n> > + * - ColorSpace::Rec709 for V4L2_COLORSPACE_REC709\n> > + */\n> > +CameraConfiguration::Status RPiCameraConfiguration::validateColorSpaces([[maybe_unused]] ColorSpaceFlags flags)\n> > +{\n> > +     Status status = Valid;\n> > +     yuvColorSpace_.reset();\n> > +\n> > +     for (auto cfg : config_) {\n> > +             /* First fix up raw streams to have the \"raw\" colour space. */\n> > +             if (isRaw(cfg.pixelFormat)) {\n> > +                     /* If there was no value here, that doesn't count as \"adjusted\". */\n> > +                     if (cfg.colorSpace && cfg.colorSpace != ColorSpace::Raw)\n> > +                             status = Adjusted;\n> > +                     cfg.colorSpace = ColorSpace::Raw;\n> > +                     continue;\n> > +             }\n> > +\n> > +             /* Next we need to find our shared colour space. The first valid one will do. */\n> > +             if (cfg.colorSpace && !yuvColorSpace_)\n> > +                     yuvColorSpace_ = findValidColorSpace(cfg.colorSpace.value());\n> > +     }\n> > +\n> > +     /* If no colour space was given anywhere, choose sYCC. */\n> > +     if (!yuvColorSpace_)\n> > +             yuvColorSpace_ = ColorSpace::Sycc;\n> > +\n> > +     /* Note the version of this that any RGB streams will have to use. */\n> > +     rgbColorSpace_ = yuvColorSpace_;\n> > +     rgbColorSpace_->ycbcrEncoding = ColorSpace::YcbcrEncoding::None;\n> > +     rgbColorSpace_->range = ColorSpace::Range::Full;\n> > +\n> > +     /* Go through the streams again and force everyone to the same colour space. */\n> > +     for (auto cfg : config_) {\n> > +             if (cfg.colorSpace == ColorSpace::Raw)\n> > +                     continue;\n> > +\n> > +             if (isYuv(cfg.pixelFormat) && cfg.colorSpace != yuvColorSpace_) {\n> > +                     /* Again, no value means \"not adjusted\". */\n> > +                     if (cfg.colorSpace)\n> > +                             status = Adjusted;\n> > +                     cfg.colorSpace = yuvColorSpace_;\n> > +             }\n> > +             if (isRgb(cfg.pixelFormat) && cfg.colorSpace != rgbColorSpace_) {\n> > +                     /* Be nice, and let the YUV version count as non-adjusted too. */\n> > +                     if (cfg.colorSpace && cfg.colorSpace != yuvColorSpace_)\n> > +                             status = Adjusted;\n> > +                     cfg.colorSpace = rgbColorSpace_;\n> > +             }\n> > +     }\n> > +\n> > +     return status;\n> > +}\n> > +\n> > +CameraConfiguration::Status RPiCameraConfiguration::validate()\n> > +{\n> > +     Status status = Valid;\n> > +\n> > +     if (config_.empty())\n> > +             return Invalid;\n> > +\n> > +     status = validateColorSpaces(ColorSpaceFlag::StreamsShareColorSpace);\n> > +\n> > +     /*\n> > +      * Validate the requested transform against the sensor capabilities and\n> > +      * rotation and store the final combined transform that configure() will\n> > +      * need to apply to the sensor to save us working it out again.\n> > +      */\n> > +     Transform requestedTransform = transform;\n> > +     combinedTransform_ = data_->sensor_->validateTransform(&transform);\n> > +     if (transform != requestedTransform)\n> > +             status = Adjusted;\n> > +\n> > +     std::vector<CameraData::StreamParams> rawStreams, outStreams;\n> > +     for (const auto &[index, cfg] : utils::enumerate(config_)) {\n> > +             if (isRaw(cfg.pixelFormat))\n> > +                     rawStreams.emplace_back(index, &cfg);\n> > +             else\n> > +                     outStreams.emplace_back(index, &cfg);\n> > +     }\n> > +\n> > +     /* Sort the streams so the highest resolution is first. */\n> > +     std::sort(rawStreams.begin(), rawStreams.end(),\n> > +               [](auto &l, auto &r) { return l.cfg->size > r.cfg->size; });\n> > +\n> > +     std::sort(outStreams.begin(), outStreams.end(),\n> > +               [](auto &l, auto &r) { return l.cfg->size > r.cfg->size; });\n> > +\n> > +     /* Do any platform specific fixups. */\n> > +     status = data_->platformValidate(rawStreams, outStreams);\n> > +     if (status == Invalid)\n> > +             return Invalid;\n> > +\n> > +     /* Further fixups on the RAW streams. */\n> > +     for (auto &raw : rawStreams) {\n> > +             StreamConfiguration &cfg = config_.at(raw.index);\n> > +             V4L2DeviceFormat rawFormat;\n> > +\n> > +             const PixelFormatInfo &info = PixelFormatInfo::info(cfg.pixelFormat);\n> > +             unsigned int bitDepth = info.isValid() ? info.bitsPerPixel : defaultRawBitDepth;\n> > +             V4L2SubdeviceFormat sensorFormat = findBestFormat(data_->sensorFormats_, cfg.size, bitDepth);\n> > +\n> > +             rawFormat.size = sensorFormat.size;\n> > +             rawFormat.fourcc = raw.dev->toV4L2PixelFormat(cfg.pixelFormat);\n> > +\n> > +             int ret = raw.dev->tryFormat(&rawFormat);\n> > +             if (ret)\n> > +                     return Invalid;\n> > +             /*\n> > +              * Some sensors change their Bayer order when they are h-flipped\n> > +              * or v-flipped, according to the transform. If this one does, we\n> > +              * must advertise the transformed Bayer order in the raw stream.\n> > +              * Note how we must fetch the \"native\" (i.e. untransformed) Bayer\n> > +              * order, because the sensor may currently be flipped!\n> > +              */\n> > +             V4L2PixelFormat fourcc = rawFormat.fourcc;\n> > +             if (data_->flipsAlterBayerOrder_) {\n> > +                     BayerFormat bayer = BayerFormat::fromV4L2PixelFormat(fourcc);\n> > +                     bayer.order = data_->nativeBayerOrder_;\n> > +                     bayer = bayer.transform(combinedTransform_);\n> > +                     fourcc = bayer.toV4L2PixelFormat();\n> > +             }\n> > +\n> > +             PixelFormat inputPixFormat = fourcc.toPixelFormat();\n> > +             if (raw.cfg->size != rawFormat.size || raw.cfg->pixelFormat != inputPixFormat) {\n> > +                     raw.cfg->size = rawFormat.size;\n> > +                     raw.cfg->pixelFormat = inputPixFormat;\n> > +                     status = Adjusted;\n> > +             }\n> > +\n> > +             raw.cfg->stride = rawFormat.planes[0].bpl;\n> > +             raw.cfg->frameSize = rawFormat.planes[0].size;\n> > +     }\n> > +\n> > +     /* Further fixups on the ISP output streams. */\n> > +     for (auto &out : outStreams) {\n> > +             StreamConfiguration &cfg = config_.at(out.index);\n> > +             PixelFormat &cfgPixFmt = cfg.pixelFormat;\n> > +             V4L2VideoDevice::Formats fmts = out.dev->formats();\n> > +\n> > +             if (fmts.find(out.dev->toV4L2PixelFormat(cfgPixFmt)) == fmts.end()) {\n> > +                     /* If we cannot find a native format, use a default one. */\n> > +                     cfgPixFmt = formats::NV12;\n> > +                     status = Adjusted;\n> > +             }\n> > +\n> > +             V4L2DeviceFormat format;\n> > +             format.fourcc = out.dev->toV4L2PixelFormat(cfg.pixelFormat);\n> > +             format.size = cfg.size;\n> > +             /* We want to send the associated YCbCr info through to the driver. */\n> > +             format.colorSpace = yuvColorSpace_;\n> > +\n> > +             LOG(RPI, Debug)\n> > +                     << \"Try color space \" << ColorSpace::toString(cfg.colorSpace);\n> > +\n> > +             int ret = out.dev->tryFormat(&format);\n> > +             if (ret)\n> > +                     return Invalid;\n> > +\n> > +             /*\n> > +              * But for RGB streams, the YCbCr info gets overwritten on the way back\n> > +              * so we must check against what the stream cfg says, not what we actually\n> > +              * requested (which carefully included the YCbCr info)!\n> > +              */\n> > +             if (cfg.colorSpace != format.colorSpace) {\n> > +                     status = Adjusted;\n> > +                     LOG(RPI, Debug)\n> > +                             << \"Color space changed from \"\n> > +                             << ColorSpace::toString(cfg.colorSpace) << \" to \"\n> > +                             << ColorSpace::toString(format.colorSpace);\n> > +             }\n> > +\n> > +             cfg.colorSpace = format.colorSpace;\n> > +             cfg.stride = format.planes[0].bpl;\n> > +             cfg.frameSize = format.planes[0].size;\n> > +     }\n> > +\n> > +     return status;\n> > +}\n> > +\n> > +V4L2DeviceFormat PipelineHandlerBase::toV4L2DeviceFormat(const V4L2VideoDevice *dev,\n> > +                                                      const V4L2SubdeviceFormat &format,\n> > +                                                      BayerFormat::Packing packingReq)\n> > +{\n> > +     unsigned int mbus_code = format.mbus_code;\n> > +     const PixelFormat pix = mbusCodeToPixelFormat(mbus_code, packingReq);\n> > +     V4L2DeviceFormat deviceFormat;\n> > +\n> > +     deviceFormat.fourcc = dev->toV4L2PixelFormat(pix);\n> > +     deviceFormat.size = format.size;\n> > +     deviceFormat.colorSpace = format.colorSpace;\n> > +     return deviceFormat;\n> > +}\n> > +\n> > +std::unique_ptr<CameraConfiguration>\n> > +PipelineHandlerBase::generateConfiguration(Camera *camera, const StreamRoles &roles)\n> > +{\n> > +     CameraData *data = cameraData(camera);\n> > +     std::unique_ptr<CameraConfiguration> config =\n> > +             std::make_unique<RPiCameraConfiguration>(data);\n> > +     V4L2SubdeviceFormat sensorFormat;\n> > +     unsigned int bufferCount;\n> > +     PixelFormat pixelFormat;\n> > +     V4L2VideoDevice::Formats fmts;\n> > +     Size size;\n> > +     std::optional<ColorSpace> colorSpace;\n> > +\n> > +     if (roles.empty())\n> > +             return config;\n> > +\n> > +     Size sensorSize = data->sensor_->resolution();\n> > +     for (const StreamRole role : roles) {\n> > +             switch (role) {\n> > +             case StreamRole::Raw:\n> > +                     size = sensorSize;\n> > +                     sensorFormat = findBestFormat(data->sensorFormats_, size, defaultRawBitDepth);\n> > +                     pixelFormat = mbusCodeToPixelFormat(sensorFormat.mbus_code,\n> > +                                                         BayerFormat::Packing::CSI2);\n> > +                     ASSERT(pixelFormat.isValid());\n> > +                     colorSpace = ColorSpace::Raw;\n> > +                     bufferCount = 2;\n> > +                     break;\n> > +\n> > +             case StreamRole::StillCapture:\n> > +                     fmts = data->ispFormats();\n> > +                     pixelFormat = formats::NV12;\n> > +                     /*\n> > +                      * Still image codecs usually expect the sYCC color space.\n> > +                      * Even RGB codecs will be fine as the RGB we get with the\n> > +                      * sYCC color space is the same as sRGB.\n> > +                      */\n> > +                     colorSpace = ColorSpace::Sycc;\n> > +                     /* Return the largest sensor resolution. */\n> > +                     size = sensorSize;\n> > +                     bufferCount = 1;\n> > +                     break;\n> > +\n> > +             case StreamRole::VideoRecording:\n> > +                     /*\n> > +                      * The colour denoise algorithm requires the analysis\n> > +                      * image, produced by the second ISP output, to be in\n> > +                      * YUV420 format. Select this format as the default, to\n> > +                      * maximize chances that it will be picked by\n> > +                      * applications and enable usage of the colour denoise\n> > +                      * algorithm.\n> > +                      */\n> > +                     fmts = data->ispFormats();\n> > +                     pixelFormat = formats::YUV420;\n> > +                     /*\n> > +                      * Choose a color space appropriate for video recording.\n> > +                      * Rec.709 will be a good default for HD resolutions.\n> > +                      */\n> > +                     colorSpace = ColorSpace::Rec709;\n> > +                     size = { 1920, 1080 };\n> > +                     bufferCount = 4;\n> > +                     break;\n> > +\n> > +             case StreamRole::Viewfinder:\n> > +                     fmts = data->ispFormats();\n> > +                     pixelFormat = formats::ARGB8888;\n> > +                     colorSpace = ColorSpace::Sycc;\n> > +                     size = { 800, 600 };\n> > +                     bufferCount = 4;\n> > +                     break;\n> > +\n> > +             default:\n> > +                     LOG(RPI, Error) << \"Requested stream role not supported: \"\n> > +                                     << role;\n> > +                     return nullptr;\n> > +             }\n> > +\n> > +             std::map<PixelFormat, std::vector<SizeRange>> deviceFormats;\n> > +             if (role == StreamRole::Raw) {\n> > +                     /* Translate the MBUS codes to a PixelFormat. */\n> > +                     for (const auto &format : data->sensorFormats_) {\n> > +                             PixelFormat pf = mbusCodeToPixelFormat(format.first,\n> > +                                                                    BayerFormat::Packing::CSI2);\n> > +                             if (pf.isValid())\n> > +                                     deviceFormats.emplace(std::piecewise_construct, std::forward_as_tuple(pf),\n> > +                                                           std::forward_as_tuple(format.second.begin(), format.second.end()));\n> > +                     }\n> > +             } else {\n> > +                     /*\n> > +                      * Translate the V4L2PixelFormat to PixelFormat. Note that we\n> > +                      * limit the recommended largest ISP output size to match the\n> > +                      * sensor resolution.\n> > +                      */\n> > +                     for (const auto &format : fmts) {\n> > +                             PixelFormat pf = format.first.toPixelFormat();\n> > +                             if (pf.isValid()) {\n> > +                                     const SizeRange &ispSizes = format.second[0];\n> > +                                     deviceFormats[pf].emplace_back(ispSizes.min, sensorSize,\n> > +                                                                    ispSizes.hStep, ispSizes.vStep);\n> > +                             }\n> > +                     }\n> > +             }\n> > +\n> > +             /* Add the stream format based on the device node used for the use case. */\n> > +             StreamFormats formats(deviceFormats);\n> > +             StreamConfiguration cfg(formats);\n> > +             cfg.size = size;\n> > +             cfg.pixelFormat = pixelFormat;\n> > +             cfg.colorSpace = colorSpace;\n> > +             cfg.bufferCount = bufferCount;\n> > +             config->addConfiguration(cfg);\n> > +     }\n> > +\n> > +     config->validate();\n> > +\n> > +     return config;\n> > +}\n> > +\n> > +int PipelineHandlerBase::configure(Camera *camera, CameraConfiguration *config)\n> > +{\n> > +     CameraData *data = cameraData(camera);\n> > +     int ret;\n> > +\n> > +     /* Start by freeing all buffers and reset the stream states. */\n> > +     data->freeBuffers();\n> > +     for (auto const stream : data->streams_)\n> > +             stream->setExternal(false);\n> > +\n> > +     std::vector<CameraData::StreamParams> rawStreams, ispStreams;\n> > +     std::optional<BayerFormat::Packing> packing;\n> > +     unsigned int bitDepth = defaultRawBitDepth;\n> > +\n> > +     for (unsigned i = 0; i < config->size(); i++) {\n> > +             StreamConfiguration *cfg = &config->at(i);\n> > +\n> > +             if (isRaw(cfg->pixelFormat))\n> > +                     rawStreams.emplace_back(i, cfg);\n> > +             else\n> > +                     ispStreams.emplace_back(i, cfg);\n> > +     }\n> > +\n> > +     /* Sort the streams so the highest resolution is first. */\n> > +     std::sort(rawStreams.begin(), rawStreams.end(),\n> > +               [](auto &l, auto &r) { return l.cfg->size > r.cfg->size; });\n> > +\n> > +     std::sort(ispStreams.begin(), ispStreams.end(),\n> > +               [](auto &l, auto &r) { return l.cfg->size > r.cfg->size; });\n> > +\n> > +     /*\n> > +      * Calculate the best sensor mode we can use based on the user's request,\n> > +      * and apply it to the sensor with the cached tranform, if any.\n> > +      *\n> > +      * If we have been given a RAW stream, use that size for setting up the sensor.\n> > +      */\n> > +     if (!rawStreams.empty()) {\n> > +             BayerFormat bayerFormat = BayerFormat::fromPixelFormat(rawStreams[0].cfg->pixelFormat);\n> > +             /* Replace the user requested packing/bit-depth. */\n> > +             packing = bayerFormat.packing;\n> > +             bitDepth = bayerFormat.bitDepth;\n> > +     }\n> > +\n> > +     V4L2SubdeviceFormat sensorFormat = findBestFormat(data->sensorFormats_,\n> > +                                                       rawStreams.empty() ? ispStreams[0].cfg->size\n> > +                                                                          : rawStreams[0].cfg->size,\n> > +                                                       bitDepth);\n> > +     /* Apply any cached transform. */\n> > +     const RPiCameraConfiguration *rpiConfig = static_cast<const RPiCameraConfiguration *>(config);\n> > +\n> > +     /* Then apply the format on the sensor. */\n> > +     ret = data->sensor_->setFormat(&sensorFormat, rpiConfig->combinedTransform_);\n> > +     if (ret)\n> > +             return ret;\n> > +\n> > +     /*\n> > +      * Platform specific internal stream configuration. This also assigns\n> > +      * external streams which get configured below.\n> > +      */\n> > +     ret = data->platformConfigure(sensorFormat, packing, rawStreams, ispStreams);\n> > +     if (ret)\n> > +             return ret;\n> > +\n> > +     ipa::RPi::ConfigResult result;\n> > +     ret = data->configureIPA(config, &result);\n> > +     if (ret) {\n> > +             LOG(RPI, Error) << \"Failed to configure the IPA: \" << ret;\n> > +             return ret;\n> > +     }\n> > +\n> > +     /*\n> > +      * Set the scaler crop to the value we are using (scaled to native sensor\n> > +      * coordinates).\n> > +      */\n> > +     data->scalerCrop_ = data->scaleIspCrop(data->ispCrop_);\n> > +\n> > +     /*\n> > +      * Update the ScalerCropMaximum to the correct value for this camera mode.\n> > +      * For us, it's the same as the \"analogue crop\".\n> > +      *\n> > +      * \\todo Make this property the ScalerCrop maximum value when dynamic\n> > +      * controls are available and set it at validate() time\n> > +      */\n> > +     data->properties_.set(properties::ScalerCropMaximum, data->sensorInfo_.analogCrop);\n> > +\n> > +     /* Store the mode sensitivity for the application. */\n> > +     data->properties_.set(properties::SensorSensitivity, result.modeSensitivity);\n> > +\n> > +     /* Update the controls that the Raspberry Pi IPA can handle. */\n> > +     ControlInfoMap::Map ctrlMap;\n> > +     for (auto const &c : result.controlInfo)\n> > +             ctrlMap.emplace(c.first, c.second);\n> > +\n> > +     /* Add the ScalerCrop control limits based on the current mode. */\n> > +     Rectangle ispMinCrop = data->scaleIspCrop(Rectangle(data->ispMinCropSize_));\n> > +     ctrlMap[&controls::ScalerCrop] = ControlInfo(ispMinCrop, data->sensorInfo_.analogCrop, data->scalerCrop_);\n> > +\n> > +     data->controlInfo_ = ControlInfoMap(std::move(ctrlMap), result.controlInfo.idmap());\n> > +\n> > +     /* Setup the Video Mux/Bridge entities. */\n> > +     for (auto &[device, link] : data->bridgeDevices_) {\n> > +             /*\n> > +              * Start by disabling all the sink pad links on the devices in the\n> > +              * cascade, with the exception of the link connecting the device.\n> > +              */\n> > +             for (const MediaPad *p : device->entity()->pads()) {\n> > +                     if (!(p->flags() & MEDIA_PAD_FL_SINK))\n> > +                             continue;\n> > +\n> > +                     for (MediaLink *l : p->links()) {\n> > +                             if (l != link)\n> > +                                     l->setEnabled(false);\n> > +                     }\n> > +             }\n> > +\n> > +             /*\n> > +              * Next, enable the entity -> entity links, and setup the pad format.\n> > +              *\n> > +              * \\todo Some bridge devices may chainge the media bus code, so we\n> > +              * ought to read the source pad format and propagate it to the sink pad.\n> > +              */\n> > +             link->setEnabled(true);\n> > +             const MediaPad *sinkPad = link->sink();\n> > +             ret = device->setFormat(sinkPad->index(), &sensorFormat);\n> > +             if (ret) {\n> > +                     LOG(RPI, Error) << \"Failed to set format on \" << device->entity()->name()\n> > +                                     << \" pad \" << sinkPad->index()\n> > +                                     << \" with format  \" << sensorFormat\n> > +                                     << \": \" << ret;\n> > +                     return ret;\n> > +             }\n> > +\n> > +             LOG(RPI, Debug) << \"Configured media link on device \" << device->entity()->name()\n> > +                             << \" on pad \" << sinkPad->index();\n> > +     }\n> > +\n> > +     return 0;\n> > +}\n> > +\n> > +int PipelineHandlerBase::exportFrameBuffers([[maybe_unused]] Camera *camera, libcamera::Stream *stream,\n> > +                                         std::vector<std::unique_ptr<FrameBuffer>> *buffers)\n> > +{\n> > +     RPi::Stream *s = static_cast<RPi::Stream *>(stream);\n> > +     unsigned int count = stream->configuration().bufferCount;\n> > +     int ret = s->dev()->exportBuffers(count, buffers);\n> > +\n> > +     s->setExportedBuffers(buffers);\n> > +\n> > +     return ret;\n> > +}\n> > +\n> > +int PipelineHandlerBase::start(Camera *camera, const ControlList *controls)\n> > +{\n> > +     CameraData *data = cameraData(camera);\n> > +     int ret;\n> > +\n> > +     /* Check if a ScalerCrop control was specified. */\n> > +     if (controls)\n> > +             data->calculateScalerCrop(*controls);\n> > +\n> > +     /* Start the IPA. */\n> > +     ipa::RPi::StartResult result;\n> > +     data->ipa_->start(controls ? *controls : ControlList{ controls::controls },\n> > +                       &result);\n> > +\n> > +     /* Apply any gain/exposure settings that the IPA may have passed back. */\n> > +     if (!result.controls.empty())\n> > +             data->setSensorControls(result.controls);\n> > +\n> > +     /* Configure the number of dropped frames required on startup. */\n> > +     data->dropFrameCount_ = data->config_.disableStartupFrameDrops ? 0\n> > +                                                                    : result.dropFrameCount;\n>\n>         data->dropFrameCount_ = data->config_.disableStartupFrameDrops\n>                               ? 0 : result.dropFrameCount;\n>\n> > +\n> > +     for (auto const stream : data->streams_)\n> > +             stream->resetBuffers();\n> > +\n> > +     if (!data->buffersAllocated_) {\n> > +             /* Allocate buffers for internal pipeline usage. */\n> > +             ret = prepareBuffers(camera);\n> > +             if (ret) {\n> > +                     LOG(RPI, Error) << \"Failed to allocate buffers\";\n> > +                     data->freeBuffers();\n> > +                     stop(camera);\n> > +                     return ret;\n> > +             }\n> > +             data->buffersAllocated_ = true;\n> > +     }\n> > +\n> > +     /* We need to set the dropFrameCount_ before queueing buffers. */\n> > +     ret = queueAllBuffers(camera);\n> > +     if (ret) {\n> > +             LOG(RPI, Error) << \"Failed to queue buffers\";\n> > +             stop(camera);\n> > +             return ret;\n> > +     }\n> > +\n> > +     /*\n> > +      * Reset the delayed controls with the gain and exposure values set by\n> > +      * the IPA.\n> > +      */\n> > +     data->delayedCtrls_->reset(0);\n> > +     data->state_ = CameraData::State::Idle;\n> > +\n> > +     /* Enable SOF event generation. */\n> > +     data->frontendDevice()->setFrameStartEnabled(true);\n> > +\n> > +     data->platformStart();\n> > +\n> > +     /* Start all streams. */\n> > +     for (auto const stream : data->streams_) {\n> > +             ret = stream->dev()->streamOn();\n> > +             if (ret) {\n> > +                     stop(camera);\n> > +                     return ret;\n> > +             }\n> > +     }\n> > +\n> > +     return 0;\n> > +}\n> > +\n> > +void PipelineHandlerBase::stopDevice(Camera *camera)\n> > +{\n> > +     CameraData *data = cameraData(camera);\n> > +\n> > +     data->state_ = CameraData::State::Stopped;\n> > +     data->platformStop();\n> > +\n> > +     for (auto const stream : data->streams_)\n> > +             stream->dev()->streamOff();\n> > +\n> > +     /* Disable SOF event generation. */\n> > +     data->frontendDevice()->setFrameStartEnabled(false);\n> > +\n> > +     data->clearIncompleteRequests();\n> > +\n> > +     /* Stop the IPA. */\n> > +     data->ipa_->stop();\n> > +}\n> > +\n> > +void PipelineHandlerBase::releaseDevice(Camera *camera)\n> > +{\n> > +     CameraData *data = cameraData(camera);\n> > +     data->freeBuffers();\n> > +}\n> > +\n> > +int PipelineHandlerBase::queueRequestDevice(Camera *camera, Request *request)\n> > +{\n> > +     CameraData *data = cameraData(camera);\n> > +\n> > +     if (!data->isRunning())\n> > +             return -EINVAL;\n> > +\n> > +     LOG(RPI, Debug) << \"queueRequestDevice: New request.\";\n> > +\n> > +     /* Push all buffers supplied in the Request to the respective streams. */\n> > +     for (auto stream : data->streams_) {\n> > +             if (!stream->isExternal())\n> > +                     continue;\n> > +\n> > +             FrameBuffer *buffer = request->findBuffer(stream);\n> > +             if (buffer && !stream->getBufferId(buffer)) {\n> > +                     /*\n> > +                      * This buffer is not recognised, so it must have been allocated\n> > +                      * outside the v4l2 device. Store it in the stream buffer list\n> > +                      * so we can track it.\n> > +                      */\n> > +                     stream->setExternalBuffer(buffer);\n> > +             }\n> > +\n> > +             /*\n> > +              * If no buffer is provided by the request for this stream, we\n> > +              * queue a nullptr to the stream to signify that it must use an\n> > +              * internally allocated buffer for this capture request. This\n> > +              * buffer will not be given back to the application, but is used\n> > +              * to support the internal pipeline flow.\n> > +              *\n> > +              * The below queueBuffer() call will do nothing if there are not\n> > +              * enough internal buffers allocated, but this will be handled by\n> > +              * queuing the request for buffers in the RPiStream object.\n> > +              */\n> > +             int ret = stream->queueBuffer(buffer);\n> > +             if (ret)\n> > +                     return ret;\n> > +     }\n> > +\n> > +     /* Push the request to the back of the queue. */\n> > +     data->requestQueue_.push(request);\n> > +     data->handleState();\n> > +\n> > +     return 0;\n> > +}\n> > +\n> > +int PipelineHandlerBase::registerCamera(MediaDevice *frontend, const std::string &frontendName,\n> > +                                     MediaDevice *backend, MediaEntity *sensorEntity)\n> > +{\n> > +     std::unique_ptr<CameraData> cameraData = allocateCameraData();\n> > +     CameraData *data = cameraData.get();\n> > +     int ret;\n> > +\n> > +     data->sensor_ = std::make_unique<CameraSensor>(sensorEntity);\n> > +     if (!data->sensor_)\n> > +             return -EINVAL;\n> > +\n> > +     if (data->sensor_->init())\n> > +             return -EINVAL;\n> > +\n> > +     data->sensorFormats_ = populateSensorFormats(data->sensor_);\n> > +\n> > +     /*\n> > +      * Enumerate all the Video Mux/Bridge devices across the sensor -> Fr\n> > +      * chain. There may be a cascade of devices in this chain!\n> > +      */\n> > +     MediaLink *link = sensorEntity->getPadByIndex(0)->links()[0];\n> > +     data->enumerateVideoDevices(link, frontendName);\n> > +\n> > +     ipa::RPi::InitResult result;\n> > +     if (data->loadIPA(&result)) {\n> > +             LOG(RPI, Error) << \"Failed to load a suitable IPA library\";\n> > +             return -EINVAL;\n> > +     }\n> > +\n> > +     /*\n> > +      * Setup our delayed control writer with the sensor default\n> > +      * gain and exposure delays. Mark VBLANK for priority write.\n> > +      */\n> > +     std::unordered_map<uint32_t, RPi::DelayedControls::ControlParams> params = {\n> > +             { V4L2_CID_ANALOGUE_GAIN, { result.sensorConfig.gainDelay, false } },\n> > +             { V4L2_CID_EXPOSURE, { result.sensorConfig.exposureDelay, false } },\n> > +             { V4L2_CID_HBLANK, { result.sensorConfig.hblankDelay, false } },\n> > +             { V4L2_CID_VBLANK, { result.sensorConfig.vblankDelay, true } }\n> > +     };\n> > +     data->delayedCtrls_ = std::make_unique<RPi::DelayedControls>(data->sensor_->device(), params);\n> > +     data->sensorMetadata_ = result.sensorConfig.sensorMetadata;\n> > +\n> > +     /* Register initial controls that the Raspberry Pi IPA can handle. */\n> > +     data->controlInfo_ = std::move(result.controlInfo);\n> > +\n> > +     /* Initialize the camera properties. */\n> > +     data->properties_ = data->sensor_->properties();\n> > +\n> > +     /*\n> > +      * The V4L2_CID_NOTIFY_GAINS control, if present, is used to inform the\n> > +      * sensor of the colour gains. It is defined to be a linear gain where\n> > +      * the default value represents a gain of exactly one.\n> > +      */\n> > +     auto it = data->sensor_->controls().find(V4L2_CID_NOTIFY_GAINS);\n> > +     if (it != data->sensor_->controls().end())\n> > +             data->notifyGainsUnity_ = it->second.def().get<int32_t>();\n> > +\n> > +     /*\n> > +      * Set a default value for the ScalerCropMaximum property to show\n> > +      * that we support its use, however, initialise it to zero because\n> > +      * it's not meaningful until a camera mode has been chosen.\n> > +      */\n> > +     data->properties_.set(properties::ScalerCropMaximum, Rectangle{});\n> > +\n> > +     /*\n> > +      * We cache two things about the sensor in relation to transforms\n> > +      * (meaning horizontal and vertical flips): if they affect the Bayer\n> > +         * ordering, and what the \"native\" Bayer order is, when no transforms\n>\n> Wrong indentation.\n>\n> > +      * are applied.\n> > +      *\n> > +      * If flips are supported verify if they affect the Bayer ordering\n> > +      * and what the \"native\" Bayer order is, when no transforms are\n> > +      * applied.\n> > +      *\n> > +      * We note that the sensor's cached list of supported formats is\n> > +      * already in the \"native\" order, with any flips having been undone.\n> > +      */\n> > +     const V4L2Subdevice *sensor = data->sensor_->device();\n> > +     const struct v4l2_query_ext_ctrl *hflipCtrl = sensor->controlInfo(V4L2_CID_HFLIP);\n> > +     if (hflipCtrl) {\n> > +             /* We assume it will support vflips too... */\n> > +             data->flipsAlterBayerOrder_ = hflipCtrl->flags & V4L2_CTRL_FLAG_MODIFY_LAYOUT;\n> > +     }\n> > +\n> > +     /* Look for a valid Bayer format. */\n> > +     BayerFormat bayerFormat;\n> > +     for (const auto &iter : data->sensorFormats_) {\n> > +             bayerFormat = BayerFormat::fromMbusCode(iter.first);\n> > +             if (bayerFormat.isValid())\n> > +                     break;\n> > +     }\n> > +\n> > +     if (!bayerFormat.isValid()) {\n> > +             LOG(RPI, Error) << \"No Bayer format found\";\n> > +             return -EINVAL;\n> > +     }\n> > +     data->nativeBayerOrder_ = bayerFormat.order;\n> > +\n> > +     ret = data->loadPipelineConfiguration();\n> > +     if (ret) {\n> > +             LOG(RPI, Error) << \"Unable to load pipeline configuration\";\n> > +             return ret;\n> > +     }\n> > +\n> > +     ret = platformRegister(cameraData, frontend, backend);\n> > +     if (ret)\n> > +             return ret;\n> > +\n> > +     /* Setup the general IPA signal handlers. */\n> > +     data->frontendDevice()->dequeueTimeout.connect(data, &RPi::CameraData::cameraTimeout);\n> > +     data->frontendDevice()->frameStart.connect(data, &RPi::CameraData::frameStarted);\n> > +     data->ipa_->setDelayedControls.connect(data, &CameraData::setDelayedControls);\n> > +     data->ipa_->setLensControls.connect(data, &CameraData::setLensControls);\n> > +\n> > +     return 0;\n> > +}\n> > +\n> > +void PipelineHandlerBase::mapBuffers(Camera *camera, const BufferMap &buffers, unsigned int mask)\n> > +{\n> > +     CameraData *data = cameraData(camera);\n> > +     std::vector<IPABuffer> bufferIds;\n> > +     /*\n> > +      * Link the FrameBuffers with the id (key value) in the map stored in\n> > +      * the RPi stream object - along with an identifier mask.\n> > +      *\n> > +      * This will allow us to identify buffers passed between the pipeline\n> > +      * handler and the IPA.\n> > +      */\n> > +     for (auto const &it : buffers) {\n> > +             bufferIds.push_back(IPABuffer(mask | it.first,\n> > +                                           it.second->planes()));\n> > +             data->bufferIds_.insert(mask | it.first);\n> > +     }\n> > +\n> > +     data->ipa_->mapBuffers(bufferIds);\n> > +}\n> > +\n> > +int PipelineHandlerBase::queueAllBuffers(Camera *camera)\n> > +{\n> > +     CameraData *data = cameraData(camera);\n> > +     int ret;\n> > +\n> > +     for (auto const stream : data->streams_) {\n> > +             if (!stream->isExternal()) {\n> > +                     ret = stream->queueAllBuffers();\n> > +                     if (ret < 0)\n> > +                             return ret;\n> > +             } else {\n> > +                     /*\n> > +                      * For external streams, we must queue up a set of internal\n> > +                      * buffers to handle the number of drop frames requested by\n> > +                      * the IPA. This is done by passing nullptr in queueBuffer().\n> > +                      *\n> > +                      * The below queueBuffer() call will do nothing if there\n> > +                      * are not enough internal buffers allocated, but this will\n> > +                      * be handled by queuing the request for buffers in the\n> > +                      * RPiStream object.\n> > +                      */\n> > +                     unsigned int i;\n> > +                     for (i = 0; i < data->dropFrameCount_; i++) {\n> > +                             ret = stream->queueBuffer(nullptr);\n> > +                             if (ret)\n> > +                                     return ret;\n> > +                     }\n> > +             }\n> > +     }\n> > +\n> > +     return 0;\n> > +}\n> > +\n> > +void CameraData::freeBuffers()\n> > +{\n> > +     if (ipa_) {\n> > +             /*\n> > +              * Copy the buffer ids from the unordered_set to a vector to\n> > +              * pass to the IPA.\n> > +              */\n> > +             std::vector<unsigned int> bufferIds(bufferIds_.begin(),\n> > +                                                 bufferIds_.end());\n> > +             ipa_->unmapBuffers(bufferIds);\n> > +             bufferIds_.clear();\n> > +     }\n> > +\n> > +     for (auto const stream : streams_)\n> > +             stream->releaseBuffers();\n> > +\n> > +     platformFreeBuffers();\n> > +\n> > +     buffersAllocated_ = false;\n> > +}\n> > +\n> > +/*\n> > + * enumerateVideoDevices() iterates over the Media Controller topology, starting\n> > + * at the sensor and finishing at the frontend. For each sensor, CameraData stores\n> > + * a unique list of any intermediate video mux or bridge devices connected in a\n> > + * cascade, together with the entity to entity link.\n> > + *\n> > + * Entity pad configuration and link enabling happens at the end of configure().\n> > + * We first disable all pad links on each entity device in the chain, and then\n> > + * selectively enabling the specific links to link sensor to the frontend across\n> > + * all intermediate muxes and bridges.\n> > + *\n> > + * In the cascaded topology below, if Sensor1 is used, the Mux2 -> Mux1 link\n> > + * will be disabled, and Sensor1 -> Mux1 -> Frontend links enabled. Alternatively,\n> > + * if Sensor3 is used, the Sensor2 -> Mux2 and Sensor1 -> Mux1 links are disabled,\n> > + * and Sensor3 -> Mux2 -> Mux1 -> Frontend links are enabled. All other links will\n> > + * remain unchanged.\n> > + *\n> > + *  +----------+\n> > + *  |     FE   |\n> > + *  +-----^----+\n> > + *        |\n> > + *    +---+---+\n> > + *    | Mux1  |<------+\n> > + *    +--^----        |\n> > + *       |            |\n> > + * +-----+---+    +---+---+\n> > + * | Sensor1 |    |  Mux2 |<--+\n> > + * +---------+    +-^-----+   |\n> > + *                  |         |\n> > + *          +-------+-+   +---+-----+\n> > + *          | Sensor2 |   | Sensor3 |\n> > + *          +---------+   +---------+\n> > + */\n> > +void CameraData::enumerateVideoDevices(MediaLink *link, const std::string &frontend)\n> > +{\n> > +     const MediaPad *sinkPad = link->sink();\n> > +     const MediaEntity *entity = sinkPad->entity();\n> > +     bool frontendFound = false;\n> > +\n> > +     /* We only deal with Video Mux and Bridge devices in cascade. */\n> > +     if (entity->function() != MEDIA_ENT_F_VID_MUX &&\n> > +         entity->function() != MEDIA_ENT_F_VID_IF_BRIDGE)\n> > +             return;\n> > +\n> > +     /* Find the source pad for this Video Mux or Bridge device. */\n> > +     const MediaPad *sourcePad = nullptr;\n> > +     for (const MediaPad *pad : entity->pads()) {\n> > +             if (pad->flags() & MEDIA_PAD_FL_SOURCE) {\n> > +                     /*\n> > +                      * We can only deal with devices that have a single source\n> > +                      * pad. If this device has multiple source pads, ignore it\n> > +                      * and this branch in the cascade.\n> > +                      */\n> > +                     if (sourcePad)\n> > +                             return;\n> > +\n> > +                     sourcePad = pad;\n> > +             }\n> > +     }\n> > +\n> > +     LOG(RPI, Debug) << \"Found video mux device \" << entity->name()\n> > +                     << \" linked to sink pad \" << sinkPad->index();\n> > +\n> > +     bridgeDevices_.emplace_back(std::make_unique<V4L2Subdevice>(entity), link);\n> > +     bridgeDevices_.back().first->open();\n> > +\n> > +     /*\n> > +      * Iterate through all the sink pad links down the cascade to find any\n> > +      * other Video Mux and Bridge devices.\n> > +      */\n> > +     for (MediaLink *l : sourcePad->links()) {\n> > +             enumerateVideoDevices(l, frontend);\n> > +             /* Once we reach the Frontend entity, we are done. */\n> > +             if (l->sink()->entity()->name() == frontend) {\n> > +                     frontendFound = true;\n> > +                     break;\n> > +             }\n> > +     }\n> > +\n> > +     /* This identifies the end of our entity enumeration recursion. */\n> > +     if (link->source()->entity()->function() == MEDIA_ENT_F_CAM_SENSOR) {\n> > +             /*\n> > +             * If the frontend is not at the end of this cascade, we cannot\n> > +             * configure this topology automatically, so remove all entity references.\n>\n> Line wrap.\n>\n> > +             */\n> > +             if (!frontendFound) {\n> > +                     LOG(RPI, Warning) << \"Cannot automatically configure this MC topology!\";\n> > +                     bridgeDevices_.clear();\n> > +             }\n> > +     }\n> > +}\n> > +\n> > +int CameraData::loadPipelineConfiguration()\n> > +{\n> > +     config_ = {\n> > +             .disableStartupFrameDrops = false,\n> > +             .cameraTimeoutValue = 0,\n> > +     };\n> > +\n> > +     /* Initial configuration of the platform, in case no config file is present */\n> > +     platformPipelineConfigure({});\n> > +\n> > +     char const *configFromEnv = utils::secure_getenv(\"LIBCAMERA_RPI_CONFIG_FILE\");\n> > +     if (!configFromEnv || *configFromEnv == '\\0')\n> > +             return 0;\n> > +\n> > +     std::string filename = std::string(configFromEnv);\n> > +     File file(filename);\n> > +\n> > +     if (!file.open(File::OpenModeFlag::ReadOnly)) {\n> > +             LOG(RPI, Error) << \"Failed to open configuration file '\" << filename << \"'\";\n> > +             return -EIO;\n> > +     }\n> > +\n> > +     LOG(RPI, Info) << \"Using configuration file '\" << filename << \"'\";\n> > +\n> > +     std::unique_ptr<YamlObject> root = YamlParser::parse(file);\n> > +     if (!root) {\n> > +             LOG(RPI, Warning) << \"Failed to parse configuration file, using defaults\";\n> > +             return 0;\n> > +     }\n> > +\n> > +     std::optional<double> ver = (*root)[\"version\"].get<double>();\n> > +     if (!ver || *ver != 1.0) {\n> > +             LOG(RPI, Error) << \"Unexpected configuration file version reported\";\n> > +             return -EINVAL;\n> > +     }\n> > +\n> > +     const YamlObject &phConfig = (*root)[\"pipeline_handler\"];\n> > +\n> > +     config_.disableStartupFrameDrops =\n> > +             phConfig[\"disable_startup_frame_drops\"].get<bool>(config_.disableStartupFrameDrops);\n> > +\n> > +     config_.cameraTimeoutValue =\n> > +             phConfig[\"camera_timeout_value_ms\"].get<unsigned int>(config_.cameraTimeoutValue);\n> > +\n> > +     if (config_.cameraTimeoutValue) {\n> > +             /* Disable the IPA signal to control timeout and set the user requested value. */\n> > +             ipa_->setCameraTimeout.disconnect();\n> > +             frontendDevice()->setDequeueTimeout(config_.cameraTimeoutValue * 1ms);\n> > +     }\n> > +\n> > +     return platformPipelineConfigure(root);\n> > +}\n> > +\n> > +int CameraData::loadIPA(ipa::RPi::InitResult *result)\n> > +{\n> > +     ipa_ = IPAManager::createIPA<ipa::RPi::IPAProxyRPi>(pipe(), 1, 1);\n> > +\n> > +     if (!ipa_)\n> > +             return -ENOENT;\n> > +\n> > +     /*\n> > +      * The configuration (tuning file) is made from the sensor name unless\n> > +      * the environment variable overrides it.\n> > +      */\n> > +     std::string configurationFile;\n> > +     char const *configFromEnv = utils::secure_getenv(\"LIBCAMERA_RPI_TUNING_FILE\");\n> > +     if (!configFromEnv || *configFromEnv == '\\0') {\n> > +             std::string model = sensor_->model();\n> > +             if (isMonoSensor(sensor_))\n> > +                     model += \"_mono\";\n> > +             configurationFile = ipa_->configurationFile(model + \".json\", \"rpi\");\n> > +     } else {\n> > +             configurationFile = std::string(configFromEnv);\n> > +     }\n> > +\n> > +     IPASettings settings(configurationFile, sensor_->model());\n> > +     ipa::RPi::InitParams params;\n> > +\n> > +     params.lensPresent = !!sensor_->focusLens();\n> > +     int ret = platformInitIpa(params);\n> > +     if (ret)\n> > +             return ret;\n> > +\n> > +     return ipa_->init(settings, params, result);\n> > +}\n> > +\n> > +int CameraData::configureIPA(const CameraConfiguration *config, ipa::RPi::ConfigResult *result)\n> > +{\n> > +     std::map<unsigned int, ControlInfoMap> entityControls;\n> > +     ipa::RPi::ConfigParams params;\n> > +     int ret;\n> > +\n> > +     params.sensorControls = sensor_->controls();\n> > +     if (sensor_->focusLens())\n> > +             params.lensControls = sensor_->focusLens()->controls();\n> > +\n> > +     ret = platformConfigureIpa(params);\n> > +     if (ret)\n> > +             return ret;\n> > +\n> > +     /* We store the IPACameraSensorInfo for digital zoom calculations. */\n> > +     ret = sensor_->sensorInfo(&sensorInfo_);\n> > +     if (ret) {\n> > +             LOG(RPI, Error) << \"Failed to retrieve camera sensor info\";\n> > +             return ret;\n> > +     }\n> > +\n> > +     /* Always send the user transform to the IPA. */\n> > +     params.transform = static_cast<unsigned int>(config->transform);\n> > +\n> > +     /* Ready the IPA - it must know about the sensor resolution. */\n> > +     ret = ipa_->configure(sensorInfo_, params, result);\n> > +     if (ret < 0) {\n> > +             LOG(RPI, Error) << \"IPA configuration failed!\";\n> > +             return -EPIPE;\n> > +     }\n> > +\n> > +     if (!result->controls.empty())\n> > +             setSensorControls(result->controls);\n> > +\n> > +     return 0;\n> > +}\n> > +\n> > +void CameraData::setDelayedControls(const ControlList &controls, uint32_t delayContext)\n> > +{\n> > +     if (!delayedCtrls_->push(controls, delayContext))\n> > +             LOG(RPI, Error) << \"V4L2 DelayedControl set failed\";\n> > +}\n> > +\n> > +void CameraData::setLensControls(const ControlList &controls)\n> > +{\n> > +     CameraLens *lens = sensor_->focusLens();\n> > +\n> > +     if (lens && controls.contains(V4L2_CID_FOCUS_ABSOLUTE)) {\n> > +             ControlValue const &focusValue = controls.get(V4L2_CID_FOCUS_ABSOLUTE);\n> > +             lens->setFocusPosition(focusValue.get<int32_t>());\n> > +     }\n> > +}\n> > +\n> > +void CameraData::setSensorControls(ControlList &controls)\n> > +{\n> > +     /*\n> > +      * We need to ensure that if both VBLANK and EXPOSURE are present, the\n> > +      * former must be written ahead of, and separately from EXPOSURE to avoid\n> > +      * V4L2 rejecting the latter. This is identical to what DelayedControls\n> > +      * does with the priority write flag.\n> > +      *\n> > +      * As a consequence of the below logic, VBLANK gets set twice, and we\n> > +      * rely on the v4l2 framework to not pass the second control set to the\n> > +      * driver as the actual control value has not changed.\n> > +      */\n> > +     if (controls.contains(V4L2_CID_EXPOSURE) && controls.contains(V4L2_CID_VBLANK)) {\n> > +             ControlList vblank_ctrl;\n> > +\n> > +             vblank_ctrl.set(V4L2_CID_VBLANK, controls.get(V4L2_CID_VBLANK));\n> > +             sensor_->setControls(&vblank_ctrl);\n> > +     }\n> > +\n> > +     sensor_->setControls(&controls);\n> > +}\n> > +\n> > +Rectangle CameraData::scaleIspCrop(const Rectangle &ispCrop) const\n> > +{\n> > +     /*\n> > +      * Scale a crop rectangle defined in the ISP's coordinates into native sensor\n> > +      * coordinates.\n> > +      */\n> > +     Rectangle nativeCrop = ispCrop.scaledBy(sensorInfo_.analogCrop.size(),\n> > +                                             sensorInfo_.outputSize);\n> > +     nativeCrop.translateBy(sensorInfo_.analogCrop.topLeft());\n> > +     return nativeCrop;\n> > +}\n> > +\n> > +void CameraData::calculateScalerCrop(const ControlList &controls)\n> > +{\n> > +     const auto &scalerCrop = controls.get<Rectangle>(controls::ScalerCrop);\n> > +     if (scalerCrop) {\n> > +             Rectangle nativeCrop = *scalerCrop;\n> > +\n> > +             if (!nativeCrop.width || !nativeCrop.height)\n> > +                     nativeCrop = { 0, 0, 1, 1 };\n> > +\n> > +             /* Create a version of the crop scaled to ISP (camera mode) pixels. */\n> > +             Rectangle ispCrop = nativeCrop.translatedBy(-sensorInfo_.analogCrop.topLeft());\n> > +             ispCrop.scaleBy(sensorInfo_.outputSize, sensorInfo_.analogCrop.size());\n> > +\n> > +             /*\n> > +              * The crop that we set must be:\n> > +              * 1. At least as big as ispMinCropSize_, once that's been\n> > +              *    enlarged to the same aspect ratio.\n> > +              * 2. With the same mid-point, if possible.\n> > +              * 3. But it can't go outside the sensor area.\n> > +              */\n> > +             Size minSize = ispMinCropSize_.expandedToAspectRatio(nativeCrop.size());\n> > +             Size size = ispCrop.size().expandedTo(minSize);\n> > +             ispCrop = size.centeredTo(ispCrop.center()).enclosedIn(Rectangle(sensorInfo_.outputSize));\n> > +\n> > +             if (ispCrop != ispCrop_) {\n> > +                     ispCrop_ = ispCrop;\n> > +                     platformIspCrop();\n>\n> The CameraData::calculateScalerCrop() function was previously called\n> applyScalerCrop(). The new name implies it calculates the crop rectangle\n> only, without applying it.\n>\n> platformIspCrop() is a fairly non-descriptive name. As far as I\n> understand its purpose is to apply the ISP input crop rectangle, could\n> you rename it accordingly ?\n\nAck.  I'll rename calculateScalerCrop to applyScalerCrop and\nplatofrmIspCrop() to platformSetIspCrop().\n\n>\n> > +\n> > +                     /*\n> > +                      * Also update the ScalerCrop in the metadata with what we actually\n> > +                      * used. But we must first rescale that from ISP (camera mode) pixels\n> > +                      * back into sensor native pixels.\n> > +                      */\n> > +                     scalerCrop_ = scaleIspCrop(ispCrop_);\n> > +             }\n> > +     }\n> > +}\n> > +\n> > +void CameraData::cameraTimeout()\n> > +{\n> > +     LOG(RPI, Error) << \"Camera frontend has timed out!\";\n> > +     LOG(RPI, Error) << \"Please check that your camera sensor connector is attached securely.\";\n> > +     LOG(RPI, Error) << \"Alternatively, try another cable and/or sensor.\";\n> > +\n> > +     state_ = CameraData::State::Error;\n> > +     platformStop();\n> > +\n> > +     /*\n> > +      * To allow the application to attempt a recovery from this timeout,\n> > +      * stop all devices streaming, and return any outstanding requests as\n> > +      * incomplete and cancelled.\n> > +      */\n> > +     for (auto const stream : streams_)\n> > +             stream->dev()->streamOff();\n> > +\n> > +     clearIncompleteRequests();\n> > +}\n> > +\n> > +void CameraData::frameStarted(uint32_t sequence)\n> > +{\n> > +     LOG(RPI, Debug) << \"Frame start \" << sequence;\n> > +\n> > +     /* Write any controls for the next frame as soon as we can. */\n> > +     delayedCtrls_->applyControls(sequence);\n> > +}\n> > +\n> > +void CameraData::clearIncompleteRequests()\n> > +{\n> > +     /*\n> > +      * All outstanding requests (and associated buffers) must be returned\n> > +      * back to the application.\n> > +      */\n> > +     while (!requestQueue_.empty()) {\n> > +             Request *request = requestQueue_.front();\n> > +\n> > +             for (auto &b : request->buffers()) {\n> > +                     FrameBuffer *buffer = b.second;\n> > +                     /*\n> > +                      * Has the buffer already been handed back to the\n> > +                      * request? If not, do so now.\n> > +                      */\n> > +                     if (buffer->request()) {\n> > +                             buffer->_d()->cancel();\n> > +                             pipe()->completeBuffer(request, buffer);\n> > +                     }\n> > +             }\n> > +\n> > +             pipe()->completeRequest(request);\n> > +             requestQueue_.pop();\n> > +     }\n> > +}\n> > +\n> > +void CameraData::handleStreamBuffer(FrameBuffer *buffer, RPi::Stream *stream)\n> > +{\n> > +     /*\n> > +      * It is possible to be here without a pending request, so check\n> > +      * that we actually have one to action, otherwise we just return\n> > +      * buffer back to the stream.\n> > +      */\n> > +     Request *request = requestQueue_.empty() ? nullptr : requestQueue_.front();\n> > +     if (!dropFrameCount_ && request && request->findBuffer(stream) == buffer) {\n> > +             /*\n> > +              * Check if this is an externally provided buffer, and if\n> > +              * so, we must stop tracking it in the pipeline handler.\n> > +              */\n> > +             handleExternalBuffer(buffer, stream);\n> > +             /*\n> > +              * Tag the buffer as completed, returning it to the\n> > +              * application.\n> > +              */\n> > +             pipe()->completeBuffer(request, buffer);\n> > +     } else {\n> > +             /*\n> > +              * This buffer was not part of the Request (which happens if an\n> > +              * internal buffer was used for an external stream, or\n> > +              * unconditionally for internal streams), or there is no pending\n> > +              * request, so we can recycle it.\n> > +              */\n> > +             stream->returnBuffer(buffer);\n> > +     }\n> > +}\n> > +\n> > +void CameraData::handleState()\n> > +{\n> > +     switch (state_) {\n> > +     case State::Stopped:\n> > +     case State::Busy:\n> > +     case State::Error:\n> > +             break;\n> > +\n> > +     case State::IpaComplete:\n> > +             /* If the request is completed, we will switch to Idle state. */\n> > +             checkRequestCompleted();\n> > +             /*\n> > +              * No break here, we want to try running the pipeline again.\n> > +              * The fallthrough clause below suppresses compiler warnings.\n> > +              */\n> > +             [[fallthrough]];\n> > +\n> > +     case State::Idle:\n> > +             tryRunPipeline();\n> > +             break;\n> > +     }\n> > +}\n> > +\n> > +void CameraData::handleExternalBuffer(FrameBuffer *buffer, RPi::Stream *stream)\n> > +{\n> > +     unsigned int id = stream->getBufferId(buffer);\n> > +\n> > +     if (!(id & MaskExternalBuffer))\n> > +             return;\n> > +\n> > +     /* Stop the Stream object from tracking the buffer. */\n> > +     stream->removeExternalBuffer(buffer);\n> > +}\n> > +\n> > +void CameraData::checkRequestCompleted()\n> > +{\n> > +     bool requestCompleted = false;\n> > +     /*\n> > +      * If we are dropping this frame, do not touch the request, simply\n> > +      * change the state to IDLE when ready.\n> > +      */\n> > +     if (!dropFrameCount_) {\n> > +             Request *request = requestQueue_.front();\n> > +             if (request->hasPendingBuffers())\n> > +                     return;\n> > +\n> > +             /* Must wait for metadata to be filled in before completing. */\n> > +             if (state_ != State::IpaComplete)\n> > +                     return;\n> > +\n> > +             pipe()->completeRequest(request);\n> > +             requestQueue_.pop();\n> > +             requestCompleted = true;\n> > +     }\n> > +\n> > +     /*\n> > +      * Make sure we have three outputs completed in the case of a dropped\n> > +      * frame.\n> > +      */\n> > +     if (state_ == State::IpaComplete &&\n> > +         ((ispOutputCount_ == ispOutputTotal_ && dropFrameCount_) || requestCompleted)) {\n>\n> Line wrap.\n>\n> > +             state_ = State::Idle;\n> > +             if (dropFrameCount_) {\n> > +                     dropFrameCount_--;\n> > +                     LOG(RPI, Debug) << \"Dropping frame at the request of the IPA (\"\n> > +                                     << dropFrameCount_ << \" left)\";\n> > +             }\n> > +     }\n> > +}\n> > +\n> > +void CameraData::fillRequestMetadata(const ControlList &bufferControls, Request *request)\n> > +{\n> > +     request->metadata().set(controls::SensorTimestamp,\n> > +                             bufferControls.get(controls::SensorTimestamp).value_or(0));\n> > +\n> > +     request->metadata().set(controls::ScalerCrop, scalerCrop_);\n> > +}\n> > +\n> > +} /* namespace libcamera */\n> > diff --git a/src/libcamera/pipeline/rpi/common/pipeline_base.h b/src/libcamera/pipeline/rpi/common/pipeline_base.h\n> > new file mode 100644\n> > index 000000000000..318266a6fb51\n> > --- /dev/null\n> > +++ b/src/libcamera/pipeline/rpi/common/pipeline_base.h\n> > @@ -0,0 +1,276 @@\n> > +/* SPDX-License-Identifier: LGPL-2.1-or-later */\n> > +/*\n> > + * Copyright (C) 2019-2023, Raspberry Pi Ltd\n> > + *\n> > + * pipeline_base.h - Pipeline handler base class for Raspberry Pi devices\n> > + */\n> > +\n> > +#include <map>\n> > +#include <memory>\n> > +#include <optional>\n> > +#include <queue>\n> > +#include <string>\n> > +#include <unordered_set>\n> > +#include <utility>\n> > +#include <vector>\n> > +\n> > +#include <libcamera/controls.h>\n> > +#include <libcamera/request.h>\n> > +\n> > +#include \"libcamera/internal/bayer_format.h\"\n> > +#include \"libcamera/internal/camera.h\"\n> > +#include \"libcamera/internal/camera_sensor.h\"\n> > +#include \"libcamera/internal/framebuffer.h\"\n> > +#include \"libcamera/internal/media_device.h\"\n> > +#include \"libcamera/internal/media_object.h\"\n> > +#include \"libcamera/internal/pipeline_handler.h\"\n> > +#include \"libcamera/internal/v4l2_videodevice.h\"\n> > +#include \"libcamera/internal/yaml_parser.h\"\n> > +\n> > +#include <libcamera/ipa/raspberrypi_ipa_interface.h>\n> > +#include <libcamera/ipa/raspberrypi_ipa_proxy.h>\n> > +\n> > +#include \"delayed_controls.h\"\n> > +#include \"rpi_stream.h\"\n> > +\n> > +using namespace std::chrono_literals;\n> > +\n> > +namespace libcamera {\n> > +\n> > +namespace RPi {\n> > +\n> > +/* Map of mbus codes to supported sizes reported by the sensor. */\n> > +using SensorFormats = std::map<unsigned int, std::vector<Size>>;\n> > +\n> > +class CameraData : public Camera::Private\n> > +{\n> > +public:\n> > +     CameraData(PipelineHandler *pipe)\n> > +             : Camera::Private(pipe), state_(State::Stopped),\n> > +               flipsAlterBayerOrder_(false), dropFrameCount_(0), buffersAllocated_(false),\n> > +               ispOutputCount_(0), ispOutputTotal_(0)\n> > +     {\n> > +     }\n> > +\n> > +     virtual ~CameraData()\n> > +     {\n> > +     }\n> > +\n> > +     struct StreamParams {\n> > +             StreamParams()\n> > +                     : index(0), cfg(nullptr), dev(nullptr)\n> > +             {\n> > +             }\n> > +\n> > +             StreamParams(unsigned int index_, StreamConfiguration *cfg_)\n> > +                     : index(index_), cfg(cfg_), dev(nullptr)\n> > +             {\n> > +             }\n> > +\n> > +             unsigned int index;\n> > +             StreamConfiguration *cfg;\n> > +             V4L2VideoDevice *dev;\n> > +     };\n> > +\n> > +     virtual CameraConfiguration::Status platformValidate(std::vector<StreamParams> &rawStreams,\n> > +                                                          std::vector<StreamParams> &outStreams) const = 0;\n> > +     virtual int platformConfigure(const V4L2SubdeviceFormat &sensorFormat,\n> > +                                   std::optional<BayerFormat::Packing> packing,\n> > +                                   std::vector<StreamParams> &rawStreams,\n> > +                                   std::vector<StreamParams> &outStreams) = 0;\n>\n> As explained in the review of the corresponding refactoring for the IPA\n> module, all those abstract virtual platform*() functions are not nice.\n> It may be possible to remove some by overriding the PipelineHandler\n> functions in the derived class and calling PipelineHandlerBase functions\n> from there.\n\nplatformValidate and platformConfigure are slightly different in this case.\nthe base class does some general housekeeping stuff before the call to the\nabove two functions.\n\nThe only other option I can think of is not to have validate() and configure()\nin the base class, but implement them in the derived classes.  However, this\nmight mean code duplication/maintenance overhead should one wish to implement\na new derived pipeline handler.\n\n>\n> > +     virtual void platformStart() = 0;\n> > +     virtual void platformStop() = 0;\n> > +\n> > +     void freeBuffers();\n> > +     virtual void platformFreeBuffers() = 0;\n> > +\n> > +     void enumerateVideoDevices(MediaLink *link, const std::string &frontend);\n> > +\n> > +     int loadPipelineConfiguration();\n> > +     int loadIPA(ipa::RPi::InitResult *result);\n> > +     int configureIPA(const CameraConfiguration *config, ipa::RPi::ConfigResult *result);\n> > +     virtual int platformInitIpa(ipa::RPi::InitParams &params) = 0;\n> > +     virtual int platformConfigureIpa(ipa::RPi::ConfigParams &params) = 0;\n> > +\n> > +     void setDelayedControls(const ControlList &controls, uint32_t delayContext);\n> > +     void setLensControls(const ControlList &controls);\n> > +     void setSensorControls(ControlList &controls);\n> > +\n> > +     Rectangle scaleIspCrop(const Rectangle &ispCrop) const;\n> > +     void calculateScalerCrop(const ControlList &controls);\n> > +     virtual void platformIspCrop() = 0;\n> > +\n> > +     void cameraTimeout();\n> > +     void frameStarted(uint32_t sequence);\n> > +\n> > +     void clearIncompleteRequests();\n> > +     void handleStreamBuffer(FrameBuffer *buffer, Stream *stream);\n> > +     void handleState();\n> > +\n> > +     virtual V4L2VideoDevice::Formats ispFormats() const = 0;\n> > +     virtual V4L2VideoDevice::Formats rawFormats() const = 0;\n> > +     virtual V4L2VideoDevice *frontendDevice() = 0;\n> > +\n> > +     virtual int platformPipelineConfigure(const std::unique_ptr<YamlObject> &root) = 0;\n> > +\n> > +     std::unique_ptr<ipa::RPi::IPAProxyRPi> ipa_;\n> > +\n> > +     std::unique_ptr<CameraSensor> sensor_;\n> > +     SensorFormats sensorFormats_;\n> > +\n> > +     /* The vector below is just for convenience when iterating over all streams. */\n> > +     std::vector<Stream *> streams_;\n> > +     /* Stores the ids of the buffers mapped in the IPA. */\n> > +     std::unordered_set<unsigned int> bufferIds_;\n> > +     /*\n> > +      * Stores a cascade of Video Mux or Bridge devices between the sensor and\n> > +      * Unicam together with media link across the entities.\n> > +      */\n> > +     std::vector<std::pair<std::unique_ptr<V4L2Subdevice>, MediaLink *>> bridgeDevices_;\n> > +\n> > +     std::unique_ptr<DelayedControls> delayedCtrls_;\n> > +     bool sensorMetadata_;\n> > +\n> > +     /*\n> > +      * All the functions in this class are called from a single calling\n> > +      * thread. So, we do not need to have any mutex to protect access to any\n> > +      * of the variables below.\n> > +      */\n> > +     enum class State { Stopped, Idle, Busy, IpaComplete, Error };\n> > +     State state_;\n> > +\n> > +     bool isRunning()\n> > +     {\n> > +             return state_ != State::Stopped && state_ != State::Error;\n> > +     }\n> > +\n> > +     std::queue<Request *> requestQueue_;\n> > +\n> > +     /* Store the \"native\" Bayer order (that is, with no transforms applied). */\n> > +     bool flipsAlterBayerOrder_;\n> > +     BayerFormat::Order nativeBayerOrder_;\n> > +\n> > +     /* For handling digital zoom. */\n> > +     IPACameraSensorInfo sensorInfo_;\n> > +     Rectangle ispCrop_; /* crop in ISP (camera mode) pixels */\n> > +     Rectangle scalerCrop_; /* crop in sensor native pixels */\n> > +     Size ispMinCropSize_;\n> > +\n> > +     unsigned int dropFrameCount_;\n> > +\n> > +     /*\n> > +      * If set, this stores the value that represets a gain of one for\n> > +      * the V4L2_CID_NOTIFY_GAINS control.\n> > +      */\n> > +     std::optional<int32_t> notifyGainsUnity_;\n> > +\n> > +     /* Have internal buffers been allocated? */\n> > +     bool buffersAllocated_;\n> > +\n> > +     struct Config {\n> > +             /*\n> > +              * Override any request from the IPA to drop a number of startup\n> > +              * frames.\n> > +              */\n> > +             bool disableStartupFrameDrops;\n> > +             /*\n> > +              * Override the camera timeout value calculated by the IPA based\n> > +              * on frame durations.\n> > +              */\n> > +             unsigned int cameraTimeoutValue;\n> > +     };\n> > +\n> > +     Config config_;\n> > +\n> > +protected:\n> > +     void fillRequestMetadata(const ControlList &bufferControls,\n> > +                              Request *request);\n> > +\n> > +     virtual void tryRunPipeline() = 0;\n> > +\n> > +     unsigned int ispOutputCount_;\n> > +     unsigned int ispOutputTotal_;\n> > +\n> > +private:\n> > +     void handleExternalBuffer(FrameBuffer *buffer, Stream *stream);\n> > +     void checkRequestCompleted();\n> > +};\n> > +\n> > +class PipelineHandlerBase : public PipelineHandler\n> > +{\n> > +public:\n> > +     PipelineHandlerBase(CameraManager *manager)\n> > +             : PipelineHandler(manager)\n> > +     {\n> > +     }\n> > +\n> > +     virtual ~PipelineHandlerBase()\n> > +     {\n> > +     }\n> > +\n> > +     static V4L2DeviceFormat toV4L2DeviceFormat(const V4L2VideoDevice *dev,\n> > +                                                const V4L2SubdeviceFormat &format,\n> > +                                                BayerFormat::Packing packingReq);\n> > +\n> > +     std::unique_ptr<CameraConfiguration>\n> > +     generateConfiguration(Camera *camera, const StreamRoles &roles) override;\n> > +     int configure(Camera *camera, CameraConfiguration *config) override;\n> > +\n> > +     int exportFrameBuffers(Camera *camera, libcamera::Stream *stream,\n> > +                            std::vector<std::unique_ptr<FrameBuffer>> *buffers) override;\n> > +\n> > +     int start(Camera *camera, const ControlList *controls) override;\n> > +     void stopDevice(Camera *camera) override;\n> > +     void releaseDevice(Camera *camera) override;\n> > +\n> > +     int queueRequestDevice(Camera *camera, Request *request) override;\n> > +\n> > +protected:\n> > +     int registerCamera(MediaDevice *frontent, const std::string &frontendName,\n> > +                        MediaDevice *backend, MediaEntity *sensorEntity);\n> > +\n> > +     void mapBuffers(Camera *camera, const BufferMap &buffers, unsigned int mask);\n> > +\n> > +     virtual std::unique_ptr<CameraData> allocateCameraData() = 0;\n> > +     virtual int platformRegister(std::unique_ptr<CameraData> &cameraData,\n> > +                                  MediaDevice *unicam, MediaDevice *isp) = 0;\n> > +\n> > +private:\n> > +     CameraData *cameraData(Camera *camera)\n> > +     {\n> > +             return static_cast<CameraData *>(camera->_d());\n> > +     }\n> > +\n> > +     int queueAllBuffers(Camera *camera);\n> > +     virtual int prepareBuffers(Camera *camera) = 0;\n> > +};\n> > +\n> > +class RPiCameraConfiguration final : public CameraConfiguration\n> > +{\n> > +public:\n> > +     RPiCameraConfiguration(const CameraData *data)\n> > +             : CameraConfiguration(), data_(data)\n> > +     {\n> > +     }\n> > +\n> > +     CameraConfiguration::Status validateColorSpaces(ColorSpaceFlags flags);\n> > +     Status validate() override;\n> > +\n> > +     /* Cache the combinedTransform_ that will be applied to the sensor */\n> > +     Transform combinedTransform_;\n> > +\n> > +private:\n> > +     const CameraData *data_;\n> > +\n> > +     /*\n> > +      * Store the colour spaces that all our streams will have. RGB format streams\n> > +      * will have the same colorspace as YUV streams, with YCbCr field cleared and\n> > +      * range set to full.\n> > +         */\n> > +     std::optional<ColorSpace> yuvColorSpace_;\n> > +     std::optional<ColorSpace> rgbColorSpace_;\n> > +};\n> > +\n> > +} /* namespace RPi */\n> > +\n> > +} /* namespace libcamera */\n> > diff --git a/src/libcamera/pipeline/rpi/vc4/data/example.yaml b/src/libcamera/pipeline/rpi/vc4/data/example.yaml\n> > index c90f518f8849..b8e01adeaf40 100644\n> > --- a/src/libcamera/pipeline/rpi/vc4/data/example.yaml\n> > +++ b/src/libcamera/pipeline/rpi/vc4/data/example.yaml\n> > @@ -34,13 +34,13 @@\n> >                  #\n> >                  # \"disable_startup_frame_drops\": false,\n> >\n> > -                # Custom timeout value (in ms) for Unicam to use. This overrides\n> > +                # Custom timeout value (in ms) for camera to use. This overrides\n> >                  # the value computed by the pipeline handler based on frame\n> >                  # durations.\n> >                  #\n> >                  # Set this value to 0 to use the pipeline handler computed\n> >                  # timeout value.\n> >                  #\n> > -                # \"unicam_timeout_value_ms\": 0,\n> > +                # \"camera_timeout_value_ms\": 0,\n> >          }\n> >  }\n> > diff --git a/src/libcamera/pipeline/rpi/vc4/meson.build b/src/libcamera/pipeline/rpi/vc4/meson.build\n> > index 228823f30922..cdb049c58d2c 100644\n> > --- a/src/libcamera/pipeline/rpi/vc4/meson.build\n> > +++ b/src/libcamera/pipeline/rpi/vc4/meson.build\n> > @@ -2,7 +2,7 @@\n> >\n> >  libcamera_sources += files([\n> >      'dma_heaps.cpp',\n> > -    'raspberrypi.cpp',\n> > +    'vc4.cpp',\n> >  ])\n> >\n> >  subdir('data')\n>\n> [snip]\n>\n> > diff --git a/src/libcamera/pipeline/rpi/vc4/vc4.cpp b/src/libcamera/pipeline/rpi/vc4/vc4.cpp\n> > new file mode 100644\n> > index 000000000000..a085a06376a8\n> > --- /dev/null\n> > +++ b/src/libcamera/pipeline/rpi/vc4/vc4.cpp\n> > @@ -0,0 +1,1001 @@\n> > +/* SPDX-License-Identifier: LGPL-2.1-or-later */\n> > +/*\n> > + * Copyright (C) 2019-2023, Raspberry Pi Ltd\n> > + *\n> > + * vc4.cpp - Pipeline handler for VC4 based Raspberry Pi devices\n> > + */\n> > +\n> > +#include <linux/bcm2835-isp.h>\n> > +#include <linux/v4l2-controls.h>\n> > +#include <linux/videodev2.h>\n> > +\n> > +#include <libcamera/formats.h>\n> > +\n> > +#include \"libcamera/internal/device_enumerator.h\"\n> > +\n> > +#include \"dma_heaps.h\"\n> > +#include \"pipeline_base.h\"\n> > +#include \"rpi_stream.h\"\n> > +\n> > +using namespace std::chrono_literals;\n> > +\n> > +namespace libcamera {\n> > +\n> > +LOG_DECLARE_CATEGORY(RPI)\n> > +\n> > +namespace {\n> > +\n> > +enum class Unicam : unsigned int { Image, Embedded };\n> > +enum class Isp : unsigned int { Input, Output0, Output1, Stats };\n> > +\n> > +} /* namespace */\n> > +\n> > +class Vc4CameraData final : public RPi::CameraData\n> > +{\n> > +public:\n> > +     Vc4CameraData(PipelineHandler *pipe)\n> > +             : RPi::CameraData(pipe)\n> > +     {\n> > +     }\n> > +\n> > +     ~Vc4CameraData()\n> > +     {\n> > +             freeBuffers();\n> > +     }\n> > +\n> > +     V4L2VideoDevice::Formats ispFormats() const override\n> > +     {\n> > +             return isp_[Isp::Output0].dev()->formats();\n> > +     }\n> > +\n> > +     V4L2VideoDevice::Formats rawFormats() const override\n> > +     {\n> > +             return unicam_[Unicam::Image].dev()->formats();\n> > +     }\n> > +\n> > +     V4L2VideoDevice *frontendDevice() override\n> > +     {\n> > +             return unicam_[Unicam::Image].dev();\n> > +     }\n> > +\n> > +     void platformFreeBuffers() override\n> > +     {\n> > +     }\n> > +\n> > +     CameraConfiguration::Status platformValidate(std::vector<StreamParams> &rawStreams,\n> > +                                                  std::vector<StreamParams> &outStreams) const override;\n> > +\n> > +     int platformPipelineConfigure(const std::unique_ptr<YamlObject> &root) override;\n> > +\n> > +     void platformStart() override;\n> > +     void platformStop() override;\n> > +\n> > +     void unicamBufferDequeue(FrameBuffer *buffer);\n> > +     void ispInputDequeue(FrameBuffer *buffer);\n> > +     void ispOutputDequeue(FrameBuffer *buffer);\n> > +\n> > +     void processStatsComplete(const ipa::RPi::BufferIds &buffers);\n> > +     void prepareIspComplete(const ipa::RPi::BufferIds &buffers);\n> > +     void setIspControls(const ControlList &controls);\n> > +     void setCameraTimeout(uint32_t maxFrameLengthMs);\n> > +\n> > +     /* Array of Unicam and ISP device streams and associated buffers/streams. */\n> > +     RPi::Device<Unicam, 2> unicam_;\n> > +     RPi::Device<Isp, 4> isp_;\n> > +\n> > +     /* DMAHEAP allocation helper. */\n> > +     RPi::DmaHeap dmaHeap_;\n> > +     SharedFD lsTable_;\n> > +\n> > +     struct Config {\n> > +             /*\n> > +              * The minimum number of internal buffers to be allocated for\n> > +              * the Unicam Image stream.\n> > +              */\n> > +             unsigned int minUnicamBuffers;\n> > +             /*\n> > +              * The minimum total (internal + external) buffer count used for\n> > +              * the Unicam Image stream.\n> > +              *\n> > +              * Note that:\n> > +              * minTotalUnicamBuffers must be >= 1, and\n> > +              * minTotalUnicamBuffers >= minUnicamBuffers\n> > +              */\n> > +             unsigned int minTotalUnicamBuffers;\n> > +     };\n> > +\n> > +     Config config_;\n> > +\n> > +private:\n> > +     void platformIspCrop() override\n> > +     {\n> > +             isp_[Isp::Input].dev()->setSelection(V4L2_SEL_TGT_CROP, &ispCrop_);\n> > +     }\n> > +\n> > +     int platformConfigure(const V4L2SubdeviceFormat &sensorFormat,\n> > +                           std::optional<BayerFormat::Packing> packing,\n> > +                           std::vector<StreamParams> &rawStreams,\n> > +                           std::vector<StreamParams> &outStreams) override;\n> > +     int platformConfigureIpa(ipa::RPi::ConfigParams &params) override;\n> > +\n> > +     int platformInitIpa([[maybe_unused]] ipa::RPi::InitParams &params) override\n> > +     {\n> > +             return 0;\n> > +     }\n> > +\n> > +     struct BayerFrame {\n> > +             FrameBuffer *buffer;\n> > +             ControlList controls;\n> > +             unsigned int delayContext;\n> > +     };\n> > +\n> > +     void tryRunPipeline() override;\n> > +     bool findMatchingBuffers(BayerFrame &bayerFrame, FrameBuffer *&embeddedBuffer);\n> > +\n> > +     std::queue<BayerFrame> bayerQueue_;\n> > +     std::queue<FrameBuffer *> embeddedQueue_;\n> > +};\n> > +\n> > +class PipelineHandlerVc4 : public RPi::PipelineHandlerBase\n> > +{\n> > +public:\n> > +     PipelineHandlerVc4(CameraManager *manager)\n> > +             : RPi::PipelineHandlerBase(manager)\n> > +     {\n> > +     }\n> > +\n> > +     ~PipelineHandlerVc4()\n> > +     {\n> > +     }\n> > +\n> > +     bool match(DeviceEnumerator *enumerator) override;\n> > +\n> > +private:\n> > +     Vc4CameraData *cameraData(Camera *camera)\n> > +     {\n> > +             return static_cast<Vc4CameraData *>(camera->_d());\n> > +     }\n> > +\n> > +     std::unique_ptr<RPi::CameraData> allocateCameraData() override\n> > +     {\n> > +             return std::make_unique<Vc4CameraData>(this);\n> > +     }\n> > +\n> > +     int prepareBuffers(Camera *camera) override;\n> > +     int platformRegister(std::unique_ptr<RPi::CameraData> &cameraData,\n> > +                          MediaDevice *unicam, MediaDevice *isp) override;\n> > +};\n> > +\n> > +bool PipelineHandlerVc4::match(DeviceEnumerator *enumerator)\n> > +{\n> > +     constexpr unsigned int numUnicamDevices = 2;\n> > +\n> > +     /*\n> > +      * Loop over all Unicam instances, but return out once a match is found.\n> > +      * This is to ensure we correctly enumrate the camera when an instance\n> > +      * of Unicam has registered with media controller, but has not registered\n> > +      * device nodes due to a sensor subdevice failure.\n> > +      */\n> > +     for (unsigned int i = 0; i < numUnicamDevices; i++) {\n> > +             DeviceMatch unicam(\"unicam\");\n> > +             MediaDevice *unicamDevice = acquireMediaDevice(enumerator, unicam);\n> > +\n> > +             if (!unicamDevice) {\n> > +                     LOG(RPI, Debug) << \"Unable to acquire a Unicam instance\";\n> > +                     break;\n>\n> The existing code continues here instead of breaking. Same if\n> !ispDevice. Is this change intended ?\n\nOops, that's a mistake.  I'll put that back.\n\nRegards,\nNaush\n\n>\n> > +             }\n> > +\n> > +             DeviceMatch isp(\"bcm2835-isp\");\n> > +             MediaDevice *ispDevice = acquireMediaDevice(enumerator, isp);\n> > +\n> > +             if (!ispDevice) {\n> > +                     LOG(RPI, Debug) << \"Unable to acquire ISP instance\";\n> > +                     break;\n> > +             }\n> > +\n> > +             /*\n> > +              * The loop below is used to register multiple cameras behind one or more\n> > +              * video mux devices that are attached to a particular Unicam instance.\n> > +              * Obviously these cameras cannot be used simultaneously.\n> > +              */\n> > +             unsigned int numCameras = 0;\n> > +             for (MediaEntity *entity : unicamDevice->entities()) {\n> > +                     if (entity->function() != MEDIA_ENT_F_CAM_SENSOR)\n> > +                             continue;\n> > +\n> > +                     int ret = RPi::PipelineHandlerBase::registerCamera(unicamDevice, \"unicam-image\",\n> > +                                                                        ispDevice, entity);\n> > +                     if (ret)\n> > +                             LOG(RPI, Error) << \"Failed to register camera \"\n> > +                                             << entity->name() << \": \" << ret;\n> > +                     else\n> > +                             numCameras++;\n> > +             }\n> > +\n> > +             if (numCameras)\n> > +                     return true;\n> > +     }\n> > +\n> > +     return false;\n> > +}\n> > +\n> > +int PipelineHandlerVc4::prepareBuffers(Camera *camera)\n> > +{\n> > +     Vc4CameraData *data = cameraData(camera);\n> > +     unsigned int numRawBuffers = 0;\n> > +     int ret;\n> > +\n> > +     for (Stream *s : camera->streams()) {\n> > +             if (BayerFormat::fromPixelFormat(s->configuration().pixelFormat).isValid()) {\n> > +                     numRawBuffers = s->configuration().bufferCount;\n> > +                     break;\n> > +             }\n> > +     }\n> > +\n> > +     /* Decide how many internal buffers to allocate. */\n> > +     for (auto const stream : data->streams_) {\n> > +             unsigned int numBuffers;\n> > +             /*\n> > +              * For Unicam, allocate a minimum number of buffers for internal\n> > +              * use as we want to avoid any frame drops.\n> > +              */\n> > +             const unsigned int minBuffers = data->config_.minTotalUnicamBuffers;\n> > +             if (stream == &data->unicam_[Unicam::Image]) {\n> > +                     /*\n> > +                      * If an application has configured a RAW stream, allocate\n> > +                      * additional buffers to make up the minimum, but ensure\n> > +                      * we have at least minUnicamBuffers of internal buffers\n> > +                      * to use to minimise frame drops.\n> > +                      */\n> > +                     numBuffers = std::max<int>(data->config_.minUnicamBuffers,\n> > +                                                minBuffers - numRawBuffers);\n> > +             } else if (stream == &data->isp_[Isp::Input]) {\n> > +                     /*\n> > +                      * ISP input buffers are imported from Unicam, so follow\n> > +                      * similar logic as above to count all the RAW buffers\n> > +                      * available.\n> > +                      */\n> > +                     numBuffers = numRawBuffers +\n> > +                                  std::max<int>(data->config_.minUnicamBuffers,\n> > +                                                minBuffers - numRawBuffers);\n> > +\n> > +             } else if (stream == &data->unicam_[Unicam::Embedded]) {\n> > +                     /*\n> > +                      * Embedded data buffers are (currently) for internal use,\n> > +                      * so allocate the minimum required to avoid frame drops.\n> > +                      */\n> > +                     numBuffers = minBuffers;\n> > +             } else {\n> > +                     /*\n> > +                      * Since the ISP runs synchronous with the IPA and requests,\n> > +                      * we only ever need one set of internal buffers. Any buffers\n> > +                      * the application wants to hold onto will already be exported\n> > +                      * through PipelineHandlerRPi::exportFrameBuffers().\n> > +                      */\n> > +                     numBuffers = 1;\n> > +             }\n> > +\n> > +             ret = stream->prepareBuffers(numBuffers);\n> > +             if (ret < 0)\n> > +                     return ret;\n> > +     }\n> > +\n> > +     /*\n> > +      * Pass the stats and embedded data buffers to the IPA. No other\n> > +      * buffers need to be passed.\n> > +      */\n> > +     mapBuffers(camera, data->isp_[Isp::Stats].getBuffers(), RPi::MaskStats);\n> > +     if (data->sensorMetadata_)\n> > +             mapBuffers(camera, data->unicam_[Unicam::Embedded].getBuffers(),\n> > +                        RPi::MaskEmbeddedData);\n> > +\n> > +     return 0;\n> > +}\n> > +\n> > +int PipelineHandlerVc4::platformRegister(std::unique_ptr<RPi::CameraData> &cameraData, MediaDevice *unicam, MediaDevice *isp)\n> > +{\n> > +     Vc4CameraData *data = static_cast<Vc4CameraData *>(cameraData.get());\n> > +\n> > +     if (!data->dmaHeap_.isValid())\n> > +             return -ENOMEM;\n> > +\n> > +     MediaEntity *unicamImage = unicam->getEntityByName(\"unicam-image\");\n> > +     MediaEntity *ispOutput0 = isp->getEntityByName(\"bcm2835-isp0-output0\");\n> > +     MediaEntity *ispCapture1 = isp->getEntityByName(\"bcm2835-isp0-capture1\");\n> > +     MediaEntity *ispCapture2 = isp->getEntityByName(\"bcm2835-isp0-capture2\");\n> > +     MediaEntity *ispCapture3 = isp->getEntityByName(\"bcm2835-isp0-capture3\");\n> > +\n> > +     if (!unicamImage || !ispOutput0 || !ispCapture1 || !ispCapture2 || !ispCapture3)\n> > +             return -ENOENT;\n> > +\n> > +     /* Locate and open the unicam video streams. */\n> > +     data->unicam_[Unicam::Image] = RPi::Stream(\"Unicam Image\", unicamImage);\n> > +\n> > +     /* An embedded data node will not be present if the sensor does not support it. */\n> > +     MediaEntity *unicamEmbedded = unicam->getEntityByName(\"unicam-embedded\");\n> > +     if (unicamEmbedded) {\n> > +             data->unicam_[Unicam::Embedded] = RPi::Stream(\"Unicam Embedded\", unicamEmbedded);\n> > +             data->unicam_[Unicam::Embedded].dev()->bufferReady.connect(data,\n> > +                                                                        &Vc4CameraData::unicamBufferDequeue);\n> > +     }\n> > +\n> > +     /* Tag the ISP input stream as an import stream. */\n> > +     data->isp_[Isp::Input] = RPi::Stream(\"ISP Input\", ispOutput0, true);\n> > +     data->isp_[Isp::Output0] = RPi::Stream(\"ISP Output0\", ispCapture1);\n> > +     data->isp_[Isp::Output1] = RPi::Stream(\"ISP Output1\", ispCapture2);\n> > +     data->isp_[Isp::Stats] = RPi::Stream(\"ISP Stats\", ispCapture3);\n> > +\n> > +     /* Wire up all the buffer connections. */\n> > +     data->unicam_[Unicam::Image].dev()->bufferReady.connect(data, &Vc4CameraData::unicamBufferDequeue);\n> > +     data->isp_[Isp::Input].dev()->bufferReady.connect(data, &Vc4CameraData::ispInputDequeue);\n> > +     data->isp_[Isp::Output0].dev()->bufferReady.connect(data, &Vc4CameraData::ispOutputDequeue);\n> > +     data->isp_[Isp::Output1].dev()->bufferReady.connect(data, &Vc4CameraData::ispOutputDequeue);\n> > +     data->isp_[Isp::Stats].dev()->bufferReady.connect(data, &Vc4CameraData::ispOutputDequeue);\n> > +\n> > +     if (data->sensorMetadata_ ^ !!data->unicam_[Unicam::Embedded].dev()) {\n> > +             LOG(RPI, Warning) << \"Mismatch between Unicam and CamHelper for embedded data usage!\";\n> > +             data->sensorMetadata_ = false;\n> > +             if (data->unicam_[Unicam::Embedded].dev())\n> > +                     data->unicam_[Unicam::Embedded].dev()->bufferReady.disconnect();\n> > +     }\n> > +\n> > +     /*\n> > +      * Open all Unicam and ISP streams. The exception is the embedded data\n> > +      * stream, which only gets opened below if the IPA reports that the sensor\n> > +      * supports embedded data.\n> > +      *\n> > +      * The below grouping is just for convenience so that we can easily\n> > +      * iterate over all streams in one go.\n> > +      */\n> > +     data->streams_.push_back(&data->unicam_[Unicam::Image]);\n> > +     if (data->sensorMetadata_)\n> > +             data->streams_.push_back(&data->unicam_[Unicam::Embedded]);\n> > +\n> > +     for (auto &stream : data->isp_)\n> > +             data->streams_.push_back(&stream);\n> > +\n> > +     for (auto stream : data->streams_) {\n> > +             int ret = stream->dev()->open();\n> > +             if (ret)\n> > +                     return ret;\n> > +     }\n> > +\n> > +     if (!data->unicam_[Unicam::Image].dev()->caps().hasMediaController()) {\n> > +             LOG(RPI, Error) << \"Unicam driver does not use the MediaController, please update your kernel!\";\n> > +             return -EINVAL;\n> > +     }\n> > +\n> > +     /* Write up all the IPA connections. */\n> > +     data->ipa_->processStatsComplete.connect(data, &Vc4CameraData::processStatsComplete);\n> > +     data->ipa_->prepareIspComplete.connect(data, &Vc4CameraData::prepareIspComplete);\n> > +     data->ipa_->setIspControls.connect(data, &Vc4CameraData::setIspControls);\n> > +     data->ipa_->setCameraTimeout.connect(data, &Vc4CameraData::setCameraTimeout);\n> > +\n> > +     /*\n> > +      * List the available streams an application may request. At present, we\n> > +      * do not advertise Unicam Embedded and ISP Statistics streams, as there\n> > +      * is no mechanism for the application to request non-image buffer formats.\n> > +      */\n> > +     std::set<Stream *> streams;\n> > +     streams.insert(&data->unicam_[Unicam::Image]);\n> > +     streams.insert(&data->isp_[Isp::Output0]);\n> > +     streams.insert(&data->isp_[Isp::Output1]);\n> > +\n> > +     /* Create and register the camera. */\n> > +     const std::string &id = data->sensor_->id();\n> > +     std::shared_ptr<Camera> camera =\n> > +             Camera::create(std::move(cameraData), id, streams);\n> > +     PipelineHandler::registerCamera(std::move(camera));\n> > +\n> > +     LOG(RPI, Info) << \"Registered camera \" << id\n> > +                    << \" to Unicam device \" << unicam->deviceNode()\n> > +                    << \" and ISP device \" << isp->deviceNode();\n> > +\n> > +     return 0;\n> > +}\n> > +\n> > +CameraConfiguration::Status Vc4CameraData::platformValidate(std::vector<StreamParams> &rawStreams,\n> > +                                                         std::vector<StreamParams> &outStreams) const\n> > +{\n> > +     CameraConfiguration::Status status = CameraConfiguration::Status::Valid;\n> > +\n> > +     /* Can only output 1 RAW stream, or 2 YUV/RGB streams. */\n> > +     if (rawStreams.size() > 1 || outStreams.size() > 2) {\n> > +             LOG(RPI, Error) << \"Invalid number of streams requested\";\n> > +             return CameraConfiguration::Status::Invalid;\n> > +     }\n> > +\n> > +     if (!rawStreams.empty())\n> > +             rawStreams[0].dev = unicam_[Unicam::Image].dev();\n> > +\n> > +     /*\n> > +      * For the two ISP outputs, one stream must be equal or smaller than the\n> > +      * other in all dimensions.\n> > +      *\n> > +      * Index 0 contains the largest requested resolution.\n> > +      */\n> > +     for (unsigned int i = 0; i < outStreams.size(); i++) {\n> > +             Size size;\n> > +\n> > +             size.width = std::min(outStreams[i].cfg->size.width,\n> > +                                   outStreams[0].cfg->size.width);\n> > +             size.height = std::min(outStreams[i].cfg->size.height,\n> > +                                    outStreams[0].cfg->size.height);\n> > +\n> > +             if (outStreams[i].cfg->size != size) {\n> > +                     outStreams[i].cfg->size = size;\n> > +                     status = CameraConfiguration::Status::Adjusted;\n> > +             }\n> > +\n> > +             /*\n> > +              * Output 0 must be for the largest resolution. We will\n> > +              * have that fixed up in the code above.\n> > +              */\n> > +             outStreams[i].dev = isp_[i == 0 ? Isp::Output0 : Isp::Output1].dev();\n> > +     }\n> > +\n> > +     return status;\n> > +}\n> > +\n> > +int Vc4CameraData::platformPipelineConfigure(const std::unique_ptr<YamlObject> &root)\n> > +{\n> > +     config_ = {\n> > +             .minUnicamBuffers = 2,\n> > +             .minTotalUnicamBuffers = 4,\n> > +     };\n> > +\n> > +     if (!root)\n> > +             return 0;\n> > +\n> > +     std::optional<double> ver = (*root)[\"version\"].get<double>();\n> > +     if (!ver || *ver != 1.0) {\n> > +             LOG(RPI, Error) << \"Unexpected configuration file version reported\";\n> > +             return -EINVAL;\n> > +     }\n> > +\n> > +     std::optional<std::string> target = (*root)[\"target\"].get<std::string>();\n> > +     if (!target || *target != \"bcm2835\") {\n> > +             LOG(RPI, Error) << \"Unexpected target reported: expected \\\"bcm2835\\\", got \"\n> > +                             << *target;\n> > +             return -EINVAL;\n> > +     }\n> > +\n> > +     const YamlObject &phConfig = (*root)[\"pipeline_handler\"];\n> > +     config_.minUnicamBuffers =\n> > +             phConfig[\"min_unicam_buffers\"].get<unsigned int>(config_.minUnicamBuffers);\n> > +     config_.minTotalUnicamBuffers =\n> > +             phConfig[\"min_total_unicam_buffers\"].get<unsigned int>(config_.minTotalUnicamBuffers);\n> > +\n> > +     if (config_.minTotalUnicamBuffers < config_.minUnicamBuffers) {\n> > +             LOG(RPI, Error) << \"Invalid configuration: min_total_unicam_buffers must be >= min_unicam_buffers\";\n> > +             return -EINVAL;\n> > +     }\n> > +\n> > +     if (config_.minTotalUnicamBuffers < 1) {\n> > +             LOG(RPI, Error) << \"Invalid configuration: min_total_unicam_buffers must be >= 1\";\n> > +             return -EINVAL;\n> > +     }\n> > +\n> > +     return 0;\n> > +}\n> > +\n> > +int Vc4CameraData::platformConfigure(const V4L2SubdeviceFormat &sensorFormat,\n> > +                                  std::optional<BayerFormat::Packing> packing,\n> > +                                  std::vector<StreamParams> &rawStreams,\n> > +                                  std::vector<StreamParams> &outStreams)\n> > +{\n> > +     int ret;\n> > +\n> > +     if (!packing)\n> > +             packing = BayerFormat::Packing::CSI2;\n> > +\n> > +     V4L2VideoDevice *unicam = unicam_[Unicam::Image].dev();\n> > +     V4L2DeviceFormat unicamFormat = RPi::PipelineHandlerBase::toV4L2DeviceFormat(unicam, sensorFormat, *packing);\n> > +\n> > +     ret = unicam->setFormat(&unicamFormat);\n> > +     if (ret)\n> > +             return ret;\n> > +\n> > +     /*\n> > +      * See which streams are requested, and route the user\n> > +      * StreamConfiguration appropriately.\n> > +      */\n> > +     if (!rawStreams.empty()) {\n> > +             rawStreams[0].cfg->setStream(&unicam_[Unicam::Image]);\n> > +             unicam_[Unicam::Image].setExternal(true);\n> > +     }\n> > +\n> > +     ret = isp_[Isp::Input].dev()->setFormat(&unicamFormat);\n> > +     if (ret)\n> > +             return ret;\n> > +\n> > +     LOG(RPI, Info) << \"Sensor: \" << sensor_->id()\n> > +                    << \" - Selected sensor format: \" << sensorFormat\n> > +                    << \" - Selected unicam format: \" << unicamFormat;\n> > +\n> > +     /* Use a sensible small default size if no output streams are configured. */\n> > +     Size maxSize = outStreams.empty() ? Size(320, 240) : outStreams[0].cfg->size;\n> > +     V4L2DeviceFormat format;\n> > +\n> > +     for (unsigned int i = 0; i < outStreams.size(); i++) {\n> > +             StreamConfiguration *cfg = outStreams[i].cfg;\n> > +\n> > +             /* The largest resolution gets routed to the ISP Output 0 node. */\n> > +             RPi::Stream *stream = i == 0 ? &isp_[Isp::Output0] : &isp_[Isp::Output1];\n> > +\n> > +             V4L2PixelFormat fourcc = stream->dev()->toV4L2PixelFormat(cfg->pixelFormat);\n> > +             format.size = cfg->size;\n> > +             format.fourcc = fourcc;\n> > +             format.colorSpace = cfg->colorSpace;\n> > +\n> > +             LOG(RPI, Debug) << \"Setting \" << stream->name() << \" to \"\n> > +                             << format;\n> > +\n> > +             ret = stream->dev()->setFormat(&format);\n> > +             if (ret)\n> > +                     return -EINVAL;\n> > +\n> > +             if (format.size != cfg->size || format.fourcc != fourcc) {\n> > +                     LOG(RPI, Error)\n> > +                             << \"Failed to set requested format on \" << stream->name()\n> > +                             << \", returned \" << format;\n> > +                     return -EINVAL;\n> > +             }\n> > +\n> > +             LOG(RPI, Debug)\n> > +                     << \"Stream \" << stream->name() << \" has color space \"\n> > +                     << ColorSpace::toString(cfg->colorSpace);\n> > +\n> > +             cfg->setStream(stream);\n> > +             stream->setExternal(true);\n> > +     }\n> > +\n> > +     ispOutputTotal_ = outStreams.size();\n> > +\n> > +     /*\n> > +      * If ISP::Output0 stream has not been configured by the application,\n> > +      * we must allow the hardware to generate an output so that the data\n> > +      * flow in the pipeline handler remains consistent, and we still generate\n> > +      * statistics for the IPA to use. So enable the output at a very low\n> > +      * resolution for internal use.\n> > +      *\n> > +      * \\todo Allow the pipeline to work correctly without Output0 and only\n> > +      * statistics coming from the hardware.\n> > +      */\n> > +     if (outStreams.empty()) {\n> > +             V4L2VideoDevice *dev = isp_[Isp::Output0].dev();\n> > +\n> > +             format = {};\n> > +             format.size = maxSize;\n> > +             format.fourcc = dev->toV4L2PixelFormat(formats::YUV420);\n> > +             /* No one asked for output, so the color space doesn't matter. */\n> > +             format.colorSpace = ColorSpace::Sycc;\n> > +             ret = dev->setFormat(&format);\n> > +             if (ret) {\n> > +                     LOG(RPI, Error)\n> > +                             << \"Failed to set default format on ISP Output0: \"\n> > +                             << ret;\n> > +                     return -EINVAL;\n> > +             }\n> > +\n> > +             ispOutputTotal_++;\n> > +\n> > +             LOG(RPI, Debug) << \"Defaulting ISP Output0 format to \"\n> > +                             << format;\n> > +     }\n> > +\n> > +     /*\n> > +      * If ISP::Output1 stream has not been requested by the application, we\n> > +      * set it up for internal use now. This second stream will be used for\n> > +      * fast colour denoise, and must be a quarter resolution of the ISP::Output0\n> > +      * stream. However, also limit the maximum size to 1200 pixels in the\n> > +      * larger dimension, just to avoid being wasteful with buffer allocations\n> > +      * and memory bandwidth.\n> > +      *\n> > +      * \\todo If Output 1 format is not YUV420, Output 1 ought to be disabled as\n> > +      * colour denoise will not run.\n> > +      */\n> > +     if (outStreams.size() == 1) {\n> > +             V4L2VideoDevice *dev = isp_[Isp::Output1].dev();\n> > +\n> > +             V4L2DeviceFormat output1Format;\n> > +             constexpr Size maxDimensions(1200, 1200);\n> > +             const Size limit = maxDimensions.boundedToAspectRatio(format.size);\n> > +\n> > +             output1Format.size = (format.size / 2).boundedTo(limit).alignedDownTo(2, 2);\n> > +             output1Format.colorSpace = format.colorSpace;\n> > +             output1Format.fourcc = dev->toV4L2PixelFormat(formats::YUV420);\n> > +\n> > +             LOG(RPI, Debug) << \"Setting ISP Output1 (internal) to \"\n> > +                             << output1Format;\n> > +\n> > +             ret = dev->setFormat(&output1Format);\n> > +             if (ret) {\n> > +                     LOG(RPI, Error) << \"Failed to set format on ISP Output1: \"\n> > +                                     << ret;\n> > +                     return -EINVAL;\n> > +             }\n> > +\n> > +             ispOutputTotal_++;\n> > +     }\n> > +\n> > +     /* ISP statistics output format. */\n> > +     format = {};\n> > +     format.fourcc = V4L2PixelFormat(V4L2_META_FMT_BCM2835_ISP_STATS);\n> > +     ret = isp_[Isp::Stats].dev()->setFormat(&format);\n> > +     if (ret) {\n> > +             LOG(RPI, Error) << \"Failed to set format on ISP stats stream: \"\n> > +                             << format;\n> > +             return ret;\n> > +     }\n> > +\n> > +     ispOutputTotal_++;\n> > +\n> > +     /*\n> > +      * Configure the Unicam embedded data output format only if the sensor\n> > +      * supports it.\n> > +      */\n> > +     if (sensorMetadata_) {\n> > +             V4L2SubdeviceFormat embeddedFormat;\n> > +\n> > +             sensor_->device()->getFormat(1, &embeddedFormat);\n> > +             format = {};\n> > +             format.fourcc = V4L2PixelFormat(V4L2_META_FMT_SENSOR_DATA);\n> > +             format.planes[0].size = embeddedFormat.size.width * embeddedFormat.size.height;\n> > +\n> > +             LOG(RPI, Debug) << \"Setting embedded data format \" << format.toString();\n> > +             ret = unicam_[Unicam::Embedded].dev()->setFormat(&format);\n> > +             if (ret) {\n> > +                     LOG(RPI, Error) << \"Failed to set format on Unicam embedded: \"\n> > +                                     << format;\n> > +                     return ret;\n> > +             }\n> > +     }\n> > +\n> > +     /* Figure out the smallest selection the ISP will allow. */\n> > +     Rectangle testCrop(0, 0, 1, 1);\n> > +     isp_[Isp::Input].dev()->setSelection(V4L2_SEL_TGT_CROP, &testCrop);\n> > +     ispMinCropSize_ = testCrop.size();\n> > +\n> > +     /* Adjust aspect ratio by providing crops on the input image. */\n> > +     Size size = unicamFormat.size.boundedToAspectRatio(maxSize);\n> > +     ispCrop_ = size.centeredTo(Rectangle(unicamFormat.size).center());\n> > +\n> > +     platformIspCrop();\n> > +\n> > +     return 0;\n> > +}\n> > +\n> > +int Vc4CameraData::platformConfigureIpa(ipa::RPi::ConfigParams &params)\n> > +{\n> > +     params.ispControls = isp_[Isp::Input].dev()->controls();\n> > +\n> > +     /* Allocate the lens shading table via dmaHeap and pass to the IPA. */\n> > +     if (!lsTable_.isValid()) {\n> > +             lsTable_ = SharedFD(dmaHeap_.alloc(\"ls_grid\", ipa::RPi::MaxLsGridSize));\n> > +             if (!lsTable_.isValid())\n> > +                     return -ENOMEM;\n> > +\n> > +             /* Allow the IPA to mmap the LS table via the file descriptor. */\n> > +             /*\n> > +              * \\todo Investigate if mapping the lens shading table buffer\n> > +              * could be handled with mapBuffers().\n> > +              */\n> > +             params.lsTableHandle = lsTable_;\n> > +     }\n> > +\n> > +     return 0;\n> > +}\n> > +\n> > +void Vc4CameraData::platformStart()\n> > +{\n> > +}\n> > +\n> > +void Vc4CameraData::platformStop()\n> > +{\n> > +     bayerQueue_ = {};\n> > +     embeddedQueue_ = {};\n> > +}\n> > +\n> > +void Vc4CameraData::unicamBufferDequeue(FrameBuffer *buffer)\n> > +{\n> > +     RPi::Stream *stream = nullptr;\n> > +     unsigned int index;\n> > +\n> > +     if (!isRunning())\n> > +             return;\n> > +\n> > +     for (RPi::Stream &s : unicam_) {\n> > +             index = s.getBufferId(buffer);\n> > +             if (index) {\n> > +                     stream = &s;\n> > +                     break;\n> > +             }\n> > +     }\n> > +\n> > +     /* The buffer must belong to one of our streams. */\n> > +     ASSERT(stream);\n> > +\n> > +     LOG(RPI, Debug) << \"Stream \" << stream->name() << \" buffer dequeue\"\n> > +                     << \", buffer id \" << index\n> > +                     << \", timestamp: \" << buffer->metadata().timestamp;\n> > +\n> > +     if (stream == &unicam_[Unicam::Image]) {\n> > +             /*\n> > +              * Lookup the sensor controls used for this frame sequence from\n> > +              * DelayedControl and queue them along with the frame buffer.\n> > +              */\n> > +             auto [ctrl, delayContext] = delayedCtrls_->get(buffer->metadata().sequence);\n> > +             /*\n> > +              * Add the frame timestamp to the ControlList for the IPA to use\n> > +              * as it does not receive the FrameBuffer object.\n> > +              */\n> > +             ctrl.set(controls::SensorTimestamp, buffer->metadata().timestamp);\n> > +             bayerQueue_.push({ buffer, std::move(ctrl), delayContext });\n> > +     } else {\n> > +             embeddedQueue_.push(buffer);\n> > +     }\n> > +\n> > +     handleState();\n> > +}\n> > +\n> > +void Vc4CameraData::ispInputDequeue(FrameBuffer *buffer)\n> > +{\n> > +     if (!isRunning())\n> > +             return;\n> > +\n> > +     LOG(RPI, Debug) << \"Stream ISP Input buffer complete\"\n> > +                     << \", buffer id \" << unicam_[Unicam::Image].getBufferId(buffer)\n> > +                     << \", timestamp: \" << buffer->metadata().timestamp;\n> > +\n> > +     /* The ISP input buffer gets re-queued into Unicam. */\n> > +     handleStreamBuffer(buffer, &unicam_[Unicam::Image]);\n> > +     handleState();\n> > +}\n> > +\n> > +void Vc4CameraData::ispOutputDequeue(FrameBuffer *buffer)\n> > +{\n> > +     RPi::Stream *stream = nullptr;\n> > +     unsigned int index;\n> > +\n> > +     if (!isRunning())\n> > +             return;\n> > +\n> > +     for (RPi::Stream &s : isp_) {\n> > +             index = s.getBufferId(buffer);\n> > +             if (index) {\n> > +                     stream = &s;\n> > +                     break;\n> > +             }\n> > +     }\n> > +\n> > +     /* The buffer must belong to one of our ISP output streams. */\n> > +     ASSERT(stream);\n> > +\n> > +     LOG(RPI, Debug) << \"Stream \" << stream->name() << \" buffer complete\"\n> > +                     << \", buffer id \" << index\n> > +                     << \", timestamp: \" << buffer->metadata().timestamp;\n> > +\n> > +     /*\n> > +      * ISP statistics buffer must not be re-queued or sent back to the\n> > +      * application until after the IPA signals so.\n> > +      */\n> > +     if (stream == &isp_[Isp::Stats]) {\n> > +             ipa::RPi::ProcessParams params;\n> > +             params.buffers.stats = index | RPi::MaskStats;\n> > +             params.ipaContext = requestQueue_.front()->sequence();\n> > +             ipa_->processStats(params);\n> > +     } else {\n> > +             /* Any other ISP output can be handed back to the application now. */\n> > +             handleStreamBuffer(buffer, stream);\n> > +     }\n> > +\n> > +     /*\n> > +      * Increment the number of ISP outputs generated.\n> > +      * This is needed to track dropped frames.\n> > +      */\n> > +     ispOutputCount_++;\n> > +\n> > +     handleState();\n> > +}\n> > +\n> > +void Vc4CameraData::processStatsComplete(const ipa::RPi::BufferIds &buffers)\n> > +{\n> > +     if (!isRunning())\n> > +             return;\n> > +\n> > +     FrameBuffer *buffer = isp_[Isp::Stats].getBuffers().at(buffers.stats & RPi::MaskID);\n> > +\n> > +     handleStreamBuffer(buffer, &isp_[Isp::Stats]);\n> > +\n> > +     /* Last thing to do is to fill up the request metadata. */\n> > +     Request *request = requestQueue_.front();\n> > +     ControlList metadata(controls::controls);\n> > +\n> > +     ipa_->reportMetadata(request->sequence(), &metadata);\n> > +     request->metadata().merge(metadata);\n> > +\n> > +     /*\n> > +      * Inform the sensor of the latest colour gains if it has the\n> > +      * V4L2_CID_NOTIFY_GAINS control (which means notifyGainsUnity_ is set).\n> > +      */\n> > +     const auto &colourGains = metadata.get(libcamera::controls::ColourGains);\n> > +     if (notifyGainsUnity_ && colourGains) {\n> > +             /* The control wants linear gains in the order B, Gb, Gr, R. */\n> > +             ControlList ctrls(sensor_->controls());\n> > +             std::array<int32_t, 4> gains{\n> > +                     static_cast<int32_t>((*colourGains)[1] * *notifyGainsUnity_),\n> > +                     *notifyGainsUnity_,\n> > +                     *notifyGainsUnity_,\n> > +                     static_cast<int32_t>((*colourGains)[0] * *notifyGainsUnity_)\n> > +             };\n> > +             ctrls.set(V4L2_CID_NOTIFY_GAINS, Span<const int32_t>{ gains });\n> > +\n> > +             sensor_->setControls(&ctrls);\n> > +     }\n> > +\n> > +     state_ = State::IpaComplete;\n> > +     handleState();\n> > +}\n> > +\n> > +void Vc4CameraData::prepareIspComplete(const ipa::RPi::BufferIds &buffers)\n> > +{\n> > +     unsigned int embeddedId = buffers.embedded & RPi::MaskID;\n> > +     unsigned int bayer = buffers.bayer & RPi::MaskID;\n> > +     FrameBuffer *buffer;\n> > +\n> > +     if (!isRunning())\n> > +             return;\n> > +\n> > +     buffer = unicam_[Unicam::Image].getBuffers().at(bayer & RPi::MaskID);\n> > +     LOG(RPI, Debug) << \"Input re-queue to ISP, buffer id \" << (bayer & RPi::MaskID)\n> > +                     << \", timestamp: \" << buffer->metadata().timestamp;\n> > +\n> > +     isp_[Isp::Input].queueBuffer(buffer);\n> > +     ispOutputCount_ = 0;\n> > +\n> > +     if (sensorMetadata_ && embeddedId) {\n> > +             buffer = unicam_[Unicam::Embedded].getBuffers().at(embeddedId & RPi::MaskID);\n> > +             handleStreamBuffer(buffer, &unicam_[Unicam::Embedded]);\n> > +     }\n> > +\n> > +     handleState();\n> > +}\n> > +\n> > +void Vc4CameraData::setIspControls(const ControlList &controls)\n> > +{\n> > +     ControlList ctrls = controls;\n> > +\n> > +     if (ctrls.contains(V4L2_CID_USER_BCM2835_ISP_LENS_SHADING)) {\n> > +             ControlValue &value =\n> > +                     const_cast<ControlValue &>(ctrls.get(V4L2_CID_USER_BCM2835_ISP_LENS_SHADING));\n> > +             Span<uint8_t> s = value.data();\n> > +             bcm2835_isp_lens_shading *ls =\n> > +                     reinterpret_cast<bcm2835_isp_lens_shading *>(s.data());\n> > +             ls->dmabuf = lsTable_.get();\n> > +     }\n> > +\n> > +     isp_[Isp::Input].dev()->setControls(&ctrls);\n> > +     handleState();\n> > +}\n> > +\n> > +void Vc4CameraData::setCameraTimeout(uint32_t maxFrameLengthMs)\n> > +{\n> > +     /*\n> > +      * Set the dequeue timeout to the larger of 5x the maximum reported\n> > +      * frame length advertised by the IPA over a number of frames. Allow\n> > +      * a minimum timeout value of 1s.\n> > +      */\n> > +     utils::Duration timeout =\n> > +             std::max<utils::Duration>(1s, 5 * maxFrameLengthMs * 1ms);\n> > +\n> > +     LOG(RPI, Debug) << \"Setting Unicam timeout to \" << timeout;\n> > +     unicam_[Unicam::Image].dev()->setDequeueTimeout(timeout);\n> > +}\n> > +\n> > +void Vc4CameraData::tryRunPipeline()\n> > +{\n> > +     FrameBuffer *embeddedBuffer;\n> > +     BayerFrame bayerFrame;\n> > +\n> > +     /* If any of our request or buffer queues are empty, we cannot proceed. */\n> > +     if (state_ != State::Idle || requestQueue_.empty() ||\n> > +         bayerQueue_.empty() || (embeddedQueue_.empty() && sensorMetadata_))\n> > +             return;\n> > +\n> > +     if (!findMatchingBuffers(bayerFrame, embeddedBuffer))\n> > +             return;\n> > +\n> > +     /* Take the first request from the queue and action the IPA. */\n> > +     Request *request = requestQueue_.front();\n> > +\n> > +     /* See if a new ScalerCrop value needs to be applied. */\n> > +     calculateScalerCrop(request->controls());\n> > +\n> > +     /*\n> > +      * Clear the request metadata and fill it with some initial non-IPA\n> > +      * related controls. We clear it first because the request metadata\n> > +      * may have been populated if we have dropped the previous frame.\n> > +      */\n> > +     request->metadata().clear();\n> > +     fillRequestMetadata(bayerFrame.controls, request);\n> > +\n> > +     /* Set our state to say the pipeline is active. */\n> > +     state_ = State::Busy;\n> > +\n> > +     unsigned int bayer = unicam_[Unicam::Image].getBufferId(bayerFrame.buffer);\n> > +\n> > +     LOG(RPI, Debug) << \"Signalling prepareIsp:\"\n> > +                     << \" Bayer buffer id: \" << bayer;\n> > +\n> > +     ipa::RPi::PrepareParams params;\n> > +     params.buffers.bayer = RPi::MaskBayerData | bayer;\n> > +     params.sensorControls = std::move(bayerFrame.controls);\n> > +     params.requestControls = request->controls();\n> > +     params.ipaContext = request->sequence();\n> > +     params.delayContext = bayerFrame.delayContext;\n> > +\n> > +     if (embeddedBuffer) {\n> > +             unsigned int embeddedId = unicam_[Unicam::Embedded].getBufferId(embeddedBuffer);\n> > +\n> > +             params.buffers.embedded = RPi::MaskEmbeddedData | embeddedId;\n> > +             LOG(RPI, Debug) << \"Signalling prepareIsp:\"\n> > +                             << \" Embedded buffer id: \" << embeddedId;\n> > +     }\n> > +\n> > +     ipa_->prepareIsp(params);\n> > +}\n> > +\n> > +bool Vc4CameraData::findMatchingBuffers(BayerFrame &bayerFrame, FrameBuffer *&embeddedBuffer)\n> > +{\n> > +     if (bayerQueue_.empty())\n> > +             return false;\n> > +\n> > +     /*\n> > +      * Find the embedded data buffer with a matching timestamp to pass to\n> > +      * the IPA. Any embedded buffers with a timestamp lower than the\n> > +      * current bayer buffer will be removed and re-queued to the driver.\n> > +      */\n> > +     uint64_t ts = bayerQueue_.front().buffer->metadata().timestamp;\n> > +     embeddedBuffer = nullptr;\n> > +     while (!embeddedQueue_.empty()) {\n> > +             FrameBuffer *b = embeddedQueue_.front();\n> > +             if (b->metadata().timestamp < ts) {\n> > +                     embeddedQueue_.pop();\n> > +                     unicam_[Unicam::Embedded].returnBuffer(b);\n> > +                     LOG(RPI, Debug) << \"Dropping unmatched input frame in stream \"\n> > +                                     << unicam_[Unicam::Embedded].name();\n> > +             } else if (b->metadata().timestamp == ts) {\n> > +                     /* Found a match! */\n> > +                     embeddedBuffer = b;\n> > +                     embeddedQueue_.pop();\n> > +                     break;\n> > +             } else {\n> > +                     break; /* Only higher timestamps from here. */\n> > +             }\n> > +     }\n> > +\n> > +     if (!embeddedBuffer && sensorMetadata_) {\n> > +             if (embeddedQueue_.empty()) {\n> > +                     /*\n> > +                      * If the embedded buffer queue is empty, wait for the next\n> > +                      * buffer to arrive - dequeue ordering may send the image\n> > +                      * buffer first.\n> > +                      */\n> > +                     LOG(RPI, Debug) << \"Waiting for next embedded buffer.\";\n> > +                     return false;\n> > +             }\n> > +\n> > +             /* Log if there is no matching embedded data buffer found. */\n> > +             LOG(RPI, Debug) << \"Returning bayer frame without a matching embedded buffer.\";\n> > +     }\n> > +\n> > +     bayerFrame = std::move(bayerQueue_.front());\n> > +     bayerQueue_.pop();\n> > +\n> > +     return true;\n> > +}\n> > +\n> > +REGISTER_PIPELINE_HANDLER(PipelineHandlerVc4)\n> > +\n> > +} /* namespace libcamera */\n>\n> --\n> Regards,\n>\n> Laurent Pinchart","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id C7A52BDCBD\n\tfor <parsemail@patchwork.libcamera.org>;\n\tFri, 28 Apr 2023 09:03:23 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id 45497627DF;\n\tFri, 28 Apr 2023 11:03:23 +0200 (CEST)","from mail-yw1-x1133.google.com (mail-yw1-x1133.google.com\n\t[IPv6:2607:f8b0:4864:20::1133])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id 4C993627CD\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tFri, 28 Apr 2023 11:03:21 +0200 (CEST)","by mail-yw1-x1133.google.com with SMTP id\n\t00721157ae682-555d2b43a23so110649887b3.2\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tFri, 28 Apr 2023 02:03:21 -0700 (PDT)"],"DKIM-Signature":["v=1; a=rsa-sha256; c=relaxed/simple; d=libcamera.org;\n\ts=mail; t=1682672603;\n\tbh=TSKLbXW/dpR9T3T7yfsFjOI8SlPhhDmpTfgpS+eGkEY=;\n\th=References:In-Reply-To:Date:To:Subject:List-Id:List-Unsubscribe:\n\tList-Archive:List-Post:List-Help:List-Subscribe:From:Reply-To:Cc:\n\tFrom;\n\tb=3CSAZBLqNGbJyVfQESSnsSlfOQRQKx2U/4W5pAgU/aReBuVvZd9r4LGp0pB/+vnUw\n\tkeVjVmxU4deQoBp23JnG9tV2a9TRGN4HAv1rg0F7RxiY5kQ+eyW/QW/2xXjErGvpf5\n\tqxdz1eD2gJmwaR+sdDrSgY2as3B17ndjfFAy6t8X4AT6k0Axaf52T6/1Kijftsm9ed\n\tp1/HjmALfRnOEnPeR3vOUzPFkS47+DnL+fxsU21I3XNFVANakYo/qrIk21Z7MwYuUn\n\ttI6fjHxuY06fe84+67UKpVXmxAdOuGSO//93YTGMhf80TiWdAmqP8BJHwUzZ8Ng4cL\n\tcjAM32KID+l9g==","v=1; a=rsa-sha256; c=relaxed/relaxed;\n\td=raspberrypi.com; s=google; t=1682672600; x=1685264600;\n\th=cc:to:subject:message-id:date:from:in-reply-to:references\n\t:mime-version:from:to:cc:subject:date:message-id:reply-to;\n\tbh=qVMLHwvlDBi/WRN+usNp4ZUQUT+D/blI1cuRG5CDv28=;\n\tb=BPwmipiYnEGuJeYaZGQYg9S2KUDtT3agOhkzyvwp8rk6NG2sg80PbTw0nF9v48Fu/D\n\tX+0EG7WTtvmvAfvsh/FdsmuBhcV5gYMf+jg8q6xidv6iVvFtWmwRW6DS+VA/mur2kRWS\n\tNPiT/4rac4fx1f8GPt+XoklR4Hvmwj+jwYjaQHJBWuuZ9ohKLrCkNgfeZ4F7xG15pkqH\n\tkei48bKN49X/DjCgqOKc6xBiK5vOyqjns+NCsD2q9ywyAb5q7qh8kI/pn7KdUl74xXH0\n\t8rLISEjza1J3YMUlgDYVlUjqF3tClxvNe+8D6rwzIOMEknNjUL+pMSp7/EKgBJ7A/zUw\n\tGkhg=="],"Authentication-Results":"lancelot.ideasonboard.com; dkim=pass (2048-bit key; \n\tunprotected) header.d=raspberrypi.com\n\theader.i=@raspberrypi.com\n\theader.b=\"BPwmipiY\"; dkim-atps=neutral","X-Google-DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/relaxed;\n\td=1e100.net; s=20221208; t=1682672600; x=1685264600;\n\th=cc:to:subject:message-id:date:from:in-reply-to:references\n\t:mime-version:x-gm-message-state:from:to:cc:subject:date:message-id\n\t:reply-to;\n\tbh=qVMLHwvlDBi/WRN+usNp4ZUQUT+D/blI1cuRG5CDv28=;\n\tb=FxsJ0UDDQEO+q7mWAlvN9/c+T+A5yYaa/thoC9TQt2ZFQgShkGptgNMn+9T0NmSDX0\n\tvdvE/kU4kmte3ElE2x0Vu12+S+yJSeExA3rxci6a4/36wz3TkzSGZL7uVdJYZHr9WVd0\n\thLzzs3DyMqTzlaX0f4EQtfqsmWUBY4tNxM+3StRxTozrhLA/vyDAOJ3h/MLzMT7IRb9x\n\tMD+74DB9l9LdYiLhxPh9NbumlJgnBr2dNnWYXp0/uM0nPMO/vr6pjMRGLnqQpHTPl87/\n\txC9/zYBWUE52yjY7OAtC6QPntvZpHfVb5Yrr0xiqQwTjwQZnYmiL/u5m7Zj7Mw75yNTP\n\tLd5g==","X-Gm-Message-State":"AC+VfDzjd0W+Ar7d4HXgr/8+4OC8vdou0PvRJJAUubOCgggaKfJVRz88\n\tAO2ZYqMi5d3I7daY8PSS/NnFCjDVE0Cc8lH+amENuw==","X-Google-Smtp-Source":"ACHHUZ76kk8GDzvAYAnefhJJz/p49cjhG6fVxQJQfoVCQYrjPW0q0R8d7wssrxCIg0bSKScsqnrRMNItu83TCZQQ8ik=","X-Received":"by 2002:a0d:cb0c:0:b0:54f:a23a:7bd2 with SMTP id\n\tn12-20020a0dcb0c000000b0054fa23a7bd2mr3521440ywd.2.1682672598555;\n\tFri, 28 Apr 2023 02:03:18 -0700 (PDT)","MIME-Version":"1.0","References":"<20230426131057.21550-1-naush@raspberrypi.com>\n\t<20230426131057.21550-12-naush@raspberrypi.com>\n\t<20230427164014.GJ28489@pendragon.ideasonboard.com>","In-Reply-To":"<20230427164014.GJ28489@pendragon.ideasonboard.com>","Date":"Fri, 28 Apr 2023 10:03:11 +0100","Message-ID":"<CAEmqJPqf3x_SDtgzr=a3yWzqqs4MjgtSUOdpNnqJ6Cpz=cNeKQ@mail.gmail.com>","To":"Laurent Pinchart <laurent.pinchart@ideasonboard.com>","Content-Type":"text/plain; charset=\"UTF-8\"","Subject":"Re: [libcamera-devel] [PATCH 11/13] pipeline: raspberrypi:\n\tIntroduce PipelineHandlerBase class","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","From":"Naushir Patuck via libcamera-devel\n\t<libcamera-devel@lists.libcamera.org>","Reply-To":"Naushir Patuck <naush@raspberrypi.com>","Cc":"libcamera-devel@lists.libcamera.org","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}}]