{"id":14384,"url":"https://patchwork.libcamera.org/api/1.1/patches/14384/?format=json","web_url":"https://patchwork.libcamera.org/patch/14384/","project":{"id":1,"url":"https://patchwork.libcamera.org/api/1.1/projects/1/?format=json","name":"libcamera","link_name":"libcamera","list_id":"libcamera_core","list_email":"libcamera-devel@lists.libcamera.org","web_url":"","scm_url":"","webscm_url":""},"msgid":"<20211028084646.453775-9-naush@raspberrypi.com>","date":"2021-10-28T08:46:44","name":"[libcamera-devel,v4,08/10] pipeline: raspberrypi: Convert the pipeline handler to use media controller","commit_ref":null,"pull_url":null,"state":"superseded","archived":false,"hash":"fcec838a36e28ba4e37b80178c007972f23ee932","submitter":{"id":34,"url":"https://patchwork.libcamera.org/api/1.1/people/34/?format=json","name":"Naushir Patuck","email":"naush@raspberrypi.com"},"delegate":null,"mbox":"https://patchwork.libcamera.org/patch/14384/mbox/","series":[{"id":2671,"url":"https://patchwork.libcamera.org/api/1.1/series/2671/?format=json","web_url":"https://patchwork.libcamera.org/project/libcamera/list/?series=2671","date":"2021-10-28T08:46:36","name":"Raspberry Pi: Conversion to media controller","version":4,"mbox":"https://patchwork.libcamera.org/series/2671/mbox/"}],"comments":"https://patchwork.libcamera.org/api/patches/14384/comments/","check":"pending","checks":"https://patchwork.libcamera.org/api/patches/14384/checks/","tags":{},"headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id DC50DC324E\n\tfor <parsemail@patchwork.libcamera.org>;\n\tThu, 28 Oct 2021 08:47:05 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id 9FC09600B5;\n\tThu, 28 Oct 2021 10:47:05 +0200 (CEST)","from mail-wm1-x330.google.com (mail-wm1-x330.google.com\n\t[IPv6:2a00:1450:4864:20::330])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id 52B22600C1\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tThu, 28 Oct 2021 10:46:59 +0200 (CEST)","by mail-wm1-x330.google.com with SMTP id m42so5011897wms.2\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tThu, 28 Oct 2021 01:46:59 -0700 (PDT)","from naush-laptop.pitowers.org\n\t([2a00:1098:3142:14:b894:feb0:b828:13b])\n\tby smtp.gmail.com with ESMTPSA id\n\tz2sm1955281wmk.19.2021.10.28.01.46.57\n\t(version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256);\n\tThu, 28 Oct 2021 01:46:57 -0700 (PDT)"],"Authentication-Results":"lancelot.ideasonboard.com;\n\tdkim=fail reason=\"signature verification failed\" (2048-bit key;\n\tunprotected) header.d=raspberrypi.com header.i=@raspberrypi.com\n\theader.b=\"lS3SCK96\"; dkim-atps=neutral","DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/relaxed;\n\td=raspberrypi.com; s=google;\n\th=from:to:cc:subject:date:message-id:in-reply-to:references\n\t:mime-version:content-transfer-encoding;\n\tbh=B0fl895moHkYilukOGROsqowWz/xVSGcgn2bqLUP+4c=;\n\tb=lS3SCK967SGhKNXDxJbyqmJqQuJIbTKoaeDSkeTLl9maM7ylvkTpC73D4EO/QUdDP+\n\tuIpCT3baZol2DV0b6nQkQS3mMN+b2n79JraH8cLJs9ERcc/kcyUpNbaKEY8AXKxC6nua\n\tuAuk3BaktU7sJRchLsHzMvbZq+mYnfQUyhiay/TYxO56pRssZbOx0tcTsHtRdy6IL1JA\n\tU2apLVgQVTdkvAm+qigV4+2rwEvG7DrEUMdP+zNnDk1HyalSEg0mI6WPk05CiFJpKmnb\n\tej4z6crJrUTHA5o3T8oiyZhnOCYyS3+1ZMhybKu7gQjYiyOKhodSofO3lx5XjmQQVy8y\n\t+W6Q==","X-Google-DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/relaxed;\n\td=1e100.net; s=20210112;\n\th=x-gm-message-state:from:to:cc:subject:date:message-id:in-reply-to\n\t:references:mime-version:content-transfer-encoding;\n\tbh=B0fl895moHkYilukOGROsqowWz/xVSGcgn2bqLUP+4c=;\n\tb=a++gqmlZSipZQSIlYoKOZgzJN8pj3qfC3IsgXInf3eutsI++RRfe0KxJwYLILJg80O\n\tDoAcSqmIRPL+ScKHFAhlaNhkYTGykP/qFngPsJ7+XVyrRPBPbhwWzFbO6/WmMY7Ox6zN\n\tEfotzfg9ttu99vOvt5TWFtgm3oT28vQWWjNygfRDA+HvFvi9owBBSutD9A7tCTP874VS\n\tlTmrRIkZ64Unxa69rDF7zM66vtbAPKly84YJpPRRFXIhbTf8G6pxlZBAo93xWhczY7M5\n\tctcmht5ZpZ7bxGHKQH0pV8wQ/PH1i30vUCwo51peuWgOHF2GJYCHDiFPAnVMA5IZL8mE\n\tlg/w==","X-Gm-Message-State":"AOAM53122PvGeaMH2sUuVtp7pO/HQlGqN1eQpf5oRXn5ufodNXiqdlsc\n\terRfn2goRm6WMyfWZt4KBoGMVxYKxFdqM9VJ","X-Google-Smtp-Source":"ABdhPJxCZx5YaHlwEZKrcF7Idn124pXlSBsPFqMGwr+MUpBu4VpYBNA5ZZyfCF1zQN5c+z8tFwvXRQ==","X-Received":"by 2002:a05:600c:441a:: with SMTP id\n\tu26mr8391951wmn.180.1635410818480; \n\tThu, 28 Oct 2021 01:46:58 -0700 (PDT)","From":"Naushir Patuck <naush@raspberrypi.com>","To":"libcamera-devel@lists.libcamera.org","Date":"Thu, 28 Oct 2021 09:46:44 +0100","Message-Id":"<20211028084646.453775-9-naush@raspberrypi.com>","X-Mailer":"git-send-email 2.25.1","In-Reply-To":"<20211028084646.453775-1-naush@raspberrypi.com>","References":"<20211028084646.453775-1-naush@raspberrypi.com>","MIME-Version":"1.0","Content-Transfer-Encoding":"8bit","Subject":"[libcamera-devel] [PATCH v4 08/10] pipeline: raspberrypi: Convert\n\tthe pipeline handler to use media controller","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"},"content":"Switch the pipeline handler to use the new Unicam media controller based driver.\nWith this change, we directly talk to the sensor device driver to set controls\nand set/get formats in the pipeline handler.\n\nThis change requires the accompanying Raspberry Pi linux kernel change at\nhttps://github.com/raspberrypi/linux/pull/4645. If this kernel change is not\npresent, the pipeline handler will fail to run with an error message informing\nthe user to update the kernel build.\n\nSigned-off-by: Naushir Patuck <naush@raspberrypi.com>\n---\n .../pipeline/raspberrypi/raspberrypi.cpp      | 167 ++++++++++++------\n 1 file changed, 112 insertions(+), 55 deletions(-)","diff":"diff --git a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp\nindex e078b8b9a811..8e3e96a4359f 100644\n--- a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp\n+++ b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp\n@@ -26,6 +26,7 @@\n #include <libcamera/base/utils.h>\n \n #include <linux/bcm2835-isp.h>\n+#include <linux/media-bus-format.h>\n #include <linux/videodev2.h>\n \n #include \"libcamera/internal/bayer_format.h\"\n@@ -48,6 +49,53 @@ LOG_DEFINE_CATEGORY(RPI)\n \n namespace {\n \n+/* Map of mbus codes to supported sizes reported by the sensor. */\n+using SensorFormats = std::map<unsigned int, std::vector<Size>>;\n+\n+SensorFormats populateSensorFormats(std::unique_ptr<CameraSensor> &sensor)\n+{\n+\tSensorFormats formats;\n+\n+\tfor (auto const mbusCode : sensor->mbusCodes())\n+\t\tformats.emplace(mbusCode, sensor->sizes(mbusCode));\n+\n+\treturn formats;\n+}\n+\n+PixelFormat mbusCodeToPixelFormat(unsigned int mbus_code,\n+\t\t\t\t  BayerFormat::Packing packingReq)\n+{\n+\tBayerFormat bayer = BayerFormat::fromMbusCode(mbus_code);\n+\n+\tASSERT(bayer.isValid());\n+\n+\tbayer.packing = packingReq;\n+\tPixelFormat pix = bayer.toPixelFormat();\n+\n+\t/*\n+\t * Not all formats (e.g. 8-bit or 16-bit Bayer formats) can have packed\n+\t * variants. So if the PixelFormat returns as invalid, use the non-packed\n+\t * conversion instead.\n+\t */\n+\tif (!pix.isValid()) {\n+\t\tbayer.packing = BayerFormat::Packing::None;\n+\t\tpix = bayer.toPixelFormat();\n+\t}\n+\n+\treturn pix;\n+}\n+\n+V4L2DeviceFormat toV4L2DeviceFormat(const V4L2SubdeviceFormat &format,\n+\t\t\t\t    BayerFormat::Packing packingReq)\n+{\n+\tconst PixelFormat pix = mbusCodeToPixelFormat(format.mbus_code, packingReq);\n+\tV4L2DeviceFormat deviceFormat;\n+\n+\tdeviceFormat.fourcc = V4L2PixelFormat::fromPixelFormat(pix);\n+\tdeviceFormat.size = format.size;\n+\treturn deviceFormat;\n+}\n+\n bool isRaw(PixelFormat &pixFmt)\n {\n \t/*\n@@ -74,11 +122,10 @@ double scoreFormat(double desired, double actual)\n \treturn score;\n }\n \n-V4L2DeviceFormat findBestMode(V4L2VideoDevice::Formats &formatsMap,\n-\t\t\t      const Size &req)\n+V4L2SubdeviceFormat findBestFormat(const SensorFormats &formatsMap, const Size &req)\n {\n \tdouble bestScore = std::numeric_limits<double>::max(), score;\n-\tV4L2DeviceFormat bestMode;\n+\tV4L2SubdeviceFormat bestFormat;\n \n #define PENALTY_AR\t\t1500.0\n #define PENALTY_8BIT\t\t2000.0\n@@ -88,19 +135,19 @@ V4L2DeviceFormat findBestMode(V4L2VideoDevice::Formats &formatsMap,\n \n \t/* Calculate the closest/best mode from the user requested size. */\n \tfor (const auto &iter : formatsMap) {\n-\t\tV4L2PixelFormat v4l2Format = iter.first;\n-\t\tconst PixelFormatInfo &info = PixelFormatInfo::info(v4l2Format);\n+\t\tconst unsigned int mbusCode = iter.first;\n+\t\tconst PixelFormat format = mbusCodeToPixelFormat(mbusCode,\n+\t\t\t\t\t\t\t\t BayerFormat::Packing::None);\n+\t\tconst PixelFormatInfo &info = PixelFormatInfo::info(format);\n \n-\t\tfor (const SizeRange &sz : iter.second) {\n-\t\t\tdouble modeWidth = sz.contains(req) ? req.width : sz.max.width;\n-\t\t\tdouble modeHeight = sz.contains(req) ? req.height : sz.max.height;\n+\t\tfor (const Size &size : iter.second) {\n \t\t\tdouble reqAr = static_cast<double>(req.width) / req.height;\n-\t\t\tdouble modeAr = modeWidth / modeHeight;\n+\t\t\tdouble fmtAr = size.width / size.height;\n \n \t\t\t/* Score the dimensions for closeness. */\n-\t\t\tscore = scoreFormat(req.width, modeWidth);\n-\t\t\tscore += scoreFormat(req.height, modeHeight);\n-\t\t\tscore += PENALTY_AR * scoreFormat(reqAr, modeAr);\n+\t\t\tscore = scoreFormat(req.width, size.width);\n+\t\t\tscore += scoreFormat(req.height, size.height);\n+\t\t\tscore += PENALTY_AR * scoreFormat(reqAr, fmtAr);\n \n \t\t\t/* Add any penalties... this is not an exact science! */\n \t\t\tif (!info.packed)\n@@ -115,18 +162,18 @@ V4L2DeviceFormat findBestMode(V4L2VideoDevice::Formats &formatsMap,\n \n \t\t\tif (score <= bestScore) {\n \t\t\t\tbestScore = score;\n-\t\t\t\tbestMode.fourcc = v4l2Format;\n-\t\t\t\tbestMode.size = Size(modeWidth, modeHeight);\n+\t\t\t\tbestFormat.mbus_code = mbusCode;\n+\t\t\t\tbestFormat.size = size;\n \t\t\t}\n \n-\t\t\tLOG(RPI, Info) << \"Mode: \" << modeWidth << \"x\" << modeHeight\n-\t\t\t\t       << \" fmt \" << v4l2Format.toString()\n+\t\t\tLOG(RPI, Info) << \"Format: \" << size.toString()\n+\t\t\t\t       << \" fmt \" << format.toString()\n \t\t\t\t       << \" Score: \" << score\n \t\t\t\t       << \" (best \" << bestScore << \")\";\n \t\t}\n \t}\n \n-\treturn bestMode;\n+\treturn bestFormat;\n }\n \n enum class Unicam : unsigned int { Image, Embedded };\n@@ -170,6 +217,7 @@ public:\n \tstd::unique_ptr<ipa::RPi::IPAProxyRPi> ipa_;\n \n \tstd::unique_ptr<CameraSensor> sensor_;\n+\tSensorFormats sensorFormats_;\n \t/* Array of Unicam and ISP device streams and associated buffers/streams. */\n \tRPi::Device<Unicam, 2> unicam_;\n \tRPi::Device<Isp, 4> isp_;\n@@ -352,9 +400,10 @@ CameraConfiguration::Status RPiCameraConfiguration::validate()\n \t\t\t * Calculate the best sensor mode we can use based on\n \t\t\t * the user request.\n \t\t\t */\n-\t\t\tV4L2VideoDevice::Formats fmts = data_->unicam_[Unicam::Image].dev()->formats();\n-\t\t\tV4L2DeviceFormat sensorFormat = findBestMode(fmts, cfg.size);\n-\t\t\tint ret = data_->unicam_[Unicam::Image].dev()->tryFormat(&sensorFormat);\n+\t\t\tV4L2SubdeviceFormat sensorFormat = findBestFormat(data_->sensorFormats_, cfg.size);\n+\t\t\tV4L2DeviceFormat unicamFormat = toV4L2DeviceFormat(sensorFormat,\n+\t\t\t\t\t\t\t\t\t   BayerFormat::Packing::CSI2);\n+\t\t\tint ret = data_->unicam_[Unicam::Image].dev()->tryFormat(&unicamFormat);\n \t\t\tif (ret)\n \t\t\t\treturn Invalid;\n \n@@ -366,7 +415,7 @@ CameraConfiguration::Status RPiCameraConfiguration::validate()\n \t\t\t * fetch the \"native\" (i.e. untransformed) Bayer order,\n \t\t\t * because the sensor may currently be flipped!\n \t\t\t */\n-\t\t\tV4L2PixelFormat fourcc = sensorFormat.fourcc;\n+\t\t\tV4L2PixelFormat fourcc = unicamFormat.fourcc;\n \t\t\tif (data_->flipsAlterBayerOrder_) {\n \t\t\t\tBayerFormat bayer = BayerFormat::fromV4L2PixelFormat(fourcc);\n \t\t\t\tbayer.order = data_->nativeBayerOrder_;\n@@ -375,15 +424,15 @@ CameraConfiguration::Status RPiCameraConfiguration::validate()\n \t\t\t}\n \n \t\t\tPixelFormat sensorPixFormat = fourcc.toPixelFormat();\n-\t\t\tif (cfg.size != sensorFormat.size ||\n+\t\t\tif (cfg.size != unicamFormat.size ||\n \t\t\t    cfg.pixelFormat != sensorPixFormat) {\n-\t\t\t\tcfg.size = sensorFormat.size;\n+\t\t\t\tcfg.size = unicamFormat.size;\n \t\t\t\tcfg.pixelFormat = sensorPixFormat;\n \t\t\t\tstatus = Adjusted;\n \t\t\t}\n \n-\t\t\tcfg.stride = sensorFormat.planes[0].bpl;\n-\t\t\tcfg.frameSize = sensorFormat.planes[0].size;\n+\t\t\tcfg.stride = unicamFormat.planes[0].bpl;\n+\t\t\tcfg.frameSize = unicamFormat.planes[0].size;\n \n \t\t\trawCount++;\n \t\t} else {\n@@ -472,7 +521,7 @@ CameraConfiguration *PipelineHandlerRPi::generateConfiguration(Camera *camera,\n {\n \tRPiCameraData *data = cameraData(camera);\n \tCameraConfiguration *config = new RPiCameraConfiguration(data);\n-\tV4L2DeviceFormat sensorFormat;\n+\tV4L2SubdeviceFormat sensorFormat;\n \tunsigned int bufferCount;\n \tPixelFormat pixelFormat;\n \tV4L2VideoDevice::Formats fmts;\n@@ -487,9 +536,9 @@ CameraConfiguration *PipelineHandlerRPi::generateConfiguration(Camera *camera,\n \t\tswitch (role) {\n \t\tcase StreamRole::Raw:\n \t\t\tsize = data->sensor_->resolution();\n-\t\t\tfmts = data->unicam_[Unicam::Image].dev()->formats();\n-\t\t\tsensorFormat = findBestMode(fmts, size);\n-\t\t\tpixelFormat = sensorFormat.fourcc.toPixelFormat();\n+\t\t\tsensorFormat = findBestFormat(data->sensorFormats_, size);\n+\t\t\tpixelFormat = mbusCodeToPixelFormat(sensorFormat.mbus_code,\n+\t\t\t\t\t\t\t    BayerFormat::Packing::CSI2);\n \t\t\tASSERT(pixelFormat.isValid());\n \t\t\tbufferCount = 2;\n \t\t\trawCount++;\n@@ -572,6 +621,7 @@ int PipelineHandlerRPi::configure(Camera *camera, CameraConfiguration *config)\n \tfor (auto const stream : data->streams_)\n \t\tstream->reset();\n \n+\tBayerFormat::Packing packing = BayerFormat::Packing::CSI2;\n \tSize maxSize, sensorSize;\n \tunsigned int maxIndex = 0;\n \tbool rawStream = false;\n@@ -590,6 +640,8 @@ int PipelineHandlerRPi::configure(Camera *camera, CameraConfiguration *config)\n \t\t\t */\n \t\t\tsensorSize = cfg.size;\n \t\t\trawStream = true;\n+\t\t\t/* Check if the user has explicitly set an unpacked format. */\n+\t\t\tpacking = BayerFormat::fromPixelFormat(cfg.pixelFormat).packing;\n \t\t} else {\n \t\t\tif (cfg.size > maxSize) {\n \t\t\t\tmaxSize = config->at(i).size;\n@@ -615,24 +667,21 @@ int PipelineHandlerRPi::configure(Camera *camera, CameraConfiguration *config)\n \t}\n \n \t/* First calculate the best sensor mode we can use based on the user request. */\n-\tV4L2VideoDevice::Formats fmts = data->unicam_[Unicam::Image].dev()->formats();\n-\tV4L2DeviceFormat sensorFormat = findBestMode(fmts, rawStream ? sensorSize : maxSize);\n-\n-\tret = data->unicam_[Unicam::Image].dev()->setFormat(&sensorFormat);\n+\tV4L2SubdeviceFormat sensorFormat = findBestFormat(data->sensorFormats_, rawStream ? sensorSize : maxSize);\n+\tret = data->sensor_->setFormat(&sensorFormat);\n \tif (ret)\n \t\treturn ret;\n \n-\t/*\n-\t * The control ranges associated with the sensor may need updating\n-\t * after a format change.\n-\t * \\todo Use the CameraSensor::setFormat API instead.\n-\t */\n-\tdata->sensor_->updateControlInfo();\n+\tV4L2DeviceFormat unicamFormat = toV4L2DeviceFormat(sensorFormat, packing);\n+\tret = data->unicam_[Unicam::Image].dev()->setFormat(&unicamFormat);\n+\tif (ret)\n+\t\treturn ret;\n \n \tLOG(RPI, Info) << \"Sensor: \" << camera->id()\n-\t\t       << \" - Selected mode: \" << sensorFormat.toString();\n+\t\t       << \" - Selected sensor format: \" << sensorFormat.toString()\n+\t\t       << \" - Selected unicam format: \" << unicamFormat.toString();\n \n-\tret = data->isp_[Isp::Input].dev()->setFormat(&sensorFormat);\n+\tret = data->isp_[Isp::Input].dev()->setFormat(&unicamFormat);\n \tif (ret)\n \t\treturn ret;\n \n@@ -754,8 +803,8 @@ int PipelineHandlerRPi::configure(Camera *camera, CameraConfiguration *config)\n \tdata->ispMinCropSize_ = testCrop.size();\n \n \t/* Adjust aspect ratio by providing crops on the input image. */\n-\tSize size = sensorFormat.size.boundedToAspectRatio(maxSize);\n-\tRectangle crop = size.centeredTo(Rectangle(sensorFormat.size).center());\n+\tSize size = unicamFormat.size.boundedToAspectRatio(maxSize);\n+\tRectangle crop = size.centeredTo(Rectangle(unicamFormat.size).center());\n \tdata->ispCrop_ = crop;\n \n \tdata->isp_[Isp::Input].dev()->setSelection(V4L2_SEL_TGT_CROP, &crop);\n@@ -769,8 +818,11 @@ int PipelineHandlerRPi::configure(Camera *camera, CameraConfiguration *config)\n \t * supports it.\n \t */\n \tif (data->sensorMetadata_) {\n-\t\tformat = {};\n+\t\tV4L2SubdeviceFormat embeddedFormat;\n+\n+\t\tdata->sensor_->device()->getFormat(1, &embeddedFormat);\n \t\tformat.fourcc = V4L2PixelFormat(V4L2_META_FMT_SENSOR_DATA);\n+\t\tformat.planes[0].size = embeddedFormat.size.width * embeddedFormat.size.height;\n \n \t\tLOG(RPI, Debug) << \"Setting embedded data format.\";\n \t\tret = data->unicam_[Unicam::Embedded].dev()->setFormat(&format);\n@@ -1000,6 +1052,8 @@ bool PipelineHandlerRPi::match(DeviceEnumerator *enumerator)\n \tif (data->sensor_->init())\n \t\treturn false;\n \n+\tdata->sensorFormats_ = populateSensorFormats(data->sensor_);\n+\n \tipa::RPi::SensorConfig sensorConfig;\n \tif (data->loadIPA(&sensorConfig)) {\n \t\tLOG(RPI, Error) << \"Failed to load a suitable IPA library\";\n@@ -1026,6 +1080,11 @@ bool PipelineHandlerRPi::match(DeviceEnumerator *enumerator)\n \t\t\treturn false;\n \t}\n \n+\tif (!data->unicam_[Unicam::Image].dev()->caps().hasMediaController()) {\n+\t\tLOG(RPI, Error) << \"Unicam driver does not use the MediaController, please update your kernel!\";\n+\t\treturn false;\n+\t}\n+\n \t/*\n \t * Setup our delayed control writer with the sensor default\n \t * gain and exposure delays. Mark VBLANK for priority write.\n@@ -1035,7 +1094,7 @@ bool PipelineHandlerRPi::match(DeviceEnumerator *enumerator)\n \t\t{ V4L2_CID_EXPOSURE, { sensorConfig.exposureDelay, false } },\n \t\t{ V4L2_CID_VBLANK, { sensorConfig.vblankDelay, true } }\n \t};\n-\tdata->delayedCtrls_ = std::make_unique<DelayedControls>(data->unicam_[Unicam::Image].dev(), params);\n+\tdata->delayedCtrls_ = std::make_unique<DelayedControls>(data->sensor_->device(), params);\n \tdata->sensorMetadata_ = sensorConfig.sensorMetadata;\n \n \t/* Register the controls that the Raspberry Pi IPA can handle. */\n@@ -1062,15 +1121,14 @@ bool PipelineHandlerRPi::match(DeviceEnumerator *enumerator)\n \t * As part of answering the final question, we reset the camera to\n \t * no transform at all.\n \t */\n-\n-\tV4L2VideoDevice *dev = data->unicam_[Unicam::Image].dev();\n-\tconst struct v4l2_query_ext_ctrl *hflipCtrl = dev->controlInfo(V4L2_CID_HFLIP);\n+\tconst V4L2Subdevice *sensor = data->sensor_->device();\n+\tconst struct v4l2_query_ext_ctrl *hflipCtrl = sensor->controlInfo(V4L2_CID_HFLIP);\n \tif (hflipCtrl) {\n \t\t/* We assume it will support vflips too... */\n \t\tdata->supportsFlips_ = true;\n \t\tdata->flipsAlterBayerOrder_ = hflipCtrl->flags & V4L2_CTRL_FLAG_MODIFY_LAYOUT;\n \n-\t\tControlList ctrls(dev->controls());\n+\t\tControlList ctrls(data->sensor_->controls());\n \t\tctrls.set(V4L2_CID_HFLIP, 0);\n \t\tctrls.set(V4L2_CID_VFLIP, 0);\n \t\tdata->setSensorControls(ctrls);\n@@ -1078,9 +1136,8 @@ bool PipelineHandlerRPi::match(DeviceEnumerator *enumerator)\n \n \t/* Look for a valid Bayer format. */\n \tBayerFormat bayerFormat;\n-\tfor (const auto &iter : dev->formats()) {\n-\t\tV4L2PixelFormat v4l2Format = iter.first;\n-\t\tbayerFormat = BayerFormat::fromV4L2PixelFormat(v4l2Format);\n+\tfor (const auto &iter : data->sensorFormats_) {\n+\t\tbayerFormat = BayerFormat::fromMbusCode(iter.first);\n \t\tif (bayerFormat.isValid())\n \t\t\tbreak;\n \t}\n@@ -1263,7 +1320,7 @@ int RPiCameraData::configureIPA(const CameraConfiguration *config)\n \t\t}\n \t}\n \n-\tentityControls.emplace(0, unicam_[Unicam::Image].dev()->controls());\n+\tentityControls.emplace(0, sensor_->controls());\n \tentityControls.emplace(1, isp_[Isp::Input].dev()->controls());\n \n \t/* Always send the user transform to the IPA. */\n@@ -1387,10 +1444,10 @@ void RPiCameraData::setSensorControls(ControlList &controls)\n \t\tControlList vblank_ctrl;\n \n \t\tvblank_ctrl.set(V4L2_CID_VBLANK, controls.get(V4L2_CID_VBLANK));\n-\t\tunicam_[Unicam::Image].dev()->setControls(&vblank_ctrl);\n+\t\tsensor_->setControls(&vblank_ctrl);\n \t}\n \n-\tunicam_[Unicam::Image].dev()->setControls(&controls);\n+\tsensor_->setControls(&controls);\n }\n \n void RPiCameraData::unicamBufferDequeue(FrameBuffer *buffer)\n","prefixes":["libcamera-devel","v4","08/10"]}