[{"id":20383,"web_url":"https://patchwork.libcamera.org/comment/20383/","msgid":"<CAHW6GYLAEJwoTQKH=7pwExFMjuxAitzcwfGybsb8hn5Y4hC8Zg@mail.gmail.com>","date":"2021-10-22T13:09:39","subject":"Re: [libcamera-devel] [PATCH 2/6] pipeline: raspberrypi: Convert\n\tthe pipeline handler to use media controller","submitter":{"id":42,"url":"https://patchwork.libcamera.org/api/people/42/","name":"David Plowman","email":"david.plowman@raspberrypi.com"},"content":"Hi Naush\n\nThanks for this patch!\n\nOn Fri, 22 Oct 2021 at 12:55, Naushir Patuck <naush@raspberrypi.com> wrote:\n>\n> Switch the pipeline handler to use the new Unicam media controller based driver.\n> With this change, we directly talk to the sensor device driver to set controls\n> and set/get formats in the pipeline handler.\n>\n> This change requires the accompanying Raspberry Pi linux kernel change at\n> https://github.com/raspberrypi/linux/pull/4645. If this kernel change is not\n> present, the pipeline handler will fail to run with an error message informing\n> the user to update the kernel build.\n>\n> Signed-off-by: Naushir Patuck <naush@raspberrypi.com>\n> ---\n>  .../pipeline/raspberrypi/raspberrypi.cpp      | 113 +++++++++++-------\n>  1 file changed, 67 insertions(+), 46 deletions(-)\n>\n> diff --git a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp\n> index 1634ca98f481..730f1575095c 100644\n> --- a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp\n> +++ b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp\n> @@ -48,6 +48,19 @@ LOG_DEFINE_CATEGORY(RPI)\n>\n>  namespace {\n>\n> +/* Map of mbus codes to supported sizes reported by the sensor. */\n> +using SensorFormats = std::map<unsigned int, std::vector<Size>>;\n> +\n> +SensorFormats populateSensorFormats(std::unique_ptr<CameraSensor> &sensor)\n> +{\n> +       SensorFormats formats;\n> +\n> +       for (auto const mbusCode : sensor->mbusCodes())\n> +               formats.emplace(mbusCode, sensor->sizes(mbusCode));\n> +\n> +       return formats;\n> +}\n> +\n>  bool isRaw(PixelFormat &pixFmt)\n>  {\n>         /*\n> @@ -74,8 +87,7 @@ double scoreFormat(double desired, double actual)\n>         return score;\n>  }\n>\n> -V4L2DeviceFormat findBestMode(V4L2VideoDevice::Formats &formatsMap,\n> -                             const Size &req)\n> +V4L2DeviceFormat findBestMode(const SensorFormats &formatsMap, const Size &req)\n\nI wonder whether this function should return something more like an\nmbus code and a size. The packing is not really relevant to what comes\nout of the sensor and should, I suspect, actually be determined by any\nraw stream that has been requested by the application. This all starts\nto impinge a bit on the sensor-mode-hint discussions I've been having.\n\nOne thing I wonder at this point is how an application could express a\nwish for packed or unpacked raw streams. I think generateConfiguration\nshould probably fill in whatever it wants, probably \"packed\" as that's\nmore efficient. Then the application would have to be able to change\nthis to \"unpacked\" - which we could make less annoying by making the\nBayerFormat publicly available.\n\n>  {\n>         double bestScore = std::numeric_limits<double>::max(), score;\n>         V4L2DeviceFormat bestMode;\n> @@ -88,18 +100,17 @@ V4L2DeviceFormat findBestMode(V4L2VideoDevice::Formats &formatsMap,\n>\n>         /* Calculate the closest/best mode from the user requested size. */\n>         for (const auto &iter : formatsMap) {\n> -               V4L2PixelFormat v4l2Format = iter.first;\n> +               const unsigned int mbus_code = iter.first;\n> +               const V4L2PixelFormat v4l2Format = BayerFormat::fromMbusCode(mbus_code).toV4L2PixelFormat();\n>                 const PixelFormatInfo &info = PixelFormatInfo::info(v4l2Format);\n>\n> -               for (const SizeRange &sz : iter.second) {\n> -                       double modeWidth = sz.contains(req) ? req.width : sz.max.width;\n> -                       double modeHeight = sz.contains(req) ? req.height : sz.max.height;\n> +               for (const Size &sz : iter.second) {\n>                         double reqAr = static_cast<double>(req.width) / req.height;\n> -                       double modeAr = modeWidth / modeHeight;\n> +                       double modeAr = sz.width / sz.height;\n>\n>                         /* Score the dimensions for closeness. */\n> -                       score = scoreFormat(req.width, modeWidth);\n> -                       score += scoreFormat(req.height, modeHeight);\n> +                       score = scoreFormat(req.width, sz.width);\n> +                       score += scoreFormat(req.height, sz.height);\n>                         score += PENALTY_AR * scoreFormat(reqAr, modeAr);\n>\n>                         /* Add any penalties... this is not an exact science! */\n> @@ -116,10 +127,10 @@ V4L2DeviceFormat findBestMode(V4L2VideoDevice::Formats &formatsMap,\n>                         if (score <= bestScore) {\n>                                 bestScore = score;\n>                                 bestMode.fourcc = v4l2Format;\n> -                               bestMode.size = Size(modeWidth, modeHeight);\n> +                               bestMode.size = sz;\n>                         }\n>\n> -                       LOG(RPI, Info) << \"Mode: \" << modeWidth << \"x\" << modeHeight\n> +                       LOG(RPI, Info) << \"Mode: \" << sz.width << \"x\" << sz.height\n>                                        << \" fmt \" << v4l2Format.toString()\n>                                        << \" Score: \" << score\n>                                        << \" (best \" << bestScore << \")\";\n> @@ -170,6 +181,7 @@ public:\n>         std::unique_ptr<ipa::RPi::IPAProxyRPi> ipa_;\n>\n>         std::unique_ptr<CameraSensor> sensor_;\n> +       SensorFormats sensorFormats_;\n>         /* Array of Unicam and ISP device streams and associated buffers/streams. */\n>         RPi::Device<Unicam, 2> unicam_;\n>         RPi::Device<Isp, 4> isp_;\n> @@ -352,8 +364,7 @@ CameraConfiguration::Status RPiCameraConfiguration::validate()\n>                          * Calculate the best sensor mode we can use based on\n>                          * the user request.\n>                          */\n> -                       V4L2VideoDevice::Formats fmts = data_->unicam_[Unicam::Image].dev()->formats();\n> -                       V4L2DeviceFormat sensorFormat = findBestMode(fmts, cfg.size);\n> +                       V4L2DeviceFormat sensorFormat = findBestMode(data_->sensorFormats_, cfg.size);\n>                         int ret = data_->unicam_[Unicam::Image].dev()->tryFormat(&sensorFormat);\n>                         if (ret)\n>                                 return Invalid;\n> @@ -487,8 +498,7 @@ CameraConfiguration *PipelineHandlerRPi::generateConfiguration(Camera *camera,\n>                 switch (role) {\n>                 case StreamRole::Raw:\n>                         size = data->sensor_->resolution();\n> -                       fmts = data->unicam_[Unicam::Image].dev()->formats();\n> -                       sensorFormat = findBestMode(fmts, size);\n> +                       sensorFormat = findBestMode(data->sensorFormats_, size);\n>                         pixelFormat = sensorFormat.fourcc.toPixelFormat();\n>                         ASSERT(pixelFormat.isValid());\n>                         bufferCount = 2;\n> @@ -599,32 +609,32 @@ int PipelineHandlerRPi::configure(Camera *camera, CameraConfiguration *config)\n>         }\n>\n>         /* First calculate the best sensor mode we can use based on the user request. */\n> -       V4L2VideoDevice::Formats fmts = data->unicam_[Unicam::Image].dev()->formats();\n> -       V4L2DeviceFormat sensorFormat = findBestMode(fmts, rawStream ? sensorSize : maxSize);\n> +       V4L2DeviceFormat unicamFormat = findBestMode(data->sensorFormats_, rawStream ? sensorSize : maxSize);\n> +\n> +       unsigned int mbus_code = BayerFormat::fromV4L2PixelFormat(unicamFormat.fourcc).toMbusCode();\n> +       V4L2SubdeviceFormat sensorFormat { .mbus_code = mbus_code, .size = unicamFormat.size };\n> +\n> +       ret = data->sensor_->setFormat(&sensorFormat);\n> +       if (ret)\n> +               return ret;\n\nAs discussed above, the unicamFormat probably wants to take account of\nthe packing (or not) in any raw stream (if present).\n\n>\n>         /*\n>          * Unicam image output format. The ISP input format gets set at start,\n>          * just in case we have swapped bayer orders due to flips.\n>          */\n> -       ret = data->unicam_[Unicam::Image].dev()->setFormat(&sensorFormat);\n> +       ret = data->unicam_[Unicam::Image].dev()->setFormat(&unicamFormat);\n>         if (ret)\n>                 return ret;\n>\n> -       /*\n> -        * The control ranges associated with the sensor may need updating\n> -        * after a format change.\n> -        * \\todo Use the CameraSensor::setFormat API instead.\n> -        */\n> -       data->sensor_->updateControlInfo();\n\nHmm, nice, I wonder if we could delete updateControlInfo entirely?\n\n> -\n>         LOG(RPI, Info) << \"Sensor: \" << camera->id()\n> -                      << \" - Selected mode: \" << sensorFormat.toString();\n> +                      << \" - Selected sensor mode: \" << sensorFormat.toString()\n> +                      << \" - Selected unicam mode: \" << unicamFormat.toString();\n>\n>         /*\n>          * This format may be reset on start() if the bayer order has changed\n>          * because of flips in the sensor.\n>          */\n> -       ret = data->isp_[Isp::Input].dev()->setFormat(&sensorFormat);\n> +       ret = data->isp_[Isp::Input].dev()->setFormat(&unicamFormat);\n>         if (ret)\n>                 return ret;\n>\n> @@ -746,8 +756,8 @@ int PipelineHandlerRPi::configure(Camera *camera, CameraConfiguration *config)\n>         data->ispMinCropSize_ = testCrop.size();\n>\n>         /* Adjust aspect ratio by providing crops on the input image. */\n> -       Size size = sensorFormat.size.boundedToAspectRatio(maxSize);\n> -       Rectangle crop = size.centeredTo(Rectangle(sensorFormat.size).center());\n> +       Size size = unicamFormat.size.boundedToAspectRatio(maxSize);\n> +       Rectangle crop = size.centeredTo(Rectangle(unicamFormat.size).center());\n>         data->ispCrop_ = crop;\n>\n>         data->isp_[Isp::Input].dev()->setSelection(V4L2_SEL_TGT_CROP, &crop);\n> @@ -761,10 +771,11 @@ int PipelineHandlerRPi::configure(Camera *camera, CameraConfiguration *config)\n>          * supports it.\n>          */\n>         if (data->sensorMetadata_) {\n> -               format = {};\n> -               format.fourcc = V4L2PixelFormat(V4L2_META_FMT_SENSOR_DATA);\n> +               V4L2SubdeviceFormat embeddedFormat;\n>\n> -               LOG(RPI, Debug) << \"Setting embedded data format.\";\n> +               data->sensor_->device()->getFormat(1, &embeddedFormat);\n> +               format.fourcc = V4L2PixelFormat(V4L2_META_FMT_SENSOR_DATA);\n> +               format.planes[0].size = embeddedFormat.size.width * embeddedFormat.size.height;\n>                 ret = data->unicam_[Unicam::Embedded].dev()->setFormat(&format);\n>                 if (ret) {\n>                         LOG(RPI, Error) << \"Failed to set format on Unicam embedded: \"\n> @@ -847,9 +858,14 @@ int PipelineHandlerRPi::start(Camera *camera, const ControlList *controls)\n>          * IPA configure may have changed the sensor flips - hence the bayer\n>          * order. Get the sensor format and set the ISP input now.\n>          */\n> -       V4L2DeviceFormat sensorFormat;\n> -       data->unicam_[Unicam::Image].dev()->getFormat(&sensorFormat);\n> -       ret = data->isp_[Isp::Input].dev()->setFormat(&sensorFormat);\n> +       V4L2SubdeviceFormat sensorFormat;\n> +       data->sensor_->device()->getFormat(0, &sensorFormat);\n> +\n> +       V4L2DeviceFormat ispFormat;\n> +       ispFormat.fourcc = BayerFormat::fromMbusCode(sensorFormat.mbus_code).toV4L2PixelFormat();\n> +       ispFormat.size = sensorFormat.size;\n> +\n> +       ret = data->isp_[Isp::Input].dev()->setFormat(&ispFormat);\n>         if (ret) {\n>                 stop(camera);\n>                 return ret;\n> @@ -1004,6 +1020,8 @@ bool PipelineHandlerRPi::match(DeviceEnumerator *enumerator)\n>         if (data->sensor_->init())\n>                 return false;\n>\n> +       data->sensorFormats_ = populateSensorFormats(data->sensor_);\n> +\n>         ipa::RPi::SensorConfig sensorConfig;\n>         if (data->loadIPA(&sensorConfig)) {\n>                 LOG(RPI, Error) << \"Failed to load a suitable IPA library\";\n> @@ -1030,6 +1048,11 @@ bool PipelineHandlerRPi::match(DeviceEnumerator *enumerator)\n>                         return false;\n>         }\n>\n> +       if (!(data->unicam_[Unicam::Image].dev()->caps().device_caps() & V4L2_CAP_IO_MC)) {\n> +               LOG(RPI, Error) << \"Unicam driver did not advertise V4L2_CAP_IO_MC, please update your kernel!\";\n> +               return false;\n> +       }\n> +\n>         /*\n>          * Setup our delayed control writer with the sensor default\n>          * gain and exposure delays. Mark VBLANK for priority write.\n> @@ -1039,7 +1062,7 @@ bool PipelineHandlerRPi::match(DeviceEnumerator *enumerator)\n>                 { V4L2_CID_EXPOSURE, { sensorConfig.exposureDelay, false } },\n>                 { V4L2_CID_VBLANK, { sensorConfig.vblankDelay, true } }\n>         };\n> -       data->delayedCtrls_ = std::make_unique<DelayedControls>(data->unicam_[Unicam::Image].dev(), params);\n> +       data->delayedCtrls_ = std::make_unique<DelayedControls>(data->sensor_->device(), params);\n>         data->sensorMetadata_ = sensorConfig.sensorMetadata;\n>\n>         /* Register the controls that the Raspberry Pi IPA can handle. */\n> @@ -1066,15 +1089,14 @@ bool PipelineHandlerRPi::match(DeviceEnumerator *enumerator)\n>          * As part of answering the final question, we reset the camera to\n>          * no transform at all.\n>          */\n> -\n> -       V4L2VideoDevice *dev = data->unicam_[Unicam::Image].dev();\n> -       const struct v4l2_query_ext_ctrl *hflipCtrl = dev->controlInfo(V4L2_CID_HFLIP);\n> +       const V4L2Subdevice *sensor = data->sensor_->device();\n> +       const struct v4l2_query_ext_ctrl *hflipCtrl = sensor->controlInfo(V4L2_CID_HFLIP);\n>         if (hflipCtrl) {\n>                 /* We assume it will support vflips too... */\n>                 data->supportsFlips_ = true;\n>                 data->flipsAlterBayerOrder_ = hflipCtrl->flags & V4L2_CTRL_FLAG_MODIFY_LAYOUT;\n>\n> -               ControlList ctrls(dev->controls());\n> +               ControlList ctrls(data->sensor_->controls());\n>                 ctrls.set(V4L2_CID_HFLIP, 0);\n>                 ctrls.set(V4L2_CID_VFLIP, 0);\n>                 data->setSensorControls(ctrls);\n> @@ -1082,9 +1104,8 @@ bool PipelineHandlerRPi::match(DeviceEnumerator *enumerator)\n>\n>         /* Look for a valid Bayer format. */\n>         BayerFormat bayerFormat;\n> -       for (const auto &iter : dev->formats()) {\n> -               V4L2PixelFormat v4l2Format = iter.first;\n> -               bayerFormat = BayerFormat::fromV4L2PixelFormat(v4l2Format);\n> +       for (const auto &iter : data->sensorFormats_) {\n> +               bayerFormat = BayerFormat::fromMbusCode(iter.first);\n>                 if (bayerFormat.isValid())\n>                         break;\n>         }\n> @@ -1271,7 +1292,7 @@ int RPiCameraData::configureIPA(const CameraConfiguration *config)\n>                 }\n>         }\n>\n> -       entityControls.emplace(0, unicam_[Unicam::Image].dev()->controls());\n> +       entityControls.emplace(0, sensor_->controls());\n>         entityControls.emplace(1, isp_[Isp::Input].dev()->controls());\n>\n>         /* Always send the user transform to the IPA. */\n> @@ -1406,10 +1427,10 @@ void RPiCameraData::setSensorControls(ControlList &controls)\n>                 ControlList vblank_ctrl;\n>\n>                 vblank_ctrl.set(V4L2_CID_VBLANK, controls.get(V4L2_CID_VBLANK));\n> -               unicam_[Unicam::Image].dev()->setControls(&vblank_ctrl);\n> +               sensor_->setControls(&vblank_ctrl);\n>         }\n>\n> -       unicam_[Unicam::Image].dev()->setControls(&controls);\n> +       sensor_->setControls(&controls);\n>  }\n>\n>  void RPiCameraData::unicamBufferDequeue(FrameBuffer *buffer)\n> --\n> 2.25.1\n>\n\nThanks!\n\nDavid","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id 7275BBDB1C\n\tfor <parsemail@patchwork.libcamera.org>;\n\tFri, 22 Oct 2021 13:09:52 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id 77DE868F5B;\n\tFri, 22 Oct 2021 15:09:51 +0200 (CEST)","from mail-wm1-x330.google.com (mail-wm1-x330.google.com\n\t[IPv6:2a00:1450:4864:20::330])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id 883996012A\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tFri, 22 Oct 2021 15:09:50 +0200 (CEST)","by mail-wm1-x330.google.com with SMTP id\n\t193-20020a1c01ca000000b00327775075f7so3323357wmb.5\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tFri, 22 Oct 2021 06:09:50 -0700 (PDT)"],"Authentication-Results":"lancelot.ideasonboard.com; dkim=pass (2048-bit key;\n\tunprotected) header.d=raspberrypi.com header.i=@raspberrypi.com\n\theader.b=\"QrQiPepr\"; dkim-atps=neutral","DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/relaxed;\n\td=raspberrypi.com; s=google;\n\th=mime-version:references:in-reply-to:from:date:message-id:subject:to\n\t:cc; bh=xQkpfYunvZ5V/XXYotZcQ/WF2j5ZVXDpCz3HocE1OSA=;\n\tb=QrQiPepr/8ge7ozsSZJkgz64fD0DlMUCZQP3+3BeMun/ZWeIRoJkDx2ChQr7n6Zgso\n\tD/9xRFBsbuN7NcX60Vr1ZLljjYsw3B0CRXiViAa/+QGzEO0UXXaG7F+79sWjx75NhhET\n\tpTT3a0eNfu8AXyPYSMNNtyGePnNfbg+W3zluhMuxErboOchk5xvB9lc2BOIBpKZdDoyq\n\tCu8SE0HEdpN7rv32kQAXqqEM73eCslnQZciEaIrkWMFq+nMqWZ4XT3rA5XR8AO+T8LmE\n\tjhOCaB0IMFYriLcSEZEuLqeS5f8HMSfGqb3mzwHTSzFkCAEB7caWy+zsIkQuMBGgJTVh\n\tTzJQ==","X-Google-DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/relaxed;\n\td=1e100.net; s=20210112;\n\th=x-gm-message-state:mime-version:references:in-reply-to:from:date\n\t:message-id:subject:to:cc;\n\tbh=xQkpfYunvZ5V/XXYotZcQ/WF2j5ZVXDpCz3HocE1OSA=;\n\tb=qUWIafXA3Jskq1k7hOqZEb1rJcvnmPyItxplPugjT8nz8Wbr+rIFHtgrT/KAuiRBts\n\tkXxsIIZnPtIAEcH/NHwOTF1nOVyq8/+0kdJenNWBJExUorIPML2D5stHFgGtuX0GrGAD\n\tk9Vn9xeuYYxF4deqLg83dJPjAqsb3cwGyQQeak8nDQa6JeMr163yeERoomhujpo/RBlI\n\ty/nM5ZBgCNHo3iqFhtu3ndQiOus90nFLQXLi7oiJCsxPeOsfTxIqh3oSXsVvVwhGJ5jR\n\tkcq9piF9TJ3HUUq2DSD6lxhEDLWpAowP0PkOfnI13VPEnB968UR5BPtCUwny40O3QUhx\n\tHs0Q==","X-Gm-Message-State":"AOAM530eoGNKQXttpF+IDvzz4wXuvfjsoB7SCSApiQ2lqxpyULdNoW2g\n\tgDJTQBQY1S3cbAa01fYSzWyto2GKW2KxAksAV3NZ+GY1yvk=","X-Google-Smtp-Source":"ABdhPJwVrKhmjcgVj/9WhsNd4L5i/TOkNVJpXNKP6dRvGi/CDE9h34clDRAaJQQY8PFkzckoKOctVvcsmfe46YYoIWM=","X-Received":"by 2002:a05:600c:1c21:: with SMTP id\n\tj33mr28840041wms.163.1634908189763; \n\tFri, 22 Oct 2021 06:09:49 -0700 (PDT)","MIME-Version":"1.0","References":"<20211022115537.2964533-1-naush@raspberrypi.com>\n\t<20211022115537.2964533-3-naush@raspberrypi.com>","In-Reply-To":"<20211022115537.2964533-3-naush@raspberrypi.com>","From":"David Plowman <david.plowman@raspberrypi.com>","Date":"Fri, 22 Oct 2021 14:09:39 +0100","Message-ID":"<CAHW6GYLAEJwoTQKH=7pwExFMjuxAitzcwfGybsb8hn5Y4hC8Zg@mail.gmail.com>","To":"Naushir Patuck <naush@raspberrypi.com>","Content-Type":"text/plain; charset=\"UTF-8\"","Subject":"Re: [libcamera-devel] [PATCH 2/6] pipeline: raspberrypi: Convert\n\tthe pipeline handler to use media controller","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","Cc":"libcamera devel <libcamera-devel@lists.libcamera.org>","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}},{"id":20389,"web_url":"https://patchwork.libcamera.org/comment/20389/","msgid":"<CAEmqJPpYH7EGoPwC_wTjdfpt=nvyJTEg8n_+CoT+CVp89qE01w@mail.gmail.com>","date":"2021-10-22T13:28:10","subject":"Re: [libcamera-devel] [PATCH 2/6] pipeline: raspberrypi: Convert\n\tthe pipeline handler to use media controller","submitter":{"id":34,"url":"https://patchwork.libcamera.org/api/people/34/","name":"Naushir Patuck","email":"naush@raspberrypi.com"},"content":"Hi David,\n\nThank you for your feedback.\n\nOn Fri, 22 Oct 2021 at 14:09, David Plowman <david.plowman@raspberrypi.com>\nwrote:\n\n> Hi Naush\n>\n> Thanks for this patch!\n>\n> On Fri, 22 Oct 2021 at 12:55, Naushir Patuck <naush@raspberrypi.com>\n> wrote:\n> >\n> > Switch the pipeline handler to use the new Unicam media controller based\n> driver.\n> > With this change, we directly talk to the sensor device driver to set\n> controls\n> > and set/get formats in the pipeline handler.\n> >\n> > This change requires the accompanying Raspberry Pi linux kernel change at\n> > https://github.com/raspberrypi/linux/pull/4645. If this kernel change\n> is not\n> > present, the pipeline handler will fail to run with an error message\n> informing\n> > the user to update the kernel build.\n> >\n> > Signed-off-by: Naushir Patuck <naush@raspberrypi.com>\n> > ---\n> >  .../pipeline/raspberrypi/raspberrypi.cpp      | 113 +++++++++++-------\n> >  1 file changed, 67 insertions(+), 46 deletions(-)\n> >\n> > diff --git a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp\n> b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp\n> > index 1634ca98f481..730f1575095c 100644\n> > --- a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp\n> > +++ b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp\n> > @@ -48,6 +48,19 @@ LOG_DEFINE_CATEGORY(RPI)\n> >\n> >  namespace {\n> >\n> > +/* Map of mbus codes to supported sizes reported by the sensor. */\n> > +using SensorFormats = std::map<unsigned int, std::vector<Size>>;\n> > +\n> > +SensorFormats populateSensorFormats(std::unique_ptr<CameraSensor>\n> &sensor)\n> > +{\n> > +       SensorFormats formats;\n> > +\n> > +       for (auto const mbusCode : sensor->mbusCodes())\n> > +               formats.emplace(mbusCode, sensor->sizes(mbusCode));\n> > +\n> > +       return formats;\n> > +}\n> > +\n> >  bool isRaw(PixelFormat &pixFmt)\n> >  {\n> >         /*\n> > @@ -74,8 +87,7 @@ double scoreFormat(double desired, double actual)\n> >         return score;\n> >  }\n> >\n> > -V4L2DeviceFormat findBestMode(V4L2VideoDevice::Formats &formatsMap,\n> > -                             const Size &req)\n> > +V4L2DeviceFormat findBestMode(const SensorFormats &formatsMap, const\n> Size &req)\n>\n> I wonder whether this function should return something more like an\n> mbus code and a size. The packing is not really relevant to what comes\n> out of the sensor and should, I suspect, actually be determined by any\n> raw stream that has been requested by the application. This all starts\n> to impinge a bit on the sensor-mode-hint discussions I've been having.\n>\n\nYes, you are right!   And for convenience, a V4L2SubdeviceFormat\nis a struct that has these two fields.  I will return that struct from\nfindBestMode to make things simpler.\n\n\n> One thing I wonder at this point is how an application could express a\n> wish for packed or unpacked raw streams. I think generateConfiguration\n> should probably fill in whatever it wants, probably \"packed\" as that's\n> more efficient. Then the application would have to be able to change\n> this to \"unpacked\" - which we could make less annoying by making the\n> BayerFormat publicly available.\n>\n\nThat makes sense.  For this change I will adjust the \"packing\" param to\nbe packed for all cases, except if the application requested an unpacked raw\nstream pixelformat.\n\nWe can consider opening the BayerFormat class to the public later.\n\n\n>\n> >  {\n> >         double bestScore = std::numeric_limits<double>::max(), score;\n> >         V4L2DeviceFormat bestMode;\n> > @@ -88,18 +100,17 @@ V4L2DeviceFormat\n> findBestMode(V4L2VideoDevice::Formats &formatsMap,\n> >\n> >         /* Calculate the closest/best mode from the user requested size.\n> */\n> >         for (const auto &iter : formatsMap) {\n> > -               V4L2PixelFormat v4l2Format = iter.first;\n> > +               const unsigned int mbus_code = iter.first;\n> > +               const V4L2PixelFormat v4l2Format =\n> BayerFormat::fromMbusCode(mbus_code).toV4L2PixelFormat();\n> >                 const PixelFormatInfo &info =\n> PixelFormatInfo::info(v4l2Format);\n> >\n> > -               for (const SizeRange &sz : iter.second) {\n> > -                       double modeWidth = sz.contains(req) ? req.width\n> : sz.max.width;\n> > -                       double modeHeight = sz.contains(req) ?\n> req.height : sz.max.height;\n> > +               for (const Size &sz : iter.second) {\n> >                         double reqAr = static_cast<double>(req.width) /\n> req.height;\n> > -                       double modeAr = modeWidth / modeHeight;\n> > +                       double modeAr = sz.width / sz.height;\n> >\n> >                         /* Score the dimensions for closeness. */\n> > -                       score = scoreFormat(req.width, modeWidth);\n> > -                       score += scoreFormat(req.height, modeHeight);\n> > +                       score = scoreFormat(req.width, sz.width);\n> > +                       score += scoreFormat(req.height, sz.height);\n> >                         score += PENALTY_AR * scoreFormat(reqAr, modeAr);\n> >\n> >                         /* Add any penalties... this is not an exact\n> science! */\n> > @@ -116,10 +127,10 @@ V4L2DeviceFormat\n> findBestMode(V4L2VideoDevice::Formats &formatsMap,\n> >                         if (score <= bestScore) {\n> >                                 bestScore = score;\n> >                                 bestMode.fourcc = v4l2Format;\n> > -                               bestMode.size = Size(modeWidth,\n> modeHeight);\n> > +                               bestMode.size = sz;\n> >                         }\n> >\n> > -                       LOG(RPI, Info) << \"Mode: \" << modeWidth << \"x\"\n> << modeHeight\n> > +                       LOG(RPI, Info) << \"Mode: \" << sz.width << \"x\" <<\n> sz.height\n> >                                        << \" fmt \" <<\n> v4l2Format.toString()\n> >                                        << \" Score: \" << score\n> >                                        << \" (best \" << bestScore << \")\";\n> > @@ -170,6 +181,7 @@ public:\n> >         std::unique_ptr<ipa::RPi::IPAProxyRPi> ipa_;\n> >\n> >         std::unique_ptr<CameraSensor> sensor_;\n> > +       SensorFormats sensorFormats_;\n> >         /* Array of Unicam and ISP device streams and associated\n> buffers/streams. */\n> >         RPi::Device<Unicam, 2> unicam_;\n> >         RPi::Device<Isp, 4> isp_;\n> > @@ -352,8 +364,7 @@ CameraConfiguration::Status\n> RPiCameraConfiguration::validate()\n> >                          * Calculate the best sensor mode we can use\n> based on\n> >                          * the user request.\n> >                          */\n> > -                       V4L2VideoDevice::Formats fmts =\n> data_->unicam_[Unicam::Image].dev()->formats();\n> > -                       V4L2DeviceFormat sensorFormat =\n> findBestMode(fmts, cfg.size);\n> > +                       V4L2DeviceFormat sensorFormat =\n> findBestMode(data_->sensorFormats_, cfg.size);\n> >                         int ret =\n> data_->unicam_[Unicam::Image].dev()->tryFormat(&sensorFormat);\n> >                         if (ret)\n> >                                 return Invalid;\n> > @@ -487,8 +498,7 @@ CameraConfiguration\n> *PipelineHandlerRPi::generateConfiguration(Camera *camera,\n> >                 switch (role) {\n> >                 case StreamRole::Raw:\n> >                         size = data->sensor_->resolution();\n> > -                       fmts =\n> data->unicam_[Unicam::Image].dev()->formats();\n> > -                       sensorFormat = findBestMode(fmts, size);\n> > +                       sensorFormat =\n> findBestMode(data->sensorFormats_, size);\n> >                         pixelFormat =\n> sensorFormat.fourcc.toPixelFormat();\n> >                         ASSERT(pixelFormat.isValid());\n> >                         bufferCount = 2;\n> > @@ -599,32 +609,32 @@ int PipelineHandlerRPi::configure(Camera *camera,\n> CameraConfiguration *config)\n> >         }\n> >\n> >         /* First calculate the best sensor mode we can use based on the\n> user request. */\n> > -       V4L2VideoDevice::Formats fmts =\n> data->unicam_[Unicam::Image].dev()->formats();\n> > -       V4L2DeviceFormat sensorFormat = findBestMode(fmts, rawStream ?\n> sensorSize : maxSize);\n> > +       V4L2DeviceFormat unicamFormat =\n> findBestMode(data->sensorFormats_, rawStream ? sensorSize : maxSize);\n> > +\n> > +       unsigned int mbus_code =\n> BayerFormat::fromV4L2PixelFormat(unicamFormat.fourcc).toMbusCode();\n> > +       V4L2SubdeviceFormat sensorFormat { .mbus_code = mbus_code, .size\n> = unicamFormat.size };\n> > +\n> > +       ret = data->sensor_->setFormat(&sensorFormat);\n> > +       if (ret)\n> > +               return ret;\n>\n> As discussed above, the unicamFormat probably wants to take account of\n> the packing (or not) in any raw stream (if present).\n>\n\nAck.\n\n\n>\n> >\n> >         /*\n> >          * Unicam image output format. The ISP input format gets set at\n> start,\n> >          * just in case we have swapped bayer orders due to flips.\n> >          */\n> > -       ret =\n> data->unicam_[Unicam::Image].dev()->setFormat(&sensorFormat);\n> > +       ret =\n> data->unicam_[Unicam::Image].dev()->setFormat(&unicamFormat);\n> >         if (ret)\n> >                 return ret;\n> >\n> > -       /*\n> > -        * The control ranges associated with the sensor may need\n> updating\n> > -        * after a format change.\n> > -        * \\todo Use the CameraSensor::setFormat API instead.\n> > -        */\n> > -       data->sensor_->updateControlInfo();\n>\n> Hmm, nice, I wonder if we could delete updateControlInfo entirely?\n>\n\nThere does not seem to be any other users of V4L2Device::updateControlInfo()\nso I could remove this.  I recall this was mainly for our use, so that\nshould be fine.\n\nRegards,\nNaush\n\n\n> > -\n> >         LOG(RPI, Info) << \"Sensor: \" << camera->id()\n> > -                      << \" - Selected mode: \" <<\n> sensorFormat.toString();\n> > +                      << \" - Selected sensor mode: \" <<\n> sensorFormat.toString()\n> > +                      << \" - Selected unicam mode: \" <<\n> unicamFormat.toString();\n> >\n> >         /*\n> >          * This format may be reset on start() if the bayer order has\n> changed\n> >          * because of flips in the sensor.\n> >          */\n> > -       ret = data->isp_[Isp::Input].dev()->setFormat(&sensorFormat);\n> > +       ret = data->isp_[Isp::Input].dev()->setFormat(&unicamFormat);\n> >         if (ret)\n> >                 return ret;\n> >\n> > @@ -746,8 +756,8 @@ int PipelineHandlerRPi::configure(Camera *camera,\n> CameraConfiguration *config)\n> >         data->ispMinCropSize_ = testCrop.size();\n> >\n> >         /* Adjust aspect ratio by providing crops on the input image. */\n> > -       Size size = sensorFormat.size.boundedToAspectRatio(maxSize);\n> > -       Rectangle crop =\n> size.centeredTo(Rectangle(sensorFormat.size).center());\n> > +       Size size = unicamFormat.size.boundedToAspectRatio(maxSize);\n> > +       Rectangle crop =\n> size.centeredTo(Rectangle(unicamFormat.size).center());\n> >         data->ispCrop_ = crop;\n> >\n> >         data->isp_[Isp::Input].dev()->setSelection(V4L2_SEL_TGT_CROP,\n> &crop);\n> > @@ -761,10 +771,11 @@ int PipelineHandlerRPi::configure(Camera *camera,\n> CameraConfiguration *config)\n> >          * supports it.\n> >          */\n> >         if (data->sensorMetadata_) {\n> > -               format = {};\n> > -               format.fourcc =\n> V4L2PixelFormat(V4L2_META_FMT_SENSOR_DATA);\n> > +               V4L2SubdeviceFormat embeddedFormat;\n> >\n> > -               LOG(RPI, Debug) << \"Setting embedded data format.\";\n> > +               data->sensor_->device()->getFormat(1, &embeddedFormat);\n> > +               format.fourcc =\n> V4L2PixelFormat(V4L2_META_FMT_SENSOR_DATA);\n> > +               format.planes[0].size = embeddedFormat.size.width *\n> embeddedFormat.size.height;\n> >                 ret =\n> data->unicam_[Unicam::Embedded].dev()->setFormat(&format);\n> >                 if (ret) {\n> >                         LOG(RPI, Error) << \"Failed to set format on\n> Unicam embedded: \"\n> > @@ -847,9 +858,14 @@ int PipelineHandlerRPi::start(Camera *camera, const\n> ControlList *controls)\n> >          * IPA configure may have changed the sensor flips - hence the\n> bayer\n> >          * order. Get the sensor format and set the ISP input now.\n> >          */\n> > -       V4L2DeviceFormat sensorFormat;\n> > -       data->unicam_[Unicam::Image].dev()->getFormat(&sensorFormat);\n> > -       ret = data->isp_[Isp::Input].dev()->setFormat(&sensorFormat);\n> > +       V4L2SubdeviceFormat sensorFormat;\n> > +       data->sensor_->device()->getFormat(0, &sensorFormat);\n> > +\n> > +       V4L2DeviceFormat ispFormat;\n> > +       ispFormat.fourcc =\n> BayerFormat::fromMbusCode(sensorFormat.mbus_code).toV4L2PixelFormat();\n> > +       ispFormat.size = sensorFormat.size;\n> > +\n> > +       ret = data->isp_[Isp::Input].dev()->setFormat(&ispFormat);\n> >         if (ret) {\n> >                 stop(camera);\n> >                 return ret;\n> > @@ -1004,6 +1020,8 @@ bool PipelineHandlerRPi::match(DeviceEnumerator\n> *enumerator)\n> >         if (data->sensor_->init())\n> >                 return false;\n> >\n> > +       data->sensorFormats_ = populateSensorFormats(data->sensor_);\n> > +\n> >         ipa::RPi::SensorConfig sensorConfig;\n> >         if (data->loadIPA(&sensorConfig)) {\n> >                 LOG(RPI, Error) << \"Failed to load a suitable IPA\n> library\";\n> > @@ -1030,6 +1048,11 @@ bool PipelineHandlerRPi::match(DeviceEnumerator\n> *enumerator)\n> >                         return false;\n> >         }\n> >\n> > +       if (!(data->unicam_[Unicam::Image].dev()->caps().device_caps() &\n> V4L2_CAP_IO_MC)) {\n> > +               LOG(RPI, Error) << \"Unicam driver did not advertise\n> V4L2_CAP_IO_MC, please update your kernel!\";\n> > +               return false;\n> > +       }\n> > +\n> >         /*\n> >          * Setup our delayed control writer with the sensor default\n> >          * gain and exposure delays. Mark VBLANK for priority write.\n> > @@ -1039,7 +1062,7 @@ bool PipelineHandlerRPi::match(DeviceEnumerator\n> *enumerator)\n> >                 { V4L2_CID_EXPOSURE, { sensorConfig.exposureDelay, false\n> } },\n> >                 { V4L2_CID_VBLANK, { sensorConfig.vblankDelay, true } }\n> >         };\n> > -       data->delayedCtrls_ =\n> std::make_unique<DelayedControls>(data->unicam_[Unicam::Image].dev(),\n> params);\n> > +       data->delayedCtrls_ =\n> std::make_unique<DelayedControls>(data->sensor_->device(), params);\n> >         data->sensorMetadata_ = sensorConfig.sensorMetadata;\n> >\n> >         /* Register the controls that the Raspberry Pi IPA can handle. */\n> > @@ -1066,15 +1089,14 @@ bool PipelineHandlerRPi::match(DeviceEnumerator\n> *enumerator)\n> >          * As part of answering the final question, we reset the camera\n> to\n> >          * no transform at all.\n> >          */\n> > -\n> > -       V4L2VideoDevice *dev = data->unicam_[Unicam::Image].dev();\n> > -       const struct v4l2_query_ext_ctrl *hflipCtrl =\n> dev->controlInfo(V4L2_CID_HFLIP);\n> > +       const V4L2Subdevice *sensor = data->sensor_->device();\n> > +       const struct v4l2_query_ext_ctrl *hflipCtrl =\n> sensor->controlInfo(V4L2_CID_HFLIP);\n> >         if (hflipCtrl) {\n> >                 /* We assume it will support vflips too... */\n> >                 data->supportsFlips_ = true;\n> >                 data->flipsAlterBayerOrder_ = hflipCtrl->flags &\n> V4L2_CTRL_FLAG_MODIFY_LAYOUT;\n> >\n> > -               ControlList ctrls(dev->controls());\n> > +               ControlList ctrls(data->sensor_->controls());\n> >                 ctrls.set(V4L2_CID_HFLIP, 0);\n> >                 ctrls.set(V4L2_CID_VFLIP, 0);\n> >                 data->setSensorControls(ctrls);\n> > @@ -1082,9 +1104,8 @@ bool PipelineHandlerRPi::match(DeviceEnumerator\n> *enumerator)\n> >\n> >         /* Look for a valid Bayer format. */\n> >         BayerFormat bayerFormat;\n> > -       for (const auto &iter : dev->formats()) {\n> > -               V4L2PixelFormat v4l2Format = iter.first;\n> > -               bayerFormat =\n> BayerFormat::fromV4L2PixelFormat(v4l2Format);\n> > +       for (const auto &iter : data->sensorFormats_) {\n> > +               bayerFormat = BayerFormat::fromMbusCode(iter.first);\n> >                 if (bayerFormat.isValid())\n> >                         break;\n> >         }\n> > @@ -1271,7 +1292,7 @@ int RPiCameraData::configureIPA(const\n> CameraConfiguration *config)\n> >                 }\n> >         }\n> >\n> > -       entityControls.emplace(0,\n> unicam_[Unicam::Image].dev()->controls());\n> > +       entityControls.emplace(0, sensor_->controls());\n> >         entityControls.emplace(1, isp_[Isp::Input].dev()->controls());\n> >\n> >         /* Always send the user transform to the IPA. */\n> > @@ -1406,10 +1427,10 @@ void\n> RPiCameraData::setSensorControls(ControlList &controls)\n> >                 ControlList vblank_ctrl;\n> >\n> >                 vblank_ctrl.set(V4L2_CID_VBLANK,\n> controls.get(V4L2_CID_VBLANK));\n> > -               unicam_[Unicam::Image].dev()->setControls(&vblank_ctrl);\n> > +               sensor_->setControls(&vblank_ctrl);\n> >         }\n> >\n> > -       unicam_[Unicam::Image].dev()->setControls(&controls);\n> > +       sensor_->setControls(&controls);\n> >  }\n> >\n> >  void RPiCameraData::unicamBufferDequeue(FrameBuffer *buffer)\n> > --\n> > 2.25.1\n> >\n>\n> Thanks!\n>\n> David\n>","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id 86F37BF415\n\tfor <parsemail@patchwork.libcamera.org>;\n\tFri, 22 Oct 2021 13:28:30 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id DE20068F59;\n\tFri, 22 Oct 2021 15:28:29 +0200 (CEST)","from mail-lj1-x231.google.com (mail-lj1-x231.google.com\n\t[IPv6:2a00:1450:4864:20::231])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id BE9BC6012A\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tFri, 22 Oct 2021 15:28:27 +0200 (CEST)","by mail-lj1-x231.google.com with SMTP id r6so371001ljg.6\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tFri, 22 Oct 2021 06:28:27 -0700 (PDT)"],"Authentication-Results":"lancelot.ideasonboard.com; dkim=pass (2048-bit key;\n\tunprotected) header.d=raspberrypi.com header.i=@raspberrypi.com\n\theader.b=\"N1OiGccn\"; dkim-atps=neutral","DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/relaxed;\n\td=raspberrypi.com; s=google;\n\th=mime-version:references:in-reply-to:from:date:message-id:subject:to\n\t:cc; bh=1cEMx7yoIXrGnLdRh8c6hoDIN36rH8AF+/yQgPxT7lk=;\n\tb=N1OiGccnhTkMUnAD7WjIMRkSnmz1VQKvJwThwR4OukkgfufdVUgt9IOQhIbApWVE/w\n\trMmPoYjyWGjU4ifPCB5rKwIziPEXqjl6YbRKYzf+UnGlJWwYi9xhbl7jZFTmVfetvQF7\n\t5y/x/CxrWk5lwWdMRMTfc28akA0dA+QC5sxNqGJiMCT9labaAv+wItKGHBBbrlu0RMnY\n\tu0w7V1yCCJm2gjkKXF0GJkCRh7CvKcoMmnhiqOIbsH2DCFKFwHxzmRZV8JYKNSHqmunO\n\t5kEDjdHRJ4UpNi8z8/TV+hASfmcAH69B+gvvQyzPjdNYghAKSCLb/pZq86AhQkPLdHAn\n\t4wTQ==","X-Google-DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/relaxed;\n\td=1e100.net; s=20210112;\n\th=x-gm-message-state:mime-version:references:in-reply-to:from:date\n\t:message-id:subject:to:cc;\n\tbh=1cEMx7yoIXrGnLdRh8c6hoDIN36rH8AF+/yQgPxT7lk=;\n\tb=C113nvjiFKCP7JTMMfGu7vJRr4hFI50i4v/LU5tu2EFOjI/h3ookRasXVMFVeSSHsF\n\tbuiEwtbjpoJRn0DPImH5aXzzEImuBVkFGEEf/EZZEyPxNVQPcaLbK9tE2mv/CnDIlO4A\n\t/+xi+JCZjAAgCvDff49k8m/3IDmKZn+THAsgcS2UEQxEF9YuLpvMJ+BcHi1UzjkBUDXW\n\tzKmEx1TObv1v/lgQ6ll29JuOlaVCaMXlr/67gD5SEEqNdO/awZH4wlhUKIDCbLvgw2Ih\n\tCP4lhHp2+3qpgeVpeZe3aSMweGF05cJtCyOZc18sk5nxulHgwCZNM+l/z71AkjXc4ra5\n\twMsw==","X-Gm-Message-State":"AOAM531AONksv4WcuU/NzUGHP9QRL0MLSCHuYkay841ASGEzNtpb2gYg\n\tQ7z+pRNxFVGOkx3gK2vZ/WvwPFn2iq1Ttc129l6n1g==","X-Google-Smtp-Source":"ABdhPJz1kGpqt8jk1JPeACoFD3DncnXp0cuvtSlwBydJ2XncZbRbcfsabcWO8s8mFv4lCvlQuaNtkGsiNwQG/JkCeSU=","X-Received":"by 2002:a2e:800c:: with SMTP id j12mr19524ljg.280.1634909306814; \n\tFri, 22 Oct 2021 06:28:26 -0700 (PDT)","MIME-Version":"1.0","References":"<20211022115537.2964533-1-naush@raspberrypi.com>\n\t<20211022115537.2964533-3-naush@raspberrypi.com>\n\t<CAHW6GYLAEJwoTQKH=7pwExFMjuxAitzcwfGybsb8hn5Y4hC8Zg@mail.gmail.com>","In-Reply-To":"<CAHW6GYLAEJwoTQKH=7pwExFMjuxAitzcwfGybsb8hn5Y4hC8Zg@mail.gmail.com>","From":"Naushir Patuck <naush@raspberrypi.com>","Date":"Fri, 22 Oct 2021 14:28:10 +0100","Message-ID":"<CAEmqJPpYH7EGoPwC_wTjdfpt=nvyJTEg8n_+CoT+CVp89qE01w@mail.gmail.com>","To":"David Plowman <david.plowman@raspberrypi.com>","Content-Type":"multipart/alternative; boundary=\"000000000000791d1f05cef0fc4d\"","Subject":"Re: [libcamera-devel] [PATCH 2/6] pipeline: raspberrypi: Convert\n\tthe pipeline handler to use media controller","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","Cc":"libcamera devel <libcamera-devel@lists.libcamera.org>","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}}]