[{"id":15268,"web_url":"https://patchwork.libcamera.org/comment/15268/","msgid":"<YDLny6WKLkLIhnAJ@pendragon.ideasonboard.com>","date":"2021-02-21T23:07:55","subject":"Re: [libcamera-devel] [PATCH v3 3/4] pipeline: raspberrypi: Only\n\tenabled embedded stream when available","submitter":{"id":2,"url":"https://patchwork.libcamera.org/api/people/2/","name":"Laurent Pinchart","email":"laurent.pinchart@ideasonboard.com"},"content":"Hi Naush,\n\nThank you for the patch.\n\ns/enabled/enable/ in the subject ?\n\nOn Thu, Feb 18, 2021 at 05:01:25PM +0000, Naushir Patuck wrote:\n> The pipeline handler would enable and use the Unicam embedded data stream\n> even if the sensor did not support it. This was to allow a means to\n> pass exposure and gain values for the frame to the IPA in a synchronised\n> way.\n> \n> The recent changes to get the pipeline handler to pass a ControlList\n> with exposure and gain values means this is no longer required. Disable\n> the use of the embedded data stream when a sensor does not support it.\n\nNice :-)\n\n> This change also removes the mappedEmbeddedBuffers_ map as it is no\n> longer used.\n> \n> Signed-off-by: Naushir Patuck <naush@raspberrypi.com>\n> ---\n>  .../pipeline/raspberrypi/raspberrypi.cpp      | 112 +++++++++++-------\n>  1 file changed, 70 insertions(+), 42 deletions(-)\n> \n> diff --git a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp\n> index b43d86166c63..d969c77993eb 100644\n> --- a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp\n> +++ b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp\n> @@ -177,12 +177,6 @@ public:\n>  \t/* Stores the ids of the buffers mapped in the IPA. */\n>  \tstd::unordered_set<unsigned int> ipaBuffers_;\n>  \n> -\t/*\n> -\t * Map of (internal) mmaped embedded data buffers, to avoid having to\n> -\t * map/unmap on every frame.\n> -\t */\n> -\tstd::map<unsigned int, MappedFrameBuffer> mappedEmbeddedBuffers_;\n> -\n>  \t/* DMAHEAP allocation helper. */\n>  \tRPi::DmaHeap dmaHeap_;\n>  \tFileDescriptor lsTable_;\n> @@ -636,14 +630,7 @@ int PipelineHandlerRPi::configure(Camera *camera, CameraConfiguration *config)\n>  \n>  \t\tif (isRaw(cfg.pixelFormat)) {\n>  \t\t\tcfg.setStream(&data->unicam_[Unicam::Image]);\n> -\t\t\t/*\n> -\t\t\t * We must set both Unicam streams as external, even\n> -\t\t\t * though the application may only request RAW frames.\n> -\t\t\t * This is because we match timestamps on both streams\n> -\t\t\t * to synchronise buffers.\n> -\t\t\t */\n>  \t\t\tdata->unicam_[Unicam::Image].setExternal(true);\n> -\t\t\tdata->unicam_[Unicam::Embedded].setExternal(true);\n>  \t\t\tcontinue;\n>  \t\t}\n>  \n> @@ -715,17 +702,6 @@ int PipelineHandlerRPi::configure(Camera *camera, CameraConfiguration *config)\n>  \t\treturn ret;\n>  \t}\n>  \n> -\t/* Unicam embedded data output format. */\n> -\tformat = {};\n> -\tformat.fourcc = V4L2PixelFormat(V4L2_META_FMT_SENSOR_DATA);\n> -\tLOG(RPI, Debug) << \"Setting embedded data format.\";\n> -\tret = data->unicam_[Unicam::Embedded].dev()->setFormat(&format);\n> -\tif (ret) {\n> -\t\tLOG(RPI, Error) << \"Failed to set format on Unicam embedded: \"\n> -\t\t\t\t<< format.toString();\n> -\t\treturn ret;\n> -\t}\n> -\n>  \t/* Figure out the smallest selection the ISP will allow. */\n>  \tRectangle testCrop(0, 0, 1, 1);\n>  \tdata->isp_[Isp::Input].dev()->setSelection(V4L2_SEL_TGT_CROP, &testCrop);\n> @@ -742,6 +718,41 @@ int PipelineHandlerRPi::configure(Camera *camera, CameraConfiguration *config)\n>  \tif (ret)\n>  \t\tLOG(RPI, Error) << \"Failed to configure the IPA: \" << ret;\n>  \n> +\t/*\n> +\t * The IPA will set data->sensorMetadata_ to true if embedded data is\n> +\t * supported on this sensor. If so, open the Unicam embedded data\n> +\t * node and configure the output format.\n> +\t */\n> +\tif (data->sensorMetadata_) {\n> +\t\tformat = {};\n> +\t\tformat.fourcc = V4L2PixelFormat(V4L2_META_FMT_SENSOR_DATA);\n> +\t\tLOG(RPI, Debug) << \"Setting embedded data format.\";\n> +\t\tdata->unicam_[Unicam::Embedded].dev()->open();\n\nThe device is opened here, but never closed. Should that be fixed ? What\nare the drawbacks in opening the device in match() as done today, even\nif not used ?\n\n> +\t\tret = data->unicam_[Unicam::Embedded].dev()->setFormat(&format);\n> +\t\tif (ret) {\n> +\t\t\tLOG(RPI, Error) << \"Failed to set format on Unicam embedded: \"\n> +\t\t\t\t\t<< format.toString();\n> +\t\t\treturn ret;\n> +\t\t}\n> +\n> +\t\t/*\n> +\t\t * If a RAW/Bayer stream has been requested by the application,\n> +\t\t * we must set both Unicam streams as external, even though the\n> +\t\t * application may only request RAW frames. This is because we\n> +\t\t * match timestamps on both streams to synchronise buffers.\n> +\t\t */\n> +\t\tif (rawStream)\n> +\t\t\tdata->unicam_[Unicam::Embedded].setExternal(true);\n> +\t} else {\n> +\t\t/*\n> +\t\t * No embedded data present, so we do not want to iterate over\n> +\t\t * the embedded data stream when starting and stopping.\n> +\t\t */\n> +\t\tdata->streams_.erase(std::remove(data->streams_.begin(), data->streams_.end(),\n> +\t\t\t\t\t\t &data->unicam_[Unicam::Embedded]),\n> +\t\t\t\t     data->streams_.end());\n\nHmmmm... Given that only one sensor, and thus one camera, is supported\nby this pipeline handler, this should work, but conceptually it's not\nvery nice. Isn't this a decision that should be made at match() time,\nnot configure() time ? It would be right to do this here if the code was\nmodelled to support multiple sensors, but in that case we would need to\nclose() the embedded data video node in appropriate locations, and also\nadd the embedded stream back to data->streams_ when metadata is present.\n\nOne option to move this to match() time would be for the IPA to return\nif the sensor supports metadata from init() instead of configure(). That\nwould require passing the sensor name to the IPA in init() too.\n\nIt's the mix n' match that bothers me I think. I don't mind if we decide\nto model the pipeline handler and IPA with the assumption that there can\nonly be a single camera, with a hardcoded pipeline known from the very\nbeginning, or in a more dynamic way that could allow runtime switching\nbetween sensors, but it would be nice if the architecture of the\npipeline handler and IPA was consistent relatively to the model we pick.\nDoes this make sense ?\n\n> +\t}\n> +\n>  \t/*\n>  \t * Update the ScalerCropMaximum to the correct value for this camera mode.\n>  \t * For us, it's the same as the \"analogue crop\".\n> @@ -949,10 +960,16 @@ bool PipelineHandlerRPi::match(DeviceEnumerator *enumerator)\n>  \tfor (auto &stream : data->isp_)\n>  \t\tdata->streams_.push_back(&stream);\n>  \n> -\t/* Open all Unicam and ISP streams. */\n> +\t/*\n> +\t * Open all Unicam and ISP streams. The exception is the embedded data\n> +\t * stream, which only gets opened if the IPA reports that the sensor\n> +\t * supports embedded data. This happens in RPiCameraData::configureIPA().\n> +\t */\n>  \tfor (auto const stream : data->streams_) {\n> -\t\tif (stream->dev()->open())\n> -\t\t\treturn false;\n> +\t\tif (stream != &data->unicam_[Unicam::Embedded]) {\n> +\t\t\tif (stream->dev()->open())\n> +\t\t\t\treturn false;\n> +\t\t}\n>  \t}\n>  \n>  \t/* Wire up all the buffer connections. */\n> @@ -1109,19 +1126,13 @@ int PipelineHandlerRPi::prepareBuffers(Camera *camera)\n>  \t\t\treturn ret;\n>  \t}\n>  \n> -\tif (!data->sensorMetadata_) {\n> -\t\tfor (auto const &it : data->unicam_[Unicam::Embedded].getBuffers()) {\n> -\t\t\tMappedFrameBuffer fb(it.second, PROT_READ | PROT_WRITE);\n> -\t\t\tdata->mappedEmbeddedBuffers_.emplace(it.first, std::move(fb));\n> -\t\t}\n> -\t}\n> -\n>  \t/*\n>  \t * Pass the stats and embedded data buffers to the IPA. No other\n>  \t * buffers need to be passed.\n>  \t */\n>  \tmapBuffers(camera, data->isp_[Isp::Stats].getBuffers(), ipa::RPi::MaskStats);\n> -\tmapBuffers(camera, data->unicam_[Unicam::Embedded].getBuffers(), ipa::RPi::MaskEmbeddedData);\n> +\tif (data->sensorMetadata_)\n> +\t\tmapBuffers(camera, data->unicam_[Unicam::Embedded].getBuffers(), ipa::RPi::MaskEmbeddedData);\n\nMaybe a bit of line wrap ? :-)\n\n>  \n>  \treturn 0;\n>  }\n> @@ -1154,7 +1165,6 @@ void PipelineHandlerRPi::freeBuffers(Camera *camera)\n>  \tstd::vector<unsigned int> ipaBuffers(data->ipaBuffers_.begin(), data->ipaBuffers_.end());\n>  \tdata->ipa_->unmapBuffers(ipaBuffers);\n>  \tdata->ipaBuffers_.clear();\n> -\tdata->mappedEmbeddedBuffers_.clear();\n>  \n>  \tfor (auto const stream : data->streams_)\n>  \t\tstream->releaseBuffers();\n> @@ -1652,7 +1662,7 @@ void RPiCameraData::tryRunPipeline()\n>  \n>  \t/* If any of our request or buffer queues are empty, we cannot proceed. */\n>  \tif (state_ != State::Idle || requestQueue_.empty() ||\n> -\t    bayerQueue_.empty() || embeddedQueue_.empty())\n> +\t    bayerQueue_.empty() || (embeddedQueue_.empty() && sensorMetadata_))\n>  \t\treturn;\n>  \n>  \tif (!findMatchingBuffers(bayerFrame, embeddedBuffer))\n> @@ -1675,17 +1685,24 @@ void RPiCameraData::tryRunPipeline()\n>  \tstate_ = State::Busy;\n>  \n>  \tunsigned int bayerId = unicam_[Unicam::Image].getBufferId(bayerFrame.buffer);\n> -\tunsigned int embeddedId = unicam_[Unicam::Embedded].getBufferId(embeddedBuffer);\n>  \n>  \tLOG(RPI, Debug) << \"Signalling signalIspPrepare:\"\n> -\t\t\t<< \" Bayer buffer id: \" << bayerId\n> -\t\t\t<< \" Embedded buffer id: \" << embeddedId;\n> +\t\t\t<< \" Bayer buffer id: \" << bayerId;\n>  \n>  \tipa::RPi::ISPConfig ispPrepare;\n> -\tispPrepare.embeddedBufferId = ipa::RPi::MaskEmbeddedData | embeddedId;\n>  \tispPrepare.bayerBufferId = ipa::RPi::MaskBayerData | bayerId;\n> -\tispPrepare.embeddedBufferPresent = true;\n>  \tispPrepare.controls = std::move(bayerFrame.controls);\n> +\n> +\tif (embeddedBuffer) {\n> +\t\tunsigned int embeddedId = unicam_[Unicam::Embedded].getBufferId(embeddedBuffer);\n> +\n> +\t\tispPrepare.embeddedBufferId = ipa::RPi::MaskEmbeddedData | embeddedId;\n> +\t\tispPrepare.embeddedBufferPresent = true;\n> +\n> +\t\tLOG(RPI, Debug) << \"Signalling signalIspPrepare:\"\n> +\t\t\t\t<< \" Bayer buffer id: \" << embeddedId;\n> +\t}\n> +\n>  \tipa_->signalIspPrepare(ispPrepare);\n>  }\n>  \n> @@ -1727,6 +1744,17 @@ bool RPiCameraData::findMatchingBuffers(BayerFrame &bayerFrame, FrameBuffer *&em\n>  \n>  \t\t\tLOG(RPI, Debug) << \"Could not find matching embedded buffer\";\n>  \n> +\t\t\tif (!sensorMetadata_) {\n> +\t\t\t\t/*\n> +\t\t\t\t * If there is no sensor metadata, simply return the\n> +\t\t\t\t * first bayer frame in the queue.\n> +\t\t\t\t */\n> +\t\t\t\tLOG(RPI, Debug) << \"Returning bayer frame without a match\";\n> +\t\t\t\tbayerQueue_.pop();\n> +\t\t\t\tembeddedBuffer = nullptr;\n> +\t\t\t\treturn true;\n> +\t\t\t}\n> +\n>  \t\t\tif (!embeddedQueue_.empty()) {\n>  \t\t\t\t/*\n>  \t\t\t\t * Not found a matching embedded buffer for the bayer buffer in","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id 76899BD1F1\n\tfor <parsemail@patchwork.libcamera.org>;\n\tSun, 21 Feb 2021 23:08:25 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id D34E0689EF;\n\tMon, 22 Feb 2021 00:08:24 +0100 (CET)","from perceval.ideasonboard.com (perceval.ideasonboard.com\n\t[213.167.242.64])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id AD0CE689B3\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tMon, 22 Feb 2021 00:08:22 +0100 (CET)","from pendragon.ideasonboard.com (62-78-145-57.bb.dnainternet.fi\n\t[62.78.145.57])\n\tby perceval.ideasonboard.com (Postfix) with ESMTPSA id E88FFA4E;\n\tMon, 22 Feb 2021 00:08:21 +0100 (CET)"],"Authentication-Results":"lancelot.ideasonboard.com;\n\tdkim=fail reason=\"signature verification failed\" (1024-bit key;\n\tunprotected) header.d=ideasonboard.com header.i=@ideasonboard.com\n\theader.b=\"GNOtYDQw\"; dkim-atps=neutral","DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/simple; d=ideasonboard.com;\n\ts=mail; t=1613948902;\n\tbh=T2hmaKNgyV/O1pPG4kq0+gpJdFLyqic/L92O/Q5BnwA=;\n\th=Date:From:To:Cc:Subject:References:In-Reply-To:From;\n\tb=GNOtYDQwMEn4BPfh6lJuSKXSg/K4FkOZMSSwR6TDyhWjdhUApzdGyHjHO4PnDfNAa\n\t+d8hrOyO2ZaVkvokk76cuT2LoOK2m9Erbt5ryMXdjR3N3pffxTxQTgAYDmZjbOkjd8\n\thIMieb3fhefYsieH1y1KAQkuyIXzkrhlN2UrsjwM=","Date":"Mon, 22 Feb 2021 01:07:55 +0200","From":"Laurent Pinchart <laurent.pinchart@ideasonboard.com>","To":"Naushir Patuck <naush@raspberrypi.com>","Message-ID":"<YDLny6WKLkLIhnAJ@pendragon.ideasonboard.com>","References":"<20210218170126.2060783-1-naush@raspberrypi.com>\n\t<20210218170126.2060783-4-naush@raspberrypi.com>","MIME-Version":"1.0","Content-Disposition":"inline","In-Reply-To":"<20210218170126.2060783-4-naush@raspberrypi.com>","Subject":"Re: [libcamera-devel] [PATCH v3 3/4] pipeline: raspberrypi: Only\n\tenabled embedded stream when available","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","Cc":"libcamera-devel@lists.libcamera.org","Content-Type":"text/plain; charset=\"us-ascii\"","Content-Transfer-Encoding":"7bit","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}},{"id":15290,"web_url":"https://patchwork.libcamera.org/comment/15290/","msgid":"<CAEmqJPr4sZ90bv1a=1JE=Chy58GGjWw5NccdaTSi=+kGu=UEcQ@mail.gmail.com>","date":"2021-02-22T13:40:38","subject":"Re: [libcamera-devel] [PATCH v3 3/4] pipeline: raspberrypi: Only\n\tenabled embedded stream when available","submitter":{"id":34,"url":"https://patchwork.libcamera.org/api/people/34/","name":"Naushir Patuck","email":"naush@raspberrypi.com"},"content":"Hi Laurent,\n\nThank you for your review feedback.\n\nOn Sun, 21 Feb 2021 at 23:08, Laurent Pinchart <\nlaurent.pinchart@ideasonboard.com> wrote:\n\n> Hi Naush,\n>\n> Thank you for the patch.\n>\n> s/enabled/enable/ in the subject ?\n>\n> On Thu, Feb 18, 2021 at 05:01:25PM +0000, Naushir Patuck wrote:\n> > The pipeline handler would enable and use the Unicam embedded data stream\n> > even if the sensor did not support it. This was to allow a means to\n> > pass exposure and gain values for the frame to the IPA in a synchronised\n> > way.\n> >\n> > The recent changes to get the pipeline handler to pass a ControlList\n> > with exposure and gain values means this is no longer required. Disable\n> > the use of the embedded data stream when a sensor does not support it.\n>\n> Nice :-)\n>\n> > This change also removes the mappedEmbeddedBuffers_ map as it is no\n> > longer used.\n> >\n> > Signed-off-by: Naushir Patuck <naush@raspberrypi.com>\n> > ---\n> >  .../pipeline/raspberrypi/raspberrypi.cpp      | 112 +++++++++++-------\n> >  1 file changed, 70 insertions(+), 42 deletions(-)\n> >\n> > diff --git a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp\n> b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp\n> > index b43d86166c63..d969c77993eb 100644\n> > --- a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp\n> > +++ b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp\n> > @@ -177,12 +177,6 @@ public:\n> >       /* Stores the ids of the buffers mapped in the IPA. */\n> >       std::unordered_set<unsigned int> ipaBuffers_;\n> >\n> > -     /*\n> > -      * Map of (internal) mmaped embedded data buffers, to avoid having\n> to\n> > -      * map/unmap on every frame.\n> > -      */\n> > -     std::map<unsigned int, MappedFrameBuffer> mappedEmbeddedBuffers_;\n> > -\n> >       /* DMAHEAP allocation helper. */\n> >       RPi::DmaHeap dmaHeap_;\n> >       FileDescriptor lsTable_;\n> > @@ -636,14 +630,7 @@ int PipelineHandlerRPi::configure(Camera *camera,\n> CameraConfiguration *config)\n> >\n> >               if (isRaw(cfg.pixelFormat)) {\n> >                       cfg.setStream(&data->unicam_[Unicam::Image]);\n> > -                     /*\n> > -                      * We must set both Unicam streams as external,\n> even\n> > -                      * though the application may only request RAW\n> frames.\n> > -                      * This is because we match timestamps on both\n> streams\n> > -                      * to synchronise buffers.\n> > -                      */\n> >                       data->unicam_[Unicam::Image].setExternal(true);\n> > -                     data->unicam_[Unicam::Embedded].setExternal(true);\n> >                       continue;\n> >               }\n> >\n> > @@ -715,17 +702,6 @@ int PipelineHandlerRPi::configure(Camera *camera,\n> CameraConfiguration *config)\n> >               return ret;\n> >       }\n> >\n> > -     /* Unicam embedded data output format. */\n> > -     format = {};\n> > -     format.fourcc = V4L2PixelFormat(V4L2_META_FMT_SENSOR_DATA);\n> > -     LOG(RPI, Debug) << \"Setting embedded data format.\";\n> > -     ret = data->unicam_[Unicam::Embedded].dev()->setFormat(&format);\n> > -     if (ret) {\n> > -             LOG(RPI, Error) << \"Failed to set format on Unicam\n> embedded: \"\n> > -                             << format.toString();\n> > -             return ret;\n> > -     }\n> > -\n> >       /* Figure out the smallest selection the ISP will allow. */\n> >       Rectangle testCrop(0, 0, 1, 1);\n> >       data->isp_[Isp::Input].dev()->setSelection(V4L2_SEL_TGT_CROP,\n> &testCrop);\n> > @@ -742,6 +718,41 @@ int PipelineHandlerRPi::configure(Camera *camera,\n> CameraConfiguration *config)\n> >       if (ret)\n> >               LOG(RPI, Error) << \"Failed to configure the IPA: \" << ret;\n> >\n> > +     /*\n> > +      * The IPA will set data->sensorMetadata_ to true if embedded data\n> is\n> > +      * supported on this sensor. If so, open the Unicam embedded data\n> > +      * node and configure the output format.\n> > +      */\n> > +     if (data->sensorMetadata_) {\n> > +             format = {};\n> > +             format.fourcc = V4L2PixelFormat(V4L2_META_FMT_SENSOR_DATA);\n> > +             LOG(RPI, Debug) << \"Setting embedded data format.\";\n> > +             data->unicam_[Unicam::Embedded].dev()->open();\n>\n> The device is opened here, but never closed. Should that be fixed ? What\n> are the drawbacks in opening the device in match() as done today, even\n> if not used ?\n>\n\nWe rely on the V4L2VideoDevice destructor to close down all the device\nnodes.\nI take it this is not the way to do it? :-)  If not, where would you advise\nto put this\ncall?  Since we call open() in match(), it does not seem right to close the\nnode\nin stop().  I did have a quick scan of rkisp1 and ipu3 pipelines, and they\ndo not\nseem to call close() on devices either, so no hints there.\n\nThe reason why we don't open the embedded data node in match() like we used\nto\nis because of the kernel driver.  In order to synchronize buffers for\nembedded data\nand image data nodes in the driver, we must open the embedded data node if\nit is\nused, i.e. we cannot keep it open and not queue it buffers to \"disable\" the\nstream.\nOf course, this can change in the driver, but it does add mode complexity\nto the\nbuffer sync logic.  Perhaps this should be noted as a \\todo to fix in the\nfuture.\n\n\n>\n> > +             ret =\n> data->unicam_[Unicam::Embedded].dev()->setFormat(&format);\n> > +             if (ret) {\n> > +                     LOG(RPI, Error) << \"Failed to set format on Unicam\n> embedded: \"\n> > +                                     << format.toString();\n> > +                     return ret;\n> > +             }\n> > +\n> > +             /*\n> > +              * If a RAW/Bayer stream has been requested by the\n> application,\n> > +              * we must set both Unicam streams as external, even\n> though the\n> > +              * application may only request RAW frames. This is\n> because we\n> > +              * match timestamps on both streams to synchronise buffers.\n> > +              */\n> > +             if (rawStream)\n> > +                     data->unicam_[Unicam::Embedded].setExternal(true);\n> > +     } else {\n> > +             /*\n> > +              * No embedded data present, so we do not want to iterate\n> over\n> > +              * the embedded data stream when starting and stopping.\n> > +              */\n> > +             data->streams_.erase(std::remove(data->streams_.begin(),\n> data->streams_.end(),\n> > +\n> &data->unicam_[Unicam::Embedded]),\n> > +                                  data->streams_.end());\n>\n> Hmmmm... Given that only one sensor, and thus one camera, is supported\n> by this pipeline handler, this should work, but conceptually it's not\n> very nice. Isn't this a decision that should be made at match() time,\n> not configure() time ? It would be right to do this here if the code was\n> modelled to support multiple sensors, but in that case we would need to\n> close() the embedded data video node in appropriate locations, and also\n> add the embedded stream back to data->streams_ when metadata is present.\n>\n\nFully agree.  I did want this to be handled in match(), but as you said, we\ndo not\nhave the information about sensor metadata support there.\n\n\n>\n> One option to move this to match() time would be for the IPA to return\n> if the sensor supports metadata from init() instead of configure(). That\n> would require passing the sensor name to the IPA in init() too.\n>\n\n> It's the mix n' match that bothers me I think. I don't mind if we decide\n> to model the pipeline handler and IPA with the assumption that there can\n> only be a single camera, with a hardcoded pipeline known from the very\n> beginning, or in a more dynamic way that could allow runtime switching\n> between sensors, but it would be nice if the architecture of the\n> pipeline handler and IPA was consistent relatively to the model we pick.\n> Does this make sense ?\n>\n\nYes, I do agree with this as well.  I am happy for ipa->init() to pass back\nthe required\nparameters so match() can be used to open the embedded data node if\nrequired.\nI presume we have all the tools needed to do this with the IPA interfaces\nchange to\nuse mojom definitions?  If so I can update the signature of ipa->init() to\npass in the\nsensor name and return out the \"SensorConfig\" parameters.\n\nRegards,\nNaush\n\n\n\n>\n> > +     }\n> > +\n> >       /*\n> >        * Update the ScalerCropMaximum to the correct value for this\n> camera mode.\n> >        * For us, it's the same as the \"analogue crop\".\n> > @@ -949,10 +960,16 @@ bool PipelineHandlerRPi::match(DeviceEnumerator\n> *enumerator)\n> >       for (auto &stream : data->isp_)\n> >               data->streams_.push_back(&stream);\n> >\n> > -     /* Open all Unicam and ISP streams. */\n> > +     /*\n> > +      * Open all Unicam and ISP streams. The exception is the embedded\n> data\n> > +      * stream, which only gets opened if the IPA reports that the\n> sensor\n> > +      * supports embedded data. This happens in\n> RPiCameraData::configureIPA().\n> > +      */\n> >       for (auto const stream : data->streams_) {\n> > -             if (stream->dev()->open())\n> > -                     return false;\n> > +             if (stream != &data->unicam_[Unicam::Embedded]) {\n> > +                     if (stream->dev()->open())\n> > +                             return false;\n> > +             }\n> >       }\n> >\n> >       /* Wire up all the buffer connections. */\n> > @@ -1109,19 +1126,13 @@ int PipelineHandlerRPi::prepareBuffers(Camera\n> *camera)\n> >                       return ret;\n> >       }\n> >\n> > -     if (!data->sensorMetadata_) {\n> > -             for (auto const &it :\n> data->unicam_[Unicam::Embedded].getBuffers()) {\n> > -                     MappedFrameBuffer fb(it.second, PROT_READ |\n> PROT_WRITE);\n> > -                     data->mappedEmbeddedBuffers_.emplace(it.first,\n> std::move(fb));\n> > -             }\n> > -     }\n> > -\n> >       /*\n> >        * Pass the stats and embedded data buffers to the IPA. No other\n> >        * buffers need to be passed.\n> >        */\n> >       mapBuffers(camera, data->isp_[Isp::Stats].getBuffers(),\n> ipa::RPi::MaskStats);\n> > -     mapBuffers(camera, data->unicam_[Unicam::Embedded].getBuffers(),\n> ipa::RPi::MaskEmbeddedData);\n> > +     if (data->sensorMetadata_)\n> > +             mapBuffers(camera,\n> data->unicam_[Unicam::Embedded].getBuffers(), ipa::RPi::MaskEmbeddedData);\n>\n> Maybe a bit of line wrap ? :-)\n>\n> >\n> >       return 0;\n> >  }\n> > @@ -1154,7 +1165,6 @@ void PipelineHandlerRPi::freeBuffers(Camera\n> *camera)\n> >       std::vector<unsigned int> ipaBuffers(data->ipaBuffers_.begin(),\n> data->ipaBuffers_.end());\n> >       data->ipa_->unmapBuffers(ipaBuffers);\n> >       data->ipaBuffers_.clear();\n> > -     data->mappedEmbeddedBuffers_.clear();\n> >\n> >       for (auto const stream : data->streams_)\n> >               stream->releaseBuffers();\n> > @@ -1652,7 +1662,7 @@ void RPiCameraData::tryRunPipeline()\n> >\n> >       /* If any of our request or buffer queues are empty, we cannot\n> proceed. */\n> >       if (state_ != State::Idle || requestQueue_.empty() ||\n> > -         bayerQueue_.empty() || embeddedQueue_.empty())\n> > +         bayerQueue_.empty() || (embeddedQueue_.empty() &&\n> sensorMetadata_))\n> >               return;\n> >\n> >       if (!findMatchingBuffers(bayerFrame, embeddedBuffer))\n> > @@ -1675,17 +1685,24 @@ void RPiCameraData::tryRunPipeline()\n> >       state_ = State::Busy;\n> >\n> >       unsigned int bayerId =\n> unicam_[Unicam::Image].getBufferId(bayerFrame.buffer);\n> > -     unsigned int embeddedId =\n> unicam_[Unicam::Embedded].getBufferId(embeddedBuffer);\n> >\n> >       LOG(RPI, Debug) << \"Signalling signalIspPrepare:\"\n> > -                     << \" Bayer buffer id: \" << bayerId\n> > -                     << \" Embedded buffer id: \" << embeddedId;\n> > +                     << \" Bayer buffer id: \" << bayerId;\n> >\n> >       ipa::RPi::ISPConfig ispPrepare;\n> > -     ispPrepare.embeddedBufferId = ipa::RPi::MaskEmbeddedData |\n> embeddedId;\n> >       ispPrepare.bayerBufferId = ipa::RPi::MaskBayerData | bayerId;\n> > -     ispPrepare.embeddedBufferPresent = true;\n> >       ispPrepare.controls = std::move(bayerFrame.controls);\n> > +\n> > +     if (embeddedBuffer) {\n> > +             unsigned int embeddedId =\n> unicam_[Unicam::Embedded].getBufferId(embeddedBuffer);\n> > +\n> > +             ispPrepare.embeddedBufferId = ipa::RPi::MaskEmbeddedData |\n> embeddedId;\n> > +             ispPrepare.embeddedBufferPresent = true;\n> > +\n> > +             LOG(RPI, Debug) << \"Signalling signalIspPrepare:\"\n> > +                             << \" Bayer buffer id: \" << embeddedId;\n> > +     }\n> > +\n> >       ipa_->signalIspPrepare(ispPrepare);\n> >  }\n> >\n> > @@ -1727,6 +1744,17 @@ bool\n> RPiCameraData::findMatchingBuffers(BayerFrame &bayerFrame, FrameBuffer *&em\n> >\n> >                       LOG(RPI, Debug) << \"Could not find matching\n> embedded buffer\";\n> >\n> > +                     if (!sensorMetadata_) {\n> > +                             /*\n> > +                              * If there is no sensor metadata, simply\n> return the\n> > +                              * first bayer frame in the queue.\n> > +                              */\n> > +                             LOG(RPI, Debug) << \"Returning bayer frame\n> without a match\";\n> > +                             bayerQueue_.pop();\n> > +                             embeddedBuffer = nullptr;\n> > +                             return true;\n> > +                     }\n> > +\n> >                       if (!embeddedQueue_.empty()) {\n> >                               /*\n> >                                * Not found a matching embedded buffer\n> for the bayer buffer in\n>\n> --\n> Regards,\n>\n> Laurent Pinchart\n>","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id 6EC2EBD1F6\n\tfor <parsemail@patchwork.libcamera.org>;\n\tMon, 22 Feb 2021 13:40:56 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id EE7D668A1C;\n\tMon, 22 Feb 2021 14:40:55 +0100 (CET)","from mail-lj1-x22d.google.com (mail-lj1-x22d.google.com\n\t[IPv6:2a00:1450:4864:20::22d])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id D06E368A07\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tMon, 22 Feb 2021 14:40:54 +0100 (CET)","by mail-lj1-x22d.google.com with SMTP id g1so52043651ljj.13\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tMon, 22 Feb 2021 05:40:54 -0800 (PST)"],"Authentication-Results":"lancelot.ideasonboard.com;\n\tdkim=fail reason=\"signature verification failed\" (2048-bit key;\n\tunprotected) header.d=raspberrypi.com header.i=@raspberrypi.com\n\theader.b=\"SDLbfmJ0\"; dkim-atps=neutral","DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/relaxed;\n\td=raspberrypi.com; s=google;\n\th=mime-version:references:in-reply-to:from:date:message-id:subject:to\n\t:cc; bh=PlYrzoLB92pWCKbkcyz6MBIBI7zEvkoSsCzrIwa8yr8=;\n\tb=SDLbfmJ0pWCulTQi6RhDnBFlelfvNwInpcJ2xIe4H1Ac3Me4m3Z5HABNzsretxJKGe\n\tmrMS0ezTdLYJutTnlpECHLFT7osRfDGUQ9Uv+Uxt/wCgqhMN+JFDeZdM8HLF5aZokQnA\n\tgeJ/cS2xxKUV7qpcg+L0cfW5f8K24xuS60bLbYlAgwyWtsVEWBW9mTvU0sR0pTO3+WvO\n\t7ZyG+zgahzbOLcEzsjgOckb2aCZ0zpJXuifC4guVa17JiN8N/Snu0scxC8cjnCH5V9sS\n\tEVhF3fcb2SFFXm6z09NKxz+JcSgH0aOc34XCpBATjWnJTqLD89KJDPij8PpaVz2eOcFR\n\tTj0Q==","X-Google-DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/relaxed;\n\td=1e100.net; s=20161025;\n\th=x-gm-message-state:mime-version:references:in-reply-to:from:date\n\t:message-id:subject:to:cc;\n\tbh=PlYrzoLB92pWCKbkcyz6MBIBI7zEvkoSsCzrIwa8yr8=;\n\tb=ac63Hp8/7iNZ8hBOOc58cQPM71H2w+2QxYYW5JvrxqmRbWi9Hk43vgMfAF8i4hKnVw\n\tJuNY9L653FpsR1eI2S9ZszOzahdBYR50LaW/2E+fI94p0LtDMpIMQWSxhMykeyioB68F\n\tOgnnx5Zq/5QbZdpUFe6iboGlIQBok8Xeb2sWPA9Jx55npRgMFBL4Bz9Fs8zeTWlcC9HP\n\teVMAodkioZnGlee90ytwk+6zw1ClB6bSycNguWJOD/euXUZ8ByY7tU+W6PgHuIWCB/yl\n\t7Gdmjk+wnCb3eJQr1OT9bTMEQJccabBCq5Ls4riU2npZbjJIrgkgzB+bdMGFXbj2XeXD\n\txlWQ==","X-Gm-Message-State":"AOAM532iIxRI5tOFoaHVfsxtrFPPtbASVBUkpfgJyNkSt1/+W2vqdGE3\n\t1qx/9DWrmQljlgdPe9wJ5mM262WXV8oI+jTlmWLKHA==","X-Google-Smtp-Source":"ABdhPJzYc2m0jhLr0vIcHAQhaPkuYX5UH3aG+ol2elcl5dranaWvFfHA5Evx83n2/EdP3WVfTAatSnoTF4wCOPL0QcU=","X-Received":"by 2002:a19:cd1:: with SMTP id\n\t200mr12056976lfm.171.1614001254237; \n\tMon, 22 Feb 2021 05:40:54 -0800 (PST)","MIME-Version":"1.0","References":"<20210218170126.2060783-1-naush@raspberrypi.com>\n\t<20210218170126.2060783-4-naush@raspberrypi.com>\n\t<YDLny6WKLkLIhnAJ@pendragon.ideasonboard.com>","In-Reply-To":"<YDLny6WKLkLIhnAJ@pendragon.ideasonboard.com>","From":"Naushir Patuck <naush@raspberrypi.com>","Date":"Mon, 22 Feb 2021 13:40:38 +0000","Message-ID":"<CAEmqJPr4sZ90bv1a=1JE=Chy58GGjWw5NccdaTSi=+kGu=UEcQ@mail.gmail.com>","To":"Laurent Pinchart <laurent.pinchart@ideasonboard.com>","Subject":"Re: [libcamera-devel] [PATCH v3 3/4] pipeline: raspberrypi: Only\n\tenabled embedded stream when available","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","Cc":"libcamera devel <libcamera-devel@lists.libcamera.org>","Content-Type":"multipart/mixed;\n\tboundary=\"===============1833724446994623519==\"","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}},{"id":15298,"web_url":"https://patchwork.libcamera.org/comment/15298/","msgid":"<CAEmqJPoSdTPgZ0kd8hb1Jim2t2cQbMi7D0jd4812oEMMWvTjGg@mail.gmail.com>","date":"2021-02-23T07:51:58","subject":"Re: [libcamera-devel] [PATCH v3 3/4] pipeline: raspberrypi: Only\n\tenabled embedded stream when available","submitter":{"id":34,"url":"https://patchwork.libcamera.org/api/people/34/","name":"Naushir Patuck","email":"naush@raspberrypi.com"},"content":"Hi Laurent,\n\nOn Mon, 22 Feb 2021 at 13:40, Naushir Patuck <naush@raspberrypi.com> wrote:\n\n<snip>\n\nOne option to move this to match() time would be for the IPA to return\n>> if the sensor supports metadata from init() instead of configure(). That\n>> would require passing the sensor name to the IPA in init() too.\n>>\n>\n>> It's the mix n' match that bothers me I think. I don't mind if we decide\n>> to model the pipeline handler and IPA with the assumption that there can\n>> only be a single camera, with a hardcoded pipeline known from the very\n>> beginning, or in a more dynamic way that could allow runtime switching\n>> between sensors, but it would be nice if the architecture of the\n>> pipeline handler and IPA was consistent relatively to the model we pick.\n>> Does this make sense ?\n>>\n>\n> Yes, I do agree with this as well.  I am happy for ipa->init() to pass\n> back the required\n> parameters so match() can be used to open the embedded data node if\n> required.\n> I presume we have all the tools needed to do this with the IPA interfaces\n> change to\n> use mojom definitions?  If so I can update the signature of ipa->init() to\n> pass in the\n> sensor name and return out the \"SensorConfig\" parameters.\n>\n\nI had a very brief go at prototyping ipa->init() returning the SensorConfig\nparameters, but\nunfortunately I ran into a few issues.  It seems like the mojom interface\ngenerating script\n(mojom_libcamera_generator.py) requires the init() method to return out\nonly one integer\nparameters:\n\n    # Validate parameters to init()\n    ValidateSingleLength(f_init.parameters, 'input parameter to init()')\n    ValidateSingleLength(f_init.response_parameters, 'output parameter from\ninit()')\n    if f_init.parameters[0].kind.mojom_name != 'IPASettings':\n        raise Exception('init() must have single IPASettings input\nparameter')\n    if f_init.response_parameters[0].kind.spec != 'i32':\n        raise Exception('init() must have single int32 output parameter')\n\nSimply commenting out these lines does not help me either, as I think the\nIPA proxy\ntemplate(?) requires a specific signature as well.\n\nAny advice on how to get around this?","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id 38B2EBD808\n\tfor <parsemail@patchwork.libcamera.org>;\n\tTue, 23 Feb 2021 07:52:17 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id AF6DE68A26;\n\tTue, 23 Feb 2021 08:52:16 +0100 (CET)","from mail-lj1-x229.google.com (mail-lj1-x229.google.com\n\t[IPv6:2a00:1450:4864:20::229])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id F2E9C60106\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tTue, 23 Feb 2021 08:52:14 +0100 (CET)","by mail-lj1-x229.google.com with SMTP id c17so62089371ljn.0\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tMon, 22 Feb 2021 23:52:14 -0800 (PST)"],"Authentication-Results":"lancelot.ideasonboard.com;\n\tdkim=fail reason=\"signature verification failed\" (2048-bit key;\n\tunprotected) header.d=raspberrypi.com header.i=@raspberrypi.com\n\theader.b=\"SYWMMUYa\"; dkim-atps=neutral","DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/relaxed;\n\td=raspberrypi.com; s=google;\n\th=mime-version:references:in-reply-to:from:date:message-id:subject:to\n\t:cc; bh=ye8wQK7kjO02+cuseC9RUhpBHj33mxjSBlR+GfFolv0=;\n\tb=SYWMMUYaf7uEkXbmD7zqzvmk1rru/kIHiK8WMxZOYXKSlZTgRMaeedJ9cPVP+RTVcp\n\tcE06YQjQ8wIwVB6giO+NVGb6hopRKU0MYE7nSM+jDKF2QUzfVP1XKiJaFkIVAi06ovre\n\tcipuFdKazrZoU+0uZDPrbjnVAzFhs52660HQ67S33Xw9/T1CoLFnQtauqMOybs1D8Dw+\n\t592NpsuszmQXT3ZbkCw9VTDT95e6mHVawZ6BqTJidGp8MdCZj2tYEwgVyQdeFYUCvyDw\n\tVjBjIRjWc1Yw8Zf3p9/W3WxihmMSjITrXBo7Oy+gjeBWch6Nza5hDTNlUMfBXeDIeFzf\n\ty/Ig==","X-Google-DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/relaxed;\n\td=1e100.net; s=20161025;\n\th=x-gm-message-state:mime-version:references:in-reply-to:from:date\n\t:message-id:subject:to:cc;\n\tbh=ye8wQK7kjO02+cuseC9RUhpBHj33mxjSBlR+GfFolv0=;\n\tb=mJiQ//tuGfffF7m+GwedVvEOLd/SYyWEqeguynTsN8HOZLUrag13joWuEnZORN+FmG\n\tvKcCzGYsZTg6JmBuoWGMJbpVr2v10SREZThrlq4tTxga6/VnCfuWJoQSEyngJWAdB4xp\n\tIYPCjrpx/seFUSSwDNXzNFcdSH2Ktl66dQ+xdhKIl4ADtMsgXTe9a8JQ7KBWaBIy0C8p\n\tTKGpwBRNhu3qu5lADVgKZGLL4reXShwheAkUa027f3fNrGSk7FmJFbtEd1mTST+yS3Wj\n\tDIq1i6mo1o6Do6H+qHdcpF65I2AqVFCQYrLcfGD8XmanHxu9Llae27e70TGDUl8hm2uT\n\tULWw==","X-Gm-Message-State":"AOAM530W7oSXh5aq+rCQzCOdOLc/nyU7+QrFuzxZAnEg0tgeBy4TzrKY\n\tzPZQzuUzlQcfxBXOMK9ydFxxkQSgze/WA/w+nwe3cw==","X-Google-Smtp-Source":"ABdhPJyQkubJ43ZU3t3dbyImmgtLppKwIOkMMQ6f3gx46ZYV2pPhUOWH+/p7WUxRIDjiC8+hQt+lSUuxR4aKj264dm0=","X-Received":"by 2002:a2e:b6d1:: with SMTP id\n\tm17mr2293391ljo.400.1614066734266; \n\tMon, 22 Feb 2021 23:52:14 -0800 (PST)","MIME-Version":"1.0","References":"<20210218170126.2060783-1-naush@raspberrypi.com>\n\t<20210218170126.2060783-4-naush@raspberrypi.com>\n\t<YDLny6WKLkLIhnAJ@pendragon.ideasonboard.com>\n\t<CAEmqJPr4sZ90bv1a=1JE=Chy58GGjWw5NccdaTSi=+kGu=UEcQ@mail.gmail.com>","In-Reply-To":"<CAEmqJPr4sZ90bv1a=1JE=Chy58GGjWw5NccdaTSi=+kGu=UEcQ@mail.gmail.com>","From":"Naushir Patuck <naush@raspberrypi.com>","Date":"Tue, 23 Feb 2021 07:51:58 +0000","Message-ID":"<CAEmqJPoSdTPgZ0kd8hb1Jim2t2cQbMi7D0jd4812oEMMWvTjGg@mail.gmail.com>","To":"Laurent Pinchart <laurent.pinchart@ideasonboard.com>","Subject":"Re: [libcamera-devel] [PATCH v3 3/4] pipeline: raspberrypi: Only\n\tenabled embedded stream when available","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","Cc":"libcamera devel <libcamera-devel@lists.libcamera.org>","Content-Type":"multipart/mixed;\n\tboundary=\"===============8398460314423369630==\"","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}}]