[{"id":15463,"web_url":"https://patchwork.libcamera.org/comment/15463/","msgid":"<YEAvuFZy8Bbh29SU@pendragon.ideasonboard.com>","date":"2021-03-04T00:54:16","subject":"Re: [libcamera-devel] [PATCH v4 1/4] pipeline: ipa: raspberrypi:\n\tPass exposure/gain values to IPA though controls","submitter":{"id":2,"url":"https://patchwork.libcamera.org/api/people/2/","name":"Laurent Pinchart","email":"laurent.pinchart@ideasonboard.com"},"content":"Hi Naush,\n\nThank you for the patch.\n\nOn Tue, Mar 02, 2021 at 03:11:08PM +0000, Naushir Patuck wrote:\n> When running with sensors that had no embedded data, the pipeline handler\n> would fill a dummy embedded data buffer with gain/exposure values, and\n> pass this buffer to the IPA together with the bayer buffer. The IPA would\n> extract these values for use in the controller algorithms.\n> \n> Rework this logic entirely by having a new RPiCameraData::BayerFrame\n> queue to replace the existing bayer queue. In addition to storing the\n> FrameBuffer pointer, this also stores all the controls tracked by\n> DelayedControls for that frame in a ControlList. This includes include\n> exposure and gain values. On signalling RPi::IPA_EVENT_SIGNAL_ISP_PREPARE\n> IPA event, the pipeline handler now passes this ControlList from the\n> RPiCameraData::BayerFrame queue.\n> \n> The IPA now extracts the gain and exposure values from the ControlList\n> instead of using RPiController::MdParserRPi to parse the embedded data\n> buffer.\n> \n> Signed-off-by: Naushir Patuck <naush@raspberrypi.com>\n> Tested-by: David Plowman <david.plowman@raspberrypi.com>\n\nReviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>\n\n> ---\n>  include/libcamera/ipa/raspberrypi.mojom       |   2 +\n>  src/ipa/raspberrypi/raspberrypi.cpp           | 132 +++++++++++-------\n>  .../pipeline/raspberrypi/raspberrypi.cpp      |  62 ++++----\n>  3 files changed, 111 insertions(+), 85 deletions(-)\n> \n> diff --git a/include/libcamera/ipa/raspberrypi.mojom b/include/libcamera/ipa/raspberrypi.mojom\n> index 5a27b1e4fc2d..f733a2cd2a40 100644\n> --- a/include/libcamera/ipa/raspberrypi.mojom\n> +++ b/include/libcamera/ipa/raspberrypi.mojom\n> @@ -29,6 +29,8 @@ struct SensorConfig {\n>  struct ISPConfig {\n>  \tuint32 embeddedBufferId;\n>  \tuint32 bayerBufferId;\n> +\tbool embeddedBufferPresent;\n> +\tControlList controls;\n>  };\n>  \n>  struct ConfigInput {\n> diff --git a/src/ipa/raspberrypi/raspberrypi.cpp b/src/ipa/raspberrypi/raspberrypi.cpp\n> index 1226ea514521..f9ab6417866e 100644\n> --- a/src/ipa/raspberrypi/raspberrypi.cpp\n> +++ b/src/ipa/raspberrypi/raspberrypi.cpp\n> @@ -1,6 +1,6 @@\n>  /* SPDX-License-Identifier: BSD-2-Clause */\n>  /*\n> - * Copyright (C) 2019-2020, Raspberry Pi (Trading) Ltd.\n> + * Copyright (C) 2019-2021, Raspberry Pi (Trading) Ltd.\n>   *\n>   * rpi.cpp - Raspberry Pi Image Processing Algorithms\n>   */\n> @@ -101,9 +101,11 @@ private:\n>  \tbool validateIspControls();\n>  \tvoid queueRequest(const ControlList &controls);\n>  \tvoid returnEmbeddedBuffer(unsigned int bufferId);\n> -\tvoid prepareISP(unsigned int bufferId);\n> +\tvoid prepareISP(const ipa::RPi::ISPConfig &data);\n>  \tvoid reportMetadata();\n>  \tbool parseEmbeddedData(unsigned int bufferId, struct DeviceStatus &deviceStatus);\n> +\tvoid fillDeviceStatus(uint32_t exposureLines, uint32_t gainCode,\n> +\t\t\t      struct DeviceStatus &deviceStatus);\n>  \tvoid processStats(unsigned int bufferId);\n>  \tvoid applyFrameDurations(double minFrameDuration, double maxFrameDuration);\n>  \tvoid applyAGC(const struct AgcStatus *agcStatus, ControlList &ctrls);\n> @@ -447,7 +449,7 @@ void IPARPi::signalIspPrepare(const ipa::RPi::ISPConfig &data)\n>  \t * avoid running the control algos for a few frames in case\n>  \t * they are \"unreliable\".\n>  \t */\n> -\tprepareISP(data.embeddedBufferId);\n> +\tprepareISP(data);\n>  \tframeCount_++;\n>  \n>  \t/* Ready to push the input buffer into the ISP. */\n> @@ -909,67 +911,84 @@ void IPARPi::returnEmbeddedBuffer(unsigned int bufferId)\n>  \tembeddedComplete.emit(bufferId & ipa::RPi::MaskID);\n>  }\n>  \n> -void IPARPi::prepareISP(unsigned int bufferId)\n> +void IPARPi::prepareISP(const ipa::RPi::ISPConfig &data)\n>  {\n>  \tstruct DeviceStatus deviceStatus = {};\n> -\tbool success = parseEmbeddedData(bufferId, deviceStatus);\n> +\tbool success = false;\n>  \n> -\t/* Done with embedded data now, return to pipeline handler asap. */\n> -\treturnEmbeddedBuffer(bufferId);\n> +\tif (data.embeddedBufferPresent) {\n> +\t\t/*\n> +\t\t * Pipeline handler has supplied us with an embedded data buffer,\n> +\t\t * so parse it and extract the exposure and gain.\n> +\t\t */\n> +\t\tsuccess = parseEmbeddedData(data.embeddedBufferId, deviceStatus);\n>  \n> -\tif (success) {\n> -\t\tControlList ctrls(ispCtrls_);\n> +\t\t/* Done with embedded data now, return to pipeline handler asap. */\n> +\t\treturnEmbeddedBuffer(data.embeddedBufferId);\n> +\t}\n>  \n> -\t\trpiMetadata_.Clear();\n> -\t\trpiMetadata_.Set(\"device.status\", deviceStatus);\n> -\t\tcontroller_.Prepare(&rpiMetadata_);\n> +\tif (!success) {\n> +\t\t/*\n> +\t\t * Pipeline handler has not supplied an embedded data buffer,\n> +\t\t * or embedded data buffer parsing has failed for some reason,\n> +\t\t * so pull the exposure and gain values from the control list.\n> +\t\t */\n> +\t\tint32_t exposureLines = data.controls.get(V4L2_CID_EXPOSURE).get<int32_t>();\n> +\t\tint32_t gainCode = data.controls.get(V4L2_CID_ANALOGUE_GAIN).get<int32_t>();\n> +\t\tfillDeviceStatus(exposureLines, gainCode, deviceStatus);\n> +\t}\n>  \n> -\t\t/* Lock the metadata buffer to avoid constant locks/unlocks. */\n> -\t\tstd::unique_lock<RPiController::Metadata> lock(rpiMetadata_);\n> +\tControlList ctrls(ispCtrls_);\n>  \n> -\t\tAwbStatus *awbStatus = rpiMetadata_.GetLocked<AwbStatus>(\"awb.status\");\n> -\t\tif (awbStatus)\n> -\t\t\tapplyAWB(awbStatus, ctrls);\n> +\trpiMetadata_.Clear();\n> +\trpiMetadata_.Set(\"device.status\", deviceStatus);\n> +\tcontroller_.Prepare(&rpiMetadata_);\n>  \n> -\t\tCcmStatus *ccmStatus = rpiMetadata_.GetLocked<CcmStatus>(\"ccm.status\");\n> -\t\tif (ccmStatus)\n> -\t\t\tapplyCCM(ccmStatus, ctrls);\n> +\t/* Lock the metadata buffer to avoid constant locks/unlocks. */\n> +\tstd::unique_lock<RPiController::Metadata> lock(rpiMetadata_);\n>  \n> -\t\tAgcStatus *dgStatus = rpiMetadata_.GetLocked<AgcStatus>(\"agc.status\");\n> -\t\tif (dgStatus)\n> -\t\t\tapplyDG(dgStatus, ctrls);\n> +\tAwbStatus *awbStatus = rpiMetadata_.GetLocked<AwbStatus>(\"awb.status\");\n> +\tif (awbStatus)\n> +\t\tapplyAWB(awbStatus, ctrls);\n>  \n> -\t\tAlscStatus *lsStatus = rpiMetadata_.GetLocked<AlscStatus>(\"alsc.status\");\n> -\t\tif (lsStatus)\n> -\t\t\tapplyLS(lsStatus, ctrls);\n> +\tCcmStatus *ccmStatus = rpiMetadata_.GetLocked<CcmStatus>(\"ccm.status\");\n> +\tif (ccmStatus)\n> +\t\tapplyCCM(ccmStatus, ctrls);\n>  \n> -\t\tContrastStatus *contrastStatus = rpiMetadata_.GetLocked<ContrastStatus>(\"contrast.status\");\n> -\t\tif (contrastStatus)\n> -\t\t\tapplyGamma(contrastStatus, ctrls);\n> +\tAgcStatus *dgStatus = rpiMetadata_.GetLocked<AgcStatus>(\"agc.status\");\n> +\tif (dgStatus)\n> +\t\tapplyDG(dgStatus, ctrls);\n>  \n> -\t\tBlackLevelStatus *blackLevelStatus = rpiMetadata_.GetLocked<BlackLevelStatus>(\"black_level.status\");\n> -\t\tif (blackLevelStatus)\n> -\t\t\tapplyBlackLevel(blackLevelStatus, ctrls);\n> +\tAlscStatus *lsStatus = rpiMetadata_.GetLocked<AlscStatus>(\"alsc.status\");\n> +\tif (lsStatus)\n> +\t\tapplyLS(lsStatus, ctrls);\n>  \n> -\t\tGeqStatus *geqStatus = rpiMetadata_.GetLocked<GeqStatus>(\"geq.status\");\n> -\t\tif (geqStatus)\n> -\t\t\tapplyGEQ(geqStatus, ctrls);\n> +\tContrastStatus *contrastStatus = rpiMetadata_.GetLocked<ContrastStatus>(\"contrast.status\");\n> +\tif (contrastStatus)\n> +\t\tapplyGamma(contrastStatus, ctrls);\n>  \n> -\t\tDenoiseStatus *denoiseStatus = rpiMetadata_.GetLocked<DenoiseStatus>(\"denoise.status\");\n> -\t\tif (denoiseStatus)\n> -\t\t\tapplyDenoise(denoiseStatus, ctrls);\n> +\tBlackLevelStatus *blackLevelStatus = rpiMetadata_.GetLocked<BlackLevelStatus>(\"black_level.status\");\n> +\tif (blackLevelStatus)\n> +\t\tapplyBlackLevel(blackLevelStatus, ctrls);\n>  \n> -\t\tSharpenStatus *sharpenStatus = rpiMetadata_.GetLocked<SharpenStatus>(\"sharpen.status\");\n> -\t\tif (sharpenStatus)\n> -\t\t\tapplySharpen(sharpenStatus, ctrls);\n> +\tGeqStatus *geqStatus = rpiMetadata_.GetLocked<GeqStatus>(\"geq.status\");\n> +\tif (geqStatus)\n> +\t\tapplyGEQ(geqStatus, ctrls);\n>  \n> -\t\tDpcStatus *dpcStatus = rpiMetadata_.GetLocked<DpcStatus>(\"dpc.status\");\n> -\t\tif (dpcStatus)\n> -\t\t\tapplyDPC(dpcStatus, ctrls);\n> +\tDenoiseStatus *denoiseStatus = rpiMetadata_.GetLocked<DenoiseStatus>(\"denoise.status\");\n> +\tif (denoiseStatus)\n> +\t\tapplyDenoise(denoiseStatus, ctrls);\n>  \n> -\t\tif (!ctrls.empty())\n> -\t\t\tsetIspControls.emit(ctrls);\n> -\t}\n> +\tSharpenStatus *sharpenStatus = rpiMetadata_.GetLocked<SharpenStatus>(\"sharpen.status\");\n> +\tif (sharpenStatus)\n> +\t\tapplySharpen(sharpenStatus, ctrls);\n> +\n> +\tDpcStatus *dpcStatus = rpiMetadata_.GetLocked<DpcStatus>(\"dpc.status\");\n> +\tif (dpcStatus)\n> +\t\tapplyDPC(dpcStatus, ctrls);\n> +\n> +\tif (!ctrls.empty())\n> +\t\tsetIspControls.emit(ctrls);\n>  }\n>  \n>  bool IPARPi::parseEmbeddedData(unsigned int bufferId, struct DeviceStatus &deviceStatus)\n> @@ -985,6 +1004,7 @@ bool IPARPi::parseEmbeddedData(unsigned int bufferId, struct DeviceStatus &devic\n>  \tRPiController::MdParser::Status status = helper_->Parser().Parse(mem.data());\n>  \tif (status != RPiController::MdParser::Status::OK) {\n>  \t\tLOG(IPARPI, Error) << \"Embedded Buffer parsing failed, error \" << status;\n> +\t\treturn false;\n>  \t} else {\n>  \t\tuint32_t exposureLines, gainCode;\n>  \t\tif (helper_->Parser().GetExposureLines(exposureLines) != RPiController::MdParser::Status::OK) {\n> @@ -992,21 +1012,29 @@ bool IPARPi::parseEmbeddedData(unsigned int bufferId, struct DeviceStatus &devic\n>  \t\t\treturn false;\n>  \t\t}\n>  \n> -\t\tdeviceStatus.shutter_speed = helper_->Exposure(exposureLines);\n>  \t\tif (helper_->Parser().GetGainCode(gainCode) != RPiController::MdParser::Status::OK) {\n>  \t\t\tLOG(IPARPI, Error) << \"Gain failed\";\n>  \t\t\treturn false;\n>  \t\t}\n>  \n> -\t\tdeviceStatus.analogue_gain = helper_->Gain(gainCode);\n> -\t\tLOG(IPARPI, Debug) << \"Metadata - Exposure : \"\n> -\t\t\t\t   << deviceStatus.shutter_speed << \" Gain : \"\n> -\t\t\t\t   << deviceStatus.analogue_gain;\n> +\t\tfillDeviceStatus(exposureLines, gainCode, deviceStatus);\n>  \t}\n>  \n>  \treturn true;\n>  }\n>  \n> +void IPARPi::fillDeviceStatus(uint32_t exposureLines, uint32_t gainCode,\n> +\t\t\t      struct DeviceStatus &deviceStatus)\n> +{\n> +\tdeviceStatus.shutter_speed = helper_->Exposure(exposureLines);\n> +\tdeviceStatus.analogue_gain = helper_->Gain(gainCode);\n> +\n> +\tLOG(IPARPI, Debug) << \"Metadata - Exposure : \"\n> +\t\t\t   << deviceStatus.shutter_speed\n> +\t\t\t   << \" Gain : \"\n> +\t\t\t   << deviceStatus.analogue_gain;\n> +}\n> +\n>  void IPARPi::processStats(unsigned int bufferId)\n>  {\n>  \tauto it = buffers_.find(bufferId);\n> diff --git a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp\n> index ba74ace183db..d057241b9c76 100644\n> --- a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp\n> +++ b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp\n> @@ -1,6 +1,6 @@\n>  /* SPDX-License-Identifier: LGPL-2.1-or-later */\n>  /*\n> - * Copyright (C) 2019-2020, Raspberry Pi (Trading) Ltd.\n> + * Copyright (C) 2019-2021, Raspberry Pi (Trading) Ltd.\n>   *\n>   * raspberrypi.cpp - Pipeline handler for Raspberry Pi devices\n>   */\n> @@ -197,7 +197,13 @@ public:\n>  \t */\n>  \tenum class State { Stopped, Idle, Busy, IpaComplete };\n>  \tState state_;\n> -\tstd::queue<FrameBuffer *> bayerQueue_;\n> +\n> +\tstruct BayerFrame {\n> +\t\tFrameBuffer *buffer;\n> +\t\tControlList controls;\n> +\t};\n> +\n> +\tstd::queue<BayerFrame> bayerQueue_;\n>  \tstd::queue<FrameBuffer *> embeddedQueue_;\n>  \tstd::deque<Request *> requestQueue_;\n>  \n> @@ -222,7 +228,7 @@ public:\n>  private:\n>  \tvoid checkRequestCompleted();\n>  \tvoid tryRunPipeline();\n> -\tbool findMatchingBuffers(FrameBuffer *&bayerBuffer, FrameBuffer *&embeddedBuffer);\n> +\tbool findMatchingBuffers(BayerFrame &bayerFrame, FrameBuffer *&embeddedBuffer);\n>  \n>  \tunsigned int ispOutputCount_;\n>  };\n> @@ -1355,7 +1361,7 @@ void RPiCameraData::setIspControls(const ControlList &controls)\n>  void RPiCameraData::setDelayedControls(const ControlList &controls)\n>  {\n>  \tif (!delayedCtrls_->push(controls))\n> -\t\tLOG(RPI, Error) << \"V4L2 staggered set failed\";\n> +\t\tLOG(RPI, Error) << \"V4L2 DelayedControl set failed\";\n>  \thandleState();\n>  }\n>  \n> @@ -1383,29 +1389,14 @@ void RPiCameraData::unicamBufferDequeue(FrameBuffer *buffer)\n>  \t\t\t<< \", timestamp: \" << buffer->metadata().timestamp;\n>  \n>  \tif (stream == &unicam_[Unicam::Image]) {\n> -\t\tbayerQueue_.push(buffer);\n> -\t} else {\n> -\t\tembeddedQueue_.push(buffer);\n> -\n> -\t\tControlList ctrl = delayedCtrls_->get(buffer->metadata().sequence);\n> -\n>  \t\t/*\n> -\t\t * Sensor metadata is unavailable, so put the expected ctrl\n> -\t\t * values (accounting for the staggered delays) into the empty\n> -\t\t * metadata buffer.\n> +\t\t * Lookup the sensor controls used for this frame sequence from\n> +\t\t * DelayedControl and queue them along with the frame buffer.\n>  \t\t */\n> -\t\tif (!sensorMetadata_) {\n> -\t\t\tunsigned int bufferId = unicam_[Unicam::Embedded].getBufferId(buffer);\n> -\t\t\tauto it = mappedEmbeddedBuffers_.find(bufferId);\n> -\t\t\tif (it != mappedEmbeddedBuffers_.end()) {\n> -\t\t\t\tuint32_t *mem = reinterpret_cast<uint32_t *>(it->second.maps()[0].data());\n> -\t\t\t\tmem[0] = ctrl.get(V4L2_CID_EXPOSURE).get<int32_t>();\n> -\t\t\t\tmem[1] = ctrl.get(V4L2_CID_ANALOGUE_GAIN).get<int32_t>();\n> -\t\t\t} else {\n> -\t\t\t\tLOG(RPI, Warning) << \"Failed to find embedded buffer \"\n> -\t\t\t\t\t\t  << bufferId;\n> -\t\t\t}\n> -\t\t}\n> +\t\tControlList ctrl = delayedCtrls_->get(buffer->metadata().sequence);\n> +\t\tbayerQueue_.push({ buffer, std::move(ctrl) });\n> +\t} else {\n> +\t\tembeddedQueue_.push(buffer);\n>  \t}\n>  \n>  \thandleState();\n> @@ -1656,14 +1647,15 @@ void RPiCameraData::applyScalerCrop(const ControlList &controls)\n>  \n>  void RPiCameraData::tryRunPipeline()\n>  {\n> -\tFrameBuffer *bayerBuffer, *embeddedBuffer;\n> +\tFrameBuffer *embeddedBuffer;\n> +\tBayerFrame bayerFrame;\n>  \n>  \t/* If any of our request or buffer queues are empty, we cannot proceed. */\n>  \tif (state_ != State::Idle || requestQueue_.empty() ||\n>  \t    bayerQueue_.empty() || embeddedQueue_.empty())\n>  \t\treturn;\n>  \n> -\tif (!findMatchingBuffers(bayerBuffer, embeddedBuffer))\n> +\tif (!findMatchingBuffers(bayerFrame, embeddedBuffer))\n>  \t\treturn;\n>  \n>  \t/* Take the first request from the queue and action the IPA. */\n> @@ -1682,7 +1674,7 @@ void RPiCameraData::tryRunPipeline()\n>  \t/* Set our state to say the pipeline is active. */\n>  \tstate_ = State::Busy;\n>  \n> -\tunsigned int bayerId = unicam_[Unicam::Image].getBufferId(bayerBuffer);\n> +\tunsigned int bayerId = unicam_[Unicam::Image].getBufferId(bayerFrame.buffer);\n>  \tunsigned int embeddedId = unicam_[Unicam::Embedded].getBufferId(embeddedBuffer);\n>  \n>  \tLOG(RPI, Debug) << \"Signalling signalIspPrepare:\"\n> @@ -1692,17 +1684,19 @@ void RPiCameraData::tryRunPipeline()\n>  \tipa::RPi::ISPConfig ispPrepare;\n>  \tispPrepare.embeddedBufferId = ipa::RPi::MaskEmbeddedData | embeddedId;\n>  \tispPrepare.bayerBufferId = ipa::RPi::MaskBayerData | bayerId;\n> +\tispPrepare.embeddedBufferPresent = sensorMetadata_;\n> +\tispPrepare.controls = std::move(bayerFrame.controls);\n>  \tipa_->signalIspPrepare(ispPrepare);\n>  }\n>  \n> -bool RPiCameraData::findMatchingBuffers(FrameBuffer *&bayerBuffer, FrameBuffer *&embeddedBuffer)\n> +bool RPiCameraData::findMatchingBuffers(BayerFrame &bayerFrame, FrameBuffer *&embeddedBuffer)\n>  {\n>  \tunsigned int embeddedRequeueCount = 0, bayerRequeueCount = 0;\n>  \n>  \t/* Loop until we find a matching bayer and embedded data buffer. */\n>  \twhile (!bayerQueue_.empty()) {\n>  \t\t/* Start with the front of the bayer queue. */\n> -\t\tbayerBuffer = bayerQueue_.front();\n> +\t\tFrameBuffer *bayerBuffer = bayerQueue_.front().buffer;\n>  \n>  \t\t/*\n>  \t\t * Find the embedded data buffer with a matching timestamp to pass to\n> @@ -1739,7 +1733,7 @@ bool RPiCameraData::findMatchingBuffers(FrameBuffer *&bayerBuffer, FrameBuffer *\n>  \t\t\t\t * the front of the queue. This buffer is now orphaned, so requeue\n>  \t\t\t\t * it back to the device.\n>  \t\t\t\t */\n> -\t\t\t\tunicam_[Unicam::Image].queueBuffer(bayerQueue_.front());\n> +\t\t\t\tunicam_[Unicam::Image].queueBuffer(bayerQueue_.front().buffer);\n>  \t\t\t\tbayerQueue_.pop();\n>  \t\t\t\tbayerRequeueCount++;\n>  \t\t\t\tLOG(RPI, Warning) << \"Dropping unmatched input frame in stream \"\n> @@ -1757,7 +1751,7 @@ bool RPiCameraData::findMatchingBuffers(FrameBuffer *&bayerBuffer, FrameBuffer *\n>  \n>  \t\t\t\tLOG(RPI, Warning) << \"Flushing bayer stream!\";\n>  \t\t\t\twhile (!bayerQueue_.empty()) {\n> -\t\t\t\t\tunicam_[Unicam::Image].queueBuffer(bayerQueue_.front());\n> +\t\t\t\t\tunicam_[Unicam::Image].queueBuffer(bayerQueue_.front().buffer);\n>  \t\t\t\t\tbayerQueue_.pop();\n>  \t\t\t\t}\n>  \t\t\t\tflushedBuffers = true;\n> @@ -1790,8 +1784,10 @@ bool RPiCameraData::findMatchingBuffers(FrameBuffer *&bayerBuffer, FrameBuffer *\n>  \t\t} else {\n>  \t\t\t/*\n>  \t\t\t * We have found a matching bayer and embedded data buffer, so\n> -\t\t\t * nothing more to do apart from popping the buffers from the queue.\n> +\t\t\t * nothing more to do apart from assigning the bayer frame and\n> +\t\t\t * popping the buffers from the queue.\n>  \t\t\t */\n> +\t\t\tbayerFrame = std::move(bayerQueue_.front());\n>  \t\t\tbayerQueue_.pop();\n>  \t\t\tembeddedQueue_.pop();\n>  \t\t\treturn true;","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id 7D05CBD1F1\n\tfor <parsemail@patchwork.libcamera.org>;\n\tThu,  4 Mar 2021 00:54:49 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id 00D5968A91;\n\tThu,  4 Mar 2021 01:54:49 +0100 (CET)","from perceval.ideasonboard.com (perceval.ideasonboard.com\n\t[IPv6:2001:4b98:dc2:55:216:3eff:fef7:d647])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id 4FA1860106\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tThu,  4 Mar 2021 01:54:47 +0100 (CET)","from pendragon.ideasonboard.com (62-78-145-57.bb.dnainternet.fi\n\t[62.78.145.57])\n\tby perceval.ideasonboard.com (Postfix) with ESMTPSA id 5F7AD27A;\n\tThu,  4 Mar 2021 01:54:46 +0100 (CET)"],"Authentication-Results":"lancelot.ideasonboard.com;\n\tdkim=fail reason=\"signature verification failed\" (1024-bit key;\n\tunprotected) header.d=ideasonboard.com header.i=@ideasonboard.com\n\theader.b=\"ULfFv6Tl\"; dkim-atps=neutral","DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/simple; d=ideasonboard.com;\n\ts=mail; t=1614819286;\n\tbh=E3ikIS+xgAmbnrY2pLH0tKzGD72WqHqfj30Hw0nvFL8=;\n\th=Date:From:To:Cc:Subject:References:In-Reply-To:From;\n\tb=ULfFv6TlBvEW+nWX8CLJNl4cWi0cZkW9yKvCnVTTaXrHBYU0TXcwrK96nO5qNPWoT\n\tFajMnyc0sMdRjzo1n63O8l6j6oG0EGSr7d3WI+/dnEjEzy3raKZeGHFRWRJ+ctO0+c\n\tEqkjSzzeqeB054X7QKkVKMTKRV72YRGDhAXG2RDM=","Date":"Thu, 4 Mar 2021 02:54:16 +0200","From":"Laurent Pinchart <laurent.pinchart@ideasonboard.com>","To":"Naushir Patuck <naush@raspberrypi.com>","Message-ID":"<YEAvuFZy8Bbh29SU@pendragon.ideasonboard.com>","References":"<20210302151111.212591-1-naush@raspberrypi.com>\n\t<20210302151111.212591-2-naush@raspberrypi.com>","MIME-Version":"1.0","Content-Disposition":"inline","In-Reply-To":"<20210302151111.212591-2-naush@raspberrypi.com>","Subject":"Re: [libcamera-devel] [PATCH v4 1/4] pipeline: ipa: raspberrypi:\n\tPass exposure/gain values to IPA though controls","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","Cc":"libcamera-devel@lists.libcamera.org","Content-Type":"text/plain; charset=\"us-ascii\"","Content-Transfer-Encoding":"7bit","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}}]