{"id":11307,"url":"https://patchwork.libcamera.org/api/1.1/patches/11307/?format=json","web_url":"https://patchwork.libcamera.org/patch/11307/","project":{"id":1,"url":"https://patchwork.libcamera.org/api/1.1/projects/1/?format=json","name":"libcamera","link_name":"libcamera","list_id":"libcamera_core","list_email":"libcamera-devel@lists.libcamera.org","web_url":"","scm_url":"","webscm_url":""},"msgid":"<20210216103140.1077307-2-naush@raspberrypi.com>","date":"2021-02-16T10:31:37","name":"[libcamera-devel,1/4] pipeline: ipa: raspberrypi: Pass exposure/gain values to IPA though controls","commit_ref":null,"pull_url":null,"state":"changes-requested","archived":false,"hash":"b136a22e1c2cef4aca38fcb4952b5c6ebcc0721e","submitter":{"id":34,"url":"https://patchwork.libcamera.org/api/1.1/people/34/?format=json","name":"Naushir Patuck","email":"naush@raspberrypi.com"},"delegate":null,"mbox":"https://patchwork.libcamera.org/patch/11307/mbox/","series":[{"id":1696,"url":"https://patchwork.libcamera.org/api/1.1/series/1696/?format=json","web_url":"https://patchwork.libcamera.org/project/libcamera/list/?series=1696","date":"2021-02-16T10:31:36","name":"Raspberry Pi: Embedded data usage","version":1,"mbox":"https://patchwork.libcamera.org/series/1696/mbox/"}],"comments":"https://patchwork.libcamera.org/api/patches/11307/comments/","check":"pending","checks":"https://patchwork.libcamera.org/api/patches/11307/checks/","tags":{},"headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id BDD51BD160\n\tfor <parsemail@patchwork.libcamera.org>;\n\tTue, 16 Feb 2021 10:32:14 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id 72C70637E6;\n\tTue, 16 Feb 2021 11:32:14 +0100 (CET)","from mail-wm1-x335.google.com (mail-wm1-x335.google.com\n\t[IPv6:2a00:1450:4864:20::335])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id 5A4F5637D6\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tTue, 16 Feb 2021 11:32:12 +0100 (CET)","by mail-wm1-x335.google.com with SMTP id l17so8711461wmq.2\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tTue, 16 Feb 2021 02:32:12 -0800 (PST)","from naush-laptop.patuck.local ([88.97.76.4])\n\tby smtp.gmail.com with ESMTPSA id\n\ta186sm3018054wme.17.2021.02.16.02.32.10\n\t(version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256);\n\tTue, 16 Feb 2021 02:32:10 -0800 (PST)"],"Authentication-Results":"lancelot.ideasonboard.com;\n\tdkim=fail reason=\"signature verification failed\" (2048-bit key;\n\tunprotected) header.d=raspberrypi.com header.i=@raspberrypi.com\n\theader.b=\"pyPvDH5w\"; dkim-atps=neutral","DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/relaxed;\n\td=raspberrypi.com; s=google;\n\th=from:to:cc:subject:date:message-id:in-reply-to:references\n\t:mime-version:content-transfer-encoding;\n\tbh=El5Uie/+IQnGOiQKboBAz6Cg/G6vkVYzBqWd3DSJDlc=;\n\tb=pyPvDH5wEFmX6y8IJsPRhvbSq6Gxu5NYXAm8/jGaLcc1ImlK1TwQwLceM4DiaS6rQS\n\tmXIwwV3LUw+DgsC9vpd/aszUZpSh4g4Rcf8Run2ZB1AUSoCjeg33JNlos7vK7bw5VCt5\n\t9Qi8u/g31FcXEGfhvrsIXOGBfQi8qXevCmkY2eWpykCDETgrR37nA0PBLeP01q+bFgk+\n\thUOOpm1e0ma07OwF0o36RXQIiefTGCy8NXiI8PDezm35HDU7MpdyDQPTloS0LtNwbWcS\n\tR4NBvix2alceVPJtZUQGptcaFAmzV7ltt9ds0Dr1F/T7baHR1tSf/OXAvyQOWSqA4KQq\n\tYNpQ==","X-Google-DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/relaxed;\n\td=1e100.net; s=20161025;\n\th=x-gm-message-state:from:to:cc:subject:date:message-id:in-reply-to\n\t:references:mime-version:content-transfer-encoding;\n\tbh=El5Uie/+IQnGOiQKboBAz6Cg/G6vkVYzBqWd3DSJDlc=;\n\tb=omSjduDJLJUBHpSX94pPteAWwOrrMntgBGpnDovw7OGDa5QbmGYcLxgUZshhn9wzuI\n\tXB/KDrmdmj43bGMhb8I2qBWV6lf+Lux2NdqCL8oOSuYRqgulmXU1GCPRnnXE22w10EfU\n\tP+t++D+Cw7IVVXTABgYJcE1RKArQa130j54Te/Egp9TGbOqI87ZpsYblmKPwRBB6109B\n\tItTdOGUO+kUiwUn8/6fr5WvvMZOM3nLnHSTvPfdKvOGPo45bRtMGzrVhF8D24T7cwppv\n\tNpcEQF2ho4+m7LwKgnmZSz5a0dadyT+Tj/ltUCim3NRQFqy64pqpWyjfmkwCGMEzKSEK\n\t/poQ==","X-Gm-Message-State":"AOAM5316sDUMpaoiv4pc/hYUXeyIjIgNmigRm3kA6U5V9K5xc2ybo18j\n\tS6BzzyJUlZrMLOAOdqVdGKfKbbpVW6cnbUmD","X-Google-Smtp-Source":"ABdhPJwdyGYsalnApzZkYEiLi+xG0UTZo/ASlYp8lhvNMJ+zYwABWWoUKUlu/8t/sneVc5G+fC7LyA==","X-Received":"by 2002:a1c:545d:: with SMTP id p29mr2689810wmi.54.1613471531445;\n\tTue, 16 Feb 2021 02:32:11 -0800 (PST)","From":"Naushir Patuck <naush@raspberrypi.com>","To":"libcamera-devel@lists.libcamera.org","Date":"Tue, 16 Feb 2021 10:31:37 +0000","Message-Id":"<20210216103140.1077307-2-naush@raspberrypi.com>","X-Mailer":"git-send-email 2.25.1","In-Reply-To":"<20210216103140.1077307-1-naush@raspberrypi.com>","References":"<20210216103140.1077307-1-naush@raspberrypi.com>","MIME-Version":"1.0","Subject":"[libcamera-devel] [PATCH 1/4] pipeline: ipa: raspberrypi: Pass\n\texposure/gain values to IPA though controls","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","Content-Type":"text/plain; charset=\"us-ascii\"","Content-Transfer-Encoding":"7bit","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"},"content":"When running with sensors that had no embedded data, the pipeline handler\nwould fill a dummy embedded data buffer with gain/exposure values, and\npass this buffer to the IPA together with the bayer buffer. The IPA would\nextract these values for use in the controller algorithms.\n\nRework this logic entirely by having a new RPiCameraData::BayerFrame\nqueue to replace the existing bayer queue. In addition to storing the\nFrameBuffer pointer, this also stores all the controls tracked by\nDelayedControls for that frame in a ControlList. This includes include\nexposure and gain values. On signalling RPi::IPA_EVENT_SIGNAL_ISP_PREPARE\nIPA event, the pipeline handler now passes this ControlList from the\nRPiCameraData::BayerFrame queue.\n\nThe IPA now extracts the gain and exposure values from the ControlList\ninstead of using RPiController::MdParserRPi to parse the embedded data\nbuffer.\n\nSigned-off-by: Naushir Patuck <naush@raspberrypi.com>\n---\n src/ipa/raspberrypi/raspberrypi.cpp           | 142 +++++++++++-------\n .../pipeline/raspberrypi/raspberrypi.cpp      |  62 ++++----\n 2 files changed, 113 insertions(+), 91 deletions(-)","diff":"diff --git a/src/ipa/raspberrypi/raspberrypi.cpp b/src/ipa/raspberrypi/raspberrypi.cpp\nindex ef8b19baa754..30db1235953b 100644\n--- a/src/ipa/raspberrypi/raspberrypi.cpp\n+++ b/src/ipa/raspberrypi/raspberrypi.cpp\n@@ -1,6 +1,6 @@\n /* SPDX-License-Identifier: BSD-2-Clause */\n /*\n- * Copyright (C) 2019-2020, Raspberry Pi (Trading) Ltd.\n+ * Copyright (C) 2019-2021, Raspberry Pi (Trading) Ltd.\n  *\n  * rpi.cpp - Raspberry Pi Image Processing Algorithms\n  */\n@@ -99,9 +99,11 @@ private:\n \tbool validateIspControls();\n \tvoid queueRequest(const ControlList &controls);\n \tvoid returnEmbeddedBuffer(unsigned int bufferId);\n-\tvoid prepareISP(unsigned int bufferId);\n+\tvoid prepareISP(const IPAOperationData &event);\n \tvoid reportMetadata();\n \tbool parseEmbeddedData(unsigned int bufferId, struct DeviceStatus &deviceStatus);\n+\tvoid fillDeviceStatus(uint32_t exposureLines, uint32_t gainCode,\n+\t\t\t      struct DeviceStatus &deviceStatus);\n \tvoid processStats(unsigned int bufferId);\n \tvoid applyFrameDurations(double minFrameDuration, double maxFrameDuration);\n \tvoid applyAGC(const struct AgcStatus *agcStatus, ControlList &ctrls);\n@@ -451,15 +453,14 @@ void IPARPi::processEvent(const IPAOperationData &event)\n \t}\n \n \tcase RPi::IPA_EVENT_SIGNAL_ISP_PREPARE: {\n-\t\tunsigned int embeddedbufferId = event.data[0];\n-\t\tunsigned int bayerbufferId = event.data[1];\n+\t\tunsigned int bayerbufferId = event.data[0];\n \n \t\t/*\n \t\t * At start-up, or after a mode-switch, we may want to\n \t\t * avoid running the control algos for a few frames in case\n \t\t * they are \"unreliable\".\n \t\t */\n-\t\tprepareISP(embeddedbufferId);\n+\t\tprepareISP(event);\n \t\tframeCount_++;\n \n \t\t/* Ready to push the input buffer into the ISP. */\n@@ -939,70 +940,88 @@ void IPARPi::returnEmbeddedBuffer(unsigned int bufferId)\n \tqueueFrameAction.emit(0, op);\n }\n \n-void IPARPi::prepareISP(unsigned int bufferId)\n+void IPARPi::prepareISP(const IPAOperationData &event)\n {\n \tstruct DeviceStatus deviceStatus = {};\n-\tbool success = parseEmbeddedData(bufferId, deviceStatus);\n+\tbool success = false;\n \n-\t/* Done with embedded data now, return to pipeline handler asap. */\n-\treturnEmbeddedBuffer(bufferId);\n+\tif (event.data.size() == 2) {\n+\t\t/*\n+\t\t * Pipeline handler has supplied us with an embedded data buffer,\n+\t\t * so parse it and extract the exposure and gain.\n+\t\t */\n+\t\tunsigned int bufferId = event.data[1];\n+\t\tsuccess = parseEmbeddedData(bufferId, deviceStatus);\n \n-\tif (success) {\n-\t\tControlList ctrls(ispCtrls_);\n+\t\t/* Done with embedded data now, return to pipeline handler asap. */\n+\t\treturnEmbeddedBuffer(bufferId);\n+\t}\n \n-\t\trpiMetadata_.Clear();\n-\t\trpiMetadata_.Set(\"device.status\", deviceStatus);\n-\t\tcontroller_.Prepare(&rpiMetadata_);\n+\tif (!success) {\n+\t\t/*\n+\t\t * Pipeline handler has not supplied an embedded data buffer,\n+\t\t * or embedded data buffer parsing has failed for some reason,\n+\t\t * so pull the exposure and gain values from the control list.\n+\t\t */\n+\t\tint32_t exposureLines = event.controls[0].get(V4L2_CID_EXPOSURE).get<int32_t>();\n+\t\tint32_t gainCode = event.controls[0].get(V4L2_CID_ANALOGUE_GAIN).get<int32_t>();\n+\t\tfillDeviceStatus(exposureLines, gainCode, deviceStatus);\n+\t}\n \n-\t\t/* Lock the metadata buffer to avoid constant locks/unlocks. */\n-\t\tstd::unique_lock<RPiController::Metadata> lock(rpiMetadata_);\n+\tControlList ctrls(ispCtrls_);\n \n-\t\tAwbStatus *awbStatus = rpiMetadata_.GetLocked<AwbStatus>(\"awb.status\");\n-\t\tif (awbStatus)\n-\t\t\tapplyAWB(awbStatus, ctrls);\n+\trpiMetadata_.Clear();\n+\trpiMetadata_.Set(\"device.status\", deviceStatus);\n+\tcontroller_.Prepare(&rpiMetadata_);\n \n-\t\tCcmStatus *ccmStatus = rpiMetadata_.GetLocked<CcmStatus>(\"ccm.status\");\n-\t\tif (ccmStatus)\n-\t\t\tapplyCCM(ccmStatus, ctrls);\n+\t/* Lock the metadata buffer to avoid constant locks/unlocks. */\n+\tstd::unique_lock<RPiController::Metadata> lock(rpiMetadata_);\n \n-\t\tAgcStatus *dgStatus = rpiMetadata_.GetLocked<AgcStatus>(\"agc.status\");\n-\t\tif (dgStatus)\n-\t\t\tapplyDG(dgStatus, ctrls);\n+\tAwbStatus *awbStatus = rpiMetadata_.GetLocked<AwbStatus>(\"awb.status\");\n+\tif (awbStatus)\n+\t\tapplyAWB(awbStatus, ctrls);\n \n-\t\tAlscStatus *lsStatus = rpiMetadata_.GetLocked<AlscStatus>(\"alsc.status\");\n-\t\tif (lsStatus)\n-\t\t\tapplyLS(lsStatus, ctrls);\n+\tCcmStatus *ccmStatus = rpiMetadata_.GetLocked<CcmStatus>(\"ccm.status\");\n+\tif (ccmStatus)\n+\t\tapplyCCM(ccmStatus, ctrls);\n \n-\t\tContrastStatus *contrastStatus = rpiMetadata_.GetLocked<ContrastStatus>(\"contrast.status\");\n-\t\tif (contrastStatus)\n-\t\t\tapplyGamma(contrastStatus, ctrls);\n+\tAgcStatus *dgStatus = rpiMetadata_.GetLocked<AgcStatus>(\"agc.status\");\n+\tif (dgStatus)\n+\t\tapplyDG(dgStatus, ctrls);\n \n-\t\tBlackLevelStatus *blackLevelStatus = rpiMetadata_.GetLocked<BlackLevelStatus>(\"black_level.status\");\n-\t\tif (blackLevelStatus)\n-\t\t\tapplyBlackLevel(blackLevelStatus, ctrls);\n+\tAlscStatus *lsStatus = rpiMetadata_.GetLocked<AlscStatus>(\"alsc.status\");\n+\tif (lsStatus)\n+\t\tapplyLS(lsStatus, ctrls);\n \n-\t\tGeqStatus *geqStatus = rpiMetadata_.GetLocked<GeqStatus>(\"geq.status\");\n-\t\tif (geqStatus)\n-\t\t\tapplyGEQ(geqStatus, ctrls);\n+\tContrastStatus *contrastStatus = rpiMetadata_.GetLocked<ContrastStatus>(\"contrast.status\");\n+\tif (contrastStatus)\n+\t\tapplyGamma(contrastStatus, ctrls);\n \n-\t\tDenoiseStatus *denoiseStatus = rpiMetadata_.GetLocked<DenoiseStatus>(\"denoise.status\");\n-\t\tif (denoiseStatus)\n-\t\t\tapplyDenoise(denoiseStatus, ctrls);\n+\tBlackLevelStatus *blackLevelStatus = rpiMetadata_.GetLocked<BlackLevelStatus>(\"black_level.status\");\n+\tif (blackLevelStatus)\n+\t\tapplyBlackLevel(blackLevelStatus, ctrls);\n \n-\t\tSharpenStatus *sharpenStatus = rpiMetadata_.GetLocked<SharpenStatus>(\"sharpen.status\");\n-\t\tif (sharpenStatus)\n-\t\t\tapplySharpen(sharpenStatus, ctrls);\n+\tGeqStatus *geqStatus = rpiMetadata_.GetLocked<GeqStatus>(\"geq.status\");\n+\tif (geqStatus)\n+\t\tapplyGEQ(geqStatus, ctrls);\n \n-\t\tDpcStatus *dpcStatus = rpiMetadata_.GetLocked<DpcStatus>(\"dpc.status\");\n-\t\tif (dpcStatus)\n-\t\t\tapplyDPC(dpcStatus, ctrls);\n+\tDenoiseStatus *denoiseStatus = rpiMetadata_.GetLocked<DenoiseStatus>(\"denoise.status\");\n+\tif (denoiseStatus)\n+\t\tapplyDenoise(denoiseStatus, ctrls);\n \n-\t\tif (!ctrls.empty()) {\n-\t\t\tIPAOperationData op;\n-\t\t\top.operation = RPi::IPA_ACTION_V4L2_SET_ISP;\n-\t\t\top.controls.push_back(ctrls);\n-\t\t\tqueueFrameAction.emit(0, op);\n-\t\t}\n+\tSharpenStatus *sharpenStatus = rpiMetadata_.GetLocked<SharpenStatus>(\"sharpen.status\");\n+\tif (sharpenStatus)\n+\t\tapplySharpen(sharpenStatus, ctrls);\n+\n+\tDpcStatus *dpcStatus = rpiMetadata_.GetLocked<DpcStatus>(\"dpc.status\");\n+\tif (dpcStatus)\n+\t\tapplyDPC(dpcStatus, ctrls);\n+\n+\tif (!ctrls.empty()) {\n+\t\tIPAOperationData op;\n+\t\top.operation = RPi::IPA_ACTION_V4L2_SET_ISP;\n+\t\top.controls.push_back(ctrls);\n+\t\tqueueFrameAction.emit(0, op);\n \t}\n }\n \n@@ -1019,6 +1038,7 @@ bool IPARPi::parseEmbeddedData(unsigned int bufferId, struct DeviceStatus &devic\n \tRPiController::MdParser::Status status = helper_->Parser().Parse(mem.data());\n \tif (status != RPiController::MdParser::Status::OK) {\n \t\tLOG(IPARPI, Error) << \"Embedded Buffer parsing failed, error \" << status;\n+\t\treturn false;\n \t} else {\n \t\tuint32_t exposureLines, gainCode;\n \t\tif (helper_->Parser().GetExposureLines(exposureLines) != RPiController::MdParser::Status::OK) {\n@@ -1026,21 +1046,29 @@ bool IPARPi::parseEmbeddedData(unsigned int bufferId, struct DeviceStatus &devic\n \t\t\treturn false;\n \t\t}\n \n-\t\tdeviceStatus.shutter_speed = helper_->Exposure(exposureLines);\n \t\tif (helper_->Parser().GetGainCode(gainCode) != RPiController::MdParser::Status::OK) {\n \t\t\tLOG(IPARPI, Error) << \"Gain failed\";\n \t\t\treturn false;\n \t\t}\n \n-\t\tdeviceStatus.analogue_gain = helper_->Gain(gainCode);\n-\t\tLOG(IPARPI, Debug) << \"Metadata - Exposure : \"\n-\t\t\t\t   << deviceStatus.shutter_speed << \" Gain : \"\n-\t\t\t\t   << deviceStatus.analogue_gain;\n+\t\tfillDeviceStatus(exposureLines, gainCode, deviceStatus);\n \t}\n \n \treturn true;\n }\n \n+void IPARPi::fillDeviceStatus(uint32_t exposureLines, uint32_t gainCode,\n+\t\t\t      struct DeviceStatus &deviceStatus)\n+{\n+\tdeviceStatus.shutter_speed = helper_->Exposure(exposureLines);\n+\tdeviceStatus.analogue_gain = helper_->Gain(gainCode);\n+\n+\tLOG(IPARPI, Debug) << \"Metadata - Exposure : \"\n+\t\t\t   << deviceStatus.shutter_speed\n+\t\t\t   << \" Gain : \"\n+\t\t\t   << deviceStatus.analogue_gain;\n+}\n+\n void IPARPi::processStats(unsigned int bufferId)\n {\n \tauto it = buffers_.find(bufferId);\ndiff --git a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp\nindex e47646818c73..7e744ce13172 100644\n--- a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp\n+++ b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp\n@@ -1,6 +1,6 @@\n /* SPDX-License-Identifier: LGPL-2.1-or-later */\n /*\n- * Copyright (C) 2019-2020, Raspberry Pi (Trading) Ltd.\n+ * Copyright (C) 2019-2021, Raspberry Pi (Trading) Ltd.\n  *\n  * raspberrypi.cpp - Pipeline handler for Raspberry Pi devices\n  */\n@@ -188,7 +188,13 @@ public:\n \t */\n \tenum class State { Stopped, Idle, Busy, IpaComplete };\n \tState state_;\n-\tstd::queue<FrameBuffer *> bayerQueue_;\n+\n+\tstruct BayerFrame {\n+\t\tFrameBuffer *buffer;\n+\t\tControlList controls;\n+\t};\n+\n+\tstd::queue<BayerFrame> bayerQueue_;\n \tstd::queue<FrameBuffer *> embeddedQueue_;\n \tstd::deque<Request *> requestQueue_;\n \n@@ -213,7 +219,7 @@ public:\n private:\n \tvoid checkRequestCompleted();\n \tvoid tryRunPipeline();\n-\tbool findMatchingBuffers(FrameBuffer *&bayerBuffer, FrameBuffer *&embeddedBuffer);\n+\tbool findMatchingBuffers(BayerFrame &bayerFrame, FrameBuffer *&embeddedBuffer);\n \n \tunsigned int ispOutputCount_;\n };\n@@ -1386,29 +1392,14 @@ void RPiCameraData::unicamBufferDequeue(FrameBuffer *buffer)\n \t\t\t<< \", timestamp: \" << buffer->metadata().timestamp;\n \n \tif (stream == &unicam_[Unicam::Image]) {\n-\t\tbayerQueue_.push(buffer);\n-\t} else {\n-\t\tembeddedQueue_.push(buffer);\n-\n-\t\tControlList ctrl = delayedCtrls_->get(buffer->metadata().sequence);\n-\n \t\t/*\n-\t\t * Sensor metadata is unavailable, so put the expected ctrl\n-\t\t * values (accounting for the staggered delays) into the empty\n-\t\t * metadata buffer.\n+\t\t * Lookup the sensor controls used for this frame sequence from\n+\t\t * StaggeredCtrl and queue them along with the frame buffer.\n \t\t */\n-\t\tif (!sensorMetadata_) {\n-\t\t\tunsigned int bufferId = unicam_[Unicam::Embedded].getBufferId(buffer);\n-\t\t\tauto it = mappedEmbeddedBuffers_.find(bufferId);\n-\t\t\tif (it != mappedEmbeddedBuffers_.end()) {\n-\t\t\t\tuint32_t *mem = reinterpret_cast<uint32_t *>(it->second.maps()[0].data());\n-\t\t\t\tmem[0] = ctrl.get(V4L2_CID_EXPOSURE).get<int32_t>();\n-\t\t\t\tmem[1] = ctrl.get(V4L2_CID_ANALOGUE_GAIN).get<int32_t>();\n-\t\t\t} else {\n-\t\t\t\tLOG(RPI, Warning) << \"Failed to find embedded buffer \"\n-\t\t\t\t\t\t  << bufferId;\n-\t\t\t}\n-\t\t}\n+\t\tControlList ctrl = delayedCtrls_->get(buffer->metadata().sequence);\n+\t\tbayerQueue_.push({ buffer, std::move(ctrl) });\n+\t} else {\n+\t\tembeddedQueue_.push(buffer);\n \t}\n \n \thandleState();\n@@ -1662,7 +1653,8 @@ void RPiCameraData::applyScalerCrop(const ControlList &controls)\n \n void RPiCameraData::tryRunPipeline()\n {\n-\tFrameBuffer *bayerBuffer, *embeddedBuffer;\n+\tFrameBuffer *embeddedBuffer;\n+\tBayerFrame bayerFrame;\n \tIPAOperationData op;\n \n \t/* If any of our request or buffer queues are empty, we cannot proceed. */\n@@ -1670,7 +1662,7 @@ void RPiCameraData::tryRunPipeline()\n \t    bayerQueue_.empty() || embeddedQueue_.empty())\n \t\treturn;\n \n-\tif (!findMatchingBuffers(bayerBuffer, embeddedBuffer))\n+\tif (!findMatchingBuffers(bayerFrame, embeddedBuffer))\n \t\treturn;\n \n \t/* Take the first request from the queue and action the IPA. */\n@@ -1691,7 +1683,7 @@ void RPiCameraData::tryRunPipeline()\n \t/* Set our state to say the pipeline is active. */\n \tstate_ = State::Busy;\n \n-\tunsigned int bayerId = unicam_[Unicam::Image].getBufferId(bayerBuffer);\n+\tunsigned int bayerId = unicam_[Unicam::Image].getBufferId(bayerFrame.buffer);\n \tunsigned int embeddedId = unicam_[Unicam::Embedded].getBufferId(embeddedBuffer);\n \n \tLOG(RPI, Debug) << \"Signalling RPi::IPA_EVENT_SIGNAL_ISP_PREPARE:\"\n@@ -1699,26 +1691,28 @@ void RPiCameraData::tryRunPipeline()\n \t\t\t<< \" Embedded buffer id: \" << embeddedId;\n \n \top.operation = RPi::IPA_EVENT_SIGNAL_ISP_PREPARE;\n-\top.data = { RPi::BufferMask::EMBEDDED_DATA | embeddedId,\n-\t\t    RPi::BufferMask::BAYER_DATA | bayerId };\n+\top.data.push_back(RPi::BufferMask::BAYER_DATA | bayerId);\n+\top.data.push_back(RPi::BufferMask::EMBEDDED_DATA | embeddedId);\n+\top.controls.emplace_back(std::move(bayerFrame.controls));\n+\n \tipa_->processEvent(op);\n }\n \n-bool RPiCameraData::findMatchingBuffers(FrameBuffer *&bayerBuffer, FrameBuffer *&embeddedBuffer)\n+bool RPiCameraData::findMatchingBuffers(BayerFrame &bayerFrame, FrameBuffer *&embeddedBuffer)\n {\n \tunsigned int embeddedRequeueCount = 0, bayerRequeueCount = 0;\n \n \t/* Loop until we find a matching bayer and embedded data buffer. */\n \twhile (!bayerQueue_.empty()) {\n \t\t/* Start with the front of the bayer queue. */\n-\t\tbayerBuffer = bayerQueue_.front();\n+\t\tbayerFrame = bayerQueue_.front();\n \n \t\t/*\n \t\t * Find the embedded data buffer with a matching timestamp to pass to\n \t\t * the IPA. Any embedded buffers with a timestamp lower than the\n \t\t * current bayer buffer will be removed and re-queued to the driver.\n \t\t */\n-\t\tuint64_t ts = bayerBuffer->metadata().timestamp;\n+\t\tuint64_t ts = bayerFrame.buffer->metadata().timestamp;\n \t\tembeddedBuffer = nullptr;\n \t\twhile (!embeddedQueue_.empty()) {\n \t\t\tFrameBuffer *b = embeddedQueue_.front();\n@@ -1748,7 +1742,7 @@ bool RPiCameraData::findMatchingBuffers(FrameBuffer *&bayerBuffer, FrameBuffer *\n \t\t\t\t * the front of the queue. This buffer is now orphaned, so requeue\n \t\t\t\t * it back to the device.\n \t\t\t\t */\n-\t\t\t\tunicam_[Unicam::Image].queueBuffer(bayerQueue_.front());\n+\t\t\t\tunicam_[Unicam::Image].queueBuffer(bayerQueue_.front().buffer);\n \t\t\t\tbayerQueue_.pop();\n \t\t\t\tbayerRequeueCount++;\n \t\t\t\tLOG(RPI, Warning) << \"Dropping unmatched input frame in stream \"\n@@ -1766,7 +1760,7 @@ bool RPiCameraData::findMatchingBuffers(FrameBuffer *&bayerBuffer, FrameBuffer *\n \n \t\t\t\tLOG(RPI, Warning) << \"Flushing bayer stream!\";\n \t\t\t\twhile (!bayerQueue_.empty()) {\n-\t\t\t\t\tunicam_[Unicam::Image].queueBuffer(bayerQueue_.front());\n+\t\t\t\t\tunicam_[Unicam::Image].queueBuffer(bayerQueue_.front().buffer);\n \t\t\t\t\tbayerQueue_.pop();\n \t\t\t\t}\n \t\t\t\tflushedBuffers = true;\n","prefixes":["libcamera-devel","1/4"]}