{"id":18274,"url":"https://patchwork.libcamera.org/api/patches/18274/?format=json","web_url":"https://patchwork.libcamera.org/patch/18274/","project":{"id":1,"url":"https://patchwork.libcamera.org/api/projects/1/?format=json","name":"libcamera","link_name":"libcamera","list_id":"libcamera_core","list_email":"libcamera-devel@lists.libcamera.org","web_url":"","scm_url":"","webscm_url":""},"msgid":"<Y+ac7Eg8UsbX14j5@duo.ucw.cz>","date":"2023-02-10T19:37:16","name":"[libcamera-devel,RFC] CPU-only auto-exposure, and where to put it","commit_ref":null,"pull_url":null,"state":"rfc","archived":false,"hash":"26924c8f24881d9d011ff51d05f7a189c9628eb1","submitter":{"id":49,"url":"https://patchwork.libcamera.org/api/people/49/?format=json","name":"Pavel Machek","email":"pavel@ucw.cz"},"delegate":null,"mbox":"https://patchwork.libcamera.org/patch/18274/mbox/","series":[{"id":3748,"url":"https://patchwork.libcamera.org/api/series/3748/?format=json","web_url":"https://patchwork.libcamera.org/project/libcamera/list/?series=3748","date":"2023-02-10T19:37:16","name":"[libcamera-devel,RFC] CPU-only auto-exposure, and where to put it","version":1,"mbox":"https://patchwork.libcamera.org/series/3748/mbox/"}],"comments":"https://patchwork.libcamera.org/api/patches/18274/comments/","check":"pending","checks":"https://patchwork.libcamera.org/api/patches/18274/checks/","tags":{},"headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id 6232EBDB1C\n\tfor <parsemail@patchwork.libcamera.org>;\n\tFri, 10 Feb 2023 19:37:22 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id 8ED7D625F4;\n\tFri, 10 Feb 2023 20:37:21 +0100 (CET)","from jabberwock.ucw.cz (jabberwock.ucw.cz [46.255.230.98])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id DD83D625CE\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tFri, 10 Feb 2023 20:37:19 +0100 (CET)","by jabberwock.ucw.cz (Postfix, from userid 1017)\n\tid B83FB1C0AB3; Fri, 10 Feb 2023 20:37:18 +0100 (CET)"],"DKIM-Signature":["v=1; a=rsa-sha256; c=relaxed/simple; d=libcamera.org;\n\ts=mail; t=1676057841;\n\tbh=Yzi+DiKForZTYOGO4VYPK2M+ZpC2d69AfqgOzMzmvx4=;\n\th=Date:To:Subject:List-Id:List-Unsubscribe:List-Archive:List-Post:\n\tList-Help:List-Subscribe:From:Reply-To:From;\n\tb=nyyPhADHnLzP56bOS1r0L5E1x4Sju+y908VfDQK82h9UqAVzr8/QQFbT9dZzuj0WH\n\tsbUBMxt4NvTfDzJLyMDT/JKE5TdKhmSBYPNSJtMxkoefP+svUvR3YNtyzCI2iQwo1t\n\tcxjhfeChhqt7an7zvSo0zTabVlRkluaZs3zz2dHxNuhyKnmB+HCxqP3Lplbt6d0ayy\n\tJNxWvpV+7PsoNqCWt6Nk6i/PTk9Vtb/QupcukrOExCx/G4Mh4p9M1oukWPP0FWC3US\n\thvqs3b6xW0YhpV9K6VQBV38oHayVJVy46di/JWF2l7a+tkmdbcn1FTQwIeFil1erNH\n\t76YJCVJMHy+QA==","v=1; a=rsa-sha256; c=relaxed/relaxed; d=ucw.cz; s=gen1;\n\tt=1676057838;\n\th=from:from:reply-to:subject:subject:date:date:message-id:message-id:\n\tto:to:cc:mime-version:mime-version:content-type:content-type;\n\tbh=BYxLhXQ5vJOW0HJQURd4vtw9aoN0s/mMsujPHC1RcCI=;\n\tb=Y45kRA1h03RbljHz4zHlzdPKqA6tByMrCRDokVrUU+qWUemkrTmyU2CfVlp5mCU4cBWBIs\n\t1oUavT+4puADmDrsNRELwO2YVTS6TcrmsEbYj6IINv5tEghTwQB+UXpBF3O0pZhU7fbDdl\n\tGGHt+IrnGCy6Q2720+MYfcqj5NiZs8c="],"Authentication-Results":"lancelot.ideasonboard.com; dkim=pass (1024-bit key; \n\tunprotected) header.d=ucw.cz header.i=@ucw.cz\n\theader.b=\"Y45kRA1h\"; dkim-atps=neutral","Date":"Fri, 10 Feb 2023 20:37:16 +0100","To":"kieran.bingham@ideasonboard.com, libcamera-devel@lists.libcamera.org","Message-ID":"<Y+ac7Eg8UsbX14j5@duo.ucw.cz>","MIME-Version":"1.0","Content-Type":"multipart/signed; micalg=pgp-sha1;\n\tprotocol=\"application/pgp-signature\"; boundary=\"XHSpOnlcW+lN7osy\"","Content-Disposition":"inline","Subject":"[libcamera-devel] [RFC] CPU-only auto-exposure, and where to put it","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","From":"Pavel Machek via libcamera-devel <libcamera-devel@lists.libcamera.org>","Reply-To":"Pavel Machek <pavel@ucw.cz>","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"},"content":"Hi!\n\nSo I have this, which kind-of works on PinePhone and Librem 5. I\nstarted with autoexposure.\n\nAgcExposureMode and divideUpExposure are from RPi code, I'm not sure\nhow to reuse the code.\n\nI guess I should convert statistics to histograms.\n\nI have placed my hooks in SimpleCameraData::bufferReady. Is that\nreasonable or is there better place?\n\nWhere should the code go? It is now in\nsrc/libcamera/pipeline/simple/simple.cpp, would something like\nsrc/libcamera/pipeline/simple/ae.cpp be suitable?\n\nDon't look at the code too much, it clearly needs... more work.\n\nBest regards,\n\t\t\t\t\t\t\tPavel","diff":"diff --git a/src/libcamera/pipeline/simple/simple.cpp b/src/libcamera/pipeline/simple/simple.cpp\nindex 24ded4db..92d2e8a5 100644\n--- a/src/libcamera/pipeline/simple/simple.cpp\n+++ b/src/libcamera/pipeline/simple/simple.cpp\n@@ -27,6 +28,7 @@\n #include <libcamera/control_ids.h>\n #include <libcamera/request.h>\n #include <libcamera/stream.h>\n+#include <libcamera/formats.h>\n \n #include \"libcamera/internal/camera.h\"\n #include \"libcamera/internal/camera_sensor.h\"\n@@ -36,7 +38,9 @@\n #include \"libcamera/internal/pipeline_handler.h\"\n #include \"libcamera/internal/v4l2_subdevice.h\"\n #include \"libcamera/internal/v4l2_videodevice.h\"\n+#include \"libcamera/internal/mapped_framebuffer.h\"\n \n+using libcamera::utils::Duration;\n \n namespace libcamera {\n \n@@ -213,6 +217,7 @@ public:\n \tint setupFormats(V4L2SubdeviceFormat *format,\n \t\t\t V4L2Subdevice::Whence whence);\n \tvoid bufferReady(FrameBuffer *buffer);\n+\tvoid autoProcessing(Request *request);\n \n \tunsigned int streamIndex(const Stream *stream) const\n \t{\n@@ -724,6 +729,372 @@ int SimpleCameraData::setupFormats(V4L2SubdeviceFormat *format,\n \treturn 0;\n }\n \n+class MappedPixels {\n+public:\n+\tunsigned char *data;\n+\tPixelFormat format;\n+\tSize size;\n+\tint bpp;\n+\tint maxval;\n+  \n+\tMappedPixels(unsigned char *_data, const struct StreamConfiguration &_config) {\n+\t\tdata = _data;\n+\t\tformat = _config.pixelFormat;\n+\t\tsize = _config.size;\n+\n+\t\tswitch (format) {\n+\t\tcase formats::SGRBG8:\n+\t\t\tbpp = 1;\n+\t\t\tmaxval = 255;\n+\t\t\tbreak;\n+\t\tcase formats::SBGGR8:\n+\t\t\tbpp = 1;\n+\t\t\tmaxval = 255;\n+\t\t\tbreak;\n+\t\tcase formats::SGRBG10:\n+\t\t\tbpp = 2;\n+\t\t\tmaxval = 1023;\n+\t\t\tbreak;\n+\t\tdefault:\n+\t\t\tLOG(SimplePipeline, Error) << \"Mapped pixels \" << format << \" is unknown format.\";\n+\t\t}\n+\t}\n+\n+\tint getValue(unsigned int x, unsigned int y) {\n+\t\tunsigned int v;\n+\t\tif (x >= size.width)\n+\t\t\tx = size.width - 1;\n+\t\tif (y >= size.height)\n+\t\t\ty = size.height - 1;\n+\t\tint i = (x + size.width * y) * bpp;\n+\t\tv = data[i];\n+\t\tif (bpp > 1)\n+\t\t\tv |= data[i+1] << 8;\n+\t\treturn v;\n+\t}\n+\n+\tint getMaxValue(unsigned int x, unsigned int y) {\n+\t\tint v, v2;\n+\n+\t\tv = getValue(x, y);\n+\t\tv2 = getValue(x+1, y);\n+\t\tif (v2 > v)\n+\t\t\tv = v2;\n+\t\tv2 = getValue(x, y+1);\n+\t\tif (v2 > v)\n+\t\t\tv = v2;\n+\t\tv2 = getValue(x+1, y+1);\n+\t\tif (v2 > v)\n+\t\t\tv = v2;\n+\t\treturn v;\n+\t}\n+\n+\tfloat getMaxValueR(float x, float y) {\n+\t\tfloat v = getMaxValue(x * size.width, y * size.height);\n+\t\treturn v/maxval;\n+\t}\n+\n+\tvoid debugPaint(void) {\n+\t\tchar map[] = \"  .,:;-+=*#\";\n+\t\tfor (float y = 0; y < 1; y += 1/25.) {\n+\t\t\tfor (float x = 0; x < 1; x += 1/80.) {\n+\t\t\t\tfloat v = getMaxValueR(x, y);\n+\t\t\t\tprintf(\"%c\", map[ int (v * (sizeof(map) - 2)) ]);\n+\t\t\t}\n+\t\t\tprintf(\"\\n\");\n+\t\t}\n+\t}\t\n+};\n+\n+LOG_DEFINE_CATEGORY(SimpleAgc)\n+LOG_DEFINE_CATEGORY(RPiAgc)\n+\n+// FIXME: from src/ipa/raspberrypi/controller/rpi/agc.h\n+struct AgcExposureMode {\n+        std::vector<libcamera::utils::Duration> shutter;\n+        std::vector<double> gain;\n+\n+\tAgcExposureMode(void) {\n+\t\tlibcamera::utils::Duration v1(1.0);\n+\t\tlibcamera::utils::Duration v2(1000.0);\n+\t\tlibcamera::utils::Duration v3(1000000.0);\n+\t\tshutter = { v1, v2, v2, v3 };\n+\t\tgain = { 1.0, 1.0, 16.0, 16.0 };\n+\t}\n+};\n+\n+\n+struct AgcStatus {\n+\tlibcamera::utils::Duration totalExposureValue; /* value for all exposure and gain for this image */\n+\tlibcamera::utils::Duration targetExposureValue; /* (unfiltered) target total exposure AGC is aiming for */\n+\tlibcamera::utils::Duration shutterTime;\n+\tdouble analogueGain;\n+\tchar exposureMode[32];\n+\tchar constraintMode[32];\n+\tchar meteringMode[32];\n+\tdouble ev;\n+\tlibcamera::utils::Duration flickerPeriod;\n+\tint floatingRegionEnable;\n+\tlibcamera::utils::Duration fixedShutter;\n+\tdouble fixedAnalogueGain;\n+\tdouble digitalGain;\n+\tint locked;\n+};\n+\n+class Agc {\n+public:\n+\tControlList ctrls;\n+\n+\tint exposure_min, exposure_max;\n+\tint again_min, again_max;\n+\tint dgain_min, dgain_max;\n+\n+        AgcStatus status_;\n+  \tAgcExposureMode *exposureMode_;\n+\n+  \tlibcamera::utils::Duration shutter_conv;\n+\n+        struct ExposureValues {\n+                ExposureValues();\n+  \n+                libcamera::utils::Duration shutter;\n+                double analogueGain;\n+                libcamera::utils::Duration totalExposure;\n+                libcamera::utils::Duration totalExposureNoDG; /* without digital gain */\n+        };\n+\n+\tstruct ExposureValues current_, filtered_;\n+        int have_ad_gain;\n+\tunsigned long cid_gain;\n+\n+\tAgc(std::unique_ptr<CameraSensor> & sensor_) {\n+\t  /*\n+\t    sudo yavta -w '0x009a0901 1' /dev/v4l-subdev0 # gc2145\n+\t    sudo yavta -w '0x009a0901 1' /dev/v4l-subdev1 # ae, ov\n+\t    sudo yavta -w '0x00980912 0' /dev/v4l-subdev1 # ag, ov\n+\t    sudo yavta -l /dev/v4l-subdev1\n+\t  */\n+\t\thave_ad_gain = 0;\n+\t\tif (have_ad_gain) {\n+\t\t\tctrls = sensor_->getControls({ V4L2_CID_EXPOSURE, V4L2_CID_ANALOGUE_GAIN, V4L2_CID_DIGITAL_GAIN });\n+\t\t\tcid_gain = V4L2_CID_ANALOGUE_GAIN;\n+\t\t} else {\n+\t\t\tctrls = sensor_->getControls({ V4L2_CID_EXPOSURE, V4L2_CID_GAIN });\n+\t\t\tcid_gain = V4L2_CID_GAIN;\n+\t\t}\n+\n+\t\tconst ControlInfoMap &infoMap = *ctrls.infoMap();\n+\n+\t\tconst ControlInfo &exposure_info = infoMap.find(V4L2_CID_EXPOSURE)->second;\n+\t\tconst ControlInfo &gain_info = infoMap.find(cid_gain)->second;\n+\t\tconst ControlInfo &dgain_info = infoMap.find(V4L2_CID_DIGITAL_GAIN)->second;\n+\n+\t        memset(&status_, 0, sizeof(status_));\n+        \tstatus_.ev = 1.0;\n+\n+\t\texposureMode_ = new AgcExposureMode();\n+\t\tlibcamera::utils::Duration msec(1);\n+\t\tshutter_conv = msec;\n+\n+\t\texposure_min = exposure_info.min().get<int>();\n+\t\tif (!exposure_min) {\n+\t\t\tLOG(SimplePipeline, Error) << \"Minimum exposure is zero, that can't be linear\";\n+\t\t\texposure_min = 1;\n+\t\t}\n+\t\texposure_max = exposure_info.max().get<int>();\n+\t\tagain_min = gain_info.min().get<int>();\n+\t\tif (!again_min) {\n+\t\t\tLOG(SimplePipeline, Error) << \"Minimum gain is zero, that can't be linear\";\n+\t\t\tagain_min = 100;\n+\t\t}\n+\t\t\n+\t\tagain_max = gain_info.max().get<int>();\n+\t\tif (have_ad_gain) {\n+\t\t\tdgain_min = dgain_info.min().get<int>();\n+\t\t\tdgain_max = dgain_info.max().get<int>();\n+\t\t} else {\n+\t\t\tdgain_min = 1;\n+\t\t\tdgain_max = 1;\n+\t\t}\n+\n+\t\tprintf(\"Exposure %d %d, gain %d %d, dgain %d %d\\n\", \n+\t\t\t\texposure_min, exposure_max,\n+\t\t\t\tagain_min, again_max,\n+\t\t\t\tdgain_min, dgain_max);\n+\t}\n+\n+\tvoid get_exposure() {\n+\t\tint exposure = ctrls.get(V4L2_CID_EXPOSURE).get<int>();\n+\t\tint gain = ctrls.get(cid_gain).get<int>();\n+\t\tint dgain;\n+\t\tif (have_ad_gain)\n+\t\t\tdgain = ctrls.get(V4L2_CID_DIGITAL_GAIN).get<int>();\n+\t\telse\n+\t\t\tdgain = 1;\n+\n+\t\tprintf(\"Old exp %d, gain %d, dgain %d\\n\", exposure, gain, dgain);\n+\n+\t\tcurrent_.shutter = (double) exposure * shutter_conv;\n+\t\tcurrent_.analogueGain = (double) gain / again_min;\n+\t}\n+\n+\tvoid set_exposure(std::unique_ptr<CameraSensor> & sensor_) {\n+\t\tint exposure = (int)(filtered_.shutter / shutter_conv);\n+\t\tint gain = (int)(filtered_.analogueGain * again_min);\n+\t\tprintf(\" new exp %d, gain %d, dgain %d \", exposure, gain, 0);\n+\t\tctrls.set(V4L2_CID_EXPOSURE, exposure);\n+\t\tctrls.set(cid_gain, (int)(filtered_.analogueGain * again_min));\n+\t\tif (have_ad_gain)\n+\t\t\tctrls.set(V4L2_CID_DIGITAL_GAIN, 768);\n+\t\tsensor_->setControls(&ctrls);\n+\t}\n+\n+\tvoid process(std::unique_ptr<CameraSensor> & sensor_, Request *request) {\n+        for (auto [stream, buffer] : request->buffers()) {\n+\t\tMappedFrameBuffer mappedBuffer(buffer, MappedFrameBuffer::MapFlag::Read);\n+\t\tconst std::vector<Span<uint8_t>> &planes = mappedBuffer.planes();\n+\t\tunsigned char *img = planes[0].data();\n+\t\tconst struct StreamConfiguration &config = stream->configuration();\n+\t\tMappedPixels pixels(img, config);\n+\n+ \t\t//LOG(SimplePipeline, Error) << config.pixelFormat << \" \" << config.size;\n+\n+\t\tpixels.debugPaint();\n+\n+\t\tint bright = 0, too_bright = 0, total = 0;\n+\n+\t\tfor (float y = 0; y < 1; y += 1/30.) {\n+\t\t\tfor (float x = 0; x < 1; x += 1/40.) {\n+\t\t\t\tfloat v = pixels.getMaxValueR(x, y);\n+\n+\t\t\t\ttotal++;\n+\t\t\t\tif (v > 240./255)\n+\t\t\t\t\ttoo_bright++;\n+\t\t\t\tif (v > 200./255)\n+\t\t\t\t\tbright++;\n+\t\t\t}\n+\t\t}\n+\n+\t\tget_exposure();\n+\t\tLOG(RPiAgc, Error) << \"Current values are \" << current_.shutter << \" and \" << current_.analogueGain;\n+\t\tfiltered_ = current_;\n+\t\tfiltered_.totalExposureNoDG = filtered_.analogueGain * filtered_.shutter;\n+\t\tif ((bright / (float) total) < 0.01) {\n+\t\t\tfiltered_.totalExposureNoDG  *= 1.1;\n+\t\t\tprintf(\"ADJ+\");\n+\t\t}\n+\t\tif ((too_bright / (float) total) > 0.08) {\n+\t\t\tfiltered_.totalExposureNoDG  *= 0.9;\n+\t\t\tprintf(\"ADJ-\");\n+\t\t}\n+\n+\t\tdivideUpExposure();\n+\t\tset_exposure(sensor_);\n+\t\t//LOG(SimpleAgc, Error) << \"Hello world\";\n+\t}\n+#if 0\n+\tconst ControlInfoMap &infoMap = controls();\n+\n+\tif (infoMap.find(V4L2_CID_BRIGHTNESS) != infoMap.end()) {\n+\t\t//const ControlInfo &brightness = infoMap.find(V4L2_CID_BRIGHTNESS)->second;\n+\t}\n+#endif\n+\t}\n+\n+\tvoid divideUpExposure();\n+\tDuration clipShutter(Duration shutter);\n+};\n+\n+/* from ...agc.cpp */\n+Agc::ExposureValues::ExposureValues()\n+\t: shutter(0), analogueGain(0),\n+\t  totalExposure(0), totalExposureNoDG(0)\n+{\n+}\n+\n+Duration Agc::clipShutter(Duration shutter)\n+{\n+  //if (maxShutter_)\n+  //\t\tshutter = std::min(shutter, maxShutter_);\n+\treturn shutter;\n+}\n+\n+void Agc::divideUpExposure()\n+{\n+\t/*\n+\t * Sending the fixed shutter/gain cases through the same code may seem\n+\t * unnecessary, but it will make more sense when extend this to cover\n+\t * variable aperture.\n+\t */\n+\tDuration exposureValue = filtered_.totalExposureNoDG;\n+\tDuration shutterTime;\n+\tdouble analogueGain;\n+\tshutterTime = status_.fixedShutter ? status_.fixedShutter\n+\t\t\t\t\t   : exposureMode_->shutter[0];\n+\tshutterTime = clipShutter(shutterTime);\n+\tanalogueGain = status_.fixedAnalogueGain != 0.0 ? status_.fixedAnalogueGain\n+\t\t\t\t\t\t\t: exposureMode_->gain[0];\n+\tif (shutterTime * analogueGain < exposureValue) {\n+\t\tfor (unsigned int stage = 1;\n+\t\t     stage < exposureMode_->gain.size(); stage++) {\n+\t\t\tprintf(\"Stage %d\\n\", stage);\n+\t\t\tLOG(RPiAgc, Error) << \"Stage \" << stage << \" s/g is \" << shutterTime << \" and \"\n+\t\t\t   << analogueGain;\n+\n+\t\t\tif (!status_.fixedShutter) {\n+\t\t\t\tDuration stageShutter =\n+\t\t\t\t\tclipShutter(exposureMode_->shutter[stage]);\n+\t\t\t\tif (stageShutter * analogueGain >= exposureValue) {\n+\t\t\t\t\tshutterTime = exposureValue / analogueGain;\n+\t\t\t\t\tbreak;\n+\t\t\t\t}\n+\t\t\t\tshutterTime = stageShutter;\n+\t\t\t}\n+\t\t\tif (status_.fixedAnalogueGain == 0.0) {\n+\t\t\t\tif (exposureMode_->gain[stage] * shutterTime >= exposureValue) {\n+\t\t\t\t\tanalogueGain = exposureValue / shutterTime;\n+\t\t\t\t\tbreak;\n+\t\t\t\t}\n+\t\t\t\tanalogueGain = exposureMode_->gain[stage];\n+\t\t\t}\n+\t\t}\n+\t}\n+\tLOG(RPiAgc, Error) << \"Divided up shutter and gain are \" << shutterTime << \" and \"\n+\t\t\t   << analogueGain;\n+\t/*\n+\t * Finally adjust shutter time for flicker avoidance (require both\n+\t * shutter and gain not to be fixed).\n+\t */\n+\tif (!status_.fixedShutter && !status_.fixedAnalogueGain &&\n+\t    status_.flickerPeriod) {\n+\t\tint flickerPeriods = shutterTime / status_.flickerPeriod;\n+\t\tif (flickerPeriods) {\n+\t\t\tDuration newShutterTime = flickerPeriods * status_.flickerPeriod;\n+\t\t\tanalogueGain *= shutterTime / newShutterTime;\n+\t\t\t/*\n+\t\t\t * We should still not allow the ag to go over the\n+\t\t\t * largest value in the exposure mode. Note that this\n+\t\t\t * may force more of the total exposure into the digital\n+\t\t\t * gain as a side-effect.\n+\t\t\t */\n+\t\t\tanalogueGain = std::min(analogueGain, exposureMode_->gain.back());\n+\t\t\tshutterTime = newShutterTime;\n+\t\t}\n+\t\tLOG(RPiAgc, Error) << \"After flicker avoidance, shutter \"\n+\t\t\t\t   << shutterTime << \" gain \" << analogueGain;\n+\t}\n+\tfiltered_.shutter = shutterTime;\n+\tfiltered_.analogueGain = analogueGain;\n+}\n+\n+\n+void SimpleCameraData::autoProcessing(Request *request)\n+{\n+\tAgc agc = Agc(sensor_);\n+\n+\tagc.process(sensor_, request);\n+}\n+\n void SimpleCameraData::bufferReady(FrameBuffer *buffer)\n {\n \tSimplePipelineHandler *pipe = SimpleCameraData::pipe();\n@@ -823,8 +1197,10 @@ void SimpleCameraData::converterOutputDone(FrameBuffer *buffer)\n \n \t/* Complete the buffer and the request. */\n \tRequest *request = buffer->request();\n-\tif (pipe->completeBuffer(request, buffer))\n+\tif (pipe->completeBuffer(request, buffer)) {\n+\t\tautoProcessing(request);\n \t\tpipe->completeRequest(request);\n+\t}\n }\n \n /* Retrieve all source pads connected to a sink pad through active routes. */\n","prefixes":["libcamera-devel","RFC"]}