[{"id":36443,"web_url":"https://patchwork.libcamera.org/comment/36443/","msgid":"<e5a7a328-b188-4608-b1e4-9bacb4a81ca4@ideasonboard.com>","date":"2025-10-24T14:56:10","subject":"Re: [PATCH 0/4] Raspberry Pi AWB using neural networks","submitter":{"id":216,"url":"https://patchwork.libcamera.org/api/people/216/","name":"Barnabás Pőcze","email":"barnabas.pocze@ideasonboard.com"},"content":"Hi\n\n\n2025. 10. 24. 16:15 keltezéssel, David Plowman írta:\n> Hi everyone\n> \n> Here's our first go at doing AWB using neural networks. For both the\n> PiSP and VC4 we take the image statistics and feed them to a neural\n> network which tells us what it thinks the colour temperature should\n> be.\n> \n> The first commit splits our existing Awb class into an Awb base class,\n> and an AwbBayes derived class, so that the subsequent AwbNN class (2nd\n> commit) can share code.\n> \n> Finally we update the tuning files for the new AWB and supply a TFLite\n> model for each platform, though by default it remains off for\n> now. Just switch the \"disable\" tag between the two AWB configurations\n> to enable the new algorithm.\n> \n> We'll shortly be making public our image capture/annotation/training\n> code which will include the full source for the model definitions. It\n> will allow users to train their own, either from scratch, or\n> incrementally using our datasets and models as a starting point.\n> \n> This is the work of our intern who was with us over the summer.\n> \n> Thanks!\n> David\n> \n\nThis definitely sounds interesting! Maybe I missed them, but are\nthere are any numbers about performance, etc? Or images showcasing\nthe differences on the same scenes?\n\n\n> Peter Bailey (4):\n>    ipa: rpi: controller: awb: Separate Bayesian Awb into AwbBayes\n>    ipa: rpi: controller: awb: Add Neural Network Awb\n>    ipa: rpi: pisp: vc4: Update tuning files for new awb and add model\n>    ipa: rpi: controller: Ignore algorithms starting with disable\n> \n>   src/ipa/rpi/controller/controller.cpp        |   6 +\n>   src/ipa/rpi/controller/meson.build           |  10 +\n>   src/ipa/rpi/controller/rpi/awb.cpp           | 409 ++---------------\n>   src/ipa/rpi/controller/rpi/awb.h             |  99 ++---\n>   src/ipa/rpi/controller/rpi/awb_bayes.cpp     | 444 +++++++++++++++++++\n>   src/ipa/rpi/controller/rpi/awb_nn.cpp        | 442 ++++++++++++++++++\n>   src/ipa/rpi/pisp/data/awb_model.tflite       | Bin 0 -> 47624 bytes\n\nI'm sure the question of where to store these binary blobs will generate some discussion.\n\n\nRegards,\nBarnabás Pőcze\n\n>   src/ipa/rpi/pisp/data/imx219.json            |  65 ++-\n>   src/ipa/rpi/pisp/data/imx296.json            |  64 ++-\n>   src/ipa/rpi/pisp/data/imx296_16mm.json       |  64 ++-\n>   src/ipa/rpi/pisp/data/imx296_6mm.json        |  64 ++-\n>   src/ipa/rpi/pisp/data/imx477.json            |  63 +++\n>   src/ipa/rpi/pisp/data/imx477_16mm.json       |  65 ++-\n>   src/ipa/rpi/pisp/data/imx477_6mm.json        |  65 ++-\n>   src/ipa/rpi/pisp/data/imx477_scientific.json |  79 +++-\n>   src/ipa/rpi/pisp/data/imx500.json            |  67 +++\n>   src/ipa/rpi/pisp/data/imx708.json            |  64 ++-\n>   src/ipa/rpi/pisp/data/imx708_wide.json       |  62 +++\n>   src/ipa/rpi/pisp/data/meson.build            |   7 +\n>   src/ipa/rpi/pisp/data/ov5647.json            |  63 +++\n>   src/ipa/rpi/vc4/data/awb_model.tflite        | Bin 0 -> 42976 bytes\n>   src/ipa/rpi/vc4/data/imx219.json             |  64 +++\n>   src/ipa/rpi/vc4/data/imx296.json             |  64 +++\n>   src/ipa/rpi/vc4/data/imx477.json             |  69 +++\n>   src/ipa/rpi/vc4/data/imx500.json             |  67 +++\n>   src/ipa/rpi/vc4/data/imx708.json             |  72 +++\n>   src/ipa/rpi/vc4/data/imx708_wide.json        |  62 +++\n>   src/ipa/rpi/vc4/data/meson.build             |   8 +\n>   src/ipa/rpi/vc4/data/ov5647.json             |  64 +++\n>   29 files changed, 2244 insertions(+), 428 deletions(-)\n>   create mode 100644 src/ipa/rpi/controller/rpi/awb_bayes.cpp\n>   create mode 100644 src/ipa/rpi/controller/rpi/awb_nn.cpp\n>   create mode 100644 src/ipa/rpi/pisp/data/awb_model.tflite\n>   create mode 100644 src/ipa/rpi/vc4/data/awb_model.tflite\n>","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id 710B2C3259\n\tfor <parsemail@patchwork.libcamera.org>;\n\tFri, 24 Oct 2025 14:56:15 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id C8F216097E;\n\tFri, 24 Oct 2025 16:56:14 +0200 (CEST)","from perceval.ideasonboard.com (perceval.ideasonboard.com\n\t[IPv6:2001:4b98:dc2:55:216:3eff:fef7:d647])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id E6F6460976\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tFri, 24 Oct 2025 16:56:13 +0200 (CEST)","from [192.168.33.13] (185.221.141.231.nat.pool.zt.hu\n\t[185.221.141.231])\n\tby perceval.ideasonboard.com (Postfix) with ESMTPSA id C23FBEFF;\n\tFri, 24 Oct 2025 16:54:27 +0200 (CEST)"],"Authentication-Results":"lancelot.ideasonboard.com; dkim=pass (1024-bit key;\n\tunprotected) header.d=ideasonboard.com header.i=@ideasonboard.com\n\theader.b=\"Z8yQBAp3\"; dkim-atps=neutral","DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/simple; d=ideasonboard.com;\n\ts=mail; t=1761317668;\n\tbh=7Cifv9j2g5dInpSLAGdem4FTMi6aRxtRuGE41zFuIlY=;\n\th=Date:Subject:To:References:From:In-Reply-To:From;\n\tb=Z8yQBAp3OYwAQLjXlV/aWXz4xyVoQ0wgVMQDY/yW5E9xEdLi6p4zxi08NkzCg6tjW\n\tRDVwpqHKBiOzMZrr4SIy948Nsg1V1jTfR4FyQX28wDyM+lBnhI6oV9iFymJW4MCFA3\n\tzn6BhVVBiL/MZQ3MKFOrGLr7rC24fdTN6wThKbCs=","Message-ID":"<e5a7a328-b188-4608-b1e4-9bacb4a81ca4@ideasonboard.com>","Date":"Fri, 24 Oct 2025 16:56:10 +0200","MIME-Version":"1.0","User-Agent":"Mozilla Thunderbird","Subject":"Re: [PATCH 0/4] Raspberry Pi AWB using neural networks","To":"David Plowman <david.plowman@raspberrypi.com>,\n\tlibcamera-devel@lists.libcamera.org","References":"<20251024144049.3311-1-david.plowman@raspberrypi.com>","From":"=?utf-8?q?Barnab=C3=A1s_P=C5=91cze?= <barnabas.pocze@ideasonboard.com>","Content-Language":"en-US, hu-HU","In-Reply-To":"<20251024144049.3311-1-david.plowman@raspberrypi.com>","Content-Type":"text/plain; charset=UTF-8; format=flowed","Content-Transfer-Encoding":"8bit","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}},{"id":36449,"web_url":"https://patchwork.libcamera.org/comment/36449/","msgid":"<CAHW6GYKpgXWp0eSe1J-CSNxY43q5dDLAkEUvpPJWR22p9ArzRQ@mail.gmail.com>","date":"2025-10-24T16:41:15","subject":"Re: [PATCH 0/4] Raspberry Pi AWB using neural networks","submitter":{"id":42,"url":"https://patchwork.libcamera.org/api/people/42/","name":"David Plowman","email":"david.plowman@raspberrypi.com"},"content":"Hi Barnabas\n\nOn Fri, 24 Oct 2025 at 15:56, Barnabás Pőcze <\nbarnabas.pocze@ideasonboard.com> wrote:\n>\n> Hi\n>\n>\n> 2025. 10. 24. 16:15 keltezéssel, David Plowman írta:\n> > Hi everyone\n> >\n> > Here's our first go at doing AWB using neural networks. For both the\n> > PiSP and VC4 we take the image statistics and feed them to a neural\n> > network which tells us what it thinks the colour temperature should\n> > be.\n> >\n> > The first commit splits our existing Awb class into an Awb base class,\n> > and an AwbBayes derived class, so that the subsequent AwbNN class (2nd\n> > commit) can share code.\n> >\n> > Finally we update the tuning files for the new AWB and supply a TFLite\n> > model for each platform, though by default it remains off for\n> > now. Just switch the \"disable\" tag between the two AWB configurations\n> > to enable the new algorithm.\n> >\n> > We'll shortly be making public our image capture/annotation/training\n> > code which will include the full source for the model definitions. It\n> > will allow users to train their own, either from scratch, or\n> > incrementally using our datasets and models as a starting point.\n> >\n> > This is the work of our intern who was with us over the summer.\n> >\n> > Thanks!\n> > David\n> >\n>\n> This definitely sounds interesting! Maybe I missed them, but are\n> there are any numbers about performance, etc? Or images showcasing\n> the differences on the same scenes?\n\nThat's definitely a good question, and one which would require a rather\nlong answer to explain fully. I could probably get some numbers easily\nenough comparing the new algorithm against the old one for the data it's\nbeen trained on, but unsurprisingly that will show it to be much better.\n\nReally we need to be comparing on a fresh set of real images which is\ntrickier. We have problems related to: how to run both algorithms on the\nsame raw buffer or DNG, not having versions of AWB code that can run on a\nDNG, or at least not in quite the same way as it does on the hardware,\nhaving insufficient training data, risks of over-fitting, real-world images\nwith mixed lighting where the white balance is very subjective, the list\ngoes on.\n\nBut I do have a plan to increase the number of training images, and maybe\npart of that work could be capturing comparisons. I think there might be a\nway to get true direct comparisons, so I'll have a think about that.\n\nIf folks would like to see a comparison, here's an image taken just now\nthrough my window. The sun's going down outside so it's pretty bright, and\nindoors I have some low CT (very orange) strip LEDs.\n\nOld algorithm:\nhttps://drive.google.com/file/d/1f-n9RA25IEgf4si11Ruf5-W77tonRWg4/view?usp=sharing\n\nIt doesn't really know what to do, so the Bayes algorithm has neutralised\neverything (whilst sticking to the CT curve). Outside is a decidedly sickly\nblue, and indoors is a sickly off-white.\n\nNew algorithm:\nhttps://drive.google.com/file/d/147wzS6exZVTHNpFc8VrqKH6zeGYi7n1O/view?usp=sharing\n\nYou can see the neural network has balanced for outside, and it looks\npretty spot on (but being a neural network, you couldn't necessarily\npredict that's what it would do). Indoors has gone orange, though in truth\nit does look nearly that orange. So things are always a bit nuanced, this\nis definitely way closer, but perhaps there was something aesthetically\nacceptable just a bit more in between? Hard to say. (This image wasn't in\nthe training set, nor, I think, were there any I would obviously describe\nas very \"similar\".)\n\n>\n>\n> > Peter Bailey (4):\n> >    ipa: rpi: controller: awb: Separate Bayesian Awb into AwbBayes\n> >    ipa: rpi: controller: awb: Add Neural Network Awb\n> >    ipa: rpi: pisp: vc4: Update tuning files for new awb and add model\n> >    ipa: rpi: controller: Ignore algorithms starting with disable\n> >\n> >   src/ipa/rpi/controller/controller.cpp        |   6 +\n> >   src/ipa/rpi/controller/meson.build           |  10 +\n> >   src/ipa/rpi/controller/rpi/awb.cpp           | 409 ++---------------\n> >   src/ipa/rpi/controller/rpi/awb.h             |  99 ++---\n> >   src/ipa/rpi/controller/rpi/awb_bayes.cpp     | 444 +++++++++++++++++++\n> >   src/ipa/rpi/controller/rpi/awb_nn.cpp        | 442 ++++++++++++++++++\n> >   src/ipa/rpi/pisp/data/awb_model.tflite       | Bin 0 -> 47624 bytes\n>\n> I'm sure the question of where to store these binary blobs will generate\nsome discussion.\n\nThe sooner the better, I'm sure!\n\nThanks\nDavid\n\n>\n>\n> Regards,\n> Barnabás Pőcze\n>\n> >   src/ipa/rpi/pisp/data/imx219.json            |  65 ++-\n> >   src/ipa/rpi/pisp/data/imx296.json            |  64 ++-\n> >   src/ipa/rpi/pisp/data/imx296_16mm.json       |  64 ++-\n> >   src/ipa/rpi/pisp/data/imx296_6mm.json        |  64 ++-\n> >   src/ipa/rpi/pisp/data/imx477.json            |  63 +++\n> >   src/ipa/rpi/pisp/data/imx477_16mm.json       |  65 ++-\n> >   src/ipa/rpi/pisp/data/imx477_6mm.json        |  65 ++-\n> >   src/ipa/rpi/pisp/data/imx477_scientific.json |  79 +++-\n> >   src/ipa/rpi/pisp/data/imx500.json            |  67 +++\n> >   src/ipa/rpi/pisp/data/imx708.json            |  64 ++-\n> >   src/ipa/rpi/pisp/data/imx708_wide.json       |  62 +++\n> >   src/ipa/rpi/pisp/data/meson.build            |   7 +\n> >   src/ipa/rpi/pisp/data/ov5647.json            |  63 +++\n> >   src/ipa/rpi/vc4/data/awb_model.tflite        | Bin 0 -> 42976 bytes\n> >   src/ipa/rpi/vc4/data/imx219.json             |  64 +++\n> >   src/ipa/rpi/vc4/data/imx296.json             |  64 +++\n> >   src/ipa/rpi/vc4/data/imx477.json             |  69 +++\n> >   src/ipa/rpi/vc4/data/imx500.json             |  67 +++\n> >   src/ipa/rpi/vc4/data/imx708.json             |  72 +++\n> >   src/ipa/rpi/vc4/data/imx708_wide.json        |  62 +++\n> >   src/ipa/rpi/vc4/data/meson.build             |   8 +\n> >   src/ipa/rpi/vc4/data/ov5647.json             |  64 +++\n> >   29 files changed, 2244 insertions(+), 428 deletions(-)\n> >   create mode 100644 src/ipa/rpi/controller/rpi/awb_bayes.cpp\n> >   create mode 100644 src/ipa/rpi/controller/rpi/awb_nn.cpp\n> >   create mode 100644 src/ipa/rpi/pisp/data/awb_model.tflite\n> >   create mode 100644 src/ipa/rpi/vc4/data/awb_model.tflite\n> >\n>","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id B0734C3259\n\tfor <parsemail@patchwork.libcamera.org>;\n\tFri, 24 Oct 2025 16:41:30 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id F1F4F609C7;\n\tFri, 24 Oct 2025 18:41:29 +0200 (CEST)","from mail-qk1-x734.google.com (mail-qk1-x734.google.com\n\t[IPv6:2607:f8b0:4864:20::734])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id BD8CD608DC\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tFri, 24 Oct 2025 18:41:28 +0200 (CEST)","by mail-qk1-x734.google.com with SMTP id\n\taf79cd13be357-88f2b29b651so231387785a.0\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tFri, 24 Oct 2025 09:41:28 -0700 (PDT)"],"Authentication-Results":"lancelot.ideasonboard.com; dkim=pass (2048-bit key;\n\tunprotected) header.d=raspberrypi.com header.i=@raspberrypi.com\n\theader.b=\"hexgUBQm\"; dkim-atps=neutral","DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/relaxed;\n\td=raspberrypi.com; s=google; t=1761324088; x=1761928888;\n\tdarn=lists.libcamera.org; \n\th=cc:to:subject:message-id:date:from:in-reply-to:references\n\t:mime-version:from:to:cc:subject:date:message-id:reply-to;\n\tbh=SNsJysWTnR1JKltgGta608JvSeayNmVH1btxuhR8wwU=;\n\tb=hexgUBQmywuCy1B7tsE49omGCL3Hwh+JtkzbCaswPGe5iivKDWM+jl0crvba1yqo+K\n\t7Cz58z86axsn54LUmPK9jFOF6KgoaCeENa2n4JiMubahFMihWiBniaUcLfT2FN6BbR1O\n\t5iWIMAxkhUpLpk9OvddzNspQucA/OCIwxpN7ThXC+mXGEihZhxyNi0wS0poWqbeuNAiZ\n\tx5I/KAqz79TSV1ZFh/uwPQ5HYCAgWYgLIdfjw0NF2ejm+GLx9DAziPZPQ8yy9oyKJzKK\n\tNMStsw1Lyivq9B7CbcWowM791Yuplcpkds2hRi1jq4eVdwic0Sud72kq1Ss+5GFRQuqK\n\tzGDg==","X-Google-DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/relaxed;\n\td=1e100.net; s=20230601; t=1761324088; x=1761928888;\n\th=cc:to:subject:message-id:date:from:in-reply-to:references\n\t:mime-version:x-gm-message-state:from:to:cc:subject:date:message-id\n\t:reply-to;\n\tbh=SNsJysWTnR1JKltgGta608JvSeayNmVH1btxuhR8wwU=;\n\tb=BgnsqiygFnagOVgyEJ2WrM8Grgspj6eDL1c6BFlUaPMbmvLIPxjEze6CtoOsfD/9bS\n\tdzAiRqzsLUbhhktGp54dLUo5KLUcybsAEzRel+vddwof68CMhyb4+zTmM1v7Hxpf0V2a\n\tobC2F1ouFxrkYH5T15u0tbmtks6X46LvtH1vBa4s3EeR9CT691EID2YZbVxo2Ejgh4II\n\ttihKLUbSFlQY1Ocrdgz6ANKFQwZAc3vRimkExT41Q1rOWP+L6/1XllzoI+APz7O5+o8F\n\tjhYQEksTsFf4sxLCZYSIr/AOF1Uh2GMfFcdlP+pLcEBi0tDVUMRcLGPbqYoT0bmWr79t\n\tzLWw==","X-Gm-Message-State":"AOJu0YyrfbBP0QFVld7CHsxk38qnobQJdMwx9HDUhXPAzuBnBa89fKL7\n\tAVIWjZURdlbUxkWNd5ulLfbq2bCgBpAkjY0H+OBhSLPjA3xMOFSyHAq3itp09lOWvv6Cwe2w5uu\n\tg6mTfYtpOlJMS72CC+y8GhGX3LytFJsl390OVQnLXDg==","X-Gm-Gg":"ASbGncv3epLliV4u2yM8FxfD3GYC5wZ6XVqf+du5q9Xa+TjcPtMjMrvwUzvoep++CtA\n\tjDGH8zRDhEiyM6Us85UZMUEYa5OX37tVsNIi7PJ/q+1DSRqC9w2z6kFJJsPoD69n02Se4l8/efD\n\tIpJFy08UjdferDE4DpKwyIc26VGR8vT59y5LxPMOzTa0RFlb+3W4pLrfumfp2RSrjQJTRGYNq3e\n\tc0WaIinhE9LTcHjiQuAPMwOBgTanALiD8aDSxzbRjMTu4QarkJ15RKymuMy5ctSyS/QrOXgAFbA\n\te/b4Ql7iU4CmNKEelI167p0Cb6nbBQxgm8ej3aaMUCPqpqfGJKfmauCuQxadBvymfNEr","X-Google-Smtp-Source":"AGHT+IFI7KGl3cDw8V81Cd78JLKAZv31CPoj037EuUYGFRwQ2/zcJi5uyQdJmNCTJfzi+8SlJ/CeFFNRihQkg4GBIbM=","X-Received":"by 2002:a05:620a:4145:b0:891:89a7:d7cb with SMTP id\n\taf79cd13be357-89c105a8833mr931864485a.38.1761324087399;\n\tFri, 24 Oct 2025 09:41:27 -0700 (PDT)","MIME-Version":"1.0","References":"<20251024144049.3311-1-david.plowman@raspberrypi.com>\n\t<e5a7a328-b188-4608-b1e4-9bacb4a81ca4@ideasonboard.com>","In-Reply-To":"<e5a7a328-b188-4608-b1e4-9bacb4a81ca4@ideasonboard.com>","From":"David Plowman <david.plowman@raspberrypi.com>","Date":"Fri, 24 Oct 2025 17:41:15 +0100","X-Gm-Features":"AS18NWDfVuTGGAq0UJ_KC91WEPyMoGF6SmrqNczzJwOQyYtHA5UeKYAxfC9CJjQ","Message-ID":"<CAHW6GYKpgXWp0eSe1J-CSNxY43q5dDLAkEUvpPJWR22p9ArzRQ@mail.gmail.com>","Subject":"Re: [PATCH 0/4] Raspberry Pi AWB using neural networks","To":"=?utf-8?q?Barnab=C3=A1s_P=C5=91cze?= <barnabas.pocze@ideasonboard.com>","Cc":"libcamera-devel@lists.libcamera.org","Content-Type":"multipart/alternative; boundary=\"000000000000906e6a0641ea38ce\"","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}},{"id":36452,"web_url":"https://patchwork.libcamera.org/comment/36452/","msgid":"<CAEmqJPqX2DJBYSnZehk6a5qezPmpwhZn2sgBjL4WFnx7cYW9zA@mail.gmail.com>","date":"2025-10-24T18:17:42","subject":"Re: [PATCH 0/4] Raspberry Pi AWB using neural networks","submitter":{"id":34,"url":"https://patchwork.libcamera.org/api/people/34/","name":"Naushir Patuck","email":"naush@raspberrypi.com"},"content":"On Fri, 24 Oct 2025, 5:41 pm David Plowman, <david.plowman@raspberrypi.com>\nwrote:\n\n> Hi Barnabas\n>\n> On Fri, 24 Oct 2025 at 15:56, Barnabás Pőcze <\n> barnabas.pocze@ideasonboard.com> wrote:\n> >\n> > Hi\n> >\n> >\n> > 2025. 10. 24. 16:15 keltezéssel, David Plowman írta:\n> > > Hi everyone\n> > >\n> > > Here's our first go at doing AWB using neural networks. For both the\n> > > PiSP and VC4 we take the image statistics and feed them to a neural\n> > > network which tells us what it thinks the colour temperature should\n> > > be.\n> > >\n> > > The first commit splits our existing Awb class into an Awb base class,\n> > > and an AwbBayes derived class, so that the subsequent AwbNN class (2nd\n> > > commit) can share code.\n> > >\n> > > Finally we update the tuning files for the new AWB and supply a TFLite\n> > > model for each platform, though by default it remains off for\n> > > now. Just switch the \"disable\" tag between the two AWB configurations\n> > > to enable the new algorithm.\n> > >\n> > > We'll shortly be making public our image capture/annotation/training\n> > > code which will include the full source for the model definitions. It\n> > > will allow users to train their own, either from scratch, or\n> > > incrementally using our datasets and models as a starting point.\n> > >\n> > > This is the work of our intern who was with us over the summer.\n> > >\n> > > Thanks!\n> > > David\n> > >\n> >\n> > This definitely sounds interesting! Maybe I missed them, but are\n> > there are any numbers about performance, etc? Or images showcasing\n> > the differences on the same scenes?\n>\n> That's definitely a good question, and one which would require a rather\n> long answer to explain fully. I could probably get some numbers easily\n> enough comparing the new algorithm against the old one for the data it's\n> been trained on, but unsurprisingly that will show it to be much better.\n>\n> Really we need to be comparing on a fresh set of real images which is\n> trickier. We have problems related to: how to run both algorithms on the\n> same raw buffer or DNG, not having versions of AWB code that can run on a\n> DNG, or at least not in quite the same way as it does on the hardware,\n> having insufficient training data, risks of over-fitting, real-world images\n> with mixed lighting where the white balance is very subjective, the list\n> goes on.\n>\n> But I do have a plan to increase the number of training images, and maybe\n> part of that work could be capturing comparisons. I think there might be a\n> way to get true direct comparisons, so I'll have a think about that.\n>\n> If folks would like to see a comparison, here's an image taken just now\n> through my window. The sun's going down outside so it's pretty bright, and\n> indoors I have some low CT (very orange) strip LEDs.\n>\n> Old algorithm:\n>\n> https://drive.google.com/file/d/1f-n9RA25IEgf4si11Ruf5-W77tonRWg4/view?usp=sharing\n>\n> It doesn't really know what to do, so the Bayes algorithm has neutralised\n> everything (whilst sticking to the CT curve). Outside is a decidedly sickly\n> blue, and indoors is a sickly off-white.\n>\n> New algorithm:\n>\n> https://drive.google.com/file/d/147wzS6exZVTHNpFc8VrqKH6zeGYi7n1O/view?usp=sharing\n>\n> You can see the neural network has balanced for outside, and it looks\n> pretty spot on (but being a neural network, you couldn't necessarily\n> predict that's what it would do). Indoors has gone orange, though in truth\n> it does look nearly that orange. So things are always a bit nuanced, this\n> is definitely way closer, but perhaps there was something aesthetically\n> acceptable just a bit more in between? Hard to say. (This image wasn't in\n> the training set, nor, I think, were there any I would obviously describe\n> as very \"similar\".)\n>\n> >\n> >\n> > > Peter Bailey (4):\n> > >    ipa: rpi: controller: awb: Separate Bayesian Awb into AwbBayes\n> > >    ipa: rpi: controller: awb: Add Neural Network Awb\n> > >    ipa: rpi: pisp: vc4: Update tuning files for new awb and add model\n> > >    ipa: rpi: controller: Ignore algorithms starting with disable\n> > >\n> > >   src/ipa/rpi/controller/controller.cpp        |   6 +\n> > >   src/ipa/rpi/controller/meson.build           |  10 +\n> > >   src/ipa/rpi/controller/rpi/awb.cpp           | 409 ++---------------\n> > >   src/ipa/rpi/controller/rpi/awb.h             |  99 ++---\n> > >   src/ipa/rpi/controller/rpi/awb_bayes.cpp     | 444\n> +++++++++++++++++++\n> > >   src/ipa/rpi/controller/rpi/awb_nn.cpp        | 442 ++++++++++++++++++\n> > >   src/ipa/rpi/pisp/data/awb_model.tflite       | Bin 0 -> 47624 bytes\n> >\n> > I'm sure the question of where to store these binary blobs will generate\n> some discussion.\n>\n> The sooner the better, I'm sure!\n>\n\nWe package our other neural networks that get used in rpicam-apps/picamera2\nin a separate deb file (and git tree).\n\nPerhaps we do the same here and keep binary blobs out of the libcamera\ntree? We would need some sort of a graceful fallback if the AWB NN code\ncannot find the model file.\n\n\nNaush\n\n\n> Thanks\n> David\n>\n> >\n> >\n> > Regards,\n> > Barnabás Pőcze\n> >\n> > >   src/ipa/rpi/pisp/data/imx219.json            |  65 ++-\n> > >   src/ipa/rpi/pisp/data/imx296.json            |  64 ++-\n> > >   src/ipa/rpi/pisp/data/imx296_16mm.json       |  64 ++-\n> > >   src/ipa/rpi/pisp/data/imx296_6mm.json        |  64 ++-\n> > >   src/ipa/rpi/pisp/data/imx477.json            |  63 +++\n> > >   src/ipa/rpi/pisp/data/imx477_16mm.json       |  65 ++-\n> > >   src/ipa/rpi/pisp/data/imx477_6mm.json        |  65 ++-\n> > >   src/ipa/rpi/pisp/data/imx477_scientific.json |  79 +++-\n> > >   src/ipa/rpi/pisp/data/imx500.json            |  67 +++\n> > >   src/ipa/rpi/pisp/data/imx708.json            |  64 ++-\n> > >   src/ipa/rpi/pisp/data/imx708_wide.json       |  62 +++\n> > >   src/ipa/rpi/pisp/data/meson.build            |   7 +\n> > >   src/ipa/rpi/pisp/data/ov5647.json            |  63 +++\n> > >   src/ipa/rpi/vc4/data/awb_model.tflite        | Bin 0 -> 42976 bytes\n> > >   src/ipa/rpi/vc4/data/imx219.json             |  64 +++\n> > >   src/ipa/rpi/vc4/data/imx296.json             |  64 +++\n> > >   src/ipa/rpi/vc4/data/imx477.json             |  69 +++\n> > >   src/ipa/rpi/vc4/data/imx500.json             |  67 +++\n> > >   src/ipa/rpi/vc4/data/imx708.json             |  72 +++\n> > >   src/ipa/rpi/vc4/data/imx708_wide.json        |  62 +++\n> > >   src/ipa/rpi/vc4/data/meson.build             |   8 +\n> > >   src/ipa/rpi/vc4/data/ov5647.json             |  64 +++\n> > >   29 files changed, 2244 insertions(+), 428 deletions(-)\n> > >   create mode 100644 src/ipa/rpi/controller/rpi/awb_bayes.cpp\n> > >   create mode 100644 src/ipa/rpi/controller/rpi/awb_nn.cpp\n> > >   create mode 100644 src/ipa/rpi/pisp/data/awb_model.tflite\n> > >   create mode 100644 src/ipa/rpi/vc4/data/awb_model.tflite\n> > >\n> >\n>","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id F13A8BE080\n\tfor <parsemail@patchwork.libcamera.org>;\n\tFri, 24 Oct 2025 18:17:55 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id CE524609DE;\n\tFri, 24 Oct 2025 20:17:54 +0200 (CEST)","from mail-vs1-xe33.google.com (mail-vs1-xe33.google.com\n\t[IPv6:2607:f8b0:4864:20::e33])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id BA85460990\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tFri, 24 Oct 2025 20:17:53 +0200 (CEST)","by mail-vs1-xe33.google.com with SMTP id\n\tada2fe7eead31-5db209ddacaso179206137.1\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tFri, 24 Oct 2025 11:17:53 -0700 (PDT)"],"Authentication-Results":"lancelot.ideasonboard.com; dkim=pass (2048-bit key;\n\tunprotected) header.d=raspberrypi.com header.i=@raspberrypi.com\n\theader.b=\"Xb0oO0rH\"; dkim-atps=neutral","DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/relaxed;\n\td=raspberrypi.com; s=google; t=1761329872; x=1761934672;\n\tdarn=lists.libcamera.org; \n\th=cc:to:subject:message-id:date:from:in-reply-to:references\n\t:mime-version:from:to:cc:subject:date:message-id:reply-to;\n\tbh=4oVtY9cAId9QsPGhmDZn3awdpQf8ZXP8mfTZx7Or5H4=;\n\tb=Xb0oO0rHIrG+1dLI5tFYnX3XM8SXst4FHjd8g85xfqPThxAo2rog/sNOIY1l4R3i61\n\tR32FVTkQgqMHxhRxgkCE5dkUjsGWdSt/B3L9OaMMewHB4u2ClHSPvK1wdurYz3Eg666S\n\tGXSmOFwDn4rRZpJVTSj4Grw5ybF3HXVK/Pe0RkwIEAkhYqdscsSr+AQbHfHq+xSj303D\n\tDDrWMHjxUBAdBreoOPnKO/Uu16cJwFAJH2/6x6n2nSIEq9JkcthZsLZtrG8pqdaGIAIp\n\tWqswqa3SgnqCPjf7fHn1uEYBHDYJpORRZnlMpmkXeXZJpKNLguNb78ZEimm1VEwMfENm\n\t41WQ==","X-Google-DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/relaxed;\n\td=1e100.net; s=20230601; t=1761329872; x=1761934672;\n\th=cc:to:subject:message-id:date:from:in-reply-to:references\n\t:mime-version:x-gm-message-state:from:to:cc:subject:date:message-id\n\t:reply-to;\n\tbh=4oVtY9cAId9QsPGhmDZn3awdpQf8ZXP8mfTZx7Or5H4=;\n\tb=YPCMunZr0hvqBHVmJWrRiaz9U2lG9IDLtiDMk6NAjYS1lY3U0ohfPbeoGlRzlHCh+B\n\tj6wuMM1/oLIzkjKlM8Pae/I7hXwBLbsjKTQNSLSKGTrxyApt2oq8az054eN5n5KLB/aP\n\tKQzL72W2SslW9ifMqWFvM4z/UAERTkHap25qOrImTguGc8dDT+Se1gQPVPMqj8h0dJKu\n\t7YQdQgZB7yYi+cVIcwadXw+W1CJTnnf8FRR8iq5Lw3NIo+bft/eZfFU6eywf9B2S31eR\n\tAVH5rkd8yRpXNBsaVyxqkutEYx+rHeZ7opSGSXXggTMgiJTtyBkgAjwd8ah5bSeTN86a\n\tIItw==","X-Forwarded-Encrypted":"i=1;\n\tAJvYcCWCPZLlES71ZMel1Do0kD8kBUCGThMg74i7ZRLacrwunCgikuCIUdc/DXIlel6YVtBh/KV2C3XnwG88rWXBCD4=@lists.libcamera.org","X-Gm-Message-State":"AOJu0Ywlf/Ki3rT/v+som2xrSczwo5HMrxrC2/V/t+PTBFe83q7/TIL5\n\tPjho5+K+0JTQGHopX+D8QRqyiPfxtt2SDVc92pS3QMvjb9f/JqvrKvfrN2UehVn/Q369giMg4RY\n\tHV4KpQ8AXgphQxCuyAmyQFLqQ67ISsh9hxEn2IinqXw==","X-Gm-Gg":"ASbGncs66aqRw6oD8pixcgvJ218b2z5Nzruh9OZcZo4lY7fJeOd71eWEfgsnCPCH0mJ\n\tOBLHp2yPKskpVyAAKLa+oUH7NjzIgtBSkO1nsN9F4UnLxDxx8FNtPTdlbctGMgWNuIpc/V1ZExN\n\tnlIbxk2PgpJbuBysROKBki7KgY037Q3i+T6kcaGIfeXOMRHZs8c1EyA9ElPNBD9svx8hm5Cj25L\n\tNVfZeLsez6nIPcIQq9NcPPJ1qbpgMtaAQmiNWovcocNvh0DVz4ms/20ZM6Am9LKmA43d5BcM+bw\n\tKlzrSOo=","X-Google-Smtp-Source":"AGHT+IGurgkaFgeDzc7mY+aAgm0F9an/Eq+zx7PxWaAOQEEeZr/1ohi/dgcspzyz2YqUL6N2/z3ppNdXillGZ/0uKwk=","X-Received":"by 2002:a05:6102:6b09:b0:5db:3585:78cb with SMTP id\n\tada2fe7eead31-5db35a3ffe4mr1180608137.5.1761329870385;\n\tFri, 24 Oct 2025 11:17:50 -0700 (PDT)","MIME-Version":"1.0","References":"<20251024144049.3311-1-david.plowman@raspberrypi.com>\n\t<e5a7a328-b188-4608-b1e4-9bacb4a81ca4@ideasonboard.com>\n\t<CAHW6GYKpgXWp0eSe1J-CSNxY43q5dDLAkEUvpPJWR22p9ArzRQ@mail.gmail.com>","In-Reply-To":"<CAHW6GYKpgXWp0eSe1J-CSNxY43q5dDLAkEUvpPJWR22p9ArzRQ@mail.gmail.com>","From":"Naushir Patuck <naush@raspberrypi.com>","Date":"Fri, 24 Oct 2025 19:17:42 +0100","X-Gm-Features":"AWmQ_bmxQTQg6J0THDfkF3Pn_P736dXrvRuoLDcahgF8urAgE5Vtq_WNIFyIYhU","Message-ID":"<CAEmqJPqX2DJBYSnZehk6a5qezPmpwhZn2sgBjL4WFnx7cYW9zA@mail.gmail.com>","Subject":"Re: [PATCH 0/4] Raspberry Pi AWB using neural networks","To":"David Plowman <david.plowman@raspberrypi.com>","Cc":"=?utf-8?q?Barnab=C3=A1s_P=C5=91cze?= <barnabas.pocze@ideasonboard.com>,\n\tlibcamera devel <libcamera-devel@lists.libcamera.org>","Content-Type":"multipart/alternative; boundary=\"00000000000041c9060641eb919c\"","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}},{"id":36459,"web_url":"https://patchwork.libcamera.org/comment/36459/","msgid":"<CAHW6GYJR=NN3AToEcY5n9so3-NdRGA=O_KvQshvp-CY9HUuK3g@mail.gmail.com>","date":"2025-10-25T08:07:58","subject":"Re: [PATCH 0/4] Raspberry Pi AWB using neural networks","submitter":{"id":42,"url":"https://patchwork.libcamera.org/api/people/42/","name":"David Plowman","email":"david.plowman@raspberrypi.com"},"content":"I think it falls back to grey world if it fails to load the network. To be\nfair, failing to start the camera system at all (with a helpful message)\nmight even be preferable; it would definitely stop you running the \"wrong\"\nthing without noticing.\n\nDavid\n\nOn Fri, 24 Oct 2025 at 19:17, Naushir Patuck <naush@raspberrypi.com> wrote:\n\n>\n>\n> On Fri, 24 Oct 2025, 5:41 pm David Plowman, <david.plowman@raspberrypi.com>\n> wrote:\n>\n>> Hi Barnabas\n>>\n>> On Fri, 24 Oct 2025 at 15:56, Barnabás Pőcze <\n>> barnabas.pocze@ideasonboard.com> wrote:\n>> >\n>> > Hi\n>> >\n>> >\n>> > 2025. 10. 24. 16:15 keltezéssel, David Plowman írta:\n>> > > Hi everyone\n>> > >\n>> > > Here's our first go at doing AWB using neural networks. For both the\n>> > > PiSP and VC4 we take the image statistics and feed them to a neural\n>> > > network which tells us what it thinks the colour temperature should\n>> > > be.\n>> > >\n>> > > The first commit splits our existing Awb class into an Awb base class,\n>> > > and an AwbBayes derived class, so that the subsequent AwbNN class (2nd\n>> > > commit) can share code.\n>> > >\n>> > > Finally we update the tuning files for the new AWB and supply a TFLite\n>> > > model for each platform, though by default it remains off for\n>> > > now. Just switch the \"disable\" tag between the two AWB configurations\n>> > > to enable the new algorithm.\n>> > >\n>> > > We'll shortly be making public our image capture/annotation/training\n>> > > code which will include the full source for the model definitions. It\n>> > > will allow users to train their own, either from scratch, or\n>> > > incrementally using our datasets and models as a starting point.\n>> > >\n>> > > This is the work of our intern who was with us over the summer.\n>> > >\n>> > > Thanks!\n>> > > David\n>> > >\n>> >\n>> > This definitely sounds interesting! Maybe I missed them, but are\n>> > there are any numbers about performance, etc? Or images showcasing\n>> > the differences on the same scenes?\n>>\n>> That's definitely a good question, and one which would require a rather\n>> long answer to explain fully. I could probably get some numbers easily\n>> enough comparing the new algorithm against the old one for the data it's\n>> been trained on, but unsurprisingly that will show it to be much better.\n>>\n>> Really we need to be comparing on a fresh set of real images which is\n>> trickier. We have problems related to: how to run both algorithms on the\n>> same raw buffer or DNG, not having versions of AWB code that can run on a\n>> DNG, or at least not in quite the same way as it does on the hardware,\n>> having insufficient training data, risks of over-fitting, real-world images\n>> with mixed lighting where the white balance is very subjective, the list\n>> goes on.\n>>\n>> But I do have a plan to increase the number of training images, and maybe\n>> part of that work could be capturing comparisons. I think there might be a\n>> way to get true direct comparisons, so I'll have a think about that.\n>>\n>> If folks would like to see a comparison, here's an image taken just now\n>> through my window. The sun's going down outside so it's pretty bright, and\n>> indoors I have some low CT (very orange) strip LEDs.\n>>\n>> Old algorithm:\n>>\n>> https://drive.google.com/file/d/1f-n9RA25IEgf4si11Ruf5-W77tonRWg4/view?usp=sharing\n>>\n>> It doesn't really know what to do, so the Bayes algorithm has neutralised\n>> everything (whilst sticking to the CT curve). Outside is a decidedly sickly\n>> blue, and indoors is a sickly off-white.\n>>\n>> New algorithm:\n>>\n>> https://drive.google.com/file/d/147wzS6exZVTHNpFc8VrqKH6zeGYi7n1O/view?usp=sharing\n>>\n>> You can see the neural network has balanced for outside, and it looks\n>> pretty spot on (but being a neural network, you couldn't necessarily\n>> predict that's what it would do). Indoors has gone orange, though in truth\n>> it does look nearly that orange. So things are always a bit nuanced, this\n>> is definitely way closer, but perhaps there was something aesthetically\n>> acceptable just a bit more in between? Hard to say. (This image wasn't in\n>> the training set, nor, I think, were there any I would obviously describe\n>> as very \"similar\".)\n>>\n>> >\n>> >\n>> > > Peter Bailey (4):\n>> > >    ipa: rpi: controller: awb: Separate Bayesian Awb into AwbBayes\n>> > >    ipa: rpi: controller: awb: Add Neural Network Awb\n>> > >    ipa: rpi: pisp: vc4: Update tuning files for new awb and add model\n>> > >    ipa: rpi: controller: Ignore algorithms starting with disable\n>> > >\n>> > >   src/ipa/rpi/controller/controller.cpp        |   6 +\n>> > >   src/ipa/rpi/controller/meson.build           |  10 +\n>> > >   src/ipa/rpi/controller/rpi/awb.cpp           | 409 ++---------------\n>> > >   src/ipa/rpi/controller/rpi/awb.h             |  99 ++---\n>> > >   src/ipa/rpi/controller/rpi/awb_bayes.cpp     | 444\n>> +++++++++++++++++++\n>> > >   src/ipa/rpi/controller/rpi/awb_nn.cpp        | 442\n>> ++++++++++++++++++\n>> > >   src/ipa/rpi/pisp/data/awb_model.tflite       | Bin 0 -> 47624 bytes\n>> >\n>> > I'm sure the question of where to store these binary blobs will\n>> generate some discussion.\n>>\n>> The sooner the better, I'm sure!\n>>\n>\n> We package our other neural networks that get used in\n> rpicam-apps/picamera2 in a separate deb file (and git tree).\n>\n> Perhaps we do the same here and keep binary blobs out of the libcamera\n> tree? We would need some sort of a graceful fallback if the AWB NN code\n> cannot find the model file.\n>\n>\n> Naush\n>\n>\n>> Thanks\n>> David\n>>\n>> >\n>> >\n>> > Regards,\n>> > Barnabás Pőcze\n>> >\n>> > >   src/ipa/rpi/pisp/data/imx219.json            |  65 ++-\n>> > >   src/ipa/rpi/pisp/data/imx296.json            |  64 ++-\n>> > >   src/ipa/rpi/pisp/data/imx296_16mm.json       |  64 ++-\n>> > >   src/ipa/rpi/pisp/data/imx296_6mm.json        |  64 ++-\n>> > >   src/ipa/rpi/pisp/data/imx477.json            |  63 +++\n>> > >   src/ipa/rpi/pisp/data/imx477_16mm.json       |  65 ++-\n>> > >   src/ipa/rpi/pisp/data/imx477_6mm.json        |  65 ++-\n>> > >   src/ipa/rpi/pisp/data/imx477_scientific.json |  79 +++-\n>> > >   src/ipa/rpi/pisp/data/imx500.json            |  67 +++\n>> > >   src/ipa/rpi/pisp/data/imx708.json            |  64 ++-\n>> > >   src/ipa/rpi/pisp/data/imx708_wide.json       |  62 +++\n>> > >   src/ipa/rpi/pisp/data/meson.build            |   7 +\n>> > >   src/ipa/rpi/pisp/data/ov5647.json            |  63 +++\n>> > >   src/ipa/rpi/vc4/data/awb_model.tflite        | Bin 0 -> 42976 bytes\n>> > >   src/ipa/rpi/vc4/data/imx219.json             |  64 +++\n>> > >   src/ipa/rpi/vc4/data/imx296.json             |  64 +++\n>> > >   src/ipa/rpi/vc4/data/imx477.json             |  69 +++\n>> > >   src/ipa/rpi/vc4/data/imx500.json             |  67 +++\n>> > >   src/ipa/rpi/vc4/data/imx708.json             |  72 +++\n>> > >   src/ipa/rpi/vc4/data/imx708_wide.json        |  62 +++\n>> > >   src/ipa/rpi/vc4/data/meson.build             |   8 +\n>> > >   src/ipa/rpi/vc4/data/ov5647.json             |  64 +++\n>> > >   29 files changed, 2244 insertions(+), 428 deletions(-)\n>> > >   create mode 100644 src/ipa/rpi/controller/rpi/awb_bayes.cpp\n>> > >   create mode 100644 src/ipa/rpi/controller/rpi/awb_nn.cpp\n>> > >   create mode 100644 src/ipa/rpi/pisp/data/awb_model.tflite\n>> > >   create mode 100644 src/ipa/rpi/vc4/data/awb_model.tflite\n>> > >\n>> >\n>>\n>","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id 6B31FBE080\n\tfor <parsemail@patchwork.libcamera.org>;\n\tSat, 25 Oct 2025 08:08:15 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id 5C0DD606A3;\n\tSat, 25 Oct 2025 10:08:14 +0200 (CEST)","from mail-qk1-x72e.google.com (mail-qk1-x72e.google.com\n\t[IPv6:2607:f8b0:4864:20::72e])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id 14570605F3\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tSat, 25 Oct 2025 10:08:12 +0200 (CEST)","by mail-qk1-x72e.google.com with SMTP id\n\taf79cd13be357-89cd189ada3so290891985a.1\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tSat, 25 Oct 2025 01:08:11 -0700 (PDT)"],"Authentication-Results":"lancelot.ideasonboard.com; dkim=pass (2048-bit key;\n\tunprotected) header.d=raspberrypi.com header.i=@raspberrypi.com\n\theader.b=\"QFkT9SFp\"; dkim-atps=neutral","DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/relaxed;\n\td=raspberrypi.com; s=google; t=1761379691; x=1761984491;\n\tdarn=lists.libcamera.org; \n\th=cc:to:subject:message-id:date:from:in-reply-to:references\n\t:mime-version:from:to:cc:subject:date:message-id:reply-to;\n\tbh=j/HQ4CnPXXKjY3UwHhWYoKXM165FYz2b08Gs0tqErZs=;\n\tb=QFkT9SFptwvcPSeys/ZsmW07dpYB+TXKcHnB/aO8CgBZ7p+Y9RZ9wF3XZzlesrcoYB\n\tfhisN5s6kiqiUlsAAV+dYDAJlHbDxtW1D+fYe15lT0wvl89/Eaxmv8IZV6IgtVOo1dcL\n\tq03wKAZsTjMa0zbVcKwsI6rfOEiDIot/4dzOt7gmiHmQCr4qvi01M2FOmNN4HchNwT1p\n\thTzhyHd81cBv+Oht20WvGPVROC/TL0RNOuddfl+OMCn2NVzm1jSGBq+C/XWNRk+7GPyb\n\tGkOes3Zr38SY0JwKnWY0NjwQJNgJKG4y8DifF35XvpAGUdLu1xeARn+09i4RbgVGs0QV\n\tuXKQ==","X-Google-DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/relaxed;\n\td=1e100.net; s=20230601; t=1761379691; x=1761984491;\n\th=cc:to:subject:message-id:date:from:in-reply-to:references\n\t:mime-version:x-gm-message-state:from:to:cc:subject:date:message-id\n\t:reply-to;\n\tbh=j/HQ4CnPXXKjY3UwHhWYoKXM165FYz2b08Gs0tqErZs=;\n\tb=g0KcbmpwUWQ79+mJkpHX5ZfEDWZAuMeoBXy/HoXAwDdxYxjHxdJ4akxJz+kfDgMYKg\n\t89yifSB6Pj3E2UCxgRzMWHm2jzR9tYsnA3mcbYQ7+li82Dk6DOtLFmU23N5LXI9xlQQT\n\tdGh/bcdeR+OQj1GMMgNwUddfqOL9oj5AvQ51A3j1Me5j79O7L1fm8UkU4vEIrSiLzPQf\n\t5Svn6nuF7EaPTUrp7mOSUbHFR0ur9YCrh2JvuoGcUENhJP6UCWsxvJp3rF/9clwN9dP/\n\tfE5H1KZaJ4jiHODFqf32sJtV+X3oV47LjWMyuqXP1PMYKraM9pFGYjtuTbJK3LD5vQok\n\tZLzQ==","X-Forwarded-Encrypted":"i=1;\n\tAJvYcCWhkQZm2BlqttnEYF2/E+QQZI9fH/on4rFAlu4/TLA3vU0tiiJl95U7I+4RzieDN35eBJEwH3NWYCOj2P4cptM=@lists.libcamera.org","X-Gm-Message-State":"AOJu0YwL1bSSGsplofUjCrwmAkhNTWn62cdj/9YLhnZGad9Crif1/MYZ\n\tOCxFBqalzOYRf+G2TzxvjmlTRRV/zFJrtPTHaRiPXysCmhcme3mz8kFFkZfu0fkIQPv+avACK98\n\tRD4wPp7eIDNlsf8xOsvFM9/RjcR781TpKpA7+gbipqQ==","X-Gm-Gg":"ASbGncuHZhgiDNWn5FQckngQ8EPo3k2XbPx+Ff6vB59kmLHPMD+w+AJtWQxPrmD6674\n\t5Pt+APFw5MYv7pl+qwhOsflRFvcvL7dDbPrTct7uV+AgI2XfHkS2tx7bzxH1j9Dytpz8olQMc8R\n\tKFuYLm/h1/xZ4F05rfMxhCa1Bs34W9Cm45y4/fiNb8JXCsM4JkGa5hIsNDf3ijMxwWbL++BI+Hx\n\t6CObDHF7oODN4ofuSrgA1aL7Ki0dnY9AC4a4DFWyfUNeSui1rK6gf9oYcx3P9LG9l8SXBBqIBKF\n\t6tI5lnW2McnbiQwajsx/2bDfEne5I1LWxd0bUY4DkR+HFWccOEFJVNTEw9bD/NyeffSFFu29GQZ\n\tIyTTmFy2Sc94=","X-Google-Smtp-Source":"AGHT+IFu0We6LrMW4LYQB5Tw2c3DrZ24EJJc0Z7+q1yB6ge5adOvU244PfCKdoDkJpTq+Gu7fXOSugB+LMXBFRFKpkA=","X-Received":"by 2002:a05:620a:700a:b0:7e7:fba9:4a79 with SMTP id\n\taf79cd13be357-89da217ec80mr627137785a.29.1761379690238;\n\tSat, 25 Oct 2025 01:08:10 -0700 (PDT)","MIME-Version":"1.0","References":"<20251024144049.3311-1-david.plowman@raspberrypi.com>\n\t<e5a7a328-b188-4608-b1e4-9bacb4a81ca4@ideasonboard.com>\n\t<CAHW6GYKpgXWp0eSe1J-CSNxY43q5dDLAkEUvpPJWR22p9ArzRQ@mail.gmail.com>\n\t<CAEmqJPqX2DJBYSnZehk6a5qezPmpwhZn2sgBjL4WFnx7cYW9zA@mail.gmail.com>","In-Reply-To":"<CAEmqJPqX2DJBYSnZehk6a5qezPmpwhZn2sgBjL4WFnx7cYW9zA@mail.gmail.com>","From":"David Plowman <david.plowman@raspberrypi.com>","Date":"Sat, 25 Oct 2025 09:07:58 +0100","X-Gm-Features":"AS18NWB3bke-r0Dekx5cYTzGbwVBy6LCQ3Q9wma_rhLgr7-hbxaoaQizAiViuTQ","Message-ID":"<CAHW6GYJR=NN3AToEcY5n9so3-NdRGA=O_KvQshvp-CY9HUuK3g@mail.gmail.com>","Subject":"Re: [PATCH 0/4] Raspberry Pi AWB using neural networks","To":"Naushir Patuck <naush@raspberrypi.com>","Cc":"=?utf-8?q?Barnab=C3=A1s_P=C5=91cze?= <barnabas.pocze@ideasonboard.com>,\n\tlibcamera devel <libcamera-devel@lists.libcamera.org>","Content-Type":"multipart/alternative; boundary=\"000000000000c072770641f72ab5\"","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}}]