[0/4] Raspberry Pi AWB using neural networks
mbox series

Message ID 20251024144049.3311-1-david.plowman@raspberrypi.com
Headers show
Series
  • Raspberry Pi AWB using neural networks
Related show

Message

David Plowman Oct. 24, 2025, 2:15 p.m. UTC
Hi everyone

Here's our first go at doing AWB using neural networks. For both the
PiSP and VC4 we take the image statistics and feed them to a neural
network which tells us what it thinks the colour temperature should
be.

The first commit splits our existing Awb class into an Awb base class,
and an AwbBayes derived class, so that the subsequent AwbNN class (2nd
commit) can share code.

Finally we update the tuning files for the new AWB and supply a TFLite
model for each platform, though by default it remains off for
now. Just switch the "disable" tag between the two AWB configurations
to enable the new algorithm.

We'll shortly be making public our image capture/annotation/training
code which will include the full source for the model definitions. It
will allow users to train their own, either from scratch, or
incrementally using our datasets and models as a starting point.

This is the work of our intern who was with us over the summer.

Thanks!
David

Peter Bailey (4):
  ipa: rpi: controller: awb: Separate Bayesian Awb into AwbBayes
  ipa: rpi: controller: awb: Add Neural Network Awb
  ipa: rpi: pisp: vc4: Update tuning files for new awb and add model
  ipa: rpi: controller: Ignore algorithms starting with disable

 src/ipa/rpi/controller/controller.cpp        |   6 +
 src/ipa/rpi/controller/meson.build           |  10 +
 src/ipa/rpi/controller/rpi/awb.cpp           | 409 ++---------------
 src/ipa/rpi/controller/rpi/awb.h             |  99 ++---
 src/ipa/rpi/controller/rpi/awb_bayes.cpp     | 444 +++++++++++++++++++
 src/ipa/rpi/controller/rpi/awb_nn.cpp        | 442 ++++++++++++++++++
 src/ipa/rpi/pisp/data/awb_model.tflite       | Bin 0 -> 47624 bytes
 src/ipa/rpi/pisp/data/imx219.json            |  65 ++-
 src/ipa/rpi/pisp/data/imx296.json            |  64 ++-
 src/ipa/rpi/pisp/data/imx296_16mm.json       |  64 ++-
 src/ipa/rpi/pisp/data/imx296_6mm.json        |  64 ++-
 src/ipa/rpi/pisp/data/imx477.json            |  63 +++
 src/ipa/rpi/pisp/data/imx477_16mm.json       |  65 ++-
 src/ipa/rpi/pisp/data/imx477_6mm.json        |  65 ++-
 src/ipa/rpi/pisp/data/imx477_scientific.json |  79 +++-
 src/ipa/rpi/pisp/data/imx500.json            |  67 +++
 src/ipa/rpi/pisp/data/imx708.json            |  64 ++-
 src/ipa/rpi/pisp/data/imx708_wide.json       |  62 +++
 src/ipa/rpi/pisp/data/meson.build            |   7 +
 src/ipa/rpi/pisp/data/ov5647.json            |  63 +++
 src/ipa/rpi/vc4/data/awb_model.tflite        | Bin 0 -> 42976 bytes
 src/ipa/rpi/vc4/data/imx219.json             |  64 +++
 src/ipa/rpi/vc4/data/imx296.json             |  64 +++
 src/ipa/rpi/vc4/data/imx477.json             |  69 +++
 src/ipa/rpi/vc4/data/imx500.json             |  67 +++
 src/ipa/rpi/vc4/data/imx708.json             |  72 +++
 src/ipa/rpi/vc4/data/imx708_wide.json        |  62 +++
 src/ipa/rpi/vc4/data/meson.build             |   8 +
 src/ipa/rpi/vc4/data/ov5647.json             |  64 +++
 29 files changed, 2244 insertions(+), 428 deletions(-)
 create mode 100644 src/ipa/rpi/controller/rpi/awb_bayes.cpp
 create mode 100644 src/ipa/rpi/controller/rpi/awb_nn.cpp
 create mode 100644 src/ipa/rpi/pisp/data/awb_model.tflite
 create mode 100644 src/ipa/rpi/vc4/data/awb_model.tflite

Comments

Barnabás Pőcze Oct. 24, 2025, 2:56 p.m. UTC | #1
Hi


2025. 10. 24. 16:15 keltezéssel, David Plowman írta:
> Hi everyone
> 
> Here's our first go at doing AWB using neural networks. For both the
> PiSP and VC4 we take the image statistics and feed them to a neural
> network which tells us what it thinks the colour temperature should
> be.
> 
> The first commit splits our existing Awb class into an Awb base class,
> and an AwbBayes derived class, so that the subsequent AwbNN class (2nd
> commit) can share code.
> 
> Finally we update the tuning files for the new AWB and supply a TFLite
> model for each platform, though by default it remains off for
> now. Just switch the "disable" tag between the two AWB configurations
> to enable the new algorithm.
> 
> We'll shortly be making public our image capture/annotation/training
> code which will include the full source for the model definitions. It
> will allow users to train their own, either from scratch, or
> incrementally using our datasets and models as a starting point.
> 
> This is the work of our intern who was with us over the summer.
> 
> Thanks!
> David
> 

This definitely sounds interesting! Maybe I missed them, but are
there are any numbers about performance, etc? Or images showcasing
the differences on the same scenes?


> Peter Bailey (4):
>    ipa: rpi: controller: awb: Separate Bayesian Awb into AwbBayes
>    ipa: rpi: controller: awb: Add Neural Network Awb
>    ipa: rpi: pisp: vc4: Update tuning files for new awb and add model
>    ipa: rpi: controller: Ignore algorithms starting with disable
> 
>   src/ipa/rpi/controller/controller.cpp        |   6 +
>   src/ipa/rpi/controller/meson.build           |  10 +
>   src/ipa/rpi/controller/rpi/awb.cpp           | 409 ++---------------
>   src/ipa/rpi/controller/rpi/awb.h             |  99 ++---
>   src/ipa/rpi/controller/rpi/awb_bayes.cpp     | 444 +++++++++++++++++++
>   src/ipa/rpi/controller/rpi/awb_nn.cpp        | 442 ++++++++++++++++++
>   src/ipa/rpi/pisp/data/awb_model.tflite       | Bin 0 -> 47624 bytes

I'm sure the question of where to store these binary blobs will generate some discussion.


Regards,
Barnabás Pőcze

>   src/ipa/rpi/pisp/data/imx219.json            |  65 ++-
>   src/ipa/rpi/pisp/data/imx296.json            |  64 ++-
>   src/ipa/rpi/pisp/data/imx296_16mm.json       |  64 ++-
>   src/ipa/rpi/pisp/data/imx296_6mm.json        |  64 ++-
>   src/ipa/rpi/pisp/data/imx477.json            |  63 +++
>   src/ipa/rpi/pisp/data/imx477_16mm.json       |  65 ++-
>   src/ipa/rpi/pisp/data/imx477_6mm.json        |  65 ++-
>   src/ipa/rpi/pisp/data/imx477_scientific.json |  79 +++-
>   src/ipa/rpi/pisp/data/imx500.json            |  67 +++
>   src/ipa/rpi/pisp/data/imx708.json            |  64 ++-
>   src/ipa/rpi/pisp/data/imx708_wide.json       |  62 +++
>   src/ipa/rpi/pisp/data/meson.build            |   7 +
>   src/ipa/rpi/pisp/data/ov5647.json            |  63 +++
>   src/ipa/rpi/vc4/data/awb_model.tflite        | Bin 0 -> 42976 bytes
>   src/ipa/rpi/vc4/data/imx219.json             |  64 +++
>   src/ipa/rpi/vc4/data/imx296.json             |  64 +++
>   src/ipa/rpi/vc4/data/imx477.json             |  69 +++
>   src/ipa/rpi/vc4/data/imx500.json             |  67 +++
>   src/ipa/rpi/vc4/data/imx708.json             |  72 +++
>   src/ipa/rpi/vc4/data/imx708_wide.json        |  62 +++
>   src/ipa/rpi/vc4/data/meson.build             |   8 +
>   src/ipa/rpi/vc4/data/ov5647.json             |  64 +++
>   29 files changed, 2244 insertions(+), 428 deletions(-)
>   create mode 100644 src/ipa/rpi/controller/rpi/awb_bayes.cpp
>   create mode 100644 src/ipa/rpi/controller/rpi/awb_nn.cpp
>   create mode 100644 src/ipa/rpi/pisp/data/awb_model.tflite
>   create mode 100644 src/ipa/rpi/vc4/data/awb_model.tflite
>
David Plowman Oct. 24, 2025, 4:41 p.m. UTC | #2
Hi Barnabas

On Fri, 24 Oct 2025 at 15:56, Barnabás Pőcze <
barnabas.pocze@ideasonboard.com> wrote:
>
> Hi
>
>
> 2025. 10. 24. 16:15 keltezéssel, David Plowman írta:
> > Hi everyone
> >
> > Here's our first go at doing AWB using neural networks. For both the
> > PiSP and VC4 we take the image statistics and feed them to a neural
> > network which tells us what it thinks the colour temperature should
> > be.
> >
> > The first commit splits our existing Awb class into an Awb base class,
> > and an AwbBayes derived class, so that the subsequent AwbNN class (2nd
> > commit) can share code.
> >
> > Finally we update the tuning files for the new AWB and supply a TFLite
> > model for each platform, though by default it remains off for
> > now. Just switch the "disable" tag between the two AWB configurations
> > to enable the new algorithm.
> >
> > We'll shortly be making public our image capture/annotation/training
> > code which will include the full source for the model definitions. It
> > will allow users to train their own, either from scratch, or
> > incrementally using our datasets and models as a starting point.
> >
> > This is the work of our intern who was with us over the summer.
> >
> > Thanks!
> > David
> >
>
> This definitely sounds interesting! Maybe I missed them, but are
> there are any numbers about performance, etc? Or images showcasing
> the differences on the same scenes?

That's definitely a good question, and one which would require a rather
long answer to explain fully. I could probably get some numbers easily
enough comparing the new algorithm against the old one for the data it's
been trained on, but unsurprisingly that will show it to be much better.

Really we need to be comparing on a fresh set of real images which is
trickier. We have problems related to: how to run both algorithms on the
same raw buffer or DNG, not having versions of AWB code that can run on a
DNG, or at least not in quite the same way as it does on the hardware,
having insufficient training data, risks of over-fitting, real-world images
with mixed lighting where the white balance is very subjective, the list
goes on.

But I do have a plan to increase the number of training images, and maybe
part of that work could be capturing comparisons. I think there might be a
way to get true direct comparisons, so I'll have a think about that.

If folks would like to see a comparison, here's an image taken just now
through my window. The sun's going down outside so it's pretty bright, and
indoors I have some low CT (very orange) strip LEDs.

Old algorithm:
https://drive.google.com/file/d/1f-n9RA25IEgf4si11Ruf5-W77tonRWg4/view?usp=sharing

It doesn't really know what to do, so the Bayes algorithm has neutralised
everything (whilst sticking to the CT curve). Outside is a decidedly sickly
blue, and indoors is a sickly off-white.

New algorithm:
https://drive.google.com/file/d/147wzS6exZVTHNpFc8VrqKH6zeGYi7n1O/view?usp=sharing

You can see the neural network has balanced for outside, and it looks
pretty spot on (but being a neural network, you couldn't necessarily
predict that's what it would do). Indoors has gone orange, though in truth
it does look nearly that orange. So things are always a bit nuanced, this
is definitely way closer, but perhaps there was something aesthetically
acceptable just a bit more in between? Hard to say. (This image wasn't in
the training set, nor, I think, were there any I would obviously describe
as very "similar".)

>
>
> > Peter Bailey (4):
> >    ipa: rpi: controller: awb: Separate Bayesian Awb into AwbBayes
> >    ipa: rpi: controller: awb: Add Neural Network Awb
> >    ipa: rpi: pisp: vc4: Update tuning files for new awb and add model
> >    ipa: rpi: controller: Ignore algorithms starting with disable
> >
> >   src/ipa/rpi/controller/controller.cpp        |   6 +
> >   src/ipa/rpi/controller/meson.build           |  10 +
> >   src/ipa/rpi/controller/rpi/awb.cpp           | 409 ++---------------
> >   src/ipa/rpi/controller/rpi/awb.h             |  99 ++---
> >   src/ipa/rpi/controller/rpi/awb_bayes.cpp     | 444 +++++++++++++++++++
> >   src/ipa/rpi/controller/rpi/awb_nn.cpp        | 442 ++++++++++++++++++
> >   src/ipa/rpi/pisp/data/awb_model.tflite       | Bin 0 -> 47624 bytes
>
> I'm sure the question of where to store these binary blobs will generate
some discussion.

The sooner the better, I'm sure!

Thanks
David

>
>
> Regards,
> Barnabás Pőcze
>
> >   src/ipa/rpi/pisp/data/imx219.json            |  65 ++-
> >   src/ipa/rpi/pisp/data/imx296.json            |  64 ++-
> >   src/ipa/rpi/pisp/data/imx296_16mm.json       |  64 ++-
> >   src/ipa/rpi/pisp/data/imx296_6mm.json        |  64 ++-
> >   src/ipa/rpi/pisp/data/imx477.json            |  63 +++
> >   src/ipa/rpi/pisp/data/imx477_16mm.json       |  65 ++-
> >   src/ipa/rpi/pisp/data/imx477_6mm.json        |  65 ++-
> >   src/ipa/rpi/pisp/data/imx477_scientific.json |  79 +++-
> >   src/ipa/rpi/pisp/data/imx500.json            |  67 +++
> >   src/ipa/rpi/pisp/data/imx708.json            |  64 ++-
> >   src/ipa/rpi/pisp/data/imx708_wide.json       |  62 +++
> >   src/ipa/rpi/pisp/data/meson.build            |   7 +
> >   src/ipa/rpi/pisp/data/ov5647.json            |  63 +++
> >   src/ipa/rpi/vc4/data/awb_model.tflite        | Bin 0 -> 42976 bytes
> >   src/ipa/rpi/vc4/data/imx219.json             |  64 +++
> >   src/ipa/rpi/vc4/data/imx296.json             |  64 +++
> >   src/ipa/rpi/vc4/data/imx477.json             |  69 +++
> >   src/ipa/rpi/vc4/data/imx500.json             |  67 +++
> >   src/ipa/rpi/vc4/data/imx708.json             |  72 +++
> >   src/ipa/rpi/vc4/data/imx708_wide.json        |  62 +++
> >   src/ipa/rpi/vc4/data/meson.build             |   8 +
> >   src/ipa/rpi/vc4/data/ov5647.json             |  64 +++
> >   29 files changed, 2244 insertions(+), 428 deletions(-)
> >   create mode 100644 src/ipa/rpi/controller/rpi/awb_bayes.cpp
> >   create mode 100644 src/ipa/rpi/controller/rpi/awb_nn.cpp
> >   create mode 100644 src/ipa/rpi/pisp/data/awb_model.tflite
> >   create mode 100644 src/ipa/rpi/vc4/data/awb_model.tflite
> >
>
Naushir Patuck Oct. 24, 2025, 6:17 p.m. UTC | #3
On Fri, 24 Oct 2025, 5:41 pm David Plowman, <david.plowman@raspberrypi.com>
wrote:

> Hi Barnabas
>
> On Fri, 24 Oct 2025 at 15:56, Barnabás Pőcze <
> barnabas.pocze@ideasonboard.com> wrote:
> >
> > Hi
> >
> >
> > 2025. 10. 24. 16:15 keltezéssel, David Plowman írta:
> > > Hi everyone
> > >
> > > Here's our first go at doing AWB using neural networks. For both the
> > > PiSP and VC4 we take the image statistics and feed them to a neural
> > > network which tells us what it thinks the colour temperature should
> > > be.
> > >
> > > The first commit splits our existing Awb class into an Awb base class,
> > > and an AwbBayes derived class, so that the subsequent AwbNN class (2nd
> > > commit) can share code.
> > >
> > > Finally we update the tuning files for the new AWB and supply a TFLite
> > > model for each platform, though by default it remains off for
> > > now. Just switch the "disable" tag between the two AWB configurations
> > > to enable the new algorithm.
> > >
> > > We'll shortly be making public our image capture/annotation/training
> > > code which will include the full source for the model definitions. It
> > > will allow users to train their own, either from scratch, or
> > > incrementally using our datasets and models as a starting point.
> > >
> > > This is the work of our intern who was with us over the summer.
> > >
> > > Thanks!
> > > David
> > >
> >
> > This definitely sounds interesting! Maybe I missed them, but are
> > there are any numbers about performance, etc? Or images showcasing
> > the differences on the same scenes?
>
> That's definitely a good question, and one which would require a rather
> long answer to explain fully. I could probably get some numbers easily
> enough comparing the new algorithm against the old one for the data it's
> been trained on, but unsurprisingly that will show it to be much better.
>
> Really we need to be comparing on a fresh set of real images which is
> trickier. We have problems related to: how to run both algorithms on the
> same raw buffer or DNG, not having versions of AWB code that can run on a
> DNG, or at least not in quite the same way as it does on the hardware,
> having insufficient training data, risks of over-fitting, real-world images
> with mixed lighting where the white balance is very subjective, the list
> goes on.
>
> But I do have a plan to increase the number of training images, and maybe
> part of that work could be capturing comparisons. I think there might be a
> way to get true direct comparisons, so I'll have a think about that.
>
> If folks would like to see a comparison, here's an image taken just now
> through my window. The sun's going down outside so it's pretty bright, and
> indoors I have some low CT (very orange) strip LEDs.
>
> Old algorithm:
>
> https://drive.google.com/file/d/1f-n9RA25IEgf4si11Ruf5-W77tonRWg4/view?usp=sharing
>
> It doesn't really know what to do, so the Bayes algorithm has neutralised
> everything (whilst sticking to the CT curve). Outside is a decidedly sickly
> blue, and indoors is a sickly off-white.
>
> New algorithm:
>
> https://drive.google.com/file/d/147wzS6exZVTHNpFc8VrqKH6zeGYi7n1O/view?usp=sharing
>
> You can see the neural network has balanced for outside, and it looks
> pretty spot on (but being a neural network, you couldn't necessarily
> predict that's what it would do). Indoors has gone orange, though in truth
> it does look nearly that orange. So things are always a bit nuanced, this
> is definitely way closer, but perhaps there was something aesthetically
> acceptable just a bit more in between? Hard to say. (This image wasn't in
> the training set, nor, I think, were there any I would obviously describe
> as very "similar".)
>
> >
> >
> > > Peter Bailey (4):
> > >    ipa: rpi: controller: awb: Separate Bayesian Awb into AwbBayes
> > >    ipa: rpi: controller: awb: Add Neural Network Awb
> > >    ipa: rpi: pisp: vc4: Update tuning files for new awb and add model
> > >    ipa: rpi: controller: Ignore algorithms starting with disable
> > >
> > >   src/ipa/rpi/controller/controller.cpp        |   6 +
> > >   src/ipa/rpi/controller/meson.build           |  10 +
> > >   src/ipa/rpi/controller/rpi/awb.cpp           | 409 ++---------------
> > >   src/ipa/rpi/controller/rpi/awb.h             |  99 ++---
> > >   src/ipa/rpi/controller/rpi/awb_bayes.cpp     | 444
> +++++++++++++++++++
> > >   src/ipa/rpi/controller/rpi/awb_nn.cpp        | 442 ++++++++++++++++++
> > >   src/ipa/rpi/pisp/data/awb_model.tflite       | Bin 0 -> 47624 bytes
> >
> > I'm sure the question of where to store these binary blobs will generate
> some discussion.
>
> The sooner the better, I'm sure!
>

We package our other neural networks that get used in rpicam-apps/picamera2
in a separate deb file (and git tree).

Perhaps we do the same here and keep binary blobs out of the libcamera
tree? We would need some sort of a graceful fallback if the AWB NN code
cannot find the model file.


Naush


> Thanks
> David
>
> >
> >
> > Regards,
> > Barnabás Pőcze
> >
> > >   src/ipa/rpi/pisp/data/imx219.json            |  65 ++-
> > >   src/ipa/rpi/pisp/data/imx296.json            |  64 ++-
> > >   src/ipa/rpi/pisp/data/imx296_16mm.json       |  64 ++-
> > >   src/ipa/rpi/pisp/data/imx296_6mm.json        |  64 ++-
> > >   src/ipa/rpi/pisp/data/imx477.json            |  63 +++
> > >   src/ipa/rpi/pisp/data/imx477_16mm.json       |  65 ++-
> > >   src/ipa/rpi/pisp/data/imx477_6mm.json        |  65 ++-
> > >   src/ipa/rpi/pisp/data/imx477_scientific.json |  79 +++-
> > >   src/ipa/rpi/pisp/data/imx500.json            |  67 +++
> > >   src/ipa/rpi/pisp/data/imx708.json            |  64 ++-
> > >   src/ipa/rpi/pisp/data/imx708_wide.json       |  62 +++
> > >   src/ipa/rpi/pisp/data/meson.build            |   7 +
> > >   src/ipa/rpi/pisp/data/ov5647.json            |  63 +++
> > >   src/ipa/rpi/vc4/data/awb_model.tflite        | Bin 0 -> 42976 bytes
> > >   src/ipa/rpi/vc4/data/imx219.json             |  64 +++
> > >   src/ipa/rpi/vc4/data/imx296.json             |  64 +++
> > >   src/ipa/rpi/vc4/data/imx477.json             |  69 +++
> > >   src/ipa/rpi/vc4/data/imx500.json             |  67 +++
> > >   src/ipa/rpi/vc4/data/imx708.json             |  72 +++
> > >   src/ipa/rpi/vc4/data/imx708_wide.json        |  62 +++
> > >   src/ipa/rpi/vc4/data/meson.build             |   8 +
> > >   src/ipa/rpi/vc4/data/ov5647.json             |  64 +++
> > >   29 files changed, 2244 insertions(+), 428 deletions(-)
> > >   create mode 100644 src/ipa/rpi/controller/rpi/awb_bayes.cpp
> > >   create mode 100644 src/ipa/rpi/controller/rpi/awb_nn.cpp
> > >   create mode 100644 src/ipa/rpi/pisp/data/awb_model.tflite
> > >   create mode 100644 src/ipa/rpi/vc4/data/awb_model.tflite
> > >
> >
>
David Plowman Oct. 25, 2025, 8:07 a.m. UTC | #4
I think it falls back to grey world if it fails to load the network. To be
fair, failing to start the camera system at all (with a helpful message)
might even be preferable; it would definitely stop you running the "wrong"
thing without noticing.

David

On Fri, 24 Oct 2025 at 19:17, Naushir Patuck <naush@raspberrypi.com> wrote:

>
>
> On Fri, 24 Oct 2025, 5:41 pm David Plowman, <david.plowman@raspberrypi.com>
> wrote:
>
>> Hi Barnabas
>>
>> On Fri, 24 Oct 2025 at 15:56, Barnabás Pőcze <
>> barnabas.pocze@ideasonboard.com> wrote:
>> >
>> > Hi
>> >
>> >
>> > 2025. 10. 24. 16:15 keltezéssel, David Plowman írta:
>> > > Hi everyone
>> > >
>> > > Here's our first go at doing AWB using neural networks. For both the
>> > > PiSP and VC4 we take the image statistics and feed them to a neural
>> > > network which tells us what it thinks the colour temperature should
>> > > be.
>> > >
>> > > The first commit splits our existing Awb class into an Awb base class,
>> > > and an AwbBayes derived class, so that the subsequent AwbNN class (2nd
>> > > commit) can share code.
>> > >
>> > > Finally we update the tuning files for the new AWB and supply a TFLite
>> > > model for each platform, though by default it remains off for
>> > > now. Just switch the "disable" tag between the two AWB configurations
>> > > to enable the new algorithm.
>> > >
>> > > We'll shortly be making public our image capture/annotation/training
>> > > code which will include the full source for the model definitions. It
>> > > will allow users to train their own, either from scratch, or
>> > > incrementally using our datasets and models as a starting point.
>> > >
>> > > This is the work of our intern who was with us over the summer.
>> > >
>> > > Thanks!
>> > > David
>> > >
>> >
>> > This definitely sounds interesting! Maybe I missed them, but are
>> > there are any numbers about performance, etc? Or images showcasing
>> > the differences on the same scenes?
>>
>> That's definitely a good question, and one which would require a rather
>> long answer to explain fully. I could probably get some numbers easily
>> enough comparing the new algorithm against the old one for the data it's
>> been trained on, but unsurprisingly that will show it to be much better.
>>
>> Really we need to be comparing on a fresh set of real images which is
>> trickier. We have problems related to: how to run both algorithms on the
>> same raw buffer or DNG, not having versions of AWB code that can run on a
>> DNG, or at least not in quite the same way as it does on the hardware,
>> having insufficient training data, risks of over-fitting, real-world images
>> with mixed lighting where the white balance is very subjective, the list
>> goes on.
>>
>> But I do have a plan to increase the number of training images, and maybe
>> part of that work could be capturing comparisons. I think there might be a
>> way to get true direct comparisons, so I'll have a think about that.
>>
>> If folks would like to see a comparison, here's an image taken just now
>> through my window. The sun's going down outside so it's pretty bright, and
>> indoors I have some low CT (very orange) strip LEDs.
>>
>> Old algorithm:
>>
>> https://drive.google.com/file/d/1f-n9RA25IEgf4si11Ruf5-W77tonRWg4/view?usp=sharing
>>
>> It doesn't really know what to do, so the Bayes algorithm has neutralised
>> everything (whilst sticking to the CT curve). Outside is a decidedly sickly
>> blue, and indoors is a sickly off-white.
>>
>> New algorithm:
>>
>> https://drive.google.com/file/d/147wzS6exZVTHNpFc8VrqKH6zeGYi7n1O/view?usp=sharing
>>
>> You can see the neural network has balanced for outside, and it looks
>> pretty spot on (but being a neural network, you couldn't necessarily
>> predict that's what it would do). Indoors has gone orange, though in truth
>> it does look nearly that orange. So things are always a bit nuanced, this
>> is definitely way closer, but perhaps there was something aesthetically
>> acceptable just a bit more in between? Hard to say. (This image wasn't in
>> the training set, nor, I think, were there any I would obviously describe
>> as very "similar".)
>>
>> >
>> >
>> > > Peter Bailey (4):
>> > >    ipa: rpi: controller: awb: Separate Bayesian Awb into AwbBayes
>> > >    ipa: rpi: controller: awb: Add Neural Network Awb
>> > >    ipa: rpi: pisp: vc4: Update tuning files for new awb and add model
>> > >    ipa: rpi: controller: Ignore algorithms starting with disable
>> > >
>> > >   src/ipa/rpi/controller/controller.cpp        |   6 +
>> > >   src/ipa/rpi/controller/meson.build           |  10 +
>> > >   src/ipa/rpi/controller/rpi/awb.cpp           | 409 ++---------------
>> > >   src/ipa/rpi/controller/rpi/awb.h             |  99 ++---
>> > >   src/ipa/rpi/controller/rpi/awb_bayes.cpp     | 444
>> +++++++++++++++++++
>> > >   src/ipa/rpi/controller/rpi/awb_nn.cpp        | 442
>> ++++++++++++++++++
>> > >   src/ipa/rpi/pisp/data/awb_model.tflite       | Bin 0 -> 47624 bytes
>> >
>> > I'm sure the question of where to store these binary blobs will
>> generate some discussion.
>>
>> The sooner the better, I'm sure!
>>
>
> We package our other neural networks that get used in
> rpicam-apps/picamera2 in a separate deb file (and git tree).
>
> Perhaps we do the same here and keep binary blobs out of the libcamera
> tree? We would need some sort of a graceful fallback if the AWB NN code
> cannot find the model file.
>
>
> Naush
>
>
>> Thanks
>> David
>>
>> >
>> >
>> > Regards,
>> > Barnabás Pőcze
>> >
>> > >   src/ipa/rpi/pisp/data/imx219.json            |  65 ++-
>> > >   src/ipa/rpi/pisp/data/imx296.json            |  64 ++-
>> > >   src/ipa/rpi/pisp/data/imx296_16mm.json       |  64 ++-
>> > >   src/ipa/rpi/pisp/data/imx296_6mm.json        |  64 ++-
>> > >   src/ipa/rpi/pisp/data/imx477.json            |  63 +++
>> > >   src/ipa/rpi/pisp/data/imx477_16mm.json       |  65 ++-
>> > >   src/ipa/rpi/pisp/data/imx477_6mm.json        |  65 ++-
>> > >   src/ipa/rpi/pisp/data/imx477_scientific.json |  79 +++-
>> > >   src/ipa/rpi/pisp/data/imx500.json            |  67 +++
>> > >   src/ipa/rpi/pisp/data/imx708.json            |  64 ++-
>> > >   src/ipa/rpi/pisp/data/imx708_wide.json       |  62 +++
>> > >   src/ipa/rpi/pisp/data/meson.build            |   7 +
>> > >   src/ipa/rpi/pisp/data/ov5647.json            |  63 +++
>> > >   src/ipa/rpi/vc4/data/awb_model.tflite        | Bin 0 -> 42976 bytes
>> > >   src/ipa/rpi/vc4/data/imx219.json             |  64 +++
>> > >   src/ipa/rpi/vc4/data/imx296.json             |  64 +++
>> > >   src/ipa/rpi/vc4/data/imx477.json             |  69 +++
>> > >   src/ipa/rpi/vc4/data/imx500.json             |  67 +++
>> > >   src/ipa/rpi/vc4/data/imx708.json             |  72 +++
>> > >   src/ipa/rpi/vc4/data/imx708_wide.json        |  62 +++
>> > >   src/ipa/rpi/vc4/data/meson.build             |   8 +
>> > >   src/ipa/rpi/vc4/data/ov5647.json             |  64 +++
>> > >   29 files changed, 2244 insertions(+), 428 deletions(-)
>> > >   create mode 100644 src/ipa/rpi/controller/rpi/awb_bayes.cpp
>> > >   create mode 100644 src/ipa/rpi/controller/rpi/awb_nn.cpp
>> > >   create mode 100644 src/ipa/rpi/pisp/data/awb_model.tflite
>> > >   create mode 100644 src/ipa/rpi/vc4/data/awb_model.tflite
>> > >
>> >
>>
>