[libcamera-devel,v4,00/12] utils: tuning: Add a new tuning infrastructure
mbox series

Message ID 20221124113550.2182342-1-paul.elder@ideasonboard.com
Headers show
Series
  • utils: tuning: Add a new tuning infrastructure
Related show

Message

Paul Elder Nov. 24, 2022, 11:35 a.m. UTC
This patch series adds a new tuning infrastructure called libtuning,
inspired by ctt.

The new changes in v4 are fairly minor. One notable change is that the
less-accurate rounding copied from ctt was replaced with simple
rounding, so the output no longer matches *exactly* what ctt outputs.

Main non-cosmetic changes in v3:
- remove full raspberry pi tuning script
- split core libtuning implementation into three patches

The design modularizes common components of tuning tools such that
new tuning scripts for new platforms can be created easily, as show in
patches 11/12 to 12/12 (v3: the full Raspberry Pi tuning script was
removed because it is DNI and clearly incomplete).

The common "core" components include file management, argument
parsing, image loading and validation, and macbeth chart detection, as
well as miscellaneous (but tedious) math utilities. It connects
everything together such that a platform's tuning script can very easily
customize tuning modules in a variety of ways, even including the format
of the input configuration file and the output tuning file. These are
all implemented in patches 1/12 to 3/12 (split from one to three patches
as of v3).

The input configuration file and output tuning file could have different
formats as well, hence why these have their own classes. As of v1, only
the raspberrypi's formats were implemented, in patches 7/12 and 8/12
respectively. As of v2, output for yaml was added in patch 10/12. A
skeleton for yaml input is added as well in patch 9/12, though it is not
implemented yet as there is no specification for the yaml input
configuration format, and the only existing user of it doesn't actually
need a configuration file.

As of v2, it became apparent that it was infeasible to create an ALSC
module (that's what we're starting with) that could completely support
different platforms with configuration options alone. So, the ALSC
module was split into a base one (patch 4/12), with variations for
raspberrypi and rkisp1 implemented on top of it in patches 5/12 and 6/12
respectively. I think they came out quite nice, and they are still more
manageable than an entirely new tuning module per platform.

v2:
I have also since managed to get test images and so this entire thing
runs! Even though the rkisp1 tuning script (patch 12/12) says "WIP", it
does run and outputs a valid tuning file. I haven't tried using the
tuning file with libcamera though, as it is missng the /other/ algorithm
tuning results.

The output from the libtuning-based alsc-only raspberrypi tuning script
(patch 11/12) has been confirmed to be character-for-character exactly
the same as the output from ctt's alsc-only tuning script.


Paul Elder (12):
  utils: tuning: libtuning: Implement the core of libtuning
  utils: tuning: libtuning: Implement math helpers
  utils: tuning: libtuning: Implement extensible components of libtuning
  utils: libtuning: modules: Add base LSC module
  utils: libtuning: modules: alsc: Add raspberrypi ALSC module
  utils: libtuning: modules: alsc: Add rkisp1 LSC module
  utils: libtuning: parsers: Add raspberrypi parser
  utils: libtuning: generators: Add raspberrypi output
  utils: libtuning: parsers: Add yaml parser
  utils: libtuning: generators: Add yaml output
  utils: tuning: Add alsc-only libtuning raspberrypi tuning script
  utils: tuning: Add tuning script for rkisp1

 utils/tuning/README.rst                       |  11 +
 utils/tuning/libtuning/__init__.py            |  13 +
 utils/tuning/libtuning/average.py             |  21 +
 utils/tuning/libtuning/generators/__init__.py |   6 +
 .../tuning/libtuning/generators/generator.py  |  15 +
 .../generators/raspberrypi_output.py          | 114 ++++
 .../libtuning/generators/yaml_output.py       | 123 +++++
 utils/tuning/libtuning/gradient.py            |  75 +++
 utils/tuning/libtuning/image.py               | 136 +++++
 utils/tuning/libtuning/libtuning.py           | 208 +++++++
 utils/tuning/libtuning/macbeth.py             | 516 ++++++++++++++++++
 utils/tuning/libtuning/macbeth_ref.pgm        |   6 +
 utils/tuning/libtuning/modules/__init__.py    |   0
 .../tuning/libtuning/modules/lsc/__init__.py  |   7 +
 utils/tuning/libtuning/modules/lsc/lsc.py     |  72 +++
 .../libtuning/modules/lsc/raspberrypi.py      | 246 +++++++++
 utils/tuning/libtuning/modules/lsc/rkisp1.py  | 112 ++++
 utils/tuning/libtuning/modules/module.py      |  32 ++
 utils/tuning/libtuning/parsers/__init__.py    |   6 +
 utils/tuning/libtuning/parsers/parser.py      |  21 +
 .../libtuning/parsers/raspberrypi_parser.py   |  93 ++++
 utils/tuning/libtuning/parsers/yaml_parser.py |  17 +
 utils/tuning/libtuning/smoothing.py           |  24 +
 utils/tuning/libtuning/utils.py               | 125 +++++
 utils/tuning/raspberrypi/__init__.py          |   0
 utils/tuning/raspberrypi/alsc.py              |  19 +
 utils/tuning/raspberrypi_alsc_only.py         |  23 +
 utils/tuning/rkisp1.py                        |  40 ++
 28 files changed, 2081 insertions(+)
 create mode 100644 utils/tuning/README.rst
 create mode 100644 utils/tuning/libtuning/__init__.py
 create mode 100644 utils/tuning/libtuning/average.py
 create mode 100644 utils/tuning/libtuning/generators/__init__.py
 create mode 100644 utils/tuning/libtuning/generators/generator.py
 create mode 100644 utils/tuning/libtuning/generators/raspberrypi_output.py
 create mode 100644 utils/tuning/libtuning/generators/yaml_output.py
 create mode 100644 utils/tuning/libtuning/gradient.py
 create mode 100644 utils/tuning/libtuning/image.py
 create mode 100644 utils/tuning/libtuning/libtuning.py
 create mode 100644 utils/tuning/libtuning/macbeth.py
 create mode 100644 utils/tuning/libtuning/macbeth_ref.pgm
 create mode 100644 utils/tuning/libtuning/modules/__init__.py
 create mode 100644 utils/tuning/libtuning/modules/lsc/__init__.py
 create mode 100644 utils/tuning/libtuning/modules/lsc/lsc.py
 create mode 100644 utils/tuning/libtuning/modules/lsc/raspberrypi.py
 create mode 100644 utils/tuning/libtuning/modules/lsc/rkisp1.py
 create mode 100644 utils/tuning/libtuning/modules/module.py
 create mode 100644 utils/tuning/libtuning/parsers/__init__.py
 create mode 100644 utils/tuning/libtuning/parsers/parser.py
 create mode 100644 utils/tuning/libtuning/parsers/raspberrypi_parser.py
 create mode 100644 utils/tuning/libtuning/parsers/yaml_parser.py
 create mode 100644 utils/tuning/libtuning/smoothing.py
 create mode 100644 utils/tuning/libtuning/utils.py
 create mode 100644 utils/tuning/raspberrypi/__init__.py
 create mode 100644 utils/tuning/raspberrypi/alsc.py
 create mode 100755 utils/tuning/raspberrypi_alsc_only.py
 create mode 100755 utils/tuning/rkisp1.py

Comments

Laurent Pinchart Nov. 25, 2022, 12:56 a.m. UTC | #1
Hi Paul,

Thank you for the patch.

On Thu, Nov 24, 2022 at 08:35:39PM +0900, Paul Elder via libcamera-devel wrote:
> Implement the core of libtuning, our new tuning tool infrastructure. It
> leverages components from raspberrypi's ctt that could be reused for
> tuning tools for other platforms.
> 
> The core components include:
> - The Image class
> - libtuning (entry point and other core functions)
> - macbeth-related tools, including the macbeth reference image
> - utils
> 
> Signed-off-by: Paul Elder <paul.elder@ideasonboard.com>
> 
> ---
> Changes in v4:
> - change Image.name to property
> - fix saturated logic in get_patches
> - move lsc vs non-lsc image loading decision from utils to Tuner
>   - this means we have to import libtuning in libtuning.py... is this
>     fine?
> - remove raspberry pi's special rounding function, as it has worse
>   rounding accuracy than simple rounding
> - remove cli args from module.process parameters
> 
> Changes in v3:
> - *Split into separate patches*
>   - The following changes apply to the next two patches as well
> - fix style
> - rename Camera to Tuner
> - remove indirection from fake polymorphism
> - remove unused options property from Module
> - remove unimplemented gradients
> - convert readme to rst
> - fix readme license
> - reorder dependencies list
> - add file descriptions
> - remove indirection from Image loading
> - remove Image member variables that are unused due to dropping BRCM
>   support
> - remove G from Color enum
>   - Color was /not/ renamed to BayerComponent because it was much too
>     long for use in code
> - add @property getters to Param
> - fix undefined functions/variables
> 
> Changes in v2:
> - fix all python errors
> - fix style
> - add SPDX and copyright
> - remove validateConfig() from the base/abstract Module class
> - actually append the image after loading, even if it's alsc_only
> - s/average_functions/average/
> - remove separate params field for Average and Smoothing
> - move remainder parameter in Gradient to Linear, as it only applies to
>   that
> - from gradient.Linear, remove the remainders that I thought don't make
>   sense
> - add Float to gradient.Linear's remainder types, to divide everything
>   in as a float; useful for rkisp1's sector sizes (the x-size and y-size
>   tuning options)
> - add a map function to Gradient, for mapping values onto a curve
> - in Smoothing, move ksize to a constructor parameter
> - remove brcm image loading
> - move process_args from utils to libtuning
> - move Module's type string and human-readble module name to class
>   variable
> - move locate_macbeth from utils to macbeth
> - add out_name to Module, for the output to know what name to write for
>   the key in the tuning output (eg. rkisp1 uses "LensShadingCorrection"
>   while raspberrypi uses "rpi.alsc")
> ---
>  utils/tuning/README.rst                |  11 +
>  utils/tuning/libtuning/__init__.py     |  13 +
>  utils/tuning/libtuning/image.py        | 136 +++++++
>  utils/tuning/libtuning/libtuning.py    | 208 ++++++++++
>  utils/tuning/libtuning/macbeth.py      | 516 +++++++++++++++++++++++++
>  utils/tuning/libtuning/macbeth_ref.pgm |   6 +
>  utils/tuning/libtuning/utils.py        | 125 ++++++
>  7 files changed, 1015 insertions(+)
>  create mode 100644 utils/tuning/README.rst
>  create mode 100644 utils/tuning/libtuning/__init__.py
>  create mode 100644 utils/tuning/libtuning/image.py
>  create mode 100644 utils/tuning/libtuning/libtuning.py
>  create mode 100644 utils/tuning/libtuning/macbeth.py
>  create mode 100644 utils/tuning/libtuning/macbeth_ref.pgm
>  create mode 100644 utils/tuning/libtuning/utils.py
> 
> diff --git a/utils/tuning/README.rst b/utils/tuning/README.rst
> new file mode 100644
> index 00000000..ce533b2c
> --- /dev/null
> +++ b/utils/tuning/README.rst
> @@ -0,0 +1,11 @@
> +.. SPDX-License-Identifier: CC-BY-SA-4.0
> +
> +.. TODO: Write an overview of libtuning
> +
> +Dependencies
> +------------
> +
> +- cv2
> +- numpy
> +- pyexiv2
> +- rawpy
> diff --git a/utils/tuning/libtuning/__init__.py b/utils/tuning/libtuning/__init__.py
> new file mode 100644
> index 00000000..93049976
> --- /dev/null
> +++ b/utils/tuning/libtuning/__init__.py
> @@ -0,0 +1,13 @@
> +# SPDX-License-Identifier: GPL-2.0-or-later
> +#
> +# Copyright (C) 2022, Paul Elder <paul.elder@ideasonboard.com>
> +
> +from libtuning.utils import *
> +from libtuning.libtuning import *
> +
> +from libtuning.image import *
> +from libtuning.macbeth import *
> +
> +from libtuning.average import *
> +from libtuning.gradient import *
> +from libtuning.smoothing import *
> diff --git a/utils/tuning/libtuning/image.py b/utils/tuning/libtuning/image.py
> new file mode 100644
> index 00000000..aa9d20b5
> --- /dev/null
> +++ b/utils/tuning/libtuning/image.py
> @@ -0,0 +1,136 @@
> +# SPDX-License-Identifier: BSD-2-Clause
> +#
> +# Copyright (C) 2019, Raspberry Pi Ltd
> +#
> +# image.py - Container for an image and associated metadata
> +
> +import binascii
> +import numpy as np
> +from pathlib import Path
> +import pyexiv2 as pyexif
> +import rawpy as raw
> +import re
> +
> +import libtuning as lt
> +import libtuning.utils as utils
> +
> +
> +class Image:
> +    def __init__(self, path: Path):
> +        self.path = path
> +        self.lsc_only = False
> +        self.color = -1
> +        self.lux = -1
> +
> +        try:
> +            self._load_metadata_exif()
> +        except Exception as e:
> +            utils.eprint(f'Failed to load metadata from {self.path}: {e}')
> +            raise e
> +
> +        try:
> +            self._read_image_dng()
> +        except Exception as e:
> +            utils.eprint(f'Failed to load image data from {self.path}: {e}')
> +            raise e
> +
> +    @property
> +    def name(self):
> +        return self.path.name
> +
> +    # May raise KeyError as there are too many to check
> +    def _load_metadata_exif(self):
> +        # RawPy doesn't load all the image tags that we need, so we use py3exiv2
> +        metadata = pyexif.ImageMetadata(str(self.path))
> +        metadata.read()
> +
> +        # The DNG and TIFF/EP specifications use different IFDs to store the
> +        # raw image data and the Exif tags. DNG stores them in a SubIFD and in
> +        # an Exif IFD respectively (named "SubImage1" and "Photo" by pyexiv2),
> +        # while TIFF/EP stores them both in IFD0 (name "Image"). Both are used
> +        # in "DNG" files, with libcamera-apps following the DNG recommendation
> +        # and applications based on picamera2 following TIFF/EP.
> +        #
> +        # This code detects which tags are being used, and therefore extracts the
> +        # correct values.
> +        try:
> +            self.w = metadata['Exif.SubImage1.ImageWidth'].value
> +            subimage = 'SubImage1'
> +            photo = 'Photo'
> +        except KeyError:
> +            self.w = metadata['Exif.Image.ImageWidth'].value
> +            subimage = 'Image'
> +            photo = 'Image'
> +        self.pad = 0
> +        self.h = metadata[f'Exif.{subimage}.ImageLength'].value
> +        white = metadata[f'Exif.{subimage}.WhiteLevel'].value
> +        self.sigbits = int(white).bit_length()
> +        self.fmt = (self.sigbits - 4) // 2
> +        self.exposure = int(metadata[f'Exif.{photo}.ExposureTime'].value * 1000000)
> +        self.againQ8 = metadata[f'Exif.{photo}.ISOSpeedRatings'].value * 256 / 100
> +        self.againQ8_norm = self.againQ8 / 256
> +        self.camName = metadata['Exif.Image.Model'].value
> +        self.blacklevel = int(metadata[f'Exif.{subimage}.BlackLevel'].value[0])
> +        self.blacklevel_16 = self.blacklevel << (16 - self.sigbits)
> +
> +        # Channel order depending on bayer pattern
> +        # The key is the order given by exif, where 0 is R, 1 is G, and 2 is B
> +        # The value is the index where the color can be found, where the first
> +        # is R, then G, then G, then B.
> +        bayer_case = {
> +            '0 1 1 2': (lt.Color.R, lt.Color.GR, lt.Color.GB, lt.Color.B),
> +            '1 2 0 1': (lt.Color.GB, lt.Color.R, lt.Color.B, lt.Color.GR),
> +            '2 1 1 0': (lt.Color.B, lt.Color.GB, lt.Color.GR, lt.Color.R),
> +            '1 0 2 1': (lt.Color.GR, lt.Color.R, lt.Color.B, lt.Color.GB)
> +        }
> +        # Note: This needs to be in IFD0
> +        cfa_pattern = metadata[f'Exif.{subimage}.CFAPattern'].value
> +        self.order = bayer_case[cfa_pattern]
> +
> +    def _read_image_dng(self):
> +        raw_im = raw.imread(str(self.path))
> +        raw_data = raw_im.raw_image
> +        shift = 16 - self.sigbits
> +        c0 = np.left_shift(raw_data[0::2, 0::2].astype(np.int64), shift)
> +        c1 = np.left_shift(raw_data[0::2, 1::2].astype(np.int64), shift)
> +        c2 = np.left_shift(raw_data[1::2, 0::2].astype(np.int64), shift)
> +        c3 = np.left_shift(raw_data[1::2, 1::2].astype(np.int64), shift)
> +        self.channels = [c0, c1, c2, c3]
> +        # Reorder the channels into R, GR, GB, B
> +        self.channels = [self.channels[i] for i in self.order]
> +
> +    # \todo Move this to macbeth.py
> +    def get_patches(self, cen_coords, size=16):
> +        saturated = False
> +
> +        # Obtain channel widths and heights
> +        ch_w, ch_h = self.w, self.h
> +        cen_coords = list(np.array((cen_coords[0])).astype(np.int32))
> +        self.cen_coords = cen_coords
> +
> +        # Squares are ordered by stacking macbeth chart columns from left to
> +        # right. Some useful patch indices:
> +        #     white = 3
> +        #     black = 23
> +        #     'reds' = 9, 10
> +        #     'blues' = 2, 5, 8, 20, 22
> +        #     'greens' = 6, 12, 17
> +        #     greyscale = 3, 7, 11, 15, 19, 23
> +        all_patches = []
> +        for ch in self.channels:
> +            ch_patches = []
> +            for cen in cen_coords:
> +                # Macbeth centre is placed at top left of central 2x2 patch to
> +                # account for rounding. Patch pixels are sorted by pixel
> +                # brightness so spatial information is lost.
> +                patch = ch[cen[1] - 7:cen[1] + 9, cen[0] - 7:cen[0] + 9].flatten()
> +                patch.sort()
> +                if patch[-5] == (2**self.sigbits - 1) * 2**(16 - self.sigbits):
> +                    saturated = True
> +                ch_patches.append(patch)
> +
> +            all_patches.append(ch_patches)
> +
> +        self.patches = all_patches
> +
> +        return not saturated
> diff --git a/utils/tuning/libtuning/libtuning.py b/utils/tuning/libtuning/libtuning.py
> new file mode 100644
> index 00000000..469e6940
> --- /dev/null
> +++ b/utils/tuning/libtuning/libtuning.py
> @@ -0,0 +1,208 @@
> +# SPDX-License-Identifier: GPL-2.0-or-later
> +#
> +# Copyright (C) 2022, Paul Elder <paul.elder@ideasonboard.com>
> +#
> +# libtuning.py - An infrastructure for camera tuning tools
> +
> +import argparse
> +
> +import libtuning as lt
> +import libtuning.utils as utils
> +from libtuning.utils import eprint
> +
> +from enum import Enum, IntEnum
> +
> +
> +class Color(IntEnum):
> +    R = 0
> +    GR = 1
> +    GB = 2
> +    B = 3
> +
> +
> +class Debug(Enum):
> +    Plot = 1
> +
> +
> +# @brief What to do with the leftover pixels after dividing them into ALSC
> +#        sectors, when the division gradient is uniform
> +# @var Float Force floating point division so all sectors divide equally
> +# @var DistributeFront Divide the remainder equally (until running out,
> +#      obviously) into the existing sectors, starting from the front
> +# @var DistributeBack Same as DistributeFront but starting from the back
> +class Remainder(Enum):
> +    Float = 0
> +    DistributeFront = 1
> +    DistributeBack = 2
> +
> +
> +# @brief A helper class to contain a default value for a module configuration
> +# parameter
> +class Param(object):
> +    # @var Required The value contained in this instance is irrelevant, and the
> +    #      value must be provided by the tuning configuration file.
> +    # @var Optional If the value is not provided by the tuning configuration
> +    #      file, then the value contained in this instance will be used instead.
> +    # @var Hardcode The value contained in this instance will be used
> +    class Mode(Enum):
> +        Required = 0
> +        Optional = 1
> +        Hardcode = 2
> +
> +    # @param name Name of the parameter. Shall match the name used in the
> +    #        configuration file for the parameter
> +    # @param required Whether or not a value is required in the config
> +    #        parameter of getVal()
> +    # @param val Default value (only relevant if mode is Optional)
> +    def __init__(self, name: str, required: Mode, val=None):
> +        self.name = name
> +        self.__required = required
> +        self.val = val
> +
> +    def get_value(self, config: dict):
> +        if self.required is self.Mode.Hardcode:
> +            return self.val
> +
> +        if self.required is self.Mode.Required and self.name not in config:
> +            raise ValueError(f'Parameter {self.name} is required but not provided in the configuration')
> +
> +        return config[self.name] if self.required is self.Mode.Required else self.val
> +
> +    @property
> +    def required(self):
> +        return self.__required is self.Mode.Required
> +
> +    # @brief Used by libtuning to auto-generate help information for the tuning
> +    #        script on the available parameters for the configuration file
> +    # \todo Implement this
> +    @property
> +    def info(self):
> +        raise NotImplementedError
> +
> +
> +class Tuner(object):
> +
> +    # External functions
> +
> +    def __init__(self, platform_name):
> +        self.name = platform_name
> +        self.modules = []
> +        self.parser = None
> +        self.generator = None
> +        self.output_order = []
> +        self.config = {}
> +        self.output = {}
> +
> +    def add(self, module):
> +        self.modules.append(module)
> +
> +    def set_input_parser(self, parser):
> +        self.parser = parser
> +
> +    def set_output_formatter(self, output):
> +        self.generator = output
> +
> +    def set_output_order(self, modules):
> +        self.output_order = modules
> +
> +    # @brief Convert classes in self.output_order to the instances in self.modules
> +    def _prepare_output_order(self):
> +        output_order = self.output_order
> +        self.output_order = []
> +        for module_type in output_order:
> +            modules = [module for module in self.modules if module.type == module_type.type]
> +            if len(modules) > 1:
> +                eprint(f'Multiple modules found for module type "{module_type.type}"')
> +                return False
> +            if len(modules) < 1:
> +                eprint(f'No module found for module type "{module_type.type}"')
> +                return False
> +            self.output_order.append(modules[0])
> +
> +        return True
> +
> +    # \todo Validate parser and generator at Tuner construction time?
> +    def _validate_settings(self):
> +        if self.parser is None:
> +            eprint('Missing parser')
> +            return False
> +
> +        if self.generator is None:
> +            eprint('Missing generator')
> +            return False
> +
> +        if len(self.modules) == 0:
> +            eprint('No modules added')
> +            return False
> +
> +        if len(self.output_order) != len(self.modules):
> +            eprint('Number of outputs does not match number of modules')
> +            return False
> +
> +        return True
> +
> +    def _process_args(self, argv, platform_name):
> +        parser = argparse.ArgumentParser(description=f'Camera Tuning for {platform_name}')
> +        parser.add_argument('-i', '--input', type=str, required=True,
> +                            help='''Directory containing calibration images (required).
> +                                    Images for ALSC must be named "alsc_{Color Temperature}k_1[u].dng",
> +                                    and all other images must be named "{Color Temperature}k_{Lux Level}l.dng"''')
> +        parser.add_argument('-o', '--output', type=str, required=True,
> +                            help='Output file (required)')
> +        # It is not our duty to scan all modules to figure out their default
> +        # options, so simply return an empty configuration if none is provided.
> +        parser.add_argument('-c', '--config', type=str, default='',
> +                            help='Config file (optional)')
> +        # \todo Check if we really need this or if stderr is good enough, or if
> +        # we want a better logging infrastructure with log levels
> +        parser.add_argument('-l', '--log', type=str, default=None,
> +                            help='Output log file (optional)')
> +        return parser.parse_args(argv[1:])
> +
> +    def run(self, argv):
> +        args = self._process_args(argv, self.name)
> +        if args is None:
> +            return -1
> +
> +        if not self._validate_settings():
> +            return -1
> +
> +        if not self._prepare_output_order():
> +            return -1
> +
> +        if len(args.config) > 0:
> +            self.config, disable = self.parser.parse(args.config, self.modules)
> +        else:
> +            self.config = {'general': {}}
> +            disable = []
> +
> +        # Remove disabled modules
> +        for module in disable:
> +            if module in self.modules:
> +                self.modules.remove(module)
> +
> +        for module in self.modules:
> +            if not module.validate_config(self.config):
> +                eprint(f'Config is invalid for module {module.type}')
> +                return -1
> +
> +        has_lsc = any(isinstance(m, lt.modules.lsc.LSC) for m in self.modules)
> +        # Only one LSC module allowed
> +        has_only_lsc = has_lsc and len(self.modules) == 1

Nothing to address now, but I can see this evolving into checking that
there's only one module of each type.

Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>

> +
> +        images = utils.load_images(args.input, self.config, not has_only_lsc, has_lsc)
> +        if images is None or len(images) == 0:
> +            eprint(f'No images were found, or able to load')
> +            return -1
> +
> +        # Do the tuning
> +        for module in self.modules:
> +            out = module.process(self.config, images, self.output)
> +            if out is None:
> +                eprint(f'Module {module.name} failed to process, aborting')
> +                break
> +            self.output[module] = out
> +
> +        self.generator.write(args.output, self.output, self.output_order)
> +
> +        return 0
> diff --git a/utils/tuning/libtuning/macbeth.py b/utils/tuning/libtuning/macbeth.py
> new file mode 100644
> index 00000000..5faddf66
> --- /dev/null
> +++ b/utils/tuning/libtuning/macbeth.py
> @@ -0,0 +1,516 @@
> +# SPDX-License-Identifier: BSD-2-Clause
> +#
> +# Copyright (C) 2019, Raspberry Pi Ltd
> +#
> +# macbeth.py - Locate and extract Macbeth charts from images
> +# (Copied from: ctt_macbeth_locator.py)
> +
> +# \todo Add debugging
> +
> +import cv2
> +import os
> +from pathlib import Path
> +import numpy as np
> +
> +from libtuning.image import Image
> +
> +
> +# Reshape image to fixed width without distorting returns image and scale
> +# factor
> +def reshape(img, width):
> +    factor = width / img.shape[0]
> +    return cv2.resize(img, None, fx=factor, fy=factor), factor
> +
> +
> +# Correlation function to quantify match
> +def correlate(im1, im2):
> +    f1 = im1.flatten()
> +    f2 = im2.flatten()
> +    cor = np.corrcoef(f1, f2)
> +    return cor[0][1]
> +
> +
> +# @brief Compute coordinates of macbeth chart vertices and square centres
> +# @return (max_cor, best_map_col_norm, fit_coords, success)
> +#
> +# Also returns an error/success message for debugging purposes. Additionally,
> +# it scores the match with a confidence value.
> +#
> +#    Brief explanation of the macbeth chart locating algorithm:
> +#    - Find rectangles within image
> +#    - Take rectangles within percentage offset of median perimeter. The
> +#        assumption is that these will be the macbeth squares
> +#    - For each potential square, find the 24 possible macbeth centre locations
> +#        that would produce a square in that location
> +#    - Find clusters of potential macbeth chart centres to find the potential
> +#        macbeth centres with the most votes, i.e. the most likely ones
> +#    - For each potential macbeth centre, use the centres of the squares that
> +#        voted for it to find macbeth chart corners
> +#    - For each set of corners, transform the possible match into normalised
> +#        space and correlate with a reference chart to evaluate the match
> +#    - Select the highest correlation as the macbeth chart match, returning the
> +#        correlation as the confidence score
> +#
> +# \todo Clean this up
> +def get_macbeth_chart(img, ref_data):
> +    ref, ref_w, ref_h, ref_corns = ref_data
> +
> +    # The code will raise and catch a MacbethError in case of a problem, trying
> +    # to give some likely reasons why the problem occured, hence the try/except
> +    try:
> +        # Obtain image, convert to grayscale and normalise
> +        src = img
> +        src, factor = reshape(src, 200)
> +        original = src.copy()
> +        a = 125 / np.average(src)
> +        src_norm = cv2.convertScaleAbs(src, alpha=a, beta=0)
> +
> +        # This code checks if there are seperate colour channels. In the past the
> +        # macbeth locator ran on jpgs and this makes it robust to different
> +        # filetypes. Note that running it on a jpg has 4x the pixels of the
> +        # average bayer channel so coordinates must be doubled.
> +
> +        # This is best done in img_load.py in the get_patches method. The
> +        # coordinates and image width, height must be divided by two if the
> +        # macbeth locator has been run on a demosaicked image.
> +        if len(src_norm.shape) == 3:
> +            src_bw = cv2.cvtColor(src_norm, cv2.COLOR_BGR2GRAY)
> +        else:
> +            src_bw = src_norm
> +        original_bw = src_bw.copy()
> +
> +        # Obtain image edges
> +        sigma = 2
> +        src_bw = cv2.GaussianBlur(src_bw, (0, 0), sigma)
> +        t1, t2 = 50, 100
> +        edges = cv2.Canny(src_bw, t1, t2)
> +
> +        # Dilate edges to prevent self-intersections in contours
> +        k_size = 2
> +        kernel = np.ones((k_size, k_size))
> +        its = 1
> +        edges = cv2.dilate(edges, kernel, iterations=its)
> +
> +        # Find contours in image
> +        conts, _ = cv2.findContours(edges, cv2.RETR_TREE,
> +                                    cv2.CHAIN_APPROX_NONE)
> +        if len(conts) == 0:
> +            raise MacbethError(
> +                '\nWARNING: No macbeth chart found!'
> +                '\nNo contours found in image\n'
> +                'Possible problems:\n'
> +                '- Macbeth chart is too dark or bright\n'
> +                '- Macbeth chart is occluded\n'
> +            )
> +
> +        # Find quadrilateral contours
> +        epsilon = 0.07
> +        conts_per = []
> +        for i in range(len(conts)):
> +            per = cv2.arcLength(conts[i], True)
> +            poly = cv2.approxPolyDP(conts[i], epsilon * per, True)
> +            if len(poly) == 4 and cv2.isContourConvex(poly):
> +                conts_per.append((poly, per))
> +
> +        if len(conts_per) == 0:
> +            raise MacbethError(
> +                '\nWARNING: No macbeth chart found!'
> +                '\nNo quadrilateral contours found'
> +                '\nPossible problems:\n'
> +                '- Macbeth chart is too dark or bright\n'
> +                '- Macbeth chart is occluded\n'
> +                '- Macbeth chart is out of camera plane\n'
> +            )
> +
> +        # Sort contours by perimeter and get perimeters within percent of median
> +        conts_per = sorted(conts_per, key=lambda x: x[1])
> +        med_per = conts_per[int(len(conts_per) / 2)][1]
> +        side = med_per / 4
> +        perc = 0.1
> +        med_low, med_high = med_per * (1 - perc), med_per * (1 + perc)
> +        squares = []
> +        for i in conts_per:
> +            if med_low <= i[1] and med_high >= i[1]:
> +                squares.append(i[0])
> +
> +        # Obtain coordinates of nomralised macbeth and squares
> +        square_verts, mac_norm = get_square_verts(0.06)
> +        # For each square guess, find 24 possible macbeth chart centres
> +        mac_mids = []
> +        squares_raw = []
> +        for i in range(len(squares)):
> +            square = squares[i]
> +            squares_raw.append(square)
> +
> +            # Convert quads to rotated rectangles. This is required as the
> +            # 'squares' are usually quite irregular quadrilaterls, so
> +            # performing a transform would result in exaggerated warping and
> +            # inaccurate macbeth chart centre placement
> +            rect = cv2.minAreaRect(square)
> +            square = cv2.boxPoints(rect).astype(np.float32)
> +
> +            # Reorder vertices to prevent 'hourglass shape'
> +            square = sorted(square, key=lambda x: x[0])
> +            square_1 = sorted(square[:2], key=lambda x: x[1])
> +            square_2 = sorted(square[2:], key=lambda x: -x[1])
> +            square = np.array(np.concatenate((square_1, square_2)), np.float32)
> +            square = np.reshape(square, (4, 2)).astype(np.float32)
> +            squares[i] = square
> +
> +            # Find 24 possible macbeth chart centres by trasnforming normalised
> +            # macbeth square vertices onto candidate square vertices found in image
> +            for j in range(len(square_verts)):
> +                verts = square_verts[j]
> +                p_mat = cv2.getPerspectiveTransform(verts, square)
> +                mac_guess = cv2.perspectiveTransform(mac_norm, p_mat)
> +                mac_guess = np.round(mac_guess).astype(np.int32)
> +
> +                mac_mid = np.mean(mac_guess, axis=1)
> +                mac_mids.append([mac_mid, (i, j)])
> +
> +        if len(mac_mids) == 0:
> +            raise MacbethError(
> +                '\nWARNING: No macbeth chart found!'
> +                '\nNo possible macbeth charts found within image'
> +                '\nPossible problems:\n'
> +                '- Part of the macbeth chart is outside the image\n'
> +                '- Quadrilaterals in image background\n'
> +            )
> +
> +        # Reshape data
> +        for i in range(len(mac_mids)):
> +            mac_mids[i][0] = mac_mids[i][0][0]
> +
> +        # Find where midpoints cluster to identify most likely macbeth centres
> +        clustering = cluster.AgglomerativeClustering(
> +            n_clusters=None,
> +            compute_full_tree=True,
> +            distance_threshold=side * 2
> +        )
> +        mac_mids_list = [x[0] for x in mac_mids]
> +
> +        if len(mac_mids_list) == 1:
> +            # Special case of only one valid centre found (probably not needed)
> +            clus_list = []
> +            clus_list.append([mac_mids, len(mac_mids)])
> +
> +        else:
> +            clustering.fit(mac_mids_list)
> +
> +            # Create list of all clusters
> +            clus_list = []
> +            if clustering.n_clusters_ > 1:
> +                for i in range(clustering.labels_.max() + 1):
> +                    indices = [j for j, x in enumerate(clustering.labels_) if x == i]
> +                    clus = []
> +                    for index in indices:
> +                        clus.append(mac_mids[index])
> +                    clus_list.append([clus, len(clus)])
> +                clus_list.sort(key=lambda x: -x[1])
> +
> +            elif clustering.n_clusters_ == 1:
> +                # Special case of only one cluster found
> +                clus_list.append([mac_mids, len(mac_mids)])
> +            else:
> +                raise MacbethError(
> +                    '\nWARNING: No macebth chart found!'
> +                    '\nNo clusters found'
> +                    '\nPossible problems:\n'
> +                    '- NA\n'
> +                )
> +
> +        # Keep only clusters with enough votes
> +        clus_len_max = clus_list[0][1]
> +        clus_tol = 0.7
> +        for i in range(len(clus_list)):
> +            if clus_list[i][1] < clus_len_max * clus_tol:
> +                clus_list = clus_list[:i]
> +                break
> +            cent = np.mean(clus_list[i][0], axis=0)[0]
> +            clus_list[i].append(cent)
> +
> +        # Get centres of each normalised square
> +        reference = get_square_centres(0.06)
> +
> +        # For each possible macbeth chart, transform image into
> +        # normalised space and find correlation with reference
> +        max_cor = 0
> +        best_map = None
> +        best_fit = None
> +        best_cen_fit = None
> +        best_ref_mat = None
> +
> +        for clus in clus_list:
> +            clus = clus[0]
> +            sq_cents = []
> +            ref_cents = []
> +            i_list = [p[1][0] for p in clus]
> +            for point in clus:
> +                i, j = point[1]
> +
> +                # Remove any square that voted for two different points within
> +                # the same cluster. This causes the same point in the image to be
> +                # mapped to two different reference square centres, resulting in
> +                # a very distorted perspective transform since cv2.findHomography
> +                # simply minimises error.
> +                # This phenomenon is not particularly likely to occur due to the
> +                # enforced distance threshold in the clustering fit but it is
> +                # best to keep this in just in case.
> +                if i_list.count(i) == 1:
> +                    square = squares_raw[i]
> +                    sq_cent = np.mean(square, axis=0)
> +                    ref_cent = reference[j]
> +                    sq_cents.append(sq_cent)
> +                    ref_cents.append(ref_cent)
> +
> +                    # At least four squares need to have voted for a centre in
> +                    # order for a transform to be found
> +            if len(sq_cents) < 4:
> +                raise MacbethError(
> +                    '\nWARNING: No macbeth chart found!'
> +                    '\nNot enough squares found'
> +                    '\nPossible problems:\n'
> +                    '- Macbeth chart is occluded\n'
> +                    '- Macbeth chart is too dark of bright\n'
> +                )
> +
> +            ref_cents = np.array(ref_cents)
> +            sq_cents = np.array(sq_cents)
> +
> +            # Find best fit transform from normalised centres to image
> +            h_mat, mask = cv2.findHomography(ref_cents, sq_cents)
> +            if 'None' in str(type(h_mat)):
> +                raise MacbethError(
> +                    '\nERROR\n'
> +                )
> +
> +            # Transform normalised corners and centres into image space
> +            mac_fit = cv2.perspectiveTransform(mac_norm, h_mat)
> +            mac_cen_fit = cv2.perspectiveTransform(np.array([reference]), h_mat)
> +
> +            # Transform located corners into reference space
> +            ref_mat = cv2.getPerspectiveTransform(
> +                mac_fit,
> +                np.array([ref_corns])
> +            )
> +            map_to_ref = cv2.warpPerspective(
> +                original_bw, ref_mat,
> +                (ref_w, ref_h)
> +            )
> +
> +            # Normalise brigthness
> +            a = 125 / np.average(map_to_ref)
> +            map_to_ref = cv2.convertScaleAbs(map_to_ref, alpha=a, beta=0)
> +
> +            # Find correlation with bw reference macbeth
> +            cor = correlate(map_to_ref, ref)
> +
> +            # Keep only if best correlation
> +            if cor > max_cor:
> +                max_cor = cor
> +                best_map = map_to_ref
> +                best_fit = mac_fit
> +                best_cen_fit = mac_cen_fit
> +                best_ref_mat = ref_mat
> +
> +            # Rotate macbeth by pi and recorrelate in case macbeth chart is
> +            # upside-down
> +            mac_fit_inv = np.array(
> +                ([[mac_fit[0][2], mac_fit[0][3],
> +                  mac_fit[0][0], mac_fit[0][1]]])
> +            )
> +            mac_cen_fit_inv = np.flip(mac_cen_fit, axis=1)
> +            ref_mat = cv2.getPerspectiveTransform(
> +                mac_fit_inv,
> +                np.array([ref_corns])
> +            )
> +            map_to_ref = cv2.warpPerspective(
> +                original_bw, ref_mat,
> +                (ref_w, ref_h)
> +            )
> +            a = 125 / np.average(map_to_ref)
> +            map_to_ref = cv2.convertScaleAbs(map_to_ref, alpha=a, beta=0)
> +            cor = correlate(map_to_ref, ref)
> +            if cor > max_cor:
> +                max_cor = cor
> +                best_map = map_to_ref
> +                best_fit = mac_fit_inv
> +                best_cen_fit = mac_cen_fit_inv
> +                best_ref_mat = ref_mat
> +
> +        # Check best match is above threshold
> +        cor_thresh = 0.6
> +        if max_cor < cor_thresh:
> +            raise MacbethError(
> +                '\nWARNING: Correlation too low'
> +                '\nPossible problems:\n'
> +                '- Bad lighting conditions\n'
> +                '- Macbeth chart is occluded\n'
> +                '- Background is too noisy\n'
> +                '- Macbeth chart is out of camera plane\n'
> +            )
> +
> +        # Represent coloured macbeth in reference space
> +        best_map_col = cv2.warpPerspective(
> +            original, best_ref_mat, (ref_w, ref_h)
> +        )
> +        best_map_col = cv2.resize(
> +            best_map_col, None, fx=4, fy=4
> +        )
> +        a = 125 / np.average(best_map_col)
> +        best_map_col_norm = cv2.convertScaleAbs(
> +            best_map_col, alpha=a, beta=0
> +        )
> +
> +        # Rescale coordinates to original image size
> +        fit_coords = (best_fit / factor, best_cen_fit / factor)
> +
> +        return (max_cor, best_map_col_norm, fit_coords, True)
> +
> +    # Catch macbeth errors and continue with code
> +    except MacbethError as error:
> +        eprint(error)
> +        return (0, None, None, False)
> +
> +
> +def find_macbeth(img, mac_config):
> +    small_chart = mac_config['small']
> +    show = mac_config['show']
> +
> +    # Catch the warnings
> +    warnings.simplefilter("ignore")
> +    warnings.warn("runtime", RuntimeWarning)
> +
> +    # Reference macbeth chart is created that will be correlated with the
> +    # located macbeth chart guess to produce a confidence value for the match.
> +    script_dir = Path(os.path.realpath(os.path.dirname(__file__)))
> +    macbeth_ref_path = script_dir.joinpath('macbeth_ref.pgm')
> +    ref = cv2.imread(str(macbeth_ref_path), flags=cv2.IMREAD_GRAYSCALE)
> +    ref_w = 120
> +    ref_h = 80
> +    rc1 = (0, 0)
> +    rc2 = (0, ref_h)
> +    rc3 = (ref_w, ref_h)
> +    rc4 = (ref_w, 0)
> +    ref_corns = np.array((rc1, rc2, rc3, rc4), np.float32)
> +    ref_data = (ref, ref_w, ref_h, ref_corns)
> +
> +    # Locate macbeth chart
> +    cor, mac, coords, ret = get_macbeth_chart(img, ref_data)
> +
> +    # Following bits of code try to fix common problems with simple techniques.
> +    # If now or at any point the best correlation is of above 0.75, then
> +    # nothing more is tried as this is a high enough confidence to ensure
> +    # reliable macbeth square centre placement.
> +
> +    for brightness in [2, 4]:
> +        if cor >= 0.75:
> +            break
> +        img_br = cv2.convertScaleAbs(img, alpha=brightness, beta=0)
> +        cor_b, mac_b, coords_b, ret_b = get_macbeth_chart(img_br, ref_data)
> +        if cor_b > cor:
> +            cor, mac, coords, ret = cor_b, mac_b, coords_b, ret_b
> +
> +    # In case macbeth chart is too small, take a selection of the image and
> +    # attempt to locate macbeth chart within that. The scale increment is
> +    # root 2
> +
> +    # These variables will be used to transform the found coordinates at
> +    # smaller scales back into the original. If ii is still -1 after this
> +    # section that means it was not successful
> +    ii = -1
> +    w_best = 0
> +    h_best = 0
> +    d_best = 100
> +
> +    # d_best records the scale of the best match. Macbeth charts are only looked
> +    # for at one scale increment smaller than the current best match in order to avoid
> +    # unecessarily searching for macbeth charts at small scales.
> +    # If a macbeth chart ha already been found then set d_best to 0
> +    if cor != 0:
> +        d_best = 0
> +
> +    for index, pair in enumerate([{'sel': 2 / 3, 'inc': 1 / 6},
> +                                  {'sel': 1 / 2, 'inc': 1 / 8},
> +                                  {'sel': 1 / 3, 'inc': 1 / 12},
> +                                  {'sel': 1 / 4, 'inc': 1 / 16}]):
> +        if cor >= 0.75:
> +            break
> +
> +        # Check if we need to check macbeth charts at even smaller scales. This
> +        # slows the code down significantly and has therefore been omitted by
> +        # default, however it is not unusably slow so might be useful if the
> +        # macbeth chart is too small to be picked up to by the current
> +        # subselections.  Use this for macbeth charts with side lengths around
> +        # 1/5 image dimensions (and smaller...?) it is, however, recommended
> +        # that macbeth charts take up as large as possible a proportion of the
> +        # image.
> +        if index >= 2 and (not small_chart or d_best <= index - 1):
> +            break
> +
> +        w, h = list(img.shape[:2])
> +        # Set dimensions of the subselection and the step along each axis
> +        # between selections
> +        w_sel = int(w * pair['sel'])
> +        h_sel = int(h * pair['sel'])
> +        w_inc = int(w * pair['inc'])
> +        h_inc = int(h * pair['inc'])
> +
> +        loop = ((1 - pair['sel']) / pair['inc']) + 1
> +        # For each subselection, look for a macbeth chart
> +        for i in range(loop):
> +            for j in range(loop):
> +                w_s, h_s = i * w_inc, j * h_inc
> +                img_sel = img[w_s:w_s + w_sel, h_s:h_s + h_sel]
> +                cor_ij, mac_ij, coords_ij, ret_ij = get_macbeth_chart(img_sel, ref_data)
> +
> +                # If the correlation is better than the best then record the
> +                # scale and current subselection at which macbeth chart was
> +                # found. Also record the coordinates, macbeth chart and message.
> +                if cor_ij > cor:
> +                    cor = cor_ij
> +                    mac, coords, ret = mac_ij, coords_ij, ret_ij
> +                    ii, jj = i, j
> +                    w_best, h_best = w_inc, h_inc
> +                    d_best = index + 1
> +
> +    # Transform coordinates from subselection to original image
> +    if ii != -1:
> +        for a in range(len(coords)):
> +            for b in range(len(coords[a][0])):
> +                coords[a][0][b][1] += ii * w_best
> +                coords[a][0][b][0] += jj * h_best
> +
> +    if not ret:
> +        return None
> +
> +    coords_fit = coords
> +    if cor < 0.75:
> +        eprint(f'Warning: Low confidence {cor:.3f} for macbeth chart in {img.path.name}')
> +
> +    if show:
> +        draw_macbeth_results(img, coords_fit)
> +
> +    return coords_fit
> +
> +
> +def locate_macbeth(image: Image, config: dict):
> +    # Find macbeth centres
> +    av_chan = (np.mean(np.array(image.channels), axis=0) / (2**16))
> +    av_val = np.mean(av_chan)
> +    if av_val < image.blacklevel_16 / (2**16) + 1 / 64:
> +        eprint(f'Image {image.path.name} too dark')
> +        return None
> +
> +    macbeth = find_macbeth(av_chan, config['general']['macbeth'])
> +
> +    if macbeth is None:
> +        eprint(f'No macbeth chart found in {image.path.name}')
> +        return None
> +
> +    mac_cen_coords = macbeth[1]
> +    if not image.get_patches(mac_cen_coords):
> +        eprint(f'Macbeth patches have saturated in {image.path.name}')
> +        return None
> +
> +    return macbeth
> diff --git a/utils/tuning/libtuning/macbeth_ref.pgm b/utils/tuning/libtuning/macbeth_ref.pgm
> new file mode 100644
> index 00000000..37897140
> --- /dev/null
> +++ b/utils/tuning/libtuning/macbeth_ref.pgm
> @@ -0,0 +1,6 @@
> +# SPDX-License-Identifier: BSD-2-Clause
> +P5
> +# Reference macbeth chart
> +120 80
> +255
> +      !#!" #!"&&$#$#'"%&#+2///..../.........-()))))))))))))))))))(((-,*)'(&)#($%(%"###""!%""&"&&!$" #!$ !"! $&**"  !#5.,%+,-5"0<HBAA54" %##((()*+,---.........+*)))))))))))))))-.,,--+))('((''('%'%##"!""!"!""""#!     !  %?/v??z:????L??????c?,!#""%%''')**+)-../..../.-*)))))))))))))**,,)**'(''&'((&&%%##$! !!!! ! !     !   5*"-)&7(1.75Rnge`\`$ ""!"%%%'')())++--/---,-..,-.,++**))))())*)*)''%'%&%&'&%%"""""               !   !!$&$$&##(+*,,/10122126545./66402006486869650*.1.***)*+)()&((('('##)('&%%&%$$$#$%$%$ (((*))('((('('(&%V0;>>;@@>@AAAACBCB=&<?????????????????<5x???????????????|64RYVTSRRRMMNLKJJLH+&0gijgdeffmmnpnkji`#3????????????????bY! 3FHHIIIHIJIIJHIII@#??????????????????=7}????????????????:5Wcbcbdcb`^^`^^_^Y,'6????????????????r'<????????????????l%2FHHIIHJJJJJJIIJI?%;?????????????????>7|????????????????;8Xfeeegeccb`^aba]Z+)<????????????????r)>????????????????q#3GHIIIIJIIJJIHIJI@&5?????????????????=8~????????????????;8Zgghggedbdcbda^\Z+(;????????????????y)9????????????????z"3GIIJJJJJKJJJJJJJ@'4?????????????????>9|????????????????=8Zhighgeeeedeca__[/)B????????????????v&:????????????????|#3GJJIIJKKKJJJKKJK@&6?????????????????>9~????????????????<8Yghegggffihccab^\/*C????????????????z'9?????????????????$  6IKJJMMMKMKKMKKMLC&2?????????????????@9?????????????????<9Yghhhhijiegdcebc^0)G????????????????(7?????????????????%   6JLMMNMMKMMNMMMMMD&2?????????????????@:~????????????????=9Xfghhjiigdgddedc`1)M????????????????}(:??????????????¾?&  "8LNOONNOMONNMMNOND'3?????????????????@;????????????????=:Ziiigheegegegggdc1,Q????????????????~)8?????????????????%# "9NNNPPPQOOOOONNOOD'0??????????????????;????????????????=;[iigeeegghgdedgea0-P????????????????(8???????????????Ý' "#$:NNOQPPRPQPOOPQPPD*1?????????????????A;?????????????????;:Yfghgghgghghhdggc3.\????????????????~);???????????????¤(&%%;OQQQRSSRPQQQQSQQF)3?????????????????B<?????????????????=:Wfhghhhihggghfhee4/f?????????????????*:???????????????ä&%%%?RSSSSSTTTTSSSTTRE)5?????????????????B=?????????????????@:Ygiihhiiiihihiiif72p????????????????}(9???????????????Ʃ'#%&?TUTTTUUQSTTTTTVSF*3?????????????????F>?????????????????A;[ghjiihiiiihihije50r?????????????????)6???????????????ƫ& &#%?SVVVUUUUUTUUVVUUG*5?????????????????F=?????????????????A;Yhijiiijjiiiiijje81t????????????????~)5???????????????ư' '$$=OQRRQQPRSRSSSSSSG+6?????????????????D@??????????????????;Wefgggggfffgeeefc41x????????????????{*5?????????????????( &&&'++++,,*-,-00-0100*-SUX\]]`_ffgiooopo=;X\bedbadbca`]\]ZZ;;<::8:;9983433110/-,...1//12410/..--+)"",---,-./,,.-/-0-( &&%+/0103322011223233)(34534767::;;==:=B9;BFGEEGIKJKIJGIJCD=<:76566554111/0/1.*+00233300/00//..,+*#")(*)++,++))*++**'!!&$*w????¼???????????1-_addc`ceccdccedbb?A|????????????????B>=>?@@?====;<:;:<:11r?????????????????+.?????????????????( !'%*z???????????????ɠ42gjmllklomooonpopmHG?????????????????D>AEDEFEECEECCCDDEC46????????????????׿0:???????????????Ѿ,!!&&,|???????????????ʡ61inknnoopoppoqqrqoEE?????????????????FACGFFFFFFDFDDDDDDC57??????????????????09?????????????????+!"%%-~???????????????ʡ42inopppppoqqqrrsrnAB?????????????????C?DGGGGFFFFDFFDDEDC48??????????????????1;?????????????????+!!"#*|???????????????ʡ62imoppppqqqqrtrqtrGD?????????????????H?CGGGGGGGGFFFFFFDB38??????????????????1<???????????????Խ,  !)}???????????????ˢ63mooppqqqqqqrrtvtoDH?????????????????JACHHGGHGGFFFDDGGFD29??????????????????3>???????????????׽, $){???????????????ˢ53jpppqprqrrrttuvuo>H?????????????????JAFHHHHHGGHGGFGGFFE28??????????????????3:???????????????ڽ- "*{???????????????̣53loqpqsqrrrtrutsvrAH?????????????????HCGHIHHHHHHGFGHGGGD5;??????????????????28?????????????????, +}???????????????ʡ52mqoqpqrttttttuurpFI?????????????????OCEHHIHHHHGHGGFFIGF8<??????????????????48???????????????ۿ, (|???????????????ʢ41krqpqqqrrtrtuvtuoEH?????????????????PBHHIIIHIIHIHGHGHHE7<??????????????????58?????????????????*  (z???????????????ʡ63kpqprqqstttutrvvoFO?????????????????LEHHIIHIHHHIGHGIHGF4=??????????????????5<?????????????????*  'z???????????????ȡ62lppqrqrrrtttuttvpAG?????????????????MGHIIIIHIIIHHIIJHHG4<??????????????????4<?????????????????+ !){???????????????Ƞ62jopqqqqqrtttutttrEH?????????????????OHFIIIIIJIIIIHIHIHI7>??????????????????5;?????????????????, !)z???????????????Ɵ53lppqqrqrtttuuuutsFI?????????????????RHGJIJHJKJJJIIIIIIH9>??????????????????5;?????????????????+  !({???????????????Ŝ41joppprqrrrutttvvrIH?????????????????THCJJJJJIJIJJIJJJIH7=??????????????????5;?????????????????+  (u?????????????????65gjlmmmnoopnpprpqoIH?????????????????OIBIJJJIJJJJIIIHHHG89??????????????????29???????????????ʾ'  "&,-*)-01/,0/12102-+04448789<>>??AFAD@DBCIJNRWTSUXT[WUQUOKFEBBABA?>>=<<;;67942:<<<>9999864565363&(13335422./1/-+..+  !"&$$""$"&$%'()(''*+-0124688:<>>??A>?EBCHKOLJLNOSQOXQQVMLACGHGHIGFHGDCCBB@??7432233210111.,++,++%(++)*(''%%%$$#%&$#  ")0/001120024455520+-U]`addcdhefeekecYGFJRXYYVWWZWVXXVZTOBF}????????????????K7Ybccddfeg`^]^]\[Z[*)OTTPPQPOKOLLJJLIK   !1;:9:<<===;=???A@9*/?????????????????FJmxyxwyzzzxyzzz{zxLO?????????????????]=??????????????????.-???????????????y# !!2><=;==>=<<>@@@@A9-0?????????????????IKnz||{|{||{}}~}}{zLO?????????????????]>??????????????????..????????????????~%  $2==;<>>?===>@A@AB;+1?????????????????JJo{|y{||}{||}}}}}yMT?????????????????_>??????????????????-.????????????????}#  %2<=;=<@?>==>?A@AA9+3?????????????????FMlz{{y|}}}}||}|}}{MT?????????????????d>??????????????????-,????????????????# %1<<<;==<<=>?A?@AA:,3?????????????????INo{{y{||||}|}}|~}{RT?????????????????d=??????????????????/-????????????????}#!$0<<<=<<==>A@@>@AA:-2?????????????????HInzz{{||{{}~~}}|}zMR?????????????????d=??????????????????++????????????????~# "$/;<==>;===@@@@>AA:+2?????????????????KHn||y|||||{}~}|}|xMS?????????????????d=??????????????????+,????????????????}# ! "/:<=>@<<>=@@@@@AA;-3?????????????????MFs||{{{y}z}}|}|}}yMW?????????????????c>??????????????????,)????????????????|! !1;>?>><<>@>>=>ABB;,0?????????????????LHr{|{|}|y|}}}}}zNX?????????????????c???????????????????()????????????????z#  $/;;<=;<>>=>>>@@BB:,1?????????????????IInyz||||||{||}{~|{NV?????????????????c;?????????????????('????????????????}# $0:<==<;>@>>>>@ABB:,/?????????????????HLlx|}y{y{|y{|}}}}yMR?????????????????d>~?????????????????*(???????????????y" !&3:;<<;==@@=>AABBA;-3?????????????????KLqz{|||y{}|}{}|~{zRQ?????????????????c9w?????????????????)'????????????????y" !%1<<;=>===<=@@ABBC<.5?????????????????IIlz{|}~~~|}{||~}}zMU?????????????????d;p?????????????????)$???????????????x"  $2===<==@=<>=ABBBC?/0?????????????????IGkz}}{||}{||y||}zyOV?????????????????c7o?????????????????'&~??????????~?z"#"#/;<:<<?>;===@?AAA>07?????????????????GGgwxz{yyxyzzyz{yuuHO?????????????????\8v?????????????????'$w~~}|||{~|{zxxxxv!"""'*+(+)*))()+,,.../0398;=<=>DCCDDCBBDHBCJMMLMPNPOJPKPSJDICCNMPONMNNOKHIFDBHE3/46433323.....*+,)( !##!!!!!$#$$#$#&"!!"(+**,,*+.//1478:<:33ACDFGGIIHIJLPKNMQFIPTTRVXVUXUUTXUSTNEGGFDEFAA>==;94877520-,))*(((('&$#!!"  &%'FQPQR]dq??????????=F?????????????????QN?????????????????LE????znki^[YTPUOS;.%-/12322221/10//,/%#0??????????????????@Q?????????????????QM?????????????????KE?????????????????H01NNQOQQOOMNNLKLJGB'&/??????????????????AW?????????????????OL?????????????????KE?????????????????F-,PQQPQPPQPOONMNNKE''0??????????????????CZ?????????????????RM?????????????????JE?????????????????F,*NSQPPQOOOOMNNMKID('2??????????????????D[?????????????????QK?????????????????IF?????????????????F,*NPPPPPPNOONMMMJIF!'(2??????????????????F]?????????????????RL?????????????????HD????????????????F+%MPPPPOOONONNMMKID)*4??????????????????D^?????????????????PL?????????????????IC?????????????????F+&NPOOOPPOONMMKMKHD**6??????????????????D_?????????????????QJ?????????????????FC~????????????????F,'MPOOOOONONNKKIIIG,+7??????????????????D^?????????????????QI?????????????????EB|????????????????E+&MONOOONNNNKMJKJHH,-8??????????????????D]?????????????????PI?????????????????HE????????????????C,#LOOOONONNNKKKMKJF,*6??????????????????Ca?????????????????MH?????????????????IF?????????????????D*%KONOMNMMKMKJJJIJE,,6??????????????????B^?????????????????MG?????????????????HB}????????????????D+&LONOOONNMMMMKLKIA,,6??????????????????A\?????????????????MF?????????????????IE????????????????E+&LNNMONNMMKKKKKIHF --6??????????????????A[?????????????????KF?????????????????JC????????????????F*&LMONMNMNKKJMKJJIF **5??????????????????>W?????????????????KE?????????????????F?}????????????????C*%KONNNJKKKMKJKJKID,*4??????????????????<W?????????????????MA?????????????????GCx????????????????B)%HKLKKJJJKIHIHHFGC!()*q????????????????o39v|}wwwwwwrqtuspn=9^gadcfgce`dbUY[\^>;DIJDB?FEGE=7>8634.(&&(%&*&%%'+*)+*#%()''03364443233222243/-+133423333423766645789:><<<;<;<?=?;<<:78673/001113--.-+*)&&#"&$#%&""$!! ))+rbPpAD9-*******+*++)++--.//./.0/21453469:=;98<;<>=;><7766666741012.-13/-+-/(''&&&%%&$.%0()-%-#-#' #&(% )))h?n?YQg?7(*))))*)**,--....../0/0001357666::;;>?>AA866666666656565300/20/.-*)(('((&&%)d=yoP?<???F?QFx;?2?1?0))*RQ.0*,,5*(*))))*,**,+/.../...02/22224456468;:>BB;>;:76666666666755303033/,.-*(())('&')#)"##(+$+*#)) & 
> diff --git a/utils/tuning/libtuning/utils.py b/utils/tuning/libtuning/utils.py
> new file mode 100644
> index 00000000..b60f2c9b
> --- /dev/null
> +++ b/utils/tuning/libtuning/utils.py
> @@ -0,0 +1,125 @@
> +# SPDX-License-Identifier: BSD-2-Clause
> +#
> +# Copyright (C) 2019, Raspberry Pi Ltd
> +# Copyright (C) 2022, Paul Elder <paul.elder@ideasonboard.com>
> +#
> +# utils.py - Utilities for libtuning
> +
> +import decimal
> +import math
> +import numpy as np
> +import os
> +from pathlib import Path
> +import re
> +import sys
> +
> +import libtuning as lt
> +from libtuning.image import Image
> +from libtuning.macbeth import locate_macbeth
> +
> +# Utility functions
> +
> +
> +def eprint(*args, **kwargs):
> +    print(*args, file=sys.stderr, **kwargs)
> +
> +
> +def get_module_by_type_name(modules, name):
> +    for module in modules:
> +        if module.type == name:
> +            return module
> +    return None
> +
> +
> +# Private utility functions
> +
> +
> +def _list_image_files(directory):
> +    d = Path(directory)
> +    files = [d.joinpath(f) for f in os.listdir(d)
> +             if re.search(r'\.(jp[e]g$)|(dng$)', f)]
> +    files.sort()
> +    return files
> +
> +
> +def _parse_image_filename(fn: Path):
> +    result = re.search(r'^(alsc_)?(\d+)[kK]_(\d+)?[lLuU]?.\w{3,4}$', fn.name)
> +    if result is None:
> +        eprint(f'The file name of {fn.name} is incorrectly formatted')
> +        return None, None, None
> +
> +    color = int(result.group(2))
> +    lsc_only = result.group(1) is not None
> +    lux = None if lsc_only else int(result.group(3))
> +
> +    return color, lux, lsc_only
> +
> +
> +# \todo Implement this from check_imgs() in ctt.py
> +def _validate_images(images):
> +    return True
> +
> +
> +# Public utility functions
> +
> +
> +# @brief Load images into a single list of Image instances
> +# @param input_dir Directory from which to load image files
> +# @param config Configuration dictionary
> +# @param load_nonlsc Whether or not to load non-lsc images
> +# @param load_lsc Whether or not to load lsc-only images
> +# @return A list of Image instances
> +def load_images(input_dir: str, config: dict, load_nonlsc: bool, load_lsc: bool) -> list:
> +    files = _list_image_files(input_dir)
> +    if len(files) == 0:
> +        eprint(f'No images found in {input_dir}')
> +        return None
> +
> +    images = []
> +    for f in files:
> +        color, lux, lsc_only = _parse_image_filename(f)
> +        if color is None:
> +            continue
> +
> +        # Skip lsc image if we don't need it
> +        if lsc_only and not load_lsc:
> +            eprint(f'Skipping {f.name} as this tuner has no LSC module')
> +            continue
> +
> +        # Skip non-lsc image if we don't need it
> +        if not lsc_only and not load_nonlsc:
> +            eprint(f'Skipping {f.name} as this tuner only has an LSC module')
> +            continue
> +
> +        # Load image
> +        try:
> +            image = Image(f)
> +        except Exception as e:
> +            eprint(f'Failed to load image {f.name}: {e}')
> +            continue
> +
> +        # Populate simple fields
> +        image.lsc_only = lsc_only
> +        image.color = color
> +        image.lux = lux
> +
> +        # Black level comes from the TIFF tags, but they are overridable by the
> +        # config file.
> +        if 'blacklevel' in config['general']:
> +            image.blacklevel_16 = config['general']['blacklevel']
> +
> +        if lsc_only:
> +            images.append(image)
> +            continue
> +
> +        # Handle macbeth
> +        macbeth = locate_macbeth(config)
> +        if macbeth is None:
> +            continue
> +
> +        images.append(image)
> +
> +    if not _validate_images(images):
> +        return None
> +
> +    return images
Paul Elder Nov. 25, 2022, 5:43 a.m. UTC | #2
On Fri, Nov 25, 2022 at 02:56:43AM +0200, Laurent Pinchart wrote:
> Hi Paul,
> 
> Thank you for the patch.
> 
> On Thu, Nov 24, 2022 at 08:35:39PM +0900, Paul Elder via libcamera-devel wrote:
> > Implement the core of libtuning, our new tuning tool infrastructure. It
> > leverages components from raspberrypi's ctt that could be reused for
> > tuning tools for other platforms.
> > 
> > The core components include:
> > - The Image class
> > - libtuning (entry point and other core functions)
> > - macbeth-related tools, including the macbeth reference image
> > - utils
> > 
> > Signed-off-by: Paul Elder <paul.elder@ideasonboard.com>
> > 
> > ---
> > Changes in v4:
> > - change Image.name to property
> > - fix saturated logic in get_patches
> > - move lsc vs non-lsc image loading decision from utils to Tuner
> >   - this means we have to import libtuning in libtuning.py... is this
> >     fine?
> > - remove raspberry pi's special rounding function, as it has worse
> >   rounding accuracy than simple rounding
> > - remove cli args from module.process parameters
> > 
> > Changes in v3:
> > - *Split into separate patches*
> >   - The following changes apply to the next two patches as well
> > - fix style
> > - rename Camera to Tuner
> > - remove indirection from fake polymorphism
> > - remove unused options property from Module
> > - remove unimplemented gradients
> > - convert readme to rst
> > - fix readme license
> > - reorder dependencies list
> > - add file descriptions
> > - remove indirection from Image loading
> > - remove Image member variables that are unused due to dropping BRCM
> >   support
> > - remove G from Color enum
> >   - Color was /not/ renamed to BayerComponent because it was much too
> >     long for use in code
> > - add @property getters to Param
> > - fix undefined functions/variables
> > 
> > Changes in v2:
> > - fix all python errors
> > - fix style
> > - add SPDX and copyright
> > - remove validateConfig() from the base/abstract Module class
> > - actually append the image after loading, even if it's alsc_only
> > - s/average_functions/average/
> > - remove separate params field for Average and Smoothing
> > - move remainder parameter in Gradient to Linear, as it only applies to
> >   that
> > - from gradient.Linear, remove the remainders that I thought don't make
> >   sense
> > - add Float to gradient.Linear's remainder types, to divide everything
> >   in as a float; useful for rkisp1's sector sizes (the x-size and y-size
> >   tuning options)
> > - add a map function to Gradient, for mapping values onto a curve
> > - in Smoothing, move ksize to a constructor parameter
> > - remove brcm image loading
> > - move process_args from utils to libtuning
> > - move Module's type string and human-readble module name to class
> >   variable
> > - move locate_macbeth from utils to macbeth
> > - add out_name to Module, for the output to know what name to write for
> >   the key in the tuning output (eg. rkisp1 uses "LensShadingCorrection"
> >   while raspberrypi uses "rpi.alsc")
> > ---
> >  utils/tuning/README.rst                |  11 +
> >  utils/tuning/libtuning/__init__.py     |  13 +
> >  utils/tuning/libtuning/image.py        | 136 +++++++
> >  utils/tuning/libtuning/libtuning.py    | 208 ++++++++++
> >  utils/tuning/libtuning/macbeth.py      | 516 +++++++++++++++++++++++++
> >  utils/tuning/libtuning/macbeth_ref.pgm |   6 +
> >  utils/tuning/libtuning/utils.py        | 125 ++++++
> >  7 files changed, 1015 insertions(+)
> >  create mode 100644 utils/tuning/README.rst
> >  create mode 100644 utils/tuning/libtuning/__init__.py
> >  create mode 100644 utils/tuning/libtuning/image.py
> >  create mode 100644 utils/tuning/libtuning/libtuning.py
> >  create mode 100644 utils/tuning/libtuning/macbeth.py
> >  create mode 100644 utils/tuning/libtuning/macbeth_ref.pgm
> >  create mode 100644 utils/tuning/libtuning/utils.py
> > 
> > diff --git a/utils/tuning/README.rst b/utils/tuning/README.rst
> > new file mode 100644
> > index 00000000..ce533b2c
> > --- /dev/null
> > +++ b/utils/tuning/README.rst
> > @@ -0,0 +1,11 @@
> > +.. SPDX-License-Identifier: CC-BY-SA-4.0
> > +
> > +.. TODO: Write an overview of libtuning
> > +
> > +Dependencies
> > +------------
> > +
> > +- cv2
> > +- numpy
> > +- pyexiv2
> > +- rawpy
> > diff --git a/utils/tuning/libtuning/__init__.py b/utils/tuning/libtuning/__init__.py
> > new file mode 100644
> > index 00000000..93049976
> > --- /dev/null
> > +++ b/utils/tuning/libtuning/__init__.py
> > @@ -0,0 +1,13 @@
> > +# SPDX-License-Identifier: GPL-2.0-or-later
> > +#
> > +# Copyright (C) 2022, Paul Elder <paul.elder@ideasonboard.com>
> > +
> > +from libtuning.utils import *
> > +from libtuning.libtuning import *
> > +
> > +from libtuning.image import *
> > +from libtuning.macbeth import *
> > +
> > +from libtuning.average import *
> > +from libtuning.gradient import *
> > +from libtuning.smoothing import *
> > diff --git a/utils/tuning/libtuning/image.py b/utils/tuning/libtuning/image.py
> > new file mode 100644
> > index 00000000..aa9d20b5
> > --- /dev/null
> > +++ b/utils/tuning/libtuning/image.py
> > @@ -0,0 +1,136 @@
> > +# SPDX-License-Identifier: BSD-2-Clause
> > +#
> > +# Copyright (C) 2019, Raspberry Pi Ltd
> > +#
> > +# image.py - Container for an image and associated metadata
> > +
> > +import binascii
> > +import numpy as np
> > +from pathlib import Path
> > +import pyexiv2 as pyexif
> > +import rawpy as raw
> > +import re
> > +
> > +import libtuning as lt
> > +import libtuning.utils as utils
> > +
> > +
> > +class Image:
> > +    def __init__(self, path: Path):
> > +        self.path = path
> > +        self.lsc_only = False
> > +        self.color = -1
> > +        self.lux = -1
> > +
> > +        try:
> > +            self._load_metadata_exif()
> > +        except Exception as e:
> > +            utils.eprint(f'Failed to load metadata from {self.path}: {e}')
> > +            raise e
> > +
> > +        try:
> > +            self._read_image_dng()
> > +        except Exception as e:
> > +            utils.eprint(f'Failed to load image data from {self.path}: {e}')
> > +            raise e
> > +
> > +    @property
> > +    def name(self):
> > +        return self.path.name
> > +
> > +    # May raise KeyError as there are too many to check
> > +    def _load_metadata_exif(self):
> > +        # RawPy doesn't load all the image tags that we need, so we use py3exiv2
> > +        metadata = pyexif.ImageMetadata(str(self.path))
> > +        metadata.read()
> > +
> > +        # The DNG and TIFF/EP specifications use different IFDs to store the
> > +        # raw image data and the Exif tags. DNG stores them in a SubIFD and in
> > +        # an Exif IFD respectively (named "SubImage1" and "Photo" by pyexiv2),
> > +        # while TIFF/EP stores them both in IFD0 (name "Image"). Both are used
> > +        # in "DNG" files, with libcamera-apps following the DNG recommendation
> > +        # and applications based on picamera2 following TIFF/EP.
> > +        #
> > +        # This code detects which tags are being used, and therefore extracts the
> > +        # correct values.
> > +        try:
> > +            self.w = metadata['Exif.SubImage1.ImageWidth'].value
> > +            subimage = 'SubImage1'
> > +            photo = 'Photo'
> > +        except KeyError:
> > +            self.w = metadata['Exif.Image.ImageWidth'].value
> > +            subimage = 'Image'
> > +            photo = 'Image'
> > +        self.pad = 0
> > +        self.h = metadata[f'Exif.{subimage}.ImageLength'].value
> > +        white = metadata[f'Exif.{subimage}.WhiteLevel'].value
> > +        self.sigbits = int(white).bit_length()
> > +        self.fmt = (self.sigbits - 4) // 2
> > +        self.exposure = int(metadata[f'Exif.{photo}.ExposureTime'].value * 1000000)
> > +        self.againQ8 = metadata[f'Exif.{photo}.ISOSpeedRatings'].value * 256 / 100
> > +        self.againQ8_norm = self.againQ8 / 256
> > +        self.camName = metadata['Exif.Image.Model'].value
> > +        self.blacklevel = int(metadata[f'Exif.{subimage}.BlackLevel'].value[0])
> > +        self.blacklevel_16 = self.blacklevel << (16 - self.sigbits)
> > +
> > +        # Channel order depending on bayer pattern
> > +        # The key is the order given by exif, where 0 is R, 1 is G, and 2 is B
> > +        # The value is the index where the color can be found, where the first
> > +        # is R, then G, then G, then B.
> > +        bayer_case = {
> > +            '0 1 1 2': (lt.Color.R, lt.Color.GR, lt.Color.GB, lt.Color.B),
> > +            '1 2 0 1': (lt.Color.GB, lt.Color.R, lt.Color.B, lt.Color.GR),
> > +            '2 1 1 0': (lt.Color.B, lt.Color.GB, lt.Color.GR, lt.Color.R),
> > +            '1 0 2 1': (lt.Color.GR, lt.Color.R, lt.Color.B, lt.Color.GB)
> > +        }
> > +        # Note: This needs to be in IFD0
> > +        cfa_pattern = metadata[f'Exif.{subimage}.CFAPattern'].value
> > +        self.order = bayer_case[cfa_pattern]
> > +
> > +    def _read_image_dng(self):
> > +        raw_im = raw.imread(str(self.path))
> > +        raw_data = raw_im.raw_image
> > +        shift = 16 - self.sigbits
> > +        c0 = np.left_shift(raw_data[0::2, 0::2].astype(np.int64), shift)
> > +        c1 = np.left_shift(raw_data[0::2, 1::2].astype(np.int64), shift)
> > +        c2 = np.left_shift(raw_data[1::2, 0::2].astype(np.int64), shift)
> > +        c3 = np.left_shift(raw_data[1::2, 1::2].astype(np.int64), shift)
> > +        self.channels = [c0, c1, c2, c3]
> > +        # Reorder the channels into R, GR, GB, B
> > +        self.channels = [self.channels[i] for i in self.order]
> > +
> > +    # \todo Move this to macbeth.py
> > +    def get_patches(self, cen_coords, size=16):
> > +        saturated = False
> > +
> > +        # Obtain channel widths and heights
> > +        ch_w, ch_h = self.w, self.h
> > +        cen_coords = list(np.array((cen_coords[0])).astype(np.int32))
> > +        self.cen_coords = cen_coords
> > +
> > +        # Squares are ordered by stacking macbeth chart columns from left to
> > +        # right. Some useful patch indices:
> > +        #     white = 3
> > +        #     black = 23
> > +        #     'reds' = 9, 10
> > +        #     'blues' = 2, 5, 8, 20, 22
> > +        #     'greens' = 6, 12, 17
> > +        #     greyscale = 3, 7, 11, 15, 19, 23
> > +        all_patches = []
> > +        for ch in self.channels:
> > +            ch_patches = []
> > +            for cen in cen_coords:
> > +                # Macbeth centre is placed at top left of central 2x2 patch to
> > +                # account for rounding. Patch pixels are sorted by pixel
> > +                # brightness so spatial information is lost.
> > +                patch = ch[cen[1] - 7:cen[1] + 9, cen[0] - 7:cen[0] + 9].flatten()
> > +                patch.sort()
> > +                if patch[-5] == (2**self.sigbits - 1) * 2**(16 - self.sigbits):
> > +                    saturated = True
> > +                ch_patches.append(patch)
> > +
> > +            all_patches.append(ch_patches)
> > +
> > +        self.patches = all_patches
> > +
> > +        return not saturated
> > diff --git a/utils/tuning/libtuning/libtuning.py b/utils/tuning/libtuning/libtuning.py
> > new file mode 100644
> > index 00000000..469e6940
> > --- /dev/null
> > +++ b/utils/tuning/libtuning/libtuning.py
> > @@ -0,0 +1,208 @@
> > +# SPDX-License-Identifier: GPL-2.0-or-later
> > +#
> > +# Copyright (C) 2022, Paul Elder <paul.elder@ideasonboard.com>
> > +#
> > +# libtuning.py - An infrastructure for camera tuning tools
> > +
> > +import argparse
> > +
> > +import libtuning as lt
> > +import libtuning.utils as utils
> > +from libtuning.utils import eprint
> > +
> > +from enum import Enum, IntEnum
> > +
> > +
> > +class Color(IntEnum):
> > +    R = 0
> > +    GR = 1
> > +    GB = 2
> > +    B = 3
> > +
> > +
> > +class Debug(Enum):
> > +    Plot = 1
> > +
> > +
> > +# @brief What to do with the leftover pixels after dividing them into ALSC
> > +#        sectors, when the division gradient is uniform
> > +# @var Float Force floating point division so all sectors divide equally
> > +# @var DistributeFront Divide the remainder equally (until running out,
> > +#      obviously) into the existing sectors, starting from the front
> > +# @var DistributeBack Same as DistributeFront but starting from the back
> > +class Remainder(Enum):
> > +    Float = 0
> > +    DistributeFront = 1
> > +    DistributeBack = 2
> > +
> > +
> > +# @brief A helper class to contain a default value for a module configuration
> > +# parameter
> > +class Param(object):
> > +    # @var Required The value contained in this instance is irrelevant, and the
> > +    #      value must be provided by the tuning configuration file.
> > +    # @var Optional If the value is not provided by the tuning configuration
> > +    #      file, then the value contained in this instance will be used instead.
> > +    # @var Hardcode The value contained in this instance will be used
> > +    class Mode(Enum):
> > +        Required = 0
> > +        Optional = 1
> > +        Hardcode = 2
> > +
> > +    # @param name Name of the parameter. Shall match the name used in the
> > +    #        configuration file for the parameter
> > +    # @param required Whether or not a value is required in the config
> > +    #        parameter of getVal()
> > +    # @param val Default value (only relevant if mode is Optional)
> > +    def __init__(self, name: str, required: Mode, val=None):
> > +        self.name = name
> > +        self.__required = required
> > +        self.val = val
> > +
> > +    def get_value(self, config: dict):
> > +        if self.required is self.Mode.Hardcode:
> > +            return self.val
> > +
> > +        if self.required is self.Mode.Required and self.name not in config:
> > +            raise ValueError(f'Parameter {self.name} is required but not provided in the configuration')
> > +
> > +        return config[self.name] if self.required is self.Mode.Required else self.val
> > +
> > +    @property
> > +    def required(self):
> > +        return self.__required is self.Mode.Required
> > +
> > +    # @brief Used by libtuning to auto-generate help information for the tuning
> > +    #        script on the available parameters for the configuration file
> > +    # \todo Implement this
> > +    @property
> > +    def info(self):
> > +        raise NotImplementedError
> > +
> > +
> > +class Tuner(object):
> > +
> > +    # External functions
> > +
> > +    def __init__(self, platform_name):
> > +        self.name = platform_name
> > +        self.modules = []
> > +        self.parser = None
> > +        self.generator = None
> > +        self.output_order = []
> > +        self.config = {}
> > +        self.output = {}
> > +
> > +    def add(self, module):
> > +        self.modules.append(module)
> > +
> > +    def set_input_parser(self, parser):
> > +        self.parser = parser
> > +
> > +    def set_output_formatter(self, output):
> > +        self.generator = output
> > +
> > +    def set_output_order(self, modules):
> > +        self.output_order = modules
> > +
> > +    # @brief Convert classes in self.output_order to the instances in self.modules
> > +    def _prepare_output_order(self):
> > +        output_order = self.output_order
> > +        self.output_order = []
> > +        for module_type in output_order:
> > +            modules = [module for module in self.modules if module.type == module_type.type]
> > +            if len(modules) > 1:
> > +                eprint(f'Multiple modules found for module type "{module_type.type}"')
> > +                return False
> > +            if len(modules) < 1:
> > +                eprint(f'No module found for module type "{module_type.type}"')
> > +                return False
> > +            self.output_order.append(modules[0])
> > +
> > +        return True
> > +
> > +    # \todo Validate parser and generator at Tuner construction time?
> > +    def _validate_settings(self):
> > +        if self.parser is None:
> > +            eprint('Missing parser')
> > +            return False
> > +
> > +        if self.generator is None:
> > +            eprint('Missing generator')
> > +            return False
> > +
> > +        if len(self.modules) == 0:
> > +            eprint('No modules added')
> > +            return False
> > +
> > +        if len(self.output_order) != len(self.modules):
> > +            eprint('Number of outputs does not match number of modules')
> > +            return False
> > +
> > +        return True
> > +
> > +    def _process_args(self, argv, platform_name):
> > +        parser = argparse.ArgumentParser(description=f'Camera Tuning for {platform_name}')
> > +        parser.add_argument('-i', '--input', type=str, required=True,
> > +                            help='''Directory containing calibration images (required).
> > +                                    Images for ALSC must be named "alsc_{Color Temperature}k_1[u].dng",
> > +                                    and all other images must be named "{Color Temperature}k_{Lux Level}l.dng"''')
> > +        parser.add_argument('-o', '--output', type=str, required=True,
> > +                            help='Output file (required)')
> > +        # It is not our duty to scan all modules to figure out their default
> > +        # options, so simply return an empty configuration if none is provided.
> > +        parser.add_argument('-c', '--config', type=str, default='',
> > +                            help='Config file (optional)')
> > +        # \todo Check if we really need this or if stderr is good enough, or if
> > +        # we want a better logging infrastructure with log levels
> > +        parser.add_argument('-l', '--log', type=str, default=None,
> > +                            help='Output log file (optional)')
> > +        return parser.parse_args(argv[1:])
> > +
> > +    def run(self, argv):
> > +        args = self._process_args(argv, self.name)
> > +        if args is None:
> > +            return -1
> > +
> > +        if not self._validate_settings():
> > +            return -1
> > +
> > +        if not self._prepare_output_order():
> > +            return -1
> > +
> > +        if len(args.config) > 0:
> > +            self.config, disable = self.parser.parse(args.config, self.modules)
> > +        else:
> > +            self.config = {'general': {}}
> > +            disable = []
> > +
> > +        # Remove disabled modules
> > +        for module in disable:
> > +            if module in self.modules:
> > +                self.modules.remove(module)
> > +
> > +        for module in self.modules:
> > +            if not module.validate_config(self.config):
> > +                eprint(f'Config is invalid for module {module.type}')
> > +                return -1
> > +
> > +        has_lsc = any(isinstance(m, lt.modules.lsc.LSC) for m in self.modules)
> > +        # Only one LSC module allowed
> > +        has_only_lsc = has_lsc and len(self.modules) == 1
> 
> Nothing to address now, but I can see this evolving into checking that
> there's only one module of each type.

The evolution that I saw was having more than two different types of
images, but yeah that's another possible evolution too.

I decided to deal with it when the time came, and that the extra
complexity wasn't worth it for now.

> 
> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>


Thanks,

Paul

> 
> > +
> > +        images = utils.load_images(args.input, self.config, not has_only_lsc, has_lsc)
> > +        if images is None or len(images) == 0:
> > +            eprint(f'No images were found, or able to load')
> > +            return -1
> > +
> > +        # Do the tuning
> > +        for module in self.modules:
> > +            out = module.process(self.config, images, self.output)
> > +            if out is None:
> > +                eprint(f'Module {module.name} failed to process, aborting')
> > +                break
> > +            self.output[module] = out
> > +
> > +        self.generator.write(args.output, self.output, self.output_order)
> > +
> > +        return 0
> > diff --git a/utils/tuning/libtuning/macbeth.py b/utils/tuning/libtuning/macbeth.py
> > new file mode 100644
> > index 00000000..5faddf66
> > --- /dev/null
> > +++ b/utils/tuning/libtuning/macbeth.py
> > @@ -0,0 +1,516 @@
> > +# SPDX-License-Identifier: BSD-2-Clause
> > +#
> > +# Copyright (C) 2019, Raspberry Pi Ltd
> > +#
> > +# macbeth.py - Locate and extract Macbeth charts from images
> > +# (Copied from: ctt_macbeth_locator.py)
> > +
> > +# \todo Add debugging
> > +
> > +import cv2
> > +import os
> > +from pathlib import Path
> > +import numpy as np
> > +
> > +from libtuning.image import Image
> > +
> > +
> > +# Reshape image to fixed width without distorting returns image and scale
> > +# factor
> > +def reshape(img, width):
> > +    factor = width / img.shape[0]
> > +    return cv2.resize(img, None, fx=factor, fy=factor), factor
> > +
> > +
> > +# Correlation function to quantify match
> > +def correlate(im1, im2):
> > +    f1 = im1.flatten()
> > +    f2 = im2.flatten()
> > +    cor = np.corrcoef(f1, f2)
> > +    return cor[0][1]
> > +
> > +
> > +# @brief Compute coordinates of macbeth chart vertices and square centres
> > +# @return (max_cor, best_map_col_norm, fit_coords, success)
> > +#
> > +# Also returns an error/success message for debugging purposes. Additionally,
> > +# it scores the match with a confidence value.
> > +#
> > +#    Brief explanation of the macbeth chart locating algorithm:
> > +#    - Find rectangles within image
> > +#    - Take rectangles within percentage offset of median perimeter. The
> > +#        assumption is that these will be the macbeth squares
> > +#    - For each potential square, find the 24 possible macbeth centre locations
> > +#        that would produce a square in that location
> > +#    - Find clusters of potential macbeth chart centres to find the potential
> > +#        macbeth centres with the most votes, i.e. the most likely ones
> > +#    - For each potential macbeth centre, use the centres of the squares that
> > +#        voted for it to find macbeth chart corners
> > +#    - For each set of corners, transform the possible match into normalised
> > +#        space and correlate with a reference chart to evaluate the match
> > +#    - Select the highest correlation as the macbeth chart match, returning the
> > +#        correlation as the confidence score
> > +#
> > +# \todo Clean this up
> > +def get_macbeth_chart(img, ref_data):
> > +    ref, ref_w, ref_h, ref_corns = ref_data
> > +
> > +    # The code will raise and catch a MacbethError in case of a problem, trying
> > +    # to give some likely reasons why the problem occured, hence the try/except
> > +    try:
> > +        # Obtain image, convert to grayscale and normalise
> > +        src = img
> > +        src, factor = reshape(src, 200)
> > +        original = src.copy()
> > +        a = 125 / np.average(src)
> > +        src_norm = cv2.convertScaleAbs(src, alpha=a, beta=0)
> > +
> > +        # This code checks if there are seperate colour channels. In the past the
> > +        # macbeth locator ran on jpgs and this makes it robust to different
> > +        # filetypes. Note that running it on a jpg has 4x the pixels of the
> > +        # average bayer channel so coordinates must be doubled.
> > +
> > +        # This is best done in img_load.py in the get_patches method. The
> > +        # coordinates and image width, height must be divided by two if the
> > +        # macbeth locator has been run on a demosaicked image.
> > +        if len(src_norm.shape) == 3:
> > +            src_bw = cv2.cvtColor(src_norm, cv2.COLOR_BGR2GRAY)
> > +        else:
> > +            src_bw = src_norm
> > +        original_bw = src_bw.copy()
> > +
> > +        # Obtain image edges
> > +        sigma = 2
> > +        src_bw = cv2.GaussianBlur(src_bw, (0, 0), sigma)
> > +        t1, t2 = 50, 100
> > +        edges = cv2.Canny(src_bw, t1, t2)
> > +
> > +        # Dilate edges to prevent self-intersections in contours
> > +        k_size = 2
> > +        kernel = np.ones((k_size, k_size))
> > +        its = 1
> > +        edges = cv2.dilate(edges, kernel, iterations=its)
> > +
> > +        # Find contours in image
> > +        conts, _ = cv2.findContours(edges, cv2.RETR_TREE,
> > +                                    cv2.CHAIN_APPROX_NONE)
> > +        if len(conts) == 0:
> > +            raise MacbethError(
> > +                '\nWARNING: No macbeth chart found!'
> > +                '\nNo contours found in image\n'
> > +                'Possible problems:\n'
> > +                '- Macbeth chart is too dark or bright\n'
> > +                '- Macbeth chart is occluded\n'
> > +            )
> > +
> > +        # Find quadrilateral contours
> > +        epsilon = 0.07
> > +        conts_per = []
> > +        for i in range(len(conts)):
> > +            per = cv2.arcLength(conts[i], True)
> > +            poly = cv2.approxPolyDP(conts[i], epsilon * per, True)
> > +            if len(poly) == 4 and cv2.isContourConvex(poly):
> > +                conts_per.append((poly, per))
> > +
> > +        if len(conts_per) == 0:
> > +            raise MacbethError(
> > +                '\nWARNING: No macbeth chart found!'
> > +                '\nNo quadrilateral contours found'
> > +                '\nPossible problems:\n'
> > +                '- Macbeth chart is too dark or bright\n'
> > +                '- Macbeth chart is occluded\n'
> > +                '- Macbeth chart is out of camera plane\n'
> > +            )
> > +
> > +        # Sort contours by perimeter and get perimeters within percent of median
> > +        conts_per = sorted(conts_per, key=lambda x: x[1])
> > +        med_per = conts_per[int(len(conts_per) / 2)][1]
> > +        side = med_per / 4
> > +        perc = 0.1
> > +        med_low, med_high = med_per * (1 - perc), med_per * (1 + perc)
> > +        squares = []
> > +        for i in conts_per:
> > +            if med_low <= i[1] and med_high >= i[1]:
> > +                squares.append(i[0])
> > +
> > +        # Obtain coordinates of nomralised macbeth and squares
> > +        square_verts, mac_norm = get_square_verts(0.06)
> > +        # For each square guess, find 24 possible macbeth chart centres
> > +        mac_mids = []
> > +        squares_raw = []
> > +        for i in range(len(squares)):
> > +            square = squares[i]
> > +            squares_raw.append(square)
> > +
> > +            # Convert quads to rotated rectangles. This is required as the
> > +            # 'squares' are usually quite irregular quadrilaterls, so
> > +            # performing a transform would result in exaggerated warping and
> > +            # inaccurate macbeth chart centre placement
> > +            rect = cv2.minAreaRect(square)
> > +            square = cv2.boxPoints(rect).astype(np.float32)
> > +
> > +            # Reorder vertices to prevent 'hourglass shape'
> > +            square = sorted(square, key=lambda x: x[0])
> > +            square_1 = sorted(square[:2], key=lambda x: x[1])
> > +            square_2 = sorted(square[2:], key=lambda x: -x[1])
> > +            square = np.array(np.concatenate((square_1, square_2)), np.float32)
> > +            square = np.reshape(square, (4, 2)).astype(np.float32)
> > +            squares[i] = square
> > +
> > +            # Find 24 possible macbeth chart centres by trasnforming normalised
> > +            # macbeth square vertices onto candidate square vertices found in image
> > +            for j in range(len(square_verts)):
> > +                verts = square_verts[j]
> > +                p_mat = cv2.getPerspectiveTransform(verts, square)
> > +                mac_guess = cv2.perspectiveTransform(mac_norm, p_mat)
> > +                mac_guess = np.round(mac_guess).astype(np.int32)
> > +
> > +                mac_mid = np.mean(mac_guess, axis=1)
> > +                mac_mids.append([mac_mid, (i, j)])
> > +
> > +        if len(mac_mids) == 0:
> > +            raise MacbethError(
> > +                '\nWARNING: No macbeth chart found!'
> > +                '\nNo possible macbeth charts found within image'
> > +                '\nPossible problems:\n'
> > +                '- Part of the macbeth chart is outside the image\n'
> > +                '- Quadrilaterals in image background\n'
> > +            )
> > +
> > +        # Reshape data
> > +        for i in range(len(mac_mids)):
> > +            mac_mids[i][0] = mac_mids[i][0][0]
> > +
> > +        # Find where midpoints cluster to identify most likely macbeth centres
> > +        clustering = cluster.AgglomerativeClustering(
> > +            n_clusters=None,
> > +            compute_full_tree=True,
> > +            distance_threshold=side * 2
> > +        )
> > +        mac_mids_list = [x[0] for x in mac_mids]
> > +
> > +        if len(mac_mids_list) == 1:
> > +            # Special case of only one valid centre found (probably not needed)
> > +            clus_list = []
> > +            clus_list.append([mac_mids, len(mac_mids)])
> > +
> > +        else:
> > +            clustering.fit(mac_mids_list)
> > +
> > +            # Create list of all clusters
> > +            clus_list = []
> > +            if clustering.n_clusters_ > 1:
> > +                for i in range(clustering.labels_.max() + 1):
> > +                    indices = [j for j, x in enumerate(clustering.labels_) if x == i]
> > +                    clus = []
> > +                    for index in indices:
> > +                        clus.append(mac_mids[index])
> > +                    clus_list.append([clus, len(clus)])
> > +                clus_list.sort(key=lambda x: -x[1])
> > +
> > +            elif clustering.n_clusters_ == 1:
> > +                # Special case of only one cluster found
> > +                clus_list.append([mac_mids, len(mac_mids)])
> > +            else:
> > +                raise MacbethError(
> > +                    '\nWARNING: No macebth chart found!'
> > +                    '\nNo clusters found'
> > +                    '\nPossible problems:\n'
> > +                    '- NA\n'
> > +                )
> > +
> > +        # Keep only clusters with enough votes
> > +        clus_len_max = clus_list[0][1]
> > +        clus_tol = 0.7
> > +        for i in range(len(clus_list)):
> > +            if clus_list[i][1] < clus_len_max * clus_tol:
> > +                clus_list = clus_list[:i]
> > +                break
> > +            cent = np.mean(clus_list[i][0], axis=0)[0]
> > +            clus_list[i].append(cent)
> > +
> > +        # Get centres of each normalised square
> > +        reference = get_square_centres(0.06)
> > +
> > +        # For each possible macbeth chart, transform image into
> > +        # normalised space and find correlation with reference
> > +        max_cor = 0
> > +        best_map = None
> > +        best_fit = None
> > +        best_cen_fit = None
> > +        best_ref_mat = None
> > +
> > +        for clus in clus_list:
> > +            clus = clus[0]
> > +            sq_cents = []
> > +            ref_cents = []
> > +            i_list = [p[1][0] for p in clus]
> > +            for point in clus:
> > +                i, j = point[1]
> > +
> > +                # Remove any square that voted for two different points within
> > +                # the same cluster. This causes the same point in the image to be
> > +                # mapped to two different reference square centres, resulting in
> > +                # a very distorted perspective transform since cv2.findHomography
> > +                # simply minimises error.
> > +                # This phenomenon is not particularly likely to occur due to the
> > +                # enforced distance threshold in the clustering fit but it is
> > +                # best to keep this in just in case.
> > +                if i_list.count(i) == 1:
> > +                    square = squares_raw[i]
> > +                    sq_cent = np.mean(square, axis=0)
> > +                    ref_cent = reference[j]
> > +                    sq_cents.append(sq_cent)
> > +                    ref_cents.append(ref_cent)
> > +
> > +                    # At least four squares need to have voted for a centre in
> > +                    # order for a transform to be found
> > +            if len(sq_cents) < 4:
> > +                raise MacbethError(
> > +                    '\nWARNING: No macbeth chart found!'
> > +                    '\nNot enough squares found'
> > +                    '\nPossible problems:\n'
> > +                    '- Macbeth chart is occluded\n'
> > +                    '- Macbeth chart is too dark of bright\n'
> > +                )
> > +
> > +            ref_cents = np.array(ref_cents)
> > +            sq_cents = np.array(sq_cents)
> > +
> > +            # Find best fit transform from normalised centres to image
> > +            h_mat, mask = cv2.findHomography(ref_cents, sq_cents)
> > +            if 'None' in str(type(h_mat)):
> > +                raise MacbethError(
> > +                    '\nERROR\n'
> > +                )
> > +
> > +            # Transform normalised corners and centres into image space
> > +            mac_fit = cv2.perspectiveTransform(mac_norm, h_mat)
> > +            mac_cen_fit = cv2.perspectiveTransform(np.array([reference]), h_mat)
> > +
> > +            # Transform located corners into reference space
> > +            ref_mat = cv2.getPerspectiveTransform(
> > +                mac_fit,
> > +                np.array([ref_corns])
> > +            )
> > +            map_to_ref = cv2.warpPerspective(
> > +                original_bw, ref_mat,
> > +                (ref_w, ref_h)
> > +            )
> > +
> > +            # Normalise brigthness
> > +            a = 125 / np.average(map_to_ref)
> > +            map_to_ref = cv2.convertScaleAbs(map_to_ref, alpha=a, beta=0)
> > +
> > +            # Find correlation with bw reference macbeth
> > +            cor = correlate(map_to_ref, ref)
> > +
> > +            # Keep only if best correlation
> > +            if cor > max_cor:
> > +                max_cor = cor
> > +                best_map = map_to_ref
> > +                best_fit = mac_fit
> > +                best_cen_fit = mac_cen_fit
> > +                best_ref_mat = ref_mat
> > +
> > +            # Rotate macbeth by pi and recorrelate in case macbeth chart is
> > +            # upside-down
> > +            mac_fit_inv = np.array(
> > +                ([[mac_fit[0][2], mac_fit[0][3],
> > +                  mac_fit[0][0], mac_fit[0][1]]])
> > +            )
> > +            mac_cen_fit_inv = np.flip(mac_cen_fit, axis=1)
> > +            ref_mat = cv2.getPerspectiveTransform(
> > +                mac_fit_inv,
> > +                np.array([ref_corns])
> > +            )
> > +            map_to_ref = cv2.warpPerspective(
> > +                original_bw, ref_mat,
> > +                (ref_w, ref_h)
> > +            )
> > +            a = 125 / np.average(map_to_ref)
> > +            map_to_ref = cv2.convertScaleAbs(map_to_ref, alpha=a, beta=0)
> > +            cor = correlate(map_to_ref, ref)
> > +            if cor > max_cor:
> > +                max_cor = cor
> > +                best_map = map_to_ref
> > +                best_fit = mac_fit_inv
> > +                best_cen_fit = mac_cen_fit_inv
> > +                best_ref_mat = ref_mat
> > +
> > +        # Check best match is above threshold
> > +        cor_thresh = 0.6
> > +        if max_cor < cor_thresh:
> > +            raise MacbethError(
> > +                '\nWARNING: Correlation too low'
> > +                '\nPossible problems:\n'
> > +                '- Bad lighting conditions\n'
> > +                '- Macbeth chart is occluded\n'
> > +                '- Background is too noisy\n'
> > +                '- Macbeth chart is out of camera plane\n'
> > +            )
> > +
> > +        # Represent coloured macbeth in reference space
> > +        best_map_col = cv2.warpPerspective(
> > +            original, best_ref_mat, (ref_w, ref_h)
> > +        )
> > +        best_map_col = cv2.resize(
> > +            best_map_col, None, fx=4, fy=4
> > +        )
> > +        a = 125 / np.average(best_map_col)
> > +        best_map_col_norm = cv2.convertScaleAbs(
> > +            best_map_col, alpha=a, beta=0
> > +        )
> > +
> > +        # Rescale coordinates to original image size
> > +        fit_coords = (best_fit / factor, best_cen_fit / factor)
> > +
> > +        return (max_cor, best_map_col_norm, fit_coords, True)
> > +
> > +    # Catch macbeth errors and continue with code
> > +    except MacbethError as error:
> > +        eprint(error)
> > +        return (0, None, None, False)
> > +
> > +
> > +def find_macbeth(img, mac_config):
> > +    small_chart = mac_config['small']
> > +    show = mac_config['show']
> > +
> > +    # Catch the warnings
> > +    warnings.simplefilter("ignore")
> > +    warnings.warn("runtime", RuntimeWarning)
> > +
> > +    # Reference macbeth chart is created that will be correlated with the
> > +    # located macbeth chart guess to produce a confidence value for the match.
> > +    script_dir = Path(os.path.realpath(os.path.dirname(__file__)))
> > +    macbeth_ref_path = script_dir.joinpath('macbeth_ref.pgm')
> > +    ref = cv2.imread(str(macbeth_ref_path), flags=cv2.IMREAD_GRAYSCALE)
> > +    ref_w = 120
> > +    ref_h = 80
> > +    rc1 = (0, 0)
> > +    rc2 = (0, ref_h)
> > +    rc3 = (ref_w, ref_h)
> > +    rc4 = (ref_w, 0)
> > +    ref_corns = np.array((rc1, rc2, rc3, rc4), np.float32)
> > +    ref_data = (ref, ref_w, ref_h, ref_corns)
> > +
> > +    # Locate macbeth chart
> > +    cor, mac, coords, ret = get_macbeth_chart(img, ref_data)
> > +
> > +    # Following bits of code try to fix common problems with simple techniques.
> > +    # If now or at any point the best correlation is of above 0.75, then
> > +    # nothing more is tried as this is a high enough confidence to ensure
> > +    # reliable macbeth square centre placement.
> > +
> > +    for brightness in [2, 4]:
> > +        if cor >= 0.75:
> > +            break
> > +        img_br = cv2.convertScaleAbs(img, alpha=brightness, beta=0)
> > +        cor_b, mac_b, coords_b, ret_b = get_macbeth_chart(img_br, ref_data)
> > +        if cor_b > cor:
> > +            cor, mac, coords, ret = cor_b, mac_b, coords_b, ret_b
> > +
> > +    # In case macbeth chart is too small, take a selection of the image and
> > +    # attempt to locate macbeth chart within that. The scale increment is
> > +    # root 2
> > +
> > +    # These variables will be used to transform the found coordinates at
> > +    # smaller scales back into the original. If ii is still -1 after this
> > +    # section that means it was not successful
> > +    ii = -1
> > +    w_best = 0
> > +    h_best = 0
> > +    d_best = 100
> > +
> > +    # d_best records the scale of the best match. Macbeth charts are only looked
> > +    # for at one scale increment smaller than the current best match in order to avoid
> > +    # unecessarily searching for macbeth charts at small scales.
> > +    # If a macbeth chart ha already been found then set d_best to 0
> > +    if cor != 0:
> > +        d_best = 0
> > +
> > +    for index, pair in enumerate([{'sel': 2 / 3, 'inc': 1 / 6},
> > +                                  {'sel': 1 / 2, 'inc': 1 / 8},
> > +                                  {'sel': 1 / 3, 'inc': 1 / 12},
> > +                                  {'sel': 1 / 4, 'inc': 1 / 16}]):
> > +        if cor >= 0.75:
> > +            break
> > +
> > +        # Check if we need to check macbeth charts at even smaller scales. This
> > +        # slows the code down significantly and has therefore been omitted by
> > +        # default, however it is not unusably slow so might be useful if the
> > +        # macbeth chart is too small to be picked up to by the current
> > +        # subselections.  Use this for macbeth charts with side lengths around
> > +        # 1/5 image dimensions (and smaller...?) it is, however, recommended
> > +        # that macbeth charts take up as large as possible a proportion of the
> > +        # image.
> > +        if index >= 2 and (not small_chart or d_best <= index - 1):
> > +            break
> > +
> > +        w, h = list(img.shape[:2])
> > +        # Set dimensions of the subselection and the step along each axis
> > +        # between selections
> > +        w_sel = int(w * pair['sel'])
> > +        h_sel = int(h * pair['sel'])
> > +        w_inc = int(w * pair['inc'])
> > +        h_inc = int(h * pair['inc'])
> > +
> > +        loop = ((1 - pair['sel']) / pair['inc']) + 1
> > +        # For each subselection, look for a macbeth chart
> > +        for i in range(loop):
> > +            for j in range(loop):
> > +                w_s, h_s = i * w_inc, j * h_inc
> > +                img_sel = img[w_s:w_s + w_sel, h_s:h_s + h_sel]
> > +                cor_ij, mac_ij, coords_ij, ret_ij = get_macbeth_chart(img_sel, ref_data)
> > +
> > +                # If the correlation is better than the best then record the
> > +                # scale and current subselection at which macbeth chart was
> > +                # found. Also record the coordinates, macbeth chart and message.
> > +                if cor_ij > cor:
> > +                    cor = cor_ij
> > +                    mac, coords, ret = mac_ij, coords_ij, ret_ij
> > +                    ii, jj = i, j
> > +                    w_best, h_best = w_inc, h_inc
> > +                    d_best = index + 1
> > +
> > +    # Transform coordinates from subselection to original image
> > +    if ii != -1:
> > +        for a in range(len(coords)):
> > +            for b in range(len(coords[a][0])):
> > +                coords[a][0][b][1] += ii * w_best
> > +                coords[a][0][b][0] += jj * h_best
> > +
> > +    if not ret:
> > +        return None
> > +
> > +    coords_fit = coords
> > +    if cor < 0.75:
> > +        eprint(f'Warning: Low confidence {cor:.3f} for macbeth chart in {img.path.name}')
> > +
> > +    if show:
> > +        draw_macbeth_results(img, coords_fit)
> > +
> > +    return coords_fit
> > +
> > +
> > +def locate_macbeth(image: Image, config: dict):
> > +    # Find macbeth centres
> > +    av_chan = (np.mean(np.array(image.channels), axis=0) / (2**16))
> > +    av_val = np.mean(av_chan)
> > +    if av_val < image.blacklevel_16 / (2**16) + 1 / 64:
> > +        eprint(f'Image {image.path.name} too dark')
> > +        return None
> > +
> > +    macbeth = find_macbeth(av_chan, config['general']['macbeth'])
> > +
> > +    if macbeth is None:
> > +        eprint(f'No macbeth chart found in {image.path.name}')
> > +        return None
> > +
> > +    mac_cen_coords = macbeth[1]
> > +    if not image.get_patches(mac_cen_coords):
> > +        eprint(f'Macbeth patches have saturated in {image.path.name}')
> > +        return None
> > +
> > +    return macbeth
> > diff --git a/utils/tuning/libtuning/macbeth_ref.pgm b/utils/tuning/libtuning/macbeth_ref.pgm
> > new file mode 100644
> > index 00000000..37897140
> > --- /dev/null
> > +++ b/utils/tuning/libtuning/macbeth_ref.pgm
> > @@ -0,0 +1,6 @@
> > +# SPDX-License-Identifier: BSD-2-Clause
> > +P5
> > +# Reference macbeth chart
> > +120 80
> > +255
> > +      !#!" #!"&&$#$#'"%&#+2///..../.........-()))))))))))))))))))(((-,*)'(&)#($%(%"###""!%""&"&&!$" #!$ !"! $&**"  !#5.,%+,-5"0<HBAA54" %##((()*+,---.........+*)))))))))))))))-.,,--+))('((''('%'%##"!""!"!""""#!     !  %?/v??z:????L??????c?,!#""%%''')**+)-../..../.-*)))))))))))))**,,)**'(''&'((&&%%##$! !!!! ! !     !   5*"-)&7(1.75Rnge`\`$ ""!"%%%'')())++--/---,-..,-.,++**))))())*)*)''%'%&%&'&%%"""""               !   !!$&$$&##(+*,,/10122126545./66402006486869650*.1.***)*+)()&((('('##)('&%%&%$$$#$%$%$ (((*))('((('('(&%V0;>>;@@>@AAAACBCB=&<?????????????????<5x???????????????|64RYVTSRRRMMNLKJJLH+&0gijgdeffmmnpnkji`#3????????????????bY! 3FHHIIIHIJIIJHIII@#??????????????????=7}????????????????:5Wcbcbdcb`^^`^^_^Y,'6????????????????r'<????????????????l%2FHHIIHJJJJJJIIJI?%;?????????????????>7|????????????????;8Xfeeegeccb`^aba]Z+)<????????????????r)>????????????????q#3GHIIIIJIIJJIHIJI@&5?????????????????=8~????????????????;8Zgghggedbdcbda^\Z+(;????????????????y)9????????????????z"3GIIJJJJJKJJJJJJJ@'4?????????????????>9|????????????????=8Zhighgeeeedeca__[/)B????????????????v&:????????????????|#3GJJIIJKKKJJJKKJK@&6?????????????????>9~????????????????<8Yghegggffihccab^\/*C????????????????z'9?????????????????$  6IKJJMMMKMKKMKKMLC&2?????????????????@9?????????????????<9Yghhhhijiegdcebc^0)G????????????????(7?????????????????%   6JLMMNMMKMMNMMMMMD&2?????????????????@:~????????????????=9Xfghhjiigdgddedc`1)M????????????????}(:??????????????¾?&  "8LNOONNOMONNMMNOND'3?????????????????@;????????????????=:Ziiigheegegegggdc1,Q????????????????~)8?????????????????%# "9NNNPPPQOOOOONNOOD'0??????????????????;????????????????=;[iigeeegghgdedgea0-P????????????????(8???????????????Ý' "#$:NNOQPPRPQPOOPQPPD*1?????????????????A;?????????????????;:Yfghgghgghghhdggc3.\????????????????~);???????????????¤(&%%;OQQQRSSRPQQQQSQQF)3?????????????????B<?????????????????=:Wfhghhhihggghfhee4/f?????????????????*:???????????????ä&%%%?RSSSSSTTTTSSSTTRE)5?????????????????B=?????????????????@:Ygiihhiiiihihiiif72p????????????????}(9???????????????Ʃ'#%&?TUTTTUUQSTTTTTVSF*3?????????????????F>?????????????????A;[ghjiihiiiihihije50r?????????????????)6???????????????ƫ& &#%?SVVVUUUUUTUUVVUUG*5?????????????????F=?????????????????A;Yhijiiijjiiiiijje81t????????????????~)5???????????????ư' '$$=OQRRQQPRSRSSSSSSG+6?????????????????D@??????????????????;Wefgggggfffgeeefc41x????????????????{*5?????????????????( &&&'++++,,*-,-00-0100*-SUX\]]`_ffgiooopo=;X\bedbadbca`]\]ZZ;;<::8:;9983433110/-,...1//12410/..--+)"",---,-./,,.-/-0-( &&%+/0103322011223233)(34534767::;;==:=B9;BFGEEGIKJKIJGIJCD=<:76566554111/0/1.*+00233300/00//..,+*#")(*)++,++))*++**'!!&$*w????¼???????????1-_addc`ceccdccedbb?A|????????????????B>=>?@@?====;<:;:<:11r?????????????????+.?????????????????( !'%*z???????????????ɠ42gjmllklomooonpopmHG?????????????????D>AEDEFEECEECCCDDEC46????????????????׿0:???????????????Ѿ,!!&&,|???????????????ʡ61inknnoopoppoqqrqoEE?????????????????FACGFFFFFFDFDDDDDDC57??????????????????09?????????????????+!"%%-~???????????????ʡ42inopppppoqqqrrsrnAB?????????????????C?DGGGGFFFFDFFDDEDC48??????????????????1;?????????????????+!!"#*|???????????????ʡ62imoppppqqqqrtrqtrGD?????????????????H?CGGGGGGGGFFFFFFDB38??????????????????1<???????????????Խ,  !)}???????????????ˢ63mooppqqqqqqrrtvtoDH?????????????????JACHHGGHGGFFFDDGGFD29??????????????????3>???????????????׽, $){???????????????ˢ53jpppqprqrrrttuvuo>H?????????????????JAFHHHHHGGHGGFGGFFE28??????????????????3:???????????????ڽ- "*{???????????????̣53loqpqsqrrrtrutsvrAH?????????????????HCGHIHHHHHHGFGHGGGD5;??????????????????28?????????????????, +}???????????????ʡ52mqoqpqrttttttuurpFI?????????????????OCEHHIHHHHGHGGFFIGF8<??????????????????48???????????????ۿ, (|???????????????ʢ41krqpqqqrrtrtuvtuoEH?????????????????PBHHIIIHIIHIHGHGHHE7<??????????????????58?????????????????*  (z???????????????ʡ63kpqprqqstttutrvvoFO?????????????????LEHHIIHIHHHIGHGIHGF4=??????????????????5<?????????????????*  'z???????????????ȡ62lppqrqrrrtttuttvpAG?????????????????MGHIIIIHIIIHHIIJHHG4<??????????????????4<?????????????????+ !){???????????????Ƞ62jopqqqqqrtttutttrEH?????????????????OHFIIIIIJIIIIHIHIHI7>??????????????????5;?????????????????, !)z???????????????Ɵ53lppqqrqrtttuuuutsFI?????????????????RHGJIJHJKJJJIIIIIIH9>??????????????????5;?????????????????+  !({???????????????Ŝ41joppprqrrrutttvvrIH?????????????????THCJJJJJIJIJJIJJJIH7=??????????????????5;?????????????????+  (u?????????????????65gjlmmmnoopnpprpqoIH?????????????????OIBIJJJIJJJJIIIHHHG89??????????????????29???????????????ʾ'  "&,-*)-01/,0/12102-+04448789<>>??AFAD@DBCIJNRWTSUXT[WUQUOKFEBBABA?>>=<<;;67942:<<<>9999864565363&(13335422./1/-+..+  !"&$$""$"&$%'()(''*+-0124688:<>>??A>?EBCHKOLJLNOSQOXQQVMLACGHGHIGFHGDCCBB@??7432233210111.,++,++%(++)*(''%%%$$#%&$#  ")0/001120024455520+-U]`addcdhefeekecYGFJRXYYVWWZWVXXVZTOBF}????????????????K7Ybccddfeg`^]^]\[Z[*)OTTPPQPOKOLLJJLIK   !1;:9:<<===;=???A@9*/?????????????????FJmxyxwyzzzxyzzz{zxLO?????????????????]=??????????????????.-???????????????y# !!2><=;==>=<<>@@@@A9-0?????????????????IKnz||{|{||{}}~}}{zLO?????????????????]>??????????????????..????????????????~%  $2==;<>>?===>@A@AB;+1?????????????????JJo{|y{||}{||}}}}}yMT?????????????????_>??????????????????-.????????????????}#  %2<=;=<@?>==>?A@AA9+3?????????????????FMlz{{y|}}}}||}|}}{MT?????????????????d>??????????????????-,????????????????# %1<<<;==<<=>?A?@AA:,3?????????????????INo{{y{||||}|}}|~}{RT?????????????????d=??????????????????/-????????????????}#!$0<<<=<<==>A@@>@AA:-2?????????????????HInzz{{||{{}~~}}|}zMR?????????????????d=??????????????????++????????????????~# "$/;<==>;===@@@@>AA:+2?????????????????KHn||y|||||{}~}|}|xMS?????????????????d=??????????????????+,????????????????}# ! "/:<=>@<<>=@@@@@AA;-3?????????????????MFs||{{{y}z}}|}|}}yMW?????????????????c>??????????????????,)????????????????|! !1;>?>><<>@>>=>ABB;,0?????????????????LHr{|{|}|y|}}}}}zNX?????????????????c???????????????????()????????????????z#  $/;;<=;<>>=>>>@@BB:,1?????????????????IInyz||||||{||}{~|{NV?????????????????c;?????????????????('????????????????}# $0:<==<;>@>>>>@ABB:,/?????????????????HLlx|}y{y{|y{|}}}}yMR?????????????????d>~?????????????????*(???????????????y" !&3:;<<;==@@=>AABBA;-3?????????????????KLqz{|||y{}|}{}|~{zRQ?????????????????c9w?????????????????)'????????????????y" !%1<<;=>===<=@@ABBC<.5?????????????????IIlz{|}~~~|}{||~}}zMU?????????????????d;p?????????????????)$???????????????x"  $2===<==@=<>=ABBBC?/0?????????????????IGkz}}{||}{||y||}zyOV?????????????????c7o?????????????????'&~??????????~?z"#"#/;<:<<?>;===@?AAA>07?????????????????GGgwxz{yyxyzzyz{yuuHO?????????????????\8v?????????????????'$w~~}|||{~|{zxxxxv!"""'*+(+)*))()+,,.../0398;=<=>DCCDDCBBDHBCJMMLMPNPOJPKPSJDICCNMPONMNNOKHIFDBHE3/46433323.....*+,)( !##!!!!!$#$$#$#&"!!"(+**,,*+.//1478:<:33ACDFGGIIHIJLPKNMQFIPTTRVXVUXUUTXUSTNEGGFDEFAA>==;94877520-,))*(((('&$#!!"  &%'FQPQR]dq??????????=F?????????????????QN?????????????????LE????znki^[YTPUOS;.%-/12322221/10//,/%#0??????????????????@Q?????????????????QM?????????????????KE?????????????????H01NNQOQQOOMNNLKLJGB'&/??????????????????AW?????????????????OL?????????????????KE?????????????????F-,PQQPQPPQPOONMNNKE''0??????????????????CZ?????????????????RM?????????????????JE?????????????????F,*NSQPPQOOOOMNNMKID('2??????????????????D[?????????????????QK?????????????????IF?????????????????F,*NPPPPPPNOONMMMJIF!'(2??????????????????F]?????????????????RL?????????????????HD????????????????F+%MPPPPOOONONNMMKID)*4??????????????????D^?????????????????PL?????????????????IC?????????????????F+&NPOOOPPOONMMKMKHD**6??????????????????D_?????????????????QJ?????????????????FC~????????????????F,'MPOOOOONONNKKIIIG,+7??????????????????D^?????????????????QI?????????????????EB|????????????????E+&MONOOONNNNKMJKJHH,-8??????????????????D]?????????????????PI?????????????????HE????????????????C,#LOOOONONNNKKKMKJF,*6??????????????????Ca?????????????????MH?????????????????IF?????????????????D*%KONOMNMMKMKJJJIJE,,6??????????????????B^?????????????????MG?????????????????HB}????????????????D+&LONOOONNMMMMKLKIA,,6??????????????????A\?????????????????MF?????????????????IE????????????????E+&LNNMONNMMKKKKKIHF --6??????????????????A[?????????????????KF?????????????????JC????????????????F*&LMONMNMNKKJMKJJIF **5??????????????????>W?????????????????KE?????????????????F?}????????????????C*%KONNNJKKKMKJKJKID,*4??????????????????<W?????????????????MA?????????????????GCx????????????????B)%HKLKKJJJKIHIHHFGC!()*q????????????????o39v|}wwwwwwrqtuspn=9^gadcfgce`dbUY[\^>;DIJDB?FEGE=7>8634.(&&(%&*&%%'+*)+*#%()''03364443233222243/-+133423333423766645789:><<<;<;<?=?;<<:78673/001113--.-+*)&&#"&$#%&""$!! ))+rbPpAD9-*******+*++)++--.//./.0/21453469:=;98<;<>=;><7766666741012.-13/-+-/(''&&&%%&$.%0()-%-#-#' #&(% )))h?n?YQg?7(*))))*)**,--....../0/0001357666::;;>?>AA866666666656565300/20/.-*)(('((&&%)d=yoP?<???F?QFx;?2?1?0))*RQ.0*,,5*(*))))*,**,+/.../...02/22224456468;:>BB;>;:76666666666755303033/,.-*(())('&')#)"##(+$+*#)) & 
> > diff --git a/utils/tuning/libtuning/utils.py b/utils/tuning/libtuning/utils.py
> > new file mode 100644
> > index 00000000..b60f2c9b
> > --- /dev/null
> > +++ b/utils/tuning/libtuning/utils.py
> > @@ -0,0 +1,125 @@
> > +# SPDX-License-Identifier: BSD-2-Clause
> > +#
> > +# Copyright (C) 2019, Raspberry Pi Ltd
> > +# Copyright (C) 2022, Paul Elder <paul.elder@ideasonboard.com>
> > +#
> > +# utils.py - Utilities for libtuning
> > +
> > +import decimal
> > +import math
> > +import numpy as np
> > +import os
> > +from pathlib import Path
> > +import re
> > +import sys
> > +
> > +import libtuning as lt
> > +from libtuning.image import Image
> > +from libtuning.macbeth import locate_macbeth
> > +
> > +# Utility functions
> > +
> > +
> > +def eprint(*args, **kwargs):
> > +    print(*args, file=sys.stderr, **kwargs)
> > +
> > +
> > +def get_module_by_type_name(modules, name):
> > +    for module in modules:
> > +        if module.type == name:
> > +            return module
> > +    return None
> > +
> > +
> > +# Private utility functions
> > +
> > +
> > +def _list_image_files(directory):
> > +    d = Path(directory)
> > +    files = [d.joinpath(f) for f in os.listdir(d)
> > +             if re.search(r'\.(jp[e]g$)|(dng$)', f)]
> > +    files.sort()
> > +    return files
> > +
> > +
> > +def _parse_image_filename(fn: Path):
> > +    result = re.search(r'^(alsc_)?(\d+)[kK]_(\d+)?[lLuU]?.\w{3,4}$', fn.name)
> > +    if result is None:
> > +        eprint(f'The file name of {fn.name} is incorrectly formatted')
> > +        return None, None, None
> > +
> > +    color = int(result.group(2))
> > +    lsc_only = result.group(1) is not None
> > +    lux = None if lsc_only else int(result.group(3))
> > +
> > +    return color, lux, lsc_only
> > +
> > +
> > +# \todo Implement this from check_imgs() in ctt.py
> > +def _validate_images(images):
> > +    return True
> > +
> > +
> > +# Public utility functions
> > +
> > +
> > +# @brief Load images into a single list of Image instances
> > +# @param input_dir Directory from which to load image files
> > +# @param config Configuration dictionary
> > +# @param load_nonlsc Whether or not to load non-lsc images
> > +# @param load_lsc Whether or not to load lsc-only images
> > +# @return A list of Image instances
> > +def load_images(input_dir: str, config: dict, load_nonlsc: bool, load_lsc: bool) -> list:
> > +    files = _list_image_files(input_dir)
> > +    if len(files) == 0:
> > +        eprint(f'No images found in {input_dir}')
> > +        return None
> > +
> > +    images = []
> > +    for f in files:
> > +        color, lux, lsc_only = _parse_image_filename(f)
> > +        if color is None:
> > +            continue
> > +
> > +        # Skip lsc image if we don't need it
> > +        if lsc_only and not load_lsc:
> > +            eprint(f'Skipping {f.name} as this tuner has no LSC module')
> > +            continue
> > +
> > +        # Skip non-lsc image if we don't need it
> > +        if not lsc_only and not load_nonlsc:
> > +            eprint(f'Skipping {f.name} as this tuner only has an LSC module')
> > +            continue
> > +
> > +        # Load image
> > +        try:
> > +            image = Image(f)
> > +        except Exception as e:
> > +            eprint(f'Failed to load image {f.name}: {e}')
> > +            continue
> > +
> > +        # Populate simple fields
> > +        image.lsc_only = lsc_only
> > +        image.color = color
> > +        image.lux = lux
> > +
> > +        # Black level comes from the TIFF tags, but they are overridable by the
> > +        # config file.
> > +        if 'blacklevel' in config['general']:
> > +            image.blacklevel_16 = config['general']['blacklevel']
> > +
> > +        if lsc_only:
> > +            images.append(image)
> > +            continue
> > +
> > +        # Handle macbeth
> > +        macbeth = locate_macbeth(config)
> > +        if macbeth is None:
> > +            continue
> > +
> > +        images.append(image)
> > +
> > +    if not _validate_images(images):
> > +        return None
> > +
> > +    return images