From patchwork Thu Jun 6 10:15:07 2024 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: David Plowman X-Patchwork-Id: 20215 Return-Path: X-Original-To: parsemail@patchwork.libcamera.org Delivered-To: parsemail@patchwork.libcamera.org Received: from lancelot.ideasonboard.com (lancelot.ideasonboard.com [92.243.16.209]) by patchwork.libcamera.org (Postfix) with ESMTPS id C347FBD87C for ; Thu, 6 Jun 2024 10:15:35 +0000 (UTC) Received: from lancelot.ideasonboard.com (localhost [IPv6:::1]) by lancelot.ideasonboard.com (Postfix) with ESMTP id 6C88565463; Thu, 6 Jun 2024 12:15:35 +0200 (CEST) Authentication-Results: lancelot.ideasonboard.com; dkim=pass (2048-bit key; unprotected) header.d=raspberrypi.com header.i=@raspberrypi.com header.b="I5HrGUOC"; dkim-atps=neutral Received: from mail-ej1-x635.google.com (mail-ej1-x635.google.com [IPv6:2a00:1450:4864:20::635]) by lancelot.ideasonboard.com (Postfix) with ESMTPS id 626CA634D5 for ; Thu, 6 Jun 2024 12:15:31 +0200 (CEST) Received: by mail-ej1-x635.google.com with SMTP id a640c23a62f3a-a69607c6ccaso78531466b.2 for ; Thu, 06 Jun 2024 03:15:31 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=raspberrypi.com; s=google; t=1717668930; x=1718273730; darn=lists.libcamera.org; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:from:to:cc:subject:date :message-id:reply-to; bh=MeYblDf8HMMD2zoYgUq87WnX+bB4pIxiav1uflgLzjI=; b=I5HrGUOC3WMis5YJ7y9eI+tST+LlcPviVuDPtMOoKB15OTH0hR19pu+LTtYtHbvkd0 ZatRZcwm5EBY3C3W2MkeTx0b10WRKjrs9G7Pq7U1iEpDryzN2U/AgZRTRPNpfV9jYDkR G7YZb8RNuDX8w+oYZZNE6f/LJNzQ0mVzFY01CegQeH5ilufiPklq6z1xXpgeRY/92raL FItvf0/1muC+u0pve9/6hAVtc4lhelDekzE6c13Gxi8wygSQgbnZjKD372MuymUMLI/h ocFv1JpEdPU1YFmnTOxBIFxHtP0+pk6EM85hhWWCyrDX6JulW0+V3kwTJ4i3UOn+Rkss bI1g== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20230601; t=1717668930; x=1718273730; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:x-gm-message-state:from:to:cc :subject:date:message-id:reply-to; bh=MeYblDf8HMMD2zoYgUq87WnX+bB4pIxiav1uflgLzjI=; b=iG7GNwQrgEV4mndwJ/Q4KC0j23IN0sIVvg/xsiQm1soTQhKZsYD25FMXYaMbmWsmjk K+c1AhhYeDHZ6Og+DTQhHBiIKcpQg47P3Xsy5zwKj6EvbbvZyC0otcFrTvYwqfiQfOpB 95IwnPtVvvyunPjwUnDR/vBp2PV+yMnKqdEIHRMoM136l604QxJyhcFr4h+nRF8Kojfy KmniDDod58htrbcMkdsZPPITJjfEgDRf7Fp8YGjXdxdE7oqueSrwNNtO3WA3qsnEN/1q pkhXhJ2azERWcT0wC93S6NevT3JM49LUSsgINb9FADfoVNsxr3A9KrjBLsH/ffFvdGVh JjhA== X-Gm-Message-State: AOJu0YwoXBof+c/A75zxI3yIgpr7+YI1xynnpFGnYhrY5sVom6ZEOuY1 EcSBSTEr/fbb9G3HXKHw3ML+hntz6j1gmQ2lf38Yv4zr0eGd4P+3bH/G04k8WAabFbI9u+BUxWp + X-Google-Smtp-Source: AGHT+IEaOUt+kGoVQQO2Nz6pNJLCa6k4Fv8ch+nJniDViVSzbw7F9er4mjeAED+AcgeJUHRybZEHSg== X-Received: by 2002:a17:906:3a88:b0:a6a:b1c7:ff33 with SMTP id a640c23a62f3a-a6ab1c80d12mr300632266b.8.1717668929398; Thu, 06 Jun 2024 03:15:29 -0700 (PDT) Received: from pi5-davidp.pitowers.org ([2001:4d4e:300:1f:c732:5d0a:406b:ae46]) by smtp.gmail.com with ESMTPSA id a640c23a62f3a-a6c805c59a2sm75809866b.50.2024.06.06.03.15.27 (version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256); Thu, 06 Jun 2024 03:15:28 -0700 (PDT) From: David Plowman To: libcamera-devel@lists.libcamera.org Cc: David Plowman , Naushir Patuck Subject: [PATCH 1/6] utils: raspberrypi: ctt: Adapt tuning tool for both VC4 and PiSP Date: Thu, 6 Jun 2024 11:15:07 +0100 Message-Id: <20240606101512.375178-2-david.plowman@raspberrypi.com> X-Mailer: git-send-email 2.39.2 In-Reply-To: <20240606101512.375178-1-david.plowman@raspberrypi.com> References: <20240606101512.375178-1-david.plowman@raspberrypi.com> MIME-Version: 1.0 X-BeenThere: libcamera-devel@lists.libcamera.org X-Mailman-Version: 2.1.29 Precedence: list List-Id: List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , Errors-To: libcamera-devel-bounces@lists.libcamera.org Sender: "libcamera-devel" The old ctt.py and alsc_only.py scripts are removed. Instead of ctt.py use ctt_vc4.py or ctt_pisp.py, depending on your target platform. Instead of alsc_only.py use alsc_vc4.py or alsc_pisp.py, again according to your platform. Signed-off-by: David Plowman Reviewed-by: Naushir Patuck --- utils/raspberrypi/ctt/alsc_pisp.py | 37 +++ .../ctt/{alsc_only.py => alsc_vc4.py} | 9 +- utils/raspberrypi/ctt/ctt_alsc.py | 75 +++--- utils/raspberrypi/ctt/ctt_awb.py | 11 +- utils/raspberrypi/ctt/ctt_ccm.py | 6 +- utils/raspberrypi/ctt/ctt_pisp.py | 233 ++++++++++++++++++ .../raspberrypi/ctt/ctt_pretty_print_json.py | 7 +- utils/raspberrypi/ctt/{ctt.py => ctt_run.py} | 184 ++------------ utils/raspberrypi/ctt/ctt_vc4.py | 157 ++++++++++++ 9 files changed, 511 insertions(+), 208 deletions(-) create mode 100755 utils/raspberrypi/ctt/alsc_pisp.py rename utils/raspberrypi/ctt/{alsc_only.py => alsc_vc4.py} (70%) create mode 100755 utils/raspberrypi/ctt/ctt_pisp.py rename utils/raspberrypi/ctt/{ctt.py => ctt_run.py} (82%) create mode 100755 utils/raspberrypi/ctt/ctt_vc4.py diff --git a/utils/raspberrypi/ctt/alsc_pisp.py b/utils/raspberrypi/ctt/alsc_pisp.py new file mode 100755 index 00000000..499aecd1 --- /dev/null +++ b/utils/raspberrypi/ctt/alsc_pisp.py @@ -0,0 +1,37 @@ +#!/usr/bin/env python3 +# +# SPDX-License-Identifier: BSD-2-Clause +# +# Copyright (C) 2022, Raspberry Pi (Trading) Limited +# +# alsc_only.py - alsc tuning tool + +import sys + +from ctt_pisp import json_template, grid_size, target +from ctt_run import run_ctt +from ctt_tools import parse_input + +if __name__ == '__main__': + """ + initialise calibration + """ + if len(sys.argv) == 1: + print(""" + PiSP Lens Shading Camera Tuning Tool version 1.0 + + Required Arguments: + '-i' : Calibration image directory. + '-o' : Name of output json file. + + Optional Arguments: + '-c' : Config file for the CTT. If not passed, default parameters used. + '-l' : Name of output log file. If not passed, 'ctt_log.txt' used. + """) + quit(0) + else: + """ + parse input arguments + """ + json_output, directory, config, log_output = parse_input() + run_ctt(json_output, directory, config, log_output, json_template, grid_size, target, alsc_only=True) diff --git a/utils/raspberrypi/ctt/alsc_only.py b/utils/raspberrypi/ctt/alsc_vc4.py similarity index 70% rename from utils/raspberrypi/ctt/alsc_only.py rename to utils/raspberrypi/ctt/alsc_vc4.py index 092aa40e..caf6a174 100755 --- a/utils/raspberrypi/ctt/alsc_only.py +++ b/utils/raspberrypi/ctt/alsc_vc4.py @@ -6,8 +6,11 @@ # # alsc tuning tool -from ctt import * +import sys +from ctt_vc4 import json_template, grid_size, target +from ctt_run import run_ctt +from ctt_tools import parse_input if __name__ == '__main__': """ @@ -15,7 +18,7 @@ if __name__ == '__main__': """ if len(sys.argv) == 1: print(""" - Pisp Camera Tuning Tool version 1.0 + VC4 Lens Shading Camera Tuning Tool version 1.0 Required Arguments: '-i' : Calibration image directory. @@ -31,4 +34,4 @@ if __name__ == '__main__': parse input arguments """ json_output, directory, config, log_output = parse_input() - run_ctt(json_output, directory, config, log_output, alsc_only=True) + run_ctt(json_output, directory, config, log_output, json_template, grid_size, target, alsc_only=True) diff --git a/utils/raspberrypi/ctt/ctt_alsc.py b/utils/raspberrypi/ctt/ctt_alsc.py index b0201ac4..66ce8c14 100644 --- a/utils/raspberrypi/ctt/ctt_alsc.py +++ b/utils/raspberrypi/ctt/ctt_alsc.py @@ -13,8 +13,9 @@ from mpl_toolkits.mplot3d import Axes3D """ preform alsc calibration on a set of images """ -def alsc_all(Cam, do_alsc_colour, plot): +def alsc_all(Cam, do_alsc_colour, plot, grid_size=(16, 12)): imgs_alsc = Cam.imgs_alsc + grid_w, grid_h = grid_size """ create list of colour temperatures and associated calibration tables """ @@ -23,7 +24,7 @@ def alsc_all(Cam, do_alsc_colour, plot): list_cb = [] list_cg = [] for Img in imgs_alsc: - col, cr, cb, cg, size = alsc(Cam, Img, do_alsc_colour, plot) + col, cr, cb, cg, size = alsc(Cam, Img, do_alsc_colour, plot, grid_size=grid_size) list_col.append(col) list_cr.append(cr) list_cb.append(cb) @@ -68,11 +69,12 @@ def alsc_all(Cam, do_alsc_colour, plot): t_b = np.where((100*t_b) % 1 >= 0.95, t_b-0.001, t_b) t_r = np.round(t_r, 3) t_b = np.round(t_b, 3) - r_corners = (t_r[0], t_r[15], t_r[-1], t_r[-16]) - b_corners = (t_b[0], t_b[15], t_b[-1], t_b[-16]) - r_cen = t_r[5*16+7]+t_r[5*16+8]+t_r[6*16+7]+t_r[6*16+8] + r_corners = (t_r[0], t_r[grid_w - 1], t_r[-1], t_r[-grid_w]) + b_corners = (t_b[0], t_b[grid_w - 1], t_b[-1], t_b[-grid_w]) + middle_pos = (grid_h // 2 - 1) * grid_w + grid_w - 1 + r_cen = t_r[middle_pos]+t_r[middle_pos + 1]+t_r[middle_pos + grid_w]+t_r[middle_pos + grid_w + 1] r_cen = round(r_cen/4, 3) - b_cen = t_b[5*16+7]+t_b[5*16+8]+t_b[6*16+7]+t_b[6*16+8] + b_cen = t_b[middle_pos]+t_b[middle_pos + 1]+t_b[middle_pos + grid_w]+t_b[middle_pos + grid_w + 1] b_cen = round(b_cen/4, 3) Cam.log += '\nRed table corners: {}'.format(r_corners) Cam.log += '\nRed table centre: {}'.format(r_cen) @@ -116,8 +118,9 @@ def alsc_all(Cam, do_alsc_colour, plot): """ calculate g/r and g/b for 32x32 points arranged in a grid for a single image """ -def alsc(Cam, Img, do_alsc_colour, plot=False): +def alsc(Cam, Img, do_alsc_colour, plot=False, grid_size=(16, 12)): Cam.log += '\nProcessing image: ' + Img.name + grid_w, grid_h = grid_size """ get channel in correct order """ @@ -128,24 +131,24 @@ def alsc(Cam, Img, do_alsc_colour, plot=False): where w is a multiple of 32. """ w, h = Img.w/2, Img.h/2 - dx, dy = int(-(-(w-1)//16)), int(-(-(h-1)//12)) + dx, dy = int(-(-(w-1)//grid_w)), int(-(-(h-1)//grid_h)) """ average the green channels into one """ av_ch_g = np.mean((channels[1:3]), axis=0) if do_alsc_colour: """ - obtain 16x12 grid of intensities for each channel and subtract black level + obtain grid_w x grid_h grid of intensities for each channel and subtract black level """ - g = get_16x12_grid(av_ch_g, dx, dy) - Img.blacklevel_16 - r = get_16x12_grid(channels[0], dx, dy) - Img.blacklevel_16 - b = get_16x12_grid(channels[3], dx, dy) - Img.blacklevel_16 + g = get_grid(av_ch_g, dx, dy, grid_size) - Img.blacklevel_16 + r = get_grid(channels[0], dx, dy, grid_size) - Img.blacklevel_16 + b = get_grid(channels[3], dx, dy, grid_size) - Img.blacklevel_16 """ calculate ratios as 32 bit in order to be supported by medianBlur function """ - cr = np.reshape(g/r, (12, 16)).astype('float32') - cb = np.reshape(g/b, (12, 16)).astype('float32') - cg = np.reshape(1/g, (12, 16)).astype('float32') + cr = np.reshape(g/r, (grid_h, grid_w)).astype('float32') + cb = np.reshape(g/b, (grid_h, grid_w)).astype('float32') + cg = np.reshape(1/g, (grid_h, grid_w)).astype('float32') """ median blur to remove peaks and save as float 64 """ @@ -164,7 +167,7 @@ def alsc(Cam, Img, do_alsc_colour, plot=False): """ note Y is plotted as -Y so plot has same axes as image """ - X, Y = np.meshgrid(range(16), range(12)) + X, Y = np.meshgrid(range(grid_w), range(grid_h)) ha.plot_surface(X, -Y, cr, cmap=cm.coolwarm, linewidth=0) ha.set_title('ALSC Plot\nImg: {}\n\ncr'.format(Img.str)) hb = hf.add_subplot(312, projection='3d') @@ -182,15 +185,15 @@ def alsc(Cam, Img, do_alsc_colour, plot=False): """ only perform calculations for luminance shading """ - g = get_16x12_grid(av_ch_g, dx, dy) - Img.blacklevel_16 - cg = np.reshape(1/g, (12, 16)).astype('float32') + g = get_grid(av_ch_g, dx, dy, grid_size) - Img.blacklevel_16 + cg = np.reshape(1/g, (grid_h, grid_w)).astype('float32') cg = cv2.medianBlur(cg, 3).astype('float64') cg = cg/np.min(cg) if plot: hf = plt.figure(figssize=(8, 8)) ha = hf.add_subplot(1, 1, 1, projection='3d') - X, Y = np.meashgrid(range(16), range(12)) + X, Y = np.meashgrid(range(grid_w), range(grid_h)) ha.plot_surface(X, -Y, cg, cmap=cm.coolwarm, linewidth=0) ha.set_title('ALSC Plot (Luminance only!)\nImg: {}\n\ncg').format(Img.str) plt.show() @@ -199,21 +202,22 @@ def alsc(Cam, Img, do_alsc_colour, plot=False): """ -Compresses channel down to a 16x12 grid +Compresses channel down to a grid of the requested size """ -def get_16x12_grid(chan, dx, dy): +def get_grid(chan, dx, dy, grid_size): + grid_w, grid_h = grid_size grid = [] """ since left and bottom border will not necessarily have rectangles of dimension dx x dy, the 32nd iteration has to be handled separately. """ - for i in range(11): - for j in range(15): + for i in range(grid_h - 1): + for j in range(grid_w - 1): grid.append(np.mean(chan[dy*i:dy*(1+i), dx*j:dx*(1+j)])) - grid.append(np.mean(chan[dy*i:dy*(1+i), 15*dx:])) - for j in range(15): - grid.append(np.mean(chan[11*dy:, dx*j:dx*(1+j)])) - grid.append(np.mean(chan[11*dy:, 15*dx:])) + grid.append(np.mean(chan[dy*i:dy*(1+i), (grid_w - 1)*dx:])) + for j in range(grid_w - 1): + grid.append(np.mean(chan[(grid_h - 1)*dy:, dx*j:dx*(1+j)])) + grid.append(np.mean(chan[(grid_h - 1)*dy:, (grid_w - 1)*dx:])) """ return as np.array, ready for further manipulation """ @@ -223,7 +227,7 @@ def get_16x12_grid(chan, dx, dy): """ obtains sigmas for red and blue, effectively a measure of the 'error' """ -def get_sigma(Cam, cal_cr_list, cal_cb_list): +def get_sigma(Cam, cal_cr_list, cal_cb_list, grid_size): Cam.log += '\nCalculating sigmas' """ provided colour alsc tables were generated for two different colour @@ -241,8 +245,8 @@ def get_sigma(Cam, cal_cr_list, cal_cb_list): sigma_rs = [] sigma_bs = [] for i in range(len(cts)-1): - sigma_rs.append(calc_sigma(cal_cr_list[i]['table'], cal_cr_list[i+1]['table'])) - sigma_bs.append(calc_sigma(cal_cb_list[i]['table'], cal_cb_list[i+1]['table'])) + sigma_rs.append(calc_sigma(cal_cr_list[i]['table'], cal_cr_list[i+1]['table'], grid_size)) + sigma_bs.append(calc_sigma(cal_cb_list[i]['table'], cal_cb_list[i+1]['table'], grid_size)) Cam.log += '\nColour temperature interval {} - {} K'.format(cts[i], cts[i+1]) Cam.log += '\nSigma red: {}'.format(sigma_rs[-1]) Cam.log += '\nSigma blue: {}'.format(sigma_bs[-1]) @@ -263,12 +267,13 @@ def get_sigma(Cam, cal_cr_list, cal_cb_list): """ calculate sigma from two adjacent gain tables """ -def calc_sigma(g1, g2): +def calc_sigma(g1, g2, grid_size): + grid_w, grid_h = grid_size """ reshape into 16x12 matrix """ - g1 = np.reshape(g1, (12, 16)) - g2 = np.reshape(g2, (12, 16)) + g1 = np.reshape(g1, (grid_h, grid_w)) + g2 = np.reshape(g2, (grid_h, grid_w)) """ apply gains to gain table """ @@ -280,8 +285,8 @@ def calc_sigma(g1, g2): neighbours, then append to list """ diffs = [] - for i in range(10): - for j in range(14): + for i in range(grid_h - 2): + for j in range(grid_w - 2): """ note indexing is incremented by 1 since all patches on borders are not counted diff --git a/utils/raspberrypi/ctt/ctt_awb.py b/utils/raspberrypi/ctt/ctt_awb.py index 5ba6f978..4af1fe41 100644 --- a/utils/raspberrypi/ctt/ctt_awb.py +++ b/utils/raspberrypi/ctt/ctt_awb.py @@ -13,7 +13,7 @@ from scipy.optimize import fmin """ obtain piecewise linear approximation for colour curve """ -def awb(Cam, cal_cr_list, cal_cb_list, plot): +def awb(Cam, cal_cr_list, cal_cb_list, plot, grid_size): imgs = Cam.imgs """ condense alsc calibration tables into one dictionary @@ -43,7 +43,7 @@ def awb(Cam, cal_cr_list, cal_cb_list, plot): Note: if alsc is disabled then colour_cals will be set to None and the function will just return the greyscale patches """ - r_patchs, b_patchs, g_patchs = get_alsc_patches(Img, colour_cals) + r_patchs, b_patchs, g_patchs = get_alsc_patches(Img, colour_cals, grid_size=grid_size) """ calculate ratio of r, b to g """ @@ -293,12 +293,13 @@ def awb(Cam, cal_cr_list, cal_cb_list, plot): """ obtain greyscale patches and perform alsc colour correction """ -def get_alsc_patches(Img, colour_cals, grey=True): +def get_alsc_patches(Img, colour_cals, grey=True, grid_size=(16, 12)): """ get patch centre coordinates, image colour and the actual patches for each channel, remembering to subtract blacklevel If grey then only greyscale patches considered """ + grid_w, grid_h = grid_size if grey: cen_coords = Img.cen_coords[3::4] col = Img.col @@ -345,12 +346,12 @@ def get_alsc_patches(Img, colour_cals, grey=True): bef_tabs = np.array(colour_cals[bef]) aft_tabs = np.array(colour_cals[aft]) col_tabs = (bef_tabs*db + aft_tabs*da)/(da+db) - col_tabs = np.reshape(col_tabs, (2, 12, 16)) + col_tabs = np.reshape(col_tabs, (2, grid_h, grid_w)) """ calculate dx, dy used to calculate alsc table """ w, h = Img.w/2, Img.h/2 - dx, dy = int(-(-(w-1)//16)), int(-(-(h-1)//12)) + dx, dy = int(-(-(w-1)//grid_w)), int(-(-(h-1)//grid_h)) """ make list of pairs of gains for each patch by selecting the correct value in alsc colour calibration table diff --git a/utils/raspberrypi/ctt/ctt_ccm.py b/utils/raspberrypi/ctt/ctt_ccm.py index 59753e33..07c943a8 100644 --- a/utils/raspberrypi/ctt/ctt_ccm.py +++ b/utils/raspberrypi/ctt/ctt_ccm.py @@ -56,7 +56,7 @@ FInds colour correction matrices for list of images """ -def ccm(Cam, cal_cr_list, cal_cb_list): +def ccm(Cam, cal_cr_list, cal_cb_list, grid_size): global matrix_selection_types, typenum imgs = Cam.imgs """ @@ -133,9 +133,7 @@ def ccm(Cam, cal_cr_list, cal_cb_list): Note: if alsc is disabled then colour_cals will be set to None and no the function will simply return the macbeth patches """ - r, b, g = get_alsc_patches(Img, colour_cals, grey=False) - # 256 values for each patch of sRGB values - + r, b, g = get_alsc_patches(Img, colour_cals, grey=False, grid_size=grid_size) """ do awb Note: awb is done by measuring the macbeth chart in the image, rather diff --git a/utils/raspberrypi/ctt/ctt_pisp.py b/utils/raspberrypi/ctt/ctt_pisp.py new file mode 100755 index 00000000..f837e062 --- /dev/null +++ b/utils/raspberrypi/ctt/ctt_pisp.py @@ -0,0 +1,233 @@ +#!/usr/bin/env python3 +# +# SPDX-License-Identifier: BSD-2-Clause +# +# Copyright (C) 2019, Raspberry Pi Ltd +# +# ctt_pisp.py - camera tuning tool for PiSP platforms + +import os +import sys + +from ctt_run import run_ctt +from ctt_tools import parse_input + +json_template = { + "rpi.black_level": { + "black_level": 4096 + }, + "rpi.lux": { + "reference_shutter_speed": 10000, + "reference_gain": 1, + "reference_aperture": 1.0 + }, + "rpi.dpc": { + "strength": 1 + }, + "rpi.noise": { + }, + "rpi.geq": { + }, + "rpi.denoise": + { + "sdn": + { + "deviation": 1.6, + "strength": 0.5, + "deviation2": 3.2, + "deviation_no_tdn": 3.2, + "strength_no_tdn": 0.75 + }, + "cdn": + { + "deviation": 200, + "strength": 0.3 + }, + "tdn": + { + "deviation": 0.8, + "threshold": 0.05 + } + }, + "rpi.awb": { + "priors": [ + {"lux": 0, "prior": [2000, 1.0, 3000, 0.0, 13000, 0.0]}, + {"lux": 800, "prior": [2000, 0.0, 6000, 2.0, 13000, 2.0]}, + {"lux": 1500, "prior": [2000, 0.0, 4000, 1.0, 6000, 6.0, 6500, 7.0, 7000, 1.0, 13000, 1.0]} + ], + "modes": { + "auto": {"lo": 2500, "hi": 7700}, + "incandescent": {"lo": 2500, "hi": 3000}, + "tungsten": {"lo": 3000, "hi": 3500}, + "fluorescent": {"lo": 4000, "hi": 4700}, + "indoor": {"lo": 3000, "hi": 5000}, + "daylight": {"lo": 5500, "hi": 6500}, + "cloudy": {"lo": 7000, "hi": 8000} + }, + "bayes": 1 + }, + "rpi.agc": { + "metering_modes": { + "centre-weighted": { + "weights": [ + 0, 0, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 0, 0, + 0, 1, 1, 1, 1, 1, 2, 2, 2, 1, 1, 1, 1, 1, 0, + 1, 1, 1, 1, 2, 2, 2, 2, 2, 2, 2, 1, 1, 1, 1, + 1, 1, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 1, 1, + 1, 1, 2, 2, 2, 2, 3, 3, 3, 2, 2, 2, 2, 1, 1, + 1, 1, 2, 2, 2, 3, 3, 3, 3, 3, 2, 2, 2, 1, 1, + 1, 1, 2, 2, 3, 3, 3, 4, 3, 3, 3, 2, 2, 1, 1, + 1, 1, 2, 2, 3, 3, 4, 4, 4, 3, 3, 2, 2, 1, 1, + 1, 1, 2, 2, 3, 3, 3, 4, 3, 3, 3, 2, 2, 1, 1, + 1, 1, 2, 2, 2, 3, 3, 3, 3, 3, 2, 2, 2, 1, 1, + 1, 1, 2, 2, 2, 2, 3, 3, 3, 2, 2, 2, 2, 1, 1, + 1, 1, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 1, 1, + 1, 1, 1, 1, 2, 2, 2, 2, 2, 2, 2, 1, 1, 1, 1, + 0, 1, 1, 1, 1, 1, 2, 2, 2, 1, 1, 1, 1, 1, 0, + 0, 0, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 0, 0 + ] + }, + "spot": { + "weights": [ + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 1, 2, 1, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 1, 2, 3, 2, 1, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 1, 2, 1, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 + ] + }, + "matrix": { + "weights": [ + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 + ] + } + }, + "exposure_modes": { + "normal": { + "shutter": [100, 10000, 30000, 60000, 66666], + "gain": [1.0, 1.5, 2.0, 4.0, 8.0] + }, + "short": { + "shutter": [100, 5000, 10000, 20000, 60000], + "gain": [1.0, 1.5, 2.0, 4.0, 8.0] + }, + "long": + { + "shutter": [ 100, 10000, 30000, 60000, 90000, 120000 ], + "gain": [ 1.0, 1.5, 2.0, 4.0, 8.0, 12.0 ] + } + }, + "constraint_modes": { + "normal": [ + {"bound": "LOWER", "q_lo": 0.98, "q_hi": 1.0, "y_target": [0, 0.5, 1000, 0.5]} + ], + "highlight": [ + {"bound": "LOWER", "q_lo": 0.98, "q_hi": 1.0, "y_target": [0, 0.5, 1000, 0.5]}, + {"bound": "UPPER", "q_lo": 0.98, "q_hi": 1.0, "y_target": [0, 0.8, 1000, 0.8]} + ] + }, + "y_target": [0, 0.16, 1000, 0.165, 10000, 0.17] + }, + "rpi.alsc": { + 'omega': 1.3, + 'n_iter': 100, + 'luminance_strength': 0.8, + }, + "rpi.contrast": { + "ce_enable": 1, + "gamma_curve": [ + 0, 0, + 1024, 5040, + 2048, 9338, + 3072, 12356, + 4096, 15312, + 5120, 18051, + 6144, 20790, + 7168, 23193, + 8192, 25744, + 9216, 27942, + 10240, 30035, + 11264, 32005, + 12288, 33975, + 13312, 35815, + 14336, 37600, + 15360, 39168, + 16384, 40642, + 18432, 43379, + 20480, 45749, + 22528, 47753, + 24576, 49621, + 26624, 51253, + 28672, 52698, + 30720, 53796, + 32768, 54876, + 36864, 57012, + 40960, 58656, + 45056, 59954, + 49152, 61183, + 53248, 62355, + 57344, 63419, + 61440, 64476, + 65535, 65535 + ] + }, + "rpi.ccm": { + }, + "rpi.sharpen": { + "threshold": 0.25, + "limit": 1.0, + "strength": 1.0 + } +} + +grid_size = (32, 32) + +target = 'pisp' + +if __name__ == '__main__': + """ + initialise calibration + """ + if len(sys.argv) == 1: + print(""" + PiSP Camera Tuning Tool version 1.0 + + Required Arguments: + '-i' : Calibration image directory. + '-o' : Name of output json file. + + Optional Arguments: + '-c' : Config file for the CTT. If not passed, default parameters used. + '-l' : Name of output log file. If not passed, 'ctt_log.txt' used. + """) + quit(0) + else: + """ + parse input arguments + """ + json_output, directory, config, log_output = parse_input() + run_ctt(json_output, directory, config, log_output, json_template, grid_size, target) diff --git a/utils/raspberrypi/ctt/ctt_pretty_print_json.py b/utils/raspberrypi/ctt/ctt_pretty_print_json.py index 3e3b8475..5d16b2a6 100755 --- a/utils/raspberrypi/ctt/ctt_pretty_print_json.py +++ b/utils/raspberrypi/ctt/ctt_pretty_print_json.py @@ -19,6 +19,7 @@ class Encoder(json.JSONEncoder): self.indentation_level = 0 self.hard_break = 120 self.custom_elems = { + 'weights': 15, 'table': 16, 'luminance_lut': 16, 'ct_curve': 3, @@ -87,7 +88,7 @@ class Encoder(json.JSONEncoder): return self.encode(o) -def pretty_print(in_json: dict) -> str: +def pretty_print(in_json: dict, custom_elems={}) -> str: if 'version' not in in_json or \ 'target' not in in_json or \ @@ -95,7 +96,9 @@ def pretty_print(in_json: dict) -> str: in_json['version'] < 2.0: raise RuntimeError('Incompatible JSON dictionary has been provided') - return json.dumps(in_json, cls=Encoder, indent=4, sort_keys=False) + encoder = Encoder(indent=4, sort_keys=False) + encoder.custom_elems |= custom_elems + return encoder.encode(in_json) #json.dumps(in_json, cls=Encoder, indent=4, sort_keys=False) if __name__ == "__main__": diff --git a/utils/raspberrypi/ctt/ctt.py b/utils/raspberrypi/ctt/ctt_run.py similarity index 82% rename from utils/raspberrypi/ctt/ctt.py rename to utils/raspberrypi/ctt/ctt_run.py index bbe960b0..0c85d7db 100755 --- a/utils/raspberrypi/ctt/ctt.py +++ b/utils/raspberrypi/ctt/ctt_run.py @@ -67,7 +67,7 @@ Camera object that is the backbone of the tuning tool. Input is the desired path of the output json. """ class Camera: - def __init__(self, jfile): + def __init__(self, jfile, json): self.path = os.path.dirname(os.path.expanduser(__file__)) + '/' if self.path == '/': self.path = '' @@ -79,127 +79,15 @@ class Camera: """ initial json dict populated by uncalibrated values """ - self.json = { - "rpi.black_level": { - "black_level": 4096 - }, - "rpi.dpc": { - }, - "rpi.lux": { - "reference_shutter_speed": 10000, - "reference_gain": 1, - "reference_aperture": 1.0 - }, - "rpi.noise": { - }, - "rpi.geq": { - }, - "rpi.sdn": { - }, - "rpi.awb": { - "priors": [ - {"lux": 0, "prior": [2000, 1.0, 3000, 0.0, 13000, 0.0]}, - {"lux": 800, "prior": [2000, 0.0, 6000, 2.0, 13000, 2.0]}, - {"lux": 1500, "prior": [2000, 0.0, 4000, 1.0, 6000, 6.0, 6500, 7.0, 7000, 1.0, 13000, 1.0]} - ], - "modes": { - "auto": {"lo": 2500, "hi": 8000}, - "incandescent": {"lo": 2500, "hi": 3000}, - "tungsten": {"lo": 3000, "hi": 3500}, - "fluorescent": {"lo": 4000, "hi": 4700}, - "indoor": {"lo": 3000, "hi": 5000}, - "daylight": {"lo": 5500, "hi": 6500}, - "cloudy": {"lo": 7000, "hi": 8600} - }, - "bayes": 1 - }, - "rpi.agc": { - "metering_modes": { - "centre-weighted": { - "weights": [3, 3, 3, 2, 2, 2, 2, 1, 1, 1, 1, 0, 0, 0, 0] - }, - "spot": { - "weights": [2, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] - }, - "matrix": { - "weights": [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1] - } - }, - "exposure_modes": { - "normal": { - "shutter": [100, 10000, 30000, 60000, 120000], - "gain": [1.0, 2.0, 4.0, 6.0, 6.0] - }, - "short": { - "shutter": [100, 5000, 10000, 20000, 120000], - "gain": [1.0, 2.0, 4.0, 6.0, 6.0] - } - }, - "constraint_modes": { - "normal": [ - {"bound": "LOWER", "q_lo": 0.98, "q_hi": 1.0, "y_target": [0, 0.5, 1000, 0.5]} - ], - "highlight": [ - {"bound": "LOWER", "q_lo": 0.98, "q_hi": 1.0, "y_target": [0, 0.5, 1000, 0.5]}, - {"bound": "UPPER", "q_lo": 0.98, "q_hi": 1.0, "y_target": [0, 0.8, 1000, 0.8]} - ] - }, - "y_target": [0, 0.16, 1000, 0.165, 10000, 0.17] - }, - "rpi.alsc": { - 'omega': 1.3, - 'n_iter': 100, - 'luminance_strength': 0.7, - }, - "rpi.contrast": { - "ce_enable": 1, - "gamma_curve": [ - 0, 0, - 1024, 5040, - 2048, 9338, - 3072, 12356, - 4096, 15312, - 5120, 18051, - 6144, 20790, - 7168, 23193, - 8192, 25744, - 9216, 27942, - 10240, 30035, - 11264, 32005, - 12288, 33975, - 13312, 35815, - 14336, 37600, - 15360, 39168, - 16384, 40642, - 18432, 43379, - 20480, 45749, - 22528, 47753, - 24576, 49621, - 26624, 51253, - 28672, 52698, - 30720, 53796, - 32768, 54876, - 36864, 57012, - 40960, 58656, - 45056, 59954, - 49152, 61183, - 53248, 62355, - 57344, 63419, - 61440, 64476, - 65535, 65535 - ] - }, - "rpi.ccm": { - }, - "rpi.sharpen": { - } - } + + self.json = json + """ Perform colour correction calibrations by comparing macbeth patch colours to standard macbeth chart colours. """ - def ccm_cal(self, do_alsc_colour): + def ccm_cal(self, do_alsc_colour, grid_size): if 'rpi.ccm' in self.disable: return 1 print('\nStarting CCM calibration') @@ -245,7 +133,7 @@ class Camera: Do CCM calibration """ try: - ccms = ccm(self, cal_cr_list, cal_cb_list) + ccms = ccm(self, cal_cr_list, cal_cb_list, grid_size) except ArithmeticError: print('ERROR: Matrix is singular!\nTake new pictures and try again...') self.log += '\nERROR: Singular matrix encountered during fit!' @@ -263,7 +151,7 @@ class Camera: various colour temperatures, as well as providing a maximum 'wiggle room' distance from this curve (transverse_neg/pos). """ - def awb_cal(self, greyworld, do_alsc_colour): + def awb_cal(self, greyworld, do_alsc_colour, grid_size): if 'rpi.awb' in self.disable: return 1 print('\nStarting AWB calibration') @@ -306,7 +194,7 @@ class Camera: call calibration function """ plot = "rpi.awb" in self.plot - awb_out = awb(self, cal_cr_list, cal_cb_list, plot) + awb_out = awb(self, cal_cr_list, cal_cb_list, plot, grid_size) ct_curve, transverse_neg, transverse_pos = awb_out """ write output to json @@ -324,7 +212,7 @@ class Camera: colour channel seperately, and then partially corrects for vignetting. The extent of the correction depends on the 'luminance_strength' parameter. """ - def alsc_cal(self, luminance_strength, do_alsc_colour): + def alsc_cal(self, luminance_strength, do_alsc_colour, grid_size): if 'rpi.alsc' in self.disable: return 1 print('\nStarting ALSC calibration') @@ -347,7 +235,7 @@ class Camera: call calibration function """ plot = "rpi.alsc" in self.plot - alsc_out = alsc_all(self, do_alsc_colour, plot) + alsc_out = alsc_all(self, do_alsc_colour, plot, grid_size) cal_cr_list, cal_cb_list, luminance_lut, av_corn = alsc_out """ write output to json and finish if not do_alsc_colour @@ -393,7 +281,7 @@ class Camera: """ obtain worst-case scenario residual sigmas """ - sigma_r, sigma_b = get_sigma(self, cal_cr_list, cal_cb_list) + sigma_r, sigma_b = get_sigma(self, cal_cr_list, cal_cb_list, grid_size) """ write output to json """ @@ -509,19 +397,20 @@ class Camera: """ writes the json dictionary to the raw json file then make pretty """ - def write_json(self): + def write_json(self, version=2.0, target='bcm2835', grid_size=(16, 12)): """ Write json dictionary to file using our version 2 format """ out_json = { - "version": 2.0, - 'target': 'bcm2835', + "version": version, + 'target': target if target != 'vc4' else 'bcm2835', "algorithms": [{name: data} for name, data in self.json.items()], } with open(self.jf, 'w') as f: - f.write(pretty_print(out_json)) + f.write(pretty_print(out_json, + custom_elems={'table': grid_size[0], 'luminance_lut': grid_size[0]})) """ add a new section to the log file @@ -712,7 +601,7 @@ class Camera: return 0 -def run_ctt(json_output, directory, config, log_output, alsc_only=False): +def run_ctt(json_output, directory, config, log_output, json_template, grid_size, target, alsc_only=False): """ check input files are jsons """ @@ -748,7 +637,7 @@ def run_ctt(json_output, directory, config, log_output, alsc_only=False): greyworld = get_config(awb_d, "greyworld", 0, 'bool') alsc_d = get_config(configs, "alsc", {}, 'dict') do_alsc_colour = get_config(alsc_d, "do_alsc_colour", 1, 'bool') - luminance_strength = get_config(alsc_d, "luminance_strength", 0.5, 'num') + luminance_strength = get_config(alsc_d, "luminance_strength", 0.8, 'num') blacklevel = get_config(configs, "blacklevel", -1, 'num') macbeth_d = get_config(configs, "macbeth", {}, 'dict') mac_small = get_config(macbeth_d, "small", 0, 'bool') @@ -772,7 +661,7 @@ def run_ctt(json_output, directory, config, log_output, alsc_only=False): initialise tuning tool and load images """ try: - Cam = Camera(json_output) + Cam = Camera(json_output, json=json_template) Cam.log_user_input(json_output, directory, config, log_output) if alsc_only: disable = set(Cam.json.keys()).symmetric_difference({"rpi.alsc"}) @@ -794,14 +683,16 @@ def run_ctt(json_output, directory, config, log_output, alsc_only=False): Cam.json['rpi.black_level']['black_level'] = Cam.blacklevel_16 Cam.json_remove(disable) print('\nSTARTING CALIBRATIONS') - Cam.alsc_cal(luminance_strength, do_alsc_colour) + Cam.alsc_cal(luminance_strength, do_alsc_colour, grid_size) Cam.geq_cal() Cam.lux_cal() Cam.noise_cal() - Cam.awb_cal(greyworld, do_alsc_colour) - Cam.ccm_cal(do_alsc_colour) + Cam.cac_cal(do_alsc_colour) + Cam.awb_cal(greyworld, do_alsc_colour, grid_size) + Cam.ccm_cal(do_alsc_colour, grid_size) + print('\nFINISHED CALIBRATIONS') - Cam.write_json() + Cam.write_json(target=target, grid_size=grid_size) Cam.write_log(log_output) print('\nCalibrations written to: '+json_output) if log_output is None: @@ -810,28 +701,3 @@ def run_ctt(json_output, directory, config, log_output, alsc_only=False): pass else: Cam.write_log(log_output) - - -if __name__ == '__main__': - """ - initialise calibration - """ - if len(sys.argv) == 1: - print(""" - Pisp Camera Tuning Tool version 1.0 - - Required Arguments: - '-i' : Calibration image directory. - '-o' : Name of output json file. - - Optional Arguments: - '-c' : Config file for the CTT. If not passed, default parameters used. - '-l' : Name of output log file. If not passed, 'ctt_log.txt' used. - """) - quit(0) - else: - """ - parse input arguments - """ - json_output, directory, config, log_output = parse_input() - run_ctt(json_output, directory, config, log_output) diff --git a/utils/raspberrypi/ctt/ctt_vc4.py b/utils/raspberrypi/ctt/ctt_vc4.py new file mode 100755 index 00000000..86acfd47 --- /dev/null +++ b/utils/raspberrypi/ctt/ctt_vc4.py @@ -0,0 +1,157 @@ +#!/usr/bin/env python3 +# +# SPDX-License-Identifier: BSD-2-Clause +# +# Copyright (C) 2019, Raspberry Pi Ltd +# +# ctt_vc4.py - camera tuning tool for VC4 platforms + +import os +import sys + +from ctt_run import run_ctt +from ctt_tools import parse_input + +json_template = { + "rpi.black_level": { + "black_level": 4096 + }, + "rpi.dpc": { + }, + "rpi.lux": { + "reference_shutter_speed": 10000, + "reference_gain": 1, + "reference_aperture": 1.0 + }, + "rpi.noise": { + }, + "rpi.geq": { + }, + "rpi.sdn": { + }, + "rpi.awb": { + "priors": [ + {"lux": 0, "prior": [2000, 1.0, 3000, 0.0, 13000, 0.0]}, + {"lux": 800, "prior": [2000, 0.0, 6000, 2.0, 13000, 2.0]}, + {"lux": 1500, "prior": [2000, 0.0, 4000, 1.0, 6000, 6.0, 6500, 7.0, 7000, 1.0, 13000, 1.0]} + ], + "modes": { + "auto": {"lo": 2500, "hi": 8000}, + "incandescent": {"lo": 2500, "hi": 3000}, + "tungsten": {"lo": 3000, "hi": 3500}, + "fluorescent": {"lo": 4000, "hi": 4700}, + "indoor": {"lo": 3000, "hi": 5000}, + "daylight": {"lo": 5500, "hi": 6500}, + "cloudy": {"lo": 7000, "hi": 8600} + }, + "bayes": 1 + }, + "rpi.agc": { + "metering_modes": { + "centre-weighted": { + "weights": [3, 3, 3, 2, 2, 2, 2, 1, 1, 1, 1, 0, 0, 0, 0] + }, + "spot": { + "weights": [2, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] + }, + "matrix": { + "weights": [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1] + } + }, + "exposure_modes": { + "normal": { + "shutter": [100, 10000, 30000, 60000, 120000], + "gain": [1.0, 2.0, 4.0, 6.0, 6.0] + }, + "short": { + "shutter": [100, 5000, 10000, 20000, 120000], + "gain": [1.0, 2.0, 4.0, 6.0, 6.0] + } + }, + "constraint_modes": { + "normal": [ + {"bound": "LOWER", "q_lo": 0.98, "q_hi": 1.0, "y_target": [0, 0.5, 1000, 0.5]} + ], + "highlight": [ + {"bound": "LOWER", "q_lo": 0.98, "q_hi": 1.0, "y_target": [0, 0.5, 1000, 0.5]}, + {"bound": "UPPER", "q_lo": 0.98, "q_hi": 1.0, "y_target": [0, 0.8, 1000, 0.8]} + ] + }, + "y_target": [0, 0.16, 1000, 0.165, 10000, 0.17] + }, + "rpi.alsc": { + 'omega': 1.3, + 'n_iter': 100, + 'luminance_strength': 0.7, + }, + "rpi.contrast": { + "ce_enable": 1, + "gamma_curve": [ + 0, 0, + 1024, 5040, + 2048, 9338, + 3072, 12356, + 4096, 15312, + 5120, 18051, + 6144, 20790, + 7168, 23193, + 8192, 25744, + 9216, 27942, + 10240, 30035, + 11264, 32005, + 12288, 33975, + 13312, 35815, + 14336, 37600, + 15360, 39168, + 16384, 40642, + 18432, 43379, + 20480, 45749, + 22528, 47753, + 24576, 49621, + 26624, 51253, + 28672, 52698, + 30720, 53796, + 32768, 54876, + 36864, 57012, + 40960, 58656, + 45056, 59954, + 49152, 61183, + 53248, 62355, + 57344, 63419, + 61440, 64476, + 65535, 65535 + ] + }, + "rpi.ccm": { + }, + "rpi.sharpen": { + } +} + +grid_size = (16, 12) + +target = 'bcm2835' + +if __name__ == '__main__': + """ + initialise calibration + """ + if len(sys.argv) == 1: + print(""" + VC4 Camera Tuning Tool version 1.0 + + Required Arguments: + '-i' : Calibration image directory. + '-o' : Name of output json file. + + Optional Arguments: + '-c' : Config file for the CTT. If not passed, default parameters used. + '-l' : Name of output log file. If not passed, 'ctt_log.txt' used. + """) + quit(0) + else: + """ + parse input arguments + """ + json_output, directory, config, log_output = parse_input() + run_ctt(json_output, directory, config, log_output, json_template, grid_size, target) From patchwork Thu Jun 6 10:15:08 2024 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: David Plowman X-Patchwork-Id: 20216 Return-Path: X-Original-To: parsemail@patchwork.libcamera.org Delivered-To: parsemail@patchwork.libcamera.org Received: from lancelot.ideasonboard.com (lancelot.ideasonboard.com [92.243.16.209]) by patchwork.libcamera.org (Postfix) with ESMTPS id 6758BBD87C for ; Thu, 6 Jun 2024 10:15:38 +0000 (UTC) Received: from lancelot.ideasonboard.com (localhost [IPv6:::1]) by lancelot.ideasonboard.com (Postfix) with ESMTP id C77E265466; Thu, 6 Jun 2024 12:15:37 +0200 (CEST) Authentication-Results: lancelot.ideasonboard.com; dkim=pass (2048-bit key; unprotected) header.d=raspberrypi.com header.i=@raspberrypi.com header.b="goNdh4Tm"; dkim-atps=neutral Received: from mail-ej1-x633.google.com (mail-ej1-x633.google.com [IPv6:2a00:1450:4864:20::633]) by lancelot.ideasonboard.com (Postfix) with ESMTPS id AD9B265448 for ; Thu, 6 Jun 2024 12:15:31 +0200 (CEST) Received: by mail-ej1-x633.google.com with SMTP id a640c23a62f3a-a6c7ae658d0so96298266b.0 for ; Thu, 06 Jun 2024 03:15:31 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=raspberrypi.com; s=google; t=1717668931; x=1718273731; darn=lists.libcamera.org; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:from:to:cc:subject:date :message-id:reply-to; bh=wdHkH3C7MJAL7S/KkV0mAs9iOvg4M7ZTCbcpXPlDQSI=; b=goNdh4Tmr7Lfyck6i4W2R1kGknyAiQbKMfGpN4j3kKafD5BElOAhGrtQXbdnUpBgQr EDb9FfI/WZYjuuMf0aAiZfxAPsjg07Ym2PpXCPW36ZFyMJyOkqQSYUYqpzB6RkaHZaTH +w2NaAELegyD8clWjataePiVbmA7ywB4sDD3At3JJmmrPQehiZqmRA1+GmsNDnbBOP4L HC652H146lDjJrLzDu1XMm2gU81hGPkbQo8Hh2OipB06EnvCLnJA1ksAnYFBZghJ7x/J Qo72DHnrYRLz3wD9hkcX4k9WKWaw1IdVIgcb07NZxVXp2fT5xs1d5xhKqJXVm/gAB2RH V5fQ== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20230601; t=1717668931; x=1718273731; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:x-gm-message-state:from:to:cc :subject:date:message-id:reply-to; bh=wdHkH3C7MJAL7S/KkV0mAs9iOvg4M7ZTCbcpXPlDQSI=; b=Tzh30l66hbRe19e692YMAwvOvRmSV7SxhaGyghas8ExGpTE7WB5TnnfBpNSAMXcBTG Elykjzkld3Z7ypposdWAL4XsGt6x16vQ80zJLeSv3k22P3Mqr296TmHWxSbIywc2yqOP a08HvLk9C2VVp2E+zDIlBVdnWjBZjkZjKUPh7PxCr0vnakK7xBgGIowE6j7G5unGym77 II7CQo2T9E6bWwJVCT9w/SZ2JzcB/KabgqeB0HFPLzat3wwhUKVHugKINURb4W8hdgU7 ztrPsbpzp5IOZPm5QF7jLXIJGASkw3BX5hibx2jP8hyuSk0blL6YJTKAAQZIZnS9Z3ob 6Kcg== X-Gm-Message-State: AOJu0YyFlRl/Uag0ZUwGlamoBt+e8GIvr62UuCjn5nQeklHaH/WOgyC9 Ekpv+9Rao+tCzfze5NjQkALN/zk3JaQ2Ff3VFjf9NoCByXjdv/+wlRPWDk3Tw5FCFlvafbwk0Aa 1 X-Google-Smtp-Source: AGHT+IHLv2Xy3/iVIue1uDrWWRfxR5JPE86PG0xb712ib9OGCnjPnv7YmreitMaxarfMwpbXyF0SbA== X-Received: by 2002:a17:906:3595:b0:a66:713a:5017 with SMTP id a640c23a62f3a-a699f670be7mr409036366b.29.1717668930201; Thu, 06 Jun 2024 03:15:30 -0700 (PDT) Received: from pi5-davidp.pitowers.org ([2001:4d4e:300:1f:c732:5d0a:406b:ae46]) by smtp.gmail.com with ESMTPSA id a640c23a62f3a-a6c805c59a2sm75809866b.50.2024.06.06.03.15.29 (version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256); Thu, 06 Jun 2024 03:15:29 -0700 (PDT) From: David Plowman To: libcamera-devel@lists.libcamera.org Cc: Ben Benson , Ben Benson , Naushir Patuck Subject: [PATCH 2/6] utils: raspberrypi: ctt: Added CAC support to the CTT Date: Thu, 6 Jun 2024 11:15:08 +0100 Message-Id: <20240606101512.375178-3-david.plowman@raspberrypi.com> X-Mailer: git-send-email 2.39.2 In-Reply-To: <20240606101512.375178-1-david.plowman@raspberrypi.com> References: <20240606101512.375178-1-david.plowman@raspberrypi.com> MIME-Version: 1.0 X-BeenThere: libcamera-devel@lists.libcamera.org X-Mailman-Version: 2.1.29 Precedence: list List-Id: List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , Errors-To: libcamera-devel-bounces@lists.libcamera.org Sender: "libcamera-devel" From: Ben Benson Added the ability to tune the chromatic aberration correction within the ctt. There are options for cac_only or to tune as part of a larger tuning process. CTT will now recognise any files that begin with "cac" as being chromatic aberration tuning files. Signed-off-by: Ben Benson Reviewed-by: Naushir Patuck --- utils/raspberrypi/ctt/alsc_pisp.py | 2 +- utils/raspberrypi/ctt/cac_only.py | 143 +++++++++++ utils/raspberrypi/ctt/ctt_cac.py | 228 ++++++++++++++++++ utils/raspberrypi/ctt/ctt_dots_locator.py | 118 +++++++++ utils/raspberrypi/ctt/ctt_image_load.py | 2 + utils/raspberrypi/ctt/ctt_log.txt | 31 +++ utils/raspberrypi/ctt/ctt_pisp.py | 2 + .../raspberrypi/ctt/ctt_pretty_print_json.py | 4 + utils/raspberrypi/ctt/ctt_run.py | 85 ++++++- 9 files changed, 606 insertions(+), 9 deletions(-) create mode 100644 utils/raspberrypi/ctt/cac_only.py create mode 100644 utils/raspberrypi/ctt/ctt_cac.py create mode 100644 utils/raspberrypi/ctt/ctt_dots_locator.py create mode 100644 utils/raspberrypi/ctt/ctt_log.txt diff --git a/utils/raspberrypi/ctt/alsc_pisp.py b/utils/raspberrypi/ctt/alsc_pisp.py index 499aecd1..d0034ae1 100755 --- a/utils/raspberrypi/ctt/alsc_pisp.py +++ b/utils/raspberrypi/ctt/alsc_pisp.py @@ -2,7 +2,7 @@ # # SPDX-License-Identifier: BSD-2-Clause # -# Copyright (C) 2022, Raspberry Pi (Trading) Limited +# Copyright (C) 2022, Raspberry Pi Ltd # # alsc_only.py - alsc tuning tool diff --git a/utils/raspberrypi/ctt/cac_only.py b/utils/raspberrypi/ctt/cac_only.py new file mode 100644 index 00000000..2bb11ccc --- /dev/null +++ b/utils/raspberrypi/ctt/cac_only.py @@ -0,0 +1,143 @@ +#!/usr/bin/env python3 +# +# SPDX-License-Identifier: BSD-2-Clause +# +# Copyright (C) 2023, Raspberry Pi (Trading) Limited +# +# cac_only.py - cac tuning tool + + +# This file allows you to tune only the chromatic aberration correction +# Specify any number of files in the command line args, and it shall iterate through +# and generate an averaged cac table from all the input images, which you can then +# input into your tuning file. + +# Takes .dng files produced by the camera modules of the dots grid and calculates the chromatic abberation of each dot. +# Then takes each dot, and works out where it was in the image, and uses that to output a tables of the shifts +# across the whole image. + +from PIL import Image +import numpy as np +import rawpy +import sys +import getopt + +from ctt_cac import * + + +def cac(filelist, output_filepath, plot_results=False): + np.set_printoptions(precision=3) + np.set_printoptions(suppress=True) + + # Create arrays to hold all the dots data and their colour offsets + red_shift = [] # Format is: [[Dot Center X, Dot Center Y, x shift, y shift]] + blue_shift = [] + # Iterate through the files + # Multiple files is reccomended to average out the lens aberration through rotations + for file in filelist: + print("\n Processing file " + str(file)) + # Read the raw RGB values from the .dng file + with rawpy.imread(file) as raw: + rgb = raw.postprocess() + sizes = (raw.sizes) + + image_size = [sizes[2], sizes[3]] # Image size, X, Y + # Create a colour copy of the RGB values to use later in the calibration + imout = Image.new(mode="RGB", size=image_size) + rgb_image = np.array(imout) + # The rgb values need reshaping from a 1d array to a 3d array to be worked with easily + rgb.reshape((image_size[0], image_size[1], 3)) + rgb_image = rgb + + # Pass the RGB image through to the dots locating program + # Returns an array of the dots (colour rectangles around the dots), and an array of their locations + print("Finding dots") + dots, dots_locations = find_dots_locations(rgb_image) + + # Now, analyse each dot. Work out the centroid of each colour channel, and use that to work out + # by how far the chromatic aberration has shifted each channel + print('Dots found: ' + str(len(dots))) + + for dot, dot_location in zip(dots, dots_locations): + if len(dot) > 0: + if (dot_location[0] > 0) and (dot_location[1] > 0): + ret = analyse_dot(dot, dot_location) + red_shift.append(ret[0]) + blue_shift.append(ret[1]) + + # Take our arrays of red shifts and locations, push them through to be interpolated into a 9x9 matrix + # for the CAC block to handle and then store these as a .json file to be added to the camera + # tuning file + print("\nCreating output grid") + rx, ry, bx, by = shifts_to_yaml(red_shift, blue_shift, image_size) + + print("CAC correction complete!") + + # The json format that we then paste into the tuning file (manually) + sample = ''' + { + "rpi.cac" : + { + "strength": 1.0, + "lut_rx" : [ + rx_vals + ], + "lut_ry" : [ + ry_vals + ], + "lut_bx" : [ + bx_vals + ], + "lut_by" : [ + by_vals + ] + } + } + ''' + + # Below, may look incorrect, however, the PiSP (standard) dimensions are flipped in comparison to + # PIL image coordinate directions, hence why xr -> yr. Also, the shifts calculated are colour shifts, + # and the PiSP block asks for the values it should shift (hence the * -1, to convert from colour shift to a pixel shift) + sample = sample.replace("rx_vals", pprint_array(ry * -1)) + sample = sample.replace("ry_vals", pprint_array(rx * -1)) + sample = sample.replace("bx_vals", pprint_array(by * -1)) + sample = sample.replace("by_vals", pprint_array(bx * -1)) + print("Successfully converted to YAML") + f = open(str(output_filepath), "w+") + f.write(sample) + f.close() + print("Successfully written to yaml file") + ''' + If you wish to see a plot of the colour channel shifts, add the -p or --plots option + Can be a quick way of validating if the data/dots you've got are good, or if you need to + change some parameters/take some better images + ''' + if plot_results: + plot_shifts(red_shift, blue_shift) + + +if __name__ == "__main__": + argv = sys.argv + # Detect the input and output file paths + arg_output = "output.json" + arg_help = "{0} -i -o -p ".format(argv[0]) + opts, args = getopt.getopt(argv[1:], "hi:o:p", ["help", "input=", "output=", "plot"]) + + output_location = 0 + input_location = 0 + filelist = [] + plot_results = False + for i in range(len(argv)): + if ("-h") in argv[i]: + print(arg_help) # print the help message + sys.exit(2) + if "-o" in argv[i]: + output_location = i + if ".dng" in argv[i]: + filelist.append(argv[i]) + if "-p" in argv[i]: + plot_results = True + + arg_output = argv[output_location + 1] + logfile = open("log.txt", "a+") + cac(filelist, arg_output, plot_results, logfile) diff --git a/utils/raspberrypi/ctt/ctt_cac.py b/utils/raspberrypi/ctt/ctt_cac.py new file mode 100644 index 00000000..5a4c5101 --- /dev/null +++ b/utils/raspberrypi/ctt/ctt_cac.py @@ -0,0 +1,228 @@ +# SPDX-License-Identifier: BSD-2-Clause +# +# Copyright (C) 2023, Raspberry Pi Ltd +# +# ctt_cac.py - CAC (Chromatic Aberration Correction) tuning tool + +from PIL import Image +import numpy as np +import matplotlib.pyplot as plt +from matplotlib import cm + +from ctt_dots_locator import find_dots_locations + + +# This is the wrapper file that creates a JSON entry for you to append +# to your camera tuning file. +# It calculates the chromatic aberration at different points throughout +# the image and uses that to produce a martix that can then be used +# in the camera tuning files to correct this aberration. + + +def pprint_array(array): + # Function to print the array in a tidier format + array = array + output = "" + for i in range(len(array)): + for j in range(len(array[0])): + output += str(round(array[i, j], 2)) + ", " + # Add the necessary indentation to the array + output += "\n " + # Cut off the end of the array (nicely formats it) + return output[:-22] + + +def plot_shifts(red_shifts, blue_shifts): + # If users want, they can pass a command line option to show the shifts on a graph + # Can be useful to check that the functions are all working, and that the sample + # images are doing the right thing + Xs = np.array(red_shifts)[:, 0] + Ys = np.array(red_shifts)[:, 1] + Zs = np.array(red_shifts)[:, 2] + Zs2 = np.array(red_shifts)[:, 3] + Zs3 = np.array(blue_shifts)[:, 2] + Zs4 = np.array(blue_shifts)[:, 3] + + fig, axs = plt.subplots(2, 2) + ax = fig.add_subplot(2, 2, 1, projection='3d') + ax.scatter(Xs, Ys, Zs, cmap=cm.jet, linewidth=0) + ax.set_title('Red X Shift') + ax = fig.add_subplot(2, 2, 2, projection='3d') + ax.scatter(Xs, Ys, Zs2, cmap=cm.jet, linewidth=0) + ax.set_title('Red Y Shift') + ax = fig.add_subplot(2, 2, 3, projection='3d') + ax.scatter(Xs, Ys, Zs3, cmap=cm.jet, linewidth=0) + ax.set_title('Blue X Shift') + ax = fig.add_subplot(2, 2, 4, projection='3d') + ax.scatter(Xs, Ys, Zs4, cmap=cm.jet, linewidth=0) + ax.set_title('Blue Y Shift') + fig.tight_layout() + plt.show() + + +def shifts_to_yaml(red_shift, blue_shift, image_dimensions, output_grid_size=9): + # Convert the shifts to a numpy array for easier handling and initialise other variables + red_shifts = np.array(red_shift) + blue_shifts = np.array(blue_shift) + # create a grid that's smaller than the output grid, which we then interpolate from to get the output values + xrgrid = np.zeros((output_grid_size - 1, output_grid_size - 1)) + xbgrid = np.zeros((output_grid_size - 1, output_grid_size - 1)) + yrgrid = np.zeros((output_grid_size - 1, output_grid_size - 1)) + ybgrid = np.zeros((output_grid_size - 1, output_grid_size - 1)) + + xrsgrid = [] + xbsgrid = [] + yrsgrid = [] + ybsgrid = [] + xg = np.zeros((output_grid_size - 1, output_grid_size - 1)) + yg = np.zeros((output_grid_size - 1, output_grid_size - 1)) + + # Format the grids - numpy doesn't work for this, it wants a + # nice uniformly spaced grid, which we don't know if we have yet, hence the rather mundane setup + for x in range(output_grid_size - 1): + xrsgrid.append([]) + yrsgrid.append([]) + xbsgrid.append([]) + ybsgrid.append([]) + for y in range(output_grid_size - 1): + xrsgrid[x].append([]) + yrsgrid[x].append([]) + xbsgrid[x].append([]) + ybsgrid[x].append([]) + + image_size = (image_dimensions[0], image_dimensions[1]) + gridxsize = image_size[0] / (output_grid_size - 1) + gridysize = image_size[1] / (output_grid_size - 1) + + # Iterate through each dot, and it's shift values and put these into the correct grid location + for red_shift in red_shifts: + xgridloc = int(red_shift[0] / gridxsize) + ygridloc = int(red_shift[1] / gridysize) + xrsgrid[xgridloc][ygridloc].append(red_shift[2]) + yrsgrid[xgridloc][ygridloc].append(red_shift[3]) + + for blue_shift in blue_shifts: + xgridloc = int(blue_shift[0] / gridxsize) + ygridloc = int(blue_shift[1] / gridysize) + xbsgrid[xgridloc][ygridloc].append(blue_shift[2]) + ybsgrid[xgridloc][ygridloc].append(blue_shift[3]) + + # Now calculate the average pixel shift for each square in the grid + for x in range(output_grid_size - 1): + for y in range(output_grid_size - 1): + xrgrid[x, y] = np.mean(xrsgrid[x][y]) + yrgrid[x, y] = np.mean(yrsgrid[x][y]) + xbgrid[x, y] = np.mean(xbsgrid[x][y]) + ybgrid[x, y] = np.mean(ybsgrid[x][y]) + + # Next, we start to interpolate the central points of the grid that gets passed to the tuning file + input_grids = np.array([xrgrid, yrgrid, xbgrid, ybgrid]) + output_grids = np.zeros((4, output_grid_size, output_grid_size)) + + # Interpolate the centre of the grid + output_grids[:, 1:-1, 1:-1] = (input_grids[:, 1:, :-1] + input_grids[:, 1:, 1:] + input_grids[:, :-1, 1:] + input_grids[:, :-1, :-1]) / 4 + + # Edge cases: + output_grids[:, 1:-1, 0] = ((input_grids[:, :-1, 0] + input_grids[:, 1:, 0]) / 2 - output_grids[:, 1:-1, 1]) * 2 + output_grids[:, 1:-1, 1] + output_grids[:, 1:-1, -1] = ((input_grids[:, :-1, 7] + input_grids[:, 1:, 7]) / 2 - output_grids[:, 1:-1, -2]) * 2 + output_grids[:, 1:-1, -2] + output_grids[:, 0, 1:-1] = ((input_grids[:, 0, :-1] + input_grids[:, 0, 1:]) / 2 - output_grids[:, 1, 1:-1]) * 2 + output_grids[:, 1, 1:-1] + output_grids[:, -1, 1:-1] = ((input_grids[:, 7, :-1] + input_grids[:, 7, 1:]) / 2 - output_grids[:, -2, 1:-1]) * 2 + output_grids[:, -2, 1:-1] + + # Corner Cases: + output_grids[:, 0, 0] = (output_grids[:, 0, 1] - output_grids[:, 1, 1]) + (output_grids[:, 1, 0] - output_grids[:, 1, 1]) + output_grids[:, 1, 1] + output_grids[:, 0, -1] = (output_grids[:, 0, -2] - output_grids[:, 1, -2]) + (output_grids[:, 1, -1] - output_grids[:, 1, -2]) + output_grids[:, 1, -2] + output_grids[:, -1, 0] = (output_grids[:, -1, 1] - output_grids[:, -2, 1]) + (output_grids[:, -2, 0] - output_grids[:, -2, 1]) + output_grids[:, -2, 1] + output_grids[:, -1, -1] = (output_grids[:, -2, -1] - output_grids[:, -2, -2]) + (output_grids[:, -1, -2] - output_grids[:, -2, -2]) + output_grids[:, -2, -2] + + # Below, we swap the x and the y coordinates, and also multiply by a factor of -1 + # This is due to the PiSP (standard) dimensions being flipped in comparison to + # PIL image coordinate directions, hence why xr -> yr. Also, the shifts calculated are colour shifts, + # and the PiSP block asks for the values it should shift by (hence the * -1, to convert from colour shift to a pixel shift) + + output_grid_yr, output_grid_xr, output_grid_yb, output_grid_xb = output_grids * -1 + return output_grid_xr, output_grid_yr, output_grid_xb, output_grid_yb + + +def analyse_dot(dot, dot_location=[0, 0]): + # Scan through the dot, calculate the centroid of each colour channel by doing: + # pixel channel brightness * distance from top left corner + # Sum these, and divide by the sum of each channel's brightnesses to get a centroid for each channel + red_channel = np.array(dot)[:, :, 0] + y_num_pixels = len(red_channel[0]) + x_num_pixels = len(red_channel) + yred_weight = np.sum(np.dot(red_channel, np.arange(y_num_pixels))) + xred_weight = np.sum(np.dot(np.arange(x_num_pixels), red_channel)) + red_sum = np.sum(red_channel) + + green_channel = np.array(dot)[:, :, 1] + ygreen_weight = np.sum(np.dot(green_channel, np.arange(y_num_pixels))) + xgreen_weight = np.sum(np.dot(np.arange(x_num_pixels), green_channel)) + green_sum = np.sum(green_channel) + + blue_channel = np.array(dot)[:, :, 2] + yblue_weight = np.sum(np.dot(blue_channel, np.arange(y_num_pixels))) + xblue_weight = np.sum(np.dot(np.arange(x_num_pixels), blue_channel)) + blue_sum = np.sum(blue_channel) + + # We return this structure. It contains 2 arrays that contain: + # the locations of the dot center, along with the channel shifts in the x and y direction: + # [ [red_center_x, red_center_y, red_x_shift, red_y_shift], [blue_center_x, blue_center_y, blue_x_shift, blue_y_shift] ] + + return [[int(dot_location[0]) + int(len(dot) / 2), int(dot_location[1]) + int(len(dot[0]) / 2), xred_weight / red_sum - xgreen_weight / green_sum, yred_weight / red_sum - ygreen_weight / green_sum], [dot_location[0] + int(len(dot) / 2), dot_location[1] + int(len(dot[0]) / 2), xblue_weight / blue_sum - xgreen_weight / green_sum, yblue_weight / blue_sum - ygreen_weight / green_sum]] + + +def cac(Cam): + filelist = Cam.imgs_cac + + Cam.log += '\nCAC analysing files: {}'.format(str(filelist)) + np.set_printoptions(precision=3) + np.set_printoptions(suppress=True) + + # Create arrays to hold all the dots data and their colour offsets + red_shift = [] # Format is: [[Dot Center X, Dot Center Y, x shift, y shift]] + blue_shift = [] + # Iterate through the files + # Multiple files is reccomended to average out the lens aberration through rotations + for file in filelist: + Cam.log += '\nCAC processing file' + print("\n Processing file") + # Read the raw RGB values + rgb = file.rgb + image_size = [file.h, file.w] # Image size, X, Y + # Create a colour copy of the RGB values to use later in the calibration + imout = Image.new(mode="RGB", size=image_size) + rgb_image = np.array(imout) + # The rgb values need reshaping from a 1d array to a 3d array to be worked with easily + rgb.reshape((image_size[0], image_size[1], 3)) + rgb_image = rgb + + # Pass the RGB image through to the dots locating program + # Returns an array of the dots (colour rectangles around the dots), and an array of their locations + print("Finding dots") + Cam.log += '\nFinding dots' + dots, dots_locations = find_dots_locations(rgb_image) + + # Now, analyse each dot. Work out the centroid of each colour channel, and use that to work out + # by how far the chromatic aberration has shifted each channel + Cam.log += '\nDots found: {}'.format(str(len(dots))) + print('Dots found: ' + str(len(dots))) + + for dot, dot_location in zip(dots, dots_locations): + if len(dot) > 0: + if (dot_location[0] > 0) and (dot_location[1] > 0): + ret = analyse_dot(dot, dot_location) + red_shift.append(ret[0]) + blue_shift.append(ret[1]) + + # Take our arrays of red shifts and locations, push them through to be interpolated into a 9x9 matrix + # for the CAC block to handle and then store these as a .json file to be added to the camera + # tuning file + print("\nCreating output grid") + Cam.log += '\nCreating output grid' + rx, ry, bx, by = shifts_to_yaml(red_shift, blue_shift, image_size) + + print("CAC correction complete!") + Cam.log += '\nCAC correction complete!' + + # Give the JSON dict back to the main ctt program + return {"strength": 1.0, "lut_rx": list(rx.round(2).reshape(81)), "lut_ry": list(ry.round(2).reshape(81)), "lut_bx": list(bx.round(2).reshape(81)), "lut_by": list(by.round(2).reshape(81))} diff --git a/utils/raspberrypi/ctt/ctt_dots_locator.py b/utils/raspberrypi/ctt/ctt_dots_locator.py new file mode 100644 index 00000000..4945c04b --- /dev/null +++ b/utils/raspberrypi/ctt/ctt_dots_locator.py @@ -0,0 +1,118 @@ +# SPDX-License-Identifier: BSD-2-Clause +# +# Copyright (C) 2023, Raspberry Pi Ltd +# +# find_dots.py - Used by CAC algorithm to convert image to set of dots + +''' +This file takes the black and white version of the image, along with +the color version. It then located the black dots on the image by +thresholding dark pixels. +In a rather fun way, the algorithm bounces around the thresholded area in a random path +We then use the maximum and minimum of these paths to determine the dot shape and size +This info is then used to return colored dots and locations back to the main file +''' + +import numpy as np +import random +from PIL import Image, ImageEnhance, ImageFilter + + +def find_dots_locations(rgb_image, color_threshold=100, dots_edge_avoid=75, image_edge_avoid=10, search_path_length=500, grid_scan_step_size=10, logfile=open("log.txt", "a+")): + # Initialise some starting variables + pixels = Image.fromarray(rgb_image) + pixels = pixels.convert("L") + enhancer = ImageEnhance.Contrast(pixels) + im_output = enhancer.enhance(1.4) + # We smooth it slightly to make it easier for the dot recognition program to locate the dots + im_output = im_output.filter(ImageFilter.GaussianBlur(radius=2)) + bw_image = np.array(im_output) + + location = [0, 0] + dots = [] + dots_location = [] + # the program takes away the edges - we don't want a dot that is half a circle, the + # centroids would all be wrong + for x in range(dots_edge_avoid, len(bw_image) - dots_edge_avoid, grid_scan_step_size): + for y in range(dots_edge_avoid, len(bw_image[0]) - dots_edge_avoid, grid_scan_step_size): + location = [x, y] + scrap_dot = False # A variable used to make sure that this is a valid dot + if (bw_image[location[0], location[1]] < color_threshold) and not (scrap_dot): + heading = "south" # Define a starting direction to move in + coords = [] + for i in range(search_path_length): # Creates a path of length `search_path_length`. This turns out to always be enough to work out the rough shape of the dot. + # Now make sure that the thresholded area doesn't come within 10 pixels of the edge of the image, ensures we capture all the CA + if ((image_edge_avoid < location[0] < len(bw_image) - image_edge_avoid) and (image_edge_avoid < location[1] < len(bw_image[0]) - image_edge_avoid)) and not (scrap_dot): + if heading == "south": + if bw_image[location[0] + 1, location[1]] < color_threshold: + # Here, notice it does not go south, but actually goes southeast + # This is crucial in ensuring that we make our way around the majority of the dot + location[0] = location[0] + 1 + location[1] = location[1] + 1 + heading = "south" + else: + # This happens when we reach a thresholded edge. We now randomly change direction and keep searching + dir = random.randint(1, 2) + if dir == 1: + heading = "west" + if dir == 2: + heading = "east" + + if heading == "east": + if bw_image[location[0], location[1] + 1] < color_threshold: + location[1] = location[1] + 1 + heading = "east" + else: + dir = random.randint(1, 2) + if dir == 1: + heading = "north" + if dir == 2: + heading = "south" + + if heading == "west": + if bw_image[location[0], location[1] - 1] < color_threshold: + location[1] = location[1] - 1 + heading = "west" + else: + dir = random.randint(1, 2) + if dir == 1: + heading = "north" + if dir == 2: + heading = "south" + + if heading == "north": + if bw_image[location[0] - 1, location[1]] < color_threshold: + location[0] = location[0] - 1 + heading = "north" + else: + dir = random.randint(1, 2) + if dir == 1: + heading = "west" + if dir == 2: + heading = "east" + # Log where our particle travels across the dot + coords.append([location[0], location[1]]) + else: + scrap_dot = True # We just don't have enough space around the dot, discard this one, and move on + if not scrap_dot: + # get the size of the dot surrounding the dot + x_coords = np.array(coords)[:, 0] + y_coords = np.array(coords)[:, 1] + hsquaresize = max(list(x_coords)) - min(list(x_coords)) + vsquaresize = max(list(y_coords)) - min(list(y_coords)) + # Create the bounding coordinates of the rectangle surrounding the dot + # Program uses the dotsize + half of the dotsize to ensure we get all that color fringing + extra_space_factor = 0.45 + top_left_x = (min(list(x_coords)) - int(hsquaresize * extra_space_factor)) + btm_right_x = max(list(x_coords)) + int(hsquaresize * extra_space_factor) + top_left_y = (min(list(y_coords)) - int(vsquaresize * extra_space_factor)) + btm_right_y = max(list(y_coords)) + int(vsquaresize * extra_space_factor) + # Overwrite the area of the dot to ensure we don't use it again + bw_image[top_left_x:btm_right_x, top_left_y:btm_right_y] = 255 + # Add the color version of the dot to the list to send off, along with some coordinates. + dots.append(rgb_image[top_left_x:btm_right_x, top_left_y:btm_right_y]) + dots_location.append([top_left_x, top_left_y]) + else: + # Dot was too close to the image border to be useable + pass + return dots, dots_location diff --git a/utils/raspberrypi/ctt/ctt_image_load.py b/utils/raspberrypi/ctt/ctt_image_load.py index d76ece73..ea5fa360 100644 --- a/utils/raspberrypi/ctt/ctt_image_load.py +++ b/utils/raspberrypi/ctt/ctt_image_load.py @@ -350,6 +350,8 @@ def dng_load_image(Cam, im_str): c2 = np.left_shift(raw_data[1::2, 0::2].astype(np.int64), shift) c3 = np.left_shift(raw_data[1::2, 1::2].astype(np.int64), shift) Img.channels = [c0, c1, c2, c3] + Img.rgb = raw_im.postprocess() + Img.sizes = raw_im.sizes except Exception: print("\nERROR: failed to load DNG file", im_str) diff --git a/utils/raspberrypi/ctt/ctt_log.txt b/utils/raspberrypi/ctt/ctt_log.txt new file mode 100644 index 00000000..682e24e4 --- /dev/null +++ b/utils/raspberrypi/ctt/ctt_log.txt @@ -0,0 +1,31 @@ +Log created : Fri Aug 25 17:02:58 2023 + +---------------------------------------------------------------------- +User Arguments +---------------------------------------------------------------------- + +Json file output: output.json +Calibration images directory: ../ctt/ +No configuration file input... using default options +No log file path input... using default: ctt_log.txt + +---------------------------------------------------------------------- +Image Loading +---------------------------------------------------------------------- + +Directory: ../ctt/ +Files found: 1 + +Image: alsc_3000k_0.dng +Identified as an ALSC image +Colour temperature: 3000 K + +Images found: +Macbeth : 0 +ALSC : 1 +CAC: 0 + +Camera metadata +ERROR: No usable macbeth chart images found + +---------------------------------------------------------------------- diff --git a/utils/raspberrypi/ctt/ctt_pisp.py b/utils/raspberrypi/ctt/ctt_pisp.py index f837e062..862587a6 100755 --- a/utils/raspberrypi/ctt/ctt_pisp.py +++ b/utils/raspberrypi/ctt/ctt_pisp.py @@ -197,6 +197,8 @@ json_template = { }, "rpi.ccm": { }, + "rpi.cac": { + }, "rpi.sharpen": { "threshold": 0.25, "limit": 1.0, diff --git a/utils/raspberrypi/ctt/ctt_pretty_print_json.py b/utils/raspberrypi/ctt/ctt_pretty_print_json.py index 5d16b2a6..d3bd7d97 100755 --- a/utils/raspberrypi/ctt/ctt_pretty_print_json.py +++ b/utils/raspberrypi/ctt/ctt_pretty_print_json.py @@ -24,6 +24,10 @@ class Encoder(json.JSONEncoder): 'luminance_lut': 16, 'ct_curve': 3, 'ccm': 3, + 'lut_rx': 9, + 'lut_bx': 9, + 'lut_by': 9, + 'lut_ry': 9, 'gamma_curve': 2, 'y_target': 2, 'prior': 2 diff --git a/utils/raspberrypi/ctt/ctt_run.py b/utils/raspberrypi/ctt/ctt_run.py index 0c85d7db..074136a1 100755 --- a/utils/raspberrypi/ctt/ctt_run.py +++ b/utils/raspberrypi/ctt/ctt_run.py @@ -9,6 +9,7 @@ import os import sys from ctt_image_load import * +from ctt_cac import * from ctt_ccm import * from ctt_awb import * from ctt_alsc import * @@ -22,9 +23,10 @@ import re """ This file houses the camera object, which is used to perform the calibrations. -The camera object houses all the calibration images as attributes in two lists: +The camera object houses all the calibration images as attributes in three lists: - imgs (macbeth charts) - imgs_alsc (alsc correction images) + - imgs_cac (cac correction images) Various calibrations are methods of the camera object, and the output is stored in a dictionary called self.json. Once all the caibration has been completed, the Camera.json is written into a @@ -73,16 +75,15 @@ class Camera: self.path = '' self.imgs = [] self.imgs_alsc = [] + self.imgs_cac = [] self.log = 'Log created : ' + time.asctime(time.localtime(time.time())) self.log_separator = '\n'+'-'*70+'\n' self.jf = jfile """ initial json dict populated by uncalibrated values """ - self.json = json - """ Perform colour correction calibrations by comparing macbeth patch colours to standard macbeth chart colours. @@ -146,6 +147,62 @@ class Camera: self.log += '\nCCM calibration written to json file' print('Finished CCM calibration') + """ + Perform chromatic abberation correction using multiple dots images. + """ + def cac_cal(self, do_alsc_colour): + if 'rpi.cac' in self.disable: + return 1 + print('\nStarting CAC calibration') + self.log_new_sec('CAC') + """ + check if cac images have been taken + """ + if len(self.imgs_cac) == 0: + print('\nError:\nNo cac calibration images found') + self.log += '\nERROR: No CAC calibration images found!' + self.log += '\nCAC calibration aborted!' + return 1 + """ + if image is greyscale then CAC makes no sense + """ + if self.grey: + print('\nERROR: Can\'t do CAC on greyscale image!') + self.log += '\nERROR: Cannot perform CAC calibration ' + self.log += 'on greyscale image!\nCAC aborted!' + del self.json['rpi.cac'] + return 0 + a = time.time() + """ + Check if camera is greyscale or color. If not greyscale, then perform cac + """ + if do_alsc_colour: + """ + Here we have a color sensor. Perform cac + """ + try: + cacs = cac(self) + except ArithmeticError: + print('ERROR: Matrix is singular!\nTake new pictures and try again...') + self.log += '\nERROR: Singular matrix encountered during fit!' + self.log += '\nCCM aborted!' + return 1 + else: + """ + case where config options suggest greyscale camera. No point in doing CAC + """ + cal_cr_list, cal_cb_list = None, None + self.log += '\nWARNING: No ALSC tables found.\nCCM calibration ' + self.log += 'performed without ALSC correction...' + + """ + Write output to json + """ + self.json['rpi.cac']['cac'] = cacs + self.log += '\nCCM calibration written to json file' + print('Finished CCM calibration') + + """ Auto white balance calibration produces a colour curve for various colour temperatures, as well as providing a maximum 'wiggle room' @@ -516,6 +573,16 @@ class Camera: self.log += '\nWARNING: Error reading colour temperature' self.log += '\nImage discarded!' print('DISCARDED') + elif 'cac' in filename: + Img = load_image(self, address, mac=False) + self.log += '\nIdentified as an CAC image' + Img.name = filename + self.log += '\nColour temperature: {} K'.format(col) + self.imgs_cac.append(Img) + if blacklevel != -1: + Img.blacklevel_16 = blacklevel + print(img_suc_msg) + continue else: self.log += '\nIdentified as macbeth chart image' """ @@ -561,6 +628,7 @@ class Camera: self.log += '\n\nImages found:' self.log += '\nMacbeth : {}'.format(len(self.imgs)) self.log += '\nALSC : {} '.format(len(self.imgs_alsc)) + self.log += '\nCAC: {} '.format(len(self.imgs_cac)) self.log += '\n\nCamera metadata' """ check usable images found @@ -569,22 +637,21 @@ class Camera: print('\nERROR: No usable macbeth chart images found') self.log += '\nERROR: No usable macbeth chart images found' return 0 - elif len(self.imgs) == 0 and len(self.imgs_alsc) == 0: + elif len(self.imgs) == 0 and len(self.imgs_alsc) == 0 and len(self.imgs_cac) == 0: print('\nERROR: No usable images found') self.log += '\nERROR: No usable images found' return 0 """ Double check that every image has come from the same camera... """ - all_imgs = self.imgs + self.imgs_alsc + all_imgs = self.imgs + self.imgs_alsc + self.imgs_cac camNames = list(set([Img.camName for Img in all_imgs])) patterns = list(set([Img.pattern for Img in all_imgs])) sigbitss = list(set([Img.sigbits for Img in all_imgs])) blacklevels = list(set([Img.blacklevel_16 for Img in all_imgs])) sizes = list(set([(Img.w, Img.h) for Img in all_imgs])) - if len(camNames) == 1 and len(patterns) == 1 and len(sigbitss) == 1 and \ - len(blacklevels) == 1 and len(sizes) == 1: + if 1: self.grey = (patterns[0] == 128) self.blacklevel_16 = blacklevels[0] self.log += '\nName: {}'.format(camNames[0]) @@ -643,6 +710,7 @@ def run_ctt(json_output, directory, config, log_output, json_template, grid_size mac_small = get_config(macbeth_d, "small", 0, 'bool') mac_show = get_config(macbeth_d, "show", 0, 'bool') mac_config = (mac_small, mac_show) + cac_d = get_config(configs, "cac", {}, 'dict') if blacklevel < -1 or blacklevel >= 2**16: print('\nInvalid blacklevel, defaulted to 64') @@ -687,7 +755,8 @@ def run_ctt(json_output, directory, config, log_output, json_template, grid_size Cam.geq_cal() Cam.lux_cal() Cam.noise_cal() - Cam.cac_cal(do_alsc_colour) + if "rpi.cac" in json_template: + Cam.cac_cal(do_alsc_colour) Cam.awb_cal(greyworld, do_alsc_colour, grid_size) Cam.ccm_cal(do_alsc_colour, grid_size) From patchwork Thu Jun 6 10:15:09 2024 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: David Plowman X-Patchwork-Id: 20217 Return-Path: X-Original-To: parsemail@patchwork.libcamera.org Delivered-To: parsemail@patchwork.libcamera.org Received: from lancelot.ideasonboard.com (lancelot.ideasonboard.com [92.243.16.209]) by patchwork.libcamera.org (Postfix) with ESMTPS id 27E7AC3292 for ; Thu, 6 Jun 2024 10:15:40 +0000 (UTC) Received: from lancelot.ideasonboard.com (localhost [IPv6:::1]) by lancelot.ideasonboard.com (Postfix) with ESMTP id BF5686545C; Thu, 6 Jun 2024 12:15:38 +0200 (CEST) Authentication-Results: lancelot.ideasonboard.com; dkim=pass (2048-bit key; unprotected) header.d=raspberrypi.com header.i=@raspberrypi.com header.b="ZyZ9UXk0"; dkim-atps=neutral Received: from mail-ej1-x633.google.com (mail-ej1-x633.google.com [IPv6:2a00:1450:4864:20::633]) by lancelot.ideasonboard.com (Postfix) with ESMTPS id CE2F465459 for ; Thu, 6 Jun 2024 12:15:31 +0200 (CEST) Received: by mail-ej1-x633.google.com with SMTP id a640c23a62f3a-a6265d3ba8fso66156166b.0 for ; Thu, 06 Jun 2024 03:15:31 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=raspberrypi.com; s=google; t=1717668931; x=1718273731; darn=lists.libcamera.org; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:from:to:cc:subject:date :message-id:reply-to; bh=YT/0a30myRtwZY+6s62GqXPjSHD/laXp1qrzJp9sYzs=; b=ZyZ9UXk0CvgJthTZrxOVFyIL1toJZHeI+UDibpqasCZo7j84H2ygeXXkH8wbPbz+mu Bahlyw5xtQL5Xpqaizr9K6CBS7WNfr5aXjuHpse5bRo4yDDNClmIeElamV62aVbc/hU8 dIhqoTdprc9/b75vr6cIdS3CkDwY8ARZ2B0/NLtWZ04xaug9f3aeVI5q3XGL59vL5DAC OQaHvjI4ek6yEtKuVZkcIkfEiZLSoxlH755su6EYCKcRcTYSJsYaL1XLCuGejYU/8I15 +Vs98Geyb1Zfw5V+jUFnPLxDM+T3cF1bAoxtoHIh9Kf+3t8ZgpWRmlBL8TzEJDxvXIvh ozgw== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20230601; t=1717668931; x=1718273731; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:x-gm-message-state:from:to:cc :subject:date:message-id:reply-to; bh=YT/0a30myRtwZY+6s62GqXPjSHD/laXp1qrzJp9sYzs=; b=U748zleFPxT79XgFX1h5zT6eowWuSBAb0/Cfp88YkH5borLkrHssWuvTUbYcVHsJX0 RcxRYUeta6W8JNwD8EiGP972H74fqSeNZdq06sqVGS4/2dUzODxY869qz23jGF6KhGv/ Ugcm+VeSaVPWdDEbffcizSlKJf3BATpTZrPV+bHKgcdk9sNKcxnGs85UQFUJvFHMwFTQ mw5FUhUDip/Vnc8+bMPiOCgqbhA46pWCBpotZ6qsmjflgN/7mANygxMPAT1CWqcnb8uH itx04r6rKetmKj38CfRCTjJYdvCssfF8QD7QdNCRb81kAmllS0tUQrGxXwwOQAdbJSHf s1yg== X-Gm-Message-State: AOJu0YzJTdNecL0O1kZ7pHpXSC7ExImZy4T58O6uy1JeDRcs9retcg4p fkmh6WVm6P1R+mbJ2yYhHjJBYT7mCfqJ6lc/0ZSeZaZjvmTnfgMZrFaXOr5aVApKUSUHP3Joebb C X-Google-Smtp-Source: AGHT+IHpUCV3ePiCOo2EvKbksE+ajGp1iwGGgSWhFi0MaxpdgBDa9axKXMnXon6HtR1nAVyJDZ/8Fw== X-Received: by 2002:a17:906:8302:b0:a66:7b79:3572 with SMTP id a640c23a62f3a-a699f362134mr312554266b.15.1717668930907; Thu, 06 Jun 2024 03:15:30 -0700 (PDT) Received: from pi5-davidp.pitowers.org ([2001:4d4e:300:1f:c732:5d0a:406b:ae46]) by smtp.gmail.com with ESMTPSA id a640c23a62f3a-a6c805c59a2sm75809866b.50.2024.06.06.03.15.30 (version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256); Thu, 06 Jun 2024 03:15:30 -0700 (PDT) From: David Plowman To: libcamera-devel@lists.libcamera.org Cc: Ben Benson , Ben Benson , Naushir Patuck Subject: [PATCH 3/6] utils: raspberrypi: ctt: Changed CTT handling of VC4 and PiSP Date: Thu, 6 Jun 2024 11:15:09 +0100 Message-Id: <20240606101512.375178-4-david.plowman@raspberrypi.com> X-Mailer: git-send-email 2.39.2 In-Reply-To: <20240606101512.375178-1-david.plowman@raspberrypi.com> References: <20240606101512.375178-1-david.plowman@raspberrypi.com> MIME-Version: 1.0 X-BeenThere: libcamera-devel@lists.libcamera.org X-Mailman-Version: 2.1.29 Precedence: list List-Id: List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , Errors-To: libcamera-devel-bounces@lists.libcamera.org Sender: "libcamera-devel" From: Ben Benson Changed how users select which platform to tune for. Now users specify a command line argument, '-t', to specify which target platform. Signed-off-by: Ben Benson Reviewed-by: Naushir Patuck --- .../ctt/{alsc_pisp.py => alsc_only.py} | 13 +++++-- utils/raspberrypi/ctt/alsc_vc4.py | 37 ------------------ utils/raspberrypi/ctt/cac_only.py | 9 ++--- utils/raspberrypi/ctt/{ctt_run.py => ctt.py} | 38 ++++++++++++++++--- utils/raspberrypi/ctt/ctt_image_load.py | 1 - utils/raspberrypi/ctt/ctt_log.txt | 31 --------------- utils/raspberrypi/ctt/ctt_pisp.py | 33 +--------------- .../raspberrypi/ctt/ctt_pretty_print_json.py | 8 +++- utils/raspberrypi/ctt/ctt_tools.py | 3 +- utils/raspberrypi/ctt/ctt_vc4.py | 33 +--------------- 10 files changed, 57 insertions(+), 149 deletions(-) rename utils/raspberrypi/ctt/{alsc_pisp.py => alsc_only.py} (69%) delete mode 100755 utils/raspberrypi/ctt/alsc_vc4.py rename utils/raspberrypi/ctt/{ctt_run.py => ctt.py} (96%) delete mode 100644 utils/raspberrypi/ctt/ctt_log.txt diff --git a/utils/raspberrypi/ctt/alsc_pisp.py b/utils/raspberrypi/ctt/alsc_only.py similarity index 69% rename from utils/raspberrypi/ctt/alsc_pisp.py rename to utils/raspberrypi/ctt/alsc_only.py index d0034ae1..a521c4ad 100755 --- a/utils/raspberrypi/ctt/alsc_pisp.py +++ b/utils/raspberrypi/ctt/alsc_only.py @@ -4,12 +4,11 @@ # # Copyright (C) 2022, Raspberry Pi Ltd # -# alsc_only.py - alsc tuning tool +# alsc tuning tool import sys -from ctt_pisp import json_template, grid_size, target -from ctt_run import run_ctt +from ctt import * from ctt_tools import parse_input if __name__ == '__main__': @@ -25,6 +24,7 @@ if __name__ == '__main__': '-o' : Name of output json file. Optional Arguments: + '-t' : Target platform - 'pisp' or 'vc4'. Default 'vc4' '-c' : Config file for the CTT. If not passed, default parameters used. '-l' : Name of output log file. If not passed, 'ctt_log.txt' used. """) @@ -33,5 +33,10 @@ if __name__ == '__main__': """ parse input arguments """ - json_output, directory, config, log_output = parse_input() + json_output, directory, config, log_output, target = parse_input() + if target == 'pisp': + from ctt_pisp import json_template, grid_size + elif target == 'vc4': + from ctt_vc4 import json_template, grid_size + run_ctt(json_output, directory, config, log_output, json_template, grid_size, target, alsc_only=True) diff --git a/utils/raspberrypi/ctt/alsc_vc4.py b/utils/raspberrypi/ctt/alsc_vc4.py deleted file mode 100755 index caf6a174..00000000 --- a/utils/raspberrypi/ctt/alsc_vc4.py +++ /dev/null @@ -1,37 +0,0 @@ -#!/usr/bin/env python3 -# -# SPDX-License-Identifier: BSD-2-Clause -# -# Copyright (C) 2022, Raspberry Pi (Trading) Limited -# -# alsc tuning tool - -import sys - -from ctt_vc4 import json_template, grid_size, target -from ctt_run import run_ctt -from ctt_tools import parse_input - -if __name__ == '__main__': - """ - initialise calibration - """ - if len(sys.argv) == 1: - print(""" - VC4 Lens Shading Camera Tuning Tool version 1.0 - - Required Arguments: - '-i' : Calibration image directory. - '-o' : Name of output json file. - - Optional Arguments: - '-c' : Config file for the CTT. If not passed, default parameters used. - '-l' : Name of output log file. If not passed, 'ctt_log.txt' used. - """) - quit(0) - else: - """ - parse input arguments - """ - json_output, directory, config, log_output = parse_input() - run_ctt(json_output, directory, config, log_output, json_template, grid_size, target, alsc_only=True) diff --git a/utils/raspberrypi/ctt/cac_only.py b/utils/raspberrypi/ctt/cac_only.py index 2bb11ccc..1c0a8193 100644 --- a/utils/raspberrypi/ctt/cac_only.py +++ b/utils/raspberrypi/ctt/cac_only.py @@ -2,7 +2,7 @@ # # SPDX-License-Identifier: BSD-2-Clause # -# Copyright (C) 2023, Raspberry Pi (Trading) Limited +# Copyright (C) 2023, Raspberry Pi (Trading) Ltd. # # cac_only.py - cac tuning tool @@ -102,11 +102,11 @@ def cac(filelist, output_filepath, plot_results=False): sample = sample.replace("ry_vals", pprint_array(rx * -1)) sample = sample.replace("bx_vals", pprint_array(by * -1)) sample = sample.replace("by_vals", pprint_array(bx * -1)) - print("Successfully converted to YAML") + print("Successfully converted to JSON") f = open(str(output_filepath), "w+") f.write(sample) f.close() - print("Successfully written to yaml file") + print("Successfully written to json file") ''' If you wish to see a plot of the colour channel shifts, add the -p or --plots option Can be a quick way of validating if the data/dots you've got are good, or if you need to @@ -139,5 +139,4 @@ if __name__ == "__main__": plot_results = True arg_output = argv[output_location + 1] - logfile = open("log.txt", "a+") - cac(filelist, arg_output, plot_results, logfile) + cac(filelist, arg_output, plot_results) diff --git a/utils/raspberrypi/ctt/ctt_run.py b/utils/raspberrypi/ctt/ctt.py similarity index 96% rename from utils/raspberrypi/ctt/ctt_run.py rename to utils/raspberrypi/ctt/ctt.py index 074136a1..522933bd 100755 --- a/utils/raspberrypi/ctt/ctt_run.py +++ b/utils/raspberrypi/ctt/ctt.py @@ -185,22 +185,22 @@ class Camera: except ArithmeticError: print('ERROR: Matrix is singular!\nTake new pictures and try again...') self.log += '\nERROR: Singular matrix encountered during fit!' - self.log += '\nCCM aborted!' + self.log += '\nCAC aborted!' return 1 else: """ case where config options suggest greyscale camera. No point in doing CAC """ cal_cr_list, cal_cb_list = None, None - self.log += '\nWARNING: No ALSC tables found.\nCCM calibration ' + self.log += '\nWARNING: No ALSC tables found.\nCAC calibration ' self.log += 'performed without ALSC correction...' """ Write output to json """ self.json['rpi.cac']['cac'] = cacs - self.log += '\nCCM calibration written to json file' - print('Finished CCM calibration') + self.log += '\nCAC calibration written to json file' + print('Finished CAC calibration') """ @@ -710,7 +710,6 @@ def run_ctt(json_output, directory, config, log_output, json_template, grid_size mac_small = get_config(macbeth_d, "small", 0, 'bool') mac_show = get_config(macbeth_d, "show", 0, 'bool') mac_config = (mac_small, mac_show) - cac_d = get_config(configs, "cac", {}, 'dict') if blacklevel < -1 or blacklevel >= 2**16: print('\nInvalid blacklevel, defaulted to 64') @@ -770,3 +769,32 @@ def run_ctt(json_output, directory, config, log_output, json_template, grid_size pass else: Cam.write_log(log_output) + +if __name__ == '__main__': + """ + initialise calibration + """ + if len(sys.argv) == 1: + print(""" + PiSP Tuning Tool version 1.0 + Required Arguments: + '-i' : Calibration image directory. + '-o' : Name of output json file. + + Optional Arguments: + '-t' : Target platform - 'pisp' or 'vc4'. Default 'vc4' + '-c' : Config file for the CTT. If not passed, default parameters used. + '-l' : Name of output log file. If not passed, 'ctt_log.txt' used. + """) + quit(0) + else: + """ + parse input arguments + """ + json_output, directory, config, log_output, target = parse_input() + if target == 'pisp': + from ctt_pisp import json_template, grid_size + elif target == 'vc4': + from ctt_vc4 import json_template, grid_size + + run_ctt(json_output, directory, config, log_output, json_template, grid_size, target) diff --git a/utils/raspberrypi/ctt/ctt_image_load.py b/utils/raspberrypi/ctt/ctt_image_load.py index ea5fa360..531de328 100644 --- a/utils/raspberrypi/ctt/ctt_image_load.py +++ b/utils/raspberrypi/ctt/ctt_image_load.py @@ -351,7 +351,6 @@ def dng_load_image(Cam, im_str): c3 = np.left_shift(raw_data[1::2, 1::2].astype(np.int64), shift) Img.channels = [c0, c1, c2, c3] Img.rgb = raw_im.postprocess() - Img.sizes = raw_im.sizes except Exception: print("\nERROR: failed to load DNG file", im_str) diff --git a/utils/raspberrypi/ctt/ctt_log.txt b/utils/raspberrypi/ctt/ctt_log.txt deleted file mode 100644 index 682e24e4..00000000 --- a/utils/raspberrypi/ctt/ctt_log.txt +++ /dev/null @@ -1,31 +0,0 @@ -Log created : Fri Aug 25 17:02:58 2023 - ----------------------------------------------------------------------- -User Arguments ----------------------------------------------------------------------- - -Json file output: output.json -Calibration images directory: ../ctt/ -No configuration file input... using default options -No log file path input... using default: ctt_log.txt - ----------------------------------------------------------------------- -Image Loading ----------------------------------------------------------------------- - -Directory: ../ctt/ -Files found: 1 - -Image: alsc_3000k_0.dng -Identified as an ALSC image -Colour temperature: 3000 K - -Images found: -Macbeth : 0 -ALSC : 1 -CAC: 0 - -Camera metadata -ERROR: No usable macbeth chart images found - ----------------------------------------------------------------------- diff --git a/utils/raspberrypi/ctt/ctt_pisp.py b/utils/raspberrypi/ctt/ctt_pisp.py index 862587a6..4c432f17 100755 --- a/utils/raspberrypi/ctt/ctt_pisp.py +++ b/utils/raspberrypi/ctt/ctt_pisp.py @@ -4,13 +4,8 @@ # # Copyright (C) 2019, Raspberry Pi Ltd # -# ctt_pisp.py - camera tuning tool for PiSP platforms +# ctt_pisp.py - camera tuning tool data for PiSP platforms -import os -import sys - -from ctt_run import run_ctt -from ctt_tools import parse_input json_template = { "rpi.black_level": { @@ -207,29 +202,3 @@ json_template = { } grid_size = (32, 32) - -target = 'pisp' - -if __name__ == '__main__': - """ - initialise calibration - """ - if len(sys.argv) == 1: - print(""" - PiSP Camera Tuning Tool version 1.0 - - Required Arguments: - '-i' : Calibration image directory. - '-o' : Name of output json file. - - Optional Arguments: - '-c' : Config file for the CTT. If not passed, default parameters used. - '-l' : Name of output log file. If not passed, 'ctt_log.txt' used. - """) - quit(0) - else: - """ - parse input arguments - """ - json_output, directory, config, log_output = parse_input() - run_ctt(json_output, directory, config, log_output, json_template, grid_size, target) diff --git a/utils/raspberrypi/ctt/ctt_pretty_print_json.py b/utils/raspberrypi/ctt/ctt_pretty_print_json.py index d3bd7d97..350cec65 100755 --- a/utils/raspberrypi/ctt/ctt_pretty_print_json.py +++ b/utils/raspberrypi/ctt/ctt_pretty_print_json.py @@ -108,6 +108,7 @@ def pretty_print(in_json: dict, custom_elems={}) -> str: if __name__ == "__main__": parser = argparse.ArgumentParser(formatter_class=argparse.RawTextHelpFormatter, description= 'Prettify a version 2.0 camera tuning config JSON file.') + parser.add_argument('-t', '--target', type=str, help='Target platform', choices=['pisp', 'vc4'], default='vc4') parser.add_argument('input', type=str, help='Input tuning file.') parser.add_argument('output', type=str, nargs='?', help='Output converted tuning file. If not provided, the input file will be updated in-place.', @@ -117,7 +118,12 @@ if __name__ == "__main__": with open(args.input, 'r') as f: in_json = json.load(f) - out_json = pretty_print(in_json) + if args.target == 'pisp': + from ctt_pisp import grid_size + elif args.target == 'vc4': + from ctt_vc4 import grid_size + + out_json = pretty_print(in_json, custom_elems={'table': grid_size[0], 'luminance_lut': grid_size[0]}) with open(args.output if args.output is not None else args.input, 'w') as f: f.write(out_json) diff --git a/utils/raspberrypi/ctt/ctt_tools.py b/utils/raspberrypi/ctt/ctt_tools.py index 27c52193..50b01ecf 100644 --- a/utils/raspberrypi/ctt/ctt_tools.py +++ b/utils/raspberrypi/ctt/ctt_tools.py @@ -65,11 +65,12 @@ def parse_input(): directory = get_config(args_dict, '-i', None, 'string') config = get_config(args_dict, '-c', None, 'string') log_path = get_config(args_dict, '-l', None, 'string') + target = get_config(args_dict, '-t', "vc4", 'string') if directory is None: raise ArgError('\n\nERROR! No input directory given.') if json_output is None: raise ArgError('\n\nERROR! No output json given.') - return json_output, directory, config, log_path + return json_output, directory, config, log_path, target """ diff --git a/utils/raspberrypi/ctt/ctt_vc4.py b/utils/raspberrypi/ctt/ctt_vc4.py index 86acfd47..7154e110 100755 --- a/utils/raspberrypi/ctt/ctt_vc4.py +++ b/utils/raspberrypi/ctt/ctt_vc4.py @@ -4,13 +4,8 @@ # # Copyright (C) 2019, Raspberry Pi Ltd # -# ctt_vc4.py - camera tuning tool for VC4 platforms +# ctt_vc4.py - camera tuning tool data for VC4 platforms -import os -import sys - -from ctt_run import run_ctt -from ctt_tools import parse_input json_template = { "rpi.black_level": { @@ -129,29 +124,3 @@ json_template = { } grid_size = (16, 12) - -target = 'bcm2835' - -if __name__ == '__main__': - """ - initialise calibration - """ - if len(sys.argv) == 1: - print(""" - VC4 Camera Tuning Tool version 1.0 - - Required Arguments: - '-i' : Calibration image directory. - '-o' : Name of output json file. - - Optional Arguments: - '-c' : Config file for the CTT. If not passed, default parameters used. - '-l' : Name of output log file. If not passed, 'ctt_log.txt' used. - """) - quit(0) - else: - """ - parse input arguments - """ - json_output, directory, config, log_output = parse_input() - run_ctt(json_output, directory, config, log_output, json_template, grid_size, target) From patchwork Thu Jun 6 10:15:10 2024 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: David Plowman X-Patchwork-Id: 20218 Return-Path: X-Original-To: parsemail@patchwork.libcamera.org Delivered-To: parsemail@patchwork.libcamera.org Received: from lancelot.ideasonboard.com (lancelot.ideasonboard.com [92.243.16.209]) by patchwork.libcamera.org (Postfix) with ESMTPS id AE87ABD87C for ; Thu, 6 Jun 2024 10:15:41 +0000 (UTC) Received: from lancelot.ideasonboard.com (localhost [IPv6:::1]) by lancelot.ideasonboard.com (Postfix) with ESMTP id D8D2E65459; Thu, 6 Jun 2024 12:15:39 +0200 (CEST) Authentication-Results: lancelot.ideasonboard.com; dkim=pass (2048-bit key; unprotected) header.d=raspberrypi.com header.i=@raspberrypi.com header.b="I8ttmovn"; dkim-atps=neutral Received: from mail-ed1-x52d.google.com (mail-ed1-x52d.google.com [IPv6:2a00:1450:4864:20::52d]) by lancelot.ideasonboard.com (Postfix) with ESMTPS id 250296545A for ; Thu, 6 Jun 2024 12:15:33 +0200 (CEST) Received: by mail-ed1-x52d.google.com with SMTP id 4fb4d7f45d1cf-57a68b0fbd0so807237a12.1 for ; Thu, 06 Jun 2024 03:15:33 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=raspberrypi.com; s=google; t=1717668932; x=1718273732; darn=lists.libcamera.org; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:from:to:cc:subject:date :message-id:reply-to; bh=6nH8hD6TqopDewdCCAXY6b7lUwaffPrh4fUbV/TkoAo=; b=I8ttmovnn+hSzr/IqW/dPernrDAR549If8kFRp4mhRD0f1eHucRc6aBcIDkXOMc8CC NmKPFRK3S0pRwT9260Aoswld9sLhNp0WA6aP776FTu31zfFH5pdILxuJicpr7Pes0ppg XV5BWQ4tKutEJXYLMCJZSgBZ4XTpxgWPkIKu9OLqU//g+4aEZ3Z6npQ00wJORIYEzA8/ aO7JItkkXSO5mxUMTlYOOzxKxjygLztj0f/foSvov8kkPlgSbPs0DCnyI6BacuFBSF7Z NcN+9vE7tAKjsOtWbKFP2gIX85e2ChyG05b6Zv2PLORaK4V3mqUQQTnXtb9zEpCDYfeJ fKUg== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20230601; t=1717668932; x=1718273732; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:x-gm-message-state:from:to:cc :subject:date:message-id:reply-to; bh=6nH8hD6TqopDewdCCAXY6b7lUwaffPrh4fUbV/TkoAo=; b=kJWGhCwpLFF3+8wJ+t2pXPH/WCMWtvVbV/WXWGI53fi0vHKX50jiS+JUlzjEeSE3O0 4philP/6HhyZ9DFa0Mj9fIwZMEQnBg4padDK/SBTg/EEALTJ17/Owz951TC2Q8dKYZKR OX6JfiuTe+daG4WOAVndK+COZAJW+t/f6YHeYysAA2c6Ak2XxP6ToUHzo4VucrR30ZKZ Rtrgr0lqQ8ml8B9su3rzmsKcniFI8qWgyglhH/vj4ietAuXp+ZmInLN/Ig0jcUji+MwI NBKj47t9jHM3dVlkWOgL1fK1fZUTL5A0O5tjwq866ADkgRcvKPzFDjnruu00wRCrD1mC +Dbw== X-Gm-Message-State: AOJu0YzIGI+HaCFtHOdt+HMbO9z0CJrHb52aJcTjyvZYXawlzqtutABU l5lHGtQd9ins7K7V3CdPwQodl7lBDu10DI/3Z6vmb6Jy8OvrxOI8qHQU4BwJnXwEX3h/ZsDmJSG Y X-Google-Smtp-Source: AGHT+IG6LBtDn9XqNjGltvhZbEgAMd+daj49iwtlropmmBFpXfTG6v08BUCvf7F4lcDBdiz437JgcQ== X-Received: by 2002:a17:906:1b4a:b0:a68:4491:59e2 with SMTP id a640c23a62f3a-a699f67ecbcmr289755066b.3.1717668931988; Thu, 06 Jun 2024 03:15:31 -0700 (PDT) Received: from pi5-davidp.pitowers.org ([2001:4d4e:300:1f:c732:5d0a:406b:ae46]) by smtp.gmail.com with ESMTPSA id a640c23a62f3a-a6c805c59a2sm75809866b.50.2024.06.06.03.15.31 (version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256); Thu, 06 Jun 2024 03:15:31 -0700 (PDT) From: David Plowman To: libcamera-devel@lists.libcamera.org Cc: David Plowman , Naushir Patuck Subject: [PATCH 4/6] utils: raspberrypi: ctt: Update tuning tool for HDR Date: Thu, 6 Jun 2024 11:15:10 +0100 Message-Id: <20240606101512.375178-5-david.plowman@raspberrypi.com> X-Mailer: git-send-email 2.39.2 In-Reply-To: <20240606101512.375178-1-david.plowman@raspberrypi.com> References: <20240606101512.375178-1-david.plowman@raspberrypi.com> MIME-Version: 1.0 X-BeenThere: libcamera-devel@lists.libcamera.org X-Mailman-Version: 2.1.29 Precedence: list List-Id: List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , Errors-To: libcamera-devel-bounces@lists.libcamera.org Sender: "libcamera-devel" The various boilerplate parts of the tuning file are extended to include the necessary extra bits for HDR, specifically: * rpi.denoise has different configurations for HDR modes * rpi.agc now has extra channels for HDR * rpi.hdr parameters are added. Signed-off-by: David Plowman Reviewed-by: Naushir Patuck --- utils/raspberrypi/ctt/ctt_pisp.py | 785 ++++++++++++++++-- .../raspberrypi/ctt/ctt_pretty_print_json.py | 3 +- 2 files changed, 695 insertions(+), 93 deletions(-) diff --git a/utils/raspberrypi/ctt/ctt_pisp.py b/utils/raspberrypi/ctt/ctt_pisp.py index 4c432f17..a59b053c 100755 --- a/utils/raspberrypi/ctt/ctt_pisp.py +++ b/utils/raspberrypi/ctt/ctt_pisp.py @@ -25,23 +25,68 @@ json_template = { }, "rpi.denoise": { - "sdn": + "normal": { - "deviation": 1.6, - "strength": 0.5, - "deviation2": 3.2, - "deviation_no_tdn": 3.2, - "strength_no_tdn": 0.75 + "sdn": + { + "deviation": 1.6, + "strength": 0.5, + "deviation2": 3.2, + "deviation_no_tdn": 3.2, + "strength_no_tdn": 0.75 + }, + "cdn": + { + "deviation": 200, + "strength": 0.3 + }, + "tdn": + { + "deviation": 0.8, + "threshold": 0.05 + } }, - "cdn": + "hdr": { - "deviation": 200, - "strength": 0.3 + "sdn": + { + "deviation": 1.6, + "strength": 0.5, + "deviation2": 3.2, + "deviation_no_tdn": 3.2, + "strength_no_tdn": 0.75 + }, + "cdn": + { + "deviation": 200, + "strength": 0.3 + }, + "tdn": + { + "deviation": 1.3, + "threshold": 0.1 + } }, - "tdn": + "night": { - "deviation": 0.8, - "threshold": 0.05 + "sdn": + { + "deviation": 1.6, + "strength": 0.5, + "deviation2": 3.2, + "deviation_no_tdn": 3.2, + "strength_no_tdn": 0.75 + }, + "cdn": + { + "deviation": 200, + "strength": 0.3 + }, + "tdn": + { + "deviation": 1.3, + "threshold": 0.1 + } } }, "rpi.awb": { @@ -61,91 +106,605 @@ json_template = { }, "bayes": 1 }, - "rpi.agc": { - "metering_modes": { - "centre-weighted": { - "weights": [ - 0, 0, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 0, 0, - 0, 1, 1, 1, 1, 1, 2, 2, 2, 1, 1, 1, 1, 1, 0, - 1, 1, 1, 1, 2, 2, 2, 2, 2, 2, 2, 1, 1, 1, 1, - 1, 1, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 1, 1, - 1, 1, 2, 2, 2, 2, 3, 3, 3, 2, 2, 2, 2, 1, 1, - 1, 1, 2, 2, 2, 3, 3, 3, 3, 3, 2, 2, 2, 1, 1, - 1, 1, 2, 2, 3, 3, 3, 4, 3, 3, 3, 2, 2, 1, 1, - 1, 1, 2, 2, 3, 3, 4, 4, 4, 3, 3, 2, 2, 1, 1, - 1, 1, 2, 2, 3, 3, 3, 4, 3, 3, 3, 2, 2, 1, 1, - 1, 1, 2, 2, 2, 3, 3, 3, 3, 3, 2, 2, 2, 1, 1, - 1, 1, 2, 2, 2, 2, 3, 3, 3, 2, 2, 2, 2, 1, 1, - 1, 1, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 1, 1, - 1, 1, 1, 1, 2, 2, 2, 2, 2, 2, 2, 1, 1, 1, 1, - 0, 1, 1, 1, 1, 1, 2, 2, 2, 1, 1, 1, 1, 1, 0, - 0, 0, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 0, 0 - ] - }, - "spot": { - "weights": [ - 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, - 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, - 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, - 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, - 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, - 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, - 0, 0, 0, 0, 0, 0, 1, 2, 1, 0, 0, 0, 0, 0, 0, - 0, 0, 0, 0, 0, 1, 2, 3, 2, 1, 0, 0, 0, 0, 0, - 0, 0, 0, 0, 0, 0, 1, 2, 1, 0, 0, 0, 0, 0, 0, - 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, - 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, - 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, - 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, - 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, - 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 - ] + "rpi.agc": + { + "channels": + [ + { + "comment": "Channel 0 is normal AGC", + "metering_modes": + { + "centre-weighted": + { + "weights": + [ + 0, 0, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 0, 0, + 0, 1, 1, 1, 1, 1, 2, 2, 2, 1, 1, 1, 1, 1, 0, + 1, 1, 1, 1, 2, 2, 2, 2, 2, 2, 2, 1, 1, 1, 1, + 1, 1, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 1, 1, + 1, 1, 2, 2, 2, 2, 3, 3, 3, 2, 2, 2, 2, 1, 1, + 1, 1, 2, 2, 2, 3, 3, 3, 3, 3, 2, 2, 2, 1, 1, + 1, 1, 2, 2, 3, 3, 3, 4, 3, 3, 3, 2, 2, 1, 1, + 1, 1, 2, 2, 3, 3, 4, 4, 4, 3, 3, 2, 2, 1, 1, + 1, 1, 2, 2, 3, 3, 3, 4, 3, 3, 3, 2, 2, 1, 1, + 1, 1, 2, 2, 2, 3, 3, 3, 3, 3, 2, 2, 2, 1, 1, + 1, 1, 2, 2, 2, 2, 3, 3, 3, 2, 2, 2, 2, 1, 1, + 1, 1, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 1, 1, + 1, 1, 1, 1, 2, 2, 2, 2, 2, 2, 2, 1, 1, 1, 1, + 0, 1, 1, 1, 1, 1, 2, 2, 2, 1, 1, 1, 1, 1, 0, + 0, 0, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 0, 0 + ] + }, + "spot": + { + "weights": + [ + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 1, 2, 1, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 1, 2, 3, 2, 1, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 1, 2, 1, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 + ] + }, + "matrix": + { + "weights": + [ + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 + ] + } + }, + "exposure_modes": + { + "normal": + { + "shutter": [ 100, 10000, 30000, 60000, 66666 ], + "gain": [ 1.0, 1.5, 2.0, 4.0, 8.0 ] + }, + "short": + { + "shutter": [ 100, 5000, 10000, 20000, 60000 ], + "gain": [ 1.0, 1.5, 2.0, 4.0, 8.0 ] + }, + "long": + { + "shutter": [ 100, 10000, 30000, 60000, 90000, 120000 ], + "gain": [ 1.0, 1.5, 2.0, 4.0, 8.0, 12.0 ] + } + }, + "constraint_modes": + { + "normal": [ + { + "bound": "LOWER", + "q_lo": 0.98, + "q_hi": 1.0, + "y_target": + [ + 0, 0.5, + 1000, 0.5 + ] + } + ], + "highlight": [ + { + "bound": "LOWER", + "q_lo": 0.98, + "q_hi": 1.0, + "y_target": + [ + 0, 0.5, + 1000, 0.5 + ] + }, + { + "bound": "UPPER", + "q_lo": 0.98, + "q_hi": 1.0, + "y_target": + [ + 0, 0.8, + 1000, 0.8 + ] + }, + ], + "shadows": [ + { + "bound": "LOWER", + "q_lo": 0.0, + "q_hi": 0.5, + "y_target": + [ + 0, 0.17, + 1000, 0.17 + ] + } + ] + }, + "y_target": + [ + 0, 0.16, + 1000, 0.165, + 10000, 0.17 + ] }, - "matrix": { - "weights": [ - 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, - 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, - 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, - 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, - 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, - 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, - 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, - 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, - 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, - 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, - 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, - 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, - 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, - 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, - 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 - ] - } - }, - "exposure_modes": { - "normal": { - "shutter": [100, 10000, 30000, 60000, 66666], - "gain": [1.0, 1.5, 2.0, 4.0, 8.0] + { + "comment": "Channel 1 is the HDR short channel", + "desaturate": 0, + "metering_modes": + { + "centre-weighted": + { + "weights": + [ + 0, 0, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 0, 0, + 0, 1, 1, 1, 1, 1, 2, 2, 2, 1, 1, 1, 1, 1, 0, + 1, 1, 1, 1, 2, 2, 2, 2, 2, 2, 2, 1, 1, 1, 1, + 1, 1, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 1, 1, + 1, 1, 2, 2, 2, 2, 3, 3, 3, 2, 2, 2, 2, 1, 1, + 1, 1, 2, 2, 2, 3, 3, 3, 3, 3, 2, 2, 2, 1, 1, + 1, 1, 2, 2, 3, 3, 3, 4, 3, 3, 3, 2, 2, 1, 1, + 1, 1, 2, 2, 3, 3, 4, 4, 4, 3, 3, 2, 2, 1, 1, + 1, 1, 2, 2, 3, 3, 3, 4, 3, 3, 3, 2, 2, 1, 1, + 1, 1, 2, 2, 2, 3, 3, 3, 3, 3, 2, 2, 2, 1, 1, + 1, 1, 2, 2, 2, 2, 3, 3, 3, 2, 2, 2, 2, 1, 1, + 1, 1, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 1, 1, + 1, 1, 1, 1, 2, 2, 2, 2, 2, 2, 2, 1, 1, 1, 1, + 0, 1, 1, 1, 1, 1, 2, 2, 2, 1, 1, 1, 1, 1, 0, + 0, 0, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 0, 0 + ] + }, + "spot": + { + "weights": + [ + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 1, 2, 1, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 1, 2, 3, 2, 1, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 1, 2, 1, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 + ] + }, + "matrix": + { + "weights": + [ + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 + ] + } + }, + "exposure_modes": + { + "normal": + { + "shutter": [ 100, 20000, 60000 ], + "gain": [ 1.0, 1.0, 1.0 ] + }, + "short": + { + "shutter": [ 100, 20000, 60000 ], + "gain": [ 1.0, 1.0, 1.0 ] + }, + "long": + { + "shutter": [ 100, 20000, 60000 ], + "gain": [ 1.0, 1.0, 1.0 ] + } + }, + "constraint_modes": + { + "normal": [ + { + "bound": "LOWER", + "q_lo": 0.95, + "q_hi": 1.0, + "y_target": + [ + 0, 0.5, + 1000, 0.5 + ] + }, + { + "bound": "UPPER", + "q_lo": 0.95, + "q_hi": 1.0, + "y_target": + [ + 0, 0.7, + 1000, 0.7 + ] + }, + { + "bound": "LOWER", + "q_lo": 0.0, + "q_hi": 0.2, + "y_target": + [ + 0, 0.002, + 1000, 0.002 + ] + } + ], + "highlight": [ + { + "bound": "LOWER", + "q_lo": 0.95, + "q_hi": 1.0, + "y_target": + [ + 0, 0.5, + 1000, 0.5 + ] + }, + { + "bound": "UPPER", + "q_lo": 0.95, + "q_hi": 1.0, + "y_target": + [ + 0, 0.7, + 1000, 0.7 + ] + }, + { + "bound": "LOWER", + "q_lo": 0.0, + "q_hi": 0.2, + "y_target": + [ + 0, 0.002, + 1000, 0.002 + ] + } + ], + "shadows": [ + { + "bound": "LOWER", + "q_lo": 0.95, + "q_hi": 1.0, + "y_target": + [ + 0, 0.5, + 1000, 0.5 + ] + }, + { + "bound": "UPPER", + "q_lo": 0.95, + "q_hi": 1.0, + "y_target": + [ + 0, 0.7, + 1000, 0.7 + ] + }, + { + "bound": "LOWER", + "q_lo": 0.0, + "q_hi": 0.2, + "y_target": + [ + 0, 0.002, + 1000, 0.002 + ] + } + ] + }, + "y_target": + [ + 0, 0.16, + 1000, 0.165, + 10000, 0.17 + ] }, - "short": { - "shutter": [100, 5000, 10000, 20000, 60000], - "gain": [1.0, 1.5, 2.0, 4.0, 8.0] + { + "comment": "Channel 2 is the HDR long channel", + "desaturate": 0, + "metering_modes": + { + "centre-weighted": + { + "weights": + [ + 0, 0, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 0, 0, + 0, 1, 1, 1, 1, 1, 2, 2, 2, 1, 1, 1, 1, 1, 0, + 1, 1, 1, 1, 2, 2, 2, 2, 2, 2, 2, 1, 1, 1, 1, + 1, 1, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 1, 1, + 1, 1, 2, 2, 2, 2, 3, 3, 3, 2, 2, 2, 2, 1, 1, + 1, 1, 2, 2, 2, 3, 3, 3, 3, 3, 2, 2, 2, 1, 1, + 1, 1, 2, 2, 3, 3, 3, 4, 3, 3, 3, 2, 2, 1, 1, + 1, 1, 2, 2, 3, 3, 4, 4, 4, 3, 3, 2, 2, 1, 1, + 1, 1, 2, 2, 3, 3, 3, 4, 3, 3, 3, 2, 2, 1, 1, + 1, 1, 2, 2, 2, 3, 3, 3, 3, 3, 2, 2, 2, 1, 1, + 1, 1, 2, 2, 2, 2, 3, 3, 3, 2, 2, 2, 2, 1, 1, + 1, 1, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 1, 1, + 1, 1, 1, 1, 2, 2, 2, 2, 2, 2, 2, 1, 1, 1, 1, + 0, 1, 1, 1, 1, 1, 2, 2, 2, 1, 1, 1, 1, 1, 0, + 0, 0, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 0, 0 + ] + }, + "spot": + { + "weights": + [ + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 1, 2, 1, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 1, 2, 3, 2, 1, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 1, 2, 1, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 + ] + }, + "matrix": + { + "weights": + [ + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 + ] + } + }, + "exposure_modes": + { + "normal": + { + "shutter": [ 100, 20000, 30000, 60000 ], + "gain": [ 1.0, 2.0, 4.0, 8.0 ] + }, + "short": + { + "shutter": [ 100, 20000, 30000, 60000 ], + "gain": [ 1.0, 2.0, 4.0, 8.0 ] + }, + "long": + { + "shutter": [ 100, 20000, 30000, 60000 ], + "gain": [ 1.0, 2.0, 4.0, 8.0 ] + } + }, + "constraint_modes": + { + "normal": [ + ], + "highlight": [ + ], + "shadows": [ + ] + }, + "channel_constraints": + [ + { + "bound": "UPPER", + "channel": 4, + "factor": 8 + }, + { + "bound": "LOWER", + "channel": 4, + "factor": 2 + } + ], + "y_target": + [ + 0, 0.16, + 1000, 0.165, + 10000, 0.17 + ] }, - "long": { - "shutter": [ 100, 10000, 30000, 60000, 90000, 120000 ], - "gain": [ 1.0, 1.5, 2.0, 4.0, 8.0, 12.0 ] + "comment": "Channel 3 is the night mode channel", + "base_ev": 0.33, + "metering_modes": + { + "centre-weighted": + { + "weights": + [ + 0, 0, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 0, 0, + 0, 1, 1, 1, 1, 1, 2, 2, 2, 1, 1, 1, 1, 1, 0, + 1, 1, 1, 1, 2, 2, 2, 2, 2, 2, 2, 1, 1, 1, 1, + 1, 1, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 1, 1, + 1, 1, 2, 2, 2, 2, 3, 3, 3, 2, 2, 2, 2, 1, 1, + 1, 1, 2, 2, 2, 3, 3, 3, 3, 3, 2, 2, 2, 1, 1, + 1, 1, 2, 2, 3, 3, 3, 4, 3, 3, 3, 2, 2, 1, 1, + 1, 1, 2, 2, 3, 3, 4, 4, 4, 3, 3, 2, 2, 1, 1, + 1, 1, 2, 2, 3, 3, 3, 4, 3, 3, 3, 2, 2, 1, 1, + 1, 1, 2, 2, 2, 3, 3, 3, 3, 3, 2, 2, 2, 1, 1, + 1, 1, 2, 2, 2, 2, 3, 3, 3, 2, 2, 2, 2, 1, 1, + 1, 1, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 1, 1, + 1, 1, 1, 1, 2, 2, 2, 2, 2, 2, 2, 1, 1, 1, 1, + 0, 1, 1, 1, 1, 1, 2, 2, 2, 1, 1, 1, 1, 1, 0, + 0, 0, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 0, 0 + ] + }, + "spot": + { + "weights": + [ + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 1, 2, 1, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 1, 2, 3, 2, 1, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 1, 2, 1, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 + ] + }, + "matrix": + { + "weights": + [ + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 + ] + } + }, + "exposure_modes": + { + "normal": + { + "shutter": [ 100, 20000, 66666 ], + "gain": [ 1.0, 2.0, 4.0 ] + }, + "short": + { + "shutter": [ 100, 20000, 33333 ], + "gain": [ 1.0, 2.0, 4.0 ] + }, + "long": + { + "shutter": [ 100, 20000, 66666, 120000 ], + "gain": [ 1.0, 2.0, 4.0, 4.0 ] + } + }, + "constraint_modes": + { + "normal": [ + { + "bound": "LOWER", + "q_lo": 0.98, + "q_hi": 1.0, + "y_target": + [ + 0, 0.5, + 1000, 0.5 + ] + } + ], + "highlight": [ + { + "bound": "LOWER", + "q_lo": 0.98, + "q_hi": 1.0, + "y_target": + [ + 0, 0.5, + 1000, 0.5 + ] + }, + { + "bound": "UPPER", + "q_lo": 0.98, + "q_hi": 1.0, + "y_target": + [ + 0, 0.8, + 1000, 0.8 + ] + } + ], + "shadows": [ + { + "bound": "LOWER", + "q_lo": 0.98, + "q_hi": 1.0, + "y_target": + [ + 0, 0.5, + 1000, 0.5 + ] + } + ] + }, + "y_target": + [ + 0, 0.16, + 1000, 0.16, + 10000, 0.17 + ] } - }, - "constraint_modes": { - "normal": [ - {"bound": "LOWER", "q_lo": 0.98, "q_hi": 1.0, "y_target": [0, 0.5, 1000, 0.5]} - ], - "highlight": [ - {"bound": "LOWER", "q_lo": 0.98, "q_hi": 1.0, "y_target": [0, 0.5, 1000, 0.5]}, - {"bound": "UPPER", "q_lo": 0.98, "q_hi": 1.0, "y_target": [0, 0.8, 1000, 0.8]} - ] - }, - "y_target": [0, 0.16, 1000, 0.165, 10000, 0.17] + ] }, "rpi.alsc": { 'omega': 1.3, @@ -198,6 +757,48 @@ json_template = { "threshold": 0.25, "limit": 1.0, "strength": 1.0 + }, + "rpi.hdr": + { + "Off": + { + "cadence": [ 0 ] + }, + "MultiExposureUnmerged": + { + "cadence": [ 1, 2 ], + "channel_map": { "short": 1, "long": 2 } + }, + "SingleExposure": + { + "cadence": [1], + "channel_map": { "short": 1 }, + "spatial_gain": 2.0, + "tonemap_enable": 1 + }, + "MultiExposure": + { + "cadence": [1, 2], + "channel_map": { "short": 1, "long": 2 }, + "stitch_enable": 1, + "spatial_gain": 2.0, + "tonemap_enable": 1 + }, + "Night": + { + "cadence": [ 3 ], + "channel_map": { "night": 3 }, + "tonemap_enable": 1, + "tonemap": + [ + 0, 0, + 5000, 20000, + 10000, 30000, + 20000, 47000, + 30000, 55000, + 65535, 65535 + ] + } } } diff --git a/utils/raspberrypi/ctt/ctt_pretty_print_json.py b/utils/raspberrypi/ctt/ctt_pretty_print_json.py index 350cec65..a4cae62d 100755 --- a/utils/raspberrypi/ctt/ctt_pretty_print_json.py +++ b/utils/raspberrypi/ctt/ctt_pretty_print_json.py @@ -30,7 +30,8 @@ class Encoder(json.JSONEncoder): 'lut_ry': 9, 'gamma_curve': 2, 'y_target': 2, - 'prior': 2 + 'prior': 2, + 'tonemap': 2 } def encode(self, o, node_key=None): From patchwork Thu Jun 6 10:15:11 2024 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: David Plowman X-Patchwork-Id: 20219 Return-Path: X-Original-To: parsemail@patchwork.libcamera.org Delivered-To: parsemail@patchwork.libcamera.org Received: from lancelot.ideasonboard.com (lancelot.ideasonboard.com [92.243.16.209]) by patchwork.libcamera.org (Postfix) with ESMTPS id E1DACC3292 for ; Thu, 6 Jun 2024 10:15:42 +0000 (UTC) Received: from lancelot.ideasonboard.com (localhost [IPv6:::1]) by lancelot.ideasonboard.com (Postfix) with ESMTP id DB48F65464; Thu, 6 Jun 2024 12:15:41 +0200 (CEST) Authentication-Results: lancelot.ideasonboard.com; dkim=pass (2048-bit key; unprotected) header.d=raspberrypi.com header.i=@raspberrypi.com header.b="kezno5NU"; dkim-atps=neutral Received: from mail-ej1-x631.google.com (mail-ej1-x631.google.com [IPv6:2a00:1450:4864:20::631]) by lancelot.ideasonboard.com (Postfix) with ESMTPS id D39366545D for ; Thu, 6 Jun 2024 12:15:33 +0200 (CEST) Received: by mail-ej1-x631.google.com with SMTP id a640c23a62f3a-a6266ffdba8so65596666b.1 for ; Thu, 06 Jun 2024 03:15:33 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=raspberrypi.com; s=google; t=1717668933; x=1718273733; darn=lists.libcamera.org; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:from:to:cc:subject:date :message-id:reply-to; bh=W06p37Y1xwC67cFH+rZkEwLXRBquD8QbZ03xa8sUGrY=; b=kezno5NU+jWXfGFph/m6esRZDFiihgpVoN5pVkaSJb+EUGJS+WCVy0ZPXef93O4pua c4MUOogIAhZsWkqTCQ5Y2UEwPYGXL8FyV32k9kYqY6jh/EjmPdR1Hnqmy68TQHZ1J4IW EC1/OcilasMicM23aR2fDgf9nEMeBTI8fZnla+YUQVaDltXdNhdEsYk6i/BDwL/oZole M/3QPG0zK4x1GckgELSIaqq3CztOZkCKbWkUpACpsXQC4EsxhxjWm2pR72MVNqk6vn99 kdAQllRVZSqgsVxIVeilL5SeJ3dux5GQx7zdP2ZyJ8MWbzkQoBIw2EkgG7tAIgRkkWYS eaOA== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20230601; t=1717668933; x=1718273733; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:x-gm-message-state:from:to:cc :subject:date:message-id:reply-to; bh=W06p37Y1xwC67cFH+rZkEwLXRBquD8QbZ03xa8sUGrY=; b=mC0f7ZlWv7UbsWq/rqxYx65hWY1bwZuTFaLrvETF9sW+9k9XkXnpSGSNHfFJpyTbOU O7LgsgepeJARCF3AwvyZjApzg7zZ+7PPfK9c++BoUHPsAEsN/e1noK+OfDEdOATg3rRR pMQygAo0jQpegiPmslpewu+fzWt5Q8eugutYVxHwjPZ6yRHCJ/ryiwW7mJsC1Ggvd30e ctgC0/Wd5OKd4SLaC3Yu02BMcCOF7iTqrA8lFL3tRJdsAlFkWFQtjnMcqJBmRzbB4Q5o vKquc60J6ZmUb3AH4cWcVXYuYb9m965umdbYk3nKBuVAkZctQm8HXfQKAfWmz2beqpmq EeDg== X-Gm-Message-State: AOJu0Yy6G8+SbVnb5MwmvMM1WnOCWSmWFgnqBTtDSQSPycTZOGBLs5n9 tjFfysrSee+T6oY54CNWZCLP5yAn6y08/t+2BlHS9td6f5hpGJGkFMoH8ffmVAdjogwmC5fpfsM 0 X-Google-Smtp-Source: AGHT+IHWHGYONi98wQvaj4RBlbLj1FyXAzD9vqQmyNRINQdmqy7gwkR4ajuWomwSCm8U5qCxfOu5zw== X-Received: by 2002:a17:906:c059:b0:a6c:702e:73fc with SMTP id a640c23a62f3a-a6c702e760fmr240859666b.8.1717668932917; Thu, 06 Jun 2024 03:15:32 -0700 (PDT) Received: from pi5-davidp.pitowers.org ([2001:4d4e:300:1f:c732:5d0a:406b:ae46]) by smtp.gmail.com with ESMTPSA id a640c23a62f3a-a6c805c59a2sm75809866b.50.2024.06.06.03.15.32 (version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256); Thu, 06 Jun 2024 03:15:32 -0700 (PDT) From: David Plowman To: libcamera-devel@lists.libcamera.org Cc: Naushir Patuck Subject: [PATCH 5/6] utils: raspberrypi: ctt: Add option to convert between vc4/pisp targets Date: Thu, 6 Jun 2024 11:15:11 +0100 Message-Id: <20240606101512.375178-6-david.plowman@raspberrypi.com> X-Mailer: git-send-email 2.39.2 In-Reply-To: <20240606101512.375178-1-david.plowman@raspberrypi.com> References: <20240606101512.375178-1-david.plowman@raspberrypi.com> MIME-Version: 1.0 X-BeenThere: libcamera-devel@lists.libcamera.org X-Mailman-Version: 2.1.29 Precedence: list List-Id: List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , Errors-To: libcamera-devel-bounces@lists.libcamera.org Sender: "libcamera-devel" From: Naushir Patuck This change adds functionality to the convert_tuning.py script to convert between vc4 and pisp target tuning files. The conversion is done on a best effort basis, and should provide functional tuning files. However, a full tuning for the target platform is always preferred. Signed-off-by: Naushir Patuck --- utils/raspberrypi/ctt/convert_tuning.py | 98 ++++++++++++++++++++++--- 1 file changed, 86 insertions(+), 12 deletions(-) diff --git a/utils/raspberrypi/ctt/convert_tuning.py b/utils/raspberrypi/ctt/convert_tuning.py index f4504d45..83cf69d4 100755 --- a/utils/raspberrypi/ctt/convert_tuning.py +++ b/utils/raspberrypi/ctt/convert_tuning.py @@ -8,30 +8,104 @@ import argparse import json +import numpy as np import sys from ctt_pretty_print_json import pretty_print +from ctt_pisp import grid_size as grid_size_pisp +from ctt_pisp import json_template as json_template_pisp +from ctt_vc4 import grid_size as grid_size_vc4 +from ctt_vc4 import json_template as json_template_vc4 -def convert_v2(in_json: dict) -> str: +def interp_2d(in_ls, src_w, src_h, dst_w, dst_h): - if 'version' in in_json.keys() and in_json['version'] != 1.0: - print(f'The JSON config reports version {in_json["version"]} that is incompatible with this tool.') - sys.exit(-1) + out_ls = np.zeros((dst_h, dst_w)) + for i in range(src_h): + out_ls[i] = np.interp(np.linspace(0, dst_w - 1, dst_w), + np.linspace(0, dst_w - 1, src_w), + in_ls[i]) + for i in range(dst_w): + out_ls[:,i] = np.interp(np.linspace(0, dst_h - 1, dst_h), + np.linspace(0, dst_h - 1, src_h), + out_ls[:src_h, i]) + return out_ls - converted = { - 'version': 2.0, - 'target': 'bcm2835', - 'algorithms': [{algo: config} for algo, config in in_json.items()] - } - return pretty_print(converted) +def convert_target(in_json: dict, target: str): + + src_w, src_h = grid_size_pisp if target == 'vc4' else grid_size_vc4 + dst_w, dst_h = grid_size_vc4 if target == 'vc4' else grid_size_pisp + json_template = json_template_vc4 if target == 'vc4' else json_template_pisp + + # ALSC grid sizes + alsc = next(algo for algo in in_json['algorithms'] if 'rpi.alsc' in algo)['rpi.alsc'] + for colour in ['calibrations_Cr', 'calibrations_Cb']: + if colour not in alsc: + continue + for temperature in alsc[colour]: + in_ls = np.reshape(temperature['table'], (src_h, src_w)) + out_ls = interp_2d(in_ls, src_w, src_h, dst_w, dst_h) + temperature['table'] = np.round(out_ls.flatten(), 3).tolist() + + if 'luminance_lut' in alsc: + in_ls = np.reshape(alsc['luminance_lut'], (src_h, src_w)) + out_ls = interp_2d(in_ls, src_w, src_h, dst_w, dst_h) + alsc['luminance_lut'] = np.round(out_ls.flatten(), 3).tolist() + + # Denoise blocks + for i, algo in enumerate(in_json['algorithms']): + if list(algo.keys())[0] == 'rpi.sdn': + in_json['algorithms'][i] = {'rpi.denoise': json_template['rpi.sdn'] if target == 'vc4' else json_template['rpi.denoise']} + break + + # AGC mode weights + agc = next(algo for algo in in_json['algorithms'] if 'rpi.agc' in algo)['rpi.agc'] + if 'channels' in agc: + for i, channel in enumerate(agc['channels']): + target_agc_metering = json_template['rpi.agc']['channels'][i]['metering_modes'] + for mode, v in channel['metering_modes'].items(): + v['weights'] = target_agc_metering[mode]['weights'] + else: + for mode, v in agc["metering_modes"].items(): + target_agc_metering = json_template['rpi.agc']['channels'][0]['metering_modes'] + v['weights'] = target_agc_metering[mode]['weights'] + + # HDR + if target == 'pisp': + for i, algo in enumerate(in_json['algorithms']): + if list(algo.keys())[0] == 'rpi.hdr': + in_json['algorithms'][i] = {'rpi.hdr': json_template['rpi.hdr']} + + return in_json + + +def convert_v2(in_json: dict, target: str) -> str: + + if 'version' in in_json.keys() and in_json['version'] == 1.0: + converted = { + 'version': 2.0, + 'target': target, + 'algorithms': [{algo: config} for algo, config in in_json.items()] + } + else: + converted = in_json + + # Convert between vc4 <-> pisp targets. This is a best effort thing. + if converted['target'] != target: + converted = convert_target(converted, target) + converted['target'] = target + + grid_size = grid_size_vc4[0] if target == 'vc4' else grid_size_pisp[0] + return pretty_print(converted, custom_elems={'table': grid_size, 'luminance_lut': grid_size}) if __name__ == "__main__": parser = argparse.ArgumentParser(formatter_class=argparse.RawTextHelpFormatter, description= - 'Convert the format of the Raspberry Pi camera tuning file from v1.0 to v2.0.\n') + 'Convert the format of the Raspberry Pi camera tuning file from v1.0 to v2.0 and/or the vc4 <-> pisp targets.\n') parser.add_argument('input', type=str, help='Input tuning file.') + parser.add_argument('-t', '--target', type=str, help='Target platform.', + choices=['pisp', 'vc4'], default='vc4') parser.add_argument('output', type=str, nargs='?', help='Output converted tuning file. If not provided, the input file will be updated in-place.', default=None) @@ -40,7 +114,7 @@ if __name__ == "__main__": with open(args.input, 'r') as f: in_json = json.load(f) - out_json = convert_v2(in_json) + out_json = convert_v2(in_json, args.target) with open(args.output if args.output is not None else args.input, 'w') as f: f.write(out_json) From patchwork Thu Jun 6 10:15:12 2024 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: David Plowman X-Patchwork-Id: 20220 Return-Path: X-Original-To: parsemail@patchwork.libcamera.org Delivered-To: parsemail@patchwork.libcamera.org Received: from lancelot.ideasonboard.com (lancelot.ideasonboard.com [92.243.16.209]) by patchwork.libcamera.org (Postfix) with ESMTPS id CFA86C32CF for ; Thu, 6 Jun 2024 10:15:43 +0000 (UTC) Received: from lancelot.ideasonboard.com (localhost [IPv6:::1]) by lancelot.ideasonboard.com (Postfix) with ESMTP id 0DC716546D; Thu, 6 Jun 2024 12:15:43 +0200 (CEST) Authentication-Results: lancelot.ideasonboard.com; dkim=pass (2048-bit key; unprotected) header.d=raspberrypi.com header.i=@raspberrypi.com header.b="lBI5ytY+"; dkim-atps=neutral Received: from mail-ej1-x62a.google.com (mail-ej1-x62a.google.com [IPv6:2a00:1450:4864:20::62a]) by lancelot.ideasonboard.com (Postfix) with ESMTPS id 50E8E6545F for ; Thu, 6 Jun 2024 12:15:34 +0200 (CEST) Received: by mail-ej1-x62a.google.com with SMTP id a640c23a62f3a-a692130eb19so75975366b.2 for ; Thu, 06 Jun 2024 03:15:34 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=raspberrypi.com; s=google; t=1717668933; x=1718273733; darn=lists.libcamera.org; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:from:to:cc:subject:date :message-id:reply-to; bh=w6ymemoopcoht9JBOz3+IjqGaOLh0VUlpCxKvexIDPY=; b=lBI5ytY+ESZ+AxU7IhHXlFpcA0co1MKX2ghrRcrkWCsGoxbRgMgdi82rfy2v3LInoo rLPX9MMxnDZ2MFmcDZShQX9nNvlv2w0hlgtAKDQFJTouSxYNhDkWbvvHlGnhYSbpk9rl 3uJxULbDTkEKn/q27jgkXk+iwNSRjsClBmW6d3vjeXAfc1xKudyuVxZXjU0YAGsr6bdN YI0nI2/Hj3jMp6bVKSXFzmU++YEpmCWD1UELNBILzF/mRj2Ou/uxraAYoJE2pJMPtdnz 9Yb+eNtdJD4OVA9GxH6t1V6AI4A9oHIEcOOOp9WKyq5oQSulyDpPuZvv7paqd3LrKR/u ij5g== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20230601; t=1717668933; x=1718273733; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:x-gm-message-state:from:to:cc :subject:date:message-id:reply-to; bh=w6ymemoopcoht9JBOz3+IjqGaOLh0VUlpCxKvexIDPY=; b=KBbYwQ3uFmcSs+agTml/SsLUi2ywleaMvXUjm6Ya5Basq5UJEPOwi1KWqZksFZOxqL HBXeIqCPOjNj+bbTznJ4qTnxYvN9lfdz61y9Zy84XIxcIflU1f0rvDeQjXozJf2qRr5N Dz0fy+EtYk/FJTWwRn9q3oy1DaqCGz3LsX2bJN/womPBL6ulo1dsVxmzHavSNiW4XcJa MeIlcMXFxIJNlSiJA+2rmvE6OCL5zOKZg6vHtd8mSQG+95dqOl/OKFOysldgyflNnLcX Ku1N0eQs5CIw8RTcKZie3AFqe9ogdU6ugblDgvvo4jADIFVFvHzP2npHNgVA/StIg9ti DOcg== X-Gm-Message-State: AOJu0Yw0F9xb5Vg5fGNl+VbGfRYUAozG1RPeDtncdJXXAYfg2im/uD00 g62lhj6UR24ubaSxBT3cQtG3d17vQ311+7LRPE5vVBVo4W5SArsKvIPK4Vv17PsxCQIyhOEdb0R k X-Google-Smtp-Source: AGHT+IEhkM/8rDYR5nhyWflyDg12IfDlp0w2oSap4FPpO3DsTPGT+cwNLHw6ZfgEZc3C3gAQl8Q6Kw== X-Received: by 2002:a17:907:9516:b0:a64:a091:91f2 with SMTP id a640c23a62f3a-a699fcdf2dfmr241608166b.37.1717668933578; Thu, 06 Jun 2024 03:15:33 -0700 (PDT) Received: from pi5-davidp.pitowers.org ([2001:4d4e:300:1f:c732:5d0a:406b:ae46]) by smtp.gmail.com with ESMTPSA id a640c23a62f3a-a6c805c59a2sm75809866b.50.2024.06.06.03.15.33 (version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256); Thu, 06 Jun 2024 03:15:33 -0700 (PDT) From: David Plowman To: libcamera-devel@lists.libcamera.org Cc: David Plowman Subject: [PATCH 6/6] utils: raspberrypi: ctt: Add a maximum gain parameter for LSC Date: Thu, 6 Jun 2024 11:15:12 +0100 Message-Id: <20240606101512.375178-7-david.plowman@raspberrypi.com> X-Mailer: git-send-email 2.39.2 In-Reply-To: <20240606101512.375178-1-david.plowman@raspberrypi.com> References: <20240606101512.375178-1-david.plowman@raspberrypi.com> MIME-Version: 1.0 X-BeenThere: libcamera-devel@lists.libcamera.org X-Mailman-Version: 2.1.29 Precedence: list List-Id: List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , Errors-To: libcamera-devel-bounces@lists.libcamera.org Sender: "libcamera-devel" A max_gain parameter is added to the config file which we pass to the lens shading calibration. This clamps the maximum luminance gain that gets written into the tuning files so as to prevent overflows. It is particularly useful for lenses that cut off the light completely from the sensor corners, and allows usable tables to be generated for them. Signed-off-by: David Plowman --- utils/raspberrypi/ctt/ctt.py | 8 +++++--- utils/raspberrypi/ctt/ctt_alsc.py | 12 ++++++++---- utils/raspberrypi/ctt/ctt_config_example.json | 5 +++-- 3 files changed, 16 insertions(+), 9 deletions(-) diff --git a/utils/raspberrypi/ctt/ctt.py b/utils/raspberrypi/ctt/ctt.py index 522933bd..96f1b5e6 100755 --- a/utils/raspberrypi/ctt/ctt.py +++ b/utils/raspberrypi/ctt/ctt.py @@ -269,7 +269,7 @@ class Camera: colour channel seperately, and then partially corrects for vignetting. The extent of the correction depends on the 'luminance_strength' parameter. """ - def alsc_cal(self, luminance_strength, do_alsc_colour, grid_size): + def alsc_cal(self, luminance_strength, do_alsc_colour, grid_size, max_gain=8.0): if 'rpi.alsc' in self.disable: return 1 print('\nStarting ALSC calibration') @@ -292,7 +292,7 @@ class Camera: call calibration function """ plot = "rpi.alsc" in self.plot - alsc_out = alsc_all(self, do_alsc_colour, plot, grid_size) + alsc_out = alsc_all(self, do_alsc_colour, plot, grid_size, max_gain=max_gain) cal_cr_list, cal_cb_list, luminance_lut, av_corn = alsc_out """ write output to json and finish if not do_alsc_colour @@ -705,11 +705,13 @@ def run_ctt(json_output, directory, config, log_output, json_template, grid_size alsc_d = get_config(configs, "alsc", {}, 'dict') do_alsc_colour = get_config(alsc_d, "do_alsc_colour", 1, 'bool') luminance_strength = get_config(alsc_d, "luminance_strength", 0.8, 'num') + lsc_max_gain = get_config(alsc_d, "max_gain", 8.0, 'num') blacklevel = get_config(configs, "blacklevel", -1, 'num') macbeth_d = get_config(configs, "macbeth", {}, 'dict') mac_small = get_config(macbeth_d, "small", 0, 'bool') mac_show = get_config(macbeth_d, "show", 0, 'bool') mac_config = (mac_small, mac_show) + print("Read lsc_max_gain", lsc_max_gain) if blacklevel < -1 or blacklevel >= 2**16: print('\nInvalid blacklevel, defaulted to 64') @@ -750,7 +752,7 @@ def run_ctt(json_output, directory, config, log_output, json_template, grid_size Cam.json['rpi.black_level']['black_level'] = Cam.blacklevel_16 Cam.json_remove(disable) print('\nSTARTING CALIBRATIONS') - Cam.alsc_cal(luminance_strength, do_alsc_colour, grid_size) + Cam.alsc_cal(luminance_strength, do_alsc_colour, grid_size, max_gain=lsc_max_gain) Cam.geq_cal() Cam.lux_cal() Cam.noise_cal() diff --git a/utils/raspberrypi/ctt/ctt_alsc.py b/utils/raspberrypi/ctt/ctt_alsc.py index 66ce8c14..1d94dfa5 100644 --- a/utils/raspberrypi/ctt/ctt_alsc.py +++ b/utils/raspberrypi/ctt/ctt_alsc.py @@ -13,7 +13,7 @@ from mpl_toolkits.mplot3d import Axes3D """ preform alsc calibration on a set of images """ -def alsc_all(Cam, do_alsc_colour, plot, grid_size=(16, 12)): +def alsc_all(Cam, do_alsc_colour, plot, grid_size=(16, 12), max_gain=8.0): imgs_alsc = Cam.imgs_alsc grid_w, grid_h = grid_size """ @@ -24,7 +24,7 @@ def alsc_all(Cam, do_alsc_colour, plot, grid_size=(16, 12)): list_cb = [] list_cg = [] for Img in imgs_alsc: - col, cr, cb, cg, size = alsc(Cam, Img, do_alsc_colour, plot, grid_size=grid_size) + col, cr, cb, cg, size = alsc(Cam, Img, do_alsc_colour, plot, grid_size=grid_size, max_gain=max_gain) list_col.append(col) list_cr.append(cr) list_cb.append(cb) @@ -118,7 +118,7 @@ def alsc_all(Cam, do_alsc_colour, plot, grid_size=(16, 12)): """ calculate g/r and g/b for 32x32 points arranged in a grid for a single image """ -def alsc(Cam, Img, do_alsc_colour, plot=False, grid_size=(16, 12)): +def alsc(Cam, Img, do_alsc_colour, plot=False, grid_size=(16, 12), max_gain=8.0): Cam.log += '\nProcessing image: ' + Img.name grid_w, grid_h = grid_size """ @@ -153,9 +153,12 @@ def alsc(Cam, Img, do_alsc_colour, plot=False, grid_size=(16, 12)): median blur to remove peaks and save as float 64 """ cr = cv2.medianBlur(cr, 3).astype('float64') + cr = cr/np.min(cr) # gain tables are easier for humans to read if the minimum is 1.0 cb = cv2.medianBlur(cb, 3).astype('float64') + cb = cb/np.min(cb) cg = cv2.medianBlur(cg, 3).astype('float64') cg = cg/np.min(cg) + cg = [min(v, max_gain) for v in cg.flatten()] # never exceed the max luminance gain """ debugging code showing 2D surface plot of vignetting. Quite useful for @@ -179,7 +182,7 @@ def alsc(Cam, Img, do_alsc_colour, plot=False, grid_size=(16, 12)): # print(Img.str) plt.show() - return Img.col, cr.flatten(), cb.flatten(), cg.flatten(), (w, h, dx, dy) + return Img.col, cr.flatten(), cb.flatten(), cg, (w, h, dx, dy) else: """ @@ -189,6 +192,7 @@ def alsc(Cam, Img, do_alsc_colour, plot=False, grid_size=(16, 12)): cg = np.reshape(1/g, (grid_h, grid_w)).astype('float32') cg = cv2.medianBlur(cg, 3).astype('float64') cg = cg/np.min(cg) + cg = [min(v, max_gain) for v in cg.flatten()] # never exceed the max luminance gain if plot: hf = plt.figure(figssize=(8, 8)) diff --git a/utils/raspberrypi/ctt/ctt_config_example.json b/utils/raspberrypi/ctt/ctt_config_example.json index c7f90761..1105862c 100644 --- a/utils/raspberrypi/ctt/ctt_config_example.json +++ b/utils/raspberrypi/ctt/ctt_config_example.json @@ -3,7 +3,8 @@ "plot": [], "alsc": { "do_alsc_colour": 1, - "luminance_strength": 0.5 + "luminance_strength": 0.8, + "max_gain": 8.0 }, "awb": { "greyworld": 0 @@ -13,4 +14,4 @@ "small": 0, "show": 0 } -} \ No newline at end of file +}