{"id":16754,"url":"https://patchwork.libcamera.org/api/patches/16754/?format=json","web_url":"https://patchwork.libcamera.org/patch/16754/","project":{"id":1,"url":"https://patchwork.libcamera.org/api/projects/1/?format=json","name":"libcamera","link_name":"libcamera","list_id":"libcamera_core","list_email":"libcamera-devel@lists.libcamera.org","web_url":"","scm_url":"","webscm_url":""},"msgid":"<20220725134639.4572-2-naush@raspberrypi.com>","date":"2022-07-25T13:46:25","name":"[libcamera-devel,01/15] DNI: ipa: raspberrypi: Code refactoring to match style guidelines","commit_ref":null,"pull_url":null,"state":"superseded","archived":false,"hash":"12e65d86202afd903c28f4ac445312787b7c6a3d","submitter":{"id":34,"url":"https://patchwork.libcamera.org/api/people/34/?format=json","name":"Naushir Patuck","email":"naush@raspberrypi.com"},"delegate":null,"mbox":"https://patchwork.libcamera.org/patch/16754/mbox/","series":[{"id":3323,"url":"https://patchwork.libcamera.org/api/series/3323/?format=json","web_url":"https://patchwork.libcamera.org/project/libcamera/list/?series=3323","date":"2022-07-25T13:46:24","name":"Raspberry Pi IPA code refactor","version":1,"mbox":"https://patchwork.libcamera.org/series/3323/mbox/"}],"comments":"https://patchwork.libcamera.org/api/patches/16754/comments/","check":"pending","checks":"https://patchwork.libcamera.org/api/patches/16754/checks/","tags":{},"headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id 7C6E2C3275\n\tfor <parsemail@patchwork.libcamera.org>;\n\tMon, 25 Jul 2022 13:46:48 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id 5F43363319;\n\tMon, 25 Jul 2022 15:46:46 +0200 (CEST)","from mail-wm1-x32b.google.com (mail-wm1-x32b.google.com\n\t[IPv6:2a00:1450:4864:20::32b])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id 2D96F6330A\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tMon, 25 Jul 2022 15:46:45 +0200 (CEST)","by mail-wm1-x32b.google.com with SMTP id id17so6854970wmb.1\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tMon, 25 Jul 2022 06:46:45 -0700 (PDT)","from naush-laptop.localdomain ([93.93.133.154])\n\tby smtp.gmail.com with ESMTPSA id\n\ta20-20020a05600c225400b003a32167b8d4sm18054320wmm.13.2022.07.25.06.46.43\n\t(version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256);\n\tMon, 25 Jul 2022 06:46:43 -0700 (PDT)"],"DKIM-Signature":["v=1; a=rsa-sha256; c=relaxed/simple; d=libcamera.org;\n\ts=mail; t=1658756806;\n\tbh=KjLAMdIsm+cekRarVqmhe2nrjBMHkg99wgymDLqBI+I=;\n\th=To:Date:In-Reply-To:References:Subject:List-Id:List-Unsubscribe:\n\tList-Archive:List-Post:List-Help:List-Subscribe:From:Reply-To:\n\tFrom;\n\tb=F5unOqll+QL0ogUimU1os8OcGYfPYexwfyl1eUl1e3rtWXFiEJcWeg/apuP4cH989\n\tKFv9jNVniJlSjqpoEiwzIccSOIw22apO7QjjndSDDtxgQI1TvlzeqxmsaMagnHza6X\n\tls6qScK6ccKqRTvk+w7NvE9lc41YkxE5XnNlfq++bxfGAZ75TW02CvN84DDI53Xajf\n\tpkupZ3tQpNdsOHi3aYA/7EB452Qm9ySDdSFk83Cgkh4aemtkfuONldj0HPU6sjeRZ6\n\tnVtXid3RqwCO453qEsyRFMYNM5JE5K6CkMeeBLN7Xi0CCwZzas6jqX3He0GQMquP3n\n\tJ5F5gvvruecyA==","v=1; a=rsa-sha256; c=relaxed/relaxed;\n\td=raspberrypi.com; s=google;\n\th=from:to:cc:subject:date:message-id:in-reply-to:references\n\t:mime-version:content-transfer-encoding;\n\tbh=JyL9DKUh27vijUb4q/TxjkRL0vkpURqizhBXWDgtoLs=;\n\tb=ggYbMcIgRRcrgJqV7tJ2xy6aRskkQ/ybibZtka3DV3tyZ7HD5uEAdQ6Flz979ZgYGb\n\thE2qh2blkAPnmA5ElmkVKRdHz0ivp9jgUWAdmBEfKtHpS2hc5urK8TuTq9Z4FMmRxRSU\n\tQZfwRnLg9A7IGa6h2KHWzaG5bzGGd4v5zoY0TPMMHieDCDJuIUkXQoPTm9aNsJM/S0+z\n\t9E8xdf/jDiTPaf+n50lOd3egCpsZYBFfBLn8dRL1bTotyXJqX3YYzvCEkYBcj3yR5b16\n\t6pWpYdYufWrHT4/s407SqaSnEwgGk8dmUXiwku4e8KFvQCI6SCqb1F9xa2Bp7Fhrk1sJ\n\t0uPg=="],"Authentication-Results":"lancelot.ideasonboard.com; dkim=pass (2048-bit key; \n\tunprotected) header.d=raspberrypi.com\n\theader.i=@raspberrypi.com\n\theader.b=\"ggYbMcIg\"; dkim-atps=neutral","X-Google-DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/relaxed;\n\td=1e100.net; s=20210112;\n\th=x-gm-message-state:from:to:cc:subject:date:message-id:in-reply-to\n\t:references:mime-version:content-transfer-encoding;\n\tbh=JyL9DKUh27vijUb4q/TxjkRL0vkpURqizhBXWDgtoLs=;\n\tb=bFwf2YBQFaew+WArtaTLrmmJb6PWRxy0/oKZt87xaJg8Um6k9nem4PYd4mxLWvVKOT\n\tJzUxTpGcyoFhIWWjQnM2gIwlo8HqPMkcsD4ppIM3OZbxdaLpYX9aoBEEUYjpYc3CLtOq\n\tqwQJF8RFbSFUmfbRV8t7/pW463jP9o08oAdS1lqyXzi91vQPAuyaSeXMavoGOfEDnq67\n\t0+iZb/HMyZ1Z5jSCObTe/y+n+dkJSugn0uXBKYaAZRJsp6mKrgp8ZKIWootDRoO5NxaQ\n\t1/VKSX6je/yoaMpDN4ZqZ/2elBwj3xfvs6sAh+uWM6e3p0lhmr+vqIMQGjee1eV+Ic4C\n\tcg6A==","X-Gm-Message-State":"AJIora/7p/9VUrKQUvNCNSbIqljSrlIf5MeDGpYludP2tYXKJrhjYURW\n\tt6Q7R7//cNXcpIyH4itcZyslevJw29AS+w==","X-Google-Smtp-Source":"AGRyM1srWTQe5GLDuHjcUOwd0jWsFMasNZKLW4sddoAKP9V47TwZIM08wKsiyVKNMNNB0iKXFh5kbA==","X-Received":"by 2002:a7b:ca48:0:b0:3a3:365d:1089 with SMTP id\n\tm8-20020a7bca48000000b003a3365d1089mr8933753wml.153.1658756803924; \n\tMon, 25 Jul 2022 06:46:43 -0700 (PDT)","To":"libcamera-devel@lists.libcamera.org","Date":"Mon, 25 Jul 2022 14:46:25 +0100","Message-Id":"<20220725134639.4572-2-naush@raspberrypi.com>","X-Mailer":"git-send-email 2.25.1","In-Reply-To":"<20220725134639.4572-1-naush@raspberrypi.com>","References":"<20220725134639.4572-1-naush@raspberrypi.com>","MIME-Version":"1.0","Content-Transfer-Encoding":"8bit","Subject":"[libcamera-devel] [PATCH 01/15] DNI: ipa: raspberrypi: Code\n\trefactoring to match style guidelines","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","From":"Naushir Patuck via libcamera-devel\n\t<libcamera-devel@lists.libcamera.org>","Reply-To":"Naushir Patuck <naush@raspberrypi.com>","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"},"content":"Refactor the source files in src/ipa/raspberrypi/ to match the\nrecommended formatting guidelines for the libcamera project. The vast majority\nof changes in this commit comprise of switching from snake_case to CamelCase,\nand starting class member functions with a lower case character.\n\nSigned-off-by: Naushir Patuck <naush@raspberrypi.com>\n---\n .../controller/sharpen_algorithm.hpp          |   2 +-\n .../raspberrypi/controller/sharpen_status.h   |   2 +-\n src/ipa/raspberrypi/md_parser.hpp             |  34 +--\n src/ipa/raspberrypi/md_parser_smia.cpp        |  96 +++----\n src/ipa/raspberrypi/raspberrypi.cpp           | 254 +++++++++---------\n 5 files changed, 194 insertions(+), 194 deletions(-)","diff":"diff --git a/src/ipa/raspberrypi/controller/sharpen_algorithm.hpp b/src/ipa/raspberrypi/controller/sharpen_algorithm.hpp\nindex ca800308fd6c..888f4569c56a 100644\n--- a/src/ipa/raspberrypi/controller/sharpen_algorithm.hpp\n+++ b/src/ipa/raspberrypi/controller/sharpen_algorithm.hpp\n@@ -15,7 +15,7 @@ class SharpenAlgorithm : public Algorithm\n public:\n \tSharpenAlgorithm(Controller *controller) : Algorithm(controller) {}\n \t// A sharpness control algorithm must provide the following:\n-\tvirtual void SetStrength(double strength) = 0;\n+\tvirtual void setStrength(double strength) = 0;\n };\n \n } // namespace RPiController\ndiff --git a/src/ipa/raspberrypi/controller/sharpen_status.h b/src/ipa/raspberrypi/controller/sharpen_status.h\nindex 7501b191d6f6..2b0490742fba 100644\n--- a/src/ipa/raspberrypi/controller/sharpen_status.h\n+++ b/src/ipa/raspberrypi/controller/sharpen_status.h\n@@ -20,7 +20,7 @@ struct SharpenStatus {\n \t// upper limit of the allowed sharpening response\n \tdouble limit;\n \t// The sharpening strength requested by the user or application.\n-\tdouble user_strength;\n+\tdouble userStrength;\n };\n \n #ifdef __cplusplus\ndiff --git a/src/ipa/raspberrypi/md_parser.hpp b/src/ipa/raspberrypi/md_parser.hpp\nindex d32d0f549b9c..e505108a7adc 100644\n--- a/src/ipa/raspberrypi/md_parser.hpp\n+++ b/src/ipa/raspberrypi/md_parser.hpp\n@@ -81,27 +81,27 @@ public:\n \n \tvirtual ~MdParser() = default;\n \n-\tvoid Reset()\n+\tvoid reset()\n \t{\n \t\treset_ = true;\n \t}\n \n-\tvoid SetBitsPerPixel(int bpp)\n+\tvoid setBitsPerPixel(int bpp)\n \t{\n \t\tbits_per_pixel_ = bpp;\n \t}\n \n-\tvoid SetNumLines(unsigned int num_lines)\n+\tvoid setNumLines(unsigned int numLines)\n \t{\n-\t\tnum_lines_ = num_lines;\n+\t\tnum_lines_ = numLines;\n \t}\n \n-\tvoid SetLineLengthBytes(unsigned int num_bytes)\n+\tvoid setLineLengthBytes(unsigned int numBytes)\n \t{\n-\t\tline_length_bytes_ = num_bytes;\n+\t\tline_length_bytes_ = numBytes;\n \t}\n \n-\tvirtual Status Parse(libcamera::Span<const uint8_t> buffer,\n+\tvirtual Status parse(libcamera::Span<const uint8_t> buffer,\n \t\t\t     RegisterMap &registers) = 0;\n \n protected:\n@@ -123,7 +123,7 @@ class MdParserSmia final : public MdParser\n public:\n \tMdParserSmia(std::initializer_list<uint32_t> registerList);\n \n-\tMdParser::Status Parse(libcamera::Span<const uint8_t> buffer,\n+\tMdParser::Status parse(libcamera::Span<const uint8_t> buffer,\n \t\t\t       RegisterMap &registers) override;\n \n private:\n@@ -133,18 +133,18 @@ private:\n \t/*\n \t * Note that error codes > 0 are regarded as non-fatal; codes < 0\n \t * indicate a bad data buffer. Status codes are:\n-\t * PARSE_OK     - found all registers, much happiness\n-\t * MISSING_REGS - some registers found; should this be a hard error?\n+\t * ParseOk     - found all registers, much happiness\n+\t * MissingRegs - some registers found; should this be a hard error?\n \t * The remaining codes are all hard errors.\n \t */\n \tenum ParseStatus {\n-\t\tPARSE_OK      =  0,\n-\t\tMISSING_REGS  =  1,\n-\t\tNO_LINE_START = -1,\n-\t\tILLEGAL_TAG   = -2,\n-\t\tBAD_DUMMY     = -3,\n-\t\tBAD_LINE_END  = -4,\n-\t\tBAD_PADDING   = -5\n+\t\tParseOk      =  0,\n+\t\tMissingRegs  =  1,\n+\t\tNoLineStart  = -1,\n+\t\tIllegalTag   = -2,\n+\t\tBadDummy     = -3,\n+\t\tBadLineEnd   = -4,\n+\t\tBadPadding   = -5\n \t};\n \n \tParseStatus findRegs(libcamera::Span<const uint8_t> buffer);\ndiff --git a/src/ipa/raspberrypi/md_parser_smia.cpp b/src/ipa/raspberrypi/md_parser_smia.cpp\nindex ea5eac414b36..f2b37cab4e97 100644\n--- a/src/ipa/raspberrypi/md_parser_smia.cpp\n+++ b/src/ipa/raspberrypi/md_parser_smia.cpp\n@@ -20,12 +20,12 @@ using namespace libcamera;\n  * sensors, I think.\n  */\n \n-constexpr unsigned int LINE_START = 0x0a;\n-constexpr unsigned int LINE_END_TAG = 0x07;\n-constexpr unsigned int REG_HI_BITS = 0xaa;\n-constexpr unsigned int REG_LOW_BITS = 0xa5;\n-constexpr unsigned int REG_VALUE = 0x5a;\n-constexpr unsigned int REG_SKIP = 0x55;\n+constexpr unsigned int LineStart = 0x0a;\n+constexpr unsigned int LineEndTag = 0x07;\n+constexpr unsigned int RegHiBits = 0xaa;\n+constexpr unsigned int RegLowBits = 0xa5;\n+constexpr unsigned int RegValue = 0x5a;\n+constexpr unsigned int RegSkip = 0x55;\n \n MdParserSmia::MdParserSmia(std::initializer_list<uint32_t> registerList)\n {\n@@ -33,7 +33,7 @@ MdParserSmia::MdParserSmia(std::initializer_list<uint32_t> registerList)\n \t\toffsets_[r] = {};\n }\n \n-MdParser::Status MdParserSmia::Parse(libcamera::Span<const uint8_t> buffer,\n+MdParser::Status MdParserSmia::parse(libcamera::Span<const uint8_t> buffer,\n \t\t\t\t     RegisterMap &registers)\n {\n \tif (reset_) {\n@@ -53,7 +53,7 @@ MdParser::Status MdParserSmia::Parse(libcamera::Span<const uint8_t> buffer,\n \t\t *\n \t\t * In either case, we retry parsing on the next frame.\n \t\t */\n-\t\tif (ret != PARSE_OK)\n+\t\tif (ret != ParseOk)\n \t\t\treturn ERROR;\n \n \t\treset_ = false;\n@@ -76,74 +76,74 @@ MdParserSmia::ParseStatus MdParserSmia::findRegs(libcamera::Span<const uint8_t>\n {\n \tASSERT(offsets_.size());\n \n-\tif (buffer[0] != LINE_START)\n-\t\treturn NO_LINE_START;\n+\tif (buffer[0] != LineStart)\n+\t\treturn NoLineStart;\n \n-\tunsigned int current_offset = 1; /* after the LINE_START */\n-\tunsigned int current_line_start = 0, current_line = 0;\n-\tunsigned int reg_num = 0, regs_done = 0;\n+\tunsigned int currentOffset = 1; /* after the LINE_START */\n+\tunsigned int currentLineStart = 0, currentLine = 0;\n+\tunsigned int regNum = 0, regsDone = 0;\n \n \twhile (1) {\n-\t\tint tag = buffer[current_offset++];\n+\t\tint tag = buffer[currentOffset++];\n \n \t\tif ((bits_per_pixel_ == 10 &&\n-\t\t     (current_offset + 1 - current_line_start) % 5 == 0) ||\n+\t\t     (currentOffset + 1 - currentLineStart) % 5 == 0) ||\n \t\t    (bits_per_pixel_ == 12 &&\n-\t\t     (current_offset + 1 - current_line_start) % 3 == 0)) {\n-\t\t\tif (buffer[current_offset++] != REG_SKIP)\n-\t\t\t\treturn BAD_DUMMY;\n+\t\t     (currentOffset + 1 - currentLineStart) % 3 == 0)) {\n+\t\t\tif (buffer[currentOffset++] != RegSkip)\n+\t\t\t\treturn BadDummy;\n \t\t}\n \n-\t\tint data_byte = buffer[current_offset++];\n+\t\tint dataByte = buffer[currentOffset++];\n \n-\t\tif (tag == LINE_END_TAG) {\n-\t\t\tif (data_byte != LINE_END_TAG)\n-\t\t\t\treturn BAD_LINE_END;\n+\t\tif (tag == LineEndTag) {\n+\t\t\tif (dataByte != LineEndTag)\n+\t\t\t\treturn BadLineEnd;\n \n-\t\t\tif (num_lines_ && ++current_line == num_lines_)\n-\t\t\t\treturn MISSING_REGS;\n+\t\t\tif (num_lines_ && ++currentLine == num_lines_)\n+\t\t\t\treturn MissingRegs;\n \n \t\t\tif (line_length_bytes_) {\n-\t\t\t\tcurrent_offset = current_line_start + line_length_bytes_;\n+\t\t\t\tcurrentOffset = currentLineStart + line_length_bytes_;\n \n \t\t\t\t/* Require whole line to be in the buffer (if buffer size set). */\n \t\t\t\tif (buffer.size() &&\n-\t\t\t\t    current_offset + line_length_bytes_ > buffer.size())\n-\t\t\t\t\treturn MISSING_REGS;\n+\t\t\t\t    currentOffset + line_length_bytes_ > buffer.size())\n+\t\t\t\t\treturn MissingRegs;\n \n-\t\t\t\tif (buffer[current_offset] != LINE_START)\n-\t\t\t\t\treturn NO_LINE_START;\n+\t\t\t\tif (buffer[currentOffset] != LineStart)\n+\t\t\t\t\treturn NoLineStart;\n \t\t\t} else {\n \t\t\t\t/* allow a zero line length to mean \"hunt for the next line\" */\n-\t\t\t\twhile (current_offset < buffer.size() &&\n-\t\t\t\t       buffer[current_offset] != LINE_START)\n-\t\t\t\t\tcurrent_offset++;\n+\t\t\t\twhile (currentOffset < buffer.size() &&\n+\t\t\t\t       buffer[currentOffset] != LineStart)\n+\t\t\t\t\tcurrentOffset++;\n \n-\t\t\t\tif (current_offset == buffer.size())\n-\t\t\t\t\treturn NO_LINE_START;\n+\t\t\t\tif (currentOffset == buffer.size())\n+\t\t\t\t\treturn NoLineStart;\n \t\t\t}\n \n \t\t\t/* inc current_offset to after LINE_START */\n-\t\t\tcurrent_line_start = current_offset++;\n+\t\t\tcurrentLineStart = currentOffset++;\n \t\t} else {\n-\t\t\tif (tag == REG_HI_BITS)\n-\t\t\t\treg_num = (reg_num & 0xff) | (data_byte << 8);\n-\t\t\telse if (tag == REG_LOW_BITS)\n-\t\t\t\treg_num = (reg_num & 0xff00) | data_byte;\n-\t\t\telse if (tag == REG_SKIP)\n-\t\t\t\treg_num++;\n-\t\t\telse if (tag == REG_VALUE) {\n-\t\t\t\tauto reg = offsets_.find(reg_num);\n+\t\t\tif (tag == RegHiBits)\n+\t\t\t\tregNum = (regNum & 0xff) | (dataByte << 8);\n+\t\t\telse if (tag == RegLowBits)\n+\t\t\t\tregNum = (regNum & 0xff00) | dataByte;\n+\t\t\telse if (tag == RegSkip)\n+\t\t\t\tregNum++;\n+\t\t\telse if (tag == RegValue) {\n+\t\t\t\tauto reg = offsets_.find(regNum);\n \n \t\t\t\tif (reg != offsets_.end()) {\n-\t\t\t\t\toffsets_[reg_num] = current_offset - 1;\n+\t\t\t\t\toffsets_[regNum] = currentOffset - 1;\n \n-\t\t\t\t\tif (++regs_done == offsets_.size())\n-\t\t\t\t\t\treturn PARSE_OK;\n+\t\t\t\t\tif (++regsDone == offsets_.size())\n+\t\t\t\t\t\treturn ParseOk;\n \t\t\t\t}\n-\t\t\t\treg_num++;\n+\t\t\t\tregNum++;\n \t\t\t} else\n-\t\t\t\treturn ILLEGAL_TAG;\n+\t\t\t\treturn IllegalTag;\n \t\t}\n \t}\n }\ndiff --git a/src/ipa/raspberrypi/raspberrypi.cpp b/src/ipa/raspberrypi/raspberrypi.cpp\nindex c7492a77a3fd..5cd8af3f0305 100644\n--- a/src/ipa/raspberrypi/raspberrypi.cpp\n+++ b/src/ipa/raspberrypi/raspberrypi.cpp\n@@ -208,7 +208,7 @@ int IPARPi::init(const IPASettings &settings, IPAInitResult *result)\n \t * that the kernel driver doesn't. We only do this the first time; we don't need\n \t * to re-parse the metadata after a simple mode-switch for no reason.\n \t */\n-\thelper_ = std::unique_ptr<RPiController::CamHelper>(RPiController::CamHelper::Create(settings.sensorModel));\n+\thelper_ = std::unique_ptr<RPiController::CamHelper>(RPiController::CamHelper::create(settings.sensorModel));\n \tif (!helper_) {\n \t\tLOG(IPARPI, Error) << \"Could not create camera helper for \"\n \t\t\t\t   << settings.sensorModel;\n@@ -220,8 +220,8 @@ int IPARPi::init(const IPASettings &settings, IPAInitResult *result)\n \t * to setup the staggered writer class.\n \t */\n \tint gainDelay, exposureDelay, vblankDelay, sensorMetadata;\n-\thelper_->GetDelays(exposureDelay, gainDelay, vblankDelay);\n-\tsensorMetadata = helper_->SensorEmbeddedDataPresent();\n+\thelper_->getDelays(exposureDelay, gainDelay, vblankDelay);\n+\tsensorMetadata = helper_->sensorEmbeddedDataPresent();\n \n \tresult->sensorConfig.gainDelay = gainDelay;\n \tresult->sensorConfig.exposureDelay = exposureDelay;\n@@ -229,8 +229,8 @@ int IPARPi::init(const IPASettings &settings, IPAInitResult *result)\n \tresult->sensorConfig.sensorMetadata = sensorMetadata;\n \n \t/* Load the tuning file for this sensor. */\n-\tcontroller_.Read(settings.configurationFile.c_str());\n-\tcontroller_.Initialise();\n+\tcontroller_.read(settings.configurationFile.c_str());\n+\tcontroller_.initialise();\n \n \t/* Return the controls handled by the IPA */\n \tControlInfoMap::Map ctrlMap = ipaControls;\n@@ -249,15 +249,15 @@ void IPARPi::start(const ControlList &controls, StartConfig *startConfig)\n \t\tqueueRequest(controls);\n \t}\n \n-\tcontroller_.SwitchMode(mode_, &metadata);\n+\tcontroller_.switchMode(mode_, &metadata);\n \n \t/* SwitchMode may supply updated exposure/gain values to use. */\n \tAgcStatus agcStatus;\n-\tagcStatus.shutter_time = 0.0s;\n-\tagcStatus.analogue_gain = 0.0;\n+\tagcStatus.shutterTime = 0.0s;\n+\tagcStatus.analogueGain = 0.0;\n \n-\tmetadata.Get(\"agc.status\", agcStatus);\n-\tif (agcStatus.shutter_time && agcStatus.analogue_gain) {\n+\tmetadata.get(\"agc.status\", agcStatus);\n+\tif (agcStatus.shutterTime && agcStatus.analogueGain) {\n \t\tControlList ctrls(sensorCtrls_);\n \t\tapplyAGC(&agcStatus, ctrls);\n \t\tstartConfig->controls = std::move(ctrls);\n@@ -271,8 +271,8 @@ void IPARPi::start(const ControlList &controls, StartConfig *startConfig)\n \tframeCount_ = 0;\n \tcheckCount_ = 0;\n \tif (firstStart_) {\n-\t\tdropFrameCount_ = helper_->HideFramesStartup();\n-\t\tmistrustCount_ = helper_->MistrustFramesStartup();\n+\t\tdropFrameCount_ = helper_->hideFramesStartup();\n+\t\tmistrustCount_ = helper_->mistrustFramesStartup();\n \n \t\t/*\n \t\t * Query the AGC/AWB for how many frames they may take to\n@@ -283,18 +283,18 @@ void IPARPi::start(const ControlList &controls, StartConfig *startConfig)\n \t\t */\n \t\tunsigned int agcConvergenceFrames = 0;\n \t\tRPiController::AgcAlgorithm *agc = dynamic_cast<RPiController::AgcAlgorithm *>(\n-\t\t\tcontroller_.GetAlgorithm(\"agc\"));\n+\t\t\tcontroller_.getAlgorithm(\"agc\"));\n \t\tif (agc) {\n-\t\t\tagcConvergenceFrames = agc->GetConvergenceFrames();\n+\t\t\tagcConvergenceFrames = agc->getConvergenceFrames();\n \t\t\tif (agcConvergenceFrames)\n \t\t\t\tagcConvergenceFrames += mistrustCount_;\n \t\t}\n \n \t\tunsigned int awbConvergenceFrames = 0;\n \t\tRPiController::AwbAlgorithm *awb = dynamic_cast<RPiController::AwbAlgorithm *>(\n-\t\t\tcontroller_.GetAlgorithm(\"awb\"));\n+\t\t\tcontroller_.getAlgorithm(\"awb\"));\n \t\tif (awb) {\n-\t\t\tawbConvergenceFrames = awb->GetConvergenceFrames();\n+\t\t\tawbConvergenceFrames = awb->getConvergenceFrames();\n \t\t\tif (awbConvergenceFrames)\n \t\t\t\tawbConvergenceFrames += mistrustCount_;\n \t\t}\n@@ -302,12 +302,12 @@ void IPARPi::start(const ControlList &controls, StartConfig *startConfig)\n \t\tdropFrameCount_ = std::max({ dropFrameCount_, agcConvergenceFrames, awbConvergenceFrames });\n \t\tLOG(IPARPI, Debug) << \"Drop \" << dropFrameCount_ << \" frames on startup\";\n \t} else {\n-\t\tdropFrameCount_ = helper_->HideFramesModeSwitch();\n-\t\tmistrustCount_ = helper_->MistrustFramesModeSwitch();\n+\t\tdropFrameCount_ = helper_->hideFramesModeSwitch();\n+\t\tmistrustCount_ = helper_->mistrustFramesModeSwitch();\n \t}\n \n \tstartConfig->dropFrameCount = dropFrameCount_;\n-\tconst Duration maxSensorFrameDuration = mode_.max_frame_length * mode_.line_length;\n+\tconst Duration maxSensorFrameDuration = mode_.maxFrameLength * mode_.lineLength;\n \tstartConfig->maxSensorFrameLengthMs = maxSensorFrameDuration.get<std::milli>();\n \n \tfirstStart_ = false;\n@@ -319,17 +319,17 @@ void IPARPi::setMode(const IPACameraSensorInfo &sensorInfo)\n \tmode_.bitdepth = sensorInfo.bitsPerPixel;\n \tmode_.width = sensorInfo.outputSize.width;\n \tmode_.height = sensorInfo.outputSize.height;\n-\tmode_.sensor_width = sensorInfo.activeAreaSize.width;\n-\tmode_.sensor_height = sensorInfo.activeAreaSize.height;\n-\tmode_.crop_x = sensorInfo.analogCrop.x;\n-\tmode_.crop_y = sensorInfo.analogCrop.y;\n+\tmode_.sensorWidth = sensorInfo.activeAreaSize.width;\n+\tmode_.sensorHeight = sensorInfo.activeAreaSize.height;\n+\tmode_.cropX = sensorInfo.analogCrop.x;\n+\tmode_.cropY = sensorInfo.analogCrop.y;\n \n \t/*\n \t * Calculate scaling parameters. The scale_[xy] factors are determined\n \t * by the ratio between the crop rectangle size and the output size.\n \t */\n-\tmode_.scale_x = sensorInfo.analogCrop.width / sensorInfo.outputSize.width;\n-\tmode_.scale_y = sensorInfo.analogCrop.height / sensorInfo.outputSize.height;\n+\tmode_.scaleX = sensorInfo.analogCrop.width / sensorInfo.outputSize.width;\n+\tmode_.scaleY = sensorInfo.analogCrop.height / sensorInfo.outputSize.height;\n \n \t/*\n \t * We're not told by the pipeline handler how scaling is split between\n@@ -339,30 +339,30 @@ void IPARPi::setMode(const IPACameraSensorInfo &sensorInfo)\n \t *\n \t * \\todo Get the pipeline handle to provide the full data\n \t */\n-\tmode_.bin_x = std::min(2, static_cast<int>(mode_.scale_x));\n-\tmode_.bin_y = std::min(2, static_cast<int>(mode_.scale_y));\n+\tmode_.binX = std::min(2, static_cast<int>(mode_.scaleX));\n+\tmode_.binY = std::min(2, static_cast<int>(mode_.scaleY));\n \n \t/* The noise factor is the square root of the total binning factor. */\n-\tmode_.noise_factor = sqrt(mode_.bin_x * mode_.bin_y);\n+\tmode_.noiseFactor = sqrt(mode_.binX * mode_.binY);\n \n \t/*\n \t * Calculate the line length as the ratio between the line length in\n \t * pixels and the pixel rate.\n \t */\n-\tmode_.line_length = sensorInfo.lineLength * (1.0s / sensorInfo.pixelRate);\n+\tmode_.lineLength = sensorInfo.lineLength * (1.0s / sensorInfo.pixelRate);\n \n \t/*\n \t * Set the frame length limits for the mode to ensure exposure and\n \t * framerate calculations are clipped appropriately.\n \t */\n-\tmode_.min_frame_length = sensorInfo.minFrameLength;\n-\tmode_.max_frame_length = sensorInfo.maxFrameLength;\n+\tmode_.minFrameLength = sensorInfo.minFrameLength;\n+\tmode_.maxFrameLength = sensorInfo.maxFrameLength;\n \n \t/*\n \t * Some sensors may have different sensitivities in different modes;\n \t * the CamHelper will know the correct value.\n \t */\n-\tmode_.sensitivity = helper_->GetModeSensitivity(mode_);\n+\tmode_.sensitivity = helper_->getModeSensitivity(mode_);\n }\n \n int IPARPi::configure(const IPACameraSensorInfo &sensorInfo,\n@@ -421,7 +421,7 @@ int IPARPi::configure(const IPACameraSensorInfo &sensorInfo,\n \t}\n \n \t/* Pass the camera mode to the CamHelper to setup algorithms. */\n-\thelper_->SetCameraMode(mode_);\n+\thelper_->setCameraMode(mode_);\n \n \t/*\n \t * Initialise this ControlList correctly, even if empty, in case the IPA is\n@@ -438,8 +438,8 @@ int IPARPi::configure(const IPACameraSensorInfo &sensorInfo,\n \n \t\t/* Supply initial values for gain and exposure. */\n \t\tAgcStatus agcStatus;\n-\t\tagcStatus.shutter_time = defaultExposureTime;\n-\t\tagcStatus.analogue_gain = defaultAnalogueGain;\n+\t\tagcStatus.shutterTime = defaultExposureTime;\n+\t\tagcStatus.analogueGain = defaultAnalogueGain;\n \t\tapplyAGC(&agcStatus, ctrls);\n \t}\n \n@@ -451,25 +451,25 @@ int IPARPi::configure(const IPACameraSensorInfo &sensorInfo,\n \t * based on the current sensor mode.\n \t */\n \tControlInfoMap::Map ctrlMap = ipaControls;\n-\tconst Duration minSensorFrameDuration = mode_.min_frame_length * mode_.line_length;\n-\tconst Duration maxSensorFrameDuration = mode_.max_frame_length * mode_.line_length;\n+\tconst Duration minSensorFrameDuration = mode_.minFrameLength * mode_.lineLength;\n+\tconst Duration maxSensorFrameDuration = mode_.maxFrameLength * mode_.lineLength;\n \tctrlMap[&controls::FrameDurationLimits] =\n \t\tControlInfo(static_cast<int64_t>(minSensorFrameDuration.get<std::micro>()),\n \t\t\t    static_cast<int64_t>(maxSensorFrameDuration.get<std::micro>()));\n \n \tctrlMap[&controls::AnalogueGain] =\n-\t\tControlInfo(1.0f, static_cast<float>(helper_->Gain(maxSensorGainCode_)));\n+\t\tControlInfo(1.0f, static_cast<float>(helper_->gain(maxSensorGainCode_)));\n \n \t/*\n \t * Calculate the max exposure limit from the frame duration limit as V4L2\n \t * will limit the maximum control value based on the current VBLANK value.\n \t */\n \tDuration maxShutter = Duration::max();\n-\thelper_->GetVBlanking(maxShutter, minSensorFrameDuration, maxSensorFrameDuration);\n+\thelper_->getVBlanking(maxShutter, minSensorFrameDuration, maxSensorFrameDuration);\n \tconst uint32_t exposureMin = sensorCtrls_.at(V4L2_CID_EXPOSURE).min().get<int32_t>();\n \n \tctrlMap[&controls::ExposureTime] =\n-\t\tControlInfo(static_cast<int32_t>(helper_->Exposure(exposureMin).get<std::micro>()),\n+\t\tControlInfo(static_cast<int32_t>(helper_->exposure(exposureMin).get<std::micro>()),\n \t\t\t    static_cast<int32_t>(maxShutter.get<std::micro>()));\n \n \tresult->controlInfo = ControlInfoMap(std::move(ctrlMap), controls::controls);\n@@ -536,35 +536,35 @@ void IPARPi::reportMetadata()\n \t * processed can be extracted and placed into the libcamera metadata\n \t * buffer, where an application could query it.\n \t */\n-\tDeviceStatus *deviceStatus = rpiMetadata_.GetLocked<DeviceStatus>(\"device.status\");\n+\tDeviceStatus *deviceStatus = rpiMetadata_.getLocked<DeviceStatus>(\"device.status\");\n \tif (deviceStatus) {\n \t\tlibcameraMetadata_.set(controls::ExposureTime,\n-\t\t\t\t       deviceStatus->shutter_speed.get<std::micro>());\n-\t\tlibcameraMetadata_.set(controls::AnalogueGain, deviceStatus->analogue_gain);\n+\t\t\t\t       deviceStatus->shutterSpeed.get<std::micro>());\n+\t\tlibcameraMetadata_.set(controls::AnalogueGain, deviceStatus->analogueGain);\n \t\tlibcameraMetadata_.set(controls::FrameDuration,\n-\t\t\t\t       helper_->Exposure(deviceStatus->frame_length).get<std::micro>());\n-\t\tif (deviceStatus->sensor_temperature)\n-\t\t\tlibcameraMetadata_.set(controls::SensorTemperature, *deviceStatus->sensor_temperature);\n+\t\t\t\t       helper_->exposure(deviceStatus->frameLength).get<std::micro>());\n+\t\tif (deviceStatus->sensorTemperature)\n+\t\t\tlibcameraMetadata_.set(controls::SensorTemperature, *deviceStatus->sensorTemperature);\n \t}\n \n-\tAgcStatus *agcStatus = rpiMetadata_.GetLocked<AgcStatus>(\"agc.status\");\n+\tAgcStatus *agcStatus = rpiMetadata_.getLocked<AgcStatus>(\"agc.status\");\n \tif (agcStatus) {\n \t\tlibcameraMetadata_.set(controls::AeLocked, agcStatus->locked);\n-\t\tlibcameraMetadata_.set(controls::DigitalGain, agcStatus->digital_gain);\n+\t\tlibcameraMetadata_.set(controls::DigitalGain, agcStatus->digitalGain);\n \t}\n \n-\tLuxStatus *luxStatus = rpiMetadata_.GetLocked<LuxStatus>(\"lux.status\");\n+\tLuxStatus *luxStatus = rpiMetadata_.getLocked<LuxStatus>(\"lux.status\");\n \tif (luxStatus)\n \t\tlibcameraMetadata_.set(controls::Lux, luxStatus->lux);\n \n-\tAwbStatus *awbStatus = rpiMetadata_.GetLocked<AwbStatus>(\"awb.status\");\n+\tAwbStatus *awbStatus = rpiMetadata_.getLocked<AwbStatus>(\"awb.status\");\n \tif (awbStatus) {\n-\t\tlibcameraMetadata_.set(controls::ColourGains, { static_cast<float>(awbStatus->gain_r),\n-\t\t\t\t\t\t\t\tstatic_cast<float>(awbStatus->gain_b) });\n-\t\tlibcameraMetadata_.set(controls::ColourTemperature, awbStatus->temperature_K);\n+\t\tlibcameraMetadata_.set(controls::ColourGains, { static_cast<float>(awbStatus->gainR),\n+\t\t\t\t\t\t\t\tstatic_cast<float>(awbStatus->gainB) });\n+\t\tlibcameraMetadata_.set(controls::ColourTemperature, awbStatus->temperatureK);\n \t}\n \n-\tBlackLevelStatus *blackLevelStatus = rpiMetadata_.GetLocked<BlackLevelStatus>(\"black_level.status\");\n+\tBlackLevelStatus *blackLevelStatus = rpiMetadata_.getLocked<BlackLevelStatus>(\"black_level.status\");\n \tif (blackLevelStatus)\n \t\tlibcameraMetadata_.set(controls::SensorBlackLevels,\n \t\t\t\t       { static_cast<int32_t>(blackLevelStatus->black_level_r),\n@@ -572,18 +572,18 @@ void IPARPi::reportMetadata()\n \t\t\t\t\t static_cast<int32_t>(blackLevelStatus->black_level_g),\n \t\t\t\t\t static_cast<int32_t>(blackLevelStatus->black_level_b) });\n \n-\tFocusStatus *focusStatus = rpiMetadata_.GetLocked<FocusStatus>(\"focus.status\");\n+\tFocusStatus *focusStatus = rpiMetadata_.getLocked<FocusStatus>(\"focus.status\");\n \tif (focusStatus && focusStatus->num == 12) {\n \t\t/*\n \t\t * We get a 4x3 grid of regions by default. Calculate the average\n \t\t * FoM over the central two positions to give an overall scene FoM.\n \t\t * This can change later if it is not deemed suitable.\n \t\t */\n-\t\tint32_t focusFoM = (focusStatus->focus_measures[5] + focusStatus->focus_measures[6]) / 2;\n+\t\tint32_t focusFoM = (focusStatus->focusMeasures[5] + focusStatus->focusMeasures[6]) / 2;\n \t\tlibcameraMetadata_.set(controls::FocusFoM, focusFoM);\n \t}\n \n-\tCcmStatus *ccmStatus = rpiMetadata_.GetLocked<CcmStatus>(\"ccm.status\");\n+\tCcmStatus *ccmStatus = rpiMetadata_.getLocked<CcmStatus>(\"ccm.status\");\n \tif (ccmStatus) {\n \t\tfloat m[9];\n \t\tfor (unsigned int i = 0; i < 9; i++)\n@@ -695,7 +695,7 @@ void IPARPi::queueRequest(const ControlList &controls)\n \n \t\tswitch (ctrl.first) {\n \t\tcase controls::AE_ENABLE: {\n-\t\t\tRPiController::Algorithm *agc = controller_.GetAlgorithm(\"agc\");\n+\t\t\tRPiController::Algorithm *agc = controller_.getAlgorithm(\"agc\");\n \t\t\tif (!agc) {\n \t\t\t\tLOG(IPARPI, Warning)\n \t\t\t\t\t<< \"Could not set AE_ENABLE - no AGC algorithm\";\n@@ -703,9 +703,9 @@ void IPARPi::queueRequest(const ControlList &controls)\n \t\t\t}\n \n \t\t\tif (ctrl.second.get<bool>() == false)\n-\t\t\t\tagc->Pause();\n+\t\t\t\tagc->pause();\n \t\t\telse\n-\t\t\t\tagc->Resume();\n+\t\t\t\tagc->resume();\n \n \t\t\tlibcameraMetadata_.set(controls::AeEnable, ctrl.second.get<bool>());\n \t\t\tbreak;\n@@ -713,7 +713,7 @@ void IPARPi::queueRequest(const ControlList &controls)\n \n \t\tcase controls::EXPOSURE_TIME: {\n \t\t\tRPiController::AgcAlgorithm *agc = dynamic_cast<RPiController::AgcAlgorithm *>(\n-\t\t\t\tcontroller_.GetAlgorithm(\"agc\"));\n+\t\t\t\tcontroller_.getAlgorithm(\"agc\"));\n \t\t\tif (!agc) {\n \t\t\t\tLOG(IPARPI, Warning)\n \t\t\t\t\t<< \"Could not set EXPOSURE_TIME - no AGC algorithm\";\n@@ -721,7 +721,7 @@ void IPARPi::queueRequest(const ControlList &controls)\n \t\t\t}\n \n \t\t\t/* The control provides units of microseconds. */\n-\t\t\tagc->SetFixedShutter(ctrl.second.get<int32_t>() * 1.0us);\n+\t\t\tagc->setFixedShutter(ctrl.second.get<int32_t>() * 1.0us);\n \n \t\t\tlibcameraMetadata_.set(controls::ExposureTime, ctrl.second.get<int32_t>());\n \t\t\tbreak;\n@@ -729,14 +729,14 @@ void IPARPi::queueRequest(const ControlList &controls)\n \n \t\tcase controls::ANALOGUE_GAIN: {\n \t\t\tRPiController::AgcAlgorithm *agc = dynamic_cast<RPiController::AgcAlgorithm *>(\n-\t\t\t\tcontroller_.GetAlgorithm(\"agc\"));\n+\t\t\t\tcontroller_.getAlgorithm(\"agc\"));\n \t\t\tif (!agc) {\n \t\t\t\tLOG(IPARPI, Warning)\n \t\t\t\t\t<< \"Could not set ANALOGUE_GAIN - no AGC algorithm\";\n \t\t\t\tbreak;\n \t\t\t}\n \n-\t\t\tagc->SetFixedAnalogueGain(ctrl.second.get<float>());\n+\t\t\tagc->setFixedAnalogueGain(ctrl.second.get<float>());\n \n \t\t\tlibcameraMetadata_.set(controls::AnalogueGain,\n \t\t\t\t\t       ctrl.second.get<float>());\n@@ -745,7 +745,7 @@ void IPARPi::queueRequest(const ControlList &controls)\n \n \t\tcase controls::AE_METERING_MODE: {\n \t\t\tRPiController::AgcAlgorithm *agc = dynamic_cast<RPiController::AgcAlgorithm *>(\n-\t\t\t\tcontroller_.GetAlgorithm(\"agc\"));\n+\t\t\t\tcontroller_.getAlgorithm(\"agc\"));\n \t\t\tif (!agc) {\n \t\t\t\tLOG(IPARPI, Warning)\n \t\t\t\t\t<< \"Could not set AE_METERING_MODE - no AGC algorithm\";\n@@ -754,7 +754,7 @@ void IPARPi::queueRequest(const ControlList &controls)\n \n \t\t\tint32_t idx = ctrl.second.get<int32_t>();\n \t\t\tif (MeteringModeTable.count(idx)) {\n-\t\t\t\tagc->SetMeteringMode(MeteringModeTable.at(idx));\n+\t\t\t\tagc->setMeteringMode(MeteringModeTable.at(idx));\n \t\t\t\tlibcameraMetadata_.set(controls::AeMeteringMode, idx);\n \t\t\t} else {\n \t\t\t\tLOG(IPARPI, Error) << \"Metering mode \" << idx\n@@ -765,7 +765,7 @@ void IPARPi::queueRequest(const ControlList &controls)\n \n \t\tcase controls::AE_CONSTRAINT_MODE: {\n \t\t\tRPiController::AgcAlgorithm *agc = dynamic_cast<RPiController::AgcAlgorithm *>(\n-\t\t\t\tcontroller_.GetAlgorithm(\"agc\"));\n+\t\t\t\tcontroller_.getAlgorithm(\"agc\"));\n \t\t\tif (!agc) {\n \t\t\t\tLOG(IPARPI, Warning)\n \t\t\t\t\t<< \"Could not set AE_CONSTRAINT_MODE - no AGC algorithm\";\n@@ -774,7 +774,7 @@ void IPARPi::queueRequest(const ControlList &controls)\n \n \t\t\tint32_t idx = ctrl.second.get<int32_t>();\n \t\t\tif (ConstraintModeTable.count(idx)) {\n-\t\t\t\tagc->SetConstraintMode(ConstraintModeTable.at(idx));\n+\t\t\t\tagc->setConstraintMode(ConstraintModeTable.at(idx));\n \t\t\t\tlibcameraMetadata_.set(controls::AeConstraintMode, idx);\n \t\t\t} else {\n \t\t\t\tLOG(IPARPI, Error) << \"Constraint mode \" << idx\n@@ -785,7 +785,7 @@ void IPARPi::queueRequest(const ControlList &controls)\n \n \t\tcase controls::AE_EXPOSURE_MODE: {\n \t\t\tRPiController::AgcAlgorithm *agc = dynamic_cast<RPiController::AgcAlgorithm *>(\n-\t\t\t\tcontroller_.GetAlgorithm(\"agc\"));\n+\t\t\t\tcontroller_.getAlgorithm(\"agc\"));\n \t\t\tif (!agc) {\n \t\t\t\tLOG(IPARPI, Warning)\n \t\t\t\t\t<< \"Could not set AE_EXPOSURE_MODE - no AGC algorithm\";\n@@ -794,7 +794,7 @@ void IPARPi::queueRequest(const ControlList &controls)\n \n \t\t\tint32_t idx = ctrl.second.get<int32_t>();\n \t\t\tif (ExposureModeTable.count(idx)) {\n-\t\t\t\tagc->SetExposureMode(ExposureModeTable.at(idx));\n+\t\t\t\tagc->setExposureMode(ExposureModeTable.at(idx));\n \t\t\t\tlibcameraMetadata_.set(controls::AeExposureMode, idx);\n \t\t\t} else {\n \t\t\t\tLOG(IPARPI, Error) << \"Exposure mode \" << idx\n@@ -805,7 +805,7 @@ void IPARPi::queueRequest(const ControlList &controls)\n \n \t\tcase controls::EXPOSURE_VALUE: {\n \t\t\tRPiController::AgcAlgorithm *agc = dynamic_cast<RPiController::AgcAlgorithm *>(\n-\t\t\t\tcontroller_.GetAlgorithm(\"agc\"));\n+\t\t\t\tcontroller_.getAlgorithm(\"agc\"));\n \t\t\tif (!agc) {\n \t\t\t\tLOG(IPARPI, Warning)\n \t\t\t\t\t<< \"Could not set EXPOSURE_VALUE - no AGC algorithm\";\n@@ -817,14 +817,14 @@ void IPARPi::queueRequest(const ControlList &controls)\n \t\t\t * So convert to 2^EV\n \t\t\t */\n \t\t\tdouble ev = pow(2.0, ctrl.second.get<float>());\n-\t\t\tagc->SetEv(ev);\n+\t\t\tagc->setEv(ev);\n \t\t\tlibcameraMetadata_.set(controls::ExposureValue,\n \t\t\t\t\t       ctrl.second.get<float>());\n \t\t\tbreak;\n \t\t}\n \n \t\tcase controls::AWB_ENABLE: {\n-\t\t\tRPiController::Algorithm *awb = controller_.GetAlgorithm(\"awb\");\n+\t\t\tRPiController::Algorithm *awb = controller_.getAlgorithm(\"awb\");\n \t\t\tif (!awb) {\n \t\t\t\tLOG(IPARPI, Warning)\n \t\t\t\t\t<< \"Could not set AWB_ENABLE - no AWB algorithm\";\n@@ -832,9 +832,9 @@ void IPARPi::queueRequest(const ControlList &controls)\n \t\t\t}\n \n \t\t\tif (ctrl.second.get<bool>() == false)\n-\t\t\t\tawb->Pause();\n+\t\t\t\tawb->pause();\n \t\t\telse\n-\t\t\t\tawb->Resume();\n+\t\t\t\tawb->resume();\n \n \t\t\tlibcameraMetadata_.set(controls::AwbEnable,\n \t\t\t\t\t       ctrl.second.get<bool>());\n@@ -843,7 +843,7 @@ void IPARPi::queueRequest(const ControlList &controls)\n \n \t\tcase controls::AWB_MODE: {\n \t\t\tRPiController::AwbAlgorithm *awb = dynamic_cast<RPiController::AwbAlgorithm *>(\n-\t\t\t\tcontroller_.GetAlgorithm(\"awb\"));\n+\t\t\t\tcontroller_.getAlgorithm(\"awb\"));\n \t\t\tif (!awb) {\n \t\t\t\tLOG(IPARPI, Warning)\n \t\t\t\t\t<< \"Could not set AWB_MODE - no AWB algorithm\";\n@@ -852,7 +852,7 @@ void IPARPi::queueRequest(const ControlList &controls)\n \n \t\t\tint32_t idx = ctrl.second.get<int32_t>();\n \t\t\tif (AwbModeTable.count(idx)) {\n-\t\t\t\tawb->SetMode(AwbModeTable.at(idx));\n+\t\t\t\tawb->setMode(AwbModeTable.at(idx));\n \t\t\t\tlibcameraMetadata_.set(controls::AwbMode, idx);\n \t\t\t} else {\n \t\t\t\tLOG(IPARPI, Error) << \"AWB mode \" << idx\n@@ -864,14 +864,14 @@ void IPARPi::queueRequest(const ControlList &controls)\n \t\tcase controls::COLOUR_GAINS: {\n \t\t\tauto gains = ctrl.second.get<Span<const float>>();\n \t\t\tRPiController::AwbAlgorithm *awb = dynamic_cast<RPiController::AwbAlgorithm *>(\n-\t\t\t\tcontroller_.GetAlgorithm(\"awb\"));\n+\t\t\t\tcontroller_.getAlgorithm(\"awb\"));\n \t\t\tif (!awb) {\n \t\t\t\tLOG(IPARPI, Warning)\n \t\t\t\t\t<< \"Could not set COLOUR_GAINS - no AWB algorithm\";\n \t\t\t\tbreak;\n \t\t\t}\n \n-\t\t\tawb->SetManualGains(gains[0], gains[1]);\n+\t\t\tawb->setManualGains(gains[0], gains[1]);\n \t\t\tif (gains[0] != 0.0f && gains[1] != 0.0f)\n \t\t\t\t/* A gain of 0.0f will switch back to auto mode. */\n \t\t\t\tlibcameraMetadata_.set(controls::ColourGains,\n@@ -881,14 +881,14 @@ void IPARPi::queueRequest(const ControlList &controls)\n \n \t\tcase controls::BRIGHTNESS: {\n \t\t\tRPiController::ContrastAlgorithm *contrast = dynamic_cast<RPiController::ContrastAlgorithm *>(\n-\t\t\t\tcontroller_.GetAlgorithm(\"contrast\"));\n+\t\t\t\tcontroller_.getAlgorithm(\"contrast\"));\n \t\t\tif (!contrast) {\n \t\t\t\tLOG(IPARPI, Warning)\n \t\t\t\t\t<< \"Could not set BRIGHTNESS - no contrast algorithm\";\n \t\t\t\tbreak;\n \t\t\t}\n \n-\t\t\tcontrast->SetBrightness(ctrl.second.get<float>() * 65536);\n+\t\t\tcontrast->setBrightness(ctrl.second.get<float>() * 65536);\n \t\t\tlibcameraMetadata_.set(controls::Brightness,\n \t\t\t\t\t       ctrl.second.get<float>());\n \t\t\tbreak;\n@@ -896,14 +896,14 @@ void IPARPi::queueRequest(const ControlList &controls)\n \n \t\tcase controls::CONTRAST: {\n \t\t\tRPiController::ContrastAlgorithm *contrast = dynamic_cast<RPiController::ContrastAlgorithm *>(\n-\t\t\t\tcontroller_.GetAlgorithm(\"contrast\"));\n+\t\t\t\tcontroller_.getAlgorithm(\"contrast\"));\n \t\t\tif (!contrast) {\n \t\t\t\tLOG(IPARPI, Warning)\n \t\t\t\t\t<< \"Could not set CONTRAST - no contrast algorithm\";\n \t\t\t\tbreak;\n \t\t\t}\n \n-\t\t\tcontrast->SetContrast(ctrl.second.get<float>());\n+\t\t\tcontrast->setContrast(ctrl.second.get<float>());\n \t\t\tlibcameraMetadata_.set(controls::Contrast,\n \t\t\t\t\t       ctrl.second.get<float>());\n \t\t\tbreak;\n@@ -911,14 +911,14 @@ void IPARPi::queueRequest(const ControlList &controls)\n \n \t\tcase controls::SATURATION: {\n \t\t\tRPiController::CcmAlgorithm *ccm = dynamic_cast<RPiController::CcmAlgorithm *>(\n-\t\t\t\tcontroller_.GetAlgorithm(\"ccm\"));\n+\t\t\t\tcontroller_.getAlgorithm(\"ccm\"));\n \t\t\tif (!ccm) {\n \t\t\t\tLOG(IPARPI, Warning)\n \t\t\t\t\t<< \"Could not set SATURATION - no ccm algorithm\";\n \t\t\t\tbreak;\n \t\t\t}\n \n-\t\t\tccm->SetSaturation(ctrl.second.get<float>());\n+\t\t\tccm->setSaturation(ctrl.second.get<float>());\n \t\t\tlibcameraMetadata_.set(controls::Saturation,\n \t\t\t\t\t       ctrl.second.get<float>());\n \t\t\tbreak;\n@@ -926,14 +926,14 @@ void IPARPi::queueRequest(const ControlList &controls)\n \n \t\tcase controls::SHARPNESS: {\n \t\t\tRPiController::SharpenAlgorithm *sharpen = dynamic_cast<RPiController::SharpenAlgorithm *>(\n-\t\t\t\tcontroller_.GetAlgorithm(\"sharpen\"));\n+\t\t\t\tcontroller_.getAlgorithm(\"sharpen\"));\n \t\t\tif (!sharpen) {\n \t\t\t\tLOG(IPARPI, Warning)\n \t\t\t\t\t<< \"Could not set SHARPNESS - no sharpen algorithm\";\n \t\t\t\tbreak;\n \t\t\t}\n \n-\t\t\tsharpen->SetStrength(ctrl.second.get<float>());\n+\t\t\tsharpen->setStrength(ctrl.second.get<float>());\n \t\t\tlibcameraMetadata_.set(controls::Sharpness,\n \t\t\t\t\t       ctrl.second.get<float>());\n \t\t\tbreak;\n@@ -952,7 +952,7 @@ void IPARPi::queueRequest(const ControlList &controls)\n \n \t\tcase controls::NOISE_REDUCTION_MODE: {\n \t\t\tRPiController::DenoiseAlgorithm *sdn = dynamic_cast<RPiController::DenoiseAlgorithm *>(\n-\t\t\t\tcontroller_.GetAlgorithm(\"SDN\"));\n+\t\t\t\tcontroller_.getAlgorithm(\"SDN\"));\n \t\t\tif (!sdn) {\n \t\t\t\tLOG(IPARPI, Warning)\n \t\t\t\t\t<< \"Could not set NOISE_REDUCTION_MODE - no SDN algorithm\";\n@@ -962,7 +962,7 @@ void IPARPi::queueRequest(const ControlList &controls)\n \t\t\tint32_t idx = ctrl.second.get<int32_t>();\n \t\t\tauto mode = DenoiseModeTable.find(idx);\n \t\t\tif (mode != DenoiseModeTable.end()) {\n-\t\t\t\tsdn->SetMode(mode->second);\n+\t\t\t\tsdn->setMode(mode->second);\n \n \t\t\t\t/*\n \t\t\t\t * \\todo If the colour denoise is not going to run due to an\n@@ -1014,7 +1014,7 @@ void IPARPi::prepareISP(const ISPConfig &data)\n \t * This may overwrite the DeviceStatus using values from the sensor\n \t * metadata, and may also do additional custom processing.\n \t */\n-\thelper_->Prepare(embeddedBuffer, rpiMetadata_);\n+\thelper_->prepare(embeddedBuffer, rpiMetadata_);\n \n \t/* Done with embedded data now, return to pipeline handler asap. */\n \tif (data.embeddedBufferPresent)\n@@ -1030,7 +1030,7 @@ void IPARPi::prepareISP(const ISPConfig &data)\n \t\t * current frame, or any other bits of metadata that were added\n \t\t * in helper_->Prepare().\n \t\t */\n-\t\trpiMetadata_.Merge(lastMetadata);\n+\t\trpiMetadata_.merge(lastMetadata);\n \t\tprocessPending_ = false;\n \t\treturn;\n \t}\n@@ -1040,48 +1040,48 @@ void IPARPi::prepareISP(const ISPConfig &data)\n \n \tControlList ctrls(ispCtrls_);\n \n-\tcontroller_.Prepare(&rpiMetadata_);\n+\tcontroller_.prepare(&rpiMetadata_);\n \n \t/* Lock the metadata buffer to avoid constant locks/unlocks. */\n \tstd::unique_lock<RPiController::Metadata> lock(rpiMetadata_);\n \n-\tAwbStatus *awbStatus = rpiMetadata_.GetLocked<AwbStatus>(\"awb.status\");\n+\tAwbStatus *awbStatus = rpiMetadata_.getLocked<AwbStatus>(\"awb.status\");\n \tif (awbStatus)\n \t\tapplyAWB(awbStatus, ctrls);\n \n-\tCcmStatus *ccmStatus = rpiMetadata_.GetLocked<CcmStatus>(\"ccm.status\");\n+\tCcmStatus *ccmStatus = rpiMetadata_.getLocked<CcmStatus>(\"ccm.status\");\n \tif (ccmStatus)\n \t\tapplyCCM(ccmStatus, ctrls);\n \n-\tAgcStatus *dgStatus = rpiMetadata_.GetLocked<AgcStatus>(\"agc.status\");\n+\tAgcStatus *dgStatus = rpiMetadata_.getLocked<AgcStatus>(\"agc.status\");\n \tif (dgStatus)\n \t\tapplyDG(dgStatus, ctrls);\n \n-\tAlscStatus *lsStatus = rpiMetadata_.GetLocked<AlscStatus>(\"alsc.status\");\n+\tAlscStatus *lsStatus = rpiMetadata_.getLocked<AlscStatus>(\"alsc.status\");\n \tif (lsStatus)\n \t\tapplyLS(lsStatus, ctrls);\n \n-\tContrastStatus *contrastStatus = rpiMetadata_.GetLocked<ContrastStatus>(\"contrast.status\");\n+\tContrastStatus *contrastStatus = rpiMetadata_.getLocked<ContrastStatus>(\"contrast.status\");\n \tif (contrastStatus)\n \t\tapplyGamma(contrastStatus, ctrls);\n \n-\tBlackLevelStatus *blackLevelStatus = rpiMetadata_.GetLocked<BlackLevelStatus>(\"black_level.status\");\n+\tBlackLevelStatus *blackLevelStatus = rpiMetadata_.getLocked<BlackLevelStatus>(\"black_level.status\");\n \tif (blackLevelStatus)\n \t\tapplyBlackLevel(blackLevelStatus, ctrls);\n \n-\tGeqStatus *geqStatus = rpiMetadata_.GetLocked<GeqStatus>(\"geq.status\");\n+\tGeqStatus *geqStatus = rpiMetadata_.getLocked<GeqStatus>(\"geq.status\");\n \tif (geqStatus)\n \t\tapplyGEQ(geqStatus, ctrls);\n \n-\tDenoiseStatus *denoiseStatus = rpiMetadata_.GetLocked<DenoiseStatus>(\"denoise.status\");\n+\tDenoiseStatus *denoiseStatus = rpiMetadata_.getLocked<DenoiseStatus>(\"denoise.status\");\n \tif (denoiseStatus)\n \t\tapplyDenoise(denoiseStatus, ctrls);\n \n-\tSharpenStatus *sharpenStatus = rpiMetadata_.GetLocked<SharpenStatus>(\"sharpen.status\");\n+\tSharpenStatus *sharpenStatus = rpiMetadata_.getLocked<SharpenStatus>(\"sharpen.status\");\n \tif (sharpenStatus)\n \t\tapplySharpen(sharpenStatus, ctrls);\n \n-\tDpcStatus *dpcStatus = rpiMetadata_.GetLocked<DpcStatus>(\"dpc.status\");\n+\tDpcStatus *dpcStatus = rpiMetadata_.getLocked<DpcStatus>(\"dpc.status\");\n \tif (dpcStatus)\n \t\tapplyDPC(dpcStatus, ctrls);\n \n@@ -1097,13 +1097,13 @@ void IPARPi::fillDeviceStatus(const ControlList &sensorControls)\n \tint32_t gainCode = sensorControls.get(V4L2_CID_ANALOGUE_GAIN).get<int32_t>();\n \tint32_t vblank = sensorControls.get(V4L2_CID_VBLANK).get<int32_t>();\n \n-\tdeviceStatus.shutter_speed = helper_->Exposure(exposureLines);\n-\tdeviceStatus.analogue_gain = helper_->Gain(gainCode);\n-\tdeviceStatus.frame_length = mode_.height + vblank;\n+\tdeviceStatus.shutterSpeed = helper_->exposure(exposureLines);\n+\tdeviceStatus.analogueGain = helper_->gain(gainCode);\n+\tdeviceStatus.frameLength = mode_.height + vblank;\n \n \tLOG(IPARPI, Debug) << \"Metadata - \" << deviceStatus;\n \n-\trpiMetadata_.Set(\"device.status\", deviceStatus);\n+\trpiMetadata_.set(\"device.status\", deviceStatus);\n }\n \n void IPARPi::processStats(unsigned int bufferId)\n@@ -1117,11 +1117,11 @@ void IPARPi::processStats(unsigned int bufferId)\n \tSpan<uint8_t> mem = it->second.planes()[0];\n \tbcm2835_isp_stats *stats = reinterpret_cast<bcm2835_isp_stats *>(mem.data());\n \tRPiController::StatisticsPtr statistics = std::make_shared<bcm2835_isp_stats>(*stats);\n-\thelper_->Process(statistics, rpiMetadata_);\n-\tcontroller_.Process(statistics, &rpiMetadata_);\n+\thelper_->process(statistics, rpiMetadata_);\n+\tcontroller_.process(statistics, &rpiMetadata_);\n \n \tstruct AgcStatus agcStatus;\n-\tif (rpiMetadata_.Get(\"agc.status\", agcStatus) == 0) {\n+\tif (rpiMetadata_.get(\"agc.status\", agcStatus) == 0) {\n \t\tControlList ctrls(sensorCtrls_);\n \t\tapplyAGC(&agcStatus, ctrls);\n \n@@ -1131,19 +1131,19 @@ void IPARPi::processStats(unsigned int bufferId)\n \n void IPARPi::applyAWB(const struct AwbStatus *awbStatus, ControlList &ctrls)\n {\n-\tLOG(IPARPI, Debug) << \"Applying WB R: \" << awbStatus->gain_r << \" B: \"\n-\t\t\t   << awbStatus->gain_b;\n+\tLOG(IPARPI, Debug) << \"Applying WB R: \" << awbStatus->gainR << \" B: \"\n+\t\t\t   << awbStatus->gainB;\n \n \tctrls.set(V4L2_CID_RED_BALANCE,\n-\t\t  static_cast<int32_t>(awbStatus->gain_r * 1000));\n+\t\t  static_cast<int32_t>(awbStatus->gainR * 1000));\n \tctrls.set(V4L2_CID_BLUE_BALANCE,\n-\t\t  static_cast<int32_t>(awbStatus->gain_b * 1000));\n+\t\t  static_cast<int32_t>(awbStatus->gainB * 1000));\n }\n \n void IPARPi::applyFrameDurations(Duration minFrameDuration, Duration maxFrameDuration)\n {\n-\tconst Duration minSensorFrameDuration = mode_.min_frame_length * mode_.line_length;\n-\tconst Duration maxSensorFrameDuration = mode_.max_frame_length * mode_.line_length;\n+\tconst Duration minSensorFrameDuration = mode_.minFrameLength * mode_.lineLength;\n+\tconst Duration maxSensorFrameDuration = mode_.maxFrameLength * mode_.lineLength;\n \n \t/*\n \t * This will only be applied once AGC recalculations occur.\n@@ -1164,20 +1164,20 @@ void IPARPi::applyFrameDurations(Duration minFrameDuration, Duration maxFrameDur\n \n \t/*\n \t * Calculate the maximum exposure time possible for the AGC to use.\n-\t * GetVBlanking() will update maxShutter with the largest exposure\n+\t * getVBlanking() will update maxShutter with the largest exposure\n \t * value possible.\n \t */\n \tDuration maxShutter = Duration::max();\n-\thelper_->GetVBlanking(maxShutter, minFrameDuration_, maxFrameDuration_);\n+\thelper_->getVBlanking(maxShutter, minFrameDuration_, maxFrameDuration_);\n \n \tRPiController::AgcAlgorithm *agc = dynamic_cast<RPiController::AgcAlgorithm *>(\n-\t\tcontroller_.GetAlgorithm(\"agc\"));\n-\tagc->SetMaxShutter(maxShutter);\n+\t\tcontroller_.getAlgorithm(\"agc\"));\n+\tagc->setMaxShutter(maxShutter);\n }\n \n void IPARPi::applyAGC(const struct AgcStatus *agcStatus, ControlList &ctrls)\n {\n-\tint32_t gainCode = helper_->GainCode(agcStatus->analogue_gain);\n+\tint32_t gainCode = helper_->gainCode(agcStatus->analogueGain);\n \n \t/*\n \t * Ensure anything larger than the max gain code will not be passed to\n@@ -1186,15 +1186,15 @@ void IPARPi::applyAGC(const struct AgcStatus *agcStatus, ControlList &ctrls)\n \t */\n \tgainCode = std::min<int32_t>(gainCode, maxSensorGainCode_);\n \n-\t/* GetVBlanking might clip exposure time to the fps limits. */\n-\tDuration exposure = agcStatus->shutter_time;\n-\tint32_t vblanking = helper_->GetVBlanking(exposure, minFrameDuration_, maxFrameDuration_);\n-\tint32_t exposureLines = helper_->ExposureLines(exposure);\n+\t/* getVBlanking might clip exposure time to the fps limits. */\n+\tDuration exposure = agcStatus->shutterTime;\n+\tint32_t vblanking = helper_->getVBlanking(exposure, minFrameDuration_, maxFrameDuration_);\n+\tint32_t exposureLines = helper_->exposureLines(exposure);\n \n \tLOG(IPARPI, Debug) << \"Applying AGC Exposure: \" << exposure\n \t\t\t   << \" (Shutter lines: \" << exposureLines << \", AGC requested \"\n-\t\t\t   << agcStatus->shutter_time << \") Gain: \"\n-\t\t\t   << agcStatus->analogue_gain << \" (Gain Code: \"\n+\t\t\t   << agcStatus->shutterTime << \") Gain: \"\n+\t\t\t   << agcStatus->analogueGain << \" (Gain Code: \"\n \t\t\t   << gainCode << \")\";\n \n \t/*\n@@ -1210,7 +1210,7 @@ void IPARPi::applyAGC(const struct AgcStatus *agcStatus, ControlList &ctrls)\n void IPARPi::applyDG(const struct AgcStatus *dgStatus, ControlList &ctrls)\n {\n \tctrls.set(V4L2_CID_DIGITAL_GAIN,\n-\t\t  static_cast<int32_t>(dgStatus->digital_gain * 1000));\n+\t\t  static_cast<int32_t>(dgStatus->digitalGain * 1000));\n }\n \n void IPARPi::applyCCM(const struct CcmStatus *ccmStatus, ControlList &ctrls)\n","prefixes":["libcamera-devel","01/15"]}