{"id":13214,"url":"https://patchwork.libcamera.org/api/patches/13214/?format=json","web_url":"https://patchwork.libcamera.org/patch/13214/","project":{"id":1,"url":"https://patchwork.libcamera.org/api/projects/1/?format=json","name":"libcamera","link_name":"libcamera","list_id":"libcamera_core","list_email":"libcamera-devel@lists.libcamera.org","web_url":"","scm_url":"","webscm_url":""},"msgid":"<20210805142154.20324-4-david.plowman@raspberrypi.com>","date":"2021-08-05T14:21:54","name":"[libcamera-devel,3/3] libcamera: pipeline: raspberrypi: Support colour spaces","commit_ref":null,"pull_url":null,"state":"superseded","archived":false,"hash":"f28bc3bb0274133b0bdd6ba9fdca3e18df51cfb9","submitter":{"id":42,"url":"https://patchwork.libcamera.org/api/people/42/?format=json","name":"David Plowman","email":"david.plowman@raspberrypi.com"},"delegate":null,"mbox":"https://patchwork.libcamera.org/patch/13214/mbox/","series":[{"id":2308,"url":"https://patchwork.libcamera.org/api/series/2308/?format=json","web_url":"https://patchwork.libcamera.org/project/libcamera/list/?series=2308","date":"2021-08-05T14:21:51","name":"Colour spaces","version":1,"mbox":"https://patchwork.libcamera.org/series/2308/mbox/"}],"comments":"https://patchwork.libcamera.org/api/patches/13214/comments/","check":"pending","checks":"https://patchwork.libcamera.org/api/patches/13214/checks/","tags":{},"headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id E9B2BC3239\n\tfor <parsemail@patchwork.libcamera.org>;\n\tThu,  5 Aug 2021 14:22:02 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id A64516884E;\n\tThu,  5 Aug 2021 16:22:02 +0200 (CEST)","from mail-wr1-x434.google.com (mail-wr1-x434.google.com\n\t[IPv6:2a00:1450:4864:20::434])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id 93E3568811\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tThu,  5 Aug 2021 16:22:00 +0200 (CEST)","by mail-wr1-x434.google.com with SMTP id c9so6773294wri.8\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tThu, 05 Aug 2021 07:22:00 -0700 (PDT)","from pi4-davidp.pitowers.org\n\t([2a00:1098:3142:14:1ce1:9965:4328:89c4])\n\tby smtp.gmail.com with ESMTPSA id\n\tk17sm6410004wrw.53.2021.08.05.07.21.59\n\t(version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256);\n\tThu, 05 Aug 2021 07:21:59 -0700 (PDT)"],"Authentication-Results":"lancelot.ideasonboard.com;\n\tdkim=fail reason=\"signature verification failed\" (2048-bit key;\n\tunprotected) header.d=raspberrypi.com header.i=@raspberrypi.com\n\theader.b=\"othEu7FX\"; dkim-atps=neutral","DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/relaxed;\n\td=raspberrypi.com; s=google;\n\th=from:to:cc:subject:date:message-id:in-reply-to:references\n\t:mime-version:content-transfer-encoding;\n\tbh=eUdwrCoEDCpXkt838qfY4LnGgP6uSf9WmosSrHwUl70=;\n\tb=othEu7FXGxXIca/+L5zyGJXmQvYw6zHNEd+JEqg9yW86cf2hudBu47jOfG8nwwE0r0\n\tNlx9aH9wYOIRsMBpEImOJ8dyV9hOKXLrJNnhMkJkUdSz9KA7VnbpOooomdZF1AjdVjMn\n\tqxfb2sfDWfv0O5hARoy9APmIQwvzGfWLU12a90h2Vnch+aKLMGyckARv8+V2WcC5hke7\n\tZPm5az6EZHDWQ16p4PmYzluDveAjOedtRoG1Ref/XGIYuSqAjMQHyI4D0phmGIMmPHKk\n\tesDN8djWFi/ewl9EYsCXFB53tZpatG9dQdo6zlOcIV1w6qrqVi5RtiDyWJedBdveOGpy\n\t10cg==","X-Google-DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/relaxed;\n\td=1e100.net; s=20161025;\n\th=x-gm-message-state:from:to:cc:subject:date:message-id:in-reply-to\n\t:references:mime-version:content-transfer-encoding;\n\tbh=eUdwrCoEDCpXkt838qfY4LnGgP6uSf9WmosSrHwUl70=;\n\tb=bFfg7wFZTQp5ObLR/0YNkNXf0Dg0onfA9LgCC/2ORskftR/VCpA9tom56dljcrGiKE\n\t7iF8CjN8Z1MkSc8Iy7rkhbrel/SMFlUs72vqF4B8m8L8aojpn64U19rkjEH3mrnRpBVN\n\trEsLHT6mFHP3DwHncggmBsHHfZubPsOZOALnQs9jwaSvJtkfJ9oq9X4JyuBzSQ3r1qsa\n\tz++vf9wiCrvDpw+bQgjU8/h3AqJTfXRVe+JkQenWRs+dmwAar5Xgr2QUoWyF4OwMD3up\n\t1bRWJpbbG+caxBOCY+HyaGTZd6g1P1Uen1SU26Z21Nwk3NMcTL5Zhiy7S43Rra3JDbmR\n\tGG3w==","X-Gm-Message-State":"AOAM532m/9JmhZMczGdOhRuhH1v1F89u2M9rRLEXYYeMqy+9u/1MMhyb\n\t5GM9Dz406KBcSLaivlcmL2wC+CBJZSt/kA==","X-Google-Smtp-Source":"ABdhPJwMLZ5rQZmd68idg43kOZBuKsx/xaGCnxCIQNu3yKF+BXbXoz/8eyWLfIa2IFLLloTSjufIUQ==","X-Received":"by 2002:a5d:6447:: with SMTP id d7mr5705367wrw.72.1628173319945; \n\tThu, 05 Aug 2021 07:21:59 -0700 (PDT)","From":"David Plowman <david.plowman@raspberrypi.com>","To":"libcamera-devel@lists.libcamera.org","Date":"Thu,  5 Aug 2021 15:21:54 +0100","Message-Id":"<20210805142154.20324-4-david.plowman@raspberrypi.com>","X-Mailer":"git-send-email 2.20.1","In-Reply-To":"<20210805142154.20324-1-david.plowman@raspberrypi.com>","References":"<20210805142154.20324-1-david.plowman@raspberrypi.com>","MIME-Version":"1.0","Content-Transfer-Encoding":"8bit","Subject":"[libcamera-devel] [PATCH 3/3] libcamera: pipeline: raspberrypi:\n\tSupport colour spaces","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"},"content":"The Raspberry Pi pipeline handler now sets colour spaces correctly.\n\nIn generateConfiguration() is sets them to reasonable default values\nbased on the stream role. Note how video recording streams use the\n\"VIDEO\" YCbCr encoding, which will later be fixed up according to the\nrequested resolution.\n\nvalidate() and configure() check the colour space is valid, and also\nforce all (non-raw) output streams to share the same colour space\n(which is a hardware restriction). Finally, the \"VIDEO\" YCbCr encoding\nis corrected by configure() to be one of REC601, REC709 or REC2020.\n\nSigned-off-by: David Plowman <david.plowman@raspberrypi.com>\n---\n .../pipeline/raspberrypi/raspberrypi.cpp      | 84 +++++++++++++++++++\n 1 file changed, 84 insertions(+)","diff":"diff --git a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp\nindex 0bab3bed..0119b1ca 100644\n--- a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp\n+++ b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp\n@@ -281,6 +281,24 @@ RPiCameraConfiguration::RPiCameraConfiguration(const RPiCameraData *data)\n {\n }\n \n+static bool validateColorSpace(ColorSpace &colorSpace)\n+{\n+\tconst std::vector<ColorSpace> validColorSpaces = {\n+\t\tColorSpace::JFIF,\n+\t\tColorSpace::SMPTE170M,\n+\t\tColorSpace::REC709,\n+\t\tColorSpace::REC2020,\n+\t\tColorSpace::VIDEO,\n+\t};\n+\tauto it = std::find_if(validColorSpaces.begin(), validColorSpaces.end(),\n+\t\t\t       [&colorSpace](const ColorSpace &item) { return colorSpace == item; });\n+\tif (it != validColorSpaces.end())\n+\t\treturn true;\n+\n+\tcolorSpace = ColorSpace::JFIF;\n+\treturn false;\n+}\n+\n CameraConfiguration::Status RPiCameraConfiguration::validate()\n {\n \tStatus status = Valid;\n@@ -346,6 +364,7 @@ CameraConfiguration::Status RPiCameraConfiguration::validate()\n \tunsigned int rawCount = 0, outCount = 0, count = 0, maxIndex = 0;\n \tstd::pair<int, Size> outSize[2];\n \tSize maxSize;\n+\tColorSpace colorSpace; /* colour space for all non-raw output streams */\n \tfor (StreamConfiguration &cfg : config_) {\n \t\tif (isRaw(cfg.pixelFormat)) {\n \t\t\t/*\n@@ -354,6 +373,7 @@ CameraConfiguration::Status RPiCameraConfiguration::validate()\n \t\t\t */\n \t\t\tV4L2VideoDevice::Formats fmts = data_->unicam_[Unicam::Image].dev()->formats();\n \t\t\tV4L2DeviceFormat sensorFormat = findBestMode(fmts, cfg.size);\n+\t\t\tsensorFormat.colorSpace = ColorSpace::RAW;\n \t\t\tint ret = data_->unicam_[Unicam::Image].dev()->tryFormat(&sensorFormat);\n \t\t\tif (ret)\n \t\t\t\treturn Invalid;\n@@ -392,6 +412,7 @@ CameraConfiguration::Status RPiCameraConfiguration::validate()\n \t\t\tif (maxSize < cfg.size) {\n \t\t\t\tmaxSize = cfg.size;\n \t\t\t\tmaxIndex = outCount;\n+\t\t\t\tcolorSpace = cfg.colorSpace;\n \t\t\t}\n \t\t\toutCount++;\n \t\t}\n@@ -405,6 +426,10 @@ CameraConfiguration::Status RPiCameraConfiguration::validate()\n \t\t}\n \t}\n \n+\t/* Ensure the output colour space (if present) is one we handle. */\n+\tif (outCount)\n+\t\tvalidateColorSpace(colorSpace);\n+\n \t/*\n \t * Now do any fixups needed. For the two ISP outputs, one stream must be\n \t * equal or smaller than the other in all dimensions.\n@@ -446,10 +471,26 @@ CameraConfiguration::Status RPiCameraConfiguration::validate()\n \t\t\tstatus = Adjusted;\n \t\t}\n \n+\t\t/* Output streams must share the same colour space. */\n+\t\tif (cfg.colorSpace != colorSpace) {\n+\t\t\tLOG(RPI, Warning) << \"Output stream \" << (i == maxIndex ? 0 : 1)\n+\t\t\t\t\t  << \" colour space changed\";\n+\t\t\tcfg.colorSpace = colorSpace;\n+\t\t\tstatus = Adjusted;\n+\t\t}\n+\n \t\tV4L2DeviceFormat format;\n \t\tformat.fourcc = dev->toV4L2PixelFormat(cfg.pixelFormat);\n \t\tformat.size = cfg.size;\n \n+\t\t/*\n+\t\t * Request a sensible colour space. Note that \"VIDEO\" isn't a real\n+\t\t * encoding, so substitute something else sensible.\n+\t\t */\n+\t\tformat.colorSpace = colorSpace;\n+\t\tif (format.colorSpace.encoding == ColorSpace::Encoding::VIDEO)\n+\t\t\tformat.colorSpace.encoding = ColorSpace::Encoding::REC601;\n+\n \t\tint ret = dev->tryFormat(&format);\n \t\tif (ret)\n \t\t\treturn Invalid;\n@@ -475,6 +516,7 @@ CameraConfiguration *PipelineHandlerRPi::generateConfiguration(Camera *camera,\n \tV4L2DeviceFormat sensorFormat;\n \tunsigned int bufferCount;\n \tPixelFormat pixelFormat;\n+\tColorSpace colorSpace;\n \tV4L2VideoDevice::Formats fmts;\n \tSize size;\n \n@@ -490,6 +532,7 @@ CameraConfiguration *PipelineHandlerRPi::generateConfiguration(Camera *camera,\n \t\t\tfmts = data->unicam_[Unicam::Image].dev()->formats();\n \t\t\tsensorFormat = findBestMode(fmts, size);\n \t\t\tpixelFormat = sensorFormat.fourcc.toPixelFormat();\n+\t\t\tcolorSpace = ColorSpace::RAW;\n \t\t\tASSERT(pixelFormat.isValid());\n \t\t\tbufferCount = 2;\n \t\t\trawCount++;\n@@ -501,6 +544,7 @@ CameraConfiguration *PipelineHandlerRPi::generateConfiguration(Camera *camera,\n \t\t\t/* Return the largest sensor resolution. */\n \t\t\tsize = data->sensor_->resolution();\n \t\t\tbufferCount = 1;\n+\t\t\tcolorSpace = ColorSpace::JFIF;\n \t\t\toutCount++;\n \t\t\tbreak;\n \n@@ -517,6 +561,7 @@ CameraConfiguration *PipelineHandlerRPi::generateConfiguration(Camera *camera,\n \t\t\tpixelFormat = formats::YUV420;\n \t\t\tsize = { 1920, 1080 };\n \t\t\tbufferCount = 4;\n+\t\t\tcolorSpace = ColorSpace::VIDEO;\n \t\t\toutCount++;\n \t\t\tbreak;\n \n@@ -525,6 +570,7 @@ CameraConfiguration *PipelineHandlerRPi::generateConfiguration(Camera *camera,\n \t\t\tpixelFormat = formats::ARGB8888;\n \t\t\tsize = { 800, 600 };\n \t\t\tbufferCount = 4;\n+\t\t\tcolorSpace = ColorSpace::JFIF;\n \t\t\toutCount++;\n \t\t\tbreak;\n \n@@ -554,6 +600,7 @@ CameraConfiguration *PipelineHandlerRPi::generateConfiguration(Camera *camera,\n \t\tStreamConfiguration cfg(formats);\n \t\tcfg.size = size;\n \t\tcfg.pixelFormat = pixelFormat;\n+\t\tcfg.colorSpace = colorSpace;\n \t\tcfg.bufferCount = bufferCount;\n \t\tconfig->addConfiguration(cfg);\n \t}\n@@ -575,6 +622,7 @@ int PipelineHandlerRPi::configure(Camera *camera, CameraConfiguration *config)\n \tSize maxSize, sensorSize;\n \tunsigned int maxIndex = 0;\n \tbool rawStream = false;\n+\tColorSpace colorSpace; /* colour space for all non-raw output streams */\n \n \t/*\n \t * Look for the RAW stream (if given) size as well as the largest\n@@ -593,14 +641,40 @@ int PipelineHandlerRPi::configure(Camera *camera, CameraConfiguration *config)\n \t\t} else {\n \t\t\tif (cfg.size > maxSize) {\n \t\t\t\tmaxSize = config->at(i).size;\n+\t\t\t\tcolorSpace = config->at(i).colorSpace;\n \t\t\t\tmaxIndex = i;\n \t\t\t}\n \t\t}\n \t}\n \n+\tif (maxSize.isNull()) {\n+\t\t/*\n+\t\t * No non-raw streams, so some will get made below. Doesn't matter\n+\t\t * what colour space we assign to them.\n+\t\t */\n+\t\tcolorSpace = ColorSpace::JFIF;\n+\t} else {\n+\t\t/* Make sure we can handle this colour space. */\n+\t\tvalidateColorSpace(colorSpace);\n+\n+\t\t/*\n+\t\t * The \"VIDEO\" colour encoding means that we choose one of REC601,\n+\t\t * REC709 or REC2020 automatically according to the resolution.\n+\t\t */\n+\t\tif (colorSpace.encoding == ColorSpace::Encoding::VIDEO) {\n+\t\t\tif (maxSize.width >= 3840)\n+\t\t\t\tcolorSpace.encoding = ColorSpace::Encoding::REC2020;\n+\t\t\telse if (maxSize.width >= 1280)\n+\t\t\t\tcolorSpace.encoding = ColorSpace::Encoding::REC709;\n+\t\t\telse\n+\t\t\t\tcolorSpace.encoding = ColorSpace::Encoding::REC601;\n+\t\t}\n+\t}\n+\n \t/* First calculate the best sensor mode we can use based on the user request. */\n \tV4L2VideoDevice::Formats fmts = data->unicam_[Unicam::Image].dev()->formats();\n \tV4L2DeviceFormat sensorFormat = findBestMode(fmts, rawStream ? sensorSize : maxSize);\n+\tsensorFormat.colorSpace = ColorSpace::RAW;\n \n \t/*\n \t * Unicam image output format. The ISP input format gets set at start,\n@@ -638,11 +712,18 @@ int PipelineHandlerRPi::configure(Camera *camera, CameraConfiguration *config)\n \t\tStreamConfiguration &cfg = config->at(i);\n \n \t\tif (isRaw(cfg.pixelFormat)) {\n+\t\t\tcfg.colorSpace = ColorSpace::RAW;\n \t\t\tcfg.setStream(&data->unicam_[Unicam::Image]);\n \t\t\tdata->unicam_[Unicam::Image].setExternal(true);\n \t\t\tcontinue;\n \t\t}\n \n+\t\t/* All other streams share the same colour space. */\n+\t\tif (cfg.colorSpace != colorSpace) {\n+\t\t\tLOG(RPI, Warning) << \"Stream \" << i << \" colour space changed\";\n+\t\t\tcfg.colorSpace = colorSpace;\n+\t\t}\n+\n \t\t/* The largest resolution gets routed to the ISP Output 0 node. */\n \t\tRPi::Stream *stream = i == maxIndex ? &data->isp_[Isp::Output0]\n \t\t\t\t\t\t    : &data->isp_[Isp::Output1];\n@@ -650,6 +731,7 @@ int PipelineHandlerRPi::configure(Camera *camera, CameraConfiguration *config)\n \t\tV4L2PixelFormat fourcc = stream->dev()->toV4L2PixelFormat(cfg.pixelFormat);\n \t\tformat.size = cfg.size;\n \t\tformat.fourcc = fourcc;\n+\t\tformat.colorSpace = cfg.colorSpace;\n \n \t\tLOG(RPI, Debug) << \"Setting \" << stream->name() << \" to \"\n \t\t\t\t<< format.toString();\n@@ -689,6 +771,7 @@ int PipelineHandlerRPi::configure(Camera *camera, CameraConfiguration *config)\n \t\tformat = {};\n \t\tformat.size = maxSize;\n \t\tformat.fourcc = V4L2PixelFormat::fromPixelFormat(formats::YUV420, false);\n+\t\tformat.colorSpace = colorSpace;\n \t\tret = data->isp_[Isp::Output0].dev()->setFormat(&format);\n \t\tif (ret) {\n \t\t\tLOG(RPI, Error)\n@@ -718,6 +801,7 @@ int PipelineHandlerRPi::configure(Camera *camera, CameraConfiguration *config)\n \t\tconst Size limit = maxDimensions.boundedToAspectRatio(format.size);\n \n \t\toutput1Format.size = (format.size / 2).boundedTo(limit).alignedDownTo(2, 2);\n+\t\toutput1Format.colorSpace = colorSpace;\n \n \t\tLOG(RPI, Debug) << \"Setting ISP Output1 (internal) to \"\n \t\t\t\t<< output1Format.toString();\n","prefixes":["libcamera-devel","3/3"]}