[{"id":20075,"web_url":"https://patchwork.libcamera.org/comment/20075/","msgid":"<YVzckwrQXcY8eG/g@pendragon.ideasonboard.com>","date":"2021-10-05T23:15:31","subject":"Re: [libcamera-devel] [PATCH v2 3/3] libcamera: pipeline:\n\traspberrypi: Support colour spaces","submitter":{"id":2,"url":"https://patchwork.libcamera.org/api/people/2/","name":"Laurent Pinchart","email":"laurent.pinchart@ideasonboard.com"},"content":"Hi David,\n\nThank you for the patch.\n\nOn Mon, Sep 27, 2021 at 01:33:27PM +0100, David Plowman wrote:\n> The Raspberry Pi pipeline handler now sets colour spaces correctly.\n> \n> In generateConfiguration() is sets them to reasonable default values\n\ns/is/it/\n\n> based on the stream role. Note how video recording streams use the\n> \"VIDEO\" YCbCr encoding, which will later be fixed up according to the\n> requested resolution.\n> \n> validate() and configure() check the colour space is valid, and also\n> force all (non-raw) output streams to share the same colour space\n> (which is a hardware restriction). Finally, the \"VIDEO\" YCbCr encoding\n> is corrected by configure() to be one of REC601, REC709 or REC2020.\n> \n> Signed-off-by: David Plowman <david.plowman@raspberrypi.com>\n> ---\n>  .../pipeline/raspberrypi/raspberrypi.cpp      | 84 +++++++++++++++++++\n>  1 file changed, 84 insertions(+)\n> \n> diff --git a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp\n> index 0bdfa727..33ab49d6 100644\n> --- a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp\n> +++ b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp\n> @@ -281,6 +281,24 @@ RPiCameraConfiguration::RPiCameraConfiguration(const RPiCameraData *data)\n>  {\n>  }\n>  \n> +static bool validateColorSpace(ColorSpace &colorSpace)\n> +{\n> +\tconst std::vector<ColorSpace> validColorSpaces = {\n\nstatic ?\n\n> +\t\tColorSpace::JFIF,\n> +\t\tColorSpace::SMPTE170M,\n> +\t\tColorSpace::REC709,\n> +\t\tColorSpace::REC2020,\n> +\t\tColorSpace::VIDEO,\n> +\t};\n> +\tauto it = std::find_if(validColorSpaces.begin(), validColorSpaces.end(),\n> +\t\t\t       [&colorSpace](const ColorSpace &item) { return colorSpace == item; });\n\n\tauto it = std::find_if(validColorSpaces.begin(), validColorSpaces.end(),\n\t\t\t       [&colorSpace](const ColorSpace &item) {\n\t\t\t\t       return colorSpace == item;\n\t\t\t       });\n\n> +\tif (it != validColorSpaces.end())\n> +\t\treturn true;\n\nYou could use an std::unordered_set<ColorSpace> and just check\nvalidColorSpaces. You'll need to define a specialization of\nstd::hash<ColorSpace> though.\n\n> +\n> +\tcolorSpace = ColorSpace::JFIF;\n> +\treturn false;\n> +}\n> +\n>  CameraConfiguration::Status RPiCameraConfiguration::validate()\n>  {\n>  \tStatus status = Valid;\n> @@ -346,6 +364,7 @@ CameraConfiguration::Status RPiCameraConfiguration::validate()\n>  \tunsigned int rawCount = 0, outCount = 0, count = 0, maxIndex = 0;\n>  \tstd::pair<int, Size> outSize[2];\n>  \tSize maxSize;\n> +\tColorSpace colorSpace; /* colour space for all non-raw output streams */\n>  \tfor (StreamConfiguration &cfg : config_) {\n>  \t\tif (isRaw(cfg.pixelFormat)) {\n>  \t\t\t/*\n> @@ -354,6 +373,7 @@ CameraConfiguration::Status RPiCameraConfiguration::validate()\n>  \t\t\t */\n>  \t\t\tV4L2VideoDevice::Formats fmts = data_->unicam_[Unicam::Image].dev()->formats();\n>  \t\t\tV4L2DeviceFormat sensorFormat = findBestMode(fmts, cfg.size);\n> +\t\t\tsensorFormat.colorSpace = ColorSpace::RAW;\n>  \t\t\tint ret = data_->unicam_[Unicam::Image].dev()->tryFormat(&sensorFormat);\n>  \t\t\tif (ret)\n>  \t\t\t\treturn Invalid;\n\nShouldn't you set cfg.colorSpace to ColorSpace::RAW for raw stream\n(possibly setting the status to Adjusted) ?\n\n> @@ -392,6 +412,7 @@ CameraConfiguration::Status RPiCameraConfiguration::validate()\n>  \t\t\tif (maxSize < cfg.size) {\n>  \t\t\t\tmaxSize = cfg.size;\n>  \t\t\t\tmaxIndex = outCount;\n> +\t\t\t\tcolorSpace = cfg.colorSpace;\n>  \t\t\t}\n>  \t\t\toutCount++;\n>  \t\t}\n> @@ -405,6 +426,10 @@ CameraConfiguration::Status RPiCameraConfiguration::validate()\n>  \t\t}\n>  \t}\n>  \n> +\t/* Ensure the output colour space (if present) is one we handle. */\n> +\tif (outCount)\n> +\t\tvalidateColorSpace(colorSpace);\n> +\n>  \t/*\n>  \t * Now do any fixups needed. For the two ISP outputs, one stream must be\n>  \t * equal or smaller than the other in all dimensions.\n> @@ -446,10 +471,26 @@ CameraConfiguration::Status RPiCameraConfiguration::validate()\n>  \t\t\tstatus = Adjusted;\n>  \t\t}\n>  \n> +\t\t/* Output streams must share the same colour space. */\n> +\t\tif (cfg.colorSpace != colorSpace) {\n> +\t\t\tLOG(RPI, Warning) << \"Output stream \" << (i == maxIndex ? 0 : 1)\n> +\t\t\t\t\t  << \" colour space changed\";\n\nDoes that deserve a warning, or should it be a debug message ? Given\nthat there's no API to enumerate supported colour spaces (do we need one\n?), applications can only try different options, so it seems to be an\nexpected behaviour.\n\n> +\t\t\tcfg.colorSpace = colorSpace;\n> +\t\t\tstatus = Adjusted;\n> +\t\t}\n> +\n>  \t\tV4L2DeviceFormat format;\n>  \t\tformat.fourcc = V4L2PixelFormat::fromPixelFormat(cfg.pixelFormat);\n>  \t\tformat.size = cfg.size;\n>  \n> +\t\t/*\n> +\t\t * Request a sensible colour space. Note that \"VIDEO\" isn't a real\n> +\t\t * encoding, so substitute something else sensible.\n> +\t\t */\n> +\t\tformat.colorSpace = colorSpace;\n> +\t\tif (format.colorSpace.encoding == ColorSpace::Encoding::VIDEO)\n> +\t\t\tformat.colorSpace.encoding = ColorSpace::Encoding::REC601;\n\nI think the substitution could be done before the loop, possibly in\nvalidateColorSpace(). It raises the question of whether validate()\nshould return Adjusted or Valid in that case. This should be documented\nin patch 2/3.\n\n> +\n>  \t\tint ret = dev->tryFormat(&format);\n>  \t\tif (ret)\n>  \t\t\treturn Invalid;\n> @@ -475,6 +516,7 @@ CameraConfiguration *PipelineHandlerRPi::generateConfiguration(Camera *camera,\n>  \tV4L2DeviceFormat sensorFormat;\n>  \tunsigned int bufferCount;\n>  \tPixelFormat pixelFormat;\n> +\tColorSpace colorSpace;\n>  \tV4L2VideoDevice::Formats fmts;\n>  \tSize size;\n>  \n> @@ -490,6 +532,7 @@ CameraConfiguration *PipelineHandlerRPi::generateConfiguration(Camera *camera,\n>  \t\t\tfmts = data->unicam_[Unicam::Image].dev()->formats();\n>  \t\t\tsensorFormat = findBestMode(fmts, size);\n>  \t\t\tpixelFormat = sensorFormat.fourcc.toPixelFormat();\n> +\t\t\tcolorSpace = ColorSpace::RAW;\n>  \t\t\tASSERT(pixelFormat.isValid());\n>  \t\t\tbufferCount = 2;\n>  \t\t\trawCount++;\n> @@ -501,6 +544,7 @@ CameraConfiguration *PipelineHandlerRPi::generateConfiguration(Camera *camera,\n>  \t\t\t/* Return the largest sensor resolution. */\n>  \t\t\tsize = data->sensor_->resolution();\n>  \t\t\tbufferCount = 1;\n> +\t\t\tcolorSpace = ColorSpace::JFIF;\n>  \t\t\toutCount++;\n>  \t\t\tbreak;\n>  \n> @@ -517,6 +561,7 @@ CameraConfiguration *PipelineHandlerRPi::generateConfiguration(Camera *camera,\n>  \t\t\tpixelFormat = formats::YUV420;\n>  \t\t\tsize = { 1920, 1080 };\n>  \t\t\tbufferCount = 4;\n> +\t\t\tcolorSpace = ColorSpace::VIDEO;\n>  \t\t\toutCount++;\n>  \t\t\tbreak;\n>  \n> @@ -525,6 +570,7 @@ CameraConfiguration *PipelineHandlerRPi::generateConfiguration(Camera *camera,\n>  \t\t\tpixelFormat = formats::ARGB8888;\n>  \t\t\tsize = { 800, 600 };\n>  \t\t\tbufferCount = 4;\n> +\t\t\tcolorSpace = ColorSpace::JFIF;\n>  \t\t\toutCount++;\n>  \t\t\tbreak;\n>  \n> @@ -554,6 +600,7 @@ CameraConfiguration *PipelineHandlerRPi::generateConfiguration(Camera *camera,\n>  \t\tStreamConfiguration cfg(formats);\n>  \t\tcfg.size = size;\n>  \t\tcfg.pixelFormat = pixelFormat;\n> +\t\tcfg.colorSpace = colorSpace;\n>  \t\tcfg.bufferCount = bufferCount;\n>  \t\tconfig->addConfiguration(cfg);\n>  \t}\n> @@ -575,6 +622,7 @@ int PipelineHandlerRPi::configure(Camera *camera, CameraConfiguration *config)\n>  \tSize maxSize, sensorSize;\n>  \tunsigned int maxIndex = 0;\n>  \tbool rawStream = false;\n> +\tColorSpace colorSpace; /* colour space for all non-raw output streams */\n>  \n>  \t/*\n>  \t * Look for the RAW stream (if given) size as well as the largest\n> @@ -593,14 +641,40 @@ int PipelineHandlerRPi::configure(Camera *camera, CameraConfiguration *config)\n>  \t\t} else {\n>  \t\t\tif (cfg.size > maxSize) {\n>  \t\t\t\tmaxSize = config->at(i).size;\n> +\t\t\t\tcolorSpace = config->at(i).colorSpace;\n>  \t\t\t\tmaxIndex = i;\n>  \t\t\t}\n>  \t\t}\n>  \t}\n>  \n> +\tif (maxSize.isNull()) {\n> +\t\t/*\n> +\t\t * No non-raw streams, so some will get made below. Doesn't matter\n> +\t\t * what colour space we assign to them.\n> +\t\t */\n> +\t\tcolorSpace = ColorSpace::JFIF;\n> +\t} else {\n> +\t\t/* Make sure we can handle this colour space. */\n> +\t\tvalidateColorSpace(colorSpace);\n\nconfigure() is guaranteed to be called with a valid configuration, so I\nthink you can skip this.\n\n> +\n> +\t\t/*\n> +\t\t * The \"VIDEO\" colour encoding means that we choose one of REC601,\n> +\t\t * REC709 or REC2020 automatically according to the resolution.\n> +\t\t */\n> +\t\tif (colorSpace.encoding == ColorSpace::Encoding::VIDEO) {\n\nSame here, this should have been adjusted in validate().\n\n> +\t\t\tif (maxSize.width >= 3840)\n> +\t\t\t\tcolorSpace.encoding = ColorSpace::Encoding::REC2020;\n> +\t\t\telse if (maxSize.width >= 1280)\n> +\t\t\t\tcolorSpace.encoding = ColorSpace::Encoding::REC709;\n> +\t\t\telse\n> +\t\t\t\tcolorSpace.encoding = ColorSpace::Encoding::REC601;\n> +\t\t}\n> +\t}\n> +\n>  \t/* First calculate the best sensor mode we can use based on the user request. */\n>  \tV4L2VideoDevice::Formats fmts = data->unicam_[Unicam::Image].dev()->formats();\n>  \tV4L2DeviceFormat sensorFormat = findBestMode(fmts, rawStream ? sensorSize : maxSize);\n> +\tsensorFormat.colorSpace = ColorSpace::RAW;\n>  \n>  \t/*\n>  \t * Unicam image output format. The ISP input format gets set at start,\n> @@ -638,11 +712,18 @@ int PipelineHandlerRPi::configure(Camera *camera, CameraConfiguration *config)\n>  \t\tStreamConfiguration &cfg = config->at(i);\n>  \n>  \t\tif (isRaw(cfg.pixelFormat)) {\n> +\t\t\tcfg.colorSpace = ColorSpace::RAW;\n\nDitto, and for the next change too.\n\n>  \t\t\tcfg.setStream(&data->unicam_[Unicam::Image]);\n>  \t\t\tdata->unicam_[Unicam::Image].setExternal(true);\n>  \t\t\tcontinue;\n>  \t\t}\n>  \n> +\t\t/* All other streams share the same colour space. */\n> +\t\tif (cfg.colorSpace != colorSpace) {\n> +\t\t\tLOG(RPI, Warning) << \"Stream \" << i << \" colour space changed\";\n> +\t\t\tcfg.colorSpace = colorSpace;\n> +\t\t}\n> +\n>  \t\t/* The largest resolution gets routed to the ISP Output 0 node. */\n>  \t\tRPi::Stream *stream = i == maxIndex ? &data->isp_[Isp::Output0]\n>  \t\t\t\t\t\t    : &data->isp_[Isp::Output1];\n> @@ -650,6 +731,7 @@ int PipelineHandlerRPi::configure(Camera *camera, CameraConfiguration *config)\n>  \t\tV4L2PixelFormat fourcc = V4L2PixelFormat::fromPixelFormat(cfg.pixelFormat);\n>  \t\tformat.size = cfg.size;\n>  \t\tformat.fourcc = fourcc;\n> +\t\tformat.colorSpace = cfg.colorSpace;\n>  \n>  \t\tLOG(RPI, Debug) << \"Setting \" << stream->name() << \" to \"\n>  \t\t\t\t<< format.toString();\n> @@ -689,6 +771,7 @@ int PipelineHandlerRPi::configure(Camera *camera, CameraConfiguration *config)\n>  \t\tformat = {};\n>  \t\tformat.size = maxSize;\n>  \t\tformat.fourcc = V4L2PixelFormat::fromPixelFormat(formats::YUV420);\n> +\t\tformat.colorSpace = colorSpace;\n>  \t\tret = data->isp_[Isp::Output0].dev()->setFormat(&format);\n>  \t\tif (ret) {\n>  \t\t\tLOG(RPI, Error)\n> @@ -718,6 +801,7 @@ int PipelineHandlerRPi::configure(Camera *camera, CameraConfiguration *config)\n>  \t\tconst Size limit = maxDimensions.boundedToAspectRatio(format.size);\n>  \n>  \t\toutput1Format.size = (format.size / 2).boundedTo(limit).alignedDownTo(2, 2);\n> +\t\toutput1Format.colorSpace = colorSpace;\n>  \n>  \t\tLOG(RPI, Debug) << \"Setting ISP Output1 (internal) to \"\n>  \t\t\t\t<< output1Format.toString();","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id ED2D6BDC71\n\tfor <parsemail@patchwork.libcamera.org>;\n\tTue,  5 Oct 2021 23:15:40 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id 646C66918C;\n\tWed,  6 Oct 2021 01:15:40 +0200 (CEST)","from perceval.ideasonboard.com (perceval.ideasonboard.com\n\t[IPv6:2001:4b98:dc2:55:216:3eff:fef7:d647])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id 271816023F\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tWed,  6 Oct 2021 01:15:39 +0200 (CEST)","from pendragon.ideasonboard.com (62-78-145-57.bb.dnainternet.fi\n\t[62.78.145.57])\n\tby perceval.ideasonboard.com (Postfix) with ESMTPSA id A2197FFD;\n\tWed,  6 Oct 2021 01:15:38 +0200 (CEST)"],"Authentication-Results":"lancelot.ideasonboard.com; dkim=pass (1024-bit key;\n\tunprotected) header.d=ideasonboard.com header.i=@ideasonboard.com\n\theader.b=\"HBoWCoYA\"; dkim-atps=neutral","DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/simple; d=ideasonboard.com;\n\ts=mail; t=1633475738;\n\tbh=s9ZuWkVvzwky7T8/XFcBb1MycO0tH4BcuA/WdBA15TU=;\n\th=Date:From:To:Cc:Subject:References:In-Reply-To:From;\n\tb=HBoWCoYABFiNgGflO2bgyBGueV1vPg9LJw+xaNbus447v8bfmj8QhJHWMl3MNpX+s\n\tIWqfmSdS2ZEd4TS3tlmUWtBbxo84xt0m4ly3uG3UbEbw/t6nFxXKhimuPi3LjuUjwf\n\tKhDwmrYiKfd0H1nEtYeRhX7zvOT17DLlFj1CA5FU=","Date":"Wed, 6 Oct 2021 02:15:31 +0300","From":"Laurent Pinchart <laurent.pinchart@ideasonboard.com>","To":"David Plowman <david.plowman@raspberrypi.com>","Message-ID":"<YVzckwrQXcY8eG/g@pendragon.ideasonboard.com>","References":"<20210927123327.14554-1-david.plowman@raspberrypi.com>\n\t<20210927123327.14554-4-david.plowman@raspberrypi.com>","MIME-Version":"1.0","Content-Type":"text/plain; charset=utf-8","Content-Disposition":"inline","In-Reply-To":"<20210927123327.14554-4-david.plowman@raspberrypi.com>","Subject":"Re: [libcamera-devel] [PATCH v2 3/3] libcamera: pipeline:\n\traspberrypi: Support colour spaces","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","Cc":"libcamera-devel@lists.libcamera.org","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}}]