[{"id":19741,"web_url":"https://patchwork.libcamera.org/comment/19741/","msgid":"<20210921134838.lkvcj6cwxtcik2fa@uno.localdomain>","date":"2021-09-21T13:48:38","subject":"Re: [libcamera-devel] [RFC PATCH 0/2] Sensor mode hints","submitter":{"id":3,"url":"https://patchwork.libcamera.org/api/people/3/","name":"Jacopo Mondi","email":"jacopo@jmondi.org"},"content":"Hi David,\n\nOn Thu, Sep 16, 2021 at 02:20:13PM +0100, David Plowman wrote:\n> Hi everyone\n>\n> Here's a first attempt at functionality that allows applications to\n> provide \"hints\" as to what kind of camera mode they want.\n>\n> 1. Bit Depths and Sizes\n>\n> So far I'm allowing hints about bit depth and the image sizes that can\n> be read out of a sensor, and I've gathered these together into\n> something I've call a SensorMode.\n>\n> I've added a SensorMode field to the CameraConfiguration so that\n> applications can optionally put in there what they think they want,\n> and also a CameraSensor::getSensorModes function that returns a list\n> of the supported modes (raw formats only...).\n\nThis is more about the theory, but do you think it's a good idea\nto allow applications to be written based on some sensor-specific\nparameter ? I mean, I understand it's totally plausible to write a\nspecialized application for say, RPi + imx477. This assumes the\ndevelopers knows the sensor modes, the sensor configuration used to\nproduce it and low level details of the sensor. Most of those\ninformation assumed to be known by the developer also because there's\ncurrently no way to expose from V4L2 how a mode has been\nrealized in the sensor, if through binning or skipping or cropping. It\ncan be deduced probably looking at the analogue cropping and full\npixel array size, but it's a guessing exercise.\n\nNow, if we assume that a very specialized application knows the sensor\nit deals with, what is the use case for writing a generic application\nthat instead inspects the (limited) information of the sensor produced\nmodes and selects its favourite one generically enough ? Wouldn't it\nbe better to just assume the application precisely knows what RAW\nformats it wants and adds a StreamConfiguration for it ?\n\n>\n> There are various ways an application could use this:\n>\n> * It might not care and would ignore the new field altogether. The\n>   pipeline handler will stick to its current behaviour.\n>\n> * It might have some notion of what it wants, perhaps a larger bit\n>   depth, and/or a range of sizes. It can fill some or all of those\n>   into the SensorMode and the pipeline handler should respect it.\n>\n> * Or it could query the CameraSensor for its list of SensorModes and\n>   then sift through them looking for the one that it likes best.\n\nThis is the part I fail to fully grasp. Could you summarize what are\nthe parameters that would drive the mode selection policy in the\napplication ?\n\n>\n> 2. Field of View and Framerates\n>\n> The SensorMode should probably include FoV and framerate information\n\nFoV it's problematic, I understand. We have a property that describes\nthe pixel array size, and I understand comparing the RAW output size\nis not enough as the same resolution can be theoretically be obtained\nby cropping or subsampling. I would not be opposite to report it\nsomehow as it might represent an important selection criteria.\n\nDuration, on the other hand can't be read from the limits of\ncontrols::FrameDurationLimits ? I do expect those control to be\npopulated with the sensor's durations (see\nhttps://git.linuxtv.org/libcamera.git/tree/src/ipa/ipu3/ipu3.cpp#n256)\n\nSure, applications should try all the supported RAW modes, configure\nthe camera with them, and read back the control value.\n\n> so that applications can make intelligent choices automatically.\n> However, this is a bit trickier for various reasons so I've left it\n> out. There could be a later phase of work that adds these.\n>\n> Even without this, however, the implementation gets us out of our\n> rather critical hole where we simply can't get 10-bit modes. It also\n\nHelp me out here: why can't you select a RAW format with 10bpp ?\n\n> provides a better alternative to the current nasty practice of\n> requesting a raw stream specifically to bodge the camera mode\n> selection, even when the raw stream is not actually wanted!\n\nI see it the other way around actually :) Tying application to sensor\nmodes goes in the opposite direction of 'abstracting camera\nimplementation details from application' (tm)\n\nAlso goes in the opposite direction of the long-term dream of having\nsensor driver not being tied to a few fixed modes because that's what\nproducers gave us to start with, but I get this is a bit far-fetched.\n\nOf course the line between abstraction and control is as usual draw on\nthe sand and I might be too concerned about exposing sensor details in\nour API, even if in an opt-in way.\n\nThanks\n   j\n\n>\n> 3. There are 2 commits here\n>\n> The first adds the SensorMode class, puts it into the\n> CameraConfiguration, and allows the supported modes to be listed from\n> the CameraSensor. (All the non-Pi stuff.)\n>\n> The second commit updates our mode selection code to select according\n> to the hinted SensorMode (figuring out defaults if it was empty). But\n> it essentially works just the same, if in a slightly more generic way.\n>\n> The code here is fully functional and seems to work fine. Would other\n> pipeline handlers be able to adapt to the idea of a \"hinted\n> SensorMode\" as easily?\n>\n>\n> As always, I'm looking forward to people's thoughts!\n>\n> Thanks\n> David\n>\n> David Plowman (2):\n>   libcamera: Add SensorMode class\n>   libcamera: pipeline_handler: raspberrypi: Handle the new SensorMode\n>     hint\n>\n>  include/libcamera/camera.h                    |   3 +\n>  include/libcamera/internal/camera_sensor.h    |   4 +\n>  include/libcamera/meson.build                 |   1 +\n>  include/libcamera/sensor_mode.h               |  50 +++++++++\n>  src/libcamera/camera_sensor.cpp               |  15 +++\n>  src/libcamera/meson.build                     |   1 +\n>  .../pipeline/raspberrypi/raspberrypi.cpp      | 105 +++++++++++++-----\n>  src/libcamera/sensor_mode.cpp                 |  60 ++++++++++\n>  8 files changed, 212 insertions(+), 27 deletions(-)\n>  create mode 100644 include/libcamera/sensor_mode.h\n>  create mode 100644 src/libcamera/sensor_mode.cpp\n>\n> --\n> 2.20.1\n>","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id 5FC1DBF01C\n\tfor <parsemail@patchwork.libcamera.org>;\n\tTue, 21 Sep 2021 13:47:54 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id 662DF6918A;\n\tTue, 21 Sep 2021 15:47:53 +0200 (CEST)","from relay10.mail.gandi.net (relay10.mail.gandi.net\n\t[217.70.178.230])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id BF6A660247\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tTue, 21 Sep 2021 15:47:52 +0200 (CEST)","(Authenticated sender: jacopo@jmondi.org)\n\tby relay10.mail.gandi.net (Postfix) with ESMTPSA id 3E90F240005;\n\tTue, 21 Sep 2021 13:47:51 +0000 (UTC)"],"Date":"Tue, 21 Sep 2021 15:48:38 +0200","From":"Jacopo Mondi <jacopo@jmondi.org>","To":"David Plowman <david.plowman@raspberrypi.com>","Message-ID":"<20210921134838.lkvcj6cwxtcik2fa@uno.localdomain>","References":"<20210916132015.1790-1-david.plowman@raspberrypi.com>","MIME-Version":"1.0","Content-Type":"text/plain; charset=utf-8","Content-Disposition":"inline","In-Reply-To":"<20210916132015.1790-1-david.plowman@raspberrypi.com>","Subject":"Re: [libcamera-devel] [RFC PATCH 0/2] Sensor mode hints","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","Cc":"libcamera-devel@lists.libcamera.org","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}},{"id":19778,"web_url":"https://patchwork.libcamera.org/comment/19778/","msgid":"<CAEmqJPrFbLSg3KjEKPxzFx0d9fh91TkqXN8W6mkT-oH1AHCtWQ@mail.gmail.com>","date":"2021-09-22T08:11:12","subject":"Re: [libcamera-devel] [RFC PATCH 0/2] Sensor mode hints","submitter":{"id":34,"url":"https://patchwork.libcamera.org/api/people/34/","name":"Naushir Patuck","email":"naush@raspberrypi.com"},"content":"Hi Jacopo,\n\nOn Tue, 21 Sept 2021 at 14:47, Jacopo Mondi <jacopo@jmondi.org> wrote:\n\n> Hi David,\n>\n> On Thu, Sep 16, 2021 at 02:20:13PM +0100, David Plowman wrote:\n> > Hi everyone\n> >\n> > Here's a first attempt at functionality that allows applications to\n> > provide \"hints\" as to what kind of camera mode they want.\n> >\n> > 1. Bit Depths and Sizes\n> >\n> > So far I'm allowing hints about bit depth and the image sizes that can\n> > be read out of a sensor, and I've gathered these together into\n> > something I've call a SensorMode.\n> >\n> > I've added a SensorMode field to the CameraConfiguration so that\n> > applications can optionally put in there what they think they want,\n> > and also a CameraSensor::getSensorModes function that returns a list\n> > of the supported modes (raw formats only...).\n>\n> This is more about the theory, but do you think it's a good idea\n> to allow applications to be written based on some sensor-specific\n> parameter ? I mean, I understand it's totally plausible to write a\n> specialized application for say, RPi + imx477. This assumes the\n> developers knows the sensor modes, the sensor configuration used to\n> produce it and low level details of the sensor. Most of those\n> information assumed to be known by the developer also because there's\n> currently no way to expose from V4L2 how a mode has been\n> realized in the sensor, if through binning or skipping or cropping. It\n> can be deduced probably looking at the analogue cropping and full\n> pixel array size, but it's a guessing exercise.\n>\n> Now, if we assume that a very specialized application knows the sensor\n> it deals with, what is the use case for writing a generic application\n> that instead inspects the (limited) information of the sensor produced\n> modes and selects its favourite one generically enough ? Wouldn't it\n> be better to just assume the application precisely knows what RAW\n> formats it wants and adds a StreamConfiguration for it ?\n>\n\nThis type of usage may be more relevant for Raspberry Pi than other vendors,\nthough it does not need to be of course.  Our current (non-libcamera based)\nraspicam applications  allows the user to select a specific sensor mode to\nuse\nwith the \"-md\" command line parameter with an integer specifying the mode\nindex\nto use.  The modes for each of our sensors are fully documented so the user\nknows the exact width/height/crop/bin/bit-depth used.  This is one of those\nfeatures\nthat our users have found extremely useful in a wide range of situations,\nand we\ncan't really live without it.  Examples of this are using a low bit depth\nmode for\nfast fps operation, or using binned mode for better noise performance, or\nforcing\na specific FoV to name a few.\n\nThe idea behind this change is to empower the users to do similar things\nwith our\nlibcamera based apps.  Currently, there is no full substitute for our\nexisting mechanism.\nThe goal is not really to write an application that is specialised to a\nparticular sensor,\nrather we allow the user a mechanism to choose a particular advertised mode\nthrough\nthis new mechanism.  This can be done either via some command line\nparameters, or\neven programmatically, that's up to the user/app.  So applications still\nremain generic,\nbut now allow the users to have a certain level of sensor specific\ncontrol.  Of course, an\napplication may choose to not bother with any of this, and then we revert\nto the pipeline\nhandlers having full choice in the sensor mode to use, like we do right now.\n\nI'll let David respond to the below comments in more detail, but I just\nwanted to\nprovide some further context for what this change is about.  Hope that\nhelps!\n\nRegards,\nNaush\n\n\n>\n> >\n> > There are various ways an application could use this:\n> >\n> > * It might not care and would ignore the new field altogether. The\n> >   pipeline handler will stick to its current behaviour.\n> >\n> > * It might have some notion of what it wants, perhaps a larger bit\n> >   depth, and/or a range of sizes. It can fill some or all of those\n> >   into the SensorMode and the pipeline handler should respect it.\n> >\n> > * Or it could query the CameraSensor for its list of SensorModes and\n> >   then sift through them looking for the one that it likes best.\n>\n> This is the part I fail to fully grasp. Could you summarize what are\n> the parameters that would drive the mode selection policy in the\n> application ?\n>\n> >\n> > 2. Field of View and Framerates\n> >\n> > The SensorMode should probably include FoV and framerate information\n>\n> FoV it's problematic, I understand. We have a property that describes\n> the pixel array size, and I understand comparing the RAW output size\n> is not enough as the same resolution can be theoretically be obtained\n> by cropping or subsampling. I would not be opposite to report it\n> somehow as it might represent an important selection criteria.\n>\n> Duration, on the other hand can't be read from the limits of\n> controls::FrameDurationLimits ? I do expect those control to be\n> populated with the sensor's durations (see\n> https://git.linuxtv.org/libcamera.git/tree/src/ipa/ipu3/ipu3.cpp#n256)\n>\n> Sure, applications should try all the supported RAW modes, configure\n> the camera with them, and read back the control value.\n>\n> > so that applications can make intelligent choices automatically.\n> > However, this is a bit trickier for various reasons so I've left it\n> > out. There could be a later phase of work that adds these.\n> >\n> > Even without this, however, the implementation gets us out of our\n> > rather critical hole where we simply can't get 10-bit modes. It also\n>\n> Help me out here: why can't you select a RAW format with 10bpp ?\n>\n> > provides a better alternative to the current nasty practice of\n> > requesting a raw stream specifically to bodge the camera mode\n> > selection, even when the raw stream is not actually wanted!\n>\n> I see it the other way around actually :) Tying application to sensor\n> modes goes in the opposite direction of 'abstracting camera\n> implementation details from application' (tm)\n>\n> Also goes in the opposite direction of the long-term dream of having\n> sensor driver not being tied to a few fixed modes because that's what\n> producers gave us to start with, but I get this is a bit far-fetched.\n>\n> Of course the line between abstraction and control is as usual draw on\n> the sand and I might be too concerned about exposing sensor details in\n> our API, even if in an opt-in way.\n>\n> Thanks\n>    j\n>\n> >\n> > 3. There are 2 commits here\n> >\n> > The first adds the SensorMode class, puts it into the\n> > CameraConfiguration, and allows the supported modes to be listed from\n> > the CameraSensor. (All the non-Pi stuff.)\n> >\n> > The second commit updates our mode selection code to select according\n> > to the hinted SensorMode (figuring out defaults if it was empty). But\n> > it essentially works just the same, if in a slightly more generic way.\n> >\n> > The code here is fully functional and seems to work fine. Would other\n> > pipeline handlers be able to adapt to the idea of a \"hinted\n> > SensorMode\" as easily?\n> >\n> >\n> > As always, I'm looking forward to people's thoughts!\n> >\n> > Thanks\n> > David\n> >\n> > David Plowman (2):\n> >   libcamera: Add SensorMode class\n> >   libcamera: pipeline_handler: raspberrypi: Handle the new SensorMode\n> >     hint\n> >\n> >  include/libcamera/camera.h                    |   3 +\n> >  include/libcamera/internal/camera_sensor.h    |   4 +\n> >  include/libcamera/meson.build                 |   1 +\n> >  include/libcamera/sensor_mode.h               |  50 +++++++++\n> >  src/libcamera/camera_sensor.cpp               |  15 +++\n> >  src/libcamera/meson.build                     |   1 +\n> >  .../pipeline/raspberrypi/raspberrypi.cpp      | 105 +++++++++++++-----\n> >  src/libcamera/sensor_mode.cpp                 |  60 ++++++++++\n> >  8 files changed, 212 insertions(+), 27 deletions(-)\n> >  create mode 100644 include/libcamera/sensor_mode.h\n> >  create mode 100644 src/libcamera/sensor_mode.cpp\n> >\n> > --\n> > 2.20.1\n> >\n>","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id 2CF09BDC71\n\tfor <parsemail@patchwork.libcamera.org>;\n\tWed, 22 Sep 2021 08:11:32 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id ED1936918E;\n\tWed, 22 Sep 2021 10:11:31 +0200 (CEST)","from mail-lf1-x131.google.com (mail-lf1-x131.google.com\n\t[IPv6:2a00:1450:4864:20::131])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id A855B687DD\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tWed, 22 Sep 2021 10:11:29 +0200 (CEST)","by mail-lf1-x131.google.com with SMTP id b15so8560892lfe.7\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tWed, 22 Sep 2021 01:11:29 -0700 (PDT)"],"Authentication-Results":"lancelot.ideasonboard.com;\n\tdkim=fail reason=\"signature verification failed\" (2048-bit key;\n\tunprotected) header.d=raspberrypi.com header.i=@raspberrypi.com\n\theader.b=\"beNN5Fhp\"; dkim-atps=neutral","DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/relaxed;\n\td=raspberrypi.com; s=google;\n\th=mime-version:references:in-reply-to:from:date:message-id:subject:to\n\t:cc; bh=rvSn7lGB/oAvgZwd3IAOYzg6d2YMNoDwVz2PMCTifsk=;\n\tb=beNN5FhpIpJZPlDetl/is9+VWgn7N6SLHv3ItJbbITH/9aFvU6SvvU43DY/ekRiCqG\n\tJjTCVa4hD/+ky8QN5HOl/b9UJx8rCS3t9cOvl2TywGGidYsUbebaheXIim4SBEsny3jx\n\tX0STSkmGJ0qRBIARCcFZe87emtPzV0flS9cAQQFXTz3WM1GyOiElzsIHEbEB/YL4IbCk\n\tmEKKyGiVEJjnngNwAGGtDNpLL+RZ93db7N2WDw2H/O7bYvM745hC7ZCAz6tWcB9FJUqk\n\tQp9Y13F5mE7FoVu3wXCk4Vsl7UNuQ9i2nFTXshbIFa9E28HQQtd9k/DY+Toxv7WzIZmU\n\tD3fw==","X-Google-DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/relaxed;\n\td=1e100.net; s=20210112;\n\th=x-gm-message-state:mime-version:references:in-reply-to:from:date\n\t:message-id:subject:to:cc;\n\tbh=rvSn7lGB/oAvgZwd3IAOYzg6d2YMNoDwVz2PMCTifsk=;\n\tb=7jAQJCckM+9H5gF1bCom3su8SjWIk16C6n/xbK+mB3Rn4Epwulmp1tSCf85bC3Hg/x\n\trMWFsTv+9QzqdeHsqEb1NTw4fw0AXLVkwNQBFKwwO6+h+LfJDAArMrmdvMtRZec6GC5t\n\t8S7oEtaWuSX2gpao46R27zhm9CSDSH9sBnRk3kembuYyG20nIflEsxOfp5qB20swvxcq\n\t3/dJkt5dR4nddU4bkmQilmbsIMcSQ5AU/Xsfc6zY251xorZD12o8BA5D4ugaNXow4DpV\n\tM/xFSDMRsV+tcCxN5uWKuXO03TPfjjyY31KxR1293hsncsycrDGCvYx3+HU7YZmPWJ/8\n\tB6pw==","X-Gm-Message-State":"AOAM530tjbfU1kSuN8da/v7uZjdPOmOzoBbLHerctjnXhfkTvTOi7G8L\n\ttdtfgwi0uBcj0NnVzLQKnizoghrn7dynZ+gYxrxLwA==","X-Google-Smtp-Source":"ABdhPJwROhpvrn63mO0M08mfloII3xuJrB0nUvuQ3XykMgfFXh8ydQskDJ3lbNlcTbLAmGAtT5PmVYjMhwdsUPmHtmk=","X-Received":"by 2002:ac2:5624:: with SMTP id\n\tb4mr27036113lff.687.1632298288462; \n\tWed, 22 Sep 2021 01:11:28 -0700 (PDT)","MIME-Version":"1.0","References":"<20210916132015.1790-1-david.plowman@raspberrypi.com>\n\t<20210921134838.lkvcj6cwxtcik2fa@uno.localdomain>","In-Reply-To":"<20210921134838.lkvcj6cwxtcik2fa@uno.localdomain>","From":"Naushir Patuck <naush@raspberrypi.com>","Date":"Wed, 22 Sep 2021 09:11:12 +0100","Message-ID":"<CAEmqJPrFbLSg3KjEKPxzFx0d9fh91TkqXN8W6mkT-oH1AHCtWQ@mail.gmail.com>","To":"Jacopo Mondi <jacopo@jmondi.org>","Content-Type":"multipart/alternative; boundary=\"000000000000a6cf1005cc910fcc\"","Subject":"Re: [libcamera-devel] [RFC PATCH 0/2] Sensor mode hints","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","Cc":"libcamera devel <libcamera-devel@lists.libcamera.org>","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}},{"id":19786,"web_url":"https://patchwork.libcamera.org/comment/19786/","msgid":"<CAHW6GYLHQZ4nwRYCg2xP0KrJ5DaYTeFFghCLyxzujNWKupGVvg@mail.gmail.com>","date":"2021-09-22T09:35:27","subject":"Re: [libcamera-devel] [RFC PATCH 0/2] Sensor mode hints","submitter":{"id":42,"url":"https://patchwork.libcamera.org/api/people/42/","name":"David Plowman","email":"david.plowman@raspberrypi.com"},"content":"Hi Jacopo, Naush\n\nThanks for joining this discussion, I think that's great!\n\nI'll address some of Jacopo's questions further down, but it's\nprobably worth talking about the context a bit more first as that will\ninform the answers I give.\n\nThe problem at hand is that of letting applications get hold of\n\"correct\" sensor modes. This means things like \"I want the fastest\nframe rate possible\", or \"I only want full FoV modes\". Or even things\nlike \"I want low and high resolution modes that share the same FoV\". I\nsee a couple of approaches.\n\nApproach A: here we provide the ability to select known sensor modes\nexactly, for example by specifying the resolution, bit-depth etc. This\ndoes indeed work and is actually more-or-less where we are at the\nmoment.\n\nApproach B: alternatively libcamera can provide applications with a\nlist of the available sensor modes so that they can (optionally)\nfigure out for themselves what they want. In time this would ideally\ncontain information about FoV, framerates and so on. It's obviously\nmore complicated than approach A but includes anything that you could\ndo with it.\n\nBut I think the most important difference is this:\n\n* Approach A gives applications no way to figure out anything about\nsensor modes for themselves. Approach A is the one that will lead us\nto applications that hardcode particular sensor mode choices, or have\nlookup tables of sensor modes for certain supported sensors.\n\nTo be honest I think I could live with approach A, but approach B\nfeels preferable to me. What do others think?\n\nOn Wed, 22 Sept 2021 at 09:11, Naushir Patuck <naush@raspberrypi.com> wrote:\n>\n> Hi Jacopo,\n>\n> On Tue, 21 Sept 2021 at 14:47, Jacopo Mondi <jacopo@jmondi.org> wrote:\n>>\n>> Hi David,\n>>\n>> On Thu, Sep 16, 2021 at 02:20:13PM +0100, David Plowman wrote:\n>> > Hi everyone\n>> >\n>> > Here's a first attempt at functionality that allows applications to\n>> > provide \"hints\" as to what kind of camera mode they want.\n>> >\n>> > 1. Bit Depths and Sizes\n>> >\n>> > So far I'm allowing hints about bit depth and the image sizes that can\n>> > be read out of a sensor, and I've gathered these together into\n>> > something I've call a SensorMode.\n>> >\n>> > I've added a SensorMode field to the CameraConfiguration so that\n>> > applications can optionally put in there what they think they want,\n>> > and also a CameraSensor::getSensorModes function that returns a list\n>> > of the supported modes (raw formats only...).\n>>\n>> This is more about the theory, but do you think it's a good idea\n>> to allow applications to be written based on some sensor-specific\n>> parameter ? I mean, I understand it's totally plausible to write a\n>> specialized application for say, RPi + imx477. This assumes the\n>> developers knows the sensor modes, the sensor configuration used to\n>> produce it and low level details of the sensor. Most of those\n>> information assumed to be known by the developer also because there's\n>> currently no way to expose from V4L2 how a mode has been\n>> realized in the sensor, if through binning or skipping or cropping. It\n>> can be deduced probably looking at the analogue cropping and full\n>> pixel array size, but it's a guessing exercise.\n>>\n>> Now, if we assume that a very specialized application knows the sensor\n>> it deals with, what is the use case for writing a generic application\n>> that instead inspects the (limited) information of the sensor produced\n>> modes and selects its favourite one generically enough ? Wouldn't it\n>> be better to just assume the application precisely knows what RAW\n>> formats it wants and adds a StreamConfiguration for it ?\n\nI agree this is the key question. Do we want applications that know\nprecisely what mode they want, or do we want there to be the option\n(it's not compulsory!) for applications to make smarter mode choices\nin a more general way?\n\n>\n>\n> This type of usage may be more relevant for Raspberry Pi than other vendors,\n> though it does not need to be of course.  Our current (non-libcamera based)\n> raspicam applications  allows the user to select a specific sensor mode to use\n> with the \"-md\" command line parameter with an integer specifying the mode index\n> to use.  The modes for each of our sensors are fully documented so the user\n> knows the exact width/height/crop/bin/bit-depth used.  This is one of those features\n> that our users have found extremely useful in a wide range of situations, and we\n> can't really live without it.  Examples of this are using a low bit depth mode for\n> fast fps operation, or using binned mode for better noise performance, or forcing\n> a specific FoV to name a few.\n>\n> The idea behind this change is to empower the users to do similar things with our\n> libcamera based apps.  Currently, there is no full substitute for our existing mechanism.\n> The goal is not really to write an application that is specialised to a particular sensor,\n> rather we allow the user a mechanism to choose a particular advertised mode through\n> this new mechanism.  This can be done either via some command line parameters, or\n> even programmatically, that's up to the user/app.  So applications still remain generic,\n> but now allow the users to have a certain level of sensor specific control.  Of course, an\n> application may choose to not bother with any of this, and then we revert to the pipeline\n> handlers having full choice in the sensor mode to use, like we do right now.\n>\n> I'll let David respond to the below comments in more detail, but I just wanted to\n> provide some further context for what this change is about.  Hope that helps!\n>\n> Regards,\n> Naush\n>\n>>\n>>\n>> >\n>> > There are various ways an application could use this:\n>> >\n>> > * It might not care and would ignore the new field altogether. The\n>> >   pipeline handler will stick to its current behaviour.\n>> >\n>> > * It might have some notion of what it wants, perhaps a larger bit\n>> >   depth, and/or a range of sizes. It can fill some or all of those\n>> >   into the SensorMode and the pipeline handler should respect it.\n>> >\n>> > * Or it could query the CameraSensor for its list of SensorModes and\n>> >   then sift through them looking for the one that it likes best.\n>>\n>> This is the part I fail to fully grasp. Could you summarize what are\n>> the parameters that would drive the mode selection policy in the\n>> application ?\n\nBit-depth and size are the most obvious and are the initial ones I\nproposed. But I think we should be expecting to extend this if\npossible to FoV and framerate.\n\n>>\n>> >\n>> > 2. Field of View and Framerates\n>> >\n>> > The SensorMode should probably include FoV and framerate information\n>>\n>> FoV it's problematic, I understand. We have a property that describes\n>> the pixel array size, and I understand comparing the RAW output size\n>> is not enough as the same resolution can be theoretically be obtained\n>> by cropping or subsampling. I would not be opposite to report it\n>> somehow as it might represent an important selection criteria.\n>>\n>> Duration, on the other hand can't be read from the limits of\n>> controls::FrameDurationLimits ? I do expect those control to be\n>> populated with the sensor's durations (see\n>> https://git.linuxtv.org/libcamera.git/tree/src/ipa/ipu3/ipu3.cpp#n256)\n>>\n>> Sure, applications should try all the supported RAW modes, configure\n>> the camera with them, and read back the control value.\n>>\n>> > so that applications can make intelligent choices automatically.\n>> > However, this is a bit trickier for various reasons so I've left it\n>> > out. There could be a later phase of work that adds these.\n>> >\n>> > Even without this, however, the implementation gets us out of our\n>> > rather critical hole where we simply can't get 10-bit modes. It also\n>>\n>> Help me out here: why can't you select a RAW format with 10bpp ?\n\nMainly because I've been working on the assumption that we will\nultimately move to something more like \"approach B\", and so baking in\nmore of \"approach A\" now is bad. In particular, I don't really see any\nfuture path from A to B. Also, I still have some niggles with the use\nof the raw stream for this purpose:\n\n* It still feels a bit kludgey to me if I have to ask for a raw stream\nto force a sensor mode choice, even when I actually have no interest\nin receiving the raw stream. But other opinions may differ there?\n\n* The raw stream mixes up the sensor mode and its format in memory.\nFor example, if I want a particular sensor mode, why do I have to\nspecify the packing? I have no interest in the packing and want the\npipeline handler to make the most efficient choice.\n\n* On the other hand, there will be times when I am interested in the\nformat in memory, presumably because I want to do something with the\npixels. Here, requesting the raw stream and specifying a packing\nformat (probably an unpacked one!) makes sense.\n\n* As an aside, I see a future where we have compressed in-memory\nformats too. Perhaps we just treat that as (a particularly opaque form\nof) packing?\n\n>>\n>> > provides a better alternative to the current nasty practice of\n>> > requesting a raw stream specifically to bodge the camera mode\n>> > selection, even when the raw stream is not actually wanted!\n>>\n>> I see it the other way around actually :) Tying application to sensor\n>> modes goes in the opposite direction of 'abstracting camera\n>> implementation details from application' (tm)\n>>\n>> Also goes in the opposite direction of the long-term dream of having\n>> sensor driver not being tied to a few fixed modes because that's what\n>> producers gave us to start with, but I get this is a bit far-fetched.\n>>\n>> Of course the line between abstraction and control is as usual draw on\n>> the sand and I might be too concerned about exposing sensor details in\n>> our API, even if in an opt-in way.\n\nGoing back to my original paragraphs: my view is that approach A ties\nan application to particular sensors and lists of known modes for\nthose sensors. Approach B gives applications the possibility (if they\nwant it) to be free of such look-up tables.\n\nI hope that explains the context a bit better, but most importantly, I\nwould agree that figuring out what we actually want is the first step!\n\nThanks!\nDavid\n\n>>\n>> Thanks\n>>    j\n>>\n>> >\n>> > 3. There are 2 commits here\n>> >\n>> > The first adds the SensorMode class, puts it into the\n>> > CameraConfiguration, and allows the supported modes to be listed from\n>> > the CameraSensor. (All the non-Pi stuff.)\n>> >\n>> > The second commit updates our mode selection code to select according\n>> > to the hinted SensorMode (figuring out defaults if it was empty). But\n>> > it essentially works just the same, if in a slightly more generic way.\n>> >\n>> > The code here is fully functional and seems to work fine. Would other\n>> > pipeline handlers be able to adapt to the idea of a \"hinted\n>> > SensorMode\" as easily?\n>> >\n>> >\n>> > As always, I'm looking forward to people's thoughts!\n>> >\n>> > Thanks\n>> > David\n>> >\n>> > David Plowman (2):\n>> >   libcamera: Add SensorMode class\n>> >   libcamera: pipeline_handler: raspberrypi: Handle the new SensorMode\n>> >     hint\n>> >\n>> >  include/libcamera/camera.h                    |   3 +\n>> >  include/libcamera/internal/camera_sensor.h    |   4 +\n>> >  include/libcamera/meson.build                 |   1 +\n>> >  include/libcamera/sensor_mode.h               |  50 +++++++++\n>> >  src/libcamera/camera_sensor.cpp               |  15 +++\n>> >  src/libcamera/meson.build                     |   1 +\n>> >  .../pipeline/raspberrypi/raspberrypi.cpp      | 105 +++++++++++++-----\n>> >  src/libcamera/sensor_mode.cpp                 |  60 ++++++++++\n>> >  8 files changed, 212 insertions(+), 27 deletions(-)\n>> >  create mode 100644 include/libcamera/sensor_mode.h\n>> >  create mode 100644 src/libcamera/sensor_mode.cpp\n>> >\n>> > --\n>> > 2.20.1\n>> >","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id 191E8BDC71\n\tfor <parsemail@patchwork.libcamera.org>;\n\tWed, 22 Sep 2021 09:35:41 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id 9E2566918C;\n\tWed, 22 Sep 2021 11:35:40 +0200 (CEST)","from mail-wr1-x42f.google.com (mail-wr1-x42f.google.com\n\t[IPv6:2a00:1450:4864:20::42f])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id D7A0F687DD\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tWed, 22 Sep 2021 11:35:38 +0200 (CEST)","by mail-wr1-x42f.google.com with SMTP id t8so4970818wri.1\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tWed, 22 Sep 2021 02:35:38 -0700 (PDT)"],"Authentication-Results":"lancelot.ideasonboard.com;\n\tdkim=fail reason=\"signature verification failed\" (2048-bit key;\n\tunprotected) header.d=raspberrypi.com header.i=@raspberrypi.com\n\theader.b=\"fzuRzhNl\"; dkim-atps=neutral","DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/relaxed;\n\td=raspberrypi.com; s=google;\n\th=mime-version:references:in-reply-to:from:date:message-id:subject:to\n\t:cc; bh=VsnwJQftX/cSXzQfPuKoyv2SrtHafbPjU7YZkJMR96M=;\n\tb=fzuRzhNlZbgTT/IXm12cg04DH30iHGLRN9GCAWBw0bQUwLx4W8xSLzc0u3UU8Fl8sf\n\tkzNU/W43H5RvMQcoHvNB06JIAbn6IxaixgljH9WH3UgI2HfXKKYcojEUeEX3rh3lk4xP\n\tvIWzdNySLyj9Bg1zK34ognXUCSPLnI6XjhNPixedLoKVpQ73y/B9J202BQatxANrtPj3\n\tWSETVrCY0TWg+DWpFmDQ00zZ0cM8RLHGn/3HkOYEHGKqbKg76FHtKOZgwnQ/bS9EAg6S\n\tg+/SM6DnS1r6mebDlw1vWFh0qaBZN5hJ5yiBqJDwDMJzCM+XWiZE3BTTHNEM414plL6h\n\twllQ==","X-Google-DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/relaxed;\n\td=1e100.net; s=20210112;\n\th=x-gm-message-state:mime-version:references:in-reply-to:from:date\n\t:message-id:subject:to:cc;\n\tbh=VsnwJQftX/cSXzQfPuKoyv2SrtHafbPjU7YZkJMR96M=;\n\tb=JujX3tabC9Md3FhC+S7+8Y10c8XlsNcks9bqSInCzSH8iOgxmzO1yJXxO84PIpFMHb\n\tjElWF/yUGCW+y4jiI3IvZn1uDufEo03g77Md8zBLNcBLtrOPSA61D7vVA+B7mYc92Tc1\n\tRMO4h/N2jlduKBRf/NAQ+lDExtCU2IE/XHvHMkmqB2LIPETa26w0E5A/qavE2xlPeW+d\n\tbji3kMPByDZvV1hr+WwVTKnNPO2KUkTqUiN5e/FVP8+EFRMggeLEUBpiS9jTsXIVRmXz\n\t69maQbdbmymaY1IIyfgEl3uXaUeL2EwOpxqbja+6JsjPjK3HiANKDp2mOrNS+3MGp07l\n\tBWTg==","X-Gm-Message-State":"AOAM531aqdaMDCTt3kAxDMnQUQc0oCdaYUctdjdl19C207Q61gSDwRv4\n\t7Hl5FP+/1pwfWvu8ymK5TTNZpXQNnMIzvB1V3SEENA==","X-Google-Smtp-Source":"ABdhPJwk7Qf85d4POkb6MyPZIK1LMxVniwpEY3BGWxE+6Fmtpdo0Jkn9fbHvvE5DHJsNqg70B52YCqigvvyqIvG48Yc=","X-Received":"by 2002:a05:6000:18a4:: with SMTP id\n\tb4mr31950103wri.288.1632303338092; \n\tWed, 22 Sep 2021 02:35:38 -0700 (PDT)","MIME-Version":"1.0","References":"<20210916132015.1790-1-david.plowman@raspberrypi.com>\n\t<20210921134838.lkvcj6cwxtcik2fa@uno.localdomain>\n\t<CAEmqJPrFbLSg3KjEKPxzFx0d9fh91TkqXN8W6mkT-oH1AHCtWQ@mail.gmail.com>","In-Reply-To":"<CAEmqJPrFbLSg3KjEKPxzFx0d9fh91TkqXN8W6mkT-oH1AHCtWQ@mail.gmail.com>","From":"David Plowman <david.plowman@raspberrypi.com>","Date":"Wed, 22 Sep 2021 10:35:27 +0100","Message-ID":"<CAHW6GYLHQZ4nwRYCg2xP0KrJ5DaYTeFFghCLyxzujNWKupGVvg@mail.gmail.com>","To":"Naushir Patuck <naush@raspberrypi.com>","Content-Type":"text/plain; charset=\"UTF-8\"","Subject":"Re: [libcamera-devel] [RFC PATCH 0/2] Sensor mode hints","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","Cc":"libcamera devel <libcamera-devel@lists.libcamera.org>","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}}]