[{"id":26744,"web_url":"https://patchwork.libcamera.org/comment/26744/","msgid":"<dc8e9d6a-a40e-7b20-d44c-eb9fb33bd01e@posteo.de>","date":"2023-03-24T18:24:26","subject":"Re: [libcamera-devel] [PATCH v2 1/3] gstreamer: Add sensor mode\n\tselection","submitter":{"id":142,"url":"https://patchwork.libcamera.org/api/people/142/","name":"Robert Mader","email":"robert.mader@posteo.de"},"content":"Hi,\n\nOn 24.03.23 19:12, Nicolas Dufresne via libcamera-devel wrote:\n> From: Nicolas Dufresne<nicolas.dufresne@collabora.com>\n>\n> This add support for selecting the sensor mode. A new read-only\n> property called sensor-modes is filled when the element reaches\n> READY state. It contains the list of all available sensor modes,\n> including the supported framerate range. This is exposed as GstCaps\n> in the form of sensor/mode,width=X,height=Y,format=Y,framerate=[...].\n> The format string matches the libcamera format string representation.\n>\n> The application can then select a mode using the read/write sensor-mode\n> control. The selected mode is also a caps, it will be intersected with\n> the supported mode and the \"best\" match will be picked. This allows\n> application to use simple filter when they want to pick a mode for lets\n> say a specific framerate (e.g. sensor/mode,framerate=60/1).\n>\n> Signed-off-by: Nicolas Dufresne<nicolas.dufresne@collabora.com>\n> ---\n>   src/gstreamer/gstlibcamera-utils.cpp |   4 +\n>   src/gstreamer/gstlibcamerasrc.cpp    | 110 ++++++++++++++++++++++++++-\n>   2 files changed, 112 insertions(+), 2 deletions(-)\n>\n> diff --git a/src/gstreamer/gstlibcamera-utils.cpp b/src/gstreamer/gstlibcamera-utils.cpp\n> index 750ec351..c8a8df09 100644\n> --- a/src/gstreamer/gstlibcamera-utils.cpp\n> +++ b/src/gstreamer/gstlibcamera-utils.cpp\n> @@ -416,6 +416,10 @@ gst_libcamera_configure_stream_from_caps(StreamConfiguration &stream_cfg,\n>   \t\tstream_cfg.pixelFormat = gst_format_to_pixel_format(gst_format);\n>   \t} else if (gst_structure_has_name(s, \"image/jpeg\")) {\n>   \t\tstream_cfg.pixelFormat = formats::MJPEG;\n> +\t} else if (gst_structure_has_name(s, \"sensor/mode\")) {\n> +\t\tgst_structure_fixate_field(s, \"format\");\n> +\t\tconst gchar *format = gst_structure_get_string(s, \"format\");\n> +\t\tstream_cfg.pixelFormat = PixelFormat::fromString(format);\n>   \t} else {\n>   \t\tg_critical(\"Unsupported media type: %s\", gst_structure_get_name(s));\n>   \t}\n> diff --git a/src/gstreamer/gstlibcamerasrc.cpp b/src/gstreamer/gstlibcamerasrc.cpp\n> index a10cbd4f..2f05a03f 100644\n> --- a/src/gstreamer/gstlibcamerasrc.cpp\n> +++ b/src/gstreamer/gstlibcamerasrc.cpp\n> @@ -147,6 +147,9 @@ struct _GstLibcameraSrc {\n>   \n>   \tgchar *camera_name;\n>   \n> +\tGstCaps *sensor_modes;\n> +\tGstCaps *sensor_mode;\n> +\n>   \tGstLibcameraSrcState *state;\n>   \tGstLibcameraAllocator *allocator;\n>   \tGstFlowCombiner *flow_combiner;\n> @@ -154,7 +157,10 @@ struct _GstLibcameraSrc {\n>   \n>   enum {\n>   \tPROP_0,\n> -\tPROP_CAMERA_NAME\n> +\tPROP_CAMERA_NAME,\n> +\tPROP_SENSOR_MODES,\n> +\tPROP_SENSOR_MODE,\n> +\tPROP_LAST\n>   };\n>   \n>   G_DEFINE_TYPE_WITH_CODE(GstLibcameraSrc, gst_libcamera_src, GST_TYPE_ELEMENT,\n> @@ -318,6 +324,61 @@ int GstLibcameraSrcState::processRequest()\n>   \treturn err;\n>   }\n>   \n> +static GstCaps *\n> +gst_libcamera_src_enumerate_sensor_modes(GstLibcameraSrc *self)\n> +{\n> +\tGstCaps *modes = gst_caps_new_empty();\n> +\tGstLibcameraSrcState *state = self->state;\n> +\tauto config = state->cam_->generateConfiguration({ libcamera::StreamRole::Raw,\n> +\t\t\t\t\t\t\t   libcamera::StreamRole::VideoRecording });\n> +\tif (config == nullptr) {\n> +\t\tGST_DEBUG_OBJECT(self, \"Sensor mode selection is not supported, skipping enumeration.\");\n> +\t\treturn modes;\n> +\t}\n\nI gave this a short try on both my USB camera and on a PPP. In both \ncases enumeration gets skipped (as intended), but in the later case the \nfollowing warning gets printed:\n\n > ERROR RkISP1 rkisp1.cpp:684 Can't capture both raw and processed streams\n\nSo I wonder if we can come a better check here.\n\nWhen allowing the enumeration by only selecting Raw or VideoRecording I \nmade the following observations:\n\n 1. enumeration on the USB camera is very slow, takes several seconds\n 2. on the PPP / RkISP1, even though there are many more modes, the\n    enumeration is super fast - probably a fraction of a second. So a\n    method like this could probably be used for framerate enumeration.\n 3. the mode selection via sensor-mode=sensor/mode,... usually fails\n    though (haven't digged deeper why yet)\n\nRegards,\n\nRobert\n\n> +\n> +\tconst libcamera::StreamFormats &formats = config->at(0).formats();\n> +\n> +\tfor (const auto &pixfmt : formats.pixelformats()) {\n> +\t\tfor (const auto &size : formats.sizes(pixfmt)) {\n> +\t\t\tconfig->at(0).size = size;\n> +\t\t\tconfig->at(0).pixelFormat = pixfmt;\n> +\n> +\t\t\tif (config->validate() == CameraConfiguration::Invalid)\n> +\t\t\t\tcontinue;\n> +\n> +\t\t\tif (state->cam_->configure(config.get()))\n> +\t\t\t\tcontinue;\n> +\n> +\t\t\tauto fd_ctrl = state->cam_->controls().find(&controls::FrameDurationLimits);\n> +\t\t\tif (fd_ctrl == state->cam_->controls().end())\n> +\t\t\t\tcontinue;\n> +\n> +\t\t\tint minrate_num, minrate_denom;\n> +\t\t\tint maxrate_num, maxrate_denom;\n> +\t\t\tdouble min_framerate = gst_util_guint64_to_gdouble(1.0e6) /\n> +\t\t\t\t\t       gst_util_guint64_to_gdouble(fd_ctrl->second.max().get<int64_t>());\n> +\t\t\tdouble max_framerate = gst_util_guint64_to_gdouble(1.0e6) /\n> +\t\t\t\t\t       gst_util_guint64_to_gdouble(fd_ctrl->second.min().get<int64_t>());\n> +\t\t\tgst_util_double_to_fraction(min_framerate, &minrate_num, &minrate_denom);\n> +\t\t\tgst_util_double_to_fraction(max_framerate, &maxrate_num, &maxrate_denom);\n> +\n> +\t\t\tGstStructure *s = gst_structure_new(\"sensor/mode\",\n> +\t\t\t\t\t\t\t    \"format\", G_TYPE_STRING, pixfmt.toString().c_str(),\n> +\t\t\t\t\t\t\t    \"width\", G_TYPE_INT, size.width,\n> +\t\t\t\t\t\t\t    \"height\", G_TYPE_INT, size.height,\n> +\t\t\t\t\t\t\t    \"framerate\", GST_TYPE_FRACTION_RANGE,\n> +\t\t\t\t\t\t\t    minrate_num, minrate_denom,\n> +\t\t\t\t\t\t\t    maxrate_num, maxrate_denom,\n> +\t\t\t\t\t\t\t    nullptr);\n> +\t\t\tgst_caps_append_structure(modes, s);\n> +\t\t}\n> +\t}\n> +\n> +\tGST_DEBUG_OBJECT(self, \"Camera sensor modes: %\" GST_PTR_FORMAT, modes);\n> +\n> +\treturn modes;\n> +}\n> +\n>   static bool\n>   gst_libcamera_src_open(GstLibcameraSrc *self)\n>   {\n> @@ -375,6 +436,7 @@ gst_libcamera_src_open(GstLibcameraSrc *self)\n>   \t/* No need to lock here, we didn't start our threads yet. */\n>   \tself->state->cm_ = cm;\n>   \tself->state->cam_ = cam;\n> +\tself->sensor_modes = gst_libcamera_src_enumerate_sensor_modes(self);\n>   \n>   \treturn true;\n>   }\n> @@ -462,6 +524,7 @@ gst_libcamera_src_task_enter(GstTask *task, [[maybe_unused]] GThread *thread,\n>   \tGstLibcameraSrcState *state = self->state;\n>   \tGstFlowReturn flow_ret = GST_FLOW_OK;\n>   \tgint ret;\n> +\tg_autoptr(GstCaps) sensor_mode = nullptr;\n>   \n>   \tg_autoptr(GstStructure) element_caps = gst_structure_new_empty(\"caps\");\n>   \n> @@ -481,6 +544,16 @@ gst_libcamera_src_task_enter(GstTask *task, [[maybe_unused]] GThread *thread,\n>   \t\troles.push_back(gst_libcamera_pad_get_role(srcpad));\n>   \t}\n>   \n> +\tif (!gst_caps_is_any(self->sensor_mode)) {\n> +\t\tsensor_mode = gst_caps_intersect(self->sensor_mode, self->sensor_modes);\n> +\t\tif (!gst_caps_is_empty(sensor_mode)) {\n> +\t\t\troles.push_back(libcamera::StreamRole::Raw);\n> +\t\t} else {\n> +\t\t\tGST_WARNING_OBJECT(self, \"No sensor mode matching the selection, ignoring.\");\n> +\t\t\tgst_clear_caps(&sensor_mode);\n> +\t\t}\n> +\t}\n> +\n>   \t/* Generate the stream configurations, there should be one per pad. */\n>   \tstate->config_ = state->cam_->generateConfiguration(roles);\n>   \tif (state->config_ == nullptr) {\n> @@ -490,7 +563,7 @@ gst_libcamera_src_task_enter(GstTask *task, [[maybe_unused]] GThread *thread,\n>   \t\tgst_task_stop(task);\n>   \t\treturn;\n>   \t}\n> -\tg_assert(state->config_->size() == state->srcpads_.size());\n> +\tg_assert(state->config_->size() == state->srcpads_.size() + (!!sensor_mode));\n>   \n>   \tfor (gsize i = 0; i < state->srcpads_.size(); i++) {\n>   \t\tGstPad *srcpad = state->srcpads_[i];\n> @@ -510,6 +583,12 @@ gst_libcamera_src_task_enter(GstTask *task, [[maybe_unused]] GThread *thread,\n>   \t\tgst_libcamera_get_framerate_from_caps(caps, element_caps);\n>   \t}\n>   \n> +\tif (sensor_mode) {\n> +\t\tStreamConfiguration &stream_cfg = state->config_->at(state->srcpads_.size());\n> +\t\tg_assert(gst_caps_is_writable(sensor_mode));\n> +\t\tgst_libcamera_configure_stream_from_caps(stream_cfg, sensor_mode);\n> +\t}\n> +\n>   \tif (flow_ret != GST_FLOW_OK)\n>   \t\tgoto done;\n>   \n> @@ -624,6 +703,7 @@ gst_libcamera_src_task_leave([[maybe_unused]] GstTask *task,\n>   \tg_clear_object(&self->allocator);\n>   \tg_clear_pointer(&self->flow_combiner,\n>   \t\t\t(GDestroyNotify)gst_flow_combiner_free);\n> +\tgst_clear_caps(&self->sensor_modes);\n>   }\n>   \n>   static void\n> @@ -659,6 +739,10 @@ gst_libcamera_src_set_property(GObject *object, guint prop_id,\n>   \t\tg_free(self->camera_name);\n>   \t\tself->camera_name = g_value_dup_string(value);\n>   \t\tbreak;\n> +\tcase PROP_SENSOR_MODE:\n> +\t\tgst_clear_caps(&self->sensor_mode);\n> +\t\tself->sensor_mode = GST_CAPS(g_value_dup_boxed(value));\n> +\t\tbreak;\n>   \tdefault:\n>   \t\tG_OBJECT_WARN_INVALID_PROPERTY_ID(object, prop_id, pspec);\n>   \t\tbreak;\n> @@ -676,6 +760,12 @@ gst_libcamera_src_get_property(GObject *object, guint prop_id, GValue *value,\n>   \tcase PROP_CAMERA_NAME:\n>   \t\tg_value_set_string(value, self->camera_name);\n>   \t\tbreak;\n> +\tcase PROP_SENSOR_MODES:\n> +\t\tg_value_set_boxed(value, self->sensor_modes);\n> +\t\tbreak;\n> +\tcase PROP_SENSOR_MODE:\n> +\t\tg_value_set_boxed(value, self->sensor_mode);\n> +\t\tbreak;\n>   \tdefault:\n>   \t\tG_OBJECT_WARN_INVALID_PROPERTY_ID(object, prop_id, pspec);\n>   \t\tbreak;\n> @@ -763,6 +853,7 @@ gst_libcamera_src_init(GstLibcameraSrc *self)\n>   \t/* C-style friend. */\n>   \tstate->src_ = self;\n>   \tself->state = state;\n> +\tself->sensor_mode = gst_caps_new_any();\n>   }\n>   \n>   static GstPad *\n> @@ -844,4 +935,19 @@ gst_libcamera_src_class_init(GstLibcameraSrcClass *klass)\n>   \t\t\t\t\t\t\t     | G_PARAM_READWRITE\n>   \t\t\t\t\t\t\t     | G_PARAM_STATIC_STRINGS));\n>   \tg_object_class_install_property(object_class, PROP_CAMERA_NAME, spec);\n> +\n> +\tspec = g_param_spec_boxed(\"sensor-modes\", \"Sensor Modes\",\n> +\t\t\t\t  \"GstCaps representing available sensor modes.\",\n> +\t\t\t\t  GST_TYPE_CAPS,\n> +\t\t\t\t  (GParamFlags)(G_PARAM_READABLE\n> +\t\t\t\t\t        | G_PARAM_STATIC_STRINGS));\n> +\tg_object_class_install_property(object_class, PROP_SENSOR_MODES, spec);\n> +\n> +\tspec = g_param_spec_boxed(\"sensor-mode\", \"Sensor Mode\",\n> +\t\t\t\t  \"GstCaps representing selected sensor mode.\",\n> +\t\t\t\t  GST_TYPE_CAPS,\n> +\t\t\t\t  (GParamFlags)(GST_PARAM_MUTABLE_READY\n> +\t\t\t\t\t        | G_PARAM_READWRITE\n> +\t\t\t\t\t        | G_PARAM_STATIC_STRINGS));\n> +\tg_object_class_install_property(object_class, PROP_SENSOR_MODE, spec);\n>   }","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id B0F36C0F2A\n\tfor <parsemail@patchwork.libcamera.org>;\n\tFri, 24 Mar 2023 18:24:29 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id 00FF66271E;\n\tFri, 24 Mar 2023 19:24:28 +0100 (CET)","from mout02.posteo.de (mout02.posteo.de [185.67.36.66])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id 6B75961ECF\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tFri, 24 Mar 2023 19:24:27 +0100 (CET)","from submission (posteo.de [185.67.36.169]) \n\tby mout02.posteo.de (Postfix) with ESMTPS id E715B24039E\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tFri, 24 Mar 2023 19:24:26 +0100 (CET)","from customer (localhost [127.0.0.1])\n\tby submission (posteo.de) with ESMTPSA id 4PjrG230Jmz6tmG\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tFri, 24 Mar 2023 19:24:26 +0100 (CET)"],"DKIM-Signature":["v=1; a=rsa-sha256; c=relaxed/simple; d=libcamera.org;\n\ts=mail; t=1679682269;\n\tbh=7G99DHRwI0xVV5Jxux23/5969nC+TWAySbOHT6dVNzc=;\n\th=Date:To:References:In-Reply-To:Subject:List-Id:List-Unsubscribe:\n\tList-Archive:List-Post:List-Help:List-Subscribe:From:Reply-To:\n\tFrom;\n\tb=VQAwfRi5c2hKPpJAHqU8SKAK9K/Gd6DVbhgxBNQwC63e2YGJ7r2ab+/RUAhAE1W+S\n\tSdt52GD0uCnKX3a6jVhpR71oxzACL2Nzk2BpcsL1GScM0yYgdV1OiMGP4t7nC3G22J\n\tLMvkO7ipnOJrrOuqt0UjhAktukwFKTPOAV+ljdKrIXOTBoDBLSvCjBmBdJ8NbQyKBI\n\tKbZXbiRR9bibXkxfaE46c00XGg7vQURhLc9qHp0uBJzi9K2pHMd3u6aI62+5bPbQqj\n\tbnCKbXC3KAB3XqCUzh1f7kYUywUrQJgDO4TXTU/ltGe6N0Z57n2MIZSrrXQ2y8mBeH\n\t2QvTD1AUp8KIg==","v=1; a=rsa-sha256; c=relaxed/simple; d=posteo.de; s=2017;\n\tt=1679682266; bh=7G99DHRwI0xVV5Jxux23/5969nC+TWAySbOHT6dVNzc=;\n\th=Date:Subject:To:From:From;\n\tb=ZpaokNXbnWGTTIQ6m7r6kTW8mRQrplGXhdRFKBPGO+yENFqdet9zi/Rxp5KzLzWxS\n\tnLCpQwDSuJEmetW5NIOWmbLj/DRRNgDVrTSTuE47RWUzL54xhYcxD6/7Rz3NRaVyT/\n\tOxn3uQjUsIORDjXY+6okv89q2+t+nW+MWp8L9k+jtmL+bQ1NK6FLxQqPh1dFp9maJJ\n\tgpZM5+EXpyzY7dIU6Ut/s56qXCxgrMawfF3XSyZGJw1lEDHrIvh9LijupQ81CLgyTF\n\tH4Lse0JFNdALfORyCIl1j+admhY3sjHwMkcoV5mpE55wOWamh7RQN25eHJVXfKCm+b\n\tKcN8nS5uKDm5w=="],"Authentication-Results":"lancelot.ideasonboard.com; dkim=pass (2048-bit key; \n\tunprotected) header.d=posteo.de header.i=@posteo.de\n\theader.b=\"ZpaokNXb\"; dkim-atps=neutral","Content-Type":"multipart/alternative;\n\tboundary=\"------------cshSCKs70jOlyRZEJJDVGxbh\"","Message-ID":"<dc8e9d6a-a40e-7b20-d44c-eb9fb33bd01e@posteo.de>","Date":"Fri, 24 Mar 2023 18:24:26 +0000","MIME-Version":"1.0","Content-Language":"en-US","To":"libcamera-devel@lists.libcamera.org","References":"<20230324181247.302586-1-nicolas@ndufresne.ca>\n\t<20230324181247.302586-2-nicolas@ndufresne.ca>","In-Reply-To":"<20230324181247.302586-2-nicolas@ndufresne.ca>","Subject":"Re: [libcamera-devel] [PATCH v2 1/3] gstreamer: Add sensor mode\n\tselection","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","From":"Robert Mader via libcamera-devel <libcamera-devel@lists.libcamera.org>","Reply-To":"Robert Mader <robert.mader@posteo.de>","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}},{"id":26745,"web_url":"https://patchwork.libcamera.org/comment/26745/","msgid":"<09160ea338e680f94bf3da112a50eb08c7d5b6dd.camel@ndufresne.ca>","date":"2023-03-24T19:11:08","subject":"Re: [libcamera-devel] [PATCH v2 1/3] gstreamer: Add sensor mode\n\tselection","submitter":{"id":30,"url":"https://patchwork.libcamera.org/api/people/30/","name":"Nicolas Dufresne","email":"nicolas@ndufresne.ca"},"content":"Le vendredi 24 mars 2023 à 18:24 +0000, Robert Mader via libcamera-devel a\nécrit :\n> Hi,\n> On 24.03.23 19:12, Nicolas Dufresne via libcamera-devel wrote:\n>  \n> > From: Nicolas Dufresne <nicolas.dufresne@collabora.com>\n> > \n> > This add support for selecting the sensor mode. A new read-only\n> > property called sensor-modes is filled when the element reaches\n> > READY state. It contains the list of all available sensor modes,\n> > including the supported framerate range. This is exposed as GstCaps\n> > in the form of sensor/mode,width=X,height=Y,format=Y,framerate=[...].\n> > The format string matches the libcamera format string representation.\n> > \n> > The application can then select a mode using the read/write sensor-mode\n> > control. The selected mode is also a caps, it will be intersected with\n> > the supported mode and the \"best\" match will be picked. This allows\n> > application to use simple filter when they want to pick a mode for lets\n> > say a specific framerate (e.g. sensor/mode,framerate=60/1).\n> > \n> > Signed-off-by: Nicolas Dufresne <nicolas.dufresne@collabora.com>\n> > ---\n> >  src/gstreamer/gstlibcamera-utils.cpp | 4 +\n> >  src/gstreamer/gstlibcamerasrc.cpp | 110 ++++++++++++++++++++++++++-\n> >  2 files changed, 112 insertions(+), 2 deletions(-)\n> > \n> > diff --git a/src/gstreamer/gstlibcamera-utils.cpp b/src/gstreamer/gstlibcamera-utils.cpp\n> > index 750ec351..c8a8df09 100644\n> > --- a/src/gstreamer/gstlibcamera-utils.cpp\n> > +++ b/src/gstreamer/gstlibcamera-utils.cpp\n> > @@ -416,6 +416,10 @@ gst_libcamera_configure_stream_from_caps(StreamConfiguration &stream_cfg,\n> >  stream_cfg.pixelFormat = gst_format_to_pixel_format(gst_format);\n> >  } else if (gst_structure_has_name(s, \"image/jpeg\")) {\n> >  stream_cfg.pixelFormat = formats::MJPEG;\n> > + } else if (gst_structure_has_name(s, \"sensor/mode\")) {\n> > + gst_structure_fixate_field(s, \"format\");\n> > + const gchar *format = gst_structure_get_string(s, \"format\");\n> > + stream_cfg.pixelFormat = PixelFormat::fromString(format);\n> >  } else {\n> >  g_critical(\"Unsupported media type: %s\", gst_structure_get_name(s));\n> >  }\n> > diff --git a/src/gstreamer/gstlibcamerasrc.cpp b/src/gstreamer/gstlibcamerasrc.cpp\n> > index a10cbd4f..2f05a03f 100644\n> > --- a/src/gstreamer/gstlibcamerasrc.cpp\n> > +++ b/src/gstreamer/gstlibcamerasrc.cpp\n> > @@ -147,6 +147,9 @@ struct _GstLibcameraSrc {\n> >  gchar *camera_name;\n> > + GstCaps *sensor_modes;\n> > + GstCaps *sensor_mode;\n> > +\n> >  GstLibcameraSrcState *state;\n> >  GstLibcameraAllocator *allocator;\n> >  GstFlowCombiner *flow_combiner;\n> > @@ -154,7 +157,10 @@ struct _GstLibcameraSrc {\n> >  enum {\n> >  PROP_0,\n> > - PROP_CAMERA_NAME\n> > + PROP_CAMERA_NAME,\n> > + PROP_SENSOR_MODES,\n> > + PROP_SENSOR_MODE,\n> > + PROP_LAST\n> >  };\n> >  G_DEFINE_TYPE_WITH_CODE(GstLibcameraSrc, gst_libcamera_src, GST_TYPE_ELEMENT,\n> > @@ -318,6 +324,61 @@ int GstLibcameraSrcState::processRequest()\n> >  return err;\n> >  }\n> > +static GstCaps *\n> > +gst_libcamera_src_enumerate_sensor_modes(GstLibcameraSrc *self)\n> > +{\n> > + GstCaps *modes = gst_caps_new_empty();\n> > + GstLibcameraSrcState *state = self->state;\n> > + auto config = state->cam_->generateConfiguration({ libcamera::StreamRole::Raw,\n> > + libcamera::StreamRole::VideoRecording });\n> > + if (config == nullptr) {\n> > + GST_DEBUG_OBJECT(self, \"Sensor mode selection is not supported, skipping enumeration.\");\n> > + return modes;\n> > + }\n> I gave this a short try on both my USB camera and on a PPP. In both cases\n> enumeration gets skipped (as intended), but in the later case the following\n> warning gets printed:\n> > ERROR RkISP1 rkisp1.cpp:684 Can't capture both raw and processed streams\n> So I wonder if we can come a better check here.\n> \nThe fact the IPAs / Pipeline manager through stuff on the terminal should\nprobably not justify changing the API usage. The logging needs to be adapted so\nit does not pretend there has been an error on behalf of the application using\nthe API. For me its seems evident that apps will try different numbers of roles\nuntil the camera accepts the configuration.\n\n> When allowing the enumeration by only selecting Raw or VideoRecording I made\n> the following observations:\n>    1. enumeration on the USB camera is very slow, takes several seconds\n>    2. on the PPP / RkISP1, even though there are many more modes, the\n>       enumeration is super fast - probably a fraction of a second. So a method\n>       like this could probably be used for framerate enumeration.\n>    3. the mode selection via sensor-mode=sensor/mode,... usually fails though\n>       (haven't digged deeper why yet)\n\nIt fails because of  \"ERROR RkISP1 rkisp1.cpp:684 Can't capture both raw and\nprocessed streams\".\n\nIn general, I don't think it is useful to enumerate the modes on PPP, because\nthe method to select sensor mode cannot be supported (you can't pass buffers\nthrough the ISP while capturing the raw). In general, this just highlights that\nsensor mode selection API is currently just a hack, as selecting sensor modes on\nPPP seems rather important. Currently, only IPU3 and RPi have a sensor/isp model\nthat works with the API.\n\nNicolas\n\n> Regards,\n> Robert\n>  \n> > +\n> > + const libcamera::StreamFormats &formats = config->at(0).formats();\n> > +\n> > + for (const auto &pixfmt : formats.pixelformats()) {\n> > + for (const auto &size : formats.sizes(pixfmt)) {\n> > + config->at(0).size = size;\n> > + config->at(0).pixelFormat = pixfmt;\n> > +\n> > + if (config->validate() == CameraConfiguration::Invalid)\n> > + continue;\n> > +\n> > + if (state->cam_->configure(config.get()))\n> > + continue;\n> > +\n> > + auto fd_ctrl = state->cam_->controls().find(&controls::FrameDurationLimits);\n> > + if (fd_ctrl == state->cam_->controls().end())\n> > + continue;\n> > +\n> > + int minrate_num, minrate_denom;\n> > + int maxrate_num, maxrate_denom;\n> > + double min_framerate = gst_util_guint64_to_gdouble(1.0e6) /\n> > + gst_util_guint64_to_gdouble(fd_ctrl->second.max().get<int64_t>());\n> > + double max_framerate = gst_util_guint64_to_gdouble(1.0e6) /\n> > + gst_util_guint64_to_gdouble(fd_ctrl->second.min().get<int64_t>());\n> > + gst_util_double_to_fraction(min_framerate, &minrate_num, &minrate_denom);\n> > + gst_util_double_to_fraction(max_framerate, &maxrate_num, &maxrate_denom);\n> > +\n> > + GstStructure *s = gst_structure_new(\"sensor/mode\",\n> > + \"format\", G_TYPE_STRING, pixfmt.toString().c_str(),\n> > + \"width\", G_TYPE_INT, size.width,\n> > + \"height\", G_TYPE_INT, size.height,\n> > + \"framerate\", GST_TYPE_FRACTION_RANGE,\n> > + minrate_num, minrate_denom,\n> > + maxrate_num, maxrate_denom,\n> > + nullptr);\n> > + gst_caps_append_structure(modes, s);\n> > + }\n> > + }\n> > +\n> > + GST_DEBUG_OBJECT(self, \"Camera sensor modes: %\" GST_PTR_FORMAT, modes);\n> > +\n> > + return modes;\n> > +}\n> > +\n> >  static bool\n> >  gst_libcamera_src_open(GstLibcameraSrc *self)\n> >  {\n> > @@ -375,6 +436,7 @@ gst_libcamera_src_open(GstLibcameraSrc *self)\n> >  /* No need to lock here, we didn't start our threads yet. */\n> >  self->state->cm_ = cm;\n> >  self->state->cam_ = cam;\n> > + self->sensor_modes = gst_libcamera_src_enumerate_sensor_modes(self);\n> >  return true;\n> >  }\n> > @@ -462,6 +524,7 @@ gst_libcamera_src_task_enter(GstTask *task, [[maybe_unused]] GThread *thread,\n> >  GstLibcameraSrcState *state = self->state;\n> >  GstFlowReturn flow_ret = GST_FLOW_OK;\n> >  gint ret;\n> > + g_autoptr(GstCaps) sensor_mode = nullptr;\n> >  g_autoptr(GstStructure) element_caps = gst_structure_new_empty(\"caps\");\n> > @@ -481,6 +544,16 @@ gst_libcamera_src_task_enter(GstTask *task, [[maybe_unused]] GThread *thread,\n> >  roles.push_back(gst_libcamera_pad_get_role(srcpad));\n> >  }\n> > + if (!gst_caps_is_any(self->sensor_mode)) {\n> > + sensor_mode = gst_caps_intersect(self->sensor_mode, self->sensor_modes);\n> > + if (!gst_caps_is_empty(sensor_mode)) {\n> > + roles.push_back(libcamera::StreamRole::Raw);\n> > + } else {\n> > + GST_WARNING_OBJECT(self, \"No sensor mode matching the selection, ignoring.\");\n> > + gst_clear_caps(&sensor_mode);\n> > + }\n> > + }\n> > +\n> >  /* Generate the stream configurations, there should be one per pad. */\n> >  state->config_ = state->cam_->generateConfiguration(roles);\n> >  if (state->config_ == nullptr) {\n> > @@ -490,7 +563,7 @@ gst_libcamera_src_task_enter(GstTask *task, [[maybe_unused]] GThread *thread,\n> >  gst_task_stop(task);\n> >  return;\n> >  }\n> > - g_assert(state->config_->size() == state->srcpads_.size());\n> > + g_assert(state->config_->size() == state->srcpads_.size() + (!!sensor_mode));\n> >  for (gsize i = 0; i < state->srcpads_.size(); i++) {\n> >  GstPad *srcpad = state->srcpads_[i];\n> > @@ -510,6 +583,12 @@ gst_libcamera_src_task_enter(GstTask *task, [[maybe_unused]] GThread *thread,\n> >  gst_libcamera_get_framerate_from_caps(caps, element_caps);\n> >  }\n> > + if (sensor_mode) {\n> > + StreamConfiguration &stream_cfg = state->config_->at(state->srcpads_.size());\n> > + g_assert(gst_caps_is_writable(sensor_mode));\n> > + gst_libcamera_configure_stream_from_caps(stream_cfg, sensor_mode);\n> > + }\n> > +\n> >  if (flow_ret != GST_FLOW_OK)\n> >  goto done;\n> > @@ -624,6 +703,7 @@ gst_libcamera_src_task_leave([[maybe_unused]] GstTask *task,\n> >  g_clear_object(&self->allocator);\n> >  g_clear_pointer(&self->flow_combiner,\n> >  (GDestroyNotify)gst_flow_combiner_free);\n> > + gst_clear_caps(&self->sensor_modes);\n> >  }\n> >  static void\n> > @@ -659,6 +739,10 @@ gst_libcamera_src_set_property(GObject *object, guint prop_id,\n> >  g_free(self->camera_name);\n> >  self->camera_name = g_value_dup_string(value);\n> >  break;\n> > + case PROP_SENSOR_MODE:\n> > + gst_clear_caps(&self->sensor_mode);\n> > + self->sensor_mode = GST_CAPS(g_value_dup_boxed(value));\n> > + break;\n> >  default:\n> >  G_OBJECT_WARN_INVALID_PROPERTY_ID(object, prop_id, pspec);\n> >  break;\n> > @@ -676,6 +760,12 @@ gst_libcamera_src_get_property(GObject *object, guint prop_id, GValue *value,\n> >  case PROP_CAMERA_NAME:\n> >  g_value_set_string(value, self->camera_name);\n> >  break;\n> > + case PROP_SENSOR_MODES:\n> > + g_value_set_boxed(value, self->sensor_modes);\n> > + break;\n> > + case PROP_SENSOR_MODE:\n> > + g_value_set_boxed(value, self->sensor_mode);\n> > + break;\n> >  default:\n> >  G_OBJECT_WARN_INVALID_PROPERTY_ID(object, prop_id, pspec);\n> >  break;\n> > @@ -763,6 +853,7 @@ gst_libcamera_src_init(GstLibcameraSrc *self)\n> >  /* C-style friend. */\n> >  state->src_ = self;\n> >  self->state = state;\n> > + self->sensor_mode = gst_caps_new_any();\n> >  }\n> >  static GstPad *\n> > @@ -844,4 +935,19 @@ gst_libcamera_src_class_init(GstLibcameraSrcClass *klass)\n> >  | G_PARAM_READWRITE\n> >  | G_PARAM_STATIC_STRINGS));\n> >  g_object_class_install_property(object_class, PROP_CAMERA_NAME, spec);\n> > +\n> > + spec = g_param_spec_boxed(\"sensor-modes\", \"Sensor Modes\",\n> > + \"GstCaps representing available sensor modes.\",\n> > + GST_TYPE_CAPS,\n> > + (GParamFlags)(G_PARAM_READABLE\n> > + | G_PARAM_STATIC_STRINGS));\n> > + g_object_class_install_property(object_class, PROP_SENSOR_MODES, spec);\n> > +\n> > + spec = g_param_spec_boxed(\"sensor-mode\", \"Sensor Mode\",\n> > + \"GstCaps representing selected sensor mode.\",\n> > + GST_TYPE_CAPS,\n> > + (GParamFlags)(GST_PARAM_MUTABLE_READY\n> > + | G_PARAM_READWRITE\n> > + | G_PARAM_STATIC_STRINGS));\n> > + g_object_class_install_property(object_class, PROP_SENSOR_MODE, spec);\n> >  }","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id ADF36C0F2A\n\tfor <parsemail@patchwork.libcamera.org>;\n\tFri, 24 Mar 2023 19:11:12 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id 022C7603AA;\n\tFri, 24 Mar 2023 20:11:12 +0100 (CET)","from mail-qt1-x833.google.com (mail-qt1-x833.google.com\n\t[IPv6:2607:f8b0:4864:20::833])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id B41FF603AA\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tFri, 24 Mar 2023 20:11:10 +0100 (CET)","by mail-qt1-x833.google.com with SMTP id g19so2389846qts.9\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tFri, 24 Mar 2023 12:11:10 -0700 (PDT)","from nicolas-tpx395.localdomain (192-222-136-102.qc.cable.ebox.net.\n\t[192.222.136.102]) by smtp.gmail.com with ESMTPSA id\n\t11-20020a37060b000000b0071d0f1d01easm8370270qkg.57.2023.03.24.12.11.08\n\t(version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256);\n\tFri, 24 Mar 2023 12:11:09 -0700 (PDT)"],"DKIM-Signature":["v=1; a=rsa-sha256; c=relaxed/simple; d=libcamera.org;\n\ts=mail; t=1679685072;\n\tbh=Umugci4mtZSytIlHH3+8k+QjgPwbDmpJ3zLx35+nG5E=;\n\th=To:Date:In-Reply-To:References:Subject:List-Id:List-Unsubscribe:\n\tList-Archive:List-Post:List-Help:List-Subscribe:From:Reply-To:\n\tFrom;\n\tb=bKJBEhp6KnwTUoPomR6nBclGRuxP0WBSUrUq3ygmHgmwAHhQTEExLUE8eLrv/ybZG\n\toGxv7YAjxyfoJDnnqvfVZ429bpOxwloydAaky1qQHrD3FxICwD38Anyrrf/fcxnWC2\n\tFqLBMhYTduStngRPB6qd5tqRt2P/4qOBshV+Mf7ceJFgMCDtRMvDZwMcGVpLT2048k\n\t59dPoXAIkzmt4i6T5E8T0H7B4uYzkZ/xo/KWAfRGA8CoYigQX9fUVNab0YVtaklmI0\n\t0Og1yZwaT6//bRX4OUanKr18BpXvTvgNlP9I6D01Et3yfpl2hDuxleyQJX4iHAnIzM\n\t0Hqzz3LF7uiug==","v=1; a=rsa-sha256; c=relaxed/relaxed;\n\td=ndufresne-ca.20210112.gappssmtp.com; s=20210112; t=1679685069;\n\th=mime-version:user-agent:references:in-reply-to:date:to:from:subject\n\t:message-id:from:to:cc:subject:date:message-id:reply-to;\n\tbh=yAQt6p7Th98QywZUkJmOBghoNwBgCXt29JHxRBBpRNs=;\n\tb=Zv5ck2PHkIDf5LmqBJFG3a/aAhQsDc8v0fSrZVOM3MxJCiL5Q7yp9/dt4JD6L8gSiT\n\toJ4qlfzhpHVIItp9oP5I+e0X1SKYa57ce2jNqW6pfB8rx2qtzidfZuRQi3JIPyB1/6sa\n\tE2ieH7s23DHPSITGkrcIVHQO8Sc7UnEBJVOQfiIxwbFPMNoz8G8DFxkUesiCsKWoreeP\n\t8SUOa3IHX+GwXwiZJKXcpV0+XTIvnJkqEfGOw6PVzThDIGj7gYUBdEIYos/alDdjciFW\n\tAtGsB2g5YRZ3+Uh3wijZyLrjqFLUXiYtvoL6zz+4R/Wds35A67f0a8Cvc1bPOq/sjjJj\n\tzO3g=="],"Authentication-Results":"lancelot.ideasonboard.com; dkim=pass (2048-bit key; \n\tunprotected)\n\theader.d=ndufresne-ca.20210112.gappssmtp.com\n\theader.i=@ndufresne-ca.20210112.gappssmtp.com header.b=\"Zv5ck2PH\"; \n\tdkim-atps=neutral","X-Google-DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/relaxed;\n\td=1e100.net; s=20210112; t=1679685069;\n\th=mime-version:user-agent:references:in-reply-to:date:to:from:subject\n\t:message-id:x-gm-message-state:from:to:cc:subject:date:message-id\n\t:reply-to;\n\tbh=yAQt6p7Th98QywZUkJmOBghoNwBgCXt29JHxRBBpRNs=;\n\tb=5ShgJAy8+yFvKoz7efhJQQhgd7TwiFMHyTztpon/gfCz8B/uE/Bd+9uaY6dV3HJz2P\n\tjyD/bATLMJ6YLsA4esfTh4LbzI+UKtTMK9mwsn1MS20fySt+oa7rR/1m/IQ5ILuTPKsU\n\toRzh7ahm0IAKtuqRlRfeSDm78a1uSTVLbm9QtKSmjnH2M2VPIAnA0gK2/7hU9bV5IDtf\n\tS2q/e3bHqD0ksHOxxZywMcOkoOLSWTEzXuMRMhoXWtCAQUlIqm56HSkS/Sg+5Lpu1vJr\n\tWAaXDuh4KgduuyvPFsSQHJxnYHY2qAy34HUbvkhckSByjjQjfT+olGen1Ke9Tv1XZ6mN\n\t5Dzg==","X-Gm-Message-State":"AAQBX9dzRsGmsgkHdjhX5rQd58ZdhWMM3EXq8tGnZHDCNHsQ8skXyaKj\n\tesBpEkBwhXlhG31Xx0z+8Mg8D0U7pAt3Uslll1PfDA==","X-Google-Smtp-Source":"AKy350aCSg8XotgS0F3dxivbkpzjtVsH+mUwQA6KiL52E3tF5v6+eCkJTgKQjDW7JW6zf8XUFmFFGQ==","X-Received":"by 2002:ac8:7e8e:0:b0:3d6:d055:72af with SMTP id\n\tw14-20020ac87e8e000000b003d6d05572afmr7876045qtj.53.1679685069393; \n\tFri, 24 Mar 2023 12:11:09 -0700 (PDT)","Message-ID":"<09160ea338e680f94bf3da112a50eb08c7d5b6dd.camel@ndufresne.ca>","To":"Robert Mader <robert.mader@posteo.de>,\n\tlibcamera-devel@lists.libcamera.org","Date":"Fri, 24 Mar 2023 15:11:08 -0400","In-Reply-To":"<dc8e9d6a-a40e-7b20-d44c-eb9fb33bd01e@posteo.de>","References":"<20230324181247.302586-1-nicolas@ndufresne.ca>\n\t<20230324181247.302586-2-nicolas@ndufresne.ca>\n\t<dc8e9d6a-a40e-7b20-d44c-eb9fb33bd01e@posteo.de>","Content-Type":"multipart/alternative; boundary=\"=-6DUF7yUGKMtbROs+Jryb\"","User-Agent":"Evolution 3.46.4 (3.46.4-1.fc37) ","MIME-Version":"1.0","Subject":"Re: [libcamera-devel] [PATCH v2 1/3] gstreamer: Add sensor mode\n\tselection","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","From":"Nicolas Dufresne via libcamera-devel\n\t<libcamera-devel@lists.libcamera.org>","Reply-To":"Nicolas Dufresne <nicolas@ndufresne.ca>","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}},{"id":26754,"web_url":"https://patchwork.libcamera.org/comment/26754/","msgid":"<20230327092155.kzzwjsr5ffnh4y7k@uno.localdomain>","date":"2023-03-27T09:21:55","subject":"Re: [libcamera-devel] [PATCH v2 1/3] gstreamer: Add sensor mode\n\tselection","submitter":{"id":143,"url":"https://patchwork.libcamera.org/api/people/143/","name":"Jacopo Mondi","email":"jacopo.mondi@ideasonboard.com"},"content":"Hi Nicolas,\n\nOn Fri, Mar 24, 2023 at 02:12:45PM -0400, Nicolas Dufresne via libcamera-devel wrote:\n> From: Nicolas Dufresne <nicolas.dufresne@collabora.com>\n>\n> This add support for selecting the sensor mode. A new read-only\n> property called sensor-modes is filled when the element reaches\n> READY state. It contains the list of all available sensor modes,\n> including the supported framerate range. This is exposed as GstCaps\n> in the form of sensor/mode,width=X,height=Y,format=Y,framerate=[...].\n> The format string matches the libcamera format string representation.\n>\n> The application can then select a mode using the read/write sensor-mode\n> control. The selected mode is also a caps, it will be intersected with\n> the supported mode and the \"best\" match will be picked. This allows\n> application to use simple filter when they want to pick a mode for lets\n> say a specific framerate (e.g. sensor/mode,framerate=60/1).\n>\n> Signed-off-by: Nicolas Dufresne <nicolas.dufresne@collabora.com>\n> ---\n>  src/gstreamer/gstlibcamera-utils.cpp |   4 +\n>  src/gstreamer/gstlibcamerasrc.cpp    | 110 ++++++++++++++++++++++++++-\n>  2 files changed, 112 insertions(+), 2 deletions(-)\n>\n> diff --git a/src/gstreamer/gstlibcamera-utils.cpp b/src/gstreamer/gstlibcamera-utils.cpp\n> index 750ec351..c8a8df09 100644\n> --- a/src/gstreamer/gstlibcamera-utils.cpp\n> +++ b/src/gstreamer/gstlibcamera-utils.cpp\n> @@ -416,6 +416,10 @@ gst_libcamera_configure_stream_from_caps(StreamConfiguration &stream_cfg,\n>  \t\tstream_cfg.pixelFormat = gst_format_to_pixel_format(gst_format);\n>  \t} else if (gst_structure_has_name(s, \"image/jpeg\")) {\n>  \t\tstream_cfg.pixelFormat = formats::MJPEG;\n> +\t} else if (gst_structure_has_name(s, \"sensor/mode\")) {\n> +\t\tgst_structure_fixate_field(s, \"format\");\n> +\t\tconst gchar *format = gst_structure_get_string(s, \"format\");\n> +\t\tstream_cfg.pixelFormat = PixelFormat::fromString(format);\n>  \t} else {\n>  \t\tg_critical(\"Unsupported media type: %s\", gst_structure_get_name(s));\n>  \t}\n> diff --git a/src/gstreamer/gstlibcamerasrc.cpp b/src/gstreamer/gstlibcamerasrc.cpp\n> index a10cbd4f..2f05a03f 100644\n> --- a/src/gstreamer/gstlibcamerasrc.cpp\n> +++ b/src/gstreamer/gstlibcamerasrc.cpp\n> @@ -147,6 +147,9 @@ struct _GstLibcameraSrc {\n>\n>  \tgchar *camera_name;\n>\n> +\tGstCaps *sensor_modes;\n> +\tGstCaps *sensor_mode;\n> +\n>  \tGstLibcameraSrcState *state;\n>  \tGstLibcameraAllocator *allocator;\n>  \tGstFlowCombiner *flow_combiner;\n> @@ -154,7 +157,10 @@ struct _GstLibcameraSrc {\n>\n>  enum {\n>  \tPROP_0,\n> -\tPROP_CAMERA_NAME\n> +\tPROP_CAMERA_NAME,\n> +\tPROP_SENSOR_MODES,\n> +\tPROP_SENSOR_MODE,\n> +\tPROP_LAST\n>  };\n>\n>  G_DEFINE_TYPE_WITH_CODE(GstLibcameraSrc, gst_libcamera_src, GST_TYPE_ELEMENT,\n> @@ -318,6 +324,61 @@ int GstLibcameraSrcState::processRequest()\n>  \treturn err;\n>  }\n>\n> +static GstCaps *\n> +gst_libcamera_src_enumerate_sensor_modes(GstLibcameraSrc *self)\n> +{\n> +\tGstCaps *modes = gst_caps_new_empty();\n> +\tGstLibcameraSrcState *state = self->state;\n> +\tauto config = state->cam_->generateConfiguration({ libcamera::StreamRole::Raw,\n> +\t\t\t\t\t\t\t   libcamera::StreamRole::VideoRecording });\n\nThis clearly shows our API falls short, as the presence of the RAW\nstream, and the ability to capture it alongside other streams, is\nconditional on the platform's capabilities. So this can't be\nconsidered a sufficiently stable way of retrieving information about\nthe sensor's configuration. Not your fault, we currently don't have\nanything better to offer.\n\nWe currently have a way to report all the possible configurations for\na stream using the StreamFormat class. StreamFormat is currently\nnothing but\n\n\tstd::map<PixelFormat, std::vector<SizeRange>> formats_;\n\nwhich gets populated by the pipeline handler while creating\nthe StreamConfiguration items which will be part of the generated\nCameraConfiguration.\n\nThat's for the \"Camera-provided information to applications\" side.\n\nOn the \"application to configure the Camera\" side, StreamConfiguration\nis used to tell the library how a Stream should be configured.\nThese are the fields a StreamConfiguration describe:\n\n\tPixelFormat pixelFormat;\n\tSize size;\n\tunsigned int stride;\n\tunsigned int frameSize;\n\n\tunsigned int bufferCount;\n\n\tstd::optional<ColorSpace> colorSpace;\n\nWith pixelformat, size, bufferCount and (optionally) colorSpace which\nare expected to be set by the application.\n\nWe have been discussing about adding to StreamConfiguration a\nframeDuration since a long time. It is needed for some parts of the\nAndroid HAL as well, and in general, as your patch shows, it is a\ndesirable feature. A similar reasoning can be made for the sensor's\nconfiguration, even if this has received several push backs in the\npast when RPi wanted something similar. Furthermore, I would be\ntempted to consider the possible FrameDuration a property of the\nsensor's configuration, but I presume that advanced use cases for\nStillCapture chain to the canonical frame processing more stages, so\nI presume it's better to have the FrameDurations separate from the\nsensor configuration to allow describe additional delays there.\n\nSumming it all up, we have StreamFormats to convey back to application\nwhat a Stream can do, and StreamConfiguration to set the Stream\ncharacteristics.\n\nAddressing the needs you have here would require:\n\n- Pipeline handlers to populate StreamFormats not only with\n  <PixelFormat, vector<SizeRange>> but also with the possible sensor's\n  configuration that can generated that mode and the possible frame\n  durations\n\n- Application to populate a StreamConfiguration with the desired\n  FrameDuration and sensor mode\n\n- Pipeline handler to validate that the desired sensor mode and\n  durations are compatible with the stream characteristics.\n  (as an example, if your pipeline can't upscale, you can't have a\n  sensor mode with a size smaller than the desired output).\n\nThis puts more complexity on the pipeline side, in both the generation\nof configurations and in their validation, but would such an API\naddress your needs in your opinion ? Have you had any thoughts on how\nyou would like to have this handled ?\n\nThat's really an open question for everyone, as I guess this part is\nbecoming critical as this patch shows and the workaround we pushed RPi\nto is really not generic enough.\n\nOne more point here below\n\nThanks\n   j\n\n-------------------------------------------------------------------------------\nSlightly unrelated point, but while you're here I would like to pick\nyour brain on the issue: the way StreamFormat are currently created is\nslightly under-specified. The documentation goes in great length in\ndescribing how the sizes associated with PixelFormat can be inspected\n[1] but it only superficially describe how a pipeline handler should\npopulate the formats associated with a stream configuration [2]. I'll\nmake two examples:\n- if the StreamConfiguration is for a 1080p Viewfinder stream should\n  the associated StreamFormats list resolutions > 1080p ?\n- if the StreamConfiguration is for a !Raw role, should RAW\n  pixelformats be listed in the StreamFormats ?\n\nThere are arguments for both cases, but I guess the big picture\nquestion is about how an application should expect to retrieve the\nfull camera capabilities:\n- Does an application expect to receive only the available formats for\n  the role it has asked a configuration for ?\n- Does an application expect to receive all the possible formats a\n  camera can produce regardless of the size/pixelformat of the\n  StreamConfiguration that has been generated ?\n\nAs a concrete example, currently the RkISP1 pipeline handler list a\nRAW StreamFormat for all configurations, regardless of the role. I had\na patch to \"fix\" this but I've been pointed to the fact the API was\nnot very well specified.\n\nWhat would your expectations be in this case ?\n\n[1] https://libcamera.org/api-html/classlibcamera_1_1StreamFormats.html\n[2] https://libcamera.org/api-html/structlibcamera_1_1StreamConfiguration.html#a37725e6b9e179ca80be0e57ed97857d3\n-------------------------------------------------------------------------------\n\n\n> +\tif (config == nullptr) {\n> +\t\tGST_DEBUG_OBJECT(self, \"Sensor mode selection is not supported, skipping enumeration.\");\n> +\t\treturn modes;\n> +\t}\n> +\n> +\tconst libcamera::StreamFormats &formats = config->at(0).formats();\n> +\n> +\tfor (const auto &pixfmt : formats.pixelformats()) {\n> +\t\tfor (const auto &size : formats.sizes(pixfmt)) {\n> +\t\t\tconfig->at(0).size = size;\n> +\t\t\tconfig->at(0).pixelFormat = pixfmt;\n> +\n> +\t\t\tif (config->validate() == CameraConfiguration::Invalid)\n> +\t\t\t\tcontinue;\n> +\n> +\t\t\tif (state->cam_->configure(config.get()))\n> +\t\t\t\tcontinue;\n> +\n> +\t\t\tauto fd_ctrl = state->cam_->controls().find(&controls::FrameDurationLimits);\n> +\t\t\tif (fd_ctrl == state->cam_->controls().end())\n> +\t\t\t\tcontinue;\n> +\n> +\t\t\tint minrate_num, minrate_denom;\n> +\t\t\tint maxrate_num, maxrate_denom;\n> +\t\t\tdouble min_framerate = gst_util_guint64_to_gdouble(1.0e6) /\n> +\t\t\t\t\t       gst_util_guint64_to_gdouble(fd_ctrl->second.max().get<int64_t>());\n> +\t\t\tdouble max_framerate = gst_util_guint64_to_gdouble(1.0e6) /\n> +\t\t\t\t\t       gst_util_guint64_to_gdouble(fd_ctrl->second.min().get<int64_t>());\n> +\t\t\tgst_util_double_to_fraction(min_framerate, &minrate_num, &minrate_denom);\n> +\t\t\tgst_util_double_to_fraction(max_framerate, &maxrate_num, &maxrate_denom);\n> +\n> +\t\t\tGstStructure *s = gst_structure_new(\"sensor/mode\",\n> +\t\t\t\t\t\t\t    \"format\", G_TYPE_STRING, pixfmt.toString().c_str(),\n> +\t\t\t\t\t\t\t    \"width\", G_TYPE_INT, size.width,\n> +\t\t\t\t\t\t\t    \"height\", G_TYPE_INT, size.height,\n> +\t\t\t\t\t\t\t    \"framerate\", GST_TYPE_FRACTION_RANGE,\n> +\t\t\t\t\t\t\t    minrate_num, minrate_denom,\n> +\t\t\t\t\t\t\t    maxrate_num, maxrate_denom,\n> +\t\t\t\t\t\t\t    nullptr);\n> +\t\t\tgst_caps_append_structure(modes, s);\n> +\t\t}\n> +\t}\n> +\n> +\tGST_DEBUG_OBJECT(self, \"Camera sensor modes: %\" GST_PTR_FORMAT, modes);\n> +\n> +\treturn modes;\n> +}\n> +\n>  static bool\n>  gst_libcamera_src_open(GstLibcameraSrc *self)\n>  {\n> @@ -375,6 +436,7 @@ gst_libcamera_src_open(GstLibcameraSrc *self)\n>  \t/* No need to lock here, we didn't start our threads yet. */\n>  \tself->state->cm_ = cm;\n>  \tself->state->cam_ = cam;\n> +\tself->sensor_modes = gst_libcamera_src_enumerate_sensor_modes(self);\n>\n>  \treturn true;\n>  }\n> @@ -462,6 +524,7 @@ gst_libcamera_src_task_enter(GstTask *task, [[maybe_unused]] GThread *thread,\n>  \tGstLibcameraSrcState *state = self->state;\n>  \tGstFlowReturn flow_ret = GST_FLOW_OK;\n>  \tgint ret;\n> +\tg_autoptr(GstCaps) sensor_mode = nullptr;\n>\n>  \tg_autoptr(GstStructure) element_caps = gst_structure_new_empty(\"caps\");\n>\n> @@ -481,6 +544,16 @@ gst_libcamera_src_task_enter(GstTask *task, [[maybe_unused]] GThread *thread,\n>  \t\troles.push_back(gst_libcamera_pad_get_role(srcpad));\n>  \t}\n>\n> +\tif (!gst_caps_is_any(self->sensor_mode)) {\n> +\t\tsensor_mode = gst_caps_intersect(self->sensor_mode, self->sensor_modes);\n> +\t\tif (!gst_caps_is_empty(sensor_mode)) {\n> +\t\t\troles.push_back(libcamera::StreamRole::Raw);\n> +\t\t} else {\n> +\t\t\tGST_WARNING_OBJECT(self, \"No sensor mode matching the selection, ignoring.\");\n> +\t\t\tgst_clear_caps(&sensor_mode);\n> +\t\t}\n> +\t}\n> +\n>  \t/* Generate the stream configurations, there should be one per pad. */\n>  \tstate->config_ = state->cam_->generateConfiguration(roles);\n>  \tif (state->config_ == nullptr) {\n> @@ -490,7 +563,7 @@ gst_libcamera_src_task_enter(GstTask *task, [[maybe_unused]] GThread *thread,\n>  \t\tgst_task_stop(task);\n>  \t\treturn;\n>  \t}\n> -\tg_assert(state->config_->size() == state->srcpads_.size());\n> +\tg_assert(state->config_->size() == state->srcpads_.size() + (!!sensor_mode));\n>\n>  \tfor (gsize i = 0; i < state->srcpads_.size(); i++) {\n>  \t\tGstPad *srcpad = state->srcpads_[i];\n> @@ -510,6 +583,12 @@ gst_libcamera_src_task_enter(GstTask *task, [[maybe_unused]] GThread *thread,\n>  \t\tgst_libcamera_get_framerate_from_caps(caps, element_caps);\n>  \t}\n>\n> +\tif (sensor_mode) {\n> +\t\tStreamConfiguration &stream_cfg = state->config_->at(state->srcpads_.size());\n> +\t\tg_assert(gst_caps_is_writable(sensor_mode));\n> +\t\tgst_libcamera_configure_stream_from_caps(stream_cfg, sensor_mode);\n> +\t}\n> +\n>  \tif (flow_ret != GST_FLOW_OK)\n>  \t\tgoto done;\n>\n> @@ -624,6 +703,7 @@ gst_libcamera_src_task_leave([[maybe_unused]] GstTask *task,\n>  \tg_clear_object(&self->allocator);\n>  \tg_clear_pointer(&self->flow_combiner,\n>  \t\t\t(GDestroyNotify)gst_flow_combiner_free);\n> +\tgst_clear_caps(&self->sensor_modes);\n>  }\n>\n>  static void\n> @@ -659,6 +739,10 @@ gst_libcamera_src_set_property(GObject *object, guint prop_id,\n>  \t\tg_free(self->camera_name);\n>  \t\tself->camera_name = g_value_dup_string(value);\n>  \t\tbreak;\n> +\tcase PROP_SENSOR_MODE:\n> +\t\tgst_clear_caps(&self->sensor_mode);\n> +\t\tself->sensor_mode = GST_CAPS(g_value_dup_boxed(value));\n> +\t\tbreak;\n>  \tdefault:\n>  \t\tG_OBJECT_WARN_INVALID_PROPERTY_ID(object, prop_id, pspec);\n>  \t\tbreak;\n> @@ -676,6 +760,12 @@ gst_libcamera_src_get_property(GObject *object, guint prop_id, GValue *value,\n>  \tcase PROP_CAMERA_NAME:\n>  \t\tg_value_set_string(value, self->camera_name);\n>  \t\tbreak;\n> +\tcase PROP_SENSOR_MODES:\n> +\t\tg_value_set_boxed(value, self->sensor_modes);\n> +\t\tbreak;\n> +\tcase PROP_SENSOR_MODE:\n> +\t\tg_value_set_boxed(value, self->sensor_mode);\n> +\t\tbreak;\n>  \tdefault:\n>  \t\tG_OBJECT_WARN_INVALID_PROPERTY_ID(object, prop_id, pspec);\n>  \t\tbreak;\n> @@ -763,6 +853,7 @@ gst_libcamera_src_init(GstLibcameraSrc *self)\n>  \t/* C-style friend. */\n>  \tstate->src_ = self;\n>  \tself->state = state;\n> +\tself->sensor_mode = gst_caps_new_any();\n>  }\n>\n>  static GstPad *\n> @@ -844,4 +935,19 @@ gst_libcamera_src_class_init(GstLibcameraSrcClass *klass)\n>  \t\t\t\t\t\t\t     | G_PARAM_READWRITE\n>  \t\t\t\t\t\t\t     | G_PARAM_STATIC_STRINGS));\n>  \tg_object_class_install_property(object_class, PROP_CAMERA_NAME, spec);\n> +\n> +\tspec = g_param_spec_boxed(\"sensor-modes\", \"Sensor Modes\",\n> +\t\t\t\t  \"GstCaps representing available sensor modes.\",\n> +\t\t\t\t  GST_TYPE_CAPS,\n> +\t\t\t\t  (GParamFlags)(G_PARAM_READABLE\n> +\t\t\t\t\t        | G_PARAM_STATIC_STRINGS));\n> +\tg_object_class_install_property(object_class, PROP_SENSOR_MODES, spec);\n> +\n> +\tspec = g_param_spec_boxed(\"sensor-mode\", \"Sensor Mode\",\n> +\t\t\t\t  \"GstCaps representing selected sensor mode.\",\n> +\t\t\t\t  GST_TYPE_CAPS,\n> +\t\t\t\t  (GParamFlags)(GST_PARAM_MUTABLE_READY\n> +\t\t\t\t\t        | G_PARAM_READWRITE\n> +\t\t\t\t\t        | G_PARAM_STATIC_STRINGS));\n> +\tg_object_class_install_property(object_class, PROP_SENSOR_MODE, spec);\n>  }\n> --\n> 2.39.2\n>","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id 53E4BBD160\n\tfor <parsemail@patchwork.libcamera.org>;\n\tMon, 27 Mar 2023 09:22:00 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id B20596271C;\n\tMon, 27 Mar 2023 11:21:59 +0200 (CEST)","from perceval.ideasonboard.com (perceval.ideasonboard.com\n\t[213.167.242.64])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id 61B47626DB\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tMon, 27 Mar 2023 11:21:58 +0200 (CEST)","from ideasonboard.com (93-61-96-190.ip145.fastwebnet.it\n\t[93.61.96.190])\n\tby perceval.ideasonboard.com (Postfix) with ESMTPSA id DE5B012F3;\n\tMon, 27 Mar 2023 11:21:57 +0200 (CEST)"],"DKIM-Signature":["v=1; a=rsa-sha256; c=relaxed/simple; d=libcamera.org;\n\ts=mail; t=1679908919;\n\tbh=yUM+8J5I4rNR8Cjp15dbxcm75ydiWWouzWe0nEavgwE=;\n\th=Date:To:References:In-Reply-To:Subject:List-Id:List-Unsubscribe:\n\tList-Archive:List-Post:List-Help:List-Subscribe:From:Reply-To:Cc:\n\tFrom;\n\tb=UvE94Eryy/2aAGfrJejbepKdiSRMadTop06AnVlz7/S8HLFdA69zenAroFLeGhI+d\n\tYGWTIlYXNW1qp+2TWKSxswSQArEfXRShZV4I1iqClycfrCbfszoczoyM00UdC+VkBS\n\tYtNVg943f6trnx4yiw6YyRtrpfg6rJGf83av7u0/vk1ZIoccMpymb1xpSpRP31Widb\n\tB0vnLJIcJS7Nlpuau91LcjoR515WdPlw1be2zS2CDlUMCcEdOTdGTZI+NDG/HjdqGT\n\t/4PuE8YZ9+eCXeYYcpGsx5LGD/9mDb8iGFAByf/Nh9It7XjqMMUmasQJgy4VLMHQZH\n\tQtTVwURlvrLYQ==","v=1; a=rsa-sha256; c=relaxed/simple; d=ideasonboard.com;\n\ts=mail; t=1679908918;\n\tbh=yUM+8J5I4rNR8Cjp15dbxcm75ydiWWouzWe0nEavgwE=;\n\th=Date:From:To:Cc:Subject:References:In-Reply-To:From;\n\tb=D+jddATIWJK42gdPlCWw5kwOMmB2S7W4UlpbYDVDCkIPLsh2kXtAgy4sIpc6C5vHC\n\t+nUn1ugkaLJxmIFG+8hHGZRhgDUQ4VKgb2Mtt/xrSdQA5epnnzY1ZhxIPeTolOgRVJ\n\t2ssCsifO2nWtMvHq6Ukx0al76+OJeTfRuzjEs1fc="],"Authentication-Results":"lancelot.ideasonboard.com; dkim=pass (1024-bit key; \n\tunprotected) header.d=ideasonboard.com\n\theader.i=@ideasonboard.com\n\theader.b=\"D+jddATI\"; dkim-atps=neutral","Date":"Mon, 27 Mar 2023 11:21:55 +0200","To":"Nicolas Dufresne <nicolas@ndufresne.ca>","Message-ID":"<20230327092155.kzzwjsr5ffnh4y7k@uno.localdomain>","References":"<20230324181247.302586-1-nicolas@ndufresne.ca>\n\t<20230324181247.302586-2-nicolas@ndufresne.ca>","MIME-Version":"1.0","Content-Type":"text/plain; charset=utf-8","Content-Disposition":"inline","In-Reply-To":"<20230324181247.302586-2-nicolas@ndufresne.ca>","Subject":"Re: [libcamera-devel] [PATCH v2 1/3] gstreamer: Add sensor mode\n\tselection","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","From":"Jacopo Mondi via libcamera-devel <libcamera-devel@lists.libcamera.org>","Reply-To":"Jacopo Mondi <jacopo.mondi@ideasonboard.com>","Cc":"libcamera-devel@lists.libcamera.org,\n\tNicolas Dufresne <nicolas.dufresne@collabora.com>","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}},{"id":26770,"web_url":"https://patchwork.libcamera.org/comment/26770/","msgid":"<db46d6b2a3dc2f48b8a8323f32bebd522235c9bf.camel@collabora.com>","date":"2023-03-27T13:40:57","subject":"Re: [libcamera-devel] [PATCH v2 1/3] gstreamer: Add sensor mode\n\tselection","submitter":{"id":31,"url":"https://patchwork.libcamera.org/api/people/31/","name":"Nicolas Dufresne","email":"nicolas.dufresne@collabora.com"},"content":"Le lundi 27 mars 2023 à 11:21 +0200, Jacopo Mondi a écrit :\n> Hi Nicolas,\n> \n> On Fri, Mar 24, 2023 at 02:12:45PM -0400, Nicolas Dufresne via libcamera-devel wrote:\n> > From: Nicolas Dufresne <nicolas.dufresne@collabora.com>\n> > \n> > This add support for selecting the sensor mode. A new read-only\n> > property called sensor-modes is filled when the element reaches\n> > READY state. It contains the list of all available sensor modes,\n> > including the supported framerate range. This is exposed as GstCaps\n> > in the form of sensor/mode,width=X,height=Y,format=Y,framerate=[...].\n> > The format string matches the libcamera format string representation.\n> > \n> > The application can then select a mode using the read/write sensor-mode\n> > control. The selected mode is also a caps, it will be intersected with\n> > the supported mode and the \"best\" match will be picked. This allows\n> > application to use simple filter when they want to pick a mode for lets\n> > say a specific framerate (e.g. sensor/mode,framerate=60/1).\n> > \n> > Signed-off-by: Nicolas Dufresne <nicolas.dufresne@collabora.com>\n> > ---\n> >  src/gstreamer/gstlibcamera-utils.cpp |   4 +\n> >  src/gstreamer/gstlibcamerasrc.cpp    | 110 ++++++++++++++++++++++++++-\n> >  2 files changed, 112 insertions(+), 2 deletions(-)\n> > \n> > diff --git a/src/gstreamer/gstlibcamera-utils.cpp b/src/gstreamer/gstlibcamera-utils.cpp\n> > index 750ec351..c8a8df09 100644\n> > --- a/src/gstreamer/gstlibcamera-utils.cpp\n> > +++ b/src/gstreamer/gstlibcamera-utils.cpp\n> > @@ -416,6 +416,10 @@ gst_libcamera_configure_stream_from_caps(StreamConfiguration &stream_cfg,\n> >  \t\tstream_cfg.pixelFormat = gst_format_to_pixel_format(gst_format);\n> >  \t} else if (gst_structure_has_name(s, \"image/jpeg\")) {\n> >  \t\tstream_cfg.pixelFormat = formats::MJPEG;\n> > +\t} else if (gst_structure_has_name(s, \"sensor/mode\")) {\n> > +\t\tgst_structure_fixate_field(s, \"format\");\n> > +\t\tconst gchar *format = gst_structure_get_string(s, \"format\");\n> > +\t\tstream_cfg.pixelFormat = PixelFormat::fromString(format);\n> >  \t} else {\n> >  \t\tg_critical(\"Unsupported media type: %s\", gst_structure_get_name(s));\n> >  \t}\n> > diff --git a/src/gstreamer/gstlibcamerasrc.cpp b/src/gstreamer/gstlibcamerasrc.cpp\n> > index a10cbd4f..2f05a03f 100644\n> > --- a/src/gstreamer/gstlibcamerasrc.cpp\n> > +++ b/src/gstreamer/gstlibcamerasrc.cpp\n> > @@ -147,6 +147,9 @@ struct _GstLibcameraSrc {\n> > \n> >  \tgchar *camera_name;\n> > \n> > +\tGstCaps *sensor_modes;\n> > +\tGstCaps *sensor_mode;\n> > +\n> >  \tGstLibcameraSrcState *state;\n> >  \tGstLibcameraAllocator *allocator;\n> >  \tGstFlowCombiner *flow_combiner;\n> > @@ -154,7 +157,10 @@ struct _GstLibcameraSrc {\n> > \n> >  enum {\n> >  \tPROP_0,\n> > -\tPROP_CAMERA_NAME\n> > +\tPROP_CAMERA_NAME,\n> > +\tPROP_SENSOR_MODES,\n> > +\tPROP_SENSOR_MODE,\n> > +\tPROP_LAST\n> >  };\n> > \n> >  G_DEFINE_TYPE_WITH_CODE(GstLibcameraSrc, gst_libcamera_src, GST_TYPE_ELEMENT,\n> > @@ -318,6 +324,61 @@ int GstLibcameraSrcState::processRequest()\n> >  \treturn err;\n> >  }\n> > \n> > +static GstCaps *\n> > +gst_libcamera_src_enumerate_sensor_modes(GstLibcameraSrc *self)\n> > +{\n> > +\tGstCaps *modes = gst_caps_new_empty();\n> > +\tGstLibcameraSrcState *state = self->state;\n> > +\tauto config = state->cam_->generateConfiguration({ libcamera::StreamRole::Raw,\n> > +\t\t\t\t\t\t\t   libcamera::StreamRole::VideoRecording });\n> \n> This clearly shows our API falls short, as the presence of the RAW\n> stream, and the ability to capture it alongside other streams, is\n> conditional on the platform's capabilities. So this can't be\n> considered a sufficiently stable way of retrieving information about\n> the sensor's configuration. Not your fault, we currently don't have\n> anything better to offer.\n> \n> We currently have a way to report all the possible configurations for\n> a stream using the StreamFormat class. StreamFormat is currently\n> nothing but\n> \n> \tstd::map<PixelFormat, std::vector<SizeRange>> formats_;\n> \n> which gets populated by the pipeline handler while creating\n> the StreamConfiguration items which will be part of the generated\n> CameraConfiguration.\n> \n> That's for the \"Camera-provided information to applications\" side.\n> \n> On the \"application to configure the Camera\" side, StreamConfiguration\n> is used to tell the library how a Stream should be configured.\n> These are the fields a StreamConfiguration describe:\n> \n> \tPixelFormat pixelFormat;\n> \tSize size;\n> \tunsigned int stride;\n> \tunsigned int frameSize;\n> \n> \tunsigned int bufferCount;\n> \n> \tstd::optional<ColorSpace> colorSpace;\n> \n> With pixelformat, size, bufferCount and (optionally) colorSpace which\n> are expected to be set by the application.\n> \n> We have been discussing about adding to StreamConfiguration a\n> frameDuration since a long time. It is needed for some parts of the\n> Android HAL as well, and in general, as your patch shows, it is a\n> desirable feature. A similar reasoning can be made for the sensor's\n> configuration, even if this has received several push backs in the\n> past when RPi wanted something similar. Furthermore, I would be\n> tempted to consider the possible FrameDuration a property of the\n> sensor's configuration, but I presume that advanced use cases for\n> StillCapture chain to the canonical frame processing more stages, so\n> I presume it's better to have the FrameDurations separate from the\n> sensor configuration to allow describe additional delays there.\n\nJust a note here, what matters to application is not as much the final frame\nduration (framerate) but the maximum imposed by the select camera mode. An\napplication will simply want to advertise lets say 60fps, if there is a mode\nthat can theoretically achieve it. At the moment, it falls short, as even if\nthere is a mode, if you aren't lucky enough, it may not work.\n\n> \n> Summing it all up, we have StreamFormats to convey back to application\n> what a Stream can do, and StreamConfiguration to set the Stream\n> characteristics.\n> \n> Addressing the needs you have here would require:\n> \n> - Pipeline handlers to populate StreamFormats not only with\n>   <PixelFormat, vector<SizeRange>> but also with the possible sensor's\n>   configuration that can generated that mode and the possible frame\n>   durations\n> \n> - Application to populate a StreamConfiguration with the desired\n>   FrameDuration and sensor mode\n> \n> - Pipeline handler to validate that the desired sensor mode and\n>   durations are compatible with the stream characteristics.\n>   (as an example, if your pipeline can't upscale, you can't have a\n>   sensor mode with a size smaller than the desired output).\n> \n> This puts more complexity on the pipeline side, in both the generation\n> of configurations and in their validation, but would such an API\n> address your needs in your opinion ? Have you had any thoughts on how\n> you would like to have this handled ?\n\nCurrently, stepping back form the implementation details, I identify two\nissues. \n\n1. Is the inability for the pipeline to do that right thing.\n\nAs the pipeline is not aware of the desired frameDuration(s) at the time of\nvalidation, it is unable to do the right thing for an application that wants\nlets 60fps but the \"best\" mode does not cover it.\n\n2. Is the lack of direct sensor mode enumeration\n\nThis is what you describe, but my learning is that sensor mode description goes\nbeyond format/width/height/frameDuration. For an application, it is imperative\nto know what portion will remain from the original field of view, and what is\nthe initial field of view. I'm not sure I have collected all the details if any\nof this actually exist.\n\nSo, if I read your proposal well, you suggest to add all supported camera modes\nto each of the enumrated format/size. As the camera configuration becomes a\nthing, we then have the ability to introduce more optional metadata there to\nsolve 2. I think that could work, as there is certainly a strong correlation\nbetween the format/size and the available modes. Making a list of modes like\nlibcamera-apps do would be more difficult though (apps have to deal with\nduplicates). Is that a concern ? (its not from my point of view, and we can use\nimmutable shared pointer in C++ to do this at no cost, we could even define that\npointer comparison is enough to match duplicates).\n\n> \n> That's really an open question for everyone, as I guess this part is\n> becoming critical as this patch shows and the workaround we pushed RPi\n> to is really not generic enough.\n\nAgreed, and looking for feedback. I'm be happy to help with the subject. I know\nthis patch was only making an issue more visible, but I'm also sharing it as it\nmay temporally be useful to some users (as a downstream patch).\n\n> \n> One more point here below\n> \n> Thanks\n>    j\n> \n> -------------------------------------------------------------------------------\n> Slightly unrelated point, but while you're here I would like to pick\n> your brain on the issue: the way StreamFormat are currently created is\n> slightly under-specified. The documentation goes in great length in\n> describing how the sizes associated with PixelFormat can be inspected\n> [1] but it only superficially describe how a pipeline handler should\n> populate the formats associated with a stream configuration [2]. I'll\n> make two examples:\n> - if the StreamConfiguration is for a 1080p Viewfinder stream should\n>   the associated StreamFormats list resolutions > 1080p ?\n\nIf you have scaling in your ISP, I think it should. I notice that feeding the\nISP with higher resolution often lead to a cleaner image. An application can\nthen decide to focus on reducing the bandwidth (and possibly the battery life as\na side effect) by matching the resolution.\n\nThe other reason is that your viewFinder will likely match the display\nresolution. If you do viewFinder + still pictures, and you don't want the\nviewFinder going blank while taking a picture, you will have no choice but to\nset the sensor at the desired still picture resolution.\n\nFinally, if you don't expose the higher resolution, a generic caps mechanism\nlike GstCaps can't be used. When you have multiple streams, you will want to\nintersect the sensor configuration for every streams in order to find the\nremaining available sensor modes you can pick from. GstCaps will not allow you\nto select higher sensor resolution if you don't include them, as there will be\nno intersection. This is a mechanism you'll find in GStreamer and Pipewire,\nwhich depends on having capability that are close to the mathematical\ndefinitions of sets (with its own salt).\n\n> - if the StreamConfiguration is for a !Raw role, should RAW\n>   pixelformats be listed in the StreamFormats ?\n\nI've been trying to answer this one, but I realize I don't know if I understand\nthe full extent of the question. Is raw role going away ? If yes, then we'll\nhave to offer the raw formats into all the other roles. But we then loose a bit,\nas the Raw role have special meaning, and can greatly help the pipeline when\nconfiguring (a raw role configuration had precedence over all the other, making\nthings a little simpler when fixating the configuration). I think we have to\ndecide, but as of my today's thinking, the raw roles could remain, its just that\nthe configuration will just be 1:1 with the sensor configuration.\n\n> \n> There are arguments for both cases, but I guess the big picture\n> question is about how an application should expect to retrieve the\n> full camera capabilities:\n> - Does an application expect to receive only the available formats for\n>   the role it has asked a configuration for ?\n> - Does an application expect to receive all the possible formats a\n>   camera can produce regardless of the size/pixelformat of the\n>   StreamConfiguration that has been generated ?\n> \n> As a concrete example, currently the RkISP1 pipeline handler list a\n> RAW StreamFormat for all configurations, regardless of the role. I had\n> a patch to \"fix\" this but I've been pointed to the fact the API was\n> not very well specified.\n> \n> What would your expectations be in this case ?\n\nThat is interesting, so today we have a bit of both. The most basic applications\nneeds to match two roles already (ViewFinder + X). So in term of sensor\nconfiguration, without going into slow validation loops, have to cross over the\ninformation across the roles.\n\nSo overall, I think my preference would be something structured like this:\n\nRole1:\n  + format/size\n\t- Sensor Config1\n\t- Sensor Config2\n\t- Sensor Config3\n  + format/size\n\t- Sensor Config2\nRole2:\n  ...\n\nThe stream configuration as defined today seems well suited. Keeping the fixated\nframeDuration away from it seems unavoidable. There is just too many variable\ntoward the fixation of the frameDuration, on top of which, applications are much\nmore flexible then they pretend. As an example, when I see:\n\n   libcamerasrc ! video/x-raw,framerate=60/1 ! ...\n\nI see a very naive application. I'm very doubtful that 121/2 or 119/2 would not\nhave done the equally the job. And even 50/1 would likely be fine for most use\ncases. In GStreamer, you can (and should) use ranges when setting filters.\n\n  libcamerasrc ! video/x-raw,framerate=[40/1,70/1] ! ...\n\nIf you have a strong preference for 60, you should hint that this way (note, as\nI said in the review, the newly added caps negotiation is not working yet for\nthat):\n\n  libcamerasrc ! video/x-raw,framerate=60/1;video/x-raw,framerate=[40/1,70/1] ! ...\n\nAnd libcamerasrc is free to read the first structure, and same all the fixed\nvalues as the \"default\", and later use oriented fixation function\n(gst_structure_fixate_nearest_fraction()). This gets complex quickly, so no one\never implement this ahead of time, but its all there and well defined. And quite\nrepresentative of how far an application can go to do the right thing.\n\nHow the sensor configuration would go, done by application or pipeline, would be\nto collect all sensorConfigurations for all the streams fixated\nstreamConfiguration. Intersect, and finally score the remaining to pick the\nbest.\n\nHope this quick GstCaps intro can help,\nNicolas\n\n> \n> [1] https://libcamera.org/api-html/classlibcamera_1_1StreamFormats.html\n> [2] https://libcamera.org/api-html/structlibcamera_1_1StreamConfiguration.html#a37725e6b9e179ca80be0e57ed97857d3\n> -------------------------------------------------------------------------------\n> \n> \n> > +\tif (config == nullptr) {\n> > +\t\tGST_DEBUG_OBJECT(self, \"Sensor mode selection is not supported, skipping enumeration.\");\n> > +\t\treturn modes;\n> > +\t}\n> > +\n> > +\tconst libcamera::StreamFormats &formats = config->at(0).formats();\n> > +\n> > +\tfor (const auto &pixfmt : formats.pixelformats()) {\n> > +\t\tfor (const auto &size : formats.sizes(pixfmt)) {\n> > +\t\t\tconfig->at(0).size = size;\n> > +\t\t\tconfig->at(0).pixelFormat = pixfmt;\n> > +\n> > +\t\t\tif (config->validate() == CameraConfiguration::Invalid)\n> > +\t\t\t\tcontinue;\n> > +\n> > +\t\t\tif (state->cam_->configure(config.get()))\n> > +\t\t\t\tcontinue;\n> > +\n> > +\t\t\tauto fd_ctrl = state->cam_->controls().find(&controls::FrameDurationLimits);\n> > +\t\t\tif (fd_ctrl == state->cam_->controls().end())\n> > +\t\t\t\tcontinue;\n> > +\n> > +\t\t\tint minrate_num, minrate_denom;\n> > +\t\t\tint maxrate_num, maxrate_denom;\n> > +\t\t\tdouble min_framerate = gst_util_guint64_to_gdouble(1.0e6) /\n> > +\t\t\t\t\t       gst_util_guint64_to_gdouble(fd_ctrl->second.max().get<int64_t>());\n> > +\t\t\tdouble max_framerate = gst_util_guint64_to_gdouble(1.0e6) /\n> > +\t\t\t\t\t       gst_util_guint64_to_gdouble(fd_ctrl->second.min().get<int64_t>());\n> > +\t\t\tgst_util_double_to_fraction(min_framerate, &minrate_num, &minrate_denom);\n> > +\t\t\tgst_util_double_to_fraction(max_framerate, &maxrate_num, &maxrate_denom);\n> > +\n> > +\t\t\tGstStructure *s = gst_structure_new(\"sensor/mode\",\n> > +\t\t\t\t\t\t\t    \"format\", G_TYPE_STRING, pixfmt.toString().c_str(),\n> > +\t\t\t\t\t\t\t    \"width\", G_TYPE_INT, size.width,\n> > +\t\t\t\t\t\t\t    \"height\", G_TYPE_INT, size.height,\n> > +\t\t\t\t\t\t\t    \"framerate\", GST_TYPE_FRACTION_RANGE,\n> > +\t\t\t\t\t\t\t    minrate_num, minrate_denom,\n> > +\t\t\t\t\t\t\t    maxrate_num, maxrate_denom,\n> > +\t\t\t\t\t\t\t    nullptr);\n> > +\t\t\tgst_caps_append_structure(modes, s);\n> > +\t\t}\n> > +\t}\n> > +\n> > +\tGST_DEBUG_OBJECT(self, \"Camera sensor modes: %\" GST_PTR_FORMAT, modes);\n> > +\n> > +\treturn modes;\n> > +}\n> > +\n> >  static bool\n> >  gst_libcamera_src_open(GstLibcameraSrc *self)\n> >  {\n> > @@ -375,6 +436,7 @@ gst_libcamera_src_open(GstLibcameraSrc *self)\n> >  \t/* No need to lock here, we didn't start our threads yet. */\n> >  \tself->state->cm_ = cm;\n> >  \tself->state->cam_ = cam;\n> > +\tself->sensor_modes = gst_libcamera_src_enumerate_sensor_modes(self);\n> > \n> >  \treturn true;\n> >  }\n> > @@ -462,6 +524,7 @@ gst_libcamera_src_task_enter(GstTask *task, [[maybe_unused]] GThread *thread,\n> >  \tGstLibcameraSrcState *state = self->state;\n> >  \tGstFlowReturn flow_ret = GST_FLOW_OK;\n> >  \tgint ret;\n> > +\tg_autoptr(GstCaps) sensor_mode = nullptr;\n> > \n> >  \tg_autoptr(GstStructure) element_caps = gst_structure_new_empty(\"caps\");\n> > \n> > @@ -481,6 +544,16 @@ gst_libcamera_src_task_enter(GstTask *task, [[maybe_unused]] GThread *thread,\n> >  \t\troles.push_back(gst_libcamera_pad_get_role(srcpad));\n> >  \t}\n> > \n> > +\tif (!gst_caps_is_any(self->sensor_mode)) {\n> > +\t\tsensor_mode = gst_caps_intersect(self->sensor_mode, self->sensor_modes);\n> > +\t\tif (!gst_caps_is_empty(sensor_mode)) {\n> > +\t\t\troles.push_back(libcamera::StreamRole::Raw);\n> > +\t\t} else {\n> > +\t\t\tGST_WARNING_OBJECT(self, \"No sensor mode matching the selection, ignoring.\");\n> > +\t\t\tgst_clear_caps(&sensor_mode);\n> > +\t\t}\n> > +\t}\n> > +\n> >  \t/* Generate the stream configurations, there should be one per pad. */\n> >  \tstate->config_ = state->cam_->generateConfiguration(roles);\n> >  \tif (state->config_ == nullptr) {\n> > @@ -490,7 +563,7 @@ gst_libcamera_src_task_enter(GstTask *task, [[maybe_unused]] GThread *thread,\n> >  \t\tgst_task_stop(task);\n> >  \t\treturn;\n> >  \t}\n> > -\tg_assert(state->config_->size() == state->srcpads_.size());\n> > +\tg_assert(state->config_->size() == state->srcpads_.size() + (!!sensor_mode));\n> > \n> >  \tfor (gsize i = 0; i < state->srcpads_.size(); i++) {\n> >  \t\tGstPad *srcpad = state->srcpads_[i];\n> > @@ -510,6 +583,12 @@ gst_libcamera_src_task_enter(GstTask *task, [[maybe_unused]] GThread *thread,\n> >  \t\tgst_libcamera_get_framerate_from_caps(caps, element_caps);\n> >  \t}\n> > \n> > +\tif (sensor_mode) {\n> > +\t\tStreamConfiguration &stream_cfg = state->config_->at(state->srcpads_.size());\n> > +\t\tg_assert(gst_caps_is_writable(sensor_mode));\n> > +\t\tgst_libcamera_configure_stream_from_caps(stream_cfg, sensor_mode);\n> > +\t}\n> > +\n> >  \tif (flow_ret != GST_FLOW_OK)\n> >  \t\tgoto done;\n> > \n> > @@ -624,6 +703,7 @@ gst_libcamera_src_task_leave([[maybe_unused]] GstTask *task,\n> >  \tg_clear_object(&self->allocator);\n> >  \tg_clear_pointer(&self->flow_combiner,\n> >  \t\t\t(GDestroyNotify)gst_flow_combiner_free);\n> > +\tgst_clear_caps(&self->sensor_modes);\n> >  }\n> > \n> >  static void\n> > @@ -659,6 +739,10 @@ gst_libcamera_src_set_property(GObject *object, guint prop_id,\n> >  \t\tg_free(self->camera_name);\n> >  \t\tself->camera_name = g_value_dup_string(value);\n> >  \t\tbreak;\n> > +\tcase PROP_SENSOR_MODE:\n> > +\t\tgst_clear_caps(&self->sensor_mode);\n> > +\t\tself->sensor_mode = GST_CAPS(g_value_dup_boxed(value));\n> > +\t\tbreak;\n> >  \tdefault:\n> >  \t\tG_OBJECT_WARN_INVALID_PROPERTY_ID(object, prop_id, pspec);\n> >  \t\tbreak;\n> > @@ -676,6 +760,12 @@ gst_libcamera_src_get_property(GObject *object, guint prop_id, GValue *value,\n> >  \tcase PROP_CAMERA_NAME:\n> >  \t\tg_value_set_string(value, self->camera_name);\n> >  \t\tbreak;\n> > +\tcase PROP_SENSOR_MODES:\n> > +\t\tg_value_set_boxed(value, self->sensor_modes);\n> > +\t\tbreak;\n> > +\tcase PROP_SENSOR_MODE:\n> > +\t\tg_value_set_boxed(value, self->sensor_mode);\n> > +\t\tbreak;\n> >  \tdefault:\n> >  \t\tG_OBJECT_WARN_INVALID_PROPERTY_ID(object, prop_id, pspec);\n> >  \t\tbreak;\n> > @@ -763,6 +853,7 @@ gst_libcamera_src_init(GstLibcameraSrc *self)\n> >  \t/* C-style friend. */\n> >  \tstate->src_ = self;\n> >  \tself->state = state;\n> > +\tself->sensor_mode = gst_caps_new_any();\n> >  }\n> > \n> >  static GstPad *\n> > @@ -844,4 +935,19 @@ gst_libcamera_src_class_init(GstLibcameraSrcClass *klass)\n> >  \t\t\t\t\t\t\t     | G_PARAM_READWRITE\n> >  \t\t\t\t\t\t\t     | G_PARAM_STATIC_STRINGS));\n> >  \tg_object_class_install_property(object_class, PROP_CAMERA_NAME, spec);\n> > +\n> > +\tspec = g_param_spec_boxed(\"sensor-modes\", \"Sensor Modes\",\n> > +\t\t\t\t  \"GstCaps representing available sensor modes.\",\n> > +\t\t\t\t  GST_TYPE_CAPS,\n> > +\t\t\t\t  (GParamFlags)(G_PARAM_READABLE\n> > +\t\t\t\t\t        | G_PARAM_STATIC_STRINGS));\n> > +\tg_object_class_install_property(object_class, PROP_SENSOR_MODES, spec);\n> > +\n> > +\tspec = g_param_spec_boxed(\"sensor-mode\", \"Sensor Mode\",\n> > +\t\t\t\t  \"GstCaps representing selected sensor mode.\",\n> > +\t\t\t\t  GST_TYPE_CAPS,\n> > +\t\t\t\t  (GParamFlags)(GST_PARAM_MUTABLE_READY\n> > +\t\t\t\t\t        | G_PARAM_READWRITE\n> > +\t\t\t\t\t        | G_PARAM_STATIC_STRINGS));\n> > +\tg_object_class_install_property(object_class, PROP_SENSOR_MODE, spec);\n> >  }\n> > --\n> > 2.39.2\n> >","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id 3DEF4C0F2A\n\tfor <parsemail@patchwork.libcamera.org>;\n\tMon, 27 Mar 2023 13:41:10 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id 6E60062722;\n\tMon, 27 Mar 2023 15:41:09 +0200 (CEST)","from madras.collabora.co.uk (madras.collabora.co.uk\n\t[46.235.227.172])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id 8D7BA626D7\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tMon, 27 Mar 2023 15:41:07 +0200 (CEST)","from nicolas-tpx395.localdomain (192-222-136-102.qc.cable.ebox.net\n\t[192.222.136.102])\n\t(using TLSv1.3 with cipher TLS_AES_256_GCM_SHA384 (256/256 bits)\n\tkey-exchange X25519 server-signature RSA-PSS (4096 bits)\n\tserver-digest SHA256)\n\t(No client certificate requested) (Authenticated sender: nicolas)\n\tby madras.collabora.co.uk (Postfix) with ESMTPSA id DCAE966018CA;\n\tMon, 27 Mar 2023 14:41:06 +0100 (BST)"],"DKIM-Signature":["v=1; a=rsa-sha256; c=relaxed/simple; d=libcamera.org;\n\ts=mail; t=1679924469;\n\tbh=v6slbv3d/Iz4mUamMb7J0yyQZgGS7FHaonqhuizBkWU=;\n\th=To:Date:In-Reply-To:References:Subject:List-Id:List-Unsubscribe:\n\tList-Archive:List-Post:List-Help:List-Subscribe:From:Reply-To:Cc:\n\tFrom;\n\tb=RWHFeAswCF2XGTIoepgYKrYI1JxGX1okwOZe8Z03mi8yAQx7ij0TWjwcffniDxrLY\n\t2ulMCu3x9VH/2IfiCVr2pDG378UE4Qjyc/fPBvX85QDRmnUmU2LHFL3wBwZ5N6CVMP\n\tTGze/qncMtrjLU2rozG4lSQkwrzQjNtA3I7OS0VteMX0Ba06q1sEYjZayqP3zSlfor\n\tOW9pwikWPJuVHGXEMSdjfPMow3gYJEkC/5tUISTv4TcXFs3YFykAA6Liktr9ldWBVy\n\th1H0RU0fwmkroqLQlPKYOM40CxHikOJvuNlRvnLk5zQyK6DqTPuOsj3FX2UodndUrM\n\tM2JMn/Dz2Fsjw==","v=1; a=rsa-sha256; c=relaxed/simple; d=collabora.com;\n\ts=mail; t=1679924467;\n\tbh=v6slbv3d/Iz4mUamMb7J0yyQZgGS7FHaonqhuizBkWU=;\n\th=Subject:From:To:Cc:Date:In-Reply-To:References:From;\n\tb=kBPUNUCL5jmM1jbBkSTAgRdt+Bxw1aJDTJDXsjNUBC4G17+lFKvIDfpmE/YRbjMRL\n\tUeSsfKD7lf0WNUzP3nc7r2uPO5BINdX7r7Emp9G2wD2MM2RWRZi6QIG3LZlNNG5wBC\n\tfUIp6j35kEhmh4xI46GgtzWcrInM+RHyeGg54FXUcEJKMfKO5XCHUv7dFZtAUUUpqK\n\ttRfRZCCrV2bpr3ybBrmN4YSd3mnAub3Sg140ELqoqvEjGphNg58qkwneHPBcjU5EIB\n\t4OCWIN+lcZp2B34V3P0WVQ9cvEeHSgDIwEWrQpyT8CWeMrQZ8arPv1L4G0UpQZh5La\n\tHOqJgae4/yivQ=="],"Authentication-Results":"lancelot.ideasonboard.com; dkim=pass (2048-bit key; \n\tunprotected) header.d=collabora.com\n\theader.i=@collabora.com\n\theader.b=\"kBPUNUCL\"; dkim-atps=neutral","Message-ID":"<db46d6b2a3dc2f48b8a8323f32bebd522235c9bf.camel@collabora.com>","To":"Jacopo Mondi <jacopo.mondi@ideasonboard.com>","Date":"Mon, 27 Mar 2023 09:40:57 -0400","In-Reply-To":"<20230327092155.kzzwjsr5ffnh4y7k@uno.localdomain>","References":"<20230324181247.302586-1-nicolas@ndufresne.ca>\n\t<20230324181247.302586-2-nicolas@ndufresne.ca>\n\t<20230327092155.kzzwjsr5ffnh4y7k@uno.localdomain>","Content-Type":"text/plain; charset=\"UTF-8\"","Content-Transfer-Encoding":"quoted-printable","User-Agent":"Evolution 3.46.4 (3.46.4-1.fc37) ","MIME-Version":"1.0","Subject":"Re: [libcamera-devel] [PATCH v2 1/3] gstreamer: Add sensor mode\n\tselection","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","From":"Nicolas Dufresne via libcamera-devel\n\t<libcamera-devel@lists.libcamera.org>","Reply-To":"Nicolas Dufresne <nicolas.dufresne@collabora.com>","Cc":"libcamera-devel@lists.libcamera.org","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}},{"id":26778,"web_url":"https://patchwork.libcamera.org/comment/26778/","msgid":"<20230327151916.bxhuho3gez7kkvz7@uno.localdomain>","date":"2023-03-27T15:19:16","subject":"Re: [libcamera-devel] [PATCH v2 1/3] gstreamer: Add sensor mode\n\tselection","submitter":{"id":143,"url":"https://patchwork.libcamera.org/api/people/143/","name":"Jacopo Mondi","email":"jacopo.mondi@ideasonboard.com"},"content":"Hi Nicolas,\n\nOn Mon, Mar 27, 2023 at 09:40:57AM -0400, Nicolas Dufresne wrote:\n> Le lundi 27 mars 2023 à 11:21 +0200, Jacopo Mondi a écrit :\n> > Hi Nicolas,\n> >\n> > On Fri, Mar 24, 2023 at 02:12:45PM -0400, Nicolas Dufresne via libcamera-devel wrote:\n> > > From: Nicolas Dufresne <nicolas.dufresne@collabora.com>\n> > >\n> > > This add support for selecting the sensor mode. A new read-only\n> > > property called sensor-modes is filled when the element reaches\n> > > READY state. It contains the list of all available sensor modes,\n> > > including the supported framerate range. This is exposed as GstCaps\n> > > in the form of sensor/mode,width=X,height=Y,format=Y,framerate=[...].\n> > > The format string matches the libcamera format string representation.\n> > >\n> > > The application can then select a mode using the read/write sensor-mode\n> > > control. The selected mode is also a caps, it will be intersected with\n> > > the supported mode and the \"best\" match will be picked. This allows\n> > > application to use simple filter when they want to pick a mode for lets\n> > > say a specific framerate (e.g. sensor/mode,framerate=60/1).\n> > >\n> > > Signed-off-by: Nicolas Dufresne <nicolas.dufresne@collabora.com>\n> > > ---\n> > >  src/gstreamer/gstlibcamera-utils.cpp |   4 +\n> > >  src/gstreamer/gstlibcamerasrc.cpp    | 110 ++++++++++++++++++++++++++-\n> > >  2 files changed, 112 insertions(+), 2 deletions(-)\n> > >\n> > > diff --git a/src/gstreamer/gstlibcamera-utils.cpp b/src/gstreamer/gstlibcamera-utils.cpp\n> > > index 750ec351..c8a8df09 100644\n> > > --- a/src/gstreamer/gstlibcamera-utils.cpp\n> > > +++ b/src/gstreamer/gstlibcamera-utils.cpp\n> > > @@ -416,6 +416,10 @@ gst_libcamera_configure_stream_from_caps(StreamConfiguration &stream_cfg,\n> > >  \t\tstream_cfg.pixelFormat = gst_format_to_pixel_format(gst_format);\n> > >  \t} else if (gst_structure_has_name(s, \"image/jpeg\")) {\n> > >  \t\tstream_cfg.pixelFormat = formats::MJPEG;\n> > > +\t} else if (gst_structure_has_name(s, \"sensor/mode\")) {\n> > > +\t\tgst_structure_fixate_field(s, \"format\");\n> > > +\t\tconst gchar *format = gst_structure_get_string(s, \"format\");\n> > > +\t\tstream_cfg.pixelFormat = PixelFormat::fromString(format);\n> > >  \t} else {\n> > >  \t\tg_critical(\"Unsupported media type: %s\", gst_structure_get_name(s));\n> > >  \t}\n> > > diff --git a/src/gstreamer/gstlibcamerasrc.cpp b/src/gstreamer/gstlibcamerasrc.cpp\n> > > index a10cbd4f..2f05a03f 100644\n> > > --- a/src/gstreamer/gstlibcamerasrc.cpp\n> > > +++ b/src/gstreamer/gstlibcamerasrc.cpp\n> > > @@ -147,6 +147,9 @@ struct _GstLibcameraSrc {\n> > >\n> > >  \tgchar *camera_name;\n> > >\n> > > +\tGstCaps *sensor_modes;\n> > > +\tGstCaps *sensor_mode;\n> > > +\n> > >  \tGstLibcameraSrcState *state;\n> > >  \tGstLibcameraAllocator *allocator;\n> > >  \tGstFlowCombiner *flow_combiner;\n> > > @@ -154,7 +157,10 @@ struct _GstLibcameraSrc {\n> > >\n> > >  enum {\n> > >  \tPROP_0,\n> > > -\tPROP_CAMERA_NAME\n> > > +\tPROP_CAMERA_NAME,\n> > > +\tPROP_SENSOR_MODES,\n> > > +\tPROP_SENSOR_MODE,\n> > > +\tPROP_LAST\n> > >  };\n> > >\n> > >  G_DEFINE_TYPE_WITH_CODE(GstLibcameraSrc, gst_libcamera_src, GST_TYPE_ELEMENT,\n> > > @@ -318,6 +324,61 @@ int GstLibcameraSrcState::processRequest()\n> > >  \treturn err;\n> > >  }\n> > >\n> > > +static GstCaps *\n> > > +gst_libcamera_src_enumerate_sensor_modes(GstLibcameraSrc *self)\n> > > +{\n> > > +\tGstCaps *modes = gst_caps_new_empty();\n> > > +\tGstLibcameraSrcState *state = self->state;\n> > > +\tauto config = state->cam_->generateConfiguration({ libcamera::StreamRole::Raw,\n> > > +\t\t\t\t\t\t\t   libcamera::StreamRole::VideoRecording });\n> >\n> > This clearly shows our API falls short, as the presence of the RAW\n> > stream, and the ability to capture it alongside other streams, is\n> > conditional on the platform's capabilities. So this can't be\n> > considered a sufficiently stable way of retrieving information about\n> > the sensor's configuration. Not your fault, we currently don't have\n> > anything better to offer.\n> >\n> > We currently have a way to report all the possible configurations for\n> > a stream using the StreamFormat class. StreamFormat is currently\n> > nothing but\n> >\n> > \tstd::map<PixelFormat, std::vector<SizeRange>> formats_;\n> >\n> > which gets populated by the pipeline handler while creating\n> > the StreamConfiguration items which will be part of the generated\n> > CameraConfiguration.\n> >\n> > That's for the \"Camera-provided information to applications\" side.\n> >\n> > On the \"application to configure the Camera\" side, StreamConfiguration\n> > is used to tell the library how a Stream should be configured.\n> > These are the fields a StreamConfiguration describe:\n> >\n> > \tPixelFormat pixelFormat;\n> > \tSize size;\n> > \tunsigned int stride;\n> > \tunsigned int frameSize;\n> >\n> > \tunsigned int bufferCount;\n> >\n> > \tstd::optional<ColorSpace> colorSpace;\n> >\n> > With pixelformat, size, bufferCount and (optionally) colorSpace which\n> > are expected to be set by the application.\n> >\n> > We have been discussing about adding to StreamConfiguration a\n> > frameDuration since a long time. It is needed for some parts of the\n> > Android HAL as well, and in general, as your patch shows, it is a\n> > desirable feature. A similar reasoning can be made for the sensor's\n> > configuration, even if this has received several push backs in the\n> > past when RPi wanted something similar. Furthermore, I would be\n> > tempted to consider the possible FrameDuration a property of the\n> > sensor's configuration, but I presume that advanced use cases for\n> > StillCapture chain to the canonical frame processing more stages, so\n> > I presume it's better to have the FrameDurations separate from the\n> > sensor configuration to allow describe additional delays there.\n>\n> Just a note here, what matters to application is not as much the final frame\n> duration (framerate) but the maximum imposed by the select camera mode. An\n> application will simply want to advertise lets say 60fps, if there is a mode\n> that can theoretically achieve it. At the moment, it falls short, as even if\n> there is a mode, if you aren't lucky enough, it may not work.\n>\n\nCorrect. My point was about StillCapture potentially imposing additional\ndelays on top of the sensor's frame duration.\n\nAs far as I understand advanced platforms might pass the frames\ndirected for still picture capture through other processing blocks.\nThis requires time and this latency should be reported somewhere.\n\n> >\n> > Summing it all up, we have StreamFormats to convey back to application\n> > what a Stream can do, and StreamConfiguration to set the Stream\n> > characteristics.\n> >\n> > Addressing the needs you have here would require:\n> >\n> > - Pipeline handlers to populate StreamFormats not only with\n> >   <PixelFormat, vector<SizeRange>> but also with the possible sensor's\n> >   configuration that can generated that mode and the possible frame\n> >   durations\n> >\n> > - Application to populate a StreamConfiguration with the desired\n> >   FrameDuration and sensor mode\n> >\n> > - Pipeline handler to validate that the desired sensor mode and\n> >   durations are compatible with the stream characteristics.\n> >   (as an example, if your pipeline can't upscale, you can't have a\n> >   sensor mode with a size smaller than the desired output).\n> >\n> > This puts more complexity on the pipeline side, in both the generation\n> > of configurations and in their validation, but would such an API\n> > address your needs in your opinion ? Have you had any thoughts on how\n> > you would like to have this handled ?\n>\n> Currently, stepping back form the implementation details, I identify two\n> issues. \n>\n> 1. Is the inability for the pipeline to do that right thing.\n>\n> As the pipeline is not aware of the desired frameDuration(s) at the time of\n> validation, it is unable to do the right thing for an application that wants\n> lets 60fps but the \"best\" mode does not cover it.\n>\n> 2. Is the lack of direct sensor mode enumeration\n>\n> This is what you describe, but my learning is that sensor mode description goes\n> beyond format/width/height/frameDuration. For an application, it is imperative\n> to know what portion will remain from the original field of view, and what is\n> the initial field of view. I'm not sure I have collected all the details if any\n> of this actually exist.\n\nWe have these information. The full pixel array size is a static\nproperty of the Camera, we also know the per-mode analogue crop\nrectangle (the portion of the pixel array that gets processed). So we\ncan expose it alongside the sensor's output sizes, format and\nduration.\n\nI'm a bit missing how application might be interested in the sensor's\noutput format, even if I know very specialized applications (like\nsome of the RPi users have) want to pick the mode with, for\nexample, the highest bit resolution.\n\n>\n> So, if I read your proposal well, you suggest to add all supported camera modes\n> to each of the enumrated format/size. As the camera configuration becomes a\n> thing, we then have the ability to introduce more optional metadata there to\n> solve 2. I think that could work, as there is certainly a strong correlation\n> between the format/size and the available modes. Making a list of modes like\n> libcamera-apps do would be more difficult though (apps have to deal with\n> duplicates). Is that a concern ? (its not from my point of view, and we can use\n> immutable shared pointer in C++ to do this at no cost, we could even define that\n> pointer comparison is enough to match duplicates).\n>\n\nDuplications (when it comes to efficiency) are not my primary concern\nas generateConfiguration() is not really in any hot path.\n\nMy struggle is with dependencies between Streams.\n\nLet's make a practical (simplified) example that only consider the\nsensor output size and the min duration a mode can achieve\n\nYour sensor can do the following modes\n\n        full res:       { 4144x3136, 66ms }\n        2x2 binned      { 2072x1568, 16ms }\n\n        - generateConfiguration({ Viewfinder });\n\n          Easy, you can get 1080p from both modes and application\n          select the one they prefer looking at durations/FOV\n\n        - generateConfiguration({ Viewfinder, Still });\n\n          Still wants full resolution, so you can't use the fastest\n          { 2072x1568 } mode. As a result we can only associate\n          { 4144x3136 } to both Roles. Your camera now runs at 16FPS\n          (plus Still can have additional delays, as explained above,\n          but they only apply when the Still stream is part of the Request).\n\nMy point here is that information like the duration depend on the overall\nCameraConfiguration and not on the single stream capabilities. In\nother words, it cannot be estimated in advance if not after a\nconfiguration has been applied (or validated) by the Camera.\n\nAs a comparison, Android lists the per-stream information (format and\nduration) as they were considered in isolation. It's up to applications\nto understand that if they configure { Viewfinder, Still } the\nresulting duration will be max(Viewfinder.duration, Still.duration).\n\nThis directly connects to the below question of what StreamFormats\nshould report: all modes a role can do as it would be considered in\nisolation or should they be tied to the combination of the requested\nstreams ?\n\nHonestly, generateConfiguration() is handy to make application's life\neasier when they want some pre-cooked configuration. But for anything\nmore than that I would be tempted to propose\n\n        StreamFormats Camera::streamsFormats()\n        StreamFormats Camera::stallingStreamsFormats()\n        StreamFormats Camera::rawStreamsFormats()\n\n        (Yes, that's the same model Android has)\n\nThe argument against this was that application should now know how to\ncombine the Streams in order to form a valid CameraConfiguration, and\nthis requires to know the platform capabilities (can the Camera do 2\nYUV streams ? Can it do 2 YUV + RAW ? etc).\n\nAgain for a comparison, Android has \"hw levels\" that describe what a\nplatform can do, so if your platform is \"FULL\" you are guaranteed that\nyou can combine up to a certain number of YUV + RAW, if it's \"LIMITED\"\nyou can't have more than 1 YUV + RAW etc etc so that apps know what\nthey can expect.\n\n> >\n> > That's really an open question for everyone, as I guess this part is\n> > becoming critical as this patch shows and the workaround we pushed RPi\n> > to is really not generic enough.\n>\n> Agreed, and looking for feedback. I'm be happy to help with the subject. I know\n> this patch was only making an issue more visible, but I'm also sharing it as it\n> may temporally be useful to some users (as a downstream patch).\n>\n> >\n> > One more point here below\n> >\n> > Thanks\n> >    j\n> >\n> > -------------------------------------------------------------------------------\n> > Slightly unrelated point, but while you're here I would like to pick\n> > your brain on the issue: the way StreamFormat are currently created is\n> > slightly under-specified. The documentation goes in great length in\n> > describing how the sizes associated with PixelFormat can be inspected\n> > [1] but it only superficially describe how a pipeline handler should\n> > populate the formats associated with a stream configuration [2]. I'll\n> > make two examples:\n> > - if the StreamConfiguration is for a 1080p Viewfinder stream should\n> >   the associated StreamFormats list resolutions > 1080p ?\n>\n> If you have scaling in your ISP, I think it should. I notice that feeding the\n> ISP with higher resolution often lead to a cleaner image. An application can\n> then decide to focus on reducing the bandwidth (and possibly the battery life as\n> a side effect) by matching the resolution.\n\nI was not talking about the sensor's mode, but about the Stream output size.\n\nIOW, if you ask for { Viewfinder } you (generally) get back a stream\nconfig with size (1920x1080). Should the StreamFormats associated with\nsuch config list output formats larger than 1080p ?\n\n>\n> The other reason is that your viewFinder will likely match the display\n> resolution. If you do viewFinder + still pictures, and you don't want the\n> viewFinder going blank while taking a picture, you will have no choice but to\n> set the sensor at the desired still picture resolution.\n>\n\nCan't disagree\n\n> Finally, if you don't expose the higher resolution, a generic caps mechanism\n> like GstCaps can't be used. When you have multiple streams, you will want to\n> intersect the sensor configuration for every streams in order to find the\n> remaining available sensor modes you can pick from. GstCaps will not allow you\n> to select higher sensor resolution if you don't include them, as there will be\n> no intersection. This is a mechanism you'll find in GStreamer and Pipewire,\n> which depends on having capability that are close to the mathematical\n> definitions of sets (with its own salt).\n>\n\nI guess the main issue is that gst (as many potential other\nusers) want to get a general view of what the Camera can do, and\nthen decide how to combine them. We have an API that tells you how the\ncamera will behave by considering the overall configuration of streams.\n\nBack to the above example, if you { Viewfinder, Still } you get 16FPS camera.\n\n> > - if the StreamConfiguration is for a !Raw role, should RAW\n> >   pixelformats be listed in the StreamFormats ?\n>\n> I've been trying to answer this one, but I realize I don't know if I understand\n> the full extent of the question. Is raw role going away ? If yes, then we'll\n\nNo no, it's not :)\n\n> have to offer the raw formats into all the other roles. But we then loose a bit,\n> as the Raw role have special meaning, and can greatly help the pipeline when\n> configuring (a raw role configuration had precedence over all the other, making\n> things a little simpler when fixating the configuration). I think we have to\n> decide, but as of my today's thinking, the raw roles could remain, its just that\n> the configuration will just be 1:1 with the sensor configuration.\n>\n\nSorry this wasn't clear. To me this is tightly related to how\nStreamFormats should be interpreted. Should they report what the\ncamera can do in the current configuration, or expose the general\ncapabilities of the Camera ? Today we do a bit of both. The example\nwith RAW was that if you ask\n\n        { Viewfinder, Still }\n\nCurrently on RkISP1 you also get back a few StreamFormat for the RAW\nconfiguration:\n\n        { SBGGR10, 4144x3136 }\n        { SBGGR10, 2072x1568 }\n\neven if you haven't asked for them.\n\nThis seems to highlight that StreamFormats should report all the\ncapabilities of the Camera, including formats you can't (or shouldn't)\nuse for a role, which seems just confusing and defeats the purpose of\nRoles in first place.\n\n> >\n> > There are arguments for both cases, but I guess the big picture\n> > question is about how an application should expect to retrieve the\n> > full camera capabilities:\n> > - Does an application expect to receive only the available formats for\n> >   the role it has asked a configuration for ?\n> > - Does an application expect to receive all the possible formats a\n> >   camera can produce regardless of the size/pixelformat of the\n> >   StreamConfiguration that has been generated ?\n> >\n> > As a concrete example, currently the RkISP1 pipeline handler list a\n> > RAW StreamFormat for all configurations, regardless of the role. I had\n> > a patch to \"fix\" this but I've been pointed to the fact the API was\n> > not very well specified.\n> >\n> > What would your expectations be in this case ?\n>\n> That is interesting, so today we have a bit of both. The most basic applications\n> needs to match two roles already (ViewFinder + X). So in term of sensor\n> configuration, without going into slow validation loops, have to cross over the\n> information across the roles.\n>\n> So overall, I think my preference would be something structured like this:\n>\n> Role1:\n>   + format/size\n> \t- Sensor Config1\n> \t- Sensor Config2\n> \t- Sensor Config3\n>   + format/size\n> \t- Sensor Config2\n> Role2:\n>   ...\n>\n> The stream configuration as defined today seems well suited. Keeping the fixated\n> frameDuration away from it seems unavoidable. There is just too many variable\n\nWhat do you mean with \"fixated frameDuration\" ?\n\n> toward the fixation of the frameDuration, on top of which, applications are much\n> more flexible then they pretend. As an example, when I see:\n>\n>    libcamerasrc ! video/x-raw,framerate=60/1 ! ...\n>\n> I see a very naive application. I'm very doubtful that 121/2 or 119/2 would not\n> have done the equally the job. And even 50/1 would likely be fine for most use\n> cases. In GStreamer, you can (and should) use ranges when setting filters.\n>\n>   libcamerasrc ! video/x-raw,framerate=[40/1,70/1] ! ...\n>\n> If you have a strong preference for 60, you should hint that this way (note, as\n> I said in the review, the newly added caps negotiation is not working yet for\n> that):\n>\n>   libcamerasrc ! video/x-raw,framerate=60/1;video/x-raw,framerate=[40/1,70/1] ! ...\n>\n> And libcamerasrc is free to read the first structure, and same all the fixed\n> values as the \"default\", and later use oriented fixation function\n> (gst_structure_fixate_nearest_fraction()). This gets complex quickly, so no one\n> ever implement this ahead of time, but its all there and well defined. And quite\n> representative of how far an application can go to do the right thing.\n>\n> How the sensor configuration would go, done by application or pipeline, would be\n> to collect all sensorConfigurations for all the streams fixated\n> streamConfiguration. Intersect, and finally score the remaining to pick the\n> best.\n\nDo I read it right that you then expect to receive all information\nabout what streams can be produced by a Camera as they were considered\nin isolation, and combine them opportunely yourself at the time of\nconstructing a CameraConfiguration before starting capture ?\n\nThanks for sticking to this long conversation\n\n>\n> Hope this quick GstCaps intro can help,\n> Nicolas\n>\n> >\n> > [1] https://libcamera.org/api-html/classlibcamera_1_1StreamFormats.html\n> > [2] https://libcamera.org/api-html/structlibcamera_1_1StreamConfiguration.html#a37725e6b9e179ca80be0e57ed97857d3\n> > -------------------------------------------------------------------------------\n> >\n> >\n> > > +\tif (config == nullptr) {\n> > > +\t\tGST_DEBUG_OBJECT(self, \"Sensor mode selection is not supported, skipping enumeration.\");\n> > > +\t\treturn modes;\n> > > +\t}\n> > > +\n> > > +\tconst libcamera::StreamFormats &formats = config->at(0).formats();\n> > > +\n> > > +\tfor (const auto &pixfmt : formats.pixelformats()) {\n> > > +\t\tfor (const auto &size : formats.sizes(pixfmt)) {\n> > > +\t\t\tconfig->at(0).size = size;\n> > > +\t\t\tconfig->at(0).pixelFormat = pixfmt;\n> > > +\n> > > +\t\t\tif (config->validate() == CameraConfiguration::Invalid)\n> > > +\t\t\t\tcontinue;\n> > > +\n> > > +\t\t\tif (state->cam_->configure(config.get()))\n> > > +\t\t\t\tcontinue;\n> > > +\n> > > +\t\t\tauto fd_ctrl = state->cam_->controls().find(&controls::FrameDurationLimits);\n> > > +\t\t\tif (fd_ctrl == state->cam_->controls().end())\n> > > +\t\t\t\tcontinue;\n> > > +\n> > > +\t\t\tint minrate_num, minrate_denom;\n> > > +\t\t\tint maxrate_num, maxrate_denom;\n> > > +\t\t\tdouble min_framerate = gst_util_guint64_to_gdouble(1.0e6) /\n> > > +\t\t\t\t\t       gst_util_guint64_to_gdouble(fd_ctrl->second.max().get<int64_t>());\n> > > +\t\t\tdouble max_framerate = gst_util_guint64_to_gdouble(1.0e6) /\n> > > +\t\t\t\t\t       gst_util_guint64_to_gdouble(fd_ctrl->second.min().get<int64_t>());\n> > > +\t\t\tgst_util_double_to_fraction(min_framerate, &minrate_num, &minrate_denom);\n> > > +\t\t\tgst_util_double_to_fraction(max_framerate, &maxrate_num, &maxrate_denom);\n> > > +\n> > > +\t\t\tGstStructure *s = gst_structure_new(\"sensor/mode\",\n> > > +\t\t\t\t\t\t\t    \"format\", G_TYPE_STRING, pixfmt.toString().c_str(),\n> > > +\t\t\t\t\t\t\t    \"width\", G_TYPE_INT, size.width,\n> > > +\t\t\t\t\t\t\t    \"height\", G_TYPE_INT, size.height,\n> > > +\t\t\t\t\t\t\t    \"framerate\", GST_TYPE_FRACTION_RANGE,\n> > > +\t\t\t\t\t\t\t    minrate_num, minrate_denom,\n> > > +\t\t\t\t\t\t\t    maxrate_num, maxrate_denom,\n> > > +\t\t\t\t\t\t\t    nullptr);\n> > > +\t\t\tgst_caps_append_structure(modes, s);\n> > > +\t\t}\n> > > +\t}\n> > > +\n> > > +\tGST_DEBUG_OBJECT(self, \"Camera sensor modes: %\" GST_PTR_FORMAT, modes);\n> > > +\n> > > +\treturn modes;\n> > > +}\n> > > +\n> > >  static bool\n> > >  gst_libcamera_src_open(GstLibcameraSrc *self)\n> > >  {\n> > > @@ -375,6 +436,7 @@ gst_libcamera_src_open(GstLibcameraSrc *self)\n> > >  \t/* No need to lock here, we didn't start our threads yet. */\n> > >  \tself->state->cm_ = cm;\n> > >  \tself->state->cam_ = cam;\n> > > +\tself->sensor_modes = gst_libcamera_src_enumerate_sensor_modes(self);\n> > >\n> > >  \treturn true;\n> > >  }\n> > > @@ -462,6 +524,7 @@ gst_libcamera_src_task_enter(GstTask *task, [[maybe_unused]] GThread *thread,\n> > >  \tGstLibcameraSrcState *state = self->state;\n> > >  \tGstFlowReturn flow_ret = GST_FLOW_OK;\n> > >  \tgint ret;\n> > > +\tg_autoptr(GstCaps) sensor_mode = nullptr;\n> > >\n> > >  \tg_autoptr(GstStructure) element_caps = gst_structure_new_empty(\"caps\");\n> > >\n> > > @@ -481,6 +544,16 @@ gst_libcamera_src_task_enter(GstTask *task, [[maybe_unused]] GThread *thread,\n> > >  \t\troles.push_back(gst_libcamera_pad_get_role(srcpad));\n> > >  \t}\n> > >\n> > > +\tif (!gst_caps_is_any(self->sensor_mode)) {\n> > > +\t\tsensor_mode = gst_caps_intersect(self->sensor_mode, self->sensor_modes);\n> > > +\t\tif (!gst_caps_is_empty(sensor_mode)) {\n> > > +\t\t\troles.push_back(libcamera::StreamRole::Raw);\n> > > +\t\t} else {\n> > > +\t\t\tGST_WARNING_OBJECT(self, \"No sensor mode matching the selection, ignoring.\");\n> > > +\t\t\tgst_clear_caps(&sensor_mode);\n> > > +\t\t}\n> > > +\t}\n> > > +\n> > >  \t/* Generate the stream configurations, there should be one per pad. */\n> > >  \tstate->config_ = state->cam_->generateConfiguration(roles);\n> > >  \tif (state->config_ == nullptr) {\n> > > @@ -490,7 +563,7 @@ gst_libcamera_src_task_enter(GstTask *task, [[maybe_unused]] GThread *thread,\n> > >  \t\tgst_task_stop(task);\n> > >  \t\treturn;\n> > >  \t}\n> > > -\tg_assert(state->config_->size() == state->srcpads_.size());\n> > > +\tg_assert(state->config_->size() == state->srcpads_.size() + (!!sensor_mode));\n> > >\n> > >  \tfor (gsize i = 0; i < state->srcpads_.size(); i++) {\n> > >  \t\tGstPad *srcpad = state->srcpads_[i];\n> > > @@ -510,6 +583,12 @@ gst_libcamera_src_task_enter(GstTask *task, [[maybe_unused]] GThread *thread,\n> > >  \t\tgst_libcamera_get_framerate_from_caps(caps, element_caps);\n> > >  \t}\n> > >\n> > > +\tif (sensor_mode) {\n> > > +\t\tStreamConfiguration &stream_cfg = state->config_->at(state->srcpads_.size());\n> > > +\t\tg_assert(gst_caps_is_writable(sensor_mode));\n> > > +\t\tgst_libcamera_configure_stream_from_caps(stream_cfg, sensor_mode);\n> > > +\t}\n> > > +\n> > >  \tif (flow_ret != GST_FLOW_OK)\n> > >  \t\tgoto done;\n> > >\n> > > @@ -624,6 +703,7 @@ gst_libcamera_src_task_leave([[maybe_unused]] GstTask *task,\n> > >  \tg_clear_object(&self->allocator);\n> > >  \tg_clear_pointer(&self->flow_combiner,\n> > >  \t\t\t(GDestroyNotify)gst_flow_combiner_free);\n> > > +\tgst_clear_caps(&self->sensor_modes);\n> > >  }\n> > >\n> > >  static void\n> > > @@ -659,6 +739,10 @@ gst_libcamera_src_set_property(GObject *object, guint prop_id,\n> > >  \t\tg_free(self->camera_name);\n> > >  \t\tself->camera_name = g_value_dup_string(value);\n> > >  \t\tbreak;\n> > > +\tcase PROP_SENSOR_MODE:\n> > > +\t\tgst_clear_caps(&self->sensor_mode);\n> > > +\t\tself->sensor_mode = GST_CAPS(g_value_dup_boxed(value));\n> > > +\t\tbreak;\n> > >  \tdefault:\n> > >  \t\tG_OBJECT_WARN_INVALID_PROPERTY_ID(object, prop_id, pspec);\n> > >  \t\tbreak;\n> > > @@ -676,6 +760,12 @@ gst_libcamera_src_get_property(GObject *object, guint prop_id, GValue *value,\n> > >  \tcase PROP_CAMERA_NAME:\n> > >  \t\tg_value_set_string(value, self->camera_name);\n> > >  \t\tbreak;\n> > > +\tcase PROP_SENSOR_MODES:\n> > > +\t\tg_value_set_boxed(value, self->sensor_modes);\n> > > +\t\tbreak;\n> > > +\tcase PROP_SENSOR_MODE:\n> > > +\t\tg_value_set_boxed(value, self->sensor_mode);\n> > > +\t\tbreak;\n> > >  \tdefault:\n> > >  \t\tG_OBJECT_WARN_INVALID_PROPERTY_ID(object, prop_id, pspec);\n> > >  \t\tbreak;\n> > > @@ -763,6 +853,7 @@ gst_libcamera_src_init(GstLibcameraSrc *self)\n> > >  \t/* C-style friend. */\n> > >  \tstate->src_ = self;\n> > >  \tself->state = state;\n> > > +\tself->sensor_mode = gst_caps_new_any();\n> > >  }\n> > >\n> > >  static GstPad *\n> > > @@ -844,4 +935,19 @@ gst_libcamera_src_class_init(GstLibcameraSrcClass *klass)\n> > >  \t\t\t\t\t\t\t     | G_PARAM_READWRITE\n> > >  \t\t\t\t\t\t\t     | G_PARAM_STATIC_STRINGS));\n> > >  \tg_object_class_install_property(object_class, PROP_CAMERA_NAME, spec);\n> > > +\n> > > +\tspec = g_param_spec_boxed(\"sensor-modes\", \"Sensor Modes\",\n> > > +\t\t\t\t  \"GstCaps representing available sensor modes.\",\n> > > +\t\t\t\t  GST_TYPE_CAPS,\n> > > +\t\t\t\t  (GParamFlags)(G_PARAM_READABLE\n> > > +\t\t\t\t\t        | G_PARAM_STATIC_STRINGS));\n> > > +\tg_object_class_install_property(object_class, PROP_SENSOR_MODES, spec);\n> > > +\n> > > +\tspec = g_param_spec_boxed(\"sensor-mode\", \"Sensor Mode\",\n> > > +\t\t\t\t  \"GstCaps representing selected sensor mode.\",\n> > > +\t\t\t\t  GST_TYPE_CAPS,\n> > > +\t\t\t\t  (GParamFlags)(GST_PARAM_MUTABLE_READY\n> > > +\t\t\t\t\t        | G_PARAM_READWRITE\n> > > +\t\t\t\t\t        | G_PARAM_STATIC_STRINGS));\n> > > +\tg_object_class_install_property(object_class, PROP_SENSOR_MODE, spec);\n> > >  }\n> > > --\n> > > 2.39.2\n> > >\n>","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id 1768EBD160\n\tfor <parsemail@patchwork.libcamera.org>;\n\tMon, 27 Mar 2023 15:19:23 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id 5CDB96271C;\n\tMon, 27 Mar 2023 17:19:22 +0200 (CEST)","from perceval.ideasonboard.com (perceval.ideasonboard.com\n\t[213.167.242.64])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id 2BA41626D7\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tMon, 27 Mar 2023 17:19:20 +0200 (CEST)","from ideasonboard.com (93-61-96-190.ip145.fastwebnet.it\n\t[93.61.96.190])\n\tby perceval.ideasonboard.com (Postfix) with ESMTPSA id 91C62AEC;\n\tMon, 27 Mar 2023 17:19:19 +0200 (CEST)"],"DKIM-Signature":["v=1; a=rsa-sha256; c=relaxed/simple; d=libcamera.org;\n\ts=mail; t=1679930362;\n\tbh=dEbrKDuvOdOTw0lIHetYQI9S6hwajbPNcwT0hEfRhyE=;\n\th=Date:To:References:In-Reply-To:Subject:List-Id:List-Unsubscribe:\n\tList-Archive:List-Post:List-Help:List-Subscribe:From:Reply-To:Cc:\n\tFrom;\n\tb=DDDw4Ig8bJvj4owJ/OeFuIF8HGYi6Dbi13J86zCW9PMIJjbam07GrW3GPU1sn20Bi\n\tTuwxX6ThKaKnF8cqFxuAMj4HmdQFMNhuyN24l79q50Ccx+WuCgA396Sa/PZSC/Rw7y\n\tl09xHorLJ/1fDP3GTBawukRFtQYBb41KGi+gDspbGpMJbWQrYSaFofwJimfp81qhl7\n\tSY7x1r+Mf4Yx8JiIXk3KkSHeXIeZwf7vKc8wpIVFG5JMp+eXZzyoltu0DDS5Qd0iRv\n\taN3NRM20i/ZOFlqfCLzuU6J+XN1pWW2Z0tNx16syRiCXbQRyafzJLFLrJkqJrcCtDR\n\tiNHEgG7hICAPw==","v=1; a=rsa-sha256; c=relaxed/simple; d=ideasonboard.com;\n\ts=mail; t=1679930359;\n\tbh=dEbrKDuvOdOTw0lIHetYQI9S6hwajbPNcwT0hEfRhyE=;\n\th=Date:From:To:Cc:Subject:References:In-Reply-To:From;\n\tb=H0VyyPmitnfgVYvGVPhKyR9VrvCV1yCwOXvZFT1Udi7GOvR/oVYKfXEwh5vVUQMWX\n\tfYMS2TtNLop4gjPUDpdVumt/SVYYwv/ITC5ABasEEQ3bONqWDVvHFecxT/EO3kCDH0\n\tbHCeTycJQLNLJfedXsY4mLD9A8kGB7Ej/WOMLz1w="],"Authentication-Results":"lancelot.ideasonboard.com; dkim=pass (1024-bit key; \n\tunprotected) header.d=ideasonboard.com\n\theader.i=@ideasonboard.com\n\theader.b=\"H0VyyPmi\"; dkim-atps=neutral","Date":"Mon, 27 Mar 2023 17:19:16 +0200","To":"Nicolas Dufresne <nicolas.dufresne@collabora.com>","Message-ID":"<20230327151916.bxhuho3gez7kkvz7@uno.localdomain>","References":"<20230324181247.302586-1-nicolas@ndufresne.ca>\n\t<20230324181247.302586-2-nicolas@ndufresne.ca>\n\t<20230327092155.kzzwjsr5ffnh4y7k@uno.localdomain>\n\t<db46d6b2a3dc2f48b8a8323f32bebd522235c9bf.camel@collabora.com>","MIME-Version":"1.0","Content-Type":"text/plain; charset=utf-8","Content-Disposition":"inline","Content-Transfer-Encoding":"8bit","In-Reply-To":"<db46d6b2a3dc2f48b8a8323f32bebd522235c9bf.camel@collabora.com>","Subject":"Re: [libcamera-devel] [PATCH v2 1/3] gstreamer: Add sensor mode\n\tselection","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","From":"Jacopo Mondi via libcamera-devel <libcamera-devel@lists.libcamera.org>","Reply-To":"Jacopo Mondi <jacopo.mondi@ideasonboard.com>","Cc":"Jacopo Mondi <jacopo.mondi@ideasonboard.com>,\n\tlibcamera-devel@lists.libcamera.org","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}},{"id":26790,"web_url":"https://patchwork.libcamera.org/comment/26790/","msgid":"<fd3367c99599febbed2a99dbb01dd89a66c38a50.camel@collabora.com>","date":"2023-03-28T19:23:47","subject":"Re: [libcamera-devel] [PATCH v2 1/3] gstreamer: Add sensor mode\n\tselection","submitter":{"id":31,"url":"https://patchwork.libcamera.org/api/people/31/","name":"Nicolas Dufresne","email":"nicolas.dufresne@collabora.com"},"content":"Le lundi 27 mars 2023 à 17:19 +0200, Jacopo Mondi via libcamera-devel a écrit :\n> Hi Nicolas,\n> \n> On Mon, Mar 27, 2023 at 09:40:57AM -0400, Nicolas Dufresne wrote:\n> > Le lundi 27 mars 2023 à 11:21 +0200, Jacopo Mondi a écrit :\n> > > Hi Nicolas,\n> > > \n> > > On Fri, Mar 24, 2023 at 02:12:45PM -0400, Nicolas Dufresne via libcamera-devel wrote:\n> > > > From: Nicolas Dufresne <nicolas.dufresne@collabora.com>\n> > > > \n> > > > This add support for selecting the sensor mode. A new read-only\n> > > > property called sensor-modes is filled when the element reaches\n> > > > READY state. It contains the list of all available sensor modes,\n> > > > including the supported framerate range. This is exposed as GstCaps\n> > > > in the form of sensor/mode,width=X,height=Y,format=Y,framerate=[...].\n> > > > The format string matches the libcamera format string representation.\n> > > > \n> > > > The application can then select a mode using the read/write sensor-mode\n> > > > control. The selected mode is also a caps, it will be intersected with\n> > > > the supported mode and the \"best\" match will be picked. This allows\n> > > > application to use simple filter when they want to pick a mode for lets\n> > > > say a specific framerate (e.g. sensor/mode,framerate=60/1).\n> > > > \n> > > > Signed-off-by: Nicolas Dufresne <nicolas.dufresne@collabora.com>\n> > > > ---\n> > > >  src/gstreamer/gstlibcamera-utils.cpp |   4 +\n> > > >  src/gstreamer/gstlibcamerasrc.cpp    | 110 ++++++++++++++++++++++++++-\n> > > >  2 files changed, 112 insertions(+), 2 deletions(-)\n> > > > \n> > > > diff --git a/src/gstreamer/gstlibcamera-utils.cpp b/src/gstreamer/gstlibcamera-utils.cpp\n> > > > index 750ec351..c8a8df09 100644\n> > > > --- a/src/gstreamer/gstlibcamera-utils.cpp\n> > > > +++ b/src/gstreamer/gstlibcamera-utils.cpp\n> > > > @@ -416,6 +416,10 @@ gst_libcamera_configure_stream_from_caps(StreamConfiguration &stream_cfg,\n> > > >  \t\tstream_cfg.pixelFormat = gst_format_to_pixel_format(gst_format);\n> > > >  \t} else if (gst_structure_has_name(s, \"image/jpeg\")) {\n> > > >  \t\tstream_cfg.pixelFormat = formats::MJPEG;\n> > > > +\t} else if (gst_structure_has_name(s, \"sensor/mode\")) {\n> > > > +\t\tgst_structure_fixate_field(s, \"format\");\n> > > > +\t\tconst gchar *format = gst_structure_get_string(s, \"format\");\n> > > > +\t\tstream_cfg.pixelFormat = PixelFormat::fromString(format);\n> > > >  \t} else {\n> > > >  \t\tg_critical(\"Unsupported media type: %s\", gst_structure_get_name(s));\n> > > >  \t}\n> > > > diff --git a/src/gstreamer/gstlibcamerasrc.cpp b/src/gstreamer/gstlibcamerasrc.cpp\n> > > > index a10cbd4f..2f05a03f 100644\n> > > > --- a/src/gstreamer/gstlibcamerasrc.cpp\n> > > > +++ b/src/gstreamer/gstlibcamerasrc.cpp\n> > > > @@ -147,6 +147,9 @@ struct _GstLibcameraSrc {\n> > > > \n> > > >  \tgchar *camera_name;\n> > > > \n> > > > +\tGstCaps *sensor_modes;\n> > > > +\tGstCaps *sensor_mode;\n> > > > +\n> > > >  \tGstLibcameraSrcState *state;\n> > > >  \tGstLibcameraAllocator *allocator;\n> > > >  \tGstFlowCombiner *flow_combiner;\n> > > > @@ -154,7 +157,10 @@ struct _GstLibcameraSrc {\n> > > > \n> > > >  enum {\n> > > >  \tPROP_0,\n> > > > -\tPROP_CAMERA_NAME\n> > > > +\tPROP_CAMERA_NAME,\n> > > > +\tPROP_SENSOR_MODES,\n> > > > +\tPROP_SENSOR_MODE,\n> > > > +\tPROP_LAST\n> > > >  };\n> > > > \n> > > >  G_DEFINE_TYPE_WITH_CODE(GstLibcameraSrc, gst_libcamera_src, GST_TYPE_ELEMENT,\n> > > > @@ -318,6 +324,61 @@ int GstLibcameraSrcState::processRequest()\n> > > >  \treturn err;\n> > > >  }\n> > > > \n> > > > +static GstCaps *\n> > > > +gst_libcamera_src_enumerate_sensor_modes(GstLibcameraSrc *self)\n> > > > +{\n> > > > +\tGstCaps *modes = gst_caps_new_empty();\n> > > > +\tGstLibcameraSrcState *state = self->state;\n> > > > +\tauto config = state->cam_->generateConfiguration({ libcamera::StreamRole::Raw,\n> > > > +\t\t\t\t\t\t\t   libcamera::StreamRole::VideoRecording });\n> > > \n> > > This clearly shows our API falls short, as the presence of the RAW\n> > > stream, and the ability to capture it alongside other streams, is\n> > > conditional on the platform's capabilities. So this can't be\n> > > considered a sufficiently stable way of retrieving information about\n> > > the sensor's configuration. Not your fault, we currently don't have\n> > > anything better to offer.\n> > > \n> > > We currently have a way to report all the possible configurations for\n> > > a stream using the StreamFormat class. StreamFormat is currently\n> > > nothing but\n> > > \n> > > \tstd::map<PixelFormat, std::vector<SizeRange>> formats_;\n> > > \n> > > which gets populated by the pipeline handler while creating\n> > > the StreamConfiguration items which will be part of the generated\n> > > CameraConfiguration.\n> > > \n> > > That's for the \"Camera-provided information to applications\" side.\n> > > \n> > > On the \"application to configure the Camera\" side, StreamConfiguration\n> > > is used to tell the library how a Stream should be configured.\n> > > These are the fields a StreamConfiguration describe:\n> > > \n> > > \tPixelFormat pixelFormat;\n> > > \tSize size;\n> > > \tunsigned int stride;\n> > > \tunsigned int frameSize;\n> > > \n> > > \tunsigned int bufferCount;\n> > > \n> > > \tstd::optional<ColorSpace> colorSpace;\n> > > \n> > > With pixelformat, size, bufferCount and (optionally) colorSpace which\n> > > are expected to be set by the application.\n> > > \n> > > We have been discussing about adding to StreamConfiguration a\n> > > frameDuration since a long time. It is needed for some parts of the\n> > > Android HAL as well, and in general, as your patch shows, it is a\n> > > desirable feature. A similar reasoning can be made for the sensor's\n> > > configuration, even if this has received several push backs in the\n> > > past when RPi wanted something similar. Furthermore, I would be\n> > > tempted to consider the possible FrameDuration a property of the\n> > > sensor's configuration, but I presume that advanced use cases for\n> > > StillCapture chain to the canonical frame processing more stages, so\n> > > I presume it's better to have the FrameDurations separate from the\n> > > sensor configuration to allow describe additional delays there.\n> > \n> > Just a note here, what matters to application is not as much the final frame\n> > duration (framerate) but the maximum imposed by the select camera mode. An\n> > application will simply want to advertise lets say 60fps, if there is a mode\n> > that can theoretically achieve it. At the moment, it falls short, as even if\n> > there is a mode, if you aren't lucky enough, it may not work.\n> > \n> \n> Correct. My point was about StillCapture potentially imposing additional\n> delays on top of the sensor's frame duration.\n> \n> As far as I understand advanced platforms might pass the frames\n> directed for still picture capture through other processing blocks.\n> This requires time and this latency should be reported somewhere.\n> \n> > > \n> > > Summing it all up, we have StreamFormats to convey back to application\n> > > what a Stream can do, and StreamConfiguration to set the Stream\n> > > characteristics.\n> > > \n> > > Addressing the needs you have here would require:\n> > > \n> > > - Pipeline handlers to populate StreamFormats not only with\n> > >   <PixelFormat, vector<SizeRange>> but also with the possible sensor's\n> > >   configuration that can generated that mode and the possible frame\n> > >   durations\n> > > \n> > > - Application to populate a StreamConfiguration with the desired\n> > >   FrameDuration and sensor mode\n> > > \n> > > - Pipeline handler to validate that the desired sensor mode and\n> > >   durations are compatible with the stream characteristics.\n> > >   (as an example, if your pipeline can't upscale, you can't have a\n> > >   sensor mode with a size smaller than the desired output).\n> > > \n> > > This puts more complexity on the pipeline side, in both the generation\n> > > of configurations and in their validation, but would such an API\n> > > address your needs in your opinion ? Have you had any thoughts on how\n> > > you would like to have this handled ?\n> > \n> > Currently, stepping back form the implementation details, I identify two\n> > issues. \n> > \n> > 1. Is the inability for the pipeline to do that right thing.\n> > \n> > As the pipeline is not aware of the desired frameDuration(s) at the time of\n> > validation, it is unable to do the right thing for an application that wants\n> > lets 60fps but the \"best\" mode does not cover it.\n> > \n> > 2. Is the lack of direct sensor mode enumeration\n> > \n> > This is what you describe, but my learning is that sensor mode description goes\n> > beyond format/width/height/frameDuration. For an application, it is imperative\n> > to know what portion will remain from the original field of view, and what is\n> > the initial field of view. I'm not sure I have collected all the details if any\n> > of this actually exist.\n> \n> We have these information. The full pixel array size is a static\n> property of the Camera, we also know the per-mode analogue crop\n> rectangle (the portion of the pixel array that gets processed). So we\n> can expose it alongside the sensor's output sizes, format and\n> duration.\n> \n> I'm a bit missing how application might be interested in the sensor's\n> output format, even if I know very specialized applications (like\n> some of the RPi users have) want to pick the mode with, for\n> example, the highest bit resolution.\n> \n> > \n> > So, if I read your proposal well, you suggest to add all supported camera modes\n> > to each of the enumrated format/size. As the camera configuration becomes a\n> > thing, we then have the ability to introduce more optional metadata there to\n> > solve 2. I think that could work, as there is certainly a strong correlation\n> > between the format/size and the available modes. Making a list of modes like\n> > libcamera-apps do would be more difficult though (apps have to deal with\n> > duplicates). Is that a concern ? (its not from my point of view, and we can use\n> > immutable shared pointer in C++ to do this at no cost, we could even define that\n> > pointer comparison is enough to match duplicates).\n> > \n> \n> Duplications (when it comes to efficiency) are not my primary concern\n> as generateConfiguration() is not really in any hot path.\n> \n> My struggle is with dependencies between Streams.\n> \n> Let's make a practical (simplified) example that only consider the\n> sensor output size and the min duration a mode can achieve\n> \n> Your sensor can do the following modes\n> \n>         full res:       { 4144x3136, 66ms }\n>         2x2 binned      { 2072x1568, 16ms }\n> \n>         - generateConfiguration({ Viewfinder });\n> \n>           Easy, you can get 1080p from both modes and application\n>           select the one they prefer looking at durations/FOV\n> \n>         - generateConfiguration({ Viewfinder, Still });\n> \n>           Still wants full resolution, so you can't use the fastest\n>           { 2072x1568 } mode. As a result we can only associate\n>           { 4144x3136 } to both Roles. Your camera now runs at 16FPS\n>           (plus Still can have additional delays, as explained above,\n>           but they only apply when the Still stream is part of the Request).\n> \n> My point here is that information like the duration depend on the overall\n> CameraConfiguration and not on the single stream capabilities. In\n> other words, it cannot be estimated in advance if not after a\n> configuration has been applied (or validated) by the Camera.\n> \n> As a comparison, Android lists the per-stream information (format and\n> duration) as they were considered in isolation. It's up to applications\n> to understand that if they configure { Viewfinder, Still } the\n> resulting duration will be max(Viewfinder.duration, Still.duration).\n> \n> \n> This directly connects to the below question of what StreamFormats\n> should report: all modes a role can do as it would be considered in\n> isolation or should they be tied to the combination of the requested\n> streams ?\n> \n\nMy original idea is that for each StreamConfig, in isolation, we'd have N\nSensorConfig. The combination of a StreamConfig and SensorConfig is what could\npossibly offer a duration range. It make no sense to me, to give a range over\nall the possible SensorConfig, since in many cases, there are not the same\nstream (like different field of view notably). Or at least, these should be\nsetup in a way that the duration range is true for a coherent with groupe.\n\nAnd then the intersection of StreamConfig+SensorConfig for one role and another,\nwill truncate the duration range accordingly and we should have the closes\npossible approximation.\n\n\n> Honestly, generateConfiguration() is handy to make application's life\n> easier when they want some pre-cooked configuration. But for anything\n> more than that I would be tempted to propose\n> \n>         StreamFormats Camera::streamsFormats()\n>         StreamFormats Camera::stallingStreamsFormats()\n>         StreamFormats Camera::rawStreamsFormats()\n> \n>         (Yes, that's the same model Android has)\n> \n> The argument against this was that application should now know how to\n> combine the Streams in order to form a valid CameraConfiguration, and\n> this requires to know the platform capabilities (can the Camera do 2\n> YUV streams ? Can it do 2 YUV + RAW ? etc).\n> \n\nI think camera apps will have uses cases, and fallback use cases. They will try\nand fallback, they don't really care enumerating the capabilities cause there is\ntoo much complexity in generically adapting to various cameras capability.\n\nApps are specialized piece of software, and of course, the trial chain will be\nshort and fail if you use it with an inappropriate piece of equipement.\n\nSo in this regard, what Android expose is right, the only improvement I see is\nto place the raw streams formats (I guess this is not just format but\nconfigurations) into the list of other formats, cause we know that sensor will\nneed to share the same sensor/raw configuration (and this should be optional, so\nwe don't go populate raw formats for UVC cameras, were it does not really make\nsense).\n\n> Again for a comparison, Android has \"hw levels\" that describe what a\n> platform can do, so if your platform is \"FULL\" you are guaranteed that\n> you can combine up to a certain number of YUV + RAW, if it's \"LIMITED\"\n> you can't have more than 1 YUV + RAW etc etc so that apps know what\n> they can expect.\n> \n\nSo a use case flag to allow to avoid trial and error, which is fine, but of\ncourse, the use cases are way larger in the context of libcamera, so it might be\na challenge to keep that. If we can express a little more, we can keep that list\ndown.\n\n> \n> > > \n> > > That's really an open question for everyone, as I guess this part is\n> > > becoming critical as this patch shows and the workaround we pushed RPi\n> > > to is really not generic enough.\n> > \n> > Agreed, and looking for feedback. I'm be happy to help with the subject. I know\n> > this patch was only making an issue more visible, but I'm also sharing it as it\n> > may temporally be useful to some users (as a downstream patch).\n> > \n> > > \n> > > One more point here below\n> > > \n> > > Thanks\n> > >    j\n> > > \n> > > -------------------------------------------------------------------------------\n> > > Slightly unrelated point, but while you're here I would like to pick\n> > > your brain on the issue: the way StreamFormat are currently created is\n> > > slightly under-specified. The documentation goes in great length in\n> > > describing how the sizes associated with PixelFormat can be inspected\n> > > [1] but it only superficially describe how a pipeline handler should\n> > > populate the formats associated with a stream configuration [2]. I'll\n> > > make two examples:\n> > > - if the StreamConfiguration is for a 1080p Viewfinder stream should\n> > >   the associated StreamFormats list resolutions > 1080p ?\n> > \n> > If you have scaling in your ISP, I think it should. I notice that feeding the\n> > ISP with higher resolution often lead to a cleaner image. An application can\n> > then decide to focus on reducing the bandwidth (and possibly the battery life as\n> > a side effect) by matching the resolution.\n> \n> I was not talking about the sensor's mode, but about the Stream output size.\n> \n> IOW, if you ask for { Viewfinder } you (generally) get back a stream\n> config with size (1920x1080). Should the StreamFormats associated with\n> such config list output formats larger than 1080p ?\n> \n\n\nNo, but it there is multiple SensorConfig (or Raw config in our current state)\nto achieve this 1920x1080. And all these sensor config will produce arguably\ndifferent streams. This is what I had in mind when special casing the sensor\nconfiguration. It is highly relevant to the use case.\n\n> \n> \n> > \n> > The other reason is that your viewFinder will likely match the display\n> > resolution. If you do viewFinder + still pictures, and you don't want the\n> > viewFinder going blank while taking a picture, you will have no choice but to\n> > set the sensor at the desired still picture resolution.\n> > \n> \n> Can't disagree\n> \n> > Finally, if you don't expose the higher resolution, a generic caps mechanism\n> > like GstCaps can't be used. When you have multiple streams, you will want to\n> > intersect the sensor configuration for every streams in order to find the\n> > remaining available sensor modes you can pick from. GstCaps will not allow you\n> > to select higher sensor resolution if you don't include them, as there will be\n> > no intersection. This is a mechanism you'll find in GStreamer and Pipewire,\n> > which depends on having capability that are close to the mathematical\n> > definitions of sets (with its own salt).\n> > \n> \n> I guess the main issue is that gst (as many potential other\n> users) want to get a general view of what the Camera can do, and\n> then decide how to combine them. We have an API that tells you how the\n> camera will behave by considering the overall configuration of streams.\n> \n> Back to the above example, if you { Viewfinder, Still } you get 16FPS camera.\n> \n> > > - if the StreamConfiguration is for a !Raw role, should RAW\n> > >   pixelformats be listed in the StreamFormats ?\n> > \n> > I've been trying to answer this one, but I realize I don't know if I understand\n> > the full extent of the question. Is raw role going away ? If yes, then we'll\n> \n> No no, it's not :)\n> \n> > have to offer the raw formats into all the other roles. But we then loose a bit,\n> > as the Raw role have special meaning, and can greatly help the pipeline when\n> > configuring (a raw role configuration had precedence over all the other, making\n> > things a little simpler when fixating the configuration). I think we have to\n> > decide, but as of my today's thinking, the raw roles could remain, its just that\n> > the configuration will just be 1:1 with the sensor configuration.\n> > \n> \n> Sorry this wasn't clear. To me this is tightly related to how\n> StreamFormats should be interpreted. Should they report what the\n> camera can do in the current configuration, or expose the general\n> capabilities of the Camera ? Today we do a bit of both. The example\n> with RAW was that if you ask\n> \n>         { Viewfinder, Still }\n> \n> Currently on RkISP1 you also get back a few StreamFormat for the RAW\n> configuration:\n> \n>         { SBGGR10, 4144x3136 }\n>         { SBGGR10, 2072x1568 }\n> \n> even if you haven't asked for them.\n> \n> This seems to highlight that StreamFormats should report all the\n> capabilities of the Camera, including formats you can't (or shouldn't)\n> use for a role, which seems just confusing and defeats the purpose of\n> Roles in first place.\n\nYeah, I think rkisp pipeline is wrong here, as you won't have any of the\nfeatures you'd have with the other formats if you pick them. It means RKISP\npropose random choices that have random capabilities, in disregard with the\nroles. At the same time, can we say that ? It depends how the roles are truly\ndefined. But I think, I can see how the RKISP interpretation of the roles is\nunusable by applications.\n\nIt basically assumes that application will know, since bayer is associate with\nno processing, but what if you have a new shiny type of sensor that uses a\ncommonly known as \"normal\" format ?\n\n> \n> > > \n> > > There are arguments for both cases, but I guess the big picture\n> > > question is about how an application should expect to retrieve the\n> > > full camera capabilities:\n> > > - Does an application expect to receive only the available formats for\n> > >   the role it has asked a configuration for ?\n> > > - Does an application expect to receive all the possible formats a\n> > >   camera can produce regardless of the size/pixelformat of the\n> > >   StreamConfiguration that has been generated ?\n> > > \n> > > As a concrete example, currently the RkISP1 pipeline handler list a\n> > > RAW StreamFormat for all configurations, regardless of the role. I had\n> > > a patch to \"fix\" this but I've been pointed to the fact the API was\n> > > not very well specified.\n> > > \n> > > What would your expectations be in this case ?\n> > \n> > That is interesting, so today we have a bit of both. The most basic applications\n> > needs to match two roles already (ViewFinder + X). So in term of sensor\n> > configuration, without going into slow validation loops, have to cross over the\n> > information across the roles.\n> > \n> > So overall, I think my preference would be something structured like this:\n> > \n> > Role1:\n> >   + format/size\n> > \t- Sensor Config1\n> > \t- Sensor Config2\n> > \t- Sensor Config3\n> >   + format/size\n> > \t- Sensor Config2\n> > Role2:\n> >   ...\n> > \n> > The stream configuration as defined today seems well suited. Keeping the fixated\n> > frameDuration away from it seems unavoidable. There is just too many variable\n> \n> What do you mean with \"fixated frameDuration\" ?\n\nA duration range is unfixed, a fixed duration is the final value you ask for. At\nleast in the sensors I've tested so far, we get some range, like 1/1 to 120/1\nfps, and if you ask for 60fps, it will drop frames for use to match as close as\npossible. At least, the RPi pipeline does this it seems, since we don't do any\nof this in GStreamer as far as I can remember.\n\nIn GStreamer, configurations like this are appended into GstCaps, which is a\nlist of structures, where the field can be fixed values, ranges or set. This our\ngoto tools to deal with random complexity like this one.\n\n> \n> > toward the fixation of the frameDuration, on top of which, applications are much\n> > more flexible then they pretend. As an example, when I see:\n> > \n> >    libcamerasrc ! video/x-raw,framerate=60/1 ! ...\n> > \n> > I see a very naive application. I'm very doubtful that 121/2 or 119/2 would not\n> > have done the equally the job. And even 50/1 would likely be fine for most use\n> > cases. In GStreamer, you can (and should) use ranges when setting filters.\n> > \n> >   libcamerasrc ! video/x-raw,framerate=[40/1,70/1] ! ...\n> > \n> > If you have a strong preference for 60, you should hint that this way (note, as\n> > I said in the review, the newly added caps negotiation is not working yet for\n> > that):\n> > \n> >   libcamerasrc ! video/x-raw,framerate=60/1;video/x-raw,framerate=[40/1,70/1] ! ...\n> > \n> > And libcamerasrc is free to read the first structure, and same all the fixed\n> > values as the \"default\", and later use oriented fixation function\n> > (gst_structure_fixate_nearest_fraction()). This gets complex quickly, so no one\n> > ever implement this ahead of time, but its all there and well defined. And quite\n> > representative of how far an application can go to do the right thing.\n> > \n> > How the sensor configuration would go, done by application or pipeline, would be\n> > to collect all sensorConfigurations for all the streams fixated\n> > streamConfiguration. Intersect, and finally score the remaining to pick the\n> > best.\n> \n> Do I read it right that you then expect to receive all information\n> about what streams can be produced by a Camera as they were considered\n> in isolation, and combine them opportunely yourself at the time of\n> constructing a CameraConfiguration before starting capture ?\n> \n\nWhat was described here was indeed list of streamConfiguration (as we have no in\nisolution, but that does not include frame duration range, since that depends on\nthe selected sensor configuration). And for each of these configuration, get a\nlist of sensor configuration. The app can walkthrough the field of views and\nstuff like this, and may have constraint regarding that. So kind of like\nAndroid, but with the raw part put into relation with the streamConfiguration\nfor us. And also the raw part is now extended to include the frame duration\nrange (even if that could get reduced by specific process included for specific\nroles like still). \n\n\nOverall, I think we at least needs a method to select some sensor specific\nconfig, which inherently will have to be common to every streams in our setup.\nAn alternative is to maybe go in two steps. Do like we do now, which is to\ncombine in isolation constraints for each streams requested by the application.\nConfigure and validate the configuration. From there, enumerate the list of\nother \"sensor configuration\" what would have worked in the supported constraints\n(remember frame duration is still not part of it). So an app the just failed on\nthe frame duration constraints (or field of view constraints) could go through\nthe list and apply global constraint like the field of view, the maximum rate,\netc. What the app can detect is that the application constraints on\nframeduration is not met, it could then try a sensor configuration with higher\nrate (there might be multiple valid option) and try them out in order to see if\nthis issue can be resolved before giving up.\n\n> \n> Thanks for sticking to this long conversation\n> \n> > \n> > Hope this quick GstCaps intro can help,\n> > Nicolas\n> > \n> > > \n> > > [1] https://libcamera.org/api-html/classlibcamera_1_1StreamFormats.html\n> > > [2] https://libcamera.org/api-html/structlibcamera_1_1StreamConfiguration.html#a37725e6b9e179ca80be0e57ed97857d3\n> > > -------------------------------------------------------------------------------\n> > > \n> > > \n> > > > +\tif (config == nullptr) {\n> > > > +\t\tGST_DEBUG_OBJECT(self, \"Sensor mode selection is not supported, skipping enumeration.\");\n> > > > +\t\treturn modes;\n> > > > +\t}\n> > > > +\n> > > > +\tconst libcamera::StreamFormats &formats = config->at(0).formats();\n> > > > +\n> > > > +\tfor (const auto &pixfmt : formats.pixelformats()) {\n> > > > +\t\tfor (const auto &size : formats.sizes(pixfmt)) {\n> > > > +\t\t\tconfig->at(0).size = size;\n> > > > +\t\t\tconfig->at(0).pixelFormat = pixfmt;\n> > > > +\n> > > > +\t\t\tif (config->validate() == CameraConfiguration::Invalid)\n> > > > +\t\t\t\tcontinue;\n> > > > +\n> > > > +\t\t\tif (state->cam_->configure(config.get()))\n> > > > +\t\t\t\tcontinue;\n> > > > +\n> > > > +\t\t\tauto fd_ctrl = state->cam_->controls().find(&controls::FrameDurationLimits);\n> > > > +\t\t\tif (fd_ctrl == state->cam_->controls().end())\n> > > > +\t\t\t\tcontinue;\n> > > > +\n> > > > +\t\t\tint minrate_num, minrate_denom;\n> > > > +\t\t\tint maxrate_num, maxrate_denom;\n> > > > +\t\t\tdouble min_framerate = gst_util_guint64_to_gdouble(1.0e6) /\n> > > > +\t\t\t\t\t       gst_util_guint64_to_gdouble(fd_ctrl->second.max().get<int64_t>());\n> > > > +\t\t\tdouble max_framerate = gst_util_guint64_to_gdouble(1.0e6) /\n> > > > +\t\t\t\t\t       gst_util_guint64_to_gdouble(fd_ctrl->second.min().get<int64_t>());\n> > > > +\t\t\tgst_util_double_to_fraction(min_framerate, &minrate_num, &minrate_denom);\n> > > > +\t\t\tgst_util_double_to_fraction(max_framerate, &maxrate_num, &maxrate_denom);\n> > > > +\n> > > > +\t\t\tGstStructure *s = gst_structure_new(\"sensor/mode\",\n> > > > +\t\t\t\t\t\t\t    \"format\", G_TYPE_STRING, pixfmt.toString().c_str(),\n> > > > +\t\t\t\t\t\t\t    \"width\", G_TYPE_INT, size.width,\n> > > > +\t\t\t\t\t\t\t    \"height\", G_TYPE_INT, size.height,\n> > > > +\t\t\t\t\t\t\t    \"framerate\", GST_TYPE_FRACTION_RANGE,\n> > > > +\t\t\t\t\t\t\t    minrate_num, minrate_denom,\n> > > > +\t\t\t\t\t\t\t    maxrate_num, maxrate_denom,\n> > > > +\t\t\t\t\t\t\t    nullptr);\n> > > > +\t\t\tgst_caps_append_structure(modes, s);\n> > > > +\t\t}\n> > > > +\t}\n> > > > +\n> > > > +\tGST_DEBUG_OBJECT(self, \"Camera sensor modes: %\" GST_PTR_FORMAT, modes);\n> > > > +\n> > > > +\treturn modes;\n> > > > +}\n> > > > +\n> > > >  static bool\n> > > >  gst_libcamera_src_open(GstLibcameraSrc *self)\n> > > >  {\n> > > > @@ -375,6 +436,7 @@ gst_libcamera_src_open(GstLibcameraSrc *self)\n> > > >  \t/* No need to lock here, we didn't start our threads yet. */\n> > > >  \tself->state->cm_ = cm;\n> > > >  \tself->state->cam_ = cam;\n> > > > +\tself->sensor_modes = gst_libcamera_src_enumerate_sensor_modes(self);\n> > > > \n> > > >  \treturn true;\n> > > >  }\n> > > > @@ -462,6 +524,7 @@ gst_libcamera_src_task_enter(GstTask *task, [[maybe_unused]] GThread *thread,\n> > > >  \tGstLibcameraSrcState *state = self->state;\n> > > >  \tGstFlowReturn flow_ret = GST_FLOW_OK;\n> > > >  \tgint ret;\n> > > > +\tg_autoptr(GstCaps) sensor_mode = nullptr;\n> > > > \n> > > >  \tg_autoptr(GstStructure) element_caps = gst_structure_new_empty(\"caps\");\n> > > > \n> > > > @@ -481,6 +544,16 @@ gst_libcamera_src_task_enter(GstTask *task, [[maybe_unused]] GThread *thread,\n> > > >  \t\troles.push_back(gst_libcamera_pad_get_role(srcpad));\n> > > >  \t}\n> > > > \n> > > > +\tif (!gst_caps_is_any(self->sensor_mode)) {\n> > > > +\t\tsensor_mode = gst_caps_intersect(self->sensor_mode, self->sensor_modes);\n> > > > +\t\tif (!gst_caps_is_empty(sensor_mode)) {\n> > > > +\t\t\troles.push_back(libcamera::StreamRole::Raw);\n> > > > +\t\t} else {\n> > > > +\t\t\tGST_WARNING_OBJECT(self, \"No sensor mode matching the selection, ignoring.\");\n> > > > +\t\t\tgst_clear_caps(&sensor_mode);\n> > > > +\t\t}\n> > > > +\t}\n> > > > +\n> > > >  \t/* Generate the stream configurations, there should be one per pad. */\n> > > >  \tstate->config_ = state->cam_->generateConfiguration(roles);\n> > > >  \tif (state->config_ == nullptr) {\n> > > > @@ -490,7 +563,7 @@ gst_libcamera_src_task_enter(GstTask *task, [[maybe_unused]] GThread *thread,\n> > > >  \t\tgst_task_stop(task);\n> > > >  \t\treturn;\n> > > >  \t}\n> > > > -\tg_assert(state->config_->size() == state->srcpads_.size());\n> > > > +\tg_assert(state->config_->size() == state->srcpads_.size() + (!!sensor_mode));\n> > > > \n> > > >  \tfor (gsize i = 0; i < state->srcpads_.size(); i++) {\n> > > >  \t\tGstPad *srcpad = state->srcpads_[i];\n> > > > @@ -510,6 +583,12 @@ gst_libcamera_src_task_enter(GstTask *task, [[maybe_unused]] GThread *thread,\n> > > >  \t\tgst_libcamera_get_framerate_from_caps(caps, element_caps);\n> > > >  \t}\n> > > > \n> > > > +\tif (sensor_mode) {\n> > > > +\t\tStreamConfiguration &stream_cfg = state->config_->at(state->srcpads_.size());\n> > > > +\t\tg_assert(gst_caps_is_writable(sensor_mode));\n> > > > +\t\tgst_libcamera_configure_stream_from_caps(stream_cfg, sensor_mode);\n> > > > +\t}\n> > > > +\n> > > >  \tif (flow_ret != GST_FLOW_OK)\n> > > >  \t\tgoto done;\n> > > > \n> > > > @@ -624,6 +703,7 @@ gst_libcamera_src_task_leave([[maybe_unused]] GstTask *task,\n> > > >  \tg_clear_object(&self->allocator);\n> > > >  \tg_clear_pointer(&self->flow_combiner,\n> > > >  \t\t\t(GDestroyNotify)gst_flow_combiner_free);\n> > > > +\tgst_clear_caps(&self->sensor_modes);\n> > > >  }\n> > > > \n> > > >  static void\n> > > > @@ -659,6 +739,10 @@ gst_libcamera_src_set_property(GObject *object, guint prop_id,\n> > > >  \t\tg_free(self->camera_name);\n> > > >  \t\tself->camera_name = g_value_dup_string(value);\n> > > >  \t\tbreak;\n> > > > +\tcase PROP_SENSOR_MODE:\n> > > > +\t\tgst_clear_caps(&self->sensor_mode);\n> > > > +\t\tself->sensor_mode = GST_CAPS(g_value_dup_boxed(value));\n> > > > +\t\tbreak;\n> > > >  \tdefault:\n> > > >  \t\tG_OBJECT_WARN_INVALID_PROPERTY_ID(object, prop_id, pspec);\n> > > >  \t\tbreak;\n> > > > @@ -676,6 +760,12 @@ gst_libcamera_src_get_property(GObject *object, guint prop_id, GValue *value,\n> > > >  \tcase PROP_CAMERA_NAME:\n> > > >  \t\tg_value_set_string(value, self->camera_name);\n> > > >  \t\tbreak;\n> > > > +\tcase PROP_SENSOR_MODES:\n> > > > +\t\tg_value_set_boxed(value, self->sensor_modes);\n> > > > +\t\tbreak;\n> > > > +\tcase PROP_SENSOR_MODE:\n> > > > +\t\tg_value_set_boxed(value, self->sensor_mode);\n> > > > +\t\tbreak;\n> > > >  \tdefault:\n> > > >  \t\tG_OBJECT_WARN_INVALID_PROPERTY_ID(object, prop_id, pspec);\n> > > >  \t\tbreak;\n> > > > @@ -763,6 +853,7 @@ gst_libcamera_src_init(GstLibcameraSrc *self)\n> > > >  \t/* C-style friend. */\n> > > >  \tstate->src_ = self;\n> > > >  \tself->state = state;\n> > > > +\tself->sensor_mode = gst_caps_new_any();\n> > > >  }\n> > > > \n> > > >  static GstPad *\n> > > > @@ -844,4 +935,19 @@ gst_libcamera_src_class_init(GstLibcameraSrcClass *klass)\n> > > >  \t\t\t\t\t\t\t     | G_PARAM_READWRITE\n> > > >  \t\t\t\t\t\t\t     | G_PARAM_STATIC_STRINGS));\n> > > >  \tg_object_class_install_property(object_class, PROP_CAMERA_NAME, spec);\n> > > > +\n> > > > +\tspec = g_param_spec_boxed(\"sensor-modes\", \"Sensor Modes\",\n> > > > +\t\t\t\t  \"GstCaps representing available sensor modes.\",\n> > > > +\t\t\t\t  GST_TYPE_CAPS,\n> > > > +\t\t\t\t  (GParamFlags)(G_PARAM_READABLE\n> > > > +\t\t\t\t\t        | G_PARAM_STATIC_STRINGS));\n> > > > +\tg_object_class_install_property(object_class, PROP_SENSOR_MODES, spec);\n> > > > +\n> > > > +\tspec = g_param_spec_boxed(\"sensor-mode\", \"Sensor Mode\",\n> > > > +\t\t\t\t  \"GstCaps representing selected sensor mode.\",\n> > > > +\t\t\t\t  GST_TYPE_CAPS,\n> > > > +\t\t\t\t  (GParamFlags)(GST_PARAM_MUTABLE_READY\n> > > > +\t\t\t\t\t        | G_PARAM_READWRITE\n> > > > +\t\t\t\t\t        | G_PARAM_STATIC_STRINGS));\n> > > > +\tg_object_class_install_property(object_class, PROP_SENSOR_MODE, spec);\n> > > >  }\n> > > > --\n> > > > 2.39.2\n> > > > \n> >","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id B6B76C0F2A\n\tfor <parsemail@patchwork.libcamera.org>;\n\tTue, 28 Mar 2023 19:24:00 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id EAF226271F;\n\tTue, 28 Mar 2023 21:23:59 +0200 (CEST)","from madras.collabora.co.uk (madras.collabora.co.uk\n\t[IPv6:2a00:1098:0:82:1000:25:2eeb:e5ab])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id 7403D61ECD\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tTue, 28 Mar 2023 21:23:57 +0200 (CEST)","from nicolas-tpx395.localdomain (unknown\n\t[IPv6:2606:6d00:15:914b::7a9])\n\t(using TLSv1.3 with cipher TLS_AES_256_GCM_SHA384 (256/256 bits)\n\tkey-exchange X25519 server-signature RSA-PSS (4096 bits)\n\tserver-digest SHA256)\n\t(No client certificate requested) (Authenticated sender: nicolas)\n\tby madras.collabora.co.uk (Postfix) with ESMTPSA id B99E76603166;\n\tTue, 28 Mar 2023 20:23:56 +0100 (BST)"],"DKIM-Signature":["v=1; a=rsa-sha256; c=relaxed/simple; d=libcamera.org;\n\ts=mail; t=1680031439;\n\tbh=fCinRgU8ykNbeyLdl29u1J99lhKlzqMoaamZRPy9OGg=;\n\th=To:Date:In-Reply-To:References:Subject:List-Id:List-Unsubscribe:\n\tList-Archive:List-Post:List-Help:List-Subscribe:From:Reply-To:Cc:\n\tFrom;\n\tb=NbkI1ihRMsKk6w/T3YcTgImjAtJPwHrvsjTsXyvsgCdNQg2PrEP9E8UuZEF+PdNMw\n\tR9+yu0tE8F4n4iNKEdqRH0l/ZNykioXmdDfuG+5fmMjCg0VIjmF/aNRPNYGxlDiOAW\n\t1xcmG3AflSJuGlbaGvEc3n6V9ITi7c9q6IVCg5rk1WS641vtI6pJMHcV3t6F9SB9hJ\n\thKWyouN5xBUJKFvlwCx4yPINg8HDhjbAht/CGNSy1wdFYXfbSfvN9TAPigbfD9vH6O\n\tPvAOpWLUyo3zZm0zLP81oYgBPSvf2aQKGfRWQKds08e2CnM/MCSYpJTUOxJHGJLGqH\n\tY+7MvBSUkrpVQ==","v=1; a=rsa-sha256; c=relaxed/simple; d=collabora.com;\n\ts=mail; t=1680031437;\n\tbh=fCinRgU8ykNbeyLdl29u1J99lhKlzqMoaamZRPy9OGg=;\n\th=Subject:From:To:Cc:Date:In-Reply-To:References:From;\n\tb=NGqklKMqtisJx5OVOj0/iyEeztLVj36GpRYAv47wWgyXFU2QsyzEkVUVP/b/NrXKz\n\tt6dZVqcWuAg6LU6QO3JIWiiA67PyqDAzuJbvcgcHTcGXaFXZ5foyj/Yv0JyRHLWEX6\n\tresLChcbAOQfdmSIZ7fk4bMvbI+nDAp6VWn99H3JcHwhwfaPGvfpY/rS/BGJxHBpOa\n\tv8zoKv2ZnZqlZG4CguqF5KDUuAM4u9Uizag5BFQpqRYjIxZ7cLHyMlkUTrIWEO5o1q\n\t/h8fFu2QFW1uKgSa7B9/qy5v+r7WxX8lCpDiALbL11/AUvLmlRfcsc0cVeDAKPeUBe\n\tP3aDf9oJ4jNuQ=="],"Authentication-Results":"lancelot.ideasonboard.com; dkim=pass (2048-bit key; \n\tunprotected) header.d=collabora.com\n\theader.i=@collabora.com\n\theader.b=\"NGqklKMq\"; dkim-atps=neutral","Message-ID":"<fd3367c99599febbed2a99dbb01dd89a66c38a50.camel@collabora.com>","To":"Jacopo Mondi <jacopo.mondi@ideasonboard.com>","Date":"Tue, 28 Mar 2023 15:23:47 -0400","In-Reply-To":"<20230327151916.bxhuho3gez7kkvz7@uno.localdomain>","References":"<20230324181247.302586-1-nicolas@ndufresne.ca>\n\t<20230324181247.302586-2-nicolas@ndufresne.ca>\n\t<20230327092155.kzzwjsr5ffnh4y7k@uno.localdomain>\n\t<db46d6b2a3dc2f48b8a8323f32bebd522235c9bf.camel@collabora.com>\n\t<20230327151916.bxhuho3gez7kkvz7@uno.localdomain>","Content-Type":"text/plain; charset=\"UTF-8\"","Content-Transfer-Encoding":"quoted-printable","User-Agent":"Evolution 3.46.4 (3.46.4-1.fc37) ","MIME-Version":"1.0","Subject":"Re: [libcamera-devel] [PATCH v2 1/3] gstreamer: Add sensor mode\n\tselection","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","From":"Nicolas Dufresne via libcamera-devel\n\t<libcamera-devel@lists.libcamera.org>","Reply-To":"Nicolas Dufresne <nicolas.dufresne@collabora.com>","Cc":"libcamera-devel@lists.libcamera.org","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}}]