Message ID | 20220316141316.926035-1-kieran.bingham@ideasonboard.com |
---|---|
State | Accepted |
Headers | show |
Series |
|
Related | show |
Hello Kieran, On 3/16/22 19:43, Kieran Bingham via libcamera-devel wrote: > There have been many reports of facing difficulties with the gstreamer > element and getting the libcamerasrc to successfully negotiate with > other gstreamer elements. > > This is often due to the current limitations on colorimetry and frame > rate support in the element, and can usually be worked around by > specifying those explicitly in the caps. Yes, I was going to take a look at it earlier but alas didn't any time-slot. Colorimetry is a low-hanging fruit between the two it seems. > > Provide a tested example to capture, encode, and stream images as jpeg > to a remote device in the gstreamer section of the getting started > readme. > > Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com> > --- > > Technically this pipeline will stream without specifying the colorimetry > but the purpose is to add an example with both framerate, and > colorimetry explicitly listed to help people debug their pipelines > themselves. Yes, great. > > README.rst | 19 +++++++++++++++++++ > 1 file changed, 19 insertions(+) > > diff --git a/README.rst b/README.rst > index ca8a97cbd71b..7abbc9e7a9ae 100644 > --- a/README.rst > +++ b/README.rst > @@ -139,6 +139,25 @@ the video device provider) and libcamerasrc (for the operation of the camera). > All corresponding debug messages can be enabled by setting the ``GST_DEBUG`` > environment variable to ``libcamera*:7``. > > +Presently to prevent element negotiation failures it is required to specify the s/Presently/Presently,/ maybe? > +colorimetry and framerate as part of your pipeline construction. For instance s/instance/instance,/ Reviewed-by: Umang Jain <umang.jain@ideasonboard.com> > +to capture and encode as a JPEG stream and receive on another device the > +following example could be used as a starting point: > + > +.. code:: > + > + gst-launch-1.0 libcamerasrc ! \ > + video/x-raw,colorimetry=bt709,format=NV12,width=1280,height=720,framerate=30/1 ! \ > + jpegenc ! multipartmux ! \ > + tcpserversink host=0.0.0.0 port=5000 > + > +Which can be received on another device over the network with: > + > +.. code:: > + > + gst-launch-1.0 tcpclientsrc host=$DEVICE_IP port=5000 ! \ > + multipartdemux ! jpegdec ! autovideosink > + > .. section-end-getting-started > > Troubleshooting
On Wed, Mar 16, 2022 at 11:36:30PM +0530, Umang Jain via libcamera-devel wrote: > On 3/16/22 19:43, Kieran Bingham via libcamera-devel wrote: > > There have been many reports of facing difficulties with the gstreamer > > element and getting the libcamerasrc to successfully negotiate with > > other gstreamer elements. > > > > This is often due to the current limitations on colorimetry and frame > > rate support in the element, and can usually be worked around by > > specifying those explicitly in the caps. > > Yes, I was going to take a look at it earlier but alas didn't any > time-slot. Colorimetry is a low-hanging fruit between the two it seems. > > > Provide a tested example to capture, encode, and stream images as jpeg > > to a remote device in the gstreamer section of the getting started > > readme. > > > > Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com> > > --- > > > > Technically this pipeline will stream without specifying the colorimetry > > but the purpose is to add an example with both framerate, and > > colorimetry explicitly listed to help people debug their pipelines > > themselves. > > Yes, great. > > > README.rst | 19 +++++++++++++++++++ > > 1 file changed, 19 insertions(+) > > > > diff --git a/README.rst b/README.rst > > index ca8a97cbd71b..7abbc9e7a9ae 100644 > > --- a/README.rst > > +++ b/README.rst > > @@ -139,6 +139,25 @@ the video device provider) and libcamerasrc (for the operation of the camera). > > All corresponding debug messages can be enabled by setting the ``GST_DEBUG`` > > environment variable to ``libcamera*:7``. > > > > +Presently to prevent element negotiation failures it is required to specify the > > s/Presently/Presently,/ maybe? > > > +colorimetry and framerate as part of your pipeline construction. For instance > > s/instance/instance,/ > > Reviewed-by: Umang Jain <umang.jain@ideasonboard.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> We may want to move documentation for the GStreamer element to a separate file at some point. > > +to capture and encode as a JPEG stream and receive on another device the > > +following example could be used as a starting point: > > + > > +.. code:: > > + > > + gst-launch-1.0 libcamerasrc ! \ > > + video/x-raw,colorimetry=bt709,format=NV12,width=1280,height=720,framerate=30/1 ! \ > > + jpegenc ! multipartmux ! \ > > + tcpserversink host=0.0.0.0 port=5000 > > + > > +Which can be received on another device over the network with: > > + > > +.. code:: > > + > > + gst-launch-1.0 tcpclientsrc host=$DEVICE_IP port=5000 ! \ > > + multipartdemux ! jpegdec ! autovideosink > > + > > .. section-end-getting-started > > > > Troubleshooting
diff --git a/README.rst b/README.rst index ca8a97cbd71b..7abbc9e7a9ae 100644 --- a/README.rst +++ b/README.rst @@ -139,6 +139,25 @@ the video device provider) and libcamerasrc (for the operation of the camera). All corresponding debug messages can be enabled by setting the ``GST_DEBUG`` environment variable to ``libcamera*:7``. +Presently to prevent element negotiation failures it is required to specify the +colorimetry and framerate as part of your pipeline construction. For instance +to capture and encode as a JPEG stream and receive on another device the +following example could be used as a starting point: + +.. code:: + + gst-launch-1.0 libcamerasrc ! \ + video/x-raw,colorimetry=bt709,format=NV12,width=1280,height=720,framerate=30/1 ! \ + jpegenc ! multipartmux ! \ + tcpserversink host=0.0.0.0 port=5000 + +Which can be received on another device over the network with: + +.. code:: + + gst-launch-1.0 tcpclientsrc host=$DEVICE_IP port=5000 ! \ + multipartdemux ! jpegdec ! autovideosink + .. section-end-getting-started Troubleshooting
There have been many reports of facing difficulties with the gstreamer element and getting the libcamerasrc to successfully negotiate with other gstreamer elements. This is often due to the current limitations on colorimetry and frame rate support in the element, and can usually be worked around by specifying those explicitly in the caps. Provide a tested example to capture, encode, and stream images as jpeg to a remote device in the gstreamer section of the getting started readme. Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com> --- Technically this pipeline will stream without specifying the colorimetry but the purpose is to add an example with both framerate, and colorimetry explicitly listed to help people debug their pipelines themselves. README.rst | 19 +++++++++++++++++++ 1 file changed, 19 insertions(+)