Message ID | 20210126162412.824033-3-naush@raspberrypi.com |
---|---|
State | Superseded |
Headers | show |
Series |
|
Related | show |
Hi Naush, Thank you for the patch. On Tue, Jan 26, 2021 at 04:24:08PM +0000, Naushir Patuck wrote: > In preparation for fast colour denoise, set the low resolution ISP > output stream (Output1) to a 1/4 resolution of the application requested > stream (Output0). This only happens if the application has not requested > an additional YUV or RGB stream. > > We also constrain this 1/4 resolution to at most 1200 pixels in the > largest dimension to avoid being too taxing on memory usage and system > bandwidth. > > Also switch the default StreamRole::VideoRecording to YUV420 to allow > fast colour denoise to run. Could you explain why all this is needed ? What happens if an application requests the second output with a higher resolution that this, will fast colour denoise still work correctly ? > Signed-off-by: Naushir Patuck <naush@raspberrypi.com> > Reviewed-by: David Plowman <david.plowman@raspberrypi.com> > Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com> > --- > .../pipeline/raspberrypi/raspberrypi.cpp | 35 ++++++++++++++++++- > 1 file changed, 34 insertions(+), 1 deletion(-) > > diff --git a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp > index d9895c779725..fe4c75f09925 100644 > --- a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp > +++ b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp > @@ -496,7 +496,7 @@ CameraConfiguration *PipelineHandlerRPi::generateConfiguration(Camera *camera, > > case StreamRole::VideoRecording: > fmts = data->isp_[Isp::Output0].dev()->formats(); > - pixelFormat = formats::NV12; > + pixelFormat = formats::YUV420; > size = { 1920, 1080 }; > bufferCount = 4; > outCount++; > @@ -608,6 +608,7 @@ int PipelineHandlerRPi::configure(Camera *camera, CameraConfiguration *config) > * StreamConfiguration appropriately. > */ > V4L2DeviceFormat format; > + bool output1Set = false; > for (unsigned i = 0; i < config->size(); i++) { > StreamConfiguration &cfg = config->at(i); > > @@ -632,6 +633,9 @@ int PipelineHandlerRPi::configure(Camera *camera, CameraConfiguration *config) > format.size = cfg.size; > format.fourcc = fourcc; > > + LOG(RPI, Info) << "Setting " << stream->name() << " to a resolution of " > + << format.toString(); > + > ret = stream->dev()->setFormat(&format); > if (ret) > return -EINVAL; > @@ -645,6 +649,35 @@ int PipelineHandlerRPi::configure(Camera *camera, CameraConfiguration *config) > > cfg.setStream(stream); > stream->setExternal(true); > + > + if (i != maxIndex) > + output1Set = true; > + } > + > + /* > + * If ISP::Output1 stream has not been requested by the application, we > + * set it up for internal use now. This second stream will be used for > + * fast colour denoise, and must be a quarter resolution of the ISP::Output0 > + * stream. However, also limit the maximum size to 1200 pixels in the > + * larger dimension, just to avoid being wasteful with buffer allocations > + * and memory bandwidth. > + */ > + if (!output1Set) { > + V4L2DeviceFormat output1Format = format; > + constexpr unsigned int maxDimensions = 1200; > + const Size limit = Size(maxDimensions, maxDimensions).boundedToAspectRatio(format.size); > + > + output1Format.size = (format.size / 2).boundedTo(limit).alignedDownTo(2, 2); > + > + LOG(RPI, Info) << "Setting ISP Output1 (internal) to a resolution of " > + << output1Format.toString(); > + > + ret = data->isp_[Isp::Output1].dev()->setFormat(&output1Format); > + if (ret) { > + LOG(RPI, Error) << "Failed to set format on ISP Output1: " > + << ret; > + return -EINVAL; > + } > } > > /* ISP statistics output format. */
Hi Laurent, On Tue, 26 Jan 2021 at 22:53, Laurent Pinchart < laurent.pinchart@ideasonboard.com> wrote: > Hi Naush, > > Thank you for the patch. > > On Tue, Jan 26, 2021 at 04:24:08PM +0000, Naushir Patuck wrote: > > In preparation for fast colour denoise, set the low resolution ISP > > output stream (Output1) to a 1/4 resolution of the application requested > > stream (Output0). This only happens if the application has not requested > > an additional YUV or RGB stream. > > > > We also constrain this 1/4 resolution to at most 1200 pixels in the > > largest dimension to avoid being too taxing on memory usage and system > > bandwidth. > > > > Also switch the default StreamRole::VideoRecording to YUV420 to allow > > fast colour denoise to run. > > Could you explain why all this is needed ? What happens if an > application requests the second output with a higher resolution that > this, will fast colour denoise still work correctly ? > Fast colour denoise essentially works by doing some analysis at a 1:4 or 1:1 resolution image when compared with the output image resolution. It then applies correction on the output image directly based on this analysis. If the application were to require a second output stream that is not 1:4 or 1:1 in resolution ratio, the colour denoise will silently be bypassed. There is another constraint that this analysis image must be in YUV420 format. Unfortunately, this is a limitation with our implementation that we currently have. Regards, Naush > > > Signed-off-by: Naushir Patuck <naush@raspberrypi.com> > > Reviewed-by: David Plowman <david.plowman@raspberrypi.com> > > Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com> > > --- > > .../pipeline/raspberrypi/raspberrypi.cpp | 35 ++++++++++++++++++- > > 1 file changed, 34 insertions(+), 1 deletion(-) > > > > diff --git a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp > b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp > > index d9895c779725..fe4c75f09925 100644 > > --- a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp > > +++ b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp > > @@ -496,7 +496,7 @@ CameraConfiguration > *PipelineHandlerRPi::generateConfiguration(Camera *camera, > > > > case StreamRole::VideoRecording: > > fmts = data->isp_[Isp::Output0].dev()->formats(); > > - pixelFormat = formats::NV12; > > + pixelFormat = formats::YUV420; > > size = { 1920, 1080 }; > > bufferCount = 4; > > outCount++; > > @@ -608,6 +608,7 @@ int PipelineHandlerRPi::configure(Camera *camera, > CameraConfiguration *config) > > * StreamConfiguration appropriately. > > */ > > V4L2DeviceFormat format; > > + bool output1Set = false; > > for (unsigned i = 0; i < config->size(); i++) { > > StreamConfiguration &cfg = config->at(i); > > > > @@ -632,6 +633,9 @@ int PipelineHandlerRPi::configure(Camera *camera, > CameraConfiguration *config) > > format.size = cfg.size; > > format.fourcc = fourcc; > > > > + LOG(RPI, Info) << "Setting " << stream->name() << " to a > resolution of " > > + << format.toString(); > > + > > ret = stream->dev()->setFormat(&format); > > if (ret) > > return -EINVAL; > > @@ -645,6 +649,35 @@ int PipelineHandlerRPi::configure(Camera *camera, > CameraConfiguration *config) > > > > cfg.setStream(stream); > > stream->setExternal(true); > > + > > + if (i != maxIndex) > > + output1Set = true; > > + } > > + > > + /* > > + * If ISP::Output1 stream has not been requested by the > application, we > > + * set it up for internal use now. This second stream will be used > for > > + * fast colour denoise, and must be a quarter resolution of the > ISP::Output0 > > + * stream. However, also limit the maximum size to 1200 pixels in > the > > + * larger dimension, just to avoid being wasteful with buffer > allocations > > + * and memory bandwidth. > > + */ > > + if (!output1Set) { > > + V4L2DeviceFormat output1Format = format; > > + constexpr unsigned int maxDimensions = 1200; > > + const Size limit = Size(maxDimensions, > maxDimensions).boundedToAspectRatio(format.size); > > + > > + output1Format.size = (format.size / > 2).boundedTo(limit).alignedDownTo(2, 2); > > + > > + LOG(RPI, Info) << "Setting ISP Output1 (internal) to a > resolution of " > > + << output1Format.toString(); > > + > > + ret = > data->isp_[Isp::Output1].dev()->setFormat(&output1Format); > > + if (ret) { > > + LOG(RPI, Error) << "Failed to set format on ISP > Output1: " > > + << ret; > > + return -EINVAL; > > + } > > } > > > > /* ISP statistics output format. */ > > -- > Regards, > > Laurent Pinchart >
Hi Naush, On Wed, Jan 27, 2021 at 08:14:29AM +0000, Naushir Patuck wrote: > On Tue, 26 Jan 2021 at 22:53, Laurent Pinchart wrote: > > On Tue, Jan 26, 2021 at 04:24:08PM +0000, Naushir Patuck wrote: > > > In preparation for fast colour denoise, set the low resolution ISP > > > output stream (Output1) to a 1/4 resolution of the application requested > > > stream (Output0). This only happens if the application has not requested > > > an additional YUV or RGB stream. > > > > > > We also constrain this 1/4 resolution to at most 1200 pixels in the > > > largest dimension to avoid being too taxing on memory usage and system > > > bandwidth. > > > > > > Also switch the default StreamRole::VideoRecording to YUV420 to allow > > > fast colour denoise to run. > > > > Could you explain why all this is needed ? What happens if an > > application requests the second output with a higher resolution that > > this, will fast colour denoise still work correctly ? > > Fast colour denoise essentially works by doing some analysis at a 1:4 or > 1:1 resolution image when compared with the output image resolution. It > then applies correction on the output image directly based on this > analysis. If the application were to require a second output stream that > is not 1:4 or 1:1 in resolution ratio, the colour denoise will silently be > bypassed. That's useful information. Maybe this could be captured in a comment in the code ? I also wonder if we shouldn't consider this in the metadata we return. When the CDN is silently bypassed, wouldn't it be better if the metadata reflected that ? > There is another constraint that this analysis image must be in > YUV420 format. Unfortunately, this is a limitation with our > implementation that we currently have. Only the analysis image, or also the main output image ? This would also be useful to capture in a comment, so that it doesn't get ignored later when the code is reworked. > > > Signed-off-by: Naushir Patuck <naush@raspberrypi.com> > > > Reviewed-by: David Plowman <david.plowman@raspberrypi.com> > > > Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com> > > > --- > > > .../pipeline/raspberrypi/raspberrypi.cpp | 35 ++++++++++++++++++- > > > 1 file changed, 34 insertions(+), 1 deletion(-) > > > > > > diff --git a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp > > > index d9895c779725..fe4c75f09925 100644 > > > --- a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp > > > +++ b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp > > > @@ -496,7 +496,7 @@ CameraConfiguration *PipelineHandlerRPi::generateConfiguration(Camera *camera, > > > > > > case StreamRole::VideoRecording: > > > fmts = data->isp_[Isp::Output0].dev()->formats(); > > > - pixelFormat = formats::NV12; Here for instance, you could write /* * The colour denoise algorithm require the analysis * image, produced by the second ISP output, to be in * YUV420 format. Select this format as the default, to * maximize chances that it will be picked by * applications and enable usage of the colour denoise * algorithm. */ Or something more accurate, as I'm not sure to have picked all the important information correctly :-) > > > + pixelFormat = formats::YUV420; > > > size = { 1920, 1080 }; > > > bufferCount = 4; > > > outCount++; > > > @@ -608,6 +608,7 @@ int PipelineHandlerRPi::configure(Camera *camera, CameraConfiguration *config) > > > * StreamConfiguration appropriately. > > > */ > > > V4L2DeviceFormat format; > > > + bool output1Set = false; > > > for (unsigned i = 0; i < config->size(); i++) { > > > StreamConfiguration &cfg = config->at(i); > > > > > > @@ -632,6 +633,9 @@ int PipelineHandlerRPi::configure(Camera *camera, CameraConfiguration *config) > > > format.size = cfg.size; > > > format.fourcc = fourcc; > > > > > > + LOG(RPI, Info) << "Setting " << stream->name() << " to a resolution of " > > > + << format.toString(); > > > + > > > ret = stream->dev()->setFormat(&format); > > > if (ret) > > > return -EINVAL; > > > @@ -645,6 +649,35 @@ int PipelineHandlerRPi::configure(Camera *camera, CameraConfiguration *config) > > > > > > cfg.setStream(stream); > > > stream->setExternal(true); > > > + > > > + if (i != maxIndex) > > > + output1Set = true; > > > + } > > > + > > > + /* > > > + * If ISP::Output1 stream has not been requested by the application, we > > > + * set it up for internal use now. This second stream will be used for > > > + * fast colour denoise, and must be a quarter resolution of the ISP::Output0 > > > + * stream. However, also limit the maximum size to 1200 pixels in the > > > + * larger dimension, just to avoid being wasteful with buffer allocations > > > + * and memory bandwidth. > > > + */ > > > + if (!output1Set) { > > > + V4L2DeviceFormat output1Format = format; If only the analysis image has to be in YUV420 format, should YUV420 be selected here in case output 0 uses a different format ? > > > + constexpr unsigned int maxDimensions = 1200; > > > + const Size limit = Size(maxDimensions, maxDimensions).boundedToAspectRatio(format.size); > > > + > > > + output1Format.size = (format.size / 2).boundedTo(limit).alignedDownTo(2, 2); > > > + > > > + LOG(RPI, Info) << "Setting ISP Output1 (internal) to a resolution of " > > > + << output1Format.toString(); > > > + > > > + ret = data->isp_[Isp::Output1].dev()->setFormat(&output1Format); > > > + if (ret) { > > > + LOG(RPI, Error) << "Failed to set format on ISP Output1: " > > > + << ret; > > > + return -EINVAL; > > > + } > > > } > > > > > > /* ISP statistics output format. */
Hi Laurent, On Thu, 4 Feb 2021 at 21:57, Laurent Pinchart < laurent.pinchart@ideasonboard.com> wrote: > Hi Naush, > > On Wed, Jan 27, 2021 at 08:14:29AM +0000, Naushir Patuck wrote: > > On Tue, 26 Jan 2021 at 22:53, Laurent Pinchart wrote: > > > On Tue, Jan 26, 2021 at 04:24:08PM +0000, Naushir Patuck wrote: > > > > In preparation for fast colour denoise, set the low resolution ISP > > > > output stream (Output1) to a 1/4 resolution of the application > requested > > > > stream (Output0). This only happens if the application has not > requested > > > > an additional YUV or RGB stream. > > > > > > > > We also constrain this 1/4 resolution to at most 1200 pixels in the > > > > largest dimension to avoid being too taxing on memory usage and > system > > > > bandwidth. > > > > > > > > Also switch the default StreamRole::VideoRecording to YUV420 to allow > > > > fast colour denoise to run. > > > > > > Could you explain why all this is needed ? What happens if an > > > application requests the second output with a higher resolution that > > > this, will fast colour denoise still work correctly ? > > > > Fast colour denoise essentially works by doing some analysis at a 1:4 or > > 1:1 resolution image when compared with the output image resolution. It > > then applies correction on the output image directly based on this > > analysis. If the application were to require a second output stream that > > is not 1:4 or 1:1 in resolution ratio, the colour denoise will silently > be > > bypassed. > > That's useful information. Maybe this could be captured in a comment in > the code ? > Yes, that's a good idea. > > I also wonder if we shouldn't consider this in the metadata we return. > When the CDN is silently bypassed, wouldn't it be better if the metadata > reflected that ? > Yes, we could consider that. The options I would consider are: 1) Only return the metadata when/if the control is set by the application indicating what has been applied (i.e. either set or not based on the requested format). 2) Return the metadata on each frame indicating the CDN status. My preference would be for 1, it matches how the metadata returns when we set other ISP related params. but I am not entirely against doing 2. What are your thoughts? > > > There is another constraint that this analysis image must be in > > YUV420 format. Unfortunately, this is a limitation with our > > implementation that we currently have. > > Only the analysis image, or also the main output image ? This would also > be useful to capture in a comment, so that it doesn't get ignored later > when the code is reworked. > Both images have to be YUV420 format. > > > > > Signed-off-by: Naushir Patuck <naush@raspberrypi.com> > > > > Reviewed-by: David Plowman <david.plowman@raspberrypi.com> > > > > Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com> > > > > --- > > > > .../pipeline/raspberrypi/raspberrypi.cpp | 35 > ++++++++++++++++++- > > > > 1 file changed, 34 insertions(+), 1 deletion(-) > > > > > > > > diff --git a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp > b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp > > > > index d9895c779725..fe4c75f09925 100644 > > > > --- a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp > > > > +++ b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp > > > > @@ -496,7 +496,7 @@ CameraConfiguration > *PipelineHandlerRPi::generateConfiguration(Camera *camera, > > > > > > > > case StreamRole::VideoRecording: > > > > fmts = > data->isp_[Isp::Output0].dev()->formats(); > > > > - pixelFormat = formats::NV12; > > Here for instance, you could write > > /* > * The colour denoise algorithm require the > analysis > * image, produced by the second ISP output, to be > in > * YUV420 format. Select this format as the > default, to > * maximize chances that it will be picked by > * applications and enable usage of the colour > denoise > * algorithm. > */ > > Or something more accurate, as I'm not sure to have picked all the > important information correctly :-) > Sounds correct to me :) I will add a comment with that wording. > > > > > + pixelFormat = formats::YUV420; > > > > size = { 1920, 1080 }; > > > > bufferCount = 4; > > > > outCount++; > > > > @@ -608,6 +608,7 @@ int PipelineHandlerRPi::configure(Camera > *camera, CameraConfiguration *config) > > > > * StreamConfiguration appropriately. > > > > */ > > > > V4L2DeviceFormat format; > > > > + bool output1Set = false; > > > > for (unsigned i = 0; i < config->size(); i++) { > > > > StreamConfiguration &cfg = config->at(i); > > > > > > > > @@ -632,6 +633,9 @@ int PipelineHandlerRPi::configure(Camera > *camera, CameraConfiguration *config) > > > > format.size = cfg.size; > > > > format.fourcc = fourcc; > > > > > > > > + LOG(RPI, Info) << "Setting " << stream->name() << " to > a resolution of " > > > > + << format.toString(); > > > > + > > > > ret = stream->dev()->setFormat(&format); > > > > if (ret) > > > > return -EINVAL; > > > > @@ -645,6 +649,35 @@ int PipelineHandlerRPi::configure(Camera > *camera, CameraConfiguration *config) > > > > > > > > cfg.setStream(stream); > > > > stream->setExternal(true); > > > > + > > > > + if (i != maxIndex) > > > > + output1Set = true; > > > > + } > > > > + > > > > + /* > > > > + * If ISP::Output1 stream has not been requested by the > application, we > > > > + * set it up for internal use now. This second stream will be > used for > > > > + * fast colour denoise, and must be a quarter resolution of > the ISP::Output0 > > > > + * stream. However, also limit the maximum size to 1200 pixels > in the > > > > + * larger dimension, just to avoid being wasteful with buffer > allocations > > > > + * and memory bandwidth. > > > > + */ > > > > + if (!output1Set) { > > > > + V4L2DeviceFormat output1Format = format; > > If only the analysis image has to be in YUV420 format, should YUV420 be > selected here in case output 0 uses a different format ? > While this is true, I don't think it makes any practical difference. With the existing code, if the output 0 format is YUV420, output 1 will be as well, and we get colour denoise working. If output 1 format is not YUV420, colour denoise will not work, so the format of output 1 is somewhat irrelevant. However, as per the comment in a previous email, what I really want to happen here is to disable the Output 1 stream if Output 0 is not in YUV420 format. I've added a \todo for that in the above comment. Regards, Naush > > > > > + constexpr unsigned int maxDimensions = 1200; > > > > + const Size limit = Size(maxDimensions, > maxDimensions).boundedToAspectRatio(format.size); > > > > + > > > > + output1Format.size = (format.size / > 2).boundedTo(limit).alignedDownTo(2, 2); > > > > + > > > > + LOG(RPI, Info) << "Setting ISP Output1 (internal) to a > resolution of " > > > > + << output1Format.toString(); > > > > + > > > > + ret = > data->isp_[Isp::Output1].dev()->setFormat(&output1Format); > > > > + if (ret) { > > > > + LOG(RPI, Error) << "Failed to set format on > ISP Output1: " > > > > + << ret; > > > > + return -EINVAL; > > > > + } > > > > } > > > > > > > > /* ISP statistics output format. */ > > -- > Regards, > > Laurent Pinchart >
Hi Naush, On Fri, Feb 05, 2021 at 01:54:25PM +0000, Naushir Patuck wrote: > On Thu, 4 Feb 2021 at 21:57, Laurent Pinchart wrote: > > On Wed, Jan 27, 2021 at 08:14:29AM +0000, Naushir Patuck wrote: > > > On Tue, 26 Jan 2021 at 22:53, Laurent Pinchart wrote: > > > > On Tue, Jan 26, 2021 at 04:24:08PM +0000, Naushir Patuck wrote: > > > > > In preparation for fast colour denoise, set the low resolution ISP > > > > > output stream (Output1) to a 1/4 resolution of the application requested > > > > > stream (Output0). This only happens if the application has not requested > > > > > an additional YUV or RGB stream. > > > > > > > > > > We also constrain this 1/4 resolution to at most 1200 pixels in the > > > > > largest dimension to avoid being too taxing on memory usage and system > > > > > bandwidth. > > > > > > > > > > Also switch the default StreamRole::VideoRecording to YUV420 to allow > > > > > fast colour denoise to run. > > > > > > > > Could you explain why all this is needed ? What happens if an > > > > application requests the second output with a higher resolution that > > > > this, will fast colour denoise still work correctly ? > > > > > > Fast colour denoise essentially works by doing some analysis at a 1:4 or > > > 1:1 resolution image when compared with the output image resolution. It > > > then applies correction on the output image directly based on this > > > analysis. If the application were to require a second output stream that > > > is not 1:4 or 1:1 in resolution ratio, the colour denoise will silently be > > > bypassed. > > > > That's useful information. Maybe this could be captured in a comment in > > the code ? > > Yes, that's a good idea. > > > I also wonder if we shouldn't consider this in the metadata we return. > > When the CDN is silently bypassed, wouldn't it be better if the metadata > > reflected that ? > > Yes, we could consider that. The options I would consider are: > 1) Only return the metadata when/if the control is set by the > application indicating what has been applied (i.e. either set or not based > on the requested format). > 2) Return the metadata on each frame indicating the CDN status. > > My preference would be for 1, it matches how the metadata returns when we > set other ISP related params. but I am not entirely against doing 2. What > are your thoughts? I think we should standardize on the second option. The rationale is that an application should be able to easily figure out what exact parameters have been applied to a particular frame. This means that - Parameters that can change per-frame need to be returned in metadata at least every time they change - Parameters that are fixed for the capture session may not need to be returned in every frame We could minimize the amount of metadata returned to application to only report diffs, but that means applications would need to accumulate the changes to get the current state, which is a bit of a burden. We could of course provide them with a helper to do so, but I'm also thinking about the issue a compliance testing of pipeline handlers and IPAs, I believe it would be easier to check for compliance if the full state was returned in every request. If there are good arguments for a diff-based metadata API, the same way we have a diff-based control API, I could be convinced otherwise. > > > There is another constraint that this analysis image must be in > > > YUV420 format. Unfortunately, this is a limitation with our > > > implementation that we currently have. > > > > Only the analysis image, or also the main output image ? This would also > > be useful to capture in a comment, so that it doesn't get ignored later > > when the code is reworked. > > Both images have to be YUV420 format. > > > > > > Signed-off-by: Naushir Patuck <naush@raspberrypi.com> > > > > > Reviewed-by: David Plowman <david.plowman@raspberrypi.com> > > > > > Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com> > > > > > --- > > > > > .../pipeline/raspberrypi/raspberrypi.cpp | 35 ++++++++++++++++++- > > > > > 1 file changed, 34 insertions(+), 1 deletion(-) > > > > > > > > > > diff --git a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp > > > > > index d9895c779725..fe4c75f09925 100644 > > > > > --- a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp > > > > > +++ b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp > > > > > @@ -496,7 +496,7 @@ CameraConfiguration *PipelineHandlerRPi::generateConfiguration(Camera *camera, > > > > > > > > > > case StreamRole::VideoRecording: > > > > > fmts = data->isp_[Isp::Output0].dev()->formats(); > > > > > - pixelFormat = formats::NV12; > > > > Here for instance, you could write > > > > /* > > * The colour denoise algorithm require the analysis > > * image, produced by the second ISP output, to be in > > * YUV420 format. Select this format as the default, to > > * maximize chances that it will be picked by > > * applications and enable usage of the colour denoise > > * algorithm. > > */ > > > > Or something more accurate, as I'm not sure to have picked all the > > important information correctly :-) > > Sounds correct to me :) I will add a comment with that wording. > > > > > > + pixelFormat = formats::YUV420; > > > > > size = { 1920, 1080 }; > > > > > bufferCount = 4; > > > > > outCount++; > > > > > @@ -608,6 +608,7 @@ int PipelineHandlerRPi::configure(Camera *camera, CameraConfiguration *config) > > > > > * StreamConfiguration appropriately. > > > > > */ > > > > > V4L2DeviceFormat format; > > > > > + bool output1Set = false; > > > > > for (unsigned i = 0; i < config->size(); i++) { > > > > > StreamConfiguration &cfg = config->at(i); > > > > > > > > > > @@ -632,6 +633,9 @@ int PipelineHandlerRPi::configure(Camera *camera, CameraConfiguration *config) > > > > > format.size = cfg.size; > > > > > format.fourcc = fourcc; > > > > > > > > > > + LOG(RPI, Info) << "Setting " << stream->name() << " to a resolution of " > > > > > + << format.toString(); > > > > > + > > > > > ret = stream->dev()->setFormat(&format); > > > > > if (ret) > > > > > return -EINVAL; > > > > > @@ -645,6 +649,35 @@ int PipelineHandlerRPi::configure(Camera *camera, CameraConfiguration *config) > > > > > > > > > > cfg.setStream(stream); > > > > > stream->setExternal(true); > > > > > + > > > > > + if (i != maxIndex) > > > > > + output1Set = true; > > > > > + } > > > > > + > > > > > + /* > > > > > + * If ISP::Output1 stream has not been requested by the application, we > > > > > + * set it up for internal use now. This second stream will be used for > > > > > + * fast colour denoise, and must be a quarter resolution of the ISP::Output0 > > > > > + * stream. However, also limit the maximum size to 1200 pixels in the > > > > > + * larger dimension, just to avoid being wasteful with buffer allocations > > > > > + * and memory bandwidth. > > > > > + */ > > > > > + if (!output1Set) { > > > > > + V4L2DeviceFormat output1Format = format; > > > > If only the analysis image has to be in YUV420 format, should YUV420 be > > selected here in case output 0 uses a different format ? > > While this is true, I don't think it makes any practical difference. With > the existing code, if the output 0 format is YUV420, output 1 will be as > well, and we get colour denoise working. If output 1 format is not YUV420, > colour denoise will not work, so the format of output 1 is somewhat > irrelevant. However, as per the comment in a previous email, what I really > want to happen here is to disable the Output 1 stream if Output 0 is not in > YUV420 format. I've added a \todo for that in the above comment. > > > > > > + constexpr unsigned int maxDimensions = 1200; > > > > > + const Size limit = Size(maxDimensions, maxDimensions).boundedToAspectRatio(format.size); > > > > > + > > > > > + output1Format.size = (format.size / 2).boundedTo(limit).alignedDownTo(2, 2); > > > > > + > > > > > + LOG(RPI, Info) << "Setting ISP Output1 (internal) to a resolution of " > > > > > + << output1Format.toString(); > > > > > + > > > > > + ret = data->isp_[Isp::Output1].dev()->setFormat(&output1Format); > > > > > + if (ret) { > > > > > + LOG(RPI, Error) << "Failed to set format on ISP Output1: " > > > > > + << ret; > > > > > + return -EINVAL; > > > > > + } > > > > > } > > > > > > > > > > /* ISP statistics output format. */
diff --git a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp index d9895c779725..fe4c75f09925 100644 --- a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp +++ b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp @@ -496,7 +496,7 @@ CameraConfiguration *PipelineHandlerRPi::generateConfiguration(Camera *camera, case StreamRole::VideoRecording: fmts = data->isp_[Isp::Output0].dev()->formats(); - pixelFormat = formats::NV12; + pixelFormat = formats::YUV420; size = { 1920, 1080 }; bufferCount = 4; outCount++; @@ -608,6 +608,7 @@ int PipelineHandlerRPi::configure(Camera *camera, CameraConfiguration *config) * StreamConfiguration appropriately. */ V4L2DeviceFormat format; + bool output1Set = false; for (unsigned i = 0; i < config->size(); i++) { StreamConfiguration &cfg = config->at(i); @@ -632,6 +633,9 @@ int PipelineHandlerRPi::configure(Camera *camera, CameraConfiguration *config) format.size = cfg.size; format.fourcc = fourcc; + LOG(RPI, Info) << "Setting " << stream->name() << " to a resolution of " + << format.toString(); + ret = stream->dev()->setFormat(&format); if (ret) return -EINVAL; @@ -645,6 +649,35 @@ int PipelineHandlerRPi::configure(Camera *camera, CameraConfiguration *config) cfg.setStream(stream); stream->setExternal(true); + + if (i != maxIndex) + output1Set = true; + } + + /* + * If ISP::Output1 stream has not been requested by the application, we + * set it up for internal use now. This second stream will be used for + * fast colour denoise, and must be a quarter resolution of the ISP::Output0 + * stream. However, also limit the maximum size to 1200 pixels in the + * larger dimension, just to avoid being wasteful with buffer allocations + * and memory bandwidth. + */ + if (!output1Set) { + V4L2DeviceFormat output1Format = format; + constexpr unsigned int maxDimensions = 1200; + const Size limit = Size(maxDimensions, maxDimensions).boundedToAspectRatio(format.size); + + output1Format.size = (format.size / 2).boundedTo(limit).alignedDownTo(2, 2); + + LOG(RPI, Info) << "Setting ISP Output1 (internal) to a resolution of " + << output1Format.toString(); + + ret = data->isp_[Isp::Output1].dev()->setFormat(&output1Format); + if (ret) { + LOG(RPI, Error) << "Failed to set format on ISP Output1: " + << ret; + return -EINVAL; + } } /* ISP statistics output format. */