[libcamera-devel,1/3] android: capabilities: Use a throw-away config for YUV stream building
diff mbox series

Message ID 20210715153241.63971-2-jacopo@jmondi.org
State Accepted
Headers show
Series
  • android: Drive-by cleanups
Related show

Commit Message

Jacopo Mondi July 15, 2021, 3:32 p.m. UTC
When building the list of supported YUV streams in getYUVResolutions()
the CameraConfiguration provided by the caller as first parameters is used.

As the CameraConfiguration will be later actually applied to the Camera,
avoid any possible overlap of the configuration parameters by using a
throw-away CameraConfiguration generated for the Viewfinder stream role
in getYUVResolutions().

It's also nicer to avoid having two functions with a similar purpose
such as getYUVResolutions() and getRawResolutions() with different
parameter lists, as the presence of a CameraConfiguration as first
parameter might be confusing to the reader.

Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
---
 src/android/camera_capabilities.cpp | 10 +++++-----
 src/android/camera_capabilities.h   |  3 +--
 2 files changed, 6 insertions(+), 7 deletions(-)

Comments

Laurent Pinchart July 23, 2021, 1:33 p.m. UTC | #1
Hi Jacopo,

Thank you for the patch.

On Thu, Jul 15, 2021 at 05:32:39PM +0200, Jacopo Mondi wrote:
> When building the list of supported YUV streams in getYUVResolutions()
> the CameraConfiguration provided by the caller as first parameters is used.
> 
> As the CameraConfiguration will be later actually applied to the Camera,
> avoid any possible overlap of the configuration parameters by using a
> throw-away CameraConfiguration generated for the Viewfinder stream role
> in getYUVResolutions().
> 
> It's also nicer to avoid having two functions with a similar purpose
> such as getYUVResolutions() and getRawResolutions() with different
> parameter lists, as the presence of a CameraConfiguration as first
> parameter might be confusing to the reader.
> 
> Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>

Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>

> ---
>  src/android/camera_capabilities.cpp | 10 +++++-----
>  src/android/camera_capabilities.h   |  3 +--
>  2 files changed, 6 insertions(+), 7 deletions(-)
> 
> diff --git a/src/android/camera_capabilities.cpp b/src/android/camera_capabilities.cpp
> index 6b5edb66fad2..6543c2bbc6c3 100644
> --- a/src/android/camera_capabilities.cpp
> +++ b/src/android/camera_capabilities.cpp
> @@ -138,13 +138,14 @@ int CameraCapabilities::initialize(std::shared_ptr<libcamera::Camera> camera,
>  	return initializeStaticMetadata();
>  }
>  
> -std::vector<Size> CameraCapabilities::getYUVResolutions(CameraConfiguration *cameraConfig,
> -							const PixelFormat &pixelFormat,
> +std::vector<Size> CameraCapabilities::getYUVResolutions(const PixelFormat &pixelFormat,
>  							const std::vector<Size> &resolutions)
>  {
>  	std::vector<Size> supportedResolutions;
> -
> +	std::unique_ptr<CameraConfiguration> cameraConfig =
> +		camera_->generateConfiguration({ StreamRole::Viewfinder });
>  	StreamConfiguration &cfg = cameraConfig->at(0);
> +
>  	for (const Size &res : resolutions) {
>  		cfg.pixelFormat = pixelFormat;
>  		cfg.size = res;
> @@ -324,8 +325,7 @@ int CameraCapabilities::initializeStreamConfigurations()
>  		if (info.colourEncoding == PixelFormatInfo::ColourEncodingRAW)
>  			resolutions = getRawResolutions(mappedFormat);
>  		else
> -			resolutions = getYUVResolutions(cameraConfig.get(),
> -							mappedFormat,
> +			resolutions = getYUVResolutions(mappedFormat,
>  							cameraResolutions);
>  
>  		for (const Size &res : resolutions) {
> diff --git a/src/android/camera_capabilities.h b/src/android/camera_capabilities.h
> index 4f5be82595d6..e72bf084cd65 100644
> --- a/src/android/camera_capabilities.h
> +++ b/src/android/camera_capabilities.h
> @@ -43,8 +43,7 @@ private:
>  	};
>  
>  	std::vector<libcamera::Size>
> -	getYUVResolutions(libcamera::CameraConfiguration *cameraConfig,
> -			  const libcamera::PixelFormat &pixelFormat,
> +	getYUVResolutions(const libcamera::PixelFormat &pixelFormat,
>  			  const std::vector<libcamera::Size> &resolutions);
>  	std::vector<libcamera::Size>
>  	getRawResolutions(const libcamera::PixelFormat &pixelFormat);
Umang Jain July 23, 2021, 1:36 p.m. UTC | #2
Hi Jacopo,

Thanks for the patch

On 7/15/21 9:02 PM, Jacopo Mondi wrote:
> When building the list of supported YUV streams in getYUVResolutions()
> the CameraConfiguration provided by the caller as first parameters is used.
>
> As the CameraConfiguration will be later actually applied to the Camera,
> avoid any possible overlap of the configuration parameters by using a
> throw-away CameraConfiguration generated for the Viewfinder stream role
> in getYUVResolutions().
>
> It's also nicer to avoid having two functions with a similar purpose
> such as getYUVResolutions() and getRawResolutions() with different
> parameter lists, as the presence of a CameraConfiguration as first
> parameter might be confusing to the reader.
>
> Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
> ---
>   src/android/camera_capabilities.cpp | 10 +++++-----
>   src/android/camera_capabilities.h   |  3 +--
>   2 files changed, 6 insertions(+), 7 deletions(-)
>
> diff --git a/src/android/camera_capabilities.cpp b/src/android/camera_capabilities.cpp
> index 6b5edb66fad2..6543c2bbc6c3 100644
> --- a/src/android/camera_capabilities.cpp
> +++ b/src/android/camera_capabilities.cpp
> @@ -138,13 +138,14 @@ int CameraCapabilities::initialize(std::shared_ptr<libcamera::Camera> camera,
>   	return initializeStaticMetadata();
>   }
>   
> -std::vector<Size> CameraCapabilities::getYUVResolutions(CameraConfiguration *cameraConfig,
> -							const PixelFormat &pixelFormat,
> +std::vector<Size> CameraCapabilities::getYUVResolutions(const PixelFormat &pixelFormat,
>   							const std::vector<Size> &resolutions)
>   {
>   	std::vector<Size> supportedResolutions;
> -
> +	std::unique_ptr<CameraConfiguration> cameraConfig =
> +		camera_->generateConfiguration({ StreamRole::Viewfinder });
>   	StreamConfiguration &cfg = cameraConfig->at(0);
> +
>   	for (const Size &res : resolutions) {
>   		cfg.pixelFormat = pixelFormat;
>   		cfg.size = res;
> @@ -324,8 +325,7 @@ int CameraCapabilities::initializeStreamConfigurations()
>   		if (info.colourEncoding == PixelFormatInfo::ColourEncodingRAW)
>   			resolutions = getRawResolutions(mappedFormat);
>   		else
> -			resolutions = getYUVResolutions(cameraConfig.get(),
> -							mappedFormat,
> +			resolutions = getYUVResolutions(mappedFormat,
>   							cameraResolutions);
>   
>   		for (const Size &res : resolutions) {
> diff --git a/src/android/camera_capabilities.h b/src/android/camera_capabilities.h
> index 4f5be82595d6..e72bf084cd65 100644
> --- a/src/android/camera_capabilities.h
> +++ b/src/android/camera_capabilities.h
> @@ -43,8 +43,7 @@ private:
>   	};
>   
>   	std::vector<libcamera::Size>
> -	getYUVResolutions(libcamera::CameraConfiguration *cameraConfig,
> -			  const libcamera::PixelFormat &pixelFormat,
> +	getYUVResolutions(const libcamera::PixelFormat &pixelFormat,
>   			  const std::vector<libcamera::Size> &resolutions);
>   	std::vector<libcamera::Size>
>   	getRawResolutions(const libcamera::PixelFormat &pixelFormat);

Patch
diff mbox series

diff --git a/src/android/camera_capabilities.cpp b/src/android/camera_capabilities.cpp
index 6b5edb66fad2..6543c2bbc6c3 100644
--- a/src/android/camera_capabilities.cpp
+++ b/src/android/camera_capabilities.cpp
@@ -138,13 +138,14 @@  int CameraCapabilities::initialize(std::shared_ptr<libcamera::Camera> camera,
 	return initializeStaticMetadata();
 }
 
-std::vector<Size> CameraCapabilities::getYUVResolutions(CameraConfiguration *cameraConfig,
-							const PixelFormat &pixelFormat,
+std::vector<Size> CameraCapabilities::getYUVResolutions(const PixelFormat &pixelFormat,
 							const std::vector<Size> &resolutions)
 {
 	std::vector<Size> supportedResolutions;
-
+	std::unique_ptr<CameraConfiguration> cameraConfig =
+		camera_->generateConfiguration({ StreamRole::Viewfinder });
 	StreamConfiguration &cfg = cameraConfig->at(0);
+
 	for (const Size &res : resolutions) {
 		cfg.pixelFormat = pixelFormat;
 		cfg.size = res;
@@ -324,8 +325,7 @@  int CameraCapabilities::initializeStreamConfigurations()
 		if (info.colourEncoding == PixelFormatInfo::ColourEncodingRAW)
 			resolutions = getRawResolutions(mappedFormat);
 		else
-			resolutions = getYUVResolutions(cameraConfig.get(),
-							mappedFormat,
+			resolutions = getYUVResolutions(mappedFormat,
 							cameraResolutions);
 
 		for (const Size &res : resolutions) {
diff --git a/src/android/camera_capabilities.h b/src/android/camera_capabilities.h
index 4f5be82595d6..e72bf084cd65 100644
--- a/src/android/camera_capabilities.h
+++ b/src/android/camera_capabilities.h
@@ -43,8 +43,7 @@  private:
 	};
 
 	std::vector<libcamera::Size>
-	getYUVResolutions(libcamera::CameraConfiguration *cameraConfig,
-			  const libcamera::PixelFormat &pixelFormat,
+	getYUVResolutions(const libcamera::PixelFormat &pixelFormat,
 			  const std::vector<libcamera::Size> &resolutions);
 	std::vector<libcamera::Size>
 	getRawResolutions(const libcamera::PixelFormat &pixelFormat);