[libcamera-devel,v2,02/10] pipeline: raspberrypi: Add a pipeline config structure
diff mbox series

Message ID 20221129134534.2933-3-naush@raspberrypi.com
State Superseded
Headers show
Series
  • Raspberry Pi: Platform configuration and buffer allocation improvements
Related show

Commit Message

Naushir Patuck Nov. 29, 2022, 1:45 p.m. UTC
Add a configuration structure to store platform specific parameters used by
the pipeline handler. Currently, these only store Unicam buffer counts,
replacing the hardcoded static values in the source code.

In subsequent commits, more parameters will be added to the configuration
structure, and parameters will be read in through a config file.

Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
---
 .../pipeline/raspberrypi/raspberrypi.cpp      | 51 ++++++++++++++++---
 1 file changed, 44 insertions(+), 7 deletions(-)

Comments

Laurent Pinchart Dec. 2, 2022, 11:42 a.m. UTC | #1
Hi Naush,

Thank you for the patch.

On Tue, Nov 29, 2022 at 01:45:26PM +0000, Naushir Patuck via libcamera-devel wrote:
> Add a configuration structure to store platform specific parameters used by
> the pipeline handler. Currently, these only store Unicam buffer counts,
> replacing the hardcoded static values in the source code.

One question here, does the configuration apply to the pipeline, or to
the camera ? If multiple sensors are connected to the same Unicam
instance (through an external multiplexers), will all these cameras
always have the same configuration, or are there use cases for
per-camera configuration ? Looking at the series, the configuration is
stored per-camera, but is the same for all cameras handled by the
pipeline handler.

The answer to that question matters for the implementation in the
Raspberry Pi pipeline handler, but also in the common helpers in case
other platforms would need per-camera configurations.

> In subsequent commits, more parameters will be added to the configuration
> structure, and parameters will be read in through a config file.
> 
> Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
> Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
> ---
>  .../pipeline/raspberrypi/raspberrypi.cpp      | 51 ++++++++++++++++---
>  1 file changed, 44 insertions(+), 7 deletions(-)
> 
> diff --git a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
> index 0e0b71945640..4486d31ea78d 100644
> --- a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
> +++ b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
> @@ -294,6 +294,21 @@ public:
>  	/* Have internal buffers been allocated? */
>  	bool buffersAllocated_;
>  
> +	struct Config {
> +		/*
> +		 * The minimum number of internal buffers to be allocated for
> +		 * the Unicam Image stream.
> +		 */
> +		unsigned int minUnicamBuffers;
> +		/*
> +		 * The minimum total (internal + external) buffer count used for
> +		 * the Unicam Image steram.

s/steram/stream/

> +		 */
> +		unsigned int minTotalUnicamBuffers;
> +	};
> +
> +	Config config_;
> +
>  private:
>  	void checkRequestCompleted();
>  	void fillRequestMetadata(const ControlList &bufferControls,
> @@ -346,6 +361,7 @@ private:
>  	}
>  
>  	int registerCamera(MediaDevice *unicam, MediaDevice *isp, MediaEntity *sensorEntity);
> +	int configurePipelineHandler(RPiCameraData *data);

Wouldn't this be better placed in the RPiCameraData class ? I would also
name the function loadPipelineHandlerConfiguration() (or similar) to
make its purpose clearer.

I wonder if we shouldn't instead talk about "pipeline configuration"
instead of "pipeline handler configuration", especially if the
configuration is specific to a camera. What does everybody think ?

>  	int queueAllBuffers(Camera *camera);
>  	int prepareBuffers(Camera *camera);
>  	void mapBuffers(Camera *camera, const RPi::BufferMap &buffers, unsigned int mask);
> @@ -1377,6 +1393,12 @@ int PipelineHandlerRPi::registerCamera(MediaDevice *unicam, MediaDevice *isp, Me
>  	streams.insert(&data->isp_[Isp::Output0]);
>  	streams.insert(&data->isp_[Isp::Output1]);
>  
> +	int ret = configurePipelineHandler(data.get());
> +	if (ret) {
> +		LOG(RPI, Error) << "Unable to configure the pipeline handler!";

"Unable to load pipeline handler configuration"

> +		return ret;
> +	}
> +
>  	/* Create and register the camera. */
>  	const std::string &id = data->sensor_->id();
>  	std::shared_ptr<Camera> camera =
> @@ -1389,6 +1411,18 @@ int PipelineHandlerRPi::registerCamera(MediaDevice *unicam, MediaDevice *isp, Me
>  	return 0;
>  }
>  
> +int PipelineHandlerRPi::configurePipelineHandler(RPiCameraData *data)
> +{
> +	RPiCameraData::Config &config = data->config_;
> +
> +	config = {
> +		.minUnicamBuffers = 2,
> +		.minTotalUnicamBuffers = 4,
> +	};
> +
> +	return 0;
> +}
> +
>  int PipelineHandlerRPi::queueAllBuffers(Camera *camera)
>  {
>  	RPiCameraData *data = cameraData(camera);
> @@ -1439,25 +1473,28 @@ int PipelineHandlerRPi::prepareBuffers(Camera *camera)
>  	for (auto const stream : data->streams_) {
>  		unsigned int numBuffers;
>  		/*
> -		 * For Unicam, allocate a minimum of 4 buffers as we want
> -		 * to avoid any frame drops.
> +		 * For Unicam, allocate a minimum number of buffers for internal
> +		 * use as we want to avoid any frame drops.
>  		 */
> -		constexpr unsigned int minBuffers = 4;
> +		const unsigned int minBuffers = data->config_.minTotalUnicamBuffers;
>  		if (stream == &data->unicam_[Unicam::Image]) {
>  			/*
>  			 * If an application has configured a RAW stream, allocate
>  			 * additional buffers to make up the minimum, but ensure
> -			 * we have at least 2 sets of internal buffers to use to
> -			 * minimise frame drops.
> +			 * we have at least minUnicamBuffers sets of internal

I'd drop 'sets' here as it's not two buffer sets but two buffers.

> +			 * buffers to use to minimise frame drops.
>  			 */
> -			numBuffers = std::max<int>(2, minBuffers - numRawBuffers);
> +			numBuffers = std::max<int>(data->config_.minUnicamBuffers,
> +						   minBuffers - numRawBuffers);
>  		} else if (stream == &data->isp_[Isp::Input]) {
>  			/*
>  			 * ISP input buffers are imported from Unicam, so follow
>  			 * similar logic as above to count all the RAW buffers
>  			 * available.
>  			 */
> -			numBuffers = numRawBuffers + std::max<int>(2, minBuffers - numRawBuffers);
> +			numBuffers = numRawBuffers +
> +				     std::max<int>(data->config_.minUnicamBuffers,
> +						   minBuffers - numRawBuffers);
>  
>  		} else if (stream == &data->unicam_[Unicam::Embedded]) {
>  			/*
Naushir Patuck Dec. 2, 2022, 1:01 p.m. UTC | #2
Hi Laurent,

Thanks for the feedback!


On Fri, 2 Dec 2022 at 11:42, Laurent Pinchart <
laurent.pinchart@ideasonboard.com> wrote:

> Hi Naush,
>
> Thank you for the patch.
>
> On Tue, Nov 29, 2022 at 01:45:26PM +0000, Naushir Patuck via
> libcamera-devel wrote:
> > Add a configuration structure to store platform specific parameters used
> by
> > the pipeline handler. Currently, these only store Unicam buffer counts,
> > replacing the hardcoded static values in the source code.
>
> One question here, does the configuration apply to the pipeline, or to
> the camera ? If multiple sensors are connected to the same Unicam
> instance (through an external multiplexers), will all these cameras
> always have the same configuration, or are there use cases for
> per-camera configuration ? Looking at the series, the configuration is
> stored per-camera, but is the same for all cameras handled by the
> pipeline handler.
>

The intention is to allow us to run alternate configurations per-camera.
With this series, raspberrypi/default.json is loaded, unless the env
variable
LIBCAMERA_RPI_CONFIG_FILE is set for the process.  Admittedly this
is not the neatest way of doing things.  We did alway talk about a mechanism
for the application to set the IPA tuning file via some API.  Whatever that
API
may be, I expect the pipeline handler config to follow a similar mechanism.


>
> The answer to that question matters for the implementation in the
> Raspberry Pi pipeline handler, but also in the common helpers in case
> other platforms would need per-camera configurations.
>

Without any sort of override mechanism, this is indeed a problem.


>
> > In subsequent commits, more parameters will be added to the configuration
> > structure, and parameters will be read in through a config file.
> >
> > Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
> > Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
> > ---
> >  .../pipeline/raspberrypi/raspberrypi.cpp      | 51 ++++++++++++++++---
> >  1 file changed, 44 insertions(+), 7 deletions(-)
> >
> > diff --git a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
> b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
> > index 0e0b71945640..4486d31ea78d 100644
> > --- a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
> > +++ b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
> > @@ -294,6 +294,21 @@ public:
> >       /* Have internal buffers been allocated? */
> >       bool buffersAllocated_;
> >
> > +     struct Config {
> > +             /*
> > +              * The minimum number of internal buffers to be allocated
> for
> > +              * the Unicam Image stream.
> > +              */
> > +             unsigned int minUnicamBuffers;
> > +             /*
> > +              * The minimum total (internal + external) buffer count
> used for
> > +              * the Unicam Image steram.
>
> s/steram/stream/
>
> > +              */
> > +             unsigned int minTotalUnicamBuffers;
> > +     };
> > +
> > +     Config config_;
> > +
> >  private:
> >       void checkRequestCompleted();
> >       void fillRequestMetadata(const ControlList &bufferControls,
> > @@ -346,6 +361,7 @@ private:
> >       }
> >
> >       int registerCamera(MediaDevice *unicam, MediaDevice *isp,
> MediaEntity *sensorEntity);
> > +     int configurePipelineHandler(RPiCameraData *data);
>
> Wouldn't this be better placed in the RPiCameraData class ? I would also
> name the function loadPipelineHandlerConfiguration() (or similar) to
> make its purpose clearer.
>

Ack


>
> I wonder if we shouldn't instead talk about "pipeline configuration"
> instead of "pipeline handler configuration", especially if the
> configuration is specific to a camera. What does everybody think ?
>

Yes, I agree with that.  Pipeline configuration does sound more appropriate.

Regards,
Naush



> >       int queueAllBuffers(Camera *camera);
> >       int prepareBuffers(Camera *camera);
> >       void mapBuffers(Camera *camera, const RPi::BufferMap &buffers,
> unsigned int mask);
> > @@ -1377,6 +1393,12 @@ int
> PipelineHandlerRPi::registerCamera(MediaDevice *unicam, MediaDevice *isp, Me
> >       streams.insert(&data->isp_[Isp::Output0]);
> >       streams.insert(&data->isp_[Isp::Output1]);
> >
> > +     int ret = configurePipelineHandler(data.get());
> > +     if (ret) {
> > +             LOG(RPI, Error) << "Unable to configure the pipeline
> handler!";
>
> "Unable to load pipeline handler configuration"
>
> > +             return ret;
> > +     }
> > +
> >       /* Create and register the camera. */
> >       const std::string &id = data->sensor_->id();
> >       std::shared_ptr<Camera> camera =
> > @@ -1389,6 +1411,18 @@ int
> PipelineHandlerRPi::registerCamera(MediaDevice *unicam, MediaDevice *isp, Me
> >       return 0;
> >  }
> >
> > +int PipelineHandlerRPi::configurePipelineHandler(RPiCameraData *data)
> > +{
> > +     RPiCameraData::Config &config = data->config_;
> > +
> > +     config = {
> > +             .minUnicamBuffers = 2,
> > +             .minTotalUnicamBuffers = 4,
> > +     };
> > +
> > +     return 0;
> > +}
> > +
> >  int PipelineHandlerRPi::queueAllBuffers(Camera *camera)
> >  {
> >       RPiCameraData *data = cameraData(camera);
> > @@ -1439,25 +1473,28 @@ int PipelineHandlerRPi::prepareBuffers(Camera
> *camera)
> >       for (auto const stream : data->streams_) {
> >               unsigned int numBuffers;
> >               /*
> > -              * For Unicam, allocate a minimum of 4 buffers as we want
> > -              * to avoid any frame drops.
> > +              * For Unicam, allocate a minimum number of buffers for
> internal
> > +              * use as we want to avoid any frame drops.
> >                */
> > -             constexpr unsigned int minBuffers = 4;
> > +             const unsigned int minBuffers =
> data->config_.minTotalUnicamBuffers;
> >               if (stream == &data->unicam_[Unicam::Image]) {
> >                       /*
> >                        * If an application has configured a RAW stream,
> allocate
> >                        * additional buffers to make up the minimum, but
> ensure
> > -                      * we have at least 2 sets of internal buffers to
> use to
> > -                      * minimise frame drops.
> > +                      * we have at least minUnicamBuffers sets of
> internal
>
> I'd drop 'sets' here as it's not two buffer sets but two buffers.
>
> > +                      * buffers to use to minimise frame drops.
> >                        */
> > -                     numBuffers = std::max<int>(2, minBuffers -
> numRawBuffers);
> > +                     numBuffers =
> std::max<int>(data->config_.minUnicamBuffers,
> > +                                                minBuffers -
> numRawBuffers);
> >               } else if (stream == &data->isp_[Isp::Input]) {
> >                       /*
> >                        * ISP input buffers are imported from Unicam, so
> follow
> >                        * similar logic as above to count all the RAW
> buffers
> >                        * available.
> >                        */
> > -                     numBuffers = numRawBuffers + std::max<int>(2,
> minBuffers - numRawBuffers);
> > +                     numBuffers = numRawBuffers +
> > +
> std::max<int>(data->config_.minUnicamBuffers,
> > +                                                minBuffers -
> numRawBuffers);
> >
> >               } else if (stream == &data->unicam_[Unicam::Embedded]) {
> >                       /*
>
> --
> Regards,
>
> Laurent Pinchart
>
Laurent Pinchart Dec. 2, 2022, 3:59 p.m. UTC | #3
Hi Naush,

On Fri, Dec 02, 2022 at 01:01:46PM +0000, Naushir Patuck wrote:
> On Fri, 2 Dec 2022 at 11:42, Laurent Pinchart wrote:
> > On Tue, Nov 29, 2022 at 01:45:26PM +0000, Naushir Patuck via libcamera-devel wrote:
> > > Add a configuration structure to store platform specific parameters used by
> > > the pipeline handler. Currently, these only store Unicam buffer counts,
> > > replacing the hardcoded static values in the source code.
> >
> > One question here, does the configuration apply to the pipeline, or to
> > the camera ? If multiple sensors are connected to the same Unicam
> > instance (through an external multiplexers), will all these cameras
> > always have the same configuration, or are there use cases for
> > per-camera configuration ? Looking at the series, the configuration is
> > stored per-camera, but is the same for all cameras handled by the
> > pipeline handler.
> 
> The intention is to allow us to run alternate configurations per-camera.
> With this series, raspberrypi/default.json is loaded, unless the env variable
> LIBCAMERA_RPI_CONFIG_FILE is set for the process.  Admittedly this
> is not the neatest way of doing things.  We did alway talk about a mechanism
> for the application to set the IPA tuning file via some API.  Whatever that API
> may be, I expect the pipeline handler config to follow a similar mechanism.

OK, we can leave it as-is, implementing it per-camera in the pipeline
handler, but limiting the mechanism to a configuration file that will be
the same for all cameras for now.

> > The answer to that question matters for the implementation in the
> > Raspberry Pi pipeline handler, but also in the common helpers in case
> > other platforms would need per-camera configurations.
> 
> Without any sort of override mechanism, this is indeed a problem.
> 
> > > In subsequent commits, more parameters will be added to the configuration
> > > structure, and parameters will be read in through a config file.
> > >
> > > Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
> > > Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
> > > ---
> > >  .../pipeline/raspberrypi/raspberrypi.cpp      | 51 ++++++++++++++++---
> > >  1 file changed, 44 insertions(+), 7 deletions(-)
> > >
> > > diff --git a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
> > > index 0e0b71945640..4486d31ea78d 100644
> > > --- a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
> > > +++ b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
> > > @@ -294,6 +294,21 @@ public:
> > >       /* Have internal buffers been allocated? */
> > >       bool buffersAllocated_;
> > >
> > > +     struct Config {
> > > +             /*
> > > +              * The minimum number of internal buffers to be allocated for
> > > +              * the Unicam Image stream.
> > > +              */
> > > +             unsigned int minUnicamBuffers;
> > > +             /*
> > > +              * The minimum total (internal + external) buffer count used for
> > > +              * the Unicam Image steram.
> >
> > s/steram/stream/
> >
> > > +              */
> > > +             unsigned int minTotalUnicamBuffers;
> > > +     };
> > > +
> > > +     Config config_;
> > > +
> > >  private:
> > >       void checkRequestCompleted();
> > >       void fillRequestMetadata(const ControlList &bufferControls,
> > > @@ -346,6 +361,7 @@ private:
> > >       }
> > >
> > >       int registerCamera(MediaDevice *unicam, MediaDevice *isp, MediaEntity *sensorEntity);
> > > +     int configurePipelineHandler(RPiCameraData *data);
> >
> > Wouldn't this be better placed in the RPiCameraData class ? I would also
> > name the function loadPipelineHandlerConfiguration() (or similar) to
> > make its purpose clearer.
> 
> Ack
> 
> > I wonder if we shouldn't instead talk about "pipeline configuration"
> > instead of "pipeline handler configuration", especially if the
> > configuration is specific to a camera. What does everybody think ?
> >
> 
> Yes, I agree with that.  Pipeline configuration does sound more appropriate.
> 
> > >       int queueAllBuffers(Camera *camera);
> > >       int prepareBuffers(Camera *camera);
> > >       void mapBuffers(Camera *camera, const RPi::BufferMap &buffers, unsigned int mask);
> > > @@ -1377,6 +1393,12 @@ int PipelineHandlerRPi::registerCamera(MediaDevice *unicam, MediaDevice *isp, Me
> > >       streams.insert(&data->isp_[Isp::Output0]);
> > >       streams.insert(&data->isp_[Isp::Output1]);
> > >
> > > +     int ret = configurePipelineHandler(data.get());
> > > +     if (ret) {
> > > +             LOG(RPI, Error) << "Unable to configure the pipeline handler!";
> > "Unable to load pipeline handler configuration"
> >
> > > +             return ret;
> > > +     }
> > > +
> > >       /* Create and register the camera. */
> > >       const std::string &id = data->sensor_->id();
> > >       std::shared_ptr<Camera> camera =
> > > @@ -1389,6 +1411,18 @@ int PipelineHandlerRPi::registerCamera(MediaDevice *unicam, MediaDevice *isp, Me
> > >       return 0;
> > >  }
> > >
> > > +int PipelineHandlerRPi::configurePipelineHandler(RPiCameraData *data)
> > > +{
> > > +     RPiCameraData::Config &config = data->config_;
> > > +
> > > +     config = {
> > > +             .minUnicamBuffers = 2,
> > > +             .minTotalUnicamBuffers = 4,
> > > +     };
> > > +
> > > +     return 0;
> > > +}
> > > +
> > >  int PipelineHandlerRPi::queueAllBuffers(Camera *camera)
> > >  {
> > >       RPiCameraData *data = cameraData(camera);
> > > @@ -1439,25 +1473,28 @@ int PipelineHandlerRPi::prepareBuffers(Camera *camera)
> > >       for (auto const stream : data->streams_) {
> > >               unsigned int numBuffers;
> > >               /*
> > > -              * For Unicam, allocate a minimum of 4 buffers as we want
> > > -              * to avoid any frame drops.
> > > +              * For Unicam, allocate a minimum number of buffers for internal
> > > +              * use as we want to avoid any frame drops.
> > >                */
> > > -             constexpr unsigned int minBuffers = 4;
> > > +             const unsigned int minBuffers = data->config_.minTotalUnicamBuffers;
> > >               if (stream == &data->unicam_[Unicam::Image]) {
> > >                       /*
> > >                        * If an application has configured a RAW stream, allocate
> > >                        * additional buffers to make up the minimum, but ensure
> > > -                      * we have at least 2 sets of internal buffers to use to
> > > -                      * minimise frame drops.
> > > +                      * we have at least minUnicamBuffers sets of internal
> >
> > I'd drop 'sets' here as it's not two buffer sets but two buffers.
> >
> > > +                      * buffers to use to minimise frame drops.
> > >                        */
> > > -                     numBuffers = std::max<int>(2, minBuffers - numRawBuffers);
> > > +                     numBuffers = std::max<int>(data->config_.minUnicamBuffers,
> > > +                                                minBuffers - numRawBuffers);
> > >               } else if (stream == &data->isp_[Isp::Input]) {
> > >                       /*
> > >                        * ISP input buffers are imported from Unicam, so follow
> > >                        * similar logic as above to count all the RAW buffers
> > >                        * available.
> > >                        */
> > > -                     numBuffers = numRawBuffers + std::max<int>(2, minBuffers - numRawBuffers);
> > > +                     numBuffers = numRawBuffers +
> > > + std::max<int>(data->config_.minUnicamBuffers,
> > > +                                                minBuffers - numRawBuffers);
> > >
> > >               } else if (stream == &data->unicam_[Unicam::Embedded]) {
> > >                       /*

Patch
diff mbox series

diff --git a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
index 0e0b71945640..4486d31ea78d 100644
--- a/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
+++ b/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp
@@ -294,6 +294,21 @@  public:
 	/* Have internal buffers been allocated? */
 	bool buffersAllocated_;
 
+	struct Config {
+		/*
+		 * The minimum number of internal buffers to be allocated for
+		 * the Unicam Image stream.
+		 */
+		unsigned int minUnicamBuffers;
+		/*
+		 * The minimum total (internal + external) buffer count used for
+		 * the Unicam Image steram.
+		 */
+		unsigned int minTotalUnicamBuffers;
+	};
+
+	Config config_;
+
 private:
 	void checkRequestCompleted();
 	void fillRequestMetadata(const ControlList &bufferControls,
@@ -346,6 +361,7 @@  private:
 	}
 
 	int registerCamera(MediaDevice *unicam, MediaDevice *isp, MediaEntity *sensorEntity);
+	int configurePipelineHandler(RPiCameraData *data);
 	int queueAllBuffers(Camera *camera);
 	int prepareBuffers(Camera *camera);
 	void mapBuffers(Camera *camera, const RPi::BufferMap &buffers, unsigned int mask);
@@ -1377,6 +1393,12 @@  int PipelineHandlerRPi::registerCamera(MediaDevice *unicam, MediaDevice *isp, Me
 	streams.insert(&data->isp_[Isp::Output0]);
 	streams.insert(&data->isp_[Isp::Output1]);
 
+	int ret = configurePipelineHandler(data.get());
+	if (ret) {
+		LOG(RPI, Error) << "Unable to configure the pipeline handler!";
+		return ret;
+	}
+
 	/* Create and register the camera. */
 	const std::string &id = data->sensor_->id();
 	std::shared_ptr<Camera> camera =
@@ -1389,6 +1411,18 @@  int PipelineHandlerRPi::registerCamera(MediaDevice *unicam, MediaDevice *isp, Me
 	return 0;
 }
 
+int PipelineHandlerRPi::configurePipelineHandler(RPiCameraData *data)
+{
+	RPiCameraData::Config &config = data->config_;
+
+	config = {
+		.minUnicamBuffers = 2,
+		.minTotalUnicamBuffers = 4,
+	};
+
+	return 0;
+}
+
 int PipelineHandlerRPi::queueAllBuffers(Camera *camera)
 {
 	RPiCameraData *data = cameraData(camera);
@@ -1439,25 +1473,28 @@  int PipelineHandlerRPi::prepareBuffers(Camera *camera)
 	for (auto const stream : data->streams_) {
 		unsigned int numBuffers;
 		/*
-		 * For Unicam, allocate a minimum of 4 buffers as we want
-		 * to avoid any frame drops.
+		 * For Unicam, allocate a minimum number of buffers for internal
+		 * use as we want to avoid any frame drops.
 		 */
-		constexpr unsigned int minBuffers = 4;
+		const unsigned int minBuffers = data->config_.minTotalUnicamBuffers;
 		if (stream == &data->unicam_[Unicam::Image]) {
 			/*
 			 * If an application has configured a RAW stream, allocate
 			 * additional buffers to make up the minimum, but ensure
-			 * we have at least 2 sets of internal buffers to use to
-			 * minimise frame drops.
+			 * we have at least minUnicamBuffers sets of internal
+			 * buffers to use to minimise frame drops.
 			 */
-			numBuffers = std::max<int>(2, minBuffers - numRawBuffers);
+			numBuffers = std::max<int>(data->config_.minUnicamBuffers,
+						   minBuffers - numRawBuffers);
 		} else if (stream == &data->isp_[Isp::Input]) {
 			/*
 			 * ISP input buffers are imported from Unicam, so follow
 			 * similar logic as above to count all the RAW buffers
 			 * available.
 			 */
-			numBuffers = numRawBuffers + std::max<int>(2, minBuffers - numRawBuffers);
+			numBuffers = numRawBuffers +
+				     std::max<int>(data->config_.minUnicamBuffers,
+						   minBuffers - numRawBuffers);
 
 		} else if (stream == &data->unicam_[Unicam::Embedded]) {
 			/*