[v15,8/8] libcamera: simple: Make raw streams working
diff mbox series

Message ID 20251104153501.34362-9-mzamazal@redhat.com
State New
Headers show
Series
  • Enable raw streams with software ISP
Related show

Commit Message

Milan Zamazal Nov. 4, 2025, 3:34 p.m. UTC
When a raw stream is requested, whether alone or together with a
processed stream, its buffers must be handled outside the software ISP
machinery.  They serve as output buffers, even when a processed stream
is produced.

At most one raw stream and at most one processed stream are supported
and can be combined.  An example of producing both raw and processed
files using `cam' application:

  cam -c1 -C100 -Ffile# \
    -s role=viewfinder,width=1920,height=1080,pixelformat=RGB888 \
    -s role=raw,width=3280,height=2464,pixelformat=SRGGB8 \

Note the difference in viewfinder and raw stream sizes due to the fact
that debayering requires enlarging the image width, which enforces
selecting a larger sensor resolution in this case.

In order to track whether a raw stream is requested and which one it is,
SimpleCameraData::rawStream_ member variable is introduced.

This is the final step to make raw streams working.

Signed-off-by: Milan Zamazal <mzamazal@redhat.com>
---
 src/libcamera/pipeline/simple/simple.cpp | 41 +++++++++++++++++-------
 1 file changed, 29 insertions(+), 12 deletions(-)

Comments

Umang Jain Nov. 24, 2025, 3:41 p.m. UTC | #1
Hi Milan,

Apologies for taking so long on this as I had been travelling last few
weeks and didn't the proper setup to test this patch.

This series works as expected, however I do want a discussion on one of
the point below:

On 2025-11-04 21:04, Milan Zamazal wrote:
> When a raw stream is requested, whether alone or together with a
> processed stream, its buffers must be handled outside the software ISP
> machinery.  They serve as output buffers, even when a processed stream
> is produced.
> 
> At most one raw stream and at most one processed stream are supported
> and can be combined.  An example of producing both raw and processed
> files using `cam' application:
> 
>   cam -c1 -C100 -Ffile# \
>     -s role=viewfinder,width=1920,height=1080,pixelformat=RGB888 \
>     -s role=raw,width=3280,height=2464,pixelformat=SRGGB8 \
> 
> Note the difference in viewfinder and raw stream sizes due to the fact
> that debayering requires enlarging the image width, which enforces
> selecting a larger sensor resolution in this case.
> 
> In order to track whether a raw stream is requested and which one it is,
> SimpleCameraData::rawStream_ member variable is introduced.
> 
> This is the final step to make raw streams working.
> 
> Signed-off-by: Milan Zamazal <mzamazal@redhat.com>
> ---
>  src/libcamera/pipeline/simple/simple.cpp | 41 +++++++++++++++++-------
>  1 file changed, 29 insertions(+), 12 deletions(-)
> 
> diff --git a/src/libcamera/pipeline/simple/simple.cpp b/src/libcamera/pipeline/simple/simple.cpp
> index cb0b24f38..dd32b70f1 100644
> --- a/src/libcamera/pipeline/simple/simple.cpp
> +++ b/src/libcamera/pipeline/simple/simple.cpp
> @@ -333,6 +333,7 @@ public:
>  	};
>  
>  	std::vector<Stream> streams_;
> +	Stream *rawStream_;
>  
>  	/*
>  	 * All entities in the pipeline, from the camera sensor to the video
> @@ -467,7 +468,7 @@ private:
>  SimpleCameraData::SimpleCameraData(SimplePipelineHandler *pipe,
>  				   unsigned int numStreams,
>  				   MediaEntity *sensor)
> -	: Camera::Private(pipe), streams_(numStreams)
> +	: Camera::Private(pipe), streams_(numStreams), rawStream_(nullptr)
>  {
>  	/*
>  	 * Find the shortest path from the camera sensor to a video capture
> @@ -878,10 +879,13 @@ void SimpleCameraData::imageBufferReady(FrameBuffer *buffer)
>  	 * point converting an erroneous buffer.
>  	 */
>  	if (buffer->metadata().status != FrameMetadata::FrameSuccess) {
> -		if (!useConversion_) {
> +		if (!useConversion_ || rawStream_) {
>  			/* No conversion, just complete the request. */
>  			Request *request = buffer->request();
>  			pipe->completeBuffer(request, buffer);
> +			SimpleFrameInfo *info = frameInfo_.find(request->sequence());
> +			if (info)
> +				info->metadataRequired = false;
>  			tryCompleteRequest(request);
>  			return;
>  		}
> @@ -940,7 +944,8 @@ void SimpleCameraData::imageBufferReady(FrameBuffer *buffer)
>  	 */
>  	if (useConversion_) {
>  		if (conversionQueue_.empty()) {
> -			video_->queueBuffer(buffer);
> +			if (!rawStream_)
> +				video_->queueBuffer(buffer);
>  			return;
>  		}
>  
> @@ -956,6 +961,8 @@ void SimpleCameraData::imageBufferReady(FrameBuffer *buffer)
>  					     conversionQueue_.front().outputs);
>  
>  		conversionQueue_.pop();
> +		if (rawStream_)
> +			pipe->completeBuffer(request, buffer);

Context: We are here in the case where `useConversion_ = true` so, the
case is 1 RAW + 1 processed stream here. And the RAW buffer is queued to
swISP/converter.

I am not sure which is a better place to complete the RAW buffer in this
case. Should we complete here like you have done or probably in
conversionInputDone() like in [1]. I think the interaction between
completeBuffer() and swIsp_->inputBufferReady is not clear because it is
probably case to case basis, but with your implementation I see a risk
of client application having access to the completed RAW buffer *while*
it is still an input for the swISP/converter. This makes me slightly
incline towards [1]. Do you have any opinions ?

[1]:
https://gitlab.freedesktop.org/uajain/libcamera/-/commit/d1749c4f08f

>  		return;
>  	}
>  
> @@ -993,7 +1000,8 @@ void SimpleCameraData::tryCompleteRequest(Request *request)
>  void SimpleCameraData::conversionInputDone(FrameBuffer *buffer)
>  {
>  	/* Queue the input buffer back for capture. */
> -	video_->queueBuffer(buffer);
> +	if (!rawStream_)
> +		video_->queueBuffer(buffer);
>  }
>  
>  void SimpleCameraData::conversionOutputDone(FrameBuffer *buffer)
> @@ -1518,13 +1526,18 @@ int SimplePipelineHandler::configure(Camera *camera, CameraConfiguration *c)
>  	std::vector<std::reference_wrapper<StreamConfiguration>> outputCfgs;
>  	data->useConversion_ = config->needConversion();
>  
> +	data->rawStream_ = nullptr;
>  	for (unsigned int i = 0; i < config->size(); ++i) {
>  		StreamConfiguration &cfg = config->at(i);
> +		bool rawStream = isRaw(cfg);
>  
>  		cfg.setStream(&data->streams_[i]);
>  
> -		if (data->useConversion_ && !isRaw(cfg))
> +		if (data->useConversion_ && !rawStream)
>  			outputCfgs.push_back(cfg);
> +
> +		if (rawStream)
> +			data->rawStream_ = &data->streams_[i];
>  	}
>  
>  	if (outputCfgs.empty())
> @@ -1555,7 +1568,7 @@ int SimplePipelineHandler::exportFrameBuffers(Camera *camera, Stream *stream,
>  	 * Export buffers on the converter or capture video node, depending on
>  	 * whether the converter is used or not.
>  	 */
> -	if (data->useConversion_)
> +	if (data->useConversion_ && stream != data->rawStream_)
>  		return data->converter_
>  			       ? data->converter_->exportBuffers(stream, count, buffers)
>  			       : data->swIsp_->exportBuffers(stream, count, buffers);
> @@ -1578,7 +1591,7 @@ int SimplePipelineHandler::start(Camera *camera, [[maybe_unused]] const ControlL
>  		return -EBUSY;
>  	}
>  
> -	if (data->useConversion_) {
> +	if (data->useConversion_ && !data->rawStream_) {
>  		/*
>  		 * When using the converter allocate a fixed number of internal
>  		 * buffers.
> @@ -1586,8 +1599,11 @@ int SimplePipelineHandler::start(Camera *camera, [[maybe_unused]] const ControlL
>  		ret = video->allocateBuffers(kNumInternalBuffers,
>  					     &data->conversionBuffers_);
>  	} else {
> -		/* Otherwise, prepare for using buffers from the only stream. */
> -		Stream *stream = &data->streams_[0];
> +		/*
> +		 * Otherwise, prepare for using buffers from either the raw stream, if
> +		 * requested, or the only stream configured.
> +		 */
> +		Stream *stream = (data->rawStream_ ? data->rawStream_ : &data->streams_[0]);
>  		ret = video->importBuffers(stream->configuration().bufferCount);
>  	}
>  	if (ret < 0) {
> @@ -1628,8 +1644,9 @@ int SimplePipelineHandler::start(Camera *camera, [[maybe_unused]] const ControlL
>  		}
>  
>  		/* Queue all internal buffers for capture. */
> -		for (std::unique_ptr<FrameBuffer> &buffer : data->conversionBuffers_)
> -			video->queueBuffer(buffer.get());
> +		if (!data->rawStream_)
> +			for (std::unique_ptr<FrameBuffer> &buffer : data->conversionBuffers_)
> +				video->queueBuffer(buffer.get());
>  	}
>  
>  	return 0;
> @@ -1680,7 +1697,7 @@ int SimplePipelineHandler::queueRequestDevice(Camera *camera, Request *request)
>  		 * queue, it will be handed to the converter in the capture
>  		 * completion handler.
>  		 */
> -		if (data->useConversion_) {
> +		if (data->useConversion_ && stream != data->rawStream_) {
>  			buffers.emplace(stream, buffer);
>  			metadataRequired = !!data->swIsp_;
>  		} else {
Milan Zamazal Nov. 24, 2025, 9:15 p.m. UTC | #2
Hi Umang,

Umang Jain <uajain@igalia.com> writes:

> Hi Milan,
>
> Apologies for taking so long on this as I had been travelling last few
> weeks and didn't the proper setup to test this patch.
>
> This series works as expected, however I do want a discussion on one of
> the point below:
>
> On 2025-11-04 21:04, Milan Zamazal wrote:
>> When a raw stream is requested, whether alone or together with a
>> processed stream, its buffers must be handled outside the software ISP
>> machinery.  They serve as output buffers, even when a processed stream
>> is produced.
>> 
>> At most one raw stream and at most one processed stream are supported
>> and can be combined.  An example of producing both raw and processed
>> files using `cam' application:
>> 
>>   cam -c1 -C100 -Ffile# \
>>     -s role=viewfinder,width=1920,height=1080,pixelformat=RGB888 \
>>     -s role=raw,width=3280,height=2464,pixelformat=SRGGB8 \
>> 
>> Note the difference in viewfinder and raw stream sizes due to the fact
>> that debayering requires enlarging the image width, which enforces
>> selecting a larger sensor resolution in this case.
>> 
>> In order to track whether a raw stream is requested and which one it is,
>> SimpleCameraData::rawStream_ member variable is introduced.
>> 
>> This is the final step to make raw streams working.
>> 
>> Signed-off-by: Milan Zamazal <mzamazal@redhat.com>
>> ---
>>  src/libcamera/pipeline/simple/simple.cpp | 41 +++++++++++++++++-------
>>  1 file changed, 29 insertions(+), 12 deletions(-)
>> 
>> diff --git a/src/libcamera/pipeline/simple/simple.cpp b/src/libcamera/pipeline/simple/simple.cpp
>> index cb0b24f38..dd32b70f1 100644
>> --- a/src/libcamera/pipeline/simple/simple.cpp
>> +++ b/src/libcamera/pipeline/simple/simple.cpp
>> @@ -333,6 +333,7 @@ public:
>>  	};
>>  
>>  	std::vector<Stream> streams_;
>> +	Stream *rawStream_;
>>  
>>  	/*
>>  	 * All entities in the pipeline, from the camera sensor to the video
>> @@ -467,7 +468,7 @@ private:
>>  SimpleCameraData::SimpleCameraData(SimplePipelineHandler *pipe,
>>  				   unsigned int numStreams,
>>  				   MediaEntity *sensor)
>> -	: Camera::Private(pipe), streams_(numStreams)
>> +	: Camera::Private(pipe), streams_(numStreams), rawStream_(nullptr)
>>  {
>>  	/*
>>  	 * Find the shortest path from the camera sensor to a video capture
>> @@ -878,10 +879,13 @@ void SimpleCameraData::imageBufferReady(FrameBuffer *buffer)
>>  	 * point converting an erroneous buffer.
>>  	 */
>>  	if (buffer->metadata().status != FrameMetadata::FrameSuccess) {
>> -		if (!useConversion_) {
>> +		if (!useConversion_ || rawStream_) {
>>  			/* No conversion, just complete the request. */
>>  			Request *request = buffer->request();
>>  			pipe->completeBuffer(request, buffer);
>> +			SimpleFrameInfo *info = frameInfo_.find(request->sequence());
>> +			if (info)
>> +				info->metadataRequired = false;
>>  			tryCompleteRequest(request);
>>  			return;
>>  		}
>> @@ -940,7 +944,8 @@ void SimpleCameraData::imageBufferReady(FrameBuffer *buffer)
>>  	 */
>>  	if (useConversion_) {
>>  		if (conversionQueue_.empty()) {
>> -			video_->queueBuffer(buffer);
>> +			if (!rawStream_)
>> +				video_->queueBuffer(buffer);
>>  			return;
>>  		}
>>  
>> @@ -956,6 +961,8 @@ void SimpleCameraData::imageBufferReady(FrameBuffer *buffer)
>>  					     conversionQueue_.front().outputs);
>>  
>>  		conversionQueue_.pop();
>> +		if (rawStream_)
>> +			pipe->completeBuffer(request, buffer);
>
> Context: We are here in the case where `useConversion_ = true` so, the
> case is 1 RAW + 1 processed stream here. And the RAW buffer is queued to
> swISP/converter.
>
> I am not sure which is a better place to complete the RAW buffer in this
> case. Should we complete here like you have done or probably in
> conversionInputDone() like in [1]. I think the interaction between
> completeBuffer() and swIsp_->inputBufferReady is not clear because it is
> probably case to case basis, but with your implementation I see a risk
> of client application having access to the completed RAW buffer *while*
> it is still an input for the swISP/converter. This makes me slightly
> incline towards [1]. Do you have any opinions ?

Good point.  Yes, [1] is a better solution for this.  I'll re-check the
other differences with this context and update the patch series one way
or the other.

> [1]:
> https://gitlab.freedesktop.org/uajain/libcamera/-/commit/d1749c4f08f
>
>>  		return;
>>  	}
>>  
>> @@ -993,7 +1000,8 @@ void SimpleCameraData::tryCompleteRequest(Request *request)
>>  void SimpleCameraData::conversionInputDone(FrameBuffer *buffer)
>>  {
>>  	/* Queue the input buffer back for capture. */
>> -	video_->queueBuffer(buffer);
>> +	if (!rawStream_)
>> +		video_->queueBuffer(buffer);
>>  }
>>  
>>  void SimpleCameraData::conversionOutputDone(FrameBuffer *buffer)
>> @@ -1518,13 +1526,18 @@ int SimplePipelineHandler::configure(Camera *camera, CameraConfiguration *c)
>>  	std::vector<std::reference_wrapper<StreamConfiguration>> outputCfgs;
>>  	data->useConversion_ = config->needConversion();
>>  
>> +	data->rawStream_ = nullptr;
>>  	for (unsigned int i = 0; i < config->size(); ++i) {
>>  		StreamConfiguration &cfg = config->at(i);
>> +		bool rawStream = isRaw(cfg);
>>  
>>  		cfg.setStream(&data->streams_[i]);
>>  
>> -		if (data->useConversion_ && !isRaw(cfg))
>> +		if (data->useConversion_ && !rawStream)
>>  			outputCfgs.push_back(cfg);
>> +
>> +		if (rawStream)
>> +			data->rawStream_ = &data->streams_[i];
>>  	}
>>  
>>  	if (outputCfgs.empty())
>> @@ -1555,7 +1568,7 @@ int SimplePipelineHandler::exportFrameBuffers(Camera *camera, Stream *stream,
>>  	 * Export buffers on the converter or capture video node, depending on
>>  	 * whether the converter is used or not.
>>  	 */
>> -	if (data->useConversion_)
>> +	if (data->useConversion_ && stream != data->rawStream_)
>>  		return data->converter_
>>  			       ? data->converter_->exportBuffers(stream, count, buffers)
>>  			       : data->swIsp_->exportBuffers(stream, count, buffers);
>> @@ -1578,7 +1591,7 @@ int SimplePipelineHandler::start(Camera *camera, [[maybe_unused]] const ControlL
>>  		return -EBUSY;
>>  	}
>>  
>> -	if (data->useConversion_) {
>> +	if (data->useConversion_ && !data->rawStream_) {
>>  		/*
>>  		 * When using the converter allocate a fixed number of internal
>>  		 * buffers.
>> @@ -1586,8 +1599,11 @@ int SimplePipelineHandler::start(Camera *camera, [[maybe_unused]] const ControlL
>>  		ret = video->allocateBuffers(kNumInternalBuffers,
>>  					     &data->conversionBuffers_);
>>  	} else {
>> -		/* Otherwise, prepare for using buffers from the only stream. */
>> -		Stream *stream = &data->streams_[0];
>> +		/*
>> +		 * Otherwise, prepare for using buffers from either the raw stream, if
>> +		 * requested, or the only stream configured.
>> +		 */
>> +		Stream *stream = (data->rawStream_ ? data->rawStream_ : &data->streams_[0]);
>>  		ret = video->importBuffers(stream->configuration().bufferCount);
>>  	}
>>  	if (ret < 0) {
>> @@ -1628,8 +1644,9 @@ int SimplePipelineHandler::start(Camera *camera, [[maybe_unused]] const ControlL
>>  		}
>>  
>>  		/* Queue all internal buffers for capture. */
>> -		for (std::unique_ptr<FrameBuffer> &buffer : data->conversionBuffers_)
>> -			video->queueBuffer(buffer.get());
>> +		if (!data->rawStream_)
>> +			for (std::unique_ptr<FrameBuffer> &buffer : data->conversionBuffers_)
>> +				video->queueBuffer(buffer.get());
>>  	}
>>  
>>  	return 0;
>> @@ -1680,7 +1697,7 @@ int SimplePipelineHandler::queueRequestDevice(Camera *camera, Request *request)
>>  		 * queue, it will be handed to the converter in the capture
>>  		 * completion handler.
>>  		 */
>> -		if (data->useConversion_) {
>> +		if (data->useConversion_ && stream != data->rawStream_) {
>>  			buffers.emplace(stream, buffer);
>>  			metadataRequired = !!data->swIsp_;
>>  		} else {

Patch
diff mbox series

diff --git a/src/libcamera/pipeline/simple/simple.cpp b/src/libcamera/pipeline/simple/simple.cpp
index cb0b24f38..dd32b70f1 100644
--- a/src/libcamera/pipeline/simple/simple.cpp
+++ b/src/libcamera/pipeline/simple/simple.cpp
@@ -333,6 +333,7 @@  public:
 	};
 
 	std::vector<Stream> streams_;
+	Stream *rawStream_;
 
 	/*
 	 * All entities in the pipeline, from the camera sensor to the video
@@ -467,7 +468,7 @@  private:
 SimpleCameraData::SimpleCameraData(SimplePipelineHandler *pipe,
 				   unsigned int numStreams,
 				   MediaEntity *sensor)
-	: Camera::Private(pipe), streams_(numStreams)
+	: Camera::Private(pipe), streams_(numStreams), rawStream_(nullptr)
 {
 	/*
 	 * Find the shortest path from the camera sensor to a video capture
@@ -878,10 +879,13 @@  void SimpleCameraData::imageBufferReady(FrameBuffer *buffer)
 	 * point converting an erroneous buffer.
 	 */
 	if (buffer->metadata().status != FrameMetadata::FrameSuccess) {
-		if (!useConversion_) {
+		if (!useConversion_ || rawStream_) {
 			/* No conversion, just complete the request. */
 			Request *request = buffer->request();
 			pipe->completeBuffer(request, buffer);
+			SimpleFrameInfo *info = frameInfo_.find(request->sequence());
+			if (info)
+				info->metadataRequired = false;
 			tryCompleteRequest(request);
 			return;
 		}
@@ -940,7 +944,8 @@  void SimpleCameraData::imageBufferReady(FrameBuffer *buffer)
 	 */
 	if (useConversion_) {
 		if (conversionQueue_.empty()) {
-			video_->queueBuffer(buffer);
+			if (!rawStream_)
+				video_->queueBuffer(buffer);
 			return;
 		}
 
@@ -956,6 +961,8 @@  void SimpleCameraData::imageBufferReady(FrameBuffer *buffer)
 					     conversionQueue_.front().outputs);
 
 		conversionQueue_.pop();
+		if (rawStream_)
+			pipe->completeBuffer(request, buffer);
 		return;
 	}
 
@@ -993,7 +1000,8 @@  void SimpleCameraData::tryCompleteRequest(Request *request)
 void SimpleCameraData::conversionInputDone(FrameBuffer *buffer)
 {
 	/* Queue the input buffer back for capture. */
-	video_->queueBuffer(buffer);
+	if (!rawStream_)
+		video_->queueBuffer(buffer);
 }
 
 void SimpleCameraData::conversionOutputDone(FrameBuffer *buffer)
@@ -1518,13 +1526,18 @@  int SimplePipelineHandler::configure(Camera *camera, CameraConfiguration *c)
 	std::vector<std::reference_wrapper<StreamConfiguration>> outputCfgs;
 	data->useConversion_ = config->needConversion();
 
+	data->rawStream_ = nullptr;
 	for (unsigned int i = 0; i < config->size(); ++i) {
 		StreamConfiguration &cfg = config->at(i);
+		bool rawStream = isRaw(cfg);
 
 		cfg.setStream(&data->streams_[i]);
 
-		if (data->useConversion_ && !isRaw(cfg))
+		if (data->useConversion_ && !rawStream)
 			outputCfgs.push_back(cfg);
+
+		if (rawStream)
+			data->rawStream_ = &data->streams_[i];
 	}
 
 	if (outputCfgs.empty())
@@ -1555,7 +1568,7 @@  int SimplePipelineHandler::exportFrameBuffers(Camera *camera, Stream *stream,
 	 * Export buffers on the converter or capture video node, depending on
 	 * whether the converter is used or not.
 	 */
-	if (data->useConversion_)
+	if (data->useConversion_ && stream != data->rawStream_)
 		return data->converter_
 			       ? data->converter_->exportBuffers(stream, count, buffers)
 			       : data->swIsp_->exportBuffers(stream, count, buffers);
@@ -1578,7 +1591,7 @@  int SimplePipelineHandler::start(Camera *camera, [[maybe_unused]] const ControlL
 		return -EBUSY;
 	}
 
-	if (data->useConversion_) {
+	if (data->useConversion_ && !data->rawStream_) {
 		/*
 		 * When using the converter allocate a fixed number of internal
 		 * buffers.
@@ -1586,8 +1599,11 @@  int SimplePipelineHandler::start(Camera *camera, [[maybe_unused]] const ControlL
 		ret = video->allocateBuffers(kNumInternalBuffers,
 					     &data->conversionBuffers_);
 	} else {
-		/* Otherwise, prepare for using buffers from the only stream. */
-		Stream *stream = &data->streams_[0];
+		/*
+		 * Otherwise, prepare for using buffers from either the raw stream, if
+		 * requested, or the only stream configured.
+		 */
+		Stream *stream = (data->rawStream_ ? data->rawStream_ : &data->streams_[0]);
 		ret = video->importBuffers(stream->configuration().bufferCount);
 	}
 	if (ret < 0) {
@@ -1628,8 +1644,9 @@  int SimplePipelineHandler::start(Camera *camera, [[maybe_unused]] const ControlL
 		}
 
 		/* Queue all internal buffers for capture. */
-		for (std::unique_ptr<FrameBuffer> &buffer : data->conversionBuffers_)
-			video->queueBuffer(buffer.get());
+		if (!data->rawStream_)
+			for (std::unique_ptr<FrameBuffer> &buffer : data->conversionBuffers_)
+				video->queueBuffer(buffer.get());
 	}
 
 	return 0;
@@ -1680,7 +1697,7 @@  int SimplePipelineHandler::queueRequestDevice(Camera *camera, Request *request)
 		 * queue, it will be handed to the converter in the capture
 		 * completion handler.
 		 */
-		if (data->useConversion_) {
+		if (data->useConversion_ && stream != data->rawStream_) {
 			buffers.emplace(stream, buffer);
 			metadataRequired = !!data->swIsp_;
 		} else {