Message ID | 20210831093439.853586-4-hiroh@chromium.org |
---|---|
State | Superseded |
Headers | show |
Series |
|
Related | show |
Hi Hiro, Thank you for the patch. On Tue, Aug 31, 2021 at 06:34:39PM +0900, Hirokazu Honda wrote: > An Android HAL client may request identical stream requests. It is > redundant that a native camera device produces a separate stream for > each of the identical requests. > Configure camera one stream configuration for the identical stream > requests. > > Signed-off-by: Hirokazu Honda <hiroh@chromium.org> > Reviewed-by: Jacopo Mondi <jacopo@jmondi.org> > --- > src/android/camera_device.cpp | 28 +++++++++++++++++++++++----- > 1 file changed, 23 insertions(+), 5 deletions(-) > > diff --git a/src/android/camera_device.cpp b/src/android/camera_device.cpp > index 74a95a2a..1cb4e675 100644 > --- a/src/android/camera_device.cpp > +++ b/src/android/camera_device.cpp > @@ -618,14 +618,32 @@ int CameraDevice::configureStreams(camera3_stream_configuration_t *stream_list) > continue; > } > > + /* This stream will be produced by hardware. */ > + stream->usage |= GRALLOC_USAGE_HW_CAMERA_WRITE; Strictly speaking this isn't true, given that the software stream will not be produced by hardware, but by a memcpy, right ? Is this needed because otherwise the gralloc module will not pick the right format ? If so, could you capture it in the comment ? > + > + /* > + * If a CameraStream with the same size and format of the > + * current stream has already been requested, associate the two. > + */ > + auto iter = std::find_if( > + streamConfigs.begin(), streamConfigs.end(), > + [size, format](const Camera3StreamConfig &streamConfig) { > + return streamConfig.config.size == size && > + streamConfig.config.pixelFormat == format; > + }); > + if (iter != streamConfigs.end()) { > + /* Add usage to copy the buffer in streams[0] to stream. */ > + iter->streams[0].stream->usage |= GRALLOC_USAGE_SW_READ_OFTEN; > + stream->usage |= GRALLOC_USAGE_SW_WRITE_OFTEN; > + iter->streams.push_back({ stream, CameraStream::Type::Mapped }); > + continue; > + } > + > Camera3StreamConfig streamConfig; > streamConfig.streams = { { stream, CameraStream::Type::Direct } }; > streamConfig.config.size = size; > streamConfig.config.pixelFormat = format; > streamConfigs.push_back(std::move(streamConfig)); > - > - /* This stream will be produced by hardware. */ > - stream->usage |= GRALLOC_USAGE_HW_CAMERA_WRITE; > } > > /* Now handle the MJPEG streams, adding a new stream if required. */ > @@ -1144,12 +1162,12 @@ void CameraDevice::requestComplete(Request *request) > resultMetadata = std::make_unique<CameraMetadata>(0, 0); > } > > - /* Handle any JPEG compression. */ > + /* Handle post-processing. */ > for (camera3_stream_buffer_t &buffer : descriptor.buffers_) { > CameraStream *cameraStream = > static_cast<CameraStream *>(buffer.stream->priv); > > - if (cameraStream->camera3Stream().format != HAL_PIXEL_FORMAT_BLOB) > + if (cameraStream->type() == CameraStream::Type::Direct) > continue; I would have moved those changes to 2/3 as that's where you introduce usage of the YUV post-processor, but it's not enabled before this patch, so it doesn't matter much. With the above comment expanded (or my understanding corrected), Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> > > FrameBuffer *src = request->findBuffer(cameraStream->stream());
Hi Laurent, On Wed, Sep 1, 2021 at 5:56 AM Laurent Pinchart <laurent.pinchart@ideasonboard.com> wrote: > > Hi Hiro, > > Thank you for the patch. > > On Tue, Aug 31, 2021 at 06:34:39PM +0900, Hirokazu Honda wrote: > > An Android HAL client may request identical stream requests. It is > > redundant that a native camera device produces a separate stream for > > each of the identical requests. > > Configure camera one stream configuration for the identical stream > > requests. > > > > Signed-off-by: Hirokazu Honda <hiroh@chromium.org> > > Reviewed-by: Jacopo Mondi <jacopo@jmondi.org> > > --- > > src/android/camera_device.cpp | 28 +++++++++++++++++++++++----- > > 1 file changed, 23 insertions(+), 5 deletions(-) > > > > diff --git a/src/android/camera_device.cpp b/src/android/camera_device.cpp > > index 74a95a2a..1cb4e675 100644 > > --- a/src/android/camera_device.cpp > > +++ b/src/android/camera_device.cpp > > @@ -618,14 +618,32 @@ int CameraDevice::configureStreams(camera3_stream_configuration_t *stream_list) > > continue; > > } > > > > + /* This stream will be produced by hardware. */ > > + stream->usage |= GRALLOC_USAGE_HW_CAMERA_WRITE; > > Strictly speaking this isn't true, given that the software stream will > not be produced by hardware, but by a memcpy, right ? Is this needed > because otherwise the gralloc module will not pick the right format ? If > so, could you capture it in the comment ? > Sure. I will make sure my understanding why our gralloc implementation doesn't work without this. I will submit the next patch series with the comment. -Hiro > > + > > + /* > > + * If a CameraStream with the same size and format of the > > + * current stream has already been requested, associate the two. > > + */ > > + auto iter = std::find_if( > > + streamConfigs.begin(), streamConfigs.end(), > > + [size, format](const Camera3StreamConfig &streamConfig) { > > + return streamConfig.config.size == size && > > + streamConfig.config.pixelFormat == format; > > + }); > > + if (iter != streamConfigs.end()) { > > + /* Add usage to copy the buffer in streams[0] to stream. */ > > + iter->streams[0].stream->usage |= GRALLOC_USAGE_SW_READ_OFTEN; > > + stream->usage |= GRALLOC_USAGE_SW_WRITE_OFTEN; > > + iter->streams.push_back({ stream, CameraStream::Type::Mapped }); > > + continue; > > + } > > + > > Camera3StreamConfig streamConfig; > > streamConfig.streams = { { stream, CameraStream::Type::Direct } }; > > streamConfig.config.size = size; > > streamConfig.config.pixelFormat = format; > > streamConfigs.push_back(std::move(streamConfig)); > > - > > - /* This stream will be produced by hardware. */ > > - stream->usage |= GRALLOC_USAGE_HW_CAMERA_WRITE; > > } > > > > /* Now handle the MJPEG streams, adding a new stream if required. */ > > @@ -1144,12 +1162,12 @@ void CameraDevice::requestComplete(Request *request) > > resultMetadata = std::make_unique<CameraMetadata>(0, 0); > > } > > > > - /* Handle any JPEG compression. */ > > + /* Handle post-processing. */ > > for (camera3_stream_buffer_t &buffer : descriptor.buffers_) { > > CameraStream *cameraStream = > > static_cast<CameraStream *>(buffer.stream->priv); > > > > - if (cameraStream->camera3Stream().format != HAL_PIXEL_FORMAT_BLOB) > > + if (cameraStream->type() == CameraStream::Type::Direct) > > continue; > > I would have moved those changes to 2/3 as that's where you introduce > usage of the YUV post-processor, but it's not enabled before this patch, > so it doesn't matter much. > > With the above comment expanded (or my understanding corrected), > > Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> > > > > > FrameBuffer *src = request->findBuffer(cameraStream->stream()); > > -- > Regards, > > Laurent Pinchart
Hi, On 8/31/21 3:04 PM, Hirokazu Honda wrote: > An Android HAL client may request identical stream requests. It is > redundant that a native camera device produces a separate stream for > each of the identical requests. > Configure camera one stream configuration for the identical stream s/camera one/one camera/ maybe? Not sure if sentence structure is correct. "Configure only one camera stream configuration for identical stream..." Does this mean the same thing too? > requests. > > Signed-off-by: Hirokazu Honda <hiroh@chromium.org> > Reviewed-by: Jacopo Mondi <jacopo@jmondi.org> > --- > src/android/camera_device.cpp | 28 +++++++++++++++++++++++----- > 1 file changed, 23 insertions(+), 5 deletions(-) > > diff --git a/src/android/camera_device.cpp b/src/android/camera_device.cpp > index 74a95a2a..1cb4e675 100644 > --- a/src/android/camera_device.cpp > +++ b/src/android/camera_device.cpp > @@ -618,14 +618,32 @@ int CameraDevice::configureStreams(camera3_stream_configuration_t *stream_list) > continue; > } > > + /* This stream will be produced by hardware. */ > + stream->usage |= GRALLOC_USAGE_HW_CAMERA_WRITE; > + > + /* > + * If a CameraStream with the same size and format of the > + * current stream has already been requested, associate the two. > + */ > + auto iter = std::find_if( > + streamConfigs.begin(), streamConfigs.end(), > + [size, format](const Camera3StreamConfig &streamConfig) { > + return streamConfig.config.size == size && > + streamConfig.config.pixelFormat == format; > + }); > + if (iter != streamConfigs.end()) { > + /* Add usage to copy the buffer in streams[0] to stream. */ > + iter->streams[0].stream->usage |= GRALLOC_USAGE_SW_READ_OFTEN; > + stream->usage |= GRALLOC_USAGE_SW_WRITE_OFTEN; > + iter->streams.push_back({ stream, CameraStream::Type::Mapped }); > + continue; > + } > + > Camera3StreamConfig streamConfig; > streamConfig.streams = { { stream, CameraStream::Type::Direct } }; > streamConfig.config.size = size; > streamConfig.config.pixelFormat = format; > streamConfigs.push_back(std::move(streamConfig)); > - > - /* This stream will be produced by hardware. */ > - stream->usage |= GRALLOC_USAGE_HW_CAMERA_WRITE; > } > > /* Now handle the MJPEG streams, adding a new stream if required. */ > @@ -1144,12 +1162,12 @@ void CameraDevice::requestComplete(Request *request) > resultMetadata = std::make_unique<CameraMetadata>(0, 0); > } > > - /* Handle any JPEG compression. */ > + /* Handle post-processing. */ > for (camera3_stream_buffer_t &buffer : descriptor.buffers_) { > CameraStream *cameraStream = > static_cast<CameraStream *>(buffer.stream->priv); > > - if (cameraStream->camera3Stream().format != HAL_PIXEL_FORMAT_BLOB) > + if (cameraStream->type() == CameraStream::Type::Direct) > continue; > > FrameBuffer *src = request->findBuffer(cameraStream->stream());
diff --git a/src/android/camera_device.cpp b/src/android/camera_device.cpp index 74a95a2a..1cb4e675 100644 --- a/src/android/camera_device.cpp +++ b/src/android/camera_device.cpp @@ -618,14 +618,32 @@ int CameraDevice::configureStreams(camera3_stream_configuration_t *stream_list) continue; } + /* This stream will be produced by hardware. */ + stream->usage |= GRALLOC_USAGE_HW_CAMERA_WRITE; + + /* + * If a CameraStream with the same size and format of the + * current stream has already been requested, associate the two. + */ + auto iter = std::find_if( + streamConfigs.begin(), streamConfigs.end(), + [size, format](const Camera3StreamConfig &streamConfig) { + return streamConfig.config.size == size && + streamConfig.config.pixelFormat == format; + }); + if (iter != streamConfigs.end()) { + /* Add usage to copy the buffer in streams[0] to stream. */ + iter->streams[0].stream->usage |= GRALLOC_USAGE_SW_READ_OFTEN; + stream->usage |= GRALLOC_USAGE_SW_WRITE_OFTEN; + iter->streams.push_back({ stream, CameraStream::Type::Mapped }); + continue; + } + Camera3StreamConfig streamConfig; streamConfig.streams = { { stream, CameraStream::Type::Direct } }; streamConfig.config.size = size; streamConfig.config.pixelFormat = format; streamConfigs.push_back(std::move(streamConfig)); - - /* This stream will be produced by hardware. */ - stream->usage |= GRALLOC_USAGE_HW_CAMERA_WRITE; } /* Now handle the MJPEG streams, adding a new stream if required. */ @@ -1144,12 +1162,12 @@ void CameraDevice::requestComplete(Request *request) resultMetadata = std::make_unique<CameraMetadata>(0, 0); } - /* Handle any JPEG compression. */ + /* Handle post-processing. */ for (camera3_stream_buffer_t &buffer : descriptor.buffers_) { CameraStream *cameraStream = static_cast<CameraStream *>(buffer.stream->priv); - if (cameraStream->camera3Stream().format != HAL_PIXEL_FORMAT_BLOB) + if (cameraStream->type() == CameraStream::Type::Direct) continue; FrameBuffer *src = request->findBuffer(cameraStream->stream());