[libcamera-devel,1/1] Add fake pipeline handler
diff mbox series

Message ID 20221012075925.3971538-2-chenghaoyang@google.com
State Superseded
Headers show
Series
  • Fake Pipeline Handler
Related show

Commit Message

Cheng-Hao Yang Oct. 12, 2022, 7:59 a.m. UTC
---
 meson_options.txt                       |   2 +-
 src/android/camera_capabilities.cpp     |   1 +
 src/libcamera/pipeline/fake/fake.cpp    | 518 ++++++++++++++++++++++++
 src/libcamera/pipeline/fake/meson.build |   3 +
 src/libcamera/pipeline_handler.cpp      |   2 +-
 test/camera/camera_reconfigure.cpp      |   2 +-
 6 files changed, 525 insertions(+), 3 deletions(-)
 create mode 100644 src/libcamera/pipeline/fake/fake.cpp
 create mode 100644 src/libcamera/pipeline/fake/meson.build

Comments

Jacopo Mondi Oct. 12, 2022, 3:06 p.m. UTC | #1
Hi Harvey,
   I had the same thought as Kieran had, why not vimc ? I understan
your reasons behind it and the requirement to backport patches for it
to older kernel (btw, does it need anything more than CAP_IO_MC ?)

I have some general questions

1) What does this pipeline handler matches on ? It seems to me match()
always returns true, hence if the pipeline handler is compiled in, it
will always report one camera available ?

2) What are the requirements for this pipeline handler ?

   Is it enough to test the Android layer to implement stub methods,
   or should frames be generated somehow (I'm thinking at CTS frame
   rate checks in example).

   Should it support multiple streams ? I guess it depends on the HW
   level one wants to test in CTS, or should it mimik the capabilities
   of a known device (ie IPU3 that can produce 2 YUV streams and one
   RAW)

   I guess however this can be started as a skeleton and be populated
   as required to complete CTS.

3) Should all mentions of IPU3 related components be removed ?

4) Possible bikeshedding, but I wonder if "dummy" isn't better than
   "fake"

On Wed, Oct 12, 2022 at 07:59:25AM +0000, Harvey Yang via libcamera-devel wrote:
> ---
>  meson_options.txt                       |   2 +-
>  src/android/camera_capabilities.cpp     |   1 +
>  src/libcamera/pipeline/fake/fake.cpp    | 518 ++++++++++++++++++++++++
>  src/libcamera/pipeline/fake/meson.build |   3 +
>  src/libcamera/pipeline_handler.cpp      |   2 +-
>  test/camera/camera_reconfigure.cpp      |   2 +-
>  6 files changed, 525 insertions(+), 3 deletions(-)
>  create mode 100644 src/libcamera/pipeline/fake/fake.cpp
>  create mode 100644 src/libcamera/pipeline/fake/meson.build
>
> diff --git a/meson_options.txt b/meson_options.txt
> index f1d67808..f08dfc5f 100644
> --- a/meson_options.txt
> +++ b/meson_options.txt
> @@ -37,7 +37,7 @@ option('lc-compliance',
>
>  option('pipelines',
>          type : 'array',
> -        choices : ['ipu3', 'raspberrypi', 'rkisp1', 'simple', 'uvcvideo', 'vimc'],
> +        choices : ['ipu3', 'raspberrypi', 'rkisp1', 'simple', 'uvcvideo', 'vimc', 'fake'],
>          description : 'Select which pipeline handlers to include')
>
>  option('qcam',
> diff --git a/src/android/camera_capabilities.cpp b/src/android/camera_capabilities.cpp
> index 64bd8dde..730ceafc 100644
> --- a/src/android/camera_capabilities.cpp
> +++ b/src/android/camera_capabilities.cpp
> @@ -1077,6 +1077,7 @@ int CameraCapabilities::initializeStaticMetadata()
>  	{
>  		const Span<const Rectangle> &rects =
>  			properties.get(properties::PixelArrayActiveAreas).value_or(Span<const Rectangle>{});
> +		// TODO: initialize at least one Rectangle as default.

Unrelated, please drop

>  		std::vector<int32_t> data{
>  			static_cast<int32_t>(rects[0].x),
>  			static_cast<int32_t>(rects[0].y),
> diff --git a/src/libcamera/pipeline/fake/fake.cpp b/src/libcamera/pipeline/fake/fake.cpp
> new file mode 100644
> index 00000000..518de6aa
> --- /dev/null
> +++ b/src/libcamera/pipeline/fake/fake.cpp
> @@ -0,0 +1,518 @@
> +/* SPDX-License-Identifier: LGPL-2.1-or-later */
> +/*
> + * Copyright (C) 2019, Google Inc.
> + *
> + * ipu3.cpp - Pipeline handler for Intel Fake
> + */

Please update year and remove mentions of Intel ?

> +
> +#include <algorithm>
> +#include <iomanip>
> +#include <memory>
> +#include <queue>
> +#include <vector>

Seems like you're also using std::set

> +
> +#include <libcamera/base/log.h>
> +#include <libcamera/base/utils.h>
> +
> +#include <libcamera/camera.h>

all internal headers "libcamera/internal/something.h" should include
<libcamera/something.h>, so if you include internal/camera.h you
probably can remove this

> +#include <libcamera/control_ids.h>
> +#include <libcamera/formats.h>
> +#include <libcamera/property_ids.h>
> +#include <libcamera/request.h>
> +#include <libcamera/stream.h>
> +
> +#include "libcamera/internal/camera.h"
> +#include "libcamera/internal/camera_lens.h"
> +#include "libcamera/internal/camera_sensor.h"
> +#include "libcamera/internal/delayed_controls.h"

The three above are not used it seems

> +#include "libcamera/internal/device_enumerator.h"
> +#include "libcamera/internal/framebuffer.h"
> +#include "libcamera/internal/media_device.h"
> +#include "libcamera/internal/pipeline_handler.h"
> +
> +namespace libcamera {
> +
> +LOG_DEFINE_CATEGORY(Fake)
> +
> +uint64_t CurrentTimestamp()
> +{
> +	struct timespec ts;
> +	if (clock_gettime(CLOCK_MONOTONIC, &ts) < 0) {
> +		LOG(Fake, Error) << "Get clock time fails";
> +		return 0;
> +	}
> +
> +	return ts.tv_sec * 1'000'000'000LL + ts.tv_nsec;
> +}
> +
> +static const ControlInfoMap::Map FakeControls = {

Missing #include <libcamera/controls.h> ?

> +	{ &controls::draft::PipelineDepth, ControlInfo(2, 3) },
> +};
> +
> +class FakeCameraData : public Camera::Private
> +{
> +public:
> +	struct Resolution {
> +		Size size;
> +		std::vector<int> frame_rates;
> +		std::vector<std::string> formats;

I would rather use libcamera::formats to verify the Android HAL does
the translation right.

> +	};
> +
> +	FakeCameraData(PipelineHandler *pipe)
> +		: Camera::Private(pipe), supportsFlips_(false)
> +	{
> +	}
> +
> +	std::vector<Resolution> supported_resolutions_;

supportedResolutions_

> +
> +	Stream outStream_;
> +	Stream vfStream_;
> +	Stream rawStream_;
> +
> +	// TODO: Check if we should support.
> +	bool supportsFlips_;
> +	Transform rotationTransform_;

For sake of simplicity you can remoev flip/rotation support from the
first skeleton

> +
> +	// TODO: remove

Please :)

> +	/* Requests for which no buffer has been queued to the CIO2 device yet. */
> +	std::queue<Request *> pendingRequests_;
> +	/* Requests queued to the CIO2 device but not yet processed by the ImgU. */
> +	std::queue<Request *> processingRequests_;
> +
> +	// ControlInfoMap ipaControls_;

This as well

> +	bool started_ = false;
> +};
> +
> +class FakeCameraConfiguration : public CameraConfiguration
> +{
> +public:
> +	static constexpr unsigned int kBufferCount = 4; // 4~6
> +	static constexpr unsigned int kMaxStreams = 3;
> +
> +	FakeCameraConfiguration(FakeCameraData *data);
> +
> +	Status validate() override;
> +
> +private:
> +	/*
> +	 * The FakeCameraData instance is guaranteed to be valid as long as the
> +	 * corresponding Camera instance is valid. In order to borrow a
> +	 * reference to the camera data, store a new reference to the camera.
> +	 */

This comes from the IPU3 pipeline handler, and I wonder if it still
applies there or it should be removed.

> +	const FakeCameraData *data_;
> +};
> +
> +class PipelineHandlerFake : public PipelineHandler
> +{
> +public:
> +	static constexpr unsigned int V4L2_CID_Fake_PIPE_MODE = 0x009819c1;

Unused, please drop

> +	static constexpr Size kViewfinderSize{ 1280, 720 };
> +
> +	enum FakePipeModes {
> +		FakePipeModeVideo = 0,
> +		FakePipeModeStillCapture = 1,
> +	};

ditto

> +
> +	PipelineHandlerFake(CameraManager *manager);
> +
> +	CameraConfiguration *generateConfiguration(Camera *camera,
> +		const StreamRoles &roles) override;

Align to open (

> +	int configure(Camera *camera, CameraConfiguration *config) override;
> +
> +	int exportFrameBuffers(Camera *camera, Stream *stream,
> +			       std::vector<std::unique_ptr<FrameBuffer>> *buffers) override;
> +
> +	int start(Camera *camera, const ControlList *controls) override;
> +	void stopDevice(Camera *camera) override;
> +
> +	int queueRequestDevice(Camera *camera, Request *request) override;
> +
> +	bool match(DeviceEnumerator *enumerator) override;
> +
> +private:
> +	FakeCameraData *cameraData(Camera *camera)
> +	{
> +		return static_cast<FakeCameraData *>(camera->_d());
> +	}
> +
> +	int registerCameras();
> +
> +	static bool registered_;
> +};
> +
> +bool PipelineHandlerFake::registered_ = false;

What purpose does this serve ? Do you get multiple calls to match()
so that you need to keep a flag ?

> +
> +FakeCameraConfiguration::FakeCameraConfiguration(FakeCameraData *data)
> +	: CameraConfiguration()
> +{
> +	data_ = data;
> +}
> +
> +CameraConfiguration::Status FakeCameraConfiguration::validate()
> +{
> +	Status status = Valid;
> +
> +	if (config_.empty())
> +		return Invalid;
> +
> +	// Transform combined = transform * data_->rotationTransform_;
> +
> +	// /*
> +	//  * We combine the platform and user transform, but must "adjust away"
> +	//  * any combined result that includes a transposition, as we can't do
> +	//  * those. In this case, flipping only the transpose bit is helpful to
> +	//  * applications - they either get the transform they requested, or have
> +	//  * to do a simple transpose themselves (they don't have to worry about
> +	//  * the other possible cases).
> +	//  */
> +	// if (!!(combined & Transform::Transpose)) {
> +	// 	/*
> +	// 	 * Flipping the transpose bit in "transform" flips it in the
> +	// 	 * combined result too (as it's the last thing that happens),
> +	// 	 * which is of course clearing it.
> +	// 	 */
> +	// 	transform ^= Transform::Transpose;
> +	// 	combined &= ~Transform::Transpose;
> +	// 	status = Adjusted;
> +	// }
> +
> +	// /*
> +	//  * We also check if the sensor doesn't do h/vflips at all, in which
> +	//  * case we clear them, and the application will have to do everything.
> +	//  */
> +	// if (!data_->supportsFlips_ && !!combined) {
> +	// 	/*
> +	// 	 * If the sensor can do no transforms, then combined must be
> +	// 	 * changed to the identity. The only user transform that gives
> +	// 	 * rise to this is the inverse of the rotation. (Recall that
> +	// 	 * combined = transform * rotationTransform.)
> +	// 	 */
> +	// 	transform = -data_->rotationTransform_;
> +	// 	combined = Transform::Identity;
> +	// 	status = Adjusted;
> +	// }
> +
> +	/*
> +	 * Store the final combined transform that configure() will need to
> +	 * apply to the sensor to save us working it out again.
> +	 */
> +	// combinedTransform_ = combined;

Please drop commented out code :0

> +
> +	/* Cap the number of entries to the available streams. */
> +	if (config_.size() > kMaxStreams) {
> +		config_.resize(kMaxStreams);
> +		status = Adjusted;
> +	}
> +
> +	/*
> +	 * Validate the requested stream configuration and select the sensor
> +	 * format by collecting the maximum RAW stream width and height and
> +	 * picking the closest larger match.
> +	 *
> +	 * If no RAW stream is requested use the one of the largest YUV stream,
> +	 * plus margin pixels for the IF and BDS rectangle to downscale.
> +	 *
> +	 * \todo Clarify the IF and BDS margins requirements.

Please drop IPU3 related stuff

> +	 */
> +	unsigned int rawCount = 0;
> +	unsigned int yuvCount = 0;
> +	Size rawRequirement;
> +	Size maxYuvSize;
> +	Size rawSize;
> +
> +	for (const StreamConfiguration &cfg : config_) {
> +		const PixelFormatInfo &info = PixelFormatInfo::info(cfg.pixelFormat);
> +
> +		if (info.colourEncoding == PixelFormatInfo::ColourEncodingRAW) {
> +			rawCount++;
> +			rawSize = std::max(rawSize, cfg.size);
> +		} else {
> +			yuvCount++;
> +			maxYuvSize = std::max(maxYuvSize, cfg.size);
> +			rawRequirement.expandTo(cfg.size);
> +		}
> +	}
> +
> +	// TODO: Base on number of cameras?

Well, should this pipeline register more than one camera ?
And anyway, the number of streams is per-camera so the number of
cameras should be not relevant here

> +	if (rawCount > 1 || yuvCount > 2) {
> +		LOG(Fake, Debug) << "Camera configuration not supported";
> +		return Invalid;
> +	}
> +
> +	/*
> +	 * Generate raw configuration from CIO2.
> +	 *
> +	 * The output YUV streams will be limited in size to the maximum frame
> +	 * size requested for the RAW stream, if present.
> +	 *
> +	 * If no raw stream is requested, generate a size from the largest YUV
> +	 * stream, aligned to the ImgU constraints and bound
> +	 * by the sensor's maximum resolution. See
> +	 * https://bugs.libcamera.org/show_bug.cgi?id=32
> +	 */
> +	// TODO
> +	if (rawSize.isNull())
> +		rawSize = rawRequirement;

All of these serves on IPU3 to generate a size suitable for the CIO2
and the sensor. You can remove it.

> +
> +	/*
> +	 * Adjust the configurations if needed and assign streams while
> +	 * iterating them.
> +	 */
> +	bool mainOutputAvailable = true;
> +	for (unsigned int i = 0; i < config_.size(); ++i) {
> +		const PixelFormatInfo &info = PixelFormatInfo::info(config_[i].pixelFormat);
> +		const StreamConfiguration originalCfg = config_[i];
> +		StreamConfiguration *cfg = &config_[i];
> +
> +		LOG(Fake, Debug) << "Validating stream: " << config_[i].toString();
> +
> +		if (info.colourEncoding == PixelFormatInfo::ColourEncodingRAW) {
> +			/* Initialize the RAW stream with the CIO2 configuration. */
> +			cfg->size = rawSize;
> +			// TODO: check
> +			cfg->pixelFormat = formats::SBGGR10_IPU3;
> +			cfg->bufferCount = FakeCameraConfiguration::kBufferCount;
> +			cfg->stride = info.stride(cfg->size.width, 0, 64);
> +			cfg->frameSize = info.frameSize(cfg->size, 64);
> +			cfg->setStream(const_cast<Stream *>(&data_->rawStream_));
> +
> +			LOG(Fake, Debug) << "Assigned " << cfg->toString()
> +					 << " to the raw stream";
> +		} else {
> +			/* Assign and configure the main and viewfinder outputs. */
> +
> +			cfg->pixelFormat = formats::NV12;
> +			cfg->bufferCount = kBufferCount;
> +			cfg->stride = info.stride(cfg->size.width, 0, 1);
> +			cfg->frameSize = info.frameSize(cfg->size, 1);
> +
> +			/*
> +			 * Use the main output stream in case only one stream is
> +			 * requested or if the current configuration is the one
> +			 * with the maximum YUV output size.
> +			 */
> +			if (mainOutputAvailable &&
> +			    (originalCfg.size == maxYuvSize || yuvCount == 1)) {
> +				cfg->setStream(const_cast<Stream *>(&data_->outStream_));
> +				mainOutputAvailable = false;
> +
> +				LOG(Fake, Debug) << "Assigned " << cfg->toString()
> +						 << " to the main output";
> +			} else {
> +				cfg->setStream(const_cast<Stream *>(&data_->vfStream_));
> +
> +				LOG(Fake, Debug) << "Assigned " << cfg->toString()
> +						 << " to the viewfinder output";
> +			}
> +		}

Again I'm not sure how much of the IPU3 should this pipeline mimick,
or it should establish requirements and constraints (ie max 2 YUV
streams of max size width x height, and 1 RAW stream in format
V4L2_PIX_FMT_..)

> +
> +		if (cfg->pixelFormat != originalCfg.pixelFormat ||
> +		    cfg->size != originalCfg.size) {
> +			LOG(Fake, Debug)
> +				<< "Stream " << i << " configuration adjusted to "
> +				<< cfg->toString();
> +			status = Adjusted;
> +		}
> +	}
> +
> +	return status;
> +}
> +
> +PipelineHandlerFake::PipelineHandlerFake(CameraManager *manager)
> +	: PipelineHandler(manager)
> +{
> +	// TODO: read the fake hal spec file.

If there's a design document with requirements, can you share it ?

> +}
> +
> +CameraConfiguration *PipelineHandlerFake::generateConfiguration(Camera *camera,
> +								const StreamRoles &roles)
> +{
> +	FakeCameraData *data = cameraData(camera);
> +	FakeCameraConfiguration *config = new FakeCameraConfiguration(data);
> +
> +	if (roles.empty())
> +		return config;
> +
> +	Size minSize, sensorResolution;
> +	for (const auto& resolution : data->supported_resolutions_) {
> +		if (minSize.isNull() || minSize > resolution.size)
> +			minSize = resolution.size;
> +
> +		sensorResolution = std::max(sensorResolution, resolution.size);
> +	}
> +

Those are hard-coded values, I think you can reuse them here.
Unless the idea is to make this information come from a configuration
file or something similar.

> +	for (const StreamRole role : roles) {
> +		std::map<PixelFormat, std::vector<SizeRange>> streamFormats;
> +		unsigned int bufferCount;
> +		PixelFormat pixelFormat;
> +		Size size;
> +
> +		switch (role) {
> +		case StreamRole::StillCapture:
> +			size = sensorResolution;
> +			pixelFormat = formats::NV12;
> +			bufferCount = FakeCameraConfiguration::kBufferCount;
> +			streamFormats[pixelFormat] = { { minSize, size } };
> +
> +			break;
> +
> +		case StreamRole::Raw: {
> +			// TODO: check
> +			pixelFormat = formats::SBGGR10_IPU3;
> +			size = sensorResolution;
> +			bufferCount = FakeCameraConfiguration::kBufferCount;
> +			streamFormats[pixelFormat] = { { minSize, size } };
> +
> +			break;
> +		}
> +
> +		case StreamRole::Viewfinder:
> +		case StreamRole::VideoRecording: {
> +			/*
> +			 * Default viewfinder and videorecording to 1280x720,
> +			 * capped to the maximum sensor resolution and aligned
> +			 * to the ImgU output constraints.
> +			 */
> +			size = sensorResolution;
> +			pixelFormat = formats::NV12;
> +			bufferCount = FakeCameraConfiguration::kBufferCount;
> +			streamFormats[pixelFormat] = { { minSize, size } };
> +
> +			break;
> +		}
> +
> +		default:
> +			LOG(Fake, Error)
> +				<< "Requested stream role not supported: " << role;
> +			delete config;
> +			return nullptr;
> +		}
> +
> +		StreamFormats formats(streamFormats);
> +		StreamConfiguration cfg(formats);
> +		cfg.size = size;
> +		cfg.pixelFormat = pixelFormat;
> +		cfg.bufferCount = bufferCount;
> +		config->addConfiguration(cfg);
> +	}
> +
> +	if (config->validate() == CameraConfiguration::Invalid)
> +		return {};
> +
> +	return config;
> +}
> +
> +int PipelineHandlerFake::configure(Camera *camera, CameraConfiguration *c)
> +{
> +	if (camera || c)
> +		return 0;
> +	return 0;
> +}
> +
> +int PipelineHandlerFake::exportFrameBuffers(Camera *camera, Stream *stream,
> +					    std::vector<std::unique_ptr<FrameBuffer>> *buffers)
> +{
> +	// Assume it's never called.
> +	LOG(Fake, Fatal) << "exportFrameBuffers should never be called";
> +	if (camera || stream || buffers)
> +		return -EINVAL;
> +	return -EINVAL;
> +}
> +
> +int PipelineHandlerFake::start(Camera *camera, [[maybe_unused]] const ControlList *controls)
> +{
> +	FakeCameraData *data = cameraData(camera);
> +	data->started_ = true;
> +
> +	return 0;
> +}
> +
> +void PipelineHandlerFake::stopDevice(Camera *camera)
> +{
> +	FakeCameraData *data = cameraData(camera);
> +
> +	data->started_ = false;
> +}
> +
> +int PipelineHandlerFake::queueRequestDevice(Camera *camera, Request *request)
> +{
> +	if (!camera)
> +		return -EINVAL;

Can this happen ?

> +
> +	for (auto it : request->buffers())
> +		completeBuffer(request, it.second);
> +
> +	// TODO: request.metadata()
> +	request->metadata().set(controls::SensorTimestamp, CurrentTimestamp());
> +	completeRequest(request);
> +
> +	return 0;
> +}
> +
> +bool PipelineHandlerFake::match(DeviceEnumerator *enumerator)
> +{
> +	// TODO: exhaust all devices in |enumerator|.
> +	if (!enumerator)
> +		LOG(Fake, Info) << "Invalid enumerator";

Can this happen ?

> +
> +	if (registered_)
> +		return false;
> +
> +	registered_ = true;
> +	return registerCameras() == 0;
> +}
> +
> +/**
> + * \brief Initialise ImgU and CIO2 devices associated with cameras
> + *
> + * Initialise the two ImgU instances and create cameras with an associated
> + * CIO2 device instance.
> + *
> + * \return 0 on success or a negative error code for error or if no camera
> + * has been created
> + * \retval -ENODEV no camera has been created
> + */
> +int PipelineHandlerFake::registerCameras()
> +{
> +	std::unique_ptr<FakeCameraData> data =
> +		std::make_unique<FakeCameraData>(this);
> +	std::set<Stream *> streams = {
> +		&data->outStream_,
> +		&data->vfStream_,
> +		&data->rawStream_,
> +	};
> +
> +	// TODO: Read from config or from IPC.
> +	// TODO: Check with Han-lin: Can this function be called more than once?
> +	data->supported_resolutions_.resize(2);
> +	data->supported_resolutions_[0].size = Size(1920, 1080);
> +	data->supported_resolutions_[0].frame_rates.push_back(30);
> +	data->supported_resolutions_[0].formats.push_back("YCbCr_420_888");
> +	data->supported_resolutions_[0].formats.push_back("BLOB");
> +	data->supported_resolutions_[1].size = Size(1280, 720);
> +	data->supported_resolutions_[1].frame_rates.push_back(30);
> +	data->supported_resolutions_[1].frame_rates.push_back(60);
> +	data->supported_resolutions_[1].formats.push_back("YCbCr_420_888");
> +	data->supported_resolutions_[1].formats.push_back("BLOB");
> +
> +	// TODO: Assign different locations for different cameras based on config.
> +	data->properties_.set(properties::Location, properties::CameraLocationFront);
> +	data->properties_.set(properties::PixelArrayActiveAreas, { Rectangle(Size(1920, 1080)) });
> +
> +	// TODO: Set FrameDurationLimits based on config.
> +	ControlInfoMap::Map controls{};
> +	int64_t min_frame_duration = 30, max_frame_duration = 60;
> +	controls[&controls::FrameDurationLimits] = ControlInfo(min_frame_duration, max_frame_duration);
> +	data->controlInfo_ = ControlInfoMap(std::move(controls), controls::controls);

FakeControls is ignored

> +
> +	std::shared_ptr<Camera> camera =
> +		Camera::create(std::move(data), "\\_SB_.PCI0.I2C4.CAM1" /* cameraId */, streams);
> +
> +	registerCamera(std::move(camera));
> +
> +	return 0;
> +}
> +
> +REGISTER_PIPELINE_HANDLER(PipelineHandlerFake)
> +
> +} /* namespace libcamera */
> diff --git a/src/libcamera/pipeline/fake/meson.build b/src/libcamera/pipeline/fake/meson.build
> new file mode 100644
> index 00000000..13980b4a
> --- /dev/null
> +++ b/src/libcamera/pipeline/fake/meson.build
> @@ -0,0 +1,3 @@
> +# SPDX-License-Identifier: CC0-1.0
> +
> +libcamera_sources += files([ 'fake.cpp' ])
> diff --git a/src/libcamera/pipeline_handler.cpp b/src/libcamera/pipeline_handler.cpp
> index e5cb751c..4261154d 100644
> --- a/src/libcamera/pipeline_handler.cpp
> +++ b/src/libcamera/pipeline_handler.cpp
> @@ -536,7 +536,7 @@ void PipelineHandler::registerCamera(std::shared_ptr<Camera> camera)
>  	cameras_.push_back(camera);
>
>  	if (mediaDevices_.empty())
> -		LOG(Pipeline, Fatal)
> +		LOG(Pipeline, Error)

I don't think we want that, you should probably open code the

	manager_->addCamera(std::move(camera), devnums);

call, with an empty list of devnums, which should be fine for this use
case.

The PipelineHandler::cameras_ vector will remain unpopulated, but
since it is used only in case of a media device disconnection (afaict)
it should be fine in your case


>  			<< "Registering camera with no media devices!";
>
>  	/*
> diff --git a/test/camera/camera_reconfigure.cpp b/test/camera/camera_reconfigure.cpp
> index d12e2413..06c87730 100644
> --- a/test/camera/camera_reconfigure.cpp
> +++ b/test/camera/camera_reconfigure.cpp
> @@ -98,7 +98,7 @@ private:
>  				return TestFail;
>  			}
>
> -			requests_.push_back(move(request));
> +			requests_.push_back(std::move(request));

Unrelated and possibly not necessary, as the test has
        using namespace std;

Looking forward to seeing this skeleton getting populated, it will help
testing the Android layer!

Thanks
   j

>  		}
>
>  		camera_->requestCompleted.connect(this, &CameraReconfigure::requestComplete);
> --
> 2.38.0.rc1.362.ged0d419d3c-goog
>
Cheng-Hao Yang Oct. 14, 2022, 10:36 a.m. UTC | #2
Thanks Jacopo!

The patch v2 is uploaded based on your comments.

On Wed, Oct 12, 2022 at 11:06 PM Jacopo Mondi <jacopo@jmondi.org> wrote:

> Hi Harvey,
>    I had the same thought as Kieran had, why not vimc ? I understan
> your reasons behind it and the requirement to backport patches for it
> to older kernel (btw, does it need anything more than CAP_IO_MC ?)
>
>
Is it a great effort to backport patches to older kernel versions?
Regarding CAP_IO_MC, I don't know :p. Anyone can help with this?


> I have some general questions
>
> 1) What does this pipeline handler matches on ? It seems to me match()
> always returns true, hence if the pipeline handler is compiled in, it
> will always report one camera available ?
>
>
I think it'll eventually depend on the configuration file provided by the
tester.
We might come up with some fake media devices, without matching the real
ones. But I haven't made up my mind yet.


> 2) What are the requirements for this pipeline handler ?
>
>    Is it enough to test the Android layer to implement stub methods,
>    or should frames be generated somehow (I'm thinking at CTS frame
>    rate checks in example).
>
>    Should it support multiple streams ? I guess it depends on the HW
>    level one wants to test in CTS, or should it mimik the capabilities
>    of a known device (ie IPU3 that can produce 2 YUV streams and one
>    RAW)
>
>    I guess however this can be started as a skeleton and be populated
>    as required to complete CTS.
>
> Additional features like custom configurations (of cameras) and frames
from
a video file will be added.

In CrOS, it helps applications to test usages in different (fake) hardware
setup,
while it also helps libcamera test the Android adapter.


> 3) Should all mentions of IPU3 related components be removed ?
>
> Yes


> 4) Possible bikeshedding, but I wonder if "dummy" isn't better than
>    "fake"
>
> As we need some more features, like a custom configuration file/cameras
and frames from a video file, I believe "fake" is still more appropriate.


> On Wed, Oct 12, 2022 at 07:59:25AM +0000, Harvey Yang via libcamera-devel
> wrote:
> > ---
> >  meson_options.txt                       |   2 +-
> >  src/android/camera_capabilities.cpp     |   1 +
> >  src/libcamera/pipeline/fake/fake.cpp    | 518 ++++++++++++++++++++++++
> >  src/libcamera/pipeline/fake/meson.build |   3 +
> >  src/libcamera/pipeline_handler.cpp      |   2 +-
> >  test/camera/camera_reconfigure.cpp      |   2 +-
> >  6 files changed, 525 insertions(+), 3 deletions(-)
> >  create mode 100644 src/libcamera/pipeline/fake/fake.cpp
> >  create mode 100644 src/libcamera/pipeline/fake/meson.build
> >
> > diff --git a/meson_options.txt b/meson_options.txt
> > index f1d67808..f08dfc5f 100644
> > --- a/meson_options.txt
> > +++ b/meson_options.txt
> > @@ -37,7 +37,7 @@ option('lc-compliance',
> >
> >  option('pipelines',
> >          type : 'array',
> > -        choices : ['ipu3', 'raspberrypi', 'rkisp1', 'simple',
> 'uvcvideo', 'vimc'],
> > +        choices : ['ipu3', 'raspberrypi', 'rkisp1', 'simple',
> 'uvcvideo', 'vimc', 'fake'],
> >          description : 'Select which pipeline handlers to include')
> >
> >  option('qcam',
> > diff --git a/src/android/camera_capabilities.cpp
> b/src/android/camera_capabilities.cpp
> > index 64bd8dde..730ceafc 100644
> > --- a/src/android/camera_capabilities.cpp
> > +++ b/src/android/camera_capabilities.cpp
> > @@ -1077,6 +1077,7 @@ int CameraCapabilities::initializeStaticMetadata()
> >       {
> >               const Span<const Rectangle> &rects =
> >
>  properties.get(properties::PixelArrayActiveAreas).value_or(Span<const
> Rectangle>{});
> > +             // TODO: initialize at least one Rectangle as default.
>
> Unrelated, please drop
>
>
Done.


> >               std::vector<int32_t> data{
> >                       static_cast<int32_t>(rects[0].x),
> >                       static_cast<int32_t>(rects[0].y),
> > diff --git a/src/libcamera/pipeline/fake/fake.cpp
> b/src/libcamera/pipeline/fake/fake.cpp
> > new file mode 100644
> > index 00000000..518de6aa
> > --- /dev/null
> > +++ b/src/libcamera/pipeline/fake/fake.cpp
> > @@ -0,0 +1,518 @@
> > +/* SPDX-License-Identifier: LGPL-2.1-or-later */
> > +/*
> > + * Copyright (C) 2019, Google Inc.
> > + *
> > + * ipu3.cpp - Pipeline handler for Intel Fake
> > + */
>
> Please update year and remove mentions of Intel ?
>
>
Done.


> > +
> > +#include <algorithm>
> > +#include <iomanip>
> > +#include <memory>
> > +#include <queue>
> > +#include <vector>
>
> Seems like you're also using std::set
>
>
Done.


> > +
> > +#include <libcamera/base/log.h>
> > +#include <libcamera/base/utils.h>
> > +
> > +#include <libcamera/camera.h>
>
> all internal headers "libcamera/internal/something.h" should include
> <libcamera/something.h>, so if you include internal/camera.h you
> probably can remove this
>
>
Done.


> > +#include <libcamera/control_ids.h>
> > +#include <libcamera/formats.h>
> > +#include <libcamera/property_ids.h>
> > +#include <libcamera/request.h>
> > +#include <libcamera/stream.h>
> > +
> > +#include "libcamera/internal/camera.h"
> > +#include "libcamera/internal/camera_lens.h"
> > +#include "libcamera/internal/camera_sensor.h"
> > +#include "libcamera/internal/delayed_controls.h"
>
> The three above are not used it seems
>
>
Done.


> > +#include "libcamera/internal/device_enumerator.h"
> > +#include "libcamera/internal/framebuffer.h"
> > +#include "libcamera/internal/media_device.h"
> > +#include "libcamera/internal/pipeline_handler.h"
> > +
> > +namespace libcamera {
> > +
> > +LOG_DEFINE_CATEGORY(Fake)
> > +
> > +uint64_t CurrentTimestamp()
> > +{
> > +     struct timespec ts;
> > +     if (clock_gettime(CLOCK_MONOTONIC, &ts) < 0) {
> > +             LOG(Fake, Error) << "Get clock time fails";
> > +             return 0;
> > +     }
> > +
> > +     return ts.tv_sec * 1'000'000'000LL + ts.tv_nsec;
> > +}
> > +
> > +static const ControlInfoMap::Map FakeControls = {
>
> Missing #include <libcamera/controls.h> ?
>
>
Done.

> +     { &controls::draft::PipelineDepth, ControlInfo(2, 3) },
> > +};
> > +
> > +class FakeCameraData : public Camera::Private
> > +{
> > +public:
> > +     struct Resolution {
> > +             Size size;
> > +             std::vector<int> frame_rates;
> > +             std::vector<std::string> formats;
>
> I would rather use libcamera::formats to verify the Android HAL does
> the translation right.
>
> I believe you mean libcamera::PixelFormat. Done.


> > +     };
> > +
> > +     FakeCameraData(PipelineHandler *pipe)
> > +             : Camera::Private(pipe), supportsFlips_(false)
> > +     {
> > +     }
> > +
> > +     std::vector<Resolution> supported_resolutions_;
>
> supportedResolutions_
>
> Done.

> > +
> > +     Stream outStream_;
> > +     Stream vfStream_;
> > +     Stream rawStream_;
> > +
> > +     // TODO: Check if we should support.
> > +     bool supportsFlips_;
> > +     Transform rotationTransform_;
>
> For sake of simplicity you can remoev flip/rotation support from the
> first skeleton
>
>
Done.


> > +
> > +     // TODO: remove
>
> Please :)
>
> Done.

> > +     /* Requests for which no buffer has been queued to the CIO2 device
> yet. */
> > +     std::queue<Request *> pendingRequests_;
> > +     /* Requests queued to the CIO2 device but not yet processed by the
> ImgU. */
> > +     std::queue<Request *> processingRequests_;
> > +
> > +     // ControlInfoMap ipaControls_;
>
> This as well
>
> Done.

> > +     bool started_ = false;
> > +};
> > +
> > +class FakeCameraConfiguration : public CameraConfiguration
> > +{
> > +public:
> > +     static constexpr unsigned int kBufferCount = 4; // 4~6
> > +     static constexpr unsigned int kMaxStreams = 3;
> > +
> > +     FakeCameraConfiguration(FakeCameraData *data);
> > +
> > +     Status validate() override;
> > +
> > +private:
> > +     /*
> > +      * The FakeCameraData instance is guaranteed to be valid as long
> as the
> > +      * corresponding Camera instance is valid. In order to borrow a
> > +      * reference to the camera data, store a new reference to the
> camera.
> > +      */
>
> This comes from the IPU3 pipeline handler, and I wonder if it still
> applies there or it should be removed.
>
>
I believe it's still needed in |validate()| to set streams, right?


> > +     const FakeCameraData *data_;
> > +};
> > +
> > +class PipelineHandlerFake : public PipelineHandler
> > +{
> > +public:
> > +     static constexpr unsigned int V4L2_CID_Fake_PIPE_MODE = 0x009819c1;
>
> Unused, please drop
>
> Done.

> > +     static constexpr Size kViewfinderSize{ 1280, 720 };
> > +
> > +     enum FakePipeModes {
> > +             FakePipeModeVideo = 0,
> > +             FakePipeModeStillCapture = 1,
> > +     };
>
> ditto
>
>
Done.


> > +
> > +     PipelineHandlerFake(CameraManager *manager);
> > +
> > +     CameraConfiguration *generateConfiguration(Camera *camera,
> > +             const StreamRoles &roles) override;
>
> Align to open (
>
> Done.

> > +     int configure(Camera *camera, CameraConfiguration *config)
> override;
> > +
> > +     int exportFrameBuffers(Camera *camera, Stream *stream,
> > +                            std::vector<std::unique_ptr<FrameBuffer>>
> *buffers) override;
> > +
> > +     int start(Camera *camera, const ControlList *controls) override;
> > +     void stopDevice(Camera *camera) override;
> > +
> > +     int queueRequestDevice(Camera *camera, Request *request) override;
> > +
> > +     bool match(DeviceEnumerator *enumerator) override;
> > +
> > +private:
> > +     FakeCameraData *cameraData(Camera *camera)
> > +     {
> > +             return static_cast<FakeCameraData *>(camera->_d());
> > +     }
> > +
> > +     int registerCameras();
> > +
> > +     static bool registered_;
> > +};
> > +
> > +bool PipelineHandlerFake::registered_ = false;
>
> What purpose does this serve ? Do you get multiple calls to match()
> so that you need to keep a flag ?
>
> Yes, in `src/libcamera/camera_manager.cpp`, it intentionally calls match()
multiple times
until it returns false (which means no devices matched). Therefore, this is
the hack as
the fake media devices haven't been created yet.


> > +
> > +FakeCameraConfiguration::FakeCameraConfiguration(FakeCameraData *data)
> > +     : CameraConfiguration()
> > +{
> > +     data_ = data;
> > +}
> > +
> > +CameraConfiguration::Status FakeCameraConfiguration::validate()
> > +{
> > +     Status status = Valid;
> > +
> > +     if (config_.empty())
> > +             return Invalid;
> > +
> > +     // Transform combined = transform * data_->rotationTransform_;
> > +
> > +     // /*
> > +     //  * We combine the platform and user transform, but must "adjust
> away"
> > +     //  * any combined result that includes a transposition, as we
> can't do
> > +     //  * those. In this case, flipping only the transpose bit is
> helpful to
> > +     //  * applications - they either get the transform they requested,
> or have
> > +     //  * to do a simple transpose themselves (they don't have to
> worry about
> > +     //  * the other possible cases).
> > +     //  */
> > +     // if (!!(combined & Transform::Transpose)) {
> > +     //      /*
> > +     //       * Flipping the transpose bit in "transform" flips it in
> the
> > +     //       * combined result too (as it's the last thing that
> happens),
> > +     //       * which is of course clearing it.
> > +     //       */
> > +     //      transform ^= Transform::Transpose;
> > +     //      combined &= ~Transform::Transpose;
> > +     //      status = Adjusted;
> > +     // }
> > +
> > +     // /*
> > +     //  * We also check if the sensor doesn't do h/vflips at all, in
> which
> > +     //  * case we clear them, and the application will have to do
> everything.
> > +     //  */
> > +     // if (!data_->supportsFlips_ && !!combined) {
> > +     //      /*
> > +     //       * If the sensor can do no transforms, then combined must
> be
> > +     //       * changed to the identity. The only user transform that
> gives
> > +     //       * rise to this is the inverse of the rotation. (Recall
> that
> > +     //       * combined = transform * rotationTransform.)
> > +     //       */
> > +     //      transform = -data_->rotationTransform_;
> > +     //      combined = Transform::Identity;
> > +     //      status = Adjusted;
> > +     // }
> > +
> > +     /*
> > +      * Store the final combined transform that configure() will need to
> > +      * apply to the sensor to save us working it out again.
> > +      */
> > +     // combinedTransform_ = combined;
>
> Please drop commented out code :0
>
> > +
> > +     /* Cap the number of entries to the available streams. */
> > +     if (config_.size() > kMaxStreams) {
> > +             config_.resize(kMaxStreams);
> > +             status = Adjusted;
> > +     }
> > +
> > +     /*
> > +      * Validate the requested stream configuration and select the
> sensor
> > +      * format by collecting the maximum RAW stream width and height and
> > +      * picking the closest larger match.
> > +      *
> > +      * If no RAW stream is requested use the one of the largest YUV
> stream,
> > +      * plus margin pixels for the IF and BDS rectangle to downscale.
> > +      *
> > +      * \todo Clarify the IF and BDS margins requirements.
>
> Please drop IPU3 related stuff
>
> Will do. I'll update the number of supported streams as well.

> > +      */
> > +     unsigned int rawCount = 0;
> > +     unsigned int yuvCount = 0;
> > +     Size rawRequirement;
> > +     Size maxYuvSize;
> > +     Size rawSize;
> > +
> > +     for (const StreamConfiguration &cfg : config_) {
> > +             const PixelFormatInfo &info =
> PixelFormatInfo::info(cfg.pixelFormat);
> > +
> > +             if (info.colourEncoding ==
> PixelFormatInfo::ColourEncodingRAW) {
> > +                     rawCount++;
> > +                     rawSize = std::max(rawSize, cfg.size);
> > +             } else {
> > +                     yuvCount++;
> > +                     maxYuvSize = std::max(maxYuvSize, cfg.size);
> > +                     rawRequirement.expandTo(cfg.size);
> > +             }
> > +     }
> > +
> > +     // TODO: Base on number of cameras?
>
> Well, should this pipeline register more than one camera ?
> And anyway, the number of streams is per-camera so the number of
> cameras should be not relevant here
>
>
Yes, and yes. The configuration file should specify the number of supported
streams.


> > +     if (rawCount > 1 || yuvCount > 2) {
> > +             LOG(Fake, Debug) << "Camera configuration not supported";
> > +             return Invalid;
> > +     }
> > +
> > +     /*
> > +      * Generate raw configuration from CIO2.
> > +      *
> > +      * The output YUV streams will be limited in size to the maximum
> frame
> > +      * size requested for the RAW stream, if present.
> > +      *
> > +      * If no raw stream is requested, generate a size from the largest
> YUV
> > +      * stream, aligned to the ImgU constraints and bound
> > +      * by the sensor's maximum resolution. See
> > +      * https://bugs.libcamera.org/show_bug.cgi?id=32
> > +      */
> > +     // TODO
> > +     if (rawSize.isNull())
> > +             rawSize = rawRequirement;
>
> All of these serves on IPU3 to generate a size suitable for the CIO2
> and the sensor. You can remove it.
>
>
Right, thanks!

> +
> > +     /*
> > +      * Adjust the configurations if needed and assign streams while
> > +      * iterating them.
> > +      */
> > +     bool mainOutputAvailable = true;
> > +     for (unsigned int i = 0; i < config_.size(); ++i) {
> > +             const PixelFormatInfo &info =
> PixelFormatInfo::info(config_[i].pixelFormat);
> > +             const StreamConfiguration originalCfg = config_[i];
> > +             StreamConfiguration *cfg = &config_[i];
> > +
> > +             LOG(Fake, Debug) << "Validating stream: " <<
> config_[i].toString();
> > +
> > +             if (info.colourEncoding ==
> PixelFormatInfo::ColourEncodingRAW) {
> > +                     /* Initialize the RAW stream with the CIO2
> configuration. */
> > +                     cfg->size = rawSize;
> > +                     // TODO: check
> > +                     cfg->pixelFormat = formats::SBGGR10_IPU3;
> > +                     cfg->bufferCount =
> FakeCameraConfiguration::kBufferCount;
> > +                     cfg->stride = info.stride(cfg->size.width, 0, 64);
> > +                     cfg->frameSize = info.frameSize(cfg->size, 64);
> > +                     cfg->setStream(const_cast<Stream
> *>(&data_->rawStream_));
> > +
> > +                     LOG(Fake, Debug) << "Assigned " << cfg->toString()
> > +                                      << " to the raw stream";
> > +             } else {
> > +                     /* Assign and configure the main and viewfinder
> outputs. */
> > +
> > +                     cfg->pixelFormat = formats::NV12;
> > +                     cfg->bufferCount = kBufferCount;
> > +                     cfg->stride = info.stride(cfg->size.width, 0, 1);
> > +                     cfg->frameSize = info.frameSize(cfg->size, 1);
> > +
> > +                     /*
> > +                      * Use the main output stream in case only one
> stream is
> > +                      * requested or if the current configuration is
> the one
> > +                      * with the maximum YUV output size.
> > +                      */
> > +                     if (mainOutputAvailable &&
> > +                         (originalCfg.size == maxYuvSize || yuvCount ==
> 1)) {
> > +                             cfg->setStream(const_cast<Stream
> *>(&data_->outStream_));
> > +                             mainOutputAvailable = false;
> > +
> > +                             LOG(Fake, Debug) << "Assigned " <<
> cfg->toString()
> > +                                              << " to the main output";
> > +                     } else {
> > +                             cfg->setStream(const_cast<Stream
> *>(&data_->vfStream_));
> > +
> > +                             LOG(Fake, Debug) << "Assigned " <<
> cfg->toString()
> > +                                              << " to the viewfinder
> output";
> > +                     }
> > +             }
>
> Again I'm not sure how much of the IPU3 should this pipeline mimick,
> or it should establish requirements and constraints (ie max 2 YUV
> streams of max size width x height, and 1 RAW stream in format
> V4L2_PIX_FMT_..)
>
>
Exactly. Will do.


> > +
> > +             if (cfg->pixelFormat != originalCfg.pixelFormat ||
> > +                 cfg->size != originalCfg.size) {
> > +                     LOG(Fake, Debug)
> > +                             << "Stream " << i << " configuration
> adjusted to "
> > +                             << cfg->toString();
> > +                     status = Adjusted;
> > +             }
> > +     }
> > +
> > +     return status;
> > +}
> > +
> > +PipelineHandlerFake::PipelineHandlerFake(CameraManager *manager)
> > +     : PipelineHandler(manager)
> > +{
> > +     // TODO: read the fake hal spec file.
>
> If there's a design document with requirements, can you share it ?
>
>
Will do.


> > +}
> > +
> > +CameraConfiguration *PipelineHandlerFake::generateConfiguration(Camera
> *camera,
> > +                                                             const
> StreamRoles &roles)
> > +{
> > +     FakeCameraData *data = cameraData(camera);
> > +     FakeCameraConfiguration *config = new
> FakeCameraConfiguration(data);
> > +
> > +     if (roles.empty())
> > +             return config;
> > +
> > +     Size minSize, sensorResolution;
> > +     for (const auto& resolution : data->supported_resolutions_) {
> > +             if (minSize.isNull() || minSize > resolution.size)
> > +                     minSize = resolution.size;
> > +
> > +             sensorResolution = std::max(sensorResolution,
> resolution.size);
> > +     }
> > +
>
> Those are hard-coded values, I think you can reuse them here.
> Unless the idea is to make this information come from a configuration
> file or something similar.
>
> Yes, supportedResolutions_ should eventually come from a configuration
file, provided by the tester.


> > +     for (const StreamRole role : roles) {
> > +             std::map<PixelFormat, std::vector<SizeRange>>
> streamFormats;
> > +             unsigned int bufferCount;
> > +             PixelFormat pixelFormat;
> > +             Size size;
> > +
> > +             switch (role) {
> > +             case StreamRole::StillCapture:
> > +                     size = sensorResolution;
> > +                     pixelFormat = formats::NV12;
> > +                     bufferCount =
> FakeCameraConfiguration::kBufferCount;
> > +                     streamFormats[pixelFormat] = { { minSize, size } };
> > +
> > +                     break;
> > +
> > +             case StreamRole::Raw: {
> > +                     // TODO: check
> > +                     pixelFormat = formats::SBGGR10_IPU3;
> > +                     size = sensorResolution;
> > +                     bufferCount =
> FakeCameraConfiguration::kBufferCount;
> > +                     streamFormats[pixelFormat] = { { minSize, size } };
> > +
> > +                     break;
> > +             }
> > +
> > +             case StreamRole::Viewfinder:
> > +             case StreamRole::VideoRecording: {
> > +                     /*
> > +                      * Default viewfinder and videorecording to
> 1280x720,
> > +                      * capped to the maximum sensor resolution and
> aligned
> > +                      * to the ImgU output constraints.
> > +                      */
> > +                     size = sensorResolution;
> > +                     pixelFormat = formats::NV12;
> > +                     bufferCount =
> FakeCameraConfiguration::kBufferCount;
> > +                     streamFormats[pixelFormat] = { { minSize, size } };
> > +
> > +                     break;
> > +             }
> > +
> > +             default:
> > +                     LOG(Fake, Error)
> > +                             << "Requested stream role not supported: "
> << role;
> > +                     delete config;
> > +                     return nullptr;
> > +             }
> > +
> > +             StreamFormats formats(streamFormats);
> > +             StreamConfiguration cfg(formats);
> > +             cfg.size = size;
> > +             cfg.pixelFormat = pixelFormat;
> > +             cfg.bufferCount = bufferCount;
> > +             config->addConfiguration(cfg);
> > +     }
> > +
> > +     if (config->validate() == CameraConfiguration::Invalid)
> > +             return {};
> > +
> > +     return config;
> > +}
> > +
> > +int PipelineHandlerFake::configure(Camera *camera, CameraConfiguration
> *c)
> > +{
> > +     if (camera || c)
> > +             return 0;
> > +     return 0;
> > +}
> > +
> > +int PipelineHandlerFake::exportFrameBuffers(Camera *camera, Stream
> *stream,
> > +
>  std::vector<std::unique_ptr<FrameBuffer>> *buffers)
> > +{
> > +     // Assume it's never called.
> > +     LOG(Fake, Fatal) << "exportFrameBuffers should never be called";
> > +     if (camera || stream || buffers)
> > +             return -EINVAL;
> > +     return -EINVAL;
> > +}
> > +
> > +int PipelineHandlerFake::start(Camera *camera, [[maybe_unused]] const
> ControlList *controls)
> > +{
> > +     FakeCameraData *data = cameraData(camera);
> > +     data->started_ = true;
> > +
> > +     return 0;
> > +}
> > +
> > +void PipelineHandlerFake::stopDevice(Camera *camera)
> > +{
> > +     FakeCameraData *data = cameraData(camera);
> > +
> > +     data->started_ = false;
> > +}
> > +
> > +int PipelineHandlerFake::queueRequestDevice(Camera *camera, Request
> *request)
> > +{
> > +     if (!camera)
> > +             return -EINVAL;
>
> Can this happen ?
>
> Actually it's a hack that CrOS compiler will fail if there's any unused
argument...
I'll try to clean it up later.


> > +
> > +     for (auto it : request->buffers())
> > +             completeBuffer(request, it.second);
> > +
> > +     // TODO: request.metadata()
> > +     request->metadata().set(controls::SensorTimestamp,
> CurrentTimestamp());
> > +     completeRequest(request);
> > +
> > +     return 0;
> > +}
> > +
> > +bool PipelineHandlerFake::match(DeviceEnumerator *enumerator)
> > +{
> > +     // TODO: exhaust all devices in |enumerator|.
> > +     if (!enumerator)
> > +             LOG(Fake, Info) << "Invalid enumerator";
>
> Can this happen ?
>
> ditto

> > +
> > +     if (registered_)
> > +             return false;
> > +
> > +     registered_ = true;
> > +     return registerCameras() == 0;
> > +}
> > +
> > +/**
> > + * \brief Initialise ImgU and CIO2 devices associated with cameras
> > + *
> > + * Initialise the two ImgU instances and create cameras with an
> associated
> > + * CIO2 device instance.
> > + *
> > + * \return 0 on success or a negative error code for error or if no
> camera
> > + * has been created
> > + * \retval -ENODEV no camera has been created
> > + */
> > +int PipelineHandlerFake::registerCameras()
> > +{
> > +     std::unique_ptr<FakeCameraData> data =
> > +             std::make_unique<FakeCameraData>(this);
> > +     std::set<Stream *> streams = {
> > +             &data->outStream_,
> > +             &data->vfStream_,
> > +             &data->rawStream_,
> > +     };
> > +
> > +     // TODO: Read from config or from IPC.
> > +     // TODO: Check with Han-lin: Can this function be called more than
> once?
> > +     data->supported_resolutions_.resize(2);
> > +     data->supported_resolutions_[0].size = Size(1920, 1080);
> > +     data->supported_resolutions_[0].frame_rates.push_back(30);
> > +     data->supported_resolutions_[0].formats.push_back("YCbCr_420_888");
> > +     data->supported_resolutions_[0].formats.push_back("BLOB");
> > +     data->supported_resolutions_[1].size = Size(1280, 720);
> > +     data->supported_resolutions_[1].frame_rates.push_back(30);
> > +     data->supported_resolutions_[1].frame_rates.push_back(60);
> > +     data->supported_resolutions_[1].formats.push_back("YCbCr_420_888");
> > +     data->supported_resolutions_[1].formats.push_back("BLOB");
> > +
> > +     // TODO: Assign different locations for different cameras based on
> config.
> > +     data->properties_.set(properties::Location,
> properties::CameraLocationFront);
> > +     data->properties_.set(properties::PixelArrayActiveAreas, {
> Rectangle(Size(1920, 1080)) });
> > +
> > +     // TODO: Set FrameDurationLimits based on config.
> > +     ControlInfoMap::Map controls{};
> > +     int64_t min_frame_duration = 30, max_frame_duration = 60;
> > +     controls[&controls::FrameDurationLimits] =
> ControlInfo(min_frame_duration, max_frame_duration);
> > +     data->controlInfo_ = ControlInfoMap(std::move(controls),
> controls::controls);
>
> FakeControls is ignored
>
> Right. Use it to initialize |controls| here.


> > +
> > +     std::shared_ptr<Camera> camera =
> > +             Camera::create(std::move(data), "\\_SB_.PCI0.I2C4.CAM1" /*
> cameraId */, streams);
> > +
> > +     registerCamera(std::move(camera));
> > +
> > +     return 0;
> > +}
> > +
> > +REGISTER_PIPELINE_HANDLER(PipelineHandlerFake)
> > +
> > +} /* namespace libcamera */
> > diff --git a/src/libcamera/pipeline/fake/meson.build
> b/src/libcamera/pipeline/fake/meson.build
> > new file mode 100644
> > index 00000000..13980b4a
> > --- /dev/null
> > +++ b/src/libcamera/pipeline/fake/meson.build
> > @@ -0,0 +1,3 @@
> > +# SPDX-License-Identifier: CC0-1.0
> > +
> > +libcamera_sources += files([ 'fake.cpp' ])
> > diff --git a/src/libcamera/pipeline_handler.cpp
> b/src/libcamera/pipeline_handler.cpp
> > index e5cb751c..4261154d 100644
> > --- a/src/libcamera/pipeline_handler.cpp
> > +++ b/src/libcamera/pipeline_handler.cpp
> > @@ -536,7 +536,7 @@ void
> PipelineHandler::registerCamera(std::shared_ptr<Camera> camera)
> >       cameras_.push_back(camera);
> >
> >       if (mediaDevices_.empty())
> > -             LOG(Pipeline, Fatal)
> > +             LOG(Pipeline, Error)
>
> I don't think we want that, you should probably open code the
>
>         manager_->addCamera(std::move(camera), devnums);
>
> call, with an empty list of devnums, which should be fine for this use
> case.
>
> The PipelineHandler::cameras_ vector will remain unpopulated, but
> since it is used only in case of a media device disconnection (afaict)
> it should be fine in your case
>
>
Ah you're right. Thanks!


>
> >                       << "Registering camera with no media devices!";
> >
> >       /*
> > diff --git a/test/camera/camera_reconfigure.cpp
> b/test/camera/camera_reconfigure.cpp
> > index d12e2413..06c87730 100644
> > --- a/test/camera/camera_reconfigure.cpp
> > +++ b/test/camera/camera_reconfigure.cpp
> > @@ -98,7 +98,7 @@ private:
> >                               return TestFail;
> >                       }
> >
> > -                     requests_.push_back(move(request));
> > +                     requests_.push_back(std::move(request));
>
> Unrelated and possibly not necessary, as the test has
>         using namespace std;
>
>
Yeah... it's also a hack that it's required in the CrOS compiler... It
doesn't
accept |move|...


> Looking forward to seeing this skeleton getting populated, it will help
> testing the Android layer!
>
> Thanks
>    j
>
> >               }
> >
> >               camera_->requestCompleted.connect(this,
> &CameraReconfigure::requestComplete);
> > --
> > 2.38.0.rc1.362.ged0d419d3c-goog
> >
>
Cheng-Hao Yang Oct. 21, 2022, 10:38 p.m. UTC | #3
On Fri, Oct 14, 2022 at 7:36 PM Cheng-Hao Yang <chenghaoyang@chromium.org>
wrote:

> Thanks Jacopo!
>
> The patch v2 is uploaded based on your comments.
>
> On Wed, Oct 12, 2022 at 11:06 PM Jacopo Mondi <jacopo@jmondi.org> wrote:
>
>> Hi Harvey,
>>    I had the same thought as Kieran had, why not vimc ? I understan
>> your reasons behind it and the requirement to backport patches for it
>> to older kernel (btw, does it need anything more than CAP_IO_MC ?)
>>
>>
> Is it a great effort to backport patches to older kernel versions?
> Regarding CAP_IO_MC, I don't know :p. Anyone can help with this?
>
>
>> I have some general questions
>>
>> 1) What does this pipeline handler matches on ? It seems to me match()
>> always returns true, hence if the pipeline handler is compiled in, it
>> will always report one camera available ?
>>
>>
> I think it'll eventually depend on the configuration file provided by the
> tester.
> We might come up with some fake media devices, without matching the real
> ones. But I haven't made up my mind yet.
>
>
>> 2) What are the requirements for this pipeline handler ?
>>
>>    Is it enough to test the Android layer to implement stub methods,
>>    or should frames be generated somehow (I'm thinking at CTS frame
>>    rate checks in example).
>>
>>    Should it support multiple streams ? I guess it depends on the HW
>>    level one wants to test in CTS, or should it mimik the capabilities
>>    of a known device (ie IPU3 that can produce 2 YUV streams and one
>>    RAW)
>>
>>    I guess however this can be started as a skeleton and be populated
>>    as required to complete CTS.
>>
>> Additional features like custom configurations (of cameras) and frames
> from
> a video file will be added.
>
> In CrOS, it helps applications to test usages in different (fake) hardware
> setup,
> while it also helps libcamera test the Android adapter.
>
>
>> 3) Should all mentions of IPU3 related components be removed ?
>>
>> Yes
>
>
>> 4) Possible bikeshedding, but I wonder if "dummy" isn't better than
>>    "fake"
>>
>> As we need some more features, like a custom configuration file/cameras
> and frames from a video file, I believe "fake" is still more appropriate.
>
>
>> On Wed, Oct 12, 2022 at 07:59:25AM +0000, Harvey Yang via libcamera-devel
>> wrote:
>> > ---
>> >  meson_options.txt                       |   2 +-
>> >  src/android/camera_capabilities.cpp     |   1 +
>> >  src/libcamera/pipeline/fake/fake.cpp    | 518 ++++++++++++++++++++++++
>> >  src/libcamera/pipeline/fake/meson.build |   3 +
>> >  src/libcamera/pipeline_handler.cpp      |   2 +-
>> >  test/camera/camera_reconfigure.cpp      |   2 +-
>> >  6 files changed, 525 insertions(+), 3 deletions(-)
>> >  create mode 100644 src/libcamera/pipeline/fake/fake.cpp
>> >  create mode 100644 src/libcamera/pipeline/fake/meson.build
>> >
>> > diff --git a/meson_options.txt b/meson_options.txt
>> > index f1d67808..f08dfc5f 100644
>> > --- a/meson_options.txt
>> > +++ b/meson_options.txt
>> > @@ -37,7 +37,7 @@ option('lc-compliance',
>> >
>> >  option('pipelines',
>> >          type : 'array',
>> > -        choices : ['ipu3', 'raspberrypi', 'rkisp1', 'simple',
>> 'uvcvideo', 'vimc'],
>> > +        choices : ['ipu3', 'raspberrypi', 'rkisp1', 'simple',
>> 'uvcvideo', 'vimc', 'fake'],
>> >          description : 'Select which pipeline handlers to include')
>> >
>> >  option('qcam',
>> > diff --git a/src/android/camera_capabilities.cpp
>> b/src/android/camera_capabilities.cpp
>> > index 64bd8dde..730ceafc 100644
>> > --- a/src/android/camera_capabilities.cpp
>> > +++ b/src/android/camera_capabilities.cpp
>> > @@ -1077,6 +1077,7 @@ int CameraCapabilities::initializeStaticMetadata()
>> >       {
>> >               const Span<const Rectangle> &rects =
>> >
>>  properties.get(properties::PixelArrayActiveAreas).value_or(Span<const
>> Rectangle>{});
>> > +             // TODO: initialize at least one Rectangle as default.
>>
>> Unrelated, please drop
>>
>>
> Done.
>
>
>> >               std::vector<int32_t> data{
>> >                       static_cast<int32_t>(rects[0].x),
>> >                       static_cast<int32_t>(rects[0].y),
>> > diff --git a/src/libcamera/pipeline/fake/fake.cpp
>> b/src/libcamera/pipeline/fake/fake.cpp
>> > new file mode 100644
>> > index 00000000..518de6aa
>> > --- /dev/null
>> > +++ b/src/libcamera/pipeline/fake/fake.cpp
>> > @@ -0,0 +1,518 @@
>> > +/* SPDX-License-Identifier: LGPL-2.1-or-later */
>> > +/*
>> > + * Copyright (C) 2019, Google Inc.
>> > + *
>> > + * ipu3.cpp - Pipeline handler for Intel Fake
>> > + */
>>
>> Please update year and remove mentions of Intel ?
>>
>>
> Done.
>
>
>> > +
>> > +#include <algorithm>
>> > +#include <iomanip>
>> > +#include <memory>
>> > +#include <queue>
>> > +#include <vector>
>>
>> Seems like you're also using std::set
>>
>>
> Done.
>
>
>> > +
>> > +#include <libcamera/base/log.h>
>> > +#include <libcamera/base/utils.h>
>> > +
>> > +#include <libcamera/camera.h>
>>
>> all internal headers "libcamera/internal/something.h" should include
>> <libcamera/something.h>, so if you include internal/camera.h you
>> probably can remove this
>>
>>
> Done.
>
>
>> > +#include <libcamera/control_ids.h>
>> > +#include <libcamera/formats.h>
>> > +#include <libcamera/property_ids.h>
>> > +#include <libcamera/request.h>
>> > +#include <libcamera/stream.h>
>> > +
>> > +#include "libcamera/internal/camera.h"
>> > +#include "libcamera/internal/camera_lens.h"
>> > +#include "libcamera/internal/camera_sensor.h"
>> > +#include "libcamera/internal/delayed_controls.h"
>>
>> The three above are not used it seems
>>
>>
> Done.
>
>
>> > +#include "libcamera/internal/device_enumerator.h"
>> > +#include "libcamera/internal/framebuffer.h"
>> > +#include "libcamera/internal/media_device.h"
>> > +#include "libcamera/internal/pipeline_handler.h"
>> > +
>> > +namespace libcamera {
>> > +
>> > +LOG_DEFINE_CATEGORY(Fake)
>> > +
>> > +uint64_t CurrentTimestamp()
>> > +{
>> > +     struct timespec ts;
>> > +     if (clock_gettime(CLOCK_MONOTONIC, &ts) < 0) {
>> > +             LOG(Fake, Error) << "Get clock time fails";
>> > +             return 0;
>> > +     }
>> > +
>> > +     return ts.tv_sec * 1'000'000'000LL + ts.tv_nsec;
>> > +}
>> > +
>> > +static const ControlInfoMap::Map FakeControls = {
>>
>> Missing #include <libcamera/controls.h> ?
>>
>>
> Done.
>
> > +     { &controls::draft::PipelineDepth, ControlInfo(2, 3) },
>> > +};
>> > +
>> > +class FakeCameraData : public Camera::Private
>> > +{
>> > +public:
>> > +     struct Resolution {
>> > +             Size size;
>> > +             std::vector<int> frame_rates;
>> > +             std::vector<std::string> formats;
>>
>> I would rather use libcamera::formats to verify the Android HAL does
>> the translation right.
>>
>> I believe you mean libcamera::PixelFormat. Done.
>
>
>> > +     };
>> > +
>> > +     FakeCameraData(PipelineHandler *pipe)
>> > +             : Camera::Private(pipe), supportsFlips_(false)
>> > +     {
>> > +     }
>> > +
>> > +     std::vector<Resolution> supported_resolutions_;
>>
>> supportedResolutions_
>>
>> Done.
>
>> > +
>> > +     Stream outStream_;
>> > +     Stream vfStream_;
>> > +     Stream rawStream_;
>> > +
>> > +     // TODO: Check if we should support.
>> > +     bool supportsFlips_;
>> > +     Transform rotationTransform_;
>>
>> For sake of simplicity you can remoev flip/rotation support from the
>> first skeleton
>>
>>
> Done.
>
>
>> > +
>> > +     // TODO: remove
>>
>> Please :)
>>
>> Done.
>
>> > +     /* Requests for which no buffer has been queued to the CIO2
>> device yet. */
>> > +     std::queue<Request *> pendingRequests_;
>> > +     /* Requests queued to the CIO2 device but not yet processed by
>> the ImgU. */
>> > +     std::queue<Request *> processingRequests_;
>> > +
>> > +     // ControlInfoMap ipaControls_;
>>
>> This as well
>>
>> Done.
>
>> > +     bool started_ = false;
>> > +};
>> > +
>> > +class FakeCameraConfiguration : public CameraConfiguration
>> > +{
>> > +public:
>> > +     static constexpr unsigned int kBufferCount = 4; // 4~6
>> > +     static constexpr unsigned int kMaxStreams = 3;
>> > +
>> > +     FakeCameraConfiguration(FakeCameraData *data);
>> > +
>> > +     Status validate() override;
>> > +
>> > +private:
>> > +     /*
>> > +      * The FakeCameraData instance is guaranteed to be valid as long
>> as the
>> > +      * corresponding Camera instance is valid. In order to borrow a
>> > +      * reference to the camera data, store a new reference to the
>> camera.
>> > +      */
>>
>> This comes from the IPU3 pipeline handler, and I wonder if it still
>> applies there or it should be removed.
>>
>>
> I believe it's still needed in |validate()| to set streams, right?
>
>
>> > +     const FakeCameraData *data_;
>> > +};
>> > +
>> > +class PipelineHandlerFake : public PipelineHandler
>> > +{
>> > +public:
>> > +     static constexpr unsigned int V4L2_CID_Fake_PIPE_MODE =
>> 0x009819c1;
>>
>> Unused, please drop
>>
>> Done.
>
>> > +     static constexpr Size kViewfinderSize{ 1280, 720 };
>> > +
>> > +     enum FakePipeModes {
>> > +             FakePipeModeVideo = 0,
>> > +             FakePipeModeStillCapture = 1,
>> > +     };
>>
>> ditto
>>
>>
> Done.
>
>
>> > +
>> > +     PipelineHandlerFake(CameraManager *manager);
>> > +
>> > +     CameraConfiguration *generateConfiguration(Camera *camera,
>> > +             const StreamRoles &roles) override;
>>
>> Align to open (
>>
>> Done.
>
>> > +     int configure(Camera *camera, CameraConfiguration *config)
>> override;
>> > +
>> > +     int exportFrameBuffers(Camera *camera, Stream *stream,
>> > +                            std::vector<std::unique_ptr<FrameBuffer>>
>> *buffers) override;
>> > +
>> > +     int start(Camera *camera, const ControlList *controls) override;
>> > +     void stopDevice(Camera *camera) override;
>> > +
>> > +     int queueRequestDevice(Camera *camera, Request *request) override;
>> > +
>> > +     bool match(DeviceEnumerator *enumerator) override;
>> > +
>> > +private:
>> > +     FakeCameraData *cameraData(Camera *camera)
>> > +     {
>> > +             return static_cast<FakeCameraData *>(camera->_d());
>> > +     }
>> > +
>> > +     int registerCameras();
>> > +
>> > +     static bool registered_;
>> > +};
>> > +
>> > +bool PipelineHandlerFake::registered_ = false;
>>
>> What purpose does this serve ? Do you get multiple calls to match()
>> so that you need to keep a flag ?
>>
>> Yes, in `src/libcamera/camera_manager.cpp`, it intentionally calls
> match() multiple times
> until it returns false (which means no devices matched). Therefore, this
> is the hack as
> the fake media devices haven't been created yet.
>
>
>> > +
>> > +FakeCameraConfiguration::FakeCameraConfiguration(FakeCameraData *data)
>> > +     : CameraConfiguration()
>> > +{
>> > +     data_ = data;
>> > +}
>> > +
>> > +CameraConfiguration::Status FakeCameraConfiguration::validate()
>> > +{
>> > +     Status status = Valid;
>> > +
>> > +     if (config_.empty())
>> > +             return Invalid;
>> > +
>> > +     // Transform combined = transform * data_->rotationTransform_;
>> > +
>> > +     // /*
>> > +     //  * We combine the platform and user transform, but must
>> "adjust away"
>> > +     //  * any combined result that includes a transposition, as we
>> can't do
>> > +     //  * those. In this case, flipping only the transpose bit is
>> helpful to
>> > +     //  * applications - they either get the transform they
>> requested, or have
>> > +     //  * to do a simple transpose themselves (they don't have to
>> worry about
>> > +     //  * the other possible cases).
>> > +     //  */
>> > +     // if (!!(combined & Transform::Transpose)) {
>> > +     //      /*
>> > +     //       * Flipping the transpose bit in "transform" flips it in
>> the
>> > +     //       * combined result too (as it's the last thing that
>> happens),
>> > +     //       * which is of course clearing it.
>> > +     //       */
>> > +     //      transform ^= Transform::Transpose;
>> > +     //      combined &= ~Transform::Transpose;
>> > +     //      status = Adjusted;
>> > +     // }
>> > +
>> > +     // /*
>> > +     //  * We also check if the sensor doesn't do h/vflips at all, in
>> which
>> > +     //  * case we clear them, and the application will have to do
>> everything.
>> > +     //  */
>> > +     // if (!data_->supportsFlips_ && !!combined) {
>> > +     //      /*
>> > +     //       * If the sensor can do no transforms, then combined must
>> be
>> > +     //       * changed to the identity. The only user transform that
>> gives
>> > +     //       * rise to this is the inverse of the rotation. (Recall
>> that
>> > +     //       * combined = transform * rotationTransform.)
>> > +     //       */
>> > +     //      transform = -data_->rotationTransform_;
>> > +     //      combined = Transform::Identity;
>> > +     //      status = Adjusted;
>> > +     // }
>> > +
>> > +     /*
>> > +      * Store the final combined transform that configure() will need
>> to
>> > +      * apply to the sensor to save us working it out again.
>> > +      */
>> > +     // combinedTransform_ = combined;
>>
>> Please drop commented out code :0
>>
>> > +
>> > +     /* Cap the number of entries to the available streams. */
>> > +     if (config_.size() > kMaxStreams) {
>> > +             config_.resize(kMaxStreams);
>> > +             status = Adjusted;
>> > +     }
>> > +
>> > +     /*
>> > +      * Validate the requested stream configuration and select the
>> sensor
>> > +      * format by collecting the maximum RAW stream width and height
>> and
>> > +      * picking the closest larger match.
>> > +      *
>> > +      * If no RAW stream is requested use the one of the largest YUV
>> stream,
>> > +      * plus margin pixels for the IF and BDS rectangle to downscale.
>> > +      *
>> > +      * \todo Clarify the IF and BDS margins requirements.
>>
>> Please drop IPU3 related stuff
>>
>> Will do. I'll update the number of supported streams as well.
>
>> > +      */
>> > +     unsigned int rawCount = 0;
>> > +     unsigned int yuvCount = 0;
>> > +     Size rawRequirement;
>> > +     Size maxYuvSize;
>> > +     Size rawSize;
>> > +
>> > +     for (const StreamConfiguration &cfg : config_) {
>> > +             const PixelFormatInfo &info =
>> PixelFormatInfo::info(cfg.pixelFormat);
>> > +
>> > +             if (info.colourEncoding ==
>> PixelFormatInfo::ColourEncodingRAW) {
>> > +                     rawCount++;
>> > +                     rawSize = std::max(rawSize, cfg.size);
>> > +             } else {
>> > +                     yuvCount++;
>> > +                     maxYuvSize = std::max(maxYuvSize, cfg.size);
>> > +                     rawRequirement.expandTo(cfg.size);
>> > +             }
>> > +     }
>> > +
>> > +     // TODO: Base on number of cameras?
>>
>> Well, should this pipeline register more than one camera ?
>> And anyway, the number of streams is per-camera so the number of
>> cameras should be not relevant here
>>
>>
> Yes, and yes. The configuration file should specify the number of
> supported streams.
>
>
>> > +     if (rawCount > 1 || yuvCount > 2) {
>> > +             LOG(Fake, Debug) << "Camera configuration not supported";
>> > +             return Invalid;
>> > +     }
>> > +
>> > +     /*
>> > +      * Generate raw configuration from CIO2.
>> > +      *
>> > +      * The output YUV streams will be limited in size to the maximum
>> frame
>> > +      * size requested for the RAW stream, if present.
>> > +      *
>> > +      * If no raw stream is requested, generate a size from the
>> largest YUV
>> > +      * stream, aligned to the ImgU constraints and bound
>> > +      * by the sensor's maximum resolution. See
>> > +      * https://bugs.libcamera.org/show_bug.cgi?id=32
>> > +      */
>> > +     // TODO
>> > +     if (rawSize.isNull())
>> > +             rawSize = rawRequirement;
>>
>> All of these serves on IPU3 to generate a size suitable for the CIO2
>> and the sensor. You can remove it.
>>
>>
> Right, thanks!
>
> > +
>> > +     /*
>> > +      * Adjust the configurations if needed and assign streams while
>> > +      * iterating them.
>> > +      */
>> > +     bool mainOutputAvailable = true;
>> > +     for (unsigned int i = 0; i < config_.size(); ++i) {
>> > +             const PixelFormatInfo &info =
>> PixelFormatInfo::info(config_[i].pixelFormat);
>> > +             const StreamConfiguration originalCfg = config_[i];
>> > +             StreamConfiguration *cfg = &config_[i];
>> > +
>> > +             LOG(Fake, Debug) << "Validating stream: " <<
>> config_[i].toString();
>> > +
>> > +             if (info.colourEncoding ==
>> PixelFormatInfo::ColourEncodingRAW) {
>> > +                     /* Initialize the RAW stream with the CIO2
>> configuration. */
>> > +                     cfg->size = rawSize;
>> > +                     // TODO: check
>> > +                     cfg->pixelFormat = formats::SBGGR10_IPU3;
>> > +                     cfg->bufferCount =
>> FakeCameraConfiguration::kBufferCount;
>> > +                     cfg->stride = info.stride(cfg->size.width, 0, 64);
>> > +                     cfg->frameSize = info.frameSize(cfg->size, 64);
>> > +                     cfg->setStream(const_cast<Stream
>> *>(&data_->rawStream_));
>> > +
>> > +                     LOG(Fake, Debug) << "Assigned " << cfg->toString()
>> > +                                      << " to the raw stream";
>> > +             } else {
>> > +                     /* Assign and configure the main and viewfinder
>> outputs. */
>> > +
>> > +                     cfg->pixelFormat = formats::NV12;
>> > +                     cfg->bufferCount = kBufferCount;
>> > +                     cfg->stride = info.stride(cfg->size.width, 0, 1);
>> > +                     cfg->frameSize = info.frameSize(cfg->size, 1);
>> > +
>> > +                     /*
>> > +                      * Use the main output stream in case only one
>> stream is
>> > +                      * requested or if the current configuration is
>> the one
>> > +                      * with the maximum YUV output size.
>> > +                      */
>> > +                     if (mainOutputAvailable &&
>> > +                         (originalCfg.size == maxYuvSize || yuvCount
>> == 1)) {
>> > +                             cfg->setStream(const_cast<Stream
>> *>(&data_->outStream_));
>> > +                             mainOutputAvailable = false;
>> > +
>> > +                             LOG(Fake, Debug) << "Assigned " <<
>> cfg->toString()
>> > +                                              << " to the main output";
>> > +                     } else {
>> > +                             cfg->setStream(const_cast<Stream
>> *>(&data_->vfStream_));
>> > +
>> > +                             LOG(Fake, Debug) << "Assigned " <<
>> cfg->toString()
>> > +                                              << " to the viewfinder
>> output";
>> > +                     }
>> > +             }
>>
>> Again I'm not sure how much of the IPU3 should this pipeline mimick,
>> or it should establish requirements and constraints (ie max 2 YUV
>> streams of max size width x height, and 1 RAW stream in format
>> V4L2_PIX_FMT_..)
>>
>>
> Exactly. Will do.
>
>
>> > +
>> > +             if (cfg->pixelFormat != originalCfg.pixelFormat ||
>> > +                 cfg->size != originalCfg.size) {
>> > +                     LOG(Fake, Debug)
>> > +                             << "Stream " << i << " configuration
>> adjusted to "
>> > +                             << cfg->toString();
>> > +                     status = Adjusted;
>> > +             }
>> > +     }
>> > +
>> > +     return status;
>> > +}
>> > +
>> > +PipelineHandlerFake::PipelineHandlerFake(CameraManager *manager)
>> > +     : PipelineHandler(manager)
>> > +{
>> > +     // TODO: read the fake hal spec file.
>>
>> If there's a design document with requirements, can you share it ?
>>
>>
> Will do.
>
>

Just shared it with you, Umang, Paul, and Kieran. Please check if you get
the invitation email.
Although it seems that the CrOS Camera is going on their own plan:
modifying the existing usb hal as the new fake hal. Therefore, we can focus
on our own use cases, and just take their design as a reference.


> > +}
>> > +
>> > +CameraConfiguration *PipelineHandlerFake::generateConfiguration(Camera
>> *camera,
>> > +                                                             const
>> StreamRoles &roles)
>> > +{
>> > +     FakeCameraData *data = cameraData(camera);
>> > +     FakeCameraConfiguration *config = new
>> FakeCameraConfiguration(data);
>> > +
>> > +     if (roles.empty())
>> > +             return config;
>> > +
>> > +     Size minSize, sensorResolution;
>> > +     for (const auto& resolution : data->supported_resolutions_) {
>> > +             if (minSize.isNull() || minSize > resolution.size)
>> > +                     minSize = resolution.size;
>> > +
>> > +             sensorResolution = std::max(sensorResolution,
>> resolution.size);
>> > +     }
>> > +
>>
>> Those are hard-coded values, I think you can reuse them here.
>> Unless the idea is to make this information come from a configuration
>> file or something similar.
>>
>> Yes, supportedResolutions_ should eventually come from a configuration
> file, provided by the tester.
>
>
>> > +     for (const StreamRole role : roles) {
>> > +             std::map<PixelFormat, std::vector<SizeRange>>
>> streamFormats;
>> > +             unsigned int bufferCount;
>> > +             PixelFormat pixelFormat;
>> > +             Size size;
>> > +
>> > +             switch (role) {
>> > +             case StreamRole::StillCapture:
>> > +                     size = sensorResolution;
>> > +                     pixelFormat = formats::NV12;
>> > +                     bufferCount =
>> FakeCameraConfiguration::kBufferCount;
>> > +                     streamFormats[pixelFormat] = { { minSize, size }
>> };
>> > +
>> > +                     break;
>> > +
>> > +             case StreamRole::Raw: {
>> > +                     // TODO: check
>> > +                     pixelFormat = formats::SBGGR10_IPU3;
>> > +                     size = sensorResolution;
>> > +                     bufferCount =
>> FakeCameraConfiguration::kBufferCount;
>> > +                     streamFormats[pixelFormat] = { { minSize, size }
>> };
>> > +
>> > +                     break;
>> > +             }
>> > +
>> > +             case StreamRole::Viewfinder:
>> > +             case StreamRole::VideoRecording: {
>> > +                     /*
>> > +                      * Default viewfinder and videorecording to
>> 1280x720,
>> > +                      * capped to the maximum sensor resolution and
>> aligned
>> > +                      * to the ImgU output constraints.
>> > +                      */
>> > +                     size = sensorResolution;
>> > +                     pixelFormat = formats::NV12;
>> > +                     bufferCount =
>> FakeCameraConfiguration::kBufferCount;
>> > +                     streamFormats[pixelFormat] = { { minSize, size }
>> };
>> > +
>> > +                     break;
>> > +             }
>> > +
>> > +             default:
>> > +                     LOG(Fake, Error)
>> > +                             << "Requested stream role not supported:
>> " << role;
>> > +                     delete config;
>> > +                     return nullptr;
>> > +             }
>> > +
>> > +             StreamFormats formats(streamFormats);
>> > +             StreamConfiguration cfg(formats);
>> > +             cfg.size = size;
>> > +             cfg.pixelFormat = pixelFormat;
>> > +             cfg.bufferCount = bufferCount;
>> > +             config->addConfiguration(cfg);
>> > +     }
>> > +
>> > +     if (config->validate() == CameraConfiguration::Invalid)
>> > +             return {};
>> > +
>> > +     return config;
>> > +}
>> > +
>> > +int PipelineHandlerFake::configure(Camera *camera, CameraConfiguration
>> *c)
>> > +{
>> > +     if (camera || c)
>> > +             return 0;
>> > +     return 0;
>> > +}
>> > +
>> > +int PipelineHandlerFake::exportFrameBuffers(Camera *camera, Stream
>> *stream,
>> > +
>>  std::vector<std::unique_ptr<FrameBuffer>> *buffers)
>> > +{
>> > +     // Assume it's never called.
>> > +     LOG(Fake, Fatal) << "exportFrameBuffers should never be called";
>> > +     if (camera || stream || buffers)
>> > +             return -EINVAL;
>> > +     return -EINVAL;
>> > +}
>> > +
>> > +int PipelineHandlerFake::start(Camera *camera, [[maybe_unused]] const
>> ControlList *controls)
>> > +{
>> > +     FakeCameraData *data = cameraData(camera);
>> > +     data->started_ = true;
>> > +
>> > +     return 0;
>> > +}
>> > +
>> > +void PipelineHandlerFake::stopDevice(Camera *camera)
>> > +{
>> > +     FakeCameraData *data = cameraData(camera);
>> > +
>> > +     data->started_ = false;
>> > +}
>> > +
>> > +int PipelineHandlerFake::queueRequestDevice(Camera *camera, Request
>> *request)
>> > +{
>> > +     if (!camera)
>> > +             return -EINVAL;
>>
>> Can this happen ?
>>
>> Actually it's a hack that CrOS compiler will fail if there's any unused
> argument...
> I'll try to clean it up later.
>
>
>> > +
>> > +     for (auto it : request->buffers())
>> > +             completeBuffer(request, it.second);
>> > +
>> > +     // TODO: request.metadata()
>> > +     request->metadata().set(controls::SensorTimestamp,
>> CurrentTimestamp());
>> > +     completeRequest(request);
>> > +
>> > +     return 0;
>> > +}
>> > +
>> > +bool PipelineHandlerFake::match(DeviceEnumerator *enumerator)
>> > +{
>> > +     // TODO: exhaust all devices in |enumerator|.
>> > +     if (!enumerator)
>> > +             LOG(Fake, Info) << "Invalid enumerator";
>>
>> Can this happen ?
>>
>> ditto
>
>> > +
>> > +     if (registered_)
>> > +             return false;
>> > +
>> > +     registered_ = true;
>> > +     return registerCameras() == 0;
>> > +}
>> > +
>> > +/**
>> > + * \brief Initialise ImgU and CIO2 devices associated with cameras
>> > + *
>> > + * Initialise the two ImgU instances and create cameras with an
>> associated
>> > + * CIO2 device instance.
>> > + *
>> > + * \return 0 on success or a negative error code for error or if no
>> camera
>> > + * has been created
>> > + * \retval -ENODEV no camera has been created
>> > + */
>> > +int PipelineHandlerFake::registerCameras()
>> > +{
>> > +     std::unique_ptr<FakeCameraData> data =
>> > +             std::make_unique<FakeCameraData>(this);
>> > +     std::set<Stream *> streams = {
>> > +             &data->outStream_,
>> > +             &data->vfStream_,
>> > +             &data->rawStream_,
>> > +     };
>> > +
>> > +     // TODO: Read from config or from IPC.
>> > +     // TODO: Check with Han-lin: Can this function be called more
>> than once?
>> > +     data->supported_resolutions_.resize(2);
>> > +     data->supported_resolutions_[0].size = Size(1920, 1080);
>> > +     data->supported_resolutions_[0].frame_rates.push_back(30);
>> > +
>>  data->supported_resolutions_[0].formats.push_back("YCbCr_420_888");
>> > +     data->supported_resolutions_[0].formats.push_back("BLOB");
>> > +     data->supported_resolutions_[1].size = Size(1280, 720);
>> > +     data->supported_resolutions_[1].frame_rates.push_back(30);
>> > +     data->supported_resolutions_[1].frame_rates.push_back(60);
>> > +
>>  data->supported_resolutions_[1].formats.push_back("YCbCr_420_888");
>> > +     data->supported_resolutions_[1].formats.push_back("BLOB");
>> > +
>> > +     // TODO: Assign different locations for different cameras based
>> on config.
>> > +     data->properties_.set(properties::Location,
>> properties::CameraLocationFront);
>> > +     data->properties_.set(properties::PixelArrayActiveAreas, {
>> Rectangle(Size(1920, 1080)) });
>> > +
>> > +     // TODO: Set FrameDurationLimits based on config.
>> > +     ControlInfoMap::Map controls{};
>> > +     int64_t min_frame_duration = 30, max_frame_duration = 60;
>> > +     controls[&controls::FrameDurationLimits] =
>> ControlInfo(min_frame_duration, max_frame_duration);
>> > +     data->controlInfo_ = ControlInfoMap(std::move(controls),
>> controls::controls);
>>
>> FakeControls is ignored
>>
>> Right. Use it to initialize |controls| here.
>
>
>> > +
>> > +     std::shared_ptr<Camera> camera =
>> > +             Camera::create(std::move(data), "\\_SB_.PCI0.I2C4.CAM1"
>> /* cameraId */, streams);
>> > +
>> > +     registerCamera(std::move(camera));
>> > +
>> > +     return 0;
>> > +}
>> > +
>> > +REGISTER_PIPELINE_HANDLER(PipelineHandlerFake)
>> > +
>> > +} /* namespace libcamera */
>> > diff --git a/src/libcamera/pipeline/fake/meson.build
>> b/src/libcamera/pipeline/fake/meson.build
>> > new file mode 100644
>> > index 00000000..13980b4a
>> > --- /dev/null
>> > +++ b/src/libcamera/pipeline/fake/meson.build
>> > @@ -0,0 +1,3 @@
>> > +# SPDX-License-Identifier: CC0-1.0
>> > +
>> > +libcamera_sources += files([ 'fake.cpp' ])
>> > diff --git a/src/libcamera/pipeline_handler.cpp
>> b/src/libcamera/pipeline_handler.cpp
>> > index e5cb751c..4261154d 100644
>> > --- a/src/libcamera/pipeline_handler.cpp
>> > +++ b/src/libcamera/pipeline_handler.cpp
>> > @@ -536,7 +536,7 @@ void
>> PipelineHandler::registerCamera(std::shared_ptr<Camera> camera)
>> >       cameras_.push_back(camera);
>> >
>> >       if (mediaDevices_.empty())
>> > -             LOG(Pipeline, Fatal)
>> > +             LOG(Pipeline, Error)
>>
>> I don't think we want that, you should probably open code the
>>
>>         manager_->addCamera(std::move(camera), devnums);
>>
>> call, with an empty list of devnums, which should be fine for this use
>> case.
>>
>> The PipelineHandler::cameras_ vector will remain unpopulated, but
>> since it is used only in case of a media device disconnection (afaict)
>> it should be fine in your case
>>
>>
> Ah you're right. Thanks!
>
>
>>
>> >                       << "Registering camera with no media devices!";
>> >
>> >       /*
>> > diff --git a/test/camera/camera_reconfigure.cpp
>> b/test/camera/camera_reconfigure.cpp
>> > index d12e2413..06c87730 100644
>> > --- a/test/camera/camera_reconfigure.cpp
>> > +++ b/test/camera/camera_reconfigure.cpp
>> > @@ -98,7 +98,7 @@ private:
>> >                               return TestFail;
>> >                       }
>> >
>> > -                     requests_.push_back(move(request));
>> > +                     requests_.push_back(std::move(request));
>>
>> Unrelated and possibly not necessary, as the test has
>>         using namespace std;
>>
>>
> Yeah... it's also a hack that it's required in the CrOS compiler... It
> doesn't
> accept |move|...
>
>
>> Looking forward to seeing this skeleton getting populated, it will help
>> testing the Android layer!
>>
>> Thanks
>>    j
>>
>> >               }
>> >
>> >               camera_->requestCompleted.connect(this,
>> &CameraReconfigure::requestComplete);
>> > --
>> > 2.38.0.rc1.362.ged0d419d3c-goog
>> >
>>
>

Patch
diff mbox series

diff --git a/meson_options.txt b/meson_options.txt
index f1d67808..f08dfc5f 100644
--- a/meson_options.txt
+++ b/meson_options.txt
@@ -37,7 +37,7 @@  option('lc-compliance',
 
 option('pipelines',
         type : 'array',
-        choices : ['ipu3', 'raspberrypi', 'rkisp1', 'simple', 'uvcvideo', 'vimc'],
+        choices : ['ipu3', 'raspberrypi', 'rkisp1', 'simple', 'uvcvideo', 'vimc', 'fake'],
         description : 'Select which pipeline handlers to include')
 
 option('qcam',
diff --git a/src/android/camera_capabilities.cpp b/src/android/camera_capabilities.cpp
index 64bd8dde..730ceafc 100644
--- a/src/android/camera_capabilities.cpp
+++ b/src/android/camera_capabilities.cpp
@@ -1077,6 +1077,7 @@  int CameraCapabilities::initializeStaticMetadata()
 	{
 		const Span<const Rectangle> &rects =
 			properties.get(properties::PixelArrayActiveAreas).value_or(Span<const Rectangle>{});
+		// TODO: initialize at least one Rectangle as default.
 		std::vector<int32_t> data{
 			static_cast<int32_t>(rects[0].x),
 			static_cast<int32_t>(rects[0].y),
diff --git a/src/libcamera/pipeline/fake/fake.cpp b/src/libcamera/pipeline/fake/fake.cpp
new file mode 100644
index 00000000..518de6aa
--- /dev/null
+++ b/src/libcamera/pipeline/fake/fake.cpp
@@ -0,0 +1,518 @@ 
+/* SPDX-License-Identifier: LGPL-2.1-or-later */
+/*
+ * Copyright (C) 2019, Google Inc.
+ *
+ * ipu3.cpp - Pipeline handler for Intel Fake
+ */
+
+#include <algorithm>
+#include <iomanip>
+#include <memory>
+#include <queue>
+#include <vector>
+
+#include <libcamera/base/log.h>
+#include <libcamera/base/utils.h>
+
+#include <libcamera/camera.h>
+#include <libcamera/control_ids.h>
+#include <libcamera/formats.h>
+#include <libcamera/property_ids.h>
+#include <libcamera/request.h>
+#include <libcamera/stream.h>
+
+#include "libcamera/internal/camera.h"
+#include "libcamera/internal/camera_lens.h"
+#include "libcamera/internal/camera_sensor.h"
+#include "libcamera/internal/delayed_controls.h"
+#include "libcamera/internal/device_enumerator.h"
+#include "libcamera/internal/framebuffer.h"
+#include "libcamera/internal/media_device.h"
+#include "libcamera/internal/pipeline_handler.h"
+
+namespace libcamera {
+
+LOG_DEFINE_CATEGORY(Fake)
+
+uint64_t CurrentTimestamp()
+{
+	struct timespec ts;
+	if (clock_gettime(CLOCK_MONOTONIC, &ts) < 0) {
+		LOG(Fake, Error) << "Get clock time fails";
+		return 0;
+	}
+
+	return ts.tv_sec * 1'000'000'000LL + ts.tv_nsec;
+}
+
+static const ControlInfoMap::Map FakeControls = {
+	{ &controls::draft::PipelineDepth, ControlInfo(2, 3) },
+};
+
+class FakeCameraData : public Camera::Private
+{
+public:
+	struct Resolution {
+		Size size;
+		std::vector<int> frame_rates;
+		std::vector<std::string> formats;
+	};
+
+	FakeCameraData(PipelineHandler *pipe)
+		: Camera::Private(pipe), supportsFlips_(false)
+	{
+	}
+
+	std::vector<Resolution> supported_resolutions_;
+
+	Stream outStream_;
+	Stream vfStream_;
+	Stream rawStream_;
+
+	// TODO: Check if we should support.
+	bool supportsFlips_;
+	Transform rotationTransform_;
+
+	// TODO: remove
+	/* Requests for which no buffer has been queued to the CIO2 device yet. */
+	std::queue<Request *> pendingRequests_;
+	/* Requests queued to the CIO2 device but not yet processed by the ImgU. */
+	std::queue<Request *> processingRequests_;
+
+	// ControlInfoMap ipaControls_;
+	bool started_ = false;
+};
+
+class FakeCameraConfiguration : public CameraConfiguration
+{
+public:
+	static constexpr unsigned int kBufferCount = 4; // 4~6
+	static constexpr unsigned int kMaxStreams = 3;
+
+	FakeCameraConfiguration(FakeCameraData *data);
+
+	Status validate() override;
+
+private:
+	/*
+	 * The FakeCameraData instance is guaranteed to be valid as long as the
+	 * corresponding Camera instance is valid. In order to borrow a
+	 * reference to the camera data, store a new reference to the camera.
+	 */
+	const FakeCameraData *data_;
+};
+
+class PipelineHandlerFake : public PipelineHandler
+{
+public:
+	static constexpr unsigned int V4L2_CID_Fake_PIPE_MODE = 0x009819c1;
+	static constexpr Size kViewfinderSize{ 1280, 720 };
+
+	enum FakePipeModes {
+		FakePipeModeVideo = 0,
+		FakePipeModeStillCapture = 1,
+	};
+
+	PipelineHandlerFake(CameraManager *manager);
+
+	CameraConfiguration *generateConfiguration(Camera *camera,
+		const StreamRoles &roles) override;
+	int configure(Camera *camera, CameraConfiguration *config) override;
+
+	int exportFrameBuffers(Camera *camera, Stream *stream,
+			       std::vector<std::unique_ptr<FrameBuffer>> *buffers) override;
+
+	int start(Camera *camera, const ControlList *controls) override;
+	void stopDevice(Camera *camera) override;
+
+	int queueRequestDevice(Camera *camera, Request *request) override;
+
+	bool match(DeviceEnumerator *enumerator) override;
+
+private:
+	FakeCameraData *cameraData(Camera *camera)
+	{
+		return static_cast<FakeCameraData *>(camera->_d());
+	}
+
+	int registerCameras();
+
+	static bool registered_;
+};
+
+bool PipelineHandlerFake::registered_ = false;
+
+FakeCameraConfiguration::FakeCameraConfiguration(FakeCameraData *data)
+	: CameraConfiguration()
+{
+	data_ = data;
+}
+
+CameraConfiguration::Status FakeCameraConfiguration::validate()
+{
+	Status status = Valid;
+
+	if (config_.empty())
+		return Invalid;
+
+	// Transform combined = transform * data_->rotationTransform_;
+
+	// /*
+	//  * We combine the platform and user transform, but must "adjust away"
+	//  * any combined result that includes a transposition, as we can't do
+	//  * those. In this case, flipping only the transpose bit is helpful to
+	//  * applications - they either get the transform they requested, or have
+	//  * to do a simple transpose themselves (they don't have to worry about
+	//  * the other possible cases).
+	//  */
+	// if (!!(combined & Transform::Transpose)) {
+	// 	/*
+	// 	 * Flipping the transpose bit in "transform" flips it in the
+	// 	 * combined result too (as it's the last thing that happens),
+	// 	 * which is of course clearing it.
+	// 	 */
+	// 	transform ^= Transform::Transpose;
+	// 	combined &= ~Transform::Transpose;
+	// 	status = Adjusted;
+	// }
+
+	// /*
+	//  * We also check if the sensor doesn't do h/vflips at all, in which
+	//  * case we clear them, and the application will have to do everything.
+	//  */
+	// if (!data_->supportsFlips_ && !!combined) {
+	// 	/*
+	// 	 * If the sensor can do no transforms, then combined must be
+	// 	 * changed to the identity. The only user transform that gives
+	// 	 * rise to this is the inverse of the rotation. (Recall that
+	// 	 * combined = transform * rotationTransform.)
+	// 	 */
+	// 	transform = -data_->rotationTransform_;
+	// 	combined = Transform::Identity;
+	// 	status = Adjusted;
+	// }
+
+	/*
+	 * Store the final combined transform that configure() will need to
+	 * apply to the sensor to save us working it out again.
+	 */
+	// combinedTransform_ = combined;
+
+	/* Cap the number of entries to the available streams. */
+	if (config_.size() > kMaxStreams) {
+		config_.resize(kMaxStreams);
+		status = Adjusted;
+	}
+
+	/*
+	 * Validate the requested stream configuration and select the sensor
+	 * format by collecting the maximum RAW stream width and height and
+	 * picking the closest larger match.
+	 *
+	 * If no RAW stream is requested use the one of the largest YUV stream,
+	 * plus margin pixels for the IF and BDS rectangle to downscale.
+	 *
+	 * \todo Clarify the IF and BDS margins requirements.
+	 */
+	unsigned int rawCount = 0;
+	unsigned int yuvCount = 0;
+	Size rawRequirement;
+	Size maxYuvSize;
+	Size rawSize;
+
+	for (const StreamConfiguration &cfg : config_) {
+		const PixelFormatInfo &info = PixelFormatInfo::info(cfg.pixelFormat);
+
+		if (info.colourEncoding == PixelFormatInfo::ColourEncodingRAW) {
+			rawCount++;
+			rawSize = std::max(rawSize, cfg.size);
+		} else {
+			yuvCount++;
+			maxYuvSize = std::max(maxYuvSize, cfg.size);
+			rawRequirement.expandTo(cfg.size);
+		}
+	}
+
+	// TODO: Base on number of cameras?
+	if (rawCount > 1 || yuvCount > 2) {
+		LOG(Fake, Debug) << "Camera configuration not supported";
+		return Invalid;
+	}
+
+	/*
+	 * Generate raw configuration from CIO2.
+	 *
+	 * The output YUV streams will be limited in size to the maximum frame
+	 * size requested for the RAW stream, if present.
+	 *
+	 * If no raw stream is requested, generate a size from the largest YUV
+	 * stream, aligned to the ImgU constraints and bound
+	 * by the sensor's maximum resolution. See
+	 * https://bugs.libcamera.org/show_bug.cgi?id=32
+	 */
+	// TODO
+	if (rawSize.isNull())
+		rawSize = rawRequirement;
+
+	/*
+	 * Adjust the configurations if needed and assign streams while
+	 * iterating them.
+	 */
+	bool mainOutputAvailable = true;
+	for (unsigned int i = 0; i < config_.size(); ++i) {
+		const PixelFormatInfo &info = PixelFormatInfo::info(config_[i].pixelFormat);
+		const StreamConfiguration originalCfg = config_[i];
+		StreamConfiguration *cfg = &config_[i];
+
+		LOG(Fake, Debug) << "Validating stream: " << config_[i].toString();
+
+		if (info.colourEncoding == PixelFormatInfo::ColourEncodingRAW) {
+			/* Initialize the RAW stream with the CIO2 configuration. */
+			cfg->size = rawSize;
+			// TODO: check
+			cfg->pixelFormat = formats::SBGGR10_IPU3;
+			cfg->bufferCount = FakeCameraConfiguration::kBufferCount;
+			cfg->stride = info.stride(cfg->size.width, 0, 64);
+			cfg->frameSize = info.frameSize(cfg->size, 64);
+			cfg->setStream(const_cast<Stream *>(&data_->rawStream_));
+
+			LOG(Fake, Debug) << "Assigned " << cfg->toString()
+					 << " to the raw stream";
+		} else {
+			/* Assign and configure the main and viewfinder outputs. */
+
+			cfg->pixelFormat = formats::NV12;
+			cfg->bufferCount = kBufferCount;
+			cfg->stride = info.stride(cfg->size.width, 0, 1);
+			cfg->frameSize = info.frameSize(cfg->size, 1);
+
+			/*
+			 * Use the main output stream in case only one stream is
+			 * requested or if the current configuration is the one
+			 * with the maximum YUV output size.
+			 */
+			if (mainOutputAvailable &&
+			    (originalCfg.size == maxYuvSize || yuvCount == 1)) {
+				cfg->setStream(const_cast<Stream *>(&data_->outStream_));
+				mainOutputAvailable = false;
+
+				LOG(Fake, Debug) << "Assigned " << cfg->toString()
+						 << " to the main output";
+			} else {
+				cfg->setStream(const_cast<Stream *>(&data_->vfStream_));
+
+				LOG(Fake, Debug) << "Assigned " << cfg->toString()
+						 << " to the viewfinder output";
+			}
+		}
+
+		if (cfg->pixelFormat != originalCfg.pixelFormat ||
+		    cfg->size != originalCfg.size) {
+			LOG(Fake, Debug)
+				<< "Stream " << i << " configuration adjusted to "
+				<< cfg->toString();
+			status = Adjusted;
+		}
+	}
+
+	return status;
+}
+
+PipelineHandlerFake::PipelineHandlerFake(CameraManager *manager)
+	: PipelineHandler(manager)
+{
+	// TODO: read the fake hal spec file.
+}
+
+CameraConfiguration *PipelineHandlerFake::generateConfiguration(Camera *camera,
+								const StreamRoles &roles)
+{
+	FakeCameraData *data = cameraData(camera);
+	FakeCameraConfiguration *config = new FakeCameraConfiguration(data);
+
+	if (roles.empty())
+		return config;
+
+	Size minSize, sensorResolution;
+	for (const auto& resolution : data->supported_resolutions_) {
+		if (minSize.isNull() || minSize > resolution.size)
+			minSize = resolution.size;
+
+		sensorResolution = std::max(sensorResolution, resolution.size);
+	}
+
+	for (const StreamRole role : roles) {
+		std::map<PixelFormat, std::vector<SizeRange>> streamFormats;
+		unsigned int bufferCount;
+		PixelFormat pixelFormat;
+		Size size;
+
+		switch (role) {
+		case StreamRole::StillCapture:
+			size = sensorResolution;
+			pixelFormat = formats::NV12;
+			bufferCount = FakeCameraConfiguration::kBufferCount;
+			streamFormats[pixelFormat] = { { minSize, size } };
+
+			break;
+
+		case StreamRole::Raw: {
+			// TODO: check
+			pixelFormat = formats::SBGGR10_IPU3;
+			size = sensorResolution;
+			bufferCount = FakeCameraConfiguration::kBufferCount;
+			streamFormats[pixelFormat] = { { minSize, size } };
+
+			break;
+		}
+
+		case StreamRole::Viewfinder:
+		case StreamRole::VideoRecording: {
+			/*
+			 * Default viewfinder and videorecording to 1280x720,
+			 * capped to the maximum sensor resolution and aligned
+			 * to the ImgU output constraints.
+			 */
+			size = sensorResolution;
+			pixelFormat = formats::NV12;
+			bufferCount = FakeCameraConfiguration::kBufferCount;
+			streamFormats[pixelFormat] = { { minSize, size } };
+
+			break;
+		}
+
+		default:
+			LOG(Fake, Error)
+				<< "Requested stream role not supported: " << role;
+			delete config;
+			return nullptr;
+		}
+
+		StreamFormats formats(streamFormats);
+		StreamConfiguration cfg(formats);
+		cfg.size = size;
+		cfg.pixelFormat = pixelFormat;
+		cfg.bufferCount = bufferCount;
+		config->addConfiguration(cfg);
+	}
+
+	if (config->validate() == CameraConfiguration::Invalid)
+		return {};
+
+	return config;
+}
+
+int PipelineHandlerFake::configure(Camera *camera, CameraConfiguration *c)
+{
+	if (camera || c)
+		return 0;
+	return 0;
+}
+
+int PipelineHandlerFake::exportFrameBuffers(Camera *camera, Stream *stream,
+					    std::vector<std::unique_ptr<FrameBuffer>> *buffers)
+{
+	// Assume it's never called.
+	LOG(Fake, Fatal) << "exportFrameBuffers should never be called";
+	if (camera || stream || buffers)
+		return -EINVAL;
+	return -EINVAL;
+}
+
+int PipelineHandlerFake::start(Camera *camera, [[maybe_unused]] const ControlList *controls)
+{
+	FakeCameraData *data = cameraData(camera);
+	data->started_ = true;
+
+	return 0;
+}
+
+void PipelineHandlerFake::stopDevice(Camera *camera)
+{
+	FakeCameraData *data = cameraData(camera);
+
+	data->started_ = false;
+}
+
+int PipelineHandlerFake::queueRequestDevice(Camera *camera, Request *request)
+{
+	if (!camera)
+		return -EINVAL;
+
+	for (auto it : request->buffers())
+		completeBuffer(request, it.second);
+
+	// TODO: request.metadata()
+	request->metadata().set(controls::SensorTimestamp, CurrentTimestamp());
+	completeRequest(request);
+
+	return 0;
+}
+
+bool PipelineHandlerFake::match(DeviceEnumerator *enumerator)
+{
+	// TODO: exhaust all devices in |enumerator|.
+	if (!enumerator)
+		LOG(Fake, Info) << "Invalid enumerator";
+
+	if (registered_)
+		return false;
+
+	registered_ = true;
+	return registerCameras() == 0;
+}
+
+/**
+ * \brief Initialise ImgU and CIO2 devices associated with cameras
+ *
+ * Initialise the two ImgU instances and create cameras with an associated
+ * CIO2 device instance.
+ *
+ * \return 0 on success or a negative error code for error or if no camera
+ * has been created
+ * \retval -ENODEV no camera has been created
+ */
+int PipelineHandlerFake::registerCameras()
+{
+	std::unique_ptr<FakeCameraData> data =
+		std::make_unique<FakeCameraData>(this);
+	std::set<Stream *> streams = {
+		&data->outStream_,
+		&data->vfStream_,
+		&data->rawStream_,
+	};
+
+	// TODO: Read from config or from IPC.
+	// TODO: Check with Han-lin: Can this function be called more than once?
+	data->supported_resolutions_.resize(2);
+	data->supported_resolutions_[0].size = Size(1920, 1080);
+	data->supported_resolutions_[0].frame_rates.push_back(30);
+	data->supported_resolutions_[0].formats.push_back("YCbCr_420_888");
+	data->supported_resolutions_[0].formats.push_back("BLOB");
+	data->supported_resolutions_[1].size = Size(1280, 720);
+	data->supported_resolutions_[1].frame_rates.push_back(30);
+	data->supported_resolutions_[1].frame_rates.push_back(60);
+	data->supported_resolutions_[1].formats.push_back("YCbCr_420_888");
+	data->supported_resolutions_[1].formats.push_back("BLOB");
+
+	// TODO: Assign different locations for different cameras based on config.
+	data->properties_.set(properties::Location, properties::CameraLocationFront);
+	data->properties_.set(properties::PixelArrayActiveAreas, { Rectangle(Size(1920, 1080)) });
+
+	// TODO: Set FrameDurationLimits based on config.
+	ControlInfoMap::Map controls{};
+	int64_t min_frame_duration = 30, max_frame_duration = 60;
+	controls[&controls::FrameDurationLimits] = ControlInfo(min_frame_duration, max_frame_duration);
+	data->controlInfo_ = ControlInfoMap(std::move(controls), controls::controls);
+
+	std::shared_ptr<Camera> camera =
+		Camera::create(std::move(data), "\\_SB_.PCI0.I2C4.CAM1" /* cameraId */, streams);
+
+	registerCamera(std::move(camera));
+
+	return 0;
+}
+
+REGISTER_PIPELINE_HANDLER(PipelineHandlerFake)
+
+} /* namespace libcamera */
diff --git a/src/libcamera/pipeline/fake/meson.build b/src/libcamera/pipeline/fake/meson.build
new file mode 100644
index 00000000..13980b4a
--- /dev/null
+++ b/src/libcamera/pipeline/fake/meson.build
@@ -0,0 +1,3 @@ 
+# SPDX-License-Identifier: CC0-1.0
+
+libcamera_sources += files([ 'fake.cpp' ])
diff --git a/src/libcamera/pipeline_handler.cpp b/src/libcamera/pipeline_handler.cpp
index e5cb751c..4261154d 100644
--- a/src/libcamera/pipeline_handler.cpp
+++ b/src/libcamera/pipeline_handler.cpp
@@ -536,7 +536,7 @@  void PipelineHandler::registerCamera(std::shared_ptr<Camera> camera)
 	cameras_.push_back(camera);
 
 	if (mediaDevices_.empty())
-		LOG(Pipeline, Fatal)
+		LOG(Pipeline, Error)
 			<< "Registering camera with no media devices!";
 
 	/*
diff --git a/test/camera/camera_reconfigure.cpp b/test/camera/camera_reconfigure.cpp
index d12e2413..06c87730 100644
--- a/test/camera/camera_reconfigure.cpp
+++ b/test/camera/camera_reconfigure.cpp
@@ -98,7 +98,7 @@  private:
 				return TestFail;
 			}
 
-			requests_.push_back(move(request));
+			requests_.push_back(std::move(request));
 		}
 
 		camera_->requestCompleted.connect(this, &CameraReconfigure::requestComplete);