[{"id":28889,"web_url":"https://patchwork.libcamera.org/comment/28889/","msgid":"<20240306165507.hdzab73gkrobkd3z@jasper>","date":"2024-03-06T16:55:07","subject":"Re: [PATCH/RFC 20/32] libcamera: camera_sensor: Create abstract base\n\tclass","submitter":{"id":184,"url":"https://patchwork.libcamera.org/api/people/184/","name":"Stefan Klug","email":"stefan.klug@ideasonboard.com"},"content":"Hi Laurent,\n\nOn Fri, Mar 01, 2024 at 11:21:09PM +0200, Laurent Pinchart wrote:\n> With a camera sensor factory in place, the next step is to create an\n> abstract base class that all camera sensors implement, providing a\n> uniform API to pipeline handler. Turn all public functions of the\n> CameraSensor class into pure virtual functions, and move the\n> implementation to the CameraSensorLegacy class.\n> \n> Part of the code is likely worth keeping as common helpers in a base\n> class. However, to follow the principle of not designing helpers with a\n> single user, this commit moves the whole implementation. Common helpers\n> will be introduced later, along with other CameraSensor subclasses.\n> \n> Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>\n\nuff, that's a big patch (not content wise). \nWe should better get that in sooner than later.\n\nReviewed-by: Stefan Klug <stefan.klug@ideasonboard.com>\n\nCheers,\nStefan\n\n> ---\n>  Documentation/Doxyfile.in                     |    1 +\n>  include/libcamera/internal/camera_sensor.h    |  126 +-\n>  src/libcamera/sensor/camera_sensor.cpp        |  981 +---------------\n>  src/libcamera/sensor/camera_sensor_legacy.cpp | 1015 +++++++++++++++++\n>  src/libcamera/sensor/meson.build              |    1 +\n>  5 files changed, 1090 insertions(+), 1034 deletions(-)\n>  create mode 100644 src/libcamera/sensor/camera_sensor_legacy.cpp\n> \n> diff --git a/Documentation/Doxyfile.in b/Documentation/Doxyfile.in\n> index a86ea6c1ec95..75326c1964e9 100644\n> --- a/Documentation/Doxyfile.in\n> +++ b/Documentation/Doxyfile.in\n> @@ -42,6 +42,7 @@ EXCLUDE                = @TOP_SRCDIR@/include/libcamera/base/span.h \\\n>                           @TOP_SRCDIR@/src/libcamera/device_enumerator_udev.cpp \\\n>                           @TOP_SRCDIR@/src/libcamera/ipc_pipe_unixsocket.cpp \\\n>                           @TOP_SRCDIR@/src/libcamera/pipeline/ \\\n> +                         @TOP_SRCDIR@/src/libcamera/sensor/camera_sensor_legacy.cpp \\\n>                           @TOP_SRCDIR@/src/libcamera/tracepoints.cpp \\\n>                           @TOP_BUILDDIR@/include/libcamera/internal/tracepoints.h \\\n>                           @TOP_BUILDDIR@/src/libcamera/proxy/\n> diff --git a/include/libcamera/internal/camera_sensor.h b/include/libcamera/internal/camera_sensor.h\n> index 577af779cd6e..d37e66773195 100644\n> --- a/include/libcamera/internal/camera_sensor.h\n> +++ b/include/libcamera/internal/camera_sensor.h\n> @@ -9,10 +9,10 @@\n>  \n>  #include <memory>\n>  #include <string>\n> +#include <variant>\n>  #include <vector>\n>  \n>  #include <libcamera/base/class.h>\n> -#include <libcamera/base/log.h>\n>  \n>  #include <libcamera/control_ids.h>\n>  #include <libcamera/controls.h>\n> @@ -20,8 +20,6 @@\n>  #include <libcamera/orientation.h>\n>  #include <libcamera/transform.h>\n>  \n> -#include <libcamera/ipa/core_ipa_interface.h>\n> -\n>  #include \"libcamera/internal/bayer_format.h\"\n>  #include \"libcamera/internal/formats.h\"\n>  #include \"libcamera/internal/v4l2_subdevice.h\"\n> @@ -32,95 +30,50 @@ class CameraLens;\n>  class MediaEntity;\n>  class SensorConfiguration;\n>  \n> -struct CameraSensorProperties;\n> -\n>  enum class Orientation;\n>  \n> -class CameraSensor : protected Loggable\n> +struct IPACameraSensorInfo;\n> +\n> +class CameraSensor\n>  {\n>  public:\n> -\t~CameraSensor();\n> +\tvirtual ~CameraSensor();\n>  \n> -\tint init();\n> +\tvirtual const std::string &model() const = 0;\n> +\tvirtual const std::string &id() const = 0;\n>  \n> -\tconst std::string &model() const { return model_; }\n> -\tconst std::string &id() const { return id_; }\n> +\tvirtual const MediaEntity *entity() const = 0;\n> +\tvirtual V4L2Subdevice *device() = 0;\n>  \n> -\tconst MediaEntity *entity() const { return entity_; }\n> -\tV4L2Subdevice *device() { return subdev_.get(); }\n> +\tvirtual CameraLens *focusLens() = 0;\n>  \n> -\tCameraLens *focusLens() { return focusLens_.get(); }\n> +\tvirtual const std::vector<unsigned int> &mbusCodes() const = 0;\n> +\tvirtual std::vector<Size> sizes(unsigned int mbusCode) const = 0;\n> +\tvirtual Size resolution() const = 0;\n>  \n> -\tconst std::vector<unsigned int> &mbusCodes() const { return mbusCodes_; }\n> -\tstd::vector<Size> sizes(unsigned int mbusCode) const;\n> -\tSize resolution() const;\n> +\tvirtual V4L2SubdeviceFormat\n> +\tgetFormat(const std::vector<unsigned int> &mbusCodes,\n> +\t\t  const Size &size) const = 0;\n> +\tvirtual int setFormat(V4L2SubdeviceFormat *format,\n> +\t\t\t      Transform transform = Transform::Identity) = 0;\n> +\tvirtual int tryFormat(V4L2SubdeviceFormat *format) const = 0;\n>  \n> -\tV4L2SubdeviceFormat getFormat(const std::vector<unsigned int> &mbusCodes,\n> -\t\t\t\t      const Size &size) const;\n> -\tint setFormat(V4L2SubdeviceFormat *format,\n> -\t\t      Transform transform = Transform::Identity);\n> -\tint tryFormat(V4L2SubdeviceFormat *format) const;\n> +\tvirtual int applyConfiguration(const SensorConfiguration &config,\n> +\t\t\t\t       Transform transform = Transform::Identity,\n> +\t\t\t\t       V4L2SubdeviceFormat *sensorFormat = nullptr) = 0;\n>  \n> -\tint applyConfiguration(const SensorConfiguration &config,\n> -\t\t\t       Transform transform = Transform::Identity,\n> -\t\t\t       V4L2SubdeviceFormat *sensorFormat = nullptr);\n> +\tvirtual const ControlList &properties() const = 0;\n> +\tvirtual int sensorInfo(IPACameraSensorInfo *info) const = 0;\n> +\tvirtual Transform computeTransform(Orientation *orientation) const = 0;\n> +\tvirtual BayerFormat::Order bayerOrder(Transform t) const = 0;\n>  \n> -\tconst ControlList &properties() const { return properties_; }\n> -\tint sensorInfo(IPACameraSensorInfo *info) const;\n> -\tTransform computeTransform(Orientation *orientation) const;\n> -\tBayerFormat::Order bayerOrder(Transform t) const;\n> +\tvirtual const ControlInfoMap &controls() const = 0;\n> +\tvirtual ControlList getControls(const std::vector<uint32_t> &ids) = 0;\n> +\tvirtual int setControls(ControlList *ctrls) = 0;\n>  \n> -\tconst ControlInfoMap &controls() const;\n> -\tControlList getControls(const std::vector<uint32_t> &ids);\n> -\tint setControls(ControlList *ctrls);\n> -\n> -\tconst std::vector<controls::draft::TestPatternModeEnum> &testPatternModes() const\n> -\t{\n> -\t\treturn testPatternModes_;\n> -\t}\n> -\tint setTestPatternMode(controls::draft::TestPatternModeEnum mode);\n> -\n> -protected:\n> -\texplicit CameraSensor(const MediaEntity *entity);\n> -\tstd::string logPrefix() const override;\n> -\n> -private:\n> -\tLIBCAMERA_DISABLE_COPY(CameraSensor)\n> -\n> -\tint generateId();\n> -\tint validateSensorDriver();\n> -\tvoid initVimcDefaultProperties();\n> -\tvoid initStaticProperties();\n> -\tvoid initTestPatternModes();\n> -\tint initProperties();\n> -\tint discoverAncillaryDevices();\n> -\tint applyTestPatternMode(controls::draft::TestPatternModeEnum mode);\n> -\n> -\tconst MediaEntity *entity_;\n> -\tstd::unique_ptr<V4L2Subdevice> subdev_;\n> -\tunsigned int pad_;\n> -\n> -\tconst CameraSensorProperties *staticProps_;\n> -\n> -\tstd::string model_;\n> -\tstd::string id_;\n> -\n> -\tV4L2Subdevice::Formats formats_;\n> -\tstd::vector<unsigned int> mbusCodes_;\n> -\tstd::vector<Size> sizes_;\n> -\tstd::vector<controls::draft::TestPatternModeEnum> testPatternModes_;\n> -\tcontrols::draft::TestPatternModeEnum testPatternMode_;\n> -\n> -\tSize pixelArraySize_;\n> -\tRectangle activeArea_;\n> -\tconst BayerFormat *bayerFormat_;\n> -\tbool supportFlips_;\n> -\tbool flipsAlterBayerOrder_;\n> -\tOrientation mountingOrientation_;\n> -\n> -\tControlList properties_;\n> -\n> -\tstd::unique_ptr<CameraLens> focusLens_;\n> +\tvirtual const std::vector<controls::draft::TestPatternModeEnum> &\n> +\ttestPatternModes() const = 0;\n> +\tvirtual int setTestPatternMode(controls::draft::TestPatternModeEnum mode) = 0;\n>  };\n>  \n>  class CameraSensorFactoryBase\n> @@ -138,10 +91,8 @@ private:\n>  \n>  \tstatic void registerFactory(CameraSensorFactoryBase *factory);\n>  \n> -\tvirtual bool match(const MediaEntity *entity) const = 0;\n> -\n> -\tvirtual std::unique_ptr<CameraSensor>\n> -\tcreateInstance(MediaEntity *entity) const = 0;\n> +\tvirtual std::variant<std::unique_ptr<CameraSensor>, int>\n> +\tmatch(MediaEntity *entity) const = 0;\n>  };\n>  \n>  template<typename _CameraSensor>\n> @@ -154,16 +105,11 @@ public:\n>  \t}\n>  \n>  private:\n> -\tbool match(const MediaEntity *entity) const override\n> +\tstd::variant<std::unique_ptr<CameraSensor>, int>\n> +\tmatch(MediaEntity *entity) const override\n>  \t{\n>  \t\treturn _CameraSensor::match(entity);\n>  \t}\n> -\n> -\tstd::unique_ptr<CameraSensor>\n> -\tcreateInstance(MediaEntity *entity) const override\n> -\t{\n> -\t\treturn _CameraSensor::create(entity);\n> -\t}\n>  };\n>  \n>  #define REGISTER_CAMERA_SENSOR(sensor) \\\n> diff --git a/src/libcamera/sensor/camera_sensor.cpp b/src/libcamera/sensor/camera_sensor.cpp\n> index a35683eb4b58..7abbe2c76596 100644\n> --- a/src/libcamera/sensor/camera_sensor.cpp\n> +++ b/src/libcamera/sensor/camera_sensor.cpp\n> @@ -6,27 +6,13 @@\n>   */\n>  \n>  #include \"libcamera/internal/camera_sensor.h\"\n> -#include \"libcamera/internal/media_device.h\"\n>  \n> -#include <algorithm>\n> -#include <float.h>\n> -#include <iomanip>\n> -#include <limits.h>\n> -#include <map>\n> -#include <math.h>\n> -#include <string.h>\n> +#include <memory>\n> +#include <vector>\n>  \n> -#include <libcamera/camera.h>\n> -#include <libcamera/orientation.h>\n> -#include <libcamera/property_ids.h>\n> +#include <libcamera/base/log.h>\n>  \n> -#include <libcamera/base/utils.h>\n> -\n> -#include \"libcamera/internal/bayer_format.h\"\n> -#include \"libcamera/internal/camera_lens.h\"\n> -#include \"libcamera/internal/camera_sensor_properties.h\"\n> -#include \"libcamera/internal/formats.h\"\n> -#include \"libcamera/internal/sysfs.h\"\n> +#include \"libcamera/internal/media_object.h\"\n>  \n>  /**\n>   * \\file camera_sensor.h\n> @@ -39,537 +25,16 @@ LOG_DEFINE_CATEGORY(CameraSensor)\n>  \n>  /**\n>   * \\class CameraSensor\n> - * \\brief A camera sensor based on V4L2 subdevices\n> + * \\brief A abstract camera sensor\n>   *\n>   * The CameraSensor class eases handling of sensors for pipeline handlers by\n> - * hiding the details of the V4L2 subdevice kernel API and caching sensor\n> - * information.\n> - *\n> - * The implementation is currently limited to sensors that expose a single V4L2\n> - * subdevice with a single pad. It will be extended to support more complex\n> - * devices as the needs arise.\n> + * hiding the details of the kernel API and caching sensor information.\n>   */\n>  \n> -/**\n> - * \\brief Construct a CameraSensor\n> - * \\param[in] entity The media entity backing the camera sensor\n> - *\n> - * Once constructed the instance must be initialized with init().\n> - */\n> -CameraSensor::CameraSensor(const MediaEntity *entity)\n> -\t: entity_(entity), pad_(UINT_MAX), staticProps_(nullptr),\n> -\t  bayerFormat_(nullptr), supportFlips_(false),\n> -\t  flipsAlterBayerOrder_(false), properties_(properties::properties)\n> -{\n> -}\n> -\n>  /**\n>   * \\brief Destroy a CameraSensor\n>   */\n> -CameraSensor::~CameraSensor()\n> -{\n> -}\n> -\n> -/**\n> - * \\brief Initialize the camera sensor instance\n> - *\n> - * This function performs the initialisation steps of the CameraSensor that may\n> - * fail. It shall be called once and only once after constructing the instance.\n> - *\n> - * \\return 0 on success or a negative error code otherwise\n> - */\n> -int CameraSensor::init()\n> -{\n> -\tfor (const MediaPad *pad : entity_->pads()) {\n> -\t\tif (pad->flags() & MEDIA_PAD_FL_SOURCE) {\n> -\t\t\tpad_ = pad->index();\n> -\t\t\tbreak;\n> -\t\t}\n> -\t}\n> -\n> -\tif (pad_ == UINT_MAX) {\n> -\t\tLOG(CameraSensor, Error)\n> -\t\t\t<< \"Sensors with more than one pad are not supported\";\n> -\t\treturn -EINVAL;\n> -\t}\n> -\n> -\tswitch (entity_->function()) {\n> -\tcase MEDIA_ENT_F_CAM_SENSOR:\n> -\tcase MEDIA_ENT_F_PROC_VIDEO_ISP:\n> -\t\tbreak;\n> -\n> -\tdefault:\n> -\t\tLOG(CameraSensor, Error)\n> -\t\t\t<< \"Invalid sensor function \"\n> -\t\t\t<< utils::hex(entity_->function());\n> -\t\treturn -EINVAL;\n> -\t}\n> -\n> -\t/* Create and open the subdev. */\n> -\tsubdev_ = std::make_unique<V4L2Subdevice>(entity_);\n> -\tint ret = subdev_->open();\n> -\tif (ret < 0)\n> -\t\treturn ret;\n> -\n> -\t/*\n> -\t * Clear any flips to be sure we get the \"native\" Bayer order. This is\n> -\t * harmless for sensors where the flips don't affect the Bayer order.\n> -\t */\n> -\tControlList ctrls(subdev_->controls());\n> -\tif (subdev_->controls().find(V4L2_CID_HFLIP) != subdev_->controls().end())\n> -\t\tctrls.set(V4L2_CID_HFLIP, 0);\n> -\tif (subdev_->controls().find(V4L2_CID_VFLIP) != subdev_->controls().end())\n> -\t\tctrls.set(V4L2_CID_VFLIP, 0);\n> -\tsubdev_->setControls(&ctrls);\n> -\n> -\t/* Enumerate, sort and cache media bus codes and sizes. */\n> -\tformats_ = subdev_->formats(pad_);\n> -\tif (formats_.empty()) {\n> -\t\tLOG(CameraSensor, Error) << \"No image format found\";\n> -\t\treturn -EINVAL;\n> -\t}\n> -\n> -\tmbusCodes_ = utils::map_keys(formats_);\n> -\tstd::sort(mbusCodes_.begin(), mbusCodes_.end());\n> -\n> -\tfor (const auto &format : formats_) {\n> -\t\tconst std::vector<SizeRange> &ranges = format.second;\n> -\t\tstd::transform(ranges.begin(), ranges.end(), std::back_inserter(sizes_),\n> -\t\t\t       [](const SizeRange &range) { return range.max; });\n> -\t}\n> -\n> -\tstd::sort(sizes_.begin(), sizes_.end());\n> -\n> -\t/* Remove duplicates. */\n> -\tauto last = std::unique(sizes_.begin(), sizes_.end());\n> -\tsizes_.erase(last, sizes_.end());\n> -\n> -\t/*\n> -\t * VIMC is a bit special, as it does not yet support all the mandatory\n> -\t * requirements regular sensors have to respect.\n> -\t *\n> -\t * Do not validate the driver if it's VIMC and initialize the sensor\n> -\t * properties with static information.\n> -\t *\n> -\t * \\todo Remove the special case once the VIMC driver has been\n> -\t * updated in all test platforms.\n> -\t */\n> -\tif (entity_->device()->driver() == \"vimc\") {\n> -\t\tinitVimcDefaultProperties();\n> -\n> -\t\tret = initProperties();\n> -\t\tif (ret)\n> -\t\t\treturn ret;\n> -\n> -\t\treturn discoverAncillaryDevices();\n> -\t}\n> -\n> -\t/* Get the color filter array pattern (only for RAW sensors). */\n> -\tfor (unsigned int mbusCode : mbusCodes_) {\n> -\t\tconst BayerFormat &bayerFormat = BayerFormat::fromMbusCode(mbusCode);\n> -\t\tif (bayerFormat.isValid()) {\n> -\t\t\tbayerFormat_ = &bayerFormat;\n> -\t\t\tbreak;\n> -\t\t}\n> -\t}\n> -\n> -\tret = validateSensorDriver();\n> -\tif (ret)\n> -\t\treturn ret;\n> -\n> -\tret = initProperties();\n> -\tif (ret)\n> -\t\treturn ret;\n> -\n> -\tret = discoverAncillaryDevices();\n> -\tif (ret)\n> -\t\treturn ret;\n> -\n> -\t/*\n> -\t * Set HBLANK to the minimum to start with a well-defined line length,\n> -\t * allowing IPA modules that do not modify HBLANK to use the sensor\n> -\t * minimum line length in their calculations.\n> -\t */\n> -\tconst struct v4l2_query_ext_ctrl *hblankInfo = subdev_->controlInfo(V4L2_CID_HBLANK);\n> -\tif (hblankInfo && !(hblankInfo->flags & V4L2_CTRL_FLAG_READ_ONLY)) {\n> -\t\tControlList ctrl(subdev_->controls());\n> -\n> -\t\tctrl.set(V4L2_CID_HBLANK, static_cast<int32_t>(hblankInfo->minimum));\n> -\t\tret = subdev_->setControls(&ctrl);\n> -\t\tif (ret)\n> -\t\t\treturn ret;\n> -\t}\n> -\n> -\treturn applyTestPatternMode(controls::draft::TestPatternModeEnum::TestPatternModeOff);\n> -}\n> -\n> -int CameraSensor::generateId()\n> -{\n> -\tconst std::string devPath = subdev_->devicePath();\n> -\n> -\t/* Try to get ID from firmware description. */\n> -\tid_ = sysfs::firmwareNodePath(devPath);\n> -\tif (!id_.empty())\n> -\t\treturn 0;\n> -\n> -\t/*\n> -\t * Virtual sensors not described in firmware\n> -\t *\n> -\t * Verify it's a platform device and construct ID from the device path\n> -\t * and model of sensor.\n> -\t */\n> -\tif (devPath.find(\"/sys/devices/platform/\", 0) == 0) {\n> -\t\tid_ = devPath.substr(strlen(\"/sys/devices/\")) + \" \" + model();\n> -\t\treturn 0;\n> -\t}\n> -\n> -\tLOG(CameraSensor, Error) << \"Can't generate sensor ID\";\n> -\treturn -EINVAL;\n> -}\n> -\n> -int CameraSensor::validateSensorDriver()\n> -{\n> -\tint err = 0;\n> -\n> -\t/*\n> -\t * Optional controls are used to register optional sensor properties. If\n> -\t * not present, some values will be defaulted.\n> -\t */\n> -\tstatic constexpr uint32_t optionalControls[] = {\n> -\t\tV4L2_CID_CAMERA_SENSOR_ROTATION,\n> -\t};\n> -\n> -\tconst ControlIdMap &controls = subdev_->controls().idmap();\n> -\tfor (uint32_t ctrl : optionalControls) {\n> -\t\tif (!controls.count(ctrl))\n> -\t\t\tLOG(CameraSensor, Debug)\n> -\t\t\t\t<< \"Optional V4L2 control \" << utils::hex(ctrl)\n> -\t\t\t\t<< \" not supported\";\n> -\t}\n> -\n> -\t/*\n> -\t * Recommended controls are similar to optional controls, but will\n> -\t * become mandatory in the near future. Be loud if they're missing.\n> -\t */\n> -\tstatic constexpr uint32_t recommendedControls[] = {\n> -\t\tV4L2_CID_CAMERA_ORIENTATION,\n> -\t};\n> -\n> -\tfor (uint32_t ctrl : recommendedControls) {\n> -\t\tif (!controls.count(ctrl)) {\n> -\t\t\tLOG(CameraSensor, Warning)\n> -\t\t\t\t<< \"Recommended V4L2 control \" << utils::hex(ctrl)\n> -\t\t\t\t<< \" not supported\";\n> -\t\t\terr = -EINVAL;\n> -\t\t}\n> -\t}\n> -\n> -\t/*\n> -\t * Verify if sensor supports horizontal/vertical flips\n> -\t *\n> -\t * \\todo Handle horizontal and vertical flips independently.\n> -\t */\n> -\tconst struct v4l2_query_ext_ctrl *hflipInfo = subdev_->controlInfo(V4L2_CID_HFLIP);\n> -\tconst struct v4l2_query_ext_ctrl *vflipInfo = subdev_->controlInfo(V4L2_CID_VFLIP);\n> -\tif (hflipInfo && !(hflipInfo->flags & V4L2_CTRL_FLAG_READ_ONLY) &&\n> -\t    vflipInfo && !(vflipInfo->flags & V4L2_CTRL_FLAG_READ_ONLY)) {\n> -\t\tsupportFlips_ = true;\n> -\n> -\t\tif (hflipInfo->flags & V4L2_CTRL_FLAG_MODIFY_LAYOUT ||\n> -\t\t    vflipInfo->flags & V4L2_CTRL_FLAG_MODIFY_LAYOUT)\n> -\t\t\tflipsAlterBayerOrder_ = true;\n> -\t}\n> -\n> -\tif (!supportFlips_)\n> -\t\tLOG(CameraSensor, Debug)\n> -\t\t\t<< \"Camera sensor does not support horizontal/vertical flip\";\n> -\n> -\t/*\n> -\t * Make sure the required selection targets are supported.\n> -\t *\n> -\t * Failures in reading any of the targets are not deemed to be fatal,\n> -\t * but some properties and features, like constructing a\n> -\t * IPACameraSensorInfo for the IPA module, won't be supported.\n> -\t *\n> -\t * \\todo Make support for selection targets mandatory as soon as all\n> -\t * test platforms have been updated.\n> -\t */\n> -\tRectangle rect;\n> -\tint ret = subdev_->getSelection(pad_, V4L2_SEL_TGT_CROP_BOUNDS, &rect);\n> -\tif (ret) {\n> -\t\t/*\n> -\t\t * Default the pixel array size to the largest size supported\n> -\t\t * by the sensor. The sizes_ vector is sorted in ascending\n> -\t\t * order, the largest size is thus the last element.\n> -\t\t */\n> -\t\tpixelArraySize_ = sizes_.back();\n> -\n> -\t\tLOG(CameraSensor, Warning)\n> -\t\t\t<< \"The PixelArraySize property has been defaulted to \"\n> -\t\t\t<< pixelArraySize_;\n> -\t\terr = -EINVAL;\n> -\t} else {\n> -\t\tpixelArraySize_ = rect.size();\n> -\t}\n> -\n> -\tret = subdev_->getSelection(pad_, V4L2_SEL_TGT_CROP_DEFAULT, &activeArea_);\n> -\tif (ret) {\n> -\t\tactiveArea_ = Rectangle(pixelArraySize_);\n> -\t\tLOG(CameraSensor, Warning)\n> -\t\t\t<< \"The PixelArrayActiveAreas property has been defaulted to \"\n> -\t\t\t<< activeArea_;\n> -\t\terr = -EINVAL;\n> -\t}\n> -\n> -\tret = subdev_->getSelection(pad_, V4L2_SEL_TGT_CROP, &rect);\n> -\tif (ret) {\n> -\t\tLOG(CameraSensor, Warning)\n> -\t\t\t<< \"Failed to retrieve the sensor crop rectangle\";\n> -\t\terr = -EINVAL;\n> -\t}\n> -\n> -\tif (err) {\n> -\t\tLOG(CameraSensor, Warning)\n> -\t\t\t<< \"The sensor kernel driver needs to be fixed\";\n> -\t\tLOG(CameraSensor, Warning)\n> -\t\t\t<< \"See Documentation/sensor_driver_requirements.rst in the libcamera sources for more information\";\n> -\t}\n> -\n> -\tif (!bayerFormat_)\n> -\t\treturn 0;\n> -\n> -\t/*\n> -\t * For raw sensors, make sure the sensor driver supports the controls\n> -\t * required by the CameraSensor class.\n> -\t */\n> -\tstatic constexpr uint32_t mandatoryControls[] = {\n> -\t\tV4L2_CID_ANALOGUE_GAIN,\n> -\t\tV4L2_CID_EXPOSURE,\n> -\t\tV4L2_CID_HBLANK,\n> -\t\tV4L2_CID_PIXEL_RATE,\n> -\t\tV4L2_CID_VBLANK,\n> -\t};\n> -\n> -\terr = 0;\n> -\tfor (uint32_t ctrl : mandatoryControls) {\n> -\t\tif (!controls.count(ctrl)) {\n> -\t\t\tLOG(CameraSensor, Error)\n> -\t\t\t\t<< \"Mandatory V4L2 control \" << utils::hex(ctrl)\n> -\t\t\t\t<< \" not available\";\n> -\t\t\terr = -EINVAL;\n> -\t\t}\n> -\t}\n> -\n> -\tif (err) {\n> -\t\tLOG(CameraSensor, Error)\n> -\t\t\t<< \"The sensor kernel driver needs to be fixed\";\n> -\t\tLOG(CameraSensor, Error)\n> -\t\t\t<< \"See Documentation/sensor_driver_requirements.rst in the libcamera sources for more information\";\n> -\t\treturn err;\n> -\t}\n> -\n> -\treturn 0;\n> -}\n> -\n> -/*\n> - * \\brief Initialize properties that cannot be intialized by the\n> - * regular initProperties() function for VIMC\n> - */\n> -void CameraSensor::initVimcDefaultProperties()\n> -{\n> -\t/* Use the largest supported size. */\n> -\tpixelArraySize_ = sizes_.back();\n> -\tactiveArea_ = Rectangle(pixelArraySize_);\n> -}\n> -\n> -void CameraSensor::initStaticProperties()\n> -{\n> -\tstaticProps_ = CameraSensorProperties::get(model_);\n> -\tif (!staticProps_)\n> -\t\treturn;\n> -\n> -\t/* Register the properties retrieved from the sensor database. */\n> -\tproperties_.set(properties::UnitCellSize, staticProps_->unitCellSize);\n> -\n> -\tinitTestPatternModes();\n> -}\n> -\n> -void CameraSensor::initTestPatternModes()\n> -{\n> -\tconst auto &v4l2TestPattern = controls().find(V4L2_CID_TEST_PATTERN);\n> -\tif (v4l2TestPattern == controls().end()) {\n> -\t\tLOG(CameraSensor, Debug) << \"V4L2_CID_TEST_PATTERN is not supported\";\n> -\t\treturn;\n> -\t}\n> -\n> -\tconst auto &testPatternModes = staticProps_->testPatternModes;\n> -\tif (testPatternModes.empty()) {\n> -\t\t/*\n> -\t\t * The camera sensor supports test patterns but we don't know\n> -\t\t * how to map them so this should be fixed.\n> -\t\t */\n> -\t\tLOG(CameraSensor, Debug) << \"No static test pattern map for \\'\"\n> -\t\t\t\t\t << model() << \"\\'\";\n> -\t\treturn;\n> -\t}\n> -\n> -\t/*\n> -\t * Create a map that associates the V4L2 control index to the test\n> -\t * pattern mode by reversing the testPatternModes map provided by the\n> -\t * camera sensor properties. This makes it easier to verify if the\n> -\t * control index is supported in the below for loop that creates the\n> -\t * list of supported test patterns.\n> -\t */\n> -\tstd::map<int32_t, controls::draft::TestPatternModeEnum> indexToTestPatternMode;\n> -\tfor (const auto &it : testPatternModes)\n> -\t\tindexToTestPatternMode[it.second] = it.first;\n> -\n> -\tfor (const ControlValue &value : v4l2TestPattern->second.values()) {\n> -\t\tconst int32_t index = value.get<int32_t>();\n> -\n> -\t\tconst auto it = indexToTestPatternMode.find(index);\n> -\t\tif (it == indexToTestPatternMode.end()) {\n> -\t\t\tLOG(CameraSensor, Debug)\n> -\t\t\t\t<< \"Test pattern mode \" << index << \" ignored\";\n> -\t\t\tcontinue;\n> -\t\t}\n> -\n> -\t\ttestPatternModes_.push_back(it->second);\n> -\t}\n> -}\n> -\n> -int CameraSensor::initProperties()\n> -{\n> -\tmodel_ = subdev_->model();\n> -\tproperties_.set(properties::Model, utils::toAscii(model_));\n> -\n> -\t/* Generate a unique ID for the sensor. */\n> -\tint ret = generateId();\n> -\tif (ret)\n> -\t\treturn ret;\n> -\n> -\t/* Initialize the static properties from the sensor database. */\n> -\tinitStaticProperties();\n> -\n> -\t/* Retrieve and register properties from the kernel interface. */\n> -\tconst ControlInfoMap &controls = subdev_->controls();\n> -\n> -\tconst auto &orientation = controls.find(V4L2_CID_CAMERA_ORIENTATION);\n> -\tif (orientation != controls.end()) {\n> -\t\tint32_t v4l2Orientation = orientation->second.def().get<int32_t>();\n> -\t\tint32_t propertyValue;\n> -\n> -\t\tswitch (v4l2Orientation) {\n> -\t\tdefault:\n> -\t\t\tLOG(CameraSensor, Warning)\n> -\t\t\t\t<< \"Unsupported camera location \"\n> -\t\t\t\t<< v4l2Orientation << \", setting to External\";\n> -\t\t\t[[fallthrough]];\n> -\t\tcase V4L2_CAMERA_ORIENTATION_EXTERNAL:\n> -\t\t\tpropertyValue = properties::CameraLocationExternal;\n> -\t\t\tbreak;\n> -\t\tcase V4L2_CAMERA_ORIENTATION_FRONT:\n> -\t\t\tpropertyValue = properties::CameraLocationFront;\n> -\t\t\tbreak;\n> -\t\tcase V4L2_CAMERA_ORIENTATION_BACK:\n> -\t\t\tpropertyValue = properties::CameraLocationBack;\n> -\t\t\tbreak;\n> -\t\t}\n> -\t\tproperties_.set(properties::Location, propertyValue);\n> -\t} else {\n> -\t\tLOG(CameraSensor, Warning) << \"Failed to retrieve the camera location\";\n> -\t}\n> -\n> -\tconst auto &rotationControl = controls.find(V4L2_CID_CAMERA_SENSOR_ROTATION);\n> -\tif (rotationControl != controls.end()) {\n> -\t\tint32_t propertyValue = rotationControl->second.def().get<int32_t>();\n> -\n> -\t\t/*\n> -\t\t * Cache the Transform associated with the camera mounting\n> -\t\t * rotation for later use in computeTransform().\n> -\t\t */\n> -\t\tbool success;\n> -\t\tmountingOrientation_ = orientationFromRotation(propertyValue, &success);\n> -\t\tif (!success) {\n> -\t\t\tLOG(CameraSensor, Warning)\n> -\t\t\t\t<< \"Invalid rotation of \" << propertyValue\n> -\t\t\t\t<< \" degrees - ignoring\";\n> -\t\t\tmountingOrientation_ = Orientation::Rotate0;\n> -\t\t}\n> -\n> -\t\tproperties_.set(properties::Rotation, propertyValue);\n> -\t} else {\n> -\t\tLOG(CameraSensor, Warning)\n> -\t\t\t<< \"Rotation control not available, default to 0 degrees\";\n> -\t\tproperties_.set(properties::Rotation, 0);\n> -\t\tmountingOrientation_ = Orientation::Rotate0;\n> -\t}\n> -\n> -\tproperties_.set(properties::PixelArraySize, pixelArraySize_);\n> -\tproperties_.set(properties::PixelArrayActiveAreas, { activeArea_ });\n> -\n> -\t/* Color filter array pattern, register only for RAW sensors. */\n> -\tif (bayerFormat_) {\n> -\t\tint32_t cfa;\n> -\t\tswitch (bayerFormat_->order) {\n> -\t\tcase BayerFormat::BGGR:\n> -\t\t\tcfa = properties::draft::BGGR;\n> -\t\t\tbreak;\n> -\t\tcase BayerFormat::GBRG:\n> -\t\t\tcfa = properties::draft::GBRG;\n> -\t\t\tbreak;\n> -\t\tcase BayerFormat::GRBG:\n> -\t\t\tcfa = properties::draft::GRBG;\n> -\t\t\tbreak;\n> -\t\tcase BayerFormat::RGGB:\n> -\t\t\tcfa = properties::draft::RGGB;\n> -\t\t\tbreak;\n> -\t\tcase BayerFormat::MONO:\n> -\t\t\tcfa = properties::draft::MONO;\n> -\t\t\tbreak;\n> -\t\t}\n> -\n> -\t\tproperties_.set(properties::draft::ColorFilterArrangement, cfa);\n> -\t}\n> -\n> -\treturn 0;\n> -}\n> -\n> -/**\n> - * \\brief Check for and initialise any ancillary devices\n> - *\n> - * Sensors sometimes have ancillary devices such as a Lens or Flash that could\n> - * be linked to their MediaEntity by the kernel. Search for and handle any\n> - * such device.\n> - *\n> - * \\todo Handle MEDIA_ENT_F_FLASH too.\n> - */\n> -int CameraSensor::discoverAncillaryDevices()\n> -{\n> -\tint ret;\n> -\n> -\tfor (MediaEntity *ancillary : entity_->ancillaryEntities()) {\n> -\t\tswitch (ancillary->function()) {\n> -\t\tcase MEDIA_ENT_F_LENS:\n> -\t\t\tfocusLens_ = std::make_unique<CameraLens>(ancillary);\n> -\t\t\tret = focusLens_->init();\n> -\t\t\tif (ret) {\n> -\t\t\t\tLOG(CameraSensor, Error)\n> -\t\t\t\t\t<< \"Lens initialisation failed, lens disabled\";\n> -\t\t\t\tfocusLens_.reset();\n> -\t\t\t}\n> -\t\t\tbreak;\n> -\n> -\t\tdefault:\n> -\t\t\tLOG(CameraSensor, Warning)\n> -\t\t\t\t<< \"Unsupported ancillary entity function \"\n> -\t\t\t\t<< ancillary->function();\n> -\t\t\tbreak;\n> -\t\t}\n> -\t}\n> -\n> -\treturn 0;\n> -}\n> +CameraSensor::~CameraSensor() = default;\n>  \n>  /**\n>   * \\fn CameraSensor::model()\n> @@ -624,29 +89,15 @@ int CameraSensor::discoverAncillaryDevices()\n>   */\n>  \n>  /**\n> + * \\fn CameraSensor::sizes()\n>   * \\brief Retrieve the supported frame sizes for a media bus code\n>   * \\param[in] mbusCode The media bus code for which sizes are requested\n>   *\n>   * \\return The supported frame sizes for \\a mbusCode sorted in increasing order\n>   */\n> -std::vector<Size> CameraSensor::sizes(unsigned int mbusCode) const\n> -{\n> -\tstd::vector<Size> sizes;\n> -\n> -\tconst auto &format = formats_.find(mbusCode);\n> -\tif (format == formats_.end())\n> -\t\treturn sizes;\n> -\n> -\tconst std::vector<SizeRange> &ranges = format->second;\n> -\tstd::transform(ranges.begin(), ranges.end(), std::back_inserter(sizes),\n> -\t\t       [](const SizeRange &range) { return range.max; });\n> -\n> -\tstd::sort(sizes.begin(), sizes.end());\n> -\n> -\treturn sizes;\n> -}\n>  \n>  /**\n> + * \\fn CameraSensor::resolution()\n>   * \\brief Retrieve the camera sensor resolution\n>   *\n>   * The camera sensor resolution is the active pixel area size, clamped to the\n> @@ -659,12 +110,9 @@ std::vector<Size> CameraSensor::sizes(unsigned int mbusCode) const\n>   *\n>   * \\return The camera sensor resolution in pixels\n>   */\n> -Size CameraSensor::resolution() const\n> -{\n> -\treturn std::min(sizes_.back(), activeArea_.size());\n> -}\n>  \n>  /**\n> + * \\fn CameraSensor::getFormat()\n>   * \\brief Retrieve the best sensor format for a desired output\n>   * \\param[in] mbusCodes The list of acceptable media bus codes\n>   * \\param[in] size The desired size\n> @@ -700,59 +148,9 @@ Size CameraSensor::resolution() const\n>   * \\return The best sensor output format matching the desired media bus codes\n>   * and size on success, or an empty format otherwise.\n>   */\n> -V4L2SubdeviceFormat CameraSensor::getFormat(const std::vector<unsigned int> &mbusCodes,\n> -\t\t\t\t\t    const Size &size) const\n> -{\n> -\tunsigned int desiredArea = size.width * size.height;\n> -\tunsigned int bestArea = UINT_MAX;\n> -\tfloat desiredRatio = static_cast<float>(size.width) / size.height;\n> -\tfloat bestRatio = FLT_MAX;\n> -\tconst Size *bestSize = nullptr;\n> -\tuint32_t bestCode = 0;\n> -\n> -\tfor (unsigned int code : mbusCodes) {\n> -\t\tconst auto formats = formats_.find(code);\n> -\t\tif (formats == formats_.end())\n> -\t\t\tcontinue;\n> -\n> -\t\tfor (const SizeRange &range : formats->second) {\n> -\t\t\tconst Size &sz = range.max;\n> -\n> -\t\t\tif (sz.width < size.width || sz.height < size.height)\n> -\t\t\t\tcontinue;\n> -\n> -\t\t\tfloat ratio = static_cast<float>(sz.width) / sz.height;\n> -\t\t\tfloat ratioDiff = fabsf(ratio - desiredRatio);\n> -\t\t\tunsigned int area = sz.width * sz.height;\n> -\t\t\tunsigned int areaDiff = area - desiredArea;\n> -\n> -\t\t\tif (ratioDiff > bestRatio)\n> -\t\t\t\tcontinue;\n> -\n> -\t\t\tif (ratioDiff < bestRatio || areaDiff < bestArea) {\n> -\t\t\t\tbestRatio = ratioDiff;\n> -\t\t\t\tbestArea = areaDiff;\n> -\t\t\t\tbestSize = &sz;\n> -\t\t\t\tbestCode = code;\n> -\t\t\t}\n> -\t\t}\n> -\t}\n> -\n> -\tif (!bestSize) {\n> -\t\tLOG(CameraSensor, Debug) << \"No supported format or size found\";\n> -\t\treturn {};\n> -\t}\n> -\n> -\tV4L2SubdeviceFormat format{\n> -\t\t.code = bestCode,\n> -\t\t.size = *bestSize,\n> -\t\t.colorSpace = ColorSpace::Raw,\n> -\t};\n> -\n> -\treturn format;\n> -}\n>  \n>  /**\n> + * \\fn CameraSensor::setFormat()\n>   * \\brief Set the sensor output format\n>   * \\param[in] format The desired sensor output format\n>   * \\param[in] transform The transform to be applied on the sensor.\n> @@ -767,32 +165,9 @@ V4L2SubdeviceFormat CameraSensor::getFormat(const std::vector<unsigned int> &mbu\n>   *\n>   * \\return 0 on success or a negative error code otherwise\n>   */\n> -int CameraSensor::setFormat(V4L2SubdeviceFormat *format, Transform transform)\n> -{\n> -\t/* Configure flips if the sensor supports that. */\n> -\tif (supportFlips_) {\n> -\t\tControlList flipCtrls(subdev_->controls());\n> -\n> -\t\tflipCtrls.set(V4L2_CID_HFLIP,\n> -\t\t\t      static_cast<int32_t>(!!(transform & Transform::HFlip)));\n> -\t\tflipCtrls.set(V4L2_CID_VFLIP,\n> -\t\t\t      static_cast<int32_t>(!!(transform & Transform::VFlip)));\n> -\n> -\t\tint ret = subdev_->setControls(&flipCtrls);\n> -\t\tif (ret)\n> -\t\t\treturn ret;\n> -\t}\n> -\n> -\t/* Apply format on the subdev. */\n> -\tint ret = subdev_->setFormat(pad_, format);\n> -\tif (ret)\n> -\t\treturn ret;\n> -\n> -\tsubdev_->updateControlInfo();\n> -\treturn 0;\n> -}\n>  \n>  /**\n> + * \\fn CameraSensor::tryFormat()\n>   * \\brief Try the sensor output format\n>   * \\param[in] format The desired sensor output format\n>   *\n> @@ -803,13 +178,9 @@ int CameraSensor::setFormat(V4L2SubdeviceFormat *format, Transform transform)\n>   *\n>   * \\return 0 on success or a negative error code otherwise\n>   */\n> -int CameraSensor::tryFormat(V4L2SubdeviceFormat *format) const\n> -{\n> -\treturn subdev_->setFormat(pad_, format,\n> -\t\t\t\t  V4L2Subdevice::Whence::TryFormat);\n> -}\n>  \n>  /**\n> + * \\fn CameraSensor::applyConfiguration()\n>   * \\brief Apply a sensor configuration to the camera sensor\n>   * \\param[in] config The sensor configuration\n>   * \\param[in] transform The transform to be applied on the sensor.\n> @@ -824,74 +195,6 @@ int CameraSensor::tryFormat(V4L2SubdeviceFormat *format) const\n>   * \\return 0 if \\a config is applied correctly to the camera sensor, a negative\n>   * error code otherwise\n>   */\n> -int CameraSensor::applyConfiguration(const SensorConfiguration &config,\n> -\t\t\t\t     Transform transform,\n> -\t\t\t\t     V4L2SubdeviceFormat *sensorFormat)\n> -{\n> -\tif (!config.isValid()) {\n> -\t\tLOG(CameraSensor, Error) << \"Invalid sensor configuration\";\n> -\t\treturn -EINVAL;\n> -\t}\n> -\n> -\tstd::vector<unsigned int> filteredCodes;\n> -\tstd::copy_if(mbusCodes_.begin(), mbusCodes_.end(),\n> -\t\t     std::back_inserter(filteredCodes),\n> -\t\t     [&config](unsigned int mbusCode) {\n> -\t\t\t     BayerFormat bayer = BayerFormat::fromMbusCode(mbusCode);\n> -\t\t\t     if (bayer.bitDepth == config.bitDepth)\n> -\t\t\t\t     return true;\n> -\t\t\t     return false;\n> -\t\t     });\n> -\tif (filteredCodes.empty()) {\n> -\t\tLOG(CameraSensor, Error)\n> -\t\t\t<< \"Cannot find any format with bit depth \"\n> -\t\t\t<< config.bitDepth;\n> -\t\treturn -EINVAL;\n> -\t}\n> -\n> -\t/*\n> -\t * Compute the sensor's data frame size by applying the cropping\n> -\t * rectangle, subsampling and output crop to the sensor's pixel array\n> -\t * size.\n> -\t *\n> -\t * \\todo The actual size computation is for now ignored and only the\n> -\t * output size is considered. This implies that resolutions obtained\n> -\t * with two different cropping/subsampling will look identical and\n> -\t * only the first found one will be considered.\n> -\t */\n> -\tV4L2SubdeviceFormat subdevFormat = {};\n> -\tfor (unsigned int code : filteredCodes) {\n> -\t\tfor (const Size &size : sizes(code)) {\n> -\t\t\tif (size.width != config.outputSize.width ||\n> -\t\t\t    size.height != config.outputSize.height)\n> -\t\t\t\tcontinue;\n> -\n> -\t\t\tsubdevFormat.code = code;\n> -\t\t\tsubdevFormat.size = size;\n> -\t\t\tbreak;\n> -\t\t}\n> -\t}\n> -\tif (!subdevFormat.code) {\n> -\t\tLOG(CameraSensor, Error) << \"Invalid output size in sensor configuration\";\n> -\t\treturn -EINVAL;\n> -\t}\n> -\n> -\tint ret = setFormat(&subdevFormat, transform);\n> -\tif (ret)\n> -\t\treturn ret;\n> -\n> -\t/*\n> -\t * Return to the caller the format actually applied to the sensor.\n> -\t * This is relevant if transform has changed the bayer pattern order.\n> -\t */\n> -\tif (sensorFormat)\n> -\t\t*sensorFormat = subdevFormat;\n> -\n> -\t/* \\todo Handle AnalogCrop. Most sensors do not support set_selection */\n> -\t/* \\todo Handle scaling in the digital domain. */\n> -\n> -\treturn 0;\n> -}\n>  \n>  /**\n>   * \\fn CameraSensor::properties()\n> @@ -900,6 +203,7 @@ int CameraSensor::applyConfiguration(const SensorConfiguration &config,\n>   */\n>  \n>  /**\n> + * \\fn CameraSensor::sensorInfo()\n>   * \\brief Assemble and return the camera sensor info\n>   * \\param[out] info The camera sensor info\n>   *\n> @@ -913,82 +217,9 @@ int CameraSensor::applyConfiguration(const SensorConfiguration &config,\n>   *\n>   * \\return 0 on success, a negative error code otherwise\n>   */\n> -int CameraSensor::sensorInfo(IPACameraSensorInfo *info) const\n> -{\n> -\tif (!bayerFormat_)\n> -\t\treturn -EINVAL;\n> -\n> -\tinfo->model = model();\n> -\n> -\t/*\n> -\t * The active area size is a static property, while the crop\n> -\t * rectangle needs to be re-read as it depends on the sensor\n> -\t * configuration.\n> -\t */\n> -\tinfo->activeAreaSize = { activeArea_.width, activeArea_.height };\n> -\n> -\t/*\n> -\t * \\todo Support for retreiving the crop rectangle is scheduled to\n> -\t * become mandatory. For the time being use the default value if it has\n> -\t * been initialized at sensor driver validation time.\n> -\t */\n> -\tint ret = subdev_->getSelection(pad_, V4L2_SEL_TGT_CROP, &info->analogCrop);\n> -\tif (ret) {\n> -\t\tinfo->analogCrop = activeArea_;\n> -\t\tLOG(CameraSensor, Warning)\n> -\t\t\t<< \"The analogue crop rectangle has been defaulted to the active area size\";\n> -\t}\n> -\n> -\t/*\n> -\t * IPACameraSensorInfo::analogCrop::x and IPACameraSensorInfo::analogCrop::y\n> -\t * are defined relatively to the active pixel area, while V4L2's\n> -\t * TGT_CROP target is defined in respect to the full pixel array.\n> -\t *\n> -\t * Compensate it by subtracting the active area offset.\n> -\t */\n> -\tinfo->analogCrop.x -= activeArea_.x;\n> -\tinfo->analogCrop.y -= activeArea_.y;\n> -\n> -\t/* The bit depth and image size depend on the currently applied format. */\n> -\tV4L2SubdeviceFormat format{};\n> -\tret = subdev_->getFormat(pad_, &format);\n> -\tif (ret)\n> -\t\treturn ret;\n> -\n> -\tinfo->bitsPerPixel = MediaBusFormatInfo::info(format.code).bitsPerPixel;\n> -\tinfo->outputSize = format.size;\n> -\n> -\tstd::optional<int32_t> cfa = properties_.get(properties::draft::ColorFilterArrangement);\n> -\tinfo->cfaPattern = cfa ? *cfa : properties::draft::RGB;\n> -\n> -\t/*\n> -\t * Retrieve the pixel rate, line length and minimum/maximum frame\n> -\t * duration through V4L2 controls. Support for the V4L2_CID_PIXEL_RATE,\n> -\t * V4L2_CID_HBLANK and V4L2_CID_VBLANK controls is mandatory.\n> -\t */\n> -\tControlList ctrls = subdev_->getControls({ V4L2_CID_PIXEL_RATE,\n> -\t\t\t\t\t\t   V4L2_CID_HBLANK,\n> -\t\t\t\t\t\t   V4L2_CID_VBLANK });\n> -\tif (ctrls.empty()) {\n> -\t\tLOG(CameraSensor, Error)\n> -\t\t\t<< \"Failed to retrieve camera info controls\";\n> -\t\treturn -EINVAL;\n> -\t}\n> -\n> -\tinfo->pixelRate = ctrls.get(V4L2_CID_PIXEL_RATE).get<int64_t>();\n> -\n> -\tconst ControlInfo hblank = ctrls.infoMap()->at(V4L2_CID_HBLANK);\n> -\tinfo->minLineLength = info->outputSize.width + hblank.min().get<int32_t>();\n> -\tinfo->maxLineLength = info->outputSize.width + hblank.max().get<int32_t>();\n> -\n> -\tconst ControlInfo vblank = ctrls.infoMap()->at(V4L2_CID_VBLANK);\n> -\tinfo->minFrameLength = info->outputSize.height + vblank.min().get<int32_t>();\n> -\tinfo->maxFrameLength = info->outputSize.height + vblank.max().get<int32_t>();\n> -\n> -\treturn 0;\n> -}\n>  \n>  /**\n> + * \\fn CameraSensor::computeTransform()\n>   * \\brief Compute the Transform that gives the requested \\a orientation\n>   * \\param[inout] orientation The desired image orientation\n>   *\n> @@ -1014,40 +245,9 @@ int CameraSensor::sensorInfo(IPACameraSensorInfo *info) const\n>   * \\return A Transform instance that applied to the CameraSensor produces images\n>   * with \\a orientation\n>   */\n> -Transform CameraSensor::computeTransform(Orientation *orientation) const\n> -{\n> -\t/*\n> -\t * If we cannot do any flips we cannot change the native camera mounting\n> -\t * orientation.\n> -\t */\n> -\tif (!supportFlips_) {\n> -\t\t*orientation = mountingOrientation_;\n> -\t\treturn Transform::Identity;\n> -\t}\n> -\n> -\t/*\n> -\t * Now compute the required transform to obtain 'orientation' starting\n> -\t * from the mounting rotation.\n> -\t *\n> -\t * As a note:\n> -\t * \torientation / mountingOrientation_ = transform\n> -\t * \tmountingOrientation_ * transform = orientation\n> -\t */\n> -\tTransform transform = *orientation / mountingOrientation_;\n> -\n> -\t/*\n> -\t * If transform contains any Transpose we cannot do it, so adjust\n> -\t * 'orientation' to report the image native orientation and return Identity.\n> -\t */\n> -\tif (!!(transform & Transform::Transpose)) {\n> -\t\t*orientation = mountingOrientation_;\n> -\t\treturn Transform::Identity;\n> -\t}\n> -\n> -\treturn transform;\n> -}\n>  \n>  /**\n> + * \\fn CameraSensor::bayerOrder()\n>   * \\brief Compute the Bayer order that results from the given Transform\n>   * \\param[in] t The Transform to apply to the sensor\n>   *\n> @@ -1059,23 +259,9 @@ Transform CameraSensor::computeTransform(Orientation *orientation) const\n>   *\n>   * \\return The Bayer order produced by the sensor when the Transform is applied\n>   */\n> -BayerFormat::Order CameraSensor::bayerOrder(Transform t) const\n> -{\n> -\t/* Return a defined by meaningless value for non-Bayer sensors. */\n> -\tif (!bayerFormat_)\n> -\t\treturn BayerFormat::Order::BGGR;\n> -\n> -\tif (!flipsAlterBayerOrder_)\n> -\t\treturn bayerFormat_->order;\n> -\n> -\t/*\n> -\t * Apply the transform to the native (i.e. untransformed) Bayer order,\n> -\t * using the rest of the Bayer format supplied by the caller.\n> -\t */\n> -\treturn bayerFormat_->transform(t).order;\n> -}\n>  \n>  /**\n> + * \\fn CameraSensor::controls()\n>   * \\brief Retrieve the supported V4L2 controls and their information\n>   *\n>   * Control information is updated automatically to reflect the current sensor\n> @@ -1084,12 +270,9 @@ BayerFormat::Order CameraSensor::bayerOrder(Transform t) const\n>   *\n>   * \\return A map of the V4L2 controls supported by the sensor\n>   */\n> -const ControlInfoMap &CameraSensor::controls() const\n> -{\n> -\treturn subdev_->controls();\n> -}\n>  \n>  /**\n> + * \\fn CameraSensor::getControls()\n>   * \\brief Read V4L2 controls from the sensor\n>   * \\param[in] ids The list of controls to read, specified by their ID\n>   *\n> @@ -1107,12 +290,9 @@ const ControlInfoMap &CameraSensor::controls() const\n>   * \\return The control values in a ControlList on success, or an empty list on\n>   * error\n>   */\n> -ControlList CameraSensor::getControls(const std::vector<uint32_t> &ids)\n> -{\n> -\treturn subdev_->getControls(ids);\n> -}\n>  \n>  /**\n> + * \\fn CameraSensor::setControls()\n>   * \\brief Write V4L2 controls to the sensor\n>   * \\param[in] ctrls The list of controls to write\n>   *\n> @@ -1137,10 +317,6 @@ ControlList CameraSensor::getControls(const std::vector<uint32_t> &ids)\n>   * \\retval -EINVAL One of the control is not supported or not accessible\n>   * \\retval i The index of the control that failed\n>   */\n> -int CameraSensor::setControls(ControlList *ctrls)\n> -{\n> -\treturn subdev_->setControls(ctrls);\n> -}\n>  \n>  /**\n>   * \\fn CameraSensor::testPatternModes()\n> @@ -1151,6 +327,7 @@ int CameraSensor::setControls(ControlList *ctrls)\n>   */\n>  \n>  /**\n> + * \\fn CameraSensor::setTestPatternMode()\n>   * \\brief Set the test pattern mode for the camera sensor\n>   * \\param[in] mode The test pattern mode\n>   *\n> @@ -1158,84 +335,6 @@ int CameraSensor::setControls(ControlList *ctrls)\n>   * pattern mode. Otherwise, this function is a no-op. Setting the same test\n>   * pattern mode for every frame thus incurs no performance penalty.\n>   */\n> -int CameraSensor::setTestPatternMode(controls::draft::TestPatternModeEnum mode)\n> -{\n> -\tif (testPatternMode_ == mode)\n> -\t\treturn 0;\n> -\n> -\tif (testPatternModes_.empty()) {\n> -\t\tLOG(CameraSensor, Error)\n> -\t\t\t<< \"Camera sensor does not support test pattern modes.\";\n> -\t\treturn -EINVAL;\n> -\t}\n> -\n> -\treturn applyTestPatternMode(mode);\n> -}\n> -\n> -int CameraSensor::applyTestPatternMode(controls::draft::TestPatternModeEnum mode)\n> -{\n> -\tif (testPatternModes_.empty())\n> -\t\treturn 0;\n> -\n> -\tauto it = std::find(testPatternModes_.begin(), testPatternModes_.end(),\n> -\t\t\t    mode);\n> -\tif (it == testPatternModes_.end()) {\n> -\t\tLOG(CameraSensor, Error) << \"Unsupported test pattern mode \"\n> -\t\t\t\t\t << mode;\n> -\t\treturn -EINVAL;\n> -\t}\n> -\n> -\tLOG(CameraSensor, Debug) << \"Apply test pattern mode \" << mode;\n> -\n> -\tint32_t index = staticProps_->testPatternModes.at(mode);\n> -\tControlList ctrls{ controls() };\n> -\tctrls.set(V4L2_CID_TEST_PATTERN, index);\n> -\n> -\tint ret = setControls(&ctrls);\n> -\tif (ret)\n> -\t\treturn ret;\n> -\n> -\ttestPatternMode_ = mode;\n> -\n> -\treturn 0;\n> -}\n> -\n> -std::string CameraSensor::logPrefix() const\n> -{\n> -\treturn \"'\" + entity_->name() + \"'\";\n> -}\n> -\n> -namespace {\n> -\n> -/* Transitory default camera sensor implementation */\n> -class CameraSensorDefault : public CameraSensor\n> -{\n> -public:\n> -\tCameraSensorDefault(MediaEntity *entity)\n> -\t\t: CameraSensor(entity)\n> -\t{\n> -\t}\n> -\n> -\tstatic bool match([[maybe_unused]] const MediaEntity *entity)\n> -\t{\n> -\t\treturn true;\n> -\t}\n> -\n> -\tstatic std::unique_ptr<CameraSensorDefault> create(MediaEntity *entity)\n> -\t{\n> -\t\tstd::unique_ptr<CameraSensorDefault> sensor =\n> -\t\t\tstd::make_unique<CameraSensorDefault>(entity);\n> -\n> -\t\tif (sensor->init())\n> -\t\t\treturn nullptr;\n> -\n> -\t\treturn sensor;\n> -\t}\n> -};\n> -\n> -REGISTER_CAMERA_SENSOR(CameraSensorDefault)\n> -\n> -}; /* namespace */\n>  \n>  /**\n>   * \\class CameraSensorFactoryBase\n> @@ -1271,18 +370,18 @@ std::unique_ptr<CameraSensor> CameraSensorFactoryBase::create(MediaEntity *entit\n>  \t\tCameraSensorFactoryBase::factories();\n>  \n>  \tfor (const CameraSensorFactoryBase *factory : factories) {\n> -\t\tif (!factory->match(entity))\n> -\t\t\tcontinue;\n> +\t\tstd::variant<std::unique_ptr<CameraSensor>, int> result =\n> +\t\t\tfactory->match(entity);\n>  \n> -\t\tstd::unique_ptr<CameraSensor> sensor = factory->createInstance(entity);\n> -\t\tif (!sensor) {\n> +\t\tif (std::holds_alternative<std::unique_ptr<CameraSensor>>(result))\n> +\t\t\treturn std::get<std::unique_ptr<CameraSensor>>(std::move(result));\n> +\n> +\t\tif (std::get<int>(result)) {\n>  \t\t\tLOG(CameraSensor, Error)\n>  \t\t\t\t<< \"Failed to create sensor for '\"\n> -\t\t\t\t<< entity->name();\n> +\t\t\t\t<< entity->name() << \": \" << std::get<int>(result);\n>  \t\t\treturn nullptr;\n>  \t\t}\n> -\n> -\t\treturn sensor;\n>  \t}\n>  \n>  \treturn nullptr;\n> @@ -1337,33 +436,27 @@ void CameraSensorFactoryBase::registerFactory(CameraSensorFactoryBase *factory)\n>   * function.\n>   */\n>  \n> -/**\n> - * \\fn CameraSensorFactory::createInstance() const\n> - * \\brief Create an instance of the CameraSensor corresponding to the factory\n> - *\n> - * \\return A unique pointer to a newly constructed instance of the CameraSensor\n> - * subclass corresponding to the factory\n> - */\n> -\n>  /**\n>   * \\def REGISTER_CAMERA_SENSOR(sensor)\n>   * \\brief Register a camera sensor type to the sensor factory\n>   * \\param[in] sensor Class name of the CameraSensor derived class to register\n>   *\n>   * Register a CameraSensor subclass with the factory and make it available to\n> - * try and match sensors. The subclass needs to implement two static functions:\n> + * try and match sensors. The subclass needs to implement a static match\n> + * function:\n>   *\n>   * \\code{.cpp}\n> - * static bool match(const MediaEntity *entity);\n> - * static std::unique_ptr<sensor> create(MediaEntity *entity);\n> + * static std::variant<std::unique_ptr<CameraSensor>, int> match(MediaEntity *entity);\n>   * \\endcode\n>   *\n> - * The match() function tests if the sensor class supports the camera sensor\n> - * identified by a MediaEntity.\n> + * The function tests if the sensor class supports the camera sensor identified\n> + * by a MediaEntity. If so, it creates a new instance of the sensor class. The\n> + * return value is a variant that contains\n>   *\n> - * The create() function creates a new instance of the sensor class. It may\n> - * return a null pointer if initialization of the instance fails. It will only\n> - * be called if the match() function has returned true for the given entity.\n> + * - A new instance of the camera sensor class if the entity matched and\n> + *   creation succeeded ;\n> + * - A non-zero error code if the entity matched and the creation failed ; or\n> + * - A zero error code if the entity didn't match.\n>   */\n>  \n>  } /* namespace libcamera */\n> diff --git a/src/libcamera/sensor/camera_sensor_legacy.cpp b/src/libcamera/sensor/camera_sensor_legacy.cpp\n> new file mode 100644\n> index 000000000000..34677339241c\n> --- /dev/null\n> +++ b/src/libcamera/sensor/camera_sensor_legacy.cpp\n> @@ -0,0 +1,1015 @@\n> +/* SPDX-License-Identifier: LGPL-2.1-or-later */\n> +/*\n> + * Copyright (C) 2019, Google Inc.\n> + *\n> + * camera_sensor_legacy.cpp - A V4L2-backed camera sensor\n> + */\n> +\n> +#include <algorithm>\n> +#include <float.h>\n> +#include <iomanip>\n> +#include <limits.h>\n> +#include <map>\n> +#include <math.h>\n> +#include <memory>\n> +#include <string.h>\n> +#include <string>\n> +#include <vector>\n> +\n> +#include <libcamera/base/class.h>\n> +#include <libcamera/base/log.h>\n> +#include <libcamera/base/utils.h>\n> +\n> +#include <libcamera/camera.h>\n> +#include <libcamera/control_ids.h>\n> +#include <libcamera/controls.h>\n> +#include <libcamera/geometry.h>\n> +#include <libcamera/orientation.h>\n> +#include <libcamera/property_ids.h>\n> +#include <libcamera/transform.h>\n> +\n> +#include <libcamera/ipa/core_ipa_interface.h>\n> +\n> +#include \"libcamera/internal/bayer_format.h\"\n> +#include \"libcamera/internal/camera_lens.h\"\n> +#include \"libcamera/internal/camera_sensor.h\"\n> +#include \"libcamera/internal/camera_sensor_properties.h\"\n> +#include \"libcamera/internal/formats.h\"\n> +#include \"libcamera/internal/media_device.h\"\n> +#include \"libcamera/internal/sysfs.h\"\n> +#include \"libcamera/internal/v4l2_subdevice.h\"\n> +\n> +namespace libcamera {\n> +\n> +class BayerFormat;\n> +class CameraLens;\n> +class MediaEntity;\n> +class SensorConfiguration;\n> +\n> +struct CameraSensorProperties;\n> +\n> +enum class Orientation;\n> +\n> +LOG_DECLARE_CATEGORY(CameraSensor)\n> +\n> +class CameraSensorLegacy : public CameraSensor, protected Loggable\n> +{\n> +public:\n> +\tCameraSensorLegacy(const MediaEntity *entity);\n> +\t~CameraSensorLegacy();\n> +\n> +\tstatic std::variant<std::unique_ptr<CameraSensor>, int>\n> +\tmatch(MediaEntity *entity);\n> +\n> +\tconst std::string &model() const override { return model_; }\n> +\tconst std::string &id() const override { return id_; }\n> +\n> +\tconst MediaEntity *entity() const override { return entity_; }\n> +\tV4L2Subdevice *device() override { return subdev_.get(); }\n> +\n> +\tCameraLens *focusLens() override { return focusLens_.get(); }\n> +\n> +\tconst std::vector<unsigned int> &mbusCodes() const override { return mbusCodes_; }\n> +\tstd::vector<Size> sizes(unsigned int mbusCode) const override;\n> +\tSize resolution() const override;\n> +\n> +\tV4L2SubdeviceFormat getFormat(const std::vector<unsigned int> &mbusCodes,\n> +\t\t\t\t      const Size &size) const override;\n> +\tint setFormat(V4L2SubdeviceFormat *format,\n> +\t\t      Transform transform = Transform::Identity) override;\n> +\tint tryFormat(V4L2SubdeviceFormat *format) const override;\n> +\n> +\tint applyConfiguration(const SensorConfiguration &config,\n> +\t\t\t       Transform transform = Transform::Identity,\n> +\t\t\t       V4L2SubdeviceFormat *sensorFormat = nullptr) override;\n> +\n> +\tconst ControlList &properties() const override { return properties_; }\n> +\tint sensorInfo(IPACameraSensorInfo *info) const override;\n> +\tTransform computeTransform(Orientation *orientation) const override;\n> +\tBayerFormat::Order bayerOrder(Transform t) const override;\n> +\n> +\tconst ControlInfoMap &controls() const override;\n> +\tControlList getControls(const std::vector<uint32_t> &ids) override;\n> +\tint setControls(ControlList *ctrls) override;\n> +\n> +\tconst std::vector<controls::draft::TestPatternModeEnum> &\n> +\ttestPatternModes() const override { return testPatternModes_; }\n> +\tint setTestPatternMode(controls::draft::TestPatternModeEnum mode) override;\n> +\n> +protected:\n> +\tstd::string logPrefix() const override;\n> +\n> +private:\n> +\tLIBCAMERA_DISABLE_COPY(CameraSensorLegacy)\n> +\n> +\tint init();\n> +\tint generateId();\n> +\tint validateSensorDriver();\n> +\tvoid initVimcDefaultProperties();\n> +\tvoid initStaticProperties();\n> +\tvoid initTestPatternModes();\n> +\tint initProperties();\n> +\tint applyTestPatternMode(controls::draft::TestPatternModeEnum mode);\n> +\tint discoverAncillaryDevices();\n> +\n> +\tconst MediaEntity *entity_;\n> +\tstd::unique_ptr<V4L2Subdevice> subdev_;\n> +\tunsigned int pad_;\n> +\n> +\tconst CameraSensorProperties *staticProps_;\n> +\n> +\tstd::string model_;\n> +\tstd::string id_;\n> +\n> +\tV4L2Subdevice::Formats formats_;\n> +\tstd::vector<unsigned int> mbusCodes_;\n> +\tstd::vector<Size> sizes_;\n> +\tstd::vector<controls::draft::TestPatternModeEnum> testPatternModes_;\n> +\tcontrols::draft::TestPatternModeEnum testPatternMode_;\n> +\n> +\tSize pixelArraySize_;\n> +\tRectangle activeArea_;\n> +\tconst BayerFormat *bayerFormat_;\n> +\tbool supportFlips_;\n> +\tbool flipsAlterBayerOrder_;\n> +\tOrientation mountingOrientation_;\n> +\n> +\tControlList properties_;\n> +\n> +\tstd::unique_ptr<CameraLens> focusLens_;\n> +};\n> +\n> +/**\n> + * \\class CameraSensorLegacy\n> + * \\brief A camera sensor based on V4L2 subdevices\n> + *\n> + * The implementation is currently limited to sensors that expose a single V4L2\n> + * subdevice with a single pad. It will be extended to support more complex\n> + * devices as the needs arise.\n> + */\n> +\n> +CameraSensorLegacy::CameraSensorLegacy(const MediaEntity *entity)\n> +\t: entity_(entity), pad_(UINT_MAX), staticProps_(nullptr),\n> +\t  bayerFormat_(nullptr), supportFlips_(false),\n> +\t  flipsAlterBayerOrder_(false), properties_(properties::properties)\n> +{\n> +}\n> +\n> +CameraSensorLegacy::~CameraSensorLegacy() = default;\n> +\n> +std::variant<std::unique_ptr<CameraSensor>, int>\n> +CameraSensorLegacy::match(MediaEntity *entity)\n> +{\n> +\tstd::unique_ptr<CameraSensorLegacy> sensor =\n> +\t\tstd::make_unique<CameraSensorLegacy>(entity);\n> +\n> +\tint ret = sensor->init();\n> +\tif (ret)\n> +\t\treturn { ret };\n> +\n> +\treturn { std::move(sensor) };\n> +}\n> +\n> +int CameraSensorLegacy::init()\n> +{\n> +\tfor (const MediaPad *pad : entity_->pads()) {\n> +\t\tif (pad->flags() & MEDIA_PAD_FL_SOURCE) {\n> +\t\t\tpad_ = pad->index();\n> +\t\t\tbreak;\n> +\t\t}\n> +\t}\n> +\n> +\tif (pad_ == UINT_MAX) {\n> +\t\tLOG(CameraSensor, Error)\n> +\t\t\t<< \"Sensors with more than one pad are not supported\";\n> +\t\treturn -EINVAL;\n> +\t}\n> +\n> +\tswitch (entity_->function()) {\n> +\tcase MEDIA_ENT_F_CAM_SENSOR:\n> +\tcase MEDIA_ENT_F_PROC_VIDEO_ISP:\n> +\t\tbreak;\n> +\n> +\tdefault:\n> +\t\tLOG(CameraSensor, Error)\n> +\t\t\t<< \"Invalid sensor function \"\n> +\t\t\t<< utils::hex(entity_->function());\n> +\t\treturn -EINVAL;\n> +\t}\n> +\n> +\t/* Create and open the subdev. */\n> +\tsubdev_ = std::make_unique<V4L2Subdevice>(entity_);\n> +\tint ret = subdev_->open();\n> +\tif (ret < 0)\n> +\t\treturn ret;\n> +\n> +\t/*\n> +\t * Clear any flips to be sure we get the \"native\" Bayer order. This is\n> +\t * harmless for sensors where the flips don't affect the Bayer order.\n> +\t */\n> +\tControlList ctrls(subdev_->controls());\n> +\tif (subdev_->controls().find(V4L2_CID_HFLIP) != subdev_->controls().end())\n> +\t\tctrls.set(V4L2_CID_HFLIP, 0);\n> +\tif (subdev_->controls().find(V4L2_CID_VFLIP) != subdev_->controls().end())\n> +\t\tctrls.set(V4L2_CID_VFLIP, 0);\n> +\tsubdev_->setControls(&ctrls);\n> +\n> +\t/* Enumerate, sort and cache media bus codes and sizes. */\n> +\tformats_ = subdev_->formats(pad_);\n> +\tif (formats_.empty()) {\n> +\t\tLOG(CameraSensor, Error) << \"No image format found\";\n> +\t\treturn -EINVAL;\n> +\t}\n> +\n> +\tmbusCodes_ = utils::map_keys(formats_);\n> +\tstd::sort(mbusCodes_.begin(), mbusCodes_.end());\n> +\n> +\tfor (const auto &format : formats_) {\n> +\t\tconst std::vector<SizeRange> &ranges = format.second;\n> +\t\tstd::transform(ranges.begin(), ranges.end(), std::back_inserter(sizes_),\n> +\t\t\t       [](const SizeRange &range) { return range.max; });\n> +\t}\n> +\n> +\tstd::sort(sizes_.begin(), sizes_.end());\n> +\n> +\t/* Remove duplicates. */\n> +\tauto last = std::unique(sizes_.begin(), sizes_.end());\n> +\tsizes_.erase(last, sizes_.end());\n> +\n> +\t/*\n> +\t * VIMC is a bit special, as it does not yet support all the mandatory\n> +\t * requirements regular sensors have to respect.\n> +\t *\n> +\t * Do not validate the driver if it's VIMC and initialize the sensor\n> +\t * properties with static information.\n> +\t *\n> +\t * \\todo Remove the special case once the VIMC driver has been\n> +\t * updated in all test platforms.\n> +\t */\n> +\tif (entity_->device()->driver() == \"vimc\") {\n> +\t\tinitVimcDefaultProperties();\n> +\n> +\t\tret = initProperties();\n> +\t\tif (ret)\n> +\t\t\treturn ret;\n> +\n> +\t\treturn discoverAncillaryDevices();\n> +\t}\n> +\n> +\t/* Get the color filter array pattern (only for RAW sensors). */\n> +\tfor (unsigned int mbusCode : mbusCodes_) {\n> +\t\tconst BayerFormat &bayerFormat = BayerFormat::fromMbusCode(mbusCode);\n> +\t\tif (bayerFormat.isValid()) {\n> +\t\t\tbayerFormat_ = &bayerFormat;\n> +\t\t\tbreak;\n> +\t\t}\n> +\t}\n> +\n> +\tret = validateSensorDriver();\n> +\tif (ret)\n> +\t\treturn ret;\n> +\n> +\tret = initProperties();\n> +\tif (ret)\n> +\t\treturn ret;\n> +\n> +\tret = discoverAncillaryDevices();\n> +\tif (ret)\n> +\t\treturn ret;\n> +\n> +\t/*\n> +\t * Set HBLANK to the minimum to start with a well-defined line length,\n> +\t * allowing IPA modules that do not modify HBLANK to use the sensor\n> +\t * minimum line length in their calculations.\n> +\t */\n> +\tconst struct v4l2_query_ext_ctrl *hblankInfo = subdev_->controlInfo(V4L2_CID_HBLANK);\n> +\tif (hblankInfo && !(hblankInfo->flags & V4L2_CTRL_FLAG_READ_ONLY)) {\n> +\t\tControlList ctrl(subdev_->controls());\n> +\n> +\t\tctrl.set(V4L2_CID_HBLANK, static_cast<int32_t>(hblankInfo->minimum));\n> +\t\tret = subdev_->setControls(&ctrl);\n> +\t\tif (ret)\n> +\t\t\treturn ret;\n> +\t}\n> +\n> +\treturn applyTestPatternMode(controls::draft::TestPatternModeEnum::TestPatternModeOff);\n> +}\n> +\n> +int CameraSensorLegacy::generateId()\n> +{\n> +\tconst std::string devPath = subdev_->devicePath();\n> +\n> +\t/* Try to get ID from firmware description. */\n> +\tid_ = sysfs::firmwareNodePath(devPath);\n> +\tif (!id_.empty())\n> +\t\treturn 0;\n> +\n> +\t/*\n> +\t * Virtual sensors not described in firmware\n> +\t *\n> +\t * Verify it's a platform device and construct ID from the device path\n> +\t * and model of sensor.\n> +\t */\n> +\tif (devPath.find(\"/sys/devices/platform/\", 0) == 0) {\n> +\t\tid_ = devPath.substr(strlen(\"/sys/devices/\")) + \" \" + model();\n> +\t\treturn 0;\n> +\t}\n> +\n> +\tLOG(CameraSensor, Error) << \"Can't generate sensor ID\";\n> +\treturn -EINVAL;\n> +}\n> +\n> +int CameraSensorLegacy::validateSensorDriver()\n> +{\n> +\tint err = 0;\n> +\n> +\t/*\n> +\t * Optional controls are used to register optional sensor properties. If\n> +\t * not present, some values will be defaulted.\n> +\t */\n> +\tstatic constexpr uint32_t optionalControls[] = {\n> +\t\tV4L2_CID_CAMERA_SENSOR_ROTATION,\n> +\t};\n> +\n> +\tconst ControlIdMap &controls = subdev_->controls().idmap();\n> +\tfor (uint32_t ctrl : optionalControls) {\n> +\t\tif (!controls.count(ctrl))\n> +\t\t\tLOG(CameraSensor, Debug)\n> +\t\t\t\t<< \"Optional V4L2 control \" << utils::hex(ctrl)\n> +\t\t\t\t<< \" not supported\";\n> +\t}\n> +\n> +\t/*\n> +\t * Recommended controls are similar to optional controls, but will\n> +\t * become mandatory in the near future. Be loud if they're missing.\n> +\t */\n> +\tstatic constexpr uint32_t recommendedControls[] = {\n> +\t\tV4L2_CID_CAMERA_ORIENTATION,\n> +\t};\n> +\n> +\tfor (uint32_t ctrl : recommendedControls) {\n> +\t\tif (!controls.count(ctrl)) {\n> +\t\t\tLOG(CameraSensor, Warning)\n> +\t\t\t\t<< \"Recommended V4L2 control \" << utils::hex(ctrl)\n> +\t\t\t\t<< \" not supported\";\n> +\t\t\terr = -EINVAL;\n> +\t\t}\n> +\t}\n> +\n> +\t/*\n> +\t * Verify if sensor supports horizontal/vertical flips\n> +\t *\n> +\t * \\todo Handle horizontal and vertical flips independently.\n> +\t */\n> +\tconst struct v4l2_query_ext_ctrl *hflipInfo = subdev_->controlInfo(V4L2_CID_HFLIP);\n> +\tconst struct v4l2_query_ext_ctrl *vflipInfo = subdev_->controlInfo(V4L2_CID_VFLIP);\n> +\tif (hflipInfo && !(hflipInfo->flags & V4L2_CTRL_FLAG_READ_ONLY) &&\n> +\t    vflipInfo && !(vflipInfo->flags & V4L2_CTRL_FLAG_READ_ONLY)) {\n> +\t\tsupportFlips_ = true;\n> +\n> +\t\tif (hflipInfo->flags & V4L2_CTRL_FLAG_MODIFY_LAYOUT ||\n> +\t\t    vflipInfo->flags & V4L2_CTRL_FLAG_MODIFY_LAYOUT)\n> +\t\t\tflipsAlterBayerOrder_ = true;\n> +\t}\n> +\n> +\tif (!supportFlips_)\n> +\t\tLOG(CameraSensor, Debug)\n> +\t\t\t<< \"Camera sensor does not support horizontal/vertical flip\";\n> +\n> +\t/*\n> +\t * Make sure the required selection targets are supported.\n> +\t *\n> +\t * Failures in reading any of the targets are not deemed to be fatal,\n> +\t * but some properties and features, like constructing a\n> +\t * IPACameraSensorInfo for the IPA module, won't be supported.\n> +\t *\n> +\t * \\todo Make support for selection targets mandatory as soon as all\n> +\t * test platforms have been updated.\n> +\t */\n> +\tRectangle rect;\n> +\tint ret = subdev_->getSelection(pad_, V4L2_SEL_TGT_CROP_BOUNDS, &rect);\n> +\tif (ret) {\n> +\t\t/*\n> +\t\t * Default the pixel array size to the largest size supported\n> +\t\t * by the sensor. The sizes_ vector is sorted in ascending\n> +\t\t * order, the largest size is thus the last element.\n> +\t\t */\n> +\t\tpixelArraySize_ = sizes_.back();\n> +\n> +\t\tLOG(CameraSensor, Warning)\n> +\t\t\t<< \"The PixelArraySize property has been defaulted to \"\n> +\t\t\t<< pixelArraySize_;\n> +\t\terr = -EINVAL;\n> +\t} else {\n> +\t\tpixelArraySize_ = rect.size();\n> +\t}\n> +\n> +\tret = subdev_->getSelection(pad_, V4L2_SEL_TGT_CROP_DEFAULT, &activeArea_);\n> +\tif (ret) {\n> +\t\tactiveArea_ = Rectangle(pixelArraySize_);\n> +\t\tLOG(CameraSensor, Warning)\n> +\t\t\t<< \"The PixelArrayActiveAreas property has been defaulted to \"\n> +\t\t\t<< activeArea_;\n> +\t\terr = -EINVAL;\n> +\t}\n> +\n> +\tret = subdev_->getSelection(pad_, V4L2_SEL_TGT_CROP, &rect);\n> +\tif (ret) {\n> +\t\tLOG(CameraSensor, Warning)\n> +\t\t\t<< \"Failed to retrieve the sensor crop rectangle\";\n> +\t\terr = -EINVAL;\n> +\t}\n> +\n> +\tif (err) {\n> +\t\tLOG(CameraSensor, Warning)\n> +\t\t\t<< \"The sensor kernel driver needs to be fixed\";\n> +\t\tLOG(CameraSensor, Warning)\n> +\t\t\t<< \"See Documentation/sensor_driver_requirements.rst in the libcamera sources for more information\";\n> +\t}\n> +\n> +\tif (!bayerFormat_)\n> +\t\treturn 0;\n> +\n> +\t/*\n> +\t * For raw sensors, make sure the sensor driver supports the controls\n> +\t * required by the CameraSensor class.\n> +\t */\n> +\tstatic constexpr uint32_t mandatoryControls[] = {\n> +\t\tV4L2_CID_ANALOGUE_GAIN,\n> +\t\tV4L2_CID_EXPOSURE,\n> +\t\tV4L2_CID_HBLANK,\n> +\t\tV4L2_CID_PIXEL_RATE,\n> +\t\tV4L2_CID_VBLANK,\n> +\t};\n> +\n> +\terr = 0;\n> +\tfor (uint32_t ctrl : mandatoryControls) {\n> +\t\tif (!controls.count(ctrl)) {\n> +\t\t\tLOG(CameraSensor, Error)\n> +\t\t\t\t<< \"Mandatory V4L2 control \" << utils::hex(ctrl)\n> +\t\t\t\t<< \" not available\";\n> +\t\t\terr = -EINVAL;\n> +\t\t}\n> +\t}\n> +\n> +\tif (err) {\n> +\t\tLOG(CameraSensor, Error)\n> +\t\t\t<< \"The sensor kernel driver needs to be fixed\";\n> +\t\tLOG(CameraSensor, Error)\n> +\t\t\t<< \"See Documentation/sensor_driver_requirements.rst in the libcamera sources for more information\";\n> +\t\treturn err;\n> +\t}\n> +\n> +\treturn 0;\n> +}\n> +\n> +void CameraSensorLegacy::initVimcDefaultProperties()\n> +{\n> +\t/* Use the largest supported size. */\n> +\tpixelArraySize_ = sizes_.back();\n> +\tactiveArea_ = Rectangle(pixelArraySize_);\n> +}\n> +\n> +void CameraSensorLegacy::initStaticProperties()\n> +{\n> +\tstaticProps_ = CameraSensorProperties::get(model_);\n> +\tif (!staticProps_)\n> +\t\treturn;\n> +\n> +\t/* Register the properties retrieved from the sensor database. */\n> +\tproperties_.set(properties::UnitCellSize, staticProps_->unitCellSize);\n> +\n> +\tinitTestPatternModes();\n> +}\n> +\n> +void CameraSensorLegacy::initTestPatternModes()\n> +{\n> +\tconst auto &v4l2TestPattern = controls().find(V4L2_CID_TEST_PATTERN);\n> +\tif (v4l2TestPattern == controls().end()) {\n> +\t\tLOG(CameraSensor, Debug) << \"V4L2_CID_TEST_PATTERN is not supported\";\n> +\t\treturn;\n> +\t}\n> +\n> +\tconst auto &testPatternModes = staticProps_->testPatternModes;\n> +\tif (testPatternModes.empty()) {\n> +\t\t/*\n> +\t\t * The camera sensor supports test patterns but we don't know\n> +\t\t * how to map them so this should be fixed.\n> +\t\t */\n> +\t\tLOG(CameraSensor, Debug) << \"No static test pattern map for \\'\"\n> +\t\t\t\t\t << model() << \"\\'\";\n> +\t\treturn;\n> +\t}\n> +\n> +\t/*\n> +\t * Create a map that associates the V4L2 control index to the test\n> +\t * pattern mode by reversing the testPatternModes map provided by the\n> +\t * camera sensor properties. This makes it easier to verify if the\n> +\t * control index is supported in the below for loop that creates the\n> +\t * list of supported test patterns.\n> +\t */\n> +\tstd::map<int32_t, controls::draft::TestPatternModeEnum> indexToTestPatternMode;\n> +\tfor (const auto &it : testPatternModes)\n> +\t\tindexToTestPatternMode[it.second] = it.first;\n> +\n> +\tfor (const ControlValue &value : v4l2TestPattern->second.values()) {\n> +\t\tconst int32_t index = value.get<int32_t>();\n> +\n> +\t\tconst auto it = indexToTestPatternMode.find(index);\n> +\t\tif (it == indexToTestPatternMode.end()) {\n> +\t\t\tLOG(CameraSensor, Debug)\n> +\t\t\t\t<< \"Test pattern mode \" << index << \" ignored\";\n> +\t\t\tcontinue;\n> +\t\t}\n> +\n> +\t\ttestPatternModes_.push_back(it->second);\n> +\t}\n> +}\n> +\n> +int CameraSensorLegacy::initProperties()\n> +{\n> +\tmodel_ = subdev_->model();\n> +\tproperties_.set(properties::Model, utils::toAscii(model_));\n> +\n> +\t/* Generate a unique ID for the sensor. */\n> +\tint ret = generateId();\n> +\tif (ret)\n> +\t\treturn ret;\n> +\n> +\t/* Initialize the static properties from the sensor database. */\n> +\tinitStaticProperties();\n> +\n> +\t/* Retrieve and register properties from the kernel interface. */\n> +\tconst ControlInfoMap &controls = subdev_->controls();\n> +\n> +\tconst auto &orientation = controls.find(V4L2_CID_CAMERA_ORIENTATION);\n> +\tif (orientation != controls.end()) {\n> +\t\tint32_t v4l2Orientation = orientation->second.def().get<int32_t>();\n> +\t\tint32_t propertyValue;\n> +\n> +\t\tswitch (v4l2Orientation) {\n> +\t\tdefault:\n> +\t\t\tLOG(CameraSensor, Warning)\n> +\t\t\t\t<< \"Unsupported camera location \"\n> +\t\t\t\t<< v4l2Orientation << \", setting to External\";\n> +\t\t\t[[fallthrough]];\n> +\t\tcase V4L2_CAMERA_ORIENTATION_EXTERNAL:\n> +\t\t\tpropertyValue = properties::CameraLocationExternal;\n> +\t\t\tbreak;\n> +\t\tcase V4L2_CAMERA_ORIENTATION_FRONT:\n> +\t\t\tpropertyValue = properties::CameraLocationFront;\n> +\t\t\tbreak;\n> +\t\tcase V4L2_CAMERA_ORIENTATION_BACK:\n> +\t\t\tpropertyValue = properties::CameraLocationBack;\n> +\t\t\tbreak;\n> +\t\t}\n> +\t\tproperties_.set(properties::Location, propertyValue);\n> +\t} else {\n> +\t\tLOG(CameraSensor, Warning) << \"Failed to retrieve the camera location\";\n> +\t}\n> +\n> +\tconst auto &rotationControl = controls.find(V4L2_CID_CAMERA_SENSOR_ROTATION);\n> +\tif (rotationControl != controls.end()) {\n> +\t\tint32_t propertyValue = rotationControl->second.def().get<int32_t>();\n> +\n> +\t\t/*\n> +\t\t * Cache the Transform associated with the camera mounting\n> +\t\t * rotation for later use in computeTransform().\n> +\t\t */\n> +\t\tbool success;\n> +\t\tmountingOrientation_ = orientationFromRotation(propertyValue, &success);\n> +\t\tif (!success) {\n> +\t\t\tLOG(CameraSensor, Warning)\n> +\t\t\t\t<< \"Invalid rotation of \" << propertyValue\n> +\t\t\t\t<< \" degrees - ignoring\";\n> +\t\t\tmountingOrientation_ = Orientation::Rotate0;\n> +\t\t}\n> +\n> +\t\tproperties_.set(properties::Rotation, propertyValue);\n> +\t} else {\n> +\t\tLOG(CameraSensor, Warning)\n> +\t\t\t<< \"Rotation control not available, default to 0 degrees\";\n> +\t\tproperties_.set(properties::Rotation, 0);\n> +\t\tmountingOrientation_ = Orientation::Rotate0;\n> +\t}\n> +\n> +\tproperties_.set(properties::PixelArraySize, pixelArraySize_);\n> +\tproperties_.set(properties::PixelArrayActiveAreas, { activeArea_ });\n> +\n> +\t/* Color filter array pattern, register only for RAW sensors. */\n> +\tif (bayerFormat_) {\n> +\t\tint32_t cfa;\n> +\t\tswitch (bayerFormat_->order) {\n> +\t\tcase BayerFormat::BGGR:\n> +\t\t\tcfa = properties::draft::BGGR;\n> +\t\t\tbreak;\n> +\t\tcase BayerFormat::GBRG:\n> +\t\t\tcfa = properties::draft::GBRG;\n> +\t\t\tbreak;\n> +\t\tcase BayerFormat::GRBG:\n> +\t\t\tcfa = properties::draft::GRBG;\n> +\t\t\tbreak;\n> +\t\tcase BayerFormat::RGGB:\n> +\t\t\tcfa = properties::draft::RGGB;\n> +\t\t\tbreak;\n> +\t\tcase BayerFormat::MONO:\n> +\t\t\tcfa = properties::draft::MONO;\n> +\t\t\tbreak;\n> +\t\t}\n> +\n> +\t\tproperties_.set(properties::draft::ColorFilterArrangement, cfa);\n> +\t}\n> +\n> +\treturn 0;\n> +}\n> +\n> +int CameraSensorLegacy::discoverAncillaryDevices()\n> +{\n> +\tint ret;\n> +\n> +\tfor (MediaEntity *ancillary : entity_->ancillaryEntities()) {\n> +\t\tswitch (ancillary->function()) {\n> +\t\tcase MEDIA_ENT_F_LENS:\n> +\t\t\tfocusLens_ = std::make_unique<CameraLens>(ancillary);\n> +\t\t\tret = focusLens_->init();\n> +\t\t\tif (ret) {\n> +\t\t\t\tLOG(CameraSensor, Error)\n> +\t\t\t\t\t<< \"Lens initialisation failed, lens disabled\";\n> +\t\t\t\tfocusLens_.reset();\n> +\t\t\t}\n> +\t\t\tbreak;\n> +\n> +\t\tdefault:\n> +\t\t\tLOG(CameraSensor, Warning)\n> +\t\t\t\t<< \"Unsupported ancillary entity function \"\n> +\t\t\t\t<< ancillary->function();\n> +\t\t\tbreak;\n> +\t\t}\n> +\t}\n> +\n> +\treturn 0;\n> +}\n> +\n> +std::vector<Size> CameraSensorLegacy::sizes(unsigned int mbusCode) const\n> +{\n> +\tstd::vector<Size> sizes;\n> +\n> +\tconst auto &format = formats_.find(mbusCode);\n> +\tif (format == formats_.end())\n> +\t\treturn sizes;\n> +\n> +\tconst std::vector<SizeRange> &ranges = format->second;\n> +\tstd::transform(ranges.begin(), ranges.end(), std::back_inserter(sizes),\n> +\t\t       [](const SizeRange &range) { return range.max; });\n> +\n> +\tstd::sort(sizes.begin(), sizes.end());\n> +\n> +\treturn sizes;\n> +}\n> +\n> +Size CameraSensorLegacy::resolution() const\n> +{\n> +\treturn std::min(sizes_.back(), activeArea_.size());\n> +}\n> +\n> +V4L2SubdeviceFormat\n> +CameraSensorLegacy::getFormat(const std::vector<unsigned int> &mbusCodes,\n> +\t\t\t      const Size &size) const\n> +{\n> +\tunsigned int desiredArea = size.width * size.height;\n> +\tunsigned int bestArea = UINT_MAX;\n> +\tfloat desiredRatio = static_cast<float>(size.width) / size.height;\n> +\tfloat bestRatio = FLT_MAX;\n> +\tconst Size *bestSize = nullptr;\n> +\tuint32_t bestCode = 0;\n> +\n> +\tfor (unsigned int code : mbusCodes) {\n> +\t\tconst auto formats = formats_.find(code);\n> +\t\tif (formats == formats_.end())\n> +\t\t\tcontinue;\n> +\n> +\t\tfor (const SizeRange &range : formats->second) {\n> +\t\t\tconst Size &sz = range.max;\n> +\n> +\t\t\tif (sz.width < size.width || sz.height < size.height)\n> +\t\t\t\tcontinue;\n> +\n> +\t\t\tfloat ratio = static_cast<float>(sz.width) / sz.height;\n> +\t\t\tfloat ratioDiff = fabsf(ratio - desiredRatio);\n> +\t\t\tunsigned int area = sz.width * sz.height;\n> +\t\t\tunsigned int areaDiff = area - desiredArea;\n> +\n> +\t\t\tif (ratioDiff > bestRatio)\n> +\t\t\t\tcontinue;\n> +\n> +\t\t\tif (ratioDiff < bestRatio || areaDiff < bestArea) {\n> +\t\t\t\tbestRatio = ratioDiff;\n> +\t\t\t\tbestArea = areaDiff;\n> +\t\t\t\tbestSize = &sz;\n> +\t\t\t\tbestCode = code;\n> +\t\t\t}\n> +\t\t}\n> +\t}\n> +\n> +\tif (!bestSize) {\n> +\t\tLOG(CameraSensor, Debug) << \"No supported format or size found\";\n> +\t\treturn {};\n> +\t}\n> +\n> +\tV4L2SubdeviceFormat format{\n> +\t\t.code = bestCode,\n> +\t\t.size = *bestSize,\n> +\t\t.colorSpace = ColorSpace::Raw,\n> +\t};\n> +\n> +\treturn format;\n> +}\n> +\n> +int CameraSensorLegacy::setFormat(V4L2SubdeviceFormat *format, Transform transform)\n> +{\n> +\t/* Configure flips if the sensor supports that. */\n> +\tif (supportFlips_) {\n> +\t\tControlList flipCtrls(subdev_->controls());\n> +\n> +\t\tflipCtrls.set(V4L2_CID_HFLIP,\n> +\t\t\t      static_cast<int32_t>(!!(transform & Transform::HFlip)));\n> +\t\tflipCtrls.set(V4L2_CID_VFLIP,\n> +\t\t\t      static_cast<int32_t>(!!(transform & Transform::VFlip)));\n> +\n> +\t\tint ret = subdev_->setControls(&flipCtrls);\n> +\t\tif (ret)\n> +\t\t\treturn ret;\n> +\t}\n> +\n> +\t/* Apply format on the subdev. */\n> +\tint ret = subdev_->setFormat(pad_, format);\n> +\tif (ret)\n> +\t\treturn ret;\n> +\n> +\tsubdev_->updateControlInfo();\n> +\treturn 0;\n> +}\n> +\n> +int CameraSensorLegacy::tryFormat(V4L2SubdeviceFormat *format) const\n> +{\n> +\treturn subdev_->setFormat(pad_, format,\n> +\t\t\t\t  V4L2Subdevice::Whence::TryFormat);\n> +}\n> +\n> +int CameraSensorLegacy::applyConfiguration(const SensorConfiguration &config,\n> +\t\t\t\t\t   Transform transform,\n> +\t\t\t\t\t   V4L2SubdeviceFormat *sensorFormat)\n> +{\n> +\tif (!config.isValid()) {\n> +\t\tLOG(CameraSensor, Error) << \"Invalid sensor configuration\";\n> +\t\treturn -EINVAL;\n> +\t}\n> +\n> +\tstd::vector<unsigned int> filteredCodes;\n> +\tstd::copy_if(mbusCodes_.begin(), mbusCodes_.end(),\n> +\t\t     std::back_inserter(filteredCodes),\n> +\t\t     [&config](unsigned int mbusCode) {\n> +\t\t\t     BayerFormat bayer = BayerFormat::fromMbusCode(mbusCode);\n> +\t\t\t     if (bayer.bitDepth == config.bitDepth)\n> +\t\t\t\t     return true;\n> +\t\t\t     return false;\n> +\t\t     });\n> +\tif (filteredCodes.empty()) {\n> +\t\tLOG(CameraSensor, Error)\n> +\t\t\t<< \"Cannot find any format with bit depth \"\n> +\t\t\t<< config.bitDepth;\n> +\t\treturn -EINVAL;\n> +\t}\n> +\n> +\t/*\n> +\t * Compute the sensor's data frame size by applying the cropping\n> +\t * rectangle, subsampling and output crop to the sensor's pixel array\n> +\t * size.\n> +\t *\n> +\t * \\todo The actual size computation is for now ignored and only the\n> +\t * output size is considered. This implies that resolutions obtained\n> +\t * with two different cropping/subsampling will look identical and\n> +\t * only the first found one will be considered.\n> +\t */\n> +\tV4L2SubdeviceFormat subdevFormat = {};\n> +\tfor (unsigned int code : filteredCodes) {\n> +\t\tfor (const Size &size : sizes(code)) {\n> +\t\t\tif (size.width != config.outputSize.width ||\n> +\t\t\t    size.height != config.outputSize.height)\n> +\t\t\t\tcontinue;\n> +\n> +\t\t\tsubdevFormat.code = code;\n> +\t\t\tsubdevFormat.size = size;\n> +\t\t\tbreak;\n> +\t\t}\n> +\t}\n> +\tif (!subdevFormat.code) {\n> +\t\tLOG(CameraSensor, Error) << \"Invalid output size in sensor configuration\";\n> +\t\treturn -EINVAL;\n> +\t}\n> +\n> +\tint ret = setFormat(&subdevFormat, transform);\n> +\tif (ret)\n> +\t\treturn ret;\n> +\n> +\t/*\n> +\t * Return to the caller the format actually applied to the sensor.\n> +\t * This is relevant if transform has changed the bayer pattern order.\n> +\t */\n> +\tif (sensorFormat)\n> +\t\t*sensorFormat = subdevFormat;\n> +\n> +\t/* \\todo Handle AnalogCrop. Most sensors do not support set_selection */\n> +\t/* \\todo Handle scaling in the digital domain. */\n> +\n> +\treturn 0;\n> +}\n> +\n> +int CameraSensorLegacy::sensorInfo(IPACameraSensorInfo *info) const\n> +{\n> +\tif (!bayerFormat_)\n> +\t\treturn -EINVAL;\n> +\n> +\tinfo->model = model();\n> +\n> +\t/*\n> +\t * The active area size is a static property, while the crop\n> +\t * rectangle needs to be re-read as it depends on the sensor\n> +\t * configuration.\n> +\t */\n> +\tinfo->activeAreaSize = { activeArea_.width, activeArea_.height };\n> +\n> +\t/*\n> +\t * \\todo Support for retreiving the crop rectangle is scheduled to\n> +\t * become mandatory. For the time being use the default value if it has\n> +\t * been initialized at sensor driver validation time.\n> +\t */\n> +\tint ret = subdev_->getSelection(pad_, V4L2_SEL_TGT_CROP, &info->analogCrop);\n> +\tif (ret) {\n> +\t\tinfo->analogCrop = activeArea_;\n> +\t\tLOG(CameraSensor, Warning)\n> +\t\t\t<< \"The analogue crop rectangle has been defaulted to the active area size\";\n> +\t}\n> +\n> +\t/*\n> +\t * IPACameraSensorInfo::analogCrop::x and IPACameraSensorInfo::analogCrop::y\n> +\t * are defined relatively to the active pixel area, while V4L2's\n> +\t * TGT_CROP target is defined in respect to the full pixel array.\n> +\t *\n> +\t * Compensate it by subtracting the active area offset.\n> +\t */\n> +\tinfo->analogCrop.x -= activeArea_.x;\n> +\tinfo->analogCrop.y -= activeArea_.y;\n> +\n> +\t/* The bit depth and image size depend on the currently applied format. */\n> +\tV4L2SubdeviceFormat format{};\n> +\tret = subdev_->getFormat(pad_, &format);\n> +\tif (ret)\n> +\t\treturn ret;\n> +\tinfo->bitsPerPixel = MediaBusFormatInfo::info(format.code).bitsPerPixel;\n> +\tinfo->outputSize = format.size;\n> +\n> +\tstd::optional<int32_t> cfa = properties_.get(properties::draft::ColorFilterArrangement);\n> +\tinfo->cfaPattern = cfa ? *cfa : properties::draft::RGB;\n> +\n> +\t/*\n> +\t * Retrieve the pixel rate, line length and minimum/maximum frame\n> +\t * duration through V4L2 controls. Support for the V4L2_CID_PIXEL_RATE,\n> +\t * V4L2_CID_HBLANK and V4L2_CID_VBLANK controls is mandatory.\n> +\t */\n> +\tControlList ctrls = subdev_->getControls({ V4L2_CID_PIXEL_RATE,\n> +\t\t\t\t\t\t   V4L2_CID_HBLANK,\n> +\t\t\t\t\t\t   V4L2_CID_VBLANK });\n> +\tif (ctrls.empty()) {\n> +\t\tLOG(CameraSensor, Error)\n> +\t\t\t<< \"Failed to retrieve camera info controls\";\n> +\t\treturn -EINVAL;\n> +\t}\n> +\n> +\tinfo->pixelRate = ctrls.get(V4L2_CID_PIXEL_RATE).get<int64_t>();\n> +\n> +\tconst ControlInfo hblank = ctrls.infoMap()->at(V4L2_CID_HBLANK);\n> +\tinfo->minLineLength = info->outputSize.width + hblank.min().get<int32_t>();\n> +\tinfo->maxLineLength = info->outputSize.width + hblank.max().get<int32_t>();\n> +\n> +\tconst ControlInfo vblank = ctrls.infoMap()->at(V4L2_CID_VBLANK);\n> +\tinfo->minFrameLength = info->outputSize.height + vblank.min().get<int32_t>();\n> +\tinfo->maxFrameLength = info->outputSize.height + vblank.max().get<int32_t>();\n> +\n> +\treturn 0;\n> +}\n> +\n> +Transform CameraSensorLegacy::computeTransform(Orientation *orientation) const\n> +{\n> +\t/*\n> +\t * If we cannot do any flips we cannot change the native camera mounting\n> +\t * orientation.\n> +\t */\n> +\tif (!supportFlips_) {\n> +\t\t*orientation = mountingOrientation_;\n> +\t\treturn Transform::Identity;\n> +\t}\n> +\n> +\t/*\n> +\t * Now compute the required transform to obtain 'orientation' starting\n> +\t * from the mounting rotation.\n> +\t *\n> +\t * As a note:\n> +\t * \torientation / mountingOrientation_ = transform\n> +\t * \tmountingOrientation_ * transform = orientation\n> +\t */\n> +\tTransform transform = *orientation / mountingOrientation_;\n> +\n> +\t/*\n> +\t * If transform contains any Transpose we cannot do it, so adjust\n> +\t * 'orientation' to report the image native orientation and return Identity.\n> +\t */\n> +\tif (!!(transform & Transform::Transpose)) {\n> +\t\t*orientation = mountingOrientation_;\n> +\t\treturn Transform::Identity;\n> +\t}\n> +\n> +\treturn transform;\n> +}\n> +\n> +BayerFormat::Order CameraSensorLegacy::bayerOrder(Transform t) const\n> +{\n> +\t/* Return a defined by meaningless value for non-Bayer sensors. */\n> +\tif (!bayerFormat_)\n> +\t\treturn BayerFormat::Order::BGGR;\n> +\n> +\tif (!flipsAlterBayerOrder_)\n> +\t\treturn bayerFormat_->order;\n> +\n> +\t/*\n> +\t * Apply the transform to the native (i.e. untransformed) Bayer order,\n> +\t * using the rest of the Bayer format supplied by the caller.\n> +\t */\n> +\treturn bayerFormat_->transform(t).order;\n> +}\n> +\n> +const ControlInfoMap &CameraSensorLegacy::controls() const\n> +{\n> +\treturn subdev_->controls();\n> +}\n> +\n> +ControlList CameraSensorLegacy::getControls(const std::vector<uint32_t> &ids)\n> +{\n> +\treturn subdev_->getControls(ids);\n> +}\n> +\n> +int CameraSensorLegacy::setControls(ControlList *ctrls)\n> +{\n> +\treturn subdev_->setControls(ctrls);\n> +}\n> +\n> +int CameraSensorLegacy::setTestPatternMode(controls::draft::TestPatternModeEnum mode)\n> +{\n> +\tif (testPatternMode_ == mode)\n> +\t\treturn 0;\n> +\n> +\tif (testPatternModes_.empty()) {\n> +\t\tLOG(CameraSensor, Error)\n> +\t\t\t<< \"Camera sensor does not support test pattern modes.\";\n> +\t\treturn -EINVAL;\n> +\t}\n> +\n> +\treturn applyTestPatternMode(mode);\n> +}\n> +\n> +int CameraSensorLegacy::applyTestPatternMode(controls::draft::TestPatternModeEnum mode)\n> +{\n> +\tif (testPatternModes_.empty())\n> +\t\treturn 0;\n> +\n> +\tauto it = std::find(testPatternModes_.begin(), testPatternModes_.end(),\n> +\t\t\t    mode);\n> +\tif (it == testPatternModes_.end()) {\n> +\t\tLOG(CameraSensor, Error) << \"Unsupported test pattern mode \"\n> +\t\t\t\t\t << mode;\n> +\t\treturn -EINVAL;\n> +\t}\n> +\n> +\tLOG(CameraSensor, Debug) << \"Apply test pattern mode \" << mode;\n> +\n> +\tint32_t index = staticProps_->testPatternModes.at(mode);\n> +\tControlList ctrls{ controls() };\n> +\tctrls.set(V4L2_CID_TEST_PATTERN, index);\n> +\n> +\tint ret = setControls(&ctrls);\n> +\tif (ret)\n> +\t\treturn ret;\n> +\n> +\ttestPatternMode_ = mode;\n> +\n> +\treturn 0;\n> +}\n> +\n> +std::string CameraSensorLegacy::logPrefix() const\n> +{\n> +\treturn \"'\" + entity_->name() + \"'\";\n> +}\n> +\n> +REGISTER_CAMERA_SENSOR(CameraSensorLegacy)\n> +\n> +} /* namespace libcamera */\n> diff --git a/src/libcamera/sensor/meson.build b/src/libcamera/sensor/meson.build\n> index bf4b131a94b1..e83020fc22c3 100644\n> --- a/src/libcamera/sensor/meson.build\n> +++ b/src/libcamera/sensor/meson.build\n> @@ -2,5 +2,6 @@\n>  \n>  libcamera_sources += files([\n>      'camera_sensor.cpp',\n> +    'camera_sensor_legacy.cpp',\n>      'camera_sensor_properties.cpp',\n>  ])\n> -- \n> Regards,\n> \n> Laurent Pinchart\n>","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id EA335C0F2A\n\tfor <parsemail@patchwork.libcamera.org>;\n\tWed,  6 Mar 2024 16:55:11 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id 2037C6286F;\n\tWed,  6 Mar 2024 17:55:11 +0100 (CET)","from perceval.ideasonboard.com (perceval.ideasonboard.com\n\t[213.167.242.64])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id 6033761C8F\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tWed,  6 Mar 2024 17:55:10 +0100 (CET)","from ideasonboard.com (unknown\n\t[IPv6:2a00:6020:448c:6c00:dd65:a3ea:5f4d:989a])\n\tby perceval.ideasonboard.com (Postfix) with ESMTPSA id 4E38ABD1;\n\tWed,  6 Mar 2024 17:54:52 +0100 (CET)"],"Authentication-Results":"lancelot.ideasonboard.com; dkim=pass (1024-bit key;\n\tunprotected) header.d=ideasonboard.com header.i=@ideasonboard.com\n\theader.b=\"ku2d5WYl\"; dkim-atps=neutral","DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/simple; d=ideasonboard.com;\n\ts=mail; t=1709744092;\n\tbh=AV1Nb+flsD5vbwLa/rn8nNma+M7ITxvpJsRa3lb8fI4=;\n\th=Date:From:To:Cc:Subject:References:In-Reply-To:From;\n\tb=ku2d5WYlBqkc+605c0eURfIVhOSPau8f98mGP09QsBvuPmoVlvy+0CfEPCgDNBUFX\n\tYlLGVYsP8FTCIfixBCBGa3wkIMues5/NlFFKrxvMFotolWLhlidsKvthyNXFldgKOy\n\tV8DrQapYl0XDlC8uou/D/C071H7nP+1vC9L68X6w=","Date":"Wed, 6 Mar 2024 17:55:07 +0100","From":"Stefan Klug <stefan.klug@ideasonboard.com>","To":"Laurent Pinchart <laurent.pinchart@ideasonboard.com>","Subject":"Re: [PATCH/RFC 20/32] libcamera: camera_sensor: Create abstract base\n\tclass","Message-ID":"<20240306165507.hdzab73gkrobkd3z@jasper>","References":"<20240301212121.9072-1-laurent.pinchart@ideasonboard.com>\n\t<20240301212121.9072-21-laurent.pinchart@ideasonboard.com>","MIME-Version":"1.0","Content-Type":"text/plain; charset=utf-8","Content-Disposition":"inline","In-Reply-To":"<20240301212121.9072-21-laurent.pinchart@ideasonboard.com>","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","Cc":"libcamera-devel@lists.libcamera.org, Sakari Ailus <sakari.ailus@iki.fi>","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}}]