[{"id":17678,"web_url":"https://patchwork.libcamera.org/comment/17678/","msgid":"<20210622074107.6oeneo3z4yluohsz@uno.localdomain>","date":"2021-06-22T07:41:07","subject":"Re: [libcamera-devel] [PATCH 2/2] android: Introduce\n\tCameraCapabilties class","submitter":{"id":3,"url":"https://patchwork.libcamera.org/api/people/3/","name":"Jacopo Mondi","email":"jacopo@jmondi.org"},"content":"Hi Hiro,\n\nOn Tue, Jun 22, 2021 at 10:34:27AM +0900, Hirokazu Honda wrote:\n> Hi Jacopo, thank you for the patch.\n>\n> I failed to apply the patch on the top of the latest tree to review.\n> Could you tell me the parent commit which I can apply this patch?\n\nweird, I just applied these two patches cleanly on the latest\nmaster which for me is\n969da3189439 (\"libcamera: utils: Support systems that lack secure_getenv and issetugid\"\n\n>\n> -Hiro\n> On Tue, Jun 22, 2021 at 12:29 AM Jacopo Mondi <jacopo@jmondi.org> wrote:\n>\n> > The camera_device.cpp has grown a little too much, and it has quickly\n> > become hard to maintain. Break out the handling of the static\n> > information collected at camera initialization time to a new\n> > CameraCapabilities class.\n> >\n> > Break out from the camera_device.cpp file all the functions relative to:\n> > - Initialization of supported stream configurations\n> > - Initialization of static metadata\n> > - Initialization of request templates\n> >\n> > Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>\n> > Acked-by: Paul Elder <paul.elder@ideasonboard.com>\n> > Tested-by: Paul Elder <paul.elder@ideasonboard.com>\n> > ---\n> >  src/android/camera_capabilities.cpp | 1164 +++++++++++++++++++++++++++\n> >  src/android/camera_capabilities.h   |   65 ++\n> >  src/android/camera_device.cpp       | 1147 +-------------------------\n> >  src/android/camera_device.h         |   27 +-\n> >  src/android/meson.build             |    1 +\n> >  5 files changed, 1245 insertions(+), 1159 deletions(-)\n> >  create mode 100644 src/android/camera_capabilities.cpp\n> >  create mode 100644 src/android/camera_capabilities.h\n> >\n> > diff --git a/src/android/camera_capabilities.cpp\n> > b/src/android/camera_capabilities.cpp\n> > new file mode 100644\n> > index 000000000000..311a2c839586\n> > --- /dev/null\n> > +++ b/src/android/camera_capabilities.cpp\n> > @@ -0,0 +1,1164 @@\n> > +/* SPDX-License-Identifier: LGPL-2.1-or-later */\n> > +/*\n> > + * Copyright (C) 2021, Google Inc.\n> > + *\n> > + * camera_capabilities.cpp - Camera static properties manager\n> > + */\n> > +\n> > +#include \"camera_capabilities.h\"\n> > +\n> > +#include <array>\n> > +#include <cmath>\n> > +\n> > +#include <hardware/camera3.h>\n> > +\n> > +#include <libcamera/control_ids.h>\n> > +#include <libcamera/controls.h>\n> > +#include <libcamera/property_ids.h>\n> > +\n> > +#include \"libcamera/internal/formats.h\"\n> > +#include \"libcamera/internal/log.h\"\n> > +\n> > +using namespace libcamera;\n> > +\n> > +LOG_DECLARE_CATEGORY(HAL)\n> > +\n> > +namespace {\n> > +\n> > +/*\n> > + * \\var camera3Resolutions\n> > + * \\brief The list of image resolutions defined as mandatory to be\n> > supported by\n> > + * the Android Camera3 specification\n> > + */\n> > +const std::vector<Size> camera3Resolutions = {\n> > +       { 320, 240 },\n> > +       { 640, 480 },\n> > +       { 1280, 720 },\n> > +       { 1920, 1080 }\n> > +};\n> > +\n> > +/*\n> > + * \\struct Camera3Format\n> > + * \\brief Data associated with an Android format identifier\n> > + * \\var libcameraFormats List of libcamera pixel formats compatible with\n> > the\n> > + * Android format\n> > + * \\var name The human-readable representation of the Android format code\n> > + */\n> > +struct Camera3Format {\n> > +       std::vector<PixelFormat> libcameraFormats;\n> > +       bool mandatory;\n> > +       const char *name;\n> > +};\n> > +\n> > +/*\n> > + * \\var camera3FormatsMap\n> > + * \\brief Associate Android format code with ancillary data\n> > + */\n> > +const std::map<int, const Camera3Format> camera3FormatsMap = {\n> > +       {\n> > +               HAL_PIXEL_FORMAT_BLOB, {\n> > +                       { formats::MJPEG },\n> > +                       true,\n> > +                       \"BLOB\"\n> > +               }\n> > +       }, {\n> > +               HAL_PIXEL_FORMAT_YCbCr_420_888, {\n> > +                       { formats::NV12, formats::NV21 },\n> > +                       true,\n> > +                       \"YCbCr_420_888\"\n> > +               }\n> > +       }, {\n> > +               /*\n> > +                * \\todo Translate IMPLEMENTATION_DEFINED inspecting the\n> > gralloc\n> > +                * usage flag. For now, copy the YCbCr_420 configuration.\n> > +                */\n> > +               HAL_PIXEL_FORMAT_IMPLEMENTATION_DEFINED, {\n> > +                       { formats::NV12, formats::NV21 },\n> > +                       true,\n> > +                       \"IMPLEMENTATION_DEFINED\"\n> > +               }\n> > +       }, {\n> > +               HAL_PIXEL_FORMAT_RAW10, {\n> > +                       {\n> > +                               formats::SBGGR10_CSI2P,\n> > +                               formats::SGBRG10_CSI2P,\n> > +                               formats::SGRBG10_CSI2P,\n> > +                               formats::SRGGB10_CSI2P\n> > +                       },\n> > +                       false,\n> > +                       \"RAW10\"\n> > +               }\n> > +       }, {\n> > +               HAL_PIXEL_FORMAT_RAW12, {\n> > +                       {\n> > +                               formats::SBGGR12_CSI2P,\n> > +                               formats::SGBRG12_CSI2P,\n> > +                               formats::SGRBG12_CSI2P,\n> > +                               formats::SRGGB12_CSI2P\n> > +                       },\n> > +                       false,\n> > +                       \"RAW12\"\n> > +               }\n> > +       }, {\n> > +               HAL_PIXEL_FORMAT_RAW16, {\n> > +                       {\n> > +                               formats::SBGGR16,\n> > +                               formats::SGBRG16,\n> > +                               formats::SGRBG16,\n> > +                               formats::SRGGB16\n> > +                       },\n> > +                       false,\n> > +                       \"RAW16\"\n> > +               }\n> > +       },\n> > +};\n> > +\n> > +} /* namespace */\n> > +\n> > +int CameraCapabilities::initialize(std::shared_ptr<libcamera::Camera>\n> > camera,\n> > +                                  int orientation, int facing)\n> > +{\n> > +       camera_ = camera;\n> > +       orientation_ = orientation;\n> > +       facing_ = facing;\n> > +\n> > +       /* Acquire the camera and initialize available stream\n> > configurations. */\n> > +       int ret = camera_->acquire();\n> > +       if (ret) {\n> > +               LOG(HAL, Error) << \"Failed to temporarily acquire the\n> > camera\";\n> > +               return ret;\n> > +       }\n> > +\n> > +       ret = initializeStreamConfigurations();\n> > +       camera_->release();\n> > +       if (ret)\n> > +               return ret;\n> > +\n> > +       return initializeStaticMetadata();\n> > +}\n> > +\n> > +std::vector<Size>\n> > CameraCapabilities::getYUVResolutions(CameraConfiguration *cameraConfig,\n> > +                                                       const PixelFormat\n> > &pixelFormat,\n> > +                                                       const\n> > std::vector<Size> &resolutions)\n> > +{\n> > +       std::vector<Size> supportedResolutions;\n> > +\n> > +       StreamConfiguration &cfg = cameraConfig->at(0);\n> > +       for (const Size &res : resolutions) {\n> > +               cfg.pixelFormat = pixelFormat;\n> > +               cfg.size = res;\n> > +\n> > +               CameraConfiguration::Status status =\n> > cameraConfig->validate();\n> > +               if (status != CameraConfiguration::Valid) {\n> > +                       LOG(HAL, Debug) << cfg.toString() << \" not\n> > supported\";\n> > +                       continue;\n> > +               }\n> > +\n> > +               LOG(HAL, Debug) << cfg.toString() << \" supported\";\n> > +\n> > +               supportedResolutions.push_back(res);\n> > +       }\n> > +\n> > +       return supportedResolutions;\n> > +}\n> > +\n> > +std::vector<Size> CameraCapabilities::getRawResolutions(const\n> > libcamera::PixelFormat &pixelFormat)\n> > +{\n> > +       std::unique_ptr<CameraConfiguration> cameraConfig =\n> > +               camera_->generateConfiguration({ StreamRole::Raw });\n> > +       StreamConfiguration &cfg = cameraConfig->at(0);\n> > +       const StreamFormats &formats = cfg.formats();\n> > +       std::vector<Size> supportedResolutions =\n> > formats.sizes(pixelFormat);\n> > +\n> > +       return supportedResolutions;\n> > +}\n> > +\n> > +/*\n> > + * Initialize the format conversion map to translate from Android format\n> > + * identifier to libcamera pixel formats and fill in the list of supported\n> > + * stream configurations to be reported to the Android camera framework\n> > through\n> > + * the Camera static metadata.\n> > + */\n> > +int CameraCapabilities::initializeStreamConfigurations()\n> > +{\n> > +       /*\n> > +        * Get the maximum output resolutions\n> > +        * \\todo Get this from the camera properties once defined\n> > +        */\n> > +       std::unique_ptr<CameraConfiguration> cameraConfig =\n> > +               camera_->generateConfiguration({ StillCapture });\n> > +       if (!cameraConfig) {\n> > +               LOG(HAL, Error) << \"Failed to get maximum resolution\";\n> > +               return -EINVAL;\n> > +       }\n> > +       StreamConfiguration &cfg = cameraConfig->at(0);\n> > +\n> > +       /*\n> > +        * \\todo JPEG - Adjust the maximum available resolution by taking\n> > the\n> > +        * JPEG encoder requirements into account (alignment and aspect\n> > ratio).\n> > +        */\n> > +       const Size maxRes = cfg.size;\n> > +       LOG(HAL, Debug) << \"Maximum supported resolution: \" <<\n> > maxRes.toString();\n> > +\n> > +       /*\n> > +        * Build the list of supported image resolutions.\n> > +        *\n> > +        * The resolutions listed in camera3Resolution are mandatory to be\n> > +        * supported, up to the camera maximum resolution.\n> > +        *\n> > +        * Augment the list by adding resolutions calculated from the\n> > camera\n> > +        * maximum one.\n> > +        */\n> > +       std::vector<Size> cameraResolutions;\n> > +       std::copy_if(camera3Resolutions.begin(), camera3Resolutions.end(),\n> > +                    std::back_inserter(cameraResolutions),\n> > +                    [&](const Size &res) { return res < maxRes; });\n> > +\n> > +       /*\n> > +        * The Camera3 specification suggests adding 1/2 and 1/4 of the\n> > maximum\n> > +        * resolution.\n> > +        */\n> > +       for (unsigned int divider = 2;; divider <<= 1) {\n> > +               Size derivedSize{\n> > +                       maxRes.width / divider,\n> > +                       maxRes.height / divider,\n> > +               };\n> > +\n> > +               if (derivedSize.width < 320 ||\n> > +                   derivedSize.height < 240)\n> > +                       break;\n> > +\n> > +               cameraResolutions.push_back(derivedSize);\n> > +       }\n> > +       cameraResolutions.push_back(maxRes);\n> > +\n> > +       /* Remove duplicated entries from the list of supported\n> > resolutions. */\n> > +       std::sort(cameraResolutions.begin(), cameraResolutions.end());\n> > +       auto last = std::unique(cameraResolutions.begin(),\n> > cameraResolutions.end());\n> > +       cameraResolutions.erase(last, cameraResolutions.end());\n> > +\n> > +       /*\n> > +        * Build the list of supported camera formats.\n> > +        *\n> > +        * To each Android format a list of compatible libcamera formats is\n> > +        * associated. The first libcamera format that tests successful is\n> > added\n> > +        * to the format translation map used when configuring the streams.\n> > +        * It is then tested against the list of supported camera\n> > resolutions to\n> > +        * build the stream configuration map reported through the camera\n> > static\n> > +        * metadata.\n> > +        */\n> > +       Size maxJpegSize;\n> > +       for (const auto &format : camera3FormatsMap) {\n> > +               int androidFormat = format.first;\n> > +               const Camera3Format &camera3Format = format.second;\n> > +               const std::vector<PixelFormat> &libcameraFormats =\n> > +                       camera3Format.libcameraFormats;\n> > +\n> > +               LOG(HAL, Debug) << \"Trying to map Android format \"\n> > +                               << camera3Format.name;\n> > +\n> > +               /*\n> > +                * JPEG is always supported, either produced directly by\n> > the\n> > +                * camera, or encoded in the HAL.\n> > +                */\n> > +               if (androidFormat == HAL_PIXEL_FORMAT_BLOB) {\n> > +                       formatsMap_[androidFormat] = formats::MJPEG;\n> > +                       LOG(HAL, Debug) << \"Mapped Android format \"\n> > +                                       << camera3Format.name << \" to \"\n> > +                                       << formats::MJPEG.toString()\n> > +                                       << \" (fixed mapping)\";\n> > +                       continue;\n> > +               }\n> > +\n> > +               /*\n> > +                * Test the libcamera formats that can produce images\n> > +                * compatible with the format defined by Android.\n> > +                */\n> > +               PixelFormat mappedFormat;\n> > +               for (const PixelFormat &pixelFormat : libcameraFormats) {\n> > +\n> > +                       LOG(HAL, Debug) << \"Testing \" <<\n> > pixelFormat.toString();\n> > +\n> > +                       /*\n> > +                        * The stream configuration size can be adjusted,\n> > +                        * not the pixel format.\n> > +                        *\n> > +                        * \\todo This could be simplified once all pipeline\n> > +                        * handlers will report the StreamFormats list of\n> > +                        * supported formats.\n> > +                        */\n> > +                       cfg.pixelFormat = pixelFormat;\n> > +\n> > +                       CameraConfiguration::Status status =\n> > cameraConfig->validate();\n> > +                       if (status != CameraConfiguration::Invalid &&\n> > +                           cfg.pixelFormat == pixelFormat) {\n> > +                               mappedFormat = pixelFormat;\n> > +                               break;\n> > +                       }\n> > +               }\n> > +\n> > +               if (!mappedFormat.isValid()) {\n> > +                       /* If the format is not mandatory, skip it. */\n> > +                       if (!camera3Format.mandatory)\n> > +                               continue;\n> > +\n> > +                       LOG(HAL, Error)\n> > +                               << \"Failed to map mandatory Android format\n> > \"\n> > +                               << camera3Format.name << \" (\"\n> > +                               << utils::hex(androidFormat) << \"):\n> > aborting\";\n> > +                       return -EINVAL;\n> > +               }\n> > +\n> > +               /*\n> > +                * Record the mapping and then proceed to generate the\n> > +                * stream configurations map, by testing the image\n> > resolutions.\n> > +                */\n> > +               formatsMap_[androidFormat] = mappedFormat;\n> > +               LOG(HAL, Debug) << \"Mapped Android format \"\n> > +                               << camera3Format.name << \" to \"\n> > +                               << mappedFormat.toString();\n> > +\n> > +               std::vector<Size> resolutions;\n> > +               const PixelFormatInfo &info =\n> > PixelFormatInfo::info(mappedFormat);\n> > +               if (info.colourEncoding ==\n> > PixelFormatInfo::ColourEncodingRAW)\n> > +                       resolutions = getRawResolutions(mappedFormat);\n> > +               else\n> > +                       resolutions = getYUVResolutions(cameraConfig.get(),\n> > +                                                       mappedFormat,\n> > +                                                       cameraResolutions);\n> > +\n> > +               for (const Size &res : resolutions) {\n> > +                       streamConfigurations_.push_back({ res,\n> > androidFormat });\n> > +\n> > +                       /*\n> > +                        * If the format is HAL_PIXEL_FORMAT_YCbCr_420_888\n> > +                        * from which JPEG is produced, add an entry for\n> > +                        * the JPEG stream.\n> > +                        *\n> > +                        * \\todo Wire the JPEG encoder to query the\n> > supported\n> > +                        * sizes provided a list of formats it can encode.\n> > +                        *\n> > +                        * \\todo Support JPEG streams produced by the\n> > Camera\n> > +                        * natively.\n> > +                        */\n> > +                       if (androidFormat ==\n> > HAL_PIXEL_FORMAT_YCbCr_420_888) {\n> > +                               streamConfigurations_.push_back(\n> > +                                       { res, HAL_PIXEL_FORMAT_BLOB });\n> > +                               maxJpegSize = std::max(maxJpegSize, res);\n> > +                       }\n> > +               }\n> > +\n> > +               /*\n> > +                * \\todo Calculate the maximum JPEG buffer size by asking\n> > the\n> > +                * encoder giving the maximum frame size required.\n> > +                */\n> > +               maxJpegBufferSize_ = maxJpegSize.width *\n> > maxJpegSize.height * 1.5;\n> > +       }\n> > +\n> > +       LOG(HAL, Debug) << \"Collected stream configuration map: \";\n> > +       for (const auto &entry : streamConfigurations_)\n> > +               LOG(HAL, Debug) << \"{ \" << entry.resolution.toString() <<\n> > \" - \"\n> > +                               << utils::hex(entry.androidFormat) << \" }\";\n> > +\n> > +       return 0;\n> > +}\n> > +\n> > +int CameraCapabilities::initializeStaticMetadata()\n> > +{\n> > +       staticMetadata_ = std::make_unique<CameraMetadata>(64, 1024);\n> > +       if (!staticMetadata_->isValid()) {\n> > +               LOG(HAL, Error) << \"Failed to allocate static metadata\";\n> > +               staticMetadata_.reset();\n> > +               return -EINVAL;\n> > +       }\n> > +\n> > +       const ControlInfoMap &controlsInfo = camera_->controls();\n> > +       const ControlList &properties = camera_->properties();\n> > +\n> > +       /* Color correction static metadata. */\n> > +       {\n> > +               std::vector<uint8_t> data;\n> > +               data.reserve(3);\n> > +               const auto &infoMap =\n> > controlsInfo.find(&controls::draft::ColorCorrectionAberrationMode);\n> > +               if (infoMap != controlsInfo.end()) {\n> > +                       for (const auto &value : infoMap->second.values())\n> > +                               data.push_back(value.get<int32_t>());\n> > +               } else {\n> > +\n> >  data.push_back(ANDROID_COLOR_CORRECTION_ABERRATION_MODE_OFF);\n> > +               }\n> > +\n> >  staticMetadata_->addEntry(ANDROID_COLOR_CORRECTION_AVAILABLE_ABERRATION_MODES,\n> > +                                         data);\n> > +       }\n> > +\n> > +       /* Control static metadata. */\n> > +       std::vector<uint8_t> aeAvailableAntiBandingModes = {\n> > +               ANDROID_CONTROL_AE_ANTIBANDING_MODE_OFF,\n> > +               ANDROID_CONTROL_AE_ANTIBANDING_MODE_50HZ,\n> > +               ANDROID_CONTROL_AE_ANTIBANDING_MODE_60HZ,\n> > +               ANDROID_CONTROL_AE_ANTIBANDING_MODE_AUTO,\n> > +       };\n> > +\n> >  staticMetadata_->addEntry(ANDROID_CONTROL_AE_AVAILABLE_ANTIBANDING_MODES,\n> > +                                 aeAvailableAntiBandingModes);\n> > +\n> > +       std::vector<uint8_t> aeAvailableModes = {\n> > +               ANDROID_CONTROL_AE_MODE_ON,\n> > +       };\n> > +       staticMetadata_->addEntry(ANDROID_CONTROL_AE_AVAILABLE_MODES,\n> > +                                 aeAvailableModes);\n> > +\n> > +       int64_t minFrameDurationNsec = -1;\n> > +       int64_t maxFrameDurationNsec = -1;\n> > +       const auto frameDurationsInfo =\n> > controlsInfo.find(&controls::FrameDurationLimits);\n> > +       if (frameDurationsInfo != controlsInfo.end()) {\n> > +               minFrameDurationNsec =\n> > frameDurationsInfo->second.min().get<int64_t>() * 1000;\n> > +               maxFrameDurationNsec =\n> > frameDurationsInfo->second.max().get<int64_t>() * 1000;\n> > +\n> > +               /*\n> > +                * Adjust the minimum frame duration to comply with Android\n> > +                * requirements. The camera service mandates all\n> > preview/record\n> > +                * streams to have a minimum frame duration < 33,366\n> > milliseconds\n> > +                * (see MAX_PREVIEW_RECORD_DURATION_NS in the camera\n> > service\n> > +                * implementation).\n> > +                *\n> > +                * If we're close enough (+ 500 useconds) to that value,\n> > round\n> > +                * the minimum frame duration of the camera to an accepted\n> > +                * value.\n> > +                */\n> > +               static constexpr int64_t MAX_PREVIEW_RECORD_DURATION_NS =\n> > 1e9 / 29.97;\n> > +               if (minFrameDurationNsec > MAX_PREVIEW_RECORD_DURATION_NS\n> > &&\n> > +                   minFrameDurationNsec < MAX_PREVIEW_RECORD_DURATION_NS\n> > + 500000)\n> > +                       minFrameDurationNsec =\n> > MAX_PREVIEW_RECORD_DURATION_NS - 1000;\n> > +\n> > +               /*\n> > +                * The AE routine frame rate limits are computed using the\n> > frame\n> > +                * duration limits, as libcamera clips the AE routine to\n> > the\n> > +                * frame durations.\n> > +                */\n> > +               int32_t maxFps = std::round(1e9 / minFrameDurationNsec);\n> > +               int32_t minFps = std::round(1e9 / maxFrameDurationNsec);\n> > +               minFps = std::max(1, minFps);\n> > +\n> > +               /*\n> > +                * Force rounding errors so that we have the proper frame\n> > +                * durations for when we reuse these variables later\n> > +                */\n> > +               minFrameDurationNsec = 1e9 / maxFps;\n> > +               maxFrameDurationNsec = 1e9 / minFps;\n> > +\n> > +               /*\n> > +                * Register to the camera service {min, max} and {max, max}\n> > +                * intervals as requested by the metadata documentation.\n> > +                */\n> > +               int32_t availableAeFpsTarget[] = {\n> > +                       minFps, maxFps, maxFps, maxFps\n> > +               };\n> > +\n> >  staticMetadata_->addEntry(ANDROID_CONTROL_AE_AVAILABLE_TARGET_FPS_RANGES,\n> > +                                         availableAeFpsTarget);\n> > +       }\n> > +\n> > +       std::vector<int32_t> aeCompensationRange = {\n> > +               0, 0,\n> > +       };\n> > +       staticMetadata_->addEntry(ANDROID_CONTROL_AE_COMPENSATION_RANGE,\n> > +                                 aeCompensationRange);\n> > +\n> > +       const camera_metadata_rational_t aeCompensationStep[] = {\n> > +               { 0, 1 }\n> > +       };\n> > +       staticMetadata_->addEntry(ANDROID_CONTROL_AE_COMPENSATION_STEP,\n> > +                                 aeCompensationStep);\n> > +\n> > +       std::vector<uint8_t> availableAfModes = {\n> > +               ANDROID_CONTROL_AF_MODE_OFF,\n> > +       };\n> > +       staticMetadata_->addEntry(ANDROID_CONTROL_AF_AVAILABLE_MODES,\n> > +                                 availableAfModes);\n> > +\n> > +       std::vector<uint8_t> availableEffects = {\n> > +               ANDROID_CONTROL_EFFECT_MODE_OFF,\n> > +       };\n> > +       staticMetadata_->addEntry(ANDROID_CONTROL_AVAILABLE_EFFECTS,\n> > +                                 availableEffects);\n> > +\n> > +       std::vector<uint8_t> availableSceneModes = {\n> > +               ANDROID_CONTROL_SCENE_MODE_DISABLED,\n> > +       };\n> > +       staticMetadata_->addEntry(ANDROID_CONTROL_AVAILABLE_SCENE_MODES,\n> > +                                 availableSceneModes);\n> > +\n> > +       std::vector<uint8_t> availableStabilizationModes = {\n> > +               ANDROID_CONTROL_VIDEO_STABILIZATION_MODE_OFF,\n> > +       };\n> > +\n> >  staticMetadata_->addEntry(ANDROID_CONTROL_AVAILABLE_VIDEO_STABILIZATION_MODES,\n> > +                                 availableStabilizationModes);\n> > +\n> > +       /*\n> > +        * \\todo Inspect the Camera capabilities to report the available\n> > +        * AWB modes. Default to AUTO as CTS tests require it.\n> > +        */\n> > +       std::vector<uint8_t> availableAwbModes = {\n> > +               ANDROID_CONTROL_AWB_MODE_AUTO,\n> > +       };\n> > +       staticMetadata_->addEntry(ANDROID_CONTROL_AWB_AVAILABLE_MODES,\n> > +                                 availableAwbModes);\n> > +\n> > +       std::vector<int32_t> availableMaxRegions = {\n> > +               0, 0, 0,\n> > +       };\n> > +       staticMetadata_->addEntry(ANDROID_CONTROL_MAX_REGIONS,\n> > +                                 availableMaxRegions);\n> > +\n> > +       std::vector<uint8_t> sceneModesOverride = {\n> > +               ANDROID_CONTROL_AE_MODE_ON,\n> > +               ANDROID_CONTROL_AWB_MODE_AUTO,\n> > +               ANDROID_CONTROL_AF_MODE_OFF,\n> > +       };\n> > +       staticMetadata_->addEntry(ANDROID_CONTROL_SCENE_MODE_OVERRIDES,\n> > +                                 sceneModesOverride);\n> > +\n> > +       uint8_t aeLockAvailable = ANDROID_CONTROL_AE_LOCK_AVAILABLE_FALSE;\n> > +       staticMetadata_->addEntry(ANDROID_CONTROL_AE_LOCK_AVAILABLE,\n> > +                                 aeLockAvailable);\n> > +\n> > +       uint8_t awbLockAvailable =\n> > ANDROID_CONTROL_AWB_LOCK_AVAILABLE_FALSE;\n> > +       staticMetadata_->addEntry(ANDROID_CONTROL_AWB_LOCK_AVAILABLE,\n> > +                                 awbLockAvailable);\n> > +\n> > +       char availableControlModes = ANDROID_CONTROL_MODE_AUTO;\n> > +       staticMetadata_->addEntry(ANDROID_CONTROL_AVAILABLE_MODES,\n> > +                                 availableControlModes);\n> > +\n> > +       /* JPEG static metadata. */\n> > +\n> > +       /*\n> > +        * Create the list of supported thumbnail sizes by inspecting the\n> > +        * available JPEG resolutions collected in streamConfigurations_\n> > and\n> > +        * generate one entry for each aspect ratio.\n> > +        *\n> > +        * The JPEG thumbnailer can freely scale, so pick an arbitrary\n> > +        * (160, 160) size as the bounding rectangle, which is then\n> > cropped to\n> > +        * the different supported aspect ratios.\n> > +        */\n> > +       constexpr Size maxJpegThumbnail(160, 160);\n> > +       std::vector<Size> thumbnailSizes;\n> > +       thumbnailSizes.push_back({ 0, 0 });\n> > +       for (const auto &entry : streamConfigurations_) {\n> > +               if (entry.androidFormat != HAL_PIXEL_FORMAT_BLOB)\n> > +                       continue;\n> > +\n> > +               Size thumbnailSize = maxJpegThumbnail\n> > +                                    .boundedToAspectRatio({\n> > entry.resolution.width,\n> > +\n> > entry.resolution.height });\n> > +               thumbnailSizes.push_back(thumbnailSize);\n> > +       }\n> > +\n> > +       std::sort(thumbnailSizes.begin(), thumbnailSizes.end());\n> > +       auto last = std::unique(thumbnailSizes.begin(),\n> > thumbnailSizes.end());\n> > +       thumbnailSizes.erase(last, thumbnailSizes.end());\n> > +\n> > +       /* Transform sizes in to a list of integers that can be consumed.\n> > */\n> > +       std::vector<int32_t> thumbnailEntries;\n> > +       thumbnailEntries.reserve(thumbnailSizes.size() * 2);\n> > +       for (const auto &size : thumbnailSizes) {\n> > +               thumbnailEntries.push_back(size.width);\n> > +               thumbnailEntries.push_back(size.height);\n> > +       }\n> > +       staticMetadata_->addEntry(ANDROID_JPEG_AVAILABLE_THUMBNAIL_SIZES,\n> > +                                 thumbnailEntries);\n> > +\n> > +       staticMetadata_->addEntry(ANDROID_JPEG_MAX_SIZE,\n> > maxJpegBufferSize_);\n> > +\n> > +       /* Sensor static metadata. */\n> > +       std::array<int32_t, 2> pixelArraySize;\n> > +       {\n> > +               const Size &size =\n> > properties.get(properties::PixelArraySize);\n> > +               pixelArraySize[0] = size.width;\n> > +               pixelArraySize[1] = size.height;\n> > +\n> >  staticMetadata_->addEntry(ANDROID_SENSOR_INFO_PIXEL_ARRAY_SIZE,\n> > +                                         pixelArraySize);\n> > +       }\n> > +\n> > +       if (properties.contains(properties::UnitCellSize)) {\n> > +               const Size &cellSize =\n> > properties.get<Size>(properties::UnitCellSize);\n> > +               std::array<float, 2> physicalSize{\n> > +                       cellSize.width * pixelArraySize[0] / 1e6f,\n> > +                       cellSize.height * pixelArraySize[1] / 1e6f\n> > +               };\n> > +\n> >  staticMetadata_->addEntry(ANDROID_SENSOR_INFO_PHYSICAL_SIZE,\n> > +                                         physicalSize);\n> > +       }\n> > +\n> > +       {\n> > +               const Span<const Rectangle> &rects =\n> > +                       properties.get(properties::PixelArrayActiveAreas);\n> > +               std::vector<int32_t> data{\n> > +                       static_cast<int32_t>(rects[0].x),\n> > +                       static_cast<int32_t>(rects[0].y),\n> > +                       static_cast<int32_t>(rects[0].width),\n> > +                       static_cast<int32_t>(rects[0].height),\n> > +               };\n> > +\n> >  staticMetadata_->addEntry(ANDROID_SENSOR_INFO_ACTIVE_ARRAY_SIZE,\n> > +                                         data);\n> > +       }\n> > +\n> > +       int32_t sensitivityRange[] = {\n> > +               32, 2400,\n> > +       };\n> > +       staticMetadata_->addEntry(ANDROID_SENSOR_INFO_SENSITIVITY_RANGE,\n> > +                                 sensitivityRange);\n> > +\n> > +       /* Report the color filter arrangement if the camera reports it. */\n> > +       if\n> > (properties.contains(properties::draft::ColorFilterArrangement)) {\n> > +               uint8_t filterArr =\n> > properties.get(properties::draft::ColorFilterArrangement);\n> > +\n> >  staticMetadata_->addEntry(ANDROID_SENSOR_INFO_COLOR_FILTER_ARRANGEMENT,\n> > +                                         filterArr);\n> > +       }\n> > +\n> > +       const auto &exposureInfo =\n> > controlsInfo.find(&controls::ExposureTime);\n> > +       if (exposureInfo != controlsInfo.end()) {\n> > +               int64_t exposureTimeRange[2] = {\n> > +                       exposureInfo->second.min().get<int32_t>() * 1000LL,\n> > +                       exposureInfo->second.max().get<int32_t>() * 1000LL,\n> > +               };\n> > +\n> >  staticMetadata_->addEntry(ANDROID_SENSOR_INFO_EXPOSURE_TIME_RANGE,\n> > +                                         exposureTimeRange, 2);\n> > +       }\n> > +\n> > +       staticMetadata_->addEntry(ANDROID_SENSOR_ORIENTATION,\n> > orientation_);\n> > +\n> > +       std::vector<int32_t> testPatternModes = {\n> > +               ANDROID_SENSOR_TEST_PATTERN_MODE_OFF\n> > +       };\n> > +       const auto &testPatternsInfo =\n> > +               controlsInfo.find(&controls::draft::TestPatternMode);\n> > +       if (testPatternsInfo != controlsInfo.end()) {\n> > +               const auto &values = testPatternsInfo->second.values();\n> > +               ASSERT(!values.empty());\n> > +               for (const auto &value : values) {\n> > +                       switch (value.get<int32_t>()) {\n> > +                       case controls::draft::TestPatternModeOff:\n> > +                               /*\n> > +                                * ANDROID_SENSOR_TEST_PATTERN_MODE_OFF is\n> > +                                * already in testPatternModes.\n> > +                                */\n> > +                               break;\n> > +\n> > +                       case controls::draft::TestPatternModeSolidColor:\n> > +                               testPatternModes.push_back(\n> > +\n> >  ANDROID_SENSOR_TEST_PATTERN_MODE_SOLID_COLOR);\n> > +                               break;\n> > +\n> > +                       case controls::draft::TestPatternModeColorBars:\n> > +                               testPatternModes.push_back(\n> > +\n> >  ANDROID_SENSOR_TEST_PATTERN_MODE_COLOR_BARS);\n> > +                               break;\n> > +\n> > +                       case\n> > controls::draft::TestPatternModeColorBarsFadeToGray:\n> > +                               testPatternModes.push_back(\n> > +\n> >  ANDROID_SENSOR_TEST_PATTERN_MODE_COLOR_BARS_FADE_TO_GRAY);\n> > +                               break;\n> > +\n> > +                       case controls::draft::TestPatternModePn9:\n> > +                               testPatternModes.push_back(\n> > +\n> >  ANDROID_SENSOR_TEST_PATTERN_MODE_PN9);\n> > +                               break;\n> > +\n> > +                       case controls::draft::TestPatternModeCustom1:\n> > +                               /* We don't support this yet. */\n> > +                               break;\n> > +\n> > +                       default:\n> > +                               LOG(HAL, Error) << \"Unknown test pattern\n> > mode: \"\n> > +                                               << value.get<int32_t>();\n> > +                               continue;\n> > +                       }\n> > +               }\n> > +       }\n> > +\n> >  staticMetadata_->addEntry(ANDROID_SENSOR_AVAILABLE_TEST_PATTERN_MODES,\n> > +                                 testPatternModes);\n> > +\n> > +       uint8_t timestampSource =\n> > ANDROID_SENSOR_INFO_TIMESTAMP_SOURCE_UNKNOWN;\n> > +       staticMetadata_->addEntry(ANDROID_SENSOR_INFO_TIMESTAMP_SOURCE,\n> > +                                 timestampSource);\n> > +\n> > +       if (maxFrameDurationNsec > 0)\n> > +\n> >  staticMetadata_->addEntry(ANDROID_SENSOR_INFO_MAX_FRAME_DURATION,\n> > +                                         maxFrameDurationNsec);\n> > +\n> > +       /* Statistics static metadata. */\n> > +       uint8_t faceDetectMode = ANDROID_STATISTICS_FACE_DETECT_MODE_OFF;\n> > +\n> >  staticMetadata_->addEntry(ANDROID_STATISTICS_INFO_AVAILABLE_FACE_DETECT_MODES,\n> > +                                 faceDetectMode);\n> > +\n> > +       int32_t maxFaceCount = 0;\n> > +       staticMetadata_->addEntry(ANDROID_STATISTICS_INFO_MAX_FACE_COUNT,\n> > +                                 maxFaceCount);\n> > +\n> > +       {\n> > +               std::vector<uint8_t> data;\n> > +               data.reserve(2);\n> > +               const auto &infoMap =\n> > controlsInfo.find(&controls::draft::LensShadingMapMode);\n> > +               if (infoMap != controlsInfo.end()) {\n> > +                       for (const auto &value : infoMap->second.values())\n> > +                               data.push_back(value.get<int32_t>());\n> > +               } else {\n> > +\n> >  data.push_back(ANDROID_STATISTICS_LENS_SHADING_MAP_MODE_OFF);\n> > +               }\n> > +\n> >  staticMetadata_->addEntry(ANDROID_STATISTICS_INFO_AVAILABLE_LENS_SHADING_MAP_MODES,\n> > +                                         data);\n> > +       }\n> > +\n> > +       /* Sync static metadata. */\n> > +       int32_t maxLatency = ANDROID_SYNC_MAX_LATENCY_UNKNOWN;\n> > +       staticMetadata_->addEntry(ANDROID_SYNC_MAX_LATENCY, maxLatency);\n> > +\n> > +       /* Flash static metadata. */\n> > +       char flashAvailable = ANDROID_FLASH_INFO_AVAILABLE_FALSE;\n> > +       staticMetadata_->addEntry(ANDROID_FLASH_INFO_AVAILABLE,\n> > +                                 flashAvailable);\n> > +\n> > +       /* Lens static metadata. */\n> > +       std::vector<float> lensApertures = {\n> > +               2.53 / 100,\n> > +       };\n> > +       staticMetadata_->addEntry(ANDROID_LENS_INFO_AVAILABLE_APERTURES,\n> > +                                 lensApertures);\n> > +\n> > +       uint8_t lensFacing;\n> > +       switch (facing_) {\n> > +       default:\n> > +       case CAMERA_FACING_FRONT:\n> > +               lensFacing = ANDROID_LENS_FACING_FRONT;\n> > +               break;\n> > +       case CAMERA_FACING_BACK:\n> > +               lensFacing = ANDROID_LENS_FACING_BACK;\n> > +               break;\n> > +       case CAMERA_FACING_EXTERNAL:\n> > +               lensFacing = ANDROID_LENS_FACING_EXTERNAL;\n> > +               break;\n> > +       }\n> > +       staticMetadata_->addEntry(ANDROID_LENS_FACING, lensFacing);\n> > +\n> > +       std::vector<float> lensFocalLengths = {\n> > +               1,\n> > +       };\n> > +\n> >  staticMetadata_->addEntry(ANDROID_LENS_INFO_AVAILABLE_FOCAL_LENGTHS,\n> > +                                 lensFocalLengths);\n> > +\n> > +       std::vector<uint8_t> opticalStabilizations = {\n> > +               ANDROID_LENS_OPTICAL_STABILIZATION_MODE_OFF,\n> > +       };\n> > +\n> >  staticMetadata_->addEntry(ANDROID_LENS_INFO_AVAILABLE_OPTICAL_STABILIZATION,\n> > +                                 opticalStabilizations);\n> > +\n> > +       float hypeFocalDistance = 0;\n> > +       staticMetadata_->addEntry(ANDROID_LENS_INFO_HYPERFOCAL_DISTANCE,\n> > +                                 hypeFocalDistance);\n> > +\n> > +       float minFocusDistance = 0;\n> > +       staticMetadata_->addEntry(ANDROID_LENS_INFO_MINIMUM_FOCUS_DISTANCE,\n> > +                                 minFocusDistance);\n> > +\n> > +       /* Noise reduction modes. */\n> > +       {\n> > +               std::vector<uint8_t> data;\n> > +               data.reserve(5);\n> > +               const auto &infoMap =\n> > controlsInfo.find(&controls::draft::NoiseReductionMode);\n> > +               if (infoMap != controlsInfo.end()) {\n> > +                       for (const auto &value : infoMap->second.values())\n> > +                               data.push_back(value.get<int32_t>());\n> > +               } else {\n> > +                       data.push_back(ANDROID_NOISE_REDUCTION_MODE_OFF);\n> > +               }\n> > +\n> >  staticMetadata_->addEntry(ANDROID_NOISE_REDUCTION_AVAILABLE_NOISE_REDUCTION_MODES,\n> > +                                         data);\n> > +       }\n> > +\n> > +       /* Scaler static metadata. */\n> > +\n> > +       /*\n> > +        * \\todo The digital zoom factor is a property that depends on the\n> > +        * desired output configuration and the sensor frame size input to\n> > the\n> > +        * ISP. This information is not available to the Android HAL, not\n> > at\n> > +        * initialization time at least.\n> > +        *\n> > +        * As a workaround rely on pipeline handlers initializing the\n> > +        * ScalerCrop control with the camera default configuration and\n> > use the\n> > +        * maximum and minimum crop rectangles to calculate the digital\n> > zoom\n> > +        * factor.\n> > +        */\n> > +       float maxZoom = 1.0f;\n> > +       const auto scalerCrop = controlsInfo.find(&controls::ScalerCrop);\n> > +       if (scalerCrop != controlsInfo.end()) {\n> > +               Rectangle min = scalerCrop->second.min().get<Rectangle>();\n> > +               Rectangle max = scalerCrop->second.max().get<Rectangle>();\n> > +               maxZoom = std::min(1.0f * max.width / min.width,\n> > +                                  1.0f * max.height / min.height);\n> > +       }\n> > +\n> >  staticMetadata_->addEntry(ANDROID_SCALER_AVAILABLE_MAX_DIGITAL_ZOOM,\n> > +                                 maxZoom);\n> > +\n> > +       std::vector<uint32_t> availableStreamConfigurations;\n> > +       availableStreamConfigurations.reserve(streamConfigurations_.size()\n> > * 4);\n> > +       for (const auto &entry : streamConfigurations_) {\n> > +\n> >  availableStreamConfigurations.push_back(entry.androidFormat);\n> > +\n> >  availableStreamConfigurations.push_back(entry.resolution.width);\n> > +\n> >  availableStreamConfigurations.push_back(entry.resolution.height);\n> > +               availableStreamConfigurations.push_back(\n> > +\n> >  ANDROID_SCALER_AVAILABLE_STREAM_CONFIGURATIONS_OUTPUT);\n> > +       }\n> > +\n> >  staticMetadata_->addEntry(ANDROID_SCALER_AVAILABLE_STREAM_CONFIGURATIONS,\n> > +                                 availableStreamConfigurations);\n> > +\n> > +       std::vector<int64_t> availableStallDurations = {\n> > +               ANDROID_SCALER_AVAILABLE_FORMATS_BLOB, 2560, 1920,\n> > 33333333,\n> > +       };\n> > +       staticMetadata_->addEntry(ANDROID_SCALER_AVAILABLE_STALL_DURATIONS,\n> > +                                 availableStallDurations);\n> > +\n> > +       /* Use the minimum frame duration for all the YUV/RGB formats. */\n> > +       if (minFrameDurationNsec > 0) {\n> > +               std::vector<int64_t> minFrameDurations;\n> > +               minFrameDurations.reserve(streamConfigurations_.size() *\n> > 4);\n> > +               for (const auto &entry : streamConfigurations_) {\n> > +                       minFrameDurations.push_back(entry.androidFormat);\n> > +\n> >  minFrameDurations.push_back(entry.resolution.width);\n> > +\n> >  minFrameDurations.push_back(entry.resolution.height);\n> > +                       minFrameDurations.push_back(minFrameDurationNsec);\n> > +               }\n> > +\n> >  staticMetadata_->addEntry(ANDROID_SCALER_AVAILABLE_MIN_FRAME_DURATIONS,\n> > +                                         minFrameDurations);\n> > +       }\n> > +\n> > +       uint8_t croppingType = ANDROID_SCALER_CROPPING_TYPE_CENTER_ONLY;\n> > +       staticMetadata_->addEntry(ANDROID_SCALER_CROPPING_TYPE,\n> > croppingType);\n> > +\n> > +       /* Info static metadata. */\n> > +       uint8_t supportedHWLevel =\n> > ANDROID_INFO_SUPPORTED_HARDWARE_LEVEL_LIMITED;\n> > +       staticMetadata_->addEntry(ANDROID_INFO_SUPPORTED_HARDWARE_LEVEL,\n> > +                                 supportedHWLevel);\n> > +\n> > +       /* Request static metadata. */\n> > +       int32_t partialResultCount = 1;\n> > +       staticMetadata_->addEntry(ANDROID_REQUEST_PARTIAL_RESULT_COUNT,\n> > +                                 partialResultCount);\n> > +\n> > +       {\n> > +               /* Default the value to 2 if not reported by the camera. */\n> > +               uint8_t maxPipelineDepth = 2;\n> > +               const auto &infoMap =\n> > controlsInfo.find(&controls::draft::PipelineDepth);\n> > +               if (infoMap != controlsInfo.end())\n> > +                       maxPipelineDepth =\n> > infoMap->second.max().get<int32_t>();\n> > +\n> >  staticMetadata_->addEntry(ANDROID_REQUEST_PIPELINE_MAX_DEPTH,\n> > +                                         maxPipelineDepth);\n> > +       }\n> > +\n> > +       /* LIMITED does not support reprocessing. */\n> > +       uint32_t maxNumInputStreams = 0;\n> > +       staticMetadata_->addEntry(ANDROID_REQUEST_MAX_NUM_INPUT_STREAMS,\n> > +                                 maxNumInputStreams);\n> > +\n> > +       std::vector<uint8_t> availableCapabilities = {\n> > +               ANDROID_REQUEST_AVAILABLE_CAPABILITIES_BACKWARD_COMPATIBLE,\n> > +       };\n> > +\n> > +       /* Report if camera supports RAW. */\n> > +       bool rawStreamAvailable = false;\n> > +       std::unique_ptr<CameraConfiguration> cameraConfig =\n> > +               camera_->generateConfiguration({ StreamRole::Raw });\n> > +       if (cameraConfig && !cameraConfig->empty()) {\n> > +               const PixelFormatInfo &info =\n> > +\n> >  PixelFormatInfo::info(cameraConfig->at(0).pixelFormat);\n> > +               /* Only advertise RAW support if RAW16 is possible. */\n> > +               if (info.colourEncoding ==\n> > PixelFormatInfo::ColourEncodingRAW &&\n> > +                   info.bitsPerPixel == 16) {\n> > +                       rawStreamAvailable = true;\n> > +\n> >  availableCapabilities.push_back(ANDROID_REQUEST_AVAILABLE_CAPABILITIES_RAW);\n> > +               }\n> > +       }\n> > +\n> > +       /* Number of { RAW, YUV, JPEG } supported output streams */\n> > +       int32_t numOutStreams[] = { rawStreamAvailable, 2, 1 };\n> > +       staticMetadata_->addEntry(ANDROID_REQUEST_MAX_NUM_OUTPUT_STREAMS,\n> > +                                 numOutStreams);\n> > +\n> > +       staticMetadata_->addEntry(ANDROID_REQUEST_AVAILABLE_CAPABILITIES,\n> > +                                 availableCapabilities);\n> > +\n> > +       std::vector<int32_t> availableCharacteristicsKeys = {\n> > +               ANDROID_COLOR_CORRECTION_AVAILABLE_ABERRATION_MODES,\n> > +               ANDROID_CONTROL_AE_AVAILABLE_ANTIBANDING_MODES,\n> > +               ANDROID_CONTROL_AE_AVAILABLE_MODES,\n> > +               ANDROID_CONTROL_AE_AVAILABLE_TARGET_FPS_RANGES,\n> > +               ANDROID_CONTROL_AE_COMPENSATION_RANGE,\n> > +               ANDROID_CONTROL_AE_COMPENSATION_STEP,\n> > +               ANDROID_CONTROL_AE_LOCK_AVAILABLE,\n> > +               ANDROID_CONTROL_AF_AVAILABLE_MODES,\n> > +               ANDROID_CONTROL_AVAILABLE_EFFECTS,\n> > +               ANDROID_CONTROL_AVAILABLE_MODES,\n> > +               ANDROID_CONTROL_AVAILABLE_SCENE_MODES,\n> > +               ANDROID_CONTROL_AVAILABLE_VIDEO_STABILIZATION_MODES,\n> > +               ANDROID_CONTROL_AWB_AVAILABLE_MODES,\n> > +               ANDROID_CONTROL_AWB_LOCK_AVAILABLE,\n> > +               ANDROID_CONTROL_MAX_REGIONS,\n> > +               ANDROID_CONTROL_SCENE_MODE_OVERRIDES,\n> > +               ANDROID_FLASH_INFO_AVAILABLE,\n> > +               ANDROID_INFO_SUPPORTED_HARDWARE_LEVEL,\n> > +               ANDROID_JPEG_AVAILABLE_THUMBNAIL_SIZES,\n> > +               ANDROID_JPEG_MAX_SIZE,\n> > +               ANDROID_LENS_FACING,\n> > +               ANDROID_LENS_INFO_AVAILABLE_APERTURES,\n> > +               ANDROID_LENS_INFO_AVAILABLE_FOCAL_LENGTHS,\n> > +               ANDROID_LENS_INFO_AVAILABLE_OPTICAL_STABILIZATION,\n> > +               ANDROID_LENS_INFO_HYPERFOCAL_DISTANCE,\n> > +               ANDROID_LENS_INFO_MINIMUM_FOCUS_DISTANCE,\n> > +               ANDROID_NOISE_REDUCTION_AVAILABLE_NOISE_REDUCTION_MODES,\n> > +               ANDROID_REQUEST_AVAILABLE_CAPABILITIES,\n> > +               ANDROID_REQUEST_MAX_NUM_INPUT_STREAMS,\n> > +               ANDROID_REQUEST_MAX_NUM_OUTPUT_STREAMS,\n> > +               ANDROID_REQUEST_PARTIAL_RESULT_COUNT,\n> > +               ANDROID_REQUEST_PIPELINE_MAX_DEPTH,\n> > +               ANDROID_SCALER_AVAILABLE_MAX_DIGITAL_ZOOM,\n> > +               ANDROID_SCALER_AVAILABLE_MIN_FRAME_DURATIONS,\n> > +               ANDROID_SCALER_AVAILABLE_STALL_DURATIONS,\n> > +               ANDROID_SCALER_AVAILABLE_STREAM_CONFIGURATIONS,\n> > +               ANDROID_SCALER_CROPPING_TYPE,\n> > +               ANDROID_SENSOR_AVAILABLE_TEST_PATTERN_MODES,\n> > +               ANDROID_SENSOR_INFO_ACTIVE_ARRAY_SIZE,\n> > +               ANDROID_SENSOR_INFO_COLOR_FILTER_ARRANGEMENT,\n> > +               ANDROID_SENSOR_INFO_EXPOSURE_TIME_RANGE,\n> > +               ANDROID_SENSOR_INFO_MAX_FRAME_DURATION,\n> > +               ANDROID_SENSOR_INFO_PHYSICAL_SIZE,\n> > +               ANDROID_SENSOR_INFO_PIXEL_ARRAY_SIZE,\n> > +               ANDROID_SENSOR_INFO_SENSITIVITY_RANGE,\n> > +               ANDROID_SENSOR_INFO_TIMESTAMP_SOURCE,\n> > +               ANDROID_SENSOR_ORIENTATION,\n> > +               ANDROID_STATISTICS_INFO_AVAILABLE_FACE_DETECT_MODES,\n> > +               ANDROID_STATISTICS_INFO_MAX_FACE_COUNT,\n> > +               ANDROID_SYNC_MAX_LATENCY,\n> > +       };\n> > +\n> >  staticMetadata_->addEntry(ANDROID_REQUEST_AVAILABLE_CHARACTERISTICS_KEYS,\n> > +                                 availableCharacteristicsKeys);\n> > +\n> > +       std::vector<int32_t> availableRequestKeys = {\n> > +               ANDROID_COLOR_CORRECTION_ABERRATION_MODE,\n> > +               ANDROID_CONTROL_AE_ANTIBANDING_MODE,\n> > +               ANDROID_CONTROL_AE_EXPOSURE_COMPENSATION,\n> > +               ANDROID_CONTROL_AE_LOCK,\n> > +               ANDROID_CONTROL_AE_MODE,\n> > +               ANDROID_CONTROL_AE_PRECAPTURE_TRIGGER,\n> > +               ANDROID_CONTROL_AE_TARGET_FPS_RANGE,\n> > +               ANDROID_CONTROL_AF_MODE,\n> > +               ANDROID_CONTROL_AF_TRIGGER,\n> > +               ANDROID_CONTROL_AWB_LOCK,\n> > +               ANDROID_CONTROL_AWB_MODE,\n> > +               ANDROID_CONTROL_CAPTURE_INTENT,\n> > +               ANDROID_CONTROL_EFFECT_MODE,\n> > +               ANDROID_CONTROL_MODE,\n> > +               ANDROID_CONTROL_SCENE_MODE,\n> > +               ANDROID_CONTROL_VIDEO_STABILIZATION_MODE,\n> > +               ANDROID_FLASH_MODE,\n> > +               ANDROID_JPEG_ORIENTATION,\n> > +               ANDROID_JPEG_QUALITY,\n> > +               ANDROID_JPEG_THUMBNAIL_QUALITY,\n> > +               ANDROID_JPEG_THUMBNAIL_SIZE,\n> > +               ANDROID_LENS_APERTURE,\n> > +               ANDROID_LENS_OPTICAL_STABILIZATION_MODE,\n> > +               ANDROID_NOISE_REDUCTION_MODE,\n> > +               ANDROID_SCALER_CROP_REGION,\n> > +               ANDROID_STATISTICS_FACE_DETECT_MODE\n> > +       };\n> > +       staticMetadata_->addEntry(ANDROID_REQUEST_AVAILABLE_REQUEST_KEYS,\n> > +                                 availableRequestKeys);\n> > +\n> > +       std::vector<int32_t> availableResultKeys = {\n> > +               ANDROID_COLOR_CORRECTION_ABERRATION_MODE,\n> > +               ANDROID_CONTROL_AE_ANTIBANDING_MODE,\n> > +               ANDROID_CONTROL_AE_EXPOSURE_COMPENSATION,\n> > +               ANDROID_CONTROL_AE_LOCK,\n> > +               ANDROID_CONTROL_AE_MODE,\n> > +               ANDROID_CONTROL_AE_PRECAPTURE_TRIGGER,\n> > +               ANDROID_CONTROL_AE_STATE,\n> > +               ANDROID_CONTROL_AE_TARGET_FPS_RANGE,\n> > +               ANDROID_CONTROL_AF_MODE,\n> > +               ANDROID_CONTROL_AF_STATE,\n> > +               ANDROID_CONTROL_AF_TRIGGER,\n> > +               ANDROID_CONTROL_AWB_LOCK,\n> > +               ANDROID_CONTROL_AWB_MODE,\n> > +               ANDROID_CONTROL_AWB_STATE,\n> > +               ANDROID_CONTROL_CAPTURE_INTENT,\n> > +               ANDROID_CONTROL_EFFECT_MODE,\n> > +               ANDROID_CONTROL_MODE,\n> > +               ANDROID_CONTROL_SCENE_MODE,\n> > +               ANDROID_CONTROL_VIDEO_STABILIZATION_MODE,\n> > +               ANDROID_FLASH_MODE,\n> > +               ANDROID_FLASH_STATE,\n> > +               ANDROID_JPEG_GPS_COORDINATES,\n> > +               ANDROID_JPEG_GPS_PROCESSING_METHOD,\n> > +               ANDROID_JPEG_GPS_TIMESTAMP,\n> > +               ANDROID_JPEG_ORIENTATION,\n> > +               ANDROID_JPEG_QUALITY,\n> > +               ANDROID_JPEG_SIZE,\n> > +               ANDROID_JPEG_THUMBNAIL_QUALITY,\n> > +               ANDROID_JPEG_THUMBNAIL_SIZE,\n> > +               ANDROID_LENS_APERTURE,\n> > +               ANDROID_LENS_FOCAL_LENGTH,\n> > +               ANDROID_LENS_OPTICAL_STABILIZATION_MODE,\n> > +               ANDROID_LENS_STATE,\n> > +               ANDROID_NOISE_REDUCTION_MODE,\n> > +               ANDROID_REQUEST_PIPELINE_DEPTH,\n> > +               ANDROID_SCALER_CROP_REGION,\n> > +               ANDROID_SENSOR_EXPOSURE_TIME,\n> > +               ANDROID_SENSOR_FRAME_DURATION,\n> > +               ANDROID_SENSOR_ROLLING_SHUTTER_SKEW,\n> > +               ANDROID_SENSOR_TEST_PATTERN_MODE,\n> > +               ANDROID_SENSOR_TIMESTAMP,\n> > +               ANDROID_STATISTICS_FACE_DETECT_MODE,\n> > +               ANDROID_STATISTICS_LENS_SHADING_MAP_MODE,\n> > +               ANDROID_STATISTICS_HOT_PIXEL_MAP_MODE,\n> > +               ANDROID_STATISTICS_SCENE_FLICKER,\n> > +       };\n> > +       staticMetadata_->addEntry(ANDROID_REQUEST_AVAILABLE_RESULT_KEYS,\n> > +                                 availableResultKeys);\n> > +\n> > +       if (!staticMetadata_->isValid()) {\n> > +               LOG(HAL, Error) << \"Failed to construct static metadata\";\n> > +               staticMetadata_.reset();\n> > +               return -EINVAL;\n> > +       }\n> > +\n> > +       if (staticMetadata_->resized()) {\n> > +               auto [entryCount, dataCount] = staticMetadata_->usage();\n> > +               LOG(HAL, Info)\n> > +                       << \"Static metadata resized: \" << entryCount\n> > +                       << \" entries and \" << dataCount << \" bytes used\";\n> > +       }\n> > +\n> > +       return 0;\n> > +}\n> > +\n> > +/* Translate Android format code to libcamera pixel format. */\n> > +PixelFormat CameraCapabilities::toPixelFormat(int format) const\n> > +{\n> > +       auto it = formatsMap_.find(format);\n> > +       if (it == formatsMap_.end()) {\n> > +               LOG(HAL, Error) << \"Requested format \" <<\n> > utils::hex(format)\n> > +                               << \" not supported\";\n> > +               return PixelFormat();\n> > +       }\n> > +\n> > +       return it->second;\n> > +}\n> > +\n> > +std::unique_ptr<CameraMetadata>\n> > CameraCapabilities::requestTemplatePreview() const\n> > +{\n> > +       /*\n> > +        * \\todo Keep this in sync with the actual number of entries.\n> > +        * Currently: 20 entries, 35 bytes\n> > +        */\n> > +       auto requestTemplate = std::make_unique<CameraMetadata>(21, 36);\n> > +       if (!requestTemplate->isValid()) {\n> > +               return nullptr;\n> > +       }\n> > +\n> > +       /* Get the FPS range registered in the static metadata. */\n> > +       camera_metadata_ro_entry_t entry;\n> > +       bool found =\n> > staticMetadata_->getEntry(ANDROID_CONTROL_AE_AVAILABLE_TARGET_FPS_RANGES,\n> > +                                              &entry);\n> > +       if (!found) {\n> > +               LOG(HAL, Error) << \"Cannot create capture template without\n> > FPS range\";\n> > +               return nullptr;\n> > +       }\n> > +\n> > +       /*\n> > +        * Assume the AE_AVAILABLE_TARGET_FPS_RANGE static metadata\n> > +        * has been assembled as {{min, max} {max, max}}.\n> > +        */\n> > +       requestTemplate->addEntry(ANDROID_CONTROL_AE_TARGET_FPS_RANGE,\n> > +                                 entry.data.i32, 2);\n> > +\n> > +       uint8_t aeMode = ANDROID_CONTROL_AE_MODE_ON;\n> > +       requestTemplate->addEntry(ANDROID_CONTROL_AE_MODE, aeMode);\n> > +\n> > +       int32_t aeExposureCompensation = 0;\n> > +       requestTemplate->addEntry(ANDROID_CONTROL_AE_EXPOSURE_COMPENSATION,\n> > +                                 aeExposureCompensation);\n> > +\n> > +       uint8_t aePrecaptureTrigger =\n> > ANDROID_CONTROL_AE_PRECAPTURE_TRIGGER_IDLE;\n> > +       requestTemplate->addEntry(ANDROID_CONTROL_AE_PRECAPTURE_TRIGGER,\n> > +                                 aePrecaptureTrigger);\n> > +\n> > +       uint8_t aeLock = ANDROID_CONTROL_AE_LOCK_OFF;\n> > +       requestTemplate->addEntry(ANDROID_CONTROL_AE_LOCK, aeLock);\n> > +\n> > +       uint8_t aeAntibandingMode =\n> > ANDROID_CONTROL_AE_ANTIBANDING_MODE_AUTO;\n> > +       requestTemplate->addEntry(ANDROID_CONTROL_AE_ANTIBANDING_MODE,\n> > +                                 aeAntibandingMode);\n> > +\n> > +       uint8_t afMode = ANDROID_CONTROL_AF_MODE_OFF;\n> > +       requestTemplate->addEntry(ANDROID_CONTROL_AF_MODE, afMode);\n> > +\n> > +       uint8_t afTrigger = ANDROID_CONTROL_AF_TRIGGER_IDLE;\n> > +       requestTemplate->addEntry(ANDROID_CONTROL_AF_TRIGGER, afTrigger);\n> > +\n> > +       uint8_t awbMode = ANDROID_CONTROL_AWB_MODE_AUTO;\n> > +       requestTemplate->addEntry(ANDROID_CONTROL_AWB_MODE, awbMode);\n> > +\n> > +       uint8_t awbLock = ANDROID_CONTROL_AWB_LOCK_OFF;\n> > +       requestTemplate->addEntry(ANDROID_CONTROL_AWB_LOCK, awbLock);\n> > +\n> > +       uint8_t flashMode = ANDROID_FLASH_MODE_OFF;\n> > +       requestTemplate->addEntry(ANDROID_FLASH_MODE, flashMode);\n> > +\n> > +       uint8_t faceDetectMode = ANDROID_STATISTICS_FACE_DETECT_MODE_OFF;\n> > +       requestTemplate->addEntry(ANDROID_STATISTICS_FACE_DETECT_MODE,\n> > +                                 faceDetectMode);\n> > +\n> > +       uint8_t noiseReduction = ANDROID_NOISE_REDUCTION_MODE_OFF;\n> > +       requestTemplate->addEntry(ANDROID_NOISE_REDUCTION_MODE,\n> > +                                 noiseReduction);\n> > +\n> > +       uint8_t aberrationMode =\n> > ANDROID_COLOR_CORRECTION_ABERRATION_MODE_OFF;\n> > +       requestTemplate->addEntry(ANDROID_COLOR_CORRECTION_ABERRATION_MODE,\n> > +                                 aberrationMode);\n> > +\n> > +       uint8_t controlMode = ANDROID_CONTROL_MODE_AUTO;\n> > +       requestTemplate->addEntry(ANDROID_CONTROL_MODE, controlMode);\n> > +\n> > +       float lensAperture = 2.53 / 100;\n> > +       requestTemplate->addEntry(ANDROID_LENS_APERTURE, lensAperture);\n> > +\n> > +       uint8_t opticalStabilization =\n> > ANDROID_LENS_OPTICAL_STABILIZATION_MODE_OFF;\n> > +       requestTemplate->addEntry(ANDROID_LENS_OPTICAL_STABILIZATION_MODE,\n> > +                                 opticalStabilization);\n> > +\n> > +       uint8_t captureIntent = ANDROID_CONTROL_CAPTURE_INTENT_PREVIEW;\n> > +       requestTemplate->addEntry(ANDROID_CONTROL_CAPTURE_INTENT,\n> > +                                 captureIntent);\n> > +\n> > +       return requestTemplate;\n> > +}\n> > +\n> > +std::unique_ptr<CameraMetadata>\n> > CameraCapabilities::requestTemplateVideo() const\n> > +{\n> > +       std::unique_ptr<CameraMetadata> previewTemplate =\n> > requestTemplatePreview();\n> > +       if (!previewTemplate)\n> > +               return nullptr;\n> > +\n> > +       /*\n> > +        * The video template requires a fixed FPS range. Everything else\n> > +        * stays the same as the preview template.\n> > +        */\n> > +       camera_metadata_ro_entry_t entry;\n> > +\n> >  staticMetadata_->getEntry(ANDROID_CONTROL_AE_AVAILABLE_TARGET_FPS_RANGES,\n> > +                                 &entry);\n> > +\n> > +       /*\n> > +        * Assume the AE_AVAILABLE_TARGET_FPS_RANGE static metadata\n> > +        * has been assembled as {{min, max} {max, max}}.\n> > +        */\n> > +       previewTemplate->updateEntry(ANDROID_CONTROL_AE_TARGET_FPS_RANGE,\n> > +                                    entry.data.i32 + 2, 2);\n> > +\n> > +       return previewTemplate;\n> > +}\n> > diff --git a/src/android/camera_capabilities.h\n> > b/src/android/camera_capabilities.h\n> > new file mode 100644\n> > index 000000000000..f511607bbd90\n> > --- /dev/null\n> > +++ b/src/android/camera_capabilities.h\n> > @@ -0,0 +1,65 @@\n> > +/* SPDX-License-Identifier: LGPL-2.1-or-later */\n> > +/*\n> > + * Copyright (C) 2021, Google Inc.\n> > + *\n> > + * camera_capabilities.h - Camera static properties manager\n> > + */\n> > +#ifndef __ANDROID_CAMERA_CAPABILITIES_H__\n> > +#define __ANDROID_CAMERA_CAPABILITIES_H__\n> > +\n> > +#include <map>\n> > +#include <memory>\n> > +#include <vector>\n> > +\n> > +#include <libcamera/camera.h>\n> > +#include <libcamera/class.h>\n> > +#include <libcamera/formats.h>\n> > +#include <libcamera/geometry.h>\n> > +\n> > +#include \"camera_metadata.h\"\n> > +\n> > +class CameraCapabilities\n> > +{\n> > +public:\n> > +       CameraCapabilities() = default;\n> > +\n> > +       int initialize(std::shared_ptr<libcamera::Camera> camera,\n> > +                      int orientation, int facing);\n> > +\n> > +       CameraMetadata *staticMetadata() const { return\n> > staticMetadata_.get(); }\n> > +       libcamera::PixelFormat toPixelFormat(int format) const;\n> > +       unsigned int maxJpegBufferSize() const { return\n> > maxJpegBufferSize_; }\n> > +\n> > +       std::unique_ptr<CameraMetadata> requestTemplatePreview() const;\n> > +       std::unique_ptr<CameraMetadata> requestTemplateVideo() const;\n> > +\n> > +private:\n> > +       LIBCAMERA_DISABLE_COPY_AND_MOVE(CameraCapabilities)\n> > +\n> > +       struct Camera3StreamConfiguration {\n> > +               libcamera::Size resolution;\n> > +               int androidFormat;\n> > +       };\n> > +\n> > +       std::vector<libcamera::Size>\n> > +       getYUVResolutions(libcamera::CameraConfiguration *cameraConfig,\n> > +                         const libcamera::PixelFormat &pixelFormat,\n> > +                         const std::vector<libcamera::Size> &resolutions);\n> > +       std::vector<libcamera::Size>\n> > +       getRawResolutions(const libcamera::PixelFormat &pixelFormat);\n> > +       int initializeStreamConfigurations();\n> > +\n> > +       int initializeStaticMetadata();\n> > +\n> > +       std::shared_ptr<libcamera::Camera> camera_;\n> > +\n> > +       int facing_;\n> > +       int orientation_;\n> > +\n> > +       std::vector<Camera3StreamConfiguration> streamConfigurations_;\n> > +       std::map<int, libcamera::PixelFormat> formatsMap_;\n> > +       std::unique_ptr<CameraMetadata> staticMetadata_;\n> > +       unsigned int maxJpegBufferSize_;\n> > +};\n> > +\n> > +#endif /* __ANDROID_CAMERA_CAPABILITIES_H__ */\n> > diff --git a/src/android/camera_device.cpp b/src/android/camera_device.cpp\n> > index 8c71fd0675d3..4bd125d7020a 100644\n> > --- a/src/android/camera_device.cpp\n> > +++ b/src/android/camera_device.cpp\n> > @@ -10,11 +10,8 @@\n> >  #include \"camera_ops.h\"\n> >  #include \"post_processor.h\"\n> >\n> > -#include <array>\n> > -#include <cmath>\n> >  #include <fstream>\n> >  #include <sys/mman.h>\n> > -#include <tuple>\n> >  #include <unistd.h>\n> >  #include <vector>\n> >\n> > @@ -23,7 +20,6 @@\n> >  #include <libcamera/formats.h>\n> >  #include <libcamera/property_ids.h>\n> >\n> > -#include \"libcamera/internal/formats.h\"\n> >  #include \"libcamera/internal/log.h\"\n> >  #include \"libcamera/internal/thread.h\"\n> >  #include \"libcamera/internal/utils.h\"\n> > @@ -36,94 +32,6 @@ LOG_DECLARE_CATEGORY(HAL)\n> >\n> >  namespace {\n> >\n> > -/*\n> > - * \\var camera3Resolutions\n> > - * \\brief The list of image resolutions defined as mandatory to be\n> > supported by\n> > - * the Android Camera3 specification\n> > - */\n> > -const std::vector<Size> camera3Resolutions = {\n> > -       { 320, 240 },\n> > -       { 640, 480 },\n> > -       { 1280, 720 },\n> > -       { 1920, 1080 }\n> > -};\n> > -\n> > -/*\n> > - * \\struct Camera3Format\n> > - * \\brief Data associated with an Android format identifier\n> > - * \\var libcameraFormats List of libcamera pixel formats compatible with\n> > the\n> > - * Android format\n> > - * \\var name The human-readable representation of the Android format code\n> > - */\n> > -struct Camera3Format {\n> > -       std::vector<PixelFormat> libcameraFormats;\n> > -       bool mandatory;\n> > -       const char *name;\n> > -};\n> > -\n> > -/*\n> > - * \\var camera3FormatsMap\n> > - * \\brief Associate Android format code with ancillary data\n> > - */\n> > -const std::map<int, const Camera3Format> camera3FormatsMap = {\n> > -       {\n> > -               HAL_PIXEL_FORMAT_BLOB, {\n> > -                       { formats::MJPEG },\n> > -                       true,\n> > -                       \"BLOB\"\n> > -               }\n> > -       }, {\n> > -               HAL_PIXEL_FORMAT_YCbCr_420_888, {\n> > -                       { formats::NV12, formats::NV21 },\n> > -                       true,\n> > -                       \"YCbCr_420_888\"\n> > -               }\n> > -       }, {\n> > -               /*\n> > -                * \\todo Translate IMPLEMENTATION_DEFINED inspecting the\n> > gralloc\n> > -                * usage flag. For now, copy the YCbCr_420 configuration.\n> > -                */\n> > -               HAL_PIXEL_FORMAT_IMPLEMENTATION_DEFINED, {\n> > -                       { formats::NV12, formats::NV21 },\n> > -                       true,\n> > -                       \"IMPLEMENTATION_DEFINED\"\n> > -               }\n> > -       }, {\n> > -               HAL_PIXEL_FORMAT_RAW10, {\n> > -                       {\n> > -                               formats::SBGGR10_CSI2P,\n> > -                               formats::SGBRG10_CSI2P,\n> > -                               formats::SGRBG10_CSI2P,\n> > -                               formats::SRGGB10_CSI2P\n> > -                       },\n> > -                       false,\n> > -                       \"RAW10\"\n> > -               }\n> > -       }, {\n> > -               HAL_PIXEL_FORMAT_RAW12, {\n> > -                       {\n> > -                               formats::SBGGR12_CSI2P,\n> > -                               formats::SGBRG12_CSI2P,\n> > -                               formats::SGRBG12_CSI2P,\n> > -                               formats::SRGGB12_CSI2P\n> > -                       },\n> > -                       false,\n> > -                       \"RAW12\"\n> > -               }\n> > -       }, {\n> > -               HAL_PIXEL_FORMAT_RAW16, {\n> > -                       {\n> > -                               formats::SBGGR16,\n> > -                               formats::SGBRG16,\n> > -                               formats::SGRBG16,\n> > -                               formats::SRGGB16\n> > -                       },\n> > -                       false,\n> > -                       \"RAW16\"\n> > -               }\n> > -       },\n> > -};\n> > -\n> >  /*\n> >   * \\struct Camera3StreamConfig\n> >   * \\brief Data to store StreamConfiguration associated with\n> > camera3_stream(s)\n> > @@ -512,242 +420,7 @@ int CameraDevice::initialize(const CameraConfigData\n> > *cameraConfigData)\n> >                 orientation_ = 0;\n> >         }\n> >\n> > -       /* Acquire the camera and initialize available stream\n> > configurations. */\n> > -       int ret = camera_->acquire();\n> > -       if (ret) {\n> > -               LOG(HAL, Error) << \"Failed to temporarily acquire the\n> > camera\";\n> > -               return ret;\n> > -       }\n> > -\n> > -       ret = initializeStreamConfigurations();\n> > -       camera_->release();\n> > -       return ret;\n> > -}\n> > -\n> > -std::vector<Size> CameraDevice::getYUVResolutions(CameraConfiguration\n> > *cameraConfig,\n> > -                                                 const PixelFormat\n> > &pixelFormat,\n> > -                                                 const std::vector<Size>\n> > &resolutions)\n> > -{\n> > -       std::vector<Size> supportedResolutions;\n> > -\n> > -       StreamConfiguration &cfg = cameraConfig->at(0);\n> > -       for (const Size &res : resolutions) {\n> > -               cfg.pixelFormat = pixelFormat;\n> > -               cfg.size = res;\n> > -\n> > -               CameraConfiguration::Status status =\n> > cameraConfig->validate();\n> > -               if (status != CameraConfiguration::Valid) {\n> > -                       LOG(HAL, Debug) << cfg.toString() << \" not\n> > supported\";\n> > -                       continue;\n> > -               }\n> > -\n> > -               LOG(HAL, Debug) << cfg.toString() << \" supported\";\n> > -\n> > -               supportedResolutions.push_back(res);\n> > -       }\n> > -\n> > -       return supportedResolutions;\n> > -}\n> > -\n> > -std::vector<Size> CameraDevice::getRawResolutions(const\n> > libcamera::PixelFormat &pixelFormat)\n> > -{\n> > -       std::unique_ptr<CameraConfiguration> cameraConfig =\n> > -               camera_->generateConfiguration({ StreamRole::Raw });\n> > -       StreamConfiguration &cfg = cameraConfig->at(0);\n> > -       const StreamFormats &formats = cfg.formats();\n> > -       std::vector<Size> supportedResolutions =\n> > formats.sizes(pixelFormat);\n> > -\n> > -       return supportedResolutions;\n> > -}\n> > -\n> > -/*\n> > - * Initialize the format conversion map to translate from Android format\n> > - * identifier to libcamera pixel formats and fill in the list of supported\n> > - * stream configurations to be reported to the Android camera framework\n> > through\n> > - * the static stream configuration metadata.\n> > - */\n> > -int CameraDevice::initializeStreamConfigurations()\n> > -{\n> > -       /*\n> > -        * Get the maximum output resolutions\n> > -        * \\todo Get this from the camera properties once defined\n> > -        */\n> > -       std::unique_ptr<CameraConfiguration> cameraConfig =\n> > -               camera_->generateConfiguration({ StillCapture });\n> > -       if (!cameraConfig) {\n> > -               LOG(HAL, Error) << \"Failed to get maximum resolution\";\n> > -               return -EINVAL;\n> > -       }\n> > -       StreamConfiguration &cfg = cameraConfig->at(0);\n> > -\n> > -       /*\n> > -        * \\todo JPEG - Adjust the maximum available resolution by taking\n> > the\n> > -        * JPEG encoder requirements into account (alignment and aspect\n> > ratio).\n> > -        */\n> > -       const Size maxRes = cfg.size;\n> > -       LOG(HAL, Debug) << \"Maximum supported resolution: \" <<\n> > maxRes.toString();\n> > -\n> > -       /*\n> > -        * Build the list of supported image resolutions.\n> > -        *\n> > -        * The resolutions listed in camera3Resolution are mandatory to be\n> > -        * supported, up to the camera maximum resolution.\n> > -        *\n> > -        * Augment the list by adding resolutions calculated from the\n> > camera\n> > -        * maximum one.\n> > -        */\n> > -       std::vector<Size> cameraResolutions;\n> > -       std::copy_if(camera3Resolutions.begin(), camera3Resolutions.end(),\n> > -                    std::back_inserter(cameraResolutions),\n> > -                    [&](const Size &res) { return res < maxRes; });\n> > -\n> > -       /*\n> > -        * The Camera3 specification suggests adding 1/2 and 1/4 of the\n> > maximum\n> > -        * resolution.\n> > -        */\n> > -       for (unsigned int divider = 2;; divider <<= 1) {\n> > -               Size derivedSize{\n> > -                       maxRes.width / divider,\n> > -                       maxRes.height / divider,\n> > -               };\n> > -\n> > -               if (derivedSize.width < 320 ||\n> > -                   derivedSize.height < 240)\n> > -                       break;\n> > -\n> > -               cameraResolutions.push_back(derivedSize);\n> > -       }\n> > -       cameraResolutions.push_back(maxRes);\n> > -\n> > -       /* Remove duplicated entries from the list of supported\n> > resolutions. */\n> > -       std::sort(cameraResolutions.begin(), cameraResolutions.end());\n> > -       auto last = std::unique(cameraResolutions.begin(),\n> > cameraResolutions.end());\n> > -       cameraResolutions.erase(last, cameraResolutions.end());\n> > -\n> > -       /*\n> > -        * Build the list of supported camera formats.\n> > -        *\n> > -        * To each Android format a list of compatible libcamera formats is\n> > -        * associated. The first libcamera format that tests successful is\n> > added\n> > -        * to the format translation map used when configuring the streams.\n> > -        * It is then tested against the list of supported camera\n> > resolutions to\n> > -        * build the stream configuration map reported through the camera\n> > static\n> > -        * metadata.\n> > -        */\n> > -       Size maxJpegSize;\n> > -       for (const auto &format : camera3FormatsMap) {\n> > -               int androidFormat = format.first;\n> > -               const Camera3Format &camera3Format = format.second;\n> > -               const std::vector<PixelFormat> &libcameraFormats =\n> > -                       camera3Format.libcameraFormats;\n> > -\n> > -               LOG(HAL, Debug) << \"Trying to map Android format \"\n> > -                               << camera3Format.name;\n> > -\n> > -               /*\n> > -                * JPEG is always supported, either produced directly by\n> > the\n> > -                * camera, or encoded in the HAL.\n> > -                */\n> > -               if (androidFormat == HAL_PIXEL_FORMAT_BLOB) {\n> > -                       formatsMap_[androidFormat] = formats::MJPEG;\n> > -                       LOG(HAL, Debug) << \"Mapped Android format \"\n> > -                                       << camera3Format.name << \" to \"\n> > -                                       << formats::MJPEG.toString()\n> > -                                       << \" (fixed mapping)\";\n> > -                       continue;\n> > -               }\n> > -\n> > -               /*\n> > -                * Test the libcamera formats that can produce images\n> > -                * compatible with the format defined by Android.\n> > -                */\n> > -               PixelFormat mappedFormat;\n> > -               for (const PixelFormat &pixelFormat : libcameraFormats) {\n> > -\n> > -                       LOG(HAL, Debug) << \"Testing \" <<\n> > pixelFormat.toString();\n> > -\n> > -                       /*\n> > -                        * The stream configuration size can be adjusted,\n> > -                        * not the pixel format.\n> > -                        *\n> > -                        * \\todo This could be simplified once all pipeline\n> > -                        * handlers will report the StreamFormats list of\n> > -                        * supported formats.\n> > -                        */\n> > -                       cfg.pixelFormat = pixelFormat;\n> > -\n> > -                       CameraConfiguration::Status status =\n> > cameraConfig->validate();\n> > -                       if (status != CameraConfiguration::Invalid &&\n> > -                           cfg.pixelFormat == pixelFormat) {\n> > -                               mappedFormat = pixelFormat;\n> > -                               break;\n> > -                       }\n> > -               }\n> > -\n> > -               if (!mappedFormat.isValid()) {\n> > -                       /* If the format is not mandatory, skip it. */\n> > -                       if (!camera3Format.mandatory)\n> > -                               continue;\n> > -\n> > -                       LOG(HAL, Error)\n> > -                               << \"Failed to map mandatory Android format\n> > \"\n> > -                               << camera3Format.name << \" (\"\n> > -                               << utils::hex(androidFormat) << \"):\n> > aborting\";\n> > -                       return -EINVAL;\n> > -               }\n> > -\n> > -               /*\n> > -                * Record the mapping and then proceed to generate the\n> > -                * stream configurations map, by testing the image\n> > resolutions.\n> > -                */\n> > -               formatsMap_[androidFormat] = mappedFormat;\n> > -               LOG(HAL, Debug) << \"Mapped Android format \"\n> > -                               << camera3Format.name << \" to \"\n> > -                               << mappedFormat.toString();\n> > -\n> > -               std::vector<Size> resolutions;\n> > -               const PixelFormatInfo &info =\n> > PixelFormatInfo::info(mappedFormat);\n> > -               if (info.colourEncoding ==\n> > PixelFormatInfo::ColourEncodingRAW)\n> > -                       resolutions = getRawResolutions(mappedFormat);\n> > -               else\n> > -                       resolutions = getYUVResolutions(cameraConfig.get(),\n> > -                                                       mappedFormat,\n> > -                                                       cameraResolutions);\n> > -\n> > -               for (const Size &res : resolutions) {\n> > -                       streamConfigurations_.push_back({ res,\n> > androidFormat });\n> > -\n> > -                       /*\n> > -                        * If the format is HAL_PIXEL_FORMAT_YCbCr_420_888\n> > -                        * from which JPEG is produced, add an entry for\n> > -                        * the JPEG stream.\n> > -                        *\n> > -                        * \\todo Wire the JPEG encoder to query the\n> > supported\n> > -                        * sizes provided a list of formats it can encode.\n> > -                        *\n> > -                        * \\todo Support JPEG streams produced by the\n> > Camera\n> > -                        * natively.\n> > -                        */\n> > -                       if (androidFormat ==\n> > HAL_PIXEL_FORMAT_YCbCr_420_888) {\n> > -                               streamConfigurations_.push_back(\n> > -                                       { res, HAL_PIXEL_FORMAT_BLOB });\n> > -                               maxJpegSize = std::max(maxJpegSize, res);\n> > -                       }\n> > -               }\n> > -\n> > -               /*\n> > -                * \\todo Calculate the maximum JPEG buffer size by asking\n> > the\n> > -                * encoder giving the maximum frame size required.\n> > -                */\n> > -               maxJpegBufferSize_ = maxJpegSize.width *\n> > maxJpegSize.height * 1.5;\n> > -       }\n> > -\n> > -       LOG(HAL, Debug) << \"Collected stream configuration map: \";\n> > -       for (const auto &entry : streamConfigurations_)\n> > -               LOG(HAL, Debug) << \"{ \" << entry.resolution.toString() <<\n> > \" - \"\n> > -                               << utils::hex(entry.androidFormat) << \" }\";\n> > -\n> > -       return 0;\n> > +       return capabilities_.initialize(camera_, orientation_, facing_);\n> >  }\n> >\n> >  /*\n> > @@ -817,802 +490,19 @@ void CameraDevice::stop()\n> >         state_ = State::Stopped;\n> >  }\n> >\n> > -void CameraDevice::setCallbacks(const camera3_callback_ops_t *callbacks)\n> > +unsigned int CameraDevice::maxJpegBufferSize() const\n> >  {\n> > -       callbacks_ = callbacks;\n> > +       return capabilities_.maxJpegBufferSize();\n> >  }\n> >\n> > -/*\n> > - * Return static information for the camera.\n> > - */\n> > -const camera_metadata_t *CameraDevice::getStaticMetadata()\n> > -{\n> > -       if (staticMetadata_)\n> > -               return staticMetadata_->get();\n> > -\n> > -       staticMetadata_ = std::make_unique<CameraMetadata>(64, 1024);\n> > -       if (!staticMetadata_->isValid()) {\n> > -               LOG(HAL, Error) << \"Failed to allocate static metadata\";\n> > -               staticMetadata_.reset();\n> > -               return nullptr;\n> > -       }\n> > -\n> > -       const ControlInfoMap &controlsInfo = camera_->controls();\n> > -       const ControlList &properties = camera_->properties();\n> > -\n> > -       /* Color correction static metadata. */\n> > -       {\n> > -               std::vector<uint8_t> data;\n> > -               data.reserve(3);\n> > -               const auto &infoMap =\n> > controlsInfo.find(&controls::draft::ColorCorrectionAberrationMode);\n> > -               if (infoMap != controlsInfo.end()) {\n> > -                       for (const auto &value : infoMap->second.values())\n> > -                               data.push_back(value.get<int32_t>());\n> > -               } else {\n> > -\n> >  data.push_back(ANDROID_COLOR_CORRECTION_ABERRATION_MODE_OFF);\n> > -               }\n> > -\n> >  staticMetadata_->addEntry(ANDROID_COLOR_CORRECTION_AVAILABLE_ABERRATION_MODES,\n> > -                                         data);\n> > -       }\n> > -\n> > -       /* Control static metadata. */\n> > -       std::vector<uint8_t> aeAvailableAntiBandingModes = {\n> > -               ANDROID_CONTROL_AE_ANTIBANDING_MODE_OFF,\n> > -               ANDROID_CONTROL_AE_ANTIBANDING_MODE_50HZ,\n> > -               ANDROID_CONTROL_AE_ANTIBANDING_MODE_60HZ,\n> > -               ANDROID_CONTROL_AE_ANTIBANDING_MODE_AUTO,\n> > -       };\n> > -\n> >  staticMetadata_->addEntry(ANDROID_CONTROL_AE_AVAILABLE_ANTIBANDING_MODES,\n> > -                                 aeAvailableAntiBandingModes);\n> > -\n> > -       std::vector<uint8_t> aeAvailableModes = {\n> > -               ANDROID_CONTROL_AE_MODE_ON,\n> > -       };\n> > -       staticMetadata_->addEntry(ANDROID_CONTROL_AE_AVAILABLE_MODES,\n> > -                                 aeAvailableModes);\n> > -\n> > -       int64_t minFrameDurationNsec = -1;\n> > -       int64_t maxFrameDurationNsec = -1;\n> > -       const auto frameDurationsInfo =\n> > controlsInfo.find(&controls::FrameDurationLimits);\n> > -       if (frameDurationsInfo != controlsInfo.end()) {\n> > -               minFrameDurationNsec =\n> > frameDurationsInfo->second.min().get<int64_t>() * 1000;\n> > -               maxFrameDurationNsec =\n> > frameDurationsInfo->second.max().get<int64_t>() * 1000;\n> > -\n> > -               /*\n> > -                * Adjust the minimum frame duration to comply with Android\n> > -                * requirements. The camera service mandates all\n> > preview/record\n> > -                * streams to have a minimum frame duration < 33,366\n> > milliseconds\n> > -                * (see MAX_PREVIEW_RECORD_DURATION_NS in the camera\n> > service\n> > -                * implementation).\n> > -                *\n> > -                * If we're close enough (+ 500 useconds) to that value,\n> > round\n> > -                * the minimum frame duration of the camera to an accepted\n> > -                * value.\n> > -                */\n> > -               static constexpr int64_t MAX_PREVIEW_RECORD_DURATION_NS =\n> > 1e9 / 29.97;\n> > -               if (minFrameDurationNsec > MAX_PREVIEW_RECORD_DURATION_NS\n> > &&\n> > -                   minFrameDurationNsec < MAX_PREVIEW_RECORD_DURATION_NS\n> > + 500000)\n> > -                       minFrameDurationNsec =\n> > MAX_PREVIEW_RECORD_DURATION_NS - 1000;\n> > -\n> > -               /*\n> > -                * The AE routine frame rate limits are computed using the\n> > frame\n> > -                * duration limits, as libcamera clips the AE routine to\n> > the\n> > -                * frame durations.\n> > -                */\n> > -               int32_t maxFps = std::round(1e9 / minFrameDurationNsec);\n> > -               int32_t minFps = std::round(1e9 / maxFrameDurationNsec);\n> > -               minFps = std::max(1, minFps);\n> > -\n> > -               /*\n> > -                * Force rounding errors so that we have the proper frame\n> > -                * durations for when we reuse these variables later\n> > -                */\n> > -               minFrameDurationNsec = 1e9 / maxFps;\n> > -               maxFrameDurationNsec = 1e9 / minFps;\n> > -\n> > -               /*\n> > -                * Register to the camera service {min, max} and {max, max}\n> > -                * intervals as requested by the metadata documentation.\n> > -                */\n> > -               int32_t availableAeFpsTarget[] = {\n> > -                       minFps, maxFps, maxFps, maxFps\n> > -               };\n> > -\n> >  staticMetadata_->addEntry(ANDROID_CONTROL_AE_AVAILABLE_TARGET_FPS_RANGES,\n> > -                                         availableAeFpsTarget);\n> > -       }\n> > -\n> > -       std::vector<int32_t> aeCompensationRange = {\n> > -               0, 0,\n> > -       };\n> > -       staticMetadata_->addEntry(ANDROID_CONTROL_AE_COMPENSATION_RANGE,\n> > -                                 aeCompensationRange);\n> > -\n> > -       const camera_metadata_rational_t aeCompensationStep[] = {\n> > -               { 0, 1 }\n> > -       };\n> > -       staticMetadata_->addEntry(ANDROID_CONTROL_AE_COMPENSATION_STEP,\n> > -                                 aeCompensationStep);\n> > -\n> > -       std::vector<uint8_t> availableAfModes = {\n> > -               ANDROID_CONTROL_AF_MODE_OFF,\n> > -       };\n> > -       staticMetadata_->addEntry(ANDROID_CONTROL_AF_AVAILABLE_MODES,\n> > -                                 availableAfModes);\n> > -\n> > -       std::vector<uint8_t> availableEffects = {\n> > -               ANDROID_CONTROL_EFFECT_MODE_OFF,\n> > -       };\n> > -       staticMetadata_->addEntry(ANDROID_CONTROL_AVAILABLE_EFFECTS,\n> > -                                 availableEffects);\n> > -\n> > -       std::vector<uint8_t> availableSceneModes = {\n> > -               ANDROID_CONTROL_SCENE_MODE_DISABLED,\n> > -       };\n> > -       staticMetadata_->addEntry(ANDROID_CONTROL_AVAILABLE_SCENE_MODES,\n> > -                                 availableSceneModes);\n> > -\n> > -       std::vector<uint8_t> availableStabilizationModes = {\n> > -               ANDROID_CONTROL_VIDEO_STABILIZATION_MODE_OFF,\n> > -       };\n> > -\n> >  staticMetadata_->addEntry(ANDROID_CONTROL_AVAILABLE_VIDEO_STABILIZATION_MODES,\n> > -                                 availableStabilizationModes);\n> > -\n> > -       /*\n> > -        * \\todo Inspect the Camera capabilities to report the available\n> > -        * AWB modes. Default to AUTO as CTS tests require it.\n> > -        */\n> > -       std::vector<uint8_t> availableAwbModes = {\n> > -               ANDROID_CONTROL_AWB_MODE_AUTO,\n> > -       };\n> > -       staticMetadata_->addEntry(ANDROID_CONTROL_AWB_AVAILABLE_MODES,\n> > -                                 availableAwbModes);\n> > -\n> > -       std::vector<int32_t> availableMaxRegions = {\n> > -               0, 0, 0,\n> > -       };\n> > -       staticMetadata_->addEntry(ANDROID_CONTROL_MAX_REGIONS,\n> > -                                 availableMaxRegions);\n> > -\n> > -       std::vector<uint8_t> sceneModesOverride = {\n> > -               ANDROID_CONTROL_AE_MODE_ON,\n> > -               ANDROID_CONTROL_AWB_MODE_AUTO,\n> > -               ANDROID_CONTROL_AF_MODE_OFF,\n> > -       };\n> > -       staticMetadata_->addEntry(ANDROID_CONTROL_SCENE_MODE_OVERRIDES,\n> > -                                 sceneModesOverride);\n> > -\n> > -       uint8_t aeLockAvailable = ANDROID_CONTROL_AE_LOCK_AVAILABLE_FALSE;\n> > -       staticMetadata_->addEntry(ANDROID_CONTROL_AE_LOCK_AVAILABLE,\n> > -                                 aeLockAvailable);\n> > -\n> > -       uint8_t awbLockAvailable =\n> > ANDROID_CONTROL_AWB_LOCK_AVAILABLE_FALSE;\n> > -       staticMetadata_->addEntry(ANDROID_CONTROL_AWB_LOCK_AVAILABLE,\n> > -                                 awbLockAvailable);\n> > -\n> > -       char availableControlModes = ANDROID_CONTROL_MODE_AUTO;\n> > -       staticMetadata_->addEntry(ANDROID_CONTROL_AVAILABLE_MODES,\n> > -                                 availableControlModes);\n> > -\n> > -       /* JPEG static metadata. */\n> > -\n> > -       /*\n> > -        * Create the list of supported thumbnail sizes by inspecting the\n> > -        * available JPEG resolutions collected in streamConfigurations_\n> > and\n> > -        * generate one entry for each aspect ratio.\n> > -        *\n> > -        * The JPEG thumbnailer can freely scale, so pick an arbitrary\n> > -        * (160, 160) size as the bounding rectangle, which is then\n> > cropped to\n> > -        * the different supported aspect ratios.\n> > -        */\n> > -       constexpr Size maxJpegThumbnail(160, 160);\n> > -       std::vector<Size> thumbnailSizes;\n> > -       thumbnailSizes.push_back({ 0, 0 });\n> > -       for (const auto &entry : streamConfigurations_) {\n> > -               if (entry.androidFormat != HAL_PIXEL_FORMAT_BLOB)\n> > -                       continue;\n> > -\n> > -               Size thumbnailSize = maxJpegThumbnail\n> > -                                    .boundedToAspectRatio({\n> > entry.resolution.width,\n> > -\n> > entry.resolution.height });\n> > -               thumbnailSizes.push_back(thumbnailSize);\n> > -       }\n> > -\n> > -       std::sort(thumbnailSizes.begin(), thumbnailSizes.end());\n> > -       auto last = std::unique(thumbnailSizes.begin(),\n> > thumbnailSizes.end());\n> > -       thumbnailSizes.erase(last, thumbnailSizes.end());\n> > -\n> > -       /* Transform sizes in to a list of integers that can be consumed.\n> > */\n> > -       std::vector<int32_t> thumbnailEntries;\n> > -       thumbnailEntries.reserve(thumbnailSizes.size() * 2);\n> > -       for (const auto &size : thumbnailSizes) {\n> > -               thumbnailEntries.push_back(size.width);\n> > -               thumbnailEntries.push_back(size.height);\n> > -       }\n> > -       staticMetadata_->addEntry(ANDROID_JPEG_AVAILABLE_THUMBNAIL_SIZES,\n> > -                                 thumbnailEntries);\n> > -\n> > -       staticMetadata_->addEntry(ANDROID_JPEG_MAX_SIZE,\n> > maxJpegBufferSize_);\n> > -\n> > -       /* Sensor static metadata. */\n> > -       std::array<int32_t, 2> pixelArraySize;\n> > -       {\n> > -               const Size &size =\n> > properties.get(properties::PixelArraySize);\n> > -               pixelArraySize[0] = size.width;\n> > -               pixelArraySize[1] = size.height;\n> > -\n> >  staticMetadata_->addEntry(ANDROID_SENSOR_INFO_PIXEL_ARRAY_SIZE,\n> > -                                         pixelArraySize);\n> > -       }\n> > -\n> > -       if (properties.contains(properties::UnitCellSize)) {\n> > -               const Size &cellSize =\n> > properties.get<Size>(properties::UnitCellSize);\n> > -               std::array<float, 2> physicalSize{\n> > -                       cellSize.width * pixelArraySize[0] / 1e6f,\n> > -                       cellSize.height * pixelArraySize[1] / 1e6f\n> > -               };\n> > -\n> >  staticMetadata_->addEntry(ANDROID_SENSOR_INFO_PHYSICAL_SIZE,\n> > -                                         physicalSize);\n> > -       }\n> > -\n> > -       {\n> > -               const Span<const Rectangle> &rects =\n> > -                       properties.get(properties::PixelArrayActiveAreas);\n> > -               std::vector<int32_t> data{\n> > -                       static_cast<int32_t>(rects[0].x),\n> > -                       static_cast<int32_t>(rects[0].y),\n> > -                       static_cast<int32_t>(rects[0].width),\n> > -                       static_cast<int32_t>(rects[0].height),\n> > -               };\n> > -\n> >  staticMetadata_->addEntry(ANDROID_SENSOR_INFO_ACTIVE_ARRAY_SIZE,\n> > -                                         data);\n> > -       }\n> > -\n> > -       int32_t sensitivityRange[] = {\n> > -               32, 2400,\n> > -       };\n> > -       staticMetadata_->addEntry(ANDROID_SENSOR_INFO_SENSITIVITY_RANGE,\n> > -                                 sensitivityRange);\n> > -\n> > -       /* Report the color filter arrangement if the camera reports it. */\n> > -       if\n> > (properties.contains(properties::draft::ColorFilterArrangement)) {\n> > -               uint8_t filterArr =\n> > properties.get(properties::draft::ColorFilterArrangement);\n> > -\n> >  staticMetadata_->addEntry(ANDROID_SENSOR_INFO_COLOR_FILTER_ARRANGEMENT,\n> > -                                         filterArr);\n> > -       }\n> > -\n> > -       const auto &exposureInfo =\n> > controlsInfo.find(&controls::ExposureTime);\n> > -       if (exposureInfo != controlsInfo.end()) {\n> > -               int64_t exposureTimeRange[2] = {\n> > -                       exposureInfo->second.min().get<int32_t>() * 1000LL,\n> > -                       exposureInfo->second.max().get<int32_t>() * 1000LL,\n> > -               };\n> > -\n> >  staticMetadata_->addEntry(ANDROID_SENSOR_INFO_EXPOSURE_TIME_RANGE,\n> > -                                         exposureTimeRange, 2);\n> > -       }\n> > -\n> > -       staticMetadata_->addEntry(ANDROID_SENSOR_ORIENTATION,\n> > orientation_);\n> > -\n> > -       std::vector<int32_t> testPatternModes = {\n> > -               ANDROID_SENSOR_TEST_PATTERN_MODE_OFF\n> > -       };\n> > -       const auto &testPatternsInfo =\n> > -               controlsInfo.find(&controls::draft::TestPatternMode);\n> > -       if (testPatternsInfo != controlsInfo.end()) {\n> > -               const auto &values = testPatternsInfo->second.values();\n> > -               ASSERT(!values.empty());\n> > -               for (const auto &value : values) {\n> > -                       switch (value.get<int32_t>()) {\n> > -                       case controls::draft::TestPatternModeOff:\n> > -                               /*\n> > -                                * ANDROID_SENSOR_TEST_PATTERN_MODE_OFF is\n> > -                                * already in testPatternModes.\n> > -                                */\n> > -                               break;\n> > -\n> > -                       case controls::draft::TestPatternModeSolidColor:\n> > -                               testPatternModes.push_back(\n> > -\n> >  ANDROID_SENSOR_TEST_PATTERN_MODE_SOLID_COLOR);\n> > -                               break;\n> > -\n> > -                       case controls::draft::TestPatternModeColorBars:\n> > -                               testPatternModes.push_back(\n> > -\n> >  ANDROID_SENSOR_TEST_PATTERN_MODE_COLOR_BARS);\n> > -                               break;\n> > -\n> > -                       case\n> > controls::draft::TestPatternModeColorBarsFadeToGray:\n> > -                               testPatternModes.push_back(\n> > -\n> >  ANDROID_SENSOR_TEST_PATTERN_MODE_COLOR_BARS_FADE_TO_GRAY);\n> > -                               break;\n> > -\n> > -                       case controls::draft::TestPatternModePn9:\n> > -                               testPatternModes.push_back(\n> > -\n> >  ANDROID_SENSOR_TEST_PATTERN_MODE_PN9);\n> > -                               break;\n> > -\n> > -                       case controls::draft::TestPatternModeCustom1:\n> > -                               /* We don't support this yet. */\n> > -                               break;\n> > -\n> > -                       default:\n> > -                               LOG(HAL, Error) << \"Unknown test pattern\n> > mode: \"\n> > -                                               << value.get<int32_t>();\n> > -                               continue;\n> > -                       }\n> > -               }\n> > -       }\n> > -\n> >  staticMetadata_->addEntry(ANDROID_SENSOR_AVAILABLE_TEST_PATTERN_MODES,\n> > -                                 testPatternModes);\n> > -\n> > -       uint8_t timestampSource =\n> > ANDROID_SENSOR_INFO_TIMESTAMP_SOURCE_UNKNOWN;\n> > -       staticMetadata_->addEntry(ANDROID_SENSOR_INFO_TIMESTAMP_SOURCE,\n> > -                                 timestampSource);\n> > -\n> > -       if (maxFrameDurationNsec > 0)\n> > -\n> >  staticMetadata_->addEntry(ANDROID_SENSOR_INFO_MAX_FRAME_DURATION,\n> > -                                         maxFrameDurationNsec);\n> > -\n> > -       /* Statistics static metadata. */\n> > -       uint8_t faceDetectMode = ANDROID_STATISTICS_FACE_DETECT_MODE_OFF;\n> > -\n> >  staticMetadata_->addEntry(ANDROID_STATISTICS_INFO_AVAILABLE_FACE_DETECT_MODES,\n> > -                                 faceDetectMode);\n> > -\n> > -       int32_t maxFaceCount = 0;\n> > -       staticMetadata_->addEntry(ANDROID_STATISTICS_INFO_MAX_FACE_COUNT,\n> > -                                 maxFaceCount);\n> > -\n> > -       {\n> > -               std::vector<uint8_t> data;\n> > -               data.reserve(2);\n> > -               const auto &infoMap =\n> > controlsInfo.find(&controls::draft::LensShadingMapMode);\n> > -               if (infoMap != controlsInfo.end()) {\n> > -                       for (const auto &value : infoMap->second.values())\n> > -                               data.push_back(value.get<int32_t>());\n> > -               } else {\n> > -\n> >  data.push_back(ANDROID_STATISTICS_LENS_SHADING_MAP_MODE_OFF);\n> > -               }\n> > -\n> >  staticMetadata_->addEntry(ANDROID_STATISTICS_INFO_AVAILABLE_LENS_SHADING_MAP_MODES,\n> > -                                         data);\n> > -       }\n> > -\n> > -       /* Sync static metadata. */\n> > -       int32_t maxLatency = ANDROID_SYNC_MAX_LATENCY_UNKNOWN;\n> > -       staticMetadata_->addEntry(ANDROID_SYNC_MAX_LATENCY, maxLatency);\n> > -\n> > -       /* Flash static metadata. */\n> > -       char flashAvailable = ANDROID_FLASH_INFO_AVAILABLE_FALSE;\n> > -       staticMetadata_->addEntry(ANDROID_FLASH_INFO_AVAILABLE,\n> > -                                 flashAvailable);\n> > -\n> > -       /* Lens static metadata. */\n> > -       std::vector<float> lensApertures = {\n> > -               2.53 / 100,\n> > -       };\n> > -       staticMetadata_->addEntry(ANDROID_LENS_INFO_AVAILABLE_APERTURES,\n> > -                                 lensApertures);\n> > -\n> > -       uint8_t lensFacing;\n> > -       switch (facing_) {\n> > -       default:\n> > -       case CAMERA_FACING_FRONT:\n> > -               lensFacing = ANDROID_LENS_FACING_FRONT;\n> > -               break;\n> > -       case CAMERA_FACING_BACK:\n> > -               lensFacing = ANDROID_LENS_FACING_BACK;\n> > -               break;\n> > -       case CAMERA_FACING_EXTERNAL:\n> > -               lensFacing = ANDROID_LENS_FACING_EXTERNAL;\n> > -               break;\n> > -       }\n> > -       staticMetadata_->addEntry(ANDROID_LENS_FACING, lensFacing);\n> > -\n> > -       std::vector<float> lensFocalLengths = {\n> > -               1,\n> > -       };\n> > -\n> >  staticMetadata_->addEntry(ANDROID_LENS_INFO_AVAILABLE_FOCAL_LENGTHS,\n> > -                                 lensFocalLengths);\n> > -\n> > -       std::vector<uint8_t> opticalStabilizations = {\n> > -               ANDROID_LENS_OPTICAL_STABILIZATION_MODE_OFF,\n> > -       };\n> > -\n> >  staticMetadata_->addEntry(ANDROID_LENS_INFO_AVAILABLE_OPTICAL_STABILIZATION,\n> > -                                 opticalStabilizations);\n> > -\n> > -       float hypeFocalDistance = 0;\n> > -       staticMetadata_->addEntry(ANDROID_LENS_INFO_HYPERFOCAL_DISTANCE,\n> > -                                 hypeFocalDistance);\n> > -\n> > -       float minFocusDistance = 0;\n> > -       staticMetadata_->addEntry(ANDROID_LENS_INFO_MINIMUM_FOCUS_DISTANCE,\n> > -                                 minFocusDistance);\n> > -\n> > -       /* Noise reduction modes. */\n> > -       {\n> > -               std::vector<uint8_t> data;\n> > -               data.reserve(5);\n> > -               const auto &infoMap =\n> > controlsInfo.find(&controls::draft::NoiseReductionMode);\n> > -               if (infoMap != controlsInfo.end()) {\n> > -                       for (const auto &value : infoMap->second.values())\n> > -                               data.push_back(value.get<int32_t>());\n> > -               } else {\n> > -                       data.push_back(ANDROID_NOISE_REDUCTION_MODE_OFF);\n> > -               }\n> > -\n> >  staticMetadata_->addEntry(ANDROID_NOISE_REDUCTION_AVAILABLE_NOISE_REDUCTION_MODES,\n> > -                                         data);\n> > -       }\n> > -\n> > -       /* Scaler static metadata. */\n> > -\n> > -       /*\n> > -        * \\todo The digital zoom factor is a property that depends on the\n> > -        * desired output configuration and the sensor frame size input to\n> > the\n> > -        * ISP. This information is not available to the Android HAL, not\n> > at\n> > -        * initialization time at least.\n> > -        *\n> > -        * As a workaround rely on pipeline handlers initializing the\n> > -        * ScalerCrop control with the camera default configuration and\n> > use the\n> > -        * maximum and minimum crop rectangles to calculate the digital\n> > zoom\n> > -        * factor.\n> > -        */\n> > -       float maxZoom = 1.0f;\n> > -       const auto scalerCrop = controlsInfo.find(&controls::ScalerCrop);\n> > -       if (scalerCrop != controlsInfo.end()) {\n> > -               Rectangle min = scalerCrop->second.min().get<Rectangle>();\n> > -               Rectangle max = scalerCrop->second.max().get<Rectangle>();\n> > -               maxZoom = std::min(1.0f * max.width / min.width,\n> > -                                  1.0f * max.height / min.height);\n> > -       }\n> > -\n> >  staticMetadata_->addEntry(ANDROID_SCALER_AVAILABLE_MAX_DIGITAL_ZOOM,\n> > -                                 maxZoom);\n> > -\n> > -       std::vector<uint32_t> availableStreamConfigurations;\n> > -       availableStreamConfigurations.reserve(streamConfigurations_.size()\n> > * 4);\n> > -       for (const auto &entry : streamConfigurations_) {\n> > -\n> >  availableStreamConfigurations.push_back(entry.androidFormat);\n> > -\n> >  availableStreamConfigurations.push_back(entry.resolution.width);\n> > -\n> >  availableStreamConfigurations.push_back(entry.resolution.height);\n> > -               availableStreamConfigurations.push_back(\n> > -\n> >  ANDROID_SCALER_AVAILABLE_STREAM_CONFIGURATIONS_OUTPUT);\n> > -       }\n> > -\n> >  staticMetadata_->addEntry(ANDROID_SCALER_AVAILABLE_STREAM_CONFIGURATIONS,\n> > -                                 availableStreamConfigurations);\n> > -\n> > -       std::vector<int64_t> availableStallDurations = {\n> > -               ANDROID_SCALER_AVAILABLE_FORMATS_BLOB, 2560, 1920,\n> > 33333333,\n> > -       };\n> > -       staticMetadata_->addEntry(ANDROID_SCALER_AVAILABLE_STALL_DURATIONS,\n> > -                                 availableStallDurations);\n> > -\n> > -       /* Use the minimum frame duration for all the YUV/RGB formats. */\n> > -       if (minFrameDurationNsec > 0) {\n> > -               std::vector<int64_t> minFrameDurations;\n> > -               minFrameDurations.reserve(streamConfigurations_.size() *\n> > 4);\n> > -               for (const auto &entry : streamConfigurations_) {\n> > -                       minFrameDurations.push_back(entry.androidFormat);\n> > -\n> >  minFrameDurations.push_back(entry.resolution.width);\n> > -\n> >  minFrameDurations.push_back(entry.resolution.height);\n> > -                       minFrameDurations.push_back(minFrameDurationNsec);\n> > -               }\n> > -\n> >  staticMetadata_->addEntry(ANDROID_SCALER_AVAILABLE_MIN_FRAME_DURATIONS,\n> > -                                         minFrameDurations);\n> > -       }\n> > -\n> > -       uint8_t croppingType = ANDROID_SCALER_CROPPING_TYPE_CENTER_ONLY;\n> > -       staticMetadata_->addEntry(ANDROID_SCALER_CROPPING_TYPE,\n> > croppingType);\n> > -\n> > -       /* Info static metadata. */\n> > -       uint8_t supportedHWLevel =\n> > ANDROID_INFO_SUPPORTED_HARDWARE_LEVEL_LIMITED;\n> > -       staticMetadata_->addEntry(ANDROID_INFO_SUPPORTED_HARDWARE_LEVEL,\n> > -                                 supportedHWLevel);\n> > -\n> > -       /* Request static metadata. */\n> > -       int32_t partialResultCount = 1;\n> > -       staticMetadata_->addEntry(ANDROID_REQUEST_PARTIAL_RESULT_COUNT,\n> > -                                 partialResultCount);\n> > -\n> > -       {\n> > -               /* Default the value to 2 if not reported by the camera. */\n> > -               uint8_t maxPipelineDepth = 2;\n> > -               const auto &infoMap =\n> > controlsInfo.find(&controls::draft::PipelineDepth);\n> > -               if (infoMap != controlsInfo.end())\n> > -                       maxPipelineDepth =\n> > infoMap->second.max().get<int32_t>();\n> > -\n> >  staticMetadata_->addEntry(ANDROID_REQUEST_PIPELINE_MAX_DEPTH,\n> > -                                         maxPipelineDepth);\n> > -       }\n> > -\n> > -       /* LIMITED does not support reprocessing. */\n> > -       uint32_t maxNumInputStreams = 0;\n> > -       staticMetadata_->addEntry(ANDROID_REQUEST_MAX_NUM_INPUT_STREAMS,\n> > -                                 maxNumInputStreams);\n> > -\n> > -       std::vector<uint8_t> availableCapabilities = {\n> > -               ANDROID_REQUEST_AVAILABLE_CAPABILITIES_BACKWARD_COMPATIBLE,\n> > -       };\n> > -\n> > -       /* Report if camera supports RAW. */\n> > -       bool rawStreamAvailable = false;\n> > -       std::unique_ptr<CameraConfiguration> cameraConfig =\n> > -               camera_->generateConfiguration({ StreamRole::Raw });\n> > -       if (cameraConfig && !cameraConfig->empty()) {\n> > -               const PixelFormatInfo &info =\n> > -\n> >  PixelFormatInfo::info(cameraConfig->at(0).pixelFormat);\n> > -               /* Only advertise RAW support if RAW16 is possible. */\n> > -               if (info.colourEncoding ==\n> > PixelFormatInfo::ColourEncodingRAW &&\n> > -                   info.bitsPerPixel == 16) {\n> > -                       rawStreamAvailable = true;\n> > -\n> >  availableCapabilities.push_back(ANDROID_REQUEST_AVAILABLE_CAPABILITIES_RAW);\n> > -               }\n> > -       }\n> > -\n> > -       /* Number of { RAW, YUV, JPEG } supported output streams */\n> > -       int32_t numOutStreams[] = { rawStreamAvailable, 2, 1 };\n> > -       staticMetadata_->addEntry(ANDROID_REQUEST_MAX_NUM_OUTPUT_STREAMS,\n> > -                                 numOutStreams);\n> > -\n> > -       staticMetadata_->addEntry(ANDROID_REQUEST_AVAILABLE_CAPABILITIES,\n> > -                                 availableCapabilities);\n> > -\n> > -       std::vector<int32_t> availableCharacteristicsKeys = {\n> > -               ANDROID_COLOR_CORRECTION_AVAILABLE_ABERRATION_MODES,\n> > -               ANDROID_CONTROL_AE_AVAILABLE_ANTIBANDING_MODES,\n> > -               ANDROID_CONTROL_AE_AVAILABLE_MODES,\n> > -               ANDROID_CONTROL_AE_AVAILABLE_TARGET_FPS_RANGES,\n> > -               ANDROID_CONTROL_AE_COMPENSATION_RANGE,\n> > -               ANDROID_CONTROL_AE_COMPENSATION_STEP,\n> > -               ANDROID_CONTROL_AE_LOCK_AVAILABLE,\n> > -               ANDROID_CONTROL_AF_AVAILABLE_MODES,\n> > -               ANDROID_CONTROL_AVAILABLE_EFFECTS,\n> > -               ANDROID_CONTROL_AVAILABLE_MODES,\n> > -               ANDROID_CONTROL_AVAILABLE_SCENE_MODES,\n> > -               ANDROID_CONTROL_AVAILABLE_VIDEO_STABILIZATION_MODES,\n> > -               ANDROID_CONTROL_AWB_AVAILABLE_MODES,\n> > -               ANDROID_CONTROL_AWB_LOCK_AVAILABLE,\n> > -               ANDROID_CONTROL_MAX_REGIONS,\n> > -               ANDROID_CONTROL_SCENE_MODE_OVERRIDES,\n> > -               ANDROID_FLASH_INFO_AVAILABLE,\n> > -               ANDROID_INFO_SUPPORTED_HARDWARE_LEVEL,\n> > -               ANDROID_JPEG_AVAILABLE_THUMBNAIL_SIZES,\n> > -               ANDROID_JPEG_MAX_SIZE,\n> > -               ANDROID_LENS_FACING,\n> > -               ANDROID_LENS_INFO_AVAILABLE_APERTURES,\n> > -               ANDROID_LENS_INFO_AVAILABLE_FOCAL_LENGTHS,\n> > -               ANDROID_LENS_INFO_AVAILABLE_OPTICAL_STABILIZATION,\n> > -               ANDROID_LENS_INFO_HYPERFOCAL_DISTANCE,\n> > -               ANDROID_LENS_INFO_MINIMUM_FOCUS_DISTANCE,\n> > -               ANDROID_NOISE_REDUCTION_AVAILABLE_NOISE_REDUCTION_MODES,\n> > -               ANDROID_REQUEST_AVAILABLE_CAPABILITIES,\n> > -               ANDROID_REQUEST_MAX_NUM_INPUT_STREAMS,\n> > -               ANDROID_REQUEST_MAX_NUM_OUTPUT_STREAMS,\n> > -               ANDROID_REQUEST_PARTIAL_RESULT_COUNT,\n> > -               ANDROID_REQUEST_PIPELINE_MAX_DEPTH,\n> > -               ANDROID_SCALER_AVAILABLE_MAX_DIGITAL_ZOOM,\n> > -               ANDROID_SCALER_AVAILABLE_MIN_FRAME_DURATIONS,\n> > -               ANDROID_SCALER_AVAILABLE_STALL_DURATIONS,\n> > -               ANDROID_SCALER_AVAILABLE_STREAM_CONFIGURATIONS,\n> > -               ANDROID_SCALER_CROPPING_TYPE,\n> > -               ANDROID_SENSOR_AVAILABLE_TEST_PATTERN_MODES,\n> > -               ANDROID_SENSOR_INFO_ACTIVE_ARRAY_SIZE,\n> > -               ANDROID_SENSOR_INFO_COLOR_FILTER_ARRANGEMENT,\n> > -               ANDROID_SENSOR_INFO_EXPOSURE_TIME_RANGE,\n> > -               ANDROID_SENSOR_INFO_MAX_FRAME_DURATION,\n> > -               ANDROID_SENSOR_INFO_PHYSICAL_SIZE,\n> > -               ANDROID_SENSOR_INFO_PIXEL_ARRAY_SIZE,\n> > -               ANDROID_SENSOR_INFO_SENSITIVITY_RANGE,\n> > -               ANDROID_SENSOR_INFO_TIMESTAMP_SOURCE,\n> > -               ANDROID_SENSOR_ORIENTATION,\n> > -               ANDROID_STATISTICS_INFO_AVAILABLE_FACE_DETECT_MODES,\n> > -               ANDROID_STATISTICS_INFO_MAX_FACE_COUNT,\n> > -               ANDROID_SYNC_MAX_LATENCY,\n> > -       };\n> > -\n> >  staticMetadata_->addEntry(ANDROID_REQUEST_AVAILABLE_CHARACTERISTICS_KEYS,\n> > -                                 availableCharacteristicsKeys);\n> > -\n> > -       std::vector<int32_t> availableRequestKeys = {\n> > -               ANDROID_COLOR_CORRECTION_ABERRATION_MODE,\n> > -               ANDROID_CONTROL_AE_ANTIBANDING_MODE,\n> > -               ANDROID_CONTROL_AE_EXPOSURE_COMPENSATION,\n> > -               ANDROID_CONTROL_AE_LOCK,\n> > -               ANDROID_CONTROL_AE_MODE,\n> > -               ANDROID_CONTROL_AE_PRECAPTURE_TRIGGER,\n> > -               ANDROID_CONTROL_AE_TARGET_FPS_RANGE,\n> > -               ANDROID_CONTROL_AF_MODE,\n> > -               ANDROID_CONTROL_AF_TRIGGER,\n> > -               ANDROID_CONTROL_AWB_LOCK,\n> > -               ANDROID_CONTROL_AWB_MODE,\n> > -               ANDROID_CONTROL_CAPTURE_INTENT,\n> > -               ANDROID_CONTROL_EFFECT_MODE,\n> > -               ANDROID_CONTROL_MODE,\n> > -               ANDROID_CONTROL_SCENE_MODE,\n> > -               ANDROID_CONTROL_VIDEO_STABILIZATION_MODE,\n> > -               ANDROID_FLASH_MODE,\n> > -               ANDROID_JPEG_ORIENTATION,\n> > -               ANDROID_JPEG_QUALITY,\n> > -               ANDROID_JPEG_THUMBNAIL_QUALITY,\n> > -               ANDROID_JPEG_THUMBNAIL_SIZE,\n> > -               ANDROID_LENS_APERTURE,\n> > -               ANDROID_LENS_OPTICAL_STABILIZATION_MODE,\n> > -               ANDROID_NOISE_REDUCTION_MODE,\n> > -               ANDROID_SCALER_CROP_REGION,\n> > -               ANDROID_STATISTICS_FACE_DETECT_MODE\n> > -       };\n> > -       staticMetadata_->addEntry(ANDROID_REQUEST_AVAILABLE_REQUEST_KEYS,\n> > -                                 availableRequestKeys);\n> > -\n> > -       std::vector<int32_t> availableResultKeys = {\n> > -               ANDROID_COLOR_CORRECTION_ABERRATION_MODE,\n> > -               ANDROID_CONTROL_AE_ANTIBANDING_MODE,\n> > -               ANDROID_CONTROL_AE_EXPOSURE_COMPENSATION,\n> > -               ANDROID_CONTROL_AE_LOCK,\n> > -               ANDROID_CONTROL_AE_MODE,\n> > -               ANDROID_CONTROL_AE_PRECAPTURE_TRIGGER,\n> > -               ANDROID_CONTROL_AE_STATE,\n> > -               ANDROID_CONTROL_AE_TARGET_FPS_RANGE,\n> > -               ANDROID_CONTROL_AF_MODE,\n> > -               ANDROID_CONTROL_AF_STATE,\n> > -               ANDROID_CONTROL_AF_TRIGGER,\n> > -               ANDROID_CONTROL_AWB_LOCK,\n> > -               ANDROID_CONTROL_AWB_MODE,\n> > -               ANDROID_CONTROL_AWB_STATE,\n> > -               ANDROID_CONTROL_CAPTURE_INTENT,\n> > -               ANDROID_CONTROL_EFFECT_MODE,\n> > -               ANDROID_CONTROL_MODE,\n> > -               ANDROID_CONTROL_SCENE_MODE,\n> > -               ANDROID_CONTROL_VIDEO_STABILIZATION_MODE,\n> > -               ANDROID_FLASH_MODE,\n> > -               ANDROID_FLASH_STATE,\n> > -               ANDROID_JPEG_GPS_COORDINATES,\n> > -               ANDROID_JPEG_GPS_PROCESSING_METHOD,\n> > -               ANDROID_JPEG_GPS_TIMESTAMP,\n> > -               ANDROID_JPEG_ORIENTATION,\n> > -               ANDROID_JPEG_QUALITY,\n> > -               ANDROID_JPEG_SIZE,\n> > -               ANDROID_JPEG_THUMBNAIL_QUALITY,\n> > -               ANDROID_JPEG_THUMBNAIL_SIZE,\n> > -               ANDROID_LENS_APERTURE,\n> > -               ANDROID_LENS_FOCAL_LENGTH,\n> > -               ANDROID_LENS_OPTICAL_STABILIZATION_MODE,\n> > -               ANDROID_LENS_STATE,\n> > -               ANDROID_NOISE_REDUCTION_MODE,\n> > -               ANDROID_REQUEST_PIPELINE_DEPTH,\n> > -               ANDROID_SCALER_CROP_REGION,\n> > -               ANDROID_SENSOR_EXPOSURE_TIME,\n> > -               ANDROID_SENSOR_FRAME_DURATION,\n> > -               ANDROID_SENSOR_ROLLING_SHUTTER_SKEW,\n> > -               ANDROID_SENSOR_TEST_PATTERN_MODE,\n> > -               ANDROID_SENSOR_TIMESTAMP,\n> > -               ANDROID_STATISTICS_FACE_DETECT_MODE,\n> > -               ANDROID_STATISTICS_LENS_SHADING_MAP_MODE,\n> > -               ANDROID_STATISTICS_HOT_PIXEL_MAP_MODE,\n> > -               ANDROID_STATISTICS_SCENE_FLICKER,\n> > -       };\n> > -       staticMetadata_->addEntry(ANDROID_REQUEST_AVAILABLE_RESULT_KEYS,\n> > -                                 availableResultKeys);\n> > -\n> > -       if (!staticMetadata_->isValid()) {\n> > -               LOG(HAL, Error) << \"Failed to construct static metadata\";\n> > -               staticMetadata_.reset();\n> > -               return nullptr;\n> > -       }\n> > -\n> > -       if (staticMetadata_->resized()) {\n> > -               auto [entryCount, dataCount] = staticMetadata_->usage();\n> > -               LOG(HAL, Info)\n> > -                       << \"Static metadata resized: \" << entryCount\n> > -                       << \" entries and \" << dataCount << \" bytes used\";\n> > -       }\n> > -\n> > -       return staticMetadata_->get();\n> > -}\n> > -\n> > -std::unique_ptr<CameraMetadata> CameraDevice::requestTemplatePreview()\n> > +void CameraDevice::setCallbacks(const camera3_callback_ops_t *callbacks)\n> >  {\n> > -       /*\n> > -        * \\todo Keep this in sync with the actual number of entries.\n> > -        * Currently: 20 entries, 35 bytes\n> > -        */\n> > -       auto requestTemplate = std::make_unique<CameraMetadata>(21, 36);\n> > -       if (!requestTemplate->isValid()) {\n> > -               return nullptr;\n> > -       }\n> > -\n> > -       /* Get the FPS range registered in the static metadata. */\n> > -       camera_metadata_ro_entry_t entry;\n> > -       bool found =\n> > staticMetadata_->getEntry(ANDROID_CONTROL_AE_AVAILABLE_TARGET_FPS_RANGES,\n> > -                                              &entry);\n> > -       if (!found) {\n> > -               LOG(HAL, Error) << \"Cannot create capture template without\n> > FPS range\";\n> > -               return nullptr;\n> > -       }\n> > -\n> > -       /*\n> > -        * Assume the AE_AVAILABLE_TARGET_FPS_RANGE static metadata\n> > -        * has been assembled as {{min, max} {max, max}}.\n> > -        */\n> > -       requestTemplate->addEntry(ANDROID_CONTROL_AE_TARGET_FPS_RANGE,\n> > -                                 entry.data.i32, 2);\n> > -\n> > -       uint8_t aeMode = ANDROID_CONTROL_AE_MODE_ON;\n> > -       requestTemplate->addEntry(ANDROID_CONTROL_AE_MODE, aeMode);\n> > -\n> > -       int32_t aeExposureCompensation = 0;\n> > -       requestTemplate->addEntry(ANDROID_CONTROL_AE_EXPOSURE_COMPENSATION,\n> > -                                 aeExposureCompensation);\n> > -\n> > -       uint8_t aePrecaptureTrigger =\n> > ANDROID_CONTROL_AE_PRECAPTURE_TRIGGER_IDLE;\n> > -       requestTemplate->addEntry(ANDROID_CONTROL_AE_PRECAPTURE_TRIGGER,\n> > -                                 aePrecaptureTrigger);\n> > -\n> > -       uint8_t aeLock = ANDROID_CONTROL_AE_LOCK_OFF;\n> > -       requestTemplate->addEntry(ANDROID_CONTROL_AE_LOCK, aeLock);\n> > -\n> > -       uint8_t aeAntibandingMode =\n> > ANDROID_CONTROL_AE_ANTIBANDING_MODE_AUTO;\n> > -       requestTemplate->addEntry(ANDROID_CONTROL_AE_ANTIBANDING_MODE,\n> > -                                 aeAntibandingMode);\n> > -\n> > -       uint8_t afMode = ANDROID_CONTROL_AF_MODE_OFF;\n> > -       requestTemplate->addEntry(ANDROID_CONTROL_AF_MODE, afMode);\n> > -\n> > -       uint8_t afTrigger = ANDROID_CONTROL_AF_TRIGGER_IDLE;\n> > -       requestTemplate->addEntry(ANDROID_CONTROL_AF_TRIGGER, afTrigger);\n> > -\n> > -       uint8_t awbMode = ANDROID_CONTROL_AWB_MODE_AUTO;\n> > -       requestTemplate->addEntry(ANDROID_CONTROL_AWB_MODE, awbMode);\n> > -\n> > -       uint8_t awbLock = ANDROID_CONTROL_AWB_LOCK_OFF;\n> > -       requestTemplate->addEntry(ANDROID_CONTROL_AWB_LOCK, awbLock);\n> > -\n> > -       uint8_t flashMode = ANDROID_FLASH_MODE_OFF;\n> > -       requestTemplate->addEntry(ANDROID_FLASH_MODE, flashMode);\n> > -\n> > -       uint8_t faceDetectMode = ANDROID_STATISTICS_FACE_DETECT_MODE_OFF;\n> > -       requestTemplate->addEntry(ANDROID_STATISTICS_FACE_DETECT_MODE,\n> > -                                 faceDetectMode);\n> > -\n> > -       uint8_t noiseReduction = ANDROID_NOISE_REDUCTION_MODE_OFF;\n> > -       requestTemplate->addEntry(ANDROID_NOISE_REDUCTION_MODE,\n> > -                                 noiseReduction);\n> > -\n> > -       uint8_t aberrationMode =\n> > ANDROID_COLOR_CORRECTION_ABERRATION_MODE_OFF;\n> > -       requestTemplate->addEntry(ANDROID_COLOR_CORRECTION_ABERRATION_MODE,\n> > -                                 aberrationMode);\n> > -\n> > -       uint8_t controlMode = ANDROID_CONTROL_MODE_AUTO;\n> > -       requestTemplate->addEntry(ANDROID_CONTROL_MODE, controlMode);\n> > -\n> > -       float lensAperture = 2.53 / 100;\n> > -       requestTemplate->addEntry(ANDROID_LENS_APERTURE, lensAperture);\n> > -\n> > -       uint8_t opticalStabilization =\n> > ANDROID_LENS_OPTICAL_STABILIZATION_MODE_OFF;\n> > -       requestTemplate->addEntry(ANDROID_LENS_OPTICAL_STABILIZATION_MODE,\n> > -                                 opticalStabilization);\n> > -\n> > -       uint8_t captureIntent = ANDROID_CONTROL_CAPTURE_INTENT_PREVIEW;\n> > -       requestTemplate->addEntry(ANDROID_CONTROL_CAPTURE_INTENT,\n> > -                                 captureIntent);\n> > -\n> > -       return requestTemplate;\n> > +       callbacks_ = callbacks;\n> >  }\n> >\n> > -std::unique_ptr<CameraMetadata> CameraDevice::requestTemplateVideo()\n> > +const camera_metadata_t *CameraDevice::getStaticMetadata()\n> >  {\n> > -       std::unique_ptr<CameraMetadata> previewTemplate =\n> > requestTemplatePreview();\n> > -       if (!previewTemplate)\n> > -               return nullptr;\n> > -\n> > -       /*\n> > -        * The video template requires a fixed FPS range. Everything else\n> > -        * stays the same as the preview template.\n> > -        */\n> > -       camera_metadata_ro_entry_t entry;\n> > -\n> >  staticMetadata_->getEntry(ANDROID_CONTROL_AE_AVAILABLE_TARGET_FPS_RANGES,\n> > -                                 &entry);\n> > -\n> > -       /*\n> > -        * Assume the AE_AVAILABLE_TARGET_FPS_RANGE static metadata\n> > -        * has been assembled as {{min, max} {max, max}}.\n> > -        */\n> > -       previewTemplate->updateEntry(ANDROID_CONTROL_AE_TARGET_FPS_RANGE,\n> > -                                    entry.data.i32 + 2, 2);\n> > -\n> > -       return previewTemplate;\n> > +       return capabilities_.staticMetadata()->get();\n> >  }\n> >\n> >  /*\n> > @@ -1630,7 +520,7 @@ const camera_metadata_t\n> > *CameraDevice::constructDefaultRequestSettings(int type)\n> >         switch (type) {\n> >         case CAMERA3_TEMPLATE_PREVIEW:\n> >                 captureIntent = ANDROID_CONTROL_CAPTURE_INTENT_PREVIEW;\n> > -               requestTemplate = requestTemplatePreview();\n> > +               requestTemplate = capabilities_.requestTemplatePreview();\n> >                 break;\n> >         case CAMERA3_TEMPLATE_STILL_CAPTURE:\n> >                 /*\n> > @@ -1638,15 +528,15 @@ const camera_metadata_t\n> > *CameraDevice::constructDefaultRequestSettings(int type)\n> >                  * for the torch mode we currently do not support.\n> >                  */\n> >                 captureIntent =\n> > ANDROID_CONTROL_CAPTURE_INTENT_STILL_CAPTURE;\n> > -               requestTemplate = requestTemplatePreview();\n> > +               requestTemplate = capabilities_.requestTemplatePreview();\n> >                 break;\n> >         case CAMERA3_TEMPLATE_VIDEO_RECORD:\n> >                 captureIntent =\n> > ANDROID_CONTROL_CAPTURE_INTENT_VIDEO_RECORD;\n> > -               requestTemplate = requestTemplateVideo();\n> > +               requestTemplate = capabilities_.requestTemplateVideo();\n> >                 break;\n> >         case CAMERA3_TEMPLATE_VIDEO_SNAPSHOT:\n> >                 captureIntent =\n> > ANDROID_CONTROL_CAPTURE_INTENT_VIDEO_SNAPSHOT;\n> > -               requestTemplate = requestTemplateVideo();\n> > +               requestTemplate = capabilities_.requestTemplateVideo();\n> >                 break;\n> >         /* \\todo Implement templates generation for the remaining use\n> > cases. */\n> >         case CAMERA3_TEMPLATE_ZERO_SHUTTER_LAG:\n> > @@ -1668,19 +558,6 @@ const camera_metadata_t\n> > *CameraDevice::constructDefaultRequestSettings(int type)\n> >         return requestTemplates_[type]->get();\n> >  }\n> >\n> > -PixelFormat CameraDevice::toPixelFormat(int format) const\n> > -{\n> > -       /* Translate Android format code to libcamera pixel format. */\n> > -       auto it = formatsMap_.find(format);\n> > -       if (it == formatsMap_.end()) {\n> > -               LOG(HAL, Error) << \"Requested format \" <<\n> > utils::hex(format)\n> > -                               << \" not supported\";\n> > -               return PixelFormat();\n> > -       }\n> > -\n> > -       return it->second;\n> > -}\n> > -\n> >  /*\n> >   * Inspect the stream_list to produce a list of StreamConfiguration to\n> >   * be use to configure the Camera.\n> > @@ -1727,7 +604,7 @@ int\n> > CameraDevice::configureStreams(camera3_stream_configuration_t *stream_list)\n> >                 camera3_stream_t *stream = stream_list->streams[i];\n> >                 Size size(stream->width, stream->height);\n> >\n> > -               PixelFormat format = toPixelFormat(stream->format);\n> > +               PixelFormat format =\n> > capabilities_.toPixelFormat(stream->format);\n> >\n> >                 LOG(HAL, Info) << \"Stream #\" << i\n> >                                << \", direction: \" << stream->stream_type\n> > diff --git a/src/android/camera_device.h b/src/android/camera_device.h\n> > index 4aadb27c562c..090fe28a551e 100644\n> > --- a/src/android/camera_device.h\n> > +++ b/src/android/camera_device.h\n> > @@ -10,14 +10,12 @@\n> >  #include <map>\n> >  #include <memory>\n> >  #include <mutex>\n> > -#include <tuple>\n> >  #include <vector>\n> >\n> >  #include <hardware/camera3.h>\n> >\n> >  #include <libcamera/buffer.h>\n> >  #include <libcamera/camera.h>\n> > -#include <libcamera/geometry.h>\n> >  #include <libcamera/request.h>\n> >  #include <libcamera/stream.h>\n> >\n> > @@ -26,6 +24,7 @@\n> >  #include \"libcamera/internal/message.h\"\n> >  #include \"libcamera/internal/thread.h\"\n> >\n> > +#include \"camera_capabilities.h\"\n> >  #include \"camera_metadata.h\"\n> >  #include \"camera_stream.h\"\n> >  #include \"camera_worker.h\"\n> > @@ -57,7 +56,7 @@ public:\n> >         const std::string &model() const { return model_; }\n> >         int facing() const { return facing_; }\n> >         int orientation() const { return orientation_; }\n> > -       unsigned int maxJpegBufferSize() const { return\n> > maxJpegBufferSize_; }\n> > +       unsigned int maxJpegBufferSize() const;\n> >\n> >         void setCallbacks(const camera3_callback_ops_t *callbacks);\n> >         const camera_metadata_t *getStaticMetadata();\n> > @@ -86,11 +85,6 @@ private:\n> >                 std::unique_ptr<CaptureRequest> request_;\n> >         };\n> >\n> > -       struct Camera3StreamConfiguration {\n> > -               libcamera::Size resolution;\n> > -               int androidFormat;\n> > -       };\n> > -\n> >         enum class State {\n> >                 Stopped,\n> >                 Flushing,\n> > @@ -99,22 +93,11 @@ private:\n> >\n> >         void stop();\n> >\n> > -       int initializeStreamConfigurations();\n> > -       std::vector<libcamera::Size>\n> > -       getYUVResolutions(libcamera::CameraConfiguration *cameraConfig,\n> > -                         const libcamera::PixelFormat &pixelFormat,\n> > -                         const std::vector<libcamera::Size> &resolutions);\n> > -       std::vector<libcamera::Size>\n> > -       getRawResolutions(const libcamera::PixelFormat &pixelFormat);\n> > -\n> >         libcamera::FrameBuffer *createFrameBuffer(const buffer_handle_t\n> > camera3buffer);\n> >         void abortRequest(camera3_capture_request_t *request);\n> >         void notifyShutter(uint32_t frameNumber, uint64_t timestamp);\n> >         void notifyError(uint32_t frameNumber, camera3_stream_t *stream,\n> >                          camera3_error_msg_code code);\n> > -       std::unique_ptr<CameraMetadata> requestTemplatePreview();\n> > -       std::unique_ptr<CameraMetadata> requestTemplateVideo();\n> > -       libcamera::PixelFormat toPixelFormat(int format) const;\n> >         int processControls(Camera3RequestDescriptor *descriptor);\n> >         std::unique_ptr<CameraMetadata> getResultMetadata(\n> >                 const Camera3RequestDescriptor &descriptor) const;\n> > @@ -129,13 +112,11 @@ private:\n> >\n> >         std::shared_ptr<libcamera::Camera> camera_;\n> >         std::unique_ptr<libcamera::CameraConfiguration> config_;\n> > +       CameraCapabilities capabilities_;\n> >\n> > -       std::unique_ptr<CameraMetadata> staticMetadata_;\n> >         std::map<unsigned int, std::unique_ptr<CameraMetadata>>\n> > requestTemplates_;\n> >         const camera3_callback_ops_t *callbacks_;\n> >\n> > -       std::vector<Camera3StreamConfiguration> streamConfigurations_;\n> > -       std::map<int, libcamera::PixelFormat> formatsMap_;\n> >         std::vector<CameraStream> streams_;\n> >\n> >         libcamera::Mutex descriptorsMutex_; /* Protects descriptors_. */\n> > @@ -147,8 +128,6 @@ private:\n> >         int facing_;\n> >         int orientation_;\n> >\n> > -       unsigned int maxJpegBufferSize_;\n> > -\n> >         CameraMetadata lastSettings_;\n> >  };\n> >\n> > diff --git a/src/android/meson.build b/src/android/meson.build\n> > index f27fd5316705..6270fb201338 100644\n> > --- a/src/android/meson.build\n> > +++ b/src/android/meson.build\n> > @@ -44,6 +44,7 @@ subdir('cros')\n> >\n> >  android_hal_sources = files([\n> >      'camera3_hal.cpp',\n> > +    'camera_capabilities.cpp',\n> >      'camera_device.cpp',\n> >      'camera_hal_config.cpp',\n> >      'camera_hal_manager.cpp',\n> > --\n> > 2.31.1\n> >\n> >","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id 7A8DDC321B\n\tfor <parsemail@patchwork.libcamera.org>;\n\tTue, 22 Jun 2021 07:40:21 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id AD6CC6893B;\n\tTue, 22 Jun 2021 09:40:20 +0200 (CEST)","from relay7-d.mail.gandi.net (relay7-d.mail.gandi.net\n\t[217.70.183.200])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id 0992C68932\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tTue, 22 Jun 2021 09:40:19 +0200 (CEST)","(Authenticated sender: jacopo@jmondi.org)\n\tby relay7-d.mail.gandi.net (Postfix) with ESMTPSA id C231320005;\n\tTue, 22 Jun 2021 07:40:17 +0000 (UTC)"],"Date":"Tue, 22 Jun 2021 09:41:07 +0200","From":"Jacopo Mondi <jacopo@jmondi.org>","To":"Hirokazu Honda <hiroh@chromium.org>","Message-ID":"<20210622074107.6oeneo3z4yluohsz@uno.localdomain>","References":"<20210621152954.40299-1-jacopo@jmondi.org>\n\t<20210621152954.40299-3-jacopo@jmondi.org>\n\t<CAO5uPHP1B=8DWZOpthDew7pjsyA_Tau6ttLAZnqmsi_wmqd=YA@mail.gmail.com>","MIME-Version":"1.0","Content-Type":"text/plain; charset=utf-8","Content-Disposition":"inline","In-Reply-To":"<CAO5uPHP1B=8DWZOpthDew7pjsyA_Tau6ttLAZnqmsi_wmqd=YA@mail.gmail.com>","Subject":"Re: [libcamera-devel] [PATCH 2/2] android: Introduce\n\tCameraCapabilties class","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","Cc":"libcamera devel <libcamera-devel@lists.libcamera.org>","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}},{"id":17682,"web_url":"https://patchwork.libcamera.org/comment/17682/","msgid":"<CAO5uPHP1B=8DWZOpthDew7pjsyA_Tau6ttLAZnqmsi_wmqd=YA@mail.gmail.com>","date":"2021-06-22T01:34:27","subject":"Re: [libcamera-devel] [PATCH 2/2] android: Introduce\n\tCameraCapabilties class","submitter":{"id":63,"url":"https://patchwork.libcamera.org/api/people/63/","name":"Hirokazu Honda","email":"hiroh@chromium.org"},"content":"Hi Jacopo, thank you for the patch.\n\nI failed to apply the patch on the top of the latest tree to review.\nCould you tell me the parent commit which I can apply this patch?\n\n-Hiro\nOn Tue, Jun 22, 2021 at 12:29 AM Jacopo Mondi <jacopo@jmondi.org> wrote:\n\n> The camera_device.cpp has grown a little too much, and it has quickly\n> become hard to maintain. Break out the handling of the static\n> information collected at camera initialization time to a new\n> CameraCapabilities class.\n>\n> Break out from the camera_device.cpp file all the functions relative to:\n> - Initialization of supported stream configurations\n> - Initialization of static metadata\n> - Initialization of request templates\n>\n> Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>\n> Acked-by: Paul Elder <paul.elder@ideasonboard.com>\n> Tested-by: Paul Elder <paul.elder@ideasonboard.com>\n> ---\n>  src/android/camera_capabilities.cpp | 1164 +++++++++++++++++++++++++++\n>  src/android/camera_capabilities.h   |   65 ++\n>  src/android/camera_device.cpp       | 1147 +-------------------------\n>  src/android/camera_device.h         |   27 +-\n>  src/android/meson.build             |    1 +\n>  5 files changed, 1245 insertions(+), 1159 deletions(-)\n>  create mode 100644 src/android/camera_capabilities.cpp\n>  create mode 100644 src/android/camera_capabilities.h\n>\n> diff --git a/src/android/camera_capabilities.cpp\n> b/src/android/camera_capabilities.cpp\n> new file mode 100644\n> index 000000000000..311a2c839586\n> --- /dev/null\n> +++ b/src/android/camera_capabilities.cpp\n> @@ -0,0 +1,1164 @@\n> +/* SPDX-License-Identifier: LGPL-2.1-or-later */\n> +/*\n> + * Copyright (C) 2021, Google Inc.\n> + *\n> + * camera_capabilities.cpp - Camera static properties manager\n> + */\n> +\n> +#include \"camera_capabilities.h\"\n> +\n> +#include <array>\n> +#include <cmath>\n> +\n> +#include <hardware/camera3.h>\n> +\n> +#include <libcamera/control_ids.h>\n> +#include <libcamera/controls.h>\n> +#include <libcamera/property_ids.h>\n> +\n> +#include \"libcamera/internal/formats.h\"\n> +#include \"libcamera/internal/log.h\"\n> +\n> +using namespace libcamera;\n> +\n> +LOG_DECLARE_CATEGORY(HAL)\n> +\n> +namespace {\n> +\n> +/*\n> + * \\var camera3Resolutions\n> + * \\brief The list of image resolutions defined as mandatory to be\n> supported by\n> + * the Android Camera3 specification\n> + */\n> +const std::vector<Size> camera3Resolutions = {\n> +       { 320, 240 },\n> +       { 640, 480 },\n> +       { 1280, 720 },\n> +       { 1920, 1080 }\n> +};\n> +\n> +/*\n> + * \\struct Camera3Format\n> + * \\brief Data associated with an Android format identifier\n> + * \\var libcameraFormats List of libcamera pixel formats compatible with\n> the\n> + * Android format\n> + * \\var name The human-readable representation of the Android format code\n> + */\n> +struct Camera3Format {\n> +       std::vector<PixelFormat> libcameraFormats;\n> +       bool mandatory;\n> +       const char *name;\n> +};\n> +\n> +/*\n> + * \\var camera3FormatsMap\n> + * \\brief Associate Android format code with ancillary data\n> + */\n> +const std::map<int, const Camera3Format> camera3FormatsMap = {\n> +       {\n> +               HAL_PIXEL_FORMAT_BLOB, {\n> +                       { formats::MJPEG },\n> +                       true,\n> +                       \"BLOB\"\n> +               }\n> +       }, {\n> +               HAL_PIXEL_FORMAT_YCbCr_420_888, {\n> +                       { formats::NV12, formats::NV21 },\n> +                       true,\n> +                       \"YCbCr_420_888\"\n> +               }\n> +       }, {\n> +               /*\n> +                * \\todo Translate IMPLEMENTATION_DEFINED inspecting the\n> gralloc\n> +                * usage flag. For now, copy the YCbCr_420 configuration.\n> +                */\n> +               HAL_PIXEL_FORMAT_IMPLEMENTATION_DEFINED, {\n> +                       { formats::NV12, formats::NV21 },\n> +                       true,\n> +                       \"IMPLEMENTATION_DEFINED\"\n> +               }\n> +       }, {\n> +               HAL_PIXEL_FORMAT_RAW10, {\n> +                       {\n> +                               formats::SBGGR10_CSI2P,\n> +                               formats::SGBRG10_CSI2P,\n> +                               formats::SGRBG10_CSI2P,\n> +                               formats::SRGGB10_CSI2P\n> +                       },\n> +                       false,\n> +                       \"RAW10\"\n> +               }\n> +       }, {\n> +               HAL_PIXEL_FORMAT_RAW12, {\n> +                       {\n> +                               formats::SBGGR12_CSI2P,\n> +                               formats::SGBRG12_CSI2P,\n> +                               formats::SGRBG12_CSI2P,\n> +                               formats::SRGGB12_CSI2P\n> +                       },\n> +                       false,\n> +                       \"RAW12\"\n> +               }\n> +       }, {\n> +               HAL_PIXEL_FORMAT_RAW16, {\n> +                       {\n> +                               formats::SBGGR16,\n> +                               formats::SGBRG16,\n> +                               formats::SGRBG16,\n> +                               formats::SRGGB16\n> +                       },\n> +                       false,\n> +                       \"RAW16\"\n> +               }\n> +       },\n> +};\n> +\n> +} /* namespace */\n> +\n> +int CameraCapabilities::initialize(std::shared_ptr<libcamera::Camera>\n> camera,\n> +                                  int orientation, int facing)\n> +{\n> +       camera_ = camera;\n> +       orientation_ = orientation;\n> +       facing_ = facing;\n> +\n> +       /* Acquire the camera and initialize available stream\n> configurations. */\n> +       int ret = camera_->acquire();\n> +       if (ret) {\n> +               LOG(HAL, Error) << \"Failed to temporarily acquire the\n> camera\";\n> +               return ret;\n> +       }\n> +\n> +       ret = initializeStreamConfigurations();\n> +       camera_->release();\n> +       if (ret)\n> +               return ret;\n> +\n> +       return initializeStaticMetadata();\n> +}\n> +\n> +std::vector<Size>\n> CameraCapabilities::getYUVResolutions(CameraConfiguration *cameraConfig,\n> +                                                       const PixelFormat\n> &pixelFormat,\n> +                                                       const\n> std::vector<Size> &resolutions)\n> +{\n> +       std::vector<Size> supportedResolutions;\n> +\n> +       StreamConfiguration &cfg = cameraConfig->at(0);\n> +       for (const Size &res : resolutions) {\n> +               cfg.pixelFormat = pixelFormat;\n> +               cfg.size = res;\n> +\n> +               CameraConfiguration::Status status =\n> cameraConfig->validate();\n> +               if (status != CameraConfiguration::Valid) {\n> +                       LOG(HAL, Debug) << cfg.toString() << \" not\n> supported\";\n> +                       continue;\n> +               }\n> +\n> +               LOG(HAL, Debug) << cfg.toString() << \" supported\";\n> +\n> +               supportedResolutions.push_back(res);\n> +       }\n> +\n> +       return supportedResolutions;\n> +}\n> +\n> +std::vector<Size> CameraCapabilities::getRawResolutions(const\n> libcamera::PixelFormat &pixelFormat)\n> +{\n> +       std::unique_ptr<CameraConfiguration> cameraConfig =\n> +               camera_->generateConfiguration({ StreamRole::Raw });\n> +       StreamConfiguration &cfg = cameraConfig->at(0);\n> +       const StreamFormats &formats = cfg.formats();\n> +       std::vector<Size> supportedResolutions =\n> formats.sizes(pixelFormat);\n> +\n> +       return supportedResolutions;\n> +}\n> +\n> +/*\n> + * Initialize the format conversion map to translate from Android format\n> + * identifier to libcamera pixel formats and fill in the list of supported\n> + * stream configurations to be reported to the Android camera framework\n> through\n> + * the Camera static metadata.\n> + */\n> +int CameraCapabilities::initializeStreamConfigurations()\n> +{\n> +       /*\n> +        * Get the maximum output resolutions\n> +        * \\todo Get this from the camera properties once defined\n> +        */\n> +       std::unique_ptr<CameraConfiguration> cameraConfig =\n> +               camera_->generateConfiguration({ StillCapture });\n> +       if (!cameraConfig) {\n> +               LOG(HAL, Error) << \"Failed to get maximum resolution\";\n> +               return -EINVAL;\n> +       }\n> +       StreamConfiguration &cfg = cameraConfig->at(0);\n> +\n> +       /*\n> +        * \\todo JPEG - Adjust the maximum available resolution by taking\n> the\n> +        * JPEG encoder requirements into account (alignment and aspect\n> ratio).\n> +        */\n> +       const Size maxRes = cfg.size;\n> +       LOG(HAL, Debug) << \"Maximum supported resolution: \" <<\n> maxRes.toString();\n> +\n> +       /*\n> +        * Build the list of supported image resolutions.\n> +        *\n> +        * The resolutions listed in camera3Resolution are mandatory to be\n> +        * supported, up to the camera maximum resolution.\n> +        *\n> +        * Augment the list by adding resolutions calculated from the\n> camera\n> +        * maximum one.\n> +        */\n> +       std::vector<Size> cameraResolutions;\n> +       std::copy_if(camera3Resolutions.begin(), camera3Resolutions.end(),\n> +                    std::back_inserter(cameraResolutions),\n> +                    [&](const Size &res) { return res < maxRes; });\n> +\n> +       /*\n> +        * The Camera3 specification suggests adding 1/2 and 1/4 of the\n> maximum\n> +        * resolution.\n> +        */\n> +       for (unsigned int divider = 2;; divider <<= 1) {\n> +               Size derivedSize{\n> +                       maxRes.width / divider,\n> +                       maxRes.height / divider,\n> +               };\n> +\n> +               if (derivedSize.width < 320 ||\n> +                   derivedSize.height < 240)\n> +                       break;\n> +\n> +               cameraResolutions.push_back(derivedSize);\n> +       }\n> +       cameraResolutions.push_back(maxRes);\n> +\n> +       /* Remove duplicated entries from the list of supported\n> resolutions. */\n> +       std::sort(cameraResolutions.begin(), cameraResolutions.end());\n> +       auto last = std::unique(cameraResolutions.begin(),\n> cameraResolutions.end());\n> +       cameraResolutions.erase(last, cameraResolutions.end());\n> +\n> +       /*\n> +        * Build the list of supported camera formats.\n> +        *\n> +        * To each Android format a list of compatible libcamera formats is\n> +        * associated. The first libcamera format that tests successful is\n> added\n> +        * to the format translation map used when configuring the streams.\n> +        * It is then tested against the list of supported camera\n> resolutions to\n> +        * build the stream configuration map reported through the camera\n> static\n> +        * metadata.\n> +        */\n> +       Size maxJpegSize;\n> +       for (const auto &format : camera3FormatsMap) {\n> +               int androidFormat = format.first;\n> +               const Camera3Format &camera3Format = format.second;\n> +               const std::vector<PixelFormat> &libcameraFormats =\n> +                       camera3Format.libcameraFormats;\n> +\n> +               LOG(HAL, Debug) << \"Trying to map Android format \"\n> +                               << camera3Format.name;\n> +\n> +               /*\n> +                * JPEG is always supported, either produced directly by\n> the\n> +                * camera, or encoded in the HAL.\n> +                */\n> +               if (androidFormat == HAL_PIXEL_FORMAT_BLOB) {\n> +                       formatsMap_[androidFormat] = formats::MJPEG;\n> +                       LOG(HAL, Debug) << \"Mapped Android format \"\n> +                                       << camera3Format.name << \" to \"\n> +                                       << formats::MJPEG.toString()\n> +                                       << \" (fixed mapping)\";\n> +                       continue;\n> +               }\n> +\n> +               /*\n> +                * Test the libcamera formats that can produce images\n> +                * compatible with the format defined by Android.\n> +                */\n> +               PixelFormat mappedFormat;\n> +               for (const PixelFormat &pixelFormat : libcameraFormats) {\n> +\n> +                       LOG(HAL, Debug) << \"Testing \" <<\n> pixelFormat.toString();\n> +\n> +                       /*\n> +                        * The stream configuration size can be adjusted,\n> +                        * not the pixel format.\n> +                        *\n> +                        * \\todo This could be simplified once all pipeline\n> +                        * handlers will report the StreamFormats list of\n> +                        * supported formats.\n> +                        */\n> +                       cfg.pixelFormat = pixelFormat;\n> +\n> +                       CameraConfiguration::Status status =\n> cameraConfig->validate();\n> +                       if (status != CameraConfiguration::Invalid &&\n> +                           cfg.pixelFormat == pixelFormat) {\n> +                               mappedFormat = pixelFormat;\n> +                               break;\n> +                       }\n> +               }\n> +\n> +               if (!mappedFormat.isValid()) {\n> +                       /* If the format is not mandatory, skip it. */\n> +                       if (!camera3Format.mandatory)\n> +                               continue;\n> +\n> +                       LOG(HAL, Error)\n> +                               << \"Failed to map mandatory Android format\n> \"\n> +                               << camera3Format.name << \" (\"\n> +                               << utils::hex(androidFormat) << \"):\n> aborting\";\n> +                       return -EINVAL;\n> +               }\n> +\n> +               /*\n> +                * Record the mapping and then proceed to generate the\n> +                * stream configurations map, by testing the image\n> resolutions.\n> +                */\n> +               formatsMap_[androidFormat] = mappedFormat;\n> +               LOG(HAL, Debug) << \"Mapped Android format \"\n> +                               << camera3Format.name << \" to \"\n> +                               << mappedFormat.toString();\n> +\n> +               std::vector<Size> resolutions;\n> +               const PixelFormatInfo &info =\n> PixelFormatInfo::info(mappedFormat);\n> +               if (info.colourEncoding ==\n> PixelFormatInfo::ColourEncodingRAW)\n> +                       resolutions = getRawResolutions(mappedFormat);\n> +               else\n> +                       resolutions = getYUVResolutions(cameraConfig.get(),\n> +                                                       mappedFormat,\n> +                                                       cameraResolutions);\n> +\n> +               for (const Size &res : resolutions) {\n> +                       streamConfigurations_.push_back({ res,\n> androidFormat });\n> +\n> +                       /*\n> +                        * If the format is HAL_PIXEL_FORMAT_YCbCr_420_888\n> +                        * from which JPEG is produced, add an entry for\n> +                        * the JPEG stream.\n> +                        *\n> +                        * \\todo Wire the JPEG encoder to query the\n> supported\n> +                        * sizes provided a list of formats it can encode.\n> +                        *\n> +                        * \\todo Support JPEG streams produced by the\n> Camera\n> +                        * natively.\n> +                        */\n> +                       if (androidFormat ==\n> HAL_PIXEL_FORMAT_YCbCr_420_888) {\n> +                               streamConfigurations_.push_back(\n> +                                       { res, HAL_PIXEL_FORMAT_BLOB });\n> +                               maxJpegSize = std::max(maxJpegSize, res);\n> +                       }\n> +               }\n> +\n> +               /*\n> +                * \\todo Calculate the maximum JPEG buffer size by asking\n> the\n> +                * encoder giving the maximum frame size required.\n> +                */\n> +               maxJpegBufferSize_ = maxJpegSize.width *\n> maxJpegSize.height * 1.5;\n> +       }\n> +\n> +       LOG(HAL, Debug) << \"Collected stream configuration map: \";\n> +       for (const auto &entry : streamConfigurations_)\n> +               LOG(HAL, Debug) << \"{ \" << entry.resolution.toString() <<\n> \" - \"\n> +                               << utils::hex(entry.androidFormat) << \" }\";\n> +\n> +       return 0;\n> +}\n> +\n> +int CameraCapabilities::initializeStaticMetadata()\n> +{\n> +       staticMetadata_ = std::make_unique<CameraMetadata>(64, 1024);\n> +       if (!staticMetadata_->isValid()) {\n> +               LOG(HAL, Error) << \"Failed to allocate static metadata\";\n> +               staticMetadata_.reset();\n> +               return -EINVAL;\n> +       }\n> +\n> +       const ControlInfoMap &controlsInfo = camera_->controls();\n> +       const ControlList &properties = camera_->properties();\n> +\n> +       /* Color correction static metadata. */\n> +       {\n> +               std::vector<uint8_t> data;\n> +               data.reserve(3);\n> +               const auto &infoMap =\n> controlsInfo.find(&controls::draft::ColorCorrectionAberrationMode);\n> +               if (infoMap != controlsInfo.end()) {\n> +                       for (const auto &value : infoMap->second.values())\n> +                               data.push_back(value.get<int32_t>());\n> +               } else {\n> +\n>  data.push_back(ANDROID_COLOR_CORRECTION_ABERRATION_MODE_OFF);\n> +               }\n> +\n>  staticMetadata_->addEntry(ANDROID_COLOR_CORRECTION_AVAILABLE_ABERRATION_MODES,\n> +                                         data);\n> +       }\n> +\n> +       /* Control static metadata. */\n> +       std::vector<uint8_t> aeAvailableAntiBandingModes = {\n> +               ANDROID_CONTROL_AE_ANTIBANDING_MODE_OFF,\n> +               ANDROID_CONTROL_AE_ANTIBANDING_MODE_50HZ,\n> +               ANDROID_CONTROL_AE_ANTIBANDING_MODE_60HZ,\n> +               ANDROID_CONTROL_AE_ANTIBANDING_MODE_AUTO,\n> +       };\n> +\n>  staticMetadata_->addEntry(ANDROID_CONTROL_AE_AVAILABLE_ANTIBANDING_MODES,\n> +                                 aeAvailableAntiBandingModes);\n> +\n> +       std::vector<uint8_t> aeAvailableModes = {\n> +               ANDROID_CONTROL_AE_MODE_ON,\n> +       };\n> +       staticMetadata_->addEntry(ANDROID_CONTROL_AE_AVAILABLE_MODES,\n> +                                 aeAvailableModes);\n> +\n> +       int64_t minFrameDurationNsec = -1;\n> +       int64_t maxFrameDurationNsec = -1;\n> +       const auto frameDurationsInfo =\n> controlsInfo.find(&controls::FrameDurationLimits);\n> +       if (frameDurationsInfo != controlsInfo.end()) {\n> +               minFrameDurationNsec =\n> frameDurationsInfo->second.min().get<int64_t>() * 1000;\n> +               maxFrameDurationNsec =\n> frameDurationsInfo->second.max().get<int64_t>() * 1000;\n> +\n> +               /*\n> +                * Adjust the minimum frame duration to comply with Android\n> +                * requirements. The camera service mandates all\n> preview/record\n> +                * streams to have a minimum frame duration < 33,366\n> milliseconds\n> +                * (see MAX_PREVIEW_RECORD_DURATION_NS in the camera\n> service\n> +                * implementation).\n> +                *\n> +                * If we're close enough (+ 500 useconds) to that value,\n> round\n> +                * the minimum frame duration of the camera to an accepted\n> +                * value.\n> +                */\n> +               static constexpr int64_t MAX_PREVIEW_RECORD_DURATION_NS =\n> 1e9 / 29.97;\n> +               if (minFrameDurationNsec > MAX_PREVIEW_RECORD_DURATION_NS\n> &&\n> +                   minFrameDurationNsec < MAX_PREVIEW_RECORD_DURATION_NS\n> + 500000)\n> +                       minFrameDurationNsec =\n> MAX_PREVIEW_RECORD_DURATION_NS - 1000;\n> +\n> +               /*\n> +                * The AE routine frame rate limits are computed using the\n> frame\n> +                * duration limits, as libcamera clips the AE routine to\n> the\n> +                * frame durations.\n> +                */\n> +               int32_t maxFps = std::round(1e9 / minFrameDurationNsec);\n> +               int32_t minFps = std::round(1e9 / maxFrameDurationNsec);\n> +               minFps = std::max(1, minFps);\n> +\n> +               /*\n> +                * Force rounding errors so that we have the proper frame\n> +                * durations for when we reuse these variables later\n> +                */\n> +               minFrameDurationNsec = 1e9 / maxFps;\n> +               maxFrameDurationNsec = 1e9 / minFps;\n> +\n> +               /*\n> +                * Register to the camera service {min, max} and {max, max}\n> +                * intervals as requested by the metadata documentation.\n> +                */\n> +               int32_t availableAeFpsTarget[] = {\n> +                       minFps, maxFps, maxFps, maxFps\n> +               };\n> +\n>  staticMetadata_->addEntry(ANDROID_CONTROL_AE_AVAILABLE_TARGET_FPS_RANGES,\n> +                                         availableAeFpsTarget);\n> +       }\n> +\n> +       std::vector<int32_t> aeCompensationRange = {\n> +               0, 0,\n> +       };\n> +       staticMetadata_->addEntry(ANDROID_CONTROL_AE_COMPENSATION_RANGE,\n> +                                 aeCompensationRange);\n> +\n> +       const camera_metadata_rational_t aeCompensationStep[] = {\n> +               { 0, 1 }\n> +       };\n> +       staticMetadata_->addEntry(ANDROID_CONTROL_AE_COMPENSATION_STEP,\n> +                                 aeCompensationStep);\n> +\n> +       std::vector<uint8_t> availableAfModes = {\n> +               ANDROID_CONTROL_AF_MODE_OFF,\n> +       };\n> +       staticMetadata_->addEntry(ANDROID_CONTROL_AF_AVAILABLE_MODES,\n> +                                 availableAfModes);\n> +\n> +       std::vector<uint8_t> availableEffects = {\n> +               ANDROID_CONTROL_EFFECT_MODE_OFF,\n> +       };\n> +       staticMetadata_->addEntry(ANDROID_CONTROL_AVAILABLE_EFFECTS,\n> +                                 availableEffects);\n> +\n> +       std::vector<uint8_t> availableSceneModes = {\n> +               ANDROID_CONTROL_SCENE_MODE_DISABLED,\n> +       };\n> +       staticMetadata_->addEntry(ANDROID_CONTROL_AVAILABLE_SCENE_MODES,\n> +                                 availableSceneModes);\n> +\n> +       std::vector<uint8_t> availableStabilizationModes = {\n> +               ANDROID_CONTROL_VIDEO_STABILIZATION_MODE_OFF,\n> +       };\n> +\n>  staticMetadata_->addEntry(ANDROID_CONTROL_AVAILABLE_VIDEO_STABILIZATION_MODES,\n> +                                 availableStabilizationModes);\n> +\n> +       /*\n> +        * \\todo Inspect the Camera capabilities to report the available\n> +        * AWB modes. Default to AUTO as CTS tests require it.\n> +        */\n> +       std::vector<uint8_t> availableAwbModes = {\n> +               ANDROID_CONTROL_AWB_MODE_AUTO,\n> +       };\n> +       staticMetadata_->addEntry(ANDROID_CONTROL_AWB_AVAILABLE_MODES,\n> +                                 availableAwbModes);\n> +\n> +       std::vector<int32_t> availableMaxRegions = {\n> +               0, 0, 0,\n> +       };\n> +       staticMetadata_->addEntry(ANDROID_CONTROL_MAX_REGIONS,\n> +                                 availableMaxRegions);\n> +\n> +       std::vector<uint8_t> sceneModesOverride = {\n> +               ANDROID_CONTROL_AE_MODE_ON,\n> +               ANDROID_CONTROL_AWB_MODE_AUTO,\n> +               ANDROID_CONTROL_AF_MODE_OFF,\n> +       };\n> +       staticMetadata_->addEntry(ANDROID_CONTROL_SCENE_MODE_OVERRIDES,\n> +                                 sceneModesOverride);\n> +\n> +       uint8_t aeLockAvailable = ANDROID_CONTROL_AE_LOCK_AVAILABLE_FALSE;\n> +       staticMetadata_->addEntry(ANDROID_CONTROL_AE_LOCK_AVAILABLE,\n> +                                 aeLockAvailable);\n> +\n> +       uint8_t awbLockAvailable =\n> ANDROID_CONTROL_AWB_LOCK_AVAILABLE_FALSE;\n> +       staticMetadata_->addEntry(ANDROID_CONTROL_AWB_LOCK_AVAILABLE,\n> +                                 awbLockAvailable);\n> +\n> +       char availableControlModes = ANDROID_CONTROL_MODE_AUTO;\n> +       staticMetadata_->addEntry(ANDROID_CONTROL_AVAILABLE_MODES,\n> +                                 availableControlModes);\n> +\n> +       /* JPEG static metadata. */\n> +\n> +       /*\n> +        * Create the list of supported thumbnail sizes by inspecting the\n> +        * available JPEG resolutions collected in streamConfigurations_\n> and\n> +        * generate one entry for each aspect ratio.\n> +        *\n> +        * The JPEG thumbnailer can freely scale, so pick an arbitrary\n> +        * (160, 160) size as the bounding rectangle, which is then\n> cropped to\n> +        * the different supported aspect ratios.\n> +        */\n> +       constexpr Size maxJpegThumbnail(160, 160);\n> +       std::vector<Size> thumbnailSizes;\n> +       thumbnailSizes.push_back({ 0, 0 });\n> +       for (const auto &entry : streamConfigurations_) {\n> +               if (entry.androidFormat != HAL_PIXEL_FORMAT_BLOB)\n> +                       continue;\n> +\n> +               Size thumbnailSize = maxJpegThumbnail\n> +                                    .boundedToAspectRatio({\n> entry.resolution.width,\n> +\n> entry.resolution.height });\n> +               thumbnailSizes.push_back(thumbnailSize);\n> +       }\n> +\n> +       std::sort(thumbnailSizes.begin(), thumbnailSizes.end());\n> +       auto last = std::unique(thumbnailSizes.begin(),\n> thumbnailSizes.end());\n> +       thumbnailSizes.erase(last, thumbnailSizes.end());\n> +\n> +       /* Transform sizes in to a list of integers that can be consumed.\n> */\n> +       std::vector<int32_t> thumbnailEntries;\n> +       thumbnailEntries.reserve(thumbnailSizes.size() * 2);\n> +       for (const auto &size : thumbnailSizes) {\n> +               thumbnailEntries.push_back(size.width);\n> +               thumbnailEntries.push_back(size.height);\n> +       }\n> +       staticMetadata_->addEntry(ANDROID_JPEG_AVAILABLE_THUMBNAIL_SIZES,\n> +                                 thumbnailEntries);\n> +\n> +       staticMetadata_->addEntry(ANDROID_JPEG_MAX_SIZE,\n> maxJpegBufferSize_);\n> +\n> +       /* Sensor static metadata. */\n> +       std::array<int32_t, 2> pixelArraySize;\n> +       {\n> +               const Size &size =\n> properties.get(properties::PixelArraySize);\n> +               pixelArraySize[0] = size.width;\n> +               pixelArraySize[1] = size.height;\n> +\n>  staticMetadata_->addEntry(ANDROID_SENSOR_INFO_PIXEL_ARRAY_SIZE,\n> +                                         pixelArraySize);\n> +       }\n> +\n> +       if (properties.contains(properties::UnitCellSize)) {\n> +               const Size &cellSize =\n> properties.get<Size>(properties::UnitCellSize);\n> +               std::array<float, 2> physicalSize{\n> +                       cellSize.width * pixelArraySize[0] / 1e6f,\n> +                       cellSize.height * pixelArraySize[1] / 1e6f\n> +               };\n> +\n>  staticMetadata_->addEntry(ANDROID_SENSOR_INFO_PHYSICAL_SIZE,\n> +                                         physicalSize);\n> +       }\n> +\n> +       {\n> +               const Span<const Rectangle> &rects =\n> +                       properties.get(properties::PixelArrayActiveAreas);\n> +               std::vector<int32_t> data{\n> +                       static_cast<int32_t>(rects[0].x),\n> +                       static_cast<int32_t>(rects[0].y),\n> +                       static_cast<int32_t>(rects[0].width),\n> +                       static_cast<int32_t>(rects[0].height),\n> +               };\n> +\n>  staticMetadata_->addEntry(ANDROID_SENSOR_INFO_ACTIVE_ARRAY_SIZE,\n> +                                         data);\n> +       }\n> +\n> +       int32_t sensitivityRange[] = {\n> +               32, 2400,\n> +       };\n> +       staticMetadata_->addEntry(ANDROID_SENSOR_INFO_SENSITIVITY_RANGE,\n> +                                 sensitivityRange);\n> +\n> +       /* Report the color filter arrangement if the camera reports it. */\n> +       if\n> (properties.contains(properties::draft::ColorFilterArrangement)) {\n> +               uint8_t filterArr =\n> properties.get(properties::draft::ColorFilterArrangement);\n> +\n>  staticMetadata_->addEntry(ANDROID_SENSOR_INFO_COLOR_FILTER_ARRANGEMENT,\n> +                                         filterArr);\n> +       }\n> +\n> +       const auto &exposureInfo =\n> controlsInfo.find(&controls::ExposureTime);\n> +       if (exposureInfo != controlsInfo.end()) {\n> +               int64_t exposureTimeRange[2] = {\n> +                       exposureInfo->second.min().get<int32_t>() * 1000LL,\n> +                       exposureInfo->second.max().get<int32_t>() * 1000LL,\n> +               };\n> +\n>  staticMetadata_->addEntry(ANDROID_SENSOR_INFO_EXPOSURE_TIME_RANGE,\n> +                                         exposureTimeRange, 2);\n> +       }\n> +\n> +       staticMetadata_->addEntry(ANDROID_SENSOR_ORIENTATION,\n> orientation_);\n> +\n> +       std::vector<int32_t> testPatternModes = {\n> +               ANDROID_SENSOR_TEST_PATTERN_MODE_OFF\n> +       };\n> +       const auto &testPatternsInfo =\n> +               controlsInfo.find(&controls::draft::TestPatternMode);\n> +       if (testPatternsInfo != controlsInfo.end()) {\n> +               const auto &values = testPatternsInfo->second.values();\n> +               ASSERT(!values.empty());\n> +               for (const auto &value : values) {\n> +                       switch (value.get<int32_t>()) {\n> +                       case controls::draft::TestPatternModeOff:\n> +                               /*\n> +                                * ANDROID_SENSOR_TEST_PATTERN_MODE_OFF is\n> +                                * already in testPatternModes.\n> +                                */\n> +                               break;\n> +\n> +                       case controls::draft::TestPatternModeSolidColor:\n> +                               testPatternModes.push_back(\n> +\n>  ANDROID_SENSOR_TEST_PATTERN_MODE_SOLID_COLOR);\n> +                               break;\n> +\n> +                       case controls::draft::TestPatternModeColorBars:\n> +                               testPatternModes.push_back(\n> +\n>  ANDROID_SENSOR_TEST_PATTERN_MODE_COLOR_BARS);\n> +                               break;\n> +\n> +                       case\n> controls::draft::TestPatternModeColorBarsFadeToGray:\n> +                               testPatternModes.push_back(\n> +\n>  ANDROID_SENSOR_TEST_PATTERN_MODE_COLOR_BARS_FADE_TO_GRAY);\n> +                               break;\n> +\n> +                       case controls::draft::TestPatternModePn9:\n> +                               testPatternModes.push_back(\n> +\n>  ANDROID_SENSOR_TEST_PATTERN_MODE_PN9);\n> +                               break;\n> +\n> +                       case controls::draft::TestPatternModeCustom1:\n> +                               /* We don't support this yet. */\n> +                               break;\n> +\n> +                       default:\n> +                               LOG(HAL, Error) << \"Unknown test pattern\n> mode: \"\n> +                                               << value.get<int32_t>();\n> +                               continue;\n> +                       }\n> +               }\n> +       }\n> +\n>  staticMetadata_->addEntry(ANDROID_SENSOR_AVAILABLE_TEST_PATTERN_MODES,\n> +                                 testPatternModes);\n> +\n> +       uint8_t timestampSource =\n> ANDROID_SENSOR_INFO_TIMESTAMP_SOURCE_UNKNOWN;\n> +       staticMetadata_->addEntry(ANDROID_SENSOR_INFO_TIMESTAMP_SOURCE,\n> +                                 timestampSource);\n> +\n> +       if (maxFrameDurationNsec > 0)\n> +\n>  staticMetadata_->addEntry(ANDROID_SENSOR_INFO_MAX_FRAME_DURATION,\n> +                                         maxFrameDurationNsec);\n> +\n> +       /* Statistics static metadata. */\n> +       uint8_t faceDetectMode = ANDROID_STATISTICS_FACE_DETECT_MODE_OFF;\n> +\n>  staticMetadata_->addEntry(ANDROID_STATISTICS_INFO_AVAILABLE_FACE_DETECT_MODES,\n> +                                 faceDetectMode);\n> +\n> +       int32_t maxFaceCount = 0;\n> +       staticMetadata_->addEntry(ANDROID_STATISTICS_INFO_MAX_FACE_COUNT,\n> +                                 maxFaceCount);\n> +\n> +       {\n> +               std::vector<uint8_t> data;\n> +               data.reserve(2);\n> +               const auto &infoMap =\n> controlsInfo.find(&controls::draft::LensShadingMapMode);\n> +               if (infoMap != controlsInfo.end()) {\n> +                       for (const auto &value : infoMap->second.values())\n> +                               data.push_back(value.get<int32_t>());\n> +               } else {\n> +\n>  data.push_back(ANDROID_STATISTICS_LENS_SHADING_MAP_MODE_OFF);\n> +               }\n> +\n>  staticMetadata_->addEntry(ANDROID_STATISTICS_INFO_AVAILABLE_LENS_SHADING_MAP_MODES,\n> +                                         data);\n> +       }\n> +\n> +       /* Sync static metadata. */\n> +       int32_t maxLatency = ANDROID_SYNC_MAX_LATENCY_UNKNOWN;\n> +       staticMetadata_->addEntry(ANDROID_SYNC_MAX_LATENCY, maxLatency);\n> +\n> +       /* Flash static metadata. */\n> +       char flashAvailable = ANDROID_FLASH_INFO_AVAILABLE_FALSE;\n> +       staticMetadata_->addEntry(ANDROID_FLASH_INFO_AVAILABLE,\n> +                                 flashAvailable);\n> +\n> +       /* Lens static metadata. */\n> +       std::vector<float> lensApertures = {\n> +               2.53 / 100,\n> +       };\n> +       staticMetadata_->addEntry(ANDROID_LENS_INFO_AVAILABLE_APERTURES,\n> +                                 lensApertures);\n> +\n> +       uint8_t lensFacing;\n> +       switch (facing_) {\n> +       default:\n> +       case CAMERA_FACING_FRONT:\n> +               lensFacing = ANDROID_LENS_FACING_FRONT;\n> +               break;\n> +       case CAMERA_FACING_BACK:\n> +               lensFacing = ANDROID_LENS_FACING_BACK;\n> +               break;\n> +       case CAMERA_FACING_EXTERNAL:\n> +               lensFacing = ANDROID_LENS_FACING_EXTERNAL;\n> +               break;\n> +       }\n> +       staticMetadata_->addEntry(ANDROID_LENS_FACING, lensFacing);\n> +\n> +       std::vector<float> lensFocalLengths = {\n> +               1,\n> +       };\n> +\n>  staticMetadata_->addEntry(ANDROID_LENS_INFO_AVAILABLE_FOCAL_LENGTHS,\n> +                                 lensFocalLengths);\n> +\n> +       std::vector<uint8_t> opticalStabilizations = {\n> +               ANDROID_LENS_OPTICAL_STABILIZATION_MODE_OFF,\n> +       };\n> +\n>  staticMetadata_->addEntry(ANDROID_LENS_INFO_AVAILABLE_OPTICAL_STABILIZATION,\n> +                                 opticalStabilizations);\n> +\n> +       float hypeFocalDistance = 0;\n> +       staticMetadata_->addEntry(ANDROID_LENS_INFO_HYPERFOCAL_DISTANCE,\n> +                                 hypeFocalDistance);\n> +\n> +       float minFocusDistance = 0;\n> +       staticMetadata_->addEntry(ANDROID_LENS_INFO_MINIMUM_FOCUS_DISTANCE,\n> +                                 minFocusDistance);\n> +\n> +       /* Noise reduction modes. */\n> +       {\n> +               std::vector<uint8_t> data;\n> +               data.reserve(5);\n> +               const auto &infoMap =\n> controlsInfo.find(&controls::draft::NoiseReductionMode);\n> +               if (infoMap != controlsInfo.end()) {\n> +                       for (const auto &value : infoMap->second.values())\n> +                               data.push_back(value.get<int32_t>());\n> +               } else {\n> +                       data.push_back(ANDROID_NOISE_REDUCTION_MODE_OFF);\n> +               }\n> +\n>  staticMetadata_->addEntry(ANDROID_NOISE_REDUCTION_AVAILABLE_NOISE_REDUCTION_MODES,\n> +                                         data);\n> +       }\n> +\n> +       /* Scaler static metadata. */\n> +\n> +       /*\n> +        * \\todo The digital zoom factor is a property that depends on the\n> +        * desired output configuration and the sensor frame size input to\n> the\n> +        * ISP. This information is not available to the Android HAL, not\n> at\n> +        * initialization time at least.\n> +        *\n> +        * As a workaround rely on pipeline handlers initializing the\n> +        * ScalerCrop control with the camera default configuration and\n> use the\n> +        * maximum and minimum crop rectangles to calculate the digital\n> zoom\n> +        * factor.\n> +        */\n> +       float maxZoom = 1.0f;\n> +       const auto scalerCrop = controlsInfo.find(&controls::ScalerCrop);\n> +       if (scalerCrop != controlsInfo.end()) {\n> +               Rectangle min = scalerCrop->second.min().get<Rectangle>();\n> +               Rectangle max = scalerCrop->second.max().get<Rectangle>();\n> +               maxZoom = std::min(1.0f * max.width / min.width,\n> +                                  1.0f * max.height / min.height);\n> +       }\n> +\n>  staticMetadata_->addEntry(ANDROID_SCALER_AVAILABLE_MAX_DIGITAL_ZOOM,\n> +                                 maxZoom);\n> +\n> +       std::vector<uint32_t> availableStreamConfigurations;\n> +       availableStreamConfigurations.reserve(streamConfigurations_.size()\n> * 4);\n> +       for (const auto &entry : streamConfigurations_) {\n> +\n>  availableStreamConfigurations.push_back(entry.androidFormat);\n> +\n>  availableStreamConfigurations.push_back(entry.resolution.width);\n> +\n>  availableStreamConfigurations.push_back(entry.resolution.height);\n> +               availableStreamConfigurations.push_back(\n> +\n>  ANDROID_SCALER_AVAILABLE_STREAM_CONFIGURATIONS_OUTPUT);\n> +       }\n> +\n>  staticMetadata_->addEntry(ANDROID_SCALER_AVAILABLE_STREAM_CONFIGURATIONS,\n> +                                 availableStreamConfigurations);\n> +\n> +       std::vector<int64_t> availableStallDurations = {\n> +               ANDROID_SCALER_AVAILABLE_FORMATS_BLOB, 2560, 1920,\n> 33333333,\n> +       };\n> +       staticMetadata_->addEntry(ANDROID_SCALER_AVAILABLE_STALL_DURATIONS,\n> +                                 availableStallDurations);\n> +\n> +       /* Use the minimum frame duration for all the YUV/RGB formats. */\n> +       if (minFrameDurationNsec > 0) {\n> +               std::vector<int64_t> minFrameDurations;\n> +               minFrameDurations.reserve(streamConfigurations_.size() *\n> 4);\n> +               for (const auto &entry : streamConfigurations_) {\n> +                       minFrameDurations.push_back(entry.androidFormat);\n> +\n>  minFrameDurations.push_back(entry.resolution.width);\n> +\n>  minFrameDurations.push_back(entry.resolution.height);\n> +                       minFrameDurations.push_back(minFrameDurationNsec);\n> +               }\n> +\n>  staticMetadata_->addEntry(ANDROID_SCALER_AVAILABLE_MIN_FRAME_DURATIONS,\n> +                                         minFrameDurations);\n> +       }\n> +\n> +       uint8_t croppingType = ANDROID_SCALER_CROPPING_TYPE_CENTER_ONLY;\n> +       staticMetadata_->addEntry(ANDROID_SCALER_CROPPING_TYPE,\n> croppingType);\n> +\n> +       /* Info static metadata. */\n> +       uint8_t supportedHWLevel =\n> ANDROID_INFO_SUPPORTED_HARDWARE_LEVEL_LIMITED;\n> +       staticMetadata_->addEntry(ANDROID_INFO_SUPPORTED_HARDWARE_LEVEL,\n> +                                 supportedHWLevel);\n> +\n> +       /* Request static metadata. */\n> +       int32_t partialResultCount = 1;\n> +       staticMetadata_->addEntry(ANDROID_REQUEST_PARTIAL_RESULT_COUNT,\n> +                                 partialResultCount);\n> +\n> +       {\n> +               /* Default the value to 2 if not reported by the camera. */\n> +               uint8_t maxPipelineDepth = 2;\n> +               const auto &infoMap =\n> controlsInfo.find(&controls::draft::PipelineDepth);\n> +               if (infoMap != controlsInfo.end())\n> +                       maxPipelineDepth =\n> infoMap->second.max().get<int32_t>();\n> +\n>  staticMetadata_->addEntry(ANDROID_REQUEST_PIPELINE_MAX_DEPTH,\n> +                                         maxPipelineDepth);\n> +       }\n> +\n> +       /* LIMITED does not support reprocessing. */\n> +       uint32_t maxNumInputStreams = 0;\n> +       staticMetadata_->addEntry(ANDROID_REQUEST_MAX_NUM_INPUT_STREAMS,\n> +                                 maxNumInputStreams);\n> +\n> +       std::vector<uint8_t> availableCapabilities = {\n> +               ANDROID_REQUEST_AVAILABLE_CAPABILITIES_BACKWARD_COMPATIBLE,\n> +       };\n> +\n> +       /* Report if camera supports RAW. */\n> +       bool rawStreamAvailable = false;\n> +       std::unique_ptr<CameraConfiguration> cameraConfig =\n> +               camera_->generateConfiguration({ StreamRole::Raw });\n> +       if (cameraConfig && !cameraConfig->empty()) {\n> +               const PixelFormatInfo &info =\n> +\n>  PixelFormatInfo::info(cameraConfig->at(0).pixelFormat);\n> +               /* Only advertise RAW support if RAW16 is possible. */\n> +               if (info.colourEncoding ==\n> PixelFormatInfo::ColourEncodingRAW &&\n> +                   info.bitsPerPixel == 16) {\n> +                       rawStreamAvailable = true;\n> +\n>  availableCapabilities.push_back(ANDROID_REQUEST_AVAILABLE_CAPABILITIES_RAW);\n> +               }\n> +       }\n> +\n> +       /* Number of { RAW, YUV, JPEG } supported output streams */\n> +       int32_t numOutStreams[] = { rawStreamAvailable, 2, 1 };\n> +       staticMetadata_->addEntry(ANDROID_REQUEST_MAX_NUM_OUTPUT_STREAMS,\n> +                                 numOutStreams);\n> +\n> +       staticMetadata_->addEntry(ANDROID_REQUEST_AVAILABLE_CAPABILITIES,\n> +                                 availableCapabilities);\n> +\n> +       std::vector<int32_t> availableCharacteristicsKeys = {\n> +               ANDROID_COLOR_CORRECTION_AVAILABLE_ABERRATION_MODES,\n> +               ANDROID_CONTROL_AE_AVAILABLE_ANTIBANDING_MODES,\n> +               ANDROID_CONTROL_AE_AVAILABLE_MODES,\n> +               ANDROID_CONTROL_AE_AVAILABLE_TARGET_FPS_RANGES,\n> +               ANDROID_CONTROL_AE_COMPENSATION_RANGE,\n> +               ANDROID_CONTROL_AE_COMPENSATION_STEP,\n> +               ANDROID_CONTROL_AE_LOCK_AVAILABLE,\n> +               ANDROID_CONTROL_AF_AVAILABLE_MODES,\n> +               ANDROID_CONTROL_AVAILABLE_EFFECTS,\n> +               ANDROID_CONTROL_AVAILABLE_MODES,\n> +               ANDROID_CONTROL_AVAILABLE_SCENE_MODES,\n> +               ANDROID_CONTROL_AVAILABLE_VIDEO_STABILIZATION_MODES,\n> +               ANDROID_CONTROL_AWB_AVAILABLE_MODES,\n> +               ANDROID_CONTROL_AWB_LOCK_AVAILABLE,\n> +               ANDROID_CONTROL_MAX_REGIONS,\n> +               ANDROID_CONTROL_SCENE_MODE_OVERRIDES,\n> +               ANDROID_FLASH_INFO_AVAILABLE,\n> +               ANDROID_INFO_SUPPORTED_HARDWARE_LEVEL,\n> +               ANDROID_JPEG_AVAILABLE_THUMBNAIL_SIZES,\n> +               ANDROID_JPEG_MAX_SIZE,\n> +               ANDROID_LENS_FACING,\n> +               ANDROID_LENS_INFO_AVAILABLE_APERTURES,\n> +               ANDROID_LENS_INFO_AVAILABLE_FOCAL_LENGTHS,\n> +               ANDROID_LENS_INFO_AVAILABLE_OPTICAL_STABILIZATION,\n> +               ANDROID_LENS_INFO_HYPERFOCAL_DISTANCE,\n> +               ANDROID_LENS_INFO_MINIMUM_FOCUS_DISTANCE,\n> +               ANDROID_NOISE_REDUCTION_AVAILABLE_NOISE_REDUCTION_MODES,\n> +               ANDROID_REQUEST_AVAILABLE_CAPABILITIES,\n> +               ANDROID_REQUEST_MAX_NUM_INPUT_STREAMS,\n> +               ANDROID_REQUEST_MAX_NUM_OUTPUT_STREAMS,\n> +               ANDROID_REQUEST_PARTIAL_RESULT_COUNT,\n> +               ANDROID_REQUEST_PIPELINE_MAX_DEPTH,\n> +               ANDROID_SCALER_AVAILABLE_MAX_DIGITAL_ZOOM,\n> +               ANDROID_SCALER_AVAILABLE_MIN_FRAME_DURATIONS,\n> +               ANDROID_SCALER_AVAILABLE_STALL_DURATIONS,\n> +               ANDROID_SCALER_AVAILABLE_STREAM_CONFIGURATIONS,\n> +               ANDROID_SCALER_CROPPING_TYPE,\n> +               ANDROID_SENSOR_AVAILABLE_TEST_PATTERN_MODES,\n> +               ANDROID_SENSOR_INFO_ACTIVE_ARRAY_SIZE,\n> +               ANDROID_SENSOR_INFO_COLOR_FILTER_ARRANGEMENT,\n> +               ANDROID_SENSOR_INFO_EXPOSURE_TIME_RANGE,\n> +               ANDROID_SENSOR_INFO_MAX_FRAME_DURATION,\n> +               ANDROID_SENSOR_INFO_PHYSICAL_SIZE,\n> +               ANDROID_SENSOR_INFO_PIXEL_ARRAY_SIZE,\n> +               ANDROID_SENSOR_INFO_SENSITIVITY_RANGE,\n> +               ANDROID_SENSOR_INFO_TIMESTAMP_SOURCE,\n> +               ANDROID_SENSOR_ORIENTATION,\n> +               ANDROID_STATISTICS_INFO_AVAILABLE_FACE_DETECT_MODES,\n> +               ANDROID_STATISTICS_INFO_MAX_FACE_COUNT,\n> +               ANDROID_SYNC_MAX_LATENCY,\n> +       };\n> +\n>  staticMetadata_->addEntry(ANDROID_REQUEST_AVAILABLE_CHARACTERISTICS_KEYS,\n> +                                 availableCharacteristicsKeys);\n> +\n> +       std::vector<int32_t> availableRequestKeys = {\n> +               ANDROID_COLOR_CORRECTION_ABERRATION_MODE,\n> +               ANDROID_CONTROL_AE_ANTIBANDING_MODE,\n> +               ANDROID_CONTROL_AE_EXPOSURE_COMPENSATION,\n> +               ANDROID_CONTROL_AE_LOCK,\n> +               ANDROID_CONTROL_AE_MODE,\n> +               ANDROID_CONTROL_AE_PRECAPTURE_TRIGGER,\n> +               ANDROID_CONTROL_AE_TARGET_FPS_RANGE,\n> +               ANDROID_CONTROL_AF_MODE,\n> +               ANDROID_CONTROL_AF_TRIGGER,\n> +               ANDROID_CONTROL_AWB_LOCK,\n> +               ANDROID_CONTROL_AWB_MODE,\n> +               ANDROID_CONTROL_CAPTURE_INTENT,\n> +               ANDROID_CONTROL_EFFECT_MODE,\n> +               ANDROID_CONTROL_MODE,\n> +               ANDROID_CONTROL_SCENE_MODE,\n> +               ANDROID_CONTROL_VIDEO_STABILIZATION_MODE,\n> +               ANDROID_FLASH_MODE,\n> +               ANDROID_JPEG_ORIENTATION,\n> +               ANDROID_JPEG_QUALITY,\n> +               ANDROID_JPEG_THUMBNAIL_QUALITY,\n> +               ANDROID_JPEG_THUMBNAIL_SIZE,\n> +               ANDROID_LENS_APERTURE,\n> +               ANDROID_LENS_OPTICAL_STABILIZATION_MODE,\n> +               ANDROID_NOISE_REDUCTION_MODE,\n> +               ANDROID_SCALER_CROP_REGION,\n> +               ANDROID_STATISTICS_FACE_DETECT_MODE\n> +       };\n> +       staticMetadata_->addEntry(ANDROID_REQUEST_AVAILABLE_REQUEST_KEYS,\n> +                                 availableRequestKeys);\n> +\n> +       std::vector<int32_t> availableResultKeys = {\n> +               ANDROID_COLOR_CORRECTION_ABERRATION_MODE,\n> +               ANDROID_CONTROL_AE_ANTIBANDING_MODE,\n> +               ANDROID_CONTROL_AE_EXPOSURE_COMPENSATION,\n> +               ANDROID_CONTROL_AE_LOCK,\n> +               ANDROID_CONTROL_AE_MODE,\n> +               ANDROID_CONTROL_AE_PRECAPTURE_TRIGGER,\n> +               ANDROID_CONTROL_AE_STATE,\n> +               ANDROID_CONTROL_AE_TARGET_FPS_RANGE,\n> +               ANDROID_CONTROL_AF_MODE,\n> +               ANDROID_CONTROL_AF_STATE,\n> +               ANDROID_CONTROL_AF_TRIGGER,\n> +               ANDROID_CONTROL_AWB_LOCK,\n> +               ANDROID_CONTROL_AWB_MODE,\n> +               ANDROID_CONTROL_AWB_STATE,\n> +               ANDROID_CONTROL_CAPTURE_INTENT,\n> +               ANDROID_CONTROL_EFFECT_MODE,\n> +               ANDROID_CONTROL_MODE,\n> +               ANDROID_CONTROL_SCENE_MODE,\n> +               ANDROID_CONTROL_VIDEO_STABILIZATION_MODE,\n> +               ANDROID_FLASH_MODE,\n> +               ANDROID_FLASH_STATE,\n> +               ANDROID_JPEG_GPS_COORDINATES,\n> +               ANDROID_JPEG_GPS_PROCESSING_METHOD,\n> +               ANDROID_JPEG_GPS_TIMESTAMP,\n> +               ANDROID_JPEG_ORIENTATION,\n> +               ANDROID_JPEG_QUALITY,\n> +               ANDROID_JPEG_SIZE,\n> +               ANDROID_JPEG_THUMBNAIL_QUALITY,\n> +               ANDROID_JPEG_THUMBNAIL_SIZE,\n> +               ANDROID_LENS_APERTURE,\n> +               ANDROID_LENS_FOCAL_LENGTH,\n> +               ANDROID_LENS_OPTICAL_STABILIZATION_MODE,\n> +               ANDROID_LENS_STATE,\n> +               ANDROID_NOISE_REDUCTION_MODE,\n> +               ANDROID_REQUEST_PIPELINE_DEPTH,\n> +               ANDROID_SCALER_CROP_REGION,\n> +               ANDROID_SENSOR_EXPOSURE_TIME,\n> +               ANDROID_SENSOR_FRAME_DURATION,\n> +               ANDROID_SENSOR_ROLLING_SHUTTER_SKEW,\n> +               ANDROID_SENSOR_TEST_PATTERN_MODE,\n> +               ANDROID_SENSOR_TIMESTAMP,\n> +               ANDROID_STATISTICS_FACE_DETECT_MODE,\n> +               ANDROID_STATISTICS_LENS_SHADING_MAP_MODE,\n> +               ANDROID_STATISTICS_HOT_PIXEL_MAP_MODE,\n> +               ANDROID_STATISTICS_SCENE_FLICKER,\n> +       };\n> +       staticMetadata_->addEntry(ANDROID_REQUEST_AVAILABLE_RESULT_KEYS,\n> +                                 availableResultKeys);\n> +\n> +       if (!staticMetadata_->isValid()) {\n> +               LOG(HAL, Error) << \"Failed to construct static metadata\";\n> +               staticMetadata_.reset();\n> +               return -EINVAL;\n> +       }\n> +\n> +       if (staticMetadata_->resized()) {\n> +               auto [entryCount, dataCount] = staticMetadata_->usage();\n> +               LOG(HAL, Info)\n> +                       << \"Static metadata resized: \" << entryCount\n> +                       << \" entries and \" << dataCount << \" bytes used\";\n> +       }\n> +\n> +       return 0;\n> +}\n> +\n> +/* Translate Android format code to libcamera pixel format. */\n> +PixelFormat CameraCapabilities::toPixelFormat(int format) const\n> +{\n> +       auto it = formatsMap_.find(format);\n> +       if (it == formatsMap_.end()) {\n> +               LOG(HAL, Error) << \"Requested format \" <<\n> utils::hex(format)\n> +                               << \" not supported\";\n> +               return PixelFormat();\n> +       }\n> +\n> +       return it->second;\n> +}\n> +\n> +std::unique_ptr<CameraMetadata>\n> CameraCapabilities::requestTemplatePreview() const\n> +{\n> +       /*\n> +        * \\todo Keep this in sync with the actual number of entries.\n> +        * Currently: 20 entries, 35 bytes\n> +        */\n> +       auto requestTemplate = std::make_unique<CameraMetadata>(21, 36);\n> +       if (!requestTemplate->isValid()) {\n> +               return nullptr;\n> +       }\n> +\n> +       /* Get the FPS range registered in the static metadata. */\n> +       camera_metadata_ro_entry_t entry;\n> +       bool found =\n> staticMetadata_->getEntry(ANDROID_CONTROL_AE_AVAILABLE_TARGET_FPS_RANGES,\n> +                                              &entry);\n> +       if (!found) {\n> +               LOG(HAL, Error) << \"Cannot create capture template without\n> FPS range\";\n> +               return nullptr;\n> +       }\n> +\n> +       /*\n> +        * Assume the AE_AVAILABLE_TARGET_FPS_RANGE static metadata\n> +        * has been assembled as {{min, max} {max, max}}.\n> +        */\n> +       requestTemplate->addEntry(ANDROID_CONTROL_AE_TARGET_FPS_RANGE,\n> +                                 entry.data.i32, 2);\n> +\n> +       uint8_t aeMode = ANDROID_CONTROL_AE_MODE_ON;\n> +       requestTemplate->addEntry(ANDROID_CONTROL_AE_MODE, aeMode);\n> +\n> +       int32_t aeExposureCompensation = 0;\n> +       requestTemplate->addEntry(ANDROID_CONTROL_AE_EXPOSURE_COMPENSATION,\n> +                                 aeExposureCompensation);\n> +\n> +       uint8_t aePrecaptureTrigger =\n> ANDROID_CONTROL_AE_PRECAPTURE_TRIGGER_IDLE;\n> +       requestTemplate->addEntry(ANDROID_CONTROL_AE_PRECAPTURE_TRIGGER,\n> +                                 aePrecaptureTrigger);\n> +\n> +       uint8_t aeLock = ANDROID_CONTROL_AE_LOCK_OFF;\n> +       requestTemplate->addEntry(ANDROID_CONTROL_AE_LOCK, aeLock);\n> +\n> +       uint8_t aeAntibandingMode =\n> ANDROID_CONTROL_AE_ANTIBANDING_MODE_AUTO;\n> +       requestTemplate->addEntry(ANDROID_CONTROL_AE_ANTIBANDING_MODE,\n> +                                 aeAntibandingMode);\n> +\n> +       uint8_t afMode = ANDROID_CONTROL_AF_MODE_OFF;\n> +       requestTemplate->addEntry(ANDROID_CONTROL_AF_MODE, afMode);\n> +\n> +       uint8_t afTrigger = ANDROID_CONTROL_AF_TRIGGER_IDLE;\n> +       requestTemplate->addEntry(ANDROID_CONTROL_AF_TRIGGER, afTrigger);\n> +\n> +       uint8_t awbMode = ANDROID_CONTROL_AWB_MODE_AUTO;\n> +       requestTemplate->addEntry(ANDROID_CONTROL_AWB_MODE, awbMode);\n> +\n> +       uint8_t awbLock = ANDROID_CONTROL_AWB_LOCK_OFF;\n> +       requestTemplate->addEntry(ANDROID_CONTROL_AWB_LOCK, awbLock);\n> +\n> +       uint8_t flashMode = ANDROID_FLASH_MODE_OFF;\n> +       requestTemplate->addEntry(ANDROID_FLASH_MODE, flashMode);\n> +\n> +       uint8_t faceDetectMode = ANDROID_STATISTICS_FACE_DETECT_MODE_OFF;\n> +       requestTemplate->addEntry(ANDROID_STATISTICS_FACE_DETECT_MODE,\n> +                                 faceDetectMode);\n> +\n> +       uint8_t noiseReduction = ANDROID_NOISE_REDUCTION_MODE_OFF;\n> +       requestTemplate->addEntry(ANDROID_NOISE_REDUCTION_MODE,\n> +                                 noiseReduction);\n> +\n> +       uint8_t aberrationMode =\n> ANDROID_COLOR_CORRECTION_ABERRATION_MODE_OFF;\n> +       requestTemplate->addEntry(ANDROID_COLOR_CORRECTION_ABERRATION_MODE,\n> +                                 aberrationMode);\n> +\n> +       uint8_t controlMode = ANDROID_CONTROL_MODE_AUTO;\n> +       requestTemplate->addEntry(ANDROID_CONTROL_MODE, controlMode);\n> +\n> +       float lensAperture = 2.53 / 100;\n> +       requestTemplate->addEntry(ANDROID_LENS_APERTURE, lensAperture);\n> +\n> +       uint8_t opticalStabilization =\n> ANDROID_LENS_OPTICAL_STABILIZATION_MODE_OFF;\n> +       requestTemplate->addEntry(ANDROID_LENS_OPTICAL_STABILIZATION_MODE,\n> +                                 opticalStabilization);\n> +\n> +       uint8_t captureIntent = ANDROID_CONTROL_CAPTURE_INTENT_PREVIEW;\n> +       requestTemplate->addEntry(ANDROID_CONTROL_CAPTURE_INTENT,\n> +                                 captureIntent);\n> +\n> +       return requestTemplate;\n> +}\n> +\n> +std::unique_ptr<CameraMetadata>\n> CameraCapabilities::requestTemplateVideo() const\n> +{\n> +       std::unique_ptr<CameraMetadata> previewTemplate =\n> requestTemplatePreview();\n> +       if (!previewTemplate)\n> +               return nullptr;\n> +\n> +       /*\n> +        * The video template requires a fixed FPS range. Everything else\n> +        * stays the same as the preview template.\n> +        */\n> +       camera_metadata_ro_entry_t entry;\n> +\n>  staticMetadata_->getEntry(ANDROID_CONTROL_AE_AVAILABLE_TARGET_FPS_RANGES,\n> +                                 &entry);\n> +\n> +       /*\n> +        * Assume the AE_AVAILABLE_TARGET_FPS_RANGE static metadata\n> +        * has been assembled as {{min, max} {max, max}}.\n> +        */\n> +       previewTemplate->updateEntry(ANDROID_CONTROL_AE_TARGET_FPS_RANGE,\n> +                                    entry.data.i32 + 2, 2);\n> +\n> +       return previewTemplate;\n> +}\n> diff --git a/src/android/camera_capabilities.h\n> b/src/android/camera_capabilities.h\n> new file mode 100644\n> index 000000000000..f511607bbd90\n> --- /dev/null\n> +++ b/src/android/camera_capabilities.h\n> @@ -0,0 +1,65 @@\n> +/* SPDX-License-Identifier: LGPL-2.1-or-later */\n> +/*\n> + * Copyright (C) 2021, Google Inc.\n> + *\n> + * camera_capabilities.h - Camera static properties manager\n> + */\n> +#ifndef __ANDROID_CAMERA_CAPABILITIES_H__\n> +#define __ANDROID_CAMERA_CAPABILITIES_H__\n> +\n> +#include <map>\n> +#include <memory>\n> +#include <vector>\n> +\n> +#include <libcamera/camera.h>\n> +#include <libcamera/class.h>\n> +#include <libcamera/formats.h>\n> +#include <libcamera/geometry.h>\n> +\n> +#include \"camera_metadata.h\"\n> +\n> +class CameraCapabilities\n> +{\n> +public:\n> +       CameraCapabilities() = default;\n> +\n> +       int initialize(std::shared_ptr<libcamera::Camera> camera,\n> +                      int orientation, int facing);\n> +\n> +       CameraMetadata *staticMetadata() const { return\n> staticMetadata_.get(); }\n> +       libcamera::PixelFormat toPixelFormat(int format) const;\n> +       unsigned int maxJpegBufferSize() const { return\n> maxJpegBufferSize_; }\n> +\n> +       std::unique_ptr<CameraMetadata> requestTemplatePreview() const;\n> +       std::unique_ptr<CameraMetadata> requestTemplateVideo() const;\n> +\n> +private:\n> +       LIBCAMERA_DISABLE_COPY_AND_MOVE(CameraCapabilities)\n> +\n> +       struct Camera3StreamConfiguration {\n> +               libcamera::Size resolution;\n> +               int androidFormat;\n> +       };\n> +\n> +       std::vector<libcamera::Size>\n> +       getYUVResolutions(libcamera::CameraConfiguration *cameraConfig,\n> +                         const libcamera::PixelFormat &pixelFormat,\n> +                         const std::vector<libcamera::Size> &resolutions);\n> +       std::vector<libcamera::Size>\n> +       getRawResolutions(const libcamera::PixelFormat &pixelFormat);\n> +       int initializeStreamConfigurations();\n> +\n> +       int initializeStaticMetadata();\n> +\n> +       std::shared_ptr<libcamera::Camera> camera_;\n> +\n> +       int facing_;\n> +       int orientation_;\n> +\n> +       std::vector<Camera3StreamConfiguration> streamConfigurations_;\n> +       std::map<int, libcamera::PixelFormat> formatsMap_;\n> +       std::unique_ptr<CameraMetadata> staticMetadata_;\n> +       unsigned int maxJpegBufferSize_;\n> +};\n> +\n> +#endif /* __ANDROID_CAMERA_CAPABILITIES_H__ */\n> diff --git a/src/android/camera_device.cpp b/src/android/camera_device.cpp\n> index 8c71fd0675d3..4bd125d7020a 100644\n> --- a/src/android/camera_device.cpp\n> +++ b/src/android/camera_device.cpp\n> @@ -10,11 +10,8 @@\n>  #include \"camera_ops.h\"\n>  #include \"post_processor.h\"\n>\n> -#include <array>\n> -#include <cmath>\n>  #include <fstream>\n>  #include <sys/mman.h>\n> -#include <tuple>\n>  #include <unistd.h>\n>  #include <vector>\n>\n> @@ -23,7 +20,6 @@\n>  #include <libcamera/formats.h>\n>  #include <libcamera/property_ids.h>\n>\n> -#include \"libcamera/internal/formats.h\"\n>  #include \"libcamera/internal/log.h\"\n>  #include \"libcamera/internal/thread.h\"\n>  #include \"libcamera/internal/utils.h\"\n> @@ -36,94 +32,6 @@ LOG_DECLARE_CATEGORY(HAL)\n>\n>  namespace {\n>\n> -/*\n> - * \\var camera3Resolutions\n> - * \\brief The list of image resolutions defined as mandatory to be\n> supported by\n> - * the Android Camera3 specification\n> - */\n> -const std::vector<Size> camera3Resolutions = {\n> -       { 320, 240 },\n> -       { 640, 480 },\n> -       { 1280, 720 },\n> -       { 1920, 1080 }\n> -};\n> -\n> -/*\n> - * \\struct Camera3Format\n> - * \\brief Data associated with an Android format identifier\n> - * \\var libcameraFormats List of libcamera pixel formats compatible with\n> the\n> - * Android format\n> - * \\var name The human-readable representation of the Android format code\n> - */\n> -struct Camera3Format {\n> -       std::vector<PixelFormat> libcameraFormats;\n> -       bool mandatory;\n> -       const char *name;\n> -};\n> -\n> -/*\n> - * \\var camera3FormatsMap\n> - * \\brief Associate Android format code with ancillary data\n> - */\n> -const std::map<int, const Camera3Format> camera3FormatsMap = {\n> -       {\n> -               HAL_PIXEL_FORMAT_BLOB, {\n> -                       { formats::MJPEG },\n> -                       true,\n> -                       \"BLOB\"\n> -               }\n> -       }, {\n> -               HAL_PIXEL_FORMAT_YCbCr_420_888, {\n> -                       { formats::NV12, formats::NV21 },\n> -                       true,\n> -                       \"YCbCr_420_888\"\n> -               }\n> -       }, {\n> -               /*\n> -                * \\todo Translate IMPLEMENTATION_DEFINED inspecting the\n> gralloc\n> -                * usage flag. For now, copy the YCbCr_420 configuration.\n> -                */\n> -               HAL_PIXEL_FORMAT_IMPLEMENTATION_DEFINED, {\n> -                       { formats::NV12, formats::NV21 },\n> -                       true,\n> -                       \"IMPLEMENTATION_DEFINED\"\n> -               }\n> -       }, {\n> -               HAL_PIXEL_FORMAT_RAW10, {\n> -                       {\n> -                               formats::SBGGR10_CSI2P,\n> -                               formats::SGBRG10_CSI2P,\n> -                               formats::SGRBG10_CSI2P,\n> -                               formats::SRGGB10_CSI2P\n> -                       },\n> -                       false,\n> -                       \"RAW10\"\n> -               }\n> -       }, {\n> -               HAL_PIXEL_FORMAT_RAW12, {\n> -                       {\n> -                               formats::SBGGR12_CSI2P,\n> -                               formats::SGBRG12_CSI2P,\n> -                               formats::SGRBG12_CSI2P,\n> -                               formats::SRGGB12_CSI2P\n> -                       },\n> -                       false,\n> -                       \"RAW12\"\n> -               }\n> -       }, {\n> -               HAL_PIXEL_FORMAT_RAW16, {\n> -                       {\n> -                               formats::SBGGR16,\n> -                               formats::SGBRG16,\n> -                               formats::SGRBG16,\n> -                               formats::SRGGB16\n> -                       },\n> -                       false,\n> -                       \"RAW16\"\n> -               }\n> -       },\n> -};\n> -\n>  /*\n>   * \\struct Camera3StreamConfig\n>   * \\brief Data to store StreamConfiguration associated with\n> camera3_stream(s)\n> @@ -512,242 +420,7 @@ int CameraDevice::initialize(const CameraConfigData\n> *cameraConfigData)\n>                 orientation_ = 0;\n>         }\n>\n> -       /* Acquire the camera and initialize available stream\n> configurations. */\n> -       int ret = camera_->acquire();\n> -       if (ret) {\n> -               LOG(HAL, Error) << \"Failed to temporarily acquire the\n> camera\";\n> -               return ret;\n> -       }\n> -\n> -       ret = initializeStreamConfigurations();\n> -       camera_->release();\n> -       return ret;\n> -}\n> -\n> -std::vector<Size> CameraDevice::getYUVResolutions(CameraConfiguration\n> *cameraConfig,\n> -                                                 const PixelFormat\n> &pixelFormat,\n> -                                                 const std::vector<Size>\n> &resolutions)\n> -{\n> -       std::vector<Size> supportedResolutions;\n> -\n> -       StreamConfiguration &cfg = cameraConfig->at(0);\n> -       for (const Size &res : resolutions) {\n> -               cfg.pixelFormat = pixelFormat;\n> -               cfg.size = res;\n> -\n> -               CameraConfiguration::Status status =\n> cameraConfig->validate();\n> -               if (status != CameraConfiguration::Valid) {\n> -                       LOG(HAL, Debug) << cfg.toString() << \" not\n> supported\";\n> -                       continue;\n> -               }\n> -\n> -               LOG(HAL, Debug) << cfg.toString() << \" supported\";\n> -\n> -               supportedResolutions.push_back(res);\n> -       }\n> -\n> -       return supportedResolutions;\n> -}\n> -\n> -std::vector<Size> CameraDevice::getRawResolutions(const\n> libcamera::PixelFormat &pixelFormat)\n> -{\n> -       std::unique_ptr<CameraConfiguration> cameraConfig =\n> -               camera_->generateConfiguration({ StreamRole::Raw });\n> -       StreamConfiguration &cfg = cameraConfig->at(0);\n> -       const StreamFormats &formats = cfg.formats();\n> -       std::vector<Size> supportedResolutions =\n> formats.sizes(pixelFormat);\n> -\n> -       return supportedResolutions;\n> -}\n> -\n> -/*\n> - * Initialize the format conversion map to translate from Android format\n> - * identifier to libcamera pixel formats and fill in the list of supported\n> - * stream configurations to be reported to the Android camera framework\n> through\n> - * the static stream configuration metadata.\n> - */\n> -int CameraDevice::initializeStreamConfigurations()\n> -{\n> -       /*\n> -        * Get the maximum output resolutions\n> -        * \\todo Get this from the camera properties once defined\n> -        */\n> -       std::unique_ptr<CameraConfiguration> cameraConfig =\n> -               camera_->generateConfiguration({ StillCapture });\n> -       if (!cameraConfig) {\n> -               LOG(HAL, Error) << \"Failed to get maximum resolution\";\n> -               return -EINVAL;\n> -       }\n> -       StreamConfiguration &cfg = cameraConfig->at(0);\n> -\n> -       /*\n> -        * \\todo JPEG - Adjust the maximum available resolution by taking\n> the\n> -        * JPEG encoder requirements into account (alignment and aspect\n> ratio).\n> -        */\n> -       const Size maxRes = cfg.size;\n> -       LOG(HAL, Debug) << \"Maximum supported resolution: \" <<\n> maxRes.toString();\n> -\n> -       /*\n> -        * Build the list of supported image resolutions.\n> -        *\n> -        * The resolutions listed in camera3Resolution are mandatory to be\n> -        * supported, up to the camera maximum resolution.\n> -        *\n> -        * Augment the list by adding resolutions calculated from the\n> camera\n> -        * maximum one.\n> -        */\n> -       std::vector<Size> cameraResolutions;\n> -       std::copy_if(camera3Resolutions.begin(), camera3Resolutions.end(),\n> -                    std::back_inserter(cameraResolutions),\n> -                    [&](const Size &res) { return res < maxRes; });\n> -\n> -       /*\n> -        * The Camera3 specification suggests adding 1/2 and 1/4 of the\n> maximum\n> -        * resolution.\n> -        */\n> -       for (unsigned int divider = 2;; divider <<= 1) {\n> -               Size derivedSize{\n> -                       maxRes.width / divider,\n> -                       maxRes.height / divider,\n> -               };\n> -\n> -               if (derivedSize.width < 320 ||\n> -                   derivedSize.height < 240)\n> -                       break;\n> -\n> -               cameraResolutions.push_back(derivedSize);\n> -       }\n> -       cameraResolutions.push_back(maxRes);\n> -\n> -       /* Remove duplicated entries from the list of supported\n> resolutions. */\n> -       std::sort(cameraResolutions.begin(), cameraResolutions.end());\n> -       auto last = std::unique(cameraResolutions.begin(),\n> cameraResolutions.end());\n> -       cameraResolutions.erase(last, cameraResolutions.end());\n> -\n> -       /*\n> -        * Build the list of supported camera formats.\n> -        *\n> -        * To each Android format a list of compatible libcamera formats is\n> -        * associated. The first libcamera format that tests successful is\n> added\n> -        * to the format translation map used when configuring the streams.\n> -        * It is then tested against the list of supported camera\n> resolutions to\n> -        * build the stream configuration map reported through the camera\n> static\n> -        * metadata.\n> -        */\n> -       Size maxJpegSize;\n> -       for (const auto &format : camera3FormatsMap) {\n> -               int androidFormat = format.first;\n> -               const Camera3Format &camera3Format = format.second;\n> -               const std::vector<PixelFormat> &libcameraFormats =\n> -                       camera3Format.libcameraFormats;\n> -\n> -               LOG(HAL, Debug) << \"Trying to map Android format \"\n> -                               << camera3Format.name;\n> -\n> -               /*\n> -                * JPEG is always supported, either produced directly by\n> the\n> -                * camera, or encoded in the HAL.\n> -                */\n> -               if (androidFormat == HAL_PIXEL_FORMAT_BLOB) {\n> -                       formatsMap_[androidFormat] = formats::MJPEG;\n> -                       LOG(HAL, Debug) << \"Mapped Android format \"\n> -                                       << camera3Format.name << \" to \"\n> -                                       << formats::MJPEG.toString()\n> -                                       << \" (fixed mapping)\";\n> -                       continue;\n> -               }\n> -\n> -               /*\n> -                * Test the libcamera formats that can produce images\n> -                * compatible with the format defined by Android.\n> -                */\n> -               PixelFormat mappedFormat;\n> -               for (const PixelFormat &pixelFormat : libcameraFormats) {\n> -\n> -                       LOG(HAL, Debug) << \"Testing \" <<\n> pixelFormat.toString();\n> -\n> -                       /*\n> -                        * The stream configuration size can be adjusted,\n> -                        * not the pixel format.\n> -                        *\n> -                        * \\todo This could be simplified once all pipeline\n> -                        * handlers will report the StreamFormats list of\n> -                        * supported formats.\n> -                        */\n> -                       cfg.pixelFormat = pixelFormat;\n> -\n> -                       CameraConfiguration::Status status =\n> cameraConfig->validate();\n> -                       if (status != CameraConfiguration::Invalid &&\n> -                           cfg.pixelFormat == pixelFormat) {\n> -                               mappedFormat = pixelFormat;\n> -                               break;\n> -                       }\n> -               }\n> -\n> -               if (!mappedFormat.isValid()) {\n> -                       /* If the format is not mandatory, skip it. */\n> -                       if (!camera3Format.mandatory)\n> -                               continue;\n> -\n> -                       LOG(HAL, Error)\n> -                               << \"Failed to map mandatory Android format\n> \"\n> -                               << camera3Format.name << \" (\"\n> -                               << utils::hex(androidFormat) << \"):\n> aborting\";\n> -                       return -EINVAL;\n> -               }\n> -\n> -               /*\n> -                * Record the mapping and then proceed to generate the\n> -                * stream configurations map, by testing the image\n> resolutions.\n> -                */\n> -               formatsMap_[androidFormat] = mappedFormat;\n> -               LOG(HAL, Debug) << \"Mapped Android format \"\n> -                               << camera3Format.name << \" to \"\n> -                               << mappedFormat.toString();\n> -\n> -               std::vector<Size> resolutions;\n> -               const PixelFormatInfo &info =\n> PixelFormatInfo::info(mappedFormat);\n> -               if (info.colourEncoding ==\n> PixelFormatInfo::ColourEncodingRAW)\n> -                       resolutions = getRawResolutions(mappedFormat);\n> -               else\n> -                       resolutions = getYUVResolutions(cameraConfig.get(),\n> -                                                       mappedFormat,\n> -                                                       cameraResolutions);\n> -\n> -               for (const Size &res : resolutions) {\n> -                       streamConfigurations_.push_back({ res,\n> androidFormat });\n> -\n> -                       /*\n> -                        * If the format is HAL_PIXEL_FORMAT_YCbCr_420_888\n> -                        * from which JPEG is produced, add an entry for\n> -                        * the JPEG stream.\n> -                        *\n> -                        * \\todo Wire the JPEG encoder to query the\n> supported\n> -                        * sizes provided a list of formats it can encode.\n> -                        *\n> -                        * \\todo Support JPEG streams produced by the\n> Camera\n> -                        * natively.\n> -                        */\n> -                       if (androidFormat ==\n> HAL_PIXEL_FORMAT_YCbCr_420_888) {\n> -                               streamConfigurations_.push_back(\n> -                                       { res, HAL_PIXEL_FORMAT_BLOB });\n> -                               maxJpegSize = std::max(maxJpegSize, res);\n> -                       }\n> -               }\n> -\n> -               /*\n> -                * \\todo Calculate the maximum JPEG buffer size by asking\n> the\n> -                * encoder giving the maximum frame size required.\n> -                */\n> -               maxJpegBufferSize_ = maxJpegSize.width *\n> maxJpegSize.height * 1.5;\n> -       }\n> -\n> -       LOG(HAL, Debug) << \"Collected stream configuration map: \";\n> -       for (const auto &entry : streamConfigurations_)\n> -               LOG(HAL, Debug) << \"{ \" << entry.resolution.toString() <<\n> \" - \"\n> -                               << utils::hex(entry.androidFormat) << \" }\";\n> -\n> -       return 0;\n> +       return capabilities_.initialize(camera_, orientation_, facing_);\n>  }\n>\n>  /*\n> @@ -817,802 +490,19 @@ void CameraDevice::stop()\n>         state_ = State::Stopped;\n>  }\n>\n> -void CameraDevice::setCallbacks(const camera3_callback_ops_t *callbacks)\n> +unsigned int CameraDevice::maxJpegBufferSize() const\n>  {\n> -       callbacks_ = callbacks;\n> +       return capabilities_.maxJpegBufferSize();\n>  }\n>\n> -/*\n> - * Return static information for the camera.\n> - */\n> -const camera_metadata_t *CameraDevice::getStaticMetadata()\n> -{\n> -       if (staticMetadata_)\n> -               return staticMetadata_->get();\n> -\n> -       staticMetadata_ = std::make_unique<CameraMetadata>(64, 1024);\n> -       if (!staticMetadata_->isValid()) {\n> -               LOG(HAL, Error) << \"Failed to allocate static metadata\";\n> -               staticMetadata_.reset();\n> -               return nullptr;\n> -       }\n> -\n> -       const ControlInfoMap &controlsInfo = camera_->controls();\n> -       const ControlList &properties = camera_->properties();\n> -\n> -       /* Color correction static metadata. */\n> -       {\n> -               std::vector<uint8_t> data;\n> -               data.reserve(3);\n> -               const auto &infoMap =\n> controlsInfo.find(&controls::draft::ColorCorrectionAberrationMode);\n> -               if (infoMap != controlsInfo.end()) {\n> -                       for (const auto &value : infoMap->second.values())\n> -                               data.push_back(value.get<int32_t>());\n> -               } else {\n> -\n>  data.push_back(ANDROID_COLOR_CORRECTION_ABERRATION_MODE_OFF);\n> -               }\n> -\n>  staticMetadata_->addEntry(ANDROID_COLOR_CORRECTION_AVAILABLE_ABERRATION_MODES,\n> -                                         data);\n> -       }\n> -\n> -       /* Control static metadata. */\n> -       std::vector<uint8_t> aeAvailableAntiBandingModes = {\n> -               ANDROID_CONTROL_AE_ANTIBANDING_MODE_OFF,\n> -               ANDROID_CONTROL_AE_ANTIBANDING_MODE_50HZ,\n> -               ANDROID_CONTROL_AE_ANTIBANDING_MODE_60HZ,\n> -               ANDROID_CONTROL_AE_ANTIBANDING_MODE_AUTO,\n> -       };\n> -\n>  staticMetadata_->addEntry(ANDROID_CONTROL_AE_AVAILABLE_ANTIBANDING_MODES,\n> -                                 aeAvailableAntiBandingModes);\n> -\n> -       std::vector<uint8_t> aeAvailableModes = {\n> -               ANDROID_CONTROL_AE_MODE_ON,\n> -       };\n> -       staticMetadata_->addEntry(ANDROID_CONTROL_AE_AVAILABLE_MODES,\n> -                                 aeAvailableModes);\n> -\n> -       int64_t minFrameDurationNsec = -1;\n> -       int64_t maxFrameDurationNsec = -1;\n> -       const auto frameDurationsInfo =\n> controlsInfo.find(&controls::FrameDurationLimits);\n> -       if (frameDurationsInfo != controlsInfo.end()) {\n> -               minFrameDurationNsec =\n> frameDurationsInfo->second.min().get<int64_t>() * 1000;\n> -               maxFrameDurationNsec =\n> frameDurationsInfo->second.max().get<int64_t>() * 1000;\n> -\n> -               /*\n> -                * Adjust the minimum frame duration to comply with Android\n> -                * requirements. The camera service mandates all\n> preview/record\n> -                * streams to have a minimum frame duration < 33,366\n> milliseconds\n> -                * (see MAX_PREVIEW_RECORD_DURATION_NS in the camera\n> service\n> -                * implementation).\n> -                *\n> -                * If we're close enough (+ 500 useconds) to that value,\n> round\n> -                * the minimum frame duration of the camera to an accepted\n> -                * value.\n> -                */\n> -               static constexpr int64_t MAX_PREVIEW_RECORD_DURATION_NS =\n> 1e9 / 29.97;\n> -               if (minFrameDurationNsec > MAX_PREVIEW_RECORD_DURATION_NS\n> &&\n> -                   minFrameDurationNsec < MAX_PREVIEW_RECORD_DURATION_NS\n> + 500000)\n> -                       minFrameDurationNsec =\n> MAX_PREVIEW_RECORD_DURATION_NS - 1000;\n> -\n> -               /*\n> -                * The AE routine frame rate limits are computed using the\n> frame\n> -                * duration limits, as libcamera clips the AE routine to\n> the\n> -                * frame durations.\n> -                */\n> -               int32_t maxFps = std::round(1e9 / minFrameDurationNsec);\n> -               int32_t minFps = std::round(1e9 / maxFrameDurationNsec);\n> -               minFps = std::max(1, minFps);\n> -\n> -               /*\n> -                * Force rounding errors so that we have the proper frame\n> -                * durations for when we reuse these variables later\n> -                */\n> -               minFrameDurationNsec = 1e9 / maxFps;\n> -               maxFrameDurationNsec = 1e9 / minFps;\n> -\n> -               /*\n> -                * Register to the camera service {min, max} and {max, max}\n> -                * intervals as requested by the metadata documentation.\n> -                */\n> -               int32_t availableAeFpsTarget[] = {\n> -                       minFps, maxFps, maxFps, maxFps\n> -               };\n> -\n>  staticMetadata_->addEntry(ANDROID_CONTROL_AE_AVAILABLE_TARGET_FPS_RANGES,\n> -                                         availableAeFpsTarget);\n> -       }\n> -\n> -       std::vector<int32_t> aeCompensationRange = {\n> -               0, 0,\n> -       };\n> -       staticMetadata_->addEntry(ANDROID_CONTROL_AE_COMPENSATION_RANGE,\n> -                                 aeCompensationRange);\n> -\n> -       const camera_metadata_rational_t aeCompensationStep[] = {\n> -               { 0, 1 }\n> -       };\n> -       staticMetadata_->addEntry(ANDROID_CONTROL_AE_COMPENSATION_STEP,\n> -                                 aeCompensationStep);\n> -\n> -       std::vector<uint8_t> availableAfModes = {\n> -               ANDROID_CONTROL_AF_MODE_OFF,\n> -       };\n> -       staticMetadata_->addEntry(ANDROID_CONTROL_AF_AVAILABLE_MODES,\n> -                                 availableAfModes);\n> -\n> -       std::vector<uint8_t> availableEffects = {\n> -               ANDROID_CONTROL_EFFECT_MODE_OFF,\n> -       };\n> -       staticMetadata_->addEntry(ANDROID_CONTROL_AVAILABLE_EFFECTS,\n> -                                 availableEffects);\n> -\n> -       std::vector<uint8_t> availableSceneModes = {\n> -               ANDROID_CONTROL_SCENE_MODE_DISABLED,\n> -       };\n> -       staticMetadata_->addEntry(ANDROID_CONTROL_AVAILABLE_SCENE_MODES,\n> -                                 availableSceneModes);\n> -\n> -       std::vector<uint8_t> availableStabilizationModes = {\n> -               ANDROID_CONTROL_VIDEO_STABILIZATION_MODE_OFF,\n> -       };\n> -\n>  staticMetadata_->addEntry(ANDROID_CONTROL_AVAILABLE_VIDEO_STABILIZATION_MODES,\n> -                                 availableStabilizationModes);\n> -\n> -       /*\n> -        * \\todo Inspect the Camera capabilities to report the available\n> -        * AWB modes. Default to AUTO as CTS tests require it.\n> -        */\n> -       std::vector<uint8_t> availableAwbModes = {\n> -               ANDROID_CONTROL_AWB_MODE_AUTO,\n> -       };\n> -       staticMetadata_->addEntry(ANDROID_CONTROL_AWB_AVAILABLE_MODES,\n> -                                 availableAwbModes);\n> -\n> -       std::vector<int32_t> availableMaxRegions = {\n> -               0, 0, 0,\n> -       };\n> -       staticMetadata_->addEntry(ANDROID_CONTROL_MAX_REGIONS,\n> -                                 availableMaxRegions);\n> -\n> -       std::vector<uint8_t> sceneModesOverride = {\n> -               ANDROID_CONTROL_AE_MODE_ON,\n> -               ANDROID_CONTROL_AWB_MODE_AUTO,\n> -               ANDROID_CONTROL_AF_MODE_OFF,\n> -       };\n> -       staticMetadata_->addEntry(ANDROID_CONTROL_SCENE_MODE_OVERRIDES,\n> -                                 sceneModesOverride);\n> -\n> -       uint8_t aeLockAvailable = ANDROID_CONTROL_AE_LOCK_AVAILABLE_FALSE;\n> -       staticMetadata_->addEntry(ANDROID_CONTROL_AE_LOCK_AVAILABLE,\n> -                                 aeLockAvailable);\n> -\n> -       uint8_t awbLockAvailable =\n> ANDROID_CONTROL_AWB_LOCK_AVAILABLE_FALSE;\n> -       staticMetadata_->addEntry(ANDROID_CONTROL_AWB_LOCK_AVAILABLE,\n> -                                 awbLockAvailable);\n> -\n> -       char availableControlModes = ANDROID_CONTROL_MODE_AUTO;\n> -       staticMetadata_->addEntry(ANDROID_CONTROL_AVAILABLE_MODES,\n> -                                 availableControlModes);\n> -\n> -       /* JPEG static metadata. */\n> -\n> -       /*\n> -        * Create the list of supported thumbnail sizes by inspecting the\n> -        * available JPEG resolutions collected in streamConfigurations_\n> and\n> -        * generate one entry for each aspect ratio.\n> -        *\n> -        * The JPEG thumbnailer can freely scale, so pick an arbitrary\n> -        * (160, 160) size as the bounding rectangle, which is then\n> cropped to\n> -        * the different supported aspect ratios.\n> -        */\n> -       constexpr Size maxJpegThumbnail(160, 160);\n> -       std::vector<Size> thumbnailSizes;\n> -       thumbnailSizes.push_back({ 0, 0 });\n> -       for (const auto &entry : streamConfigurations_) {\n> -               if (entry.androidFormat != HAL_PIXEL_FORMAT_BLOB)\n> -                       continue;\n> -\n> -               Size thumbnailSize = maxJpegThumbnail\n> -                                    .boundedToAspectRatio({\n> entry.resolution.width,\n> -\n> entry.resolution.height });\n> -               thumbnailSizes.push_back(thumbnailSize);\n> -       }\n> -\n> -       std::sort(thumbnailSizes.begin(), thumbnailSizes.end());\n> -       auto last = std::unique(thumbnailSizes.begin(),\n> thumbnailSizes.end());\n> -       thumbnailSizes.erase(last, thumbnailSizes.end());\n> -\n> -       /* Transform sizes in to a list of integers that can be consumed.\n> */\n> -       std::vector<int32_t> thumbnailEntries;\n> -       thumbnailEntries.reserve(thumbnailSizes.size() * 2);\n> -       for (const auto &size : thumbnailSizes) {\n> -               thumbnailEntries.push_back(size.width);\n> -               thumbnailEntries.push_back(size.height);\n> -       }\n> -       staticMetadata_->addEntry(ANDROID_JPEG_AVAILABLE_THUMBNAIL_SIZES,\n> -                                 thumbnailEntries);\n> -\n> -       staticMetadata_->addEntry(ANDROID_JPEG_MAX_SIZE,\n> maxJpegBufferSize_);\n> -\n> -       /* Sensor static metadata. */\n> -       std::array<int32_t, 2> pixelArraySize;\n> -       {\n> -               const Size &size =\n> properties.get(properties::PixelArraySize);\n> -               pixelArraySize[0] = size.width;\n> -               pixelArraySize[1] = size.height;\n> -\n>  staticMetadata_->addEntry(ANDROID_SENSOR_INFO_PIXEL_ARRAY_SIZE,\n> -                                         pixelArraySize);\n> -       }\n> -\n> -       if (properties.contains(properties::UnitCellSize)) {\n> -               const Size &cellSize =\n> properties.get<Size>(properties::UnitCellSize);\n> -               std::array<float, 2> physicalSize{\n> -                       cellSize.width * pixelArraySize[0] / 1e6f,\n> -                       cellSize.height * pixelArraySize[1] / 1e6f\n> -               };\n> -\n>  staticMetadata_->addEntry(ANDROID_SENSOR_INFO_PHYSICAL_SIZE,\n> -                                         physicalSize);\n> -       }\n> -\n> -       {\n> -               const Span<const Rectangle> &rects =\n> -                       properties.get(properties::PixelArrayActiveAreas);\n> -               std::vector<int32_t> data{\n> -                       static_cast<int32_t>(rects[0].x),\n> -                       static_cast<int32_t>(rects[0].y),\n> -                       static_cast<int32_t>(rects[0].width),\n> -                       static_cast<int32_t>(rects[0].height),\n> -               };\n> -\n>  staticMetadata_->addEntry(ANDROID_SENSOR_INFO_ACTIVE_ARRAY_SIZE,\n> -                                         data);\n> -       }\n> -\n> -       int32_t sensitivityRange[] = {\n> -               32, 2400,\n> -       };\n> -       staticMetadata_->addEntry(ANDROID_SENSOR_INFO_SENSITIVITY_RANGE,\n> -                                 sensitivityRange);\n> -\n> -       /* Report the color filter arrangement if the camera reports it. */\n> -       if\n> (properties.contains(properties::draft::ColorFilterArrangement)) {\n> -               uint8_t filterArr =\n> properties.get(properties::draft::ColorFilterArrangement);\n> -\n>  staticMetadata_->addEntry(ANDROID_SENSOR_INFO_COLOR_FILTER_ARRANGEMENT,\n> -                                         filterArr);\n> -       }\n> -\n> -       const auto &exposureInfo =\n> controlsInfo.find(&controls::ExposureTime);\n> -       if (exposureInfo != controlsInfo.end()) {\n> -               int64_t exposureTimeRange[2] = {\n> -                       exposureInfo->second.min().get<int32_t>() * 1000LL,\n> -                       exposureInfo->second.max().get<int32_t>() * 1000LL,\n> -               };\n> -\n>  staticMetadata_->addEntry(ANDROID_SENSOR_INFO_EXPOSURE_TIME_RANGE,\n> -                                         exposureTimeRange, 2);\n> -       }\n> -\n> -       staticMetadata_->addEntry(ANDROID_SENSOR_ORIENTATION,\n> orientation_);\n> -\n> -       std::vector<int32_t> testPatternModes = {\n> -               ANDROID_SENSOR_TEST_PATTERN_MODE_OFF\n> -       };\n> -       const auto &testPatternsInfo =\n> -               controlsInfo.find(&controls::draft::TestPatternMode);\n> -       if (testPatternsInfo != controlsInfo.end()) {\n> -               const auto &values = testPatternsInfo->second.values();\n> -               ASSERT(!values.empty());\n> -               for (const auto &value : values) {\n> -                       switch (value.get<int32_t>()) {\n> -                       case controls::draft::TestPatternModeOff:\n> -                               /*\n> -                                * ANDROID_SENSOR_TEST_PATTERN_MODE_OFF is\n> -                                * already in testPatternModes.\n> -                                */\n> -                               break;\n> -\n> -                       case controls::draft::TestPatternModeSolidColor:\n> -                               testPatternModes.push_back(\n> -\n>  ANDROID_SENSOR_TEST_PATTERN_MODE_SOLID_COLOR);\n> -                               break;\n> -\n> -                       case controls::draft::TestPatternModeColorBars:\n> -                               testPatternModes.push_back(\n> -\n>  ANDROID_SENSOR_TEST_PATTERN_MODE_COLOR_BARS);\n> -                               break;\n> -\n> -                       case\n> controls::draft::TestPatternModeColorBarsFadeToGray:\n> -                               testPatternModes.push_back(\n> -\n>  ANDROID_SENSOR_TEST_PATTERN_MODE_COLOR_BARS_FADE_TO_GRAY);\n> -                               break;\n> -\n> -                       case controls::draft::TestPatternModePn9:\n> -                               testPatternModes.push_back(\n> -\n>  ANDROID_SENSOR_TEST_PATTERN_MODE_PN9);\n> -                               break;\n> -\n> -                       case controls::draft::TestPatternModeCustom1:\n> -                               /* We don't support this yet. */\n> -                               break;\n> -\n> -                       default:\n> -                               LOG(HAL, Error) << \"Unknown test pattern\n> mode: \"\n> -                                               << value.get<int32_t>();\n> -                               continue;\n> -                       }\n> -               }\n> -       }\n> -\n>  staticMetadata_->addEntry(ANDROID_SENSOR_AVAILABLE_TEST_PATTERN_MODES,\n> -                                 testPatternModes);\n> -\n> -       uint8_t timestampSource =\n> ANDROID_SENSOR_INFO_TIMESTAMP_SOURCE_UNKNOWN;\n> -       staticMetadata_->addEntry(ANDROID_SENSOR_INFO_TIMESTAMP_SOURCE,\n> -                                 timestampSource);\n> -\n> -       if (maxFrameDurationNsec > 0)\n> -\n>  staticMetadata_->addEntry(ANDROID_SENSOR_INFO_MAX_FRAME_DURATION,\n> -                                         maxFrameDurationNsec);\n> -\n> -       /* Statistics static metadata. */\n> -       uint8_t faceDetectMode = ANDROID_STATISTICS_FACE_DETECT_MODE_OFF;\n> -\n>  staticMetadata_->addEntry(ANDROID_STATISTICS_INFO_AVAILABLE_FACE_DETECT_MODES,\n> -                                 faceDetectMode);\n> -\n> -       int32_t maxFaceCount = 0;\n> -       staticMetadata_->addEntry(ANDROID_STATISTICS_INFO_MAX_FACE_COUNT,\n> -                                 maxFaceCount);\n> -\n> -       {\n> -               std::vector<uint8_t> data;\n> -               data.reserve(2);\n> -               const auto &infoMap =\n> controlsInfo.find(&controls::draft::LensShadingMapMode);\n> -               if (infoMap != controlsInfo.end()) {\n> -                       for (const auto &value : infoMap->second.values())\n> -                               data.push_back(value.get<int32_t>());\n> -               } else {\n> -\n>  data.push_back(ANDROID_STATISTICS_LENS_SHADING_MAP_MODE_OFF);\n> -               }\n> -\n>  staticMetadata_->addEntry(ANDROID_STATISTICS_INFO_AVAILABLE_LENS_SHADING_MAP_MODES,\n> -                                         data);\n> -       }\n> -\n> -       /* Sync static metadata. */\n> -       int32_t maxLatency = ANDROID_SYNC_MAX_LATENCY_UNKNOWN;\n> -       staticMetadata_->addEntry(ANDROID_SYNC_MAX_LATENCY, maxLatency);\n> -\n> -       /* Flash static metadata. */\n> -       char flashAvailable = ANDROID_FLASH_INFO_AVAILABLE_FALSE;\n> -       staticMetadata_->addEntry(ANDROID_FLASH_INFO_AVAILABLE,\n> -                                 flashAvailable);\n> -\n> -       /* Lens static metadata. */\n> -       std::vector<float> lensApertures = {\n> -               2.53 / 100,\n> -       };\n> -       staticMetadata_->addEntry(ANDROID_LENS_INFO_AVAILABLE_APERTURES,\n> -                                 lensApertures);\n> -\n> -       uint8_t lensFacing;\n> -       switch (facing_) {\n> -       default:\n> -       case CAMERA_FACING_FRONT:\n> -               lensFacing = ANDROID_LENS_FACING_FRONT;\n> -               break;\n> -       case CAMERA_FACING_BACK:\n> -               lensFacing = ANDROID_LENS_FACING_BACK;\n> -               break;\n> -       case CAMERA_FACING_EXTERNAL:\n> -               lensFacing = ANDROID_LENS_FACING_EXTERNAL;\n> -               break;\n> -       }\n> -       staticMetadata_->addEntry(ANDROID_LENS_FACING, lensFacing);\n> -\n> -       std::vector<float> lensFocalLengths = {\n> -               1,\n> -       };\n> -\n>  staticMetadata_->addEntry(ANDROID_LENS_INFO_AVAILABLE_FOCAL_LENGTHS,\n> -                                 lensFocalLengths);\n> -\n> -       std::vector<uint8_t> opticalStabilizations = {\n> -               ANDROID_LENS_OPTICAL_STABILIZATION_MODE_OFF,\n> -       };\n> -\n>  staticMetadata_->addEntry(ANDROID_LENS_INFO_AVAILABLE_OPTICAL_STABILIZATION,\n> -                                 opticalStabilizations);\n> -\n> -       float hypeFocalDistance = 0;\n> -       staticMetadata_->addEntry(ANDROID_LENS_INFO_HYPERFOCAL_DISTANCE,\n> -                                 hypeFocalDistance);\n> -\n> -       float minFocusDistance = 0;\n> -       staticMetadata_->addEntry(ANDROID_LENS_INFO_MINIMUM_FOCUS_DISTANCE,\n> -                                 minFocusDistance);\n> -\n> -       /* Noise reduction modes. */\n> -       {\n> -               std::vector<uint8_t> data;\n> -               data.reserve(5);\n> -               const auto &infoMap =\n> controlsInfo.find(&controls::draft::NoiseReductionMode);\n> -               if (infoMap != controlsInfo.end()) {\n> -                       for (const auto &value : infoMap->second.values())\n> -                               data.push_back(value.get<int32_t>());\n> -               } else {\n> -                       data.push_back(ANDROID_NOISE_REDUCTION_MODE_OFF);\n> -               }\n> -\n>  staticMetadata_->addEntry(ANDROID_NOISE_REDUCTION_AVAILABLE_NOISE_REDUCTION_MODES,\n> -                                         data);\n> -       }\n> -\n> -       /* Scaler static metadata. */\n> -\n> -       /*\n> -        * \\todo The digital zoom factor is a property that depends on the\n> -        * desired output configuration and the sensor frame size input to\n> the\n> -        * ISP. This information is not available to the Android HAL, not\n> at\n> -        * initialization time at least.\n> -        *\n> -        * As a workaround rely on pipeline handlers initializing the\n> -        * ScalerCrop control with the camera default configuration and\n> use the\n> -        * maximum and minimum crop rectangles to calculate the digital\n> zoom\n> -        * factor.\n> -        */\n> -       float maxZoom = 1.0f;\n> -       const auto scalerCrop = controlsInfo.find(&controls::ScalerCrop);\n> -       if (scalerCrop != controlsInfo.end()) {\n> -               Rectangle min = scalerCrop->second.min().get<Rectangle>();\n> -               Rectangle max = scalerCrop->second.max().get<Rectangle>();\n> -               maxZoom = std::min(1.0f * max.width / min.width,\n> -                                  1.0f * max.height / min.height);\n> -       }\n> -\n>  staticMetadata_->addEntry(ANDROID_SCALER_AVAILABLE_MAX_DIGITAL_ZOOM,\n> -                                 maxZoom);\n> -\n> -       std::vector<uint32_t> availableStreamConfigurations;\n> -       availableStreamConfigurations.reserve(streamConfigurations_.size()\n> * 4);\n> -       for (const auto &entry : streamConfigurations_) {\n> -\n>  availableStreamConfigurations.push_back(entry.androidFormat);\n> -\n>  availableStreamConfigurations.push_back(entry.resolution.width);\n> -\n>  availableStreamConfigurations.push_back(entry.resolution.height);\n> -               availableStreamConfigurations.push_back(\n> -\n>  ANDROID_SCALER_AVAILABLE_STREAM_CONFIGURATIONS_OUTPUT);\n> -       }\n> -\n>  staticMetadata_->addEntry(ANDROID_SCALER_AVAILABLE_STREAM_CONFIGURATIONS,\n> -                                 availableStreamConfigurations);\n> -\n> -       std::vector<int64_t> availableStallDurations = {\n> -               ANDROID_SCALER_AVAILABLE_FORMATS_BLOB, 2560, 1920,\n> 33333333,\n> -       };\n> -       staticMetadata_->addEntry(ANDROID_SCALER_AVAILABLE_STALL_DURATIONS,\n> -                                 availableStallDurations);\n> -\n> -       /* Use the minimum frame duration for all the YUV/RGB formats. */\n> -       if (minFrameDurationNsec > 0) {\n> -               std::vector<int64_t> minFrameDurations;\n> -               minFrameDurations.reserve(streamConfigurations_.size() *\n> 4);\n> -               for (const auto &entry : streamConfigurations_) {\n> -                       minFrameDurations.push_back(entry.androidFormat);\n> -\n>  minFrameDurations.push_back(entry.resolution.width);\n> -\n>  minFrameDurations.push_back(entry.resolution.height);\n> -                       minFrameDurations.push_back(minFrameDurationNsec);\n> -               }\n> -\n>  staticMetadata_->addEntry(ANDROID_SCALER_AVAILABLE_MIN_FRAME_DURATIONS,\n> -                                         minFrameDurations);\n> -       }\n> -\n> -       uint8_t croppingType = ANDROID_SCALER_CROPPING_TYPE_CENTER_ONLY;\n> -       staticMetadata_->addEntry(ANDROID_SCALER_CROPPING_TYPE,\n> croppingType);\n> -\n> -       /* Info static metadata. */\n> -       uint8_t supportedHWLevel =\n> ANDROID_INFO_SUPPORTED_HARDWARE_LEVEL_LIMITED;\n> -       staticMetadata_->addEntry(ANDROID_INFO_SUPPORTED_HARDWARE_LEVEL,\n> -                                 supportedHWLevel);\n> -\n> -       /* Request static metadata. */\n> -       int32_t partialResultCount = 1;\n> -       staticMetadata_->addEntry(ANDROID_REQUEST_PARTIAL_RESULT_COUNT,\n> -                                 partialResultCount);\n> -\n> -       {\n> -               /* Default the value to 2 if not reported by the camera. */\n> -               uint8_t maxPipelineDepth = 2;\n> -               const auto &infoMap =\n> controlsInfo.find(&controls::draft::PipelineDepth);\n> -               if (infoMap != controlsInfo.end())\n> -                       maxPipelineDepth =\n> infoMap->second.max().get<int32_t>();\n> -\n>  staticMetadata_->addEntry(ANDROID_REQUEST_PIPELINE_MAX_DEPTH,\n> -                                         maxPipelineDepth);\n> -       }\n> -\n> -       /* LIMITED does not support reprocessing. */\n> -       uint32_t maxNumInputStreams = 0;\n> -       staticMetadata_->addEntry(ANDROID_REQUEST_MAX_NUM_INPUT_STREAMS,\n> -                                 maxNumInputStreams);\n> -\n> -       std::vector<uint8_t> availableCapabilities = {\n> -               ANDROID_REQUEST_AVAILABLE_CAPABILITIES_BACKWARD_COMPATIBLE,\n> -       };\n> -\n> -       /* Report if camera supports RAW. */\n> -       bool rawStreamAvailable = false;\n> -       std::unique_ptr<CameraConfiguration> cameraConfig =\n> -               camera_->generateConfiguration({ StreamRole::Raw });\n> -       if (cameraConfig && !cameraConfig->empty()) {\n> -               const PixelFormatInfo &info =\n> -\n>  PixelFormatInfo::info(cameraConfig->at(0).pixelFormat);\n> -               /* Only advertise RAW support if RAW16 is possible. */\n> -               if (info.colourEncoding ==\n> PixelFormatInfo::ColourEncodingRAW &&\n> -                   info.bitsPerPixel == 16) {\n> -                       rawStreamAvailable = true;\n> -\n>  availableCapabilities.push_back(ANDROID_REQUEST_AVAILABLE_CAPABILITIES_RAW);\n> -               }\n> -       }\n> -\n> -       /* Number of { RAW, YUV, JPEG } supported output streams */\n> -       int32_t numOutStreams[] = { rawStreamAvailable, 2, 1 };\n> -       staticMetadata_->addEntry(ANDROID_REQUEST_MAX_NUM_OUTPUT_STREAMS,\n> -                                 numOutStreams);\n> -\n> -       staticMetadata_->addEntry(ANDROID_REQUEST_AVAILABLE_CAPABILITIES,\n> -                                 availableCapabilities);\n> -\n> -       std::vector<int32_t> availableCharacteristicsKeys = {\n> -               ANDROID_COLOR_CORRECTION_AVAILABLE_ABERRATION_MODES,\n> -               ANDROID_CONTROL_AE_AVAILABLE_ANTIBANDING_MODES,\n> -               ANDROID_CONTROL_AE_AVAILABLE_MODES,\n> -               ANDROID_CONTROL_AE_AVAILABLE_TARGET_FPS_RANGES,\n> -               ANDROID_CONTROL_AE_COMPENSATION_RANGE,\n> -               ANDROID_CONTROL_AE_COMPENSATION_STEP,\n> -               ANDROID_CONTROL_AE_LOCK_AVAILABLE,\n> -               ANDROID_CONTROL_AF_AVAILABLE_MODES,\n> -               ANDROID_CONTROL_AVAILABLE_EFFECTS,\n> -               ANDROID_CONTROL_AVAILABLE_MODES,\n> -               ANDROID_CONTROL_AVAILABLE_SCENE_MODES,\n> -               ANDROID_CONTROL_AVAILABLE_VIDEO_STABILIZATION_MODES,\n> -               ANDROID_CONTROL_AWB_AVAILABLE_MODES,\n> -               ANDROID_CONTROL_AWB_LOCK_AVAILABLE,\n> -               ANDROID_CONTROL_MAX_REGIONS,\n> -               ANDROID_CONTROL_SCENE_MODE_OVERRIDES,\n> -               ANDROID_FLASH_INFO_AVAILABLE,\n> -               ANDROID_INFO_SUPPORTED_HARDWARE_LEVEL,\n> -               ANDROID_JPEG_AVAILABLE_THUMBNAIL_SIZES,\n> -               ANDROID_JPEG_MAX_SIZE,\n> -               ANDROID_LENS_FACING,\n> -               ANDROID_LENS_INFO_AVAILABLE_APERTURES,\n> -               ANDROID_LENS_INFO_AVAILABLE_FOCAL_LENGTHS,\n> -               ANDROID_LENS_INFO_AVAILABLE_OPTICAL_STABILIZATION,\n> -               ANDROID_LENS_INFO_HYPERFOCAL_DISTANCE,\n> -               ANDROID_LENS_INFO_MINIMUM_FOCUS_DISTANCE,\n> -               ANDROID_NOISE_REDUCTION_AVAILABLE_NOISE_REDUCTION_MODES,\n> -               ANDROID_REQUEST_AVAILABLE_CAPABILITIES,\n> -               ANDROID_REQUEST_MAX_NUM_INPUT_STREAMS,\n> -               ANDROID_REQUEST_MAX_NUM_OUTPUT_STREAMS,\n> -               ANDROID_REQUEST_PARTIAL_RESULT_COUNT,\n> -               ANDROID_REQUEST_PIPELINE_MAX_DEPTH,\n> -               ANDROID_SCALER_AVAILABLE_MAX_DIGITAL_ZOOM,\n> -               ANDROID_SCALER_AVAILABLE_MIN_FRAME_DURATIONS,\n> -               ANDROID_SCALER_AVAILABLE_STALL_DURATIONS,\n> -               ANDROID_SCALER_AVAILABLE_STREAM_CONFIGURATIONS,\n> -               ANDROID_SCALER_CROPPING_TYPE,\n> -               ANDROID_SENSOR_AVAILABLE_TEST_PATTERN_MODES,\n> -               ANDROID_SENSOR_INFO_ACTIVE_ARRAY_SIZE,\n> -               ANDROID_SENSOR_INFO_COLOR_FILTER_ARRANGEMENT,\n> -               ANDROID_SENSOR_INFO_EXPOSURE_TIME_RANGE,\n> -               ANDROID_SENSOR_INFO_MAX_FRAME_DURATION,\n> -               ANDROID_SENSOR_INFO_PHYSICAL_SIZE,\n> -               ANDROID_SENSOR_INFO_PIXEL_ARRAY_SIZE,\n> -               ANDROID_SENSOR_INFO_SENSITIVITY_RANGE,\n> -               ANDROID_SENSOR_INFO_TIMESTAMP_SOURCE,\n> -               ANDROID_SENSOR_ORIENTATION,\n> -               ANDROID_STATISTICS_INFO_AVAILABLE_FACE_DETECT_MODES,\n> -               ANDROID_STATISTICS_INFO_MAX_FACE_COUNT,\n> -               ANDROID_SYNC_MAX_LATENCY,\n> -       };\n> -\n>  staticMetadata_->addEntry(ANDROID_REQUEST_AVAILABLE_CHARACTERISTICS_KEYS,\n> -                                 availableCharacteristicsKeys);\n> -\n> -       std::vector<int32_t> availableRequestKeys = {\n> -               ANDROID_COLOR_CORRECTION_ABERRATION_MODE,\n> -               ANDROID_CONTROL_AE_ANTIBANDING_MODE,\n> -               ANDROID_CONTROL_AE_EXPOSURE_COMPENSATION,\n> -               ANDROID_CONTROL_AE_LOCK,\n> -               ANDROID_CONTROL_AE_MODE,\n> -               ANDROID_CONTROL_AE_PRECAPTURE_TRIGGER,\n> -               ANDROID_CONTROL_AE_TARGET_FPS_RANGE,\n> -               ANDROID_CONTROL_AF_MODE,\n> -               ANDROID_CONTROL_AF_TRIGGER,\n> -               ANDROID_CONTROL_AWB_LOCK,\n> -               ANDROID_CONTROL_AWB_MODE,\n> -               ANDROID_CONTROL_CAPTURE_INTENT,\n> -               ANDROID_CONTROL_EFFECT_MODE,\n> -               ANDROID_CONTROL_MODE,\n> -               ANDROID_CONTROL_SCENE_MODE,\n> -               ANDROID_CONTROL_VIDEO_STABILIZATION_MODE,\n> -               ANDROID_FLASH_MODE,\n> -               ANDROID_JPEG_ORIENTATION,\n> -               ANDROID_JPEG_QUALITY,\n> -               ANDROID_JPEG_THUMBNAIL_QUALITY,\n> -               ANDROID_JPEG_THUMBNAIL_SIZE,\n> -               ANDROID_LENS_APERTURE,\n> -               ANDROID_LENS_OPTICAL_STABILIZATION_MODE,\n> -               ANDROID_NOISE_REDUCTION_MODE,\n> -               ANDROID_SCALER_CROP_REGION,\n> -               ANDROID_STATISTICS_FACE_DETECT_MODE\n> -       };\n> -       staticMetadata_->addEntry(ANDROID_REQUEST_AVAILABLE_REQUEST_KEYS,\n> -                                 availableRequestKeys);\n> -\n> -       std::vector<int32_t> availableResultKeys = {\n> -               ANDROID_COLOR_CORRECTION_ABERRATION_MODE,\n> -               ANDROID_CONTROL_AE_ANTIBANDING_MODE,\n> -               ANDROID_CONTROL_AE_EXPOSURE_COMPENSATION,\n> -               ANDROID_CONTROL_AE_LOCK,\n> -               ANDROID_CONTROL_AE_MODE,\n> -               ANDROID_CONTROL_AE_PRECAPTURE_TRIGGER,\n> -               ANDROID_CONTROL_AE_STATE,\n> -               ANDROID_CONTROL_AE_TARGET_FPS_RANGE,\n> -               ANDROID_CONTROL_AF_MODE,\n> -               ANDROID_CONTROL_AF_STATE,\n> -               ANDROID_CONTROL_AF_TRIGGER,\n> -               ANDROID_CONTROL_AWB_LOCK,\n> -               ANDROID_CONTROL_AWB_MODE,\n> -               ANDROID_CONTROL_AWB_STATE,\n> -               ANDROID_CONTROL_CAPTURE_INTENT,\n> -               ANDROID_CONTROL_EFFECT_MODE,\n> -               ANDROID_CONTROL_MODE,\n> -               ANDROID_CONTROL_SCENE_MODE,\n> -               ANDROID_CONTROL_VIDEO_STABILIZATION_MODE,\n> -               ANDROID_FLASH_MODE,\n> -               ANDROID_FLASH_STATE,\n> -               ANDROID_JPEG_GPS_COORDINATES,\n> -               ANDROID_JPEG_GPS_PROCESSING_METHOD,\n> -               ANDROID_JPEG_GPS_TIMESTAMP,\n> -               ANDROID_JPEG_ORIENTATION,\n> -               ANDROID_JPEG_QUALITY,\n> -               ANDROID_JPEG_SIZE,\n> -               ANDROID_JPEG_THUMBNAIL_QUALITY,\n> -               ANDROID_JPEG_THUMBNAIL_SIZE,\n> -               ANDROID_LENS_APERTURE,\n> -               ANDROID_LENS_FOCAL_LENGTH,\n> -               ANDROID_LENS_OPTICAL_STABILIZATION_MODE,\n> -               ANDROID_LENS_STATE,\n> -               ANDROID_NOISE_REDUCTION_MODE,\n> -               ANDROID_REQUEST_PIPELINE_DEPTH,\n> -               ANDROID_SCALER_CROP_REGION,\n> -               ANDROID_SENSOR_EXPOSURE_TIME,\n> -               ANDROID_SENSOR_FRAME_DURATION,\n> -               ANDROID_SENSOR_ROLLING_SHUTTER_SKEW,\n> -               ANDROID_SENSOR_TEST_PATTERN_MODE,\n> -               ANDROID_SENSOR_TIMESTAMP,\n> -               ANDROID_STATISTICS_FACE_DETECT_MODE,\n> -               ANDROID_STATISTICS_LENS_SHADING_MAP_MODE,\n> -               ANDROID_STATISTICS_HOT_PIXEL_MAP_MODE,\n> -               ANDROID_STATISTICS_SCENE_FLICKER,\n> -       };\n> -       staticMetadata_->addEntry(ANDROID_REQUEST_AVAILABLE_RESULT_KEYS,\n> -                                 availableResultKeys);\n> -\n> -       if (!staticMetadata_->isValid()) {\n> -               LOG(HAL, Error) << \"Failed to construct static metadata\";\n> -               staticMetadata_.reset();\n> -               return nullptr;\n> -       }\n> -\n> -       if (staticMetadata_->resized()) {\n> -               auto [entryCount, dataCount] = staticMetadata_->usage();\n> -               LOG(HAL, Info)\n> -                       << \"Static metadata resized: \" << entryCount\n> -                       << \" entries and \" << dataCount << \" bytes used\";\n> -       }\n> -\n> -       return staticMetadata_->get();\n> -}\n> -\n> -std::unique_ptr<CameraMetadata> CameraDevice::requestTemplatePreview()\n> +void CameraDevice::setCallbacks(const camera3_callback_ops_t *callbacks)\n>  {\n> -       /*\n> -        * \\todo Keep this in sync with the actual number of entries.\n> -        * Currently: 20 entries, 35 bytes\n> -        */\n> -       auto requestTemplate = std::make_unique<CameraMetadata>(21, 36);\n> -       if (!requestTemplate->isValid()) {\n> -               return nullptr;\n> -       }\n> -\n> -       /* Get the FPS range registered in the static metadata. */\n> -       camera_metadata_ro_entry_t entry;\n> -       bool found =\n> staticMetadata_->getEntry(ANDROID_CONTROL_AE_AVAILABLE_TARGET_FPS_RANGES,\n> -                                              &entry);\n> -       if (!found) {\n> -               LOG(HAL, Error) << \"Cannot create capture template without\n> FPS range\";\n> -               return nullptr;\n> -       }\n> -\n> -       /*\n> -        * Assume the AE_AVAILABLE_TARGET_FPS_RANGE static metadata\n> -        * has been assembled as {{min, max} {max, max}}.\n> -        */\n> -       requestTemplate->addEntry(ANDROID_CONTROL_AE_TARGET_FPS_RANGE,\n> -                                 entry.data.i32, 2);\n> -\n> -       uint8_t aeMode = ANDROID_CONTROL_AE_MODE_ON;\n> -       requestTemplate->addEntry(ANDROID_CONTROL_AE_MODE, aeMode);\n> -\n> -       int32_t aeExposureCompensation = 0;\n> -       requestTemplate->addEntry(ANDROID_CONTROL_AE_EXPOSURE_COMPENSATION,\n> -                                 aeExposureCompensation);\n> -\n> -       uint8_t aePrecaptureTrigger =\n> ANDROID_CONTROL_AE_PRECAPTURE_TRIGGER_IDLE;\n> -       requestTemplate->addEntry(ANDROID_CONTROL_AE_PRECAPTURE_TRIGGER,\n> -                                 aePrecaptureTrigger);\n> -\n> -       uint8_t aeLock = ANDROID_CONTROL_AE_LOCK_OFF;\n> -       requestTemplate->addEntry(ANDROID_CONTROL_AE_LOCK, aeLock);\n> -\n> -       uint8_t aeAntibandingMode =\n> ANDROID_CONTROL_AE_ANTIBANDING_MODE_AUTO;\n> -       requestTemplate->addEntry(ANDROID_CONTROL_AE_ANTIBANDING_MODE,\n> -                                 aeAntibandingMode);\n> -\n> -       uint8_t afMode = ANDROID_CONTROL_AF_MODE_OFF;\n> -       requestTemplate->addEntry(ANDROID_CONTROL_AF_MODE, afMode);\n> -\n> -       uint8_t afTrigger = ANDROID_CONTROL_AF_TRIGGER_IDLE;\n> -       requestTemplate->addEntry(ANDROID_CONTROL_AF_TRIGGER, afTrigger);\n> -\n> -       uint8_t awbMode = ANDROID_CONTROL_AWB_MODE_AUTO;\n> -       requestTemplate->addEntry(ANDROID_CONTROL_AWB_MODE, awbMode);\n> -\n> -       uint8_t awbLock = ANDROID_CONTROL_AWB_LOCK_OFF;\n> -       requestTemplate->addEntry(ANDROID_CONTROL_AWB_LOCK, awbLock);\n> -\n> -       uint8_t flashMode = ANDROID_FLASH_MODE_OFF;\n> -       requestTemplate->addEntry(ANDROID_FLASH_MODE, flashMode);\n> -\n> -       uint8_t faceDetectMode = ANDROID_STATISTICS_FACE_DETECT_MODE_OFF;\n> -       requestTemplate->addEntry(ANDROID_STATISTICS_FACE_DETECT_MODE,\n> -                                 faceDetectMode);\n> -\n> -       uint8_t noiseReduction = ANDROID_NOISE_REDUCTION_MODE_OFF;\n> -       requestTemplate->addEntry(ANDROID_NOISE_REDUCTION_MODE,\n> -                                 noiseReduction);\n> -\n> -       uint8_t aberrationMode =\n> ANDROID_COLOR_CORRECTION_ABERRATION_MODE_OFF;\n> -       requestTemplate->addEntry(ANDROID_COLOR_CORRECTION_ABERRATION_MODE,\n> -                                 aberrationMode);\n> -\n> -       uint8_t controlMode = ANDROID_CONTROL_MODE_AUTO;\n> -       requestTemplate->addEntry(ANDROID_CONTROL_MODE, controlMode);\n> -\n> -       float lensAperture = 2.53 / 100;\n> -       requestTemplate->addEntry(ANDROID_LENS_APERTURE, lensAperture);\n> -\n> -       uint8_t opticalStabilization =\n> ANDROID_LENS_OPTICAL_STABILIZATION_MODE_OFF;\n> -       requestTemplate->addEntry(ANDROID_LENS_OPTICAL_STABILIZATION_MODE,\n> -                                 opticalStabilization);\n> -\n> -       uint8_t captureIntent = ANDROID_CONTROL_CAPTURE_INTENT_PREVIEW;\n> -       requestTemplate->addEntry(ANDROID_CONTROL_CAPTURE_INTENT,\n> -                                 captureIntent);\n> -\n> -       return requestTemplate;\n> +       callbacks_ = callbacks;\n>  }\n>\n> -std::unique_ptr<CameraMetadata> CameraDevice::requestTemplateVideo()\n> +const camera_metadata_t *CameraDevice::getStaticMetadata()\n>  {\n> -       std::unique_ptr<CameraMetadata> previewTemplate =\n> requestTemplatePreview();\n> -       if (!previewTemplate)\n> -               return nullptr;\n> -\n> -       /*\n> -        * The video template requires a fixed FPS range. Everything else\n> -        * stays the same as the preview template.\n> -        */\n> -       camera_metadata_ro_entry_t entry;\n> -\n>  staticMetadata_->getEntry(ANDROID_CONTROL_AE_AVAILABLE_TARGET_FPS_RANGES,\n> -                                 &entry);\n> -\n> -       /*\n> -        * Assume the AE_AVAILABLE_TARGET_FPS_RANGE static metadata\n> -        * has been assembled as {{min, max} {max, max}}.\n> -        */\n> -       previewTemplate->updateEntry(ANDROID_CONTROL_AE_TARGET_FPS_RANGE,\n> -                                    entry.data.i32 + 2, 2);\n> -\n> -       return previewTemplate;\n> +       return capabilities_.staticMetadata()->get();\n>  }\n>\n>  /*\n> @@ -1630,7 +520,7 @@ const camera_metadata_t\n> *CameraDevice::constructDefaultRequestSettings(int type)\n>         switch (type) {\n>         case CAMERA3_TEMPLATE_PREVIEW:\n>                 captureIntent = ANDROID_CONTROL_CAPTURE_INTENT_PREVIEW;\n> -               requestTemplate = requestTemplatePreview();\n> +               requestTemplate = capabilities_.requestTemplatePreview();\n>                 break;\n>         case CAMERA3_TEMPLATE_STILL_CAPTURE:\n>                 /*\n> @@ -1638,15 +528,15 @@ const camera_metadata_t\n> *CameraDevice::constructDefaultRequestSettings(int type)\n>                  * for the torch mode we currently do not support.\n>                  */\n>                 captureIntent =\n> ANDROID_CONTROL_CAPTURE_INTENT_STILL_CAPTURE;\n> -               requestTemplate = requestTemplatePreview();\n> +               requestTemplate = capabilities_.requestTemplatePreview();\n>                 break;\n>         case CAMERA3_TEMPLATE_VIDEO_RECORD:\n>                 captureIntent =\n> ANDROID_CONTROL_CAPTURE_INTENT_VIDEO_RECORD;\n> -               requestTemplate = requestTemplateVideo();\n> +               requestTemplate = capabilities_.requestTemplateVideo();\n>                 break;\n>         case CAMERA3_TEMPLATE_VIDEO_SNAPSHOT:\n>                 captureIntent =\n> ANDROID_CONTROL_CAPTURE_INTENT_VIDEO_SNAPSHOT;\n> -               requestTemplate = requestTemplateVideo();\n> +               requestTemplate = capabilities_.requestTemplateVideo();\n>                 break;\n>         /* \\todo Implement templates generation for the remaining use\n> cases. */\n>         case CAMERA3_TEMPLATE_ZERO_SHUTTER_LAG:\n> @@ -1668,19 +558,6 @@ const camera_metadata_t\n> *CameraDevice::constructDefaultRequestSettings(int type)\n>         return requestTemplates_[type]->get();\n>  }\n>\n> -PixelFormat CameraDevice::toPixelFormat(int format) const\n> -{\n> -       /* Translate Android format code to libcamera pixel format. */\n> -       auto it = formatsMap_.find(format);\n> -       if (it == formatsMap_.end()) {\n> -               LOG(HAL, Error) << \"Requested format \" <<\n> utils::hex(format)\n> -                               << \" not supported\";\n> -               return PixelFormat();\n> -       }\n> -\n> -       return it->second;\n> -}\n> -\n>  /*\n>   * Inspect the stream_list to produce a list of StreamConfiguration to\n>   * be use to configure the Camera.\n> @@ -1727,7 +604,7 @@ int\n> CameraDevice::configureStreams(camera3_stream_configuration_t *stream_list)\n>                 camera3_stream_t *stream = stream_list->streams[i];\n>                 Size size(stream->width, stream->height);\n>\n> -               PixelFormat format = toPixelFormat(stream->format);\n> +               PixelFormat format =\n> capabilities_.toPixelFormat(stream->format);\n>\n>                 LOG(HAL, Info) << \"Stream #\" << i\n>                                << \", direction: \" << stream->stream_type\n> diff --git a/src/android/camera_device.h b/src/android/camera_device.h\n> index 4aadb27c562c..090fe28a551e 100644\n> --- a/src/android/camera_device.h\n> +++ b/src/android/camera_device.h\n> @@ -10,14 +10,12 @@\n>  #include <map>\n>  #include <memory>\n>  #include <mutex>\n> -#include <tuple>\n>  #include <vector>\n>\n>  #include <hardware/camera3.h>\n>\n>  #include <libcamera/buffer.h>\n>  #include <libcamera/camera.h>\n> -#include <libcamera/geometry.h>\n>  #include <libcamera/request.h>\n>  #include <libcamera/stream.h>\n>\n> @@ -26,6 +24,7 @@\n>  #include \"libcamera/internal/message.h\"\n>  #include \"libcamera/internal/thread.h\"\n>\n> +#include \"camera_capabilities.h\"\n>  #include \"camera_metadata.h\"\n>  #include \"camera_stream.h\"\n>  #include \"camera_worker.h\"\n> @@ -57,7 +56,7 @@ public:\n>         const std::string &model() const { return model_; }\n>         int facing() const { return facing_; }\n>         int orientation() const { return orientation_; }\n> -       unsigned int maxJpegBufferSize() const { return\n> maxJpegBufferSize_; }\n> +       unsigned int maxJpegBufferSize() const;\n>\n>         void setCallbacks(const camera3_callback_ops_t *callbacks);\n>         const camera_metadata_t *getStaticMetadata();\n> @@ -86,11 +85,6 @@ private:\n>                 std::unique_ptr<CaptureRequest> request_;\n>         };\n>\n> -       struct Camera3StreamConfiguration {\n> -               libcamera::Size resolution;\n> -               int androidFormat;\n> -       };\n> -\n>         enum class State {\n>                 Stopped,\n>                 Flushing,\n> @@ -99,22 +93,11 @@ private:\n>\n>         void stop();\n>\n> -       int initializeStreamConfigurations();\n> -       std::vector<libcamera::Size>\n> -       getYUVResolutions(libcamera::CameraConfiguration *cameraConfig,\n> -                         const libcamera::PixelFormat &pixelFormat,\n> -                         const std::vector<libcamera::Size> &resolutions);\n> -       std::vector<libcamera::Size>\n> -       getRawResolutions(const libcamera::PixelFormat &pixelFormat);\n> -\n>         libcamera::FrameBuffer *createFrameBuffer(const buffer_handle_t\n> camera3buffer);\n>         void abortRequest(camera3_capture_request_t *request);\n>         void notifyShutter(uint32_t frameNumber, uint64_t timestamp);\n>         void notifyError(uint32_t frameNumber, camera3_stream_t *stream,\n>                          camera3_error_msg_code code);\n> -       std::unique_ptr<CameraMetadata> requestTemplatePreview();\n> -       std::unique_ptr<CameraMetadata> requestTemplateVideo();\n> -       libcamera::PixelFormat toPixelFormat(int format) const;\n>         int processControls(Camera3RequestDescriptor *descriptor);\n>         std::unique_ptr<CameraMetadata> getResultMetadata(\n>                 const Camera3RequestDescriptor &descriptor) const;\n> @@ -129,13 +112,11 @@ private:\n>\n>         std::shared_ptr<libcamera::Camera> camera_;\n>         std::unique_ptr<libcamera::CameraConfiguration> config_;\n> +       CameraCapabilities capabilities_;\n>\n> -       std::unique_ptr<CameraMetadata> staticMetadata_;\n>         std::map<unsigned int, std::unique_ptr<CameraMetadata>>\n> requestTemplates_;\n>         const camera3_callback_ops_t *callbacks_;\n>\n> -       std::vector<Camera3StreamConfiguration> streamConfigurations_;\n> -       std::map<int, libcamera::PixelFormat> formatsMap_;\n>         std::vector<CameraStream> streams_;\n>\n>         libcamera::Mutex descriptorsMutex_; /* Protects descriptors_. */\n> @@ -147,8 +128,6 @@ private:\n>         int facing_;\n>         int orientation_;\n>\n> -       unsigned int maxJpegBufferSize_;\n> -\n>         CameraMetadata lastSettings_;\n>  };\n>\n> diff --git a/src/android/meson.build b/src/android/meson.build\n> index f27fd5316705..6270fb201338 100644\n> --- a/src/android/meson.build\n> +++ b/src/android/meson.build\n> @@ -44,6 +44,7 @@ subdir('cros')\n>\n>  android_hal_sources = files([\n>      'camera3_hal.cpp',\n> +    'camera_capabilities.cpp',\n>      'camera_device.cpp',\n>      'camera_hal_config.cpp',\n>      'camera_hal_manager.cpp',\n> --\n> 2.31.1\n>\n>","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id 882EAC321A\n\tfor <parsemail@patchwork.libcamera.org>;\n\tTue, 22 Jun 2021 08:55:23 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id D84456893B;\n\tTue, 22 Jun 2021 10:55:22 +0200 (CEST)","from mail-ej1-x632.google.com (mail-ej1-x632.google.com\n\t[IPv6:2a00:1450:4864:20::632])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id 6740B6050B\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tTue, 22 Jun 2021 03:34:40 +0200 (CEST)","by mail-ej1-x632.google.com with SMTP id gn32so4385041ejc.2\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tMon, 21 Jun 2021 18:34:40 -0700 (PDT)"],"Authentication-Results":"lancelot.ideasonboard.com; dkim=pass (1024-bit key;\n\tunprotected) header.d=chromium.org header.i=@chromium.org\n\theader.b=\"dfIFYnIp\"; dkim-atps=neutral","DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/relaxed; d=chromium.org;\n\ts=google; \n\th=mime-version:references:in-reply-to:from:date:message-id:subject:to\n\t:cc; bh=ArNhmq9AU+FVLVloZ+nFJ3D6v9L4VnDI0mTI4M86F1U=;\n\tb=dfIFYnIph8OzlayROFmZVyKJSrwXain6IL+aVQza8Lgg5Am1o7bswGzX4TSnMIQojg\n\tIsNIRjmw5DjHBh6v11M/JMsOY84944olvnWsK8I9/DJk++wvrUP72vGaVHJZeAe2Ek2g\n\tRW9g9sl51TSpQfZ25S9tO/oZeM+UHXXPmCBuQ=","X-Google-DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/relaxed;\n\td=1e100.net; s=20161025;\n\th=x-gm-message-state:mime-version:references:in-reply-to:from:date\n\t:message-id:subject:to:cc;\n\tbh=ArNhmq9AU+FVLVloZ+nFJ3D6v9L4VnDI0mTI4M86F1U=;\n\tb=WJY923jwY02gGKEAFYe8dNrzMD3ef703FC+SDEd9h3uEgpjUqYyg9hdYPUxwoMgT/k\n\tjY6LWiba0pnFQyFGCgzDbgkOY2GJa+TEgEp4HTN3vfxfAmz4srlDborHonC0a6L8Uwlt\n\t6ewYyHErRJUbKvlKuSd+W5fbFKicOkkrUfLcC3ER8Wo03NBOci8jV0Di53gw1KJJIeX8\n\t/gQjkOa+4a6Wii8VCcj7JuSnPXW5YIEUDhmEUBylLfawZb2O9m3G0stjAwvwW/1+6ROw\n\tT44aMBxGywH8zBs39vDYZZmKGDVvOSYTGsygGXqIrES5FVr7RuqHv54ECPrMr3ynmQjJ\n\tFWxg==","X-Gm-Message-State":"AOAM531PAYEHu+FLsnhUpTgYBagCPnmLlMhDPh+NcrLVnbaLTeQMjG5/\n\tmqnQCHRsIJLdfYLvvfm0nBvQA9I3Om6Ra5zVeKgjkQ==","X-Google-Smtp-Source":"ABdhPJy1EV68AYwR/EAsyloa6bQ5G1KkZ5X8PDcr+MTOlt2i4Rwcv9iRZfALTLbr6JIM9a/xa1eD4lnhvUwclG5DZSU=","X-Received":"by 2002:a17:906:19cc:: with SMTP id\n\th12mr1073269ejd.306.1624325679877; \n\tMon, 21 Jun 2021 18:34:39 -0700 (PDT)","MIME-Version":"1.0","References":"<20210621152954.40299-1-jacopo@jmondi.org>\n\t<20210621152954.40299-3-jacopo@jmondi.org>","In-Reply-To":"<20210621152954.40299-3-jacopo@jmondi.org>","From":"Hirokazu Honda <hiroh@chromium.org>","Date":"Tue, 22 Jun 2021 10:34:27 +0900","Message-ID":"<CAO5uPHP1B=8DWZOpthDew7pjsyA_Tau6ttLAZnqmsi_wmqd=YA@mail.gmail.com>","To":"Jacopo Mondi <jacopo@jmondi.org>","Content-Type":"multipart/alternative; boundary=\"00000000000026241105c550cb5d\"","X-Mailman-Approved-At":"Tue, 22 Jun 2021 10:55:21 +0200","Subject":"Re: [libcamera-devel] [PATCH 2/2] android: Introduce\n\tCameraCapabilties class","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","Cc":"libcamera devel <libcamera-devel@lists.libcamera.org>","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}},{"id":17686,"web_url":"https://patchwork.libcamera.org/comment/17686/","msgid":"<YNG1BSZCI3pECBhE@pendragon.ideasonboard.com>","date":"2021-06-22T10:01:41","subject":"Re: [libcamera-devel] [PATCH 2/2] android: Introduce\n\tCameraCapabilties class","submitter":{"id":2,"url":"https://patchwork.libcamera.org/api/people/2/","name":"Laurent Pinchart","email":"laurent.pinchart@ideasonboard.com"},"content":"Hi Jacopo,\n\nThank you for the patch.\n\nOn Mon, Jun 21, 2021 at 05:29:54PM +0200, Jacopo Mondi wrote:\n> The camera_device.cpp has grown a little too much, and it has quickly\n> become hard to maintain. Break out the handling of the static\n> information collected at camera initialization time to a new\n> CameraCapabilities class.\n> \n> Break out from the camera_device.cpp file all the functions relative to:\n\ns/relative/related/\n\n> - Initialization of supported stream configurations\n> - Initialization of static metadata\n> - Initialization of request templates\n> \n> Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>\n> Acked-by: Paul Elder <paul.elder@ideasonboard.com>\n> Tested-by: Paul Elder <paul.elder@ideasonboard.com>\n> ---\n>  src/android/camera_capabilities.cpp | 1164 +++++++++++++++++++++++++++\n>  src/android/camera_capabilities.h   |   65 ++\n>  src/android/camera_device.cpp       | 1147 +-------------------------\n>  src/android/camera_device.h         |   27 +-\n>  src/android/meson.build             |    1 +\n>  5 files changed, 1245 insertions(+), 1159 deletions(-)\n>  create mode 100644 src/android/camera_capabilities.cpp\n>  create mode 100644 src/android/camera_capabilities.h\n> \n> diff --git a/src/android/camera_capabilities.cpp b/src/android/camera_capabilities.cpp\n> new file mode 100644\n> index 000000000000..311a2c839586\n> --- /dev/null\n> +++ b/src/android/camera_capabilities.cpp\n> @@ -0,0 +1,1164 @@\n> +/* SPDX-License-Identifier: LGPL-2.1-or-later */\n> +/*\n> + * Copyright (C) 2021, Google Inc.\n> + *\n> + * camera_capabilities.cpp - Camera static properties manager\n> + */\n> +\n> +#include \"camera_capabilities.h\"\n> +\n> +#include <array>\n> +#include <cmath>\n> +\n> +#include <hardware/camera3.h>\n> +\n> +#include <libcamera/control_ids.h>\n> +#include <libcamera/controls.h>\n> +#include <libcamera/property_ids.h>\n> +\n> +#include \"libcamera/internal/formats.h\"\n> +#include \"libcamera/internal/log.h\"\n> +\n> +using namespace libcamera;\n> +\n> +LOG_DECLARE_CATEGORY(HAL)\n> +\n> +namespace {\n> +\n> +/*\n> + * \\var camera3Resolutions\n> + * \\brief The list of image resolutions defined as mandatory to be supported by\n> + * the Android Camera3 specification\n> + */\n> +const std::vector<Size> camera3Resolutions = {\n> +\t{ 320, 240 },\n> +\t{ 640, 480 },\n> +\t{ 1280, 720 },\n> +\t{ 1920, 1080 }\n> +};\n> +\n> +/*\n> + * \\struct Camera3Format\n> + * \\brief Data associated with an Android format identifier\n> + * \\var libcameraFormats List of libcamera pixel formats compatible with the\n> + * Android format\n> + * \\var name The human-readable representation of the Android format code\n> + */\n> +struct Camera3Format {\n> +\tstd::vector<PixelFormat> libcameraFormats;\n> +\tbool mandatory;\n> +\tconst char *name;\n> +};\n> +\n> +/*\n> + * \\var camera3FormatsMap\n> + * \\brief Associate Android format code with ancillary data\n> + */\n> +const std::map<int, const Camera3Format> camera3FormatsMap = {\n> +\t{\n> +\t\tHAL_PIXEL_FORMAT_BLOB, {\n> +\t\t\t{ formats::MJPEG },\n> +\t\t\ttrue,\n> +\t\t\t\"BLOB\"\n> +\t\t}\n> +\t}, {\n> +\t\tHAL_PIXEL_FORMAT_YCbCr_420_888, {\n> +\t\t\t{ formats::NV12, formats::NV21 },\n> +\t\t\ttrue,\n> +\t\t\t\"YCbCr_420_888\"\n> +\t\t}\n> +\t}, {\n> +\t\t/*\n> +\t\t * \\todo Translate IMPLEMENTATION_DEFINED inspecting the gralloc\n> +\t\t * usage flag. For now, copy the YCbCr_420 configuration.\n> +\t\t */\n> +\t\tHAL_PIXEL_FORMAT_IMPLEMENTATION_DEFINED, {\n> +\t\t\t{ formats::NV12, formats::NV21 },\n> +\t\t\ttrue,\n> +\t\t\t\"IMPLEMENTATION_DEFINED\"\n> +\t\t}\n> +\t}, {\n> +\t\tHAL_PIXEL_FORMAT_RAW10, {\n> +\t\t\t{\n> +\t\t\t\tformats::SBGGR10_CSI2P,\n> +\t\t\t\tformats::SGBRG10_CSI2P,\n> +\t\t\t\tformats::SGRBG10_CSI2P,\n> +\t\t\t\tformats::SRGGB10_CSI2P\n> +\t\t\t},\n> +\t\t\tfalse,\n> +\t\t\t\"RAW10\"\n> +\t\t}\n> +\t}, {\n> +\t\tHAL_PIXEL_FORMAT_RAW12, {\n> +\t\t\t{\n> +\t\t\t\tformats::SBGGR12_CSI2P,\n> +\t\t\t\tformats::SGBRG12_CSI2P,\n> +\t\t\t\tformats::SGRBG12_CSI2P,\n> +\t\t\t\tformats::SRGGB12_CSI2P\n> +\t\t\t},\n> +\t\t\tfalse,\n> +\t\t\t\"RAW12\"\n> +\t\t}\n> +\t}, {\n> +\t\tHAL_PIXEL_FORMAT_RAW16, {\n> +\t\t\t{\n> +\t\t\t\tformats::SBGGR16,\n> +\t\t\t\tformats::SGBRG16,\n> +\t\t\t\tformats::SGRBG16,\n> +\t\t\t\tformats::SRGGB16\n> +\t\t\t},\n> +\t\t\tfalse,\n> +\t\t\t\"RAW16\"\n> +\t\t}\n> +\t},\n> +};\n> +\n> +} /* namespace */\n> +\n> +int CameraCapabilities::initialize(std::shared_ptr<libcamera::Camera> camera,\n> +\t\t\t\t   int orientation, int facing)\n> +{\n> +\tcamera_ = camera;\n> +\torientation_ = orientation;\n> +\tfacing_ = facing;\n> +\n> +\t/* Acquire the camera and initialize available stream configurations. */\n> +\tint ret = camera_->acquire();\n> +\tif (ret) {\n> +\t\tLOG(HAL, Error) << \"Failed to temporarily acquire the camera\";\n> +\t\treturn ret;\n> +\t}\n> +\n> +\tret = initializeStreamConfigurations();\n> +\tcamera_->release();\n> +\tif (ret)\n> +\t\treturn ret;\n> +\n> +\treturn initializeStaticMetadata();\n> +}\n> +\n> +std::vector<Size> CameraCapabilities::getYUVResolutions(CameraConfiguration *cameraConfig,\n> +\t\t\t\t\t\t\tconst PixelFormat &pixelFormat,\n> +\t\t\t\t\t\t\tconst std::vector<Size> &resolutions)\n> +{\n> +\tstd::vector<Size> supportedResolutions;\n> +\n> +\tStreamConfiguration &cfg = cameraConfig->at(0);\n> +\tfor (const Size &res : resolutions) {\n> +\t\tcfg.pixelFormat = pixelFormat;\n> +\t\tcfg.size = res;\n> +\n> +\t\tCameraConfiguration::Status status = cameraConfig->validate();\n> +\t\tif (status != CameraConfiguration::Valid) {\n> +\t\t\tLOG(HAL, Debug) << cfg.toString() << \" not supported\";\n> +\t\t\tcontinue;\n> +\t\t}\n> +\n> +\t\tLOG(HAL, Debug) << cfg.toString() << \" supported\";\n> +\n> +\t\tsupportedResolutions.push_back(res);\n> +\t}\n> +\n> +\treturn supportedResolutions;\n> +}\n> +\n> +std::vector<Size> CameraCapabilities::getRawResolutions(const libcamera::PixelFormat &pixelFormat)\n> +{\n> +\tstd::unique_ptr<CameraConfiguration> cameraConfig =\n> +\t\tcamera_->generateConfiguration({ StreamRole::Raw });\n> +\tStreamConfiguration &cfg = cameraConfig->at(0);\n> +\tconst StreamFormats &formats = cfg.formats();\n> +\tstd::vector<Size> supportedResolutions = formats.sizes(pixelFormat);\n> +\n> +\treturn supportedResolutions;\n> +}\n> +\n> +/*\n> + * Initialize the format conversion map to translate from Android format\n> + * identifier to libcamera pixel formats and fill in the list of supported\n> + * stream configurations to be reported to the Android camera framework through\n> + * the Camera static metadata.\n\nDid you mean s/Camera/camera/ as it's not a class name ?\n\nReviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>\n\n> + */\n> +int CameraCapabilities::initializeStreamConfigurations()\n> +{\n> +\t/*\n> +\t * Get the maximum output resolutions\n> +\t * \\todo Get this from the camera properties once defined\n> +\t */\n> +\tstd::unique_ptr<CameraConfiguration> cameraConfig =\n> +\t\tcamera_->generateConfiguration({ StillCapture });\n> +\tif (!cameraConfig) {\n> +\t\tLOG(HAL, Error) << \"Failed to get maximum resolution\";\n> +\t\treturn -EINVAL;\n> +\t}\n> +\tStreamConfiguration &cfg = cameraConfig->at(0);\n> +\n> +\t/*\n> +\t * \\todo JPEG - Adjust the maximum available resolution by taking the\n> +\t * JPEG encoder requirements into account (alignment and aspect ratio).\n> +\t */\n> +\tconst Size maxRes = cfg.size;\n> +\tLOG(HAL, Debug) << \"Maximum supported resolution: \" << maxRes.toString();\n> +\n> +\t/*\n> +\t * Build the list of supported image resolutions.\n> +\t *\n> +\t * The resolutions listed in camera3Resolution are mandatory to be\n> +\t * supported, up to the camera maximum resolution.\n> +\t *\n> +\t * Augment the list by adding resolutions calculated from the camera\n> +\t * maximum one.\n> +\t */\n> +\tstd::vector<Size> cameraResolutions;\n> +\tstd::copy_if(camera3Resolutions.begin(), camera3Resolutions.end(),\n> +\t\t     std::back_inserter(cameraResolutions),\n> +\t\t     [&](const Size &res) { return res < maxRes; });\n> +\n> +\t/*\n> +\t * The Camera3 specification suggests adding 1/2 and 1/4 of the maximum\n> +\t * resolution.\n> +\t */\n> +\tfor (unsigned int divider = 2;; divider <<= 1) {\n> +\t\tSize derivedSize{\n> +\t\t\tmaxRes.width / divider,\n> +\t\t\tmaxRes.height / divider,\n> +\t\t};\n> +\n> +\t\tif (derivedSize.width < 320 ||\n> +\t\t    derivedSize.height < 240)\n> +\t\t\tbreak;\n> +\n> +\t\tcameraResolutions.push_back(derivedSize);\n> +\t}\n> +\tcameraResolutions.push_back(maxRes);\n> +\n> +\t/* Remove duplicated entries from the list of supported resolutions. */\n> +\tstd::sort(cameraResolutions.begin(), cameraResolutions.end());\n> +\tauto last = std::unique(cameraResolutions.begin(), cameraResolutions.end());\n> +\tcameraResolutions.erase(last, cameraResolutions.end());\n> +\n> +\t/*\n> +\t * Build the list of supported camera formats.\n> +\t *\n> +\t * To each Android format a list of compatible libcamera formats is\n> +\t * associated. The first libcamera format that tests successful is added\n> +\t * to the format translation map used when configuring the streams.\n> +\t * It is then tested against the list of supported camera resolutions to\n> +\t * build the stream configuration map reported through the camera static\n> +\t * metadata.\n> +\t */\n> +\tSize maxJpegSize;\n> +\tfor (const auto &format : camera3FormatsMap) {\n> +\t\tint androidFormat = format.first;\n> +\t\tconst Camera3Format &camera3Format = format.second;\n> +\t\tconst std::vector<PixelFormat> &libcameraFormats =\n> +\t\t\tcamera3Format.libcameraFormats;\n> +\n> +\t\tLOG(HAL, Debug) << \"Trying to map Android format \"\n> +\t\t\t\t<< camera3Format.name;\n> +\n> +\t\t/*\n> +\t\t * JPEG is always supported, either produced directly by the\n> +\t\t * camera, or encoded in the HAL.\n> +\t\t */\n> +\t\tif (androidFormat == HAL_PIXEL_FORMAT_BLOB) {\n> +\t\t\tformatsMap_[androidFormat] = formats::MJPEG;\n> +\t\t\tLOG(HAL, Debug) << \"Mapped Android format \"\n> +\t\t\t\t\t<< camera3Format.name << \" to \"\n> +\t\t\t\t\t<< formats::MJPEG.toString()\n> +\t\t\t\t\t<< \" (fixed mapping)\";\n> +\t\t\tcontinue;\n> +\t\t}\n> +\n> +\t\t/*\n> +\t\t * Test the libcamera formats that can produce images\n> +\t\t * compatible with the format defined by Android.\n> +\t\t */\n> +\t\tPixelFormat mappedFormat;\n> +\t\tfor (const PixelFormat &pixelFormat : libcameraFormats) {\n> +\n> +\t\t\tLOG(HAL, Debug) << \"Testing \" << pixelFormat.toString();\n> +\n> +\t\t\t/*\n> +\t\t\t * The stream configuration size can be adjusted,\n> +\t\t\t * not the pixel format.\n> +\t\t\t *\n> +\t\t\t * \\todo This could be simplified once all pipeline\n> +\t\t\t * handlers will report the StreamFormats list of\n> +\t\t\t * supported formats.\n> +\t\t\t */\n> +\t\t\tcfg.pixelFormat = pixelFormat;\n> +\n> +\t\t\tCameraConfiguration::Status status = cameraConfig->validate();\n> +\t\t\tif (status != CameraConfiguration::Invalid &&\n> +\t\t\t    cfg.pixelFormat == pixelFormat) {\n> +\t\t\t\tmappedFormat = pixelFormat;\n> +\t\t\t\tbreak;\n> +\t\t\t}\n> +\t\t}\n> +\n> +\t\tif (!mappedFormat.isValid()) {\n> +\t\t\t/* If the format is not mandatory, skip it. */\n> +\t\t\tif (!camera3Format.mandatory)\n> +\t\t\t\tcontinue;\n> +\n> +\t\t\tLOG(HAL, Error)\n> +\t\t\t\t<< \"Failed to map mandatory Android format \"\n> +\t\t\t\t<< camera3Format.name << \" (\"\n> +\t\t\t\t<< utils::hex(androidFormat) << \"): aborting\";\n> +\t\t\treturn -EINVAL;\n> +\t\t}\n> +\n> +\t\t/*\n> +\t\t * Record the mapping and then proceed to generate the\n> +\t\t * stream configurations map, by testing the image resolutions.\n> +\t\t */\n> +\t\tformatsMap_[androidFormat] = mappedFormat;\n> +\t\tLOG(HAL, Debug) << \"Mapped Android format \"\n> +\t\t\t\t<< camera3Format.name << \" to \"\n> +\t\t\t\t<< mappedFormat.toString();\n> +\n> +\t\tstd::vector<Size> resolutions;\n> +\t\tconst PixelFormatInfo &info = PixelFormatInfo::info(mappedFormat);\n> +\t\tif (info.colourEncoding == PixelFormatInfo::ColourEncodingRAW)\n> +\t\t\tresolutions = getRawResolutions(mappedFormat);\n> +\t\telse\n> +\t\t\tresolutions = getYUVResolutions(cameraConfig.get(),\n> +\t\t\t\t\t\t\tmappedFormat,\n> +\t\t\t\t\t\t\tcameraResolutions);\n> +\n> +\t\tfor (const Size &res : resolutions) {\n> +\t\t\tstreamConfigurations_.push_back({ res, androidFormat });\n> +\n> +\t\t\t/*\n> +\t\t\t * If the format is HAL_PIXEL_FORMAT_YCbCr_420_888\n> +\t\t\t * from which JPEG is produced, add an entry for\n> +\t\t\t * the JPEG stream.\n> +\t\t\t *\n> +\t\t\t * \\todo Wire the JPEG encoder to query the supported\n> +\t\t\t * sizes provided a list of formats it can encode.\n> +\t\t\t *\n> +\t\t\t * \\todo Support JPEG streams produced by the Camera\n> +\t\t\t * natively.\n> +\t\t\t */\n> +\t\t\tif (androidFormat == HAL_PIXEL_FORMAT_YCbCr_420_888) {\n> +\t\t\t\tstreamConfigurations_.push_back(\n> +\t\t\t\t\t{ res, HAL_PIXEL_FORMAT_BLOB });\n> +\t\t\t\tmaxJpegSize = std::max(maxJpegSize, res);\n> +\t\t\t}\n> +\t\t}\n> +\n> +\t\t/*\n> +\t\t * \\todo Calculate the maximum JPEG buffer size by asking the\n> +\t\t * encoder giving the maximum frame size required.\n> +\t\t */\n> +\t\tmaxJpegBufferSize_ = maxJpegSize.width * maxJpegSize.height * 1.5;\n> +\t}\n> +\n> +\tLOG(HAL, Debug) << \"Collected stream configuration map: \";\n> +\tfor (const auto &entry : streamConfigurations_)\n> +\t\tLOG(HAL, Debug) << \"{ \" << entry.resolution.toString() << \" - \"\n> +\t\t\t\t<< utils::hex(entry.androidFormat) << \" }\";\n> +\n> +\treturn 0;\n> +}\n> +\n> +int CameraCapabilities::initializeStaticMetadata()\n> +{\n> +\tstaticMetadata_ = std::make_unique<CameraMetadata>(64, 1024);\n> +\tif (!staticMetadata_->isValid()) {\n> +\t\tLOG(HAL, Error) << \"Failed to allocate static metadata\";\n> +\t\tstaticMetadata_.reset();\n> +\t\treturn -EINVAL;\n> +\t}\n> +\n> +\tconst ControlInfoMap &controlsInfo = camera_->controls();\n> +\tconst ControlList &properties = camera_->properties();\n> +\n> +\t/* Color correction static metadata. */\n> +\t{\n> +\t\tstd::vector<uint8_t> data;\n> +\t\tdata.reserve(3);\n> +\t\tconst auto &infoMap = controlsInfo.find(&controls::draft::ColorCorrectionAberrationMode);\n> +\t\tif (infoMap != controlsInfo.end()) {\n> +\t\t\tfor (const auto &value : infoMap->second.values())\n> +\t\t\t\tdata.push_back(value.get<int32_t>());\n> +\t\t} else {\n> +\t\t\tdata.push_back(ANDROID_COLOR_CORRECTION_ABERRATION_MODE_OFF);\n> +\t\t}\n> +\t\tstaticMetadata_->addEntry(ANDROID_COLOR_CORRECTION_AVAILABLE_ABERRATION_MODES,\n> +\t\t\t\t\t  data);\n> +\t}\n> +\n> +\t/* Control static metadata. */\n> +\tstd::vector<uint8_t> aeAvailableAntiBandingModes = {\n> +\t\tANDROID_CONTROL_AE_ANTIBANDING_MODE_OFF,\n> +\t\tANDROID_CONTROL_AE_ANTIBANDING_MODE_50HZ,\n> +\t\tANDROID_CONTROL_AE_ANTIBANDING_MODE_60HZ,\n> +\t\tANDROID_CONTROL_AE_ANTIBANDING_MODE_AUTO,\n> +\t};\n> +\tstaticMetadata_->addEntry(ANDROID_CONTROL_AE_AVAILABLE_ANTIBANDING_MODES,\n> +\t\t\t\t  aeAvailableAntiBandingModes);\n> +\n> +\tstd::vector<uint8_t> aeAvailableModes = {\n> +\t\tANDROID_CONTROL_AE_MODE_ON,\n> +\t};\n> +\tstaticMetadata_->addEntry(ANDROID_CONTROL_AE_AVAILABLE_MODES,\n> +\t\t\t\t  aeAvailableModes);\n> +\n> +\tint64_t minFrameDurationNsec = -1;\n> +\tint64_t maxFrameDurationNsec = -1;\n> +\tconst auto frameDurationsInfo = controlsInfo.find(&controls::FrameDurationLimits);\n> +\tif (frameDurationsInfo != controlsInfo.end()) {\n> +\t\tminFrameDurationNsec = frameDurationsInfo->second.min().get<int64_t>() * 1000;\n> +\t\tmaxFrameDurationNsec = frameDurationsInfo->second.max().get<int64_t>() * 1000;\n> +\n> +\t\t/*\n> +\t\t * Adjust the minimum frame duration to comply with Android\n> +\t\t * requirements. The camera service mandates all preview/record\n> +\t\t * streams to have a minimum frame duration < 33,366 milliseconds\n> +\t\t * (see MAX_PREVIEW_RECORD_DURATION_NS in the camera service\n> +\t\t * implementation).\n> +\t\t *\n> +\t\t * If we're close enough (+ 500 useconds) to that value, round\n> +\t\t * the minimum frame duration of the camera to an accepted\n> +\t\t * value.\n> +\t\t */\n> +\t\tstatic constexpr int64_t MAX_PREVIEW_RECORD_DURATION_NS = 1e9 / 29.97;\n> +\t\tif (minFrameDurationNsec > MAX_PREVIEW_RECORD_DURATION_NS &&\n> +\t\t    minFrameDurationNsec < MAX_PREVIEW_RECORD_DURATION_NS + 500000)\n> +\t\t\tminFrameDurationNsec = MAX_PREVIEW_RECORD_DURATION_NS - 1000;\n> +\n> +\t\t/*\n> +\t\t * The AE routine frame rate limits are computed using the frame\n> +\t\t * duration limits, as libcamera clips the AE routine to the\n> +\t\t * frame durations.\n> +\t\t */\n> +\t\tint32_t maxFps = std::round(1e9 / minFrameDurationNsec);\n> +\t\tint32_t minFps = std::round(1e9 / maxFrameDurationNsec);\n> +\t\tminFps = std::max(1, minFps);\n> +\n> +\t\t/*\n> +\t\t * Force rounding errors so that we have the proper frame\n> +\t\t * durations for when we reuse these variables later\n> +\t\t */\n> +\t\tminFrameDurationNsec = 1e9 / maxFps;\n> +\t\tmaxFrameDurationNsec = 1e9 / minFps;\n> +\n> +\t\t/*\n> +\t\t * Register to the camera service {min, max} and {max, max}\n> +\t\t * intervals as requested by the metadata documentation.\n> +\t\t */\n> +\t\tint32_t availableAeFpsTarget[] = {\n> +\t\t\tminFps, maxFps, maxFps, maxFps\n> +\t\t};\n> +\t\tstaticMetadata_->addEntry(ANDROID_CONTROL_AE_AVAILABLE_TARGET_FPS_RANGES,\n> +\t\t\t\t\t  availableAeFpsTarget);\n> +\t}\n> +\n> +\tstd::vector<int32_t> aeCompensationRange = {\n> +\t\t0, 0,\n> +\t};\n> +\tstaticMetadata_->addEntry(ANDROID_CONTROL_AE_COMPENSATION_RANGE,\n> +\t\t\t\t  aeCompensationRange);\n> +\n> +\tconst camera_metadata_rational_t aeCompensationStep[] = {\n> +\t\t{ 0, 1 }\n> +\t};\n> +\tstaticMetadata_->addEntry(ANDROID_CONTROL_AE_COMPENSATION_STEP,\n> +\t\t\t\t  aeCompensationStep);\n> +\n> +\tstd::vector<uint8_t> availableAfModes = {\n> +\t\tANDROID_CONTROL_AF_MODE_OFF,\n> +\t};\n> +\tstaticMetadata_->addEntry(ANDROID_CONTROL_AF_AVAILABLE_MODES,\n> +\t\t\t\t  availableAfModes);\n> +\n> +\tstd::vector<uint8_t> availableEffects = {\n> +\t\tANDROID_CONTROL_EFFECT_MODE_OFF,\n> +\t};\n> +\tstaticMetadata_->addEntry(ANDROID_CONTROL_AVAILABLE_EFFECTS,\n> +\t\t\t\t  availableEffects);\n> +\n> +\tstd::vector<uint8_t> availableSceneModes = {\n> +\t\tANDROID_CONTROL_SCENE_MODE_DISABLED,\n> +\t};\n> +\tstaticMetadata_->addEntry(ANDROID_CONTROL_AVAILABLE_SCENE_MODES,\n> +\t\t\t\t  availableSceneModes);\n> +\n> +\tstd::vector<uint8_t> availableStabilizationModes = {\n> +\t\tANDROID_CONTROL_VIDEO_STABILIZATION_MODE_OFF,\n> +\t};\n> +\tstaticMetadata_->addEntry(ANDROID_CONTROL_AVAILABLE_VIDEO_STABILIZATION_MODES,\n> +\t\t\t\t  availableStabilizationModes);\n> +\n> +\t/*\n> +\t * \\todo Inspect the Camera capabilities to report the available\n> +\t * AWB modes. Default to AUTO as CTS tests require it.\n> +\t */\n> +\tstd::vector<uint8_t> availableAwbModes = {\n> +\t\tANDROID_CONTROL_AWB_MODE_AUTO,\n> +\t};\n> +\tstaticMetadata_->addEntry(ANDROID_CONTROL_AWB_AVAILABLE_MODES,\n> +\t\t\t\t  availableAwbModes);\n> +\n> +\tstd::vector<int32_t> availableMaxRegions = {\n> +\t\t0, 0, 0,\n> +\t};\n> +\tstaticMetadata_->addEntry(ANDROID_CONTROL_MAX_REGIONS,\n> +\t\t\t\t  availableMaxRegions);\n> +\n> +\tstd::vector<uint8_t> sceneModesOverride = {\n> +\t\tANDROID_CONTROL_AE_MODE_ON,\n> +\t\tANDROID_CONTROL_AWB_MODE_AUTO,\n> +\t\tANDROID_CONTROL_AF_MODE_OFF,\n> +\t};\n> +\tstaticMetadata_->addEntry(ANDROID_CONTROL_SCENE_MODE_OVERRIDES,\n> +\t\t\t\t  sceneModesOverride);\n> +\n> +\tuint8_t aeLockAvailable = ANDROID_CONTROL_AE_LOCK_AVAILABLE_FALSE;\n> +\tstaticMetadata_->addEntry(ANDROID_CONTROL_AE_LOCK_AVAILABLE,\n> +\t\t\t\t  aeLockAvailable);\n> +\n> +\tuint8_t awbLockAvailable = ANDROID_CONTROL_AWB_LOCK_AVAILABLE_FALSE;\n> +\tstaticMetadata_->addEntry(ANDROID_CONTROL_AWB_LOCK_AVAILABLE,\n> +\t\t\t\t  awbLockAvailable);\n> +\n> +\tchar availableControlModes = ANDROID_CONTROL_MODE_AUTO;\n> +\tstaticMetadata_->addEntry(ANDROID_CONTROL_AVAILABLE_MODES,\n> +\t\t\t\t  availableControlModes);\n> +\n> +\t/* JPEG static metadata. */\n> +\n> +\t/*\n> +\t * Create the list of supported thumbnail sizes by inspecting the\n> +\t * available JPEG resolutions collected in streamConfigurations_ and\n> +\t * generate one entry for each aspect ratio.\n> +\t *\n> +\t * The JPEG thumbnailer can freely scale, so pick an arbitrary\n> +\t * (160, 160) size as the bounding rectangle, which is then cropped to\n> +\t * the different supported aspect ratios.\n> +\t */\n> +\tconstexpr Size maxJpegThumbnail(160, 160);\n> +\tstd::vector<Size> thumbnailSizes;\n> +\tthumbnailSizes.push_back({ 0, 0 });\n> +\tfor (const auto &entry : streamConfigurations_) {\n> +\t\tif (entry.androidFormat != HAL_PIXEL_FORMAT_BLOB)\n> +\t\t\tcontinue;\n> +\n> +\t\tSize thumbnailSize = maxJpegThumbnail\n> +\t\t\t\t     .boundedToAspectRatio({ entry.resolution.width,\n> +\t\t\t\t\t\t\t     entry.resolution.height });\n> +\t\tthumbnailSizes.push_back(thumbnailSize);\n> +\t}\n> +\n> +\tstd::sort(thumbnailSizes.begin(), thumbnailSizes.end());\n> +\tauto last = std::unique(thumbnailSizes.begin(), thumbnailSizes.end());\n> +\tthumbnailSizes.erase(last, thumbnailSizes.end());\n> +\n> +\t/* Transform sizes in to a list of integers that can be consumed. */\n> +\tstd::vector<int32_t> thumbnailEntries;\n> +\tthumbnailEntries.reserve(thumbnailSizes.size() * 2);\n> +\tfor (const auto &size : thumbnailSizes) {\n> +\t\tthumbnailEntries.push_back(size.width);\n> +\t\tthumbnailEntries.push_back(size.height);\n> +\t}\n> +\tstaticMetadata_->addEntry(ANDROID_JPEG_AVAILABLE_THUMBNAIL_SIZES,\n> +\t\t\t\t  thumbnailEntries);\n> +\n> +\tstaticMetadata_->addEntry(ANDROID_JPEG_MAX_SIZE, maxJpegBufferSize_);\n> +\n> +\t/* Sensor static metadata. */\n> +\tstd::array<int32_t, 2> pixelArraySize;\n> +\t{\n> +\t\tconst Size &size = properties.get(properties::PixelArraySize);\n> +\t\tpixelArraySize[0] = size.width;\n> +\t\tpixelArraySize[1] = size.height;\n> +\t\tstaticMetadata_->addEntry(ANDROID_SENSOR_INFO_PIXEL_ARRAY_SIZE,\n> +\t\t\t\t\t  pixelArraySize);\n> +\t}\n> +\n> +\tif (properties.contains(properties::UnitCellSize)) {\n> +\t\tconst Size &cellSize = properties.get<Size>(properties::UnitCellSize);\n> +\t\tstd::array<float, 2> physicalSize{\n> +\t\t\tcellSize.width * pixelArraySize[0] / 1e6f,\n> +\t\t\tcellSize.height * pixelArraySize[1] / 1e6f\n> +\t\t};\n> +\t\tstaticMetadata_->addEntry(ANDROID_SENSOR_INFO_PHYSICAL_SIZE,\n> +\t\t\t\t\t  physicalSize);\n> +\t}\n> +\n> +\t{\n> +\t\tconst Span<const Rectangle> &rects =\n> +\t\t\tproperties.get(properties::PixelArrayActiveAreas);\n> +\t\tstd::vector<int32_t> data{\n> +\t\t\tstatic_cast<int32_t>(rects[0].x),\n> +\t\t\tstatic_cast<int32_t>(rects[0].y),\n> +\t\t\tstatic_cast<int32_t>(rects[0].width),\n> +\t\t\tstatic_cast<int32_t>(rects[0].height),\n> +\t\t};\n> +\t\tstaticMetadata_->addEntry(ANDROID_SENSOR_INFO_ACTIVE_ARRAY_SIZE,\n> +\t\t\t\t\t  data);\n> +\t}\n> +\n> +\tint32_t sensitivityRange[] = {\n> +\t\t32, 2400,\n> +\t};\n> +\tstaticMetadata_->addEntry(ANDROID_SENSOR_INFO_SENSITIVITY_RANGE,\n> +\t\t\t\t  sensitivityRange);\n> +\n> +\t/* Report the color filter arrangement if the camera reports it. */\n> +\tif (properties.contains(properties::draft::ColorFilterArrangement)) {\n> +\t\tuint8_t filterArr = properties.get(properties::draft::ColorFilterArrangement);\n> +\t\tstaticMetadata_->addEntry(ANDROID_SENSOR_INFO_COLOR_FILTER_ARRANGEMENT,\n> +\t\t\t\t\t  filterArr);\n> +\t}\n> +\n> +\tconst auto &exposureInfo = controlsInfo.find(&controls::ExposureTime);\n> +\tif (exposureInfo != controlsInfo.end()) {\n> +\t\tint64_t exposureTimeRange[2] = {\n> +\t\t\texposureInfo->second.min().get<int32_t>() * 1000LL,\n> +\t\t\texposureInfo->second.max().get<int32_t>() * 1000LL,\n> +\t\t};\n> +\t\tstaticMetadata_->addEntry(ANDROID_SENSOR_INFO_EXPOSURE_TIME_RANGE,\n> +\t\t\t\t\t  exposureTimeRange, 2);\n> +\t}\n> +\n> +\tstaticMetadata_->addEntry(ANDROID_SENSOR_ORIENTATION, orientation_);\n> +\n> +\tstd::vector<int32_t> testPatternModes = {\n> +\t\tANDROID_SENSOR_TEST_PATTERN_MODE_OFF\n> +\t};\n> +\tconst auto &testPatternsInfo =\n> +\t\tcontrolsInfo.find(&controls::draft::TestPatternMode);\n> +\tif (testPatternsInfo != controlsInfo.end()) {\n> +\t\tconst auto &values = testPatternsInfo->second.values();\n> +\t\tASSERT(!values.empty());\n> +\t\tfor (const auto &value : values) {\n> +\t\t\tswitch (value.get<int32_t>()) {\n> +\t\t\tcase controls::draft::TestPatternModeOff:\n> +\t\t\t\t/*\n> +\t\t\t\t * ANDROID_SENSOR_TEST_PATTERN_MODE_OFF is\n> +\t\t\t\t * already in testPatternModes.\n> +\t\t\t\t */\n> +\t\t\t\tbreak;\n> +\n> +\t\t\tcase controls::draft::TestPatternModeSolidColor:\n> +\t\t\t\ttestPatternModes.push_back(\n> +\t\t\t\t\tANDROID_SENSOR_TEST_PATTERN_MODE_SOLID_COLOR);\n> +\t\t\t\tbreak;\n> +\n> +\t\t\tcase controls::draft::TestPatternModeColorBars:\n> +\t\t\t\ttestPatternModes.push_back(\n> +\t\t\t\t\tANDROID_SENSOR_TEST_PATTERN_MODE_COLOR_BARS);\n> +\t\t\t\tbreak;\n> +\n> +\t\t\tcase controls::draft::TestPatternModeColorBarsFadeToGray:\n> +\t\t\t\ttestPatternModes.push_back(\n> +\t\t\t\t\tANDROID_SENSOR_TEST_PATTERN_MODE_COLOR_BARS_FADE_TO_GRAY);\n> +\t\t\t\tbreak;\n> +\n> +\t\t\tcase controls::draft::TestPatternModePn9:\n> +\t\t\t\ttestPatternModes.push_back(\n> +\t\t\t\t\tANDROID_SENSOR_TEST_PATTERN_MODE_PN9);\n> +\t\t\t\tbreak;\n> +\n> +\t\t\tcase controls::draft::TestPatternModeCustom1:\n> +\t\t\t\t/* We don't support this yet. */\n> +\t\t\t\tbreak;\n> +\n> +\t\t\tdefault:\n> +\t\t\t\tLOG(HAL, Error) << \"Unknown test pattern mode: \"\n> +\t\t\t\t\t\t<< value.get<int32_t>();\n> +\t\t\t\tcontinue;\n> +\t\t\t}\n> +\t\t}\n> +\t}\n> +\tstaticMetadata_->addEntry(ANDROID_SENSOR_AVAILABLE_TEST_PATTERN_MODES,\n> +\t\t\t\t  testPatternModes);\n> +\n> +\tuint8_t timestampSource = ANDROID_SENSOR_INFO_TIMESTAMP_SOURCE_UNKNOWN;\n> +\tstaticMetadata_->addEntry(ANDROID_SENSOR_INFO_TIMESTAMP_SOURCE,\n> +\t\t\t\t  timestampSource);\n> +\n> +\tif (maxFrameDurationNsec > 0)\n> +\t\tstaticMetadata_->addEntry(ANDROID_SENSOR_INFO_MAX_FRAME_DURATION,\n> +\t\t\t\t\t  maxFrameDurationNsec);\n> +\n> +\t/* Statistics static metadata. */\n> +\tuint8_t faceDetectMode = ANDROID_STATISTICS_FACE_DETECT_MODE_OFF;\n> +\tstaticMetadata_->addEntry(ANDROID_STATISTICS_INFO_AVAILABLE_FACE_DETECT_MODES,\n> +\t\t\t\t  faceDetectMode);\n> +\n> +\tint32_t maxFaceCount = 0;\n> +\tstaticMetadata_->addEntry(ANDROID_STATISTICS_INFO_MAX_FACE_COUNT,\n> +\t\t\t\t  maxFaceCount);\n> +\n> +\t{\n> +\t\tstd::vector<uint8_t> data;\n> +\t\tdata.reserve(2);\n> +\t\tconst auto &infoMap = controlsInfo.find(&controls::draft::LensShadingMapMode);\n> +\t\tif (infoMap != controlsInfo.end()) {\n> +\t\t\tfor (const auto &value : infoMap->second.values())\n> +\t\t\t\tdata.push_back(value.get<int32_t>());\n> +\t\t} else {\n> +\t\t\tdata.push_back(ANDROID_STATISTICS_LENS_SHADING_MAP_MODE_OFF);\n> +\t\t}\n> +\t\tstaticMetadata_->addEntry(ANDROID_STATISTICS_INFO_AVAILABLE_LENS_SHADING_MAP_MODES,\n> +\t\t\t\t\t  data);\n> +\t}\n> +\n> +\t/* Sync static metadata. */\n> +\tint32_t maxLatency = ANDROID_SYNC_MAX_LATENCY_UNKNOWN;\n> +\tstaticMetadata_->addEntry(ANDROID_SYNC_MAX_LATENCY, maxLatency);\n> +\n> +\t/* Flash static metadata. */\n> +\tchar flashAvailable = ANDROID_FLASH_INFO_AVAILABLE_FALSE;\n> +\tstaticMetadata_->addEntry(ANDROID_FLASH_INFO_AVAILABLE,\n> +\t\t\t\t  flashAvailable);\n> +\n> +\t/* Lens static metadata. */\n> +\tstd::vector<float> lensApertures = {\n> +\t\t2.53 / 100,\n> +\t};\n> +\tstaticMetadata_->addEntry(ANDROID_LENS_INFO_AVAILABLE_APERTURES,\n> +\t\t\t\t  lensApertures);\n> +\n> +\tuint8_t lensFacing;\n> +\tswitch (facing_) {\n> +\tdefault:\n> +\tcase CAMERA_FACING_FRONT:\n> +\t\tlensFacing = ANDROID_LENS_FACING_FRONT;\n> +\t\tbreak;\n> +\tcase CAMERA_FACING_BACK:\n> +\t\tlensFacing = ANDROID_LENS_FACING_BACK;\n> +\t\tbreak;\n> +\tcase CAMERA_FACING_EXTERNAL:\n> +\t\tlensFacing = ANDROID_LENS_FACING_EXTERNAL;\n> +\t\tbreak;\n> +\t}\n> +\tstaticMetadata_->addEntry(ANDROID_LENS_FACING, lensFacing);\n> +\n> +\tstd::vector<float> lensFocalLengths = {\n> +\t\t1,\n> +\t};\n> +\tstaticMetadata_->addEntry(ANDROID_LENS_INFO_AVAILABLE_FOCAL_LENGTHS,\n> +\t\t\t\t  lensFocalLengths);\n> +\n> +\tstd::vector<uint8_t> opticalStabilizations = {\n> +\t\tANDROID_LENS_OPTICAL_STABILIZATION_MODE_OFF,\n> +\t};\n> +\tstaticMetadata_->addEntry(ANDROID_LENS_INFO_AVAILABLE_OPTICAL_STABILIZATION,\n> +\t\t\t\t  opticalStabilizations);\n> +\n> +\tfloat hypeFocalDistance = 0;\n> +\tstaticMetadata_->addEntry(ANDROID_LENS_INFO_HYPERFOCAL_DISTANCE,\n> +\t\t\t\t  hypeFocalDistance);\n> +\n> +\tfloat minFocusDistance = 0;\n> +\tstaticMetadata_->addEntry(ANDROID_LENS_INFO_MINIMUM_FOCUS_DISTANCE,\n> +\t\t\t\t  minFocusDistance);\n> +\n> +\t/* Noise reduction modes. */\n> +\t{\n> +\t\tstd::vector<uint8_t> data;\n> +\t\tdata.reserve(5);\n> +\t\tconst auto &infoMap = controlsInfo.find(&controls::draft::NoiseReductionMode);\n> +\t\tif (infoMap != controlsInfo.end()) {\n> +\t\t\tfor (const auto &value : infoMap->second.values())\n> +\t\t\t\tdata.push_back(value.get<int32_t>());\n> +\t\t} else {\n> +\t\t\tdata.push_back(ANDROID_NOISE_REDUCTION_MODE_OFF);\n> +\t\t}\n> +\t\tstaticMetadata_->addEntry(ANDROID_NOISE_REDUCTION_AVAILABLE_NOISE_REDUCTION_MODES,\n> +\t\t\t\t\t  data);\n> +\t}\n> +\n> +\t/* Scaler static metadata. */\n> +\n> +\t/*\n> +\t * \\todo The digital zoom factor is a property that depends on the\n> +\t * desired output configuration and the sensor frame size input to the\n> +\t * ISP. This information is not available to the Android HAL, not at\n> +\t * initialization time at least.\n> +\t *\n> +\t * As a workaround rely on pipeline handlers initializing the\n> +\t * ScalerCrop control with the camera default configuration and use the\n> +\t * maximum and minimum crop rectangles to calculate the digital zoom\n> +\t * factor.\n> +\t */\n> +\tfloat maxZoom = 1.0f;\n> +\tconst auto scalerCrop = controlsInfo.find(&controls::ScalerCrop);\n> +\tif (scalerCrop != controlsInfo.end()) {\n> +\t\tRectangle min = scalerCrop->second.min().get<Rectangle>();\n> +\t\tRectangle max = scalerCrop->second.max().get<Rectangle>();\n> +\t\tmaxZoom = std::min(1.0f * max.width / min.width,\n> +\t\t\t\t   1.0f * max.height / min.height);\n> +\t}\n> +\tstaticMetadata_->addEntry(ANDROID_SCALER_AVAILABLE_MAX_DIGITAL_ZOOM,\n> +\t\t\t\t  maxZoom);\n> +\n> +\tstd::vector<uint32_t> availableStreamConfigurations;\n> +\tavailableStreamConfigurations.reserve(streamConfigurations_.size() * 4);\n> +\tfor (const auto &entry : streamConfigurations_) {\n> +\t\tavailableStreamConfigurations.push_back(entry.androidFormat);\n> +\t\tavailableStreamConfigurations.push_back(entry.resolution.width);\n> +\t\tavailableStreamConfigurations.push_back(entry.resolution.height);\n> +\t\tavailableStreamConfigurations.push_back(\n> +\t\t\tANDROID_SCALER_AVAILABLE_STREAM_CONFIGURATIONS_OUTPUT);\n> +\t}\n> +\tstaticMetadata_->addEntry(ANDROID_SCALER_AVAILABLE_STREAM_CONFIGURATIONS,\n> +\t\t\t\t  availableStreamConfigurations);\n> +\n> +\tstd::vector<int64_t> availableStallDurations = {\n> +\t\tANDROID_SCALER_AVAILABLE_FORMATS_BLOB, 2560, 1920, 33333333,\n> +\t};\n> +\tstaticMetadata_->addEntry(ANDROID_SCALER_AVAILABLE_STALL_DURATIONS,\n> +\t\t\t\t  availableStallDurations);\n> +\n> +\t/* Use the minimum frame duration for all the YUV/RGB formats. */\n> +\tif (minFrameDurationNsec > 0) {\n> +\t\tstd::vector<int64_t> minFrameDurations;\n> +\t\tminFrameDurations.reserve(streamConfigurations_.size() * 4);\n> +\t\tfor (const auto &entry : streamConfigurations_) {\n> +\t\t\tminFrameDurations.push_back(entry.androidFormat);\n> +\t\t\tminFrameDurations.push_back(entry.resolution.width);\n> +\t\t\tminFrameDurations.push_back(entry.resolution.height);\n> +\t\t\tminFrameDurations.push_back(minFrameDurationNsec);\n> +\t\t}\n> +\t\tstaticMetadata_->addEntry(ANDROID_SCALER_AVAILABLE_MIN_FRAME_DURATIONS,\n> +\t\t\t\t\t  minFrameDurations);\n> +\t}\n> +\n> +\tuint8_t croppingType = ANDROID_SCALER_CROPPING_TYPE_CENTER_ONLY;\n> +\tstaticMetadata_->addEntry(ANDROID_SCALER_CROPPING_TYPE, croppingType);\n> +\n> +\t/* Info static metadata. */\n> +\tuint8_t supportedHWLevel = ANDROID_INFO_SUPPORTED_HARDWARE_LEVEL_LIMITED;\n> +\tstaticMetadata_->addEntry(ANDROID_INFO_SUPPORTED_HARDWARE_LEVEL,\n> +\t\t\t\t  supportedHWLevel);\n> +\n> +\t/* Request static metadata. */\n> +\tint32_t partialResultCount = 1;\n> +\tstaticMetadata_->addEntry(ANDROID_REQUEST_PARTIAL_RESULT_COUNT,\n> +\t\t\t\t  partialResultCount);\n> +\n> +\t{\n> +\t\t/* Default the value to 2 if not reported by the camera. */\n> +\t\tuint8_t maxPipelineDepth = 2;\n> +\t\tconst auto &infoMap = controlsInfo.find(&controls::draft::PipelineDepth);\n> +\t\tif (infoMap != controlsInfo.end())\n> +\t\t\tmaxPipelineDepth = infoMap->second.max().get<int32_t>();\n> +\t\tstaticMetadata_->addEntry(ANDROID_REQUEST_PIPELINE_MAX_DEPTH,\n> +\t\t\t\t\t  maxPipelineDepth);\n> +\t}\n> +\n> +\t/* LIMITED does not support reprocessing. */\n> +\tuint32_t maxNumInputStreams = 0;\n> +\tstaticMetadata_->addEntry(ANDROID_REQUEST_MAX_NUM_INPUT_STREAMS,\n> +\t\t\t\t  maxNumInputStreams);\n> +\n> +\tstd::vector<uint8_t> availableCapabilities = {\n> +\t\tANDROID_REQUEST_AVAILABLE_CAPABILITIES_BACKWARD_COMPATIBLE,\n> +\t};\n> +\n> +\t/* Report if camera supports RAW. */\n> +\tbool rawStreamAvailable = false;\n> +\tstd::unique_ptr<CameraConfiguration> cameraConfig =\n> +\t\tcamera_->generateConfiguration({ StreamRole::Raw });\n> +\tif (cameraConfig && !cameraConfig->empty()) {\n> +\t\tconst PixelFormatInfo &info =\n> +\t\t\tPixelFormatInfo::info(cameraConfig->at(0).pixelFormat);\n> +\t\t/* Only advertise RAW support if RAW16 is possible. */\n> +\t\tif (info.colourEncoding == PixelFormatInfo::ColourEncodingRAW &&\n> +\t\t    info.bitsPerPixel == 16) {\n> +\t\t\trawStreamAvailable = true;\n> +\t\t\tavailableCapabilities.push_back(ANDROID_REQUEST_AVAILABLE_CAPABILITIES_RAW);\n> +\t\t}\n> +\t}\n> +\n> +\t/* Number of { RAW, YUV, JPEG } supported output streams */\n> +\tint32_t numOutStreams[] = { rawStreamAvailable, 2, 1 };\n> +\tstaticMetadata_->addEntry(ANDROID_REQUEST_MAX_NUM_OUTPUT_STREAMS,\n> +\t\t\t\t  numOutStreams);\n> +\n> +\tstaticMetadata_->addEntry(ANDROID_REQUEST_AVAILABLE_CAPABILITIES,\n> +\t\t\t\t  availableCapabilities);\n> +\n> +\tstd::vector<int32_t> availableCharacteristicsKeys = {\n> +\t\tANDROID_COLOR_CORRECTION_AVAILABLE_ABERRATION_MODES,\n> +\t\tANDROID_CONTROL_AE_AVAILABLE_ANTIBANDING_MODES,\n> +\t\tANDROID_CONTROL_AE_AVAILABLE_MODES,\n> +\t\tANDROID_CONTROL_AE_AVAILABLE_TARGET_FPS_RANGES,\n> +\t\tANDROID_CONTROL_AE_COMPENSATION_RANGE,\n> +\t\tANDROID_CONTROL_AE_COMPENSATION_STEP,\n> +\t\tANDROID_CONTROL_AE_LOCK_AVAILABLE,\n> +\t\tANDROID_CONTROL_AF_AVAILABLE_MODES,\n> +\t\tANDROID_CONTROL_AVAILABLE_EFFECTS,\n> +\t\tANDROID_CONTROL_AVAILABLE_MODES,\n> +\t\tANDROID_CONTROL_AVAILABLE_SCENE_MODES,\n> +\t\tANDROID_CONTROL_AVAILABLE_VIDEO_STABILIZATION_MODES,\n> +\t\tANDROID_CONTROL_AWB_AVAILABLE_MODES,\n> +\t\tANDROID_CONTROL_AWB_LOCK_AVAILABLE,\n> +\t\tANDROID_CONTROL_MAX_REGIONS,\n> +\t\tANDROID_CONTROL_SCENE_MODE_OVERRIDES,\n> +\t\tANDROID_FLASH_INFO_AVAILABLE,\n> +\t\tANDROID_INFO_SUPPORTED_HARDWARE_LEVEL,\n> +\t\tANDROID_JPEG_AVAILABLE_THUMBNAIL_SIZES,\n> +\t\tANDROID_JPEG_MAX_SIZE,\n> +\t\tANDROID_LENS_FACING,\n> +\t\tANDROID_LENS_INFO_AVAILABLE_APERTURES,\n> +\t\tANDROID_LENS_INFO_AVAILABLE_FOCAL_LENGTHS,\n> +\t\tANDROID_LENS_INFO_AVAILABLE_OPTICAL_STABILIZATION,\n> +\t\tANDROID_LENS_INFO_HYPERFOCAL_DISTANCE,\n> +\t\tANDROID_LENS_INFO_MINIMUM_FOCUS_DISTANCE,\n> +\t\tANDROID_NOISE_REDUCTION_AVAILABLE_NOISE_REDUCTION_MODES,\n> +\t\tANDROID_REQUEST_AVAILABLE_CAPABILITIES,\n> +\t\tANDROID_REQUEST_MAX_NUM_INPUT_STREAMS,\n> +\t\tANDROID_REQUEST_MAX_NUM_OUTPUT_STREAMS,\n> +\t\tANDROID_REQUEST_PARTIAL_RESULT_COUNT,\n> +\t\tANDROID_REQUEST_PIPELINE_MAX_DEPTH,\n> +\t\tANDROID_SCALER_AVAILABLE_MAX_DIGITAL_ZOOM,\n> +\t\tANDROID_SCALER_AVAILABLE_MIN_FRAME_DURATIONS,\n> +\t\tANDROID_SCALER_AVAILABLE_STALL_DURATIONS,\n> +\t\tANDROID_SCALER_AVAILABLE_STREAM_CONFIGURATIONS,\n> +\t\tANDROID_SCALER_CROPPING_TYPE,\n> +\t\tANDROID_SENSOR_AVAILABLE_TEST_PATTERN_MODES,\n> +\t\tANDROID_SENSOR_INFO_ACTIVE_ARRAY_SIZE,\n> +\t\tANDROID_SENSOR_INFO_COLOR_FILTER_ARRANGEMENT,\n> +\t\tANDROID_SENSOR_INFO_EXPOSURE_TIME_RANGE,\n> +\t\tANDROID_SENSOR_INFO_MAX_FRAME_DURATION,\n> +\t\tANDROID_SENSOR_INFO_PHYSICAL_SIZE,\n> +\t\tANDROID_SENSOR_INFO_PIXEL_ARRAY_SIZE,\n> +\t\tANDROID_SENSOR_INFO_SENSITIVITY_RANGE,\n> +\t\tANDROID_SENSOR_INFO_TIMESTAMP_SOURCE,\n> +\t\tANDROID_SENSOR_ORIENTATION,\n> +\t\tANDROID_STATISTICS_INFO_AVAILABLE_FACE_DETECT_MODES,\n> +\t\tANDROID_STATISTICS_INFO_MAX_FACE_COUNT,\n> +\t\tANDROID_SYNC_MAX_LATENCY,\n> +\t};\n> +\tstaticMetadata_->addEntry(ANDROID_REQUEST_AVAILABLE_CHARACTERISTICS_KEYS,\n> +\t\t\t\t  availableCharacteristicsKeys);\n> +\n> +\tstd::vector<int32_t> availableRequestKeys = {\n> +\t\tANDROID_COLOR_CORRECTION_ABERRATION_MODE,\n> +\t\tANDROID_CONTROL_AE_ANTIBANDING_MODE,\n> +\t\tANDROID_CONTROL_AE_EXPOSURE_COMPENSATION,\n> +\t\tANDROID_CONTROL_AE_LOCK,\n> +\t\tANDROID_CONTROL_AE_MODE,\n> +\t\tANDROID_CONTROL_AE_PRECAPTURE_TRIGGER,\n> +\t\tANDROID_CONTROL_AE_TARGET_FPS_RANGE,\n> +\t\tANDROID_CONTROL_AF_MODE,\n> +\t\tANDROID_CONTROL_AF_TRIGGER,\n> +\t\tANDROID_CONTROL_AWB_LOCK,\n> +\t\tANDROID_CONTROL_AWB_MODE,\n> +\t\tANDROID_CONTROL_CAPTURE_INTENT,\n> +\t\tANDROID_CONTROL_EFFECT_MODE,\n> +\t\tANDROID_CONTROL_MODE,\n> +\t\tANDROID_CONTROL_SCENE_MODE,\n> +\t\tANDROID_CONTROL_VIDEO_STABILIZATION_MODE,\n> +\t\tANDROID_FLASH_MODE,\n> +\t\tANDROID_JPEG_ORIENTATION,\n> +\t\tANDROID_JPEG_QUALITY,\n> +\t\tANDROID_JPEG_THUMBNAIL_QUALITY,\n> +\t\tANDROID_JPEG_THUMBNAIL_SIZE,\n> +\t\tANDROID_LENS_APERTURE,\n> +\t\tANDROID_LENS_OPTICAL_STABILIZATION_MODE,\n> +\t\tANDROID_NOISE_REDUCTION_MODE,\n> +\t\tANDROID_SCALER_CROP_REGION,\n> +\t\tANDROID_STATISTICS_FACE_DETECT_MODE\n> +\t};\n> +\tstaticMetadata_->addEntry(ANDROID_REQUEST_AVAILABLE_REQUEST_KEYS,\n> +\t\t\t\t  availableRequestKeys);\n> +\n> +\tstd::vector<int32_t> availableResultKeys = {\n> +\t\tANDROID_COLOR_CORRECTION_ABERRATION_MODE,\n> +\t\tANDROID_CONTROL_AE_ANTIBANDING_MODE,\n> +\t\tANDROID_CONTROL_AE_EXPOSURE_COMPENSATION,\n> +\t\tANDROID_CONTROL_AE_LOCK,\n> +\t\tANDROID_CONTROL_AE_MODE,\n> +\t\tANDROID_CONTROL_AE_PRECAPTURE_TRIGGER,\n> +\t\tANDROID_CONTROL_AE_STATE,\n> +\t\tANDROID_CONTROL_AE_TARGET_FPS_RANGE,\n> +\t\tANDROID_CONTROL_AF_MODE,\n> +\t\tANDROID_CONTROL_AF_STATE,\n> +\t\tANDROID_CONTROL_AF_TRIGGER,\n> +\t\tANDROID_CONTROL_AWB_LOCK,\n> +\t\tANDROID_CONTROL_AWB_MODE,\n> +\t\tANDROID_CONTROL_AWB_STATE,\n> +\t\tANDROID_CONTROL_CAPTURE_INTENT,\n> +\t\tANDROID_CONTROL_EFFECT_MODE,\n> +\t\tANDROID_CONTROL_MODE,\n> +\t\tANDROID_CONTROL_SCENE_MODE,\n> +\t\tANDROID_CONTROL_VIDEO_STABILIZATION_MODE,\n> +\t\tANDROID_FLASH_MODE,\n> +\t\tANDROID_FLASH_STATE,\n> +\t\tANDROID_JPEG_GPS_COORDINATES,\n> +\t\tANDROID_JPEG_GPS_PROCESSING_METHOD,\n> +\t\tANDROID_JPEG_GPS_TIMESTAMP,\n> +\t\tANDROID_JPEG_ORIENTATION,\n> +\t\tANDROID_JPEG_QUALITY,\n> +\t\tANDROID_JPEG_SIZE,\n> +\t\tANDROID_JPEG_THUMBNAIL_QUALITY,\n> +\t\tANDROID_JPEG_THUMBNAIL_SIZE,\n> +\t\tANDROID_LENS_APERTURE,\n> +\t\tANDROID_LENS_FOCAL_LENGTH,\n> +\t\tANDROID_LENS_OPTICAL_STABILIZATION_MODE,\n> +\t\tANDROID_LENS_STATE,\n> +\t\tANDROID_NOISE_REDUCTION_MODE,\n> +\t\tANDROID_REQUEST_PIPELINE_DEPTH,\n> +\t\tANDROID_SCALER_CROP_REGION,\n> +\t\tANDROID_SENSOR_EXPOSURE_TIME,\n> +\t\tANDROID_SENSOR_FRAME_DURATION,\n> +\t\tANDROID_SENSOR_ROLLING_SHUTTER_SKEW,\n> +\t\tANDROID_SENSOR_TEST_PATTERN_MODE,\n> +\t\tANDROID_SENSOR_TIMESTAMP,\n> +\t\tANDROID_STATISTICS_FACE_DETECT_MODE,\n> +\t\tANDROID_STATISTICS_LENS_SHADING_MAP_MODE,\n> +\t\tANDROID_STATISTICS_HOT_PIXEL_MAP_MODE,\n> +\t\tANDROID_STATISTICS_SCENE_FLICKER,\n> +\t};\n> +\tstaticMetadata_->addEntry(ANDROID_REQUEST_AVAILABLE_RESULT_KEYS,\n> +\t\t\t\t  availableResultKeys);\n> +\n> +\tif (!staticMetadata_->isValid()) {\n> +\t\tLOG(HAL, Error) << \"Failed to construct static metadata\";\n> +\t\tstaticMetadata_.reset();\n> +\t\treturn -EINVAL;\n> +\t}\n> +\n> +\tif (staticMetadata_->resized()) {\n> +\t\tauto [entryCount, dataCount] = staticMetadata_->usage();\n> +\t\tLOG(HAL, Info)\n> +\t\t\t<< \"Static metadata resized: \" << entryCount\n> +\t\t\t<< \" entries and \" << dataCount << \" bytes used\";\n> +\t}\n> +\n> +\treturn 0;\n> +}\n> +\n> +/* Translate Android format code to libcamera pixel format. */\n> +PixelFormat CameraCapabilities::toPixelFormat(int format) const\n> +{\n> +\tauto it = formatsMap_.find(format);\n> +\tif (it == formatsMap_.end()) {\n> +\t\tLOG(HAL, Error) << \"Requested format \" << utils::hex(format)\n> +\t\t\t\t<< \" not supported\";\n> +\t\treturn PixelFormat();\n> +\t}\n> +\n> +\treturn it->second;\n> +}\n> +\n> +std::unique_ptr<CameraMetadata> CameraCapabilities::requestTemplatePreview() const\n> +{\n> +\t/*\n> +\t * \\todo Keep this in sync with the actual number of entries.\n> +\t * Currently: 20 entries, 35 bytes\n> +\t */\n> +\tauto requestTemplate = std::make_unique<CameraMetadata>(21, 36);\n> +\tif (!requestTemplate->isValid()) {\n> +\t\treturn nullptr;\n> +\t}\n> +\n> +\t/* Get the FPS range registered in the static metadata. */\n> +\tcamera_metadata_ro_entry_t entry;\n> +\tbool found = staticMetadata_->getEntry(ANDROID_CONTROL_AE_AVAILABLE_TARGET_FPS_RANGES,\n> +\t\t\t\t\t       &entry);\n> +\tif (!found) {\n> +\t\tLOG(HAL, Error) << \"Cannot create capture template without FPS range\";\n> +\t\treturn nullptr;\n> +\t}\n> +\n> +\t/*\n> +\t * Assume the AE_AVAILABLE_TARGET_FPS_RANGE static metadata\n> +\t * has been assembled as {{min, max} {max, max}}.\n> +\t */\n> +\trequestTemplate->addEntry(ANDROID_CONTROL_AE_TARGET_FPS_RANGE,\n> +\t\t\t\t  entry.data.i32, 2);\n> +\n> +\tuint8_t aeMode = ANDROID_CONTROL_AE_MODE_ON;\n> +\trequestTemplate->addEntry(ANDROID_CONTROL_AE_MODE, aeMode);\n> +\n> +\tint32_t aeExposureCompensation = 0;\n> +\trequestTemplate->addEntry(ANDROID_CONTROL_AE_EXPOSURE_COMPENSATION,\n> +\t\t\t\t  aeExposureCompensation);\n> +\n> +\tuint8_t aePrecaptureTrigger = ANDROID_CONTROL_AE_PRECAPTURE_TRIGGER_IDLE;\n> +\trequestTemplate->addEntry(ANDROID_CONTROL_AE_PRECAPTURE_TRIGGER,\n> +\t\t\t\t  aePrecaptureTrigger);\n> +\n> +\tuint8_t aeLock = ANDROID_CONTROL_AE_LOCK_OFF;\n> +\trequestTemplate->addEntry(ANDROID_CONTROL_AE_LOCK, aeLock);\n> +\n> +\tuint8_t aeAntibandingMode = ANDROID_CONTROL_AE_ANTIBANDING_MODE_AUTO;\n> +\trequestTemplate->addEntry(ANDROID_CONTROL_AE_ANTIBANDING_MODE,\n> +\t\t\t\t  aeAntibandingMode);\n> +\n> +\tuint8_t afMode = ANDROID_CONTROL_AF_MODE_OFF;\n> +\trequestTemplate->addEntry(ANDROID_CONTROL_AF_MODE, afMode);\n> +\n> +\tuint8_t afTrigger = ANDROID_CONTROL_AF_TRIGGER_IDLE;\n> +\trequestTemplate->addEntry(ANDROID_CONTROL_AF_TRIGGER, afTrigger);\n> +\n> +\tuint8_t awbMode = ANDROID_CONTROL_AWB_MODE_AUTO;\n> +\trequestTemplate->addEntry(ANDROID_CONTROL_AWB_MODE, awbMode);\n> +\n> +\tuint8_t awbLock = ANDROID_CONTROL_AWB_LOCK_OFF;\n> +\trequestTemplate->addEntry(ANDROID_CONTROL_AWB_LOCK, awbLock);\n> +\n> +\tuint8_t flashMode = ANDROID_FLASH_MODE_OFF;\n> +\trequestTemplate->addEntry(ANDROID_FLASH_MODE, flashMode);\n> +\n> +\tuint8_t faceDetectMode = ANDROID_STATISTICS_FACE_DETECT_MODE_OFF;\n> +\trequestTemplate->addEntry(ANDROID_STATISTICS_FACE_DETECT_MODE,\n> +\t\t\t\t  faceDetectMode);\n> +\n> +\tuint8_t noiseReduction = ANDROID_NOISE_REDUCTION_MODE_OFF;\n> +\trequestTemplate->addEntry(ANDROID_NOISE_REDUCTION_MODE,\n> +\t\t\t\t  noiseReduction);\n> +\n> +\tuint8_t aberrationMode = ANDROID_COLOR_CORRECTION_ABERRATION_MODE_OFF;\n> +\trequestTemplate->addEntry(ANDROID_COLOR_CORRECTION_ABERRATION_MODE,\n> +\t\t\t\t  aberrationMode);\n> +\n> +\tuint8_t controlMode = ANDROID_CONTROL_MODE_AUTO;\n> +\trequestTemplate->addEntry(ANDROID_CONTROL_MODE, controlMode);\n> +\n> +\tfloat lensAperture = 2.53 / 100;\n> +\trequestTemplate->addEntry(ANDROID_LENS_APERTURE, lensAperture);\n> +\n> +\tuint8_t opticalStabilization = ANDROID_LENS_OPTICAL_STABILIZATION_MODE_OFF;\n> +\trequestTemplate->addEntry(ANDROID_LENS_OPTICAL_STABILIZATION_MODE,\n> +\t\t\t\t  opticalStabilization);\n> +\n> +\tuint8_t captureIntent = ANDROID_CONTROL_CAPTURE_INTENT_PREVIEW;\n> +\trequestTemplate->addEntry(ANDROID_CONTROL_CAPTURE_INTENT,\n> +\t\t\t\t  captureIntent);\n> +\n> +\treturn requestTemplate;\n> +}\n> +\n> +std::unique_ptr<CameraMetadata> CameraCapabilities::requestTemplateVideo() const\n> +{\n> +\tstd::unique_ptr<CameraMetadata> previewTemplate = requestTemplatePreview();\n> +\tif (!previewTemplate)\n> +\t\treturn nullptr;\n> +\n> +\t/*\n> +\t * The video template requires a fixed FPS range. Everything else\n> +\t * stays the same as the preview template.\n> +\t */\n> +\tcamera_metadata_ro_entry_t entry;\n> +\tstaticMetadata_->getEntry(ANDROID_CONTROL_AE_AVAILABLE_TARGET_FPS_RANGES,\n> +\t\t\t\t  &entry);\n> +\n> +\t/*\n> +\t * Assume the AE_AVAILABLE_TARGET_FPS_RANGE static metadata\n> +\t * has been assembled as {{min, max} {max, max}}.\n> +\t */\n> +\tpreviewTemplate->updateEntry(ANDROID_CONTROL_AE_TARGET_FPS_RANGE,\n> +\t\t\t\t     entry.data.i32 + 2, 2);\n> +\n> +\treturn previewTemplate;\n> +}\n> diff --git a/src/android/camera_capabilities.h b/src/android/camera_capabilities.h\n> new file mode 100644\n> index 000000000000..f511607bbd90\n> --- /dev/null\n> +++ b/src/android/camera_capabilities.h\n> @@ -0,0 +1,65 @@\n> +/* SPDX-License-Identifier: LGPL-2.1-or-later */\n> +/*\n> + * Copyright (C) 2021, Google Inc.\n> + *\n> + * camera_capabilities.h - Camera static properties manager\n> + */\n> +#ifndef __ANDROID_CAMERA_CAPABILITIES_H__\n> +#define __ANDROID_CAMERA_CAPABILITIES_H__\n> +\n> +#include <map>\n> +#include <memory>\n> +#include <vector>\n> +\n> +#include <libcamera/camera.h>\n> +#include <libcamera/class.h>\n> +#include <libcamera/formats.h>\n> +#include <libcamera/geometry.h>\n> +\n> +#include \"camera_metadata.h\"\n> +\n> +class CameraCapabilities\n> +{\n> +public:\n> +\tCameraCapabilities() = default;\n> +\n> +\tint initialize(std::shared_ptr<libcamera::Camera> camera,\n> +\t\t       int orientation, int facing);\n> +\n> +\tCameraMetadata *staticMetadata() const { return staticMetadata_.get(); }\n> +\tlibcamera::PixelFormat toPixelFormat(int format) const;\n> +\tunsigned int maxJpegBufferSize() const { return maxJpegBufferSize_; }\n> +\n> +\tstd::unique_ptr<CameraMetadata> requestTemplatePreview() const;\n> +\tstd::unique_ptr<CameraMetadata> requestTemplateVideo() const;\n> +\n> +private:\n> +\tLIBCAMERA_DISABLE_COPY_AND_MOVE(CameraCapabilities)\n> +\n> +\tstruct Camera3StreamConfiguration {\n> +\t\tlibcamera::Size resolution;\n> +\t\tint androidFormat;\n> +\t};\n> +\n> +\tstd::vector<libcamera::Size>\n> +\tgetYUVResolutions(libcamera::CameraConfiguration *cameraConfig,\n> +\t\t\t  const libcamera::PixelFormat &pixelFormat,\n> +\t\t\t  const std::vector<libcamera::Size> &resolutions);\n> +\tstd::vector<libcamera::Size>\n> +\tgetRawResolutions(const libcamera::PixelFormat &pixelFormat);\n> +\tint initializeStreamConfigurations();\n> +\n> +\tint initializeStaticMetadata();\n> +\n> +\tstd::shared_ptr<libcamera::Camera> camera_;\n> +\n> +\tint facing_;\n> +\tint orientation_;\n> +\n> +\tstd::vector<Camera3StreamConfiguration> streamConfigurations_;\n> +\tstd::map<int, libcamera::PixelFormat> formatsMap_;\n> +\tstd::unique_ptr<CameraMetadata> staticMetadata_;\n> +\tunsigned int maxJpegBufferSize_;\n> +};\n> +\n> +#endif /* __ANDROID_CAMERA_CAPABILITIES_H__ */\n> diff --git a/src/android/camera_device.cpp b/src/android/camera_device.cpp\n> index 8c71fd0675d3..4bd125d7020a 100644\n> --- a/src/android/camera_device.cpp\n> +++ b/src/android/camera_device.cpp\n> @@ -10,11 +10,8 @@\n>  #include \"camera_ops.h\"\n>  #include \"post_processor.h\"\n>  \n> -#include <array>\n> -#include <cmath>\n>  #include <fstream>\n>  #include <sys/mman.h>\n> -#include <tuple>\n>  #include <unistd.h>\n>  #include <vector>\n>  \n> @@ -23,7 +20,6 @@\n>  #include <libcamera/formats.h>\n>  #include <libcamera/property_ids.h>\n>  \n> -#include \"libcamera/internal/formats.h\"\n>  #include \"libcamera/internal/log.h\"\n>  #include \"libcamera/internal/thread.h\"\n>  #include \"libcamera/internal/utils.h\"\n> @@ -36,94 +32,6 @@ LOG_DECLARE_CATEGORY(HAL)\n>  \n>  namespace {\n>  \n> -/*\n> - * \\var camera3Resolutions\n> - * \\brief The list of image resolutions defined as mandatory to be supported by\n> - * the Android Camera3 specification\n> - */\n> -const std::vector<Size> camera3Resolutions = {\n> -\t{ 320, 240 },\n> -\t{ 640, 480 },\n> -\t{ 1280, 720 },\n> -\t{ 1920, 1080 }\n> -};\n> -\n> -/*\n> - * \\struct Camera3Format\n> - * \\brief Data associated with an Android format identifier\n> - * \\var libcameraFormats List of libcamera pixel formats compatible with the\n> - * Android format\n> - * \\var name The human-readable representation of the Android format code\n> - */\n> -struct Camera3Format {\n> -\tstd::vector<PixelFormat> libcameraFormats;\n> -\tbool mandatory;\n> -\tconst char *name;\n> -};\n> -\n> -/*\n> - * \\var camera3FormatsMap\n> - * \\brief Associate Android format code with ancillary data\n> - */\n> -const std::map<int, const Camera3Format> camera3FormatsMap = {\n> -\t{\n> -\t\tHAL_PIXEL_FORMAT_BLOB, {\n> -\t\t\t{ formats::MJPEG },\n> -\t\t\ttrue,\n> -\t\t\t\"BLOB\"\n> -\t\t}\n> -\t}, {\n> -\t\tHAL_PIXEL_FORMAT_YCbCr_420_888, {\n> -\t\t\t{ formats::NV12, formats::NV21 },\n> -\t\t\ttrue,\n> -\t\t\t\"YCbCr_420_888\"\n> -\t\t}\n> -\t}, {\n> -\t\t/*\n> -\t\t * \\todo Translate IMPLEMENTATION_DEFINED inspecting the gralloc\n> -\t\t * usage flag. For now, copy the YCbCr_420 configuration.\n> -\t\t */\n> -\t\tHAL_PIXEL_FORMAT_IMPLEMENTATION_DEFINED, {\n> -\t\t\t{ formats::NV12, formats::NV21 },\n> -\t\t\ttrue,\n> -\t\t\t\"IMPLEMENTATION_DEFINED\"\n> -\t\t}\n> -\t}, {\n> -\t\tHAL_PIXEL_FORMAT_RAW10, {\n> -\t\t\t{\n> -\t\t\t\tformats::SBGGR10_CSI2P,\n> -\t\t\t\tformats::SGBRG10_CSI2P,\n> -\t\t\t\tformats::SGRBG10_CSI2P,\n> -\t\t\t\tformats::SRGGB10_CSI2P\n> -\t\t\t},\n> -\t\t\tfalse,\n> -\t\t\t\"RAW10\"\n> -\t\t}\n> -\t}, {\n> -\t\tHAL_PIXEL_FORMAT_RAW12, {\n> -\t\t\t{\n> -\t\t\t\tformats::SBGGR12_CSI2P,\n> -\t\t\t\tformats::SGBRG12_CSI2P,\n> -\t\t\t\tformats::SGRBG12_CSI2P,\n> -\t\t\t\tformats::SRGGB12_CSI2P\n> -\t\t\t},\n> -\t\t\tfalse,\n> -\t\t\t\"RAW12\"\n> -\t\t}\n> -\t}, {\n> -\t\tHAL_PIXEL_FORMAT_RAW16, {\n> -\t\t\t{\n> -\t\t\t\tformats::SBGGR16,\n> -\t\t\t\tformats::SGBRG16,\n> -\t\t\t\tformats::SGRBG16,\n> -\t\t\t\tformats::SRGGB16\n> -\t\t\t},\n> -\t\t\tfalse,\n> -\t\t\t\"RAW16\"\n> -\t\t}\n> -\t},\n> -};\n> -\n>  /*\n>   * \\struct Camera3StreamConfig\n>   * \\brief Data to store StreamConfiguration associated with camera3_stream(s)\n> @@ -512,242 +420,7 @@ int CameraDevice::initialize(const CameraConfigData *cameraConfigData)\n>  \t\torientation_ = 0;\n>  \t}\n>  \n> -\t/* Acquire the camera and initialize available stream configurations. */\n> -\tint ret = camera_->acquire();\n> -\tif (ret) {\n> -\t\tLOG(HAL, Error) << \"Failed to temporarily acquire the camera\";\n> -\t\treturn ret;\n> -\t}\n> -\n> -\tret = initializeStreamConfigurations();\n> -\tcamera_->release();\n> -\treturn ret;\n> -}\n> -\n> -std::vector<Size> CameraDevice::getYUVResolutions(CameraConfiguration *cameraConfig,\n> -\t\t\t\t\t\t  const PixelFormat &pixelFormat,\n> -\t\t\t\t\t\t  const std::vector<Size> &resolutions)\n> -{\n> -\tstd::vector<Size> supportedResolutions;\n> -\n> -\tStreamConfiguration &cfg = cameraConfig->at(0);\n> -\tfor (const Size &res : resolutions) {\n> -\t\tcfg.pixelFormat = pixelFormat;\n> -\t\tcfg.size = res;\n> -\n> -\t\tCameraConfiguration::Status status = cameraConfig->validate();\n> -\t\tif (status != CameraConfiguration::Valid) {\n> -\t\t\tLOG(HAL, Debug) << cfg.toString() << \" not supported\";\n> -\t\t\tcontinue;\n> -\t\t}\n> -\n> -\t\tLOG(HAL, Debug) << cfg.toString() << \" supported\";\n> -\n> -\t\tsupportedResolutions.push_back(res);\n> -\t}\n> -\n> -\treturn supportedResolutions;\n> -}\n> -\n> -std::vector<Size> CameraDevice::getRawResolutions(const libcamera::PixelFormat &pixelFormat)\n> -{\n> -\tstd::unique_ptr<CameraConfiguration> cameraConfig =\n> -\t\tcamera_->generateConfiguration({ StreamRole::Raw });\n> -\tStreamConfiguration &cfg = cameraConfig->at(0);\n> -\tconst StreamFormats &formats = cfg.formats();\n> -\tstd::vector<Size> supportedResolutions = formats.sizes(pixelFormat);\n> -\n> -\treturn supportedResolutions;\n> -}\n> -\n> -/*\n> - * Initialize the format conversion map to translate from Android format\n> - * identifier to libcamera pixel formats and fill in the list of supported\n> - * stream configurations to be reported to the Android camera framework through\n> - * the static stream configuration metadata.\n> - */\n> -int CameraDevice::initializeStreamConfigurations()\n> -{\n> -\t/*\n> -\t * Get the maximum output resolutions\n> -\t * \\todo Get this from the camera properties once defined\n> -\t */\n> -\tstd::unique_ptr<CameraConfiguration> cameraConfig =\n> -\t\tcamera_->generateConfiguration({ StillCapture });\n> -\tif (!cameraConfig) {\n> -\t\tLOG(HAL, Error) << \"Failed to get maximum resolution\";\n> -\t\treturn -EINVAL;\n> -\t}\n> -\tStreamConfiguration &cfg = cameraConfig->at(0);\n> -\n> -\t/*\n> -\t * \\todo JPEG - Adjust the maximum available resolution by taking the\n> -\t * JPEG encoder requirements into account (alignment and aspect ratio).\n> -\t */\n> -\tconst Size maxRes = cfg.size;\n> -\tLOG(HAL, Debug) << \"Maximum supported resolution: \" << maxRes.toString();\n> -\n> -\t/*\n> -\t * Build the list of supported image resolutions.\n> -\t *\n> -\t * The resolutions listed in camera3Resolution are mandatory to be\n> -\t * supported, up to the camera maximum resolution.\n> -\t *\n> -\t * Augment the list by adding resolutions calculated from the camera\n> -\t * maximum one.\n> -\t */\n> -\tstd::vector<Size> cameraResolutions;\n> -\tstd::copy_if(camera3Resolutions.begin(), camera3Resolutions.end(),\n> -\t\t     std::back_inserter(cameraResolutions),\n> -\t\t     [&](const Size &res) { return res < maxRes; });\n> -\n> -\t/*\n> -\t * The Camera3 specification suggests adding 1/2 and 1/4 of the maximum\n> -\t * resolution.\n> -\t */\n> -\tfor (unsigned int divider = 2;; divider <<= 1) {\n> -\t\tSize derivedSize{\n> -\t\t\tmaxRes.width / divider,\n> -\t\t\tmaxRes.height / divider,\n> -\t\t};\n> -\n> -\t\tif (derivedSize.width < 320 ||\n> -\t\t    derivedSize.height < 240)\n> -\t\t\tbreak;\n> -\n> -\t\tcameraResolutions.push_back(derivedSize);\n> -\t}\n> -\tcameraResolutions.push_back(maxRes);\n> -\n> -\t/* Remove duplicated entries from the list of supported resolutions. */\n> -\tstd::sort(cameraResolutions.begin(), cameraResolutions.end());\n> -\tauto last = std::unique(cameraResolutions.begin(), cameraResolutions.end());\n> -\tcameraResolutions.erase(last, cameraResolutions.end());\n> -\n> -\t/*\n> -\t * Build the list of supported camera formats.\n> -\t *\n> -\t * To each Android format a list of compatible libcamera formats is\n> -\t * associated. The first libcamera format that tests successful is added\n> -\t * to the format translation map used when configuring the streams.\n> -\t * It is then tested against the list of supported camera resolutions to\n> -\t * build the stream configuration map reported through the camera static\n> -\t * metadata.\n> -\t */\n> -\tSize maxJpegSize;\n> -\tfor (const auto &format : camera3FormatsMap) {\n> -\t\tint androidFormat = format.first;\n> -\t\tconst Camera3Format &camera3Format = format.second;\n> -\t\tconst std::vector<PixelFormat> &libcameraFormats =\n> -\t\t\tcamera3Format.libcameraFormats;\n> -\n> -\t\tLOG(HAL, Debug) << \"Trying to map Android format \"\n> -\t\t\t\t<< camera3Format.name;\n> -\n> -\t\t/*\n> -\t\t * JPEG is always supported, either produced directly by the\n> -\t\t * camera, or encoded in the HAL.\n> -\t\t */\n> -\t\tif (androidFormat == HAL_PIXEL_FORMAT_BLOB) {\n> -\t\t\tformatsMap_[androidFormat] = formats::MJPEG;\n> -\t\t\tLOG(HAL, Debug) << \"Mapped Android format \"\n> -\t\t\t\t\t<< camera3Format.name << \" to \"\n> -\t\t\t\t\t<< formats::MJPEG.toString()\n> -\t\t\t\t\t<< \" (fixed mapping)\";\n> -\t\t\tcontinue;\n> -\t\t}\n> -\n> -\t\t/*\n> -\t\t * Test the libcamera formats that can produce images\n> -\t\t * compatible with the format defined by Android.\n> -\t\t */\n> -\t\tPixelFormat mappedFormat;\n> -\t\tfor (const PixelFormat &pixelFormat : libcameraFormats) {\n> -\n> -\t\t\tLOG(HAL, Debug) << \"Testing \" << pixelFormat.toString();\n> -\n> -\t\t\t/*\n> -\t\t\t * The stream configuration size can be adjusted,\n> -\t\t\t * not the pixel format.\n> -\t\t\t *\n> -\t\t\t * \\todo This could be simplified once all pipeline\n> -\t\t\t * handlers will report the StreamFormats list of\n> -\t\t\t * supported formats.\n> -\t\t\t */\n> -\t\t\tcfg.pixelFormat = pixelFormat;\n> -\n> -\t\t\tCameraConfiguration::Status status = cameraConfig->validate();\n> -\t\t\tif (status != CameraConfiguration::Invalid &&\n> -\t\t\t    cfg.pixelFormat == pixelFormat) {\n> -\t\t\t\tmappedFormat = pixelFormat;\n> -\t\t\t\tbreak;\n> -\t\t\t}\n> -\t\t}\n> -\n> -\t\tif (!mappedFormat.isValid()) {\n> -\t\t\t/* If the format is not mandatory, skip it. */\n> -\t\t\tif (!camera3Format.mandatory)\n> -\t\t\t\tcontinue;\n> -\n> -\t\t\tLOG(HAL, Error)\n> -\t\t\t\t<< \"Failed to map mandatory Android format \"\n> -\t\t\t\t<< camera3Format.name << \" (\"\n> -\t\t\t\t<< utils::hex(androidFormat) << \"): aborting\";\n> -\t\t\treturn -EINVAL;\n> -\t\t}\n> -\n> -\t\t/*\n> -\t\t * Record the mapping and then proceed to generate the\n> -\t\t * stream configurations map, by testing the image resolutions.\n> -\t\t */\n> -\t\tformatsMap_[androidFormat] = mappedFormat;\n> -\t\tLOG(HAL, Debug) << \"Mapped Android format \"\n> -\t\t\t\t<< camera3Format.name << \" to \"\n> -\t\t\t\t<< mappedFormat.toString();\n> -\n> -\t\tstd::vector<Size> resolutions;\n> -\t\tconst PixelFormatInfo &info = PixelFormatInfo::info(mappedFormat);\n> -\t\tif (info.colourEncoding == PixelFormatInfo::ColourEncodingRAW)\n> -\t\t\tresolutions = getRawResolutions(mappedFormat);\n> -\t\telse\n> -\t\t\tresolutions = getYUVResolutions(cameraConfig.get(),\n> -\t\t\t\t\t\t\tmappedFormat,\n> -\t\t\t\t\t\t\tcameraResolutions);\n> -\n> -\t\tfor (const Size &res : resolutions) {\n> -\t\t\tstreamConfigurations_.push_back({ res, androidFormat });\n> -\n> -\t\t\t/*\n> -\t\t\t * If the format is HAL_PIXEL_FORMAT_YCbCr_420_888\n> -\t\t\t * from which JPEG is produced, add an entry for\n> -\t\t\t * the JPEG stream.\n> -\t\t\t *\n> -\t\t\t * \\todo Wire the JPEG encoder to query the supported\n> -\t\t\t * sizes provided a list of formats it can encode.\n> -\t\t\t *\n> -\t\t\t * \\todo Support JPEG streams produced by the Camera\n> -\t\t\t * natively.\n> -\t\t\t */\n> -\t\t\tif (androidFormat == HAL_PIXEL_FORMAT_YCbCr_420_888) {\n> -\t\t\t\tstreamConfigurations_.push_back(\n> -\t\t\t\t\t{ res, HAL_PIXEL_FORMAT_BLOB });\n> -\t\t\t\tmaxJpegSize = std::max(maxJpegSize, res);\n> -\t\t\t}\n> -\t\t}\n> -\n> -\t\t/*\n> -\t\t * \\todo Calculate the maximum JPEG buffer size by asking the\n> -\t\t * encoder giving the maximum frame size required.\n> -\t\t */\n> -\t\tmaxJpegBufferSize_ = maxJpegSize.width * maxJpegSize.height * 1.5;\n> -\t}\n> -\n> -\tLOG(HAL, Debug) << \"Collected stream configuration map: \";\n> -\tfor (const auto &entry : streamConfigurations_)\n> -\t\tLOG(HAL, Debug) << \"{ \" << entry.resolution.toString() << \" - \"\n> -\t\t\t\t<< utils::hex(entry.androidFormat) << \" }\";\n> -\n> -\treturn 0;\n> +\treturn capabilities_.initialize(camera_, orientation_, facing_);\n>  }\n>  \n>  /*\n> @@ -817,802 +490,19 @@ void CameraDevice::stop()\n>  \tstate_ = State::Stopped;\n>  }\n>  \n> -void CameraDevice::setCallbacks(const camera3_callback_ops_t *callbacks)\n> +unsigned int CameraDevice::maxJpegBufferSize() const\n>  {\n> -\tcallbacks_ = callbacks;\n> +\treturn capabilities_.maxJpegBufferSize();\n>  }\n>  \n> -/*\n> - * Return static information for the camera.\n> - */\n> -const camera_metadata_t *CameraDevice::getStaticMetadata()\n> -{\n> -\tif (staticMetadata_)\n> -\t\treturn staticMetadata_->get();\n> -\n> -\tstaticMetadata_ = std::make_unique<CameraMetadata>(64, 1024);\n> -\tif (!staticMetadata_->isValid()) {\n> -\t\tLOG(HAL, Error) << \"Failed to allocate static metadata\";\n> -\t\tstaticMetadata_.reset();\n> -\t\treturn nullptr;\n> -\t}\n> -\n> -\tconst ControlInfoMap &controlsInfo = camera_->controls();\n> -\tconst ControlList &properties = camera_->properties();\n> -\n> -\t/* Color correction static metadata. */\n> -\t{\n> -\t\tstd::vector<uint8_t> data;\n> -\t\tdata.reserve(3);\n> -\t\tconst auto &infoMap = controlsInfo.find(&controls::draft::ColorCorrectionAberrationMode);\n> -\t\tif (infoMap != controlsInfo.end()) {\n> -\t\t\tfor (const auto &value : infoMap->second.values())\n> -\t\t\t\tdata.push_back(value.get<int32_t>());\n> -\t\t} else {\n> -\t\t\tdata.push_back(ANDROID_COLOR_CORRECTION_ABERRATION_MODE_OFF);\n> -\t\t}\n> -\t\tstaticMetadata_->addEntry(ANDROID_COLOR_CORRECTION_AVAILABLE_ABERRATION_MODES,\n> -\t\t\t\t\t  data);\n> -\t}\n> -\n> -\t/* Control static metadata. */\n> -\tstd::vector<uint8_t> aeAvailableAntiBandingModes = {\n> -\t\tANDROID_CONTROL_AE_ANTIBANDING_MODE_OFF,\n> -\t\tANDROID_CONTROL_AE_ANTIBANDING_MODE_50HZ,\n> -\t\tANDROID_CONTROL_AE_ANTIBANDING_MODE_60HZ,\n> -\t\tANDROID_CONTROL_AE_ANTIBANDING_MODE_AUTO,\n> -\t};\n> -\tstaticMetadata_->addEntry(ANDROID_CONTROL_AE_AVAILABLE_ANTIBANDING_MODES,\n> -\t\t\t\t  aeAvailableAntiBandingModes);\n> -\n> -\tstd::vector<uint8_t> aeAvailableModes = {\n> -\t\tANDROID_CONTROL_AE_MODE_ON,\n> -\t};\n> -\tstaticMetadata_->addEntry(ANDROID_CONTROL_AE_AVAILABLE_MODES,\n> -\t\t\t\t  aeAvailableModes);\n> -\n> -\tint64_t minFrameDurationNsec = -1;\n> -\tint64_t maxFrameDurationNsec = -1;\n> -\tconst auto frameDurationsInfo = controlsInfo.find(&controls::FrameDurationLimits);\n> -\tif (frameDurationsInfo != controlsInfo.end()) {\n> -\t\tminFrameDurationNsec = frameDurationsInfo->second.min().get<int64_t>() * 1000;\n> -\t\tmaxFrameDurationNsec = frameDurationsInfo->second.max().get<int64_t>() * 1000;\n> -\n> -\t\t/*\n> -\t\t * Adjust the minimum frame duration to comply with Android\n> -\t\t * requirements. The camera service mandates all preview/record\n> -\t\t * streams to have a minimum frame duration < 33,366 milliseconds\n> -\t\t * (see MAX_PREVIEW_RECORD_DURATION_NS in the camera service\n> -\t\t * implementation).\n> -\t\t *\n> -\t\t * If we're close enough (+ 500 useconds) to that value, round\n> -\t\t * the minimum frame duration of the camera to an accepted\n> -\t\t * value.\n> -\t\t */\n> -\t\tstatic constexpr int64_t MAX_PREVIEW_RECORD_DURATION_NS = 1e9 / 29.97;\n> -\t\tif (minFrameDurationNsec > MAX_PREVIEW_RECORD_DURATION_NS &&\n> -\t\t    minFrameDurationNsec < MAX_PREVIEW_RECORD_DURATION_NS + 500000)\n> -\t\t\tminFrameDurationNsec = MAX_PREVIEW_RECORD_DURATION_NS - 1000;\n> -\n> -\t\t/*\n> -\t\t * The AE routine frame rate limits are computed using the frame\n> -\t\t * duration limits, as libcamera clips the AE routine to the\n> -\t\t * frame durations.\n> -\t\t */\n> -\t\tint32_t maxFps = std::round(1e9 / minFrameDurationNsec);\n> -\t\tint32_t minFps = std::round(1e9 / maxFrameDurationNsec);\n> -\t\tminFps = std::max(1, minFps);\n> -\n> -\t\t/*\n> -\t\t * Force rounding errors so that we have the proper frame\n> -\t\t * durations for when we reuse these variables later\n> -\t\t */\n> -\t\tminFrameDurationNsec = 1e9 / maxFps;\n> -\t\tmaxFrameDurationNsec = 1e9 / minFps;\n> -\n> -\t\t/*\n> -\t\t * Register to the camera service {min, max} and {max, max}\n> -\t\t * intervals as requested by the metadata documentation.\n> -\t\t */\n> -\t\tint32_t availableAeFpsTarget[] = {\n> -\t\t\tminFps, maxFps, maxFps, maxFps\n> -\t\t};\n> -\t\tstaticMetadata_->addEntry(ANDROID_CONTROL_AE_AVAILABLE_TARGET_FPS_RANGES,\n> -\t\t\t\t\t  availableAeFpsTarget);\n> -\t}\n> -\n> -\tstd::vector<int32_t> aeCompensationRange = {\n> -\t\t0, 0,\n> -\t};\n> -\tstaticMetadata_->addEntry(ANDROID_CONTROL_AE_COMPENSATION_RANGE,\n> -\t\t\t\t  aeCompensationRange);\n> -\n> -\tconst camera_metadata_rational_t aeCompensationStep[] = {\n> -\t\t{ 0, 1 }\n> -\t};\n> -\tstaticMetadata_->addEntry(ANDROID_CONTROL_AE_COMPENSATION_STEP,\n> -\t\t\t\t  aeCompensationStep);\n> -\n> -\tstd::vector<uint8_t> availableAfModes = {\n> -\t\tANDROID_CONTROL_AF_MODE_OFF,\n> -\t};\n> -\tstaticMetadata_->addEntry(ANDROID_CONTROL_AF_AVAILABLE_MODES,\n> -\t\t\t\t  availableAfModes);\n> -\n> -\tstd::vector<uint8_t> availableEffects = {\n> -\t\tANDROID_CONTROL_EFFECT_MODE_OFF,\n> -\t};\n> -\tstaticMetadata_->addEntry(ANDROID_CONTROL_AVAILABLE_EFFECTS,\n> -\t\t\t\t  availableEffects);\n> -\n> -\tstd::vector<uint8_t> availableSceneModes = {\n> -\t\tANDROID_CONTROL_SCENE_MODE_DISABLED,\n> -\t};\n> -\tstaticMetadata_->addEntry(ANDROID_CONTROL_AVAILABLE_SCENE_MODES,\n> -\t\t\t\t  availableSceneModes);\n> -\n> -\tstd::vector<uint8_t> availableStabilizationModes = {\n> -\t\tANDROID_CONTROL_VIDEO_STABILIZATION_MODE_OFF,\n> -\t};\n> -\tstaticMetadata_->addEntry(ANDROID_CONTROL_AVAILABLE_VIDEO_STABILIZATION_MODES,\n> -\t\t\t\t  availableStabilizationModes);\n> -\n> -\t/*\n> -\t * \\todo Inspect the Camera capabilities to report the available\n> -\t * AWB modes. Default to AUTO as CTS tests require it.\n> -\t */\n> -\tstd::vector<uint8_t> availableAwbModes = {\n> -\t\tANDROID_CONTROL_AWB_MODE_AUTO,\n> -\t};\n> -\tstaticMetadata_->addEntry(ANDROID_CONTROL_AWB_AVAILABLE_MODES,\n> -\t\t\t\t  availableAwbModes);\n> -\n> -\tstd::vector<int32_t> availableMaxRegions = {\n> -\t\t0, 0, 0,\n> -\t};\n> -\tstaticMetadata_->addEntry(ANDROID_CONTROL_MAX_REGIONS,\n> -\t\t\t\t  availableMaxRegions);\n> -\n> -\tstd::vector<uint8_t> sceneModesOverride = {\n> -\t\tANDROID_CONTROL_AE_MODE_ON,\n> -\t\tANDROID_CONTROL_AWB_MODE_AUTO,\n> -\t\tANDROID_CONTROL_AF_MODE_OFF,\n> -\t};\n> -\tstaticMetadata_->addEntry(ANDROID_CONTROL_SCENE_MODE_OVERRIDES,\n> -\t\t\t\t  sceneModesOverride);\n> -\n> -\tuint8_t aeLockAvailable = ANDROID_CONTROL_AE_LOCK_AVAILABLE_FALSE;\n> -\tstaticMetadata_->addEntry(ANDROID_CONTROL_AE_LOCK_AVAILABLE,\n> -\t\t\t\t  aeLockAvailable);\n> -\n> -\tuint8_t awbLockAvailable = ANDROID_CONTROL_AWB_LOCK_AVAILABLE_FALSE;\n> -\tstaticMetadata_->addEntry(ANDROID_CONTROL_AWB_LOCK_AVAILABLE,\n> -\t\t\t\t  awbLockAvailable);\n> -\n> -\tchar availableControlModes = ANDROID_CONTROL_MODE_AUTO;\n> -\tstaticMetadata_->addEntry(ANDROID_CONTROL_AVAILABLE_MODES,\n> -\t\t\t\t  availableControlModes);\n> -\n> -\t/* JPEG static metadata. */\n> -\n> -\t/*\n> -\t * Create the list of supported thumbnail sizes by inspecting the\n> -\t * available JPEG resolutions collected in streamConfigurations_ and\n> -\t * generate one entry for each aspect ratio.\n> -\t *\n> -\t * The JPEG thumbnailer can freely scale, so pick an arbitrary\n> -\t * (160, 160) size as the bounding rectangle, which is then cropped to\n> -\t * the different supported aspect ratios.\n> -\t */\n> -\tconstexpr Size maxJpegThumbnail(160, 160);\n> -\tstd::vector<Size> thumbnailSizes;\n> -\tthumbnailSizes.push_back({ 0, 0 });\n> -\tfor (const auto &entry : streamConfigurations_) {\n> -\t\tif (entry.androidFormat != HAL_PIXEL_FORMAT_BLOB)\n> -\t\t\tcontinue;\n> -\n> -\t\tSize thumbnailSize = maxJpegThumbnail\n> -\t\t\t\t     .boundedToAspectRatio({ entry.resolution.width,\n> -\t\t\t\t\t\t\t     entry.resolution.height });\n> -\t\tthumbnailSizes.push_back(thumbnailSize);\n> -\t}\n> -\n> -\tstd::sort(thumbnailSizes.begin(), thumbnailSizes.end());\n> -\tauto last = std::unique(thumbnailSizes.begin(), thumbnailSizes.end());\n> -\tthumbnailSizes.erase(last, thumbnailSizes.end());\n> -\n> -\t/* Transform sizes in to a list of integers that can be consumed. */\n> -\tstd::vector<int32_t> thumbnailEntries;\n> -\tthumbnailEntries.reserve(thumbnailSizes.size() * 2);\n> -\tfor (const auto &size : thumbnailSizes) {\n> -\t\tthumbnailEntries.push_back(size.width);\n> -\t\tthumbnailEntries.push_back(size.height);\n> -\t}\n> -\tstaticMetadata_->addEntry(ANDROID_JPEG_AVAILABLE_THUMBNAIL_SIZES,\n> -\t\t\t\t  thumbnailEntries);\n> -\n> -\tstaticMetadata_->addEntry(ANDROID_JPEG_MAX_SIZE, maxJpegBufferSize_);\n> -\n> -\t/* Sensor static metadata. */\n> -\tstd::array<int32_t, 2> pixelArraySize;\n> -\t{\n> -\t\tconst Size &size = properties.get(properties::PixelArraySize);\n> -\t\tpixelArraySize[0] = size.width;\n> -\t\tpixelArraySize[1] = size.height;\n> -\t\tstaticMetadata_->addEntry(ANDROID_SENSOR_INFO_PIXEL_ARRAY_SIZE,\n> -\t\t\t\t\t  pixelArraySize);\n> -\t}\n> -\n> -\tif (properties.contains(properties::UnitCellSize)) {\n> -\t\tconst Size &cellSize = properties.get<Size>(properties::UnitCellSize);\n> -\t\tstd::array<float, 2> physicalSize{\n> -\t\t\tcellSize.width * pixelArraySize[0] / 1e6f,\n> -\t\t\tcellSize.height * pixelArraySize[1] / 1e6f\n> -\t\t};\n> -\t\tstaticMetadata_->addEntry(ANDROID_SENSOR_INFO_PHYSICAL_SIZE,\n> -\t\t\t\t\t  physicalSize);\n> -\t}\n> -\n> -\t{\n> -\t\tconst Span<const Rectangle> &rects =\n> -\t\t\tproperties.get(properties::PixelArrayActiveAreas);\n> -\t\tstd::vector<int32_t> data{\n> -\t\t\tstatic_cast<int32_t>(rects[0].x),\n> -\t\t\tstatic_cast<int32_t>(rects[0].y),\n> -\t\t\tstatic_cast<int32_t>(rects[0].width),\n> -\t\t\tstatic_cast<int32_t>(rects[0].height),\n> -\t\t};\n> -\t\tstaticMetadata_->addEntry(ANDROID_SENSOR_INFO_ACTIVE_ARRAY_SIZE,\n> -\t\t\t\t\t  data);\n> -\t}\n> -\n> -\tint32_t sensitivityRange[] = {\n> -\t\t32, 2400,\n> -\t};\n> -\tstaticMetadata_->addEntry(ANDROID_SENSOR_INFO_SENSITIVITY_RANGE,\n> -\t\t\t\t  sensitivityRange);\n> -\n> -\t/* Report the color filter arrangement if the camera reports it. */\n> -\tif (properties.contains(properties::draft::ColorFilterArrangement)) {\n> -\t\tuint8_t filterArr = properties.get(properties::draft::ColorFilterArrangement);\n> -\t\tstaticMetadata_->addEntry(ANDROID_SENSOR_INFO_COLOR_FILTER_ARRANGEMENT,\n> -\t\t\t\t\t  filterArr);\n> -\t}\n> -\n> -\tconst auto &exposureInfo = controlsInfo.find(&controls::ExposureTime);\n> -\tif (exposureInfo != controlsInfo.end()) {\n> -\t\tint64_t exposureTimeRange[2] = {\n> -\t\t\texposureInfo->second.min().get<int32_t>() * 1000LL,\n> -\t\t\texposureInfo->second.max().get<int32_t>() * 1000LL,\n> -\t\t};\n> -\t\tstaticMetadata_->addEntry(ANDROID_SENSOR_INFO_EXPOSURE_TIME_RANGE,\n> -\t\t\t\t\t  exposureTimeRange, 2);\n> -\t}\n> -\n> -\tstaticMetadata_->addEntry(ANDROID_SENSOR_ORIENTATION, orientation_);\n> -\n> -\tstd::vector<int32_t> testPatternModes = {\n> -\t\tANDROID_SENSOR_TEST_PATTERN_MODE_OFF\n> -\t};\n> -\tconst auto &testPatternsInfo =\n> -\t\tcontrolsInfo.find(&controls::draft::TestPatternMode);\n> -\tif (testPatternsInfo != controlsInfo.end()) {\n> -\t\tconst auto &values = testPatternsInfo->second.values();\n> -\t\tASSERT(!values.empty());\n> -\t\tfor (const auto &value : values) {\n> -\t\t\tswitch (value.get<int32_t>()) {\n> -\t\t\tcase controls::draft::TestPatternModeOff:\n> -\t\t\t\t/*\n> -\t\t\t\t * ANDROID_SENSOR_TEST_PATTERN_MODE_OFF is\n> -\t\t\t\t * already in testPatternModes.\n> -\t\t\t\t */\n> -\t\t\t\tbreak;\n> -\n> -\t\t\tcase controls::draft::TestPatternModeSolidColor:\n> -\t\t\t\ttestPatternModes.push_back(\n> -\t\t\t\t\tANDROID_SENSOR_TEST_PATTERN_MODE_SOLID_COLOR);\n> -\t\t\t\tbreak;\n> -\n> -\t\t\tcase controls::draft::TestPatternModeColorBars:\n> -\t\t\t\ttestPatternModes.push_back(\n> -\t\t\t\t\tANDROID_SENSOR_TEST_PATTERN_MODE_COLOR_BARS);\n> -\t\t\t\tbreak;\n> -\n> -\t\t\tcase controls::draft::TestPatternModeColorBarsFadeToGray:\n> -\t\t\t\ttestPatternModes.push_back(\n> -\t\t\t\t\tANDROID_SENSOR_TEST_PATTERN_MODE_COLOR_BARS_FADE_TO_GRAY);\n> -\t\t\t\tbreak;\n> -\n> -\t\t\tcase controls::draft::TestPatternModePn9:\n> -\t\t\t\ttestPatternModes.push_back(\n> -\t\t\t\t\tANDROID_SENSOR_TEST_PATTERN_MODE_PN9);\n> -\t\t\t\tbreak;\n> -\n> -\t\t\tcase controls::draft::TestPatternModeCustom1:\n> -\t\t\t\t/* We don't support this yet. */\n> -\t\t\t\tbreak;\n> -\n> -\t\t\tdefault:\n> -\t\t\t\tLOG(HAL, Error) << \"Unknown test pattern mode: \"\n> -\t\t\t\t\t\t<< value.get<int32_t>();\n> -\t\t\t\tcontinue;\n> -\t\t\t}\n> -\t\t}\n> -\t}\n> -\tstaticMetadata_->addEntry(ANDROID_SENSOR_AVAILABLE_TEST_PATTERN_MODES,\n> -\t\t\t\t  testPatternModes);\n> -\n> -\tuint8_t timestampSource = ANDROID_SENSOR_INFO_TIMESTAMP_SOURCE_UNKNOWN;\n> -\tstaticMetadata_->addEntry(ANDROID_SENSOR_INFO_TIMESTAMP_SOURCE,\n> -\t\t\t\t  timestampSource);\n> -\n> -\tif (maxFrameDurationNsec > 0)\n> -\t\tstaticMetadata_->addEntry(ANDROID_SENSOR_INFO_MAX_FRAME_DURATION,\n> -\t\t\t\t\t  maxFrameDurationNsec);\n> -\n> -\t/* Statistics static metadata. */\n> -\tuint8_t faceDetectMode = ANDROID_STATISTICS_FACE_DETECT_MODE_OFF;\n> -\tstaticMetadata_->addEntry(ANDROID_STATISTICS_INFO_AVAILABLE_FACE_DETECT_MODES,\n> -\t\t\t\t  faceDetectMode);\n> -\n> -\tint32_t maxFaceCount = 0;\n> -\tstaticMetadata_->addEntry(ANDROID_STATISTICS_INFO_MAX_FACE_COUNT,\n> -\t\t\t\t  maxFaceCount);\n> -\n> -\t{\n> -\t\tstd::vector<uint8_t> data;\n> -\t\tdata.reserve(2);\n> -\t\tconst auto &infoMap = controlsInfo.find(&controls::draft::LensShadingMapMode);\n> -\t\tif (infoMap != controlsInfo.end()) {\n> -\t\t\tfor (const auto &value : infoMap->second.values())\n> -\t\t\t\tdata.push_back(value.get<int32_t>());\n> -\t\t} else {\n> -\t\t\tdata.push_back(ANDROID_STATISTICS_LENS_SHADING_MAP_MODE_OFF);\n> -\t\t}\n> -\t\tstaticMetadata_->addEntry(ANDROID_STATISTICS_INFO_AVAILABLE_LENS_SHADING_MAP_MODES,\n> -\t\t\t\t\t  data);\n> -\t}\n> -\n> -\t/* Sync static metadata. */\n> -\tint32_t maxLatency = ANDROID_SYNC_MAX_LATENCY_UNKNOWN;\n> -\tstaticMetadata_->addEntry(ANDROID_SYNC_MAX_LATENCY, maxLatency);\n> -\n> -\t/* Flash static metadata. */\n> -\tchar flashAvailable = ANDROID_FLASH_INFO_AVAILABLE_FALSE;\n> -\tstaticMetadata_->addEntry(ANDROID_FLASH_INFO_AVAILABLE,\n> -\t\t\t\t  flashAvailable);\n> -\n> -\t/* Lens static metadata. */\n> -\tstd::vector<float> lensApertures = {\n> -\t\t2.53 / 100,\n> -\t};\n> -\tstaticMetadata_->addEntry(ANDROID_LENS_INFO_AVAILABLE_APERTURES,\n> -\t\t\t\t  lensApertures);\n> -\n> -\tuint8_t lensFacing;\n> -\tswitch (facing_) {\n> -\tdefault:\n> -\tcase CAMERA_FACING_FRONT:\n> -\t\tlensFacing = ANDROID_LENS_FACING_FRONT;\n> -\t\tbreak;\n> -\tcase CAMERA_FACING_BACK:\n> -\t\tlensFacing = ANDROID_LENS_FACING_BACK;\n> -\t\tbreak;\n> -\tcase CAMERA_FACING_EXTERNAL:\n> -\t\tlensFacing = ANDROID_LENS_FACING_EXTERNAL;\n> -\t\tbreak;\n> -\t}\n> -\tstaticMetadata_->addEntry(ANDROID_LENS_FACING, lensFacing);\n> -\n> -\tstd::vector<float> lensFocalLengths = {\n> -\t\t1,\n> -\t};\n> -\tstaticMetadata_->addEntry(ANDROID_LENS_INFO_AVAILABLE_FOCAL_LENGTHS,\n> -\t\t\t\t  lensFocalLengths);\n> -\n> -\tstd::vector<uint8_t> opticalStabilizations = {\n> -\t\tANDROID_LENS_OPTICAL_STABILIZATION_MODE_OFF,\n> -\t};\n> -\tstaticMetadata_->addEntry(ANDROID_LENS_INFO_AVAILABLE_OPTICAL_STABILIZATION,\n> -\t\t\t\t  opticalStabilizations);\n> -\n> -\tfloat hypeFocalDistance = 0;\n> -\tstaticMetadata_->addEntry(ANDROID_LENS_INFO_HYPERFOCAL_DISTANCE,\n> -\t\t\t\t  hypeFocalDistance);\n> -\n> -\tfloat minFocusDistance = 0;\n> -\tstaticMetadata_->addEntry(ANDROID_LENS_INFO_MINIMUM_FOCUS_DISTANCE,\n> -\t\t\t\t  minFocusDistance);\n> -\n> -\t/* Noise reduction modes. */\n> -\t{\n> -\t\tstd::vector<uint8_t> data;\n> -\t\tdata.reserve(5);\n> -\t\tconst auto &infoMap = controlsInfo.find(&controls::draft::NoiseReductionMode);\n> -\t\tif (infoMap != controlsInfo.end()) {\n> -\t\t\tfor (const auto &value : infoMap->second.values())\n> -\t\t\t\tdata.push_back(value.get<int32_t>());\n> -\t\t} else {\n> -\t\t\tdata.push_back(ANDROID_NOISE_REDUCTION_MODE_OFF);\n> -\t\t}\n> -\t\tstaticMetadata_->addEntry(ANDROID_NOISE_REDUCTION_AVAILABLE_NOISE_REDUCTION_MODES,\n> -\t\t\t\t\t  data);\n> -\t}\n> -\n> -\t/* Scaler static metadata. */\n> -\n> -\t/*\n> -\t * \\todo The digital zoom factor is a property that depends on the\n> -\t * desired output configuration and the sensor frame size input to the\n> -\t * ISP. This information is not available to the Android HAL, not at\n> -\t * initialization time at least.\n> -\t *\n> -\t * As a workaround rely on pipeline handlers initializing the\n> -\t * ScalerCrop control with the camera default configuration and use the\n> -\t * maximum and minimum crop rectangles to calculate the digital zoom\n> -\t * factor.\n> -\t */\n> -\tfloat maxZoom = 1.0f;\n> -\tconst auto scalerCrop = controlsInfo.find(&controls::ScalerCrop);\n> -\tif (scalerCrop != controlsInfo.end()) {\n> -\t\tRectangle min = scalerCrop->second.min().get<Rectangle>();\n> -\t\tRectangle max = scalerCrop->second.max().get<Rectangle>();\n> -\t\tmaxZoom = std::min(1.0f * max.width / min.width,\n> -\t\t\t\t   1.0f * max.height / min.height);\n> -\t}\n> -\tstaticMetadata_->addEntry(ANDROID_SCALER_AVAILABLE_MAX_DIGITAL_ZOOM,\n> -\t\t\t\t  maxZoom);\n> -\n> -\tstd::vector<uint32_t> availableStreamConfigurations;\n> -\tavailableStreamConfigurations.reserve(streamConfigurations_.size() * 4);\n> -\tfor (const auto &entry : streamConfigurations_) {\n> -\t\tavailableStreamConfigurations.push_back(entry.androidFormat);\n> -\t\tavailableStreamConfigurations.push_back(entry.resolution.width);\n> -\t\tavailableStreamConfigurations.push_back(entry.resolution.height);\n> -\t\tavailableStreamConfigurations.push_back(\n> -\t\t\tANDROID_SCALER_AVAILABLE_STREAM_CONFIGURATIONS_OUTPUT);\n> -\t}\n> -\tstaticMetadata_->addEntry(ANDROID_SCALER_AVAILABLE_STREAM_CONFIGURATIONS,\n> -\t\t\t\t  availableStreamConfigurations);\n> -\n> -\tstd::vector<int64_t> availableStallDurations = {\n> -\t\tANDROID_SCALER_AVAILABLE_FORMATS_BLOB, 2560, 1920, 33333333,\n> -\t};\n> -\tstaticMetadata_->addEntry(ANDROID_SCALER_AVAILABLE_STALL_DURATIONS,\n> -\t\t\t\t  availableStallDurations);\n> -\n> -\t/* Use the minimum frame duration for all the YUV/RGB formats. */\n> -\tif (minFrameDurationNsec > 0) {\n> -\t\tstd::vector<int64_t> minFrameDurations;\n> -\t\tminFrameDurations.reserve(streamConfigurations_.size() * 4);\n> -\t\tfor (const auto &entry : streamConfigurations_) {\n> -\t\t\tminFrameDurations.push_back(entry.androidFormat);\n> -\t\t\tminFrameDurations.push_back(entry.resolution.width);\n> -\t\t\tminFrameDurations.push_back(entry.resolution.height);\n> -\t\t\tminFrameDurations.push_back(minFrameDurationNsec);\n> -\t\t}\n> -\t\tstaticMetadata_->addEntry(ANDROID_SCALER_AVAILABLE_MIN_FRAME_DURATIONS,\n> -\t\t\t\t\t  minFrameDurations);\n> -\t}\n> -\n> -\tuint8_t croppingType = ANDROID_SCALER_CROPPING_TYPE_CENTER_ONLY;\n> -\tstaticMetadata_->addEntry(ANDROID_SCALER_CROPPING_TYPE, croppingType);\n> -\n> -\t/* Info static metadata. */\n> -\tuint8_t supportedHWLevel = ANDROID_INFO_SUPPORTED_HARDWARE_LEVEL_LIMITED;\n> -\tstaticMetadata_->addEntry(ANDROID_INFO_SUPPORTED_HARDWARE_LEVEL,\n> -\t\t\t\t  supportedHWLevel);\n> -\n> -\t/* Request static metadata. */\n> -\tint32_t partialResultCount = 1;\n> -\tstaticMetadata_->addEntry(ANDROID_REQUEST_PARTIAL_RESULT_COUNT,\n> -\t\t\t\t  partialResultCount);\n> -\n> -\t{\n> -\t\t/* Default the value to 2 if not reported by the camera. */\n> -\t\tuint8_t maxPipelineDepth = 2;\n> -\t\tconst auto &infoMap = controlsInfo.find(&controls::draft::PipelineDepth);\n> -\t\tif (infoMap != controlsInfo.end())\n> -\t\t\tmaxPipelineDepth = infoMap->second.max().get<int32_t>();\n> -\t\tstaticMetadata_->addEntry(ANDROID_REQUEST_PIPELINE_MAX_DEPTH,\n> -\t\t\t\t\t  maxPipelineDepth);\n> -\t}\n> -\n> -\t/* LIMITED does not support reprocessing. */\n> -\tuint32_t maxNumInputStreams = 0;\n> -\tstaticMetadata_->addEntry(ANDROID_REQUEST_MAX_NUM_INPUT_STREAMS,\n> -\t\t\t\t  maxNumInputStreams);\n> -\n> -\tstd::vector<uint8_t> availableCapabilities = {\n> -\t\tANDROID_REQUEST_AVAILABLE_CAPABILITIES_BACKWARD_COMPATIBLE,\n> -\t};\n> -\n> -\t/* Report if camera supports RAW. */\n> -\tbool rawStreamAvailable = false;\n> -\tstd::unique_ptr<CameraConfiguration> cameraConfig =\n> -\t\tcamera_->generateConfiguration({ StreamRole::Raw });\n> -\tif (cameraConfig && !cameraConfig->empty()) {\n> -\t\tconst PixelFormatInfo &info =\n> -\t\t\tPixelFormatInfo::info(cameraConfig->at(0).pixelFormat);\n> -\t\t/* Only advertise RAW support if RAW16 is possible. */\n> -\t\tif (info.colourEncoding == PixelFormatInfo::ColourEncodingRAW &&\n> -\t\t    info.bitsPerPixel == 16) {\n> -\t\t\trawStreamAvailable = true;\n> -\t\t\tavailableCapabilities.push_back(ANDROID_REQUEST_AVAILABLE_CAPABILITIES_RAW);\n> -\t\t}\n> -\t}\n> -\n> -\t/* Number of { RAW, YUV, JPEG } supported output streams */\n> -\tint32_t numOutStreams[] = { rawStreamAvailable, 2, 1 };\n> -\tstaticMetadata_->addEntry(ANDROID_REQUEST_MAX_NUM_OUTPUT_STREAMS,\n> -\t\t\t\t  numOutStreams);\n> -\n> -\tstaticMetadata_->addEntry(ANDROID_REQUEST_AVAILABLE_CAPABILITIES,\n> -\t\t\t\t  availableCapabilities);\n> -\n> -\tstd::vector<int32_t> availableCharacteristicsKeys = {\n> -\t\tANDROID_COLOR_CORRECTION_AVAILABLE_ABERRATION_MODES,\n> -\t\tANDROID_CONTROL_AE_AVAILABLE_ANTIBANDING_MODES,\n> -\t\tANDROID_CONTROL_AE_AVAILABLE_MODES,\n> -\t\tANDROID_CONTROL_AE_AVAILABLE_TARGET_FPS_RANGES,\n> -\t\tANDROID_CONTROL_AE_COMPENSATION_RANGE,\n> -\t\tANDROID_CONTROL_AE_COMPENSATION_STEP,\n> -\t\tANDROID_CONTROL_AE_LOCK_AVAILABLE,\n> -\t\tANDROID_CONTROL_AF_AVAILABLE_MODES,\n> -\t\tANDROID_CONTROL_AVAILABLE_EFFECTS,\n> -\t\tANDROID_CONTROL_AVAILABLE_MODES,\n> -\t\tANDROID_CONTROL_AVAILABLE_SCENE_MODES,\n> -\t\tANDROID_CONTROL_AVAILABLE_VIDEO_STABILIZATION_MODES,\n> -\t\tANDROID_CONTROL_AWB_AVAILABLE_MODES,\n> -\t\tANDROID_CONTROL_AWB_LOCK_AVAILABLE,\n> -\t\tANDROID_CONTROL_MAX_REGIONS,\n> -\t\tANDROID_CONTROL_SCENE_MODE_OVERRIDES,\n> -\t\tANDROID_FLASH_INFO_AVAILABLE,\n> -\t\tANDROID_INFO_SUPPORTED_HARDWARE_LEVEL,\n> -\t\tANDROID_JPEG_AVAILABLE_THUMBNAIL_SIZES,\n> -\t\tANDROID_JPEG_MAX_SIZE,\n> -\t\tANDROID_LENS_FACING,\n> -\t\tANDROID_LENS_INFO_AVAILABLE_APERTURES,\n> -\t\tANDROID_LENS_INFO_AVAILABLE_FOCAL_LENGTHS,\n> -\t\tANDROID_LENS_INFO_AVAILABLE_OPTICAL_STABILIZATION,\n> -\t\tANDROID_LENS_INFO_HYPERFOCAL_DISTANCE,\n> -\t\tANDROID_LENS_INFO_MINIMUM_FOCUS_DISTANCE,\n> -\t\tANDROID_NOISE_REDUCTION_AVAILABLE_NOISE_REDUCTION_MODES,\n> -\t\tANDROID_REQUEST_AVAILABLE_CAPABILITIES,\n> -\t\tANDROID_REQUEST_MAX_NUM_INPUT_STREAMS,\n> -\t\tANDROID_REQUEST_MAX_NUM_OUTPUT_STREAMS,\n> -\t\tANDROID_REQUEST_PARTIAL_RESULT_COUNT,\n> -\t\tANDROID_REQUEST_PIPELINE_MAX_DEPTH,\n> -\t\tANDROID_SCALER_AVAILABLE_MAX_DIGITAL_ZOOM,\n> -\t\tANDROID_SCALER_AVAILABLE_MIN_FRAME_DURATIONS,\n> -\t\tANDROID_SCALER_AVAILABLE_STALL_DURATIONS,\n> -\t\tANDROID_SCALER_AVAILABLE_STREAM_CONFIGURATIONS,\n> -\t\tANDROID_SCALER_CROPPING_TYPE,\n> -\t\tANDROID_SENSOR_AVAILABLE_TEST_PATTERN_MODES,\n> -\t\tANDROID_SENSOR_INFO_ACTIVE_ARRAY_SIZE,\n> -\t\tANDROID_SENSOR_INFO_COLOR_FILTER_ARRANGEMENT,\n> -\t\tANDROID_SENSOR_INFO_EXPOSURE_TIME_RANGE,\n> -\t\tANDROID_SENSOR_INFO_MAX_FRAME_DURATION,\n> -\t\tANDROID_SENSOR_INFO_PHYSICAL_SIZE,\n> -\t\tANDROID_SENSOR_INFO_PIXEL_ARRAY_SIZE,\n> -\t\tANDROID_SENSOR_INFO_SENSITIVITY_RANGE,\n> -\t\tANDROID_SENSOR_INFO_TIMESTAMP_SOURCE,\n> -\t\tANDROID_SENSOR_ORIENTATION,\n> -\t\tANDROID_STATISTICS_INFO_AVAILABLE_FACE_DETECT_MODES,\n> -\t\tANDROID_STATISTICS_INFO_MAX_FACE_COUNT,\n> -\t\tANDROID_SYNC_MAX_LATENCY,\n> -\t};\n> -\tstaticMetadata_->addEntry(ANDROID_REQUEST_AVAILABLE_CHARACTERISTICS_KEYS,\n> -\t\t\t\t  availableCharacteristicsKeys);\n> -\n> -\tstd::vector<int32_t> availableRequestKeys = {\n> -\t\tANDROID_COLOR_CORRECTION_ABERRATION_MODE,\n> -\t\tANDROID_CONTROL_AE_ANTIBANDING_MODE,\n> -\t\tANDROID_CONTROL_AE_EXPOSURE_COMPENSATION,\n> -\t\tANDROID_CONTROL_AE_LOCK,\n> -\t\tANDROID_CONTROL_AE_MODE,\n> -\t\tANDROID_CONTROL_AE_PRECAPTURE_TRIGGER,\n> -\t\tANDROID_CONTROL_AE_TARGET_FPS_RANGE,\n> -\t\tANDROID_CONTROL_AF_MODE,\n> -\t\tANDROID_CONTROL_AF_TRIGGER,\n> -\t\tANDROID_CONTROL_AWB_LOCK,\n> -\t\tANDROID_CONTROL_AWB_MODE,\n> -\t\tANDROID_CONTROL_CAPTURE_INTENT,\n> -\t\tANDROID_CONTROL_EFFECT_MODE,\n> -\t\tANDROID_CONTROL_MODE,\n> -\t\tANDROID_CONTROL_SCENE_MODE,\n> -\t\tANDROID_CONTROL_VIDEO_STABILIZATION_MODE,\n> -\t\tANDROID_FLASH_MODE,\n> -\t\tANDROID_JPEG_ORIENTATION,\n> -\t\tANDROID_JPEG_QUALITY,\n> -\t\tANDROID_JPEG_THUMBNAIL_QUALITY,\n> -\t\tANDROID_JPEG_THUMBNAIL_SIZE,\n> -\t\tANDROID_LENS_APERTURE,\n> -\t\tANDROID_LENS_OPTICAL_STABILIZATION_MODE,\n> -\t\tANDROID_NOISE_REDUCTION_MODE,\n> -\t\tANDROID_SCALER_CROP_REGION,\n> -\t\tANDROID_STATISTICS_FACE_DETECT_MODE\n> -\t};\n> -\tstaticMetadata_->addEntry(ANDROID_REQUEST_AVAILABLE_REQUEST_KEYS,\n> -\t\t\t\t  availableRequestKeys);\n> -\n> -\tstd::vector<int32_t> availableResultKeys = {\n> -\t\tANDROID_COLOR_CORRECTION_ABERRATION_MODE,\n> -\t\tANDROID_CONTROL_AE_ANTIBANDING_MODE,\n> -\t\tANDROID_CONTROL_AE_EXPOSURE_COMPENSATION,\n> -\t\tANDROID_CONTROL_AE_LOCK,\n> -\t\tANDROID_CONTROL_AE_MODE,\n> -\t\tANDROID_CONTROL_AE_PRECAPTURE_TRIGGER,\n> -\t\tANDROID_CONTROL_AE_STATE,\n> -\t\tANDROID_CONTROL_AE_TARGET_FPS_RANGE,\n> -\t\tANDROID_CONTROL_AF_MODE,\n> -\t\tANDROID_CONTROL_AF_STATE,\n> -\t\tANDROID_CONTROL_AF_TRIGGER,\n> -\t\tANDROID_CONTROL_AWB_LOCK,\n> -\t\tANDROID_CONTROL_AWB_MODE,\n> -\t\tANDROID_CONTROL_AWB_STATE,\n> -\t\tANDROID_CONTROL_CAPTURE_INTENT,\n> -\t\tANDROID_CONTROL_EFFECT_MODE,\n> -\t\tANDROID_CONTROL_MODE,\n> -\t\tANDROID_CONTROL_SCENE_MODE,\n> -\t\tANDROID_CONTROL_VIDEO_STABILIZATION_MODE,\n> -\t\tANDROID_FLASH_MODE,\n> -\t\tANDROID_FLASH_STATE,\n> -\t\tANDROID_JPEG_GPS_COORDINATES,\n> -\t\tANDROID_JPEG_GPS_PROCESSING_METHOD,\n> -\t\tANDROID_JPEG_GPS_TIMESTAMP,\n> -\t\tANDROID_JPEG_ORIENTATION,\n> -\t\tANDROID_JPEG_QUALITY,\n> -\t\tANDROID_JPEG_SIZE,\n> -\t\tANDROID_JPEG_THUMBNAIL_QUALITY,\n> -\t\tANDROID_JPEG_THUMBNAIL_SIZE,\n> -\t\tANDROID_LENS_APERTURE,\n> -\t\tANDROID_LENS_FOCAL_LENGTH,\n> -\t\tANDROID_LENS_OPTICAL_STABILIZATION_MODE,\n> -\t\tANDROID_LENS_STATE,\n> -\t\tANDROID_NOISE_REDUCTION_MODE,\n> -\t\tANDROID_REQUEST_PIPELINE_DEPTH,\n> -\t\tANDROID_SCALER_CROP_REGION,\n> -\t\tANDROID_SENSOR_EXPOSURE_TIME,\n> -\t\tANDROID_SENSOR_FRAME_DURATION,\n> -\t\tANDROID_SENSOR_ROLLING_SHUTTER_SKEW,\n> -\t\tANDROID_SENSOR_TEST_PATTERN_MODE,\n> -\t\tANDROID_SENSOR_TIMESTAMP,\n> -\t\tANDROID_STATISTICS_FACE_DETECT_MODE,\n> -\t\tANDROID_STATISTICS_LENS_SHADING_MAP_MODE,\n> -\t\tANDROID_STATISTICS_HOT_PIXEL_MAP_MODE,\n> -\t\tANDROID_STATISTICS_SCENE_FLICKER,\n> -\t};\n> -\tstaticMetadata_->addEntry(ANDROID_REQUEST_AVAILABLE_RESULT_KEYS,\n> -\t\t\t\t  availableResultKeys);\n> -\n> -\tif (!staticMetadata_->isValid()) {\n> -\t\tLOG(HAL, Error) << \"Failed to construct static metadata\";\n> -\t\tstaticMetadata_.reset();\n> -\t\treturn nullptr;\n> -\t}\n> -\n> -\tif (staticMetadata_->resized()) {\n> -\t\tauto [entryCount, dataCount] = staticMetadata_->usage();\n> -\t\tLOG(HAL, Info)\n> -\t\t\t<< \"Static metadata resized: \" << entryCount\n> -\t\t\t<< \" entries and \" << dataCount << \" bytes used\";\n> -\t}\n> -\n> -\treturn staticMetadata_->get();\n> -}\n> -\n> -std::unique_ptr<CameraMetadata> CameraDevice::requestTemplatePreview()\n> +void CameraDevice::setCallbacks(const camera3_callback_ops_t *callbacks)\n>  {\n> -\t/*\n> -\t * \\todo Keep this in sync with the actual number of entries.\n> -\t * Currently: 20 entries, 35 bytes\n> -\t */\n> -\tauto requestTemplate = std::make_unique<CameraMetadata>(21, 36);\n> -\tif (!requestTemplate->isValid()) {\n> -\t\treturn nullptr;\n> -\t}\n> -\n> -\t/* Get the FPS range registered in the static metadata. */\n> -\tcamera_metadata_ro_entry_t entry;\n> -\tbool found = staticMetadata_->getEntry(ANDROID_CONTROL_AE_AVAILABLE_TARGET_FPS_RANGES,\n> -\t\t\t\t\t       &entry);\n> -\tif (!found) {\n> -\t\tLOG(HAL, Error) << \"Cannot create capture template without FPS range\";\n> -\t\treturn nullptr;\n> -\t}\n> -\n> -\t/*\n> -\t * Assume the AE_AVAILABLE_TARGET_FPS_RANGE static metadata\n> -\t * has been assembled as {{min, max} {max, max}}.\n> -\t */\n> -\trequestTemplate->addEntry(ANDROID_CONTROL_AE_TARGET_FPS_RANGE,\n> -\t\t\t\t  entry.data.i32, 2);\n> -\n> -\tuint8_t aeMode = ANDROID_CONTROL_AE_MODE_ON;\n> -\trequestTemplate->addEntry(ANDROID_CONTROL_AE_MODE, aeMode);\n> -\n> -\tint32_t aeExposureCompensation = 0;\n> -\trequestTemplate->addEntry(ANDROID_CONTROL_AE_EXPOSURE_COMPENSATION,\n> -\t\t\t\t  aeExposureCompensation);\n> -\n> -\tuint8_t aePrecaptureTrigger = ANDROID_CONTROL_AE_PRECAPTURE_TRIGGER_IDLE;\n> -\trequestTemplate->addEntry(ANDROID_CONTROL_AE_PRECAPTURE_TRIGGER,\n> -\t\t\t\t  aePrecaptureTrigger);\n> -\n> -\tuint8_t aeLock = ANDROID_CONTROL_AE_LOCK_OFF;\n> -\trequestTemplate->addEntry(ANDROID_CONTROL_AE_LOCK, aeLock);\n> -\n> -\tuint8_t aeAntibandingMode = ANDROID_CONTROL_AE_ANTIBANDING_MODE_AUTO;\n> -\trequestTemplate->addEntry(ANDROID_CONTROL_AE_ANTIBANDING_MODE,\n> -\t\t\t\t  aeAntibandingMode);\n> -\n> -\tuint8_t afMode = ANDROID_CONTROL_AF_MODE_OFF;\n> -\trequestTemplate->addEntry(ANDROID_CONTROL_AF_MODE, afMode);\n> -\n> -\tuint8_t afTrigger = ANDROID_CONTROL_AF_TRIGGER_IDLE;\n> -\trequestTemplate->addEntry(ANDROID_CONTROL_AF_TRIGGER, afTrigger);\n> -\n> -\tuint8_t awbMode = ANDROID_CONTROL_AWB_MODE_AUTO;\n> -\trequestTemplate->addEntry(ANDROID_CONTROL_AWB_MODE, awbMode);\n> -\n> -\tuint8_t awbLock = ANDROID_CONTROL_AWB_LOCK_OFF;\n> -\trequestTemplate->addEntry(ANDROID_CONTROL_AWB_LOCK, awbLock);\n> -\n> -\tuint8_t flashMode = ANDROID_FLASH_MODE_OFF;\n> -\trequestTemplate->addEntry(ANDROID_FLASH_MODE, flashMode);\n> -\n> -\tuint8_t faceDetectMode = ANDROID_STATISTICS_FACE_DETECT_MODE_OFF;\n> -\trequestTemplate->addEntry(ANDROID_STATISTICS_FACE_DETECT_MODE,\n> -\t\t\t\t  faceDetectMode);\n> -\n> -\tuint8_t noiseReduction = ANDROID_NOISE_REDUCTION_MODE_OFF;\n> -\trequestTemplate->addEntry(ANDROID_NOISE_REDUCTION_MODE,\n> -\t\t\t\t  noiseReduction);\n> -\n> -\tuint8_t aberrationMode = ANDROID_COLOR_CORRECTION_ABERRATION_MODE_OFF;\n> -\trequestTemplate->addEntry(ANDROID_COLOR_CORRECTION_ABERRATION_MODE,\n> -\t\t\t\t  aberrationMode);\n> -\n> -\tuint8_t controlMode = ANDROID_CONTROL_MODE_AUTO;\n> -\trequestTemplate->addEntry(ANDROID_CONTROL_MODE, controlMode);\n> -\n> -\tfloat lensAperture = 2.53 / 100;\n> -\trequestTemplate->addEntry(ANDROID_LENS_APERTURE, lensAperture);\n> -\n> -\tuint8_t opticalStabilization = ANDROID_LENS_OPTICAL_STABILIZATION_MODE_OFF;\n> -\trequestTemplate->addEntry(ANDROID_LENS_OPTICAL_STABILIZATION_MODE,\n> -\t\t\t\t  opticalStabilization);\n> -\n> -\tuint8_t captureIntent = ANDROID_CONTROL_CAPTURE_INTENT_PREVIEW;\n> -\trequestTemplate->addEntry(ANDROID_CONTROL_CAPTURE_INTENT,\n> -\t\t\t\t  captureIntent);\n> -\n> -\treturn requestTemplate;\n> +\tcallbacks_ = callbacks;\n>  }\n>  \n> -std::unique_ptr<CameraMetadata> CameraDevice::requestTemplateVideo()\n> +const camera_metadata_t *CameraDevice::getStaticMetadata()\n>  {\n> -\tstd::unique_ptr<CameraMetadata> previewTemplate = requestTemplatePreview();\n> -\tif (!previewTemplate)\n> -\t\treturn nullptr;\n> -\n> -\t/*\n> -\t * The video template requires a fixed FPS range. Everything else\n> -\t * stays the same as the preview template.\n> -\t */\n> -\tcamera_metadata_ro_entry_t entry;\n> -\tstaticMetadata_->getEntry(ANDROID_CONTROL_AE_AVAILABLE_TARGET_FPS_RANGES,\n> -\t\t\t\t  &entry);\n> -\n> -\t/*\n> -\t * Assume the AE_AVAILABLE_TARGET_FPS_RANGE static metadata\n> -\t * has been assembled as {{min, max} {max, max}}.\n> -\t */\n> -\tpreviewTemplate->updateEntry(ANDROID_CONTROL_AE_TARGET_FPS_RANGE,\n> -\t\t\t\t     entry.data.i32 + 2, 2);\n> -\n> -\treturn previewTemplate;\n> +\treturn capabilities_.staticMetadata()->get();\n>  }\n>  \n>  /*\n> @@ -1630,7 +520,7 @@ const camera_metadata_t *CameraDevice::constructDefaultRequestSettings(int type)\n>  \tswitch (type) {\n>  \tcase CAMERA3_TEMPLATE_PREVIEW:\n>  \t\tcaptureIntent = ANDROID_CONTROL_CAPTURE_INTENT_PREVIEW;\n> -\t\trequestTemplate = requestTemplatePreview();\n> +\t\trequestTemplate = capabilities_.requestTemplatePreview();\n>  \t\tbreak;\n>  \tcase CAMERA3_TEMPLATE_STILL_CAPTURE:\n>  \t\t/*\n> @@ -1638,15 +528,15 @@ const camera_metadata_t *CameraDevice::constructDefaultRequestSettings(int type)\n>  \t\t * for the torch mode we currently do not support.\n>  \t\t */\n>  \t\tcaptureIntent = ANDROID_CONTROL_CAPTURE_INTENT_STILL_CAPTURE;\n> -\t\trequestTemplate = requestTemplatePreview();\n> +\t\trequestTemplate = capabilities_.requestTemplatePreview();\n>  \t\tbreak;\n>  \tcase CAMERA3_TEMPLATE_VIDEO_RECORD:\n>  \t\tcaptureIntent = ANDROID_CONTROL_CAPTURE_INTENT_VIDEO_RECORD;\n> -\t\trequestTemplate = requestTemplateVideo();\n> +\t\trequestTemplate = capabilities_.requestTemplateVideo();\n>  \t\tbreak;\n>  \tcase CAMERA3_TEMPLATE_VIDEO_SNAPSHOT:\n>  \t\tcaptureIntent = ANDROID_CONTROL_CAPTURE_INTENT_VIDEO_SNAPSHOT;\n> -\t\trequestTemplate = requestTemplateVideo();\n> +\t\trequestTemplate = capabilities_.requestTemplateVideo();\n>  \t\tbreak;\n>  \t/* \\todo Implement templates generation for the remaining use cases. */\n>  \tcase CAMERA3_TEMPLATE_ZERO_SHUTTER_LAG:\n> @@ -1668,19 +558,6 @@ const camera_metadata_t *CameraDevice::constructDefaultRequestSettings(int type)\n>  \treturn requestTemplates_[type]->get();\n>  }\n>  \n> -PixelFormat CameraDevice::toPixelFormat(int format) const\n> -{\n> -\t/* Translate Android format code to libcamera pixel format. */\n> -\tauto it = formatsMap_.find(format);\n> -\tif (it == formatsMap_.end()) {\n> -\t\tLOG(HAL, Error) << \"Requested format \" << utils::hex(format)\n> -\t\t\t\t<< \" not supported\";\n> -\t\treturn PixelFormat();\n> -\t}\n> -\n> -\treturn it->second;\n> -}\n> -\n>  /*\n>   * Inspect the stream_list to produce a list of StreamConfiguration to\n>   * be use to configure the Camera.\n> @@ -1727,7 +604,7 @@ int CameraDevice::configureStreams(camera3_stream_configuration_t *stream_list)\n>  \t\tcamera3_stream_t *stream = stream_list->streams[i];\n>  \t\tSize size(stream->width, stream->height);\n>  \n> -\t\tPixelFormat format = toPixelFormat(stream->format);\n> +\t\tPixelFormat format = capabilities_.toPixelFormat(stream->format);\n>  \n>  \t\tLOG(HAL, Info) << \"Stream #\" << i\n>  \t\t\t       << \", direction: \" << stream->stream_type\n> diff --git a/src/android/camera_device.h b/src/android/camera_device.h\n> index 4aadb27c562c..090fe28a551e 100644\n> --- a/src/android/camera_device.h\n> +++ b/src/android/camera_device.h\n> @@ -10,14 +10,12 @@\n>  #include <map>\n>  #include <memory>\n>  #include <mutex>\n> -#include <tuple>\n>  #include <vector>\n>  \n>  #include <hardware/camera3.h>\n>  \n>  #include <libcamera/buffer.h>\n>  #include <libcamera/camera.h>\n> -#include <libcamera/geometry.h>\n>  #include <libcamera/request.h>\n>  #include <libcamera/stream.h>\n>  \n> @@ -26,6 +24,7 @@\n>  #include \"libcamera/internal/message.h\"\n>  #include \"libcamera/internal/thread.h\"\n>  \n> +#include \"camera_capabilities.h\"\n>  #include \"camera_metadata.h\"\n>  #include \"camera_stream.h\"\n>  #include \"camera_worker.h\"\n> @@ -57,7 +56,7 @@ public:\n>  \tconst std::string &model() const { return model_; }\n>  \tint facing() const { return facing_; }\n>  \tint orientation() const { return orientation_; }\n> -\tunsigned int maxJpegBufferSize() const { return maxJpegBufferSize_; }\n> +\tunsigned int maxJpegBufferSize() const;\n>  \n>  \tvoid setCallbacks(const camera3_callback_ops_t *callbacks);\n>  \tconst camera_metadata_t *getStaticMetadata();\n> @@ -86,11 +85,6 @@ private:\n>  \t\tstd::unique_ptr<CaptureRequest> request_;\n>  \t};\n>  \n> -\tstruct Camera3StreamConfiguration {\n> -\t\tlibcamera::Size resolution;\n> -\t\tint androidFormat;\n> -\t};\n> -\n>  \tenum class State {\n>  \t\tStopped,\n>  \t\tFlushing,\n> @@ -99,22 +93,11 @@ private:\n>  \n>  \tvoid stop();\n>  \n> -\tint initializeStreamConfigurations();\n> -\tstd::vector<libcamera::Size>\n> -\tgetYUVResolutions(libcamera::CameraConfiguration *cameraConfig,\n> -\t\t\t  const libcamera::PixelFormat &pixelFormat,\n> -\t\t\t  const std::vector<libcamera::Size> &resolutions);\n> -\tstd::vector<libcamera::Size>\n> -\tgetRawResolutions(const libcamera::PixelFormat &pixelFormat);\n> -\n>  \tlibcamera::FrameBuffer *createFrameBuffer(const buffer_handle_t camera3buffer);\n>  \tvoid abortRequest(camera3_capture_request_t *request);\n>  \tvoid notifyShutter(uint32_t frameNumber, uint64_t timestamp);\n>  \tvoid notifyError(uint32_t frameNumber, camera3_stream_t *stream,\n>  \t\t\t camera3_error_msg_code code);\n> -\tstd::unique_ptr<CameraMetadata> requestTemplatePreview();\n> -\tstd::unique_ptr<CameraMetadata> requestTemplateVideo();\n> -\tlibcamera::PixelFormat toPixelFormat(int format) const;\n>  \tint processControls(Camera3RequestDescriptor *descriptor);\n>  \tstd::unique_ptr<CameraMetadata> getResultMetadata(\n>  \t\tconst Camera3RequestDescriptor &descriptor) const;\n> @@ -129,13 +112,11 @@ private:\n>  \n>  \tstd::shared_ptr<libcamera::Camera> camera_;\n>  \tstd::unique_ptr<libcamera::CameraConfiguration> config_;\n> +\tCameraCapabilities capabilities_;\n>  \n> -\tstd::unique_ptr<CameraMetadata> staticMetadata_;\n>  \tstd::map<unsigned int, std::unique_ptr<CameraMetadata>> requestTemplates_;\n>  \tconst camera3_callback_ops_t *callbacks_;\n>  \n> -\tstd::vector<Camera3StreamConfiguration> streamConfigurations_;\n> -\tstd::map<int, libcamera::PixelFormat> formatsMap_;\n>  \tstd::vector<CameraStream> streams_;\n>  \n>  \tlibcamera::Mutex descriptorsMutex_; /* Protects descriptors_. */\n> @@ -147,8 +128,6 @@ private:\n>  \tint facing_;\n>  \tint orientation_;\n>  \n> -\tunsigned int maxJpegBufferSize_;\n> -\n>  \tCameraMetadata lastSettings_;\n>  };\n>  \n> diff --git a/src/android/meson.build b/src/android/meson.build\n> index f27fd5316705..6270fb201338 100644\n> --- a/src/android/meson.build\n> +++ b/src/android/meson.build\n> @@ -44,6 +44,7 @@ subdir('cros')\n>  \n>  android_hal_sources = files([\n>      'camera3_hal.cpp',\n> +    'camera_capabilities.cpp',\n>      'camera_device.cpp',\n>      'camera_hal_config.cpp',\n>      'camera_hal_manager.cpp',\n>","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id 460C3C321A\n\tfor <parsemail@patchwork.libcamera.org>;\n\tTue, 22 Jun 2021 10:02:13 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id 9250068935;\n\tTue, 22 Jun 2021 12:02:11 +0200 (CEST)","from perceval.ideasonboard.com (perceval.ideasonboard.com\n\t[213.167.242.64])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id 9740B60292\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tTue, 22 Jun 2021 12:02:10 +0200 (CEST)","from pendragon.ideasonboard.com (62-78-145-57.bb.dnainternet.fi\n\t[62.78.145.57])\n\tby perceval.ideasonboard.com (Postfix) with ESMTPSA id C89B7A66;\n\tTue, 22 Jun 2021 12:02:09 +0200 (CEST)"],"Authentication-Results":"lancelot.ideasonboard.com; dkim=pass (1024-bit key;\n\tunprotected) header.d=ideasonboard.com header.i=@ideasonboard.com\n\theader.b=\"JrNTuTSK\"; dkim-atps=neutral","DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/simple; d=ideasonboard.com;\n\ts=mail; t=1624356130;\n\tbh=xiHfqcK65R5EaznTk9tT+SLTTnCxtGF9lwaoQiebpzk=;\n\th=Date:From:To:Cc:Subject:References:In-Reply-To:From;\n\tb=JrNTuTSKsfLeatnvYM9fMCSYSCE8z6THLTM9dUvaKQjjXGF0zbeTBrWoOnR680zjr\n\tT11BzmqXrNYeYWBUlPAzdASA0e8Obvs0KRmjkoPgNvCYXDz+flHCQshIlHpBeqJ/Je\n\t2YPmkoAbiLxIHZm9SGV4XtiD0/pNr/DFsI8yGGus=","Date":"Tue, 22 Jun 2021 13:01:41 +0300","From":"Laurent Pinchart <laurent.pinchart@ideasonboard.com>","To":"Jacopo Mondi <jacopo@jmondi.org>","Message-ID":"<YNG1BSZCI3pECBhE@pendragon.ideasonboard.com>","References":"<20210621152954.40299-1-jacopo@jmondi.org>\n\t<20210621152954.40299-3-jacopo@jmondi.org>","MIME-Version":"1.0","Content-Type":"text/plain; charset=utf-8","Content-Disposition":"inline","In-Reply-To":"<20210621152954.40299-3-jacopo@jmondi.org>","Subject":"Re: [libcamera-devel] [PATCH 2/2] android: Introduce\n\tCameraCapabilties class","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","Cc":"libcamera-devel@lists.libcamera.org","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}},{"id":17703,"web_url":"https://patchwork.libcamera.org/comment/17703/","msgid":"<CAO5uPHP-KNPPFDXWph7D5xqaNt-o4TBA-Ta_ksM7LFx=uC3mtA@mail.gmail.com>","date":"2021-06-23T03:25:01","subject":"Re: [libcamera-devel] [PATCH 2/2] android: Introduce\n\tCameraCapabilties class","submitter":{"id":63,"url":"https://patchwork.libcamera.org/api/people/63/","name":"Hirokazu Honda","email":"hiroh@chromium.org"},"content":"Hi Jacopo,\n\nOn Tue, Jun 22, 2021 at 4:40 PM Jacopo Mondi <jacopo@jmondi.org> wrote:\n>\n> Hi Hiro,\n>\n> On Tue, Jun 22, 2021 at 10:34:27AM +0900, Hirokazu Honda wrote:\n> > Hi Jacopo, thank you for the patch.\n> >\n> > I failed to apply the patch on the top of the latest tree to review.\n> > Could you tell me the parent commit which I can apply this patch?\n>\n> weird, I just applied these two patches cleanly on the latest\n> master which for me is\n> 969da3189439 (\"libcamera: utils: Support systems that lack secure_getenv and issetugid\"\n>\n\nI tried applying the series whose id is 2162. But the series doesn't\ncontain \"[PATCH 1/2] android: Sort source files alphabetically\".\nIn fact, the patch is not shown in patchwork. It is strange.\nI managed to apply this patch after manually applying 1/2.\n>\n> >\n> > -Hiro\n> > On Tue, Jun 22, 2021 at 12:29 AM Jacopo Mondi <jacopo@jmondi.org> wrote:\n> >\n> > > The camera_device.cpp has grown a little too much, and it has quickly\n> > > become hard to maintain. Break out the handling of the static\n> > > information collected at camera initialization time to a new\n> > > CameraCapabilities class.\n> > >\n> > > Break out from the camera_device.cpp file all the functions relative to:\n> > > - Initialization of supported stream configurations\n> > > - Initialization of static metadata\n> > > - Initialization of request templates\n> > >\n> > > Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>\n> > > Acked-by: Paul Elder <paul.elder@ideasonboard.com>\n> > > Tested-by: Paul Elder <paul.elder@ideasonboard.com>\n> > > ---\n> > >  src/android/camera_capabilities.cpp | 1164 +++++++++++++++++++++++++++\n> > >  src/android/camera_capabilities.h   |   65 ++\n> > >  src/android/camera_device.cpp       | 1147 +-------------------------\n> > >  src/android/camera_device.h         |   27 +-\n> > >  src/android/meson.build             |    1 +\n> > >  5 files changed, 1245 insertions(+), 1159 deletions(-)\n> > >  create mode 100644 src/android/camera_capabilities.cpp\n> > >  create mode 100644 src/android/camera_capabilities.h\n> > >\n> > > diff --git a/src/android/camera_capabilities.cpp\n> > > b/src/android/camera_capabilities.cpp\n> > > new file mode 100644\n> > > index 000000000000..311a2c839586\n> > > --- /dev/null\n> > > +++ b/src/android/camera_capabilities.cpp\n> > > @@ -0,0 +1,1164 @@\n> > > +/* SPDX-License-Identifier: LGPL-2.1-or-later */\n> > > +/*\n> > > + * Copyright (C) 2021, Google Inc.\n> > > + *\n> > > + * camera_capabilities.cpp - Camera static properties manager\n> > > + */\n> > > +\n> > > +#include \"camera_capabilities.h\"\n> > > +\n> > > +#include <array>\n> > > +#include <cmath>\n> > > +\n> > > +#include <hardware/camera3.h>\n> > > +\n> > > +#include <libcamera/control_ids.h>\n> > > +#include <libcamera/controls.h>\n> > > +#include <libcamera/property_ids.h>\n> > > +\n> > > +#include \"libcamera/internal/formats.h\"\n> > > +#include \"libcamera/internal/log.h\"\n> > > +\n> > > +using namespace libcamera;\n> > > +\n> > > +LOG_DECLARE_CATEGORY(HAL)\n> > > +\n> > > +namespace {\n> > > +\n> > > +/*\n> > > + * \\var camera3Resolutions\n> > > + * \\brief The list of image resolutions defined as mandatory to be\n> > > supported by\n> > > + * the Android Camera3 specification\n> > > + */\n> > > +const std::vector<Size> camera3Resolutions = {\n> > > +       { 320, 240 },\n> > > +       { 640, 480 },\n> > > +       { 1280, 720 },\n> > > +       { 1920, 1080 }\n> > > +};\n> > > +\n> > > +/*\n> > > + * \\struct Camera3Format\n> > > + * \\brief Data associated with an Android format identifier\n> > > + * \\var libcameraFormats List of libcamera pixel formats compatible with\n> > > the\n> > > + * Android format\n> > > + * \\var name The human-readable representation of the Android format code\n> > > + */\n> > > +struct Camera3Format {\n> > > +       std::vector<PixelFormat> libcameraFormats;\n> > > +       bool mandatory;\n> > > +       const char *name;\n> > > +};\n> > > +\n> > > +/*\n> > > + * \\var camera3FormatsMap\n> > > + * \\brief Associate Android format code with ancillary data\n> > > + */\n> > > +const std::map<int, const Camera3Format> camera3FormatsMap = {\n> > > +       {\n> > > +               HAL_PIXEL_FORMAT_BLOB, {\n> > > +                       { formats::MJPEG },\n> > > +                       true,\n> > > +                       \"BLOB\"\n> > > +               }\n> > > +       }, {\n> > > +               HAL_PIXEL_FORMAT_YCbCr_420_888, {\n> > > +                       { formats::NV12, formats::NV21 },\n> > > +                       true,\n> > > +                       \"YCbCr_420_888\"\n> > > +               }\n> > > +       }, {\n> > > +               /*\n> > > +                * \\todo Translate IMPLEMENTATION_DEFINED inspecting the\n> > > gralloc\n> > > +                * usage flag. For now, copy the YCbCr_420 configuration.\n> > > +                */\n> > > +               HAL_PIXEL_FORMAT_IMPLEMENTATION_DEFINED, {\n> > > +                       { formats::NV12, formats::NV21 },\n> > > +                       true,\n> > > +                       \"IMPLEMENTATION_DEFINED\"\n> > > +               }\n> > > +       }, {\n> > > +               HAL_PIXEL_FORMAT_RAW10, {\n> > > +                       {\n> > > +                               formats::SBGGR10_CSI2P,\n> > > +                               formats::SGBRG10_CSI2P,\n> > > +                               formats::SGRBG10_CSI2P,\n> > > +                               formats::SRGGB10_CSI2P\n> > > +                       },\n> > > +                       false,\n> > > +                       \"RAW10\"\n> > > +               }\n> > > +       }, {\n> > > +               HAL_PIXEL_FORMAT_RAW12, {\n> > > +                       {\n> > > +                               formats::SBGGR12_CSI2P,\n> > > +                               formats::SGBRG12_CSI2P,\n> > > +                               formats::SGRBG12_CSI2P,\n> > > +                               formats::SRGGB12_CSI2P\n> > > +                       },\n> > > +                       false,\n> > > +                       \"RAW12\"\n> > > +               }\n> > > +       }, {\n> > > +               HAL_PIXEL_FORMAT_RAW16, {\n> > > +                       {\n> > > +                               formats::SBGGR16,\n> > > +                               formats::SGBRG16,\n> > > +                               formats::SGRBG16,\n> > > +                               formats::SRGGB16\n> > > +                       },\n> > > +                       false,\n> > > +                       \"RAW16\"\n> > > +               }\n> > > +       },\n> > > +};\n> > > +\n> > > +} /* namespace */\n> > > +\n> > > +int CameraCapabilities::initialize(std::shared_ptr<libcamera::Camera>\n> > > camera,\n> > > +                                  int orientation, int facing)\n> > > +{\n> > > +       camera_ = camera;\n> > > +       orientation_ = orientation;\n> > > +       facing_ = facing;\n> > > +\n> > > +       /* Acquire the camera and initialize available stream\n> > > configurations. */\n> > > +       int ret = camera_->acquire();\n> > > +       if (ret) {\n> > > +               LOG(HAL, Error) << \"Failed to temporarily acquire the\n> > > camera\";\n> > > +               return ret;\n> > > +       }\n> > > +\n> > > +       ret = initializeStreamConfigurations();\n> > > +       camera_->release();\n> > > +       if (ret)\n> > > +               return ret;\n> > > +\n> > > +       return initializeStaticMetadata();\n> > > +}\n> > > +\n> > > +std::vector<Size>\n> > > CameraCapabilities::getYUVResolutions(CameraConfiguration *cameraConfig,\n> > > +                                                       const PixelFormat\n> > > &pixelFormat,\n> > > +                                                       const\n> > > std::vector<Size> &resolutions)\n> > > +{\n> > > +       std::vector<Size> supportedResolutions;\n> > > +\n> > > +       StreamConfiguration &cfg = cameraConfig->at(0);\n> > > +       for (const Size &res : resolutions) {\n> > > +               cfg.pixelFormat = pixelFormat;\n> > > +               cfg.size = res;\n> > > +\n> > > +               CameraConfiguration::Status status =\n> > > cameraConfig->validate();\n> > > +               if (status != CameraConfiguration::Valid) {\n> > > +                       LOG(HAL, Debug) << cfg.toString() << \" not\n> > > supported\";\n> > > +                       continue;\n> > > +               }\n> > > +\n> > > +               LOG(HAL, Debug) << cfg.toString() << \" supported\";\n> > > +\n> > > +               supportedResolutions.push_back(res);\n> > > +       }\n> > > +\n> > > +       return supportedResolutions;\n> > > +}\n> > > +\n> > > +std::vector<Size> CameraCapabilities::getRawResolutions(const\n> > > libcamera::PixelFormat &pixelFormat)\n> > > +{\n> > > +       std::unique_ptr<CameraConfiguration> cameraConfig =\n> > > +               camera_->generateConfiguration({ StreamRole::Raw });\n> > > +       StreamConfiguration &cfg = cameraConfig->at(0);\n> > > +       const StreamFormats &formats = cfg.formats();\n> > > +       std::vector<Size> supportedResolutions =\n> > > formats.sizes(pixelFormat);\n> > > +\n> > > +       return supportedResolutions;\n> > > +}\n> > > +\n> > > +/*\n> > > + * Initialize the format conversion map to translate from Android format\n> > > + * identifier to libcamera pixel formats and fill in the list of supported\n> > > + * stream configurations to be reported to the Android camera framework\n> > > through\n> > > + * the Camera static metadata.\n> > > + */\n> > > +int CameraCapabilities::initializeStreamConfigurations()\n> > > +{\n> > > +       /*\n> > > +        * Get the maximum output resolutions\n> > > +        * \\todo Get this from the camera properties once defined\n> > > +        */\n> > > +       std::unique_ptr<CameraConfiguration> cameraConfig =\n> > > +               camera_->generateConfiguration({ StillCapture });\n> > > +       if (!cameraConfig) {\n> > > +               LOG(HAL, Error) << \"Failed to get maximum resolution\";\n> > > +               return -EINVAL;\n> > > +       }\n> > > +       StreamConfiguration &cfg = cameraConfig->at(0);\n> > > +\n> > > +       /*\n> > > +        * \\todo JPEG - Adjust the maximum available resolution by taking\n> > > the\n> > > +        * JPEG encoder requirements into account (alignment and aspect\n> > > ratio).\n> > > +        */\n> > > +       const Size maxRes = cfg.size;\n> > > +       LOG(HAL, Debug) << \"Maximum supported resolution: \" <<\n> > > maxRes.toString();\n> > > +\n> > > +       /*\n> > > +        * Build the list of supported image resolutions.\n> > > +        *\n> > > +        * The resolutions listed in camera3Resolution are mandatory to be\n> > > +        * supported, up to the camera maximum resolution.\n> > > +        *\n> > > +        * Augment the list by adding resolutions calculated from the\n> > > camera\n> > > +        * maximum one.\n> > > +        */\n> > > +       std::vector<Size> cameraResolutions;\n> > > +       std::copy_if(camera3Resolutions.begin(), camera3Resolutions.end(),\n> > > +                    std::back_inserter(cameraResolutions),\n> > > +                    [&](const Size &res) { return res < maxRes; });\n> > > +\n> > > +       /*\n> > > +        * The Camera3 specification suggests adding 1/2 and 1/4 of the\n> > > maximum\n> > > +        * resolution.\n> > > +        */\n> > > +       for (unsigned int divider = 2;; divider <<= 1) {\n> > > +               Size derivedSize{\n> > > +                       maxRes.width / divider,\n> > > +                       maxRes.height / divider,\n> > > +               };\n> > > +\n> > > +               if (derivedSize.width < 320 ||\n> > > +                   derivedSize.height < 240)\n> > > +                       break;\n> > > +\n> > > +               cameraResolutions.push_back(derivedSize);\n> > > +       }\n> > > +       cameraResolutions.push_back(maxRes);\n> > > +\n> > > +       /* Remove duplicated entries from the list of supported\n> > > resolutions. */\n> > > +       std::sort(cameraResolutions.begin(), cameraResolutions.end());\n> > > +       auto last = std::unique(cameraResolutions.begin(),\n> > > cameraResolutions.end());\n> > > +       cameraResolutions.erase(last, cameraResolutions.end());\n> > > +\n> > > +       /*\n> > > +        * Build the list of supported camera formats.\n> > > +        *\n> > > +        * To each Android format a list of compatible libcamera formats is\n> > > +        * associated. The first libcamera format that tests successful is\n> > > added\n> > > +        * to the format translation map used when configuring the streams.\n> > > +        * It is then tested against the list of supported camera\n> > > resolutions to\n> > > +        * build the stream configuration map reported through the camera\n> > > static\n> > > +        * metadata.\n> > > +        */\n> > > +       Size maxJpegSize;\n> > > +       for (const auto &format : camera3FormatsMap) {\n> > > +               int androidFormat = format.first;\n> > > +               const Camera3Format &camera3Format = format.second;\n> > > +               const std::vector<PixelFormat> &libcameraFormats =\n> > > +                       camera3Format.libcameraFormats;\n> > > +\n> > > +               LOG(HAL, Debug) << \"Trying to map Android format \"\n> > > +                               << camera3Format.name;\n> > > +\n> > > +               /*\n> > > +                * JPEG is always supported, either produced directly by\n> > > the\n> > > +                * camera, or encoded in the HAL.\n> > > +                */\n> > > +               if (androidFormat == HAL_PIXEL_FORMAT_BLOB) {\n> > > +                       formatsMap_[androidFormat] = formats::MJPEG;\n> > > +                       LOG(HAL, Debug) << \"Mapped Android format \"\n> > > +                                       << camera3Format.name << \" to \"\n> > > +                                       << formats::MJPEG.toString()\n> > > +                                       << \" (fixed mapping)\";\n> > > +                       continue;\n> > > +               }\n> > > +\n> > > +               /*\n> > > +                * Test the libcamera formats that can produce images\n> > > +                * compatible with the format defined by Android.\n> > > +                */\n> > > +               PixelFormat mappedFormat;\n> > > +               for (const PixelFormat &pixelFormat : libcameraFormats) {\n> > > +\n> > > +                       LOG(HAL, Debug) << \"Testing \" <<\n> > > pixelFormat.toString();\n> > > +\n> > > +                       /*\n> > > +                        * The stream configuration size can be adjusted,\n> > > +                        * not the pixel format.\n> > > +                        *\n> > > +                        * \\todo This could be simplified once all pipeline\n> > > +                        * handlers will report the StreamFormats list of\n> > > +                        * supported formats.\n> > > +                        */\n> > > +                       cfg.pixelFormat = pixelFormat;\n> > > +\n> > > +                       CameraConfiguration::Status status =\n> > > cameraConfig->validate();\n> > > +                       if (status != CameraConfiguration::Invalid &&\n> > > +                           cfg.pixelFormat == pixelFormat) {\n> > > +                               mappedFormat = pixelFormat;\n> > > +                               break;\n> > > +                       }\n> > > +               }\n> > > +\n> > > +               if (!mappedFormat.isValid()) {\n> > > +                       /* If the format is not mandatory, skip it. */\n> > > +                       if (!camera3Format.mandatory)\n> > > +                               continue;\n> > > +\n> > > +                       LOG(HAL, Error)\n> > > +                               << \"Failed to map mandatory Android format\n> > > \"\n> > > +                               << camera3Format.name << \" (\"\n> > > +                               << utils::hex(androidFormat) << \"):\n> > > aborting\";\n> > > +                       return -EINVAL;\n> > > +               }\n> > > +\n> > > +               /*\n> > > +                * Record the mapping and then proceed to generate the\n> > > +                * stream configurations map, by testing the image\n> > > resolutions.\n> > > +                */\n> > > +               formatsMap_[androidFormat] = mappedFormat;\n> > > +               LOG(HAL, Debug) << \"Mapped Android format \"\n> > > +                               << camera3Format.name << \" to \"\n> > > +                               << mappedFormat.toString();\n> > > +\n> > > +               std::vector<Size> resolutions;\n> > > +               const PixelFormatInfo &info =\n> > > PixelFormatInfo::info(mappedFormat);\n> > > +               if (info.colourEncoding ==\n> > > PixelFormatInfo::ColourEncodingRAW)\n> > > +                       resolutions = getRawResolutions(mappedFormat);\n> > > +               else\n> > > +                       resolutions = getYUVResolutions(cameraConfig.get(),\n> > > +                                                       mappedFormat,\n> > > +                                                       cameraResolutions);\n> > > +\n> > > +               for (const Size &res : resolutions) {\n> > > +                       streamConfigurations_.push_back({ res,\n> > > androidFormat });\n> > > +\n> > > +                       /*\n> > > +                        * If the format is HAL_PIXEL_FORMAT_YCbCr_420_888\n> > > +                        * from which JPEG is produced, add an entry for\n> > > +                        * the JPEG stream.\n> > > +                        *\n> > > +                        * \\todo Wire the JPEG encoder to query the\n> > > supported\n> > > +                        * sizes provided a list of formats it can encode.\n> > > +                        *\n> > > +                        * \\todo Support JPEG streams produced by the\n> > > Camera\n> > > +                        * natively.\n> > > +                        */\n> > > +                       if (androidFormat ==\n> > > HAL_PIXEL_FORMAT_YCbCr_420_888) {\n> > > +                               streamConfigurations_.push_back(\n> > > +                                       { res, HAL_PIXEL_FORMAT_BLOB });\n> > > +                               maxJpegSize = std::max(maxJpegSize, res);\n> > > +                       }\n> > > +               }\n> > > +\n> > > +               /*\n> > > +                * \\todo Calculate the maximum JPEG buffer size by asking\n> > > the\n> > > +                * encoder giving the maximum frame size required.\n> > > +                */\n> > > +               maxJpegBufferSize_ = maxJpegSize.width *\n> > > maxJpegSize.height * 1.5;\n> > > +       }\n> > > +\n> > > +       LOG(HAL, Debug) << \"Collected stream configuration map: \";\n> > > +       for (const auto &entry : streamConfigurations_)\n> > > +               LOG(HAL, Debug) << \"{ \" << entry.resolution.toString() <<\n> > > \" - \"\n> > > +                               << utils::hex(entry.androidFormat) << \" }\";\n> > > +\n> > > +       return 0;\n> > > +}\n> > > +\n> > > +int CameraCapabilities::initializeStaticMetadata()\n> > > +{\n> > > +       staticMetadata_ = std::make_unique<CameraMetadata>(64, 1024);\n> > > +       if (!staticMetadata_->isValid()) {\n> > > +               LOG(HAL, Error) << \"Failed to allocate static metadata\";\n> > > +               staticMetadata_.reset();\n> > > +               return -EINVAL;\n> > > +       }\n> > > +\n> > > +       const ControlInfoMap &controlsInfo = camera_->controls();\n> > > +       const ControlList &properties = camera_->properties();\n> > > +\n> > > +       /* Color correction static metadata. */\n> > > +       {\n> > > +               std::vector<uint8_t> data;\n> > > +               data.reserve(3);\n> > > +               const auto &infoMap =\n> > > controlsInfo.find(&controls::draft::ColorCorrectionAberrationMode);\n> > > +               if (infoMap != controlsInfo.end()) {\n> > > +                       for (const auto &value : infoMap->second.values())\n> > > +                               data.push_back(value.get<int32_t>());\n> > > +               } else {\n> > > +\n> > >  data.push_back(ANDROID_COLOR_CORRECTION_ABERRATION_MODE_OFF);\n> > > +               }\n> > > +\n> > >  staticMetadata_->addEntry(ANDROID_COLOR_CORRECTION_AVAILABLE_ABERRATION_MODES,\n> > > +                                         data);\n> > > +       }\n> > > +\n> > > +       /* Control static metadata. */\n> > > +       std::vector<uint8_t> aeAvailableAntiBandingModes = {\n> > > +               ANDROID_CONTROL_AE_ANTIBANDING_MODE_OFF,\n> > > +               ANDROID_CONTROL_AE_ANTIBANDING_MODE_50HZ,\n> > > +               ANDROID_CONTROL_AE_ANTIBANDING_MODE_60HZ,\n> > > +               ANDROID_CONTROL_AE_ANTIBANDING_MODE_AUTO,\n> > > +       };\n> > > +\n> > >  staticMetadata_->addEntry(ANDROID_CONTROL_AE_AVAILABLE_ANTIBANDING_MODES,\n> > > +                                 aeAvailableAntiBandingModes);\n> > > +\n> > > +       std::vector<uint8_t> aeAvailableModes = {\n> > > +               ANDROID_CONTROL_AE_MODE_ON,\n> > > +       };\n> > > +       staticMetadata_->addEntry(ANDROID_CONTROL_AE_AVAILABLE_MODES,\n> > > +                                 aeAvailableModes);\n> > > +\n> > > +       int64_t minFrameDurationNsec = -1;\n> > > +       int64_t maxFrameDurationNsec = -1;\n> > > +       const auto frameDurationsInfo =\n> > > controlsInfo.find(&controls::FrameDurationLimits);\n> > > +       if (frameDurationsInfo != controlsInfo.end()) {\n> > > +               minFrameDurationNsec =\n> > > frameDurationsInfo->second.min().get<int64_t>() * 1000;\n> > > +               maxFrameDurationNsec =\n> > > frameDurationsInfo->second.max().get<int64_t>() * 1000;\n> > > +\n> > > +               /*\n> > > +                * Adjust the minimum frame duration to comply with Android\n> > > +                * requirements. The camera service mandates all\n> > > preview/record\n> > > +                * streams to have a minimum frame duration < 33,366\n> > > milliseconds\n> > > +                * (see MAX_PREVIEW_RECORD_DURATION_NS in the camera\n> > > service\n> > > +                * implementation).\n> > > +                *\n> > > +                * If we're close enough (+ 500 useconds) to that value,\n> > > round\n> > > +                * the minimum frame duration of the camera to an accepted\n> > > +                * value.\n> > > +                */\n> > > +               static constexpr int64_t MAX_PREVIEW_RECORD_DURATION_NS =\n> > > 1e9 / 29.97;\n> > > +               if (minFrameDurationNsec > MAX_PREVIEW_RECORD_DURATION_NS\n> > > &&\n> > > +                   minFrameDurationNsec < MAX_PREVIEW_RECORD_DURATION_NS\n> > > + 500000)\n> > > +                       minFrameDurationNsec =\n> > > MAX_PREVIEW_RECORD_DURATION_NS - 1000;\n> > > +\n> > > +               /*\n> > > +                * The AE routine frame rate limits are computed using the\n> > > frame\n> > > +                * duration limits, as libcamera clips the AE routine to\n> > > the\n> > > +                * frame durations.\n> > > +                */\n> > > +               int32_t maxFps = std::round(1e9 / minFrameDurationNsec);\n> > > +               int32_t minFps = std::round(1e9 / maxFrameDurationNsec);\n> > > +               minFps = std::max(1, minFps);\n> > > +\n> > > +               /*\n> > > +                * Force rounding errors so that we have the proper frame\n> > > +                * durations for when we reuse these variables later\n> > > +                */\n> > > +               minFrameDurationNsec = 1e9 / maxFps;\n> > > +               maxFrameDurationNsec = 1e9 / minFps;\n> > > +\n> > > +               /*\n> > > +                * Register to the camera service {min, max} and {max, max}\n> > > +                * intervals as requested by the metadata documentation.\n> > > +                */\n> > > +               int32_t availableAeFpsTarget[] = {\n> > > +                       minFps, maxFps, maxFps, maxFps\n> > > +               };\n> > > +\n> > >  staticMetadata_->addEntry(ANDROID_CONTROL_AE_AVAILABLE_TARGET_FPS_RANGES,\n> > > +                                         availableAeFpsTarget);\n> > > +       }\n> > > +\n> > > +       std::vector<int32_t> aeCompensationRange = {\n> > > +               0, 0,\n> > > +       };\n> > > +       staticMetadata_->addEntry(ANDROID_CONTROL_AE_COMPENSATION_RANGE,\n> > > +                                 aeCompensationRange);\n> > > +\n> > > +       const camera_metadata_rational_t aeCompensationStep[] = {\n> > > +               { 0, 1 }\n> > > +       };\n> > > +       staticMetadata_->addEntry(ANDROID_CONTROL_AE_COMPENSATION_STEP,\n> > > +                                 aeCompensationStep);\n> > > +\n> > > +       std::vector<uint8_t> availableAfModes = {\n> > > +               ANDROID_CONTROL_AF_MODE_OFF,\n> > > +       };\n> > > +       staticMetadata_->addEntry(ANDROID_CONTROL_AF_AVAILABLE_MODES,\n> > > +                                 availableAfModes);\n> > > +\n> > > +       std::vector<uint8_t> availableEffects = {\n> > > +               ANDROID_CONTROL_EFFECT_MODE_OFF,\n> > > +       };\n> > > +       staticMetadata_->addEntry(ANDROID_CONTROL_AVAILABLE_EFFECTS,\n> > > +                                 availableEffects);\n> > > +\n> > > +       std::vector<uint8_t> availableSceneModes = {\n> > > +               ANDROID_CONTROL_SCENE_MODE_DISABLED,\n> > > +       };\n> > > +       staticMetadata_->addEntry(ANDROID_CONTROL_AVAILABLE_SCENE_MODES,\n> > > +                                 availableSceneModes);\n> > > +\n> > > +       std::vector<uint8_t> availableStabilizationModes = {\n> > > +               ANDROID_CONTROL_VIDEO_STABILIZATION_MODE_OFF,\n> > > +       };\n> > > +\n> > >  staticMetadata_->addEntry(ANDROID_CONTROL_AVAILABLE_VIDEO_STABILIZATION_MODES,\n> > > +                                 availableStabilizationModes);\n> > > +\n> > > +       /*\n> > > +        * \\todo Inspect the Camera capabilities to report the available\n> > > +        * AWB modes. Default to AUTO as CTS tests require it.\n> > > +        */\n> > > +       std::vector<uint8_t> availableAwbModes = {\n> > > +               ANDROID_CONTROL_AWB_MODE_AUTO,\n> > > +       };\n> > > +       staticMetadata_->addEntry(ANDROID_CONTROL_AWB_AVAILABLE_MODES,\n> > > +                                 availableAwbModes);\n> > > +\n> > > +       std::vector<int32_t> availableMaxRegions = {\n> > > +               0, 0, 0,\n> > > +       };\n> > > +       staticMetadata_->addEntry(ANDROID_CONTROL_MAX_REGIONS,\n> > > +                                 availableMaxRegions);\n> > > +\n> > > +       std::vector<uint8_t> sceneModesOverride = {\n> > > +               ANDROID_CONTROL_AE_MODE_ON,\n> > > +               ANDROID_CONTROL_AWB_MODE_AUTO,\n> > > +               ANDROID_CONTROL_AF_MODE_OFF,\n> > > +       };\n> > > +       staticMetadata_->addEntry(ANDROID_CONTROL_SCENE_MODE_OVERRIDES,\n> > > +                                 sceneModesOverride);\n> > > +\n> > > +       uint8_t aeLockAvailable = ANDROID_CONTROL_AE_LOCK_AVAILABLE_FALSE;\n> > > +       staticMetadata_->addEntry(ANDROID_CONTROL_AE_LOCK_AVAILABLE,\n> > > +                                 aeLockAvailable);\n> > > +\n> > > +       uint8_t awbLockAvailable =\n> > > ANDROID_CONTROL_AWB_LOCK_AVAILABLE_FALSE;\n> > > +       staticMetadata_->addEntry(ANDROID_CONTROL_AWB_LOCK_AVAILABLE,\n> > > +                                 awbLockAvailable);\n> > > +\n> > > +       char availableControlModes = ANDROID_CONTROL_MODE_AUTO;\n> > > +       staticMetadata_->addEntry(ANDROID_CONTROL_AVAILABLE_MODES,\n> > > +                                 availableControlModes);\n> > > +\n> > > +       /* JPEG static metadata. */\n> > > +\n> > > +       /*\n> > > +        * Create the list of supported thumbnail sizes by inspecting the\n> > > +        * available JPEG resolutions collected in streamConfigurations_\n> > > and\n> > > +        * generate one entry for each aspect ratio.\n> > > +        *\n> > > +        * The JPEG thumbnailer can freely scale, so pick an arbitrary\n> > > +        * (160, 160) size as the bounding rectangle, which is then\n> > > cropped to\n> > > +        * the different supported aspect ratios.\n> > > +        */\n> > > +       constexpr Size maxJpegThumbnail(160, 160);\n> > > +       std::vector<Size> thumbnailSizes;\n> > > +       thumbnailSizes.push_back({ 0, 0 });\n> > > +       for (const auto &entry : streamConfigurations_) {\n> > > +               if (entry.androidFormat != HAL_PIXEL_FORMAT_BLOB)\n> > > +                       continue;\n> > > +\n> > > +               Size thumbnailSize = maxJpegThumbnail\n> > > +                                    .boundedToAspectRatio({\n> > > entry.resolution.width,\n> > > +\n> > > entry.resolution.height });\n> > > +               thumbnailSizes.push_back(thumbnailSize);\n> > > +       }\n> > > +\n> > > +       std::sort(thumbnailSizes.begin(), thumbnailSizes.end());\n> > > +       auto last = std::unique(thumbnailSizes.begin(),\n> > > thumbnailSizes.end());\n> > > +       thumbnailSizes.erase(last, thumbnailSizes.end());\n> > > +\n> > > +       /* Transform sizes in to a list of integers that can be consumed.\n> > > */\n> > > +       std::vector<int32_t> thumbnailEntries;\n> > > +       thumbnailEntries.reserve(thumbnailSizes.size() * 2);\n> > > +       for (const auto &size : thumbnailSizes) {\n> > > +               thumbnailEntries.push_back(size.width);\n> > > +               thumbnailEntries.push_back(size.height);\n> > > +       }\n> > > +       staticMetadata_->addEntry(ANDROID_JPEG_AVAILABLE_THUMBNAIL_SIZES,\n> > > +                                 thumbnailEntries);\n> > > +\n> > > +       staticMetadata_->addEntry(ANDROID_JPEG_MAX_SIZE,\n> > > maxJpegBufferSize_);\n> > > +\n> > > +       /* Sensor static metadata. */\n> > > +       std::array<int32_t, 2> pixelArraySize;\n> > > +       {\n> > > +               const Size &size =\n> > > properties.get(properties::PixelArraySize);\n> > > +               pixelArraySize[0] = size.width;\n> > > +               pixelArraySize[1] = size.height;\n> > > +\n> > >  staticMetadata_->addEntry(ANDROID_SENSOR_INFO_PIXEL_ARRAY_SIZE,\n> > > +                                         pixelArraySize);\n> > > +       }\n> > > +\n> > > +       if (properties.contains(properties::UnitCellSize)) {\n> > > +               const Size &cellSize =\n> > > properties.get<Size>(properties::UnitCellSize);\n> > > +               std::array<float, 2> physicalSize{\n> > > +                       cellSize.width * pixelArraySize[0] / 1e6f,\n> > > +                       cellSize.height * pixelArraySize[1] / 1e6f\n> > > +               };\n> > > +\n> > >  staticMetadata_->addEntry(ANDROID_SENSOR_INFO_PHYSICAL_SIZE,\n> > > +                                         physicalSize);\n> > > +       }\n> > > +\n> > > +       {\n> > > +               const Span<const Rectangle> &rects =\n> > > +                       properties.get(properties::PixelArrayActiveAreas);\n> > > +               std::vector<int32_t> data{\n> > > +                       static_cast<int32_t>(rects[0].x),\n> > > +                       static_cast<int32_t>(rects[0].y),\n> > > +                       static_cast<int32_t>(rects[0].width),\n> > > +                       static_cast<int32_t>(rects[0].height),\n> > > +               };\n> > > +\n> > >  staticMetadata_->addEntry(ANDROID_SENSOR_INFO_ACTIVE_ARRAY_SIZE,\n> > > +                                         data);\n> > > +       }\n> > > +\n> > > +       int32_t sensitivityRange[] = {\n> > > +               32, 2400,\n> > > +       };\n> > > +       staticMetadata_->addEntry(ANDROID_SENSOR_INFO_SENSITIVITY_RANGE,\n> > > +                                 sensitivityRange);\n> > > +\n> > > +       /* Report the color filter arrangement if the camera reports it. */\n> > > +       if\n> > > (properties.contains(properties::draft::ColorFilterArrangement)) {\n> > > +               uint8_t filterArr =\n> > > properties.get(properties::draft::ColorFilterArrangement);\n> > > +\n> > >  staticMetadata_->addEntry(ANDROID_SENSOR_INFO_COLOR_FILTER_ARRANGEMENT,\n> > > +                                         filterArr);\n> > > +       }\n> > > +\n> > > +       const auto &exposureInfo =\n> > > controlsInfo.find(&controls::ExposureTime);\n> > > +       if (exposureInfo != controlsInfo.end()) {\n> > > +               int64_t exposureTimeRange[2] = {\n> > > +                       exposureInfo->second.min().get<int32_t>() * 1000LL,\n> > > +                       exposureInfo->second.max().get<int32_t>() * 1000LL,\n> > > +               };\n> > > +\n> > >  staticMetadata_->addEntry(ANDROID_SENSOR_INFO_EXPOSURE_TIME_RANGE,\n> > > +                                         exposureTimeRange, 2);\n> > > +       }\n> > > +\n> > > +       staticMetadata_->addEntry(ANDROID_SENSOR_ORIENTATION,\n> > > orientation_);\n> > > +\n> > > +       std::vector<int32_t> testPatternModes = {\n> > > +               ANDROID_SENSOR_TEST_PATTERN_MODE_OFF\n> > > +       };\n> > > +       const auto &testPatternsInfo =\n> > > +               controlsInfo.find(&controls::draft::TestPatternMode);\n> > > +       if (testPatternsInfo != controlsInfo.end()) {\n> > > +               const auto &values = testPatternsInfo->second.values();\n> > > +               ASSERT(!values.empty());\n> > > +               for (const auto &value : values) {\n> > > +                       switch (value.get<int32_t>()) {\n> > > +                       case controls::draft::TestPatternModeOff:\n> > > +                               /*\n> > > +                                * ANDROID_SENSOR_TEST_PATTERN_MODE_OFF is\n> > > +                                * already in testPatternModes.\n> > > +                                */\n> > > +                               break;\n> > > +\n> > > +                       case controls::draft::TestPatternModeSolidColor:\n> > > +                               testPatternModes.push_back(\n> > > +\n> > >  ANDROID_SENSOR_TEST_PATTERN_MODE_SOLID_COLOR);\n> > > +                               break;\n> > > +\n> > > +                       case controls::draft::TestPatternModeColorBars:\n> > > +                               testPatternModes.push_back(\n> > > +\n> > >  ANDROID_SENSOR_TEST_PATTERN_MODE_COLOR_BARS);\n> > > +                               break;\n> > > +\n> > > +                       case\n> > > controls::draft::TestPatternModeColorBarsFadeToGray:\n> > > +                               testPatternModes.push_back(\n> > > +\n> > >  ANDROID_SENSOR_TEST_PATTERN_MODE_COLOR_BARS_FADE_TO_GRAY);\n> > > +                               break;\n> > > +\n> > > +                       case controls::draft::TestPatternModePn9:\n> > > +                               testPatternModes.push_back(\n> > > +\n> > >  ANDROID_SENSOR_TEST_PATTERN_MODE_PN9);\n> > > +                               break;\n> > > +\n> > > +                       case controls::draft::TestPatternModeCustom1:\n> > > +                               /* We don't support this yet. */\n> > > +                               break;\n> > > +\n> > > +                       default:\n> > > +                               LOG(HAL, Error) << \"Unknown test pattern\n> > > mode: \"\n> > > +                                               << value.get<int32_t>();\n> > > +                               continue;\n> > > +                       }\n> > > +               }\n> > > +       }\n> > > +\n> > >  staticMetadata_->addEntry(ANDROID_SENSOR_AVAILABLE_TEST_PATTERN_MODES,\n> > > +                                 testPatternModes);\n> > > +\n> > > +       uint8_t timestampSource =\n> > > ANDROID_SENSOR_INFO_TIMESTAMP_SOURCE_UNKNOWN;\n> > > +       staticMetadata_->addEntry(ANDROID_SENSOR_INFO_TIMESTAMP_SOURCE,\n> > > +                                 timestampSource);\n> > > +\n> > > +       if (maxFrameDurationNsec > 0)\n> > > +\n> > >  staticMetadata_->addEntry(ANDROID_SENSOR_INFO_MAX_FRAME_DURATION,\n> > > +                                         maxFrameDurationNsec);\n> > > +\n> > > +       /* Statistics static metadata. */\n> > > +       uint8_t faceDetectMode = ANDROID_STATISTICS_FACE_DETECT_MODE_OFF;\n> > > +\n> > >  staticMetadata_->addEntry(ANDROID_STATISTICS_INFO_AVAILABLE_FACE_DETECT_MODES,\n> > > +                                 faceDetectMode);\n> > > +\n> > > +       int32_t maxFaceCount = 0;\n> > > +       staticMetadata_->addEntry(ANDROID_STATISTICS_INFO_MAX_FACE_COUNT,\n> > > +                                 maxFaceCount);\n> > > +\n> > > +       {\n> > > +               std::vector<uint8_t> data;\n> > > +               data.reserve(2);\n> > > +               const auto &infoMap =\n> > > controlsInfo.find(&controls::draft::LensShadingMapMode);\n> > > +               if (infoMap != controlsInfo.end()) {\n> > > +                       for (const auto &value : infoMap->second.values())\n> > > +                               data.push_back(value.get<int32_t>());\n> > > +               } else {\n> > > +\n> > >  data.push_back(ANDROID_STATISTICS_LENS_SHADING_MAP_MODE_OFF);\n> > > +               }\n> > > +\n> > >  staticMetadata_->addEntry(ANDROID_STATISTICS_INFO_AVAILABLE_LENS_SHADING_MAP_MODES,\n> > > +                                         data);\n> > > +       }\n> > > +\n> > > +       /* Sync static metadata. */\n> > > +       int32_t maxLatency = ANDROID_SYNC_MAX_LATENCY_UNKNOWN;\n> > > +       staticMetadata_->addEntry(ANDROID_SYNC_MAX_LATENCY, maxLatency);\n> > > +\n> > > +       /* Flash static metadata. */\n> > > +       char flashAvailable = ANDROID_FLASH_INFO_AVAILABLE_FALSE;\n> > > +       staticMetadata_->addEntry(ANDROID_FLASH_INFO_AVAILABLE,\n> > > +                                 flashAvailable);\n> > > +\n> > > +       /* Lens static metadata. */\n> > > +       std::vector<float> lensApertures = {\n> > > +               2.53 / 100,\n> > > +       };\n> > > +       staticMetadata_->addEntry(ANDROID_LENS_INFO_AVAILABLE_APERTURES,\n> > > +                                 lensApertures);\n> > > +\n> > > +       uint8_t lensFacing;\n> > > +       switch (facing_) {\n> > > +       default:\n> > > +       case CAMERA_FACING_FRONT:\n> > > +               lensFacing = ANDROID_LENS_FACING_FRONT;\n> > > +               break;\n> > > +       case CAMERA_FACING_BACK:\n> > > +               lensFacing = ANDROID_LENS_FACING_BACK;\n> > > +               break;\n> > > +       case CAMERA_FACING_EXTERNAL:\n> > > +               lensFacing = ANDROID_LENS_FACING_EXTERNAL;\n> > > +               break;\n> > > +       }\n> > > +       staticMetadata_->addEntry(ANDROID_LENS_FACING, lensFacing);\n> > > +\n> > > +       std::vector<float> lensFocalLengths = {\n> > > +               1,\n> > > +       };\n> > > +\n> > >  staticMetadata_->addEntry(ANDROID_LENS_INFO_AVAILABLE_FOCAL_LENGTHS,\n> > > +                                 lensFocalLengths);\n> > > +\n> > > +       std::vector<uint8_t> opticalStabilizations = {\n> > > +               ANDROID_LENS_OPTICAL_STABILIZATION_MODE_OFF,\n> > > +       };\n> > > +\n> > >  staticMetadata_->addEntry(ANDROID_LENS_INFO_AVAILABLE_OPTICAL_STABILIZATION,\n> > > +                                 opticalStabilizations);\n> > > +\n> > > +       float hypeFocalDistance = 0;\n> > > +       staticMetadata_->addEntry(ANDROID_LENS_INFO_HYPERFOCAL_DISTANCE,\n> > > +                                 hypeFocalDistance);\n> > > +\n> > > +       float minFocusDistance = 0;\n> > > +       staticMetadata_->addEntry(ANDROID_LENS_INFO_MINIMUM_FOCUS_DISTANCE,\n> > > +                                 minFocusDistance);\n> > > +\n> > > +       /* Noise reduction modes. */\n> > > +       {\n> > > +               std::vector<uint8_t> data;\n> > > +               data.reserve(5);\n> > > +               const auto &infoMap =\n> > > controlsInfo.find(&controls::draft::NoiseReductionMode);\n> > > +               if (infoMap != controlsInfo.end()) {\n> > > +                       for (const auto &value : infoMap->second.values())\n> > > +                               data.push_back(value.get<int32_t>());\n> > > +               } else {\n> > > +                       data.push_back(ANDROID_NOISE_REDUCTION_MODE_OFF);\n> > > +               }\n> > > +\n> > >  staticMetadata_->addEntry(ANDROID_NOISE_REDUCTION_AVAILABLE_NOISE_REDUCTION_MODES,\n> > > +                                         data);\n> > > +       }\n> > > +\n> > > +       /* Scaler static metadata. */\n> > > +\n> > > +       /*\n> > > +        * \\todo The digital zoom factor is a property that depends on the\n> > > +        * desired output configuration and the sensor frame size input to\n> > > the\n> > > +        * ISP. This information is not available to the Android HAL, not\n> > > at\n> > > +        * initialization time at least.\n> > > +        *\n> > > +        * As a workaround rely on pipeline handlers initializing the\n> > > +        * ScalerCrop control with the camera default configuration and\n> > > use the\n> > > +        * maximum and minimum crop rectangles to calculate the digital\n> > > zoom\n> > > +        * factor.\n> > > +        */\n> > > +       float maxZoom = 1.0f;\n> > > +       const auto scalerCrop = controlsInfo.find(&controls::ScalerCrop);\n> > > +       if (scalerCrop != controlsInfo.end()) {\n> > > +               Rectangle min = scalerCrop->second.min().get<Rectangle>();\n> > > +               Rectangle max = scalerCrop->second.max().get<Rectangle>();\n> > > +               maxZoom = std::min(1.0f * max.width / min.width,\n> > > +                                  1.0f * max.height / min.height);\n> > > +       }\n> > > +\n> > >  staticMetadata_->addEntry(ANDROID_SCALER_AVAILABLE_MAX_DIGITAL_ZOOM,\n> > > +                                 maxZoom);\n> > > +\n> > > +       std::vector<uint32_t> availableStreamConfigurations;\n> > > +       availableStreamConfigurations.reserve(streamConfigurations_.size()\n> > > * 4);\n> > > +       for (const auto &entry : streamConfigurations_) {\n> > > +\n> > >  availableStreamConfigurations.push_back(entry.androidFormat);\n> > > +\n> > >  availableStreamConfigurations.push_back(entry.resolution.width);\n> > > +\n> > >  availableStreamConfigurations.push_back(entry.resolution.height);\n> > > +               availableStreamConfigurations.push_back(\n> > > +\n> > >  ANDROID_SCALER_AVAILABLE_STREAM_CONFIGURATIONS_OUTPUT);\n> > > +       }\n> > > +\n> > >  staticMetadata_->addEntry(ANDROID_SCALER_AVAILABLE_STREAM_CONFIGURATIONS,\n> > > +                                 availableStreamConfigurations);\n> > > +\n> > > +       std::vector<int64_t> availableStallDurations = {\n> > > +               ANDROID_SCALER_AVAILABLE_FORMATS_BLOB, 2560, 1920,\n> > > 33333333,\n> > > +       };\n> > > +       staticMetadata_->addEntry(ANDROID_SCALER_AVAILABLE_STALL_DURATIONS,\n> > > +                                 availableStallDurations);\n> > > +\n> > > +       /* Use the minimum frame duration for all the YUV/RGB formats. */\n> > > +       if (minFrameDurationNsec > 0) {\n> > > +               std::vector<int64_t> minFrameDurations;\n> > > +               minFrameDurations.reserve(streamConfigurations_.size() *\n> > > 4);\n> > > +               for (const auto &entry : streamConfigurations_) {\n> > > +                       minFrameDurations.push_back(entry.androidFormat);\n> > > +\n> > >  minFrameDurations.push_back(entry.resolution.width);\n> > > +\n> > >  minFrameDurations.push_back(entry.resolution.height);\n> > > +                       minFrameDurations.push_back(minFrameDurationNsec);\n> > > +               }\n> > > +\n> > >  staticMetadata_->addEntry(ANDROID_SCALER_AVAILABLE_MIN_FRAME_DURATIONS,\n> > > +                                         minFrameDurations);\n> > > +       }\n> > > +\n> > > +       uint8_t croppingType = ANDROID_SCALER_CROPPING_TYPE_CENTER_ONLY;\n> > > +       staticMetadata_->addEntry(ANDROID_SCALER_CROPPING_TYPE,\n> > > croppingType);\n> > > +\n> > > +       /* Info static metadata. */\n> > > +       uint8_t supportedHWLevel =\n> > > ANDROID_INFO_SUPPORTED_HARDWARE_LEVEL_LIMITED;\n> > > +       staticMetadata_->addEntry(ANDROID_INFO_SUPPORTED_HARDWARE_LEVEL,\n> > > +                                 supportedHWLevel);\n> > > +\n> > > +       /* Request static metadata. */\n> > > +       int32_t partialResultCount = 1;\n> > > +       staticMetadata_->addEntry(ANDROID_REQUEST_PARTIAL_RESULT_COUNT,\n> > > +                                 partialResultCount);\n> > > +\n> > > +       {\n> > > +               /* Default the value to 2 if not reported by the camera. */\n> > > +               uint8_t maxPipelineDepth = 2;\n> > > +               const auto &infoMap =\n> > > controlsInfo.find(&controls::draft::PipelineDepth);\n> > > +               if (infoMap != controlsInfo.end())\n> > > +                       maxPipelineDepth =\n> > > infoMap->second.max().get<int32_t>();\n> > > +\n> > >  staticMetadata_->addEntry(ANDROID_REQUEST_PIPELINE_MAX_DEPTH,\n> > > +                                         maxPipelineDepth);\n> > > +       }\n> > > +\n> > > +       /* LIMITED does not support reprocessing. */\n> > > +       uint32_t maxNumInputStreams = 0;\n> > > +       staticMetadata_->addEntry(ANDROID_REQUEST_MAX_NUM_INPUT_STREAMS,\n> > > +                                 maxNumInputStreams);\n> > > +\n> > > +       std::vector<uint8_t> availableCapabilities = {\n> > > +               ANDROID_REQUEST_AVAILABLE_CAPABILITIES_BACKWARD_COMPATIBLE,\n> > > +       };\n> > > +\n> > > +       /* Report if camera supports RAW. */\n> > > +       bool rawStreamAvailable = false;\n> > > +       std::unique_ptr<CameraConfiguration> cameraConfig =\n> > > +               camera_->generateConfiguration({ StreamRole::Raw });\n> > > +       if (cameraConfig && !cameraConfig->empty()) {\n> > > +               const PixelFormatInfo &info =\n> > > +\n> > >  PixelFormatInfo::info(cameraConfig->at(0).pixelFormat);\n> > > +               /* Only advertise RAW support if RAW16 is possible. */\n> > > +               if (info.colourEncoding ==\n> > > PixelFormatInfo::ColourEncodingRAW &&\n> > > +                   info.bitsPerPixel == 16) {\n> > > +                       rawStreamAvailable = true;\n> > > +\n> > >  availableCapabilities.push_back(ANDROID_REQUEST_AVAILABLE_CAPABILITIES_RAW);\n> > > +               }\n> > > +       }\n> > > +\n> > > +       /* Number of { RAW, YUV, JPEG } supported output streams */\n> > > +       int32_t numOutStreams[] = { rawStreamAvailable, 2, 1 };\n> > > +       staticMetadata_->addEntry(ANDROID_REQUEST_MAX_NUM_OUTPUT_STREAMS,\n> > > +                                 numOutStreams);\n> > > +\n> > > +       staticMetadata_->addEntry(ANDROID_REQUEST_AVAILABLE_CAPABILITIES,\n> > > +                                 availableCapabilities);\n> > > +\n> > > +       std::vector<int32_t> availableCharacteristicsKeys = {\n> > > +               ANDROID_COLOR_CORRECTION_AVAILABLE_ABERRATION_MODES,\n> > > +               ANDROID_CONTROL_AE_AVAILABLE_ANTIBANDING_MODES,\n> > > +               ANDROID_CONTROL_AE_AVAILABLE_MODES,\n> > > +               ANDROID_CONTROL_AE_AVAILABLE_TARGET_FPS_RANGES,\n> > > +               ANDROID_CONTROL_AE_COMPENSATION_RANGE,\n> > > +               ANDROID_CONTROL_AE_COMPENSATION_STEP,\n> > > +               ANDROID_CONTROL_AE_LOCK_AVAILABLE,\n> > > +               ANDROID_CONTROL_AF_AVAILABLE_MODES,\n> > > +               ANDROID_CONTROL_AVAILABLE_EFFECTS,\n> > > +               ANDROID_CONTROL_AVAILABLE_MODES,\n> > > +               ANDROID_CONTROL_AVAILABLE_SCENE_MODES,\n> > > +               ANDROID_CONTROL_AVAILABLE_VIDEO_STABILIZATION_MODES,\n> > > +               ANDROID_CONTROL_AWB_AVAILABLE_MODES,\n> > > +               ANDROID_CONTROL_AWB_LOCK_AVAILABLE,\n> > > +               ANDROID_CONTROL_MAX_REGIONS,\n> > > +               ANDROID_CONTROL_SCENE_MODE_OVERRIDES,\n> > > +               ANDROID_FLASH_INFO_AVAILABLE,\n> > > +               ANDROID_INFO_SUPPORTED_HARDWARE_LEVEL,\n> > > +               ANDROID_JPEG_AVAILABLE_THUMBNAIL_SIZES,\n> > > +               ANDROID_JPEG_MAX_SIZE,\n> > > +               ANDROID_LENS_FACING,\n> > > +               ANDROID_LENS_INFO_AVAILABLE_APERTURES,\n> > > +               ANDROID_LENS_INFO_AVAILABLE_FOCAL_LENGTHS,\n> > > +               ANDROID_LENS_INFO_AVAILABLE_OPTICAL_STABILIZATION,\n> > > +               ANDROID_LENS_INFO_HYPERFOCAL_DISTANCE,\n> > > +               ANDROID_LENS_INFO_MINIMUM_FOCUS_DISTANCE,\n> > > +               ANDROID_NOISE_REDUCTION_AVAILABLE_NOISE_REDUCTION_MODES,\n> > > +               ANDROID_REQUEST_AVAILABLE_CAPABILITIES,\n> > > +               ANDROID_REQUEST_MAX_NUM_INPUT_STREAMS,\n> > > +               ANDROID_REQUEST_MAX_NUM_OUTPUT_STREAMS,\n> > > +               ANDROID_REQUEST_PARTIAL_RESULT_COUNT,\n> > > +               ANDROID_REQUEST_PIPELINE_MAX_DEPTH,\n> > > +               ANDROID_SCALER_AVAILABLE_MAX_DIGITAL_ZOOM,\n> > > +               ANDROID_SCALER_AVAILABLE_MIN_FRAME_DURATIONS,\n> > > +               ANDROID_SCALER_AVAILABLE_STALL_DURATIONS,\n> > > +               ANDROID_SCALER_AVAILABLE_STREAM_CONFIGURATIONS,\n> > > +               ANDROID_SCALER_CROPPING_TYPE,\n> > > +               ANDROID_SENSOR_AVAILABLE_TEST_PATTERN_MODES,\n> > > +               ANDROID_SENSOR_INFO_ACTIVE_ARRAY_SIZE,\n> > > +               ANDROID_SENSOR_INFO_COLOR_FILTER_ARRANGEMENT,\n> > > +               ANDROID_SENSOR_INFO_EXPOSURE_TIME_RANGE,\n> > > +               ANDROID_SENSOR_INFO_MAX_FRAME_DURATION,\n> > > +               ANDROID_SENSOR_INFO_PHYSICAL_SIZE,\n> > > +               ANDROID_SENSOR_INFO_PIXEL_ARRAY_SIZE,\n> > > +               ANDROID_SENSOR_INFO_SENSITIVITY_RANGE,\n> > > +               ANDROID_SENSOR_INFO_TIMESTAMP_SOURCE,\n> > > +               ANDROID_SENSOR_ORIENTATION,\n> > > +               ANDROID_STATISTICS_INFO_AVAILABLE_FACE_DETECT_MODES,\n> > > +               ANDROID_STATISTICS_INFO_MAX_FACE_COUNT,\n> > > +               ANDROID_SYNC_MAX_LATENCY,\n> > > +       };\n> > > +\n> > >  staticMetadata_->addEntry(ANDROID_REQUEST_AVAILABLE_CHARACTERISTICS_KEYS,\n> > > +                                 availableCharacteristicsKeys);\n> > > +\n> > > +       std::vector<int32_t> availableRequestKeys = {\n> > > +               ANDROID_COLOR_CORRECTION_ABERRATION_MODE,\n> > > +               ANDROID_CONTROL_AE_ANTIBANDING_MODE,\n> > > +               ANDROID_CONTROL_AE_EXPOSURE_COMPENSATION,\n> > > +               ANDROID_CONTROL_AE_LOCK,\n> > > +               ANDROID_CONTROL_AE_MODE,\n> > > +               ANDROID_CONTROL_AE_PRECAPTURE_TRIGGER,\n> > > +               ANDROID_CONTROL_AE_TARGET_FPS_RANGE,\n> > > +               ANDROID_CONTROL_AF_MODE,\n> > > +               ANDROID_CONTROL_AF_TRIGGER,\n> > > +               ANDROID_CONTROL_AWB_LOCK,\n> > > +               ANDROID_CONTROL_AWB_MODE,\n> > > +               ANDROID_CONTROL_CAPTURE_INTENT,\n> > > +               ANDROID_CONTROL_EFFECT_MODE,\n> > > +               ANDROID_CONTROL_MODE,\n> > > +               ANDROID_CONTROL_SCENE_MODE,\n> > > +               ANDROID_CONTROL_VIDEO_STABILIZATION_MODE,\n> > > +               ANDROID_FLASH_MODE,\n> > > +               ANDROID_JPEG_ORIENTATION,\n> > > +               ANDROID_JPEG_QUALITY,\n> > > +               ANDROID_JPEG_THUMBNAIL_QUALITY,\n> > > +               ANDROID_JPEG_THUMBNAIL_SIZE,\n> > > +               ANDROID_LENS_APERTURE,\n> > > +               ANDROID_LENS_OPTICAL_STABILIZATION_MODE,\n> > > +               ANDROID_NOISE_REDUCTION_MODE,\n> > > +               ANDROID_SCALER_CROP_REGION,\n> > > +               ANDROID_STATISTICS_FACE_DETECT_MODE\n> > > +       };\n> > > +       staticMetadata_->addEntry(ANDROID_REQUEST_AVAILABLE_REQUEST_KEYS,\n> > > +                                 availableRequestKeys);\n> > > +\n> > > +       std::vector<int32_t> availableResultKeys = {\n> > > +               ANDROID_COLOR_CORRECTION_ABERRATION_MODE,\n> > > +               ANDROID_CONTROL_AE_ANTIBANDING_MODE,\n> > > +               ANDROID_CONTROL_AE_EXPOSURE_COMPENSATION,\n> > > +               ANDROID_CONTROL_AE_LOCK,\n> > > +               ANDROID_CONTROL_AE_MODE,\n> > > +               ANDROID_CONTROL_AE_PRECAPTURE_TRIGGER,\n> > > +               ANDROID_CONTROL_AE_STATE,\n> > > +               ANDROID_CONTROL_AE_TARGET_FPS_RANGE,\n> > > +               ANDROID_CONTROL_AF_MODE,\n> > > +               ANDROID_CONTROL_AF_STATE,\n> > > +               ANDROID_CONTROL_AF_TRIGGER,\n> > > +               ANDROID_CONTROL_AWB_LOCK,\n> > > +               ANDROID_CONTROL_AWB_MODE,\n> > > +               ANDROID_CONTROL_AWB_STATE,\n> > > +               ANDROID_CONTROL_CAPTURE_INTENT,\n> > > +               ANDROID_CONTROL_EFFECT_MODE,\n> > > +               ANDROID_CONTROL_MODE,\n> > > +               ANDROID_CONTROL_SCENE_MODE,\n> > > +               ANDROID_CONTROL_VIDEO_STABILIZATION_MODE,\n> > > +               ANDROID_FLASH_MODE,\n> > > +               ANDROID_FLASH_STATE,\n> > > +               ANDROID_JPEG_GPS_COORDINATES,\n> > > +               ANDROID_JPEG_GPS_PROCESSING_METHOD,\n> > > +               ANDROID_JPEG_GPS_TIMESTAMP,\n> > > +               ANDROID_JPEG_ORIENTATION,\n> > > +               ANDROID_JPEG_QUALITY,\n> > > +               ANDROID_JPEG_SIZE,\n> > > +               ANDROID_JPEG_THUMBNAIL_QUALITY,\n> > > +               ANDROID_JPEG_THUMBNAIL_SIZE,\n> > > +               ANDROID_LENS_APERTURE,\n> > > +               ANDROID_LENS_FOCAL_LENGTH,\n> > > +               ANDROID_LENS_OPTICAL_STABILIZATION_MODE,\n> > > +               ANDROID_LENS_STATE,\n> > > +               ANDROID_NOISE_REDUCTION_MODE,\n> > > +               ANDROID_REQUEST_PIPELINE_DEPTH,\n> > > +               ANDROID_SCALER_CROP_REGION,\n> > > +               ANDROID_SENSOR_EXPOSURE_TIME,\n> > > +               ANDROID_SENSOR_FRAME_DURATION,\n> > > +               ANDROID_SENSOR_ROLLING_SHUTTER_SKEW,\n> > > +               ANDROID_SENSOR_TEST_PATTERN_MODE,\n> > > +               ANDROID_SENSOR_TIMESTAMP,\n> > > +               ANDROID_STATISTICS_FACE_DETECT_MODE,\n> > > +               ANDROID_STATISTICS_LENS_SHADING_MAP_MODE,\n> > > +               ANDROID_STATISTICS_HOT_PIXEL_MAP_MODE,\n> > > +               ANDROID_STATISTICS_SCENE_FLICKER,\n> > > +       };\n> > > +       staticMetadata_->addEntry(ANDROID_REQUEST_AVAILABLE_RESULT_KEYS,\n> > > +                                 availableResultKeys);\n> > > +\n> > > +       if (!staticMetadata_->isValid()) {\n> > > +               LOG(HAL, Error) << \"Failed to construct static metadata\";\n> > > +               staticMetadata_.reset();\n> > > +               return -EINVAL;\n> > > +       }\n> > > +\n> > > +       if (staticMetadata_->resized()) {\n> > > +               auto [entryCount, dataCount] = staticMetadata_->usage();\n> > > +               LOG(HAL, Info)\n> > > +                       << \"Static metadata resized: \" << entryCount\n> > > +                       << \" entries and \" << dataCount << \" bytes used\";\n> > > +       }\n> > > +\n> > > +       return 0;\n> > > +}\n> > > +\n> > > +/* Translate Android format code to libcamera pixel format. */\n> > > +PixelFormat CameraCapabilities::toPixelFormat(int format) const\n> > > +{\n> > > +       auto it = formatsMap_.find(format);\n> > > +       if (it == formatsMap_.end()) {\n> > > +               LOG(HAL, Error) << \"Requested format \" <<\n> > > utils::hex(format)\n> > > +                               << \" not supported\";\n> > > +               return PixelFormat();\n> > > +       }\n> > > +\n> > > +       return it->second;\n> > > +}\n> > > +\n> > > +std::unique_ptr<CameraMetadata>\n> > > CameraCapabilities::requestTemplatePreview() const\n> > > +{\n> > > +       /*\n> > > +        * \\todo Keep this in sync with the actual number of entries.\n> > > +        * Currently: 20 entries, 35 bytes\n> > > +        */\n> > > +       auto requestTemplate = std::make_unique<CameraMetadata>(21, 36);\n> > > +       if (!requestTemplate->isValid()) {\n> > > +               return nullptr;\n> > > +       }\n> > > +\n> > > +       /* Get the FPS range registered in the static metadata. */\n> > > +       camera_metadata_ro_entry_t entry;\n> > > +       bool found =\n> > > staticMetadata_->getEntry(ANDROID_CONTROL_AE_AVAILABLE_TARGET_FPS_RANGES,\n> > > +                                              &entry);\n> > > +       if (!found) {\n> > > +               LOG(HAL, Error) << \"Cannot create capture template without\n> > > FPS range\";\n> > > +               return nullptr;\n> > > +       }\n> > > +\n> > > +       /*\n> > > +        * Assume the AE_AVAILABLE_TARGET_FPS_RANGE static metadata\n> > > +        * has been assembled as {{min, max} {max, max}}.\n> > > +        */\n> > > +       requestTemplate->addEntry(ANDROID_CONTROL_AE_TARGET_FPS_RANGE,\n> > > +                                 entry.data.i32, 2);\n> > > +\n> > > +       uint8_t aeMode = ANDROID_CONTROL_AE_MODE_ON;\n> > > +       requestTemplate->addEntry(ANDROID_CONTROL_AE_MODE, aeMode);\n> > > +\n> > > +       int32_t aeExposureCompensation = 0;\n> > > +       requestTemplate->addEntry(ANDROID_CONTROL_AE_EXPOSURE_COMPENSATION,\n> > > +                                 aeExposureCompensation);\n> > > +\n> > > +       uint8_t aePrecaptureTrigger =\n> > > ANDROID_CONTROL_AE_PRECAPTURE_TRIGGER_IDLE;\n> > > +       requestTemplate->addEntry(ANDROID_CONTROL_AE_PRECAPTURE_TRIGGER,\n> > > +                                 aePrecaptureTrigger);\n> > > +\n> > > +       uint8_t aeLock = ANDROID_CONTROL_AE_LOCK_OFF;\n> > > +       requestTemplate->addEntry(ANDROID_CONTROL_AE_LOCK, aeLock);\n> > > +\n> > > +       uint8_t aeAntibandingMode =\n> > > ANDROID_CONTROL_AE_ANTIBANDING_MODE_AUTO;\n> > > +       requestTemplate->addEntry(ANDROID_CONTROL_AE_ANTIBANDING_MODE,\n> > > +                                 aeAntibandingMode);\n> > > +\n> > > +       uint8_t afMode = ANDROID_CONTROL_AF_MODE_OFF;\n> > > +       requestTemplate->addEntry(ANDROID_CONTROL_AF_MODE, afMode);\n> > > +\n> > > +       uint8_t afTrigger = ANDROID_CONTROL_AF_TRIGGER_IDLE;\n> > > +       requestTemplate->addEntry(ANDROID_CONTROL_AF_TRIGGER, afTrigger);\n> > > +\n> > > +       uint8_t awbMode = ANDROID_CONTROL_AWB_MODE_AUTO;\n> > > +       requestTemplate->addEntry(ANDROID_CONTROL_AWB_MODE, awbMode);\n> > > +\n> > > +       uint8_t awbLock = ANDROID_CONTROL_AWB_LOCK_OFF;\n> > > +       requestTemplate->addEntry(ANDROID_CONTROL_AWB_LOCK, awbLock);\n> > > +\n> > > +       uint8_t flashMode = ANDROID_FLASH_MODE_OFF;\n> > > +       requestTemplate->addEntry(ANDROID_FLASH_MODE, flashMode);\n> > > +\n> > > +       uint8_t faceDetectMode = ANDROID_STATISTICS_FACE_DETECT_MODE_OFF;\n> > > +       requestTemplate->addEntry(ANDROID_STATISTICS_FACE_DETECT_MODE,\n> > > +                                 faceDetectMode);\n> > > +\n> > > +       uint8_t noiseReduction = ANDROID_NOISE_REDUCTION_MODE_OFF;\n> > > +       requestTemplate->addEntry(ANDROID_NOISE_REDUCTION_MODE,\n> > > +                                 noiseReduction);\n> > > +\n> > > +       uint8_t aberrationMode =\n> > > ANDROID_COLOR_CORRECTION_ABERRATION_MODE_OFF;\n> > > +       requestTemplate->addEntry(ANDROID_COLOR_CORRECTION_ABERRATION_MODE,\n> > > +                                 aberrationMode);\n> > > +\n> > > +       uint8_t controlMode = ANDROID_CONTROL_MODE_AUTO;\n> > > +       requestTemplate->addEntry(ANDROID_CONTROL_MODE, controlMode);\n> > > +\n> > > +       float lensAperture = 2.53 / 100;\n> > > +       requestTemplate->addEntry(ANDROID_LENS_APERTURE, lensAperture);\n> > > +\n> > > +       uint8_t opticalStabilization =\n> > > ANDROID_LENS_OPTICAL_STABILIZATION_MODE_OFF;\n> > > +       requestTemplate->addEntry(ANDROID_LENS_OPTICAL_STABILIZATION_MODE,\n> > > +                                 opticalStabilization);\n> > > +\n> > > +       uint8_t captureIntent = ANDROID_CONTROL_CAPTURE_INTENT_PREVIEW;\n> > > +       requestTemplate->addEntry(ANDROID_CONTROL_CAPTURE_INTENT,\n> > > +                                 captureIntent);\n> > > +\n> > > +       return requestTemplate;\n> > > +}\n> > > +\n> > > +std::unique_ptr<CameraMetadata>\n> > > CameraCapabilities::requestTemplateVideo() const\n> > > +{\n> > > +       std::unique_ptr<CameraMetadata> previewTemplate =\n> > > requestTemplatePreview();\n> > > +       if (!previewTemplate)\n> > > +               return nullptr;\n> > > +\n> > > +       /*\n> > > +        * The video template requires a fixed FPS range. Everything else\n> > > +        * stays the same as the preview template.\n> > > +        */\n> > > +       camera_metadata_ro_entry_t entry;\n> > > +\n> > >  staticMetadata_->getEntry(ANDROID_CONTROL_AE_AVAILABLE_TARGET_FPS_RANGES,\n> > > +                                 &entry);\n> > > +\n> > > +       /*\n> > > +        * Assume the AE_AVAILABLE_TARGET_FPS_RANGE static metadata\n> > > +        * has been assembled as {{min, max} {max, max}}.\n> > > +        */\n> > > +       previewTemplate->updateEntry(ANDROID_CONTROL_AE_TARGET_FPS_RANGE,\n> > > +                                    entry.data.i32 + 2, 2);\n> > > +\n> > > +       return previewTemplate;\n> > > +}\n> > > diff --git a/src/android/camera_capabilities.h\n> > > b/src/android/camera_capabilities.h\n> > > new file mode 100644\n> > > index 000000000000..f511607bbd90\n> > > --- /dev/null\n> > > +++ b/src/android/camera_capabilities.h\n> > > @@ -0,0 +1,65 @@\n> > > +/* SPDX-License-Identifier: LGPL-2.1-or-later */\n> > > +/*\n> > > + * Copyright (C) 2021, Google Inc.\n> > > + *\n> > > + * camera_capabilities.h - Camera static properties manager\n> > > + */\n> > > +#ifndef __ANDROID_CAMERA_CAPABILITIES_H__\n> > > +#define __ANDROID_CAMERA_CAPABILITIES_H__\n> > > +\n> > > +#include <map>\n> > > +#include <memory>\n> > > +#include <vector>\n> > > +\n> > > +#include <libcamera/camera.h>\n> > > +#include <libcamera/class.h>\n> > > +#include <libcamera/formats.h>\n> > > +#include <libcamera/geometry.h>\n> > > +\n> > > +#include \"camera_metadata.h\"\n> > > +\n> > > +class CameraCapabilities\n> > > +{\n> > > +public:\n> > > +       CameraCapabilities() = default;\n> > > +\n> > > +       int initialize(std::shared_ptr<libcamera::Camera> camera,\n> > > +                      int orientation, int facing);\n> > > +\n> > > +       CameraMetadata *staticMetadata() const { return\n> > > staticMetadata_.get(); }\n> > > +       libcamera::PixelFormat toPixelFormat(int format) const;\n> > > +       unsigned int maxJpegBufferSize() const { return\n> > > maxJpegBufferSize_; }\n> > > +\n> > > +       std::unique_ptr<CameraMetadata> requestTemplatePreview() const;\n> > > +       std::unique_ptr<CameraMetadata> requestTemplateVideo() const;\n> > > +\n> > > +private:\n> > > +       LIBCAMERA_DISABLE_COPY_AND_MOVE(CameraCapabilities)\n> > > +\n> > > +       struct Camera3StreamConfiguration {\n> > > +               libcamera::Size resolution;\n> > > +               int androidFormat;\n> > > +       };\n> > > +\n> > > +       std::vector<libcamera::Size>\n> > > +       getYUVResolutions(libcamera::CameraConfiguration *cameraConfig,\n> > > +                         const libcamera::PixelFormat &pixelFormat,\n> > > +                         const std::vector<libcamera::Size> &resolutions);\n> > > +       std::vector<libcamera::Size>\n> > > +       getRawResolutions(const libcamera::PixelFormat &pixelFormat);\n> > > +       int initializeStreamConfigurations();\n> > > +\n> > > +       int initializeStaticMetadata();\n> > > +\n> > > +       std::shared_ptr<libcamera::Camera> camera_;\n> > > +\n> > > +       int facing_;\n> > > +       int orientation_;\n> > > +\n> > > +       std::vector<Camera3StreamConfiguration> streamConfigurations_;\n> > > +       std::map<int, libcamera::PixelFormat> formatsMap_;\n> > > +       std::unique_ptr<CameraMetadata> staticMetadata_;\n> > > +       unsigned int maxJpegBufferSize_;\n> > > +};\n> > > +\n> > > +#endif /* __ANDROID_CAMERA_CAPABILITIES_H__ */\n> > > diff --git a/src/android/camera_device.cpp b/src/android/camera_device.cpp\n> > > index 8c71fd0675d3..4bd125d7020a 100644\n> > > --- a/src/android/camera_device.cpp\n> > > +++ b/src/android/camera_device.cpp\n> > > @@ -10,11 +10,8 @@\n> > >  #include \"camera_ops.h\"\n> > >  #include \"post_processor.h\"\n> > >\n> > > -#include <array>\n> > > -#include <cmath>\n> > >  #include <fstream>\n> > >  #include <sys/mman.h>\n> > > -#include <tuple>\n> > >  #include <unistd.h>\n> > >  #include <vector>\n> > >\n> > > @@ -23,7 +20,6 @@\n> > >  #include <libcamera/formats.h>\n> > >  #include <libcamera/property_ids.h>\n> > >\n> > > -#include \"libcamera/internal/formats.h\"\n> > >  #include \"libcamera/internal/log.h\"\n> > >  #include \"libcamera/internal/thread.h\"\n> > >  #include \"libcamera/internal/utils.h\"\n> > > @@ -36,94 +32,6 @@ LOG_DECLARE_CATEGORY(HAL)\n> > >\n> > >  namespace {\n> > >\n> > > -/*\n> > > - * \\var camera3Resolutions\n> > > - * \\brief The list of image resolutions defined as mandatory to be\n> > > supported by\n> > > - * the Android Camera3 specification\n> > > - */\n> > > -const std::vector<Size> camera3Resolutions = {\n> > > -       { 320, 240 },\n> > > -       { 640, 480 },\n> > > -       { 1280, 720 },\n> > > -       { 1920, 1080 }\n> > > -};\n> > > -\n> > > -/*\n> > > - * \\struct Camera3Format\n> > > - * \\brief Data associated with an Android format identifier\n> > > - * \\var libcameraFormats List of libcamera pixel formats compatible with\n> > > the\n> > > - * Android format\n> > > - * \\var name The human-readable representation of the Android format code\n> > > - */\n> > > -struct Camera3Format {\n> > > -       std::vector<PixelFormat> libcameraFormats;\n> > > -       bool mandatory;\n> > > -       const char *name;\n> > > -};\n> > > -\n> > > -/*\n> > > - * \\var camera3FormatsMap\n> > > - * \\brief Associate Android format code with ancillary data\n> > > - */\n> > > -const std::map<int, const Camera3Format> camera3FormatsMap = {\n> > > -       {\n> > > -               HAL_PIXEL_FORMAT_BLOB, {\n> > > -                       { formats::MJPEG },\n> > > -                       true,\n> > > -                       \"BLOB\"\n> > > -               }\n> > > -       }, {\n> > > -               HAL_PIXEL_FORMAT_YCbCr_420_888, {\n> > > -                       { formats::NV12, formats::NV21 },\n> > > -                       true,\n> > > -                       \"YCbCr_420_888\"\n> > > -               }\n> > > -       }, {\n> > > -               /*\n> > > -                * \\todo Translate IMPLEMENTATION_DEFINED inspecting the\n> > > gralloc\n> > > -                * usage flag. For now, copy the YCbCr_420 configuration.\n> > > -                */\n> > > -               HAL_PIXEL_FORMAT_IMPLEMENTATION_DEFINED, {\n> > > -                       { formats::NV12, formats::NV21 },\n> > > -                       true,\n> > > -                       \"IMPLEMENTATION_DEFINED\"\n> > > -               }\n> > > -       }, {\n> > > -               HAL_PIXEL_FORMAT_RAW10, {\n> > > -                       {\n> > > -                               formats::SBGGR10_CSI2P,\n> > > -                               formats::SGBRG10_CSI2P,\n> > > -                               formats::SGRBG10_CSI2P,\n> > > -                               formats::SRGGB10_CSI2P\n> > > -                       },\n> > > -                       false,\n> > > -                       \"RAW10\"\n> > > -               }\n> > > -       }, {\n> > > -               HAL_PIXEL_FORMAT_RAW12, {\n> > > -                       {\n> > > -                               formats::SBGGR12_CSI2P,\n> > > -                               formats::SGBRG12_CSI2P,\n> > > -                               formats::SGRBG12_CSI2P,\n> > > -                               formats::SRGGB12_CSI2P\n> > > -                       },\n> > > -                       false,\n> > > -                       \"RAW12\"\n> > > -               }\n> > > -       }, {\n> > > -               HAL_PIXEL_FORMAT_RAW16, {\n> > > -                       {\n> > > -                               formats::SBGGR16,\n> > > -                               formats::SGBRG16,\n> > > -                               formats::SGRBG16,\n> > > -                               formats::SRGGB16\n> > > -                       },\n> > > -                       false,\n> > > -                       \"RAW16\"\n> > > -               }\n> > > -       },\n> > > -};\n> > > -\n> > >  /*\n> > >   * \\struct Camera3StreamConfig\n> > >   * \\brief Data to store StreamConfiguration associated with\n> > > camera3_stream(s)\n> > > @@ -512,242 +420,7 @@ int CameraDevice::initialize(const CameraConfigData\n> > > *cameraConfigData)\n> > >                 orientation_ = 0;\n> > >         }\n> > >\n> > > -       /* Acquire the camera and initialize available stream\n> > > configurations. */\n> > > -       int ret = camera_->acquire();\n> > > -       if (ret) {\n> > > -               LOG(HAL, Error) << \"Failed to temporarily acquire the\n> > > camera\";\n> > > -               return ret;\n> > > -       }\n> > > -\n> > > -       ret = initializeStreamConfigurations();\n> > > -       camera_->release();\n> > > -       return ret;\n> > > -}\n> > > -\n> > > -std::vector<Size> CameraDevice::getYUVResolutions(CameraConfiguration\n> > > *cameraConfig,\n> > > -                                                 const PixelFormat\n> > > &pixelFormat,\n> > > -                                                 const std::vector<Size>\n> > > &resolutions)\n> > > -{\n> > > -       std::vector<Size> supportedResolutions;\n> > > -\n> > > -       StreamConfiguration &cfg = cameraConfig->at(0);\n> > > -       for (const Size &res : resolutions) {\n> > > -               cfg.pixelFormat = pixelFormat;\n> > > -               cfg.size = res;\n> > > -\n> > > -               CameraConfiguration::Status status =\n> > > cameraConfig->validate();\n> > > -               if (status != CameraConfiguration::Valid) {\n> > > -                       LOG(HAL, Debug) << cfg.toString() << \" not\n> > > supported\";\n> > > -                       continue;\n> > > -               }\n> > > -\n> > > -               LOG(HAL, Debug) << cfg.toString() << \" supported\";\n> > > -\n> > > -               supportedResolutions.push_back(res);\n> > > -       }\n> > > -\n> > > -       return supportedResolutions;\n> > > -}\n> > > -\n> > > -std::vector<Size> CameraDevice::getRawResolutions(const\n> > > libcamera::PixelFormat &pixelFormat)\n> > > -{\n> > > -       std::unique_ptr<CameraConfiguration> cameraConfig =\n> > > -               camera_->generateConfiguration({ StreamRole::Raw });\n> > > -       StreamConfiguration &cfg = cameraConfig->at(0);\n> > > -       const StreamFormats &formats = cfg.formats();\n> > > -       std::vector<Size> supportedResolutions =\n> > > formats.sizes(pixelFormat);\n> > > -\n> > > -       return supportedResolutions;\n> > > -}\n> > > -\n> > > -/*\n> > > - * Initialize the format conversion map to translate from Android format\n> > > - * identifier to libcamera pixel formats and fill in the list of supported\n> > > - * stream configurations to be reported to the Android camera framework\n> > > through\n> > > - * the static stream configuration metadata.\n> > > - */\n> > > -int CameraDevice::initializeStreamConfigurations()\n> > > -{\n> > > -       /*\n> > > -        * Get the maximum output resolutions\n> > > -        * \\todo Get this from the camera properties once defined\n> > > -        */\n> > > -       std::unique_ptr<CameraConfiguration> cameraConfig =\n> > > -               camera_->generateConfiguration({ StillCapture });\n> > > -       if (!cameraConfig) {\n> > > -               LOG(HAL, Error) << \"Failed to get maximum resolution\";\n> > > -               return -EINVAL;\n> > > -       }\n> > > -       StreamConfiguration &cfg = cameraConfig->at(0);\n> > > -\n> > > -       /*\n> > > -        * \\todo JPEG - Adjust the maximum available resolution by taking\n> > > the\n> > > -        * JPEG encoder requirements into account (alignment and aspect\n> > > ratio).\n> > > -        */\n> > > -       const Size maxRes = cfg.size;\n> > > -       LOG(HAL, Debug) << \"Maximum supported resolution: \" <<\n> > > maxRes.toString();\n> > > -\n> > > -       /*\n> > > -        * Build the list of supported image resolutions.\n> > > -        *\n> > > -        * The resolutions listed in camera3Resolution are mandatory to be\n> > > -        * supported, up to the camera maximum resolution.\n> > > -        *\n> > > -        * Augment the list by adding resolutions calculated from the\n> > > camera\n> > > -        * maximum one.\n> > > -        */\n> > > -       std::vector<Size> cameraResolutions;\n> > > -       std::copy_if(camera3Resolutions.begin(), camera3Resolutions.end(),\n> > > -                    std::back_inserter(cameraResolutions),\n> > > -                    [&](const Size &res) { return res < maxRes; });\n> > > -\n> > > -       /*\n> > > -        * The Camera3 specification suggests adding 1/2 and 1/4 of the\n> > > maximum\n> > > -        * resolution.\n> > > -        */\n> > > -       for (unsigned int divider = 2;; divider <<= 1) {\n> > > -               Size derivedSize{\n> > > -                       maxRes.width / divider,\n> > > -                       maxRes.height / divider,\n> > > -               };\n> > > -\n> > > -               if (derivedSize.width < 320 ||\n> > > -                   derivedSize.height < 240)\n> > > -                       break;\n> > > -\n> > > -               cameraResolutions.push_back(derivedSize);\n> > > -       }\n> > > -       cameraResolutions.push_back(maxRes);\n> > > -\n> > > -       /* Remove duplicated entries from the list of supported\n> > > resolutions. */\n> > > -       std::sort(cameraResolutions.begin(), cameraResolutions.end());\n> > > -       auto last = std::unique(cameraResolutions.begin(),\n> > > cameraResolutions.end());\n> > > -       cameraResolutions.erase(last, cameraResolutions.end());\n> > > -\n> > > -       /*\n> > > -        * Build the list of supported camera formats.\n> > > -        *\n> > > -        * To each Android format a list of compatible libcamera formats is\n> > > -        * associated. The first libcamera format that tests successful is\n> > > added\n> > > -        * to the format translation map used when configuring the streams.\n> > > -        * It is then tested against the list of supported camera\n> > > resolutions to\n> > > -        * build the stream configuration map reported through the camera\n> > > static\n> > > -        * metadata.\n> > > -        */\n> > > -       Size maxJpegSize;\n> > > -       for (const auto &format : camera3FormatsMap) {\n> > > -               int androidFormat = format.first;\n> > > -               const Camera3Format &camera3Format = format.second;\n> > > -               const std::vector<PixelFormat> &libcameraFormats =\n> > > -                       camera3Format.libcameraFormats;\n> > > -\n> > > -               LOG(HAL, Debug) << \"Trying to map Android format \"\n> > > -                               << camera3Format.name;\n> > > -\n> > > -               /*\n> > > -                * JPEG is always supported, either produced directly by\n> > > the\n> > > -                * camera, or encoded in the HAL.\n> > > -                */\n> > > -               if (androidFormat == HAL_PIXEL_FORMAT_BLOB) {\n> > > -                       formatsMap_[androidFormat] = formats::MJPEG;\n> > > -                       LOG(HAL, Debug) << \"Mapped Android format \"\n> > > -                                       << camera3Format.name << \" to \"\n> > > -                                       << formats::MJPEG.toString()\n> > > -                                       << \" (fixed mapping)\";\n> > > -                       continue;\n> > > -               }\n> > > -\n> > > -               /*\n> > > -                * Test the libcamera formats that can produce images\n> > > -                * compatible with the format defined by Android.\n> > > -                */\n> > > -               PixelFormat mappedFormat;\n> > > -               for (const PixelFormat &pixelFormat : libcameraFormats) {\n> > > -\n> > > -                       LOG(HAL, Debug) << \"Testing \" <<\n> > > pixelFormat.toString();\n> > > -\n> > > -                       /*\n> > > -                        * The stream configuration size can be adjusted,\n> > > -                        * not the pixel format.\n> > > -                        *\n> > > -                        * \\todo This could be simplified once all pipeline\n> > > -                        * handlers will report the StreamFormats list of\n> > > -                        * supported formats.\n> > > -                        */\n> > > -                       cfg.pixelFormat = pixelFormat;\n> > > -\n> > > -                       CameraConfiguration::Status status =\n> > > cameraConfig->validate();\n> > > -                       if (status != CameraConfiguration::Invalid &&\n> > > -                           cfg.pixelFormat == pixelFormat) {\n> > > -                               mappedFormat = pixelFormat;\n> > > -                               break;\n> > > -                       }\n> > > -               }\n> > > -\n> > > -               if (!mappedFormat.isValid()) {\n> > > -                       /* If the format is not mandatory, skip it. */\n> > > -                       if (!camera3Format.mandatory)\n> > > -                               continue;\n> > > -\n> > > -                       LOG(HAL, Error)\n> > > -                               << \"Failed to map mandatory Android format\n> > > \"\n> > > -                               << camera3Format.name << \" (\"\n> > > -                               << utils::hex(androidFormat) << \"):\n> > > aborting\";\n> > > -                       return -EINVAL;\n> > > -               }\n> > > -\n> > > -               /*\n> > > -                * Record the mapping and then proceed to generate the\n> > > -                * stream configurations map, by testing the image\n> > > resolutions.\n> > > -                */\n> > > -               formatsMap_[androidFormat] = mappedFormat;\n> > > -               LOG(HAL, Debug) << \"Mapped Android format \"\n> > > -                               << camera3Format.name << \" to \"\n> > > -                               << mappedFormat.toString();\n> > > -\n> > > -               std::vector<Size> resolutions;\n> > > -               const PixelFormatInfo &info =\n> > > PixelFormatInfo::info(mappedFormat);\n> > > -               if (info.colourEncoding ==\n> > > PixelFormatInfo::ColourEncodingRAW)\n> > > -                       resolutions = getRawResolutions(mappedFormat);\n> > > -               else\n> > > -                       resolutions = getYUVResolutions(cameraConfig.get(),\n> > > -                                                       mappedFormat,\n> > > -                                                       cameraResolutions);\n> > > -\n> > > -               for (const Size &res : resolutions) {\n> > > -                       streamConfigurations_.push_back({ res,\n> > > androidFormat });\n> > > -\n> > > -                       /*\n> > > -                        * If the format is HAL_PIXEL_FORMAT_YCbCr_420_888\n> > > -                        * from which JPEG is produced, add an entry for\n> > > -                        * the JPEG stream.\n> > > -                        *\n> > > -                        * \\todo Wire the JPEG encoder to query the\n> > > supported\n> > > -                        * sizes provided a list of formats it can encode.\n> > > -                        *\n> > > -                        * \\todo Support JPEG streams produced by the\n> > > Camera\n> > > -                        * natively.\n> > > -                        */\n> > > -                       if (androidFormat ==\n> > > HAL_PIXEL_FORMAT_YCbCr_420_888) {\n> > > -                               streamConfigurations_.push_back(\n> > > -                                       { res, HAL_PIXEL_FORMAT_BLOB });\n> > > -                               maxJpegSize = std::max(maxJpegSize, res);\n> > > -                       }\n> > > -               }\n> > > -\n> > > -               /*\n> > > -                * \\todo Calculate the maximum JPEG buffer size by asking\n> > > the\n> > > -                * encoder giving the maximum frame size required.\n> > > -                */\n> > > -               maxJpegBufferSize_ = maxJpegSize.width *\n> > > maxJpegSize.height * 1.5;\n> > > -       }\n> > > -\n> > > -       LOG(HAL, Debug) << \"Collected stream configuration map: \";\n> > > -       for (const auto &entry : streamConfigurations_)\n> > > -               LOG(HAL, Debug) << \"{ \" << entry.resolution.toString() <<\n> > > \" - \"\n> > > -                               << utils::hex(entry.androidFormat) << \" }\";\n> > > -\n> > > -       return 0;\n> > > +       return capabilities_.initialize(camera_, orientation_, facing_);\n> > >  }\n> > >\n> > >  /*\n> > > @@ -817,802 +490,19 @@ void CameraDevice::stop()\n> > >         state_ = State::Stopped;\n> > >  }\n> > >\n> > > -void CameraDevice::setCallbacks(const camera3_callback_ops_t *callbacks)\n> > > +unsigned int CameraDevice::maxJpegBufferSize() const\n> > >  {\n> > > -       callbacks_ = callbacks;\n> > > +       return capabilities_.maxJpegBufferSize();\n> > >  }\n> > >\n> > > -/*\n> > > - * Return static information for the camera.\n> > > - */\n> > > -const camera_metadata_t *CameraDevice::getStaticMetadata()\n> > > -{\n> > > -       if (staticMetadata_)\n> > > -               return staticMetadata_->get();\n> > > -\n> > > -       staticMetadata_ = std::make_unique<CameraMetadata>(64, 1024);\n> > > -       if (!staticMetadata_->isValid()) {\n> > > -               LOG(HAL, Error) << \"Failed to allocate static metadata\";\n> > > -               staticMetadata_.reset();\n> > > -               return nullptr;\n> > > -       }\n> > > -\n> > > -       const ControlInfoMap &controlsInfo = camera_->controls();\n> > > -       const ControlList &properties = camera_->properties();\n> > > -\n> > > -       /* Color correction static metadata. */\n> > > -       {\n> > > -               std::vector<uint8_t> data;\n> > > -               data.reserve(3);\n> > > -               const auto &infoMap =\n> > > controlsInfo.find(&controls::draft::ColorCorrectionAberrationMode);\n> > > -               if (infoMap != controlsInfo.end()) {\n> > > -                       for (const auto &value : infoMap->second.values())\n> > > -                               data.push_back(value.get<int32_t>());\n> > > -               } else {\n> > > -\n> > >  data.push_back(ANDROID_COLOR_CORRECTION_ABERRATION_MODE_OFF);\n> > > -               }\n> > > -\n> > >  staticMetadata_->addEntry(ANDROID_COLOR_CORRECTION_AVAILABLE_ABERRATION_MODES,\n> > > -                                         data);\n> > > -       }\n> > > -\n> > > -       /* Control static metadata. */\n> > > -       std::vector<uint8_t> aeAvailableAntiBandingModes = {\n> > > -               ANDROID_CONTROL_AE_ANTIBANDING_MODE_OFF,\n> > > -               ANDROID_CONTROL_AE_ANTIBANDING_MODE_50HZ,\n> > > -               ANDROID_CONTROL_AE_ANTIBANDING_MODE_60HZ,\n> > > -               ANDROID_CONTROL_AE_ANTIBANDING_MODE_AUTO,\n> > > -       };\n> > > -\n> > >  staticMetadata_->addEntry(ANDROID_CONTROL_AE_AVAILABLE_ANTIBANDING_MODES,\n> > > -                                 aeAvailableAntiBandingModes);\n> > > -\n> > > -       std::vector<uint8_t> aeAvailableModes = {\n> > > -               ANDROID_CONTROL_AE_MODE_ON,\n> > > -       };\n> > > -       staticMetadata_->addEntry(ANDROID_CONTROL_AE_AVAILABLE_MODES,\n> > > -                                 aeAvailableModes);\n> > > -\n> > > -       int64_t minFrameDurationNsec = -1;\n> > > -       int64_t maxFrameDurationNsec = -1;\n> > > -       const auto frameDurationsInfo =\n> > > controlsInfo.find(&controls::FrameDurationLimits);\n> > > -       if (frameDurationsInfo != controlsInfo.end()) {\n> > > -               minFrameDurationNsec =\n> > > frameDurationsInfo->second.min().get<int64_t>() * 1000;\n> > > -               maxFrameDurationNsec =\n> > > frameDurationsInfo->second.max().get<int64_t>() * 1000;\n> > > -\n> > > -               /*\n> > > -                * Adjust the minimum frame duration to comply with Android\n> > > -                * requirements. The camera service mandates all\n> > > preview/record\n> > > -                * streams to have a minimum frame duration < 33,366\n> > > milliseconds\n> > > -                * (see MAX_PREVIEW_RECORD_DURATION_NS in the camera\n> > > service\n> > > -                * implementation).\n> > > -                *\n> > > -                * If we're close enough (+ 500 useconds) to that value,\n> > > round\n> > > -                * the minimum frame duration of the camera to an accepted\n> > > -                * value.\n> > > -                */\n> > > -               static constexpr int64_t MAX_PREVIEW_RECORD_DURATION_NS =\n> > > 1e9 / 29.97;\n> > > -               if (minFrameDurationNsec > MAX_PREVIEW_RECORD_DURATION_NS\n> > > &&\n> > > -                   minFrameDurationNsec < MAX_PREVIEW_RECORD_DURATION_NS\n> > > + 500000)\n> > > -                       minFrameDurationNsec =\n> > > MAX_PREVIEW_RECORD_DURATION_NS - 1000;\n> > > -\n> > > -               /*\n> > > -                * The AE routine frame rate limits are computed using the\n> > > frame\n> > > -                * duration limits, as libcamera clips the AE routine to\n> > > the\n> > > -                * frame durations.\n> > > -                */\n> > > -               int32_t maxFps = std::round(1e9 / minFrameDurationNsec);\n> > > -               int32_t minFps = std::round(1e9 / maxFrameDurationNsec);\n> > > -               minFps = std::max(1, minFps);\n> > > -\n> > > -               /*\n> > > -                * Force rounding errors so that we have the proper frame\n> > > -                * durations for when we reuse these variables later\n> > > -                */\n> > > -               minFrameDurationNsec = 1e9 / maxFps;\n> > > -               maxFrameDurationNsec = 1e9 / minFps;\n> > > -\n> > > -               /*\n> > > -                * Register to the camera service {min, max} and {max, max}\n> > > -                * intervals as requested by the metadata documentation.\n> > > -                */\n> > > -               int32_t availableAeFpsTarget[] = {\n> > > -                       minFps, maxFps, maxFps, maxFps\n> > > -               };\n> > > -\n> > >  staticMetadata_->addEntry(ANDROID_CONTROL_AE_AVAILABLE_TARGET_FPS_RANGES,\n> > > -                                         availableAeFpsTarget);\n> > > -       }\n> > > -\n> > > -       std::vector<int32_t> aeCompensationRange = {\n> > > -               0, 0,\n> > > -       };\n> > > -       staticMetadata_->addEntry(ANDROID_CONTROL_AE_COMPENSATION_RANGE,\n> > > -                                 aeCompensationRange);\n> > > -\n> > > -       const camera_metadata_rational_t aeCompensationStep[] = {\n> > > -               { 0, 1 }\n> > > -       };\n> > > -       staticMetadata_->addEntry(ANDROID_CONTROL_AE_COMPENSATION_STEP,\n> > > -                                 aeCompensationStep);\n> > > -\n> > > -       std::vector<uint8_t> availableAfModes = {\n> > > -               ANDROID_CONTROL_AF_MODE_OFF,\n> > > -       };\n> > > -       staticMetadata_->addEntry(ANDROID_CONTROL_AF_AVAILABLE_MODES,\n> > > -                                 availableAfModes);\n> > > -\n> > > -       std::vector<uint8_t> availableEffects = {\n> > > -               ANDROID_CONTROL_EFFECT_MODE_OFF,\n> > > -       };\n> > > -       staticMetadata_->addEntry(ANDROID_CONTROL_AVAILABLE_EFFECTS,\n> > > -                                 availableEffects);\n> > > -\n> > > -       std::vector<uint8_t> availableSceneModes = {\n> > > -               ANDROID_CONTROL_SCENE_MODE_DISABLED,\n> > > -       };\n> > > -       staticMetadata_->addEntry(ANDROID_CONTROL_AVAILABLE_SCENE_MODES,\n> > > -                                 availableSceneModes);\n> > > -\n> > > -       std::vector<uint8_t> availableStabilizationModes = {\n> > > -               ANDROID_CONTROL_VIDEO_STABILIZATION_MODE_OFF,\n> > > -       };\n> > > -\n> > >  staticMetadata_->addEntry(ANDROID_CONTROL_AVAILABLE_VIDEO_STABILIZATION_MODES,\n> > > -                                 availableStabilizationModes);\n> > > -\n> > > -       /*\n> > > -        * \\todo Inspect the Camera capabilities to report the available\n> > > -        * AWB modes. Default to AUTO as CTS tests require it.\n> > > -        */\n> > > -       std::vector<uint8_t> availableAwbModes = {\n> > > -               ANDROID_CONTROL_AWB_MODE_AUTO,\n> > > -       };\n> > > -       staticMetadata_->addEntry(ANDROID_CONTROL_AWB_AVAILABLE_MODES,\n> > > -                                 availableAwbModes);\n> > > -\n> > > -       std::vector<int32_t> availableMaxRegions = {\n> > > -               0, 0, 0,\n> > > -       };\n> > > -       staticMetadata_->addEntry(ANDROID_CONTROL_MAX_REGIONS,\n> > > -                                 availableMaxRegions);\n> > > -\n> > > -       std::vector<uint8_t> sceneModesOverride = {\n> > > -               ANDROID_CONTROL_AE_MODE_ON,\n> > > -               ANDROID_CONTROL_AWB_MODE_AUTO,\n> > > -               ANDROID_CONTROL_AF_MODE_OFF,\n> > > -       };\n> > > -       staticMetadata_->addEntry(ANDROID_CONTROL_SCENE_MODE_OVERRIDES,\n> > > -                                 sceneModesOverride);\n> > > -\n> > > -       uint8_t aeLockAvailable = ANDROID_CONTROL_AE_LOCK_AVAILABLE_FALSE;\n> > > -       staticMetadata_->addEntry(ANDROID_CONTROL_AE_LOCK_AVAILABLE,\n> > > -                                 aeLockAvailable);\n> > > -\n> > > -       uint8_t awbLockAvailable =\n> > > ANDROID_CONTROL_AWB_LOCK_AVAILABLE_FALSE;\n> > > -       staticMetadata_->addEntry(ANDROID_CONTROL_AWB_LOCK_AVAILABLE,\n> > > -                                 awbLockAvailable);\n> > > -\n> > > -       char availableControlModes = ANDROID_CONTROL_MODE_AUTO;\n> > > -       staticMetadata_->addEntry(ANDROID_CONTROL_AVAILABLE_MODES,\n> > > -                                 availableControlModes);\n> > > -\n> > > -       /* JPEG static metadata. */\n> > > -\n> > > -       /*\n> > > -        * Create the list of supported thumbnail sizes by inspecting the\n> > > -        * available JPEG resolutions collected in streamConfigurations_\n> > > and\n> > > -        * generate one entry for each aspect ratio.\n> > > -        *\n> > > -        * The JPEG thumbnailer can freely scale, so pick an arbitrary\n> > > -        * (160, 160) size as the bounding rectangle, which is then\n> > > cropped to\n> > > -        * the different supported aspect ratios.\n> > > -        */\n> > > -       constexpr Size maxJpegThumbnail(160, 160);\n> > > -       std::vector<Size> thumbnailSizes;\n> > > -       thumbnailSizes.push_back({ 0, 0 });\n> > > -       for (const auto &entry : streamConfigurations_) {\n> > > -               if (entry.androidFormat != HAL_PIXEL_FORMAT_BLOB)\n> > > -                       continue;\n> > > -\n> > > -               Size thumbnailSize = maxJpegThumbnail\n> > > -                                    .boundedToAspectRatio({\n> > > entry.resolution.width,\n> > > -\n> > > entry.resolution.height });\n> > > -               thumbnailSizes.push_back(thumbnailSize);\n> > > -       }\n> > > -\n> > > -       std::sort(thumbnailSizes.begin(), thumbnailSizes.end());\n> > > -       auto last = std::unique(thumbnailSizes.begin(),\n> > > thumbnailSizes.end());\n> > > -       thumbnailSizes.erase(last, thumbnailSizes.end());\n> > > -\n> > > -       /* Transform sizes in to a list of integers that can be consumed.\n> > > */\n> > > -       std::vector<int32_t> thumbnailEntries;\n> > > -       thumbnailEntries.reserve(thumbnailSizes.size() * 2);\n> > > -       for (const auto &size : thumbnailSizes) {\n> > > -               thumbnailEntries.push_back(size.width);\n> > > -               thumbnailEntries.push_back(size.height);\n> > > -       }\n> > > -       staticMetadata_->addEntry(ANDROID_JPEG_AVAILABLE_THUMBNAIL_SIZES,\n> > > -                                 thumbnailEntries);\n> > > -\n> > > -       staticMetadata_->addEntry(ANDROID_JPEG_MAX_SIZE,\n> > > maxJpegBufferSize_);\n> > > -\n> > > -       /* Sensor static metadata. */\n> > > -       std::array<int32_t, 2> pixelArraySize;\n> > > -       {\n> > > -               const Size &size =\n> > > properties.get(properties::PixelArraySize);\n> > > -               pixelArraySize[0] = size.width;\n> > > -               pixelArraySize[1] = size.height;\n> > > -\n> > >  staticMetadata_->addEntry(ANDROID_SENSOR_INFO_PIXEL_ARRAY_SIZE,\n> > > -                                         pixelArraySize);\n> > > -       }\n> > > -\n> > > -       if (properties.contains(properties::UnitCellSize)) {\n> > > -               const Size &cellSize =\n> > > properties.get<Size>(properties::UnitCellSize);\n> > > -               std::array<float, 2> physicalSize{\n> > > -                       cellSize.width * pixelArraySize[0] / 1e6f,\n> > > -                       cellSize.height * pixelArraySize[1] / 1e6f\n> > > -               };\n> > > -\n> > >  staticMetadata_->addEntry(ANDROID_SENSOR_INFO_PHYSICAL_SIZE,\n> > > -                                         physicalSize);\n> > > -       }\n> > > -\n> > > -       {\n> > > -               const Span<const Rectangle> &rects =\n> > > -                       properties.get(properties::PixelArrayActiveAreas);\n> > > -               std::vector<int32_t> data{\n> > > -                       static_cast<int32_t>(rects[0].x),\n> > > -                       static_cast<int32_t>(rects[0].y),\n> > > -                       static_cast<int32_t>(rects[0].width),\n> > > -                       static_cast<int32_t>(rects[0].height),\n> > > -               };\n> > > -\n> > >  staticMetadata_->addEntry(ANDROID_SENSOR_INFO_ACTIVE_ARRAY_SIZE,\n> > > -                                         data);\n> > > -       }\n> > > -\n> > > -       int32_t sensitivityRange[] = {\n> > > -               32, 2400,\n> > > -       };\n> > > -       staticMetadata_->addEntry(ANDROID_SENSOR_INFO_SENSITIVITY_RANGE,\n> > > -                                 sensitivityRange);\n> > > -\n> > > -       /* Report the color filter arrangement if the camera reports it. */\n> > > -       if\n> > > (properties.contains(properties::draft::ColorFilterArrangement)) {\n> > > -               uint8_t filterArr =\n> > > properties.get(properties::draft::ColorFilterArrangement);\n> > > -\n> > >  staticMetadata_->addEntry(ANDROID_SENSOR_INFO_COLOR_FILTER_ARRANGEMENT,\n> > > -                                         filterArr);\n> > > -       }\n> > > -\n> > > -       const auto &exposureInfo =\n> > > controlsInfo.find(&controls::ExposureTime);\n> > > -       if (exposureInfo != controlsInfo.end()) {\n> > > -               int64_t exposureTimeRange[2] = {\n> > > -                       exposureInfo->second.min().get<int32_t>() * 1000LL,\n> > > -                       exposureInfo->second.max().get<int32_t>() * 1000LL,\n> > > -               };\n> > > -\n> > >  staticMetadata_->addEntry(ANDROID_SENSOR_INFO_EXPOSURE_TIME_RANGE,\n> > > -                                         exposureTimeRange, 2);\n> > > -       }\n> > > -\n> > > -       staticMetadata_->addEntry(ANDROID_SENSOR_ORIENTATION,\n> > > orientation_);\n> > > -\n> > > -       std::vector<int32_t> testPatternModes = {\n> > > -               ANDROID_SENSOR_TEST_PATTERN_MODE_OFF\n> > > -       };\n> > > -       const auto &testPatternsInfo =\n> > > -               controlsInfo.find(&controls::draft::TestPatternMode);\n> > > -       if (testPatternsInfo != controlsInfo.end()) {\n> > > -               const auto &values = testPatternsInfo->second.values();\n> > > -               ASSERT(!values.empty());\n> > > -               for (const auto &value : values) {\n> > > -                       switch (value.get<int32_t>()) {\n> > > -                       case controls::draft::TestPatternModeOff:\n> > > -                               /*\n> > > -                                * ANDROID_SENSOR_TEST_PATTERN_MODE_OFF is\n> > > -                                * already in testPatternModes.\n> > > -                                */\n> > > -                               break;\n> > > -\n> > > -                       case controls::draft::TestPatternModeSolidColor:\n> > > -                               testPatternModes.push_back(\n> > > -\n> > >  ANDROID_SENSOR_TEST_PATTERN_MODE_SOLID_COLOR);\n> > > -                               break;\n> > > -\n> > > -                       case controls::draft::TestPatternModeColorBars:\n> > > -                               testPatternModes.push_back(\n> > > -\n> > >  ANDROID_SENSOR_TEST_PATTERN_MODE_COLOR_BARS);\n> > > -                               break;\n> > > -\n> > > -                       case\n> > > controls::draft::TestPatternModeColorBarsFadeToGray:\n> > > -                               testPatternModes.push_back(\n> > > -\n> > >  ANDROID_SENSOR_TEST_PATTERN_MODE_COLOR_BARS_FADE_TO_GRAY);\n> > > -                               break;\n> > > -\n> > > -                       case controls::draft::TestPatternModePn9:\n> > > -                               testPatternModes.push_back(\n> > > -\n> > >  ANDROID_SENSOR_TEST_PATTERN_MODE_PN9);\n> > > -                               break;\n> > > -\n> > > -                       case controls::draft::TestPatternModeCustom1:\n> > > -                               /* We don't support this yet. */\n> > > -                               break;\n> > > -\n> > > -                       default:\n> > > -                               LOG(HAL, Error) << \"Unknown test pattern\n> > > mode: \"\n> > > -                                               << value.get<int32_t>();\n> > > -                               continue;\n> > > -                       }\n> > > -               }\n> > > -       }\n> > > -\n> > >  staticMetadata_->addEntry(ANDROID_SENSOR_AVAILABLE_TEST_PATTERN_MODES,\n> > > -                                 testPatternModes);\n> > > -\n> > > -       uint8_t timestampSource =\n> > > ANDROID_SENSOR_INFO_TIMESTAMP_SOURCE_UNKNOWN;\n> > > -       staticMetadata_->addEntry(ANDROID_SENSOR_INFO_TIMESTAMP_SOURCE,\n> > > -                                 timestampSource);\n> > > -\n> > > -       if (maxFrameDurationNsec > 0)\n> > > -\n> > >  staticMetadata_->addEntry(ANDROID_SENSOR_INFO_MAX_FRAME_DURATION,\n> > > -                                         maxFrameDurationNsec);\n> > > -\n> > > -       /* Statistics static metadata. */\n> > > -       uint8_t faceDetectMode = ANDROID_STATISTICS_FACE_DETECT_MODE_OFF;\n> > > -\n> > >  staticMetadata_->addEntry(ANDROID_STATISTICS_INFO_AVAILABLE_FACE_DETECT_MODES,\n> > > -                                 faceDetectMode);\n> > > -\n> > > -       int32_t maxFaceCount = 0;\n> > > -       staticMetadata_->addEntry(ANDROID_STATISTICS_INFO_MAX_FACE_COUNT,\n> > > -                                 maxFaceCount);\n> > > -\n> > > -       {\n> > > -               std::vector<uint8_t> data;\n> > > -               data.reserve(2);\n> > > -               const auto &infoMap =\n> > > controlsInfo.find(&controls::draft::LensShadingMapMode);\n> > > -               if (infoMap != controlsInfo.end()) {\n> > > -                       for (const auto &value : infoMap->second.values())\n> > > -                               data.push_back(value.get<int32_t>());\n> > > -               } else {\n> > > -\n> > >  data.push_back(ANDROID_STATISTICS_LENS_SHADING_MAP_MODE_OFF);\n> > > -               }\n> > > -\n> > >  staticMetadata_->addEntry(ANDROID_STATISTICS_INFO_AVAILABLE_LENS_SHADING_MAP_MODES,\n> > > -                                         data);\n> > > -       }\n> > > -\n> > > -       /* Sync static metadata. */\n> > > -       int32_t maxLatency = ANDROID_SYNC_MAX_LATENCY_UNKNOWN;\n> > > -       staticMetadata_->addEntry(ANDROID_SYNC_MAX_LATENCY, maxLatency);\n> > > -\n> > > -       /* Flash static metadata. */\n> > > -       char flashAvailable = ANDROID_FLASH_INFO_AVAILABLE_FALSE;\n> > > -       staticMetadata_->addEntry(ANDROID_FLASH_INFO_AVAILABLE,\n> > > -                                 flashAvailable);\n> > > -\n> > > -       /* Lens static metadata. */\n> > > -       std::vector<float> lensApertures = {\n> > > -               2.53 / 100,\n> > > -       };\n> > > -       staticMetadata_->addEntry(ANDROID_LENS_INFO_AVAILABLE_APERTURES,\n> > > -                                 lensApertures);\n> > > -\n> > > -       uint8_t lensFacing;\n> > > -       switch (facing_) {\n> > > -       default:\n> > > -       case CAMERA_FACING_FRONT:\n> > > -               lensFacing = ANDROID_LENS_FACING_FRONT;\n> > > -               break;\n> > > -       case CAMERA_FACING_BACK:\n> > > -               lensFacing = ANDROID_LENS_FACING_BACK;\n> > > -               break;\n> > > -       case CAMERA_FACING_EXTERNAL:\n> > > -               lensFacing = ANDROID_LENS_FACING_EXTERNAL;\n> > > -               break;\n> > > -       }\n> > > -       staticMetadata_->addEntry(ANDROID_LENS_FACING, lensFacing);\n> > > -\n> > > -       std::vector<float> lensFocalLengths = {\n> > > -               1,\n> > > -       };\n> > > -\n> > >  staticMetadata_->addEntry(ANDROID_LENS_INFO_AVAILABLE_FOCAL_LENGTHS,\n> > > -                                 lensFocalLengths);\n> > > -\n> > > -       std::vector<uint8_t> opticalStabilizations = {\n> > > -               ANDROID_LENS_OPTICAL_STABILIZATION_MODE_OFF,\n> > > -       };\n> > > -\n> > >  staticMetadata_->addEntry(ANDROID_LENS_INFO_AVAILABLE_OPTICAL_STABILIZATION,\n> > > -                                 opticalStabilizations);\n> > > -\n> > > -       float hypeFocalDistance = 0;\n> > > -       staticMetadata_->addEntry(ANDROID_LENS_INFO_HYPERFOCAL_DISTANCE,\n> > > -                                 hypeFocalDistance);\n> > > -\n> > > -       float minFocusDistance = 0;\n> > > -       staticMetadata_->addEntry(ANDROID_LENS_INFO_MINIMUM_FOCUS_DISTANCE,\n> > > -                                 minFocusDistance);\n> > > -\n> > > -       /* Noise reduction modes. */\n> > > -       {\n> > > -               std::vector<uint8_t> data;\n> > > -               data.reserve(5);\n> > > -               const auto &infoMap =\n> > > controlsInfo.find(&controls::draft::NoiseReductionMode);\n> > > -               if (infoMap != controlsInfo.end()) {\n> > > -                       for (const auto &value : infoMap->second.values())\n> > > -                               data.push_back(value.get<int32_t>());\n> > > -               } else {\n> > > -                       data.push_back(ANDROID_NOISE_REDUCTION_MODE_OFF);\n> > > -               }\n> > > -\n> > >  staticMetadata_->addEntry(ANDROID_NOISE_REDUCTION_AVAILABLE_NOISE_REDUCTION_MODES,\n> > > -                                         data);\n> > > -       }\n> > > -\n> > > -       /* Scaler static metadata. */\n> > > -\n> > > -       /*\n> > > -        * \\todo The digital zoom factor is a property that depends on the\n> > > -        * desired output configuration and the sensor frame size input to\n> > > the\n> > > -        * ISP. This information is not available to the Android HAL, not\n> > > at\n> > > -        * initialization time at least.\n> > > -        *\n> > > -        * As a workaround rely on pipeline handlers initializing the\n> > > -        * ScalerCrop control with the camera default configuration and\n> > > use the\n> > > -        * maximum and minimum crop rectangles to calculate the digital\n> > > zoom\n> > > -        * factor.\n> > > -        */\n> > > -       float maxZoom = 1.0f;\n> > > -       const auto scalerCrop = controlsInfo.find(&controls::ScalerCrop);\n> > > -       if (scalerCrop != controlsInfo.end()) {\n> > > -               Rectangle min = scalerCrop->second.min().get<Rectangle>();\n> > > -               Rectangle max = scalerCrop->second.max().get<Rectangle>();\n> > > -               maxZoom = std::min(1.0f * max.width / min.width,\n> > > -                                  1.0f * max.height / min.height);\n> > > -       }\n> > > -\n> > >  staticMetadata_->addEntry(ANDROID_SCALER_AVAILABLE_MAX_DIGITAL_ZOOM,\n> > > -                                 maxZoom);\n> > > -\n> > > -       std::vector<uint32_t> availableStreamConfigurations;\n> > > -       availableStreamConfigurations.reserve(streamConfigurations_.size()\n> > > * 4);\n> > > -       for (const auto &entry : streamConfigurations_) {\n> > > -\n> > >  availableStreamConfigurations.push_back(entry.androidFormat);\n> > > -\n> > >  availableStreamConfigurations.push_back(entry.resolution.width);\n> > > -\n> > >  availableStreamConfigurations.push_back(entry.resolution.height);\n> > > -               availableStreamConfigurations.push_back(\n> > > -\n> > >  ANDROID_SCALER_AVAILABLE_STREAM_CONFIGURATIONS_OUTPUT);\n> > > -       }\n> > > -\n> > >  staticMetadata_->addEntry(ANDROID_SCALER_AVAILABLE_STREAM_CONFIGURATIONS,\n> > > -                                 availableStreamConfigurations);\n> > > -\n> > > -       std::vector<int64_t> availableStallDurations = {\n> > > -               ANDROID_SCALER_AVAILABLE_FORMATS_BLOB, 2560, 1920,\n> > > 33333333,\n> > > -       };\n> > > -       staticMetadata_->addEntry(ANDROID_SCALER_AVAILABLE_STALL_DURATIONS,\n> > > -                                 availableStallDurations);\n> > > -\n> > > -       /* Use the minimum frame duration for all the YUV/RGB formats. */\n> > > -       if (minFrameDurationNsec > 0) {\n> > > -               std::vector<int64_t> minFrameDurations;\n> > > -               minFrameDurations.reserve(streamConfigurations_.size() *\n> > > 4);\n> > > -               for (const auto &entry : streamConfigurations_) {\n> > > -                       minFrameDurations.push_back(entry.androidFormat);\n> > > -\n> > >  minFrameDurations.push_back(entry.resolution.width);\n> > > -\n> > >  minFrameDurations.push_back(entry.resolution.height);\n> > > -                       minFrameDurations.push_back(minFrameDurationNsec);\n> > > -               }\n> > > -\n> > >  staticMetadata_->addEntry(ANDROID_SCALER_AVAILABLE_MIN_FRAME_DURATIONS,\n> > > -                                         minFrameDurations);\n> > > -       }\n> > > -\n> > > -       uint8_t croppingType = ANDROID_SCALER_CROPPING_TYPE_CENTER_ONLY;\n> > > -       staticMetadata_->addEntry(ANDROID_SCALER_CROPPING_TYPE,\n> > > croppingType);\n> > > -\n> > > -       /* Info static metadata. */\n> > > -       uint8_t supportedHWLevel =\n> > > ANDROID_INFO_SUPPORTED_HARDWARE_LEVEL_LIMITED;\n> > > -       staticMetadata_->addEntry(ANDROID_INFO_SUPPORTED_HARDWARE_LEVEL,\n> > > -                                 supportedHWLevel);\n> > > -\n> > > -       /* Request static metadata. */\n> > > -       int32_t partialResultCount = 1;\n> > > -       staticMetadata_->addEntry(ANDROID_REQUEST_PARTIAL_RESULT_COUNT,\n> > > -                                 partialResultCount);\n> > > -\n> > > -       {\n> > > -               /* Default the value to 2 if not reported by the camera. */\n> > > -               uint8_t maxPipelineDepth = 2;\n> > > -               const auto &infoMap =\n> > > controlsInfo.find(&controls::draft::PipelineDepth);\n> > > -               if (infoMap != controlsInfo.end())\n> > > -                       maxPipelineDepth =\n> > > infoMap->second.max().get<int32_t>();\n> > > -\n> > >  staticMetadata_->addEntry(ANDROID_REQUEST_PIPELINE_MAX_DEPTH,\n> > > -                                         maxPipelineDepth);\n> > > -       }\n> > > -\n> > > -       /* LIMITED does not support reprocessing. */\n> > > -       uint32_t maxNumInputStreams = 0;\n> > > -       staticMetadata_->addEntry(ANDROID_REQUEST_MAX_NUM_INPUT_STREAMS,\n> > > -                                 maxNumInputStreams);\n> > > -\n> > > -       std::vector<uint8_t> availableCapabilities = {\n> > > -               ANDROID_REQUEST_AVAILABLE_CAPABILITIES_BACKWARD_COMPATIBLE,\n> > > -       };\n> > > -\n> > > -       /* Report if camera supports RAW. */\n> > > -       bool rawStreamAvailable = false;\n> > > -       std::unique_ptr<CameraConfiguration> cameraConfig =\n> > > -               camera_->generateConfiguration({ StreamRole::Raw });\n> > > -       if (cameraConfig && !cameraConfig->empty()) {\n> > > -               const PixelFormatInfo &info =\n> > > -\n> > >  PixelFormatInfo::info(cameraConfig->at(0).pixelFormat);\n> > > -               /* Only advertise RAW support if RAW16 is possible. */\n> > > -               if (info.colourEncoding ==\n> > > PixelFormatInfo::ColourEncodingRAW &&\n> > > -                   info.bitsPerPixel == 16) {\n> > > -                       rawStreamAvailable = true;\n> > > -\n> > >  availableCapabilities.push_back(ANDROID_REQUEST_AVAILABLE_CAPABILITIES_RAW);\n> > > -               }\n> > > -       }\n> > > -\n> > > -       /* Number of { RAW, YUV, JPEG } supported output streams */\n> > > -       int32_t numOutStreams[] = { rawStreamAvailable, 2, 1 };\n> > > -       staticMetadata_->addEntry(ANDROID_REQUEST_MAX_NUM_OUTPUT_STREAMS,\n> > > -                                 numOutStreams);\n> > > -\n> > > -       staticMetadata_->addEntry(ANDROID_REQUEST_AVAILABLE_CAPABILITIES,\n> > > -                                 availableCapabilities);\n> > > -\n> > > -       std::vector<int32_t> availableCharacteristicsKeys = {\n> > > -               ANDROID_COLOR_CORRECTION_AVAILABLE_ABERRATION_MODES,\n> > > -               ANDROID_CONTROL_AE_AVAILABLE_ANTIBANDING_MODES,\n> > > -               ANDROID_CONTROL_AE_AVAILABLE_MODES,\n> > > -               ANDROID_CONTROL_AE_AVAILABLE_TARGET_FPS_RANGES,\n> > > -               ANDROID_CONTROL_AE_COMPENSATION_RANGE,\n> > > -               ANDROID_CONTROL_AE_COMPENSATION_STEP,\n> > > -               ANDROID_CONTROL_AE_LOCK_AVAILABLE,\n> > > -               ANDROID_CONTROL_AF_AVAILABLE_MODES,\n> > > -               ANDROID_CONTROL_AVAILABLE_EFFECTS,\n> > > -               ANDROID_CONTROL_AVAILABLE_MODES,\n> > > -               ANDROID_CONTROL_AVAILABLE_SCENE_MODES,\n> > > -               ANDROID_CONTROL_AVAILABLE_VIDEO_STABILIZATION_MODES,\n> > > -               ANDROID_CONTROL_AWB_AVAILABLE_MODES,\n> > > -               ANDROID_CONTROL_AWB_LOCK_AVAILABLE,\n> > > -               ANDROID_CONTROL_MAX_REGIONS,\n> > > -               ANDROID_CONTROL_SCENE_MODE_OVERRIDES,\n> > > -               ANDROID_FLASH_INFO_AVAILABLE,\n> > > -               ANDROID_INFO_SUPPORTED_HARDWARE_LEVEL,\n> > > -               ANDROID_JPEG_AVAILABLE_THUMBNAIL_SIZES,\n> > > -               ANDROID_JPEG_MAX_SIZE,\n> > > -               ANDROID_LENS_FACING,\n> > > -               ANDROID_LENS_INFO_AVAILABLE_APERTURES,\n> > > -               ANDROID_LENS_INFO_AVAILABLE_FOCAL_LENGTHS,\n> > > -               ANDROID_LENS_INFO_AVAILABLE_OPTICAL_STABILIZATION,\n> > > -               ANDROID_LENS_INFO_HYPERFOCAL_DISTANCE,\n> > > -               ANDROID_LENS_INFO_MINIMUM_FOCUS_DISTANCE,\n> > > -               ANDROID_NOISE_REDUCTION_AVAILABLE_NOISE_REDUCTION_MODES,\n> > > -               ANDROID_REQUEST_AVAILABLE_CAPABILITIES,\n> > > -               ANDROID_REQUEST_MAX_NUM_INPUT_STREAMS,\n> > > -               ANDROID_REQUEST_MAX_NUM_OUTPUT_STREAMS,\n> > > -               ANDROID_REQUEST_PARTIAL_RESULT_COUNT,\n> > > -               ANDROID_REQUEST_PIPELINE_MAX_DEPTH,\n> > > -               ANDROID_SCALER_AVAILABLE_MAX_DIGITAL_ZOOM,\n> > > -               ANDROID_SCALER_AVAILABLE_MIN_FRAME_DURATIONS,\n> > > -               ANDROID_SCALER_AVAILABLE_STALL_DURATIONS,\n> > > -               ANDROID_SCALER_AVAILABLE_STREAM_CONFIGURATIONS,\n> > > -               ANDROID_SCALER_CROPPING_TYPE,\n> > > -               ANDROID_SENSOR_AVAILABLE_TEST_PATTERN_MODES,\n> > > -               ANDROID_SENSOR_INFO_ACTIVE_ARRAY_SIZE,\n> > > -               ANDROID_SENSOR_INFO_COLOR_FILTER_ARRANGEMENT,\n> > > -               ANDROID_SENSOR_INFO_EXPOSURE_TIME_RANGE,\n> > > -               ANDROID_SENSOR_INFO_MAX_FRAME_DURATION,\n> > > -               ANDROID_SENSOR_INFO_PHYSICAL_SIZE,\n> > > -               ANDROID_SENSOR_INFO_PIXEL_ARRAY_SIZE,\n> > > -               ANDROID_SENSOR_INFO_SENSITIVITY_RANGE,\n> > > -               ANDROID_SENSOR_INFO_TIMESTAMP_SOURCE,\n> > > -               ANDROID_SENSOR_ORIENTATION,\n> > > -               ANDROID_STATISTICS_INFO_AVAILABLE_FACE_DETECT_MODES,\n> > > -               ANDROID_STATISTICS_INFO_MAX_FACE_COUNT,\n> > > -               ANDROID_SYNC_MAX_LATENCY,\n> > > -       };\n> > > -\n> > >  staticMetadata_->addEntry(ANDROID_REQUEST_AVAILABLE_CHARACTERISTICS_KEYS,\n> > > -                                 availableCharacteristicsKeys);\n> > > -\n> > > -       std::vector<int32_t> availableRequestKeys = {\n> > > -               ANDROID_COLOR_CORRECTION_ABERRATION_MODE,\n> > > -               ANDROID_CONTROL_AE_ANTIBANDING_MODE,\n> > > -               ANDROID_CONTROL_AE_EXPOSURE_COMPENSATION,\n> > > -               ANDROID_CONTROL_AE_LOCK,\n> > > -               ANDROID_CONTROL_AE_MODE,\n> > > -               ANDROID_CONTROL_AE_PRECAPTURE_TRIGGER,\n> > > -               ANDROID_CONTROL_AE_TARGET_FPS_RANGE,\n> > > -               ANDROID_CONTROL_AF_MODE,\n> > > -               ANDROID_CONTROL_AF_TRIGGER,\n> > > -               ANDROID_CONTROL_AWB_LOCK,\n> > > -               ANDROID_CONTROL_AWB_MODE,\n> > > -               ANDROID_CONTROL_CAPTURE_INTENT,\n> > > -               ANDROID_CONTROL_EFFECT_MODE,\n> > > -               ANDROID_CONTROL_MODE,\n> > > -               ANDROID_CONTROL_SCENE_MODE,\n> > > -               ANDROID_CONTROL_VIDEO_STABILIZATION_MODE,\n> > > -               ANDROID_FLASH_MODE,\n> > > -               ANDROID_JPEG_ORIENTATION,\n> > > -               ANDROID_JPEG_QUALITY,\n> > > -               ANDROID_JPEG_THUMBNAIL_QUALITY,\n> > > -               ANDROID_JPEG_THUMBNAIL_SIZE,\n> > > -               ANDROID_LENS_APERTURE,\n> > > -               ANDROID_LENS_OPTICAL_STABILIZATION_MODE,\n> > > -               ANDROID_NOISE_REDUCTION_MODE,\n> > > -               ANDROID_SCALER_CROP_REGION,\n> > > -               ANDROID_STATISTICS_FACE_DETECT_MODE\n> > > -       };\n> > > -       staticMetadata_->addEntry(ANDROID_REQUEST_AVAILABLE_REQUEST_KEYS,\n> > > -                                 availableRequestKeys);\n> > > -\n> > > -       std::vector<int32_t> availableResultKeys = {\n> > > -               ANDROID_COLOR_CORRECTION_ABERRATION_MODE,\n> > > -               ANDROID_CONTROL_AE_ANTIBANDING_MODE,\n> > > -               ANDROID_CONTROL_AE_EXPOSURE_COMPENSATION,\n> > > -               ANDROID_CONTROL_AE_LOCK,\n> > > -               ANDROID_CONTROL_AE_MODE,\n> > > -               ANDROID_CONTROL_AE_PRECAPTURE_TRIGGER,\n> > > -               ANDROID_CONTROL_AE_STATE,\n> > > -               ANDROID_CONTROL_AE_TARGET_FPS_RANGE,\n> > > -               ANDROID_CONTROL_AF_MODE,\n> > > -               ANDROID_CONTROL_AF_STATE,\n> > > -               ANDROID_CONTROL_AF_TRIGGER,\n> > > -               ANDROID_CONTROL_AWB_LOCK,\n> > > -               ANDROID_CONTROL_AWB_MODE,\n> > > -               ANDROID_CONTROL_AWB_STATE,\n> > > -               ANDROID_CONTROL_CAPTURE_INTENT,\n> > > -               ANDROID_CONTROL_EFFECT_MODE,\n> > > -               ANDROID_CONTROL_MODE,\n> > > -               ANDROID_CONTROL_SCENE_MODE,\n> > > -               ANDROID_CONTROL_VIDEO_STABILIZATION_MODE,\n> > > -               ANDROID_FLASH_MODE,\n> > > -               ANDROID_FLASH_STATE,\n> > > -               ANDROID_JPEG_GPS_COORDINATES,\n> > > -               ANDROID_JPEG_GPS_PROCESSING_METHOD,\n> > > -               ANDROID_JPEG_GPS_TIMESTAMP,\n> > > -               ANDROID_JPEG_ORIENTATION,\n> > > -               ANDROID_JPEG_QUALITY,\n> > > -               ANDROID_JPEG_SIZE,\n> > > -               ANDROID_JPEG_THUMBNAIL_QUALITY,\n> > > -               ANDROID_JPEG_THUMBNAIL_SIZE,\n> > > -               ANDROID_LENS_APERTURE,\n> > > -               ANDROID_LENS_FOCAL_LENGTH,\n> > > -               ANDROID_LENS_OPTICAL_STABILIZATION_MODE,\n> > > -               ANDROID_LENS_STATE,\n> > > -               ANDROID_NOISE_REDUCTION_MODE,\n> > > -               ANDROID_REQUEST_PIPELINE_DEPTH,\n> > > -               ANDROID_SCALER_CROP_REGION,\n> > > -               ANDROID_SENSOR_EXPOSURE_TIME,\n> > > -               ANDROID_SENSOR_FRAME_DURATION,\n> > > -               ANDROID_SENSOR_ROLLING_SHUTTER_SKEW,\n> > > -               ANDROID_SENSOR_TEST_PATTERN_MODE,\n> > > -               ANDROID_SENSOR_TIMESTAMP,\n> > > -               ANDROID_STATISTICS_FACE_DETECT_MODE,\n> > > -               ANDROID_STATISTICS_LENS_SHADING_MAP_MODE,\n> > > -               ANDROID_STATISTICS_HOT_PIXEL_MAP_MODE,\n> > > -               ANDROID_STATISTICS_SCENE_FLICKER,\n> > > -       };\n> > > -       staticMetadata_->addEntry(ANDROID_REQUEST_AVAILABLE_RESULT_KEYS,\n> > > -                                 availableResultKeys);\n> > > -\n> > > -       if (!staticMetadata_->isValid()) {\n> > > -               LOG(HAL, Error) << \"Failed to construct static metadata\";\n> > > -               staticMetadata_.reset();\n> > > -               return nullptr;\n> > > -       }\n> > > -\n> > > -       if (staticMetadata_->resized()) {\n> > > -               auto [entryCount, dataCount] = staticMetadata_->usage();\n> > > -               LOG(HAL, Info)\n> > > -                       << \"Static metadata resized: \" << entryCount\n> > > -                       << \" entries and \" << dataCount << \" bytes used\";\n> > > -       }\n> > > -\n> > > -       return staticMetadata_->get();\n> > > -}\n> > > -\n> > > -std::unique_ptr<CameraMetadata> CameraDevice::requestTemplatePreview()\n> > > +void CameraDevice::setCallbacks(const camera3_callback_ops_t *callbacks)\n> > >  {\n> > > -       /*\n> > > -        * \\todo Keep this in sync with the actual number of entries.\n> > > -        * Currently: 20 entries, 35 bytes\n> > > -        */\n> > > -       auto requestTemplate = std::make_unique<CameraMetadata>(21, 36);\n> > > -       if (!requestTemplate->isValid()) {\n> > > -               return nullptr;\n> > > -       }\n> > > -\n> > > -       /* Get the FPS range registered in the static metadata. */\n> > > -       camera_metadata_ro_entry_t entry;\n> > > -       bool found =\n> > > staticMetadata_->getEntry(ANDROID_CONTROL_AE_AVAILABLE_TARGET_FPS_RANGES,\n> > > -                                              &entry);\n> > > -       if (!found) {\n> > > -               LOG(HAL, Error) << \"Cannot create capture template without\n> > > FPS range\";\n> > > -               return nullptr;\n> > > -       }\n> > > -\n> > > -       /*\n> > > -        * Assume the AE_AVAILABLE_TARGET_FPS_RANGE static metadata\n> > > -        * has been assembled as {{min, max} {max, max}}.\n> > > -        */\n> > > -       requestTemplate->addEntry(ANDROID_CONTROL_AE_TARGET_FPS_RANGE,\n> > > -                                 entry.data.i32, 2);\n> > > -\n> > > -       uint8_t aeMode = ANDROID_CONTROL_AE_MODE_ON;\n> > > -       requestTemplate->addEntry(ANDROID_CONTROL_AE_MODE, aeMode);\n> > > -\n> > > -       int32_t aeExposureCompensation = 0;\n> > > -       requestTemplate->addEntry(ANDROID_CONTROL_AE_EXPOSURE_COMPENSATION,\n> > > -                                 aeExposureCompensation);\n> > > -\n> > > -       uint8_t aePrecaptureTrigger =\n> > > ANDROID_CONTROL_AE_PRECAPTURE_TRIGGER_IDLE;\n> > > -       requestTemplate->addEntry(ANDROID_CONTROL_AE_PRECAPTURE_TRIGGER,\n> > > -                                 aePrecaptureTrigger);\n> > > -\n> > > -       uint8_t aeLock = ANDROID_CONTROL_AE_LOCK_OFF;\n> > > -       requestTemplate->addEntry(ANDROID_CONTROL_AE_LOCK, aeLock);\n> > > -\n> > > -       uint8_t aeAntibandingMode =\n> > > ANDROID_CONTROL_AE_ANTIBANDING_MODE_AUTO;\n> > > -       requestTemplate->addEntry(ANDROID_CONTROL_AE_ANTIBANDING_MODE,\n> > > -                                 aeAntibandingMode);\n> > > -\n> > > -       uint8_t afMode = ANDROID_CONTROL_AF_MODE_OFF;\n> > > -       requestTemplate->addEntry(ANDROID_CONTROL_AF_MODE, afMode);\n> > > -\n> > > -       uint8_t afTrigger = ANDROID_CONTROL_AF_TRIGGER_IDLE;\n> > > -       requestTemplate->addEntry(ANDROID_CONTROL_AF_TRIGGER, afTrigger);\n> > > -\n> > > -       uint8_t awbMode = ANDROID_CONTROL_AWB_MODE_AUTO;\n> > > -       requestTemplate->addEntry(ANDROID_CONTROL_AWB_MODE, awbMode);\n> > > -\n> > > -       uint8_t awbLock = ANDROID_CONTROL_AWB_LOCK_OFF;\n> > > -       requestTemplate->addEntry(ANDROID_CONTROL_AWB_LOCK, awbLock);\n> > > -\n> > > -       uint8_t flashMode = ANDROID_FLASH_MODE_OFF;\n> > > -       requestTemplate->addEntry(ANDROID_FLASH_MODE, flashMode);\n> > > -\n> > > -       uint8_t faceDetectMode = ANDROID_STATISTICS_FACE_DETECT_MODE_OFF;\n> > > -       requestTemplate->addEntry(ANDROID_STATISTICS_FACE_DETECT_MODE,\n> > > -                                 faceDetectMode);\n> > > -\n> > > -       uint8_t noiseReduction = ANDROID_NOISE_REDUCTION_MODE_OFF;\n> > > -       requestTemplate->addEntry(ANDROID_NOISE_REDUCTION_MODE,\n> > > -                                 noiseReduction);\n> > > -\n> > > -       uint8_t aberrationMode =\n> > > ANDROID_COLOR_CORRECTION_ABERRATION_MODE_OFF;\n> > > -       requestTemplate->addEntry(ANDROID_COLOR_CORRECTION_ABERRATION_MODE,\n> > > -                                 aberrationMode);\n> > > -\n> > > -       uint8_t controlMode = ANDROID_CONTROL_MODE_AUTO;\n> > > -       requestTemplate->addEntry(ANDROID_CONTROL_MODE, controlMode);\n> > > -\n> > > -       float lensAperture = 2.53 / 100;\n> > > -       requestTemplate->addEntry(ANDROID_LENS_APERTURE, lensAperture);\n> > > -\n> > > -       uint8_t opticalStabilization =\n> > > ANDROID_LENS_OPTICAL_STABILIZATION_MODE_OFF;\n> > > -       requestTemplate->addEntry(ANDROID_LENS_OPTICAL_STABILIZATION_MODE,\n> > > -                                 opticalStabilization);\n> > > -\n> > > -       uint8_t captureIntent = ANDROID_CONTROL_CAPTURE_INTENT_PREVIEW;\n> > > -       requestTemplate->addEntry(ANDROID_CONTROL_CAPTURE_INTENT,\n> > > -                                 captureIntent);\n> > > -\n> > > -       return requestTemplate;\n> > > +       callbacks_ = callbacks;\n> > >  }\n> > >\n> > > -std::unique_ptr<CameraMetadata> CameraDevice::requestTemplateVideo()\n> > > +const camera_metadata_t *CameraDevice::getStaticMetadata()\n> > >  {\n> > > -       std::unique_ptr<CameraMetadata> previewTemplate =\n> > > requestTemplatePreview();\n> > > -       if (!previewTemplate)\n> > > -               return nullptr;\n> > > -\n> > > -       /*\n> > > -        * The video template requires a fixed FPS range. Everything else\n> > > -        * stays the same as the preview template.\n> > > -        */\n> > > -       camera_metadata_ro_entry_t entry;\n> > > -\n> > >  staticMetadata_->getEntry(ANDROID_CONTROL_AE_AVAILABLE_TARGET_FPS_RANGES,\n> > > -                                 &entry);\n> > > -\n> > > -       /*\n> > > -        * Assume the AE_AVAILABLE_TARGET_FPS_RANGE static metadata\n> > > -        * has been assembled as {{min, max} {max, max}}.\n> > > -        */\n> > > -       previewTemplate->updateEntry(ANDROID_CONTROL_AE_TARGET_FPS_RANGE,\n> > > -                                    entry.data.i32 + 2, 2);\n> > > -\n> > > -       return previewTemplate;\n> > > +       return capabilities_.staticMetadata()->get();\n> > >  }\n> > >\n> > >  /*\n> > > @@ -1630,7 +520,7 @@ const camera_metadata_t\n> > > *CameraDevice::constructDefaultRequestSettings(int type)\n> > >         switch (type) {\n> > >         case CAMERA3_TEMPLATE_PREVIEW:\n> > >                 captureIntent = ANDROID_CONTROL_CAPTURE_INTENT_PREVIEW;\n> > > -               requestTemplate = requestTemplatePreview();\n> > > +               requestTemplate = capabilities_.requestTemplatePreview();\n> > >                 break;\n> > >         case CAMERA3_TEMPLATE_STILL_CAPTURE:\n> > >                 /*\n> > > @@ -1638,15 +528,15 @@ const camera_metadata_t\n> > > *CameraDevice::constructDefaultRequestSettings(int type)\n> > >                  * for the torch mode we currently do not support.\n> > >                  */\n> > >                 captureIntent =\n> > > ANDROID_CONTROL_CAPTURE_INTENT_STILL_CAPTURE;\n> > > -               requestTemplate = requestTemplatePreview();\n> > > +               requestTemplate = capabilities_.requestTemplatePreview();\n> > >                 break;\n> > >         case CAMERA3_TEMPLATE_VIDEO_RECORD:\n> > >                 captureIntent =\n> > > ANDROID_CONTROL_CAPTURE_INTENT_VIDEO_RECORD;\n> > > -               requestTemplate = requestTemplateVideo();\n> > > +               requestTemplate = capabilities_.requestTemplateVideo();\n> > >                 break;\n> > >         case CAMERA3_TEMPLATE_VIDEO_SNAPSHOT:\n> > >                 captureIntent =\n> > > ANDROID_CONTROL_CAPTURE_INTENT_VIDEO_SNAPSHOT;\n> > > -               requestTemplate = requestTemplateVideo();\n> > > +               requestTemplate = capabilities_.requestTemplateVideo();\n> > >                 break;\n> > >         /* \\todo Implement templates generation for the remaining use\n> > > cases. */\n> > >         case CAMERA3_TEMPLATE_ZERO_SHUTTER_LAG:\n> > > @@ -1668,19 +558,6 @@ const camera_metadata_t\n> > > *CameraDevice::constructDefaultRequestSettings(int type)\n> > >         return requestTemplates_[type]->get();\n> > >  }\n> > >\n> > > -PixelFormat CameraDevice::toPixelFormat(int format) const\n> > > -{\n> > > -       /* Translate Android format code to libcamera pixel format. */\n> > > -       auto it = formatsMap_.find(format);\n> > > -       if (it == formatsMap_.end()) {\n> > > -               LOG(HAL, Error) << \"Requested format \" <<\n> > > utils::hex(format)\n> > > -                               << \" not supported\";\n> > > -               return PixelFormat();\n> > > -       }\n> > > -\n> > > -       return it->second;\n> > > -}\n> > > -\n> > >  /*\n> > >   * Inspect the stream_list to produce a list of StreamConfiguration to\n> > >   * be use to configure the Camera.\n> > > @@ -1727,7 +604,7 @@ int\n> > > CameraDevice::configureStreams(camera3_stream_configuration_t *stream_list)\n> > >                 camera3_stream_t *stream = stream_list->streams[i];\n> > >                 Size size(stream->width, stream->height);\n> > >\n> > > -               PixelFormat format = toPixelFormat(stream->format);\n> > > +               PixelFormat format =\n> > > capabilities_.toPixelFormat(stream->format);\n> > >\n> > >                 LOG(HAL, Info) << \"Stream #\" << i\n> > >                                << \", direction: \" << stream->stream_type\n> > > diff --git a/src/android/camera_device.h b/src/android/camera_device.h\n> > > index 4aadb27c562c..090fe28a551e 100644\n> > > --- a/src/android/camera_device.h\n> > > +++ b/src/android/camera_device.h\n> > > @@ -10,14 +10,12 @@\n> > >  #include <map>\n> > >  #include <memory>\n> > >  #include <mutex>\n> > > -#include <tuple>\n> > >  #include <vector>\n> > >\n> > >  #include <hardware/camera3.h>\n> > >\n> > >  #include <libcamera/buffer.h>\n> > >  #include <libcamera/camera.h>\n> > > -#include <libcamera/geometry.h>\n> > >  #include <libcamera/request.h>\n> > >  #include <libcamera/stream.h>\n> > >\n> > > @@ -26,6 +24,7 @@\n> > >  #include \"libcamera/internal/message.h\"\n> > >  #include \"libcamera/internal/thread.h\"\n> > >\n> > > +#include \"camera_capabilities.h\"\n> > >  #include \"camera_metadata.h\"\n> > >  #include \"camera_stream.h\"\n> > >  #include \"camera_worker.h\"\n> > > @@ -57,7 +56,7 @@ public:\n> > >         const std::string &model() const { return model_; }\n> > >         int facing() const { return facing_; }\n> > >         int orientation() const { return orientation_; }\n> > > -       unsigned int maxJpegBufferSize() const { return\n> > > maxJpegBufferSize_; }\n> > > +       unsigned int maxJpegBufferSize() const;\n> > >\n> > >         void setCallbacks(const camera3_callback_ops_t *callbacks);\n> > >         const camera_metadata_t *getStaticMetadata();\n> > > @@ -86,11 +85,6 @@ private:\n> > >                 std::unique_ptr<CaptureRequest> request_;\n> > >         };\n> > >\n> > > -       struct Camera3StreamConfiguration {\n> > > -               libcamera::Size resolution;\n> > > -               int androidFormat;\n> > > -       };\n> > > -\n> > >         enum class State {\n> > >                 Stopped,\n> > >                 Flushing,\n> > > @@ -99,22 +93,11 @@ private:\n> > >\n> > >         void stop();\n> > >\n> > > -       int initializeStreamConfigurations();\n> > > -       std::vector<libcamera::Size>\n> > > -       getYUVResolutions(libcamera::CameraConfiguration *cameraConfig,\n> > > -                         const libcamera::PixelFormat &pixelFormat,\n> > > -                         const std::vector<libcamera::Size> &resolutions);\n> > > -       std::vector<libcamera::Size>\n> > > -       getRawResolutions(const libcamera::PixelFormat &pixelFormat);\n> > > -\n> > >         libcamera::FrameBuffer *createFrameBuffer(const buffer_handle_t\n> > > camera3buffer);\n> > >         void abortRequest(camera3_capture_request_t *request);\n> > >         void notifyShutter(uint32_t frameNumber, uint64_t timestamp);\n> > >         void notifyError(uint32_t frameNumber, camera3_stream_t *stream,\n> > >                          camera3_error_msg_code code);\n> > > -       std::unique_ptr<CameraMetadata> requestTemplatePreview();\n> > > -       std::unique_ptr<CameraMetadata> requestTemplateVideo();\n> > > -       libcamera::PixelFormat toPixelFormat(int format) const;\n> > >         int processControls(Camera3RequestDescriptor *descriptor);\n> > >         std::unique_ptr<CameraMetadata> getResultMetadata(\n> > >                 const Camera3RequestDescriptor &descriptor) const;\n> > > @@ -129,13 +112,11 @@ private:\n> > >\n> > >         std::shared_ptr<libcamera::Camera> camera_;\n> > >         std::unique_ptr<libcamera::CameraConfiguration> config_;\n> > > +       CameraCapabilities capabilities_;\n> > >\n> > > -       std::unique_ptr<CameraMetadata> staticMetadata_;\n> > >         std::map<unsigned int, std::unique_ptr<CameraMetadata>>\n> > > requestTemplates_;\n> > >         const camera3_callback_ops_t *callbacks_;\n> > >\n> > > -       std::vector<Camera3StreamConfiguration> streamConfigurations_;\n> > > -       std::map<int, libcamera::PixelFormat> formatsMap_;\n> > >         std::vector<CameraStream> streams_;\n> > >\n> > >         libcamera::Mutex descriptorsMutex_; /* Protects descriptors_. */\n> > > @@ -147,8 +128,6 @@ private:\n> > >         int facing_;\n> > >         int orientation_;\n> > >\n> > > -       unsigned int maxJpegBufferSize_;\n> > > -\n> > >         CameraMetadata lastSettings_;\n> > >  };\n> > >\n> > > diff --git a/src/android/meson.build b/src/android/meson.build\n> > > index f27fd5316705..6270fb201338 100644\n> > > --- a/src/android/meson.build\n> > > +++ b/src/android/meson.build\n> > > @@ -44,6 +44,7 @@ subdir('cros')\n> > >\n> > >  android_hal_sources = files([\n> > >      'camera3_hal.cpp',\n> > > +    'camera_capabilities.cpp',\n> > >      'camera_device.cpp',\n> > >      'camera_hal_config.cpp',\n> > >      'camera_hal_manager.cpp',\n\nSince the code is copied as-is,\nReviewed-by: Hirokazu Honda <hiroh@chromium.org>\n\nAlthough stylecheck script complains some part of code, do you think fixing it?\n-Hiro\n> > > --\n> > > 2.31.1\n> > >\n> > >","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id 48F92C321B\n\tfor <parsemail@patchwork.libcamera.org>;\n\tWed, 23 Jun 2021 03:25:15 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id 7F60D68935;\n\tWed, 23 Jun 2021 05:25:14 +0200 (CEST)","from mail-ej1-x62e.google.com (mail-ej1-x62e.google.com\n\t[IPv6:2a00:1450:4864:20::62e])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id F3DD76050A\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tWed, 23 Jun 2021 05:25:12 +0200 (CEST)","by mail-ej1-x62e.google.com with SMTP id bu12so1737981ejb.0\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tTue, 22 Jun 2021 20:25:12 -0700 (PDT)"],"Authentication-Results":"lancelot.ideasonboard.com; dkim=pass (1024-bit key;\n\tunprotected) header.d=chromium.org header.i=@chromium.org\n\theader.b=\"ecS91327\"; dkim-atps=neutral","DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/relaxed; d=chromium.org;\n\ts=google; \n\th=mime-version:references:in-reply-to:from:date:message-id:subject:to\n\t:cc; bh=iIyGiwaAQetfyz9rRoC5661TohpvBna0vFMDwn3jTyA=;\n\tb=ecS91327oejFumVAsJyGfGDynbH97DBRSNc89hzv3Fh75hQPIrL5vc9WqcdTXUxuMw\n\t9Jlh6T5Jrtk0iLFrf4RKP0+mtULHXfpVOlL9H3JHL+1S0Hz2Uz8zoGDGmkTxJzjkImmT\n\tUUwJZ4yQbKBHZuPcR4Ht2kuMueOB5s5Jq52Kg=","X-Google-DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/relaxed;\n\td=1e100.net; s=20161025;\n\th=x-gm-message-state:mime-version:references:in-reply-to:from:date\n\t:message-id:subject:to:cc;\n\tbh=iIyGiwaAQetfyz9rRoC5661TohpvBna0vFMDwn3jTyA=;\n\tb=Ku8grn3qS9OJXb2shschslCNWg/SG4z7tKX7buaRsUa9rSoV4ifepAe5ZWDnv3IfQI\n\to6fc9q9JtQ8ActCZBTtv6UIJL3PM1EA+DnEogCu2/1regoUynp116l5vb5dz9Ai+/Hld\n\tgthb+zhVsPXJS7JPDFvZjCDU0NFog0CDoiacFfsPp2e1rSXayz/Gx5AY8ANno7/hMf4I\n\tvAsYsqGKbecz5P1DylhONAajHri8v69JVZTfPUrWfim89sstmhBUvywye0L1cdBHX15P\n\tGyevX6uLGbH4wHAQhVVGbVqyrj6YbQfMvTjkTGYgLPHxuHcdSs3ur9Y7ilJn3rrnD5Mp\n\tp12A==","X-Gm-Message-State":"AOAM533O7ffo1qd5bPIfebLgBfozzxjsR3YLA7TQ6ufFy5keXCCz0uwX\n\tdSsixoWyc9a2MTmTlemrpeBUFFRdgNnsOP/qjg5WSg==","X-Google-Smtp-Source":"ABdhPJw5837ASszS1ajyngG+8CettF136oCxa6YEjUM9EM0faLBkd0gL8B1YxBbFoRSeH/0BOWiEOniyneZd9pbuo6o=","X-Received":"by 2002:a17:906:dbd8:: with SMTP id\n\tyc24mr7505345ejb.55.1624418711740; \n\tTue, 22 Jun 2021 20:25:11 -0700 (PDT)","MIME-Version":"1.0","References":"<20210621152954.40299-1-jacopo@jmondi.org>\n\t<20210621152954.40299-3-jacopo@jmondi.org>\n\t<CAO5uPHP1B=8DWZOpthDew7pjsyA_Tau6ttLAZnqmsi_wmqd=YA@mail.gmail.com>\n\t<20210622074107.6oeneo3z4yluohsz@uno.localdomain>","In-Reply-To":"<20210622074107.6oeneo3z4yluohsz@uno.localdomain>","From":"Hirokazu Honda <hiroh@chromium.org>","Date":"Wed, 23 Jun 2021 12:25:01 +0900","Message-ID":"<CAO5uPHP-KNPPFDXWph7D5xqaNt-o4TBA-Ta_ksM7LFx=uC3mtA@mail.gmail.com>","To":"Jacopo Mondi <jacopo@jmondi.org>","Content-Type":"text/plain; charset=\"UTF-8\"","Subject":"Re: [libcamera-devel] [PATCH 2/2] android: Introduce\n\tCameraCapabilties class","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","Cc":"libcamera devel <libcamera-devel@lists.libcamera.org>","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}},{"id":17705,"web_url":"https://patchwork.libcamera.org/comment/17705/","msgid":"<20210623070342.nbh7mrqoiqdrq7kv@uno.localdomain>","date":"2021-06-23T07:03:42","subject":"Re: [libcamera-devel] [PATCH 2/2] android: Introduce\n\tCameraCapabilties class","submitter":{"id":3,"url":"https://patchwork.libcamera.org/api/people/3/","name":"Jacopo Mondi","email":"jacopo@jmondi.org"},"content":"Hi Hiro,\n\nOn Wed, Jun 23, 2021 at 12:25:01PM +0900, Hirokazu Honda wrote:\n> Hi Jacopo,\n>\n\n[snip]\n\n> > > >  android_hal_sources = files([\n> > > >      'camera3_hal.cpp',\n> > > > +    'camera_capabilities.cpp',\n> > > >      'camera_device.cpp',\n> > > >      'camera_hal_config.cpp',\n> > > >      'camera_hal_manager.cpp',\n>\n> Since the code is copied as-is,\n> Reviewed-by: Hirokazu Honda <hiroh@chromium.org>\n\nThanks\n\n>\n> Although stylecheck script complains some part of code, do you think fixing it?\n\nTo be honest, none of the suggested changes make much sense. I'll copy\nthem here for reference.\n\n--- src/android/camera_capabilities.cpp\n+++ src/android/camera_capabilities.cpp\n@@ -55,62 +55,19 @@\n  * \\brief Associate Android format code with ancillary data\n  */\n const std::map<int, const Camera3Format> camera3FormatsMap = {\n-\t{\n-\t\tHAL_PIXEL_FORMAT_BLOB, {\n-\t\t\t{ formats::MJPEG },\n-\t\t\ttrue,\n-\t\t\t\"BLOB\"\n-\t\t}\n-\t}, {\n-\t\tHAL_PIXEL_FORMAT_YCbCr_420_888, {\n-\t\t\t{ formats::NV12, formats::NV21 },\n-\t\t\ttrue,\n-\t\t\t\"YCbCr_420_888\"\n-\t\t}\n-\t}, {\n-\t\t/*\n+\t{ HAL_PIXEL_FORMAT_BLOB, { { formats::MJPEG }, true, \"BLOB\" } },\n+\t{ HAL_PIXEL_FORMAT_YCbCr_420_888, { { formats::NV12, formats::NV21 }, true, \"YCbCr_420_888\" } },\n+\t{ /*\n \t\t * \\todo Translate IMPLEMENTATION_DEFINED inspecting the gralloc\n \t\t * usage flag. For now, copy the YCbCr_420 configuration.\n \t\t */\n-\t\tHAL_PIXEL_FORMAT_IMPLEMENTATION_DEFINED, {\n-\t\t\t{ formats::NV12, formats::NV21 },\n-\t\t\ttrue,\n-\t\t\t\"IMPLEMENTATION_DEFINED\"\n-\t\t}\n-\t}, {\n-\t\tHAL_PIXEL_FORMAT_RAW10, {\n-\t\t\t{\n-\t\t\t\tformats::SBGGR10_CSI2P,\n-\t\t\t\tformats::SGBRG10_CSI2P,\n-\t\t\t\tformats::SGRBG10_CSI2P,\n-\t\t\t\tformats::SRGGB10_CSI2P\n-\t\t\t},\n-\t\t\tfalse,\n-\t\t\t\"RAW10\"\n-\t\t}\n-\t}, {\n-\t\tHAL_PIXEL_FORMAT_RAW12, {\n-\t\t\t{\n-\t\t\t\tformats::SBGGR12_CSI2P,\n-\t\t\t\tformats::SGBRG12_CSI2P,\n-\t\t\t\tformats::SGRBG12_CSI2P,\n-\t\t\t\tformats::SRGGB12_CSI2P\n-\t\t\t},\n-\t\t\tfalse,\n-\t\t\t\"RAW12\"\n-\t\t}\n-\t}, {\n-\t\tHAL_PIXEL_FORMAT_RAW16, {\n-\t\t\t{\n-\t\t\t\tformats::SBGGR16,\n-\t\t\t\tformats::SGBRG16,\n-\t\t\t\tformats::SGRBG16,\n-\t\t\t\tformats::SRGGB16\n-\t\t\t},\n-\t\t\tfalse,\n-\t\t\t\"RAW16\"\n-\t\t}\n-\t},\n+\t  HAL_PIXEL_FORMAT_IMPLEMENTATION_DEFINED,\n+\t  { { formats::NV12, formats::NV21 },\n+\t    true,\n+\t    \"IMPLEMENTATION_DEFINED\" } },\n+\t{ HAL_PIXEL_FORMAT_RAW10, { { formats::SBGGR10_CSI2P, formats::SGBRG10_CSI2P, formats::SGRBG10_CSI2P, formats::SRGGB10_CSI2P }, false, \"RAW10\" } },\n+\t{ HAL_PIXEL_FORMAT_RAW12, { { formats::SBGGR12_CSI2P, formats::SGBRG12_CSI2P, formats::SGRBG12_CSI2P, formats::SRGGB12_CSI2P }, false, \"RAW12\" } },\n+\t{ HAL_PIXEL_FORMAT_RAW16, { { formats::SBGGR16, formats::SGBRG16, formats::SGRBG16, formats::SRGGB16 }, false, \"RAW16\" } },\n };\n\n } /* namespace */\n@@ -276,7 +233,6 @@\n \t\t */\n \t\tPixelFormat mappedFormat;\n \t\tfor (const PixelFormat &pixelFormat : libcameraFormats) {\n-\n \t\t\tLOG(HAL, Debug) << \"Testing \" << pixelFormat.toString();\n\n \t\t\t/*\n@@ -457,7 +413,8 @@\n \t}\n\n \tstd::vector<int32_t> aeCompensationRange = {\n-\t\t0, 0,\n+\t\t0,\n+\t\t0,\n \t};\n \tstaticMetadata_->addEntry(ANDROID_CONTROL_AE_COMPENSATION_RANGE,\n \t\t\t\t  aeCompensationRange);\n@@ -503,7 +460,9 @@\n \t\t\t\t  availableAwbModes);\n\n \tstd::vector<int32_t> availableMaxRegions = {\n-\t\t0, 0, 0,\n+\t\t0,\n+\t\t0,\n+\t\t0,\n \t};\n \tstaticMetadata_->addEntry(ANDROID_CONTROL_MAX_REGIONS,\n \t\t\t\t  availableMaxRegions);\n@@ -547,8 +506,8 @@\n \t\t\tcontinue;\n\n \t\tSize thumbnailSize = maxJpegThumbnail\n-\t\t\t\t     .boundedToAspectRatio({ entry.resolution.width,\n-\t\t\t\t\t\t\t     entry.resolution.height });\n+\t\t\t\t\t     .boundedToAspectRatio({ entry.resolution.width,\n+\t\t\t\t\t\t\t\t     entry.resolution.height });\n \t\tthumbnailSizes.push_back(thumbnailSize);\n \t}\n\n@@ -602,7 +561,8 @@\n \t}\n\n \tint32_t sensitivityRange[] = {\n-\t\t32, 2400,\n+\t\t32,\n+\t\t2400,\n \t};\n \tstaticMetadata_->addEntry(ANDROID_SENSOR_INFO_SENSITIVITY_RANGE,\n \t\t\t\t  sensitivityRange);\n@@ -811,7 +771,10 @@\n \t\t\t\t  availableStreamConfigurations);\n\n \tstd::vector<int64_t> availableStallDurations = {\n-\t\tANDROID_SCALER_AVAILABLE_FORMATS_BLOB, 2560, 1920, 33333333,\n+\t\tANDROID_SCALER_AVAILABLE_FORMATS_BLOB,\n+\t\t2560,\n+\t\t1920,\n+\t\t33333333,\n \t};\n \tstaticMetadata_->addEntry(ANDROID_SCALER_AVAILABLE_STALL_DURATIONS,\n \t\t\t\t  availableStallDurations);\n---\n7 potential issues detected, please review\n\nI think I'll ignore them all.\nThanks\n   j\n\n> -Hiro\n> > > > --\n> > > > 2.31.1\n> > > >\n> > > >","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id 6E9ABC321A\n\tfor <parsemail@patchwork.libcamera.org>;\n\tWed, 23 Jun 2021 07:02:58 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id AF3AB68935;\n\tWed, 23 Jun 2021 09:02:57 +0200 (CEST)","from relay5-d.mail.gandi.net (relay5-d.mail.gandi.net\n\t[217.70.183.197])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id A69B268931\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tWed, 23 Jun 2021 09:02:55 +0200 (CEST)","(Authenticated sender: jacopo@jmondi.org)\n\tby relay5-d.mail.gandi.net (Postfix) with ESMTPSA id 901321C0005;\n\tWed, 23 Jun 2021 07:02:53 +0000 (UTC)"],"Date":"Wed, 23 Jun 2021 09:03:42 +0200","From":"Jacopo Mondi <jacopo@jmondi.org>","To":"Hirokazu Honda <hiroh@chromium.org>","Message-ID":"<20210623070342.nbh7mrqoiqdrq7kv@uno.localdomain>","References":"<20210621152954.40299-1-jacopo@jmondi.org>\n\t<20210621152954.40299-3-jacopo@jmondi.org>\n\t<CAO5uPHP1B=8DWZOpthDew7pjsyA_Tau6ttLAZnqmsi_wmqd=YA@mail.gmail.com>\n\t<20210622074107.6oeneo3z4yluohsz@uno.localdomain>\n\t<CAO5uPHP-KNPPFDXWph7D5xqaNt-o4TBA-Ta_ksM7LFx=uC3mtA@mail.gmail.com>","MIME-Version":"1.0","Content-Type":"text/plain; charset=utf-8","Content-Disposition":"inline","In-Reply-To":"<CAO5uPHP-KNPPFDXWph7D5xqaNt-o4TBA-Ta_ksM7LFx=uC3mtA@mail.gmail.com>","Subject":"Re: [libcamera-devel] [PATCH 2/2] android: Introduce\n\tCameraCapabilties class","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","Cc":"libcamera devel <libcamera-devel@lists.libcamera.org>","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}},{"id":17706,"web_url":"https://patchwork.libcamera.org/comment/17706/","msgid":"<CAO5uPHMpEoPkW_PPQ3cCU9Q59edg77aEc=e1ysir68QtQTm4cA@mail.gmail.com>","date":"2021-06-23T07:08:00","subject":"Re: [libcamera-devel] [PATCH 2/2] android: Introduce\n\tCameraCapabilties class","submitter":{"id":63,"url":"https://patchwork.libcamera.org/api/people/63/","name":"Hirokazu Honda","email":"hiroh@chromium.org"},"content":"Hi Jacopo,\n\nOn Wed, Jun 23, 2021 at 4:02 PM Jacopo Mondi <jacopo@jmondi.org> wrote:\n>\n> Hi Hiro,\n>\n> On Wed, Jun 23, 2021 at 12:25:01PM +0900, Hirokazu Honda wrote:\n> > Hi Jacopo,\n> >\n>\n> [snip]\n>\n> > > > >  android_hal_sources = files([\n> > > > >      'camera3_hal.cpp',\n> > > > > +    'camera_capabilities.cpp',\n> > > > >      'camera_device.cpp',\n> > > > >      'camera_hal_config.cpp',\n> > > > >      'camera_hal_manager.cpp',\n> >\n> > Since the code is copied as-is,\n> > Reviewed-by: Hirokazu Honda <hiroh@chromium.org>\n>\n> Thanks\n>\n> >\n> > Although stylecheck script complains some part of code, do you think fixing it?\n>\n> To be honest, none of the suggested changes make much sense. I'll copy\n> them here for reference.\n>\n> --- src/android/camera_capabilities.cpp\n> +++ src/android/camera_capabilities.cpp\n> @@ -55,62 +55,19 @@\n>   * \\brief Associate Android format code with ancillary data\n>   */\n>  const std::map<int, const Camera3Format> camera3FormatsMap = {\n> -       {\n> -               HAL_PIXEL_FORMAT_BLOB, {\n> -                       { formats::MJPEG },\n> -                       true,\n> -                       \"BLOB\"\n> -               }\n> -       }, {\n> -               HAL_PIXEL_FORMAT_YCbCr_420_888, {\n> -                       { formats::NV12, formats::NV21 },\n> -                       true,\n> -                       \"YCbCr_420_888\"\n> -               }\n> -       }, {\n> -               /*\n> +       { HAL_PIXEL_FORMAT_BLOB, { { formats::MJPEG }, true, \"BLOB\" } },\n> +       { HAL_PIXEL_FORMAT_YCbCr_420_888, { { formats::NV12, formats::NV21 }, true, \"YCbCr_420_888\" } },\n> +       { /*\n>                  * \\todo Translate IMPLEMENTATION_DEFINED inspecting the gralloc\n>                  * usage flag. For now, copy the YCbCr_420 configuration.\n>                  */\n> -               HAL_PIXEL_FORMAT_IMPLEMENTATION_DEFINED, {\n> -                       { formats::NV12, formats::NV21 },\n> -                       true,\n> -                       \"IMPLEMENTATION_DEFINED\"\n> -               }\n> -       }, {\n> -               HAL_PIXEL_FORMAT_RAW10, {\n> -                       {\n> -                               formats::SBGGR10_CSI2P,\n> -                               formats::SGBRG10_CSI2P,\n> -                               formats::SGRBG10_CSI2P,\n> -                               formats::SRGGB10_CSI2P\n> -                       },\n> -                       false,\n> -                       \"RAW10\"\n> -               }\n> -       }, {\n> -               HAL_PIXEL_FORMAT_RAW12, {\n> -                       {\n> -                               formats::SBGGR12_CSI2P,\n> -                               formats::SGBRG12_CSI2P,\n> -                               formats::SGRBG12_CSI2P,\n> -                               formats::SRGGB12_CSI2P\n> -                       },\n> -                       false,\n> -                       \"RAW12\"\n> -               }\n> -       }, {\n> -               HAL_PIXEL_FORMAT_RAW16, {\n> -                       {\n> -                               formats::SBGGR16,\n> -                               formats::SGBRG16,\n> -                               formats::SGRBG16,\n> -                               formats::SRGGB16\n> -                       },\n> -                       false,\n> -                       \"RAW16\"\n> -               }\n> -       },\n> +         HAL_PIXEL_FORMAT_IMPLEMENTATION_DEFINED,\n> +         { { formats::NV12, formats::NV21 },\n> +           true,\n> +           \"IMPLEMENTATION_DEFINED\" } },\n> +       { HAL_PIXEL_FORMAT_RAW10, { { formats::SBGGR10_CSI2P, formats::SGBRG10_CSI2P, formats::SGRBG10_CSI2P, formats::SRGGB10_CSI2P }, false, \"RAW10\" } },\n> +       { HAL_PIXEL_FORMAT_RAW12, { { formats::SBGGR12_CSI2P, formats::SGBRG12_CSI2P, formats::SGRBG12_CSI2P, formats::SRGGB12_CSI2P }, false, \"RAW12\" } },\n> +       { HAL_PIXEL_FORMAT_RAW16, { { formats::SBGGR16, formats::SGBRG16, formats::SGRBG16, formats::SRGGB16 }, false, \"RAW16\" } },\n>  };\n>\n>  } /* namespace */\n> @@ -276,7 +233,6 @@\n>                  */\n>                 PixelFormat mappedFormat;\n>                 for (const PixelFormat &pixelFormat : libcameraFormats) {\n> -\n>                         LOG(HAL, Debug) << \"Testing \" << pixelFormat.toString();\n>\n>                         /*\n> @@ -457,7 +413,8 @@\n>         }\n>\n>         std::vector<int32_t> aeCompensationRange = {\n> -               0, 0,\n> +               0,\n> +               0,\n>         };\n>         staticMetadata_->addEntry(ANDROID_CONTROL_AE_COMPENSATION_RANGE,\n>                                   aeCompensationRange);\n> @@ -503,7 +460,9 @@\n>                                   availableAwbModes);\n>\n>         std::vector<int32_t> availableMaxRegions = {\n> -               0, 0, 0,\n> +               0,\n> +               0,\n> +               0,\n>         };\n>         staticMetadata_->addEntry(ANDROID_CONTROL_MAX_REGIONS,\n>                                   availableMaxRegions);\n> @@ -547,8 +506,8 @@\n>                         continue;\n>\n>                 Size thumbnailSize = maxJpegThumbnail\n> -                                    .boundedToAspectRatio({ entry.resolution.width,\n> -                                                            entry.resolution.height });\n> +                                            .boundedToAspectRatio({ entry.resolution.width,\n> +                                                                    entry.resolution.height });\n>                 thumbnailSizes.push_back(thumbnailSize);\n>         }\n>\n> @@ -602,7 +561,8 @@\n>         }\n>\n>         int32_t sensitivityRange[] = {\n> -               32, 2400,\n> +               32,\n> +               2400,\n>         };\n>         staticMetadata_->addEntry(ANDROID_SENSOR_INFO_SENSITIVITY_RANGE,\n>                                   sensitivityRange);\n> @@ -811,7 +771,10 @@\n>                                   availableStreamConfigurations);\n>\n>         std::vector<int64_t> availableStallDurations = {\n> -               ANDROID_SCALER_AVAILABLE_FORMATS_BLOB, 2560, 1920, 33333333,\n> +               ANDROID_SCALER_AVAILABLE_FORMATS_BLOB,\n> +               2560,\n> +               1920,\n> +               33333333,\n>         };\n>         staticMetadata_->addEntry(ANDROID_SCALER_AVAILABLE_STALL_DURATIONS,\n>                                   availableStallDurations);\n> ---\n> 7 potential issues detected, please review\n>\n> I think I'll ignore them all.\n> Thanks\n\nAck.\n>    j\n>\n> > -Hiro\n> > > > > --\n> > > > > 2.31.1\n> > > > >\n> > > > >","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id D576AC321B\n\tfor <parsemail@patchwork.libcamera.org>;\n\tWed, 23 Jun 2021 07:08:13 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id AA55B68935;\n\tWed, 23 Jun 2021 09:08:12 +0200 (CEST)","from mail-ed1-x534.google.com (mail-ed1-x534.google.com\n\t[IPv6:2a00:1450:4864:20::534])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id 00D8668931\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tWed, 23 Jun 2021 09:08:10 +0200 (CEST)","by mail-ed1-x534.google.com with SMTP id d7so2062287edx.0\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tWed, 23 Jun 2021 00:08:10 -0700 (PDT)"],"Authentication-Results":"lancelot.ideasonboard.com; dkim=pass (1024-bit key;\n\tunprotected) header.d=chromium.org header.i=@chromium.org\n\theader.b=\"c18uPSsx\"; dkim-atps=neutral","DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/relaxed; d=chromium.org;\n\ts=google; \n\th=mime-version:references:in-reply-to:from:date:message-id:subject:to\n\t:cc; bh=hi9eLV15p+tnztrV8+z29nxSM8bwGMD6YtxUnrEwlGI=;\n\tb=c18uPSsxyR9jNQIi35Dn94ikG3p98OnUqku2CH2uBD9m+yMUVey94sRrDVL1rQyHa6\n\tWvvIvoEPKeXbtHbFfnIJOTyQxGsUghP2Mc1Cbn4bp6sYlzZt4d+smMIn9rADGaZBy5lG\n\tE8nmAelv+xUZ36o79SiT80fv1T2bt5RUGikZc=","X-Google-DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/relaxed;\n\td=1e100.net; s=20161025;\n\th=x-gm-message-state:mime-version:references:in-reply-to:from:date\n\t:message-id:subject:to:cc;\n\tbh=hi9eLV15p+tnztrV8+z29nxSM8bwGMD6YtxUnrEwlGI=;\n\tb=Ddx3ScCgOYMyeKR4pQqBCBoNWMiccaPdWgOzQNrGJySohC3vddZiZ+RHKoq3XxFY1z\n\t2hYlI/luhBqjgwxnYwPrXJzmBq+LAbmMCR6M4pVYV4KgGZ9t1lA7yj5GrAE9GQ8H0j9X\n\t9NnT6I3F8K0F30gvgNvBY+zSSltjTNztSrQ5fMWPR0hr8AULu1/+ramgUyXxzSuHiRHN\n\tY97348R2qYtatTP0uxXBLn/8wWDWUV/7s96zxWNwWGqqxqlcYmRXAdLCgedmV01N/vCn\n\tsCJYVpUGWHoRIcphxvt7p8eKIbmYd9FUENCpE7LotGkvghxg2BAKHOz66KFD669tJDkG\n\t3SsQ==","X-Gm-Message-State":"AOAM5324H0EjnHKwP3UK9bxOzVngI8SB0V7er/68mdrwtI/4Cy2C5Y8K\n\t6UVE9KNwkNBTHAjUsbSmf+aVKkAm+qJg54qJPDo0Hw==","X-Google-Smtp-Source":"ABdhPJxpYo9YlrNR/S4x4JJNj4LE4UbXa1ujg67tdOPBNTrMR/7+TTxlkyxme7wzWnUjCivil0fAMiz8cQi902BS3Sw=","X-Received":"by 2002:a05:6402:151:: with SMTP id\n\ts17mr10126073edu.233.1624432090590; \n\tWed, 23 Jun 2021 00:08:10 -0700 (PDT)","MIME-Version":"1.0","References":"<20210621152954.40299-1-jacopo@jmondi.org>\n\t<20210621152954.40299-3-jacopo@jmondi.org>\n\t<CAO5uPHP1B=8DWZOpthDew7pjsyA_Tau6ttLAZnqmsi_wmqd=YA@mail.gmail.com>\n\t<20210622074107.6oeneo3z4yluohsz@uno.localdomain>\n\t<CAO5uPHP-KNPPFDXWph7D5xqaNt-o4TBA-Ta_ksM7LFx=uC3mtA@mail.gmail.com>\n\t<20210623070342.nbh7mrqoiqdrq7kv@uno.localdomain>","In-Reply-To":"<20210623070342.nbh7mrqoiqdrq7kv@uno.localdomain>","From":"Hirokazu Honda <hiroh@chromium.org>","Date":"Wed, 23 Jun 2021 16:08:00 +0900","Message-ID":"<CAO5uPHMpEoPkW_PPQ3cCU9Q59edg77aEc=e1ysir68QtQTm4cA@mail.gmail.com>","To":"Jacopo Mondi <jacopo@jmondi.org>","Content-Type":"text/plain; charset=\"UTF-8\"","Subject":"Re: [libcamera-devel] [PATCH 2/2] android: Introduce\n\tCameraCapabilties class","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","Cc":"libcamera devel <libcamera-devel@lists.libcamera.org>","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}},{"id":17721,"web_url":"https://patchwork.libcamera.org/comment/17721/","msgid":"<e555e2da-c347-4652-9296-1f5f7950a290@ideasonboard.com>","date":"2021-06-23T09:51:34","subject":"Re: [libcamera-devel] [PATCH 2/2] android: Introduce\n\tCameraCapabilties class","submitter":{"id":4,"url":"https://patchwork.libcamera.org/api/people/4/","name":"Kieran Bingham","email":"kieran.bingham@ideasonboard.com"},"content":"Hi Hiro,\n\nOn 23/06/2021 04:25, Hirokazu Honda wrote:\n> Hi Jacopo,\n> \n> On Tue, Jun 22, 2021 at 4:40 PM Jacopo Mondi <jacopo@jmondi.org> wrote:\n>>\n>> Hi Hiro,\n>>\n>> On Tue, Jun 22, 2021 at 10:34:27AM +0900, Hirokazu Honda wrote:\n>>> Hi Jacopo, thank you for the patch.\n>>>\n>>> I failed to apply the patch on the top of the latest tree to review.\n>>> Could you tell me the parent commit which I can apply this patch?\n>>\n>> weird, I just applied these two patches cleanly on the latest\n>> master which for me is\n>> 969da3189439 (\"libcamera: utils: Support systems that lack secure_getenv and issetugid\"\n>>\n> \n> I tried applying the series whose id is 2162. But the series doesn't\n> contain \"[PATCH 1/2] android: Sort source files alphabetically\".\n> In fact, the patch is not shown in patchwork. It is strange.\n> I managed to apply this patch after manually applying 1/2.\n\nI'm sorry about this, indeed - I think we have noticed that at times\npatchwork is either rejecting or not receiving some patches. I have been\ntrying to identify why, but haven't got to the bottom of it yet.\n--\nKieran","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id 32E6CC321A\n\tfor <parsemail@patchwork.libcamera.org>;\n\tWed, 23 Jun 2021 09:51:40 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id A76DA68935;\n\tWed, 23 Jun 2021 11:51:39 +0200 (CEST)","from perceval.ideasonboard.com (perceval.ideasonboard.com\n\t[IPv6:2001:4b98:dc2:55:216:3eff:fef7:d647])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id AC05268931\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tWed, 23 Jun 2021 11:51:37 +0200 (CEST)","from [192.168.0.20]\n\t(cpc89244-aztw30-2-0-cust3082.18-1.cable.virginm.net [86.31.172.11])\n\tby perceval.ideasonboard.com (Postfix) with ESMTPSA id 2150CEE;\n\tWed, 23 Jun 2021 11:51:37 +0200 (CEST)"],"Authentication-Results":"lancelot.ideasonboard.com; dkim=pass (1024-bit key;\n\tunprotected) header.d=ideasonboard.com header.i=@ideasonboard.com\n\theader.b=\"mDYbvXH5\"; dkim-atps=neutral","DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/simple; d=ideasonboard.com;\n\ts=mail; t=1624441897;\n\tbh=cRK9lVe3GXscDPW2VhkQ7u1AXnvn2yMdZpMQ4oJDbFw=;\n\th=Subject:To:Cc:References:From:Date:In-Reply-To:From;\n\tb=mDYbvXH5Cauqd2U4B2BG2Qjovfmrn+IP/JStpnEyUacZAPP/MehcwO7Xw182Bm5Dq\n\tyKJhT5w9zI7HFRzyQCM7lxLrdLfDpslbGZm04RymdQAqCT/Zk8pODxCAvchcFH/glE\n\tmAYcoyK1v4hrjxTVPxXZGKDRiIEoS4gmmCHnZx5c=","To":"Hirokazu Honda <hiroh@chromium.org>, Jacopo Mondi <jacopo@jmondi.org>","References":"<20210621152954.40299-1-jacopo@jmondi.org>\n\t<20210621152954.40299-3-jacopo@jmondi.org>\n\t<CAO5uPHP1B=8DWZOpthDew7pjsyA_Tau6ttLAZnqmsi_wmqd=YA@mail.gmail.com>\n\t<20210622074107.6oeneo3z4yluohsz@uno.localdomain>\n\t<CAO5uPHP-KNPPFDXWph7D5xqaNt-o4TBA-Ta_ksM7LFx=uC3mtA@mail.gmail.com>","From":"Kieran Bingham <kieran.bingham@ideasonboard.com>","Message-ID":"<e555e2da-c347-4652-9296-1f5f7950a290@ideasonboard.com>","Date":"Wed, 23 Jun 2021 10:51:34 +0100","User-Agent":"Mozilla/5.0 (X11; Linux x86_64; rv:78.0) Gecko/20100101\n\tThunderbird/78.8.1","MIME-Version":"1.0","In-Reply-To":"<CAO5uPHP-KNPPFDXWph7D5xqaNt-o4TBA-Ta_ksM7LFx=uC3mtA@mail.gmail.com>","Content-Type":"text/plain; charset=utf-8","Content-Language":"en-GB","Content-Transfer-Encoding":"7bit","Subject":"Re: [libcamera-devel] [PATCH 2/2] android: Introduce\n\tCameraCapabilties class","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","Cc":"libcamera devel <libcamera-devel@lists.libcamera.org>","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}},{"id":17722,"web_url":"https://patchwork.libcamera.org/comment/17722/","msgid":"<CAO5uPHP=Dkkki6N1W=aPC2x6xKqUiCob=Wpo55SDvZBWpP6Y+w@mail.gmail.com>","date":"2021-06-23T09:57:42","subject":"Re: [libcamera-devel] [PATCH 2/2] android: Introduce\n\tCameraCapabilties class","submitter":{"id":63,"url":"https://patchwork.libcamera.org/api/people/63/","name":"Hirokazu Honda","email":"hiroh@chromium.org"},"content":"Hi Kieran,\n\nOn Wed, Jun 23, 2021 at 6:51 PM Kieran Bingham\n<kieran.bingham@ideasonboard.com> wrote:\n>\n> Hi Hiro,\n>\n> On 23/06/2021 04:25, Hirokazu Honda wrote:\n> > Hi Jacopo,\n> >\n> > On Tue, Jun 22, 2021 at 4:40 PM Jacopo Mondi <jacopo@jmondi.org> wrote:\n> >>\n> >> Hi Hiro,\n> >>\n> >> On Tue, Jun 22, 2021 at 10:34:27AM +0900, Hirokazu Honda wrote:\n> >>> Hi Jacopo, thank you for the patch.\n> >>>\n> >>> I failed to apply the patch on the top of the latest tree to review.\n> >>> Could you tell me the parent commit which I can apply this patch?\n> >>\n> >> weird, I just applied these two patches cleanly on the latest\n> >> master which for me is\n> >> 969da3189439 (\"libcamera: utils: Support systems that lack secure_getenv and issetugid\"\n> >>\n> >\n> > I tried applying the series whose id is 2162. But the series doesn't\n> > contain \"[PATCH 1/2] android: Sort source files alphabetically\".\n> > In fact, the patch is not shown in patchwork. It is strange.\n> > I managed to apply this patch after manually applying 1/2.\n>\n> I'm sorry about this, indeed - I think we have noticed that at times\n> patchwork is either rejecting or not receiving some patches. I have been\n> trying to identify why, but haven't got to the bottom of it yet.\n\nOh, that's good to know.\nThanks for letting me know.\n\n-Hiro\n> --\n> Kieran","headers":{"Return-Path":"<libcamera-devel-bounces@lists.libcamera.org>","X-Original-To":"parsemail@patchwork.libcamera.org","Delivered-To":"parsemail@patchwork.libcamera.org","Received":["from lancelot.ideasonboard.com (lancelot.ideasonboard.com\n\t[92.243.16.209])\n\tby patchwork.libcamera.org (Postfix) with ESMTPS id 0D22DC321B\n\tfor <parsemail@patchwork.libcamera.org>;\n\tWed, 23 Jun 2021 09:57:54 +0000 (UTC)","from lancelot.ideasonboard.com (localhost [IPv6:::1])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTP id 7F60168935;\n\tWed, 23 Jun 2021 11:57:53 +0200 (CEST)","from mail-ej1-x62b.google.com (mail-ej1-x62b.google.com\n\t[IPv6:2a00:1450:4864:20::62b])\n\tby lancelot.ideasonboard.com (Postfix) with ESMTPS id BCE6B68931\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tWed, 23 Jun 2021 11:57:52 +0200 (CEST)","by mail-ej1-x62b.google.com with SMTP id nb6so3058555ejc.10\n\tfor <libcamera-devel@lists.libcamera.org>;\n\tWed, 23 Jun 2021 02:57:52 -0700 (PDT)"],"Authentication-Results":"lancelot.ideasonboard.com;\n\tdkim=fail reason=\"signature verification failed\" (1024-bit key;\n\tunprotected) header.d=chromium.org header.i=@chromium.org\n\theader.b=\"FXamFRlj\"; dkim-atps=neutral","DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/relaxed; d=chromium.org;\n\ts=google; \n\th=mime-version:references:in-reply-to:from:date:message-id:subject:to\n\t:cc; bh=yqKe2S+WpHGpGLroIZh6xwKA3vEicBY7TXrQUeq5neI=;\n\tb=FXamFRljAO1qXMF+eBQmcFAf+DPP6WaZRnSidK6yjenZZi1a80N5WlTxG39AKLLYML\n\t25XD0JekGBVEsaVkkCQXk59n+wlsr2pbBcd2nLB6FuVwS1W1dZf89mbGXPAvIKClpK0K\n\tAzsIZwJAyjJl7ChtvD40Gi3TB2Rr5BQQ3FN+0=","X-Google-DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/relaxed;\n\td=1e100.net; s=20161025;\n\th=x-gm-message-state:mime-version:references:in-reply-to:from:date\n\t:message-id:subject:to:cc;\n\tbh=yqKe2S+WpHGpGLroIZh6xwKA3vEicBY7TXrQUeq5neI=;\n\tb=nLb/mtTD4fdPwZCD/0hOiKymdQPGwsxanhHaXItk/8VRk8Oue1cC6ynp6S/RnfIvQp\n\t3e8H7WmQEnF5rj7Mwa28Fem0xJGllDDaAPUhU314CtiRF2MjDx4u9RxhPjBNoQ+zFB2p\n\tvPDasIIT4Tx9qJDRRgWEXEA+3iDh3ByV2QylX/NmvvSy7zNoLUezxyqeqXeesJkVccyL\n\tI/GP5c8Kp35Ht5X3CnhOiNQ4GF2/+gkwjdpgWpo/G8uIdem3ri7kclwJrYHYxxtyRy9a\n\tMt9jdB+tPJWToVx8JXKcUt7G9CJlfjjuvrBdw9f0p+1sK7MjKR8/DCkaKRb8Om3pBpDz\n\tkJzw==","X-Gm-Message-State":"AOAM530YajkcvBmrLN0LQ/k1mVyCwjmsj1Z7ePoBTrrXml+vpVQvkZEd\n\tDOqN1imp5+5HiL8SjsGYPx02COTpWnLSrMM0nZ+z0w==","X-Google-Smtp-Source":"ABdhPJwNf3yqEWvZ7bHrZYsTG5JL87J8R9YbYArKuo7z36MsDPSlkUltfNJHaVG/zTTGCuU7rxNOS0UeRlzMhZ2X0CE=","X-Received":"by 2002:a17:906:dbd8:: with SMTP id\n\tyc24mr9068703ejb.55.1624442272417; \n\tWed, 23 Jun 2021 02:57:52 -0700 (PDT)","MIME-Version":"1.0","References":"<20210621152954.40299-1-jacopo@jmondi.org>\n\t<20210621152954.40299-3-jacopo@jmondi.org>\n\t<CAO5uPHP1B=8DWZOpthDew7pjsyA_Tau6ttLAZnqmsi_wmqd=YA@mail.gmail.com>\n\t<20210622074107.6oeneo3z4yluohsz@uno.localdomain>\n\t<CAO5uPHP-KNPPFDXWph7D5xqaNt-o4TBA-Ta_ksM7LFx=uC3mtA@mail.gmail.com>\n\t<e555e2da-c347-4652-9296-1f5f7950a290@ideasonboard.com>","In-Reply-To":"<e555e2da-c347-4652-9296-1f5f7950a290@ideasonboard.com>","From":"Hirokazu Honda <hiroh@chromium.org>","Date":"Wed, 23 Jun 2021 18:57:42 +0900","Message-ID":"<CAO5uPHP=Dkkki6N1W=aPC2x6xKqUiCob=Wpo55SDvZBWpP6Y+w@mail.gmail.com>","To":"Kieran Bingham <kieran.bingham@ideasonboard.com>","Content-Type":"text/plain; charset=\"UTF-8\"","Subject":"Re: [libcamera-devel] [PATCH 2/2] android: Introduce\n\tCameraCapabilties class","X-BeenThere":"libcamera-devel@lists.libcamera.org","X-Mailman-Version":"2.1.29","Precedence":"list","List-Id":"<libcamera-devel.lists.libcamera.org>","List-Unsubscribe":"<https://lists.libcamera.org/options/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=unsubscribe>","List-Archive":"<https://lists.libcamera.org/pipermail/libcamera-devel/>","List-Post":"<mailto:libcamera-devel@lists.libcamera.org>","List-Help":"<mailto:libcamera-devel-request@lists.libcamera.org?subject=help>","List-Subscribe":"<https://lists.libcamera.org/listinfo/libcamera-devel>,\n\t<mailto:libcamera-devel-request@lists.libcamera.org?subject=subscribe>","Cc":"libcamera devel <libcamera-devel@lists.libcamera.org>","Errors-To":"libcamera-devel-bounces@lists.libcamera.org","Sender":"\"libcamera-devel\" <libcamera-devel-bounces@lists.libcamera.org>"}}]