Message ID | 20240805100038.11972-3-jaslo@ziska.de |
---|---|
State | Superseded |
Headers | show |
Series |
|
Related | show |
Hi, I started inspecting the generated documentation. It is generally hard to get a perfectly clean results, but let me comment few things. We can at least try some improvement. > Factory Details: > Rank primary (256) > Long-name libcamera Source > Klass Source/Video > Description Linux Camera source using libcamera > Author Nicolas Dufresne <nicolas.dufresne@collabora.com> Thanks for this one! > > Plugin Details: > Name libcamera > Description libcamera capture plugin > Filename /home/nicolas/Sources/libcamera/build/src/gstreamer/libgstlibcamera.so > Version 0.3.1+20-80a202bf > License LGPL > Source module libcamera > Binary package libcamera > Origin URL https://libcamera.org > > GObject > +----GInitiallyUnowned > +----GstObject > +----GstElement > +----GstLibcameraSrc > > Implemented Interfaces: > GstChildProxy > > Pad Templates: > SRC template: 'src' > Availability: Always > Capabilities: > video/x-raw > image/jpeg > video/x-bayer > Type: GstLibcameraPad > Pad Properties: > > stream-role : The selected stream role > flags: readable, writable, changeable only in NULL or READY state > Enum "GstLibcameraStreamRole" Default: 2, "video-recording" > (1): still-capture - libcamera::StillCapture > (2): video-recording - libcamera::VideoRecording > (3): view-finder - libcamera::Viewfinder > > > SRC template: 'src_%u' > Availability: On request > Capabilities: > video/x-raw > image/jpeg > video/x-bayer > Type: GstLibcameraPad > Pad Properties: > > stream-role : The selected stream role > flags: readable, writable, changeable only in NULL or READY state > Enum "GstLibcameraStreamRole" Default: 2, "video-recording" > (1): still-capture - libcamera::StillCapture > (2): video-recording - libcamera::VideoRecording > (3): view-finder - libcamera::Viewfinder > > > Element has no clocking capabilities. > Element has no URI handling capabilities. > > Pads: > SRC: 'src' > Pad Template: 'src' > > Element Properties: > > ae-constraint-mode : Specify a constraint mode for the AE algorithm to use. These determine how the measured scene brightness is adjusted to reach the desired target exposure. Constraint modes may be platform specific, and not all constraint modes may be supported. Inspect generally allow for pretty short description, but I do like having the full description, specially that as long as this element lives here, we are unlikely to pull in the GStreamer doc generator for it. Might be worth giving a try to line breaks, though. > flags: readable, writable, controllable > Enum "AeConstraintMode" Default: 0, "normal" > (0): normal - Default constraint mode. This mode aims to balance the exposure of different parts of the image so as to reach a reasonable average level. However, highlights in the image may appear over-exposed and lowlights may appear under-exposed. > (1): highlight - Highlight constraint mode. This mode adjusts the exposure levels in order to try and avoid over-exposing the brightest parts (highlights) of an image. Other non-highlight parts of the image may appear under-exposed. > (2): shadows - Shadows constraint mode. This mode adjusts the exposure levels in order to try and avoid under-exposing the dark parts (shadows) of an image. Other normally exposed parts of the image may appear over-exposed. > (3): custom - Custom constraint mode. Same here, would be nice to at least "try" something, but this is part of things that may of may not be easy. Everything else seems good. > > ae-enable : Enable or disable the AE. See also: exposure-time, analogue-gain. > flags: readable, writable, controllable > Boolean. Default: false > > ae-exposure-mode : Specify an exposure mode for the AE algorithm to use. These specify how the desired total exposure is divided between the shutter time and the sensor's analogue gain. The exposure modes are platform specific, and not all exposure modes may be supported. > flags: readable, writable, controllable > Enum "AeExposureMode" Default: 0, "normal" > (0): normal - Default exposure mode. > (1): short - Exposure mode allowing only short exposure times. > (2): long - Exposure mode allowing long exposure times. > (3): custom - Custom exposure mode. > > ae-flicker-detected : Flicker period detected in microseconds. The value reported here indicates the currently detected flicker period, or zero if no flicker at all is detected. When AeFlickerMode is set to FlickerAuto, there may be a period during which the value reported here remains zero. Once a non-zero value is reported, then this is the flicker period that has been detected and is now being cancelled. In the case of 50Hz mains flicker, the value would be 10000 (corresponding to 100Hz), or 8333 (120Hz) for 60Hz mains flicker. It is implementation dependent whether the system can continue to detect flicker of different periods when another frequency is already being cancelled. See also: ae-flicker-mode. > flags: readable, writable, controllable > Integer. Range: -2147483648 - 2147483647 Default: 0 > > ae-flicker-mode : Set the flicker mode, which determines whether, and how, the AGC/AEC algorithm attempts to hide flicker effects caused by the duty cycle of artificial lighting. Although implementation dependent, many algorithms for "flicker avoidance" work by restricting this exposure time to integer multiples of the cycle period, wherever possible. Implementations may not support all of the flicker modes listed below. By default the system will start in FlickerAuto mode if this is supported, otherwise the flicker mode will be set to FlickerOff. > flags: readable, writable, controllable > Enum "AeFlickerMode" Default: 0, "off" Is that accurate or accidental default ? > (0): off - No flicker avoidance is performed. > (1): manual - Manual flicker avoidance. Suppress flicker effects caused by lighting running with a period specified by the AeFlickerPeriod control. See also: ae-flicker-period. > (2): auto - Automatic flicker period detection and avoidance. The system will automatically determine the most likely value of flicker period, and avoid flicker of this frequency. Once flicker is being corrected, it is implementation dependent whether the system is still able to detect a change in the flicker period. See also: ae-flicker-detected. > > ae-flicker-period : Manual flicker period in microseconds. This value sets the current flicker period to avoid. It is used when AeFlickerMode is set to FlickerManual. To cancel 50Hz mains flicker, this should be set to 10000 (corresponding to 100Hz), or 8333 (120Hz) for 60Hz mains. Setting the mode to FlickerManual when no AeFlickerPeriod has ever been set means that no flicker cancellation occurs (until the value of this control is updated). Switching to modes other than FlickerManual has no effect on the value of the AeFlickerPeriod control. See also: ae-flicker-mode. > flags: readable, writable, controllable > Integer. Range: -2147483648 - 2147483647 Default: 0 > > ae-locked : Report the lock status of a running AE algorithm. If the AE algorithm is locked the value shall be set to true, if it's converging it shall be set to false. If the AE algorithm is not running the control shall not be present in the metadata control list. See also: ae-enable. > flags: readable, writable, controllable > Boolean. Default: false As this one is a status, it should only be readable, without the writable/controllable. If I'm not mistaken, you get this value withing the finalized request, would be nice enhancement to notify our users of the change using g_notify() and/or bus messages. > > ae-metering-mode : Specify a metering mode for the AE algorithm to use. The metering modes determine which parts of the image are used to determine the scene brightness. Metering modes may be platform specific and not all metering modes may be supported. > flags: readable, writable, controllable > Enum "AeMeteringMode" Default: 0, "centre-weighted" > (0): centre-weighted - Centre-weighted metering mode. > (1): spot - Spot metering mode. > (2): matrix - Matrix metering mode. > (3): custom - Custom metering mode. From the doc, this seems pretty obscure. Shouldn't "custom" comes with another control ? Perhaps we introduce a mask to hide the controls that are not yet usable or that the usage isn't obvious and may need special case or rework ? > > af-metering : Instruct the AF algorithm how it should decide which parts of the image should be used to measure focus. > flags: readable, writable, controllable > Enum "AfMetering" Default: 0, "auto" > (0): auto - The AF algorithm should decide for itself where it will measure focus. > (1): windows - The AF algorithm should use the rectangles defined by the AfWindows control to measure focus. If no windows are specified the behaviour is platform dependent. > > af-mode : Control to set the mode of the AF (autofocus) algorithm. An implementation may choose not to implement all the modes. > flags: readable, writable, controllable > Enum "AfMode" Default: 0, "manual" Is that appropriate default ? Of course does not matter if there is no focus on the target. I wonder how you get to know if focus is there or not ... > (0): manual - The AF algorithm is in manual mode. In this mode it will never perform any action nor move the lens of its own accord, but an application can specify the desired lens position using the LensPosition control. In this mode the AfState will always report AfStateIdle. If the camera is started in AfModeManual, it will move the focus lens to the position specified by the LensPosition control. This mode is the recommended default value for the AfMode control. External cameras (as reported by the Location property set to CameraLocationExternal) may use a different default value. > (1): auto - The AF algorithm is in auto mode. This means that the algorithm will never move the lens or change state unless the AfTrigger control is used. The AfTrigger control can be used to initiate a focus scan, the results of which will be reported by AfState. If the autofocus algorithm is moved from AfModeAuto to another mode while a scan is in progress, the scan is cancelled immediately, without waiting for the scan to finish. When first entering this mode the AfState will report AfStateIdle. When a trigger control is sent, AfState will report AfStateScanning for a period before spontaneously changing to AfStateFocused or AfStateFailed, depending on the outcome of the scan. It will remain in this state until another scan is initiated by the AfTrigger control. If a scan is cancelled (without changing to another mode), AfState will return to AfStateIdle. > (2): continuous - The AF algorithm is in continuous mode. This means that the lens can re-start a scan spontaneously at any moment, without any user intervention. The AfState still reports whether the algorithm is currently scanning or not, though the application has no ability to initiate or cancel scans, nor to move the lens for itself. However, applications can pause the AF algorithm from continuously scanning by using the AfPause control. This allows video or still images to be captured whilst guaranteeing that the focus is fixed. When set to AfModeContinuous, the system will immediately initiate a scan so AfState will report AfStateScanning, and will settle on one of AfStateFocused or AfStateFailed, depending on the scan result. > > af-pause : This control has no effect except when in continuous autofocus mode (AfModeContinuous). It can be used to pause any lens movements while (for example) images are captured. The algorithm remains inactive until it is instructed to resume. > flags: readable, writable, controllable > Enum "AfPause" Default: 0, "immediate" > (0): immediate - Pause the continuous autofocus algorithm immediately, whether or not any kind of scan is underway. AfPauseState will subsequently report AfPauseStatePaused. AfState may report any of AfStateScanning, AfStateFocused or AfStateFailed, depending on the algorithm's state when it received this control. > (1): deferred - This is similar to AfPauseImmediate, and if the AfState is currently reporting AfStateFocused or AfStateFailed it will remain in that state and AfPauseState will report AfPauseStatePaused. However, if the algorithm is scanning (AfStateScanning), AfPauseState will report AfPauseStatePausing until the scan is finished, at which point AfState will report one of AfStateFocused or AfStateFailed, and AfPauseState will change to AfPauseStatePaused. > (2): resume - Resume continuous autofocus operation. The algorithm starts again from exactly where it left off, and AfPauseState will report AfPauseStateRunning. > > af-pause-state : Only applicable in continuous (AfModeContinuous) mode, this reports whether the algorithm is currently running, paused or pausing (that is, will pause as soon as any in-progress scan completes). Any change to AfMode will cause AfPauseStateRunning to be reported. > flags: readable, writable, controllable Does not seem writable. > Enum "AfPauseState" Default: 0, "running" > (0): running - Continuous AF is running and the algorithm may restart a scan spontaneously. > (1): pausing - Continuous AF has been sent an AfPauseDeferred control, and will pause as soon as any in-progress scan completes (and then report AfPauseStatePaused). No new scans will be start spontaneously until the AfPauseResume control is sent. > (2): paused - Continuous AF is paused. No further state changes or lens movements will occur until the AfPauseResume control is sent. > > af-range : Control to set the range of focus distances that is scanned. An implementation may choose not to implement all the options here. > flags: readable, writable, controllable > Enum "AfRange" Default: 0, "normal" > (0): normal - A wide range of focus distances is scanned, all the way from infinity down to close distances, though depending on the implementation, possibly not including the very closest macro positions. > (1): macro - Only close distances are scanned. > (2): full - The full range of focus distances is scanned just as with AfRangeNormal but this time including the very closest macro positions. > > af-speed : Control that determines whether the AF algorithm is to move the lens as quickly as possible or more steadily. For example, during video recording it may be desirable not to move the lens too abruptly, but when in a preview mode (waiting for a still capture) it may be helpful to move the lens as quickly as is reasonably possible. > flags: readable, writable, controllable > Enum "AfSpeed" Default: 0, "normal" > (0): normal - Move the lens at its usual speed. > (1): fast - Move the lens more quickly. > > af-state : Reports the current state of the AF algorithm in conjunction with the reported AfMode value and (in continuous AF mode) the AfPauseState value. The possible state changes are described below, though we note the following state transitions that occur when the AfMode is changed. If the AfMode is set to AfModeManual, then the AfState will always report AfStateIdle (even if the lens is subsequently moved). Changing to the AfModeManual state does not initiate any lens movement. If the AfMode is set to AfModeAuto then the AfState will report AfStateIdle. However, if AfModeAuto and AfTriggerStart are sent together then AfState will omit AfStateIdle and move straight to AfStateScanning (and start a scan). If the AfMode is set to AfModeContinuous then the AfState will initially report AfStateScanning. > flags: readable, writable, controllable Seems read-only. > Enum "AfState" Default: 0, "idle" > (0): idle - The AF algorithm is in manual mode (AfModeManual) or in auto mode (AfModeAuto) and a scan has not yet been triggered, or an in-progress scan was cancelled. > (1): scanning - The AF algorithm is in auto mode (AfModeAuto), and a scan has been started using the AfTrigger control. The scan can be cancelled by sending AfTriggerCancel at which point the algorithm will either move back to AfStateIdle or, if the scan actually completes before the cancel request is processed, to one of AfStateFocused or AfStateFailed. Alternatively the AF algorithm could be in continuous mode (AfModeContinuous) at which point it may enter this state spontaneously whenever it determines that a rescan is needed. > (2): focused - The AF algorithm is in auto (AfModeAuto) or continuous (AfModeContinuous) mode and a scan has completed with the result that the algorithm believes the image is now in focus. > (3): failed - The AF algorithm is in auto (AfModeAuto) or continuous (AfModeContinuous) mode and a scan has completed with the result that the algorithm did not find a good focus position. > > af-trigger : This control starts an autofocus scan when AfMode is set to AfModeAuto, and can also be used to terminate a scan early. It is ignored if AfMode is set to AfModeManual or AfModeContinuous. > flags: readable, writable, controllable > Enum "AfTrigger" Default: 0, "start" > (0): start - Start an AF scan. Ignored if a scan is in progress. > (1): cancel - Cancel an AF scan. This does not cause the lens to move anywhere else. Ignored if no scan is in progress. This one is missing something to make sense, at least from GStreamer point of view. When you are not starting or cancelling, what value this control should change to ? That question might be relevant for libcamera globally. Without this third state, I would say this one should not be there, and replace with action signals. > > analogue-gain : Analogue gain value applied in the sensor device. The value of the control specifies the gain multiplier applied to all colour channels. This value cannot be lower than 1.0. Setting this value means that it is now fixed and the AE algorithm may not change it. Setting it back to zero returns it to the control of the AE algorithm. See also: exposure-time, ae-enable. Todo: Document the interactions between AeEnable and setting a fixed value for this control. Consider interactions with other AE features, such as aperture and aperture/shutter priority mode, and decide if control of which features should be automatically adjusted shouldn't better be handled through a separate AE mode control. > flags: readable, writable, controllable > Float. Range: -3.402823e+38 - 3.402823e+38 Default: 0 I'm not confident there is no range, would be nice if we can align to something sensible (might need yaml changes of course). > > awb-enable : Enable or disable the AWB. See also: colour-gains. > flags: readable, writable, controllable > Boolean. Default: false Should be on by default (if not, we should do that in gst ;-P). > > awb-locked : Report the lock status of a running AWB algorithm. If the AWB algorithm is locked the value shall be set to true, if it's converging it shall be set to false. If the AWB algorithm is not running the control shall not be present in the metadata control list. See also: awb-enable. > flags: readable, writable, controllable > Boolean. Default: false Read-only. > > awb-mode : Specify the range of illuminants to use for the AWB algorithm. The modes supported are platform specific, and not all modes may be supported. > flags: readable, writable, controllable > Enum "AwbMode" Default: 0, "auto" > (0): auto - Search over the whole colour temperature range. > (1): incandescent - Incandescent AWB lamp mode. > (2): tungsten - Tungsten AWB lamp mode. > (3): fluorescent - Fluorescent AWB lamp mode. > (4): indoor - Indoor AWB lighting mode. > (5): daylight - Daylight AWB lighting mode. > (6): cloudy - Cloudy AWB lighting mode. > (7): custom - Custom AWB mode. Again, unclear how custom can be used. > > brightness : Specify a fixed brightness parameter. Positive values (up to 1.0) produce brighter images; negative values (up to -1.0) produce darker images and 0.0 leaves pixels unchanged. > flags: readable, writable, controllable > Float. Range: -3.402823e+38 - 3.402823e+38 Default: 0 So this time the range is documented, we need a machine readable form. > > camera-name : Select by name which camera to use. > flags: readable, writable, changeable only in NULL or READY state > String. Default: null > > colour-correction-matrix: The 3x3 matrix that converts camera RGB to sRGB within the imaging pipeline. This should describe the matrix that is used after pixels have been white-balanced, but before any gamma transformation. The 3x3 matrix is stored in conventional reading order in an array of 9 floating point values. > flags: readable, writable, controllable > Default: "< >" > GstValueArray of GValues of type "gfloat" The type only allow flat list, but as its a matrix I was expecting "< < 1, 2, 3 >, < 4, 5, 6 >, < 7, 8, 9 > >" That being said I would be fine with flat raster data: "< 1, 2, 3, 4, 5, 6, 7, 8, 9 >" The down side is for bindings. Consider Python, the first would create a collection that you can access with matrix[Y][X] and more importantly can store into numpy.matrix. A comment for all list "< >" syntax. The usage and serialization syntax is not easy to guess. For matrix, if we had the identity matrix as default instead of an empty list, that would help. Otherwise we need a way to extend the doc for GStreamer usage. > > colour-gains : Pair of gain values for the Red and Blue colour channels, in that order. ColourGains can only be applied in a Request when the AWB is disabled. See also: awb-enable. > flags: readable, writable, controllable > Default: "< >" > GstValueArray of GValues of type "gfloat" Again the type allow for flat list, but I think we have "< < r1, r2 >, < b1, b2 > >" ? > > colour-temperature : Report the current estimate of the colour temperature, in kelvin, for this frame. The ColourTemperature control can only be returned in metadata. > flags: readable, writable, controllable > Integer. Range: -2147483648 - 2147483647 Default: 0 read-only. > > contrast : Specify a fixed contrast parameter. Normal contrast is given by the value 1.0; larger values produce images with more contrast. > flags: readable, writable, controllable > Float. Range: -3.402823e+38 - 3.402823e+38 Default: 0 Range needed programmatically. > > digital-gain : Digital gain value applied during the processing steps applied to the image as captured from the sensor. The global digital gain factor is applied to all the colour channels of the RAW image. Different pipeline models are free to specify how the global gain factor applies to each separate channel. If an imaging pipeline applies digital gain in distinct processing steps, this value indicates their total sum. Pipelines are free to decide how to adjust each processing step to respect the received gain factor and shall report their total value in the request metadata. > flags: readable, writable, controllable > Float. Range: -3.402823e+38 - 3.402823e+38 Default: 0 > > draft-ae-precapture-trigger: Control for AE metering trigger. Currently identical to ANDROID_CONTROL_AE_PRECAPTURE_TRIGGER. Whether the camera device will trigger a precapture metering sequence when it processes this request. > flags: readable, writable, controllable > Enum "AePrecaptureTrigger" Default: 0, "idle" > (0): idle - The trigger is idle. > (1): start - The pre-capture AE metering is started by the camera. > (2): cancel - The camera will cancel any active or completed metering sequence. The AE algorithm is reset to its initial state. Perhaps we can skip over draft- as this could later break our interface. We can also add an env to include them ? > > draft-ae-state : Control to report the current AE algorithm state. Currently identical to ANDROID_CONTROL_AE_STATE. Current state of the AE algorithm. > flags: readable, writable, controllable > Enum "AeState" Default: 0, "inactive" > (0): inactive - The AE algorithm is inactive. > (1): searching - The AE algorithm has not converged yet. > (2): converged - The AE algorithm has converged. > (3): locked - The AE algorithm is locked. > (4): flash-required - The AE algorithm would need a flash for good results > (5): precapture - The AE algorithm has started a pre-capture metering session. See also: ae-precapture-trigger. > > draft-awb-state : Control to report the current AWB algorithm state. Currently identical to ANDROID_CONTROL_AWB_STATE. Current state of the AWB algorithm. > flags: readable, writable, controllable > Enum "AwbState" Default: 0, "state-inactive" > (0): state-inactive - The AWB algorithm is inactive. > (1): state-searching - The AWB algorithm has not converged yet. > (2): converged - The AWB algorithm has converged. > (3): locked - The AWB algorithm is locked. > > draft-color-correction-aberration-mode: Control to select the color correction aberration mode. Currently identical to ANDROID_COLOR_CORRECTION_ABERRATION_MODE. Mode of operation for the chromatic aberration correction algorithm. > flags: readable, writable, controllable > Enum "ColorCorrectionAberrationMode" Default: 0, "off" > (0): off - No aberration correction is applied. > (1): fast - Aberration correction will not slow down the frame rate. > (2): high-quality - High quality aberration correction which might reduce the frame rate. > > draft-lens-shading-map-mode: Control to report if the lens shading map is available. Currently identical to ANDROID_STATISTICS_LENS_SHADING_MAP_MODE. > flags: readable, writable, controllable > Enum "LensShadingMapMode" Default: 0, "ff" > (0): ff - No lens shading map mode is available. > (1): n - The lens shading map mode is available. > > draft-max-latency : The maximum number of frames that can occur after a request (different than the previous) has been submitted, and before the result's state becomes synchronized. A value of -1 indicates unknown latency, and 0 indicates per-frame control. Currently identical to ANDROID_SYNC_MAX_LATENCY. > flags: readable, writable, controllable > Integer. Range: -2147483648 - 2147483647 Default: 0 > > draft-noise-reduction-mode: Control to select the noise reduction algorithm mode. Currently identical to ANDROID_NOISE_REDUCTION_MODE. Mode of operation for the noise reduction algorithm. > flags: readable, writable, controllable > Enum "NoiseReductionMode" Default: 0, "off" > (0): off - No noise reduction is applied > (1): fast - Noise reduction is applied without reducing the frame rate. > (2): high-quality - High quality noise reduction at the expense of frame rate. > (3): minimal - Minimal noise reduction is applied without reducing the frame rate. > (4): z-s-l - Noise reduction is applied at different levels to different streams. > > draft-pipeline-depth: Specifies the number of pipeline stages the frame went through from when it was exposed to when the final completed result was available to the framework. Always less than or equal to PipelineMaxDepth. Currently identical to ANDROID_REQUEST_PIPELINE_DEPTH. The typical value for this control is 3 as a frame is first exposed, captured and then processed in a single pass through the ISP. Any additional processing step performed after the ISP pass (in example face detection, additional format conversions etc) count as an additional pipeline stage. > flags: readable, writable, controllable > Integer. Range: -2147483648 - 2147483647 Default: 0 > > draft-sensor-rolling-shutter-skew: Control to report the time between the start of exposure of the first row and the start of exposure of the last row. Currently identical to ANDROID_SENSOR_ROLLING_SHUTTER_SKEW > flags: readable, writable, controllable > Integer64. Range: -9223372036854775808 - 9223372036854775807 Default: 0 > > draft-test-pattern-mode: Control to select the test pattern mode. Currently identical to ANDROID_SENSOR_TEST_PATTERN_MODE. > flags: readable, writable, controllable > Enum "TestPatternMode" Default: 0, "off" > (0): off - No test pattern mode is used. The camera device returns frames from the image sensor. > (1): solid-color - Each pixel in [R, G_even, G_odd, B] is replaced by its respective color channel provided in test pattern data. Todo: Add control for test pattern data. > (2): color-bars - All pixel data is replaced with an 8-bar color pattern. The vertical bars (left-to-right) are as follows; white, yellow, cyan, green, magenta, red, blue and black. Each bar should take up 1/8 of the sensor pixel array width. When this is not possible, the bar size should be rounded down to the nearest integer and the pattern can repeat on the right side. Each bar's height must always take up the full sensor pixel array height. > (3): color-bars-fade-to-gray - The test pattern is similar to TestPatternModeColorBars, except that each bar should start at its specified color at the top and fade to gray at the bottom. Furthermore each bar is further subdevided into a left and right half. The left half should have a smooth gradient, and the right half should have a quantized gradient. In particular, the right half's should consist of blocks of the same color for 1/16th active sensor pixel array width. The least significant bits in the quantized gradient should be copied from the most significant bits of the smooth gradient. The height of each bar should always be a multiple of 128. When this is not the case, the pattern should repeat at the bottom of the image. > (4): pn9 - All pixel data is replaced by a pseudo-random sequence generated from a PN9 512-bit sequence (typically implemented in hardware with a linear feedback shift register). The generator should be reset at the beginning of each frame, and thus each subsequent raw frame with this test pattern should be exactly the same as the last. > (256): custom1 - The first custom test pattern. All custom patterns that are available only on this camera device are at least this numeric value. All of the custom test patterns will be static (that is the raw image must not vary from frame to frame). > > exposure-time : Exposure time (shutter speed) for the frame applied in the sensor device. This value is specified in micro-seconds. Setting this value means that it is now fixed and the AE algorithm may not change it. Setting it back to zero returns it to the control of the AE algorithm. See also: analogue-gain, ae-enable. Todo: Document the interactions between AeEnable and setting a fixed value for this control. Consider interactions with other AE features, such as aperture and aperture/shutter priority mode, and decide if control of which features should be automatically adjusted shouldn't better be handled through a separate AE mode control. > flags: readable, writable, controllable > Integer. Range: -2147483648 - 2147483647 Default: 0 > > exposure-value : Specify an Exposure Value (EV) parameter. The EV parameter will only be applied if the AE algorithm is currently enabled. By convention EV adjusts the exposure as log2. For example EV = [-2, -1, 0.5, 0, 0.5, 1, 2] results in an exposure adjustment of [1/4x, 1/2x, 1/sqrt(2)x, 1x, sqrt(2)x, 2x, 4x]. See also: ae-enable. > flags: readable, writable, controllable > Float. Range: -3.402823e+38 - 3.402823e+38 Default: 0 > > focus-fo-m : Reports a Figure of Merit (FoM) to indicate how in-focus the frame is. A larger FocusFoM value indicates a more in-focus frame. This singular value may be based on a combination of statistics gathered from multiple focus regions within an image. The number of focus regions and method of combination is platform dependent. In this respect, it is not necessarily aimed at providing a way to implement a focus algorithm by the application, rather an indication of how in-focus a frame is. > flags: readable, writable, controllable > Integer. Range: -2147483648 - 2147483647 Default: 0 > > frame-duration : The instantaneous frame duration from start of frame exposure to start of next exposure, expressed in microseconds. This control is meant to be returned in metadata. > flags: readable, writable, controllable > Integer64. Range: -9223372036854775808 - 9223372036854775807 Default: 0 > > frame-duration-limits: The minimum and maximum (in that order) frame duration, expressed in microseconds. When provided by applications, the control specifies the sensor frame duration interval the pipeline has to use. This limits the largest exposure time the sensor can use. For example, if a maximum frame duration of 33ms is requested (corresponding to 30 frames per second), the sensor will not be able to raise the exposure time above 33ms. A fixed frame duration is achieved by setting the minimum and maximum values to be the same. Setting both values to 0 reverts to using the camera defaults. The maximum frame duration provides the absolute limit to the shutter speed computed by the AE algorithm and it overrides any exposure mode setting specified with controls::AeExposureMode. Similarly, when a manual exposure time is set through controls::ExposureTime, it also gets clipped to the limits set by this control. When reported in metadata, the control expresses the minimum and maximum frame durations used after being clipped to the sensor provided frame duration limits. See also: ae-exposure-mode. See also: exposure-time. Todo: Define how to calculate the capture frame rate by defining controls to report additional delays introduced by the capture pipeline or post-processing stages (ie JPEG conversion, frame scaling). Todo: Provide an explicit definition of default control values, for this and all other controls. > flags: readable, writable, controllable > Default: "< >" > GstValueArray of GValues of type "gint64" > > gamma : Specify a fixed gamma value. Default must be 2.2 which closely mimics sRGB gamma. Note that this is camera gamma, so it is applied as 1.0/gamma. > flags: readable, writable, controllable > Float. Range: -3.402823e+38 - 3.402823e+38 Default: 0 > > hdr-channel : This value is reported back to the application so that it can discover whether this capture corresponds to the short or long exposure image (or any other image used by the HDR procedure). An application can monitor the HDR channel to discover when the differently exposed images have arrived. This metadata is only available when an HDR mode has been enabled. See also: hdr-mode. > flags: readable, writable, controllable > Enum "HdrChannel" Default: 0, "none" > (0): none - This image does not correspond to any of the captures used to create an HDR image. > (1): short - This is a short exposure image. > (2): medium - This is a medium exposure image. > (3): long - This is a long exposure image. > > hdr-mode : Control to set the mode to be used for High Dynamic Range (HDR) imaging. HDR techniques typically include multiple exposure, image fusion and tone mapping techniques to improve the dynamic range of the resulting images. When using an HDR mode, images are captured with different sets of AGC settings called HDR channels. Channels indicate in particular the type of exposure (short, medium or long) used to capture the raw image, before fusion. Each HDR image is tagged with the corresponding channel using the HdrChannel control. See also: hdr-channel. > flags: readable, writable, controllable > Enum "HdrMode" Default: 0, "off" > (0): off - HDR is disabled. Metadata for this frame will not include the HdrChannel control. > (1): multi-exposure-unmerged - Multiple exposures will be generated in an alternating fashion. However, they will not be merged together and will be returned to the application as they are. Each image will be tagged with the correct HDR channel, indicating what kind of exposure it is. The tag should be the same as in the HdrModeMultiExposure case. The expectation is that an application using this mode would merge the frames to create HDR images for itself if it requires them. > (2): multi-exposure - Multiple exposures will be generated and merged to create HDR images. Each image will be tagged with the HDR channel (long, medium or short) that arrived and which caused this image to be output. Systems that use two channels for HDR will return images tagged alternately as the short and long channel. Systems that use three channels for HDR will cycle through the short, medium and long channel before repeating. > (3): single-exposure - Multiple frames all at a single exposure will be used to create HDR images. These images should be reported as all corresponding to the HDR short channel. > (4): night - Multiple frames will be combined to produce "night mode" images. It is up to the implementation exactly which HDR channels it uses, and the images will all be tagged accordingly with the correct HDR channel information. > > lens-position : Acts as a control to instruct the lens to move to a particular position and also reports back the position of the lens for each frame. The LensPosition control is ignored unless the AfMode is set to AfModeManual, though the value is reported back unconditionally in all modes. This value, which is generally a non-integer, is the reciprocal of the focal distance in metres, also known as dioptres. That is, to set a focal distance D, the lens position LP is given by \f$LP = \frac{1\mathrm{m}}{D}\f$ For example: 0 moves the lens to infinity. 0.5 moves the lens to focus on objects 2m away. 2 moves the lens to focus on objects 50cm away. And larger values will focus the lens closer. The default value of the control should indicate a good general position for the lens, often corresponding to the hyperfocal distance (the closest position for which objects at infinity are still acceptably sharp). The minimum will often be zero (meaning infinity), and the maximum value defines the closest focus position. Todo: Define a property to report the Hyperfocal distance of calibrated lenses. > flags: readable, writable, controllable > Float. Range: -3.402823e+38 - 3.402823e+38 Default: 0 > > lux : Report an estimate of the current illuminance level in lux. The Lux control can only be returned in metadata. > flags: readable, writable, controllable > Float. Range: -3.402823e+38 - 3.402823e+38 Default: 0 > > name : The name of the object > flags: readable, writable > String. Default: "libcamerasrc0" > > parent : The parent of the object > flags: readable, writable > Object of type "GstObject" > > saturation : Specify a fixed saturation parameter. Normal saturation is given by the value 1.0; larger values produce more saturated colours; 0.0 produces a greyscale image. > flags: readable, writable, controllable > Float. Range: -3.402823e+38 - 3.402823e+38 Default: 0 > > sensor-black-levels : Reports the sensor black levels used for processing a frame, in the order R, Gr, Gb, B. These values are returned as numbers out of a 16-bit pixel range (as if pixels ranged from 0 to 65535). The SensorBlackLevels control can only be returned in metadata. > flags: readable, writable, controllable > Default: "< >" > GstValueArray of GValues of type "gint" > > sensor-temperature : Temperature measure from the camera sensor in Celsius. This is typically obtained by a thermal sensor present on-die or in the camera module. The range of reported temperatures is device dependent. The SensorTemperature control will only be returned in metadata if a thermal sensor is present. > flags: readable, writable, controllable > Float. Range: -3.402823e+38 - 3.402823e+38 Default: 0 > > sensor-timestamp : The time when the first row of the image sensor active array is exposed. The timestamp, expressed in nanoseconds, represents a monotonically increasing counter since the system boot time, as defined by the Linux-specific CLOCK_BOOTTIME clock id. The SensorTimestamp control can only be returned in metadata. Todo: Define how the sensor timestamp has to be used in the reprocessing use case. > flags: readable, writable, controllable > Integer64. Range: -9223372036854775808 - 9223372036854775807 Default: 0 > > sharpness : A value of 0.0 means no sharpening. The minimum value means minimal sharpening, and shall be 0.0 unless the camera can't disable sharpening completely. The default value shall give a "reasonable" level of sharpening, suitable for most use cases. The maximum value may apply extremely high levels of sharpening, higher than anyone could reasonably want. Negative values are not allowed. Note also that sharpening is not applied to raw streams. > flags: readable, writable, controllable > Float. Range: -3.402823e+38 - 3.402823e+38 Default: 0 I think my comments begins to be repetitive, and if we solve the earlier issues we will solve them all. The main thing is that the yaml should explicitly define more stuff rather then relying to text. And using that, we can easily adjust GStreamer properties. A final note, not all of these will be supported. I was wondering if we should add an extra property that tells use which one are actually supported ? We don't need anything fency, a "/" seperate list in a string or similar could do. We can also go for GstStructure. Nicolas
Hi Nicolas, Jaslo, Adding comments where I think I can add helpful (I hope) feedback. Jaslo - Thanks again for starting this work - It's certainly worthwhile - even if there might be a few challenges below! And I've already seen one person on IRC who was able to solve their issues by using this series! \o/ Quoting Nicolas Dufresne (2024-08-07 20:13:08) > Hi, > > I started inspecting the generated documentation. It is generally hard to get a > perfectly clean results, but let me comment few things. We can at least try some > improvement. > <snip> > > ae-flicker-period : Manual flicker period in microseconds. This value sets the current flicker period to avoid. It is used when AeFlickerMode is set to FlickerManual. To cancel 50Hz mains flicker, this should be set to 10000 (corresponding to 100Hz), or 8333 (120Hz) for 60Hz mains. Setting the mode to FlickerManual when no AeFlickerPeriod has ever been set means that no flicker cancellation occurs (until the value of this control is updated). Switching to modes other than FlickerManual has no effect on the value of the AeFlickerPeriod control. See also: ae-flicker-mode. > > flags: readable, writable, controllable > > Integer. Range: -2147483648 - 2147483647 Default: 0 > > > > ae-locked : Report the lock status of a running AE algorithm. If the AE algorithm is locked the value shall be set to true, if it's converging it shall be set to false. If the AE algorithm is not running the control shall not be present in the metadata control list. See also: ae-enable. > > flags: readable, writable, controllable > > Boolean. Default: false > > As this one is a status, it should only be readable, without the writable/controllable. If I'm not mistaken, you get this value withing the finalized request, would be nice enhancement to notify our users of the change using g_notify() and/or bus messages. This comes down to the fact that readable controls / properties are only reported in metadata and not exposed as a control by the camera... > > > > ae-metering-mode : Specify a metering mode for the AE algorithm to use. The metering modes determine which parts of the image are used to determine the scene brightness. Metering modes may be platform specific and not all metering modes may be supported. > > flags: readable, writable, controllable > > Enum "AeMeteringMode" Default: 0, "centre-weighted" > > (0): centre-weighted - Centre-weighted metering mode. > > (1): spot - Spot metering mode. > > (2): matrix - Matrix metering mode. > > (3): custom - Custom metering mode. > > From the doc, this seems pretty obscure. Shouldn't "custom" comes with another control ? Perhaps we introduce a mask to hide the controls that are not yet usable or that the usage isn't obvious and may need special case or rework ? > > > > > af-metering : Instruct the AF algorithm how it should decide which parts of the image should be used to measure focus. > > flags: readable, writable, controllable > > Enum "AfMetering" Default: 0, "auto" > > (0): auto - The AF algorithm should decide for itself where it will measure focus. > > (1): windows - The AF algorithm should use the rectangles defined by the AfWindows control to measure focus. If no windows are specified the behaviour is platform dependent. > > > > af-mode : Control to set the mode of the AF (autofocus) algorithm. An implementation may choose not to implement all the modes. > > flags: readable, writable, controllable > > Enum "AfMode" Default: 0, "manual" > > Is that appropriate default ? Of course does not matter if there is no focus on the target. I wonder how you get to know if focus is there or not ... Cameras without autofocus should not expose the controls, as they do not 'exist' and they will not be handled... However - that doesn't fit in this gstreamer model where all controls are going to be 'always' exposed. (Even if they are actually 'metadata only' as well). It's up to the Camera at runtime to report what the pipeline handler will accept based on the underlying hardware capabilities or configuration. In fact setting them would likely / potentially cause an assertion? (or at least a failed request - potentially causing a frame drop... ?). <snip> > > Enum "AfState" Default: 0, "idle" > > (0): idle - The AF algorithm is in manual mode (AfModeManual) or in auto mode (AfModeAuto) and a scan has not yet been triggered, or an in-progress scan was cancelled. > > (1): scanning - The AF algorithm is in auto mode (AfModeAuto), and a scan has been started using the AfTrigger control. The scan can be cancelled by sending AfTriggerCancel at which point the algorithm will either move back to AfStateIdle or, if the scan actually completes before the cancel request is processed, to one of AfStateFocused or AfStateFailed. Alternatively the AF algorithm could be in continuous mode (AfModeContinuous) at which point it may enter this state spontaneously whenever it determines that a rescan is needed. > > (2): focused - The AF algorithm is in auto (AfModeAuto) or continuous (AfModeContinuous) mode and a scan has completed with the result that the algorithm believes the image is now in focus. > > (3): failed - The AF algorithm is in auto (AfModeAuto) or continuous (AfModeContinuous) mode and a scan has completed with the result that the algorithm did not find a good focus position. > > > > af-trigger : This control starts an autofocus scan when AfMode is set to AfModeAuto, and can also be used to terminate a scan early. It is ignored if AfMode is set to AfModeManual or AfModeContinuous. > > flags: readable, writable, controllable > > Enum "AfTrigger" Default: 0, "start" > > (0): start - Start an AF scan. Ignored if a scan is in progress. > > (1): cancel - Cancel an AF scan. This does not cause the lens to move anywhere else. Ignored if no scan is in progress. > > This one is missing something to make sense, at least from GStreamer point of view. When you are not starting or cancelling, what value this control should change to ? That question might be relevant for libcamera globally. > > Without this third state, I would say this one should not be there, and replace with action signals. In libcamera - controls only 'exist' in a single request. What is an 'action signal' in Gstreamer? So - AfTrigger for start and cancel are, in this case 'events' - "Start an AFscan at this request" - "Cancel any active scan" There are no 'events' in between. Controls are only 'set' in Requests. Any 'response' is only available in the metadata from a completed Request. You can not 'read' the state of a control in the same sense of a V4L2 control. (Maybe this should change for some use cases, or for an abilty to set controls directly on a camera, which I believe Stefan has explored, and not through a request, but it's the current model). > > analogue-gain : Analogue gain value applied in the sensor device. The value of the control specifies the gain multiplier applied to all colour channels. This value cannot be lower than 1.0. Setting this value means that it is now fixed and the AE algorithm may not change it. Setting it back to zero returns it to the control of the AE algorithm. See also: exposure-time, ae-enable. Todo: Document the interactions between AeEnable and setting a fixed value for this control. Consider interactions with other AE features, such as aperture and aperture/shutter priority mode, and decide if control of which features should be automatically adjusted shouldn't better be handled through a separate AE mode control. > > flags: readable, writable, controllable > > Float. Range: -3.402823e+38 - 3.402823e+38 Default: 0 > > I'm not confident there is no range, would be nice if we can align to something sensible (might need yaml changes of course). Different cameras will have different ranges ... only determinable at runtime. We can't specify this in YAML. Though - it should always be positive at least! (and as you've mentioned elsewhere, if it was machine readable to say always above 1.0 would be beneficial too). > > awb-enable : Enable or disable the AWB. See also: colour-gains. > > flags: readable, writable, controllable > > Boolean. Default: false > > Should be on by default (if not, we should do that in gst ;-P). Maybe that's a GST decision. It is likely use case dependent. But the same could be said for either default ... (and 'on' seems more likely to be useful than 'off' in the majority case?) > > awb-locked : Report the lock status of a running AWB algorithm. If the AWB algorithm is locked the value shall be set to true, if it's converging it shall be set to false. If the AWB algorithm is not running the control shall not be present in the metadata control list. See also: awb-enable. > > flags: readable, writable, controllable > > Boolean. Default: false > > Read-only. Reported through metadata() same for all the other Read-only 'control ids' I suspect. > > > > > awb-mode : Specify the range of illuminants to use for the AWB algorithm. The modes supported are platform specific, and not all modes may be supported. > > flags: readable, writable, controllable > > Enum "AwbMode" Default: 0, "auto" > > (0): auto - Search over the whole colour temperature range. > > (1): incandescent - Incandescent AWB lamp mode. > > (2): tungsten - Tungsten AWB lamp mode. > > (3): fluorescent - Fluorescent AWB lamp mode. > > (4): indoor - Indoor AWB lighting mode. > > (5): daylight - Daylight AWB lighting mode. > > (6): cloudy - Cloudy AWB lighting mode. > > (7): custom - Custom AWB mode. > > Again, unclear how custom can be used. I think/suspect this was added by Raspberry Pi. We should certainly revisit this I think. > > brightness : Specify a fixed brightness parameter. Positive values (up to 1.0) produce brighter images; negative values (up to -1.0) produce darker images and 0.0 leaves pixels unchanged. > > flags: readable, writable, controllable > > Float. Range: -3.402823e+38 - 3.402823e+38 Default: 0 > > So this time the range is documented, we need a machine readable form. Indeed. > > > > camera-name : Select by name which camera to use. > > flags: readable, writable, changeable only in NULL or READY state > > String. Default: null > > > > colour-correction-matrix: The 3x3 matrix that converts camera RGB to sRGB within the imaging pipeline. This should describe the matrix that is used after pixels have been white-balanced, but before any gamma transformation. The 3x3 matrix is stored in conventional reading order in an array of 9 floating point values. > > flags: readable, writable, controllable > > Default: "< >" > > GstValueArray of GValues of type "gfloat" > > The type only allow flat list, but as its a matrix I was expecting > > "< < 1, 2, 3 >, > < 4, 5, 6 >, > < 7, 8, 9 > >" > > That being said I would be fine with flat raster data: > > "< 1, 2, 3, > 4, 5, 6, > 7, 8, 9 >" > > The down side is for bindings. Consider Python, the first would create a collection that you can access with matrix[Y][X] and more importantly can store into numpy.matrix. > > A comment for all list "< >" syntax. The usage and serialization syntax is not easy to guess. For matrix, if we had the identity matrix as default instead of an empty list, that would help. Otherwise we need a way to extend the doc for GStreamer usage. > I recall various bits of work on that recently... But I think this is already handled through python. Stefan/Tomi ? <snip> > > draft-ae-precapture-trigger: Control for AE metering trigger. Currently identical to ANDROID_CONTROL_AE_PRECAPTURE_TRIGGER. Whether the camera device will trigger a precapture metering sequence when it processes this request. > > flags: readable, writable, controllable > > Enum "AePrecaptureTrigger" Default: 0, "idle" > > (0): idle - The trigger is idle. > > (1): start - The pre-capture AE metering is started by the camera. > > (2): cancel - The camera will cancel any active or completed metering sequence. The AE algorithm is reset to its initial state. > > Perhaps we can skip over draft- as this could later break our interface. We can also add an env to include them ? Skipping them is an option. I think most of the 'draft' were only added to match the Android interface. But I wonder how we'll handle other namespaced controls like 'RPi' which I guess will be below... <snip> > I think my comments begins to be repetitive, and if we solve the earlier issues > we will solve them all. The main thing is that the yaml should explicitly define > more stuff rather then relying to text. And using that, we can easily adjust Agreed, I think at least the yaml files could specify if a control can only be accessed as metadata which would simplify all of the 'read-only' issues. > GStreamer properties. A final note, not all of these will be supported. I was > wondering if we should add an extra property that tells use which one are > actually supported ? We don't need anything fency, a "/" seperate list in a > string or similar could do. We can also go for GstStructure. I don't understand this part fully. Do you mean a new control/property that gstreamer libcamerasrc will read at runtime from libcamera? It already has a Camera->controls() that it can read that from. But you mention GstStructure - do you mean that the libcamerasrc will generate this ? (Perhaps from the Camera->controls()?) > Nicolas Kieran
Hi Nicolas, thanks for the review. Nicolas Dufresne <nicolas@ndufresne.ca> writes: > Hi, > > I started inspecting the generated documentation. It is > generally hard to get a > perfectly clean results, but let me comment few things. We can > at least try some > improvement. > >> Factory Details: >> Rank primary (256) >> Long-name libcamera Source >> Klass Source/Video >> Description Linux Camera source using libcamera >> Author Nicolas Dufresne >> <nicolas.dufresne@collabora.com> > > Thanks for this one! >> >> Plugin Details: >> Name libcamera >> Description libcamera capture plugin >> Filename >> /home/nicolas/Sources/libcamera/build/src/gstreamer/libgstlibcamera.so >> Version 0.3.1+20-80a202bf >> License LGPL >> Source module libcamera >> Binary package libcamera >> Origin URL https://libcamera.org >> >> GObject >> +----GInitiallyUnowned >> +----GstObject >> +----GstElement >> +----GstLibcameraSrc >> >> Implemented Interfaces: >> GstChildProxy >> >> Pad Templates: >> SRC template: 'src' >> Availability: Always >> Capabilities: >> video/x-raw >> image/jpeg >> video/x-bayer >> Type: GstLibcameraPad >> Pad Properties: >> >> stream-role : The selected stream role >> flags: readable, writable, >> changeable only in NULL or READY >> state >> Enum "GstLibcameraStreamRole" >> Default: 2, "video-recording" >> (1): still-capture - >> libcamera::StillCapture >> (2): video-recording - >> libcamera::VideoRecording >> (3): view-finder - >> libcamera::Viewfinder >> >> >> SRC template: 'src_%u' >> Availability: On request >> Capabilities: >> video/x-raw >> image/jpeg >> video/x-bayer >> Type: GstLibcameraPad >> Pad Properties: >> >> stream-role : The selected stream role >> flags: readable, writable, >> changeable only in NULL or READY >> state >> Enum "GstLibcameraStreamRole" >> Default: 2, "video-recording" >> (1): still-capture - >> libcamera::StillCapture >> (2): video-recording - >> libcamera::VideoRecording >> (3): view-finder - >> libcamera::Viewfinder >> >> >> Element has no clocking capabilities. >> Element has no URI handling capabilities. >> >> Pads: >> SRC: 'src' >> Pad Template: 'src' >> >> Element Properties: >> >> ae-constraint-mode : Specify a constraint mode for the AE >> algorithm to use. These determine how the measured scene >> brightness is adjusted to reach the desired target exposure. >> Constraint modes may be platform specific, and not all >> constraint modes may be supported. > > Inspect generally allow for pretty short description, but I do > like having the full description, specially that as long as this > element lives here, we are unlikely to pull in the GStreamer doc > generator for it. Might be worth giving a try to line breaks, > though. I wasn't sure if the whole description should be included but I also like having the whole description there as one can then use the gstreamer element without consulting the libcamera documentation. I looked at a few gstreamer elements (h264parse and webrtcbin) with long descriptions (it was surprisingly hard to find some) and they both did not include line breaks in the description. Should I really try to add some? >> flags: readable, writable, controllable >> Enum "AeConstraintMode" Default: 0, >> "normal" >> (0): normal - Default constraint mode. This mode aims to >> balance the exposure of different parts of the image so as to >> reach a reasonable average level. However, highlights in the >> image may appear over-exposed and lowlights may appear >> under-exposed. >> (1): highlight - Highlight >> constraint mode. This mode adjusts >> the exposure levels in order to try >> and avoid over-exposing the >> brightest parts (highlights) of an >> image. Other non-highlight parts of >> the image may appear under-exposed. >> (2): shadows - Shadows >> constraint mode. This mode adjusts >> the exposure levels in order to try >> and avoid under-exposing the dark >> parts (shadows) of an image. Other >> normally exposed parts of the image >> may appear over-exposed. >> (3): custom - Custom >> constraint mode. > > Same here, would be nice to at least "try" something, but this > is part of things that may of may not be easy. Everything else > seems good. > >> >> ae-enable : Enable or disable the AE. See also: >> exposure-time, analogue-gain. >> flags: readable, writable, controllable >> Boolean. Default: false >> >> ae-exposure-mode : Specify an exposure mode for the AE >> algorithm to use. These specify how the desired total exposure >> is divided between the shutter time and the sensor's analogue >> gain. The exposure modes are platform specific, and not >> all exposure modes may be supported. >> flags: readable, writable, controllable >> Enum "AeExposureMode" Default: 0, >> "normal" >> (0): normal - Default >> exposure mode. >> (1): short - Exposure >> mode allowing only short exposure >> times. >> (2): long - Exposure >> mode allowing long exposure times. >> (3): custom - Custom >> exposure mode. >> >> ae-flicker-detected : Flicker period detected in microseconds. >> The value reported here indicates the currently >> detected flicker period, or zero if no flicker at all is >> detected. When AeFlickerMode is set to FlickerAuto, there may >> be a period during which the value reported here remains zero. >> Once a non-zero value is reported, then this is the >> flicker period that has been detected and is now being >> cancelled. In the case of 50Hz mains flicker, the value would >> be >> 10000 (corresponding to 100Hz), or 8333 (120Hz) for 60Hz mains >> flicker. It is implementation dependent whether the >> system can continue to detect flicker of different periods when >> another frequency is already being cancelled. See also: >> ae-flicker-mode. >> flags: readable, writable, controllable >> Integer. Range: -2147483648 - >> 2147483647 Default: 0 >> >> ae-flicker-mode : Set the flicker mode, which determines >> whether, and how, the AGC/AEC algorithm attempts to hide >> flicker effects caused by the duty cycle of artificial >> lighting. Although implementation dependent, many algorithms >> for >> "flicker avoidance" work by restricting this exposure time to >> integer multiples of the cycle period, wherever possible. >> Implementations may not support all of the flicker modes listed >> below. By default the system will start in FlickerAuto >> mode if this is supported, otherwise the flicker mode will be >> set to FlickerOff. >> flags: readable, writable, controllable >> Enum "AeFlickerMode" Default: 0, "off" > > Is that accurate or accidental default ? All the control defaults are set to zero as a default in the spec at the moment. That's why the first enum variant is the default for all enums. So this is kind of accidental. The thing is that (at the moment, that can of course be changed) none of these default controls even get applied to requests. Only once you do g_object_set_property() will the control be added to the ControlList. If we would want to have custom defaults for controls then these would need to be specified somewhere either in the control_ids_*.yaml or somewhere else. >> (0): off - No flicker >> avoidance is performed. >> (1): manual - Manual >> flicker avoidance. Suppress flicker >> effects caused by lighting running >> with a period specified by the >> AeFlickerPeriod control. See also: >> ae-flicker-period. >> (2): auto - Automatic flicker period detection and avoidance. >> The system will automatically determine the most likely >> value of flicker period, and avoid flicker of this frequency. >> Once flicker is being corrected, it is implementation >> dependent whether the system is still able to detect a change >> in the flicker period. See also: ae-flicker-detected. >> >> ae-flicker-period : Manual flicker period in microseconds. This >> value sets the current flicker period to avoid. It is >> used when AeFlickerMode is set to FlickerManual. To cancel 50Hz >> mains flicker, this should be set to 10000 >> (corresponding to 100Hz), or 8333 (120Hz) for 60Hz mains. >> Setting the mode to FlickerManual when no AeFlickerPeriod has >> ever been set means that no flicker cancellation occurs (until >> the value of this control is updated). Switching to modes >> other than FlickerManual has no effect on the value of the >> AeFlickerPeriod control. See also: ae-flicker-mode. >> flags: readable, writable, controllable >> Integer. Range: -2147483648 - >> 2147483647 Default: 0 >> >> ae-locked : Report the lock status of a running AE algorithm. >> If the AE algorithm is locked the value shall be set to >> true, if it's converging it shall be set to false. If the AE >> algorithm is not running the control shall not be present >> in the metadata control list. See also: ae-enable. >> flags: readable, writable, controllable >> Boolean. Default: false > > As this one is a status, it should only be readable, without the > writable/controllable. If I'm not mistaken, you get this value > withing the finalized request, would be nice enhancement to > notify our users of the change using g_notify() and/or bus > messages. The readable/writeable situation can hopefully be solved by adding some parsable information to the yaml. A notification about a change would be nice. Maybe these type of controls can also be marked in the yaml somehow. >> ae-metering-mode : Specify a metering mode for the AE >> algorithm to use. The metering modes determine which parts of >> the image are used to determine the scene brightness. >> Metering modes may be platform specific and not all metering >> modes may be supported. >> flags: readable, writable, controllable >> Enum "AeMeteringMode" Default: 0, >> "centre-weighted" >> (0): centre-weighted - >> Centre-weighted metering mode. >> (1): spot - Spot >> metering mode. >> (2): matrix - Matrix >> metering mode. >> (3): custom - Custom >> metering mode. > > From the doc, this seems pretty obscure. Shouldn't "custom" > comes with another control ? Perhaps we introduce a mask to hide > the controls that are not yet usable or that the usage isn't > obvious and may need special case or rework ? Hmmm yeah I don't understand this either, maybe custom is hardware specific? Stuff like this could be hidden of course. >> af-metering : Instruct the AF algorithm how it should >> decide which parts of the image should be used to measure >> focus. >> flags: readable, writable, controllable >> Enum "AfMetering" Default: 0, "auto" >> (0): auto - The AF >> algorithm should decide for itself >> where it will measure focus. >> (1): windows - The AF >> algorithm should use the rectangles >> defined by the AfWindows control to >> measure focus. If no windows are >> specified the behaviour is platform >> dependent. >> >> af-mode : Control to set the mode of the AF >> (autofocus) algorithm. An implementation may choose not to >> implement all the modes. >> flags: readable, writable, controllable >> Enum "AfMode" Default: 0, "manual" > > Is that appropriate default ? Of course does not matter if there > is no focus on the target. I wonder how you get to know if focus > is there or not ... > >> (0): manual - The AF algorithm is in manual mode. In this mode >> it will never perform any action nor move the lens of >> its own accord, but an application can specify the desired lens >> position using the LensPosition control. In this mode >> the AfState will always report AfStateIdle. If the camera is >> started in AfModeManual, it will move the focus lens to the >> position specified by the LensPosition control. This mode is >> the recommended default value for the AfMode control. >> External cameras (as reported by the Location property set to >> CameraLocationExternal) may use a different default value. >> (1): auto - The AF algorithm is in auto mode. This means that >> the algorithm will never move the lens or change state >> unless the AfTrigger control is used. The AfTrigger control can >> be used to initiate a focus scan, the results of which >> will be reported by AfState. If the autofocus algorithm is >> moved from AfModeAuto to another mode while a scan is in >> progress, the scan is cancelled immediately, without waiting >> for the scan to finish. When first entering this mode the >> AfState will report AfStateIdle. When a trigger control is >> sent, AfState will report AfStateScanning for a period before >> spontaneously changing to AfStateFocused or AfStateFailed, >> depending on the outcome of the scan. It will remain in this >> state until another scan is initiated by the AfTrigger control. >> If a scan is cancelled (without changing to another >> mode), AfState will return to AfStateIdle. >> (2): continuous - The AF algorithm is in continuous mode. This >> means that the lens can re-start a scan spontaneously >> at any moment, without any user intervention. The AfState still >> reports whether the algorithm is currently scanning or >> not, though the application has no ability to initiate or >> cancel scans, nor to move the lens for itself. However, >> applications can pause the AF algorithm from continuously >> scanning by using the AfPause control. This allows video or >> still images to be captured whilst guaranteeing that the focus >> is fixed. When set to AfModeContinuous, the system will >> immediately initiate a scan so AfState will report >> AfStateScanning, and will settle on one of AfStateFocused or >> AfStateFailed, depending on the scan result. >> >> af-pause : This control has no effect except when >> in continuous autofocus mode (AfModeContinuous). It can be >> used to pause any lens movements while (for example) images >> are captured. The algorithm remains inactive until it is >> instructed to resume. >> flags: readable, writable, controllable >> Enum "AfPause" Default: 0, "immediate" >> (0): immediate - Pause the continuous autofocus algorithm >> immediately, whether or not any kind of scan is underway. >> AfPauseState will subsequently report AfPauseStatePaused. >> AfState may report any of AfStateScanning, AfStateFocused or >> AfStateFailed, depending on the algorithm's state when it >> received this control. >> (1): deferred - This is similar to AfPauseImmediate, and if the >> AfState is currently reporting AfStateFocused or >> AfStateFailed it will remain in that state and AfPauseState >> will report AfPauseStatePaused. However, if the algorithm is >> scanning (AfStateScanning), AfPauseState will report >> AfPauseStatePausing until the scan is finished, at which point >> AfState will report one of AfStateFocused or AfStateFailed, and >> AfPauseState will change to AfPauseStatePaused. >> (2): resume - Resume >> continuous autofocus operation. The >> algorithm starts again from exactly >> where it left off, and AfPauseState >> will report AfPauseStateRunning. >> >> af-pause-state : Only applicable in continuous >> (AfModeContinuous) mode, this reports whether the algorithm is >> currently running, paused or pausing (that is, will pause as >> soon as any in-progress scan completes). Any change to >> AfMode will cause AfPauseStateRunning to be reported. >> flags: readable, writable, controllable > > Does not seem writable. > >> Enum "AfPauseState" Default: 0, >> "running" >> (0): running - Continuous >> AF is running and the algorithm may >> restart a scan spontaneously. >> (1): pausing - Continuous AF has been sent an AfPauseDeferred >> control, and will pause as soon as any in-progress scan >> completes (and then report AfPauseStatePaused). No new scans >> will be start spontaneously until the AfPauseResume control >> is sent. >> (2): paused - Continuous >> AF is paused. No further state >> changes or lens movements will occur >> until the AfPauseResume control is >> sent. >> >> af-range : Control to set the range of focus >> distances that is scanned. An implementation may choose not >> to implement all the options here. >> flags: readable, writable, controllable >> Enum "AfRange" Default: 0, "normal" >> (0): normal - A wide range >> of focus distances is scanned, all >> the way from infinity down to close >> distances, though depending on the >> implementation, possibly not >> including the very closest macro >> positions. >> (1): macro - Only close >> distances are scanned. >> (2): full - The full >> range of focus distances is scanned >> just as with AfRangeNormal but this >> time including the very closest >> macro positions. >> >> af-speed : Control that determines whether the AF algorithm is >> to move the lens as quickly as possible or more >> steadily. For example, during video recording it may be >> desirable not to move the lens too abruptly, but when in a >> preview mode (waiting for a still capture) it may be helpful to >> move the lens as quickly as is reasonably possible. >> flags: readable, writable, controllable >> Enum "AfSpeed" Default: 0, "normal" >> (0): normal - Move the >> lens at its usual speed. >> (1): fast - Move the >> lens more quickly. >> >> af-state : Reports the current state of the AF algorithm in >> conjunction with the reported AfMode value and (in >> continuous AF mode) the AfPauseState value. The possible state >> changes are described below, though we note the following >> state transitions that occur when the AfMode is changed. If the >> AfMode is set to AfModeManual, then the AfState will >> always report AfStateIdle (even if the lens is subsequently >> moved). Changing to the AfModeManual state does not initiate >> any lens movement. If the AfMode is set to AfModeAuto then the >> AfState will report AfStateIdle. However, if AfModeAuto >> and AfTriggerStart are sent together then AfState will omit >> AfStateIdle and move straight to AfStateScanning (and start >> a scan). If the AfMode is set to AfModeContinuous then the >> AfState will initially report AfStateScanning. >> flags: readable, writable, controllable > > Seems read-only. > >> Enum "AfState" Default: 0, "idle" >> (0): idle - The AF >> algorithm is in manual mode >> (AfModeManual) or in auto mode >> (AfModeAuto) and a scan has not yet >> been triggered, or an in-progress >> scan was cancelled. >> (1): scanning - The AF algorithm is in auto mode (AfModeAuto), >> and a scan has been started using the AfTrigger >> control. The scan can be cancelled by sending AfTriggerCancel >> at which point the algorithm will either move back to >> AfStateIdle or, if the scan actually completes before the >> cancel request is processed, to one of AfStateFocused or >> AfStateFailed. Alternatively the AF algorithm could be in >> continuous mode (AfModeContinuous) at which point it may enter >> this state spontaneously whenever it determines that a rescan >> is needed. >> (2): focused - The AF >> algorithm is in auto (AfModeAuto) or >> continuous (AfModeContinuous) mode >> and a scan has completed with the >> result that the algorithm believes >> the image is now in focus. >> (3): failed - The AF >> algorithm is in auto (AfModeAuto) or >> continuous (AfModeContinuous) mode >> and a scan has completed with the >> result that the algorithm did not >> find a good focus position. >> >> af-trigger : This control starts an autofocus scan >> when AfMode is set to AfModeAuto, and can also be used to >> terminate a scan early. It is ignored if AfMode is set to >> AfModeManual or AfModeContinuous. >> flags: readable, writable, controllable >> Enum "AfTrigger" Default: 0, "start" >> (0): start - Start an AF >> scan. Ignored if a scan is in >> progress. >> (1): cancel - Cancel an AF >> scan. This does not cause the lens >> to move anywhere else. Ignored if no >> scan is in progress. > > This one is missing something to make sense, at least from > GStreamer point of view. When you are not starting or > cancelling, what value this control should change to ? That > question might be relevant for libcamera globally. > > Without this third state, I would say this one should not be > there, and replace with action signals. True, that would make more sense. Maybe controls like these can also be marked in the yaml somehow? >> analogue-gain : Analogue gain value applied in the sensor >> device. The value of the control specifies the gain >> multiplier applied to all colour channels. This value cannot be >> lower than 1.0. Setting this value means that it is now >> fixed and the AE algorithm may not change it. Setting it back >> to zero returns it to the control of the AE algorithm. See >> also: exposure-time, ae-enable. Todo: Document the interactions >> between AeEnable and setting a fixed value for this >> control. Consider interactions with other AE features, such as >> aperture and aperture/shutter priority mode, and decide >> if control of which features should be automatically adjusted >> shouldn't better be handled through a separate AE mode >> control. >> flags: readable, writable, controllable >> Float. Range: -3.402823e+38 - >> 3.402823e+38 Default: 0 > > I'm not confident there is no range, would be nice if we can > align to something sensible (might need yaml changes of course). > >> >> awb-enable : Enable or disable the AWB. See also: >> colour-gains. >> flags: readable, writable, controllable >> Boolean. Default: false > > Should be on by default (if not, we should do that in gst ;-P). > >> >> awb-locked : Report the lock status of a running AWB algorithm. >> If the AWB algorithm is locked the value shall be set >> to true, if it's converging it shall be set to false. If the >> AWB algorithm is not running the control shall not be >> present in the metadata control list. See also: awb-enable. >> flags: readable, writable, controllable >> Boolean. Default: false > > Read-only. > >> >> awb-mode : Specify the range of illuminants to use >> for the AWB algorithm. The modes supported are platform >> specific, and not all modes may be supported. >> flags: readable, writable, controllable >> Enum "AwbMode" Default: 0, "auto" >> (0): auto - Search over >> the whole colour temperature range. >> (1): incandescent - Incandescent >> AWB lamp mode. >> (2): tungsten - Tungsten AWB >> lamp mode. >> (3): fluorescent - Fluorescent >> AWB lamp mode. >> (4): indoor - Indoor AWB >> lighting mode. >> (5): daylight - Daylight AWB >> lighting mode. >> (6): cloudy - Cloudy AWB >> lighting mode. >> (7): custom - Custom AWB >> mode. > > Again, unclear how custom can be used. > >> >> brightness : Specify a fixed brightness parameter. >> Positive values (up to 1.0) produce brighter images; negative >> values (up to -1.0) produce darker images and 0.0 leaves >> pixels unchanged. >> flags: readable, writable, controllable >> Float. Range: -3.402823e+38 - >> 3.402823e+38 Default: 0 > > So this time the range is documented, we need a machine readable > form. > >> >> camera-name : Select by name which camera to use. >> flags: readable, writable, changeable >> only in NULL or READY state >> String. Default: null >> >> colour-correction-matrix: The 3x3 matrix that converts camera >> RGB to sRGB within the imaging pipeline. This should >> describe the matrix that is used after pixels have been >> white-balanced, but before any gamma transformation. The 3x3 >> matrix is stored in conventional reading order in an array of 9 >> floating point values. >> flags: readable, writable, controllable >> Default: "< >" >> GstValueArray of GValues of type >> "gfloat" > > The type only allow flat list, but as its a matrix I was > expecting > > "< < 1, 2, 3 >, > < 4, 5, 6 >, > < 7, 8, 9 > >" > > That being said I would be fine with flat raster data: > > "< 1, 2, 3, > 4, 5, 6, > 7, 8, 9 >" > > The down side is for bindings. Consider Python, the first would > create a collection that you can access with matrix[Y][X] and > more importantly can store into numpy.matrix. > > A comment for all list "< >" syntax. The usage and serialization > syntax is not easy to guess. For matrix, if we had the identity > matrix as default instead of an empty list, that would help. > Otherwise we need a way to extend the doc for GStreamer usage. I thought it would be best to use a flat array here because the documentation mentions that it is an array of 9 values. An identity matrix as default would work but I think this is the same discussion as the default values for the enums: that the default might be hardware specific. That's why I was kind of happy that you could leave arrays empty as a default. >> colour-gains : Pair of gain values for the Red and >> Blue colour channels, in that order. ColourGains can only be >> applied in a Request when the AWB is disabled. See also: >> awb-enable. >> flags: readable, writable, controllable >> Default: "< >" >> GstValueArray of GValues of type >> "gfloat" > > Again the type allow for flat list, but I think we have "< < r1, > r2 >, < b1, b2 > >" ? I think "< r b >" would actually be correct here. >> colour-temperature : Report the current estimate of the >> colour temperature, in kelvin, for this frame. The >> ColourTemperature control can only be returned in metadata. >> flags: readable, writable, controllable >> Integer. Range: -2147483648 - >> 2147483647 Default: 0 > > read-only. > >> >> contrast : Specify a fixed contrast parameter. >> Normal contrast is given by the value 1.0; larger values >> produce images with more contrast. >> flags: readable, writable, controllable >> Float. Range: -3.402823e+38 - >> 3.402823e+38 Default: 0 > > Range needed programmatically. > >> >> digital-gain : Digital gain value applied during the processing >> steps applied to the image as captured from the >> sensor. The global digital gain factor is applied to all the >> colour channels of the RAW image. Different pipeline models >> are free to specify how the global gain factor applies to each >> separate channel. If an imaging pipeline applies digital >> gain in distinct processing steps, this value indicates their >> total sum. Pipelines are free to decide how to adjust each >> processing step to respect the received gain factor and shall >> report their total value in the request metadata. >> flags: readable, writable, controllable >> Float. Range: -3.402823e+38 - >> 3.402823e+38 Default: 0 >> >> draft-ae-precapture-trigger: Control for AE metering trigger. >> Currently identical to ANDROID_CONTROL_AE_PRECAPTURE_TRIGGER. >> Whether the camera device will trigger a precapture metering >> sequence when it processes this request. >> flags: readable, writable, controllable >> Enum "AePrecaptureTrigger" Default: 0, >> "idle" >> (0): idle - The trigger >> is idle. >> (1): start - The >> pre-capture AE metering is started >> by the camera. >> (2): cancel - The camera >> will cancel any active or completed >> metering sequence. The AE algorithm >> is reset to its initial state. > > Perhaps we can skip over draft- as this could later break our > interface. We can also add an env to include them ? Sure, I could hide that behind an env variable. What about other vendor controls, should they be included? >> draft-ae-state : Control to report the current AE >> algorithm state. Currently identical to >> ANDROID_CONTROL_AE_STATE. Current state of the AE algorithm. >> flags: readable, writable, controllable >> Enum "AeState" Default: 0, "inactive" >> (0): inactive - The AE >> algorithm is inactive. >> (1): searching - The AE >> algorithm has not converged yet. >> (2): converged - The AE >> algorithm has converged. >> (3): locked - The AE >> algorithm is locked. >> (4): flash-required - The AE >> algorithm would need a flash for >> good results >> (5): precapture - The AE >> algorithm has started a pre-capture >> metering session. See also: >> ae-precapture-trigger. >> >> draft-awb-state : Control to report the current AWB >> algorithm state. Currently identical to >> ANDROID_CONTROL_AWB_STATE. Current state of the AWB >> algorithm. >> flags: readable, writable, controllable >> Enum "AwbState" Default: 0, >> "state-inactive" >> (0): state-inactive - The AWB >> algorithm is inactive. >> (1): state-searching - The AWB >> algorithm has not converged yet. >> (2): converged - The AWB >> algorithm has converged. >> (3): locked - The AWB >> algorithm is locked. >> >> draft-color-correction-aberration-mode: Control to select the >> color correction aberration mode. Currently identical to >> ANDROID_COLOR_CORRECTION_ABERRATION_MODE. Mode of operation >> for the chromatic aberration correction algorithm. >> flags: readable, writable, controllable >> Enum "ColorCorrectionAberrationMode" >> Default: 0, "off" >> (0): off - No >> aberration correction is applied. >> (1): fast - Aberration >> correction will not slow down the >> frame rate. >> (2): high-quality - High quality >> aberration correction which might >> reduce the frame rate. >> >> draft-lens-shading-map-mode: Control to report if the lens >> shading map is available. Currently identical to >> ANDROID_STATISTICS_LENS_SHADING_MAP_MODE. >> flags: readable, writable, controllable >> Enum "LensShadingMapMode" Default: 0, >> "ff" >> (0): ff - No lens >> shading map mode is available. >> (1): n - The lens >> shading map mode is available. >> >> draft-max-latency : The maximum number of frames that can occur >> after a request (different than the previous) has been >> submitted, and before the result's state becomes synchronized. >> A value of -1 indicates unknown latency, and 0 indicates >> per-frame control. Currently identical to >> ANDROID_SYNC_MAX_LATENCY. >> flags: readable, writable, controllable >> Integer. Range: -2147483648 - >> 2147483647 Default: 0 >> >> draft-noise-reduction-mode: Control to select the noise >> reduction algorithm mode. Currently identical to >> ANDROID_NOISE_REDUCTION_MODE. Mode of operation for the >> noise reduction algorithm. >> flags: readable, writable, controllable >> Enum "NoiseReductionMode" Default: 0, >> "off" >> (0): off - No noise >> reduction is applied >> (1): fast - Noise >> reduction is applied without >> reducing the frame rate. >> (2): high-quality - High quality >> noise reduction at the expense of >> frame rate. >> (3): minimal - Minimal >> noise reduction is applied without >> reducing the frame rate. >> (4): z-s-l - Noise >> reduction is applied at different >> levels to different streams. >> >> draft-pipeline-depth: Specifies the number of pipeline stages >> the frame went through from when it was exposed to when >> the final completed result was available to the framework. >> Always less than or equal to PipelineMaxDepth. Currently >> identical to ANDROID_REQUEST_PIPELINE_DEPTH. The typical value >> for this control is 3 as a frame is first exposed, >> captured and then processed in a single pass through the ISP. >> Any additional processing step performed after the ISP >> pass (in example face detection, additional format conversions >> etc) count as an additional pipeline stage. >> flags: readable, writable, controllable >> Integer. Range: -2147483648 - >> 2147483647 Default: 0 >> >> draft-sensor-rolling-shutter-skew: Control to report the time >> between the start of exposure of the first row and the start >> of exposure of the last row. Currently identical to >> ANDROID_SENSOR_ROLLING_SHUTTER_SKEW >> flags: readable, writable, controllable >> Integer64. Range: -9223372036854775808 >> - 9223372036854775807 Default: 0 >> >> draft-test-pattern-mode: Control to select the test pattern >> mode. Currently identical to >> ANDROID_SENSOR_TEST_PATTERN_MODE. >> flags: readable, writable, controllable >> Enum "TestPatternMode" Default: 0, >> "off" >> (0): off - No test >> pattern mode is used. The camera >> device returns frames from the image >> sensor. >> (1): solid-color - Each pixel >> in [R, G_even, G_odd, B] is replaced >> by its respective color channel >> provided in test pattern data. Todo: >> Add control for test pattern data. >> (2): color-bars - All pixel data is replaced with an 8-bar >> color pattern. The vertical bars (left-to-right) are as >> follows; white, yellow, cyan, green, magenta, red, blue and >> black. Each bar should take up 1/8 of the sensor pixel array >> width. When this is not possible, the bar size should be >> rounded down to the nearest integer and the pattern can repeat >> on the right side. Each bar's height must always take up the >> full sensor pixel array height. >> (3): color-bars-fade-to-gray - The test pattern is similar to >> TestPatternModeColorBars, except that each bar should >> start at its specified color at the top and fade to gray at the >> bottom. Furthermore each bar is further subdevided into >> a left and right half. The left half should have a smooth >> gradient, and the right half should have a quantized gradient. >> In particular, the right half's should consist of blocks of the >> same color for 1/16th active sensor pixel array width. >> The least significant bits in the quantized gradient should be >> copied from the most significant bits of the smooth >> gradient. The height of each bar should always be a multiple of >> 128. When this is not the case, the pattern should >> repeat at the bottom of the image. >> (4): pn9 - All pixel data is replaced by a pseudo-random >> sequence generated from a PN9 512-bit sequence (typically >> implemented in hardware with a linear feedback shift register). >> The generator should be reset at the beginning of each >> frame, and thus each subsequent raw frame with this test >> pattern should be exactly the same as the last. >> (256): custom1 - The first custom test pattern. All custom >> patterns that are available only on this camera device are >> at least this numeric value. All of the custom test patterns >> will be static (that is the raw image must not vary from >> frame to frame). >> >> exposure-time : Exposure time (shutter speed) for the frame >> applied in the sensor device. This value is specified in >> micro-seconds. Setting this value means that it is now fixed >> and the AE algorithm may not change it. Setting it back to >> zero returns it to the control of the AE algorithm. See also: >> analogue-gain, ae-enable. Todo: Document the interactions >> between AeEnable and setting a fixed value for this control. >> Consider interactions with other AE features, such as >> aperture and aperture/shutter priority mode, and decide if >> control of which features should be automatically adjusted >> shouldn't better be handled through a separate AE mode control. >> flags: readable, writable, controllable >> Integer. Range: -2147483648 - >> 2147483647 Default: 0 >> >> exposure-value : Specify an Exposure Value (EV) parameter. The >> EV parameter will only be applied if the AE algorithm >> is currently enabled. By convention EV adjusts the exposure as >> log2. For example EV = [-2, -1, 0.5, 0, 0.5, 1, 2] >> results in an exposure adjustment of [1/4x, 1/2x, 1/sqrt(2)x, >> 1x, sqrt(2)x, 2x, 4x]. See also: ae-enable. >> flags: readable, writable, controllable >> Float. Range: -3.402823e+38 - >> 3.402823e+38 Default: 0 >> >> focus-fo-m : Reports a Figure of Merit (FoM) to indicate how >> in-focus the frame is. A larger FocusFoM value indicates >> a more in-focus frame. This singular value may be based on a >> combination of statistics gathered from multiple focus >> regions within an image. The number of focus regions and method >> of combination is platform dependent. In this respect, >> it is not necessarily aimed at providing a way to implement a >> focus algorithm by the application, rather an indication >> of how in-focus a frame is. >> flags: readable, writable, controllable >> Integer. Range: -2147483648 - >> 2147483647 Default: 0 >> >> frame-duration : The instantaneous frame duration from >> start of frame exposure to start of next exposure, expressed >> in microseconds. This control is meant to be returned in >> metadata. >> flags: readable, writable, controllable >> Integer64. Range: -9223372036854775808 >> - 9223372036854775807 Default: 0 >> >> frame-duration-limits: The minimum and maximum (in that order) >> frame duration, expressed in microseconds. When >> provided by applications, the control specifies the sensor >> frame duration interval the pipeline has to use. This limits >> the largest exposure time the sensor can use. For example, if a >> maximum frame duration of 33ms is requested >> (corresponding to 30 frames per second), the sensor will not be >> able to raise the exposure time above 33ms. A fixed >> frame duration is achieved by setting the minimum and maximum >> values to be the same. Setting both values to 0 reverts to >> using the camera defaults. The maximum frame duration provides >> the absolute limit to the shutter speed computed by the >> AE algorithm and it overrides any exposure mode setting >> specified with controls::AeExposureMode. Similarly, when a >> manual exposure time is set through controls::ExposureTime, it >> also gets clipped to the limits set by this control. When >> reported in metadata, the control expresses the minimum and >> maximum frame durations used after being clipped to the >> sensor provided frame duration limits. See also: >> ae-exposure-mode. See also: exposure-time. Todo: Define how to >> calculate the capture frame rate by defining controls to report >> additional delays introduced by the capture pipeline or >> post-processing stages (ie JPEG conversion, frame scaling). >> Todo: Provide an explicit definition of default control >> values, for this and all other controls. >> flags: readable, writable, controllable >> Default: "< >" >> GstValueArray of GValues of type >> "gint64" >> >> gamma : Specify a fixed gamma value. Default >> must be 2.2 which closely mimics sRGB gamma. Note that this >> is camera gamma, so it is applied as 1.0/gamma. >> flags: readable, writable, controllable >> Float. Range: -3.402823e+38 - >> 3.402823e+38 Default: 0 >> >> hdr-channel : This value is reported back to the application so >> that it can discover whether this capture corresponds >> to the short or long exposure image (or any other image used by >> the HDR procedure). An application can monitor the HDR >> channel to discover when the differently exposed images have >> arrived. This metadata is only available when an HDR mode >> has been enabled. See also: hdr-mode. >> flags: readable, writable, controllable >> Enum "HdrChannel" Default: 0, "none" >> (0): none - This image >> does not correspond to any of the >> captures used to create an HDR >> image. >> (1): short - This is a >> short exposure image. >> (2): medium - This is a >> medium exposure image. >> (3): long - This is a >> long exposure image. >> >> hdr-mode : Control to set the mode to be used for High Dynamic >> Range (HDR) imaging. HDR techniques typically include >> multiple exposure, image fusion and tone mapping techniques to >> improve the dynamic range of the resulting images. When >> using an HDR mode, images are captured with different sets of >> AGC settings called HDR channels. Channels indicate in >> particular the type of exposure (short, medium or long) used to >> capture the raw image, before fusion. Each HDR image is >> tagged with the corresponding channel using the HdrChannel >> control. See also: hdr-channel. >> flags: readable, writable, controllable >> Enum "HdrMode" Default: 0, "off" >> (0): off - HDR is >> disabled. Metadata for this frame >> will not include the HdrChannel >> control. >> (1): multi-exposure-unmerged - Multiple exposures will be >> generated in an alternating fashion. However, they will not >> be merged together and will be returned to the application as >> they are. Each image will be tagged with the correct HDR >> channel, indicating what kind of exposure it is. The tag should >> be the same as in the HdrModeMultiExposure case. The >> expectation is that an application using this mode would merge >> the frames to create HDR images for itself if it requires >> them. >> (2): multi-exposure - Multiple exposures will be generated and >> merged to create HDR images. Each image will be tagged >> with the HDR channel (long, medium or short) that arrived and >> which caused this image to be output. Systems that use two >> channels for HDR will return images tagged alternately as the >> short and long channel. Systems that use three channels >> for HDR will cycle through the short, medium and long channel >> before repeating. >> (3): single-exposure - Multiple >> frames all at a single exposure will >> be used to create HDR images. These >> images should be reported as all >> corresponding to the HDR short >> channel. >> (4): night - Multiple >> frames will be combined to produce >> "night mode" images. It is up to the >> implementation exactly which HDR >> channels it uses, and the images >> will all be tagged accordingly with >> the correct HDR channel information. >> >> lens-position : Acts as a control to instruct the lens to move >> to a particular position and also reports back the >> position of the lens for each frame. The LensPosition control >> is ignored unless the AfMode is set to AfModeManual, >> though the value is reported back unconditionally in all modes. >> This value, which is generally a non-integer, is the >> reciprocal of the focal distance in metres, also known as >> dioptres. That is, to set a focal distance D, the lens >> position LP is given by \f$LP = \frac{1\mathrm{m}}{D}\f$ For >> example: 0 moves the lens to infinity. 0.5 moves the lens >> to focus on objects 2m away. 2 moves the lens to focus on >> objects 50cm away. And larger values will focus the lens >> closer. The default value of the control should indicate a good >> general position for the lens, often corresponding to >> the hyperfocal distance (the closest position for which objects >> at infinity are still acceptably sharp). The minimum >> will often be zero (meaning infinity), and the maximum value >> defines the closest focus position. Todo: Define a property >> to report the Hyperfocal distance of calibrated lenses. >> flags: readable, writable, controllable >> Float. Range: -3.402823e+38 - >> 3.402823e+38 Default: 0 >> >> lux : Report an estimate of the current >> illuminance level in lux. The Lux control can only be >> returned in metadata. >> flags: readable, writable, controllable >> Float. Range: -3.402823e+38 - >> 3.402823e+38 Default: 0 >> >> name : The name of the object >> flags: readable, writable >> String. Default: "libcamerasrc0" >> >> parent : The parent of the object >> flags: readable, writable >> Object of type "GstObject" >> >> saturation : Specify a fixed saturation parameter. >> Normal saturation is given by the value 1.0; larger values >> produce more saturated colours; 0.0 produces a greyscale >> image. >> flags: readable, writable, controllable >> Float. Range: -3.402823e+38 - >> 3.402823e+38 Default: 0 >> >> sensor-black-levels : Reports the sensor black levels used for >> processing a frame, in the order R, Gr, Gb, B. These >> values are returned as numbers out of a 16-bit pixel range (as >> if pixels ranged from 0 to 65535). The SensorBlackLevels >> control can only be returned in metadata. >> flags: readable, writable, controllable >> Default: "< >" >> GstValueArray of GValues of type "gint" >> >> sensor-temperature : Temperature measure from the camera sensor >> in Celsius. This is typically obtained by a thermal >> sensor present on-die or in the camera module. The range of >> reported temperatures is device dependent. The >> SensorTemperature control will only be returned in metadata if >> a thermal sensor is present. >> flags: readable, writable, controllable >> Float. Range: -3.402823e+38 - >> 3.402823e+38 Default: 0 >> >> sensor-timestamp : The time when the first row of the image >> sensor active array is exposed. The timestamp, expressed >> in nanoseconds, represents a monotonically increasing counter >> since the system boot time, as defined by the >> Linux-specific CLOCK_BOOTTIME clock id. The SensorTimestamp >> control can only be returned in metadata. Todo: Define how >> the sensor timestamp has to be used in the reprocessing use >> case. >> flags: readable, writable, controllable >> Integer64. Range: -9223372036854775808 >> - 9223372036854775807 Default: 0 >> >> sharpness : A value of 0.0 means no sharpening. The minimum >> value means minimal sharpening, and shall be 0.0 unless >> the camera can't disable sharpening completely. The default >> value shall give a "reasonable" level of sharpening, >> suitable for most use cases. The maximum value may apply >> extremely high levels of sharpening, higher than anyone could >> reasonably want. Negative values are not allowed. Note also >> that sharpening is not applied to raw streams. >> flags: readable, writable, controllable >> Float. Range: -3.402823e+38 - >> 3.402823e+38 Default: 0 > > > I think my comments begins to be repetitive, and if we solve the > earlier issues > we will solve them all. The main thing is that the yaml should > explicitly define > more stuff rather then relying to text. And using that, we can > easily adjust > GStreamer properties. A final note, not all of these will be > supported. I was > wondering if we should add an extra property that tells use > which one are > actually supported ? We don't need anything fency, a "/" > seperate list in a > string or similar could do. We can also go for GstStructure. > > Nicolas Best regards, Jaslo
Hi Kieran, thanks for the review. Kieran Bingham <kieran.bingham@ideasonboard.com> writes: > Hi Nicolas, Jaslo, > > Adding comments where I think I can add helpful (I hope) > feedback. > > Jaslo - Thanks again for starting this work - It's certainly > worthwhile > - even if there might be a few challenges below! It seems that *few* might be an understatement ;) > And I've already seen one person on IRC who was able to solve > their > issues by using this series! \o/ I'm very happy to hear that! > > Quoting Nicolas Dufresne (2024-08-07 20:13:08) >> Hi, >> >> I started inspecting the generated documentation. It is >> generally hard to get a >> perfectly clean results, but let me comment few things. We can >> at least try some >> improvement. >> > > <snip> > >> > ae-flicker-period : Manual flicker period in microseconds. >> > This value sets the current flicker period to avoid. It >> > is used when AeFlickerMode is set to FlickerManual. To cancel >> > 50Hz mains flicker, this should be set to 10000 >> > (corresponding to 100Hz), or 8333 (120Hz) for 60Hz mains. >> > Setting the mode to FlickerManual when no AeFlickerPeriod >> > has ever been set means that no flicker cancellation occurs >> > (until the value of this control is updated). Switching >> > to modes other than FlickerManual has no effect on the value >> > of the AeFlickerPeriod control. See also: >> > ae-flicker-mode. >> > flags: readable, writable, >> > controllable >> > Integer. Range: -2147483648 - >> > 2147483647 Default: 0 >> > >> > ae-locked : Report the lock status of a running AE >> > algorithm. If the AE algorithm is locked the value shall be >> > set >> > to true, if it's converging it shall be set to false. If the >> > AE algorithm is not running the control shall not be >> > present in the metadata control list. See also: ae-enable. >> > flags: readable, writable, >> > controllable >> > Boolean. Default: false >> >> As this one is a status, it should only be readable, without >> the writable/controllable. If I'm not mistaken, you get this >> value withing the finalized request, would be nice enhancement >> to notify our users of the change using g_notify() and/or bus >> messages. > > > This comes down to the fact that readable controls / properties > are only > reported in metadata and not exposed as a control by the > camera... > >> > >> > ae-metering-mode : Specify a metering mode for the AE >> > algorithm to use. The metering modes determine which parts >> > of the image are used to determine the scene brightness. >> > Metering modes may be platform specific and not all >> > metering modes may be supported. >> > flags: readable, writable, >> > controllable >> > Enum "AeMeteringMode" Default: 0, >> > "centre-weighted" >> > (0): centre-weighted - >> > Centre-weighted metering mode. >> > (1): spot - Spot >> > metering mode. >> > (2): matrix - Matrix >> > metering mode. >> > (3): custom - Custom >> > metering mode. >> >> From the doc, this seems pretty obscure. Shouldn't "custom" >> comes with another control ? Perhaps we introduce a mask to >> hide the controls that are not yet usable or that the usage >> isn't obvious and may need special case or rework ? >> >> > >> > af-metering : Instruct the AF algorithm how it >> > should decide which parts of the image should be used to >> > measure focus. >> > flags: readable, writable, >> > controllable >> > Enum "AfMetering" Default: 0, "auto" >> > (0): auto - The AF >> > algorithm should decide for itself >> > where it will measure focus. >> > (1): windows - The AF >> > algorithm should use the >> > rectangles defined by the >> > AfWindows control to measure >> > focus. If no windows are specified >> > the behaviour is platform >> > dependent. >> > >> > af-mode : Control to set the mode of the AF >> > (autofocus) algorithm. An implementation may choose not to >> > implement all the modes. >> > flags: readable, writable, >> > controllable >> > Enum "AfMode" Default: 0, "manual" >> >> Is that appropriate default ? Of course does not matter if >> there is no focus on the target. I wonder how you get to know >> if focus is there or not ... > > Cameras without autofocus should not expose the controls, as > they do not > 'exist' and they will not be handled... > > However - that doesn't fit in this gstreamer model where all > controls > are going to be 'always' exposed. (Even if they are actually > 'metadata > only' as well). It's up to the Camera at runtime to report what > the > pipeline handler will accept based on the underlying hardware > capabilities or configuration. > > In fact setting them would likely / potentially cause an > assertion? (or > at least a failed request - potentially causing a frame drop... > ?). At the moment the control gets set anyway, whether it is supported or not, and if it isn't there is an error printed on every request. Whether a control is supported should probably already be checked and handled in the gstreamer element. Best regards, Jaslo > > <snip> > > >> > Enum "AfState" Default: 0, "idle" >> > (0): idle - The AF >> > algorithm is in manual mode >> > (AfModeManual) or in auto mode >> > (AfModeAuto) and a scan has not >> > yet been triggered, or an >> > in-progress scan was cancelled. >> > (1): scanning - The AF >> > algorithm is in auto mode >> > (AfModeAuto), and a scan has been >> > started using the AfTrigger >> > control. The scan can be cancelled >> > by sending AfTriggerCancel at >> > which point the algorithm will >> > either move back to AfStateIdle >> > or, if the scan actually completes >> > before the cancel request is >> > processed, to one of >> > AfStateFocused or AfStateFailed. >> > Alternatively the AF algorithm >> > could be in continuous mode >> > (AfModeContinuous) at which point >> > it may enter this state >> > spontaneously whenever it >> > determines that a rescan is >> > needed. >> > (2): focused - The AF >> > algorithm is in auto (AfModeAuto) >> > or continuous (AfModeContinuous) >> > mode and a scan has completed with >> > the result that the algorithm >> > believes the image is now in >> > focus. >> > (3): failed - The AF >> > algorithm is in auto (AfModeAuto) >> > or continuous (AfModeContinuous) >> > mode and a scan has completed with >> > the result that the algorithm did >> > not find a good focus position. >> > >> > af-trigger : This control starts an autofocus scan >> > when AfMode is set to AfModeAuto, and can also be used to >> > terminate a scan early. It is ignored if AfMode is set to >> > AfModeManual or AfModeContinuous. >> > flags: readable, writable, >> > controllable >> > Enum "AfTrigger" Default: 0, "start" >> > (0): start - Start an >> > AF scan. Ignored if a scan is in >> > progress. >> > (1): cancel - Cancel an >> > AF scan. This does not cause the >> > lens to move anywhere else. >> > Ignored if no scan is in progress. >> >> This one is missing something to make sense, at least from >> GStreamer point of view. When you are not starting or >> cancelling, what value this control should change to ? That >> question might be relevant for libcamera globally. >> >> Without this third state, I would say this one should not be >> there, and replace with action signals. > > > In libcamera - controls only 'exist' in a single request. What > is an > 'action signal' in Gstreamer? > > > So - AfTrigger for start and cancel are, in this case 'events' > > - "Start an AFscan at this request" > - "Cancel any active scan" > > > There are no 'events' in between. Controls are only 'set' in > Requests. > Any 'response' is only available in the metadata from a > completed > Request. > > You can not 'read' the state of a control in the same sense of a > V4L2 > control. > > (Maybe this should change for some use cases, or for an abilty > to set > controls directly on a camera, which I believe Stefan has > explored, and > not through a request, but it's the current model). > > > >> > analogue-gain : Analogue gain value applied in the sensor >> > device. The value of the control specifies the gain >> > multiplier applied to all colour channels. This value cannot >> > be lower than 1.0. Setting this value means that it is >> > now fixed and the AE algorithm may not change it. Setting it >> > back to zero returns it to the control of the AE >> > algorithm. See also: exposure-time, ae-enable. Todo: Document >> > the interactions between AeEnable and setting a fixed >> > value for this control. Consider interactions with other AE >> > features, such as aperture and aperture/shutter priority >> > mode, and decide if control of which features should be >> > automatically adjusted shouldn't better be handled through a >> > separate AE mode control. >> > flags: readable, writable, >> > controllable >> > Float. Range: -3.402823e+38 - >> > 3.402823e+38 Default: 0 >> >> I'm not confident there is no range, would be nice if we can >> align to something sensible (might need yaml changes of >> course). > > Different cameras will have different ranges ... only > determinable at > runtime. We can't specify this in YAML. Though - it should > always be > positive at least! (and as you've mentioned elsewhere, if it was > machine readable to say always above 1.0 would be beneficial > too). > > > >> > awb-enable : Enable or disable the AWB. See also: >> > colour-gains. >> > flags: readable, writable, >> > controllable >> > Boolean. Default: false >> >> Should be on by default (if not, we should do that in gst ;-P). > > > Maybe that's a GST decision. It is likely use case dependent. > But the same could be said for either default ... (and 'on' > seems more > likely to be useful than 'off' in the majority case?) > > >> > awb-locked : Report the lock status of a running AWB >> > algorithm. If the AWB algorithm is locked the value shall >> > be >> > set to true, if it's converging it shall be set to false. If >> > the AWB algorithm is not running the control shall not >> > be present in the metadata control list. See also: >> > awb-enable. >> > flags: readable, writable, >> > controllable >> > Boolean. Default: false >> >> Read-only. > > Reported through metadata() same for all the other Read-only > 'control > ids' I suspect. > > > >> >> > >> > awb-mode : Specify the range of illuminants to >> > use for the AWB algorithm. The modes supported are platform >> > specific, and not all modes may be supported. >> > flags: readable, writable, >> > controllable >> > Enum "AwbMode" Default: 0, "auto" >> > (0): auto - Search >> > over the whole colour temperature >> > range. >> > (1): incandescent - >> > Incandescent AWB lamp mode. >> > (2): tungsten - Tungsten >> > AWB lamp mode. >> > (3): fluorescent - >> > Fluorescent AWB lamp mode. >> > (4): indoor - Indoor AWB >> > lighting mode. >> > (5): daylight - Daylight >> > AWB lighting mode. >> > (6): cloudy - Cloudy AWB >> > lighting mode. >> > (7): custom - Custom AWB >> > mode. >> >> Again, unclear how custom can be used. > > I think/suspect this was added by Raspberry Pi. We should > certainly > revisit this I think. > >> > brightness : Specify a fixed brightness parameter. >> > Positive values (up to 1.0) produce brighter images; >> > negative values (up to -1.0) produce darker images and 0.0 >> > leaves pixels unchanged. >> > flags: readable, writable, >> > controllable >> > Float. Range: -3.402823e+38 - >> > 3.402823e+38 Default: 0 >> >> So this time the range is documented, we need a machine >> readable form. > > Indeed. > > >> > >> > camera-name : Select by name which camera to use. >> > flags: readable, writable, changeable >> > only in NULL or READY state >> > String. Default: null >> > >> > colour-correction-matrix: The 3x3 matrix that converts >> > camera RGB to sRGB within the imaging pipeline. This should >> > describe the matrix that is used after pixels have been >> > white-balanced, but before any gamma transformation. The 3x3 >> > matrix is stored in conventional reading order in an array of >> > 9 floating point values. >> > flags: readable, writable, >> > controllable >> > Default: "< >" >> > GstValueArray of GValues of type >> > "gfloat" >> >> The type only allow flat list, but as its a matrix I was >> expecting >> >> "< < 1, 2, 3 >, >> < 4, 5, 6 >, >> < 7, 8, 9 > >" >> >> That being said I would be fine with flat raster data: >> >> "< 1, 2, 3, >> 4, 5, 6, >> 7, 8, 9 >" >> >> The down side is for bindings. Consider Python, the first would >> create a collection that you can access with matrix[Y][X] and >> more importantly can store into numpy.matrix. >> >> A comment for all list "< >" syntax. The usage and >> serialization syntax is not easy to guess. For matrix, if we >> had the identity matrix as default instead of an empty list, >> that would help. Otherwise we need a way to extend the doc for >> GStreamer usage. >> > > I recall various bits of work on that recently... > > But I think this is already handled through python. Stefan/Tomi > ? > > > > <snip> > >> > draft-ae-precapture-trigger: Control for AE metering >> > trigger. Currently identical to >> > ANDROID_CONTROL_AE_PRECAPTURE_TRIGGER. Whether the camera >> > device will trigger a precapture metering sequence when it >> > processes this request. >> > flags: readable, writable, >> > controllable >> > Enum "AePrecaptureTrigger" Default: >> > 0, "idle" >> > (0): idle - The >> > trigger is idle. >> > (1): start - The >> > pre-capture AE metering is started >> > by the camera. >> > (2): cancel - The camera >> > will cancel any active or >> > completed metering sequence. The >> > AE algorithm is reset to its >> > initial state. >> >> Perhaps we can skip over draft- as this could later break our >> interface. We can also add an env to include them ? > > > Skipping them is an option. I think most of the 'draft' were > only added > to match the Android interface. But I wonder how we'll handle > other > namespaced controls like 'RPi' which I guess will be below... > > > <snip> > >> I think my comments begins to be repetitive, and if we solve >> the earlier issues >> we will solve them all. The main thing is that the yaml should >> explicitly define >> more stuff rather then relying to text. And using that, we can >> easily adjust > > Agreed, I think at least the yaml files could specify if a > control can > only be accessed as metadata which would simplify all of the > 'read-only' > issues. > >> GStreamer properties. A final note, not all of these will be >> supported. I was >> wondering if we should add an extra property that tells use >> which one are >> actually supported ? We don't need anything fency, a "/" >> seperate list in a >> string or similar could do. We can also go for GstStructure. > > I don't understand this part fully. Do you mean a new > control/property > that gstreamer libcamerasrc will read at runtime from libcamera? > It > already has a Camera->controls() that it can read that from. > > But you mention GstStructure - do you mean that the libcamerasrc > will > generate this ? (Perhaps from the Camera->controls()?) > >> Nicolas > > Kieran
Hi Jaslo, Thank you for the patch. On Mon, Aug 05, 2024 at 11:28:37AM +0200, Jaslo Ziska wrote: > This commit implements gstreamer controls for the libcamera element by > generating the controls from the control_ids_*.yaml files using a new > gen-gst-controls.py script. The appropriate meson files are also changed > to automatically run the script when building. > > The gen-gst-controls.py script works similar to the gen-controls.py > script by parsing the control_ids_*.yaml files and generating C++ code > for each control. > For the controls to be used as gstreamer properties the type for each > control needs to be translated to the appropriate glib type and a > GEnumValue is generated for each enum control. Then a > g_object_install_property(), _get_property() and _set_property() > function is generated for each control. > The vendor controls get prefixed with "$vendor-" in the final gstreamer > property name. > > The C++ code generated by the gen-gst-controls.py script is written into > the template gstlibcamerasrc-controls.cpp.in file. The matching > gstlibcamerasrc-controls.h header defines the GstCameraControls class > which handles the installation of the gstreamer properties as well as > keeping track of the control values and setting and getting the > controls. The content of these functions is generated in the Python > script. > > Finally the libcamerasrc element itself is edited to make use of the new > GstCameraControls class. The way this works is by defining a PROP_LAST > enum variant which is passed to the installProperties() function so the > properties are defined with the appropriate offset. When getting or > setting a property PROP_LAST is subtracted from the requested property > to translate the control back into a libcamera::controls:: enum > variant. Nice explanation, that's very helpful for all of us who are not gstreamer specialists (which includes myself). > Signed-off-by: Jaslo Ziska <jaslo@ziska.de> > --- > src/gstreamer/gstlibcamera-controls.cpp.in | 46 +++ > src/gstreamer/gstlibcamera-controls.h | 36 ++ > src/gstreamer/gstlibcamerasrc.cpp | 17 +- > src/gstreamer/meson.build | 14 + > utils/gen-gst-controls.py | 398 +++++++++++++++++++++ > utils/meson.build | 1 + > 6 files changed, 509 insertions(+), 3 deletions(-) > create mode 100644 src/gstreamer/gstlibcamera-controls.cpp.in > create mode 100644 src/gstreamer/gstlibcamera-controls.h > create mode 100755 utils/gen-gst-controls.py > > diff --git a/src/gstreamer/gstlibcamera-controls.cpp.in b/src/gstreamer/gstlibcamera-controls.cpp.in > new file mode 100644 > index 00000000..ff93f5c3 > --- /dev/null > +++ b/src/gstreamer/gstlibcamera-controls.cpp.in > @@ -0,0 +1,46 @@ > +/* SPDX-License-Identifier: LGPL-2.1-or-later */ > +/* > + * Copyright (C) 2023, Collabora Ltd. > + * Author: Nicolas Dufresne <nicolas.dufresne@collabora.com> > + * > + * gstlibcamera-controls.cpp - GStreamer Camera Controls You can drop the file name part and just keep the description. There were a few remaining file names in templates, I have posted a patch to drop them. > + * > + * This file is auto-generated. Do not edit. > + */ > + > +#include "gstlibcamera-controls.h" > + > +#include <libcamera/control_ids.h> > + > +using namespace libcamera; > + > +${enum_def} I've posted "[PATCH 00/10] libcamera: Improve code generation for controls" (https://patchwork.libcamera.org/project/libcamera/list/?series=4506) yesterday to convert our Python string templates to jinja2 templates. jinja2 allows better separation of the template logic and the data logic, making the code much more readable. Would you be able to rebase on top of that series ? I expect it will be merged soon. I'm sorry for the extra work. I had conversion to jinja2 in mind for a while, and your series triggered me to complete it, in order to share more code between the control code generation scripts. Hopefully you will like the new templates better as well. > + > +void GstCameraControls::installProperties(GObjectClass *klass, int lastPropId) > +{ > +${install_properties} > +} > + > +bool GstCameraControls::getProperty(guint propId, GValue *value, GParamSpec *pspec) > +{ > + switch (propId) { > +${get_properties} > + default: > + return false; > + } > +} > + > +bool GstCameraControls::setProperty(guint propId, const GValue *value, > + [[maybe_unused]] GParamSpec *pspec) > +{ > + switch (propId) { > +${set_properties} > + default: > + return false; > + } > +} > + > +void GstCameraControls::applyControls(std::unique_ptr<libcamera::Request> &request) > +{ > + request->controls().merge(controls_); > +} > diff --git a/src/gstreamer/gstlibcamera-controls.h b/src/gstreamer/gstlibcamera-controls.h > new file mode 100644 > index 00000000..4e1d5bf9 > --- /dev/null > +++ b/src/gstreamer/gstlibcamera-controls.h > @@ -0,0 +1,36 @@ > +/* SPDX-License-Identifier: LGPL-2.1-or-later */ > +/* > + * Copyright (C) 2023, Collabora Ltd. > + * Author: Nicolas Dufresne <nicolas.dufresne@collabora.com> > + * > + * gstlibcamera-controls.h - GStreamer Camera Controls Here too please drop the file name and keep the description only. > + */ > + > +#pragma once #include <memory> for std::unique_ptr<> > + > +#include <libcamera/controls.h> > +#include <libcamera/request.h> > + > +#include "gstlibcamerasrc.h" > + > +namespace libcamera { > + > +class GstCameraControls > +{ > +public: > + GstCameraControls() {}; > + ~GstCameraControls() {}; GstCameraControls() = default; ~GstCameraControls() = default; or just drop them, as they will be implicitly declared and defined by default. > + > + static void installProperties(GObjectClass *klass, int lastProp); > + > + bool getProperty(guint propId, GValue *value, GParamSpec *pspec); > + bool setProperty(guint propId, const GValue *value, GParamSpec *pspec); > + > + void applyControls(std::unique_ptr<libcamera::Request> &request); > + > +private: > + /* set of user modified controls */ > + ControlList controls_; > +}; > + > +} /* namespace libcamera */ > diff --git a/src/gstreamer/gstlibcamerasrc.cpp b/src/gstreamer/gstlibcamerasrc.cpp > index 5a3e2989..85dab67f 100644 > --- a/src/gstreamer/gstlibcamerasrc.cpp > +++ b/src/gstreamer/gstlibcamerasrc.cpp > @@ -37,10 +37,11 @@ > > #include <gst/base/base.h> > > +#include "gstlibcamera-controls.h" > +#include "gstlibcamera-utils.h" > #include "gstlibcameraallocator.h" > #include "gstlibcamerapad.h" > #include "gstlibcamerapool.h" > -#include "gstlibcamera-utils.h" > > using namespace libcamera; > > @@ -128,6 +129,7 @@ struct GstLibcameraSrcState { > > ControlList initControls_; > guint group_id_; > + GstCameraControls controls_; > > int queueRequest(); > void requestCompleted(Request *request); > @@ -153,6 +155,7 @@ struct _GstLibcameraSrc { > enum { > PROP_0, > PROP_CAMERA_NAME, > + PROP_LAST > }; > > static void gst_libcamera_src_child_proxy_init(gpointer g_iface, > @@ -183,6 +186,9 @@ int GstLibcameraSrcState::queueRequest() > if (!request) > return -ENOMEM; > > + /* Apply controls */ > + controls_.applyControls(request); > + > std::unique_ptr<RequestWrap> wrap = > std::make_unique<RequestWrap>(std::move(request)); > > @@ -722,6 +728,7 @@ gst_libcamera_src_set_property(GObject *object, guint prop_id, > { > GLibLocker lock(GST_OBJECT(object)); > GstLibcameraSrc *self = GST_LIBCAMERA_SRC(object); > + GstLibcameraSrcState *state = self->state; > > switch (prop_id) { > case PROP_CAMERA_NAME: > @@ -729,7 +736,8 @@ gst_libcamera_src_set_property(GObject *object, guint prop_id, > self->camera_name = g_value_dup_string(value); > break; > default: > - G_OBJECT_WARN_INVALID_PROPERTY_ID(object, prop_id, pspec); > + if (!state->controls_.setProperty(prop_id - PROP_LAST, value, pspec)) > + G_OBJECT_WARN_INVALID_PROPERTY_ID(object, prop_id, pspec); > break; > } > } > @@ -740,13 +748,15 @@ gst_libcamera_src_get_property(GObject *object, guint prop_id, GValue *value, > { > GLibLocker lock(GST_OBJECT(object)); > GstLibcameraSrc *self = GST_LIBCAMERA_SRC(object); > + GstLibcameraSrcState *state = self->state; > > switch (prop_id) { > case PROP_CAMERA_NAME: > g_value_set_string(value, self->camera_name); > break; > default: > - G_OBJECT_WARN_INVALID_PROPERTY_ID(object, prop_id, pspec); > + if (!state->controls_.getProperty(prop_id - PROP_LAST, value, pspec)) > + G_OBJECT_WARN_INVALID_PROPERTY_ID(object, prop_id, pspec); > break; > } > } > @@ -947,6 +957,7 @@ gst_libcamera_src_class_init(GstLibcameraSrcClass *klass) > | G_PARAM_STATIC_STRINGS)); > g_object_class_install_property(object_class, PROP_CAMERA_NAME, spec); > > + GstCameraControls::installProperties(object_class, PROP_LAST); > } > > /* GstChildProxy implementation */ > diff --git a/src/gstreamer/meson.build b/src/gstreamer/meson.build > index c2a01e7b..e6c20124 100644 > --- a/src/gstreamer/meson.build > +++ b/src/gstreamer/meson.build > @@ -25,6 +25,20 @@ libcamera_gst_sources = [ > 'gstlibcamerasrc.cpp', > ] > > +# Generate gstreamer control properties > + > +gen_gst_controls_input_files = [] > +gen_gst_controls_template = files('gstlibcamera-controls.cpp.in') > +foreach file : controls_files > + gen_gst_controls_input_files += files('../libcamera/' + file) > +endforeach > + > +libcamera_gst_sources += custom_target('gstlibcamera-controls.cpp', > + input : gen_gst_controls_input_files, > + output : 'gstlibcamera-controls.cpp', > + command : [gen_gst_controls, '-o', '@OUTPUT@', > + '-t', gen_gst_controls_template, '@INPUT@']) > + > libcamera_gst_cpp_args = [ > '-DVERSION="@0@"'.format(libcamera_git_version), > '-DPACKAGE="@0@"'.format(meson.project_name()), > diff --git a/utils/gen-gst-controls.py b/utils/gen-gst-controls.py > new file mode 100755 > index 00000000..d0c12b50 > --- /dev/null > +++ b/utils/gen-gst-controls.py > @@ -0,0 +1,398 @@ > +#!/usr/bin/env python3 > + > +import argparse > +import re > +import string > +import sys > +import yaml > + > + > +class ControlEnum(object): > + def __init__(self, data): > + self.__data = data > + > + @property > + def description(self): > + """The enum description""" > + return self.__data.get('description') > + > + @property > + def name(self): > + """The enum name""" > + return self.__data.get('name') > + > + @property > + def value(self): > + """The enum value""" > + return self.__data.get('value') > + > + > +class Control(object): > + def __init__(self, name, data, vendor): > + self.__name = name > + self.__data = data > + self.__enum_values = None > + self.__size = None > + self.__vendor = vendor > + > + enum_values = data.get('enum') > + if enum_values is not None: > + self.__enum_values = [ControlEnum(enum) for enum in enum_values] > + > + size = self.__data.get('size') > + if size is not None: > + if len(size) == 0: > + raise RuntimeError(f'Control `{self.__name}` size must have at least one dimension') > + > + # Compute the total number of elements in the array. If any of the > + # array dimension is a string, the array is variable-sized. > + num_elems = 1 > + for dim in size: > + if type(dim) is str: > + num_elems = 0 > + break > + > + dim = int(dim) > + if dim <= 0: > + raise RuntimeError(f'Control `{self.__name}` size must have positive values only') > + > + num_elems *= dim > + > + self.__size = num_elems > + > + @property > + def description(self): > + """The control description""" > + return self.__data.get('description') > + > + @property > + def enum_values(self): > + """The enum values, if the control is an enumeration""" > + if self.__enum_values is None: > + return > + for enum in self.__enum_values: > + yield enum > + > + @property > + def is_enum(self): > + """Is the control an enumeration""" > + return self.__enum_values is not None > + > + @property > + def vendor(self): > + """The vendor string, or None""" > + return self.__vendor > + > + @property > + def name(self): > + """The control name (CamelCase)""" > + return self.__name > + > + @property > + def type(self): > + typ = self.__data.get('type') > + size = self.__data.get('size') > + > + if typ == 'string': > + return 'std::string' > + > + if self.__size is None: > + return typ > + > + if self.__size: > + return f"Span<const {typ}, {self.__size}>" > + else: > + return f"Span<const {typ}>" > + > + @property > + def element_type(self): > + typ = self.__data.get('type') > + return typ > + > + @property > + def size(self): > + return self.__size > + > + > +def find_common_prefix(strings): > + prefix = strings[0] > + > + for string in strings[1:]: > + while string[:len(prefix)] != prefix and prefix: > + prefix = prefix[:len(prefix) - 1] > + if not prefix: > + break > + > + return prefix > + > + > +def format_description(description, indent = 0): > + # Substitute doxygen keywords \sa (see also) and \todo > + description = re.sub(r'\\sa((?: \w+)+)', > + lambda match: 'See also: ' + ', '.join(map(kebab_case, match.group(1).strip().split(' '))) + '.', description) > + description = re.sub(r'\\todo', 'Todo:', description) > + > + description = description.strip().split('\n') > + return '\n'.join([indent * '\t' + '"' + line.replace('\\', r'\\').replace('"', r'\"') + ' "' for line in description if line]).rstrip() > + > + > +def snake_case(s): > + return ''.join([c.isupper() and ('_' + c.lower()) or c for c in s]).strip('_') > + > + > +def kebab_case(s): > + return snake_case(s).replace('_', '-') > + > + > +def indent(s, n): > + lines = s.split('\n') > + return '\n'.join([n * '\t' + line for line in lines]) > + > + > +def generate_cpp(controls): > + # Beginning of a GEnumValue definition for enum controls > + enum_values_start = string.Template('static const GEnumValue ${name_snake_case}_types[] = {') > + # Definition of the GEnumValue variant for each enum control variant > + # Because the description might have multiple lines it will get indented below > + enum_values_values = string.Template('''{ > +\tcontrols${vendor_namespace}::${name}, > +${description}, > +\t"${nick}" > +},''') > + # End of a GEnumValue definition and definition of the matching _get_type() function > + enum_values_end = string.Template('''\t{0, NULL, NULL} > +}; > + > +#define TYPE_${name_upper} (${name_snake_case}_get_type()) > +static GType ${name_snake_case}_get_type(void) > +{ > +\tstatic GType ${name_snake_case}_type = 0; > + > +\tif (!${name_snake_case}_type) > +\t\t${name_snake_case}_type = g_enum_register_static("${name}", ${name_snake_case}_types); > + > +\treturn ${name_snake_case}_type; > +} > +''') > + > + # Creation of the type spec for the different types of controls > + # The description (and the element_spec for the array) might have multiple lines and will get indented below > + spec_array = string.Template('''gst_param_spec_array( > +\t"${spec_name}", > +\t"${nick}", > +${description}, > +${element_spec}, > +\t(GParamFlags) (GST_PARAM_CONTROLLABLE | G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS) > +)''') > + spec_bool = string.Template('''g_param_spec_boolean( > +\t"${spec_name}", > +\t"${nick}", > +${description}, > +\t${default}, > +\t(GParamFlags) (GST_PARAM_CONTROLLABLE | G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS) > +)''') > + spec_enum = string.Template('''g_param_spec_enum( > +\t"${spec_name}", > +\t"${nick}", > +${description}, > +\tTYPE_${enum}, > +\t${default}, > +\t(GParamFlags) (GST_PARAM_CONTROLLABLE | G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS) > +)''') > + spec_numeric = string.Template('''g_param_spec_${gtype}( > +\t"${spec_name}", > +\t"${nick}", > +${description}, > +\t${min}, > +\t${max}, > +\t${default}, > +\t(GParamFlags) (GST_PARAM_CONTROLLABLE | G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS) > +)''') > + > + # The g_object_class_install_property() function call for each control > + install_property = string.Template('''\tg_object_class_install_property( > +\t\tklass, > +\t\tlastPropId + controls${vendor_namespace}::${name_upper}, > +${spec} > +\t);''') > + > + # The _get_property() switch cases for each control > + get_property = string.Template('''case controls${vendor_namespace}::${name_upper}: { > +\tauto val = controls_.get(controls${vendor_namespace}::${name}); > +\tauto spec = G_PARAM_SPEC_${gtype_upper}(pspec); > +\tg_value_set_${gtype}(value, val.value_or(spec->default_value)); > +\treturn true; > +}''') > + get_array_property = string.Template('''case controls${vendor_namespace}::${name_upper}: { > +\tauto val = controls_.get(controls${vendor_namespace}::${name}); > +\tif (val) { > +\t\tfor (size_t i = 0; i < val->size(); ++i) { > +\t\t\tGValue v = G_VALUE_INIT; > +\t\t\tg_value_init(&v, G_TYPE_${gtype_upper}); > +\t\t\tg_value_set_${gtype}(&v, (*val)[i]); > +\t\t\tgst_value_array_append_and_take_value(value, &v); > +\t\t} > +\t} > +\treturn true; > +}''') > + > + # The _set_property() switch cases for each control > + set_property = string.Template('''case controls${vendor_namespace}::${name_upper}: > +\tcontrols_.set(controls${vendor_namespace}::${name}, g_value_get_${gtype}(value)); > +\treturn true;''') > + set_array_property = string.Template('''case controls${vendor_namespace}::${name_upper}: { > +\tsize_t size = gst_value_array_get_size(value); > +\tstd::vector<${type}> val(size); > +\tfor (size_t i = 0; i < size; ++i) { > +\t\tconst GValue *v = gst_value_array_get_value(value, i); > +\t\tval[i] = g_value_get_${gtype}(v); > +\t} > +\tcontrols_.set(controls${vendor_namespace}::${name}, Span<const ${type}, ${size}>(val.data(), size)); > +\treturn true; > +}''') > + > + enum_def = [] > + install_properties = [] > + get_properties = [] > + set_properties = [] > + > + for ctrl in controls: > + is_array = ctrl.size is not None > + size = ctrl.size > + if size == 0: > + size = 'dynamic_extent' > + > + # Determine the matching glib type for each C++ type used in the controls > + gtype = '' > + if ctrl.is_enum: > + gtype = 'enum' > + elif ctrl.element_type == 'bool': > + gtype = 'boolean' > + elif ctrl.element_type == 'float': > + gtype = 'float' > + elif ctrl.element_type == 'int32_t': > + gtype = 'int' > + elif ctrl.element_type == 'int64_t': > + gtype = 'int64' > + elif ctrl.element_type == 'uint8_t': > + gtype = 'uchar' > + elif ctrl.element_type == 'Rectangle': > + # TODO: Handle Rectangle > + continue > + else: > + raise RuntimeError(f'The type `{ctrl.element_type}` is unknown') > + > + vendor_prefix = '' > + vendor_namespace = '' > + if ctrl.vendor != 'libcamera': > + vendor_prefix = ctrl.vendor + '-' > + vendor_namespace = '::' + ctrl.vendor > + > + name_snake_case = snake_case(ctrl.name) > + name_upper = name_snake_case.upper() > + > + info = { > + 'name': ctrl.name, > + 'vendor_namespace': vendor_namespace, > + 'name_snake_case': name_snake_case, > + 'name_upper': name_upper, > + 'spec_name': vendor_prefix + kebab_case(ctrl.name), > + 'nick': ''.join([c.isupper() and (' ' + c) or c for c in ctrl.name]).strip(' '), > + 'description': format_description(ctrl.description, indent=1), > + 'gtype': gtype, > + 'gtype_upper': gtype.upper(), > + 'type': ctrl.element_type, > + 'size': size, > + } > + > + if ctrl.is_enum: > + enum_def.append(enum_values_start.substitute(info)) > + > + common_prefix = find_common_prefix([enum.name for enum in ctrl.enum_values]) > + > + for enum in ctrl.enum_values: > + values_info = { > + 'name': enum.name, > + 'vendor_namespace': vendor_namespace, > + 'description': format_description(enum.description, indent=1), > + 'nick': kebab_case(enum.name.removeprefix(common_prefix)), > + } > + enum_def.append(indent(enum_values_values.substitute(values_info), 1)) > + > + enum_def.append(enum_values_end.substitute(info)) > + > + spec = '' > + if ctrl.is_enum: > + spec = spec_enum.substitute({'enum': name_upper, 'default': 0, **info}) > + elif gtype == 'boolean': > + spec = spec_bool.substitute({'default': 'false', **info}) > + elif gtype == 'float': > + spec = spec_numeric.substitute({'min': '-G_MAXFLOAT', 'max': 'G_MAXFLOAT', 'default': 0, **info}) > + elif gtype == 'int': > + spec = spec_numeric.substitute({'min': 'G_MININT', 'max': 'G_MAXINT', 'default': 0, **info}) > + elif gtype == 'int64': > + spec = spec_numeric.substitute({'min': 'G_MININT64', 'max': 'G_MAXINT64', 'default': 0, **info}) > + elif gtype == 'uchar': > + spec = spec_numeric.substitute({'min': '0', 'max': 'G_MAXUINT8', 'default': 0, **info}) > + > + if is_array: > + spec = spec_array.substitute({'element_spec': indent(spec, 1), **info}) > + > + install_properties.append(install_property.substitute({'spec': indent(spec, 2), **info})) > + > + if is_array: > + get_properties.append(indent(get_array_property.substitute(info), 1)) > + set_properties.append(indent(set_array_property.substitute(info), 1)) > + else: > + get_properties.append(indent(get_property.substitute(info), 1)) > + set_properties.append(indent(set_property.substitute(info), 1)) > + > + return { > + 'enum_def': '\n'.join(enum_def), > + 'install_properties': '\n'.join(install_properties), > + 'get_properties': '\n'.join(get_properties), > + 'set_properties': '\n'.join(set_properties), > + } > + > + > +def fill_template(template, data): > + template = open(template, 'rb').read() > + template = template.decode('utf-8') > + template = string.Template(template) > + return template.substitute(data) > + > + > +def main(argv): > + # Parse command line arguments > + parser = argparse.ArgumentParser() > + parser.add_argument('--output', '-o', metavar='file', type=str, > + help='Output file name. Defaults to standard output if not specified.') > + parser.add_argument('--template', '-t', dest='template', type=str, required=True, > + help='Template file name.') > + parser.add_argument('input', type=str, nargs='+', > + help='Input file name.') > + args = parser.parse_args(argv[1:]) > + > + controls = [] > + for input in args.input: > + data = open(input, 'rb').read() > + vendor = yaml.safe_load(data)['vendor'] > + ctrls = yaml.safe_load(data)['controls'] > + controls = controls + [Control(*ctrl.popitem(), vendor) for ctrl in ctrls] > + > + data = generate_cpp(controls) > + > + data = fill_template(args.template, data) > + > + if args.output: > + output = open(args.output, 'wb') > + output.write(data.encode('utf-8')) > + output.close() > + else: > + sys.stdout.write(data) > + > + return 0 > + > + > +if __name__ == '__main__': > + sys.exit(main(sys.argv)) > diff --git a/utils/meson.build b/utils/meson.build > index 8e28ada7..89597f7f 100644 > --- a/utils/meson.build > +++ b/utils/meson.build > @@ -9,6 +9,7 @@ py_modules += ['yaml'] > gen_controls = files('gen-controls.py') > gen_formats = files('gen-formats.py') > gen_header = files('gen-header.sh') > +gen_gst_controls = files('gen-gst-controls.py') > > ## Module signing > gen_ipa_priv_key = files('gen-ipa-priv-key.sh')
On Thu, Aug 08, 2024 at 02:25:40PM +0200, Jaslo Ziska wrote: > Nicolas Dufresne writes: > > Hi, > > > > I started inspecting the generated documentation. It is > > generally hard to get a > > perfectly clean results, but let me comment few things. We can > > at least try some > > improvement. > > > >> Factory Details: > >> Rank primary (256) > >> Long-name libcamera Source > >> Klass Source/Video > >> Description Linux Camera source using libcamera > >> Author Nicolas Dufresne > >> <nicolas.dufresne@collabora.com> > > > > Thanks for this one! > >> > >> Plugin Details: > >> Name libcamera > >> Description libcamera capture plugin > >> Filename > >> /home/nicolas/Sources/libcamera/build/src/gstreamer/libgstlibcamera.so > >> Version 0.3.1+20-80a202bf > >> License LGPL > >> Source module libcamera > >> Binary package libcamera > >> Origin URL https://libcamera.org > >> > >> GObject > >> +----GInitiallyUnowned > >> +----GstObject > >> +----GstElement > >> +----GstLibcameraSrc > >> > >> Implemented Interfaces: > >> GstChildProxy > >> > >> Pad Templates: > >> SRC template: 'src' > >> Availability: Always > >> Capabilities: > >> video/x-raw > >> image/jpeg > >> video/x-bayer > >> Type: GstLibcameraPad > >> Pad Properties: > >> > >> stream-role : The selected stream role > >> flags: readable, writable, > >> changeable only in NULL or READY > >> state > >> Enum "GstLibcameraStreamRole" > >> Default: 2, "video-recording" > >> (1): still-capture - > >> libcamera::StillCapture > >> (2): video-recording - > >> libcamera::VideoRecording > >> (3): view-finder - > >> libcamera::Viewfinder > >> > >> > >> SRC template: 'src_%u' > >> Availability: On request > >> Capabilities: > >> video/x-raw > >> image/jpeg > >> video/x-bayer > >> Type: GstLibcameraPad > >> Pad Properties: > >> > >> stream-role : The selected stream role > >> flags: readable, writable, > >> changeable only in NULL or READY > >> state > >> Enum "GstLibcameraStreamRole" > >> Default: 2, "video-recording" > >> (1): still-capture - > >> libcamera::StillCapture > >> (2): video-recording - > >> libcamera::VideoRecording > >> (3): view-finder - > >> libcamera::Viewfinder > >> > >> > >> Element has no clocking capabilities. > >> Element has no URI handling capabilities. > >> > >> Pads: > >> SRC: 'src' > >> Pad Template: 'src' > >> > >> Element Properties: > >> > >> ae-constraint-mode : Specify a constraint mode for the AE > >> algorithm to use. These determine how the measured scene > >> brightness is adjusted to reach the desired target exposure. > >> Constraint modes may be platform specific, and not all > >> constraint modes may be supported. > > > > Inspect generally allow for pretty short description, but I do > > like having the full description, specially that as long as this > > element lives here, we are unlikely to pull in the GStreamer doc > > generator for it. Might be worth giving a try to line breaks, > > though. > > I wasn't sure if the whole description should be included but I > also like having the whole description there as one can then use > the gstreamer element without consulting the libcamera > documentation. > > I looked at a few gstreamer elements (h264parse and webrtcbin) > with long descriptions (it was surprisingly hard to find some) and > they both did not include line breaks in the description. Should I > really try to add some? I've sent a patch that fixes formatting issues in the YAML files. You should now be able to extract multi-line documentation with the line break being preserved, and the first line should be a short description. > >> flags: readable, writable, controllable > >> Enum "AeConstraintMode" Default: 0, > >> "normal" > >> (0): normal - Default constraint mode. This mode aims to > >> balance the exposure of different parts of the image so as to > >> reach a reasonable average level. However, highlights in the > >> image may appear over-exposed and lowlights may appear > >> under-exposed. > >> (1): highlight - Highlight > >> constraint mode. This mode adjusts > >> the exposure levels in order to try > >> and avoid over-exposing the > >> brightest parts (highlights) of an > >> image. Other non-highlight parts of > >> the image may appear under-exposed. > >> (2): shadows - Shadows > >> constraint mode. This mode adjusts > >> the exposure levels in order to try > >> and avoid under-exposing the dark > >> parts (shadows) of an image. Other > >> normally exposed parts of the image > >> may appear over-exposed. > >> (3): custom - Custom > >> constraint mode. > > > > Same here, would be nice to at least "try" something, but this > > is part of things that may of may not be easy. Everything else > > seems good. > > > >> > >> ae-enable : Enable or disable the AE. See also: > >> exposure-time, analogue-gain. > >> flags: readable, writable, controllable > >> Boolean. Default: false > >> > >> ae-exposure-mode : Specify an exposure mode for the AE > >> algorithm to use. These specify how the desired total exposure > >> is divided between the shutter time and the sensor's analogue > >> gain. The exposure modes are platform specific, and not > >> all exposure modes may be supported. > >> flags: readable, writable, controllable > >> Enum "AeExposureMode" Default: 0, > >> "normal" > >> (0): normal - Default > >> exposure mode. > >> (1): short - Exposure > >> mode allowing only short exposure > >> times. > >> (2): long - Exposure > >> mode allowing long exposure times. > >> (3): custom - Custom > >> exposure mode. > >> > >> ae-flicker-detected : Flicker period detected in microseconds. > >> The value reported here indicates the currently > >> detected flicker period, or zero if no flicker at all is > >> detected. When AeFlickerMode is set to FlickerAuto, there may > >> be a period during which the value reported here remains zero. > >> Once a non-zero value is reported, then this is the > >> flicker period that has been detected and is now being > >> cancelled. In the case of 50Hz mains flicker, the value would > >> be > >> 10000 (corresponding to 100Hz), or 8333 (120Hz) for 60Hz mains > >> flicker. It is implementation dependent whether the > >> system can continue to detect flicker of different periods when > >> another frequency is already being cancelled. See also: > >> ae-flicker-mode. > >> flags: readable, writable, controllable > >> Integer. Range: -2147483648 - > >> 2147483647 Default: 0 > >> > >> ae-flicker-mode : Set the flicker mode, which determines > >> whether, and how, the AGC/AEC algorithm attempts to hide > >> flicker effects caused by the duty cycle of artificial > >> lighting. Although implementation dependent, many algorithms > >> for > >> "flicker avoidance" work by restricting this exposure time to > >> integer multiples of the cycle period, wherever possible. > >> Implementations may not support all of the flicker modes listed > >> below. By default the system will start in FlickerAuto > >> mode if this is supported, otherwise the flicker mode will be > >> set to FlickerOff. > >> flags: readable, writable, controllable > >> Enum "AeFlickerMode" Default: 0, "off" > > > > Is that accurate or accidental default ? > > All the control defaults are set to zero as a default in the spec > at the moment. That's why the first enum variant is the default > for all enums. So this is kind of accidental. > > The thing is that (at the moment, that can of course be changed) > none of these default controls even get applied to requests. Only > once you do g_object_set_property() will the control be added to > the ControlList. > > If we would want to have custom defaults for controls then these > would need to be specified somewhere either in the > control_ids_*.yaml or somewhere else. > > >> (0): off - No flicker > >> avoidance is performed. > >> (1): manual - Manual > >> flicker avoidance. Suppress flicker > >> effects caused by lighting running > >> with a period specified by the > >> AeFlickerPeriod control. See also: > >> ae-flicker-period. > >> (2): auto - Automatic flicker period detection and avoidance. > >> The system will automatically determine the most likely > >> value of flicker period, and avoid flicker of this frequency. > >> Once flicker is being corrected, it is implementation > >> dependent whether the system is still able to detect a change > >> in the flicker period. See also: ae-flicker-detected. > >> > >> ae-flicker-period : Manual flicker period in microseconds. This > >> value sets the current flicker period to avoid. It is > >> used when AeFlickerMode is set to FlickerManual. To cancel 50Hz > >> mains flicker, this should be set to 10000 > >> (corresponding to 100Hz), or 8333 (120Hz) for 60Hz mains. > >> Setting the mode to FlickerManual when no AeFlickerPeriod has > >> ever been set means that no flicker cancellation occurs (until > >> the value of this control is updated). Switching to modes > >> other than FlickerManual has no effect on the value of the > >> AeFlickerPeriod control. See also: ae-flicker-mode. > >> flags: readable, writable, controllable > >> Integer. Range: -2147483648 - > >> 2147483647 Default: 0 > >> > >> ae-locked : Report the lock status of a running AE algorithm. > >> If the AE algorithm is locked the value shall be set to > >> true, if it's converging it shall be set to false. If the AE > >> algorithm is not running the control shall not be present > >> in the metadata control list. See also: ae-enable. > >> flags: readable, writable, controllable > >> Boolean. Default: false > > > > As this one is a status, it should only be readable, without the > > writable/controllable. If I'm not mistaken, you get this value > > withing the finalized request, would be nice enhancement to > > notify our users of the change using g_notify() and/or bus > > messages. > > The readable/writeable situation can hopefully be solved by adding > some parsable information to the yaml. If someone wants to give it a try I'll be happy to review patches. I've exhausted my spare time with the YAML formatting cleanups and code generator reworks the past few days. > A notification about a change would be nice. Maybe these type of > controls can also be marked in the yaml somehow. All controls should be reported in metadata, so that would be lots of notifications. I'm not sure how to best handle that on the GStreamer side. > >> ae-metering-mode : Specify a metering mode for the AE > >> algorithm to use. The metering modes determine which parts of > >> the image are used to determine the scene brightness. > >> Metering modes may be platform specific and not all metering > >> modes may be supported. > >> flags: readable, writable, controllable > >> Enum "AeMeteringMode" Default: 0, > >> "centre-weighted" > >> (0): centre-weighted - > >> Centre-weighted metering mode. > >> (1): spot - Spot > >> metering mode. > >> (2): matrix - Matrix > >> metering mode. > >> (3): custom - Custom > >> metering mode. > > > > From the doc, this seems pretty obscure. Shouldn't "custom" > > comes with another control ? Perhaps we introduce a mask to hide > > the controls that are not yet usable or that the usage isn't > > obvious and may need special case or rework ? > > Hmmm yeah I don't understand this either, maybe custom is hardware > specific? Stuff like this could be hidden of course. This was introduced for Raspberry Pi, the idea is to let OEMs define custom modes. At the moment we support a single custom mode. I think we need to rework this mechanism and come up with something better. > >> af-metering : Instruct the AF algorithm how it should > >> decide which parts of the image should be used to measure > >> focus. > >> flags: readable, writable, controllable > >> Enum "AfMetering" Default: 0, "auto" > >> (0): auto - The AF > >> algorithm should decide for itself > >> where it will measure focus. > >> (1): windows - The AF > >> algorithm should use the rectangles > >> defined by the AfWindows control to > >> measure focus. If no windows are > >> specified the behaviour is platform > >> dependent. > >> > >> af-mode : Control to set the mode of the AF > >> (autofocus) algorithm. An implementation may choose not to > >> implement all the modes. > >> flags: readable, writable, controllable > >> Enum "AfMode" Default: 0, "manual" > > > > Is that appropriate default ? Of course does not matter if there > > is no focus on the target. I wonder how you get to know if focus > > is there or not ... > > > >> (0): manual - The AF algorithm is in manual mode. In this mode > >> it will never perform any action nor move the lens of > >> its own accord, but an application can specify the desired lens > >> position using the LensPosition control. In this mode > >> the AfState will always report AfStateIdle. If the camera is > >> started in AfModeManual, it will move the focus lens to the > >> position specified by the LensPosition control. This mode is > >> the recommended default value for the AfMode control. > >> External cameras (as reported by the Location property set to > >> CameraLocationExternal) may use a different default value. > >> (1): auto - The AF algorithm is in auto mode. This means that > >> the algorithm will never move the lens or change state > >> unless the AfTrigger control is used. The AfTrigger control can > >> be used to initiate a focus scan, the results of which > >> will be reported by AfState. If the autofocus algorithm is > >> moved from AfModeAuto to another mode while a scan is in > >> progress, the scan is cancelled immediately, without waiting > >> for the scan to finish. When first entering this mode the > >> AfState will report AfStateIdle. When a trigger control is > >> sent, AfState will report AfStateScanning for a period before > >> spontaneously changing to AfStateFocused or AfStateFailed, > >> depending on the outcome of the scan. It will remain in this > >> state until another scan is initiated by the AfTrigger control. > >> If a scan is cancelled (without changing to another > >> mode), AfState will return to AfStateIdle. > >> (2): continuous - The AF algorithm is in continuous mode. This > >> means that the lens can re-start a scan spontaneously > >> at any moment, without any user intervention. The AfState still > >> reports whether the algorithm is currently scanning or > >> not, though the application has no ability to initiate or > >> cancel scans, nor to move the lens for itself. However, > >> applications can pause the AF algorithm from continuously > >> scanning by using the AfPause control. This allows video or > >> still images to be captured whilst guaranteeing that the focus > >> is fixed. When set to AfModeContinuous, the system will > >> immediately initiate a scan so AfState will report > >> AfStateScanning, and will settle on one of AfStateFocused or > >> AfStateFailed, depending on the scan result. > >> > >> af-pause : This control has no effect except when > >> in continuous autofocus mode (AfModeContinuous). It can be > >> used to pause any lens movements while (for example) images > >> are captured. The algorithm remains inactive until it is > >> instructed to resume. > >> flags: readable, writable, controllable > >> Enum "AfPause" Default: 0, "immediate" > >> (0): immediate - Pause the continuous autofocus algorithm > >> immediately, whether or not any kind of scan is underway. > >> AfPauseState will subsequently report AfPauseStatePaused. > >> AfState may report any of AfStateScanning, AfStateFocused or > >> AfStateFailed, depending on the algorithm's state when it > >> received this control. > >> (1): deferred - This is similar to AfPauseImmediate, and if the > >> AfState is currently reporting AfStateFocused or > >> AfStateFailed it will remain in that state and AfPauseState > >> will report AfPauseStatePaused. However, if the algorithm is > >> scanning (AfStateScanning), AfPauseState will report > >> AfPauseStatePausing until the scan is finished, at which point > >> AfState will report one of AfStateFocused or AfStateFailed, and > >> AfPauseState will change to AfPauseStatePaused. > >> (2): resume - Resume > >> continuous autofocus operation. The > >> algorithm starts again from exactly > >> where it left off, and AfPauseState > >> will report AfPauseStateRunning. > >> > >> af-pause-state : Only applicable in continuous > >> (AfModeContinuous) mode, this reports whether the algorithm is > >> currently running, paused or pausing (that is, will pause as > >> soon as any in-progress scan completes). Any change to > >> AfMode will cause AfPauseStateRunning to be reported. > >> flags: readable, writable, controllable > > > > Does not seem writable. > > > >> Enum "AfPauseState" Default: 0, > >> "running" > >> (0): running - Continuous > >> AF is running and the algorithm may > >> restart a scan spontaneously. > >> (1): pausing - Continuous AF has been sent an AfPauseDeferred > >> control, and will pause as soon as any in-progress scan > >> completes (and then report AfPauseStatePaused). No new scans > >> will be start spontaneously until the AfPauseResume control > >> is sent. > >> (2): paused - Continuous > >> AF is paused. No further state > >> changes or lens movements will occur > >> until the AfPauseResume control is > >> sent. > >> > >> af-range : Control to set the range of focus > >> distances that is scanned. An implementation may choose not > >> to implement all the options here. > >> flags: readable, writable, controllable > >> Enum "AfRange" Default: 0, "normal" > >> (0): normal - A wide range > >> of focus distances is scanned, all > >> the way from infinity down to close > >> distances, though depending on the > >> implementation, possibly not > >> including the very closest macro > >> positions. > >> (1): macro - Only close > >> distances are scanned. > >> (2): full - The full > >> range of focus distances is scanned > >> just as with AfRangeNormal but this > >> time including the very closest > >> macro positions. > >> > >> af-speed : Control that determines whether the AF algorithm is > >> to move the lens as quickly as possible or more > >> steadily. For example, during video recording it may be > >> desirable not to move the lens too abruptly, but when in a > >> preview mode (waiting for a still capture) it may be helpful to > >> move the lens as quickly as is reasonably possible. > >> flags: readable, writable, controllable > >> Enum "AfSpeed" Default: 0, "normal" > >> (0): normal - Move the > >> lens at its usual speed. > >> (1): fast - Move the > >> lens more quickly. > >> > >> af-state : Reports the current state of the AF algorithm in > >> conjunction with the reported AfMode value and (in > >> continuous AF mode) the AfPauseState value. The possible state > >> changes are described below, though we note the following > >> state transitions that occur when the AfMode is changed. If the > >> AfMode is set to AfModeManual, then the AfState will > >> always report AfStateIdle (even if the lens is subsequently > >> moved). Changing to the AfModeManual state does not initiate > >> any lens movement. If the AfMode is set to AfModeAuto then the > >> AfState will report AfStateIdle. However, if AfModeAuto > >> and AfTriggerStart are sent together then AfState will omit > >> AfStateIdle and move straight to AfStateScanning (and start > >> a scan). If the AfMode is set to AfModeContinuous then the > >> AfState will initially report AfStateScanning. > >> flags: readable, writable, controllable > > > > Seems read-only. > > > >> Enum "AfState" Default: 0, "idle" > >> (0): idle - The AF > >> algorithm is in manual mode > >> (AfModeManual) or in auto mode > >> (AfModeAuto) and a scan has not yet > >> been triggered, or an in-progress > >> scan was cancelled. > >> (1): scanning - The AF algorithm is in auto mode (AfModeAuto), > >> and a scan has been started using the AfTrigger > >> control. The scan can be cancelled by sending AfTriggerCancel > >> at which point the algorithm will either move back to > >> AfStateIdle or, if the scan actually completes before the > >> cancel request is processed, to one of AfStateFocused or > >> AfStateFailed. Alternatively the AF algorithm could be in > >> continuous mode (AfModeContinuous) at which point it may enter > >> this state spontaneously whenever it determines that a rescan > >> is needed. > >> (2): focused - The AF > >> algorithm is in auto (AfModeAuto) or > >> continuous (AfModeContinuous) mode > >> and a scan has completed with the > >> result that the algorithm believes > >> the image is now in focus. > >> (3): failed - The AF > >> algorithm is in auto (AfModeAuto) or > >> continuous (AfModeContinuous) mode > >> and a scan has completed with the > >> result that the algorithm did not > >> find a good focus position. > >> > >> af-trigger : This control starts an autofocus scan > >> when AfMode is set to AfModeAuto, and can also be used to > >> terminate a scan early. It is ignored if AfMode is set to > >> AfModeManual or AfModeContinuous. > >> flags: readable, writable, controllable > >> Enum "AfTrigger" Default: 0, "start" > >> (0): start - Start an AF > >> scan. Ignored if a scan is in > >> progress. > >> (1): cancel - Cancel an AF > >> scan. This does not cause the lens > >> to move anywhere else. Ignored if no > >> scan is in progress. > > > > This one is missing something to make sense, at least from > > GStreamer point of view. When you are not starting or > > cancelling, what value this control should change to ? That > > question might be relevant for libcamera globally. > > > > Without this third state, I would say this one should not be > > there, and replace with action signals. > > True, that would make more sense. Maybe controls like these can > also be marked in the yaml somehow? There's a quite high chance we'll rework the AE, AWB and AF controls, so this may change. > >> analogue-gain : Analogue gain value applied in the sensor > >> device. The value of the control specifies the gain > >> multiplier applied to all colour channels. This value cannot be > >> lower than 1.0. Setting this value means that it is now > >> fixed and the AE algorithm may not change it. Setting it back > >> to zero returns it to the control of the AE algorithm. See > >> also: exposure-time, ae-enable. Todo: Document the interactions > >> between AeEnable and setting a fixed value for this > >> control. Consider interactions with other AE features, such as > >> aperture and aperture/shutter priority mode, and decide > >> if control of which features should be automatically adjusted > >> shouldn't better be handled through a separate AE mode > >> control. > >> flags: readable, writable, controllable > >> Float. Range: -3.402823e+38 - > >> 3.402823e+38 Default: 0 > > > > I'm not confident there is no range, would be nice if we can > > align to something sensible (might need yaml changes of course). > > > >> > >> awb-enable : Enable or disable the AWB. See also: > >> colour-gains. > >> flags: readable, writable, controllable > >> Boolean. Default: false > > > > Should be on by default (if not, we should do that in gst ;-P). > > > >> > >> awb-locked : Report the lock status of a running AWB algorithm. > >> If the AWB algorithm is locked the value shall be set > >> to true, if it's converging it shall be set to false. If the > >> AWB algorithm is not running the control shall not be > >> present in the metadata control list. See also: awb-enable. > >> flags: readable, writable, controllable > >> Boolean. Default: false > > > > Read-only. > > > >> > >> awb-mode : Specify the range of illuminants to use > >> for the AWB algorithm. The modes supported are platform > >> specific, and not all modes may be supported. > >> flags: readable, writable, controllable > >> Enum "AwbMode" Default: 0, "auto" > >> (0): auto - Search over > >> the whole colour temperature range. > >> (1): incandescent - Incandescent > >> AWB lamp mode. > >> (2): tungsten - Tungsten AWB > >> lamp mode. > >> (3): fluorescent - Fluorescent > >> AWB lamp mode. > >> (4): indoor - Indoor AWB > >> lighting mode. > >> (5): daylight - Daylight AWB > >> lighting mode. > >> (6): cloudy - Cloudy AWB > >> lighting mode. > >> (7): custom - Custom AWB > >> mode. > > > > Again, unclear how custom can be used. > > > >> > >> brightness : Specify a fixed brightness parameter. > >> Positive values (up to 1.0) produce brighter images; negative > >> values (up to -1.0) produce darker images and 0.0 leaves > >> pixels unchanged. > >> flags: readable, writable, controllable > >> Float. Range: -3.402823e+38 - > >> 3.402823e+38 Default: 0 > > > > So this time the range is documented, we need a machine readable > > form. > > > >> > >> camera-name : Select by name which camera to use. > >> flags: readable, writable, changeable > >> only in NULL or READY state > >> String. Default: null > >> > >> colour-correction-matrix: The 3x3 matrix that converts camera > >> RGB to sRGB within the imaging pipeline. This should > >> describe the matrix that is used after pixels have been > >> white-balanced, but before any gamma transformation. The 3x3 > >> matrix is stored in conventional reading order in an array of 9 > >> floating point values. > >> flags: readable, writable, controllable > >> Default: "< >" > >> GstValueArray of GValues of type > >> "gfloat" > > > > The type only allow flat list, but as its a matrix I was > > expecting > > > > "< < 1, 2, 3 >, > > < 4, 5, 6 >, > > < 7, 8, 9 > >" > > > > That being said I would be fine with flat raster data: > > > > "< 1, 2, 3, > > 4, 5, 6, > > 7, 8, 9 >" > > > > The down side is for bindings. Consider Python, the first would > > create a collection that you can access with matrix[Y][X] and > > more importantly can store into numpy.matrix. > > > > A comment for all list "< >" syntax. The usage and serialization > > syntax is not easy to guess. For matrix, if we had the identity > > matrix as default instead of an empty list, that would help. > > Otherwise we need a way to extend the doc for GStreamer usage. > > I thought it would be best to use a flat array here because the > documentation mentions that it is an array of 9 values. > > An identity matrix as default would work but I think this is the > same discussion as the default values for the enums: that the > default might be hardware specific. That's why I was kind of happy > that you could leave arrays empty as a default. > > >> colour-gains : Pair of gain values for the Red and > >> Blue colour channels, in that order. ColourGains can only be > >> applied in a Request when the AWB is disabled. See also: > >> awb-enable. > >> flags: readable, writable, controllable > >> Default: "< >" > >> GstValueArray of GValues of type > >> "gfloat" > > > > Again the type allow for flat list, but I think we have "< < r1, > > r2 >, < b1, b2 > >" ? > > I think "< r b >" would actually be correct here. That's correct. > >> colour-temperature : Report the current estimate of the > >> colour temperature, in kelvin, for this frame. The > >> ColourTemperature control can only be returned in metadata. > >> flags: readable, writable, controllable > >> Integer. Range: -2147483648 - > >> 2147483647 Default: 0 > > > > read-only. > > > >> > >> contrast : Specify a fixed contrast parameter. > >> Normal contrast is given by the value 1.0; larger values > >> produce images with more contrast. > >> flags: readable, writable, controllable > >> Float. Range: -3.402823e+38 - > >> 3.402823e+38 Default: 0 > > > > Range needed programmatically. > > > >> > >> digital-gain : Digital gain value applied during the processing > >> steps applied to the image as captured from the > >> sensor. The global digital gain factor is applied to all the > >> colour channels of the RAW image. Different pipeline models > >> are free to specify how the global gain factor applies to each > >> separate channel. If an imaging pipeline applies digital > >> gain in distinct processing steps, this value indicates their > >> total sum. Pipelines are free to decide how to adjust each > >> processing step to respect the received gain factor and shall > >> report their total value in the request metadata. > >> flags: readable, writable, controllable > >> Float. Range: -3.402823e+38 - > >> 3.402823e+38 Default: 0 > >> > >> draft-ae-precapture-trigger: Control for AE metering trigger. > >> Currently identical to ANDROID_CONTROL_AE_PRECAPTURE_TRIGGER. > >> Whether the camera device will trigger a precapture metering > >> sequence when it processes this request. > >> flags: readable, writable, controllable > >> Enum "AePrecaptureTrigger" Default: 0, > >> "idle" > >> (0): idle - The trigger > >> is idle. > >> (1): start - The > >> pre-capture AE metering is started > >> by the camera. > >> (2): cancel - The camera > >> will cancel any active or completed > >> metering sequence. The AE algorithm > >> is reset to its initial state. > > > > Perhaps we can skip over draft- as this could later break our > > interface. We can also add an env to include them ? > > Sure, I could hide that behind an env variable. What about other > vendor controls, should they be included? I'd recommend starting small, covering the core controls only for instance, and then extending that in a second step. Draft controls should disappear, they were added to get something working quickly when testing with Android, but they need to be reworked and standardized properly. The same is probably true for standard controls :-) > >> draft-ae-state : Control to report the current AE > >> algorithm state. Currently identical to > >> ANDROID_CONTROL_AE_STATE. Current state of the AE algorithm. > >> flags: readable, writable, controllable > >> Enum "AeState" Default: 0, "inactive" > >> (0): inactive - The AE > >> algorithm is inactive. > >> (1): searching - The AE > >> algorithm has not converged yet. > >> (2): converged - The AE > >> algorithm has converged. > >> (3): locked - The AE > >> algorithm is locked. > >> (4): flash-required - The AE > >> algorithm would need a flash for > >> good results > >> (5): precapture - The AE > >> algorithm has started a pre-capture > >> metering session. See also: > >> ae-precapture-trigger. > >> > >> draft-awb-state : Control to report the current AWB > >> algorithm state. Currently identical to > >> ANDROID_CONTROL_AWB_STATE. Current state of the AWB > >> algorithm. > >> flags: readable, writable, controllable > >> Enum "AwbState" Default: 0, > >> "state-inactive" > >> (0): state-inactive - The AWB > >> algorithm is inactive. > >> (1): state-searching - The AWB > >> algorithm has not converged yet. > >> (2): converged - The AWB > >> algorithm has converged. > >> (3): locked - The AWB > >> algorithm is locked. > >> > >> draft-color-correction-aberration-mode: Control to select the > >> color correction aberration mode. Currently identical to > >> ANDROID_COLOR_CORRECTION_ABERRATION_MODE. Mode of operation > >> for the chromatic aberration correction algorithm. > >> flags: readable, writable, controllable > >> Enum "ColorCorrectionAberrationMode" > >> Default: 0, "off" > >> (0): off - No > >> aberration correction is applied. > >> (1): fast - Aberration > >> correction will not slow down the > >> frame rate. > >> (2): high-quality - High quality > >> aberration correction which might > >> reduce the frame rate. > >> > >> draft-lens-shading-map-mode: Control to report if the lens > >> shading map is available. Currently identical to > >> ANDROID_STATISTICS_LENS_SHADING_MAP_MODE. > >> flags: readable, writable, controllable > >> Enum "LensShadingMapMode" Default: 0, > >> "ff" > >> (0): ff - No lens > >> shading map mode is available. > >> (1): n - The lens > >> shading map mode is available. > >> > >> draft-max-latency : The maximum number of frames that can occur > >> after a request (different than the previous) has been > >> submitted, and before the result's state becomes synchronized. > >> A value of -1 indicates unknown latency, and 0 indicates > >> per-frame control. Currently identical to > >> ANDROID_SYNC_MAX_LATENCY. > >> flags: readable, writable, controllable > >> Integer. Range: -2147483648 - > >> 2147483647 Default: 0 > >> > >> draft-noise-reduction-mode: Control to select the noise > >> reduction algorithm mode. Currently identical to > >> ANDROID_NOISE_REDUCTION_MODE. Mode of operation for the > >> noise reduction algorithm. > >> flags: readable, writable, controllable > >> Enum "NoiseReductionMode" Default: 0, > >> "off" > >> (0): off - No noise > >> reduction is applied > >> (1): fast - Noise > >> reduction is applied without > >> reducing the frame rate. > >> (2): high-quality - High quality > >> noise reduction at the expense of > >> frame rate. > >> (3): minimal - Minimal > >> noise reduction is applied without > >> reducing the frame rate. > >> (4): z-s-l - Noise > >> reduction is applied at different > >> levels to different streams. > >> > >> draft-pipeline-depth: Specifies the number of pipeline stages > >> the frame went through from when it was exposed to when > >> the final completed result was available to the framework. > >> Always less than or equal to PipelineMaxDepth. Currently > >> identical to ANDROID_REQUEST_PIPELINE_DEPTH. The typical value > >> for this control is 3 as a frame is first exposed, > >> captured and then processed in a single pass through the ISP. > >> Any additional processing step performed after the ISP > >> pass (in example face detection, additional format conversions > >> etc) count as an additional pipeline stage. > >> flags: readable, writable, controllable > >> Integer. Range: -2147483648 - > >> 2147483647 Default: 0 > >> > >> draft-sensor-rolling-shutter-skew: Control to report the time > >> between the start of exposure of the first row and the start > >> of exposure of the last row. Currently identical to > >> ANDROID_SENSOR_ROLLING_SHUTTER_SKEW > >> flags: readable, writable, controllable > >> Integer64. Range: -9223372036854775808 > >> - 9223372036854775807 Default: 0 > >> > >> draft-test-pattern-mode: Control to select the test pattern > >> mode. Currently identical to > >> ANDROID_SENSOR_TEST_PATTERN_MODE. > >> flags: readable, writable, controllable > >> Enum "TestPatternMode" Default: 0, > >> "off" > >> (0): off - No test > >> pattern mode is used. The camera > >> device returns frames from the image > >> sensor. > >> (1): solid-color - Each pixel > >> in [R, G_even, G_odd, B] is replaced > >> by its respective color channel > >> provided in test pattern data. Todo: > >> Add control for test pattern data. > >> (2): color-bars - All pixel data is replaced with an 8-bar > >> color pattern. The vertical bars (left-to-right) are as > >> follows; white, yellow, cyan, green, magenta, red, blue and > >> black. Each bar should take up 1/8 of the sensor pixel array > >> width. When this is not possible, the bar size should be > >> rounded down to the nearest integer and the pattern can repeat > >> on the right side. Each bar's height must always take up the > >> full sensor pixel array height. > >> (3): color-bars-fade-to-gray - The test pattern is similar to > >> TestPatternModeColorBars, except that each bar should > >> start at its specified color at the top and fade to gray at the > >> bottom. Furthermore each bar is further subdevided into > >> a left and right half. The left half should have a smooth > >> gradient, and the right half should have a quantized gradient. > >> In particular, the right half's should consist of blocks of the > >> same color for 1/16th active sensor pixel array width. > >> The least significant bits in the quantized gradient should be > >> copied from the most significant bits of the smooth > >> gradient. The height of each bar should always be a multiple of > >> 128. When this is not the case, the pattern should > >> repeat at the bottom of the image. > >> (4): pn9 - All pixel data is replaced by a pseudo-random > >> sequence generated from a PN9 512-bit sequence (typically > >> implemented in hardware with a linear feedback shift register). > >> The generator should be reset at the beginning of each > >> frame, and thus each subsequent raw frame with this test > >> pattern should be exactly the same as the last. > >> (256): custom1 - The first custom test pattern. All custom > >> patterns that are available only on this camera device are > >> at least this numeric value. All of the custom test patterns > >> will be static (that is the raw image must not vary from > >> frame to frame). > >> > >> exposure-time : Exposure time (shutter speed) for the frame > >> applied in the sensor device. This value is specified in > >> micro-seconds. Setting this value means that it is now fixed > >> and the AE algorithm may not change it. Setting it back to > >> zero returns it to the control of the AE algorithm. See also: > >> analogue-gain, ae-enable. Todo: Document the interactions > >> between AeEnable and setting a fixed value for this control. > >> Consider interactions with other AE features, such as > >> aperture and aperture/shutter priority mode, and decide if > >> control of which features should be automatically adjusted > >> shouldn't better be handled through a separate AE mode control. > >> flags: readable, writable, controllable > >> Integer. Range: -2147483648 - > >> 2147483647 Default: 0 > >> > >> exposure-value : Specify an Exposure Value (EV) parameter. The > >> EV parameter will only be applied if the AE algorithm > >> is currently enabled. By convention EV adjusts the exposure as > >> log2. For example EV = [-2, -1, 0.5, 0, 0.5, 1, 2] > >> results in an exposure adjustment of [1/4x, 1/2x, 1/sqrt(2)x, > >> 1x, sqrt(2)x, 2x, 4x]. See also: ae-enable. > >> flags: readable, writable, controllable > >> Float. Range: -3.402823e+38 - > >> 3.402823e+38 Default: 0 > >> > >> focus-fo-m : Reports a Figure of Merit (FoM) to indicate how > >> in-focus the frame is. A larger FocusFoM value indicates > >> a more in-focus frame. This singular value may be based on a > >> combination of statistics gathered from multiple focus > >> regions within an image. The number of focus regions and method > >> of combination is platform dependent. In this respect, > >> it is not necessarily aimed at providing a way to implement a > >> focus algorithm by the application, rather an indication > >> of how in-focus a frame is. > >> flags: readable, writable, controllable > >> Integer. Range: -2147483648 - > >> 2147483647 Default: 0 > >> > >> frame-duration : The instantaneous frame duration from > >> start of frame exposure to start of next exposure, expressed > >> in microseconds. This control is meant to be returned in > >> metadata. > >> flags: readable, writable, controllable > >> Integer64. Range: -9223372036854775808 > >> - 9223372036854775807 Default: 0 > >> > >> frame-duration-limits: The minimum and maximum (in that order) > >> frame duration, expressed in microseconds. When > >> provided by applications, the control specifies the sensor > >> frame duration interval the pipeline has to use. This limits > >> the largest exposure time the sensor can use. For example, if a > >> maximum frame duration of 33ms is requested > >> (corresponding to 30 frames per second), the sensor will not be > >> able to raise the exposure time above 33ms. A fixed > >> frame duration is achieved by setting the minimum and maximum > >> values to be the same. Setting both values to 0 reverts to > >> using the camera defaults. The maximum frame duration provides > >> the absolute limit to the shutter speed computed by the > >> AE algorithm and it overrides any exposure mode setting > >> specified with controls::AeExposureMode. Similarly, when a > >> manual exposure time is set through controls::ExposureTime, it > >> also gets clipped to the limits set by this control. When > >> reported in metadata, the control expresses the minimum and > >> maximum frame durations used after being clipped to the > >> sensor provided frame duration limits. See also: > >> ae-exposure-mode. See also: exposure-time. Todo: Define how to > >> calculate the capture frame rate by defining controls to report > >> additional delays introduced by the capture pipeline or > >> post-processing stages (ie JPEG conversion, frame scaling). > >> Todo: Provide an explicit definition of default control > >> values, for this and all other controls. > >> flags: readable, writable, controllable > >> Default: "< >" > >> GstValueArray of GValues of type > >> "gint64" > >> > >> gamma : Specify a fixed gamma value. Default > >> must be 2.2 which closely mimics sRGB gamma. Note that this > >> is camera gamma, so it is applied as 1.0/gamma. > >> flags: readable, writable, controllable > >> Float. Range: -3.402823e+38 - > >> 3.402823e+38 Default: 0 > >> > >> hdr-channel : This value is reported back to the application so > >> that it can discover whether this capture corresponds > >> to the short or long exposure image (or any other image used by > >> the HDR procedure). An application can monitor the HDR > >> channel to discover when the differently exposed images have > >> arrived. This metadata is only available when an HDR mode > >> has been enabled. See also: hdr-mode. > >> flags: readable, writable, controllable > >> Enum "HdrChannel" Default: 0, "none" > >> (0): none - This image > >> does not correspond to any of the > >> captures used to create an HDR > >> image. > >> (1): short - This is a > >> short exposure image. > >> (2): medium - This is a > >> medium exposure image. > >> (3): long - This is a > >> long exposure image. > >> > >> hdr-mode : Control to set the mode to be used for High Dynamic > >> Range (HDR) imaging. HDR techniques typically include > >> multiple exposure, image fusion and tone mapping techniques to > >> improve the dynamic range of the resulting images. When > >> using an HDR mode, images are captured with different sets of > >> AGC settings called HDR channels. Channels indicate in > >> particular the type of exposure (short, medium or long) used to > >> capture the raw image, before fusion. Each HDR image is > >> tagged with the corresponding channel using the HdrChannel > >> control. See also: hdr-channel. > >> flags: readable, writable, controllable > >> Enum "HdrMode" Default: 0, "off" > >> (0): off - HDR is > >> disabled. Metadata for this frame > >> will not include the HdrChannel > >> control. > >> (1): multi-exposure-unmerged - Multiple exposures will be > >> generated in an alternating fashion. However, they will not > >> be merged together and will be returned to the application as > >> they are. Each image will be tagged with the correct HDR > >> channel, indicating what kind of exposure it is. The tag should > >> be the same as in the HdrModeMultiExposure case. The > >> expectation is that an application using this mode would merge > >> the frames to create HDR images for itself if it requires > >> them. > >> (2): multi-exposure - Multiple exposures will be generated and > >> merged to create HDR images. Each image will be tagged > >> with the HDR channel (long, medium or short) that arrived and > >> which caused this image to be output. Systems that use two > >> channels for HDR will return images tagged alternately as the > >> short and long channel. Systems that use three channels > >> for HDR will cycle through the short, medium and long channel > >> before repeating. > >> (3): single-exposure - Multiple > >> frames all at a single exposure will > >> be used to create HDR images. These > >> images should be reported as all > >> corresponding to the HDR short > >> channel. > >> (4): night - Multiple > >> frames will be combined to produce > >> "night mode" images. It is up to the > >> implementation exactly which HDR > >> channels it uses, and the images > >> will all be tagged accordingly with > >> the correct HDR channel information. > >> > >> lens-position : Acts as a control to instruct the lens to move > >> to a particular position and also reports back the > >> position of the lens for each frame. The LensPosition control > >> is ignored unless the AfMode is set to AfModeManual, > >> though the value is reported back unconditionally in all modes. > >> This value, which is generally a non-integer, is the > >> reciprocal of the focal distance in metres, also known as > >> dioptres. That is, to set a focal distance D, the lens > >> position LP is given by \f$LP = \frac{1\mathrm{m}}{D}\f$ For > >> example: 0 moves the lens to infinity. 0.5 moves the lens > >> to focus on objects 2m away. 2 moves the lens to focus on > >> objects 50cm away. And larger values will focus the lens > >> closer. The default value of the control should indicate a good > >> general position for the lens, often corresponding to > >> the hyperfocal distance (the closest position for which objects > >> at infinity are still acceptably sharp). The minimum > >> will often be zero (meaning infinity), and the maximum value > >> defines the closest focus position. Todo: Define a property > >> to report the Hyperfocal distance of calibrated lenses. > >> flags: readable, writable, controllable > >> Float. Range: -3.402823e+38 - > >> 3.402823e+38 Default: 0 > >> > >> lux : Report an estimate of the current > >> illuminance level in lux. The Lux control can only be > >> returned in metadata. > >> flags: readable, writable, controllable > >> Float. Range: -3.402823e+38 - > >> 3.402823e+38 Default: 0 > >> > >> name : The name of the object > >> flags: readable, writable > >> String. Default: "libcamerasrc0" > >> > >> parent : The parent of the object > >> flags: readable, writable > >> Object of type "GstObject" > >> > >> saturation : Specify a fixed saturation parameter. > >> Normal saturation is given by the value 1.0; larger values > >> produce more saturated colours; 0.0 produces a greyscale > >> image. > >> flags: readable, writable, controllable > >> Float. Range: -3.402823e+38 - > >> 3.402823e+38 Default: 0 > >> > >> sensor-black-levels : Reports the sensor black levels used for > >> processing a frame, in the order R, Gr, Gb, B. These > >> values are returned as numbers out of a 16-bit pixel range (as > >> if pixels ranged from 0 to 65535). The SensorBlackLevels > >> control can only be returned in metadata. > >> flags: readable, writable, controllable > >> Default: "< >" > >> GstValueArray of GValues of type "gint" > >> > >> sensor-temperature : Temperature measure from the camera sensor > >> in Celsius. This is typically obtained by a thermal > >> sensor present on-die or in the camera module. The range of > >> reported temperatures is device dependent. The > >> SensorTemperature control will only be returned in metadata if > >> a thermal sensor is present. > >> flags: readable, writable, controllable > >> Float. Range: -3.402823e+38 - > >> 3.402823e+38 Default: 0 > >> > >> sensor-timestamp : The time when the first row of the image > >> sensor active array is exposed. The timestamp, expressed > >> in nanoseconds, represents a monotonically increasing counter > >> since the system boot time, as defined by the > >> Linux-specific CLOCK_BOOTTIME clock id. The SensorTimestamp > >> control can only be returned in metadata. Todo: Define how > >> the sensor timestamp has to be used in the reprocessing use > >> case. > >> flags: readable, writable, controllable > >> Integer64. Range: -9223372036854775808 > >> - 9223372036854775807 Default: 0 > >> > >> sharpness : A value of 0.0 means no sharpening. The minimum > >> value means minimal sharpening, and shall be 0.0 unless > >> the camera can't disable sharpening completely. The default > >> value shall give a "reasonable" level of sharpening, > >> suitable for most use cases. The maximum value may apply > >> extremely high levels of sharpening, higher than anyone could > >> reasonably want. Negative values are not allowed. Note also > >> that sharpening is not applied to raw streams. > >> flags: readable, writable, controllable > >> Float. Range: -3.402823e+38 - > >> 3.402823e+38 Default: 0 > > > > > > I think my comments begins to be repetitive, and if we solve the > > earlier issues > > we will solve them all. The main thing is that the yaml should > > explicitly define > > more stuff rather then relying to text. And using that, we can > > easily adjust > > GStreamer properties. A final note, not all of these will be > > supported. I was > > wondering if we should add an extra property that tells use > > which one are > > actually supported ? We don't need anything fency, a "/" > > seperate list in a > > string or similar could do. We can also go for GstStructure.
Le jeudi 08 août 2024 à 10:50 +0100, Kieran Bingham a écrit : > Hi Nicolas, Jaslo, > > [...] > > > > > > af-trigger : This control starts an autofocus scan when AfMode is set to AfModeAuto, and can also be used to terminate a scan early. It is ignored if AfMode is set to AfModeManual or AfModeContinuous. > > > flags: readable, writable, controllable > > > Enum "AfTrigger" Default: 0, "start" > > > (0): start - Start an AF scan. Ignored if a scan is in progress. > > > (1): cancel - Cancel an AF scan. This does not cause the lens to move anywhere else. Ignored if no scan is in progress. > > > > This one is missing something to make sense, at least from GStreamer point of view. When you are not starting or cancelling, what value this control should change to ? That question might be relevant for libcamera globally. > > > > Without this third state, I would say this one should not be there, and replace with action signals. > > > In libcamera - controls only 'exist' in a single request. What is an > 'action signal' in Gstreamer? Its more or less a method call. The main difference is that you don't need a .h and a static link or dlopen to call it, and bindings let you call them without the need for per-element code. This allow GStreamer plugins to offer advance API. You can check multiudpsink as an example. It has 4 actions, "add", "remove", "clear", this is related to the transmission address, and "get-stats", which let you get the statistics but for a specific destination rather then the combination (which is offered as property). It also adds 2 signals, "client- added" and "client-remove", which is pretty common to allow a second component monitor the sate while application adds/remove clients. https://gstreamer.freedesktop.org/documentation/udp/multiudpsink.html?gi-language=c#action-signals This discussion kind of reveals that libcamera controls are many different mechanisms under a single umbrella. You have the controls that has a known state, that you can write in one request, and read back on the response of that request. And that you don't expect the state to change unless requested. These are 1:1 match with properties (properties can be asynchronous, and we have a notifiy signal to let application know when applied). The difference is that the yaml that serves as a spec is not abstract the ranges and default values. This is manageable, libcamera way of specifying control will just pass through the GStreamer element, and the libcamera based GStreamer applications will implement per camera glue to make things work, just like they would have to do by using the libcamera API directly. In reality, folks will hard code their app to the Raspberry Pi way, and it will break on other platform, but that is not my battle ;-P There is metadata (the states in their documentation), which are not expected to be set in a request, but only read. This is also good fit for read-only properties. Though, I kind of suspect that the specification leave it open in regard to these being per-stream (which which case they should be pad properties), or for the entire camera (libcamerasrc property). > > > So - AfTrigger for start and cancel are, in this case 'events' > > - "Start an AFscan at this request" > - "Cancel any active scan" And finally, there is triggers. This is the same concept as V4L2 buttons, except that you group multiple actions in the same button by passing an argument. That is not a property. If we introduce a classification for settables, metadata and triggers, that will be enough for the python generator to avoid this. My recommandation Jaslo, would be to try and reduce the set of properties we expose in a first pass, or make them opt-in. > > > There are no 'events' in between. Controls are only 'set' in Requests. > Any 'response' is only available in the metadata from a completed > Request. > > You can not 'read' the state of a control in the same sense of a V4L2 > control. > > (Maybe this should change for some use cases, or for an abilty to set > controls directly on a camera, which I believe Stefan has explored, and > not through a request, but it's the current model). I haven't read the implementation yet, but indeed, this requires a bit of glue. There will be a config that is what has been set since the last request (this is mostly likely the case already). As soon as we have a response, we can save everything, and use that to reply to the get() calls. It always good though to filter unsupported controls before making a request, otherwise the warning message will repeat every request, and that will be very spammy. > > > > analogue-gain : Analogue gain value applied in the sensor device. The value of the control specifies the gain multiplier applied to all colour channels. This value cannot be lower than 1.0. Setting this value means that it is now fixed and the AE algorithm may not change it. Setting it back to zero returns it to the control of the AE algorithm. See also: exposure-time, ae-enable. Todo: Document the interactions between AeEnable and setting a fixed value for this control. Consider interactions with other AE features, such as aperture and aperture/shutter priority mode, and decide if control of which features should be automatically adjusted shouldn't better be handled through a separate AE mode control. > > > flags: readable, writable, controllable > > > Float. Range: -3.402823e+38 - 3.402823e+38 Default: 0 > > > > I'm not confident there is no range, would be nice if we can align to something sensible (might need yaml changes of course). > > Different cameras will have different ranges ... only determinable at > runtime. We can't specify this in YAML. Though - it should always be > positive at least! (and as you've mentioned elsewhere, if it was Well, that would already be an improvement, signed / unsigned. Will it fit 32bit ? 64bit, it is float or double ? etc. > machine readable to say always above 1.0 would be beneficial too). Notice that with proper types, there is always -INF and INF defined, and worse case, we can define that. > [...] > > > I think my comments begins to be repetitive, and if we solve the earlier issues > > we will solve them all. The main thing is that the yaml should explicitly define > > more stuff rather then relying to text. And using that, we can easily adjust > > Agreed, I think at least the yaml files could specify if a control can > only be accessed as metadata which would simplify all of the 'read-only' > issues. Ack, we should propose some improvement to the yaml. > > > GStreamer properties. A final note, not all of these will be supported. I was > > wondering if we should add an extra property that tells use which one are > > actually supported ? We don't need anything fency, a "/" seperate list in a > > string or similar could do. We can also go for GstStructure. > > I don't understand this part fully. Do you mean a new control/property > that gstreamer libcamerasrc will read at runtime from libcamera? It > already has a Camera->controls() that it can read that from. > > But you mention GstStructure - do you mean that the libcamerasrc will > generate this ? (Perhaps from the Camera->controls()?) I think you see core libcamera changes in every comments (must feel discouraging :-D), This is in regard to how we glue the two worlds. I'd like to see "property" that reports the list of supported controls, or the map of supported controls (with the run-time limits). > > > Nicolas > > Kieran
Le jeudi 08 août 2024 à 14:25 +0200, Jaslo Ziska a écrit : > > Inspect generally allow for pretty short description, but I do > > like having the full description, specially that as long as this > > element lives here, we are unlikely to pull in the GStreamer doc > > generator for it. Might be worth giving a try to line breaks, > > though. > > I wasn't sure if the whole description should be included but I > also like having the whole description there as one can then use > the gstreamer element without consulting the libcamera > documentation. > > I looked at a few gstreamer elements (h264parse and webrtcbin) > with long descriptions (it was surprisingly hard to find some) and > they both did not include line breaks in the description. Should I > really try to add some? Yeah, lets leave that alone, it will be pretty finicky to get it right. Nicolas
Le samedi 10 août 2024 à 05:49 +0300, Laurent Pinchart a écrit : > > A notification about a change would be nice. Maybe these type of > > controls can also be marked in the yaml somehow. > > All controls should be reported in metadata, so that would be lots of > notifications. I'm not sure how to best handle that on the GStreamer > side. If there is no handler, the signalling is a no-op of course. Only "changed" value are expected to be notified. Nicolas
On Tue, Aug 13, 2024 at 02:40:55PM -0400, Nicolas Dufresne wrote: > Le jeudi 08 août 2024 à 14:25 +0200, Jaslo Ziska a écrit : > > > Inspect generally allow for pretty short description, but I do > > > like having the full description, specially that as long as this > > > element lives here, we are unlikely to pull in the GStreamer doc > > > generator for it. Might be worth giving a try to line breaks, > > > though. > > > > I wasn't sure if the whole description should be included but I > > also like having the whole description there as one can then use > > the gstreamer element without consulting the libcamera > > documentation. > > > > I looked at a few gstreamer elements (h264parse and webrtcbin) > > with long descriptions (it was surprisingly hard to find some) and > > they both did not include line breaks in the description. Should I > > really try to add some? > > Yeah, lets leave that alone, it will be pretty finicky to get it right. It should now be easier with "[PATCH] libcamera: controls: Improve formatting of control descriptions in YAML", which I'll merge in a few days to give it a chance of receiving a second review. I'm also fine leaving line breaks for later.
Hi Nicolas, Nicolas Dufresne <nicolas@ndufresne.ca> writes: > Le jeudi 08 août 2024 à 10:50 +0100, Kieran Bingham a écrit : >> Hi Nicolas, Jaslo, >> >> > [...] > >> > > >> > > af-trigger : This control starts an autofocus >> > > scan when AfMode is set to AfModeAuto, and can also be >> > > used to terminate a scan early. It is ignored if AfMode >> > > is set to AfModeManual or AfModeContinuous. >> > > flags: readable, writable, >> > > controllable >> > > Enum "AfTrigger" Default: 0, >> > > "start" >> > > (0): start - Start an >> > > AF scan. Ignored if a scan is in >> > > progress. >> > > (1): cancel - Cancel >> > > an AF scan. This does not cause >> > > the lens to move anywhere else. >> > > Ignored if no scan is in >> > > progress. >> > >> > This one is missing something to make sense, at least from >> > GStreamer point of view. When you are not starting or >> > cancelling, what value this control should change to ? That >> > question might be relevant for libcamera globally. >> > >> > Without this third state, I would say this one should not be >> > there, and replace with action signals. >> >> >> In libcamera - controls only 'exist' in a single request. What >> is an >> 'action signal' in Gstreamer? > > Its more or less a method call. The main difference is that you > don't need a .h > and a static link or dlopen to call it, and bindings let you > call them without > the need for per-element code. This allow GStreamer plugins to > offer advance > API. You can check multiudpsink as an example. It has 4 actions, > "add", > "remove", "clear", this is related to the transmission address, > and "get-stats", > which let you get the statistics but for a specific destination > rather then the > combination (which is offered as property). It also adds 2 > signals, "client- > added" and "client-remove", which is pretty common to allow a > second component > monitor the sate while application adds/remove clients. > > https://gstreamer.freedesktop.org/documentation/udp/multiudpsink.html?gi-language=c#action-signals > > This discussion kind of reveals that libcamera controls are many > different > mechanisms under a single umbrella. > > You have the controls that has a known state, that you can write > in one request, > and read back on the response of that request. And that you > don't expect the > state to change unless requested. These are 1:1 match with > properties > (properties can be asynchronous, and we have a notifiy signal to > let application > know when applied). The difference is that the yaml that serves > as a spec is not > abstract the ranges and default values. This is manageable, > libcamera way of > specifying control will just pass through the GStreamer element, > and the > libcamera based GStreamer applications will implement per camera > glue to make > things work, just like they would have to do by using the > libcamera API > directly. In reality, folks will hard code their app to the > Raspberry Pi way, > and it will break on other platform, but that is not my battle > ;-P > > There is metadata (the states in their documentation), which are > not expected to > be set in a request, but only read. This is also good fit for > read-only > properties. Though, I kind of suspect that the specification > leave it open in > regard to these being per-stream (which which case they should > be pad > properties), or for the entire camera (libcamerasrc property). > >> >> >> So - AfTrigger for start and cancel are, in this case 'events' >> >> - "Start an AFscan at this request" >> - "Cancel any active scan" > > And finally, there is triggers. This is the same concept as V4L2 > buttons, except > that you group multiple actions in the same button by passing an > argument. That > is not a property. If we introduce a classification for > settables, metadata and > triggers, that will be enough for the python generator to avoid > this. My > recommandation Jaslo, would be to try and reduce the set of > properties we expose > in a first pass, or make them opt-in. Could you maybe elaborate which controls should be which GStreamer kind property? I think I understand it this way: the plain normal libcamera controls should be GStreamer properties and the libcamera metadata should be read-only GStreamer properties. But what about the other controls? Which kind of libcamera controls do you image to be GStreamer actions? And what should be done about the libcamera "triggers"? And also how should we specify which libcamera control is which kind of GStreamer property/action/...? In the main yaml (if this would be allowed just for GStreamer) or somewhere else? Best regards, Jaslo >> There are no 'events' in between. Controls are only 'set' in >> Requests. >> Any 'response' is only available in the metadata from a >> completed >> Request. >> >> You can not 'read' the state of a control in the same sense of >> a V4L2 >> control. >> >> (Maybe this should change for some use cases, or for an abilty >> to set >> controls directly on a camera, which I believe Stefan has >> explored, and >> not through a request, but it's the current model). > > I haven't read the implementation yet, but indeed, this requires > a bit of glue. > There will be a config that is what has been set since the last > request (this is > mostly likely the case already). As soon as we have a response, > we can save > everything, and use that to reply to the get() calls. It always > good though to > filter unsupported controls before making a request, otherwise > the warning > message will repeat every request, and that will be very spammy. > >> >> > > analogue-gain : Analogue gain value applied in the sensor >> > > device. The value of the control specifies the gain >> > > multiplier applied to all colour channels. This value >> > > cannot be lower than 1.0. Setting this value means that it >> > > is now fixed and the AE algorithm may not change it. >> > > Setting it back to zero returns it to the control of the AE >> > > algorithm. See also: exposure-time, ae-enable. Todo: >> > > Document the interactions between AeEnable and setting a >> > > fixed value for this control. Consider interactions with >> > > other AE features, such as aperture and aperture/shutter >> > > priority mode, and decide if control of which features >> > > should be automatically adjusted shouldn't better be >> > > handled through a separate AE mode control. >> > > flags: readable, writable, >> > > controllable >> > > Float. Range: -3.402823e+38 - >> > > 3.402823e+38 Default: >> > > 0 >> > >> > I'm not confident there is no range, would be nice if we can >> > align to something sensible (might need yaml changes of >> > course). >> >> Different cameras will have different ranges ... only >> determinable at >> runtime. We can't specify this in YAML. Though - it should >> always be >> positive at least! (and as you've mentioned elsewhere, if it >> was > > Well, that would already be an improvement, signed / unsigned. > Will it fit 32bit > ? 64bit, it is float or double ? etc. > >> machine readable to say always above 1.0 would be beneficial >> too). > > Notice that with proper types, there is always -INF and INF > defined, and worse > case, we can define that. > >> > > [...] > >> >> > I think my comments begins to be repetitive, and if we solve >> > the earlier issues >> > we will solve them all. The main thing is that the yaml >> > should explicitly define >> > more stuff rather then relying to text. And using that, we >> > can easily adjust >> >> Agreed, I think at least the yaml files could specify if a >> control can >> only be accessed as metadata which would simplify all of the >> 'read-only' >> issues. > > Ack, we should propose some improvement to the yaml. > >> >> > GStreamer properties. A final note, not all of these will be >> > supported. I was >> > wondering if we should add an extra property that tells use >> > which one are >> > actually supported ? We don't need anything fency, a "/" >> > seperate list in a >> > string or similar could do. We can also go for GstStructure. >> >> I don't understand this part fully. Do you mean a new >> control/property >> that gstreamer libcamerasrc will read at runtime from >> libcamera? It >> already has a Camera->controls() that it can read that from. >> >> But you mention GstStructure - do you mean that the >> libcamerasrc will >> generate this ? (Perhaps from the Camera->controls()?) > > I think you see core libcamera changes in every comments (must > feel discouraging > :-D), This is in regard to how we glue the two worlds. I'd like > to see > "property" that reports the list of supported controls, or the > map of supported > controls (with the run-time limits). > >> >> > Nicolas >> >> Kieran
Hi Jaslo, Le lundi 30 septembre 2024 à 11:31 +0200, Jaslo Ziska a écrit : > Hi Nicolas, > > > [...] > > And finally, there is triggers. This is the same concept as V4L2 > > buttons, except > > that you group multiple actions in the same button by passing an > > argument. That > > is not a property. If we introduce a classification for > > settables, metadata and > > triggers, that will be enough for the python generator to avoid > > this. My > > recommandation Jaslo, would be to try and reduce the set of > > properties we expose > > in a first pass, or make them opt-in. > > Could you maybe elaborate which controls should be which GStreamer > kind property? > > I think I understand it this way: the plain normal libcamera > controls should be GStreamer properties and the libcamera metadata > should be read-only GStreamer properties. But what about the other > controls? > > Which kind of libcamera controls do you image to be GStreamer > actions? And what should be done about the libcamera "triggers"? > > And also how should we specify which libcamera control is which > kind of GStreamer property/action/...? In the main yaml (if this > would be allowed just for GStreamer) or somewhere else? In order to move on, maybe you can build a list of obvious controls from libcamera that requires no wrapping to be useful. We need to limit the generator in some ways and that would be a good start and a way to get v3 included. For the rest, we have to stop thinking generically and go back to the items one by one, and eventually we'll have have a common understanding of what goes where. Some of them will have to be per pads, some of them might endup in a stats property (rather then millions of little props), etc. Nicolas > > Best regards, > > Jaslo
diff --git a/src/gstreamer/gstlibcamera-controls.cpp.in b/src/gstreamer/gstlibcamera-controls.cpp.in new file mode 100644 index 00000000..ff93f5c3 --- /dev/null +++ b/src/gstreamer/gstlibcamera-controls.cpp.in @@ -0,0 +1,46 @@ +/* SPDX-License-Identifier: LGPL-2.1-or-later */ +/* + * Copyright (C) 2023, Collabora Ltd. + * Author: Nicolas Dufresne <nicolas.dufresne@collabora.com> + * + * gstlibcamera-controls.cpp - GStreamer Camera Controls + * + * This file is auto-generated. Do not edit. + */ + +#include "gstlibcamera-controls.h" + +#include <libcamera/control_ids.h> + +using namespace libcamera; + +${enum_def} + +void GstCameraControls::installProperties(GObjectClass *klass, int lastPropId) +{ +${install_properties} +} + +bool GstCameraControls::getProperty(guint propId, GValue *value, GParamSpec *pspec) +{ + switch (propId) { +${get_properties} + default: + return false; + } +} + +bool GstCameraControls::setProperty(guint propId, const GValue *value, + [[maybe_unused]] GParamSpec *pspec) +{ + switch (propId) { +${set_properties} + default: + return false; + } +} + +void GstCameraControls::applyControls(std::unique_ptr<libcamera::Request> &request) +{ + request->controls().merge(controls_); +} diff --git a/src/gstreamer/gstlibcamera-controls.h b/src/gstreamer/gstlibcamera-controls.h new file mode 100644 index 00000000..4e1d5bf9 --- /dev/null +++ b/src/gstreamer/gstlibcamera-controls.h @@ -0,0 +1,36 @@ +/* SPDX-License-Identifier: LGPL-2.1-or-later */ +/* + * Copyright (C) 2023, Collabora Ltd. + * Author: Nicolas Dufresne <nicolas.dufresne@collabora.com> + * + * gstlibcamera-controls.h - GStreamer Camera Controls + */ + +#pragma once + +#include <libcamera/controls.h> +#include <libcamera/request.h> + +#include "gstlibcamerasrc.h" + +namespace libcamera { + +class GstCameraControls +{ +public: + GstCameraControls() {}; + ~GstCameraControls() {}; + + static void installProperties(GObjectClass *klass, int lastProp); + + bool getProperty(guint propId, GValue *value, GParamSpec *pspec); + bool setProperty(guint propId, const GValue *value, GParamSpec *pspec); + + void applyControls(std::unique_ptr<libcamera::Request> &request); + +private: + /* set of user modified controls */ + ControlList controls_; +}; + +} /* namespace libcamera */ diff --git a/src/gstreamer/gstlibcamerasrc.cpp b/src/gstreamer/gstlibcamerasrc.cpp index 5a3e2989..85dab67f 100644 --- a/src/gstreamer/gstlibcamerasrc.cpp +++ b/src/gstreamer/gstlibcamerasrc.cpp @@ -37,10 +37,11 @@ #include <gst/base/base.h> +#include "gstlibcamera-controls.h" +#include "gstlibcamera-utils.h" #include "gstlibcameraallocator.h" #include "gstlibcamerapad.h" #include "gstlibcamerapool.h" -#include "gstlibcamera-utils.h" using namespace libcamera; @@ -128,6 +129,7 @@ struct GstLibcameraSrcState { ControlList initControls_; guint group_id_; + GstCameraControls controls_; int queueRequest(); void requestCompleted(Request *request); @@ -153,6 +155,7 @@ struct _GstLibcameraSrc { enum { PROP_0, PROP_CAMERA_NAME, + PROP_LAST }; static void gst_libcamera_src_child_proxy_init(gpointer g_iface, @@ -183,6 +186,9 @@ int GstLibcameraSrcState::queueRequest() if (!request) return -ENOMEM; + /* Apply controls */ + controls_.applyControls(request); + std::unique_ptr<RequestWrap> wrap = std::make_unique<RequestWrap>(std::move(request)); @@ -722,6 +728,7 @@ gst_libcamera_src_set_property(GObject *object, guint prop_id, { GLibLocker lock(GST_OBJECT(object)); GstLibcameraSrc *self = GST_LIBCAMERA_SRC(object); + GstLibcameraSrcState *state = self->state; switch (prop_id) { case PROP_CAMERA_NAME: @@ -729,7 +736,8 @@ gst_libcamera_src_set_property(GObject *object, guint prop_id, self->camera_name = g_value_dup_string(value); break; default: - G_OBJECT_WARN_INVALID_PROPERTY_ID(object, prop_id, pspec); + if (!state->controls_.setProperty(prop_id - PROP_LAST, value, pspec)) + G_OBJECT_WARN_INVALID_PROPERTY_ID(object, prop_id, pspec); break; } } @@ -740,13 +748,15 @@ gst_libcamera_src_get_property(GObject *object, guint prop_id, GValue *value, { GLibLocker lock(GST_OBJECT(object)); GstLibcameraSrc *self = GST_LIBCAMERA_SRC(object); + GstLibcameraSrcState *state = self->state; switch (prop_id) { case PROP_CAMERA_NAME: g_value_set_string(value, self->camera_name); break; default: - G_OBJECT_WARN_INVALID_PROPERTY_ID(object, prop_id, pspec); + if (!state->controls_.getProperty(prop_id - PROP_LAST, value, pspec)) + G_OBJECT_WARN_INVALID_PROPERTY_ID(object, prop_id, pspec); break; } } @@ -947,6 +957,7 @@ gst_libcamera_src_class_init(GstLibcameraSrcClass *klass) | G_PARAM_STATIC_STRINGS)); g_object_class_install_property(object_class, PROP_CAMERA_NAME, spec); + GstCameraControls::installProperties(object_class, PROP_LAST); } /* GstChildProxy implementation */ diff --git a/src/gstreamer/meson.build b/src/gstreamer/meson.build index c2a01e7b..e6c20124 100644 --- a/src/gstreamer/meson.build +++ b/src/gstreamer/meson.build @@ -25,6 +25,20 @@ libcamera_gst_sources = [ 'gstlibcamerasrc.cpp', ] +# Generate gstreamer control properties + +gen_gst_controls_input_files = [] +gen_gst_controls_template = files('gstlibcamera-controls.cpp.in') +foreach file : controls_files + gen_gst_controls_input_files += files('../libcamera/' + file) +endforeach + +libcamera_gst_sources += custom_target('gstlibcamera-controls.cpp', + input : gen_gst_controls_input_files, + output : 'gstlibcamera-controls.cpp', + command : [gen_gst_controls, '-o', '@OUTPUT@', + '-t', gen_gst_controls_template, '@INPUT@']) + libcamera_gst_cpp_args = [ '-DVERSION="@0@"'.format(libcamera_git_version), '-DPACKAGE="@0@"'.format(meson.project_name()), diff --git a/utils/gen-gst-controls.py b/utils/gen-gst-controls.py new file mode 100755 index 00000000..d0c12b50 --- /dev/null +++ b/utils/gen-gst-controls.py @@ -0,0 +1,398 @@ +#!/usr/bin/env python3 + +import argparse +import re +import string +import sys +import yaml + + +class ControlEnum(object): + def __init__(self, data): + self.__data = data + + @property + def description(self): + """The enum description""" + return self.__data.get('description') + + @property + def name(self): + """The enum name""" + return self.__data.get('name') + + @property + def value(self): + """The enum value""" + return self.__data.get('value') + + +class Control(object): + def __init__(self, name, data, vendor): + self.__name = name + self.__data = data + self.__enum_values = None + self.__size = None + self.__vendor = vendor + + enum_values = data.get('enum') + if enum_values is not None: + self.__enum_values = [ControlEnum(enum) for enum in enum_values] + + size = self.__data.get('size') + if size is not None: + if len(size) == 0: + raise RuntimeError(f'Control `{self.__name}` size must have at least one dimension') + + # Compute the total number of elements in the array. If any of the + # array dimension is a string, the array is variable-sized. + num_elems = 1 + for dim in size: + if type(dim) is str: + num_elems = 0 + break + + dim = int(dim) + if dim <= 0: + raise RuntimeError(f'Control `{self.__name}` size must have positive values only') + + num_elems *= dim + + self.__size = num_elems + + @property + def description(self): + """The control description""" + return self.__data.get('description') + + @property + def enum_values(self): + """The enum values, if the control is an enumeration""" + if self.__enum_values is None: + return + for enum in self.__enum_values: + yield enum + + @property + def is_enum(self): + """Is the control an enumeration""" + return self.__enum_values is not None + + @property + def vendor(self): + """The vendor string, or None""" + return self.__vendor + + @property + def name(self): + """The control name (CamelCase)""" + return self.__name + + @property + def type(self): + typ = self.__data.get('type') + size = self.__data.get('size') + + if typ == 'string': + return 'std::string' + + if self.__size is None: + return typ + + if self.__size: + return f"Span<const {typ}, {self.__size}>" + else: + return f"Span<const {typ}>" + + @property + def element_type(self): + typ = self.__data.get('type') + return typ + + @property + def size(self): + return self.__size + + +def find_common_prefix(strings): + prefix = strings[0] + + for string in strings[1:]: + while string[:len(prefix)] != prefix and prefix: + prefix = prefix[:len(prefix) - 1] + if not prefix: + break + + return prefix + + +def format_description(description, indent = 0): + # Substitute doxygen keywords \sa (see also) and \todo + description = re.sub(r'\\sa((?: \w+)+)', + lambda match: 'See also: ' + ', '.join(map(kebab_case, match.group(1).strip().split(' '))) + '.', description) + description = re.sub(r'\\todo', 'Todo:', description) + + description = description.strip().split('\n') + return '\n'.join([indent * '\t' + '"' + line.replace('\\', r'\\').replace('"', r'\"') + ' "' for line in description if line]).rstrip() + + +def snake_case(s): + return ''.join([c.isupper() and ('_' + c.lower()) or c for c in s]).strip('_') + + +def kebab_case(s): + return snake_case(s).replace('_', '-') + + +def indent(s, n): + lines = s.split('\n') + return '\n'.join([n * '\t' + line for line in lines]) + + +def generate_cpp(controls): + # Beginning of a GEnumValue definition for enum controls + enum_values_start = string.Template('static const GEnumValue ${name_snake_case}_types[] = {') + # Definition of the GEnumValue variant for each enum control variant + # Because the description might have multiple lines it will get indented below + enum_values_values = string.Template('''{ +\tcontrols${vendor_namespace}::${name}, +${description}, +\t"${nick}" +},''') + # End of a GEnumValue definition and definition of the matching _get_type() function + enum_values_end = string.Template('''\t{0, NULL, NULL} +}; + +#define TYPE_${name_upper} (${name_snake_case}_get_type()) +static GType ${name_snake_case}_get_type(void) +{ +\tstatic GType ${name_snake_case}_type = 0; + +\tif (!${name_snake_case}_type) +\t\t${name_snake_case}_type = g_enum_register_static("${name}", ${name_snake_case}_types); + +\treturn ${name_snake_case}_type; +} +''') + + # Creation of the type spec for the different types of controls + # The description (and the element_spec for the array) might have multiple lines and will get indented below + spec_array = string.Template('''gst_param_spec_array( +\t"${spec_name}", +\t"${nick}", +${description}, +${element_spec}, +\t(GParamFlags) (GST_PARAM_CONTROLLABLE | G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS) +)''') + spec_bool = string.Template('''g_param_spec_boolean( +\t"${spec_name}", +\t"${nick}", +${description}, +\t${default}, +\t(GParamFlags) (GST_PARAM_CONTROLLABLE | G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS) +)''') + spec_enum = string.Template('''g_param_spec_enum( +\t"${spec_name}", +\t"${nick}", +${description}, +\tTYPE_${enum}, +\t${default}, +\t(GParamFlags) (GST_PARAM_CONTROLLABLE | G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS) +)''') + spec_numeric = string.Template('''g_param_spec_${gtype}( +\t"${spec_name}", +\t"${nick}", +${description}, +\t${min}, +\t${max}, +\t${default}, +\t(GParamFlags) (GST_PARAM_CONTROLLABLE | G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS) +)''') + + # The g_object_class_install_property() function call for each control + install_property = string.Template('''\tg_object_class_install_property( +\t\tklass, +\t\tlastPropId + controls${vendor_namespace}::${name_upper}, +${spec} +\t);''') + + # The _get_property() switch cases for each control + get_property = string.Template('''case controls${vendor_namespace}::${name_upper}: { +\tauto val = controls_.get(controls${vendor_namespace}::${name}); +\tauto spec = G_PARAM_SPEC_${gtype_upper}(pspec); +\tg_value_set_${gtype}(value, val.value_or(spec->default_value)); +\treturn true; +}''') + get_array_property = string.Template('''case controls${vendor_namespace}::${name_upper}: { +\tauto val = controls_.get(controls${vendor_namespace}::${name}); +\tif (val) { +\t\tfor (size_t i = 0; i < val->size(); ++i) { +\t\t\tGValue v = G_VALUE_INIT; +\t\t\tg_value_init(&v, G_TYPE_${gtype_upper}); +\t\t\tg_value_set_${gtype}(&v, (*val)[i]); +\t\t\tgst_value_array_append_and_take_value(value, &v); +\t\t} +\t} +\treturn true; +}''') + + # The _set_property() switch cases for each control + set_property = string.Template('''case controls${vendor_namespace}::${name_upper}: +\tcontrols_.set(controls${vendor_namespace}::${name}, g_value_get_${gtype}(value)); +\treturn true;''') + set_array_property = string.Template('''case controls${vendor_namespace}::${name_upper}: { +\tsize_t size = gst_value_array_get_size(value); +\tstd::vector<${type}> val(size); +\tfor (size_t i = 0; i < size; ++i) { +\t\tconst GValue *v = gst_value_array_get_value(value, i); +\t\tval[i] = g_value_get_${gtype}(v); +\t} +\tcontrols_.set(controls${vendor_namespace}::${name}, Span<const ${type}, ${size}>(val.data(), size)); +\treturn true; +}''') + + enum_def = [] + install_properties = [] + get_properties = [] + set_properties = [] + + for ctrl in controls: + is_array = ctrl.size is not None + size = ctrl.size + if size == 0: + size = 'dynamic_extent' + + # Determine the matching glib type for each C++ type used in the controls + gtype = '' + if ctrl.is_enum: + gtype = 'enum' + elif ctrl.element_type == 'bool': + gtype = 'boolean' + elif ctrl.element_type == 'float': + gtype = 'float' + elif ctrl.element_type == 'int32_t': + gtype = 'int' + elif ctrl.element_type == 'int64_t': + gtype = 'int64' + elif ctrl.element_type == 'uint8_t': + gtype = 'uchar' + elif ctrl.element_type == 'Rectangle': + # TODO: Handle Rectangle + continue + else: + raise RuntimeError(f'The type `{ctrl.element_type}` is unknown') + + vendor_prefix = '' + vendor_namespace = '' + if ctrl.vendor != 'libcamera': + vendor_prefix = ctrl.vendor + '-' + vendor_namespace = '::' + ctrl.vendor + + name_snake_case = snake_case(ctrl.name) + name_upper = name_snake_case.upper() + + info = { + 'name': ctrl.name, + 'vendor_namespace': vendor_namespace, + 'name_snake_case': name_snake_case, + 'name_upper': name_upper, + 'spec_name': vendor_prefix + kebab_case(ctrl.name), + 'nick': ''.join([c.isupper() and (' ' + c) or c for c in ctrl.name]).strip(' '), + 'description': format_description(ctrl.description, indent=1), + 'gtype': gtype, + 'gtype_upper': gtype.upper(), + 'type': ctrl.element_type, + 'size': size, + } + + if ctrl.is_enum: + enum_def.append(enum_values_start.substitute(info)) + + common_prefix = find_common_prefix([enum.name for enum in ctrl.enum_values]) + + for enum in ctrl.enum_values: + values_info = { + 'name': enum.name, + 'vendor_namespace': vendor_namespace, + 'description': format_description(enum.description, indent=1), + 'nick': kebab_case(enum.name.removeprefix(common_prefix)), + } + enum_def.append(indent(enum_values_values.substitute(values_info), 1)) + + enum_def.append(enum_values_end.substitute(info)) + + spec = '' + if ctrl.is_enum: + spec = spec_enum.substitute({'enum': name_upper, 'default': 0, **info}) + elif gtype == 'boolean': + spec = spec_bool.substitute({'default': 'false', **info}) + elif gtype == 'float': + spec = spec_numeric.substitute({'min': '-G_MAXFLOAT', 'max': 'G_MAXFLOAT', 'default': 0, **info}) + elif gtype == 'int': + spec = spec_numeric.substitute({'min': 'G_MININT', 'max': 'G_MAXINT', 'default': 0, **info}) + elif gtype == 'int64': + spec = spec_numeric.substitute({'min': 'G_MININT64', 'max': 'G_MAXINT64', 'default': 0, **info}) + elif gtype == 'uchar': + spec = spec_numeric.substitute({'min': '0', 'max': 'G_MAXUINT8', 'default': 0, **info}) + + if is_array: + spec = spec_array.substitute({'element_spec': indent(spec, 1), **info}) + + install_properties.append(install_property.substitute({'spec': indent(spec, 2), **info})) + + if is_array: + get_properties.append(indent(get_array_property.substitute(info), 1)) + set_properties.append(indent(set_array_property.substitute(info), 1)) + else: + get_properties.append(indent(get_property.substitute(info), 1)) + set_properties.append(indent(set_property.substitute(info), 1)) + + return { + 'enum_def': '\n'.join(enum_def), + 'install_properties': '\n'.join(install_properties), + 'get_properties': '\n'.join(get_properties), + 'set_properties': '\n'.join(set_properties), + } + + +def fill_template(template, data): + template = open(template, 'rb').read() + template = template.decode('utf-8') + template = string.Template(template) + return template.substitute(data) + + +def main(argv): + # Parse command line arguments + parser = argparse.ArgumentParser() + parser.add_argument('--output', '-o', metavar='file', type=str, + help='Output file name. Defaults to standard output if not specified.') + parser.add_argument('--template', '-t', dest='template', type=str, required=True, + help='Template file name.') + parser.add_argument('input', type=str, nargs='+', + help='Input file name.') + args = parser.parse_args(argv[1:]) + + controls = [] + for input in args.input: + data = open(input, 'rb').read() + vendor = yaml.safe_load(data)['vendor'] + ctrls = yaml.safe_load(data)['controls'] + controls = controls + [Control(*ctrl.popitem(), vendor) for ctrl in ctrls] + + data = generate_cpp(controls) + + data = fill_template(args.template, data) + + if args.output: + output = open(args.output, 'wb') + output.write(data.encode('utf-8')) + output.close() + else: + sys.stdout.write(data) + + return 0 + + +if __name__ == '__main__': + sys.exit(main(sys.argv)) diff --git a/utils/meson.build b/utils/meson.build index 8e28ada7..89597f7f 100644 --- a/utils/meson.build +++ b/utils/meson.build @@ -9,6 +9,7 @@ py_modules += ['yaml'] gen_controls = files('gen-controls.py') gen_formats = files('gen-formats.py') gen_header = files('gen-header.sh') +gen_gst_controls = files('gen-gst-controls.py') ## Module signing gen_ipa_priv_key = files('gen-ipa-priv-key.sh')
This commit implements gstreamer controls for the libcamera element by generating the controls from the control_ids_*.yaml files using a new gen-gst-controls.py script. The appropriate meson files are also changed to automatically run the script when building. The gen-gst-controls.py script works similar to the gen-controls.py script by parsing the control_ids_*.yaml files and generating C++ code for each control. For the controls to be used as gstreamer properties the type for each control needs to be translated to the appropriate glib type and a GEnumValue is generated for each enum control. Then a g_object_install_property(), _get_property() and _set_property() function is generated for each control. The vendor controls get prefixed with "$vendor-" in the final gstreamer property name. The C++ code generated by the gen-gst-controls.py script is written into the template gstlibcamerasrc-controls.cpp.in file. The matching gstlibcamerasrc-controls.h header defines the GstCameraControls class which handles the installation of the gstreamer properties as well as keeping track of the control values and setting and getting the controls. The content of these functions is generated in the Python script. Finally the libcamerasrc element itself is edited to make use of the new GstCameraControls class. The way this works is by defining a PROP_LAST enum variant which is passed to the installProperties() function so the properties are defined with the appropriate offset. When getting or setting a property PROP_LAST is subtracted from the requested property to translate the control back into a libcamera::controls:: enum variant. Signed-off-by: Jaslo Ziska <jaslo@ziska.de> --- src/gstreamer/gstlibcamera-controls.cpp.in | 46 +++ src/gstreamer/gstlibcamera-controls.h | 36 ++ src/gstreamer/gstlibcamerasrc.cpp | 17 +- src/gstreamer/meson.build | 14 + utils/gen-gst-controls.py | 398 +++++++++++++++++++++ utils/meson.build | 1 + 6 files changed, 509 insertions(+), 3 deletions(-) create mode 100644 src/gstreamer/gstlibcamera-controls.cpp.in create mode 100644 src/gstreamer/gstlibcamera-controls.h create mode 100755 utils/gen-gst-controls.py