1<?xml version="1.0" encoding="utf-8"?> 2<!-- Copyright (C) 2012 The Android Open Source Project 3 4 Licensed under the Apache License, Version 2.0 (the "License"); 5 you may not use this file except in compliance with the License. 6 You may obtain a copy of the License at 7 8 http://www.apache.org/licenses/LICENSE-2.0 9 10 Unless required by applicable law or agreed to in writing, software 11 distributed under the License is distributed on an "AS IS" BASIS, 12 WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 13 See the License for the specific language governing permissions and 14 limitations under the License. 15--> 16<metadata xmlns="http://schemas.android.com/service/camera/metadata/" 17xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" 18xsi:schemaLocation="http://schemas.android.com/service/camera/metadata/ metadata_properties.xsd"> 19 20 <tags> 21 <tag id="BC"> 22 Needed for backwards compatibility with old Java API 23 </tag> 24 <tag id="V1"> 25 New features for first camera 2 release (API1) 26 </tag> 27 <tag id="RAW"> 28 Needed for useful RAW image processing and DNG file support 29 </tag> 30 <tag id="HAL2"> 31 Entry is only used by camera device HAL 2.x 32 </tag> 33 <tag id="FULL"> 34 Entry is required for full hardware level devices, and optional for other hardware levels 35 </tag> 36 <tag id="DEPTH"> 37 Entry is required for the depth capability. 38 </tag> 39 <tag id="REPROC"> 40 Entry is required for the YUV or PRIVATE reprocessing capability. 41 </tag> 42 <tag id="FUTURE"> 43 Entry is under-specified and is not required for now. This is for book-keeping purpose, 44 do not implement or use it, it may be revised for future. 45 </tag> 46 </tags> 47 48 <types> 49 <typedef name="pairFloatFloat"> 50 <language name="java">android.util.Pair<Float,Float></language> 51 </typedef> 52 <typedef name="pairDoubleDouble"> 53 <language name="java">android.util.Pair<Double,Double></language> 54 </typedef> 55 <typedef name="rectangle"> 56 <language name="java">android.graphics.Rect</language> 57 </typedef> 58 <typedef name="size"> 59 <language name="java">android.util.Size</language> 60 </typedef> 61 <typedef name="string"> 62 <language name="java">String</language> 63 </typedef> 64 <typedef name="boolean"> 65 <language name="java">boolean</language> 66 </typedef> 67 <typedef name="imageFormat"> 68 <language name="java">int</language> 69 </typedef> 70 <typedef name="streamConfigurationMap"> 71 <language name="java">android.hardware.camera2.params.StreamConfigurationMap</language> 72 </typedef> 73 <typedef name="streamConfiguration"> 74 <language name="java">android.hardware.camera2.params.StreamConfiguration</language> 75 </typedef> 76 <typedef name="streamConfigurationDuration"> 77 <language name="java">android.hardware.camera2.params.StreamConfigurationDuration</language> 78 </typedef> 79 <typedef name="face"> 80 <language name="java">android.hardware.camera2.params.Face</language> 81 </typedef> 82 <typedef name="meteringRectangle"> 83 <language name="java">android.hardware.camera2.params.MeteringRectangle</language> 84 </typedef> 85 <typedef name="rangeFloat"> 86 <language name="java">android.util.Range<Float></language> 87 </typedef> 88 <typedef name="rangeInt"> 89 <language name="java">android.util.Range<Integer></language> 90 </typedef> 91 <typedef name="rangeLong"> 92 <language name="java">android.util.Range<Long></language> 93 </typedef> 94 <typedef name="colorSpaceTransform"> 95 <language name="java">android.hardware.camera2.params.ColorSpaceTransform</language> 96 </typedef> 97 <typedef name="rggbChannelVector"> 98 <language name="java">android.hardware.camera2.params.RggbChannelVector</language> 99 </typedef> 100 <typedef name="blackLevelPattern"> 101 <language name="java">android.hardware.camera2.params.BlackLevelPattern</language> 102 </typedef> 103 <typedef name="enumList"> 104 <language name="java">int</language> 105 </typedef> 106 <typedef name="sizeF"> 107 <language name="java">android.util.SizeF</language> 108 </typedef> 109 <typedef name="point"> 110 <language name="java">android.graphics.Point</language> 111 </typedef> 112 <typedef name="tonemapCurve"> 113 <language name="java">android.hardware.camera2.params.TonemapCurve</language> 114 </typedef> 115 <typedef name="lensShadingMap"> 116 <language name="java">android.hardware.camera2.params.LensShadingMap</language> 117 </typedef> 118 <typedef name="location"> 119 <language name="java">android.location.Location</language> 120 </typedef> 121 <typedef name="highSpeedVideoConfiguration"> 122 <language name="java">android.hardware.camera2.params.HighSpeedVideoConfiguration</language> 123 </typedef> 124 <typedef name="reprocessFormatsMap"> 125 <language name="java">android.hardware.camera2.params.ReprocessFormatsMap</language> 126 </typedef> 127 </types> 128 129 <namespace name="android"> 130 <section name="colorCorrection"> 131 <controls> 132 <entry name="mode" type="byte" visibility="public" enum="true" hwlevel="full"> 133 <enum> 134 <value>TRANSFORM_MATRIX 135 <notes>Use the android.colorCorrection.transform matrix 136 and android.colorCorrection.gains to do color conversion. 137 138 All advanced white balance adjustments (not specified 139 by our white balance pipeline) must be disabled. 140 141 If AWB is enabled with `android.control.awbMode != OFF`, then 142 TRANSFORM_MATRIX is ignored. The camera device will override 143 this value to either FAST or HIGH_QUALITY. 144 </notes> 145 </value> 146 <value>FAST 147 <notes>Color correction processing must not slow down 148 capture rate relative to sensor raw output. 149 150 Advanced white balance adjustments above and beyond 151 the specified white balance pipeline may be applied. 152 153 If AWB is enabled with `android.control.awbMode != OFF`, then 154 the camera device uses the last frame's AWB values 155 (or defaults if AWB has never been run). 156 </notes> 157 </value> 158 <value>HIGH_QUALITY 159 <notes>Color correction processing operates at improved 160 quality but the capture rate might be reduced (relative to sensor 161 raw output rate) 162 163 Advanced white balance adjustments above and beyond 164 the specified white balance pipeline may be applied. 165 166 If AWB is enabled with `android.control.awbMode != OFF`, then 167 the camera device uses the last frame's AWB values 168 (or defaults if AWB has never been run). 169 </notes> 170 </value> 171 </enum> 172 173 <description> 174 The mode control selects how the image data is converted from the 175 sensor's native color into linear sRGB color. 176 </description> 177 <details> 178 When auto-white balance (AWB) is enabled with android.control.awbMode, this 179 control is overridden by the AWB routine. When AWB is disabled, the 180 application controls how the color mapping is performed. 181 182 We define the expected processing pipeline below. For consistency 183 across devices, this is always the case with TRANSFORM_MATRIX. 184 185 When either FULL or HIGH_QUALITY is used, the camera device may 186 do additional processing but android.colorCorrection.gains and 187 android.colorCorrection.transform will still be provided by the 188 camera device (in the results) and be roughly correct. 189 190 Switching to TRANSFORM_MATRIX and using the data provided from 191 FAST or HIGH_QUALITY will yield a picture with the same white point 192 as what was produced by the camera device in the earlier frame. 193 194 The expected processing pipeline is as follows: 195 196 ![White balance processing pipeline](android.colorCorrection.mode/processing_pipeline.png) 197 198 The white balance is encoded by two values, a 4-channel white-balance 199 gain vector (applied in the Bayer domain), and a 3x3 color transform 200 matrix (applied after demosaic). 201 202 The 4-channel white-balance gains are defined as: 203 204 android.colorCorrection.gains = [ R G_even G_odd B ] 205 206 where `G_even` is the gain for green pixels on even rows of the 207 output, and `G_odd` is the gain for green pixels on the odd rows. 208 These may be identical for a given camera device implementation; if 209 the camera device does not support a separate gain for even/odd green 210 channels, it will use the `G_even` value, and write `G_odd` equal to 211 `G_even` in the output result metadata. 212 213 The matrices for color transforms are defined as a 9-entry vector: 214 215 android.colorCorrection.transform = [ I0 I1 I2 I3 I4 I5 I6 I7 I8 ] 216 217 which define a transform from input sensor colors, `P_in = [ r g b ]`, 218 to output linear sRGB, `P_out = [ r' g' b' ]`, 219 220 with colors as follows: 221 222 r' = I0r + I1g + I2b 223 g' = I3r + I4g + I5b 224 b' = I6r + I7g + I8b 225 226 Both the input and output value ranges must match. Overflow/underflow 227 values are clipped to fit within the range. 228 </details> 229 <hal_details> 230 HAL must support both FAST and HIGH_QUALITY if color correction control is available 231 on the camera device, but the underlying implementation can be the same for both modes. 232 That is, if the highest quality implementation on the camera device does not slow down 233 capture rate, then FAST and HIGH_QUALITY should generate the same output. 234 </hal_details> 235 </entry> 236 <entry name="transform" type="rational" visibility="public" 237 type_notes="3x3 rational matrix in row-major order" 238 container="array" typedef="colorSpaceTransform" hwlevel="full"> 239 <array> 240 <size>3</size> 241 <size>3</size> 242 </array> 243 <description>A color transform matrix to use to transform 244 from sensor RGB color space to output linear sRGB color space. 245 </description> 246 <units>Unitless scale factors</units> 247 <details>This matrix is either set by the camera device when the request 248 android.colorCorrection.mode is not TRANSFORM_MATRIX, or 249 directly by the application in the request when the 250 android.colorCorrection.mode is TRANSFORM_MATRIX. 251 252 In the latter case, the camera device may round the matrix to account 253 for precision issues; the final rounded matrix should be reported back 254 in this matrix result metadata. The transform should keep the magnitude 255 of the output color values within `[0, 1.0]` (assuming input color 256 values is within the normalized range `[0, 1.0]`), or clipping may occur. 257 258 The valid range of each matrix element varies on different devices, but 259 values within [-1.5, 3.0] are guaranteed not to be clipped. 260 </details> 261 </entry> 262 <entry name="gains" type="float" visibility="public" 263 type_notes="A 1D array of floats for 4 color channel gains" 264 container="array" typedef="rggbChannelVector" hwlevel="full"> 265 <array> 266 <size>4</size> 267 </array> 268 <description>Gains applying to Bayer raw color channels for 269 white-balance.</description> 270 <units>Unitless gain factors</units> 271 <details> 272 These per-channel gains are either set by the camera device 273 when the request android.colorCorrection.mode is not 274 TRANSFORM_MATRIX, or directly by the application in the 275 request when the android.colorCorrection.mode is 276 TRANSFORM_MATRIX. 277 278 The gains in the result metadata are the gains actually 279 applied by the camera device to the current frame. 280 281 The valid range of gains varies on different devices, but gains 282 between [1.0, 3.0] are guaranteed not to be clipped. Even if a given 283 device allows gains below 1.0, this is usually not recommended because 284 this can create color artifacts. 285 </details> 286 <hal_details> 287 The 4-channel white-balance gains are defined in 288 the order of `[R G_even G_odd B]`, where `G_even` is the gain 289 for green pixels on even rows of the output, and `G_odd` 290 is the gain for green pixels on the odd rows. 291 292 If a HAL does not support a separate gain for even/odd green 293 channels, it must use the `G_even` value, and write 294 `G_odd` equal to `G_even` in the output result metadata. 295 </hal_details> 296 </entry> 297 <entry name="aberrationMode" type="byte" visibility="public" enum="true" hwlevel="legacy"> 298 <enum> 299 <value>OFF 300 <notes> 301 No aberration correction is applied. 302 </notes> 303 </value> 304 <value>FAST 305 <notes> 306 Aberration correction will not slow down capture rate 307 relative to sensor raw output. 308 </notes> 309 </value> 310 <value>HIGH_QUALITY 311 <notes> 312 Aberration correction operates at improved quality but the capture rate might be 313 reduced (relative to sensor raw output rate) 314 </notes> 315 </value> 316 </enum> 317 <description> 318 Mode of operation for the chromatic aberration correction algorithm. 319 </description> 320 <range>android.colorCorrection.availableAberrationModes</range> 321 <details> 322 Chromatic (color) aberration is caused by the fact that different wavelengths of light 323 can not focus on the same point after exiting from the lens. This metadata defines 324 the high level control of chromatic aberration correction algorithm, which aims to 325 minimize the chromatic artifacts that may occur along the object boundaries in an 326 image. 327 328 FAST/HIGH_QUALITY both mean that camera device determined aberration 329 correction will be applied. HIGH_QUALITY mode indicates that the camera device will 330 use the highest-quality aberration correction algorithms, even if it slows down 331 capture rate. FAST means the camera device will not slow down capture rate when 332 applying aberration correction. 333 334 LEGACY devices will always be in FAST mode. 335 </details> 336 </entry> 337 </controls> 338 <dynamic> 339 <clone entry="android.colorCorrection.mode" kind="controls"> 340 </clone> 341 <clone entry="android.colorCorrection.transform" kind="controls"> 342 </clone> 343 <clone entry="android.colorCorrection.gains" kind="controls"> 344 </clone> 345 <clone entry="android.colorCorrection.aberrationMode" kind="controls"> 346 </clone> 347 </dynamic> 348 <static> 349 <entry name="availableAberrationModes" type="byte" visibility="public" 350 type_notes="list of enums" container="array" typedef="enumList" hwlevel="legacy"> 351 <array> 352 <size>n</size> 353 </array> 354 <description> 355 List of aberration correction modes for android.colorCorrection.aberrationMode that are 356 supported by this camera device. 357 </description> 358 <range>Any value listed in android.colorCorrection.aberrationMode</range> 359 <details> 360 This key lists the valid modes for android.colorCorrection.aberrationMode. If no 361 aberration correction modes are available for a device, this list will solely include 362 OFF mode. All camera devices will support either OFF or FAST mode. 363 364 Camera devices that support the MANUAL_POST_PROCESSING capability will always list 365 OFF mode. This includes all FULL level devices. 366 367 LEGACY devices will always only support FAST mode. 368 </details> 369 <hal_details> 370 HAL must support both FAST and HIGH_QUALITY if chromatic aberration control is available 371 on the camera device, but the underlying implementation can be the same for both modes. 372 That is, if the highest quality implementation on the camera device does not slow down 373 capture rate, then FAST and HIGH_QUALITY will generate the same output. 374 </hal_details> 375 <tag id="V1" /> 376 </entry> 377 </static> 378 </section> 379 <section name="control"> 380 <controls> 381 <entry name="aeAntibandingMode" type="byte" visibility="public" 382 enum="true" hwlevel="legacy"> 383 <enum> 384 <value>OFF 385 <notes> 386 The camera device will not adjust exposure duration to 387 avoid banding problems. 388 </notes> 389 </value> 390 <value>50HZ 391 <notes> 392 The camera device will adjust exposure duration to 393 avoid banding problems with 50Hz illumination sources. 394 </notes> 395 </value> 396 <value>60HZ 397 <notes> 398 The camera device will adjust exposure duration to 399 avoid banding problems with 60Hz illumination 400 sources. 401 </notes> 402 </value> 403 <value>AUTO 404 <notes> 405 The camera device will automatically adapt its 406 antibanding routine to the current illumination 407 condition. This is the default mode if AUTO is 408 available on given camera device. 409 </notes> 410 </value> 411 </enum> 412 <description> 413 The desired setting for the camera device's auto-exposure 414 algorithm's antibanding compensation. 415 </description> 416 <range> 417 android.control.aeAvailableAntibandingModes 418 </range> 419 <details> 420 Some kinds of lighting fixtures, such as some fluorescent 421 lights, flicker at the rate of the power supply frequency 422 (60Hz or 50Hz, depending on country). While this is 423 typically not noticeable to a person, it can be visible to 424 a camera device. If a camera sets its exposure time to the 425 wrong value, the flicker may become visible in the 426 viewfinder as flicker or in a final captured image, as a 427 set of variable-brightness bands across the image. 428 429 Therefore, the auto-exposure routines of camera devices 430 include antibanding routines that ensure that the chosen 431 exposure value will not cause such banding. The choice of 432 exposure time depends on the rate of flicker, which the 433 camera device can detect automatically, or the expected 434 rate can be selected by the application using this 435 control. 436 437 A given camera device may not support all of the possible 438 options for the antibanding mode. The 439 android.control.aeAvailableAntibandingModes key contains 440 the available modes for a given camera device. 441 442 AUTO mode is the default if it is available on given 443 camera device. When AUTO mode is not available, the 444 default will be either 50HZ or 60HZ, and both 50HZ 445 and 60HZ will be available. 446 447 If manual exposure control is enabled (by setting 448 android.control.aeMode or android.control.mode to OFF), 449 then this setting has no effect, and the application must 450 ensure it selects exposure times that do not cause banding 451 issues. The android.statistics.sceneFlicker key can assist 452 the application in this. 453 </details> 454 <hal_details> 455 For all capture request templates, this field must be set 456 to AUTO if AUTO mode is available. If AUTO is not available, 457 the default must be either 50HZ or 60HZ, and both 50HZ and 458 60HZ must be available. 459 460 If manual exposure control is enabled (by setting 461 android.control.aeMode or android.control.mode to OFF), 462 then the exposure values provided by the application must not be 463 adjusted for antibanding. 464 </hal_details> 465 <tag id="BC" /> 466 </entry> 467 <entry name="aeExposureCompensation" type="int32" visibility="public" hwlevel="legacy"> 468 <description>Adjustment to auto-exposure (AE) target image 469 brightness.</description> 470 <units>Compensation steps</units> 471 <range>android.control.aeCompensationRange</range> 472 <details> 473 The adjustment is measured as a count of steps, with the 474 step size defined by android.control.aeCompensationStep and the 475 allowed range by android.control.aeCompensationRange. 476 477 For example, if the exposure value (EV) step is 0.333, '6' 478 will mean an exposure compensation of +2 EV; -3 will mean an 479 exposure compensation of -1 EV. One EV represents a doubling 480 of image brightness. Note that this control will only be 481 effective if android.control.aeMode `!=` OFF. This control 482 will take effect even when android.control.aeLock `== true`. 483 484 In the event of exposure compensation value being changed, camera device 485 may take several frames to reach the newly requested exposure target. 486 During that time, android.control.aeState field will be in the SEARCHING 487 state. Once the new exposure target is reached, android.control.aeState will 488 change from SEARCHING to either CONVERGED, LOCKED (if AE lock is enabled), or 489 FLASH_REQUIRED (if the scene is too dark for still capture). 490 </details> 491 <tag id="BC" /> 492 </entry> 493 <entry name="aeLock" type="byte" visibility="public" enum="true" 494 typedef="boolean" hwlevel="legacy"> 495 <enum> 496 <value>OFF 497 <notes>Auto-exposure lock is disabled; the AE algorithm 498 is free to update its parameters.</notes></value> 499 <value>ON 500 <notes>Auto-exposure lock is enabled; the AE algorithm 501 must not update the exposure and sensitivity parameters 502 while the lock is active. 503 504 android.control.aeExposureCompensation setting changes 505 will still take effect while auto-exposure is locked. 506 507 Some rare LEGACY devices may not support 508 this, in which case the value will always be overridden to OFF. 509 </notes></value> 510 </enum> 511 <description>Whether auto-exposure (AE) is currently locked to its latest 512 calculated values.</description> 513 <details> 514 When set to `true` (ON), the AE algorithm is locked to its latest parameters, 515 and will not change exposure settings until the lock is set to `false` (OFF). 516 517 Note that even when AE is locked, the flash may be fired if 518 the android.control.aeMode is ON_AUTO_FLASH / 519 ON_ALWAYS_FLASH / ON_AUTO_FLASH_REDEYE. 520 521 When android.control.aeExposureCompensation is changed, even if the AE lock 522 is ON, the camera device will still adjust its exposure value. 523 524 If AE precapture is triggered (see android.control.aePrecaptureTrigger) 525 when AE is already locked, the camera device will not change the exposure time 526 (android.sensor.exposureTime) and sensitivity (android.sensor.sensitivity) 527 parameters. The flash may be fired if the android.control.aeMode 528 is ON_AUTO_FLASH/ON_AUTO_FLASH_REDEYE and the scene is too dark. If the 529 android.control.aeMode is ON_ALWAYS_FLASH, the scene may become overexposed. 530 Similarly, AE precapture trigger CANCEL has no effect when AE is already locked. 531 532 When an AE precapture sequence is triggered, AE unlock will not be able to unlock 533 the AE if AE is locked by the camera device internally during precapture metering 534 sequence In other words, submitting requests with AE unlock has no effect for an 535 ongoing precapture metering sequence. Otherwise, the precapture metering sequence 536 will never succeed in a sequence of preview requests where AE lock is always set 537 to `false`. 538 539 Since the camera device has a pipeline of in-flight requests, the settings that 540 get locked do not necessarily correspond to the settings that were present in the 541 latest capture result received from the camera device, since additional captures 542 and AE updates may have occurred even before the result was sent out. If an 543 application is switching between automatic and manual control and wishes to eliminate 544 any flicker during the switch, the following procedure is recommended: 545 546 1. Starting in auto-AE mode: 547 2. Lock AE 548 3. Wait for the first result to be output that has the AE locked 549 4. Copy exposure settings from that result into a request, set the request to manual AE 550 5. Submit the capture request, proceed to run manual AE as desired. 551 552 See android.control.aeState for AE lock related state transition details. 553 </details> 554 <tag id="BC" /> 555 </entry> 556 <entry name="aeMode" type="byte" visibility="public" enum="true" hwlevel="legacy"> 557 <enum> 558 <value>OFF 559 <notes> 560 The camera device's autoexposure routine is disabled. 561 562 The application-selected android.sensor.exposureTime, 563 android.sensor.sensitivity and 564 android.sensor.frameDuration are used by the camera 565 device, along with android.flash.* fields, if there's 566 a flash unit for this camera device. 567 568 Note that auto-white balance (AWB) and auto-focus (AF) 569 behavior is device dependent when AE is in OFF mode. 570 To have consistent behavior across different devices, 571 it is recommended to either set AWB and AF to OFF mode 572 or lock AWB and AF before setting AE to OFF. 573 See android.control.awbMode, android.control.afMode, 574 android.control.awbLock, and android.control.afTrigger 575 for more details. 576 577 LEGACY devices do not support the OFF mode and will 578 override attempts to use this value to ON. 579 </notes> 580 </value> 581 <value>ON 582 <notes> 583 The camera device's autoexposure routine is active, 584 with no flash control. 585 586 The application's values for 587 android.sensor.exposureTime, 588 android.sensor.sensitivity, and 589 android.sensor.frameDuration are ignored. The 590 application has control over the various 591 android.flash.* fields. 592 </notes> 593 </value> 594 <value>ON_AUTO_FLASH 595 <notes> 596 Like ON, except that the camera device also controls 597 the camera's flash unit, firing it in low-light 598 conditions. 599 600 The flash may be fired during a precapture sequence 601 (triggered by android.control.aePrecaptureTrigger) and 602 may be fired for captures for which the 603 android.control.captureIntent field is set to 604 STILL_CAPTURE 605 </notes> 606 </value> 607 <value>ON_ALWAYS_FLASH 608 <notes> 609 Like ON, except that the camera device also controls 610 the camera's flash unit, always firing it for still 611 captures. 612 613 The flash may be fired during a precapture sequence 614 (triggered by android.control.aePrecaptureTrigger) and 615 will always be fired for captures for which the 616 android.control.captureIntent field is set to 617 STILL_CAPTURE 618 </notes> 619 </value> 620 <value>ON_AUTO_FLASH_REDEYE 621 <notes> 622 Like ON_AUTO_FLASH, but with automatic red eye 623 reduction. 624 625 If deemed necessary by the camera device, a red eye 626 reduction flash will fire during the precapture 627 sequence. 628 </notes> 629 </value> 630 </enum> 631 <description>The desired mode for the camera device's 632 auto-exposure routine.</description> 633 <range>android.control.aeAvailableModes</range> 634 <details> 635 This control is only effective if android.control.mode is 636 AUTO. 637 638 When set to any of the ON modes, the camera device's 639 auto-exposure routine is enabled, overriding the 640 application's selected exposure time, sensor sensitivity, 641 and frame duration (android.sensor.exposureTime, 642 android.sensor.sensitivity, and 643 android.sensor.frameDuration). If one of the FLASH modes 644 is selected, the camera device's flash unit controls are 645 also overridden. 646 647 The FLASH modes are only available if the camera device 648 has a flash unit (android.flash.info.available is `true`). 649 650 If flash TORCH mode is desired, this field must be set to 651 ON or OFF, and android.flash.mode set to TORCH. 652 653 When set to any of the ON modes, the values chosen by the 654 camera device auto-exposure routine for the overridden 655 fields for a given capture will be available in its 656 CaptureResult. 657 </details> 658 <tag id="BC" /> 659 </entry> 660 <entry name="aeRegions" type="int32" visibility="public" 661 optional="true" container="array" typedef="meteringRectangle"> 662 <array> 663 <size>5</size> 664 <size>area_count</size> 665 </array> 666 <description>List of metering areas to use for auto-exposure adjustment.</description> 667 <units>Pixel coordinates within android.sensor.info.activeArraySize</units> 668 <range>Coordinates must be between `[(0,0), (width, height))` of 669 android.sensor.info.activeArraySize</range> 670 <details> 671 Not available if android.control.maxRegionsAe is 0. 672 Otherwise will always be present. 673 674 The maximum number of regions supported by the device is determined by the value 675 of android.control.maxRegionsAe. 676 677 The coordinate system is based on the active pixel array, 678 with (0,0) being the top-left pixel in the active pixel array, and 679 (android.sensor.info.activeArraySize.width - 1, 680 android.sensor.info.activeArraySize.height - 1) being the 681 bottom-right pixel in the active pixel array. 682 683 The weight must be within `[0, 1000]`, and represents a weight 684 for every pixel in the area. This means that a large metering area 685 with the same weight as a smaller area will have more effect in 686 the metering result. Metering areas can partially overlap and the 687 camera device will add the weights in the overlap region. 688 689 The weights are relative to weights of other exposure metering regions, so if only one 690 region is used, all non-zero weights will have the same effect. A region with 0 691 weight is ignored. 692 693 If all regions have 0 weight, then no specific metering area needs to be used by the 694 camera device. 695 696 If the metering region is outside the used android.scaler.cropRegion returned in 697 capture result metadata, the camera device will ignore the sections outside the crop 698 region and output only the intersection rectangle as the metering region in the result 699 metadata. If the region is entirely outside the crop region, it will be ignored and 700 not reported in the result metadata. 701 </details> 702 <hal_details> 703 The HAL level representation of MeteringRectangle[] is a 704 int[5 * area_count]. 705 Every five elements represent a metering region of 706 (xmin, ymin, xmax, ymax, weight). 707 The rectangle is defined to be inclusive on xmin and ymin, but 708 exclusive on xmax and ymax. 709 </hal_details> 710 <tag id="BC" /> 711 </entry> 712 <entry name="aeTargetFpsRange" type="int32" visibility="public" 713 container="array" typedef="rangeInt" hwlevel="legacy"> 714 <array> 715 <size>2</size> 716 </array> 717 <description>Range over which the auto-exposure routine can 718 adjust the capture frame rate to maintain good 719 exposure.</description> 720 <units>Frames per second (FPS)</units> 721 <range>Any of the entries in android.control.aeAvailableTargetFpsRanges</range> 722 <details>Only constrains auto-exposure (AE) algorithm, not 723 manual control of android.sensor.exposureTime and 724 android.sensor.frameDuration.</details> 725 <tag id="BC" /> 726 </entry> 727 <entry name="aePrecaptureTrigger" type="byte" visibility="public" 728 enum="true" hwlevel="limited"> 729 <enum> 730 <value>IDLE 731 <notes>The trigger is idle.</notes> 732 </value> 733 <value>START 734 <notes>The precapture metering sequence will be started 735 by the camera device. 736 737 The exact effect of the precapture trigger depends on 738 the current AE mode and state.</notes> 739 </value> 740 <value>CANCEL 741 <notes>The camera device will cancel any currently active or completed 742 precapture metering sequence, the auto-exposure routine will return to its 743 initial state.</notes> 744 </value> 745 </enum> 746 <description>Whether the camera device will trigger a precapture 747 metering sequence when it processes this request.</description> 748 <details>This entry is normally set to IDLE, or is not 749 included at all in the request settings. When included and 750 set to START, the camera device will trigger the auto-exposure (AE) 751 precapture metering sequence. 752 753 When set to CANCEL, the camera device will cancel any active 754 precapture metering trigger, and return to its initial AE state. 755 If a precapture metering sequence is already completed, and the camera 756 device has implicitly locked the AE for subsequent still capture, the 757 CANCEL trigger will unlock the AE and return to its initial AE state. 758 759 The precapture sequence should be triggered before starting a 760 high-quality still capture for final metering decisions to 761 be made, and for firing pre-capture flash pulses to estimate 762 scene brightness and required final capture flash power, when 763 the flash is enabled. 764 765 Normally, this entry should be set to START for only a 766 single request, and the application should wait until the 767 sequence completes before starting a new one. 768 769 When a precapture metering sequence is finished, the camera device 770 may lock the auto-exposure routine internally to be able to accurately expose the 771 subsequent still capture image (`android.control.captureIntent == STILL_CAPTURE`). 772 For this case, the AE may not resume normal scan if no subsequent still capture is 773 submitted. To ensure that the AE routine restarts normal scan, the application should 774 submit a request with `android.control.aeLock == true`, followed by a request 775 with `android.control.aeLock == false`, if the application decides not to submit a 776 still capture request after the precapture sequence completes. Alternatively, for 777 API level 23 or newer devices, the CANCEL can be used to unlock the camera device 778 internally locked AE if the application doesn't submit a still capture request after 779 the AE precapture trigger. Note that, the CANCEL was added in API level 23, and must not 780 be used in devices that have earlier API levels. 781 782 The exact effect of auto-exposure (AE) precapture trigger 783 depends on the current AE mode and state; see 784 android.control.aeState for AE precapture state transition 785 details. 786 787 On LEGACY-level devices, the precapture trigger is not supported; 788 capturing a high-resolution JPEG image will automatically trigger a 789 precapture sequence before the high-resolution capture, including 790 potentially firing a pre-capture flash. 791 792 Using the precapture trigger and the auto-focus trigger android.control.afTrigger 793 simultaneously is allowed. However, since these triggers often require cooperation between 794 the auto-focus and auto-exposure routines (for example, the may need to be enabled for a 795 focus sweep), the camera device may delay acting on a later trigger until the previous 796 trigger has been fully handled. This may lead to longer intervals between the trigger and 797 changes to android.control.aeState indicating the start of the precapture sequence, for 798 example. 799 800 If both the precapture and the auto-focus trigger are activated on the same request, then 801 the camera device will complete them in the optimal order for that device. 802 </details> 803 <hal_details> 804 The HAL must support triggering the AE precapture trigger while an AF trigger is active 805 (and vice versa), or at the same time as the AF trigger. It is acceptable for the HAL to 806 treat these as two consecutive triggers, for example handling the AF trigger and then the 807 AE trigger. Or the HAL may choose to optimize the case with both triggers fired at once, 808 to minimize the latency for converging both focus and exposure/flash usage. 809 </hal_details> 810 <tag id="BC" /> 811 </entry> 812 <entry name="afMode" type="byte" visibility="public" enum="true" 813 hwlevel="legacy"> 814 <enum> 815 <value>OFF 816 <notes>The auto-focus routine does not control the lens; 817 android.lens.focusDistance is controlled by the 818 application.</notes></value> 819 <value>AUTO 820 <notes>Basic automatic focus mode. 821 822 In this mode, the lens does not move unless 823 the autofocus trigger action is called. When that trigger 824 is activated, AF will transition to ACTIVE_SCAN, then to 825 the outcome of the scan (FOCUSED or NOT_FOCUSED). 826 827 Always supported if lens is not fixed focus. 828 829 Use android.lens.info.minimumFocusDistance to determine if lens 830 is fixed-focus. 831 832 Triggering AF_CANCEL resets the lens position to default, 833 and sets the AF state to INACTIVE.</notes></value> 834 <value>MACRO 835 <notes>Close-up focusing mode. 836 837 In this mode, the lens does not move unless the 838 autofocus trigger action is called. When that trigger is 839 activated, AF will transition to ACTIVE_SCAN, then to 840 the outcome of the scan (FOCUSED or NOT_FOCUSED). This 841 mode is optimized for focusing on objects very close to 842 the camera. 843 844 When that trigger is activated, AF will transition to 845 ACTIVE_SCAN, then to the outcome of the scan (FOCUSED or 846 NOT_FOCUSED). Triggering cancel AF resets the lens 847 position to default, and sets the AF state to 848 INACTIVE.</notes></value> 849 <value>CONTINUOUS_VIDEO 850 <notes>In this mode, the AF algorithm modifies the lens 851 position continually to attempt to provide a 852 constantly-in-focus image stream. 853 854 The focusing behavior should be suitable for good quality 855 video recording; typically this means slower focus 856 movement and no overshoots. When the AF trigger is not 857 involved, the AF algorithm should start in INACTIVE state, 858 and then transition into PASSIVE_SCAN and PASSIVE_FOCUSED 859 states as appropriate. When the AF trigger is activated, 860 the algorithm should immediately transition into 861 AF_FOCUSED or AF_NOT_FOCUSED as appropriate, and lock the 862 lens position until a cancel AF trigger is received. 863 864 Once cancel is received, the algorithm should transition 865 back to INACTIVE and resume passive scan. Note that this 866 behavior is not identical to CONTINUOUS_PICTURE, since an 867 ongoing PASSIVE_SCAN must immediately be 868 canceled.</notes></value> 869 <value>CONTINUOUS_PICTURE 870 <notes>In this mode, the AF algorithm modifies the lens 871 position continually to attempt to provide a 872 constantly-in-focus image stream. 873 874 The focusing behavior should be suitable for still image 875 capture; typically this means focusing as fast as 876 possible. When the AF trigger is not involved, the AF 877 algorithm should start in INACTIVE state, and then 878 transition into PASSIVE_SCAN and PASSIVE_FOCUSED states as 879 appropriate as it attempts to maintain focus. When the AF 880 trigger is activated, the algorithm should finish its 881 PASSIVE_SCAN if active, and then transition into 882 AF_FOCUSED or AF_NOT_FOCUSED as appropriate, and lock the 883 lens position until a cancel AF trigger is received. 884 885 When the AF cancel trigger is activated, the algorithm 886 should transition back to INACTIVE and then act as if it 887 has just been started.</notes></value> 888 <value>EDOF 889 <notes>Extended depth of field (digital focus) mode. 890 891 The camera device will produce images with an extended 892 depth of field automatically; no special focusing 893 operations need to be done before taking a picture. 894 895 AF triggers are ignored, and the AF state will always be 896 INACTIVE.</notes></value> 897 </enum> 898 <description>Whether auto-focus (AF) is currently enabled, and what 899 mode it is set to.</description> 900 <range>android.control.afAvailableModes</range> 901 <details>Only effective if android.control.mode = AUTO and the lens is not fixed focus 902 (i.e. `android.lens.info.minimumFocusDistance > 0`). Also note that 903 when android.control.aeMode is OFF, the behavior of AF is device 904 dependent. It is recommended to lock AF by using android.control.afTrigger before 905 setting android.control.aeMode to OFF, or set AF mode to OFF when AE is OFF. 906 907 If the lens is controlled by the camera device auto-focus algorithm, 908 the camera device will report the current AF status in android.control.afState 909 in result metadata.</details> 910 <hal_details> 911 When afMode is AUTO or MACRO, the lens must not move until an AF trigger is sent in a 912 request (android.control.afTrigger `==` START). After an AF trigger, the afState will end 913 up with either FOCUSED_LOCKED or NOT_FOCUSED_LOCKED state (see 914 android.control.afState for detailed state transitions), which indicates that the lens is 915 locked and will not move. If camera movement (e.g. tilting camera) causes the lens to move 916 after the lens is locked, the HAL must compensate this movement appropriately such that 917 the same focal plane remains in focus. 918 919 When afMode is one of the continuous auto focus modes, the HAL is free to start a AF 920 scan whenever it's not locked. When the lens is locked after an AF trigger 921 (see android.control.afState for detailed state transitions), the HAL should maintain the 922 same lock behavior as above. 923 924 When afMode is OFF, the application controls focus manually. The accuracy of the 925 focus distance control depends on the android.lens.info.focusDistanceCalibration. 926 However, the lens must not move regardless of the camera movement for any focus distance 927 manual control. 928 929 To put this in concrete terms, if the camera has lens elements which may move based on 930 camera orientation or motion (e.g. due to gravity), then the HAL must drive the lens to 931 remain in a fixed position invariant to the camera's orientation or motion, for example, 932 by using accelerometer measurements in the lens control logic. This is a typical issue 933 that will arise on camera modules with open-loop VCMs. 934 </hal_details> 935 <tag id="BC" /> 936 </entry> 937 <entry name="afRegions" type="int32" visibility="public" 938 optional="true" container="array" typedef="meteringRectangle"> 939 <array> 940 <size>5</size> 941 <size>area_count</size> 942 </array> 943 <description>List of metering areas to use for auto-focus.</description> 944 <units>Pixel coordinates within android.sensor.info.activeArraySize</units> 945 <range>Coordinates must be between `[(0,0), (width, height))` of 946 android.sensor.info.activeArraySize</range> 947 <details> 948 Not available if android.control.maxRegionsAf is 0. 949 Otherwise will always be present. 950 951 The maximum number of focus areas supported by the device is determined by the value 952 of android.control.maxRegionsAf. 953 954 The coordinate system is based on the active pixel array, 955 with (0,0) being the top-left pixel in the active pixel array, and 956 (android.sensor.info.activeArraySize.width - 1, 957 android.sensor.info.activeArraySize.height - 1) being the 958 bottom-right pixel in the active pixel array. 959 960 The weight must be within `[0, 1000]`, and represents a weight 961 for every pixel in the area. This means that a large metering area 962 with the same weight as a smaller area will have more effect in 963 the metering result. Metering areas can partially overlap and the 964 camera device will add the weights in the overlap region. 965 966 The weights are relative to weights of other metering regions, so if only one region 967 is used, all non-zero weights will have the same effect. A region with 0 weight is 968 ignored. 969 970 If all regions have 0 weight, then no specific metering area needs to be used by the 971 camera device. 972 973 If the metering region is outside the used android.scaler.cropRegion returned in 974 capture result metadata, the camera device will ignore the sections outside the crop 975 region and output only the intersection rectangle as the metering region in the result 976 metadata. If the region is entirely outside the crop region, it will be ignored and 977 not reported in the result metadata. 978 </details> 979 <hal_details> 980 The HAL level representation of MeteringRectangle[] is a 981 int[5 * area_count]. 982 Every five elements represent a metering region of 983 (xmin, ymin, xmax, ymax, weight). 984 The rectangle is defined to be inclusive on xmin and ymin, but 985 exclusive on xmax and ymax. 986 </hal_details> 987 <tag id="BC" /> 988 </entry> 989 <entry name="afTrigger" type="byte" visibility="public" enum="true" 990 hwlevel="legacy"> 991 <enum> 992 <value>IDLE 993 <notes>The trigger is idle.</notes> 994 </value> 995 <value>START 996 <notes>Autofocus will trigger now.</notes> 997 </value> 998 <value>CANCEL 999 <notes>Autofocus will return to its initial 1000 state, and cancel any currently active trigger.</notes> 1001 </value> 1002 </enum> 1003 <description> 1004 Whether the camera device will trigger autofocus for this request. 1005 </description> 1006 <details>This entry is normally set to IDLE, or is not 1007 included at all in the request settings. 1008 1009 When included and set to START, the camera device will trigger the 1010 autofocus algorithm. If autofocus is disabled, this trigger has no effect. 1011 1012 When set to CANCEL, the camera device will cancel any active trigger, 1013 and return to its initial AF state. 1014 1015 Generally, applications should set this entry to START or CANCEL for only a 1016 single capture, and then return it to IDLE (or not set at all). Specifying 1017 START for multiple captures in a row means restarting the AF operation over 1018 and over again. 1019 1020 See android.control.afState for what the trigger means for each AF mode. 1021 1022 Using the autofocus trigger and the precapture trigger android.control.aePrecaptureTrigger 1023 simultaneously is allowed. However, since these triggers often require cooperation between 1024 the auto-focus and auto-exposure routines (for example, the may need to be enabled for a 1025 focus sweep), the camera device may delay acting on a later trigger until the previous 1026 trigger has been fully handled. This may lead to longer intervals between the trigger and 1027 changes to android.control.afState, for example. 1028 </details> 1029 <hal_details> 1030 The HAL must support triggering the AF trigger while an AE precapture trigger is active 1031 (and vice versa), or at the same time as the AE trigger. It is acceptable for the HAL to 1032 treat these as two consecutive triggers, for example handling the AF trigger and then the 1033 AE trigger. Or the HAL may choose to optimize the case with both triggers fired at once, 1034 to minimize the latency for converging both focus and exposure/flash usage. 1035 </hal_details> 1036 <tag id="BC" /> 1037 </entry> 1038 <entry name="awbLock" type="byte" visibility="public" enum="true" 1039 typedef="boolean" hwlevel="legacy"> 1040 <enum> 1041 <value>OFF 1042 <notes>Auto-white balance lock is disabled; the AWB 1043 algorithm is free to update its parameters if in AUTO 1044 mode.</notes></value> 1045 <value>ON 1046 <notes>Auto-white balance lock is enabled; the AWB 1047 algorithm will not update its parameters while the lock 1048 is active.</notes></value> 1049 </enum> 1050 <description>Whether auto-white balance (AWB) is currently locked to its 1051 latest calculated values.</description> 1052 <details> 1053 When set to `true` (ON), the AWB algorithm is locked to its latest parameters, 1054 and will not change color balance settings until the lock is set to `false` (OFF). 1055 1056 Since the camera device has a pipeline of in-flight requests, the settings that 1057 get locked do not necessarily correspond to the settings that were present in the 1058 latest capture result received from the camera device, since additional captures 1059 and AWB updates may have occurred even before the result was sent out. If an 1060 application is switching between automatic and manual control and wishes to eliminate 1061 any flicker during the switch, the following procedure is recommended: 1062 1063 1. Starting in auto-AWB mode: 1064 2. Lock AWB 1065 3. Wait for the first result to be output that has the AWB locked 1066 4. Copy AWB settings from that result into a request, set the request to manual AWB 1067 5. Submit the capture request, proceed to run manual AWB as desired. 1068 1069 Note that AWB lock is only meaningful when 1070 android.control.awbMode is in the AUTO mode; in other modes, 1071 AWB is already fixed to a specific setting. 1072 1073 Some LEGACY devices may not support ON; the value is then overridden to OFF. 1074 </details> 1075 <tag id="BC" /> 1076 </entry> 1077 <entry name="awbMode" type="byte" visibility="public" enum="true" 1078 hwlevel="legacy"> 1079 <enum> 1080 <value>OFF 1081 <notes> 1082 The camera device's auto-white balance routine is disabled. 1083 1084 The application-selected color transform matrix 1085 (android.colorCorrection.transform) and gains 1086 (android.colorCorrection.gains) are used by the camera 1087 device for manual white balance control. 1088 </notes> 1089 </value> 1090 <value>AUTO 1091 <notes> 1092 The camera device's auto-white balance routine is active. 1093 1094 The application's values for android.colorCorrection.transform 1095 and android.colorCorrection.gains are ignored. 1096 For devices that support the MANUAL_POST_PROCESSING capability, the 1097 values used by the camera device for the transform and gains 1098 will be available in the capture result for this request. 1099 </notes> 1100 </value> 1101 <value>INCANDESCENT 1102 <notes> 1103 The camera device's auto-white balance routine is disabled; 1104 the camera device uses incandescent light as the assumed scene 1105 illumination for white balance. 1106 1107 While the exact white balance transforms are up to the 1108 camera device, they will approximately match the CIE 1109 standard illuminant A. 1110 1111 The application's values for android.colorCorrection.transform 1112 and android.colorCorrection.gains are ignored. 1113 For devices that support the MANUAL_POST_PROCESSING capability, the 1114 values used by the camera device for the transform and gains 1115 will be available in the capture result for this request. 1116 </notes> 1117 </value> 1118 <value>FLUORESCENT 1119 <notes> 1120 The camera device's auto-white balance routine is disabled; 1121 the camera device uses fluorescent light as the assumed scene 1122 illumination for white balance. 1123 1124 While the exact white balance transforms are up to the 1125 camera device, they will approximately match the CIE 1126 standard illuminant F2. 1127 1128 The application's values for android.colorCorrection.transform 1129 and android.colorCorrection.gains are ignored. 1130 For devices that support the MANUAL_POST_PROCESSING capability, the 1131 values used by the camera device for the transform and gains 1132 will be available in the capture result for this request. 1133 </notes> 1134 </value> 1135 <value>WARM_FLUORESCENT 1136 <notes> 1137 The camera device's auto-white balance routine is disabled; 1138 the camera device uses warm fluorescent light as the assumed scene 1139 illumination for white balance. 1140 1141 While the exact white balance transforms are up to the 1142 camera device, they will approximately match the CIE 1143 standard illuminant F4. 1144 1145 The application's values for android.colorCorrection.transform 1146 and android.colorCorrection.gains are ignored. 1147 For devices that support the MANUAL_POST_PROCESSING capability, the 1148 values used by the camera device for the transform and gains 1149 will be available in the capture result for this request. 1150 </notes> 1151 </value> 1152 <value>DAYLIGHT 1153 <notes> 1154 The camera device's auto-white balance routine is disabled; 1155 the camera device uses daylight light as the assumed scene 1156 illumination for white balance. 1157 1158 While the exact white balance transforms are up to the 1159 camera device, they will approximately match the CIE 1160 standard illuminant D65. 1161 1162 The application's values for android.colorCorrection.transform 1163 and android.colorCorrection.gains are ignored. 1164 For devices that support the MANUAL_POST_PROCESSING capability, the 1165 values used by the camera device for the transform and gains 1166 will be available in the capture result for this request. 1167 </notes> 1168 </value> 1169 <value>CLOUDY_DAYLIGHT 1170 <notes> 1171 The camera device's auto-white balance routine is disabled; 1172 the camera device uses cloudy daylight light as the assumed scene 1173 illumination for white balance. 1174 1175 The application's values for android.colorCorrection.transform 1176 and android.colorCorrection.gains are ignored. 1177 For devices that support the MANUAL_POST_PROCESSING capability, the 1178 values used by the camera device for the transform and gains 1179 will be available in the capture result for this request. 1180 </notes> 1181 </value> 1182 <value>TWILIGHT 1183 <notes> 1184 The camera device's auto-white balance routine is disabled; 1185 the camera device uses twilight light as the assumed scene 1186 illumination for white balance. 1187 1188 The application's values for android.colorCorrection.transform 1189 and android.colorCorrection.gains are ignored. 1190 For devices that support the MANUAL_POST_PROCESSING capability, the 1191 values used by the camera device for the transform and gains 1192 will be available in the capture result for this request. 1193 </notes> 1194 </value> 1195 <value>SHADE 1196 <notes> 1197 The camera device's auto-white balance routine is disabled; 1198 the camera device uses shade light as the assumed scene 1199 illumination for white balance. 1200 1201 The application's values for android.colorCorrection.transform 1202 and android.colorCorrection.gains are ignored. 1203 For devices that support the MANUAL_POST_PROCESSING capability, the 1204 values used by the camera device for the transform and gains 1205 will be available in the capture result for this request. 1206 </notes> 1207 </value> 1208 </enum> 1209 <description>Whether auto-white balance (AWB) is currently setting the color 1210 transform fields, and what its illumination target 1211 is.</description> 1212 <range>android.control.awbAvailableModes</range> 1213 <details> 1214 This control is only effective if android.control.mode is AUTO. 1215 1216 When set to the ON mode, the camera device's auto-white balance 1217 routine is enabled, overriding the application's selected 1218 android.colorCorrection.transform, android.colorCorrection.gains and 1219 android.colorCorrection.mode. Note that when android.control.aeMode 1220 is OFF, the behavior of AWB is device dependent. It is recommened to 1221 also set AWB mode to OFF or lock AWB by using android.control.awbLock before 1222 setting AE mode to OFF. 1223 1224 When set to the OFF mode, the camera device's auto-white balance 1225 routine is disabled. The application manually controls the white 1226 balance by android.colorCorrection.transform, android.colorCorrection.gains 1227 and android.colorCorrection.mode. 1228 1229 When set to any other modes, the camera device's auto-white 1230 balance routine is disabled. The camera device uses each 1231 particular illumination target for white balance 1232 adjustment. The application's values for 1233 android.colorCorrection.transform, 1234 android.colorCorrection.gains and 1235 android.colorCorrection.mode are ignored. 1236 </details> 1237 <tag id="BC" /> 1238 </entry> 1239 <entry name="awbRegions" type="int32" visibility="public" 1240 optional="true" container="array" typedef="meteringRectangle"> 1241 <array> 1242 <size>5</size> 1243 <size>area_count</size> 1244 </array> 1245 <description>List of metering areas to use for auto-white-balance illuminant 1246 estimation.</description> 1247 <units>Pixel coordinates within android.sensor.info.activeArraySize</units> 1248 <range>Coordinates must be between `[(0,0), (width, height))` of 1249 android.sensor.info.activeArraySize</range> 1250 <details> 1251 Not available if android.control.maxRegionsAwb is 0. 1252 Otherwise will always be present. 1253 1254 The maximum number of regions supported by the device is determined by the value 1255 of android.control.maxRegionsAwb. 1256 1257 The coordinate system is based on the active pixel array, 1258 with (0,0) being the top-left pixel in the active pixel array, and 1259 (android.sensor.info.activeArraySize.width - 1, 1260 android.sensor.info.activeArraySize.height - 1) being the 1261 bottom-right pixel in the active pixel array. 1262 1263 The weight must range from 0 to 1000, and represents a weight 1264 for every pixel in the area. This means that a large metering area 1265 with the same weight as a smaller area will have more effect in 1266 the metering result. Metering areas can partially overlap and the 1267 camera device will add the weights in the overlap region. 1268 1269 The weights are relative to weights of other white balance metering regions, so if 1270 only one region is used, all non-zero weights will have the same effect. A region with 1271 0 weight is ignored. 1272 1273 If all regions have 0 weight, then no specific metering area needs to be used by the 1274 camera device. 1275 1276 If the metering region is outside the used android.scaler.cropRegion returned in 1277 capture result metadata, the camera device will ignore the sections outside the crop 1278 region and output only the intersection rectangle as the metering region in the result 1279 metadata. If the region is entirely outside the crop region, it will be ignored and 1280 not reported in the result metadata. 1281 </details> 1282 <hal_details> 1283 The HAL level representation of MeteringRectangle[] is a 1284 int[5 * area_count]. 1285 Every five elements represent a metering region of 1286 (xmin, ymin, xmax, ymax, weight). 1287 The rectangle is defined to be inclusive on xmin and ymin, but 1288 exclusive on xmax and ymax. 1289 </hal_details> 1290 <tag id="BC" /> 1291 </entry> 1292 <entry name="captureIntent" type="byte" visibility="public" enum="true" 1293 hwlevel="legacy"> 1294 <enum> 1295 <value>CUSTOM 1296 <notes>The goal of this request doesn't fall into the other 1297 categories. The camera device will default to preview-like 1298 behavior.</notes></value> 1299 <value>PREVIEW 1300 <notes>This request is for a preview-like use case. 1301 1302 The precapture trigger may be used to start off a metering 1303 w/flash sequence. 1304 </notes></value> 1305 <value>STILL_CAPTURE 1306 <notes>This request is for a still capture-type 1307 use case. 1308 1309 If the flash unit is under automatic control, it may fire as needed. 1310 </notes></value> 1311 <value>VIDEO_RECORD 1312 <notes>This request is for a video recording 1313 use case.</notes></value> 1314 <value>VIDEO_SNAPSHOT 1315 <notes>This request is for a video snapshot (still 1316 image while recording video) use case. 1317 1318 The camera device should take the highest-quality image 1319 possible (given the other settings) without disrupting the 1320 frame rate of video recording. </notes></value> 1321 <value>ZERO_SHUTTER_LAG 1322 <notes>This request is for a ZSL usecase; the 1323 application will stream full-resolution images and 1324 reprocess one or several later for a final 1325 capture. 1326 </notes></value> 1327 <value>MANUAL 1328 <notes>This request is for manual capture use case where 1329 the applications want to directly control the capture parameters. 1330 1331 For example, the application may wish to manually control 1332 android.sensor.exposureTime, android.sensor.sensitivity, etc. 1333 </notes></value> 1334 </enum> 1335 <description>Information to the camera device 3A (auto-exposure, 1336 auto-focus, auto-white balance) routines about the purpose 1337 of this capture, to help the camera device to decide optimal 3A 1338 strategy.</description> 1339 <details>This control (except for MANUAL) is only effective if 1340 `android.control.mode != OFF` and any 3A routine is active. 1341 1342 ZERO_SHUTTER_LAG will be supported if android.request.availableCapabilities 1343 contains PRIVATE_REPROCESSING or YUV_REPROCESSING. MANUAL will be supported if 1344 android.request.availableCapabilities contains MANUAL_SENSOR. Other intent values are 1345 always supported. 1346 </details> 1347 <tag id="BC" /> 1348 </entry> 1349 <entry name="effectMode" type="byte" visibility="public" enum="true" 1350 hwlevel="legacy"> 1351 <enum> 1352 <value>OFF 1353 <notes> 1354 No color effect will be applied. 1355 </notes> 1356 </value> 1357 <value optional="true">MONO 1358 <notes> 1359 A "monocolor" effect where the image is mapped into 1360 a single color. 1361 1362 This will typically be grayscale. 1363 </notes> 1364 </value> 1365 <value optional="true">NEGATIVE 1366 <notes> 1367 A "photo-negative" effect where the image's colors 1368 are inverted. 1369 </notes> 1370 </value> 1371 <value optional="true">SOLARIZE 1372 <notes> 1373 A "solarisation" effect (Sabattier effect) where the 1374 image is wholly or partially reversed in 1375 tone. 1376 </notes> 1377 </value> 1378 <value optional="true">SEPIA 1379 <notes> 1380 A "sepia" effect where the image is mapped into warm 1381 gray, red, and brown tones. 1382 </notes> 1383 </value> 1384 <value optional="true">POSTERIZE 1385 <notes> 1386 A "posterization" effect where the image uses 1387 discrete regions of tone rather than a continuous 1388 gradient of tones. 1389 </notes> 1390 </value> 1391 <value optional="true">WHITEBOARD 1392 <notes> 1393 A "whiteboard" effect where the image is typically displayed 1394 as regions of white, with black or grey details. 1395 </notes> 1396 </value> 1397 <value optional="true">BLACKBOARD 1398 <notes> 1399 A "blackboard" effect where the image is typically displayed 1400 as regions of black, with white or grey details. 1401 </notes> 1402 </value> 1403 <value optional="true">AQUA 1404 <notes> 1405 An "aqua" effect where a blue hue is added to the image. 1406 </notes> 1407 </value> 1408 </enum> 1409 <description>A special color effect to apply.</description> 1410 <range>android.control.availableEffects</range> 1411 <details> 1412 When this mode is set, a color effect will be applied 1413 to images produced by the camera device. The interpretation 1414 and implementation of these color effects is left to the 1415 implementor of the camera device, and should not be 1416 depended on to be consistent (or present) across all 1417 devices. 1418 </details> 1419 <tag id="BC" /> 1420 </entry> 1421 <entry name="mode" type="byte" visibility="public" enum="true" 1422 hwlevel="legacy"> 1423 <enum> 1424 <value>OFF 1425 <notes>Full application control of pipeline. 1426 1427 All control by the device's metering and focusing (3A) 1428 routines is disabled, and no other settings in 1429 android.control.* have any effect, except that 1430 android.control.captureIntent may be used by the camera 1431 device to select post-processing values for processing 1432 blocks that do not allow for manual control, or are not 1433 exposed by the camera API. 1434 1435 However, the camera device's 3A routines may continue to 1436 collect statistics and update their internal state so that 1437 when control is switched to AUTO mode, good control values 1438 can be immediately applied. 1439 </notes></value> 1440 <value>AUTO 1441 <notes>Use settings for each individual 3A routine. 1442 1443 Manual control of capture parameters is disabled. All 1444 controls in android.control.* besides sceneMode take 1445 effect.</notes></value> 1446 <value optional="true">USE_SCENE_MODE 1447 <notes>Use a specific scene mode. 1448 1449 Enabling this disables control.aeMode, control.awbMode and 1450 control.afMode controls; the camera device will ignore 1451 those settings while USE_SCENE_MODE is active (except for 1452 FACE_PRIORITY scene mode). Other control entries are still active. 1453 This setting can only be used if scene mode is supported (i.e. 1454 android.control.availableSceneModes 1455 contain some modes other than DISABLED).</notes></value> 1456 <value optional="true">OFF_KEEP_STATE 1457 <notes>Same as OFF mode, except that this capture will not be 1458 used by camera device background auto-exposure, auto-white balance and 1459 auto-focus algorithms (3A) to update their statistics. 1460 1461 Specifically, the 3A routines are locked to the last 1462 values set from a request with AUTO, OFF, or 1463 USE_SCENE_MODE, and any statistics or state updates 1464 collected from manual captures with OFF_KEEP_STATE will be 1465 discarded by the camera device. 1466 </notes></value> 1467 </enum> 1468 <description>Overall mode of 3A (auto-exposure, auto-white-balance, auto-focus) control 1469 routines.</description> 1470 <range>android.control.availableModes</range> 1471 <details> 1472 This is a top-level 3A control switch. When set to OFF, all 3A control 1473 by the camera device is disabled. The application must set the fields for 1474 capture parameters itself. 1475 1476 When set to AUTO, the individual algorithm controls in 1477 android.control.* are in effect, such as android.control.afMode. 1478 1479 When set to USE_SCENE_MODE, the individual controls in 1480 android.control.* are mostly disabled, and the camera device implements 1481 one of the scene mode settings (such as ACTION, SUNSET, or PARTY) 1482 as it wishes. The camera device scene mode 3A settings are provided by 1483 {@link android.hardware.camera2.CaptureResult capture results}. 1484 1485 When set to OFF_KEEP_STATE, it is similar to OFF mode, the only difference 1486 is that this frame will not be used by camera device background 3A statistics 1487 update, as if this frame is never captured. This mode can be used in the scenario 1488 where the application doesn't want a 3A manual control capture to affect 1489 the subsequent auto 3A capture results. 1490 </details> 1491 <tag id="BC" /> 1492 </entry> 1493 <entry name="sceneMode" type="byte" visibility="public" enum="true" 1494 hwlevel="legacy"> 1495 <enum> 1496 <value id="0">DISABLED 1497 <notes> 1498 Indicates that no scene modes are set for a given capture request. 1499 </notes> 1500 </value> 1501 <value>FACE_PRIORITY 1502 <notes>If face detection support exists, use face 1503 detection data for auto-focus, auto-white balance, and 1504 auto-exposure routines. 1505 1506 If face detection statistics are disabled 1507 (i.e. android.statistics.faceDetectMode is set to OFF), 1508 this should still operate correctly (but will not return 1509 face detection statistics to the framework). 1510 1511 Unlike the other scene modes, android.control.aeMode, 1512 android.control.awbMode, and android.control.afMode 1513 remain active when FACE_PRIORITY is set. 1514 </notes> 1515 </value> 1516 <value optional="true">ACTION 1517 <notes> 1518 Optimized for photos of quickly moving objects. 1519 1520 Similar to SPORTS. 1521 </notes> 1522 </value> 1523 <value optional="true">PORTRAIT 1524 <notes> 1525 Optimized for still photos of people. 1526 </notes> 1527 </value> 1528 <value optional="true">LANDSCAPE 1529 <notes> 1530 Optimized for photos of distant macroscopic objects. 1531 </notes> 1532 </value> 1533 <value optional="true">NIGHT 1534 <notes> 1535 Optimized for low-light settings. 1536 </notes> 1537 </value> 1538 <value optional="true">NIGHT_PORTRAIT 1539 <notes> 1540 Optimized for still photos of people in low-light 1541 settings. 1542 </notes> 1543 </value> 1544 <value optional="true">THEATRE 1545 <notes> 1546 Optimized for dim, indoor settings where flash must 1547 remain off. 1548 </notes> 1549 </value> 1550 <value optional="true">BEACH 1551 <notes> 1552 Optimized for bright, outdoor beach settings. 1553 </notes> 1554 </value> 1555 <value optional="true">SNOW 1556 <notes> 1557 Optimized for bright, outdoor settings containing snow. 1558 </notes> 1559 </value> 1560 <value optional="true">SUNSET 1561 <notes> 1562 Optimized for scenes of the setting sun. 1563 </notes> 1564 </value> 1565 <value optional="true">STEADYPHOTO 1566 <notes> 1567 Optimized to avoid blurry photos due to small amounts of 1568 device motion (for example: due to hand shake). 1569 </notes> 1570 </value> 1571 <value optional="true">FIREWORKS 1572 <notes> 1573 Optimized for nighttime photos of fireworks. 1574 </notes> 1575 </value> 1576 <value optional="true">SPORTS 1577 <notes> 1578 Optimized for photos of quickly moving people. 1579 1580 Similar to ACTION. 1581 </notes> 1582 </value> 1583 <value optional="true">PARTY 1584 <notes> 1585 Optimized for dim, indoor settings with multiple moving 1586 people. 1587 </notes> 1588 </value> 1589 <value optional="true">CANDLELIGHT 1590 <notes> 1591 Optimized for dim settings where the main light source 1592 is a flame. 1593 </notes> 1594 </value> 1595 <value optional="true">BARCODE 1596 <notes> 1597 Optimized for accurately capturing a photo of barcode 1598 for use by camera applications that wish to read the 1599 barcode value. 1600 </notes> 1601 </value> 1602 <value deprecated="true" optional="true">HIGH_SPEED_VIDEO 1603 <notes> 1604 This is deprecated, please use {@link 1605 android.hardware.camera2.CameraDevice#createConstrainedHighSpeedCaptureSession} 1606 and {@link 1607 android.hardware.camera2.CameraConstrainedHighSpeedCaptureSession#createHighSpeedRequestList} 1608 for high speed video recording. 1609 1610 Optimized for high speed video recording (frame rate >=60fps) use case. 1611 1612 The supported high speed video sizes and fps ranges are specified in 1613 android.control.availableHighSpeedVideoConfigurations. To get desired 1614 output frame rates, the application is only allowed to select video size 1615 and fps range combinations listed in this static metadata. The fps range 1616 can be control via android.control.aeTargetFpsRange. 1617 1618 In this mode, the camera device will override aeMode, awbMode, and afMode to 1619 ON, ON, and CONTINUOUS_VIDEO, respectively. All post-processing block mode 1620 controls will be overridden to be FAST. Therefore, no manual control of capture 1621 and post-processing parameters is possible. All other controls operate the 1622 same as when android.control.mode == AUTO. This means that all other 1623 android.control.* fields continue to work, such as 1624 1625 * android.control.aeTargetFpsRange 1626 * android.control.aeExposureCompensation 1627 * android.control.aeLock 1628 * android.control.awbLock 1629 * android.control.effectMode 1630 * android.control.aeRegions 1631 * android.control.afRegions 1632 * android.control.awbRegions 1633 * android.control.afTrigger 1634 * android.control.aePrecaptureTrigger 1635 1636 Outside of android.control.*, the following controls will work: 1637 1638 * android.flash.mode (automatic flash for still capture will not work since aeMode is ON) 1639 * android.lens.opticalStabilizationMode (if it is supported) 1640 * android.scaler.cropRegion 1641 * android.statistics.faceDetectMode 1642 1643 For high speed recording use case, the actual maximum supported frame rate may 1644 be lower than what camera can output, depending on the destination Surfaces for 1645 the image data. For example, if the destination surface is from video encoder, 1646 the application need check if the video encoder is capable of supporting the 1647 high frame rate for a given video size, or it will end up with lower recording 1648 frame rate. If the destination surface is from preview window, the preview frame 1649 rate will be bounded by the screen refresh rate. 1650 1651 The camera device will only support up to 2 output high speed streams 1652 (processed non-stalling format defined in android.request.maxNumOutputStreams) 1653 in this mode. This control will be effective only if all of below conditions are true: 1654 1655 * The application created no more than maxNumHighSpeedStreams processed non-stalling 1656 format output streams, where maxNumHighSpeedStreams is calculated as 1657 min(2, android.request.maxNumOutputStreams[Processed (but not-stalling)]). 1658 * The stream sizes are selected from the sizes reported by 1659 android.control.availableHighSpeedVideoConfigurations. 1660 * No processed non-stalling or raw streams are configured. 1661 1662 When above conditions are NOT satistied, the controls of this mode and 1663 android.control.aeTargetFpsRange will be ignored by the camera device, 1664 the camera device will fall back to android.control.mode `==` AUTO, 1665 and the returned capture result metadata will give the fps range choosen 1666 by the camera device. 1667 1668 Switching into or out of this mode may trigger some camera ISP/sensor 1669 reconfigurations, which may introduce extra latency. It is recommended that 1670 the application avoids unnecessary scene mode switch as much as possible. 1671 </notes> 1672 </value> 1673 <value optional="true">HDR 1674 <notes> 1675 Turn on a device-specific high dynamic range (HDR) mode. 1676 1677 In this scene mode, the camera device captures images 1678 that keep a larger range of scene illumination levels 1679 visible in the final image. For example, when taking a 1680 picture of a object in front of a bright window, both 1681 the object and the scene through the window may be 1682 visible when using HDR mode, while in normal AUTO mode, 1683 one or the other may be poorly exposed. As a tradeoff, 1684 HDR mode generally takes much longer to capture a single 1685 image, has no user control, and may have other artifacts 1686 depending on the HDR method used. 1687 1688 Therefore, HDR captures operate at a much slower rate 1689 than regular captures. 1690 1691 In this mode, on LIMITED or FULL devices, when a request 1692 is made with a android.control.captureIntent of 1693 STILL_CAPTURE, the camera device will capture an image 1694 using a high dynamic range capture technique. On LEGACY 1695 devices, captures that target a JPEG-format output will 1696 be captured with HDR, and the capture intent is not 1697 relevant. 1698 1699 The HDR capture may involve the device capturing a burst 1700 of images internally and combining them into one, or it 1701 may involve the device using specialized high dynamic 1702 range capture hardware. In all cases, a single image is 1703 produced in response to a capture request submitted 1704 while in HDR mode. 1705 1706 Since substantial post-processing is generally needed to 1707 produce an HDR image, only YUV, PRIVATE, and JPEG 1708 outputs are supported for LIMITED/FULL device HDR 1709 captures, and only JPEG outputs are supported for LEGACY 1710 HDR captures. Using a RAW output for HDR capture is not 1711 supported. 1712 1713 Some devices may also support always-on HDR, which 1714 applies HDR processing at full frame rate. For these 1715 devices, intents other than STILL_CAPTURE will also 1716 produce an HDR output with no frame rate impact compared 1717 to normal operation, though the quality may be lower 1718 than for STILL_CAPTURE intents. 1719 1720 If SCENE_MODE_HDR is used with unsupported output types 1721 or capture intents, the images captured will be as if 1722 the SCENE_MODE was not enabled at all. 1723 </notes> 1724 </value> 1725 <value optional="true" hidden="true">FACE_PRIORITY_LOW_LIGHT 1726 <notes>Same as FACE_PRIORITY scene mode, except that the camera 1727 device will choose higher sensitivity values (android.sensor.sensitivity) 1728 under low light conditions. 1729 1730 The camera device may be tuned to expose the images in a reduced 1731 sensitivity range to produce the best quality images. For example, 1732 if the android.sensor.info.sensitivityRange gives range of [100, 1600], 1733 the camera device auto-exposure routine tuning process may limit the actual 1734 exposure sensitivity range to [100, 1200] to ensure that the noise level isn't 1735 exessive in order to preserve the image quality. Under this situation, the image under 1736 low light may be under-exposed when the sensor max exposure time (bounded by the 1737 android.control.aeTargetFpsRange when android.control.aeMode is one of the 1738 ON_* modes) and effective max sensitivity are reached. This scene mode allows the 1739 camera device auto-exposure routine to increase the sensitivity up to the max 1740 sensitivity specified by android.sensor.info.sensitivityRange when the scene is too 1741 dark and the max exposure time is reached. The captured images may be noisier 1742 compared with the images captured in normal FACE_PRIORITY mode; therefore, it is 1743 recommended that the application only use this scene mode when it is capable of 1744 reducing the noise level of the captured images. 1745 1746 Unlike the other scene modes, android.control.aeMode, 1747 android.control.awbMode, and android.control.afMode 1748 remain active when FACE_PRIORITY_LOW_LIGHT is set. 1749 </notes> 1750 </value> 1751 <value optional="true" hidden="true" id="100">DEVICE_CUSTOM_START 1752 <notes> 1753 Scene mode values within the range of 1754 `[DEVICE_CUSTOM_START, DEVICE_CUSTOM_END]` are reserved for device specific 1755 customized scene modes. 1756 </notes> 1757 </value> 1758 <value optional="true" hidden="true" id="127">DEVICE_CUSTOM_END 1759 <notes> 1760 Scene mode values within the range of 1761 `[DEVICE_CUSTOM_START, DEVICE_CUSTOM_END]` are reserved for device specific 1762 customized scene modes. 1763 </notes> 1764 </value> 1765 </enum> 1766 <description> 1767 Control for which scene mode is currently active. 1768 </description> 1769 <range>android.control.availableSceneModes</range> 1770 <details> 1771 Scene modes are custom camera modes optimized for a certain set of conditions and 1772 capture settings. 1773 1774 This is the mode that that is active when 1775 `android.control.mode == USE_SCENE_MODE`. Aside from FACE_PRIORITY, these modes will 1776 disable android.control.aeMode, android.control.awbMode, and android.control.afMode 1777 while in use. 1778 1779 The interpretation and implementation of these scene modes is left 1780 to the implementor of the camera device. Their behavior will not be 1781 consistent across all devices, and any given device may only implement 1782 a subset of these modes. 1783 </details> 1784 <hal_details> 1785 HAL implementations that include scene modes are expected to provide 1786 the per-scene settings to use for android.control.aeMode, 1787 android.control.awbMode, and android.control.afMode in 1788 android.control.sceneModeOverrides. 1789 1790 For HIGH_SPEED_VIDEO mode, if it is included in android.control.availableSceneModes, 1791 the HAL must list supported video size and fps range in 1792 android.control.availableHighSpeedVideoConfigurations. For a given size, e.g. 1793 1280x720, if the HAL has two different sensor configurations for normal streaming 1794 mode and high speed streaming, when this scene mode is set/reset in a sequence of capture 1795 requests, the HAL may have to switch between different sensor modes. 1796 This mode is deprecated in HAL3.3, to support high speed video recording, please implement 1797 android.control.availableHighSpeedVideoConfigurations and CONSTRAINED_HIGH_SPEED_VIDEO 1798 capbility defined in android.request.availableCapabilities. 1799 </hal_details> 1800 <tag id="BC" /> 1801 </entry> 1802 <entry name="videoStabilizationMode" type="byte" visibility="public" 1803 enum="true" hwlevel="legacy"> 1804 <enum> 1805 <value>OFF 1806 <notes> 1807 Video stabilization is disabled. 1808 </notes></value> 1809 <value>ON 1810 <notes> 1811 Video stabilization is enabled. 1812 </notes></value> 1813 </enum> 1814 <description>Whether video stabilization is 1815 active.</description> 1816 <details> 1817 Video stabilization automatically warps images from 1818 the camera in order to stabilize motion between consecutive frames. 1819 1820 If enabled, video stabilization can modify the 1821 android.scaler.cropRegion to keep the video stream stabilized. 1822 1823 Switching between different video stabilization modes may take several 1824 frames to initialize, the camera device will report the current mode 1825 in capture result metadata. For example, When "ON" mode is requested, 1826 the video stabilization modes in the first several capture results may 1827 still be "OFF", and it will become "ON" when the initialization is 1828 done. 1829 1830 In addition, not all recording sizes or frame rates may be supported for 1831 stabilization by a device that reports stabilization support. It is guaranteed 1832 that an output targeting a MediaRecorder or MediaCodec will be stabilized if 1833 the recording resolution is less than or equal to 1920 x 1080 (width less than 1834 or equal to 1920, height less than or equal to 1080), and the recording 1835 frame rate is less than or equal to 30fps. At other sizes, the CaptureResult 1836 android.control.videoStabilizationMode field will return 1837 OFF if the recording output is not stabilized, or if there are no output 1838 Surface types that can be stabilized. 1839 1840 If a camera device supports both this mode and OIS 1841 (android.lens.opticalStabilizationMode), turning both modes on may 1842 produce undesirable interaction, so it is recommended not to enable 1843 both at the same time. 1844 </details> 1845 <tag id="BC" /> 1846 </entry> 1847 </controls> 1848 <static> 1849 <entry name="aeAvailableAntibandingModes" type="byte" visibility="public" 1850 type_notes="list of enums" container="array" typedef="enumList" 1851 hwlevel="legacy"> 1852 <array> 1853 <size>n</size> 1854 </array> 1855 <description> 1856 List of auto-exposure antibanding modes for android.control.aeAntibandingMode that are 1857 supported by this camera device. 1858 </description> 1859 <range>Any value listed in android.control.aeAntibandingMode</range> 1860 <details> 1861 Not all of the auto-exposure anti-banding modes may be 1862 supported by a given camera device. This field lists the 1863 valid anti-banding modes that the application may request 1864 for this camera device with the 1865 android.control.aeAntibandingMode control. 1866 </details> 1867 <tag id="BC" /> 1868 </entry> 1869 <entry name="aeAvailableModes" type="byte" visibility="public" 1870 type_notes="list of enums" container="array" typedef="enumList" 1871 hwlevel="legacy"> 1872 <array> 1873 <size>n</size> 1874 </array> 1875 <description> 1876 List of auto-exposure modes for android.control.aeMode that are supported by this camera 1877 device. 1878 </description> 1879 <range>Any value listed in android.control.aeMode</range> 1880 <details> 1881 Not all the auto-exposure modes may be supported by a 1882 given camera device, especially if no flash unit is 1883 available. This entry lists the valid modes for 1884 android.control.aeMode for this camera device. 1885 1886 All camera devices support ON, and all camera devices with flash 1887 units support ON_AUTO_FLASH and ON_ALWAYS_FLASH. 1888 1889 FULL mode camera devices always support OFF mode, 1890 which enables application control of camera exposure time, 1891 sensitivity, and frame duration. 1892 1893 LEGACY mode camera devices never support OFF mode. 1894 LIMITED mode devices support OFF if they support the MANUAL_SENSOR 1895 capability. 1896 </details> 1897 <tag id="BC" /> 1898 </entry> 1899 <entry name="aeAvailableTargetFpsRanges" type="int32" visibility="public" 1900 type_notes="list of pairs of frame rates" 1901 container="array" typedef="rangeInt" 1902 hwlevel="legacy"> 1903 <array> 1904 <size>2</size> 1905 <size>n</size> 1906 </array> 1907 <description>List of frame rate ranges for android.control.aeTargetFpsRange supported by 1908 this camera device.</description> 1909 <units>Frames per second (FPS)</units> 1910 <details> 1911 For devices at the LEGACY level or above: 1912 1913 * For constant-framerate recording, for each normal 1914 {@link android.media.CamcorderProfile CamcorderProfile}, that is, a 1915 {@link android.media.CamcorderProfile CamcorderProfile} that has 1916 {@link android.media.CamcorderProfile#quality quality} in 1917 the range [{@link android.media.CamcorderProfile#QUALITY_LOW QUALITY_LOW}, 1918 {@link android.media.CamcorderProfile#QUALITY_2160P QUALITY_2160P}], if the profile is 1919 supported by the device and has 1920 {@link android.media.CamcorderProfile#videoFrameRate videoFrameRate} `x`, this list will 1921 always include (`x`,`x`). 1922 1923 * Also, a camera device must either not support any 1924 {@link android.media.CamcorderProfile CamcorderProfile}, 1925 or support at least one 1926 normal {@link android.media.CamcorderProfile CamcorderProfile} that has 1927 {@link android.media.CamcorderProfile#videoFrameRate videoFrameRate} `x` >= 24. 1928 1929 For devices at the LIMITED level or above: 1930 1931 * For YUV_420_888 burst capture use case, this list will always include (`min`, `max`) 1932 and (`max`, `max`) where `min` <= 15 and `max` = the maximum output frame rate of the 1933 maximum YUV_420_888 output size. 1934 </details> 1935 <tag id="BC" /> 1936 </entry> 1937 <entry name="aeCompensationRange" type="int32" visibility="public" 1938 container="array" typedef="rangeInt" 1939 hwlevel="legacy"> 1940 <array> 1941 <size>2</size> 1942 </array> 1943 <description>Maximum and minimum exposure compensation values for 1944 android.control.aeExposureCompensation, in counts of android.control.aeCompensationStep, 1945 that are supported by this camera device.</description> 1946 <range> 1947 Range [0,0] indicates that exposure compensation is not supported. 1948 1949 For LIMITED and FULL devices, range must follow below requirements if exposure 1950 compensation is supported (`range != [0, 0]`): 1951 1952 `Min.exposure compensation * android.control.aeCompensationStep <= -2 EV` 1953 1954 `Max.exposure compensation * android.control.aeCompensationStep >= 2 EV` 1955 1956 LEGACY devices may support a smaller range than this. 1957 </range> 1958 <tag id="BC" /> 1959 </entry> 1960 <entry name="aeCompensationStep" type="rational" visibility="public" 1961 hwlevel="legacy"> 1962 <description>Smallest step by which the exposure compensation 1963 can be changed.</description> 1964 <units>Exposure Value (EV)</units> 1965 <details> 1966 This is the unit for android.control.aeExposureCompensation. For example, if this key has 1967 a value of `1/2`, then a setting of `-2` for android.control.aeExposureCompensation means 1968 that the target EV offset for the auto-exposure routine is -1 EV. 1969 1970 One unit of EV compensation changes the brightness of the captured image by a factor 1971 of two. +1 EV doubles the image brightness, while -1 EV halves the image brightness. 1972 </details> 1973 <hal_details> 1974 This must be less than or equal to 1/2. 1975 </hal_details> 1976 <tag id="BC" /> 1977 </entry> 1978 <entry name="afAvailableModes" type="byte" visibility="public" 1979 type_notes="List of enums" container="array" typedef="enumList" 1980 hwlevel="legacy"> 1981 <array> 1982 <size>n</size> 1983 </array> 1984 <description> 1985 List of auto-focus (AF) modes for android.control.afMode that are 1986 supported by this camera device. 1987 </description> 1988 <range>Any value listed in android.control.afMode</range> 1989 <details> 1990 Not all the auto-focus modes may be supported by a 1991 given camera device. This entry lists the valid modes for 1992 android.control.afMode for this camera device. 1993 1994 All LIMITED and FULL mode camera devices will support OFF mode, and all 1995 camera devices with adjustable focuser units 1996 (`android.lens.info.minimumFocusDistance > 0`) will support AUTO mode. 1997 1998 LEGACY devices will support OFF mode only if they support 1999 focusing to infinity (by also setting android.lens.focusDistance to 2000 `0.0f`). 2001 </details> 2002 <tag id="BC" /> 2003 </entry> 2004 <entry name="availableEffects" type="byte" visibility="public" 2005 type_notes="List of enums (android.control.effectMode)." container="array" 2006 typedef="enumList" hwlevel="legacy"> 2007 <array> 2008 <size>n</size> 2009 </array> 2010 <description> 2011 List of color effects for android.control.effectMode that are supported by this camera 2012 device. 2013 </description> 2014 <range>Any value listed in android.control.effectMode</range> 2015 <details> 2016 This list contains the color effect modes that can be applied to 2017 images produced by the camera device. 2018 Implementations are not expected to be consistent across all devices. 2019 If no color effect modes are available for a device, this will only list 2020 OFF. 2021 2022 A color effect will only be applied if 2023 android.control.mode != OFF. OFF is always included in this list. 2024 2025 This control has no effect on the operation of other control routines such 2026 as auto-exposure, white balance, or focus. 2027 </details> 2028 <tag id="BC" /> 2029 </entry> 2030 <entry name="availableSceneModes" type="byte" visibility="public" 2031 type_notes="List of enums (android.control.sceneMode)." 2032 container="array" typedef="enumList" hwlevel="legacy"> 2033 <array> 2034 <size>n</size> 2035 </array> 2036 <description> 2037 List of scene modes for android.control.sceneMode that are supported by this camera 2038 device. 2039 </description> 2040 <range>Any value listed in android.control.sceneMode</range> 2041 <details> 2042 This list contains scene modes that can be set for the camera device. 2043 Only scene modes that have been fully implemented for the 2044 camera device may be included here. Implementations are not expected 2045 to be consistent across all devices. 2046 2047 If no scene modes are supported by the camera device, this 2048 will be set to DISABLED. Otherwise DISABLED will not be listed. 2049 2050 FACE_PRIORITY is always listed if face detection is 2051 supported (i.e.`android.statistics.info.maxFaceCount > 2052 0`). 2053 </details> 2054 <tag id="BC" /> 2055 </entry> 2056 <entry name="availableVideoStabilizationModes" type="byte" 2057 visibility="public" type_notes="List of enums." container="array" 2058 typedef="enumList" hwlevel="legacy"> 2059 <array> 2060 <size>n</size> 2061 </array> 2062 <description> 2063 List of video stabilization modes for android.control.videoStabilizationMode 2064 that are supported by this camera device. 2065 </description> 2066 <range>Any value listed in android.control.videoStabilizationMode</range> 2067 <details> 2068 OFF will always be listed. 2069 </details> 2070 <tag id="BC" /> 2071 </entry> 2072 <entry name="awbAvailableModes" type="byte" visibility="public" 2073 type_notes="List of enums" 2074 container="array" typedef="enumList" hwlevel="legacy"> 2075 <array> 2076 <size>n</size> 2077 </array> 2078 <description> 2079 List of auto-white-balance modes for android.control.awbMode that are supported by this 2080 camera device. 2081 </description> 2082 <range>Any value listed in android.control.awbMode</range> 2083 <details> 2084 Not all the auto-white-balance modes may be supported by a 2085 given camera device. This entry lists the valid modes for 2086 android.control.awbMode for this camera device. 2087 2088 All camera devices will support ON mode. 2089 2090 Camera devices that support the MANUAL_POST_PROCESSING capability will always support OFF 2091 mode, which enables application control of white balance, by using 2092 android.colorCorrection.transform and android.colorCorrection.gains 2093 (android.colorCorrection.mode must be set to TRANSFORM_MATRIX). This includes all FULL 2094 mode camera devices. 2095 </details> 2096 <tag id="BC" /> 2097 </entry> 2098 <entry name="maxRegions" type="int32" visibility="ndk_public" 2099 container="array" hwlevel="legacy"> 2100 <array> 2101 <size>3</size> 2102 </array> 2103 <description> 2104 List of the maximum number of regions that can be used for metering in 2105 auto-exposure (AE), auto-white balance (AWB), and auto-focus (AF); 2106 this corresponds to the the maximum number of elements in 2107 android.control.aeRegions, android.control.awbRegions, 2108 and android.control.afRegions. 2109 </description> 2110 <range> 2111 Value must be &gt;= 0 for each element. For full-capability devices 2112 this value must be &gt;= 1 for AE and AF. The order of the elements is: 2113 `(AE, AWB, AF)`.</range> 2114 <tag id="BC" /> 2115 </entry> 2116 <entry name="maxRegionsAe" type="int32" visibility="java_public" 2117 synthetic="true" hwlevel="legacy"> 2118 <description> 2119 The maximum number of metering regions that can be used by the auto-exposure (AE) 2120 routine. 2121 </description> 2122 <range>Value will be &gt;= 0. For FULL-capability devices, this 2123 value will be &gt;= 1. 2124 </range> 2125 <details> 2126 This corresponds to the the maximum allowed number of elements in 2127 android.control.aeRegions. 2128 </details> 2129 <hal_details>This entry is private to the framework. Fill in 2130 maxRegions to have this entry be automatically populated. 2131 </hal_details> 2132 </entry> 2133 <entry name="maxRegionsAwb" type="int32" visibility="java_public" 2134 synthetic="true" hwlevel="legacy"> 2135 <description> 2136 The maximum number of metering regions that can be used by the auto-white balance (AWB) 2137 routine. 2138 </description> 2139 <range>Value will be &gt;= 0. 2140 </range> 2141 <details> 2142 This corresponds to the the maximum allowed number of elements in 2143 android.control.awbRegions. 2144 </details> 2145 <hal_details>This entry is private to the framework. Fill in 2146 maxRegions to have this entry be automatically populated. 2147 </hal_details> 2148 </entry> 2149 <entry name="maxRegionsAf" type="int32" visibility="java_public" 2150 synthetic="true" hwlevel="legacy"> 2151 <description> 2152 The maximum number of metering regions that can be used by the auto-focus (AF) routine. 2153 </description> 2154 <range>Value will be &gt;= 0. For FULL-capability devices, this 2155 value will be &gt;= 1. 2156 </range> 2157 <details> 2158 This corresponds to the the maximum allowed number of elements in 2159 android.control.afRegions. 2160 </details> 2161 <hal_details>This entry is private to the framework. Fill in 2162 maxRegions to have this entry be automatically populated. 2163 </hal_details> 2164 </entry> 2165 <entry name="sceneModeOverrides" type="byte" visibility="system" 2166 container="array" hwlevel="limited"> 2167 <array> 2168 <size>3</size> 2169 <size>length(availableSceneModes)</size> 2170 </array> 2171 <description> 2172 Ordered list of auto-exposure, auto-white balance, and auto-focus 2173 settings to use with each available scene mode. 2174 </description> 2175 <range> 2176 For each available scene mode, the list must contain three 2177 entries containing the android.control.aeMode, 2178 android.control.awbMode, and android.control.afMode values used 2179 by the camera device. The entry order is `(aeMode, awbMode, afMode)` 2180 where aeMode has the lowest index position. 2181 </range> 2182 <details> 2183 When a scene mode is enabled, the camera device is expected 2184 to override android.control.aeMode, android.control.awbMode, 2185 and android.control.afMode with its preferred settings for 2186 that scene mode. 2187 2188 The order of this list matches that of availableSceneModes, 2189 with 3 entries for each mode. The overrides listed 2190 for FACE_PRIORITY and FACE_PRIORITY_LOW_LIGHT (if supported) are ignored, 2191 since for that mode the application-set android.control.aeMode, 2192 android.control.awbMode, and android.control.afMode values are 2193 used instead, matching the behavior when android.control.mode 2194 is set to AUTO. It is recommended that the FACE_PRIORITY and 2195 FACE_PRIORITY_LOW_LIGHT (if supported) overrides should be set to 0. 2196 2197 For example, if availableSceneModes contains 2198 `(FACE_PRIORITY, ACTION, NIGHT)`, then the camera framework 2199 expects sceneModeOverrides to have 9 entries formatted like: 2200 `(0, 0, 0, ON_AUTO_FLASH, AUTO, CONTINUOUS_PICTURE, 2201 ON_AUTO_FLASH, INCANDESCENT, AUTO)`. 2202 </details> 2203 <hal_details> 2204 To maintain backward compatibility, this list will be made available 2205 in the static metadata of the camera service. The camera service will 2206 use these values to set android.control.aeMode, 2207 android.control.awbMode, and android.control.afMode when using a scene 2208 mode other than FACE_PRIORITY and FACE_PRIORITY_LOW_LIGHT (if supported). 2209 </hal_details> 2210 <tag id="BC" /> 2211 </entry> 2212 </static> 2213 <dynamic> 2214 <entry name="aePrecaptureId" type="int32" visibility="system" deprecated="true"> 2215 <description>The ID sent with the latest 2216 CAMERA2_TRIGGER_PRECAPTURE_METERING call</description> 2217 <details>Must be 0 if no 2218 CAMERA2_TRIGGER_PRECAPTURE_METERING trigger received yet 2219 by HAL. Always updated even if AE algorithm ignores the 2220 trigger</details> 2221 </entry> 2222 <clone entry="android.control.aeAntibandingMode" kind="controls"> 2223 </clone> 2224 <clone entry="android.control.aeExposureCompensation" kind="controls"> 2225 </clone> 2226 <clone entry="android.control.aeLock" kind="controls"> 2227 </clone> 2228 <clone entry="android.control.aeMode" kind="controls"> 2229 </clone> 2230 <clone entry="android.control.aeRegions" kind="controls"> 2231 </clone> 2232 <clone entry="android.control.aeTargetFpsRange" kind="controls"> 2233 </clone> 2234 <clone entry="android.control.aePrecaptureTrigger" kind="controls"> 2235 </clone> 2236 <entry name="aeState" type="byte" visibility="public" enum="true" 2237 hwlevel="limited"> 2238 <enum> 2239 <value>INACTIVE 2240 <notes>AE is off or recently reset. 2241 2242 When a camera device is opened, it starts in 2243 this state. This is a transient state, the camera device may skip reporting 2244 this state in capture result.</notes></value> 2245 <value>SEARCHING 2246 <notes>AE doesn't yet have a good set of control values 2247 for the current scene. 2248 2249 This is a transient state, the camera device may skip 2250 reporting this state in capture result.</notes></value> 2251 <value>CONVERGED 2252 <notes>AE has a good set of control values for the 2253 current scene.</notes></value> 2254 <value>LOCKED 2255 <notes>AE has been locked.</notes></value> 2256 <value>FLASH_REQUIRED 2257 <notes>AE has a good set of control values, but flash 2258 needs to be fired for good quality still 2259 capture.</notes></value> 2260 <value>PRECAPTURE 2261 <notes>AE has been asked to do a precapture sequence 2262 and is currently executing it. 2263 2264 Precapture can be triggered through setting 2265 android.control.aePrecaptureTrigger to START. Currently 2266 active and completed (if it causes camera device internal AE lock) precapture 2267 metering sequence can be canceled through setting 2268 android.control.aePrecaptureTrigger to CANCEL. 2269 2270 Once PRECAPTURE completes, AE will transition to CONVERGED 2271 or FLASH_REQUIRED as appropriate. This is a transient 2272 state, the camera device may skip reporting this state in 2273 capture result.</notes></value> 2274 </enum> 2275 <description>Current state of the auto-exposure (AE) algorithm.</description> 2276 <details>Switching between or enabling AE modes (android.control.aeMode) always 2277 resets the AE state to INACTIVE. Similarly, switching between android.control.mode, 2278 or android.control.sceneMode if `android.control.mode == USE_SCENE_MODE` resets all 2279 the algorithm states to INACTIVE. 2280 2281 The camera device can do several state transitions between two results, if it is 2282 allowed by the state transition table. For example: INACTIVE may never actually be 2283 seen in a result. 2284 2285 The state in the result is the state for this image (in sync with this image): if 2286 AE state becomes CONVERGED, then the image data associated with this result should 2287 be good to use. 2288 2289 Below are state transition tables for different AE modes. 2290 2291 State | Transition Cause | New State | Notes 2292 :------------:|:----------------:|:---------:|:-----------------------: 2293 INACTIVE | | INACTIVE | Camera device auto exposure algorithm is disabled 2294 2295 When android.control.aeMode is AE_MODE_ON_*: 2296 2297 State | Transition Cause | New State | Notes 2298 :-------------:|:--------------------------------------------:|:--------------:|:-----------------: 2299 INACTIVE | Camera device initiates AE scan | SEARCHING | Values changing 2300 INACTIVE | android.control.aeLock is ON | LOCKED | Values locked 2301 SEARCHING | Camera device finishes AE scan | CONVERGED | Good values, not changing 2302 SEARCHING | Camera device finishes AE scan | FLASH_REQUIRED | Converged but too dark w/o flash 2303 SEARCHING | android.control.aeLock is ON | LOCKED | Values locked 2304 CONVERGED | Camera device initiates AE scan | SEARCHING | Values changing 2305 CONVERGED | android.control.aeLock is ON | LOCKED | Values locked 2306 FLASH_REQUIRED | Camera device initiates AE scan | SEARCHING | Values changing 2307 FLASH_REQUIRED | android.control.aeLock is ON | LOCKED | Values locked 2308 LOCKED | android.control.aeLock is OFF | SEARCHING | Values not good after unlock 2309 LOCKED | android.control.aeLock is OFF | CONVERGED | Values good after unlock 2310 LOCKED | android.control.aeLock is OFF | FLASH_REQUIRED | Exposure good, but too dark 2311 PRECAPTURE | Sequence done. android.control.aeLock is OFF | CONVERGED | Ready for high-quality capture 2312 PRECAPTURE | Sequence done. android.control.aeLock is ON | LOCKED | Ready for high-quality capture 2313 LOCKED | aeLock is ON and aePrecaptureTrigger is START | LOCKED | Precapture trigger is ignored when AE is already locked 2314 LOCKED | aeLock is ON and aePrecaptureTrigger is CANCEL| LOCKED | Precapture trigger is ignored when AE is already locked 2315 Any state (excluding LOCKED) | android.control.aePrecaptureTrigger is START | PRECAPTURE | Start AE precapture metering sequence 2316 Any state (excluding LOCKED) | android.control.aePrecaptureTrigger is CANCEL| INACTIVE | Currently active precapture metering sequence is canceled 2317 2318 For the above table, the camera device may skip reporting any state changes that happen 2319 without application intervention (i.e. mode switch, trigger, locking). Any state that 2320 can be skipped in that manner is called a transient state. 2321 2322 For example, for above AE modes (AE_MODE_ON_*), in addition to the state transitions 2323 listed in above table, it is also legal for the camera device to skip one or more 2324 transient states between two results. See below table for examples: 2325 2326 State | Transition Cause | New State | Notes 2327 :-------------:|:-----------------------------------------------------------:|:--------------:|:-----------------: 2328 INACTIVE | Camera device finished AE scan | CONVERGED | Values are already good, transient states are skipped by camera device. 2329 Any state (excluding LOCKED) | android.control.aePrecaptureTrigger is START, sequence done | FLASH_REQUIRED | Converged but too dark w/o flash after a precapture sequence, transient states are skipped by camera device. 2330 Any state (excluding LOCKED) | android.control.aePrecaptureTrigger is START, sequence done | CONVERGED | Converged after a precapture sequence, transient states are skipped by camera device. 2331 Any state (excluding LOCKED) | android.control.aePrecaptureTrigger is CANCEL, converged | FLASH_REQUIRED | Converged but too dark w/o flash after a precapture sequence is canceled, transient states are skipped by camera device. 2332 Any state (excluding LOCKED) | android.control.aePrecaptureTrigger is CANCEL, converged | CONVERGED | Converged after a precapture sequenceis canceled, transient states are skipped by camera device. 2333 CONVERGED | Camera device finished AE scan | FLASH_REQUIRED | Converged but too dark w/o flash after a new scan, transient states are skipped by camera device. 2334 FLASH_REQUIRED | Camera device finished AE scan | CONVERGED | Converged after a new scan, transient states are skipped by camera device. 2335 </details> 2336 </entry> 2337 <clone entry="android.control.afMode" kind="controls"> 2338 </clone> 2339 <clone entry="android.control.afRegions" kind="controls"> 2340 </clone> 2341 <clone entry="android.control.afTrigger" kind="controls"> 2342 </clone> 2343 <entry name="afState" type="byte" visibility="public" enum="true" 2344 hwlevel="legacy"> 2345 <enum> 2346 <value>INACTIVE 2347 <notes>AF is off or has not yet tried to scan/been asked 2348 to scan. 2349 2350 When a camera device is opened, it starts in this 2351 state. This is a transient state, the camera device may 2352 skip reporting this state in capture 2353 result.</notes></value> 2354 <value>PASSIVE_SCAN 2355 <notes>AF is currently performing an AF scan initiated the 2356 camera device in a continuous autofocus mode. 2357 2358 Only used by CONTINUOUS_* AF modes. This is a transient 2359 state, the camera device may skip reporting this state in 2360 capture result.</notes></value> 2361 <value>PASSIVE_FOCUSED 2362 <notes>AF currently believes it is in focus, but may 2363 restart scanning at any time. 2364 2365 Only used by CONTINUOUS_* AF modes. This is a transient 2366 state, the camera device may skip reporting this state in 2367 capture result.</notes></value> 2368 <value>ACTIVE_SCAN 2369 <notes>AF is performing an AF scan because it was 2370 triggered by AF trigger. 2371 2372 Only used by AUTO or MACRO AF modes. This is a transient 2373 state, the camera device may skip reporting this state in 2374 capture result.</notes></value> 2375 <value>FOCUSED_LOCKED 2376 <notes>AF believes it is focused correctly and has locked 2377 focus. 2378 2379 This state is reached only after an explicit START AF trigger has been 2380 sent (android.control.afTrigger), when good focus has been obtained. 2381 2382 The lens will remain stationary until the AF mode (android.control.afMode) is changed or 2383 a new AF trigger is sent to the camera device (android.control.afTrigger). 2384 </notes></value> 2385 <value>NOT_FOCUSED_LOCKED 2386 <notes>AF has failed to focus successfully and has locked 2387 focus. 2388 2389 This state is reached only after an explicit START AF trigger has been 2390 sent (android.control.afTrigger), when good focus cannot be obtained. 2391 2392 The lens will remain stationary until the AF mode (android.control.afMode) is changed or 2393 a new AF trigger is sent to the camera device (android.control.afTrigger). 2394 </notes></value> 2395 <value>PASSIVE_UNFOCUSED 2396 <notes>AF finished a passive scan without finding focus, 2397 and may restart scanning at any time. 2398 2399 Only used by CONTINUOUS_* AF modes. This is a transient state, the camera 2400 device may skip reporting this state in capture result. 2401 2402 LEGACY camera devices do not support this state. When a passive 2403 scan has finished, it will always go to PASSIVE_FOCUSED. 2404 </notes></value> 2405 </enum> 2406 <description>Current state of auto-focus (AF) algorithm.</description> 2407 <details> 2408 Switching between or enabling AF modes (android.control.afMode) always 2409 resets the AF state to INACTIVE. Similarly, switching between android.control.mode, 2410 or android.control.sceneMode if `android.control.mode == USE_SCENE_MODE` resets all 2411 the algorithm states to INACTIVE. 2412 2413 The camera device can do several state transitions between two results, if it is 2414 allowed by the state transition table. For example: INACTIVE may never actually be 2415 seen in a result. 2416 2417 The state in the result is the state for this image (in sync with this image): if 2418 AF state becomes FOCUSED, then the image data associated with this result should 2419 be sharp. 2420 2421 Below are state transition tables for different AF modes. 2422 2423 When android.control.afMode is AF_MODE_OFF or AF_MODE_EDOF: 2424 2425 State | Transition Cause | New State | Notes 2426 :------------:|:----------------:|:---------:|:-----------: 2427 INACTIVE | | INACTIVE | Never changes 2428 2429 When android.control.afMode is AF_MODE_AUTO or AF_MODE_MACRO: 2430 2431 State | Transition Cause | New State | Notes 2432 :-----------------:|:----------------:|:------------------:|:--------------: 2433 INACTIVE | AF_TRIGGER | ACTIVE_SCAN | Start AF sweep, Lens now moving 2434 ACTIVE_SCAN | AF sweep done | FOCUSED_LOCKED | Focused, Lens now locked 2435 ACTIVE_SCAN | AF sweep done | NOT_FOCUSED_LOCKED | Not focused, Lens now locked 2436 ACTIVE_SCAN | AF_CANCEL | INACTIVE | Cancel/reset AF, Lens now locked 2437 FOCUSED_LOCKED | AF_CANCEL | INACTIVE | Cancel/reset AF 2438 FOCUSED_LOCKED | AF_TRIGGER | ACTIVE_SCAN | Start new sweep, Lens now moving 2439 NOT_FOCUSED_LOCKED | AF_CANCEL | INACTIVE | Cancel/reset AF 2440 NOT_FOCUSED_LOCKED | AF_TRIGGER | ACTIVE_SCAN | Start new sweep, Lens now moving 2441 Any state | Mode change | INACTIVE | 2442 2443 For the above table, the camera device may skip reporting any state changes that happen 2444 without application intervention (i.e. mode switch, trigger, locking). Any state that 2445 can be skipped in that manner is called a transient state. 2446 2447 For example, for these AF modes (AF_MODE_AUTO and AF_MODE_MACRO), in addition to the 2448 state transitions listed in above table, it is also legal for the camera device to skip 2449 one or more transient states between two results. See below table for examples: 2450 2451 State | Transition Cause | New State | Notes 2452 :-----------------:|:----------------:|:------------------:|:--------------: 2453 INACTIVE | AF_TRIGGER | FOCUSED_LOCKED | Focus is already good or good after a scan, lens is now locked. 2454 INACTIVE | AF_TRIGGER | NOT_FOCUSED_LOCKED | Focus failed after a scan, lens is now locked. 2455 FOCUSED_LOCKED | AF_TRIGGER | FOCUSED_LOCKED | Focus is already good or good after a scan, lens is now locked. 2456 NOT_FOCUSED_LOCKED | AF_TRIGGER | FOCUSED_LOCKED | Focus is good after a scan, lens is not locked. 2457 2458 2459 When android.control.afMode is AF_MODE_CONTINUOUS_VIDEO: 2460 2461 State | Transition Cause | New State | Notes 2462 :-----------------:|:-----------------------------------:|:------------------:|:--------------: 2463 INACTIVE | Camera device initiates new scan | PASSIVE_SCAN | Start AF scan, Lens now moving 2464 INACTIVE | AF_TRIGGER | NOT_FOCUSED_LOCKED | AF state query, Lens now locked 2465 PASSIVE_SCAN | Camera device completes current scan| PASSIVE_FOCUSED | End AF scan, Lens now locked 2466 PASSIVE_SCAN | Camera device fails current scan | PASSIVE_UNFOCUSED | End AF scan, Lens now locked 2467 PASSIVE_SCAN | AF_TRIGGER | FOCUSED_LOCKED | Immediate transition, if focus is good. Lens now locked 2468 PASSIVE_SCAN | AF_TRIGGER | NOT_FOCUSED_LOCKED | Immediate transition, if focus is bad. Lens now locked 2469 PASSIVE_SCAN | AF_CANCEL | INACTIVE | Reset lens position, Lens now locked 2470 PASSIVE_FOCUSED | Camera device initiates new scan | PASSIVE_SCAN | Start AF scan, Lens now moving 2471 PASSIVE_UNFOCUSED | Camera device initiates new scan | PASSIVE_SCAN | Start AF scan, Lens now moving 2472 PASSIVE_FOCUSED | AF_TRIGGER | FOCUSED_LOCKED | Immediate transition, lens now locked 2473 PASSIVE_UNFOCUSED | AF_TRIGGER | NOT_FOCUSED_LOCKED | Immediate transition, lens now locked 2474 FOCUSED_LOCKED | AF_TRIGGER | FOCUSED_LOCKED | No effect 2475 FOCUSED_LOCKED | AF_CANCEL | INACTIVE | Restart AF scan 2476 NOT_FOCUSED_LOCKED | AF_TRIGGER | NOT_FOCUSED_LOCKED | No effect 2477 NOT_FOCUSED_LOCKED | AF_CANCEL | INACTIVE | Restart AF scan 2478 2479 When android.control.afMode is AF_MODE_CONTINUOUS_PICTURE: 2480 2481 State | Transition Cause | New State | Notes 2482 :-----------------:|:------------------------------------:|:------------------:|:--------------: 2483 INACTIVE | Camera device initiates new scan | PASSIVE_SCAN | Start AF scan, Lens now moving 2484 INACTIVE | AF_TRIGGER | NOT_FOCUSED_LOCKED | AF state query, Lens now locked 2485 PASSIVE_SCAN | Camera device completes current scan | PASSIVE_FOCUSED | End AF scan, Lens now locked 2486 PASSIVE_SCAN | Camera device fails current scan | PASSIVE_UNFOCUSED | End AF scan, Lens now locked 2487 PASSIVE_SCAN | AF_TRIGGER | FOCUSED_LOCKED | Eventual transition once the focus is good. Lens now locked 2488 PASSIVE_SCAN | AF_TRIGGER | NOT_FOCUSED_LOCKED | Eventual transition if cannot find focus. Lens now locked 2489 PASSIVE_SCAN | AF_CANCEL | INACTIVE | Reset lens position, Lens now locked 2490 PASSIVE_FOCUSED | Camera device initiates new scan | PASSIVE_SCAN | Start AF scan, Lens now moving 2491 PASSIVE_UNFOCUSED | Camera device initiates new scan | PASSIVE_SCAN | Start AF scan, Lens now moving 2492 PASSIVE_FOCUSED | AF_TRIGGER | FOCUSED_LOCKED | Immediate trans. Lens now locked 2493 PASSIVE_UNFOCUSED | AF_TRIGGER | NOT_FOCUSED_LOCKED | Immediate trans. Lens now locked 2494 FOCUSED_LOCKED | AF_TRIGGER | FOCUSED_LOCKED | No effect 2495 FOCUSED_LOCKED | AF_CANCEL | INACTIVE | Restart AF scan 2496 NOT_FOCUSED_LOCKED | AF_TRIGGER | NOT_FOCUSED_LOCKED | No effect 2497 NOT_FOCUSED_LOCKED | AF_CANCEL | INACTIVE | Restart AF scan 2498 2499 When switch between AF_MODE_CONTINUOUS_* (CAF modes) and AF_MODE_AUTO/AF_MODE_MACRO 2500 (AUTO modes), the initial INACTIVE or PASSIVE_SCAN states may be skipped by the 2501 camera device. When a trigger is included in a mode switch request, the trigger 2502 will be evaluated in the context of the new mode in the request. 2503 See below table for examples: 2504 2505 State | Transition Cause | New State | Notes 2506 :-----------:|:--------------------------------------:|:----------------------------------------:|:--------------: 2507 any state | CAF-->AUTO mode switch | INACTIVE | Mode switch without trigger, initial state must be INACTIVE 2508 any state | CAF-->AUTO mode switch with AF_TRIGGER | trigger-reachable states from INACTIVE | Mode switch with trigger, INACTIVE is skipped 2509 any state | AUTO-->CAF mode switch | passively reachable states from INACTIVE | Mode switch without trigger, passive transient state is skipped 2510 </details> 2511 </entry> 2512 <entry name="afTriggerId" type="int32" visibility="system" deprecated="true"> 2513 <description>The ID sent with the latest 2514 CAMERA2_TRIGGER_AUTOFOCUS call</description> 2515 <details>Must be 0 if no CAMERA2_TRIGGER_AUTOFOCUS trigger 2516 received yet by HAL. Always updated even if AF algorithm 2517 ignores the trigger</details> 2518 </entry> 2519 <clone entry="android.control.awbLock" kind="controls"> 2520 </clone> 2521 <clone entry="android.control.awbMode" kind="controls"> 2522 </clone> 2523 <clone entry="android.control.awbRegions" kind="controls"> 2524 </clone> 2525 <clone entry="android.control.captureIntent" kind="controls"> 2526 </clone> 2527 <entry name="awbState" type="byte" visibility="public" enum="true" 2528 hwlevel="limited"> 2529 <enum> 2530 <value>INACTIVE 2531 <notes>AWB is not in auto mode, or has not yet started metering. 2532 2533 When a camera device is opened, it starts in this 2534 state. This is a transient state, the camera device may 2535 skip reporting this state in capture 2536 result.</notes></value> 2537 <value>SEARCHING 2538 <notes>AWB doesn't yet have a good set of control 2539 values for the current scene. 2540 2541 This is a transient state, the camera device 2542 may skip reporting this state in capture result.</notes></value> 2543 <value>CONVERGED 2544 <notes>AWB has a good set of control values for the 2545 current scene.</notes></value> 2546 <value>LOCKED 2547 <notes>AWB has been locked. 2548 </notes></value> 2549 </enum> 2550 <description>Current state of auto-white balance (AWB) algorithm.</description> 2551 <details>Switching between or enabling AWB modes (android.control.awbMode) always 2552 resets the AWB state to INACTIVE. Similarly, switching between android.control.mode, 2553 or android.control.sceneMode if `android.control.mode == USE_SCENE_MODE` resets all 2554 the algorithm states to INACTIVE. 2555 2556 The camera device can do several state transitions between two results, if it is 2557 allowed by the state transition table. So INACTIVE may never actually be seen in 2558 a result. 2559 2560 The state in the result is the state for this image (in sync with this image): if 2561 AWB state becomes CONVERGED, then the image data associated with this result should 2562 be good to use. 2563 2564 Below are state transition tables for different AWB modes. 2565 2566 When `android.control.awbMode != AWB_MODE_AUTO`: 2567 2568 State | Transition Cause | New State | Notes 2569 :------------:|:----------------:|:---------:|:-----------------------: 2570 INACTIVE | |INACTIVE |Camera device auto white balance algorithm is disabled 2571 2572 When android.control.awbMode is AWB_MODE_AUTO: 2573 2574 State | Transition Cause | New State | Notes 2575 :-------------:|:--------------------------------:|:-------------:|:-----------------: 2576 INACTIVE | Camera device initiates AWB scan | SEARCHING | Values changing 2577 INACTIVE | android.control.awbLock is ON | LOCKED | Values locked 2578 SEARCHING | Camera device finishes AWB scan | CONVERGED | Good values, not changing 2579 SEARCHING | android.control.awbLock is ON | LOCKED | Values locked 2580 CONVERGED | Camera device initiates AWB scan | SEARCHING | Values changing 2581 CONVERGED | android.control.awbLock is ON | LOCKED | Values locked 2582 LOCKED | android.control.awbLock is OFF | SEARCHING | Values not good after unlock 2583 2584 For the above table, the camera device may skip reporting any state changes that happen 2585 without application intervention (i.e. mode switch, trigger, locking). Any state that 2586 can be skipped in that manner is called a transient state. 2587 2588 For example, for this AWB mode (AWB_MODE_AUTO), in addition to the state transitions 2589 listed in above table, it is also legal for the camera device to skip one or more 2590 transient states between two results. See below table for examples: 2591 2592 State | Transition Cause | New State | Notes 2593 :-------------:|:--------------------------------:|:-------------:|:-----------------: 2594 INACTIVE | Camera device finished AWB scan | CONVERGED | Values are already good, transient states are skipped by camera device. 2595 LOCKED | android.control.awbLock is OFF | CONVERGED | Values good after unlock, transient states are skipped by camera device. 2596 </details> 2597 </entry> 2598 <clone entry="android.control.effectMode" kind="controls"> 2599 </clone> 2600 <clone entry="android.control.mode" kind="controls"> 2601 </clone> 2602 <clone entry="android.control.sceneMode" kind="controls"> 2603 </clone> 2604 <clone entry="android.control.videoStabilizationMode" kind="controls"> 2605 </clone> 2606 </dynamic> 2607 <static> 2608 <entry name="availableHighSpeedVideoConfigurations" type="int32" visibility="hidden" 2609 container="array" typedef="highSpeedVideoConfiguration" hwlevel="limited"> 2610 <array> 2611 <size>5</size> 2612 <size>n</size> 2613 </array> 2614 <description> 2615 List of available high speed video size, fps range and max batch size configurations 2616 supported by the camera device, in the format of (width, height, fps_min, fps_max, batch_size_max). 2617 </description> 2618 <range> 2619 For each configuration, the fps_max &gt;= 120fps. 2620 </range> 2621 <details> 2622 When CONSTRAINED_HIGH_SPEED_VIDEO is supported in android.request.availableCapabilities, 2623 this metadata will list the supported high speed video size, fps range and max batch size 2624 configurations. All the sizes listed in this configuration will be a subset of the sizes 2625 reported by {@link android.hardware.camera2.params.StreamConfigurationMap#getOutputSizes} 2626 for processed non-stalling formats. 2627 2628 For the high speed video use case, the application must 2629 select the video size and fps range from this metadata to configure the recording and 2630 preview streams and setup the recording requests. For example, if the application intends 2631 to do high speed recording, it can select the maximum size reported by this metadata to 2632 configure output streams. Once the size is selected, application can filter this metadata 2633 by selected size and get the supported fps ranges, and use these fps ranges to setup the 2634 recording requests. Note that for the use case of multiple output streams, application 2635 must select one unique size from this metadata to use (e.g., preview and recording streams 2636 must have the same size). Otherwise, the high speed capture session creation will fail. 2637 2638 The min and max fps will be multiple times of 30fps. 2639 2640 High speed video streaming extends significant performance pressue to camera hardware, 2641 to achieve efficient high speed streaming, the camera device may have to aggregate 2642 multiple frames together and send to camera device for processing where the request 2643 controls are same for all the frames in this batch. Max batch size indicates 2644 the max possible number of frames the camera device will group together for this high 2645 speed stream configuration. This max batch size will be used to generate a high speed 2646 recording request list by 2647 {@link android.hardware.camera2.CameraConstrainedHighSpeedCaptureSession#createHighSpeedRequestList}. 2648 The max batch size for each configuration will satisfy below conditions: 2649 2650 * Each max batch size will be a divisor of its corresponding fps_max / 30. For example, 2651 if max_fps is 300, max batch size will only be 1, 2, 5, or 10. 2652 * The camera device may choose smaller internal batch size for each configuration, but 2653 the actual batch size will be a divisor of max batch size. For example, if the max batch 2654 size is 8, the actual batch size used by camera device will only be 1, 2, 4, or 8. 2655 * The max batch size in each configuration entry must be no larger than 32. 2656 2657 The camera device doesn't have to support batch mode to achieve high speed video recording, 2658 in such case, batch_size_max will be reported as 1 in each configuration entry. 2659 2660 This fps ranges in this configuration list can only be used to create requests 2661 that are submitted to a high speed camera capture session created by 2662 {@link android.hardware.camera2.CameraDevice#createConstrainedHighSpeedCaptureSession}. 2663 The fps ranges reported in this metadata must not be used to setup capture requests for 2664 normal capture session, or it will cause request error. 2665 </details> 2666 <hal_details> 2667 All the sizes listed in this configuration will be a subset of the sizes reported by 2668 android.scaler.availableStreamConfigurations for processed non-stalling output formats. 2669 Note that for all high speed video configurations, HAL must be able to support a minimum 2670 of two streams, though the application might choose to configure just one stream. 2671 2672 The HAL may support multiple sensor modes for high speed outputs, for example, 120fps 2673 sensor mode and 120fps recording, 240fps sensor mode for 240fps recording. The application 2674 usually starts preview first, then starts recording. To avoid sensor mode switch caused 2675 stutter when starting recording as much as possible, the application may want to ensure 2676 the same sensor mode is used for preview and recording. Therefore, The HAL must advertise 2677 the variable fps range [30, fps_max] for each fixed fps range in this configuration list. 2678 For example, if the HAL advertises [120, 120] and [240, 240], the HAL must also advertise 2679 [30, 120] and [30, 240] for each configuration. In doing so, if the application intends to 2680 do 120fps recording, it can select [30, 120] to start preview, and [120, 120] to start 2681 recording. For these variable fps ranges, it's up to the HAL to decide the actual fps 2682 values that are suitable for smooth preview streaming. If the HAL sees different max_fps 2683 values that fall into different sensor modes in a sequence of requests, the HAL must 2684 switch the sensor mode as quick as possible to minimize the mode switch caused stutter. 2685 </hal_details> 2686 <tag id="V1" /> 2687 </entry> 2688 <entry name="aeLockAvailable" type="byte" visibility="public" enum="true" 2689 typedef="boolean" hwlevel="legacy"> 2690 <enum> 2691 <value>FALSE</value> 2692 <value>TRUE</value> 2693 </enum> 2694 <description>Whether the camera device supports android.control.aeLock</description> 2695 <details> 2696 Devices with MANUAL_SENSOR capability or BURST_CAPTURE capability will always 2697 list `true`. This includes FULL devices. 2698 </details> 2699 <tag id="BC"/> 2700 </entry> 2701 <entry name="awbLockAvailable" type="byte" visibility="public" enum="true" 2702 typedef="boolean" hwlevel="legacy"> 2703 <enum> 2704 <value>FALSE</value> 2705 <value>TRUE</value> 2706 </enum> 2707 <description>Whether the camera device supports android.control.awbLock</description> 2708 <details> 2709 Devices with MANUAL_POST_PROCESSING capability or BURST_CAPTURE capability will 2710 always list `true`. This includes FULL devices. 2711 </details> 2712 <tag id="BC"/> 2713 </entry> 2714 <entry name="availableModes" type="byte" visibility="public" 2715 type_notes="List of enums (android.control.mode)." container="array" 2716 typedef="enumList" hwlevel="legacy"> 2717 <array> 2718 <size>n</size> 2719 </array> 2720 <description> 2721 List of control modes for android.control.mode that are supported by this camera 2722 device. 2723 </description> 2724 <range>Any value listed in android.control.mode</range> 2725 <details> 2726 This list contains control modes that can be set for the camera device. 2727 LEGACY mode devices will always support AUTO mode. LIMITED and FULL 2728 devices will always support OFF, AUTO modes. 2729 </details> 2730 </entry> 2731 <entry name="postRawSensitivityBoostRange" type="int32" visibility="public" 2732 type_notes="Range of supported post RAW sensitivitiy boosts" 2733 container="array" typedef="rangeInt"> 2734 <array> 2735 <size>2</size> 2736 </array> 2737 <description>Range of boosts for android.control.postRawSensitivityBoost supported 2738 by this camera device. 2739 </description> 2740 <units>ISO arithmetic units, the same as android.sensor.sensitivity</units> 2741 <details> 2742 Devices support post RAW sensitivity boost will advertise 2743 android.control.postRawSensitivityBoost key for controling 2744 post RAW sensitivity boost. 2745 2746 This key will be `null` for devices that do not support any RAW format 2747 outputs. For devices that do support RAW format outputs, this key will always 2748 present, and if a device does not support post RAW sensitivity boost, it will 2749 list `(100, 100)` in this key. 2750 </details> 2751 <hal_details> 2752 This key is added in HAL3.4. For HAL3.3 or earlier devices, camera framework will 2753 generate this key as `(100, 100)` if device supports any of RAW output formats. 2754 All HAL3.4 and above devices should list this key if device supports any of RAW 2755 output formats. 2756 </hal_details> 2757 </entry> 2758 </static> 2759 <controls> 2760 <entry name="postRawSensitivityBoost" type="int32" visibility="public"> 2761 <description>The amount of additional sensitivity boost applied to output images 2762 after RAW sensor data is captured. 2763 </description> 2764 <units>ISO arithmetic units, the same as android.sensor.sensitivity</units> 2765 <range>android.control.postRawSensitivityBoostRange</range> 2766 <details> 2767 Some camera devices support additional digital sensitivity boosting in the 2768 camera processing pipeline after sensor RAW image is captured. 2769 Such a boost will be applied to YUV/JPEG format output images but will not 2770 have effect on RAW output formats like RAW_SENSOR, RAW10, RAW12 or RAW_OPAQUE. 2771 2772 This key will be `null` for devices that do not support any RAW format 2773 outputs. For devices that do support RAW format outputs, this key will always 2774 present, and if a device does not support post RAW sensitivity boost, it will 2775 list `100` in this key. 2776 2777 If the camera device cannot apply the exact boost requested, it will reduce the 2778 boost to the nearest supported value. 2779 The final boost value used will be available in the output capture result. 2780 2781 For devices that support post RAW sensitivity boost, the YUV/JPEG output images 2782 of such device will have the total sensitivity of 2783 `android.sensor.sensitivity * android.control.postRawSensitivityBoost / 100` 2784 The sensitivity of RAW format images will always be `android.sensor.sensitivity` 2785 2786 This control is only effective if android.control.aeMode or android.control.mode is set to 2787 OFF; otherwise the auto-exposure algorithm will override this value. 2788 </details> 2789 </entry> 2790 </controls> 2791 <dynamic> 2792 <clone entry="android.control.postRawSensitivityBoost" kind="controls"> 2793 </clone> 2794 </dynamic> 2795 <controls> 2796 <entry name="enableZsl" type="byte" visibility="public" enum="true" typedef="boolean"> 2797 <enum> 2798 <value>FALSE 2799 <notes>Requests with android.control.captureIntent == STILL_CAPTURE must be captured 2800 after previous requests.</notes></value> 2801 <value>TRUE 2802 <notes>Requests with android.control.captureIntent == STILL_CAPTURE may or may not be 2803 captured before previous requests.</notes></value> 2804 </enum> 2805 <description>Allow camera device to enable zero-shutter-lag mode for requests with 2806 android.control.captureIntent == STILL_CAPTURE. 2807 </description> 2808 <details> 2809 If enableZsl is `true`, the camera device may enable zero-shutter-lag mode for requests with 2810 STILL_CAPTURE capture intent. The camera device may use images captured in the past to 2811 produce output images for a zero-shutter-lag request. The result metadata including the 2812 android.sensor.timestamp reflects the source frames used to produce output images. 2813 Therefore, the contents of the output images and the result metadata may be out of order 2814 compared to previous regular requests. enableZsl does not affect requests with other 2815 capture intents. 2816 2817 For example, when requests are submitted in the following order: 2818 Request A: enableZsl is ON, android.control.captureIntent is PREVIEW 2819 Request B: enableZsl is ON, android.control.captureIntent is STILL_CAPTURE 2820 2821 The output images for request B may have contents captured before the output images for 2822 request A, and the result metadata for request B may be older than the result metadata for 2823 request A. 2824 2825 Note that when enableZsl is `true`, it is not guaranteed to get output images captured in 2826 the past for requests with STILL_CAPTURE capture intent. 2827 2828 For applications targeting SDK versions O and newer, the value of enableZsl in 2829 TEMPLATE_STILL_CAPTURE template may be `true`. The value in other templates is always 2830 `false` if present. 2831 2832 For applications targeting SDK versions older than O, the value of enableZsl in all 2833 capture templates is always `false` if present. 2834 2835 For application-operated ZSL, use CAMERA3_TEMPLATE_ZERO_SHUTTER_LAG template. 2836 </details> 2837 <hal_details> 2838 It is valid for HAL to produce regular output images for requests with STILL_CAPTURE 2839 capture intent. 2840 </hal_details> 2841 </entry> 2842 </controls> 2843 <dynamic> 2844 <clone entry="android.control.enableZsl" kind="controls"> 2845 </clone> 2846 </dynamic> 2847 </section> 2848 <section name="demosaic"> 2849 <controls> 2850 <entry name="mode" type="byte" enum="true"> 2851 <enum> 2852 <value>FAST 2853 <notes>Minimal or no slowdown of frame rate compared to 2854 Bayer RAW output.</notes></value> 2855 <value>HIGH_QUALITY 2856 <notes>Improved processing quality but the frame rate might be slowed down 2857 relative to raw output.</notes></value> 2858 </enum> 2859 <description>Controls the quality of the demosaicing 2860 processing.</description> 2861 <tag id="FUTURE" /> 2862 </entry> 2863 </controls> 2864 </section> 2865 <section name="edge"> 2866 <controls> 2867 <entry name="mode" type="byte" visibility="public" enum="true" hwlevel="full"> 2868 <enum> 2869 <value>OFF 2870 <notes>No edge enhancement is applied.</notes></value> 2871 <value>FAST 2872 <notes>Apply edge enhancement at a quality level that does not slow down frame rate 2873 relative to sensor output. It may be the same as OFF if edge enhancement will 2874 slow down frame rate relative to sensor.</notes></value> 2875 <value>HIGH_QUALITY 2876 <notes>Apply high-quality edge enhancement, at a cost of possibly reduced output frame rate. 2877 </notes></value> 2878 <value optional="true">ZERO_SHUTTER_LAG 2879 <notes>Edge enhancement is applied at different levels for different output streams, 2880 based on resolution. Streams at maximum recording resolution (see {@link 2881 android.hardware.camera2.CameraDevice#createCaptureSession}) or below have 2882 edge enhancement applied, while higher-resolution streams have no edge enhancement 2883 applied. The level of edge enhancement for low-resolution streams is tuned so that 2884 frame rate is not impacted, and the quality is equal to or better than FAST (since it 2885 is only applied to lower-resolution outputs, quality may improve from FAST). 2886 2887 This mode is intended to be used by applications operating in a zero-shutter-lag mode 2888 with YUV or PRIVATE reprocessing, where the application continuously captures 2889 high-resolution intermediate buffers into a circular buffer, from which a final image is 2890 produced via reprocessing when a user takes a picture. For such a use case, the 2891 high-resolution buffers must not have edge enhancement applied to maximize efficiency of 2892 preview and to avoid double-applying enhancement when reprocessed, while low-resolution 2893 buffers (used for recording or preview, generally) need edge enhancement applied for 2894 reasonable preview quality. 2895 2896 This mode is guaranteed to be supported by devices that support either the 2897 YUV_REPROCESSING or PRIVATE_REPROCESSING capabilities 2898 (android.request.availableCapabilities lists either of those capabilities) and it will 2899 be the default mode for CAMERA3_TEMPLATE_ZERO_SHUTTER_LAG template. 2900 </notes></value> 2901 </enum> 2902 <description>Operation mode for edge 2903 enhancement.</description> 2904 <range>android.edge.availableEdgeModes</range> 2905 <details>Edge enhancement improves sharpness and details in the captured image. OFF means 2906 no enhancement will be applied by the camera device. 2907 2908 FAST/HIGH_QUALITY both mean camera device determined enhancement 2909 will be applied. HIGH_QUALITY mode indicates that the 2910 camera device will use the highest-quality enhancement algorithms, 2911 even if it slows down capture rate. FAST means the camera device will 2912 not slow down capture rate when applying edge enhancement. FAST may be the same as OFF if 2913 edge enhancement will slow down capture rate. Every output stream will have a similar 2914 amount of enhancement applied. 2915 2916 ZERO_SHUTTER_LAG is meant to be used by applications that maintain a continuous circular 2917 buffer of high-resolution images during preview and reprocess image(s) from that buffer 2918 into a final capture when triggered by the user. In this mode, the camera device applies 2919 edge enhancement to low-resolution streams (below maximum recording resolution) to 2920 maximize preview quality, but does not apply edge enhancement to high-resolution streams, 2921 since those will be reprocessed later if necessary. 2922 2923 For YUV_REPROCESSING, these FAST/HIGH_QUALITY modes both mean that the camera 2924 device will apply FAST/HIGH_QUALITY YUV-domain edge enhancement, respectively. 2925 The camera device may adjust its internal edge enhancement parameters for best 2926 image quality based on the android.reprocess.effectiveExposureFactor, if it is set. 2927 </details> 2928 <hal_details> 2929 For YUV_REPROCESSING The HAL can use android.reprocess.effectiveExposureFactor to 2930 adjust the internal edge enhancement reduction parameters appropriately to get the best 2931 quality images. 2932 </hal_details> 2933 <tag id="V1" /> 2934 <tag id="REPROC" /> 2935 </entry> 2936 <entry name="strength" type="byte"> 2937 <description>Control the amount of edge enhancement 2938 applied to the images</description> 2939 <units>1-10; 10 is maximum sharpening</units> 2940 <tag id="FUTURE" /> 2941 </entry> 2942 </controls> 2943 <static> 2944 <entry name="availableEdgeModes" type="byte" visibility="public" 2945 type_notes="list of enums" container="array" typedef="enumList" 2946 hwlevel="full"> 2947 <array> 2948 <size>n</size> 2949 </array> 2950 <description> 2951 List of edge enhancement modes for android.edge.mode that are supported by this camera 2952 device. 2953 </description> 2954 <range>Any value listed in android.edge.mode</range> 2955 <details> 2956 Full-capability camera devices must always support OFF; camera devices that support 2957 YUV_REPROCESSING or PRIVATE_REPROCESSING will list ZERO_SHUTTER_LAG; all devices will 2958 list FAST. 2959 </details> 2960 <hal_details> 2961 HAL must support both FAST and HIGH_QUALITY if edge enhancement control is available 2962 on the camera device, but the underlying implementation can be the same for both modes. 2963 That is, if the highest quality implementation on the camera device does not slow down 2964 capture rate, then FAST and HIGH_QUALITY will generate the same output. 2965 </hal_details> 2966 <tag id="V1" /> 2967 <tag id="REPROC" /> 2968 </entry> 2969 </static> 2970 <dynamic> 2971 <clone entry="android.edge.mode" kind="controls"> 2972 <tag id="V1" /> 2973 <tag id="REPROC" /> 2974 </clone> 2975 </dynamic> 2976 </section> 2977 <section name="flash"> 2978 <controls> 2979 <entry name="firingPower" type="byte"> 2980 <description>Power for flash firing/torch</description> 2981 <units>10 is max power; 0 is no flash. Linear</units> 2982 <range>0 - 10</range> 2983 <details>Power for snapshot may use a different scale than 2984 for torch mode. Only one entry for torch mode will be 2985 used</details> 2986 <tag id="FUTURE" /> 2987 </entry> 2988 <entry name="firingTime" type="int64"> 2989 <description>Firing time of flash relative to start of 2990 exposure</description> 2991 <units>nanoseconds</units> 2992 <range>0-(exposure time-flash duration)</range> 2993 <details>Clamped to (0, exposure time - flash 2994 duration).</details> 2995 <tag id="FUTURE" /> 2996 </entry> 2997 <entry name="mode" type="byte" visibility="public" enum="true" hwlevel="legacy"> 2998 <enum> 2999 <value>OFF 3000 <notes> 3001 Do not fire the flash for this capture. 3002 </notes> 3003 </value> 3004 <value>SINGLE 3005 <notes> 3006 If the flash is available and charged, fire flash 3007 for this capture. 3008 </notes> 3009 </value> 3010 <value>TORCH 3011 <notes> 3012 Transition flash to continuously on. 3013 </notes> 3014 </value> 3015 </enum> 3016 <description>The desired mode for for the camera device's flash control.</description> 3017 <details> 3018 This control is only effective when flash unit is available 3019 (`android.flash.info.available == true`). 3020 3021 When this control is used, the android.control.aeMode must be set to ON or OFF. 3022 Otherwise, the camera device auto-exposure related flash control (ON_AUTO_FLASH, 3023 ON_ALWAYS_FLASH, or ON_AUTO_FLASH_REDEYE) will override this control. 3024 3025 When set to OFF, the camera device will not fire flash for this capture. 3026 3027 When set to SINGLE, the camera device will fire flash regardless of the camera 3028 device's auto-exposure routine's result. When used in still capture case, this 3029 control should be used along with auto-exposure (AE) precapture metering sequence 3030 (android.control.aePrecaptureTrigger), otherwise, the image may be incorrectly exposed. 3031 3032 When set to TORCH, the flash will be on continuously. This mode can be used 3033 for use cases such as preview, auto-focus assist, still capture, or video recording. 3034 3035 The flash status will be reported by android.flash.state in the capture result metadata. 3036 </details> 3037 <tag id="BC" /> 3038 </entry> 3039 </controls> 3040 <static> 3041 <namespace name="info"> 3042 <entry name="available" type="byte" visibility="public" enum="true" 3043 typedef="boolean" hwlevel="legacy"> 3044 <enum> 3045 <value>FALSE</value> 3046 <value>TRUE</value> 3047 </enum> 3048 <description>Whether this camera device has a 3049 flash unit.</description> 3050 <details> 3051 Will be `false` if no flash is available. 3052 3053 If there is no flash unit, none of the flash controls do 3054 anything.</details> 3055 <tag id="BC" /> 3056 </entry> 3057 <entry name="chargeDuration" type="int64"> 3058 <description>Time taken before flash can fire 3059 again</description> 3060 <units>nanoseconds</units> 3061 <range>0-1e9</range> 3062 <details>1 second too long/too short for recharge? Should 3063 this be power-dependent?</details> 3064 <tag id="FUTURE" /> 3065 </entry> 3066 </namespace> 3067 <entry name="colorTemperature" type="byte"> 3068 <description>The x,y whitepoint of the 3069 flash</description> 3070 <units>pair of floats</units> 3071 <range>0-1 for both</range> 3072 <tag id="FUTURE" /> 3073 </entry> 3074 <entry name="maxEnergy" type="byte"> 3075 <description>Max energy output of the flash for a full 3076 power single flash</description> 3077 <units>lumen-seconds</units> 3078 <range>&gt;= 0</range> 3079 <tag id="FUTURE" /> 3080 </entry> 3081 </static> 3082 <dynamic> 3083 <clone entry="android.flash.firingPower" kind="controls"> 3084 </clone> 3085 <clone entry="android.flash.firingTime" kind="controls"> 3086 </clone> 3087 <clone entry="android.flash.mode" kind="controls"></clone> 3088 <entry name="state" type="byte" visibility="public" enum="true" 3089 hwlevel="limited"> 3090 <enum> 3091 <value>UNAVAILABLE 3092 <notes>No flash on camera.</notes></value> 3093 <value>CHARGING 3094 <notes>Flash is charging and cannot be fired.</notes></value> 3095 <value>READY 3096 <notes>Flash is ready to fire.</notes></value> 3097 <value>FIRED 3098 <notes>Flash fired for this capture.</notes></value> 3099 <value>PARTIAL 3100 <notes>Flash partially illuminated this frame. 3101 3102 This is usually due to the next or previous frame having 3103 the flash fire, and the flash spilling into this capture 3104 due to hardware limitations.</notes></value> 3105 </enum> 3106 <description>Current state of the flash 3107 unit.</description> 3108 <details> 3109 When the camera device doesn't have flash unit 3110 (i.e. `android.flash.info.available == false`), this state will always be UNAVAILABLE. 3111 Other states indicate the current flash status. 3112 3113 In certain conditions, this will be available on LEGACY devices: 3114 3115 * Flash-less cameras always return UNAVAILABLE. 3116 * Using android.control.aeMode `==` ON_ALWAYS_FLASH 3117 will always return FIRED. 3118 * Using android.flash.mode `==` TORCH 3119 will always return FIRED. 3120 3121 In all other conditions the state will not be available on 3122 LEGACY devices (i.e. it will be `null`). 3123 </details> 3124 </entry> 3125 </dynamic> 3126 </section> 3127 <section name="hotPixel"> 3128 <controls> 3129 <entry name="mode" type="byte" visibility="public" enum="true"> 3130 <enum> 3131 <value>OFF 3132 <notes> 3133 No hot pixel correction is applied. 3134 3135 The frame rate must not be reduced relative to sensor raw output 3136 for this option. 3137 3138 The hotpixel map may be returned in android.statistics.hotPixelMap. 3139 </notes> 3140 </value> 3141 <value>FAST 3142 <notes> 3143 Hot pixel correction is applied, without reducing frame 3144 rate relative to sensor raw output. 3145 3146 The hotpixel map may be returned in android.statistics.hotPixelMap. 3147 </notes> 3148 </value> 3149 <value>HIGH_QUALITY 3150 <notes> 3151 High-quality hot pixel correction is applied, at a cost 3152 of possibly reduced frame rate relative to sensor raw output. 3153 3154 The hotpixel map may be returned in android.statistics.hotPixelMap. 3155 </notes> 3156 </value> 3157 </enum> 3158 <description> 3159 Operational mode for hot pixel correction. 3160 </description> 3161 <range>android.hotPixel.availableHotPixelModes</range> 3162 <details> 3163 Hotpixel correction interpolates out, or otherwise removes, pixels 3164 that do not accurately measure the incoming light (i.e. pixels that 3165 are stuck at an arbitrary value or are oversensitive). 3166 </details> 3167 <tag id="V1" /> 3168 <tag id="RAW" /> 3169 </entry> 3170 </controls> 3171 <static> 3172 <entry name="availableHotPixelModes" type="byte" visibility="public" 3173 type_notes="list of enums" container="array" typedef="enumList"> 3174 <array> 3175 <size>n</size> 3176 </array> 3177 <description> 3178 List of hot pixel correction modes for android.hotPixel.mode that are supported by this 3179 camera device. 3180 </description> 3181 <range>Any value listed in android.hotPixel.mode</range> 3182 <details> 3183 FULL mode camera devices will always support FAST. 3184 </details> 3185 <hal_details> 3186 To avoid performance issues, there will be significantly fewer hot 3187 pixels than actual pixels on the camera sensor. 3188 HAL must support both FAST and HIGH_QUALITY if hot pixel correction control is available 3189 on the camera device, but the underlying implementation can be the same for both modes. 3190 That is, if the highest quality implementation on the camera device does not slow down 3191 capture rate, then FAST and HIGH_QUALITY will generate the same output. 3192 </hal_details> 3193 <tag id="V1" /> 3194 <tag id="RAW" /> 3195 </entry> 3196 </static> 3197 <dynamic> 3198 <clone entry="android.hotPixel.mode" kind="controls"> 3199 <tag id="V1" /> 3200 <tag id="RAW" /> 3201 </clone> 3202 </dynamic> 3203 </section> 3204 <section name="jpeg"> 3205 <controls> 3206 <entry name="gpsLocation" type="byte" visibility="java_public" synthetic="true" 3207 typedef="location" hwlevel="legacy"> 3208 <description> 3209 A location object to use when generating image GPS metadata. 3210 </description> 3211 <details> 3212 Setting a location object in a request will include the GPS coordinates of the location 3213 into any JPEG images captured based on the request. These coordinates can then be 3214 viewed by anyone who receives the JPEG image. 3215 </details> 3216 </entry> 3217 <entry name="gpsCoordinates" type="double" visibility="ndk_public" 3218 type_notes="latitude, longitude, altitude. First two in degrees, the third in meters" 3219 container="array" hwlevel="legacy"> 3220 <array> 3221 <size>3</size> 3222 </array> 3223 <description>GPS coordinates to include in output JPEG 3224 EXIF.</description> 3225 <range>(-180 - 180], [-90,90], [-inf, inf]</range> 3226 <tag id="BC" /> 3227 </entry> 3228 <entry name="gpsProcessingMethod" type="byte" visibility="ndk_public" 3229 typedef="string" hwlevel="legacy"> 3230 <description>32 characters describing GPS algorithm to 3231 include in EXIF.</description> 3232 <units>UTF-8 null-terminated string</units> 3233 <tag id="BC" /> 3234 </entry> 3235 <entry name="gpsTimestamp" type="int64" visibility="ndk_public" hwlevel="legacy"> 3236 <description>Time GPS fix was made to include in 3237 EXIF.</description> 3238 <units>UTC in seconds since January 1, 1970</units> 3239 <tag id="BC" /> 3240 </entry> 3241 <entry name="orientation" type="int32" visibility="public" hwlevel="legacy"> 3242 <description>The orientation for a JPEG image.</description> 3243 <units>Degrees in multiples of 90</units> 3244 <range>0, 90, 180, 270</range> 3245 <details> 3246 The clockwise rotation angle in degrees, relative to the orientation 3247 to the camera, that the JPEG picture needs to be rotated by, to be viewed 3248 upright. 3249 3250 Camera devices may either encode this value into the JPEG EXIF header, or 3251 rotate the image data to match this orientation. When the image data is rotated, 3252 the thumbnail data will also be rotated. 3253 3254 Note that this orientation is relative to the orientation of the camera sensor, given 3255 by android.sensor.orientation. 3256 3257 To translate from the device orientation given by the Android sensor APIs, the following 3258 sample code may be used: 3259 3260 private int getJpegOrientation(CameraCharacteristics c, int deviceOrientation) { 3261 if (deviceOrientation == android.view.OrientationEventListener.ORIENTATION_UNKNOWN) return 0; 3262 int sensorOrientation = c.get(CameraCharacteristics.SENSOR_ORIENTATION); 3263 3264 // Round device orientation to a multiple of 90 3265 deviceOrientation = (deviceOrientation + 45) / 90 * 90; 3266 3267 // Reverse device orientation for front-facing cameras 3268 boolean facingFront = c.get(CameraCharacteristics.LENS_FACING) == CameraCharacteristics.LENS_FACING_FRONT; 3269 if (facingFront) deviceOrientation = -deviceOrientation; 3270 3271 // Calculate desired JPEG orientation relative to camera orientation to make 3272 // the image upright relative to the device orientation 3273 int jpegOrientation = (sensorOrientation + deviceOrientation + 360) % 360; 3274 3275 return jpegOrientation; 3276 } 3277 </details> 3278 <tag id="BC" /> 3279 </entry> 3280 <entry name="quality" type="byte" visibility="public" hwlevel="legacy"> 3281 <description>Compression quality of the final JPEG 3282 image.</description> 3283 <range>1-100; larger is higher quality</range> 3284 <details>85-95 is typical usage range.</details> 3285 <tag id="BC" /> 3286 </entry> 3287 <entry name="thumbnailQuality" type="byte" visibility="public" hwlevel="legacy"> 3288 <description>Compression quality of JPEG 3289 thumbnail.</description> 3290 <range>1-100; larger is higher quality</range> 3291 <tag id="BC" /> 3292 </entry> 3293 <entry name="thumbnailSize" type="int32" visibility="public" 3294 container="array" typedef="size" hwlevel="legacy"> 3295 <array> 3296 <size>2</size> 3297 </array> 3298 <description>Resolution of embedded JPEG thumbnail.</description> 3299 <range>android.jpeg.availableThumbnailSizes</range> 3300 <details>When set to (0, 0) value, the JPEG EXIF will not contain thumbnail, 3301 but the captured JPEG will still be a valid image. 3302 3303 For best results, when issuing a request for a JPEG image, the thumbnail size selected 3304 should have the same aspect ratio as the main JPEG output. 3305 3306 If the thumbnail image aspect ratio differs from the JPEG primary image aspect 3307 ratio, the camera device creates the thumbnail by cropping it from the primary image. 3308 For example, if the primary image has 4:3 aspect ratio, the thumbnail image has 3309 16:9 aspect ratio, the primary image will be cropped vertically (letterbox) to 3310 generate the thumbnail image. The thumbnail image will always have a smaller Field 3311 Of View (FOV) than the primary image when aspect ratios differ. 3312 3313 When an android.jpeg.orientation of non-zero degree is requested, 3314 the camera device will handle thumbnail rotation in one of the following ways: 3315 3316 * Set the {@link android.media.ExifInterface#TAG_ORIENTATION EXIF orientation flag} 3317 and keep jpeg and thumbnail image data unrotated. 3318 * Rotate the jpeg and thumbnail image data and not set 3319 {@link android.media.ExifInterface#TAG_ORIENTATION EXIF orientation flag}. In this 3320 case, LIMITED or FULL hardware level devices will report rotated thumnail size in 3321 capture result, so the width and height will be interchanged if 90 or 270 degree 3322 orientation is requested. LEGACY device will always report unrotated thumbnail 3323 size. 3324 </details> 3325 <hal_details> 3326 The HAL must not squeeze or stretch the downscaled primary image to generate thumbnail. 3327 The cropping must be done on the primary jpeg image rather than the sensor active array. 3328 The stream cropping rule specified by "S5. Cropping" in camera3.h doesn't apply to the 3329 thumbnail image cropping. 3330 </hal_details> 3331 <tag id="BC" /> 3332 </entry> 3333 </controls> 3334 <static> 3335 <entry name="availableThumbnailSizes" type="int32" visibility="public" 3336 container="array" typedef="size" hwlevel="legacy"> 3337 <array> 3338 <size>2</size> 3339 <size>n</size> 3340 </array> 3341 <description>List of JPEG thumbnail sizes for android.jpeg.thumbnailSize supported by this 3342 camera device.</description> 3343 <details> 3344 This list will include at least one non-zero resolution, plus `(0,0)` for indicating no 3345 thumbnail should be generated. 3346 3347 Below condiditions will be satisfied for this size list: 3348 3349 * The sizes will be sorted by increasing pixel area (width x height). 3350 If several resolutions have the same area, they will be sorted by increasing width. 3351 * The aspect ratio of the largest thumbnail size will be same as the 3352 aspect ratio of largest JPEG output size in android.scaler.availableStreamConfigurations. 3353 The largest size is defined as the size that has the largest pixel area 3354 in a given size list. 3355 * Each output JPEG size in android.scaler.availableStreamConfigurations will have at least 3356 one corresponding size that has the same aspect ratio in availableThumbnailSizes, 3357 and vice versa. 3358 * All non-`(0, 0)` sizes will have non-zero widths and heights.</details> 3359 <tag id="BC" /> 3360 </entry> 3361 <entry name="maxSize" type="int32" visibility="system"> 3362 <description>Maximum size in bytes for the compressed 3363 JPEG buffer</description> 3364 <range>Must be large enough to fit any JPEG produced by 3365 the camera</range> 3366 <details>This is used for sizing the gralloc buffers for 3367 JPEG</details> 3368 </entry> 3369 </static> 3370 <dynamic> 3371 <clone entry="android.jpeg.gpsLocation" kind="controls"> 3372 </clone> 3373 <clone entry="android.jpeg.gpsCoordinates" kind="controls"> 3374 </clone> 3375 <clone entry="android.jpeg.gpsProcessingMethod" 3376 kind="controls"></clone> 3377 <clone entry="android.jpeg.gpsTimestamp" kind="controls"> 3378 </clone> 3379 <clone entry="android.jpeg.orientation" kind="controls"> 3380 </clone> 3381 <clone entry="android.jpeg.quality" kind="controls"> 3382 </clone> 3383 <entry name="size" type="int32"> 3384 <description>The size of the compressed JPEG image, in 3385 bytes</description> 3386 <range>&gt;= 0</range> 3387 <details>If no JPEG output is produced for the request, 3388 this must be 0. 3389 3390 Otherwise, this describes the real size of the compressed 3391 JPEG image placed in the output stream. More specifically, 3392 if android.jpeg.maxSize = 1000000, and a specific capture 3393 has android.jpeg.size = 500000, then the output buffer from 3394 the JPEG stream will be 1000000 bytes, of which the first 3395 500000 make up the real data.</details> 3396 <tag id="FUTURE" /> 3397 </entry> 3398 <clone entry="android.jpeg.thumbnailQuality" 3399 kind="controls"></clone> 3400 <clone entry="android.jpeg.thumbnailSize" kind="controls"> 3401 </clone> 3402 </dynamic> 3403 </section> 3404 <section name="lens"> 3405 <controls> 3406 <entry name="aperture" type="float" visibility="public" hwlevel="full"> 3407 <description>The desired lens aperture size, as a ratio of lens focal length to the 3408 effective aperture diameter.</description> 3409 <units>The f-number (f/N)</units> 3410 <range>android.lens.info.availableApertures</range> 3411 <details>Setting this value is only supported on the camera devices that have a variable 3412 aperture lens. 3413 3414 When this is supported and android.control.aeMode is OFF, 3415 this can be set along with android.sensor.exposureTime, 3416 android.sensor.sensitivity, and android.sensor.frameDuration 3417 to achieve manual exposure control. 3418 3419 The requested aperture value may take several frames to reach the 3420 requested value; the camera device will report the current (intermediate) 3421 aperture size in capture result metadata while the aperture is changing. 3422 While the aperture is still changing, android.lens.state will be set to MOVING. 3423 3424 When this is supported and android.control.aeMode is one of 3425 the ON modes, this will be overridden by the camera device 3426 auto-exposure algorithm, the overridden values are then provided 3427 back to the user in the corresponding result.</details> 3428 <tag id="V1" /> 3429 </entry> 3430 <entry name="filterDensity" type="float" visibility="public" hwlevel="full"> 3431 <description> 3432 The desired setting for the lens neutral density filter(s). 3433 </description> 3434 <units>Exposure Value (EV)</units> 3435 <range>android.lens.info.availableFilterDensities</range> 3436 <details> 3437 This control will not be supported on most camera devices. 3438 3439 Lens filters are typically used to lower the amount of light the 3440 sensor is exposed to (measured in steps of EV). As used here, an EV 3441 step is the standard logarithmic representation, which are 3442 non-negative, and inversely proportional to the amount of light 3443 hitting the sensor. For example, setting this to 0 would result 3444 in no reduction of the incoming light, and setting this to 2 would 3445 mean that the filter is set to reduce incoming light by two stops 3446 (allowing 1/4 of the prior amount of light to the sensor). 3447 3448 It may take several frames before the lens filter density changes 3449 to the requested value. While the filter density is still changing, 3450 android.lens.state will be set to MOVING. 3451 </details> 3452 <tag id="V1" /> 3453 </entry> 3454 <entry name="focalLength" type="float" visibility="public" hwlevel="legacy"> 3455 <description> 3456 The desired lens focal length; used for optical zoom. 3457 </description> 3458 <units>Millimeters</units> 3459 <range>android.lens.info.availableFocalLengths</range> 3460 <details> 3461 This setting controls the physical focal length of the camera 3462 device's lens. Changing the focal length changes the field of 3463 view of the camera device, and is usually used for optical zoom. 3464 3465 Like android.lens.focusDistance and android.lens.aperture, this 3466 setting won't be applied instantaneously, and it may take several 3467 frames before the lens can change to the requested focal length. 3468 While the focal length is still changing, android.lens.state will 3469 be set to MOVING. 3470 3471 Optical zoom will not be supported on most devices. 3472 </details> 3473 <tag id="V1" /> 3474 </entry> 3475 <entry name="focusDistance" type="float" visibility="public" hwlevel="full"> 3476 <description>Desired distance to plane of sharpest focus, 3477 measured from frontmost surface of the lens.</description> 3478 <units>See android.lens.info.focusDistanceCalibration for details</units> 3479 <range>&gt;= 0</range> 3480 <details> 3481 This control can be used for setting manual focus, on devices that support 3482 the MANUAL_SENSOR capability and have a variable-focus lens (see 3483 android.lens.info.minimumFocusDistance). 3484 3485 A value of `0.0f` means infinity focus. The value set will be clamped to 3486 `[0.0f, android.lens.info.minimumFocusDistance]`. 3487 3488 Like android.lens.focalLength, this setting won't be applied 3489 instantaneously, and it may take several frames before the lens 3490 can move to the requested focus distance. While the lens is still moving, 3491 android.lens.state will be set to MOVING. 3492 3493 LEGACY devices support at most setting this to `0.0f` 3494 for infinity focus. 3495 </details> 3496 <tag id="BC" /> 3497 <tag id="V1" /> 3498 </entry> 3499 <entry name="opticalStabilizationMode" type="byte" visibility="public" 3500 enum="true" hwlevel="limited"> 3501 <enum> 3502 <value>OFF 3503 <notes>Optical stabilization is unavailable.</notes> 3504 </value> 3505 <value optional="true">ON 3506 <notes>Optical stabilization is enabled.</notes> 3507 </value> 3508 </enum> 3509 <description> 3510 Sets whether the camera device uses optical image stabilization (OIS) 3511 when capturing images. 3512 </description> 3513 <range>android.lens.info.availableOpticalStabilization</range> 3514 <details> 3515 OIS is used to compensate for motion blur due to small 3516 movements of the camera during capture. Unlike digital image 3517 stabilization (android.control.videoStabilizationMode), OIS 3518 makes use of mechanical elements to stabilize the camera 3519 sensor, and thus allows for longer exposure times before 3520 camera shake becomes apparent. 3521 3522 Switching between different optical stabilization modes may take several 3523 frames to initialize, the camera device will report the current mode in 3524 capture result metadata. For example, When "ON" mode is requested, the 3525 optical stabilization modes in the first several capture results may still 3526 be "OFF", and it will become "ON" when the initialization is done. 3527 3528 If a camera device supports both OIS and digital image stabilization 3529 (android.control.videoStabilizationMode), turning both modes on may produce undesirable 3530 interaction, so it is recommended not to enable both at the same time. 3531 3532 Not all devices will support OIS; see 3533 android.lens.info.availableOpticalStabilization for 3534 available controls. 3535 </details> 3536 <tag id="V1" /> 3537 </entry> 3538 </controls> 3539 <static> 3540 <namespace name="info"> 3541 <entry name="availableApertures" type="float" visibility="public" 3542 container="array" hwlevel="full"> 3543 <array> 3544 <size>n</size> 3545 </array> 3546 <description>List of aperture size values for android.lens.aperture that are 3547 supported by this camera device.</description> 3548 <units>The aperture f-number</units> 3549 <details>If the camera device doesn't support a variable lens aperture, 3550 this list will contain only one value, which is the fixed aperture size. 3551 3552 If the camera device supports a variable aperture, the aperture values 3553 in this list will be sorted in ascending order.</details> 3554 <tag id="V1" /> 3555 </entry> 3556 <entry name="availableFilterDensities" type="float" visibility="public" 3557 container="array" hwlevel="full"> 3558 <array> 3559 <size>n</size> 3560 </array> 3561 <description> 3562 List of neutral density filter values for 3563 android.lens.filterDensity that are supported by this camera device. 3564 </description> 3565 <units>Exposure value (EV)</units> 3566 <range> 3567 Values are &gt;= 0 3568 </range> 3569 <details> 3570 If a neutral density filter is not supported by this camera device, 3571 this list will contain only 0. Otherwise, this list will include every 3572 filter density supported by the camera device, in ascending order. 3573 </details> 3574 <tag id="V1" /> 3575 </entry> 3576 <entry name="availableFocalLengths" type="float" visibility="public" 3577 type_notes="The list of available focal lengths" 3578 container="array" hwlevel="legacy"> 3579 <array> 3580 <size>n</size> 3581 </array> 3582 <description> 3583 List of focal lengths for android.lens.focalLength that are supported by this camera 3584 device. 3585 </description> 3586 <units>Millimeters</units> 3587 <range> 3588 Values are &gt; 0 3589 </range> 3590 <details> 3591 If optical zoom is not supported, this list will only contain 3592 a single value corresponding to the fixed focal length of the 3593 device. Otherwise, this list will include every focal length supported 3594 by the camera device, in ascending order. 3595 </details> 3596 <tag id="BC" /> 3597 <tag id="V1" /> 3598 </entry> 3599 <entry name="availableOpticalStabilization" type="byte" 3600 visibility="public" type_notes="list of enums" container="array" 3601 typedef="enumList" hwlevel="limited"> 3602 <array> 3603 <size>n</size> 3604 </array> 3605 <description> 3606 List of optical image stabilization (OIS) modes for 3607 android.lens.opticalStabilizationMode that are supported by this camera device. 3608 </description> 3609 <range>Any value listed in android.lens.opticalStabilizationMode</range> 3610 <details> 3611 If OIS is not supported by a given camera device, this list will 3612 contain only OFF. 3613 </details> 3614 <tag id="V1" /> 3615 </entry> 3616 <entry name="hyperfocalDistance" type="float" visibility="public" optional="true" 3617 hwlevel="limited"> 3618 <description>Hyperfocal distance for this lens.</description> 3619 <units>See android.lens.info.focusDistanceCalibration for details</units> 3620 <range>If lens is fixed focus, &gt;= 0. If lens has focuser unit, the value is 3621 within `(0.0f, android.lens.info.minimumFocusDistance]`</range> 3622 <details> 3623 If the lens is not fixed focus, the camera device will report this 3624 field when android.lens.info.focusDistanceCalibration is APPROXIMATE or CALIBRATED. 3625 </details> 3626 </entry> 3627 <entry name="minimumFocusDistance" type="float" visibility="public" optional="true" 3628 hwlevel="limited"> 3629 <description>Shortest distance from frontmost surface 3630 of the lens that can be brought into sharp focus.</description> 3631 <units>See android.lens.info.focusDistanceCalibration for details</units> 3632 <range>&gt;= 0</range> 3633 <details>If the lens is fixed-focus, this will be 3634 0.</details> 3635 <hal_details>Mandatory for FULL devices; LIMITED devices 3636 must always set this value to 0 for fixed-focus; and may omit 3637 the minimum focus distance otherwise. 3638 3639 This field is also mandatory for all devices advertising 3640 the MANUAL_SENSOR capability.</hal_details> 3641 <tag id="V1" /> 3642 </entry> 3643 <entry name="shadingMapSize" type="int32" visibility="ndk_public" 3644 type_notes="width and height (N, M) of lens shading map provided by the camera device." 3645 container="array" typedef="size" hwlevel="full"> 3646 <array> 3647 <size>2</size> 3648 </array> 3649 <description>Dimensions of lens shading map.</description> 3650 <range>Both values &gt;= 1</range> 3651 <details> 3652 The map should be on the order of 30-40 rows and columns, and 3653 must be smaller than 64x64. 3654 </details> 3655 <tag id="V1" /> 3656 </entry> 3657 <entry name="focusDistanceCalibration" type="byte" visibility="public" 3658 enum="true" hwlevel="limited"> 3659 <enum> 3660 <value>UNCALIBRATED 3661 <notes> 3662 The lens focus distance is not accurate, and the units used for 3663 android.lens.focusDistance do not correspond to any physical units. 3664 3665 Setting the lens to the same focus distance on separate occasions may 3666 result in a different real focus distance, depending on factors such 3667 as the orientation of the device, the age of the focusing mechanism, 3668 and the device temperature. The focus distance value will still be 3669 in the range of `[0, android.lens.info.minimumFocusDistance]`, where 0 3670 represents the farthest focus. 3671 </notes> 3672 </value> 3673 <value>APPROXIMATE 3674 <notes> 3675 The lens focus distance is measured in diopters. 3676 3677 However, setting the lens to the same focus distance 3678 on separate occasions may result in a different real 3679 focus distance, depending on factors such as the 3680 orientation of the device, the age of the focusing 3681 mechanism, and the device temperature. 3682 </notes> 3683 </value> 3684 <value>CALIBRATED 3685 <notes> 3686 The lens focus distance is measured in diopters, and 3687 is calibrated. 3688 3689 The lens mechanism is calibrated so that setting the 3690 same focus distance is repeatable on multiple 3691 occasions with good accuracy, and the focus distance 3692 corresponds to the real physical distance to the plane 3693 of best focus. 3694 </notes> 3695 </value> 3696 </enum> 3697 <description>The lens focus distance calibration quality.</description> 3698 <details> 3699 The lens focus distance calibration quality determines the reliability of 3700 focus related metadata entries, i.e. android.lens.focusDistance, 3701 android.lens.focusRange, android.lens.info.hyperfocalDistance, and 3702 android.lens.info.minimumFocusDistance. 3703 3704 APPROXIMATE and CALIBRATED devices report the focus metadata in 3705 units of diopters (1/meter), so `0.0f` represents focusing at infinity, 3706 and increasing positive numbers represent focusing closer and closer 3707 to the camera device. The focus distance control also uses diopters 3708 on these devices. 3709 3710 UNCALIBRATED devices do not use units that are directly comparable 3711 to any real physical measurement, but `0.0f` still represents farthest 3712 focus, and android.lens.info.minimumFocusDistance represents the 3713 nearest focus the device can achieve. 3714 </details> 3715 <hal_details> 3716 For devices advertise APPROXIMATE quality or higher, diopters 0 (infinity 3717 focus) must work. When autofocus is disabled (android.control.afMode == OFF) 3718 and the lens focus distance is set to 0 diopters 3719 (android.lens.focusDistance == 0), the lens will move to focus at infinity 3720 and is stably focused at infinity even if the device tilts. It may take the 3721 lens some time to move; during the move the lens state should be MOVING and 3722 the output diopter value should be changing toward 0. 3723 </hal_details> 3724 <tag id="V1" /> 3725 </entry> 3726 </namespace> 3727 <entry name="facing" type="byte" visibility="public" enum="true" hwlevel="legacy"> 3728 <enum> 3729 <value>FRONT 3730 <notes> 3731 The camera device faces the same direction as the device's screen. 3732 </notes></value> 3733 <value>BACK 3734 <notes> 3735 The camera device faces the opposite direction as the device's screen. 3736 </notes></value> 3737 <value>EXTERNAL 3738 <notes> 3739 The camera device is an external camera, and has no fixed facing relative to the 3740 device's screen. 3741 </notes></value> 3742 </enum> 3743 <description>Direction the camera faces relative to 3744 device screen.</description> 3745 </entry> 3746 <entry name="poseRotation" type="float" visibility="public" 3747 container="array"> 3748 <array> 3749 <size>4</size> 3750 </array> 3751 <description> 3752 The orientation of the camera relative to the sensor 3753 coordinate system. 3754 </description> 3755 <units> 3756 Quaternion coefficients 3757 </units> 3758 <details> 3759 The four coefficients that describe the quaternion 3760 rotation from the Android sensor coordinate system to a 3761 camera-aligned coordinate system where the X-axis is 3762 aligned with the long side of the image sensor, the Y-axis 3763 is aligned with the short side of the image sensor, and 3764 the Z-axis is aligned with the optical axis of the sensor. 3765 3766 To convert from the quaternion coefficients `(x,y,z,w)` 3767 to the axis of rotation `(a_x, a_y, a_z)` and rotation 3768 amount `theta`, the following formulas can be used: 3769 3770 theta = 2 * acos(w) 3771 a_x = x / sin(theta/2) 3772 a_y = y / sin(theta/2) 3773 a_z = z / sin(theta/2) 3774 3775 To create a 3x3 rotation matrix that applies the rotation 3776 defined by this quaternion, the following matrix can be 3777 used: 3778 3779 R = [ 1 - 2y^2 - 2z^2, 2xy - 2zw, 2xz + 2yw, 3780 2xy + 2zw, 1 - 2x^2 - 2z^2, 2yz - 2xw, 3781 2xz - 2yw, 2yz + 2xw, 1 - 2x^2 - 2y^2 ] 3782 3783 This matrix can then be used to apply the rotation to a 3784 column vector point with 3785 3786 `p' = Rp` 3787 3788 where `p` is in the device sensor coordinate system, and 3789 `p'` is in the camera-oriented coordinate system. 3790 </details> 3791 <tag id="DEPTH" /> 3792 </entry> 3793 <entry name="poseTranslation" type="float" visibility="public" 3794 container="array"> 3795 <array> 3796 <size>3</size> 3797 </array> 3798 <description>Position of the camera optical center.</description> 3799 <units>Meters</units> 3800 <details> 3801 The position of the camera device's lens optical center, 3802 as a three-dimensional vector `(x,y,z)`, relative to the 3803 optical center of the largest camera device facing in the 3804 same direction as this camera, in the {@link 3805 android.hardware.SensorEvent Android sensor coordinate 3806 axes}. Note that only the axis definitions are shared with 3807 the sensor coordinate system, but not the origin. 3808 3809 If this device is the largest or only camera device with a 3810 given facing, then this position will be `(0, 0, 0)`; a 3811 camera device with a lens optical center located 3 cm from 3812 the main sensor along the +X axis (to the right from the 3813 user's perspective) will report `(0.03, 0, 0)`. 3814 3815 To transform a pixel coordinates between two cameras 3816 facing the same direction, first the source camera 3817 android.lens.radialDistortion must be corrected for. Then 3818 the source camera android.lens.intrinsicCalibration needs 3819 to be applied, followed by the android.lens.poseRotation 3820 of the source camera, the translation of the source camera 3821 relative to the destination camera, the 3822 android.lens.poseRotation of the destination camera, and 3823 finally the inverse of android.lens.intrinsicCalibration 3824 of the destination camera. This obtains a 3825 radial-distortion-free coordinate in the destination 3826 camera pixel coordinates. 3827 3828 To compare this against a real image from the destination 3829 camera, the destination camera image then needs to be 3830 corrected for radial distortion before comparison or 3831 sampling. 3832 </details> 3833 <tag id="DEPTH" /> 3834 </entry> 3835 </static> 3836 <dynamic> 3837 <clone entry="android.lens.aperture" kind="controls"> 3838 <tag id="V1" /> 3839 </clone> 3840 <clone entry="android.lens.filterDensity" kind="controls"> 3841 <tag id="V1" /> 3842 </clone> 3843 <clone entry="android.lens.focalLength" kind="controls"> 3844 <tag id="BC" /> 3845 </clone> 3846 <clone entry="android.lens.focusDistance" kind="controls"> 3847 <details>Should be zero for fixed-focus cameras</details> 3848 <tag id="BC" /> 3849 </clone> 3850 <entry name="focusRange" type="float" visibility="public" 3851 type_notes="Range of scene distances that are in focus" 3852 container="array" typedef="pairFloatFloat" hwlevel="limited"> 3853 <array> 3854 <size>2</size> 3855 </array> 3856 <description>The range of scene distances that are in 3857 sharp focus (depth of field).</description> 3858 <units>A pair of focus distances in diopters: (near, 3859 far); see android.lens.info.focusDistanceCalibration for details.</units> 3860 <range>&gt;=0</range> 3861 <details>If variable focus not supported, can still report 3862 fixed depth of field range</details> 3863 <tag id="BC" /> 3864 </entry> 3865 <clone entry="android.lens.opticalStabilizationMode" 3866 kind="controls"> 3867 <tag id="V1" /> 3868 </clone> 3869 <entry name="state" type="byte" visibility="public" enum="true" hwlevel="limited"> 3870 <enum> 3871 <value>STATIONARY 3872 <notes> 3873 The lens parameters (android.lens.focalLength, android.lens.focusDistance, 3874 android.lens.filterDensity and android.lens.aperture) are not changing. 3875 </notes> 3876 </value> 3877 <value>MOVING 3878 <notes> 3879 One or several of the lens parameters 3880 (android.lens.focalLength, android.lens.focusDistance, 3881 android.lens.filterDensity or android.lens.aperture) is 3882 currently changing. 3883 </notes> 3884 </value> 3885 </enum> 3886 <description>Current lens status.</description> 3887 <details> 3888 For lens parameters android.lens.focalLength, android.lens.focusDistance, 3889 android.lens.filterDensity and android.lens.aperture, when changes are requested, 3890 they may take several frames to reach the requested values. This state indicates 3891 the current status of the lens parameters. 3892 3893 When the state is STATIONARY, the lens parameters are not changing. This could be 3894 either because the parameters are all fixed, or because the lens has had enough 3895 time to reach the most recently-requested values. 3896 If all these lens parameters are not changable for a camera device, as listed below: 3897 3898 * Fixed focus (`android.lens.info.minimumFocusDistance == 0`), which means 3899 android.lens.focusDistance parameter will always be 0. 3900 * Fixed focal length (android.lens.info.availableFocalLengths contains single value), 3901 which means the optical zoom is not supported. 3902 * No ND filter (android.lens.info.availableFilterDensities contains only 0). 3903 * Fixed aperture (android.lens.info.availableApertures contains single value). 3904 3905 Then this state will always be STATIONARY. 3906 3907 When the state is MOVING, it indicates that at least one of the lens parameters 3908 is changing. 3909 </details> 3910 <tag id="V1" /> 3911 </entry> 3912 <clone entry="android.lens.poseRotation" kind="static"> 3913 </clone> 3914 <clone entry="android.lens.poseTranslation" kind="static"> 3915 </clone> 3916 </dynamic> 3917 <static> 3918 <entry name="intrinsicCalibration" type="float" visibility="public" 3919 container="array"> 3920 <array> 3921 <size>5</size> 3922 </array> 3923 <description> 3924 The parameters for this camera device's intrinsic 3925 calibration. 3926 </description> 3927 <units> 3928 Pixels in the 3929 android.sensor.info.preCorrectionActiveArraySize 3930 coordinate system. 3931 </units> 3932 <details> 3933 The five calibration parameters that describe the 3934 transform from camera-centric 3D coordinates to sensor 3935 pixel coordinates: 3936 3937 [f_x, f_y, c_x, c_y, s] 3938 3939 Where `f_x` and `f_y` are the horizontal and vertical 3940 focal lengths, `[c_x, c_y]` is the position of the optical 3941 axis, and `s` is a skew parameter for the sensor plane not 3942 being aligned with the lens plane. 3943 3944 These are typically used within a transformation matrix K: 3945 3946 K = [ f_x, s, c_x, 3947 0, f_y, c_y, 3948 0 0, 1 ] 3949 3950 which can then be combined with the camera pose rotation 3951 `R` and translation `t` (android.lens.poseRotation and 3952 android.lens.poseTranslation, respective) to calculate the 3953 complete transform from world coordinates to pixel 3954 coordinates: 3955 3956 P = [ K 0 * [ R t 3957 0 1 ] 0 1 ] 3958 3959 and with `p_w` being a point in the world coordinate system 3960 and `p_s` being a point in the camera active pixel array 3961 coordinate system, and with the mapping including the 3962 homogeneous division by z: 3963 3964 p_h = (x_h, y_h, z_h) = P p_w 3965 p_s = p_h / z_h 3966 3967 so `[x_s, y_s]` is the pixel coordinates of the world 3968 point, `z_s = 1`, and `w_s` is a measurement of disparity 3969 (depth) in pixel coordinates. 3970 3971 Note that the coordinate system for this transform is the 3972 android.sensor.info.preCorrectionActiveArraySize system, 3973 where `(0,0)` is the top-left of the 3974 preCorrectionActiveArraySize rectangle. Once the pose and 3975 intrinsic calibration transforms have been applied to a 3976 world point, then the android.lens.radialDistortion 3977 transform needs to be applied, and the result adjusted to 3978 be in the android.sensor.info.activeArraySize coordinate 3979 system (where `(0, 0)` is the top-left of the 3980 activeArraySize rectangle), to determine the final pixel 3981 coordinate of the world point for processed (non-RAW) 3982 output buffers. 3983 </details> 3984 <tag id="DEPTH" /> 3985 </entry> 3986 <entry name="radialDistortion" type="float" visibility="public" 3987 container="array"> 3988 <array> 3989 <size>6</size> 3990 </array> 3991 <description> 3992 The correction coefficients to correct for this camera device's 3993 radial and tangential lens distortion. 3994 </description> 3995 <units> 3996 Unitless coefficients. 3997 </units> 3998 <details> 3999 Four radial distortion coefficients `[kappa_0, kappa_1, kappa_2, 4000 kappa_3]` and two tangential distortion coefficients 4001 `[kappa_4, kappa_5]` that can be used to correct the 4002 lens's geometric distortion with the mapping equations: 4003 4004 x_c = x_i * ( kappa_0 + kappa_1 * r^2 + kappa_2 * r^4 + kappa_3 * r^6 ) + 4005 kappa_4 * (2 * x_i * y_i) + kappa_5 * ( r^2 + 2 * x_i^2 ) 4006 y_c = y_i * ( kappa_0 + kappa_1 * r^2 + kappa_2 * r^4 + kappa_3 * r^6 ) + 4007 kappa_5 * (2 * x_i * y_i) + kappa_4 * ( r^2 + 2 * y_i^2 ) 4008 4009 Here, `[x_c, y_c]` are the coordinates to sample in the 4010 input image that correspond to the pixel values in the 4011 corrected image at the coordinate `[x_i, y_i]`: 4012 4013 correctedImage(x_i, y_i) = sample_at(x_c, y_c, inputImage) 4014 4015 The pixel coordinates are defined in a normalized 4016 coordinate system related to the 4017 android.lens.intrinsicCalibration calibration fields. 4018 Both `[x_i, y_i]` and `[x_c, y_c]` have `(0,0)` at the 4019 lens optical center `[c_x, c_y]`. The maximum magnitudes 4020 of both x and y coordinates are normalized to be 1 at the 4021 edge further from the optical center, so the range 4022 for both dimensions is `-1 <= x <= 1`. 4023 4024 Finally, `r` represents the radial distance from the 4025 optical center, `r^2 = x_i^2 + y_i^2`, and its magnitude 4026 is therefore no larger than `|r| <= sqrt(2)`. 4027 4028 The distortion model used is the Brown-Conrady model. 4029 </details> 4030 <tag id="DEPTH" /> 4031 </entry> 4032 </static> 4033 <dynamic> 4034 <clone entry="android.lens.intrinsicCalibration" kind="static"> 4035 </clone> 4036 <clone entry="android.lens.radialDistortion" kind="static"> 4037 </clone> 4038 </dynamic> 4039 </section> 4040 <section name="noiseReduction"> 4041 <controls> 4042 <entry name="mode" type="byte" visibility="public" enum="true" hwlevel="full"> 4043 <enum> 4044 <value>OFF 4045 <notes>No noise reduction is applied.</notes></value> 4046 <value>FAST 4047 <notes>Noise reduction is applied without reducing frame rate relative to sensor 4048 output. It may be the same as OFF if noise reduction will reduce frame rate 4049 relative to sensor.</notes></value> 4050 <value>HIGH_QUALITY 4051 <notes>High-quality noise reduction is applied, at the cost of possibly reduced frame 4052 rate relative to sensor output.</notes></value> 4053 <value optional="true">MINIMAL 4054 <notes>MINIMAL noise reduction is applied without reducing frame rate relative to 4055 sensor output. </notes></value> 4056 <value optional="true">ZERO_SHUTTER_LAG 4057 4058 <notes>Noise reduction is applied at different levels for different output streams, 4059 based on resolution. Streams at maximum recording resolution (see {@link 4060 android.hardware.camera2.CameraDevice#createCaptureSession}) or below have noise 4061 reduction applied, while higher-resolution streams have MINIMAL (if supported) or no 4062 noise reduction applied (if MINIMAL is not supported.) The degree of noise reduction 4063 for low-resolution streams is tuned so that frame rate is not impacted, and the quality 4064 is equal to or better than FAST (since it is only applied to lower-resolution outputs, 4065 quality may improve from FAST). 4066 4067 This mode is intended to be used by applications operating in a zero-shutter-lag mode 4068 with YUV or PRIVATE reprocessing, where the application continuously captures 4069 high-resolution intermediate buffers into a circular buffer, from which a final image is 4070 produced via reprocessing when a user takes a picture. For such a use case, the 4071 high-resolution buffers must not have noise reduction applied to maximize efficiency of 4072 preview and to avoid over-applying noise filtering when reprocessing, while 4073 low-resolution buffers (used for recording or preview, generally) need noise reduction 4074 applied for reasonable preview quality. 4075 4076 This mode is guaranteed to be supported by devices that support either the 4077 YUV_REPROCESSING or PRIVATE_REPROCESSING capabilities 4078 (android.request.availableCapabilities lists either of those capabilities) and it will 4079 be the default mode for CAMERA3_TEMPLATE_ZERO_SHUTTER_LAG template. 4080 </notes></value> 4081 </enum> 4082 <description>Mode of operation for the noise reduction algorithm.</description> 4083 <range>android.noiseReduction.availableNoiseReductionModes</range> 4084 <details>The noise reduction algorithm attempts to improve image quality by removing 4085 excessive noise added by the capture process, especially in dark conditions. 4086 4087 OFF means no noise reduction will be applied by the camera device, for both raw and 4088 YUV domain. 4089 4090 MINIMAL means that only sensor raw domain basic noise reduction is enabled ,to remove 4091 demosaicing or other processing artifacts. For YUV_REPROCESSING, MINIMAL is same as OFF. 4092 This mode is optional, may not be support by all devices. The application should check 4093 android.noiseReduction.availableNoiseReductionModes before using it. 4094 4095 FAST/HIGH_QUALITY both mean camera device determined noise filtering 4096 will be applied. HIGH_QUALITY mode indicates that the camera device 4097 will use the highest-quality noise filtering algorithms, 4098 even if it slows down capture rate. FAST means the camera device will not 4099 slow down capture rate when applying noise filtering. FAST may be the same as MINIMAL if 4100 MINIMAL is listed, or the same as OFF if any noise filtering will slow down capture rate. 4101 Every output stream will have a similar amount of enhancement applied. 4102 4103 ZERO_SHUTTER_LAG is meant to be used by applications that maintain a continuous circular 4104 buffer of high-resolution images during preview and reprocess image(s) from that buffer 4105 into a final capture when triggered by the user. In this mode, the camera device applies 4106 noise reduction to low-resolution streams (below maximum recording resolution) to maximize 4107 preview quality, but does not apply noise reduction to high-resolution streams, since 4108 those will be reprocessed later if necessary. 4109 4110 For YUV_REPROCESSING, these FAST/HIGH_QUALITY modes both mean that the camera device 4111 will apply FAST/HIGH_QUALITY YUV domain noise reduction, respectively. The camera device 4112 may adjust the noise reduction parameters for best image quality based on the 4113 android.reprocess.effectiveExposureFactor if it is set. 4114 </details> 4115 <hal_details> 4116 For YUV_REPROCESSING The HAL can use android.reprocess.effectiveExposureFactor to 4117 adjust the internal noise reduction parameters appropriately to get the best quality 4118 images. 4119 </hal_details> 4120 <tag id="V1" /> 4121 <tag id="REPROC" /> 4122 </entry> 4123 <entry name="strength" type="byte"> 4124 <description>Control the amount of noise reduction 4125 applied to the images</description> 4126 <units>1-10; 10 is max noise reduction</units> 4127 <range>1 - 10</range> 4128 <tag id="FUTURE" /> 4129 </entry> 4130 </controls> 4131 <static> 4132 <entry name="availableNoiseReductionModes" type="byte" visibility="public" 4133 type_notes="list of enums" container="array" typedef="enumList" hwlevel="limited"> 4134 <array> 4135 <size>n</size> 4136 </array> 4137 <description> 4138 List of noise reduction modes for android.noiseReduction.mode that are supported 4139 by this camera device. 4140 </description> 4141 <range>Any value listed in android.noiseReduction.mode</range> 4142 <details> 4143 Full-capability camera devices will always support OFF and FAST. 4144 4145 Camera devices that support YUV_REPROCESSING or PRIVATE_REPROCESSING will support 4146 ZERO_SHUTTER_LAG. 4147 4148 Legacy-capability camera devices will only support FAST mode. 4149 </details> 4150 <hal_details> 4151 HAL must support both FAST and HIGH_QUALITY if noise reduction control is available 4152 on the camera device, but the underlying implementation can be the same for both modes. 4153 That is, if the highest quality implementation on the camera device does not slow down 4154 capture rate, then FAST and HIGH_QUALITY will generate the same output. 4155 </hal_details> 4156 <tag id="V1" /> 4157 <tag id="REPROC" /> 4158 </entry> 4159 </static> 4160 <dynamic> 4161 <clone entry="android.noiseReduction.mode" kind="controls"> 4162 <tag id="V1" /> 4163 <tag id="REPROC" /> 4164 </clone> 4165 </dynamic> 4166 </section> 4167 <section name="quirks"> 4168 <static> 4169 <entry name="meteringCropRegion" type="byte" visibility="system" deprecated="true" optional="true"> 4170 <description>If set to 1, the camera service does not 4171 scale 'normalized' coordinates with respect to the crop 4172 region. This applies to metering input (a{e,f,wb}Region 4173 and output (face rectangles).</description> 4174 <details>Normalized coordinates refer to those in the 4175 (-1000,1000) range mentioned in the 4176 android.hardware.Camera API. 4177 4178 HAL implementations should instead always use and emit 4179 sensor array-relative coordinates for all region data. Does 4180 not need to be listed in static metadata. Support will be 4181 removed in future versions of camera service.</details> 4182 </entry> 4183 <entry name="triggerAfWithAuto" type="byte" visibility="system" deprecated="true" optional="true"> 4184 <description>If set to 1, then the camera service always 4185 switches to FOCUS_MODE_AUTO before issuing a AF 4186 trigger.</description> 4187 <details>HAL implementations should implement AF trigger 4188 modes for AUTO, MACRO, CONTINUOUS_FOCUS, and 4189 CONTINUOUS_PICTURE modes instead of using this flag. Does 4190 not need to be listed in static metadata. Support will be 4191 removed in future versions of camera service</details> 4192 </entry> 4193 <entry name="useZslFormat" type="byte" visibility="system" deprecated="true" optional="true"> 4194 <description>If set to 1, the camera service uses 4195 CAMERA2_PIXEL_FORMAT_ZSL instead of 4196 HAL_PIXEL_FORMAT_IMPLEMENTATION_DEFINED for the zero 4197 shutter lag stream</description> 4198 <details>HAL implementations should use gralloc usage flags 4199 to determine that a stream will be used for 4200 zero-shutter-lag, instead of relying on an explicit 4201 format setting. Does not need to be listed in static 4202 metadata. Support will be removed in future versions of 4203 camera service.</details> 4204 </entry> 4205 <entry name="usePartialResult" type="byte" visibility="hidden" deprecated="true" optional="true"> 4206 <description> 4207 If set to 1, the HAL will always split result 4208 metadata for a single capture into multiple buffers, 4209 returned using multiple process_capture_result calls. 4210 </description> 4211 <details> 4212 Does not need to be listed in static 4213 metadata. Support for partial results will be reworked in 4214 future versions of camera service. This quirk will stop 4215 working at that point; DO NOT USE without careful 4216 consideration of future support. 4217 </details> 4218 <hal_details> 4219 Refer to `camera3_capture_result::partial_result` 4220 for information on how to implement partial results. 4221 </hal_details> 4222 </entry> 4223 </static> 4224 <dynamic> 4225 <entry name="partialResult" type="byte" visibility="hidden" deprecated="true" optional="true" enum="true" typedef="boolean"> 4226 <enum> 4227 <value>FINAL 4228 <notes>The last or only metadata result buffer 4229 for this capture.</notes> 4230 </value> 4231 <value>PARTIAL 4232 <notes>A partial buffer of result metadata for this 4233 capture. More result buffers for this capture will be sent 4234 by the camera device, the last of which will be marked 4235 FINAL.</notes> 4236 </value> 4237 </enum> 4238 <description> 4239 Whether a result given to the framework is the 4240 final one for the capture, or only a partial that contains a 4241 subset of the full set of dynamic metadata 4242 values.</description> 4243 <range>Optional. Default value is FINAL.</range> 4244 <details> 4245 The entries in the result metadata buffers for a 4246 single capture may not overlap, except for this entry. The 4247 FINAL buffers must retain FIFO ordering relative to the 4248 requests that generate them, so the FINAL buffer for frame 3 must 4249 always be sent to the framework after the FINAL buffer for frame 2, and 4250 before the FINAL buffer for frame 4. PARTIAL buffers may be returned 4251 in any order relative to other frames, but all PARTIAL buffers for a given 4252 capture must arrive before the FINAL buffer for that capture. This entry may 4253 only be used by the camera device if quirks.usePartialResult is set to 1. 4254 </details> 4255 <hal_details> 4256 Refer to `camera3_capture_result::partial_result` 4257 for information on how to implement partial results. 4258 </hal_details> 4259 </entry> 4260 </dynamic> 4261 </section> 4262 <section name="request"> 4263 <controls> 4264 <entry name="frameCount" type="int32" visibility="system" deprecated="true"> 4265 <description>A frame counter set by the framework. Must 4266 be maintained unchanged in output frame. This value monotonically 4267 increases with every new result (that is, each new result has a unique 4268 frameCount value). 4269 </description> 4270 <units>incrementing integer</units> 4271 <range>Any int.</range> 4272 </entry> 4273 <entry name="id" type="int32" visibility="hidden"> 4274 <description>An application-specified ID for the current 4275 request. Must be maintained unchanged in output 4276 frame</description> 4277 <units>arbitrary integer assigned by application</units> 4278 <range>Any int</range> 4279 <tag id="V1" /> 4280 </entry> 4281 <entry name="inputStreams" type="int32" visibility="system" deprecated="true" 4282 container="array"> 4283 <array> 4284 <size>n</size> 4285 </array> 4286 <description>List which camera reprocess stream is used 4287 for the source of reprocessing data.</description> 4288 <units>List of camera reprocess stream IDs</units> 4289 <range> 4290 Typically, only one entry allowed, must be a valid reprocess stream ID. 4291 </range> 4292 <details>Only meaningful when android.request.type == 4293 REPROCESS. Ignored otherwise</details> 4294 <tag id="HAL2" /> 4295 </entry> 4296 <entry name="metadataMode" type="byte" visibility="system" 4297 enum="true"> 4298 <enum> 4299 <value>NONE 4300 <notes>No metadata should be produced on output, except 4301 for application-bound buffer data. If no 4302 application-bound streams exist, no frame should be 4303 placed in the output frame queue. If such streams 4304 exist, a frame should be placed on the output queue 4305 with null metadata but with the necessary output buffer 4306 information. Timestamp information should still be 4307 included with any output stream buffers</notes></value> 4308 <value>FULL 4309 <notes>All metadata should be produced. Statistics will 4310 only be produced if they are separately 4311 enabled</notes></value> 4312 </enum> 4313 <description>How much metadata to produce on 4314 output</description> 4315 <tag id="FUTURE" /> 4316 </entry> 4317 <entry name="outputStreams" type="int32" visibility="system" deprecated="true" 4318 container="array"> 4319 <array> 4320 <size>n</size> 4321 </array> 4322 <description>Lists which camera output streams image data 4323 from this capture must be sent to</description> 4324 <units>List of camera stream IDs</units> 4325 <range>List must only include streams that have been 4326 created</range> 4327 <details>If no output streams are listed, then the image 4328 data should simply be discarded. The image data must 4329 still be captured for metadata and statistics production, 4330 and the lens and flash must operate as requested.</details> 4331 <tag id="HAL2" /> 4332 </entry> 4333 <entry name="type" type="byte" visibility="system" deprecated="true" enum="true"> 4334 <enum> 4335 <value>CAPTURE 4336 <notes>Capture a new image from the imaging hardware, 4337 and process it according to the 4338 settings</notes></value> 4339 <value>REPROCESS 4340 <notes>Process previously captured data; the 4341 android.request.inputStreams parameter determines the 4342 source reprocessing stream. TODO: Mark dynamic metadata 4343 needed for reprocessing with [RP]</notes></value> 4344 </enum> 4345 <description>The type of the request; either CAPTURE or 4346 REPROCESS. For HAL3, this tag is redundant. 4347 </description> 4348 <tag id="HAL2" /> 4349 </entry> 4350 </controls> 4351 <static> 4352 <entry name="maxNumOutputStreams" type="int32" visibility="ndk_public" 4353 container="array" hwlevel="legacy"> 4354 <array> 4355 <size>3</size> 4356 </array> 4357 <description>The maximum numbers of different types of output streams 4358 that can be configured and used simultaneously by a camera device. 4359 </description> 4360 <range> 4361 For processed (and stalling) format streams, &gt;= 1. 4362 4363 For Raw format (either stalling or non-stalling) streams, &gt;= 0. 4364 4365 For processed (but not stalling) format streams, &gt;= 3 4366 for FULL mode devices (`android.info.supportedHardwareLevel == FULL`); 4367 &gt;= 2 for LIMITED mode devices (`android.info.supportedHardwareLevel == LIMITED`). 4368 </range> 4369 <details> 4370 This is a 3 element tuple that contains the max number of output simultaneous 4371 streams for raw sensor, processed (but not stalling), and processed (and stalling) 4372 formats respectively. For example, assuming that JPEG is typically a processed and 4373 stalling stream, if max raw sensor format output stream number is 1, max YUV streams 4374 number is 3, and max JPEG stream number is 2, then this tuple should be `(1, 3, 2)`. 4375 4376 This lists the upper bound of the number of output streams supported by 4377 the camera device. Using more streams simultaneously may require more hardware and 4378 CPU resources that will consume more power. The image format for an output stream can 4379 be any supported format provided by android.scaler.availableStreamConfigurations. 4380 The formats defined in android.scaler.availableStreamConfigurations can be catergorized 4381 into the 3 stream types as below: 4382 4383 * Processed (but stalling): any non-RAW format with a stallDurations &gt; 0. 4384 Typically {@link android.graphics.ImageFormat#JPEG JPEG format}. 4385 * Raw formats: {@link android.graphics.ImageFormat#RAW_SENSOR RAW_SENSOR}, {@link 4386 android.graphics.ImageFormat#RAW10 RAW10}, or {@link android.graphics.ImageFormat#RAW12 4387 RAW12}. 4388 * Processed (but not-stalling): any non-RAW format without a stall duration. 4389 Typically {@link android.graphics.ImageFormat#YUV_420_888 YUV_420_888}, 4390 {@link android.graphics.ImageFormat#NV21 NV21}, or 4391 {@link android.graphics.ImageFormat#YV12 YV12}. 4392 </details> 4393 <tag id="BC" /> 4394 </entry> 4395 <entry name="maxNumOutputRaw" type="int32" visibility="java_public" synthetic="true" 4396 hwlevel="legacy"> 4397 <description>The maximum numbers of different types of output streams 4398 that can be configured and used simultaneously by a camera device 4399 for any `RAW` formats. 4400 </description> 4401 <range> 4402 &gt;= 0 4403 </range> 4404 <details> 4405 This value contains the max number of output simultaneous 4406 streams from the raw sensor. 4407 4408 This lists the upper bound of the number of output streams supported by 4409 the camera device. Using more streams simultaneously may require more hardware and 4410 CPU resources that will consume more power. The image format for this kind of an output stream can 4411 be any `RAW` and supported format provided by android.scaler.streamConfigurationMap. 4412 4413 In particular, a `RAW` format is typically one of: 4414 4415 * {@link android.graphics.ImageFormat#RAW_SENSOR RAW_SENSOR} 4416 * {@link android.graphics.ImageFormat#RAW10 RAW10} 4417 * {@link android.graphics.ImageFormat#RAW12 RAW12} 4418 4419 LEGACY mode devices (android.info.supportedHardwareLevel `==` LEGACY) 4420 never support raw streams. 4421 </details> 4422 </entry> 4423 <entry name="maxNumOutputProc" type="int32" visibility="java_public" synthetic="true" 4424 hwlevel="legacy"> 4425 <description>The maximum numbers of different types of output streams 4426 that can be configured and used simultaneously by a camera device 4427 for any processed (but not-stalling) formats. 4428 </description> 4429 <range> 4430 &gt;= 3 4431 for FULL mode devices (`android.info.supportedHardwareLevel == FULL`); 4432 &gt;= 2 for LIMITED mode devices (`android.info.supportedHardwareLevel == LIMITED`). 4433 </range> 4434 <details> 4435 This value contains the max number of output simultaneous 4436 streams for any processed (but not-stalling) formats. 4437 4438 This lists the upper bound of the number of output streams supported by 4439 the camera device. Using more streams simultaneously may require more hardware and 4440 CPU resources that will consume more power. The image format for this kind of an output stream can 4441 be any non-`RAW` and supported format provided by android.scaler.streamConfigurationMap. 4442 4443 Processed (but not-stalling) is defined as any non-RAW format without a stall duration. 4444 Typically: 4445 4446 * {@link android.graphics.ImageFormat#YUV_420_888 YUV_420_888} 4447 * {@link android.graphics.ImageFormat#NV21 NV21} 4448 * {@link android.graphics.ImageFormat#YV12 YV12} 4449 * Implementation-defined formats, i.e. {@link 4450 android.hardware.camera2.params.StreamConfigurationMap#isOutputSupportedFor(Class)} 4451 4452 For full guarantees, query {@link 4453 android.hardware.camera2.params.StreamConfigurationMap#getOutputStallDuration} with a 4454 processed format -- it will return 0 for a non-stalling stream. 4455 4456 LEGACY devices will support at least 2 processing/non-stalling streams. 4457 </details> 4458 </entry> 4459 <entry name="maxNumOutputProcStalling" type="int32" visibility="java_public" synthetic="true" 4460 hwlevel="legacy"> 4461 <description>The maximum numbers of different types of output streams 4462 that can be configured and used simultaneously by a camera device 4463 for any processed (and stalling) formats. 4464 </description> 4465 <range> 4466 &gt;= 1 4467 </range> 4468 <details> 4469 This value contains the max number of output simultaneous 4470 streams for any processed (but not-stalling) formats. 4471 4472 This lists the upper bound of the number of output streams supported by 4473 the camera device. Using more streams simultaneously may require more hardware and 4474 CPU resources that will consume more power. The image format for this kind of an output stream can 4475 be any non-`RAW` and supported format provided by android.scaler.streamConfigurationMap. 4476 4477 A processed and stalling format is defined as any non-RAW format with a stallDurations 4478 &gt; 0. Typically only the {@link android.graphics.ImageFormat#JPEG JPEG format} is a 4479 stalling format. 4480 4481 For full guarantees, query {@link 4482 android.hardware.camera2.params.StreamConfigurationMap#getOutputStallDuration} with a 4483 processed format -- it will return a non-0 value for a stalling stream. 4484 4485 LEGACY devices will support up to 1 processing/stalling stream. 4486 </details> 4487 </entry> 4488 <entry name="maxNumReprocessStreams" type="int32" visibility="system" 4489 deprecated="true" container="array"> 4490 <array> 4491 <size>1</size> 4492 </array> 4493 <description>How many reprocessing streams of any type 4494 can be allocated at the same time.</description> 4495 <range>&gt;= 0</range> 4496 <details> 4497 Only used by HAL2.x. 4498 4499 When set to 0, it means no reprocess stream is supported. 4500 </details> 4501 <tag id="HAL2" /> 4502 </entry> 4503 <entry name="maxNumInputStreams" type="int32" visibility="public" hwlevel="full"> 4504 <description> 4505 The maximum numbers of any type of input streams 4506 that can be configured and used simultaneously by a camera device. 4507 </description> 4508 <range> 4509 0 or 1. 4510 </range> 4511 <details>When set to 0, it means no input stream is supported. 4512 4513 The image format for a input stream can be any supported format returned by {@link 4514 android.hardware.camera2.params.StreamConfigurationMap#getInputFormats}. When using an 4515 input stream, there must be at least one output stream configured to to receive the 4516 reprocessed images. 4517 4518 When an input stream and some output streams are used in a reprocessing request, 4519 only the input buffer will be used to produce these output stream buffers, and a 4520 new sensor image will not be captured. 4521 4522 For example, for Zero Shutter Lag (ZSL) still capture use case, the input 4523 stream image format will be PRIVATE, the associated output stream image format 4524 should be JPEG. 4525 </details> 4526 <hal_details> 4527 For the reprocessing flow and controls, see 4528 hardware/libhardware/include/hardware/camera3.h Section 10 for more details. 4529 </hal_details> 4530 <tag id="REPROC" /> 4531 </entry> 4532 </static> 4533 <dynamic> 4534 <entry name="frameCount" type="int32" visibility="hidden" deprecated="true"> 4535 <description>A frame counter set by the framework. This value monotonically 4536 increases with every new result (that is, each new result has a unique 4537 frameCount value).</description> 4538 <units>count of frames</units> 4539 <range>&gt; 0</range> 4540 <details>Reset on release()</details> 4541 </entry> 4542 <clone entry="android.request.id" kind="controls"></clone> 4543 <clone entry="android.request.metadataMode" 4544 kind="controls"></clone> 4545 <clone entry="android.request.outputStreams" 4546 kind="controls"></clone> 4547 <entry name="pipelineDepth" type="byte" visibility="public" hwlevel="legacy"> 4548 <description>Specifies the number of pipeline stages the frame went 4549 through from when it was exposed to when the final completed result 4550 was available to the framework.</description> 4551 <range>&lt;= android.request.pipelineMaxDepth</range> 4552 <details>Depending on what settings are used in the request, and 4553 what streams are configured, the data may undergo less processing, 4554 and some pipeline stages skipped. 4555 4556 See android.request.pipelineMaxDepth for more details. 4557 </details> 4558 <hal_details> 4559 This value must always represent the accurate count of how many 4560 pipeline stages were actually used. 4561 </hal_details> 4562 </entry> 4563 </dynamic> 4564 <static> 4565 <entry name="pipelineMaxDepth" type="byte" visibility="public" hwlevel="legacy"> 4566 <description>Specifies the number of maximum pipeline stages a frame 4567 has to go through from when it's exposed to when it's available 4568 to the framework.</description> 4569 <details>A typical minimum value for this is 2 (one stage to expose, 4570 one stage to readout) from the sensor. The ISP then usually adds 4571 its own stages to do custom HW processing. Further stages may be 4572 added by SW processing. 4573 4574 Depending on what settings are used (e.g. YUV, JPEG) and what 4575 processing is enabled (e.g. face detection), the actual pipeline 4576 depth (specified by android.request.pipelineDepth) may be less than 4577 the max pipeline depth. 4578 4579 A pipeline depth of X stages is equivalent to a pipeline latency of 4580 X frame intervals. 4581 4582 This value will normally be 8 or less, however, for high speed capture session, 4583 the max pipeline depth will be up to 8 x size of high speed capture request list. 4584 </details> 4585 <hal_details> 4586 This value should be 4 or less, expect for the high speed recording session, where the 4587 max batch sizes may be larger than 1. 4588 </hal_details> 4589 </entry> 4590 <entry name="partialResultCount" type="int32" visibility="public" optional="true"> 4591 <description>Defines how many sub-components 4592 a result will be composed of. 4593 </description> 4594 <range>&gt;= 1</range> 4595 <details>In order to combat the pipeline latency, partial results 4596 may be delivered to the application layer from the camera device as 4597 soon as they are available. 4598 4599 Optional; defaults to 1. A value of 1 means that partial 4600 results are not supported, and only the final TotalCaptureResult will 4601 be produced by the camera device. 4602 4603 A typical use case for this might be: after requesting an 4604 auto-focus (AF) lock the new AF state might be available 50% 4605 of the way through the pipeline. The camera device could 4606 then immediately dispatch this state via a partial result to 4607 the application, and the rest of the metadata via later 4608 partial results. 4609 </details> 4610 </entry> 4611 <entry name="availableCapabilities" type="byte" visibility="public" 4612 enum="true" container="array" hwlevel="legacy"> 4613 <array> 4614 <size>n</size> 4615 </array> 4616 <enum> 4617 <value>BACKWARD_COMPATIBLE 4618 <notes>The minimal set of capabilities that every camera 4619 device (regardless of android.info.supportedHardwareLevel) 4620 supports. 4621 4622 This capability is listed by all normal devices, and 4623 indicates that the camera device has a feature set 4624 that's comparable to the baseline requirements for the 4625 older android.hardware.Camera API. 4626 4627 Devices with the DEPTH_OUTPUT capability might not list this 4628 capability, indicating that they support only depth measurement, 4629 not standard color output. 4630 </notes> 4631 </value> 4632 <value optional="true">MANUAL_SENSOR 4633 <notes> 4634 The camera device can be manually controlled (3A algorithms such 4635 as auto-exposure, and auto-focus can be bypassed). 4636 The camera device supports basic manual control of the sensor image 4637 acquisition related stages. This means the following controls are 4638 guaranteed to be supported: 4639 4640 * Manual frame duration control 4641 * android.sensor.frameDuration 4642 * android.sensor.info.maxFrameDuration 4643 * Manual exposure control 4644 * android.sensor.exposureTime 4645 * android.sensor.info.exposureTimeRange 4646 * Manual sensitivity control 4647 * android.sensor.sensitivity 4648 * android.sensor.info.sensitivityRange 4649 * Manual lens control (if the lens is adjustable) 4650 * android.lens.* 4651 * Manual flash control (if a flash unit is present) 4652 * android.flash.* 4653 * Manual black level locking 4654 * android.blackLevel.lock 4655 * Auto exposure lock 4656 * android.control.aeLock 4657 4658 If any of the above 3A algorithms are enabled, then the camera 4659 device will accurately report the values applied by 3A in the 4660 result. 4661 4662 A given camera device may also support additional manual sensor controls, 4663 but this capability only covers the above list of controls. 4664 4665 If this is supported, android.scaler.streamConfigurationMap will 4666 additionally return a min frame duration that is greater than 4667 zero for each supported size-format combination. 4668 </notes> 4669 </value> 4670 <value optional="true">MANUAL_POST_PROCESSING 4671 <notes> 4672 The camera device post-processing stages can be manually controlled. 4673 The camera device supports basic manual control of the image post-processing 4674 stages. This means the following controls are guaranteed to be supported: 4675 4676 * Manual tonemap control 4677 * android.tonemap.curve 4678 * android.tonemap.mode 4679 * android.tonemap.maxCurvePoints 4680 * android.tonemap.gamma 4681 * android.tonemap.presetCurve 4682 4683 * Manual white balance control 4684 * android.colorCorrection.transform 4685 * android.colorCorrection.gains 4686 * Manual lens shading map control 4687 * android.shading.mode 4688 * android.statistics.lensShadingMapMode 4689 * android.statistics.lensShadingMap 4690 * android.lens.info.shadingMapSize 4691 * Manual aberration correction control (if aberration correction is supported) 4692 * android.colorCorrection.aberrationMode 4693 * android.colorCorrection.availableAberrationModes 4694 * Auto white balance lock 4695 * android.control.awbLock 4696 4697 If auto white balance is enabled, then the camera device 4698 will accurately report the values applied by AWB in the result. 4699 4700 A given camera device may also support additional post-processing 4701 controls, but this capability only covers the above list of controls. 4702 </notes> 4703 </value> 4704 <value optional="true">RAW 4705 <notes> 4706 The camera device supports outputting RAW buffers and 4707 metadata for interpreting them. 4708 4709 Devices supporting the RAW capability allow both for 4710 saving DNG files, and for direct application processing of 4711 raw sensor images. 4712 4713 * RAW_SENSOR is supported as an output format. 4714 * The maximum available resolution for RAW_SENSOR streams 4715 will match either the value in 4716 android.sensor.info.pixelArraySize or 4717 android.sensor.info.preCorrectionActiveArraySize. 4718 * All DNG-related optional metadata entries are provided 4719 by the camera device. 4720 </notes> 4721 </value> 4722 <value optional="true" ndk_hidden="true">PRIVATE_REPROCESSING 4723 <notes> 4724 The camera device supports the Zero Shutter Lag reprocessing use case. 4725 4726 * One input stream is supported, that is, `android.request.maxNumInputStreams == 1`. 4727 * {@link android.graphics.ImageFormat#PRIVATE} is supported as an output/input format, 4728 that is, {@link android.graphics.ImageFormat#PRIVATE} is included in the lists of 4729 formats returned by {@link 4730 android.hardware.camera2.params.StreamConfigurationMap#getInputFormats} and {@link 4731 android.hardware.camera2.params.StreamConfigurationMap#getOutputFormats}. 4732 * {@link android.hardware.camera2.params.StreamConfigurationMap#getValidOutputFormatsForInput} 4733 returns non empty int[] for each supported input format returned by {@link 4734 android.hardware.camera2.params.StreamConfigurationMap#getInputFormats}. 4735 * Each size returned by {@link 4736 android.hardware.camera2.params.StreamConfigurationMap#getInputSizes 4737 getInputSizes(ImageFormat.PRIVATE)} is also included in {@link 4738 android.hardware.camera2.params.StreamConfigurationMap#getOutputSizes 4739 getOutputSizes(ImageFormat.PRIVATE)} 4740 * Using {@link android.graphics.ImageFormat#PRIVATE} does not cause a frame rate drop 4741 relative to the sensor's maximum capture rate (at that resolution). 4742 * {@link android.graphics.ImageFormat#PRIVATE} will be reprocessable into both 4743 {@link android.graphics.ImageFormat#YUV_420_888} and 4744 {@link android.graphics.ImageFormat#JPEG} formats. 4745 * The maximum available resolution for PRIVATE streams 4746 (both input/output) will match the maximum available 4747 resolution of JPEG streams. 4748 * Static metadata android.reprocess.maxCaptureStall. 4749 * Only below controls are effective for reprocessing requests and 4750 will be present in capture results, other controls in reprocess 4751 requests will be ignored by the camera device. 4752 * android.jpeg.* 4753 * android.noiseReduction.mode 4754 * android.edge.mode 4755 * android.noiseReduction.availableNoiseReductionModes and 4756 android.edge.availableEdgeModes will both list ZERO_SHUTTER_LAG as a supported mode. 4757 </notes> 4758 </value> 4759 <value optional="true">READ_SENSOR_SETTINGS 4760 <notes> 4761 The camera device supports accurately reporting the sensor settings for many of 4762 the sensor controls while the built-in 3A algorithm is running. This allows 4763 reporting of sensor settings even when these settings cannot be manually changed. 4764 4765 The values reported for the following controls are guaranteed to be available 4766 in the CaptureResult, including when 3A is enabled: 4767 4768 * Exposure control 4769 * android.sensor.exposureTime 4770 * Sensitivity control 4771 * android.sensor.sensitivity 4772 * Lens controls (if the lens is adjustable) 4773 * android.lens.focusDistance 4774 * android.lens.aperture 4775 4776 This capability is a subset of the MANUAL_SENSOR control capability, and will 4777 always be included if the MANUAL_SENSOR capability is available. 4778 </notes> 4779 </value> 4780 <value optional="true">BURST_CAPTURE 4781 <notes> 4782 The camera device supports capturing high-resolution images at >= 20 frames per 4783 second, in at least the uncompressed YUV format, when post-processing settings are set 4784 to FAST. Additionally, maximum-resolution images can be captured at >= 10 frames 4785 per second. Here, 'high resolution' means at least 8 megapixels, or the maximum 4786 resolution of the device, whichever is smaller. 4787 4788 More specifically, this means that a size matching the camera device's active array 4789 size is listed as a supported size for the {@link 4790 android.graphics.ImageFormat#YUV_420_888} format in either {@link 4791 android.hardware.camera2.params.StreamConfigurationMap#getOutputSizes} or {@link 4792 android.hardware.camera2.params.StreamConfigurationMap#getHighResolutionOutputSizes}, 4793 with a minimum frame duration for that format and size of either <= 1/20 s, or 4794 <= 1/10 s, respectively; and the android.control.aeAvailableTargetFpsRanges entry 4795 lists at least one FPS range where the minimum FPS is >= 1 / minimumFrameDuration 4796 for the maximum-size YUV_420_888 format. If that maximum size is listed in {@link 4797 android.hardware.camera2.params.StreamConfigurationMap#getHighResolutionOutputSizes}, 4798 then the list of resolutions for YUV_420_888 from {@link 4799 android.hardware.camera2.params.StreamConfigurationMap#getOutputSizes} contains at 4800 least one resolution >= 8 megapixels, with a minimum frame duration of <= 1/20 4801 s. 4802 4803 If the device supports the {@link android.graphics.ImageFormat#RAW10}, {@link 4804 android.graphics.ImageFormat#RAW12}, then those can also be captured at the same rate 4805 as the maximum-size YUV_420_888 resolution is. 4806 4807 If the device supports the PRIVATE_REPROCESSING capability, then the same guarantees 4808 as for the YUV_420_888 format also apply to the {@link 4809 android.graphics.ImageFormat#PRIVATE} format. 4810 4811 In addition, the android.sync.maxLatency field is guaranted to have a value between 0 4812 and 4, inclusive. android.control.aeLockAvailable and android.control.awbLockAvailable 4813 are also guaranteed to be `true` so burst capture with these two locks ON yields 4814 consistent image output. 4815 </notes> 4816 </value> 4817 <value optional="true" ndk_hidden="true">YUV_REPROCESSING 4818 <notes> 4819 The camera device supports the YUV_420_888 reprocessing use case, similar as 4820 PRIVATE_REPROCESSING, This capability requires the camera device to support the 4821 following: 4822 4823 * One input stream is supported, that is, `android.request.maxNumInputStreams == 1`. 4824 * {@link android.graphics.ImageFormat#YUV_420_888} is supported as an output/input format, that is, 4825 YUV_420_888 is included in the lists of formats returned by 4826 {@link android.hardware.camera2.params.StreamConfigurationMap#getInputFormats} and 4827 {@link android.hardware.camera2.params.StreamConfigurationMap#getOutputFormats}. 4828 * {@link 4829 android.hardware.camera2.params.StreamConfigurationMap#getValidOutputFormatsForInput} 4830 returns non-empty int[] for each supported input format returned by {@link 4831 android.hardware.camera2.params.StreamConfigurationMap#getInputFormats}. 4832 * Each size returned by {@link 4833 android.hardware.camera2.params.StreamConfigurationMap#getInputSizes 4834 getInputSizes(YUV_420_888)} is also included in {@link 4835 android.hardware.camera2.params.StreamConfigurationMap#getOutputSizes 4836 getOutputSizes(YUV_420_888)} 4837 * Using {@link android.graphics.ImageFormat#YUV_420_888} does not cause a frame rate drop 4838 relative to the sensor's maximum capture rate (at that resolution). 4839 * {@link android.graphics.ImageFormat#YUV_420_888} will be reprocessable into both 4840 {@link android.graphics.ImageFormat#YUV_420_888} and {@link 4841 android.graphics.ImageFormat#JPEG} formats. 4842 * The maximum available resolution for {@link 4843 android.graphics.ImageFormat#YUV_420_888} streams (both input/output) will match the 4844 maximum available resolution of {@link android.graphics.ImageFormat#JPEG} streams. 4845 * Static metadata android.reprocess.maxCaptureStall. 4846 * Only the below controls are effective for reprocessing requests and will be present 4847 in capture results. The reprocess requests are from the original capture results that 4848 are associated with the intermediate {@link android.graphics.ImageFormat#YUV_420_888} 4849 output buffers. All other controls in the reprocess requests will be ignored by the 4850 camera device. 4851 * android.jpeg.* 4852 * android.noiseReduction.mode 4853 * android.edge.mode 4854 * android.reprocess.effectiveExposureFactor 4855 * android.noiseReduction.availableNoiseReductionModes and 4856 android.edge.availableEdgeModes will both list ZERO_SHUTTER_LAG as a supported mode. 4857 </notes> 4858 </value> 4859 <value optional="true">DEPTH_OUTPUT 4860 <notes> 4861 The camera device can produce depth measurements from its field of view. 4862 4863 This capability requires the camera device to support the following: 4864 4865 * {@link android.graphics.ImageFormat#DEPTH16} is supported as an output format. 4866 * {@link android.graphics.ImageFormat#DEPTH_POINT_CLOUD} is optionally supported as an 4867 output format. 4868 * This camera device, and all camera devices with the same android.lens.facing, 4869 will list the following calibration entries in both 4870 {@link android.hardware.camera2.CameraCharacteristics} and 4871 {@link android.hardware.camera2.CaptureResult}: 4872 - android.lens.poseTranslation 4873 - android.lens.poseRotation 4874 - android.lens.intrinsicCalibration 4875 - android.lens.radialDistortion 4876 * The android.depth.depthIsExclusive entry is listed by this device. 4877 * A LIMITED camera with only the DEPTH_OUTPUT capability does not have to support 4878 normal YUV_420_888, JPEG, and PRIV-format outputs. It only has to support the DEPTH16 4879 format. 4880 4881 Generally, depth output operates at a slower frame rate than standard color capture, 4882 so the DEPTH16 and DEPTH_POINT_CLOUD formats will commonly have a stall duration that 4883 should be accounted for (see 4884 {@link android.hardware.camera2.params.StreamConfigurationMap#getOutputStallDuration}). 4885 On a device that supports both depth and color-based output, to enable smooth preview, 4886 using a repeating burst is recommended, where a depth-output target is only included 4887 once every N frames, where N is the ratio between preview output rate and depth output 4888 rate, including depth stall time. 4889 </notes> 4890 </value> 4891 <value optional="true" ndk_hidden="true">CONSTRAINED_HIGH_SPEED_VIDEO 4892 <notes> 4893 The device supports constrained high speed video recording (frame rate >=120fps) 4894 use case. The camera device will support high speed capture session created by 4895 {@link android.hardware.camera2.CameraDevice#createConstrainedHighSpeedCaptureSession}, which 4896 only accepts high speed request lists created by 4897 {@link android.hardware.camera2.CameraConstrainedHighSpeedCaptureSession#createHighSpeedRequestList}. 4898 4899 A camera device can still support high speed video streaming by advertising the high speed 4900 FPS ranges in android.control.aeAvailableTargetFpsRanges. For this case, all normal 4901 capture request per frame control and synchronization requirements will apply to 4902 the high speed fps ranges, the same as all other fps ranges. This capability describes 4903 the capability of a specialized operating mode with many limitations (see below), which 4904 is only targeted at high speed video recording. 4905 4906 The supported high speed video sizes and fps ranges are specified in 4907 {@link android.hardware.camera2.params.StreamConfigurationMap#getHighSpeedVideoFpsRanges}. 4908 To get desired output frame rates, the application is only allowed to select video size 4909 and FPS range combinations provided by 4910 {@link android.hardware.camera2.params.StreamConfigurationMap#getHighSpeedVideoSizes}. 4911 The fps range can be controlled via android.control.aeTargetFpsRange. 4912 4913 In this capability, the camera device will override aeMode, awbMode, and afMode to 4914 ON, AUTO, and CONTINUOUS_VIDEO, respectively. All post-processing block mode 4915 controls will be overridden to be FAST. Therefore, no manual control of capture 4916 and post-processing parameters is possible. All other controls operate the 4917 same as when android.control.mode == AUTO. This means that all other 4918 android.control.* fields continue to work, such as 4919 4920 * android.control.aeTargetFpsRange 4921 * android.control.aeExposureCompensation 4922 * android.control.aeLock 4923 * android.control.awbLock 4924 * android.control.effectMode 4925 * android.control.aeRegions 4926 * android.control.afRegions 4927 * android.control.awbRegions 4928 * android.control.afTrigger 4929 * android.control.aePrecaptureTrigger 4930 4931 Outside of android.control.*, the following controls will work: 4932 4933 * android.flash.mode (TORCH mode only, automatic flash for still capture will not 4934 work since aeMode is ON) 4935 * android.lens.opticalStabilizationMode (if it is supported) 4936 * android.scaler.cropRegion 4937 * android.statistics.faceDetectMode (if it is supported) 4938 4939 For high speed recording use case, the actual maximum supported frame rate may 4940 be lower than what camera can output, depending on the destination Surfaces for 4941 the image data. For example, if the destination surface is from video encoder, 4942 the application need check if the video encoder is capable of supporting the 4943 high frame rate for a given video size, or it will end up with lower recording 4944 frame rate. If the destination surface is from preview window, the actual preview frame 4945 rate will be bounded by the screen refresh rate. 4946 4947 The camera device will only support up to 2 high speed simultaneous output surfaces 4948 (preview and recording surfaces) 4949 in this mode. Above controls will be effective only if all of below conditions are true: 4950 4951 * The application creates a camera capture session with no more than 2 surfaces via 4952 {@link android.hardware.camera2.CameraDevice#createConstrainedHighSpeedCaptureSession}. The 4953 targeted surfaces must be preview surface (either from 4954 {@link android.view.SurfaceView} or {@link android.graphics.SurfaceTexture}) or 4955 recording surface(either from {@link android.media.MediaRecorder#getSurface} or 4956 {@link android.media.MediaCodec#createInputSurface}). 4957 * The stream sizes are selected from the sizes reported by 4958 {@link android.hardware.camera2.params.StreamConfigurationMap#getHighSpeedVideoSizes}. 4959 * The FPS ranges are selected from 4960 {@link android.hardware.camera2.params.StreamConfigurationMap#getHighSpeedVideoFpsRanges}. 4961 4962 When above conditions are NOT satistied, 4963 {@link android.hardware.camera2.CameraDevice#createConstrainedHighSpeedCaptureSession} 4964 will fail. 4965 4966 Switching to a FPS range that has different maximum FPS may trigger some camera device 4967 reconfigurations, which may introduce extra latency. It is recommended that 4968 the application avoids unnecessary maximum target FPS changes as much as possible 4969 during high speed streaming. 4970 </notes> 4971 </value> 4972 </enum> 4973 <description>List of capabilities that this camera device 4974 advertises as fully supporting.</description> 4975 <details> 4976 A capability is a contract that the camera device makes in order 4977 to be able to satisfy one or more use cases. 4978 4979 Listing a capability guarantees that the whole set of features 4980 required to support a common use will all be available. 4981 4982 Using a subset of the functionality provided by an unsupported 4983 capability may be possible on a specific camera device implementation; 4984 to do this query each of android.request.availableRequestKeys, 4985 android.request.availableResultKeys, 4986 android.request.availableCharacteristicsKeys. 4987 4988 The following capabilities are guaranteed to be available on 4989 android.info.supportedHardwareLevel `==` FULL devices: 4990 4991 * MANUAL_SENSOR 4992 * MANUAL_POST_PROCESSING 4993 4994 Other capabilities may be available on either FULL or LIMITED 4995 devices, but the application should query this key to be sure. 4996 </details> 4997 <hal_details> 4998 Additional constraint details per-capability will be available 4999 in the Compatibility Test Suite. 5000 5001 Minimum baseline requirements required for the 5002 BACKWARD_COMPATIBLE capability are not explicitly listed. 5003 Instead refer to "BC" tags and the camera CTS tests in the 5004 android.hardware.camera2.cts package. 5005 5006 Listed controls that can be either request or result (e.g. 5007 android.sensor.exposureTime) must be available both in the 5008 request and the result in order to be considered to be 5009 capability-compliant. 5010 5011 For example, if the HAL claims to support MANUAL control, 5012 then exposure time must be configurable via the request _and_ 5013 the actual exposure applied must be available via 5014 the result. 5015 5016 If MANUAL_SENSOR is omitted, the HAL may choose to omit the 5017 android.scaler.availableMinFrameDurations static property entirely. 5018 5019 For PRIVATE_REPROCESSING and YUV_REPROCESSING capabilities, see 5020 hardware/libhardware/include/hardware/camera3.h Section 10 for more information. 5021 5022 Devices that support the MANUAL_SENSOR capability must support the 5023 CAMERA3_TEMPLATE_MANUAL template defined in camera3.h. 5024 5025 Devices that support the PRIVATE_REPROCESSING capability or the 5026 YUV_REPROCESSING capability must support the 5027 CAMERA3_TEMPLATE_ZERO_SHUTTER_LAG template defined in camera3.h. 5028 5029 For DEPTH_OUTPUT, the depth-format keys 5030 android.depth.availableDepthStreamConfigurations, 5031 android.depth.availableDepthMinFrameDurations, 5032 android.depth.availableDepthStallDurations must be available, in 5033 addition to the other keys explicitly mentioned in the DEPTH_OUTPUT 5034 enum notes. The entry android.depth.maxDepthSamples must be available 5035 if the DEPTH_POINT_CLOUD format is supported (HAL pixel format BLOB, dataspace 5036 DEPTH). 5037 </hal_details> 5038 </entry> 5039 <entry name="availableRequestKeys" type="int32" visibility="ndk_public" 5040 container="array" hwlevel="legacy"> 5041 <array> 5042 <size>n</size> 5043 </array> 5044 <description>A list of all keys that the camera device has available 5045 to use with {@link android.hardware.camera2.CaptureRequest}.</description> 5046 5047 <details>Attempting to set a key into a CaptureRequest that is not 5048 listed here will result in an invalid request and will be rejected 5049 by the camera device. 5050 5051 This field can be used to query the feature set of a camera device 5052 at a more granular level than capabilities. This is especially 5053 important for optional keys that are not listed under any capability 5054 in android.request.availableCapabilities. 5055 </details> 5056 <hal_details> 5057 Vendor tags must not be listed here. Use the vendor tag metadata 5058 extensions C api instead (refer to camera3.h for more details). 5059 5060 Setting/getting vendor tags will be checked against the metadata 5061 vendor extensions API and not against this field. 5062 5063 The HAL must not consume any request tags that are not listed either 5064 here or in the vendor tag list. 5065 5066 The public camera2 API will always make the vendor tags visible 5067 via 5068 {@link android.hardware.camera2.CameraCharacteristics#getAvailableCaptureRequestKeys}. 5069 </hal_details> 5070 </entry> 5071 <entry name="availableResultKeys" type="int32" visibility="ndk_public" 5072 container="array" hwlevel="legacy"> 5073 <array> 5074 <size>n</size> 5075 </array> 5076 <description>A list of all keys that the camera device has available 5077 to use with {@link android.hardware.camera2.CaptureResult}.</description> 5078 5079 <details>Attempting to get a key from a CaptureResult that is not 5080 listed here will always return a `null` value. Getting a key from 5081 a CaptureResult that is listed here will generally never return a `null` 5082 value. 5083 5084 The following keys may return `null` unless they are enabled: 5085 5086 * android.statistics.lensShadingMap (non-null iff android.statistics.lensShadingMapMode == ON) 5087 5088 (Those sometimes-null keys will nevertheless be listed here 5089 if they are available.) 5090 5091 This field can be used to query the feature set of a camera device 5092 at a more granular level than capabilities. This is especially 5093 important for optional keys that are not listed under any capability 5094 in android.request.availableCapabilities. 5095 </details> 5096 <hal_details> 5097 Tags listed here must always have an entry in the result metadata, 5098 even if that size is 0 elements. Only array-type tags (e.g. lists, 5099 matrices, strings) are allowed to have 0 elements. 5100 5101 Vendor tags must not be listed here. Use the vendor tag metadata 5102 extensions C api instead (refer to camera3.h for more details). 5103 5104 Setting/getting vendor tags will be checked against the metadata 5105 vendor extensions API and not against this field. 5106 5107 The HAL must not produce any result tags that are not listed either 5108 here or in the vendor tag list. 5109 5110 The public camera2 API will always make the vendor tags visible via {@link 5111 android.hardware.camera2.CameraCharacteristics#getAvailableCaptureResultKeys}. 5112 </hal_details> 5113 </entry> 5114 <entry name="availableCharacteristicsKeys" type="int32" visibility="ndk_public" 5115 container="array" hwlevel="legacy"> 5116 <array> 5117 <size>n</size> 5118 </array> 5119 <description>A list of all keys that the camera device has available 5120 to use with {@link android.hardware.camera2.CameraCharacteristics}.</description> 5121 <details>This entry follows the same rules as 5122 android.request.availableResultKeys (except that it applies for 5123 CameraCharacteristics instead of CaptureResult). See above for more 5124 details. 5125 </details> 5126 <hal_details> 5127 Keys listed here must always have an entry in the static info metadata, 5128 even if that size is 0 elements. Only array-type tags (e.g. lists, 5129 matrices, strings) are allowed to have 0 elements. 5130 5131 Vendor tags must not be listed here. Use the vendor tag metadata 5132 extensions C api instead (refer to camera3.h for more details). 5133 5134 Setting/getting vendor tags will be checked against the metadata 5135 vendor extensions API and not against this field. 5136 5137 The HAL must not have any tags in its static info that are not listed 5138 either here or in the vendor tag list. 5139 5140 The public camera2 API will always make the vendor tags visible 5141 via {@link android.hardware.camera2.CameraCharacteristics#getKeys}. 5142 </hal_details> 5143 </entry> 5144 </static> 5145 </section> 5146 <section name="scaler"> 5147 <controls> 5148 <entry name="cropRegion" type="int32" visibility="public" 5149 container="array" typedef="rectangle" hwlevel="legacy"> 5150 <array> 5151 <size>4</size> 5152 </array> 5153 <description>The desired region of the sensor to read out for this capture.</description> 5154 <units>Pixel coordinates relative to 5155 android.sensor.info.activeArraySize</units> 5156 <details> 5157 This control can be used to implement digital zoom. 5158 5159 The crop region coordinate system is based off 5160 android.sensor.info.activeArraySize, with `(0, 0)` being the 5161 top-left corner of the sensor active array. 5162 5163 Output streams use this rectangle to produce their output, 5164 cropping to a smaller region if necessary to maintain the 5165 stream's aspect ratio, then scaling the sensor input to 5166 match the output's configured resolution. 5167 5168 The crop region is applied after the RAW to other color 5169 space (e.g. YUV) conversion. Since raw streams 5170 (e.g. RAW16) don't have the conversion stage, they are not 5171 croppable. The crop region will be ignored by raw streams. 5172 5173 For non-raw streams, any additional per-stream cropping will 5174 be done to maximize the final pixel area of the stream. 5175 5176 For example, if the crop region is set to a 4:3 aspect 5177 ratio, then 4:3 streams will use the exact crop 5178 region. 16:9 streams will further crop vertically 5179 (letterbox). 5180 5181 Conversely, if the crop region is set to a 16:9, then 4:3 5182 outputs will crop horizontally (pillarbox), and 16:9 5183 streams will match exactly. These additional crops will 5184 be centered within the crop region. 5185 5186 The width and height of the crop region cannot 5187 be set to be smaller than 5188 `floor( activeArraySize.width / android.scaler.availableMaxDigitalZoom )` and 5189 `floor( activeArraySize.height / android.scaler.availableMaxDigitalZoom )`, respectively. 5190 5191 The camera device may adjust the crop region to account 5192 for rounding and other hardware requirements; the final 5193 crop region used will be included in the output capture 5194 result. 5195 </details> 5196 <hal_details> 5197 The output streams must maintain square pixels at all 5198 times, no matter what the relative aspect ratios of the 5199 crop region and the stream are. Negative values for 5200 corner are allowed for raw output if full pixel array is 5201 larger than active pixel array. Width and height may be 5202 rounded to nearest larger supportable width, especially 5203 for raw output, where only a few fixed scales may be 5204 possible. 5205 5206 For a set of output streams configured, if the sensor output is cropped to a smaller 5207 size than active array size, the HAL need follow below cropping rules: 5208 5209 * The HAL need handle the cropRegion as if the sensor crop size is the effective active 5210 array size.More specifically, the HAL must transform the request cropRegion from 5211 android.sensor.info.activeArraySize to the sensor cropped pixel area size in this way: 5212 1. Translate the requested cropRegion w.r.t., the left top corner of the sensor 5213 cropped pixel area by (tx, ty), 5214 where `tx = sensorCrop.top * (sensorCrop.height / activeArraySize.height)` 5215 and `tx = sensorCrop.left * (sensorCrop.width / activeArraySize.width)`. The 5216 (sensorCrop.top, sensorCrop.left) is the coordinate based off the 5217 android.sensor.info.activeArraySize. 5218 2. Scale the width and height of requested cropRegion with scaling factor of 5219 sensorCrop.width/activeArraySize.width and sensorCrop.height/activeArraySize.height 5220 respectively. 5221 Once this new cropRegion is calculated, the HAL must use this region to crop the image 5222 with regard to the sensor crop size (effective active array size). The HAL still need 5223 follow the general cropping rule for this new cropRegion and effective active 5224 array size. 5225 5226 * The HAL must report the cropRegion with regard to android.sensor.info.activeArraySize. 5227 The HAL need convert the new cropRegion generated above w.r.t., full active array size. 5228 The reported cropRegion may be slightly different with the requested cropRegion since 5229 the HAL may adjust the crop region to account for rounding, conversion error, or other 5230 hardware limitations. 5231 5232 HAL2.x uses only (x, y, width) 5233 </hal_details> 5234 <tag id="BC" /> 5235 </entry> 5236 </controls> 5237 <static> 5238 <entry name="availableFormats" type="int32" 5239 visibility="hidden" deprecated="true" enum="true" 5240 container="array" typedef="imageFormat"> 5241 <array> 5242 <size>n</size> 5243 </array> 5244 <enum> 5245 <value optional="true" id="0x20">RAW16 5246 <notes> 5247 RAW16 is a standard, cross-platform format for raw image 5248 buffers with 16-bit pixels. 5249 5250 Buffers of this format are typically expected to have a 5251 Bayer Color Filter Array (CFA) layout, which is given in 5252 android.sensor.info.colorFilterArrangement. Sensors with 5253 CFAs that are not representable by a format in 5254 android.sensor.info.colorFilterArrangement should not 5255 use this format. 5256 5257 Buffers of this format will also follow the constraints given for 5258 RAW_OPAQUE buffers, but with relaxed performance constraints. 5259 5260 This format is intended to give users access to the full contents 5261 of the buffers coming directly from the image sensor prior to any 5262 cropping or scaling operations, and all coordinate systems for 5263 metadata used for this format are relative to the size of the 5264 active region of the image sensor before any geometric distortion 5265 correction has been applied (i.e. 5266 android.sensor.info.preCorrectionActiveArraySize). Supported 5267 dimensions for this format are limited to the full dimensions of 5268 the sensor (e.g. either android.sensor.info.pixelArraySize or 5269 android.sensor.info.preCorrectionActiveArraySize will be the 5270 only supported output size). 5271 5272 See android.scaler.availableInputOutputFormatsMap for 5273 the full set of performance guarantees. 5274 </notes> 5275 </value> 5276 <value optional="true" id="0x24">RAW_OPAQUE 5277 <notes> 5278 RAW_OPAQUE (or 5279 {@link android.graphics.ImageFormat#RAW_PRIVATE RAW_PRIVATE} 5280 as referred in public API) is a format for raw image buffers 5281 coming from an image sensor. 5282 5283 The actual structure of buffers of this format is 5284 platform-specific, but must follow several constraints: 5285 5286 1. No image post-processing operations may have been applied to 5287 buffers of this type. These buffers contain raw image data coming 5288 directly from the image sensor. 5289 1. If a buffer of this format is passed to the camera device for 5290 reprocessing, the resulting images will be identical to the images 5291 produced if the buffer had come directly from the sensor and was 5292 processed with the same settings. 5293 5294 The intended use for this format is to allow access to the native 5295 raw format buffers coming directly from the camera sensor without 5296 any additional conversions or decrease in framerate. 5297 5298 See android.scaler.availableInputOutputFormatsMap for the full set of 5299 performance guarantees. 5300 </notes> 5301 </value> 5302 <value optional="true" id="0x32315659">YV12 5303 <notes>YCrCb 4:2:0 Planar</notes> 5304 </value> 5305 <value optional="true" id="0x11">YCrCb_420_SP 5306 <notes>NV21</notes> 5307 </value> 5308 <value id="0x22">IMPLEMENTATION_DEFINED 5309 <notes>System internal format, not application-accessible</notes> 5310 </value> 5311 <value id="0x23">YCbCr_420_888 5312 <notes>Flexible YUV420 Format</notes> 5313 </value> 5314 <value id="0x21">BLOB 5315 <notes>JPEG format</notes> 5316 </value> 5317 </enum> 5318 <description>The list of image formats that are supported by this 5319 camera device for output streams.</description> 5320 <details> 5321 All camera devices will support JPEG and YUV_420_888 formats. 5322 5323 When set to YUV_420_888, application can access the YUV420 data directly. 5324 </details> 5325 <hal_details> 5326 These format values are from HAL_PIXEL_FORMAT_* in 5327 system/core/include/system/graphics.h. 5328 5329 When IMPLEMENTATION_DEFINED is used, the platform 5330 gralloc module will select a format based on the usage flags provided 5331 by the camera HAL device and the other endpoint of the stream. It is 5332 usually used by preview and recording streams, where the application doesn't 5333 need access the image data. 5334 5335 YCbCr_420_888 format must be supported by the HAL. When an image stream 5336 needs CPU/application direct access, this format will be used. 5337 5338 The BLOB format must be supported by the HAL. This is used for the JPEG stream. 5339 5340 A RAW_OPAQUE buffer should contain only pixel data. It is strongly 5341 recommended that any information used by the camera device when 5342 processing images is fully expressed by the result metadata 5343 for that image buffer. 5344 </hal_details> 5345 <tag id="BC" /> 5346 </entry> 5347 <entry name="availableJpegMinDurations" type="int64" visibility="hidden" deprecated="true" 5348 container="array"> 5349 <array> 5350 <size>n</size> 5351 </array> 5352 <description>The minimum frame duration that is supported 5353 for each resolution in android.scaler.availableJpegSizes. 5354 </description> 5355 <units>Nanoseconds</units> 5356 <range>TODO: Remove property.</range> 5357 <details> 5358 This corresponds to the minimum steady-state frame duration when only 5359 that JPEG stream is active and captured in a burst, with all 5360 processing (typically in android.*.mode) set to FAST. 5361 5362 When multiple streams are configured, the minimum 5363 frame duration will be &gt;= max(individual stream min 5364 durations)</details> 5365 <tag id="BC" /> 5366 </entry> 5367 <entry name="availableJpegSizes" type="int32" visibility="hidden" 5368 deprecated="true" container="array" typedef="size"> 5369 <array> 5370 <size>n</size> 5371 <size>2</size> 5372 </array> 5373 <description>The JPEG resolutions that are supported by this camera device.</description> 5374 <range>TODO: Remove property.</range> 5375 <details> 5376 The resolutions are listed as `(width, height)` pairs. All camera devices will support 5377 sensor maximum resolution (defined by android.sensor.info.activeArraySize). 5378 </details> 5379 <hal_details> 5380 The HAL must include sensor maximum resolution 5381 (defined by android.sensor.info.activeArraySize), 5382 and should include half/quarter of sensor maximum resolution. 5383 </hal_details> 5384 <tag id="BC" /> 5385 </entry> 5386 <entry name="availableMaxDigitalZoom" type="float" visibility="public" 5387 hwlevel="legacy"> 5388 <description>The maximum ratio between both active area width 5389 and crop region width, and active area height and 5390 crop region height, for android.scaler.cropRegion. 5391 </description> 5392 <units>Zoom scale factor</units> 5393 <range>&gt;=1</range> 5394 <details> 5395 This represents the maximum amount of zooming possible by 5396 the camera device, or equivalently, the minimum cropping 5397 window size. 5398 5399 Crop regions that have a width or height that is smaller 5400 than this ratio allows will be rounded up to the minimum 5401 allowed size by the camera device. 5402 </details> 5403 <tag id="BC" /> 5404 </entry> 5405 <entry name="availableProcessedMinDurations" type="int64" visibility="hidden" deprecated="true" 5406 container="array"> 5407 <array> 5408 <size>n</size> 5409 </array> 5410 <description>For each available processed output size (defined in 5411 android.scaler.availableProcessedSizes), this property lists the 5412 minimum supportable frame duration for that size. 5413 </description> 5414 <units>Nanoseconds</units> 5415 <details> 5416 This should correspond to the frame duration when only that processed 5417 stream is active, with all processing (typically in android.*.mode) 5418 set to FAST. 5419 5420 When multiple streams are configured, the minimum frame duration will 5421 be &gt;= max(individual stream min durations). 5422 </details> 5423 <tag id="BC" /> 5424 </entry> 5425 <entry name="availableProcessedSizes" type="int32" visibility="hidden" 5426 deprecated="true" container="array" typedef="size"> 5427 <array> 5428 <size>n</size> 5429 <size>2</size> 5430 </array> 5431 <description>The resolutions available for use with 5432 processed output streams, such as YV12, NV12, and 5433 platform opaque YUV/RGB streams to the GPU or video 5434 encoders.</description> 5435 <details> 5436 The resolutions are listed as `(width, height)` pairs. 5437 5438 For a given use case, the actual maximum supported resolution 5439 may be lower than what is listed here, depending on the destination 5440 Surface for the image data. For example, for recording video, 5441 the video encoder chosen may have a maximum size limit (e.g. 1080p) 5442 smaller than what the camera (e.g. maximum resolution is 3264x2448) 5443 can provide. 5444 5445 Please reference the documentation for the image data destination to 5446 check if it limits the maximum size for image data. 5447 </details> 5448 <hal_details> 5449 For FULL capability devices (`android.info.supportedHardwareLevel == FULL`), 5450 the HAL must include all JPEG sizes listed in android.scaler.availableJpegSizes 5451 and each below resolution if it is smaller than or equal to the sensor 5452 maximum resolution (if they are not listed in JPEG sizes already): 5453 5454 * 240p (320 x 240) 5455 * 480p (640 x 480) 5456 * 720p (1280 x 720) 5457 * 1080p (1920 x 1080) 5458 5459 For LIMITED capability devices (`android.info.supportedHardwareLevel == LIMITED`), 5460 the HAL only has to list up to the maximum video size supported by the devices. 5461 </hal_details> 5462 <tag id="BC" /> 5463 </entry> 5464 <entry name="availableRawMinDurations" type="int64" deprecated="true" 5465 container="array"> 5466 <array> 5467 <size>n</size> 5468 </array> 5469 <description> 5470 For each available raw output size (defined in 5471 android.scaler.availableRawSizes), this property lists the minimum 5472 supportable frame duration for that size. 5473 </description> 5474 <units>Nanoseconds</units> 5475 <details> 5476 Should correspond to the frame duration when only the raw stream is 5477 active. 5478 5479 When multiple streams are configured, the minimum 5480 frame duration will be &gt;= max(individual stream min 5481 durations)</details> 5482 <tag id="BC" /> 5483 </entry> 5484 <entry name="availableRawSizes" type="int32" deprecated="true" 5485 container="array" typedef="size"> 5486 <array> 5487 <size>n</size> 5488 <size>2</size> 5489 </array> 5490 <description>The resolutions available for use with raw 5491 sensor output streams, listed as width, 5492 height</description> 5493 </entry> 5494 </static> 5495 <dynamic> 5496 <clone entry="android.scaler.cropRegion" kind="controls"> 5497 </clone> 5498 </dynamic> 5499 <static> 5500 <entry name="availableInputOutputFormatsMap" type="int32" visibility="hidden" 5501 typedef="reprocessFormatsMap"> 5502 <description>The mapping of image formats that are supported by this 5503 camera device for input streams, to their corresponding output formats. 5504 </description> 5505 <details> 5506 All camera devices with at least 1 5507 android.request.maxNumInputStreams will have at least one 5508 available input format. 5509 5510 The camera device will support the following map of formats, 5511 if its dependent capability (android.request.availableCapabilities) is supported: 5512 5513 Input Format | Output Format | Capability 5514 :-------------------------------------------------|:--------------------------------------------------|:---------- 5515 {@link android.graphics.ImageFormat#PRIVATE} | {@link android.graphics.ImageFormat#JPEG} | PRIVATE_REPROCESSING 5516 {@link android.graphics.ImageFormat#PRIVATE} | {@link android.graphics.ImageFormat#YUV_420_888} | PRIVATE_REPROCESSING 5517 {@link android.graphics.ImageFormat#YUV_420_888} | {@link android.graphics.ImageFormat#JPEG} | YUV_REPROCESSING 5518 {@link android.graphics.ImageFormat#YUV_420_888} | {@link android.graphics.ImageFormat#YUV_420_888} | YUV_REPROCESSING 5519 5520 PRIVATE refers to a device-internal format that is not directly application-visible. A 5521 PRIVATE input surface can be acquired by {@link android.media.ImageReader#newInstance} 5522 with {@link android.graphics.ImageFormat#PRIVATE} as the format. 5523 5524 For a PRIVATE_REPROCESSING-capable camera device, using the PRIVATE format as either input 5525 or output will never hurt maximum frame rate (i.e. {@link 5526 android.hardware.camera2.params.StreamConfigurationMap#getOutputStallDuration 5527 getOutputStallDuration(ImageFormat.PRIVATE, size)} is always 0), 5528 5529 Attempting to configure an input stream with output streams not 5530 listed as available in this map is not valid. 5531 </details> 5532 <hal_details> 5533 For the formats, see `system/core/include/system/graphics.h` for a definition 5534 of the image format enumerations. The PRIVATE format refers to the 5535 HAL_PIXEL_FORMAT_IMPLEMENTATION_DEFINED format. The HAL could determine 5536 the actual format by using the gralloc usage flags. 5537 For ZSL use case in particular, the HAL could choose appropriate format (partially 5538 processed YUV or RAW based format) by checking the format and GRALLOC_USAGE_HW_CAMERA_ZSL. 5539 See camera3.h for more details. 5540 5541 This value is encoded as a variable-size array-of-arrays. 5542 The inner array always contains `[format, length, ...]` where 5543 `...` has `length` elements. An inner array is followed by another 5544 inner array if the total metadata entry size hasn't yet been exceeded. 5545 5546 A code sample to read/write this encoding (with a device that 5547 supports reprocessing IMPLEMENTATION_DEFINED to YUV_420_888, and JPEG, 5548 and reprocessing YUV_420_888 to YUV_420_888 and JPEG): 5549 5550 // reading 5551 int32_t* contents = &entry.i32[0]; 5552 for (size_t i = 0; i < entry.count; ) { 5553 int32_t format = contents[i++]; 5554 int32_t length = contents[i++]; 5555 int32_t output_formats[length]; 5556 memcpy(&output_formats[0], &contents[i], 5557 length * sizeof(int32_t)); 5558 i += length; 5559 } 5560 5561 // writing (static example, PRIVATE_REPROCESSING + YUV_REPROCESSING) 5562 int32_t[] contents = { 5563 IMPLEMENTATION_DEFINED, 2, YUV_420_888, BLOB, 5564 YUV_420_888, 2, YUV_420_888, BLOB, 5565 }; 5566 update_camera_metadata_entry(metadata, index, &contents[0], 5567 sizeof(contents)/sizeof(contents[0]), &updated_entry); 5568 5569 If the HAL claims to support any of the capabilities listed in the 5570 above details, then it must also support all the input-output 5571 combinations listed for that capability. It can optionally support 5572 additional formats if it so chooses. 5573 </hal_details> 5574 <tag id="REPROC" /> 5575 </entry> 5576 <entry name="availableStreamConfigurations" type="int32" visibility="ndk_public" 5577 enum="true" container="array" typedef="streamConfiguration" hwlevel="legacy"> 5578 <array> 5579 <size>n</size> 5580 <size>4</size> 5581 </array> 5582 <enum> 5583 <value>OUTPUT</value> 5584 <value>INPUT</value> 5585 </enum> 5586 <description>The available stream configurations that this 5587 camera device supports 5588 (i.e. format, width, height, output/input stream). 5589 </description> 5590 <details> 5591 The configurations are listed as `(format, width, height, input?)` 5592 tuples. 5593 5594 For a given use case, the actual maximum supported resolution 5595 may be lower than what is listed here, depending on the destination 5596 Surface for the image data. For example, for recording video, 5597 the video encoder chosen may have a maximum size limit (e.g. 1080p) 5598 smaller than what the camera (e.g. maximum resolution is 3264x2448) 5599 can provide. 5600 5601 Please reference the documentation for the image data destination to 5602 check if it limits the maximum size for image data. 5603 5604 Not all output formats may be supported in a configuration with 5605 an input stream of a particular format. For more details, see 5606 android.scaler.availableInputOutputFormatsMap. 5607 5608 The following table describes the minimum required output stream 5609 configurations based on the hardware level 5610 (android.info.supportedHardwareLevel): 5611 5612 Format | Size | Hardware Level | Notes 5613 :-------------:|:--------------------------------------------:|:--------------:|:--------------: 5614 JPEG | android.sensor.info.activeArraySize | Any | 5615 JPEG | 1920x1080 (1080p) | Any | if 1080p <= activeArraySize 5616 JPEG | 1280x720 (720) | Any | if 720p <= activeArraySize 5617 JPEG | 640x480 (480p) | Any | if 480p <= activeArraySize 5618 JPEG | 320x240 (240p) | Any | if 240p <= activeArraySize 5619 YUV_420_888 | all output sizes available for JPEG | FULL | 5620 YUV_420_888 | all output sizes available for JPEG, up to the maximum video size | LIMITED | 5621 IMPLEMENTATION_DEFINED | same as YUV_420_888 | Any | 5622 5623 Refer to android.request.availableCapabilities for additional 5624 mandatory stream configurations on a per-capability basis. 5625 </details> 5626 <hal_details> 5627 It is recommended (but not mandatory) to also include half/quarter 5628 of sensor maximum resolution for JPEG formats (regardless of hardware 5629 level). 5630 5631 (The following is a rewording of the above required table): 5632 5633 For JPEG format, the sizes may be restricted by below conditions: 5634 5635 * The HAL may choose the aspect ratio of each Jpeg size to be one of well known ones 5636 (e.g. 4:3, 16:9, 3:2 etc.). If the sensor maximum resolution 5637 (defined by android.sensor.info.activeArraySize) has an aspect ratio other than these, 5638 it does not have to be included in the supported JPEG sizes. 5639 * Some hardware JPEG encoders may have pixel boundary alignment requirements, such as 5640 the dimensions being a multiple of 16. 5641 5642 Therefore, the maximum JPEG size may be smaller than sensor maximum resolution. 5643 However, the largest JPEG size must be as close as possible to the sensor maximum 5644 resolution given above constraints. It is required that after aspect ratio adjustments, 5645 additional size reduction due to other issues must be less than 3% in area. For example, 5646 if the sensor maximum resolution is 3280x2464, if the maximum JPEG size has aspect 5647 ratio 4:3, the JPEG encoder alignment requirement is 16, the maximum JPEG size will be 5648 3264x2448. 5649 5650 For FULL capability devices (`android.info.supportedHardwareLevel == FULL`), 5651 the HAL must include all YUV_420_888 sizes that have JPEG sizes listed 5652 here as output streams. 5653 5654 It must also include each below resolution if it is smaller than or 5655 equal to the sensor maximum resolution (for both YUV_420_888 and JPEG 5656 formats), as output streams: 5657 5658 * 240p (320 x 240) 5659 * 480p (640 x 480) 5660 * 720p (1280 x 720) 5661 * 1080p (1920 x 1080) 5662 5663 For LIMITED capability devices 5664 (`android.info.supportedHardwareLevel == LIMITED`), 5665 the HAL only has to list up to the maximum video size 5666 supported by the device. 5667 5668 Regardless of hardware level, every output resolution available for 5669 YUV_420_888 must also be available for IMPLEMENTATION_DEFINED. 5670 5671 This supercedes the following fields, which are now deprecated: 5672 5673 * availableFormats 5674 * available[Processed,Raw,Jpeg]Sizes 5675 </hal_details> 5676 </entry> 5677 <entry name="availableMinFrameDurations" type="int64" visibility="ndk_public" 5678 container="array" typedef="streamConfigurationDuration" hwlevel="legacy"> 5679 <array> 5680 <size>4</size> 5681 <size>n</size> 5682 </array> 5683 <description>This lists the minimum frame duration for each 5684 format/size combination. 5685 </description> 5686 <units>(format, width, height, ns) x n</units> 5687 <details> 5688 This should correspond to the frame duration when only that 5689 stream is active, with all processing (typically in android.*.mode) 5690 set to either OFF or FAST. 5691 5692 When multiple streams are used in a request, the minimum frame 5693 duration will be max(individual stream min durations). 5694 5695 The minimum frame duration of a stream (of a particular format, size) 5696 is the same regardless of whether the stream is input or output. 5697 5698 See android.sensor.frameDuration and 5699 android.scaler.availableStallDurations for more details about 5700 calculating the max frame rate. 5701 5702 (Keep in sync with 5703 {@link android.hardware.camera2.params.StreamConfigurationMap#getOutputMinFrameDuration}) 5704 </details> 5705 <tag id="V1" /> 5706 </entry> 5707 <entry name="availableStallDurations" type="int64" visibility="ndk_public" 5708 container="array" typedef="streamConfigurationDuration" hwlevel="legacy"> 5709 <array> 5710 <size>4</size> 5711 <size>n</size> 5712 </array> 5713 <description>This lists the maximum stall duration for each 5714 output format/size combination. 5715 </description> 5716 <units>(format, width, height, ns) x n</units> 5717 <details> 5718 A stall duration is how much extra time would get added 5719 to the normal minimum frame duration for a repeating request 5720 that has streams with non-zero stall. 5721 5722 For example, consider JPEG captures which have the following 5723 characteristics: 5724 5725 * JPEG streams act like processed YUV streams in requests for which 5726 they are not included; in requests in which they are directly 5727 referenced, they act as JPEG streams. This is because supporting a 5728 JPEG stream requires the underlying YUV data to always be ready for 5729 use by a JPEG encoder, but the encoder will only be used (and impact 5730 frame duration) on requests that actually reference a JPEG stream. 5731 * The JPEG processor can run concurrently to the rest of the camera 5732 pipeline, but cannot process more than 1 capture at a time. 5733 5734 In other words, using a repeating YUV request would result 5735 in a steady frame rate (let's say it's 30 FPS). If a single 5736 JPEG request is submitted periodically, the frame rate will stay 5737 at 30 FPS (as long as we wait for the previous JPEG to return each 5738 time). If we try to submit a repeating YUV + JPEG request, then 5739 the frame rate will drop from 30 FPS. 5740 5741 In general, submitting a new request with a non-0 stall time 5742 stream will _not_ cause a frame rate drop unless there are still 5743 outstanding buffers for that stream from previous requests. 5744 5745 Submitting a repeating request with streams (call this `S`) 5746 is the same as setting the minimum frame duration from 5747 the normal minimum frame duration corresponding to `S`, added with 5748 the maximum stall duration for `S`. 5749 5750 If interleaving requests with and without a stall duration, 5751 a request will stall by the maximum of the remaining times 5752 for each can-stall stream with outstanding buffers. 5753 5754 This means that a stalling request will not have an exposure start 5755 until the stall has completed. 5756 5757 This should correspond to the stall duration when only that stream is 5758 active, with all processing (typically in android.*.mode) set to FAST 5759 or OFF. Setting any of the processing modes to HIGH_QUALITY 5760 effectively results in an indeterminate stall duration for all 5761 streams in a request (the regular stall calculation rules are 5762 ignored). 5763 5764 The following formats may always have a stall duration: 5765 5766 * {@link android.graphics.ImageFormat#JPEG} 5767 * {@link android.graphics.ImageFormat#RAW_SENSOR} 5768 5769 The following formats will never have a stall duration: 5770 5771 * {@link android.graphics.ImageFormat#YUV_420_888} 5772 * {@link android.graphics.ImageFormat#RAW10} 5773 5774 All other formats may or may not have an allowed stall duration on 5775 a per-capability basis; refer to android.request.availableCapabilities 5776 for more details. 5777 5778 See android.sensor.frameDuration for more information about 5779 calculating the max frame rate (absent stalls). 5780 5781 (Keep up to date with 5782 {@link android.hardware.camera2.params.StreamConfigurationMap#getOutputStallDuration} ) 5783 </details> 5784 <hal_details> 5785 If possible, it is recommended that all non-JPEG formats 5786 (such as RAW16) should not have a stall duration. RAW10, RAW12, RAW_OPAQUE 5787 and IMPLEMENTATION_DEFINED must not have stall durations. 5788 </hal_details> 5789 <tag id="V1" /> 5790 </entry> 5791 <entry name="streamConfigurationMap" type="int32" visibility="java_public" 5792 synthetic="true" typedef="streamConfigurationMap" 5793 hwlevel="legacy"> 5794 <description>The available stream configurations that this 5795 camera device supports; also includes the minimum frame durations 5796 and the stall durations for each format/size combination. 5797 </description> 5798 <details> 5799 All camera devices will support sensor maximum resolution (defined by 5800 android.sensor.info.activeArraySize) for the JPEG format. 5801 5802 For a given use case, the actual maximum supported resolution 5803 may be lower than what is listed here, depending on the destination 5804 Surface for the image data. For example, for recording video, 5805 the video encoder chosen may have a maximum size limit (e.g. 1080p) 5806 smaller than what the camera (e.g. maximum resolution is 3264x2448) 5807 can provide. 5808 5809 Please reference the documentation for the image data destination to 5810 check if it limits the maximum size for image data. 5811 5812 The following table describes the minimum required output stream 5813 configurations based on the hardware level 5814 (android.info.supportedHardwareLevel): 5815 5816 Format | Size | Hardware Level | Notes 5817 :-------------------------------------------------:|:--------------------------------------------:|:--------------:|:--------------: 5818 {@link android.graphics.ImageFormat#JPEG} | android.sensor.info.activeArraySize (*1) | Any | 5819 {@link android.graphics.ImageFormat#JPEG} | 1920x1080 (1080p) | Any | if 1080p <= activeArraySize 5820 {@link android.graphics.ImageFormat#JPEG} | 1280x720 (720p) | Any | if 720p <= activeArraySize 5821 {@link android.graphics.ImageFormat#JPEG} | 640x480 (480p) | Any | if 480p <= activeArraySize 5822 {@link android.graphics.ImageFormat#JPEG} | 320x240 (240p) | Any | if 240p <= activeArraySize 5823 {@link android.graphics.ImageFormat#YUV_420_888} | all output sizes available for JPEG | FULL | 5824 {@link android.graphics.ImageFormat#YUV_420_888} | all output sizes available for JPEG, up to the maximum video size | LIMITED | 5825 {@link android.graphics.ImageFormat#PRIVATE} | same as YUV_420_888 | Any | 5826 5827 Refer to android.request.availableCapabilities and {@link 5828 android.hardware.camera2.CameraDevice#createCaptureSession} for additional mandatory 5829 stream configurations on a per-capability basis. 5830 5831 *1: For JPEG format, the sizes may be restricted by below conditions: 5832 5833 * The HAL may choose the aspect ratio of each Jpeg size to be one of well known ones 5834 (e.g. 4:3, 16:9, 3:2 etc.). If the sensor maximum resolution 5835 (defined by android.sensor.info.activeArraySize) has an aspect ratio other than these, 5836 it does not have to be included in the supported JPEG sizes. 5837 * Some hardware JPEG encoders may have pixel boundary alignment requirements, such as 5838 the dimensions being a multiple of 16. 5839 Therefore, the maximum JPEG size may be smaller than sensor maximum resolution. 5840 However, the largest JPEG size will be as close as possible to the sensor maximum 5841 resolution given above constraints. It is required that after aspect ratio adjustments, 5842 additional size reduction due to other issues must be less than 3% in area. For example, 5843 if the sensor maximum resolution is 3280x2464, if the maximum JPEG size has aspect 5844 ratio 4:3, and the JPEG encoder alignment requirement is 16, the maximum JPEG size will be 5845 3264x2448. 5846 </details> 5847 <hal_details> 5848 Do not set this property directly 5849 (it is synthetic and will not be available at the HAL layer); 5850 set the android.scaler.availableStreamConfigurations instead. 5851 5852 Not all output formats may be supported in a configuration with 5853 an input stream of a particular format. For more details, see 5854 android.scaler.availableInputOutputFormatsMap. 5855 5856 It is recommended (but not mandatory) to also include half/quarter 5857 of sensor maximum resolution for JPEG formats (regardless of hardware 5858 level). 5859 5860 (The following is a rewording of the above required table): 5861 5862 The HAL must include sensor maximum resolution (defined by 5863 android.sensor.info.activeArraySize). 5864 5865 For FULL capability devices (`android.info.supportedHardwareLevel == FULL`), 5866 the HAL must include all YUV_420_888 sizes that have JPEG sizes listed 5867 here as output streams. 5868 5869 It must also include each below resolution if it is smaller than or 5870 equal to the sensor maximum resolution (for both YUV_420_888 and JPEG 5871 formats), as output streams: 5872 5873 * 240p (320 x 240) 5874 * 480p (640 x 480) 5875 * 720p (1280 x 720) 5876 * 1080p (1920 x 1080) 5877 5878 For LIMITED capability devices 5879 (`android.info.supportedHardwareLevel == LIMITED`), 5880 the HAL only has to list up to the maximum video size 5881 supported by the device. 5882 5883 Regardless of hardware level, every output resolution available for 5884 YUV_420_888 must also be available for IMPLEMENTATION_DEFINED. 5885 5886 This supercedes the following fields, which are now deprecated: 5887 5888 * availableFormats 5889 * available[Processed,Raw,Jpeg]Sizes 5890 </hal_details> 5891 </entry> 5892 <entry name="croppingType" type="byte" visibility="public" enum="true" 5893 hwlevel="legacy"> 5894 <enum> 5895 <value>CENTER_ONLY 5896 <notes> 5897 The camera device only supports centered crop regions. 5898 </notes> 5899 </value> 5900 <value>FREEFORM 5901 <notes> 5902 The camera device supports arbitrarily chosen crop regions. 5903 </notes> 5904 </value> 5905 </enum> 5906 <description>The crop type that this camera device supports.</description> 5907 <details> 5908 When passing a non-centered crop region (android.scaler.cropRegion) to a camera 5909 device that only supports CENTER_ONLY cropping, the camera device will move the 5910 crop region to the center of the sensor active array (android.sensor.info.activeArraySize) 5911 and keep the crop region width and height unchanged. The camera device will return the 5912 final used crop region in metadata result android.scaler.cropRegion. 5913 5914 Camera devices that support FREEFORM cropping will support any crop region that 5915 is inside of the active array. The camera device will apply the same crop region and 5916 return the final used crop region in capture result metadata android.scaler.cropRegion. 5917 5918 LEGACY capability devices will only support CENTER_ONLY cropping. 5919 </details> 5920 </entry> 5921 </static> 5922 </section> 5923 <section name="sensor"> 5924 <controls> 5925 <entry name="exposureTime" type="int64" visibility="public" hwlevel="full"> 5926 <description>Duration each pixel is exposed to 5927 light.</description> 5928 <units>Nanoseconds</units> 5929 <range>android.sensor.info.exposureTimeRange</range> 5930 <details>If the sensor can't expose this exact duration, it will shorten the 5931 duration exposed to the nearest possible value (rather than expose longer). 5932 The final exposure time used will be available in the output capture result. 5933 5934 This control is only effective if android.control.aeMode or android.control.mode is set to 5935 OFF; otherwise the auto-exposure algorithm will override this value. 5936 </details> 5937 <tag id="V1" /> 5938 </entry> 5939 <entry name="frameDuration" type="int64" visibility="public" hwlevel="full"> 5940 <description>Duration from start of frame exposure to 5941 start of next frame exposure.</description> 5942 <units>Nanoseconds</units> 5943 <range>See android.sensor.info.maxFrameDuration, 5944 android.scaler.streamConfigurationMap. The duration 5945 is capped to `max(duration, exposureTime + overhead)`.</range> 5946 <details> 5947 The maximum frame rate that can be supported by a camera subsystem is 5948 a function of many factors: 5949 5950 * Requested resolutions of output image streams 5951 * Availability of binning / skipping modes on the imager 5952 * The bandwidth of the imager interface 5953 * The bandwidth of the various ISP processing blocks 5954 5955 Since these factors can vary greatly between different ISPs and 5956 sensors, the camera abstraction tries to represent the bandwidth 5957 restrictions with as simple a model as possible. 5958 5959 The model presented has the following characteristics: 5960 5961 * The image sensor is always configured to output the smallest 5962 resolution possible given the application's requested output stream 5963 sizes. The smallest resolution is defined as being at least as large 5964 as the largest requested output stream size; the camera pipeline must 5965 never digitally upsample sensor data when the crop region covers the 5966 whole sensor. In general, this means that if only small output stream 5967 resolutions are configured, the sensor can provide a higher frame 5968 rate. 5969 * Since any request may use any or all the currently configured 5970 output streams, the sensor and ISP must be configured to support 5971 scaling a single capture to all the streams at the same time. This 5972 means the camera pipeline must be ready to produce the largest 5973 requested output size without any delay. Therefore, the overall 5974 frame rate of a given configured stream set is governed only by the 5975 largest requested stream resolution. 5976 * Using more than one output stream in a request does not affect the 5977 frame duration. 5978 * Certain format-streams may need to do additional background processing 5979 before data is consumed/produced by that stream. These processors 5980 can run concurrently to the rest of the camera pipeline, but 5981 cannot process more than 1 capture at a time. 5982 5983 The necessary information for the application, given the model above, 5984 is provided via the android.scaler.streamConfigurationMap field using 5985 {@link android.hardware.camera2.params.StreamConfigurationMap#getOutputMinFrameDuration}. 5986 These are used to determine the maximum frame rate / minimum frame 5987 duration that is possible for a given stream configuration. 5988 5989 Specifically, the application can use the following rules to 5990 determine the minimum frame duration it can request from the camera 5991 device: 5992 5993 1. Let the set of currently configured input/output streams 5994 be called `S`. 5995 1. Find the minimum frame durations for each stream in `S`, by looking 5996 it up in android.scaler.streamConfigurationMap using {@link 5997 android.hardware.camera2.params.StreamConfigurationMap#getOutputMinFrameDuration} 5998 (with its respective size/format). Let this set of frame durations be 5999 called `F`. 6000 1. For any given request `R`, the minimum frame duration allowed 6001 for `R` is the maximum out of all values in `F`. Let the streams 6002 used in `R` be called `S_r`. 6003 6004 If none of the streams in `S_r` have a stall time (listed in {@link 6005 android.hardware.camera2.params.StreamConfigurationMap#getOutputStallDuration} 6006 using its respective size/format), then the frame duration in `F` 6007 determines the steady state frame rate that the application will get 6008 if it uses `R` as a repeating request. Let this special kind of 6009 request be called `Rsimple`. 6010 6011 A repeating request `Rsimple` can be _occasionally_ interleaved 6012 by a single capture of a new request `Rstall` (which has at least 6013 one in-use stream with a non-0 stall time) and if `Rstall` has the 6014 same minimum frame duration this will not cause a frame rate loss 6015 if all buffers from the previous `Rstall` have already been 6016 delivered. 6017 6018 For more details about stalling, see 6019 {@link android.hardware.camera2.params.StreamConfigurationMap#getOutputStallDuration}. 6020 6021 This control is only effective if android.control.aeMode or android.control.mode is set to 6022 OFF; otherwise the auto-exposure algorithm will override this value. 6023 </details> 6024 <hal_details> 6025 For more details about stalling, see 6026 android.scaler.availableStallDurations. 6027 </hal_details> 6028 <tag id="V1" /> 6029 </entry> 6030 <entry name="sensitivity" type="int32" visibility="public" hwlevel="full"> 6031 <description>The amount of gain applied to sensor data 6032 before processing.</description> 6033 <units>ISO arithmetic units</units> 6034 <range>android.sensor.info.sensitivityRange</range> 6035 <details> 6036 The sensitivity is the standard ISO sensitivity value, 6037 as defined in ISO 12232:2006. 6038 6039 The sensitivity must be within android.sensor.info.sensitivityRange, and 6040 if if it less than android.sensor.maxAnalogSensitivity, the camera device 6041 is guaranteed to use only analog amplification for applying the gain. 6042 6043 If the camera device cannot apply the exact sensitivity 6044 requested, it will reduce the gain to the nearest supported 6045 value. The final sensitivity used will be available in the 6046 output capture result. 6047 6048 This control is only effective if android.control.aeMode or android.control.mode is set to 6049 OFF; otherwise the auto-exposure algorithm will override this value. 6050 </details> 6051 <hal_details>ISO 12232:2006 REI method is acceptable.</hal_details> 6052 <tag id="V1" /> 6053 </entry> 6054 </controls> 6055 <static> 6056 <namespace name="info"> 6057 <entry name="activeArraySize" type="int32" visibility="public" 6058 type_notes="Four ints defining the active pixel rectangle" 6059 container="array" typedef="rectangle" hwlevel="legacy"> 6060 <array> 6061 <size>4</size> 6062 </array> 6063 <description> 6064 The area of the image sensor which corresponds to active pixels after any geometric 6065 distortion correction has been applied. 6066 </description> 6067 <units>Pixel coordinates on the image sensor</units> 6068 <details> 6069 This is the rectangle representing the size of the active region of the sensor (i.e. 6070 the region that actually receives light from the scene) after any geometric correction 6071 has been applied, and should be treated as the maximum size in pixels of any of the 6072 image output formats aside from the raw formats. 6073 6074 This rectangle is defined relative to the full pixel array; (0,0) is the top-left of 6075 the full pixel array, and the size of the full pixel array is given by 6076 android.sensor.info.pixelArraySize. 6077 6078 The coordinate system for most other keys that list pixel coordinates, including 6079 android.scaler.cropRegion, is defined relative to the active array rectangle given in 6080 this field, with `(0, 0)` being the top-left of this rectangle. 6081 6082 The active array may be smaller than the full pixel array, since the full array may 6083 include black calibration pixels or other inactive regions, and geometric correction 6084 resulting in scaling or cropping may have been applied. 6085 </details> 6086 <hal_details> 6087 This array contains `(xmin, ymin, width, height)`. The `(xmin, ymin)` must be 6088 &gt;= `(0,0)`. 6089 The `(width, height)` must be &lt;= `android.sensor.info.pixelArraySize`. 6090 </hal_details> 6091 <tag id="RAW" /> 6092 </entry> 6093 <entry name="sensitivityRange" type="int32" visibility="public" 6094 type_notes="Range of supported sensitivities" 6095 container="array" typedef="rangeInt" 6096 hwlevel="full"> 6097 <array> 6098 <size>2</size> 6099 </array> 6100 <description>Range of sensitivities for android.sensor.sensitivity supported by this 6101 camera device.</description> 6102 <range>Min <= 100, Max &gt;= 800</range> 6103 <details> 6104 The values are the standard ISO sensitivity values, 6105 as defined in ISO 12232:2006. 6106 </details> 6107 6108 <tag id="BC" /> 6109 <tag id="V1" /> 6110 </entry> 6111 <entry name="colorFilterArrangement" type="byte" visibility="public" enum="true" 6112 hwlevel="full"> 6113 <enum> 6114 <value>RGGB</value> 6115 <value>GRBG</value> 6116 <value>GBRG</value> 6117 <value>BGGR</value> 6118 <value>RGB 6119 <notes>Sensor is not Bayer; output has 3 16-bit 6120 values for each pixel, instead of just 1 16-bit value 6121 per pixel.</notes></value> 6122 </enum> 6123 <description>The arrangement of color filters on sensor; 6124 represents the colors in the top-left 2x2 section of 6125 the sensor, in reading order.</description> 6126 <tag id="RAW" /> 6127 </entry> 6128 <entry name="exposureTimeRange" type="int64" visibility="public" 6129 type_notes="nanoseconds" container="array" typedef="rangeLong" 6130 hwlevel="full"> 6131 <array> 6132 <size>2</size> 6133 </array> 6134 <description>The range of image exposure times for android.sensor.exposureTime supported 6135 by this camera device. 6136 </description> 6137 <units>Nanoseconds</units> 6138 <range>The minimum exposure time will be less than 100 us. For FULL 6139 capability devices (android.info.supportedHardwareLevel == FULL), 6140 the maximum exposure time will be greater than 100ms.</range> 6141 <hal_details>For FULL capability devices (android.info.supportedHardwareLevel == FULL), 6142 The maximum of the range SHOULD be at least 1 second (1e9), MUST be at least 6143 100ms. 6144 </hal_details> 6145 <tag id="V1" /> 6146 </entry> 6147 <entry name="maxFrameDuration" type="int64" visibility="public" 6148 hwlevel="full"> 6149 <description>The maximum possible frame duration (minimum frame rate) for 6150 android.sensor.frameDuration that is supported this camera device.</description> 6151 <units>Nanoseconds</units> 6152 <range>For FULL capability devices 6153 (android.info.supportedHardwareLevel == FULL), at least 100ms. 6154 </range> 6155 <details>Attempting to use frame durations beyond the maximum will result in the frame 6156 duration being clipped to the maximum. See that control for a full definition of frame 6157 durations. 6158 6159 Refer to {@link 6160 android.hardware.camera2.params.StreamConfigurationMap#getOutputMinFrameDuration} 6161 for the minimum frame duration values. 6162 </details> 6163 <hal_details> 6164 For FULL capability devices (android.info.supportedHardwareLevel == FULL), 6165 The maximum of the range SHOULD be at least 6166 1 second (1e9), MUST be at least 100ms (100e6). 6167 6168 android.sensor.info.maxFrameDuration must be greater or 6169 equal to the android.sensor.info.exposureTimeRange max 6170 value (since exposure time overrides frame duration). 6171 6172 Available minimum frame durations for JPEG must be no greater 6173 than that of the YUV_420_888/IMPLEMENTATION_DEFINED 6174 minimum frame durations (for that respective size). 6175 6176 Since JPEG processing is considered offline and can take longer than 6177 a single uncompressed capture, refer to 6178 android.scaler.availableStallDurations 6179 for details about encoding this scenario. 6180 </hal_details> 6181 <tag id="V1" /> 6182 </entry> 6183 <entry name="physicalSize" type="float" visibility="public" 6184 type_notes="width x height" 6185 container="array" typedef="sizeF" hwlevel="legacy"> 6186 <array> 6187 <size>2</size> 6188 </array> 6189 <description>The physical dimensions of the full pixel 6190 array.</description> 6191 <units>Millimeters</units> 6192 <details>This is the physical size of the sensor pixel 6193 array defined by android.sensor.info.pixelArraySize. 6194 </details> 6195 <hal_details>Needed for FOV calculation for old API</hal_details> 6196 <tag id="V1" /> 6197 <tag id="BC" /> 6198 </entry> 6199 <entry name="pixelArraySize" type="int32" visibility="public" 6200 container="array" typedef="size" hwlevel="legacy"> 6201 <array> 6202 <size>2</size> 6203 </array> 6204 <description>Dimensions of the full pixel array, possibly 6205 including black calibration pixels.</description> 6206 <units>Pixels</units> 6207 <details>The pixel count of the full pixel array of the image sensor, which covers 6208 android.sensor.info.physicalSize area. This represents the full pixel dimensions of 6209 the raw buffers produced by this sensor. 6210 6211 If a camera device supports raw sensor formats, either this or 6212 android.sensor.info.preCorrectionActiveArraySize is the maximum dimensions for the raw 6213 output formats listed in android.scaler.streamConfigurationMap (this depends on 6214 whether or not the image sensor returns buffers containing pixels that are not 6215 part of the active array region for blacklevel calibration or other purposes). 6216 6217 Some parts of the full pixel array may not receive light from the scene, 6218 or be otherwise inactive. The android.sensor.info.preCorrectionActiveArraySize key 6219 defines the rectangle of active pixels that will be included in processed image 6220 formats. 6221 </details> 6222 <tag id="RAW" /> 6223 <tag id="BC" /> 6224 </entry> 6225 <entry name="whiteLevel" type="int32" visibility="public"> 6226 <description> 6227 Maximum raw value output by sensor. 6228 </description> 6229 <range>&gt; 255 (8-bit output)</range> 6230 <details> 6231 This specifies the fully-saturated encoding level for the raw 6232 sample values from the sensor. This is typically caused by the 6233 sensor becoming highly non-linear or clipping. The minimum for 6234 each channel is specified by the offset in the 6235 android.sensor.blackLevelPattern key. 6236 6237 The white level is typically determined either by sensor bit depth 6238 (8-14 bits is expected), or by the point where the sensor response 6239 becomes too non-linear to be useful. The default value for this is 6240 maximum representable value for a 16-bit raw sample (2^16 - 1). 6241 6242 The white level values of captured images may vary for different 6243 capture settings (e.g., android.sensor.sensitivity). This key 6244 represents a coarse approximation for such case. It is recommended 6245 to use android.sensor.dynamicWhiteLevel for captures when supported 6246 by the camera device, which provides more accurate white level values. 6247 </details> 6248 <hal_details> 6249 The full bit depth of the sensor must be available in the raw data, 6250 so the value for linear sensors should not be significantly lower 6251 than maximum raw value supported, i.e. 2^(sensor bits per pixel). 6252 </hal_details> 6253 <tag id="RAW" /> 6254 </entry> 6255 <entry name="timestampSource" type="byte" visibility="public" 6256 enum="true" hwlevel="legacy"> 6257 <enum> 6258 <value>UNKNOWN 6259 <notes> 6260 Timestamps from android.sensor.timestamp are in nanoseconds and monotonic, 6261 but can not be compared to timestamps from other subsystems 6262 (e.g. accelerometer, gyro etc.), or other instances of the same or different 6263 camera devices in the same system. Timestamps between streams and results for 6264 a single camera instance are comparable, and the timestamps for all buffers 6265 and the result metadata generated by a single capture are identical. 6266 </notes> 6267 </value> 6268 <value>REALTIME 6269 <notes> 6270 Timestamps from android.sensor.timestamp are in the same timebase as 6271 {@link android.os.SystemClock#elapsedRealtimeNanos}, 6272 and they can be compared to other timestamps using that base. 6273 </notes> 6274 </value> 6275 </enum> 6276 <description>The time base source for sensor capture start timestamps.</description> 6277 <details> 6278 The timestamps provided for captures are always in nanoseconds and monotonic, but 6279 may not based on a time source that can be compared to other system time sources. 6280 6281 This characteristic defines the source for the timestamps, and therefore whether they 6282 can be compared against other system time sources/timestamps. 6283 </details> 6284 <hal_details> 6285 For camera devices implement UNKNOWN, the camera framework expects that the timestamp 6286 source to be SYSTEM_TIME_MONOTONIC. For camera devices implement REALTIME, the camera 6287 framework expects that the timestamp source to be SYSTEM_TIME_BOOTTIME. See 6288 system/core/include/utils/Timers.h for the definition of SYSTEM_TIME_MONOTONIC and 6289 SYSTEM_TIME_BOOTTIME. Note that HAL must follow above expectation; otherwise video 6290 recording might suffer unexpected behavior. 6291 6292 Also, camera devices implements REALTIME must pass the ITS sensor fusion test which 6293 tests the alignment between camera timestamps and gyro sensor timestamps. 6294 </hal_details> 6295 <tag id="V1" /> 6296 </entry> 6297 <entry name="lensShadingApplied" type="byte" visibility="public" enum="true" 6298 typedef="boolean"> 6299 <enum> 6300 <value>FALSE</value> 6301 <value>TRUE</value> 6302 </enum> 6303 <description>Whether the RAW images output from this camera device are subject to 6304 lens shading correction.</description> 6305 <details> 6306 If TRUE, all images produced by the camera device in the RAW image formats will 6307 have lens shading correction already applied to it. If FALSE, the images will 6308 not be adjusted for lens shading correction. 6309 See android.request.maxNumOutputRaw for a list of RAW image formats. 6310 6311 This key will be `null` for all devices do not report this information. 6312 Devices with RAW capability will always report this information in this key. 6313 </details> 6314 </entry> 6315 <entry name="preCorrectionActiveArraySize" type="int32" visibility="public" 6316 type_notes="Four ints defining the active pixel rectangle" container="array" 6317 typedef="rectangle" hwlevel="legacy"> 6318 <array> 6319 <size>4</size> 6320 </array> 6321 <description> 6322 The area of the image sensor which corresponds to active pixels prior to the 6323 application of any geometric distortion correction. 6324 </description> 6325 <units>Pixel coordinates on the image sensor</units> 6326 <details> 6327 This is the rectangle representing the size of the active region of the sensor (i.e. 6328 the region that actually receives light from the scene) before any geometric correction 6329 has been applied, and should be treated as the active region rectangle for any of the 6330 raw formats. All metadata associated with raw processing (e.g. the lens shading 6331 correction map, and radial distortion fields) treats the top, left of this rectangle as 6332 the origin, (0,0). 6333 6334 The size of this region determines the maximum field of view and the maximum number of 6335 pixels that an image from this sensor can contain, prior to the application of 6336 geometric distortion correction. The effective maximum pixel dimensions of a 6337 post-distortion-corrected image is given by the android.sensor.info.activeArraySize 6338 field, and the effective maximum field of view for a post-distortion-corrected image 6339 can be calculated by applying the geometric distortion correction fields to this 6340 rectangle, and cropping to the rectangle given in android.sensor.info.activeArraySize. 6341 6342 E.g. to calculate position of a pixel, (x,y), in a processed YUV output image with the 6343 dimensions in android.sensor.info.activeArraySize given the position of a pixel, 6344 (x', y'), in the raw pixel array with dimensions give in 6345 android.sensor.info.pixelArraySize: 6346 6347 1. Choose a pixel (x', y') within the active array region of the raw buffer given in 6348 android.sensor.info.preCorrectionActiveArraySize, otherwise this pixel is considered 6349 to be outside of the FOV, and will not be shown in the processed output image. 6350 1. Apply geometric distortion correction to get the post-distortion pixel coordinate, 6351 (x_i, y_i). When applying geometric correction metadata, note that metadata for raw 6352 buffers is defined relative to the top, left of the 6353 android.sensor.info.preCorrectionActiveArraySize rectangle. 6354 1. If the resulting corrected pixel coordinate is within the region given in 6355 android.sensor.info.activeArraySize, then the position of this pixel in the 6356 processed output image buffer is `(x_i - activeArray.left, y_i - activeArray.top)`, 6357 when the top, left coordinate of that buffer is treated as (0, 0). 6358 6359 Thus, for pixel x',y' = (25, 25) on a sensor where android.sensor.info.pixelArraySize 6360 is (100,100), android.sensor.info.preCorrectionActiveArraySize is (10, 10, 100, 100), 6361 android.sensor.info.activeArraySize is (20, 20, 80, 80), and the geometric distortion 6362 correction doesn't change the pixel coordinate, the resulting pixel selected in 6363 pixel coordinates would be x,y = (25, 25) relative to the top,left of the raw buffer 6364 with dimensions given in android.sensor.info.pixelArraySize, and would be (5, 5) 6365 relative to the top,left of post-processed YUV output buffer with dimensions given in 6366 android.sensor.info.activeArraySize. 6367 6368 The currently supported fields that correct for geometric distortion are: 6369 6370 1. android.lens.radialDistortion. 6371 6372 If all of the geometric distortion fields are no-ops, this rectangle will be the same 6373 as the post-distortion-corrected rectangle given in 6374 android.sensor.info.activeArraySize. 6375 6376 This rectangle is defined relative to the full pixel array; (0,0) is the top-left of 6377 the full pixel array, and the size of the full pixel array is given by 6378 android.sensor.info.pixelArraySize. 6379 6380 The pre-correction active array may be smaller than the full pixel array, since the 6381 full array may include black calibration pixels or other inactive regions. 6382 </details> 6383 <hal_details> 6384 This array contains `(xmin, ymin, width, height)`. The `(xmin, ymin)` must be 6385 &gt;= `(0,0)`. 6386 The `(width, height)` must be &lt;= `android.sensor.info.pixelArraySize`. 6387 6388 If omitted by the HAL implementation, the camera framework will assume that this is 6389 the same as the post-correction active array region given in 6390 android.sensor.info.activeArraySize. 6391 </hal_details> 6392 <tag id="RAW" /> 6393 </entry> 6394 </namespace> 6395 <entry name="referenceIlluminant1" type="byte" visibility="public" 6396 enum="true"> 6397 <enum> 6398 <value id="1">DAYLIGHT</value> 6399 <value id="2">FLUORESCENT</value> 6400 <value id="3">TUNGSTEN 6401 <notes>Incandescent light</notes> 6402 </value> 6403 <value id="4">FLASH</value> 6404 <value id="9">FINE_WEATHER</value> 6405 <value id="10">CLOUDY_WEATHER</value> 6406 <value id="11">SHADE</value> 6407 <value id="12">DAYLIGHT_FLUORESCENT 6408 <notes>D 5700 - 7100K</notes> 6409 </value> 6410 <value id="13">DAY_WHITE_FLUORESCENT 6411 <notes>N 4600 - 5400K</notes> 6412 </value> 6413 <value id="14">COOL_WHITE_FLUORESCENT 6414 <notes>W 3900 - 4500K</notes> 6415 </value> 6416 <value id="15">WHITE_FLUORESCENT 6417 <notes>WW 3200 - 3700K</notes> 6418 </value> 6419 <value id="17">STANDARD_A</value> 6420 <value id="18">STANDARD_B</value> 6421 <value id="19">STANDARD_C</value> 6422 <value id="20">D55</value> 6423 <value id="21">D65</value> 6424 <value id="22">D75</value> 6425 <value id="23">D50</value> 6426 <value id="24">ISO_STUDIO_TUNGSTEN</value> 6427 </enum> 6428 <description> 6429 The standard reference illuminant used as the scene light source when 6430 calculating the android.sensor.colorTransform1, 6431 android.sensor.calibrationTransform1, and 6432 android.sensor.forwardMatrix1 matrices. 6433 </description> 6434 <details> 6435 The values in this key correspond to the values defined for the 6436 EXIF LightSource tag. These illuminants are standard light sources 6437 that are often used calibrating camera devices. 6438 6439 If this key is present, then android.sensor.colorTransform1, 6440 android.sensor.calibrationTransform1, and 6441 android.sensor.forwardMatrix1 will also be present. 6442 6443 Some devices may choose to provide a second set of calibration 6444 information for improved quality, including 6445 android.sensor.referenceIlluminant2 and its corresponding matrices. 6446 </details> 6447 <hal_details> 6448 The first reference illuminant (android.sensor.referenceIlluminant1) 6449 and corresponding matrices must be present to support the RAW capability 6450 and DNG output. 6451 6452 When producing raw images with a color profile that has only been 6453 calibrated against a single light source, it is valid to omit 6454 android.sensor.referenceIlluminant2 along with the 6455 android.sensor.colorTransform2, android.sensor.calibrationTransform2, 6456 and android.sensor.forwardMatrix2 matrices. 6457 6458 If only android.sensor.referenceIlluminant1 is included, it should be 6459 chosen so that it is representative of typical scene lighting. In 6460 general, D50 or DAYLIGHT will be chosen for this case. 6461 6462 If both android.sensor.referenceIlluminant1 and 6463 android.sensor.referenceIlluminant2 are included, they should be 6464 chosen to represent the typical range of scene lighting conditions. 6465 In general, low color temperature illuminant such as Standard-A will 6466 be chosen for the first reference illuminant and a higher color 6467 temperature illuminant such as D65 will be chosen for the second 6468 reference illuminant. 6469 </hal_details> 6470 <tag id="RAW" /> 6471 </entry> 6472 <entry name="referenceIlluminant2" type="byte" visibility="public"> 6473 <description> 6474 The standard reference illuminant used as the scene light source when 6475 calculating the android.sensor.colorTransform2, 6476 android.sensor.calibrationTransform2, and 6477 android.sensor.forwardMatrix2 matrices. 6478 </description> 6479 <range>Any value listed in android.sensor.referenceIlluminant1</range> 6480 <details> 6481 See android.sensor.referenceIlluminant1 for more details. 6482 6483 If this key is present, then android.sensor.colorTransform2, 6484 android.sensor.calibrationTransform2, and 6485 android.sensor.forwardMatrix2 will also be present. 6486 </details> 6487 <tag id="RAW" /> 6488 </entry> 6489 <entry name="calibrationTransform1" type="rational" 6490 visibility="public" optional="true" 6491 type_notes="3x3 matrix in row-major-order" container="array" 6492 typedef="colorSpaceTransform"> 6493 <array> 6494 <size>3</size> 6495 <size>3</size> 6496 </array> 6497 <description> 6498 A per-device calibration transform matrix that maps from the 6499 reference sensor colorspace to the actual device sensor colorspace. 6500 </description> 6501 <details> 6502 This matrix is used to correct for per-device variations in the 6503 sensor colorspace, and is used for processing raw buffer data. 6504 6505 The matrix is expressed as a 3x3 matrix in row-major-order, and 6506 contains a per-device calibration transform that maps colors 6507 from reference sensor color space (i.e. the "golden module" 6508 colorspace) into this camera device's native sensor color 6509 space under the first reference illuminant 6510 (android.sensor.referenceIlluminant1). 6511 </details> 6512 <tag id="RAW" /> 6513 </entry> 6514 <entry name="calibrationTransform2" type="rational" 6515 visibility="public" optional="true" 6516 type_notes="3x3 matrix in row-major-order" container="array" 6517 typedef="colorSpaceTransform"> 6518 <array> 6519 <size>3</size> 6520 <size>3</size> 6521 </array> 6522 <description> 6523 A per-device calibration transform matrix that maps from the 6524 reference sensor colorspace to the actual device sensor colorspace 6525 (this is the colorspace of the raw buffer data). 6526 </description> 6527 <details> 6528 This matrix is used to correct for per-device variations in the 6529 sensor colorspace, and is used for processing raw buffer data. 6530 6531 The matrix is expressed as a 3x3 matrix in row-major-order, and 6532 contains a per-device calibration transform that maps colors 6533 from reference sensor color space (i.e. the "golden module" 6534 colorspace) into this camera device's native sensor color 6535 space under the second reference illuminant 6536 (android.sensor.referenceIlluminant2). 6537 6538 This matrix will only be present if the second reference 6539 illuminant is present. 6540 </details> 6541 <tag id="RAW" /> 6542 </entry> 6543 <entry name="colorTransform1" type="rational" 6544 visibility="public" optional="true" 6545 type_notes="3x3 matrix in row-major-order" container="array" 6546 typedef="colorSpaceTransform"> 6547 <array> 6548 <size>3</size> 6549 <size>3</size> 6550 </array> 6551 <description> 6552 A matrix that transforms color values from CIE XYZ color space to 6553 reference sensor color space. 6554 </description> 6555 <details> 6556 This matrix is used to convert from the standard CIE XYZ color 6557 space to the reference sensor colorspace, and is used when processing 6558 raw buffer data. 6559 6560 The matrix is expressed as a 3x3 matrix in row-major-order, and 6561 contains a color transform matrix that maps colors from the CIE 6562 XYZ color space to the reference sensor color space (i.e. the 6563 "golden module" colorspace) under the first reference illuminant 6564 (android.sensor.referenceIlluminant1). 6565 6566 The white points chosen in both the reference sensor color space 6567 and the CIE XYZ colorspace when calculating this transform will 6568 match the standard white point for the first reference illuminant 6569 (i.e. no chromatic adaptation will be applied by this transform). 6570 </details> 6571 <tag id="RAW" /> 6572 </entry> 6573 <entry name="colorTransform2" type="rational" 6574 visibility="public" optional="true" 6575 type_notes="3x3 matrix in row-major-order" container="array" 6576 typedef="colorSpaceTransform"> 6577 <array> 6578 <size>3</size> 6579 <size>3</size> 6580 </array> 6581 <description> 6582 A matrix that transforms color values from CIE XYZ color space to 6583 reference sensor color space. 6584 </description> 6585 <details> 6586 This matrix is used to convert from the standard CIE XYZ color 6587 space to the reference sensor colorspace, and is used when processing 6588 raw buffer data. 6589 6590 The matrix is expressed as a 3x3 matrix in row-major-order, and 6591 contains a color transform matrix that maps colors from the CIE 6592 XYZ color space to the reference sensor color space (i.e. the 6593 "golden module" colorspace) under the second reference illuminant 6594 (android.sensor.referenceIlluminant2). 6595 6596 The white points chosen in both the reference sensor color space 6597 and the CIE XYZ colorspace when calculating this transform will 6598 match the standard white point for the second reference illuminant 6599 (i.e. no chromatic adaptation will be applied by this transform). 6600 6601 This matrix will only be present if the second reference 6602 illuminant is present. 6603 </details> 6604 <tag id="RAW" /> 6605 </entry> 6606 <entry name="forwardMatrix1" type="rational" 6607 visibility="public" optional="true" 6608 type_notes="3x3 matrix in row-major-order" container="array" 6609 typedef="colorSpaceTransform"> 6610 <array> 6611 <size>3</size> 6612 <size>3</size> 6613 </array> 6614 <description> 6615 A matrix that transforms white balanced camera colors from the reference 6616 sensor colorspace to the CIE XYZ colorspace with a D50 whitepoint. 6617 </description> 6618 <details> 6619 This matrix is used to convert to the standard CIE XYZ colorspace, and 6620 is used when processing raw buffer data. 6621 6622 This matrix is expressed as a 3x3 matrix in row-major-order, and contains 6623 a color transform matrix that maps white balanced colors from the 6624 reference sensor color space to the CIE XYZ color space with a D50 white 6625 point. 6626 6627 Under the first reference illuminant (android.sensor.referenceIlluminant1) 6628 this matrix is chosen so that the standard white point for this reference 6629 illuminant in the reference sensor colorspace is mapped to D50 in the 6630 CIE XYZ colorspace. 6631 </details> 6632 <tag id="RAW" /> 6633 </entry> 6634 <entry name="forwardMatrix2" type="rational" 6635 visibility="public" optional="true" 6636 type_notes="3x3 matrix in row-major-order" container="array" 6637 typedef="colorSpaceTransform"> 6638 <array> 6639 <size>3</size> 6640 <size>3</size> 6641 </array> 6642 <description> 6643 A matrix that transforms white balanced camera colors from the reference 6644 sensor colorspace to the CIE XYZ colorspace with a D50 whitepoint. 6645 </description> 6646 <details> 6647 This matrix is used to convert to the standard CIE XYZ colorspace, and 6648 is used when processing raw buffer data. 6649 6650 This matrix is expressed as a 3x3 matrix in row-major-order, and contains 6651 a color transform matrix that maps white balanced colors from the 6652 reference sensor color space to the CIE XYZ color space with a D50 white 6653 point. 6654 6655 Under the second reference illuminant (android.sensor.referenceIlluminant2) 6656 this matrix is chosen so that the standard white point for this reference 6657 illuminant in the reference sensor colorspace is mapped to D50 in the 6658 CIE XYZ colorspace. 6659 6660 This matrix will only be present if the second reference 6661 illuminant is present. 6662 </details> 6663 <tag id="RAW" /> 6664 </entry> 6665 <entry name="baseGainFactor" type="rational" 6666 optional="true"> 6667 <description>Gain factor from electrons to raw units when 6668 ISO=100</description> 6669 <tag id="FUTURE" /> 6670 </entry> 6671 <entry name="blackLevelPattern" type="int32" visibility="public" 6672 optional="true" type_notes="2x2 raw count block" container="array" 6673 typedef="blackLevelPattern"> 6674 <array> 6675 <size>4</size> 6676 </array> 6677 <description> 6678 A fixed black level offset for each of the color filter arrangement 6679 (CFA) mosaic channels. 6680 </description> 6681 <range>&gt;= 0 for each.</range> 6682 <details> 6683 This key specifies the zero light value for each of the CFA mosaic 6684 channels in the camera sensor. The maximal value output by the 6685 sensor is represented by the value in android.sensor.info.whiteLevel. 6686 6687 The values are given in the same order as channels listed for the CFA 6688 layout key (see android.sensor.info.colorFilterArrangement), i.e. the 6689 nth value given corresponds to the black level offset for the nth 6690 color channel listed in the CFA. 6691 6692 The black level values of captured images may vary for different 6693 capture settings (e.g., android.sensor.sensitivity). This key 6694 represents a coarse approximation for such case. It is recommended to 6695 use android.sensor.dynamicBlackLevel or use pixels from 6696 android.sensor.opticalBlackRegions directly for captures when 6697 supported by the camera device, which provides more accurate black 6698 level values. For raw capture in particular, it is recommended to use 6699 pixels from android.sensor.opticalBlackRegions to calculate black 6700 level values for each frame. 6701 </details> 6702 <hal_details> 6703 The values are given in row-column scan order, with the first value 6704 corresponding to the element of the CFA in row=0, column=0. 6705 </hal_details> 6706 <tag id="RAW" /> 6707 </entry> 6708 <entry name="maxAnalogSensitivity" type="int32" visibility="public" 6709 optional="true" hwlevel="full"> 6710 <description>Maximum sensitivity that is implemented 6711 purely through analog gain.</description> 6712 <details>For android.sensor.sensitivity values less than or 6713 equal to this, all applied gain must be analog. For 6714 values above this, the gain applied can be a mix of analog and 6715 digital.</details> 6716 <tag id="V1" /> 6717 <tag id="FULL" /> 6718 </entry> 6719 <entry name="orientation" type="int32" visibility="public" 6720 hwlevel="legacy"> 6721 <description>Clockwise angle through which the output image needs to be rotated to be 6722 upright on the device screen in its native orientation. 6723 </description> 6724 <units>Degrees of clockwise rotation; always a multiple of 6725 90</units> 6726 <range>0, 90, 180, 270</range> 6727 <details> 6728 Also defines the direction of rolling shutter readout, which is from top to bottom in 6729 the sensor's coordinate system. 6730 </details> 6731 <tag id="BC" /> 6732 </entry> 6733 <entry name="profileHueSatMapDimensions" type="int32" 6734 visibility="system" optional="true" 6735 type_notes="Number of samples for hue, saturation, and value" 6736 container="array"> 6737 <array> 6738 <size>3</size> 6739 </array> 6740 <description> 6741 The number of input samples for each dimension of 6742 android.sensor.profileHueSatMap. 6743 </description> 6744 <range> 6745 Hue &gt;= 1, 6746 Saturation &gt;= 2, 6747 Value &gt;= 1 6748 </range> 6749 <details> 6750 The number of input samples for the hue, saturation, and value 6751 dimension of android.sensor.profileHueSatMap. The order of the 6752 dimensions given is hue, saturation, value; where hue is the 0th 6753 element. 6754 </details> 6755 <tag id="RAW" /> 6756 </entry> 6757 </static> 6758 <dynamic> 6759 <clone entry="android.sensor.exposureTime" kind="controls"> 6760 </clone> 6761 <clone entry="android.sensor.frameDuration" 6762 kind="controls"></clone> 6763 <clone entry="android.sensor.sensitivity" kind="controls"> 6764 </clone> 6765 <entry name="timestamp" type="int64" visibility="public" 6766 hwlevel="legacy"> 6767 <description>Time at start of exposure of first 6768 row of the image sensor active array, in nanoseconds.</description> 6769 <units>Nanoseconds</units> 6770 <range>&gt; 0</range> 6771 <details>The timestamps are also included in all image 6772 buffers produced for the same capture, and will be identical 6773 on all the outputs. 6774 6775 When android.sensor.info.timestampSource `==` UNKNOWN, 6776 the timestamps measure time since an unspecified starting point, 6777 and are monotonically increasing. They can be compared with the 6778 timestamps for other captures from the same camera device, but are 6779 not guaranteed to be comparable to any other time source. 6780 6781 When android.sensor.info.timestampSource `==` REALTIME, the 6782 timestamps measure time in the same timebase as {@link 6783 android.os.SystemClock#elapsedRealtimeNanos}, and they can 6784 be compared to other timestamps from other subsystems that 6785 are using that base. 6786 6787 For reprocessing, the timestamp will match the start of exposure of 6788 the input image, i.e. {@link CaptureResult#SENSOR_TIMESTAMP the 6789 timestamp} in the TotalCaptureResult that was used to create the 6790 reprocess capture request. 6791 </details> 6792 <hal_details> 6793 All timestamps must be in reference to the kernel's 6794 CLOCK_BOOTTIME monotonic clock, which properly accounts for 6795 time spent asleep. This allows for synchronization with 6796 sensors that continue to operate while the system is 6797 otherwise asleep. 6798 6799 If android.sensor.info.timestampSource `==` REALTIME, 6800 The timestamp must be synchronized with the timestamps from other 6801 sensor subsystems that are using the same timebase. 6802 6803 For reprocessing, the input image's start of exposure can be looked up 6804 with android.sensor.timestamp from the metadata included in the 6805 capture request. 6806 </hal_details> 6807 <tag id="BC" /> 6808 </entry> 6809 <entry name="temperature" type="float" 6810 optional="true"> 6811 <description>The temperature of the sensor, sampled at the time 6812 exposure began for this frame. 6813 6814 The thermal diode being queried should be inside the sensor PCB, or 6815 somewhere close to it. 6816 </description> 6817 6818 <units>Celsius</units> 6819 <range>Optional. This value is missing if no temperature is available.</range> 6820 <tag id="FUTURE" /> 6821 </entry> 6822 <entry name="neutralColorPoint" type="rational" visibility="public" 6823 optional="true" container="array"> 6824 <array> 6825 <size>3</size> 6826 </array> 6827 <description> 6828 The estimated camera neutral color in the native sensor colorspace at 6829 the time of capture. 6830 </description> 6831 <details> 6832 This value gives the neutral color point encoded as an RGB value in the 6833 native sensor color space. The neutral color point indicates the 6834 currently estimated white point of the scene illumination. It can be 6835 used to interpolate between the provided color transforms when 6836 processing raw sensor data. 6837 6838 The order of the values is R, G, B; where R is in the lowest index. 6839 </details> 6840 <tag id="RAW" /> 6841 </entry> 6842 <entry name="noiseProfile" type="double" visibility="public" 6843 optional="true" type_notes="Pairs of noise model coefficients" 6844 container="array" typedef="pairDoubleDouble"> 6845 <array> 6846 <size>2</size> 6847 <size>CFA Channels</size> 6848 </array> 6849 <description> 6850 Noise model coefficients for each CFA mosaic channel. 6851 </description> 6852 <details> 6853 This key contains two noise model coefficients for each CFA channel 6854 corresponding to the sensor amplification (S) and sensor readout 6855 noise (O). These are given as pairs of coefficients for each channel 6856 in the same order as channels listed for the CFA layout key 6857 (see android.sensor.info.colorFilterArrangement). This is 6858 represented as an array of Pair&lt;Double, Double&gt;, where 6859 the first member of the Pair at index n is the S coefficient and the 6860 second member is the O coefficient for the nth color channel in the CFA. 6861 6862 These coefficients are used in a two parameter noise model to describe 6863 the amount of noise present in the image for each CFA channel. The 6864 noise model used here is: 6865 6866 N(x) = sqrt(Sx + O) 6867 6868 Where x represents the recorded signal of a CFA channel normalized to 6869 the range [0, 1], and S and O are the noise model coeffiecients for 6870 that channel. 6871 6872 A more detailed description of the noise model can be found in the 6873 Adobe DNG specification for the NoiseProfile tag. 6874 </details> 6875 <hal_details> 6876 For a CFA layout of RGGB, the list of coefficients would be given as 6877 an array of doubles S0,O0,S1,O1,..., where S0 and O0 are the coefficients 6878 for the red channel, S1 and O1 are the coefficients for the first green 6879 channel, etc. 6880 </hal_details> 6881 <tag id="RAW" /> 6882 </entry> 6883 <entry name="profileHueSatMap" type="float" 6884 visibility="system" optional="true" 6885 type_notes="Mapping for hue, saturation, and value" 6886 container="array"> 6887 <array> 6888 <size>hue_samples</size> 6889 <size>saturation_samples</size> 6890 <size>value_samples</size> 6891 <size>3</size> 6892 </array> 6893 <description> 6894 A mapping containing a hue shift, saturation scale, and value scale 6895 for each pixel. 6896 </description> 6897 <units> 6898 The hue shift is given in degrees; saturation and value scale factors are 6899 unitless and are between 0 and 1 inclusive 6900 </units> 6901 <details> 6902 hue_samples, saturation_samples, and value_samples are given in 6903 android.sensor.profileHueSatMapDimensions. 6904 6905 Each entry of this map contains three floats corresponding to the 6906 hue shift, saturation scale, and value scale, respectively; where the 6907 hue shift has the lowest index. The map entries are stored in the key 6908 in nested loop order, with the value divisions in the outer loop, the 6909 hue divisions in the middle loop, and the saturation divisions in the 6910 inner loop. All zero input saturation entries are required to have a 6911 value scale factor of 1.0. 6912 </details> 6913 <tag id="RAW" /> 6914 </entry> 6915 <entry name="profileToneCurve" type="float" 6916 visibility="system" optional="true" 6917 type_notes="Samples defining a spline for a tone-mapping curve" 6918 container="array"> 6919 <array> 6920 <size>samples</size> 6921 <size>2</size> 6922 </array> 6923 <description> 6924 A list of x,y samples defining a tone-mapping curve for gamma adjustment. 6925 </description> 6926 <range> 6927 Each sample has an input range of `[0, 1]` and an output range of 6928 `[0, 1]`. The first sample is required to be `(0, 0)`, and the last 6929 sample is required to be `(1, 1)`. 6930 </range> 6931 <details> 6932 This key contains a default tone curve that can be applied while 6933 processing the image as a starting point for user adjustments. 6934 The curve is specified as a list of value pairs in linear gamma. 6935 The curve is interpolated using a cubic spline. 6936 </details> 6937 <tag id="RAW" /> 6938 </entry> 6939 <entry name="greenSplit" type="float" visibility="public" optional="true"> 6940 <description> 6941 The worst-case divergence between Bayer green channels. 6942 </description> 6943 <range> 6944 &gt;= 0 6945 </range> 6946 <details> 6947 This value is an estimate of the worst case split between the 6948 Bayer green channels in the red and blue rows in the sensor color 6949 filter array. 6950 6951 The green split is calculated as follows: 6952 6953 1. A 5x5 pixel (or larger) window W within the active sensor array is 6954 chosen. The term 'pixel' here is taken to mean a group of 4 Bayer 6955 mosaic channels (R, Gr, Gb, B). The location and size of the window 6956 chosen is implementation defined, and should be chosen to provide a 6957 green split estimate that is both representative of the entire image 6958 for this camera sensor, and can be calculated quickly. 6959 1. The arithmetic mean of the green channels from the red 6960 rows (mean_Gr) within W is computed. 6961 1. The arithmetic mean of the green channels from the blue 6962 rows (mean_Gb) within W is computed. 6963 1. The maximum ratio R of the two means is computed as follows: 6964 `R = max((mean_Gr + 1)/(mean_Gb + 1), (mean_Gb + 1)/(mean_Gr + 1))` 6965 6966 The ratio R is the green split divergence reported for this property, 6967 which represents how much the green channels differ in the mosaic 6968 pattern. This value is typically used to determine the treatment of 6969 the green mosaic channels when demosaicing. 6970 6971 The green split value can be roughly interpreted as follows: 6972 6973 * R &lt; 1.03 is a negligible split (&lt;3% divergence). 6974 * 1.20 &lt;= R &gt;= 1.03 will require some software 6975 correction to avoid demosaic errors (3-20% divergence). 6976 * R &gt; 1.20 will require strong software correction to produce 6977 a usuable image (&gt;20% divergence). 6978 </details> 6979 <hal_details> 6980 The green split given may be a static value based on prior 6981 characterization of the camera sensor using the green split 6982 calculation method given here over a large, representative, sample 6983 set of images. Other methods of calculation that produce equivalent 6984 results, and can be interpreted in the same manner, may be used. 6985 </hal_details> 6986 <tag id="RAW" /> 6987 </entry> 6988 </dynamic> 6989 <controls> 6990 <entry name="testPatternData" type="int32" visibility="public" optional="true" container="array"> 6991 <array> 6992 <size>4</size> 6993 </array> 6994 <description> 6995 A pixel `[R, G_even, G_odd, B]` that supplies the test pattern 6996 when android.sensor.testPatternMode is SOLID_COLOR. 6997 </description> 6998 <details> 6999 Each color channel is treated as an unsigned 32-bit integer. 7000 The camera device then uses the most significant X bits 7001 that correspond to how many bits are in its Bayer raw sensor 7002 output. 7003 7004 For example, a sensor with RAW10 Bayer output would use the 7005 10 most significant bits from each color channel. 7006 </details> 7007 <hal_details> 7008 </hal_details> 7009 </entry> 7010 <entry name="testPatternMode" type="int32" visibility="public" optional="true" 7011 enum="true"> 7012 <enum> 7013 <value>OFF 7014 <notes>No test pattern mode is used, and the camera 7015 device returns captures from the image sensor. 7016 7017 This is the default if the key is not set.</notes> 7018 </value> 7019 <value>SOLID_COLOR 7020 <notes> 7021 Each pixel in `[R, G_even, G_odd, B]` is replaced by its 7022 respective color channel provided in 7023 android.sensor.testPatternData. 7024 7025 For example: 7026 7027 android.testPatternData = [0, 0xFFFFFFFF, 0xFFFFFFFF, 0] 7028 7029 All green pixels are 100% green. All red/blue pixels are black. 7030 7031 android.testPatternData = [0xFFFFFFFF, 0, 0xFFFFFFFF, 0] 7032 7033 All red pixels are 100% red. Only the odd green pixels 7034 are 100% green. All blue pixels are 100% black. 7035 </notes> 7036 </value> 7037 <value>COLOR_BARS 7038 <notes> 7039 All pixel data is replaced with an 8-bar color pattern. 7040 7041 The vertical bars (left-to-right) are as follows: 7042 7043 * 100% white 7044 * yellow 7045 * cyan 7046 * green 7047 * magenta 7048 * red 7049 * blue 7050 * black 7051 7052 In general the image would look like the following: 7053 7054 W Y C G M R B K 7055 W Y C G M R B K 7056 W Y C G M R B K 7057 W Y C G M R B K 7058 W Y C G M R B K 7059 . . . . . . . . 7060 . . . . . . . . 7061 . . . . . . . . 7062 7063 (B = Blue, K = Black) 7064 7065 Each bar should take up 1/8 of the sensor pixel array width. 7066 When this is not possible, the bar size should be rounded 7067 down to the nearest integer and the pattern can repeat 7068 on the right side. 7069 7070 Each bar's height must always take up the full sensor 7071 pixel array height. 7072 7073 Each pixel in this test pattern must be set to either 7074 0% intensity or 100% intensity. 7075 </notes> 7076 </value> 7077 <value>COLOR_BARS_FADE_TO_GRAY 7078 <notes> 7079 The test pattern is similar to COLOR_BARS, except that 7080 each bar should start at its specified color at the top, 7081 and fade to gray at the bottom. 7082 7083 Furthermore each bar is further subdivided into a left and 7084 right half. The left half should have a smooth gradient, 7085 and the right half should have a quantized gradient. 7086 7087 In particular, the right half's should consist of blocks of the 7088 same color for 1/16th active sensor pixel array width. 7089 7090 The least significant bits in the quantized gradient should 7091 be copied from the most significant bits of the smooth gradient. 7092 7093 The height of each bar should always be a multiple of 128. 7094 When this is not the case, the pattern should repeat at the bottom 7095 of the image. 7096 </notes> 7097 </value> 7098 <value>PN9 7099 <notes> 7100 All pixel data is replaced by a pseudo-random sequence 7101 generated from a PN9 512-bit sequence (typically implemented 7102 in hardware with a linear feedback shift register). 7103 7104 The generator should be reset at the beginning of each frame, 7105 and thus each subsequent raw frame with this test pattern should 7106 be exactly the same as the last. 7107 </notes> 7108 </value> 7109 <value id="256">CUSTOM1 7110 <notes>The first custom test pattern. All custom patterns that are 7111 available only on this camera device are at least this numeric 7112 value. 7113 7114 All of the custom test patterns will be static 7115 (that is the raw image must not vary from frame to frame). 7116 </notes> 7117 </value> 7118 </enum> 7119 <description>When enabled, the sensor sends a test pattern instead of 7120 doing a real exposure from the camera. 7121 </description> 7122 <range>android.sensor.availableTestPatternModes</range> 7123 <details> 7124 When a test pattern is enabled, all manual sensor controls specified 7125 by android.sensor.* will be ignored. All other controls should 7126 work as normal. 7127 7128 For example, if manual flash is enabled, flash firing should still 7129 occur (and that the test pattern remain unmodified, since the flash 7130 would not actually affect it). 7131 7132 Defaults to OFF. 7133 </details> 7134 <hal_details> 7135 All test patterns are specified in the Bayer domain. 7136 7137 The HAL may choose to substitute test patterns from the sensor 7138 with test patterns from on-device memory. In that case, it should be 7139 indistinguishable to the ISP whether the data came from the 7140 sensor interconnect bus (such as CSI2) or memory. 7141 </hal_details> 7142 </entry> 7143 </controls> 7144 <dynamic> 7145 <clone entry="android.sensor.testPatternData" kind="controls"> 7146 </clone> 7147 <clone entry="android.sensor.testPatternMode" kind="controls"> 7148 </clone> 7149 </dynamic> 7150 <static> 7151 <entry name="availableTestPatternModes" type="int32" visibility="public" optional="true" 7152 type_notes="list of enums" container="array"> 7153 <array> 7154 <size>n</size> 7155 </array> 7156 <description>List of sensor test pattern modes for android.sensor.testPatternMode 7157 supported by this camera device. 7158 </description> 7159 <range>Any value listed in android.sensor.testPatternMode</range> 7160 <details> 7161 Defaults to OFF, and always includes OFF if defined. 7162 </details> 7163 <hal_details> 7164 All custom modes must be >= CUSTOM1. 7165 </hal_details> 7166 </entry> 7167 </static> 7168 <dynamic> 7169 <entry name="rollingShutterSkew" type="int64" visibility="public" hwlevel="limited"> 7170 <description>Duration between the start of first row exposure 7171 and the start of last row exposure.</description> 7172 <units>Nanoseconds</units> 7173 <range> &gt;= 0 and &lt; 7174 {@link android.hardware.camera2.params.StreamConfigurationMap#getOutputMinFrameDuration}.</range> 7175 <details> 7176 This is the exposure time skew between the first and last 7177 row exposure start times. The first row and the last row are 7178 the first and last rows inside of the 7179 android.sensor.info.activeArraySize. 7180 7181 For typical camera sensors that use rolling shutters, this is also equivalent 7182 to the frame readout time. 7183 </details> 7184 <hal_details> 7185 The HAL must report `0` if the sensor is using global shutter, where all pixels begin 7186 exposure at the same time. 7187 </hal_details> 7188 <tag id="V1" /> 7189 </entry> 7190 </dynamic> 7191 <static> 7192 <entry name="opticalBlackRegions" type="int32" visibility="public" optional="true" 7193 container="array" typedef="rectangle"> 7194 <array> 7195 <size>4</size> 7196 <size>num_regions</size> 7197 </array> 7198 <description>List of disjoint rectangles indicating the sensor 7199 optically shielded black pixel regions. 7200 </description> 7201 <details> 7202 In most camera sensors, the active array is surrounded by some 7203 optically shielded pixel areas. By blocking light, these pixels 7204 provides a reliable black reference for black level compensation 7205 in active array region. 7206 7207 This key provides a list of disjoint rectangles specifying the 7208 regions of optically shielded (with metal shield) black pixel 7209 regions if the camera device is capable of reading out these black 7210 pixels in the output raw images. In comparison to the fixed black 7211 level values reported by android.sensor.blackLevelPattern, this key 7212 may provide a more accurate way for the application to calculate 7213 black level of each captured raw images. 7214 7215 When this key is reported, the android.sensor.dynamicBlackLevel and 7216 android.sensor.dynamicWhiteLevel will also be reported. 7217 </details> 7218 <hal_details> 7219 This array contains (xmin, ymin, width, height). The (xmin, ymin) 7220 must be &gt;= (0,0) and &lt;= 7221 android.sensor.info.pixelArraySize. The (width, height) must be 7222 &lt;= android.sensor.info.pixelArraySize. Each region must be 7223 outside the region reported by 7224 android.sensor.info.preCorrectionActiveArraySize. 7225 7226 The HAL must report minimal number of disjoint regions for the 7227 optically shielded back pixel regions. For example, if a region can 7228 be covered by one rectangle, the HAL must not split this region into 7229 multiple rectangles. 7230 </hal_details> 7231 </entry> 7232 </static> 7233 <dynamic> 7234 <entry name="dynamicBlackLevel" type="float" visibility="public" 7235 optional="true" type_notes="2x2 raw count block" container="array"> 7236 <array> 7237 <size>4</size> 7238 </array> 7239 <description> 7240 A per-frame dynamic black level offset for each of the color filter 7241 arrangement (CFA) mosaic channels. 7242 </description> 7243 <range>&gt;= 0 for each.</range> 7244 <details> 7245 Camera sensor black levels may vary dramatically for different 7246 capture settings (e.g. android.sensor.sensitivity). The fixed black 7247 level reported by android.sensor.blackLevelPattern may be too 7248 inaccurate to represent the actual value on a per-frame basis. The 7249 camera device internal pipeline relies on reliable black level values 7250 to process the raw images appropriately. To get the best image 7251 quality, the camera device may choose to estimate the per frame black 7252 level values either based on optically shielded black regions 7253 (android.sensor.opticalBlackRegions) or its internal model. 7254 7255 This key reports the camera device estimated per-frame zero light 7256 value for each of the CFA mosaic channels in the camera sensor. The 7257 android.sensor.blackLevelPattern may only represent a coarse 7258 approximation of the actual black level values. This value is the 7259 black level used in camera device internal image processing pipeline 7260 and generally more accurate than the fixed black level values. 7261 However, since they are estimated values by the camera device, they 7262 may not be as accurate as the black level values calculated from the 7263 optical black pixels reported by android.sensor.opticalBlackRegions. 7264 7265 The values are given in the same order as channels listed for the CFA 7266 layout key (see android.sensor.info.colorFilterArrangement), i.e. the 7267 nth value given corresponds to the black level offset for the nth 7268 color channel listed in the CFA. 7269 7270 This key will be available if android.sensor.opticalBlackRegions is 7271 available or the camera device advertises this key via 7272 {@link android.hardware.camera2.CameraCharacteristics#getAvailableCaptureResultKeys}. 7273 </details> 7274 <hal_details> 7275 The values are given in row-column scan order, with the first value 7276 corresponding to the element of the CFA in row=0, column=0. 7277 </hal_details> 7278 <tag id="RAW" /> 7279 </entry> 7280 <entry name="dynamicWhiteLevel" type="int32" visibility="public" 7281 optional="true" > 7282 <description> 7283 Maximum raw value output by sensor for this frame. 7284 </description> 7285 <range> &gt;= 0</range> 7286 <details> 7287 Since the android.sensor.blackLevelPattern may change for different 7288 capture settings (e.g., android.sensor.sensitivity), the white 7289 level will change accordingly. This key is similar to 7290 android.sensor.info.whiteLevel, but specifies the camera device 7291 estimated white level for each frame. 7292 7293 This key will be available if android.sensor.opticalBlackRegions is 7294 available or the camera device advertises this key via 7295 {@link android.hardware.camera2.CameraCharacteristics#getAvailableCaptureRequestKeys}. 7296 </details> 7297 <hal_details> 7298 The full bit depth of the sensor must be available in the raw data, 7299 so the value for linear sensors should not be significantly lower 7300 than maximum raw value supported, i.e. 2^(sensor bits per pixel). 7301 </hal_details> 7302 <tag id="RAW" /> 7303 </entry> 7304 </dynamic> 7305 <static> 7306 <entry name="opaqueRawSize" type="int32" visibility="system" container="array"> 7307 <array> 7308 <size>n</size> 7309 <size>3</size> 7310 </array> 7311 <description>Size in bytes for all the listed opaque RAW buffer sizes</description> 7312 <range>Must be large enough to fit the opaque RAW of corresponding size produced by 7313 the camera</range> 7314 <details> 7315 This configurations are listed as `(width, height, size_in_bytes)` tuples. 7316 This is used for sizing the gralloc buffers for opaque RAW buffers. 7317 All RAW_OPAQUE output stream configuration listed in 7318 android.scaler.availableStreamConfigurations will have a corresponding tuple in 7319 this key. 7320 </details> 7321 <hal_details> 7322 This key is added in HAL3.4. 7323 For HAL3.4 or above: devices advertising RAW_OPAQUE format output must list this key. 7324 For HAL3.3 or earlier devices: if RAW_OPAQUE ouput is advertised, camera framework 7325 will derive this key by assuming each pixel takes two bytes and no padding bytes 7326 between rows. 7327 </hal_details> 7328 </entry> 7329 </static> 7330 </section> 7331 <section name="shading"> 7332 <controls> 7333 <entry name="mode" type="byte" visibility="public" enum="true" hwlevel="full"> 7334 <enum> 7335 <value>OFF 7336 <notes>No lens shading correction is applied.</notes></value> 7337 <value>FAST 7338 <notes>Apply lens shading corrections, without slowing 7339 frame rate relative to sensor raw output</notes></value> 7340 <value>HIGH_QUALITY 7341 <notes>Apply high-quality lens shading correction, at the 7342 cost of possibly reduced frame rate.</notes></value> 7343 </enum> 7344 <description>Quality of lens shading correction applied 7345 to the image data.</description> 7346 <range>android.shading.availableModes</range> 7347 <details> 7348 When set to OFF mode, no lens shading correction will be applied by the 7349 camera device, and an identity lens shading map data will be provided 7350 if `android.statistics.lensShadingMapMode == ON`. For example, for lens 7351 shading map with size of `[ 4, 3 ]`, 7352 the output android.statistics.lensShadingCorrectionMap for this case will be an identity 7353 map shown below: 7354 7355 [ 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 7356 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 7357 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 7358 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 7359 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 7360 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0 ] 7361 7362 When set to other modes, lens shading correction will be applied by the camera 7363 device. Applications can request lens shading map data by setting 7364 android.statistics.lensShadingMapMode to ON, and then the camera device will provide lens 7365 shading map data in android.statistics.lensShadingCorrectionMap; the returned shading map 7366 data will be the one applied by the camera device for this capture request. 7367 7368 The shading map data may depend on the auto-exposure (AE) and AWB statistics, therefore 7369 the reliability of the map data may be affected by the AE and AWB algorithms. When AE and 7370 AWB are in AUTO modes(android.control.aeMode `!=` OFF and android.control.awbMode `!=` 7371 OFF), to get best results, it is recommended that the applications wait for the AE and AWB 7372 to be converged before using the returned shading map data. 7373 </details> 7374 </entry> 7375 <entry name="strength" type="byte"> 7376 <description>Control the amount of shading correction 7377 applied to the images</description> 7378 <units>unitless: 1-10; 10 is full shading 7379 compensation</units> 7380 <tag id="FUTURE" /> 7381 </entry> 7382 </controls> 7383 <dynamic> 7384 <clone entry="android.shading.mode" kind="controls"> 7385 </clone> 7386 </dynamic> 7387 <static> 7388 <entry name="availableModes" type="byte" visibility="public" 7389 type_notes="List of enums (android.shading.mode)." container="array" 7390 typedef="enumList" hwlevel="legacy"> 7391 <array> 7392 <size>n</size> 7393 </array> 7394 <description> 7395 List of lens shading modes for android.shading.mode that are supported by this camera device. 7396 </description> 7397 <range>Any value listed in android.shading.mode</range> 7398 <details> 7399 This list contains lens shading modes that can be set for the camera device. 7400 Camera devices that support the MANUAL_POST_PROCESSING capability will always 7401 list OFF and FAST mode. This includes all FULL level devices. 7402 LEGACY devices will always only support FAST mode. 7403 </details> 7404 <hal_details> 7405 HAL must support both FAST and HIGH_QUALITY if lens shading correction control is 7406 available on the camera device, but the underlying implementation can be the same for 7407 both modes. That is, if the highest quality implementation on the camera device does not 7408 slow down capture rate, then FAST and HIGH_QUALITY will generate the same output. 7409 </hal_details> 7410 </entry> 7411 </static> 7412 </section> 7413 <section name="statistics"> 7414 <controls> 7415 <entry name="faceDetectMode" type="byte" visibility="public" enum="true" 7416 hwlevel="legacy"> 7417 <enum> 7418 <value>OFF 7419 <notes>Do not include face detection statistics in capture 7420 results.</notes></value> 7421 <value optional="true">SIMPLE 7422 <notes>Return face rectangle and confidence values only. 7423 </notes></value> 7424 <value optional="true">FULL 7425 <notes>Return all face 7426 metadata. 7427 7428 In this mode, face rectangles, scores, landmarks, and face IDs are all valid. 7429 </notes></value> 7430 </enum> 7431 <description>Operating mode for the face detector 7432 unit.</description> 7433 <range>android.statistics.info.availableFaceDetectModes</range> 7434 <details>Whether face detection is enabled, and whether it 7435 should output just the basic fields or the full set of 7436 fields.</details> 7437 <hal_details> 7438 SIMPLE mode must fill in android.statistics.faceRectangles and 7439 android.statistics.faceScores. 7440 FULL mode must also fill in android.statistics.faceIds, and 7441 android.statistics.faceLandmarks. 7442 </hal_details> 7443 <tag id="BC" /> 7444 </entry> 7445 <entry name="histogramMode" type="byte" enum="true" typedef="boolean"> 7446 <enum> 7447 <value>OFF</value> 7448 <value>ON</value> 7449 </enum> 7450 <description>Operating mode for histogram 7451 generation</description> 7452 <tag id="FUTURE" /> 7453 </entry> 7454 <entry name="sharpnessMapMode" type="byte" enum="true" typedef="boolean"> 7455 <enum> 7456 <value>OFF</value> 7457 <value>ON</value> 7458 </enum> 7459 <description>Operating mode for sharpness map 7460 generation</description> 7461 <tag id="FUTURE" /> 7462 </entry> 7463 <entry name="hotPixelMapMode" type="byte" visibility="public" enum="true" 7464 typedef="boolean"> 7465 <enum> 7466 <value>OFF 7467 <notes>Hot pixel map production is disabled. 7468 </notes></value> 7469 <value>ON 7470 <notes>Hot pixel map production is enabled. 7471 </notes></value> 7472 </enum> 7473 <description> 7474 Operating mode for hot pixel map generation. 7475 </description> 7476 <range>android.statistics.info.availableHotPixelMapModes</range> 7477 <details> 7478 If set to `true`, a hot pixel map is returned in android.statistics.hotPixelMap. 7479 If set to `false`, no hot pixel map will be returned. 7480 </details> 7481 <tag id="V1" /> 7482 <tag id="RAW" /> 7483 </entry> 7484 </controls> 7485 <static> 7486 <namespace name="info"> 7487 <entry name="availableFaceDetectModes" type="byte" 7488 visibility="public" 7489 type_notes="List of enums from android.statistics.faceDetectMode" 7490 container="array" 7491 typedef="enumList" 7492 hwlevel="legacy"> 7493 <array> 7494 <size>n</size> 7495 </array> 7496 <description>List of face detection modes for android.statistics.faceDetectMode that are 7497 supported by this camera device. 7498 </description> 7499 <range>Any value listed in android.statistics.faceDetectMode</range> 7500 <details>OFF is always supported. 7501 </details> 7502 </entry> 7503 <entry name="histogramBucketCount" type="int32"> 7504 <description>Number of histogram buckets 7505 supported</description> 7506 <range>&gt;= 64</range> 7507 <tag id="FUTURE" /> 7508 </entry> 7509 <entry name="maxFaceCount" type="int32" visibility="public" hwlevel="legacy"> 7510 <description>The maximum number of simultaneously detectable 7511 faces.</description> 7512 <range>0 for cameras without available face detection; otherwise: 7513 `>=4` for LIMITED or FULL hwlevel devices or 7514 `>0` for LEGACY devices.</range> 7515 <tag id="BC" /> 7516 </entry> 7517 <entry name="maxHistogramCount" type="int32"> 7518 <description>Maximum value possible for a histogram 7519 bucket</description> 7520 <tag id="FUTURE" /> 7521 </entry> 7522 <entry name="maxSharpnessMapValue" type="int32"> 7523 <description>Maximum value possible for a sharpness map 7524 region.</description> 7525 <tag id="FUTURE" /> 7526 </entry> 7527 <entry name="sharpnessMapSize" type="int32" 7528 type_notes="width x height" container="array" typedef="size"> 7529 <array> 7530 <size>2</size> 7531 </array> 7532 <description>Dimensions of the sharpness 7533 map</description> 7534 <range>Must be at least 32 x 32</range> 7535 <tag id="FUTURE" /> 7536 </entry> 7537 <entry name="availableHotPixelMapModes" type="byte" visibility="public" 7538 type_notes="list of enums" container="array" typedef="boolean"> 7539 <array> 7540 <size>n</size> 7541 </array> 7542 <description> 7543 List of hot pixel map output modes for android.statistics.hotPixelMapMode that are 7544 supported by this camera device. 7545 </description> 7546 <range>Any value listed in android.statistics.hotPixelMapMode</range> 7547 <details> 7548 If no hotpixel map output is available for this camera device, this will contain only 7549 `false`. 7550 7551 ON is always supported on devices with the RAW capability. 7552 </details> 7553 <tag id="V1" /> 7554 <tag id="RAW" /> 7555 </entry> 7556 <entry name="availableLensShadingMapModes" type="byte" visibility="public" 7557 type_notes="list of enums" container="array" typedef="enumList"> 7558 <array> 7559 <size>n</size> 7560 </array> 7561 <description> 7562 List of lens shading map output modes for android.statistics.lensShadingMapMode that 7563 are supported by this camera device. 7564 </description> 7565 <range>Any value listed in android.statistics.lensShadingMapMode</range> 7566 <details> 7567 If no lens shading map output is available for this camera device, this key will 7568 contain only OFF. 7569 7570 ON is always supported on devices with the RAW capability. 7571 LEGACY mode devices will always only support OFF. 7572 </details> 7573 </entry> 7574 </namespace> 7575 </static> 7576 <dynamic> 7577 <clone entry="android.statistics.faceDetectMode" 7578 kind="controls"></clone> 7579 <entry name="faceIds" type="int32" visibility="ndk_public" 7580 container="array" hwlevel="legacy"> 7581 <array> 7582 <size>n</size> 7583 </array> 7584 <description>List of unique IDs for detected faces.</description> 7585 <details> 7586 Each detected face is given a unique ID that is valid for as long as the face is visible 7587 to the camera device. A face that leaves the field of view and later returns may be 7588 assigned a new ID. 7589 7590 Only available if android.statistics.faceDetectMode == FULL</details> 7591 <tag id="BC" /> 7592 </entry> 7593 <entry name="faceLandmarks" type="int32" visibility="ndk_public" 7594 type_notes="(leftEyeX, leftEyeY, rightEyeX, rightEyeY, mouthX, mouthY)" 7595 container="array" hwlevel="legacy"> 7596 <array> 7597 <size>n</size> 7598 <size>6</size> 7599 </array> 7600 <description>List of landmarks for detected 7601 faces.</description> 7602 <details> 7603 The coordinate system is that of android.sensor.info.activeArraySize, with 7604 `(0, 0)` being the top-left pixel of the active array. 7605 7606 Only available if android.statistics.faceDetectMode == FULL</details> 7607 <tag id="BC" /> 7608 </entry> 7609 <entry name="faceRectangles" type="int32" visibility="ndk_public" 7610 type_notes="(xmin, ymin, xmax, ymax). (0,0) is top-left of active pixel area" 7611 container="array" typedef="rectangle" hwlevel="legacy"> 7612 <array> 7613 <size>n</size> 7614 <size>4</size> 7615 </array> 7616 <description>List of the bounding rectangles for detected 7617 faces.</description> 7618 <details> 7619 The coordinate system is that of android.sensor.info.activeArraySize, with 7620 `(0, 0)` being the top-left pixel of the active array. 7621 7622 Only available if android.statistics.faceDetectMode != OFF</details> 7623 <tag id="BC" /> 7624 </entry> 7625 <entry name="faceScores" type="byte" visibility="ndk_public" 7626 container="array" hwlevel="legacy"> 7627 <array> 7628 <size>n</size> 7629 </array> 7630 <description>List of the face confidence scores for 7631 detected faces</description> 7632 <range>1-100</range> 7633 <details>Only available if android.statistics.faceDetectMode != OFF. 7634 </details> 7635 <hal_details> 7636 The value should be meaningful (for example, setting 100 at 7637 all times is illegal).</hal_details> 7638 <tag id="BC" /> 7639 </entry> 7640 <entry name="faces" type="int32" visibility="java_public" synthetic="true" 7641 container="array" typedef="face" hwlevel="legacy"> 7642 <array> 7643 <size>n</size> 7644 </array> 7645 <description>List of the faces detected through camera face detection 7646 in this capture.</description> 7647 <details> 7648 Only available if android.statistics.faceDetectMode `!=` OFF. 7649 </details> 7650 </entry> 7651 <entry name="histogram" type="int32" 7652 type_notes="count of pixels for each color channel that fall into each histogram bucket, scaled to be between 0 and maxHistogramCount" 7653 container="array"> 7654 <array> 7655 <size>n</size> 7656 <size>3</size> 7657 </array> 7658 <description>A 3-channel histogram based on the raw 7659 sensor data</description> 7660 <details>The k'th bucket (0-based) covers the input range 7661 (with w = android.sensor.info.whiteLevel) of [ k * w/N, 7662 (k + 1) * w / N ). If only a monochrome sharpness map is 7663 supported, all channels should have the same data</details> 7664 <tag id="FUTURE" /> 7665 </entry> 7666 <clone entry="android.statistics.histogramMode" 7667 kind="controls"></clone> 7668 <entry name="sharpnessMap" type="int32" 7669 type_notes="estimated sharpness for each region of the input image. Normalized to be between 0 and maxSharpnessMapValue. Higher values mean sharper (better focused)" 7670 container="array"> 7671 <array> 7672 <size>n</size> 7673 <size>m</size> 7674 <size>3</size> 7675 </array> 7676 <description>A 3-channel sharpness map, based on the raw 7677 sensor data</description> 7678 <details>If only a monochrome sharpness map is supported, 7679 all channels should have the same data</details> 7680 <tag id="FUTURE" /> 7681 </entry> 7682 <clone entry="android.statistics.sharpnessMapMode" 7683 kind="controls"></clone> 7684 <entry name="lensShadingCorrectionMap" type="byte" visibility="java_public" 7685 typedef="lensShadingMap" hwlevel="full"> 7686 <description>The shading map is a low-resolution floating-point map 7687 that lists the coefficients used to correct for vignetting, for each 7688 Bayer color channel.</description> 7689 <range>Each gain factor is &gt;= 1</range> 7690 <details> 7691 The map provided here is the same map that is used by the camera device to 7692 correct both color shading and vignetting for output non-RAW images. 7693 7694 When there is no lens shading correction applied to RAW 7695 output images (android.sensor.info.lensShadingApplied `==` 7696 false), this map is the complete lens shading correction 7697 map; when there is some lens shading correction applied to 7698 the RAW output image (android.sensor.info.lensShadingApplied 7699 `==` true), this map reports the remaining lens shading 7700 correction map that needs to be applied to get shading 7701 corrected images that match the camera device's output for 7702 non-RAW formats. 7703 7704 For a complete shading correction map, the least shaded 7705 section of the image will have a gain factor of 1; all 7706 other sections will have gains above 1. 7707 7708 When android.colorCorrection.mode = TRANSFORM_MATRIX, the map 7709 will take into account the colorCorrection settings. 7710 7711 The shading map is for the entire active pixel array, and is not 7712 affected by the crop region specified in the request. Each shading map 7713 entry is the value of the shading compensation map over a specific 7714 pixel on the sensor. Specifically, with a (N x M) resolution shading 7715 map, and an active pixel array size (W x H), shading map entry 7716 (x,y) ϵ (0 ... N-1, 0 ... M-1) is the value of the shading map at 7717 pixel ( ((W-1)/(N-1)) * x, ((H-1)/(M-1)) * y) for the four color channels. 7718 The map is assumed to be bilinearly interpolated between the sample points. 7719 7720 The channel order is [R, Geven, Godd, B], where Geven is the green 7721 channel for the even rows of a Bayer pattern, and Godd is the odd rows. 7722 The shading map is stored in a fully interleaved format. 7723 7724 The shading map will generally have on the order of 30-40 rows and columns, 7725 and will be smaller than 64x64. 7726 7727 As an example, given a very small map defined as: 7728 7729 width,height = [ 4, 3 ] 7730 values = 7731 [ 1.3, 1.2, 1.15, 1.2, 1.2, 1.2, 1.15, 1.2, 7732 1.1, 1.2, 1.2, 1.2, 1.3, 1.2, 1.3, 1.3, 7733 1.2, 1.2, 1.25, 1.1, 1.1, 1.1, 1.1, 1.0, 7734 1.0, 1.0, 1.0, 1.0, 1.2, 1.3, 1.25, 1.2, 7735 1.3, 1.2, 1.2, 1.3, 1.2, 1.15, 1.1, 1.2, 7736 1.2, 1.1, 1.0, 1.2, 1.3, 1.15, 1.2, 1.3 ] 7737 7738 The low-resolution scaling map images for each channel are 7739 (displayed using nearest-neighbor interpolation): 7740 7741 ![Red lens shading map](android.statistics.lensShadingMap/red_shading.png) 7742 ![Green (even rows) lens shading map](android.statistics.lensShadingMap/green_e_shading.png) 7743 ![Green (odd rows) lens shading map](android.statistics.lensShadingMap/green_o_shading.png) 7744 ![Blue lens shading map](android.statistics.lensShadingMap/blue_shading.png) 7745 7746 As a visualization only, inverting the full-color map to recover an 7747 image of a gray wall (using bicubic interpolation for visual quality) as captured by the sensor gives: 7748 7749 ![Image of a uniform white wall (inverse shading map)](android.statistics.lensShadingMap/inv_shading.png) 7750 </details> 7751 </entry> 7752 <entry name="lensShadingMap" type="float" visibility="ndk_public" 7753 type_notes="2D array of float gain factors per channel to correct lens shading" 7754 container="array" hwlevel="full"> 7755 <array> 7756 <size>4</size> 7757 <size>n</size> 7758 <size>m</size> 7759 </array> 7760 <description>The shading map is a low-resolution floating-point map 7761 that lists the coefficients used to correct for vignetting and color shading, 7762 for each Bayer color channel of RAW image data.</description> 7763 <range>Each gain factor is &gt;= 1</range> 7764 <details> 7765 The map provided here is the same map that is used by the camera device to 7766 correct both color shading and vignetting for output non-RAW images. 7767 7768 When there is no lens shading correction applied to RAW 7769 output images (android.sensor.info.lensShadingApplied `==` 7770 false), this map is the complete lens shading correction 7771 map; when there is some lens shading correction applied to 7772 the RAW output image (android.sensor.info.lensShadingApplied 7773 `==` true), this map reports the remaining lens shading 7774 correction map that needs to be applied to get shading 7775 corrected images that match the camera device's output for 7776 non-RAW formats. 7777 7778 For a complete shading correction map, the least shaded 7779 section of the image will have a gain factor of 1; all 7780 other sections will have gains above 1. 7781 7782 When android.colorCorrection.mode = TRANSFORM_MATRIX, the map 7783 will take into account the colorCorrection settings. 7784 7785 The shading map is for the entire active pixel array, and is not 7786 affected by the crop region specified in the request. Each shading map 7787 entry is the value of the shading compensation map over a specific 7788 pixel on the sensor. Specifically, with a (N x M) resolution shading 7789 map, and an active pixel array size (W x H), shading map entry 7790 (x,y) ϵ (0 ... N-1, 0 ... M-1) is the value of the shading map at 7791 pixel ( ((W-1)/(N-1)) * x, ((H-1)/(M-1)) * y) for the four color channels. 7792 The map is assumed to be bilinearly interpolated between the sample points. 7793 7794 The channel order is [R, Geven, Godd, B], where Geven is the green 7795 channel for the even rows of a Bayer pattern, and Godd is the odd rows. 7796 The shading map is stored in a fully interleaved format, and its size 7797 is provided in the camera static metadata by android.lens.info.shadingMapSize. 7798 7799 The shading map will generally have on the order of 30-40 rows and columns, 7800 and will be smaller than 64x64. 7801 7802 As an example, given a very small map defined as: 7803 7804 android.lens.info.shadingMapSize = [ 4, 3 ] 7805 android.statistics.lensShadingMap = 7806 [ 1.3, 1.2, 1.15, 1.2, 1.2, 1.2, 1.15, 1.2, 7807 1.1, 1.2, 1.2, 1.2, 1.3, 1.2, 1.3, 1.3, 7808 1.2, 1.2, 1.25, 1.1, 1.1, 1.1, 1.1, 1.0, 7809 1.0, 1.0, 1.0, 1.0, 1.2, 1.3, 1.25, 1.2, 7810 1.3, 1.2, 1.2, 1.3, 1.2, 1.15, 1.1, 1.2, 7811 1.2, 1.1, 1.0, 1.2, 1.3, 1.15, 1.2, 1.3 ] 7812 7813 The low-resolution scaling map images for each channel are 7814 (displayed using nearest-neighbor interpolation): 7815 7816 ![Red lens shading map](android.statistics.lensShadingMap/red_shading.png) 7817 ![Green (even rows) lens shading map](android.statistics.lensShadingMap/green_e_shading.png) 7818 ![Green (odd rows) lens shading map](android.statistics.lensShadingMap/green_o_shading.png) 7819 ![Blue lens shading map](android.statistics.lensShadingMap/blue_shading.png) 7820 7821 As a visualization only, inverting the full-color map to recover an 7822 image of a gray wall (using bicubic interpolation for visual quality) 7823 as captured by the sensor gives: 7824 7825 ![Image of a uniform white wall (inverse shading map)](android.statistics.lensShadingMap/inv_shading.png) 7826 7827 Note that the RAW image data might be subject to lens shading 7828 correction not reported on this map. Query 7829 android.sensor.info.lensShadingApplied to see if RAW image data has subject 7830 to lens shading correction. If android.sensor.info.lensShadingApplied 7831 is TRUE, the RAW image data is subject to partial or full lens shading 7832 correction. In the case full lens shading correction is applied to RAW 7833 images, the gain factor map reported in this key will contain all 1.0 gains. 7834 In other words, the map reported in this key is the remaining lens shading 7835 that needs to be applied on the RAW image to get images without lens shading 7836 artifacts. See android.request.maxNumOutputRaw for a list of RAW image 7837 formats. 7838 </details> 7839 <hal_details> 7840 The lens shading map calculation may depend on exposure and white balance statistics. 7841 When AE and AWB are in AUTO modes 7842 (android.control.aeMode `!=` OFF and android.control.awbMode `!=` OFF), the HAL 7843 may have all the information it need to generate most accurate lens shading map. When 7844 AE or AWB are in manual mode 7845 (android.control.aeMode `==` OFF or android.control.awbMode `==` OFF), the shading map 7846 may be adversely impacted by manual exposure or white balance parameters. To avoid 7847 generating unreliable shading map data, the HAL may choose to lock the shading map with 7848 the latest known good map generated when the AE and AWB are in AUTO modes. 7849 </hal_details> 7850 </entry> 7851 <entry name="predictedColorGains" type="float" 7852 visibility="hidden" 7853 deprecated="true" 7854 optional="true" 7855 type_notes="A 1D array of floats for 4 color channel gains" 7856 container="array"> 7857 <array> 7858 <size>4</size> 7859 </array> 7860 <description>The best-fit color channel gains calculated 7861 by the camera device's statistics units for the current output frame. 7862 </description> 7863 <details> 7864 This may be different than the gains used for this frame, 7865 since statistics processing on data from a new frame 7866 typically completes after the transform has already been 7867 applied to that frame. 7868 7869 The 4 channel gains are defined in Bayer domain, 7870 see android.colorCorrection.gains for details. 7871 7872 This value should always be calculated by the auto-white balance (AWB) block, 7873 regardless of the android.control.* current values. 7874 </details> 7875 </entry> 7876 <entry name="predictedColorTransform" type="rational" 7877 visibility="hidden" 7878 deprecated="true" 7879 optional="true" 7880 type_notes="3x3 rational matrix in row-major order" 7881 container="array"> 7882 <array> 7883 <size>3</size> 7884 <size>3</size> 7885 </array> 7886 <description>The best-fit color transform matrix estimate 7887 calculated by the camera device's statistics units for the current 7888 output frame.</description> 7889 <details>The camera device will provide the estimate from its 7890 statistics unit on the white balance transforms to use 7891 for the next frame. These are the values the camera device believes 7892 are the best fit for the current output frame. This may 7893 be different than the transform used for this frame, since 7894 statistics processing on data from a new frame typically 7895 completes after the transform has already been applied to 7896 that frame. 7897 7898 These estimates must be provided for all frames, even if 7899 capture settings and color transforms are set by the application. 7900 7901 This value should always be calculated by the auto-white balance (AWB) block, 7902 regardless of the android.control.* current values. 7903 </details> 7904 </entry> 7905 <entry name="sceneFlicker" type="byte" visibility="public" enum="true" 7906 hwlevel="full"> 7907 <enum> 7908 <value>NONE 7909 <notes>The camera device does not detect any flickering illumination 7910 in the current scene.</notes></value> 7911 <value>50HZ 7912 <notes>The camera device detects illumination flickering at 50Hz 7913 in the current scene.</notes></value> 7914 <value>60HZ 7915 <notes>The camera device detects illumination flickering at 60Hz 7916 in the current scene.</notes></value> 7917 </enum> 7918 <description>The camera device estimated scene illumination lighting 7919 frequency.</description> 7920 <details> 7921 Many light sources, such as most fluorescent lights, flicker at a rate 7922 that depends on the local utility power standards. This flicker must be 7923 accounted for by auto-exposure routines to avoid artifacts in captured images. 7924 The camera device uses this entry to tell the application what the scene 7925 illuminant frequency is. 7926 7927 When manual exposure control is enabled 7928 (`android.control.aeMode == OFF` or `android.control.mode == 7929 OFF`), the android.control.aeAntibandingMode doesn't perform 7930 antibanding, and the application can ensure it selects 7931 exposure times that do not cause banding issues by looking 7932 into this metadata field. See 7933 android.control.aeAntibandingMode for more details. 7934 7935 Reports NONE if there doesn't appear to be flickering illumination. 7936 </details> 7937 </entry> 7938 <clone entry="android.statistics.hotPixelMapMode" kind="controls"> 7939 </clone> 7940 <entry name="hotPixelMap" type="int32" visibility="public" 7941 type_notes="list of coordinates based on android.sensor.pixelArraySize" 7942 container="array" typedef="point"> 7943 <array> 7944 <size>2</size> 7945 <size>n</size> 7946 </array> 7947 <description> 7948 List of `(x, y)` coordinates of hot/defective pixels on the sensor. 7949 </description> 7950 <range> 7951 n <= number of pixels on the sensor. 7952 The `(x, y)` coordinates must be bounded by 7953 android.sensor.info.pixelArraySize. 7954 </range> 7955 <details> 7956 A coordinate `(x, y)` must lie between `(0, 0)`, and 7957 `(width - 1, height - 1)` (inclusive), which are the top-left and 7958 bottom-right of the pixel array, respectively. The width and 7959 height dimensions are given in android.sensor.info.pixelArraySize. 7960 This may include hot pixels that lie outside of the active array 7961 bounds given by android.sensor.info.activeArraySize. 7962 </details> 7963 <hal_details> 7964 A hotpixel map contains the coordinates of pixels on the camera 7965 sensor that do report valid values (usually due to defects in 7966 the camera sensor). This includes pixels that are stuck at certain 7967 values, or have a response that does not accuractly encode the 7968 incoming light from the scene. 7969 7970 To avoid performance issues, there should be significantly fewer hot 7971 pixels than actual pixels on the camera sensor. 7972 </hal_details> 7973 <tag id="V1" /> 7974 <tag id="RAW" /> 7975 </entry> 7976 </dynamic> 7977 <controls> 7978 <entry name="lensShadingMapMode" type="byte" visibility="public" enum="true" hwlevel="full"> 7979 <enum> 7980 <value>OFF 7981 <notes>Do not include a lens shading map in the capture result.</notes></value> 7982 <value>ON 7983 <notes>Include a lens shading map in the capture result.</notes></value> 7984 </enum> 7985 <description>Whether the camera device will output the lens 7986 shading map in output result metadata.</description> 7987 <range>android.statistics.info.availableLensShadingMapModes</range> 7988 <details>When set to ON, 7989 android.statistics.lensShadingMap will be provided in 7990 the output result metadata. 7991 7992 ON is always supported on devices with the RAW capability. 7993 </details> 7994 <tag id="RAW" /> 7995 </entry> 7996 </controls> 7997 <dynamic> 7998 <clone entry="android.statistics.lensShadingMapMode" kind="controls"> 7999 </clone> 8000 </dynamic> 8001 </section> 8002 <section name="tonemap"> 8003 <controls> 8004 <entry name="curveBlue" type="float" visibility="ndk_public" 8005 type_notes="1D array of float pairs (P_IN, P_OUT). The maximum number of pairs is specified by android.tonemap.maxCurvePoints." 8006 container="array" hwlevel="full"> 8007 <array> 8008 <size>n</size> 8009 <size>2</size> 8010 </array> 8011 <description>Tonemapping / contrast / gamma curve for the blue 8012 channel, to use when android.tonemap.mode is 8013 CONTRAST_CURVE.</description> 8014 <details>See android.tonemap.curveRed for more details.</details> 8015 </entry> 8016 <entry name="curveGreen" type="float" visibility="ndk_public" 8017 type_notes="1D array of float pairs (P_IN, P_OUT). The maximum number of pairs is specified by android.tonemap.maxCurvePoints." 8018 container="array" hwlevel="full"> 8019 <array> 8020 <size>n</size> 8021 <size>2</size> 8022 </array> 8023 <description>Tonemapping / contrast / gamma curve for the green 8024 channel, to use when android.tonemap.mode is 8025 CONTRAST_CURVE.</description> 8026 <details>See android.tonemap.curveRed for more details.</details> 8027 </entry> 8028 <entry name="curveRed" type="float" visibility="ndk_public" 8029 type_notes="1D array of float pairs (P_IN, P_OUT). The maximum number of pairs is specified by android.tonemap.maxCurvePoints." 8030 container="array" hwlevel="full"> 8031 <array> 8032 <size>n</size> 8033 <size>2</size> 8034 </array> 8035 <description>Tonemapping / contrast / gamma curve for the red 8036 channel, to use when android.tonemap.mode is 8037 CONTRAST_CURVE.</description> 8038 <range>0-1 on both input and output coordinates, normalized 8039 as a floating-point value such that 0 == black and 1 == white. 8040 </range> 8041 <details> 8042 Each channel's curve is defined by an array of control points: 8043 8044 android.tonemap.curveRed = 8045 [ P0in, P0out, P1in, P1out, P2in, P2out, P3in, P3out, ..., PNin, PNout ] 8046 2 <= N <= android.tonemap.maxCurvePoints 8047 8048 These are sorted in order of increasing `Pin`; it is 8049 required that input values 0.0 and 1.0 are included in the list to 8050 define a complete mapping. For input values between control points, 8051 the camera device must linearly interpolate between the control 8052 points. 8053 8054 Each curve can have an independent number of points, and the number 8055 of points can be less than max (that is, the request doesn't have to 8056 always provide a curve with number of points equivalent to 8057 android.tonemap.maxCurvePoints). 8058 8059 A few examples, and their corresponding graphical mappings; these 8060 only specify the red channel and the precision is limited to 4 8061 digits, for conciseness. 8062 8063 Linear mapping: 8064 8065 android.tonemap.curveRed = [ 0, 0, 1.0, 1.0 ] 8066 8067 ![Linear mapping curve](android.tonemap.curveRed/linear_tonemap.png) 8068 8069 Invert mapping: 8070 8071 android.tonemap.curveRed = [ 0, 1.0, 1.0, 0 ] 8072 8073 ![Inverting mapping curve](android.tonemap.curveRed/inverse_tonemap.png) 8074 8075 Gamma 1/2.2 mapping, with 16 control points: 8076 8077 android.tonemap.curveRed = [ 8078 0.0000, 0.0000, 0.0667, 0.2920, 0.1333, 0.4002, 0.2000, 0.4812, 8079 0.2667, 0.5484, 0.3333, 0.6069, 0.4000, 0.6594, 0.4667, 0.7072, 8080 0.5333, 0.7515, 0.6000, 0.7928, 0.6667, 0.8317, 0.7333, 0.8685, 8081 0.8000, 0.9035, 0.8667, 0.9370, 0.9333, 0.9691, 1.0000, 1.0000 ] 8082 8083 ![Gamma = 1/2.2 tonemapping curve](android.tonemap.curveRed/gamma_tonemap.png) 8084 8085 Standard sRGB gamma mapping, per IEC 61966-2-1:1999, with 16 control points: 8086 8087 android.tonemap.curveRed = [ 8088 0.0000, 0.0000, 0.0667, 0.2864, 0.1333, 0.4007, 0.2000, 0.4845, 8089 0.2667, 0.5532, 0.3333, 0.6125, 0.4000, 0.6652, 0.4667, 0.7130, 8090 0.5333, 0.7569, 0.6000, 0.7977, 0.6667, 0.8360, 0.7333, 0.8721, 8091 0.8000, 0.9063, 0.8667, 0.9389, 0.9333, 0.9701, 1.0000, 1.0000 ] 8092 8093 ![sRGB tonemapping curve](android.tonemap.curveRed/srgb_tonemap.png) 8094 </details> 8095 <hal_details> 8096 For good quality of mapping, at least 128 control points are 8097 preferred. 8098 8099 A typical use case of this would be a gamma-1/2.2 curve, with as many 8100 control points used as are available. 8101 </hal_details> 8102 </entry> 8103 <entry name="curve" type="float" visibility="java_public" synthetic="true" 8104 typedef="tonemapCurve" 8105 hwlevel="full"> 8106 <description>Tonemapping / contrast / gamma curve to use when android.tonemap.mode 8107 is CONTRAST_CURVE.</description> 8108 <details> 8109 The tonemapCurve consist of three curves for each of red, green, and blue 8110 channels respectively. The following example uses the red channel as an 8111 example. The same logic applies to green and blue channel. 8112 Each channel's curve is defined by an array of control points: 8113 8114 curveRed = 8115 [ P0(in, out), P1(in, out), P2(in, out), P3(in, out), ..., PN(in, out) ] 8116 2 <= N <= android.tonemap.maxCurvePoints 8117 8118 These are sorted in order of increasing `Pin`; it is always 8119 guaranteed that input values 0.0 and 1.0 are included in the list to 8120 define a complete mapping. For input values between control points, 8121 the camera device must linearly interpolate between the control 8122 points. 8123 8124 Each curve can have an independent number of points, and the number 8125 of points can be less than max (that is, the request doesn't have to 8126 always provide a curve with number of points equivalent to 8127 android.tonemap.maxCurvePoints). 8128 8129 A few examples, and their corresponding graphical mappings; these 8130 only specify the red channel and the precision is limited to 4 8131 digits, for conciseness. 8132 8133 Linear mapping: 8134 8135 curveRed = [ (0, 0), (1.0, 1.0) ] 8136 8137 ![Linear mapping curve](android.tonemap.curveRed/linear_tonemap.png) 8138 8139 Invert mapping: 8140 8141 curveRed = [ (0, 1.0), (1.0, 0) ] 8142 8143 ![Inverting mapping curve](android.tonemap.curveRed/inverse_tonemap.png) 8144 8145 Gamma 1/2.2 mapping, with 16 control points: 8146 8147 curveRed = [ 8148 (0.0000, 0.0000), (0.0667, 0.2920), (0.1333, 0.4002), (0.2000, 0.4812), 8149 (0.2667, 0.5484), (0.3333, 0.6069), (0.4000, 0.6594), (0.4667, 0.7072), 8150 (0.5333, 0.7515), (0.6000, 0.7928), (0.6667, 0.8317), (0.7333, 0.8685), 8151 (0.8000, 0.9035), (0.8667, 0.9370), (0.9333, 0.9691), (1.0000, 1.0000) ] 8152 8153 ![Gamma = 1/2.2 tonemapping curve](android.tonemap.curveRed/gamma_tonemap.png) 8154 8155 Standard sRGB gamma mapping, per IEC 61966-2-1:1999, with 16 control points: 8156 8157 curveRed = [ 8158 (0.0000, 0.0000), (0.0667, 0.2864), (0.1333, 0.4007), (0.2000, 0.4845), 8159 (0.2667, 0.5532), (0.3333, 0.6125), (0.4000, 0.6652), (0.4667, 0.7130), 8160 (0.5333, 0.7569), (0.6000, 0.7977), (0.6667, 0.8360), (0.7333, 0.8721), 8161 (0.8000, 0.9063), (0.8667, 0.9389), (0.9333, 0.9701), (1.0000, 1.0000) ] 8162 8163 ![sRGB tonemapping curve](android.tonemap.curveRed/srgb_tonemap.png) 8164 </details> 8165 <hal_details> 8166 This entry is created by the framework from the curveRed, curveGreen and 8167 curveBlue entries. 8168 </hal_details> 8169 </entry> 8170 <entry name="mode" type="byte" visibility="public" enum="true" 8171 hwlevel="full"> 8172 <enum> 8173 <value>CONTRAST_CURVE 8174 <notes>Use the tone mapping curve specified in 8175 the android.tonemap.curve* entries. 8176 8177 All color enhancement and tonemapping must be disabled, except 8178 for applying the tonemapping curve specified by 8179 android.tonemap.curve. 8180 8181 Must not slow down frame rate relative to raw 8182 sensor output. 8183 </notes> 8184 </value> 8185 <value>FAST 8186 <notes> 8187 Advanced gamma mapping and color enhancement may be applied, without 8188 reducing frame rate compared to raw sensor output. 8189 </notes> 8190 </value> 8191 <value>HIGH_QUALITY 8192 <notes> 8193 High-quality gamma mapping and color enhancement will be applied, at 8194 the cost of possibly reduced frame rate compared to raw sensor output. 8195 </notes> 8196 </value> 8197 <value>GAMMA_VALUE 8198 <notes> 8199 Use the gamma value specified in android.tonemap.gamma to peform 8200 tonemapping. 8201 8202 All color enhancement and tonemapping must be disabled, except 8203 for applying the tonemapping curve specified by android.tonemap.gamma. 8204 8205 Must not slow down frame rate relative to raw sensor output. 8206 </notes> 8207 </value> 8208 <value>PRESET_CURVE 8209 <notes> 8210 Use the preset tonemapping curve specified in 8211 android.tonemap.presetCurve to peform tonemapping. 8212 8213 All color enhancement and tonemapping must be disabled, except 8214 for applying the tonemapping curve specified by 8215 android.tonemap.presetCurve. 8216 8217 Must not slow down frame rate relative to raw sensor output. 8218 </notes> 8219 </value> 8220 </enum> 8221 <description>High-level global contrast/gamma/tonemapping control. 8222 </description> 8223 <range>android.tonemap.availableToneMapModes</range> 8224 <details> 8225 When switching to an application-defined contrast curve by setting 8226 android.tonemap.mode to CONTRAST_CURVE, the curve is defined 8227 per-channel with a set of `(in, out)` points that specify the 8228 mapping from input high-bit-depth pixel value to the output 8229 low-bit-depth value. Since the actual pixel ranges of both input 8230 and output may change depending on the camera pipeline, the values 8231 are specified by normalized floating-point numbers. 8232 8233 More-complex color mapping operations such as 3D color look-up 8234 tables, selective chroma enhancement, or other non-linear color 8235 transforms will be disabled when android.tonemap.mode is 8236 CONTRAST_CURVE. 8237 8238 When using either FAST or HIGH_QUALITY, the camera device will 8239 emit its own tonemap curve in android.tonemap.curve. 8240 These values are always available, and as close as possible to the 8241 actually used nonlinear/nonglobal transforms. 8242 8243 If a request is sent with CONTRAST_CURVE with the camera device's 8244 provided curve in FAST or HIGH_QUALITY, the image's tonemap will be 8245 roughly the same.</details> 8246 </entry> 8247 </controls> 8248 <static> 8249 <entry name="maxCurvePoints" type="int32" visibility="public" 8250 hwlevel="full"> 8251 <description>Maximum number of supported points in the 8252 tonemap curve that can be used for android.tonemap.curve. 8253 </description> 8254 <details> 8255 If the actual number of points provided by the application (in android.tonemap.curve*) is 8256 less than this maximum, the camera device will resample the curve to its internal 8257 representation, using linear interpolation. 8258 8259 The output curves in the result metadata may have a different number 8260 of points than the input curves, and will represent the actual 8261 hardware curves used as closely as possible when linearly interpolated. 8262 </details> 8263 <hal_details> 8264 This value must be at least 64. This should be at least 128. 8265 </hal_details> 8266 </entry> 8267 <entry name="availableToneMapModes" type="byte" visibility="public" 8268 type_notes="list of enums" container="array" typedef="enumList" hwlevel="full"> 8269 <array> 8270 <size>n</size> 8271 </array> 8272 <description> 8273 List of tonemapping modes for android.tonemap.mode that are supported by this camera 8274 device. 8275 </description> 8276 <range>Any value listed in android.tonemap.mode</range> 8277 <details> 8278 Camera devices that support the MANUAL_POST_PROCESSING capability will always contain 8279 at least one of below mode combinations: 8280 8281 * CONTRAST_CURVE, FAST and HIGH_QUALITY 8282 * GAMMA_VALUE, PRESET_CURVE, FAST and HIGH_QUALITY 8283 8284 This includes all FULL level devices. 8285 </details> 8286 <hal_details> 8287 HAL must support both FAST and HIGH_QUALITY if automatic tonemap control is available 8288 on the camera device, but the underlying implementation can be the same for both modes. 8289 That is, if the highest quality implementation on the camera device does not slow down 8290 capture rate, then FAST and HIGH_QUALITY will generate the same output. 8291 </hal_details> 8292 </entry> 8293 </static> 8294 <dynamic> 8295 <clone entry="android.tonemap.curveBlue" kind="controls"> 8296 </clone> 8297 <clone entry="android.tonemap.curveGreen" kind="controls"> 8298 </clone> 8299 <clone entry="android.tonemap.curveRed" kind="controls"> 8300 </clone> 8301 <clone entry="android.tonemap.curve" kind="controls"> 8302 </clone> 8303 <clone entry="android.tonemap.mode" kind="controls"> 8304 </clone> 8305 </dynamic> 8306 <controls> 8307 <entry name="gamma" type="float" visibility="public"> 8308 <description> Tonemapping curve to use when android.tonemap.mode is 8309 GAMMA_VALUE 8310 </description> 8311 <details> 8312 The tonemap curve will be defined the following formula: 8313 * OUT = pow(IN, 1.0 / gamma) 8314 where IN and OUT is the input pixel value scaled to range [0.0, 1.0], 8315 pow is the power function and gamma is the gamma value specified by this 8316 key. 8317 8318 The same curve will be applied to all color channels. The camera device 8319 may clip the input gamma value to its supported range. The actual applied 8320 value will be returned in capture result. 8321 8322 The valid range of gamma value varies on different devices, but values 8323 within [1.0, 5.0] are guaranteed not to be clipped. 8324 </details> 8325 </entry> 8326 <entry name="presetCurve" type="byte" visibility="public" enum="true"> 8327 <enum> 8328 <value>SRGB 8329 <notes>Tonemapping curve is defined by sRGB</notes> 8330 </value> 8331 <value>REC709 8332 <notes>Tonemapping curve is defined by ITU-R BT.709</notes> 8333 </value> 8334 </enum> 8335 <description> Tonemapping curve to use when android.tonemap.mode is 8336 PRESET_CURVE 8337 </description> 8338 <details> 8339 The tonemap curve will be defined by specified standard. 8340 8341 sRGB (approximated by 16 control points): 8342 8343 ![sRGB tonemapping curve](android.tonemap.curveRed/srgb_tonemap.png) 8344 8345 Rec. 709 (approximated by 16 control points): 8346 8347 ![Rec. 709 tonemapping curve](android.tonemap.curveRed/rec709_tonemap.png) 8348 8349 Note that above figures show a 16 control points approximation of preset 8350 curves. Camera devices may apply a different approximation to the curve. 8351 </details> 8352 </entry> 8353 </controls> 8354 <dynamic> 8355 <clone entry="android.tonemap.gamma" kind="controls"> 8356 </clone> 8357 <clone entry="android.tonemap.presetCurve" kind="controls"> 8358 </clone> 8359 </dynamic> 8360 </section> 8361 <section name="led"> 8362 <controls> 8363 <entry name="transmit" type="byte" visibility="hidden" optional="true" 8364 enum="true" typedef="boolean"> 8365 <enum> 8366 <value>OFF</value> 8367 <value>ON</value> 8368 </enum> 8369 <description>This LED is nominally used to indicate to the user 8370 that the camera is powered on and may be streaming images back to the 8371 Application Processor. In certain rare circumstances, the OS may 8372 disable this when video is processed locally and not transmitted to 8373 any untrusted applications. 8374 8375 In particular, the LED *must* always be on when the data could be 8376 transmitted off the device. The LED *should* always be on whenever 8377 data is stored locally on the device. 8378 8379 The LED *may* be off if a trusted application is using the data that 8380 doesn't violate the above rules. 8381 </description> 8382 </entry> 8383 </controls> 8384 <dynamic> 8385 <clone entry="android.led.transmit" kind="controls"></clone> 8386 </dynamic> 8387 <static> 8388 <entry name="availableLeds" type="byte" visibility="hidden" optional="true" 8389 enum="true" 8390 container="array"> 8391 <array> 8392 <size>n</size> 8393 </array> 8394 <enum> 8395 <value>TRANSMIT 8396 <notes>android.led.transmit control is used.</notes> 8397 </value> 8398 </enum> 8399 <description>A list of camera LEDs that are available on this system. 8400 </description> 8401 </entry> 8402 </static> 8403 </section> 8404 <section name="info"> 8405 <static> 8406 <entry name="supportedHardwareLevel" type="byte" visibility="public" 8407 enum="true" hwlevel="legacy"> 8408 <enum> 8409 <value> 8410 LIMITED 8411 <notes> 8412 This camera device does not have enough capabilities to qualify as a `FULL` device or 8413 better. 8414 8415 Only the stream configurations listed in the `LEGACY` and `LIMITED` tables in the 8416 {@link android.hardware.camera2.CameraDevice#createCaptureSession 8417 createCaptureSession} documentation are guaranteed to be supported. 8418 8419 All `LIMITED` devices support the `BACKWARDS_COMPATIBLE` capability, indicating basic 8420 support for color image capture. The only exception is that the device may 8421 alternatively support only the `DEPTH_OUTPUT` capability, if it can only output depth 8422 measurements and not color images. 8423 8424 `LIMITED` devices and above require the use of android.control.aePrecaptureTrigger 8425 to lock exposure metering (and calculate flash power, for cameras with flash) before 8426 capturing a high-quality still image. 8427 8428 A `LIMITED` device that only lists the `BACKWARDS_COMPATIBLE` capability is only 8429 required to support full-automatic operation and post-processing (`OFF` is not 8430 supported for android.control.aeMode, android.control.afMode, or 8431 android.control.awbMode) 8432 8433 Additional capabilities may optionally be supported by a `LIMITED`-level device, and 8434 can be checked for in android.request.availableCapabilities. 8435 </notes> 8436 </value> 8437 <value> 8438 FULL 8439 <notes> 8440 This camera device is capable of supporting advanced imaging applications. 8441 8442 The stream configurations listed in the `FULL`, `LEGACY` and `LIMITED` tables in the 8443 {@link android.hardware.camera2.CameraDevice#createCaptureSession 8444 createCaptureSession} documentation are guaranteed to be supported. 8445 8446 A `FULL` device will support below capabilities: 8447 8448 * `BURST_CAPTURE` capability (android.request.availableCapabilities contains 8449 `BURST_CAPTURE`) 8450 * Per frame control (android.sync.maxLatency `==` PER_FRAME_CONTROL) 8451 * Manual sensor control (android.request.availableCapabilities contains `MANUAL_SENSOR`) 8452 * Manual post-processing control (android.request.availableCapabilities contains 8453 `MANUAL_POST_PROCESSING`) 8454 * The required exposure time range defined in android.sensor.info.exposureTimeRange 8455 * The required maxFrameDuration defined in android.sensor.info.maxFrameDuration 8456 8457 Note: 8458 Pre-API level 23, FULL devices also supported arbitrary cropping region 8459 (android.scaler.croppingType `== FREEFORM`); this requirement was relaxed in API level 8460 23, and `FULL` devices may only support `CENTERED` cropping. 8461 </notes> 8462 </value> 8463 <value> 8464 LEGACY 8465 <notes> 8466 This camera device is running in backward compatibility mode. 8467 8468 Only the stream configurations listed in the `LEGACY` table in the {@link 8469 android.hardware.camera2.CameraDevice#createCaptureSession createCaptureSession} 8470 documentation are supported. 8471 8472 A `LEGACY` device does not support per-frame control, manual sensor control, manual 8473 post-processing, arbitrary cropping regions, and has relaxed performance constraints. 8474 No additional capabilities beyond `BACKWARD_COMPATIBLE` will ever be listed by a 8475 `LEGACY` device in android.request.availableCapabilities. 8476 8477 In addition, the android.control.aePrecaptureTrigger is not functional on `LEGACY` 8478 devices. Instead, every request that includes a JPEG-format output target is treated 8479 as triggering a still capture, internally executing a precapture trigger. This may 8480 fire the flash for flash power metering during precapture, and then fire the flash 8481 for the final capture, if a flash is available on the device and the AE mode is set to 8482 enable the flash. 8483 </notes> 8484 </value> 8485 <value> 8486 3 8487 <notes> 8488 This camera device is capable of YUV reprocessing and RAW data capture, in addition to 8489 FULL-level capabilities. 8490 8491 The stream configurations listed in the `LEVEL_3`, `RAW`, `FULL`, `LEGACY` and 8492 `LIMITED` tables in the {@link 8493 android.hardware.camera2.CameraDevice#createCaptureSession createCaptureSession} 8494 documentation are guaranteed to be supported. 8495 8496 The following additional capabilities are guaranteed to be supported: 8497 8498 * `YUV_REPROCESSING` capability (android.request.availableCapabilities contains 8499 `YUV_REPROCESSING`) 8500 * `RAW` capability (android.request.availableCapabilities contains 8501 `RAW`) 8502 </notes> 8503 </value> 8504 </enum> 8505 <description> 8506 Generally classifies the overall set of the camera device functionality. 8507 </description> 8508 <details> 8509 The supported hardware level is a high-level description of the camera device's 8510 capabilities, summarizing several capabilities into one field. Each level adds additional 8511 features to the previous one, and is always a strict superset of the previous level. 8512 The ordering is `LEGACY < LIMITED < FULL < LEVEL_3`. 8513 8514 Starting from `LEVEL_3`, the level enumerations are guaranteed to be in increasing 8515 numerical value as well. To check if a given device is at least at a given hardware level, 8516 the following code snippet can be used: 8517 8518 // Returns true if the device supports the required hardware level, or better. 8519 boolean isHardwareLevelSupported(CameraCharacteristics c, int requiredLevel) { 8520 int deviceLevel = c.get(CameraCharacteristics.INFO_SUPPORTED_HARDWARE_LEVEL); 8521 if (deviceLevel == CameraCharacteristics.INFO_SUPPORTED_HARDWARE_LEVEL_LEGACY) { 8522 return requiredLevel == deviceLevel; 8523 } 8524 // deviceLevel is not LEGACY, can use numerical sort 8525 return requiredLevel <= deviceLevel; 8526 } 8527 8528 At a high level, the levels are: 8529 8530 * `LEGACY` devices operate in a backwards-compatibility mode for older 8531 Android devices, and have very limited capabilities. 8532 * `LIMITED` devices represent the 8533 baseline feature set, and may also include additional capabilities that are 8534 subsets of `FULL`. 8535 * `FULL` devices additionally support per-frame manual control of sensor, flash, lens and 8536 post-processing settings, and image capture at a high rate. 8537 * `LEVEL_3` devices additionally support YUV reprocessing and RAW image capture, along 8538 with additional output stream configurations. 8539 8540 See the individual level enums for full descriptions of the supported capabilities. The 8541 android.request.availableCapabilities entry describes the device's capabilities at a 8542 finer-grain level, if needed. In addition, many controls have their available settings or 8543 ranges defined in individual {@link android.hardware.camera2.CameraCharacteristics} entries. 8544 8545 Some features are not part of any particular hardware level or capability and must be 8546 queried separately. These include: 8547 8548 * Calibrated timestamps (android.sensor.info.timestampSource `==` REALTIME) 8549 * Precision lens control (android.lens.info.focusDistanceCalibration `==` CALIBRATED) 8550 * Face detection (android.statistics.info.availableFaceDetectModes) 8551 * Optical or electrical image stabilization 8552 (android.lens.info.availableOpticalStabilization, 8553 android.control.availableVideoStabilizationModes) 8554 8555 </details> 8556 <hal_details> 8557 The camera 3 HAL device can implement one of three possible operational modes; LIMITED, 8558 FULL, and LEVEL_3. 8559 8560 FULL support or better is expected from new higher-end devices. Limited 8561 mode has hardware requirements roughly in line with those for a camera HAL device v1 8562 implementation, and is expected from older or inexpensive devices. Each level is a strict 8563 superset of the previous level, and they share the same essential operational flow. 8564 8565 For full details refer to "S3. Operational Modes" in camera3.h 8566 8567 Camera HAL3+ must not implement LEGACY mode. It is there for backwards compatibility in 8568 the `android.hardware.camera2` user-facing API only on HALv1 devices, and is implemented 8569 by the camera framework code. 8570 </hal_details> 8571 </entry> 8572 </static> 8573 </section> 8574 <section name="blackLevel"> 8575 <controls> 8576 <entry name="lock" type="byte" visibility="public" enum="true" 8577 typedef="boolean" hwlevel="full"> 8578 <enum> 8579 <value>OFF</value> 8580 <value>ON</value> 8581 </enum> 8582 <description> Whether black-level compensation is locked 8583 to its current values, or is free to vary.</description> 8584 <details>When set to `true` (ON), the values used for black-level 8585 compensation will not change until the lock is set to 8586 `false` (OFF). 8587 8588 Since changes to certain capture parameters (such as 8589 exposure time) may require resetting of black level 8590 compensation, the camera device must report whether setting 8591 the black level lock was successful in the output result 8592 metadata. 8593 8594 For example, if a sequence of requests is as follows: 8595 8596 * Request 1: Exposure = 10ms, Black level lock = OFF 8597 * Request 2: Exposure = 10ms, Black level lock = ON 8598 * Request 3: Exposure = 10ms, Black level lock = ON 8599 * Request 4: Exposure = 20ms, Black level lock = ON 8600 * Request 5: Exposure = 20ms, Black level lock = ON 8601 * Request 6: Exposure = 20ms, Black level lock = ON 8602 8603 And the exposure change in Request 4 requires the camera 8604 device to reset the black level offsets, then the output 8605 result metadata is expected to be: 8606 8607 * Result 1: Exposure = 10ms, Black level lock = OFF 8608 * Result 2: Exposure = 10ms, Black level lock = ON 8609 * Result 3: Exposure = 10ms, Black level lock = ON 8610 * Result 4: Exposure = 20ms, Black level lock = OFF 8611 * Result 5: Exposure = 20ms, Black level lock = ON 8612 * Result 6: Exposure = 20ms, Black level lock = ON 8613 8614 This indicates to the application that on frame 4, black 8615 levels were reset due to exposure value changes, and pixel 8616 values may not be consistent across captures. 8617 8618 The camera device will maintain the lock to the extent 8619 possible, only overriding the lock to OFF when changes to 8620 other request parameters require a black level recalculation 8621 or reset. 8622 </details> 8623 <hal_details> 8624 If for some reason black level locking is no longer possible 8625 (for example, the analog gain has changed, which forces 8626 black level offsets to be recalculated), then the HAL must 8627 override this request (and it must report 'OFF' when this 8628 does happen) until the next capture for which locking is 8629 possible again.</hal_details> 8630 <tag id="HAL2" /> 8631 </entry> 8632 </controls> 8633 <dynamic> 8634 <clone entry="android.blackLevel.lock" 8635 kind="controls"> 8636 <details> 8637 Whether the black level offset was locked for this frame. Should be 8638 ON if android.blackLevel.lock was ON in the capture request, unless 8639 a change in other capture settings forced the camera device to 8640 perform a black level reset. 8641 </details> 8642 </clone> 8643 </dynamic> 8644 </section> 8645 <section name="sync"> 8646 <dynamic> 8647 <entry name="frameNumber" type="int64" visibility="ndk_public" 8648 enum="true" hwlevel="legacy"> 8649 <enum> 8650 <value id="-1">CONVERGING 8651 <notes> 8652 The current result is not yet fully synchronized to any request. 8653 8654 Synchronization is in progress, and reading metadata from this 8655 result may include a mix of data that have taken effect since the 8656 last synchronization time. 8657 8658 In some future result, within android.sync.maxLatency frames, 8659 this value will update to the actual frame number frame number 8660 the result is guaranteed to be synchronized to (as long as the 8661 request settings remain constant). 8662 </notes> 8663 </value> 8664 <value id="-2">UNKNOWN 8665 <notes> 8666 The current result's synchronization status is unknown. 8667 8668 The result may have already converged, or it may be in 8669 progress. Reading from this result may include some mix 8670 of settings from past requests. 8671 8672 After a settings change, the new settings will eventually all 8673 take effect for the output buffers and results. However, this 8674 value will not change when that happens. Altering settings 8675 rapidly may provide outcomes using mixes of settings from recent 8676 requests. 8677 8678 This value is intended primarily for backwards compatibility with 8679 the older camera implementations (for android.hardware.Camera). 8680 </notes> 8681 </value> 8682 </enum> 8683 <description>The frame number corresponding to the last request 8684 with which the output result (metadata + buffers) has been fully 8685 synchronized.</description> 8686 <range>Either a non-negative value corresponding to a 8687 `frame_number`, or one of the two enums (CONVERGING / UNKNOWN). 8688 </range> 8689 <details> 8690 When a request is submitted to the camera device, there is usually a 8691 delay of several frames before the controls get applied. A camera 8692 device may either choose to account for this delay by implementing a 8693 pipeline and carefully submit well-timed atomic control updates, or 8694 it may start streaming control changes that span over several frame 8695 boundaries. 8696 8697 In the latter case, whenever a request's settings change relative to 8698 the previous submitted request, the full set of changes may take 8699 multiple frame durations to fully take effect. Some settings may 8700 take effect sooner (in less frame durations) than others. 8701 8702 While a set of control changes are being propagated, this value 8703 will be CONVERGING. 8704 8705 Once it is fully known that a set of control changes have been 8706 finished propagating, and the resulting updated control settings 8707 have been read back by the camera device, this value will be set 8708 to a non-negative frame number (corresponding to the request to 8709 which the results have synchronized to). 8710 8711 Older camera device implementations may not have a way to detect 8712 when all camera controls have been applied, and will always set this 8713 value to UNKNOWN. 8714 8715 FULL capability devices will always have this value set to the 8716 frame number of the request corresponding to this result. 8717 8718 _Further details_: 8719 8720 * Whenever a request differs from the last request, any future 8721 results not yet returned may have this value set to CONVERGING (this 8722 could include any in-progress captures not yet returned by the camera 8723 device, for more details see pipeline considerations below). 8724 * Submitting a series of multiple requests that differ from the 8725 previous request (e.g. r1, r2, r3 s.t. r1 != r2 != r3) 8726 moves the new synchronization frame to the last non-repeating 8727 request (using the smallest frame number from the contiguous list of 8728 repeating requests). 8729 * Submitting the same request repeatedly will not change this value 8730 to CONVERGING, if it was already a non-negative value. 8731 * When this value changes to non-negative, that means that all of the 8732 metadata controls from the request have been applied, all of the 8733 metadata controls from the camera device have been read to the 8734 updated values (into the result), and all of the graphics buffers 8735 corresponding to this result are also synchronized to the request. 8736 8737 _Pipeline considerations_: 8738 8739 Submitting a request with updated controls relative to the previously 8740 submitted requests may also invalidate the synchronization state 8741 of all the results corresponding to currently in-flight requests. 8742 8743 In other words, results for this current request and up to 8744 android.request.pipelineMaxDepth prior requests may have their 8745 android.sync.frameNumber change to CONVERGING. 8746 </details> 8747 <hal_details> 8748 Using UNKNOWN here is illegal unless android.sync.maxLatency 8749 is also UNKNOWN. 8750 8751 FULL capability devices should simply set this value to the 8752 `frame_number` of the request this result corresponds to. 8753 </hal_details> 8754 <tag id="V1" /> 8755 </entry> 8756 </dynamic> 8757 <static> 8758 <entry name="maxLatency" type="int32" visibility="public" enum="true" 8759 hwlevel="legacy"> 8760 <enum> 8761 <value id="0">PER_FRAME_CONTROL 8762 <notes> 8763 Every frame has the requests immediately applied. 8764 8765 Changing controls over multiple requests one after another will 8766 produce results that have those controls applied atomically 8767 each frame. 8768 8769 All FULL capability devices will have this as their maxLatency. 8770 </notes> 8771 </value> 8772 <value id="-1">UNKNOWN 8773 <notes> 8774 Each new frame has some subset (potentially the entire set) 8775 of the past requests applied to the camera settings. 8776 8777 By submitting a series of identical requests, the camera device 8778 will eventually have the camera settings applied, but it is 8779 unknown when that exact point will be. 8780 8781 All LEGACY capability devices will have this as their maxLatency. 8782 </notes> 8783 </value> 8784 </enum> 8785 <description> 8786 The maximum number of frames that can occur after a request 8787 (different than the previous) has been submitted, and before the 8788 result's state becomes synchronized. 8789 </description> 8790 <units>Frame counts</units> 8791 <range>A positive value, PER_FRAME_CONTROL, or UNKNOWN.</range> 8792 <details> 8793 This defines the maximum distance (in number of metadata results), 8794 between the frame number of the request that has new controls to apply 8795 and the frame number of the result that has all the controls applied. 8796 8797 In other words this acts as an upper boundary for how many frames 8798 must occur before the camera device knows for a fact that the new 8799 submitted camera settings have been applied in outgoing frames. 8800 </details> 8801 <hal_details> 8802 For example if maxLatency was 2, 8803 8804 initial request = X (repeating) 8805 request1 = X 8806 request2 = Y 8807 request3 = Y 8808 request4 = Y 8809 8810 where requestN has frameNumber N, and the first of the repeating 8811 initial request's has frameNumber F (and F < 1). 8812 8813 initial result = X' + { android.sync.frameNumber == F } 8814 result1 = X' + { android.sync.frameNumber == F } 8815 result2 = X' + { android.sync.frameNumber == CONVERGING } 8816 result3 = X' + { android.sync.frameNumber == CONVERGING } 8817 result4 = X' + { android.sync.frameNumber == 2 } 8818 8819 where resultN has frameNumber N. 8820 8821 Since `result4` has a `frameNumber == 4` and 8822 `android.sync.frameNumber == 2`, the distance is clearly 8823 `4 - 2 = 2`. 8824 8825 Use `frame_count` from camera3_request_t instead of 8826 android.request.frameCount or 8827 `{@link android.hardware.camera2.CaptureResult#getFrameNumber}`. 8828 8829 LIMITED devices are strongly encouraged to use a non-negative 8830 value. If UNKNOWN is used here then app developers do not have a way 8831 to know when sensor settings have been applied. 8832 </hal_details> 8833 <tag id="V1" /> 8834 </entry> 8835 </static> 8836 </section> 8837 <section name="reprocess"> 8838 <controls> 8839 <entry name="effectiveExposureFactor" type="float" visibility="java_public" hwlevel="limited"> 8840 <description> 8841 The amount of exposure time increase factor applied to the original output 8842 frame by the application processing before sending for reprocessing. 8843 </description> 8844 <units>Relative exposure time increase factor.</units> 8845 <range> &gt;= 1.0</range> 8846 <details> 8847 This is optional, and will be supported if the camera device supports YUV_REPROCESSING 8848 capability (android.request.availableCapabilities contains YUV_REPROCESSING). 8849 8850 For some YUV reprocessing use cases, the application may choose to filter the original 8851 output frames to effectively reduce the noise to the same level as a frame that was 8852 captured with longer exposure time. To be more specific, assuming the original captured 8853 images were captured with a sensitivity of S and an exposure time of T, the model in 8854 the camera device is that the amount of noise in the image would be approximately what 8855 would be expected if the original capture parameters had been a sensitivity of 8856 S/effectiveExposureFactor and an exposure time of T*effectiveExposureFactor, rather 8857 than S and T respectively. If the captured images were processed by the application 8858 before being sent for reprocessing, then the application may have used image processing 8859 algorithms and/or multi-frame image fusion to reduce the noise in the 8860 application-processed images (input images). By using the effectiveExposureFactor 8861 control, the application can communicate to the camera device the actual noise level 8862 improvement in the application-processed image. With this information, the camera 8863 device can select appropriate noise reduction and edge enhancement parameters to avoid 8864 excessive noise reduction (android.noiseReduction.mode) and insufficient edge 8865 enhancement (android.edge.mode) being applied to the reprocessed frames. 8866 8867 For example, for multi-frame image fusion use case, the application may fuse 8868 multiple output frames together to a final frame for reprocessing. When N image are 8869 fused into 1 image for reprocessing, the exposure time increase factor could be up to 8870 square root of N (based on a simple photon shot noise model). The camera device will 8871 adjust the reprocessing noise reduction and edge enhancement parameters accordingly to 8872 produce the best quality images. 8873 8874 This is relative factor, 1.0 indicates the application hasn't processed the input 8875 buffer in a way that affects its effective exposure time. 8876 8877 This control is only effective for YUV reprocessing capture request. For noise 8878 reduction reprocessing, it is only effective when `android.noiseReduction.mode != OFF`. 8879 Similarly, for edge enhancement reprocessing, it is only effective when 8880 `android.edge.mode != OFF`. 8881 </details> 8882 <tag id="REPROC" /> 8883 </entry> 8884 </controls> 8885 <dynamic> 8886 <clone entry="android.reprocess.effectiveExposureFactor" kind="controls"> 8887 </clone> 8888 </dynamic> 8889 <static> 8890 <entry name="maxCaptureStall" type="int32" visibility="java_public" hwlevel="limited"> 8891 <description> 8892 The maximal camera capture pipeline stall (in unit of frame count) introduced by a 8893 reprocess capture request. 8894 </description> 8895 <units>Number of frames.</units> 8896 <range> &lt;= 4</range> 8897 <details> 8898 The key describes the maximal interference that one reprocess (input) request 8899 can introduce to the camera simultaneous streaming of regular (output) capture 8900 requests, including repeating requests. 8901 8902 When a reprocessing capture request is submitted while a camera output repeating request 8903 (e.g. preview) is being served by the camera device, it may preempt the camera capture 8904 pipeline for at least one frame duration so that the camera device is unable to process 8905 the following capture request in time for the next sensor start of exposure boundary. 8906 When this happens, the application may observe a capture time gap (longer than one frame 8907 duration) between adjacent capture output frames, which usually exhibits as preview 8908 glitch if the repeating request output targets include a preview surface. This key gives 8909 the worst-case number of frame stall introduced by one reprocess request with any kind of 8910 formats/sizes combination. 8911 8912 If this key reports 0, it means a reprocess request doesn't introduce any glitch to the 8913 ongoing camera repeating request outputs, as if this reprocess request is never issued. 8914 8915 This key is supported if the camera device supports PRIVATE or YUV reprocessing ( 8916 i.e. android.request.availableCapabilities contains PRIVATE_REPROCESSING or 8917 YUV_REPROCESSING). 8918 </details> 8919 <tag id="REPROC" /> 8920 </entry> 8921 </static> 8922 </section> 8923 <section name="depth"> 8924 <static> 8925 <entry name="maxDepthSamples" type="int32" visibility="system" hwlevel="limited"> 8926 <description>Maximum number of points that a depth point cloud may contain. 8927 </description> 8928 <details> 8929 If a camera device supports outputting depth range data in the form of a depth point 8930 cloud ({@link android.graphics.ImageFormat#DEPTH_POINT_CLOUD}), this is the maximum 8931 number of points an output buffer may contain. 8932 8933 Any given buffer may contain between 0 and maxDepthSamples points, inclusive. 8934 If output in the depth point cloud format is not supported, this entry will 8935 not be defined. 8936 </details> 8937 <tag id="DEPTH" /> 8938 </entry> 8939 <entry name="availableDepthStreamConfigurations" type="int32" visibility="ndk_public" 8940 enum="true" container="array" typedef="streamConfiguration" hwlevel="limited"> 8941 <array> 8942 <size>n</size> 8943 <size>4</size> 8944 </array> 8945 <enum> 8946 <value>OUTPUT</value> 8947 <value>INPUT</value> 8948 </enum> 8949 <description>The available depth dataspace stream 8950 configurations that this camera device supports 8951 (i.e. format, width, height, output/input stream). 8952 </description> 8953 <details> 8954 These are output stream configurations for use with 8955 dataSpace HAL_DATASPACE_DEPTH. The configurations are 8956 listed as `(format, width, height, input?)` tuples. 8957 8958 Only devices that support depth output for at least 8959 the HAL_PIXEL_FORMAT_Y16 dense depth map may include 8960 this entry. 8961 8962 A device that also supports the HAL_PIXEL_FORMAT_BLOB 8963 sparse depth point cloud must report a single entry for 8964 the format in this list as `(HAL_PIXEL_FORMAT_BLOB, 8965 android.depth.maxDepthSamples, 1, OUTPUT)` in addition to 8966 the entries for HAL_PIXEL_FORMAT_Y16. 8967 </details> 8968 <tag id="DEPTH" /> 8969 </entry> 8970 <entry name="availableDepthMinFrameDurations" type="int64" visibility="ndk_public" 8971 container="array" typedef="streamConfigurationDuration" hwlevel="limited"> 8972 <array> 8973 <size>4</size> 8974 <size>n</size> 8975 </array> 8976 <description>This lists the minimum frame duration for each 8977 format/size combination for depth output formats. 8978 </description> 8979 <units>(format, width, height, ns) x n</units> 8980 <details> 8981 This should correspond to the frame duration when only that 8982 stream is active, with all processing (typically in android.*.mode) 8983 set to either OFF or FAST. 8984 8985 When multiple streams are used in a request, the minimum frame 8986 duration will be max(individual stream min durations). 8987 8988 The minimum frame duration of a stream (of a particular format, size) 8989 is the same regardless of whether the stream is input or output. 8990 8991 See android.sensor.frameDuration and 8992 android.scaler.availableStallDurations for more details about 8993 calculating the max frame rate. 8994 8995 (Keep in sync with {@link 8996 android.hardware.camera2.params.StreamConfigurationMap#getOutputMinFrameDuration}) 8997 </details> 8998 <tag id="DEPTH" /> 8999 </entry> 9000 <entry name="availableDepthStallDurations" type="int64" visibility="ndk_public" 9001 container="array" typedef="streamConfigurationDuration" hwlevel="limited"> 9002 <array> 9003 <size>4</size> 9004 <size>n</size> 9005 </array> 9006 <description>This lists the maximum stall duration for each 9007 output format/size combination for depth streams. 9008 </description> 9009 <units>(format, width, height, ns) x n</units> 9010 <details> 9011 A stall duration is how much extra time would get added 9012 to the normal minimum frame duration for a repeating request 9013 that has streams with non-zero stall. 9014 9015 This functions similarly to 9016 android.scaler.availableStallDurations for depth 9017 streams. 9018 9019 All depth output stream formats may have a nonzero stall 9020 duration. 9021 </details> 9022 <tag id="DEPTH" /> 9023 </entry> 9024 <entry name="depthIsExclusive" type="byte" visibility="public" 9025 enum="true" typedef="boolean" hwlevel="limited"> 9026 <enum> 9027 <value>FALSE</value> 9028 <value>TRUE</value> 9029 </enum> 9030 <description>Indicates whether a capture request may target both a 9031 DEPTH16 / DEPTH_POINT_CLOUD output, and normal color outputs (such as 9032 YUV_420_888, JPEG, or RAW) simultaneously. 9033 </description> 9034 <details> 9035 If TRUE, including both depth and color outputs in a single 9036 capture request is not supported. An application must interleave color 9037 and depth requests. If FALSE, a single request can target both types 9038 of output. 9039 9040 Typically, this restriction exists on camera devices that 9041 need to emit a specific pattern or wavelength of light to 9042 measure depth values, which causes the color image to be 9043 corrupted during depth measurement. 9044 </details> 9045 </entry> 9046 </static> 9047 </section> 9048 </namespace> 9049</metadata> 9050