High-dynamic-range imaging

From Infogalactic: the planetary knowledge core
(Redirected from High Dynamic Range)
Jump to: navigation, search

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

Tone mapped high-dynamic-range (HDR) image of St. Kentigerns Roman Catholic Church in Blackpool, Lancashire, England, UK

High-dynamic-range imaging (HDRI or HDR) is a technique used in imaging and photography to reproduce a greater dynamic range of luminosity than is possible with standard digital imaging or photographic techniques. The aim is to present the human eye with a similar range of luminance as that which, through the visual system, is familiar in everyday life. The human eye, through adaptation of the iris (and other methods) adjusts constantly to the broad dynamic changes ubiquitous in our environment. The brain continuously interprets this information so that most of us can see in a wide range of light conditions. Most cameras, on the other hand, cannot.

HDR images can represent a greater range of luminance levels than can be achieved using more 'traditional' methods, such as many real-world scenes containing very bright, direct sunlight to extreme shade, or very faint nebulae. This is often achieved by capturing and then combining several different narrower range exposures of the same subject matter.[1][2][3][4] Non-HDR cameras take photographs with a limited exposure range, resulting in the loss of detail in highlights or shadows.

The two primary types of HDR images are computer renderings and images resulting from merging multiple low-dynamic-range (LDR)[5] or standard-dynamic-range (SDR)[6] photographs. HDR images can also be acquired using special image sensors, like an oversampled binary image sensor.

Due to the limitations of printing and display contrast, acquiring an HDR image is only half the story; one must also develop methods of displaying the results. The method of rendering an HDR image to a standard monitor or printing device is called tone mapping. This method reduces the overall contrast of an HDR image to facilitate display on devices or printouts with lower dynamic range, and can be applied to produce images with preserved or exaggerated local contrast for artistic effect.

HDR lenses are currently being developed by Ricoh-Imaging. Current applications include prototype lenses being trialled on the international space station. Further developments include consumer lenses for use in low light conditions such as night driving. These HDR lenses combine a laminate of borosilicate carbide and polarizing films to actively refine the intensity of various wavelengths of light, which results in a higher dynamic range image seen by the eye.[not verified in body]

Photography

In photography, dynamic range is measured in exposure value (EV) differences (known as stops). An increase of one EV, or 'one stop', represents a doubling of the amount of light. Conversely, a decrease of one EV represents a halving of the amount of light. To reveal detail in the darkest shadow requires high exposures. Inversely, to prevent 'bleaching out' of detail in very bright areas, one must choose very low exposures. Most cameras cannot provide this range of exposure values within a single exposure, due to their low dynamic range.

Dynamic ranges of common devices
Device Stops Contrast
LCD 9.5 700:1 (250:1 – 1750:1)
Negative film (Kodak VISION3) 13[7] 8000:1
Human eye (static) 10–14[8] 1000:1 – 15000:1
High-end DSLR camera (Nikon D810) 14.8[9] 28500:1
Human eye (dynamic) 20 1000000:1

High-dynamic-range photographs are generally achieved by capturing multiple standard exposure images, often using exposure bracketing, and then later, merging them (usually within a photo manipulation program) into a single HDR image. Digital images are often encoded in a camera's raw image format, because 8 bit JPEG encoding doesn't offer a great enough range of values to allow fine transitions (and regarding HDR, later introduces undesirable effects due to lossy compression).

Any camera that allows manual exposure control can make images for HDR work, although one equipped with auto exposure bracketing (AEB) is far better suited. Images from film cameras are less suitable as they often must be digitized first, so that they can later be processed using software HDR methods.

In most imaging devices, the degree of exposure to light applied to the active element (be it film or CCD) can be altered in one of two ways; by either increasing/decreasing the size of the aperture or by increasing/decreasing the time of each exposure. Exposure variation in an HDR set is only done by altering the exposure time and NOT the aperture size; this is because altering the aperture size also affects the depth of field and so the resultant multiple images would be quite different, preventing their final combination into a single HDR image.

An important limitation for HDR photography is that any movement between successive images will impede or prevent success in combining them afterwards. Also, as one must create several images (often three or five-sometimes more) to obtain the desired luminance range. Such a full 'set' of images takes extra time. HDR photographers have developed calculation methods and techniques to partially overcome these problems, but the use of a sturdy tripod at least is advised.

Some cameras have an auto exposure bracketing (AEB) feature with a far greater dynamic range than others, from the 3 EV of the Canon EOS 40D, to the 18 EV of the Canon EOS-1D Mark II.[10] As the popularity of this imaging method grows, several camera manufactures are now offering built-in HDR features. For example, the Pentax K-7 DSLR has an HDR mode that captures an HDR image and outputs (only) a tone mapped JPEG file.[11] The Canon PowerShot G12, Canon PowerShot S95 and Canon PowerShot S100 offer similar features in a smaller format.[12] Some smartphones provide HDR modes, and most mobile platforms have apps that provide HDR picture taking.[13]

Camera characteristics such as gamma curves, sensor resolution, noise, photometric calibration and color calibration affect resulting high-dynamic-range images.[14]

Color film negatives and slides consist of multiple film layers that respond to light differently. As a consequence, transparent originals (especially positive slides) feature a very high dynamic range.[15]

Tone mapping

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

Tone mapping reduces the dynamic range, or contrast ratio, of an entire image while retaining localized contrast. Although it is a distinct operation, tone mapping is often applied to HDRI files by the same software package.

Several software applications are available on the PC, Mac and Linux platforms for producing HDR files and tone mapped images. Notable titles include

<templatestyles src="Div col/styles.css"/>

Comparison with traditional digital images

Information stored in high-dynamic-range images typically corresponds to the physical values of luminance or radiance that can be observed in the real world. This is different from traditional digital images, which represent colors that should appear on a monitor or a paper print. Therefore, HDR image formats are often called scene-referred, in contrast to traditional digital images, which are device-referred or output-referred. Furthermore, traditional images are usually encoded for the human visual system (maximizing the visual information stored in the fixed number of bits), which is usually called gamma encoding or gamma correction. The values stored for HDR images are often gamma compressed (power law) or logarithmically encoded, or floating-point linear values, since fixed-point linear encodings are increasingly inefficient over higher dynamic ranges.[16][17][18]

HDR images often don't use fixed ranges per color channel—other than for traditional images—to represent many more colors over a much wider dynamic range. For that purpose, they don't use integer values to represent the single color channels (e.g., 0-255 in an 8 bit per pixel interval for red, green and blue) but instead use a floating point representation. Common are 16-bit (half precision) or 32-bit floating point numbers to represent HDR pixels. However, when the appropriate transfer function is used, HDR pixels for some applications can be represented with as few as 10–12 bits for luminance and 8 bits for chrominance without introducing any visible quantization artifacts.[16][19]

History of HDR photography

Mid-nineteenth century

Photo by Gustave Le Gray

The idea of using several exposures to fix a too-extreme range of luminance was pioneered as early as the 1850s by Gustave Le Gray to render seascapes showing both the sky and the sea. Such rendering was impossible at the time using standard methods, the luminosity range being too extreme. Le Gray used one negative for the sky, and another one with a longer exposure for the sea, and combined the two into one picture in positive.[20]

Mid-twentieth century

External image
image icon Schweitzer at the Lamp, by W. Eugene Smith[21][22]

Manual tone mapping was accomplished by dodging and burning – selectively increasing or decreasing the exposure of regions of the photograph to yield better tonality reproduction. This is effective because the dynamic range of the negative is significantly higher than would be available on the finished positive paper print when that is exposed via the negative in a uniform manner. An excellent example is the photograph Schweitzer at the Lamp by W. Eugene Smith, from his 1954 photo essay A Man of Mercy on Dr. Albert Schweitzer and his humanitarian work in French Equatorial Africa. The image took 5 days to reproduce the tonal range of the scene, which ranges from a bright lamp (relative to the scene) to a dark shadow.[22]

Ansel Adams elevated dodging and burning to an art form. Many of his famous prints were manipulated in the darkroom with these two methods. Adams wrote a comprehensive book on producing prints called The Print, which prominently features dodging and burning, in the context of his Zone System.

With the advent of color photography, tone mapping in the darkroom was no longer possible due to the specific timing needed during the developing process of color film. Photographers looked to film manufacturers to design new film stocks with improved response, or continued to shoot in black and white to use tone mapping methods.[citation needed]

Exposure/Density Characteristics of Wyckoff's Extended Exposure Response Film

Color film capable of directly recording high-dynamic-range images was developed by Charles Wyckoff and EG&G "in the course of a contract with the Department of the Air Force".[23] This XR film had three emulsion layers, an upper layer having an ASA speed rating of 400, a middle layer with an intermediate rating, and a lower layer with an ASA rating of 0.004. The film was processed in a manner similar to color films, and each layer produced a different color.[24] The dynamic range of this extended range film has been estimated as 1:108.[25] It has been used to photograph nuclear explosions,[26] for astronomical photography,[27] for spectrographic research,[28] and for medical imaging.[29] Wyckoff's detailed pictures of nuclear explosions appeared on the cover of Life magazine in the mid-1950s.

Late-twentieth century

The concept of neighborhood tone mapping was applied to video cameras by a group from the Technion in Israel led by Dr. Oliver Hilsenrath and Prof. Y.Y.Zeevi who filed for a patent on this concept in 1988.[30] In 1993 the first commercial medical camera was introduced that performed real time capturing of multiple images with different exposures, and producing an HDR video image, by the same group.[31]

Modern HDR imaging uses a completely different approach, based on making a high-dynamic-range luminance or light map using only global image operations (across the entire image), and then tone mapping this result. Global HDR was first introduced in 1993[1] resulting in a mathematical theory of differently exposed pictures of the same subject matter that was published in 1995 by Steve Mann and Rosalind Picard.[2]

On October 28, 1998, Ben Sarao created one of the first night time HDR+G (High Dynamic Range + Graphic image)of STS-95 on the launch pad at NASA's Kennedy Space Center. It consisted of four film images of the shuttle at night in that were digitally composited with additional digital graphic elements. The image was first exhibited at NASA Headquarters Great Hall, Washington DC in 1999 and then published in Hasselblad Forum, Issue 3 1993, Volume 35 ISSN 0282-5449.[32]

The advent of consumer digital cameras produced a new demand for HDR imaging to improve the light response of digital camera sensors, which had a much smaller dynamic range than film. Steve Mann developed and patented the global-HDR method for producing digital images having extended dynamic range at the MIT Media Laboratory.[33] Mann's method involved a two-step procedure: (1) generate one floating point image array by global-only image operations (operations that affect all pixels identically, without regard to their local neighborhoods); and then (2) convert this image array, using local neighborhood processing (tone-remapping, etc.), into an HDR image. The image array generated by the first step of Mann's process is called a lightspace image, lightspace picture, or radiance map. Another benefit of global-HDR imaging is that it provides access to the intermediate light or radiance map, which has been used for computer vision, and other image processing operations.[33]

In 2005, Adobe Systems introduced several new features in Photoshop CS2 including Merge to HDR, 32 bit floating point image support, and HDR tone mapping.[34]

Video

Example of HDR time-lapse video

While custom high-dynamic-range digital video solutions had been developed for industrial manufacturing during the 1980s, it was not until the early 2000s that several scholarly research efforts used consumer-grade sensors and cameras.[35] A few companies such as RED[36] and Arri[37] have been developing digital sensors capable of a higher dynamic range. RED EPIC-X can capture HDRx images with a user selectable 1-3 stops of additional highlight latitude in the 'x' channel. The 'x' channel can be merged with the normal channel in post production software.

With the advent of low-cost consumer digital cameras, many amateurs began posting tone mapped HDR time-lapse videos on the Internet, essentially a sequence of still photographs in quick succession. In 2010 the independent studio Soviet Montage produced an example of HDR video from disparately exposed video streams using a beam splitter and consumer grade HD video cameras.[38] Similar methods have been described in the academic literature in 2001[39] and 2007.[40]

Modern movies have often been filmed with cameras featuring a higher dynamic range, and legacy movies can be upgraded even if manual intervention would be needed for some frames (as in past black&white films’ upgrade to color). Also, special effects, especially those in which real and synthetic footage are seamlessly mixed, require both HDR shooting and rendering. HDR video is also needed in applications that demand high accuracy for capturing temporal aspects of changes in the scene. This is important in monitoring of some industrial processes such as welding, in predictive driver assistance systems in automotive industry, in surveillance video systems, and other applications. HDR video can be also considered to speed image acquisition in applications that need a large number of static HDR images are, for example in image-based methods in computer graphics.

Finally, with the spread of TV sets with enhanced dynamic range, broadcasting HDR video may become important, but may take a long time to occur due to standardization issues. For this particular application, enhancing current Low-dynamic-range rendering (LDR) video signal to HDR by intelligent TV sets seems to be a more viable near-term solution.[41]

On August 27, 2015, the Consumer Electronics Association announced their definition for HDR compatible displays which would need to be able to process HDR10 Media Profile video which uses the Rec. 2020 color space, SMPTE ST 2084, and a bit depth of 10-bits.[42]

Example

This is an example of four standard dynamic range images that are combined to produce two resulting tone mapped images.

Original images
Results after processing

HDR sensors

More and more CMOS image sensors now have high dynamic range capability within the pixels themselves. Such pixels are intrinsically non-linear (by design) so that the wide dynamic range of the scene is non-linearly compressed into a smaller dynamic range electronic representation inside the pixel.[43] Such sensors are used in extreme dynamic range applications like welding or automotive.

Some other sensors designed for use in security applications can automatically provide two or more images for each frame, with changing exposure. For example, a sensor for 30fps video will give out 60fps with the odd frames at a short exposure time and the even frames at a longer exposure time. Some of the sensor may even combine the two images on-chip so that a wider dynamic range without in-pixel compression is directly available to the user for display or processing.

See also

References

  1. 1.0 1.1 "Compositing Multiple Pictures of the Same Scene", by Steve Mann, in IS&T's 46th Annual Conference, Cambridge, Massachusetts, May 9–14, 1993
  2. 2.0 2.1 Lua error in package.lua at line 80: module 'strict' not found.
  3. Lua error in package.lua at line 80: module 'strict' not found.
  4. Lua error in package.lua at line 80: module 'strict' not found.
  5. Lua error in package.lua at line 80: module 'strict' not found.
  6. Lua error in package.lua at line 80: module 'strict' not found.
  7. Lua error in package.lua at line 80: module 'strict' not found.
  8. Lua error in package.lua at line 80: module 'strict' not found.
  9. Lua error in package.lua at line 80: module 'strict' not found.
  10. Lua error in package.lua at line 80: module 'strict' not found.
  11. Lua error in package.lua at line 80: module 'strict' not found.
  12. Lua error in package.lua at line 80: module 'strict' not found.
  13. HDR apps for Android Google Play
  14. Lua error in package.lua at line 80: module 'strict' not found.
  15. Lua error in package.lua at line 80: module 'strict' not found.
  16. 16.0 16.1 Lua error in package.lua at line 80: module 'strict' not found.
  17. Lua error in package.lua at line 80: module 'strict' not found.
  18. Lua error in package.lua at line 80: module 'strict' not found.
  19. Lua error in package.lua at line 80: module 'strict' not found.
  20. J. Paul Getty Museum. Gustave Le Gray, Photographer. July 9 – September 29, 2002. Retrieved September 14, 2008.
  21. The Future of Digital Imaging – High Dynamic Range Photography, Jon Meyer, Feb 2004
  22. 22.0 22.1 4.209: The Art and Science of Depiction, Frédo Durand and Julie Dorsey, Limitations of the Medium: Compensation and accentuation – The Contrast is Limited, lecture of Monday, April 9. 2001, slide 57–59; image on slide 57, depiction of dodging and burning on slide 58
  23. Lua error in package.lua at line 80: module 'strict' not found.
  24. C. W. Wyckoff. Experimental extended exposure response film. Society of Photographic Instrumentation Engineers Newsletter, June–July, 1962, pp. 16-20.
  25. Michael Goesele, et al., "High Dynamic Range Techniques in Graphics: from Acquisition to Display", Eurographics 2005 Tutorial T7
  26. The Militarily Critical Technologies List (1998), pages II-5-100 and II-5-107.
  27. Andrew T. Young and Harold Boeschenstein, Jr., Isotherms in the region of Proclus at a phase angle of 9.8 degrees, Scientific Report No. 5, Harvard, College Observatory: Cambridge, Massachusetts, 1964.
  28. Lua error in package.lua at line 80: module 'strict' not found.
  29. Lua error in package.lua at line 80: module 'strict' not found.
  30. US granted 5144442, Ginosar, R., Hilsenrath, O., Zeevi, Y., "Wide dynamic range camera", published 1992-09-01 
  31. Lua error in package.lua at line 80: module 'strict' not found.
  32. Lua error in package.lua at line 80: module 'strict' not found.
  33. 33.0 33.1 US application 5828793, Steve Mann, "Method and apparatus for producing digital images having extended dynamic ranges", published 1998-10-27 
  34. Lua error in package.lua at line 80: module 'strict' not found.
  35. Lua error in package.lua at line 80: module 'strict' not found.
  36. https://www.red.com/epic_scarlet/[dead link]
  37. http://www.arridigital.com/alexa[dead link]
  38. Lua error in package.lua at line 80: module 'strict' not found.
  39. Lua error in package.lua at line 80: module 'strict' not found.
  40. Lua error in package.lua at line 80: module 'strict' not found.
  41. Lua error in package.lua at line 80: module 'strict' not found.
  42. Lua error in package.lua at line 80: module 'strict' not found.
  43. Lua error in package.lua at line 80: module 'strict' not found.