+ +
+

WWDC 2023: HDR images

+

Find hereafter a detailed summary of two videos that belong to a taxonomy of some WWDC footages.

+

The original videos are available on the Apple official website (session 10113, session 10181).
+
+"Learn how to identify, load, display, and create High Dynamic Range (HDR) still images in your app. Explore common HDR concepts and find out about the latest updates to the ISO specification. Learn how to identify and display HDR images with SwiftUI and UIKit, create them from ProRAW and RAW captures, and display them in CALayers. We'll also take you through CoreGraphics support for ISO HDR and share best practices for HDR adoption."

+

"EDR is Apple's High Dynamic Range representation and rendering pipeline. Explore how you can render HDR content using EDR in your app and unleash the dynamic range capabilities of HDR displays on iPhone and iPad. We'll show how you can take advantage of the native EDR APIs on iOS, provide best practices to help you decide when HDR is appropriate, and share tips for tone-mapping and HDR content rendering. We'll also introduce you to Reference Mode and highlight how it provides a reference response to enable color-critical workflows such as color grading, editing, and content review."

+


Hereafter, the underlined elements lead directly to the playback of the WWDC video at the appropriate moment.
+

+

In order to alleviate the writing, the following acronyms will be used throughout this document:

+
    +
  • SDR  =  Standard  Dynamic  Range
  • +
  • HDR  =  High  Dynamic  Range
  • +
  • EDR  =  Extended  Dynamic  Range
  • +
+
+

The basics #

+

+

The HDR display provides a more contrasted image that highlights its brightest parts and improves the image definition.
+Thanks to additional data collected within the iPhone images, the gain map HDR has the ability to create a HDR image from a SDR's.

+

+

Besides providing a better user experience and being a cross-platform implementation, the HDR support is based on a partnership with the International Standards Organization through an ISO HDR specification for HDR images with new contents including some additional optional metadata fields.

+

+

The brightest part of an HDR display is called the headroom that conditions the brightness and the dynamism of the content for permitting the EDR (Apple's HDR technology) to impact the SDR and HDR representations.

+

+

The EDR depends on the headroom whose value is both related to the Display Peak Brightness and the SDR Brightness which means that it will also rely on the device capacities.

+


+

+

EDR #

+

New features #

+

Besides the availability of the EDR API for iOS 16 and iPadOS 16, two other features have been introduced in the WWDC22.

+ +
+
+

🎬

+

Supported by LumaFusion since WWDC22, the Reference Mode enables a color rendering that matches the description of the specification.

+

+
+
+

🎬

+

As a technology for users willing to use their iPad as a secondary display, Sidecar now supports reference-level SDR and HDR contents.

+

+
+
+
+
+

Read the content #

+

The process of reading a HDR content and converting it to a renderable format consists in four steps to be perfectly performed.

+


+

+

Stick to EDR #

+

Support the EDR rendering relies on a two-step process.
+
+A close combination between the pixel format and the color space is essential to support EDR and it must be followed otherwise these items will be cropped and downgraded to SDR.
+
+

+

Query headroom #

+

Whether on the NSScreen or UIScreen, the headroom queries are quite different and head towards various situations according to the needs.
+

+

The new APIs for iOS lead to accurate conclusions on the EDR content rendering...

+

+

... and inform about the four possible states of the display.

+


+

+

Tone-mapping #

+

This technique is used to map the pixel levels in high dynamic range images to a reduced dynamic range, while trying and retaining the original appearance as close as possible.

+

+

Instead of using the currentEDRHeadroom for a customized rendering, the built-in tone mapping may also be used by following three milestones including dedicated constructors for specific color space usage.

+


+

+

HDR #

+

Display images #

+ +
+
+

🎬

+

+
+
+

🎬

+

+
+
+
+

The way to handle the HDR content is managed by three options of the dynamic range properties (high, standard and constrainedHigh) for which detailed examples are highlighted so as to perfectly understand their usage areas (ex1 & ex2).

+

+

It's crucial to follow the latest HDR features to not add workload.

+

+

Even if importing an image in an app is pretty simple, it's not necessary to know if it's HDR but if need be, each platform has its API to reach this goal.

+

+

iOS and iPadOS provide APIs to get the maximum display headroom of a device to use HDR when it's appropriate.

+


+

+

Pipeline details #

+

Handling HDR images implies specific steps between the source and its renderer (image processing pipeline).

+

+

Read #

+

Reading an image is based on its type and the technical way for its description.

+

+ +
+
+

🎬

+

+
+
+

🎬

+

+
+
+

🎬

+

+
+
+

🎬

+

+
+
+

🎬

+

The RAW files can be displayed as HDR thanks to the dynamic range within thereof.

+

+
+
+
+
+

Modify & write #

+ +
+
+

🎬

+

Modifying an image relies on the native HDR CIFilters.

+

+
+
+

🎬

+

+
+
+
+
+

Convert #

+

+ +
+
+

🎬

+

Alter an image into an IOSurface or a CVPixelBuffer object may be useful as a CALayer content support.

+

+
+
+

🎬

+

+
+
+

🎬

+

+
+
+
+
+

Advanced display #

+

+

It's important to use the proper classes to render HDR with CALayer.

+

+

It's crucial to take into account the appropriate pixel format with the CGImage flags when handling HDR data.

+

+

Few elements should be considered to enable the compatibilty of HDR contents with older versions of iOS and macOS.

+


+

+ +
+