SHARE

Apple’s Worldwide Developer Conference (WWDC) kicked off this week with the announcement of a new MacBook Air and first looks at macOS 13 Ventura, iOS 16, iPadOS 16, and watchOS 9. It’s a giant stew of features and technologies meant to excite developers and prepare them for the software releases later this year.

But what about photographers? Several photo-related changes are coming, including improvements that take advantage of computational photography. Given this column’s interest in AI and ML technologies, that’s what I’m mostly going to focus on here.

Keep in mind that the operating system releases are currently available only as betas to developers, with full versions coming likely in September or October. As such, it’s possible that some announced features may be delayed or canceled before then. Also, Apple usually saves some details in reserve, particularly regarding the hardware capabilities of new iPhone models.

That said, here are the things that stood out to me.

The M2-Based MacBook Air and MacBook Pro

Photographers’ infamous Gear Acquisition Syndrome isn’t limited to camera bodies and lenses. The redesigned MacBook Air was the noteworthy hardware announcement, specifically because it’s powered by a new M2 processor.

The new MacBook Air uses Apple's M2 chip.
The new MacBook Air uses Apple’s latest M2 chip. Apple

Related: Testing the advantages of Apple’s ProRAW format

In short, the M2 is faster and better than the M1, which itself was a stark improvement over the Intel-based processors Apple had been using before transitioning to its own silicon. A few standout specs that will interest photographers include: The memory bandwidth is 100 GB/s, 50 percent more than the M1, which will speed up operations in general. (The M-series architecture uses a unified pool of memory for CPU and GPU operations instead of discrete chipsets, increasing performance; up to 24 GB of memory is available on the M2.)

The M2’s 20 billion transistors need more space than the M1’s dimensions
The M2’s 20 billion transistors need more space than the M1’s dimensions. Apple

Photographers and videographers will also see improvements due to 10 GPU cores, compared to 8 on the M1, and an improved onboard media engine that supports high bandwidth 8K H.264 and HEVC video decoding, a ProRes video engine enabling playback of multiple 8K and 4K video streams, and a new image signal processor (ISP) that offers improved image noise reduction.

In short, the M2 offers more power while also being highly efficient and battery-friendly. (The battery life I get on my 2021 MacBook Pro with M1 Max processor is unreal compared to my 2019 Intel-based model, and I’ve heard the fan spin up only on a handful of occasions over the past 6 months.)

The MacBook Air’s design reflects the new MacBook Pro’s flattened profile—goodbye to the distinctive wedge shape that defined the Air since its introduction—and includes two Thunderbolt ports and a MagSafe charging port. The screen is now a 13.6-inch Liquid Retina display that supports 1 billion colors and can go up to 500 nits of brightness.

The MacBook Air is just as slim as its predecessor and available in four colors.
The MacBook Air is just as slim as its predecessor and available in four colors. Apple

Apple also announced a 13-inch MacBook Pro with an M2 processor in the same older design, which includes a TouchBar but no MagSafe connector. The slight advantage of this model over the new MacBook Air is the inclusion of a fan for active cooling, which allows for longer sustained processing.

The M2 MacBook Air starts at $1199, and the M2 MacBook Pro starts at $1299. The M1-powered MacBook Air remains available as the $999 entry-level option.

Continuity Camera

Next on my list of interests is the Continuity Camera feature. Continuity refers to technologies that let you pass information between nearby Apple devices, such as copying text on the Mac and pasting it on an iPad. The Continuity Camera lets you use an iPhone 11 or later as a webcam.

Using a phone as a webcam isn’t new; I’ve long used Reincubate Camo software for this (and full disclosure, wrote a few articles for them). Apple brings its Center Stage technology for following subjects in the frame and Portrait Mode for artificially softening the background. It also features a Studio Light setting that boosts the exposure on the subject (you) and darkens the background to simulate external illumination like a ring light. Apple does these things by using machine learning to identify the subject.

But more intriguing is a new Desk View mode: It uses the iPhone’s Ultra-Wide camera and likely some AI technology to apply extreme distortion correction to display what’s on your desk as if you’re looking through a down-facing camera mounted above you. Other participants on the video call still see you in another frame, presumably captured by the normal Wide camera at the same time.

Continuity Camera uses the iPhone’s cameras as webcams and to show a top-down view of the desktop.
Continuity Camera uses the iPhone’s cameras as webcams to show a top-down view of the desktop. Apple

Acting on Photo Content

A few new features take advantage of the software’s ability to identify content within images and act on it.

The iPhone in iOS 16 will have a configurable lock screen with options for changing the typeface of the current time and including widgets for getting quick information at a glance. If the wallpaper image includes depth information, such as a Portrait Mode photo of someone, the screen automatically places the time behind them (a feature introduced in last year’s watchOS 8 update). It can also suggest photos from your library that would work well as lock screen images.

Awareness of subjects in a photo enable the new iOS 16 lock screen to simulate depth by obscuring the time.
Awareness of subjects in a photo enables the new iOS 16 lock screen to simulate depth by obscuring the time. Apple

Another clever bit of subject recognition is the ability to lift a subject from the background. You can touch and hold a subject, which is automatically identified and extracted using machine learning, and then drag or copy it to another app, such as Messages.

Touch to select a subject and then drag it to another app.
Touch to select a subject and then drag it to another app. Apple

The previous iOS and iPadOS updates added Live Text, which lets you select any text that appears in an image. In the next version, you can also pause any frame of video and interact with the text. Developers will be able to add quick actions to do things like convert currency or translate text.

Photos App Improvements

Apple’s Photos app has always occupied an odd space: it’s the default place for saving and organizing images on each platform, but needs to have enough broad appeal that it doesn’t turn off average users who aren’t looking for complexity. I suspect many photographers turn to apps such as Lightroom or Capture One, but we all still rely on Photos as the gatekeeper for iPhone photos.

In the next update, Apple is introducing iCloud Shared Photo Library, a way for people with iCloud family plans to share a separate photo library with up to six members. Each person can share and receive all the photos, bringing photos from family events together in one library without encroaching on individual personal libraries.

An iCloud Shared Library collects photos from every family member.
An iCloud Shared Library collects photos from every family member. Apple

You can populate the library manually, or use person recognition to specify photos where two or more people are together. Or, you can set it up so that when family members are together, photos will automatically be sent to the shared library.

Other Photos improvements include a way to detect duplicates in the Photos app, the ability to copy and paste adjustments between photos or in batches, and more granular undo and redo options while editing.

Reference Mode on iPad Pro

The last thing I want to mention isn’t related to computational photography, but it’s cool nonetheless. Currently, you can use the Sidecar feature in macOS to use an iPad as an additional display, which is great when you need more screen real estate.

In macOS Ventura and iPadOS 16, an iPad Pro can be set up as a reference monitor to view color-consistent photos and videos as you edit. The catch is that according to Apple’s footnotes, only the 12.9-inch iPad Pro with its gorgeous Liquid Retina XDR display will work, and the Mac must have an M1 or M2 processor. (I added “gorgeous” there; it’s not in the footnotes.)

Use the 12.9-inch M1 iPad Pro as a color-accurate reference monitor.
Use the 12.9-inch M1 iPad Pro as a color-accurate reference monitor. Apple

Speaking of screen real estate, iPadOS 16 finally—finally!—enables you to connect a single external display (up to 6K resolution) and use it to extend the iPad desktop, not just mirror the image. Again, that’s limited to models with the M1 processor, which currently includes the iPad Pro and the iPad Air. But if you’re the type who does a lot of work or photo editing on the iPad, external display support will give you more breathing room.

Extend the iPad Pro’s desktop by connecting an external display.
Extend the iPad Pro’s desktop by connecting an external display. Apple

A new feature called Stage Manager breaks apps out of their full-screen modes to enable up to four simultaneous app windows on the iPad and on the external display. If you’ve ever felt constrained running apps like Lightroom and Photoshop side-by-side in Split View on the same iPad screen, Stage Manager should open things up nicely. Another feature, Display Zoom, can also increase the pixel density to reveal more information on the M1-based iPad’s screen.

More to Come

I’ve focused mostly on features that affect photographers, but there are plenty of other new things coming in the fall. If nothing else, the iPad finally has its own Weather app and the Mac has a full Clock app. That may not sound like much, but it helps when you’re huddled in your car wondering if the rain will let up enough to capture dramatic clouds before sundown, or when you want a timer to remind you to get to bed at a respectable hour while you’re lost in editing.