I Used Eye Tracking on Apple Vision Pro to Edit Lightroom Photos

Several weeks prior, I was flying across the country with a large collection of digital photos I wished to adjust and organize using my preferred tool, Adobe Lightroom. However, the seat in front of me was so near that I could hardly open my laptop lid. Disheartened, I ended up tackling some crossword puzzles and watching a disappointing movie.

But imagine if I had a virtual or augmented reality headset — the Apple Vision Pro, for example — that displayed my photos on a large screen only visible to me?

This week, I had the opportunity to test that precise technology. I used Adobe’s new Lightroom app for the Apple Vision Pro. I managed the entire process by simply looking at what I wanted and tapping my fingertips together.

And it functions. In this unique early look at the app, I can say it took only a few minutes to understand how to use the headset for standard Lightroom actions like adjusting a photo’s exposure, applying some editing presets, or gradually reducing sky haziness.

Impressed. Furthermore, my experience helped persuade me not only that Apple has done a good job figuring out an effective interface for what it calls “spatial computing,” but also that developers should have a reasonably easy time bringing at least their iPad apps to the system.

And that is promising for the long-term potential of Apple’s headset. The more productivity and fun the Apple Vision Pro and its successors offer, the better the chance they’ll appeal to a sizable population, not just some narrow niche like Beat Saber fans.

For me, the most compelling possibility with the Apple Vision Pro is using the virtual and augmented reality headset to have a private workspace in a public area. Lightroom fits well into that concept. I’m not embarrassed by my photos, but I don’t exactly relish sharing them with everyone on a plane flight.

Lightroom support isn’t sufficient to persuade me to purchase an Apple Vision Pro — beginning price: $3,499 — but if I made the investment eventually, I would most certainly use Lightroom on it.

For a broader perspective, refer to my CNET colleague Scott Stein’s assessment of the Apple Vision Pro. He describes it as the “best wearable display I’ve ever put on,” with impressive technology but also a somewhat unfinished feel.

How Lightroom appears on the Apple Vision Pro

Upon launching Lightroom on the Vision Pro, a virtual window opens with your photo catalog. It can occupy a significant portion of your field of view, which is fantastic. You can also have other windows nearby if you want to multitask more easily than you can on an iPad.

I found the screen to be very good on the Apple Vision Pro. I wasn’t bothered by low resolution and resulting pixelation of photos. Colors were vibrant, and tonal variations of photos looked good, although I wasn’t performing any kind of calibration tests, so take that with a grain of salt. I was able to use Lightroom’s exciting new HDR mode, though I had to adjust the headset’s background image to very dark to get sufficient brightness headroom.

Apple’s foveated rendering technology, a power-saving measure that shows high-resolution imagery only for the part of your field of view where you’re directly looking, functioned well for me. I never noticed low-resolution rendering.

I used the headset for a bit under an hour, and it wasn’t custom-fitted for my head, so I can’t remark much on weight and comfort issues. However, I didn’t consider the weight until I removed the headset and realized I hadn’t noticed it.

How Lightroom works on the Apple Vision Pro

If you’ve used Lightroom on an iPad, you know what Lightroom on an Apple Vision Pro looks like. It’s essentially identical.

On an iPad, you can tap on a slider and move your finger back and forth to brighten or darken a photo’s exposure, for instance. That works the same way on an Apple Vision Pro, except instead of placing your finger on a screen, you direct your gaze at the control, and instead of tapping on the screen, you hold your thumb and index finger together.

Want to open a photo for editing? Look at it and tap your fingers together. Open up the effects editing panel? Look at the effects button and tap your fingers together. Apply a preset? Look at the preset button and tap your fingers together. Notice a pattern here?

I’d never used an Apple Vision Pro before, but it took me only a few moments to comprehend this look-and-tap interaction. Double tapping your fingertips, for instance, to zoom in to a Lightroom photo, is the immediate analog for double clicking on a mouse or double tapping on a trackpad.

Also easily understood for me was dragging: Look at one spot; tap your fingers together; move your hand sideways, up or down; move your fingers apart again. Tapping together fingers on both hands and then moving my hands apart worked well to zoom in to a photo.

Some drawbacks

I was able to be productive with Lightroom on the Apple Vision Pro quickly, but it wasn’t perfect.

I encountered some issues with eye tracking accuracy. Sometimes the headset couldn’t determine which control I was looking at, but I hope that will improve with better calibration and Apple hardware and software improvements.

I also sometimes found myself locating the control I wanted to use then looking away before I tapped my fingers. I think that was because on computers, I’m used to aiming my mouse and then looking elsewhere as I click. So I had to learn to move at a more methodical pace.

A larger problem from my perspective is that I use Lightroom Classic, the version of the editing and cataloging software that stores photos on my local hard drive and has a number of advanced features I enjoy. The Apple Vision Pro app is in the non-Classic Lightroom family, a more stripped-down version that stores photos in the cloud.

Others like me could still use Lightroom Classic on a Mac and then use the Vision Pro as a large virtual monitor, although the interface might not be as polished.

Also, for editing photos on that airplane flight, internet access would be an issue for cloud-based Lightroom. However, you can get Lightroom to download a group of photos ahead of time, so as long as you planned ahead, you’d probably be OK.

And lastly, some features don’t work, like merging different shots into a single HDR photo. And if you want to take advantage of the Vision Pro’s ability to view a panoramic photo in all its wrap-around glory, you’ll have to export it to Apple Photos. That’s easy to do, but I’d prefer a more immersive option in the Lightroom app itself.

Final Thoughts

Adobe also released three other apps for Apple’s Vision Pro. Its Firefly app lets you create imagery with Adobe’s generative AI tool (though that won’t work without an internet connection). Fresco is a version of its sketching app. And the Behance app lets you use the online portfolio tool with a slight social networking flavor.

According to Adobe’s vice president of design, Eric Snowden, Adobe has a lot of other apps, such as Photoshop, Illustrator, and Express. It started with these four due to their compatibility with Apple’s headset.

Software like Photoshop or Illustrator necessitates precise control over the interface and creative work, which often requires numeric input into dialog boxes.

“It’s something we would want to rethink,” Snowden said. “It’s not that it couldn’t work, but I think there’s a less direct translation.”

For me, Lightroom’s sliders and buttons were a natural fit for the Vision Pro. Maybe someday I’ll be wearing one to edit my photos on a cramped plane flight.

Leave a Reply

Your email address will not be published. Required fields are marked *