SAN FRANCISCO,June 18,2025
Adobe’s New Camera app: A Pixel Legacy
Adobe unveils “Project Indigo,” a new camera app that harnesses the expertise behind the Pixel camera,though it’s currently exclusive to iPhones.
- Adobe’s “Project Indigo” camera app offers advanced computational photography features.
- The app captures more frames and utilizes local tone mapping for improved HDR.
- It features pro controls and a “Removing Reflections” function.
Adobe has launched “Project Indigo,” a new camera app.This app, developed by the same engineers who built the Pixel camera, offers a suite of computational photography features. Project Indigo promises to deliver enhanced image quality and a more professional photography experience.
The minds behind the Pixel camera, marc Levoy and florian Kainz, formerly of Google, have brought their expertise to Adobe. Thay were instrumental in developing the computational photography technology found in Pixel phones from 2014 through 2020. Kainz, in particular, was responsible for the remarkable nighttime photography capabilities of Pixel devices.
Levoy and Kainz now at Adobe, are showcasing their work with “Project Indigo.” In a post, Marc Levoy says his team, called “Nextcam,” launched the app after five years of development. Levoy describes it as a “computational photography camera app” that “offers a natural SLR-like look, full manual controls, the highest possible image quality, and new photographic experiences, including on-device removal of window reflections.”
Computational Photography and Enhanced Features
The “Project Indigo” app captures and combines more frames than typical cameras. It also underexposes photos to reduce blown-out highlights and noise in shadows. the app can capture up to 32 frames.
Frame Averaging: By capturing multiple frames, “Project Indigo” reduces noise and increases dynamic range, similar to techniques used in astrophotography.
Additionally,the app has “local tone mapping” for better HDR and is designed to be compatible with Adobe Camera Raw and Lightroom. It offers “Photo” and “Night” modes. “Night” mode requires a tripod.
Local Tone Mapping: This technique adjusts the contrast and brightness of different regions of an image independently, resulting in a more balanced and detailed final photo.
The app’s “multi-frame super-resolution” technology enhances detail when zooming in,ensuring that the added detail is authentic. “Indigo” provides professional controls for focus, shutter speed, ISO, exposure, and white balance. Users can also manage the number of frames taken in a burst.

The app also includes a “Removing Reflections” feature that can eliminate reflections from glass and windows after an image is taken.
Reflection Removal: This feature uses computational techniques to identify and remove unwanted reflections, resulting in clearer and more professional-looking photos.
Availability and Future Plans
Currently,”Project Indigo” is an experimental camera app available on the App Store for iPhone 14 and above (or iPhone 12 Pro and above). An android version is planned.
The Future of Mobile Photography: Beyond “Project Indigo”
As “Project Indigo” enters the scene, it’s natural to ponder the bigger picture: what’s *next* for mobile photography? Adobe’s new camera app has already made waves with its cutting-edge computational photography features, a testament to the expertise of former pixel engineers. Beyond the initial launch,ther’s a whole world of innovation on the horizon,impacting how we capture,edit,and share photos.
The Role of computational Photography
Computational photography is central to “Project Indigo” as well as other innovations in the space. Thes techniques combine hardware and software to enhance images.This differs from traditional photography, which depended heavily on the lens and sensor.The progress of computational photography has enabled remarkable advancements in mobile imaging, and Adobe is poised to continue its contribution.
In a world of ever-evolving technology, the mobile camera is expected to keep advancing. We’re likely going to see improved image processing, better low-light performance, and more elegant tools for editing and manipulating shots. It’s logical to anticipate a deeper integration of computational techniques, similar to the “Removing Reflections” effect, to address more common photographic challenges. For example, advanced noise reduction, improved dynamic range, and even automatic object removal could become standard features.
What to Expect Next
The future of mobile camera apps has high expectations. The success of “Project Indigo”-and its feature set-will likely serve as a blueprint for future app developments. Here’s a peek into what we might see:
- Enhanced AI Integration: AI algorithms will play a more notable role in image processing,with the goal of more realistic results. Imagine AI-powered scene recognition delivering optimal image settings automatically.
- Improved Video Capabilities: Mobile camera technology is already enabling notable video recording. We can anticipate improved stabilization, high-resolution video, and advanced editing tools integrated directly into camera apps.
- Augmented Reality (AR) Integration: Expect a further blending of photography and AR. It’s possible that “Project Indigo” may include AR features that will add additional context to the photo’s story when shared.
- Cross-Platform Consistency: While “Project Indigo” starts on iOS, the goal is cross-platform functionality. We might see more apps that provide consistent features and user experiences across iOS, Android, and possibly other platforms.
Table of Contents
