Development Notes

What Worked

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat.

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident.

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Sunt in culpa qui officia deserunt mollit anim id est laborum sed perspiciatis unde omnis iste natus.

Dead Ends

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Sed ut perspiciatis unde omnis iste natus error sit voluptatem accusantium doloremque laudantium, totam rem aperiam eaque ipsa quae ab illo inventore veritatis et quasi architecto.

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Nemo enim ipsam voluptatem quia voluptas sit aspernatur aut odit aut fugit, sed quia consequuntur magni dolores eos qui ratione voluptatem sequi nesciunt.

Lorem ipsum dolor sit amet, consectetur adipiscing elit. At vero eos et accusamus et iusto odio dignissimos ducimus qui blanditiis praesentium voluptatum deleniti atque corrupti.

Limits of Apple's Camera System

Portrait Mode

Apple's native Portrait Mode bokeh is private API — it can't be accessed directly. What is publicly available is:

The result is good but visually different from Apple's portrait mode: less refined edge detection, and no ML-based subject segmentation unless Vision is added on top.

Depth of Field in the Pipeline

CIDepthBlurEffect slots in as a post-capture step after existing filters and before the crop. The main constraints:

The tricky part is getting CIDepthBlurEffect tuned well — aperture, focal point, and the depth data format (disparity vs. depth) require experimentation.

Implementation

The main steps would be:

  1. Enable photoOutput.isDepthDataDeliveryEnabled in setup
  2. Set photoSettings.isDepthDataDeliveryEnabled = true at capture time
  3. Extract AVDepthData from the captured photo in photoOutput(_:didFinishProcessingPhoto:)
  4. Apply CIDepthBlurEffect in the processing pipeline
  5. Add a toggle to settings

The iPhone 15 Pro Camera System

The iPhone 15 Pro has three physical lenses plus one virtual one:

The fourth entry in Apple's spec sheet — a "2× telephoto" at 48mm — is not a separate lens. It is a crop mode within the 48MP main sensor: by reading only the central 12MP of pixels (a 2×2 block per output pixel), iOS delivers a lossless 2× image with no interpolation and no lens switch. Apple calls this the quad-pixel sensor.

This mode is used transparently by AVFoundation. When the zoom factor is set to 2×, the system switches the main camera to this crop mode without any visible transition. The app never sees it as a lens change — there is no bridging artifact at 2×, unlike the true lens crossovers at 0.87× and ~3×.

The practical implication: a discrete 2× preset would be essentially free in image quality — no interpolation, no lens switch, the same aperture as 1×.

Saving and Deleting Photos

Yes, but only within limits set by iOS.

Saving photos — use PHPhotoLibrary with addOnly or readWrite access. You can request permission once via PHPhotoLibrary.requestAuthorization. After that, your app can save photos to the camera roll without further prompts.

Deleting photos — deleting always requires user confirmation via a system dialog. Even with full readWrite permission, you must call:

PHPhotoLibrary.shared().performChanges {
    PHAssetChangeRequest.deleteAssets([asset] as NSArray)
}

iOS will show a confirmation popup every time.

No way around it — there is no permission you can request that allows silent deletion of photos from the user's library. This is enforced by iOS privacy rules.

Workarounds — store temporary images in your app's sandbox instead of the camera roll. Or save to a custom album and let users manage deletion manually (still prompts if you delete).

Summary: save silently after permission → yes. Delete silently → not possible on iOS.

Photos App Delay After Capture

After taking a photo, there is always a 2–3 second delay before it appears at the top of the camera roll in the Photos app. This is a system-level limitation with no workaround.

The delay comes from the photolibraryd background daemon, which processes every incoming photo before Photos will show it: indexing, thumbnail generation at multiple resolutions, face and scene recognition, and metadata writing. Until this pipeline finishes, the photo exists in the library but the Photos app won't display it.

Format doesn't help much — switching from HEIC to JPEG would skip the codec overhead but not the indexing pipeline. The difference is under a second and imperceptible in practice, while the trade-off is larger file sizes (2–3×) and loss of HDR/wide color information.

What we do instead — Phase keeps a live in-memory copy of the last captured image and makes it immediately available via the last-image viewer, bypassing the Photos library entirely for that use case.