Building a Photo Gallery app in SwiftUI Part 1: Memory Management using PhotoKit

– Written by Iñaki Narciso, 20 Sept 2022

Table of Contents

  1. Introduction
  2. Part 1: Memory Management using PhotoKit
  3. Part 2: Memory Management practices for a photo-grid UI
  4. Part 3: Creating the photo gallery app (coming Oct 4)
  5. Part 4: Adding Multiple Gestures (coming Oct 11)

Hello, and welcome back! This is part 1 of 4 of the Building a Photo Gallery app in SwiftUI series.

In this post, we’re going to talk about memory management. Memory is a very important resource as it is shared by many apps including the system itself. Photos can consume memory pretty quickly if you’re not careful, and loading an entire library all at once will cause extreme use of the memory. App crashes and slow performance are among the most common effects of poor memory management.

The Anatomy of a Digital Photo

The memory consumption of a single photo depends on the following:

  1. Resolution – the total number of pixels required to show the photo
  2. Compression – commonly determined by file type, the compression of an image will tell how many bytes are required to render the image based on its colour space and opacity, and its strategy to save up memory. The most common compression types are JPEG (lossy compression) and PNG (lossless compression). I won’t discuss in greater detail the differences between the two, but image compression is one of the factors that determine the total size of an image in memory.

Suppose we capture a single photo from a modern 12MP camera (such as the iPhone 13 camera). The resulting photo will have a resolution of 4290 x 2800. The resolution alone would need a total of 12 million pixels to fully render the photo. Assuming that we use no compression algorithm to reduce the size, but support alpha transparency information (ARGB) – this would mean we’ll need 8 bits to represent alpha transparency, 8 bits to represent the red colour information, 8 bits to represent the blue colour information, and 8bits to represent the green colour information. The colour space and transparency information would need a total of 32 bits (8bit each from ARGB) for each pixel to fully render its colour.

So to render the fully coloured photo, we need 12 million pixels multiplied by 32bits of colour for each pixel = 384Mbits (Megabits) which are around 48MB (Megabytes: 8 bits = 1 byte). Theoretically, a modern photo without a compression format would cost 48Mb each. Ten of these photos would cost 480Mb. A hundred would cost 4.8Gb. You get the idea.

Assuming that an average iPhone user can have a total of 25,000 photos in the photo library, loading all of them without a compression format on the same resolution as the example above would theoretically require 1.2Tb of memory. No modern iPhones can handle it if we choose to load it all at once.

In reality photos with modern compression formats such as JPEG or PNG cost way less, but loading an entire library is still way too extremely expensive in memory even when photos have lesser sizes due to compression.

Managing Memory in a Photo Gallery App

  1. Load Photos on Demand – Store photos in memory only when necessary. In a photo gallery app, only a small number of photos can be shown at a time even when the user’s photo library is large. As the user scrolls through the gallery, photos that go past the scroll area should be freed up from the memory, and only load new photos that are to be shown.

  2. Avoid Referencing Photos Directly – as photos consume memory very fast, we need to avoid directly referencing them in our code, and only reference them by ID or by URL. Directly referencing a photo may lead to duplications in memory, and we don’t want a single photo to exist multiple times in our memory. Since we wanted to acquire photos from the user’s library using PhotoKit, we can reference the photos indirectly by using asset IDs. If we need to load the photo on the screen, we can tell PhotoKit to fetch and cache the photo and let the kit handle all of the memory management for us.

  3. Avoid Duplication of Photos in Memory – when passing photos from one screen to another, we should not pass a copy of the photo itself. We can instead send the reference of the photo as an ID (if using PhotoKit), or as a URL (if fetching from cache or from a network).

  4. Make Effective Use of Disk Caching – Fetching photos from the library or from the network takes time. To speed up loading the next time you need to show a photo, you should cache it to disk. In PhotoKit, this is handled by the photo caching manager. When you tell PhotoKit to fetch the photo for the first time, the caching manager automatically caches the photo on disk so it would take a shorter time to load the next time you need it. But beware! The cache is volatile – meaning it would only persist the photos in the cache while the app is active. If the app is closed or terminated, the photos in the cache will be lost. Caching photos fetched from the network is another story, as you have the flexibility to adapt your own caching strategy.

Using PhotoKit

PhotoKit is an Apple framework that allows our apps to work with photo and video assets managed by the Photos app. This also includes photos and videos managed by iCloud Photos and Live Photos, but we don’t need those in our photo gallery app.

To abstract all PhotoKit code, I created a service that will do the following for us:

  1. Request Access to the Photo Library – In order to fetch photos from the photo library, the user needs to grant this access privilege to the app.
  2. Fetch Photos from the Library – Once access is granted, we can fetch photos that we can show in the app. Fetching would include setting limitations (such as excluding hidden photos for privacy reasons) and setting the sort order (such as showing the most recent photos first).
  3. Caching Fetched Photos – All fetched photos should be cached so it would take lesser time to load the next time we need to show it to the user.

Requesting Access to the User’s Photo Library

To request access, we first need to store the current permission granted by the user. We can ask PhotoKit about the exact permission later, but for now, let’s start with .notDetermined.

import Foundation
import Photos

class PhotoLibraryService: ObservableObject {
	...
	/// The permission status granted by the user
	/// This property will determine if we need to request
	/// for library access or not
    var authorizationStatus: PHAuthorizationStatus = .notDetermined
    ...
}

Next would be to add the function that will show the request to access the user’s photo library. The function will take an optional error closure parameter so we can allow the UI to show an error when the user decides not to grant photo library access.

class PhotoLibraryService: ObservableObject {
	...
	func requestAuthorization(
        handleError: ((AuthorizationError?) -> Void)? = nil
    ) {
	    /// This is the code that does the permission requests
        PHPhotoLibrary.requestAuthorization { [weak self] status in
	        self?.authorizationStatus = status
	        /// We can determine permission granted by the status
            switch status {
            /// Fetch all photos if the user granted us access
            /// This won't be the photos themselves but the
            /// references only.
            case .authorized, .limited:
                self?.fetchAllPhotos()
            
            /// For denied response, we should show an error
            case .denied, .notDetermined, .restricted:
                handleError?(.restrictedAccess)
                
            @unknown default:
                break
            }
        }
    }
}

Using the authorizationStatus property, we can determine if the app needs to request access to the photo library, and use the requestAuthorization() function to do the actual permission request itself.

Fetching Photos from the Library

Now that we have a way to ask for user permission to access the photo library, the next thing that we need to do is to find a way to fetch photo references from the photo library.

Let’s begin by adding a variable to store our fetch results.

class PhotoLibraryService: ObservableObject {
    ...
	/// https://stackoverflow.com/a/69755543
	/// A collection that allows subscript support to
	/// PHFetchResult<PHAsset>
	///
	/// The results property will store all of the photo asset ids 
	/// that we requested, and will be used by our views to request 
	/// for a copy of the photo itself.
	///
    /// We don't want to store a copy of the actual photo as it would 
    /// cost too much memory, especially if we show the photos in a
    /// grid.
	@Published var results = PHFetchResultCollection(
		fetchResult: .init()
	)
	...
}

PHFetchResultCollection is a collection wrapper for PHFetchResult<PHAsset> but with support for subscript access. We’ll be using subscript access later on when we need to create the grid UI.

We’ll also need a PHCachingImageManager to do the actual photo fetching and caching for us. Let’s add it next to our list of property declarations:

class PhotoLibraryService: ObservableObject {
	...
	/// The manager that will fetch and cache photos for us
    var imageCachingManager = PHCachingImageManager()
    ...
}

We now have a property to store our fetch results and a manager that will fetch and cache photos for us. Let’s proceed by writing the function that will tell the manager to fetch all photo references from the library, and store them in our results property.

class PhotoLibraryService: ObservableObject {
	...
    /// Function that will tell the image caching manager to fetch 
    /// all photos from the user's photo library. We don't want to 
    /// include hidden assets for obvious privacy reasons.
    ///
    /// We also need to sort the photos being fetched by the most 
    /// recent first, mimicking the behaviour of the Recents album 
    /// from the Photos app.
    private func fetchAllPhotos() {
        imageCachingManager.allowsCachingHighQualityImages = false
        let fetchOptions = PHFetchOptions()
        fetchOptions.includeHiddenAssets = false
        fetchOptions.sortDescriptors = [
            NSSortDescriptor(key: "creationDate", ascending: false)
        ]
        DispatchQueue.main.async {
            self.results.fetchResult = PHAsset.fetchAssets(with: .image, options: fetchOptions)
        }
    }
    ...
}

The function above only fetches photo references, and not the actual photos themselves. That is one of the memory management mechanisms of PhotoKit, as it only loads a photo’s image only when requested. To do the actual loading of an image using its asset ID, we need to write another function that would accept an image asset ID, and return an image type. Since the photo gallery app will be written in 100% SwiftUI, we’ll be using SwiftUI Image to render the photos in the UI, and it’s a good thing that PhotoKit returns a UIImage instance as SwiftUI Image supports it.

import Foundation
import Photos
import UIKit // Don't forget to add this

class PhotoLibraryService: ObservableObject {
    /// Requests an image copy given a photo asset id.
    ///
    /// The image caching manager performs the fetching, and will 
    /// cache the photo fetched for later use. Please know that the 
    /// cache is temporary – all photos cached will be lost when the
    /// app is terminated.
    func fetchImage(
        byLocalIdentifier localId: PHAssetLocalIdentifier,
        targetSize: CGSize = PHImageManagerMaximumSize,
        contentMode: PHImageContentMode = .default
    ) async throws -> UIImage? {
        let results = PHAsset.fetchAssets(
            withLocalIdentifiers: [localId],
            options: nil
        )
        guard let asset = results.firstObject else {
            throw QueryError.phAssetNotFound
        }
        let options = PHImageRequestOptions()
        options.deliveryMode = .opportunistic
        options.resizeMode = .fast
        options.isNetworkAccessAllowed = true
        options.isSynchronous = true
        return try await withCheckedThrowingContinuation { [weak self] continuation in
	        /// Use the imageCachingManager to fetch the image
            self?.imageCachingManager.requestImage(
                for: asset,
                targetSize: targetSize,
                contentMode: contentMode,
                options: options,
                resultHandler: { image, info in
	                /// image is of type UIImage
                    if let error = info?[PHImageErrorKey] as? Error {
                        continuation.resume(throwing: error)
                        return
                    }
                    continuation.resume(returning: image)
                }
            )
        }
    }
}

The great thing about using the PHCachingImageManager instance is that when it fetches the photo the first time, it caches the photo as well. So when we need to fetch the photo at another time, loading it would be a whole lot faster! We also don’t need to worry anything about caching strategies and memory management of the cache since PhotoKit is handling it all for us!

We now have a PhotoLibraryService class that we can readily use, let’s add it up into our App file as an environmentObject dependency, so we can easily have access to it from our views.

import SwiftUI

@main
struct PhotoGalleryApp: App {
	let photoLibraryService = PhotoLibraryService()

	var body: some Scene {
		WindowGroup {
			PhotoLibraryView() // This is what we'll do next
				.environmentObject(photoLibraryService)
		}
	}
}

In this post, you have learned the basic requirements of memory management for a photo gallery app, and how to use PhotoKit to do the fetching, caching, and do memory management work for us.

In the next post, we’re going to continue our discussion about memory management from the perspective of the user interface, and we’ll also start building the UI for the photo grid. That’s all for now. I’ll see you at the next one!

Leave a Comment

Share
Tweet
Pin
Share
Buffer