In the past years, iOS apps have become more and more visually appealing. Displaying images is a key part of that, that"s why most of them use images that need to be downloaded and rendered. Most developers have faced the need to populate table views or collection views with images. Downloading the images is resource consuming (cellular data, battery, CPU etc.); so, in order to minimize this, the caching model was developed.
To achieve a great user experience, it"s important to understand what is going on under the iOS hood when we cache and load images.
Also, the benchmarks on the most used image caching open source libraries can be of great help when choosing your solution.
2. Classical approach
download the images asynchronously
process images (scale, remove red eyes, remove borders, …) so they are ready to be displayed
write them on the flash drive (internal storage unit)
read from flash drive and display them when needed
// assuming we have an NSURL *imageUrl and UIImageView *imageView, we need to load the image from the URL and display it in the imageView
if ([self hasImageDataForURL:imageUrl] {
NSData *data = [self imageDataForUrl:imageUrl];
UIImage *image = [UIImage imageWithData:imageData];
dispatch_async(dispatch_get_main_queue(), ^{
imageView.image = image;
});
} else {
[self downloadImageFromURL:imageUrl withCompletion:^(NSData *imageData, …)
{
[self storeImageData:imageData …];
UIImage *image = [UIImage imageWithData:imageData];
dispatch_async(dispatch_get_main_queue(), ^{
imageView.image = image;
});
}];
}
FPS simple math
60 FPS is our ideal for any UI update, so the experience is flawless
60 FPS => 16.7ms per frame. This means that if any main-queue operation takes longer than 16.7 ms, the scrolling FPS will drop, since the CPU will be busy doing something else than rendering UI.
3. Downsides of the classical variant
loading images or any file from the flash drive is expensive (flash drive access is significantly slower than accessing the RAM)
creating the UIImage instance will result in a compressed version of the image mapped to a memory section. The compressed image is small and cannot be rendered. If loaded from the flash drive, the image is not even loaded into memory. Decompressing an image is also expensive.
setting the image property of the imageView in this case will create a CATransaction that will be committed on the run loop. On the next run loop iteration, the CATransaction involves (depending on the images) creating a copy of any images which have been set as layer contents. Copying images includes:
allocating buffers for file IO and decompression
reading flash drive data into memory
decompressing the image data (the raw bitmap is the result) - high CPU usage
CoreAnimation uses the decompressed data and renders it
improper byte-aligned images are copied by CoreAnimation so that their byte-alignment is fixed and can be rendered. This isn"t stated by Apple docs, but profiling apps with Instruments shows CA::Render::copy_image even when the Core Animation instrument shows no copied images
starting with iOS 7, the JPEG hardware decoder is no longer accessible to 3rd party apps. This means our apps are relying on a software decoder which is significantly slower. This was documented by the FastImageCache team on their Github page and also by Nick Lockwood on a Twitter post.
4. A strong iOS image cache component must:
download images asynchronously, so the main queue is used as little as possible
decompress images on a background queue. This is far from being trivial. See details http://www.cocoanetics.com/2011/10/avoiding-image-decompression-sickness/
cache images into memory and on flash drive. Caching on flash drive is important because the app might be closed or need to purge the memory because of low memory conditions. In this case, re-loading the images from the flash drive is a lot faster than downloading them. Note: if you use NSCache for the memory cache, this class will purge all its contents when a memory warning is issued. Details about NSCache here: http://nshipster.com/nscache/
store the decompressed image on flash drive and in memory to avoid redoing the decompression
use GCD and blocks. This makes the code more performant, easier to read and write. Nowadays, GCD and blocks is a must for async operations
nice to have: category over UIImageView for trivial integration.
nice to have: ability to process the image after download and before storing it into the cache.
Advanced imaging on iOS
To find out more about imaging on iOS, how the SDK frameworks work (CoreGraphics, Image IO, CoreAnimation, CoreImage), CPU vs GPU and more, go through this great article by @rsebbe.
Is Core Data a good candidate?
Here is a benchmark of image caching using Core Data versus File System. The results are recommending the File System (as we are already accustomed to).
5. Benchmarks
Just looking at the concepts listed above makes it clear that writing such a component on your own is hard, time consuming and painful. That"s why we turn to open source image caching solutions. Most of you have heard of SDWebImage or the new FastImageCache. In order to decide which one fits you best, I"ve benchmarked them and analyzed how they match our list of requirements.
Libraries tested:
SDWebImage
FastImageCache
AFNetworking
TMCache
Haneke
Note: AFNetworking was added to the comparison since starting with iOS7, due to NSURLCache, AFNetworking benefits of flash drive caching.
Scenario
For each library, I made a clean install of the benchmark app, then started the app, scroll easily while all images are loaded, then scroll back and forth with different intensities (from slow to fast). I closed the app to force loading from flash drive cache (where available), then ran the same scrolling scenario.
Benchmark app - project
the demo project source can be found on Github under the name ImageCachingBenchmark, together with the charts, collected data tables and more.
please note the project from Github had to be modified, as well as the image caching libraries, so that we know the cache source of each image loaded. Because I didn"t want to check in the Cocoapods source files (not a good practice) and that the project code must compile after a clean install of the Cocoapods, the current version of the Github project is slightly different from the one I used for the benchmarks.
if some of you want to rerun the benchmarks, you need to make a similar completionBlock for image loading for all libraries, like the default one on SDWebImage that returns the SDImageCacheType.
Fastest vs slowest device results
Complete benchmark results can be found on the Github project. Since those tables are big, I decided to create charts using the fastest (iPhone 5s) and the slowest device data (iPhone 4).
iPhone 5s results
Note: disk ~ flash drive (device storage unit)
iPhone 4 results
Legend
async download = support for asynchronous downloads directly into the library
backgr decompr = image decompression executed on a background queue/thread
store decompr = images are stored in their decompressed version
memory/flash drive cache = support for memory/flash drive cache
UIImageView categ = category for UIImageView directly into the library
from memory/flash drive = top results for the average retrieve times from memory/flash drive cache
6. Conclusions
writing an iOS image caching component from scratch is hard
SDWebImage and AFNetworking are solid projects, with many contributors, that are maintained properly. FastImageCache is catching up pretty fast.
looking at all the data provided above, I think we can all agree SDWebImage is the best solution at this time, even if for some projects AFNetworking or FastImageCache might fit better. It all depends on the project"s requirements.