Autocomplete in SwiftUI using async/await
With Swift 5.5 released I want to offer a look how new Swift concurrency model can be used to create autocomplete feature in SwiftUI.
Text autocomplete is a common feature that typically involves database lookup or networking. This operations must be asynchronous, not to block user input, and can include in-memory cache to speedup consecutive lookups. This problem is perfect to battle test new Swift concurrency model.
Note: this problem is similar to search and can be adopted with slight modification.
Let’s say we have an app that can show information about a city. When user types city in a text field and we want to offer autocomplete suggestions.
Here is SwiftUI code of a view prototype, with hardcoded list of suggestions.
Creating the source of suggestions
Suggestions can come from a server or bundled with the app. For simplicity, in the example we store suggestions as a plain text, where each city name is separated with a newline.
...
Amstelveen
Amsterdam
Amsterdam-Zuidoost
Amstetten
...
To load the file in memory we use CitiesSource
protocol and CitiesFile
object that implements it. You may choose not to declare a protocol and use an object directly. But I find that having a protocol creates simple to understand abstraction, further useful for unit testing.
Caching
Next we need to build a cache. In our example CitiesCache
keeps the complete list of cities in-memory. For a real app you should consider creating something smarter. We, instead, focus on concurrency. A good cache should be thread-safe. This is where new Swift concurrency model comes to life.
CitiesCache
is an actor
. Actor protects its own data, ensuring that only a single thread will access that data at a given time. Precisely what we need.
CitiesCache
stores the list of cities in cachedCities
, loaded lazily on first access to computed cities
property.
Cache lookup is a straight forward enumeration comparing prefixes. In the example we do case and diacritic insensitive comparison. This will match characters like o
and ó
.
Notice a thing: so far there is not a line of synchronization code that we wrote. Actors allow only one task to access their state at a time. So we don’t need to worry about data races.
Delay and cancellation
Pieces are almost ready to connect. One small autocomplete feature to consider is a slight delay between user input and autocomplete routine, to limit number of calls. This is especially useful if autocomplete extensively uses I/O, like database lookup or sending network requests.
AutocompleteObject
object implements autocomplete and notifies SwiftUI using @Published var suggestions: [String]
property. To execute autocomplete asynchronously we use Task
, new in Swift Standard Library. A Task
can execute concurrent routines and supports cancellation.
You can also notice that AutocompleteObject
uses @MainActor
to always execute its code on the main thread.
Important that asyncronous calls, such as Task.sleep
to add delay, and also using CitiesCache
actor, must be marked with await
.
What await
does, is indicates that the routine must stop and wait for asynchronous subroutine (functions marked with async
keyword and actors) to complete.
You may previously used semaphores or asyncAndWait
in GCD to achieve similar behaviour. The difference is that await
won’t block calling thread and simply return execution when asynchronous call completes. Note: even that AutocompleteObject
always uses the main thread, await Task.sleep
won’t block it.
The view
Inside the view we create AutocompleteObject
and observe its suggestions
property, pretty standard. When input
changes we call autocomplete
function and the property will update.
That’s it. I hope you find this example useful and new API simple to use. You can find full source code on my github: Autocomplete.
And if you like the article please share. Hope to see you next time.
Cheers!