Task
, async
, await
, etc.)Practical solutions to problems with Swift Concurrency.
Structured concurrency is a new term for most Swift developers. This is an attempt to decipher its meaning.
Thread.sleep()
and Task.sleep()
So here’s a quick and simple example that showcases some of the nice features of the new concurrency model without going into much detail.
However, with great power comes great responsibility. If you learn from tutorials or even from the documentation, it’s really hard to find some details on how it works under the hood.
Since WWDC21, we have talked, extensively, about all the new concurrency features introduced in Swift 5.5. We covered a lot of topics, so I decided to finish off this series writing a summary article were we cover the most important topics of each article.
Swift Concurrency promises to make it possible to write correct, performant code designed for today’s world of asynchronous events and ubiquitous hardware parallelism. And indeed, when wielded appropriately it does exactly that. However–much like an iceberg–the simple APIs it exposes hide a staggering amount of complexity underneath. Unfortunately, concurrency is a challenging topic to reason about when compared to straight-line, synchronous code, and it is difficult for any programming model to paper over all of its subtleties.
The libdispatch is one of the most misused API due to the way it was presented to us when it was introduced and for many years after that, and due to the confusing documentation and API. This page is a compilation of important things to know if you’re going to use this library. Many references are available at the end of this document pointing to comments from Apple’s very own libdispatch maintainer (Pierre Habouzit).
Underused GCD patterns:
- Making a serial queue that’s less aggressive about creating threads (“non-overcommit”) […]
- Multiplexing work onto a single serial queue efficiently […]
Probably overused GCD patterns:
- Global queues as anything but targets
- Almost any use of concurrent queues
- Queues as locks; os_unfair_lock is more efficient (sadly a little trickier to use in Swift; no ideal solution here yet)
- Turning async into sync with semaphores
Queues with .background
Quality of Service (QoS) may never be executed, e.g. low power mode, so plan accordingly.
Fast-forward to today’s 2020 and most consumer machines have about 4 cores and pro machines have about 8 to 12 cores. Something must have gone wrong along the way. Spoiler: multithreading is hard.
[…]
How would serial queues help us with concurrency? Well various program components would have their own private queue which would be used to ensure thread-safety (locks would not even be needed anymore) and those components would be concurrent between themselves. They told us these were “islands of serialization in a sea of concurrency”.
Let’s take a closer look at this feature and check how can we use it to define atomic properties in Swift.
Swift 5 turns on exclusivity checking by default. This has some interesting interactions with atomics, especially when running under the Thread Sanitizer (TSAN). If you’ve ever seen a TSAN report on some simple Swift code that looks obviously correct then you’re probably running into this issue:
Generally, you can summarize atomic as “one at a time”.
For example, when accessing or mutating a property is atomic, it means that only one read or write operation can be performed at a time. If you have a program that reads a property atomically, this means that the property cannot change during this read operation.
NetNewsWire is mostly not multi-threaded. Here’s what we do…
In my previous post I describe how NetNewsWire handles threading, and I touch on some of the benefits — but I want to be more explicit about them.