Sync vs Async, Queues, and Effective Multithreading Techniques

Explore Swift's concurrency: differentiate synchronous/asynchronous tasks, understand DispatchQueue types, and learn multithreading strategies for efficient, responsive iOS app development.

12/27/20234 min read

Concurrency is key in modern iOS development for a simple reason: it allows apps to handle multiple tasks simultaneously, dramatically enhancing performance and user experience. By adeptly managing synchronous and asynchronous operations, effectively utilizing DispatchQueues, and mastering multithreading techniques, developers can build responsive, efficient, and powerful applications.

Join us as we delve into the nuances of Swift concurrency, unlocking the full potential of your apps in an environment where speed and responsiveness are paramount.

Moving forward, we will now explore the cornerstones of concurrency in Swift: asynchronous and synchronous tasks, and the distinct dynamics of serial and parallel (concurrent) queues. This exploration will provide a clearer understanding of how tasks are executed and managed, laying the foundation for advanced concurrency techniques in iOS app development.

In Swift, particularly when dealing with concurrency, understanding synchronous and asynchronous tasks is crucial. These concepts dictate how code execution is managed, affecting the performance and responsiveness of an application.

Synchronous Tasks

Definition: Synchronous tasks are executed one at a time, and the execution of subsequent tasks or code lines is blocked until the current task completes. This means that the code waits for the task to finish before moving to the next one.


1. Blocking: The calling thread is blocked, meaning it does nothing else while waiting for the task to complete.

2. Predictability: Since tasks are completed one after the other, the order of execution is predictable.

3. Simplicity: Easier to understand and debug because the code runs in a straightforward, sequential manner.

Asynchronous Tasks

Definition: Asynchronous tasks allow the execution of other tasks to continue without waiting for the current task to finish. When you execute something asynchronously, you can move on to another task before it finishes.


1. Non-blocking: The calling thread is not blocked and can continue executing other tasks.

2. Concurrency: Allows for concurrency as multiple tasks can be processed simultaneously.

3. Complexity: More complex to handle, as it requires dealing with tasks that finish at different times and potentially in an unpredictable order.

Exploring Dispatch Queue in Swift Concurrency

A Dispatch Queue in Swift is a powerful tool for managing how work is executed. It's akin to a real-life queue at a coffee shop, where customers (or tasks) line up to be served (or executed). These queues ensure tasks are performed in an organised manner, either one after the other or simultaneously.

Serial Dispatch Queue

A Serial Dispatch Queue is like a single barista serving one customer at a time. Each task is completed before the next one begins, ensuring tasks are executed in the order they are added. This is particularly useful when tasks need to be performed sequentially to avoid conflicts, such as when modifying a shared resource.

Code Example:

In this example, Task 2 will only commence after Task 1 is complete.

Parallel (Concurrent) Dispatch Queue

In contrast, a Parallel Dispatch Queue is akin to having multiple baristas, where several customers are served at the same time.
Tasks in a concurrent queue start in the order they are added, but they can finish in any order;
they are executed simultaneously.

Code Example:

Here, Task 1 and Task 2 can run and complete independently of each other, potentially at the same time.

After understanding serial and parallel dispatch queues, it's crucial to delve into two specific types of queues provided by Swift: the Main Queue and the Global Queue. These queues play distinct roles in the orchestration of tasks in an iOS application.

The Main Queue

The Main Queue is a special serial queue that runs on the main thread of the application. It's primarily responsible for handling UI updates and user interactions. Since the user interface can only be updated on the main thread, the Main Queue becomes a critical component in ensuring a smooth and responsive UI.

Key Point: All UI updates must be performed on the Main Queue. Failing to do so can lead to inconsistent UI states and even app crashes.

Code Example:

In this snippet, a UI update is safely performed on the Main Queue.

The Global Queue

The Global Queue is a system-provided concurrent queue. It's used to perform non-UI work in the background, like downloading data, processing images, or performing calculations.

Code Example:

Here, a background task is executed on the Global Queue with the appropriate QoS.

Quality of Service in Swift's DispatchQueue

In Swift concurrency, Quality of Service (QoS) is a pivotal concept that helps prioritize tasks. It's akin to assigning different levels of urgency to different types of work. By setting the QoS for a task, you inform the system of its importance, which influences how the system prioritizes and schedules these tasks.

Understanding Quality of Service Classes

Swift offers several predefined QoS classes. Each class serves a unique purpose, ensuring that tasks are handled according to their intended user impact.

Applying QoS in DispatchQueue

When creating your own queues or submitting tasks to a global queue, specifying the QoS can help the system manage resources more effectively.

Code Example:

In this example, a task is dispatched on a global queue with the `User Initiated` QoS, indicating that it should be executed promptly but doesn't require immediate user interaction.

In the next article of our series, we will dive into more advanced aspects of Swift concurrency. We'll explore the concept of Target Queues, which offer sophisticated control over task execution, and we'll unravel the functionality of Dispatch Barriers and Dispatch Groups. These tools provide us with the means to handle even more complex concurrency scenarios, allowing for synchronized access to resources and coordinated task completion.