Concurrent VS Serial Queues in Gcd

Concurrent vs serial queues in GCD

A simple example: you have a block that takes a minute to execute. You add it to a queue from the main thread. Let's look at the four cases.

  • async - concurrent: the code runs on a background thread. Control returns immediately to the main thread (and UI). The block can't assume that it's the only block running on that queue
  • async - serial: the code runs on a background thread. Control returns immediately to the main thread. The block can assume that it's the only block running on that queue
  • sync - concurrent: the code runs on a background thread but the main thread waits for it to finish, blocking any updates to the UI. The block can't assume that it's the only block running on that queue (I could have added another block using async a few seconds previously)
  • sync - serial: the code runs on a background thread but the main thread waits for it to finish, blocking any updates to the UI. The block can assume that it's the only block running on that queue

Obviously you wouldn't use either of the last two for long running processes. You normally see it when you're trying to update the UI (always on the main thread) from something that may be running on another thread.

Dispatch_barrier_async and serial queue in GCD, what're differences between them?

dispatch_barrier_[a]sync are meant to be used with a concurrent queue. They are also meant to be used along with calls to dispatch_[a]sync.

The common usage is the "multi-readers, one writer" pattern. You setup a concurrent queue. For "reader" blocks, you use dispatch_[a]sync. For "writer" blocks, you use dispatch_barrier_[a]sync.

This setup allows concurrent reading but only allows one writer at a time and no reading while the writing is happening.

Compare this with a serial queue where only one block at a time can ever happen.

Serial vs concurrent blocking main queue in similar fashion

let queue = DispatchQueue(label: "queue_label", attributes: .concurrent) // concurrent queue

If this queue has multiple items queued, then it may use multiple threads to run them in parallel. It does not promise that it will, but it may. And it will only do so if it has multiple items queued at the same time.

for i in 1..<11 {
queue.sync { ... }
}

This loop queues a single item, blocks until the item is scheduled and completed, and then queues another item. If this is all the code, then at no point are there multiple items on the queue. Of course if there is other code running in parallel, that enqueues items on queue, then there may be parallel items running.

As written, this code is legal, but seems pretty useless. If printNumber is time consuming, and this is on the main queue, it could crash the app (or at least beachball it on Mac).

Nothing you've done here is a "read-write" problem, so I don't think that's related. Queues can be used to do all kinds of things. In this particular example, there doesn't seem to be any reason to use the queue at all, so I would delete that code. If you have a different problem, you can open a question asking about that.

Why concurrent queue with sync act like serial queue?

If you want to demonstrate them running concurrently, you should dispatch the 10 tasks individually:

let cq = DispatchQueue(label: "downloadQueue", attributes: .concurrent)

for i in 0..<10 {
cq.async {
sleep(2)
print(i)
}
}
print("all finished queuing them!")

Note:

  • There are 10 dispatches to the concurrent queue, not one.

    Each dispatched task runs concurrently with respect to other tasks dispatched to that queue (which is why we need multiple dispatches to illustrate the concurrency).

  • Also note that we dispatch asynchronously because we do not want to calling queue to wait for each dispatched task before dispatching the next.


You ask:

So my original thought was: the 1-10 printing should be done concurrently, not necessarily in the serial order.

Because they are inside a single dispatch, they will run as a single task, running in order. You need to put them in separate dispatches to see them run concurrently.

You go on to ask:

Could anyone explain the purpose of sync call on concurrent queue and give me an example why and when we need it?

The sync has nothing to do with whether the destination queue is serial or concurrent. The sync only dictates the behavior of the calling thread, namely, should the caller wait for the dispatched task to finish or not. In this case, you really do not want to wait, so you should use async.

As a general rule, you should avoid calling sync unless (a) you absolutely have to; and (b) you are willing to have the calling thread blocked until the sync task runs. So, with very few exceptions, one should use async. And, perhaps needless to say, we never block the main thread for more than a few milliseconds.

While using sync on a concurrent dispatch queue is generally avoided, one example you might encounter is the “reader-writer” synchronization pattern. In this case, “reads” happen synchronously (because you need to wait the result), but “writes” happen asynchronously with a barrier (because you do not need to wait, but you do not want it to happen concurrently with respect to anything else on that queue). A detailed discussion of using GCD for synchronization (esp the reader-writer pattern), is probably beyond the scope of this question. But search the web or StackOverflow for “GCD reader-writer” and you will find discussions on the topic.)


Let us graphically illustrate my revamped rendition of your code, using OSLog to create intervals in Instruments’ “Points of Interest” tool:

import os.log

private let log = OSLog(subsystem: "Foo", category: .pointsOfInterest)

class Foo {
func demonstration() {
let queue = DispatchQueue(label: "downloadQueue", attributes: .concurrent)

for i in 0..<10 {
queue.async { [self] in
let id = OSSignpostID(log: log)
os_signpost(.begin, log: log, name: "async", signpostID: id, "%d", i)
spin(for: 2)
os_signpost(.end, log: log, name: "async", signpostID: id)
}
}
print("all finished queuing them!")
}

func spin(for seconds: TimeInterval) {
let start = CACurrentMediaTime()
while CACurrentMediaTime() - start < seconds { }
}
}

When I profile this in Instruments (e.g. “Product” » “Profile”), choosing “Time Profiler” template (which includes “Points of Interest” tool), I see a graphical timeline of what is happening:

Sample Image

So let me draw your attention to two interesting aspects of the above:

  1. The concurrent queue runs tasks concurrently, but because my iPhone’s CPU only has six cores, only six of them can actually run at the same time. The next four will have to wait until a core is available for that particular worker thread.

    Note, this demonstration only works because I am not just calling sleep, but rather I am spinning for the desired time interval, to more accurately simulate some slow, blocking task. Spinning is a better proxy for slow synchronous task than sleep is.

  2. This illustrates, as you noted, concurrent tasks may not appear in the precise order that they were submitted. This is because (a) they all were queued so quickly in succession; and (b) they run concurrently: There is a “race” as to which concurrently running thread gets to the logging statements (or “Points of Interest” intervals) first.

    Bottom line, for those tasks running concurrently, because of races, they may not appear to run in order. That is how concurrent execution works.

GCD serial async queue vs serial sync queue nested in async

In the first approach, the calling thread is not blocked and the task (critical section) passed in the async block will be executed in background.

In the second approach, the calling thread is not blocked, but the "background" thread will be waiting for the sync block (critical section) execution which is executed by another thread.

I don't know what you do in your critical section, but it seems first approach seems the best one. Note that background qos is quite slow, maybe use default qos for your queue, unless you know what you are doing. Also note that convention wants that you use bundle identifier as label for your queue. So something like this:

private let someQueue = DispatchQueue(label: "\(Bundle.main.bundleIdentifier ?? "").\(type(of: self)).someQueue")

How do serial queues differ from concurrent queues?

"serial" means that the blocks submitted to the queue are executed sequentially,
i.e. the second block is not executed before the first one has finished.
It does not mean that the blocks are executed on the same thread.

"concurrent" means that the blocks submitted to the queue may execute
concurrently (on different threads).

In both cases, GCD uses a "thread pool" to execute blocks, so you cannot know on which
thread a block will be executed. The only exception is the "main queue" which
executes all blocks on the main thread.



Related Topics



Leave a reply



Submit