Check If on Correct Dispatch Queue in Swift 3

Check if on correct dispatch queue in Swift 3

Answering my own question:

Based on KFDoom's comments, I'm now using setSpecific and getSpecific.

This creates a key, sets it on the test queue, and later on, gets it again:

let testQueueLabel = "com.example.my-test-queue"
let testQueue = DispatchQueue(label: testQueueLabel, attributes: [])
let testQueueKey = DispatchSpecificKey<Void>()

testQueue.setSpecific(key: testQueueKey, value: ())

// ... later on, to test:

XCTAssertNotNil(DispatchQueue.getSpecific(key: testQueueKey), "callback should be called on specified queue")

Note that there's no value associated with the key (its type is Void), I'm only interested in the existence of the specific, not in it's value.

Important!
Make sure to keep a reference to the key, or cleanup after you're done using it. Otherwise a newly created key could use the same memory address, leading to weird behaviour. See: http://tom.lokhorst.eu/2018/02/leaky-abstractions-in-swift-with-dispatchqueue

How to get the current queue name in swift 3

Now DispatchQueue has label property.

The label you assigned to the dispatch queue at creation time.

var label: String { get } 

It seems been existed from first, maybe not been exposed via public API.

macOS 10.10+

And please use this only to obtain human-readable labels. Not to identify each GCDQ.

If you want to check whether your code is running on certain GCDQ, you can use dispatchPrecondition(...) function.

priority of Dispatch Queues in swift 3

You say:

The outcome shows that we ignore the asynchronous execution. ...

No, it just means that you didn't give the asynchronously dispatched code enough time to get started.

I know queue2 should be completed before queue1 since it's synchronous execution ...

First, queue2 might not complete before queue1. It just happens to. Make queue2 do something much slower (e.g. loop through a few thousand iterations rather than just five) and you'll see that queue1 can actually run concurrently with respect to what's on queue2. It just takes a few milliseconds to get going and the stuff on your very simple queue2 is finishing before the stuff on queue1 gets a chance to start.

Second, this behavior is not technically because it's synchronous execution. It's just that async takes a few milliseconds to get it's stuff running on some worker thread, whereas the synchronous call, because of optimizations that I won't bore you with, gets started more quickly.

but why we ignore the asynchronous execution ...

We don't "ignore" it. It just takes a few milliseconds to get started.

and what is the actual difference between async, sync and so-called main queue?

"Async" merely means that the current thread may carry on and not wait for the dispatched code to run on some other thread. "Sync" means that the current thread should wait for the dispatched code to finish.

The "main thread" is a different topic and simply refers to the primary thread that is created for driving your UI. In practice, the main thread is where most of your code runs, basically running everything except that which you manually dispatch to some background queue (or code that is dispatched there for you, e.g. URLSession completion handlers).

Get current dispatch queue?

You do have the option of "dispatch_get_current_queue()", however the iOS 6.1 SDK defines this API with these disclaimers:

"Recommended for debugging and logging purposes only:"

and

"This function is deprecated and will be removed in a future release.".

Here's another related question with some alternatives you can consider if you want code that's future-proof.

Dispatch queues: How to tell if they're running and how to stop them

This is a semi-common question when programming with GCD.

The short answer is that GCD does not have a cancelation API for queues. The rationale:

  1. memory management would become vastly more complicated, because a given block might be responsible for free()ing a given allocation of memory. By always running the block, GCD ensures that memory management remains easy.
  2. It is practically impossible to halt a running block without corrupting state.
  3. Most code that needs cancellation logic is already tracking that state in private data structures.

Given all of these cases, it is far more efficient and powerful to write code like this:

dispatch_async(my_obj->queue, ^{
bool done = false;
// do_full_update() takes too long, therefore:
while ( !my_obj->cancelled && !done ) {
done = do_partial_update(my_obj);
}
});

Oh, and to know if a queue has finished running all of the enqueued blocks, your code can simply execute an empty block with the synchronous API:

dispatch_sync(my_obj->queue, ^{});

As mentioned in the comments, a better way of knowing when your work is done is to use dispatch groups. Dispatch all your blocks to the group and then you can add a completion handler to the group. Once the work is complete, the completion block will run.

dispatch_group_t myGroup = dispatch_group_create();
dispatch_group_async(myGroup, my_obj->queue, ^{
bool done = false;
while ( !my_obj->cancelled && !done ) {
done = do_partial_update(my_obj);
}
});
dispatch_group_notify(myGroup, my_obj->queue, ^{
NSLog(@"Work is done!");
dispatch_release(myGroup);
});

Once all of your blocks have completed, the group will be empty and trigger the notification block. From there, you can update UI, etc.

Good luck and have fun!

How to create dispatch queue in Swift 3

Creating a concurrent queue

let concurrentQueue = DispatchQueue(label: "queuename", attributes: .concurrent)
concurrentQueue.sync {

}

Create a serial queue

let serialQueue = DispatchQueue(label: "queuename")
serialQueue.sync {

}

Get main queue asynchronously

DispatchQueue.main.async {

}

Get main queue synchronously

DispatchQueue.main.sync {

}

To get one of the background thread

DispatchQueue.global(qos: .background).async {

}

Xcode 8.2 beta 2:

To get one of the background thread

DispatchQueue.global(qos: .default).async {

}

DispatchQueue.global().async {
// qos' default value is ´DispatchQoS.QoSClass.default`
}

If you want to learn about using these queues .See this answer

What happens if dispatch on same queue?

I highly recommend you to watch these videos. Then go through the examples I provided and then change the code and play around with them as much as you can. It took me 3 years to feel fully comfortable with iOS multi-threading so take your time :D

Watch the first 3 minutes of this RWDevCon video and more if you like.

Also watch 3:45 until 6:15. Though I recommend you watch this video in its entirety.


To summarize the points the videos make in the duration I mentioned:

threading and conccurrency is all about the source queue and destination. queue.

sync vs. async is specifically a matter of the source queue.

Think of source and destination queues of a highway where your work is being done.

If you do async, then it's like you sending a car (has to deliver stuff) exiting the highway and then continue to let other cars drive in the highway.

If you do sync, then it's like you sending a car (has to deliver stuff) exiting the highway and then halting all other cars on the highway until the car delivers all its stuff.

Think of a car delivering stuff as a block of code, starting and finishing execution.


What happens for main queue is identical to what happens for serial queue. They're both serial queues.

So if you're already on main thread and dispatch to main thread and dispatch asynchronously then, anything you dispatch will go to the end of the queue

To show you what I mean: In what order do you think this would print? You can easily test this in Playground:

DispatchQueue.main.async {
print("1")
print("2")
print("3")
DispatchQueue.main.async {
DispatchQueue.main.async {
print("4")
}
print("5")
print("6")
print("7")
}
print("8")
}

DispatchQueue.main.async {
print("9")
print("10")
}

It will print:

1
2
3
8
9
10
5
6
7
4

Why?

It's mainly because every time you dispatch to main from main, the block will be placed at the end of the main queue.

Dispatching to main while you're already on the main queue is very hidden subtle reason for many tiny delays that you see in an app's user-interaction.


What happens if you dispatch to the same serial queue using sync?

Deadlock! See here

If you dispatch to the same concurrent queue using sync, then you won't have a deadlock. But every other thread would just wait the moment you do sync. I've discussed that below.


Now if you're trying to dispatch to a concurrent queue, then if you do sync, it's just like the example of the highway, where the entire 5 lane highway is blocked till the car delivers everything. But it's kinda useless to do sync on a concurrent queue, unless you're doing something like a .barrier queue and are trying to solve a read-write problem.

But to just see what happens if you do sync on a concurrent queue:

let queue = DispatchQueue(label: "aConcurrentQueue", attributes: .concurrent)

for i in 0...4 {
if i == 3 {
queue.sync {
someOperation(iteration: UInt32(i))
}
} else {
queue.async {
someOperation(iteration: UInt32(i))
}
}
}

func someOperation(iteration: UInt32) {
sleep(1)
print("iteration", iteration)
}

will log:

'3' will USUALLY (not always) be first (or closer to the first), because sync blocks get executed on the source queue. As docs on sync say:

As a performance optimization, this function executes blocks on the current thread whenever possible

The other iterations happen concurrently. Each time you run the app, the sequence may be different. That's the inherit unpredictability associated with concurrency. 4 will be closer to being completed last and 0 would be closer to being finished sooner. So something like this:

iteration 3     
iteration 0
iteration 2
iteration 1
iteration 4

If you do async on a concurrent queue, then assuming you have a limited number of concurrent threads, e.g. 5 then 5 tasks would get executed at once. Just that each given task is going to the end of the queue. It would make sense to do this for logging stuff. You can have multiple log threads. One thread logging location events, another logging purchases, etc.

A good playground example would be:

let queue = DispatchQueue(label: "serial", attributes: .concurrent)

func delay(seconds: UInt32 ) {
queue.async {
sleep(seconds)
print(seconds)
}
}

for i in (1...5).reversed() {
delay(seconds: UInt32(i))
}

Even though you've dispatched the 5 first, this would print

1
2
3
4
5

Swift DispatchQueue global and main in variable

The key issue is to ensure thread-safety. For example, the following is not thread-safe:

func addUpValuesNotThreadSafe() {
var total = 0

getMyFirstItem { value in
total += value // on main thread
}

getMySecondItem { value in
total += value // on some GCD worker thread!!!
}

getMyThirdItem { value in
total += value // on main thread
}

...
}

One could solve this problem by not allowing these tasks run in parallel, but you lose all the benefits of asynchronous processes and the concurrency they offer.

Needless to say, when you do allow them to run in parallel, you would likely add some mechanism (such as dispatch groups) to know when all of these asynchronous tasks are done. But I did not want to complicate this example, but rather keep our focus on the thread-safety issue. (I show how to use dispatch groups later in this answer.)

Anyway, if you have closures called from multiple threads, you must not increment the same total without adding some synchronization. You could add synchronization with a serial dispatch queue, for example:

func addUpValues() {
var total = 0
let queue = DispatchQueue(label: Bundle.main.bundleIdentifier! + ".synchronized")

getMyFirstItem { value in
queue.async {
total += value // on serial queue
}
}

getMySecondItem { value in
queue.async {
total += value // on serial queue
}
}

getMyThirdItem { value in
queue.async {
total += value // on serial queue
}
}

...
}

There are a variety of alternative synchronization mechanisms (locks, GCD reader-writer, actor, etc.). But I start with the serial queue example to observe that, actually, any serial queue would accomplish the same thing. Many use the main queue (which is a serial queue) for this sort of trivial synchronization where the performance impact is negligible, such as in this example.

For example, one could therefore either refactor getMySecondItem to also call its completion handler on the main queue, like getMyFirstItem and getMyThirdItem already do. Or if you cannot do that, you could simply have the getMySecondItem caller dispatch the code that needs to be synchronized to the main queue:

func addUpValues() {
var total = 0

getMyFirstItem { value in
total += value // on main thread
}

getMySecondItem { value in
DispatchQueue.main.async {
total += value // now on main thread, too
}
}

getMyThirdItem { value in
total += value // on main thread
}

// ...
}

That is also thread-safe. This is why many libraries will ensure that all of their completion handlers are called on the main thread, as it minimizes the amount of time the app developer needs to manually synchronize values.


While I have illustrated the use of serial dispatch queues for synchronization, there are a multitude of alternatives. E.g., one might use locks or GCD reader-writer pattern.

The key is that one should never mutate a variable from multiple threads without some synchronization.


Above I mention that you need to know when the three asynchronous tasks are done. You can use a DispatchGroup, e.g.:

func addUpValues(complete: @escaping (Int) -> Void) {
let total = Synchronized(0)
let group = DispatchGroup()

group.enter()
getMyFirstItem { first in
total.synchronized { value in
value += first
}
group.leave()
}

group.enter()
getMySecondItem { second in
total.synchronized { value in
value += second
}
group.leave()
}

group.enter()
getMyThirdItem { third in
total.synchronized { value in
value += third
}
group.leave()
}

group.notify(queue: .main) {
let value = total.synchronized { $0 }
complete(value)
}
}

And in this example, I abstracted the synchronization details out of addUpValues:

class Synchronized<T> {
private var value: T
private let lock = NSLock()

init(_ value: T) {
self.value = value
}

func synchronized<U>(block: (inout T) throws -> U) rethrows -> U {
lock.lock()
defer { lock.unlock() }
return try block(&value)
}
}

Obviously, use whatever synchronization mechanism you want (e.g., GCD or os_unfair_lock or whatever).

But the idea is that in the GCD world, dispatch groups can notify you when a series of asynchronous tasks are done.


I know that this was a GCD question, but for the sake of completeness, the Swift concurrency async-await pattern renders much of this moot.

func getMyFirstItem() async -> Int {
return 10
}

func getMySecondItem() async -> Int {
await Task.detached(priority: .background) {
return 10
}.value
}

func getMyThirdItem() async -> Int {
return 10
}

func addUpValues() {
Task {
async let value1 = getMyFirstItem()
async let value2 = getMySecondItem()
async let value3 = getMyThirdItem()
let total = await value1 + value2 + value3
print(total)
}
}

Or, if your async methods were updating some shared property, you would use an actor to synchronize access. See Protect mutable state with Swift actors.

How to check current thread in Swift 3?

Looks like it's simply Thread.isMainThread in Swift 3.



Related Topics



Leave a reply



Submit