How to create dispatch queue in Swift 3
Creating a concurrent queue
let concurrentQueue = DispatchQueue(label: "queuename", attributes: .concurrent)
concurrentQueue.sync {
}
Create a serial queue
let serialQueue = DispatchQueue(label: "queuename")
serialQueue.sync {
}
Get main queue asynchronously
DispatchQueue.main.async {
}
Get main queue synchronously
DispatchQueue.main.sync {
}
To get one of the background thread
DispatchQueue.global(qos: .background).async {
}
Xcode 8.2 beta 2:
To get one of the background thread
DispatchQueue.global(qos: .default).async {
}
DispatchQueue.global().async {
// qos' default value is ´DispatchQoS.QoSClass.default`
}
If you want to learn about using these queues .See this answer
How to create queue and start it manually
In my application I have to implement a refresh token logic. I would like that during the refresh token process all requests sent be kept in a queue and as soon as my process is finished I start the queue
If you want to create a queue and delay the starting of its tasks, just suspend it, e.g.:
let queue = DispatchQueue(label: "myQueue", attributes: .concurrent)
queue.suspend()
queue.async {
// request One
}
queue.async {
// request Two
}
fetchToken { result in
switch result {
case .success(let token):
// do something with token
print(token)
queue.resume()
case .failure(let error):
// handle the error
print(error)
}
}
That’s how you suspend
and resume
dispatch queues. Note, suspend
only prevent items from starting on a queue, but has no affect on tasks that are already running. That is why I suspended the queue before dispatching items to it.
But the above begs the question of what you want to do in that failure
scenario. You just have a queue sitting there with a bunch of scheduled tasks. You could, theoretically, keep references to those dispatched blocks (by using DispatchWorkItem
pattern rather than just simple closures, and you could cancel
those items), but I’d probably reach for an operation queue, e.g.
let queue = OperationQueue()
queue.isSuspended = true
queue.addOperation {
// request One
}
queue.addOperation {
// request Two
}
fetchToken { result in
switch result {
case .success(let token):
// do something with token
print(token)
queue.isSuspended = false
case .failure(let error):
// handle the error
print(error)
queue.cancelAllOperations()
}
}
This is the same as the above, but we can cancel all of those queued operations with cancelAllOperations
.
By the way, you can create custom Operation
subclass that handles tasks that are, themselves, asynchronous. And I’m presuming your “request One” and “request Two” are asynchronous network requests. See looking for a specific example where Operation is preferred over GCD or vice-versa for a discussion of when one might prefer OperationQueue
over DispatchQueue
.
How do I create a dispatch_queue with QoS attributes in Swift 3?
DispatchQueue
is now a class
, and you can use its init(label:attributes:target:)
initializer. The attributes are now an OptionSet called DispatchQueueAttributes
, which has instances .serial
and .qosUtility
.
Putting it together:
let myQueue = DispatchQueue(label: "com.example.serial-queue",
attributes: [.serial, .qosUtility])
priority of Dispatch Queues in swift 3
You say:
The outcome shows that we ignore the asynchronous execution. ...
No, it just means that you didn't give the asynchronously dispatched code enough time to get started.
I know queue2 should be completed before queue1 since it's synchronous execution ...
First, queue2 might not complete before queue1. It just happens to. Make queue2 do something much slower (e.g. loop through a few thousand iterations rather than just five) and you'll see that queue1 can actually run concurrently with respect to what's on queue2. It just takes a few milliseconds to get going and the stuff on your very simple queue2 is finishing before the stuff on queue1 gets a chance to start.
Second, this behavior is not technically because it's synchronous execution. It's just that async takes a few milliseconds to get it's stuff running on some worker thread, whereas the synchronous call, because of optimizations that I won't bore you with, gets started more quickly.
but why we ignore the asynchronous execution ...
We don't "ignore" it. It just takes a few milliseconds to get started.
and what is the actual difference between async, sync and so-called main queue?
"Async" merely means that the current thread may carry on and not wait for the dispatched code to run on some other thread. "Sync" means that the current thread should wait for the dispatched code to finish.
The "main thread" is a different topic and simply refers to the primary thread that is created for driving your UI. In practice, the main thread is where most of your code runs, basically running everything except that which you manually dispatch to some background queue (or code that is dispatched there for you, e.g. URLSession
completion handlers).
Using Dispatch queue in swift 3 perfect 2.0
Ok, now i understand. I can't block main queue in swift perfect.
solution:
let backgroundQueue = DispatchQueue(label: "com.app.queue", qos: .background, target: nil)
let when = DispatchTime.now() + 10
backgroundQueue.asyncAfter(deadline: when){
//call function
}
What is the difference in approach to create DispatchQueue Swift3
The queue you create in your first example is your own custom serial queue. As the somewhat dated, yet still relevant, Concurrency Programming Guide says:
Serial queues (also known as private dispatch queues) execute one task at a time in the order in which they are added to the queue. The currently executing task runs on a distinct thread (which can vary from task to task) that is managed by the dispatch queue. Serial queues are often used to synchronize access to a specific resource.
You can create as many serial queues as you need, and each queue operates concurrently with respect to all other queues. In other words, if you create four serial queues, each queue executes only one task at a time but up to four tasks could still execute concurrently, one from each queue.
Whereas your latter examples are using simply retrieving system-provided global queues which are concurrent:
Concurrent queues (also known as a type of global dispatch queue) execute one or more tasks concurrently, but tasks are still started in the order in which they were added to the queue. The currently executing tasks run on distinct threads that are managed by the dispatch queue. The exact number of tasks executing at any given point is variable and depends on system conditions.
Now, you can nowadays create your own custom concurrent queue, but a global queue is simply a concurrent queue that was created for us.
So, what does this mean to you?
If you dispatch blocks to serial queue (your first example), only one block can run at any time. This makes it really useful for synchronizing memory access in multi-threaded apps, but can be used in any environment where you need a background queue, but you want dispatched blocks to be run serially (i.e. sequentially) with respect to other blocks on that queue.
The global queues that you are using in your latter examples are concurrent queues. This means that if you dispatch four separate tasks to this global queue, those blocks may run concurrently with respect to each other). This is ideal where you really want not only background execution, but don't care if these blocks also run at the same time as other dispatched blocks, too.
In your latter examples, where you're accessing a global queue, recognize that because those are system-provided, you have some modest limitations on your interaction with these queues, namely:
You cannot suspend global queues;
You cannot use barriers on global queues;
But, with that notwithstanding, if you are just looking for an simple way of dispatching blocks to run in the background (and you don't care if those dispatched blocks run at the same time as each other), then global queues are incredibly simple and efficient way to do that.
By the way, the difference between your second example (for which I assume you intended let backgroundQueue = DispatchQueue.global()
) and the third example, is merely that in your third example, you assigned the explicit quality of service (qos), whereas in your second example, you're using the default qos. FWIW, it's generally advisable to specify a qos, so that the OS can prioritize threads contending for limited system resources accordingly.
How to create a background priority SERIAL queue WITH A NAME, in Swift?
You can use:
DispatchQueue(label: "name.of.your.queue")
or
let processingQueue = DispatchQueue(label: "your.queue", qos: .background,
attributes: [],
autoreleaseFrequency: .inherit,
target: nil)
Noted that DispatchQueue.Attributes
is a OptionSet, you can pass empty [], or combine several values on it.
Server Side Swift - Dispatch Queues
Perhaps you are thinking that DispatchQueue is middleware that runs in a separate process. DispatchQueue runs in the same process as the rest of your application and so if your server process crashes, it would crash along with it. To work around this, you would want to either build your own queue that runs in a separate process (and probably uses a shared backend for queuing/messaging) or use a package like SwiftQ.
Related Topics
How to Use Nsjsonserialization
How to Change the Name of an iOS App
Getting Current Device Language in Ios
Cfnetwork Sslhandshake Failed iOS 9
Remove HTML Tags from an Nsstring on the Iphone
Expand/Collapse Section in Uitableview in Ios
When Should I Compare an Optional Value to Nil
How to Load Custom Uitableviewcells from Xib Files
How to Add Nsapptransportsecurity to My Info.Plist File
Creating a Segue Programmatically
What Are Different Types of Notifications in iOS and How to Configure Them Correctly
Generate .Pem File Used to Set Up Apple Push Notifications
Can You Build Dynamic Libraries For iOS and Load Them At Runtime
Fatal Error: Unexpectedly Found Nil While Unwrapping an Optional Values
Objective-C Arc: Strong VS Retain and Weak VS Assign
Protocol Can Only Be Used as a Generic Constraint Because It Has Self or Associatedtype Requirements