3

In my application, I have to implement a refresh token logic. I would like that during the refresh token process all requests sent to be kept in a queue and as soon as my process is finished, I start the queue

For example, I want something like this:

let queue = DispatchQueue(label: "myQueue", attributes: .concurrent)  

queue.async {
    // request One
}

queue.async {
    // request Two
}

And when the refresh token process finished:

queue.send()
sandpat
  • 1,478
  • 12
  • 30
Mickael Belhassen
  • 2,970
  • 1
  • 25
  • 46
  • 1
    Are you perhaps looking for a dispatch group? – Paulw11 Jan 01 '20 at 12:16
  • You get away with it by having a function as such: `public func getAccessToken(completion: @escaping (token?) -> Void) {...}` This is far more simpler than queuing up and holding things. – mfaani Jan 02 '20 at 20:42

3 Answers3

4

In my application I have to implement a refresh token logic. I would like that during the refresh token process all requests sent be kept in a queue and as soon as my process is finished I start the queue

If you want to create a queue and delay the starting of its tasks, just suspend it, e.g.:

let queue = DispatchQueue(label: "myQueue", attributes: .concurrent)  
queue.suspend()

queue.async {
    // request One
}

queue.async {
    // request Two
}

fetchToken { result in
    switch result {
    case .success(let token):
        // do something with token
        print(token)
        queue.resume()

    case .failure(let error):
        // handle the error
        print(error)
    }
}

That’s how you suspend and resume dispatch queues. Note, suspend only prevent items from starting on a queue, but has no affect on tasks that are already running. That is why I suspended the queue before dispatching items to it.

But the above begs the question of what you want to do in that failure scenario. You just have a queue sitting there with a bunch of scheduled tasks. You could, theoretically, keep references to those dispatched blocks (by using DispatchWorkItem pattern rather than just simple closures, and you could cancel those items), but I’d probably reach for an operation queue, e.g.

let queue = OperationQueue()
queue.isSuspended = true

queue.addOperation {
    // request One
}

queue.addOperation {
    // request Two
}

fetchToken { result in
    switch result {
    case .success(let token):
        // do something with token
        print(token)
        queue.isSuspended = false

    case .failure(let error):
        // handle the error
        print(error)
        queue.cancelAllOperations()
    }
}

This is the same as the above, but we can cancel all of those queued operations with cancelAllOperations.


By the way, you can create custom Operation subclass that handles tasks that are, themselves, asynchronous. And I’m presuming your “request One” and “request Two” are asynchronous network requests. See looking for a specific example where Operation is preferred over GCD or vice-versa for a discussion of when one might prefer OperationQueue over DispatchQueue.

Rob
  • 415,655
  • 72
  • 787
  • 1,044
2

You can build a class like this

class ConcurrentQueue {

    typealias Task = () -> ()
    private var tasks: [Task] = []
    private let serialQueue = DispatchQueue(label: "Serial queue")

    func enqueueTask(_ task: @escaping Task) {
        serialQueue.sync {
            tasks.append(task)
        }
    }

    func runAndRemoveAllTasks() {
        serialQueue.sync {
            tasks.forEach { task in
                task()
            }
            tasks.removeAll()
        }
    }
}

This class allows you to enqueue several Task(s).

Every Task is a closure like this () -> Void.

When you enqueue a Task it is not executed, it is just appended to the internal array.

Look

let concurrentQueue = ConcurrentQueue()

concurrentQueue.enqueueTask {
    print("1")
}

concurrentQueue.enqueueTask {
    print("2")
}

concurrentQueue.enqueueTask {
    print("3")
}

The result of this code is just saving the 3 Tasks, they are not executed.

Then when you call

concurrentQueue.runAndRemoveAllTasks()

All the tasks are executed and in the console you get

1
2
3

Thread-safety

The methods enqueueTask(_) and runAndRemoveAllTasks() are thread safe.

In fact they interact with the internal tasks Array (which is not implicitly thread-safe) only within

serialQueue.sync {
   ...
}

This guarantee a consistent access to the tasks array.

Luca Angeletti
  • 58,465
  • 13
  • 121
  • 148
  • Your answer is good, but I found the solution using `BlockOperation` It is made specifically for this kind of case. – Mickael Belhassen Jan 01 '20 at 12:55
  • Why sync and not async? as this is a serial queue operations order is kept and also this won't block main thread if these methods are called from it (GCD queues are thread safe) – giorashc Jan 01 '20 at 13:01
  • @giorashc Good point. I used `sync` because I wanted the caller of `concurrentQueue.enqueueTask { }` to be sure the task is appended before the next line is executed. Does it make sense? – Luca Angeletti Jan 01 '20 at 13:03
  • @giorashc - You are correct: If this `serialQueue` is to synchronize interaction with this local tasks queue, then `async` is the right pattern here. The deeper problem is that `runAndRemoveAllTasks` is synchronous, and if these tasks can be slow, then you’re going to block the calling queue for a prolonged period of time. Also note that the above assumes that the two network requests will not run concurrently with respect to each other and the OP explicitly specified queue to be concurrent. – Rob Jan 02 '20 at 20:39
1

The solution is to use BlockOperation:

let pendingRequests = BlockOperation()

pendingRequests.addExecutionBlock {
    //Adding first request
}

pendingRequests.addExecutionBlock {
    //Adding second request
}

When the refreshing token are finish:

pendingRequests.start()
Mickael Belhassen
  • 2,970
  • 1
  • 25
  • 46
  • I’d be very wary with `addExecutionBlock`. If you use `maxConcurrentOperationCount`, on your queue, for example, that’s at the operation level, not the execution block level (i.e. these two execution blocks will run concurrently regardless of queue's `maxConcurrentOperationCount`). Or, if you just `start` the operation without an `OperationQueue`, as shown here, note that `start` is a blocking call. Execution blocks have their uses, but this might not be the best use case. – Rob Jan 02 '20 at 20:23