2

I have to stop my code for one second in order to get server databases synced before continuing.

All code snippets below are run from main thread.

I used this first:

// code 1
DispatchQueue.main.asyncAfter(deadline: .now() + 1.0) {
    // dummy
}
// stuff

Outcome obviously not as desired since stuff is executed immediately. I'd like stuff to run delayed after the code block (not nice, but there's a reason for it).

// code 2
let group = DispatchGroup()
group.enter()
DispatchQueue.main.asyncAfter(deadline: .now() + 1.0) {
    group.leave()
}
group.wait()
// stuff

Deadlock!

Questions 1 : Why is main queue without DispatchGroup working and locks in conjunction with DispatchGroup?

// code 3
let group = DispatchGroup()
group.enter()
DispatchQueue.global().asyncAfter(deadline: .now() + 1.0) {
    group.leave()
}
group.wait()
// stuff

This works as desired!

Question 2 : Why is global queue required with DispatchGroup to make stuff run delayed?

I read this here:
https://stackoverflow.com/a/42773392/3657691/ https://stackoverflow.com/a/28821805/3657691/ https://stackoverflow.com/a/49287115/3657691/

geohei
  • 696
  • 4
  • 15
  • Well, do you execute the second snippet (the one generating a deadlock) from the main thread? – Vlad Rusu Jun 30 '21 at 09:41
  • 2
    You are misusing `DispatchGroup` to force an asynchronous function to become synchronous. This is pretty bad practice. Put `// stuff` into the closure of `asyncAfter` replacing `group.leave()` – vadian Jun 30 '21 at 09:49
  • @VladRusu: It's indeed from the main thread as you correctly assumed below. I'll correct my OP. – geohei Jun 30 '21 at 12:25
  • @vadian: Yes, as I said the code is not nice, but it's more for testing at this time. The final productive code will hold the delayed stuff inside the completion. Nevertheless I didn't understand why the OP code behaved like it did - hence the question. – geohei Jun 30 '21 at 12:30
  • 1
    If the group runs also on the main thread (which it apparently does) `wait` blocks the thread. It’s highly recommended to run groups on their own thread. And once again - as the name implies - `DispatchGroup` is for managing multiple tasks. – vadian Jun 30 '21 at 12:40

1 Answers1

2

I am going to assume that you are running these snippets on the main thread, as this is most probably the case from the issue description.

Dispatch queues are basically task queues, so a queue has some tasks enqueued. Let's see what is on the main queue when you are executing the snippet generating a deadlock.

  1. The main queue has a task executing (the one executing the snippet)
  2. Then you call asyncAfter which will enqueue another task (the closure containing group.leave()) on the main queue, after the specified deadline.

Now the task being executed (1.) is being blocked by the call to group.wait(), and it's going to block the whole main queue until it finishes execution. This means the enqueued task (2.) will have to wait until the first one finishes. You can see here that the 2 tasks will block each other:

  • the first one (1.) will wait until the second one (2.) releases the dispatch group
  • the second one (2.) will wait until the first one (1.) finishes execution so it can be scheduled

For question number 2, using a global queue (or literally any other queue other than the one our current code is being executed on - in this example the main queue), will not block the asyncAfter task (obviously, because it's being scheduled on another queue which is not blocked, thus it gets the chance to be executed).

This is true for serial dispatch queues (the main queue being a serial queue as well). Serial dispatch queues will execute their tasks serially, that is only one at a time.

On the other hand, for a concurrent dispatch queue, this scenario won't result in a deadlock, because the asyncAfter task won't be blocked by the waiting task. That's because concurrent dispatch queues don't wait for the task being executed to be finished to schedule the next enqueued task.

This would even be a good exercise to try to run this scenario on a serial queue, and then on a concurrent queue to observe the differences

Vlad Rusu
  • 1,414
  • 12
  • 17