I've searched the Swift book, but can't find the Swift version of @synchronized. How do I do mutual exclusion in Swift?
-
1I would use a dispatch barrier. Barriers provide very cheap synchronization. dispatch_barrier_async(). etc. – Frederick C. Lee Aug 07 '15 at 18:10
-
@FrederickC.Lee, what if you need a **write** to be synchronized though, such as when creating a wrapper for `removeFirst()`? – ScottyBlades Jun 26 '18 at 15:49
24 Answers
You can use GCD. It is a little more verbose than @synchronized
, but works as a replacement:
let serialQueue = DispatchQueue(label: "com.test.mySerialQueue")
serialQueue.sync {
// code
}

- 7,038
- 6
- 33
- 44
-
12This is great, but there lacks the re-entry ability that you have with @synchronized. – Michael Waterfall Oct 23 '14 at 16:46
-
I have tried this method. But, it is not locked properly. I don't know why. objc_sync_enter/objc_sync_exit seems work properly. – derjohng Nov 20 '14 at 09:19
-
-
1
-
1
-
10With this approach you need to be careful. Your block might be executed on some other thread. API docs say: "As an optimization, this function invokes the block on the current thread when possible." – bio Sep 24 '15 at 17:57
-
-
1
-
23Great article from Matt Gallagher about this: http://www.cocoawithlove.com/blog/2016/06/02/threads-and-mutexes.html – wuf810 Jul 29 '16 at 11:50
-
-
4
-
81No, no and no. Nice try, but works imperfectly well. Why? Essential reading (comprehensive comparison of alternatives, cautions) and a great utility framework from Matt Gallagher, here: https://www.cocoawithlove.com/blog/2016/06/02/threads-and-mutexes.html @wuf810 mentioned this first (HT), but understated how good this articles is. All should read. (Please upvote this by the minimum to make it initially visible, but no more.) – t0rst Nov 24 '16 at 10:13
-
1It's strange for me, that someone considers, that it's incorrect answer. This is looks like example from the Matt Galloway's book, named "Effective Objective-C". _syncQueue = dispatch_queue_create("com.effectiveobjectivec.syncQueue", NULL); - (NSString*)someString { __block NSString *localSomeString; dispatch_sync(_syncQueue, ^{ localSomeString = _someString; }); return localSomeString; } – akozin Feb 14 '18 at 13:11
-
5Can someone clarify why this answer could cause deadlocks? Matt Gallagher's article makes it clear why this will be slower than `@synchronized` was, but why would it cause deadlocks? @TomKraina @bio @t0rst – Bill Nov 30 '19 at 12:36
-
afaict using GCD is not the same as `@synchronized` when using `Thread` directly. – WestCoastProjects May 20 '20 at 21:36
-
Smell fish, hacky and incorrect to me. As, we are looking for "primitive thread locking" behaviour, not a single thread execution behaviour. – Cheok Yan Cheng Mar 25 '21 at 05:56
-
[Dispatch Queues](https://developer.apple.com/library/archive/documentation/General/Conceptual/ConcurrencyProgrammingGuide/OperationQueues/OperationQueues.html) From official doc, there is other side of story: This type of queue-based synchronization is more efficient than locks because locks always require an expensive kernel trap in both the contested and uncontested cases, whereas a dispatch queue works primarily in your application’s process space and only calls down to the kernel when absolutely necessary. – Alex Bin Zhao Jun 16 '21 at 07:13
I was looking for this myself and came to the conclusion there's no native construct inside of swift for this yet.
I did make up this small helper function based on some of the code I've seen from Matt Bridges and others.
func synced(_ lock: Any, closure: () -> ()) {
objc_sync_enter(lock)
closure()
objc_sync_exit(lock)
}
Usage is pretty straight forward
synced(self) {
println("This is a synchronized closure")
}
There is one problem I've found with this. Passing in an array as the lock argument seems to cause a very obtuse compiler error at this point. Otherwise though it seems to work as desired.
Bitcast requires both operands to be pointer or neither
%26 = bitcast i64 %25 to %objc_object*, !dbg !378
LLVM ERROR: Broken function found, compilation aborted!

- 25,802
- 10
- 92
- 123

- 6,438
- 1
- 26
- 30
-
-
17This is pretty useful and preserves the syntax of the `@synchronized` block nicely, but note that it is not identical to a real builtin block statement like the `@synchronized` block in Objective-C, because `return` and `break` statements no longer work to jump out of the surrounding function/loop like it would if this were an ordinary statement. – newacct Oct 11 '14 at 05:11
-
3Error is likely due to that arrays are passed as values not references – James Alvarez Feb 16 '15 at 09:52
-
13This would probably be a great place to use the new `defer` keyword to ensure `objc_sync_exit` gets called even if `closure` throws. – devios1 Apr 18 '16 at 19:26
-
Awesome! I want to sync static methods in my class. Will this work if I use my class as the lock object? `synced(MyClass, () -> { /* ... */ })`? – Steven Wexler May 04 '16 at 16:54
-
-
What will happen if we pass a struct type like Dictionary to it ? Since its taking AnyObject as a param type, will the compiler convert it to NSDictionary and will there be any side-effects to it ? – RandomGuy Jul 07 '16 at 01:12
-
4@t0rst Calling this answer "flawed" based on the linked-to article is not valid. The article says this method is "a little slower than ideal" and "is limited to Apple platforms". That doesn't make it "flawed" by a long shot. – RenniePet Dec 12 '16 at 16:57
-
@RenniePet it has flaws. Based on what you need, the flaws may be of no relevance or be significant. The article from Matt Galagher is a helpful start to making a good choice for each case. – t0rst Dec 12 '16 at 17:04
-
If you use this on a class method in swift 3.0 and Alamofire then you network calls might stop working after some requests. Look at the following issue: https://github.com/Alamofire/Alamofire/issues/1075 – Ankit Srivastava Mar 18 '17 at 06:23
-
2This very interesting article explains a pitfall with `objc_sync_xxx`: https://straypixels.net/swift-dictionary-locking/ – Mike Taverne Apr 30 '19 at 18:17
I like and use many of the answers here, so I'd choose whichever works best for you. That said, the method I prefer when I need something like objective-c's @synchronized
uses the defer
statement introduced in swift 2.
{
objc_sync_enter(lock)
defer { objc_sync_exit(lock) }
//
// code of critical section goes here
//
} // <-- lock released when this block is exited
The nice thing about this method, is that your critical section can exit the containing block in any fashion desired (e.g., return
, break
, continue
, throw
), and "the statements within the defer statement are executed no matter how program control is transferred."1

- 6,990
- 3
- 24
- 22
-
1I think this is probably the most elegant solution provided here. Thanks for your feedback. – Scott D Jun 02 '16 at 14:24
-
3
-
6
-
2Excellent! I had written some lock helper methods when Swift 1 was introduced and hadn't revisited these in a while. Completely forgot about defer; this is the way to go! – Randy Sep 17 '16 at 14:33
-
I like this but get a compiler error "Braced block of statements is an unused closure" in Xcode 8. Ah I get it they are just the function braces - too a while to find your "1" reference link - thanks ! – Duncan Groenewald Nov 01 '16 at 09:40
-
This is the only solution that worked for me when i was trying to make my dictionary manipulation thread safe. Thank you!! – Ajji Jul 29 '20 at 14:10
-
https://straypixels.net/swift-dictionary-locking/ makes objc_sync_enter() in swift essentially useless. Its just too tricky. The article describes how a swift optimization blew up objc_sync_enter by causing deadlock. Also hard to debug. – Tom Andersen Jul 29 '20 at 18:49
-
1Great answer. It would help to say that one can use do { ... } to define a block of code, so `do { obj_sync_enter(lock); defer { obj_sync_exit(lock); }; ...code... }` achieves the same of `@synchronized{ ...code... }` – Gibezynu Nu Dec 31 '20 at 14:51
You can sandwich statements between objc_sync_enter(obj: AnyObject?)
and objc_sync_exit(obj: AnyObject?)
. The @synchronized keyword is using those methods under the covers. i.e.
objc_sync_enter(self)
... synchronized code ...
objc_sync_exit(self)

- 48,277
- 7
- 47
- 61
-
3
-
2No, `objc_sync_enter` and `objc_sync_exit` are methods defined in Objc-sync.h and are open source: http://opensource.apple.com/source/objc4/objc4-371.2/runtime/objc-sync.h – bontoJR Aug 05 '15 at 11:48
-
What happens if multiple threads try to access the same resource, does the second one wait, retry, or crash? – TruMan1 Apr 26 '16 at 20:39
-
Adding onto what @bontoJR said, `objc_sync_enter(…)` & `objc_sync_exit(…)` are public headers provided by the iOS/macOS/etc. APIs _(looks like they're inside the `….sdk` at the path `usr/include/objc/objc-sync.h`)_. The easiest way to find out if something is a public API or not is to _(in Xcode)_ type the function name _(e.g. `objc_sync_enter()`; arguments don't need to be specified for C functions)_, then try to command-click it. If it shows you the header file for that API, then you're good _(since you wouldn't be able to see the header if it weren't public)_. – Slipp D. Thompson Jul 26 '18 at 09:19
Analog of the @synchronized
directive from Objective-C can have an arbitrary return type and nice rethrows
behaviour in Swift.
// Swift 3
func synchronized<T>(_ lock: AnyObject, _ body: () throws -> T) rethrows -> T {
objc_sync_enter(lock)
defer { objc_sync_exit(lock) }
return try body()
}
The use of the defer
statement lets directly return a value without introducing a temporary variable.
In Swift 2 add the @noescape
attribute to the closure to allow more optimisations:
// Swift 2
func synchronized<T>(lock: AnyObject, @noescape _ body: () throws -> T) rethrows -> T {
objc_sync_enter(lock)
defer { objc_sync_exit(lock) }
return try body()
}
Based on the answers from GNewc [1] (where I like arbitrary return type) and Tod Cunningham [2] (where I like defer
).
-
Xcode is telling me that @noescape is now default and is deprecated in Swift 3. – RenniePet Dec 12 '16 at 20:11
-
That's right, the code in this answer is for Swift 2 and requires some adaptation for Swift 3. I'll update it when I have time to. – werediver Dec 12 '16 at 20:56
-
1Can you explain the usage? Maybe with an example.. thanks in advance! In my case, I have a Set that I need to synchronize, because I manipulate its content in a DispatchQueue. – sancho Jun 30 '17 at 21:21
-
@sancho I'd prefer to keep this post concise. You seem to ask about general concurrent programming guidelines, that's a broad question. Try to ask it as a separate question! – werediver Jul 01 '17 at 10:16
SWIFT 4
In Swift 4 you can use GCDs dispatch queues to lock resources.
class MyObject {
private var internalState: Int = 0
private let internalQueue: DispatchQueue = DispatchQueue(label:"LockingQueue") // Serial by default
var state: Int {
get {
return internalQueue.sync { internalState }
}
set (newState) {
internalQueue.sync { internalState = newState }
}
}
}

- 5,283
- 9
- 52
- 64
-
This doesn't seem to work with XCode8.1. `.serial` seems to be unavailable. But `.concurrent` is available. :/ – Travis Griggs Oct 26 '16 at 16:19
-
2
-
2Note that this pattern does not guard properly against most common multi thread issues. For example, if you'd run `myObject.state = myObject.state + 1` concurrently, it would not count the total operations but instead yield a nondeterministic value. To solve that problem, the calling code should be wrapped in a serial queue so that both the read and the write happen atomically. Of course Obj-c's `@synchronised` has the same problem, so in that sense your implementation is correct. – Berik Jul 26 '18 at 16:12
-
1Yes, `myObject.state += 1` is a combination of a read and then a write operation. Some other thread can still come inbetween to set/write a value. As per https://www.objc.io/blog/2018/12/18/atomic-variables/, it would be easier to run the `set` in a sync block/closure instead and not under the variable itself. – CyberMew Sep 18 '19 at 10:17
In modern Swift 5, with return capability:
/**
Makes sure no other thread reenters the closure before the one running has not returned
*/
@discardableResult
public func synchronized<T>(_ lock: AnyObject, closure:() -> T) -> T {
objc_sync_enter(lock)
defer { objc_sync_exit(lock) }
return closure()
}
Use it like this, to take advantage the return value capability:
let returnedValue = synchronized(self) {
// Your code here
return yourCode()
}
Or like that otherwise:
synchronized(self) {
// Your code here
yourCode()
}

- 12,745
- 9
- 57
- 95
-
3This is the correct answer and not the accepted and highly upvoted one (which depends on `GCD`). It seems essentially _noone_ uses or understands how to use `Thread`. I am v happy with it - whereas `GCD` is fraught with gotchas and limitations. – WestCoastProjects May 20 '20 at 21:37
-
1The correct answer needs to use a recursive lock, as does `objc_sync_enter`. I prefer to hide the `lock` parameter in a private let or iVar instead of using `self`, unless it needs to be published to allow others to sync too. That is a very rare case, but if that happens the use of `objc_sync_enter` permits cooperation between swift and objective-C. This answer also permits returning a value. For these reasons I have chosen this answer for use in my projects. – Rik Renich Oct 15 '20 at 13:20
To add return functionalty, you could do this:
func synchronize<T>(lockObj: AnyObject!, closure: ()->T) -> T
{
objc_sync_enter(lockObj)
var retVal: T = closure()
objc_sync_exit(lockObj)
return retVal
}
Subsequently, you can call it using:
func importantMethod(...) -> Bool {
return synchronize(self) {
if(feelLikeReturningTrue) { return true }
// do other things
if(feelLikeReturningTrueNow) { return true }
// more things
return whatIFeelLike ? true : false
}
}

- 445
- 4
- 4
Using Bryan McLemore answer, I extended it to support objects that throw in a safe manor with the Swift 2.0 defer ability.
func synchronized( lock:AnyObject, block:() throws -> Void ) rethrows
{
objc_sync_enter(lock)
defer {
objc_sync_exit(lock)
}
try block()
}

- 3,691
- 4
- 30
- 32
-
It would be better to use `rethrows` to simplify usage with non-throwing closures (no need to use `try`), as shown in [my answer](http://stackoverflow.com/a/34173952/3541063). – werediver Dec 22 '15 at 10:06
In the "Understanding Crashes and Crash Logs" session 414 of the 2018 WWDC they show the following way using DispatchQueues with sync.
In swift 4 should be something like the following:
class ImageCache {
private let queue = DispatchQueue(label: "sync queue")
private var storage: [String: UIImage] = [:]
public subscript(key: String) -> UIImage? {
get {
return queue.sync {
return storage[key]
}
}
set {
queue.sync {
storage[key] = newValue
}
}
}
}
Anyway you can also make reads faster using concurrent queues with barriers. Sync and async reads are performed concurrently and writing a new value waits for previous operations to finish.
class ImageCache {
private let queue = DispatchQueue(label: "with barriers", attributes: .concurrent)
private var storage: [String: UIImage] = [:]
func get(_ key: String) -> UIImage? {
return queue.sync { [weak self] in
guard let self = self else { return nil }
return self.storage[key]
}
}
func set(_ image: UIImage, for key: String) {
queue.async(flags: .barrier) { [weak self] in
guard let self = self else { return }
self.storage[key] = image
}
}
}

- 9,613
- 5
- 40
- 46
-
you probably don't need to block reads and slowdown the queue using sync. You can just use sync for serial writing. – Basheer_CAD Sep 11 '18 at 20:16
Try: NSRecursiveLock
A lock that may be acquired multiple times by the same thread without causing a deadlock.
let lock = NSRecursiveLock()
func f() {
lock.lock()
//Your Code
lock.unlock()
}
func f2() {
lock.lock()
defer {
lock.unlock()
}
//Your Code
}
The Objective-C synchronization feature supports recursive and reentrant code. A thread can use a single semaphore several times in a recursive manner; other threads are blocked from using it until the thread releases all the locks obtained with it; that is, every @synchronized() block is exited normally or through an exception. Source

- 1,714
- 18
- 36
Swift 3
This code has the re-entry ability and can work with Asynchronous function calls. In this code, after someAsyncFunc() is called, another function closure on the serial queue will process but be blocked by semaphore.wait() until signal() is called. internalQueue.sync shouldn't be used as it will block the main thread if I'm not mistaken.
let internalQueue = DispatchQueue(label: "serialQueue")
let semaphore = DispatchSemaphore(value: 1)
internalQueue.async {
self.semaphore.wait()
// Critical section
someAsyncFunc() {
// Do some work here
self.semaphore.signal()
}
}
objc_sync_enter/objc_sync_exit isn't a good idea without error handling.

- 1,282
- 1
- 13
- 10
-
What error handling? The compiler won't allow anything that throws. On the other hand, by not using objc_sync_enter/exit, you give up on some substantial performance gains. – gnasher729 Aug 25 '17 at 20:02
Use NSLock in Swift4:
let lock = NSLock()
lock.lock()
if isRunning == true {
print("Service IS running ==> please wait")
return
} else {
print("Service not running")
}
isRunning = true
lock.unlock()
Warning The NSLock class uses POSIX threads to implement its locking behavior. When sending an unlock message to an NSLock object, you must be sure that message is sent from the same thread that sent the initial lock message. Unlocking a lock from a different thread can result in undefined behavior.

- 4,751
- 1
- 36
- 39
-
1[Eliminating Lock-Based Code](https://developer.apple.com/library/content/documentation/General/Conceptual/ConcurrencyProgrammingGuide/ThreadMigration/ThreadMigration.html#//apple_ref/doc/uid/TP40008091-CH105-SW3) – Gobe Mar 18 '18 at 16:20
With Swift's property wrappers, this is what I'm using now:
@propertyWrapper public struct NCCSerialized<Wrapped> {
private let queue = DispatchQueue(label: "com.nuclearcyborg.NCCSerialized_\(UUID().uuidString)")
private var _wrappedValue: Wrapped
public var wrappedValue: Wrapped {
get { queue.sync { _wrappedValue } }
set { queue.sync { _wrappedValue = newValue } }
}
public init(wrappedValue: Wrapped) {
self._wrappedValue = wrappedValue
}
}
Then you can just do:
@NCCSerialized var foo: Int = 10
or
@NCCSerialized var myData: [SomeStruct] = []
Then access the variable as you normally would.

- 5,460
- 5
- 40
- 50
-
1I like this solution, but was curious about the cost of folks @Decorating since doing so has the side effect of creating a `DispatchQueue` which is hidden from the user. I found this SO reference to put my mind at ease: https://stackoverflow.com/a/35022486/1060314 – AJ Venturella Apr 30 '20 at 20:59
-
The property wrapper itself is quite light -- just a struct, so, one of the lightest things you can make. Thanks for the link on DispatchQueue though. I've had in the back of my mind to do some performance testing on the queue.sync wrap versus other solutions (and versus no queue), but hadn't done so. – drewster May 03 '20 at 05:13
With the advent of Swift concurrency, we would use actors.
You can use tasks to break up your program into isolated, concurrent pieces. Tasks are isolated from each other, which is what makes it safe for them to run at the same time, but sometimes you need to share some information between tasks. Actors let you safely share information between concurrent code.
Like classes, actors are reference types, so the comparison of value types and reference types in Classes Are Reference Types applies to actors as well as classes. Unlike classes, actors allow only one task to access their mutable state at a time, which makes it safe for code in multiple tasks to interact with the same instance of an actor. For example, here’s an actor that records temperatures:
actor TemperatureLogger { let label: String var measurements: [Int] private(set) var max: Int init(label: String, measurement: Int) { self.label = label self.measurements = [measurement] self.max = measurement } }
You introduce an actor with the
actor
keyword, followed by its definition in a pair of braces. TheTemperatureLogger
actor has properties that other code outside the actor can access, and restricts the max property so only code inside the actor can update the maximum value.
For more information, see WWDC video Protect mutable state with Swift actors.
For the sake of completeness, the historical alternatives include:
GCD serial queue: This is a simple pre-concurrency approach to ensure that one one thread at a time will interact with the shared resource.
Reader-writer pattern with concurrent GCD queue: In reader-writer patterns, one uses a concurrent dispatch queue to perform synchronous, but concurrent, reads (but concurrent with other reads only, not writes) but perform writes asynchronously with a barrier (forcing writes to not be performed concurrently with anything else on that queue). This can offer a performance improvement over a simple GCD serial solution, but in practice, the advantage is modest and comes at the cost of additional complexity (e.g., you have to be careful about thread-explosion scenarios). IMHO, I tend to avoid this pattern, either sticking with the simplicity of the serial queue pattern, or, when the performance difference is critical, using a completely different pattern.
Locks: In my Swift tests, lock-based synchronization tends to be substantially faster than either of the GCD approaches. Locks come in a few flavors:
NSLock
is a nice, relatively efficient lock mechanism.- In those cases where performance is of paramount concern, I use “unfair locks”, but you must be careful when using them from Swift (see https://stackoverflow.com/a/66525671/1271826).
- For the sake of completeness, there is also the recursive lock. IMHO, I would favor simple
NSLock
overNSRecursiveLock
. Recursive locks are subject to abuse and often indicate code smell. - You might see references to “spin locks”. Many years ago, they used to be employed where performance was of paramount concern, but they are now deprecated in favor of unfair locks.
Technically, one can use semaphores for synchronization, but it tends to be the slowest of all the alternatives.
I outline a few my benchmark results here.
In short, nowadays I use actors for contemporary codebases, GCD serial queues for simple scenarios non-async-await code, and locks in those rare cases where performance is essential.
And, needless to say, we often try to reduce the number of synchronizations altogether. If we can, we often use value types, where each thread gets its own copy. And where synchronization cannot be avoided, we try to minimize the number of those synchronizations where possible.

- 415,655
- 72
- 787
- 1,044
Figure I'll post my Swift 5 implementation, built off of the prior answers. Thanks guys! I found it helpful to have one that returns a value too, so I have two methods.
Here is a simple class to make first:
import Foundation
class Sync {
public class func synced(_ lock: Any, closure: () -> ()) {
objc_sync_enter(lock)
defer { objc_sync_exit(lock) }
closure()
}
public class func syncedReturn(_ lock: Any, closure: () -> (Any?)) -> Any? {
objc_sync_enter(lock)
defer { objc_sync_exit(lock) }
return closure()
}
}
Then use it like so if needing a return value:
return Sync.syncedReturn(self, closure: {
// some code here
return "hello world"
})
Or:
Sync.synced(self, closure: {
// do some work synchronously
})

- 3,665
- 34
- 52
-
Try `public class func synced
(_ lock: Any, closure: () -> T)`, works for both, void and any other type. There is also the regrows stuff. – hnh Apr 02 '20 at 16:03 -
@hnh what do you mean by the regrows stuff? Also if you'd be willing to share an example call to the generic method with type
that would help me update the answer - I like where you're going with that. – TheJeff May 11 '20 at 16:45 -
You can create propertyWrapper Synchronised
Here example with NSLock
underhood. You can use for synchronisation whatever you want GCD, posix_locks e.t.c
@propertyWrapper public struct Synchronised<T> {
private let lock = NSLock()
private var _wrappedValue: T
public var wrappedValue: T {
get {
lock.lock()
defer {
lock.unlock()
}
return _wrappedValue
}
set {
lock.lock()
defer {
lock.unlock()
}
_wrappedValue = newValue
}
}
public init(wrappedValue: T) {
self._wrappedValue = wrappedValue
}
}
@Synchronised var example: String = "testing"
based on @drewster answer

- 437
- 5
- 11
In conclusion, Here give more common way that include return value or void, and throw
import Foundation
extension NSObject {
func synchronized<T>(lockObj: AnyObject!, closure: () throws -> T) rethrows -> T
{
objc_sync_enter(lockObj)
defer {
objc_sync_exit(lockObj)
}
return try closure()
}
}

- 58,982
- 91
- 316
- 560

- 4,006
- 28
- 35
-
why is `defer {sync_exit}` after `sync_enter` but not before ? In Development session I heard that defer should be placed before all code inside function :) – iTux Jun 07 '21 at 08:32
-
Because it is reasonable to make objc_sync_exit must happen after objc_sync_enter. – Victor Choy Jun 08 '21 at 06:35
-
But it's exits after enter if you put it before, it's exit on exit from scope, am I right ?) – iTux Jun 13 '21 at 21:27
Details
Xcode 8.3.1, Swift 3.1
Task
Read write value from different threads (async).
Code
class AsyncObject<T>:CustomStringConvertible {
private var _value: T
public private(set) var dispatchQueueName: String
let dispatchQueue: DispatchQueue
init (value: T, dispatchQueueName: String) {
_value = value
self.dispatchQueueName = dispatchQueueName
dispatchQueue = DispatchQueue(label: dispatchQueueName)
}
func setValue(with closure: @escaping (_ currentValue: T)->(T) ) {
dispatchQueue.sync { [weak self] in
if let _self = self {
_self._value = closure(_self._value)
}
}
}
func getValue(with closure: @escaping (_ currentValue: T)->() ) {
dispatchQueue.sync { [weak self] in
if let _self = self {
closure(_self._value)
}
}
}
var value: T {
get {
return dispatchQueue.sync { _value }
}
set (newValue) {
dispatchQueue.sync { _value = newValue }
}
}
var description: String {
return "\(_value)"
}
}
Usage
print("Single read/write action")
// Use it when when you need to make single action
let obj = AsyncObject<Int>(value: 0, dispatchQueueName: "Dispatch0")
obj.value = 100
let x = obj.value
print(x)
print("Write action in block")
// Use it when when you need to make many action
obj.setValue{ (current) -> (Int) in
let newValue = current*2
print("previous: \(current), new: \(newValue)")
return newValue
}
Full Sample
extension DispatchGroup
extension DispatchGroup {
class func loop(repeatNumber: Int, action: @escaping (_ index: Int)->(), completion: @escaping ()->()) {
let group = DispatchGroup()
for index in 0...repeatNumber {
group.enter()
DispatchQueue.global(qos: .utility).async {
action(index)
group.leave()
}
}
group.notify(queue: DispatchQueue.global(qos: .userInitiated)) {
completion()
}
}
}
class ViewController
import UIKit
class ViewController: UIViewController {
override func viewDidLoad() {
super.viewDidLoad()
//sample1()
sample2()
}
func sample1() {
print("=================================================\nsample with variable")
let obj = AsyncObject<Int>(value: 0, dispatchQueueName: "Dispatch1")
DispatchGroup.loop(repeatNumber: 5, action: { index in
obj.value = index
}) {
print("\(obj.value)")
}
}
func sample2() {
print("\n=================================================\nsample with array")
let arr = AsyncObject<[Int]>(value: [], dispatchQueueName: "Dispatch2")
DispatchGroup.loop(repeatNumber: 15, action: { index in
arr.setValue{ (current) -> ([Int]) in
var array = current
array.append(index*index)
print("index: \(index), value \(array[array.count-1])")
return array
}
}) {
print("\(arr.value)")
}
}
}

- 9,188
- 10
- 67
- 113

- 24,482
- 9
- 132
- 127
What about
final class SpinLock {
private let lock = NSRecursiveLock()
func sync<T>(action: () -> T) -> T {
lock.lock()
defer { lock.unlock() }
return action()
}
}

- 5,667
- 12
- 59
- 97
Why make it difficult and hassle with locks? Use Dispatch Barriers.
A dispatch barrier creates a synchronization point within a concurrent queue.
While it’s running, no other block on the queue is allowed to run, even if it’s concurrent and other cores are available.
If that sounds like an exclusive (write) lock, it is. Non-barrier blocks can be thought of as shared (read) locks.
As long as all access to the resource is performed through the queue, barriers provide very cheap synchronization.

- 9,019
- 17
- 64
- 105
-
2I mean, you're assuming the use of a GCD queue to synchronize access, but that's not mentioned in the original question. And a barrier is only necessary with a concurrent queue - you can simply use a serial queue to queue up mutually excluded blocks to emulate a lock. – Bill Aug 07 '15 at 19:10
-
My question, why emulate a lock? From what I read, locks are discouraged due to the overhead vs a barrier within a queue. – Frederick C. Lee Aug 07 '15 at 19:18
dispatch_barrier_async is the better way, while not blocking current thread.
dispatch_barrier_async(accessQueue, { dictionary[object.ID] = object })

- 8,038
- 8
- 41
- 54
Based on ɲeuroburɳ, test an sub-class case
class Foo: NSObject {
func test() {
print("1")
objc_sync_enter(self)
defer {
objc_sync_exit(self)
print("3")
}
print("2")
}
}
class Foo2: Foo {
override func test() {
super.test()
print("11")
objc_sync_enter(self)
defer {
print("33")
objc_sync_exit(self)
}
print("22")
}
}
let test = Foo2()
test.test()
Output:
1
2
3
11
22
33
Another method is to create a superclass and then inherit it. This way you can use GCD more directly
class Lockable {
let lockableQ:dispatch_queue_t
init() {
lockableQ = dispatch_queue_create("com.blah.blah.\(self.dynamicType)", DISPATCH_QUEUE_SERIAL)
}
func lock(closure: () -> ()) {
dispatch_sync(lockableQ, closure)
}
}
class Foo: Lockable {
func boo() {
lock {
....... do something
}
}

- 9
- 1
- 4
-
10-1 Inheritance gives you subtype polymorphism in return for increasing coupling. Avoid the later if you don’t need the former. Don’t be lazy. Prefer composition for code reuse. – Jano Nov 23 '15 at 22:20