0

I'm looking for a way to prevent an array from having duplicate references to the same object in it. I don't mean duplicate values - that would be fine.

For example, I know in Apple's SpriteKit framework, an SKNode object stores its children in an array (var children: [SKNode]), and adding a node to a parent twice causes the program to crash.

let parent = SKNode()
let child1 = SKNode()
let child2 = SKNode()
parent.addChild(child1)
parent.addChild(child2)    // Allowed, although child2 == child1.
parent.addChild(child1)    // Causes a crash.

This is the exact kind of behavior I am wanting to emulate. How would I manage this? Is it possible without having O(n) complexity from having to compare each reference?

Jared
  • 4,240
  • 4
  • 22
  • 27

1 Answers1

1

I'm not sure why exactly you would want to do this, but here is how you can go about doing it:

...

var childReferences = Set<ObjectIdentifier>

private func validateUniqueness(_ node: SKNode) {
    guard childReferences.insert(ObjectIdentifier(node)).inserted else {
        fatalError("An attempt was made to add a duplicate child")
    }
}

override func addChild(_ node: SKNode) {
    validateUniqueness(node)
    super.addChild(node)
}

override func insertChild(_ node: SKNode at index: Int) {
    validateUniqueness(node)
    super.insertChild(node, at: index)
}

override func removeChildren(in nodes: [SKNode]) {
    childReferences.subtract(nodes)
    super.removeChildren(in nodes: [SKNode])
}

override func removeAllChildren() {
    childReferences.removeAll(keepingCapacity: false)
    super.removeAllChildren()
}
Alexander
  • 59,041
  • 12
  • 98
  • 151
  • I think OP asked if it is possible to do it without O(n) complexity. But you're just using a `set` which just checks for duplicates before adding and so the worst case complexity is still O(n). – ebby94 Jun 13 '17 at 04:12
  • 1
    @ebby94 Lookups in `Sets` are `O(1)`. – Alexander Jun 13 '17 at 04:19
  • Didn't know that. But how is a `Set` lookup O(1)? Accessing with a subscript has a O(1) complexity, but finding duplicates should take O(n) complexity right? Any links that explains this would be cool :) – ebby94 Jun 13 '17 at 04:27
  • `Accessing with a subscript has a O(1) complexity` Subscripts can implement any arbitrary behaviour, so there's no guarantee they're O(1). For `Array` in particular, the index is multiplied by the element size, and added to the base pointer. This happens to be a constant time operation, but other types can implement subscripts that have higher time complexity. For example, a linked list could implement a subscript with O(N). – Alexander Jun 13 '17 at 04:37
  • @ebby94 Are you familiar with how Dictionaries work? A `Set` is essentially a wrapper around a `Dictionary` that maps a key x: T to itself, a value x: T, so it benefits from the same constant time `contains`, `insert`, `remove` as a Dictionary does. – Alexander Jun 13 '17 at 04:41
  • @ebby94 https://stackoverflow.com/questions/28904014/how-is-it-possible-for-java-hashmap-to-perform-constant-time-lookup-o1-for-ge – Alexander Jun 13 '17 at 04:42
  • Didn't know Set was a wrapper around a dictionary. Thanks. Much appreciated. – ebby94 Jun 13 '17 at 04:49
  • 1
    @ebby94 https://github.com/apple/swift/blob/master/stdlib/public/core/HashedCollections.swift.gyb#L582 – Alexander Jun 13 '17 at 05:41
  • @Alexander Although note that `Collection` semantically requires that implementations of its subscript requirement(s) operate in O(1) time – even for a linked list (in that case, the index advancement needs to walk through the linked list, rather than the subscript doing it). Otherwise a loop in the form `var index = startIndex; while index < endIndex { let element = self[index]; formIndex(after: &index) }` looks linear, but would be accidently quadratic if the subscript was O(n). – Hamish Jun 13 '17 at 09:05