0

Follow-up to my previous question in Objective-C here (specifically, I'm trying to translate the answer I provided with Objective-C into Swift):

Drawing a line between two points using SceneKit

Problem: SceneKit statistics show that line is being drawn, but it doesn't appear on screen.

Minimal example:

  • Start a new Game project in Xcode using Swift and SceneKit.
  • Remove boilerplate:
    • let scene = SCNScene() instead of SCNScene(named: "art.scnassets/ship.dae")!
    • comment out let ship = ... and ship.runAction(...)
  • Change scnView.backgroundColor = UIColor.blackColor() to scnView.backgroundColor = UIColor.grayColor() (I originally thought maybe the lines were just black on a black background, so changed the background colour accordingly)
  • Add the following code:

    var positions: [SCNVector3] = [SCNVector3Make(0.0,0.0,0.0),SCNVector3Make(10.0,10.0,10.0)]
    
    // using Swift's Int gives an error with unsupported "SceneKit: error, C3DRendererContextBindMeshElement unsupported byte per index (8)"
    // when constructing element: SCNGeometryElement below, so switched to CInt
    var indicies: [CInt] = [0,1]
    
    // for whatever reason, the following doesn't work on iOS:
    // let vertexSource = SCNGeometrySource(vertices: &positions, count: positions.count)
    // see https://stackoverflow.com/questions/26890739/how-to-solve-scenekit-double-not-supported-error
    // so using more complex SCNGeometrySource() initializer instead
    
    let vertexData = NSData(bytes: &positions, length: positions.count)
    let vertexSource = SCNGeometrySource(data: vertexData, semantic: SCNGeometrySourceSemanticVertex,
    vectorCount: positions.count, floatComponents: true, componentsPerVector: 3,
    bytesPerComponent: sizeof(Float), dataOffset: 0, dataStride: sizeof(SCNVector3))
    
    let indexData = NSData(bytes: &indicies, length: indicies.count)
    let element = SCNGeometryElement(data: indexData, primitiveType: SCNGeometryPrimitiveType.Line,
    primitiveCount: indicies.count, bytesPerIndex: sizeof(CInt))
    
    let lines = SCNGeometry(sources: [vertexSource], elements: [element])
    
    let cellNode = SCNNode(geometry: lines)
    scene.rootNode.addChildNode(cellNode)
    

As mentioned, the SceneKit statistics suggest the line is being drawn, but it doesn't seem to appear, and there are no compilation or runtime errors, which makes it a little tricky to track down.

Edit: Using the GPU 'Analyse' tool in the debug navigator gave the following errors:

Draw call exceeded element array buffer bounds

Draw call exceeded array buffer bounds

in glDrawElements(GL_LINES, 4, GL_UNSIGNED_INT, NULL) which suggests there's something wrong with my SCNGeometryElement construction?

Community
  • 1
  • 1
Matthew
  • 1,366
  • 2
  • 15
  • 28

1 Answers1

3

I haven't had a chance to debug the code live, but most likely these lines are your problem:

let vertexData = NSData(bytes: &positions, length: positions.count)
// ...
let indexData = NSData(bytes: &indicies, length: indicies.count)

The length of your data is the number of entries times the size (in bytes) of each entry. Each element in positions is a SCNVector3, which is three Floats on iOS, so each position is 3 * sizeof(Float) or 12 bytes. Each element in indicies (sic) is a CInt, which I believe is a 32-bit integer. So your data buffers are much shorter than your SCNGeometrySource and SCNGeometryElement constructors claim they should be, which means that at render time SceneKit is asking OpenGL to take a long walk draw off a short pier buffer.

Generally, if you're constructing an NSData from an array, you want the length parameter to be the array count times the sizeof the array's element type.

rickster
  • 124,678
  • 26
  • 272
  • 326
  • I can confirm this works, thank you (and thanks for catching my indicies typo too). I can also confirm that the code now works with a Swift Int instead of a CInt; the error I was getting before with the Swift Int must have been related to my misuse of NSData. Out of interest, do you know why NSData needs to be given the length explicitly, rather than just using the entire buffer? Also, do you know why, in Objective-C, it was possible to give `sizeof(indices)` directly, but in Swift this isn't possible (`[Int] is not convertible to 'T.Type'`)? Thanks again for your help. – Matthew Jan 15 '15 at 22:58
  • 1
    Ah, I spoke to soon about `Int` vs `CInt`. The line is drawn with `indices: [Int]` and then `let indexData = NSData(bytes: &indices, length: sizeof(Int)*indices.count)` but *only if* `let element = SCNGeometryElement(...)` line has `bytesPerIndex: sizeof(CInt)` and it *doesn't work* with `bytesPerIndex: sizeof(Int)`. Since I imagine it's better to be consistent, I'm going to stick with `indices: [CInt]` in my code even though `indices: [Int]` now technically works... – Matthew Jan 15 '15 at 23:25
  • 1
    `NSData` needs to be given an explicit length because you're handing it an arbitrary bag of bytes and it needs to know how big that bag is. In Swift, `NSData` could, in theory, infer the length from the size of the array you've given it (because Swift arrays know their own size)... but currently `NSData`'s Swift initializers are just imported from the ObjC versions, which need to be told a length because C arrays don't know their own size. – rickster Jan 15 '15 at 23:30
  • 1
    In C, `sizeof` is a compiler directive that can take either a type name or a value. In Swift, `sizeof` takes a type, and `sizeofValue` takes a value... however, `sizeofValue` just returns the size of a pointer when you give it an array, so you should instead pass `sizeof(MyType) * arrayOfMyType.count` if you want the total length of the array's data. – rickster Jan 15 '15 at 23:34
  • 1
    Also, depending on how big your `indices` array is, you might want to consider a smaller integer type. Are you really stuffing 2^31 indices in that buffer? I don't know if SceneKit will optimize that down for you before sending it to the GPU. – rickster Jan 15 '15 at 23:37
  • My indices array isn't huge, it's just used to draw a fairly simple wireframe, but I may as well use a smaller integer type (there's no reason not to). Thanks for the advice. – Matthew Jan 15 '15 at 23:46
  • 1
    As for the problem with the Swift `Int`, I'm guessing this is because it's actually an `Int64` on my device, but 8 `bytesPerIndex` isn't supported by SCNGeometryElement. If I gave `bytesPerIndex: sizeof(CInt)` it technically 'worked', but I imagine it was reading in incorrect integers from `indexData: Int` since it was expecting to read `Int32`s and I was actually giving it `Int64`s by mistake... I'm now explicitly using `Int8` everywhere anyway, and it works correctly. – Matthew Jan 15 '15 at 23:50