0

I have scanned and trained multiple real world objects. I do have the ARReferenceObject and the app detects them fine.

The issue that I'm facing is when an object doest not have distinct, vibrant features it takes few seconds to return a detection result, which I can understand. Now, I want the app to show a bounding box and an activity indicator on top the object while it is trying to detect it.

I do not see any information regarding this. Also, if there is any way to get the time when detection starts or the confidence percentage of the object being detected.

Any help is appreciated.

BlackMirrorz
  • 7,217
  • 2
  • 20
  • 31
Tech
  • 59
  • 1
  • 6
  • Any source code you could share? See: [How to Ask](https://stackoverflow.com/help/how-to-ask) and [How to write a Minimal, Complete, and Verifiable example](https://stackoverflow.com/help/mcve). – sɐunıɔןɐqɐp Jul 27 '18 at 07:49
  • Are you talking about scanning an ARReference Object and showing a bounding box? Or show a bounding box of an existing ARReference Object before it is detected? – BlackMirrorz Jul 28 '18 at 08:04
  • show a bounding box of an existing ARReference Object before it is detected. – Tech Jul 28 '18 at 08:42
  • 1
    Please check my answer, as this does what you have asked. I have updated it, with additional images to clarify everything. – BlackMirrorz Jul 28 '18 at 09:15

1 Answers1

13

It is possible to show a boundingBox in regard to the ARReferenceObject prior to it being detected; although I am not sure why you would want to do that (in advance anyway).

For example, assuming your referenceObject was on a horizontal surface you would first need to place your estimated bounding box on the plane (or use some other method to place it in advance), and in the time it took to detect the ARPlaneAnchor and place the boundingBox it is most likely that your model would already have been detected.

Possible Approach:

As you are no doubt aware an ARReferenceObject has a center, extent and scale property as well as a set of rawFeaturePoints associated with the object.

As such we can create our own boundingBox node based on some of the sample code from Apple in Scanning & Detecting 3D Objects and create our own SCNNode which will display a bounding box of the approximate size of the ARReferenceObject which is stored locally prior to it being detected.

Note you will need to locate the 'wireframe_shader' from the Apple Sample Code for the boundingBox to render transparent:

import Foundation
import ARKit
import SceneKit

class BlackMirrorzBoundingBox: SCNNode {

    //-----------------------
    // MARK: - Initialization
    //-----------------------

    /// Creates A WireFrame Bounding Box From The Data Retrieved From The ARReferenceObject
    ///
    /// - Parameters:
    ///   - points: [float3]
    ///   - scale: CGFloat
    ///   - color: UIColor
    init(points: [float3], scale: CGFloat, color: UIColor = .cyan) {
        super.init()

        var localMin = float3(Float.greatestFiniteMagnitude)
        var localMax = float3(-Float.greatestFiniteMagnitude)

        for point in points {
            localMin = min(localMin, point)
            localMax = max(localMax, point)
        }

        self.simdPosition += (localMax + localMin) / 2
        let extent = localMax - localMin

        let wireFrame = SCNNode()
        let box = SCNBox(width: CGFloat(extent.x), height: CGFloat(extent.y), length: CGFloat(extent.z), chamferRadius: 0)
        box.firstMaterial?.diffuse.contents = color
        box.firstMaterial?.isDoubleSided = true
        wireFrame.geometry = box
        setupShaderOnGeometry(box)
        self.addChildNode(wireFrame)
    }

    required init?(coder aDecoder: NSCoder) { fatalError("init(coder:) Has Not Been Implemented") }

    //----------------
    // MARK: - Shaders
    //----------------

    /// Sets A Shader To Render The Cube As A Wireframe
    ///
    /// - Parameter geometry: SCNBox
    func setupShaderOnGeometry(_ geometry: SCNBox) {
        guard let path = Bundle.main.path(forResource: "wireframe_shader", ofType: "metal", inDirectory: "art.scnassets"),
            let shader = try? String(contentsOfFile: path, encoding: .utf8) else {

                return
        }

        geometry.firstMaterial?.shaderModifiers = [.surface: shader]
    }

}

To display the bounding box you you would then do something like the following, noting that in my example I have the following variables:

 @IBOutlet var augmentedRealityView: ARSCNView!
 let configuration = ARWorldTrackingConfiguration()
 let augmentedRealitySession = ARSession()

To display the boundingBox prior to detection of the actual object itself, you would call the func loadBoundigBox in viewDidLoad e.g:

/// Creates A Bounding Box From The Data Available From The ARObject In The Local Bundle
func loadBoundingBox(){

    //1. Run Our Session
    augmentedRealityView.session = augmentedRealitySession
    augmentedRealityView.delegate = self

    //2. Load A Single ARReferenceObject From The Main Bundle
    if let objectURL =  Bundle.main.url(forResource: "fox", withExtension: ".arobject"){

        do{
            var referenceObjects = [ARReferenceObject]()
            let object = try ARReferenceObject(archiveURL: objectURL)

            //3. Log it's Properties
            print("""
                Object Center = \(object.center)
                Object Extent = \(object.extent)
                Object Scale = \(object.scale)
                """)

            //4. Get It's Scale
            let scale = CGFloat(object.scale.x)

            //5. Create A Bounding Box
            let boundingBoxNode = BlackMirrorzBoundingBox(points: object.rawFeaturePoints.points, scale: scale)

            //6. Add It To The ARSCNView
            self.augmentedRealityView.scene.rootNode.addChildNode(boundingBoxNode)

            //7. Position It 0.5m Away From The Camera
            boundingBoxNode.position = SCNVector3(0, -0.5, -0.5)

            //8. Add It To The Configuration
            referenceObjects.append(object)
            configuration.detectionObjects = Set(referenceObjects)

        }catch{
            print(error)
        }

    }

    //9. Run The Session
    augmentedRealitySession.run(configuration, options: [.resetTracking, .removeExistingAnchors])
    augmentedRealityView.automaticallyUpdatesLighting = true
}

The above example simple creates a boundingBox from the non-detected ARReferenceObject and places it 0.5m down from and 0.5meter away from the Camera which yields something like this:

enter image description here

You would of course need to handle the position of the boundBox initially, as well as hoe to handle the removal of the boundingBox 'indicator'.

The method below simply shows a boundBox when the actual object is detected e.g:

//--------------------------
// MARK: - ARSCNViewDelegate
//--------------------------

extension ViewController: ARSCNViewDelegate{

    func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {

        //1. Check We Have A Valid ARObject Anchor 
        guard let objectAnchor = anchor as? ARObjectAnchor else { return }

        //2. Create A Bounding Box Around Our Object
        let scale = CGFloat(objectAnchor.referenceObject.scale.x)
        let boundingBoxNode = BlackMirrorzBoundingBox(points: objectAnchor.referenceObject.rawFeaturePoints.points, scale: scale)
        node.addChildNode(boundingBoxNode)

    }

}

Which yields something like this:

enter image description here

In regard to the detection timer, there is an example in the Apple Sample Code, which displays how long it takes to detect the model.

In its crudest form (not accounting for milliseconds) you can do something like so:

Firstly create A Timer and a var to store the detection time e.g:

var detectionTimer = Timer()

var detectionTime: Int = 0

Then when you run your ARSessionConfiguration initialise the timer e.g:

/// Starts The Detection Timer
func startDetectionTimer(){

     detectionTimer = Timer.scheduledTimer(timeInterval: 1.0, target: self, selector: #selector(logDetectionTime), userInfo: nil, repeats: true)
}

/// Increments The Total Detection Time Before The ARReference Object Is Detected
@objc func logDetectionTime(){
    detectionTime += 1

}

Then when an ARReferenceObject has been detected invalidate the timer and log the time e.g:

//--------------------------
// MARK: - ARSCNViewDelegate
//--------------------------

extension ViewController: ARSCNViewDelegate{

    func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {

        //1. Check We Have A Valid ARObject Anchor
        guard let _ = anchor as? ARObjectAnchor else { return }

        //2. Stop The Timer
        detectionTimer.invalidate()

        //3. Log The Detection Time
        print("Total Detection Time = \(detectionTime) Seconds")

        //4. Reset The Detection Time
        detectionTime = 0

    }

}

This should be more than enough to get your started...

And please note, that this example doesn't provide a boundingBox when scanning an object (look at the Apple Sample Code for that), it provides one based on an existing ARReferenceObject which is implied in your question (assuming I interpreted it correctly).

BlackMirrorz
  • 7,217
  • 2
  • 20
  • 31
  • 1
    This is in Swift For IOS, not Unity :) – BlackMirrorz Jul 28 '18 at 04:50
  • ...https://stackoverflow.com/questions/51568109/how-object-tracking-is-done-in-arkit-2-using-unity – zyonneo Jul 28 '18 at 05:31
  • Hello Josh, Thanks for answer. Currently, i draw bounding box when detecting object, not on detected object. – Tech Jul 28 '18 at 07:27
  • The first part of my answer shows a bounding box from the arrefernce object you have created not around the object itself. The bounding box is created from the feature points saved in the arreference object. – BlackMirrorz Jul 28 '18 at 07:38
  • If you actually mean you want to show a bounding box when you are scanning an object then you can use the Apple example. Although a bounding box is drawn based on raw feature points – BlackMirrorz Jul 28 '18 at 07:40
  • Yes but, i mean not after detected object, but when object is detection, that time how to define this object is detecting. thanks for reply. – Tech Jul 28 '18 at 07:40
  • I have tried to help you as best I can. Since you posted no Code, I think I have done more than enough to point you in the right direction :) Try the code and see. And I dont use WhatsApp... – BlackMirrorz Jul 28 '18 at 07:52
  • Thanks Josh. Actually i can't share code publikly, because our project code is very Private code. but my requirement is like object detection with percentage, i mean when i detecting object its show me how many percent detected at this time. – Tech Jul 28 '18 at 09:18
  • My answer shows you how to show the boundingBox of the ARReferenceObject prior to detection as well as a way to show a crude timer :) As for the percentage I will leave that up to you. I think my answer is sufficient enough (being tested and working) to point you and or your team in the right direction :) – BlackMirrorz Jul 28 '18 at 09:20
  • Thanks Josh. But is it possible?, What you think? – Tech Jul 28 '18 at 09:28
  • Well if you think about it... an ARReference object is detected based on a match to its raw feature points. So you could theoretically get the raw feature points of the current frame and calculate a percentage based on a match to those of the object itself. – BlackMirrorz Jul 28 '18 at 09:47
  • Also if my solution has answered your question which was show the bounding box while detecting an ARReference object then please mark it as correct :) – BlackMirrorz Jul 28 '18 at 09:47
  • @BlackMirrorz great answer, a shame that the poster did not mark it as the right one, 1+ – Juan Boero Mar 13 '19 at 19:28
  • To save you time looking for the metal shader - this is a similar shader(made by @BlackMirrorz) - https://gist.github.com/BlackMirrorz/cc3d6fa5a6746b89b9606643a59bc41f – Nativ May 27 '20 at 11:05