Experience A Super Power - The Journey of Augmented Reality With ARKit Application

  • By: Akshay Agarwal
  • Date: 07-08-2018

Apple’s newest API, ARKit, makes the exciting world of Augmented Reality available to every iOS developer, but where do you get started? Come with us on an Augmented Reality journey to build your first ARKit application.

What does ARKit do?

ARKit recognizes notable features in the scene image, tracks differences in the positions of those features across video frames, and compares that information with motion sensing data. The result is a high-precision model of the device’s position and motion that also analyzes and understands the contents of a scene.

What do I need to get started?

-Create a new app and select Augmented Reality App.

AR App

There are some key points every AR app needs:

-You will need an ARSCNView. Most people name their instance sceneView. This is where all the AR magic happens. You can set it to occupy the whole screen or simply as a part of the UI.

-You need to implement the ARSCNViewDelegate protocol which includes the methods used to render the models into the View. The sceneView controller will implement this protocol and be the delegate of the View.

sceneView.delegate = self

-ARConfiguration needs to be set up with the type of plane tracking you want (horizontal is the default) and then added to the sceneView session run() method to actually start the AR scene. ARSession

override func viewWillAppear(_ animated: Bool) { super.viewWillAppear(animated)

// Create a session configuration

let configuration = ARWorldTrackingConfiguration()

// Run the view's session sceneView.session.run(configuration)

}
Apple’s newest API, ARKit, makes the exciting world of Augmented Reality available to every iOS developer
tweet this

-On viewWillDisappear we pause the sceneView session to stop the world tracking and device motion tracking the phone performs while AR is running. This allows the device to free up resources.

override func viewWillDisappear(_ animated: Bool) { super.viewWillDisappear(animated)

// Pause the view's session sceneView.session.pause()

}

This is the basic configuration you need for every AR scene. None of this code will add any AR object just yet though, only set up the view.

Which you get as default when you select Augmented Reality App as template of App.

Before we move forward I highly recommend you add this to the viewDidLoad method of your view controller:

sceneView.showsStatistics = true

sceneView.debugOptions = [ARSCNDebugOptions.showFeaturePoints,
ARSCNDebugOptions.showWorldOrigin]

Enabling these options will allow you to see the recognized Feature Points and the XYZ axes of the AR scene. If there is any bug with your model these features are one of the few ways you can debug AR. We’ll dig deeper into how you can test and debug AR and ML applications in an upcoming article of this series.

Feature Points and the XYZ axes of the AR scene

With Feature points options on debug enabled you are able to see what ARKit is recognising as you move around your plane aka yellow dots

Displaying a 3D Object in the Scene

ARKit SceneKit View supports several file formats, namely .dae (digital asset exchange), .abc (alembic), and .scn (SceneKit archive). When .dae or .abc files are added to an Xcode project, however, the editor automatically converts them to SceneKit’s compressed files that retain the same extensions. To serialize an SCNScene object, create a .scn file by converting the initial .dae or .abc file. Since we’re using SceneKit, a 3D object model must be a subclass of SCNNode, so we need to create a new class (we’ve called it Chair, though you may call it whatever you like) and load the initial file containing the object (in our case, Chair.dae). Here’s the code we used to add the chair to the scene:

class Chair: SCNNode { func loadModel() {

guard let virtualObjectScene = SCNScene(named: "Chair.scn") else { return } let wrapperNode = SCNNode()
for child in virtualObjectScene.rootNode.childNodes { wrapperNode.addChildNode(child)

}

addChildNode(wrapperNode)
}
}

Having configured this SCNNode, we must initialize an object of the Chair class and add it after the setup configuration:

func addChair() { chair.loadModel()

sceneView.scene.rootNode.addChildNode(chair)
}

If we run our augmented reality mobile app now, we’ll see a chair in its default position on our iPhone’s screen. Needless to say, if you can’t see a 3D object, go back and check whether you’ve done everything correctly. In our sample app, the chair in the default position looked like this:

3D object

The default position of the 3D object looks unnatural, so we decided to change it. For the design of this chair, we experimented a lot till we chose the following parameters:

Feature Points and the XYZ axes of the AR scene

If your 3D object doesn’t look the way you want, feel free to experiment with its position.

Subscribe For Our Newsletter And Stay Updated

Recent blogs

ecommerce sales
As an e-commerce marketer, you are always looking for new ways in which you can drive traffic to your e-commerce site and boost...
decoupled
The relationship between content and code is not a healthy one anymore. The consumer-facing presentation layer is not as removed...
On September 6th the Drupal team shipped Drupal 8.6.0, a huge update to the Drupal project. Packed with powerful features shipped...

This website uses cookies to offer you an enhanced website experience. We collect information about how you interact with our website to provide personalized services to you. To find out more, see our Privacy Policy

×