Category: Blog, iOS, Development

How to Create a Measuring App With ARKit In iOS 11

ARKit iOS 11 measuring app

I’m sure that many of you (like me) were really excited when Apple announced a new framework – ARKit – at WWDC 2017. The demo was really interesting and most of us started wondering: what else can we achieve by using that framework? In this post, I would like to show you how to create a basic measuring app. That’s correct – your device is able to measure things located in the real world! But, for now, let’s start from the beginning.

What is an ARKit, really?

It’s a framework in iOS 11 which allows us to blend digital objects with the real world. Is it really something new? Of course not.

We have seen things like this earlier, for example in the Pokemon GO game or IKEA app. Yet now it’s much easier to achieve the same (or even better!) results, only with much less effort.

You need to keep in mind, though, that ARKit runs on the Apple A9 and A10 processors.

For presenting the results of „world” detection, we are able to useSpriteKitMetal or SceneKit, in case of Apple’s frameworks. ARKit is also available for use for third-party tools like Unity or Unreal Engine. That’s all in as much as you need to know for this brief description. Time to play!

Let’s code!

At the beginning, we have to create a new empty project that supports ARKit. Xcode 9 now has a separate template for this.

Zrzut ekranu 2017 07 13 o 11.40.11

Why is using a template better than creating it from scratch? During the configuration, we can choose the content technology like SpriteKit, SceneKit or Metal.

In the case of SceneKit (my choice for this demo), we are getting a sample art.scnassets folder with the ship scene (well known from WWDC17), the default configuration and setup for the scene, handled methods – like run or pause scene – and also session error handling methods (didFailWithError, wasInterrupted and interruptionEnded). Our storyboard also contains the ARSCNView object already connected with IBOutlet of view controller.

In a case of UI staff for this demo, I’ve created some labels which inform us about the final result, readiness of the app for detecting the world and to simply aim at the center of the screen.

Ok, the next thing we have to do is setup the session and set the sessionConfig. So far, we are able to detect only horizontal planes but it’s not a problem because, in our app, we will relate to the features points. Wait, whaaaat!?

Features points - points identified by the framework as part of the surface. Based on these, we are able to detect things.

For this demo, we don’t need to detect planes (so far only,only the horizontal plane is available), so the default configuration will be enough.

Where are my feature points?!

Well, we have to start from the end. For proper detection, we will need a function for detecting feature points, then translate it into 3D coordinates to calculate the distance between two of them. Okay, that’s a brief description of our algorithm. Below is an an extension of the ARSCNView:

As you probably noticed, we have the set session delegate before.

We need to implement the method updateAtTime and call our detection function there. Also don’t forget that this has to be done on the main thread.

How does it work? We need to call hitTest on ARSCNView with the expected result type set to featurePoints. Once we get the results, we are taking the first of them from an array and the call helper function for getting a position from its transformation matrix.

Basically, the hardest part is done. Uff…

Right now we have to „save” the beginning and the end of thing we want to measure. These two points are typeSCNVector3 so we need to calculate the distance between them.

It’s quite easy:

That’s it! Now, we just need to display the results.

How to do this? Once you tap and hold, the application will start measuring, starting from the point of the aim. Once you release your finger, the measuring will be done.

Conclusion:

Don’t be surprised when, sometimes, we get some strange results, as that’s because of featurePoints. We’re getting a lot of them at once, so it’s possible that we did not start measuring with the best one, so try to move device a bit. For this simple demo, I wanted to show you how to implement it in the fastest way possible, yet still getting us really good results.

Project is available on our GitHub account. Have a glance

Below you can see an example of it in use:

measuring demo app ARKit iOS 11