AndPlus acquired by expert technology adviser and managed service provider, Ensono. Read the full announcement

ARKit vs ARCore

May 2, 2018 9:05:00 AM

shutterstock_456078694 medAs you’ve probably gathered by reading this blog, we’re really excited about the future of augmented reality (AR) technology. That’s especially true now that the two biggest mobile ecosystems, iOS and Android, have development kits (ARKit and ARCore, respectively) that enable developers to bring AR apps to the mass market, without having to fuss around learning the science behind AR.

The idea of AR was first floated over 20 years ago, when the computational resources needed to make it happen would have involved tons of gear. We've finally reached the point where devices have the portability, ubiquity, sensors, and computing power to make AR practical.

How ARKit and ARCore Are Alike—and Different

ARKit and ARCore are both in their infancy, and from the perspective of application products, there is little to differentiate them.

  • They both recognize horizontal and vertical surfaces on which virtual objects can be placed.
  • They both take lighting conditions into account in order to generate shading and shadows.
  • They both construct three-dimensional maps of the environment in the camera’s field of view to reduce the calculations needed to place virtual objects (a process known as tracking).

How they differ is in the approaches they take under the hood to enable these features. For example, ARKit keeps a smaller data set for its 3D map, throwing away data according to the amount of time since a given point was last seen in the camera’s field of view. This reduces the amount of memory needed to maintain the map, but means that when a user returns to a previous point, the map for that area has to be rediscovered. ARCore “remembers” a much larger area, at the cost of more memory and computing power needed to match the current camera image with the map.

The other differences have more to do with their respective business models. Apple, being vertically integrated, has more control over the hardware-software integration and can develop them in tandem. This enables them to bring innovations to market faster. Android depends in large part on independent OEM device manufacturers to incorporate ARCore and its periodic enhancements; they will do this on their own schedule and budget, and may not be as quick to make the investments necessary to bring the hardware up to what’s needed to make AR apps more realistic.

Useful AR and Its Effect on User Interface

But let’s imagine a future in which AR has evolved from gee-whiz demonstrations to applications that solve real-world problems: applications that enable users to do things better, faster, or more easily (or at all) than they could without AR. From a user interface perspective, what would that look like?

In his groundbreaking 1998 book The Invisible Computer, Donald Norman argues that computers are difficult to use, in part because they are separate chunks of hardware; they are not a natural, integrated part of our environment. It’s just as true today as it was in the late ‘90s: our computers, as small and portable as they have become, are still separate objects that need specific, persistent care and feeding. Norman’s argument is that for computers to be truly usable, they should be invisible—such a natural part of the other objects that we interact with that we don’t notice that there are computers involved.

AR on mobile is a step in that direction, but it’s a small step. Having to carry a screen in front of your face wherever you go, manipulating virtual objects by tapping, pinching, and sliding them around with your fingers on the screen, is a less-than-compelling AR experience. Being able to reach out with your hands to accomplish the same thing would be much more natural and realistic—but it’s really difficult to do when at least one of those hands is holding a smartphone.

AR, ARKit, and ARCore: Means to an End

At the moment, AR on mobile is being treated as an end in itself. But in the broader view, the development and refinement of AR technology is a stepping stone on the path to the “invisible computer” that Norman proposed so long ago. ARKit and ARCore are necessary, but not sufficient, components of that evolution. Their continued development will make the user experience more natural, lifelike, and realistic, but they need to go beyond the confines of the smartphone. Hence, it’s the hardware development that will take AR to the next level where we can eliminate the smartphone and provide an AR experience that’s nearly indistinguishable from the real world.

How will this hardware evolution happen? I don’t know—after all, I’m a software guy. But it’s fun to imagine where AR can take us when it’s untethered from the five-inch smartphone screen. What do you think?

Brian Geary

Written by Brian Geary

Brian is a true believer in the Agile process. He often assists the development process by performing the product owner role. In addition to his technical background, he is an experienced account manager with a background in design and marketing.

Get in touch

LET’S BUILD SOMETHING AWESOME. TOGETHER.