AR Location Anchors in ARKit 4

Bring augmented reality into real world locations with geo-tracking from WWDC 2020

Ethan Saadia
3 min readJun 23, 2020

With ARKit 4, it is now possible to anchor AR content at a specific coordinate in the real world. AR apps that previously relied on image recognition or scanning a code to launch location-specific experiences can now upgrade to take advantage of location anchors.

How geo-tracking works

As part of rebuilding the data backend for Apple Maps, Apple collected camera and 3D LiDAR data from city streets around the world. When using location anchors, ARKit downloads the virtual map surrounding your device from the cloud and matches it with the device’s camera feed. Combined with GPS, ARKit can quickly and precisely determine your location in the real world. All this processing happens with on-device machine learning, so the camera feed never leaves your device.

Availability

Geo-tracking supported on all devices with a GPS and an A12 chip or later. Because this feature requires Apple to have mapped the area previously, it is only available in certain cities. ARKit now supports over 50 US cities for Geo-tracking, and Apple will almost certainly keep expanding availability.

Find out if your city is supported on the Apple Documentation.

Geo-tracking uses a new ARConfiguration called ARGeoTrackingConfiguration that makes it easy to check for compatible devices and availability.

First, check for device support (A12 or later and GPS).

guard ARGeoTrackingConfiguration.isSupported else { return }

Now you can check if the device is in a supported city. If so, run a geo-tracking configuration on the ARView. If you are using RealityKit, you cannot use automatic configuration, so manually run the configuration on the session.

ARGeoTrackingConfiguration.checkAvailability { (available, error) in
guard available else { return }
arView.session.run(ARGeoTrackingConfiguration())
}

Creating location anchors

From the WWDC 2020 session “Explore ARKit 4”

ARKit uses its own coordinate system relative to the device while real world locations are described with latitude and longitude. With geo-tracking, there is a unified coordinate system so you do not need to worry about it! The ARKit axes automatically line up with the compass so the X axis points east and the Z axis points south.

All you need to create an ARGeoAnchor is a single GPS coordinate. Here is how to create a location anchor for the center of Apple Park. For best precision, I’m using 6 decimal places of the coordinates.

let coordinate = CLLocationCoordinate2D(latitude: 37.334525, longitude: -122.008898)let geoAnchor = ARGeoAnchor(name: "Apple Park", coordinate: coordinate)

Optionally, you can specify an altitude in meters. By default, the altitude is at ground level.

let geoAnchor = ARGeoAnchor(name: "Apple Park", coordinate: coordinate, altitude: 72)

Now let’s add the anchor to the scene. In RealityKit, it works by creating an AnchorEntity from the ARGeoAnchor .

arView.session.add(anchor: geoAnchor)
let geoAnchorEntity = AnchorEntity(anchor: geoAnchor)
arView.scene.addAnchor(geoAnchorEntity)

Now you can add other entities to your location anchor, rotating and positioning them to match the real world space.

Converting between coordinate spaces

If you want to get the GPS coordinates for a point in your scene, ARKit makes it easy to get an ARGeoAnchor from with an XYZ ARKit coordinate. This makes it possible to create a location anchor from a tap on the screen.

let point = SIMD3<Float>([0, 1, -2])arView.session.getGeoLocation(forPoint: point) { 
(coordinate, altitude, error) in
let geoAnchor = ARGeoAnchor(coordinate: coordinate, altitude:
altitude)
}

That’s it! For more ARKit coverage at WWDC 2020, see my other articles here. Thank you, and I can’t wait to see what you build.

--

--

Ethan Saadia

Founder at PhotoCatch. Developing 3D content creation tools for everyone. 2x  WWDC Scholar. Developer, Engineer, and Entrepreneur.