Kodeco Forums

Augmented Reality iOS Tutorial: Location Based

In this augmented reality tutorial, you'll learn how to use your iOS users location to create compelling augmented reality experiences.


This is a companion discussion topic for the original entry at https://www.raywenderlich.com/764-augmented-reality-ios-tutorial-location-based

I am having trouble. The viewForAnnotation is always returning the same object.

public func ar(_ arViewController: ARViewController, viewForAnnotation: ARAnnotation) → ARAnnotationView {

    let pokemonAnnotationView = PokemonAnnotationView()
    pokemonAnnotationView.annotation = viewForAnnotation
    pokemonAnnotationView.frame = CGRect(x: 0, y: 0, width: 150, height: 150)
    return pokemonAnnotationView
}

Here is how I load the pokemons from a web service:

private func populatePokemons() {

    let url = URL(string: "https://still-wave-26435.herokuapp.com/pokemon/all")
    
    URLSession.shared.dataTask(with: url!) { (data, response, error) in
        
        let dictionaries = try! JSONSerialization.jsonObject(with: data!, options: []) as! [[String:Any]]
        
        self.pokemons = dictionaries.flatMap(Pokemon.init)
        
        DispatchQueue.main.async {
            
            for pokemon in self.pokemons {
                
                let annotation = MKPointAnnotation()
                annotation.title = pokemon.name
                annotation.coordinate = CLLocationCoordinate2D(latitude: pokemon.latitude, longitude: pokemon.longitude)
                
                // create pokemon annotation 
                let pokemonAnnotation = PokemonAnnotation()
                pokemonAnnotation.title = pokemon.name
                pokemonAnnotation.location = CLLocation(latitude: pokemon.latitude, longitude: pokemon.longitude)
                pokemonAnnotation.imageURL = pokemon.imageURL
                
                self.pokemonAnnotations.append(pokemonAnnotation)
                self.mapView.addAnnotation(annotation)
            }
            
        }
        
    }.resume()
}

Hey, I was following the Objective-C version of this tutorial, and the original link now points to this Swift version.

How do I get to see the original tutorial?

This method is called from ARViewController createAnnotationViews() and getAnyAnnotationView(), i suggest you add a breakpoint inside àr(_: viewForAnnotation:) and if it gets hit you move the call stack up into the calling method and check the activeAnnotations property of ARViewController.

Hi, sorry to hear that. Unfortuneltay i don’t know if there is a way to see the old version of this tutorial, but maybe @raywenderlich or @samdavies can tie you an answer.

Great tutorial. is there a way to add a given angle in degrees (elevation) for each POI so that the user has to tilt the phone up to the sky to see the POI? stars for example.

Great tutorial but …

AVCaptureDevice.devices is Depreciated in iOS 10…

/Users/robert/Documents/XCode Swift Projects/Places/Places/HDAugmentedReality/Classes/ARViewController.swift:915:44: ‘devices(withMediaType:)’ was deprecated in iOS 10.0: Use AVCaptureDeviceDiscoverySession instead.

I can really use this idea. Thanks for the tutorial.

Well, I hate to say this, but this tutorial - or at least the downloadable project, does not run “out of the box”

Firstly, it has that Depreciated warning in it. And i’m using the very latest iOS and XCode etc. and whn I run it on my iPhone SE, I get this.

2017-01-25 11:39:50.537577 Places[4443:4704805] [LogMessageLogging] 6.1 Unable to retrieve CarrierName. CTError: domain-2, code-5, errStr:((os/kern) failure)
Load pois
fatal error: unexpectedly found nil while unwrapping an Optional value
2017-01-25 11:39:56.705175 Places[4443:4704750] fatal error: unexpectedly found nil while unwrapping an Optional value
(lldb)

OK, Solved. Put in my API key … doh !

Just a little technical correction:

“There are 360 meridian lines and 360 lines of parallel, one for every degree out of 360 degrees.”

True, there are 360 meridian lines (longitude) but only 180 lines of latitude (90 to -90).

Otherwise, I sped read through the article and cannot wait to implement it!

How do I change this line in ARViewcontroller.swift.

915 let videoDevices = AVCaptureDevice.devices(withMediaType: AVMediaTypeVideo)

To the correct syntax - as this is the line that is depreciated in ios 10.

: ‘devices(withMediaType:)’ was deprecated in iOS 10.0: Use AVCaptureDeviceDiscoverySession instead.

But I can’t work out how to change it.
Thanks

Lol, i’ll check if i can emit a warning or compile error for the key.

I’m not at home today, but i try to give you a solution tomorrow.

Right, i’ll correct this later.

Very interesting Tutorial +1

I use it and it work very well. For now I’m working on an astronomical app. I use complex lib and I’m now. I can determine the position of a planet (or sun or moon etc.) in relation to the position of the device and the current Date(). I have an azimuth and an altitude. My goal is to show this position on an AR view.

I read the lib use in this project doesn’t use altitude. Do you any suggestion about this ?

Regards
Seb

@pierredrks Excellent tutorial! I have a use case like instead of google places POI, I have a collection of lat long of fire hydrants. So I have a question like

  1. If the lat long of the marker is offset by few meters from the actual Fire hydrant so the user tries to drag and drop the marker on top of the actual fire hydrant , how to fetch the new lat long(the correct lat long) of the marker?

@pierredrks
Nice tutorial, I am learning a lot of this here but… am I the only one who does not get any POI in the augmented view here?
There are several POI’s in the map view but when I hit camera none af them showes up?
I followed the tut, downloaded the final, add my API key.
Thx, Harry

1 Like

I just finished it, learned a lot. Still much, so much more though.

Notes:

  • annotaionView
  • Xcode complains the line where… (needs an “about”)

I’ll stop nitpicking for now because this was an awesome tutorial. So, thank you!

PS: Don’t give up and cheat by getting the final if you get errors. Use the print command to learn the structure and program flow to help you.

I agree but do you get the right augmented view with poi in it?

Indeed I do see the individual POIs in AR with their correct annotations. Also changed to miles for here in the US. Went out for a drive later and noticed a scaling problem, So a bit of debugging still, as it might have left some code I was playing with…

Hi can you replace createCaptureSession inside ARViewController with the following and give me ping if everything works

func createCaptureSession() -> (session: AVCaptureSession?, error: NSError?) {
var error: NSError?
var captureSession: AVCaptureSession?
var backVideoDevice: AVCaptureDevice?
backVideoDevice = AVCaptureDevice.defaultDevice(withDeviceType: .builtInWideAngleCamera, mediaType: AVMediaTypeVideo, position: .back)

if backVideoDevice != nil {
  var videoInput: AVCaptureDeviceInput!
  do {
    videoInput = try AVCaptureDeviceInput(device: backVideoDevice)
  } catch let error1 as NSError {
    error = error1
    videoInput = nil
  }
  
  if error == nil {
    captureSession = AVCaptureSession()
    
    if captureSession!.canAddInput(videoInput) {
      captureSession!.addInput(videoInput)
    } else {
      error = NSError(domain: "", code: 0, userInfo: ["description": "Error adding video input."])
    }
  } else {
    error = NSError(domain: "", code: 1, userInfo: ["description": "Error creating capture device input."])
  }
} else {
  error = NSError(domain: "", code: 2, userInfo: ["description": "Back video device not found."])
}

return (session: captureSession, error: error)
}