The app itself doesn’t have access to your location - only the location provided to it from the intent. Siri has separate permissions to use your location, see the privacy settings on your device for details.
yes, Siri told me to turn on location settings for Siri and Dictation, but kept repeating herself about this, and I had to build and run again. Then it worked.
It probably (still?) helps to have an American accent ;]
later: I can’t get Siri to ask me for a pickup location; I’ve deleted the app from my phone, then built and run again, but she just says Wenderloon can be there, do I want to request it. She’s assuming my current location is the pickup location: the latitude is -37.76
I’m sorry the tutorial didn’t work out for you. Extensions, permissions and Sirikit itself make the whole package very hard to work on and debug, which is why we added that note at the start of the tutorial :] There are so many moving parts relying on so many different variables that it’s unlikely I’ll be able to offer any specific help to you, sorry.
Thanks for the tutorial. I am stuck at a point where I am trying to bring the WenderLoonCore framework into the extension. I keep getting a “Use of undeclared type WenderLoonSimulator” error. I get the same error in the final project as well. Any help would be appreciated.
Try selecting the WenderLoonCore target on its own, then building that. Afterwards you should be OK to switch back and build the rest of the project. I just tried that with the final project in Xcode 8.3.2 and it was OK (before then I got a build error)
Hi, I try to play with the new intents available in iOS 11 like INSearchForAccountsIntent.
It is sometimes hard to find what sentence to use to fill the right parameter. For exemple INSearchForAccountsIntent has an accountNickname that I can’t reach.
Do you have a tip in order to know how to reach a particular parameter of an intent or maybe where to find a (complete) list of sentences for each intent?
Also it is possible to easily write the content of an Intent for debugging purpose? intentDescription doesn’t seem to return something interesting.
Hi, thanks for the great tutorial.
I have a question: my model is a CoreData Entity which sort of matches INTask. Should I add to the entity a derived property which returns an INTask instance or do you suggest a different approach for this case?
Sorry, I can’t help you with that. The docs themselves provide a few examples but then say that “many other sentences may be recognised” - they’re continually modifying and improving Siri’s behaviour so you’re not going to find a complete list.
If an intent parameter isn’t present then Siri has decided that the user didn’t include that information and there’s nothing you can do about it. I’d guess the account nickname would be something like “checking account” or “joint account”.
As for printing out the content of an intent - if there’s nothing useful in intentDescription then you should file a radar, because that’s exactly what that method is for. I have found writing and debugging intents to be very frustrating
That sounds reasonable. INTask wasn’t around when I wrote this tutorial so I didn’t look into that intent specifically, but it would seem to be a good way to make your existing data model “siriable”
Thanks Richard.
Regarding some implementation details, and considering my model is made out of CoreData Entities, do you have any suggestion? Meaning: shall I implement a derived property INTask on that particular Entity Class, or would it be better to resort to a sort of trampoline pattern with a different object who’s responsible for bridging between the two data representations?
You’re absolutely right, I’m just having some concerns for the derived property since my app also targets iOS 10.x (INTask is available starting with iOS 11), thus I thought about leaving the CoreData Entity alone and implement a sort of trampoline pattern class only for iOS 11.x.
But I’ll follow your suggestion.
Thanks a lot.
PS sadly INTask doesn’t take into account amounts, which is what I were more interested on for letting the user also manipulate via Siri my model which is based on amounts and not on text for tasks.
This tutorial is more than six months old so questions are no longer supported at the moment for it. We will update it as soon as possible. Thank you! :]