toggleAnswerLabels

Hi,

I’m in Chapter 3 page 57/58 and Control-dragged Tap Gesture Recognizer object to the Question View Controller but instead of show/hide the answer hint labels the simulator displays:

11%20pm

Any idea what I’m not doing right?

Thanks in advance

h

@jrg.developer Can you please help with this when you get a chance? Thank you - much appreciated! :]

Hey @zwolf !

At a high level, here’s what you’re doing in this part:

  1. Add the tap gesture to the view.
  2. Connect the IBAction.
  3. Build and run the app.
  4. Tap on the view to show/hide the hint and answer labels.

Here’s an animated GIF showing this entire process:

Add_Tap_Gesture

Hopefully, this helps you to understand this part. :]

Hi,

Thank you for promptly replying. Yes I did the connection as per your example and got the first hiragana (or katakana?) page but it wasn’t returning the Prompt-Hint-Answer UI, but I played around and dragged from the Outlet Collections in the Connections Inspector to the Tap Gesture recogniser object and now it works.

I was also wondering, when I’m finished with these language app lessons (Chapter 3 - 9), I would like to build one for learning Indonesian but would also like to imbed audio so the user hears the pronunciation. To do that will I need to import AVFoundation into QuestionGroupData (after setting it in main file; Build Phases; Link Binary with Libraries; AVFoundation.Framework), add a “play” label - do all the connections - and add audio files into the public static func array (e.g. Question(answer: “a”, hint: nil, prompt: “あ”, play: “audio1.mp3”)? Would that work?

h

No, you shouldn’t import AVFoundation into QuestionGroupData, Question, etc. These are models, and they shouldn’t be responsible for actually playing the audio, which is what AVFoundation gives you.

You will, however, need to add a file path/name or URL for where the audio is located on the Question model, e.g. a new property for url or fileName (depending on whether you intend to add the audio files locally or get from a remote URL).

You’ll need to import AVFoundation into a controller. Unlike AVPlayerViewController, which is an AVFoundation-provided view controller for playing videos, there isn’t an AVFoundation view controller for playing audio, unfortunately.

However, you can use AVPlayer, which provides support for playing audio, and write your own view controller for the UI to interact with it (play button, etc). See this answer on StackOverflow for good suggestions on how to do this.

Note that at some point, you’ll find it’s beneficial to get the models from a remote service, instead of bundled within the app or provided via ad-hoc URLs. I’d suggest using MVC-N pattern to separate out your networking into a separate client, instead of putting this logic the view controller. You’d also need a web service capable of providing these as well. … This is out of scope for the Foundation chapters here, though.

Hi Joshua,

Thank you Kindly.

This is plenty for me to work and experiment with.

h