0

I am using the AVSpeechSynthesizer() in Swift for reading some directions and heading instructions for visually impaired users.

Yet, after some time the app crashes with:

-[AXSpeechAction retain]: message sent to deallocated instance 0x1c37e2b0.

Of course that class is not included in my project and i do not even know who creates and removes it.

Mayur Prajapati
  • 5,454
  • 7
  • 41
  • 70
Fabrizio Bartolomucci
  • 4,948
  • 8
  • 43
  • 75
  • Thanks for the correction, please take away the -2 evaluation as this subject is quite new. – Fabrizio Bartolomucci Aug 04 '15 at 11:15
  • Have you considered using `UIAccessibilityAnnouncementNotification` (e.g. see [here](http://stackoverflow.com/questions/31776907/is-there-any-way-to-have-voiceover-read-a-label-on-command))? Otherwise, assuming you have already googled and found no answer, without a more complete code sample, it is hard to help. I would assume you might be setting an object to a weak property which then gets deallocated, or maybe you are calling some `AVSpeech*` methods from wrong threads (see e.g. [here](http://stackoverflow.com/questions/28544201/uiimageasset-retain-message-sent-to-deallocated-instance))? – Boris Dušek Aug 04 '15 at 12:28
  • Actually I have no closures in this specific file. One option would be using UIAccessibilityAnnouncementNotification as I did everywhere else, of course, but in that case I would surrender to the system any connection between the different spoken sentences, for that matter without any idea about what happens when an announcement arrives while the system is playing a previous one. – Fabrizio Bartolomucci Aug 04 '15 at 13:39
  • My tests told that all new messages get discarded, instead of being queued as in AVSpeechSynthesizer. Of course If I not find a better solution, I shall switch to that. As a matter of fact the speaking of the directions also avaialable for other users, seem not to crash, but that could depend on sheer number of successive requests I do of either. – Fabrizio Bartolomucci Aug 04 '15 at 13:39
  • You can use [`UIAccessibilityAnnouncementDidFinishNotification`](https://developer.apple.com/library/ios/documentation/UIKit/Reference/UIAccessibility_Protocol/#//apple_ref/c/data/UIAccessibilityAnnouncementDidFinishNotification) to get notified when an announcement you have posted has finished speaking, so that you know not to submit another announcement before the previous has finished speaking. Not saying that it would be easy to implement it that way, but it would be possible. – Boris Dušek Aug 04 '15 at 13:44

1 Answers1

0

Ok, I turned to UIAccessibilityAnnouncementNotification for handling accessibility information, of course the issue for the utterance of the turn-by-turn directions for the other people stays. This is how I expressed the function:

 func announcementFinished(notification:NSNotification){
        activeAnnouncement=false
    }

 func read(text:String, onlyAccessible:Bool){
     println("reading \(text)")
     if UIAccessibilityIsVoiceOverRunning() && !activeAnnouncement{
               UIAccessibilityPostNotification(UIAccessibilityAnnouncementNotification,
                text);
            activeAnnouncement=true
        }
        if (!onlyAccessible && !UIAccessibilityIsVoiceOverRunning()){
            utterance=AVSpeechUtterance(string: text)
            speechSynth.speakUtterance(utterance)
            println("alla fine dell'if")
        }
    }
Fabrizio Bartolomucci
  • 4,948
  • 8
  • 43
  • 75
  • In fact I see this string in the AVSpeechSynthesizer definition I did not totally grasp:/* AVSpeechUtterances are queued by default. If an AVSpeechUtterance is already enqueued or is speaking, this method will raise an exception. */ – Fabrizio Bartolomucci Aug 04 '15 at 14:43