3

Problem:

I have UITextField side by side with UIButton with send functionality. When user presses send button I'm performing simple action:

- (IBAction)sendMessage: (id)sender {
   [self.chatService sendMessage: self.messageTextField.text];
   self.messageTextField.text = @""; // here I get exception
}

Now when user starts using dictation from keyboard, then presses done on dictation view (keyboard) and immediately presses send button, I've got exception "Range or index out of bounds".

Possible solution:

I've noticed that other applications disable this "send" button when speech recognition server is processing data. This is exactly between two events: user presses "done" and results are appearing in text field. I wish to solve it in the same manner.

I've problem finding in documentation where this notification can be received. I've found UITextInput protocol, but this is not what I need.

Similar topics:

What have I tried:

  1. simply catch and ignore exception. Crash didn't acured, but virtual keyboard become completely unresponsive
  2. Disabling send button when [UITextInputMode currentInputMode].primaryLanguage is equal @"dictation". Notification UITextInputCurrentInputModeDidChangeNotification which reports end of dictation mode arrives before dictation service commits new value and I'm still able to click send button to cause exception. I could add delay when primaryLanguage losses @"dictation" value, but I don't like this approach. Most probably this required delay depends how much speech recognition service is responsive.
  3. I've added bunch of actions on different events (this evets was looking processing: UIControlEventEditingDidBegin, UIControlEventEditingChanged, UIControlEventEditingDidEnd, UIControlEventEditingDidEndOnExit). The good thing is that it looks like UIControlEventEditingChanged is fired exactly at desired moments: when user presses "Done" on dictation view and when service is committing or ending dictation. So this is my best concept so far. The bad thing is that this is fired in other cases too and there is no information to distinguish in which case this control event was fired, so I don't know should I disable or enable the button or do nothing.
Community
  • 1
  • 1
Marek R
  • 32,568
  • 6
  • 55
  • 140
  • Your notes section indicates that UITextInputCurrentInputModeDidChangeNotification fails because the end is reported too early. What if you use that approach, but upon input mode changing back from "dictation", do a performSelector with 0.0 delay to clear the text. The notification is probably blocking in the SDK. Maybe you just need to delay till the next turn of the run loop? – danh Jun 04 '14 at 17:07
  • yes I've thought about this. I don't like it (see edit), but looks like as only solution. – Marek R Jun 04 '14 at 18:27
  • Maybe you could submit the call to the main queue via GCD, rather than performSelector with a delay? That may put it on the queue after the dictation finishes it's update without suffering the speculative delay. – DavidA Jun 04 '14 at 18:35
  • Yeah, if you need a delay, like N-seconds, then I agree that's asking for trouble because you have a real race condition and the delay could either be too long or sometimes crashing inducing. But I'm suggesting 0.0 delay, just to get your message to the back of the line. If that works at all, I think it will work reliably – danh Jun 04 '14 at 18:35
  • like I wrote in edit, this test case already provides this delay (user have to do two actions, first action changes `primaryLanguage`), so zero delay is not a solution. – Marek R Jun 04 '14 at 18:38

2 Answers2

6

I finally found ultimate solution.

It is simple elegant will pass apple review and it Always work. Just react on UIControlEventEditingChanged and detect existance of replacemnt characterlike this:

-(void)viewDidLoad {
  [super viewDidLoad];

  [self.textField addTarget: self
                     action: @selector(eventEditingChanged:)
           forControlEvents: UIControlEventEditingChanged];
}

-(IBAction)eventEditingChanged:(UITextField *)sender {
  NSRange range = [sender.text rangeOfString: @"\uFFFC"];
  self.sendButton.enabled = range.location==NSNotFound;
}


Old approach

Finlay I've found some solution. This is improved concept nr 3 with mix of concept nr 2 (based on that answer).

-(void)viewDidLoad {
  [super viewDidLoad];

  [self.textField addTarget: self
                     action: @selector(eventEditingChanged:)
           forControlEvents: UIControlEventEditingChanged];
}

-(IBAction)eventEditingChanged:(UITextField *)sender {
  NSString *primaryLanguage = [UITextInputMode currentInputMode].primaryLanguage;

  if ([primaryLanguage isEqualToString: @"dictation"]) {
    self.sendButton.enabled = NO;
  } else {
    // restore normal text field state
    self.sendButton.enabled = self.textField.text.length>0;
  }
}

- (IBAction)sendMessage: (id)sender {
   [self.chatService sendMessage: self.messageTextField.text];
   self.messageTextField.text = @"";
}

- (BOOL)textFieldShouldReturn:(UITextField *)textField {
  if (self.textField.text.length==0 || !self.sendButton.enabled) {
     return NO;
   }
   [self sendMessage: textField];
   return YES;
}

// other UITextFieldDelegate methods ...

Now problem doesn't appears since user is blocked when it could happen (exactly between user presses "Done" button on dictation view and when results are coming from speech recognition service.
The good thing is that public API is used (only @"dictation" can be a problem, but I thin it should be accepted by Apple).

Community
  • 1
  • 1
Marek R
  • 32,568
  • 6
  • 55
  • 140
  • Hello Marek, Have you encountered any issues with your final solution? Are you running this code in a production app? – josh-fuggle Jul 27 '15 at 01:00
  • the application is already on app store and I've didn't seen any new bugs for that, but! Application is not supported anymore (client decided to launch new more advanced product), so I don't have any new bug reports. It should work in all cases except in cases if you are adding some graphics to text (for example emoticons), if you do then detection logic has to be more complicated (detection is based on that speech recognition signals working progress by temporary adding icon to text and evry icon/graphics is marked in text by replacement character). – Marek R Jul 27 '15 at 08:07
  • Your new approach no longer works in iOS 9. The old approach of adding a target for `UIControlEvents.EditingChanged` and then checking `sender.textInputMode?.primaryLanguage == "dictation"` does work though. Haven't gone through review yet, but I don't see why it would be an issue. – Nick Yap Jul 12 '16 at 19:03
  • at least the fixed crash which could happen when you modify text during speech recognition in iOS 8. This was the reason I needed this solution. – Marek R Jul 13 '16 at 07:03
1

In iOS 7 Apple introduced TextKit so there are new information for this question: NSAttachmentCharacter = 0xfffc Used to denote an attachment as documentation says.

So, if your version is more or equal to 7.0, better approach is to check attributedString for attachments.

riskpp
  • 135
  • 5
  • 1
    This is not answer to this question, but more like a improvement comment to my answer. Thanks anyway. – Marek R Jan 22 '15 at 14:53
  • This constants require to use of complex API (`NSCharacterSet`, `NSRange`), I prefer use string literal, it's simpler. – Marek R Jan 22 '15 at 16:10