1

I was optimizing my code to support both iOS8 and iOS7, and the app works perfectly in iOS8. It was originally build only to support iOS8 but I have been adjusting the minor settings so it should support iOS7 as well. Now the only problem that I am expecting is that in iOS7 - the controls in the bottom right corner are ignored. The home button in bottom left corner works without problems, and if I change the position of the two controls in the bottom right, to be bottom left, then they work as well. It is like there is a "mask" on top of the right side of the page view controller. It is a book application, so the page view is made to supporting swiping between pages like a normal book app layout.

So basically what my question is, is if there is a way to remove this disabled overlay or if I should adjust some other settings to fetch these inputs? I have also tried to NSLog tap commands when the user hits the bottom right controls, but no logs are coming, only when the controls are in the left side of screen.

I also tried following: By removing the tap recognizer from the pageview, but neither did it work.

self.view.gestureRecognizers = self.pageViewController.gestureRecognizers;

// Find the tap gesture recognizer so we can remove it!
UIGestureRecognizer* tapRecognizer = nil;
for (UIGestureRecognizer* recognizer in self.pageViewController.gestureRecognizers) {
    if ( [recognizer isKindOfClass:[UITapGestureRecognizer class]] ) {
        tapRecognizer = recognizer;
        break;
    }
}

if ( tapRecognizer ) {
    [self.view removeGestureRecognizer:tapRecognizer];
    [self.pageViewController.view removeGestureRecognizer:tapRecognizer];
}

enter image description here

EDIT

@Rory McKinnel - Here is the screenshot of the view hierarchy.

enter image description here

Here is a test where I put a close button of another view in the same position as the play button, but this still works. It seems like it is only in the pageview there is a problem.

enter image description here

UPDATE TO ISSUE

I played around with some logs and so to see what areas where affected etc. In the app, there is a function that makes it possible to tap a word in the book to have it highlighted and read out using an audio file. This tap event also only applies to the left side of the screen. If I tap a word in the "ignored" area, the event is not executed and not even the log is made for the function.

I have measured and here is an estimate on how big each side covers. When I tap in the "ignored" part of the screen, the only function that happens is the swipe to next page.

enter image description here

For example: I can tap and hear the word: "Adipisicing" but not the word "elit" in the first line on the image.

Corfitz.
  • 1,864
  • 3
  • 31
  • 53
  • Hard to tell without being able to see your view hierarchy. If the buttons are inside a view, it could be that that view has 0 width due to constraints not working for some reason but you can still see the buttons if its not clipping. Hence the events do not get captured but you see the buttons. Easy test is to set the background of any containing view to something like red. If you see it then its ok, otherwise not. – Rory McKinnel Jun 09 '15 at 18:43
  • @RoryMcKinnel I have added a screenshot of the view hierarchy. As far as I can see, the two buttons should be on the very top. It is underneath the navigation item though, but I don't seem to be able to move it above it. - I also tried to put red background as you suggested and it applied the background behind the button image correctly, so it seem that the view should be wide enough. `View mode: Scale to fill` – Corfitz. Jun 10 '15 at 05:49
  • It could relate to orientation, and the input is assuming the hardware is portrait which would show as button presses not working. Have you got any code that adjusts/sets orientation specific to iOS7 versus iOS8, adjusting the screen etc.... Hence why moving the buttons left suddenly works. My guess is that they start working when they get portrait width from the left. – Rory McKinnel Jun 10 '15 at 06:48
  • @RoryMcKinnel - I have multiple views in the app and I tried to place a back button in the bottom right corner on some of the other views. .Those buttons are working fine without problems - I don't know if this would still work if the orientation was a problem.. Check update. – Corfitz. Jun 10 '15 at 07:37
  • @Dimser Did you check the `userInteractionEnabled` of the right buttons? – Bannings Jun 10 '15 at 07:55
  • @Bannings Yes.. Besides the buttons are working fine in the left side of the screen. – Corfitz. Jun 10 '15 at 08:23
  • @Dimser Can you post your demo project on GitHub? – Bannings Jun 10 '15 at 08:27
  • @Bannings Unfortunately I am not allowed to do so.. It is not my code, I am simply trying to help optimizing it for a client. – Corfitz. Jun 10 '15 at 08:46
  • @Dimser It is likely a single screen issue. See how far to the left and up you have top move the buttons to get them to work. That will tell you the affected area. Also check if there is any code implementing anything like `pointInside` to make decisions on which subviews receive events. – Rory McKinnel Jun 10 '15 at 09:47
  • @Dimser Saw your update. So the code that is masking off the area may well be using [[UIScreen mainScreen] bounds] or something similar to work out the touch frame. In your one it looks like it has confused Portrait with Landscape. In iOS8 UIScreen bounds changed to be interface orientated rather than device orientated. This might explain why iOS7 is the one not working. See http://stackoverflow.com/questions/24150359/is-uiscreen-mainscreen-bounds-size-becoming-orientation-dependent-in-ios8 – Rory McKinnel Jun 10 '15 at 10:07
  • @RoryMcKinnel it might actually be like that. should I use something like this then? http://stackoverflow.com/a/25088478/1421945 And sorry for asking: "Where should I put it?" – Corfitz. Jun 10 '15 at 10:49
  • @Dimser Yes that is the kind of thing you need. There must be a piece of code somewhere which is intercepting the touches and deciding if they should be passed on. It is most likely to be in code for the outermost view. Use the search tool and search for `UIScreen`, `pointInside` and see if anything comes up that looks related. You need to correct the code where it works out the frame area to intercept touches. Assuming that is what is wrong. – Rory McKinnel Jun 10 '15 at 10:59
  • @RoryMcKinnel `pointInside ` gives nothing but `UIScreen ` gives three results. Although the three results are in `AFHTTPClient.m` and `AFImageRequestOperation.m` which is not code I have been writing though. – Corfitz. Jun 10 '15 at 11:12
  • Let us [continue this discussion in chat](http://chat.stackoverflow.com/rooms/80167/discussion-between-rory-mckinnel-and-dimser). – Rory McKinnel Jun 10 '15 at 11:13

1 Answers1

1

Following discussion, the issue seems to relate to adding a child controller when in landscape mode and the child by default using a frame which is in portait. This seems to relate somehow to the differences between iOS7 and iOS8 with how screen bounds are defined: device orientated on iOS7, interface orientated in iOS8. Hence on iOS7, the input was restricted to the portrait area of the child controller.

The solution was to explicitly set the child controllers frame to the reverse for iOS7 using:

if ([[[UIDevice currentDevice] systemVersion] floatValue] < 8)
  self.pageViewController.frame = CGRectMake(self.view.frame.origin.x,
                                             self.view.frame.origin.y,
                                             self.view.frame.size.height,
                                             self.view.frame.size.width);
Rory McKinnel
  • 7,936
  • 2
  • 17
  • 28