4

I am wanting to know when a user has touched anywhere on the screen of my app.

I have looked into using -(UIResponder *)nextResponder but unfortunately this will not work, as I am also reloaded a table automatically, so this gets trigged when that occurs.

I have also tried a gesture recognizer, with the following code. But this will only recognise touches on the view. Where as I have many buttons the user will be using to operate the app. I would like to avoid adding a gesture recogniser or code for this in every button and segment control I have on the screen

UITapGestureRecognizer *tap = [[UITapGestureRecognizer alloc] initWithTarget:self action:@selector(tapOnView:)];
[self.mainView addGestureRecognizer:tap];

- (void)tapOnView:(UITapGestureRecognizer *)sender
{
    //do something
}

I have also tried -(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event , but this has the same issue as the gesture recognizer.

I was wondering if there is any way I could achieve this task. I was hoping that I may be able to recognise the type of event from within the nextResponder, and then I could detect if it is button for example.

EDIT: The reason I am working on this is that my app needs to stay active and the screen cannot be locked (so I have disabled screen locking). To avoid excessive use of power, I need to dim the screen, but then return the brightness back to the original level once the app is touched. I need this feature to only occur on 1 of my viewcontrollers.

Remixed123
  • 1,575
  • 4
  • 21
  • 35

4 Answers4

11

As mentioned by Ian MacDonald, using hitTest:: is a great solution to detect user interaction on an app wide scale, including when buttons, textfields, etc, are selected.

My solution was to subclass UIWindow and implement the hitTest method.

- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event {

   // do your stuff here

   // return nil if you want to prevent interaction with UI elements
   return [super hitTest:point withEvent:event];
}
danfordham
  • 980
  • 9
  • 15
  • 1
    Since views "bubble up" touch events, this was the only viable solution I could find. Thank you! – Josh Bernfeld Nov 26 '17 at 06:44
  • 2
    Can you provide more info about the details no how to use sub class of UIWindow - like initializing, showing it over current window etc ? – Satyam Oct 07 '18 at 12:02
3

You could attach your UITapGestureRecognizer to your [[UIApplication sharedApplication] keyWindow].

Alternatively, you could override hitTest: of your root UIView.

Is there a particular task you are hoping to accomplish? There may be a better way than assigning an "anywhere" gesture.

Edit: Use hitTest:.

@interface PassthroughView : UIView
@property (readonly) id target;
@property (readonly) SEL selector;
@end
@implementation PassthroughView
- (void)setTarget:(id)target selector:(SEL)selector {
  _target = target;
  _selector = selector;
}
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event {
  [_target performSelector:_selector];
  return nil;
}
@end


@implementation YourUIViewController {
  PassthroughView *anytouchView;
}
- (void)viewDidLoad {
  // Add this at the end so it's above all other views.
  anytouchView = [[PassthroughView alloc] initWithFrame:self.view.bounds];
  [anytouchView setAutoresizingMask:UIViewAutoresizingFlexibleWidth|UIViewAutoresizingFlexibleHeight];
  [anytouchView setTarget:self selector:@selector(undim)];
  [anytouchView setHidden:YES];
  [self.view addSubview:anytouchView];
}
- (void)undim {
  [anytouchView setHidden:YES];
}
- (void)dim {
  [anytouchView setHidden:NO];
}
@end
Ian MacDonald
  • 13,472
  • 2
  • 30
  • 51
  • I have added extra detail to my question – Remixed123 Oct 14 '14 at 15:55
  • [[UIApplication sharedApplication] keyWindow] does not function with buttons – Remixed123 Oct 14 '14 at 15:59
  • I have updated my answer to show you how to use `hitTest:` to listen for touch events while still letting your application behave as normal. – Ian MacDonald Oct 14 '14 at 16:11
  • Thanks, this works, But will need to look into the other question, as it looks like it may fit better with my solution – Remixed123 Oct 14 '14 at 16:23
  • I think this solution might work only in "YourUIViewController". How can I capture events of tap in the whole application. We can't keep adding PassthroughView in all the view controllers. Any solution? – Satyam Oct 07 '18 at 12:04
  • @Satyam There aren't many use cases for having an application-wide tap detector because event handlers are usually context-dependent. If you really need this, you could add it to the root view controller. – Ian MacDonald Oct 09 '18 at 13:12
2

Your edit adds more clarity to your question.

The reason I am working on this is that my app needs to stay active and the screen cannot be locked (so I have disabled screen locking). To avoid excessive use of power, I need to dim the screen, but then return the brightness back to the original level once the app is touched.

Since you are controlling the screen brightness, you can add one transparent view controller before dimming screen on top of your root controller which does only one job, listen to tap using Tap gesture. And on tap you can dismiss the view controller and adjust brightness to previous state.

By doing so you dont have to worry about buttons being clicked as they will be below the transparent view controller. Since its a whole new view controller sitting on top of stack you dont have to modify your existing code as well.

GoodSp33d
  • 6,252
  • 4
  • 35
  • 67
1

Ok I have had a similar problem before.

As I remember I subclassed the UIWindow for full screen detection and made it First responder.

Than I overridden the touch to handle from subclasses.

You can also use code to identify the control that is been touched.

#import <QuartzCore/QuartzCore.h>

- (void)viewDidLoad
{
    [super viewDidLoad];
    [self.view setMultipleTouchEnabled:YES];
}

-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {

    // Enumerate over all the touches 
    [touches enumerateObjectsUsingBlock:^(id obj, BOOL *stop) {
        // Get a single touch and it's location
        UITouch *touch = obj;
        CGPoint touchPoint = [touch locationInView:self.view];
        ...
    }];
}

To disable the locking of screen I used below code:

[[UIApplication sharedApplication] setIdleTimerDisabled:YES];

I used following functions to dim or increase the screen brightness

[[UIScreen mainScreen] setBrightness:0.0f]; //and
[[UIScreen mainScreen] setBrightness:1.0f];
bllakjakk
  • 5,045
  • 1
  • 18
  • 28
  • -(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event does not get triggered when a button is pressed. Am I missing something? – Remixed123 Oct 14 '14 at 16:30
  • I hope you are not forgetting the concept of passing event through responder chain. [self.nextResponder touchesBegan:touches withEvent:event]; – bllakjakk Oct 14 '14 at 16:35
  • Also since the [NSWindow makeFirstResponder:] is a window method, it only make sense after you’ve added the corresponding view to the window. – bllakjakk Oct 14 '14 at 16:40
  • Pressing a button never enters -(void)touchesBegan: so I do not get the chance to pass the event through responder chain. It does enter when I press on views. – Remixed123 Oct 14 '14 at 16:45
  • Can you add a log in the subclassed UIWindow - (void)sendEvent:(UIEvent *)event { }. To verify the reception of touch event. – bllakjakk Oct 14 '14 at 16:56
  • I added the NSLog as suggested, but it does not get logged. I do not have a UIWindow in my app though. – Remixed123 Oct 14 '14 at 17:16
  • did you mistyped it ? Every application has UIWindow. And my approach was based on Subclassing UIWindow so you can capture fullscreen events. I had worked on a application similar to Rise Alarm and used this approach for handling the screen interaction. – bllakjakk Oct 14 '14 at 17:20
  • UIWindows is a property and instance in my appdelegate.h, but it is not used anywhere. My appdelegate looks like this - @interface SSAppDelegate : UIResponder - Perhaps this is a storyboard thing – Remixed123 Oct 14 '14 at 17:31