7

I have a custom UIControl that contains a few other controls. In between those controls there is empty space and background of my UIControl needs to be transparent.

I need to catch all touch events that happen on my custom UIControl even if they occur just between other controls (over transparent areas). I cannot use gesture recognizers I need more control then they provide. Instead I would like to register touches handling functions like this:

myControl.addTarget(self, action: "handleTouchDown:event:", forControlEvents: UIControlEvents.TouchDown)

With this aproach I receive touches that happened with over non transparent areas of myControl but not those that happen ower transparent background.

I tried overriding hitTest::point:withEvent in my custom control not to check for alpha value. But the hitTest::point:withEvent is not even called when touch happens over transparent area of control. I replaced my control's layer by custom CALayer and have overriden hitTest on that too with no result (hitTest on the layer seems not to be called at all).


More details (EDIT)

To provide a perfect answer (and win the bounty) all you need to do is:

  1. Create simple app, add one UIControl (for example UIButton).
  2. Remove all content from UIControl (text from UIButton) and make its background transparent (either set to clear color or set alpha channel to 0).
  3. Use addTarget::action:forControlEvents: method to register for UIControlEvents.TouchDown events on the control. In handler method print something to console.
  4. Run the app, press the control. Nothing gets printed to console. Make it work - do not use gesture recognisers I need the granularity provided by addTarget::action:forControlEvents:. No hacking solutions are prefered. I know that setting background alpha channel on the control to 0.01 will make it work all the sudden but that is kind of hack I do not want. Describe here what you did.
Rasto
  • 17,204
  • 47
  • 154
  • 245
  • If I set background color of my `UIControl` to be some color with alpha `0.01` everything works fine, I get all touch events I need. But that is hacking and it is influencing colors of views that are be behing that transparent control. – Rasto Oct 30 '14 at 04:01
  • Have you tried `view.backgroundColor = [UIColor clearColor]`? – Michael Oct 30 '14 at 04:24
  • @Nikita Yes, I did indeed. It does not help. – Rasto Oct 30 '14 at 04:26
  • Are you sure that your `UIControl` is in the proper place in the view hierarchy? Check the order of your view hierarchy. – Michael Oct 30 '14 at 04:32
  • @Nikita If some other view was covering it, I would receive no touches for opaque areas. But I do receive touches for opaque areas. This is not some strange behaviour - the controls normally behave like this. Try simple project, place some `UIControl` make it completelly transparent and try to `addTarget` to register for some touches. It will not work. Just stupid by design... – Rasto Oct 30 '14 at 04:41
  • I tried it using a `UIControl` and I am noticing the same problem. I also tried `hitTest:Point:withEvent` as per [this](http://stackoverflow.com/questions/17083102/can-i-create-a-totally-transparent-uiview-that-receives-touches) SO post, it also didn't work properly. Perhaps you can set the background color to white and set a very low alpha value? Or even try a `UIButton`? – Michael Oct 30 '14 at 05:06
  • This post may solve your problem - [Allowing interaction with a UIView under another UIView](http://stackoverflow.com/questions/1694529/allowing-interaction-with-a-uiview-under-another-uiview). – Michael Oct 30 '14 at 05:22
  • @Nikita That post is unrelated to my problem. I do not have overcovering view - imagine there is only one transparent view in hierarchy and I want to detect touches on it. – Rasto Oct 30 '14 at 10:19
  • 1
    With the Gesture Recognizer Delegate method `- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldReceiveTouch:(UITouch *)touch` you can specify precisely where the gesture will be passed to the recognizer. Couldn't you specify only the view of your control to receive the touch? (and deny the touch if it is over one of your sub-controllers.) – Tim Quinn Oct 31 '14 at 00:37
  • @TimQuinn I the *title* of my question it is stated that the solution may not need gesture recognisers (see edit for reason). – Rasto Nov 01 '14 at 22:27

3 Answers3

2

Following your EDIT section:

https://github.com/soxjke/TransparentControl

1) If i set the background colour to +[UIColor clearColor] the touches work wonderful. So you have no need to do smth more, go ahead with clear color. (top button)
2) If i set alpha = 0, touches are not handled. OK (middle button)
3) To handle this touches there's simple solution (bottom button), subclass UIButton (actually you can go with anything in hierarchy up to UIView). Override the hitTest:withEvent:

- (UIView*)hitTest:(CGPoint)point withEvent:(UIEvent *)event
{
    return CGRectContainsPoint(self.bounds, point) ? self : nil;
}

PROFIT
4) If you need to go deeper, use

touchesBegan:withEvent:
touchesMoved:withEvent:
touchesEnded:withEvent:
touchesCancelled:withEvent:

on your UIResponder subclass like Rob Glassey proposed in his answer

P.S. I'll end with off topic. Don't know what your task actually is, but telling that you can't use recognizers because you need "more control" on all touch events discovers that you don't know the possibilities of gesture recognisers. So, judging from my experience i'd say you're rather inventing bicycle then doing good solution.

P.P.S. If proposed by me and another guys here methods don't work for you - check your control's userInteractionEnabled

Appendix (view controller code to test:):

#import "ViewController.h"
#import "TransparentControl.h"

@interface ViewController ()

@property (weak, nonatomic) IBOutlet UIButton *buttonClearColor;
@property (weak, nonatomic) IBOutlet UIButton *buttonAlpha0;
@property (weak, nonatomic) IBOutlet TransparentControl *customButtonAlpha0;

@end

@implementation ViewController

- (void)viewDidLoad
{
    [super viewDidLoad];
    [self.buttonClearColor addTarget:self action:@selector(touchUpInside:) forControlEvents:UIControlEventTouchUpInside];
    [self.buttonClearColor addTarget:self action:@selector(touchUpOutside:) forControlEvents:UIControlEventTouchUpOutside];
    [self.buttonClearColor addTarget:self action:@selector(touchDown:) forControlEvents:UIControlEventTouchDown];

    [self.buttonAlpha0 addTarget:self action:@selector(touchUpInside:) forControlEvents:UIControlEventTouchUpInside];
    [self.buttonAlpha0 addTarget:self action:@selector(touchUpOutside:) forControlEvents:UIControlEventTouchUpOutside];
    [self.buttonAlpha0 addTarget:self action:@selector(touchDown:) forControlEvents:UIControlEventTouchDown];

    [self.customButtonAlpha0 addTarget:self action:@selector(touchUpInside:) forControlEvents:UIControlEventTouchUpInside];
    [self.customButtonAlpha0 addTarget:self action:@selector(touchUpOutside:) forControlEvents:UIControlEventTouchUpOutside];
    [self.customButtonAlpha0 addTarget:self action:@selector(touchDown:) forControlEvents:UIControlEventTouchDown];
}

- (void)didReceiveMemoryWarning
{
    [super didReceiveMemoryWarning];
}

- (void)touchUpInside:(id)sender
{
    NSLog(@"%s", __PRETTY_FUNCTION__);
}

- (void)touchDown:(id)sender
{
    NSLog(@"%s", __PRETTY_FUNCTION__);
}

- (void)touchUpOutside:(id)sender
{
    NSLog(@"%s", __PRETTY_FUNCTION__);
}

@end
Community
  • 1
  • 1
Petro Korienev
  • 4,007
  • 6
  • 34
  • 43
1

The documentation for hitTest mentions totally transparent things being ignored, so it is possible that overriding hitTest itself is not enough to get around this, as whatever calls hitTest isn't calling it when the object is transparent.

Instead I'd suggest you try falling down to the lower level UIResponder touch methods if you need to get access to the raw touch events no matter what (these are inherited by UIView and UIControl so are available to you).

They are:

touchesBegan:withEvent:
touchesMoved:withEvent:
touchesEnded:withEvent:
touchesCancelled:withEvent:

The first parameter is a NSSet of touches, the second a UIEvent like what you've been referring to in your other methods...

With this you don't need to add target-action, but are overriding these methods on your custom control. These are lower level (and old-school in the extreme) but should give you total control over the touch events.

Rob Glassey
  • 2,237
  • 20
  • 21
  • Thank you for an answer. I did try that as well. `UIResponder` touch methods (`touchesBegan:withEvent:` etc.) are not called either (!) on transparent `UIControl`s. While falling down to `UIResponder` this is a good idea and the one I would expect to work it unfortunatelly does not. – Rasto Nov 01 '14 at 23:41
  • In that case I think the problem might be touches getting blocked by the controls above (where you say 'contains a few other controls')? Do they handle their own touches? Does it still fail if you just have the one background control (removing those other controls)? What [Petro](http://stackoverflow.com/users/2392973/petro-korienev) mentioned in his answer about `userInteractionEnabled` needs to be looked at - it is very easy for views of any sort above to block touches, even if those views are transparent, if they aren't quite configured perfectly. – Rob Glassey Nov 02 '14 at 00:22
1

I subclassed UIControl with an empty drawRect method and it worked.

According to the docs, opaque is ignored by UIButton and some other controls, so that can't be used as the control point for this technique. Curious however is that the default background color for a view is transparent (nil).

By subclassing UIControl and setting opaque = NO, you can create a drawRect method which doesn't fully fill the frame and allows for "transparent" regions without setting alpha = 0 allowing for hitTest:withEvent: to still pick up events. Since the element is a UIView, you should be able to add views and then implement your own drawRect which calls all the subviews' equivalent functions while not drawing the regions which are supposed to be transparent.

My basic ViewController elements, the ImageView was to ensure it worked.

@implementation MyViewController
- (void)viewDidLoad {
    [ super viewDidLoad ];

    transparentControl = [ [ TransparentControl alloc ] initWithFrame:CGRectMake( 0, 0, 400, 400 ) ];
    [ transparentControl addTarget:self action:@selector(printText) forControlEvents:UIControlEventTouchUpInside];

    // Create an image view below the button for proof the control is transparent
    UIImageView * imageView = [ [ UIImageView alloc ] initWithImage:[ UIImage imageNamed:@"BGImage.jpg" ] ];
    imageView.frame = self.view.frame;

    [ self.view addSubview:imageView ];
    [ self.view addSubview:transparentControl ];
}

 -( void )printText {
    NSLog( @"Hello, this is a transparent button." );
}
@end

And my transparent control.

@implementation TransparentControl

- ( instancetype )initWithFrame:( CGRect )frame {
    if( self = [ super initWithFrame:frame ] ) {
        self.opaque = NO;
        self.userInteractionEnabled = YES;
    }
    return self;
}

- ( void )drawRect:(CGRect)rect {
}
@end
pyj
  • 1,489
  • 11
  • 19