I have a custom UIControl
that contains a few other controls. In between those controls there is empty space and background of my UIControl
needs to be transparent.
I need to catch all touch events that happen on my custom UIControl
even if they occur just between other controls (over transparent areas). I cannot use gesture recognizers I need more control then they provide. Instead I would like to register touches handling functions like this:
myControl.addTarget(self, action: "handleTouchDown:event:", forControlEvents: UIControlEvents.TouchDown)
With this aproach I receive touches that happened with over non transparent areas of myControl
but not those that happen ower transparent background.
I tried overriding hitTest::point:withEvent
in my custom control not to check for alpha value. But the hitTest::point:withEvent
is not even called when touch happens over transparent area of control. I replaced my control's layer
by custom CALayer
and have overriden hitTest
on that too with no result (hitTest
on the layer seems not to be called at all).
More details (EDIT)
To provide a perfect answer (and win the bounty) all you need to do is:
- Create simple app, add one
UIControl
(for exampleUIButton
). - Remove all content from
UIControl
(text fromUIButton
) and make its background transparent (either set to clear color or set alpha channel to 0). - Use
addTarget::action:forControlEvents:
method to register forUIControlEvents.TouchDown
events on the control. In handler method print something to console. - Run the app, press the control. Nothing gets printed to console. Make it work - do not use gesture recognisers I need the granularity provided by
addTarget::action:forControlEvents:
. No hacking solutions are prefered. I know that setting background alpha channel on the control to0.01
will make it work all the sudden but that is kind of hack I do not want. Describe here what you did.