Context: I'm trying to create a fader-like widget that can have multiple instances in the same view, each of which can be controlled simultaneously by different fingers.
I want to use Qt's gesture recognition system, but I also need some functionality above and beyond the standard Qt::PanGesture
. To this end, I've subclassed both QGesture
and QGestureRecognizer
. In FooGestureRecognizer::recognize(...)
, I'm currently intercepting both QMouseEvent
s and QTouchEvent
s (for the time being, at least).
On Windows I only receive QMouseEvent
s - I handle them and everything works as expected (though obviously I don't have to deal with the multitouch problem when my input is from a physical mouse). The events I receive (in order):
QEvent::MouseButtonPress
- A string of
QEvent::MouseMove
s QEvent::MouseButtonRelease
On Android, I receive a strange mix of QMouseEvent
s and QTouchEvent
s (in order):
QEvent::TouchBegin
QEvent::MouseButtonPress
QEvent::MouseMove
(with no actual change in position)- Another
QEvent::MouseButtonPress
(not sure why I needed another one) - My actual string of
QEvent::MouseMove
s, as expected QEvent::MouseButtonRelease
The global attribute Qt::AA_SynthesizeMouseForUnhandledTouchEvents
is true
by default. Turning it off changes the events I receive to:
QEvent::TouchBegin
...nothing else.
Here's a precursor question then: What can I do inside QGestureRecognizer::recognize()
to tell Qt that I'm handling the QEvent::TouchBegin
, and that it doesn't need to synthesize a QEvent::MouseButtonPress
for me? event->accept()
doesn't appear to make any difference.
The actual question: If (as it appears) Qt is synthesizing MouseEvent
s from TouchEvent
s, why do I see I see QEvent::MouseMove
and QEvent::MouseButtonRelease
but not QEvent::TouchUpdate
or QEvent::TouchRelease
?
Code is available, but in the interests of conciseness I've not included it here. Please ask if needed.