3

I am developing a touch application for Windows 7 with Qt/QML. The end-user-device has Windows 7's native touch behavior, i.e.: When touching the screen, a point appears on the last-touched-point, and when ending the physical touch, Windows puts that point on the now-touched point and runs in the on-clicked-Event.

Compared to the behavior one knows from standard Windows mouse-usage, this leads to a different behavior as soon as it comes to e.g. clicking some button: A mouse user will expect that the button changes color to the pressed-down-color when mouse button goes down, while the color changes to the default color again when the mouse button goes up.

In my application, I want to have a customized way of touch feedback: What is currently being touched should be marked using changed colors of buttons, imitating a "mouse goes down" when the actual physical touch begins and imitating a "mouse goes up" when the actual physical touch ends.

My application will run fullscreen, so an actual possibility would be to change the system's behavior on application start and change it back to default on applications end.

Such a behavior would effectively be the same as the standard behavior on e.g. all Android devices I know.

I searched through all the MouseArea and MultiPointTouchArea elements, trying to find a way to just make the click-reaction behavior different to the standard behavior. However I did not even find a way to capture the begin of the actual touch ... All the things which I want to happen at the begin of the touch actually happen when the touching ends.

Edit: It does not matter if I use a QML button or a mousearea plus the MouseArea.pressed property: Nothing will be "pressed" before the finger leaves the touch and the onClicked() even is called.

Possibly related: Adobe AIR: touch screen doesn't trigger mouse down event correctly - but I did not find a way to access the functions like Multitouch.inputMode (which are mentioned in the first reply) from a native Qt application.

How can I achieve the described behavior for my application?

Community
  • 1
  • 1
FourtyTwo
  • 1,616
  • 2
  • 15
  • 43
  • I don't know if I get your point but, following the sentence "In my application, I want [...]"...what about using standard QML `Button`s type with custom `Style`s? That way you can gain the desired colors/effects. For what concerns `MouseArea` you have `pressed` and `release`, is that granularity in the management of events not enough for your needs? Or is it the generation of the point, you are talking about in the first sentence, the main problem? – BaCaRoZzo Dec 12 '14 at 18:00
  • Added one sentance for clarification: It does not matter if I use a QML button or a mousearea plus the MouseArea.pressed property: Nothing will be "pressed" before the finger leaves the touch and the onClicked() even is called. And this is what I want to change ;-) Also added that snipped in the text above for future readers. – FourtyTwo Dec 15 '14 at 11:05
  • It's not "plus the pressed". It just the `onPressed` event handler and `onRelease` event handler that you are interested in. `onclick` event handler is surely out of scope here and should not be defined/used in your `MouseArea` code. Still, the question is: does the combination of the two handlers and styles solves your problem? Probably not? – BaCaRoZzo Dec 15 '14 at 11:19
  • No, it does not: Windows 7's native touch behavior triggers the following sequence of events when stopping the actual touch: onPressed -> onReleased -> onClicked. – FourtyTwo Dec 16 '14 at 06:03
  • I see, finally. Having a `MouseArea` over the app and using its `onRelease` to generate events for the underlying GUI could be a (rought, I know) solution. Event acceptance and forward are quite straightforward. – BaCaRoZzo Dec 16 '14 at 08:35
  • But this would mean the same behavior like with a simple QML Button, so it doesn't do the changes I want as far as I can see? – FourtyTwo Dec 17 '14 at 06:28
  • I don't know. :) Once you have clicked the `MouseArea` you can play around with the other handlers underneath in whichever way you want. But maybe I'm still missing something, sorry if I'm not helpful. – BaCaRoZzo Dec 17 '14 at 10:32
  • Windows generates mouse messages for touch input to provide basic touch support for applications that weren't specifically prepared for touch input. In your specific scenario it may be helpful to filter out those generated mouse messages (see [System Events and Mouse Messages](https://msdn.microsoft.com/en-us/library/windows/desktop/ms703320.aspx)). Qt5 can - in theory - distinguish between generated mouse messages and messages coming from hardware input. As always, Qt screws up every now and again, so you should install your own native message filter. – IInspectable Feb 09 '15 at 15:40

1 Answers1

2

The solution for this issue is to disable "Press and Hold" for the application. This is what can be done in a system-wide setting using ... Control Panel -> Pen and Touch -> Touch -> Press and Hold -> Settings -> uncheck 'Enable press and hold for right-clicking'

The only solution I found to to this in native code can be found here: http://msdn.microsoft.com/en-us/library/ms812373.aspx

I checked that this is at least still working for Windows 7. To get it working for QML, I searched for the QWindow* in QQmlApplicationEngine::rootObjects() and used its winId as a HWND. With that HWND, I called the TogglePressAndHold function from the link before app.exec().

FourtyTwo
  • 1,616
  • 2
  • 15
  • 43