I have a Ubuntu machine with a 24" touchscreen and it is working fine. I can move the mouse, do gestures with several touch points and such so the hardware is working fine. Now I wonder if it is possible to make a browser interpret the events as touch and not as mousedown, mousedrag etc. HTML5 has really good support for touch and multiple touch and I would like to develop web applications for this setup. Does anyone have a clue on how to do this? I've tried enabling the --enable-touch-events switch with no success. Tho it seems that this is only implemented in the ms windows version.
~$ xinput -version
xinput version 1.6.0
XI version on server: 2.2
~$ xinput
⎡ Virtual core pointer id=2 [master pointer (3)]
⎜ ↳ Virtual core XTEST pointer id=4 [slave pointer (2)]
⎜ ↳ Advanced Silicon S.A CoolTouch(TM) System id=9 [slave pointer (2)]
⎜ ↳ USBest Technology SiS HID Touch Controller id=10 [slave pointer (2)]
⎜ ↳ Logitech USB Optical Mouse id=11 [slave pointer (2)]
⎜ ↳ MCE IR Keyboard/Mouse (nuvoton-cir) id=14 [slave pointer (2)]
⎣ Virtual core keyboard id=3 [master keyboard (2)]
↳ Virtual core XTEST keyboard id=5 [slave keyboard (3)]
↳ Power Button id=6 [slave keyboard (3)]
↳ Video Bus id=7 [slave keyboard (3)]
↳ Power Button id=8 [slave keyboard (3)]
↳ CHICONY HP Basic USB Keyboard id=12 [slave keyboard (3)]
↳ Nuvoton w836x7hg Infrared Remote Transceiver id=13 [slave keyboard (3)]
I've read about building with the touch-UI flag but im not shure it will help?