Re: Lenovo Yoga 700-11ISK touchscreen



On 3/18/18 10:36 PM, Carlos Garnacho wrote:

2. the touchscreen seems not to understand multitouch at all so it also
does not allow scrolling most of the time. If it is firefox or a terminal -
moving finger over the window selects text. No two finger scrolling is
possible (enabled and works with the touchpad).

Ultimately, reaction to (multi)touch is implemented by each specific
application. Any issues there should thus be reported on each
application bug reporting system. NB, such issues are very probably
already filed, so look out for duplicates.

Is this different from touchpad? I am asking as the touchpad settings are
system wide and it works the same everywhere but there is no such control
over the touchscreen behaviour and I wonder why... I expected touchpad and
touchscreen to share same settings or each to have own settings and neither
seems to be the case.

It is a whole lot different :). From the perspective of the
application, a touchpad is the essentially same than a mouse, and
applications have been able to deal with those for many years. As the
actions apply on a single concrete point (where you see the mouse
cursor), the windowing (xorg/wayland) is able to send specific events
of what happened (moved to these coordinates, pressed mouse button,
scrolled by this much).

Multitouch is peculiar in that the application is made known of basic
information (new touchpoint at these coordinates, touchpoint moved
somewhere else, touchpoint was lifted) for *every* touchpoint. The
application is in charge of seeing where do each apply and giving
those a meaning, some examples:
- A touchpoint happening in a browser tab might eventually tap some
button in the webpage, or be used by the browser to start scrolling if
it moves past a distance (and effectively cancelling other actions)
- Say you actually meant to zoom on the page, so a second touchpoint
on the webpage arrives a couple milliseconds later. All previous
actions should be rendered ineffective, and the browser should see how
far away are those touchpoints in order to start zooming based on that
initial distance, using the center of those 2 points as the zooming
anchor.
- But say you missed the webpage in that second touch, and it ends up
on the location bar. What should the app do in that case? I dunno, but
what it shouldn't do is zooming :).

Oh. Ok. Seems weird as touchpad seems to have to solve exact same problems
but ok :)


Toolkits like GTK+ may help to some extent, but nothing enforced apps
to using that fully/correctly. And in the case of Firefox, I think
their use of GTK+ is a lot more superficial than that, so there's more
on their plate to handle.

TL;DR: Touch input is highly location and context dependent, there's
no way "correct" multitouch-y behavior can be done for free without
app involvement.

Got it. Thanks for the explanations.


-- 
Alexey


[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]