Re: Lenovo Yoga 700-11ISK touchscreen



Hey,

On Sun, Mar 18, 2018 at 3:06 AM, Alexey Kardashevskiy <aik ozlabs ru> wrote:
On 3/18/18 4:29 AM, Carlos Garnacho wrote:
Hi!,

On Sat, Mar 17, 2018 at 2:49 PM, Alexey Kardashevskiy <aik ozlabs ru> wrote:
Hi,

Fedora 27, Gnome desktop 2.32, gnome 3.26.2, kernel 4.15.8-300.fc27.x86_64.
Lenovo Yoga 700-11ISK, ELAN touchscreen, Synaptic touchpad.

The yoga is a convertible laptop which I am trying to use as a tablet but
it is confusing. I'll describe these and hopefully you can give me some
pointers what is a bug and what is a feature :)

1. when switched to a tablet, the keyboard gets disabled (good) but the
touchpad is not.

The keyboard being disabled is something with Yoga models. Other
2-in-1s are smart enough to also disable the touchpad, and some
disable nothing.

We have https://bugzilla.gnome.org/show_bug.cgi?id=744036 open to
track/expose SW_TABLET_MODE state, which may be used to implement such
behavior on software. But arguably this could belong on libinput,
which has a better grasp of the devices to be disabled and their
topology.

Ok, thanks, I'll look into that. Quick question - is that libinput
scriptable or I need to recompile it anyway?

Libinput is written in C. It does learn about the available devices
though udev, so once the support was added, presumably you should be
able to opt devices in or out through the right udev rules. But
support needs to be added first.


2. the touchscreen seems not to understand multitouch at all so it also
does not allow scrolling most of the time. If it is firefox or a terminal -
moving finger over the window selects text. No two finger scrolling is
possible (enabled and works with the touchpad).

Ultimately, reaction to (multi)touch is implemented by each specific
application. Any issues there should thus be reported on each
application bug reporting system. NB, such issues are very probably
already filed, so look out for duplicates.

Is this different from touchpad? I am asking as the touchpad settings are
system wide and it works the same everywhere but there is no such control
over the touchscreen behaviour and I wonder why... I expected touchpad and
touchscreen to share same settings or each to have own settings and neither
seems to be the case.

It is a whole lot different :). From the perspective of the
application, a touchpad is the essentially same than a mouse, and
applications have been able to deal with those for many years. As the
actions apply on a single concrete point (where you see the mouse
cursor), the windowing (xorg/wayland) is able to send specific events
of what happened (moved to these coordinates, pressed mouse button,
scrolled by this much).

Multitouch is peculiar in that the application is made known of basic
information (new touchpoint at these coordinates, touchpoint moved
somewhere else, touchpoint was lifted) for *every* touchpoint. The
application is in charge of seeing where do each apply and giving
those a meaning, some examples:
- A touchpoint happening in a browser tab might eventually tap some
button in the webpage, or be used by the browser to start scrolling if
it moves past a distance (and effectively cancelling other actions)
- Say you actually meant to zoom on the page, so a second touchpoint
on the webpage arrives a couple milliseconds later. All previous
actions should be rendered ineffective, and the browser should see how
far away are those touchpoints in order to start zooming based on that
initial distance, using the center of those 2 points as the zooming
anchor.
- But say you missed the webpage in that second touch, and it ends up
on the location bar. What should the app do in that case? I dunno, but
what it shouldn't do is zooming :).

Toolkits like GTK+ may help to some extent, but nothing enforced apps
to using that fully/correctly. And in the case of Firefox, I think
their use of GTK+ is a lot more superficial than that, so there's more
on their plate to handle.

TL;DR: Touch input is highly location and context dependent, there's
no way "correct" multitouch-y behavior can be done for free without
app involvement.



3. touchscreen tapping on a task in the taskbar does not bring it up, I can
see gnome notices the click but it still does nothing. Tapping on the
touchpad works, clicking left button on the touchpad works, so it only
touchscreen problem.

This should be reported on the specific gnome-shell extension
providing the taskbar, if this is the "Window list" extension, that'd
be https://gitlab.gnome.org/GNOME/gnome-shell-extensions/issues/

Ok!

4. Orientation sensor - when enabled, constantly rotates the screen.
iio-sensor-proxy is in charge of reading accelerometer orientation,
some of the opened issues at
https://github.com/hadess/iio-sensor-proxy/issues might fit your
model, if not file a new one.

I did that already actually. This thing stays silent for a while and starts
sending events constantly, may be a hardware problem.

Accelerometers may be rather verbose. Not all agree on the same X,Y,Z
default orientation which may result on display staying rotated to one
side even after you physically rotating the device, that can be fixed
up (see [1]). If the symptom doesn't match, I guess kernel driver
problem shouldn't be entirely discarded.

Cheers,
  Carlos

[1] https://github.com/hadess/iio-sensor-proxy#accelerometer-orientation



--
Alexey


[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]