This fixes a problem where touches can get stuck because the
driver and the framework have different ideas of what the
initial slot index is. The framework assumed it was slot 0
but it could in principle be any slot, such as slot 1. When
that happened, the framework would start tracking the first
touch as slot 0, but it might never receive an "up" for that slot.
Change-Id: Idaffc4534b275d66b9d4360987b28dc2d0f63218
Fixed some issues where inconsistent streams of events could
be generated by the dispatcher, particularly when switching from
hovering with one device to hovering with another.
Fixed a bug where the touch pad would fail to generate a new
HOVER_MOVE following a tap event. As a result, the hover event
stream would not resume until the user touched the touch pad
again.
Change-Id: I444dce84641fb12e56a0af84c931520771d6c493
Made it possible for individual windows to disable pointer gestures
while the window has focus using a private API.
Cleaned up the InputReader configuration code to enable in-place
reconfiguration of input devices without having to reopen them all.
This change makes changing the pointer speed somewhat nicer since the
pointer doesn't jump back to the origin after each change.
Change-Id: I9727419c2f4cb39e16acb4b15fd7fd84526b1239
When a window registers to listen for outside touches, it doesn't need
the position information for touches that land outside of its activity
for normal use cases.
This patch uses the foreground window's UID as a filter to determine
whether to pass the position information. This will allow applications
to continue to rely on touch information for inputs that were directed
at one of its other windows.
Bug: 4541250
Change-Id: If16eb1ec8404b797d991859eef55ac0a20a355a3
Fix bug where the pointer presentation would be updated on
any input reader timeout rather than only when a pointer gesture
is in progress.
Bug: 4124987
Change-Id: Ie9bba4a0b3228d55e45e65fa2ede5cd6ba887a08
Some drivers report individual finger updates one at a time
instead of all at once. When 10 fingers are down, this can
cause the framework to have to handle 10 times as many events
each with 10 times as much data. Applications like
PointerLocation would get significantly bogged down by all
of the redundant samples.
This change coalesces samples that are closely spaced in time,
before they are dispatched, as part of the motion event batching
protocol.
Increased the size of the InputChannel shared memory buffer so
that applications can catch up faster if they accumulate a
backlog of samples.
Change-Id: Ibc6abf8af027d9003011ac75caa12941080caba3
This enlarges the window of opportunity for batching to
encompass time spent for the window to become ready (while it is
busy processing the last event).
Change-Id: I8870cc3081d27a4de659fb4e375f888fe966460b
Also be more careful about canceling fallback keys during focus
transitions, when the application handles the key, or when the
policy decides to do something different.
Fixed a crash due to JNI CallObjectMethod returning an undefined
value (not null) when an exception is thrown.
Fixed a crash due to the policy trying to create a Dialog for
recent apps on the dispatcher thread. It should happen on the
policy's Looper instead.
Bug: 4187302
Change-Id: I043f82913830f411b3bb4018d6422467b6ca454f
Fix a bug where we were always setting the focused application
handle to NULL. This broke ANR processing while starting
applications and caused input events to be dropped while
starting applications.
Bug: 4174573
Change-Id: Ice7ce3a2b65219568a8227fc1383bafb294666b5
Added a timeout mechanism to EventHub and InputReader so that
InputMappers can request timeouts to perform delayed processing of
input when needed.
Change-Id: I89c1171c9326c6e413042e3ee13aa9f7f1fc0454
1. Single finger tap performs a click.
2. Single finger movement moves the pointer (hovers).
3. Button press plus movement performs click or drag.
While dragging, the pointer follows the finger that is moving
fastest. This is important if there are additional fingers
down on the touch pad for the purpose of applying force
to an integrated button underneath.
4. Two fingers near each other moving in the same direction
are coalesced as a swipe gesture under the pointer.
5. Two or more fingers moving in arbitrary directions are
transformed into touches in the vicinity of the pointer.
This makes scale/zoom and rotate gestures possible.
Added a native VelocityTracker implementation to enable intelligent
switching of the active pointer during drags.
Change-Id: I7b7ddacc724fb1306e1590dbaebb740d3130d7cd
Added the concept of pointer properties in a MotionEvent.
This is currently used to track the pointer tool type to enable
applications to distinguish finger touches from a stylus.
Button states are also reported to application as part of touch events.
There are no new actions for detecting changes in button states.
The application should instead query the button state from the
MotionEvent and take appropriate action as needed.
A good time to check the button state is on ACTION_DOWN.
As a side-effect, applications that do not support multiple buttons
will treat primary, secondary and tertiary buttons identically
for all touch events.
The back button on the mouse is mapped to KEYCODE_BACK
and the forward button is mapped to KEYCODE_FORWARD.
Added basic plumbing for the secondary mouse button to invoke
the context menu, particularly in lists.
Added clamp and split methods on MotionEvent to take care of
common filtering operations so we don't have them scattered
in multiple places across the framework.
Bug: 4260011
Change-Id: Ie992b4d4e00c8f2e76b961da0a902145b27f6d83
First step of improving app screen size compatibility mode. When
running in compat mode, an application's windows are scaled up on
the screen rather than being small with 1:1 pixels.
Currently we scale the application to fill the entire screen, so
don't use an even pixel scaling. Though this may have some
negative impact on the appearance (it looks okay to me), it has a
big benefit of allowing us to now treat these apps as normal
full-screens apps and do the normal transition animations as you
move in and out and around in them.
This introduces fun stuff in the input system to take care of
modifying pointer coordinates to account for the app window
surface scaling. The input dispatcher is told about the scale
that is being applied to each window and, when there is one,
adjusts pointer events appropriately as they are being sent
to the transport.
Also modified is CompatibilityInfo, which has been greatly
simplified to not be so insane and incomprehendible. It is
now simple -- when constructed it determines if the given app
is compatible with the current screen size and density, and
that is that.
There are new APIs on ActivityManagerService to put applications
that we would traditionally consider compatible with larger screens
in compatibility mode. This is the start of a facility to have
a UI affordance for a user to switch apps in and out of
compatibility.
To test switching of modes, there is a new variation of the "am"
command to do this: am screen-compat [on|off] [package]
This mode switching has the fundamentals of restarting activities
when it is changed, though the state still needs to be persisted
and the overall mode switch cleaned up.
For the few small apps I have tested, things mostly seem to be
working well. I know of one problem with the text selection
handles being drawn at the wrong position because at some point
the window offset is being scaled incorrectly. There are
probably other similar issues around the interaction between
two windows because the different window coordinate spaces are
done in a hacky way instead of being formally integrated into
the window manager layout process.
Change-Id: Ie038e3746b448135117bd860859d74e360938557
Some drivers report individual finger updates one at a time
instead of all at once. When 10 fingers are down, this can
cause the framework to have to handle 10 times as many events
each with 10 times as much data. Applications like
PointerLocation would get significantly bogged down by all
of the redundant samples.
This change coalesces samples that are closely spaced in time,
before they are dispatched, as part of the motion event batching
protocol.
Increased the size of the InputChannel shared memory buffer so
that applications can catch up faster if they accumulate a
backlog of samples.
Added logging code to help measure input dispatch and drawing
latency issues in the view hierarchy. See ViewDebug.DEBUG_LATENCY.
Change-Id: Ia5898f781f19901d2225c529a910c32bdf4f504f
This enlarges the window of opportunity for batching to
encompass time spent for the window to become ready (while it is
busy processing the last event).
Change-Id: I3fb5a394ab1b85d6591192678168ca6e35dd9d53
This patch adds a mechanism for capturing, filtering, transforming
and injecting input events at a very low level before the input
dispatcher attempts to deliver them to applications. At this time,
the mechanism is only intended to be used by the accessibility
system to implement built-in system-level accessibility affordances.
The accessibility input filter is currently just a stub.
It logs the input events receives and reinjects them unchanged,
except that it transforms KEYCODE_Q into KEYCODE_Z.
Currently, the accessibility input filter is installed whenever
accessibility is enabled. We'll probably want to change that
so it only enables the input filter when a screen reader is
installed and we want touch exploration.
Change-Id: I35764fdf75522b69d09ebd78c9766eb7593c1afe
Also be more careful about canceling fallback keys during focus
transitions, when the application handles the key, or when the
policy decides to do something different.
Fixed a crash due to JNI CallObjectMethod returning an undefined
value (not null) when an exception is thrown.
Fixed a crash due to the policy trying to create a Dialog for
recent apps on the dispatcher thread. It should happen on the
policy's Looper instead.
Bug: 4187302
Change-Id: I659a3fd1bd2325ed36d965f9beb75dacb89790c9
Fix a bug where we were always setting the focused application
handle to NULL. This broke ANR processing while starting
applications and caused input events to be dropped while
starting applications.
Bug: 4174573
Change-Id: Icd7b8c4c49ed73c41978f3ff076c2e5cd839a802
The input dispatcher sends a HOVER_ENTER to a window before dispatching
it any HOVER_MOVE events. For compatibility reasons, the window will
*also* receive the HOVER_MOVE. When the pointer moves into a different
window or the pointer goes down or when events are canceled for some reason,
the input dispatcher sends a HOVER_EXIT to the previously hovered window.
The view hierarchy behavior is similar. All views under the pointer
receive onHoverEvent with HOVER_ENTER followed by any number of HOVER_MOVE
events. When the pointer leaves a view, the view receives HOVER_EXIT.
Similarly, if a parent view decides to capture hover by returning true
from onHoverEvent, the hovered descendants will receive HOVER_EXIT.
The default behavior of onHoverEvent is to update the view's hovered
state by calling setHovered(true/false). Views can query their current
hovered state using isHovered().
For testing purposes, the hovered state is mapped to the pressed
drawable state. This will change in a subsequent commit with the
introduction of a new hovered drawable state.
Change-Id: Ib76a7a90236c8f2c7336e55773acade6346cacbe
Added a timeout mechanism to EventHub and InputReader so that
InputMappers can request timeouts to perform delayed processing of
input when needed.
Change-Id: Iec2045baaf4e67690b15eef3c09a58d5cac76897
1. Single finger tap performs a click.
2. Single finger movement moves the pointer (hovers).
3. Button press plus movement performs click or drag.
While dragging, the pointer follows the finger that is moving
fastest. This is important if there are additional fingers
down on the touch pad for the purpose of applying force
to an integrated button underneath.
4. Two fingers near each other moving in the same direction
are coalesced as a swipe gesture under the pointer.
5. Two or more fingers moving in arbitrary directions are
transformed into touches in the vicinity of the pointer.
This makes scale/zoom and rotate gestures possible.
Added a native VelocityTracker implementation to enable intelligent
switching of the active pointer during drags.
Change-Id: I5ada57e7f2bdb9b0a791843eb354a8c706b365dc
Associate each motion axis with the source from which it comes.
It is possible for multiple sources of the same device to define
the same axis. This fixes new API that was introduced in MR1.
(Bug: 4066146)
Fixed a bug that might cause a segfault when using a trackball.
Only fade out the mouse pointer when touching the touch screen,
ignore other touch pads.
Changed the plural "sources" to "source" in several places in
the InputReader where we intend to refer to a particular source
rather than to a combination of sources.
Improved the batching code to support batching events from different
sources of the same device in parallel. (Bug: 3391564)
Change-Id: I0189e18e464338f126f7bf94370b928e1b1695f2
Determination of the last event time. Currently, uses
the time that the first event in the previous batch that
was sent from hardware. This produces inconsistent timing
intervals for event delivery to apps. Now, use the time that the
previous batch was delivered to the application.
Original Author: Stephen Moore <steve.moore@motorola.com>
Signed-off-by: makarand.karvekar <makarand.karvekar@motorola.com>
Change-Id: I2a3701915702d622dc04fbf4bbd4918a9ebe8856
Added some plumbing to enable the policy to intercept motion
events when the screen is off to handle wakeup if needed.
Added a basic concept of an external device to limit the scope
of the wakeup policy to external devices only. The wakeup policy
for internal devices should be based on explicit rules such as
policy flags in key layout files.
Moved isTouchEvent to native.
Ensure the dispatcher sends the right event type to userActivity
for non-touch pointer events like HOVER_MOVE and SCROLL.
Bug: 3193114
Change-Id: I15dbd48a16810dfaf226ff7ad117d46908ca4f86
Dispatch ACTION_HOVER_MOVE and ACTION_SCROLL through the View
hierarchy as onGenericTouchEvent. Pointer events dispatched
this way are delivered to the view under the pointer. Non-pointer
events continue to be delivered to the focused view.
Added scroll wheel support to AbsListView, ScrollView,
HorizontalScrollView and WebView. Shift+VSCROLL is translated
to HSCROLL as appropriate.
Added logging of new pointer events in PointerLocationView.
Fixed a problem in EventHub when a USB device is removed that
resulted in a long stream of ENODEV errors being logged until INotify
noticed the device was gone.
Note that the new events are not supported by wallpapers at this time
because the wallpaper engine only delivers touch events.
Make all mouse buttons behave identically. (Effectively we only
support one button.)
Change-Id: I9ab445ffb63c813fcb07db6693987b02475f3756
Only initiate fallback key handling if the first key down was
not handled and there is no other fallback key already in progress.
This prevents spurious fallbacks from being generated when
applications handle the initial down but not repeated downs or the up.
Change-Id: I8a513896cf96b16dc502cd72291926d5532aa2ab
Added support for tracking the mouse position even when the mouse button
is not pressed. To avoid confusing existing applications, mouse movements
are reported using the new ACTION_HOVER_MOVE action when the mouse button
is not pressed.
Added some more plumbing for the scroll wheel axes. The values are
reported to Views but they are not yet handled by the framework.
Change-Id: I1706be850d25cf34e5adf880bbed5cc3265cf4b1
This change makes it possible to extend the set of axes that
are reported in MotionEvents by defining new axis constants.
The MotionEvent object is now backed by its C++ counterpart
to avoid having to maintain multiple representations of the
same data.
Change-Id: Ibe93c90d4b390d43c176cce48d558d20869ee608
The touch screen sometimes reports more than 10 pointers even though that's
all we asked for. When this happens, we start dropping events with more
than 10 pointers. This confuses applications and causes them to crash.
Raised the limit to 16 pointers.
Bug: 3331247
The default behavior was to identify all touch devices as touch screens.
External devices that are plugged in are more likely to be touch pads
not attached to a screen. Changed the default to be a touch pad
and renamed some internal constants to avoid confusion.
A certain mouse happens to also behave like a touch pad. That caused
problems because we would see multiple concurrent traces of motion events
coming from the same input device so we would batch them up.
Added code to ensure that we don't batch events unless they come from
the same *source* in addition to coming from the same *device*.
Due to batching or misbehaving drivers, it's possible for the set of
pointer ids to be different from what we expect when it comes time to
split motion events across windows. As a result, we can generate motion
events with 0 pointers. When we try to deliver those events, we cause
an error in the InputTransport so we tear down the InputChannel and kill
the application.
Added code to check out assumption about pointer ids and drop the
event gracefully instead.
Patched up the tests to take into account the change in default behavior
for identifying touch screens and touch pads.
Change-Id: Ic364bd4cb4cc6335d4a1213a26d6bdadc7e33505
This enables the system bar to carve out a region through which
events will be sent to the IME behind it.
Bug: 3238092
Change-Id: I69b855a8d9b5b3ee525266c0861826e53e5b5028
This change implements two heuristics.
1. When events are older than 10 seconds, they are dropped.
2. If the application is currently busy processing an event and
the user touches a window belonging to a different application
then we drop the currently queued events so the other application
can start processing the gesture immediately.
Note that the system takes care of synthesizing cancelation events
automatically for any events that it drops.
Added some new handle types to allow the native dispatcher to
indirectly refer to the WindowManager's window state and app window
token. This was done to enable the dispatcher to identify the
application to which each window belongs but it also eliminates
some lookup tables and linear searches through the window list
on each key press.
Bug: 3224911
Change-Id: I9dae8dfe23d195d76865f97011fe2f1d351e2940
Reject movements from other devices when one device is already down.
This fixes jittery behavior when the user moves the mouse and touches
the screen at the same time.
Change-Id: I99151c8f2596a3139720f776bcbc559d4b314878
Added support for loading the pointer icon from a resource.
Moved the system server related bits of the input manager out
of libui and into libinput since they do not need to be linked into
applications.
Change-Id: Iec11e0725b3add2b905c51f8ea2c3b4b0d1a2d67