The TouchExplorer was not taking into account the case with incative
pointers while dragging. If one puts a finger down and then perfroms
a dragging gestore the explorer tries to inject UP event for the end
of the gesture upon every of the two dragging pointers going up instead
only for one the first went up.
bug:5476098
Change-Id: I20d2dd7bde7e016b0678a35d14cd068d9ff37023
In touch exploration two fingers in the same direction drag and if one of them
goes up the other starts to touch explore. This however causes inadvertent touch
exploring to happen on almost every scroll causing confusion. Now two finger
drag and they should both go up to allow exploring. This way the inadvertent
exploring is gone and user experience is much better.
bug:5440411
Change-Id: Id8aaece92e5dea1fc740400d2adc9dd63a1674e4
1. Due to a previous change that disabled accessibility if not enabled
and installed serivces are present the automation APIs stopped working
since they use fake automation service that is not installed.
2. Added clean up of death recipients when binders die.
bug:5374662
bug:5239044
Change-Id: I1f3c8cd1d1c79753a4a64e2b8b2963025abb2939
The system is didabling accessiblity if no accessibility serivces are enabled
to avoid sending events across processes if no recipients are present. The
check considers enabled services which may not have been installed. Now the
check is made against enabled and installed serivces.
bug:5347273
Change-Id: Iad391a1a5bf0bbca470584bc8392f35821ba768c
The touch explorer was using the id of the last pointer that
went up while injecting up and down to tap through the last
touch explore event incorrectly assuming that the last up
pointer did touch explore. This was leading to a system crash.
bug:5319315
Change-Id: Iffe8ef753795ad685abe6f493cc09adac8bfea94
1. Added flags to the search method to specify whether to match text or
content description or both.
2. Added test case for the seach by content description.
3. Updated the code in AccessibilityManager service to reflect the latest
changes there so test automation service works - this is the fake
service used for UI automation.
Change-Id: I14a6779a920ff0430e78947ea5aaf876c2e66076
The pakcage monitor in the AccessibilityManagerService was not
watching for packages that are removed. This is needes since
1) we need to remove the package from the enabled accessibility
serivces and clean up after the removed serivice; 2) we need to
disable accessibility if the last access serivices went away.
Change-Id: I06d33b411ce60703e5a2843107323ffc87046c16
Accessibility was kept enabled even if all accessibility services
are disabled (explicitly by the user or removed) which was causing
the system to fire accessibility events that will never be consumed.
Change-Id: Ifb03e786ac0106687252bd1979725ffd724ad1c5
1. The touch explorer was not canceling long press runnable when a finger
goes down. This was causing system crash in the scenario of one pointer
down and not moving followed by another pointer down. Since the long press
runnable posed when the first pointer went down was not removed it was
sending events with wrong pointer id leading to a crash.
bug:5271592
Change-Id: I40dd7dd21d465ecedd9413f00b3cedc6066fa22d
1. Tuned the max angle between two moving fingers in touch
exploration mode for a gesture to be considered a drag.
The previous value was too aggressive and it was fairly
easy for the user to get out of dragging state if she
ingreases the distance between her fingers.
bug:5223787
2. Before clicking the explorer was sending hover enter and
exit resulting in firing the corresponding accessibility
events which leads to announcement of the content under
the tap that triggered the click. However, the click is
actually performed on the last touch explored location
(if in the distance slop of course) instead of the actual
tapping pointer location. Before fixing that the user was
confused since he was hearing announcement of one content
but actually was clicking on something else.
bug:5225721
Change-Id: I79fec704878f98c95f181bf8a9647e0bb1bd10ef
1. The downTime of the first down event was zero but it should the event time.
2. Hover exit events were not injected while transitioning to delegating
state and when tapping.
3. Differentiation between dragging and delagating state based on
two moving pointer direction and distance is now based only on
the direction. Hence, two pointers moving in the same direction
are dragging, otherwise the event stream is delegated unmodified.
The reason for that is the blind people cannot easily determine
and control the distance between their fingers resulting in
different behavior for gestures which the user thinks are the same
which creates confusion. Also in some cases the delegation and
draggig yield the same result, for example in list view, further
adding to the confusion. This was also causing the status bar to
be opened closed inreliably creating frustration.
4. Refactored the code such that now there is only one method that
injects motion events and all request go through it. Some bugs
were introduced by inconsistent implementation in the different
injection methods.
5. Fixed a couple of event stream inconsistencies reported by the
event consistency verifier.
bug:5224183
bug:5223787
bug:5214829
Change-Id: I16c9be3562ad093017af5b974a41ab525b73453f
The content retrieval APIs are synchronous from a client's
perspective but internally they are asynchronous. The client thread
calls into the system requesting an action and providing a callback
to receive the result after which it waits up to a timeout for that
result. The system enforces security and then delegates the request
to a given view hierarchy where a message is posted (from a binder
thread) describing what to be performed by the main UI thread the
result of which it delivered via the mentioned callback. However,
the blocked client thread and the main UI thread of the target view
hierarchy can be the same one, for example an accessibility service
and an activity run in the same process, thus they are executed on the
same main thread. In such a case the retrieval will fail since the UI
thread that has to process the message describing the work to be done
is blocked waiting for a result is has to compute! To avoid this scenario
when making a call the client also passes its process and thread ids so
the accessed view hierarchy can detect if the client making the request
is running in its main UI thread. In such a case the view hierarchy,
specifically the binder thread performing the IPC to it, does not post a
message to be run on the UI thread but passes it to the singleton
interaction client through which all interactions occur and the latter is
responsible to execute the message before starting to wait for the
asynchronous result delivered via the callback. In this case the expected
result is already received so no waiting is performed.
bug:5138933
Change-Id: I382e2d8689f5189110226613c2387f553df98bd3
1. The touch explorer uses delayed injection of events
which can happen after its hosting accessibility
input filer has been unregistered, thus the explorer
was trying to inject events when this is not allowed.
Now upon unregistration of the accessibility explorer
it resets the state of the touch explorer it hosts.
bug:5105956
Change-Id: I720682abf93382aedf4f431eaac90fd2c781e442
1. The code for detecting the end of a touch exploration gesture
was not injecting the hover exit event upon detection of the
gesture end.
bug:5091758:
Change-Id: I468164617d6677cd2a2a2815e1756c826d49f3a9
1. Events not generated by the user can change the interrogation allowing window
unpredicatably. For example when a ListView lays out its children it fires an
accessibility events and changes the currently active window while the user
interaction may be happening in another window say a dialog. Now the interrogation
allowing window is changed when a new window is shown or the user has touch
explored it.
bug:5074116
Change-Id: I8dde12bbec807d32445a781eedced9b95312b3e2
1. The first problem is manifested on Prime. Apparently the Prime screen driver
is very aggresive in filtering move events that origin from almost the same
location. Hence, the framework doesn't see a constant stream of events. However,
the TouchExplorer implementation was assuming a constant event stream to detect
long press. Refactored the code such that no assumptions for the event stream
are made.
2. Touch exploring an item and then tapping far away from that item was activating
it, hence not respecting the distance slop. This was due to incorrect check of
the latter.
bug:5070917
Change-Id: I3627a2feeb3712133f58f8f8f1ab7a2ec50cdc9a
1. Upon registration of an accessibility client the latter received only
the accessiiblity state and waiting for the touch exploration state
to be sent by the system in async manner. This led the very first
check of touch exploration state is checked a wrong value to be reported.
Now a state of the accessibility layer is returned to the client
upon registration.
2. Removing the dependency on talking accessibility service to be enabled
for getting into touch exploration mode. What if the user wants to use
an accessibility service that shows a dialog with the text of the touched
view?
bug:5051546
Change-Id: Ib377babb3f560929ee73bd3d8b0d277341ba23f7
1. Seperated touch exploration to be a seperate setting rather being
magically enabled by the system of accessiiblity is on the there
is at leas one accessibility service that speaks enabled. Now
there is a setting for requesting touch exploration but still the
system will enabled it only if that makes sense i.e. accessibility
is on and one accessibility service that speaks is enabled.
2. Added public API for checking of touch exploration is enabled.
3. Added description attribute in accessibility service declaration
which will be shown to the user before enabling the service.
4. Added API for quick cloning of AccessibilityNodeInfo.
5. Added clone functionality to SparseArray, SparseIntArray, and
SparseBooleanArray.
bug:5034010
bug:5033928
Change-Id: Ia442edbe55c20309244061cd9d24e0545c01b54f
1. If an accessibility service does not request access to the window
content and does so, an exception is thrown to point the developer
to the reason.
bug:5038284
Change-Id: Ibf08f4d2c8ad8939c4f4c2e288048a4f8ff1e31b
1. Touch exploration start and end events are generated
by the sytstem to provide additional information for
accessibility services. Since such events do not come
from any particular window they whould not change the
id of the window that currently allows exploring its
content.
2. Touch exploration start and end events were lealing the
touch explorer class wich is private.
bug:5026258
Change-Id: Icaf3e2bd9566716f2afb876cf8e0d50813b0c76e
1. Due to thread interleaving it was possible that
two messages are sent for requesting dispatch of
the same accessibility event and since the first
one sends the event and removes it from the pending
list the second message pulls null during the event
lookup. Look at the patch's comments for a detailed
scenario and rationale of the fix.
bug:4886129
Change-Id: If8b272ceaec7709c659ae502c3a730e63c939172
1. The explorer was injecting up/down touch events to
click with the id of the last pointer that went up
but the prototype i.e. last touch explore event may
not contain this pointer. Since we click on the last
touch explored location then using the action pointer
index of that event is the right approach.
bug:4551506
Change-Id: I73428b09dc014417096a52e667f58768a2871dc8
1. Added scrolling accessibility event to provicde feedback
when a view is scrolled.
Note: We need scroll events for ICS since even though we have
touch exploration the user does not know when something
is scrollable and not feedback is provided while scrolling.
bug:4902097
2. Added a text selection change event to provide feedback
for selection changes including cursor movement.
Note: We need the text selection change events for ICS since
even though the IME supports navigation in text fields
the user receives no feedback for the current selection/
cursor position.
bug:4586186
3. Added a scrollable property to both AccessibilityEvent and
AccessibilityNodeInfo. The info has to describe the source
in terms of all properties that make sense for accessibility
purposes and the event has this property (kinda duplicated)
since clients will aways want to know if the source is
scrollable to provided clue to the user and we want to avoid
pulling the info of the source for every accessibility event.
Change-Id: I232d6825da78e6a12d52125f51320217e6fadb11
1. In compatibility mode a window wide scaling is applied to stretch
the content. However, AccessibilityNodeInfos retrieved from that
window contain bounds in application's view of the world and need
to be scaled to properly relect what a sighted user sees.
Change-Id: Iebbb99526fc327f45b5cede89ba8c32e6ebd8845
1. Enabling accessibility and disabling all enabled
accessibility service when a test client connects
the the AccessibilityManagerService.
Change-Id: I2f40cccaa0035ac1454d8c5ac84678c1542a0229
1. Added a new event type for notifying client accessibilitiy
services for changes in the layout. The event is fired at
most once for a given time frame and is delivered to clients
only if it originates from the window that can be interrogated.
2. Exposed the findByText functionality in AccessibilityNodeInfo.
This is very useful for an accessibility service since it allows
searching for something the user knows is on the screen thus
avoiding touch exploring the content. Touch exploring is
excellent for learning the apps but knowing them search is
much faster.
3. Fixed a bug causing an accessibiliby service not to receive
the event source in case of more than one service is registered
and one of them does not have paermission to interrogate the window.
The same event was dispatched to multiple services but if one
of them does not have interrogation permission the event is
modified to remove the source causing subsequent serivices not
to get the later.
4. Moved the getSource setSource methods to AccessibilityRecord
instead in AccessibilityEvent.
5. Hiden some protected members in AccessibilityRecod which should
not be made public since getters exist.
6. Added the View absolute coordinates in the screen to AccessibilityNodeInfo.
This is needed for fast computation of relative positions of
views from accessibility - common use case for the later.
7. Fixed a couple of marshalling bugs.
8. Added a test for the object contract of AccessibilityNodeInfo.
Change-Id: Id9dc50c33aff441e4c93d25ea316c9bbc4bd7a35
1. No clearing the last touch explore event in all cases
when transitioning to another mode.
2. Incorrectly assuming the the action index of an up/down
events is 0.
bug:4551506
Change-Id: I43f8e800b54a340968489dc924a539795a9195cb
1. Views are represented as AccessibilityNodeInfos to AccessibilityServices.
2. An accessibility service receives AccessibilityEvents and can ask
for its source and gets an AccessibilityNodeInfo which can be used
to get its parent and children infos and so on.
3. AccessibilityNodeInfo contains some attributes and actions that
can be performed on the source.
4. AccessibilityService can request the system to preform an action
on the source of an AccessibilityNodeInfo.
5. ViewAncestor provides an interaction connection to the
AccessibiltyManagerService and an accessibility service uses
its connection to the latter to interact with screen content.
6. AccessibilityService can interact ONLY with the focused window
and all calls are routed through the AccessibilityManagerService
which imposes security.
7. Hidden APIs on AccessibilityService can find AccessibilityNodeInfos
based on some criteria. These API go through the AccessibilityManagerServcie
for security check.
8. Some actions are hidden and are exposes only to eng builds for UI testing.
Change-Id: Ie34fa4219f350eb3f4f6f9f45b24f709bd98783c
1. AccessibilityManagerService was keeping handles to dead
IAccessibilitymanagerClients - now doing so.
2. AccessibilityManagerService was lazily cleaning up dead
IAccessibilityServiceConnections - now using a callback.
3. Cleaned up the book keeping of enabled services.
4. Fixed a bug that the input filter is still enabled
when disabling accessibility.
Change-Id: I5e9af7ab684a3b71e8ee51125b1262a17e960eb0
Note: This is a part of two CL change and contains the
system changes without updates to the settings.
1. Added a mechanism for configuring an accessibility service via
XML file specified in a meta-data tag (similar to IMEs).
2. Added property for specifying a settings activity for an
accessibility service.
3. Refactored the APIs in AccessibilityManager to return
lists of AccessiblityServiceInfo instead ServiceInfo
since the former describes an AccessibilityService in
particular (similar to IMEs).
Change-Id: Ie8781bb7e0cdb329e583b6702a612a507367ad7b
1. Refactored the code to avoid code duplication.
2. Fixed a bug in removing unused pointers from the event.
3. Fixed a bug that was crashing the explorer.
4. Sending hover exit immediately at the end of touch exploration
gesture rather with a delay.
Change-Id: Ie288cb8090d6fb5e5c715afa6ea5660b17c019e0
Added the concept of pointer properties in a MotionEvent.
This is currently used to track the pointer tool type to enable
applications to distinguish finger touches from a stylus.
Button states are also reported to application as part of touch events.
There are no new actions for detecting changes in button states.
The application should instead query the button state from the
MotionEvent and take appropriate action as needed.
A good time to check the button state is on ACTION_DOWN.
As a side-effect, applications that do not support multiple buttons
will treat primary, secondary and tertiary buttons identically
for all touch events.
The back button on the mouse is mapped to KEYCODE_BACK
and the forward button is mapped to KEYCODE_FORWARD.
Added basic plumbing for the secondary mouse button to invoke
the context menu, particularly in lists.
Added clamp and split methods on MotionEvent to take care of
common filtering operations so we don't have them scattered
in multiple places across the framework.
Bug: 4260011
Change-Id: Ie992b4d4e00c8f2e76b961da0a902145b27f6d83
1. Added an Input Filter that interprets the touch screen motion
events to perfrom accessibility exploration. One finger explores.
Tapping within a given time and distance slop on the last exlopred
location does click and long press, respectively. Two fingers close
and in the same diretion drag. Multiple finglers or two fingers in
different directions or two fingers too far away are delegated to
the view hierarchy. Non moving fingers "accidentally grabbed the
device for the scrren" are ignored.
2. Added accessibility events for hover enter, hover exit, touch
exoloration gesture start, and end. Accessibility hover events
are fired by the hover pipeline. An accessibility event is
dispatched up the view tree and the topmost view fires it.
Thus predecessors can augment the fired event. An accessibility
event has several records and a predecessor can optionally
modify, delete, and add such to the event.
3. Added onPopulateAccessibilityEvent and refactored the existing
accessibility code to use it.
4. Added API for querying the currently enabled accessibility services
by feedback type.
Change-Id: Iea2258c07ffae9491071825d966dc453b07e5134
This reverts commit ac84d3ba81f08036308b17e1ab919e43987a3df5.
There seems to be a problem with this API change. Reverting for now to
fix the build.
Change-Id: Ifa7426b080651b59afbcec2d3ede09a3ec49644c
1. Added an Input Filter that interprets the touch screen motion
events to perfrom accessibility exploration. One finger explores.
Tapping within a given time and distance slop on the last exlopred
location does click and long press, respectively. Two fingers close
and in the same diretion drag. Multiple finglers or two fingers in
different directions or two fingers too far away are delegated to
the view hierarchy. Non moving fingers "accidentally grabbed the
device for the scrren" are ignored.
2. Added accessibility events for hover enter, hover exit, touch
exoloration gesture start, and end. Accessibility hover events
are fired by the hover pipeline. An accessibility event is
dispatched up the view tree and the topmost view fires it.
Thus predecessors can augment the fired event. An accessibility
event has several records and a predecessor can optionally
modify, delete, and add such to the event.
3. Added onPopulateAccessibilityEvent and refactored the existing
accessibility code to use it.
4. Added API for querying the currently enabled accessibility services
by feedback type.
Change-Id: Iec03c6c3fe298de3f14cb6efdbb9b198cd531a0c
This patch adds a mechanism for capturing, filtering, transforming
and injecting input events at a very low level before the input
dispatcher attempts to deliver them to applications. At this time,
the mechanism is only intended to be used by the accessibility
system to implement built-in system-level accessibility affordances.
The accessibility input filter is currently just a stub.
It logs the input events receives and reinjects them unchanged,
except that it transforms KEYCODE_Q into KEYCODE_Z.
Currently, the accessibility input filter is installed whenever
accessibility is enabled. We'll probably want to change that
so it only enables the input filter when a screen reader is
installed and we want touch exploration.
Change-Id: I35764fdf75522b69d09ebd78c9766eb7593c1afe