1. Changed all references to granularity to movement
granularity. BTW, to be more precise it should be
text movement granularity.
bug:6435232
Change-Id: If6366b002ca3390f74918995b342baff2cbcfd01
1. The event of setting an accessibility focus on a view should not
make the host window the currently active one.
bug:6400648
Change-Id: Ib45c255f441c38489ee9d4ab5f284550ac5f6b01
1. The checks for action arguments are not needed since they
may cause trouble for developers if we add more args to
an action.
bug:6414006
Change-Id: Ia4212b52be183b1ef1cfd2561ce618cef2b015e4
1. The granularities for traversing the text content of an accessibility
node info are now predefined constants and custom ones will not be
supported. This is the simplest solution - we can always add namespaced
user defined ones (unlikely).
2. Added actions for traversing web content. These actions can be used by
an accessibility service to transparently drive the JavaScript based
screen reader that is used for handling web content.
3. Added a new accessibility event type for traversing the content of a
view. This event is needed to announce to the user what is the next
element, i.e. the one next to the cursor, after the view's text was
traversed.
bug:5932640
bug:6389591
Change-Id: I144647da55bc4005c64f89865ef333af8359e145
1. The accessibility focus directions are not needed since an
accessibility service just get the root, first child, next
sibling, previous sibling and call execute the action to
give it accessibility focus. Now the accessibility node
info tree is properly ordered taking into account layout
manager directions for both layout manager that we report
and ones that we have determined as not important for
accessibility. Also the position of a node info are ordered
properly based on their coordinates after all transformations
as opposed to child index.
bug:5932640
Change-Id: I994a8297cb1e57c829ecbac73a937c2bcbe0bac7
1. A view that creates an accessibility node info may add to the info
a list of granularity labels. These are granularities by which the
source view can iterate over its content. For example a text view
may support character, word link while a web view may additionally
support buttons, tables, etc. There are actions on accessibility
node info to go to the next/previous at a given granularity which
is passesed as an argument.
2. Added Bundle argument to the APIs for performing accessibility
actions. This is generic and extensible.
bug:5932640
Change-Id: I328cbbb4cddfdee082ab2a8b7ff1bd7477d8d6f9
1. The gesture dispatcher thread was not waiting in a loop
that check for complete initialization. Therefore is was
susceptible to missed signals and unexpected interrupts.
2. In the gesture processing message handle the interaction id
was reading the wrong message argument.
bug:5932640
Change-Id: Ic65ecc01a7fe7d43929c6c07d0759ae9001cf515
1. An accessibility service has to explicitly opt in to be notified
for gestures by the system. There is only one accessibility service
that handles gestures and in case it does not handle a gesture
the system performs default handling. This default handling ensures
that we have gesture navigation even if no accessibility service
would like to participate/customize the interaction model.
bug:5932640
Change-Id: Id8194293bd94097b455e9388b68134a45dc3b8fa
1. Added more gesture for accessibility. After a meeting
with the access-eng team we have decided that the current
set of gestures may be smaller than needed considering
that we will use four gestures for home, back, recents,
and notifications.
2. Adding actions for going back, home, opening the recents,
and opening the notifications.
3. Added preliminary mapping from some of the new gestures
to the new actions.
4. Fixed a bug in the accessibility interaction controller
which was trying to create a handled on the main looper
thread which may be null if the queried UI is in the
system process. Now the context looper of the root view
is used.
5. Fixed a bug of using an incorrect constant.
6. Added a missing locking in a couple of places.
7. Fixed view comparison for accessibilityt since it was
not anisymmetric.
bug:5932640
bug:5605641
Change-Id: Icc983bf4eafefa42b65920b3782ed8a25518e94f
Usefulness: Keep track of the current user location in the screen when
traversing the it. Enabling structural and directional
navigation over all elements on the screen. This enables
blind users that know the application layout to efficiently
locate desired elements as opposed to try touch exploring the
region where the the element should be - very tedious.
Rationale: There are two ways to implement accessibility focus One is
to let accessibility services keep track of it since they
have access to the screen content, and another to let the view
hierarchy keep track of it. While the first approach would
require almost no work on our part it poses several challenges
which make it a sub-optimal choice. Having the accessibility focus
in the accessibility service would require that service to scrape
the window content every time it changes to sync the view tree
state and the accessibility focus location. Pretty much the service
will have to keep an off screen model of the screen content. This
could be quite challenging to get right and would incur performance
cost for the multiple IPCs to repeatedly fetch the screen content.
Further, keeping virtual accessibility focus (i.e. in the service)
would require sync of the input and accessibility focus. This could
be challenging to implement right as well. Also, having an unlimited
number of accessibility services we cannot guarantee that they will
have a proper implementation, if any, to allow users to perform structural
navigation of the screen content. Assuming two accessibility
services implement structural navigation via accessibility focus,
there is not guarantee that they will behave similarly by default,
i.e. provide some standard way to navigate the screen content.
Also feedback from experienced accessibility researchers, specifically
T.V Raman, provides evidence that having virtual accessibility focus
creates many issues and it is very hard to get right.
Therefore, keeping accessibility focus in the system will avoid
keeping an off-screen model in accessibility services, it will always
be in sync with the state of the view hierarchy and the input focus.
Also this will allow having a default behavior for traversing the
screen via this accessibility focus that is consistent in all
accessibility services. We provide accessibility services with APIs to
override this behavior but all of them will perform screen traversal
in a consistent way by default.
Behavior: If accessibility is enabled the accessibility focus is the leading one
and the input follows it. Putting accessibility focus on a view moves
the input focus there. Clearing the accessibility focus of a view, clears
the input focus of this view. If accessibility focus is on a view that
cannot take input focus, then no other view should have input focus.
In accessibility mode we initially give accessibility focus to the topmost
view and no view has input focus. This ensures consistent behavior accross
all apps. Note that accessibility focus can move hierarchically in the
view tree and having it at the root is better than putting it where the
input focus would be - at the first input focusable which could be at
an arbitrary depth in the view tree. By default not all views are reported
for accessibility, only the important ones. A view may be explicitly labeled
as important or not for accessibility, or the system determines which one
is such - default. Important views for accessibility are all views that are
not dumb layout managers used only to arrange their chidren. Since the same
content arrangement can be obtained via different combintation of layout
managers, such managers cannot be used to reliably determine the application
structure. For example, a user should see a list as a list view with several
list items and each list item as a text view and a button as opposed to seeing
all the layout managers used to arrange the list item's content.
By default only important for accessibility views are regared for accessibility
purposes. View not regarded for accessibility neither fire accessibility events,
nor are reported being on the screen. An accessibility service may request the
system to regard all views. If the target SDK of an accessibility services is
less than JellyBean, then all views are regarded for accessibility.
Note that an accessibility service that requires all view to be ragarded for
accessibility may put accessibility focus on any view. Hence, it may implement
any navigational paradigm if desired. Especially considering the fact that
the system is detecting some standard gestures and delegates their processing
to an accessibility service. The default implementation of an accessibility
services performs the defualt navigation.
bug:5932640
bug:5605641
Change-Id: Ieac461d480579d706a847b9325720cb254736ebe
Extracted the input system from the window manager service into
a new input manager service. This will make it easier to
offer new input-related features to applications.
Cleaned up the input manager service JNI layer somewhat to get rid
of all of the unnecessary checks for whether the input manager
had been initialized. Simplified the callback layer as well.
Change-Id: I3175d01307aed1420780d3c093d2694b41edf66e
1. Before there were two caches one in the app process that
kept track only the ids of infos that were given to a
querying client and one in the querying client that
holds the infos. This design requires precise sync
between the caches. Doing that is somehow complicated
since the app has cache for each window and it has
to intercept all accessibility events from that window
to manage the cache. Each app has to have a cache for
each querying client. This approach would guarantee that
no infos are fetched twice but due to its stateful nature
and the two caches is tricky to implement and adds
unnecessary complexity. Now there is only one cache in
the client and the apps are stateless. The client is
passing flags to the app that are a clue what nodes to
prefetch. This approach may occasionally fetch a node
twice but it is considerably simpler and stateless
from the app perspective - there is only one cache.
Fetching a node more than once does not cause much
overhead compared to the IPC.
Change-Id: Ia02f6fe4f82cff9a9c2e21f4a36747de0f414c6f
1. UiTestAutomationBridge was accessing the root node in the
active window by tracking the accessibility event stream
and keeping the last active window changing event. Now
the bridge is stateless and the root node is fetched by
passing special window and view id with the request to
the system.
2. AccessibilityNodeInfos that are cached were not finished,
i.e. not sealed, causing exception when trying to access
their children or rpedecessors.
3. AccessibilityManagerService was not properly restoring its
state after the UI automation bridge disconnects from it.
I particular the devices was still in explore by touch mode
event if no services are enabled and the sutomation bridge
is disconnected.
4. ViewRootImpl for the focused window now fires accessibility
events when accessibility is enabled to allow accessibility
services to determine the current user location.
5. Several missing null checks in ViewRootImpl are fixed since
there were scenraios in which a NPE can occur.
6. Update the internal window content querying tests.
7. ViewRootImpl was firing one extra focus event.
bug:6009813
bug:6026952
Change-Id: Ib2e058d64538ecc268f9ef7a8f36ead047868a05
1. The AccessibilityManagerService used to disable the IU
automation service on package change. This behavior
was incorrect since the automation service has to
survive package installations.
bug:5975207
Change-Id: Idb5e76d02625c333a5842a6b5c5bc90c9b9634c9
1. Now when an interrogating client requires an AccessibilibtyNodeInfo
we aggressively prefetch all the predecessors of that node and its
descendants. The number of fetched nodes in one call is limited to
keep the APIs responsive. The prefetched nodes infos are cached in
the client process. The node info cache is invalidated partially or
completely based on the fired accessibility events. For example,
TYPE_WINDOW_STATE_CHANGED event clears the cache while
TYPE_VIEW_FOCUSED removed the focused node from the cache, etc.
Note that the cache is only for the currently active window.
The ViewRootImple also keeps track of only the ids of the node
infos it has sent to each querying process to avoid duplicating
work. Usually only one process will query the screen content
but we support the general case. Also all the caches are
automatically invalidated so not additional bookkeeping is
required. This simple strategy leads to 10X improving the
speed of the querying APIs.
2. The Monkey and UI test automation framework were registering a
raw event listener for accessibility events and hence perform
connection and cache management in similar way to an AccessibilityService.
This is fragile and requires the implementer to know internal framework
stuff. Now the functionality required by the Monkey and the UI automation
is encapsulated in a new UiTestAutomationBridge class. To enable this
was requited some refactoring of AccessibilityService.
3. Removed the *doSomethiong*InActiveWindow methods from the
AccessibilityInteractionClient and the AccessibilityInteractionConnection.
The function of these methods is implemented by the not *InActiveWindow
version while passing appropriate constants.
4. Updated the internal window Querying tests to use the new
UiTestAutomationBridge.
5. If the ViewRootImple was not initialized the querying APIs of
the IAccessibilityInteractionConnection implementation were
returning immediately without calling the callback with null.
This was causing the client side to wait until it times out. Now
the client is notified as soon as the call fails.
6. Added a check to guarantee that Views with AccessibilityNodeProvider
do not have children.
bug:5879530
Change-Id: I3ee43718748fec6e570992c7073c8f6f1fc269b3
1. AccessibilityInteractionConnections were removed from the
AccessiiblityManagerService but their DeathRecipents were
not unregistered, thus every removed interaction connection
was essentially leaking. Such connection is registered in
the system for every ViewRootImpl when accessiiblity is
enabled and inregistered when disabled.
2. Every AccessibilityEvent and AccessiilbityEventInfo obtained
from a widnow content querying accessibility service had a
handle to a binder proxy over which to make queries. Hoewever,
holding a proxy to a remote binder prevents the latter from
being garbage collected. Therefore, now the events and infos
have a connection id insteand and the hindden singleton
AccessiiblityInteaction client via which queries are made
has a registry with the connections. This class looks up
the connection given its id before making an IPC. Now the
connection is stored in one place and when an accessibility
service is disconnected the system sets the connection to
null so the binder object in the system process can be GCed.
Note that before this change a bad implemented accessibility
service could cache events or infos causing a leak in the
system process. This should never happen.
3. SparseArray was not clearing the reference to the last moved
element while garbage collecting thus causing a leak.
bug:5664337
Change-Id: Id397f614b026d43bd7b57bb7f8186bca5cdfcff9
Added an interface that is the contract for a client to expose a virtual
view hierarchy to accessibility services. Clients impement this interface
and set it in the View that is the root of the virtual sub-tree. Adding
this finctionality via compostion as opposed to inheritance enables apps
to maintain backwards compatibility by setting the accessibility virtual
hierarchy provider on the View only if the API version is high enough.
bug:5382859
Change-Id: I7e3927b71a5517943c6cb071be2e87fba23132bf
The TouchExplorer was not taking into account the case with incative
pointers while dragging. If one puts a finger down and then perfroms
a dragging gestore the explorer tries to inject UP event for the end
of the gesture upon every of the two dragging pointers going up instead
only for one the first went up.
bug:5476098
Change-Id: I20d2dd7bde7e016b0678a35d14cd068d9ff37023
In touch exploration two fingers in the same direction drag and if one of them
goes up the other starts to touch explore. This however causes inadvertent touch
exploring to happen on almost every scroll causing confusion. Now two finger
drag and they should both go up to allow exploring. This way the inadvertent
exploring is gone and user experience is much better.
bug:5440411
Change-Id: Id8aaece92e5dea1fc740400d2adc9dd63a1674e4
1. Due to a previous change that disabled accessibility if not enabled
and installed serivces are present the automation APIs stopped working
since they use fake automation service that is not installed.
2. Added clean up of death recipients when binders die.
bug:5374662
bug:5239044
Change-Id: I1f3c8cd1d1c79753a4a64e2b8b2963025abb2939
The system is didabling accessiblity if no accessibility serivces are enabled
to avoid sending events across processes if no recipients are present. The
check considers enabled services which may not have been installed. Now the
check is made against enabled and installed serivces.
bug:5347273
Change-Id: Iad391a1a5bf0bbca470584bc8392f35821ba768c
The touch explorer was using the id of the last pointer that
went up while injecting up and down to tap through the last
touch explore event incorrectly assuming that the last up
pointer did touch explore. This was leading to a system crash.
bug:5319315
Change-Id: Iffe8ef753795ad685abe6f493cc09adac8bfea94
1. Added flags to the search method to specify whether to match text or
content description or both.
2. Added test case for the seach by content description.
3. Updated the code in AccessibilityManager service to reflect the latest
changes there so test automation service works - this is the fake
service used for UI automation.
Change-Id: I14a6779a920ff0430e78947ea5aaf876c2e66076
The pakcage monitor in the AccessibilityManagerService was not
watching for packages that are removed. This is needes since
1) we need to remove the package from the enabled accessibility
serivces and clean up after the removed serivice; 2) we need to
disable accessibility if the last access serivices went away.
Change-Id: I06d33b411ce60703e5a2843107323ffc87046c16
Accessibility was kept enabled even if all accessibility services
are disabled (explicitly by the user or removed) which was causing
the system to fire accessibility events that will never be consumed.
Change-Id: Ifb03e786ac0106687252bd1979725ffd724ad1c5
1. The touch explorer was not canceling long press runnable when a finger
goes down. This was causing system crash in the scenario of one pointer
down and not moving followed by another pointer down. Since the long press
runnable posed when the first pointer went down was not removed it was
sending events with wrong pointer id leading to a crash.
bug:5271592
Change-Id: I40dd7dd21d465ecedd9413f00b3cedc6066fa22d
1. Tuned the max angle between two moving fingers in touch
exploration mode for a gesture to be considered a drag.
The previous value was too aggressive and it was fairly
easy for the user to get out of dragging state if she
ingreases the distance between her fingers.
bug:5223787
2. Before clicking the explorer was sending hover enter and
exit resulting in firing the corresponding accessibility
events which leads to announcement of the content under
the tap that triggered the click. However, the click is
actually performed on the last touch explored location
(if in the distance slop of course) instead of the actual
tapping pointer location. Before fixing that the user was
confused since he was hearing announcement of one content
but actually was clicking on something else.
bug:5225721
Change-Id: I79fec704878f98c95f181bf8a9647e0bb1bd10ef
1. The downTime of the first down event was zero but it should the event time.
2. Hover exit events were not injected while transitioning to delegating
state and when tapping.
3. Differentiation between dragging and delagating state based on
two moving pointer direction and distance is now based only on
the direction. Hence, two pointers moving in the same direction
are dragging, otherwise the event stream is delegated unmodified.
The reason for that is the blind people cannot easily determine
and control the distance between their fingers resulting in
different behavior for gestures which the user thinks are the same
which creates confusion. Also in some cases the delegation and
draggig yield the same result, for example in list view, further
adding to the confusion. This was also causing the status bar to
be opened closed inreliably creating frustration.
4. Refactored the code such that now there is only one method that
injects motion events and all request go through it. Some bugs
were introduced by inconsistent implementation in the different
injection methods.
5. Fixed a couple of event stream inconsistencies reported by the
event consistency verifier.
bug:5224183
bug:5223787
bug:5214829
Change-Id: I16c9be3562ad093017af5b974a41ab525b73453f
The content retrieval APIs are synchronous from a client's
perspective but internally they are asynchronous. The client thread
calls into the system requesting an action and providing a callback
to receive the result after which it waits up to a timeout for that
result. The system enforces security and then delegates the request
to a given view hierarchy where a message is posted (from a binder
thread) describing what to be performed by the main UI thread the
result of which it delivered via the mentioned callback. However,
the blocked client thread and the main UI thread of the target view
hierarchy can be the same one, for example an accessibility service
and an activity run in the same process, thus they are executed on the
same main thread. In such a case the retrieval will fail since the UI
thread that has to process the message describing the work to be done
is blocked waiting for a result is has to compute! To avoid this scenario
when making a call the client also passes its process and thread ids so
the accessed view hierarchy can detect if the client making the request
is running in its main UI thread. In such a case the view hierarchy,
specifically the binder thread performing the IPC to it, does not post a
message to be run on the UI thread but passes it to the singleton
interaction client through which all interactions occur and the latter is
responsible to execute the message before starting to wait for the
asynchronous result delivered via the callback. In this case the expected
result is already received so no waiting is performed.
bug:5138933
Change-Id: I382e2d8689f5189110226613c2387f553df98bd3
1. The touch explorer uses delayed injection of events
which can happen after its hosting accessibility
input filer has been unregistered, thus the explorer
was trying to inject events when this is not allowed.
Now upon unregistration of the accessibility explorer
it resets the state of the touch explorer it hosts.
bug:5105956
Change-Id: I720682abf93382aedf4f431eaac90fd2c781e442
1. The code for detecting the end of a touch exploration gesture
was not injecting the hover exit event upon detection of the
gesture end.
bug:5091758:
Change-Id: I468164617d6677cd2a2a2815e1756c826d49f3a9
1. Events not generated by the user can change the interrogation allowing window
unpredicatably. For example when a ListView lays out its children it fires an
accessibility events and changes the currently active window while the user
interaction may be happening in another window say a dialog. Now the interrogation
allowing window is changed when a new window is shown or the user has touch
explored it.
bug:5074116
Change-Id: I8dde12bbec807d32445a781eedced9b95312b3e2
1. The first problem is manifested on Prime. Apparently the Prime screen driver
is very aggresive in filtering move events that origin from almost the same
location. Hence, the framework doesn't see a constant stream of events. However,
the TouchExplorer implementation was assuming a constant event stream to detect
long press. Refactored the code such that no assumptions for the event stream
are made.
2. Touch exploring an item and then tapping far away from that item was activating
it, hence not respecting the distance slop. This was due to incorrect check of
the latter.
bug:5070917
Change-Id: I3627a2feeb3712133f58f8f8f1ab7a2ec50cdc9a
1. Upon registration of an accessibility client the latter received only
the accessiiblity state and waiting for the touch exploration state
to be sent by the system in async manner. This led the very first
check of touch exploration state is checked a wrong value to be reported.
Now a state of the accessibility layer is returned to the client
upon registration.
2. Removing the dependency on talking accessibility service to be enabled
for getting into touch exploration mode. What if the user wants to use
an accessibility service that shows a dialog with the text of the touched
view?
bug:5051546
Change-Id: Ib377babb3f560929ee73bd3d8b0d277341ba23f7
1. Seperated touch exploration to be a seperate setting rather being
magically enabled by the system of accessiiblity is on the there
is at leas one accessibility service that speaks enabled. Now
there is a setting for requesting touch exploration but still the
system will enabled it only if that makes sense i.e. accessibility
is on and one accessibility service that speaks is enabled.
2. Added public API for checking of touch exploration is enabled.
3. Added description attribute in accessibility service declaration
which will be shown to the user before enabling the service.
4. Added API for quick cloning of AccessibilityNodeInfo.
5. Added clone functionality to SparseArray, SparseIntArray, and
SparseBooleanArray.
bug:5034010
bug:5033928
Change-Id: Ia442edbe55c20309244061cd9d24e0545c01b54f
1. If an accessibility service does not request access to the window
content and does so, an exception is thrown to point the developer
to the reason.
bug:5038284
Change-Id: Ibf08f4d2c8ad8939c4f4c2e288048a4f8ff1e31b
1. Touch exploration start and end events are generated
by the sytstem to provide additional information for
accessibility services. Since such events do not come
from any particular window they whould not change the
id of the window that currently allows exploring its
content.
2. Touch exploration start and end events were lealing the
touch explorer class wich is private.
bug:5026258
Change-Id: Icaf3e2bd9566716f2afb876cf8e0d50813b0c76e
1. Due to thread interleaving it was possible that
two messages are sent for requesting dispatch of
the same accessibility event and since the first
one sends the event and removes it from the pending
list the second message pulls null during the event
lookup. Look at the patch's comments for a detailed
scenario and rationale of the fix.
bug:4886129
Change-Id: If8b272ceaec7709c659ae502c3a730e63c939172
1. The explorer was injecting up/down touch events to
click with the id of the last pointer that went up
but the prototype i.e. last touch explore event may
not contain this pointer. Since we click on the last
touch explored location then using the action pointer
index of that event is the right approach.
bug:4551506
Change-Id: I73428b09dc014417096a52e667f58768a2871dc8