1. Since the API version has been finalized this change
updates the SDk version checks to use the JellyBean
verson number.
bug:5947249
Change-Id: Ie22fa7e18a7ea7b0c7077d80246a26c17f327ceb
The window manager now performs the crop internally, evaluating
it every animation from, to be able to update it along with
the surface position.
Change-Id: I960a2161b9defb6fba4840fa35aee4e411c39b32
1. The initial design was to have some accessibility gestures
being handled by the system if the gesture handling access
service does not consume the gesture. However, we are not
sure what a good default is and once we add a default handler
we cannot remove it since people may rely on it. Thus, we
take the simples approach and let the accessibility service
handle the gestures. If no gestures are handled the system
will work in explore by touch as before.
bug:5932640
Change-Id: I865a83549fa03b0141d27ce9713e9b7bb45a57b4
- During the transition, fade the bg to black
- Exiting activity fades to black
- Recents background no longer fades away, because
then it would fight against the fade to black
happening behind it
The CertBlacklister is a mechanism for allowing is to use gservices
to blacklist certs by serial number or public key.
Change-Id: Ie4b0c966a8a43c9823fb550c0b1691204f133ae7
This refactoring sets the stage for a follow-on change that
will make use additional functions of the power HAL.
Moved functionality from android.os.Power into PowerManagerService.
None of these functions make sense being called outside of the
system server. Moving them to the PowerManagerService makes it
easier to ensure that the power HAL is initialized exactly once.
Similarly, moved ShutdownThread out of the policy package and into
the services package where it can tie into the PowerManagerService
as needed.
Bug: 6435382
Change-Id: I958241bb124fb4410d96f5d5eb00ed68d60b29e5
1. Delegating activation gestures has several issues that we should
decide how to handle if possible before allowing an accessibility
service to take over them:
A) It is needed that every view than can be clicked or long pressed on
reacts to such as a response to calling performClick and performLongPress
which is not necessary true since the view may watch the touch
events and do its own click long click detection. As a result it may
be possible that there are view a user cannot interact with in
touch exploration mode but can if not in that mode.
B) Clicking or long pressing on a different location in a view may yield
different results, for example NumberPicker. Ideally such views have
to implement AccessibilityNodeProvide which provider handles correctly
the request for click long press on virtual nodes. Some apps however
just fire different hover accessibility events when the user is over
a specific semantic portion of the view but do not provide virtual
nodes. Hence, a user will not be able to interact with such semantic
regions but the system can achieve that by sending the click/long click
at the precise location in the view that was last touch explored.
2. Adding a flag on accessibility service info to request explore by touch
mode. There is no need to put the device in this mode if node of the currently
enabled accessibility services supports it. Now the problem is inverted and
the service has to explicitly state its capability.
3. Fixing a bug where includeImportantViews was ignored for automation
services.
Change-Id: I3b29a19f24ab5e26ee29f974bbac2197614c9e2a
Conflicts:
api/current.txt
1. Scrolling actions are crucial for enabling a gesture based
traversal of the UI and specifically scrollable containers
especially lists and anything backed by an adapter. Since
accessibility focus can land only attached views, it cannot
visit views for adapter items not shown on the screen.
Auto scrolling the list as a result of putting access focus
ot a list item does not work well since the user may get
trapped in a long list. Adding an accessibility node provider
to emit virtual views for one view before the first and one
after the last is complex and suffers the limitation of trapping
the user. Accessibility service need an explicit scroll actions
which may be performed upon an explicit user action. Hence,
the user is informed for the start/end of the visible part of
the list and he makes a deliberate choice to scroll. This will
benefit also people developing Braille devices since they can
scroll the content without telling the user to stop using the
Braille controller and take the device out of his pocket to scroll
and go back to the Braille controller.
NOTE: Without these action large portions of the screen will be
hard to access since users will have to touch and explore to
find and scroll the list.
Change-Id: Iafcf54d4967893205872b3649025a4e347a299ed