This can be used by apps that won't work without an account of that
type in the limited user environment. This way we can avoid letting
users select these apps when setting up a limited user.
Bug: 8600261
Change-Id: Iaa0dd5ff88e89fa7a1d8a4e70317290268411bdb
Add the encrypted flag for the KeyPairGenerator and the KeyStore so that
applications can choose to allow entries when there is no lockscreen.
Bug: 8122243
Change-Id: Ia802afe965f2377ad3f282dab8c512388c705850
Avoid confusion with user restrictions which can be applied to
non-limited users as well.
Updated the java docs.
Change-Id: I4097c50b528b01a49cebcb0832d09f2b06998faa
Adding features which round out the animation APIs (missing
getters, etc.). Also fix doc typos.
Issue #8350510 Add APIs needed for future animation capabilities
Change-Id: I063736848ba26e6d6c809b15fc3a103c74222f46
1. This allows UiAutomation type tests to run as if an
android monkey test is running. This allows applications
that recognize that they are drive by a test framework and
avoid performing certain actions such as calling 911.
2. Fixed a bug where the UiAutomation#disconnect() was not
called when the instrumentation is shutdown.
bug: 8588857
Change-Id: I9e3624dfbe2b8f81f27805711de1098ea2edd03d
...to allow for orientation locking
This doesn't add an API to get the current orientation, since that is
inherantly racy. Instead there is a new "locked" orientation mode that
locks the screen into whatever the current rotation is.
While at it, added a few other useful orientation modes.
Change-Id: I5c369e6511cb72294e9e922ea8acffd770df9440
Created constants in current.txt and UserManager.java, modified restrictions access in UserManagerService.java.
Change-Id: If8d778d84af81dcbf5784f6e0afd9ef966cc8ecf
Now that we have gestures which are detected by the system and
interpreted by an accessibility service, there is an inconsistent
behavior between using the gestures and the keyboard. Some devices
have both. Therefore, an accessibility service should be able to
interpret keys in addition to gestures to provide consistent user
experience. Now an accessibility service can expose shortcuts for
each gestural action.
This change adds APIs for an accessibility service to observe and
intercept at will key events before they are dispatched to the
rest of the system. The service can return true or false from its
onKeyEvent to either consume the event or to let it be delivered
to the rest of the system. However, the service will *not* be
able to inject key events or modify the observed ones.
Previous ideas of allowing the service to say it "tracks" the event
so the latter is not delivered to the system until a subsequent
event is either "handled" or "not handled" will not work. If the
service tracks a key but no other key is pressed essentially this
key is not delivered to the app and at potentially much later point
this stashed event will be delivered in maybe a completely different
context.The correct way of implementing shortcuts is a combination
of modifier keys plus some other key/key sequence. Key events already
contain information about which modifier keys are down as well as
the service can track them as well.
bug:8088812
Change-Id: I81ba9a7de9f19ca6662661f27fdc852323e38c00
1. Add uncalibrated gyros and magnetic field sensor.
2. Change max number of events from 3 to 16.
3. Add new APIs for trigger sensors.
Change-Id: Ifac5c0024c8e5f88b721e5cd97ff26afaaa36717