* commit 'c97fa8f8a02f1850779cc1e6ff6b67bcdbeecb1b':
Always accept API calls from processes which have INTERACT_ACROSS_USERS_FULL in InputMethodManagerService
This takes the easy way around notifying the correct users
about GPS state transitions by notifying ALL the users(!).
I've also laid groundwork for proper multiuser support in
LocationManager and did a tiny bit of cleanup in
GpsNetInitiatedHandler while I was looking at notifications.
Bug: 7213552
Change-Id: I2d6dc65c459e55d110ac0f5f79ae7a87ad638ede
* commit '135e5fb71242b1151929e2ea7bf221ff421e6ad2':
Always accept API calls from processes which have INTERACT_ACROSS_USERS_FULL in InputMethodManagerService
Headsets are now detected from calls coming in from the input switch
subsystem if a config.xml value is set to true.
Bug: 6548391.
Change-Id: I79259d2742e157b106a746474f32ffd1c171ddf3
...#testScreenLayout failures on JO
This doesn't actually fix it; I have concluded that the test is broken
(the platform is correctly reporting that this is a NOT LONG device
because in portrait once you account for the status bar and system
bar our size is 880dp high and 600dp wide, which is not enough for us
to be in the LONG config).
However while working on this I noticed that the code for computing
the configuration of the external display was wrong. I have fixed
that by putting this code for computing these parts of the configuration
in a common place that both the window manager and external display
code can use.
Change-Id: Ic6a84b955e9ec345a87f725203a29e4712dac0ad
1. A recently added check was preventing touch exploration being
disabled when the last touch exploring service was turned off.
As a consequence enabling explore by touch was initializing the
input filter with the magnification and the not disabled
screen magnification features.
bug:7256223
Change-Id: I9ed5457705d625805462e4d316b2c8a5af9aabca
1. In explore-by-touch when the user slides two fingers in the same
direction we consider it a drag gesture. We merge the pointers into
one and deliver a touch event. When one of the pointers goes up
we were transitioning into touch exploring state. This means that
were transitioning to another state in the middle of a gesture which
creates complications and leads for interaction end event not being
sent.
This change transitions out of dragging state when all pointers go up
- simple and all events are properly sent. Consequentially, staring a
drag the user has to lift all pointers to touch explore. Since usually
users either drags or touch explores this seems the simplest and
*least risky* fix.
bug:7253731
Change-Id: Ie8588fbe9b26cb81312bd7fd377c94732e41e3f8
Issue #7211769: Crash dialog from background user has non-working "report"
The report button now launches the issue reporter for the correct user.
Also for crashes on background users, either disable the report button,
or simply don't show the dialog depending on the build config.
Issue #7244492: Bugreport button in Quick Settings doesn't actually do anything
Now they do.
Issue #7226656: second user seeing primary user's apps
I haven't had any success at reproducing this. I have tried to tighten up
the path where we create the user to ensure nothing could cause the
user's applications to be accessed before the user it fully created and thus
make them installed... but I can't convince myself that is the actual problem.
Also tightened up the user switch code to use forground broadcasts for all
of the updates about the switch (since this is really a foreground operation),
added a facility to have BOOT_COMPELTED broadcasts not get launched for
secondary users and use that on a few key system receivers, fixed some debug
output.
Change-Id: Iadf8f8e4878a86def2e495e9d0dc40c4fb347021