149 Commits

Author SHA1 Message Date
Svetoslav Ganov
ec33d56300 Exception in the touch explorer when dragging.
1. During a drag in touch exploration we have two pointers moving in the same
   direction but inject only one of them. If the dragging pointer goes up we
   send an up to the view system and wait for all pointers to go up to transition
   to touch exploring state. At this point the dragging pointer id is cleared
   and if a new pointer goes down we are trying to send up (rather do nothing)
   for the dragging pointer which we already did and due to the invalid pointer
   id we get an exception when splitting the motion event.

bug:7282053

Change-Id: I690bf8bdf6e2e5851ee46a322c4a1bb7d484b53a
2012-10-03 17:05:57 -07:00
Svetoslav Ganov
9bfb8bcfeb Merge "Up motion event not injected by the touch explorer at the end of a drag." into jb-mr1-dev 2012-10-02 16:36:29 -07:00
Svetoslav Ganov
aeb8d0ed0d Up motion event not injected by the touch explorer at the end of a drag.
1. The up event was not injected when the last pointer went up, i.e.
   at the end of the drag. This patch sends an up event if the dragging
   pointer goes up for both cases, when the dragging pointer goes up
   first and when it goes up second.

bug:7272830

Change-Id: I708a2b93ee2d0a4c46dbeea002841666e919602d
2012-10-02 14:33:27 -07:00
Jeff Sharkey
6e2bee75ce Migrate more System and Secure settings to Global.
Includes telephony, WindowManager, PackageManager, and debugging
settings.  Update API to point towards moved values.

Bug: 7231764, 7231252, 7231156
Change-Id: I5828747205708872f19f83a5bc821ed0a801cb79
2012-10-02 13:55:15 -07:00
Svetoslav Ganov
0944d62544 Merge "Touch explorer and magnifier do not work well together." into jb-mr1-dev 2012-10-02 13:14:06 -07:00
Svetoslav Ganov
45af84a483 Touch explorer and magnifier do not work well together.
1. If tocuh exploration and screen magnification are enabled and the screen
   is currently magnified, gesture detection does not work well. The reason
   is because we are transforming the events if the screen is magnified before
   passing them to the touch explorer to compensate for the magnification so
   the user can poke what he thinks he pokes. However, when doing gesture
   detection/velocity computing this compensating shrinks the gestured shape/
   decreases velocity leading to poor gesture reco/incorrect velocity.

   This change adds a onRawMotionEvent method in the event transformation chain
   which will process the raw touch events. In this method of the touch explorer
   we are passing events to the gesture recognized and the velocity tracker.

2. Velocity tracker was not cleared on transitions out of touch exploring state
   which is the only one that uses velocity.

bug:7266617

Change-Id: I7887fe5f3c3bb6cfa203b7866a145c7341098a02
2012-10-02 12:02:05 -07:00
Svetoslav Ganov
59f07690c2 Owner should not be announces as a user switch.
1. The accessibility layer announces user switches. Even though
   the initial switch to the owner on a singe user device is a
   valid use switch we should not announce it for accessibility.

bug:7264693

Change-Id: Idf022fab6b74c84b7a96bc4ed7c7fee2b83029a6
2012-10-01 19:22:43 -07:00
Svetoslav Ganov
9ea8f390db Explore by touch enabled when screen magnification is on.
1. A recently added check was preventing touch exploration being
   disabled when the last touch exploring service was turned off.
   As a consequence enabling explore by touch was initializing the
   input filter with the magnification and the not disabled
   screen magnification features.
bug:7256223

Change-Id: I9ed5457705d625805462e4d316b2c8a5af9aabca
2012-09-29 10:46:16 -07:00
Svetoslav Ganov
46824214bb Sending interaction end event at the end of a drag.
1. In explore-by-touch when the user slides two fingers in the same
   direction we consider it a drag gesture. We merge the pointers into
   one and deliver a touch event. When one of the pointers goes up
   we were transitioning into touch exploring state. This means that
   were transitioning to another state in the middle of a gesture which
   creates complications and leads for interaction end event not being
   sent.

   This change transitions out of dragging state when all pointers go up
   - simple and all events are properly sent. Consequentially, staring a
   drag the user has to lift all pointers to touch explore. Since usually
   users either drags or touch explores this seems the simplest and
   *least risky* fix.

bug:7253731

Change-Id: Ie8588fbe9b26cb81312bd7fd377c94732e41e3f8
2012-09-28 17:04:14 -07:00
Svetoslav Ganov
fe304b8939 Some accessibility events not sent from touch explorer if apps misbehave.
1. The touch explorer is relying on the hover exit accessibility event to be sent
   from the app's view tree before sending the exploration end and last touch
   accessibility events. However, if the app is buggy and does not send the hover
   exit event, then the interaction ending events are never sent. Now there is a
   timeout in which we wait for the hover exit accessibility event before sending
   the gesture end and last touch accessibility events. Hence, we are making a
   best effort to have a consistent event stream.

2. Sneaking in the new nine patch for the border around the magnified region
   since the current one is engineering art.

bug:7233616

Change-Id: Ie64f23659c25ab914565d50537b9a82bdc6a44a0
2012-09-28 11:23:24 -07:00
Svetoslav Ganov
95841ac3c2 Merge "Inconsistent events on transition from gesture detection to touch exploration." into jb-mr1-dev 2012-09-28 10:39:38 -07:00
Svetoslav Ganov
ca8688207b Merge "Accessibility services that do not accept events are mismanaged." into jb-mr1-dev 2012-09-28 10:36:53 -07:00
Daniel Sandler
0dc2b81ce1 Merge "Cleanup internal status bar APIs." into jb-mr1-dev 2012-09-28 10:07:16 -07:00
Svetoslav Ganov
aed4b6f812 Inconsistent events on transition from gesture detection to touch exploration.
1. The problem is that we have a gesture detection timeout after which we transition
   to touch exploration state. This handles the case where the user is using too high
   velocity while trying to touch explore. The delayed command that transitions from
   gesture detection state to touch exploration state was not firing an event for the
   end of gesture detection and begin of touch exploration before doing its main work
   to transition to touch exploring state.

bug:7233819

Change-Id: I5c4855231aa3826dadbee324e74a3c9e52c96cd9
2012-09-28 10:06:24 -07:00
Svetoslav Ganov
1f22b6a25d Accessibility services that do not accept events are mismanaged.
1. If an accessibility service does not specify that it handles any
   event types it was never added to the list of services while
   the system is bound to it. Since the service is not in the list
   with enabled services we never unbind it, hence it consumes
   resources without doing nothing. This is also semantically
   incorrect because a sevice may not want to receive events while
   handling only gestures.

bug:5648345

Change-Id: Id478a4704cdeeb1729330f6ae4b8ff9e06320952
2012-09-28 09:45:15 -07:00
Svetoslav Ganov
7befb7deb2 Global gesture to toggle Accessibility system-wide.
1. This change adds a global gesture for enabling accessibility.
   To enable this gesture the user has to allow it from the
   accessibility settings or use the setup wizard to enable
   accessibility. When the global gesture is enabled the user
   can long press on power to bring the global actions dialog
   and then hold with two fingers for a few seconds to enable
   accessibility. The appropriate feedback is also provided.

2. The global gesture is writing directly into the settings for
   the current user if performed when the keyguard is not on. If
   the keygaurd is on and the current user has no accessibility
   enabled, the gesture will temporary enable accessibility
   for the current user, i.e. no settings are changed, to allow
   the blind user to log into his account. As soon as a user
   switch happens the new user settings are inherited. If no
   user change happens after temporary enabling accessibility
   the temporary changes will be undone when the keyguard goes
   away and the device will works as expected by the current user.

bug:6171929

3. The initialization code for the owner was not executed due
   to a redundant check, thus putting the accessibility layer in
   an inconsistent state which breaks pretty much everything.

bug:7240414

Change-Id: Ie7d7aba80f5867b7f88d5893b848b53fb02a7537
2012-09-27 20:33:20 -07:00
Daniel Sandler
11cf178100 Cleanup internal status bar APIs.
IStatusBarService.collapseQuickSettings is gone;
collapseNotifications is now collapsePanels, which does what
collapse() used to do. Similarly,
IStatusBar.animateCollapseQuickSettings is now simply
IStatusBar.animateCollapse().

Bug: 7245229
Change-Id: Id157d2fdf34926d3c85ffa8b81c741a5359aede4
2012-09-27 14:03:08 -04:00
Svetoslav Ganov
c91fb5875b Merge "Adding a global accessibility action to open quick settings." into jb-mr1-dev 2012-09-25 16:47:06 -07:00
Svetoslav Ganov
e20a177d3f Adding a global accessibility action to open quick settings.
1. Added APIs for opening the quick settings to the StatusBarManagerService
   and the local StatausBarManager. The new APIs are protected by the old
   EXPAND_STATUS_BAR permission.
   Renamed the expand* and collapse* non-public APIs that are expanding
   the notifications to expandNotifications* collapseNotifications* to
   better convey what they do given that this change adds
   expandQuickSettings* and collapseQuickSettings*.
   Added a global action to the accessibility layer to expand the quick
   settings which is calling into the new status bar manager APIs.

bug:7030487

Change-Id: Ic7b46e1a132f1c0d71355f18e7c5a9a2424171c3
2012-09-25 16:07:59 -07:00
Svetoslav Ganov
1c9766e32a Merge "The active window for accessibilitiy incorrectly tracked." into jb-mr1-dev 2012-09-25 14:46:58 -07:00
Svetoslav Ganov
a8afa694d6 Regression in screen introspection APIs due to the multi-user change.
1. The initial user was set to USER_NULL but some clients were registering
   before the user change callback happens. Since the initial user is
   the owner the current user id defaults to USER_OWNER.

2. The check for global clients and window connections was using the
   calling UID but there are processes that run in a per user basis
   as system UID (Setting for example). Now the check is stronger
   and comparing the caller PID with that of the system process.

3. The code for finding the focused window id was not checking the
   global window token list in addition to that of the current user.

4. The code updating the active window id was calling out into the
   window manager with a lock held.

bug:7224670

Change-Id: I9f4b7ea67eb5598b30ee7d1b68a1d3ce0cf8cfb4
2012-09-25 13:59:37 -07:00
Jean-Baptiste Queru
fe3632bcbe Merge into jb-mr1-dev
Change-Id: Ib0523ded92e2fe4be6a32d092baa55b527229c07
2012-09-25 09:42:01 -07:00
Jean-Baptiste Queru
56d8cc1031 Merge into jb-mr1-dev
Change-Id: I6504b000be7e3b6e770af99c5a922fd1e9ec73de
2012-09-25 09:41:25 -07:00
Svetoslav Ganov
76c0dd4827 The active window for accessibilitiy incorrectly tracked.
1. The active window for accessibility purposes is the either the
   window the user is touching or the window that has input focus. We
   were using the touch exploration gesture end event to figure
   when the user stops touching the screen so we can set the active
   window to the input focused one. However, we do not send such
   gesture end if the user does not touch explore. If the user only
   taps we do not consider this touch exploring. We now have dedicated
   accessibility events for first and last touch and this change uses
   them as a guide when to update the active window.

bug:6523219

Change-Id: I6262c0c5f408b02dbaa127664e4b426935d7f81f
2012-09-24 19:16:20 -07:00
Svetoslav Ganov
03e7b88815 More than one finger at a time can trigger a system crash.
1. The crash was happening if: two active pointers are performing a drag;
   there are some inactive pointers down; the main dragging pointer (we are
   merging the dragging pointers into one) goes up; now an inactive pointer
   goes up and the explorer tries to inject up for the dragging pointer
   which is no longer in the event resulting in a crash. Basically two
   problems: inactive pointers were not ignored; 2) having only one
   active pointer should not only send the up event but also transition
   the explorer in touch exploring state.

bug:6874128

Change-Id: I341fc360ebc074fe3919d5ba3b98ee5cb08dd71e
2012-09-24 18:43:30 -07:00
Svetoslav Ganov
187f3f9490 Magnified frame not properly computed when keyguard goes away.
1. The keyguard force hides some windows when it is shown and as soon
   as the keyguard goes away there windows are made visible. However,
   the window transition that the keyguard is moving away is reported
   before the force hidden windows are shown which makes the screen
   magnifier compute the magnified region with an incomplete list of
   windows of interest.

bug:7215285

Change-Id: I3abc4d97b7a74de8183ad20477dadf66c82da037
2012-09-24 16:34:26 -07:00
Svetoslav Ganov
657968a65f UI test automation service should not be auto reconnected.
1. Since adb is restarted on user switch it makes no sense to
   try to reconnect the ui automation service since it will
   be killed on a user switch.

   Disabling touch exploration on UI automation service
   connect since it can explicitly put the device in this
   state if needed.

bug:6967373

Change-Id: I8cfde74f28f3f03d4ccf24746d43b8178ae2b5ef
2012-09-24 13:50:44 -07:00
Svetoslav Ganov
9371a0a0c0 Fixing a regression in the UI test automation.
bug:6967373

Change-Id: I28f01a2bfe44febcb1a519028dab82fb1da9753e
2012-09-21 18:20:42 -07:00
Svetoslav Ganov
58d37b55bd Multi-user support for the accessibility layer.
1. This change converts the accessibility manager service to
   maintain a state per user. When the user changes the services
   for the user that is going away are disconnected, the local
   accessibility managers in the processes for this user are
   disabled, the state is swapped with the new user's one, and
   the new user state is refreshed.

   This change updates all calls into the system to use their
   user specific versions when applicable. For example, regisetring
   content observers, package monitors, calls into other system
   services, etc.

   There are some components that are shared across users such
   as UI created by the system process and the SystemUI package.
   Such components are managed as a global state shared across
   all users and are updated accordingly on a user switch. Since
   the SystemUI is running in a normal app process this change
   adds hidden APIs on the local window manager to allow the
   SystemUI to notify the accessibility layer that it will run
   accross users.

   Calls to AccessibiltyManager's isEnabled(), isTouchExplorationEnabled()
   and sendAccessibilityEvent return false or a are a nop for a
   background user sice he should not send accessibility events,
   and should not perform touch exploration.

   Update the internal accessibility tests due to changes in the
   AccessibilityManager.

   This change also fixes several issues that were encountered
   such as calling out the accessibility manager service with a
   lock held.

   Removed some incorrect debugging code from the TouchExplorer
   that was leading to a system crash.

bug:6967373

Change-Id: I2cf32ffdee1d827a8197ae4ce717dc0ff798b259
2012-09-21 16:48:07 -07:00
Svetoslav Ganov
8b681cb881 Some formatting missed in the previous patch
Change-Id: I299090ca67b1d90cf75a46dc85b13970d32511e5
2012-09-14 15:20:45 -07:00
Svetoslav Ganov
77276b6085 Adding accessibility events for touch and gesture detection states.
1. Currently the system fires accessibility events to announce the
   start and end of a touch exploration gesture. However, such a
   gesture starts after we have decided that the user is not
   performing a gesture which is achieved by measuring speed of
   movement during a threshold distance. This allows an accessibility
   service to provide some feedback to the user so he knows that
   he is touch exploring.

   This change adds event types for the first and last touches
   of the user. Note that the first touch does not conincide with
   the start of a touch exploration gesture since we need a time
   or distance to pass before we know whether the user explores
   or gestures. However, it is very useful for an accessibility
   service to know when the user starts to interact with the
   touch screen so it can turn the speech off, to name one
   compelling use case.

   This change also provides event types for the start and end
   of gesture detection. If the user has moved over the threshold
   with a speed greater than X, then the system detects gestures.
   It is useful for an accessibility service to know the begin
   and end of gesture detection so it can provide given feedback
   type for such a gesture, say it may produce haptic feedback
   or sound that differs for the one for touch exploration.

   The main benefit of announcing these new events is that an
   accessibility service can provide feedback for each touch
   state allowing the user to always know what he is doing.

bug:7166935

Change-Id: I26270d774cc059cb921d6a4254bc0aab0530c1dd
2012-09-14 15:12:54 -07:00
Svetoslav Ganov
19f4a29fa4 Enforcing BIND_ACCESSIBILITY_SERVICE for connecting to an accessibility service.
1. This change enforces an accessibility service to require the system
   defined BIND_ACCESSIBILITY_SERVICE permission.

bug:6507771

Change-Id: If5e16bb4fa97891be0ccbb35e343773712e33b98
2012-09-12 20:26:30 -07:00
Svetoslav Ganov
3e1476a697 Adding a scaling threshold in ScreenMagnifier
Change-Id: I1fdd7c93de571a61d88d7386c5c2a423a6b83fb9
2012-09-11 18:15:17 -07:00
Svetoslav Ganov
d420e3ac94 Refactoring the scale and pan detection in the ScreenMagnifier.
Change-Id: I8560f53f88ef0c9244e2b48d40119574cacb544f
2012-09-11 17:48:28 -07:00
Svetoslav Ganov
9b4125e435 Screen magnifier should handle window rebuilds correctly.
1. The way for computing the magnified region was simplistic and
   incorrect. It was ignoring window layering resulting in broken
   behavior. For example, if the IME is up, then the everything else
   is magnifed and the IME not. Now the keyguard appears and covers
   the IME but the magnified region does not expand while it would
   since the keyguard completely covers the not magnified IME window.

bug:7138937

Change-Id: I21414635aefab700ce75d40f3e913c1472cba202
2012-09-11 15:50:58 -07:00
Svetoslav Ganov
36e614c110 Screen magnification should disengage on screen off.
1. When the screen goes off the user will be in a completely
   different context upon turning the screen on. Therefore,
   if magnification auto update is enabled magnification
   will be disengaged on screen off.

bug:7139088

Change-Id: I790cfa5b3cf31d34e95fc9548e6246a84096c37b
2012-09-10 18:16:05 -07:00
Svetoslav Ganov
86fe9e14f1 Reducing the click delay while screen magnification is enabled.
1. If screen magnification is enabled the user has to triple tap
   and lift or triple tap and hold to engage magnification. Hence,
   we delay the touch events until we are sure that it is no longer
   possible for the user to perform a multi-tap to engage
   magnification. While such a delay is unavoidable it feels a
   bit longer than it should be. This change reduces the delay
   between taps to be considered a multi-tap, essentially making
   the click delay shorter.

bug:7139918

Change-Id: I2100945171fff99600766193f0effdaef1f1db8f
2012-09-10 17:35:35 -07:00
Svetoslav Ganov
662538957f Scaling in viewport moving state locks into a magnified state.
1. If the user changes the magnification level while moving the
   viewport the magnification is locked. The gesture handle has
   to put device back into a viewport moving state if this was
   the last state.

bug:7139363

Change-Id: I24992b973bb15624580114353b004efdb35c2faa
2012-09-10 16:41:07 -07:00
Svetoslav Ganov
6d04712d15 Allow simultaneous scale and pan in magnified state.
1. Before in magnified state the user was able to only scale or
   pan. Based on user input this change allows performing pan
   or scale or both. If the user scales more than a threshold
   we are performing a scale and independently of that if the
   use pans more than a threshold we are performing a pan.

bug:7138928

Change-Id: Ic1511500ba3369091dcfd070669d3e4f0286fbe5
2012-09-10 15:57:13 -07:00
Svetoslav Ganov
0c381504a8 Improve scaling vs pan in screen magnifier.
1. Due to frequent changes of the behavior of ScaleGestureDetector
   this patch rolls in a gesture detector used for changing the
   screen magnification level. It has an improved algorithm which
   uses the diameter of min circle around the points as the span, the
   center of this circle as the focal point, and the average slop
   of the lines from each pointer to the center to determine the
   angle of the diameter used when computing the span x and y.

Change-Id: I5cee8dba84032a0702016b8f9632f78139024bbe
2012-09-10 10:49:04 -07:00
Svetoslav Ganov
add52a975a Viewport should zoom out when screen magnification is disabled.
1. If screen magnification is disabled when the screen is in a
   magnified state we have to zoom out since otherwise the user
   is stuck in a magnified state without ability to pan/zoom/
   toggle magnification which renders the device useless.

bug:7131030

Change-Id: I8f3339f31310448ec8742f3101c1fdc61a6a5f83
2012-09-09 11:13:58 -07:00
Svetoslav Ganov
5b1720e11d Merge "Viewport should zoom out when screen magnification is disabled." into jb-mr1-dev 2012-09-09 10:50:34 -07:00
Svetoslav Ganov
6d0df874ce Viewport should zoom out when screen magnification is disabled.
1. If screen magnification is disabled when the screen is in a
   magnified state we have to zoom out since otherwise the user
   is stuck in a magnified state without ability to pan/zoom/
   toggle magnification which renders the device useless.

bug:7131030

Change-Id: Ia620954fbd594e7cd470e43b89d9ed04c0397c3c
2012-09-09 10:46:10 -07:00
Svetoslav Ganov
2cee686498 Fixing off by one error in the ScreenMagnifier.
Change-Id: Ia0ccfb6b354b7a18633e7cf26647c6436ebf5c08
2012-09-07 17:33:09 -07:00
Svetoslav Ganov
1cf70bbf96 Screen magnification - feature - framework.
This change is the initial check in of the screen magnification
feature. This feature enables magnification of the screen via
global gestures (assuming it has been enabled from settings)
to allow a low vision user to efficiently use an Android device.

Interaction model:

1. Triple tap toggles permanent screen magnification which is magnifying
   the area around the location of the triple tap. One can think of the
   location of the triple tap as the center of the magnified viewport.
   For example, a triple tap when not magnified would magnify the screen
   and leave it in a magnified state. A triple tapping when magnified would
   clear magnification and leave the screen in a not magnified state.

2. Triple tap and hold would magnify the screen if not magnified and enable
   viewport dragging mode until the finger goes up. One can think of this
   mode as a way to move the magnified viewport since the area around the
   moving finger will be magnified to fit the screen. For example, if the
   screen was not magnified and the user triple taps and holds the screen
   would magnify and the viewport will follow the user's finger. When the
   finger goes up the screen will clear zoom out. If the same user interaction
   is performed when the screen is magnified, the viewport movement will
   be the same but when the finger goes up the screen will stay magnified.
   In other words, the initial magnified state is sticky.

3. Pinching with any number of additional fingers when viewport dragging
   is enabled, i.e. the user triple tapped and holds, would adjust the
   magnification scale which will become the current default magnification
   scale. The next time the user magnifies the same magnification scale
   would be used.

4. When in a permanent magnified state the user can use two or more fingers
   to pan the viewport. Note that in this mode the content is panned as
   opposed to the viewport dragging mode in which the viewport is moved.

5. When in a permanent magnified state the user can use three or more
   fingers to change the magnification scale which will become the current
   default magnification scale. The next time the user magnifies the same
   magnification scale would be used.

6. The magnification scale will be persisted in settings and in the cloud.

Note: Since two fingers are used to pan the content in a permanently magnified
   state no other two finger gestures in touch exploration or applications
   will work unless the uses zooms out to normal state where all gestures
   works as expected. This is an intentional tradeoff to allow efficient
   panning since in a permanently magnified state this would be the dominant
   action to be performed.

Design:

1. The window manager exposes APIs for setting accessibility transformation
   which is a scale and offsets for X and Y axis. The window manager queries
   the window policy for which windows will not be magnified. For example,
   the IME windows and the navigation bar are not magnified including windows
   that are attached to them.

2. The accessibility features such a screen magnification and touch
   exploration are now impemented as a sequence of transformations on the
   event stream. The accessibility manager service may request each
   of these features or both. The behavior of the features is not changed
   based on the fact that another one is enabled.

3. The screen magnifier keeps a viewport of the content that is magnified
   which is surrounded by a glow in a magnified state. Interactions outside
   of the viewport are delegated directly to the application without
   interpretation. For example, a triple tap on the letter 'a' of the IME
   would type three letters instead of toggling magnified state. The viewport
   is updated on screen rotation and on window transitions. For example,
   when the IME pops up the viewport shrinks.

4. The glow around the viewport is implemented as a special type of window
   that does not take input focus, cannot be touched, is laid out in the
   screen coordiates with width and height matching these of the screen.
   When the magnified region changes the root view of the window draws the
   hightlight but the size of the window does not change - unless a rotation
   happens. All changes in the viewport size or showing or hiding it are
   animated.

5. The viewport is encapsulated in a class that knows how to show,
   hide, and resize the viewport - potentially animating that.
   This class uses the new animation framework for animations.

6. The magnification is handled by a magnification controller that
   keeps track of the current trnasformation to be applied to the screen
   content and the desired such. If these two are not the same it is
   responsibility of the magnification controller to reconcile them by
   potentially animating the transition from one to the other.

7. A dipslay content observer wathces for winodw transitions, screen
   rotations, and when a rectange on the screen has been reqeusted. This
   class is responsible for handling interesting state changes such
   as changing the viewport bounds on IME pop up or screen rotation,
   panning the content to make a requested rectangle visible on the
   screen, etc.

8. To implement viewport updates the window manger was updated with APIs
   to watch for window transitions and when a rectangle has been requested
   on the screen. These APIs are protected by a signature level permission.
   Also a parcelable and poolable window info class has been added with
   APIs for getting the window info given the window token. This enables
   getting some useful information about a window. There APIs are also
   signature protected.

bug:6795382

Change-Id: Iec93da8bf6376beebbd4f5167ab7723dc7d9bd00
2012-09-06 18:56:17 -07:00
Svetoslav Ganov
c9c9a48e7b Removing a workaround for incorrect window position on window move.
1. The window manager was not notifying a window when the latter
   has been moved. This was causing incorrect coordinates of the
   nodes reported to accessibility services. To workaround that
   we have carried the correct window location when making a
   call from the accessibility layer into a window. Now the
   window manager notifies the window when it is moved and the
   workaround is no longer needed. This change takes it out.

2. The left and right in the attach info were not updated properly
   after a report that the window has moved.

3. The accessibility manager service was calling directly methods
   on the window manager service without going through the interface
   of the latter. This leads to unnecessary coupling and in the
   long rung increases system complexity and reduces maintability.

bug:6623031

Change-Id: Iacb734b1bf337a47fad02c827ece45bb2f53a79d
2012-07-16 08:46:11 -07:00
Svetoslav Ganov
a43ef3d1c5 Gestures don't work when turning on Explore by Touch pragmatically.
1. There was a misspelled duplicate member in the accessibility service
   class which was causing inconsistent behavior because one field was
   updated and another checked.

2. When the set of services that can put the device in explore by touch
   mode changes we were disconnecting and reconnecting all services
   and this is not correct. Now only the state of explore by touch is
   updated appropriately.

bug:6798860

Change-Id: Ib3c119cef8e71c3458d56e4ce6fbde2c2f750dcd
2012-07-12 13:07:59 -07:00
Casey Burkhardt
ea6fbc0981 Fixing gesture recognition configuration in TouchExplorer.
This fix adjusts the sensitivity of the gesture recognizer by
eliminating gesture rotation in the recognition process.

Bug:6697119
Change-Id: Ic767f513c05210b27e583338c4f0adcaa1c4c625
2012-06-19 16:31:54 -07:00
Svetoslav Ganov
5d043ce8cc Active window not updated window not updated properly.
1. Accessibility allows querying only of the active window.
   The active window is the one that has input focus or the
   one the user is touching. Hence, if the user is touching
   a window that does not have input focus this window is
   the active one and as soon as the user stops touching
   it the active window becomes the one that has input
   focus. Currently the active window is not updated properly
   when the user lifts his finger. This leads to a scenario
   of traversal actions sent to the wrong window and the user
   being stuck.

   The reason is that the last touch explored event that is
   used to determine where to click is cleared when accessibility
   focus moves but this event is also used to determine when to
   send the hover exit and touch exploration gesture end events.
   The problem is that the last hover event is cleared before
   it is used for sending the right exit events, thus the event
   stream is inconsistent and the accessibility manager service
   relies on this stream to update the active window. Now we
   are keeping separate copies of the last touch event - one
   for clicking and one for determining the which events to
   inject to ensure consistent stream.

bug:6666041

Change-Id: Ie9961e562a42ef8a9463afacfff2246adcb66303
2012-06-14 10:40:12 -07:00
Svetoslav Ganov
95068e5d1b If a gesture cannot be detected the device should transition to touch exploration state.
1. We are deciding whether the user is performing a gesture or an exploration based
   on the gesture velocity. If we are detecting gesture we do the recognition at the
   gesture end which is when the finger goes up. This is better than having a mode
   toggle gesture for exploring and gestures detection. However, it is possible that
   the user really wanted to perform an exploration but was moving too fast and
   unless he lifts his finger the device is in gesture detection mode. This is
   frustrating since the user has no feedback and assumes exploration does not
   work.

   We want to perform gesture detection only for a maximal time frame and if the
   user did not lift his finger we transition into touch exploration state.

bug:6663173

Change-Id: I954ff937cca902e31b51325d1e1dfce84d239624
2012-06-13 21:14:16 -07:00