Can switch from a pure overlay at the top of the screen,
to interactive mode with the voice UI drawing at the bottom
and pushing its target activity up like an IME.
Add mechanism to get assist data to the voice interaction UI.
Add some basic visualization of the assist data, outlining
where on the screen we have text.
Add a test ACTION_ASSIST handler, which can propagate the
assist data it gets to the voice interaction session so
you can see what kind of data we are getting from different
apps.
Change-Id: I18312fe1601d7926d1fb96a817638d60f6263771