This column doesn't actually exist in the corresponding table, and
never has, so the presence of this definition is confusing.
Change-Id: I199f9a8effbdc9f45d51060830e3ad83675a0dff
Add a method on ViewGroup to determine whether it supports scrolling.
This allows us to show the pressed feedback immediately in many cases,
improving responsiveness of buttons, etc.
This patch also lengthens the timeout in order to reduce flashes
when the user is scrolling.
Change-Id: Ieb91ae7a1f8e8f7e87448f2a730381a53947996f
Apply (or extend) the theme Theme.Holo.SplitActionBarWhenNarrow or
Theme.Holo.Light.SplitActionBarWhenNarrow to enable splitting the
action bar across both the top and bottom of the screen. This places
the action menu along the bottom, leaving more room at the top for
titles, navigation, and custom views and more room at the bottom for
menu items.
TODO: Refine layout of the action menu when placed at the bottom of
the screen. Make action modes split as well.
Change-Id: I92c91f99c533f26aecf6b828ed041386c4f16922
Bug #4343984
TextureView can be used to render media content (video, OpenGL,
RenderScript) inside a View.
The key difference with SurfaceView is that TextureView does
not create a new Surface. This gives the ability to seamlessly
transform, animate, fade, etc. a TextureView, which was hard
if not impossible to do with a SurfaceView.
A TextureView also interacts perfectly with ScrollView,
ListView, etc. It allows application to embed media content
in a much more flexible way than before.
For instance, to render the camera preview at 50% opacity,
all you need to do is the following:
mTextureView.setAlpha(0.5f);
Camera c = Camera.open();
c.setPreviewTexture(mTextureView.getSurfaceTexture());
c.startPreview();
TextureView uses a SurfaceTexture to get the job done. More
APIs are required to make it easy to create OpenGL contexts
for a TextureView. It can currently be done with a bit of
JNI code.
Change-Id: Iaa7953097ab5beb8437bcbbfa03b2df5b7f80cd7
Currently the counterpart to FrameLayout.setMeasureAllChildren() is
FameLayout.getConsiderGoneChildrenWhenMeasuring(). This change
deprecates FameLayout.getConsiderGoneChildrenWhenMeasuring() in favor of
a new FrameLayout.getMeasureAllChildren() method.
Change-Id: Id26ec8b3966bd1553d5fd708ad76049069fcaeef
This is a new API for writing text-to-speech engines.
The existing API for apps that use TTS remains the same,
with some minor additions.
Change-Id: Id577db449ae0e5baec40621d4a08387dbd755342
Drop tabs to a second row at < w480dp
Make 9-patches for the cab's "done" button thinner
Add a "disable home" display option to the action bar to turn off
focus and touch feedback when tapping home would do nothing
Change-Id: Ib2eedf311655f02055357321e2e9ad5b9037fed1
1. Added an Input Filter that interprets the touch screen motion
events to perfrom accessibility exploration. One finger explores.
Tapping within a given time and distance slop on the last exlopred
location does click and long press, respectively. Two fingers close
and in the same diretion drag. Multiple finglers or two fingers in
different directions or two fingers too far away are delegated to
the view hierarchy. Non moving fingers "accidentally grabbed the
device for the scrren" are ignored.
2. Added accessibility events for hover enter, hover exit, touch
exoloration gesture start, and end. Accessibility hover events
are fired by the hover pipeline. An accessibility event is
dispatched up the view tree and the topmost view fires it.
Thus predecessors can augment the fired event. An accessibility
event has several records and a predecessor can optionally
modify, delete, and add such to the event.
3. Added onPopulateAccessibilityEvent and refactored the existing
accessibility code to use it.
4. Added API for querying the currently enabled accessibility services
by feedback type.
Change-Id: Iea2258c07ffae9491071825d966dc453b07e5134
This reverts commit ac84d3ba81f08036308b17e1ab919e43987a3df5.
There seems to be a problem with this API change. Reverting for now to
fix the build.
Change-Id: Ifa7426b080651b59afbcec2d3ede09a3ec49644c
1. Added an Input Filter that interprets the touch screen motion
events to perfrom accessibility exploration. One finger explores.
Tapping within a given time and distance slop on the last exlopred
location does click and long press, respectively. Two fingers close
and in the same diretion drag. Multiple finglers or two fingers in
different directions or two fingers too far away are delegated to
the view hierarchy. Non moving fingers "accidentally grabbed the
device for the scrren" are ignored.
2. Added accessibility events for hover enter, hover exit, touch
exoloration gesture start, and end. Accessibility hover events
are fired by the hover pipeline. An accessibility event is
dispatched up the view tree and the topmost view fires it.
Thus predecessors can augment the fired event. An accessibility
event has several records and a predecessor can optionally
modify, delete, and add such to the event.
3. Added onPopulateAccessibilityEvent and refactored the existing
accessibility code to use it.
4. Added API for querying the currently enabled accessibility services
by feedback type.
Change-Id: Iec03c6c3fe298de3f14cb6efdbb9b198cd531a0c
Applications now get the display size from the window manager. No
behavior should be changed yet, this is just prep for some real
changes.
Change-Id: I2958a6660895c1cba2b670509600014e55ee9273
In the old world, MenuBuilder and MenuItemImpl were responsible for
generating views for any presentation of a menu. MenuBuilder needed to
know any types and resources involved, and the implied caching
semantics did not work well for menus presented within AdapterViews.
In the new world, the MenuPresenter interface takes over the
responsibility of generating views or adapters for menu
items. MenuBuilder/MenuItemImpl still provide extra metadata tracking
used by these presenters. Mutiple presenters may be active for a
single menu at a time. All of this remains internal framework
implementation details.
BaseMenuPresenter provides a simple base for presenters that treats
the host MenuView more like an AdapterView. This allows for less
rebuilding of views when items are added/removed.
Callbacks have been restructured. Calls that relate to the menu itself
are still handled by MenuBuilder.Callback, but calls related to a
specific presentation of a menu are handled by MenuPresenter.Callback
objects attached to a MenuPresenter.
Also add API to programmatically set divider options for LinearLayout
and hidden API so that ActionBarView can have finer-grained control
over divider placement.
Change-Id: I2265b86a084279822908021aec20dfbadc1bb56b
The FragmentManager/ListFragment impl was restoring the list
state before setting its adapter. This caused the list view to
lose the state, since it gets cleared as part of setting the
adapter. Now the fragment manager waits on restoring the view
hierarchy state until after it has done onActivityCreated(),
at which point we have set the adapter.
It would be nice to make list view less fragile in this regard,
but that is for a different change.
Change-Id: I032d6fe0fefc0dabfae95d44152146029ef5db8e
You can remove sub-tasks inside of a task, or an entire task.
When removing an entire task, you can have its process killed
as well.
When the process is killed, any running services will get an
onTaskRemoved() callback for them to do cleanup before their
process is killed (and the service possibly restarted).
Or they can set a new android:stopWithTask attribute to just
have the service automatically (cleanly) stopped at this point.
Change-Id: I1891bc2da006fa53b99c52f9040f1145650e6808