Changes:
- Move declaration of kClassPathName to top of file so it can be used
in more than one place, instead of "android/media/AudioSystem".
- Make private methods static.
- Add comment to stream_type, audio_mode, force_use types that they must match
values in AudioSystem.java.
- Add comment about unused types mp3_sub_format and vorbis_sub_format.
- Fix typos.
- Use @ in javadoc comments.
- Delete dead APIs setMode, getMode, setRouting, getRouting in AudioSystem.java
(they are all hidden, deprecated, and unused by rest of framework)
- Delete unused private log method.
- Fix pathname for android_media_AudioSystem.cpp.
- Improve code formatting for space after == and !=.
- Add logging of delta for changing audio policy manager ref count.
Change-Id: I18037c7beb8ab76d1fda08c11e589f6e591d36e1
Use that mime type to determine if we should do upfront buffering at the start of
playback and don't for audio streams to ensure playback starts fairly instantly.
Change-Id: If21e36d1b024f0e5c723911bceadaa2e0307ab42
related-to-bug: 4090916
Methods getNumberOfVideoBuffers() and getVideoBuffer() as well as struct
image_rect_struct are no longer used (instead, the necessary information is
passed through ANativeWindow.)
Change-Id: If4b11446fc9ccbde1f6b45bc70c0d0b8e54376eb
Signed-off-by: Iliyan Malchev <malchev@google.com>
Modifies Stagefright to verify that there is a hardware-protected path
to video sink for DRM content.
Change-Id: I18b8741390e803a05a88c7f180b860a24ba88a10
This patch fixs a crash bug caused by using a NULL DecryptHandle pointer.
Fix by using sp<DecryptHandle> instead.
Change-Id: Icbd59858385e8256125a615a3c82656b25319d44
This change fixes the stability problems experienced when using
a bluetooth headset supporting both A2DP and SCO. Problems occur
when starting the video chat at which time the A2DP output is being
stopped to start SCO. At that time, active AudioTracks are invalidated
by AudioFlinger so that a new AudioTrack binder interface can be
recreated by the client process on the new mixer thread with correct parameters.
The problem was that the process to restore the binder interface was not
protected against concurrent requests which caused 2 binder interfaces
to be created sometimes. This could lead to permanent client deadlock
if one of the client threads was waiting for a condition of the first
created binder interface while the second one was created (as the AudioFlinger
would only signal conditions on the last one created).
This concurrent request situation is more likely to happen when a client
uses the JAVA AudioTrack as the JNI implementation uses simultaneously the
native AudioTrack callback and write push mechanisms. By doing so, the code
that checks if the binder interface should be restored (in obtainBuffer()) is
much more likely to be called concurrently from two different threads.
The fix consists in protecting the critical binder interface restore phase
with a flag in the AudioTrack control block. The first thread acting upon the binder
interface restore request will raise the flag and the second thread will just wait for
a condition to be signaled when the restore process is complete.
Also protected all accesses to the AudioTrack control block by a mutex to prevent
access while the track is being destroyed and restored. If a mutex cannot be held
(e.g because we call a callback function), acquire a strong reference on the IAudioTrack
to prevent its destruction while the cblk is being accessed.
Modified AudioTrack JNI to use GetByteArrayElements() instead of
GetPrimitiveArrayCritical() when writing audio buffers. Entering a critical section would
cause the JNI to abort if a mediaserver crash occurs during a write due to the AudioSystem
callback being called during the critical section when media server process restarts.
Anyway with current JNI implementation, either versions do not copy data most of the times
and the criticial version does not guaranty no data copy.
The same modifications have been made to AudioRecord.
Change-Id: Idc5aa711a04c3eee180cdd03f44fe17f3c4dcb52
- To track the currently used audio device
- The devices are separated as speaker and other audio devices
- Provide the collected data to battery application through pullBatteryData()
Change-Id: I374c755266b5ac6b1c6c630400f4daf901ea8acc
This change defines an OpenMAX IL API for querying from the IL component
the gralloc buffer usage flags that should be used to allocate the
buffers. It also adds the Stagefright plumbing for using the new OMX IL
API.
Change-Id: I046b5e7be70ce61e2a921dcdc6e3aa9324d19ea6
Related-Bug: 3479027
This change enables the use of a SurfaceTexture in place of a Surface
as the video sink for an android.media.MediaPlayer. The new API
MediaPlayer.setTexture is currently hidden.
This includes:
- New Java and C++ interfaces
- C++ plumbing and implementation (JNI, Binder)
- Stagefright AwesomePlayer and NuPlayer use ANativeWindow
(either Surface or SurfaceTextureClient)
Change-Id: I2b568bee143d9eaf3dfc6cc4533c1bebbd5afc51
related-to-bug: 3474610
Change-Id: I6dab40e8b465922c62be9ee7f168718822c6caac
Now skipping extra header that the spec claimed shouldn't be present in LATM...
o do not use edts/elst boxes since these optional boxes are ignored
o manipulate the first video/audio frame duration to make sure that the rest
of the audio/video is in sync (ideally, we should only manipulate
the vidoe frame duration, not the audio)
o reduce the initial audio mute/suppression period, which is used to
eliminate the "recording" sound.
bug - 3405882 and 3362703
Change-Id: Ib0acfb4f3843b365157288951dc122b006299c18
These were exposed by the new preview-seekframe while paused code.
In particular, the codec may have been in state RECONFIGURING when attempting
to seek, or we may have initiated flushing of the output port and this may not
have completed yet by the time we want to reconfigure the output port.
Change-Id: Id7640ade11dbc7205a22f648ea0b5e3e9b49cf4b
related-to-bug: 3392259
- Make sure ACodec reverts its state when it's shutdown
- Defer "resume" to after handling the OutputPortSettingsChange
- If the OMX_EventPortSettingsChanged event comes in while we're flushing, defer it
and make sure the output port can be disabled by deleting all buffers not already
owned by the component.
Change-Id: I1f8cdffa71237b57d4275a48b834647a7b263e8b
Note: dependent on external/flac for libFLAC
Implemented and tested:
* FLAC container
* mono and stereo
* standard sample rates
* standard bit depths
* sniffer
* media scanner
* Vorbis comment metadata including album art
* random access seeking with "torture test"
* web browser integration for audio/flac (not audio/x-flac), but
note that most web servers don't correctly report the MIME type
Not implemented:
* 24-bit to 16-bit dither or noise shaping in AudioFlinger
* 96 kHz to 44.1 or 48 kHz downsampling low pass filter in AudioFlinger
* replay gain is better done in AudioFlinger
* multi-channel, would need AudioFlinger support
* Ogg container, does not seem to be very popular yet
Change-Id: I300873e8c0cfc2e95403d9adb5064d16a2923f17
Modified default volume control logic in AudioService:
1 IN_CALL volume if in video/audio chat
2 NOTIFICATION if notification is playing or was playing less than 5s ago.
3 MUSIC
Modified silent mode:
- now also affect MUSIC stream type
- entering silent mode when VOL- hard key is pressed once while selected
stream volume is already at 0 (except for VOICE_CALL stream).
- exiting silent mode when pressing VOL+ hard key while in silent mode
Play sound FX (audible selections, keyboard clicks) at a fixed volume.
Modified audio framework:
- isStreamActive() method now implemented in AudioPolicyManagerBase (previously AudioFlinger)
- iStreamActive() now specifies a time window during which the stream is considered
active after it actually stopped.
Change-Id: I7e5a0724099450b9fc90825224180ac97322785f
This will fix the stop failure issue where we have to wait n * time_interval before a key frame can be received by the file writer, where
o n is the actual number of buffers advertised by the video encoder
o time_interval is the interval settings for timelapse video recording
specifying the time distance between neighboring input video frames
The fix includes two parts:
o OMXCodec will not submit all n buffers at one time, but instead submit one input
frame at one time if it become available.
o Timelapse camera source made available the first two input frames and do not skip
them so that the first compressed output frame data can be received regardless
the specified time_interval
bug - 3367659
Change-Id: Ia68cc2cb0d71aa7dc54540e9ad82fae911ad530b
This avoids the race condition where notifications are dispatched to a NULL receiver
after notifications have been disabled.
Change-Id: I6d351ffbee97616e2c35559c132a6c5e6a66948a
related-to-bug: 3394139
This prevents the mediaserver from leaking a file descriptor after
the media scanner runs
BUG: 3373546
Change-Id: I82a8bae82306de3da56a5c7da5b03ecf106a4efc
Signed-off-by: Mike Lockwood <lockwood@android.com>
- play audio-only streams again
- workaround for malformed streams that switch PIDs across bandwidths
- attempt to pick a different bandwidth stream if the previously chosen one appears
to be malformed/unsupported.
Change-Id: I426d0a40dc725aa242f619d4c9d048b69aca55c9
related-to-bug: 2368598
o Removed setMode() methods and related mode constants
o Removed some of the unused the metadata keys
o Updated the javadoc
o part of a multi-project change.
bug - 2433195
Change-Id: I5ed167f1fd6a53cb143b7dc385b149431d434438