Compare commits

...

113 Commits

Author SHA1 Message Date
1c864a88eb Use --pause-on-exit from launchers
The terminal opened by scrcpy-console (.bat or .desktop) must not close
if scrcpy terminates with an error, so that error messages can be read.

Refs #3817 <https://github.com/Genymobile/scrcpy/pull/3817>
Refs #3822 <https://github.com/Genymobile/scrcpy/pull/3822>
PR #4130 <https://github.com/Genymobile/scrcpy/pull/4130>
2023-10-11 09:43:44 +02:00
1650b7c058 Add --pause-on-exit
Add an option to make scrcpy pause on exit.

Three behaviors are possible:
 - always pause on exit:
    --pause-on-exit
    --pause-on-exit=true
 - never pause on exit:
    (no option)
    --pause-on-exit=false
 - pause when scrcpy returns with an error (a non-zero exit code):
    --pause-on-exit=if-error

This is useful to prevent the terminal window from automatically
closing, so that error messages can be read.

Refs #3817 <https://github.com/Genymobile/scrcpy/pull/3817>
Refs #3822 <https://github.com/Genymobile/scrcpy/pull/3822>
PR #4130 <https://github.com/Genymobile/scrcpy/pull/4130>
2023-10-11 09:43:44 +02:00
a7c3c9a54c Make fillBaseContext() method private
This is consistent with fillAppInfo() and fillAppContext(), which are
also private.
2023-08-22 20:10:06 +02:00
111d02fca4 Add missing 'final' in Java classes
For consistency.
2023-08-22 18:52:29 +02:00
36670dda40 Fix warning typo
A parenthesis was missing.
2023-08-07 20:22:17 +02:00
0983f0a194 Report device disconnection on audio EOS
If --no-video was set, then device disconnection was not reported. To
avoid the problem, report device disconnection also on audio
end-of-stream (EOS).

If both video and audio are enabled, then a device disconnection event
will be sent twice, but only the first one will be handled (since it
makes scrcpy exit).

Fixes #4207 <https://github.com/Genymobile/scrcpy/issues/4207>
2023-08-01 12:05:16 +02:00
110b3a16f6 Do not disable controls without video playback
Some control messages can still be used even when video playback is
disabled (i.e. there is no window), for example to turn the screen off.

This reverts commit 92483fe11b
(semantically).

Fixes #4175 <https://github.com/Genymobile/scrcpy/issues/4175>
2023-07-28 14:45:33 +02:00
1ee46970e3 Fix TCP/IP link in README
Refs 328ed3650d
PR #4173 <https://github.com/Genymobile/scrcpy/pull/4173>

Signed-off-by: Romain Vimont <rom@rom1v.com>
2023-07-26 19:58:13 +02:00
fcdf847dd3 Add missing syntax highlighting in audio doc 2023-07-14 23:37:19 +02:00
ad05a01800 Add Encoder section
This will allow to reference the encoder section directly in issues.
2023-07-14 23:36:21 +02:00
328ed3650d Extract device connection to a separate doc page
Create a new "Connection" documentation page.
2023-07-14 23:31:26 +02:00
c14668b177 Move display section to video documentation 2023-07-14 23:26:52 +02:00
637f48f360 Update links to v2.1.1 2023-07-14 23:09:44 +02:00
d391fc3b69 Bump version to 2.1.1 2023-07-14 18:58:58 +02:00
75ad925423 Merge branch 'master' into release 2023-07-14 18:58:34 +02:00
7e936fa879 Fix meizu deadlock
Some devices (Meizu) assume that the video encoding thread has a
Looper. By moving video encoding to a separate thread, commit
feab87053a broke this assumption.

Call Looper.prepare() from this thread to fix the problem.

Fixes #4143 <https://github.com/Genymobile/scrcpy/issues/4143>
2023-07-13 21:46:50 +02:00
01d785d9a3 Increase attempts to start AudioRecord
Making the shell app foreground (specific for Android 11) may take more
than 300ms on some devices, so increase the number of attempts from 3 to
5 (separated by 100ms).

Fixes #4147 <https://github.com/Genymobile/scrcpy/issues/4147>
Refs #3796 <https://github.com/Genymobile/scrcpy/issues/3796>
Refs 02f4ff7534
2023-07-07 18:21:17 +02:00
fe6e9acb36 Log device selection at INFO level
The selected device should be logged by default.
2023-07-04 18:22:33 +02:00
625934fb1b Fix fedora package in build instructions
In Fedora, the package is libusb1-devel.

Fixes #4131 <https://github.com/Genymobile/scrcpy/issues/4131>
PR #4132 <https://github.com/Genymobile/scrcpy/pull/4132>

Signed-off-by: Romain Vimont <rom@rom1v.com>
2023-06-30 21:21:13 +02:00
85b55b3c4e Fix possible division by zero
On sway (a window manager), SDL_WINDOWEVENT_EXPOSED and
SDL_WINDOWEVENT_SIZE_CHANGED might not be called before a mouse event is
triggered. As a consequence, the "content rectangle" might not be
initialized when the mouse event is processed, causing a division by
zero.

To avoid the problem, initialize the content rect immediately when the
window is shown.

Fixes #4115 <https://github.com/Genymobile/scrcpy/issues/4115>
2023-06-29 19:18:32 +02:00
7b7076ef85 Add direct links to donations 2023-06-28 10:58:00 +02:00
808bd14e30 Ignore fold change events for other display ids
Scrcpy mirrors a specific display id, it must ignore events for other
display ids.

Fixes #4120 <https://github.com/Genymobile/scrcpy/issues/4120>
2023-06-27 18:43:22 +02:00
0049b3ce07 Remove superfluous log
This line was committed by error in commit
a52053421a.
2023-06-23 08:23:29 +02:00
5764f47fee Update links to v2.1 2023-06-22 01:18:17 +02:00
2dab1f7024 Bump version to 2.1 2023-06-22 01:15:44 +02:00
744312ec64 Merge branch 'master' into release 2023-06-22 01:15:39 +02:00
b9315620e2 Fix adb forward initialization
In forward mode, the dummy byte must be written immediately after the
first accept(), otherwise the client will wait indefinitely, causing a
deadlock (or a timeout).

Regression introduced by 8c650e53cd.
2023-06-22 01:13:53 +02:00
ea59d525bd Fix code style
The code should fit in 80 columns.
2023-06-22 01:07:09 +02:00
0ffcfa0f5c Accept failure in rotation or fold registration
Do not make scrcpy fail if rotation or display fold listeners could not
be registered.
2023-06-22 00:52:54 +02:00
c0f3c080b6 Register DisplayFoldListener only for Android 10+
This listener does not exist on Android < 10, and it makes scrcpy fail.
2023-06-22 00:49:11 +02:00
d046678f85 Upgrade platform-tools (34.0.3) for Windows
Include the latest version of adb in Windows releases.
2023-06-22 00:10:37 +02:00
fae3fbc934 Update developer documentation 2023-06-22 00:03:26 +02:00
5061b7e02c Fix build without gradle
Add missing class generation from IDisplayFoldListener.aidl.

Refs 24999d0d32
2023-06-22 00:03:26 +02:00
09009c2aa7 Upgrade SDL (2.28.0) for Windows
Include the latest version of SDL in Windows releases.

Fixes #3825 <https://github.com/Genymobile/scrcpy/issues/3825>
Refs libsdl/#7478 <https://github.com/libsdl-org/SDL/issues/7478>
2023-06-20 21:45:14 +02:00
fb21bbf763 Add workarounds for Honor devices
Audio did not work on Honor devices.

To make it work, a system context must be set as a base context of
FakeContext (so that a PackageManager is available), and a current
Application and ActivityThread must be set.

These workarounds must not be applied for all devices, because they
might cause other issues.

Fixes #4015 <https://github.com/Genymobile/scrcpy/issues/4015>
Refs #3085 <https://github.com/Genymobile/scrcpy/issues/3805>

Co-authored-by: Simon Chan <1330321+yume-chan@users.noreply.github.com>
2023-06-19 18:45:40 +02:00
0f1afff7a6 Move workarounds execution
Expose a single public static method in the Workarounds class to apply
all necessary workarounds.
2023-06-19 18:45:11 +02:00
48a00fb481 Log device BRAND
The BRAND value is not always the same as the MANUFACTURER value.
2023-06-17 00:25:01 +02:00
3b7e2ca9c8 Fix lint warning
Suppress lint "DiscouragedPrivateApi" in Workarounds.java.
2023-06-16 23:24:08 +02:00
5bd7514871 Add InputManagerGlobal for Android 14 beta 3
Parts of the InputManager class have been moved to a new
InputManagerGlobal class in Android 14 preview.

Fixes #4074 <https://github.com/Genymobile/scrcpy/issues/4074>
PR #4075 <https://github.com/Genymobile/scrcpy/pull/4075>

Signed-off-by: Romain Vimont <rom@rom1v.com>
2023-06-13 15:19:24 +02:00
d3c2955fb9 Add --time-limit
Add an option to stop scrcpy automatically after a given delay.

PR #4052 <https://github.com/Genymobile/scrcpy/pull/4052>
Fixes #3752 <https://github.com/Genymobile/scrcpy/issues/3752>
2023-06-10 16:04:51 +02:00
5042f8de93 Improve recording documentation 2023-06-10 16:03:27 +02:00
7536f95d1c Rename raw_video_stream to raw_stream
This server-specific option impacts both the video and audio streams.
2023-06-10 12:09:43 +02:00
6832e8d629 Remove spurious empty line 2023-06-10 12:07:35 +02:00
28313631e5 Reformat Java code
Fix code style.
2023-06-09 22:28:01 +02:00
fdbc9397a7 Name Java threads
Give a user-friendly name to Java threads created by the server.
2023-06-09 22:27:35 +02:00
4ad7479425 Add missing shortcut in documentation
MOD+Backspace also triggers BACK.
2023-06-08 08:52:21 +02:00
a3cdf1a6b8 Add option to kill adb on close
Killing adb on close by default would be incorrect, since it would break
any other usage of adb in parallel.

It could be easily done manually by calling "adb kill-server" once
scrcpy terminates, but add an option --kill-adb-on-close for
convenience.

Fixes #205 <https://github.com/Genymobile/scrcpy/issues/205>
Fixes #2580 <https://github.com/Genymobile/scrcpy/issues/2580>
Fixes #4049 <https://github.com/Genymobile/scrcpy/issues/4049>
2023-06-05 19:48:21 +02:00
b16d4d1835 Fix adb server vs adb daemon confusion
The adb daemon runs on the device, the adb server runs as a background
process on the computer.
2023-06-05 19:45:20 +02:00
b8d43866d2 Fix options alphabetical order
Commit fc52b24503 missed this one.
2023-06-05 19:44:15 +02:00
2d79aeb117 Simplify command in documentation
If --no-video is passed, --no-playback is equivalent to
--no-audio-playback.
2023-06-04 18:43:35 +02:00
888a5aae7d Fix typo in recording documentation
The option is --record, not --record-file.
2023-06-04 18:40:55 +02:00
323ea2f1d9 Fix PTS when not monotonically increasing
Some decoders fail to guarantee that PTS is strictly monotonically
increasing. Fix the (rescaled) PTS when it does not respect this
constraint.

Fixes #4054 <https://github.com/Genymobile/scrcpy/issues/4054>
2023-06-03 18:50:28 +02:00
9ca554ca41 Extract stream-specific structure in recorder
For now, it only contains the stream index, but more fields will be
added.
2023-06-03 18:48:01 +02:00
9d3c656414 Fix recorder waiting when stream disabled
In the recorder, if the video or audio stream is disabled, do not wait
for its initialization (it will never happen) to process the header.

In that case (scrcpy --no-audio --record=file.mp4), this caused the
whole content to be buffered in memory, and written only on exit.
2023-06-03 18:46:39 +02:00
379caf8551 Use a single condvar in recorder
The sc_cond_wait() in sc_recorder_process_header() needs to be notified
of changes to video_init/audio_init (protected by stream_cond) and
video_queue/audio_queue (protected by queue_cond).

Use only one condition variable to simplify.
2023-06-03 15:10:42 +02:00
2aec7b4c9d Mention how to interrupt scrcpy without video
There is no window to close if video playback is disabled.
2023-06-02 09:00:33 +02:00
fc52b24503 Reorder options in alphabetical order
Fix the options order, using the short option as key first (if any) in
all cases for consistency.
2023-06-01 12:52:48 +02:00
ff5ffc892f Add option to select audio source
Pass --audio-source=mic to capture the microphone instead of the device
audio output.
2023-06-01 09:21:09 +02:00
360f2fea1e Extract AudioCapture creation
This will allow to pass capture options without code duplication.
2023-06-01 09:21:09 +02:00
24999d0d32 Reset video capture on folding event
Handle folding event the same way as rotation events.

Fixes #3960 <https://github.com/Genymobile/scrcpy/issues/3960>
PR #3979 <https://github.com/Genymobile/scrcpy/pull/3979>

Signed-off-by: Romain Vimont <rom@rom1v.com>
2023-06-01 09:20:00 +02:00
8e2c0d6407 Rename rotationChanged to resetCapture
The flag is used to reset the capture (restart the encoding) on rotation
change. It will also be used for other events (on folding change), so
rename it.

PR #3979 <https://github.com/Genymobile/scrcpy/pull/3979>
2023-06-01 09:20:00 +02:00
9a2abba098 Update demuxer comment
The comment was outdated:
 - the "meta" header is now always present (not only when recording is
   enabled);
 - it is not only used for the video stream, but also for the audio
   stream.
2023-06-01 09:04:00 +02:00
b2d860382f Fix stream offset on audio buffer underflow
The `read` variable is in number of samples, while the offset must be in
bytes.

PR #4045 <https://github.com/Genymobile/scrcpy/pull/4045>

Signed-off-by: Romain Vimont <rom@rom1v.com>
2023-05-31 10:09:56 +02:00
4c4a03ebe1 Reorder options to maintain alphabetical order 2023-05-30 21:36:48 +02:00
Yan
798dfd240e Turn device screen off after set up
Sometimes it can take quite a while for everything to get set up and
the screen to appear.

PR #3902 <https://github.com/Genymobile/scrcpy/pull/3902>

Signed-off-by: Romain Vimont <rom@rom1v.com>
2023-05-27 10:11:42 +02:00
c4caa6b81d Document --no-{video,audio}-playback
PR #4033 <https://github.com/Genymobile/scrcpy/pull/4033>
2023-05-27 10:08:10 +02:00
1efbfe1175 Add separate video and audio playback options
Add --no-video-playback and --no-audio-playback. The option
--no-playback is now an alias for both.

PR #4033 <https://github.com/Genymobile/scrcpy/pull/4033>
2023-05-27 10:08:10 +02:00
751c09f47a Simplify V4L2/USB ifdefs
Define local variables whose value depends on ifdefs, to avoid
cluttering all conditions with ifdefs.
2023-05-27 09:55:49 +02:00
6ad46d70b8 Define v4l2_buffer only if HAVE_V4L2
If V4L2 support is disabled, there is no v4l2 buffer option.
2023-05-27 09:55:49 +02:00
f46758d1c5 Fix V4L2 error message when disabled
For consistency, use the same error message for --v4l2-sink and
--v4l2-buffer.
2023-05-27 09:55:49 +02:00
e71f5358b3 Reorder command line options checks
Perform checks that impact the options first.
2023-05-27 09:55:49 +02:00
a2c8910006 Rename --no-mirror to --no-playback
This option impacts video and audio _playback_. For example, if we use
V4L2, the device is still "mirrored" (via V4L2), even if playback is
disabled. Therefore, "playback" is more approriate than "mirror".

The initial option --no-display option was renamed to --no-mirror by
commit 6928acdeac, but this has never been
released, so it is ok to rename it one more time.

Refs #3978 <https://github.com/Genymobile/scrcpy/pull/3978#issuecomment-1549420103>
PR #4033 <https://github.com/Genymobile/scrcpy/pull/4033>
2023-05-27 09:55:38 +02:00
cab354102d Create AudioRecord by reflection as a fallback
Some devices (Vivo phones) fail to create an AudioRecord from an
AudioRecord.Builder (which throws a NullPointerException).

In that case, create an AudioRecord instance directly by reflection.

The AOSP version of AudioRecord constructor code can be found at:
 - Android 11 (R):
   <https://cs.android.com/android/platform/superproject/+/android-11.0.0_r1:frameworks/base/media/java/android/media/AudioRecord.java;l=335;drc=64ed2ec38a511bbbd048985fe413268335e072f8>
 - Android 12 (S):
   <https://cs.android.com/android/platform/superproject/+/android-12.0.0_r1:frameworks/base/media/java/android/media/AudioRecord.java;l=388;drc=2eebf929650e0d320a21f0d13677a27d7ab278e9>
 - Android 13 (T, functionally identical to Android 12):
   <https://cs.android.com/android/platform/superproject/+/android-13.0.0_r1:frameworks/base/media/java/android/media/AudioRecord.java;l=382;drc=ed242da52f975a1dd18671afb346b18853d729f2>
 - Android 14 (U): Not released, but expected to change

PR #3862 <https://github.com/Genymobile/scrcpy/pull/3862>
Fixes #3805 <https://github.com/Genymobile/scrcpy/issues/3805>

Signed-off-by: Romain Vimont <rom@rom1v.com>
2023-05-26 18:56:43 +02:00
597d2ccc01 Rename FORMAT to ENCODING
The AudioFormat contains several properties. This specific value is
named "encoding".
2023-05-24 22:13:09 +02:00
38900d7730 Extract audio source to a static constant
For consistency with the other parameters.
2023-05-24 22:13:09 +02:00
e926bf1fe8 Delay window resize when minimized
On some window managers (e.g. on Windows), performing a resize while the
window is minimized does nothing (the restored window keeps its old
size).

Therefore, like for maximized and fullscreen states, wait for the window
to be restored to apply a resize.

Refs #3947 <https://github.com/Genymobile/scrcpy/issues/3947>
2023-05-22 18:22:45 +02:00
6298ef095f Accept texture failures
When the scrcpy window is minimized on Windows with D3D9, texture
creation and update fail.

In that case, do not terminate scrcpy. Instead, store the pending size
or frame to update, to attempt again during the next update or
rendering.

Fixes #3947 <https://github.com/Genymobile/scrcpy/issues/3947>
2023-05-22 18:21:10 +02:00
958f22490b Document installation via winget on Windows
PR #4005 <https://github.com/Genymobile/scrcpy/pull/4005>
Refs #1444 <https://github.com/Genymobile/scrcpy/issues/1444>
Refs #3932 <https://github.com/Genymobile/scrcpy/issues/3932>

Signed-off-by: Romain Vimont <rom@rom1v.com>
2023-05-17 19:36:39 +02:00
7d33798b40 Upgrade FFmpeg build to 6.0-scrcpy-4
Use FFmpeg DLLs which do not depend on zlib1.dll.
2023-05-15 21:55:22 +02:00
d500550212 Update audio recording documentation
Document how to record audio-only to .opus and .aac files.

PR #3978 <https://github.com/Genymobile/scrcpy/pull/3978>
2023-05-15 14:28:55 +02:00
a166eee909 Upgrade FFmpeg build to 6.0-scrcpy-3
Use a build which includes the opus muxer, to support recording to .opus
files.

Refs <https://github.com/rom1v/scrcpy-deps/commits/6.0-scrcpy-3>
PR #3978 <https://github.com/Genymobile/scrcpy/pull/3978>
2023-05-15 14:28:53 +02:00
b11b363e8e Add recording to aac file
It is just an alias for mp4.

PR #3978 <https://github.com/Genymobile/scrcpy/pull/3978>
2023-05-08 17:11:34 +02:00
7321db6f28 Add recording to opus file
Use the FFmpeg opus muxer to record an opus file.

PR #3978 <https://github.com/Genymobile/scrcpy/pull/3978>
2023-05-08 17:11:34 +02:00
d6bcde565f Accept .m4a and .mka
These are just aliases for mp4 and mkv when there is no video stream.

PR #3978 <https://github.com/Genymobile/scrcpy/pull/3978>
2023-05-08 17:11:34 +02:00
98f4f4e68a Refactor command line checks
Several checks are performed when opts->record_filename is not NULL.
Group them in a single block.

PR #3978 <https://github.com/Genymobile/scrcpy/pull/3978>
2023-05-08 17:11:34 +02:00
be86e14e05 Factorize record format parsing
Convert either the filename extension or the explicit record format
to a sc_record_format using the same function.

PR #3978 <https://github.com/Genymobile/scrcpy/pull/3978>
2023-05-08 17:11:34 +02:00
8c650e53cd Add --no-video
Similar to --no-audio, add --no-video to play audio only.

Fixes #3842 <https://github.com/Genymobile/scrcpy/issues/3842>
PR #3978 <https://github.com/Genymobile/scrcpy/pull/3978>
2023-05-08 17:11:34 +02:00
e89e772c7c Remove unnecessary 'else'
Some server parameters may depend on one another. For example,
audio_bit_rate is meaningless if audio is false.

But it is inconsistent to disable some parameters based on these
dependencies checks, but not others. Handling all dependencies between
parameters would add too much complexity for no benefit.

So just pass individual parameters independently.

PR #3978 <https://github.com/Genymobile/scrcpy/pull/3978>
2023-05-08 16:41:12 +02:00
feab87053a Convert screen encoder to async processor
Contrary to the other tasks (controller and audio capture/encoding), the
screen encoder was executed synchronously. As a consequence,
scrcpy-server could not terminate until the screen encoder returned.

Convert it to an async processor. This allows to terminate on controller
error, and this paves the way to disable video mirroring.

PR #3978 <https://github.com/Genymobile/scrcpy/pull/3978>
2023-05-08 16:41:09 +02:00
751a3653a0 Add missing @Override annotations
PR #3978 <https://github.com/Genymobile/scrcpy/pull/3978>
2023-05-08 16:41:06 +02:00
9c08eb79cb Close connection at the end of finally-block
The async processors use the socket file descriptors from the
connection. Therefore, the connection must not be closed before all
async processor threads are joined.

PR #3978 <https://github.com/Genymobile/scrcpy/pull/3978>
2023-05-08 16:41:04 +02:00
92483fe11b Disable controls on --no-mirror
If mirroring is disabled, control must also be disabled.

PR #3978 <https://github.com/Genymobile/scrcpy/pull/3978>
2023-05-08 16:41:01 +02:00
6928acdeac Rename --no-display to --no-mirror
The option impacts both video and audio playback, so "no display" is not
an appropriate name.

PR #3978 <https://github.com/Genymobile/scrcpy/pull/3978>
2023-05-08 16:40:58 +02:00
cb20bcb16f Clarify API versions that support Audio Forwarding
Reword the supported API versions for audio forwarding sentence to
clarify that it supports API >= 30

PR #3949 <https://github.com/Genymobile/scrcpy/pull/3949>

Signed-off-by: Romain Vimont <rom@rom1v.com>
2023-04-26 19:33:28 +02:00
0f3af2d20b Fix build for FFmpeg < 3.3
The constant AV_CODEC_ID_AV1 was introduced in FFmpeg 3.3. Add an ifdef
to support older versions.

Fixes #3939 <https://github.com/Genymobile/scrcpy/issues/3939>
2023-04-23 12:26:46 +02:00
Yan
c083a7cc90 Force OpenGL Core Profile context on macOS
By default, SDL creates an OpenGL 2.1 context on macOS for an OpenGL
renderer. As a consequence, mipmapping is not supported.

Force to use a core profile context, to get a higher version.

Before:

    INFO: Renderer: opengl
    INFO: OpenGL version: 2.1 NVIDIA-14.0.32 355.11.11.10.10.143
    WARN: Trilinear filtering disabled (OpenGL 3.0+ or ES 2.0+ required)

After:

    INFO: Renderer: opengl
    DEBUG: Creating OpenGL Core Profile context
    INFO: OpenGL version: 4.1 NVIDIA-14.0.32 355.11.11.10.10.143
    INFO: Trilinear filtering enabled

when running with:

    scrcpy --verbosity=debug --render-driver=opengl

Note: Since SDL_CreateRenderer() causes a fallback to OpenGL 2.1, the
profile and version attributes have to be set and the context created
_after_.

PR #3895 <https://github.com/Genymobile/scrcpy/pull/3895>

Signed-off-by: Romain Vimont <rom@rom1v.com>
2023-04-12 21:26:24 +02:00
9eb6591913 Add missing --no-audio option in manpage 2023-04-10 19:30:04 +02:00
9cfea347d0 Remove Options setters
Now that options parsing is performed from the Options class, setters
are not necessary anymore.
2023-04-09 20:02:39 +02:00
ce064fb5e0 Move options parsing to Options class 2023-04-09 20:02:39 +02:00
afcdfc7fd7 Fix checkstyle violation
Checkstyle reported this error:

    [ant:checkstyle] [ERROR] AudioCapture.java:89:145: '+' should be on
    a new line. [OperatorWrap]
2023-04-09 20:01:58 +02:00
051b74c883 Extract sc_display from sc_screen
Move the display code to a separate component.
2023-04-06 19:48:26 +02:00
2e532afd2b Pass const pointers to events
SDL_Events are only read.
2023-04-06 19:48:01 +02:00
fdf465851c Add Android version check in raw audio recorder
Do not attempt to capture audio below Android 11, this may cause a
segfault on the device.

PR #3889 <https://github.com/Genymobile/scrcpy/pull/3889>

Signed-off-by: Romain Vimont <rom@rom1v.com>
2023-04-04 18:37:30 +02:00
8f0b38cc4f Specify in README that OTG does not require adb 2023-03-31 07:55:13 +02:00
a1e8a34001 Fix documentation link in FAQ 2023-03-28 08:32:07 +02:00
00534b0b2d Fix typo in FAQ 2023-03-28 08:31:01 +02:00
21df2c240e Mention necessary reboot
After setting "USB debugging (security settings)", a reboot is
necessary.
2023-03-23 19:06:44 +01:00
2d3059e1ab Reference FAQ from HID/OTG documentation
Reference the FAQ section about "HID/OTG issues on Windows" from the
HID/OTG documentation.
2023-03-20 19:42:23 +01:00
478aece68f Replace "bit-rate" with "bit rate" 2023-03-20 08:35:13 +01:00
55899c091e Fix typo in doc/audio.md
The documentation is about audio bit rate, not video bit rate.

PR #3839 <https://github.com/Genymobile/scrcpy/pull/3839>

Signed-off-by: Romain Vimont <rom@rom1v.com>
2023-03-20 08:32:58 +01:00
d9a644df9c Clarify V4L2 feature in README
The previous formulation could suggest that the device camera could be
used as a webcam. This is not the case (yet?).
2023-03-15 10:46:36 +01:00
45717733a1 Document missing Opus encoder error
And how to solve it.
2023-03-15 00:41:07 +01:00
6ad037ea04 Update Gentoo instructions
scrcpy is available directly in the distro, drop link to the overlay
(which only contains older versions).

PR #3816 <https://github.com/Genymobile/scrcpy/pull/3816>

Signed-off-by: Romain Vimont <rom@rom1v.com>
2023-03-14 22:03:21 +01:00
80 changed files with 2818 additions and 1314 deletions

8
FAQ.md
View File

@ -159,6 +159,8 @@ In developer options, enable:
> **USB debugging (Security settings)** > **USB debugging (Security settings)**
> _Allow granting permissions and simulating input via USB debugging_ > _Allow granting permissions and simulating input via USB debugging_
Rebooting the device is necessary once this option is set.
[simulating input]: https://github.com/Genymobile/scrcpy/issues/70#issuecomment-373286323 [simulating input]: https://github.com/Genymobile/scrcpy/issues/70#issuecomment-373286323
@ -168,12 +170,12 @@ The default text injection method is [limited to ASCII characters][text-input].
A trick allows to also inject some [accented characters][accented-characters], A trick allows to also inject some [accented characters][accented-characters],
but that's all. See [#37]. but that's all. See [#37].
Since scrcpy v1.20, it is possible to simulate a [physical keyboard][hid] (HID). It is also possible to simulate a [physical keyboard][hid] (HID).
[text-input]: https://github.com/Genymobile/scrcpy/issues?q=is%3Aopen+is%3Aissue+label%3Aunicode [text-input]: https://github.com/Genymobile/scrcpy/issues?q=is%3Aopen+is%3Aissue+label%3Aunicode
[accented-characters]: https://blog.rom1v.com/2018/03/introducing-scrcpy/#handle-accented-characters [accented-characters]: https://blog.rom1v.com/2018/03/introducing-scrcpy/#handle-accented-characters
[#37]: https://github.com/Genymobile/scrcpy/issues/37 [#37]: https://github.com/Genymobile/scrcpy/issues/37
[hid]: README.md#physical-keyboard-simulation-hid [hid]: doc/hid-otg.md
## Client issues ## Client issues
@ -229,4 +231,4 @@ Translations of this FAQ in other languages are available in the [wiki].
[wiki]: https://github.com/Genymobile/scrcpy/wiki [wiki]: https://github.com/Genymobile/scrcpy/wiki
Only this README file is guaranteed to be up-to-date. Only this FAQ file is guaranteed to be up-to-date.

View File

@ -1,11 +1,11 @@
# scrcpy (v2.0) # scrcpy (v2.1.1)
<img src="app/data/icon.svg" width="128" height="128" alt="scrcpy" align="right" /> <img src="app/data/icon.svg" width="128" height="128" alt="scrcpy" align="right" />
_pronounced "**scr**een **c**o**py**"_ _pronounced "**scr**een **c**o**py**"_
This application mirrors Android devices (video and audio) connected via This application mirrors Android devices (video and audio) connected via
USB or [over TCP/IP](doc/device.md#tcpip-wireless), and allows to control the USB or [over TCP/IP](doc/connection.md#tcpip-wireless), and allows to control the
device with the keyboard and the mouse of the computer. It does not require any device with the keyboard and the mouse of the computer. It does not require any
_root_ access. It works on _Linux_, _Windows_ and _macOS_. _root_ access. It works on _Linux_, _Windows_ and _macOS_.
@ -30,7 +30,7 @@ Its features include:
- mirroring with [Android device screen off](doc/device.md#turn-screen-off) - mirroring with [Android device screen off](doc/device.md#turn-screen-off)
- [copy-paste](doc/control.md#copy-paste) in both directions - [copy-paste](doc/control.md#copy-paste) in both directions
- [configurable quality](doc/video.md) - [configurable quality](doc/video.md)
- Android device [as a webcam (V4L2)](doc/v4l2.md) (Linux-only) - Android device screen [as a webcam (V4L2)](doc/v4l2.md) (Linux-only)
- [physical keyboard/mouse simulation (HID)](doc/hid-otg.md) - [physical keyboard/mouse simulation (HID)](doc/hid-otg.md)
- [OTG mode](doc/hid-otg.md#otg) - [OTG mode](doc/hid-otg.md#otg)
- and more… - and more…
@ -39,7 +39,7 @@ Its features include:
The Android device requires at least API 21 (Android 5.0). The Android device requires at least API 21 (Android 5.0).
[Audio forwarding](doc/audio.md) is supported from API 30 (Android 11). [Audio forwarding](doc/audio.md) is supported for API >= 30 (Android 11+).
Make sure you [enabled USB debugging][enable-adb] on your device(s). Make sure you [enabled USB debugging][enable-adb] on your device(s).
@ -47,10 +47,14 @@ Make sure you [enabled USB debugging][enable-adb] on your device(s).
On some devices, you also need to enable [an additional option][control] `USB On some devices, you also need to enable [an additional option][control] `USB
debugging (Security Settings)` (this is an item different from `USB debugging`) debugging (Security Settings)` (this is an item different from `USB debugging`)
to control it using a keyboard and mouse. to control it using a keyboard and mouse. Rebooting the device is necessary once
this option is set.
[control]: https://github.com/Genymobile/scrcpy/issues/70#issuecomment-373286323 [control]: https://github.com/Genymobile/scrcpy/issues/70#issuecomment-373286323
Note that USB debugging is not required to run scrcpy in [OTG
mode](doc/hid-otg.md#otg).
## Get the app ## Get the app
@ -64,10 +68,11 @@ to control it using a keyboard and mouse.
The application provides a lot of features and configuration options. They are The application provides a lot of features and configuration options. They are
documented in the following pages: documented in the following pages:
- [Device](doc/device.md) - [Connection](doc/connection.md)
- [Video](doc/video.md) - [Video](doc/video.md)
- [Audio](doc/audio.md) - [Audio](doc/audio.md)
- [Control](doc/control.md) - [Control](doc/control.md)
- [Device](doc/device.md)
- [Window](doc/window.md) - [Window](doc/window.md)
- [Recording](doc/recording.md) - [Recording](doc/recording.md)
- [Tunnels](doc/tunnels.md) - [Tunnels](doc/tunnels.md)
@ -113,7 +118,10 @@ For general questions or discussions, you can also use:
I'm [@rom1v](https://github.com/rom1v), the author and maintainer of _scrcpy_. I'm [@rom1v](https://github.com/rom1v), the author and maintainer of _scrcpy_.
If you appreciate this application, you can [support my open source If you appreciate this application, you can [support my open source
work][donate]. work][donate]:
- [GitHub Sponsors](https://github.com/sponsors/rom1v)
- [Liberapay](https://liberapay.com/rom1v/)
- [PayPal](https://paypal.me/rom2v)
[donate]: https://blog.rom1v.com/about/#support-my-open-source-work [donate]: https://blog.rom1v.com/about/#support-my-open-source-work

View File

@ -7,6 +7,7 @@ _scrcpy() {
--audio-codec= --audio-codec=
--audio-codec-options= --audio-codec-options=
--audio-encoder= --audio-encoder=
--audio-source=
--audio-output-buffer= --audio-output-buffer=
-b --video-bit-rate= -b --video-bit-rate=
--crop= --crop=
@ -15,52 +16,59 @@ _scrcpy() {
--display= --display=
--display-buffer= --display-buffer=
-e --select-tcpip -e --select-tcpip
-f --fullscreen
--force-adb-forward --force-adb-forward
--forward-all-clicks --forward-all-clicks
-f --fullscreen
-K --hid-keyboard
-h --help -h --help
--kill-adb-on-close
-K --hid-keyboard
--legacy-paste --legacy-paste
--list-displays --list-displays
--list-encoders --list-encoders
--lock-video-orientation --lock-video-orientation
--lock-video-orientation= --lock-video-orientation=
--max-fps=
-M --hid-mouse
-m --max-size= -m --max-size=
-M --hid-mouse
--max-fps=
-n --no-control
-N --no-playback
--no-audio --no-audio
--no-audio-playback
--no-cleanup --no-cleanup
--no-clipboard-autosync --no-clipboard-autosync
--no-downsize-on-error --no-downsize-on-error
-n --no-control
-N --no-display
--no-key-repeat --no-key-repeat
--no-mipmaps --no-mipmaps
--no-power-on --no-power-on
--no-video
--no-video-playback
--otg --otg
-p --port= -p --port=
--pause-on-exit
--pause-on-exit=
--power-off-on-close --power-off-on-close
--prefer-text --prefer-text
--print-fps --print-fps
--push-target= --push-target=
--raw-key-events
-r --record= -r --record=
--raw-key-events
--record-format= --record-format=
--render-driver= --render-driver=
--require-audio --require-audio
--rotation= --rotation=
-s --serial= -s --serial=
--shortcut-mod=
-S --turn-screen-off -S --turn-screen-off
--shortcut-mod=
-t --show-touches -t --show-touches
--tcpip --tcpip
--tcpip= --tcpip=
--time-limit=
--tunnel-host= --tunnel-host=
--tunnel-port= --tunnel-port=
--v4l2-buffer= --v4l2-buffer=
--v4l2-sink= --v4l2-sink=
-V --verbosity=
-v --version -v --version
-V --verbosity=
--video-codec= --video-codec=
--video-codec-options= --video-codec-options=
--video-encoder= --video-encoder=
@ -83,10 +91,18 @@ _scrcpy() {
COMPREPLY=($(compgen -W 'opus aac raw' -- "$cur")) COMPREPLY=($(compgen -W 'opus aac raw' -- "$cur"))
return return
;; ;;
--audio-source)
COMPREPLY=($(compgen -W 'output mic' -- "$cur"))
return
;;
--lock-video-orientation) --lock-video-orientation)
COMPREPLY=($(compgen -W 'unlocked initial 0 1 2 3' -- "$cur")) COMPREPLY=($(compgen -W 'unlocked initial 0 1 2 3' -- "$cur"))
return return
;; ;;
--pause-on-exit)
COMPREPLY=($(compgen -W 'true false if-error' -- "$cur"))
return
;;
-r|--record) -r|--record)
COMPREPLY=($(compgen -f -- "$cur")) COMPREPLY=($(compgen -f -- "$cur"))
return return

View File

@ -1,4 +1,2 @@
@echo off @echo off
scrcpy.exe %* scrcpy.exe --pause-on-exit=if-error %*
:: if the exit code is >= 1, then pause
if errorlevel 1 pause

View File

@ -5,7 +5,7 @@ Comment=Display and control your Android device
# For some users, the PATH or ADB environment variables are set from the shell # For some users, the PATH or ADB environment variables are set from the shell
# startup file, like .bashrc or .zshrc… Run an interactive shell to get # startup file, like .bashrc or .zshrc… Run an interactive shell to get
# environment correctly initialized. # environment correctly initialized.
Exec=/bin/bash --norc --noprofile -i -c "\"\\$SHELL\" -i -c scrcpy || read -p 'Press Enter to quit...'" Exec=/bin/sh -c "\"\\$SHELL\" -i -c scrcpy --pause-on-exit=if-error"
Icon=scrcpy Icon=scrcpy
Terminal=true Terminal=true
Type=Application Type=Application

View File

@ -14,6 +14,7 @@ arguments=(
'--audio-codec=[Select the audio codec]:codec:(opus aac raw)' '--audio-codec=[Select the audio codec]:codec:(opus aac raw)'
'--audio-codec-options=[Set a list of comma-separated key\:type=value options for the device audio encoder]' '--audio-codec-options=[Set a list of comma-separated key\:type=value options for the device audio encoder]'
'--audio-encoder=[Use a specific MediaCodec audio encoder]' '--audio-encoder=[Use a specific MediaCodec audio encoder]'
'--audio-source=[Select the audio source]:source:(output mic)'
'--audio-output-buffer=[Configure the size of the SDL audio output buffer (in milliseconds)]' '--audio-output-buffer=[Configure the size of the SDL audio output buffer (in milliseconds)]'
{-b,--video-bit-rate=}'[Encode the video at the given bit-rate]' {-b,--video-bit-rate=}'[Encode the video at the given bit-rate]'
'--crop=[\[width\:height\:x\:y\] Crop the device screen on the server]' '--crop=[\[width\:height\:x\:y\] Crop the device screen on the server]'
@ -22,50 +23,56 @@ arguments=(
'--display=[Specify the display id to mirror]' '--display=[Specify the display id to mirror]'
'--display-buffer=[Add a buffering delay \(in milliseconds\) before displaying]' '--display-buffer=[Add a buffering delay \(in milliseconds\) before displaying]'
{-e,--select-tcpip}'[Use TCP/IP device]' {-e,--select-tcpip}'[Use TCP/IP device]'
{-f,--fullscreen}'[Start in fullscreen]'
'--force-adb-forward[Do not attempt to use \"adb reverse\" to connect to the device]' '--force-adb-forward[Do not attempt to use \"adb reverse\" to connect to the device]'
'--forward-all-clicks[Forward clicks to device]' '--forward-all-clicks[Forward clicks to device]'
{-f,--fullscreen}'[Start in fullscreen]'
{-K,--hid-keyboard}'[Simulate a physical keyboard by using HID over AOAv2]'
{-h,--help}'[Print the help]' {-h,--help}'[Print the help]'
'--kill-adb-on-close[Kill adb when scrcpy terminates]'
{-K,--hid-keyboard}'[Simulate a physical keyboard by using HID over AOAv2]'
'--legacy-paste[Inject computer clipboard text as a sequence of key events on Ctrl+v]' '--legacy-paste[Inject computer clipboard text as a sequence of key events on Ctrl+v]'
'--list-displays[List displays available on the device]' '--list-displays[List displays available on the device]'
'--list-encoders[List video and audio encoders available on the device]' '--list-encoders[List video and audio encoders available on the device]'
'--lock-video-orientation=[Lock video orientation]:orientation:(unlocked initial 0 1 2 3)' '--lock-video-orientation=[Lock video orientation]:orientation:(unlocked initial 0 1 2 3)'
'--max-fps=[Limit the frame rate of screen capture]'
{-M,--hid-mouse}'[Simulate a physical mouse by using HID over AOAv2]'
{-m,--max-size=}'[Limit both the width and height of the video to value]' {-m,--max-size=}'[Limit both the width and height of the video to value]'
{-M,--hid-mouse}'[Simulate a physical mouse by using HID over AOAv2]'
'--max-fps=[Limit the frame rate of screen capture]'
{-n,--no-control}'[Disable device control \(mirror the device in read only\)]'
{-N,--no-playback}'[Disable video and audio playback]'
'--no-audio[Disable audio forwarding]' '--no-audio[Disable audio forwarding]'
'--no-audio-playback[Disable audio playback]'
'--no-cleanup[Disable device cleanup actions on exit]' '--no-cleanup[Disable device cleanup actions on exit]'
'--no-clipboard-autosync[Disable automatic clipboard synchronization]' '--no-clipboard-autosync[Disable automatic clipboard synchronization]'
'--no-downsize-on-error[Disable lowering definition on MediaCodec error]' '--no-downsize-on-error[Disable lowering definition on MediaCodec error]'
{-n,--no-control}'[Disable device control \(mirror the device in read only\)]'
{-N,--no-display}'[Do not display device \(during screen recording or when V4L2 sink is enabled\)]'
'--no-key-repeat[Do not forward repeated key events when a key is held down]' '--no-key-repeat[Do not forward repeated key events when a key is held down]'
'--no-mipmaps[Disable the generation of mipmaps]' '--no-mipmaps[Disable the generation of mipmaps]'
'--no-power-on[Do not power on the device on start]' '--no-power-on[Do not power on the device on start]'
'--no-video[Disable video forwarding]'
'--no-video-playback[Disable video playback]'
'--otg[Run in OTG mode \(simulating physical keyboard and mouse\)]' '--otg[Run in OTG mode \(simulating physical keyboard and mouse\)]'
{-p,--port=}'[\[port\[\:port\]\] Set the TCP port \(range\) used by the client to listen]' {-p,--port=}'[\[port\[\:port\]\] Set the TCP port \(range\) used by the client to listen]'
'--pause-on-exit=[Make scrcpy pause before exiting]:mode:(true false if-error)'
'--power-off-on-close[Turn the device screen off when closing scrcpy]' '--power-off-on-close[Turn the device screen off when closing scrcpy]'
'--prefer-text[Inject alpha characters and space as text events instead of key events]' '--prefer-text[Inject alpha characters and space as text events instead of key events]'
'--print-fps[Start FPS counter, to print frame logs to the console]' '--print-fps[Start FPS counter, to print frame logs to the console]'
'--push-target=[Set the target directory for pushing files to the device by drag and drop]' '--push-target=[Set the target directory for pushing files to the device by drag and drop]'
'--raw-key-events[Inject key events for all input keys, and ignore text events]'
{-r,--record=}'[Record screen to file]:record file:_files' {-r,--record=}'[Record screen to file]:record file:_files'
'--raw-key-events[Inject key events for all input keys, and ignore text events]'
'--record-format=[Force recording format]:format:(mp4 mkv)' '--record-format=[Force recording format]:format:(mp4 mkv)'
'--render-driver=[Request SDL to use the given render driver]:driver name:(direct3d opengl opengles2 opengles metal software)' '--render-driver=[Request SDL to use the given render driver]:driver name:(direct3d opengl opengles2 opengles metal software)'
'--require-audio=[Make scrcpy fail if audio is enabled but does not work]' '--require-audio=[Make scrcpy fail if audio is enabled but does not work]'
'--rotation=[Set the initial display rotation]:rotation values:(0 1 2 3)' '--rotation=[Set the initial display rotation]:rotation values:(0 1 2 3)'
{-s,--serial=}'[The device serial number \(mandatory for multiple devices only\)]:serial:($("${ADB-adb}" devices | awk '\''$2 == "device" {print $1}'\''))' {-s,--serial=}'[The device serial number \(mandatory for multiple devices only\)]:serial:($("${ADB-adb}" devices | awk '\''$2 == "device" {print $1}'\''))'
'--shortcut-mod=[\[key1,key2+key3,...\] Specify the modifiers to use for scrcpy shortcuts]:shortcut mod:(lctrl rctrl lalt ralt lsuper rsuper)'
{-S,--turn-screen-off}'[Turn the device screen off immediately]' {-S,--turn-screen-off}'[Turn the device screen off immediately]'
'--shortcut-mod=[\[key1,key2+key3,...\] Specify the modifiers to use for scrcpy shortcuts]:shortcut mod:(lctrl rctrl lalt ralt lsuper rsuper)'
{-t,--show-touches}'[Show physical touches]' {-t,--show-touches}'[Show physical touches]'
'--tcpip[\(optional \[ip\:port\]\) Configure and connect the device over TCP/IP]' '--tcpip[\(optional \[ip\:port\]\) Configure and connect the device over TCP/IP]'
'--time-limit=[Set the maximum mirroring time, in seconds]'
'--tunnel-host=[Set the IP address of the adb tunnel to reach the scrcpy server]' '--tunnel-host=[Set the IP address of the adb tunnel to reach the scrcpy server]'
'--tunnel-port=[Set the TCP port of the adb tunnel to reach the scrcpy server]' '--tunnel-port=[Set the TCP port of the adb tunnel to reach the scrcpy server]'
'--v4l2-buffer=[Add a buffering delay \(in milliseconds\) before pushing frames]' '--v4l2-buffer=[Add a buffering delay \(in milliseconds\) before pushing frames]'
'--v4l2-sink=[\[\/dev\/videoN\] Output to v4l2loopback device]' '--v4l2-sink=[\[\/dev\/videoN\] Output to v4l2loopback device]'
{-V,--verbosity=}'[Set the log level]:verbosity:(verbose debug info warn error)'
{-v,--version}'[Print the version of scrcpy]' {-v,--version}'[Print the version of scrcpy]'
{-V,--verbosity=}'[Set the log level]:verbosity:(verbose debug info warn error)'
'--video-codec=[Select the video codec]:codec:(h264 h265 av1)' '--video-codec=[Select the video codec]:codec:(h264 h265 av1)'
'--video-codec-options=[Set a list of comma-separated key\:type=value options for the device video encoder]' '--video-codec-options=[Set a list of comma-separated key\:type=value options for the device video encoder]'
'--video-encoder=[Use a specific MediaCodec video encoder]' '--video-encoder=[Use a specific MediaCodec video encoder]'

View File

@ -14,6 +14,7 @@ src = [
'src/delay_buffer.c', 'src/delay_buffer.c',
'src/demuxer.c', 'src/demuxer.c',
'src/device_msg.c', 'src/device_msg.c',
'src/display.c',
'src/icon.c', 'src/icon.c',
'src/file_pusher.c', 'src/file_pusher.c',
'src/fps_counter.c', 'src/fps_counter.c',
@ -50,6 +51,7 @@ src = [
'src/util/term.c', 'src/util/term.c',
'src/util/thread.c', 'src/util/thread.c',
'src/util/tick.c', 'src/util/tick.c',
'src/util/timeout.c',
] ]
conf = configuration_data() conf = configuration_data()

View File

@ -6,10 +6,10 @@ cd "$DIR"
mkdir -p "$PREBUILT_DATA_DIR" mkdir -p "$PREBUILT_DATA_DIR"
cd "$PREBUILT_DATA_DIR" cd "$PREBUILT_DATA_DIR"
DEP_DIR=platform-tools-34.0.1 DEP_DIR=platform-tools-34.0.3
FILENAME=platform-tools_r34.0.1-windows.zip FILENAME=platform-tools_r34.0.3-windows.zip
SHA256SUM=5dd9c2be744c224fa3a7cbe30ba02d2cb378c763bd0f797a7e47e9f3156a5daa SHA256SUM=fce992e93eb786fc9f47df93d83a7b912c46742d45c39d712c02e06d05b72e2b
if [[ -d "$DEP_DIR" ]] if [[ -d "$DEP_DIR" ]]
then then

View File

@ -6,11 +6,11 @@ cd "$DIR"
mkdir -p "$PREBUILT_DATA_DIR" mkdir -p "$PREBUILT_DATA_DIR"
cd "$PREBUILT_DATA_DIR" cd "$PREBUILT_DATA_DIR"
VERSION=6.0-scrcpy-2 VERSION=6.0-scrcpy-4
DEP_DIR="ffmpeg-$VERSION" DEP_DIR="ffmpeg-$VERSION"
FILENAME="$DEP_DIR".7z FILENAME="$DEP_DIR".7z
SHA256SUM=98ef97f8607c97a5c4f9c5a0a991b78f105d002a3619145011d16ffb92501b14 SHA256SUM=39274b321491ce83e76cab5d24e7cbe3f402d3ccf382f739b13be5651c146b60
if [[ -d "$DEP_DIR" ]] if [[ -d "$DEP_DIR" ]]
then then

View File

@ -6,10 +6,10 @@ cd "$DIR"
mkdir -p "$PREBUILT_DATA_DIR" mkdir -p "$PREBUILT_DATA_DIR"
cd "$PREBUILT_DATA_DIR" cd "$PREBUILT_DATA_DIR"
DEP_DIR=SDL2-2.26.4 DEP_DIR=SDL2-2.28.0
FILENAME=SDL2-devel-2.26.4-mingw.tar.gz FILENAME=SDL2-devel-2.28.0-mingw.tar.gz
SHA256SUM=fe899c8642caac2f180b1ee6f786857ddcaa0adc1fa82474312b09dd47d74712 SHA256SUM=b91ce59eeacd4a9db403f976fd2337d9360b21ada374124417d716065c380e20
if [[ -d "$DEP_DIR" ]] if [[ -d "$DEP_DIR" ]]
then then

View File

@ -13,7 +13,7 @@ BEGIN
VALUE "LegalCopyright", "Romain Vimont, Genymobile" VALUE "LegalCopyright", "Romain Vimont, Genymobile"
VALUE "OriginalFilename", "scrcpy.exe" VALUE "OriginalFilename", "scrcpy.exe"
VALUE "ProductName", "scrcpy" VALUE "ProductName", "scrcpy"
VALUE "ProductVersion", "2.0" VALUE "ProductVersion", "2.1.1"
END END
END END
BLOCK "VarFileInfo" BLOCK "VarFileInfo"

View File

@ -21,7 +21,7 @@ Make scrcpy window always on top (above other windows).
.TP .TP
.BI "\-\-audio\-bit\-rate " value .BI "\-\-audio\-bit\-rate " value
Encode the audio at the given bit\-rate, expressed in bits/s. Unit suffixes are supported: '\fBK\fR' (x1000) and '\fBM\fR' (x1000000). Encode the audio at the given bit rate, expressed in bits/s. Unit suffixes are supported: '\fBK\fR' (x1000) and '\fBM\fR' (x1000000).
Default is 128K (128000). Default is 128K (128000).
@ -33,14 +33,6 @@ Lower values decrease the latency, but increase the likelyhood of buffer underru
Default is 50. Default is 50.
.TP
.BI "\-\-audio\-output\-buffer ms
Configure the size of the SDL audio output buffer (in milliseconds).
If you get "robotic" audio playback, you should test with a higher value (10). Do not change this setting otherwise.
Default is 5.
.TP .TP
.BI "\-\-audio\-codec " name .BI "\-\-audio\-codec " name
Select an audio codec (opus, aac or raw). Select an audio codec (opus, aac or raw).
@ -63,9 +55,23 @@ Use a specific MediaCodec audio encoder (depending on the codec provided by \fB\
The available encoders can be listed by \-\-list\-encoders. The available encoders can be listed by \-\-list\-encoders.
.TP
.BI "\-\-audio\-source " source
Select the audio source (output or mic).
Default is output.
.TP
.BI "\-\-audio\-output\-buffer ms
Configure the size of the SDL audio output buffer (in milliseconds).
If you get "robotic" audio playback, you should test with a higher value (10). Do not change this setting otherwise.
Default is 5.
.TP .TP
.BI "\-b, \-\-video\-bit\-rate " value .BI "\-b, \-\-video\-bit\-rate " value
Encode the video at the given bit\-rate, expressed in bits/s. Unit suffixes are supported: '\fBK\fR' (x1000) and '\fBM\fR' (x1000000). Encode the video at the given bit rate, expressed in bits/s. Unit suffixes are supported: '\fBK\fR' (x1000) and '\fBM\fR' (x1000000).
Default is 8M (8000000). Default is 8M (8000000).
@ -107,6 +113,10 @@ Use TCP/IP device (if there is exactly one, like adb -e).
Also see \fB\-d\fR (\fB\-\-select\-usb\fR). Also see \fB\-d\fR (\fB\-\-select\-usb\fR).
.TP
.B \-f, \-\-fullscreen
Start in fullscreen.
.TP .TP
.B \-\-force\-adb\-forward .B \-\-force\-adb\-forward
Do not attempt to use "adb reverse" to connect to the device. Do not attempt to use "adb reverse" to connect to the device.
@ -115,14 +125,14 @@ Do not attempt to use "adb reverse" to connect to the device.
.B \-\-forward\-all\-clicks .B \-\-forward\-all\-clicks
By default, right-click triggers BACK (or POWER on) and middle-click triggers HOME. This option disables these shortcuts and forward the clicks to the device instead. By default, right-click triggers BACK (or POWER on) and middle-click triggers HOME. This option disables these shortcuts and forward the clicks to the device instead.
.TP
.B \-f, \-\-fullscreen
Start in fullscreen.
.TP .TP
.B \-h, \-\-help .B \-h, \-\-help
Print this help. Print this help.
.TP
.B \-\-kill\-adb\-on\-close
Kill adb when scrcpy terminates.
.TP .TP
.B \-K, \-\-hid\-keyboard .B \-K, \-\-hid\-keyboard
Simulate a physical keyboard by using HID over AOAv2. Simulate a physical keyboard by using HID over AOAv2.
@ -161,10 +171,6 @@ Default is "unlocked".
Passing the option without argument is equivalent to passing "initial". Passing the option without argument is equivalent to passing "initial".
.TP
.BI "\-\-max\-fps " value
Limit the framerate of screen capture (officially supported since Android 10, but may work on earlier versions).
.TP .TP
.BI "\-m, \-\-max\-size " value .BI "\-m, \-\-max\-size " value
Limit both the width and height of the video to \fIvalue\fR. The other dimension is computed so that the device aspect\-ratio is preserved. Limit both the width and height of the video to \fIvalue\fR. The other dimension is computed so that the device aspect\-ratio is preserved.
@ -183,6 +189,26 @@ It may only work over USB.
Also see \fB\-\-hid\-keyboard\fR. Also see \fB\-\-hid\-keyboard\fR.
.TP
.BI "\-\-max\-fps " value
Limit the framerate of screen capture (officially supported since Android 10, but may work on earlier versions).
.TP
.B \-n, \-\-no\-control
Disable device control (mirror the device in read\-only).
.TP
.B \-N, \-\-no\-playback
Disable video and audio playback on the computer (equivalent to --no-video-playback --no-audio-playback).
.TP
.B \-\-no\-audio
Disable audio forwarding.
.TP
.B \-\-no\-audio\-playback
Disable audio playback on the computer.
.TP .TP
.B \-\-no\-cleanup .B \-\-no\-cleanup
By default, scrcpy removes the server binary from the device and restores the device state (show touches, stay awake and power mode) on exit. By default, scrcpy removes the server binary from the device and restores the device state (show touches, stay awake and power mode) on exit.
@ -201,14 +227,6 @@ By default, on MediaCodec error, scrcpy automatically tries again with a lower d
This option disables this behavior. This option disables this behavior.
.TP
.B \-n, \-\-no\-control
Disable device control (mirror the device in read\-only).
.TP
.B \-N, \-\-no\-display
Do not display device (only when screen recording is enabled).
.TP .TP
.B \-\-no\-key\-repeat .B \-\-no\-key\-repeat
Do not forward repeated key events when a key is held down. Do not forward repeated key events when a key is held down.
@ -221,6 +239,14 @@ If the renderer is OpenGL 3.0+ or OpenGL ES 2.0+, then mipmaps are automatically
.B \-\-no\-power\-on .B \-\-no\-power\-on
Do not power on the device on start. Do not power on the device on start.
.TP
.B \-\-no\-video
Disable video forwarding.
.TP
.B \-\-no\-video\-playback
Disable video playback on the computer.
.TP .TP
.B \-\-otg .B \-\-otg
Run in OTG mode: simulate physical keyboard and mouse, as if the computer keyboard and mouse were plugged directly to the device via an OTG cable. Run in OTG mode: simulate physical keyboard and mouse, as if the computer keyboard and mouse were plugged directly to the device via an OTG cable.
@ -241,6 +267,16 @@ Set the TCP port (range) used by the client to listen.
Default is 27183:27199. Default is 27183:27199.
.TP
\fB\-\-pause\-on\-exit\fR[=\fImode\fR]
Configure pause on exit. Possible values are "true" (always pause on exit), "false" (never pause on exit) and "if-error" (pause only if an error occured).
This is useful to prevent the terminal window from automatically closing, so that error messages can be read.
Default is "false".
Passing the option without argument is equivalent to passing "true".
.TP .TP
.B \-\-power\-off\-on\-close .B \-\-power\-off\-on\-close
Turn the device screen off when closing scrcpy. Turn the device screen off when closing scrcpy.
@ -262,10 +298,6 @@ Set the target directory for pushing files to the device by drag & drop. It is p
Default is "/sdcard/Download/". Default is "/sdcard/Download/".
.TP
.B \-\-raw\-key\-events
Inject key events for all input keys, and ignore text events.
.TP .TP
.BI "\-r, \-\-record " file .BI "\-r, \-\-record " file
Record screen to Record screen to
@ -275,6 +307,10 @@ The format is determined by the
.B \-\-record\-format .B \-\-record\-format
option if set, or by the file extension (.mp4 or .mkv). option if set, or by the file extension (.mp4 or .mkv).
.TP
.B \-\-raw\-key\-events
Inject key events for all input keys, and ignore text events.
.TP .TP
.BI "\-\-record\-format " format .BI "\-\-record\-format " format
Force recording format (either mp4 or mkv). Force recording format (either mp4 or mkv).
@ -300,6 +336,10 @@ Set the initial display rotation. Possibles values are 0, 1, 2 and 3. Each incre
.BI "\-s, \-\-serial " number .BI "\-s, \-\-serial " number
The device serial number. Mandatory only if several devices are connected to adb. The device serial number. Mandatory only if several devices are connected to adb.
.TP
.B \-S, \-\-turn\-screen\-off
Turn the device screen off immediately.
.TP .TP
.BI "\-\-shortcut\-mod " key\fR[+...]][,...] .BI "\-\-shortcut\-mod " key\fR[+...]][,...]
Specify the modifiers to use for scrcpy shortcuts. Possible keys are "lctrl", "rctrl", "lalt", "ralt", "lsuper" and "rsuper". Specify the modifiers to use for scrcpy shortcuts. Possible keys are "lctrl", "rctrl", "lalt", "ralt", "lsuper" and "rsuper".
@ -310,6 +350,12 @@ For example, to use either LCtrl+LAlt or LSuper for scrcpy shortcuts, pass "lctr
Default is "lalt,lsuper" (left-Alt or left-Super). Default is "lalt,lsuper" (left-Alt or left-Super).
.TP
.B \-t, \-\-show\-touches
Enable "show touches" on start, restore the initial value on exit.
It only shows physical touches (not clicks from scrcpy).
.TP .TP
.BI "\-\-tcpip\fR[=\fIip\fR[:\fIport\fR]] .BI "\-\-tcpip\fR[=\fIip\fR[:\fIport\fR]]
Configure and reconnect the device over TCP/IP. Configure and reconnect the device over TCP/IP.
@ -319,14 +365,8 @@ If a destination address is provided, then scrcpy connects to this address befor
If no destination address is provided, then scrcpy attempts to find the IP address and adb port of the current device (typically connected over USB), enables TCP/IP mode if necessary, then connects to this address before starting. If no destination address is provided, then scrcpy attempts to find the IP address and adb port of the current device (typically connected over USB), enables TCP/IP mode if necessary, then connects to this address before starting.
.TP .TP
.B \-S, \-\-turn\-screen\-off .BI "\-\-time\-limit " seconds
Turn the device screen off immediately. Set the maximum mirroring time, in seconds.
.TP
.B \-t, \-\-show\-touches
Enable "show touches" on start, restore the initial value on exit.
It only shows physical touches (not clicks from scrcpy).
.TP .TP
.BI "\-\-tunnel\-host " ip .BI "\-\-tunnel\-host " ip
@ -340,6 +380,16 @@ Set the TCP port of the adb tunnel to reach the scrcpy server. This option autom
Default is 0 (not forced): the local port used for establishing the tunnel will be used. Default is 0 (not forced): the local port used for establishing the tunnel will be used.
.TP
.B \-v, \-\-version
Print the version of scrcpy.
.TP
.BI "\-V, \-\-verbosity " value
Set the log level ("verbose", "debug", "info", "warn" or "error").
Default is "info" for release builds, "debug" for debug builds.
.TP .TP
.BI "\-\-v4l2-sink " /dev/videoN .BI "\-\-v4l2-sink " /dev/videoN
Output to v4l2loopback device. Output to v4l2loopback device.
@ -354,16 +404,6 @@ This option is similar to \fB\-\-display\-buffer\fR, but specific to V4L2 sink.
Default is 0 (no buffering). Default is 0 (no buffering).
.TP
.BI "\-V, \-\-verbosity " value
Set the log level ("verbose", "debug", "info", "warn" or "error").
Default is "info" for release builds, "debug" for debug builds.
.TP
.B \-v, \-\-version
Print the version of scrcpy.
.TP .TP
.BI "\-\-video\-codec " name .BI "\-\-video\-codec " name
Select a video codec (h264, h265 or av1). Select a video codec (h264, h265 or av1).

View File

@ -628,8 +628,8 @@ sc_adb_select_device(struct sc_intr *intr,
return false; return false;
} }
LOGD("ADB device found:"); LOGI("ADB device found:");
sc_adb_devices_log(SC_LOG_LEVEL_DEBUG, vec.data, vec.size); sc_adb_devices_log(SC_LOG_LEVEL_INFO, vec.data, vec.size);
// Move devics into out_device (do not destroy device) // Move devics into out_device (do not destroy device)
sc_adb_device_move(out_device, device); sc_adb_device_move(out_device, device);

View File

@ -107,7 +107,7 @@ sc_audio_player_sdl_callback(void *userdata, uint8_t *stream, int len_int) {
// latency. // latency.
LOGD("[Audio] Buffer underflow, inserting silence: %" PRIu32 " samples", LOGD("[Audio] Buffer underflow, inserting silence: %" PRIu32 " samples",
silence); silence);
memset(stream + read, 0, TO_BYTES(silence)); memset(stream + TO_BYTES(read), 0, TO_BYTES(silence));
if (ap->received) { if (ap->received) {
// Inserting additional samples immediately increases buffering // Inserting additional samples immediately increases buffering

View File

@ -72,6 +72,14 @@ enum {
OPT_REQUIRE_AUDIO, OPT_REQUIRE_AUDIO,
OPT_AUDIO_BUFFER, OPT_AUDIO_BUFFER,
OPT_AUDIO_OUTPUT_BUFFER, OPT_AUDIO_OUTPUT_BUFFER,
OPT_NO_DISPLAY,
OPT_NO_VIDEO,
OPT_NO_AUDIO_PLAYBACK,
OPT_NO_VIDEO_PLAYBACK,
OPT_AUDIO_SOURCE,
OPT_KILL_ADB_ON_CLOSE,
OPT_TIME_LIMIT,
OPT_PAUSE_ON_EXIT,
}; };
struct sc_option { struct sc_option {
@ -117,7 +125,7 @@ static const struct sc_option options[] = {
.longopt_id = OPT_AUDIO_BIT_RATE, .longopt_id = OPT_AUDIO_BIT_RATE,
.longopt = "audio-bit-rate", .longopt = "audio-bit-rate",
.argdesc = "value", .argdesc = "value",
.text = "Encode the audio at the given bit-rate, expressed in bits/s. " .text = "Encode the audio at the given bit rate, expressed in bits/s. "
"Unit suffixes are supported: 'K' (x1000) and 'M' (x1000000).\n" "Unit suffixes are supported: 'K' (x1000) and 'M' (x1000000).\n"
"Default is 128K (128000).", "Default is 128K (128000).",
}, },
@ -130,16 +138,6 @@ static const struct sc_option options[] = {
"likelyhood of buffer underrun (causing audio glitches).\n" "likelyhood of buffer underrun (causing audio glitches).\n"
"Default is 50.", "Default is 50.",
}, },
{
.longopt_id = OPT_AUDIO_OUTPUT_BUFFER,
.longopt = "audio-output-buffer",
.argdesc = "ms",
.text = "Configure the size of the SDL audio output buffer (in "
"milliseconds).\n"
"If you get \"robotic\" audio playback, you should test with "
"a higher value (10). Do not change this setting otherwise.\n"
"Default is 5.",
},
{ {
.longopt_id = OPT_AUDIO_CODEC, .longopt_id = OPT_AUDIO_CODEC,
.longopt = "audio-codec", .longopt = "audio-codec",
@ -167,11 +165,28 @@ static const struct sc_option options[] = {
"codec provided by --audio-codec).\n" "codec provided by --audio-codec).\n"
"The available encoders can be listed by --list-encoders.", "The available encoders can be listed by --list-encoders.",
}, },
{
.longopt_id = OPT_AUDIO_SOURCE,
.longopt = "audio-source",
.argdesc = "source",
.text = "Select the audio source (output or mic).\n"
"Default is output.",
},
{
.longopt_id = OPT_AUDIO_OUTPUT_BUFFER,
.longopt = "audio-output-buffer",
.argdesc = "ms",
.text = "Configure the size of the SDL audio output buffer (in "
"milliseconds).\n"
"If you get \"robotic\" audio playback, you should test with "
"a higher value (10). Do not change this setting otherwise.\n"
"Default is 5.",
},
{ {
.shortopt = 'b', .shortopt = 'b',
.longopt = "video-bit-rate", .longopt = "video-bit-rate",
.argdesc = "value", .argdesc = "value",
.text = "Encode the video at the given bit-rate, expressed in bits/s. " .text = "Encode the video at the given bit rate, expressed in bits/s. "
"Unit suffixes are supported: 'K' (x1000) and 'M' (x1000000).\n" "Unit suffixes are supported: 'K' (x1000) and 'M' (x1000000).\n"
"Default is 8M (8000000).", "Default is 8M (8000000).",
}, },
@ -245,6 +260,11 @@ static const struct sc_option options[] = {
.longopt = "encoder", .longopt = "encoder",
.argdesc = "name", .argdesc = "name",
}, },
{
.shortopt = 'f',
.longopt = "fullscreen",
.text = "Start in fullscreen.",
},
{ {
.longopt_id = OPT_FORCE_ADB_FORWARD, .longopt_id = OPT_FORCE_ADB_FORWARD,
.longopt = "force-adb-forward", .longopt = "force-adb-forward",
@ -259,9 +279,14 @@ static const struct sc_option options[] = {
"shortcuts and forwards the clicks to the device instead.", "shortcuts and forwards the clicks to the device instead.",
}, },
{ {
.shortopt = 'f', .shortopt = 'h',
.longopt = "fullscreen", .longopt = "help",
.text = "Start in fullscreen.", .text = "Print this help.",
},
{
.longopt_id = OPT_KILL_ADB_ON_CLOSE,
.longopt = "kill-adb-on-close",
.text = "Kill adb when scrcpy terminates.",
}, },
{ {
.shortopt = 'K', .shortopt = 'K',
@ -280,11 +305,6 @@ static const struct sc_option options[] = {
"is enabled (or a physical keyboard is connected).\n" "is enabled (or a physical keyboard is connected).\n"
"Also see --hid-mouse.", "Also see --hid-mouse.",
}, },
{
.shortopt = 'h',
.longopt = "help",
.text = "Print this help.",
},
{ {
.longopt_id = OPT_LEGACY_PASTE, .longopt_id = OPT_LEGACY_PASTE,
.longopt = "legacy-paste", .longopt = "legacy-paste",
@ -318,11 +338,13 @@ static const struct sc_option options[] = {
"\"initial\".", "\"initial\".",
}, },
{ {
.longopt_id = OPT_MAX_FPS, .shortopt = 'm',
.longopt = "max-fps", .longopt = "max-size",
.argdesc = "value", .argdesc = "value",
.text = "Limit the frame rate of screen capture (officially supported " .text = "Limit both the width and height of the video to value. The "
"since Android 10, but may work on earlier versions).", "other dimension is computed so that the device aspect-ratio "
"is preserved.\n"
"Default is 0 (unlimited).",
}, },
{ {
.shortopt = 'M', .shortopt = 'M',
@ -336,19 +358,33 @@ static const struct sc_option options[] = {
"Also see --hid-keyboard.", "Also see --hid-keyboard.",
}, },
{ {
.shortopt = 'm', .longopt_id = OPT_MAX_FPS,
.longopt = "max-size", .longopt = "max-fps",
.argdesc = "value", .argdesc = "value",
.text = "Limit both the width and height of the video to value. The " .text = "Limit the frame rate of screen capture (officially supported "
"other dimension is computed so that the device aspect-ratio " "since Android 10, but may work on earlier versions).",
"is preserved.\n" },
"Default is 0 (unlimited).", {
.shortopt = 'n',
.longopt = "no-control",
.text = "Disable device control (mirror the device in read-only).",
},
{
.shortopt = 'N',
.longopt = "no-playback",
.text = "Disable video and audio playback on the computer (equivalent "
"to --no-video-playback --no-audio-playback).",
}, },
{ {
.longopt_id = OPT_NO_AUDIO, .longopt_id = OPT_NO_AUDIO,
.longopt = "no-audio", .longopt = "no-audio",
.text = "Disable audio forwarding.", .text = "Disable audio forwarding.",
}, },
{
.longopt_id = OPT_NO_AUDIO_PLAYBACK,
.longopt = "no-audio-playback",
.text = "Disable audio playback on the computer.",
},
{ {
.longopt_id = OPT_NO_CLEANUP, .longopt_id = OPT_NO_CLEANUP,
.longopt = "no-cleanup", .longopt = "no-cleanup",
@ -374,15 +410,9 @@ static const struct sc_option options[] = {
"This option disables this behavior.", "This option disables this behavior.",
}, },
{ {
.shortopt = 'n', // deprecated
.longopt = "no-control", .longopt_id = OPT_NO_DISPLAY,
.text = "Disable device control (mirror the device in read-only).",
},
{
.shortopt = 'N',
.longopt = "no-display", .longopt = "no-display",
.text = "Do not display device (only when screen recording or V4L2 "
"sink is enabled).",
}, },
{ {
.longopt_id = OPT_NO_KEY_REPEAT, .longopt_id = OPT_NO_KEY_REPEAT,
@ -401,6 +431,16 @@ static const struct sc_option options[] = {
.longopt = "no-power-on", .longopt = "no-power-on",
.text = "Do not power on the device on start.", .text = "Do not power on the device on start.",
}, },
{
.longopt_id = OPT_NO_VIDEO,
.longopt = "no-video",
.text = "Disable video forwarding.",
},
{
.longopt_id = OPT_NO_VIDEO_PLAYBACK,
.longopt = "no-video-playback",
.text = "Disable video playback on the computer.",
},
{ {
.longopt_id = OPT_OTG, .longopt_id = OPT_OTG,
.longopt = "otg", .longopt = "otg",
@ -424,6 +464,20 @@ static const struct sc_option options[] = {
"Default is " STR(DEFAULT_LOCAL_PORT_RANGE_FIRST) ":" "Default is " STR(DEFAULT_LOCAL_PORT_RANGE_FIRST) ":"
STR(DEFAULT_LOCAL_PORT_RANGE_LAST) ".", STR(DEFAULT_LOCAL_PORT_RANGE_LAST) ".",
}, },
{
.longopt_id = OPT_PAUSE_ON_EXIT,
.longopt = "pause-on-exit",
.argdesc = "mode",
.optional_arg = true,
.text = "Configure pause on exit. Possible values are \"true\" (always "
"pause on exit), \"false\" (never pause on exit) and "
"\"if-error\" (pause only if an error occured).\n"
"This is useful to prevent the terminal window from "
"automatically closing, so that error messages can be read.\n"
"Default is \"false\".\n"
"Passing the option without argument is equivalent to passing "
"\"true\".",
},
{ {
.longopt_id = OPT_POWER_OFF_ON_CLOSE, .longopt_id = OPT_POWER_OFF_ON_CLOSE,
.longopt = "power-off-on-close", .longopt = "power-off-on-close",
@ -452,11 +506,6 @@ static const struct sc_option options[] = {
"drag & drop. It is passed as is to \"adb push\".\n" "drag & drop. It is passed as is to \"adb push\".\n"
"Default is \"/sdcard/Download/\".", "Default is \"/sdcard/Download/\".",
}, },
{
.longopt_id = OPT_RAW_KEY_EVENTS,
.longopt = "raw-key-events",
.text = "Inject key events for all input keys, and ignore text events."
},
{ {
.shortopt = 'r', .shortopt = 'r',
.longopt = "record", .longopt = "record",
@ -465,6 +514,11 @@ static const struct sc_option options[] = {
"The format is determined by the --record-format option if " "The format is determined by the --record-format option if "
"set, or by the file extension (.mp4 or .mkv).", "set, or by the file extension (.mp4 or .mkv).",
}, },
{
.longopt_id = OPT_RAW_KEY_EVENTS,
.longopt = "raw-key-events",
.text = "Inject key events for all input keys, and ignore text events."
},
{ {
.longopt_id = OPT_RECORD_FORMAT, .longopt_id = OPT_RECORD_FORMAT,
.longopt = "record-format", .longopt = "record-format",
@ -503,6 +557,11 @@ static const struct sc_option options[] = {
.text = "The device serial number. Mandatory only if several devices " .text = "The device serial number. Mandatory only if several devices "
"are connected to adb.", "are connected to adb.",
}, },
{
.shortopt = 'S',
.longopt = "turn-screen-off",
.text = "Turn the device screen off immediately.",
},
{ {
.longopt_id = OPT_SHORTCUT_MOD, .longopt_id = OPT_SHORTCUT_MOD,
.longopt = "shortcut-mod", .longopt = "shortcut-mod",
@ -516,11 +575,6 @@ static const struct sc_option options[] = {
"shortcuts, pass \"lctrl+lalt,lsuper\".\n" "shortcuts, pass \"lctrl+lalt,lsuper\".\n"
"Default is \"lalt,lsuper\" (left-Alt or left-Super).", "Default is \"lalt,lsuper\" (left-Alt or left-Super).",
}, },
{
.shortopt = 'S',
.longopt = "turn-screen-off",
.text = "Turn the device screen off immediately.",
},
{ {
.shortopt = 't', .shortopt = 't',
.longopt = "show-touches", .longopt = "show-touches",
@ -542,6 +596,12 @@ static const struct sc_option options[] = {
"connected over USB), enables TCP/IP mode, then connects to " "connected over USB), enables TCP/IP mode, then connects to "
"this address before starting.", "this address before starting.",
}, },
{
.longopt_id = OPT_TIME_LIMIT,
.longopt = "time-limit",
.argdesc = "seconds",
.text = "Set the maximum mirroring time, in seconds.",
},
{ {
.longopt_id = OPT_TUNNEL_HOST, .longopt_id = OPT_TUNNEL_HOST,
.longopt = "tunnel-host", .longopt = "tunnel-host",
@ -561,6 +621,22 @@ static const struct sc_option options[] = {
"Default is 0 (not forced): the local port used for " "Default is 0 (not forced): the local port used for "
"establishing the tunnel will be used.", "establishing the tunnel will be used.",
}, },
{
.shortopt = 'v',
.longopt = "version",
.text = "Print the version of scrcpy.",
},
{
.shortopt = 'V',
.longopt = "verbosity",
.argdesc = "value",
.text = "Set the log level (verbose, debug, info, warn or error).\n"
#ifndef NDEBUG
"Default is debug.",
#else
"Default is info.",
#endif
},
{ {
.longopt_id = OPT_V4L2_SINK, .longopt_id = OPT_V4L2_SINK,
.longopt = "v4l2-sink", .longopt = "v4l2-sink",
@ -581,22 +657,6 @@ static const struct sc_option options[] = {
"Default is 0 (no buffering).\n" "Default is 0 (no buffering).\n"
"This option is only available on Linux.", "This option is only available on Linux.",
}, },
{
.shortopt = 'V',
.longopt = "verbosity",
.argdesc = "value",
.text = "Set the log level (verbose, debug, info, warn or error).\n"
#ifndef NDEBUG
"Default is debug.",
#else
"Default is info.",
#endif
},
{
.shortopt = 'v',
.longopt = "version",
.text = "Print the version of scrcpy.",
},
{ {
.longopt_id = OPT_VIDEO_CODEC, .longopt_id = OPT_VIDEO_CODEC,
.longopt = "video-codec", .longopt = "video-codec",
@ -1467,18 +1527,39 @@ sc_parse_shortcut_mods(const char *s, struct sc_shortcut_mods *mods) {
} }
#endif #endif
static enum sc_record_format
get_record_format(const char *name) {
if (!strcmp(name, "mp4")) {
return SC_RECORD_FORMAT_MP4;
}
if (!strcmp(name, "mkv")) {
return SC_RECORD_FORMAT_MKV;
}
if (!strcmp(name, "m4a")) {
return SC_RECORD_FORMAT_M4A;
}
if (!strcmp(name, "mka")) {
return SC_RECORD_FORMAT_MKA;
}
if (!strcmp(name, "opus")) {
return SC_RECORD_FORMAT_OPUS;
}
if (!strcmp(name, "aac")) {
return SC_RECORD_FORMAT_AAC;
}
return 0;
}
static bool static bool
parse_record_format(const char *optarg, enum sc_record_format *format) { parse_record_format(const char *optarg, enum sc_record_format *format) {
if (!strcmp(optarg, "mp4")) { enum sc_record_format fmt = get_record_format(optarg);
*format = SC_RECORD_FORMAT_MP4; if (!fmt) {
return true; LOGE("Unsupported format: %s (expected mp4 or mkv)", optarg);
return false;
} }
if (!strcmp(optarg, "mkv")) {
*format = SC_RECORD_FORMAT_MKV; *format = fmt;
return true; return true;
}
LOGE("Unsupported format: %s (expected mp4 or mkv)", optarg);
return false;
} }
static bool static bool
@ -1498,18 +1579,13 @@ parse_port(const char *optarg, uint16_t *port) {
static enum sc_record_format static enum sc_record_format
guess_record_format(const char *filename) { guess_record_format(const char *filename) {
size_t len = strlen(filename); const char *dot = strrchr(filename, '.');
if (len < 4) { if (!dot) {
return 0; return 0;
} }
const char *ext = &filename[len - 4];
if (!strcmp(ext, ".mp4")) { const char *ext = dot + 1;
return SC_RECORD_FORMAT_MP4; return get_record_format(ext);
}
if (!strcmp(ext, ".mkv")) {
return SC_RECORD_FORMAT_MKV;
}
return 0;
} }
static bool static bool
@ -1548,6 +1624,57 @@ parse_audio_codec(const char *optarg, enum sc_codec *codec) {
return false; return false;
} }
static bool
parse_audio_source(const char *optarg, enum sc_audio_source *source) {
if (!strcmp(optarg, "mic")) {
*source = SC_AUDIO_SOURCE_MIC;
return true;
}
if (!strcmp(optarg, "output")) {
*source = SC_AUDIO_SOURCE_OUTPUT;
return true;
}
LOGE("Unsupported audio source: %s (expected output or mic)", optarg);
return false;
}
static bool
parse_time_limit(const char *s, sc_tick *tick) {
long value;
bool ok = parse_integer_arg(s, &value, false, 0, 0x7FFFFFFF, "time limit");
if (!ok) {
return false;
}
*tick = SC_TICK_FROM_SEC(value);
return true;
}
static bool
parse_pause_on_exit(const char *s, enum sc_pause_on_exit *pause_on_exit) {
if (!s || !strcmp(s, "true")) {
*pause_on_exit = SC_PAUSE_ON_EXIT_TRUE;
return true;
}
if (!strcmp(s, "false")) {
*pause_on_exit = SC_PAUSE_ON_EXIT_FALSE;
return true;
}
if (!strcmp(s, "if-error")) {
*pause_on_exit = SC_PAUSE_ON_EXIT_IF_ERROR;
return true;
}
LOGE("Unsupported pause on exit mode: %s "
"(expected true, false or if-error)", optarg);
return false;
}
static bool static bool
parse_args_with_getopt(struct scrcpy_cli_args *args, int argc, char *argv[], parse_args_with_getopt(struct scrcpy_cli_args *args, int argc, char *argv[],
const char *optstring, const struct option *longopts) { const char *optstring, const struct option *longopts) {
@ -1642,8 +1769,18 @@ parse_args_with_getopt(struct scrcpy_cli_args *args, int argc, char *argv[],
case 'n': case 'n':
opts->control = false; opts->control = false;
break; break;
case OPT_NO_DISPLAY:
LOGW("--no-display is deprecated, use --no-playback instead.");
// fall through
case 'N': case 'N':
opts->display = false; opts->video_playback = false;
opts->audio_playback = false;
break;
case OPT_NO_VIDEO_PLAYBACK:
opts->video_playback = false;
break;
case OPT_NO_AUDIO_PLAYBACK:
opts->audio_playback = false;
break; break;
case 'p': case 'p':
if (!parse_port_range(optarg, &opts->port_range)) { if (!parse_port_range(optarg, &opts->port_range)) {
@ -1788,6 +1925,9 @@ parse_args_with_getopt(struct scrcpy_cli_args *args, int argc, char *argv[],
case OPT_NO_DOWNSIZE_ON_ERROR: case OPT_NO_DOWNSIZE_ON_ERROR:
opts->downsize_on_error = false; opts->downsize_on_error = false;
break; break;
case OPT_NO_VIDEO:
opts->video = false;
break;
case OPT_NO_AUDIO: case OPT_NO_AUDIO:
opts->audio = false; opts->audio = false;
break; break;
@ -1838,7 +1978,8 @@ parse_args_with_getopt(struct scrcpy_cli_args *args, int argc, char *argv[],
} }
break; break;
#else #else
LOGE("V4L2 (--v4l2-buffer) is only available on Linux."); LOGE("V4L2 (--v4l2-buffer) is disabled (or unsupported on this "
"platform).");
return false; return false;
#endif #endif
case OPT_LIST_ENCODERS: case OPT_LIST_ENCODERS:
@ -1861,6 +2002,24 @@ parse_args_with_getopt(struct scrcpy_cli_args *args, int argc, char *argv[],
return false; return false;
} }
break; break;
case OPT_AUDIO_SOURCE:
if (!parse_audio_source(optarg, &opts->audio_source)) {
return false;
}
break;
case OPT_KILL_ADB_ON_CLOSE:
opts->kill_adb_on_close = true;
break;
case OPT_TIME_LIMIT:
if (!parse_time_limit(optarg, &opts->time_limit)) {
return false;
}
break;
case OPT_PAUSE_ON_EXIT:
if (!parse_pause_on_exit(optarg, &args->pause_on_exit)) {
return false;
}
break;
default: default:
// getopt prints the error message on stderr // getopt prints the error message on stderr
return false; return false;
@ -1889,14 +2048,46 @@ parse_args_with_getopt(struct scrcpy_cli_args *args, int argc, char *argv[],
return false; return false;
} }
bool otg = false;
bool v4l2 = false;
#ifdef HAVE_USB
otg = opts->otg;
#endif
#ifdef HAVE_V4L2 #ifdef HAVE_V4L2
if (!opts->display && !opts->record_filename && !opts->v4l2_device) { v4l2 = !!opts->v4l2_device;
LOGE("-N/--no-display requires either screen recording (-r/--record)" #endif
" or sink to v4l2loopback device (--v4l2-sink)");
if (!opts->video) {
opts->video_playback = false;
}
if (!opts->audio) {
opts->audio_playback = false;
}
if (opts->video && !opts->video_playback && !opts->record_filename
&& !v4l2) {
LOGI("No video playback, no recording, no V4L2 sink: video disabled");
opts->video = false;
}
if (opts->audio && !opts->audio_playback && !opts->record_filename) {
LOGI("No audio playback, no recording: audio disabled");
opts->audio = false;
}
if (!opts->video && !opts->audio && !otg) {
LOGE("No video, no audio, no OTG: nothing to do");
return false; return false;
} }
if (opts->v4l2_device) { if (!opts->video && !otg) {
// If video is disabled, then scrcpy must exit on audio failure.
opts->require_audio = true;
}
#ifdef HAVE_V4L2
if (v4l2) {
if (opts->lock_video_orientation == if (opts->lock_video_orientation ==
SC_LOCK_VIDEO_ORIENTATION_UNLOCKED) { SC_LOCK_VIDEO_ORIENTATION_UNLOCKED) {
LOGI("Video orientation is locked for v4l2 sink. " LOGI("Video orientation is locked for v4l2 sink. "
@ -1914,18 +2105,8 @@ parse_args_with_getopt(struct scrcpy_cli_args *args, int argc, char *argv[],
LOGE("V4L2 buffer value without V4L2 sink\n"); LOGE("V4L2 buffer value without V4L2 sink\n");
return false; return false;
} }
#else
if (!opts->display && !opts->record_filename) {
LOGE("-N/--no-display requires screen recording (-r/--record)");
return false;
}
#endif #endif
if (opts->audio && !opts->display && !opts->record_filename) {
LOGI("No display and no recording: audio disabled");
opts->audio = false;
}
if ((opts->tunnel_host || opts->tunnel_port) && !opts->force_adb_forward) { if ((opts->tunnel_host || opts->tunnel_port) && !opts->force_adb_forward) {
LOGI("Tunnel host/port is set, " LOGI("Tunnel host/port is set, "
"--force-adb-forward automatically enabled."); "--force-adb-forward automatically enabled.");
@ -1937,19 +2118,41 @@ parse_args_with_getopt(struct scrcpy_cli_args *args, int argc, char *argv[],
return false; return false;
} }
if (opts->record_filename && !opts->record_format) { if (opts->record_filename) {
opts->record_format = guess_record_format(opts->record_filename);
if (!opts->record_format) { if (!opts->record_format) {
LOGE("No format specified for \"%s\" " opts->record_format = guess_record_format(opts->record_filename);
"(try with --record-format=mkv)", if (!opts->record_format) {
opts->record_filename); LOGE("No format specified for \"%s\" "
"(try with --record-format=mkv)",
opts->record_filename);
return false;
}
}
if (opts->audio_codec == SC_CODEC_RAW) {
LOGW("Recording does not support RAW audio codec");
return false; return false;
} }
}
if (opts->record_filename && opts->audio_codec == SC_CODEC_RAW) { if (opts->video
LOGW("Recording does not support RAW audio codec"); && sc_record_format_is_audio_only(opts->record_format)) {
return false; LOGE("Audio container does not support video stream");
return false;
}
if (opts->record_format == SC_RECORD_FORMAT_OPUS
&& opts->audio_codec != SC_CODEC_OPUS) {
LOGE("Recording to OPUS file requires an OPUS audio stream "
"(try with --audio-codec=opus)");
return false;
}
if (opts->record_format == SC_RECORD_FORMAT_AAC
&& opts->audio_codec != SC_CODEC_AAC) {
LOGE("Recording to AAC file requires an AAC audio stream "
"(try with --audio-codec=aac)");
return false;
}
} }
if (opts->audio_codec == SC_CODEC_RAW) { if (opts->audio_codec == SC_CODEC_RAW) {
@ -1983,11 +2186,9 @@ parse_args_with_getopt(struct scrcpy_cli_args *args, int argc, char *argv[],
} }
} }
#ifdef HAVE_USB
# ifdef _WIN32 # ifdef _WIN32
if (!opts->otg && (opts->keyboard_input_mode == SC_KEYBOARD_INPUT_MODE_HID if (!otg && (opts->keyboard_input_mode == SC_KEYBOARD_INPUT_MODE_HID
|| opts->mouse_input_mode == SC_MOUSE_INPUT_MODE_HID)) { || opts->mouse_input_mode == SC_MOUSE_INPUT_MODE_HID)) {
LOGE("On Windows, it is not possible to open a USB device already open " LOGE("On Windows, it is not possible to open a USB device already open "
"by another process (like adb)."); "by another process (like adb).");
LOGE("Therefore, -K/--hid-keyboard and -M/--hid-mouse may only work in " LOGE("Therefore, -K/--hid-keyboard and -M/--hid-mouse may only work in "
@ -1996,7 +2197,7 @@ parse_args_with_getopt(struct scrcpy_cli_args *args, int argc, char *argv[],
} }
# endif # endif
if (opts->otg) { if (otg) {
// OTG mode is compatible with only very few options. // OTG mode is compatible with only very few options.
// Only report obvious errors. // Only report obvious errors.
if (opts->record_filename) { if (opts->record_filename) {
@ -2023,18 +2224,46 @@ parse_args_with_getopt(struct scrcpy_cli_args *args, int argc, char *argv[],
LOGE("OTG mode: could not select display"); LOGE("OTG mode: could not select display");
return false; return false;
} }
# ifdef HAVE_V4L2 if (v4l2) {
if (opts->v4l2_device) {
LOGE("OTG mode: could not sink to V4L2 device"); LOGE("OTG mode: could not sink to V4L2 device");
return false; return false;
} }
# endif
} }
#endif
return true; return true;
} }
static enum sc_pause_on_exit
sc_get_pause_on_exit(int argc, char *argv[]) {
// Read arguments backwards so that the last --pause-on-exit is considered
// (same behavior as getopt())
for (int i = argc - 1; i >= 1; --i) {
const char *arg = argv[i];
// Starts with "--pause-on-exit"
if (!strncmp("--pause-on-exit", arg, 15)) {
if (arg[15] == '\0') {
// No argument
return SC_PAUSE_ON_EXIT_TRUE;
}
if (arg[15] != '=') {
// Invalid parameter, ignore
return SC_PAUSE_ON_EXIT_FALSE;
}
const char *value = &arg[16];
if (!strcmp(value, "true")) {
return SC_PAUSE_ON_EXIT_TRUE;
}
if (!strcmp(value, "if-error")) {
return SC_PAUSE_ON_EXIT_IF_ERROR;
}
// Set to false, inclusing when the value is invalid
return SC_PAUSE_ON_EXIT_FALSE;
}
}
return false;
}
bool bool
scrcpy_parse_args(struct scrcpy_cli_args *args, int argc, char *argv[]) { scrcpy_parse_args(struct scrcpy_cli_args *args, int argc, char *argv[]) {
struct sc_getopt_adapter adapter; struct sc_getopt_adapter adapter;
@ -2048,5 +2277,11 @@ scrcpy_parse_args(struct scrcpy_cli_args *args, int argc, char *argv[]) {
sc_getopt_adapter_destroy(&adapter); sc_getopt_adapter_destroy(&adapter);
if (!ret && args->pause_on_exit == SC_PAUSE_ON_EXIT_FALSE) {
// Check if "--pause-on-exit" is present in the arguments list, because
// it must be taken into account even if command line parsing failed
args->pause_on_exit = sc_get_pause_on_exit(argc, argv);
}
return ret; return ret;
} }

View File

@ -7,10 +7,17 @@
#include "options.h" #include "options.h"
enum sc_pause_on_exit {
SC_PAUSE_ON_EXIT_TRUE,
SC_PAUSE_ON_EXIT_FALSE,
SC_PAUSE_ON_EXIT_IF_ERROR,
};
struct scrcpy_cli_args { struct scrcpy_cli_args {
struct scrcpy_options opts; struct scrcpy_options opts;
bool help; bool help;
bool version; bool version;
enum sc_pause_on_exit pause_on_exit;
}; };
void void

View File

@ -25,6 +25,12 @@
# define SCRCPY_LAVF_REQUIRES_REGISTER_ALL # define SCRCPY_LAVF_REQUIRES_REGISTER_ALL
#endif #endif
// Not documented in ffmpeg/doc/APIchanges, but AV_CODEC_ID_AV1 has been added
// by FFmpeg commit d42809f9835a4e9e5c7c63210abb09ad0ef19cfb (included in tag
// n3.3).
#if LIBAVFORMAT_VERSION_INT >= AV_VERSION_INT(57, 89, 100)
# define SCRCPY_LAVC_HAS_AV1
#endif
// In ffmpeg/doc/APIchanges: // In ffmpeg/doc/APIchanges:
// 2018-01-28 - ea3672b7d6 - lavf 58.7.100 - avformat.h // 2018-01-28 - ea3672b7d6 - lavf 58.7.100 - avformat.h

View File

@ -33,7 +33,12 @@ sc_demuxer_to_avcodec_id(uint32_t codec_id) {
case SC_CODEC_ID_H265: case SC_CODEC_ID_H265:
return AV_CODEC_ID_HEVC; return AV_CODEC_ID_HEVC;
case SC_CODEC_ID_AV1: case SC_CODEC_ID_AV1:
#ifdef SCRCPY_LAVC_HAS_AV1
return AV_CODEC_ID_AV1; return AV_CODEC_ID_AV1;
#else
LOGE("AV1 not supported by this FFmpeg version");
return AV_CODEC_ID_NONE;
#endif
case SC_CODEC_ID_OPUS: case SC_CODEC_ID_OPUS:
return AV_CODEC_ID_OPUS; return AV_CODEC_ID_OPUS;
case SC_CODEC_ID_AAC: case SC_CODEC_ID_AAC:
@ -74,9 +79,8 @@ sc_demuxer_recv_video_size(struct sc_demuxer *demuxer, uint32_t *width,
static bool static bool
sc_demuxer_recv_packet(struct sc_demuxer *demuxer, AVPacket *packet) { sc_demuxer_recv_packet(struct sc_demuxer *demuxer, AVPacket *packet) {
// The video stream contains raw packets, without time information. When we // The video and audio streams contain a sequence of raw packets (as
// record, we retrieve the timestamps separately, from a "meta" header // provided by MediaCodec), each prefixed with a "meta" header.
// added by the server before each raw packet.
// //
// The "meta" header length is 12 bytes: // The "meta" header length is 12 bytes:
// [. . . . . . . .|. . . .]. . . . . . . . . . . . . . . ... // [. . . . . . . .|. . . .]. . . . . . . . . . . . . . . ...

285
app/src/display.c Normal file
View File

@ -0,0 +1,285 @@
#include "display.h"
#include <assert.h>
#include "util/log.h"
bool
sc_display_init(struct sc_display *display, SDL_Window *window, bool mipmaps) {
display->renderer =
SDL_CreateRenderer(window, -1, SDL_RENDERER_ACCELERATED);
if (!display->renderer) {
LOGE("Could not create renderer: %s", SDL_GetError());
return false;
}
SDL_RendererInfo renderer_info;
int r = SDL_GetRendererInfo(display->renderer, &renderer_info);
const char *renderer_name = r ? NULL : renderer_info.name;
LOGI("Renderer: %s", renderer_name ? renderer_name : "(unknown)");
display->mipmaps = false;
// starts with "opengl"
bool use_opengl = renderer_name && !strncmp(renderer_name, "opengl", 6);
if (use_opengl) {
#ifdef SC_DISPLAY_FORCE_OPENGL_CORE_PROFILE
// Persuade macOS to give us something better than OpenGL 2.1.
// If we create a Core Profile context, we get the best OpenGL version.
SDL_GL_SetAttribute(SDL_GL_CONTEXT_PROFILE_MASK,
SDL_GL_CONTEXT_PROFILE_CORE);
LOGD("Creating OpenGL Core Profile context");
display->gl_context = SDL_GL_CreateContext(window);
if (!display->gl_context) {
LOGE("Could not create OpenGL context: %s", SDL_GetError());
SDL_DestroyRenderer(display->renderer);
return false;
}
#endif
struct sc_opengl *gl = &display->gl;
sc_opengl_init(gl);
LOGI("OpenGL version: %s", gl->version);
if (mipmaps) {
bool supports_mipmaps =
sc_opengl_version_at_least(gl, 3, 0, /* OpenGL 3.0+ */
2, 0 /* OpenGL ES 2.0+ */);
if (supports_mipmaps) {
LOGI("Trilinear filtering enabled");
display->mipmaps = true;
} else {
LOGW("Trilinear filtering disabled "
"(OpenGL 3.0+ or ES 2.0+ required)");
}
} else {
LOGI("Trilinear filtering disabled");
}
} else if (mipmaps) {
LOGD("Trilinear filtering disabled (not an OpenGL renderer");
}
display->pending.flags = 0;
display->pending.frame = NULL;
return true;
}
void
sc_display_destroy(struct sc_display *display) {
if (display->pending.frame) {
av_frame_free(&display->pending.frame);
}
#ifdef SC_DISPLAY_FORCE_OPENGL_CORE_PROFILE
SDL_GL_DeleteContext(display->gl_context);
#endif
if (display->texture) {
SDL_DestroyTexture(display->texture);
}
SDL_DestroyRenderer(display->renderer);
}
static SDL_Texture *
sc_display_create_texture(struct sc_display *display,
struct sc_size size) {
SDL_Renderer *renderer = display->renderer;
SDL_Texture *texture = SDL_CreateTexture(renderer, SDL_PIXELFORMAT_YV12,
SDL_TEXTUREACCESS_STREAMING,
size.width, size.height);
if (!texture) {
LOGD("Could not create texture: %s", SDL_GetError());
return NULL;
}
if (display->mipmaps) {
struct sc_opengl *gl = &display->gl;
SDL_GL_BindTexture(texture, NULL, NULL);
// Enable trilinear filtering for downscaling
gl->TexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER,
GL_LINEAR_MIPMAP_LINEAR);
gl->TexParameterf(GL_TEXTURE_2D, GL_TEXTURE_LOD_BIAS, -1.f);
SDL_GL_UnbindTexture(texture);
}
return texture;
}
static inline void
sc_display_set_pending_size(struct sc_display *display, struct sc_size size) {
assert(!display->texture);
display->pending.size = size;
display->pending.flags |= SC_DISPLAY_PENDING_FLAG_SIZE;
}
static bool
sc_display_set_pending_frame(struct sc_display *display, const AVFrame *frame) {
if (!display->pending.frame) {
display->pending.frame = av_frame_alloc();
if (!display->pending.frame) {
LOG_OOM();
return false;
}
}
int r = av_frame_ref(display->pending.frame, frame);
if (r) {
LOGE("Could not ref frame: %d", r);
return false;
}
display->pending.flags |= SC_DISPLAY_PENDING_FLAG_FRAME;
return true;
}
static bool
sc_display_apply_pending(struct sc_display *display) {
if (display->pending.flags & SC_DISPLAY_PENDING_FLAG_SIZE) {
assert(!display->texture);
display->texture =
sc_display_create_texture(display, display->pending.size);
if (!display->texture) {
return false;
}
display->pending.flags &= ~SC_DISPLAY_PENDING_FLAG_SIZE;
}
if (display->pending.flags & SC_DISPLAY_PENDING_FLAG_FRAME) {
assert(display->pending.frame);
bool ok = sc_display_update_texture(display, display->pending.frame);
if (!ok) {
return false;
}
av_frame_unref(display->pending.frame);
display->pending.flags &= ~SC_DISPLAY_PENDING_FLAG_FRAME;
}
return true;
}
static bool
sc_display_set_texture_size_internal(struct sc_display *display,
struct sc_size size) {
assert(size.width && size.height);
if (display->texture) {
SDL_DestroyTexture(display->texture);
}
display->texture = sc_display_create_texture(display, size);
if (!display->texture) {
return false;
}
LOGI("Texture: %" PRIu16 "x%" PRIu16, size.width, size.height);
return true;
}
enum sc_display_result
sc_display_set_texture_size(struct sc_display *display, struct sc_size size) {
bool ok = sc_display_set_texture_size_internal(display, size);
if (!ok) {
sc_display_set_pending_size(display, size);
return SC_DISPLAY_RESULT_PENDING;
}
return SC_DISPLAY_RESULT_OK;
}
static bool
sc_display_update_texture_internal(struct sc_display *display,
const AVFrame *frame) {
int ret = SDL_UpdateYUVTexture(display->texture, NULL,
frame->data[0], frame->linesize[0],
frame->data[1], frame->linesize[1],
frame->data[2], frame->linesize[2]);
if (ret) {
LOGD("Could not update texture: %s", SDL_GetError());
return false;
}
if (display->mipmaps) {
SDL_GL_BindTexture(display->texture, NULL, NULL);
display->gl.GenerateMipmap(GL_TEXTURE_2D);
SDL_GL_UnbindTexture(display->texture);
}
return true;
}
enum sc_display_result
sc_display_update_texture(struct sc_display *display, const AVFrame *frame) {
bool ok = sc_display_update_texture_internal(display, frame);
if (!ok) {
ok = sc_display_set_pending_frame(display, frame);
if (!ok) {
LOGE("Could not set pending frame");
return SC_DISPLAY_RESULT_ERROR;
}
return SC_DISPLAY_RESULT_PENDING;
}
return SC_DISPLAY_RESULT_OK;
}
enum sc_display_result
sc_display_render(struct sc_display *display, const SDL_Rect *geometry,
unsigned rotation) {
SDL_RenderClear(display->renderer);
if (display->pending.flags) {
bool ok = sc_display_apply_pending(display);
if (!ok) {
return SC_DISPLAY_RESULT_PENDING;
}
}
SDL_Renderer *renderer = display->renderer;
SDL_Texture *texture = display->texture;
if (rotation == 0) {
int ret = SDL_RenderCopy(renderer, texture, NULL, geometry);
if (ret) {
LOGE("Could not render texture: %s", SDL_GetError());
return SC_DISPLAY_RESULT_ERROR;
}
} else {
// rotation in RenderCopyEx() is clockwise, while screen->rotation is
// counterclockwise (to be consistent with --lock-video-orientation)
int cw_rotation = (4 - rotation) % 4;
double angle = 90 * cw_rotation;
const SDL_Rect *dstrect = NULL;
SDL_Rect rect;
if (rotation & 1) {
rect.x = geometry->x + (geometry->w - geometry->h) / 2;
rect.y = geometry->y + (geometry->h - geometry->w) / 2;
rect.w = geometry->h;
rect.h = geometry->w;
dstrect = &rect;
} else {
assert(rotation == 2);
dstrect = geometry;
}
int ret = SDL_RenderCopyEx(renderer, texture, NULL, dstrect, angle,
NULL, 0);
if (ret) {
LOGE("Could not render texture: %s", SDL_GetError());
return SC_DISPLAY_RESULT_ERROR;
}
}
SDL_RenderPresent(display->renderer);
return SC_DISPLAY_RESULT_OK;
}

59
app/src/display.h Normal file
View File

@ -0,0 +1,59 @@
#ifndef SC_DISPLAY_H
#define SC_DISPLAY_H
#include "common.h"
#include <stdbool.h>
#include <libavformat/avformat.h>
#include <SDL2/SDL.h>
#include "coords.h"
#include "opengl.h"
#ifdef __APPLE__
# define SC_DISPLAY_FORCE_OPENGL_CORE_PROFILE
#endif
struct sc_display {
SDL_Renderer *renderer;
SDL_Texture *texture;
struct sc_opengl gl;
#ifdef SC_DISPLAY_FORCE_OPENGL_CORE_PROFILE
SDL_GLContext *gl_context;
#endif
bool mipmaps;
struct {
#define SC_DISPLAY_PENDING_FLAG_SIZE 1
#define SC_DISPLAY_PENDING_FLAG_FRAME 2
int8_t flags;
struct sc_size size;
AVFrame *frame;
} pending;
};
enum sc_display_result {
SC_DISPLAY_RESULT_OK,
SC_DISPLAY_RESULT_PENDING,
SC_DISPLAY_RESULT_ERROR,
};
bool
sc_display_init(struct sc_display *display, SDL_Window *window, bool mipmaps);
void
sc_display_destroy(struct sc_display *display);
enum sc_display_result
sc_display_set_texture_size(struct sc_display *display, struct sc_size size);
enum sc_display_result
sc_display_update_texture(struct sc_display *display, const AVFrame *frame);
enum sc_display_result
sc_display_render(struct sc_display *display, const SDL_Rect *geometry,
unsigned rotation);
#endif

View File

@ -6,3 +6,4 @@
#define SC_EVENT_DEMUXER_ERROR (SDL_USEREVENT + 5) #define SC_EVENT_DEMUXER_ERROR (SDL_USEREVENT + 5)
#define SC_EVENT_RECORDER_ERROR (SDL_USEREVENT + 6) #define SC_EVENT_RECORDER_ERROR (SDL_USEREVENT + 6)
#define SC_EVENT_SCREEN_INIT_SIZE (SDL_USEREVENT + 7) #define SC_EVENT_SCREEN_INIT_SIZE (SDL_USEREVENT + 7)
#define SC_EVENT_TIME_LIMIT_REACHED (SDL_USEREVENT + 8)

View File

@ -797,7 +797,8 @@ sc_input_manager_process_file(struct sc_input_manager *im,
} }
void void
sc_input_manager_handle_event(struct sc_input_manager *im, SDL_Event *event) { sc_input_manager_handle_event(struct sc_input_manager *im,
const SDL_Event *event) {
bool control = im->controller; bool control = im->controller;
switch (event->type) { switch (event->type) {
case SDL_TEXTINPUT: case SDL_TEXTINPUT:

View File

@ -61,6 +61,7 @@ sc_input_manager_init(struct sc_input_manager *im,
const struct sc_input_manager_params *params); const struct sc_input_manager_params *params);
void void
sc_input_manager_handle_event(struct sc_input_manager *im, SDL_Event *event); sc_input_manager_handle_event(struct sc_input_manager *im,
const SDL_Event *event);
#endif #endif

View File

@ -39,26 +39,32 @@ main_scrcpy(int argc, char *argv[]) {
.opts = scrcpy_options_default, .opts = scrcpy_options_default,
.help = false, .help = false,
.version = false, .version = false,
.pause_on_exit = SC_PAUSE_ON_EXIT_FALSE,
}; };
#ifndef NDEBUG #ifndef NDEBUG
args.opts.log_level = SC_LOG_LEVEL_DEBUG; args.opts.log_level = SC_LOG_LEVEL_DEBUG;
#endif #endif
enum scrcpy_exit_code ret;
if (!scrcpy_parse_args(&args, argc, argv)) { if (!scrcpy_parse_args(&args, argc, argv)) {
return SCRCPY_EXIT_FAILURE; ret = SCRCPY_EXIT_FAILURE;
goto end;
} }
sc_set_log_level(args.opts.log_level); sc_set_log_level(args.opts.log_level);
if (args.help) { if (args.help) {
scrcpy_print_usage(argv[0]); scrcpy_print_usage(argv[0]);
return SCRCPY_EXIT_SUCCESS; ret = SCRCPY_EXIT_SUCCESS;
goto end;
} }
if (args.version) { if (args.version) {
scrcpy_print_version(); scrcpy_print_version();
return SCRCPY_EXIT_SUCCESS; ret = SCRCPY_EXIT_SUCCESS;
goto end;
} }
#ifdef SCRCPY_LAVF_REQUIRES_REGISTER_ALL #ifdef SCRCPY_LAVF_REQUIRES_REGISTER_ALL
@ -72,18 +78,26 @@ main_scrcpy(int argc, char *argv[]) {
#endif #endif
if (!net_init()) { if (!net_init()) {
return SCRCPY_EXIT_FAILURE; ret = SCRCPY_EXIT_FAILURE;
goto end;
} }
sc_log_configure(); sc_log_configure();
#ifdef HAVE_USB #ifdef HAVE_USB
enum scrcpy_exit_code ret = args.opts.otg ? scrcpy_otg(&args.opts) ret = args.opts.otg ? scrcpy_otg(&args.opts) : scrcpy(&args.opts);
: scrcpy(&args.opts);
#else #else
enum scrcpy_exit_code ret = scrcpy(&args.opts); ret = scrcpy(&args.opts);
#endif #endif
end:
if (args.pause_on_exit == SC_PAUSE_ON_EXIT_TRUE ||
(args.pause_on_exit == SC_PAUSE_ON_EXIT_IF_ERROR &&
ret != SCRCPY_EXIT_SUCCESS)) {
printf("Press Enter to continue...\n");
getchar();
}
return ret; return ret;
} }

View File

@ -11,12 +11,10 @@ const struct scrcpy_options scrcpy_options_default = {
.audio_codec_options = NULL, .audio_codec_options = NULL,
.video_encoder = NULL, .video_encoder = NULL,
.audio_encoder = NULL, .audio_encoder = NULL,
#ifdef HAVE_V4L2
.v4l2_device = NULL,
#endif
.log_level = SC_LOG_LEVEL_INFO, .log_level = SC_LOG_LEVEL_INFO,
.video_codec = SC_CODEC_H264, .video_codec = SC_CODEC_H264,
.audio_codec = SC_CODEC_OPUS, .audio_codec = SC_CODEC_OPUS,
.audio_source = SC_AUDIO_SOURCE_OUTPUT,
.record_format = SC_RECORD_FORMAT_AUTO, .record_format = SC_RECORD_FORMAT_AUTO,
.keyboard_input_mode = SC_KEYBOARD_INPUT_MODE_INJECT, .keyboard_input_mode = SC_KEYBOARD_INPUT_MODE_INJECT,
.mouse_input_mode = SC_MOUSE_INPUT_MODE_INJECT, .mouse_input_mode = SC_MOUSE_INPUT_MODE_INJECT,
@ -42,9 +40,13 @@ const struct scrcpy_options scrcpy_options_default = {
.window_height = 0, .window_height = 0,
.display_id = 0, .display_id = 0,
.display_buffer = 0, .display_buffer = 0,
.v4l2_buffer = 0,
.audio_buffer = SC_TICK_FROM_MS(50), .audio_buffer = SC_TICK_FROM_MS(50),
.audio_output_buffer = SC_TICK_FROM_MS(5), .audio_output_buffer = SC_TICK_FROM_MS(5),
.time_limit = 0,
#ifdef HAVE_V4L2
.v4l2_device = NULL,
.v4l2_buffer = 0,
#endif
#ifdef HAVE_USB #ifdef HAVE_USB
.otg = false, .otg = false,
#endif #endif
@ -52,7 +54,8 @@ const struct scrcpy_options scrcpy_options_default = {
.fullscreen = false, .fullscreen = false,
.always_on_top = false, .always_on_top = false,
.control = true, .control = true,
.display = true, .video_playback = true,
.audio_playback = true,
.turn_screen_off = false, .turn_screen_off = false,
.key_inject_mode = SC_KEY_INJECT_MODE_MIXED, .key_inject_mode = SC_KEY_INJECT_MODE_MIXED,
.window_borderless = false, .window_borderless = false,
@ -73,8 +76,10 @@ const struct scrcpy_options scrcpy_options_default = {
.cleanup = true, .cleanup = true,
.start_fps_counter = false, .start_fps_counter = false,
.power_on = true, .power_on = true,
.video = true,
.audio = true, .audio = true,
.require_audio = false, .require_audio = false,
.list_encoders = false, .list_encoders = false,
.list_displays = false, .list_displays = false,
.kill_adb_on_close = false,
}; };

View File

@ -21,8 +21,20 @@ enum sc_record_format {
SC_RECORD_FORMAT_AUTO, SC_RECORD_FORMAT_AUTO,
SC_RECORD_FORMAT_MP4, SC_RECORD_FORMAT_MP4,
SC_RECORD_FORMAT_MKV, SC_RECORD_FORMAT_MKV,
SC_RECORD_FORMAT_M4A,
SC_RECORD_FORMAT_MKA,
SC_RECORD_FORMAT_OPUS,
SC_RECORD_FORMAT_AAC,
}; };
static inline bool
sc_record_format_is_audio_only(enum sc_record_format fmt) {
return fmt == SC_RECORD_FORMAT_M4A
|| fmt == SC_RECORD_FORMAT_MKA
|| fmt == SC_RECORD_FORMAT_OPUS
|| fmt == SC_RECORD_FORMAT_AAC;
}
enum sc_codec { enum sc_codec {
SC_CODEC_H264, SC_CODEC_H264,
SC_CODEC_H265, SC_CODEC_H265,
@ -32,6 +44,11 @@ enum sc_codec {
SC_CODEC_RAW, SC_CODEC_RAW,
}; };
enum sc_audio_source {
SC_AUDIO_SOURCE_OUTPUT,
SC_AUDIO_SOURCE_MIC,
};
enum sc_lock_video_orientation { enum sc_lock_video_orientation {
SC_LOCK_VIDEO_ORIENTATION_UNLOCKED = -1, SC_LOCK_VIDEO_ORIENTATION_UNLOCKED = -1,
// lock the current orientation when scrcpy starts // lock the current orientation when scrcpy starts
@ -100,12 +117,10 @@ struct scrcpy_options {
const char *audio_codec_options; const char *audio_codec_options;
const char *video_encoder; const char *video_encoder;
const char *audio_encoder; const char *audio_encoder;
#ifdef HAVE_V4L2
const char *v4l2_device;
#endif
enum sc_log_level log_level; enum sc_log_level log_level;
enum sc_codec video_codec; enum sc_codec video_codec;
enum sc_codec audio_codec; enum sc_codec audio_codec;
enum sc_audio_source audio_source;
enum sc_record_format record_format; enum sc_record_format record_format;
enum sc_keyboard_input_mode keyboard_input_mode; enum sc_keyboard_input_mode keyboard_input_mode;
enum sc_mouse_input_mode mouse_input_mode; enum sc_mouse_input_mode mouse_input_mode;
@ -125,9 +140,13 @@ struct scrcpy_options {
uint16_t window_height; uint16_t window_height;
uint32_t display_id; uint32_t display_id;
sc_tick display_buffer; sc_tick display_buffer;
sc_tick v4l2_buffer;
sc_tick audio_buffer; sc_tick audio_buffer;
sc_tick audio_output_buffer; sc_tick audio_output_buffer;
sc_tick time_limit;
#ifdef HAVE_V4L2
const char *v4l2_device;
sc_tick v4l2_buffer;
#endif
#ifdef HAVE_USB #ifdef HAVE_USB
bool otg; bool otg;
#endif #endif
@ -135,7 +154,8 @@ struct scrcpy_options {
bool fullscreen; bool fullscreen;
bool always_on_top; bool always_on_top;
bool control; bool control;
bool display; bool video_playback;
bool audio_playback;
bool turn_screen_off; bool turn_screen_off;
enum sc_key_inject_mode key_inject_mode; enum sc_key_inject_mode key_inject_mode;
bool window_borderless; bool window_borderless;
@ -156,10 +176,12 @@ struct scrcpy_options {
bool cleanup; bool cleanup;
bool start_fps_counter; bool start_fps_counter;
bool power_on; bool power_on;
bool video;
bool audio; bool audio;
bool require_audio; bool require_audio;
bool list_encoders; bool list_encoders;
bool list_displays; bool list_displays;
bool kill_adb_on_close;
}; };
extern const struct scrcpy_options scrcpy_options_default; extern const struct scrcpy_options scrcpy_options_default;

View File

@ -60,9 +60,17 @@ sc_recorder_queue_clear(struct sc_recorder_queue *queue) {
static const char * static const char *
sc_recorder_get_format_name(enum sc_record_format format) { sc_recorder_get_format_name(enum sc_record_format format) {
switch (format) { switch (format) {
case SC_RECORD_FORMAT_MP4: return "mp4"; case SC_RECORD_FORMAT_MP4:
case SC_RECORD_FORMAT_MKV: return "matroska"; case SC_RECORD_FORMAT_M4A:
default: return NULL; case SC_RECORD_FORMAT_AAC:
return "mp4";
case SC_RECORD_FORMAT_MKV:
case SC_RECORD_FORMAT_MKA:
return "matroska";
case SC_RECORD_FORMAT_OPUS:
return "opus";
default:
return NULL;
} }
} }
@ -88,23 +96,30 @@ sc_recorder_rescale_packet(AVStream *stream, AVPacket *packet) {
} }
static bool static bool
sc_recorder_write_stream(struct sc_recorder *recorder, int stream_index, sc_recorder_write_stream(struct sc_recorder *recorder,
AVPacket *packet) { struct sc_recorder_stream *st, AVPacket *packet) {
AVStream *stream = recorder->ctx->streams[stream_index]; AVStream *stream = recorder->ctx->streams[st->index];
sc_recorder_rescale_packet(stream, packet); sc_recorder_rescale_packet(stream, packet);
if (st->last_pts != AV_NOPTS_VALUE && packet->pts <= st->last_pts) {
LOGW("Fixing PTS non monotonically increasing in stream %d "
"(%" PRIi64 " >= %" PRIi64 ")",
st->index, st->last_pts, packet->pts);
packet->pts = ++st->last_pts;
packet->dts = packet->pts;
} else {
st->last_pts = packet->pts;
}
return av_interleaved_write_frame(recorder->ctx, packet) >= 0; return av_interleaved_write_frame(recorder->ctx, packet) >= 0;
} }
static inline bool static inline bool
sc_recorder_write_video(struct sc_recorder *recorder, AVPacket *packet) { sc_recorder_write_video(struct sc_recorder *recorder, AVPacket *packet) {
return sc_recorder_write_stream(recorder, recorder->video_stream_index, return sc_recorder_write_stream(recorder, &recorder->video_stream, packet);
packet);
} }
static inline bool static inline bool
sc_recorder_write_audio(struct sc_recorder *recorder, AVPacket *packet) { sc_recorder_write_audio(struct sc_recorder *recorder, AVPacket *packet) {
return sc_recorder_write_stream(recorder, recorder->audio_stream_index, return sc_recorder_write_stream(recorder, &recorder->audio_stream, packet);
packet);
} }
static bool static bool
@ -152,7 +167,7 @@ sc_recorder_close_output_file(struct sc_recorder *recorder) {
static inline bool static inline bool
sc_recorder_has_empty_queues(struct sc_recorder *recorder) { sc_recorder_has_empty_queues(struct sc_recorder *recorder) {
if (sc_vecdeque_is_empty(&recorder->video_queue)) { if (recorder->video && sc_vecdeque_is_empty(&recorder->video_queue)) {
// The video queue is empty // The video queue is empty
return true; return true;
} }
@ -170,13 +185,14 @@ static bool
sc_recorder_process_header(struct sc_recorder *recorder) { sc_recorder_process_header(struct sc_recorder *recorder) {
sc_mutex_lock(&recorder->mutex); sc_mutex_lock(&recorder->mutex);
while (!recorder->stopped && (!recorder->video_init while (!recorder->stopped &&
|| !recorder->audio_init ((recorder->video && !recorder->video_init)
|| sc_recorder_has_empty_queues(recorder))) { || (recorder->audio && !recorder->audio_init)
sc_cond_wait(&recorder->stream_cond, &recorder->mutex); || sc_recorder_has_empty_queues(recorder))) {
sc_cond_wait(&recorder->cond, &recorder->mutex);
} }
if (sc_vecdeque_is_empty(&recorder->video_queue)) { if (recorder->video && sc_vecdeque_is_empty(&recorder->video_queue)) {
assert(recorder->stopped); assert(recorder->stopped);
// If the recorder is stopped, don't process anything if there are not // If the recorder is stopped, don't process anything if there are not
// at least video packets // at least video packets
@ -184,7 +200,11 @@ sc_recorder_process_header(struct sc_recorder *recorder) {
return false; return false;
} }
AVPacket *video_pkt = sc_vecdeque_pop(&recorder->video_queue); AVPacket *video_pkt = NULL;
if (!sc_vecdeque_is_empty(&recorder->video_queue)) {
assert(recorder->video);
video_pkt = sc_vecdeque_pop(&recorder->video_queue);
}
AVPacket *audio_pkt = NULL; AVPacket *audio_pkt = NULL;
if (!sc_vecdeque_is_empty(&recorder->audio_queue)) { if (!sc_vecdeque_is_empty(&recorder->audio_queue)) {
@ -196,17 +216,19 @@ sc_recorder_process_header(struct sc_recorder *recorder) {
int ret = false; int ret = false;
if (video_pkt->pts != AV_NOPTS_VALUE) { if (video_pkt) {
LOGE("The first video packet is not a config packet"); if (video_pkt->pts != AV_NOPTS_VALUE) {
goto end; LOGE("The first video packet is not a config packet");
} goto end;
}
assert(recorder->video_stream_index >= 0); assert(recorder->video_stream.index >= 0);
AVStream *video_stream = AVStream *video_stream =
recorder->ctx->streams[recorder->video_stream_index]; recorder->ctx->streams[recorder->video_stream.index];
bool ok = sc_recorder_set_extradata(video_stream, video_pkt); bool ok = sc_recorder_set_extradata(video_stream, video_pkt);
if (!ok) { if (!ok) {
goto end; goto end;
}
} }
if (audio_pkt) { if (audio_pkt) {
@ -215,16 +237,16 @@ sc_recorder_process_header(struct sc_recorder *recorder) {
goto end; goto end;
} }
assert(recorder->audio_stream_index >= 0); assert(recorder->audio_stream.index >= 0);
AVStream *audio_stream = AVStream *audio_stream =
recorder->ctx->streams[recorder->audio_stream_index]; recorder->ctx->streams[recorder->audio_stream.index];
ok = sc_recorder_set_extradata(audio_stream, audio_pkt); bool ok = sc_recorder_set_extradata(audio_stream, audio_pkt);
if (!ok) { if (!ok) {
goto end; goto end;
} }
} }
ok = avformat_write_header(recorder->ctx, NULL) >= 0; bool ok = avformat_write_header(recorder->ctx, NULL) >= 0;
if (!ok) { if (!ok) {
LOGE("Failed to write header to %s", recorder->filename); LOGE("Failed to write header to %s", recorder->filename);
goto end; goto end;
@ -233,7 +255,9 @@ sc_recorder_process_header(struct sc_recorder *recorder) {
ret = true; ret = true;
end: end:
av_packet_free(&video_pkt); if (video_pkt) {
av_packet_free(&video_pkt);
}
if (audio_pkt) { if (audio_pkt) {
av_packet_free(&audio_pkt); av_packet_free(&audio_pkt);
} }
@ -263,7 +287,8 @@ sc_recorder_process_packets(struct sc_recorder *recorder) {
sc_mutex_lock(&recorder->mutex); sc_mutex_lock(&recorder->mutex);
while (!recorder->stopped) { while (!recorder->stopped) {
if (!video_pkt && !sc_vecdeque_is_empty(&recorder->video_queue)) { if (recorder->video && !video_pkt &&
!sc_vecdeque_is_empty(&recorder->video_queue)) {
// A new packet may be assigned to video_pkt and be processed // A new packet may be assigned to video_pkt and be processed
break; break;
} }
@ -272,12 +297,17 @@ sc_recorder_process_packets(struct sc_recorder *recorder) {
// A new packet may be assigned to audio_pkt and be processed // A new packet may be assigned to audio_pkt and be processed
break; break;
} }
sc_cond_wait(&recorder->queue_cond, &recorder->mutex); sc_cond_wait(&recorder->cond, &recorder->mutex);
} }
// If stopped is set, continue to process the remaining events (to // If stopped is set, continue to process the remaining events (to
// finish the recording) before actually stopping. // finish the recording) before actually stopping.
// If there is no video, then the video_queue will remain empty forever
// and video_pkt will always be NULL.
assert(recorder->video || (!video_pkt
&& sc_vecdeque_is_empty(&recorder->video_queue)));
// If there is no audio, then the audio_queue will remain empty forever // If there is no audio, then the audio_queue will remain empty forever
// and audio_pkt will always be NULL. // and audio_pkt will always be NULL.
assert(recorder->audio || (!audio_pkt assert(recorder->audio || (!audio_pkt
@ -319,6 +349,9 @@ sc_recorder_process_packets(struct sc_recorder *recorder) {
if (!recorder->audio) { if (!recorder->audio) {
assert(video_pkt); assert(video_pkt);
pts_origin = video_pkt->pts; pts_origin = video_pkt->pts;
} else if (!recorder->video) {
assert(audio_pkt);
pts_origin = audio_pkt->pts;
} else if (video_pkt && audio_pkt) { } else if (video_pkt && audio_pkt) {
pts_origin = MIN(video_pkt->pts, audio_pkt->pts); pts_origin = MIN(video_pkt->pts, audio_pkt->pts);
} else if (recorder->stopped) { } else if (recorder->stopped) {
@ -479,10 +512,10 @@ sc_recorder_video_packet_sink_open(struct sc_packet_sink *sink,
return false; return false;
} }
recorder->video_stream_index = stream->index; recorder->video_stream.index = stream->index;
recorder->video_init = true; recorder->video_init = true;
sc_cond_signal(&recorder->stream_cond); sc_cond_signal(&recorder->cond);
sc_mutex_unlock(&recorder->mutex); sc_mutex_unlock(&recorder->mutex);
return true; return true;
@ -497,7 +530,7 @@ sc_recorder_video_packet_sink_close(struct sc_packet_sink *sink) {
sc_mutex_lock(&recorder->mutex); sc_mutex_lock(&recorder->mutex);
// EOS also stops the recorder // EOS also stops the recorder
recorder->stopped = true; recorder->stopped = true;
sc_cond_signal(&recorder->queue_cond); sc_cond_signal(&recorder->cond);
sc_mutex_unlock(&recorder->mutex); sc_mutex_unlock(&recorder->mutex);
} }
@ -523,7 +556,7 @@ sc_recorder_video_packet_sink_push(struct sc_packet_sink *sink,
return false; return false;
} }
rec->stream_index = recorder->video_stream_index; rec->stream_index = recorder->video_stream.index;
bool ok = sc_vecdeque_push(&recorder->video_queue, rec); bool ok = sc_vecdeque_push(&recorder->video_queue, rec);
if (!ok) { if (!ok) {
@ -532,7 +565,7 @@ sc_recorder_video_packet_sink_push(struct sc_packet_sink *sink,
return false; return false;
} }
sc_cond_signal(&recorder->queue_cond); sc_cond_signal(&recorder->cond);
sc_mutex_unlock(&recorder->mutex); sc_mutex_unlock(&recorder->mutex);
return true; return true;
@ -560,10 +593,10 @@ sc_recorder_audio_packet_sink_open(struct sc_packet_sink *sink,
return false; return false;
} }
recorder->audio_stream_index = stream->index; recorder->audio_stream.index = stream->index;
recorder->audio_init = true; recorder->audio_init = true;
sc_cond_signal(&recorder->stream_cond); sc_cond_signal(&recorder->cond);
sc_mutex_unlock(&recorder->mutex); sc_mutex_unlock(&recorder->mutex);
return true; return true;
@ -579,7 +612,7 @@ sc_recorder_audio_packet_sink_close(struct sc_packet_sink *sink) {
sc_mutex_lock(&recorder->mutex); sc_mutex_lock(&recorder->mutex);
// EOS also stops the recorder // EOS also stops the recorder
recorder->stopped = true; recorder->stopped = true;
sc_cond_signal(&recorder->queue_cond); sc_cond_signal(&recorder->cond);
sc_mutex_unlock(&recorder->mutex); sc_mutex_unlock(&recorder->mutex);
} }
@ -606,7 +639,7 @@ sc_recorder_audio_packet_sink_push(struct sc_packet_sink *sink,
return false; return false;
} }
rec->stream_index = recorder->audio_stream_index; rec->stream_index = recorder->audio_stream.index;
bool ok = sc_vecdeque_push(&recorder->audio_queue, rec); bool ok = sc_vecdeque_push(&recorder->audio_queue, rec);
if (!ok) { if (!ok) {
@ -615,7 +648,7 @@ sc_recorder_audio_packet_sink_push(struct sc_packet_sink *sink,
return false; return false;
} }
sc_cond_signal(&recorder->queue_cond); sc_cond_signal(&recorder->cond);
sc_mutex_unlock(&recorder->mutex); sc_mutex_unlock(&recorder->mutex);
return true; return true;
@ -633,13 +666,19 @@ sc_recorder_audio_packet_sink_disable(struct sc_packet_sink *sink) {
sc_mutex_lock(&recorder->mutex); sc_mutex_lock(&recorder->mutex);
recorder->audio = false; recorder->audio = false;
recorder->audio_init = true; recorder->audio_init = true;
sc_cond_signal(&recorder->stream_cond); sc_cond_signal(&recorder->cond);
sc_mutex_unlock(&recorder->mutex); sc_mutex_unlock(&recorder->mutex);
} }
static void
sc_recorder_stream_init(struct sc_recorder_stream *stream) {
stream->index = -1;
stream->last_pts = AV_NOPTS_VALUE;
}
bool bool
sc_recorder_init(struct sc_recorder *recorder, const char *filename, sc_recorder_init(struct sc_recorder *recorder, const char *filename,
enum sc_record_format format, bool audio, enum sc_record_format format, bool video, bool audio,
const struct sc_recorder_callbacks *cbs, void *cbs_userdata) { const struct sc_recorder_callbacks *cbs, void *cbs_userdata) {
recorder->filename = strdup(filename); recorder->filename = strdup(filename);
if (!recorder->filename) { if (!recorder->filename) {
@ -652,16 +691,13 @@ sc_recorder_init(struct sc_recorder *recorder, const char *filename,
goto error_free_filename; goto error_free_filename;
} }
ok = sc_cond_init(&recorder->queue_cond); ok = sc_cond_init(&recorder->cond);
if (!ok) { if (!ok) {
goto error_mutex_destroy; goto error_mutex_destroy;
} }
ok = sc_cond_init(&recorder->stream_cond); assert(video || audio);
if (!ok) { recorder->video = video;
goto error_queue_cond_destroy;
}
recorder->audio = audio; recorder->audio = audio;
sc_vecdeque_init(&recorder->video_queue); sc_vecdeque_init(&recorder->video_queue);
@ -671,8 +707,8 @@ sc_recorder_init(struct sc_recorder *recorder, const char *filename,
recorder->video_init = false; recorder->video_init = false;
recorder->audio_init = false; recorder->audio_init = false;
recorder->video_stream_index = -1; sc_recorder_stream_init(&recorder->video_stream);
recorder->audio_stream_index = -1; sc_recorder_stream_init(&recorder->audio_stream);
recorder->format = format; recorder->format = format;
@ -680,13 +716,15 @@ sc_recorder_init(struct sc_recorder *recorder, const char *filename,
recorder->cbs = cbs; recorder->cbs = cbs;
recorder->cbs_userdata = cbs_userdata; recorder->cbs_userdata = cbs_userdata;
static const struct sc_packet_sink_ops video_ops = { if (video) {
.open = sc_recorder_video_packet_sink_open, static const struct sc_packet_sink_ops video_ops = {
.close = sc_recorder_video_packet_sink_close, .open = sc_recorder_video_packet_sink_open,
.push = sc_recorder_video_packet_sink_push, .close = sc_recorder_video_packet_sink_close,
}; .push = sc_recorder_video_packet_sink_push,
};
recorder->video_packet_sink.ops = &video_ops; recorder->video_packet_sink.ops = &video_ops;
}
if (audio) { if (audio) {
static const struct sc_packet_sink_ops audio_ops = { static const struct sc_packet_sink_ops audio_ops = {
@ -701,8 +739,6 @@ sc_recorder_init(struct sc_recorder *recorder, const char *filename,
return true; return true;
error_queue_cond_destroy:
sc_cond_destroy(&recorder->queue_cond);
error_mutex_destroy: error_mutex_destroy:
sc_mutex_destroy(&recorder->mutex); sc_mutex_destroy(&recorder->mutex);
error_free_filename: error_free_filename:
@ -727,8 +763,7 @@ void
sc_recorder_stop(struct sc_recorder *recorder) { sc_recorder_stop(struct sc_recorder *recorder) {
sc_mutex_lock(&recorder->mutex); sc_mutex_lock(&recorder->mutex);
recorder->stopped = true; recorder->stopped = true;
sc_cond_signal(&recorder->queue_cond); sc_cond_signal(&recorder->cond);
sc_cond_signal(&recorder->stream_cond);
sc_mutex_unlock(&recorder->mutex); sc_mutex_unlock(&recorder->mutex);
} }
@ -739,8 +774,7 @@ sc_recorder_join(struct sc_recorder *recorder) {
void void
sc_recorder_destroy(struct sc_recorder *recorder) { sc_recorder_destroy(struct sc_recorder *recorder) {
sc_cond_destroy(&recorder->stream_cond); sc_cond_destroy(&recorder->cond);
sc_cond_destroy(&recorder->queue_cond);
sc_mutex_destroy(&recorder->mutex); sc_mutex_destroy(&recorder->mutex);
free(recorder->filename); free(recorder->filename);
} }

View File

@ -14,6 +14,11 @@
struct sc_recorder_queue SC_VECDEQUE(AVPacket *); struct sc_recorder_queue SC_VECDEQUE(AVPacket *);
struct sc_recorder_stream {
int index;
int64_t last_pts;
};
struct sc_recorder { struct sc_recorder {
struct sc_packet_sink video_packet_sink; struct sc_packet_sink video_packet_sink;
struct sc_packet_sink audio_packet_sink; struct sc_packet_sink audio_packet_sink;
@ -27,6 +32,7 @@ struct sc_recorder {
* may access it without data races. * may access it without data races.
*/ */
bool audio; bool audio;
bool video;
char *filename; char *filename;
enum sc_record_format format; enum sc_record_format format;
@ -34,19 +40,18 @@ struct sc_recorder {
sc_thread thread; sc_thread thread;
sc_mutex mutex; sc_mutex mutex;
sc_cond queue_cond; sc_cond cond;
// set on sc_recorder_stop(), packet_sink close or recording failure // set on sc_recorder_stop(), packet_sink close or recording failure
bool stopped; bool stopped;
struct sc_recorder_queue video_queue; struct sc_recorder_queue video_queue;
struct sc_recorder_queue audio_queue; struct sc_recorder_queue audio_queue;
// wake up the recorder thread once the video or audio codec is known // wake up the recorder thread once the video or audio codec is known
sc_cond stream_cond;
bool video_init; bool video_init;
bool audio_init; bool audio_init;
int video_stream_index; struct sc_recorder_stream video_stream;
int audio_stream_index; struct sc_recorder_stream audio_stream;
const struct sc_recorder_callbacks *cbs; const struct sc_recorder_callbacks *cbs;
void *cbs_userdata; void *cbs_userdata;
@ -59,7 +64,7 @@ struct sc_recorder_callbacks {
bool bool
sc_recorder_init(struct sc_recorder *recorder, const char *filename, sc_recorder_init(struct sc_recorder *recorder, const char *filename,
enum sc_record_format format, bool audio, enum sc_record_format format, bool video, bool audio,
const struct sc_recorder_callbacks *cbs, void *cbs_userdata); const struct sc_recorder_callbacks *cbs, void *cbs_userdata);
bool bool

View File

@ -35,6 +35,7 @@
#include "util/log.h" #include "util/log.h"
#include "util/net.h" #include "util/net.h"
#include "util/rand.h" #include "util/rand.h"
#include "util/timeout.h"
#ifdef HAVE_V4L2 #ifdef HAVE_V4L2
# include "v4l2_sink.h" # include "v4l2_sink.h"
#endif #endif
@ -73,6 +74,7 @@ struct scrcpy {
struct sc_hid_mouse mouse_hid; struct sc_hid_mouse mouse_hid;
#endif #endif
}; };
struct sc_timeout timeout;
}; };
static inline void static inline void
@ -137,7 +139,7 @@ sdl_set_hints(const char *render_driver) {
} }
static void static void
sdl_configure(bool display, bool disable_screensaver) { sdl_configure(bool video_playback, bool disable_screensaver) {
#ifdef _WIN32 #ifdef _WIN32
// Clean up properly on Ctrl+C on Windows // Clean up properly on Ctrl+C on Windows
bool ok = SetConsoleCtrlHandler(windows_ctrl_handler, TRUE); bool ok = SetConsoleCtrlHandler(windows_ctrl_handler, TRUE);
@ -146,7 +148,7 @@ sdl_configure(bool display, bool disable_screensaver) {
} }
#endif // _WIN32 #endif // _WIN32
if (!display) { if (!video_playback) {
return; return;
} }
@ -171,6 +173,9 @@ event_loop(struct scrcpy *s) {
case SC_EVENT_RECORDER_ERROR: case SC_EVENT_RECORDER_ERROR:
LOGE("Recorder error"); LOGE("Recorder error");
return SCRCPY_EXIT_FAILURE; return SCRCPY_EXIT_FAILURE;
case SC_EVENT_TIME_LIMIT_REACHED:
LOGI("Time limit reached");
return SCRCPY_EXIT_SUCCESS;
case SDL_QUIT: case SDL_QUIT:
LOGD("User requested to quit"); LOGD("User requested to quit");
return SCRCPY_EXIT_SUCCESS; return SCRCPY_EXIT_SUCCESS;
@ -247,7 +252,9 @@ sc_audio_demuxer_on_ended(struct sc_demuxer *demuxer,
// Contrary to the video demuxer, keep mirroring if only the audio fails // Contrary to the video demuxer, keep mirroring if only the audio fails
// (unless --require-audio is set). // (unless --require-audio is set).
if (status == SC_DEMUXER_STATUS_ERROR if (status == SC_DEMUXER_STATUS_EOS) {
PUSH_EVENT(SC_EVENT_DEVICE_DISCONNECTED);
} else if (status == SC_DEMUXER_STATUS_ERROR
|| (status == SC_DEMUXER_STATUS_DISABLED || (status == SC_DEMUXER_STATUS_DISABLED
&& options->require_audio)) { && options->require_audio)) {
PUSH_EVENT(SC_EVENT_DEMUXER_ERROR); PUSH_EVENT(SC_EVENT_DEMUXER_ERROR);
@ -280,6 +287,14 @@ sc_server_on_disconnected(struct sc_server *server, void *userdata) {
// event // event
} }
static void
sc_timeout_on_timeout(struct sc_timeout *timeout, void *userdata) {
(void) timeout;
(void) userdata;
PUSH_EVENT(SC_EVENT_TIME_LIMIT_REACHED);
}
// Generate a scrcpy id to differentiate multiple running scrcpy instances // Generate a scrcpy id to differentiate multiple running scrcpy instances
static uint32_t static uint32_t
scrcpy_generate_scid() { scrcpy_generate_scid() {
@ -321,6 +336,8 @@ scrcpy(struct scrcpy_options *options) {
bool controller_initialized = false; bool controller_initialized = false;
bool controller_started = false; bool controller_started = false;
bool screen_initialized = false; bool screen_initialized = false;
bool timeout_initialized = false;
bool timeout_started = false;
struct sc_acksync *acksync = NULL; struct sc_acksync *acksync = NULL;
@ -334,6 +351,7 @@ scrcpy(struct scrcpy_options *options) {
.log_level = options->log_level, .log_level = options->log_level,
.video_codec = options->video_codec, .video_codec = options->video_codec,
.audio_codec = options->audio_codec, .audio_codec = options->audio_codec,
.audio_source = options->audio_source,
.crop = options->crop, .crop = options->crop,
.port_range = options->port_range, .port_range = options->port_range,
.tunnel_host = options->tunnel_host, .tunnel_host = options->tunnel_host,
@ -345,6 +363,7 @@ scrcpy(struct scrcpy_options *options) {
.lock_video_orientation = options->lock_video_orientation, .lock_video_orientation = options->lock_video_orientation,
.control = options->control, .control = options->control,
.display_id = options->display_id, .display_id = options->display_id,
.video = options->video,
.audio = options->audio, .audio = options->audio,
.show_touches = options->show_touches, .show_touches = options->show_touches,
.stay_awake = options->stay_awake, .stay_awake = options->stay_awake,
@ -362,6 +381,7 @@ scrcpy(struct scrcpy_options *options) {
.power_on = options->power_on, .power_on = options->power_on,
.list_encoders = options->list_encoders, .list_encoders = options->list_encoders,
.list_displays = options->list_displays, .list_displays = options->list_displays,
.kill_adb_on_close = options->kill_adb_on_close,
}; };
static const struct sc_server_callbacks cbs = { static const struct sc_server_callbacks cbs = {
@ -385,24 +405,26 @@ scrcpy(struct scrcpy_options *options) {
goto end; goto end;
} }
if (options->display) { // playback implies capture
sdl_set_hints(options->render_driver); assert(!options->video_playback || options->video);
} assert(!options->audio_playback || options->audio);
// Initialize SDL video in addition if display is enabled if (options->video_playback) {
if (options->display) { sdl_set_hints(options->render_driver);
if (SDL_Init(SDL_INIT_VIDEO)) { if (SDL_Init(SDL_INIT_VIDEO)) {
LOGE("Could not initialize SDL video: %s", SDL_GetError()); LOGE("Could not initialize SDL video: %s", SDL_GetError());
goto end; goto end;
} }
}
if (options->audio && SDL_Init(SDL_INIT_AUDIO)) { if (options->audio_playback) {
if (SDL_Init(SDL_INIT_AUDIO)) {
LOGE("Could not initialize SDL audio: %s", SDL_GetError()); LOGE("Could not initialize SDL audio: %s", SDL_GetError());
goto end; goto end;
} }
} }
sdl_configure(options->display, options->disable_screensaver); sdl_configure(options->video_playback, options->disable_screensaver);
// Await for server without blocking Ctrl+C handling // Await for server without blocking Ctrl+C handling
bool connected; bool connected;
@ -428,7 +450,7 @@ scrcpy(struct scrcpy_options *options) {
struct sc_file_pusher *fp = NULL; struct sc_file_pusher *fp = NULL;
if (options->display && options->control) { if (options->video_playback && options->control) {
if (!sc_file_pusher_init(&s->file_pusher, serial, if (!sc_file_pusher_init(&s->file_pusher, serial,
options->push_target)) { options->push_target)) {
goto end; goto end;
@ -437,11 +459,13 @@ scrcpy(struct scrcpy_options *options) {
file_pusher_initialized = true; file_pusher_initialized = true;
} }
static const struct sc_demuxer_callbacks video_demuxer_cbs = { if (options->video) {
.on_ended = sc_video_demuxer_on_ended, static const struct sc_demuxer_callbacks video_demuxer_cbs = {
}; .on_ended = sc_video_demuxer_on_ended,
sc_demuxer_init(&s->video_demuxer, "video", s->server.video_socket, };
&video_demuxer_cbs, NULL); sc_demuxer_init(&s->video_demuxer, "video", s->server.video_socket,
&video_demuxer_cbs, NULL);
}
if (options->audio) { if (options->audio) {
static const struct sc_demuxer_callbacks audio_demuxer_cbs = { static const struct sc_demuxer_callbacks audio_demuxer_cbs = {
@ -451,8 +475,8 @@ scrcpy(struct scrcpy_options *options) {
&audio_demuxer_cbs, options); &audio_demuxer_cbs, options);
} }
bool needs_video_decoder = options->display; bool needs_video_decoder = options->video_playback;
bool needs_audio_decoder = options->audio && options->display; bool needs_audio_decoder = options->audio_playback;
#ifdef HAVE_V4L2 #ifdef HAVE_V4L2
needs_video_decoder |= !!options->v4l2_device; needs_video_decoder |= !!options->v4l2_device;
#endif #endif
@ -472,8 +496,8 @@ scrcpy(struct scrcpy_options *options) {
.on_ended = sc_recorder_on_ended, .on_ended = sc_recorder_on_ended,
}; };
if (!sc_recorder_init(&s->recorder, options->record_filename, if (!sc_recorder_init(&s->recorder, options->record_filename,
options->record_format, options->audio, options->record_format, options->video,
&recorder_cbs, NULL)) { options->audio, &recorder_cbs, NULL)) {
goto end; goto end;
} }
recorder_initialized = true; recorder_initialized = true;
@ -483,8 +507,10 @@ scrcpy(struct scrcpy_options *options) {
} }
recorder_started = true; recorder_started = true;
sc_packet_source_add_sink(&s->video_demuxer.packet_source, if (options->video) {
&s->recorder.video_packet_sink); sc_packet_source_add_sink(&s->video_demuxer.packet_source,
&s->recorder.video_packet_sink);
}
if (options->audio) { if (options->audio) {
sc_packet_source_add_sink(&s->audio_demuxer.packet_source, sc_packet_source_add_sink(&s->audio_demuxer.packet_source,
&s->recorder.audio_packet_sink); &s->recorder.audio_packet_sink);
@ -630,23 +656,12 @@ aoa_hid_end:
} }
controller_started = true; controller_started = true;
controller = &s->controller; controller = &s->controller;
if (options->turn_screen_off) {
struct sc_control_msg msg;
msg.type = SC_CONTROL_MSG_TYPE_SET_SCREEN_POWER_MODE;
msg.set_screen_power_mode.mode = SC_SCREEN_POWER_MODE_OFF;
if (!sc_controller_push_msg(&s->controller, &msg)) {
LOGW("Could not request 'set screen power mode'");
}
}
} }
// There is a controller if and only if control is enabled // There is a controller if and only if control is enabled
assert(options->control == !!controller); assert(options->control == !!controller);
if (options->display) { if (options->video_playback) {
const char *window_title = const char *window_title =
options->window_title ? options->window_title : info->device_name; options->window_title ? options->window_title : info->device_name;
@ -672,11 +687,6 @@ aoa_hid_end:
.start_fps_counter = options->start_fps_counter, .start_fps_counter = options->start_fps_counter,
}; };
if (!sc_screen_init(&s->screen, &screen_params)) {
goto end;
}
screen_initialized = true;
struct sc_frame_source *src = &s->video_decoder.frame_source; struct sc_frame_source *src = &s->video_decoder.frame_source;
if (options->display_buffer) { if (options->display_buffer) {
sc_delay_buffer_init(&s->display_buffer, options->display_buffer, sc_delay_buffer_init(&s->display_buffer, options->display_buffer,
@ -685,14 +695,19 @@ aoa_hid_end:
src = &s->display_buffer.frame_source; src = &s->display_buffer.frame_source;
} }
sc_frame_source_add_sink(src, &s->screen.frame_sink); if (!sc_screen_init(&s->screen, &screen_params)) {
goto end;
if (options->audio) {
sc_audio_player_init(&s->audio_player, options->audio_buffer,
options->audio_output_buffer);
sc_frame_source_add_sink(&s->audio_decoder.frame_source,
&s->audio_player.frame_sink);
} }
screen_initialized = true;
sc_frame_source_add_sink(src, &s->screen.frame_sink);
}
if (options->audio_playback) {
sc_audio_player_init(&s->audio_player, options->audio_buffer,
options->audio_output_buffer);
sc_frame_source_add_sink(&s->audio_decoder.frame_source,
&s->audio_player.frame_sink);
} }
#ifdef HAVE_V4L2 #ifdef HAVE_V4L2
@ -714,12 +729,15 @@ aoa_hid_end:
} }
#endif #endif
// now we consumed the header values, the socket receives the video stream // Now that the header values have been consumed, the socket(s) will
// start the video demuxer // receive the stream(s). Start the demuxer(s).
if (!sc_demuxer_start(&s->video_demuxer)) {
goto end; if (options->video) {
if (!sc_demuxer_start(&s->video_demuxer)) {
goto end;
}
video_demuxer_started = true;
} }
video_demuxer_started = true;
if (options->audio) { if (options->audio) {
if (!sc_demuxer_start(&s->audio_demuxer)) { if (!sc_demuxer_start(&s->audio_demuxer)) {
@ -728,6 +746,39 @@ aoa_hid_end:
audio_demuxer_started = true; audio_demuxer_started = true;
} }
// If the device screen is to be turned off, send the control message after
// everything is set up
if (options->control && options->turn_screen_off) {
struct sc_control_msg msg;
msg.type = SC_CONTROL_MSG_TYPE_SET_SCREEN_POWER_MODE;
msg.set_screen_power_mode.mode = SC_SCREEN_POWER_MODE_OFF;
if (!sc_controller_push_msg(&s->controller, &msg)) {
LOGW("Could not request 'set screen power mode'");
}
}
if (options->time_limit) {
bool ok = sc_timeout_init(&s->timeout);
if (!ok) {
goto end;
}
timeout_initialized = true;
sc_tick deadline = sc_tick_now() + options->time_limit;
static const struct sc_timeout_callbacks cbs = {
.on_timeout = sc_timeout_on_timeout,
};
ok = sc_timeout_start(&s->timeout, deadline, &cbs, NULL);
if (!ok) {
goto end;
}
timeout_started = true;
}
ret = event_loop(s); ret = event_loop(s);
LOGD("quit..."); LOGD("quit...");
@ -736,6 +787,10 @@ aoa_hid_end:
sc_screen_hide_window(&s->screen); sc_screen_hide_window(&s->screen);
end: end:
if (timeout_started) {
sc_timeout_stop(&s->timeout);
}
// The demuxer is not stopped explicitly, because it will stop by itself on // The demuxer is not stopped explicitly, because it will stop by itself on
// end-of-stream // end-of-stream
#ifdef HAVE_USB #ifdef HAVE_USB
@ -771,6 +826,13 @@ end:
sc_server_stop(&s->server); sc_server_stop(&s->server);
} }
if (timeout_started) {
sc_timeout_join(&s->timeout);
}
if (timeout_initialized) {
sc_timeout_destroy(&s->timeout);
}
// now that the sockets are shutdown, the demuxer and controller are // now that the sockets are shutdown, the demuxer and controller are
// interrupted, we can join them // interrupted, we can join them
if (video_demuxer_started) { if (video_demuxer_started) {

View File

@ -56,6 +56,7 @@ static void
set_window_size(struct sc_screen *screen, struct sc_size new_size) { set_window_size(struct sc_screen *screen, struct sc_size new_size) {
assert(!screen->fullscreen); assert(!screen->fullscreen);
assert(!screen->maximized); assert(!screen->maximized);
assert(!screen->minimized);
SDL_SetWindowSize(screen->window, new_size.width, new_size.height); SDL_SetWindowSize(screen->window, new_size.width, new_size.height);
} }
@ -239,35 +240,6 @@ sc_screen_update_content_rect(struct sc_screen *screen) {
} }
} }
static bool
create_texture(struct sc_screen *screen) {
SDL_Renderer *renderer = screen->renderer;
struct sc_size size = screen->frame_size;
SDL_Texture *texture = SDL_CreateTexture(renderer, SDL_PIXELFORMAT_YV12,
SDL_TEXTUREACCESS_STREAMING,
size.width, size.height);
if (!texture) {
LOGE("Could not create texture: %s", SDL_GetError());
return false;
}
if (screen->mipmaps) {
struct sc_opengl *gl = &screen->gl;
SDL_GL_BindTexture(texture, NULL, NULL);
// Enable trilinear filtering for downscaling
gl->TexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER,
GL_LINEAR_MIPMAP_LINEAR);
gl->TexParameterf(GL_TEXTURE_2D, GL_TEXTURE_LOD_BIAS, -1.f);
SDL_GL_UnbindTexture(texture);
}
screen->texture = texture;
return true;
}
// render the texture to the renderer // render the texture to the renderer
// //
// Set the update_content_rect flag if the window or content size may have // Set the update_content_rect flag if the window or content size may have
@ -278,35 +250,11 @@ sc_screen_render(struct sc_screen *screen, bool update_content_rect) {
sc_screen_update_content_rect(screen); sc_screen_update_content_rect(screen);
} }
SDL_RenderClear(screen->renderer); enum sc_display_result res =
if (screen->rotation == 0) { sc_display_render(&screen->display, &screen->rect, screen->rotation);
SDL_RenderCopy(screen->renderer, screen->texture, NULL, &screen->rect); (void) res; // any error already logged
} else {
// rotation in RenderCopyEx() is clockwise, while screen->rotation is
// counterclockwise (to be consistent with --lock-video-orientation)
int cw_rotation = (4 - screen->rotation) % 4;
double angle = 90 * cw_rotation;
SDL_Rect *dstrect = NULL;
SDL_Rect rect;
if (screen->rotation & 1) {
rect.x = screen->rect.x + (screen->rect.w - screen->rect.h) / 2;
rect.y = screen->rect.y + (screen->rect.h - screen->rect.w) / 2;
rect.w = screen->rect.h;
rect.h = screen->rect.w;
dstrect = &rect;
} else {
assert(screen->rotation == 2);
dstrect = &screen->rect;
}
SDL_RenderCopyEx(screen->renderer, screen->texture, NULL, dstrect,
angle, NULL, 0);
}
SDL_RenderPresent(screen->renderer);
} }
#if defined(__APPLE__) || defined(__WINDOWS__) #if defined(__APPLE__) || defined(__WINDOWS__)
# define CONTINUOUS_RESIZING_WORKAROUND # define CONTINUOUS_RESIZING_WORKAROUND
#endif #endif
@ -412,6 +360,7 @@ sc_screen_init(struct sc_screen *screen,
screen->has_frame = false; screen->has_frame = false;
screen->fullscreen = false; screen->fullscreen = false;
screen->maximized = false; screen->maximized = false;
screen->minimized = false;
screen->mouse_capture_key_pressed = 0; screen->mouse_capture_key_pressed = 0;
screen->req.x = params->window_x; screen->req.x = params->window_x;
@ -453,46 +402,11 @@ sc_screen_init(struct sc_screen *screen,
goto error_destroy_fps_counter; goto error_destroy_fps_counter;
} }
screen->renderer = SDL_CreateRenderer(screen->window, -1, ok = sc_display_init(&screen->display, screen->window, params->mipmaps);
SDL_RENDERER_ACCELERATED); if (!ok) {
if (!screen->renderer) {
LOGE("Could not create renderer: %s", SDL_GetError());
goto error_destroy_window; goto error_destroy_window;
} }
SDL_RendererInfo renderer_info;
int r = SDL_GetRendererInfo(screen->renderer, &renderer_info);
const char *renderer_name = r ? NULL : renderer_info.name;
LOGI("Renderer: %s", renderer_name ? renderer_name : "(unknown)");
screen->mipmaps = false;
// starts with "opengl"
bool use_opengl = renderer_name && !strncmp(renderer_name, "opengl", 6);
if (use_opengl) {
struct sc_opengl *gl = &screen->gl;
sc_opengl_init(gl);
LOGI("OpenGL version: %s", gl->version);
if (params->mipmaps) {
bool supports_mipmaps =
sc_opengl_version_at_least(gl, 3, 0, /* OpenGL 3.0+ */
2, 0 /* OpenGL ES 2.0+ */);
if (supports_mipmaps) {
LOGI("Trilinear filtering enabled");
screen->mipmaps = true;
} else {
LOGW("Trilinear filtering disabled "
"(OpenGL 3.0+ or ES 2.0+ required)");
}
} else {
LOGI("Trilinear filtering disabled");
}
} else if (params->mipmaps) {
LOGD("Trilinear filtering disabled (not an OpenGL renderer)");
}
SDL_Surface *icon = scrcpy_icon_load(); SDL_Surface *icon = scrcpy_icon_load();
if (icon) { if (icon) {
SDL_SetWindowIcon(screen->window, icon); SDL_SetWindowIcon(screen->window, icon);
@ -504,7 +418,7 @@ sc_screen_init(struct sc_screen *screen,
screen->frame = av_frame_alloc(); screen->frame = av_frame_alloc();
if (!screen->frame) { if (!screen->frame) {
LOG_OOM(); LOG_OOM();
goto error_destroy_renderer; goto error_destroy_display;
} }
struct sc_input_manager_params im_params = { struct sc_input_manager_params im_params = {
@ -539,8 +453,8 @@ sc_screen_init(struct sc_screen *screen,
return true; return true;
error_destroy_renderer: error_destroy_display:
SDL_DestroyRenderer(screen->renderer); sc_display_destroy(&screen->display);
error_destroy_window: error_destroy_window:
SDL_DestroyWindow(screen->window); SDL_DestroyWindow(screen->window);
error_destroy_fps_counter: error_destroy_fps_counter:
@ -574,6 +488,7 @@ sc_screen_show_initial_window(struct sc_screen *screen) {
} }
SDL_ShowWindow(screen->window); SDL_ShowWindow(screen->window);
sc_screen_update_content_rect(screen);
} }
void void
@ -596,11 +511,8 @@ sc_screen_destroy(struct sc_screen *screen) {
#ifndef NDEBUG #ifndef NDEBUG
assert(!screen->open); assert(!screen->open);
#endif #endif
sc_display_destroy(&screen->display);
av_frame_free(&screen->frame); av_frame_free(&screen->frame);
if (screen->texture) {
SDL_DestroyTexture(screen->texture);
}
SDL_DestroyRenderer(screen->renderer);
SDL_DestroyWindow(screen->window); SDL_DestroyWindow(screen->window);
sc_fps_counter_destroy(&screen->fps_counter); sc_fps_counter_destroy(&screen->fps_counter);
sc_frame_buffer_destroy(&screen->fb); sc_frame_buffer_destroy(&screen->fb);
@ -622,11 +534,11 @@ resize_for_content(struct sc_screen *screen, struct sc_size old_content_size,
static void static void
set_content_size(struct sc_screen *screen, struct sc_size new_content_size) { set_content_size(struct sc_screen *screen, struct sc_size new_content_size) {
if (!screen->fullscreen && !screen->maximized) { if (!screen->fullscreen && !screen->maximized && !screen->minimized) {
resize_for_content(screen, screen->content_size, new_content_size); resize_for_content(screen, screen->content_size, new_content_size);
} else if (!screen->resize_pending) { } else if (!screen->resize_pending) {
// Store the windowed size to be able to compute the optimal size once // Store the windowed size to be able to compute the optimal size once
// fullscreen and maximized are disabled // fullscreen/maximized/minimized are disabled
screen->windowed_content_size = screen->content_size; screen->windowed_content_size = screen->content_size;
screen->resize_pending = true; screen->resize_pending = true;
} }
@ -638,6 +550,7 @@ static void
apply_pending_resize(struct sc_screen *screen) { apply_pending_resize(struct sc_screen *screen) {
assert(!screen->fullscreen); assert(!screen->fullscreen);
assert(!screen->maximized); assert(!screen->maximized);
assert(!screen->minimized);
if (screen->resize_pending) { if (screen->resize_pending) {
resize_for_content(screen, screen->windowed_content_size, resize_for_content(screen, screen->windowed_content_size,
screen->content_size); screen->content_size);
@ -667,7 +580,6 @@ static bool
sc_screen_init_size(struct sc_screen *screen) { sc_screen_init_size(struct sc_screen *screen) {
// Before first frame // Before first frame
assert(!screen->has_frame); assert(!screen->has_frame);
assert(!screen->texture);
// The requested size is passed via screen->frame_size // The requested size is passed via screen->frame_size
@ -675,48 +587,29 @@ sc_screen_init_size(struct sc_screen *screen) {
get_rotated_size(screen->frame_size, screen->rotation); get_rotated_size(screen->frame_size, screen->rotation);
screen->content_size = content_size; screen->content_size = content_size;
LOGI("Initial texture: %" PRIu16 "x%" PRIu16, enum sc_display_result res =
screen->frame_size.width, screen->frame_size.height); sc_display_set_texture_size(&screen->display, screen->frame_size);
return create_texture(screen); return res != SC_DISPLAY_RESULT_ERROR;
} }
// recreate the texture and resize the window if the frame size has changed // recreate the texture and resize the window if the frame size has changed
static bool static enum sc_display_result
prepare_for_frame(struct sc_screen *screen, struct sc_size new_frame_size) { prepare_for_frame(struct sc_screen *screen, struct sc_size new_frame_size) {
if (screen->frame_size.width != new_frame_size.width if (screen->frame_size.width == new_frame_size.width
|| screen->frame_size.height != new_frame_size.height) { && screen->frame_size.height == new_frame_size.height) {
// frame dimension changed, destroy texture return SC_DISPLAY_RESULT_OK;
SDL_DestroyTexture(screen->texture);
screen->frame_size = new_frame_size;
struct sc_size new_content_size =
get_rotated_size(new_frame_size, screen->rotation);
set_content_size(screen, new_content_size);
sc_screen_update_content_rect(screen);
LOGI("New texture: %" PRIu16 "x%" PRIu16,
screen->frame_size.width, screen->frame_size.height);
return create_texture(screen);
} }
return true; // frame dimension changed
} screen->frame_size = new_frame_size;
// write the frame into the texture struct sc_size new_content_size =
static void get_rotated_size(new_frame_size, screen->rotation);
update_texture(struct sc_screen *screen, const AVFrame *frame) { set_content_size(screen, new_content_size);
SDL_UpdateYUVTexture(screen->texture, NULL,
frame->data[0], frame->linesize[0],
frame->data[1], frame->linesize[1],
frame->data[2], frame->linesize[2]);
if (screen->mipmaps) { sc_screen_update_content_rect(screen);
SDL_GL_BindTexture(screen->texture, NULL, NULL);
screen->gl.GenerateMipmap(GL_TEXTURE_2D); return sc_display_set_texture_size(&screen->display, screen->frame_size);
SDL_GL_UnbindTexture(screen->texture);
}
} }
static bool static bool
@ -728,10 +621,23 @@ sc_screen_update_frame(struct sc_screen *screen) {
sc_fps_counter_add_rendered_frame(&screen->fps_counter); sc_fps_counter_add_rendered_frame(&screen->fps_counter);
struct sc_size new_frame_size = {frame->width, frame->height}; struct sc_size new_frame_size = {frame->width, frame->height};
if (!prepare_for_frame(screen, new_frame_size)) { enum sc_display_result res = prepare_for_frame(screen, new_frame_size);
if (res == SC_DISPLAY_RESULT_ERROR) {
return false; return false;
} }
update_texture(screen, frame); if (res == SC_DISPLAY_RESULT_PENDING) {
// Not an error, but do not continue
return true;
}
res = sc_display_update_texture(&screen->display, frame);
if (res == SC_DISPLAY_RESULT_ERROR) {
return false;
}
if (res == SC_DISPLAY_RESULT_PENDING) {
// Not an error, but do not continue
return true;
}
if (!screen->has_frame) { if (!screen->has_frame) {
screen->has_frame = true; screen->has_frame = true;
@ -757,7 +663,7 @@ sc_screen_switch_fullscreen(struct sc_screen *screen) {
} }
screen->fullscreen = !screen->fullscreen; screen->fullscreen = !screen->fullscreen;
if (!screen->fullscreen && !screen->maximized) { if (!screen->fullscreen && !screen->maximized && !screen->minimized) {
apply_pending_resize(screen); apply_pending_resize(screen);
} }
@ -767,7 +673,7 @@ sc_screen_switch_fullscreen(struct sc_screen *screen) {
void void
sc_screen_resize_to_fit(struct sc_screen *screen) { sc_screen_resize_to_fit(struct sc_screen *screen) {
if (screen->fullscreen || screen->maximized) { if (screen->fullscreen || screen->maximized || screen->minimized) {
return; return;
} }
@ -791,7 +697,7 @@ sc_screen_resize_to_fit(struct sc_screen *screen) {
void void
sc_screen_resize_to_pixel_perfect(struct sc_screen *screen) { sc_screen_resize_to_pixel_perfect(struct sc_screen *screen) {
if (screen->fullscreen) { if (screen->fullscreen || screen->minimized) {
return; return;
} }
@ -812,7 +718,7 @@ sc_screen_is_mouse_capture_key(SDL_Keycode key) {
} }
bool bool
sc_screen_handle_event(struct sc_screen *screen, SDL_Event *event) { sc_screen_handle_event(struct sc_screen *screen, const SDL_Event *event) {
bool relative_mode = sc_screen_is_relative_mode(screen); bool relative_mode = sc_screen_is_relative_mode(screen);
switch (event->type) { switch (event->type) {
@ -848,6 +754,9 @@ sc_screen_handle_event(struct sc_screen *screen, SDL_Event *event) {
case SDL_WINDOWEVENT_MAXIMIZED: case SDL_WINDOWEVENT_MAXIMIZED:
screen->maximized = true; screen->maximized = true;
break; break;
case SDL_WINDOWEVENT_MINIMIZED:
screen->minimized = true;
break;
case SDL_WINDOWEVENT_RESTORED: case SDL_WINDOWEVENT_RESTORED:
if (screen->fullscreen) { if (screen->fullscreen) {
// On Windows, in maximized+fullscreen, disabling // On Windows, in maximized+fullscreen, disabling
@ -858,6 +767,7 @@ sc_screen_handle_event(struct sc_screen *screen, SDL_Event *event) {
break; break;
} }
screen->maximized = false; screen->maximized = false;
screen->minimized = false;
apply_pending_resize(screen); apply_pending_resize(screen);
sc_screen_render(screen, true); sc_screen_render(screen, true);
break; break;
@ -939,6 +849,8 @@ sc_screen_convert_drawable_to_frame_coords(struct sc_screen *screen,
int32_t w = screen->content_size.width; int32_t w = screen->content_size.width;
int32_t h = screen->content_size.height; int32_t h = screen->content_size.height;
// screen->rect must be initialized to avoid a division by zero
assert(screen->rect.w && screen->rect.h);
x = (int64_t) (x - screen->rect.x) * w / screen->rect.w; x = (int64_t) (x - screen->rect.x) * w / screen->rect.w;
y = (int64_t) (y - screen->rect.y) * h / screen->rect.h; y = (int64_t) (y - screen->rect.y) * h / screen->rect.h;

View File

@ -9,6 +9,7 @@
#include "controller.h" #include "controller.h"
#include "coords.h" #include "coords.h"
#include "display.h"
#include "fps_counter.h" #include "fps_counter.h"
#include "frame_buffer.h" #include "frame_buffer.h"
#include "input_manager.h" #include "input_manager.h"
@ -24,6 +25,7 @@ struct sc_screen {
bool open; // track the open/close state to assert correct behavior bool open; // track the open/close state to assert correct behavior
#endif #endif
struct sc_display display;
struct sc_input_manager im; struct sc_input_manager im;
struct sc_frame_buffer fb; struct sc_frame_buffer fb;
struct sc_fps_counter fps_counter; struct sc_fps_counter fps_counter;
@ -39,9 +41,6 @@ struct sc_screen {
} req; } req;
SDL_Window *window; SDL_Window *window;
SDL_Renderer *renderer;
SDL_Texture *texture;
struct sc_opengl gl;
struct sc_size frame_size; struct sc_size frame_size;
struct sc_size content_size; // rotated frame_size struct sc_size content_size; // rotated frame_size
@ -57,7 +56,7 @@ struct sc_screen {
bool has_frame; bool has_frame;
bool fullscreen; bool fullscreen;
bool maximized; bool maximized;
bool mipmaps; bool minimized;
// To enable/disable mouse capture, a mouse capture key (LALT, LGUI or // To enable/disable mouse capture, a mouse capture key (LALT, LGUI or
// RGUI) must be pressed. This variable tracks the pressed capture key. // RGUI) must be pressed. This variable tracks the pressed capture key.
@ -137,7 +136,7 @@ sc_screen_set_rotation(struct sc_screen *screen, unsigned rotation);
// react to SDL events // react to SDL events
// If this function returns false, scrcpy must exit with an error. // If this function returns false, scrcpy must exit with an error.
bool bool
sc_screen_handle_event(struct sc_screen *screen, SDL_Event *event); sc_screen_handle_event(struct sc_screen *screen, const SDL_Event *event);
// convert point from window coordinates to frame coordinates // convert point from window coordinates to frame coordinates
// x and y are expressed in pixels // x and y are expressed in pixels

View File

@ -226,12 +226,16 @@ execute_server(struct sc_server *server,
ADD_PARAM("scid=%08x", params->scid); ADD_PARAM("scid=%08x", params->scid);
ADD_PARAM("log_level=%s", log_level_to_server_string(params->log_level)); ADD_PARAM("log_level=%s", log_level_to_server_string(params->log_level));
if (!params->video) {
ADD_PARAM("video=false");
}
if (params->video_bit_rate) { if (params->video_bit_rate) {
ADD_PARAM("video_bit_rate=%" PRIu32, params->video_bit_rate); ADD_PARAM("video_bit_rate=%" PRIu32, params->video_bit_rate);
} }
if (!params->audio) { if (!params->audio) {
ADD_PARAM("audio=false"); ADD_PARAM("audio=false");
} else if (params->audio_bit_rate) { }
if (params->audio_bit_rate) {
ADD_PARAM("audio_bit_rate=%" PRIu32, params->audio_bit_rate); ADD_PARAM("audio_bit_rate=%" PRIu32, params->audio_bit_rate);
} }
if (params->video_codec != SC_CODEC_H264) { if (params->video_codec != SC_CODEC_H264) {
@ -242,6 +246,10 @@ execute_server(struct sc_server *server,
ADD_PARAM("audio_codec=%s", ADD_PARAM("audio_codec=%s",
sc_server_get_codec_name(params->audio_codec)); sc_server_get_codec_name(params->audio_codec));
} }
if (params->audio_source != SC_AUDIO_SOURCE_OUTPUT) {
assert(params->audio_source == SC_AUDIO_SOURCE_MIC);
ADD_PARAM("audio_source=mic");
}
if (params->max_size) { if (params->max_size) {
ADD_PARAM("max_size=%" PRIu16, params->max_size); ADD_PARAM("max_size=%" PRIu16, params->max_size);
} }
@ -463,6 +471,7 @@ sc_server_connect_to(struct sc_server *server, struct sc_server_info *info) {
const char *serial = server->serial; const char *serial = server->serial;
assert(serial); assert(serial);
bool video = server->params.video;
bool audio = server->params.audio; bool audio = server->params.audio;
bool control = server->params.control; bool control = server->params.control;
@ -470,9 +479,12 @@ sc_server_connect_to(struct sc_server *server, struct sc_server_info *info) {
sc_socket audio_socket = SC_SOCKET_NONE; sc_socket audio_socket = SC_SOCKET_NONE;
sc_socket control_socket = SC_SOCKET_NONE; sc_socket control_socket = SC_SOCKET_NONE;
if (!tunnel->forward) { if (!tunnel->forward) {
video_socket = net_accept_intr(&server->intr, tunnel->server_socket); if (video) {
if (video_socket == SC_SOCKET_NONE) { video_socket =
goto fail; net_accept_intr(&server->intr, tunnel->server_socket);
if (video_socket == SC_SOCKET_NONE) {
goto fail;
}
} }
if (audio) { if (audio) {
@ -503,35 +515,45 @@ sc_server_connect_to(struct sc_server *server, struct sc_server_info *info) {
unsigned attempts = 100; unsigned attempts = 100;
sc_tick delay = SC_TICK_FROM_MS(100); sc_tick delay = SC_TICK_FROM_MS(100);
video_socket = connect_to_server(server, attempts, delay, tunnel_host, sc_socket first_socket = connect_to_server(server, attempts, delay,
tunnel_port); tunnel_host, tunnel_port);
if (video_socket == SC_SOCKET_NONE) { if (first_socket == SC_SOCKET_NONE) {
goto fail; goto fail;
} }
if (video) {
video_socket = first_socket;
}
if (audio) { if (audio) {
audio_socket = net_socket(); if (!video) {
if (audio_socket == SC_SOCKET_NONE) { audio_socket = first_socket;
goto fail; } else {
} audio_socket = net_socket();
bool ok = net_connect_intr(&server->intr, audio_socket, tunnel_host, if (audio_socket == SC_SOCKET_NONE) {
tunnel_port); goto fail;
if (!ok) { }
goto fail; bool ok = net_connect_intr(&server->intr, audio_socket,
tunnel_host, tunnel_port);
if (!ok) {
goto fail;
}
} }
} }
if (control) { if (control) {
// we know that the device is listening, we don't need several if (!video && !audio) {
// attempts control_socket = first_socket;
control_socket = net_socket(); } else {
if (control_socket == SC_SOCKET_NONE) { control_socket = net_socket();
goto fail; if (control_socket == SC_SOCKET_NONE) {
} goto fail;
bool ok = net_connect_intr(&server->intr, control_socket, }
tunnel_host, tunnel_port); bool ok = net_connect_intr(&server->intr, control_socket,
if (!ok) { tunnel_host, tunnel_port);
goto fail; if (!ok) {
goto fail;
}
} }
} }
} }
@ -540,13 +562,17 @@ sc_server_connect_to(struct sc_server *server, struct sc_server_info *info) {
sc_adb_tunnel_close(tunnel, &server->intr, serial, sc_adb_tunnel_close(tunnel, &server->intr, serial,
server->device_socket_name); server->device_socket_name);
sc_socket first_socket = video ? video_socket
: audio ? audio_socket
: control_socket;
// The sockets will be closed on stop if device_read_info() fails // The sockets will be closed on stop if device_read_info() fails
bool ok = device_read_info(&server->intr, video_socket, info); bool ok = device_read_info(&server->intr, first_socket, info);
if (!ok) { if (!ok) {
goto fail; goto fail;
} }
assert(video_socket != SC_SOCKET_NONE); assert(!video || video_socket != SC_SOCKET_NONE);
assert(!audio || audio_socket != SC_SOCKET_NONE); assert(!audio || audio_socket != SC_SOCKET_NONE);
assert(!control || control_socket != SC_SOCKET_NONE); assert(!control || control_socket != SC_SOCKET_NONE);
@ -768,6 +794,15 @@ sc_server_configure_tcpip_unknown_address(struct sc_server *server,
return sc_server_connect_to_tcpip(server, ip_port); return sc_server_connect_to_tcpip(server, ip_port);
} }
static void
sc_server_kill_adb_if_requested(struct sc_server *server) {
if (server->params.kill_adb_on_close) {
LOGI("Killing adb server...");
unsigned flags = SC_ADB_NO_STDOUT | SC_ADB_NO_STDERR | SC_ADB_NO_LOGERR;
sc_adb_kill_server(&server->intr, flags);
}
}
static int static int
run_server(void *data) { run_server(void *data) {
struct sc_server *server = data; struct sc_server *server = data;
@ -779,7 +814,7 @@ run_server(void *data) {
// is parsed, so it is not output) // is parsed, so it is not output)
bool ok = sc_adb_start_server(&server->intr, 0); bool ok = sc_adb_start_server(&server->intr, 0);
if (!ok) { if (!ok) {
LOGE("Could not start adb daemon"); LOGE("Could not start adb server");
goto error_connection_failed; goto error_connection_failed;
} }
@ -930,8 +965,11 @@ run_server(void *data) {
sc_mutex_unlock(&server->mutex); sc_mutex_unlock(&server->mutex);
// Interrupt sockets to wake up socket blocking calls on the server // Interrupt sockets to wake up socket blocking calls on the server
assert(server->video_socket != SC_SOCKET_NONE);
net_interrupt(server->video_socket); if (server->video_socket != SC_SOCKET_NONE) {
// There is no video_socket if --no-video is set
net_interrupt(server->video_socket);
}
if (server->audio_socket != SC_SOCKET_NONE) { if (server->audio_socket != SC_SOCKET_NONE) {
// There is no audio_socket if --no-audio is set // There is no audio_socket if --no-audio is set
@ -964,9 +1002,12 @@ run_server(void *data) {
sc_process_close(pid); sc_process_close(pid);
sc_server_kill_adb_if_requested(server);
return 0; return 0;
error_connection_failed: error_connection_failed:
sc_server_kill_adb_if_requested(server);
server->cbs->on_connection_failed(server, server->cbs_userdata); server->cbs->on_connection_failed(server, server->cbs_userdata);
return -1; return -1;
} }

View File

@ -26,6 +26,7 @@ struct sc_server_params {
enum sc_log_level log_level; enum sc_log_level log_level;
enum sc_codec video_codec; enum sc_codec video_codec;
enum sc_codec audio_codec; enum sc_codec audio_codec;
enum sc_audio_source audio_source;
const char *crop; const char *crop;
const char *video_codec_options; const char *video_codec_options;
const char *audio_codec_options; const char *audio_codec_options;
@ -41,6 +42,7 @@ struct sc_server_params {
int8_t lock_video_orientation; int8_t lock_video_orientation;
bool control; bool control;
uint32_t display_id; uint32_t display_id;
bool video;
bool audio; bool audio;
bool show_touches; bool show_touches;
bool stay_awake; bool stay_awake;
@ -56,6 +58,7 @@ struct sc_server_params {
bool power_on; bool power_on;
bool list_encoders; bool list_encoders;
bool list_displays; bool list_displays;
bool kill_adb_on_close;
}; };
struct sc_server { struct sc_server {

View File

@ -83,7 +83,7 @@ scrcpy_otg(struct scrcpy_options *options) {
#ifdef _WIN32 #ifdef _WIN32
// On Windows, only one process could open a USB device // On Windows, only one process could open a USB device
// <https://github.com/Genymobile/scrcpy/issues/2773> // <https://github.com/Genymobile/scrcpy/issues/2773>
LOGI("Killing adb daemon (if any)..."); LOGI("Killing adb server (if any)...");
unsigned flags = SC_ADB_NO_STDOUT | SC_ADB_NO_STDERR | SC_ADB_NO_LOGERR; unsigned flags = SC_ADB_NO_STDOUT | SC_ADB_NO_STDERR | SC_ADB_NO_LOGERR;
// uninterruptible (intr == NULL), but in practice it's very quick // uninterruptible (intr == NULL), but in practice it's very quick
sc_adb_kill_server(NULL, flags); sc_adb_kill_server(NULL, flags);
@ -105,10 +105,6 @@ scrcpy_otg(struct scrcpy_options *options) {
usb_device_initialized = true; usb_device_initialized = true;
LOGI("USB device: %s (%04x:%04x) %s %s", usb_device.serial,
(unsigned) usb_device.vid, (unsigned) usb_device.pid,
usb_device.manufacturer, usb_device.product);
ok = sc_usb_connect(&s->usb, usb_device.device, &cbs, NULL); ok = sc_usb_connect(&s->usb, usb_device.device, &cbs, NULL);
if (!ok) { if (!ok) {
goto end; goto end;

View File

@ -213,8 +213,8 @@ sc_usb_select_device(struct sc_usb *usb, const char *serial,
assert(sel_count == 1); // sel_idx is valid only if sel_count == 1 assert(sel_count == 1); // sel_idx is valid only if sel_count == 1
struct sc_usb_device *device = &vec.data[sel_idx]; struct sc_usb_device *device = &vec.data[sel_idx];
LOGD("USB device found:"); LOGI("USB device found:");
sc_usb_devices_log(SC_LOG_LEVEL_DEBUG, vec.data, vec.size); sc_usb_devices_log(SC_LOG_LEVEL_INFO, vec.data, vec.size);
// Move device into out_device (do not destroy device) // Move device into out_device (do not destroy device)
sc_usb_device_move(out_device, device); sc_usb_device_move(out_device, device);

77
app/src/util/timeout.c Normal file
View File

@ -0,0 +1,77 @@
#include "timeout.h"
#include <assert.h>
#include "log.h"
bool
sc_timeout_init(struct sc_timeout *timeout) {
bool ok = sc_mutex_init(&timeout->mutex);
if (!ok) {
return false;
}
ok = sc_cond_init(&timeout->cond);
if (!ok) {
return false;
}
timeout->stopped = false;
return true;
}
static int
run_timeout(void *data) {
struct sc_timeout *timeout = data;
sc_tick deadline = timeout->deadline;
sc_mutex_lock(&timeout->mutex);
bool timed_out = false;
while (!timeout->stopped && !timed_out) {
timed_out = !sc_cond_timedwait(&timeout->cond, &timeout->mutex,
deadline);
}
sc_mutex_unlock(&timeout->mutex);
timeout->cbs->on_timeout(timeout, timeout->cbs_userdata);
return 0;
}
bool
sc_timeout_start(struct sc_timeout *timeout, sc_tick deadline,
const struct sc_timeout_callbacks *cbs, void *cbs_userdata) {
bool ok = sc_thread_create(&timeout->thread, run_timeout, "scrcpy-timeout",
timeout);
if (!ok) {
LOGE("Timeout: could not start thread");
return false;
}
timeout->deadline = deadline;
assert(cbs && cbs->on_timeout);
timeout->cbs = cbs;
timeout->cbs_userdata = cbs_userdata;
return true;
}
void
sc_timeout_stop(struct sc_timeout *timeout) {
sc_mutex_lock(&timeout->mutex);
timeout->stopped = true;
sc_mutex_unlock(&timeout->mutex);
}
void
sc_timeout_join(struct sc_timeout *timeout) {
sc_thread_join(&timeout->thread, NULL);
}
void
sc_timeout_destroy(struct sc_timeout *timeout) {
sc_mutex_destroy(&timeout->mutex);
sc_cond_destroy(&timeout->cond);
}

43
app/src/util/timeout.h Normal file
View File

@ -0,0 +1,43 @@
#ifndef SC_TIMEOUT_H
#define SC_TIMEOUT_H
#include "common.h"
#include <stdbool.h>
#include "thread.h"
#include "tick.h"
struct sc_timeout {
sc_thread thread;
sc_tick deadline;
sc_mutex mutex;
sc_cond cond;
bool stopped;
const struct sc_timeout_callbacks *cbs;
void *cbs_userdata;
};
struct sc_timeout_callbacks {
void (*on_timeout)(struct sc_timeout *timeout, void *userdata);
};
bool
sc_timeout_init(struct sc_timeout *timeout);
void
sc_timeout_destroy(struct sc_timeout *timeout);
bool
sc_timeout_start(struct sc_timeout *timeout, sc_tick deadline,
const struct sc_timeout_callbacks *cbs, void *cbs_userdata);
void
sc_timeout_stop(struct sc_timeout *timeout);
void
sc_timeout_join(struct sc_timeout *timeout);
#endif

View File

@ -53,7 +53,7 @@ static void test_options(void) {
"--max-size", "1024", "--max-size", "1024",
"--lock-video-orientation=2", // optional arguments require '=' "--lock-video-orientation=2", // optional arguments require '='
// "--no-control" is not compatible with "--turn-screen-off" // "--no-control" is not compatible with "--turn-screen-off"
// "--no-display" is not compatible with "--fulscreen" // "--no-playback" is not compatible with "--fulscreen"
"--port", "1234:1236", "--port", "1234:1236",
"--push-target", "/sdcard/Movies", "--push-target", "/sdcard/Movies",
"--record", "file", "--record", "file",
@ -108,8 +108,8 @@ static void test_options2(void) {
char *argv[] = { char *argv[] = {
"scrcpy", "scrcpy",
"--no-control", "--no-control",
"--no-display", "--no-playback",
"--record", "file.mp4", // cannot enable --no-display without recording "--record", "file.mp4", // cannot enable --no-playback without recording
}; };
bool ok = scrcpy_parse_args(&args, ARRAY_LEN(argv), argv); bool ok = scrcpy_parse_args(&args, ARRAY_LEN(argv), argv);
@ -117,7 +117,8 @@ static void test_options2(void) {
const struct scrcpy_options *opts = &args.opts; const struct scrcpy_options *opts = &args.opts;
assert(!opts->control); assert(!opts->control);
assert(!opts->display); assert(!opts->video_playback);
assert(!opts->audio_playback);
assert(!strcmp(opts->record_filename, "file.mp4")); assert(!strcmp(opts->record_filename, "file.mp4"));
assert(opts->record_format == SC_RECORD_FORMAT_MP4); assert(opts->record_format == SC_RECORD_FORMAT_MP4);
} }

View File

@ -16,6 +16,6 @@ cpu = 'i686'
endian = 'little' endian = 'little'
[properties] [properties]
prebuilt_ffmpeg = 'ffmpeg-6.0-scrcpy-2/win32' prebuilt_ffmpeg = 'ffmpeg-6.0-scrcpy-4/win32'
prebuilt_sdl2 = 'SDL2-2.26.4/i686-w64-mingw32' prebuilt_sdl2 = 'SDL2-2.28.0/i686-w64-mingw32'
prebuilt_libusb = 'libusb-1.0.26/libusb-MinGW-Win32' prebuilt_libusb = 'libusb-1.0.26/libusb-MinGW-Win32'

View File

@ -16,6 +16,6 @@ cpu = 'x86_64'
endian = 'little' endian = 'little'
[properties] [properties]
prebuilt_ffmpeg = 'ffmpeg-6.0-scrcpy-2/win64' prebuilt_ffmpeg = 'ffmpeg-6.0-scrcpy-4/win64'
prebuilt_sdl2 = 'SDL2-2.26.4/x86_64-w64-mingw32' prebuilt_sdl2 = 'SDL2-2.28.0/x86_64-w64-mingw32'
prebuilt_libusb = 'libusb-1.0.26/libusb-MinGW-x64' prebuilt_libusb = 'libusb-1.0.26/libusb-MinGW-x64'

View File

@ -24,6 +24,42 @@ To disable audio:
scrcpy --no-audio scrcpy --no-audio
``` ```
To disable only the audio playback, see [no playback](video.md#no-playback).
## Audio only
To play audio only, disable the video:
```bash
scrcpy --no-video
# interrupt with Ctrl+C
```
Without video, the audio latency is typically not criticial, so it might be
interesting to add [buffering](#buffering) to minimize glitches:
```
scrcpy --no-video --audio-buffer=200
```
## Source
By default, the device audio output is forwarded.
It is possible to capture the device microphone instead:
```
scrcpy --audio-source=mic
```
For example, to use the device as a dictaphone and record a capture directly on
the computer:
```
scrcpy --audio-source=mic --no-video --no-playback --record=file.opus
```
## Codec ## Codec
The audio codec can be selected. The possible values are `opus` (default), `aac` The audio codec can be selected. The possible values are `opus` (default), `aac`
@ -35,6 +71,20 @@ scrcpy --audio-codec=aac
scrcpy --audio-codec=raw scrcpy --audio-codec=raw
``` ```
In particular, if you get the following error:
> Failed to initialize audio/opus, error 0xfffffffe
then your device has no Opus encoder: try `scrcpy --audio-codec=aac`.
For advanced usage, to pass arbitrary parameters to the [`MediaFormat`],
check `--audio-codec-options` in the manpage or in `scrcpy --help`.
[`MediaFormat`]: https://developer.android.com/reference/android/media/MediaFormat
## Encoder
Several encoders may be available on the device. They can be listed by: Several encoders may be available on the device. They can be listed by:
```bash ```bash
@ -43,19 +93,14 @@ scrcpy --list-encoders
To select a specific encoder: To select a specific encoder:
``` ```bash
scrcpy --audio-codec=opus --audio-encoder='c2.android.opus.encoder' scrcpy --audio-codec=opus --audio-encoder='c2.android.opus.encoder'
``` ```
For advanced usage, to pass arbitrary parameters to the [`MediaFormat`],
check `--audio-codec-options` in the manpage or in `scrcpy --help`.
[`MediaFormat`]: https://developer.android.com/reference/android/media/MediaFormat
## Bit rate ## Bit rate
The default video bit-rate is 128Kbps. To change it: The default audio bit rate is 128Kbps. To change it:
```bash ```bash
scrcpy --audio-bit-rate=64K scrcpy --audio-bit-rate=64K

View File

@ -77,7 +77,7 @@ pip3 install meson
sudo dnf install https://download1.rpmfusion.org/free/fedora/rpmfusion-free-release-$(rpm -E %fedora).noarch.rpm sudo dnf install https://download1.rpmfusion.org/free/fedora/rpmfusion-free-release-$(rpm -E %fedora).noarch.rpm
# client build dependencies # client build dependencies
sudo dnf install SDL2-devel ffms2-devel libusb-devel meson gcc make sudo dnf install SDL2-devel ffms2-devel libusb1-devel meson gcc make
# server build dependencies # server build dependencies
sudo dnf install java-devel sudo dnf install java-devel
@ -233,10 +233,10 @@ install` must be run as root)._
#### Option 2: Use prebuilt server #### Option 2: Use prebuilt server
- [`scrcpy-server-v2.0`][direct-scrcpy-server] - [`scrcpy-server-v2.1.1`][direct-scrcpy-server]
<sub>SHA-256: `9e241615f578cd690bb43311000debdecf6a9c50a7082b001952f18f6f21ddc2`</sub> <sub>SHA-256: `9558db6c56743a1dc03b38f59801fb40e91cc891f8fc0c89e5b0b067761f148e`</sub>
[direct-scrcpy-server]: https://github.com/Genymobile/scrcpy/releases/download/v2.0/scrcpy-server-v2.0 [direct-scrcpy-server]: https://github.com/Genymobile/scrcpy/releases/download/v2.1.1/scrcpy-server-v2.1.1
Download the prebuilt server somewhere, and specify its path during the Meson Download the prebuilt server somewhere, and specify its path during the Meson
configuration: configuration:

125
doc/connection.md Normal file
View File

@ -0,0 +1,125 @@
# Connection
## Selection
If exactly one device is connected (i.e. listed by `adb devices`), then it is
automatically selected.
However, if there are multiple devices connected, you must specify the one to
use in one of 4 ways:
- by its serial:
```bash
scrcpy --serial=0123456789abcdef
scrcpy -s 0123456789abcdef # short version
# the serial is the ip:port if connected over TCP/IP (same behavior as adb)
scrcpy --serial=192.168.1.1:5555
```
- the one connected over USB (if there is exactly one):
```bash
scrcpy --select-usb
scrcpy -d # short version
```
- the one connected over TCP/IP (if there is exactly one):
```bash
scrcpy --select-tcpip
scrcpy -e # short version
```
- a device already listening on TCP/IP (see [below](#tcpip-wireless)):
```bash
scrcpy --tcpip=192.168.1.1:5555
scrcpy --tcpip=192.168.1.1 # default port is 5555
```
The serial may also be provided via the environment variable `ANDROID_SERIAL`
(also used by `adb`):
```bash
# in bash
export ANDROID_SERIAL=0123456789abcdef
scrcpy
```
```cmd
:: in cmd
set ANDROID_SERIAL=0123456789abcdef
scrcpy
```
```powershell
# in PowerShell
$env:ANDROID_SERIAL = '0123456789abcdef'
scrcpy
```
## TCP/IP (wireless)
_Scrcpy_ uses `adb` to communicate with the device, and `adb` can [connect] to a
device over TCP/IP. The device must be connected on the same network as the
computer.
[connect]: https://developer.android.com/studio/command-line/adb.html#wireless
### Automatic
An option `--tcpip` allows to configure the connection automatically. There are
two variants.
If the device (accessible at 192.168.1.1 in this example) already listens on a
port (typically 5555) for incoming _adb_ connections, then run:
```bash
scrcpy --tcpip=192.168.1.1 # default port is 5555
scrcpy --tcpip=192.168.1.1:5555
```
If _adb_ TCP/IP mode is disabled on the device (or if you don't know the IP
address), connect the device over USB, then run:
```bash
scrcpy --tcpip # without arguments
```
It will automatically find the device IP address and adb port, enable TCP/IP
mode if necessary, then connect to the device before starting.
### Manual
Alternatively, it is possible to enable the TCP/IP connection manually using
`adb`:
1. Plug the device into a USB port on your computer.
2. Connect the device to the same Wi-Fi network as your computer.
3. Get your device IP address, in Settings → About phone → Status, or by
executing this command:
```bash
adb shell ip route | awk '{print $9}'
```
4. Enable `adb` over TCP/IP on your device: `adb tcpip 5555`.
5. Unplug your device.
6. Connect to your device: `adb connect DEVICE_IP:5555` _(replace `DEVICE_IP`
with the device IP address you found)_.
7. Run `scrcpy` as usual.
8. Run `adb disconnect` once you're done.
Since Android 11, a [wireless debugging option][adb-wireless] allows to bypass
having to physically connect your device directly to your computer.
[adb-wireless]: https://developer.android.com/studio/command-line/adb#wireless-android11-command-line
## Autostart
A small tool (by the scrcpy author) allows to run arbitrary commands whenever a
new Android device is connected: [AutoAdb]. It can be used to start scrcpy:
```bash
autoadb scrcpy -s '{}'
```
[AutoAdb]: https://github.com/rom1v/autoadb

View File

@ -9,16 +9,52 @@ This application is composed of two parts:
The client is responsible to push the server to the device and start its The client is responsible to push the server to the device and start its
execution. execution.
Once the client and the server are connected to each other, the server initially The client and the server establish communication using separate sockets for
sends device information (name and initial screen dimensions), then starts to video, audio and controls. Any of them may be disabled (but not all), so
send a raw H.264 video stream of the device screen. The client decodes the video there are 1, 2 or 3 socket(s).
frames, and display them as soon as possible, without buffering, to minimize
latency. The client is not aware of the device rotation (which is handled by the
server), it just knows the dimensions of the video frames.
The client captures relevant keyboard and mouse events, that it transmits to the The server initially sends the device name on the first socket (it is used for
server, which injects them to the device. the scrcpy window title), then each socket is used for its own purpose. All
reads and writes are performed from a dedicated thread for each socket, both on
the client and on the server.
If video is enabled, then the server sends a raw video stream (H.264 by default)
of the device screen, with some additional headers for each packet. The client
decodes the video frames, and displays them as soon as possible, without
buffering (unless `--display-buffer=delay` is specified) to minimize latency.
The client is not aware of the device rotation (which is handled by the server),
it just knows the dimensions of the video frames it receives.
Similarly, if audio is enabled, then the server sends a raw audio stream (OPUS
by default) of the device audio output (or the microphone if
`--audio-source=mic` is specified), with some additional headers for each
packet. The client decodes the stream, attempts to keep a minimal latency by
maintaining an average buffering. The [blog post][scrcpy2] of the scrcpy v2.0
release gives more details about the audio feature.
If control is enabled, then the client captures relevant keyboard and mouse
events, that it transmits to the server, which injects them to the device. This
is the only socket which is used in both direction: input events are sent from
the client to the device, and when the device clipboard changes, the new content
is sent from the device to the client to support seamless copy-paste.
[scrcpy2]: https://blog.rom1v.com/2023/03/scrcpy-2-0-with-audio/
Note that the client-server roles are expressed at the application level:
- the server _serves_ video and audio streams, and handle requests from the
client,
- the client _controls_ the device through the server.
However, by default (when `--force-adb-forward` is not set), the roles are
reversed at the network level:
- the client opens a server socket and listen on a port before starting the
server,
- the server connects to the client.
This role inversion guarantees that the connection will not fail due to race
conditions without polling.
## Server ## Server
@ -32,15 +68,14 @@ The server is a Java application (with a [`public static void main(String...
args)`][main] method), compiled against the Android framework, and executed as args)`][main] method), compiled against the Android framework, and executed as
`shell` on the Android device. `shell` on the Android device.
[main]: https://github.com/Genymobile/scrcpy/blob/ffe0417228fb78ab45b7ee4e202fc06fc8875bf3/server/src/main/java/com/genymobile/scrcpy/Server.java#L123 [main]: https://github.com/Genymobile/scrcpy/blob/a3cdf1a6b86ea22786e1f7d09b9c202feabc6949/server/src/main/java/com/genymobile/scrcpy/Server.java#L193
To run such a Java application, the classes must be [_dexed_][dex] (typically, To run such a Java application, the classes must be [_dexed_][dex] (typically,
to `classes.dex`). If `my.package.MainClass` is the main class, compiled to to `classes.dex`). If `my.package.MainClass` is the main class, compiled to
`classes.dex`, pushed to the device in `/data/local/tmp`, then it can be run `classes.dex`, pushed to the device in `/data/local/tmp`, then it can be run
with: with:
adb shell CLASSPATH=/data/local/tmp/classes.dex \ adb shell CLASSPATH=/data/local/tmp/classes.dex app_process / my.package.MainClass
app_process / my.package.MainClass
_The path `/data/local/tmp` is a good candidate to push the server, since it's _The path `/data/local/tmp` is a good candidate to push the server, since it's
readable and writable by `shell`, but not world-writable, so a malicious readable and writable by `shell`, but not world-writable, so a malicious
@ -49,7 +84,7 @@ application may not replace the server just before the client executes it._
Instead of a raw _dex_ file, `app_process` accepts a _jar_ containing Instead of a raw _dex_ file, `app_process` accepts a _jar_ containing
`classes.dex` (e.g. an [APK]). For simplicity, and to benefit from the gradle `classes.dex` (e.g. an [APK]). For simplicity, and to benefit from the gradle
build system, the server is built to an (unsigned) APK (renamed to build system, the server is built to an (unsigned) APK (renamed to
`scrcpy-server`). `scrcpy-server.jar`).
[dex]: https://en.wikipedia.org/wiki/Dalvik_(software) [dex]: https://en.wikipedia.org/wiki/Dalvik_(software)
[apk]: https://en.wikipedia.org/wiki/Android_application_package [apk]: https://en.wikipedia.org/wiki/Android_application_package
@ -65,42 +100,77 @@ They can be called using reflection though. The communication with hidden
components is provided by [_wrappers_ classes][wrappers] and [aidl]. components is provided by [_wrappers_ classes][wrappers] and [aidl].
[hidden]: https://stackoverflow.com/a/31908373/1987178 [hidden]: https://stackoverflow.com/a/31908373/1987178
[wrappers]: https://github.com/Genymobile/scrcpy/tree/ffe0417228fb78ab45b7ee4e202fc06fc8875bf3/server/src/main/java/com/genymobile/scrcpy/wrappers [wrappers]: https://github.com/Genymobile/scrcpy/tree/master/server/src/main/java/com/genymobile/scrcpy/wrappers
[aidl]: https://github.com/Genymobile/scrcpy/tree/ffe0417228fb78ab45b7ee4e202fc06fc8875bf3/server/src/main/aidl/android/view [aidl]: https://github.com/Genymobile/scrcpy/tree/master/server/src/main/aidl
### Threading
The server uses 3 threads: ### Execution
- the **main** thread, encoding and streaming the video to the client; The server is started by the client basically by executing the following
- the **controller** thread, listening for _control messages_ (typically, commands:
keyboard and mouse events) from the client;
- the **receiver** thread (managed by the controller), sending _device messages_
to the clients (currently, it is only used to send the device clipboard
content).
Since the video encoding is typically hardware, there would be no benefit in ```bash
encoding and streaming in two different threads. adb push scrcpy-server /data/local/tmp/scrcpy-server.jar
adb forward tcp:27183 localabstract:scrcpy
adb shell CLASSPATH=/data/local/tmp/scrcpy-server.jar app_process / com.genymobile.scrcpy.Server 2.1
```
The first argument (`2.1` in the example) is the client scrcpy version. The
server fails if the client and the server do not have the exact same version.
The protocol between the client and the server may change from version to
version (see [protocol](#protocol) below), and there is no backward or forward
compatibility (there is no point to use different client and server versions).
This check allows to detect misconfiguration (running an older or newer server
by mistake).
It is followed by any number of arguments, in the form of `key=value` pairs.
Their order is irrelevant. The possible keys and associated value types can be
found in the [server][server-options] and [client][client-options] code.
[server-options]: https://github.com/Genymobile/scrcpy/blob/a3cdf1a6b86ea22786e1f7d09b9c202feabc6949/server/src/main/java/com/genymobile/scrcpy/Options.java#L181
[client-options]: https://github.com/Genymobile/scrcpy/blob/a3cdf1a6b86ea22786e1f7d09b9c202feabc6949/app/src/server.c#L226
For example, if we execute `scrcpy -m1920 --no-audio`, then the server
execution will look like this:
```bash
# scid is a random number to identify different clients running on the same device
adb shell CLASSPATH=/data/local/tmp/scrcpy-server.jar app_process / com.genymobile.scrcpy.Server 2.1 scid=12345678 log_level=info audio=false max_size=1920
```
### Components
When executed, its [`main()`][main] method is executed (on the "main" thread).
It parses the arguments, establishes the connection with the client and starts
the other "components":
- the **video** streamer: it captures the video screen and send encoded video
packets on the _video_ socket (from the _video_ thread).
- the **audio** streamer: it uses several threads to capture raw packets,
submits them to encoding and retrieve encoded packets, which it sends on the
_audio_ socket.
- the **controller**: it receives _control messages_ (typically input events)
on the _control_ socket from one thread, and sends _device messages_ (e.g. to
transmit the device clipboard content to the client) on the same _control
socket_ from another thread. Thus, the _control_ socket is used in both
directions (contrary to the _video_ and _audio_ sockets).
### Screen video encoding ### Screen video encoding
The encoding is managed by [`ScreenEncoder`]. The encoding is managed by [`ScreenEncoder`].
The video is encoded using the [`MediaCodec`] API. The codec takes its input The video is encoded using the [`MediaCodec`] API. The codec encodes the content
from a [surface] associated to the display, and writes the resulting H.264 of a `Surface` associated to the display, and writes the encoding packets to the
stream to the provided output stream (the socket connected to the client). client (on the _video_ socket).
[`ScreenEncoder`]: https://github.com/Genymobile/scrcpy/blob/ffe0417228fb78ab45b7ee4e202fc06fc8875bf3/server/src/main/java/com/genymobile/scrcpy/ScreenEncoder.java [`ScreenEncoder`]: https://github.com/Genymobile/scrcpy/blob/a3cdf1a6b86ea22786e1f7d09b9c202feabc6949/server/src/main/java/com/genymobile/scrcpy/ScreenEncoder.java
[`MediaCodec`]: https://developer.android.com/reference/android/media/MediaCodec.html [`MediaCodec`]: https://developer.android.com/reference/android/media/MediaCodec.html
[surface]: https://github.com/Genymobile/scrcpy/blob/ffe0417228fb78ab45b7ee4e202fc06fc8875bf3/server/src/main/java/com/genymobile/scrcpy/ScreenEncoder.java#L68-L69
On device [rotation], the codec, surface and display are reinitialized, and a On device rotation (or folding), the encoding session is [reset] and restarted.
new video stream is produced.
New frames are produced only when changes occur on the surface. This is good New frames are produced only when changes occur on the surface. This avoids to
because it avoids to send unnecessary frames, but there are drawbacks: send unnecessary frames, but by default there might be drawbacks:
- it does not send any frame on start if the device screen does not change, - it does not send any frame on start if the device screen does not change,
- after fast motion changes, the last frame may have poor quality. - after fast motion changes, the last frame may have poor quality.
@ -108,11 +178,24 @@ because it avoids to send unnecessary frames, but there are drawbacks:
Both problems are [solved][repeat] by the flag Both problems are [solved][repeat] by the flag
[`KEY_REPEAT_PREVIOUS_FRAME_AFTER`][repeat-flag]. [`KEY_REPEAT_PREVIOUS_FRAME_AFTER`][repeat-flag].
[reset]: https://github.com/Genymobile/scrcpy/blob/a3cdf1a6b86ea22786e1f7d09b9c202feabc6949/server/src/main/java/com/genymobile/scrcpy/ScreenEncoder.java#L179
[rotation]: https://github.com/Genymobile/scrcpy/blob/ffe0417228fb78ab45b7ee4e202fc06fc8875bf3/server/src/main/java/com/genymobile/scrcpy/ScreenEncoder.java#L90 [rotation]: https://github.com/Genymobile/scrcpy/blob/ffe0417228fb78ab45b7ee4e202fc06fc8875bf3/server/src/main/java/com/genymobile/scrcpy/ScreenEncoder.java#L90
[repeat]: https://github.com/Genymobile/scrcpy/blob/ffe0417228fb78ab45b7ee4e202fc06fc8875bf3/server/src/main/java/com/genymobile/scrcpy/ScreenEncoder.java#L147-L148 [repeat]: https://github.com/Genymobile/scrcpy/blob/a3cdf1a6b86ea22786e1f7d09b9c202feabc6949/server/src/main/java/com/genymobile/scrcpy/ScreenEncoder.java#L246-L247
[repeat-flag]: https://developer.android.com/reference/android/media/MediaFormat.html#KEY_REPEAT_PREVIOUS_FRAME_AFTER [repeat-flag]: https://developer.android.com/reference/android/media/MediaFormat.html#KEY_REPEAT_PREVIOUS_FRAME_AFTER
### Audio encoding
Similarly, the audio is [captured] using an [`AudioRecord`], and [encoded] using
the [`MediaCodec`] asynchronous API.
More details are available on the [blog post][scrcpy2] introducing the audio feature.
[captured]: https://github.com/Genymobile/scrcpy/blob/a3cdf1a6b86ea22786e1f7d09b9c202feabc6949/server/src/main/java/com/genymobile/scrcpy/AudioCapture.java
[encoded]: https://github.com/Genymobile/scrcpy/blob/a3cdf1a6b86ea22786e1f7d09b9c202feabc6949/server/src/main/java/com/genymobile/scrcpy/AudioEncoder.java
[`AudioRecord`]: https://developer.android.com/reference/android/media/AudioRecord
### Input events injection ### Input events injection
_Control messages_ are received from the client by the [`Controller`] (run in a _Control messages_ are received from the client by the [`Controller`] (run in a
@ -124,13 +207,13 @@ separate thread). There are several types of input events:
- other commands (e.g. to switch the screen on or to copy the clipboard). - other commands (e.g. to switch the screen on or to copy the clipboard).
Some of them need to inject input events to the system. To do so, they use the Some of them need to inject input events to the system. To do so, they use the
_hidden_ method [`InputManager.injectInputEvent`] (exposed by our _hidden_ method [`InputManager.injectInputEvent()`] (exposed by the
[`InputManager` wrapper][inject-wrapper]). [`InputManager` wrapper][inject-wrapper]).
[`Controller`]: https://github.com/Genymobile/scrcpy/blob/ffe0417228fb78ab45b7ee4e202fc06fc8875bf3/server/src/main/java/com/genymobile/scrcpy/Controller.java#L81 [`Controller`]: https://github.com/Genymobile/scrcpy/blob/a3cdf1a6b86ea22786e1f7d09b9c202feabc6949/server/src/main/java/com/genymobile/scrcpy/Controller.java
[`KeyEvent`]: https://developer.android.com/reference/android/view/KeyEvent.html [`KeyEvent`]: https://developer.android.com/reference/android/view/KeyEvent.html
[`MotionEvent`]: https://developer.android.com/reference/android/view/MotionEvent.html [`MotionEvent`]: https://developer.android.com/reference/android/view/MotionEvent.html
[`InputManager.injectInputEvent`]: https://android.googlesource.com/platform/frameworks/base/+/oreo-release/core/java/android/hardware/input/InputManager.java#857 [`InputManager.injectInputEvent()`]: https://github.com/Genymobile/scrcpy/blob/a3cdf1a6b86ea22786e1f7d09b9c202feabc6949/server/src/main/java/com/genymobile/scrcpy/wrappers/InputManager.java#L34
[inject-wrapper]: https://github.com/Genymobile/scrcpy/blob/ffe0417228fb78ab45b7ee4e202fc06fc8875bf3/server/src/main/java/com/genymobile/scrcpy/wrappers/InputManager.java#L27 [inject-wrapper]: https://github.com/Genymobile/scrcpy/blob/ffe0417228fb78ab45b7ee4e202fc06fc8875bf3/server/src/main/java/com/genymobile/scrcpy/wrappers/InputManager.java#L27
@ -140,126 +223,222 @@ _hidden_ method [`InputManager.injectInputEvent`] (exposed by our
The client relies on [SDL], which provides cross-platform API for UI, input The client relies on [SDL], which provides cross-platform API for UI, input
events, threading, etc. events, threading, etc.
The video stream is decoded by [libav] (FFmpeg). The video and audio streams are decoded by [FFmpeg].
[SDL]: https://www.libsdl.org [SDL]: https://www.libsdl.org
[libav]: https://www.libav.org/ [ffmpeg]: https://ffmpeg.org/
### Initialization ### Initialization
On startup, in addition to _libav_ and _SDL_ initialization, the client must The client parses the command line arguments, then [runs one of two code
push and start the server on the device, and open two sockets (one for the video paths][run]:
stream, one for control) so that they may communicate. - scrcpy in "normal" mode ([`scrcpy.c`])
- scrcpy in [OTG mode](hid-otg.md) ([`scrcpy_otg.c`])
Note that the client-server roles are expressed at the application level: [run]: https://github.com/Genymobile/scrcpy/blob/a3cdf1a6b86ea22786e1f7d09b9c202feabc6949/app/src/main.c#L81-L82
[`scrcpy.c`]: https://github.com/Genymobile/scrcpy/blob/a3cdf1a6b86ea22786e1f7d09b9c202feabc6949/app/src/scrcpy.c#L292-L293
[`scrcpy_otg.c`]: https://github.com/Genymobile/scrcpy/blob/a3cdf1a6b86ea22786e1f7d09b9c202feabc6949/app/src/usb/scrcpy_otg.c#L51-L52
- the server _serves_ video stream and handle requests from the client, In the remaining of this document, we assume that the "normal" mode is used
- the client _controls_ the device through the server. (read the code for the OTG mode).
However, the roles are reversed at the network level: On startup, the client:
- opens the _video_, _audio_ and _control_ sockets;
- the client opens a server socket and listen on a port before starting the - pushes and starts the server on the device;
server, - initializes its components (demuxers, decoders, recorder…).
- the server connects to the client.
This role inversion guarantees that the connection will not fail due to race
conditions, and avoids polling.
_(Note that over TCP/IP, the roles are not reversed, due to a bug in `adb
reverse`. See commit [1038bad] and [issue #5].)_
Once the server is connected, it sends the device information (name and initial
screen dimensions). Thus, the client may init the window and renderer, before
the first frame is available.
To minimize startup time, SDL initialization is performed while listening for
the connection from the server (see commit [90a46b4]).
[1038bad]: https://github.com/Genymobile/scrcpy/commit/1038bad3850f18717a048a4d5c0f8110e54ee172
[issue #5]: https://github.com/Genymobile/scrcpy/issues/5
[90a46b4]: https://github.com/Genymobile/scrcpy/commit/90a46b4c45637d083e877020d85ade52a9a5fa8e
### Threading ### Video and audio streams
The client uses 4 threads: Depending on the arguments passed to `scrcpy`, several components may be used.
Here is an overview of the video and audio components:
- the **main** thread, executing the SDL event loop,
- the **stream** thread, receiving the video and used for decoding and
recording,
- the **controller** thread, sending _control messages_ to the server,
- the **receiver** thread (managed by the controller), receiving _device
messages_ from the server.
In addition, another thread can be started if necessary to handle APK
installation or file push requests (via drag&drop on the main window) or to
print the framerate regularly in the console.
### Stream
The video [stream] is received from the socket (connected to the server on the
device) in a separate thread.
If a [decoder] is present (i.e. `--no-display` is not set), then it uses _libav_
to decode the H.264 stream from the socket, and notifies the main thread when a
new frame is available.
There are two [frames][video_buffer] simultaneously in memory:
- the **decoding** frame, written by the decoder from the decoder thread,
- the **rendering** frame, rendered in a texture from the main thread.
When a new decoded frame is available, the decoder _swaps_ the decoding and
rendering frame (with proper synchronization). Thus, it immediately starts
to decode a new frame while the main thread renders the last one.
If a [recorder] is present (i.e. `--record` is enabled), then it muxes the raw
H.264 packet to the output video file.
[stream]: https://github.com/Genymobile/scrcpy/blob/ffe0417228fb78ab45b7ee4e202fc06fc8875bf3/app/src/stream.h
[decoder]: https://github.com/Genymobile/scrcpy/blob/ffe0417228fb78ab45b7ee4e202fc06fc8875bf3/app/src/decoder.h
[video_buffer]: https://github.com/Genymobile/scrcpy/blob/ffe0417228fb78ab45b7ee4e202fc06fc8875bf3/app/src/video_buffer.h
[recorder]: https://github.com/Genymobile/scrcpy/blob/ffe0417228fb78ab45b7ee4e202fc06fc8875bf3/app/src/recorder.h
``` ```
+----------+ +----------+ V4L2 sink
---> | decoder | ---> | screen | /
+---------+ / +----------+ +----------+ decoder
socket ---> | stream | ---- / \
+---------+ \ +----------+ VIDEO -------------> demuxer display
---> | recorder | \
+----------+ recorder
/
AUDIO -------------> demuxer
\
decoder --- audio player
``` ```
The _demuxer_ is responsible to extract video and audio packets (read some
header, split the video stream into packets at correct boundaries, etc.).
The demuxed packets may be sent to a _decoder_ (one per stream, to produce
frames) and to a recorder (receiving both video and audio stream to record a
single file). The packets are encoded on the device (by `MediaCodec`), but when
recording, they are _muxed_ (asynchronously) into a container (MKV or MP4) on
the client side.
Video frames are sent to the screen/display to be rendered in the scrcpy window.
They may also be sent to a [V4L2 sink](v4l2.md).
Audio "frames" (an array of decoded samples) are sent to the audio player.
### Controller ### Controller
The [controller] is responsible to send _control messages_ to the device. It The _controller_ is responsible to send _control messages_ to the device. It
runs in a separate thread, to avoid I/O on the main thread. runs in a separate thread, to avoid I/O on the main thread.
On SDL event, received on the main thread, the [input manager][inputmanager] On SDL event, received on the main thread, the _input manager_ creates
creates appropriate [_control messages_][controlmsg]. It is responsible to appropriate _control messages_. It is responsible to convert SDL events to
convert SDL events to Android events (using [convert]). It pushes the _control Android events. It then pushes the _control messages_ to a queue hold by the
messages_ to a queue hold by the controller. On its own thread, the controller controller. On its own thread, the controller takes messages from the queue,
takes messages from the queue, that it serializes and sends to the client. that it serializes and sends to the client.
[controller]: https://github.com/Genymobile/scrcpy/blob/ffe0417228fb78ab45b7ee4e202fc06fc8875bf3/app/src/controller.h
[controlmsg]: https://github.com/Genymobile/scrcpy/blob/ffe0417228fb78ab45b7ee4e202fc06fc8875bf3/app/src/control_msg.h
[inputmanager]: https://github.com/Genymobile/scrcpy/blob/ffe0417228fb78ab45b7ee4e202fc06fc8875bf3/app/src/input_manager.h
[convert]: https://github.com/Genymobile/scrcpy/blob/ffe0417228fb78ab45b7ee4e202fc06fc8875bf3/app/src/convert.h
### UI and event loop ## Protocol
Initialization, input events and rendering are all [managed][scrcpy] in the main The protocol between the client and the server must be considered _internal_: it
thread. may (and will) change at any time for any reason. Everything may change (the
number of sockets, the order in which the sockets must be opened, the data
format on the wire…) from version to version. A client must always be run with a
matching server version.
Events are handled in the [event loop], which either updates the [screen] or This section documents the current protocol in scrcpy v2.1.
delegates to the [input manager][inputmanager].
[scrcpy]: https://github.com/Genymobile/scrcpy/blob/ffe0417228fb78ab45b7ee4e202fc06fc8875bf3/app/src/scrcpy.c ### Connection
[event loop]: https://github.com/Genymobile/scrcpy/blob/ffe0417228fb78ab45b7ee4e202fc06fc8875bf3/app/src/scrcpy.c#L201
[screen]: https://github.com/Genymobile/scrcpy/blob/ffe0417228fb78ab45b7ee4e202fc06fc8875bf3/app/src/screen.h Firstly, the client sets up an adb tunnel:
```bash
# By default, a reverse redirection: the computer listens, the device connects
adb reverse localabstract:scrcpy_<SCID> tcp:27183
# As a fallback (or if --force-adb forward is set), a forward redirection:
# the device listens, the computer connects
adb forward tcp:27183 localabstract:scrcpy_<SCID>
```
(`<SCID>` is a 31-bit random number, so that it does not fail when several
scrcpy instances start "at the same time" for the same device.)
Then, up to 3 sockets are opened, in that order:
- a _video_ socket
- an _audio_ socket
- a _control_ socket
Each one may be disabled (respectively by `--no-video`, `--no-audio` and
`--no-control`, directly or indirectly). For example, if `--no-audio` is set,
then the _video_ socket is opened first, then the _control_ socket.
On the _first_ socket opened (whichever it is), if the tunnel is _forward_, then
a [dummy byte] is sent from the device to the client. This allows to detect a
connection error (the client connection does not fail as long as there is an adb
forward redirection, even if nothing is listening on the device side).
Still on this _first_ socket, the device sends some [metadata][device meta] to
the client (currently only the device name, used as the window title, but there
might be other fields in the future).
[dummy byte]: https://github.com/Genymobile/scrcpy/blob/a3cdf1a6b86ea22786e1f7d09b9c202feabc6949/server/src/main/java/com/genymobile/scrcpy/DesktopConnection.java#L93
[device meta]: https://github.com/Genymobile/scrcpy/blob/a3cdf1a6b86ea22786e1f7d09b9c202feabc6949/server/src/main/java/com/genymobile/scrcpy/DesktopConnection.java#L151
You can read the [client][client-connection] and [server][server-connection]
code for more details.
[client-connection]: https://github.com/Genymobile/scrcpy/blob/a3cdf1a6b86ea22786e1f7d09b9c202feabc6949/app/src/server.c#L465-L466
[server-connection]: https://github.com/Genymobile/scrcpy/blob/a3cdf1a6b86ea22786e1f7d09b9c202feabc6949/server/src/main/java/com/genymobile/scrcpy/DesktopConnection.java#L63
Then each socket is used for its intended purpose.
### Video and audio
On the _video_ and _audio_ sockets, the device first sends some [codec
metadata]:
- On the _video_ socket, 12 bytes:
- the codec id (`u32`) (H264, H265 or AV1)
- the initial video width (`u32`)
- the initial video height (`u32`)
- On the _audio_ socket, 4 bytes:
- the codec id (`u32`) (OPUS, AAC or RAW)
[codec metadata]: https://github.com/Genymobile/scrcpy/blob/a3cdf1a6b86ea22786e1f7d09b9c202feabc6949/server/src/main/java/com/genymobile/scrcpy/Streamer.java#L33-L51
Then each packet produced by `MediaCodec` is sent, prefixed by a 12-byte [frame
header]:
- config packet flag (`u1`)
- key frame flag (`u1`)
- PTS (`u62`)
- packet size (`u32`)
Here is a schema describing the frame header:
```
[. . . . . . . .|. . . .]. . . . . . . . . . . . . . . ...
<-------------> <-----> <-----------------------------...
PTS packet raw packet
size
<--------------------->
frame header
The most significant bits of the PTS are used for packet flags:
byte 7 byte 6 byte 5 byte 4 byte 3 byte 2 byte 1 byte 0
CK...... ........ ........ ........ ........ ........ ........ ........
^^<------------------------------------------------------------------->
|| PTS
| `- key frame
`-- config packet
```
[frame header]: https://github.com/Genymobile/scrcpy/blob/a3cdf1a6b86ea22786e1f7d09b9c202feabc6949/server/src/main/java/com/genymobile/scrcpy/Streamer.java#L83
### Controls
Controls messages are sent via a custom binary protocol.
The only documentation for this protocol is the set of unit tests on both sides:
- `ControlMessage` (from client to device): [serialization](https://github.com/Genymobile/scrcpy/blob/master/app/tests/test_control_msg_serialize.c) | [deserialization](https://github.com/Genymobile/scrcpy/blob/master/server/src/test/java/com/genymobile/scrcpy/ControlMessageReaderTest.java)
- `DeviceMessage` (from device to client) [serialization](https://github.com/Genymobile/scrcpy/blob/master/server/src/test/java/com/genymobile/scrcpy/DeviceMessageWriterTest.java) | [deserialization](https://github.com/Genymobile/scrcpy/blob/master/app/tests/test_device_msg_deserialize.c)
## Standalone server
Although the server is designed to work for the scrcpy client, it can be used
with any client which uses the same protocol.
For simplicity, some [server-specific options] have been added to produce raw
streams easily:
- `send_device_meta=false`: disable the device metata (in practice, the device
name) sent on the _first_ socket
- `send_frame_meta=false`: disable the 12-byte header for each packet
- `send_dummy_byte`: disable the dummy byte sent on forward connections
- `send_codec_meta`: disable the codec information (and initial device size for
video)
- `raw_stream`: disable all the above
[server-specific options]: https://github.com/Genymobile/scrcpy/blob/a3cdf1a6b86ea22786e1f7d09b9c202feabc6949/server/src/main/java/com/genymobile/scrcpy/Options.java#L309-L329
Concretely, here is how to expose a raw H.264 stream on a TCP socket:
```bash
adb push scrcpy-server-v2.1 /data/local/tmp/scrcpy-server-manual.jar
adb forward tcp:1234 localabstract:scrcpy
adb shell CLASSPATH=/data/local/tmp/scrcpy-server-manual.jar \
app_process / com.genymobile.scrcpy.Server 2.1 \
tunnel_forward=true audio=false control=false cleanup=false \
raw_stream=true max_size=1920
```
As soon as a client connects over TCP on port 1234, the device will start
streaming the video. For example, VLC can play the video (although you will
experience a very high latency, more details [here][vlc-0latency]):
```
vlc -Idummy --demux=h264 --network-caching=0 tcp://localhost:1234
```
[vlc-0latency]: https://code.videolan.org/rom1v/vlc/-/merge_requests/20
## Hack ## Hack

View File

@ -1,156 +1,9 @@
# Device # Device
## Selection
If exactly one device is connected (i.e. listed by `adb devices`), then it is
automatically selected.
However, if there are multiple devices connected, you must specify the one to
use in one of 4 ways:
- by its serial:
```bash
scrcpy --serial=0123456789abcdef
scrcpy -s 0123456789abcdef # short version
# the serial is the ip:port if connected over TCP/IP (same behavior as adb)
scrcpy --serial=192.168.1.1:5555
```
- the one connected over USB (if there is exactly one):
```bash
scrcpy --select-usb
scrcpy -d # short version
```
- the one connected over TCP/IP (if there is exactly one):
```bash
scrcpy --select-tcpip
scrcpy -e # short version
```
- a device already listening on TCP/IP (see [below](#tcpip-wireless)):
```bash
scrcpy --tcpip=192.168.1.1:5555
scrcpy --tcpip=192.168.1.1 # default port is 5555
```
The serial may also be provided via the environment variable `ANDROID_SERIAL`
(also used by `adb`):
```bash
# in bash
export ANDROID_SERIAL=0123456789abcdef
scrcpy
```
```cmd
:: in cmd
set ANDROID_SERIAL=0123456789abcdef
scrcpy
```
```powershell
# in PowerShell
$env:ANDROID_SERIAL = '0123456789abcdef'
scrcpy
```
## TCP/IP (wireless)
_Scrcpy_ uses `adb` to communicate with the device, and `adb` can [connect] to a
device over TCP/IP. The device must be connected on the same network as the
computer.
[connect]: https://developer.android.com/studio/command-line/adb.html#wireless
### Automatic
An option `--tcpip` allows to configure the connection automatically. There are
two variants.
If the device (accessible at 192.168.1.1 in this example) already listens on a
port (typically 5555) for incoming _adb_ connections, then run:
```bash
scrcpy --tcpip=192.168.1.1 # default port is 5555
scrcpy --tcpip=192.168.1.1:5555
```
If _adb_ TCP/IP mode is disabled on the device (or if you don't know the IP
address), connect the device over USB, then run:
```bash
scrcpy --tcpip # without arguments
```
It will automatically find the device IP address and adb port, enable TCP/IP
mode if necessary, then connect to the device before starting.
### Manual
Alternatively, it is possible to enable the TCP/IP connection manually using
`adb`:
1. Plug the device into a USB port on your computer.
2. Connect the device to the same Wi-Fi network as your computer.
3. Get your device IP address, in Settings → About phone → Status, or by
executing this command:
```bash
adb shell ip route | awk '{print $9}'
```
4. Enable `adb` over TCP/IP on your device: `adb tcpip 5555`.
5. Unplug your device.
6. Connect to your device: `adb connect DEVICE_IP:5555` _(replace `DEVICE_IP`
with the device IP address you found)_.
7. Run `scrcpy` as usual.
8. Run `adb disconnect` once you're done.
Since Android 11, a [wireless debugging option][adb-wireless] allows to bypass
having to physically connect your device directly to your computer.
[adb-wireless]: https://developer.android.com/studio/command-line/adb#wireless-android11-command-line
## Autostart
A small tool (by the scrcpy author) allows to run arbitrary commands whenever a
new Android device is connected: [AutoAdb]. It can be used to start scrcpy:
```bash
autoadb scrcpy -s '{}'
```
[AutoAdb]: https://github.com/rom1v/autoadb
## Display
If several displays are available on the Android device, it is possible to
select the display to mirror:
```bash
scrcpy --display=1
```
The list of display ids can be retrieved by:
```bash
scrcpy --list-displays
```
A secondary display may only be controlled if the device runs at least Android
10 (otherwise it is mirrored as read-only).
## Actions
Some command line arguments perform actions on the device itself while scrcpy is Some command line arguments perform actions on the device itself while scrcpy is
running. running.
## Stay awake
### Stay awake
To prevent the device from sleeping after a delay **when the device is plugged To prevent the device from sleeping after a delay **when the device is plugged
in**: in**:
@ -166,7 +19,7 @@ If the device is not plugged in (i.e. only connected over TCP/IP),
`--stay-awake` has no effect (this is the Android behavior). `--stay-awake` has no effect (this is the Android behavior).
### Turn screen off ## Turn screen off
It is possible to turn the device screen off while mirroring on start with a It is possible to turn the device screen off while mirroring on start with a
command-line option: command-line option:
@ -194,7 +47,7 @@ scrcpy -Sw # short version
``` ```
### Show touches ## Show touches
For presentations, it may be useful to show physical touches (on the physical For presentations, it may be useful to show physical touches (on the physical
device). Android exposes this feature in _Developers options_. device). Android exposes this feature in _Developers options_.
@ -210,7 +63,7 @@ scrcpy -t # short version
Note that it only shows _physical_ touches (by a finger on the device). Note that it only shows _physical_ touches (by a finger on the device).
### Power off on close ## Power off on close
To turn the device screen off when closing _scrcpy_: To turn the device screen off when closing _scrcpy_:
@ -218,11 +71,10 @@ To turn the device screen off when closing _scrcpy_:
scrcpy --power-off-on-close scrcpy --power-off-on-close
``` ```
### Power on on start ## Power on on start
By default, on start, the device is powered on. To prevent this behavior: By default, on start, the device is powered on. To prevent this behavior:
```bash ```bash
scrcpy --no-power-on scrcpy --no-power-on
``` ```

View File

@ -106,3 +106,7 @@ scrcpy --otg # keyboard and mouse
Like `--hid-keyboard` and `--hid-mouse`, it only works if the device is Like `--hid-keyboard` and `--hid-mouse`, it only works if the device is
connected over USB. connected over USB.
## HID/OTG issues on Windows
See [FAQ](/FAQ.md#hidotg-issues-on-windows).

View File

@ -9,12 +9,10 @@ Scrcpy is packaged in several distributions and package managers:
- Debian/Ubuntu: `apt install scrcpy` - Debian/Ubuntu: `apt install scrcpy`
- Arch Linux: `pacman -S scrcpy` - Arch Linux: `pacman -S scrcpy`
- Fedora: `dnf copr enable zeno/scrcpy && dnf install scrcpy` - Fedora: `dnf copr enable zeno/scrcpy && dnf install scrcpy`
- Gentoo: [ebuild][ebuild-link] file - Gentoo: `emerge scrcpy`
- Snap: `snap install scrcpy` - Snap: `snap install scrcpy`
- … (see [repology](https://repology.org/project/scrcpy/versions)) - … (see [repology](https://repology.org/project/scrcpy/versions))
[ebuild-link]: https://github.com/maggu2810/maggu2810-overlay/tree/master/app-mobilephone/scrcpy
### Latest version ### Latest version
However, the packaged version is not always the latest release. To install the However, the packaged version is not always the latest release. To install the

View File

@ -13,14 +13,12 @@ To record only the video:
scrcpy --no-audio --record=file.mp4 scrcpy --no-audio --record=file.mp4
``` ```
_It is currently not possible to record only the audio._ To record only the audio:
To disable mirroring while recording:
```bash ```bash
scrcpy --no-display --record=file.mp4 scrcpy --no-video --record=file.opus
scrcpy -Nr file.mkv scrcpy --no-video --audio-codec=aac --record=file.aac
# interrupt recording with Ctrl+C # .m4a/.mp4 and .mka/.mkv are also supported for both opus and aac
``` ```
Timestamps are captured on the device, so [packet delay variation] does not Timestamps are captured on the device, so [packet delay variation] does not
@ -29,6 +27,9 @@ course, not if you capture your scrcpy window and audio output on the computer).
[packet delay variation]: https://en.wikipedia.org/wiki/Packet_delay_variation [packet delay variation]: https://en.wikipedia.org/wiki/Packet_delay_variation
## Format
The video and audio streams are encoded on the device, but are muxed on the The video and audio streams are encoded on the device, but are muxed on the
client side. Two formats (containers) are supported: client side. Two formats (containers) are supported:
- Matroska (`.mkv`) - Matroska (`.mkv`)
@ -42,3 +43,36 @@ needs not end with `.mkv` or `.mp4`):
``` ```
scrcpy --record=file --record-format=mkv scrcpy --record=file --record-format=mkv
``` ```
## No playback
To disable playback while recording:
```bash
scrcpy --no-playback --record=file.mp4
scrcpy -Nr file.mkv
# interrupt recording with Ctrl+C
```
It is also possible to disable video and audio playback separately:
```bash
# Record both video and audio, but only play video
scrcpy --record=file.mkv --no-audio-playback
```
## Time limit
To limit the recording time:
```bash
scrcpy --record=file.mkv --time-limit=20 # in seconds
```
The `--time-limit` option is not limited to recording, it also impacts simple
mirroring:
```
scrcpy --time-limit=20
```

View File

@ -29,7 +29,7 @@ _<kbd>[Super]</kbd> is typically the <kbd>Windows</kbd> or <kbd>Cmd</kbd> key._
| Resize window to 1:1 (pixel-perfect) | <kbd>MOD</kbd>+<kbd>g</kbd> | Resize window to 1:1 (pixel-perfect) | <kbd>MOD</kbd>+<kbd>g</kbd>
| Resize window to remove black borders | <kbd>MOD</kbd>+<kbd>w</kbd> \| _Double-left-click¹_ | Resize window to remove black borders | <kbd>MOD</kbd>+<kbd>w</kbd> \| _Double-left-click¹_
| Click on `HOME` | <kbd>MOD</kbd>+<kbd>h</kbd> \| _Middle-click_ | Click on `HOME` | <kbd>MOD</kbd>+<kbd>h</kbd> \| _Middle-click_
| Click on `BACK` | <kbd>MOD</kbd>+<kbd>b</kbd> \| _Right-click²_ | Click on `BACK` | <kbd>MOD</kbd>+<kbd>b</kbd> \| <kbd>MOD</kbd>+<kbd>Backspace</kbd> \| _Right-click²_
| Click on `APP_SWITCH` | <kbd>MOD</kbd>+<kbd>s</kbd> \| _4th-click³_ | Click on `APP_SWITCH` | <kbd>MOD</kbd>+<kbd>s</kbd> \| _4th-click³_
| Click on `MENU` (unlock screen)⁴ | <kbd>MOD</kbd>+<kbd>m</kbd> | Click on `MENU` (unlock screen)⁴ | <kbd>MOD</kbd>+<kbd>m</kbd>
| Click on `VOLUME_UP` | <kbd>MOD</kbd>+<kbd></kbd> _(up)_ | Click on `VOLUME_UP` | <kbd>MOD</kbd>+<kbd></kbd> _(up)_

View File

@ -35,7 +35,7 @@ To start `scrcpy` using a v4l2 sink:
```bash ```bash
scrcpy --v4l2-sink=/dev/videoN scrcpy --v4l2-sink=/dev/videoN
scrcpy --v4l2-sink=/dev/videoN --no-display # disable mirroring window scrcpy --v4l2-sink=/dev/videoN --no-video-playback # disable playback window
``` ```
(replace `N` with the device ID, check with `ls /dev/video*`) (replace `N` with the device ID, check with `ls /dev/video*`)

View File

@ -21,7 +21,7 @@ If encoding fails, scrcpy automatically tries again with a lower definition
## Bit rate ## Bit rate
The default video bit-rate is 8 Mbps. To change it: The default video bit rate is 8 Mbps. To change it:
```bash ```bash
scrcpy --video-bit-rate=2M scrcpy --video-bit-rate=2M
@ -66,6 +66,14 @@ scrcpy --video-codec=av1
H265 may provide better quality, but H264 should provide lower latency. H265 may provide better quality, but H264 should provide lower latency.
AV1 encoders are not common on current Android devices. AV1 encoders are not common on current Android devices.
For advanced usage, to pass arbitrary parameters to the [`MediaFormat`],
check `--video-codec-options` in the manpage or in `scrcpy --help`.
[`MediaFormat`]: https://developer.android.com/reference/android/media/MediaFormat
## Encoder
Several encoders may be available on the device. They can be listed by: Several encoders may be available on the device. They can be listed by:
```bash ```bash
@ -79,11 +87,6 @@ try another one:
scrcpy --video-codec=h264 --video-encoder='OMX.qcom.video.encoder.avc' scrcpy --video-codec=h264 --video-encoder='OMX.qcom.video.encoder.avc'
``` ```
For advanced usage, to pass arbitrary parameters to the [`MediaFormat`],
check `--video-codec-options` in the manpage or in `scrcpy --help`.
[`MediaFormat`]: https://developer.android.com/reference/android/media/MediaFormat
## Rotation ## Rotation
@ -134,6 +137,25 @@ phone, landscape for a tablet).
If `--max-size` is also specified, resizing is applied after cropping. If `--max-size` is also specified, resizing is applied after cropping.
## Display
If several displays are available on the Android device, it is possible to
select the display to mirror:
```bash
scrcpy --display=1
```
The list of display ids can be retrieved by:
```bash
scrcpy --list-displays
```
A secondary display may only be controlled if the device runs at least Android
10 (otherwise it is mirrored as read-only).
## Buffering ## Buffering
By default, there is no video buffering, to get the lowest possible latency. By default, there is no video buffering, to get the lowest possible latency.
@ -159,17 +181,38 @@ scrcpy --display-buffer=50 --v4l2-buffer=300
``` ```
## No display ## No playback
It is possible to capture an Android device without displaying a mirroring It is possible to capture an Android device without playing video or audio on
window. This option is available if either [recording](recording.md) or the computer. This option is useful when [recording](recording.md) or when
[v4l2](#video4linux) is enabled: [v4l2](#video4linux) is enabled:
```bash ```bash
scrcpy --v4l2-sink=/dev/video2 --no-display scrcpy --v4l2-sink=/dev/video2 --no-playback
scrcpy --record=file.mkv --no-display scrcpy --record=file.mkv --no-playback
# interrupt with Ctrl+C
``` ```
It is also possible to disable video and audio playback separately:
```bash
# Send video to V4L2 sink without playing it, but keep audio playback
scrcpy --v4l2-sink=/dev/video2 --no-video-playback
# Record both video and audio, but only play video
scrcpy --record=file.mkv --no-audio-playback
```
## No video
To disable video forwarding completely, so that only audio is forwarded:
```
scrcpy --no-video
```
## Video4Linux ## Video4Linux
See the dedicated [Video4Linux](v4l2.md) page. See the dedicated [Video4Linux](v4l2.md) page.

View File

@ -4,18 +4,24 @@
Download the [latest release]: Download the [latest release]:
- [`scrcpy-win64-v2.0.zip`][direct-win64] (64-bit) - [`scrcpy-win64-v2.1.1.zip`][direct-win64] (64-bit)
<sub>SHA-256: `ae4c8d37a496b43f8974ba8f07f708e22a9570ba0cddc3dc3a36edbccd4d2a20`</sub> <sub>SHA-256: `f77281e1bce2f9934617699c581f063d5b327f012eff602ee98fb2ef550c25c2`</sub>
- [`scrcpy-win32-v2.0.zip`][direct-win32] (32-bit) - [`scrcpy-win32-v2.1.1.zip`][direct-win32] (32-bit)
<sub>SHA-256: `15d98c02cb0e0bbd84f8b5d54991e0f6925569b1286a86a40743944fcb1c2d8c`</sub> <sub>SHA-256: `ef7ae7fbe9449f2643febdc2244fb186d1a746a3c736394150cfd14f06d3c943`</sub>
[latest release]: https://github.com/Genymobile/scrcpy/releases/latest [latest release]: https://github.com/Genymobile/scrcpy/releases/latest
[direct-win64]: https://github.com/Genymobile/scrcpy/releases/download/v2.0/scrcpy-win64-v2.0.zip [direct-win64]: https://github.com/Genymobile/scrcpy/releases/download/v2.1.1/scrcpy-win64-v2.1.1.zip
[direct-win32]: https://github.com/Genymobile/scrcpy/releases/download/v2.0/scrcpy-win32-v2.0.zip [direct-win32]: https://github.com/Genymobile/scrcpy/releases/download/v2.1.1/scrcpy-win32-v2.1.1.zip
and extract it. and extract it.
Alternatively, you could install it from packages manager, like [Chocolatey]: Alternatively, you could install it from packages manager, like [Winget]:
```bash
winget install scrcpy
```
or [Chocolatey]:
```bash ```bash
choco install scrcpy choco install scrcpy
@ -30,6 +36,7 @@ scoop install scrcpy
scoop install adb # if you don't have it yet scoop install adb # if you don't have it yet
``` ```
[Winget]: https://github.com/microsoft/winget-cli
[Chocolatey]: https://chocolatey.org/ [Chocolatey]: https://chocolatey.org/
[Scoop]: https://scoop.sh [Scoop]: https://scoop.sh

View File

@ -2,8 +2,8 @@
set -e set -e
BUILDDIR=build-auto BUILDDIR=build-auto
PREBUILT_SERVER_URL=https://github.com/Genymobile/scrcpy/releases/download/v2.0/scrcpy-server-v2.0 PREBUILT_SERVER_URL=https://github.com/Genymobile/scrcpy/releases/download/v2.1.1/scrcpy-server-v2.1.1
PREBUILT_SERVER_SHA256=9e241615f578cd690bb43311000debdecf6a9c50a7082b001952f18f6f21ddc2 PREBUILT_SERVER_SHA256=9558db6c56743a1dc03b38f59801fb40e91cc891f8fc0c89e5b0b067761f148e
echo "[scrcpy] Downloading prebuilt server..." echo "[scrcpy] Downloading prebuilt server..."
wget "$PREBUILT_SERVER_URL" -O scrcpy-server wget "$PREBUILT_SERVER_URL" -O scrcpy-server

View File

@ -1,5 +1,5 @@
project('scrcpy', 'c', project('scrcpy', 'c',
version: '2.0', version: '2.1.1',
meson_version: '>= 0.48', meson_version: '>= 0.48',
default_options: [ default_options: [
'c_std=c11', 'c_std=c11',

View File

@ -94,15 +94,14 @@ dist-win32: build-server build-win32
cp app/data/scrcpy-noconsole.vbs "$(DIST)/$(WIN32_TARGET_DIR)" cp app/data/scrcpy-noconsole.vbs "$(DIST)/$(WIN32_TARGET_DIR)"
cp app/data/icon.png "$(DIST)/$(WIN32_TARGET_DIR)" cp app/data/icon.png "$(DIST)/$(WIN32_TARGET_DIR)"
cp app/data/open_a_terminal_here.bat "$(DIST)/$(WIN32_TARGET_DIR)" cp app/data/open_a_terminal_here.bat "$(DIST)/$(WIN32_TARGET_DIR)"
cp app/prebuilt-deps/data/ffmpeg-6.0-scrcpy-2/win32/bin/avutil-58.dll "$(DIST)/$(WIN32_TARGET_DIR)/" cp app/prebuilt-deps/data/ffmpeg-6.0-scrcpy-4/win32/bin/avutil-58.dll "$(DIST)/$(WIN32_TARGET_DIR)/"
cp app/prebuilt-deps/data/ffmpeg-6.0-scrcpy-2/win32/bin/avcodec-60.dll "$(DIST)/$(WIN32_TARGET_DIR)/" cp app/prebuilt-deps/data/ffmpeg-6.0-scrcpy-4/win32/bin/avcodec-60.dll "$(DIST)/$(WIN32_TARGET_DIR)/"
cp app/prebuilt-deps/data/ffmpeg-6.0-scrcpy-2/win32/bin/avformat-60.dll "$(DIST)/$(WIN32_TARGET_DIR)/" cp app/prebuilt-deps/data/ffmpeg-6.0-scrcpy-4/win32/bin/avformat-60.dll "$(DIST)/$(WIN32_TARGET_DIR)/"
cp app/prebuilt-deps/data/ffmpeg-6.0-scrcpy-2/win32/bin/swresample-4.dll "$(DIST)/$(WIN32_TARGET_DIR)/" cp app/prebuilt-deps/data/ffmpeg-6.0-scrcpy-4/win32/bin/swresample-4.dll "$(DIST)/$(WIN32_TARGET_DIR)/"
cp app/prebuilt-deps/data/ffmpeg-6.0-scrcpy-2/win32/bin/zlib1.dll "$(DIST)/$(WIN32_TARGET_DIR)/" cp app/prebuilt-deps/data/platform-tools-34.0.3/adb.exe "$(DIST)/$(WIN32_TARGET_DIR)/"
cp app/prebuilt-deps/data/platform-tools-34.0.1/adb.exe "$(DIST)/$(WIN32_TARGET_DIR)/" cp app/prebuilt-deps/data/platform-tools-34.0.3/AdbWinApi.dll "$(DIST)/$(WIN32_TARGET_DIR)/"
cp app/prebuilt-deps/data/platform-tools-34.0.1/AdbWinApi.dll "$(DIST)/$(WIN32_TARGET_DIR)/" cp app/prebuilt-deps/data/platform-tools-34.0.3/AdbWinUsbApi.dll "$(DIST)/$(WIN32_TARGET_DIR)/"
cp app/prebuilt-deps/data/platform-tools-34.0.1/AdbWinUsbApi.dll "$(DIST)/$(WIN32_TARGET_DIR)/" cp app/prebuilt-deps/data/SDL2-2.28.0/i686-w64-mingw32/bin/SDL2.dll "$(DIST)/$(WIN32_TARGET_DIR)/"
cp app/prebuilt-deps/data/SDL2-2.26.4/i686-w64-mingw32/bin/SDL2.dll "$(DIST)/$(WIN32_TARGET_DIR)/"
cp app/prebuilt-deps/data/libusb-1.0.26/libusb-MinGW-Win32/bin/msys-usb-1.0.dll "$(DIST)/$(WIN32_TARGET_DIR)/" cp app/prebuilt-deps/data/libusb-1.0.26/libusb-MinGW-Win32/bin/msys-usb-1.0.dll "$(DIST)/$(WIN32_TARGET_DIR)/"
dist-win64: build-server build-win64 dist-win64: build-server build-win64
@ -113,15 +112,14 @@ dist-win64: build-server build-win64
cp app/data/scrcpy-noconsole.vbs "$(DIST)/$(WIN64_TARGET_DIR)" cp app/data/scrcpy-noconsole.vbs "$(DIST)/$(WIN64_TARGET_DIR)"
cp app/data/icon.png "$(DIST)/$(WIN64_TARGET_DIR)" cp app/data/icon.png "$(DIST)/$(WIN64_TARGET_DIR)"
cp app/data/open_a_terminal_here.bat "$(DIST)/$(WIN64_TARGET_DIR)" cp app/data/open_a_terminal_here.bat "$(DIST)/$(WIN64_TARGET_DIR)"
cp app/prebuilt-deps/data/ffmpeg-6.0-scrcpy-2/win64/bin/avutil-58.dll "$(DIST)/$(WIN64_TARGET_DIR)/" cp app/prebuilt-deps/data/ffmpeg-6.0-scrcpy-4/win64/bin/avutil-58.dll "$(DIST)/$(WIN64_TARGET_DIR)/"
cp app/prebuilt-deps/data/ffmpeg-6.0-scrcpy-2/win64/bin/avcodec-60.dll "$(DIST)/$(WIN64_TARGET_DIR)/" cp app/prebuilt-deps/data/ffmpeg-6.0-scrcpy-4/win64/bin/avcodec-60.dll "$(DIST)/$(WIN64_TARGET_DIR)/"
cp app/prebuilt-deps/data/ffmpeg-6.0-scrcpy-2/win64/bin/avformat-60.dll "$(DIST)/$(WIN64_TARGET_DIR)/" cp app/prebuilt-deps/data/ffmpeg-6.0-scrcpy-4/win64/bin/avformat-60.dll "$(DIST)/$(WIN64_TARGET_DIR)/"
cp app/prebuilt-deps/data/ffmpeg-6.0-scrcpy-2/win64/bin/swresample-4.dll "$(DIST)/$(WIN64_TARGET_DIR)/" cp app/prebuilt-deps/data/ffmpeg-6.0-scrcpy-4/win64/bin/swresample-4.dll "$(DIST)/$(WIN64_TARGET_DIR)/"
cp app/prebuilt-deps/data/ffmpeg-6.0-scrcpy-2/win64/bin/zlib1.dll "$(DIST)/$(WIN64_TARGET_DIR)/" cp app/prebuilt-deps/data/platform-tools-34.0.3/adb.exe "$(DIST)/$(WIN64_TARGET_DIR)/"
cp app/prebuilt-deps/data/platform-tools-34.0.1/adb.exe "$(DIST)/$(WIN64_TARGET_DIR)/" cp app/prebuilt-deps/data/platform-tools-34.0.3/AdbWinApi.dll "$(DIST)/$(WIN64_TARGET_DIR)/"
cp app/prebuilt-deps/data/platform-tools-34.0.1/AdbWinApi.dll "$(DIST)/$(WIN64_TARGET_DIR)/" cp app/prebuilt-deps/data/platform-tools-34.0.3/AdbWinUsbApi.dll "$(DIST)/$(WIN64_TARGET_DIR)/"
cp app/prebuilt-deps/data/platform-tools-34.0.1/AdbWinUsbApi.dll "$(DIST)/$(WIN64_TARGET_DIR)/" cp app/prebuilt-deps/data/SDL2-2.28.0/x86_64-w64-mingw32/bin/SDL2.dll "$(DIST)/$(WIN64_TARGET_DIR)/"
cp app/prebuilt-deps/data/SDL2-2.26.4/x86_64-w64-mingw32/bin/SDL2.dll "$(DIST)/$(WIN64_TARGET_DIR)/"
cp app/prebuilt-deps/data/libusb-1.0.26/libusb-MinGW-x64/bin/msys-usb-1.0.dll "$(DIST)/$(WIN64_TARGET_DIR)/" cp app/prebuilt-deps/data/libusb-1.0.26/libusb-MinGW-x64/bin/msys-usb-1.0.dll "$(DIST)/$(WIN64_TARGET_DIR)/"
zip-win32: dist-win32 zip-win32: dist-win32

View File

@ -7,8 +7,8 @@ android {
applicationId "com.genymobile.scrcpy" applicationId "com.genymobile.scrcpy"
minSdkVersion 21 minSdkVersion 21
targetSdkVersion 33 targetSdkVersion 33
versionCode 20000 versionCode 20101
versionName "2.0" versionName "2.1.1"
testInstrumentationRunner "android.support.test.runner.AndroidJUnitRunner" testInstrumentationRunner "android.support.test.runner.AndroidJUnitRunner"
} }
buildTypes { buildTypes {

View File

@ -12,7 +12,7 @@
set -e set -e
SCRCPY_DEBUG=false SCRCPY_DEBUG=false
SCRCPY_VERSION_NAME=2.0 SCRCPY_VERSION_NAME=2.1.1
PLATFORM=${ANDROID_PLATFORM:-33} PLATFORM=${ANDROID_PLATFORM:-33}
BUILD_TOOLS=${ANDROID_BUILD_TOOLS:-33.0.0} BUILD_TOOLS=${ANDROID_BUILD_TOOLS:-33.0.0}
@ -48,6 +48,7 @@ cd "$SERVER_DIR/src/main/aidl"
"$BUILD_TOOLS_DIR/aidl" -o"$GEN_DIR" android/view/IRotationWatcher.aidl "$BUILD_TOOLS_DIR/aidl" -o"$GEN_DIR" android/view/IRotationWatcher.aidl
"$BUILD_TOOLS_DIR/aidl" -o"$GEN_DIR" \ "$BUILD_TOOLS_DIR/aidl" -o"$GEN_DIR" \
android/content/IOnPrimaryClipChangedListener.aidl android/content/IOnPrimaryClipChangedListener.aidl
"$BUILD_TOOLS_DIR/aidl" -o"$GEN_DIR" android/view/IDisplayFoldListener.aidl
echo "Compiling java sources..." echo "Compiling java sources..."
cd ../java cd ../java

View File

@ -0,0 +1,26 @@
/*
* Copyright (C) 2019 The Android Open Source Project
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package android.view;
/**
* {@hide}
*/
oneway interface IDisplayFoldListener
{
/** Called when the foldedness of a display changes */
void onDisplayFoldChanged(int displayId, boolean folded);
}

View File

@ -1,7 +1,16 @@
package com.genymobile.scrcpy; package com.genymobile.scrcpy;
public interface AsyncProcessor { public interface AsyncProcessor {
void start(); interface TerminationListener {
/**
* Notify processor termination
*
* @param fatalError {@code true} if this must cause the termination of the whole scrcpy-server.
*/
void onTerminated(boolean fatalError);
}
void start(TerminationListener listener);
void stop(); void stop();
void join() throws InterruptedException; void join() throws InterruptedException;
} }

View File

@ -10,7 +10,6 @@ import android.media.AudioFormat;
import android.media.AudioRecord; import android.media.AudioRecord;
import android.media.AudioTimestamp; import android.media.AudioTimestamp;
import android.media.MediaCodec; import android.media.MediaCodec;
import android.media.MediaRecorder;
import android.os.Build; import android.os.Build;
import android.os.SystemClock; import android.os.SystemClock;
@ -21,22 +20,29 @@ public final class AudioCapture {
public static final int SAMPLE_RATE = 48000; public static final int SAMPLE_RATE = 48000;
public static final int CHANNEL_CONFIG = AudioFormat.CHANNEL_IN_STEREO; public static final int CHANNEL_CONFIG = AudioFormat.CHANNEL_IN_STEREO;
public static final int CHANNELS = 2; public static final int CHANNELS = 2;
public static final int FORMAT = AudioFormat.ENCODING_PCM_16BIT; public static final int CHANNEL_MASK = AudioFormat.CHANNEL_IN_LEFT | AudioFormat.CHANNEL_IN_RIGHT;
public static final int ENCODING = AudioFormat.ENCODING_PCM_16BIT;
public static final int BYTES_PER_SAMPLE = 2; public static final int BYTES_PER_SAMPLE = 2;
private final int audioSource;
private AudioRecord recorder; private AudioRecord recorder;
private final AudioTimestamp timestamp = new AudioTimestamp(); private final AudioTimestamp timestamp = new AudioTimestamp();
private long previousPts = 0; private long previousPts = 0;
private long nextPts = 0; private long nextPts = 0;
public AudioCapture(AudioSource audioSource) {
this.audioSource = audioSource.value();
}
public static int millisToBytes(int millis) { public static int millisToBytes(int millis) {
return SAMPLE_RATE * CHANNELS * BYTES_PER_SAMPLE * millis / 1000; return SAMPLE_RATE * CHANNELS * BYTES_PER_SAMPLE * millis / 1000;
} }
private static AudioFormat createAudioFormat() { private static AudioFormat createAudioFormat() {
AudioFormat.Builder builder = new AudioFormat.Builder(); AudioFormat.Builder builder = new AudioFormat.Builder();
builder.setEncoding(FORMAT); builder.setEncoding(ENCODING);
builder.setSampleRate(SAMPLE_RATE); builder.setSampleRate(SAMPLE_RATE);
builder.setChannelMask(CHANNEL_CONFIG); builder.setChannelMask(CHANNEL_CONFIG);
return builder.build(); return builder.build();
@ -44,15 +50,15 @@ public final class AudioCapture {
@TargetApi(Build.VERSION_CODES.M) @TargetApi(Build.VERSION_CODES.M)
@SuppressLint({"WrongConstant", "MissingPermission"}) @SuppressLint({"WrongConstant", "MissingPermission"})
private static AudioRecord createAudioRecord() { private static AudioRecord createAudioRecord(int audioSource) {
AudioRecord.Builder builder = new AudioRecord.Builder(); AudioRecord.Builder builder = new AudioRecord.Builder();
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.S) { if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.S) {
// On older APIs, Workarounds.fillAppInfo() must be called beforehand // On older APIs, Workarounds.fillAppInfo() must be called beforehand
builder.setContext(FakeContext.get()); builder.setContext(FakeContext.get());
} }
builder.setAudioSource(MediaRecorder.AudioSource.REMOTE_SUBMIX); builder.setAudioSource(audioSource);
builder.setAudioFormat(createAudioFormat()); builder.setAudioFormat(createAudioFormat());
int minBufferSize = AudioRecord.getMinBufferSize(SAMPLE_RATE, CHANNEL_CONFIG, FORMAT); int minBufferSize = AudioRecord.getMinBufferSize(SAMPLE_RATE, CHANNEL_CONFIG, ENCODING);
// This buffer size does not impact latency // This buffer size does not impact latency
builder.setBufferSizeInBytes(8 * minBufferSize); builder.setBufferSizeInBytes(8 * minBufferSize);
return builder.build(); return builder.build();
@ -86,8 +92,8 @@ public final class AudioCapture {
} catch (UnsupportedOperationException e) { } catch (UnsupportedOperationException e) {
if (attempts == 0) { if (attempts == 0) {
Ln.e("Failed to start audio capture"); Ln.e("Failed to start audio capture");
Ln.e("On Android 11, audio capture must be started in the foreground, make sure that the device is unlocked when starting " + Ln.e("On Android 11, audio capture must be started in the foreground, make sure that the device is unlocked when starting "
"scrcpy."); + "scrcpy.");
throw new AudioCaptureForegroundException(); throw new AudioCaptureForegroundException();
} else { } else {
Ln.d("Failed to start audio capture, retrying..."); Ln.d("Failed to start audio capture, retrying...");
@ -97,7 +103,14 @@ public final class AudioCapture {
} }
private void startRecording() { private void startRecording() {
recorder = createAudioRecord(); try {
recorder = createAudioRecord(audioSource);
} catch (NullPointerException e) {
// Creating an AudioRecord using an AudioRecord.Builder does not work on Vivo phones:
// - <https://github.com/Genymobile/scrcpy/issues/3805>
// - <https://github.com/Genymobile/scrcpy/pull/3862>
recorder = Workarounds.createAudioRecord(audioSource, SAMPLE_RATE, CHANNEL_CONFIG, CHANNELS, CHANNEL_MASK, ENCODING);
}
recorder.startRecording(); recorder.startRecording();
} }
@ -105,7 +118,7 @@ public final class AudioCapture {
if (Build.VERSION.SDK_INT == Build.VERSION_CODES.R) { if (Build.VERSION.SDK_INT == Build.VERSION_CODES.R) {
startWorkaroundAndroid11(); startWorkaroundAndroid11();
try { try {
tryStartRecording(3, 100); tryStartRecording(5, 100);
} finally { } finally {
stopWorkaroundAndroid11(); stopWorkaroundAndroid11();
} }

View File

@ -40,6 +40,7 @@ public final class AudioEncoder implements AsyncProcessor {
private static final int READ_MS = 5; // milliseconds private static final int READ_MS = 5; // milliseconds
private static final int READ_SIZE = AudioCapture.millisToBytes(READ_MS); private static final int READ_SIZE = AudioCapture.millisToBytes(READ_MS);
private final AudioCapture capture;
private final Streamer streamer; private final Streamer streamer;
private final int bitRate; private final int bitRate;
private final List<CodecOption> codecOptions; private final List<CodecOption> codecOptions;
@ -58,7 +59,8 @@ public final class AudioEncoder implements AsyncProcessor {
private boolean ended; private boolean ended;
public AudioEncoder(Streamer streamer, int bitRate, List<CodecOption> codecOptions, String encoderName) { public AudioEncoder(AudioCapture capture, Streamer streamer, int bitRate, List<CodecOption> codecOptions, String encoderName) {
this.capture = capture;
this.streamer = streamer; this.streamer = streamer;
this.bitRate = bitRate; this.bitRate = bitRate;
this.codecOptions = codecOptions; this.codecOptions = codecOptions;
@ -114,21 +116,29 @@ public final class AudioEncoder implements AsyncProcessor {
} }
} }
public void start() { @Override
public void start(TerminationListener listener) {
thread = new Thread(() -> { thread = new Thread(() -> {
boolean fatalError = false;
try { try {
encode(); encode();
} catch (ConfigurationException | AudioCaptureForegroundException e) { } catch (ConfigurationException e) {
// Do not print stack trace, a user-friendly error-message has already been logged
fatalError = true;
} catch (AudioCaptureForegroundException e) {
// Do not print stack trace, a user-friendly error-message has already been logged // Do not print stack trace, a user-friendly error-message has already been logged
} catch (IOException e) { } catch (IOException e) {
Ln.e("Audio encoding error", e); Ln.e("Audio encoding error", e);
fatalError = true;
} finally { } finally {
Ln.d("Audio encoder stopped"); Ln.d("Audio encoder stopped");
listener.onTerminated(fatalError);
} }
}); }, "audio-encoder");
thread.start(); thread.start();
} }
@Override
public void stop() { public void stop() {
if (thread != null) { if (thread != null) {
// Just wake up the blocking wait from the thread, so that it properly releases all its resources and terminates // Just wake up the blocking wait from the thread, so that it properly releases all its resources and terminates
@ -136,6 +146,7 @@ public final class AudioEncoder implements AsyncProcessor {
} }
} }
@Override
public void join() throws InterruptedException { public void join() throws InterruptedException {
if (thread != null) { if (thread != null) {
thread.join(); thread.join();
@ -166,14 +177,13 @@ public final class AudioEncoder implements AsyncProcessor {
} }
MediaCodec mediaCodec = null; MediaCodec mediaCodec = null;
AudioCapture capture = new AudioCapture();
boolean mediaCodecStarted = false; boolean mediaCodecStarted = false;
try { try {
Codec codec = streamer.getCodec(); Codec codec = streamer.getCodec();
mediaCodec = createMediaCodec(codec, encoderName); mediaCodec = createMediaCodec(codec, encoderName);
mediaCodecThread = new HandlerThread("AudioEncoder"); mediaCodecThread = new HandlerThread("media-codec");
mediaCodecThread.start(); mediaCodecThread.start();
MediaFormat format = createFormat(codec.getMimeType(), bitRate, codecOptions); MediaFormat format = createFormat(codec.getMimeType(), bitRate, codecOptions);
@ -183,16 +193,15 @@ public final class AudioEncoder implements AsyncProcessor {
capture.start(); capture.start();
final MediaCodec mediaCodecRef = mediaCodec; final MediaCodec mediaCodecRef = mediaCodec;
final AudioCapture captureRef = capture;
inputThread = new Thread(() -> { inputThread = new Thread(() -> {
try { try {
inputThread(mediaCodecRef, captureRef); inputThread(mediaCodecRef, capture);
} catch (IOException | InterruptedException e) { } catch (IOException | InterruptedException e) {
Ln.e("Audio capture error", e); Ln.e("Audio capture error", e);
} finally { } finally {
end(); end();
} }
}); }, "audio-in");
outputThread = new Thread(() -> { outputThread = new Thread(() -> {
try { try {
@ -207,7 +216,7 @@ public final class AudioEncoder implements AsyncProcessor {
} finally { } finally {
end(); end();
} }
}); }, "audio-out");
mediaCodec.start(); mediaCodec.start();
mediaCodecStarted = true; mediaCodecStarted = true;

View File

@ -1,12 +1,14 @@
package com.genymobile.scrcpy; package com.genymobile.scrcpy;
import android.media.MediaCodec; import android.media.MediaCodec;
import android.os.Build;
import java.io.IOException; import java.io.IOException;
import java.nio.ByteBuffer; import java.nio.ByteBuffer;
public final class AudioRawRecorder implements AsyncProcessor { public final class AudioRawRecorder implements AsyncProcessor {
private final AudioCapture capture;
private final Streamer streamer; private final Streamer streamer;
private Thread thread; private Thread thread;
@ -14,15 +16,21 @@ public final class AudioRawRecorder implements AsyncProcessor {
private static final int READ_MS = 5; // milliseconds private static final int READ_MS = 5; // milliseconds
private static final int READ_SIZE = AudioCapture.millisToBytes(READ_MS); private static final int READ_SIZE = AudioCapture.millisToBytes(READ_MS);
public AudioRawRecorder(Streamer streamer) { public AudioRawRecorder(AudioCapture capture, Streamer streamer) {
this.capture = capture;
this.streamer = streamer; this.streamer = streamer;
} }
private void record() throws IOException, AudioCaptureForegroundException { private void record() throws IOException, AudioCaptureForegroundException {
if (Build.VERSION.SDK_INT < Build.VERSION_CODES.R) {
Ln.w("Audio disabled: it is not supported before Android 11");
streamer.writeDisableStream(false);
return;
}
final ByteBuffer buffer = ByteBuffer.allocateDirect(READ_SIZE); final ByteBuffer buffer = ByteBuffer.allocateDirect(READ_SIZE);
final MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo(); final MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
AudioCapture capture = new AudioCapture();
try { try {
capture.start(); capture.start();
@ -46,27 +54,33 @@ public final class AudioRawRecorder implements AsyncProcessor {
} }
} }
public void start() { @Override
public void start(TerminationListener listener) {
thread = new Thread(() -> { thread = new Thread(() -> {
boolean fatalError = false;
try { try {
record(); record();
} catch (AudioCaptureForegroundException e) { } catch (AudioCaptureForegroundException e) {
// Do not print stack trace, a user-friendly error-message has already been logged // Do not print stack trace, a user-friendly error-message has already been logged
} catch (IOException e) { } catch (IOException e) {
Ln.e("Audio recording error", e); Ln.e("Audio recording error", e);
fatalError = true;
} finally { } finally {
Ln.d("Audio recorder stopped"); Ln.d("Audio recorder stopped");
listener.onTerminated(fatalError);
} }
}); }, "audio-raw");
thread.start(); thread.start();
} }
@Override
public void stop() { public void stop() {
if (thread != null) { if (thread != null) {
thread.interrupt(); thread.interrupt();
} }
} }
@Override
public void join() throws InterruptedException { public void join() throws InterruptedException {
if (thread != null) { if (thread != null) {
thread.join(); thread.join();

View File

@ -0,0 +1,30 @@
package com.genymobile.scrcpy;
import android.media.MediaRecorder;
public enum AudioSource {
OUTPUT("output", MediaRecorder.AudioSource.REMOTE_SUBMIX),
MIC("mic", MediaRecorder.AudioSource.MIC);
private final String name;
private final int value;
AudioSource(String name, int value) {
this.name = name;
this.value = value;
}
int value() {
return value;
}
static AudioSource findByName(String name) {
for (AudioSource audioSource : AudioSource.values()) {
if (name.equals(audioSource.name)) {
return audioSource;
}
}
return null;
}
}

View File

@ -84,7 +84,8 @@ public class Controller implements AsyncProcessor {
} }
} }
public void start() { @Override
public void start(TerminationListener listener) {
thread = new Thread(() -> { thread = new Thread(() -> {
try { try {
control(); control();
@ -92,12 +93,14 @@ public class Controller implements AsyncProcessor {
// this is expected on close // this is expected on close
} finally { } finally {
Ln.d("Controller stopped"); Ln.d("Controller stopped");
listener.onTerminated(true);
} }
}); }, "control-recv");
thread.start(); thread.start();
sender.start(); sender.start();
} }
@Override
public void stop() { public void stop() {
if (thread != null) { if (thread != null) {
thread.interrupt(); thread.interrupt();
@ -105,6 +108,7 @@ public class Controller implements AsyncProcessor {
sender.stop(); sender.stop();
} }
@Override
public void join() throws InterruptedException { public void join() throws InterruptedException {
if (thread != null) { if (thread != null) {
thread.join(); thread.join();

View File

@ -41,7 +41,7 @@ public final class DesktopConnection implements Closeable {
controlInputStream = null; controlInputStream = null;
controlOutputStream = null; controlOutputStream = null;
} }
videoFd = videoSocket.getFileDescriptor(); videoFd = videoSocket != null ? videoSocket.getFileDescriptor() : null;
audioFd = audioSocket != null ? audioSocket.getFileDescriptor() : null; audioFd = audioSocket != null ? audioSocket.getFileDescriptor() : null;
} }
@ -60,7 +60,8 @@ public final class DesktopConnection implements Closeable {
return SOCKET_NAME_PREFIX + String.format("_%08x", scid); return SOCKET_NAME_PREFIX + String.format("_%08x", scid);
} }
public static DesktopConnection open(int scid, boolean tunnelForward, boolean audio, boolean control, boolean sendDummyByte) throws IOException { public static DesktopConnection open(int scid, boolean tunnelForward, boolean video, boolean audio, boolean control, boolean sendDummyByte)
throws IOException {
String socketName = getSocketName(scid); String socketName = getSocketName(scid);
LocalSocket videoSocket = null; LocalSocket videoSocket = null;
@ -69,20 +70,35 @@ public final class DesktopConnection implements Closeable {
try { try {
if (tunnelForward) { if (tunnelForward) {
try (LocalServerSocket localServerSocket = new LocalServerSocket(socketName)) { try (LocalServerSocket localServerSocket = new LocalServerSocket(socketName)) {
videoSocket = localServerSocket.accept(); if (video) {
if (sendDummyByte) { videoSocket = localServerSocket.accept();
// send one byte so the client may read() to detect a connection error if (sendDummyByte) {
videoSocket.getOutputStream().write(0); // send one byte so the client may read() to detect a connection error
videoSocket.getOutputStream().write(0);
sendDummyByte = false;
}
} }
if (audio) { if (audio) {
audioSocket = localServerSocket.accept(); audioSocket = localServerSocket.accept();
if (sendDummyByte) {
// send one byte so the client may read() to detect a connection error
audioSocket.getOutputStream().write(0);
sendDummyByte = false;
}
} }
if (control) { if (control) {
controlSocket = localServerSocket.accept(); controlSocket = localServerSocket.accept();
if (sendDummyByte) {
// send one byte so the client may read() to detect a connection error
controlSocket.getOutputStream().write(0);
sendDummyByte = false;
}
} }
} }
} else { } else {
videoSocket = connect(socketName); if (video) {
videoSocket = connect(socketName);
}
if (audio) { if (audio) {
audioSocket = connect(socketName); audioSocket = connect(socketName);
} }
@ -106,10 +122,22 @@ public final class DesktopConnection implements Closeable {
return new DesktopConnection(videoSocket, audioSocket, controlSocket); return new DesktopConnection(videoSocket, audioSocket, controlSocket);
} }
private LocalSocket getFirstSocket() {
if (videoSocket != null) {
return videoSocket;
}
if (audioSocket != null) {
return audioSocket;
}
return controlSocket;
}
public void close() throws IOException { public void close() throws IOException {
videoSocket.shutdownInput(); if (videoSocket != null) {
videoSocket.shutdownOutput(); videoSocket.shutdownInput();
videoSocket.close(); videoSocket.shutdownOutput();
videoSocket.close();
}
if (audioSocket != null) { if (audioSocket != null) {
audioSocket.shutdownInput(); audioSocket.shutdownInput();
audioSocket.shutdownOutput(); audioSocket.shutdownOutput();
@ -130,7 +158,8 @@ public final class DesktopConnection implements Closeable {
System.arraycopy(deviceNameBytes, 0, buffer, 0, len); System.arraycopy(deviceNameBytes, 0, buffer, 0, len);
// byte[] are always 0-initialized in java, no need to set '\0' explicitly // byte[] are always 0-initialized in java, no need to set '\0' explicitly
IO.writeFully(videoFd, buffer, 0, buffer.length); FileDescriptor fd = getFirstSocket().getFileDescriptor();
IO.writeFully(fd, buffer, 0, buffer.length);
} }
public FileDescriptor getVideoFd() { public FileDescriptor getVideoFd() {

View File

@ -12,6 +12,7 @@ import android.os.Build;
import android.os.IBinder; import android.os.IBinder;
import android.os.SystemClock; import android.os.SystemClock;
import android.view.IRotationWatcher; import android.view.IRotationWatcher;
import android.view.IDisplayFoldListener;
import android.view.InputDevice; import android.view.InputDevice;
import android.view.InputEvent; import android.view.InputEvent;
import android.view.KeyCharacterMap; import android.view.KeyCharacterMap;
@ -35,6 +36,10 @@ public final class Device {
void onRotationChanged(int rotation); void onRotationChanged(int rotation);
} }
public interface FoldListener {
void onFoldChanged(int displayId, boolean folded);
}
public interface ClipboardListener { public interface ClipboardListener {
void onClipboardTextChanged(String text); void onClipboardTextChanged(String text);
} }
@ -46,6 +51,7 @@ public final class Device {
private ScreenInfo screenInfo; private ScreenInfo screenInfo;
private RotationListener rotationListener; private RotationListener rotationListener;
private FoldListener foldListener;
private ClipboardListener clipboardListener; private ClipboardListener clipboardListener;
private final AtomicBoolean isSettingClipboard = new AtomicBoolean(); private final AtomicBoolean isSettingClipboard = new AtomicBoolean();
@ -93,6 +99,33 @@ public final class Device {
} }
}, displayId); }, displayId);
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.Q) {
ServiceManager.getWindowManager().registerDisplayFoldListener(new IDisplayFoldListener.Stub() {
@Override
public void onDisplayFoldChanged(int displayId, boolean folded) {
if (Device.this.displayId != displayId) {
// Ignore events related to other display ids
return;
}
synchronized (Device.this) {
DisplayInfo displayInfo = ServiceManager.getDisplayManager().getDisplayInfo(displayId);
if (displayInfo == null) {
Ln.e("Display " + displayId + " not found\n" + LogUtils.buildDisplayListMessage());
return;
}
screenInfo = ScreenInfo.computeScreenInfo(displayInfo.getRotation(), displayInfo.getSize(), options.getCrop(),
options.getMaxSize(), options.getLockVideoOrientation());
// notify
if (foldListener != null) {
foldListener.onFoldChanged(displayId, folded);
}
}
}
});
}
if (options.getControl() && options.getClipboardAutosync()) { if (options.getControl() && options.getClipboardAutosync()) {
// If control and autosync are enabled, synchronize Android clipboard to the computer automatically // If control and autosync are enabled, synchronize Android clipboard to the computer automatically
ClipboardManager clipboardManager = ServiceManager.getClipboardManager(); ClipboardManager clipboardManager = ServiceManager.getClipboardManager();
@ -224,6 +257,10 @@ public final class Device {
this.rotationListener = rotationListener; this.rotationListener = rotationListener;
} }
public synchronized void setFoldListener(FoldListener foldlistener) {
this.foldListener = foldlistener;
}
public synchronized void setClipboardListener(ClipboardListener clipboardListener) { public synchronized void setClipboardListener(ClipboardListener clipboardListener) {
this.clipboardListener = clipboardListener; this.clipboardListener = clipboardListener;
} }

View File

@ -60,7 +60,7 @@ public final class DeviceMessageSender {
} finally { } finally {
Ln.d("Device message sender stopped"); Ln.d("Device message sender stopped");
} }
}); }, "control-send");
thread.start(); thread.start();
} }

View File

@ -2,11 +2,11 @@ package com.genymobile.scrcpy;
import android.annotation.TargetApi; import android.annotation.TargetApi;
import android.content.AttributionSource; import android.content.AttributionSource;
import android.content.ContextWrapper; import android.content.MutableContextWrapper;
import android.os.Build; import android.os.Build;
import android.os.Process; import android.os.Process;
public final class FakeContext extends ContextWrapper { public final class FakeContext extends MutableContextWrapper {
public static final String PACKAGE_NAME = "com.android.shell"; public static final String PACKAGE_NAME = "com.android.shell";
public static final int ROOT_UID = 0; // Like android.os.Process.ROOT_UID, but before API 29 public static final int ROOT_UID = 0; // Like android.os.Process.ROOT_UID, but before API 29

View File

@ -3,15 +3,18 @@ package com.genymobile.scrcpy;
import android.graphics.Rect; import android.graphics.Rect;
import java.util.List; import java.util.List;
import java.util.Locale;
public class Options { public class Options {
private Ln.Level logLevel = Ln.Level.DEBUG; private Ln.Level logLevel = Ln.Level.DEBUG;
private int scid = -1; // 31-bit non-negative value, or -1 private int scid = -1; // 31-bit non-negative value, or -1
private boolean video = true;
private boolean audio = true; private boolean audio = true;
private int maxSize; private int maxSize;
private VideoCodec videoCodec = VideoCodec.H264; private VideoCodec videoCodec = VideoCodec.H264;
private AudioCodec audioCodec = AudioCodec.OPUS; private AudioCodec audioCodec = AudioCodec.OPUS;
private AudioSource audioSource = AudioSource.OUTPUT;
private int videoBitRate = 8000000; private int videoBitRate = 8000000;
private int audioBitRate = 128000; private int audioBitRate = 128000;
private int maxFps; private int maxFps;
@ -46,166 +49,90 @@ public class Options {
return logLevel; return logLevel;
} }
public void setLogLevel(Ln.Level logLevel) {
this.logLevel = logLevel;
}
public int getScid() { public int getScid() {
return scid; return scid;
} }
public void setScid(int scid) { public boolean getVideo() {
this.scid = scid; return video;
} }
public boolean getAudio() { public boolean getAudio() {
return audio; return audio;
} }
public void setAudio(boolean audio) {
this.audio = audio;
}
public int getMaxSize() { public int getMaxSize() {
return maxSize; return maxSize;
} }
public void setMaxSize(int maxSize) {
this.maxSize = maxSize;
}
public VideoCodec getVideoCodec() { public VideoCodec getVideoCodec() {
return videoCodec; return videoCodec;
} }
public void setVideoCodec(VideoCodec videoCodec) {
this.videoCodec = videoCodec;
}
public AudioCodec getAudioCodec() { public AudioCodec getAudioCodec() {
return audioCodec; return audioCodec;
} }
public void setAudioCodec(AudioCodec audioCodec) { public AudioSource getAudioSource() {
this.audioCodec = audioCodec; return audioSource;
} }
public int getVideoBitRate() { public int getVideoBitRate() {
return videoBitRate; return videoBitRate;
} }
public void setVideoBitRate(int videoBitRate) {
this.videoBitRate = videoBitRate;
}
public int getAudioBitRate() { public int getAudioBitRate() {
return audioBitRate; return audioBitRate;
} }
public void setAudioBitRate(int audioBitRate) {
this.audioBitRate = audioBitRate;
}
public int getMaxFps() { public int getMaxFps() {
return maxFps; return maxFps;
} }
public void setMaxFps(int maxFps) {
this.maxFps = maxFps;
}
public int getLockVideoOrientation() { public int getLockVideoOrientation() {
return lockVideoOrientation; return lockVideoOrientation;
} }
public void setLockVideoOrientation(int lockVideoOrientation) {
this.lockVideoOrientation = lockVideoOrientation;
}
public boolean isTunnelForward() { public boolean isTunnelForward() {
return tunnelForward; return tunnelForward;
} }
public void setTunnelForward(boolean tunnelForward) {
this.tunnelForward = tunnelForward;
}
public Rect getCrop() { public Rect getCrop() {
return crop; return crop;
} }
public void setCrop(Rect crop) {
this.crop = crop;
}
public boolean getControl() { public boolean getControl() {
return control; return control;
} }
public void setControl(boolean control) {
this.control = control;
}
public int getDisplayId() { public int getDisplayId() {
return displayId; return displayId;
} }
public void setDisplayId(int displayId) {
this.displayId = displayId;
}
public boolean getShowTouches() { public boolean getShowTouches() {
return showTouches; return showTouches;
} }
public void setShowTouches(boolean showTouches) {
this.showTouches = showTouches;
}
public boolean getStayAwake() { public boolean getStayAwake() {
return stayAwake; return stayAwake;
} }
public void setStayAwake(boolean stayAwake) {
this.stayAwake = stayAwake;
}
public List<CodecOption> getVideoCodecOptions() { public List<CodecOption> getVideoCodecOptions() {
return videoCodecOptions; return videoCodecOptions;
} }
public void setVideoCodecOptions(List<CodecOption> videoCodecOptions) {
this.videoCodecOptions = videoCodecOptions;
}
public List<CodecOption> getAudioCodecOptions() { public List<CodecOption> getAudioCodecOptions() {
return audioCodecOptions; return audioCodecOptions;
} }
public void setAudioCodecOptions(List<CodecOption> audioCodecOptions) {
this.audioCodecOptions = audioCodecOptions;
}
public String getVideoEncoder() { public String getVideoEncoder() {
return videoEncoder; return videoEncoder;
} }
public void setVideoEncoder(String videoEncoder) {
this.videoEncoder = videoEncoder;
}
public String getAudioEncoder() { public String getAudioEncoder() {
return audioEncoder; return audioEncoder;
} }
public void setAudioEncoder(String audioEncoder) {
this.audioEncoder = audioEncoder;
}
public void setPowerOffScreenOnClose(boolean powerOffScreenOnClose) {
this.powerOffScreenOnClose = powerOffScreenOnClose;
}
public boolean getPowerOffScreenOnClose() { public boolean getPowerOffScreenOnClose() {
return this.powerOffScreenOnClose; return this.powerOffScreenOnClose;
} }
@ -214,79 +141,214 @@ public class Options {
return clipboardAutosync; return clipboardAutosync;
} }
public void setClipboardAutosync(boolean clipboardAutosync) {
this.clipboardAutosync = clipboardAutosync;
}
public boolean getDownsizeOnError() { public boolean getDownsizeOnError() {
return downsizeOnError; return downsizeOnError;
} }
public void setDownsizeOnError(boolean downsizeOnError) {
this.downsizeOnError = downsizeOnError;
}
public boolean getCleanup() { public boolean getCleanup() {
return cleanup; return cleanup;
} }
public void setCleanup(boolean cleanup) {
this.cleanup = cleanup;
}
public boolean getPowerOn() { public boolean getPowerOn() {
return powerOn; return powerOn;
} }
public void setPowerOn(boolean powerOn) {
this.powerOn = powerOn;
}
public boolean getListEncoders() { public boolean getListEncoders() {
return listEncoders; return listEncoders;
} }
public void setListEncoders(boolean listEncoders) {
this.listEncoders = listEncoders;
}
public boolean getListDisplays() { public boolean getListDisplays() {
return listDisplays; return listDisplays;
} }
public void setListDisplays(boolean listDisplays) {
this.listDisplays = listDisplays;
}
public boolean getSendDeviceMeta() { public boolean getSendDeviceMeta() {
return sendDeviceMeta; return sendDeviceMeta;
} }
public void setSendDeviceMeta(boolean sendDeviceMeta) {
this.sendDeviceMeta = sendDeviceMeta;
}
public boolean getSendFrameMeta() { public boolean getSendFrameMeta() {
return sendFrameMeta; return sendFrameMeta;
} }
public void setSendFrameMeta(boolean sendFrameMeta) {
this.sendFrameMeta = sendFrameMeta;
}
public boolean getSendDummyByte() { public boolean getSendDummyByte() {
return sendDummyByte; return sendDummyByte;
} }
public void setSendDummyByte(boolean sendDummyByte) {
this.sendDummyByte = sendDummyByte;
}
public boolean getSendCodecMeta() { public boolean getSendCodecMeta() {
return sendCodecMeta; return sendCodecMeta;
} }
public void setSendCodecMeta(boolean sendCodecMeta) { @SuppressWarnings("MethodLength")
this.sendCodecMeta = sendCodecMeta; public static Options parse(String... args) {
if (args.length < 1) {
throw new IllegalArgumentException("Missing client version");
}
String clientVersion = args[0];
if (!clientVersion.equals(BuildConfig.VERSION_NAME)) {
throw new IllegalArgumentException(
"The server version (" + BuildConfig.VERSION_NAME + ") does not match the client " + "(" + clientVersion + ")");
}
Options options = new Options();
for (int i = 1; i < args.length; ++i) {
String arg = args[i];
int equalIndex = arg.indexOf('=');
if (equalIndex == -1) {
throw new IllegalArgumentException("Invalid key=value pair: \"" + arg + "\"");
}
String key = arg.substring(0, equalIndex);
String value = arg.substring(equalIndex + 1);
switch (key) {
case "scid":
int scid = Integer.parseInt(value, 0x10);
if (scid < -1) {
throw new IllegalArgumentException("scid may not be negative (except -1 for 'none'): " + scid);
}
options.scid = scid;
break;
case "log_level":
options.logLevel = Ln.Level.valueOf(value.toUpperCase(Locale.ENGLISH));
break;
case "video":
options.video = Boolean.parseBoolean(value);
break;
case "audio":
options.audio = Boolean.parseBoolean(value);
break;
case "video_codec":
VideoCodec videoCodec = VideoCodec.findByName(value);
if (videoCodec == null) {
throw new IllegalArgumentException("Video codec " + value + " not supported");
}
options.videoCodec = videoCodec;
break;
case "audio_codec":
AudioCodec audioCodec = AudioCodec.findByName(value);
if (audioCodec == null) {
throw new IllegalArgumentException("Audio codec " + value + " not supported");
}
options.audioCodec = audioCodec;
break;
case "audio_source":
AudioSource audioSource = AudioSource.findByName(value);
if (audioSource == null) {
throw new IllegalArgumentException("Audio source " + value + " not supported");
}
options.audioSource = audioSource;
break;
case "max_size":
options.maxSize = Integer.parseInt(value) & ~7; // multiple of 8
break;
case "video_bit_rate":
options.videoBitRate = Integer.parseInt(value);
break;
case "audio_bit_rate":
options.audioBitRate = Integer.parseInt(value);
break;
case "max_fps":
options.maxFps = Integer.parseInt(value);
break;
case "lock_video_orientation":
options.lockVideoOrientation = Integer.parseInt(value);
break;
case "tunnel_forward":
options.tunnelForward = Boolean.parseBoolean(value);
break;
case "crop":
options.crop = parseCrop(value);
break;
case "control":
options.control = Boolean.parseBoolean(value);
break;
case "display_id":
options.displayId = Integer.parseInt(value);
break;
case "show_touches":
options.showTouches = Boolean.parseBoolean(value);
break;
case "stay_awake":
options.stayAwake = Boolean.parseBoolean(value);
break;
case "video_codec_options":
options.videoCodecOptions = CodecOption.parse(value);
break;
case "audio_codec_options":
options.audioCodecOptions = CodecOption.parse(value);
break;
case "video_encoder":
if (!value.isEmpty()) {
options.videoEncoder = value;
}
break;
case "audio_encoder":
if (!value.isEmpty()) {
options.audioEncoder = value;
}
case "power_off_on_close":
options.powerOffScreenOnClose = Boolean.parseBoolean(value);
break;
case "clipboard_autosync":
options.clipboardAutosync = Boolean.parseBoolean(value);
break;
case "downsize_on_error":
options.downsizeOnError = Boolean.parseBoolean(value);
break;
case "cleanup":
options.cleanup = Boolean.parseBoolean(value);
break;
case "power_on":
options.powerOn = Boolean.parseBoolean(value);
break;
case "list_encoders":
options.listEncoders = Boolean.parseBoolean(value);
break;
case "list_displays":
options.listDisplays = Boolean.parseBoolean(value);
break;
case "send_device_meta":
options.sendDeviceMeta = Boolean.parseBoolean(value);
break;
case "send_frame_meta":
options.sendFrameMeta = Boolean.parseBoolean(value);
break;
case "send_dummy_byte":
options.sendDummyByte = Boolean.parseBoolean(value);
break;
case "send_codec_meta":
options.sendCodecMeta = Boolean.parseBoolean(value);
break;
case "raw_stream":
boolean rawStream = Boolean.parseBoolean(value);
if (rawStream) {
options.sendDeviceMeta = false;
options.sendFrameMeta = false;
options.sendDummyByte = false;
options.sendCodecMeta = false;
}
break;
default:
Ln.w("Unknown server option: " + key);
break;
}
}
return options;
}
private static Rect parseCrop(String crop) {
if (crop.isEmpty()) {
return null;
}
// input format: "width:height:x:y"
String[] tokens = crop.split(":");
if (tokens.length != 4) {
throw new IllegalArgumentException("Crop must contains 4 values separated by colons: \"" + crop + "\"");
}
int width = Integer.parseInt(tokens[0]);
int height = Integer.parseInt(tokens[1]);
int x = Integer.parseInt(tokens[2]);
int y = Integer.parseInt(tokens[3]);
return new Rect(x, y, x + width, y + height);
} }
} }

View File

@ -8,6 +8,7 @@ import android.media.MediaCodecInfo;
import android.media.MediaFormat; import android.media.MediaFormat;
import android.os.Build; import android.os.Build;
import android.os.IBinder; import android.os.IBinder;
import android.os.Looper;
import android.os.SystemClock; import android.os.SystemClock;
import android.view.Surface; import android.view.Surface;
@ -16,7 +17,7 @@ import java.nio.ByteBuffer;
import java.util.List; import java.util.List;
import java.util.concurrent.atomic.AtomicBoolean; import java.util.concurrent.atomic.AtomicBoolean;
public class ScreenEncoder implements Device.RotationListener { public class ScreenEncoder implements Device.RotationListener, Device.FoldListener, AsyncProcessor {
private static final int DEFAULT_I_FRAME_INTERVAL = 10; // seconds private static final int DEFAULT_I_FRAME_INTERVAL = 10; // seconds
private static final int REPEAT_FRAME_DELAY_US = 100_000; // repeat after 100ms private static final int REPEAT_FRAME_DELAY_US = 100_000; // repeat after 100ms
@ -26,7 +27,7 @@ public class ScreenEncoder implements Device.RotationListener {
private static final int[] MAX_SIZE_FALLBACK = {2560, 1920, 1600, 1280, 1024, 800}; private static final int[] MAX_SIZE_FALLBACK = {2560, 1920, 1600, 1280, 1024, 800};
private static final int MAX_CONSECUTIVE_ERRORS = 3; private static final int MAX_CONSECUTIVE_ERRORS = 3;
private final AtomicBoolean rotationChanged = new AtomicBoolean(); private final AtomicBoolean resetCapture = new AtomicBoolean();
private final Device device; private final Device device;
private final Streamer streamer; private final Streamer streamer;
@ -39,6 +40,9 @@ public class ScreenEncoder implements Device.RotationListener {
private boolean firstFrameSent; private boolean firstFrameSent;
private int consecutiveErrors; private int consecutiveErrors;
private Thread thread;
private final AtomicBoolean stopped = new AtomicBoolean();
public ScreenEncoder(Device device, Streamer streamer, int videoBitRate, int maxFps, List<CodecOption> codecOptions, String encoderName, public ScreenEncoder(Device device, Streamer streamer, int videoBitRate, int maxFps, List<CodecOption> codecOptions, String encoderName,
boolean downsizeOnError) { boolean downsizeOnError) {
this.device = device; this.device = device;
@ -50,21 +54,27 @@ public class ScreenEncoder implements Device.RotationListener {
this.downsizeOnError = downsizeOnError; this.downsizeOnError = downsizeOnError;
} }
@Override
public void onFoldChanged(int displayId, boolean folded) {
resetCapture.set(true);
}
@Override @Override
public void onRotationChanged(int rotation) { public void onRotationChanged(int rotation) {
rotationChanged.set(true); resetCapture.set(true);
} }
public boolean consumeRotationChange() { private boolean consumeResetCapture() {
return rotationChanged.getAndSet(false); return resetCapture.getAndSet(false);
} }
public void streamScreen() throws IOException, ConfigurationException { private void streamScreen() throws IOException, ConfigurationException {
Codec codec = streamer.getCodec(); Codec codec = streamer.getCodec();
MediaCodec mediaCodec = createMediaCodec(codec, encoderName); MediaCodec mediaCodec = createMediaCodec(codec, encoderName);
MediaFormat format = createFormat(codec.getMimeType(), videoBitRate, maxFps, codecOptions); MediaFormat format = createFormat(codec.getMimeType(), videoBitRate, maxFps, codecOptions);
IBinder display = createDisplay(); IBinder display = createDisplay();
device.setRotationListener(this); device.setRotationListener(this);
device.setFoldListener(this);
streamer.writeVideoHeader(device.getScreenInfo().getVideoSize()); streamer.writeVideoHeader(device.getScreenInfo().getVideoSize());
@ -112,6 +122,7 @@ public class ScreenEncoder implements Device.RotationListener {
} finally { } finally {
mediaCodec.release(); mediaCodec.release();
device.setRotationListener(null); device.setRotationListener(null);
device.setFoldListener(null);
SurfaceControl.destroyDisplay(display); SurfaceControl.destroyDisplay(display);
} }
} }
@ -137,7 +148,6 @@ public class ScreenEncoder implements Device.RotationListener {
// Downsizing on error is only enabled if an encoding failure occurs before the first frame (downsizing later could be surprising) // Downsizing on error is only enabled if an encoding failure occurs before the first frame (downsizing later could be surprising)
int newMaxSize = chooseMaxSizeFallback(screenInfo.getVideoSize()); int newMaxSize = chooseMaxSizeFallback(screenInfo.getVideoSize());
Ln.i("newMaxSize = " + newMaxSize);
if (newMaxSize == 0) { if (newMaxSize == 0) {
// Must definitively fail // Must definitively fail
return false; return false;
@ -163,12 +173,17 @@ public class ScreenEncoder implements Device.RotationListener {
private boolean encode(MediaCodec codec, Streamer streamer) throws IOException { private boolean encode(MediaCodec codec, Streamer streamer) throws IOException {
boolean eof = false; boolean eof = false;
boolean alive = true;
MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo(); MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
while (!consumeRotationChange() && !eof) { while (!consumeResetCapture() && !eof) {
if (stopped.get()) {
alive = false;
break;
}
int outputBufferId = codec.dequeueOutputBuffer(bufferInfo, -1); int outputBufferId = codec.dequeueOutputBuffer(bufferInfo, -1);
try { try {
if (consumeRotationChange()) { if (consumeResetCapture()) {
// must restart encoding with new size // must restart encoding with new size
break; break;
} }
@ -193,7 +208,7 @@ public class ScreenEncoder implements Device.RotationListener {
} }
} }
return !eof; return !eof && alive;
} }
private static MediaCodec createMediaCodec(Codec codec, String encoderName) throws IOException, ConfigurationException { private static MediaCodec createMediaCodec(Codec codec, String encoderName) throws IOException, ConfigurationException {
@ -267,4 +282,42 @@ public class ScreenEncoder implements Device.RotationListener {
SurfaceControl.closeTransaction(); SurfaceControl.closeTransaction();
} }
} }
@Override
public void start(TerminationListener listener) {
thread = new Thread(() -> {
// Some devices (Meizu) deadlock if the video encoding thread has no Looper
// <https://github.com/Genymobile/scrcpy/issues/4143>
Looper.prepare();
try {
streamScreen();
} catch (ConfigurationException e) {
// Do not print stack trace, a user-friendly error-message has already been logged
} catch (IOException e) {
// Broken pipe is expected on close, because the socket is closed by the client
if (!IO.isBrokenPipe(e)) {
Ln.e("Video encoding error", e);
}
} finally {
Ln.d("Screen streaming stopped");
listener.onTerminated(true);
}
}, "video");
thread.start();
}
@Override
public void stop() {
if (thread != null) {
stopped.set(true);
}
}
@Override
public void join() throws InterruptedException {
if (thread != null) {
thread.join();
}
}
} }

View File

@ -1,16 +1,43 @@
package com.genymobile.scrcpy; package com.genymobile.scrcpy;
import android.graphics.Rect;
import android.os.BatteryManager; import android.os.BatteryManager;
import android.os.Build; import android.os.Build;
import java.io.IOException; import java.io.IOException;
import java.util.ArrayList; import java.util.ArrayList;
import java.util.List; import java.util.List;
import java.util.Locale;
public final class Server { public final class Server {
private static class Completion {
private int running;
private boolean fatalError;
Completion(int running) {
this.running = running;
}
synchronized void addCompleted(boolean fatalError) {
--running;
if (fatalError) {
this.fatalError = true;
}
if (running == 0 || this.fatalError) {
notify();
}
}
synchronized void await() {
try {
while (running > 0 && !fatalError) {
wait();
}
} catch (InterruptedException e) {
// ignore
}
}
}
private Server() { private Server() {
// not instantiable // not instantiable
} }
@ -60,7 +87,7 @@ public final class Server {
} }
private static void scrcpy(Options options) throws IOException, ConfigurationException { private static void scrcpy(Options options) throws IOException, ConfigurationException {
Ln.i("Device: " + Build.MANUFACTURER + " " + Build.MODEL + " (Android " + Build.VERSION.RELEASE + ")"); Ln.i("Device: [" + Build.MANUFACTURER + "] " + Build.BRAND + " " + Build.MODEL + " (Android " + Build.VERSION.RELEASE + ")");
final Device device = new Device(options); final Device device = new Device(options);
Thread initThread = startInitThread(options); Thread initThread = startInitThread(options);
@ -68,33 +95,16 @@ public final class Server {
int scid = options.getScid(); int scid = options.getScid();
boolean tunnelForward = options.isTunnelForward(); boolean tunnelForward = options.isTunnelForward();
boolean control = options.getControl(); boolean control = options.getControl();
boolean video = options.getVideo();
boolean audio = options.getAudio(); boolean audio = options.getAudio();
boolean sendDummyByte = options.getSendDummyByte(); boolean sendDummyByte = options.getSendDummyByte();
Workarounds.prepareMainLooper(); Workarounds.apply(audio);
// Workarounds must be applied for Meizu phones:
// - <https://github.com/Genymobile/scrcpy/issues/240>
// - <https://github.com/Genymobile/scrcpy/issues/365>
// - <https://github.com/Genymobile/scrcpy/issues/2656>
//
// But only apply when strictly necessary, since workarounds can cause other issues:
// - <https://github.com/Genymobile/scrcpy/issues/940>
// - <https://github.com/Genymobile/scrcpy/issues/994>
if (Build.BRAND.equalsIgnoreCase("meizu")) {
Workarounds.fillAppInfo();
}
// Before Android 11, audio is not supported.
// Since Android 12, we can properly set a context on the AudioRecord.
// Only on Android 11 we must fill the application context for the AudioRecord to work.
if (audio && Build.VERSION.SDK_INT == Build.VERSION_CODES.R) {
Workarounds.fillAppContext();
}
List<AsyncProcessor> asyncProcessors = new ArrayList<>(); List<AsyncProcessor> asyncProcessors = new ArrayList<>();
try (DesktopConnection connection = DesktopConnection.open(scid, tunnelForward, audio, control, sendDummyByte)) { DesktopConnection connection = DesktopConnection.open(scid, tunnelForward, video, audio, control, sendDummyByte);
try {
if (options.getSendDeviceMeta()) { if (options.getSendDeviceMeta()) {
connection.sendDeviceMeta(Device.getDeviceName()); connection.sendDeviceMeta(Device.getDeviceName());
} }
@ -107,38 +117,35 @@ public final class Server {
if (audio) { if (audio) {
AudioCodec audioCodec = options.getAudioCodec(); AudioCodec audioCodec = options.getAudioCodec();
Streamer audioStreamer = new Streamer(connection.getAudioFd(), audioCodec, options.getSendCodecMeta(), AudioCapture audioCapture = new AudioCapture(options.getAudioSource());
options.getSendFrameMeta()); Streamer audioStreamer = new Streamer(connection.getAudioFd(), audioCodec, options.getSendCodecMeta(), options.getSendFrameMeta());
AsyncProcessor audioRecorder; AsyncProcessor audioRecorder;
if (audioCodec == AudioCodec.RAW) { if (audioCodec == AudioCodec.RAW) {
audioRecorder = new AudioRawRecorder(audioStreamer); audioRecorder = new AudioRawRecorder(audioCapture, audioStreamer);
} else { } else {
audioRecorder = new AudioEncoder(audioStreamer, options.getAudioBitRate(), options.getAudioCodecOptions(), audioRecorder = new AudioEncoder(audioCapture, audioStreamer, options.getAudioBitRate(), options.getAudioCodecOptions(),
options.getAudioEncoder()); options.getAudioEncoder());
} }
asyncProcessors.add(audioRecorder); asyncProcessors.add(audioRecorder);
} }
Streamer videoStreamer = new Streamer(connection.getVideoFd(), options.getVideoCodec(), options.getSendCodecMeta(), if (video) {
options.getSendFrameMeta()); Streamer videoStreamer = new Streamer(connection.getVideoFd(), options.getVideoCodec(), options.getSendCodecMeta(),
ScreenEncoder screenEncoder = new ScreenEncoder(device, videoStreamer, options.getVideoBitRate(), options.getMaxFps(), options.getSendFrameMeta());
options.getVideoCodecOptions(), options.getVideoEncoder(), options.getDownsizeOnError()); ScreenEncoder screenEncoder = new ScreenEncoder(device, videoStreamer, options.getVideoBitRate(), options.getMaxFps(),
options.getVideoCodecOptions(), options.getVideoEncoder(), options.getDownsizeOnError());
asyncProcessors.add(screenEncoder);
}
Completion completion = new Completion(asyncProcessors.size());
for (AsyncProcessor asyncProcessor : asyncProcessors) { for (AsyncProcessor asyncProcessor : asyncProcessors) {
asyncProcessor.start(); asyncProcessor.start((fatalError) -> {
completion.addCompleted(fatalError);
});
} }
try { completion.await();
// synchronous
screenEncoder.streamScreen();
} catch (IOException e) {
// Broken pipe is expected on close, because the socket is closed by the client
if (!IO.isBrokenPipe(e)) {
Ln.e("Video encoding error", e);
}
}
} finally { } finally {
Ln.d("Screen streaming stopped");
initThread.interrupt(); initThread.interrupt();
for (AsyncProcessor asyncProcessor : asyncProcessors) { for (AsyncProcessor asyncProcessor : asyncProcessors) {
asyncProcessor.stop(); asyncProcessor.stop();
@ -152,212 +159,23 @@ public final class Server {
} catch (InterruptedException e) { } catch (InterruptedException e) {
// ignore // ignore
} }
connection.close();
} }
} }
private static Thread startInitThread(final Options options) { private static Thread startInitThread(final Options options) {
Thread thread = new Thread(() -> initAndCleanUp(options)); Thread thread = new Thread(() -> initAndCleanUp(options), "init-cleanup");
thread.start(); thread.start();
return thread; return thread;
} }
@SuppressWarnings("MethodLength")
private static Options createOptions(String... args) {
if (args.length < 1) {
throw new IllegalArgumentException("Missing client version");
}
String clientVersion = args[0];
if (!clientVersion.equals(BuildConfig.VERSION_NAME)) {
throw new IllegalArgumentException(
"The server version (" + BuildConfig.VERSION_NAME + ") does not match the client " + "(" + clientVersion + ")");
}
Options options = new Options();
for (int i = 1; i < args.length; ++i) {
String arg = args[i];
int equalIndex = arg.indexOf('=');
if (equalIndex == -1) {
throw new IllegalArgumentException("Invalid key=value pair: \"" + arg + "\"");
}
String key = arg.substring(0, equalIndex);
String value = arg.substring(equalIndex + 1);
switch (key) {
case "scid":
int scid = Integer.parseInt(value, 0x10);
if (scid < -1) {
throw new IllegalArgumentException("scid may not be negative (except -1 for 'none'): " + scid);
}
options.setScid(scid);
break;
case "log_level":
Ln.Level level = Ln.Level.valueOf(value.toUpperCase(Locale.ENGLISH));
options.setLogLevel(level);
break;
case "audio":
boolean audio = Boolean.parseBoolean(value);
options.setAudio(audio);
break;
case "video_codec":
VideoCodec videoCodec = VideoCodec.findByName(value);
if (videoCodec == null) {
throw new IllegalArgumentException("Video codec " + value + " not supported");
}
options.setVideoCodec(videoCodec);
break;
case "audio_codec":
AudioCodec audioCodec = AudioCodec.findByName(value);
if (audioCodec == null) {
throw new IllegalArgumentException("Audio codec " + value + " not supported");
}
options.setAudioCodec(audioCodec);
break;
case "max_size":
int maxSize = Integer.parseInt(value) & ~7; // multiple of 8
options.setMaxSize(maxSize);
break;
case "video_bit_rate":
int videoBitRate = Integer.parseInt(value);
options.setVideoBitRate(videoBitRate);
break;
case "audio_bit_rate":
int audioBitRate = Integer.parseInt(value);
options.setAudioBitRate(audioBitRate);
break;
case "max_fps":
int maxFps = Integer.parseInt(value);
options.setMaxFps(maxFps);
break;
case "lock_video_orientation":
int lockVideoOrientation = Integer.parseInt(value);
options.setLockVideoOrientation(lockVideoOrientation);
break;
case "tunnel_forward":
boolean tunnelForward = Boolean.parseBoolean(value);
options.setTunnelForward(tunnelForward);
break;
case "crop":
Rect crop = parseCrop(value);
options.setCrop(crop);
break;
case "control":
boolean control = Boolean.parseBoolean(value);
options.setControl(control);
break;
case "display_id":
int displayId = Integer.parseInt(value);
options.setDisplayId(displayId);
break;
case "show_touches":
boolean showTouches = Boolean.parseBoolean(value);
options.setShowTouches(showTouches);
break;
case "stay_awake":
boolean stayAwake = Boolean.parseBoolean(value);
options.setStayAwake(stayAwake);
break;
case "video_codec_options":
List<CodecOption> videoCodecOptions = CodecOption.parse(value);
options.setVideoCodecOptions(videoCodecOptions);
break;
case "audio_codec_options":
List<CodecOption> audioCodecOptions = CodecOption.parse(value);
options.setAudioCodecOptions(audioCodecOptions);
break;
case "video_encoder":
if (!value.isEmpty()) {
options.setVideoEncoder(value);
}
break;
case "audio_encoder":
if (!value.isEmpty()) {
options.setAudioEncoder(value);
}
case "power_off_on_close":
boolean powerOffScreenOnClose = Boolean.parseBoolean(value);
options.setPowerOffScreenOnClose(powerOffScreenOnClose);
break;
case "clipboard_autosync":
boolean clipboardAutosync = Boolean.parseBoolean(value);
options.setClipboardAutosync(clipboardAutosync);
break;
case "downsize_on_error":
boolean downsizeOnError = Boolean.parseBoolean(value);
options.setDownsizeOnError(downsizeOnError);
break;
case "cleanup":
boolean cleanup = Boolean.parseBoolean(value);
options.setCleanup(cleanup);
break;
case "power_on":
boolean powerOn = Boolean.parseBoolean(value);
options.setPowerOn(powerOn);
break;
case "list_encoders":
boolean listEncoders = Boolean.parseBoolean(value);
options.setListEncoders(listEncoders);
break;
case "list_displays":
boolean listDisplays = Boolean.parseBoolean(value);
options.setListDisplays(listDisplays);
break;
case "send_device_meta":
boolean sendDeviceMeta = Boolean.parseBoolean(value);
options.setSendDeviceMeta(sendDeviceMeta);
break;
case "send_frame_meta":
boolean sendFrameMeta = Boolean.parseBoolean(value);
options.setSendFrameMeta(sendFrameMeta);
break;
case "send_dummy_byte":
boolean sendDummyByte = Boolean.parseBoolean(value);
options.setSendDummyByte(sendDummyByte);
break;
case "send_codec_meta":
boolean sendCodecMeta = Boolean.parseBoolean(value);
options.setSendCodecMeta(sendCodecMeta);
break;
case "raw_video_stream":
boolean rawVideoStream = Boolean.parseBoolean(value);
if (rawVideoStream) {
options.setSendDeviceMeta(false);
options.setSendFrameMeta(false);
options.setSendDummyByte(false);
options.setSendCodecMeta(false);
}
break;
default:
Ln.w("Unknown server option: " + key);
break;
}
}
return options;
}
private static Rect parseCrop(String crop) {
if (crop.isEmpty()) {
return null;
}
// input format: "width:height:x:y"
String[] tokens = crop.split(":");
if (tokens.length != 4) {
throw new IllegalArgumentException("Crop must contains 4 values separated by colons: \"" + crop + "\"");
}
int width = Integer.parseInt(tokens[0]);
int height = Integer.parseInt(tokens[1]);
int x = Integer.parseInt(tokens[2]);
int y = Integer.parseInt(tokens[3]);
return new Rect(x, y, x + width, y + height);
}
public static void main(String... args) throws Exception { public static void main(String... args) throws Exception {
Thread.setDefaultUncaughtExceptionHandler((t, e) -> { Thread.setDefaultUncaughtExceptionHandler((t, e) -> {
Ln.e("Exception on thread " + t, e); Ln.e("Exception on thread " + t, e);
}); });
Options options = createOptions(args); Options options = Options.parse(args);
Ln.initLogLevel(options.getLogLevel()); Ln.initLogLevel(options.getLogLevel());

View File

@ -1,13 +1,23 @@
package com.genymobile.scrcpy; package com.genymobile.scrcpy;
import android.annotation.SuppressLint; import android.annotation.SuppressLint;
import android.annotation.TargetApi;
import android.app.Application; import android.app.Application;
import android.content.AttributionSource;
import android.content.Context;
import android.content.ContextWrapper; import android.content.ContextWrapper;
import android.content.pm.ApplicationInfo; import android.content.pm.ApplicationInfo;
import android.media.AudioAttributes;
import android.media.AudioManager;
import android.media.AudioRecord;
import android.os.Build;
import android.os.Looper; import android.os.Looper;
import android.os.Parcel;
import java.lang.ref.WeakReference;
import java.lang.reflect.Constructor; import java.lang.reflect.Constructor;
import java.lang.reflect.Field; import java.lang.reflect.Field;
import java.lang.reflect.Method;
public final class Workarounds { public final class Workarounds {
@ -18,8 +28,56 @@ public final class Workarounds {
// not instantiable // not instantiable
} }
public static void apply(boolean audio) {
Workarounds.prepareMainLooper();
boolean mustFillAppInfo = false;
boolean mustFillBaseContext = false;
boolean mustFillAppContext = false;
if (Build.BRAND.equalsIgnoreCase("meizu")) {
// Workarounds must be applied for Meizu phones:
// - <https://github.com/Genymobile/scrcpy/issues/240>
// - <https://github.com/Genymobile/scrcpy/issues/365>
// - <https://github.com/Genymobile/scrcpy/issues/2656>
//
// But only apply when strictly necessary, since workarounds can cause other issues:
// - <https://github.com/Genymobile/scrcpy/issues/940>
// - <https://github.com/Genymobile/scrcpy/issues/994>
mustFillAppInfo = true;
} else if (Build.BRAND.equalsIgnoreCase("honor")) {
// More workarounds must be applied for Honor devices:
// - <https://github.com/Genymobile/scrcpy/issues/4015>
//
// The system context must not be set for all devices, because it would cause other problems:
// - <https://github.com/Genymobile/scrcpy/issues/4015#issuecomment-1595382142>
// - <https://github.com/Genymobile/scrcpy/issues/3805#issuecomment-1596148031>
mustFillAppInfo = true;
mustFillBaseContext = true;
mustFillAppContext = true;
}
if (audio && Build.VERSION.SDK_INT == Build.VERSION_CODES.R) {
// Before Android 11, audio is not supported.
// Since Android 12, we can properly set a context on the AudioRecord.
// Only on Android 11 we must fill the application context for the AudioRecord to work.
mustFillAppContext = true;
}
if (mustFillAppInfo) {
Workarounds.fillAppInfo();
}
if (mustFillBaseContext) {
Workarounds.fillBaseContext();
}
if (mustFillAppContext) {
Workarounds.fillAppContext();
}
}
@SuppressWarnings("deprecation") @SuppressWarnings("deprecation")
public static void prepareMainLooper() { private static void prepareMainLooper() {
// Some devices internally create a Handler when creating an input Surface, causing an exception: // Some devices internally create a Handler when creating an input Surface, causing an exception:
// "Can't create handler inside thread that has not called Looper.prepare()" // "Can't create handler inside thread that has not called Looper.prepare()"
// <https://github.com/Genymobile/scrcpy/issues/240> // <https://github.com/Genymobile/scrcpy/issues/240>
@ -48,7 +106,7 @@ public final class Workarounds {
} }
@SuppressLint("PrivateApi,DiscouragedPrivateApi") @SuppressLint("PrivateApi,DiscouragedPrivateApi")
public static void fillAppInfo() { private static void fillAppInfo() {
try { try {
fillActivityThread(); fillActivityThread();
@ -77,7 +135,7 @@ public final class Workarounds {
} }
@SuppressLint("PrivateApi,DiscouragedPrivateApi") @SuppressLint("PrivateApi,DiscouragedPrivateApi")
public static void fillAppContext() { private static void fillAppContext() {
try { try {
fillActivityThread(); fillActivityThread();
@ -95,4 +153,153 @@ public final class Workarounds {
Ln.d("Could not fill app context: " + throwable.getMessage()); Ln.d("Could not fill app context: " + throwable.getMessage());
} }
} }
private static void fillBaseContext() {
try {
fillActivityThread();
Method getSystemContextMethod = activityThreadClass.getDeclaredMethod("getSystemContext");
Context context = (Context) getSystemContextMethod.invoke(activityThread);
FakeContext.get().setBaseContext(context);
} catch (Throwable throwable) {
// this is a workaround, so failing is not an error
Ln.d("Could not fill base context: " + throwable.getMessage());
}
}
@TargetApi(Build.VERSION_CODES.R)
@SuppressLint("WrongConstant,MissingPermission,BlockedPrivateApi,SoonBlockedPrivateApi,DiscouragedPrivateApi")
public static AudioRecord createAudioRecord(int source, int sampleRate, int channelConfig, int channels, int channelMask, int encoding) {
// Vivo (and maybe some other third-party ROMs) modified `AudioRecord`'s constructor, requiring `Context`s from real App environment.
//
// This method invokes the `AudioRecord(long nativeRecordInJavaObj)` constructor to create an empty `AudioRecord` instance, then uses
// reflections to initialize it like the normal constructor do (or the `AudioRecord.Builder.build()` method do).
// As a result, the modified code was not executed.
try {
// AudioRecord audioRecord = new AudioRecord(0L);
Constructor<AudioRecord> audioRecordConstructor = AudioRecord.class.getDeclaredConstructor(long.class);
audioRecordConstructor.setAccessible(true);
AudioRecord audioRecord = audioRecordConstructor.newInstance(0L);
// audioRecord.mRecordingState = RECORDSTATE_STOPPED;
Field mRecordingStateField = AudioRecord.class.getDeclaredField("mRecordingState");
mRecordingStateField.setAccessible(true);
mRecordingStateField.set(audioRecord, AudioRecord.RECORDSTATE_STOPPED);
Looper looper = Looper.myLooper();
if (looper == null) {
looper = Looper.getMainLooper();
}
// audioRecord.mInitializationLooper = looper;
Field mInitializationLooperField = AudioRecord.class.getDeclaredField("mInitializationLooper");
mInitializationLooperField.setAccessible(true);
mInitializationLooperField.set(audioRecord, looper);
// Create `AudioAttributes` with fixed capture preset
int capturePreset = source;
AudioAttributes.Builder audioAttributesBuilder = new AudioAttributes.Builder();
Method setInternalCapturePresetMethod = AudioAttributes.Builder.class.getMethod("setInternalCapturePreset", int.class);
setInternalCapturePresetMethod.invoke(audioAttributesBuilder, capturePreset);
AudioAttributes attributes = audioAttributesBuilder.build();
// audioRecord.mAudioAttributes = attributes;
Field mAudioAttributesField = AudioRecord.class.getDeclaredField("mAudioAttributes");
mAudioAttributesField.setAccessible(true);
mAudioAttributesField.set(audioRecord, attributes);
// audioRecord.audioParamCheck(capturePreset, sampleRate, encoding);
Method audioParamCheckMethod = AudioRecord.class.getDeclaredMethod("audioParamCheck", int.class, int.class, int.class);
audioParamCheckMethod.setAccessible(true);
audioParamCheckMethod.invoke(audioRecord, capturePreset, sampleRate, encoding);
// audioRecord.mChannelCount = channels
Field mChannelCountField = AudioRecord.class.getDeclaredField("mChannelCount");
mChannelCountField.setAccessible(true);
mChannelCountField.set(audioRecord, channels);
// audioRecord.mChannelMask = channelMask
Field mChannelMaskField = AudioRecord.class.getDeclaredField("mChannelMask");
mChannelMaskField.setAccessible(true);
mChannelMaskField.set(audioRecord, channelMask);
int minBufferSize = AudioRecord.getMinBufferSize(sampleRate, channelConfig, encoding);
int bufferSizeInBytes = minBufferSize * 8;
// audioRecord.audioBuffSizeCheck(bufferSizeInBytes)
Method audioBuffSizeCheckMethod = AudioRecord.class.getDeclaredMethod("audioBuffSizeCheck", int.class);
audioBuffSizeCheckMethod.setAccessible(true);
audioBuffSizeCheckMethod.invoke(audioRecord, bufferSizeInBytes);
final int channelIndexMask = 0;
int[] sampleRateArray = new int[]{sampleRate};
int[] session = new int[]{AudioManager.AUDIO_SESSION_ID_GENERATE};
int initResult;
if (Build.VERSION.SDK_INT < Build.VERSION_CODES.S) {
// private native final int native_setup(Object audiorecord_this,
// Object /*AudioAttributes*/ attributes,
// int[] sampleRate, int channelMask, int channelIndexMask, int audioFormat,
// int buffSizeInBytes, int[] sessionId, String opPackageName,
// long nativeRecordInJavaObj);
Method nativeSetupMethod = AudioRecord.class.getDeclaredMethod("native_setup", Object.class, Object.class, int[].class, int.class,
int.class, int.class, int.class, int[].class, String.class, long.class);
nativeSetupMethod.setAccessible(true);
initResult = (int) nativeSetupMethod.invoke(audioRecord, new WeakReference<AudioRecord>(audioRecord), attributes, sampleRateArray,
channelMask, channelIndexMask, audioRecord.getAudioFormat(), bufferSizeInBytes, session, FakeContext.get().getOpPackageName(),
0L);
} else {
// Assume `context` is never `null`
AttributionSource attributionSource = FakeContext.get().getAttributionSource();
// Assume `attributionSource.getPackageName()` is never null
// ScopedParcelState attributionSourceState = attributionSource.asScopedParcelState()
Method asScopedParcelStateMethod = AttributionSource.class.getDeclaredMethod("asScopedParcelState");
asScopedParcelStateMethod.setAccessible(true);
try (AutoCloseable attributionSourceState = (AutoCloseable) asScopedParcelStateMethod.invoke(attributionSource)) {
Method getParcelMethod = attributionSourceState.getClass().getDeclaredMethod("getParcel");
Parcel attributionSourceParcel = (Parcel) getParcelMethod.invoke(attributionSourceState);
// private native int native_setup(Object audiorecordThis,
// Object /*AudioAttributes*/ attributes,
// int[] sampleRate, int channelMask, int channelIndexMask, int audioFormat,
// int buffSizeInBytes, int[] sessionId, @NonNull Parcel attributionSource,
// long nativeRecordInJavaObj, int maxSharedAudioHistoryMs);
Method nativeSetupMethod = AudioRecord.class.getDeclaredMethod("native_setup", Object.class, Object.class, int[].class, int.class,
int.class, int.class, int.class, int[].class, Parcel.class, long.class, int.class);
nativeSetupMethod.setAccessible(true);
initResult = (int) nativeSetupMethod.invoke(audioRecord, new WeakReference<AudioRecord>(audioRecord), attributes, sampleRateArray,
channelMask, channelIndexMask, audioRecord.getAudioFormat(), bufferSizeInBytes, session, attributionSourceParcel, 0L, 0);
}
}
if (initResult != AudioRecord.SUCCESS) {
Ln.e("Error code " + initResult + " when initializing native AudioRecord object.");
throw new RuntimeException("Cannot create AudioRecord");
}
// mSampleRate = sampleRate[0]
Field mSampleRateField = AudioRecord.class.getDeclaredField("mSampleRate");
mSampleRateField.setAccessible(true);
mSampleRateField.set(audioRecord, sampleRateArray[0]);
// audioRecord.mSessionId = session[0]
Field mSessionIdField = AudioRecord.class.getDeclaredField("mSessionId");
mSessionIdField.setAccessible(true);
mSessionIdField.set(audioRecord, session[0]);
// audioRecord.mState = AudioRecord.STATE_INITIALIZED
Field mStateField = AudioRecord.class.getDeclaredField("mState");
mStateField.setAccessible(true);
mStateField.set(audioRecord, AudioRecord.STATE_INITIALIZED);
return audioRecord;
} catch (Exception e) {
Ln.e("Failed to invoke AudioRecord.<init>.", e);
throw new RuntimeException("Cannot create AudioRecord");
}
}
} }

View File

@ -17,7 +17,7 @@ import java.lang.reflect.InvocationTargetException;
import java.lang.reflect.Method; import java.lang.reflect.Method;
@SuppressLint("PrivateApi,DiscouragedPrivateApi") @SuppressLint("PrivateApi,DiscouragedPrivateApi")
public class ActivityManager { public final class ActivityManager {
private final IInterface manager; private final IInterface manager;
private Method getContentProviderExternalMethod; private Method getContentProviderExternalMethod;

View File

@ -11,7 +11,7 @@ import android.os.IInterface;
import java.lang.reflect.InvocationTargetException; import java.lang.reflect.InvocationTargetException;
import java.lang.reflect.Method; import java.lang.reflect.Method;
public class ClipboardManager { public final class ClipboardManager {
private final IInterface manager; private final IInterface manager;
private Method getPrimaryClipMethod; private Method getPrimaryClipMethod;
private Method setPrimaryClipMethod; private Method setPrimaryClipMethod;

View File

@ -14,7 +14,7 @@ import java.io.Closeable;
import java.lang.reflect.InvocationTargetException; import java.lang.reflect.InvocationTargetException;
import java.lang.reflect.Method; import java.lang.reflect.Method;
public class ContentProvider implements Closeable { public final class ContentProvider implements Closeable {
public static final String TABLE_SYSTEM = "system"; public static final String TABLE_SYSTEM = "system";
public static final String TABLE_SECURE = "secure"; public static final String TABLE_SECURE = "secure";

View File

@ -14,13 +14,13 @@ public final class InputManager {
public static final int INJECT_INPUT_EVENT_MODE_WAIT_FOR_RESULT = 1; public static final int INJECT_INPUT_EVENT_MODE_WAIT_FOR_RESULT = 1;
public static final int INJECT_INPUT_EVENT_MODE_WAIT_FOR_FINISH = 2; public static final int INJECT_INPUT_EVENT_MODE_WAIT_FOR_FINISH = 2;
private final android.hardware.input.InputManager manager; private final Object manager;
private Method injectInputEventMethod; private Method injectInputEventMethod;
private static Method setDisplayIdMethod; private static Method setDisplayIdMethod;
private static Method setActionButtonMethod; private static Method setActionButtonMethod;
public InputManager(android.hardware.input.InputManager manager) { public InputManager(Object manager) {
this.manager = manager; this.manager = manager;
} }

View File

@ -62,11 +62,21 @@ public final class ServiceManager {
return displayManager; return displayManager;
} }
public static Class<?> getInputManagerClass() {
try {
// Parts of the InputManager class have been moved to a new InputManagerGlobal class in Android 14 preview
return Class.forName("android.hardware.input.InputManagerGlobal");
} catch (ClassNotFoundException e) {
return android.hardware.input.InputManager.class;
}
}
public static InputManager getInputManager() { public static InputManager getInputManager() {
if (inputManager == null) { if (inputManager == null) {
try { try {
Method getInstanceMethod = android.hardware.input.InputManager.class.getDeclaredMethod("getInstance"); Class<?> inputManagerClass = getInputManagerClass();
android.hardware.input.InputManager im = (android.hardware.input.InputManager) getInstanceMethod.invoke(null); Method getInstanceMethod = inputManagerClass.getDeclaredMethod("getInstance");
Object im = getInstanceMethod.invoke(null);
inputManager = new InputManager(im); inputManager = new InputManager(im);
} catch (NoSuchMethodException | IllegalAccessException | InvocationTargetException e) { } catch (NoSuchMethodException | IllegalAccessException | InvocationTargetException e) {
throw new AssertionError(e); throw new AssertionError(e);

View File

@ -7,7 +7,7 @@ import android.os.IInterface;
import java.lang.reflect.InvocationTargetException; import java.lang.reflect.InvocationTargetException;
import java.lang.reflect.Method; import java.lang.reflect.Method;
public class StatusBarManager { public final class StatusBarManager {
private final IInterface manager; private final IInterface manager;
private Method expandNotificationsPanelMethod; private Method expandNotificationsPanelMethod;

View File

@ -2,8 +2,10 @@ package com.genymobile.scrcpy.wrappers;
import com.genymobile.scrcpy.Ln; import com.genymobile.scrcpy.Ln;
import android.annotation.TargetApi;
import android.os.IInterface; import android.os.IInterface;
import android.view.IRotationWatcher; import android.view.IRotationWatcher;
import android.view.IDisplayFoldListener;
import java.lang.reflect.InvocationTargetException; import java.lang.reflect.InvocationTargetException;
import java.lang.reflect.Method; import java.lang.reflect.Method;
@ -105,7 +107,17 @@ public final class WindowManager {
cls.getMethod("watchRotation", IRotationWatcher.class).invoke(manager, rotationWatcher); cls.getMethod("watchRotation", IRotationWatcher.class).invoke(manager, rotationWatcher);
} }
} catch (Exception e) { } catch (Exception e) {
throw new AssertionError(e); Ln.e("Could not register rotation watcher", e);
}
}
@TargetApi(29)
public void registerDisplayFoldListener(IDisplayFoldListener foldListener) {
try {
Class<?> cls = manager.getClass();
cls.getMethod("registerDisplayFoldListener", IDisplayFoldListener.class).invoke(manager, foldListener);
} catch (Exception e) {
Ln.e("Could not register display fold listener", e);
} }
} }
} }

View File

@ -12,7 +12,6 @@ import java.io.IOException;
import java.nio.charset.StandardCharsets; import java.nio.charset.StandardCharsets;
import java.util.Arrays; import java.util.Arrays;
public class ControlMessageReaderTest { public class ControlMessageReaderTest {
@Test @Test