Compare commits

..

3 Commits

Author SHA1 Message Date
9bff8ccadb Apply workaround on "honor" devices
This makes audio work on those devices.

Fixes #4015 <https://github.com/Genymobile/scrcpy/issues/4015>
2023-06-17 00:25:10 +02:00
0d4157357a Use system context as base context
DONOTMERGE: it causes #994 on Xiaomi devices

This allows to make Context.getPackageManager() work.

Fixes #4015 <https://github.com/Genymobile/scrcpy/issues/4015>
Refs <https://github.com/Genymobile/scrcpy/issues/4015#issuecomment-1594262721>

Co-authored-by: Simon Chan <1330321+yume-chan@users.noreply.github.com>
2023-06-17 00:25:10 +02:00
95e61e2a0b Move ActivityThread instance
An ActivityThread instance will be needed from several classes.
2023-06-17 00:25:10 +02:00
77 changed files with 689 additions and 2460 deletions

1
.gitignore vendored
View File

@ -7,4 +7,3 @@ build/
.gradle/
/x/
local.properties
/scrcpy-server

8
FAQ.md
View File

@ -159,8 +159,6 @@ In developer options, enable:
> **USB debugging (Security settings)**
> _Allow granting permissions and simulating input via USB debugging_
Rebooting the device is necessary once this option is set.
[simulating input]: https://github.com/Genymobile/scrcpy/issues/70#issuecomment-373286323
@ -170,12 +168,12 @@ The default text injection method is [limited to ASCII characters][text-input].
A trick allows to also inject some [accented characters][accented-characters],
but that's all. See [#37].
It is also possible to simulate a [physical keyboard][hid] (HID).
Since scrcpy v1.20, it is possible to simulate a [physical keyboard][hid] (HID).
[text-input]: https://github.com/Genymobile/scrcpy/issues?q=is%3Aopen+is%3Aissue+label%3Aunicode
[accented-characters]: https://blog.rom1v.com/2018/03/introducing-scrcpy/#handle-accented-characters
[#37]: https://github.com/Genymobile/scrcpy/issues/37
[hid]: doc/hid-otg.md
[hid]: README.md#physical-keyboard-simulation-hid
## Client issues
@ -231,4 +229,4 @@ Translations of this FAQ in other languages are available in the [wiki].
[wiki]: https://github.com/Genymobile/scrcpy/wiki
Only this FAQ file is guaranteed to be up-to-date.
Only this README file is guaranteed to be up-to-date.

View File

@ -1,11 +1,11 @@
# scrcpy (v2.1.1)
# scrcpy (v2.0)
<img src="app/data/icon.svg" width="128" height="128" alt="scrcpy" align="right" />
_pronounced "**scr**een **c**o**py**"_
This application mirrors Android devices (video and audio) connected via
USB or [over TCP/IP](doc/connection.md#tcpip-wireless), and allows to control the
USB or [over TCP/IP](doc/device.md#tcpip-wireless), and allows to control the
device with the keyboard and the mouse of the computer. It does not require any
_root_ access. It works on _Linux_, _Windows_ and _macOS_.
@ -25,13 +25,12 @@ It focuses on:
[lowlatency]: https://github.com/Genymobile/scrcpy/pull/646
Its features include:
- [audio forwarding](doc/audio.md) (Android 11+)
- [audio forwarding](doc/audio.md) (Android >= 11)
- [recording](doc/recording.md)
- mirroring with [Android device screen off](doc/device.md#turn-screen-off)
- [copy-paste](doc/control.md#copy-paste) in both directions
- [configurable quality](doc/video.md)
- [camera mirroring](doc/camera.md) (Android 12+)
- [mirroring as a webcam (V4L2)](doc/v4l2.md) (Linux-only)
- Android device [as a webcam (V4L2)](doc/v4l2.md) (Linux-only)
- [physical keyboard/mouse simulation (HID)](doc/hid-otg.md)
- [OTG mode](doc/hid-otg.md#otg)
- and more…
@ -40,7 +39,7 @@ Its features include:
The Android device requires at least API 21 (Android 5.0).
[Audio forwarding](doc/audio.md) is supported for API >= 30 (Android 11+).
[Audio forwarding](doc/audio.md) is supported from API 30 (Android 11).
Make sure you [enabled USB debugging][enable-adb] on your device(s).
@ -48,14 +47,10 @@ Make sure you [enabled USB debugging][enable-adb] on your device(s).
On some devices, you also need to enable [an additional option][control] `USB
debugging (Security Settings)` (this is an item different from `USB debugging`)
to control it using a keyboard and mouse. Rebooting the device is necessary once
this option is set.
to control it using a keyboard and mouse.
[control]: https://github.com/Genymobile/scrcpy/issues/70#issuecomment-373286323
Note that USB debugging is not required to run scrcpy in [OTG
mode](doc/hid-otg.md#otg).
## Get the app
@ -69,16 +64,14 @@ mode](doc/hid-otg.md#otg).
The application provides a lot of features and configuration options. They are
documented in the following pages:
- [Connection](doc/connection.md)
- [Device](doc/device.md)
- [Video](doc/video.md)
- [Audio](doc/audio.md)
- [Control](doc/control.md)
- [Device](doc/device.md)
- [Window](doc/window.md)
- [Recording](doc/recording.md)
- [Tunnels](doc/tunnels.md)
- [HID/OTG](doc/hid-otg.md)
- [Camera](doc/camera.md)
- [Video4Linux](doc/v4l2.md)
- [Shortcuts](doc/shortcuts.md)
@ -120,10 +113,7 @@ For general questions or discussions, you can also use:
I'm [@rom1v](https://github.com/rom1v), the author and maintainer of _scrcpy_.
If you appreciate this application, you can [support my open source
work][donate]:
- [GitHub Sponsors](https://github.com/sponsors/rom1v)
- [Liberapay](https://liberapay.com/rom1v/)
- [PayPal](https://paypal.me/rom2v)
work][donate].
[donate]: https://blog.rom1v.com/about/#support-my-open-source-work

View File

@ -10,16 +10,10 @@ _scrcpy() {
--audio-source=
--audio-output-buffer=
-b --video-bit-rate=
--camera-ar=
--camera-id=
--camera-facing=
--camera-fps=
--camera-high-speed
--camera-size=
--crop=
-d --select-usb
--disable-screensaver
--display-id=
--display=
--display-buffer=
-e --select-tcpip
-f --fullscreen
@ -29,8 +23,6 @@ _scrcpy() {
--kill-adb-on-close
-K --hid-keyboard
--legacy-paste
--list-camera-sizes
--list-cameras
--list-displays
--list-encoders
--lock-video-orientation
@ -52,8 +44,6 @@ _scrcpy() {
--no-video-playback
--otg
-p --port=
--pause-on-exit
--pause-on-exit=
--power-off-on-close
--prefer-text
--print-fps
@ -80,7 +70,6 @@ _scrcpy() {
--video-codec=
--video-codec-options=
--video-encoder=
--video-source=
-w --stay-awake
--window-borderless
--window-title=
@ -100,26 +89,14 @@ _scrcpy() {
COMPREPLY=($(compgen -W 'opus aac raw' -- "$cur"))
return
;;
--video-source)
COMPREPLY=($(compgen -W 'display camera' -- "$cur"))
return
;;
--audio-source)
COMPREPLY=($(compgen -W 'output mic' -- "$cur"))
return
;;
--camera-facing)
COMPREPLY=($(compgen -W 'front back external' -- "$cur"))
return
;;
--lock-video-orientation)
COMPREPLY=($(compgen -W 'unlocked initial 0 1 2 3' -- "$cur"))
return
;;
--pause-on-exit)
COMPREPLY=($(compgen -W 'true false if-error' -- "$cur"))
return
;;
-r|--record)
COMPREPLY=($(compgen -f -- "$cur"))
return
@ -156,12 +133,8 @@ _scrcpy() {
|--audio-codec-options \
|--audio-encoder \
|--audio-output-buffer \
|--camera-ar \
|--camera-id \
|--camera-fps \
|--camera-size \
|--crop \
|--display-id \
|--display \
|--display-buffer \
|--max-fps \
|-m|--max-size \

View File

@ -1,2 +1,4 @@
@echo off
scrcpy.exe --pause-on-exit=if-error %*
scrcpy.exe %*
:: if the exit code is >= 1, then pause
if errorlevel 1 pause

View File

@ -5,7 +5,7 @@ Comment=Display and control your Android device
# For some users, the PATH or ADB environment variables are set from the shell
# startup file, like .bashrc or .zshrc… Run an interactive shell to get
# environment correctly initialized.
Exec=/bin/sh -c "\"\\$SHELL\" -i -c scrcpy --pause-on-exit=if-error"
Exec=/bin/bash --norc --noprofile -i -c "\"\\$SHELL\" -i -c scrcpy || read -p 'Press Enter to quit...'"
Icon=scrcpy
Terminal=true
Type=Application

View File

@ -17,16 +17,10 @@ arguments=(
'--audio-source=[Select the audio source]:source:(output mic)'
'--audio-output-buffer=[Configure the size of the SDL audio output buffer (in milliseconds)]'
{-b,--video-bit-rate=}'[Encode the video at the given bit-rate]'
'--camera-ar=[Select the camera size by its aspect ratio]'
'--camera-high-speed=[Enable high-speed camera capture mode]'
'--camera-id=[Specify the camera id to mirror]'
'--camera-facing=[Select the device camera by its facing direction]:facing:(front back external)'
'--camera-fps=[Specify the camera capture frame rate]'
'--camera-size=[Specify an explicit camera capture size]'
'--crop=[\[width\:height\:x\:y\] Crop the device screen on the server]'
{-d,--select-usb}'[Use USB device]'
'--disable-screensaver[Disable screensaver while scrcpy is running]'
'--display-id=[Specify the display id to mirror]'
'--display=[Specify the display id to mirror]'
'--display-buffer=[Add a buffering delay \(in milliseconds\) before displaying]'
{-e,--select-tcpip}'[Use TCP/IP device]'
{-f,--fullscreen}'[Start in fullscreen]'
@ -36,8 +30,6 @@ arguments=(
'--kill-adb-on-close[Kill adb when scrcpy terminates]'
{-K,--hid-keyboard}'[Simulate a physical keyboard by using HID over AOAv2]'
'--legacy-paste[Inject computer clipboard text as a sequence of key events on Ctrl+v]'
'--list-camera-sizes[List the valid camera capture sizes]'
'--list-cameras[List cameras available on the device]'
'--list-displays[List displays available on the device]'
'--list-encoders[List video and audio encoders available on the device]'
'--lock-video-orientation=[Lock video orientation]:orientation:(unlocked initial 0 1 2 3)'
@ -58,7 +50,6 @@ arguments=(
'--no-video-playback[Disable video playback]'
'--otg[Run in OTG mode \(simulating physical keyboard and mouse\)]'
{-p,--port=}'[\[port\[\:port\]\] Set the TCP port \(range\) used by the client to listen]'
'--pause-on-exit=[Make scrcpy pause before exiting]:mode:(true false if-error)'
'--power-off-on-close[Turn the device screen off when closing scrcpy]'
'--prefer-text[Inject alpha characters and space as text events instead of key events]'
'--print-fps[Start FPS counter, to print frame logs to the console]'
@ -84,7 +75,6 @@ arguments=(
'--video-codec=[Select the video codec]:codec:(h264 h265 av1)'
'--video-codec-options=[Set a list of comma-separated key\:type=value options for the device video encoder]'
'--video-encoder=[Use a specific MediaCodec video encoder]'
'--video-source=[Select the video source]:source:(display camera)'
{-w,--stay-awake}'[Keep the device on while scrcpy is running, when the device is plugged in]'
'--window-borderless[Disable window decorations \(display borderless window\)]'
'--window-title=[Set a custom window title]'

View File

@ -6,10 +6,10 @@ cd "$DIR"
mkdir -p "$PREBUILT_DATA_DIR"
cd "$PREBUILT_DATA_DIR"
DEP_DIR=platform-tools-34.0.5
DEP_DIR=platform-tools-34.0.1
FILENAME=platform-tools_r34.0.5-windows.zip
SHA256SUM=3f8320152704377de150418a3c4c9d07d16d80a6c0d0d8f7289c22c499e33571
FILENAME=platform-tools_r34.0.1-windows.zip
SHA256SUM=5dd9c2be744c224fa3a7cbe30ba02d2cb378c763bd0f797a7e47e9f3156a5daa
if [[ -d "$DEP_DIR" ]]
then

View File

@ -6,10 +6,10 @@ cd "$DIR"
mkdir -p "$PREBUILT_DATA_DIR"
cd "$PREBUILT_DATA_DIR"
DEP_DIR=SDL2-2.28.4
DEP_DIR=SDL2-2.26.4
FILENAME=SDL2-devel-2.28.4-mingw.tar.gz
SHA256SUM=779d091072cf97291f80030f5232d97aa3d48ab0f2c14fe0b9d9a33c593cdc35
FILENAME=SDL2-devel-2.26.4-mingw.tar.gz
SHA256SUM=fe899c8642caac2f180b1ee6f786857ddcaa0adc1fa82474312b09dd47d74712
if [[ -d "$DEP_DIR" ]]
then

View File

@ -13,7 +13,7 @@ BEGIN
VALUE "LegalCopyright", "Romain Vimont, Genymobile"
VALUE "OriginalFilename", "scrcpy.exe"
VALUE "ProductName", "scrcpy"
VALUE "ProductVersion", "v2.2"
VALUE "ProductVersion", "2.0"
END
END
BLOCK "VarFileInfo"

View File

@ -21,7 +21,7 @@ Make scrcpy window always on top (above other windows).
.TP
.BI "\-\-audio\-bit\-rate " value
Encode the audio at the given bit rate, expressed in bits/s. Unit suffixes are supported: '\fBK\fR' (x1000) and '\fBM\fR' (x1000000).
Encode the audio at the given bit\-rate, expressed in bits/s. Unit suffixes are supported: '\fBK\fR' (x1000) and '\fBM\fR' (x1000000).
Default is 128K (128000).
@ -45,15 +45,15 @@ Set a list of comma-separated key:type=value options for the device audio encode
The possible values for 'type' are 'int' (default), 'long', 'float' and 'string'.
The list of possible codec options is available in the Android documentation:
<https://d.android.com/reference/android/media/MediaFormat>
The list of possible codec options is available in the Android documentation
.UR https://d.android.com/reference/android/media/MediaFormat
.UE .
.TP
.BI "\-\-audio\-encoder " name
Use a specific MediaCodec audio encoder (depending on the codec provided by \fB\-\-audio\-codec\fR).
The available encoders can be listed by \fB\-\-list\-encoders\fR.
The available encoders can be listed by \-\-list\-encoders.
.TP
.BI "\-\-audio\-source " source
@ -71,44 +71,10 @@ Default is 5.
.TP
.BI "\-b, \-\-video\-bit\-rate " value
Encode the video at the given bit rate, expressed in bits/s. Unit suffixes are supported: '\fBK\fR' (x1000) and '\fBM\fR' (x1000000).
Encode the video at the given bit\-rate, expressed in bits/s. Unit suffixes are supported: '\fBK\fR' (x1000) and '\fBM\fR' (x1000000).
Default is 8M (8000000).
.TP
.BI "\-\-camera\-ar " ar
Select the camera size by its aspect ratio (+/- 10%).
Possible values are "sensor" (use the camera sensor aspect ratio), "\fInum\fR:\fIden\fR" (e.g. "4:3") and "\fIvalue\fR" (e.g. "1.6").
.TP
.B \-\-camera\-high\-speed
Enable high-speed camera capture mode.
This mode is restricted to specific resolutions and frame rates, listed by \fB\-\-list\-camera\-sizes\fR.
.TP
.BI "\-\-camera\-id " id
Specify the device camera id to mirror.
The available camera ids can be listed by \fB\-\-list\-cameras\fR.
.TP
.BI "\-\-camera\-facing " facing
Select the device camera by its facing direction.
Possible values are "front", "back" and "external".
.TP
.BI "\-\-camera\-fps " fps
Specify the camera capture frame rate.
If not specified, Android's default frame rate (30 fps) is used.
.TP
.BI "\-\-camera\-size " width\fRx\fIheight
Specify an explicit camera capture size.
.TP
.BI "\-\-crop " width\fR:\fIheight\fR:\fIx\fR:\fIy
Crop the device screen on the server.
@ -128,10 +94,10 @@ Also see \fB\-e\fR (\fB\-\-select\-tcpip\fR).
Disable screensaver while scrcpy is running.
.TP
.BI "\-\-display\-id " id
.BI "\-\-display " id
Specify the device display id to mirror.
The available display ids can be listed by \fB\-\-list\-displays\fR.
The available display ids can be listed by \-\-list\-displays.
Default is 0.
@ -189,14 +155,6 @@ Inject computer clipboard text as a sequence of key events on Ctrl+v (like MOD+S
This is a workaround for some devices not behaving as expected when setting the device clipboard programmatically.
.TP
.B \-\-list\-camera\-sizes
List the valid camera capture sizes.
.TP
.B \-\-list\-cameras
List cameras available on the device.
.TP
.B \-\-list\-encoders
List video and audio encoders available on the device.
@ -241,7 +199,7 @@ Disable device control (mirror the device in read\-only).
.TP
.B \-N, \-\-no\-playback
Disable video and audio playback on the computer (equivalent to \fB\-\-no\-video\-playback \-\-no\-audio\-playback\fR).
Disable video and audio playback on the computer (equivalent to --no-video-playback --no-audio-playback).
.TP
.B \-\-no\-audio
@ -309,16 +267,6 @@ Set the TCP port (range) used by the client to listen.
Default is 27183:27199.
.TP
\fB\-\-pause\-on\-exit\fR[=\fImode\fR]
Configure pause on exit. Possible values are "true" (always pause on exit), "false" (never pause on exit) and "if-error" (pause only if an error occured).
This is useful to prevent the terminal window from automatically closing, so that error messages can be read.
Default is "false".
Passing the option without argument is equivalent to passing "true".
.TP
.B \-\-power\-off\-on\-close
Turn the device screen off when closing scrcpy.
@ -363,7 +311,8 @@ Request SDL to use the given render driver (this is just a hint).
Supported names are currently "direct3d", "opengl", "opengles2", "opengles", "metal" and "software".
<https://wiki.libsdl.org/SDL_HINT_RENDER_DRIVER>
.UR https://wiki.libsdl.org/SDL_HINT_RENDER_DRIVER
.UE
.TP
.B \-\-require\-audio
@ -411,13 +360,13 @@ Set the maximum mirroring time, in seconds.
.TP
.BI "\-\-tunnel\-host " ip
Set the IP address of the adb tunnel to reach the scrcpy server. This option automatically enables \fB\-\-force\-adb\-forward\fR.
Set the IP address of the adb tunnel to reach the scrcpy server. This option automatically enables --force-adb-forward.
Default is localhost.
.TP
.BI "\-\-tunnel\-port " port
Set the TCP port of the adb tunnel to reach the scrcpy server. This option automatically enables \fB\-\-force\-adb\-forward\fR.
Set the TCP port of the adb tunnel to reach the scrcpy server. This option automatically enables --force-adb-forward.
Default is 0 (not forced): the local port used for establishing the tunnel will be used.
@ -457,23 +406,15 @@ Set a list of comma-separated key:type=value options for the device video encode
The possible values for 'type' are 'int' (default), 'long', 'float' and 'string'.
The list of possible codec options is available in the Android documentation:
<https://d.android.com/reference/android/media/MediaFormat>
The list of possible codec options is available in the Android documentation
.UR https://d.android.com/reference/android/media/MediaFormat
.UE .
.TP
.BI "\-\-video\-encoder " name
Use a specific MediaCodec video encoder (depending on the codec provided by \fB\-\-video\-codec\fR).
The available encoders can be listed by \fB\-\-list\-encoders\fR.
.TP
.BI "\-\-video\-source " source
Select the video source (display or camera).
Camera mirroring requires Android 12+.
Default is display.
The available encoders can be listed by \-\-list\-encoders.
.TP
.B \-w, \-\-stay-awake
@ -635,7 +576,7 @@ Path to adb.
.TP
.B ANDROID_SERIAL
Device serial to use if no selector (\fB-s\fR, \fB-d\fR, \fB-e\fR or \fB\-\-tcpip=\fIaddr\fR) is specified.
Device serial to use if no selector (-s, -d, -e or --tcpip=<addr>) is specified.
.TP
.B SCRCPY_ICON_PATH
@ -658,14 +599,23 @@ for the Debian Project (and may be used by others).
.SH "REPORTING BUGS"
Report bugs to <https://github.com/Genymobile/scrcpy/issues>.
Report bugs to
.UR https://github.com/Genymobile/scrcpy/issues
.UE .
.SH COPYRIGHT
Copyright \(co 2018 Genymobile <https://www.genymobile.com>
Copyright \(co 2018 Genymobile
.UR https://www.genymobile.com
Genymobile
.UE
Copyright \(co 2018\-2023 Romain Vimont <rom@rom1v.com>
Copyright \(co 2018\-2023
.MT rom@rom1v.com
Romain Vimont
.ME
Licensed under the Apache License, Version 2.0.
.SH WWW
<https://github.com/Genymobile/scrcpy>
.UR https://github.com/Genymobile/scrcpy
.UE

View File

@ -70,7 +70,7 @@ argv_to_string(const char *const *argv, char *buf, size_t bufsize) {
}
static void
show_adb_installation_msg(void) {
show_adb_installation_msg() {
#ifndef __WINDOWS__
static const struct {
const char *binary;
@ -218,16 +218,8 @@ sc_adb_forward(struct sc_intr *intr, const char *serial, uint16_t local_port,
const char *device_socket_name, unsigned flags) {
char local[4 + 5 + 1]; // tcp:PORT
char remote[108 + 14 + 1]; // localabstract:NAME
int r = snprintf(local, sizeof(local), "tcp:%" PRIu16, local_port);
assert(r >= 0 && (size_t) r < sizeof(local));
r = snprintf(remote, sizeof(remote), "localabstract:%s",
device_socket_name);
if (r < 0 || (size_t) r >= sizeof(remote)) {
LOGE("Could not write socket name");
return false;
}
sprintf(local, "tcp:%" PRIu16, local_port);
snprintf(remote, sizeof(remote), "localabstract:%s", device_socket_name);
assert(serial);
const char *const argv[] =
@ -241,9 +233,7 @@ bool
sc_adb_forward_remove(struct sc_intr *intr, const char *serial,
uint16_t local_port, unsigned flags) {
char local[4 + 5 + 1]; // tcp:PORT
int r = snprintf(local, sizeof(local), "tcp:%" PRIu16, local_port);
assert(r >= 0 && (size_t) r < sizeof(local));
(void) r;
sprintf(local, "tcp:%" PRIu16, local_port);
assert(serial);
const char *const argv[] =
@ -259,16 +249,8 @@ sc_adb_reverse(struct sc_intr *intr, const char *serial,
unsigned flags) {
char local[4 + 5 + 1]; // tcp:PORT
char remote[108 + 14 + 1]; // localabstract:NAME
int r = snprintf(local, sizeof(local), "tcp:%" PRIu16, local_port);
assert(r >= 0 && (size_t) r < sizeof(local));
r = snprintf(remote, sizeof(remote), "localabstract:%s",
device_socket_name);
if (r < 0 || (size_t) r >= sizeof(remote)) {
LOGE("Could not write socket name");
return false;
}
sprintf(local, "tcp:%" PRIu16, local_port);
snprintf(remote, sizeof(remote), "localabstract:%s", device_socket_name);
assert(serial);
const char *const argv[] =
SC_ADB_COMMAND("-s", serial, "reverse", remote, local);
@ -281,12 +263,7 @@ bool
sc_adb_reverse_remove(struct sc_intr *intr, const char *serial,
const char *device_socket_name, unsigned flags) {
char remote[108 + 14 + 1]; // localabstract:NAME
int r = snprintf(remote, sizeof(remote), "localabstract:%s",
device_socket_name);
if (r < 0 || (size_t) r >= sizeof(remote)) {
LOGE("Device socket name too long");
return false;
}
snprintf(remote, sizeof(remote), "localabstract:%s", device_socket_name);
assert(serial);
const char *const argv[] =
@ -356,9 +333,7 @@ bool
sc_adb_tcpip(struct sc_intr *intr, const char *serial, uint16_t port,
unsigned flags) {
char port_string[5 + 1];
int r = snprintf(port_string, sizeof(port_string), "%" PRIu16, port);
assert(r >= 0 && (size_t) r < sizeof(port_string));
(void) r;
sprintf(port_string, "%" PRIu16, port);
assert(serial);
const char *const argv[] =
@ -653,8 +628,8 @@ sc_adb_select_device(struct sc_intr *intr,
return false;
}
LOGI("ADB device found:");
sc_adb_devices_log(SC_LOG_LEVEL_INFO, vec.data, vec.size);
LOGD("ADB device found:");
sc_adb_devices_log(SC_LOG_LEVEL_DEBUG, vec.data, vec.size);
// Move devics into out_device (do not destroy device)
sc_adb_device_move(out_device, device);

View File

@ -7,7 +7,7 @@
#include "util/log.h"
#include "util/str.h"
static bool
bool
sc_adb_parse_device(char *line, struct sc_adb_device *device) {
// One device line looks like:
// "0123456789abcdef device usb:2-1 product:MyProduct model:MyModel "

View File

@ -32,7 +32,6 @@ enum {
OPT_WINDOW_BORDERLESS,
OPT_MAX_FPS,
OPT_LOCK_VIDEO_ORIENTATION,
OPT_DISPLAY,
OPT_DISPLAY_ID,
OPT_ROTATION,
OPT_RENDER_DRIVER,
@ -77,19 +76,9 @@ enum {
OPT_NO_VIDEO,
OPT_NO_AUDIO_PLAYBACK,
OPT_NO_VIDEO_PLAYBACK,
OPT_VIDEO_SOURCE,
OPT_AUDIO_SOURCE,
OPT_KILL_ADB_ON_CLOSE,
OPT_TIME_LIMIT,
OPT_PAUSE_ON_EXIT,
OPT_LIST_CAMERAS,
OPT_LIST_CAMERA_SIZES,
OPT_CAMERA_ID,
OPT_CAMERA_SIZE,
OPT_CAMERA_FACING,
OPT_CAMERA_AR,
OPT_CAMERA_FPS,
OPT_CAMERA_HIGH_SPEED,
};
struct sc_option {
@ -135,7 +124,7 @@ static const struct sc_option options[] = {
.longopt_id = OPT_AUDIO_BIT_RATE,
.longopt = "audio-bit-rate",
.argdesc = "value",
.text = "Encode the audio at the given bit rate, expressed in bits/s. "
.text = "Encode the audio at the given bit-rate, expressed in bits/s. "
"Unit suffixes are supported: 'K' (x1000) and 'M' (x1000000).\n"
"Default is 128K (128000).",
},
@ -196,7 +185,7 @@ static const struct sc_option options[] = {
.shortopt = 'b',
.longopt = "video-bit-rate",
.argdesc = "value",
.text = "Encode the video at the given bit rate, expressed in bits/s. "
.text = "Encode the video at the given bit-rate, expressed in bits/s. "
"Unit suffixes are supported: 'K' (x1000) and 'M' (x1000000).\n"
"Default is 8M (8000000).",
},
@ -206,51 +195,6 @@ static const struct sc_option options[] = {
.longopt = "bit-rate",
.argdesc = "value",
},
{
.longopt_id = OPT_CAMERA_AR,
.longopt = "camera-ar",
.argdesc = "ar",
.text = "Select the camera size by its aspect ratio (+/- 10%).\n"
"Possible values are \"sensor\" (use the camera sensor aspect "
"ratio), \"<num>:<den>\" (e.g. \"4:3\") or \"<value>\" (e.g. "
"\"1.6\")."
},
{
.longopt_id = OPT_CAMERA_ID,
.longopt = "camera-id",
.argdesc = "id",
.text = "Specify the device camera id to mirror.\n"
"The available camera ids can be listed by:\n"
" scrcpy --list-cameras",
},
{
.longopt_id = OPT_CAMERA_FACING,
.longopt = "camera-facing",
.argdesc = "facing",
.text = "Select the device camera by its facing direction.\n"
"Possible values are \"front\", \"back\" and \"external\".",
},
{
.longopt_id = OPT_CAMERA_HIGH_SPEED,
.longopt = "camera-high-speed",
.text = "Enable high-speed camera capture mode.\n"
"This mode is restricted to specific resolutions and frame "
"rates, listed by --list-camera-sizes.",
},
{
.longopt_id = OPT_CAMERA_SIZE,
.longopt = "camera-size",
.argdesc = "<width>x<height>",
.text = "Specify an explicit camera capture size.",
},
{
.longopt_id = OPT_CAMERA_FPS,
.longopt = "camera-fps",
.argdesc = "value",
.text = "Specify the camera capture frame rate.\n"
"If not specified, Android's default frame rate (30 fps) is "
"used.",
},
{
// Not really deprecated (--codec has never been released), but without
// declaring an explicit --codec option, getopt_long() partial matching
@ -286,15 +230,9 @@ static const struct sc_option options[] = {
.longopt = "disable-screensaver",
.text = "Disable screensaver while scrcpy is running.",
},
{
// deprecated
.longopt_id = OPT_DISPLAY,
.longopt = "display",
.argdesc = "id",
},
{
.longopt_id = OPT_DISPLAY_ID,
.longopt = "display-id",
.longopt = "display",
.argdesc = "id",
.text = "Specify the device display id to mirror.\n"
"The available display ids can be listed by:\n"
@ -374,16 +312,6 @@ static const struct sc_option options[] = {
"This is a workaround for some devices not behaving as "
"expected when setting the device clipboard programmatically.",
},
{
.longopt_id = OPT_LIST_CAMERAS,
.longopt = "list-cameras",
.text = "List device cameras.",
},
{
.longopt_id = OPT_LIST_CAMERA_SIZES,
.longopt = "list-camera-sizes",
.text = "List the valid camera capture sizes.",
},
{
.longopt_id = OPT_LIST_DISPLAYS,
.longopt = "list-displays",
@ -535,20 +463,6 @@ static const struct sc_option options[] = {
"Default is " STR(DEFAULT_LOCAL_PORT_RANGE_FIRST) ":"
STR(DEFAULT_LOCAL_PORT_RANGE_LAST) ".",
},
{
.longopt_id = OPT_PAUSE_ON_EXIT,
.longopt = "pause-on-exit",
.argdesc = "mode",
.optional_arg = true,
.text = "Configure pause on exit. Possible values are \"true\" (always "
"pause on exit), \"false\" (never pause on exit) and "
"\"if-error\" (pause only if an error occured).\n"
"This is useful to prevent the terminal window from "
"automatically closing, so that error messages can be read.\n"
"Default is \"false\".\n"
"Passing the option without argument is equivalent to passing "
"\"true\".",
},
{
.longopt_id = OPT_POWER_OFF_ON_CLOSE,
.longopt = "power-off-on-close",
@ -755,14 +669,6 @@ static const struct sc_option options[] = {
"codec provided by --video-codec).\n"
"The available encoders can be listed by --list-encoders.",
},
{
.longopt_id = OPT_VIDEO_SOURCE,
.longopt = "video-source",
.argdesc = "source",
.text = "Select the video source (display or camera).\n"
"Camera mirroring requires Android 12+.\n"
"Default is display.",
},
{
.shortopt = 'w',
.longopt = "stay-awake",
@ -1164,7 +1070,7 @@ print_shortcut(const struct sc_shortcut *shortcut, unsigned cols) {
while (shortcut->shortcuts[i]) {
printf(" %s\n", shortcut->shortcuts[i]);
++i;
}
};
char *text = sc_str_wrap_lines(shortcut->text, cols, 8);
if (!text) {
@ -1283,9 +1189,9 @@ parse_integer_arg(const char *s, long *out, bool accept_suffix, long min,
}
static size_t
parse_integers_arg(const char *s, const char sep, size_t max_items, long *out,
long min, long max, const char *name) {
size_t count = sc_str_parse_integers(s, sep, max_items, out);
parse_integers_arg(const char *s, size_t max_items, long *out, long min,
long max, const char *name) {
size_t count = sc_str_parse_integers(s, ':', max_items, out);
if (!count) {
LOGE("Could not parse %s: %s", name, s);
return 0;
@ -1332,7 +1238,7 @@ parse_max_size(const char *s, uint16_t *max_size) {
static bool
parse_max_fps(const char *s, uint16_t *max_fps) {
long value;
bool ok = parse_integer_arg(s, &value, false, 0, 0xFFFF, "max fps");
bool ok = parse_integer_arg(s, &value, false, 0, 1000, "max fps");
if (!ok) {
return false;
}
@ -1441,7 +1347,7 @@ parse_window_dimension(const char *s, uint16_t *dimension) {
static bool
parse_port_range(const char *s, struct sc_port_range *port_range) {
long values[2];
size_t count = parse_integers_arg(s, ':', 2, values, 0, 0xFFFF, "port");
size_t count = parse_integers_arg(s, 2, values, 0, 0xFFFF, "port");
if (!count) {
return false;
}
@ -1703,22 +1609,6 @@ parse_audio_codec(const char *optarg, enum sc_codec *codec) {
return false;
}
static bool
parse_video_source(const char *optarg, enum sc_video_source *source) {
if (!strcmp(optarg, "display")) {
*source = SC_VIDEO_SOURCE_DISPLAY;
return true;
}
if (!strcmp(optarg, "camera")) {
*source = SC_VIDEO_SOURCE_CAMERA;
return true;
}
LOGE("Unsupported video source: %s (expected display or camera)", optarg);
return false;
}
static bool
parse_audio_source(const char *optarg, enum sc_audio_source *source) {
if (!strcmp(optarg, "mic")) {
@ -1735,46 +1625,6 @@ parse_audio_source(const char *optarg, enum sc_audio_source *source) {
return false;
}
static bool
parse_camera_facing(const char *optarg, enum sc_camera_facing *facing) {
if (!strcmp(optarg, "front")) {
*facing = SC_CAMERA_FACING_FRONT;
return true;
}
if (!strcmp(optarg, "back")) {
*facing = SC_CAMERA_FACING_BACK;
return true;
}
if (!strcmp(optarg, "external")) {
*facing = SC_CAMERA_FACING_EXTERNAL;
return true;
}
if (*optarg == '\0') {
// Empty string is a valid value (equivalent to not passing the option)
*facing = SC_CAMERA_FACING_ANY;
return true;
}
LOGE("Unsupported camera facing: %s (expected front, back or external)",
optarg);
return false;
}
static bool
parse_camera_fps(const char *s, uint16_t *camera_fps) {
long value;
bool ok = parse_integer_arg(s, &value, false, 0, 0xFFFF, "camera fps");
if (!ok) {
return false;
}
*camera_fps = (uint16_t) value;
return true;
}
static bool
parse_time_limit(const char *s, sc_tick *tick) {
long value;
@ -1787,29 +1637,6 @@ parse_time_limit(const char *s, sc_tick *tick) {
return true;
}
static bool
parse_pause_on_exit(const char *s, enum sc_pause_on_exit *pause_on_exit) {
if (!s || !strcmp(s, "true")) {
*pause_on_exit = SC_PAUSE_ON_EXIT_TRUE;
return true;
}
if (!strcmp(s, "false")) {
*pause_on_exit = SC_PAUSE_ON_EXIT_FALSE;
return true;
}
if (!strcmp(s, "if-error")) {
*pause_on_exit = SC_PAUSE_ON_EXIT_IF_ERROR;
return true;
}
LOGE("Unsupported pause on exit mode: %s "
"(expected true, false or if-error)", optarg);
return false;
}
static bool
parse_args_with_getopt(struct scrcpy_cli_args *args, int argc, char *argv[],
const char *optstring, const struct option *longopts) {
@ -1837,9 +1664,6 @@ parse_args_with_getopt(struct scrcpy_cli_args *args, int argc, char *argv[],
case OPT_CROP:
opts->crop = optarg;
break;
case OPT_DISPLAY:
LOGW("--display is deprecated, use --display-id instead.");
// fall through
case OPT_DISPLAY_ID:
if (!parse_display_id(optarg, &opts->display_id)) {
return false;
@ -2121,16 +1945,10 @@ parse_args_with_getopt(struct scrcpy_cli_args *args, int argc, char *argv[],
return false;
#endif
case OPT_LIST_ENCODERS:
opts->list |= SC_OPTION_LIST_ENCODERS;
opts->list_encoders = true;
break;
case OPT_LIST_DISPLAYS:
opts->list |= SC_OPTION_LIST_DISPLAYS;
break;
case OPT_LIST_CAMERAS:
opts->list |= SC_OPTION_LIST_CAMERAS;
break;
case OPT_LIST_CAMERA_SIZES:
opts->list |= SC_OPTION_LIST_CAMERA_SIZES;
opts->list_displays = true;
break;
case OPT_REQUIRE_AUDIO:
opts->require_audio = true;
@ -2146,11 +1964,6 @@ parse_args_with_getopt(struct scrcpy_cli_args *args, int argc, char *argv[],
return false;
}
break;
case OPT_VIDEO_SOURCE:
if (!parse_video_source(optarg, &opts->video_source)) {
return false;
}
break;
case OPT_AUDIO_SOURCE:
if (!parse_audio_source(optarg, &opts->audio_source)) {
return false;
@ -2164,33 +1977,6 @@ parse_args_with_getopt(struct scrcpy_cli_args *args, int argc, char *argv[],
return false;
}
break;
case OPT_PAUSE_ON_EXIT:
if (!parse_pause_on_exit(optarg, &args->pause_on_exit)) {
return false;
}
break;
case OPT_CAMERA_AR:
opts->camera_ar = optarg;
break;
case OPT_CAMERA_ID:
opts->camera_id = optarg;
break;
case OPT_CAMERA_SIZE:
opts->camera_size = optarg;
break;
case OPT_CAMERA_FACING:
if (!parse_camera_facing(optarg, &opts->camera_facing)) {
return false;
}
break;
case OPT_CAMERA_FPS:
if (!parse_camera_fps(optarg, &opts->camera_fps)) {
return false;
}
break;
case OPT_CAMERA_HIGH_SPEED:
opts->camera_high_speed = true;
break;
default:
// getopt prints the error message on stderr
return false;
@ -2236,6 +2022,12 @@ parse_args_with_getopt(struct scrcpy_cli_args *args, int argc, char *argv[],
opts->audio_playback = false;
}
if (!opts->video_playback && !otg) {
// If video playback is disabled and OTG are disabled, then there is
// no way to control the device.
opts->control = false;
}
if (opts->video && !opts->video_playback && !opts->record_filename
&& !v4l2) {
LOGI("No video playback, no recording, no V4L2 sink: video disabled");
@ -2284,58 +2076,6 @@ parse_args_with_getopt(struct scrcpy_cli_args *args, int argc, char *argv[],
opts->force_adb_forward = true;
}
if (opts->video_source == SC_VIDEO_SOURCE_CAMERA) {
if (opts->display_id) {
LOGE("--display-id is only available with --video-source=display");
return false;
}
if (opts->camera_id && opts->camera_facing != SC_CAMERA_FACING_ANY) {
LOGE("Could not specify both --camera-id and --camera-facing");
return false;
}
if (opts->camera_size) {
if (opts->max_size) {
LOGE("Could not specify both --camera-size and -m/--max-size");
return false;
}
if (opts->camera_ar) {
LOGE("Could not specify both --camera-size and --camera-ar");
return false;
}
}
if (opts->camera_high_speed && !opts->camera_fps) {
LOGE("--camera-high-speed requires an explicit --camera-fps value");
return false;
}
if (opts->control) {
LOGI("Camera video source: control disabled");
opts->control = false;
}
} else if (opts->camera_id
|| opts->camera_ar
|| opts->camera_facing != SC_CAMERA_FACING_ANY
|| opts->camera_fps
|| opts->camera_high_speed
|| opts->camera_size) {
LOGE("Camera options are only available with --video-source=camera");
return false;
}
if (opts->audio && opts->audio_source == SC_AUDIO_SOURCE_AUTO) {
// Select the audio source according to the video source
if (opts->video_source == SC_VIDEO_SOURCE_DISPLAY) {
opts->audio_source = SC_AUDIO_SOURCE_OUTPUT;
} else {
opts->audio_source = SC_AUDIO_SOURCE_MIC;
LOGI("Camera video source: microphone audio source selected");
}
}
if (opts->record_format && !opts->record_filename) {
LOGE("Record format specified without recording");
return false;
@ -2456,37 +2196,6 @@ parse_args_with_getopt(struct scrcpy_cli_args *args, int argc, char *argv[],
return true;
}
static enum sc_pause_on_exit
sc_get_pause_on_exit(int argc, char *argv[]) {
// Read arguments backwards so that the last --pause-on-exit is considered
// (same behavior as getopt())
for (int i = argc - 1; i >= 1; --i) {
const char *arg = argv[i];
// Starts with "--pause-on-exit"
if (!strncmp("--pause-on-exit", arg, 15)) {
if (arg[15] == '\0') {
// No argument
return SC_PAUSE_ON_EXIT_TRUE;
}
if (arg[15] != '=') {
// Invalid parameter, ignore
return SC_PAUSE_ON_EXIT_FALSE;
}
const char *value = &arg[16];
if (!strcmp(value, "true")) {
return SC_PAUSE_ON_EXIT_TRUE;
}
if (!strcmp(value, "if-error")) {
return SC_PAUSE_ON_EXIT_IF_ERROR;
}
// Set to false, inclusing when the value is invalid
return SC_PAUSE_ON_EXIT_FALSE;
}
}
return SC_PAUSE_ON_EXIT_FALSE;
}
bool
scrcpy_parse_args(struct scrcpy_cli_args *args, int argc, char *argv[]) {
struct sc_getopt_adapter adapter;
@ -2500,11 +2209,5 @@ scrcpy_parse_args(struct scrcpy_cli_args *args, int argc, char *argv[]) {
sc_getopt_adapter_destroy(&adapter);
if (!ret && args->pause_on_exit == SC_PAUSE_ON_EXIT_FALSE) {
// Check if "--pause-on-exit" is present in the arguments list, because
// it must be taken into account even if command line parsing failed
args->pause_on_exit = sc_get_pause_on_exit(argc, argv);
}
return ret;
}

View File

@ -7,17 +7,10 @@
#include "options.h"
enum sc_pause_on_exit {
SC_PAUSE_ON_EXIT_TRUE,
SC_PAUSE_ON_EXIT_FALSE,
SC_PAUSE_ON_EXIT_IF_ERROR,
};
struct scrcpy_cli_args {
struct scrcpy_options opts;
bool help;
bool version;
enum sc_pause_on_exit pause_on_exit;
};
void

View File

@ -53,7 +53,7 @@ sc_display_init(struct sc_display *display, SDL_Window *window, bool mipmaps) {
display->mipmaps = true;
} else {
LOGW("Trilinear filtering disabled "
"(OpenGL 3.0+ or ES 2.0+ required)");
"(OpenGL 3.0+ or ES 2.0+ required");
}
} else {
LOGI("Trilinear filtering disabled");

View File

@ -271,7 +271,7 @@ error:
}
SDL_Surface *
scrcpy_icon_load(void) {
scrcpy_icon_load() {
char *icon_path = get_icon_path();
if (!icon_path) {
return NULL;

View File

@ -23,7 +23,7 @@
#include "util/str.h"
#endif
static int
int
main_scrcpy(int argc, char *argv[]) {
#ifdef _WIN32
// disable buffering, we want logs immediately
@ -39,32 +39,26 @@ main_scrcpy(int argc, char *argv[]) {
.opts = scrcpy_options_default,
.help = false,
.version = false,
.pause_on_exit = SC_PAUSE_ON_EXIT_FALSE,
};
#ifndef NDEBUG
args.opts.log_level = SC_LOG_LEVEL_DEBUG;
#endif
enum scrcpy_exit_code ret;
if (!scrcpy_parse_args(&args, argc, argv)) {
ret = SCRCPY_EXIT_FAILURE;
goto end;
return SCRCPY_EXIT_FAILURE;
}
sc_set_log_level(args.opts.log_level);
if (args.help) {
scrcpy_print_usage(argv[0]);
ret = SCRCPY_EXIT_SUCCESS;
goto end;
return SCRCPY_EXIT_SUCCESS;
}
if (args.version) {
scrcpy_print_version();
ret = SCRCPY_EXIT_SUCCESS;
goto end;
return SCRCPY_EXIT_SUCCESS;
}
#ifdef SCRCPY_LAVF_REQUIRES_REGISTER_ALL
@ -78,26 +72,18 @@ main_scrcpy(int argc, char *argv[]) {
#endif
if (!net_init()) {
ret = SCRCPY_EXIT_FAILURE;
goto end;
return SCRCPY_EXIT_FAILURE;
}
sc_log_configure();
#ifdef HAVE_USB
ret = args.opts.otg ? scrcpy_otg(&args.opts) : scrcpy(&args.opts);
enum scrcpy_exit_code ret = args.opts.otg ? scrcpy_otg(&args.opts)
: scrcpy(&args.opts);
#else
ret = scrcpy(&args.opts);
enum scrcpy_exit_code ret = scrcpy(&args.opts);
#endif
end:
if (args.pause_on_exit == SC_PAUSE_ON_EXIT_TRUE ||
(args.pause_on_exit == SC_PAUSE_ON_EXIT_IF_ERROR &&
ret != SCRCPY_EXIT_SUCCESS)) {
printf("Press Enter to continue...\n");
getchar();
}
return ret;
}

View File

@ -11,19 +11,13 @@ const struct scrcpy_options scrcpy_options_default = {
.audio_codec_options = NULL,
.video_encoder = NULL,
.audio_encoder = NULL,
.camera_id = NULL,
.camera_size = NULL,
.camera_ar = NULL,
.camera_fps = 0,
.log_level = SC_LOG_LEVEL_INFO,
.video_codec = SC_CODEC_H264,
.audio_codec = SC_CODEC_OPUS,
.video_source = SC_VIDEO_SOURCE_DISPLAY,
.audio_source = SC_AUDIO_SOURCE_AUTO,
.audio_source = SC_AUDIO_SOURCE_OUTPUT,
.record_format = SC_RECORD_FORMAT_AUTO,
.keyboard_input_mode = SC_KEYBOARD_INPUT_MODE_INJECT,
.mouse_input_mode = SC_MOUSE_INPUT_MODE_INJECT,
.camera_facing = SC_CAMERA_FACING_ANY,
.port_range = {
.first = DEFAULT_LOCAL_PORT_RANGE_FIRST,
.last = DEFAULT_LOCAL_PORT_RANGE_LAST,
@ -85,7 +79,7 @@ const struct scrcpy_options scrcpy_options_default = {
.video = true,
.audio = true,
.require_audio = false,
.list_encoders = false,
.list_displays = false,
.kill_adb_on_close = false,
.camera_high_speed = false,
.list = 0,
};

View File

@ -44,24 +44,11 @@ enum sc_codec {
SC_CODEC_RAW,
};
enum sc_video_source {
SC_VIDEO_SOURCE_DISPLAY,
SC_VIDEO_SOURCE_CAMERA,
};
enum sc_audio_source {
SC_AUDIO_SOURCE_AUTO, // OUTPUT for video DISPLAY, MIC for video CAMERA
SC_AUDIO_SOURCE_OUTPUT,
SC_AUDIO_SOURCE_MIC,
};
enum sc_camera_facing {
SC_CAMERA_FACING_ANY,
SC_CAMERA_FACING_FRONT,
SC_CAMERA_FACING_BACK,
SC_CAMERA_FACING_EXTERNAL,
};
enum sc_lock_video_orientation {
SC_LOCK_VIDEO_ORIENTATION_UNLOCKED = -1,
// lock the current orientation when scrcpy starts
@ -130,19 +117,13 @@ struct scrcpy_options {
const char *audio_codec_options;
const char *video_encoder;
const char *audio_encoder;
const char *camera_id;
const char *camera_size;
const char *camera_ar;
uint16_t camera_fps;
enum sc_log_level log_level;
enum sc_codec video_codec;
enum sc_codec audio_codec;
enum sc_video_source video_source;
enum sc_audio_source audio_source;
enum sc_record_format record_format;
enum sc_keyboard_input_mode keyboard_input_mode;
enum sc_mouse_input_mode mouse_input_mode;
enum sc_camera_facing camera_facing;
struct sc_port_range port_range;
uint32_t tunnel_host;
uint16_t tunnel_port;
@ -198,13 +179,9 @@ struct scrcpy_options {
bool video;
bool audio;
bool require_audio;
bool list_encoders;
bool list_displays;
bool kill_adb_on_close;
bool camera_high_speed;
#define SC_OPTION_LIST_ENCODERS 0x1
#define SC_OPTION_LIST_DISPLAYS 0x2
#define SC_OPTION_LIST_CAMERAS 0x4
#define SC_OPTION_LIST_CAMERA_SIZES 0x8
uint8_t list;
};
extern const struct scrcpy_options scrcpy_options_default;

View File

@ -90,7 +90,7 @@ push_event(uint32_t type, const char *name) {
#define PUSH_EVENT(TYPE) push_event(TYPE, # TYPE)
#ifdef _WIN32
static BOOL WINAPI windows_ctrl_handler(DWORD ctrl_type) {
BOOL WINAPI windows_ctrl_handler(DWORD ctrl_type) {
if (ctrl_type == CTRL_C_EVENT) {
PUSH_EVENT(SDL_QUIT);
return TRUE;
@ -252,9 +252,7 @@ sc_audio_demuxer_on_ended(struct sc_demuxer *demuxer,
// Contrary to the video demuxer, keep mirroring if only the audio fails
// (unless --require-audio is set).
if (status == SC_DEMUXER_STATUS_EOS) {
PUSH_EVENT(SC_EVENT_DEVICE_DISCONNECTED);
} else if (status == SC_DEMUXER_STATUS_ERROR
if (status == SC_DEMUXER_STATUS_ERROR
|| (status == SC_DEMUXER_STATUS_DISABLED
&& options->require_audio)) {
PUSH_EVENT(SC_EVENT_DEMUXER_ERROR);
@ -297,7 +295,7 @@ sc_timeout_on_timeout(struct sc_timeout *timeout, void *userdata) {
// Generate a scrcpy id to differentiate multiple running scrcpy instances
static uint32_t
scrcpy_generate_scid(void) {
scrcpy_generate_scid() {
struct sc_rand rand;
sc_rand_init(&rand);
// Only use 31 bits to avoid issues with signed values on the Java-side
@ -351,9 +349,7 @@ scrcpy(struct scrcpy_options *options) {
.log_level = options->log_level,
.video_codec = options->video_codec,
.audio_codec = options->audio_codec,
.video_source = options->video_source,
.audio_source = options->audio_source,
.camera_facing = options->camera_facing,
.crop = options->crop,
.port_range = options->port_range,
.tunnel_host = options->tunnel_host,
@ -373,10 +369,6 @@ scrcpy(struct scrcpy_options *options) {
.audio_codec_options = options->audio_codec_options,
.video_encoder = options->video_encoder,
.audio_encoder = options->audio_encoder,
.camera_id = options->camera_id,
.camera_size = options->camera_size,
.camera_ar = options->camera_ar,
.camera_fps = options->camera_fps,
.force_adb_forward = options->force_adb_forward,
.power_off_on_close = options->power_off_on_close,
.clipboard_autosync = options->clipboard_autosync,
@ -385,9 +377,9 @@ scrcpy(struct scrcpy_options *options) {
.tcpip_dst = options->tcpip_dst,
.cleanup = options->cleanup,
.power_on = options->power_on,
.list_encoders = options->list_encoders,
.list_displays = options->list_displays,
.kill_adb_on_close = options->kill_adb_on_close,
.camera_high_speed = options->camera_high_speed,
.list = options->list,
};
static const struct sc_server_callbacks cbs = {
@ -405,7 +397,7 @@ scrcpy(struct scrcpy_options *options) {
server_started = true;
if (options->list) {
if (options->list_encoders || options->list_displays) {
bool ok = await_for_server(NULL);
ret = ok ? SCRCPY_EXIT_SUCCESS : SCRCPY_EXIT_FAILURE;
goto end;
@ -456,7 +448,9 @@ scrcpy(struct scrcpy_options *options) {
struct sc_file_pusher *fp = NULL;
if (options->video_playback && options->control) {
// control implies video playback
assert(!options->control || options->video_playback);
if (options->control) {
if (!sc_file_pusher_init(&s->file_pusher, serial,
options->push_target)) {
goto end;

View File

@ -488,7 +488,6 @@ sc_screen_show_initial_window(struct sc_screen *screen) {
}
SDL_ShowWindow(screen->window);
sc_screen_update_content_rect(screen);
}
void
@ -849,8 +848,6 @@ sc_screen_convert_drawable_to_frame_coords(struct sc_screen *screen,
int32_t w = screen->content_size.width;
int32_t h = screen->content_size.height;
// screen->rect must be initialized to avoid a division by zero
assert(screen->rect.w && screen->rect.h);
x = (int64_t) (x - screen->rect.x) * w / screen->rect.w;
y = (int64_t) (y - screen->rect.y) * h / screen->rect.h;

View File

@ -76,8 +76,6 @@ sc_server_params_destroy(struct sc_server_params *params) {
free((char *) params->video_encoder);
free((char *) params->audio_encoder);
free((char *) params->tcpip_dst);
free((char *) params->camera_id);
free((char *) params->camera_ar);
}
static bool
@ -88,15 +86,14 @@ sc_server_params_copy(struct sc_server_params *dst,
// The params reference user-allocated memory, so we must copy them to
// handle them from another thread
#define COPY(FIELD) do { \
#define COPY(FIELD) \
dst->FIELD = NULL; \
if (src->FIELD) { \
dst->FIELD = strdup(src->FIELD); \
if (!dst->FIELD) { \
goto error; \
} \
} \
} while(0)
}
COPY(req_serial);
COPY(crop);
@ -105,8 +102,6 @@ sc_server_params_copy(struct sc_server_params *dst,
COPY(video_encoder);
COPY(audio_encoder);
COPY(tcpip_dst);
COPY(camera_id);
COPY(camera_ar);
#undef COPY
return true;
@ -185,20 +180,6 @@ sc_server_get_codec_name(enum sc_codec codec) {
}
}
static const char *
sc_server_get_camera_facing_name(enum sc_camera_facing camera_facing) {
switch (camera_facing) {
case SC_CAMERA_FACING_FRONT:
return "front";
case SC_CAMERA_FACING_BACK:
return "back";
case SC_CAMERA_FACING_EXTERNAL:
return "external";
default:
return NULL;
}
}
static sc_pid
execute_server(struct sc_server *server,
const struct sc_server_params *params) {
@ -234,13 +215,13 @@ execute_server(struct sc_server *server,
cmd[count++] = SCRCPY_VERSION;
unsigned dyn_idx = count; // from there, the strings are allocated
#define ADD_PARAM(fmt, ...) do { \
#define ADD_PARAM(fmt, ...) { \
char *p; \
if (asprintf(&p, fmt, ## __VA_ARGS__) == -1) { \
goto end; \
} \
cmd[count++] = p; \
} while(0)
}
ADD_PARAM("scid=%08x", params->scid);
ADD_PARAM("log_level=%s", log_level_to_server_string(params->log_level));
@ -265,11 +246,8 @@ execute_server(struct sc_server *server,
ADD_PARAM("audio_codec=%s",
sc_server_get_codec_name(params->audio_codec));
}
if (params->video_source != SC_VIDEO_SOURCE_DISPLAY) {
assert(params->video_source == SC_VIDEO_SOURCE_CAMERA);
ADD_PARAM("video_source=camera");
}
if (params->audio_source == SC_AUDIO_SOURCE_MIC) {
if (params->audio_source != SC_AUDIO_SOURCE_OUTPUT) {
assert(params->audio_source == SC_AUDIO_SOURCE_MIC);
ADD_PARAM("audio_source=mic");
}
if (params->max_size) {
@ -295,25 +273,6 @@ execute_server(struct sc_server *server,
if (params->display_id) {
ADD_PARAM("display_id=%" PRIu32, params->display_id);
}
if (params->camera_id) {
ADD_PARAM("camera_id=%s", params->camera_id);
}
if (params->camera_size) {
ADD_PARAM("camera_size=%s", params->camera_size);
}
if (params->camera_facing != SC_CAMERA_FACING_ANY) {
ADD_PARAM("camera_facing=%s",
sc_server_get_camera_facing_name(params->camera_facing));
}
if (params->camera_ar) {
ADD_PARAM("camera_ar=%s", params->camera_ar);
}
if (params->camera_fps) {
ADD_PARAM("camera_fps=%" PRIu16, params->camera_fps);
}
if (params->camera_high_speed) {
ADD_PARAM("camera_high_speed=true");
}
if (params->show_touches) {
ADD_PARAM("show_touches=true");
}
@ -351,18 +310,12 @@ execute_server(struct sc_server *server,
// By default, power_on is true
ADD_PARAM("power_on=false");
}
if (params->list & SC_OPTION_LIST_ENCODERS) {
if (params->list_encoders) {
ADD_PARAM("list_encoders=true");
}
if (params->list & SC_OPTION_LIST_DISPLAYS) {
if (params->list_displays) {
ADD_PARAM("list_displays=true");
}
if (params->list & SC_OPTION_LIST_CAMERAS) {
ADD_PARAM("list_cameras=true");
}
if (params->list & SC_OPTION_LIST_CAMERA_SIZES) {
ADD_PARAM("list_camera_sizes=true");
}
#undef ADD_PARAM
@ -580,8 +533,8 @@ sc_server_connect_to(struct sc_server *server, struct sc_server_info *info) {
if (audio_socket == SC_SOCKET_NONE) {
goto fail;
}
bool ok = net_connect_intr(&server->intr, audio_socket,
tunnel_host, tunnel_port);
bool ok = net_connect_intr(&server->intr, audio_socket, tunnel_host,
tunnel_port);
if (!ok) {
goto fail;
}
@ -942,7 +895,7 @@ run_server(void *data) {
// If --list-* is passed, then the server just prints the requested data
// then exits.
if (params->list) {
if (params->list_encoders || params->list_displays) {
sc_pid pid = execute_server(server, params);
if (pid == SC_PROCESS_NONE) {
goto error_connection_failed;

View File

@ -26,18 +26,12 @@ struct sc_server_params {
enum sc_log_level log_level;
enum sc_codec video_codec;
enum sc_codec audio_codec;
enum sc_video_source video_source;
enum sc_audio_source audio_source;
enum sc_camera_facing camera_facing;
const char *crop;
const char *video_codec_options;
const char *audio_codec_options;
const char *video_encoder;
const char *audio_encoder;
const char *camera_id;
const char *camera_size;
const char *camera_ar;
uint16_t camera_fps;
struct sc_port_range port_range;
uint32_t tunnel_host;
uint16_t tunnel_port;
@ -62,9 +56,9 @@ struct sc_server_params {
bool select_tcpip;
bool cleanup;
bool power_on;
bool list_encoders;
bool list_displays;
bool kill_adb_on_close;
bool camera_high_speed;
uint8_t list;
};
struct sc_server {

View File

@ -27,8 +27,7 @@
// keyboard support, though OS could support more keys via modifying the report
// desc. 6 should be enough for scrcpy.
#define HID_KEYBOARD_MAX_KEYS 6
#define HID_KEYBOARD_EVENT_SIZE \
(HID_KEYBOARD_INDEX_KEYS + HID_KEYBOARD_MAX_KEYS)
#define HID_KEYBOARD_EVENT_SIZE (2 + HID_KEYBOARD_MAX_KEYS)
#define HID_RESERVED 0x00
#define HID_ERROR_ROLL_OVER 0x01

View File

@ -105,6 +105,10 @@ scrcpy_otg(struct scrcpy_options *options) {
usb_device_initialized = true;
LOGI("USB device: %s (%04x:%04x) %s %s", usb_device.serial,
(unsigned) usb_device.vid, (unsigned) usb_device.pid,
usb_device.manufacturer, usb_device.product);
ok = sc_usb_connect(&s->usb, usb_device.device, &cbs, NULL);
if (!ok) {
goto end;

View File

@ -93,7 +93,7 @@ sc_usb_device_move(struct sc_usb_device *dst, struct sc_usb_device *src) {
src->product = NULL;
}
static void
void
sc_usb_devices_destroy(struct sc_vec_usb_devices *usb_devices) {
for (size_t i = 0; i < usb_devices->size; ++i) {
sc_usb_device_destroy(&usb_devices->data[i]);
@ -213,8 +213,8 @@ sc_usb_select_device(struct sc_usb *usb, const char *serial,
assert(sel_count == 1); // sel_idx is valid only if sel_count == 1
struct sc_usb_device *device = &vec.data[sel_idx];
LOGI("USB device found:");
sc_usb_devices_log(SC_LOG_LEVEL_INFO, vec.data, vec.size);
LOGD("USB device found:");
sc_usb_devices_log(SC_LOG_LEVEL_DEBUG, vec.data, vec.size);
// Move device into out_device (do not destroy device)
sc_usb_device_move(out_device, device);

View File

@ -147,7 +147,7 @@ sc_sdl_log_print(void *userdata, int category, SDL_LogPriority priority,
}
void
sc_log_configure(void) {
sc_log_configure() {
SDL_LogSetOutputFunction(sc_sdl_log_print, NULL);
// Redirect FFmpeg logs to SDL logs
av_log_set_callback(sc_av_log_callback);

View File

@ -36,6 +36,6 @@ sc_log_windows_error(const char *prefix, int error);
#endif
void
sc_log_configure(void);
sc_log_configure();
#endif

View File

@ -190,10 +190,10 @@ sc_vecdeque_reallocdata_(void *ptr, size_t newcap, size_t item_size,
size_t right_len = MIN(size, oldcap - oldorigin);
assert(right_len);
memcpy(newptr, (char *) ptr + (oldorigin * item_size), right_len * item_size);
memcpy(newptr, ptr + (oldorigin * item_size), right_len * item_size);
if (size > right_len) {
memcpy((char *) newptr + (right_len * item_size), ptr,
memcpy(newptr + (right_len * item_size), ptr,
(size - right_len) * item_size);
}

View File

@ -5,7 +5,7 @@
#include "util/bytebuf.h"
static void test_bytebuf_simple(void) {
void test_bytebuf_simple(void) {
struct sc_bytebuf buf;
uint8_t data[20];
@ -34,7 +34,7 @@ static void test_bytebuf_simple(void) {
sc_bytebuf_destroy(&buf);
}
static void test_bytebuf_boundaries(void) {
void test_bytebuf_boundaries(void) {
struct sc_bytebuf buf;
uint8_t data[20];
@ -71,7 +71,7 @@ static void test_bytebuf_boundaries(void) {
sc_bytebuf_destroy(&buf);
}
static void test_bytebuf_two_steps_write(void) {
void test_bytebuf_two_steps_write(void) {
struct sc_bytebuf buf;
uint8_t data[20];

View File

@ -269,25 +269,21 @@ static void test_parse_integer_with_suffix(void) {
char buf[32];
int r = snprintf(buf, sizeof(buf), "%ldk", LONG_MAX / 2000);
assert(r >= 0 && (size_t) r < sizeof(buf));
sprintf(buf, "%ldk", LONG_MAX / 2000);
ok = sc_str_parse_integer_with_suffix(buf, &value);
assert(ok);
assert(value == LONG_MAX / 2000 * 1000);
r = snprintf(buf, sizeof(buf), "%ldm", LONG_MAX / 2000);
assert(r >= 0 && (size_t) r < sizeof(buf));
sprintf(buf, "%ldm", LONG_MAX / 2000);
ok = sc_str_parse_integer_with_suffix(buf, &value);
assert(!ok);
r = snprintf(buf, sizeof(buf), "%ldk", LONG_MIN / 2000);
assert(r >= 0 && (size_t) r < sizeof(buf));
sprintf(buf, "%ldk", LONG_MIN / 2000);
ok = sc_str_parse_integer_with_suffix(buf, &value);
assert(ok);
assert(value == LONG_MIN / 2000 * 1000);
r = snprintf(buf, sizeof(buf), "%ldm", LONG_MIN / 2000);
assert(r >= 0 && (size_t) r < sizeof(buf));
sprintf(buf, "%ldm", LONG_MIN / 2000);
ok = sc_str_parse_integer_with_suffix(buf, &value);
assert(!ok);
}
@ -362,7 +358,7 @@ static void test_index_of_column(void) {
assert(sc_str_index_of_column(" a bc d", 1, " ") == 2);
}
static void test_remove_trailing_cr(void) {
static void test_remove_trailing_cr() {
char s[] = "abc\r";
sc_str_remove_trailing_cr(s, sizeof(s) - 1);
assert(!strcmp(s, "abc"));

View File

@ -102,7 +102,7 @@ static void test_vecdeque_reserve(void) {
sc_vecdeque_destroy(&vdq);
}
static void test_vecdeque_grow(void) {
static void test_vecdeque_grow() {
struct SC_VECDEQUE(int) vdq = SC_VECDEQUE_INITIALIZER;
bool ok = sc_vecdeque_reserve(&vdq, 20);
@ -142,7 +142,7 @@ static void test_vecdeque_grow(void) {
sc_vecdeque_destroy(&vdq);
}
static void test_vecdeque_push_hole(void) {
static void test_vecdeque_push_hole() {
struct SC_VECDEQUE(int) vdq = SC_VECDEQUE_INITIALIZER;
bool ok = sc_vecdeque_reserve(&vdq, 20);

View File

@ -187,7 +187,7 @@ static void test_vector_index_of(void) {
sc_vector_destroy(&vec);
}
static void test_vector_grow(void) {
static void test_vector_grow() {
struct SC_VECTOR(int) vec = SC_VECTOR_INITIALIZER;
bool ok;

View File

@ -17,5 +17,5 @@ endian = 'little'
[properties]
prebuilt_ffmpeg = 'ffmpeg-6.0-scrcpy-4/win32'
prebuilt_sdl2 = 'SDL2-2.28.4/i686-w64-mingw32'
prebuilt_sdl2 = 'SDL2-2.26.4/i686-w64-mingw32'
prebuilt_libusb = 'libusb-1.0.26/libusb-MinGW-Win32'

View File

@ -17,5 +17,5 @@ endian = 'little'
[properties]
prebuilt_ffmpeg = 'ffmpeg-6.0-scrcpy-4/win64'
prebuilt_sdl2 = 'SDL2-2.28.4/x86_64-w64-mingw32'
prebuilt_sdl2 = 'SDL2-2.26.4/x86_64-w64-mingw32'
prebuilt_libusb = 'libusb-1.0.26/libusb-MinGW-x64'

View File

@ -71,20 +71,6 @@ scrcpy --audio-codec=aac
scrcpy --audio-codec=raw
```
In particular, if you get the following error:
> Failed to initialize audio/opus, error 0xfffffffe
then your device has no Opus encoder: try `scrcpy --audio-codec=aac`.
For advanced usage, to pass arbitrary parameters to the [`MediaFormat`],
check `--audio-codec-options` in the manpage or in `scrcpy --help`.
[`MediaFormat`]: https://developer.android.com/reference/android/media/MediaFormat
## Encoder
Several encoders may be available on the device. They can be listed by:
```bash
@ -93,14 +79,19 @@ scrcpy --list-encoders
To select a specific encoder:
```bash
```
scrcpy --audio-codec=opus --audio-encoder='c2.android.opus.encoder'
```
For advanced usage, to pass arbitrary parameters to the [`MediaFormat`],
check `--audio-codec-options` in the manpage or in `scrcpy --help`.
[`MediaFormat`]: https://developer.android.com/reference/android/media/MediaFormat
## Bit rate
The default audio bit rate is 128Kbps. To change it:
The default video bit-rate is 128Kbps. To change it:
```bash
scrcpy --audio-bit-rate=64K

View File

@ -77,7 +77,7 @@ pip3 install meson
sudo dnf install https://download1.rpmfusion.org/free/fedora/rpmfusion-free-release-$(rpm -E %fedora).noarch.rpm
# client build dependencies
sudo dnf install SDL2-devel ffms2-devel libusb1-devel meson gcc make
sudo dnf install SDL2-devel ffms2-devel libusb-devel meson gcc make
# server build dependencies
sudo dnf install java-devel
@ -233,10 +233,10 @@ install` must be run as root)._
#### Option 2: Use prebuilt server
- [`scrcpy-server-v2.1.1`][direct-scrcpy-server]
<sub>SHA-256: `9558db6c56743a1dc03b38f59801fb40e91cc891f8fc0c89e5b0b067761f148e`</sub>
- [`scrcpy-server-v2.0`][direct-scrcpy-server]
<sub>SHA-256: `9e241615f578cd690bb43311000debdecf6a9c50a7082b001952f18f6f21ddc2`</sub>
[direct-scrcpy-server]: https://github.com/Genymobile/scrcpy/releases/download/v2.1.1/scrcpy-server-v2.1.1
[direct-scrcpy-server]: https://github.com/Genymobile/scrcpy/releases/download/v2.0/scrcpy-server-v2.0
Download the prebuilt server somewhere, and specify its path during the Meson
configuration:

View File

@ -1,150 +0,0 @@
# Camera
Camera mirroring is supported for devices with Android 12 or higher.
To capture the camera instead of the device screen:
```
scrcpy --video-source=camera
```
By default, it automatically switches [audio source](audio.md#source) to
microphone (as if `--audio-source=mic` were also passed).
```bash
scrcpy --video-source=display # default is --audio-source=output
scrcpy --video-source=camera # default is --audio-source=mic
scrcpy --video-source=display --audio-source=mic # force display AND microphone
scrcpy --video-source=camera --audio-source=output # force camera AND device audio output
```
## List
To list the cameras available (with their declared valid sizes and frame rates):
```
scrcpy --list-cameras
scrcpy --list-camera-sizes
```
_Note that the sizes and frame rates are declarative. They are not accurate on
all devices: some of them are declared but not supported, while some others are
not declared but supported._
## Selection
It is possible to pass an explicit camera id (as listed by `--list-cameras`):
```
scrcpy --video-source=camera --camera-id=0
```
Alternatively, the camera may be selected automatically:
```bash
scrcpy --video-source=camera # use the first camera
scrcpy --video-source=camera --camera-facing=front # use the first front camera
scrcpy --video-source=camera --camera-facing=back # use the first back camera
scrcpy --video-source=camera --camera-facing=external # use the first external camera
```
If `--camera-id` is specified, then `--camera-facing` is forbidden (the id
already determines the camera):
```bash
scrcpy --video-source=camera --camera-id=0 --camera-facing=front # error
```
### Size selection
It is possible to pass an explicit camera size:
```
scrcpy --video-source=camera --camera-size=1920x1080
```
The given size may be listed among the declared valid sizes
(`--list-camera-sizes`), but may also be anything else (some devices support
arbitrary sizes):
```
scrcpy --video-source=camera --camera-size=1840x444
```
Alternatively, a declared valid size (among the ones listed by
`list-camera-sizes`) may be selected automatically.
Two constraints are supported:
- `-m`/`--max-size` (already used for display mirroring), for example `-m1920`;
- `--camera-ar` to specify an aspect ratio (`<num>:<den>`, `<value>` or
`sensor`).
Some examples:
```bash
scrcpy --video-source=camera # use the greatest width and the greatest associated height
scrcpy --video-source=camera -m1920 # use the greatest width not above 1920 and the greatest associated height
scrcpy --video-source=camera --camera-ar=4:3 # use the greatest size with an aspect ratio of 4:3 (+/- 10%)
scrcpy --video-source=camera --camera-ar=1.6 # use the greatest size with an aspect ratio of 1.6 (+/- 10%)
scrcpy --video-source=camera --camera-ar=sensor # use the greatest size with the aspect ratio of the camera sensor (+/- 10%)
scrcpy --video-source=camera -m1920 --camera-ar=16:9 # use the greatest width not above 1920 and the closest to 16:9 aspect ratio
```
If `--camera-size` is specified, then `-m`/`--max-size` and `--camera-ar` are
forbidden (the size is determined by the value given explicitly):
```bash
scrcpy --video-source=camera --camera-size=1920x1080 -m3000 # error
```
## Frame rate
By default, camera is captured at Android's default frame rate (30 fps).
To configure a different frame rate:
```
scrcpy --video-source=camera --camera-fps=60
```
## High speed capture
The Android camera API also supports a [high speed capture mode][high speed].
This mode is restricted to specific resolutions and frame rates, listed by
`--list-camera-sizes`.
```
scrcpy --video-source=camera --camera-size=1920x1080 --camera-fps=240
```
[high speed]: https://developer.android.com/reference/android/hardware/camera2/CameraConstrainedHighSpeedCaptureSession
## Brace expansion tip
All camera options start with `--camera-`, so if your shell supports it, you can
benefit from [brace expansion] (for example, it is supported _bash_ and _zsh_):
```bash
scrcpy --video-source=camera --camera-{facing=back,ar=16:9,high-speed,fps=120}
```
This will be expanded as:
```bash
scrcpy --video-source=camera --camera-facing=back --camera-ar=16:9 --camera-high-speed --camera-fps=120
```
[brace expansion]: https://www.gnu.org/software/bash/manual/html_node/Brace-Expansion.html
## Webcam
Combined with the [V4L2](v4l2.md) feature on Linux, the Android device camera
may be used as a webcam on the computer.

View File

@ -1,125 +0,0 @@
# Connection
## Selection
If exactly one device is connected (i.e. listed by `adb devices`), then it is
automatically selected.
However, if there are multiple devices connected, you must specify the one to
use in one of 4 ways:
- by its serial:
```bash
scrcpy --serial=0123456789abcdef
scrcpy -s 0123456789abcdef # short version
# the serial is the ip:port if connected over TCP/IP (same behavior as adb)
scrcpy --serial=192.168.1.1:5555
```
- the one connected over USB (if there is exactly one):
```bash
scrcpy --select-usb
scrcpy -d # short version
```
- the one connected over TCP/IP (if there is exactly one):
```bash
scrcpy --select-tcpip
scrcpy -e # short version
```
- a device already listening on TCP/IP (see [below](#tcpip-wireless)):
```bash
scrcpy --tcpip=192.168.1.1:5555
scrcpy --tcpip=192.168.1.1 # default port is 5555
```
The serial may also be provided via the environment variable `ANDROID_SERIAL`
(also used by `adb`):
```bash
# in bash
export ANDROID_SERIAL=0123456789abcdef
scrcpy
```
```cmd
:: in cmd
set ANDROID_SERIAL=0123456789abcdef
scrcpy
```
```powershell
# in PowerShell
$env:ANDROID_SERIAL = '0123456789abcdef'
scrcpy
```
## TCP/IP (wireless)
_Scrcpy_ uses `adb` to communicate with the device, and `adb` can [connect] to a
device over TCP/IP. The device must be connected on the same network as the
computer.
[connect]: https://developer.android.com/studio/command-line/adb.html#wireless
### Automatic
An option `--tcpip` allows to configure the connection automatically. There are
two variants.
If the device (accessible at 192.168.1.1 in this example) already listens on a
port (typically 5555) for incoming _adb_ connections, then run:
```bash
scrcpy --tcpip=192.168.1.1 # default port is 5555
scrcpy --tcpip=192.168.1.1:5555
```
If _adb_ TCP/IP mode is disabled on the device (or if you don't know the IP
address), connect the device over USB, then run:
```bash
scrcpy --tcpip # without arguments
```
It will automatically find the device IP address and adb port, enable TCP/IP
mode if necessary, then connect to the device before starting.
### Manual
Alternatively, it is possible to enable the TCP/IP connection manually using
`adb`:
1. Plug the device into a USB port on your computer.
2. Connect the device to the same Wi-Fi network as your computer.
3. Get your device IP address, in Settings → About phone → Status, or by
executing this command:
```bash
adb shell ip route | awk '{print $9}'
```
4. Enable `adb` over TCP/IP on your device: `adb tcpip 5555`.
5. Unplug your device.
6. Connect to your device: `adb connect DEVICE_IP:5555` _(replace `DEVICE_IP`
with the device IP address you found)_.
7. Run `scrcpy` as usual.
8. Run `adb disconnect` once you're done.
Since Android 11, a [wireless debugging option][adb-wireless] allows to bypass
having to physically connect your device directly to your computer.
[adb-wireless]: https://developer.android.com/studio/command-line/adb#wireless-android11-command-line
## Autostart
A small tool (by the scrcpy author) allows to run arbitrary commands whenever a
new Android device is connected: [AutoAdb]. It can be used to start scrcpy:
```bash
autoadb scrcpy -s '{}'
```
[AutoAdb]: https://github.com/rom1v/autoadb

View File

@ -9,52 +9,16 @@ This application is composed of two parts:
The client is responsible to push the server to the device and start its
execution.
The client and the server establish communication using separate sockets for
video, audio and controls. Any of them may be disabled (but not all), so
there are 1, 2 or 3 socket(s).
Once the client and the server are connected to each other, the server initially
sends device information (name and initial screen dimensions), then starts to
send a raw H.264 video stream of the device screen. The client decodes the video
frames, and display them as soon as possible, without buffering, to minimize
latency. The client is not aware of the device rotation (which is handled by the
server), it just knows the dimensions of the video frames.
The server initially sends the device name on the first socket (it is used for
the scrcpy window title), then each socket is used for its own purpose. All
reads and writes are performed from a dedicated thread for each socket, both on
the client and on the server.
The client captures relevant keyboard and mouse events, that it transmits to the
server, which injects them to the device.
If video is enabled, then the server sends a raw video stream (H.264 by default)
of the device screen, with some additional headers for each packet. The client
decodes the video frames, and displays them as soon as possible, without
buffering (unless `--display-buffer=delay` is specified) to minimize latency.
The client is not aware of the device rotation (which is handled by the server),
it just knows the dimensions of the video frames it receives.
Similarly, if audio is enabled, then the server sends a raw audio stream (OPUS
by default) of the device audio output (or the microphone if
`--audio-source=mic` is specified), with some additional headers for each
packet. The client decodes the stream, attempts to keep a minimal latency by
maintaining an average buffering. The [blog post][scrcpy2] of the scrcpy v2.0
release gives more details about the audio feature.
If control is enabled, then the client captures relevant keyboard and mouse
events, that it transmits to the server, which injects them to the device. This
is the only socket which is used in both direction: input events are sent from
the client to the device, and when the device clipboard changes, the new content
is sent from the device to the client to support seamless copy-paste.
[scrcpy2]: https://blog.rom1v.com/2023/03/scrcpy-2-0-with-audio/
Note that the client-server roles are expressed at the application level:
- the server _serves_ video and audio streams, and handle requests from the
client,
- the client _controls_ the device through the server.
However, by default (when `--force-adb-forward` is not set), the roles are
reversed at the network level:
- the client opens a server socket and listen on a port before starting the
server,
- the server connects to the client.
This role inversion guarantees that the connection will not fail due to race
conditions without polling.
## Server
@ -68,14 +32,15 @@ The server is a Java application (with a [`public static void main(String...
args)`][main] method), compiled against the Android framework, and executed as
`shell` on the Android device.
[main]: https://github.com/Genymobile/scrcpy/blob/a3cdf1a6b86ea22786e1f7d09b9c202feabc6949/server/src/main/java/com/genymobile/scrcpy/Server.java#L193
[main]: https://github.com/Genymobile/scrcpy/blob/ffe0417228fb78ab45b7ee4e202fc06fc8875bf3/server/src/main/java/com/genymobile/scrcpy/Server.java#L123
To run such a Java application, the classes must be [_dexed_][dex] (typically,
to `classes.dex`). If `my.package.MainClass` is the main class, compiled to
`classes.dex`, pushed to the device in `/data/local/tmp`, then it can be run
with:
adb shell CLASSPATH=/data/local/tmp/classes.dex app_process / my.package.MainClass
adb shell CLASSPATH=/data/local/tmp/classes.dex \
app_process / my.package.MainClass
_The path `/data/local/tmp` is a good candidate to push the server, since it's
readable and writable by `shell`, but not world-writable, so a malicious
@ -84,7 +49,7 @@ application may not replace the server just before the client executes it._
Instead of a raw _dex_ file, `app_process` accepts a _jar_ containing
`classes.dex` (e.g. an [APK]). For simplicity, and to benefit from the gradle
build system, the server is built to an (unsigned) APK (renamed to
`scrcpy-server.jar`).
`scrcpy-server`).
[dex]: https://en.wikipedia.org/wiki/Dalvik_(software)
[apk]: https://en.wikipedia.org/wiki/Android_application_package
@ -100,77 +65,42 @@ They can be called using reflection though. The communication with hidden
components is provided by [_wrappers_ classes][wrappers] and [aidl].
[hidden]: https://stackoverflow.com/a/31908373/1987178
[wrappers]: https://github.com/Genymobile/scrcpy/tree/master/server/src/main/java/com/genymobile/scrcpy/wrappers
[aidl]: https://github.com/Genymobile/scrcpy/tree/master/server/src/main/aidl
[wrappers]: https://github.com/Genymobile/scrcpy/tree/ffe0417228fb78ab45b7ee4e202fc06fc8875bf3/server/src/main/java/com/genymobile/scrcpy/wrappers
[aidl]: https://github.com/Genymobile/scrcpy/tree/ffe0417228fb78ab45b7ee4e202fc06fc8875bf3/server/src/main/aidl/android/view
### Threading
### Execution
The server uses 3 threads:
The server is started by the client basically by executing the following
commands:
- the **main** thread, encoding and streaming the video to the client;
- the **controller** thread, listening for _control messages_ (typically,
keyboard and mouse events) from the client;
- the **receiver** thread (managed by the controller), sending _device messages_
to the clients (currently, it is only used to send the device clipboard
content).
```bash
adb push scrcpy-server /data/local/tmp/scrcpy-server.jar
adb forward tcp:27183 localabstract:scrcpy
adb shell CLASSPATH=/data/local/tmp/scrcpy-server.jar app_process / com.genymobile.scrcpy.Server 2.1
```
The first argument (`2.1` in the example) is the client scrcpy version. The
server fails if the client and the server do not have the exact same version.
The protocol between the client and the server may change from version to
version (see [protocol](#protocol) below), and there is no backward or forward
compatibility (there is no point to use different client and server versions).
This check allows to detect misconfiguration (running an older or newer server
by mistake).
It is followed by any number of arguments, in the form of `key=value` pairs.
Their order is irrelevant. The possible keys and associated value types can be
found in the [server][server-options] and [client][client-options] code.
[server-options]: https://github.com/Genymobile/scrcpy/blob/a3cdf1a6b86ea22786e1f7d09b9c202feabc6949/server/src/main/java/com/genymobile/scrcpy/Options.java#L181
[client-options]: https://github.com/Genymobile/scrcpy/blob/a3cdf1a6b86ea22786e1f7d09b9c202feabc6949/app/src/server.c#L226
For example, if we execute `scrcpy -m1920 --no-audio`, then the server
execution will look like this:
```bash
# scid is a random number to identify different clients running on the same device
adb shell CLASSPATH=/data/local/tmp/scrcpy-server.jar app_process / com.genymobile.scrcpy.Server 2.1 scid=12345678 log_level=info audio=false max_size=1920
```
### Components
When executed, its [`main()`][main] method is executed (on the "main" thread).
It parses the arguments, establishes the connection with the client and starts
the other "components":
- the **video** streamer: it captures the video screen and send encoded video
packets on the _video_ socket (from the _video_ thread).
- the **audio** streamer: it uses several threads to capture raw packets,
submits them to encoding and retrieve encoded packets, which it sends on the
_audio_ socket.
- the **controller**: it receives _control messages_ (typically input events)
on the _control_ socket from one thread, and sends _device messages_ (e.g. to
transmit the device clipboard content to the client) on the same _control
socket_ from another thread. Thus, the _control_ socket is used in both
directions (contrary to the _video_ and _audio_ sockets).
Since the video encoding is typically hardware, there would be no benefit in
encoding and streaming in two different threads.
### Screen video encoding
The encoding is managed by [`ScreenEncoder`].
The video is encoded using the [`MediaCodec`] API. The codec encodes the content
of a `Surface` associated to the display, and writes the encoding packets to the
client (on the _video_ socket).
The video is encoded using the [`MediaCodec`] API. The codec takes its input
from a [surface] associated to the display, and writes the resulting H.264
stream to the provided output stream (the socket connected to the client).
[`ScreenEncoder`]: https://github.com/Genymobile/scrcpy/blob/a3cdf1a6b86ea22786e1f7d09b9c202feabc6949/server/src/main/java/com/genymobile/scrcpy/ScreenEncoder.java
[`ScreenEncoder`]: https://github.com/Genymobile/scrcpy/blob/ffe0417228fb78ab45b7ee4e202fc06fc8875bf3/server/src/main/java/com/genymobile/scrcpy/ScreenEncoder.java
[`MediaCodec`]: https://developer.android.com/reference/android/media/MediaCodec.html
[surface]: https://github.com/Genymobile/scrcpy/blob/ffe0417228fb78ab45b7ee4e202fc06fc8875bf3/server/src/main/java/com/genymobile/scrcpy/ScreenEncoder.java#L68-L69
On device rotation (or folding), the encoding session is [reset] and restarted.
On device [rotation], the codec, surface and display are reinitialized, and a
new video stream is produced.
New frames are produced only when changes occur on the surface. This avoids to
send unnecessary frames, but by default there might be drawbacks:
New frames are produced only when changes occur on the surface. This is good
because it avoids to send unnecessary frames, but there are drawbacks:
- it does not send any frame on start if the device screen does not change,
- after fast motion changes, the last frame may have poor quality.
@ -178,24 +108,11 @@ send unnecessary frames, but by default there might be drawbacks:
Both problems are [solved][repeat] by the flag
[`KEY_REPEAT_PREVIOUS_FRAME_AFTER`][repeat-flag].
[reset]: https://github.com/Genymobile/scrcpy/blob/a3cdf1a6b86ea22786e1f7d09b9c202feabc6949/server/src/main/java/com/genymobile/scrcpy/ScreenEncoder.java#L179
[rotation]: https://github.com/Genymobile/scrcpy/blob/ffe0417228fb78ab45b7ee4e202fc06fc8875bf3/server/src/main/java/com/genymobile/scrcpy/ScreenEncoder.java#L90
[repeat]: https://github.com/Genymobile/scrcpy/blob/a3cdf1a6b86ea22786e1f7d09b9c202feabc6949/server/src/main/java/com/genymobile/scrcpy/ScreenEncoder.java#L246-L247
[repeat]: https://github.com/Genymobile/scrcpy/blob/ffe0417228fb78ab45b7ee4e202fc06fc8875bf3/server/src/main/java/com/genymobile/scrcpy/ScreenEncoder.java#L147-L148
[repeat-flag]: https://developer.android.com/reference/android/media/MediaFormat.html#KEY_REPEAT_PREVIOUS_FRAME_AFTER
### Audio encoding
Similarly, the audio is [captured] using an [`AudioRecord`], and [encoded] using
the [`MediaCodec`] asynchronous API.
More details are available on the [blog post][scrcpy2] introducing the audio feature.
[captured]: https://github.com/Genymobile/scrcpy/blob/a3cdf1a6b86ea22786e1f7d09b9c202feabc6949/server/src/main/java/com/genymobile/scrcpy/AudioCapture.java
[encoded]: https://github.com/Genymobile/scrcpy/blob/a3cdf1a6b86ea22786e1f7d09b9c202feabc6949/server/src/main/java/com/genymobile/scrcpy/AudioEncoder.java
[`AudioRecord`]: https://developer.android.com/reference/android/media/AudioRecord
### Input events injection
_Control messages_ are received from the client by the [`Controller`] (run in a
@ -207,13 +124,13 @@ separate thread). There are several types of input events:
- other commands (e.g. to switch the screen on or to copy the clipboard).
Some of them need to inject input events to the system. To do so, they use the
_hidden_ method [`InputManager.injectInputEvent()`] (exposed by the
_hidden_ method [`InputManager.injectInputEvent`] (exposed by our
[`InputManager` wrapper][inject-wrapper]).
[`Controller`]: https://github.com/Genymobile/scrcpy/blob/a3cdf1a6b86ea22786e1f7d09b9c202feabc6949/server/src/main/java/com/genymobile/scrcpy/Controller.java
[`Controller`]: https://github.com/Genymobile/scrcpy/blob/ffe0417228fb78ab45b7ee4e202fc06fc8875bf3/server/src/main/java/com/genymobile/scrcpy/Controller.java#L81
[`KeyEvent`]: https://developer.android.com/reference/android/view/KeyEvent.html
[`MotionEvent`]: https://developer.android.com/reference/android/view/MotionEvent.html
[`InputManager.injectInputEvent()`]: https://github.com/Genymobile/scrcpy/blob/a3cdf1a6b86ea22786e1f7d09b9c202feabc6949/server/src/main/java/com/genymobile/scrcpy/wrappers/InputManager.java#L34
[`InputManager.injectInputEvent`]: https://android.googlesource.com/platform/frameworks/base/+/oreo-release/core/java/android/hardware/input/InputManager.java#857
[inject-wrapper]: https://github.com/Genymobile/scrcpy/blob/ffe0417228fb78ab45b7ee4e202fc06fc8875bf3/server/src/main/java/com/genymobile/scrcpy/wrappers/InputManager.java#L27
@ -223,222 +140,126 @@ _hidden_ method [`InputManager.injectInputEvent()`] (exposed by the
The client relies on [SDL], which provides cross-platform API for UI, input
events, threading, etc.
The video and audio streams are decoded by [FFmpeg].
The video stream is decoded by [libav] (FFmpeg).
[SDL]: https://www.libsdl.org
[ffmpeg]: https://ffmpeg.org/
[libav]: https://www.libav.org/
### Initialization
The client parses the command line arguments, then [runs one of two code
paths][run]:
- scrcpy in "normal" mode ([`scrcpy.c`])
- scrcpy in [OTG mode](hid-otg.md) ([`scrcpy_otg.c`])
On startup, in addition to _libav_ and _SDL_ initialization, the client must
push and start the server on the device, and open two sockets (one for the video
stream, one for control) so that they may communicate.
[run]: https://github.com/Genymobile/scrcpy/blob/a3cdf1a6b86ea22786e1f7d09b9c202feabc6949/app/src/main.c#L81-L82
[`scrcpy.c`]: https://github.com/Genymobile/scrcpy/blob/a3cdf1a6b86ea22786e1f7d09b9c202feabc6949/app/src/scrcpy.c#L292-L293
[`scrcpy_otg.c`]: https://github.com/Genymobile/scrcpy/blob/a3cdf1a6b86ea22786e1f7d09b9c202feabc6949/app/src/usb/scrcpy_otg.c#L51-L52
Note that the client-server roles are expressed at the application level:
In the remaining of this document, we assume that the "normal" mode is used
(read the code for the OTG mode).
- the server _serves_ video stream and handle requests from the client,
- the client _controls_ the device through the server.
On startup, the client:
- opens the _video_, _audio_ and _control_ sockets;
- pushes and starts the server on the device;
- initializes its components (demuxers, decoders, recorder…).
However, the roles are reversed at the network level:
- the client opens a server socket and listen on a port before starting the
server,
- the server connects to the client.
This role inversion guarantees that the connection will not fail due to race
conditions, and avoids polling.
_(Note that over TCP/IP, the roles are not reversed, due to a bug in `adb
reverse`. See commit [1038bad] and [issue #5].)_
Once the server is connected, it sends the device information (name and initial
screen dimensions). Thus, the client may init the window and renderer, before
the first frame is available.
To minimize startup time, SDL initialization is performed while listening for
the connection from the server (see commit [90a46b4]).
[1038bad]: https://github.com/Genymobile/scrcpy/commit/1038bad3850f18717a048a4d5c0f8110e54ee172
[issue #5]: https://github.com/Genymobile/scrcpy/issues/5
[90a46b4]: https://github.com/Genymobile/scrcpy/commit/90a46b4c45637d083e877020d85ade52a9a5fa8e
### Video and audio streams
### Threading
Depending on the arguments passed to `scrcpy`, several components may be used.
Here is an overview of the video and audio components:
The client uses 4 threads:
- the **main** thread, executing the SDL event loop,
- the **stream** thread, receiving the video and used for decoding and
recording,
- the **controller** thread, sending _control messages_ to the server,
- the **receiver** thread (managed by the controller), receiving _device
messages_ from the server.
In addition, another thread can be started if necessary to handle APK
installation or file push requests (via drag&drop on the main window) or to
print the framerate regularly in the console.
### Stream
The video [stream] is received from the socket (connected to the server on the
device) in a separate thread.
If a [decoder] is present (i.e. `--no-display` is not set), then it uses _libav_
to decode the H.264 stream from the socket, and notifies the main thread when a
new frame is available.
There are two [frames][video_buffer] simultaneously in memory:
- the **decoding** frame, written by the decoder from the decoder thread,
- the **rendering** frame, rendered in a texture from the main thread.
When a new decoded frame is available, the decoder _swaps_ the decoding and
rendering frame (with proper synchronization). Thus, it immediately starts
to decode a new frame while the main thread renders the last one.
If a [recorder] is present (i.e. `--record` is enabled), then it muxes the raw
H.264 packet to the output video file.
[stream]: https://github.com/Genymobile/scrcpy/blob/ffe0417228fb78ab45b7ee4e202fc06fc8875bf3/app/src/stream.h
[decoder]: https://github.com/Genymobile/scrcpy/blob/ffe0417228fb78ab45b7ee4e202fc06fc8875bf3/app/src/decoder.h
[video_buffer]: https://github.com/Genymobile/scrcpy/blob/ffe0417228fb78ab45b7ee4e202fc06fc8875bf3/app/src/video_buffer.h
[recorder]: https://github.com/Genymobile/scrcpy/blob/ffe0417228fb78ab45b7ee4e202fc06fc8875bf3/app/src/recorder.h
```
V4L2 sink
/
decoder
/ \
VIDEO -------------> demuxer display
\
recorder
/
AUDIO -------------> demuxer
\
decoder --- audio player
+----------+ +----------+
---> | decoder | ---> | screen |
+---------+ / +----------+ +----------+
socket ---> | stream | ----
+---------+ \ +----------+
---> | recorder |
+----------+
```
The _demuxer_ is responsible to extract video and audio packets (read some
header, split the video stream into packets at correct boundaries, etc.).
The demuxed packets may be sent to a _decoder_ (one per stream, to produce
frames) and to a recorder (receiving both video and audio stream to record a
single file). The packets are encoded on the device (by `MediaCodec`), but when
recording, they are _muxed_ (asynchronously) into a container (MKV or MP4) on
the client side.
Video frames are sent to the screen/display to be rendered in the scrcpy window.
They may also be sent to a [V4L2 sink](v4l2.md).
Audio "frames" (an array of decoded samples) are sent to the audio player.
### Controller
The _controller_ is responsible to send _control messages_ to the device. It
The [controller] is responsible to send _control messages_ to the device. It
runs in a separate thread, to avoid I/O on the main thread.
On SDL event, received on the main thread, the _input manager_ creates
appropriate _control messages_. It is responsible to convert SDL events to
Android events. It then pushes the _control messages_ to a queue hold by the
controller. On its own thread, the controller takes messages from the queue,
that it serializes and sends to the client.
On SDL event, received on the main thread, the [input manager][inputmanager]
creates appropriate [_control messages_][controlmsg]. It is responsible to
convert SDL events to Android events (using [convert]). It pushes the _control
messages_ to a queue hold by the controller. On its own thread, the controller
takes messages from the queue, that it serializes and sends to the client.
[controller]: https://github.com/Genymobile/scrcpy/blob/ffe0417228fb78ab45b7ee4e202fc06fc8875bf3/app/src/controller.h
[controlmsg]: https://github.com/Genymobile/scrcpy/blob/ffe0417228fb78ab45b7ee4e202fc06fc8875bf3/app/src/control_msg.h
[inputmanager]: https://github.com/Genymobile/scrcpy/blob/ffe0417228fb78ab45b7ee4e202fc06fc8875bf3/app/src/input_manager.h
[convert]: https://github.com/Genymobile/scrcpy/blob/ffe0417228fb78ab45b7ee4e202fc06fc8875bf3/app/src/convert.h
## Protocol
### UI and event loop
The protocol between the client and the server must be considered _internal_: it
may (and will) change at any time for any reason. Everything may change (the
number of sockets, the order in which the sockets must be opened, the data
format on the wire…) from version to version. A client must always be run with a
matching server version.
Initialization, input events and rendering are all [managed][scrcpy] in the main
thread.
This section documents the current protocol in scrcpy v2.1.
Events are handled in the [event loop], which either updates the [screen] or
delegates to the [input manager][inputmanager].
### Connection
Firstly, the client sets up an adb tunnel:
```bash
# By default, a reverse redirection: the computer listens, the device connects
adb reverse localabstract:scrcpy_<SCID> tcp:27183
# As a fallback (or if --force-adb forward is set), a forward redirection:
# the device listens, the computer connects
adb forward tcp:27183 localabstract:scrcpy_<SCID>
```
(`<SCID>` is a 31-bit random number, so that it does not fail when several
scrcpy instances start "at the same time" for the same device.)
Then, up to 3 sockets are opened, in that order:
- a _video_ socket
- an _audio_ socket
- a _control_ socket
Each one may be disabled (respectively by `--no-video`, `--no-audio` and
`--no-control`, directly or indirectly). For example, if `--no-audio` is set,
then the _video_ socket is opened first, then the _control_ socket.
On the _first_ socket opened (whichever it is), if the tunnel is _forward_, then
a [dummy byte] is sent from the device to the client. This allows to detect a
connection error (the client connection does not fail as long as there is an adb
forward redirection, even if nothing is listening on the device side).
Still on this _first_ socket, the device sends some [metadata][device meta] to
the client (currently only the device name, used as the window title, but there
might be other fields in the future).
[dummy byte]: https://github.com/Genymobile/scrcpy/blob/a3cdf1a6b86ea22786e1f7d09b9c202feabc6949/server/src/main/java/com/genymobile/scrcpy/DesktopConnection.java#L93
[device meta]: https://github.com/Genymobile/scrcpy/blob/a3cdf1a6b86ea22786e1f7d09b9c202feabc6949/server/src/main/java/com/genymobile/scrcpy/DesktopConnection.java#L151
You can read the [client][client-connection] and [server][server-connection]
code for more details.
[client-connection]: https://github.com/Genymobile/scrcpy/blob/a3cdf1a6b86ea22786e1f7d09b9c202feabc6949/app/src/server.c#L465-L466
[server-connection]: https://github.com/Genymobile/scrcpy/blob/a3cdf1a6b86ea22786e1f7d09b9c202feabc6949/server/src/main/java/com/genymobile/scrcpy/DesktopConnection.java#L63
Then each socket is used for its intended purpose.
### Video and audio
On the _video_ and _audio_ sockets, the device first sends some [codec
metadata]:
- On the _video_ socket, 12 bytes:
- the codec id (`u32`) (H264, H265 or AV1)
- the initial video width (`u32`)
- the initial video height (`u32`)
- On the _audio_ socket, 4 bytes:
- the codec id (`u32`) (OPUS, AAC or RAW)
[codec metadata]: https://github.com/Genymobile/scrcpy/blob/a3cdf1a6b86ea22786e1f7d09b9c202feabc6949/server/src/main/java/com/genymobile/scrcpy/Streamer.java#L33-L51
Then each packet produced by `MediaCodec` is sent, prefixed by a 12-byte [frame
header]:
- config packet flag (`u1`)
- key frame flag (`u1`)
- PTS (`u62`)
- packet size (`u32`)
Here is a schema describing the frame header:
```
[. . . . . . . .|. . . .]. . . . . . . . . . . . . . . ...
<-------------> <-----> <-----------------------------...
PTS packet raw packet
size
<--------------------->
frame header
The most significant bits of the PTS are used for packet flags:
byte 7 byte 6 byte 5 byte 4 byte 3 byte 2 byte 1 byte 0
CK...... ........ ........ ........ ........ ........ ........ ........
^^<------------------------------------------------------------------->
|| PTS
| `- key frame
`-- config packet
```
[frame header]: https://github.com/Genymobile/scrcpy/blob/a3cdf1a6b86ea22786e1f7d09b9c202feabc6949/server/src/main/java/com/genymobile/scrcpy/Streamer.java#L83
### Controls
Controls messages are sent via a custom binary protocol.
The only documentation for this protocol is the set of unit tests on both sides:
- `ControlMessage` (from client to device): [serialization](https://github.com/Genymobile/scrcpy/blob/master/app/tests/test_control_msg_serialize.c) | [deserialization](https://github.com/Genymobile/scrcpy/blob/master/server/src/test/java/com/genymobile/scrcpy/ControlMessageReaderTest.java)
- `DeviceMessage` (from device to client) [serialization](https://github.com/Genymobile/scrcpy/blob/master/server/src/test/java/com/genymobile/scrcpy/DeviceMessageWriterTest.java) | [deserialization](https://github.com/Genymobile/scrcpy/blob/master/app/tests/test_device_msg_deserialize.c)
## Standalone server
Although the server is designed to work for the scrcpy client, it can be used
with any client which uses the same protocol.
For simplicity, some [server-specific options] have been added to produce raw
streams easily:
- `send_device_meta=false`: disable the device metata (in practice, the device
name) sent on the _first_ socket
- `send_frame_meta=false`: disable the 12-byte header for each packet
- `send_dummy_byte`: disable the dummy byte sent on forward connections
- `send_codec_meta`: disable the codec information (and initial device size for
video)
- `raw_stream`: disable all the above
[server-specific options]: https://github.com/Genymobile/scrcpy/blob/a3cdf1a6b86ea22786e1f7d09b9c202feabc6949/server/src/main/java/com/genymobile/scrcpy/Options.java#L309-L329
Concretely, here is how to expose a raw H.264 stream on a TCP socket:
```bash
adb push scrcpy-server-v2.1 /data/local/tmp/scrcpy-server-manual.jar
adb forward tcp:1234 localabstract:scrcpy
adb shell CLASSPATH=/data/local/tmp/scrcpy-server-manual.jar \
app_process / com.genymobile.scrcpy.Server 2.1 \
tunnel_forward=true audio=false control=false cleanup=false \
raw_stream=true max_size=1920
```
As soon as a client connects over TCP on port 1234, the device will start
streaming the video. For example, VLC can play the video (although you will
experience a very high latency, more details [here][vlc-0latency]):
```
vlc -Idummy --demux=h264 --network-caching=0 tcp://localhost:1234
```
[vlc-0latency]: https://code.videolan.org/rom1v/vlc/-/merge_requests/20
[scrcpy]: https://github.com/Genymobile/scrcpy/blob/ffe0417228fb78ab45b7ee4e202fc06fc8875bf3/app/src/scrcpy.c
[event loop]: https://github.com/Genymobile/scrcpy/blob/ffe0417228fb78ab45b7ee4e202fc06fc8875bf3/app/src/scrcpy.c#L201
[screen]: https://github.com/Genymobile/scrcpy/blob/ffe0417228fb78ab45b7ee4e202fc06fc8875bf3/app/src/screen.h
## Hack

View File

@ -1,9 +1,156 @@
# Device
## Selection
If exactly one device is connected (i.e. listed by `adb devices`), then it is
automatically selected.
However, if there are multiple devices connected, you must specify the one to
use in one of 4 ways:
- by its serial:
```bash
scrcpy --serial=0123456789abcdef
scrcpy -s 0123456789abcdef # short version
# the serial is the ip:port if connected over TCP/IP (same behavior as adb)
scrcpy --serial=192.168.1.1:5555
```
- the one connected over USB (if there is exactly one):
```bash
scrcpy --select-usb
scrcpy -d # short version
```
- the one connected over TCP/IP (if there is exactly one):
```bash
scrcpy --select-tcpip
scrcpy -e # short version
```
- a device already listening on TCP/IP (see [below](#tcpip-wireless)):
```bash
scrcpy --tcpip=192.168.1.1:5555
scrcpy --tcpip=192.168.1.1 # default port is 5555
```
The serial may also be provided via the environment variable `ANDROID_SERIAL`
(also used by `adb`):
```bash
# in bash
export ANDROID_SERIAL=0123456789abcdef
scrcpy
```
```cmd
:: in cmd
set ANDROID_SERIAL=0123456789abcdef
scrcpy
```
```powershell
# in PowerShell
$env:ANDROID_SERIAL = '0123456789abcdef'
scrcpy
```
## TCP/IP (wireless)
_Scrcpy_ uses `adb` to communicate with the device, and `adb` can [connect] to a
device over TCP/IP. The device must be connected on the same network as the
computer.
[connect]: https://developer.android.com/studio/command-line/adb.html#wireless
### Automatic
An option `--tcpip` allows to configure the connection automatically. There are
two variants.
If the device (accessible at 192.168.1.1 in this example) already listens on a
port (typically 5555) for incoming _adb_ connections, then run:
```bash
scrcpy --tcpip=192.168.1.1 # default port is 5555
scrcpy --tcpip=192.168.1.1:5555
```
If _adb_ TCP/IP mode is disabled on the device (or if you don't know the IP
address), connect the device over USB, then run:
```bash
scrcpy --tcpip # without arguments
```
It will automatically find the device IP address and adb port, enable TCP/IP
mode if necessary, then connect to the device before starting.
### Manual
Alternatively, it is possible to enable the TCP/IP connection manually using
`adb`:
1. Plug the device into a USB port on your computer.
2. Connect the device to the same Wi-Fi network as your computer.
3. Get your device IP address, in Settings → About phone → Status, or by
executing this command:
```bash
adb shell ip route | awk '{print $9}'
```
4. Enable `adb` over TCP/IP on your device: `adb tcpip 5555`.
5. Unplug your device.
6. Connect to your device: `adb connect DEVICE_IP:5555` _(replace `DEVICE_IP`
with the device IP address you found)_.
7. Run `scrcpy` as usual.
8. Run `adb disconnect` once you're done.
Since Android 11, a [wireless debugging option][adb-wireless] allows to bypass
having to physically connect your device directly to your computer.
[adb-wireless]: https://developer.android.com/studio/command-line/adb#wireless-android11-command-line
## Autostart
A small tool (by the scrcpy author) allows to run arbitrary commands whenever a
new Android device is connected: [AutoAdb]. It can be used to start scrcpy:
```bash
autoadb scrcpy -s '{}'
```
[AutoAdb]: https://github.com/rom1v/autoadb
## Display
If several displays are available on the Android device, it is possible to
select the display to mirror:
```bash
scrcpy --display=1
```
The list of display ids can be retrieved by:
```bash
scrcpy --list-displays
```
A secondary display may only be controlled if the device runs at least Android
10 (otherwise it is mirrored as read-only).
## Actions
Some command line arguments perform actions on the device itself while scrcpy is
running.
## Stay awake
### Stay awake
To prevent the device from sleeping after a delay **when the device is plugged
in**:
@ -19,7 +166,7 @@ If the device is not plugged in (i.e. only connected over TCP/IP),
`--stay-awake` has no effect (this is the Android behavior).
## Turn screen off
### Turn screen off
It is possible to turn the device screen off while mirroring on start with a
command-line option:
@ -47,7 +194,7 @@ scrcpy -Sw # short version
```
## Show touches
### Show touches
For presentations, it may be useful to show physical touches (on the physical
device). Android exposes this feature in _Developers options_.
@ -63,7 +210,7 @@ scrcpy -t # short version
Note that it only shows _physical_ touches (by a finger on the device).
## Power off on close
### Power off on close
To turn the device screen off when closing _scrcpy_:
@ -71,10 +218,11 @@ To turn the device screen off when closing _scrcpy_:
scrcpy --power-off-on-close
```
## Power on on start
### Power on on start
By default, on start, the device is powered on. To prevent this behavior:
```bash
scrcpy --no-power-on
```

View File

@ -106,7 +106,3 @@ scrcpy --otg # keyboard and mouse
Like `--hid-keyboard` and `--hid-mouse`, it only works if the device is
connected over USB.
## HID/OTG issues on Windows
See [FAQ](/FAQ.md#hidotg-issues-on-windows).

View File

@ -9,10 +9,12 @@ Scrcpy is packaged in several distributions and package managers:
- Debian/Ubuntu: `apt install scrcpy`
- Arch Linux: `pacman -S scrcpy`
- Fedora: `dnf copr enable zeno/scrcpy && dnf install scrcpy`
- Gentoo: `emerge scrcpy`
- Gentoo: [ebuild][ebuild-link] file
- Snap: `snap install scrcpy`
- … (see [repology](https://repology.org/project/scrcpy/versions))
[ebuild-link]: https://github.com/maggu2810/maggu2810-overlay/tree/master/app-mobilephone/scrcpy
### Latest version
However, the packaged version is not always the latest release. To install the

View File

@ -29,7 +29,7 @@ _<kbd>[Super]</kbd> is typically the <kbd>Windows</kbd> or <kbd>Cmd</kbd> key._
| Resize window to 1:1 (pixel-perfect) | <kbd>MOD</kbd>+<kbd>g</kbd>
| Resize window to remove black borders | <kbd>MOD</kbd>+<kbd>w</kbd> \| _Double-left-click¹_
| Click on `HOME` | <kbd>MOD</kbd>+<kbd>h</kbd> \| _Middle-click_
| Click on `BACK` | <kbd>MOD</kbd>+<kbd>b</kbd> \| <kbd>MOD</kbd>+<kbd>Backspace</kbd> \| _Right-click²_
| Click on `BACK` | <kbd>MOD</kbd>+<kbd>b</kbd> \| _Right-click²_
| Click on `APP_SWITCH` | <kbd>MOD</kbd>+<kbd>s</kbd> \| _4th-click³_
| Click on `MENU` (unlock screen)⁴ | <kbd>MOD</kbd>+<kbd>m</kbd>
| Click on `VOLUME_UP` | <kbd>MOD</kbd>+<kbd></kbd> _(up)_

View File

@ -1,14 +1,5 @@
# Video
## Source
By default, scrcpy mirrors the device screen.
It is possible to capture the device camera instead.
See the dedicated [camera](camera.md) page.
## Size
By default, scrcpy attempts to mirror at the Android device resolution.
@ -30,7 +21,7 @@ If encoding fails, scrcpy automatically tries again with a lower definition
## Bit rate
The default video bit rate is 8 Mbps. To change it:
The default video bit-rate is 8 Mbps. To change it:
```bash
scrcpy --video-bit-rate=2M
@ -75,14 +66,6 @@ scrcpy --video-codec=av1
H265 may provide better quality, but H264 should provide lower latency.
AV1 encoders are not common on current Android devices.
For advanced usage, to pass arbitrary parameters to the [`MediaFormat`],
check `--video-codec-options` in the manpage or in `scrcpy --help`.
[`MediaFormat`]: https://developer.android.com/reference/android/media/MediaFormat
## Encoder
Several encoders may be available on the device. They can be listed by:
```bash
@ -96,6 +79,11 @@ try another one:
scrcpy --video-codec=h264 --video-encoder='OMX.qcom.video.encoder.avc'
```
For advanced usage, to pass arbitrary parameters to the [`MediaFormat`],
check `--video-codec-options` in the manpage or in `scrcpy --help`.
[`MediaFormat`]: https://developer.android.com/reference/android/media/MediaFormat
## Rotation
@ -146,25 +134,6 @@ phone, landscape for a tablet).
If `--max-size` is also specified, resizing is applied after cropping.
## Display
If several displays are available on the Android device, it is possible to
select the display to mirror:
```bash
scrcpy --display-id=1
```
The list of display ids can be retrieved by:
```bash
scrcpy --list-displays
```
A secondary display may only be controlled if the device runs at least Android
10 (otherwise it is mirrored as read-only).
## Buffering
By default, there is no video buffering, to get the lowest possible latency.

View File

@ -4,24 +4,18 @@
Download the [latest release]:
- [`scrcpy-win64-v2.1.1.zip`][direct-win64] (64-bit)
<sub>SHA-256: `f77281e1bce2f9934617699c581f063d5b327f012eff602ee98fb2ef550c25c2`</sub>
- [`scrcpy-win32-v2.1.1.zip`][direct-win32] (32-bit)
<sub>SHA-256: `ef7ae7fbe9449f2643febdc2244fb186d1a746a3c736394150cfd14f06d3c943`</sub>
- [`scrcpy-win64-v2.0.zip`][direct-win64] (64-bit)
<sub>SHA-256: `ae4c8d37a496b43f8974ba8f07f708e22a9570ba0cddc3dc3a36edbccd4d2a20`</sub>
- [`scrcpy-win32-v2.0.zip`][direct-win32] (32-bit)
<sub>SHA-256: `15d98c02cb0e0bbd84f8b5d54991e0f6925569b1286a86a40743944fcb1c2d8c`</sub>
[latest release]: https://github.com/Genymobile/scrcpy/releases/latest
[direct-win64]: https://github.com/Genymobile/scrcpy/releases/download/v2.1.1/scrcpy-win64-v2.1.1.zip
[direct-win32]: https://github.com/Genymobile/scrcpy/releases/download/v2.1.1/scrcpy-win32-v2.1.1.zip
[direct-win64]: https://github.com/Genymobile/scrcpy/releases/download/v2.0/scrcpy-win64-v2.0.zip
[direct-win32]: https://github.com/Genymobile/scrcpy/releases/download/v2.0/scrcpy-win32-v2.0.zip
and extract it.
Alternatively, you could install it from packages manager, like [Winget]:
```bash
winget install scrcpy
```
or [Chocolatey]:
Alternatively, you could install it from packages manager, like [Chocolatey]:
```bash
choco install scrcpy
@ -36,7 +30,6 @@ scoop install scrcpy
scoop install adb # if you don't have it yet
```
[Winget]: https://github.com/microsoft/winget-cli
[Chocolatey]: https://chocolatey.org/
[Scoop]: https://scoop.sh

View File

@ -2,8 +2,8 @@
set -e
BUILDDIR=build-auto
PREBUILT_SERVER_URL=https://github.com/Genymobile/scrcpy/releases/download/v2.1.1/scrcpy-server-v2.1.1
PREBUILT_SERVER_SHA256=9558db6c56743a1dc03b38f59801fb40e91cc891f8fc0c89e5b0b067761f148e
PREBUILT_SERVER_URL=https://github.com/Genymobile/scrcpy/releases/download/v2.0/scrcpy-server-v2.0
PREBUILT_SERVER_SHA256=9e241615f578cd690bb43311000debdecf6a9c50a7082b001952f18f6f21ddc2
echo "[scrcpy] Downloading prebuilt server..."
wget "$PREBUILT_SERVER_URL" -O scrcpy-server

View File

@ -1,5 +1,5 @@
project('scrcpy', 'c',
version: 'v2.2',
version: '2.0',
meson_version: '>= 0.48',
default_options: [
'c_std=c11',
@ -7,8 +7,6 @@ project('scrcpy', 'c',
'b_ndebug=if-release',
])
add_project_arguments('-Wmissing-prototypes', language: 'c')
if get_option('compile_app')
subdir('app')
endif

View File

@ -98,10 +98,10 @@ dist-win32: build-server build-win32
cp app/prebuilt-deps/data/ffmpeg-6.0-scrcpy-4/win32/bin/avcodec-60.dll "$(DIST)/$(WIN32_TARGET_DIR)/"
cp app/prebuilt-deps/data/ffmpeg-6.0-scrcpy-4/win32/bin/avformat-60.dll "$(DIST)/$(WIN32_TARGET_DIR)/"
cp app/prebuilt-deps/data/ffmpeg-6.0-scrcpy-4/win32/bin/swresample-4.dll "$(DIST)/$(WIN32_TARGET_DIR)/"
cp app/prebuilt-deps/data/platform-tools-34.0.5/adb.exe "$(DIST)/$(WIN32_TARGET_DIR)/"
cp app/prebuilt-deps/data/platform-tools-34.0.5/AdbWinApi.dll "$(DIST)/$(WIN32_TARGET_DIR)/"
cp app/prebuilt-deps/data/platform-tools-34.0.5/AdbWinUsbApi.dll "$(DIST)/$(WIN32_TARGET_DIR)/"
cp app/prebuilt-deps/data/SDL2-2.28.4/i686-w64-mingw32/bin/SDL2.dll "$(DIST)/$(WIN32_TARGET_DIR)/"
cp app/prebuilt-deps/data/platform-tools-34.0.1/adb.exe "$(DIST)/$(WIN32_TARGET_DIR)/"
cp app/prebuilt-deps/data/platform-tools-34.0.1/AdbWinApi.dll "$(DIST)/$(WIN32_TARGET_DIR)/"
cp app/prebuilt-deps/data/platform-tools-34.0.1/AdbWinUsbApi.dll "$(DIST)/$(WIN32_TARGET_DIR)/"
cp app/prebuilt-deps/data/SDL2-2.26.4/i686-w64-mingw32/bin/SDL2.dll "$(DIST)/$(WIN32_TARGET_DIR)/"
cp app/prebuilt-deps/data/libusb-1.0.26/libusb-MinGW-Win32/bin/msys-usb-1.0.dll "$(DIST)/$(WIN32_TARGET_DIR)/"
dist-win64: build-server build-win64
@ -116,10 +116,10 @@ dist-win64: build-server build-win64
cp app/prebuilt-deps/data/ffmpeg-6.0-scrcpy-4/win64/bin/avcodec-60.dll "$(DIST)/$(WIN64_TARGET_DIR)/"
cp app/prebuilt-deps/data/ffmpeg-6.0-scrcpy-4/win64/bin/avformat-60.dll "$(DIST)/$(WIN64_TARGET_DIR)/"
cp app/prebuilt-deps/data/ffmpeg-6.0-scrcpy-4/win64/bin/swresample-4.dll "$(DIST)/$(WIN64_TARGET_DIR)/"
cp app/prebuilt-deps/data/platform-tools-34.0.5/adb.exe "$(DIST)/$(WIN64_TARGET_DIR)/"
cp app/prebuilt-deps/data/platform-tools-34.0.5/AdbWinApi.dll "$(DIST)/$(WIN64_TARGET_DIR)/"
cp app/prebuilt-deps/data/platform-tools-34.0.5/AdbWinUsbApi.dll "$(DIST)/$(WIN64_TARGET_DIR)/"
cp app/prebuilt-deps/data/SDL2-2.28.4/x86_64-w64-mingw32/bin/SDL2.dll "$(DIST)/$(WIN64_TARGET_DIR)/"
cp app/prebuilt-deps/data/platform-tools-34.0.1/adb.exe "$(DIST)/$(WIN64_TARGET_DIR)/"
cp app/prebuilt-deps/data/platform-tools-34.0.1/AdbWinApi.dll "$(DIST)/$(WIN64_TARGET_DIR)/"
cp app/prebuilt-deps/data/platform-tools-34.0.1/AdbWinUsbApi.dll "$(DIST)/$(WIN64_TARGET_DIR)/"
cp app/prebuilt-deps/data/SDL2-2.26.4/x86_64-w64-mingw32/bin/SDL2.dll "$(DIST)/$(WIN64_TARGET_DIR)/"
cp app/prebuilt-deps/data/libusb-1.0.26/libusb-MinGW-x64/bin/msys-usb-1.0.dll "$(DIST)/$(WIN64_TARGET_DIR)/"
zip-win32: dist-win32

View File

@ -7,8 +7,8 @@ android {
applicationId "com.genymobile.scrcpy"
minSdkVersion 21
targetSdkVersion 33
versionCode 200
versionName "v2.2"
versionCode 20000
versionName "2.0"
testInstrumentationRunner "android.support.test.runner.AndroidJUnitRunner"
}
buildTypes {

View File

@ -12,7 +12,7 @@
set -e
SCRCPY_DEBUG=false
SCRCPY_VERSION_NAME=v2.2
SCRCPY_VERSION_NAME=2.0
PLATFORM=${ANDROID_PLATFORM:-33}
BUILD_TOOLS=${ANDROID_BUILD_TOOLS:-33.0.0}
@ -48,7 +48,6 @@ cd "$SERVER_DIR/src/main/aidl"
"$BUILD_TOOLS_DIR/aidl" -o"$GEN_DIR" android/view/IRotationWatcher.aidl
"$BUILD_TOOLS_DIR/aidl" -o"$GEN_DIR" \
android/content/IOnPrimaryClipChangedListener.aidl
"$BUILD_TOOLS_DIR/aidl" -o"$GEN_DIR" android/view/IDisplayFoldListener.aidl
echo "Compiling java sources..."
cd ../java

View File

@ -118,7 +118,7 @@ public final class AudioCapture {
if (Build.VERSION.SDK_INT == Build.VERSION_CODES.R) {
startWorkaroundAndroid11();
try {
tryStartRecording(5, 100);
tryStartRecording(3, 100);
} finally {
stopWorkaroundAndroid11();
}

View File

@ -1,37 +0,0 @@
package com.genymobile.scrcpy;
public final class CameraAspectRatio {
private static final float SENSOR = -1;
private float ar;
private CameraAspectRatio(float ar) {
this.ar = ar;
}
public static CameraAspectRatio fromFloat(float ar) {
if (ar < 0) {
throw new IllegalArgumentException("Invalid aspect ratio: " + ar);
}
return new CameraAspectRatio(ar);
}
public static CameraAspectRatio fromFraction(int w, int h) {
if (w <= 0 || h <= 0) {
throw new IllegalArgumentException("Invalid aspect ratio: " + w + ":" + h);
}
return new CameraAspectRatio((float) w / h);
}
public static CameraAspectRatio sensorAspectRatio() {
return new CameraAspectRatio(SENSOR);
}
public boolean isSensor() {
return ar == SENSOR;
}
public float getAspectRatio() {
return ar;
}
}

View File

@ -1,352 +0,0 @@
package com.genymobile.scrcpy;
import com.genymobile.scrcpy.wrappers.ServiceManager;
import android.annotation.SuppressLint;
import android.annotation.TargetApi;
import android.graphics.Rect;
import android.hardware.camera2.CameraAccessException;
import android.hardware.camera2.CameraCaptureSession;
import android.hardware.camera2.CameraCharacteristics;
import android.hardware.camera2.CameraConstrainedHighSpeedCaptureSession;
import android.hardware.camera2.CameraDevice;
import android.hardware.camera2.CameraManager;
import android.hardware.camera2.CaptureFailure;
import android.hardware.camera2.CaptureRequest;
import android.hardware.camera2.params.OutputConfiguration;
import android.hardware.camera2.params.SessionConfiguration;
import android.hardware.camera2.params.StreamConfigurationMap;
import android.media.MediaCodec;
import android.os.Build;
import android.os.Handler;
import android.os.HandlerThread;
import android.util.Range;
import android.view.Surface;
import java.io.IOException;
import java.util.Arrays;
import java.util.List;
import java.util.Optional;
import java.util.concurrent.CompletableFuture;
import java.util.concurrent.ExecutionException;
import java.util.concurrent.Executor;
import java.util.concurrent.atomic.AtomicBoolean;
import java.util.stream.Stream;
public class CameraCapture extends SurfaceCapture {
private final String explicitCameraId;
private final CameraFacing cameraFacing;
private final Size explicitSize;
private int maxSize;
private final CameraAspectRatio aspectRatio;
private final int fps;
private final boolean highSpeed;
private String cameraId;
private Size size;
private HandlerThread cameraThread;
private Handler cameraHandler;
private CameraDevice cameraDevice;
private Executor cameraExecutor;
private final AtomicBoolean disconnected = new AtomicBoolean();
public CameraCapture(String explicitCameraId, CameraFacing cameraFacing, Size explicitSize, int maxSize, CameraAspectRatio aspectRatio, int fps,
boolean highSpeed) {
this.explicitCameraId = explicitCameraId;
this.cameraFacing = cameraFacing;
this.explicitSize = explicitSize;
this.maxSize = maxSize;
this.aspectRatio = aspectRatio;
this.fps = fps;
this.highSpeed = highSpeed;
}
@Override
public void init() throws IOException {
cameraThread = new HandlerThread("camera");
cameraThread.start();
cameraHandler = new Handler(cameraThread.getLooper());
cameraExecutor = new HandlerExecutor(cameraHandler);
try {
cameraId = selectCamera(explicitCameraId, cameraFacing);
if (cameraId == null) {
throw new IOException("No matching camera found");
}
size = selectSize(cameraId, explicitSize, maxSize, aspectRatio, highSpeed);
if (size == null) {
throw new IOException("Could not select camera size");
}
Ln.i("Using camera '" + cameraId + "'");
cameraDevice = openCamera(cameraId);
} catch (CameraAccessException | InterruptedException e) {
throw new IOException(e);
}
}
private static String selectCamera(String explicitCameraId, CameraFacing cameraFacing) throws CameraAccessException {
if (explicitCameraId != null) {
return explicitCameraId;
}
CameraManager cameraManager = ServiceManager.getCameraManager();
String[] cameraIds = cameraManager.getCameraIdList();
if (cameraFacing == null) {
// Use the first one
return cameraIds.length > 0 ? cameraIds[0] : null;
}
for (String cameraId : cameraIds) {
CameraCharacteristics characteristics = cameraManager.getCameraCharacteristics(cameraId);
int facing = characteristics.get(CameraCharacteristics.LENS_FACING);
if (cameraFacing.value() == facing) {
return cameraId;
}
}
// Not found
return null;
}
@TargetApi(Build.VERSION_CODES.N)
private static Size selectSize(String cameraId, Size explicitSize, int maxSize, CameraAspectRatio aspectRatio, boolean highSpeed)
throws CameraAccessException {
if (explicitSize != null) {
return explicitSize;
}
CameraManager cameraManager = ServiceManager.getCameraManager();
CameraCharacteristics characteristics = cameraManager.getCameraCharacteristics(cameraId);
StreamConfigurationMap configs = characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
android.util.Size[] sizes = highSpeed ? configs.getHighSpeedVideoSizes() : configs.getOutputSizes(MediaCodec.class);
Stream<android.util.Size> stream = Arrays.stream(sizes);
if (maxSize > 0) {
stream = stream.filter(it -> it.getWidth() <= maxSize && it.getHeight() <= maxSize);
}
Float targetAspectRatio = resolveAspectRatio(aspectRatio, characteristics);
if (targetAspectRatio != null) {
stream = stream.filter(it -> {
float ar = ((float) it.getWidth() / it.getHeight());
float arRatio = ar / targetAspectRatio;
// Accept if the aspect ratio is the target aspect ratio + or - 10%
return arRatio >= 0.9f && arRatio <= 1.1f;
});
}
Optional<android.util.Size> selected = stream.max((s1, s2) -> {
// Greater width is better
int cmp = Integer.compare(s1.getWidth(), s2.getWidth());
if (cmp != 0) {
return cmp;
}
if (targetAspectRatio != null) {
// Closer to the target aspect ratio is better
float ar1 = ((float) s1.getWidth() / s1.getHeight());
float arRatio1 = ar1 / targetAspectRatio;
float distance1 = Math.abs(1 - arRatio1);
float ar2 = ((float) s2.getWidth() / s2.getHeight());
float arRatio2 = ar2 / targetAspectRatio;
float distance2 = Math.abs(1 - arRatio2);
// Reverse the order because lower distance is better
cmp = Float.compare(distance2, distance1);
if (cmp != 0) {
return cmp;
}
}
// Greater height is better
return Integer.compare(s1.getHeight(), s2.getHeight());
});
if (selected.isPresent()) {
android.util.Size size = selected.get();
return new Size(size.getWidth(), size.getHeight());
}
// Not found
return null;
}
private static Float resolveAspectRatio(CameraAspectRatio ratio, CameraCharacteristics characteristics) {
if (ratio == null) {
return null;
}
if (ratio.isSensor()) {
Rect activeSize = characteristics.get(CameraCharacteristics.SENSOR_INFO_ACTIVE_ARRAY_SIZE);
return (float) activeSize.width() / activeSize.height();
}
return ratio.getAspectRatio();
}
@Override
public void start(Surface surface) throws IOException {
try {
CameraCaptureSession session = createCaptureSession(cameraDevice, surface);
CaptureRequest request = createCaptureRequest(surface);
setRepeatingRequest(session, request);
} catch (CameraAccessException | InterruptedException e) {
throw new IOException(e);
}
}
@Override
public void release() {
if (cameraDevice != null) {
cameraDevice.close();
}
if (cameraThread != null) {
cameraThread.quitSafely();
}
}
@Override
public Size getSize() {
return size;
}
@Override
public boolean setMaxSize(int maxSize) {
if (explicitSize != null) {
return false;
}
this.maxSize = maxSize;
try {
size = selectSize(cameraId, null, maxSize, aspectRatio, highSpeed);
return size != null;
} catch (CameraAccessException e) {
Ln.w("Could not select camera size", e);
return false;
}
}
@SuppressLint("MissingPermission")
@TargetApi(Build.VERSION_CODES.S)
private CameraDevice openCamera(String id) throws CameraAccessException, InterruptedException {
CompletableFuture<CameraDevice> future = new CompletableFuture<>();
ServiceManager.getCameraManager().openCamera(id, new CameraDevice.StateCallback() {
@Override
public void onOpened(CameraDevice camera) {
Ln.d("Camera opened successfully");
future.complete(camera);
}
@Override
public void onDisconnected(CameraDevice camera) {
Ln.w("Camera disconnected");
disconnected.set(true);
requestReset();
}
@Override
public void onError(CameraDevice camera, int error) {
int cameraAccessExceptionErrorCode;
switch (error) {
case CameraDevice.StateCallback.ERROR_CAMERA_IN_USE:
cameraAccessExceptionErrorCode = CameraAccessException.CAMERA_IN_USE;
break;
case CameraDevice.StateCallback.ERROR_MAX_CAMERAS_IN_USE:
cameraAccessExceptionErrorCode = CameraAccessException.MAX_CAMERAS_IN_USE;
break;
case CameraDevice.StateCallback.ERROR_CAMERA_DISABLED:
cameraAccessExceptionErrorCode = CameraAccessException.CAMERA_DISABLED;
break;
case CameraDevice.StateCallback.ERROR_CAMERA_DEVICE:
case CameraDevice.StateCallback.ERROR_CAMERA_SERVICE:
default:
cameraAccessExceptionErrorCode = CameraAccessException.CAMERA_ERROR;
break;
}
future.completeExceptionally(new CameraAccessException(cameraAccessExceptionErrorCode));
}
}, cameraHandler);
try {
return future.get();
} catch (ExecutionException e) {
throw (CameraAccessException) e.getCause();
}
}
@TargetApi(Build.VERSION_CODES.S)
private CameraCaptureSession createCaptureSession(CameraDevice camera, Surface surface) throws CameraAccessException, InterruptedException {
CompletableFuture<CameraCaptureSession> future = new CompletableFuture<>();
OutputConfiguration outputConfig = new OutputConfiguration(surface);
List<OutputConfiguration> outputs = Arrays.asList(outputConfig);
int sessionType = highSpeed ? SessionConfiguration.SESSION_HIGH_SPEED : SessionConfiguration.SESSION_REGULAR;
SessionConfiguration sessionConfig = new SessionConfiguration(sessionType, outputs, cameraExecutor,
new CameraCaptureSession.StateCallback() {
@Override
public void onConfigured(CameraCaptureSession session) {
future.complete(session);
}
@Override
public void onConfigureFailed(CameraCaptureSession session) {
future.completeExceptionally(new CameraAccessException(CameraAccessException.CAMERA_ERROR));
}
});
camera.createCaptureSession(sessionConfig);
try {
return future.get();
} catch (ExecutionException e) {
throw (CameraAccessException) e.getCause();
}
}
private CaptureRequest createCaptureRequest(Surface surface) throws CameraAccessException {
CaptureRequest.Builder requestBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_RECORD);
requestBuilder.addTarget(surface);
if (fps > 0) {
requestBuilder.set(CaptureRequest.CONTROL_AE_TARGET_FPS_RANGE, new Range<>(fps, fps));
}
return requestBuilder.build();
}
@TargetApi(Build.VERSION_CODES.S)
private void setRepeatingRequest(CameraCaptureSession session, CaptureRequest request) throws CameraAccessException, InterruptedException {
CameraCaptureSession.CaptureCallback callback = new CameraCaptureSession.CaptureCallback() {
@Override
public void onCaptureStarted(CameraCaptureSession session, CaptureRequest request, long timestamp, long frameNumber) {
// Called for each frame captured, do nothing
}
@Override
public void onCaptureFailed(CameraCaptureSession session, CaptureRequest request, CaptureFailure failure) {
Ln.w("Camera capture failed: frame " + failure.getFrameNumber());
}
};
if (highSpeed) {
CameraConstrainedHighSpeedCaptureSession highSpeedSession = (CameraConstrainedHighSpeedCaptureSession) session;
List<CaptureRequest> requests = highSpeedSession.createHighSpeedRequestList(request);
highSpeedSession.setRepeatingBurst(requests, callback, cameraHandler);
} else {
session.setRepeatingRequest(request, callback, cameraHandler);
}
}
@Override
public boolean isClosed() {
return disconnected.get();
}
}

View File

@ -1,33 +0,0 @@
package com.genymobile.scrcpy;
import android.annotation.SuppressLint;
import android.hardware.camera2.CameraCharacteristics;
public enum CameraFacing {
FRONT("front", CameraCharacteristics.LENS_FACING_FRONT),
BACK("back", CameraCharacteristics.LENS_FACING_BACK),
@SuppressLint("InlinedApi") // introduced in API 23
EXTERNAL("external", CameraCharacteristics.LENS_FACING_EXTERNAL);
private final String name;
private final int value;
CameraFacing(String name, int value) {
this.name = name;
this.value = value;
}
int value() {
return value;
}
static CameraFacing findByName(String name) {
for (CameraFacing facing : CameraFacing.values()) {
if (name.equals(facing.name)) {
return facing;
}
}
return null;
}
}

View File

@ -64,6 +64,8 @@ public final class DesktopConnection implements Closeable {
throws IOException {
String socketName = getSocketName(scid);
LocalSocket firstSocket = null;
LocalSocket videoSocket = null;
LocalSocket audioSocket = null;
LocalSocket controlSocket = null;
@ -72,28 +74,24 @@ public final class DesktopConnection implements Closeable {
try (LocalServerSocket localServerSocket = new LocalServerSocket(socketName)) {
if (video) {
videoSocket = localServerSocket.accept();
if (sendDummyByte) {
// send one byte so the client may read() to detect a connection error
videoSocket.getOutputStream().write(0);
sendDummyByte = false;
}
firstSocket = videoSocket;
}
if (audio) {
audioSocket = localServerSocket.accept();
if (sendDummyByte) {
// send one byte so the client may read() to detect a connection error
audioSocket.getOutputStream().write(0);
sendDummyByte = false;
if (firstSocket == null) {
firstSocket = audioSocket;
}
}
if (control) {
controlSocket = localServerSocket.accept();
if (sendDummyByte) {
// send one byte so the client may read() to detect a connection error
controlSocket.getOutputStream().write(0);
sendDummyByte = false;
if (firstSocket == null) {
firstSocket = controlSocket;
}
}
if (sendDummyByte) {
// send one byte so the client may read() to detect a connection error
firstSocket.getOutputStream().write(0);
}
}
} else {
if (video) {
@ -132,29 +130,20 @@ public final class DesktopConnection implements Closeable {
return controlSocket;
}
public void shutdown() throws IOException {
public void close() throws IOException {
if (videoSocket != null) {
videoSocket.shutdownInput();
videoSocket.shutdownOutput();
videoSocket.close();
}
if (audioSocket != null) {
audioSocket.shutdownInput();
audioSocket.shutdownOutput();
audioSocket.close();
}
if (controlSocket != null) {
controlSocket.shutdownInput();
controlSocket.shutdownOutput();
}
}
public void close() throws IOException {
if (videoSocket != null) {
videoSocket.close();
}
if (audioSocket != null) {
audioSocket.close();
}
if (controlSocket != null) {
controlSocket.close();
}
}

View File

@ -99,32 +99,25 @@ public final class Device {
}
}, displayId);
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.Q) {
ServiceManager.getWindowManager().registerDisplayFoldListener(new IDisplayFoldListener.Stub() {
@Override
public void onDisplayFoldChanged(int displayId, boolean folded) {
if (Device.this.displayId != displayId) {
// Ignore events related to other display ids
ServiceManager.getWindowManager().registerDisplayFoldListener(new IDisplayFoldListener.Stub() {
@Override
public void onDisplayFoldChanged(int displayId, boolean folded) {
synchronized (Device.this) {
DisplayInfo displayInfo = ServiceManager.getDisplayManager().getDisplayInfo(displayId);
if (displayInfo == null) {
Ln.e("Display " + displayId + " not found\n" + LogUtils.buildDisplayListMessage());
return;
}
synchronized (Device.this) {
DisplayInfo displayInfo = ServiceManager.getDisplayManager().getDisplayInfo(displayId);
if (displayInfo == null) {
Ln.e("Display " + displayId + " not found\n" + LogUtils.buildDisplayListMessage());
return;
}
screenInfo = ScreenInfo.computeScreenInfo(displayInfo.getRotation(), displayInfo.getSize(), options.getCrop(),
options.getMaxSize(), options.getLockVideoOrientation());
// notify
if (foldListener != null) {
foldListener.onFoldChanged(displayId, folded);
}
screenInfo = ScreenInfo.computeScreenInfo(displayInfo.getRotation(), displayInfo.getSize(), options.getCrop(),
options.getMaxSize(), options.getLockVideoOrientation());
// notify
if (foldListener != null) {
foldListener.onFoldChanged(displayId, folded);
}
}
});
}
}
});
if (options.getControl() && options.getClipboardAutosync()) {
// If control and autosync are enabled, synchronize Android clipboard to the computer automatically

View File

@ -1,24 +1,42 @@
package com.genymobile.scrcpy;
import com.genymobile.scrcpy.wrappers.ActivityThread;
import android.annotation.TargetApi;
import android.content.AttributionSource;
import android.content.MutableContextWrapper;
import android.content.Context;
import android.content.ContextWrapper;
import android.os.Build;
import android.os.Process;
public final class FakeContext extends MutableContextWrapper {
import java.lang.reflect.Method;
public final class FakeContext extends ContextWrapper {
public static final String PACKAGE_NAME = "com.android.shell";
public static final int ROOT_UID = 0; // Like android.os.Process.ROOT_UID, but before API 29
private static final FakeContext INSTANCE = new FakeContext();
private static Context retrieveSystemContext() {
try {
Class<?> activityThreadClass = ActivityThread.getActivityThreadClass();
Object activityThread = ActivityThread.getActivityThread();
Method getSystemContextMethod = activityThreadClass.getDeclaredMethod("getSystemContext");
return (Context) getSystemContextMethod.invoke(activityThread);
} catch (Exception e) {
Ln.e("Cannot retrieve system context", e);
return null;
}
}
public static FakeContext get() {
return INSTANCE;
}
private FakeContext() {
super(null);
super(retrieveSystemContext());
}
@Override

View File

@ -1,23 +0,0 @@
package com.genymobile.scrcpy;
import android.os.Handler;
import java.util.concurrent.Executor;
import java.util.concurrent.RejectedExecutionException;
// Inspired from hidden android.os.HandlerExecutor
public class HandlerExecutor implements Executor {
private final Handler handler;
public HandlerExecutor(Handler handler) {
this.handler = handler;
}
@Override
public void execute(Runnable command) {
if (!handler.post(command)) {
throw new RejectedExecutionException(handler + " is shutting down");
}
}
}

View File

@ -2,11 +2,6 @@ package com.genymobile.scrcpy;
import android.util.Log;
import java.io.FileDescriptor;
import java.io.FileOutputStream;
import java.io.OutputStream;
import java.io.PrintStream;
/**
* Log both to Android logger (so that logs are visible in "adb logcat") and standard output/error (so that they are visible in the terminal
* directly).
@ -16,9 +11,6 @@ public final class Ln {
private static final String TAG = "scrcpy";
private static final String PREFIX = "[server] ";
private static final PrintStream CONSOLE_OUT = new PrintStream(new FileOutputStream(FileDescriptor.out));
private static final PrintStream CONSOLE_ERR = new PrintStream(new FileOutputStream(FileDescriptor.err));
enum Level {
VERBOSE, DEBUG, INFO, WARN, ERROR
}
@ -29,12 +21,6 @@ public final class Ln {
// not instantiable
}
public static void disableSystemStreams() {
PrintStream nullStream = new PrintStream(new NullOutputStream());
System.setOut(nullStream);
System.setErr(nullStream);
}
/**
* Initialize the log level.
* <p>
@ -53,30 +39,30 @@ public final class Ln {
public static void v(String message) {
if (isEnabled(Level.VERBOSE)) {
Log.v(TAG, message);
CONSOLE_OUT.print(PREFIX + "VERBOSE: " + message + '\n');
System.out.print(PREFIX + "VERBOSE: " + message + '\n');
}
}
public static void d(String message) {
if (isEnabled(Level.DEBUG)) {
Log.d(TAG, message);
CONSOLE_OUT.print(PREFIX + "DEBUG: " + message + '\n');
System.out.print(PREFIX + "DEBUG: " + message + '\n');
}
}
public static void i(String message) {
if (isEnabled(Level.INFO)) {
Log.i(TAG, message);
CONSOLE_OUT.print(PREFIX + "INFO: " + message + '\n');
System.out.print(PREFIX + "INFO: " + message + '\n');
}
}
public static void w(String message, Throwable throwable) {
if (isEnabled(Level.WARN)) {
Log.w(TAG, message, throwable);
CONSOLE_ERR.print(PREFIX + "WARN: " + message + '\n');
System.err.print(PREFIX + "WARN: " + message + '\n');
if (throwable != null) {
throwable.printStackTrace(CONSOLE_ERR);
throwable.printStackTrace();
}
}
}
@ -88,9 +74,9 @@ public final class Ln {
public static void e(String message, Throwable throwable) {
if (isEnabled(Level.ERROR)) {
Log.e(TAG, message, throwable);
CONSOLE_ERR.print(PREFIX + "ERROR: " + message + '\n');
System.err.print(PREFIX + "ERROR: " + message + "\n");
if (throwable != null) {
throwable.printStackTrace(CONSOLE_ERR);
throwable.printStackTrace();
}
}
}
@ -98,21 +84,4 @@ public final class Ln {
public static void e(String message) {
e(message, null);
}
static class NullOutputStream extends OutputStream {
@Override
public void write(byte[] b) {
// ignore
}
@Override
public void write(byte[] b, int off, int len) {
// ignore
}
@Override
public void write(int b) {
// ignore
}
}
}

View File

@ -3,17 +3,7 @@ package com.genymobile.scrcpy;
import com.genymobile.scrcpy.wrappers.DisplayManager;
import com.genymobile.scrcpy.wrappers.ServiceManager;
import android.graphics.Rect;
import android.hardware.camera2.CameraAccessException;
import android.hardware.camera2.CameraCharacteristics;
import android.hardware.camera2.CameraManager;
import android.hardware.camera2.params.StreamConfigurationMap;
import android.media.MediaCodec;
import android.util.Range;
import java.util.List;
import java.util.SortedSet;
import java.util.TreeSet;
public final class LogUtils {
@ -57,7 +47,7 @@ public final class LogUtils {
builder.append("\n (none)");
} else {
for (int id : displayIds) {
builder.append("\n --display-id=").append(id).append(" (");
builder.append("\n --display=").append(id).append(" (");
DisplayInfo displayInfo = displayManager.getDisplayInfo(id);
if (displayInfo != null) {
Size size = displayInfo.getSize();
@ -70,75 +60,4 @@ public final class LogUtils {
}
return builder.toString();
}
private static String getCameraFacingName(int facing) {
switch (facing) {
case CameraCharacteristics.LENS_FACING_FRONT:
return "front";
case CameraCharacteristics.LENS_FACING_BACK:
return "back";
case CameraCharacteristics.LENS_FACING_EXTERNAL:
return "external";
default:
return "unknown";
}
}
public static String buildCameraListMessage(boolean includeSizes) {
StringBuilder builder = new StringBuilder("List of cameras:");
CameraManager cameraManager = ServiceManager.getCameraManager();
try {
String[] cameraIds = cameraManager.getCameraIdList();
if (cameraIds == null || cameraIds.length == 0) {
builder.append("\n (none)");
} else {
for (String id : cameraIds) {
builder.append("\n --video-source=camera --camera-id=").append(id);
CameraCharacteristics characteristics = cameraManager.getCameraCharacteristics(id);
int facing = characteristics.get(CameraCharacteristics.LENS_FACING);
builder.append(" (").append(getCameraFacingName(facing)).append(", ");
Rect activeSize = characteristics.get(CameraCharacteristics.SENSOR_INFO_ACTIVE_ARRAY_SIZE);
builder.append(activeSize.width()).append("x").append(activeSize.height()).append(", ");
// Capture frame rates for low-FPS mode are the same for every resolution
Range<Integer>[] lowFpsRanges = characteristics.get(CameraCharacteristics.CONTROL_AE_AVAILABLE_TARGET_FPS_RANGES);
SortedSet<Integer> uniqueLowFps = getUniqueSet(lowFpsRanges);
builder.append("fps=").append(uniqueLowFps).append(')');
if (includeSizes) {
StreamConfigurationMap configs = characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
android.util.Size[] sizes = configs.getOutputSizes(MediaCodec.class);
for (android.util.Size size : sizes) {
builder.append("\n - ").append(size.getWidth()).append('x').append(size.getHeight());
}
android.util.Size[] highSpeedSizes = configs.getHighSpeedVideoSizes();
if (highSpeedSizes.length > 0) {
builder.append("\n High speed capture (--camera-high-speed):");
for (android.util.Size size : highSpeedSizes) {
Range<Integer>[] highFpsRanges = configs.getHighSpeedVideoFpsRanges();
SortedSet<Integer> uniqueHighFps = getUniqueSet(highFpsRanges);
builder.append("\n - ").append(size.getWidth()).append("x").append(size.getHeight());
builder.append(" (fps=").append(uniqueHighFps).append(')');
}
}
}
}
}
} catch (CameraAccessException e) {
builder.append("\n (access denied)");
}
return builder.toString();
}
private static SortedSet<Integer> getUniqueSet(Range<Integer>[] ranges) {
SortedSet<Integer> set = new TreeSet<>();
for (Range<Integer> range : ranges) {
set.add(range.getUpper());
}
return set;
}
}

View File

@ -14,7 +14,6 @@ public class Options {
private int maxSize;
private VideoCodec videoCodec = VideoCodec.H264;
private AudioCodec audioCodec = AudioCodec.OPUS;
private VideoSource videoSource = VideoSource.DISPLAY;
private AudioSource audioSource = AudioSource.OUTPUT;
private int videoBitRate = 8000000;
private int audioBitRate = 128000;
@ -24,12 +23,6 @@ public class Options {
private Rect crop;
private boolean control = true;
private int displayId;
private String cameraId;
private Size cameraSize;
private CameraFacing cameraFacing;
private CameraAspectRatio cameraAspectRatio;
private int cameraFps;
private boolean cameraHighSpeed;
private boolean showTouches;
private boolean stayAwake;
private List<CodecOption> videoCodecOptions;
@ -45,8 +38,6 @@ public class Options {
private boolean listEncoders;
private boolean listDisplays;
private boolean listCameras;
private boolean listCameraSizes;
// Options not used by the scrcpy client, but useful to use scrcpy-server directly
private boolean sendDeviceMeta = true; // send device name and size
@ -82,10 +73,6 @@ public class Options {
return audioCodec;
}
public VideoSource getVideoSource() {
return videoSource;
}
public AudioSource getAudioSource() {
return audioSource;
}
@ -122,30 +109,6 @@ public class Options {
return displayId;
}
public String getCameraId() {
return cameraId;
}
public Size getCameraSize() {
return cameraSize;
}
public CameraFacing getCameraFacing() {
return cameraFacing;
}
public CameraAspectRatio getCameraAspectRatio() {
return cameraAspectRatio;
}
public int getCameraFps() {
return cameraFps;
}
public boolean getCameraHighSpeed() {
return cameraHighSpeed;
}
public boolean getShowTouches() {
return showTouches;
}
@ -190,10 +153,6 @@ public class Options {
return powerOn;
}
public boolean getList() {
return listEncoders || listDisplays || listCameras || listCameraSizes;
}
public boolean getListEncoders() {
return listEncoders;
}
@ -202,14 +161,6 @@ public class Options {
return listDisplays;
}
public boolean getListCameras() {
return listCameras;
}
public boolean getListCameraSizes() {
return listCameraSizes;
}
public boolean getSendDeviceMeta() {
return sendDeviceMeta;
}
@ -279,13 +230,6 @@ public class Options {
}
options.audioCodec = audioCodec;
break;
case "video_source":
VideoSource videoSource = VideoSource.findByName(value);
if (videoSource == null) {
throw new IllegalArgumentException("Video source " + value + " not supported");
}
options.videoSource = videoSource;
break;
case "audio_source":
AudioSource audioSource = AudioSource.findByName(value);
if (audioSource == null) {
@ -312,9 +256,7 @@ public class Options {
options.tunnelForward = Boolean.parseBoolean(value);
break;
case "crop":
if (!value.isEmpty()) {
options.crop = parseCrop(value);
}
options.crop = parseCrop(value);
break;
case "control":
options.control = Boolean.parseBoolean(value);
@ -364,42 +306,6 @@ public class Options {
case "list_displays":
options.listDisplays = Boolean.parseBoolean(value);
break;
case "list_cameras":
options.listCameras = Boolean.parseBoolean(value);
break;
case "list_camera_sizes":
options.listCameraSizes = Boolean.parseBoolean(value);
break;
case "camera_id":
if (!value.isEmpty()) {
options.cameraId = value;
}
break;
case "camera_size":
if (!value.isEmpty()) {
options.cameraSize = parseSize(value);
}
break;
case "camera_facing":
if (!value.isEmpty()) {
CameraFacing facing = CameraFacing.findByName(value);
if (facing == null) {
throw new IllegalArgumentException("Camera facing " + value + " not supported");
}
options.cameraFacing = facing;
}
break;
case "camera_ar":
if (!value.isEmpty()) {
options.cameraAspectRatio = parseCameraAspectRatio(value);
}
break;
case "camera_fps":
options.cameraFps = Integer.parseInt(value);
break;
case "camera_high_speed":
options.cameraHighSpeed = Boolean.parseBoolean(value);
break;
case "send_device_meta":
options.sendDeviceMeta = Boolean.parseBoolean(value);
break;
@ -431,6 +337,9 @@ public class Options {
}
private static Rect parseCrop(String crop) {
if (crop.isEmpty()) {
return null;
}
// input format: "width:height:x:y"
String[] tokens = crop.split(":");
if (tokens.length != 4) {
@ -442,31 +351,4 @@ public class Options {
int y = Integer.parseInt(tokens[3]);
return new Rect(x, y, x + width, y + height);
}
private static Size parseSize(String size) {
// input format: "<width>x<height>"
String[] tokens = size.split("x");
if (tokens.length != 2) {
throw new IllegalArgumentException("Invalid size format (expected <width>x<height>): \"" + size + "\"");
}
int width = Integer.parseInt(tokens[0]);
int height = Integer.parseInt(tokens[1]);
return new Size(width, height);
}
private static CameraAspectRatio parseCameraAspectRatio(String ar) {
if ("sensor".equals(ar)) {
return CameraAspectRatio.sensorAspectRatio();
}
String[] tokens = ar.split(":");
if (tokens.length == 2) {
int w = Integer.parseInt(tokens[0]);
int h = Integer.parseInt(tokens[1]);
return CameraAspectRatio.fromFraction(w, h);
}
float floatAr = Float.parseFloat(tokens[0]);
return CameraAspectRatio.fromFloat(floatAr);
}
}

View File

@ -1,84 +0,0 @@
package com.genymobile.scrcpy;
import com.genymobile.scrcpy.wrappers.SurfaceControl;
import android.graphics.Rect;
import android.os.Build;
import android.os.IBinder;
import android.view.Surface;
public class ScreenCapture extends SurfaceCapture implements Device.RotationListener, Device.FoldListener {
private final Device device;
private IBinder display;
public ScreenCapture(Device device) {
this.device = device;
}
@Override
public void init() {
display = createDisplay();
device.setRotationListener(this);
device.setFoldListener(this);
}
@Override
public void start(Surface surface) {
ScreenInfo screenInfo = device.getScreenInfo();
Rect contentRect = screenInfo.getContentRect();
// does not include the locked video orientation
Rect unlockedVideoRect = screenInfo.getUnlockedVideoSize().toRect();
int videoRotation = screenInfo.getVideoRotation();
int layerStack = device.getLayerStack();
setDisplaySurface(display, surface, videoRotation, contentRect, unlockedVideoRect, layerStack);
}
@Override
public void release() {
device.setRotationListener(null);
device.setFoldListener(null);
SurfaceControl.destroyDisplay(display);
}
@Override
public Size getSize() {
return device.getScreenInfo().getVideoSize();
}
@Override
public boolean setMaxSize(int maxSize) {
device.setMaxSize(maxSize);
return true;
}
@Override
public void onFoldChanged(int displayId, boolean folded) {
requestReset();
}
@Override
public void onRotationChanged(int rotation) {
requestReset();
}
private static IBinder createDisplay() {
// Since Android 12 (preview), secure displays could not be created with shell permissions anymore.
// On Android 12 preview, SDK_INT is still R (not S), but CODENAME is "S".
boolean secure = Build.VERSION.SDK_INT < Build.VERSION_CODES.R || (Build.VERSION.SDK_INT == Build.VERSION_CODES.R && !"S".equals(
Build.VERSION.CODENAME));
return SurfaceControl.createDisplay("scrcpy", secure);
}
private static void setDisplaySurface(IBinder display, Surface surface, int orientation, Rect deviceRect, Rect displayRect, int layerStack) {
SurfaceControl.openTransaction();
try {
SurfaceControl.setDisplaySurface(display, surface);
SurfaceControl.setDisplayProjection(display, orientation, deviceRect, displayRect);
SurfaceControl.setDisplayLayerStack(display, layerStack);
} finally {
SurfaceControl.closeTransaction();
}
}
}

View File

@ -1,9 +1,13 @@
package com.genymobile.scrcpy;
import com.genymobile.scrcpy.wrappers.SurfaceControl;
import android.graphics.Rect;
import android.media.MediaCodec;
import android.media.MediaCodecInfo;
import android.media.MediaFormat;
import android.os.Looper;
import android.os.Build;
import android.os.IBinder;
import android.os.SystemClock;
import android.view.Surface;
@ -12,7 +16,7 @@ import java.nio.ByteBuffer;
import java.util.List;
import java.util.concurrent.atomic.AtomicBoolean;
public class SurfaceEncoder implements AsyncProcessor {
public class ScreenEncoder implements Device.RotationListener, Device.FoldListener, AsyncProcessor {
private static final int DEFAULT_I_FRAME_INTERVAL = 10; // seconds
private static final int REPEAT_FRAME_DELAY_US = 100_000; // repeat after 100ms
@ -22,7 +26,9 @@ public class SurfaceEncoder implements AsyncProcessor {
private static final int[] MAX_SIZE_FALLBACK = {2560, 1920, 1600, 1280, 1024, 800};
private static final int MAX_CONSECUTIVE_ERRORS = 3;
private final SurfaceCapture capture;
private final AtomicBoolean resetCapture = new AtomicBoolean();
private final Device device;
private final Streamer streamer;
private final String encoderName;
private final List<CodecOption> codecOptions;
@ -36,9 +42,9 @@ public class SurfaceEncoder implements AsyncProcessor {
private Thread thread;
private final AtomicBoolean stopped = new AtomicBoolean();
public SurfaceEncoder(SurfaceCapture capture, Streamer streamer, int videoBitRate, int maxFps, List<CodecOption> codecOptions, String encoderName,
public ScreenEncoder(Device device, Streamer streamer, int videoBitRate, int maxFps, List<CodecOption> codecOptions, String encoderName,
boolean downsizeOnError) {
this.capture = capture;
this.device = device;
this.streamer = streamer;
this.videoBitRate = videoBitRate;
this.maxFps = maxFps;
@ -47,29 +53,51 @@ public class SurfaceEncoder implements AsyncProcessor {
this.downsizeOnError = downsizeOnError;
}
@Override
public void onFoldChanged(int displayId, boolean folded) {
resetCapture.set(true);
}
@Override
public void onRotationChanged(int rotation) {
resetCapture.set(true);
}
private boolean consumeResetCapture() {
return resetCapture.getAndSet(false);
}
private void streamScreen() throws IOException, ConfigurationException {
Codec codec = streamer.getCodec();
MediaCodec mediaCodec = createMediaCodec(codec, encoderName);
MediaFormat format = createFormat(codec.getMimeType(), videoBitRate, maxFps, codecOptions);
IBinder display = createDisplay();
device.setRotationListener(this);
device.setFoldListener(this);
capture.init();
streamer.writeVideoHeader(device.getScreenInfo().getVideoSize());
boolean alive;
try {
streamer.writeVideoHeader(capture.getSize());
boolean alive;
do {
Size size = capture.getSize();
format.setInteger(MediaFormat.KEY_WIDTH, size.getWidth());
format.setInteger(MediaFormat.KEY_HEIGHT, size.getHeight());
ScreenInfo screenInfo = device.getScreenInfo();
Rect contentRect = screenInfo.getContentRect();
// include the locked video orientation
Rect videoRect = screenInfo.getVideoSize().toRect();
format.setInteger(MediaFormat.KEY_WIDTH, videoRect.width());
format.setInteger(MediaFormat.KEY_HEIGHT, videoRect.height());
Surface surface = null;
try {
mediaCodec.configure(format, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
surface = mediaCodec.createInputSurface();
capture.start(surface);
// does not include the locked video orientation
Rect unlockedVideoRect = screenInfo.getUnlockedVideoSize().toRect();
int videoRotation = screenInfo.getVideoRotation();
int layerStack = device.getLayerStack();
setDisplaySurface(display, surface, videoRotation, contentRect, unlockedVideoRect, layerStack);
mediaCodec.start();
@ -78,7 +106,7 @@ public class SurfaceEncoder implements AsyncProcessor {
mediaCodec.stop();
} catch (IllegalStateException | IllegalArgumentException e) {
Ln.e("Encoding error: " + e.getClass().getName() + ": " + e.getMessage());
if (!prepareRetry(size)) {
if (!prepareRetry(device, screenInfo)) {
throw e;
}
Ln.i("Retrying...");
@ -92,11 +120,13 @@ public class SurfaceEncoder implements AsyncProcessor {
} while (alive);
} finally {
mediaCodec.release();
capture.release();
device.setRotationListener(null);
device.setFoldListener(null);
SurfaceControl.destroyDisplay(display);
}
}
private boolean prepareRetry(Size currentSize) {
private boolean prepareRetry(Device device, ScreenInfo screenInfo) {
if (firstFrameSent) {
++consecutiveErrors;
if (consecutiveErrors >= MAX_CONSECUTIVE_ERRORS) {
@ -116,19 +146,16 @@ public class SurfaceEncoder implements AsyncProcessor {
// Downsizing on error is only enabled if an encoding failure occurs before the first frame (downsizing later could be surprising)
int newMaxSize = chooseMaxSizeFallback(currentSize);
int newMaxSize = chooseMaxSizeFallback(screenInfo.getVideoSize());
Ln.i("newMaxSize = " + newMaxSize);
if (newMaxSize == 0) {
// Must definitively fail
return false;
}
boolean accepted = capture.setMaxSize(newMaxSize);
if (!accepted) {
return false;
}
// Retry with a smaller size
// Retry with a smaller device size
Ln.i("Retrying with -m" + newMaxSize + "...");
device.setMaxSize(newMaxSize);
return true;
}
@ -149,14 +176,14 @@ public class SurfaceEncoder implements AsyncProcessor {
boolean alive = true;
MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
while (!capture.consumeReset() && !eof) {
while (!consumeResetCapture() && !eof) {
if (stopped.get()) {
alive = false;
break;
}
int outputBufferId = codec.dequeueOutputBuffer(bufferInfo, -1);
try {
if (capture.consumeReset()) {
if (consumeResetCapture()) {
// must restart encoding with new size
break;
}
@ -181,11 +208,6 @@ public class SurfaceEncoder implements AsyncProcessor {
}
}
if (capture.isClosed()) {
// The capture might have been closed internally (for example if the camera is disconnected)
alive = false;
}
return !eof && alive;
}
@ -242,13 +264,28 @@ public class SurfaceEncoder implements AsyncProcessor {
return format;
}
private static IBinder createDisplay() {
// Since Android 12 (preview), secure displays could not be created with shell permissions anymore.
// On Android 12 preview, SDK_INT is still R (not S), but CODENAME is "S".
boolean secure = Build.VERSION.SDK_INT < Build.VERSION_CODES.R || (Build.VERSION.SDK_INT == Build.VERSION_CODES.R && !"S"
.equals(Build.VERSION.CODENAME));
return SurfaceControl.createDisplay("scrcpy", secure);
}
private static void setDisplaySurface(IBinder display, Surface surface, int orientation, Rect deviceRect, Rect displayRect, int layerStack) {
SurfaceControl.openTransaction();
try {
SurfaceControl.setDisplaySurface(display, surface);
SurfaceControl.setDisplayProjection(display, orientation, deviceRect, displayRect);
SurfaceControl.setDisplayLayerStack(display, layerStack);
} finally {
SurfaceControl.closeTransaction();
}
}
@Override
public void start(TerminationListener listener) {
thread = new Thread(() -> {
// Some devices (Meizu) deadlock if the video encoding thread has no Looper
// <https://github.com/Genymobile/scrcpy/issues/4143>
Looper.prepare();
try {
streamScreen();
} catch (ConfigurationException e) {

View File

@ -87,11 +87,7 @@ public final class Server {
}
private static void scrcpy(Options options) throws IOException, ConfigurationException {
if (Build.VERSION.SDK_INT < Build.VERSION_CODES.S && options.getVideoSource() == VideoSource.CAMERA) {
Ln.e("Camera mirroring is not supported before Android 12");
throw new ConfigurationException("Camera mirroring is not supported");
}
Ln.i("Device: [" + Build.MANUFACTURER + "] " + Build.BRAND + " " + Build.MODEL + " (Android " + Build.VERSION.RELEASE + ")");
final Device device = new Device(options);
Thread initThread = startInitThread(options);
@ -102,9 +98,27 @@ public final class Server {
boolean video = options.getVideo();
boolean audio = options.getAudio();
boolean sendDummyByte = options.getSendDummyByte();
boolean camera = options.getVideoSource() == VideoSource.CAMERA;
Workarounds.apply(audio, camera);
Workarounds.prepareMainLooper();
// Workarounds must be applied for Meizu phones:
// - <https://github.com/Genymobile/scrcpy/issues/240>
// - <https://github.com/Genymobile/scrcpy/issues/365>
// - <https://github.com/Genymobile/scrcpy/issues/2656>
//
// But only apply when strictly necessary, since workarounds can cause other issues:
// - <https://github.com/Genymobile/scrcpy/issues/940>
// - <https://github.com/Genymobile/scrcpy/issues/994>
if (Build.BRAND.equalsIgnoreCase("meizu") || Build.BRAND.equalsIgnoreCase("honor")) {
Workarounds.fillAppInfo();
}
// Before Android 11, audio is not supported.
// Since Android 12, we can properly set a context on the AudioRecord.
// Only on Android 11 we must fill the application context for the AudioRecord to work.
if (audio && Build.VERSION.SDK_INT == Build.VERSION_CODES.R) {
Workarounds.fillAppContext();
}
List<AsyncProcessor> asyncProcessors = new ArrayList<>();
@ -137,16 +151,9 @@ public final class Server {
if (video) {
Streamer videoStreamer = new Streamer(connection.getVideoFd(), options.getVideoCodec(), options.getSendCodecMeta(),
options.getSendFrameMeta());
SurfaceCapture surfaceCapture;
if (options.getVideoSource() == VideoSource.DISPLAY) {
surfaceCapture = new ScreenCapture(device);
} else {
surfaceCapture = new CameraCapture(options.getCameraId(), options.getCameraFacing(), options.getCameraSize(),
options.getMaxSize(), options.getCameraAspectRatio(), options.getCameraFps(), options.getCameraHighSpeed());
}
SurfaceEncoder surfaceEncoder = new SurfaceEncoder(surfaceCapture, videoStreamer, options.getVideoBitRate(), options.getMaxFps(),
ScreenEncoder screenEncoder = new ScreenEncoder(device, videoStreamer, options.getVideoBitRate(), options.getMaxFps(),
options.getVideoCodecOptions(), options.getVideoEncoder(), options.getDownsizeOnError());
asyncProcessors.add(surfaceEncoder);
asyncProcessors.add(screenEncoder);
}
Completion completion = new Completion(asyncProcessors.size());
@ -163,8 +170,6 @@ public final class Server {
asyncProcessor.stop();
}
connection.shutdown();
try {
initThread.join();
for (AsyncProcessor asyncProcessor : asyncProcessors) {
@ -184,34 +189,16 @@ public final class Server {
return thread;
}
public static void main(String... args) {
int status = 0;
try {
internalMain(args);
} catch (Throwable t) {
Ln.e(t.getMessage(), t);
status = 1;
} finally {
// By default, the Java process exits when all non-daemon threads are terminated.
// The Android SDK might start some non-daemon threads internally, preventing the scrcpy server to exit.
// So force the process to exit explicitly.
System.exit(status);
}
}
private static void internalMain(String... args) throws Exception {
public static void main(String... args) throws Exception {
Thread.setDefaultUncaughtExceptionHandler((t, e) -> {
Ln.e("Exception on thread " + t, e);
});
Options options = Options.parse(args);
Ln.disableSystemStreams();
Ln.initLogLevel(options.getLogLevel());
Ln.i("Device: [" + Build.MANUFACTURER + "] " + Build.BRAND + " " + Build.MODEL + " (Android " + Build.VERSION.RELEASE + ")");
if (options.getList()) {
if (options.getListEncoders() || options.getListDisplays()) {
if (options.getCleanup()) {
CleanUp.unlinkSelf();
}
@ -223,10 +210,6 @@ public final class Server {
if (options.getListDisplays()) {
Ln.i(LogUtils.buildDisplayListMessage());
}
if (options.getListCameras() || options.getListCameraSizes()) {
Workarounds.apply(false, true);
Ln.i(LogUtils.buildCameraListMessage(options.getListCameraSizes()));
}
// Just print the requested data, do not mirror
return;
}

View File

@ -1,71 +0,0 @@
package com.genymobile.scrcpy;
import android.view.Surface;
import java.io.IOException;
import java.util.concurrent.atomic.AtomicBoolean;
/**
* A video source which can be rendered on a Surface for encoding.
*/
public abstract class SurfaceCapture {
private final AtomicBoolean resetCapture = new AtomicBoolean();
/**
* Request the encoding session to be restarted, for example if the capture implementation detects that the video source size has changed (on
* device rotation for example).
*/
protected void requestReset() {
resetCapture.set(true);
}
/**
* Consume the reset request (intended to be called by the encoder).
*
* @return {@code true} if a reset request was pending, {@code false} otherwise.
*/
public boolean consumeReset() {
return resetCapture.getAndSet(false);
}
/**
* Called once before the capture starts.
*/
public abstract void init() throws IOException;
/**
* Called after the capture ends (if and only if {@link #init()} has been called).
*/
public abstract void release();
/**
* Start the capture to the target surface.
*
* @param surface the surface which will be encoded
*/
public abstract void start(Surface surface) throws IOException;
/**
* Return the video size
*
* @return the video size
*/
public abstract Size getSize();
/**
* Set the maximum capture size (set by the encoder if it does not support the current size).
*
* @param maxSize Maximum size
*/
public abstract boolean setMaxSize(int maxSize);
/**
* Indicate if the capture has been closed internally.
*
* @return {@code true} is the capture is closed, {@code false} otherwise.
*/
public boolean isClosed() {
return false;
}
}

View File

@ -6,7 +6,7 @@ import android.media.MediaFormat;
public enum VideoCodec implements Codec {
H264(0x68_32_36_34, "h264", MediaFormat.MIMETYPE_VIDEO_AVC),
H265(0x68_32_36_35, "h265", MediaFormat.MIMETYPE_VIDEO_HEVC),
@SuppressLint("InlinedApi") // introduced in API 29
@SuppressLint("InlinedApi") // introduced in API 21
AV1(0x00_61_76_31, "av1", MediaFormat.MIMETYPE_VIDEO_AV1);
private final int id; // 4-byte ASCII representation of the name

View File

@ -1,22 +0,0 @@
package com.genymobile.scrcpy;
public enum VideoSource {
DISPLAY("display"),
CAMERA("camera");
private final String name;
VideoSource(String name) {
this.name = name;
}
static VideoSource findByName(String name) {
for (VideoSource videoSource : VideoSource.values()) {
if (name.equals(videoSource.name)) {
return videoSource;
}
}
return null;
}
}

View File

@ -1,10 +1,11 @@
package com.genymobile.scrcpy;
import com.genymobile.scrcpy.wrappers.ActivityThread;
import android.annotation.SuppressLint;
import android.annotation.TargetApi;
import android.app.Application;
import android.content.AttributionSource;
import android.content.Context;
import android.content.ContextWrapper;
import android.content.pm.ApplicationInfo;
import android.media.AudioAttributes;
@ -21,67 +22,14 @@ import java.lang.reflect.Method;
public final class Workarounds {
private static Class<?> activityThreadClass;
private static Object activityThread;
private static boolean activityThreadFilled;
private Workarounds() {
// not instantiable
}
public static void apply(boolean audio, boolean camera) {
Workarounds.prepareMainLooper();
boolean mustFillAppInfo = false;
boolean mustFillBaseContext = false;
boolean mustFillAppContext = false;
if (Build.BRAND.equalsIgnoreCase("meizu")) {
// Workarounds must be applied for Meizu phones:
// - <https://github.com/Genymobile/scrcpy/issues/240>
// - <https://github.com/Genymobile/scrcpy/issues/365>
// - <https://github.com/Genymobile/scrcpy/issues/2656>
//
// But only apply when strictly necessary, since workarounds can cause other issues:
// - <https://github.com/Genymobile/scrcpy/issues/940>
// - <https://github.com/Genymobile/scrcpy/issues/994>
mustFillAppInfo = true;
} else if (Build.BRAND.equalsIgnoreCase("honor")) {
// More workarounds must be applied for Honor devices:
// - <https://github.com/Genymobile/scrcpy/issues/4015>
//
// The system context must not be set for all devices, because it would cause other problems:
// - <https://github.com/Genymobile/scrcpy/issues/4015#issuecomment-1595382142>
// - <https://github.com/Genymobile/scrcpy/issues/3805#issuecomment-1596148031>
mustFillAppInfo = true;
mustFillBaseContext = true;
mustFillAppContext = true;
}
if (audio && Build.VERSION.SDK_INT == Build.VERSION_CODES.R) {
// Before Android 11, audio is not supported.
// Since Android 12, we can properly set a context on the AudioRecord.
// Only on Android 11 we must fill the application context for the AudioRecord to work.
mustFillAppContext = true;
}
if (camera) {
mustFillAppInfo = true;
mustFillBaseContext = true;
}
if (mustFillAppInfo) {
Workarounds.fillAppInfo();
}
if (mustFillBaseContext) {
Workarounds.fillBaseContext();
}
if (mustFillAppContext) {
Workarounds.fillAppContext();
}
}
@SuppressWarnings("deprecation")
private static void prepareMainLooper() {
public static void prepareMainLooper() {
// Some devices internally create a Handler when creating an input Surface, causing an exception:
// "Can't create handler inside thread that has not called Looper.prepare()"
// <https://github.com/Genymobile/scrcpy/issues/240>
@ -95,22 +43,21 @@ public final class Workarounds {
@SuppressLint("PrivateApi,DiscouragedPrivateApi")
private static void fillActivityThread() throws Exception {
if (activityThread == null) {
// ActivityThread activityThread = new ActivityThread();
activityThreadClass = Class.forName("android.app.ActivityThread");
Constructor<?> activityThreadConstructor = activityThreadClass.getDeclaredConstructor();
activityThreadConstructor.setAccessible(true);
activityThread = activityThreadConstructor.newInstance();
if (!activityThreadFilled) {
Class<?> activityThreadClass = ActivityThread.getActivityThreadClass();
Object activityThread = ActivityThread.getActivityThread();
// ActivityThread.sCurrentActivityThread = activityThread;
Field sCurrentActivityThreadField = activityThreadClass.getDeclaredField("sCurrentActivityThread");
sCurrentActivityThreadField.setAccessible(true);
sCurrentActivityThreadField.set(null, activityThread);
activityThreadFilled = true;
}
}
@SuppressLint("PrivateApi,DiscouragedPrivateApi")
private static void fillAppInfo() {
public static void fillAppInfo() {
try {
fillActivityThread();
@ -128,6 +75,9 @@ public final class Workarounds {
appInfoField.setAccessible(true);
appInfoField.set(appBindData, applicationInfo);
Class<?> activityThreadClass = ActivityThread.getActivityThreadClass();
Object activityThread = ActivityThread.getActivityThread();
// activityThread.mBoundApplication = appBindData;
Field mBoundApplicationField = activityThreadClass.getDeclaredField("mBoundApplication");
mBoundApplicationField.setAccessible(true);
@ -139,7 +89,7 @@ public final class Workarounds {
}
@SuppressLint("PrivateApi,DiscouragedPrivateApi")
private static void fillAppContext() {
public static void fillAppContext() {
try {
fillActivityThread();
@ -148,6 +98,9 @@ public final class Workarounds {
baseField.setAccessible(true);
baseField.set(app, FakeContext.get());
Class<?> activityThreadClass = ActivityThread.getActivityThreadClass();
Object activityThread = ActivityThread.getActivityThread();
// activityThread.mInitialApplication = app;
Field mInitialApplicationField = activityThreadClass.getDeclaredField("mInitialApplication");
mInitialApplicationField.setAccessible(true);
@ -158,19 +111,6 @@ public final class Workarounds {
}
}
private static void fillBaseContext() {
try {
fillActivityThread();
Method getSystemContextMethod = activityThreadClass.getDeclaredMethod("getSystemContext");
Context context = (Context) getSystemContextMethod.invoke(activityThread);
FakeContext.get().setBaseContext(context);
} catch (Throwable throwable) {
// this is a workaround, so failing is not an error
Ln.d("Could not fill base context: " + throwable.getMessage());
}
}
@TargetApi(Build.VERSION_CODES.R)
@SuppressLint("WrongConstant,MissingPermission,BlockedPrivateApi,SoonBlockedPrivateApi,DiscouragedPrivateApi")
public static AudioRecord createAudioRecord(int source, int sampleRate, int channelConfig, int channels, int channelMask, int encoding) {

View File

@ -17,7 +17,7 @@ import java.lang.reflect.InvocationTargetException;
import java.lang.reflect.Method;
@SuppressLint("PrivateApi,DiscouragedPrivateApi")
public final class ActivityManager {
public class ActivityManager {
private final IInterface manager;
private Method getContentProviderExternalMethod;

View File

@ -0,0 +1,32 @@
package com.genymobile.scrcpy.wrappers;
import java.lang.reflect.Constructor;
public class ActivityThread {
private static final Class<?> activityThreadClass;
private static final Object activityThread;
static {
try {
activityThreadClass = Class.forName("android.app.ActivityThread");
Constructor<?> activityThreadConstructor = activityThreadClass.getDeclaredConstructor();
activityThreadConstructor.setAccessible(true);
activityThread = activityThreadConstructor.newInstance();
} catch (Exception e) {
throw new AssertionError(e);
}
}
private ActivityThread() {
// only static methods
}
public static Object getActivityThread() {
return activityThread;
}
public static Class<?> getActivityThreadClass() {
return activityThreadClass;
}
}

View File

@ -11,7 +11,7 @@ import android.os.IInterface;
import java.lang.reflect.InvocationTargetException;
import java.lang.reflect.Method;
public final class ClipboardManager {
public class ClipboardManager {
private final IInterface manager;
private Method getPrimaryClipMethod;
private Method setPrimaryClipMethod;

View File

@ -14,7 +14,7 @@ import java.io.Closeable;
import java.lang.reflect.InvocationTargetException;
import java.lang.reflect.Method;
public final class ContentProvider implements Closeable {
public class ContentProvider implements Closeable {
public static final String TABLE_SYSTEM = "system";
public static final String TABLE_SECURE = "secure";

View File

@ -1,14 +1,9 @@
package com.genymobile.scrcpy.wrappers;
import com.genymobile.scrcpy.FakeContext;
import android.annotation.SuppressLint;
import android.content.Context;
import android.hardware.camera2.CameraManager;
import android.os.IBinder;
import android.os.IInterface;
import java.lang.reflect.Constructor;
import java.lang.reflect.InvocationTargetException;
import java.lang.reflect.Method;
@ -31,7 +26,6 @@ public final class ServiceManager {
private static StatusBarManager statusBarManager;
private static ClipboardManager clipboardManager;
private static ActivityManager activityManager;
private static CameraManager cameraManager;
private ServiceManager() {
/* not instantiable */
@ -135,16 +129,4 @@ public final class ServiceManager {
return activityManager;
}
public static CameraManager getCameraManager() {
if (cameraManager == null) {
try {
Constructor<CameraManager> ctor = CameraManager.class.getDeclaredConstructor(Context.class);
cameraManager = ctor.newInstance(FakeContext.get());
} catch (Exception e) {
throw new AssertionError(e);
}
}
return cameraManager;
}
}

View File

@ -7,7 +7,7 @@ import android.os.IInterface;
import java.lang.reflect.InvocationTargetException;
import java.lang.reflect.Method;
public final class StatusBarManager {
public class StatusBarManager {
private final IInterface manager;
private Method expandNotificationsPanelMethod;

View File

@ -2,7 +2,6 @@ package com.genymobile.scrcpy.wrappers;
import com.genymobile.scrcpy.Ln;
import android.annotation.TargetApi;
import android.os.IInterface;
import android.view.IRotationWatcher;
import android.view.IDisplayFoldListener;
@ -107,17 +106,16 @@ public final class WindowManager {
cls.getMethod("watchRotation", IRotationWatcher.class).invoke(manager, rotationWatcher);
}
} catch (Exception e) {
Ln.e("Could not register rotation watcher", e);
throw new AssertionError(e);
}
}
@TargetApi(29)
public void registerDisplayFoldListener(IDisplayFoldListener foldListener) {
try {
Class<?> cls = manager.getClass();
cls.getMethod("registerDisplayFoldListener", IDisplayFoldListener.class).invoke(manager, foldListener);
} catch (Exception e) {
Ln.e("Could not register display fold listener", e);
throw new AssertionError(e);
}
}
}