Compare commits

..

3 Commits

Author SHA1 Message Date
9bff8ccadb Apply workaround on "honor" devices
This makes audio work on those devices.

Fixes #4015 <https://github.com/Genymobile/scrcpy/issues/4015>
2023-06-17 00:25:10 +02:00
0d4157357a Use system context as base context
DONOTMERGE: it causes #994 on Xiaomi devices

This allows to make Context.getPackageManager() work.

Fixes #4015 <https://github.com/Genymobile/scrcpy/issues/4015>
Refs <https://github.com/Genymobile/scrcpy/issues/4015#issuecomment-1594262721>

Co-authored-by: Simon Chan <1330321+yume-chan@users.noreply.github.com>
2023-06-17 00:25:10 +02:00
95e61e2a0b Move ActivityThread instance
An ActivityThread instance will be needed from several classes.
2023-06-17 00:25:10 +02:00
107 changed files with 1056 additions and 3655 deletions

1
.gitignore vendored
View File

@ -7,4 +7,3 @@ build/
.gradle/ .gradle/
/x/ /x/
local.properties local.properties
/scrcpy-server

8
FAQ.md
View File

@ -159,8 +159,6 @@ In developer options, enable:
> **USB debugging (Security settings)** > **USB debugging (Security settings)**
> _Allow granting permissions and simulating input via USB debugging_ > _Allow granting permissions and simulating input via USB debugging_
Rebooting the device is necessary once this option is set.
[simulating input]: https://github.com/Genymobile/scrcpy/issues/70#issuecomment-373286323 [simulating input]: https://github.com/Genymobile/scrcpy/issues/70#issuecomment-373286323
@ -170,12 +168,12 @@ The default text injection method is [limited to ASCII characters][text-input].
A trick allows to also inject some [accented characters][accented-characters], A trick allows to also inject some [accented characters][accented-characters],
but that's all. See [#37]. but that's all. See [#37].
It is also possible to simulate a [physical keyboard][hid] (HID). Since scrcpy v1.20, it is possible to simulate a [physical keyboard][hid] (HID).
[text-input]: https://github.com/Genymobile/scrcpy/issues?q=is%3Aopen+is%3Aissue+label%3Aunicode [text-input]: https://github.com/Genymobile/scrcpy/issues?q=is%3Aopen+is%3Aissue+label%3Aunicode
[accented-characters]: https://blog.rom1v.com/2018/03/introducing-scrcpy/#handle-accented-characters [accented-characters]: https://blog.rom1v.com/2018/03/introducing-scrcpy/#handle-accented-characters
[#37]: https://github.com/Genymobile/scrcpy/issues/37 [#37]: https://github.com/Genymobile/scrcpy/issues/37
[hid]: doc/hid-otg.md [hid]: README.md#physical-keyboard-simulation-hid
## Client issues ## Client issues
@ -231,4 +229,4 @@ Translations of this FAQ in other languages are available in the [wiki].
[wiki]: https://github.com/Genymobile/scrcpy/wiki [wiki]: https://github.com/Genymobile/scrcpy/wiki
Only this FAQ file is guaranteed to be up-to-date. Only this README file is guaranteed to be up-to-date.

View File

@ -1,15 +1,11 @@
**This GitHub repo (<https://github.com/Genymobile/scrcpy>) is the only official # scrcpy (v2.0)
source for the project. Do not download releases from random websites, even if
their name contains `scrcpy`.**
# scrcpy (v2.3.1)
<img src="app/data/icon.svg" width="128" height="128" alt="scrcpy" align="right" /> <img src="app/data/icon.svg" width="128" height="128" alt="scrcpy" align="right" />
_pronounced "**scr**een **c**o**py**"_ _pronounced "**scr**een **c**o**py**"_
This application mirrors Android devices (video and audio) connected via This application mirrors Android devices (video and audio) connected via
USB or [over TCP/IP](doc/connection.md#tcpip-wireless), and allows to control the USB or [over TCP/IP](doc/device.md#tcpip-wireless), and allows to control the
device with the keyboard and the mouse of the computer. It does not require any device with the keyboard and the mouse of the computer. It does not require any
_root_ access. It works on _Linux_, _Windows_ and _macOS_. _root_ access. It works on _Linux_, _Windows_ and _macOS_.
@ -29,13 +25,12 @@ It focuses on:
[lowlatency]: https://github.com/Genymobile/scrcpy/pull/646 [lowlatency]: https://github.com/Genymobile/scrcpy/pull/646
Its features include: Its features include:
- [audio forwarding](doc/audio.md) (Android 11+) - [audio forwarding](doc/audio.md) (Android >= 11)
- [recording](doc/recording.md) - [recording](doc/recording.md)
- mirroring with [Android device screen off](doc/device.md#turn-screen-off) - mirroring with [Android device screen off](doc/device.md#turn-screen-off)
- [copy-paste](doc/control.md#copy-paste) in both directions - [copy-paste](doc/control.md#copy-paste) in both directions
- [configurable quality](doc/video.md) - [configurable quality](doc/video.md)
- [camera mirroring](doc/camera.md) (Android 12+) - Android device [as a webcam (V4L2)](doc/v4l2.md) (Linux-only)
- [mirroring as a webcam (V4L2)](doc/v4l2.md) (Linux-only)
- [physical keyboard/mouse simulation (HID)](doc/hid-otg.md) - [physical keyboard/mouse simulation (HID)](doc/hid-otg.md)
- [OTG mode](doc/hid-otg.md#otg) - [OTG mode](doc/hid-otg.md#otg)
- and more… - and more…
@ -44,7 +39,7 @@ Its features include:
The Android device requires at least API 21 (Android 5.0). The Android device requires at least API 21 (Android 5.0).
[Audio forwarding](doc/audio.md) is supported for API >= 30 (Android 11+). [Audio forwarding](doc/audio.md) is supported from API 30 (Android 11).
Make sure you [enabled USB debugging][enable-adb] on your device(s). Make sure you [enabled USB debugging][enable-adb] on your device(s).
@ -52,14 +47,10 @@ Make sure you [enabled USB debugging][enable-adb] on your device(s).
On some devices, you also need to enable [an additional option][control] `USB On some devices, you also need to enable [an additional option][control] `USB
debugging (Security Settings)` (this is an item different from `USB debugging`) debugging (Security Settings)` (this is an item different from `USB debugging`)
to control it using a keyboard and mouse. Rebooting the device is necessary once to control it using a keyboard and mouse.
this option is set.
[control]: https://github.com/Genymobile/scrcpy/issues/70#issuecomment-373286323 [control]: https://github.com/Genymobile/scrcpy/issues/70#issuecomment-373286323
Note that USB debugging is not required to run scrcpy in [OTG
mode](doc/hid-otg.md#otg).
## Get the app ## Get the app
@ -73,16 +64,14 @@ mode](doc/hid-otg.md#otg).
The application provides a lot of features and configuration options. They are The application provides a lot of features and configuration options. They are
documented in the following pages: documented in the following pages:
- [Connection](doc/connection.md) - [Device](doc/device.md)
- [Video](doc/video.md) - [Video](doc/video.md)
- [Audio](doc/audio.md) - [Audio](doc/audio.md)
- [Control](doc/control.md) - [Control](doc/control.md)
- [Device](doc/device.md)
- [Window](doc/window.md) - [Window](doc/window.md)
- [Recording](doc/recording.md) - [Recording](doc/recording.md)
- [Tunnels](doc/tunnels.md) - [Tunnels](doc/tunnels.md)
- [HID/OTG](doc/hid-otg.md) - [HID/OTG](doc/hid-otg.md)
- [Camera](doc/camera.md)
- [Video4Linux](doc/v4l2.md) - [Video4Linux](doc/v4l2.md)
- [Shortcuts](doc/shortcuts.md) - [Shortcuts](doc/shortcuts.md)
@ -124,10 +113,7 @@ For general questions or discussions, you can also use:
I'm [@rom1v](https://github.com/rom1v), the author and maintainer of _scrcpy_. I'm [@rom1v](https://github.com/rom1v), the author and maintainer of _scrcpy_.
If you appreciate this application, you can [support my open source If you appreciate this application, you can [support my open source
work][donate]: work][donate].
- [GitHub Sponsors](https://github.com/sponsors/rom1v)
- [Liberapay](https://liberapay.com/rom1v/)
- [PayPal](https://paypal.me/rom2v)
[donate]: https://blog.rom1v.com/about/#support-my-open-source-work [donate]: https://blog.rom1v.com/about/#support-my-open-source-work

View File

@ -10,18 +10,11 @@ _scrcpy() {
--audio-source= --audio-source=
--audio-output-buffer= --audio-output-buffer=
-b --video-bit-rate= -b --video-bit-rate=
--camera-ar=
--camera-id=
--camera-facing=
--camera-fps=
--camera-high-speed
--camera-size=
--crop= --crop=
-d --select-usb -d --select-usb
--disable-screensaver --disable-screensaver
--display=
--display-buffer= --display-buffer=
--display-id=
--display-orientation=
-e --select-tcpip -e --select-tcpip
-f --fullscreen -f --fullscreen
--force-adb-forward --force-adb-forward
@ -30,8 +23,6 @@ _scrcpy() {
--kill-adb-on-close --kill-adb-on-close
-K --hid-keyboard -K --hid-keyboard
--legacy-paste --legacy-paste
--list-camera-sizes
--list-cameras
--list-displays --list-displays
--list-encoders --list-encoders
--lock-video-orientation --lock-video-orientation
@ -51,11 +42,8 @@ _scrcpy() {
--no-power-on --no-power-on
--no-video --no-video
--no-video-playback --no-video-playback
--orientation=
--otg --otg
-p --port= -p --port=
--pause-on-exit
--pause-on-exit=
--power-off-on-close --power-off-on-close
--prefer-text --prefer-text
--print-fps --print-fps
@ -63,7 +51,6 @@ _scrcpy() {
-r --record= -r --record=
--raw-key-events --raw-key-events
--record-format= --record-format=
--record-orientation=
--render-driver= --render-driver=
--require-audio --require-audio
--rotation= --rotation=
@ -83,7 +70,6 @@ _scrcpy() {
--video-codec= --video-codec=
--video-codec-options= --video-codec-options=
--video-encoder= --video-encoder=
--video-source=
-w --stay-awake -w --stay-awake
--window-borderless --window-borderless
--window-title= --window-title=
@ -100,36 +86,15 @@ _scrcpy() {
return return
;; ;;
--audio-codec) --audio-codec)
COMPREPLY=($(compgen -W 'opus aac flac raw' -- "$cur")) COMPREPLY=($(compgen -W 'opus aac raw' -- "$cur"))
return
;;
--video-source)
COMPREPLY=($(compgen -W 'display camera' -- "$cur"))
return return
;; ;;
--audio-source) --audio-source)
COMPREPLY=($(compgen -W 'output mic' -- "$cur")) COMPREPLY=($(compgen -W 'output mic' -- "$cur"))
return return
;; ;;
--camera-facing)
COMPREPLY=($(compgen -W 'front back external' -- "$cur"))
return
;;
--orientation
--display-orientation)
COMPREPLY=($(compgen -> '0 90 180 270 flip0 flip90 flip180 flip270' -- "$cur"))
return
;;
--record-orientation)
COMPREPLY=($(compgen -> '0 90 180 270' -- "$cur"))
return
;;
--lock-video-orientation) --lock-video-orientation)
COMPREPLY=($(compgen -W 'unlocked initial 0 90 180 270' -- "$cur")) COMPREPLY=($(compgen -W 'unlocked initial 0 1 2 3' -- "$cur"))
return
;;
--pause-on-exit)
COMPREPLY=($(compgen -W 'true false if-error' -- "$cur"))
return return
;; ;;
-r|--record) -r|--record)
@ -137,13 +102,17 @@ _scrcpy() {
return return
;; ;;
--record-format) --record-format)
COMPREPLY=($(compgen -W 'mp4 mkv m4a mka opus aac flac wav' -- "$cur")) COMPREPLY=($(compgen -W 'mkv mp4' -- "$cur"))
return return
;; ;;
--render-driver) --render-driver)
COMPREPLY=($(compgen -W 'direct3d opengl opengles2 opengles metal software' -- "$cur")) COMPREPLY=($(compgen -W 'direct3d opengl opengles2 opengles metal software' -- "$cur"))
return return
;; ;;
--rotation)
COMPREPLY=($(compgen -W '0 1 2 3' -- "$cur"))
return
;;
--shortcut-mod) --shortcut-mod)
# Only auto-complete a single key # Only auto-complete a single key
COMPREPLY=($(compgen -W 'lctrl rctrl lalt ralt lsuper rsuper' -- "$cur")) COMPREPLY=($(compgen -W 'lctrl rctrl lalt ralt lsuper rsuper' -- "$cur"))
@ -164,12 +133,8 @@ _scrcpy() {
|--audio-codec-options \ |--audio-codec-options \
|--audio-encoder \ |--audio-encoder \
|--audio-output-buffer \ |--audio-output-buffer \
|--camera-ar \
|--camera-id \
|--camera-fps \
|--camera-size \
|--crop \ |--crop \
|--display-id \ |--display \
|--display-buffer \ |--display-buffer \
|--max-fps \ |--max-fps \
|-m|--max-size \ |-m|--max-size \

View File

@ -1,2 +1,4 @@
@echo off @echo off
scrcpy.exe --pause-on-exit=if-error %* scrcpy.exe %*
:: if the exit code is >= 1, then pause
if errorlevel 1 pause

View File

@ -5,7 +5,7 @@ Comment=Display and control your Android device
# For some users, the PATH or ADB environment variables are set from the shell # For some users, the PATH or ADB environment variables are set from the shell
# startup file, like .bashrc or .zshrc… Run an interactive shell to get # startup file, like .bashrc or .zshrc… Run an interactive shell to get
# environment correctly initialized. # environment correctly initialized.
Exec=/bin/sh -c "\\$SHELL -i -c 'scrcpy --pause-on-exit=if-error'" Exec=/bin/bash --norc --noprofile -i -c "\"\\$SHELL\" -i -c scrcpy || read -p 'Press Enter to quit...'"
Icon=scrcpy Icon=scrcpy
Terminal=true Terminal=true
Type=Application Type=Application

View File

@ -5,7 +5,7 @@ Comment=Display and control your Android device
# For some users, the PATH or ADB environment variables are set from the shell # For some users, the PATH or ADB environment variables are set from the shell
# startup file, like .bashrc or .zshrc… Run an interactive shell to get # startup file, like .bashrc or .zshrc… Run an interactive shell to get
# environment correctly initialized. # environment correctly initialized.
Exec=/bin/sh -c "\\$SHELL -i -c scrcpy" Exec=/bin/sh -c "\"\\$SHELL\" -i -c scrcpy"
Icon=scrcpy Icon=scrcpy
Terminal=false Terminal=false
Type=Application Type=Application

View File

@ -11,24 +11,17 @@ arguments=(
'--always-on-top[Make scrcpy window always on top \(above other windows\)]' '--always-on-top[Make scrcpy window always on top \(above other windows\)]'
'--audio-bit-rate=[Encode the audio at the given bit-rate]' '--audio-bit-rate=[Encode the audio at the given bit-rate]'
'--audio-buffer=[Configure the audio buffering delay (in milliseconds)]' '--audio-buffer=[Configure the audio buffering delay (in milliseconds)]'
'--audio-codec=[Select the audio codec]:codec:(opus aac flac raw)' '--audio-codec=[Select the audio codec]:codec:(opus aac raw)'
'--audio-codec-options=[Set a list of comma-separated key\:type=value options for the device audio encoder]' '--audio-codec-options=[Set a list of comma-separated key\:type=value options for the device audio encoder]'
'--audio-encoder=[Use a specific MediaCodec audio encoder]' '--audio-encoder=[Use a specific MediaCodec audio encoder]'
'--audio-source=[Select the audio source]:source:(output mic)' '--audio-source=[Select the audio source]:source:(output mic)'
'--audio-output-buffer=[Configure the size of the SDL audio output buffer (in milliseconds)]' '--audio-output-buffer=[Configure the size of the SDL audio output buffer (in milliseconds)]'
{-b,--video-bit-rate=}'[Encode the video at the given bit-rate]' {-b,--video-bit-rate=}'[Encode the video at the given bit-rate]'
'--camera-ar=[Select the camera size by its aspect ratio]'
'--camera-high-speed=[Enable high-speed camera capture mode]'
'--camera-id=[Specify the camera id to mirror]'
'--camera-facing=[Select the device camera by its facing direction]:facing:(front back external)'
'--camera-fps=[Specify the camera capture frame rate]'
'--camera-size=[Specify an explicit camera capture size]'
'--crop=[\[width\:height\:x\:y\] Crop the device screen on the server]' '--crop=[\[width\:height\:x\:y\] Crop the device screen on the server]'
{-d,--select-usb}'[Use USB device]' {-d,--select-usb}'[Use USB device]'
'--disable-screensaver[Disable screensaver while scrcpy is running]' '--disable-screensaver[Disable screensaver while scrcpy is running]'
'--display=[Specify the display id to mirror]'
'--display-buffer=[Add a buffering delay \(in milliseconds\) before displaying]' '--display-buffer=[Add a buffering delay \(in milliseconds\) before displaying]'
'--display-id=[Specify the display id to mirror]'
'--display-orientation=[Set the initial display orientation]:orientation values:(0 90 180 270 flip0 flip90 flip180 flip270)'
{-e,--select-tcpip}'[Use TCP/IP device]' {-e,--select-tcpip}'[Use TCP/IP device]'
{-f,--fullscreen}'[Start in fullscreen]' {-f,--fullscreen}'[Start in fullscreen]'
'--force-adb-forward[Do not attempt to use \"adb reverse\" to connect to the device]' '--force-adb-forward[Do not attempt to use \"adb reverse\" to connect to the device]'
@ -37,11 +30,9 @@ arguments=(
'--kill-adb-on-close[Kill adb when scrcpy terminates]' '--kill-adb-on-close[Kill adb when scrcpy terminates]'
{-K,--hid-keyboard}'[Simulate a physical keyboard by using HID over AOAv2]' {-K,--hid-keyboard}'[Simulate a physical keyboard by using HID over AOAv2]'
'--legacy-paste[Inject computer clipboard text as a sequence of key events on Ctrl+v]' '--legacy-paste[Inject computer clipboard text as a sequence of key events on Ctrl+v]'
'--list-camera-sizes[List the valid camera capture sizes]'
'--list-cameras[List cameras available on the device]'
'--list-displays[List displays available on the device]' '--list-displays[List displays available on the device]'
'--list-encoders[List video and audio encoders available on the device]' '--list-encoders[List video and audio encoders available on the device]'
'--lock-video-orientation=[Lock video orientation]:orientation:(unlocked initial 0 90 180 270)' '--lock-video-orientation=[Lock video orientation]:orientation:(unlocked initial 0 1 2 3)'
{-m,--max-size=}'[Limit both the width and height of the video to value]' {-m,--max-size=}'[Limit both the width and height of the video to value]'
{-M,--hid-mouse}'[Simulate a physical mouse by using HID over AOAv2]' {-M,--hid-mouse}'[Simulate a physical mouse by using HID over AOAv2]'
'--max-fps=[Limit the frame rate of screen capture]' '--max-fps=[Limit the frame rate of screen capture]'
@ -57,20 +48,18 @@ arguments=(
'--no-power-on[Do not power on the device on start]' '--no-power-on[Do not power on the device on start]'
'--no-video[Disable video forwarding]' '--no-video[Disable video forwarding]'
'--no-video-playback[Disable video playback]' '--no-video-playback[Disable video playback]'
'--orientation=[Set the video orientation]:orientation values:(0 90 180 270 flip0 flip90 flip180 flip270)'
'--otg[Run in OTG mode \(simulating physical keyboard and mouse\)]' '--otg[Run in OTG mode \(simulating physical keyboard and mouse\)]'
{-p,--port=}'[\[port\[\:port\]\] Set the TCP port \(range\) used by the client to listen]' {-p,--port=}'[\[port\[\:port\]\] Set the TCP port \(range\) used by the client to listen]'
'--pause-on-exit=[Make scrcpy pause before exiting]:mode:(true false if-error)'
'--power-off-on-close[Turn the device screen off when closing scrcpy]' '--power-off-on-close[Turn the device screen off when closing scrcpy]'
'--prefer-text[Inject alpha characters and space as text events instead of key events]' '--prefer-text[Inject alpha characters and space as text events instead of key events]'
'--print-fps[Start FPS counter, to print frame logs to the console]' '--print-fps[Start FPS counter, to print frame logs to the console]'
'--push-target=[Set the target directory for pushing files to the device by drag and drop]' '--push-target=[Set the target directory for pushing files to the device by drag and drop]'
{-r,--record=}'[Record screen to file]:record file:_files' {-r,--record=}'[Record screen to file]:record file:_files'
'--raw-key-events[Inject key events for all input keys, and ignore text events]' '--raw-key-events[Inject key events for all input keys, and ignore text events]'
'--record-format=[Force recording format]:format:(mp4 mkv m4a mka opus aac flac wav)' '--record-format=[Force recording format]:format:(mp4 mkv)'
'--record-orientation=[Set the record orientation]:orientation values:(0 90 180 270)'
'--render-driver=[Request SDL to use the given render driver]:driver name:(direct3d opengl opengles2 opengles metal software)' '--render-driver=[Request SDL to use the given render driver]:driver name:(direct3d opengl opengles2 opengles metal software)'
'--require-audio=[Make scrcpy fail if audio is enabled but does not work]' '--require-audio=[Make scrcpy fail if audio is enabled but does not work]'
'--rotation=[Set the initial display rotation]:rotation values:(0 1 2 3)'
{-s,--serial=}'[The device serial number \(mandatory for multiple devices only\)]:serial:($("${ADB-adb}" devices | awk '\''$2 == "device" {print $1}'\''))' {-s,--serial=}'[The device serial number \(mandatory for multiple devices only\)]:serial:($("${ADB-adb}" devices | awk '\''$2 == "device" {print $1}'\''))'
{-S,--turn-screen-off}'[Turn the device screen off immediately]' {-S,--turn-screen-off}'[Turn the device screen off immediately]'
'--shortcut-mod=[\[key1,key2+key3,...\] Specify the modifiers to use for scrcpy shortcuts]:shortcut mod:(lctrl rctrl lalt ralt lsuper rsuper)' '--shortcut-mod=[\[key1,key2+key3,...\] Specify the modifiers to use for scrcpy shortcuts]:shortcut mod:(lctrl rctrl lalt ralt lsuper rsuper)'
@ -86,7 +75,6 @@ arguments=(
'--video-codec=[Select the video codec]:codec:(h264 h265 av1)' '--video-codec=[Select the video codec]:codec:(h264 h265 av1)'
'--video-codec-options=[Set a list of comma-separated key\:type=value options for the device video encoder]' '--video-codec-options=[Set a list of comma-separated key\:type=value options for the device video encoder]'
'--video-encoder=[Use a specific MediaCodec video encoder]' '--video-encoder=[Use a specific MediaCodec video encoder]'
'--video-source=[Select the video source]:source:(display camera)'
{-w,--stay-awake}'[Keep the device on while scrcpy is running, when the device is plugged in]' {-w,--stay-awake}'[Keep the device on while scrcpy is running, when the device is plugged in]'
'--window-borderless[Disable window decorations \(display borderless window\)]' '--window-borderless[Disable window decorations \(display borderless window\)]'
'--window-title=[Set a custom window title]' '--window-title=[Set a custom window title]'

View File

@ -98,24 +98,77 @@ endif
cc = meson.get_compiler('c') cc = meson.get_compiler('c')
dependencies = [ crossbuild_windows = meson.is_cross_build() and host_machine.system() == 'windows'
dependency('libavformat', version: '>= 57.33'),
dependency('libavcodec', version: '>= 57.37'),
dependency('libavutil'),
dependency('libswresample'),
dependency('sdl2', version: '>= 2.0.5'),
]
if v4l2_support if not crossbuild_windows
dependencies += dependency('libavdevice')
endif # native build
dependencies = [
dependency('libavformat', version: '>= 57.33'),
dependency('libavcodec', version: '>= 57.37'),
dependency('libavutil'),
dependency('libswresample'),
dependency('sdl2', version: '>= 2.0.5'),
]
if v4l2_support
dependencies += dependency('libavdevice')
endif
if usb_support
dependencies += dependency('libusb-1.0')
endif
else
# cross-compile mingw32 build (from Linux to Windows)
prebuilt_sdl2 = meson.get_cross_property('prebuilt_sdl2')
sdl2_bin_dir = meson.current_source_dir() + '/prebuilt-deps/data/' + prebuilt_sdl2 + '/bin'
sdl2_lib_dir = meson.current_source_dir() + '/prebuilt-deps/data/' + prebuilt_sdl2 + '/lib'
sdl2_include_dir = 'prebuilt-deps/data/' + prebuilt_sdl2 + '/include'
sdl2 = declare_dependency(
dependencies: [
cc.find_library('SDL2', dirs: sdl2_bin_dir),
cc.find_library('SDL2main', dirs: sdl2_lib_dir),
],
include_directories: include_directories(sdl2_include_dir)
)
prebuilt_ffmpeg = meson.get_cross_property('prebuilt_ffmpeg')
ffmpeg_bin_dir = meson.current_source_dir() + '/prebuilt-deps/data/' + prebuilt_ffmpeg + '/bin'
ffmpeg_include_dir = 'prebuilt-deps/data/' + prebuilt_ffmpeg + '/include'
ffmpeg = declare_dependency(
dependencies: [
cc.find_library('avcodec-60', dirs: ffmpeg_bin_dir),
cc.find_library('avformat-60', dirs: ffmpeg_bin_dir),
cc.find_library('avutil-58', dirs: ffmpeg_bin_dir),
cc.find_library('swresample-4', dirs: ffmpeg_bin_dir),
],
include_directories: include_directories(ffmpeg_include_dir)
)
prebuilt_libusb = meson.get_cross_property('prebuilt_libusb')
libusb_bin_dir = meson.current_source_dir() + '/prebuilt-deps/data/' + prebuilt_libusb + '/bin'
libusb_include_dir = 'prebuilt-deps/data/' + prebuilt_libusb + '/include'
libusb = declare_dependency(
dependencies: [
cc.find_library('msys-usb-1.0', dirs: libusb_bin_dir),
],
include_directories: include_directories(libusb_include_dir)
)
dependencies = [
ffmpeg,
sdl2,
libusb,
cc.find_library('mingw32')
]
if usb_support
dependencies += dependency('libusb-1.0')
endif endif
if host_machine.system() == 'windows' if host_machine.system() == 'windows'
dependencies += cc.find_library('mingw32')
dependencies += cc.find_library('ws2_32') dependencies += cc.find_library('ws2_32')
endif endif
@ -236,10 +289,6 @@ if get_option('buildtype') == 'debug'
'tests/test_device_msg_deserialize.c', 'tests/test_device_msg_deserialize.c',
'src/device_msg.c', 'src/device_msg.c',
]], ]],
['test_orientation', [
'tests/test_orientation.c',
'src/options.c',
]],
['test_strbuf', [ ['test_strbuf', [
'tests/test_strbuf.c', 'tests/test_strbuf.c',
'src/util/strbuf.c', 'src/util/strbuf.c',

View File

@ -6,10 +6,10 @@ cd "$DIR"
mkdir -p "$PREBUILT_DATA_DIR" mkdir -p "$PREBUILT_DATA_DIR"
cd "$PREBUILT_DATA_DIR" cd "$PREBUILT_DATA_DIR"
DEP_DIR=platform-tools-34.0.5 DEP_DIR=platform-tools-34.0.1
FILENAME=platform-tools_r34.0.5-windows.zip FILENAME=platform-tools_r34.0.1-windows.zip
SHA256SUM=3f8320152704377de150418a3c4c9d07d16d80a6c0d0d8f7289c22c499e33571 SHA256SUM=5dd9c2be744c224fa3a7cbe30ba02d2cb378c763bd0f797a7e47e9f3156a5daa
if [[ -d "$DEP_DIR" ]] if [[ -d "$DEP_DIR" ]]
then then

View File

@ -6,11 +6,11 @@ cd "$DIR"
mkdir -p "$PREBUILT_DATA_DIR" mkdir -p "$PREBUILT_DATA_DIR"
cd "$PREBUILT_DATA_DIR" cd "$PREBUILT_DATA_DIR"
VERSION=6.1-scrcpy-3 VERSION=6.0-scrcpy-4
DEP_DIR="ffmpeg-$VERSION" DEP_DIR="ffmpeg-$VERSION"
FILENAME="$DEP_DIR".7z FILENAME="$DEP_DIR".7z
SHA256SUM=b646d18a3d543a4e4c46881568213499f22e4454a464e1552f03f2ac9cc3a05a SHA256SUM=39274b321491ce83e76cab5d24e7cbe3f402d3ccf382f739b13be5651c146b60
if [[ -d "$DEP_DIR" ]] if [[ -d "$DEP_DIR" ]]
then then

View File

@ -6,10 +6,9 @@ cd "$DIR"
mkdir -p "$PREBUILT_DATA_DIR" mkdir -p "$PREBUILT_DATA_DIR"
cd "$PREBUILT_DATA_DIR" cd "$PREBUILT_DATA_DIR"
VERSION=1.0.26 DEP_DIR=libusb-1.0.26
DEP_DIR="libusb-$VERSION"
FILENAME="libusb-$VERSION-binaries.7z" FILENAME=libusb-1.0.26-binaries.7z
SHA256SUM=9c242696342dbde9cdc47239391f71833939bf9f7aa2bbb28cdaabe890465ec5 SHA256SUM=9c242696342dbde9cdc47239391f71833939bf9f7aa2bbb28cdaabe890465ec5
if [[ -d "$DEP_DIR" ]] if [[ -d "$DEP_DIR" ]]
@ -18,22 +17,17 @@ then
exit 0 exit 0
fi fi
get_file "https://github.com/libusb/libusb/releases/download/v$VERSION/$FILENAME" \ get_file "https://github.com/libusb/libusb/releases/download/v1.0.26/$FILENAME" "$FILENAME" "$SHA256SUM"
"$FILENAME" "$SHA256SUM"
mkdir "$DEP_DIR" mkdir "$DEP_DIR"
cd "$DEP_DIR" cd "$DEP_DIR"
7z x "../$FILENAME" \ 7z x "../$FILENAME" \
"libusb-$VERSION-binaries/libusb-MinGW-Win32/" \ libusb-1.0.26-binaries/libusb-MinGW-Win32/bin/msys-usb-1.0.dll \
"libusb-$VERSION-binaries/libusb-MinGW-Win32/" \ libusb-1.0.26-binaries/libusb-MinGW-Win32/include/ \
"libusb-$VERSION-binaries/libusb-MinGW-x64/" \ libusb-1.0.26-binaries/libusb-MinGW-x64/bin/msys-usb-1.0.dll \
"libusb-$VERSION-binaries/libusb-MinGW-x64/" libusb-1.0.26-binaries/libusb-MinGW-x64/include/
mv "libusb-$VERSION-binaries/libusb-MinGW-Win32" . mv libusb-1.0.26-binaries/libusb-MinGW-Win32 .
mv "libusb-$VERSION-binaries/libusb-MinGW-x64" . mv libusb-1.0.26-binaries/libusb-MinGW-x64 .
rm -rf "libusb-$VERSION-binaries" rm -rf libusb-1.0.26-binaries
# Rename the dll to get the same library name on all platforms
mv libusb-MinGW-Win32/bin/msys-usb-1.0.dll libusb-MinGW-Win32/bin/libusb-1.0.dll
mv libusb-MinGW-x64/bin/msys-usb-1.0.dll libusb-MinGW-x64/bin/libusb-1.0.dll

View File

@ -6,11 +6,10 @@ cd "$DIR"
mkdir -p "$PREBUILT_DATA_DIR" mkdir -p "$PREBUILT_DATA_DIR"
cd "$PREBUILT_DATA_DIR" cd "$PREBUILT_DATA_DIR"
VERSION=2.28.5 DEP_DIR=SDL2-2.26.4
DEP_DIR="SDL2-$VERSION"
FILENAME="SDL2-devel-$VERSION-mingw.tar.gz" FILENAME=SDL2-devel-2.26.4-mingw.tar.gz
SHA256SUM=3c0c655c2ebf67cad48fead72761d1601740ded30808952c3274ba223d226c21 SHA256SUM=fe899c8642caac2f180b1ee6f786857ddcaa0adc1fa82474312b09dd47d74712
if [[ -d "$DEP_DIR" ]] if [[ -d "$DEP_DIR" ]]
then then
@ -18,8 +17,7 @@ then
exit 0 exit 0
fi fi
get_file "https://github.com/libsdl-org/SDL/releases/download/release-$VERSION/$FILENAME" \ get_file "https://libsdl.org/release/$FILENAME" "$FILENAME" "$SHA256SUM"
"$FILENAME" "$SHA256SUM"
mkdir "$DEP_DIR" mkdir "$DEP_DIR"
cd "$DEP_DIR" cd "$DEP_DIR"

View File

@ -13,7 +13,7 @@ BEGIN
VALUE "LegalCopyright", "Romain Vimont, Genymobile" VALUE "LegalCopyright", "Romain Vimont, Genymobile"
VALUE "OriginalFilename", "scrcpy.exe" VALUE "OriginalFilename", "scrcpy.exe"
VALUE "ProductName", "scrcpy" VALUE "ProductName", "scrcpy"
VALUE "ProductVersion", "2.3.1" VALUE "ProductVersion", "2.0"
END END
END END
BLOCK "VarFileInfo" BLOCK "VarFileInfo"

View File

@ -21,12 +21,12 @@ Make scrcpy window always on top (above other windows).
.TP .TP
.BI "\-\-audio\-bit\-rate " value .BI "\-\-audio\-bit\-rate " value
Encode the audio at the given bit rate, expressed in bits/s. Unit suffixes are supported: '\fBK\fR' (x1000) and '\fBM\fR' (x1000000). Encode the audio at the given bit\-rate, expressed in bits/s. Unit suffixes are supported: '\fBK\fR' (x1000) and '\fBM\fR' (x1000000).
Default is 128K (128000). Default is 128K (128000).
.TP .TP
.BI "\-\-audio\-buffer " ms .BI "\-\-audio\-buffer ms
Configure the audio buffering delay (in milliseconds). Configure the audio buffering delay (in milliseconds).
Lower values decrease the latency, but increase the likelyhood of buffer underrun (causing audio glitches). Lower values decrease the latency, but increase the likelyhood of buffer underrun (causing audio glitches).
@ -35,7 +35,7 @@ Default is 50.
.TP .TP
.BI "\-\-audio\-codec " name .BI "\-\-audio\-codec " name
Select an audio codec (opus, aac, flac or raw). Select an audio codec (opus, aac or raw).
Default is opus. Default is opus.
@ -45,15 +45,15 @@ Set a list of comma-separated key:type=value options for the device audio encode
The possible values for 'type' are 'int' (default), 'long', 'float' and 'string'. The possible values for 'type' are 'int' (default), 'long', 'float' and 'string'.
The list of possible codec options is available in the Android documentation: The list of possible codec options is available in the Android documentation
.UR https://d.android.com/reference/android/media/MediaFormat
<https://d.android.com/reference/android/media/MediaFormat> .UE .
.TP .TP
.BI "\-\-audio\-encoder " name .BI "\-\-audio\-encoder " name
Use a specific MediaCodec audio encoder (depending on the codec provided by \fB\-\-audio\-codec\fR). Use a specific MediaCodec audio encoder (depending on the codec provided by \fB\-\-audio\-codec\fR).
The available encoders can be listed by \fB\-\-list\-encoders\fR. The available encoders can be listed by \-\-list\-encoders.
.TP .TP
.BI "\-\-audio\-source " source .BI "\-\-audio\-source " source
@ -62,7 +62,7 @@ Select the audio source (output or mic).
Default is output. Default is output.
.TP .TP
.BI "\-\-audio\-output\-buffer " ms .BI "\-\-audio\-output\-buffer ms
Configure the size of the SDL audio output buffer (in milliseconds). Configure the size of the SDL audio output buffer (in milliseconds).
If you get "robotic" audio playback, you should test with a higher value (10). Do not change this setting otherwise. If you get "robotic" audio playback, you should test with a higher value (10). Do not change this setting otherwise.
@ -71,44 +71,10 @@ Default is 5.
.TP .TP
.BI "\-b, \-\-video\-bit\-rate " value .BI "\-b, \-\-video\-bit\-rate " value
Encode the video at the given bit rate, expressed in bits/s. Unit suffixes are supported: '\fBK\fR' (x1000) and '\fBM\fR' (x1000000). Encode the video at the given bit\-rate, expressed in bits/s. Unit suffixes are supported: '\fBK\fR' (x1000) and '\fBM\fR' (x1000000).
Default is 8M (8000000). Default is 8M (8000000).
.TP
.BI "\-\-camera\-ar " ar
Select the camera size by its aspect ratio (+/- 10%).
Possible values are "sensor" (use the camera sensor aspect ratio), "\fInum\fR:\fIden\fR" (e.g. "4:3") and "\fIvalue\fR" (e.g. "1.6").
.TP
.B \-\-camera\-high\-speed
Enable high-speed camera capture mode.
This mode is restricted to specific resolutions and frame rates, listed by \fB\-\-list\-camera\-sizes\fR.
.TP
.BI "\-\-camera\-id " id
Specify the device camera id to mirror.
The available camera ids can be listed by \fB\-\-list\-cameras\fR.
.TP
.BI "\-\-camera\-facing " facing
Select the device camera by its facing direction.
Possible values are "front", "back" and "external".
.TP
.BI "\-\-camera\-fps " fps
Specify the camera capture frame rate.
If not specified, Android's default frame rate (30 fps) is used.
.TP
.BI "\-\-camera\-size " width\fRx\fIheight
Specify an explicit camera capture size.
.TP .TP
.BI "\-\-crop " width\fR:\fIheight\fR:\fIx\fR:\fIy .BI "\-\-crop " width\fR:\fIheight\fR:\fIx\fR:\fIy
Crop the device screen on the server. Crop the device screen on the server.
@ -128,27 +94,19 @@ Also see \fB\-e\fR (\fB\-\-select\-tcpip\fR).
Disable screensaver while scrcpy is running. Disable screensaver while scrcpy is running.
.TP .TP
.BI "\-\-display\-buffer " ms .BI "\-\-display " id
Specify the device display id to mirror.
The available display ids can be listed by \-\-list\-displays.
Default is 0.
.TP
.BI "\-\-display\-buffer ms
Add a buffering delay (in milliseconds) before displaying. This increases latency to compensate for jitter. Add a buffering delay (in milliseconds) before displaying. This increases latency to compensate for jitter.
Default is 0 (no buffering). Default is 0 (no buffering).
.TP
.BI "\-\-display\-id " id
Specify the device display id to mirror.
The available display ids can be listed by \fB\-\-list\-displays\fR.
Default is 0.
.TP
.BI "\-\-display\-orientation " value
Set the initial display orientation.
Possible values are 0, 90, 180, 270, flip0, flip90, flip180 and flip270. The number represents the clockwise rotation in degrees; the "flip" keyword applies a horizontal flip before the rotation.
Default is 0.
.TP .TP
.B \-e, \-\-select\-tcpip .B \-e, \-\-select\-tcpip
Use TCP/IP device (if there is exactly one, like adb -e). Use TCP/IP device (if there is exactly one, like adb -e).
@ -197,14 +155,6 @@ Inject computer clipboard text as a sequence of key events on Ctrl+v (like MOD+S
This is a workaround for some devices not behaving as expected when setting the device clipboard programmatically. This is a workaround for some devices not behaving as expected when setting the device clipboard programmatically.
.TP
.B \-\-list\-camera\-sizes
List the valid camera capture sizes.
.TP
.B \-\-list\-cameras
List cameras available on the device.
.TP .TP
.B \-\-list\-encoders .B \-\-list\-encoders
List video and audio encoders available on the device. List video and audio encoders available on the device.
@ -215,9 +165,7 @@ List displays available on the device.
.TP .TP
\fB\-\-lock\-video\-orientation\fR[=\fIvalue\fR] \fB\-\-lock\-video\-orientation\fR[=\fIvalue\fR]
Lock capture video orientation to \fIvalue\fR. Lock video orientation to \fIvalue\fR. Possible values are "unlocked", "initial" (locked to the initial orientation), 0, 1, 2 and 3. Natural device orientation is 0, and each increment adds a 90 degrees rotation counterclockwise.
Possible values are "unlocked", "initial" (locked to the initial orientation), 0, 90, 180, and 270. The values represent the clockwise rotation from the natural device orientation, in degrees.
Default is "unlocked". Default is "unlocked".
@ -251,7 +199,7 @@ Disable device control (mirror the device in read\-only).
.TP .TP
.B \-N, \-\-no\-playback .B \-N, \-\-no\-playback
Disable video and audio playback on the computer (equivalent to \fB\-\-no\-video\-playback \-\-no\-audio\-playback\fR). Disable video and audio playback on the computer (equivalent to --no-video-playback --no-audio-playback).
.TP .TP
.B \-\-no\-audio .B \-\-no\-audio
@ -299,10 +247,6 @@ Disable video forwarding.
.B \-\-no\-video\-playback .B \-\-no\-video\-playback
Disable video playback on the computer. Disable video playback on the computer.
.TP
.BI "\-\-orientation " value
Same as --display-orientation=value --record-orientation=value.
.TP .TP
.B \-\-otg .B \-\-otg
Run in OTG mode: simulate physical keyboard and mouse, as if the computer keyboard and mouse were plugged directly to the device via an OTG cable. Run in OTG mode: simulate physical keyboard and mouse, as if the computer keyboard and mouse were plugged directly to the device via an OTG cable.
@ -323,16 +267,6 @@ Set the TCP port (range) used by the client to listen.
Default is 27183:27199. Default is 27183:27199.
.TP
\fB\-\-pause\-on\-exit\fR[=\fImode\fR]
Configure pause on exit. Possible values are "true" (always pause on exit), "false" (never pause on exit) and "if-error" (pause only if an error occured).
This is useful to prevent the terminal window from automatically closing, so that error messages can be read.
Default is "false".
Passing the option without argument is equivalent to passing "true".
.TP .TP
.B \-\-power\-off\-on\-close .B \-\-power\-off\-on\-close
Turn the device screen off when closing scrcpy. Turn the device screen off when closing scrcpy.
@ -361,7 +295,7 @@ Record screen to
The format is determined by the The format is determined by the
.B \-\-record\-format .B \-\-record\-format
option if set, or by the file extension. option if set, or by the file extension (.mp4 or .mkv).
.TP .TP
.B \-\-raw\-key\-events .B \-\-raw\-key\-events
@ -369,15 +303,7 @@ Inject key events for all input keys, and ignore text events.
.TP .TP
.BI "\-\-record\-format " format .BI "\-\-record\-format " format
Force recording format (mp4, mkv, m4a, mka, opus, aac, flac or wav). Force recording format (either mp4 or mkv).
.TP
.BI "\-\-record\-orientation " value
Set the record orientation.
Possible values are 0, 90, 180 and 270. The number represents the clockwise rotation in degrees.
Default is 0.
.TP .TP
.BI "\-\-render\-driver " name .BI "\-\-render\-driver " name
@ -385,12 +311,17 @@ Request SDL to use the given render driver (this is just a hint).
Supported names are currently "direct3d", "opengl", "opengles2", "opengles", "metal" and "software". Supported names are currently "direct3d", "opengl", "opengles2", "opengles", "metal" and "software".
<https://wiki.libsdl.org/SDL_HINT_RENDER_DRIVER> .UR https://wiki.libsdl.org/SDL_HINT_RENDER_DRIVER
.UE
.TP .TP
.B \-\-require\-audio .B \-\-require\-audio
By default, scrcpy mirrors only the video if audio capture fails on the device. This option makes scrcpy fail if audio is enabled but does not work. By default, scrcpy mirrors only the video if audio capture fails on the device. This option makes scrcpy fail if audio is enabled but does not work.
.TP
.BI "\-\-rotation " value
Set the initial display rotation. Possibles values are 0, 1, 2 and 3. Each increment adds a 90 degrees rotation counterclockwise.
.TP .TP
.BI "\-s, \-\-serial " number .BI "\-s, \-\-serial " number
The device serial number. Mandatory only if several devices are connected to adb. The device serial number. Mandatory only if several devices are connected to adb.
@ -429,13 +360,13 @@ Set the maximum mirroring time, in seconds.
.TP .TP
.BI "\-\-tunnel\-host " ip .BI "\-\-tunnel\-host " ip
Set the IP address of the adb tunnel to reach the scrcpy server. This option automatically enables \fB\-\-force\-adb\-forward\fR. Set the IP address of the adb tunnel to reach the scrcpy server. This option automatically enables --force-adb-forward.
Default is localhost. Default is localhost.
.TP .TP
.BI "\-\-tunnel\-port " port .BI "\-\-tunnel\-port " port
Set the TCP port of the adb tunnel to reach the scrcpy server. This option automatically enables \fB\-\-force\-adb\-forward\fR. Set the TCP port of the adb tunnel to reach the scrcpy server. This option automatically enables --force-adb-forward.
Default is 0 (not forced): the local port used for establishing the tunnel will be used. Default is 0 (not forced): the local port used for establishing the tunnel will be used.
@ -475,23 +406,15 @@ Set a list of comma-separated key:type=value options for the device video encode
The possible values for 'type' are 'int' (default), 'long', 'float' and 'string'. The possible values for 'type' are 'int' (default), 'long', 'float' and 'string'.
The list of possible codec options is available in the Android documentation: The list of possible codec options is available in the Android documentation
.UR https://d.android.com/reference/android/media/MediaFormat
<https://d.android.com/reference/android/media/MediaFormat> .UE .
.TP .TP
.BI "\-\-video\-encoder " name .BI "\-\-video\-encoder " name
Use a specific MediaCodec video encoder (depending on the codec provided by \fB\-\-video\-codec\fR). Use a specific MediaCodec video encoder (depending on the codec provided by \fB\-\-video\-codec\fR).
The available encoders can be listed by \fB\-\-list\-encoders\fR. The available encoders can be listed by \-\-list\-encoders.
.TP
.BI "\-\-video\-source " source
Select the video source (display or camera).
Camera mirroring requires Android 12+.
Default is display.
.TP .TP
.B \-w, \-\-stay-awake .B \-w, \-\-stay-awake
@ -552,14 +475,6 @@ Rotate display left
.B MOD+Right .B MOD+Right
Rotate display right Rotate display right
.TP
.B MOD+Shift+Left, MOD+Shift+Right
Flip display horizontally
.TP
.B MOD+Shift+Up, MOD+Shift+Down
Flip display vertically
.TP .TP
.B MOD+g .B MOD+g
Resize window to 1:1 (pixel\-perfect) Resize window to 1:1 (pixel\-perfect)
@ -661,7 +576,7 @@ Path to adb.
.TP .TP
.B ANDROID_SERIAL .B ANDROID_SERIAL
Device serial to use if no selector (\fB-s\fR, \fB-d\fR, \fB-e\fR or \fB\-\-tcpip=\fIaddr\fR) is specified. Device serial to use if no selector (-s, -d, -e or --tcpip=<addr>) is specified.
.TP .TP
.B SCRCPY_ICON_PATH .B SCRCPY_ICON_PATH
@ -684,14 +599,23 @@ for the Debian Project (and may be used by others).
.SH "REPORTING BUGS" .SH "REPORTING BUGS"
Report bugs to <https://github.com/Genymobile/scrcpy/issues>. Report bugs to
.UR https://github.com/Genymobile/scrcpy/issues
.UE .
.SH COPYRIGHT .SH COPYRIGHT
Copyright \(co 2018 Genymobile <https://www.genymobile.com> Copyright \(co 2018 Genymobile
.UR https://www.genymobile.com
Genymobile
.UE
Copyright \(co 2018\-2023 Romain Vimont <rom@rom1v.com> Copyright \(co 2018\-2023
.MT rom@rom1v.com
Romain Vimont
.ME
Licensed under the Apache License, Version 2.0. Licensed under the Apache License, Version 2.0.
.SH WWW .SH WWW
<https://github.com/Genymobile/scrcpy> .UR https://github.com/Genymobile/scrcpy
.UE

View File

@ -70,7 +70,7 @@ argv_to_string(const char *const *argv, char *buf, size_t bufsize) {
} }
static void static void
show_adb_installation_msg(void) { show_adb_installation_msg() {
#ifndef __WINDOWS__ #ifndef __WINDOWS__
static const struct { static const struct {
const char *binary; const char *binary;
@ -218,16 +218,8 @@ sc_adb_forward(struct sc_intr *intr, const char *serial, uint16_t local_port,
const char *device_socket_name, unsigned flags) { const char *device_socket_name, unsigned flags) {
char local[4 + 5 + 1]; // tcp:PORT char local[4 + 5 + 1]; // tcp:PORT
char remote[108 + 14 + 1]; // localabstract:NAME char remote[108 + 14 + 1]; // localabstract:NAME
sprintf(local, "tcp:%" PRIu16, local_port);
int r = snprintf(local, sizeof(local), "tcp:%" PRIu16, local_port); snprintf(remote, sizeof(remote), "localabstract:%s", device_socket_name);
assert(r >= 0 && (size_t) r < sizeof(local));
r = snprintf(remote, sizeof(remote), "localabstract:%s",
device_socket_name);
if (r < 0 || (size_t) r >= sizeof(remote)) {
LOGE("Could not write socket name");
return false;
}
assert(serial); assert(serial);
const char *const argv[] = const char *const argv[] =
@ -241,9 +233,7 @@ bool
sc_adb_forward_remove(struct sc_intr *intr, const char *serial, sc_adb_forward_remove(struct sc_intr *intr, const char *serial,
uint16_t local_port, unsigned flags) { uint16_t local_port, unsigned flags) {
char local[4 + 5 + 1]; // tcp:PORT char local[4 + 5 + 1]; // tcp:PORT
int r = snprintf(local, sizeof(local), "tcp:%" PRIu16, local_port); sprintf(local, "tcp:%" PRIu16, local_port);
assert(r >= 0 && (size_t) r < sizeof(local));
(void) r;
assert(serial); assert(serial);
const char *const argv[] = const char *const argv[] =
@ -259,16 +249,8 @@ sc_adb_reverse(struct sc_intr *intr, const char *serial,
unsigned flags) { unsigned flags) {
char local[4 + 5 + 1]; // tcp:PORT char local[4 + 5 + 1]; // tcp:PORT
char remote[108 + 14 + 1]; // localabstract:NAME char remote[108 + 14 + 1]; // localabstract:NAME
int r = snprintf(local, sizeof(local), "tcp:%" PRIu16, local_port); sprintf(local, "tcp:%" PRIu16, local_port);
assert(r >= 0 && (size_t) r < sizeof(local)); snprintf(remote, sizeof(remote), "localabstract:%s", device_socket_name);
r = snprintf(remote, sizeof(remote), "localabstract:%s",
device_socket_name);
if (r < 0 || (size_t) r >= sizeof(remote)) {
LOGE("Could not write socket name");
return false;
}
assert(serial); assert(serial);
const char *const argv[] = const char *const argv[] =
SC_ADB_COMMAND("-s", serial, "reverse", remote, local); SC_ADB_COMMAND("-s", serial, "reverse", remote, local);
@ -281,12 +263,7 @@ bool
sc_adb_reverse_remove(struct sc_intr *intr, const char *serial, sc_adb_reverse_remove(struct sc_intr *intr, const char *serial,
const char *device_socket_name, unsigned flags) { const char *device_socket_name, unsigned flags) {
char remote[108 + 14 + 1]; // localabstract:NAME char remote[108 + 14 + 1]; // localabstract:NAME
int r = snprintf(remote, sizeof(remote), "localabstract:%s", snprintf(remote, sizeof(remote), "localabstract:%s", device_socket_name);
device_socket_name);
if (r < 0 || (size_t) r >= sizeof(remote)) {
LOGE("Device socket name too long");
return false;
}
assert(serial); assert(serial);
const char *const argv[] = const char *const argv[] =
@ -356,9 +333,7 @@ bool
sc_adb_tcpip(struct sc_intr *intr, const char *serial, uint16_t port, sc_adb_tcpip(struct sc_intr *intr, const char *serial, uint16_t port,
unsigned flags) { unsigned flags) {
char port_string[5 + 1]; char port_string[5 + 1];
int r = snprintf(port_string, sizeof(port_string), "%" PRIu16, port); sprintf(port_string, "%" PRIu16, port);
assert(r >= 0 && (size_t) r < sizeof(port_string));
(void) r;
assert(serial); assert(serial);
const char *const argv[] = const char *const argv[] =
@ -653,8 +628,8 @@ sc_adb_select_device(struct sc_intr *intr,
return false; return false;
} }
LOGI("ADB device found:"); LOGD("ADB device found:");
sc_adb_devices_log(SC_LOG_LEVEL_INFO, vec.data, vec.size); sc_adb_devices_log(SC_LOG_LEVEL_DEBUG, vec.data, vec.size);
// Move devics into out_device (do not destroy device) // Move devics into out_device (do not destroy device)
sc_adb_device_move(out_device, device); sc_adb_device_move(out_device, device);

View File

@ -7,7 +7,7 @@
#include "util/log.h" #include "util/log.h"
#include "util/str.h" #include "util/str.h"
static bool bool
sc_adb_parse_device(char *line, struct sc_adb_device *device) { sc_adb_parse_device(char *line, struct sc_adb_device *device) {
// One device line looks like: // One device line looks like:
// "0123456789abcdef device usb:2-1 product:MyProduct model:MyModel " // "0123456789abcdef device usb:2-1 product:MyProduct model:MyModel "

View File

@ -32,7 +32,6 @@ enum {
OPT_WINDOW_BORDERLESS, OPT_WINDOW_BORDERLESS,
OPT_MAX_FPS, OPT_MAX_FPS,
OPT_LOCK_VIDEO_ORIENTATION, OPT_LOCK_VIDEO_ORIENTATION,
OPT_DISPLAY,
OPT_DISPLAY_ID, OPT_DISPLAY_ID,
OPT_ROTATION, OPT_ROTATION,
OPT_RENDER_DRIVER, OPT_RENDER_DRIVER,
@ -77,22 +76,9 @@ enum {
OPT_NO_VIDEO, OPT_NO_VIDEO,
OPT_NO_AUDIO_PLAYBACK, OPT_NO_AUDIO_PLAYBACK,
OPT_NO_VIDEO_PLAYBACK, OPT_NO_VIDEO_PLAYBACK,
OPT_VIDEO_SOURCE,
OPT_AUDIO_SOURCE, OPT_AUDIO_SOURCE,
OPT_KILL_ADB_ON_CLOSE, OPT_KILL_ADB_ON_CLOSE,
OPT_TIME_LIMIT, OPT_TIME_LIMIT,
OPT_PAUSE_ON_EXIT,
OPT_LIST_CAMERAS,
OPT_LIST_CAMERA_SIZES,
OPT_CAMERA_ID,
OPT_CAMERA_SIZE,
OPT_CAMERA_FACING,
OPT_CAMERA_AR,
OPT_CAMERA_FPS,
OPT_CAMERA_HIGH_SPEED,
OPT_DISPLAY_ORIENTATION,
OPT_RECORD_ORIENTATION,
OPT_ORIENTATION,
}; };
struct sc_option { struct sc_option {
@ -138,7 +124,7 @@ static const struct sc_option options[] = {
.longopt_id = OPT_AUDIO_BIT_RATE, .longopt_id = OPT_AUDIO_BIT_RATE,
.longopt = "audio-bit-rate", .longopt = "audio-bit-rate",
.argdesc = "value", .argdesc = "value",
.text = "Encode the audio at the given bit rate, expressed in bits/s. " .text = "Encode the audio at the given bit-rate, expressed in bits/s. "
"Unit suffixes are supported: 'K' (x1000) and 'M' (x1000000).\n" "Unit suffixes are supported: 'K' (x1000) and 'M' (x1000000).\n"
"Default is 128K (128000).", "Default is 128K (128000).",
}, },
@ -155,7 +141,7 @@ static const struct sc_option options[] = {
.longopt_id = OPT_AUDIO_CODEC, .longopt_id = OPT_AUDIO_CODEC,
.longopt = "audio-codec", .longopt = "audio-codec",
.argdesc = "name", .argdesc = "name",
.text = "Select an audio codec (opus, aac, flac or raw).\n" .text = "Select an audio codec (opus, aac or raw).\n"
"Default is opus.", "Default is opus.",
}, },
{ {
@ -199,7 +185,7 @@ static const struct sc_option options[] = {
.shortopt = 'b', .shortopt = 'b',
.longopt = "video-bit-rate", .longopt = "video-bit-rate",
.argdesc = "value", .argdesc = "value",
.text = "Encode the video at the given bit rate, expressed in bits/s. " .text = "Encode the video at the given bit-rate, expressed in bits/s. "
"Unit suffixes are supported: 'K' (x1000) and 'M' (x1000000).\n" "Unit suffixes are supported: 'K' (x1000) and 'M' (x1000000).\n"
"Default is 8M (8000000).", "Default is 8M (8000000).",
}, },
@ -209,51 +195,6 @@ static const struct sc_option options[] = {
.longopt = "bit-rate", .longopt = "bit-rate",
.argdesc = "value", .argdesc = "value",
}, },
{
.longopt_id = OPT_CAMERA_AR,
.longopt = "camera-ar",
.argdesc = "ar",
.text = "Select the camera size by its aspect ratio (+/- 10%).\n"
"Possible values are \"sensor\" (use the camera sensor aspect "
"ratio), \"<num>:<den>\" (e.g. \"4:3\") or \"<value>\" (e.g. "
"\"1.6\")."
},
{
.longopt_id = OPT_CAMERA_ID,
.longopt = "camera-id",
.argdesc = "id",
.text = "Specify the device camera id to mirror.\n"
"The available camera ids can be listed by:\n"
" scrcpy --list-cameras",
},
{
.longopt_id = OPT_CAMERA_FACING,
.longopt = "camera-facing",
.argdesc = "facing",
.text = "Select the device camera by its facing direction.\n"
"Possible values are \"front\", \"back\" and \"external\".",
},
{
.longopt_id = OPT_CAMERA_HIGH_SPEED,
.longopt = "camera-high-speed",
.text = "Enable high-speed camera capture mode.\n"
"This mode is restricted to specific resolutions and frame "
"rates, listed by --list-camera-sizes.",
},
{
.longopt_id = OPT_CAMERA_SIZE,
.longopt = "camera-size",
.argdesc = "<width>x<height>",
.text = "Specify an explicit camera capture size.",
},
{
.longopt_id = OPT_CAMERA_FPS,
.longopt = "camera-fps",
.argdesc = "value",
.text = "Specify the camera capture frame rate.\n"
"If not specified, Android's default frame rate (30 fps) is "
"used.",
},
{ {
// Not really deprecated (--codec has never been released), but without // Not really deprecated (--codec has never been released), but without
// declaring an explicit --codec option, getopt_long() partial matching // declaring an explicit --codec option, getopt_long() partial matching
@ -290,10 +231,13 @@ static const struct sc_option options[] = {
.text = "Disable screensaver while scrcpy is running.", .text = "Disable screensaver while scrcpy is running.",
}, },
{ {
// deprecated .longopt_id = OPT_DISPLAY_ID,
.longopt_id = OPT_DISPLAY,
.longopt = "display", .longopt = "display",
.argdesc = "id", .argdesc = "id",
.text = "Specify the device display id to mirror.\n"
"The available display ids can be listed by:\n"
" scrcpy --list-displays\n"
"Default is 0.",
}, },
{ {
.longopt_id = OPT_DISPLAY_BUFFER, .longopt_id = OPT_DISPLAY_BUFFER,
@ -303,26 +247,6 @@ static const struct sc_option options[] = {
"This increases latency to compensate for jitter.\n" "This increases latency to compensate for jitter.\n"
"Default is 0 (no buffering).", "Default is 0 (no buffering).",
}, },
{
.longopt_id = OPT_DISPLAY_ID,
.longopt = "display-id",
.argdesc = "id",
.text = "Specify the device display id to mirror.\n"
"The available display ids can be listed by:\n"
" scrcpy --list-displays\n"
"Default is 0.",
},
{
.longopt_id = OPT_DISPLAY_ORIENTATION,
.longopt = "display-orientation",
.argdesc = "value",
.text = "Set the initial display orientation.\n"
"Possible values are 0, 90, 180, 270, flip0, flip90, flip180 "
"and flip270. The number represents the clockwise rotation "
"in degrees; the \"flip\" keyword applies a horizontal flip "
"before the rotation.\n"
"Default is 0.",
},
{ {
.shortopt = 'e', .shortopt = 'e',
.longopt = "select-tcpip", .longopt = "select-tcpip",
@ -388,16 +312,6 @@ static const struct sc_option options[] = {
"This is a workaround for some devices not behaving as " "This is a workaround for some devices not behaving as "
"expected when setting the device clipboard programmatically.", "expected when setting the device clipboard programmatically.",
}, },
{
.longopt_id = OPT_LIST_CAMERAS,
.longopt = "list-cameras",
.text = "List device cameras.",
},
{
.longopt_id = OPT_LIST_CAMERA_SIZES,
.longopt = "list-camera-sizes",
.text = "List the valid camera capture sizes.",
},
{ {
.longopt_id = OPT_LIST_DISPLAYS, .longopt_id = OPT_LIST_DISPLAYS,
.longopt = "list-displays", .longopt = "list-displays",
@ -413,11 +327,11 @@ static const struct sc_option options[] = {
.longopt = "lock-video-orientation", .longopt = "lock-video-orientation",
.argdesc = "value", .argdesc = "value",
.optional_arg = true, .optional_arg = true,
.text = "Lock capture video orientation to value.\n" .text = "Lock video orientation to value.\n"
"Possible values are \"unlocked\", \"initial\" (locked to the " "Possible values are \"unlocked\", \"initial\" (locked to the "
"initial orientation), 0, 90, 180 and 270. The values " "initial orientation), 0, 1, 2 and 3. Natural device "
"represent the clockwise rotation from the natural device " "orientation is 0, and each increment adds a 90 degrees "
"orientation, in degrees.\n" "rotation counterclockwise.\n"
"Default is \"unlocked\".\n" "Default is \"unlocked\".\n"
"Passing the option without argument is equivalent to passing " "Passing the option without argument is equivalent to passing "
"\"initial\".", "\"initial\".",
@ -526,13 +440,6 @@ static const struct sc_option options[] = {
.longopt = "no-video-playback", .longopt = "no-video-playback",
.text = "Disable video playback on the computer.", .text = "Disable video playback on the computer.",
}, },
{
.longopt_id = OPT_ORIENTATION,
.longopt = "orientation",
.argdesc = "value",
.text = "Same as --display-orientation=value "
"--record-orientation=value.",
},
{ {
.longopt_id = OPT_OTG, .longopt_id = OPT_OTG,
.longopt = "otg", .longopt = "otg",
@ -556,20 +463,6 @@ static const struct sc_option options[] = {
"Default is " STR(DEFAULT_LOCAL_PORT_RANGE_FIRST) ":" "Default is " STR(DEFAULT_LOCAL_PORT_RANGE_FIRST) ":"
STR(DEFAULT_LOCAL_PORT_RANGE_LAST) ".", STR(DEFAULT_LOCAL_PORT_RANGE_LAST) ".",
}, },
{
.longopt_id = OPT_PAUSE_ON_EXIT,
.longopt = "pause-on-exit",
.argdesc = "mode",
.optional_arg = true,
.text = "Configure pause on exit. Possible values are \"true\" (always "
"pause on exit), \"false\" (never pause on exit) and "
"\"if-error\" (pause only if an error occured).\n"
"This is useful to prevent the terminal window from "
"automatically closing, so that error messages can be read.\n"
"Default is \"false\".\n"
"Passing the option without argument is equivalent to passing "
"\"true\".",
},
{ {
.longopt_id = OPT_POWER_OFF_ON_CLOSE, .longopt_id = OPT_POWER_OFF_ON_CLOSE,
.longopt = "power-off-on-close", .longopt = "power-off-on-close",
@ -604,7 +497,7 @@ static const struct sc_option options[] = {
.argdesc = "file.mp4", .argdesc = "file.mp4",
.text = "Record screen to file.\n" .text = "Record screen to file.\n"
"The format is determined by the --record-format option if " "The format is determined by the --record-format option if "
"set, or by the file extension.", "set, or by the file extension (.mp4 or .mkv).",
}, },
{ {
.longopt_id = OPT_RAW_KEY_EVENTS, .longopt_id = OPT_RAW_KEY_EVENTS,
@ -615,17 +508,7 @@ static const struct sc_option options[] = {
.longopt_id = OPT_RECORD_FORMAT, .longopt_id = OPT_RECORD_FORMAT,
.longopt = "record-format", .longopt = "record-format",
.argdesc = "format", .argdesc = "format",
.text = "Force recording format (mp4, mkv, m4a, mka, opus, aac, flac " .text = "Force recording format (either mp4 or mkv).",
"or wav).",
},
{
.longopt_id = OPT_RECORD_ORIENTATION,
.longopt = "record-orientation",
.argdesc = "value",
.text = "Set the record orientation.\n"
"Possible values are 0, 90, 180 and 270. The number represents "
"the clockwise rotation in degrees.\n"
"Default is 0.",
}, },
{ {
.longopt_id = OPT_RENDER_DRIVER, .longopt_id = OPT_RENDER_DRIVER,
@ -645,10 +528,12 @@ static const struct sc_option options[] = {
"is enabled but does not work." "is enabled but does not work."
}, },
{ {
// deprecated
.longopt_id = OPT_ROTATION, .longopt_id = OPT_ROTATION,
.longopt = "rotation", .longopt = "rotation",
.argdesc = "value", .argdesc = "value",
.text = "Set the initial display rotation.\n"
"Possible values are 0, 1, 2 and 3. Each increment adds a 90 "
"degrees rotation counterclockwise.",
}, },
{ {
.shortopt = 's', .shortopt = 's',
@ -784,14 +669,6 @@ static const struct sc_option options[] = {
"codec provided by --video-codec).\n" "codec provided by --video-codec).\n"
"The available encoders can be listed by --list-encoders.", "The available encoders can be listed by --list-encoders.",
}, },
{
.longopt_id = OPT_VIDEO_SOURCE,
.longopt = "video-source",
.argdesc = "source",
.text = "Select the video source (display or camera).\n"
"Camera mirroring requires Android 12+.\n"
"Default is display.",
},
{ {
.shortopt = 'w', .shortopt = 'w',
.longopt = "stay-awake", .longopt = "stay-awake",
@ -852,14 +729,6 @@ static const struct sc_shortcut shortcuts[] = {
.shortcuts = { "MOD+Right" }, .shortcuts = { "MOD+Right" },
.text = "Rotate display right", .text = "Rotate display right",
}, },
{
.shortcuts = { "MOD+Shift+Left", "MOD+Shift+Right" },
.text = "Flip display horizontally",
},
{
.shortcuts = { "MOD+Shift+Up", "MOD+Shift+Down" },
.text = "Flip display vertically",
},
{ {
.shortcuts = { "MOD+g" }, .shortcuts = { "MOD+g" },
.text = "Resize window to 1:1 (pixel-perfect)", .text = "Resize window to 1:1 (pixel-perfect)",
@ -1201,7 +1070,7 @@ print_shortcut(const struct sc_shortcut *shortcut, unsigned cols) {
while (shortcut->shortcuts[i]) { while (shortcut->shortcuts[i]) {
printf(" %s\n", shortcut->shortcuts[i]); printf(" %s\n", shortcut->shortcuts[i]);
++i; ++i;
} };
char *text = sc_str_wrap_lines(shortcut->text, cols, 8); char *text = sc_str_wrap_lines(shortcut->text, cols, 8);
if (!text) { if (!text) {
@ -1320,9 +1189,9 @@ parse_integer_arg(const char *s, long *out, bool accept_suffix, long min,
} }
static size_t static size_t
parse_integers_arg(const char *s, const char sep, size_t max_items, long *out, parse_integers_arg(const char *s, size_t max_items, long *out, long min,
long min, long max, const char *name) { long max, const char *name) {
size_t count = sc_str_parse_integers(s, sep, max_items, out); size_t count = sc_str_parse_integers(s, ':', max_items, out);
if (!count) { if (!count) {
LOGE("Could not parse %s: %s", name, s); LOGE("Could not parse %s: %s", name, s);
return 0; return 0;
@ -1369,7 +1238,7 @@ parse_max_size(const char *s, uint16_t *max_size) {
static bool static bool
parse_max_fps(const char *s, uint16_t *max_fps) { parse_max_fps(const char *s, uint16_t *max_fps) {
long value; long value;
bool ok = parse_integer_arg(s, &value, false, 0, 0xFFFF, "max fps"); bool ok = parse_integer_arg(s, &value, false, 0, 1000, "max fps");
if (!ok) { if (!ok) {
return false; return false;
} }
@ -1418,50 +1287,15 @@ parse_lock_video_orientation(const char *s,
return true; return true;
} }
if (!strcmp(s, "0")) { long value;
*lock_mode = SC_LOCK_VIDEO_ORIENTATION_0; bool ok = parse_integer_arg(s, &value, false, 0, 3,
return true; "lock video orientation");
if (!ok) {
return false;
} }
if (!strcmp(s, "90")) { *lock_mode = (enum sc_lock_video_orientation) value;
*lock_mode = SC_LOCK_VIDEO_ORIENTATION_90; return true;
return true;
}
if (!strcmp(s, "180")) {
*lock_mode = SC_LOCK_VIDEO_ORIENTATION_180;
return true;
}
if (!strcmp(s, "270")) {
*lock_mode = SC_LOCK_VIDEO_ORIENTATION_270;
return true;
}
if (!strcmp(s, "1")) {
LOGW("--lock-video-orientation=1 is deprecated, use "
"--lock-video-orientation=270 instead.");
*lock_mode = SC_LOCK_VIDEO_ORIENTATION_270;
return true;
}
if (!strcmp(s, "2")) {
LOGW("--lock-video-orientation=2 is deprecated, use "
"--lock-video-orientation=180 instead.");
*lock_mode = SC_LOCK_VIDEO_ORIENTATION_180;
return true;
}
if (!strcmp(s, "3")) {
LOGW("--lock-video-orientation=3 is deprecated, use "
"--lock-video-orientation=90 instead.");
*lock_mode = SC_LOCK_VIDEO_ORIENTATION_90;
return true;
}
LOGE("Unsupported --lock-video-orientation value: %s (expected initial, "
"unlocked, 0, 90, 180 or 270).", s);
return false;
} }
static bool static bool
@ -1476,45 +1310,6 @@ parse_rotation(const char *s, uint8_t *rotation) {
return true; return true;
} }
static bool
parse_orientation(const char *s, enum sc_orientation *orientation) {
if (!strcmp(s, "0")) {
*orientation = SC_ORIENTATION_0;
return true;
}
if (!strcmp(s, "90")) {
*orientation = SC_ORIENTATION_90;
return true;
}
if (!strcmp(s, "180")) {
*orientation = SC_ORIENTATION_180;
return true;
}
if (!strcmp(s, "270")) {
*orientation = SC_ORIENTATION_270;
return true;
}
if (!strcmp(s, "flip0")) {
*orientation = SC_ORIENTATION_FLIP_0;
return true;
}
if (!strcmp(s, "flip90")) {
*orientation = SC_ORIENTATION_FLIP_90;
return true;
}
if (!strcmp(s, "flip180")) {
*orientation = SC_ORIENTATION_FLIP_180;
return true;
}
if (!strcmp(s, "flip270")) {
*orientation = SC_ORIENTATION_FLIP_270;
return true;
}
LOGE("Unsupported orientation: %s (expected 0, 90, 180, 270, flip0, "
"flip90, flip180 or flip270)", optarg);
return false;
}
static bool static bool
parse_window_position(const char *s, int16_t *position) { parse_window_position(const char *s, int16_t *position) {
// special value for "auto" // special value for "auto"
@ -1552,7 +1347,7 @@ parse_window_dimension(const char *s, uint16_t *dimension) {
static bool static bool
parse_port_range(const char *s, struct sc_port_range *port_range) { parse_port_range(const char *s, struct sc_port_range *port_range) {
long values[2]; long values[2];
size_t count = parse_integers_arg(s, ':', 2, values, 0, 0xFFFF, "port"); size_t count = parse_integers_arg(s, 2, values, 0, 0xFFFF, "port");
if (!count) { if (!count) {
return false; return false;
} }
@ -1737,12 +1532,6 @@ get_record_format(const char *name) {
if (!strcmp(name, "aac")) { if (!strcmp(name, "aac")) {
return SC_RECORD_FORMAT_AAC; return SC_RECORD_FORMAT_AAC;
} }
if (!strcmp(name, "flac")) {
return SC_RECORD_FORMAT_FLAC;
}
if (!strcmp(name, "wav")) {
return SC_RECORD_FORMAT_WAV;
}
return 0; return 0;
} }
@ -1750,8 +1539,7 @@ static bool
parse_record_format(const char *optarg, enum sc_record_format *format) { parse_record_format(const char *optarg, enum sc_record_format *format) {
enum sc_record_format fmt = get_record_format(optarg); enum sc_record_format fmt = get_record_format(optarg);
if (!fmt) { if (!fmt) {
LOGE("Unsupported record format: %s (expected mp4, mkv, m4a, mka, " LOGE("Unsupported format: %s (expected mp4 or mkv)", optarg);
"opus, aac, flac or wav)", optarg);
return false; return false;
} }
@ -1813,32 +1601,11 @@ parse_audio_codec(const char *optarg, enum sc_codec *codec) {
*codec = SC_CODEC_AAC; *codec = SC_CODEC_AAC;
return true; return true;
} }
if (!strcmp(optarg, "flac")) {
*codec = SC_CODEC_FLAC;
return true;
}
if (!strcmp(optarg, "raw")) { if (!strcmp(optarg, "raw")) {
*codec = SC_CODEC_RAW; *codec = SC_CODEC_RAW;
return true; return true;
} }
LOGE("Unsupported audio codec: %s (expected opus, aac, flac or raw)", LOGE("Unsupported audio codec: %s (expected opus, aac or raw)", optarg);
optarg);
return false;
}
static bool
parse_video_source(const char *optarg, enum sc_video_source *source) {
if (!strcmp(optarg, "display")) {
*source = SC_VIDEO_SOURCE_DISPLAY;
return true;
}
if (!strcmp(optarg, "camera")) {
*source = SC_VIDEO_SOURCE_CAMERA;
return true;
}
LOGE("Unsupported video source: %s (expected display or camera)", optarg);
return false; return false;
} }
@ -1858,46 +1625,6 @@ parse_audio_source(const char *optarg, enum sc_audio_source *source) {
return false; return false;
} }
static bool
parse_camera_facing(const char *optarg, enum sc_camera_facing *facing) {
if (!strcmp(optarg, "front")) {
*facing = SC_CAMERA_FACING_FRONT;
return true;
}
if (!strcmp(optarg, "back")) {
*facing = SC_CAMERA_FACING_BACK;
return true;
}
if (!strcmp(optarg, "external")) {
*facing = SC_CAMERA_FACING_EXTERNAL;
return true;
}
if (*optarg == '\0') {
// Empty string is a valid value (equivalent to not passing the option)
*facing = SC_CAMERA_FACING_ANY;
return true;
}
LOGE("Unsupported camera facing: %s (expected front, back or external)",
optarg);
return false;
}
static bool
parse_camera_fps(const char *s, uint16_t *camera_fps) {
long value;
bool ok = parse_integer_arg(s, &value, false, 0, 0xFFFF, "camera fps");
if (!ok) {
return false;
}
*camera_fps = (uint16_t) value;
return true;
}
static bool static bool
parse_time_limit(const char *s, sc_tick *tick) { parse_time_limit(const char *s, sc_tick *tick) {
long value; long value;
@ -1910,29 +1637,6 @@ parse_time_limit(const char *s, sc_tick *tick) {
return true; return true;
} }
static bool
parse_pause_on_exit(const char *s, enum sc_pause_on_exit *pause_on_exit) {
if (!s || !strcmp(s, "true")) {
*pause_on_exit = SC_PAUSE_ON_EXIT_TRUE;
return true;
}
if (!strcmp(s, "false")) {
*pause_on_exit = SC_PAUSE_ON_EXIT_FALSE;
return true;
}
if (!strcmp(s, "if-error")) {
*pause_on_exit = SC_PAUSE_ON_EXIT_IF_ERROR;
return true;
}
LOGE("Unsupported pause on exit mode: %s "
"(expected true, false or if-error)", optarg);
return false;
}
static bool static bool
parse_args_with_getopt(struct scrcpy_cli_args *args, int argc, char *argv[], parse_args_with_getopt(struct scrcpy_cli_args *args, int argc, char *argv[],
const char *optstring, const struct option *longopts) { const char *optstring, const struct option *longopts) {
@ -1960,9 +1664,6 @@ parse_args_with_getopt(struct scrcpy_cli_args *args, int argc, char *argv[],
case OPT_CROP: case OPT_CROP:
opts->crop = optarg; opts->crop = optarg;
break; break;
case OPT_DISPLAY:
LOGW("--display is deprecated, use --display-id instead.");
// fall through
case OPT_DISPLAY_ID: case OPT_DISPLAY_ID:
if (!parse_display_id(optarg, &opts->display_id)) { if (!parse_display_id(optarg, &opts->display_id)) {
return false; return false;
@ -2118,51 +1819,10 @@ parse_args_with_getopt(struct scrcpy_cli_args *args, int argc, char *argv[],
opts->key_inject_mode = SC_KEY_INJECT_MODE_RAW; opts->key_inject_mode = SC_KEY_INJECT_MODE_RAW;
break; break;
case OPT_ROTATION: case OPT_ROTATION:
LOGW("--rotation is deprecated, use --display-orientation " if (!parse_rotation(optarg, &opts->rotation)) {
"instead.");
uint8_t rotation;
if (!parse_rotation(optarg, &rotation)) {
return false;
}
assert(rotation <= 3);
switch (rotation) {
case 0:
opts->display_orientation = SC_ORIENTATION_0;
break;
case 1:
// rotation 1 was 90° counterclockwise, but orientation
// is expressed clockwise
opts->display_orientation = SC_ORIENTATION_270;
break;
case 2:
opts->display_orientation = SC_ORIENTATION_180;
break;
case 3:
// rotation 3 was 270° counterclockwise, but orientation
// is expressed clockwise
opts->display_orientation = SC_ORIENTATION_90;
break;
}
break;
case OPT_DISPLAY_ORIENTATION:
if (!parse_orientation(optarg, &opts->display_orientation)) {
return false; return false;
} }
break; break;
case OPT_RECORD_ORIENTATION:
if (!parse_orientation(optarg, &opts->record_orientation)) {
return false;
}
break;
case OPT_ORIENTATION: {
enum sc_orientation orientation;
if (!parse_orientation(optarg, &orientation)) {
return false;
}
opts->display_orientation = orientation;
opts->record_orientation = orientation;
break;
}
case OPT_RENDER_DRIVER: case OPT_RENDER_DRIVER:
opts->render_driver = optarg; opts->render_driver = optarg;
break; break;
@ -2285,16 +1945,10 @@ parse_args_with_getopt(struct scrcpy_cli_args *args, int argc, char *argv[],
return false; return false;
#endif #endif
case OPT_LIST_ENCODERS: case OPT_LIST_ENCODERS:
opts->list |= SC_OPTION_LIST_ENCODERS; opts->list_encoders = true;
break; break;
case OPT_LIST_DISPLAYS: case OPT_LIST_DISPLAYS:
opts->list |= SC_OPTION_LIST_DISPLAYS; opts->list_displays = true;
break;
case OPT_LIST_CAMERAS:
opts->list |= SC_OPTION_LIST_CAMERAS;
break;
case OPT_LIST_CAMERA_SIZES:
opts->list |= SC_OPTION_LIST_CAMERA_SIZES;
break; break;
case OPT_REQUIRE_AUDIO: case OPT_REQUIRE_AUDIO:
opts->require_audio = true; opts->require_audio = true;
@ -2310,11 +1964,6 @@ parse_args_with_getopt(struct scrcpy_cli_args *args, int argc, char *argv[],
return false; return false;
} }
break; break;
case OPT_VIDEO_SOURCE:
if (!parse_video_source(optarg, &opts->video_source)) {
return false;
}
break;
case OPT_AUDIO_SOURCE: case OPT_AUDIO_SOURCE:
if (!parse_audio_source(optarg, &opts->audio_source)) { if (!parse_audio_source(optarg, &opts->audio_source)) {
return false; return false;
@ -2328,33 +1977,6 @@ parse_args_with_getopt(struct scrcpy_cli_args *args, int argc, char *argv[],
return false; return false;
} }
break; break;
case OPT_PAUSE_ON_EXIT:
if (!parse_pause_on_exit(optarg, &args->pause_on_exit)) {
return false;
}
break;
case OPT_CAMERA_AR:
opts->camera_ar = optarg;
break;
case OPT_CAMERA_ID:
opts->camera_id = optarg;
break;
case OPT_CAMERA_SIZE:
opts->camera_size = optarg;
break;
case OPT_CAMERA_FACING:
if (!parse_camera_facing(optarg, &opts->camera_facing)) {
return false;
}
break;
case OPT_CAMERA_FPS:
if (!parse_camera_fps(optarg, &opts->camera_fps)) {
return false;
}
break;
case OPT_CAMERA_HIGH_SPEED:
opts->camera_high_speed = true;
break;
default: default:
// getopt prints the error message on stderr // getopt prints the error message on stderr
return false; return false;
@ -2400,6 +2022,12 @@ parse_args_with_getopt(struct scrcpy_cli_args *args, int argc, char *argv[],
opts->audio_playback = false; opts->audio_playback = false;
} }
if (!opts->video_playback && !otg) {
// If video playback is disabled and OTG are disabled, then there is
// no way to control the device.
opts->control = false;
}
if (opts->video && !opts->video_playback && !opts->record_filename if (opts->video && !opts->video_playback && !opts->record_filename
&& !v4l2) { && !v4l2) {
LOGI("No video playback, no recording, no V4L2 sink: video disabled"); LOGI("No video playback, no recording, no V4L2 sink: video disabled");
@ -2421,19 +2049,6 @@ parse_args_with_getopt(struct scrcpy_cli_args *args, int argc, char *argv[],
opts->require_audio = true; opts->require_audio = true;
} }
if (opts->audio_playback && opts->audio_buffer == -1) {
if (opts->audio_codec == SC_CODEC_FLAC) {
// Use 50 ms audio buffer by default, but use a higher value for FLAC,
// which is not low latency (the default encoder produces blocks of
// 4096 samples, which represent ~85.333ms).
LOGI("FLAC audio: audio buffer increased to 120 ms (use "
"--audio-buffer to set a custom value)");
opts->audio_buffer = SC_TICK_FROM_MS(120);
} else {
opts->audio_buffer = SC_TICK_FROM_MS(50);
}
}
#ifdef HAVE_V4L2 #ifdef HAVE_V4L2
if (v4l2) { if (v4l2) {
if (opts->lock_video_orientation == if (opts->lock_video_orientation ==
@ -2461,58 +2076,6 @@ parse_args_with_getopt(struct scrcpy_cli_args *args, int argc, char *argv[],
opts->force_adb_forward = true; opts->force_adb_forward = true;
} }
if (opts->video_source == SC_VIDEO_SOURCE_CAMERA) {
if (opts->display_id) {
LOGE("--display-id is only available with --video-source=display");
return false;
}
if (opts->camera_id && opts->camera_facing != SC_CAMERA_FACING_ANY) {
LOGE("Could not specify both --camera-id and --camera-facing");
return false;
}
if (opts->camera_size) {
if (opts->max_size) {
LOGE("Could not specify both --camera-size and -m/--max-size");
return false;
}
if (opts->camera_ar) {
LOGE("Could not specify both --camera-size and --camera-ar");
return false;
}
}
if (opts->camera_high_speed && !opts->camera_fps) {
LOGE("--camera-high-speed requires an explicit --camera-fps value");
return false;
}
if (opts->control) {
LOGI("Camera video source: control disabled");
opts->control = false;
}
} else if (opts->camera_id
|| opts->camera_ar
|| opts->camera_facing != SC_CAMERA_FACING_ANY
|| opts->camera_fps
|| opts->camera_high_speed
|| opts->camera_size) {
LOGE("Camera options are only available with --video-source=camera");
return false;
}
if (opts->audio && opts->audio_source == SC_AUDIO_SOURCE_AUTO) {
// Select the audio source according to the video source
if (opts->video_source == SC_VIDEO_SOURCE_DISPLAY) {
opts->audio_source = SC_AUDIO_SOURCE_OUTPUT;
} else {
opts->audio_source = SC_AUDIO_SOURCE_MIC;
LOGI("Camera video source: microphone audio source selected");
}
}
if (opts->record_format && !opts->record_filename) { if (opts->record_format && !opts->record_filename) {
LOGE("Record format specified without recording"); LOGE("Record format specified without recording");
return false; return false;
@ -2529,13 +2092,9 @@ parse_args_with_getopt(struct scrcpy_cli_args *args, int argc, char *argv[],
} }
} }
if (opts->record_orientation != SC_ORIENTATION_0) { if (opts->audio_codec == SC_CODEC_RAW) {
if (sc_orientation_is_mirror(opts->record_orientation)) { LOGW("Recording does not support RAW audio codec");
LOGE("Record orientation only supports rotation, not " return false;
"flipping: %s",
sc_orientation_get_name(opts->record_orientation));
return false;
}
} }
if (opts->video if (opts->video
@ -2557,30 +2116,6 @@ parse_args_with_getopt(struct scrcpy_cli_args *args, int argc, char *argv[],
"(try with --audio-codec=aac)"); "(try with --audio-codec=aac)");
return false; return false;
} }
if (opts->record_format == SC_RECORD_FORMAT_FLAC
&& opts->audio_codec != SC_CODEC_FLAC) {
LOGE("Recording to FLAC file requires a FLAC audio stream "
"(try with --audio-codec=flac)");
return false;
}
if (opts->record_format == SC_RECORD_FORMAT_WAV
&& opts->audio_codec != SC_CODEC_RAW) {
LOGE("Recording to WAV file requires a RAW audio stream "
"(try with --audio-codec=raw)");
return false;
}
if ((opts->record_format == SC_RECORD_FORMAT_MP4 ||
opts->record_format == SC_RECORD_FORMAT_M4A)
&& opts->audio_codec == SC_CODEC_RAW) {
LOGE("Recording to MP4 container does not support RAW audio");
return false;
}
}
if (opts->audio_codec == SC_CODEC_FLAC && opts->audio_bit_rate) {
LOGW("--audio-bit-rate is ignored for FLAC audio codec");
} }
if (opts->audio_codec == SC_CODEC_RAW) { if (opts->audio_codec == SC_CODEC_RAW) {
@ -2661,37 +2196,6 @@ parse_args_with_getopt(struct scrcpy_cli_args *args, int argc, char *argv[],
return true; return true;
} }
static enum sc_pause_on_exit
sc_get_pause_on_exit(int argc, char *argv[]) {
// Read arguments backwards so that the last --pause-on-exit is considered
// (same behavior as getopt())
for (int i = argc - 1; i >= 1; --i) {
const char *arg = argv[i];
// Starts with "--pause-on-exit"
if (!strncmp("--pause-on-exit", arg, 15)) {
if (arg[15] == '\0') {
// No argument
return SC_PAUSE_ON_EXIT_TRUE;
}
if (arg[15] != '=') {
// Invalid parameter, ignore
return SC_PAUSE_ON_EXIT_FALSE;
}
const char *value = &arg[16];
if (!strcmp(value, "true")) {
return SC_PAUSE_ON_EXIT_TRUE;
}
if (!strcmp(value, "if-error")) {
return SC_PAUSE_ON_EXIT_IF_ERROR;
}
// Set to false, inclusing when the value is invalid
return SC_PAUSE_ON_EXIT_FALSE;
}
}
return SC_PAUSE_ON_EXIT_FALSE;
}
bool bool
scrcpy_parse_args(struct scrcpy_cli_args *args, int argc, char *argv[]) { scrcpy_parse_args(struct scrcpy_cli_args *args, int argc, char *argv[]) {
struct sc_getopt_adapter adapter; struct sc_getopt_adapter adapter;
@ -2705,11 +2209,5 @@ scrcpy_parse_args(struct scrcpy_cli_args *args, int argc, char *argv[]) {
sc_getopt_adapter_destroy(&adapter); sc_getopt_adapter_destroy(&adapter);
if (!ret && args->pause_on_exit == SC_PAUSE_ON_EXIT_FALSE) {
// Check if "--pause-on-exit" is present in the arguments list, because
// it must be taken into account even if command line parsing failed
args->pause_on_exit = sc_get_pause_on_exit(argc, argv);
}
return ret; return ret;
} }

View File

@ -7,17 +7,10 @@
#include "options.h" #include "options.h"
enum sc_pause_on_exit {
SC_PAUSE_ON_EXIT_TRUE,
SC_PAUSE_ON_EXIT_FALSE,
SC_PAUSE_ON_EXIT_IF_ERROR,
};
struct scrcpy_cli_args { struct scrcpy_cli_args {
struct scrcpy_options opts; struct scrcpy_options opts;
bool help; bool help;
bool version; bool version;
enum sc_pause_on_exit pause_on_exit;
}; };
void void

View File

@ -3,9 +3,7 @@
#include "config.h" #include "config.h"
#include <libavcodec/version.h>
#include <libavformat/version.h> #include <libavformat/version.h>
#include <libavutil/version.h>
#include <SDL2/SDL_version.h> #include <SDL2/SDL_version.h>
#ifndef __WIN32 #ifndef __WIN32
@ -52,15 +50,6 @@
# define SCRCPY_LAVU_HAS_CHLAYOUT # define SCRCPY_LAVU_HAS_CHLAYOUT
#endif #endif
// In ffmpeg/doc/APIchanges:
// 2023-10-06 - 5432d2aacad - lavc 60.15.100 - avformat.h
// Deprecate AVFormatContext.{nb_,}side_data, av_stream_add_side_data(),
// av_stream_new_side_data(), and av_stream_get_side_data(). Side data fields
// from AVFormatContext.codecpar should be used from now on.
#if LIBAVCODEC_VERSION_INT >= AV_VERSION_INT(60, 15, 100)
# define SCRCPY_LAVC_HAS_CODECPAR_CODEC_SIDEDATA
#endif
#if SDL_VERSION_ATLEAST(2, 0, 6) #if SDL_VERSION_ATLEAST(2, 0, 6)
// <https://github.com/libsdl-org/SDL/commit/d7a318de563125e5bb465b1000d6bc9576fbc6fc> // <https://github.com/libsdl-org/SDL/commit/d7a318de563125e5bb465b1000d6bc9576fbc6fc>
# define SCRCPY_SDL_HAS_HINT_TOUCH_MOUSE_EVENTS # define SCRCPY_SDL_HAS_HINT_TOUCH_MOUSE_EVENTS

View File

@ -25,8 +25,7 @@ sc_demuxer_to_avcodec_id(uint32_t codec_id) {
#define SC_CODEC_ID_H265 UINT32_C(0x68323635) // "h265" in ASCII #define SC_CODEC_ID_H265 UINT32_C(0x68323635) // "h265" in ASCII
#define SC_CODEC_ID_AV1 UINT32_C(0x00617631) // "av1" in ASCII #define SC_CODEC_ID_AV1 UINT32_C(0x00617631) // "av1" in ASCII
#define SC_CODEC_ID_OPUS UINT32_C(0x6f707573) // "opus" in ASCII #define SC_CODEC_ID_OPUS UINT32_C(0x6f707573) // "opus" in ASCII
#define SC_CODEC_ID_AAC UINT32_C(0x00616163) // "aac" in ASCII #define SC_CODEC_ID_AAC UINT32_C(0x00616163) // "aac in ASCII"
#define SC_CODEC_ID_FLAC UINT32_C(0x666c6163) // "flac" in ASCII
#define SC_CODEC_ID_RAW UINT32_C(0x00726177) // "raw" in ASCII #define SC_CODEC_ID_RAW UINT32_C(0x00726177) // "raw" in ASCII
switch (codec_id) { switch (codec_id) {
case SC_CODEC_ID_H264: case SC_CODEC_ID_H264:
@ -44,8 +43,6 @@ sc_demuxer_to_avcodec_id(uint32_t codec_id) {
return AV_CODEC_ID_OPUS; return AV_CODEC_ID_OPUS;
case SC_CODEC_ID_AAC: case SC_CODEC_ID_AAC:
return AV_CODEC_ID_AAC; return AV_CODEC_ID_AAC;
case SC_CODEC_ID_FLAC:
return AV_CODEC_ID_FLAC;
case SC_CODEC_ID_RAW: case SC_CODEC_ID_RAW:
return AV_CODEC_ID_PCM_S16LE; return AV_CODEC_ID_PCM_S16LE;
default: default:
@ -210,11 +207,6 @@ run_demuxer(void *data) {
codec_ctx->channels = 2; codec_ctx->channels = 2;
#endif #endif
codec_ctx->sample_rate = 48000; codec_ctx->sample_rate = 48000;
if (raw_codec_id == SC_CODEC_ID_FLAC) {
// The sample_fmt is not set by the FLAC decoder
codec_ctx->sample_fmt = AV_SAMPLE_FMT_S16;
}
} }
if (avcodec_open2(codec_ctx, codec, NULL) < 0) { if (avcodec_open2(codec_ctx, codec, NULL) < 0) {
@ -227,9 +219,8 @@ run_demuxer(void *data) {
} }
// Config packets must be merged with the next non-config packet only for // Config packets must be merged with the next non-config packet only for
// H.26x // video streams
bool must_merge_config_packet = raw_codec_id == SC_CODEC_ID_H264 bool must_merge_config_packet = codec->type == AVMEDIA_TYPE_VIDEO;
|| raw_codec_id == SC_CODEC_ID_H265;
struct sc_packet_merger merger; struct sc_packet_merger merger;

View File

@ -53,7 +53,7 @@ sc_display_init(struct sc_display *display, SDL_Window *window, bool mipmaps) {
display->mipmaps = true; display->mipmaps = true;
} else { } else {
LOGW("Trilinear filtering disabled " LOGW("Trilinear filtering disabled "
"(OpenGL 3.0+ or ES 2.0+ required)"); "(OpenGL 3.0+ or ES 2.0+ required");
} }
} else { } else {
LOGI("Trilinear filtering disabled"); LOGI("Trilinear filtering disabled");
@ -234,7 +234,7 @@ sc_display_update_texture(struct sc_display *display, const AVFrame *frame) {
enum sc_display_result enum sc_display_result
sc_display_render(struct sc_display *display, const SDL_Rect *geometry, sc_display_render(struct sc_display *display, const SDL_Rect *geometry,
enum sc_orientation orientation) { unsigned rotation) {
SDL_RenderClear(display->renderer); SDL_RenderClear(display->renderer);
if (display->pending.flags) { if (display->pending.flags) {
@ -247,33 +247,33 @@ sc_display_render(struct sc_display *display, const SDL_Rect *geometry,
SDL_Renderer *renderer = display->renderer; SDL_Renderer *renderer = display->renderer;
SDL_Texture *texture = display->texture; SDL_Texture *texture = display->texture;
if (orientation == SC_ORIENTATION_0) { if (rotation == 0) {
int ret = SDL_RenderCopy(renderer, texture, NULL, geometry); int ret = SDL_RenderCopy(renderer, texture, NULL, geometry);
if (ret) { if (ret) {
LOGE("Could not render texture: %s", SDL_GetError()); LOGE("Could not render texture: %s", SDL_GetError());
return SC_DISPLAY_RESULT_ERROR; return SC_DISPLAY_RESULT_ERROR;
} }
} else { } else {
unsigned cw_rotation = sc_orientation_get_rotation(orientation); // rotation in RenderCopyEx() is clockwise, while screen->rotation is
// counterclockwise (to be consistent with --lock-video-orientation)
int cw_rotation = (4 - rotation) % 4;
double angle = 90 * cw_rotation; double angle = 90 * cw_rotation;
const SDL_Rect *dstrect = NULL; const SDL_Rect *dstrect = NULL;
SDL_Rect rect; SDL_Rect rect;
if (sc_orientation_is_swap(orientation)) { if (rotation & 1) {
rect.x = geometry->x + (geometry->w - geometry->h) / 2; rect.x = geometry->x + (geometry->w - geometry->h) / 2;
rect.y = geometry->y + (geometry->h - geometry->w) / 2; rect.y = geometry->y + (geometry->h - geometry->w) / 2;
rect.w = geometry->h; rect.w = geometry->h;
rect.h = geometry->w; rect.h = geometry->w;
dstrect = &rect; dstrect = &rect;
} else { } else {
assert(rotation == 2);
dstrect = geometry; dstrect = geometry;
} }
SDL_RendererFlip flip = sc_orientation_is_mirror(orientation)
? SDL_FLIP_HORIZONTAL : 0;
int ret = SDL_RenderCopyEx(renderer, texture, NULL, dstrect, angle, int ret = SDL_RenderCopyEx(renderer, texture, NULL, dstrect, angle,
NULL, flip); NULL, 0);
if (ret) { if (ret) {
LOGE("Could not render texture: %s", SDL_GetError()); LOGE("Could not render texture: %s", SDL_GetError());
return SC_DISPLAY_RESULT_ERROR; return SC_DISPLAY_RESULT_ERROR;

View File

@ -9,7 +9,6 @@
#include "coords.h" #include "coords.h"
#include "opengl.h" #include "opengl.h"
#include "options.h"
#ifdef __APPLE__ #ifdef __APPLE__
# define SC_DISPLAY_FORCE_OPENGL_CORE_PROFILE # define SC_DISPLAY_FORCE_OPENGL_CORE_PROFILE
@ -55,6 +54,6 @@ sc_display_update_texture(struct sc_display *display, const AVFrame *frame);
enum sc_display_result enum sc_display_result
sc_display_render(struct sc_display *display, const SDL_Rect *geometry, sc_display_render(struct sc_display *display, const SDL_Rect *geometry,
enum sc_orientation orientation); unsigned rotation);
#endif #endif

View File

@ -271,7 +271,7 @@ error:
} }
SDL_Surface * SDL_Surface *
scrcpy_icon_load(void) { scrcpy_icon_load() {
char *icon_path = get_icon_path(); char *icon_path = get_icon_path();
if (!icon_path) { if (!icon_path) {
return NULL; return NULL;

View File

@ -293,11 +293,15 @@ rotate_device(struct sc_controller *controller) {
} }
static void static void
apply_orientation_transform(struct sc_screen *screen, rotate_client_left(struct sc_screen *screen) {
enum sc_orientation transform) { unsigned new_rotation = (screen->rotation + 1) % 4;
enum sc_orientation new_orientation = sc_screen_set_rotation(screen, new_rotation);
sc_orientation_apply(screen->orientation, transform); }
sc_screen_set_orientation(screen, new_orientation);
static void
rotate_client_right(struct sc_screen *screen) {
unsigned new_rotation = (screen->rotation + 3) % 4;
sc_screen_set_rotation(screen, new_rotation);
} }
static void static void
@ -417,47 +421,25 @@ sc_input_manager_process_key(struct sc_input_manager *im,
} }
return; return;
case SDLK_DOWN: case SDLK_DOWN:
if (shift) { if (controller && !shift) {
if (!repeat & down) {
apply_orientation_transform(im->screen,
SC_ORIENTATION_FLIP_180);
}
} else if (controller) {
// forward repeated events // forward repeated events
action_volume_down(controller, action); action_volume_down(controller, action);
} }
return; return;
case SDLK_UP: case SDLK_UP:
if (shift) { if (controller && !shift) {
if (!repeat & down) {
apply_orientation_transform(im->screen,
SC_ORIENTATION_FLIP_180);
}
} else if (controller) {
// forward repeated events // forward repeated events
action_volume_up(controller, action); action_volume_up(controller, action);
} }
return; return;
case SDLK_LEFT: case SDLK_LEFT:
if (!repeat && down) { if (!shift && !repeat && down) {
if (shift) { rotate_client_left(im->screen);
apply_orientation_transform(im->screen,
SC_ORIENTATION_FLIP_0);
} else {
apply_orientation_transform(im->screen,
SC_ORIENTATION_270);
}
} }
return; return;
case SDLK_RIGHT: case SDLK_RIGHT:
if (!repeat && down) { if (!shift && !repeat && down) {
if (shift) { rotate_client_right(im->screen);
apply_orientation_transform(im->screen,
SC_ORIENTATION_FLIP_0);
} else {
apply_orientation_transform(im->screen,
SC_ORIENTATION_90);
}
} }
return; return;
case SDLK_c: case SDLK_c:

View File

@ -23,7 +23,7 @@
#include "util/str.h" #include "util/str.h"
#endif #endif
static int int
main_scrcpy(int argc, char *argv[]) { main_scrcpy(int argc, char *argv[]) {
#ifdef _WIN32 #ifdef _WIN32
// disable buffering, we want logs immediately // disable buffering, we want logs immediately
@ -39,32 +39,26 @@ main_scrcpy(int argc, char *argv[]) {
.opts = scrcpy_options_default, .opts = scrcpy_options_default,
.help = false, .help = false,
.version = false, .version = false,
.pause_on_exit = SC_PAUSE_ON_EXIT_FALSE,
}; };
#ifndef NDEBUG #ifndef NDEBUG
args.opts.log_level = SC_LOG_LEVEL_DEBUG; args.opts.log_level = SC_LOG_LEVEL_DEBUG;
#endif #endif
enum scrcpy_exit_code ret;
if (!scrcpy_parse_args(&args, argc, argv)) { if (!scrcpy_parse_args(&args, argc, argv)) {
ret = SCRCPY_EXIT_FAILURE; return SCRCPY_EXIT_FAILURE;
goto end;
} }
sc_set_log_level(args.opts.log_level); sc_set_log_level(args.opts.log_level);
if (args.help) { if (args.help) {
scrcpy_print_usage(argv[0]); scrcpy_print_usage(argv[0]);
ret = SCRCPY_EXIT_SUCCESS; return SCRCPY_EXIT_SUCCESS;
goto end;
} }
if (args.version) { if (args.version) {
scrcpy_print_version(); scrcpy_print_version();
ret = SCRCPY_EXIT_SUCCESS; return SCRCPY_EXIT_SUCCESS;
goto end;
} }
#ifdef SCRCPY_LAVF_REQUIRES_REGISTER_ALL #ifdef SCRCPY_LAVF_REQUIRES_REGISTER_ALL
@ -78,26 +72,18 @@ main_scrcpy(int argc, char *argv[]) {
#endif #endif
if (!net_init()) { if (!net_init()) {
ret = SCRCPY_EXIT_FAILURE; return SCRCPY_EXIT_FAILURE;
goto end;
} }
sc_log_configure(); sc_log_configure();
#ifdef HAVE_USB #ifdef HAVE_USB
ret = args.opts.otg ? scrcpy_otg(&args.opts) : scrcpy(&args.opts); enum scrcpy_exit_code ret = args.opts.otg ? scrcpy_otg(&args.opts)
: scrcpy(&args.opts);
#else #else
ret = scrcpy(&args.opts); enum scrcpy_exit_code ret = scrcpy(&args.opts);
#endif #endif
end:
if (args.pause_on_exit == SC_PAUSE_ON_EXIT_TRUE ||
(args.pause_on_exit == SC_PAUSE_ON_EXIT_IF_ERROR &&
ret != SCRCPY_EXIT_SUCCESS)) {
printf("Press Enter to continue...\n");
getchar();
}
return ret; return ret;
} }

View File

@ -11,19 +11,13 @@ const struct scrcpy_options scrcpy_options_default = {
.audio_codec_options = NULL, .audio_codec_options = NULL,
.video_encoder = NULL, .video_encoder = NULL,
.audio_encoder = NULL, .audio_encoder = NULL,
.camera_id = NULL,
.camera_size = NULL,
.camera_ar = NULL,
.camera_fps = 0,
.log_level = SC_LOG_LEVEL_INFO, .log_level = SC_LOG_LEVEL_INFO,
.video_codec = SC_CODEC_H264, .video_codec = SC_CODEC_H264,
.audio_codec = SC_CODEC_OPUS, .audio_codec = SC_CODEC_OPUS,
.video_source = SC_VIDEO_SOURCE_DISPLAY, .audio_source = SC_AUDIO_SOURCE_OUTPUT,
.audio_source = SC_AUDIO_SOURCE_AUTO,
.record_format = SC_RECORD_FORMAT_AUTO, .record_format = SC_RECORD_FORMAT_AUTO,
.keyboard_input_mode = SC_KEYBOARD_INPUT_MODE_INJECT, .keyboard_input_mode = SC_KEYBOARD_INPUT_MODE_INJECT,
.mouse_input_mode = SC_MOUSE_INPUT_MODE_INJECT, .mouse_input_mode = SC_MOUSE_INPUT_MODE_INJECT,
.camera_facing = SC_CAMERA_FACING_ANY,
.port_range = { .port_range = {
.first = DEFAULT_LOCAL_PORT_RANGE_FIRST, .first = DEFAULT_LOCAL_PORT_RANGE_FIRST,
.last = DEFAULT_LOCAL_PORT_RANGE_LAST, .last = DEFAULT_LOCAL_PORT_RANGE_LAST,
@ -39,15 +33,14 @@ const struct scrcpy_options scrcpy_options_default = {
.audio_bit_rate = 0, .audio_bit_rate = 0,
.max_fps = 0, .max_fps = 0,
.lock_video_orientation = SC_LOCK_VIDEO_ORIENTATION_UNLOCKED, .lock_video_orientation = SC_LOCK_VIDEO_ORIENTATION_UNLOCKED,
.display_orientation = SC_ORIENTATION_0, .rotation = 0,
.record_orientation = SC_ORIENTATION_0,
.window_x = SC_WINDOW_POSITION_UNDEFINED, .window_x = SC_WINDOW_POSITION_UNDEFINED,
.window_y = SC_WINDOW_POSITION_UNDEFINED, .window_y = SC_WINDOW_POSITION_UNDEFINED,
.window_width = 0, .window_width = 0,
.window_height = 0, .window_height = 0,
.display_id = 0, .display_id = 0,
.display_buffer = 0, .display_buffer = 0,
.audio_buffer = -1, // depends on the audio format, .audio_buffer = SC_TICK_FROM_MS(50),
.audio_output_buffer = SC_TICK_FROM_MS(5), .audio_output_buffer = SC_TICK_FROM_MS(5),
.time_limit = 0, .time_limit = 0,
#ifdef HAVE_V4L2 #ifdef HAVE_V4L2
@ -86,43 +79,7 @@ const struct scrcpy_options scrcpy_options_default = {
.video = true, .video = true,
.audio = true, .audio = true,
.require_audio = false, .require_audio = false,
.list_encoders = false,
.list_displays = false,
.kill_adb_on_close = false, .kill_adb_on_close = false,
.camera_high_speed = false,
.list = 0,
}; };
enum sc_orientation
sc_orientation_apply(enum sc_orientation src, enum sc_orientation transform) {
assert(!(src & ~7));
assert(!(transform & ~7));
unsigned transform_hflip = transform & 4;
unsigned transform_rotation = transform & 3;
unsigned src_hflip = src & 4;
unsigned src_rotation = src & 3;
unsigned src_swap = src & 1;
if (src_swap && transform_hflip) {
// If the src is rotated by 90 or 270 degrees, applying a flipped
// transformation requires an additional 180 degrees rotation to
// compensate for the inversion of the order of multiplication:
//
// hflip1 × rotate1 × hflip2 × rotate2
// `--------------' `--------------'
// src transform
//
// In the final result, we want all the hflips then all the rotations,
// so we must move hflip2 to the left:
//
// hflip1 × hflip2 × rotate1' × rotate2
//
// with rotate1' = | rotate1 if src is 0° or 180°
// | rotate1 + 180° if src is 90° or 270°
src_rotation += 2;
}
unsigned result_hflip = src_hflip ^ transform_hflip;
unsigned result_rotation = (transform_rotation + src_rotation) % 4;
enum sc_orientation result = result_hflip | result_rotation;
return result;
}

View File

@ -3,7 +3,6 @@
#include "common.h" #include "common.h"
#include <assert.h>
#include <stdbool.h> #include <stdbool.h>
#include <stddef.h> #include <stddef.h>
#include <stdint.h> #include <stdint.h>
@ -26,8 +25,6 @@ enum sc_record_format {
SC_RECORD_FORMAT_MKA, SC_RECORD_FORMAT_MKA,
SC_RECORD_FORMAT_OPUS, SC_RECORD_FORMAT_OPUS,
SC_RECORD_FORMAT_AAC, SC_RECORD_FORMAT_AAC,
SC_RECORD_FORMAT_FLAC,
SC_RECORD_FORMAT_WAV,
}; };
static inline bool static inline bool
@ -35,9 +32,7 @@ sc_record_format_is_audio_only(enum sc_record_format fmt) {
return fmt == SC_RECORD_FORMAT_M4A return fmt == SC_RECORD_FORMAT_M4A
|| fmt == SC_RECORD_FORMAT_MKA || fmt == SC_RECORD_FORMAT_MKA
|| fmt == SC_RECORD_FORMAT_OPUS || fmt == SC_RECORD_FORMAT_OPUS
|| fmt == SC_RECORD_FORMAT_AAC || fmt == SC_RECORD_FORMAT_AAC;
|| fmt == SC_RECORD_FORMAT_FLAC
|| fmt == SC_RECORD_FORMAT_WAV;
} }
enum sc_codec { enum sc_codec {
@ -46,97 +41,22 @@ enum sc_codec {
SC_CODEC_AV1, SC_CODEC_AV1,
SC_CODEC_OPUS, SC_CODEC_OPUS,
SC_CODEC_AAC, SC_CODEC_AAC,
SC_CODEC_FLAC,
SC_CODEC_RAW, SC_CODEC_RAW,
}; };
enum sc_video_source {
SC_VIDEO_SOURCE_DISPLAY,
SC_VIDEO_SOURCE_CAMERA,
};
enum sc_audio_source { enum sc_audio_source {
SC_AUDIO_SOURCE_AUTO, // OUTPUT for video DISPLAY, MIC for video CAMERA
SC_AUDIO_SOURCE_OUTPUT, SC_AUDIO_SOURCE_OUTPUT,
SC_AUDIO_SOURCE_MIC, SC_AUDIO_SOURCE_MIC,
}; };
enum sc_camera_facing {
SC_CAMERA_FACING_ANY,
SC_CAMERA_FACING_FRONT,
SC_CAMERA_FACING_BACK,
SC_CAMERA_FACING_EXTERNAL,
};
// ,----- hflip (applied before the rotation)
// | ,--- 180°
// | | ,- 90° clockwise
// | | |
enum sc_orientation { // v v v
SC_ORIENTATION_0, // 0 0 0
SC_ORIENTATION_90, // 0 0 1
SC_ORIENTATION_180, // 0 1 0
SC_ORIENTATION_270, // 0 1 1
SC_ORIENTATION_FLIP_0, // 1 0 0
SC_ORIENTATION_FLIP_90, // 1 0 1
SC_ORIENTATION_FLIP_180, // 1 1 0
SC_ORIENTATION_FLIP_270, // 1 1 1
};
static inline bool
sc_orientation_is_mirror(enum sc_orientation orientation) {
assert(!(orientation & ~7));
return orientation & 4;
}
// Does the orientation swap width and height?
static inline bool
sc_orientation_is_swap(enum sc_orientation orientation) {
assert(!(orientation & ~7));
return orientation & 1;
}
static inline enum sc_orientation
sc_orientation_get_rotation(enum sc_orientation orientation) {
assert(!(orientation & ~7));
return orientation & 3;
}
enum sc_orientation
sc_orientation_apply(enum sc_orientation src, enum sc_orientation transform);
static inline const char *
sc_orientation_get_name(enum sc_orientation orientation) {
switch (orientation) {
case SC_ORIENTATION_0:
return "0";
case SC_ORIENTATION_90:
return "90";
case SC_ORIENTATION_180:
return "180";
case SC_ORIENTATION_270:
return "270";
case SC_ORIENTATION_FLIP_0:
return "flip0";
case SC_ORIENTATION_FLIP_90:
return "flip90";
case SC_ORIENTATION_FLIP_180:
return "flip180";
case SC_ORIENTATION_FLIP_270:
return "flip270";
default:
return "(unknown)";
}
}
enum sc_lock_video_orientation { enum sc_lock_video_orientation {
SC_LOCK_VIDEO_ORIENTATION_UNLOCKED = -1, SC_LOCK_VIDEO_ORIENTATION_UNLOCKED = -1,
// lock the current orientation when scrcpy starts // lock the current orientation when scrcpy starts
SC_LOCK_VIDEO_ORIENTATION_INITIAL = -2, SC_LOCK_VIDEO_ORIENTATION_INITIAL = -2,
SC_LOCK_VIDEO_ORIENTATION_0 = 0, SC_LOCK_VIDEO_ORIENTATION_0 = 0,
SC_LOCK_VIDEO_ORIENTATION_90 = 3, SC_LOCK_VIDEO_ORIENTATION_1,
SC_LOCK_VIDEO_ORIENTATION_180 = 2, SC_LOCK_VIDEO_ORIENTATION_2,
SC_LOCK_VIDEO_ORIENTATION_270 = 1, SC_LOCK_VIDEO_ORIENTATION_3,
}; };
enum sc_keyboard_input_mode { enum sc_keyboard_input_mode {
@ -197,19 +117,13 @@ struct scrcpy_options {
const char *audio_codec_options; const char *audio_codec_options;
const char *video_encoder; const char *video_encoder;
const char *audio_encoder; const char *audio_encoder;
const char *camera_id;
const char *camera_size;
const char *camera_ar;
uint16_t camera_fps;
enum sc_log_level log_level; enum sc_log_level log_level;
enum sc_codec video_codec; enum sc_codec video_codec;
enum sc_codec audio_codec; enum sc_codec audio_codec;
enum sc_video_source video_source;
enum sc_audio_source audio_source; enum sc_audio_source audio_source;
enum sc_record_format record_format; enum sc_record_format record_format;
enum sc_keyboard_input_mode keyboard_input_mode; enum sc_keyboard_input_mode keyboard_input_mode;
enum sc_mouse_input_mode mouse_input_mode; enum sc_mouse_input_mode mouse_input_mode;
enum sc_camera_facing camera_facing;
struct sc_port_range port_range; struct sc_port_range port_range;
uint32_t tunnel_host; uint32_t tunnel_host;
uint16_t tunnel_port; uint16_t tunnel_port;
@ -219,8 +133,7 @@ struct scrcpy_options {
uint32_t audio_bit_rate; uint32_t audio_bit_rate;
uint16_t max_fps; uint16_t max_fps;
enum sc_lock_video_orientation lock_video_orientation; enum sc_lock_video_orientation lock_video_orientation;
enum sc_orientation display_orientation; uint8_t rotation;
enum sc_orientation record_orientation;
int16_t window_x; // SC_WINDOW_POSITION_UNDEFINED for "auto" int16_t window_x; // SC_WINDOW_POSITION_UNDEFINED for "auto"
int16_t window_y; // SC_WINDOW_POSITION_UNDEFINED for "auto" int16_t window_y; // SC_WINDOW_POSITION_UNDEFINED for "auto"
uint16_t window_width; uint16_t window_width;
@ -266,13 +179,9 @@ struct scrcpy_options {
bool video; bool video;
bool audio; bool audio;
bool require_audio; bool require_audio;
bool list_encoders;
bool list_displays;
bool kill_adb_on_close; bool kill_adb_on_close;
bool camera_high_speed;
#define SC_OPTION_LIST_ENCODERS 0x1
#define SC_OPTION_LIST_DISPLAYS 0x2
#define SC_OPTION_LIST_CAMERAS 0x4
#define SC_OPTION_LIST_CAMERA_SIZES 0x8
uint8_t list;
}; };
extern const struct scrcpy_options scrcpy_options_default; extern const struct scrcpy_options scrcpy_options_default;

View File

@ -4,7 +4,6 @@
#include <libavcodec/avcodec.h> #include <libavcodec/avcodec.h>
#include <libavformat/avformat.h> #include <libavformat/avformat.h>
#include <libavutil/time.h> #include <libavutil/time.h>
#include <libavutil/display.h>
#include "util/log.h" #include "util/log.h"
#include "util/str.h" #include "util/str.h"
@ -70,10 +69,6 @@ sc_recorder_get_format_name(enum sc_record_format format) {
return "matroska"; return "matroska";
case SC_RECORD_FORMAT_OPUS: case SC_RECORD_FORMAT_OPUS:
return "opus"; return "opus";
case SC_RECORD_FORMAT_FLAC:
return "flac";
case SC_RECORD_FORMAT_WAV:
return "wav";
default: default:
return NULL; return NULL;
} }
@ -106,7 +101,7 @@ sc_recorder_write_stream(struct sc_recorder *recorder,
AVStream *stream = recorder->ctx->streams[st->index]; AVStream *stream = recorder->ctx->streams[st->index];
sc_recorder_rescale_packet(stream, packet); sc_recorder_rescale_packet(stream, packet);
if (st->last_pts != AV_NOPTS_VALUE && packet->pts <= st->last_pts) { if (st->last_pts != AV_NOPTS_VALUE && packet->pts <= st->last_pts) {
LOGD("Fixing PTS non monotonically increasing in stream %d " LOGW("Fixing PTS non monotonically increasing in stream %d "
"(%" PRIi64 " >= %" PRIi64 ")", "(%" PRIi64 " >= %" PRIi64 ")",
st->index, st->last_pts, packet->pts); st->index, st->last_pts, packet->pts);
packet->pts = ++st->last_pts; packet->pts = ++st->last_pts;
@ -171,14 +166,13 @@ sc_recorder_close_output_file(struct sc_recorder *recorder) {
} }
static inline bool static inline bool
sc_recorder_must_wait_for_config_packets(struct sc_recorder *recorder) { sc_recorder_has_empty_queues(struct sc_recorder *recorder) {
if (recorder->video && sc_vecdeque_is_empty(&recorder->video_queue)) { if (recorder->video && sc_vecdeque_is_empty(&recorder->video_queue)) {
// The video queue is empty // The video queue is empty
return true; return true;
} }
if (recorder->audio && recorder->audio_expects_config_packet if (recorder->audio && sc_vecdeque_is_empty(&recorder->audio_queue)) {
&& sc_vecdeque_is_empty(&recorder->audio_queue)) {
// The audio queue is empty (when audio is enabled) // The audio queue is empty (when audio is enabled)
return true; return true;
} }
@ -194,7 +188,7 @@ sc_recorder_process_header(struct sc_recorder *recorder) {
while (!recorder->stopped && while (!recorder->stopped &&
((recorder->video && !recorder->video_init) ((recorder->video && !recorder->video_init)
|| (recorder->audio && !recorder->audio_init) || (recorder->audio && !recorder->audio_init)
|| sc_recorder_must_wait_for_config_packets(recorder))) { || sc_recorder_has_empty_queues(recorder))) {
sc_cond_wait(&recorder->cond, &recorder->mutex); sc_cond_wait(&recorder->cond, &recorder->mutex);
} }
@ -213,8 +207,7 @@ sc_recorder_process_header(struct sc_recorder *recorder) {
} }
AVPacket *audio_pkt = NULL; AVPacket *audio_pkt = NULL;
if (recorder->audio_expects_config_packet && if (!sc_vecdeque_is_empty(&recorder->audio_queue)) {
!sc_vecdeque_is_empty(&recorder->audio_queue)) {
assert(recorder->audio); assert(recorder->audio);
audio_pkt = sc_vecdeque_pop(&recorder->audio_queue); audio_pkt = sc_vecdeque_pop(&recorder->audio_queue);
} }
@ -494,42 +487,6 @@ run_recorder(void *data) {
return 0; return 0;
} }
static bool
sc_recorder_set_orientation(AVStream *stream, enum sc_orientation orientation) {
assert(!sc_orientation_is_mirror(orientation));
uint8_t *raw_data;
#ifdef SCRCPY_LAVC_HAS_CODECPAR_CODEC_SIDEDATA
AVPacketSideData *sd =
av_packet_side_data_new(&stream->codecpar->coded_side_data,
&stream->codecpar->nb_coded_side_data,
AV_PKT_DATA_DISPLAYMATRIX,
sizeof(int32_t) * 9, 0);
if (!sd) {
LOG_OOM();
return false;
}
raw_data = sd->data;
#else
raw_data = av_stream_new_side_data(stream, AV_PKT_DATA_DISPLAYMATRIX,
sizeof(int32_t) * 9);
if (!raw_data) {
LOG_OOM();
return false;
}
#endif
int32_t *matrix = (int32_t *) raw_data;
unsigned rotation = orientation;
unsigned angle = rotation * 90;
av_display_rotation_set(matrix, angle);
return true;
}
static bool static bool
sc_recorder_video_packet_sink_open(struct sc_packet_sink *sink, sc_recorder_video_packet_sink_open(struct sc_packet_sink *sink,
AVCodecContext *ctx) { AVCodecContext *ctx) {
@ -557,16 +514,6 @@ sc_recorder_video_packet_sink_open(struct sc_packet_sink *sink,
recorder->video_stream.index = stream->index; recorder->video_stream.index = stream->index;
if (recorder->orientation != SC_ORIENTATION_0) {
if (!sc_recorder_set_orientation(stream, recorder->orientation)) {
sc_mutex_unlock(&recorder->mutex);
return false;
}
LOGI("Record orientation set to %s",
sc_orientation_get_name(recorder->orientation));
}
recorder->video_init = true; recorder->video_init = true;
sc_cond_signal(&recorder->cond); sc_cond_signal(&recorder->cond);
sc_mutex_unlock(&recorder->mutex); sc_mutex_unlock(&recorder->mutex);
@ -648,10 +595,6 @@ sc_recorder_audio_packet_sink_open(struct sc_packet_sink *sink,
recorder->audio_stream.index = stream->index; recorder->audio_stream.index = stream->index;
// A config packet is provided for all supported formats except raw audio
recorder->audio_expects_config_packet =
ctx->codec_id != AV_CODEC_ID_PCM_S16LE;
recorder->audio_init = true; recorder->audio_init = true;
sc_cond_signal(&recorder->cond); sc_cond_signal(&recorder->cond);
sc_mutex_unlock(&recorder->mutex); sc_mutex_unlock(&recorder->mutex);
@ -736,10 +679,7 @@ sc_recorder_stream_init(struct sc_recorder_stream *stream) {
bool bool
sc_recorder_init(struct sc_recorder *recorder, const char *filename, sc_recorder_init(struct sc_recorder *recorder, const char *filename,
enum sc_record_format format, bool video, bool audio, enum sc_record_format format, bool video, bool audio,
enum sc_orientation orientation,
const struct sc_recorder_callbacks *cbs, void *cbs_userdata) { const struct sc_recorder_callbacks *cbs, void *cbs_userdata) {
assert(!sc_orientation_is_mirror(orientation));
recorder->filename = strdup(filename); recorder->filename = strdup(filename);
if (!recorder->filename) { if (!recorder->filename) {
LOG_OOM(); LOG_OOM();
@ -760,8 +700,6 @@ sc_recorder_init(struct sc_recorder *recorder, const char *filename,
recorder->video = video; recorder->video = video;
recorder->audio = audio; recorder->audio = audio;
recorder->orientation = orientation;
sc_vecdeque_init(&recorder->video_queue); sc_vecdeque_init(&recorder->video_queue);
sc_vecdeque_init(&recorder->audio_queue); sc_vecdeque_init(&recorder->audio_queue);
recorder->stopped = false; recorder->stopped = false;
@ -769,8 +707,6 @@ sc_recorder_init(struct sc_recorder *recorder, const char *filename,
recorder->video_init = false; recorder->video_init = false;
recorder->audio_init = false; recorder->audio_init = false;
recorder->audio_expects_config_packet = false;
sc_recorder_stream_init(&recorder->video_stream); sc_recorder_stream_init(&recorder->video_stream);
sc_recorder_stream_init(&recorder->audio_stream); sc_recorder_stream_init(&recorder->audio_stream);

View File

@ -34,8 +34,6 @@ struct sc_recorder {
bool audio; bool audio;
bool video; bool video;
enum sc_orientation orientation;
char *filename; char *filename;
enum sc_record_format format; enum sc_record_format format;
AVFormatContext *ctx; AVFormatContext *ctx;
@ -52,8 +50,6 @@ struct sc_recorder {
bool video_init; bool video_init;
bool audio_init; bool audio_init;
bool audio_expects_config_packet;
struct sc_recorder_stream video_stream; struct sc_recorder_stream video_stream;
struct sc_recorder_stream audio_stream; struct sc_recorder_stream audio_stream;
@ -69,7 +65,6 @@ struct sc_recorder_callbacks {
bool bool
sc_recorder_init(struct sc_recorder *recorder, const char *filename, sc_recorder_init(struct sc_recorder *recorder, const char *filename,
enum sc_record_format format, bool video, bool audio, enum sc_record_format format, bool video, bool audio,
enum sc_orientation orientation,
const struct sc_recorder_callbacks *cbs, void *cbs_userdata); const struct sc_recorder_callbacks *cbs, void *cbs_userdata);
bool bool

View File

@ -90,7 +90,7 @@ push_event(uint32_t type, const char *name) {
#define PUSH_EVENT(TYPE) push_event(TYPE, # TYPE) #define PUSH_EVENT(TYPE) push_event(TYPE, # TYPE)
#ifdef _WIN32 #ifdef _WIN32
static BOOL WINAPI windows_ctrl_handler(DWORD ctrl_type) { BOOL WINAPI windows_ctrl_handler(DWORD ctrl_type) {
if (ctrl_type == CTRL_C_EVENT) { if (ctrl_type == CTRL_C_EVENT) {
PUSH_EVENT(SDL_QUIT); PUSH_EVENT(SDL_QUIT);
return TRUE; return TRUE;
@ -252,9 +252,7 @@ sc_audio_demuxer_on_ended(struct sc_demuxer *demuxer,
// Contrary to the video demuxer, keep mirroring if only the audio fails // Contrary to the video demuxer, keep mirroring if only the audio fails
// (unless --require-audio is set). // (unless --require-audio is set).
if (status == SC_DEMUXER_STATUS_EOS) { if (status == SC_DEMUXER_STATUS_ERROR
PUSH_EVENT(SC_EVENT_DEVICE_DISCONNECTED);
} else if (status == SC_DEMUXER_STATUS_ERROR
|| (status == SC_DEMUXER_STATUS_DISABLED || (status == SC_DEMUXER_STATUS_DISABLED
&& options->require_audio)) { && options->require_audio)) {
PUSH_EVENT(SC_EVENT_DEMUXER_ERROR); PUSH_EVENT(SC_EVENT_DEMUXER_ERROR);
@ -297,7 +295,7 @@ sc_timeout_on_timeout(struct sc_timeout *timeout, void *userdata) {
// Generate a scrcpy id to differentiate multiple running scrcpy instances // Generate a scrcpy id to differentiate multiple running scrcpy instances
static uint32_t static uint32_t
scrcpy_generate_scid(void) { scrcpy_generate_scid() {
struct sc_rand rand; struct sc_rand rand;
sc_rand_init(&rand); sc_rand_init(&rand);
// Only use 31 bits to avoid issues with signed values on the Java-side // Only use 31 bits to avoid issues with signed values on the Java-side
@ -351,9 +349,7 @@ scrcpy(struct scrcpy_options *options) {
.log_level = options->log_level, .log_level = options->log_level,
.video_codec = options->video_codec, .video_codec = options->video_codec,
.audio_codec = options->audio_codec, .audio_codec = options->audio_codec,
.video_source = options->video_source,
.audio_source = options->audio_source, .audio_source = options->audio_source,
.camera_facing = options->camera_facing,
.crop = options->crop, .crop = options->crop,
.port_range = options->port_range, .port_range = options->port_range,
.tunnel_host = options->tunnel_host, .tunnel_host = options->tunnel_host,
@ -373,10 +369,6 @@ scrcpy(struct scrcpy_options *options) {
.audio_codec_options = options->audio_codec_options, .audio_codec_options = options->audio_codec_options,
.video_encoder = options->video_encoder, .video_encoder = options->video_encoder,
.audio_encoder = options->audio_encoder, .audio_encoder = options->audio_encoder,
.camera_id = options->camera_id,
.camera_size = options->camera_size,
.camera_ar = options->camera_ar,
.camera_fps = options->camera_fps,
.force_adb_forward = options->force_adb_forward, .force_adb_forward = options->force_adb_forward,
.power_off_on_close = options->power_off_on_close, .power_off_on_close = options->power_off_on_close,
.clipboard_autosync = options->clipboard_autosync, .clipboard_autosync = options->clipboard_autosync,
@ -385,9 +377,9 @@ scrcpy(struct scrcpy_options *options) {
.tcpip_dst = options->tcpip_dst, .tcpip_dst = options->tcpip_dst,
.cleanup = options->cleanup, .cleanup = options->cleanup,
.power_on = options->power_on, .power_on = options->power_on,
.list_encoders = options->list_encoders,
.list_displays = options->list_displays,
.kill_adb_on_close = options->kill_adb_on_close, .kill_adb_on_close = options->kill_adb_on_close,
.camera_high_speed = options->camera_high_speed,
.list = options->list,
}; };
static const struct sc_server_callbacks cbs = { static const struct sc_server_callbacks cbs = {
@ -405,7 +397,7 @@ scrcpy(struct scrcpy_options *options) {
server_started = true; server_started = true;
if (options->list) { if (options->list_encoders || options->list_displays) {
bool ok = await_for_server(NULL); bool ok = await_for_server(NULL);
ret = ok ? SCRCPY_EXIT_SUCCESS : SCRCPY_EXIT_FAILURE; ret = ok ? SCRCPY_EXIT_SUCCESS : SCRCPY_EXIT_FAILURE;
goto end; goto end;
@ -417,22 +409,9 @@ scrcpy(struct scrcpy_options *options) {
if (options->video_playback) { if (options->video_playback) {
sdl_set_hints(options->render_driver); sdl_set_hints(options->render_driver);
}
if (options->video_playback ||
(options->control && options->clipboard_autosync)) {
// Initialize the video subsystem even if --no-video or
// --no-video-playback is passed so that clipboard synchronization
// still works.
// <https://github.com/Genymobile/scrcpy/issues/4418>
if (SDL_Init(SDL_INIT_VIDEO)) { if (SDL_Init(SDL_INIT_VIDEO)) {
// If it fails, it is an error only if video playback is enabled LOGE("Could not initialize SDL video: %s", SDL_GetError());
if (options->video_playback) { goto end;
LOGE("Could not initialize SDL video: %s", SDL_GetError());
goto end;
} else {
LOGW("Could not initialize SDL video: %s", SDL_GetError());
}
} }
} }
@ -469,7 +448,9 @@ scrcpy(struct scrcpy_options *options) {
struct sc_file_pusher *fp = NULL; struct sc_file_pusher *fp = NULL;
if (options->video_playback && options->control) { // control implies video playback
assert(!options->control || options->video_playback);
if (options->control) {
if (!sc_file_pusher_init(&s->file_pusher, serial, if (!sc_file_pusher_init(&s->file_pusher, serial,
options->push_target)) { options->push_target)) {
goto end; goto end;
@ -516,8 +497,7 @@ scrcpy(struct scrcpy_options *options) {
}; };
if (!sc_recorder_init(&s->recorder, options->record_filename, if (!sc_recorder_init(&s->recorder, options->record_filename,
options->record_format, options->video, options->record_format, options->video,
options->audio, options->record_orientation, options->audio, &recorder_cbs, NULL)) {
&recorder_cbs, NULL)) {
goto end; goto end;
} }
recorder_initialized = true; recorder_initialized = true;
@ -701,7 +681,7 @@ aoa_hid_end:
.window_width = options->window_width, .window_width = options->window_width,
.window_height = options->window_height, .window_height = options->window_height,
.window_borderless = options->window_borderless, .window_borderless = options->window_borderless,
.orientation = options->display_orientation, .rotation = options->rotation,
.mipmaps = options->mipmaps, .mipmaps = options->mipmaps,
.fullscreen = options->fullscreen, .fullscreen = options->fullscreen,
.start_fps_counter = options->start_fps_counter, .start_fps_counter = options->start_fps_counter,

View File

@ -14,16 +14,16 @@
#define DOWNCAST(SINK) container_of(SINK, struct sc_screen, frame_sink) #define DOWNCAST(SINK) container_of(SINK, struct sc_screen, frame_sink)
static inline struct sc_size static inline struct sc_size
get_oriented_size(struct sc_size size, enum sc_orientation orientation) { get_rotated_size(struct sc_size size, int rotation) {
struct sc_size oriented_size; struct sc_size rotated_size;
if (sc_orientation_is_swap(orientation)) { if (rotation & 1) {
oriented_size.width = size.height; rotated_size.width = size.height;
oriented_size.height = size.width; rotated_size.height = size.width;
} else { } else {
oriented_size.width = size.width; rotated_size.width = size.width;
oriented_size.height = size.height; rotated_size.height = size.height;
} }
return oriented_size; return rotated_size;
} }
// get the window size in a struct sc_size // get the window size in a struct sc_size
@ -251,7 +251,7 @@ sc_screen_render(struct sc_screen *screen, bool update_content_rect) {
} }
enum sc_display_result res = enum sc_display_result res =
sc_display_render(&screen->display, &screen->rect, screen->orientation); sc_display_render(&screen->display, &screen->rect, screen->rotation);
(void) res; // any error already logged (void) res; // any error already logged
} }
@ -379,10 +379,9 @@ sc_screen_init(struct sc_screen *screen,
goto error_destroy_frame_buffer; goto error_destroy_frame_buffer;
} }
screen->orientation = params->orientation; screen->rotation = params->rotation;
if (screen->orientation != SC_ORIENTATION_0) { if (screen->rotation) {
LOGI("Initial display orientation set to %s", LOGI("Initial display rotation set to %u", screen->rotation);
sc_orientation_get_name(screen->orientation));
} }
uint32_t window_flags = SDL_WINDOW_HIDDEN uint32_t window_flags = SDL_WINDOW_HIDDEN
@ -489,7 +488,6 @@ sc_screen_show_initial_window(struct sc_screen *screen) {
} }
SDL_ShowWindow(screen->window); SDL_ShowWindow(screen->window);
sc_screen_update_content_rect(screen);
} }
void void
@ -560,19 +558,19 @@ apply_pending_resize(struct sc_screen *screen) {
} }
void void
sc_screen_set_orientation(struct sc_screen *screen, sc_screen_set_rotation(struct sc_screen *screen, unsigned rotation) {
enum sc_orientation orientation) { assert(rotation < 4);
if (orientation == screen->orientation) { if (rotation == screen->rotation) {
return; return;
} }
struct sc_size new_content_size = struct sc_size new_content_size =
get_oriented_size(screen->frame_size, orientation); get_rotated_size(screen->frame_size, rotation);
set_content_size(screen, new_content_size); set_content_size(screen, new_content_size);
screen->orientation = orientation; screen->rotation = rotation;
LOGI("Display orientation set to %s", sc_orientation_get_name(orientation)); LOGI("Display rotation set to %u", rotation);
sc_screen_render(screen, true); sc_screen_render(screen, true);
} }
@ -585,7 +583,7 @@ sc_screen_init_size(struct sc_screen *screen) {
// The requested size is passed via screen->frame_size // The requested size is passed via screen->frame_size
struct sc_size content_size = struct sc_size content_size =
get_oriented_size(screen->frame_size, screen->orientation); get_rotated_size(screen->frame_size, screen->rotation);
screen->content_size = content_size; screen->content_size = content_size;
enum sc_display_result res = enum sc_display_result res =
@ -605,7 +603,7 @@ prepare_for_frame(struct sc_screen *screen, struct sc_size new_frame_size) {
screen->frame_size = new_frame_size; screen->frame_size = new_frame_size;
struct sc_size new_content_size = struct sc_size new_content_size =
get_oriented_size(new_frame_size, screen->orientation); get_rotated_size(new_frame_size, screen->rotation);
set_content_size(screen, new_content_size); set_content_size(screen, new_content_size);
sc_screen_update_content_rect(screen); sc_screen_update_content_rect(screen);
@ -844,54 +842,37 @@ sc_screen_handle_event(struct sc_screen *screen, const SDL_Event *event) {
struct sc_point struct sc_point
sc_screen_convert_drawable_to_frame_coords(struct sc_screen *screen, sc_screen_convert_drawable_to_frame_coords(struct sc_screen *screen,
int32_t x, int32_t y) { int32_t x, int32_t y) {
enum sc_orientation orientation = screen->orientation; unsigned rotation = screen->rotation;
assert(rotation < 4);
int32_t w = screen->content_size.width; int32_t w = screen->content_size.width;
int32_t h = screen->content_size.height; int32_t h = screen->content_size.height;
// screen->rect must be initialized to avoid a division by zero
assert(screen->rect.w && screen->rect.h);
x = (int64_t) (x - screen->rect.x) * w / screen->rect.w; x = (int64_t) (x - screen->rect.x) * w / screen->rect.w;
y = (int64_t) (y - screen->rect.y) * h / screen->rect.h; y = (int64_t) (y - screen->rect.y) * h / screen->rect.h;
// rotate
struct sc_point result; struct sc_point result;
switch (orientation) { switch (rotation) {
case SC_ORIENTATION_0: case 0:
result.x = x; result.x = x;
result.y = y; result.y = y;
break; break;
case SC_ORIENTATION_90: case 1:
result.x = y;
result.y = w - x;
break;
case SC_ORIENTATION_180:
result.x = w - x;
result.y = h - y;
break;
case SC_ORIENTATION_270:
result.x = h - y; result.x = h - y;
result.y = x; result.y = x;
break; break;
case SC_ORIENTATION_FLIP_0: case 2:
result.x = w - x; result.x = w - x;
result.y = y;
break;
case SC_ORIENTATION_FLIP_90:
result.x = h - y;
result.y = w - x;
break;
case SC_ORIENTATION_FLIP_180:
result.x = x;
result.y = h - y; result.y = h - y;
break; break;
default: default:
assert(orientation == SC_ORIENTATION_FLIP_270); assert(rotation == 3);
result.x = y; result.x = y;
result.y = x; result.y = w - x;
break; break;
} }
return result; return result;
} }

View File

@ -14,7 +14,6 @@
#include "frame_buffer.h" #include "frame_buffer.h"
#include "input_manager.h" #include "input_manager.h"
#include "opengl.h" #include "opengl.h"
#include "options.h"
#include "trait/key_processor.h" #include "trait/key_processor.h"
#include "trait/frame_sink.h" #include "trait/frame_sink.h"
#include "trait/mouse_processor.h" #include "trait/mouse_processor.h"
@ -50,8 +49,8 @@ struct sc_screen {
// fullscreen (meaningful only when resize_pending is true) // fullscreen (meaningful only when resize_pending is true)
struct sc_size windowed_content_size; struct sc_size windowed_content_size;
// client orientation // client rotation: 0, 1, 2 or 3 (x90 degrees counterclockwise)
enum sc_orientation orientation; unsigned rotation;
// rectangle of the content (excluding black borders) // rectangle of the content (excluding black borders)
struct SDL_Rect rect; struct SDL_Rect rect;
bool has_frame; bool has_frame;
@ -87,7 +86,7 @@ struct sc_screen_params {
bool window_borderless; bool window_borderless;
enum sc_orientation orientation; uint8_t rotation;
bool mipmaps; bool mipmaps;
bool fullscreen; bool fullscreen;
@ -130,10 +129,9 @@ sc_screen_resize_to_fit(struct sc_screen *screen);
void void
sc_screen_resize_to_pixel_perfect(struct sc_screen *screen); sc_screen_resize_to_pixel_perfect(struct sc_screen *screen);
// set the display orientation // set the display rotation (0, 1, 2 or 3, x90 degrees counterclockwise)
void void
sc_screen_set_orientation(struct sc_screen *screen, sc_screen_set_rotation(struct sc_screen *screen, unsigned rotation);
enum sc_orientation orientation);
// react to SDL events // react to SDL events
// If this function returns false, scrcpy must exit with an error. // If this function returns false, scrcpy must exit with an error.

View File

@ -76,8 +76,6 @@ sc_server_params_destroy(struct sc_server_params *params) {
free((char *) params->video_encoder); free((char *) params->video_encoder);
free((char *) params->audio_encoder); free((char *) params->audio_encoder);
free((char *) params->tcpip_dst); free((char *) params->tcpip_dst);
free((char *) params->camera_id);
free((char *) params->camera_ar);
} }
static bool static bool
@ -88,15 +86,14 @@ sc_server_params_copy(struct sc_server_params *dst,
// The params reference user-allocated memory, so we must copy them to // The params reference user-allocated memory, so we must copy them to
// handle them from another thread // handle them from another thread
#define COPY(FIELD) do { \ #define COPY(FIELD) \
dst->FIELD = NULL; \ dst->FIELD = NULL; \
if (src->FIELD) { \ if (src->FIELD) { \
dst->FIELD = strdup(src->FIELD); \ dst->FIELD = strdup(src->FIELD); \
if (!dst->FIELD) { \ if (!dst->FIELD) { \
goto error; \ goto error; \
} \ } \
} \ }
} while(0)
COPY(req_serial); COPY(req_serial);
COPY(crop); COPY(crop);
@ -105,8 +102,6 @@ sc_server_params_copy(struct sc_server_params *dst,
COPY(video_encoder); COPY(video_encoder);
COPY(audio_encoder); COPY(audio_encoder);
COPY(tcpip_dst); COPY(tcpip_dst);
COPY(camera_id);
COPY(camera_ar);
#undef COPY #undef COPY
return true; return true;
@ -178,8 +173,6 @@ sc_server_get_codec_name(enum sc_codec codec) {
return "opus"; return "opus";
case SC_CODEC_AAC: case SC_CODEC_AAC:
return "aac"; return "aac";
case SC_CODEC_FLAC:
return "flac";
case SC_CODEC_RAW: case SC_CODEC_RAW:
return "raw"; return "raw";
default: default:
@ -187,20 +180,6 @@ sc_server_get_codec_name(enum sc_codec codec) {
} }
} }
static const char *
sc_server_get_camera_facing_name(enum sc_camera_facing camera_facing) {
switch (camera_facing) {
case SC_CAMERA_FACING_FRONT:
return "front";
case SC_CAMERA_FACING_BACK:
return "back";
case SC_CAMERA_FACING_EXTERNAL:
return "external";
default:
return NULL;
}
}
static sc_pid static sc_pid
execute_server(struct sc_server *server, execute_server(struct sc_server *server,
const struct sc_server_params *params) { const struct sc_server_params *params) {
@ -236,13 +215,13 @@ execute_server(struct sc_server *server,
cmd[count++] = SCRCPY_VERSION; cmd[count++] = SCRCPY_VERSION;
unsigned dyn_idx = count; // from there, the strings are allocated unsigned dyn_idx = count; // from there, the strings are allocated
#define ADD_PARAM(fmt, ...) do { \ #define ADD_PARAM(fmt, ...) { \
char *p; \ char *p; \
if (asprintf(&p, fmt, ## __VA_ARGS__) == -1) { \ if (asprintf(&p, fmt, ## __VA_ARGS__) == -1) { \
goto end; \ goto end; \
} \ } \
cmd[count++] = p; \ cmd[count++] = p; \
} while(0) }
ADD_PARAM("scid=%08x", params->scid); ADD_PARAM("scid=%08x", params->scid);
ADD_PARAM("log_level=%s", log_level_to_server_string(params->log_level)); ADD_PARAM("log_level=%s", log_level_to_server_string(params->log_level));
@ -267,11 +246,8 @@ execute_server(struct sc_server *server,
ADD_PARAM("audio_codec=%s", ADD_PARAM("audio_codec=%s",
sc_server_get_codec_name(params->audio_codec)); sc_server_get_codec_name(params->audio_codec));
} }
if (params->video_source != SC_VIDEO_SOURCE_DISPLAY) { if (params->audio_source != SC_AUDIO_SOURCE_OUTPUT) {
assert(params->video_source == SC_VIDEO_SOURCE_CAMERA); assert(params->audio_source == SC_AUDIO_SOURCE_MIC);
ADD_PARAM("video_source=camera");
}
if (params->audio_source == SC_AUDIO_SOURCE_MIC) {
ADD_PARAM("audio_source=mic"); ADD_PARAM("audio_source=mic");
} }
if (params->max_size) { if (params->max_size) {
@ -297,25 +273,6 @@ execute_server(struct sc_server *server,
if (params->display_id) { if (params->display_id) {
ADD_PARAM("display_id=%" PRIu32, params->display_id); ADD_PARAM("display_id=%" PRIu32, params->display_id);
} }
if (params->camera_id) {
ADD_PARAM("camera_id=%s", params->camera_id);
}
if (params->camera_size) {
ADD_PARAM("camera_size=%s", params->camera_size);
}
if (params->camera_facing != SC_CAMERA_FACING_ANY) {
ADD_PARAM("camera_facing=%s",
sc_server_get_camera_facing_name(params->camera_facing));
}
if (params->camera_ar) {
ADD_PARAM("camera_ar=%s", params->camera_ar);
}
if (params->camera_fps) {
ADD_PARAM("camera_fps=%" PRIu16, params->camera_fps);
}
if (params->camera_high_speed) {
ADD_PARAM("camera_high_speed=true");
}
if (params->show_touches) { if (params->show_touches) {
ADD_PARAM("show_touches=true"); ADD_PARAM("show_touches=true");
} }
@ -353,18 +310,12 @@ execute_server(struct sc_server *server,
// By default, power_on is true // By default, power_on is true
ADD_PARAM("power_on=false"); ADD_PARAM("power_on=false");
} }
if (params->list & SC_OPTION_LIST_ENCODERS) { if (params->list_encoders) {
ADD_PARAM("list_encoders=true"); ADD_PARAM("list_encoders=true");
} }
if (params->list & SC_OPTION_LIST_DISPLAYS) { if (params->list_displays) {
ADD_PARAM("list_displays=true"); ADD_PARAM("list_displays=true");
} }
if (params->list & SC_OPTION_LIST_CAMERAS) {
ADD_PARAM("list_cameras=true");
}
if (params->list & SC_OPTION_LIST_CAMERA_SIZES) {
ADD_PARAM("list_camera_sizes=true");
}
#undef ADD_PARAM #undef ADD_PARAM
@ -582,8 +533,8 @@ sc_server_connect_to(struct sc_server *server, struct sc_server_info *info) {
if (audio_socket == SC_SOCKET_NONE) { if (audio_socket == SC_SOCKET_NONE) {
goto fail; goto fail;
} }
bool ok = net_connect_intr(&server->intr, audio_socket, bool ok = net_connect_intr(&server->intr, audio_socket, tunnel_host,
tunnel_host, tunnel_port); tunnel_port);
if (!ok) { if (!ok) {
goto fail; goto fail;
} }
@ -944,7 +895,7 @@ run_server(void *data) {
// If --list-* is passed, then the server just prints the requested data // If --list-* is passed, then the server just prints the requested data
// then exits. // then exits.
if (params->list) { if (params->list_encoders || params->list_displays) {
sc_pid pid = execute_server(server, params); sc_pid pid = execute_server(server, params);
if (pid == SC_PROCESS_NONE) { if (pid == SC_PROCESS_NONE) {
goto error_connection_failed; goto error_connection_failed;

View File

@ -26,18 +26,12 @@ struct sc_server_params {
enum sc_log_level log_level; enum sc_log_level log_level;
enum sc_codec video_codec; enum sc_codec video_codec;
enum sc_codec audio_codec; enum sc_codec audio_codec;
enum sc_video_source video_source;
enum sc_audio_source audio_source; enum sc_audio_source audio_source;
enum sc_camera_facing camera_facing;
const char *crop; const char *crop;
const char *video_codec_options; const char *video_codec_options;
const char *audio_codec_options; const char *audio_codec_options;
const char *video_encoder; const char *video_encoder;
const char *audio_encoder; const char *audio_encoder;
const char *camera_id;
const char *camera_size;
const char *camera_ar;
uint16_t camera_fps;
struct sc_port_range port_range; struct sc_port_range port_range;
uint32_t tunnel_host; uint32_t tunnel_host;
uint16_t tunnel_port; uint16_t tunnel_port;
@ -62,9 +56,9 @@ struct sc_server_params {
bool select_tcpip; bool select_tcpip;
bool cleanup; bool cleanup;
bool power_on; bool power_on;
bool list_encoders;
bool list_displays;
bool kill_adb_on_close; bool kill_adb_on_close;
bool camera_high_speed;
uint8_t list;
}; };
struct sc_server { struct sc_server {

View File

@ -27,8 +27,7 @@
// keyboard support, though OS could support more keys via modifying the report // keyboard support, though OS could support more keys via modifying the report
// desc. 6 should be enough for scrcpy. // desc. 6 should be enough for scrcpy.
#define HID_KEYBOARD_MAX_KEYS 6 #define HID_KEYBOARD_MAX_KEYS 6
#define HID_KEYBOARD_EVENT_SIZE \ #define HID_KEYBOARD_EVENT_SIZE (2 + HID_KEYBOARD_MAX_KEYS)
(HID_KEYBOARD_INDEX_KEYS + HID_KEYBOARD_MAX_KEYS)
#define HID_RESERVED 0x00 #define HID_RESERVED 0x00
#define HID_ERROR_ROLL_OVER 0x01 #define HID_ERROR_ROLL_OVER 0x01

View File

@ -105,6 +105,10 @@ scrcpy_otg(struct scrcpy_options *options) {
usb_device_initialized = true; usb_device_initialized = true;
LOGI("USB device: %s (%04x:%04x) %s %s", usb_device.serial,
(unsigned) usb_device.vid, (unsigned) usb_device.pid,
usb_device.manufacturer, usb_device.product);
ok = sc_usb_connect(&s->usb, usb_device.device, &cbs, NULL); ok = sc_usb_connect(&s->usb, usb_device.device, &cbs, NULL);
if (!ok) { if (!ok) {
goto end; goto end;

View File

@ -93,7 +93,7 @@ sc_usb_device_move(struct sc_usb_device *dst, struct sc_usb_device *src) {
src->product = NULL; src->product = NULL;
} }
static void void
sc_usb_devices_destroy(struct sc_vec_usb_devices *usb_devices) { sc_usb_devices_destroy(struct sc_vec_usb_devices *usb_devices) {
for (size_t i = 0; i < usb_devices->size; ++i) { for (size_t i = 0; i < usb_devices->size; ++i) {
sc_usb_device_destroy(&usb_devices->data[i]); sc_usb_device_destroy(&usb_devices->data[i]);
@ -213,8 +213,8 @@ sc_usb_select_device(struct sc_usb *usb, const char *serial,
assert(sel_count == 1); // sel_idx is valid only if sel_count == 1 assert(sel_count == 1); // sel_idx is valid only if sel_count == 1
struct sc_usb_device *device = &vec.data[sel_idx]; struct sc_usb_device *device = &vec.data[sel_idx];
LOGI("USB device found:"); LOGD("USB device found:");
sc_usb_devices_log(SC_LOG_LEVEL_INFO, vec.data, vec.size); sc_usb_devices_log(SC_LOG_LEVEL_DEBUG, vec.data, vec.size);
// Move device into out_device (do not destroy device) // Move device into out_device (do not destroy device)
sc_usb_device_move(out_device, device); sc_usb_device_move(out_device, device);

View File

@ -147,7 +147,7 @@ sc_sdl_log_print(void *userdata, int category, SDL_LogPriority priority,
} }
void void
sc_log_configure(void) { sc_log_configure() {
SDL_LogSetOutputFunction(sc_sdl_log_print, NULL); SDL_LogSetOutputFunction(sc_sdl_log_print, NULL);
// Redirect FFmpeg logs to SDL logs // Redirect FFmpeg logs to SDL logs
av_log_set_callback(sc_av_log_callback); av_log_set_callback(sc_av_log_callback);

View File

@ -36,6 +36,6 @@ sc_log_windows_error(const char *prefix, int error);
#endif #endif
void void
sc_log_configure(void); sc_log_configure();
#endif #endif

View File

@ -190,10 +190,10 @@ sc_vecdeque_reallocdata_(void *ptr, size_t newcap, size_t item_size,
size_t right_len = MIN(size, oldcap - oldorigin); size_t right_len = MIN(size, oldcap - oldorigin);
assert(right_len); assert(right_len);
memcpy(newptr, (char *) ptr + (oldorigin * item_size), right_len * item_size); memcpy(newptr, ptr + (oldorigin * item_size), right_len * item_size);
if (size > right_len) { if (size > right_len) {
memcpy((char *) newptr + (right_len * item_size), ptr, memcpy(newptr + (right_len * item_size), ptr,
(size - right_len) * item_size); (size - right_len) * item_size);
} }

View File

@ -5,7 +5,7 @@
#include "util/bytebuf.h" #include "util/bytebuf.h"
static void test_bytebuf_simple(void) { void test_bytebuf_simple(void) {
struct sc_bytebuf buf; struct sc_bytebuf buf;
uint8_t data[20]; uint8_t data[20];
@ -34,7 +34,7 @@ static void test_bytebuf_simple(void) {
sc_bytebuf_destroy(&buf); sc_bytebuf_destroy(&buf);
} }
static void test_bytebuf_boundaries(void) { void test_bytebuf_boundaries(void) {
struct sc_bytebuf buf; struct sc_bytebuf buf;
uint8_t data[20]; uint8_t data[20];
@ -71,7 +71,7 @@ static void test_bytebuf_boundaries(void) {
sc_bytebuf_destroy(&buf); sc_bytebuf_destroy(&buf);
} }
static void test_bytebuf_two_steps_write(void) { void test_bytebuf_two_steps_write(void) {
struct sc_bytebuf buf; struct sc_bytebuf buf;
uint8_t data[20]; uint8_t data[20];

View File

@ -1,91 +0,0 @@
#include "common.h"
#include <assert.h>
#include "options.h"
static void test_transforms(void) {
#define O(X) SC_ORIENTATION_ ## X
#define ASSERT_TRANSFORM(SRC, TR, RES) \
assert(sc_orientation_apply(O(SRC), O(TR)) == O(RES));
ASSERT_TRANSFORM(0, 0, 0);
ASSERT_TRANSFORM(0, 90, 90);
ASSERT_TRANSFORM(0, 180, 180);
ASSERT_TRANSFORM(0, 270, 270);
ASSERT_TRANSFORM(0, FLIP_0, FLIP_0);
ASSERT_TRANSFORM(0, FLIP_90, FLIP_90);
ASSERT_TRANSFORM(0, FLIP_180, FLIP_180);
ASSERT_TRANSFORM(0, FLIP_270, FLIP_270);
ASSERT_TRANSFORM(90, 0, 90);
ASSERT_TRANSFORM(90, 90, 180);
ASSERT_TRANSFORM(90, 180, 270);
ASSERT_TRANSFORM(90, 270, 0);
ASSERT_TRANSFORM(90, FLIP_0, FLIP_270);
ASSERT_TRANSFORM(90, FLIP_90, FLIP_0);
ASSERT_TRANSFORM(90, FLIP_180, FLIP_90);
ASSERT_TRANSFORM(90, FLIP_270, FLIP_180);
ASSERT_TRANSFORM(180, 0, 180);
ASSERT_TRANSFORM(180, 90, 270);
ASSERT_TRANSFORM(180, 180, 0);
ASSERT_TRANSFORM(180, 270, 90);
ASSERT_TRANSFORM(180, FLIP_0, FLIP_180);
ASSERT_TRANSFORM(180, FLIP_90, FLIP_270);
ASSERT_TRANSFORM(180, FLIP_180, FLIP_0);
ASSERT_TRANSFORM(180, FLIP_270, FLIP_90);
ASSERT_TRANSFORM(270, 0, 270);
ASSERT_TRANSFORM(270, 90, 0);
ASSERT_TRANSFORM(270, 180, 90);
ASSERT_TRANSFORM(270, 270, 180);
ASSERT_TRANSFORM(270, FLIP_0, FLIP_90);
ASSERT_TRANSFORM(270, FLIP_90, FLIP_180);
ASSERT_TRANSFORM(270, FLIP_180, FLIP_270);
ASSERT_TRANSFORM(270, FLIP_270, FLIP_0);
ASSERT_TRANSFORM(FLIP_0, 0, FLIP_0);
ASSERT_TRANSFORM(FLIP_0, 90, FLIP_90);
ASSERT_TRANSFORM(FLIP_0, 180, FLIP_180);
ASSERT_TRANSFORM(FLIP_0, 270, FLIP_270);
ASSERT_TRANSFORM(FLIP_0, FLIP_0, 0);
ASSERT_TRANSFORM(FLIP_0, FLIP_90, 90);
ASSERT_TRANSFORM(FLIP_0, FLIP_180, 180);
ASSERT_TRANSFORM(FLIP_0, FLIP_270, 270);
ASSERT_TRANSFORM(FLIP_90, 0, FLIP_90);
ASSERT_TRANSFORM(FLIP_90, 90, FLIP_180);
ASSERT_TRANSFORM(FLIP_90, 180, FLIP_270);
ASSERT_TRANSFORM(FLIP_90, 270, FLIP_0);
ASSERT_TRANSFORM(FLIP_90, FLIP_0, 270);
ASSERT_TRANSFORM(FLIP_90, FLIP_90, 0);
ASSERT_TRANSFORM(FLIP_90, FLIP_180, 90);
ASSERT_TRANSFORM(FLIP_90, FLIP_270, 180);
ASSERT_TRANSFORM(FLIP_180, 0, FLIP_180);
ASSERT_TRANSFORM(FLIP_180, 90, FLIP_270);
ASSERT_TRANSFORM(FLIP_180, 180, FLIP_0);
ASSERT_TRANSFORM(FLIP_180, 270, FLIP_90);
ASSERT_TRANSFORM(FLIP_180, FLIP_0, 180);
ASSERT_TRANSFORM(FLIP_180, FLIP_90, 270);
ASSERT_TRANSFORM(FLIP_180, FLIP_180, 0);
ASSERT_TRANSFORM(FLIP_180, FLIP_270, 90);
ASSERT_TRANSFORM(FLIP_270, 0, FLIP_270);
ASSERT_TRANSFORM(FLIP_270, 90, FLIP_0);
ASSERT_TRANSFORM(FLIP_270, 180, FLIP_90);
ASSERT_TRANSFORM(FLIP_270, 270, FLIP_180);
ASSERT_TRANSFORM(FLIP_270, FLIP_0, 90);
ASSERT_TRANSFORM(FLIP_270, FLIP_90, 180);
ASSERT_TRANSFORM(FLIP_270, FLIP_180, 270);
ASSERT_TRANSFORM(FLIP_270, FLIP_270, 0);
}
int main(int argc, char *argv[]) {
(void) argc;
(void) argv;
test_transforms();
return 0;
}

View File

@ -269,25 +269,21 @@ static void test_parse_integer_with_suffix(void) {
char buf[32]; char buf[32];
int r = snprintf(buf, sizeof(buf), "%ldk", LONG_MAX / 2000); sprintf(buf, "%ldk", LONG_MAX / 2000);
assert(r >= 0 && (size_t) r < sizeof(buf));
ok = sc_str_parse_integer_with_suffix(buf, &value); ok = sc_str_parse_integer_with_suffix(buf, &value);
assert(ok); assert(ok);
assert(value == LONG_MAX / 2000 * 1000); assert(value == LONG_MAX / 2000 * 1000);
r = snprintf(buf, sizeof(buf), "%ldm", LONG_MAX / 2000); sprintf(buf, "%ldm", LONG_MAX / 2000);
assert(r >= 0 && (size_t) r < sizeof(buf));
ok = sc_str_parse_integer_with_suffix(buf, &value); ok = sc_str_parse_integer_with_suffix(buf, &value);
assert(!ok); assert(!ok);
r = snprintf(buf, sizeof(buf), "%ldk", LONG_MIN / 2000); sprintf(buf, "%ldk", LONG_MIN / 2000);
assert(r >= 0 && (size_t) r < sizeof(buf));
ok = sc_str_parse_integer_with_suffix(buf, &value); ok = sc_str_parse_integer_with_suffix(buf, &value);
assert(ok); assert(ok);
assert(value == LONG_MIN / 2000 * 1000); assert(value == LONG_MIN / 2000 * 1000);
r = snprintf(buf, sizeof(buf), "%ldm", LONG_MIN / 2000); sprintf(buf, "%ldm", LONG_MIN / 2000);
assert(r >= 0 && (size_t) r < sizeof(buf));
ok = sc_str_parse_integer_with_suffix(buf, &value); ok = sc_str_parse_integer_with_suffix(buf, &value);
assert(!ok); assert(!ok);
} }
@ -362,7 +358,7 @@ static void test_index_of_column(void) {
assert(sc_str_index_of_column(" a bc d", 1, " ") == 2); assert(sc_str_index_of_column(" a bc d", 1, " ") == 2);
} }
static void test_remove_trailing_cr(void) { static void test_remove_trailing_cr() {
char s[] = "abc\r"; char s[] = "abc\r";
sc_str_remove_trailing_cr(s, sizeof(s) - 1); sc_str_remove_trailing_cr(s, sizeof(s) - 1);
assert(!strcmp(s, "abc")); assert(!strcmp(s, "abc"));

View File

@ -102,7 +102,7 @@ static void test_vecdeque_reserve(void) {
sc_vecdeque_destroy(&vdq); sc_vecdeque_destroy(&vdq);
} }
static void test_vecdeque_grow(void) { static void test_vecdeque_grow() {
struct SC_VECDEQUE(int) vdq = SC_VECDEQUE_INITIALIZER; struct SC_VECDEQUE(int) vdq = SC_VECDEQUE_INITIALIZER;
bool ok = sc_vecdeque_reserve(&vdq, 20); bool ok = sc_vecdeque_reserve(&vdq, 20);
@ -142,7 +142,7 @@ static void test_vecdeque_grow(void) {
sc_vecdeque_destroy(&vdq); sc_vecdeque_destroy(&vdq);
} }
static void test_vecdeque_push_hole(void) { static void test_vecdeque_push_hole() {
struct SC_VECDEQUE(int) vdq = SC_VECDEQUE_INITIALIZER; struct SC_VECDEQUE(int) vdq = SC_VECDEQUE_INITIALIZER;
bool ok = sc_vecdeque_reserve(&vdq, 20); bool ok = sc_vecdeque_reserve(&vdq, 20);

View File

@ -187,7 +187,7 @@ static void test_vector_index_of(void) {
sc_vector_destroy(&vec); sc_vector_destroy(&vec);
} }
static void test_vector_grow(void) { static void test_vector_grow() {
struct SC_VECTOR(int) vec = SC_VECTOR_INITIALIZER; struct SC_VECTOR(int) vec = SC_VECTOR_INITIALIZER;
bool ok; bool ok;

View File

@ -7,7 +7,7 @@ buildscript {
mavenCentral() mavenCentral()
} }
dependencies { dependencies {
classpath 'com.android.tools.build:gradle:8.1.3' classpath 'com.android.tools.build:gradle:7.4.0'
// NOTE: Do not place your application dependencies here; they belong // NOTE: Do not place your application dependencies here; they belong
// in the individual module build.gradle files // in the individual module build.gradle files
@ -23,3 +23,7 @@ allprojects {
options.compilerArgs << "-Xlint:deprecation" options.compilerArgs << "-Xlint:deprecation"
} }
} }
task clean(type: Delete) {
delete rootProject.buildDir
}

View File

@ -2,7 +2,7 @@ apply plugin: 'checkstyle'
check.dependsOn 'checkstyle' check.dependsOn 'checkstyle'
checkstyle { checkstyle {
toolVersion = '10.12.5' toolVersion = '9.0.1'
} }
task checkstyle(type: Checkstyle) { task checkstyle(type: Checkstyle) {

View File

@ -6,7 +6,7 @@ c = 'i686-w64-mingw32-gcc'
cpp = 'i686-w64-mingw32-g++' cpp = 'i686-w64-mingw32-g++'
ar = 'i686-w64-mingw32-ar' ar = 'i686-w64-mingw32-ar'
strip = 'i686-w64-mingw32-strip' strip = 'i686-w64-mingw32-strip'
pkg-config = 'i686-w64-mingw32-pkg-config' pkgconfig = 'i686-w64-mingw32-pkg-config'
windres = 'i686-w64-mingw32-windres' windres = 'i686-w64-mingw32-windres'
[host_machine] [host_machine]
@ -14,3 +14,8 @@ system = 'windows'
cpu_family = 'x86' cpu_family = 'x86'
cpu = 'i686' cpu = 'i686'
endian = 'little' endian = 'little'
[properties]
prebuilt_ffmpeg = 'ffmpeg-6.0-scrcpy-4/win32'
prebuilt_sdl2 = 'SDL2-2.26.4/i686-w64-mingw32'
prebuilt_libusb = 'libusb-1.0.26/libusb-MinGW-Win32'

View File

@ -6,7 +6,7 @@ c = 'x86_64-w64-mingw32-gcc'
cpp = 'x86_64-w64-mingw32-g++' cpp = 'x86_64-w64-mingw32-g++'
ar = 'x86_64-w64-mingw32-ar' ar = 'x86_64-w64-mingw32-ar'
strip = 'x86_64-w64-mingw32-strip' strip = 'x86_64-w64-mingw32-strip'
pkg-config = 'x86_64-w64-mingw32-pkg-config' pkgconfig = 'x86_64-w64-mingw32-pkg-config'
windres = 'x86_64-w64-mingw32-windres' windres = 'x86_64-w64-mingw32-windres'
[host_machine] [host_machine]
@ -14,3 +14,8 @@ system = 'windows'
cpu_family = 'x86' cpu_family = 'x86'
cpu = 'x86_64' cpu = 'x86_64'
endian = 'little' endian = 'little'
[properties]
prebuilt_ffmpeg = 'ffmpeg-6.0-scrcpy-4/win64'
prebuilt_sdl2 = 'SDL2-2.26.4/x86_64-w64-mingw32'
prebuilt_libusb = 'libusb-1.0.26/libusb-MinGW-x64'

View File

@ -62,37 +62,15 @@ scrcpy --audio-source=mic --no-video --no-playback --record=file.opus
## Codec ## Codec
The audio codec can be selected. The possible values are `opus` (default), The audio codec can be selected. The possible values are `opus` (default), `aac`
`aac`, `flac` and `raw` (uncompressed PCM 16-bit LE): and `raw` (uncompressed PCM 16-bit LE):
```bash ```bash
scrcpy --audio-codec=opus # default scrcpy --audio-codec=opus # default
scrcpy --audio-codec=aac scrcpy --audio-codec=aac
scrcpy --audio-codec=flac
scrcpy --audio-codec=raw scrcpy --audio-codec=raw
``` ```
In particular, if you get the following error:
> Failed to initialize audio/opus, error 0xfffffffe
then your device has no Opus encoder: try `scrcpy --audio-codec=aac`.
For advanced usage, to pass arbitrary parameters to the [`MediaFormat`],
check `--audio-codec-options` in the manpage or in `scrcpy --help`.
For example, to change the [FLAC compression level]:
```bash
scrcpy --audio-codec=flac --audio-codec-options=flac-compression-level=8
```
[`MediaFormat`]: https://developer.android.com/reference/android/media/MediaFormat
[FLAC compression level]: https://developer.android.com/reference/android/media/MediaFormat#KEY_FLAC_COMPRESSION_LEVEL
## Encoder
Several encoders may be available on the device. They can be listed by: Several encoders may be available on the device. They can be listed by:
```bash ```bash
@ -101,14 +79,19 @@ scrcpy --list-encoders
To select a specific encoder: To select a specific encoder:
```bash ```
scrcpy --audio-codec=opus --audio-encoder='c2.android.opus.encoder' scrcpy --audio-codec=opus --audio-encoder='c2.android.opus.encoder'
``` ```
For advanced usage, to pass arbitrary parameters to the [`MediaFormat`],
check `--audio-codec-options` in the manpage or in `scrcpy --help`.
[`MediaFormat`]: https://developer.android.com/reference/android/media/MediaFormat
## Bit rate ## Bit rate
The default audio bit rate is 128Kbps. To change it: The default video bit-rate is 128Kbps. To change it:
```bash ```bash
scrcpy --audio-bit-rate=64K scrcpy --audio-bit-rate=64K

View File

@ -58,7 +58,7 @@ sudo apt install gcc git pkg-config meson ninja-build libsdl2-dev \
libswresample-dev libusb-1.0-0-dev libswresample-dev libusb-1.0-0-dev
# server build dependencies # server build dependencies
sudo apt install openjdk-17-jdk sudo apt install openjdk-11-jdk
``` ```
On old versions (like Ubuntu 16.04), `meson` is too old. In that case, install On old versions (like Ubuntu 16.04), `meson` is too old. In that case, install
@ -77,7 +77,7 @@ pip3 install meson
sudo dnf install https://download1.rpmfusion.org/free/fedora/rpmfusion-free-release-$(rpm -E %fedora).noarch.rpm sudo dnf install https://download1.rpmfusion.org/free/fedora/rpmfusion-free-release-$(rpm -E %fedora).noarch.rpm
# client build dependencies # client build dependencies
sudo dnf install SDL2-devel ffms2-devel libusb1-devel meson gcc make sudo dnf install SDL2-devel ffms2-devel libusb-devel meson gcc make
# server build dependencies # server build dependencies
sudo dnf install java-devel sudo dnf install java-devel
@ -100,7 +100,7 @@ sudo apt install mingw-w64 mingw-w64-tools
You also need the JDK to build the server: You also need the JDK to build the server:
```bash ```bash
sudo apt install openjdk-17-jdk sudo apt install openjdk-11-jdk
``` ```
Then generate the releases: Then generate the releases:
@ -168,13 +168,13 @@ brew install sdl2 ffmpeg libusb
brew install pkg-config meson brew install pkg-config meson
``` ```
Additionally, if you want to build the server, install Java 17 from Caskroom, and Additionally, if you want to build the server, install Java 8 from Caskroom, and
make it available from the `PATH`: make it available from the `PATH`:
```bash ```bash
brew tap homebrew/cask-versions brew tap homebrew/cask-versions
brew install adoptopenjdk/openjdk/adoptopenjdk17 brew install adoptopenjdk/openjdk/adoptopenjdk11
export JAVA_HOME="$(/usr/libexec/java_home --version 1.17)" export JAVA_HOME="$(/usr/libexec/java_home --version 1.11)"
export PATH="$JAVA_HOME/bin:$PATH" export PATH="$JAVA_HOME/bin:$PATH"
``` ```
@ -233,10 +233,10 @@ install` must be run as root)._
#### Option 2: Use prebuilt server #### Option 2: Use prebuilt server
- [`scrcpy-server-v2.3.1`][direct-scrcpy-server] - [`scrcpy-server-v2.0`][direct-scrcpy-server]
<sub>SHA-256: `f6814822fc308a7a532f253485c9038183c6296a6c5df470a9e383b4f8e7605b`</sub> <sub>SHA-256: `9e241615f578cd690bb43311000debdecf6a9c50a7082b001952f18f6f21ddc2`</sub>
[direct-scrcpy-server]: https://github.com/Genymobile/scrcpy/releases/download/v2.3.1/scrcpy-server-v2.3.1 [direct-scrcpy-server]: https://github.com/Genymobile/scrcpy/releases/download/v2.0/scrcpy-server-v2.0
Download the prebuilt server somewhere, and specify its path during the Meson Download the prebuilt server somewhere, and specify its path during the Meson
configuration: configuration:

View File

@ -1,171 +0,0 @@
# Camera
Camera mirroring is supported for devices with Android 12 or higher.
To capture the camera instead of the device screen:
```
scrcpy --video-source=camera
```
By default, it automatically switches [audio source](audio.md#source) to
microphone (as if `--audio-source=mic` were also passed).
```bash
scrcpy --video-source=display # default is --audio-source=output
scrcpy --video-source=camera # default is --audio-source=mic
scrcpy --video-source=display --audio-source=mic # force display AND microphone
scrcpy --video-source=camera --audio-source=output # force camera AND device audio output
```
Audio can be disabled:
```bash
# audio not captured at all
scrcpy --video-source=camera --no-audio
scrcpy --video-source=camera --no-audio --record=file.mp4
# audio captured and recorded, but not played
scrcpy --video-source=camera --no-audio-playback --record=file.mp4
```
## List
To list the cameras available (with their declared valid sizes and frame rates):
```
scrcpy --list-cameras
scrcpy --list-camera-sizes
```
_Note that the sizes and frame rates are declarative. They are not accurate on
all devices: some of them are declared but not supported, while some others are
not declared but supported._
## Selection
It is possible to pass an explicit camera id (as listed by `--list-cameras`):
```
scrcpy --video-source=camera --camera-id=0
```
Alternatively, the camera may be selected automatically:
```bash
scrcpy --video-source=camera # use the first camera
scrcpy --video-source=camera --camera-facing=front # use the first front camera
scrcpy --video-source=camera --camera-facing=back # use the first back camera
scrcpy --video-source=camera --camera-facing=external # use the first external camera
```
If `--camera-id` is specified, then `--camera-facing` is forbidden (the id
already determines the camera):
```bash
scrcpy --video-source=camera --camera-id=0 --camera-facing=front # error
```
### Size selection
It is possible to pass an explicit camera size:
```
scrcpy --video-source=camera --camera-size=1920x1080
```
The given size may be listed among the declared valid sizes
(`--list-camera-sizes`), but may also be anything else (some devices support
arbitrary sizes):
```
scrcpy --video-source=camera --camera-size=1840x444
```
Alternatively, a declared valid size (among the ones listed by
`list-camera-sizes`) may be selected automatically.
Two constraints are supported:
- `-m`/`--max-size` (already used for display mirroring), for example `-m1920`;
- `--camera-ar` to specify an aspect ratio (`<num>:<den>`, `<value>` or
`sensor`).
Some examples:
```bash
scrcpy --video-source=camera # use the greatest width and the greatest associated height
scrcpy --video-source=camera -m1920 # use the greatest width not above 1920 and the greatest associated height
scrcpy --video-source=camera --camera-ar=4:3 # use the greatest size with an aspect ratio of 4:3 (+/- 10%)
scrcpy --video-source=camera --camera-ar=1.6 # use the greatest size with an aspect ratio of 1.6 (+/- 10%)
scrcpy --video-source=camera --camera-ar=sensor # use the greatest size with the aspect ratio of the camera sensor (+/- 10%)
scrcpy --video-source=camera -m1920 --camera-ar=16:9 # use the greatest width not above 1920 and the closest to 16:9 aspect ratio
```
If `--camera-size` is specified, then `-m`/`--max-size` and `--camera-ar` are
forbidden (the size is determined by the value given explicitly):
```bash
scrcpy --video-source=camera --camera-size=1920x1080 -m3000 # error
```
## Rotation
To rotate the captured video, use the [video orientation](video.md#orientation)
option:
```
scrcpy --video-source=camera --camera-size=1920x1080 --orientation=90
```
## Frame rate
By default, camera is captured at Android's default frame rate (30 fps).
To configure a different frame rate:
```
scrcpy --video-source=camera --camera-fps=60
```
## High speed capture
The Android camera API also supports a [high speed capture mode][high speed].
This mode is restricted to specific resolutions and frame rates, listed by
`--list-camera-sizes`.
```
scrcpy --video-source=camera --camera-size=1920x1080 --camera-fps=240
```
[high speed]: https://developer.android.com/reference/android/hardware/camera2/CameraConstrainedHighSpeedCaptureSession
## Brace expansion tip
All camera options start with `--camera-`, so if your shell supports it, you can
benefit from [brace expansion] (for example, it is supported _bash_ and _zsh_):
```bash
scrcpy --video-source=camera --camera-{facing=back,ar=16:9,high-speed,fps=120}
```
This will be expanded as:
```bash
scrcpy --video-source=camera --camera-facing=back --camera-ar=16:9 --camera-high-speed --camera-fps=120
```
[brace expansion]: https://www.gnu.org/software/bash/manual/html_node/Brace-Expansion.html
## Webcam
Combined with the [V4L2](v4l2.md) feature on Linux, the Android device camera
may be used as a webcam on the computer.

View File

@ -1,125 +0,0 @@
# Connection
## Selection
If exactly one device is connected (i.e. listed by `adb devices`), then it is
automatically selected.
However, if there are multiple devices connected, you must specify the one to
use in one of 4 ways:
- by its serial:
```bash
scrcpy --serial=0123456789abcdef
scrcpy -s 0123456789abcdef # short version
# the serial is the ip:port if connected over TCP/IP (same behavior as adb)
scrcpy --serial=192.168.1.1:5555
```
- the one connected over USB (if there is exactly one):
```bash
scrcpy --select-usb
scrcpy -d # short version
```
- the one connected over TCP/IP (if there is exactly one):
```bash
scrcpy --select-tcpip
scrcpy -e # short version
```
- a device already listening on TCP/IP (see [below](#tcpip-wireless)):
```bash
scrcpy --tcpip=192.168.1.1:5555
scrcpy --tcpip=192.168.1.1 # default port is 5555
```
The serial may also be provided via the environment variable `ANDROID_SERIAL`
(also used by `adb`):
```bash
# in bash
export ANDROID_SERIAL=0123456789abcdef
scrcpy
```
```cmd
:: in cmd
set ANDROID_SERIAL=0123456789abcdef
scrcpy
```
```powershell
# in PowerShell
$env:ANDROID_SERIAL = '0123456789abcdef'
scrcpy
```
## TCP/IP (wireless)
_Scrcpy_ uses `adb` to communicate with the device, and `adb` can [connect] to a
device over TCP/IP. The device must be connected on the same network as the
computer.
[connect]: https://developer.android.com/studio/command-line/adb.html#wireless
### Automatic
An option `--tcpip` allows to configure the connection automatically. There are
two variants.
If the device (accessible at 192.168.1.1 in this example) already listens on a
port (typically 5555) for incoming _adb_ connections, then run:
```bash
scrcpy --tcpip=192.168.1.1 # default port is 5555
scrcpy --tcpip=192.168.1.1:5555
```
If _adb_ TCP/IP mode is disabled on the device (or if you don't know the IP
address), connect the device over USB, then run:
```bash
scrcpy --tcpip # without arguments
```
It will automatically find the device IP address and adb port, enable TCP/IP
mode if necessary, then connect to the device before starting.
### Manual
Alternatively, it is possible to enable the TCP/IP connection manually using
`adb`:
1. Plug the device into a USB port on your computer.
2. Connect the device to the same Wi-Fi network as your computer.
3. Get your device IP address, in Settings → About phone → Status, or by
executing this command:
```bash
adb shell ip route | awk '{print $9}'
```
4. Enable `adb` over TCP/IP on your device: `adb tcpip 5555`.
5. Unplug your device.
6. Connect to your device: `adb connect DEVICE_IP:5555` _(replace `DEVICE_IP`
with the device IP address you found)_.
7. Run `scrcpy` as usual.
8. Run `adb disconnect` once you're done.
Since Android 11, a [wireless debugging option][adb-wireless] allows to bypass
having to physically connect your device directly to your computer.
[adb-wireless]: https://developer.android.com/studio/command-line/adb#wireless-android11-command-line
## Autostart
A small tool (by the scrcpy author) allows to run arbitrary commands whenever a
new Android device is connected: [AutoAdb]. It can be used to start scrcpy:
```bash
autoadb scrcpy -s '{}'
```
[AutoAdb]: https://github.com/rom1v/autoadb

View File

@ -9,52 +9,16 @@ This application is composed of two parts:
The client is responsible to push the server to the device and start its The client is responsible to push the server to the device and start its
execution. execution.
The client and the server establish communication using separate sockets for Once the client and the server are connected to each other, the server initially
video, audio and controls. Any of them may be disabled (but not all), so sends device information (name and initial screen dimensions), then starts to
there are 1, 2 or 3 socket(s). send a raw H.264 video stream of the device screen. The client decodes the video
frames, and display them as soon as possible, without buffering, to minimize
latency. The client is not aware of the device rotation (which is handled by the
server), it just knows the dimensions of the video frames.
The server initially sends the device name on the first socket (it is used for The client captures relevant keyboard and mouse events, that it transmits to the
the scrcpy window title), then each socket is used for its own purpose. All server, which injects them to the device.
reads and writes are performed from a dedicated thread for each socket, both on
the client and on the server.
If video is enabled, then the server sends a raw video stream (H.264 by default)
of the device screen, with some additional headers for each packet. The client
decodes the video frames, and displays them as soon as possible, without
buffering (unless `--display-buffer=delay` is specified) to minimize latency.
The client is not aware of the device rotation (which is handled by the server),
it just knows the dimensions of the video frames it receives.
Similarly, if audio is enabled, then the server sends a raw audio stream (OPUS
by default) of the device audio output (or the microphone if
`--audio-source=mic` is specified), with some additional headers for each
packet. The client decodes the stream, attempts to keep a minimal latency by
maintaining an average buffering. The [blog post][scrcpy2] of the scrcpy v2.0
release gives more details about the audio feature.
If control is enabled, then the client captures relevant keyboard and mouse
events, that it transmits to the server, which injects them to the device. This
is the only socket which is used in both direction: input events are sent from
the client to the device, and when the device clipboard changes, the new content
is sent from the device to the client to support seamless copy-paste.
[scrcpy2]: https://blog.rom1v.com/2023/03/scrcpy-2-0-with-audio/
Note that the client-server roles are expressed at the application level:
- the server _serves_ video and audio streams, and handle requests from the
client,
- the client _controls_ the device through the server.
However, by default (when `--force-adb-forward` is not set), the roles are
reversed at the network level:
- the client opens a server socket and listen on a port before starting the
server,
- the server connects to the client.
This role inversion guarantees that the connection will not fail due to race
conditions without polling.
## Server ## Server
@ -68,14 +32,15 @@ The server is a Java application (with a [`public static void main(String...
args)`][main] method), compiled against the Android framework, and executed as args)`][main] method), compiled against the Android framework, and executed as
`shell` on the Android device. `shell` on the Android device.
[main]: https://github.com/Genymobile/scrcpy/blob/a3cdf1a6b86ea22786e1f7d09b9c202feabc6949/server/src/main/java/com/genymobile/scrcpy/Server.java#L193 [main]: https://github.com/Genymobile/scrcpy/blob/ffe0417228fb78ab45b7ee4e202fc06fc8875bf3/server/src/main/java/com/genymobile/scrcpy/Server.java#L123
To run such a Java application, the classes must be [_dexed_][dex] (typically, To run such a Java application, the classes must be [_dexed_][dex] (typically,
to `classes.dex`). If `my.package.MainClass` is the main class, compiled to to `classes.dex`). If `my.package.MainClass` is the main class, compiled to
`classes.dex`, pushed to the device in `/data/local/tmp`, then it can be run `classes.dex`, pushed to the device in `/data/local/tmp`, then it can be run
with: with:
adb shell CLASSPATH=/data/local/tmp/classes.dex app_process / my.package.MainClass adb shell CLASSPATH=/data/local/tmp/classes.dex \
app_process / my.package.MainClass
_The path `/data/local/tmp` is a good candidate to push the server, since it's _The path `/data/local/tmp` is a good candidate to push the server, since it's
readable and writable by `shell`, but not world-writable, so a malicious readable and writable by `shell`, but not world-writable, so a malicious
@ -84,7 +49,7 @@ application may not replace the server just before the client executes it._
Instead of a raw _dex_ file, `app_process` accepts a _jar_ containing Instead of a raw _dex_ file, `app_process` accepts a _jar_ containing
`classes.dex` (e.g. an [APK]). For simplicity, and to benefit from the gradle `classes.dex` (e.g. an [APK]). For simplicity, and to benefit from the gradle
build system, the server is built to an (unsigned) APK (renamed to build system, the server is built to an (unsigned) APK (renamed to
`scrcpy-server.jar`). `scrcpy-server`).
[dex]: https://en.wikipedia.org/wiki/Dalvik_(software) [dex]: https://en.wikipedia.org/wiki/Dalvik_(software)
[apk]: https://en.wikipedia.org/wiki/Android_application_package [apk]: https://en.wikipedia.org/wiki/Android_application_package
@ -100,77 +65,42 @@ They can be called using reflection though. The communication with hidden
components is provided by [_wrappers_ classes][wrappers] and [aidl]. components is provided by [_wrappers_ classes][wrappers] and [aidl].
[hidden]: https://stackoverflow.com/a/31908373/1987178 [hidden]: https://stackoverflow.com/a/31908373/1987178
[wrappers]: https://github.com/Genymobile/scrcpy/tree/master/server/src/main/java/com/genymobile/scrcpy/wrappers [wrappers]: https://github.com/Genymobile/scrcpy/tree/ffe0417228fb78ab45b7ee4e202fc06fc8875bf3/server/src/main/java/com/genymobile/scrcpy/wrappers
[aidl]: https://github.com/Genymobile/scrcpy/tree/master/server/src/main/aidl [aidl]: https://github.com/Genymobile/scrcpy/tree/ffe0417228fb78ab45b7ee4e202fc06fc8875bf3/server/src/main/aidl/android/view
### Threading
### Execution The server uses 3 threads:
The server is started by the client basically by executing the following - the **main** thread, encoding and streaming the video to the client;
commands: - the **controller** thread, listening for _control messages_ (typically,
keyboard and mouse events) from the client;
- the **receiver** thread (managed by the controller), sending _device messages_
to the clients (currently, it is only used to send the device clipboard
content).
```bash Since the video encoding is typically hardware, there would be no benefit in
adb push scrcpy-server /data/local/tmp/scrcpy-server.jar encoding and streaming in two different threads.
adb forward tcp:27183 localabstract:scrcpy
adb shell CLASSPATH=/data/local/tmp/scrcpy-server.jar app_process / com.genymobile.scrcpy.Server 2.1
```
The first argument (`2.1` in the example) is the client scrcpy version. The
server fails if the client and the server do not have the exact same version.
The protocol between the client and the server may change from version to
version (see [protocol](#protocol) below), and there is no backward or forward
compatibility (there is no point to use different client and server versions).
This check allows to detect misconfiguration (running an older or newer server
by mistake).
It is followed by any number of arguments, in the form of `key=value` pairs.
Their order is irrelevant. The possible keys and associated value types can be
found in the [server][server-options] and [client][client-options] code.
[server-options]: https://github.com/Genymobile/scrcpy/blob/a3cdf1a6b86ea22786e1f7d09b9c202feabc6949/server/src/main/java/com/genymobile/scrcpy/Options.java#L181
[client-options]: https://github.com/Genymobile/scrcpy/blob/a3cdf1a6b86ea22786e1f7d09b9c202feabc6949/app/src/server.c#L226
For example, if we execute `scrcpy -m1920 --no-audio`, then the server
execution will look like this:
```bash
# scid is a random number to identify different clients running on the same device
adb shell CLASSPATH=/data/local/tmp/scrcpy-server.jar app_process / com.genymobile.scrcpy.Server 2.1 scid=12345678 log_level=info audio=false max_size=1920
```
### Components
When executed, its [`main()`][main] method is executed (on the "main" thread).
It parses the arguments, establishes the connection with the client and starts
the other "components":
- the **video** streamer: it captures the video screen and send encoded video
packets on the _video_ socket (from the _video_ thread).
- the **audio** streamer: it uses several threads to capture raw packets,
submits them to encoding and retrieve encoded packets, which it sends on the
_audio_ socket.
- the **controller**: it receives _control messages_ (typically input events)
on the _control_ socket from one thread, and sends _device messages_ (e.g. to
transmit the device clipboard content to the client) on the same _control
socket_ from another thread. Thus, the _control_ socket is used in both
directions (contrary to the _video_ and _audio_ sockets).
### Screen video encoding ### Screen video encoding
The encoding is managed by [`ScreenEncoder`]. The encoding is managed by [`ScreenEncoder`].
The video is encoded using the [`MediaCodec`] API. The codec encodes the content The video is encoded using the [`MediaCodec`] API. The codec takes its input
of a `Surface` associated to the display, and writes the encoding packets to the from a [surface] associated to the display, and writes the resulting H.264
client (on the _video_ socket). stream to the provided output stream (the socket connected to the client).
[`ScreenEncoder`]: https://github.com/Genymobile/scrcpy/blob/a3cdf1a6b86ea22786e1f7d09b9c202feabc6949/server/src/main/java/com/genymobile/scrcpy/ScreenEncoder.java [`ScreenEncoder`]: https://github.com/Genymobile/scrcpy/blob/ffe0417228fb78ab45b7ee4e202fc06fc8875bf3/server/src/main/java/com/genymobile/scrcpy/ScreenEncoder.java
[`MediaCodec`]: https://developer.android.com/reference/android/media/MediaCodec.html [`MediaCodec`]: https://developer.android.com/reference/android/media/MediaCodec.html
[surface]: https://github.com/Genymobile/scrcpy/blob/ffe0417228fb78ab45b7ee4e202fc06fc8875bf3/server/src/main/java/com/genymobile/scrcpy/ScreenEncoder.java#L68-L69
On device rotation (or folding), the encoding session is [reset] and restarted. On device [rotation], the codec, surface and display are reinitialized, and a
new video stream is produced.
New frames are produced only when changes occur on the surface. This avoids to New frames are produced only when changes occur on the surface. This is good
send unnecessary frames, but by default there might be drawbacks: because it avoids to send unnecessary frames, but there are drawbacks:
- it does not send any frame on start if the device screen does not change, - it does not send any frame on start if the device screen does not change,
- after fast motion changes, the last frame may have poor quality. - after fast motion changes, the last frame may have poor quality.
@ -178,24 +108,11 @@ send unnecessary frames, but by default there might be drawbacks:
Both problems are [solved][repeat] by the flag Both problems are [solved][repeat] by the flag
[`KEY_REPEAT_PREVIOUS_FRAME_AFTER`][repeat-flag]. [`KEY_REPEAT_PREVIOUS_FRAME_AFTER`][repeat-flag].
[reset]: https://github.com/Genymobile/scrcpy/blob/a3cdf1a6b86ea22786e1f7d09b9c202feabc6949/server/src/main/java/com/genymobile/scrcpy/ScreenEncoder.java#L179
[rotation]: https://github.com/Genymobile/scrcpy/blob/ffe0417228fb78ab45b7ee4e202fc06fc8875bf3/server/src/main/java/com/genymobile/scrcpy/ScreenEncoder.java#L90 [rotation]: https://github.com/Genymobile/scrcpy/blob/ffe0417228fb78ab45b7ee4e202fc06fc8875bf3/server/src/main/java/com/genymobile/scrcpy/ScreenEncoder.java#L90
[repeat]: https://github.com/Genymobile/scrcpy/blob/a3cdf1a6b86ea22786e1f7d09b9c202feabc6949/server/src/main/java/com/genymobile/scrcpy/ScreenEncoder.java#L246-L247 [repeat]: https://github.com/Genymobile/scrcpy/blob/ffe0417228fb78ab45b7ee4e202fc06fc8875bf3/server/src/main/java/com/genymobile/scrcpy/ScreenEncoder.java#L147-L148
[repeat-flag]: https://developer.android.com/reference/android/media/MediaFormat.html#KEY_REPEAT_PREVIOUS_FRAME_AFTER [repeat-flag]: https://developer.android.com/reference/android/media/MediaFormat.html#KEY_REPEAT_PREVIOUS_FRAME_AFTER
### Audio encoding
Similarly, the audio is [captured] using an [`AudioRecord`], and [encoded] using
the [`MediaCodec`] asynchronous API.
More details are available on the [blog post][scrcpy2] introducing the audio feature.
[captured]: https://github.com/Genymobile/scrcpy/blob/a3cdf1a6b86ea22786e1f7d09b9c202feabc6949/server/src/main/java/com/genymobile/scrcpy/AudioCapture.java
[encoded]: https://github.com/Genymobile/scrcpy/blob/a3cdf1a6b86ea22786e1f7d09b9c202feabc6949/server/src/main/java/com/genymobile/scrcpy/AudioEncoder.java
[`AudioRecord`]: https://developer.android.com/reference/android/media/AudioRecord
### Input events injection ### Input events injection
_Control messages_ are received from the client by the [`Controller`] (run in a _Control messages_ are received from the client by the [`Controller`] (run in a
@ -207,13 +124,13 @@ separate thread). There are several types of input events:
- other commands (e.g. to switch the screen on or to copy the clipboard). - other commands (e.g. to switch the screen on or to copy the clipboard).
Some of them need to inject input events to the system. To do so, they use the Some of them need to inject input events to the system. To do so, they use the
_hidden_ method [`InputManager.injectInputEvent()`] (exposed by the _hidden_ method [`InputManager.injectInputEvent`] (exposed by our
[`InputManager` wrapper][inject-wrapper]). [`InputManager` wrapper][inject-wrapper]).
[`Controller`]: https://github.com/Genymobile/scrcpy/blob/a3cdf1a6b86ea22786e1f7d09b9c202feabc6949/server/src/main/java/com/genymobile/scrcpy/Controller.java [`Controller`]: https://github.com/Genymobile/scrcpy/blob/ffe0417228fb78ab45b7ee4e202fc06fc8875bf3/server/src/main/java/com/genymobile/scrcpy/Controller.java#L81
[`KeyEvent`]: https://developer.android.com/reference/android/view/KeyEvent.html [`KeyEvent`]: https://developer.android.com/reference/android/view/KeyEvent.html
[`MotionEvent`]: https://developer.android.com/reference/android/view/MotionEvent.html [`MotionEvent`]: https://developer.android.com/reference/android/view/MotionEvent.html
[`InputManager.injectInputEvent()`]: https://github.com/Genymobile/scrcpy/blob/a3cdf1a6b86ea22786e1f7d09b9c202feabc6949/server/src/main/java/com/genymobile/scrcpy/wrappers/InputManager.java#L34 [`InputManager.injectInputEvent`]: https://android.googlesource.com/platform/frameworks/base/+/oreo-release/core/java/android/hardware/input/InputManager.java#857
[inject-wrapper]: https://github.com/Genymobile/scrcpy/blob/ffe0417228fb78ab45b7ee4e202fc06fc8875bf3/server/src/main/java/com/genymobile/scrcpy/wrappers/InputManager.java#L27 [inject-wrapper]: https://github.com/Genymobile/scrcpy/blob/ffe0417228fb78ab45b7ee4e202fc06fc8875bf3/server/src/main/java/com/genymobile/scrcpy/wrappers/InputManager.java#L27
@ -223,222 +140,126 @@ _hidden_ method [`InputManager.injectInputEvent()`] (exposed by the
The client relies on [SDL], which provides cross-platform API for UI, input The client relies on [SDL], which provides cross-platform API for UI, input
events, threading, etc. events, threading, etc.
The video and audio streams are decoded by [FFmpeg]. The video stream is decoded by [libav] (FFmpeg).
[SDL]: https://www.libsdl.org [SDL]: https://www.libsdl.org
[ffmpeg]: https://ffmpeg.org/ [libav]: https://www.libav.org/
### Initialization ### Initialization
The client parses the command line arguments, then [runs one of two code On startup, in addition to _libav_ and _SDL_ initialization, the client must
paths][run]: push and start the server on the device, and open two sockets (one for the video
- scrcpy in "normal" mode ([`scrcpy.c`]) stream, one for control) so that they may communicate.
- scrcpy in [OTG mode](hid-otg.md) ([`scrcpy_otg.c`])
[run]: https://github.com/Genymobile/scrcpy/blob/a3cdf1a6b86ea22786e1f7d09b9c202feabc6949/app/src/main.c#L81-L82 Note that the client-server roles are expressed at the application level:
[`scrcpy.c`]: https://github.com/Genymobile/scrcpy/blob/a3cdf1a6b86ea22786e1f7d09b9c202feabc6949/app/src/scrcpy.c#L292-L293
[`scrcpy_otg.c`]: https://github.com/Genymobile/scrcpy/blob/a3cdf1a6b86ea22786e1f7d09b9c202feabc6949/app/src/usb/scrcpy_otg.c#L51-L52
In the remaining of this document, we assume that the "normal" mode is used - the server _serves_ video stream and handle requests from the client,
(read the code for the OTG mode). - the client _controls_ the device through the server.
On startup, the client: However, the roles are reversed at the network level:
- opens the _video_, _audio_ and _control_ sockets;
- pushes and starts the server on the device; - the client opens a server socket and listen on a port before starting the
- initializes its components (demuxers, decoders, recorder…). server,
- the server connects to the client.
This role inversion guarantees that the connection will not fail due to race
conditions, and avoids polling.
_(Note that over TCP/IP, the roles are not reversed, due to a bug in `adb
reverse`. See commit [1038bad] and [issue #5].)_
Once the server is connected, it sends the device information (name and initial
screen dimensions). Thus, the client may init the window and renderer, before
the first frame is available.
To minimize startup time, SDL initialization is performed while listening for
the connection from the server (see commit [90a46b4]).
[1038bad]: https://github.com/Genymobile/scrcpy/commit/1038bad3850f18717a048a4d5c0f8110e54ee172
[issue #5]: https://github.com/Genymobile/scrcpy/issues/5
[90a46b4]: https://github.com/Genymobile/scrcpy/commit/90a46b4c45637d083e877020d85ade52a9a5fa8e
### Video and audio streams ### Threading
Depending on the arguments passed to `scrcpy`, several components may be used. The client uses 4 threads:
Here is an overview of the video and audio components:
- the **main** thread, executing the SDL event loop,
- the **stream** thread, receiving the video and used for decoding and
recording,
- the **controller** thread, sending _control messages_ to the server,
- the **receiver** thread (managed by the controller), receiving _device
messages_ from the server.
In addition, another thread can be started if necessary to handle APK
installation or file push requests (via drag&drop on the main window) or to
print the framerate regularly in the console.
### Stream
The video [stream] is received from the socket (connected to the server on the
device) in a separate thread.
If a [decoder] is present (i.e. `--no-display` is not set), then it uses _libav_
to decode the H.264 stream from the socket, and notifies the main thread when a
new frame is available.
There are two [frames][video_buffer] simultaneously in memory:
- the **decoding** frame, written by the decoder from the decoder thread,
- the **rendering** frame, rendered in a texture from the main thread.
When a new decoded frame is available, the decoder _swaps_ the decoding and
rendering frame (with proper synchronization). Thus, it immediately starts
to decode a new frame while the main thread renders the last one.
If a [recorder] is present (i.e. `--record` is enabled), then it muxes the raw
H.264 packet to the output video file.
[stream]: https://github.com/Genymobile/scrcpy/blob/ffe0417228fb78ab45b7ee4e202fc06fc8875bf3/app/src/stream.h
[decoder]: https://github.com/Genymobile/scrcpy/blob/ffe0417228fb78ab45b7ee4e202fc06fc8875bf3/app/src/decoder.h
[video_buffer]: https://github.com/Genymobile/scrcpy/blob/ffe0417228fb78ab45b7ee4e202fc06fc8875bf3/app/src/video_buffer.h
[recorder]: https://github.com/Genymobile/scrcpy/blob/ffe0417228fb78ab45b7ee4e202fc06fc8875bf3/app/src/recorder.h
``` ```
V4L2 sink +----------+ +----------+
/ ---> | decoder | ---> | screen |
decoder +---------+ / +----------+ +----------+
/ \ socket ---> | stream | ----
VIDEO -------------> demuxer display +---------+ \ +----------+
\ ---> | recorder |
recorder +----------+
/
AUDIO -------------> demuxer
\
decoder --- audio player
``` ```
The _demuxer_ is responsible to extract video and audio packets (read some
header, split the video stream into packets at correct boundaries, etc.).
The demuxed packets may be sent to a _decoder_ (one per stream, to produce
frames) and to a recorder (receiving both video and audio stream to record a
single file). The packets are encoded on the device (by `MediaCodec`), but when
recording, they are _muxed_ (asynchronously) into a container (MKV or MP4) on
the client side.
Video frames are sent to the screen/display to be rendered in the scrcpy window.
They may also be sent to a [V4L2 sink](v4l2.md).
Audio "frames" (an array of decoded samples) are sent to the audio player.
### Controller ### Controller
The _controller_ is responsible to send _control messages_ to the device. It The [controller] is responsible to send _control messages_ to the device. It
runs in a separate thread, to avoid I/O on the main thread. runs in a separate thread, to avoid I/O on the main thread.
On SDL event, received on the main thread, the _input manager_ creates On SDL event, received on the main thread, the [input manager][inputmanager]
appropriate _control messages_. It is responsible to convert SDL events to creates appropriate [_control messages_][controlmsg]. It is responsible to
Android events. It then pushes the _control messages_ to a queue hold by the convert SDL events to Android events (using [convert]). It pushes the _control
controller. On its own thread, the controller takes messages from the queue, messages_ to a queue hold by the controller. On its own thread, the controller
that it serializes and sends to the client. takes messages from the queue, that it serializes and sends to the client.
[controller]: https://github.com/Genymobile/scrcpy/blob/ffe0417228fb78ab45b7ee4e202fc06fc8875bf3/app/src/controller.h
[controlmsg]: https://github.com/Genymobile/scrcpy/blob/ffe0417228fb78ab45b7ee4e202fc06fc8875bf3/app/src/control_msg.h
[inputmanager]: https://github.com/Genymobile/scrcpy/blob/ffe0417228fb78ab45b7ee4e202fc06fc8875bf3/app/src/input_manager.h
[convert]: https://github.com/Genymobile/scrcpy/blob/ffe0417228fb78ab45b7ee4e202fc06fc8875bf3/app/src/convert.h
## Protocol ### UI and event loop
The protocol between the client and the server must be considered _internal_: it Initialization, input events and rendering are all [managed][scrcpy] in the main
may (and will) change at any time for any reason. Everything may change (the thread.
number of sockets, the order in which the sockets must be opened, the data
format on the wire…) from version to version. A client must always be run with a
matching server version.
This section documents the current protocol in scrcpy v2.1. Events are handled in the [event loop], which either updates the [screen] or
delegates to the [input manager][inputmanager].
### Connection [scrcpy]: https://github.com/Genymobile/scrcpy/blob/ffe0417228fb78ab45b7ee4e202fc06fc8875bf3/app/src/scrcpy.c
[event loop]: https://github.com/Genymobile/scrcpy/blob/ffe0417228fb78ab45b7ee4e202fc06fc8875bf3/app/src/scrcpy.c#L201
Firstly, the client sets up an adb tunnel: [screen]: https://github.com/Genymobile/scrcpy/blob/ffe0417228fb78ab45b7ee4e202fc06fc8875bf3/app/src/screen.h
```bash
# By default, a reverse redirection: the computer listens, the device connects
adb reverse localabstract:scrcpy_<SCID> tcp:27183
# As a fallback (or if --force-adb forward is set), a forward redirection:
# the device listens, the computer connects
adb forward tcp:27183 localabstract:scrcpy_<SCID>
```
(`<SCID>` is a 31-bit random number, so that it does not fail when several
scrcpy instances start "at the same time" for the same device.)
Then, up to 3 sockets are opened, in that order:
- a _video_ socket
- an _audio_ socket
- a _control_ socket
Each one may be disabled (respectively by `--no-video`, `--no-audio` and
`--no-control`, directly or indirectly). For example, if `--no-audio` is set,
then the _video_ socket is opened first, then the _control_ socket.
On the _first_ socket opened (whichever it is), if the tunnel is _forward_, then
a [dummy byte] is sent from the device to the client. This allows to detect a
connection error (the client connection does not fail as long as there is an adb
forward redirection, even if nothing is listening on the device side).
Still on this _first_ socket, the device sends some [metadata][device meta] to
the client (currently only the device name, used as the window title, but there
might be other fields in the future).
[dummy byte]: https://github.com/Genymobile/scrcpy/blob/a3cdf1a6b86ea22786e1f7d09b9c202feabc6949/server/src/main/java/com/genymobile/scrcpy/DesktopConnection.java#L93
[device meta]: https://github.com/Genymobile/scrcpy/blob/a3cdf1a6b86ea22786e1f7d09b9c202feabc6949/server/src/main/java/com/genymobile/scrcpy/DesktopConnection.java#L151
You can read the [client][client-connection] and [server][server-connection]
code for more details.
[client-connection]: https://github.com/Genymobile/scrcpy/blob/a3cdf1a6b86ea22786e1f7d09b9c202feabc6949/app/src/server.c#L465-L466
[server-connection]: https://github.com/Genymobile/scrcpy/blob/a3cdf1a6b86ea22786e1f7d09b9c202feabc6949/server/src/main/java/com/genymobile/scrcpy/DesktopConnection.java#L63
Then each socket is used for its intended purpose.
### Video and audio
On the _video_ and _audio_ sockets, the device first sends some [codec
metadata]:
- On the _video_ socket, 12 bytes:
- the codec id (`u32`) (H264, H265 or AV1)
- the initial video width (`u32`)
- the initial video height (`u32`)
- On the _audio_ socket, 4 bytes:
- the codec id (`u32`) (OPUS, AAC or RAW)
[codec metadata]: https://github.com/Genymobile/scrcpy/blob/a3cdf1a6b86ea22786e1f7d09b9c202feabc6949/server/src/main/java/com/genymobile/scrcpy/Streamer.java#L33-L51
Then each packet produced by `MediaCodec` is sent, prefixed by a 12-byte [frame
header]:
- config packet flag (`u1`)
- key frame flag (`u1`)
- PTS (`u62`)
- packet size (`u32`)
Here is a schema describing the frame header:
```
[. . . . . . . .|. . . .]. . . . . . . . . . . . . . . ...
<-------------> <-----> <-----------------------------...
PTS packet raw packet
size
<--------------------->
frame header
The most significant bits of the PTS are used for packet flags:
byte 7 byte 6 byte 5 byte 4 byte 3 byte 2 byte 1 byte 0
CK...... ........ ........ ........ ........ ........ ........ ........
^^<------------------------------------------------------------------->
|| PTS
| `- key frame
`-- config packet
```
[frame header]: https://github.com/Genymobile/scrcpy/blob/a3cdf1a6b86ea22786e1f7d09b9c202feabc6949/server/src/main/java/com/genymobile/scrcpy/Streamer.java#L83
### Controls
Controls messages are sent via a custom binary protocol.
The only documentation for this protocol is the set of unit tests on both sides:
- `ControlMessage` (from client to device): [serialization](https://github.com/Genymobile/scrcpy/blob/master/app/tests/test_control_msg_serialize.c) | [deserialization](https://github.com/Genymobile/scrcpy/blob/master/server/src/test/java/com/genymobile/scrcpy/ControlMessageReaderTest.java)
- `DeviceMessage` (from device to client) [serialization](https://github.com/Genymobile/scrcpy/blob/master/server/src/test/java/com/genymobile/scrcpy/DeviceMessageWriterTest.java) | [deserialization](https://github.com/Genymobile/scrcpy/blob/master/app/tests/test_device_msg_deserialize.c)
## Standalone server
Although the server is designed to work for the scrcpy client, it can be used
with any client which uses the same protocol.
For simplicity, some [server-specific options] have been added to produce raw
streams easily:
- `send_device_meta=false`: disable the device metata (in practice, the device
name) sent on the _first_ socket
- `send_frame_meta=false`: disable the 12-byte header for each packet
- `send_dummy_byte`: disable the dummy byte sent on forward connections
- `send_codec_meta`: disable the codec information (and initial device size for
video)
- `raw_stream`: disable all the above
[server-specific options]: https://github.com/Genymobile/scrcpy/blob/a3cdf1a6b86ea22786e1f7d09b9c202feabc6949/server/src/main/java/com/genymobile/scrcpy/Options.java#L309-L329
Concretely, here is how to expose a raw H.264 stream on a TCP socket:
```bash
adb push scrcpy-server-v2.1 /data/local/tmp/scrcpy-server-manual.jar
adb forward tcp:1234 localabstract:scrcpy
adb shell CLASSPATH=/data/local/tmp/scrcpy-server-manual.jar \
app_process / com.genymobile.scrcpy.Server 2.1 \
tunnel_forward=true audio=false control=false cleanup=false \
raw_stream=true max_size=1920
```
As soon as a client connects over TCP on port 1234, the device will start
streaming the video. For example, VLC can play the video (although you will
experience a very high latency, more details [here][vlc-0latency]):
```
vlc -Idummy --demux=h264 --network-caching=0 tcp://localhost:1234
```
[vlc-0latency]: https://code.videolan.org/rom1v/vlc/-/merge_requests/20
## Hack ## Hack

View File

@ -1,9 +1,156 @@
# Device # Device
## Selection
If exactly one device is connected (i.e. listed by `adb devices`), then it is
automatically selected.
However, if there are multiple devices connected, you must specify the one to
use in one of 4 ways:
- by its serial:
```bash
scrcpy --serial=0123456789abcdef
scrcpy -s 0123456789abcdef # short version
# the serial is the ip:port if connected over TCP/IP (same behavior as adb)
scrcpy --serial=192.168.1.1:5555
```
- the one connected over USB (if there is exactly one):
```bash
scrcpy --select-usb
scrcpy -d # short version
```
- the one connected over TCP/IP (if there is exactly one):
```bash
scrcpy --select-tcpip
scrcpy -e # short version
```
- a device already listening on TCP/IP (see [below](#tcpip-wireless)):
```bash
scrcpy --tcpip=192.168.1.1:5555
scrcpy --tcpip=192.168.1.1 # default port is 5555
```
The serial may also be provided via the environment variable `ANDROID_SERIAL`
(also used by `adb`):
```bash
# in bash
export ANDROID_SERIAL=0123456789abcdef
scrcpy
```
```cmd
:: in cmd
set ANDROID_SERIAL=0123456789abcdef
scrcpy
```
```powershell
# in PowerShell
$env:ANDROID_SERIAL = '0123456789abcdef'
scrcpy
```
## TCP/IP (wireless)
_Scrcpy_ uses `adb` to communicate with the device, and `adb` can [connect] to a
device over TCP/IP. The device must be connected on the same network as the
computer.
[connect]: https://developer.android.com/studio/command-line/adb.html#wireless
### Automatic
An option `--tcpip` allows to configure the connection automatically. There are
two variants.
If the device (accessible at 192.168.1.1 in this example) already listens on a
port (typically 5555) for incoming _adb_ connections, then run:
```bash
scrcpy --tcpip=192.168.1.1 # default port is 5555
scrcpy --tcpip=192.168.1.1:5555
```
If _adb_ TCP/IP mode is disabled on the device (or if you don't know the IP
address), connect the device over USB, then run:
```bash
scrcpy --tcpip # without arguments
```
It will automatically find the device IP address and adb port, enable TCP/IP
mode if necessary, then connect to the device before starting.
### Manual
Alternatively, it is possible to enable the TCP/IP connection manually using
`adb`:
1. Plug the device into a USB port on your computer.
2. Connect the device to the same Wi-Fi network as your computer.
3. Get your device IP address, in Settings → About phone → Status, or by
executing this command:
```bash
adb shell ip route | awk '{print $9}'
```
4. Enable `adb` over TCP/IP on your device: `adb tcpip 5555`.
5. Unplug your device.
6. Connect to your device: `adb connect DEVICE_IP:5555` _(replace `DEVICE_IP`
with the device IP address you found)_.
7. Run `scrcpy` as usual.
8. Run `adb disconnect` once you're done.
Since Android 11, a [wireless debugging option][adb-wireless] allows to bypass
having to physically connect your device directly to your computer.
[adb-wireless]: https://developer.android.com/studio/command-line/adb#wireless-android11-command-line
## Autostart
A small tool (by the scrcpy author) allows to run arbitrary commands whenever a
new Android device is connected: [AutoAdb]. It can be used to start scrcpy:
```bash
autoadb scrcpy -s '{}'
```
[AutoAdb]: https://github.com/rom1v/autoadb
## Display
If several displays are available on the Android device, it is possible to
select the display to mirror:
```bash
scrcpy --display=1
```
The list of display ids can be retrieved by:
```bash
scrcpy --list-displays
```
A secondary display may only be controlled if the device runs at least Android
10 (otherwise it is mirrored as read-only).
## Actions
Some command line arguments perform actions on the device itself while scrcpy is Some command line arguments perform actions on the device itself while scrcpy is
running. running.
## Stay awake
### Stay awake
To prevent the device from sleeping after a delay **when the device is plugged To prevent the device from sleeping after a delay **when the device is plugged
in**: in**:
@ -19,7 +166,7 @@ If the device is not plugged in (i.e. only connected over TCP/IP),
`--stay-awake` has no effect (this is the Android behavior). `--stay-awake` has no effect (this is the Android behavior).
## Turn screen off ### Turn screen off
It is possible to turn the device screen off while mirroring on start with a It is possible to turn the device screen off while mirroring on start with a
command-line option: command-line option:
@ -47,7 +194,7 @@ scrcpy -Sw # short version
``` ```
## Show touches ### Show touches
For presentations, it may be useful to show physical touches (on the physical For presentations, it may be useful to show physical touches (on the physical
device). Android exposes this feature in _Developers options_. device). Android exposes this feature in _Developers options_.
@ -63,7 +210,7 @@ scrcpy -t # short version
Note that it only shows _physical_ touches (by a finger on the device). Note that it only shows _physical_ touches (by a finger on the device).
## Power off on close ### Power off on close
To turn the device screen off when closing _scrcpy_: To turn the device screen off when closing _scrcpy_:
@ -71,10 +218,11 @@ To turn the device screen off when closing _scrcpy_:
scrcpy --power-off-on-close scrcpy --power-off-on-close
``` ```
## Power on on start ### Power on on start
By default, on start, the device is powered on. To prevent this behavior: By default, on start, the device is powered on. To prevent this behavior:
```bash ```bash
scrcpy --no-power-on scrcpy --no-power-on
``` ```

View File

@ -106,7 +106,3 @@ scrcpy --otg # keyboard and mouse
Like `--hid-keyboard` and `--hid-mouse`, it only works if the device is Like `--hid-keyboard` and `--hid-mouse`, it only works if the device is
connected over USB. connected over USB.
## HID/OTG issues on Windows
See [FAQ](/FAQ.md#hidotg-issues-on-windows).

View File

@ -9,10 +9,12 @@ Scrcpy is packaged in several distributions and package managers:
- Debian/Ubuntu: `apt install scrcpy` - Debian/Ubuntu: `apt install scrcpy`
- Arch Linux: `pacman -S scrcpy` - Arch Linux: `pacman -S scrcpy`
- Fedora: `dnf copr enable zeno/scrcpy && dnf install scrcpy` - Fedora: `dnf copr enable zeno/scrcpy && dnf install scrcpy`
- Gentoo: `emerge scrcpy` - Gentoo: [ebuild][ebuild-link] file
- Snap: `snap install scrcpy` - Snap: `snap install scrcpy`
- … (see [repology](https://repology.org/project/scrcpy/versions)) - … (see [repology](https://repology.org/project/scrcpy/versions))
[ebuild-link]: https://github.com/maggu2810/maggu2810-overlay/tree/master/app-mobilephone/scrcpy
### Latest version ### Latest version
However, the packaged version is not always the latest release. To install the However, the packaged version is not always the latest release. To install the

View File

@ -18,9 +18,7 @@ To record only the audio:
```bash ```bash
scrcpy --no-video --record=file.opus scrcpy --no-video --record=file.opus
scrcpy --no-video --audio-codec=aac --record=file.aac scrcpy --no-video --audio-codec=aac --record=file.aac
scrcpy --no-video --audio-codec=flac --record=file.flac # .m4a/.mp4 and .mka/.mkv are also supported for both opus and aac
scrcpy --no-video --audio-codec=raw --record=file.wav
# .m4a/.mp4 and .mka/.mkv are also supported for opus, aac and flac
``` ```
Timestamps are captured on the device, so [packet delay variation] does not Timestamps are captured on the device, so [packet delay variation] does not
@ -33,29 +31,20 @@ course, not if you capture your scrcpy window and audio output on the computer).
## Format ## Format
The video and audio streams are encoded on the device, but are muxed on the The video and audio streams are encoded on the device, but are muxed on the
client side. Several formats (containers) are supported: client side. Two formats (containers) are supported:
- MP4 (`.mp4`, `.m4a`, `.aac`) - Matroska (`.mkv`)
- Matroska (`.mkv`, `.mka`) - MP4 (`.mp4`)
- OPUS (`.opus`)
- FLAC (`.flac`)
- WAV (`.wav`)
The container is automatically selected based on the filename. The container is automatically selected based on the filename.
It is also possible to explicitly select a container (in that case the filename It is also possible to explicitly select a container (in that case the filename
needs not end with a known extension): needs not end with `.mkv` or `.mp4`):
``` ```
scrcpy --record=file --record-format=mkv scrcpy --record=file --record-format=mkv
``` ```
## Rotation
The video can be recorded rotated. See [video
orientation](video.md#orientation).
## No playback ## No playback
To disable playback while recording: To disable playback while recording:

View File

@ -26,12 +26,10 @@ _<kbd>[Super]</kbd> is typically the <kbd>Windows</kbd> or <kbd>Cmd</kbd> key._
| Switch fullscreen mode | <kbd>MOD</kbd>+<kbd>f</kbd> | Switch fullscreen mode | <kbd>MOD</kbd>+<kbd>f</kbd>
| Rotate display left | <kbd>MOD</kbd>+<kbd></kbd> _(left)_ | Rotate display left | <kbd>MOD</kbd>+<kbd></kbd> _(left)_
| Rotate display right | <kbd>MOD</kbd>+<kbd></kbd> _(right)_ | Rotate display right | <kbd>MOD</kbd>+<kbd></kbd> _(right)_
| Flip display horizontally | <kbd>MOD</kbd>+<kbd>Shift</kbd>+<kbd></kbd> _(left)_ \| <kbd>MOD</kbd>+<kbd>Shift</kbd>+<kbd></kbd> _(right)_
| Flip display vertically | <kbd>MOD</kbd>+<kbd>Shift</kbd>+<kbd></kbd> _(up)_ \| <kbd>MOD</kbd>+<kbd>Shift</kbd>+<kbd></kbd> _(down)_
| Resize window to 1:1 (pixel-perfect) | <kbd>MOD</kbd>+<kbd>g</kbd> | Resize window to 1:1 (pixel-perfect) | <kbd>MOD</kbd>+<kbd>g</kbd>
| Resize window to remove black borders | <kbd>MOD</kbd>+<kbd>w</kbd> \| _Double-left-click¹_ | Resize window to remove black borders | <kbd>MOD</kbd>+<kbd>w</kbd> \| _Double-left-click¹_
| Click on `HOME` | <kbd>MOD</kbd>+<kbd>h</kbd> \| _Middle-click_ | Click on `HOME` | <kbd>MOD</kbd>+<kbd>h</kbd> \| _Middle-click_
| Click on `BACK` | <kbd>MOD</kbd>+<kbd>b</kbd> \| <kbd>MOD</kbd>+<kbd>Backspace</kbd> \| _Right-click²_ | Click on `BACK` | <kbd>MOD</kbd>+<kbd>b</kbd> \| _Right-click²_
| Click on `APP_SWITCH` | <kbd>MOD</kbd>+<kbd>s</kbd> \| _4th-click³_ | Click on `APP_SWITCH` | <kbd>MOD</kbd>+<kbd>s</kbd> \| _4th-click³_
| Click on `MENU` (unlock screen)⁴ | <kbd>MOD</kbd>+<kbd>m</kbd> | Click on `MENU` (unlock screen)⁴ | <kbd>MOD</kbd>+<kbd>m</kbd>
| Click on `VOLUME_UP` | <kbd>MOD</kbd>+<kbd></kbd> _(up)_ | Click on `VOLUME_UP` | <kbd>MOD</kbd>+<kbd></kbd> _(up)_

View File

@ -21,13 +21,6 @@ This will create a new video device in `/dev/videoN`, where `N` is an integer
(more [options](https://github.com/umlaeute/v4l2loopback#options) are available (more [options](https://github.com/umlaeute/v4l2loopback#options) are available
to create several devices or devices with specific IDs). to create several devices or devices with specific IDs).
If you encounter problems detecting your device with Chrome/WebRTC, you can try
`exclusive_caps` mode:
```
sudo modprobe v4l2loopback exclusive_caps=1
```
To list the enabled devices: To list the enabled devices:
```bash ```bash

View File

@ -1,14 +1,5 @@
# Video # Video
## Source
By default, scrcpy mirrors the device screen.
It is possible to capture the device camera instead.
See the dedicated [camera](camera.md) page.
## Size ## Size
By default, scrcpy attempts to mirror at the Android device resolution. By default, scrcpy attempts to mirror at the Android device resolution.
@ -30,7 +21,7 @@ If encoding fails, scrcpy automatically tries again with a lower definition
## Bit rate ## Bit rate
The default video bit rate is 8 Mbps. To change it: The default video bit-rate is 8 Mbps. To change it:
```bash ```bash
scrcpy --video-bit-rate=2M scrcpy --video-bit-rate=2M
@ -75,14 +66,6 @@ scrcpy --video-codec=av1
H265 may provide better quality, but H264 should provide lower latency. H265 may provide better quality, but H264 should provide lower latency.
AV1 encoders are not common on current Android devices. AV1 encoders are not common on current Android devices.
For advanced usage, to pass arbitrary parameters to the [`MediaFormat`],
check `--video-codec-options` in the manpage or in `scrcpy --help`.
[`MediaFormat`]: https://developer.android.com/reference/android/media/MediaFormat
## Encoder
Several encoders may be available on the device. They can be listed by: Several encoders may be available on the device. They can be listed by:
```bash ```bash
@ -96,51 +79,45 @@ try another one:
scrcpy --video-codec=h264 --video-encoder='OMX.qcom.video.encoder.avc' scrcpy --video-codec=h264 --video-encoder='OMX.qcom.video.encoder.avc'
``` ```
For advanced usage, to pass arbitrary parameters to the [`MediaFormat`],
check `--video-codec-options` in the manpage or in `scrcpy --help`.
## Orientation [`MediaFormat`]: https://developer.android.com/reference/android/media/MediaFormat
The orientation may be applied at 3 different levels:
## Rotation
The rotation may be applied at 3 different levels:
- The [shortcut](shortcuts.md) <kbd>MOD</kbd>+<kbd>r</kbd> requests the - The [shortcut](shortcuts.md) <kbd>MOD</kbd>+<kbd>r</kbd> requests the
device to switch between portrait and landscape (the current running app may device to switch between portrait and landscape (the current running app may
refuse, if it does not support the requested orientation). refuse, if it does not support the requested orientation).
- `--lock-video-orientation` changes the mirroring orientation (the orientation - `--lock-video-orientation` changes the mirroring orientation (the orientation
of the video sent from the device to the computer). This affects the of the video sent from the device to the computer). This affects the
recording. recording.
- `--orientation` is applied on the client side, and affects display and - `--rotation` rotates only the window content. This only affects the display,
recording. For the display, it can be changed dynamically using not the recording. It may be changed dynamically at any time using the
[shortcuts](shortcuts.md). [shortcuts](shortcuts.md) <kbd>MOD</kbd>+<kbd></kbd> and
<kbd>MOD</kbd>+<kbd></kbd>.
To lock the mirroring orientation (on the capture side): To lock the mirroring orientation:
```bash ```bash
scrcpy --lock-video-orientation # initial (current) orientation scrcpy --lock-video-orientation # initial (current) orientation
scrcpy --lock-video-orientation=0 # natural orientation scrcpy --lock-video-orientation=0 # natural orientation
scrcpy --lock-video-orientation=90 # 90° clockwise scrcpy --lock-video-orientation=1 # 90° counterclockwise
scrcpy --lock-video-orientation=180 # 180° scrcpy --lock-video-orientation=2 # 180°
scrcpy --lock-video-orientation=270 # 270° clockwise scrcpy --lock-video-orientation=3 # 90° clockwise
``` ```
To orient the video (on the rendering side): To set an initial window rotation:
```bash ```bash
scrcpy --orientation=0 scrcpy --rotation=0 # no rotation
scrcpy --orientation=90 # 90° clockwise scrcpy --rotation=1 # 90 degrees counterclockwise
scrcpy --orientation=180 # 180° scrcpy --rotation=2 # 180 degrees
scrcpy --orientation=270 # 270° clockwise scrcpy --rotation=3 # 90 degrees clockwise
scrcpy --orientation=flip0 # hflip
scrcpy --orientation=flip90 # hflip + 90° clockwise
scrcpy --orientation=flip180 # vflip (hflip + 180°)
scrcpy --orientation=flip270 # hflip + 270° clockwise
``` ```
The orientation can be set separately for display and record if necessary, via
`--display-orientation` and `--record-orientation`.
The rotation is applied to a recorded file by writing a display transformation
to the MP4 or MKV target file. Flipping is not supported, so only the 4 first
values are allowed when recording.
## Crop ## Crop
The device screen may be cropped to mirror only part of the screen. The device screen may be cropped to mirror only part of the screen.
@ -157,25 +134,6 @@ phone, landscape for a tablet).
If `--max-size` is also specified, resizing is applied after cropping. If `--max-size` is also specified, resizing is applied after cropping.
## Display
If several displays are available on the Android device, it is possible to
select the display to mirror:
```bash
scrcpy --display-id=1
```
The list of display ids can be retrieved by:
```bash
scrcpy --list-displays
```
A secondary display may only be controlled if the device runs at least Android
10 (otherwise it is mirrored as read-only).
## Buffering ## Buffering
By default, there is no video buffering, to get the lowest possible latency. By default, there is no video buffering, to get the lowest possible latency.

View File

@ -4,24 +4,18 @@
Download the [latest release]: Download the [latest release]:
- [`scrcpy-win64-v2.3.1.zip`][direct-win64] (64-bit) - [`scrcpy-win64-v2.0.zip`][direct-win64] (64-bit)
<sub>SHA-256: `f1f78ac98214078425804e524a1bed515b9d4b8a05b78d210a4ced2b910b262d`</sub> <sub>SHA-256: `ae4c8d37a496b43f8974ba8f07f708e22a9570ba0cddc3dc3a36edbccd4d2a20`</sub>
- [`scrcpy-win32-v2.3.1.zip`][direct-win32] (32-bit) - [`scrcpy-win32-v2.0.zip`][direct-win32] (32-bit)
<sub>SHA-256: `5dffc2d432e9b8b5b0e16f12e71428c37c70d9124cfbe7620df0b41b7efe91ff`</sub> <sub>SHA-256: `15d98c02cb0e0bbd84f8b5d54991e0f6925569b1286a86a40743944fcb1c2d8c`</sub>
[latest release]: https://github.com/Genymobile/scrcpy/releases/latest [latest release]: https://github.com/Genymobile/scrcpy/releases/latest
[direct-win64]: https://github.com/Genymobile/scrcpy/releases/download/v2.3.1/scrcpy-win64-v2.3.1.zip [direct-win64]: https://github.com/Genymobile/scrcpy/releases/download/v2.0/scrcpy-win64-v2.0.zip
[direct-win32]: https://github.com/Genymobile/scrcpy/releases/download/v2.3.1/scrcpy-win32-v2.3.1.zip [direct-win32]: https://github.com/Genymobile/scrcpy/releases/download/v2.0/scrcpy-win32-v2.0.zip
and extract it. and extract it.
Alternatively, you could install it from packages manager, like [Winget]: Alternatively, you could install it from packages manager, like [Chocolatey]:
```bash
winget install scrcpy
```
or [Chocolatey]:
```bash ```bash
choco install scrcpy choco install scrcpy
@ -36,7 +30,6 @@ scoop install scrcpy
scoop install adb # if you don't have it yet scoop install adb # if you don't have it yet
``` ```
[Winget]: https://github.com/microsoft/winget-cli
[Chocolatey]: https://chocolatey.org/ [Chocolatey]: https://chocolatey.org/
[Scoop]: https://scoop.sh [Scoop]: https://scoop.sh

View File

@ -1,5 +1,5 @@
distributionBase=GRADLE_USER_HOME distributionBase=GRADLE_USER_HOME
distributionPath=wrapper/dists distributionPath=wrapper/dists
distributionUrl=https\://services.gradle.org/distributions/gradle-8.4-bin.zip distributionUrl=https\://services.gradle.org/distributions/gradle-7.5-all.zip
zipStoreBase=GRADLE_USER_HOME zipStoreBase=GRADLE_USER_HOME
zipStorePath=wrapper/dists zipStorePath=wrapper/dists

View File

@ -2,8 +2,8 @@
set -e set -e
BUILDDIR=build-auto BUILDDIR=build-auto
PREBUILT_SERVER_URL=https://github.com/Genymobile/scrcpy/releases/download/v2.3.1/scrcpy-server-v2.3.1 PREBUILT_SERVER_URL=https://github.com/Genymobile/scrcpy/releases/download/v2.0/scrcpy-server-v2.0
PREBUILT_SERVER_SHA256=f6814822fc308a7a532f253485c9038183c6296a6c5df470a9e383b4f8e7605b PREBUILT_SERVER_SHA256=9e241615f578cd690bb43311000debdecf6a9c50a7082b001952f18f6f21ddc2
echo "[scrcpy] Downloading prebuilt server..." echo "[scrcpy] Downloading prebuilt server..."
wget "$PREBUILT_SERVER_URL" -O scrcpy-server wget "$PREBUILT_SERVER_URL" -O scrcpy-server

View File

@ -1,5 +1,5 @@
project('scrcpy', 'c', project('scrcpy', 'c',
version: '2.3.1', version: '2.0',
meson_version: '>= 0.48', meson_version: '>= 0.48',
default_options: [ default_options: [
'c_std=c11', 'c_std=c11',
@ -7,8 +7,6 @@ project('scrcpy', 'c',
'b_ndebug=if-release', 'b_ndebug=if-release',
]) ])
add_project_arguments('-Wmissing-prototypes', language: 'c')
if get_option('compile_app') if get_option('compile_app')
subdir('app') subdir('app')
endif endif

View File

@ -69,62 +69,58 @@ prepare-deps:
@app/prebuilt-deps/prepare-libusb.sh @app/prebuilt-deps/prepare-libusb.sh
build-win32: prepare-deps build-win32: prepare-deps
rm -rf "$(WIN32_BUILD_DIR)" [ -d "$(WIN32_BUILD_DIR)" ] || ( mkdir "$(WIN32_BUILD_DIR)" && \
mkdir -p "$(WIN32_BUILD_DIR)/local" meson setup "$(WIN32_BUILD_DIR)" \
cp -r app/prebuilt-deps/data/ffmpeg-6.1-scrcpy-3/win32/. "$(WIN32_BUILD_DIR)/local/" --cross-file cross_win32.txt \
cp -r app/prebuilt-deps/data/SDL2-2.28.5/i686-w64-mingw32/. "$(WIN32_BUILD_DIR)/local/" --buildtype release --strip -Db_lto=true \
cp -r app/prebuilt-deps/data/libusb-1.0.26/libusb-MinGW-Win32/. "$(WIN32_BUILD_DIR)/local/" -Dcompile_server=false \
meson setup "$(WIN32_BUILD_DIR)" \ -Dportable=true )
--pkg-config-path="$(WIN32_BUILD_DIR)/local/lib/pkgconfig" \
-Dc_args="-I$(PWD)/$(WIN32_BUILD_DIR)/local/include" \
-Dc_link_args="-L$(PWD)/$(WIN32_BUILD_DIR)/local/lib" \
--cross-file=cross_win32.txt \
--buildtype=release --strip -Db_lto=true \
-Dcompile_server=false \
-Dportable=true
ninja -C "$(WIN32_BUILD_DIR)" ninja -C "$(WIN32_BUILD_DIR)"
build-win64: prepare-deps build-win64: prepare-deps
rm -rf "$(WIN64_BUILD_DIR)" [ -d "$(WIN64_BUILD_DIR)" ] || ( mkdir "$(WIN64_BUILD_DIR)" && \
mkdir -p "$(WIN64_BUILD_DIR)/local" meson setup "$(WIN64_BUILD_DIR)" \
cp -r app/prebuilt-deps/data/ffmpeg-6.1-scrcpy-3/win64/. "$(WIN64_BUILD_DIR)/local/" --cross-file cross_win64.txt \
cp -r app/prebuilt-deps/data/SDL2-2.28.5/x86_64-w64-mingw32/. "$(WIN64_BUILD_DIR)/local/" --buildtype release --strip -Db_lto=true \
cp -r app/prebuilt-deps/data/libusb-1.0.26/libusb-MinGW-x64/. "$(WIN64_BUILD_DIR)/local/" -Dcompile_server=false \
meson setup "$(WIN64_BUILD_DIR)" \ -Dportable=true )
--pkg-config-path="$(WIN64_BUILD_DIR)/local/lib/pkgconfig" \
-Dc_args="-I$(PWD)/$(WIN64_BUILD_DIR)/local/include" \
-Dc_link_args="-L$(PWD)/$(WIN64_BUILD_DIR)/local/lib" \
--cross-file=cross_win64.txt \
--buildtype=release --strip -Db_lto=true \
-Dcompile_server=false \
-Dportable=true
ninja -C "$(WIN64_BUILD_DIR)" ninja -C "$(WIN64_BUILD_DIR)"
dist-win32: build-server build-win32 dist-win32: build-server build-win32
mkdir -p "$(DIST)/$(WIN32_TARGET_DIR)" mkdir -p "$(DIST)/$(WIN32_TARGET_DIR)"
cp "$(SERVER_BUILD_DIR)"/server/scrcpy-server "$(DIST)/$(WIN32_TARGET_DIR)/" cp "$(SERVER_BUILD_DIR)"/server/scrcpy-server "$(DIST)/$(WIN32_TARGET_DIR)/"
cp "$(WIN32_BUILD_DIR)"/app/scrcpy.exe "$(DIST)/$(WIN32_TARGET_DIR)/" cp "$(WIN32_BUILD_DIR)"/app/scrcpy.exe "$(DIST)/$(WIN32_TARGET_DIR)/"
cp app/data/scrcpy-console.bat "$(DIST)/$(WIN32_TARGET_DIR)/" cp app/data/scrcpy-console.bat "$(DIST)/$(WIN32_TARGET_DIR)"
cp app/data/scrcpy-noconsole.vbs "$(DIST)/$(WIN32_TARGET_DIR)/" cp app/data/scrcpy-noconsole.vbs "$(DIST)/$(WIN32_TARGET_DIR)"
cp app/data/icon.png "$(DIST)/$(WIN32_TARGET_DIR)/" cp app/data/icon.png "$(DIST)/$(WIN32_TARGET_DIR)"
cp app/data/open_a_terminal_here.bat "$(DIST)/$(WIN32_TARGET_DIR)/" cp app/data/open_a_terminal_here.bat "$(DIST)/$(WIN32_TARGET_DIR)"
cp app/prebuilt-deps/data/platform-tools-34.0.5/adb.exe "$(DIST)/$(WIN32_TARGET_DIR)/" cp app/prebuilt-deps/data/ffmpeg-6.0-scrcpy-4/win32/bin/avutil-58.dll "$(DIST)/$(WIN32_TARGET_DIR)/"
cp app/prebuilt-deps/data/platform-tools-34.0.5/AdbWinApi.dll "$(DIST)/$(WIN32_TARGET_DIR)/" cp app/prebuilt-deps/data/ffmpeg-6.0-scrcpy-4/win32/bin/avcodec-60.dll "$(DIST)/$(WIN32_TARGET_DIR)/"
cp app/prebuilt-deps/data/platform-tools-34.0.5/AdbWinUsbApi.dll "$(DIST)/$(WIN32_TARGET_DIR)/" cp app/prebuilt-deps/data/ffmpeg-6.0-scrcpy-4/win32/bin/avformat-60.dll "$(DIST)/$(WIN32_TARGET_DIR)/"
cp "$(WIN32_BUILD_DIR)"/local/bin/*.dll "$(DIST)/$(WIN32_TARGET_DIR)/" cp app/prebuilt-deps/data/ffmpeg-6.0-scrcpy-4/win32/bin/swresample-4.dll "$(DIST)/$(WIN32_TARGET_DIR)/"
cp app/prebuilt-deps/data/platform-tools-34.0.1/adb.exe "$(DIST)/$(WIN32_TARGET_DIR)/"
cp app/prebuilt-deps/data/platform-tools-34.0.1/AdbWinApi.dll "$(DIST)/$(WIN32_TARGET_DIR)/"
cp app/prebuilt-deps/data/platform-tools-34.0.1/AdbWinUsbApi.dll "$(DIST)/$(WIN32_TARGET_DIR)/"
cp app/prebuilt-deps/data/SDL2-2.26.4/i686-w64-mingw32/bin/SDL2.dll "$(DIST)/$(WIN32_TARGET_DIR)/"
cp app/prebuilt-deps/data/libusb-1.0.26/libusb-MinGW-Win32/bin/msys-usb-1.0.dll "$(DIST)/$(WIN32_TARGET_DIR)/"
dist-win64: build-server build-win64 dist-win64: build-server build-win64
mkdir -p "$(DIST)/$(WIN64_TARGET_DIR)" mkdir -p "$(DIST)/$(WIN64_TARGET_DIR)"
cp "$(SERVER_BUILD_DIR)"/server/scrcpy-server "$(DIST)/$(WIN64_TARGET_DIR)/" cp "$(SERVER_BUILD_DIR)"/server/scrcpy-server "$(DIST)/$(WIN64_TARGET_DIR)/"
cp "$(WIN64_BUILD_DIR)"/app/scrcpy.exe "$(DIST)/$(WIN64_TARGET_DIR)/" cp "$(WIN64_BUILD_DIR)"/app/scrcpy.exe "$(DIST)/$(WIN64_TARGET_DIR)/"
cp app/data/scrcpy-console.bat "$(DIST)/$(WIN64_TARGET_DIR)/" cp app/data/scrcpy-console.bat "$(DIST)/$(WIN64_TARGET_DIR)"
cp app/data/scrcpy-noconsole.vbs "$(DIST)/$(WIN64_TARGET_DIR)/" cp app/data/scrcpy-noconsole.vbs "$(DIST)/$(WIN64_TARGET_DIR)"
cp app/data/icon.png "$(DIST)/$(WIN64_TARGET_DIR)/" cp app/data/icon.png "$(DIST)/$(WIN64_TARGET_DIR)"
cp app/data/open_a_terminal_here.bat "$(DIST)/$(WIN64_TARGET_DIR)/" cp app/data/open_a_terminal_here.bat "$(DIST)/$(WIN64_TARGET_DIR)"
cp app/prebuilt-deps/data/platform-tools-34.0.5/adb.exe "$(DIST)/$(WIN64_TARGET_DIR)/" cp app/prebuilt-deps/data/ffmpeg-6.0-scrcpy-4/win64/bin/avutil-58.dll "$(DIST)/$(WIN64_TARGET_DIR)/"
cp app/prebuilt-deps/data/platform-tools-34.0.5/AdbWinApi.dll "$(DIST)/$(WIN64_TARGET_DIR)/" cp app/prebuilt-deps/data/ffmpeg-6.0-scrcpy-4/win64/bin/avcodec-60.dll "$(DIST)/$(WIN64_TARGET_DIR)/"
cp app/prebuilt-deps/data/platform-tools-34.0.5/AdbWinUsbApi.dll "$(DIST)/$(WIN64_TARGET_DIR)/" cp app/prebuilt-deps/data/ffmpeg-6.0-scrcpy-4/win64/bin/avformat-60.dll "$(DIST)/$(WIN64_TARGET_DIR)/"
cp "$(WIN64_BUILD_DIR)"/local/bin/*.dll "$(DIST)/$(WIN64_TARGET_DIR)/" cp app/prebuilt-deps/data/ffmpeg-6.0-scrcpy-4/win64/bin/swresample-4.dll "$(DIST)/$(WIN64_TARGET_DIR)/"
cp app/prebuilt-deps/data/platform-tools-34.0.1/adb.exe "$(DIST)/$(WIN64_TARGET_DIR)/"
cp app/prebuilt-deps/data/platform-tools-34.0.1/AdbWinApi.dll "$(DIST)/$(WIN64_TARGET_DIR)/"
cp app/prebuilt-deps/data/platform-tools-34.0.1/AdbWinUsbApi.dll "$(DIST)/$(WIN64_TARGET_DIR)/"
cp app/prebuilt-deps/data/SDL2-2.26.4/x86_64-w64-mingw32/bin/SDL2.dll "$(DIST)/$(WIN64_TARGET_DIR)/"
cp app/prebuilt-deps/data/libusb-1.0.26/libusb-MinGW-x64/bin/msys-usb-1.0.dll "$(DIST)/$(WIN64_TARGET_DIR)/"
zip-win32: dist-win32 zip-win32: dist-win32
cd "$(DIST)"; \ cd "$(DIST)"; \

View File

@ -2,13 +2,13 @@ apply plugin: 'com.android.application'
android { android {
namespace 'com.genymobile.scrcpy' namespace 'com.genymobile.scrcpy'
compileSdk 34 compileSdkVersion 33
defaultConfig { defaultConfig {
applicationId "com.genymobile.scrcpy" applicationId "com.genymobile.scrcpy"
minSdkVersion 21 minSdkVersion 21
targetSdkVersion 34 targetSdkVersion 33
versionCode 20301 versionCode 20000
versionName "2.3.1" versionName "2.0"
testInstrumentationRunner "android.support.test.runner.AndroidJUnitRunner" testInstrumentationRunner "android.support.test.runner.AndroidJUnitRunner"
} }
buildTypes { buildTypes {
@ -17,10 +17,6 @@ android {
proguardFiles getDefaultProguardFile('proguard-android.txt'), 'proguard-rules.pro' proguardFiles getDefaultProguardFile('proguard-android.txt'), 'proguard-rules.pro'
} }
} }
buildFeatures {
buildConfig true
aidl true
}
} }
dependencies { dependencies {

View File

@ -12,10 +12,10 @@
set -e set -e
SCRCPY_DEBUG=false SCRCPY_DEBUG=false
SCRCPY_VERSION_NAME=2.3.1 SCRCPY_VERSION_NAME=2.0
PLATFORM=${ANDROID_PLATFORM:-34} PLATFORM=${ANDROID_PLATFORM:-33}
BUILD_TOOLS=${ANDROID_BUILD_TOOLS:-34.0.0} BUILD_TOOLS=${ANDROID_BUILD_TOOLS:-33.0.0}
BUILD_TOOLS_DIR="$ANDROID_HOME/build-tools/$BUILD_TOOLS" BUILD_TOOLS_DIR="$ANDROID_HOME/build-tools/$BUILD_TOOLS"
BUILD_DIR="$(realpath ${BUILD_DIR:-build_manual})" BUILD_DIR="$(realpath ${BUILD_DIR:-build_manual})"
@ -48,7 +48,6 @@ cd "$SERVER_DIR/src/main/aidl"
"$BUILD_TOOLS_DIR/aidl" -o"$GEN_DIR" android/view/IRotationWatcher.aidl "$BUILD_TOOLS_DIR/aidl" -o"$GEN_DIR" android/view/IRotationWatcher.aidl
"$BUILD_TOOLS_DIR/aidl" -o"$GEN_DIR" \ "$BUILD_TOOLS_DIR/aidl" -o"$GEN_DIR" \
android/content/IOnPrimaryClipChangedListener.aidl android/content/IOnPrimaryClipChangedListener.aidl
"$BUILD_TOOLS_DIR/aidl" -o"$GEN_DIR" android/view/IDisplayFoldListener.aidl
echo "Compiling java sources..." echo "Compiling java sources..."
cd ../java cd ../java

View File

@ -11,8 +11,6 @@ public interface AsyncProcessor {
} }
void start(TerminationListener listener); void start(TerminationListener listener);
void stop(); void stop();
void join() throws InterruptedException; void join() throws InterruptedException;
} }

View File

@ -24,19 +24,11 @@ public final class AudioCapture {
public static final int ENCODING = AudioFormat.ENCODING_PCM_16BIT; public static final int ENCODING = AudioFormat.ENCODING_PCM_16BIT;
public static final int BYTES_PER_SAMPLE = 2; public static final int BYTES_PER_SAMPLE = 2;
// Never read more than 1024 samples, even if the buffer is bigger (that would increase latency).
// A lower value is useless, since the system captures audio samples by blocks of 1024 (so for example if we read by blocks of 256 samples, we
// receive 4 successive blocks without waiting, then we wait for the 4 next ones).
public static final int MAX_READ_SIZE = 1024 * CHANNELS * BYTES_PER_SAMPLE;
private static final long ONE_SAMPLE_US = (1000000 + SAMPLE_RATE - 1) / SAMPLE_RATE; // 1 sample in microseconds (used for fixing PTS)
private final int audioSource; private final int audioSource;
private AudioRecord recorder; private AudioRecord recorder;
private final AudioTimestamp timestamp = new AudioTimestamp(); private final AudioTimestamp timestamp = new AudioTimestamp();
private long previousRecorderTimestamp = -1;
private long previousPts = 0; private long previousPts = 0;
private long nextPts = 0; private long nextPts = 0;
@ -44,6 +36,10 @@ public final class AudioCapture {
this.audioSource = audioSource.value(); this.audioSource = audioSource.value();
} }
public static int millisToBytes(int millis) {
return SAMPLE_RATE * CHANNELS * BYTES_PER_SAMPLE * millis / 1000;
}
private static AudioFormat createAudioFormat() { private static AudioFormat createAudioFormat() {
AudioFormat.Builder builder = new AudioFormat.Builder(); AudioFormat.Builder builder = new AudioFormat.Builder();
builder.setEncoding(ENCODING); builder.setEncoding(ENCODING);
@ -122,7 +118,7 @@ public final class AudioCapture {
if (Build.VERSION.SDK_INT == Build.VERSION_CODES.R) { if (Build.VERSION.SDK_INT == Build.VERSION_CODES.R) {
startWorkaroundAndroid11(); startWorkaroundAndroid11();
try { try {
tryStartRecording(5, 100); tryStartRecording(3, 100);
} finally { } finally {
stopWorkaroundAndroid11(); stopWorkaroundAndroid11();
} }
@ -139,8 +135,8 @@ public final class AudioCapture {
} }
@TargetApi(Build.VERSION_CODES.N) @TargetApi(Build.VERSION_CODES.N)
public int read(ByteBuffer directBuffer, MediaCodec.BufferInfo outBufferInfo) { public int read(ByteBuffer directBuffer, int size, MediaCodec.BufferInfo outBufferInfo) {
int r = recorder.read(directBuffer, MAX_READ_SIZE); int r = recorder.read(directBuffer, size);
if (r <= 0) { if (r <= 0) {
return r; return r;
} }
@ -148,9 +144,8 @@ public final class AudioCapture {
long pts; long pts;
int ret = recorder.getTimestamp(timestamp, AudioTimestamp.TIMEBASE_MONOTONIC); int ret = recorder.getTimestamp(timestamp, AudioTimestamp.TIMEBASE_MONOTONIC);
if (ret == AudioRecord.SUCCESS && timestamp.nanoTime != previousRecorderTimestamp) { if (ret == AudioRecord.SUCCESS) {
pts = timestamp.nanoTime / 1000; pts = timestamp.nanoTime / 1000;
previousRecorderTimestamp = timestamp.nanoTime;
} else { } else {
if (nextPts == 0) { if (nextPts == 0) {
Ln.w("Could not get any audio timestamp"); Ln.w("Could not get any audio timestamp");
@ -162,13 +157,13 @@ public final class AudioCapture {
long durationUs = r * 1000000 / (CHANNELS * BYTES_PER_SAMPLE * SAMPLE_RATE); long durationUs = r * 1000000 / (CHANNELS * BYTES_PER_SAMPLE * SAMPLE_RATE);
nextPts = pts + durationUs; nextPts = pts + durationUs;
if (previousPts != 0 && pts < previousPts + ONE_SAMPLE_US) { if (previousPts != 0 && pts < previousPts) {
// Audio PTS may come from two sources: // Audio PTS may come from two sources:
// - recorder.getTimestamp() if the call works; // - recorder.getTimestamp() if the call works;
// - an estimation from the previous PTS and the packet size as a fallback. // - an estimation from the previous PTS and the packet size as a fallback.
// //
// Therefore, the property that PTS are monotonically increasing is no guaranteed in corner cases, so enforce it. // Therefore, the property that PTS are monotonically increasing is no guaranteed in corner cases, so enforce it.
pts = previousPts + ONE_SAMPLE_US; pts = previousPts + 1;
} }
previousPts = pts; previousPts = pts;

View File

@ -5,7 +5,6 @@ import android.media.MediaFormat;
public enum AudioCodec implements Codec { public enum AudioCodec implements Codec {
OPUS(0x6f_70_75_73, "opus", MediaFormat.MIMETYPE_AUDIO_OPUS), OPUS(0x6f_70_75_73, "opus", MediaFormat.MIMETYPE_AUDIO_OPUS),
AAC(0x00_61_61_63, "aac", MediaFormat.MIMETYPE_AUDIO_AAC), AAC(0x00_61_61_63, "aac", MediaFormat.MIMETYPE_AUDIO_AAC),
FLAC(0x66_6c_61_63, "flac", MediaFormat.MIMETYPE_AUDIO_FLAC),
RAW(0x00_72_61_77, "raw", MediaFormat.MIMETYPE_AUDIO_RAW); RAW(0x00_72_61_77, "raw", MediaFormat.MIMETYPE_AUDIO_RAW);
private final int id; // 4-byte ASCII representation of the name private final int id; // 4-byte ASCII representation of the name

View File

@ -37,6 +37,9 @@ public final class AudioEncoder implements AsyncProcessor {
private static final int SAMPLE_RATE = AudioCapture.SAMPLE_RATE; private static final int SAMPLE_RATE = AudioCapture.SAMPLE_RATE;
private static final int CHANNELS = AudioCapture.CHANNELS; private static final int CHANNELS = AudioCapture.CHANNELS;
private static final int READ_MS = 5; // milliseconds
private static final int READ_SIZE = AudioCapture.millisToBytes(READ_MS);
private final AudioCapture capture; private final AudioCapture capture;
private final Streamer streamer; private final Streamer streamer;
private final int bitRate; private final int bitRate;
@ -90,7 +93,7 @@ public final class AudioEncoder implements AsyncProcessor {
while (!Thread.currentThread().isInterrupted()) { while (!Thread.currentThread().isInterrupted()) {
InputTask task = inputTasks.take(); InputTask task = inputTasks.take();
ByteBuffer buffer = mediaCodec.getInputBuffer(task.index); ByteBuffer buffer = mediaCodec.getInputBuffer(task.index);
int r = capture.read(buffer, bufferInfo); int r = capture.read(buffer, READ_SIZE, bufferInfo);
if (r <= 0) { if (r <= 0) {
throw new IOException("Could not read audio: " + r); throw new IOException("Could not read audio: " + r);
} }
@ -295,7 +298,7 @@ public final class AudioEncoder implements AsyncProcessor {
} }
} }
private final class EncoderCallback extends MediaCodec.Callback { private class EncoderCallback extends MediaCodec.Callback {
@TargetApi(Build.VERSION_CODES.N) @TargetApi(Build.VERSION_CODES.N)
@Override @Override
public void onInputBufferAvailable(MediaCodec codec, int index) { public void onInputBufferAvailable(MediaCodec codec, int index) {

View File

@ -13,6 +13,9 @@ public final class AudioRawRecorder implements AsyncProcessor {
private Thread thread; private Thread thread;
private static final int READ_MS = 5; // milliseconds
private static final int READ_SIZE = AudioCapture.millisToBytes(READ_MS);
public AudioRawRecorder(AudioCapture capture, Streamer streamer) { public AudioRawRecorder(AudioCapture capture, Streamer streamer) {
this.capture = capture; this.capture = capture;
this.streamer = streamer; this.streamer = streamer;
@ -25,22 +28,16 @@ public final class AudioRawRecorder implements AsyncProcessor {
return; return;
} }
final ByteBuffer buffer = ByteBuffer.allocateDirect(AudioCapture.MAX_READ_SIZE); final ByteBuffer buffer = ByteBuffer.allocateDirect(READ_SIZE);
final MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo(); final MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
try { try {
try { capture.start();
capture.start();
} catch (Throwable t) {
// Notify the client that the audio could not be captured
streamer.writeDisableStream(false);
throw t;
}
streamer.writeAudioHeader(); streamer.writeAudioHeader();
while (!Thread.currentThread().isInterrupted()) { while (!Thread.currentThread().isInterrupted()) {
buffer.position(0); buffer.position(0);
int r = capture.read(buffer, bufferInfo); int r = capture.read(buffer, READ_SIZE, bufferInfo);
if (r < 0) { if (r < 0) {
throw new IOException("Could not read audio: " + r); throw new IOException("Could not read audio: " + r);
} }
@ -48,11 +45,10 @@ public final class AudioRawRecorder implements AsyncProcessor {
streamer.writePacket(buffer, bufferInfo); streamer.writePacket(buffer, bufferInfo);
} }
} catch (IOException e) { } catch (Throwable e) {
// Broken pipe is expected on close, because the socket is closed by the client // Notify the client that the audio could not be captured
if (!IO.isBrokenPipe(e)) { streamer.writeDisableStream(false);
Ln.e("Audio capture error", e); throw e;
}
} finally { } finally {
capture.stop(); capture.stop();
} }
@ -66,8 +62,8 @@ public final class AudioRawRecorder implements AsyncProcessor {
record(); record();
} catch (AudioCaptureForegroundException e) { } catch (AudioCaptureForegroundException e) {
// Do not print stack trace, a user-friendly error-message has already been logged // Do not print stack trace, a user-friendly error-message has already been logged
} catch (Throwable t) { } catch (IOException e) {
Ln.e("Audio recording error", t); Ln.e("Audio recording error", e);
fatalError = true; fatalError = true;
} finally { } finally {
Ln.d("Audio recorder stopped"); Ln.d("Audio recorder stopped");

View File

@ -1,37 +0,0 @@
package com.genymobile.scrcpy;
public final class CameraAspectRatio {
private static final float SENSOR = -1;
private float ar;
private CameraAspectRatio(float ar) {
this.ar = ar;
}
public static CameraAspectRatio fromFloat(float ar) {
if (ar < 0) {
throw new IllegalArgumentException("Invalid aspect ratio: " + ar);
}
return new CameraAspectRatio(ar);
}
public static CameraAspectRatio fromFraction(int w, int h) {
if (w <= 0 || h <= 0) {
throw new IllegalArgumentException("Invalid aspect ratio: " + w + ":" + h);
}
return new CameraAspectRatio((float) w / h);
}
public static CameraAspectRatio sensorAspectRatio() {
return new CameraAspectRatio(SENSOR);
}
public boolean isSensor() {
return ar == SENSOR;
}
public float getAspectRatio() {
return ar;
}
}

View File

@ -1,351 +0,0 @@
package com.genymobile.scrcpy;
import com.genymobile.scrcpy.wrappers.ServiceManager;
import android.annotation.SuppressLint;
import android.annotation.TargetApi;
import android.graphics.Rect;
import android.hardware.camera2.CameraAccessException;
import android.hardware.camera2.CameraCaptureSession;
import android.hardware.camera2.CameraCharacteristics;
import android.hardware.camera2.CameraConstrainedHighSpeedCaptureSession;
import android.hardware.camera2.CameraDevice;
import android.hardware.camera2.CameraManager;
import android.hardware.camera2.CaptureFailure;
import android.hardware.camera2.CaptureRequest;
import android.hardware.camera2.params.OutputConfiguration;
import android.hardware.camera2.params.SessionConfiguration;
import android.hardware.camera2.params.StreamConfigurationMap;
import android.media.MediaCodec;
import android.os.Build;
import android.os.Handler;
import android.os.HandlerThread;
import android.util.Range;
import android.view.Surface;
import java.io.IOException;
import java.util.Arrays;
import java.util.List;
import java.util.Optional;
import java.util.concurrent.CompletableFuture;
import java.util.concurrent.ExecutionException;
import java.util.concurrent.Executor;
import java.util.concurrent.atomic.AtomicBoolean;
import java.util.stream.Stream;
public class CameraCapture extends SurfaceCapture {
private final String explicitCameraId;
private final CameraFacing cameraFacing;
private final Size explicitSize;
private int maxSize;
private final CameraAspectRatio aspectRatio;
private final int fps;
private final boolean highSpeed;
private String cameraId;
private Size size;
private HandlerThread cameraThread;
private Handler cameraHandler;
private CameraDevice cameraDevice;
private Executor cameraExecutor;
private final AtomicBoolean disconnected = new AtomicBoolean();
public CameraCapture(String explicitCameraId, CameraFacing cameraFacing, Size explicitSize, int maxSize, CameraAspectRatio aspectRatio, int fps,
boolean highSpeed) {
this.explicitCameraId = explicitCameraId;
this.cameraFacing = cameraFacing;
this.explicitSize = explicitSize;
this.maxSize = maxSize;
this.aspectRatio = aspectRatio;
this.fps = fps;
this.highSpeed = highSpeed;
}
@Override
public void init() throws IOException {
cameraThread = new HandlerThread("camera");
cameraThread.start();
cameraHandler = new Handler(cameraThread.getLooper());
cameraExecutor = new HandlerExecutor(cameraHandler);
try {
cameraId = selectCamera(explicitCameraId, cameraFacing);
if (cameraId == null) {
throw new IOException("No matching camera found");
}
size = selectSize(cameraId, explicitSize, maxSize, aspectRatio, highSpeed);
if (size == null) {
throw new IOException("Could not select camera size");
}
Ln.i("Using camera '" + cameraId + "'");
cameraDevice = openCamera(cameraId);
} catch (CameraAccessException | InterruptedException e) {
throw new IOException(e);
}
}
private static String selectCamera(String explicitCameraId, CameraFacing cameraFacing) throws CameraAccessException {
if (explicitCameraId != null) {
return explicitCameraId;
}
CameraManager cameraManager = ServiceManager.getCameraManager();
String[] cameraIds = cameraManager.getCameraIdList();
if (cameraFacing == null) {
// Use the first one
return cameraIds.length > 0 ? cameraIds[0] : null;
}
for (String cameraId : cameraIds) {
CameraCharacteristics characteristics = cameraManager.getCameraCharacteristics(cameraId);
int facing = characteristics.get(CameraCharacteristics.LENS_FACING);
if (cameraFacing.value() == facing) {
return cameraId;
}
}
// Not found
return null;
}
@TargetApi(Build.VERSION_CODES.N)
private static Size selectSize(String cameraId, Size explicitSize, int maxSize, CameraAspectRatio aspectRatio, boolean highSpeed)
throws CameraAccessException {
if (explicitSize != null) {
return explicitSize;
}
CameraManager cameraManager = ServiceManager.getCameraManager();
CameraCharacteristics characteristics = cameraManager.getCameraCharacteristics(cameraId);
StreamConfigurationMap configs = characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
android.util.Size[] sizes = highSpeed ? configs.getHighSpeedVideoSizes() : configs.getOutputSizes(MediaCodec.class);
Stream<android.util.Size> stream = Arrays.stream(sizes);
if (maxSize > 0) {
stream = stream.filter(it -> it.getWidth() <= maxSize && it.getHeight() <= maxSize);
}
Float targetAspectRatio = resolveAspectRatio(aspectRatio, characteristics);
if (targetAspectRatio != null) {
stream = stream.filter(it -> {
float ar = ((float) it.getWidth() / it.getHeight());
float arRatio = ar / targetAspectRatio;
// Accept if the aspect ratio is the target aspect ratio + or - 10%
return arRatio >= 0.9f && arRatio <= 1.1f;
});
}
Optional<android.util.Size> selected = stream.max((s1, s2) -> {
// Greater width is better
int cmp = Integer.compare(s1.getWidth(), s2.getWidth());
if (cmp != 0) {
return cmp;
}
if (targetAspectRatio != null) {
// Closer to the target aspect ratio is better
float ar1 = ((float) s1.getWidth() / s1.getHeight());
float arRatio1 = ar1 / targetAspectRatio;
float distance1 = Math.abs(1 - arRatio1);
float ar2 = ((float) s2.getWidth() / s2.getHeight());
float arRatio2 = ar2 / targetAspectRatio;
float distance2 = Math.abs(1 - arRatio2);
// Reverse the order because lower distance is better
cmp = Float.compare(distance2, distance1);
if (cmp != 0) {
return cmp;
}
}
// Greater height is better
return Integer.compare(s1.getHeight(), s2.getHeight());
});
if (selected.isPresent()) {
android.util.Size size = selected.get();
return new Size(size.getWidth(), size.getHeight());
}
// Not found
return null;
}
private static Float resolveAspectRatio(CameraAspectRatio ratio, CameraCharacteristics characteristics) {
if (ratio == null) {
return null;
}
if (ratio.isSensor()) {
Rect activeSize = characteristics.get(CameraCharacteristics.SENSOR_INFO_ACTIVE_ARRAY_SIZE);
return (float) activeSize.width() / activeSize.height();
}
return ratio.getAspectRatio();
}
@Override
public void start(Surface surface) throws IOException {
try {
CameraCaptureSession session = createCaptureSession(cameraDevice, surface);
CaptureRequest request = createCaptureRequest(surface);
setRepeatingRequest(session, request);
} catch (CameraAccessException | InterruptedException e) {
throw new IOException(e);
}
}
@Override
public void release() {
if (cameraDevice != null) {
cameraDevice.close();
}
if (cameraThread != null) {
cameraThread.quitSafely();
}
}
@Override
public Size getSize() {
return size;
}
@Override
public boolean setMaxSize(int maxSize) {
if (explicitSize != null) {
return false;
}
this.maxSize = maxSize;
try {
size = selectSize(cameraId, null, maxSize, aspectRatio, highSpeed);
return size != null;
} catch (CameraAccessException e) {
Ln.w("Could not select camera size", e);
return false;
}
}
@SuppressLint("MissingPermission")
@TargetApi(Build.VERSION_CODES.S)
private CameraDevice openCamera(String id) throws CameraAccessException, InterruptedException {
CompletableFuture<CameraDevice> future = new CompletableFuture<>();
ServiceManager.getCameraManager().openCamera(id, new CameraDevice.StateCallback() {
@Override
public void onOpened(CameraDevice camera) {
Ln.d("Camera opened successfully");
future.complete(camera);
}
@Override
public void onDisconnected(CameraDevice camera) {
Ln.w("Camera disconnected");
disconnected.set(true);
requestReset();
}
@Override
public void onError(CameraDevice camera, int error) {
int cameraAccessExceptionErrorCode;
switch (error) {
case CameraDevice.StateCallback.ERROR_CAMERA_IN_USE:
cameraAccessExceptionErrorCode = CameraAccessException.CAMERA_IN_USE;
break;
case CameraDevice.StateCallback.ERROR_MAX_CAMERAS_IN_USE:
cameraAccessExceptionErrorCode = CameraAccessException.MAX_CAMERAS_IN_USE;
break;
case CameraDevice.StateCallback.ERROR_CAMERA_DISABLED:
cameraAccessExceptionErrorCode = CameraAccessException.CAMERA_DISABLED;
break;
case CameraDevice.StateCallback.ERROR_CAMERA_DEVICE:
case CameraDevice.StateCallback.ERROR_CAMERA_SERVICE:
default:
cameraAccessExceptionErrorCode = CameraAccessException.CAMERA_ERROR;
break;
}
future.completeExceptionally(new CameraAccessException(cameraAccessExceptionErrorCode));
}
}, cameraHandler);
try {
return future.get();
} catch (ExecutionException e) {
throw (CameraAccessException) e.getCause();
}
}
@TargetApi(Build.VERSION_CODES.S)
private CameraCaptureSession createCaptureSession(CameraDevice camera, Surface surface) throws CameraAccessException, InterruptedException {
CompletableFuture<CameraCaptureSession> future = new CompletableFuture<>();
OutputConfiguration outputConfig = new OutputConfiguration(surface);
List<OutputConfiguration> outputs = Arrays.asList(outputConfig);
int sessionType = highSpeed ? SessionConfiguration.SESSION_HIGH_SPEED : SessionConfiguration.SESSION_REGULAR;
SessionConfiguration sessionConfig = new SessionConfiguration(sessionType, outputs, cameraExecutor, new CameraCaptureSession.StateCallback() {
@Override
public void onConfigured(CameraCaptureSession session) {
future.complete(session);
}
@Override
public void onConfigureFailed(CameraCaptureSession session) {
future.completeExceptionally(new CameraAccessException(CameraAccessException.CAMERA_ERROR));
}
});
camera.createCaptureSession(sessionConfig);
try {
return future.get();
} catch (ExecutionException e) {
throw (CameraAccessException) e.getCause();
}
}
private CaptureRequest createCaptureRequest(Surface surface) throws CameraAccessException {
CaptureRequest.Builder requestBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_RECORD);
requestBuilder.addTarget(surface);
if (fps > 0) {
requestBuilder.set(CaptureRequest.CONTROL_AE_TARGET_FPS_RANGE, new Range<>(fps, fps));
}
return requestBuilder.build();
}
@TargetApi(Build.VERSION_CODES.S)
private void setRepeatingRequest(CameraCaptureSession session, CaptureRequest request) throws CameraAccessException, InterruptedException {
CameraCaptureSession.CaptureCallback callback = new CameraCaptureSession.CaptureCallback() {
@Override
public void onCaptureStarted(CameraCaptureSession session, CaptureRequest request, long timestamp, long frameNumber) {
// Called for each frame captured, do nothing
}
@Override
public void onCaptureFailed(CameraCaptureSession session, CaptureRequest request, CaptureFailure failure) {
Ln.w("Camera capture failed: frame " + failure.getFrameNumber());
}
};
if (highSpeed) {
CameraConstrainedHighSpeedCaptureSession highSpeedSession = (CameraConstrainedHighSpeedCaptureSession) session;
List<CaptureRequest> requests = highSpeedSession.createHighSpeedRequestList(request);
highSpeedSession.setRepeatingBurst(requests, callback, cameraHandler);
} else {
session.setRepeatingRequest(request, callback, cameraHandler);
}
}
@Override
public boolean isClosed() {
return disconnected.get();
}
}

View File

@ -1,33 +0,0 @@
package com.genymobile.scrcpy;
import android.annotation.SuppressLint;
import android.hardware.camera2.CameraCharacteristics;
public enum CameraFacing {
FRONT("front", CameraCharacteristics.LENS_FACING_FRONT),
BACK("back", CameraCharacteristics.LENS_FACING_BACK),
@SuppressLint("InlinedApi") // introduced in API 23
EXTERNAL("external", CameraCharacteristics.LENS_FACING_EXTERNAL);
private final String name;
private final int value;
CameraFacing(String name, int value) {
this.name = name;
this.value = value;
}
int value() {
return value;
}
static CameraFacing findByName(String name) {
for (CameraFacing facing : CameraFacing.values()) {
if (name.equals(facing.name)) {
return facing;
}
}
return null;
}
}

View File

@ -14,6 +14,8 @@ import java.io.IOException;
*/ */
public final class CleanUp { public final class CleanUp {
public static final String SERVER_PATH = "/data/local/tmp/scrcpy-server.jar";
// A simple struct to be passed from the main process to the cleanup process // A simple struct to be passed from the main process to the cleanup process
public static class Config implements Parcelable { public static class Config implements Parcelable {
@ -133,13 +135,13 @@ public final class CleanUp {
String[] cmd = {"app_process", "/", CleanUp.class.getName(), config.toBase64()}; String[] cmd = {"app_process", "/", CleanUp.class.getName(), config.toBase64()};
ProcessBuilder builder = new ProcessBuilder(cmd); ProcessBuilder builder = new ProcessBuilder(cmd);
builder.environment().put("CLASSPATH", Server.SERVER_PATH); builder.environment().put("CLASSPATH", SERVER_PATH);
builder.start(); builder.start();
} }
public static void unlinkSelf() { public static void unlinkSelf() {
try { try {
new File(Server.SERVER_PATH).delete(); new File(SERVER_PATH).delete();
} catch (Exception e) { } catch (Exception e) {
Ln.e("Could not unlink server", e); Ln.e("Could not unlink server", e);
} }

View File

@ -318,8 +318,9 @@ public class Controller implements AsyncProcessor {
} }
} }
MotionEvent event = MotionEvent.obtain(lastTouchDown, now, action, pointerCount, pointerProperties, pointerCoords, 0, buttons, 1f, 1f, MotionEvent event = MotionEvent
DEFAULT_DEVICE_ID, 0, source, 0); .obtain(lastTouchDown, now, action, pointerCount, pointerProperties, pointerCoords, 0, buttons, 1f, 1f, DEFAULT_DEVICE_ID, 0, source,
0);
return device.injectEvent(event, Device.INJECT_MODE_ASYNC); return device.injectEvent(event, Device.INJECT_MODE_ASYNC);
} }
@ -340,8 +341,9 @@ public class Controller implements AsyncProcessor {
coords.setAxisValue(MotionEvent.AXIS_HSCROLL, hScroll); coords.setAxisValue(MotionEvent.AXIS_HSCROLL, hScroll);
coords.setAxisValue(MotionEvent.AXIS_VSCROLL, vScroll); coords.setAxisValue(MotionEvent.AXIS_VSCROLL, vScroll);
MotionEvent event = MotionEvent.obtain(lastTouchDown, now, MotionEvent.ACTION_SCROLL, 1, pointerProperties, pointerCoords, 0, buttons, 1f, 1f, MotionEvent event = MotionEvent
DEFAULT_DEVICE_ID, 0, InputDevice.SOURCE_MOUSE, 0); .obtain(lastTouchDown, now, MotionEvent.ACTION_SCROLL, 1, pointerProperties, pointerCoords, 0, buttons, 1f, 1f, DEFAULT_DEVICE_ID, 0,
InputDevice.SOURCE_MOUSE, 0);
return device.injectEvent(event, Device.INJECT_MODE_ASYNC); return device.injectEvent(event, Device.INJECT_MODE_ASYNC);
} }

View File

@ -64,6 +64,8 @@ public final class DesktopConnection implements Closeable {
throws IOException { throws IOException {
String socketName = getSocketName(scid); String socketName = getSocketName(scid);
LocalSocket firstSocket = null;
LocalSocket videoSocket = null; LocalSocket videoSocket = null;
LocalSocket audioSocket = null; LocalSocket audioSocket = null;
LocalSocket controlSocket = null; LocalSocket controlSocket = null;
@ -72,28 +74,24 @@ public final class DesktopConnection implements Closeable {
try (LocalServerSocket localServerSocket = new LocalServerSocket(socketName)) { try (LocalServerSocket localServerSocket = new LocalServerSocket(socketName)) {
if (video) { if (video) {
videoSocket = localServerSocket.accept(); videoSocket = localServerSocket.accept();
if (sendDummyByte) { firstSocket = videoSocket;
// send one byte so the client may read() to detect a connection error
videoSocket.getOutputStream().write(0);
sendDummyByte = false;
}
} }
if (audio) { if (audio) {
audioSocket = localServerSocket.accept(); audioSocket = localServerSocket.accept();
if (sendDummyByte) { if (firstSocket == null) {
// send one byte so the client may read() to detect a connection error firstSocket = audioSocket;
audioSocket.getOutputStream().write(0);
sendDummyByte = false;
} }
} }
if (control) { if (control) {
controlSocket = localServerSocket.accept(); controlSocket = localServerSocket.accept();
if (sendDummyByte) { if (firstSocket == null) {
// send one byte so the client may read() to detect a connection error firstSocket = controlSocket;
controlSocket.getOutputStream().write(0);
sendDummyByte = false;
} }
} }
if (sendDummyByte) {
// send one byte so the client may read() to detect a connection error
firstSocket.getOutputStream().write(0);
}
} }
} else { } else {
if (video) { if (video) {
@ -132,29 +130,20 @@ public final class DesktopConnection implements Closeable {
return controlSocket; return controlSocket;
} }
public void shutdown() throws IOException { public void close() throws IOException {
if (videoSocket != null) { if (videoSocket != null) {
videoSocket.shutdownInput(); videoSocket.shutdownInput();
videoSocket.shutdownOutput(); videoSocket.shutdownOutput();
videoSocket.close();
} }
if (audioSocket != null) { if (audioSocket != null) {
audioSocket.shutdownInput(); audioSocket.shutdownInput();
audioSocket.shutdownOutput(); audioSocket.shutdownOutput();
audioSocket.close();
} }
if (controlSocket != null) { if (controlSocket != null) {
controlSocket.shutdownInput(); controlSocket.shutdownInput();
controlSocket.shutdownOutput(); controlSocket.shutdownOutput();
}
}
public void close() throws IOException {
if (videoSocket != null) {
videoSocket.close();
}
if (audioSocket != null) {
audioSocket.close();
}
if (controlSocket != null) {
controlSocket.close(); controlSocket.close();
} }
} }

View File

@ -1,7 +1,6 @@
package com.genymobile.scrcpy; package com.genymobile.scrcpy;
import com.genymobile.scrcpy.wrappers.ClipboardManager; import com.genymobile.scrcpy.wrappers.ClipboardManager;
import com.genymobile.scrcpy.wrappers.DisplayControl;
import com.genymobile.scrcpy.wrappers.InputManager; import com.genymobile.scrcpy.wrappers.InputManager;
import com.genymobile.scrcpy.wrappers.ServiceManager; import com.genymobile.scrcpy.wrappers.ServiceManager;
import com.genymobile.scrcpy.wrappers.SurfaceControl; import com.genymobile.scrcpy.wrappers.SurfaceControl;
@ -12,8 +11,8 @@ import android.graphics.Rect;
import android.os.Build; import android.os.Build;
import android.os.IBinder; import android.os.IBinder;
import android.os.SystemClock; import android.os.SystemClock;
import android.view.IDisplayFoldListener;
import android.view.IRotationWatcher; import android.view.IRotationWatcher;
import android.view.IDisplayFoldListener;
import android.view.InputDevice; import android.view.InputDevice;
import android.view.InputEvent; import android.view.InputEvent;
import android.view.KeyCharacterMap; import android.view.KeyCharacterMap;
@ -100,32 +99,25 @@ public final class Device {
} }
}, displayId); }, displayId);
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.Q) { ServiceManager.getWindowManager().registerDisplayFoldListener(new IDisplayFoldListener.Stub() {
ServiceManager.getWindowManager().registerDisplayFoldListener(new IDisplayFoldListener.Stub() { @Override
@Override public void onDisplayFoldChanged(int displayId, boolean folded) {
public void onDisplayFoldChanged(int displayId, boolean folded) { synchronized (Device.this) {
if (Device.this.displayId != displayId) { DisplayInfo displayInfo = ServiceManager.getDisplayManager().getDisplayInfo(displayId);
// Ignore events related to other display ids if (displayInfo == null) {
Ln.e("Display " + displayId + " not found\n" + LogUtils.buildDisplayListMessage());
return; return;
} }
synchronized (Device.this) { screenInfo = ScreenInfo.computeScreenInfo(displayInfo.getRotation(), displayInfo.getSize(), options.getCrop(),
DisplayInfo displayInfo = ServiceManager.getDisplayManager().getDisplayInfo(displayId); options.getMaxSize(), options.getLockVideoOrientation());
if (displayInfo == null) { // notify
Ln.e("Display " + displayId + " not found\n" + LogUtils.buildDisplayListMessage()); if (foldListener != null) {
return; foldListener.onFoldChanged(displayId, folded);
}
screenInfo = ScreenInfo.computeScreenInfo(displayInfo.getRotation(), displayInfo.getSize(), options.getCrop(),
options.getMaxSize(), options.getLockVideoOrientation());
// notify
if (foldListener != null) {
foldListener.onFoldChanged(displayId, folded);
}
} }
} }
}); }
} });
if (options.getControl() && options.getClipboardAutosync()) { if (options.getControl() && options.getClipboardAutosync()) {
// If control and autosync are enabled, synchronize Android clipboard to the computer automatically // If control and autosync are enabled, synchronize Android clipboard to the computer automatically
@ -164,10 +156,6 @@ public final class Device {
} }
} }
public int getDisplayId() {
return displayId;
}
public synchronized void setMaxSize(int newMaxSize) { public synchronized void setMaxSize(int newMaxSize) {
maxSize = newMaxSize; maxSize = newMaxSize;
screenInfo = ScreenInfo.computeScreenInfo(screenInfo.getReverseVideoRotation(), deviceSize, crop, newMaxSize, lockVideoOrientation); screenInfo = ScreenInfo.computeScreenInfo(screenInfo.getReverseVideoRotation(), deviceSize, crop, newMaxSize, lockVideoOrientation);
@ -320,12 +308,8 @@ public final class Device {
*/ */
public static boolean setScreenPowerMode(int mode) { public static boolean setScreenPowerMode(int mode) {
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.Q) { if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.Q) {
// On Android 14, these internal methods have been moved to DisplayControl
boolean useDisplayControl =
Build.VERSION.SDK_INT >= Build.VERSION_CODES.UPSIDE_DOWN_CAKE && !SurfaceControl.hasPhysicalDisplayIdsMethod();
// Change the power mode for all physical displays // Change the power mode for all physical displays
long[] physicalDisplayIds = useDisplayControl ? DisplayControl.getPhysicalDisplayIds() : SurfaceControl.getPhysicalDisplayIds(); long[] physicalDisplayIds = SurfaceControl.getPhysicalDisplayIds();
if (physicalDisplayIds == null) { if (physicalDisplayIds == null) {
Ln.e("Could not get physical display ids"); Ln.e("Could not get physical display ids");
return false; return false;
@ -333,8 +317,7 @@ public final class Device {
boolean allOk = true; boolean allOk = true;
for (long physicalDisplayId : physicalDisplayIds) { for (long physicalDisplayId : physicalDisplayIds) {
IBinder binder = useDisplayControl ? DisplayControl.getPhysicalDisplayToken( IBinder binder = SurfaceControl.getPhysicalDisplayToken(physicalDisplayId);
physicalDisplayId) : SurfaceControl.getPhysicalDisplayToken(physicalDisplayId);
allOk &= SurfaceControl.setDisplayPowerMode(binder, mode); allOk &= SurfaceControl.setDisplayPowerMode(binder, mode);
} }
return allOk; return allOk;

View File

@ -51,7 +51,6 @@ public final class DeviceMessageSender {
} }
} }
} }
public void start() { public void start() {
thread = new Thread(() -> { thread = new Thread(() -> {
try { try {

View File

@ -1,5 +1,7 @@
package com.genymobile.scrcpy; package com.genymobile.scrcpy;
import com.genymobile.scrcpy.wrappers.ActivityThread;
import android.annotation.TargetApi; import android.annotation.TargetApi;
import android.content.AttributionSource; import android.content.AttributionSource;
import android.content.Context; import android.content.Context;
@ -7,6 +9,8 @@ import android.content.ContextWrapper;
import android.os.Build; import android.os.Build;
import android.os.Process; import android.os.Process;
import java.lang.reflect.Method;
public final class FakeContext extends ContextWrapper { public final class FakeContext extends ContextWrapper {
public static final String PACKAGE_NAME = "com.android.shell"; public static final String PACKAGE_NAME = "com.android.shell";
@ -14,12 +18,25 @@ public final class FakeContext extends ContextWrapper {
private static final FakeContext INSTANCE = new FakeContext(); private static final FakeContext INSTANCE = new FakeContext();
private static Context retrieveSystemContext() {
try {
Class<?> activityThreadClass = ActivityThread.getActivityThreadClass();
Object activityThread = ActivityThread.getActivityThread();
Method getSystemContextMethod = activityThreadClass.getDeclaredMethod("getSystemContext");
return (Context) getSystemContextMethod.invoke(activityThread);
} catch (Exception e) {
Ln.e("Cannot retrieve system context", e);
return null;
}
}
public static FakeContext get() { public static FakeContext get() {
return INSTANCE; return INSTANCE;
} }
private FakeContext() { private FakeContext() {
super(Workarounds.getSystemContext()); super(retrieveSystemContext());
} }
@Override @Override
@ -45,9 +62,4 @@ public final class FakeContext extends ContextWrapper {
public int getDeviceId() { public int getDeviceId() {
return 0; return 0;
} }
@Override
public Context getApplicationContext() {
return this;
}
} }

View File

@ -1,23 +0,0 @@
package com.genymobile.scrcpy;
import android.os.Handler;
import java.util.concurrent.Executor;
import java.util.concurrent.RejectedExecutionException;
// Inspired from hidden android.os.HandlerExecutor
public class HandlerExecutor implements Executor {
private final Handler handler;
public HandlerExecutor(Handler handler) {
this.handler = handler;
}
@Override
public void execute(Runnable command) {
if (!handler.post(command)) {
throw new RejectedExecutionException(handler + " is shutting down");
}
}
}

View File

@ -2,11 +2,6 @@ package com.genymobile.scrcpy;
import android.util.Log; import android.util.Log;
import java.io.FileDescriptor;
import java.io.FileOutputStream;
import java.io.OutputStream;
import java.io.PrintStream;
/** /**
* Log both to Android logger (so that logs are visible in "adb logcat") and standard output/error (so that they are visible in the terminal * Log both to Android logger (so that logs are visible in "adb logcat") and standard output/error (so that they are visible in the terminal
* directly). * directly).
@ -16,9 +11,6 @@ public final class Ln {
private static final String TAG = "scrcpy"; private static final String TAG = "scrcpy";
private static final String PREFIX = "[server] "; private static final String PREFIX = "[server] ";
private static final PrintStream CONSOLE_OUT = new PrintStream(new FileOutputStream(FileDescriptor.out));
private static final PrintStream CONSOLE_ERR = new PrintStream(new FileOutputStream(FileDescriptor.err));
enum Level { enum Level {
VERBOSE, DEBUG, INFO, WARN, ERROR VERBOSE, DEBUG, INFO, WARN, ERROR
} }
@ -29,12 +21,6 @@ public final class Ln {
// not instantiable // not instantiable
} }
public static void disableSystemStreams() {
PrintStream nullStream = new PrintStream(new NullOutputStream());
System.setOut(nullStream);
System.setErr(nullStream);
}
/** /**
* Initialize the log level. * Initialize the log level.
* <p> * <p>
@ -53,30 +39,30 @@ public final class Ln {
public static void v(String message) { public static void v(String message) {
if (isEnabled(Level.VERBOSE)) { if (isEnabled(Level.VERBOSE)) {
Log.v(TAG, message); Log.v(TAG, message);
CONSOLE_OUT.print(PREFIX + "VERBOSE: " + message + '\n'); System.out.print(PREFIX + "VERBOSE: " + message + '\n');
} }
} }
public static void d(String message) { public static void d(String message) {
if (isEnabled(Level.DEBUG)) { if (isEnabled(Level.DEBUG)) {
Log.d(TAG, message); Log.d(TAG, message);
CONSOLE_OUT.print(PREFIX + "DEBUG: " + message + '\n'); System.out.print(PREFIX + "DEBUG: " + message + '\n');
} }
} }
public static void i(String message) { public static void i(String message) {
if (isEnabled(Level.INFO)) { if (isEnabled(Level.INFO)) {
Log.i(TAG, message); Log.i(TAG, message);
CONSOLE_OUT.print(PREFIX + "INFO: " + message + '\n'); System.out.print(PREFIX + "INFO: " + message + '\n');
} }
} }
public static void w(String message, Throwable throwable) { public static void w(String message, Throwable throwable) {
if (isEnabled(Level.WARN)) { if (isEnabled(Level.WARN)) {
Log.w(TAG, message, throwable); Log.w(TAG, message, throwable);
CONSOLE_ERR.print(PREFIX + "WARN: " + message + '\n'); System.err.print(PREFIX + "WARN: " + message + '\n');
if (throwable != null) { if (throwable != null) {
throwable.printStackTrace(CONSOLE_ERR); throwable.printStackTrace();
} }
} }
} }
@ -88,9 +74,9 @@ public final class Ln {
public static void e(String message, Throwable throwable) { public static void e(String message, Throwable throwable) {
if (isEnabled(Level.ERROR)) { if (isEnabled(Level.ERROR)) {
Log.e(TAG, message, throwable); Log.e(TAG, message, throwable);
CONSOLE_ERR.print(PREFIX + "ERROR: " + message + '\n'); System.err.print(PREFIX + "ERROR: " + message + "\n");
if (throwable != null) { if (throwable != null) {
throwable.printStackTrace(CONSOLE_ERR); throwable.printStackTrace();
} }
} }
} }
@ -98,21 +84,4 @@ public final class Ln {
public static void e(String message) { public static void e(String message) {
e(message, null); e(message, null);
} }
static class NullOutputStream extends OutputStream {
@Override
public void write(byte[] b) {
// ignore
}
@Override
public void write(byte[] b, int off, int len) {
// ignore
}
@Override
public void write(int b) {
// ignore
}
}
} }

View File

@ -3,17 +3,7 @@ package com.genymobile.scrcpy;
import com.genymobile.scrcpy.wrappers.DisplayManager; import com.genymobile.scrcpy.wrappers.DisplayManager;
import com.genymobile.scrcpy.wrappers.ServiceManager; import com.genymobile.scrcpy.wrappers.ServiceManager;
import android.graphics.Rect;
import android.hardware.camera2.CameraAccessException;
import android.hardware.camera2.CameraCharacteristics;
import android.hardware.camera2.CameraManager;
import android.hardware.camera2.params.StreamConfigurationMap;
import android.media.MediaCodec;
import android.util.Range;
import java.util.List; import java.util.List;
import java.util.SortedSet;
import java.util.TreeSet;
public final class LogUtils { public final class LogUtils {
@ -57,7 +47,7 @@ public final class LogUtils {
builder.append("\n (none)"); builder.append("\n (none)");
} else { } else {
for (int id : displayIds) { for (int id : displayIds) {
builder.append("\n --display-id=").append(id).append(" ("); builder.append("\n --display=").append(id).append(" (");
DisplayInfo displayInfo = displayManager.getDisplayInfo(id); DisplayInfo displayInfo = displayManager.getDisplayInfo(id);
if (displayInfo != null) { if (displayInfo != null) {
Size size = displayInfo.getSize(); Size size = displayInfo.getSize();
@ -70,82 +60,4 @@ public final class LogUtils {
} }
return builder.toString(); return builder.toString();
} }
private static String getCameraFacingName(int facing) {
switch (facing) {
case CameraCharacteristics.LENS_FACING_FRONT:
return "front";
case CameraCharacteristics.LENS_FACING_BACK:
return "back";
case CameraCharacteristics.LENS_FACING_EXTERNAL:
return "external";
default:
return "unknown";
}
}
public static String buildCameraListMessage(boolean includeSizes) {
StringBuilder builder = new StringBuilder("List of cameras:");
CameraManager cameraManager = ServiceManager.getCameraManager();
try {
String[] cameraIds = cameraManager.getCameraIdList();
if (cameraIds == null || cameraIds.length == 0) {
builder.append("\n (none)");
} else {
for (String id : cameraIds) {
builder.append("\n --camera-id=").append(id);
CameraCharacteristics characteristics = cameraManager.getCameraCharacteristics(id);
int facing = characteristics.get(CameraCharacteristics.LENS_FACING);
builder.append(" (").append(getCameraFacingName(facing)).append(", ");
Rect activeSize = characteristics.get(CameraCharacteristics.SENSOR_INFO_ACTIVE_ARRAY_SIZE);
builder.append(activeSize.width()).append("x").append(activeSize.height());
try {
// Capture frame rates for low-FPS mode are the same for every resolution
Range<Integer>[] lowFpsRanges = characteristics.get(CameraCharacteristics.CONTROL_AE_AVAILABLE_TARGET_FPS_RANGES);
SortedSet<Integer> uniqueLowFps = getUniqueSet(lowFpsRanges);
builder.append(", fps=").append(uniqueLowFps);
} catch (Exception e) {
// Some devices may provide invalid ranges, causing an IllegalArgumentException "lower must be less than or equal to upper"
Ln.w("Could not get available frame rates for camera " + id, e);
}
builder.append(')');
if (includeSizes) {
StreamConfigurationMap configs = characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
android.util.Size[] sizes = configs.getOutputSizes(MediaCodec.class);
for (android.util.Size size : sizes) {
builder.append("\n - ").append(size.getWidth()).append('x').append(size.getHeight());
}
android.util.Size[] highSpeedSizes = configs.getHighSpeedVideoSizes();
if (highSpeedSizes.length > 0) {
builder.append("\n High speed capture (--camera-high-speed):");
for (android.util.Size size : highSpeedSizes) {
Range<Integer>[] highFpsRanges = configs.getHighSpeedVideoFpsRanges();
SortedSet<Integer> uniqueHighFps = getUniqueSet(highFpsRanges);
builder.append("\n - ").append(size.getWidth()).append("x").append(size.getHeight());
builder.append(" (fps=").append(uniqueHighFps).append(')');
}
}
}
}
}
} catch (CameraAccessException e) {
builder.append("\n (access denied)");
}
return builder.toString();
}
private static SortedSet<Integer> getUniqueSet(Range<Integer>[] ranges) {
SortedSet<Integer> set = new TreeSet<>();
for (Range<Integer> range : ranges) {
set.add(range.getUpper());
}
return set;
}
} }

View File

@ -14,7 +14,6 @@ public class Options {
private int maxSize; private int maxSize;
private VideoCodec videoCodec = VideoCodec.H264; private VideoCodec videoCodec = VideoCodec.H264;
private AudioCodec audioCodec = AudioCodec.OPUS; private AudioCodec audioCodec = AudioCodec.OPUS;
private VideoSource videoSource = VideoSource.DISPLAY;
private AudioSource audioSource = AudioSource.OUTPUT; private AudioSource audioSource = AudioSource.OUTPUT;
private int videoBitRate = 8000000; private int videoBitRate = 8000000;
private int audioBitRate = 128000; private int audioBitRate = 128000;
@ -24,12 +23,6 @@ public class Options {
private Rect crop; private Rect crop;
private boolean control = true; private boolean control = true;
private int displayId; private int displayId;
private String cameraId;
private Size cameraSize;
private CameraFacing cameraFacing;
private CameraAspectRatio cameraAspectRatio;
private int cameraFps;
private boolean cameraHighSpeed;
private boolean showTouches; private boolean showTouches;
private boolean stayAwake; private boolean stayAwake;
private List<CodecOption> videoCodecOptions; private List<CodecOption> videoCodecOptions;
@ -45,8 +38,6 @@ public class Options {
private boolean listEncoders; private boolean listEncoders;
private boolean listDisplays; private boolean listDisplays;
private boolean listCameras;
private boolean listCameraSizes;
// Options not used by the scrcpy client, but useful to use scrcpy-server directly // Options not used by the scrcpy client, but useful to use scrcpy-server directly
private boolean sendDeviceMeta = true; // send device name and size private boolean sendDeviceMeta = true; // send device name and size
@ -82,10 +73,6 @@ public class Options {
return audioCodec; return audioCodec;
} }
public VideoSource getVideoSource() {
return videoSource;
}
public AudioSource getAudioSource() { public AudioSource getAudioSource() {
return audioSource; return audioSource;
} }
@ -122,30 +109,6 @@ public class Options {
return displayId; return displayId;
} }
public String getCameraId() {
return cameraId;
}
public Size getCameraSize() {
return cameraSize;
}
public CameraFacing getCameraFacing() {
return cameraFacing;
}
public CameraAspectRatio getCameraAspectRatio() {
return cameraAspectRatio;
}
public int getCameraFps() {
return cameraFps;
}
public boolean getCameraHighSpeed() {
return cameraHighSpeed;
}
public boolean getShowTouches() { public boolean getShowTouches() {
return showTouches; return showTouches;
} }
@ -190,10 +153,6 @@ public class Options {
return powerOn; return powerOn;
} }
public boolean getList() {
return listEncoders || listDisplays || listCameras || listCameraSizes;
}
public boolean getListEncoders() { public boolean getListEncoders() {
return listEncoders; return listEncoders;
} }
@ -202,14 +161,6 @@ public class Options {
return listDisplays; return listDisplays;
} }
public boolean getListCameras() {
return listCameras;
}
public boolean getListCameraSizes() {
return listCameraSizes;
}
public boolean getSendDeviceMeta() { public boolean getSendDeviceMeta() {
return sendDeviceMeta; return sendDeviceMeta;
} }
@ -279,13 +230,6 @@ public class Options {
} }
options.audioCodec = audioCodec; options.audioCodec = audioCodec;
break; break;
case "video_source":
VideoSource videoSource = VideoSource.findByName(value);
if (videoSource == null) {
throw new IllegalArgumentException("Video source " + value + " not supported");
}
options.videoSource = videoSource;
break;
case "audio_source": case "audio_source":
AudioSource audioSource = AudioSource.findByName(value); AudioSource audioSource = AudioSource.findByName(value);
if (audioSource == null) { if (audioSource == null) {
@ -312,9 +256,7 @@ public class Options {
options.tunnelForward = Boolean.parseBoolean(value); options.tunnelForward = Boolean.parseBoolean(value);
break; break;
case "crop": case "crop":
if (!value.isEmpty()) { options.crop = parseCrop(value);
options.crop = parseCrop(value);
}
break; break;
case "control": case "control":
options.control = Boolean.parseBoolean(value); options.control = Boolean.parseBoolean(value);
@ -364,42 +306,6 @@ public class Options {
case "list_displays": case "list_displays":
options.listDisplays = Boolean.parseBoolean(value); options.listDisplays = Boolean.parseBoolean(value);
break; break;
case "list_cameras":
options.listCameras = Boolean.parseBoolean(value);
break;
case "list_camera_sizes":
options.listCameraSizes = Boolean.parseBoolean(value);
break;
case "camera_id":
if (!value.isEmpty()) {
options.cameraId = value;
}
break;
case "camera_size":
if (!value.isEmpty()) {
options.cameraSize = parseSize(value);
}
break;
case "camera_facing":
if (!value.isEmpty()) {
CameraFacing facing = CameraFacing.findByName(value);
if (facing == null) {
throw new IllegalArgumentException("Camera facing " + value + " not supported");
}
options.cameraFacing = facing;
}
break;
case "camera_ar":
if (!value.isEmpty()) {
options.cameraAspectRatio = parseCameraAspectRatio(value);
}
break;
case "camera_fps":
options.cameraFps = Integer.parseInt(value);
break;
case "camera_high_speed":
options.cameraHighSpeed = Boolean.parseBoolean(value);
break;
case "send_device_meta": case "send_device_meta":
options.sendDeviceMeta = Boolean.parseBoolean(value); options.sendDeviceMeta = Boolean.parseBoolean(value);
break; break;
@ -431,6 +337,9 @@ public class Options {
} }
private static Rect parseCrop(String crop) { private static Rect parseCrop(String crop) {
if (crop.isEmpty()) {
return null;
}
// input format: "width:height:x:y" // input format: "width:height:x:y"
String[] tokens = crop.split(":"); String[] tokens = crop.split(":");
if (tokens.length != 4) { if (tokens.length != 4) {
@ -442,31 +351,4 @@ public class Options {
int y = Integer.parseInt(tokens[3]); int y = Integer.parseInt(tokens[3]);
return new Rect(x, y, x + width, y + height); return new Rect(x, y, x + width, y + height);
} }
private static Size parseSize(String size) {
// input format: "<width>x<height>"
String[] tokens = size.split("x");
if (tokens.length != 2) {
throw new IllegalArgumentException("Invalid size format (expected <width>x<height>): \"" + size + "\"");
}
int width = Integer.parseInt(tokens[0]);
int height = Integer.parseInt(tokens[1]);
return new Size(width, height);
}
private static CameraAspectRatio parseCameraAspectRatio(String ar) {
if ("sensor".equals(ar)) {
return CameraAspectRatio.sensorAspectRatio();
}
String[] tokens = ar.split(":");
if (tokens.length == 2) {
int w = Integer.parseInt(tokens[0]);
int h = Integer.parseInt(tokens[1]);
return CameraAspectRatio.fromFraction(w, h);
}
float floatAr = Float.parseFloat(tokens[0]);
return CameraAspectRatio.fromFloat(floatAr);
}
} }

View File

@ -1,113 +0,0 @@
package com.genymobile.scrcpy;
import com.genymobile.scrcpy.wrappers.ServiceManager;
import com.genymobile.scrcpy.wrappers.SurfaceControl;
import android.graphics.Rect;
import android.hardware.display.VirtualDisplay;
import android.os.Build;
import android.os.IBinder;
import android.view.Surface;
public class ScreenCapture extends SurfaceCapture implements Device.RotationListener, Device.FoldListener {
private final Device device;
private IBinder display;
private VirtualDisplay virtualDisplay;
public ScreenCapture(Device device) {
this.device = device;
}
@Override
public void init() {
device.setRotationListener(this);
device.setFoldListener(this);
}
@Override
public void start(Surface surface) {
ScreenInfo screenInfo = device.getScreenInfo();
Rect contentRect = screenInfo.getContentRect();
// does not include the locked video orientation
Rect unlockedVideoRect = screenInfo.getUnlockedVideoSize().toRect();
int videoRotation = screenInfo.getVideoRotation();
int layerStack = device.getLayerStack();
if (display != null) {
SurfaceControl.destroyDisplay(display);
display = null;
}
if (virtualDisplay != null) {
virtualDisplay.release();
virtualDisplay = null;
}
try {
display = createDisplay();
setDisplaySurface(display, surface, videoRotation, contentRect, unlockedVideoRect, layerStack);
Ln.d("Display: using SurfaceControl API");
} catch (Exception surfaceControlException) {
Rect videoRect = screenInfo.getVideoSize().toRect();
try {
virtualDisplay = ServiceManager.getDisplayManager()
.createVirtualDisplay("scrcpy", videoRect.width(), videoRect.height(), device.getDisplayId(), surface);
Ln.d("Display: using DisplayManager API");
} catch (Exception displayManagerException) {
Ln.e("Could not create display using SurfaceControl", surfaceControlException);
Ln.e("Could not create display using DisplayManager", displayManagerException);
throw new AssertionError("Could not create display");
}
}
}
@Override
public void release() {
device.setRotationListener(null);
device.setFoldListener(null);
if (display != null) {
SurfaceControl.destroyDisplay(display);
}
}
@Override
public Size getSize() {
return device.getScreenInfo().getVideoSize();
}
@Override
public boolean setMaxSize(int maxSize) {
device.setMaxSize(maxSize);
return true;
}
@Override
public void onFoldChanged(int displayId, boolean folded) {
requestReset();
}
@Override
public void onRotationChanged(int rotation) {
requestReset();
}
private static IBinder createDisplay() throws Exception {
// Since Android 12 (preview), secure displays could not be created with shell permissions anymore.
// On Android 12 preview, SDK_INT is still R (not S), but CODENAME is "S".
boolean secure = Build.VERSION.SDK_INT < Build.VERSION_CODES.R || (Build.VERSION.SDK_INT == Build.VERSION_CODES.R && !"S".equals(
Build.VERSION.CODENAME));
return SurfaceControl.createDisplay("scrcpy", secure);
}
private static void setDisplaySurface(IBinder display, Surface surface, int orientation, Rect deviceRect, Rect displayRect, int layerStack) {
SurfaceControl.openTransaction();
try {
SurfaceControl.setDisplaySurface(display, surface);
SurfaceControl.setDisplayProjection(display, orientation, deviceRect, displayRect);
SurfaceControl.setDisplayLayerStack(display, layerStack);
} finally {
SurfaceControl.closeTransaction();
}
}
}

View File

@ -1,9 +1,13 @@
package com.genymobile.scrcpy; package com.genymobile.scrcpy;
import com.genymobile.scrcpy.wrappers.SurfaceControl;
import android.graphics.Rect;
import android.media.MediaCodec; import android.media.MediaCodec;
import android.media.MediaCodecInfo; import android.media.MediaCodecInfo;
import android.media.MediaFormat; import android.media.MediaFormat;
import android.os.Looper; import android.os.Build;
import android.os.IBinder;
import android.os.SystemClock; import android.os.SystemClock;
import android.view.Surface; import android.view.Surface;
@ -12,7 +16,7 @@ import java.nio.ByteBuffer;
import java.util.List; import java.util.List;
import java.util.concurrent.atomic.AtomicBoolean; import java.util.concurrent.atomic.AtomicBoolean;
public class SurfaceEncoder implements AsyncProcessor { public class ScreenEncoder implements Device.RotationListener, Device.FoldListener, AsyncProcessor {
private static final int DEFAULT_I_FRAME_INTERVAL = 10; // seconds private static final int DEFAULT_I_FRAME_INTERVAL = 10; // seconds
private static final int REPEAT_FRAME_DELAY_US = 100_000; // repeat after 100ms private static final int REPEAT_FRAME_DELAY_US = 100_000; // repeat after 100ms
@ -22,7 +26,9 @@ public class SurfaceEncoder implements AsyncProcessor {
private static final int[] MAX_SIZE_FALLBACK = {2560, 1920, 1600, 1280, 1024, 800}; private static final int[] MAX_SIZE_FALLBACK = {2560, 1920, 1600, 1280, 1024, 800};
private static final int MAX_CONSECUTIVE_ERRORS = 3; private static final int MAX_CONSECUTIVE_ERRORS = 3;
private final SurfaceCapture capture; private final AtomicBoolean resetCapture = new AtomicBoolean();
private final Device device;
private final Streamer streamer; private final Streamer streamer;
private final String encoderName; private final String encoderName;
private final List<CodecOption> codecOptions; private final List<CodecOption> codecOptions;
@ -36,9 +42,9 @@ public class SurfaceEncoder implements AsyncProcessor {
private Thread thread; private Thread thread;
private final AtomicBoolean stopped = new AtomicBoolean(); private final AtomicBoolean stopped = new AtomicBoolean();
public SurfaceEncoder(SurfaceCapture capture, Streamer streamer, int videoBitRate, int maxFps, List<CodecOption> codecOptions, String encoderName, public ScreenEncoder(Device device, Streamer streamer, int videoBitRate, int maxFps, List<CodecOption> codecOptions, String encoderName,
boolean downsizeOnError) { boolean downsizeOnError) {
this.capture = capture; this.device = device;
this.streamer = streamer; this.streamer = streamer;
this.videoBitRate = videoBitRate; this.videoBitRate = videoBitRate;
this.maxFps = maxFps; this.maxFps = maxFps;
@ -47,29 +53,51 @@ public class SurfaceEncoder implements AsyncProcessor {
this.downsizeOnError = downsizeOnError; this.downsizeOnError = downsizeOnError;
} }
@Override
public void onFoldChanged(int displayId, boolean folded) {
resetCapture.set(true);
}
@Override
public void onRotationChanged(int rotation) {
resetCapture.set(true);
}
private boolean consumeResetCapture() {
return resetCapture.getAndSet(false);
}
private void streamScreen() throws IOException, ConfigurationException { private void streamScreen() throws IOException, ConfigurationException {
Codec codec = streamer.getCodec(); Codec codec = streamer.getCodec();
MediaCodec mediaCodec = createMediaCodec(codec, encoderName); MediaCodec mediaCodec = createMediaCodec(codec, encoderName);
MediaFormat format = createFormat(codec.getMimeType(), videoBitRate, maxFps, codecOptions); MediaFormat format = createFormat(codec.getMimeType(), videoBitRate, maxFps, codecOptions);
IBinder display = createDisplay();
device.setRotationListener(this);
device.setFoldListener(this);
capture.init(); streamer.writeVideoHeader(device.getScreenInfo().getVideoSize());
boolean alive;
try { try {
streamer.writeVideoHeader(capture.getSize());
boolean alive;
do { do {
Size size = capture.getSize(); ScreenInfo screenInfo = device.getScreenInfo();
format.setInteger(MediaFormat.KEY_WIDTH, size.getWidth()); Rect contentRect = screenInfo.getContentRect();
format.setInteger(MediaFormat.KEY_HEIGHT, size.getHeight());
// include the locked video orientation
Rect videoRect = screenInfo.getVideoSize().toRect();
format.setInteger(MediaFormat.KEY_WIDTH, videoRect.width());
format.setInteger(MediaFormat.KEY_HEIGHT, videoRect.height());
Surface surface = null; Surface surface = null;
try { try {
mediaCodec.configure(format, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE); mediaCodec.configure(format, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
surface = mediaCodec.createInputSurface(); surface = mediaCodec.createInputSurface();
capture.start(surface); // does not include the locked video orientation
Rect unlockedVideoRect = screenInfo.getUnlockedVideoSize().toRect();
int videoRotation = screenInfo.getVideoRotation();
int layerStack = device.getLayerStack();
setDisplaySurface(display, surface, videoRotation, contentRect, unlockedVideoRect, layerStack);
mediaCodec.start(); mediaCodec.start();
@ -78,7 +106,7 @@ public class SurfaceEncoder implements AsyncProcessor {
mediaCodec.stop(); mediaCodec.stop();
} catch (IllegalStateException | IllegalArgumentException e) { } catch (IllegalStateException | IllegalArgumentException e) {
Ln.e("Encoding error: " + e.getClass().getName() + ": " + e.getMessage()); Ln.e("Encoding error: " + e.getClass().getName() + ": " + e.getMessage());
if (!prepareRetry(size)) { if (!prepareRetry(device, screenInfo)) {
throw e; throw e;
} }
Ln.i("Retrying..."); Ln.i("Retrying...");
@ -92,11 +120,13 @@ public class SurfaceEncoder implements AsyncProcessor {
} while (alive); } while (alive);
} finally { } finally {
mediaCodec.release(); mediaCodec.release();
capture.release(); device.setRotationListener(null);
device.setFoldListener(null);
SurfaceControl.destroyDisplay(display);
} }
} }
private boolean prepareRetry(Size currentSize) { private boolean prepareRetry(Device device, ScreenInfo screenInfo) {
if (firstFrameSent) { if (firstFrameSent) {
++consecutiveErrors; ++consecutiveErrors;
if (consecutiveErrors >= MAX_CONSECUTIVE_ERRORS) { if (consecutiveErrors >= MAX_CONSECUTIVE_ERRORS) {
@ -116,19 +146,16 @@ public class SurfaceEncoder implements AsyncProcessor {
// Downsizing on error is only enabled if an encoding failure occurs before the first frame (downsizing later could be surprising) // Downsizing on error is only enabled if an encoding failure occurs before the first frame (downsizing later could be surprising)
int newMaxSize = chooseMaxSizeFallback(currentSize); int newMaxSize = chooseMaxSizeFallback(screenInfo.getVideoSize());
Ln.i("newMaxSize = " + newMaxSize);
if (newMaxSize == 0) { if (newMaxSize == 0) {
// Must definitively fail // Must definitively fail
return false; return false;
} }
boolean accepted = capture.setMaxSize(newMaxSize); // Retry with a smaller device size
if (!accepted) {
return false;
}
// Retry with a smaller size
Ln.i("Retrying with -m" + newMaxSize + "..."); Ln.i("Retrying with -m" + newMaxSize + "...");
device.setMaxSize(newMaxSize);
return true; return true;
} }
@ -149,14 +176,14 @@ public class SurfaceEncoder implements AsyncProcessor {
boolean alive = true; boolean alive = true;
MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo(); MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
while (!capture.consumeReset() && !eof) { while (!consumeResetCapture() && !eof) {
if (stopped.get()) { if (stopped.get()) {
alive = false; alive = false;
break; break;
} }
int outputBufferId = codec.dequeueOutputBuffer(bufferInfo, -1); int outputBufferId = codec.dequeueOutputBuffer(bufferInfo, -1);
try { try {
if (capture.consumeReset()) { if (consumeResetCapture()) {
// must restart encoding with new size // must restart encoding with new size
break; break;
} }
@ -181,11 +208,6 @@ public class SurfaceEncoder implements AsyncProcessor {
} }
} }
if (capture.isClosed()) {
// The capture might have been closed internally (for example if the camera is disconnected)
alive = false;
}
return !eof && alive; return !eof && alive;
} }
@ -242,13 +264,28 @@ public class SurfaceEncoder implements AsyncProcessor {
return format; return format;
} }
private static IBinder createDisplay() {
// Since Android 12 (preview), secure displays could not be created with shell permissions anymore.
// On Android 12 preview, SDK_INT is still R (not S), but CODENAME is "S".
boolean secure = Build.VERSION.SDK_INT < Build.VERSION_CODES.R || (Build.VERSION.SDK_INT == Build.VERSION_CODES.R && !"S"
.equals(Build.VERSION.CODENAME));
return SurfaceControl.createDisplay("scrcpy", secure);
}
private static void setDisplaySurface(IBinder display, Surface surface, int orientation, Rect deviceRect, Rect displayRect, int layerStack) {
SurfaceControl.openTransaction();
try {
SurfaceControl.setDisplaySurface(display, surface);
SurfaceControl.setDisplayProjection(display, orientation, deviceRect, displayRect);
SurfaceControl.setDisplayLayerStack(display, layerStack);
} finally {
SurfaceControl.closeTransaction();
}
}
@Override @Override
public void start(TerminationListener listener) { public void start(TerminationListener listener) {
thread = new Thread(() -> { thread = new Thread(() -> {
// Some devices (Meizu) deadlock if the video encoding thread has no Looper
// <https://github.com/Genymobile/scrcpy/issues/4143>
Looper.prepare();
try { try {
streamScreen(); streamScreen();
} catch (ConfigurationException e) { } catch (ConfigurationException e) {

View File

@ -3,21 +3,12 @@ package com.genymobile.scrcpy;
import android.os.BatteryManager; import android.os.BatteryManager;
import android.os.Build; import android.os.Build;
import java.io.File;
import java.io.IOException; import java.io.IOException;
import java.util.ArrayList; import java.util.ArrayList;
import java.util.List; import java.util.List;
public final class Server { public final class Server {
public static final String SERVER_PATH;
static {
String[] classPaths = System.getProperty("java.class.path").split(File.pathSeparator);
// By convention, scrcpy is always executed with the absolute path of scrcpy-server.jar as the first item in the classpath
SERVER_PATH = classPaths[0];
}
private static class Completion { private static class Completion {
private int running; private int running;
private boolean fatalError; private boolean fatalError;
@ -96,10 +87,8 @@ public final class Server {
} }
private static void scrcpy(Options options) throws IOException, ConfigurationException { private static void scrcpy(Options options) throws IOException, ConfigurationException {
if (Build.VERSION.SDK_INT < Build.VERSION_CODES.S && options.getVideoSource() == VideoSource.CAMERA) { Ln.i("Device: [" + Build.MANUFACTURER + "] " + Build.BRAND + " " + Build.MODEL + " (Android " + Build.VERSION.RELEASE + ")");
Ln.e("Camera mirroring is not supported before Android 12"); final Device device = new Device(options);
throw new ConfigurationException("Camera mirroring is not supported");
}
Thread initThread = startInitThread(options); Thread initThread = startInitThread(options);
@ -109,11 +98,27 @@ public final class Server {
boolean video = options.getVideo(); boolean video = options.getVideo();
boolean audio = options.getAudio(); boolean audio = options.getAudio();
boolean sendDummyByte = options.getSendDummyByte(); boolean sendDummyByte = options.getSendDummyByte();
boolean camera = video && options.getVideoSource() == VideoSource.CAMERA;
final Device device = camera ? null : new Device(options); Workarounds.prepareMainLooper();
Workarounds.apply(audio, camera); // Workarounds must be applied for Meizu phones:
// - <https://github.com/Genymobile/scrcpy/issues/240>
// - <https://github.com/Genymobile/scrcpy/issues/365>
// - <https://github.com/Genymobile/scrcpy/issues/2656>
//
// But only apply when strictly necessary, since workarounds can cause other issues:
// - <https://github.com/Genymobile/scrcpy/issues/940>
// - <https://github.com/Genymobile/scrcpy/issues/994>
if (Build.BRAND.equalsIgnoreCase("meizu") || Build.BRAND.equalsIgnoreCase("honor")) {
Workarounds.fillAppInfo();
}
// Before Android 11, audio is not supported.
// Since Android 12, we can properly set a context on the AudioRecord.
// Only on Android 11 we must fill the application context for the AudioRecord to work.
if (audio && Build.VERSION.SDK_INT == Build.VERSION_CODES.R) {
Workarounds.fillAppContext();
}
List<AsyncProcessor> asyncProcessors = new ArrayList<>(); List<AsyncProcessor> asyncProcessors = new ArrayList<>();
@ -146,16 +151,9 @@ public final class Server {
if (video) { if (video) {
Streamer videoStreamer = new Streamer(connection.getVideoFd(), options.getVideoCodec(), options.getSendCodecMeta(), Streamer videoStreamer = new Streamer(connection.getVideoFd(), options.getVideoCodec(), options.getSendCodecMeta(),
options.getSendFrameMeta()); options.getSendFrameMeta());
SurfaceCapture surfaceCapture; ScreenEncoder screenEncoder = new ScreenEncoder(device, videoStreamer, options.getVideoBitRate(), options.getMaxFps(),
if (options.getVideoSource() == VideoSource.DISPLAY) {
surfaceCapture = new ScreenCapture(device);
} else {
surfaceCapture = new CameraCapture(options.getCameraId(), options.getCameraFacing(), options.getCameraSize(),
options.getMaxSize(), options.getCameraAspectRatio(), options.getCameraFps(), options.getCameraHighSpeed());
}
SurfaceEncoder surfaceEncoder = new SurfaceEncoder(surfaceCapture, videoStreamer, options.getVideoBitRate(), options.getMaxFps(),
options.getVideoCodecOptions(), options.getVideoEncoder(), options.getDownsizeOnError()); options.getVideoCodecOptions(), options.getVideoEncoder(), options.getDownsizeOnError());
asyncProcessors.add(surfaceEncoder); asyncProcessors.add(screenEncoder);
} }
Completion completion = new Completion(asyncProcessors.size()); Completion completion = new Completion(asyncProcessors.size());
@ -172,8 +170,6 @@ public final class Server {
asyncProcessor.stop(); asyncProcessor.stop();
} }
connection.shutdown();
try { try {
initThread.join(); initThread.join();
for (AsyncProcessor asyncProcessor : asyncProcessors) { for (AsyncProcessor asyncProcessor : asyncProcessors) {
@ -193,34 +189,16 @@ public final class Server {
return thread; return thread;
} }
public static void main(String... args) { public static void main(String... args) throws Exception {
int status = 0;
try {
internalMain(args);
} catch (Throwable t) {
Ln.e(t.getMessage(), t);
status = 1;
} finally {
// By default, the Java process exits when all non-daemon threads are terminated.
// The Android SDK might start some non-daemon threads internally, preventing the scrcpy server to exit.
// So force the process to exit explicitly.
System.exit(status);
}
}
private static void internalMain(String... args) throws Exception {
Thread.setDefaultUncaughtExceptionHandler((t, e) -> { Thread.setDefaultUncaughtExceptionHandler((t, e) -> {
Ln.e("Exception on thread " + t, e); Ln.e("Exception on thread " + t, e);
}); });
Options options = Options.parse(args); Options options = Options.parse(args);
Ln.disableSystemStreams();
Ln.initLogLevel(options.getLogLevel()); Ln.initLogLevel(options.getLogLevel());
Ln.i("Device: [" + Build.MANUFACTURER + "] " + Build.BRAND + " " + Build.MODEL + " (Android " + Build.VERSION.RELEASE + ")"); if (options.getListEncoders() || options.getListDisplays()) {
if (options.getList()) {
if (options.getCleanup()) { if (options.getCleanup()) {
CleanUp.unlinkSelf(); CleanUp.unlinkSelf();
} }
@ -232,10 +210,6 @@ public final class Server {
if (options.getListDisplays()) { if (options.getListDisplays()) {
Ln.i(LogUtils.buildDisplayListMessage()); Ln.i(LogUtils.buildDisplayListMessage());
} }
if (options.getListCameras() || options.getListCameraSizes()) {
Workarounds.apply(false, true);
Ln.i(LogUtils.buildCameraListMessage(options.getListCameraSizes()));
}
// Just print the requested data, do not mirror // Just print the requested data, do not mirror
return; return;
} }

View File

@ -75,7 +75,7 @@ public final class Settings {
String oldValue = getValue(table, key); String oldValue = getValue(table, key);
if (!value.equals(oldValue)) { if (!value.equals(oldValue)) {
putValue(table, key, value); putValue(table, key, value);
} }
return oldValue; return oldValue;
} }

View File

@ -5,14 +5,14 @@ import android.media.MediaCodec;
import java.io.FileDescriptor; import java.io.FileDescriptor;
import java.io.IOException; import java.io.IOException;
import java.nio.ByteBuffer; import java.nio.ByteBuffer;
import java.nio.ByteOrder;
import java.util.Arrays;
public final class Streamer { public final class Streamer {
private static final long PACKET_FLAG_CONFIG = 1L << 63; private static final long PACKET_FLAG_CONFIG = 1L << 63;
private static final long PACKET_FLAG_KEY_FRAME = 1L << 62; private static final long PACKET_FLAG_KEY_FRAME = 1L << 62;
private static final long AOPUSHDR = 0x5244485355504F41L; // "AOPUSHDR" in ASCII (little-endian)
private final FileDescriptor fd; private final FileDescriptor fd;
private final Codec codec; private final Codec codec;
private final boolean sendCodecMeta; private final boolean sendCodecMeta;
@ -30,7 +30,6 @@ public final class Streamer {
public Codec getCodec() { public Codec getCodec() {
return codec; return codec;
} }
public void writeAudioHeader() throws IOException { public void writeAudioHeader() throws IOException {
if (sendCodecMeta) { if (sendCodecMeta) {
ByteBuffer buffer = ByteBuffer.allocate(4); ByteBuffer buffer = ByteBuffer.allocate(4);
@ -63,12 +62,8 @@ public final class Streamer {
} }
public void writePacket(ByteBuffer buffer, long pts, boolean config, boolean keyFrame) throws IOException { public void writePacket(ByteBuffer buffer, long pts, boolean config, boolean keyFrame) throws IOException {
if (config) { if (config && codec == AudioCodec.OPUS) {
if (codec == AudioCodec.OPUS) { fixOpusConfigPacket(buffer);
fixOpusConfigPacket(buffer);
} else if (codec == AudioCodec.FLAC) {
fixFlacConfigPacket(buffer);
}
} }
if (sendFrameMeta) { if (sendFrameMeta) {
@ -125,14 +120,11 @@ public final class Streamer {
throw new IOException("Not enough data in OPUS config packet"); throw new IOException("Not enough data in OPUS config packet");
} }
final byte[] opusHeaderId = {'A', 'O', 'P', 'U', 'S', 'H', 'D', 'R'}; long id = buffer.getLong();
byte[] idBuffer = new byte[8]; if (id != AOPUSHDR) {
buffer.get(idBuffer);
if (!Arrays.equals(idBuffer, opusHeaderId)) {
throw new IOException("OPUS header not found"); throw new IOException("OPUS header not found");
} }
// The size is in native byte-order
long sizeLong = buffer.getLong(); long sizeLong = buffer.getLong();
if (sizeLong < 0 || sizeLong >= 0x7FFFFFFF) { if (sizeLong < 0 || sizeLong >= 0x7FFFFFFF) {
throw new IOException("Invalid block size in OPUS header: " + sizeLong); throw new IOException("Invalid block size in OPUS header: " + sizeLong);
@ -146,41 +138,4 @@ public final class Streamer {
// Set the buffer to point to the OPUS header slice // Set the buffer to point to the OPUS header slice
buffer.limit(buffer.position() + size); buffer.limit(buffer.position() + size);
} }
private static void fixFlacConfigPacket(ByteBuffer buffer) throws IOException {
// 00000000 66 4c 61 43 00 00 00 22 |fLaC..." |
// -------------- BELOW IS THE PART WE MUST PUT AS EXTRADATA -------------------
// 00000000 10 00 10 00 00 00 00 00 | ........|
// 00000010 00 00 0b b8 02 f0 00 00 00 00 00 00 00 00 00 00 |................|
// 00000020 00 00 00 00 00 00 00 00 00 00 |.......... |
// ------------------------------------------------------------------------------
// 00000020 84 00 00 28 20 00 | ...( .|
// 00000030 00 00 72 65 66 65 72 65 6e 63 65 20 6c 69 62 46 |..reference libF|
// 00000040 4c 41 43 20 31 2e 33 2e 32 20 32 30 32 32 31 30 |LAC 1.3.2 202210|
// 00000050 32 32 00 00 00 00 |22....|
//
// <https://developer.android.com/reference/android/media/MediaCodec#CSD>
if (buffer.remaining() < 8) {
throw new IOException("Not enough data in FLAC config packet");
}
final byte[] flacHeaderId = {'f', 'L', 'a', 'C'};
byte[] idBuffer = new byte[4];
buffer.get(idBuffer);
if (!Arrays.equals(idBuffer, flacHeaderId)) {
throw new IOException("FLAC header not found");
}
// The size is in big-endian
buffer.order(ByteOrder.BIG_ENDIAN);
int size = buffer.getInt();
if (buffer.remaining() < size) {
throw new IOException("Not enough data in FLAC header (invalid size: " + size + ")");
}
// Set the buffer to point to the FLAC header slice
buffer.limit(buffer.position() + size);
}
} }

View File

@ -1,71 +0,0 @@
package com.genymobile.scrcpy;
import android.view.Surface;
import java.io.IOException;
import java.util.concurrent.atomic.AtomicBoolean;
/**
* A video source which can be rendered on a Surface for encoding.
*/
public abstract class SurfaceCapture {
private final AtomicBoolean resetCapture = new AtomicBoolean();
/**
* Request the encoding session to be restarted, for example if the capture implementation detects that the video source size has changed (on
* device rotation for example).
*/
protected void requestReset() {
resetCapture.set(true);
}
/**
* Consume the reset request (intended to be called by the encoder).
*
* @return {@code true} if a reset request was pending, {@code false} otherwise.
*/
public boolean consumeReset() {
return resetCapture.getAndSet(false);
}
/**
* Called once before the capture starts.
*/
public abstract void init() throws IOException;
/**
* Called after the capture ends (if and only if {@link #init()} has been called).
*/
public abstract void release();
/**
* Start the capture to the target surface.
*
* @param surface the surface which will be encoded
*/
public abstract void start(Surface surface) throws IOException;
/**
* Return the video size
*
* @return the video size
*/
public abstract Size getSize();
/**
* Set the maximum capture size (set by the encoder if it does not support the current size).
*
* @param maxSize Maximum size
*/
public abstract boolean setMaxSize(int maxSize);
/**
* Indicate if the capture has been closed internally.
*
* @return {@code true} is the capture is closed, {@code false} otherwise.
*/
public boolean isClosed() {
return false;
}
}

View File

@ -6,7 +6,7 @@ import android.media.MediaFormat;
public enum VideoCodec implements Codec { public enum VideoCodec implements Codec {
H264(0x68_32_36_34, "h264", MediaFormat.MIMETYPE_VIDEO_AVC), H264(0x68_32_36_34, "h264", MediaFormat.MIMETYPE_VIDEO_AVC),
H265(0x68_32_36_35, "h265", MediaFormat.MIMETYPE_VIDEO_HEVC), H265(0x68_32_36_35, "h265", MediaFormat.MIMETYPE_VIDEO_HEVC),
@SuppressLint("InlinedApi") // introduced in API 29 @SuppressLint("InlinedApi") // introduced in API 21
AV1(0x00_61_76_31, "av1", MediaFormat.MIMETYPE_VIDEO_AV1); AV1(0x00_61_76_31, "av1", MediaFormat.MIMETYPE_VIDEO_AV1);
private final int id; // 4-byte ASCII representation of the name private final int id; // 4-byte ASCII representation of the name

View File

@ -1,22 +0,0 @@
package com.genymobile.scrcpy;
public enum VideoSource {
DISPLAY("display"),
CAMERA("camera");
private final String name;
VideoSource(String name) {
this.name = name;
}
static VideoSource findByName(String name) {
for (VideoSource videoSource : VideoSource.values()) {
if (name.equals(videoSource.name)) {
return videoSource;
}
}
return null;
}
}

View File

@ -1,10 +1,11 @@
package com.genymobile.scrcpy; package com.genymobile.scrcpy;
import com.genymobile.scrcpy.wrappers.ActivityThread;
import android.annotation.SuppressLint; import android.annotation.SuppressLint;
import android.annotation.TargetApi; import android.annotation.TargetApi;
import android.app.Application; import android.app.Application;
import android.content.AttributionSource; import android.content.AttributionSource;
import android.content.Context;
import android.content.ContextWrapper; import android.content.ContextWrapper;
import android.content.pm.ApplicationInfo; import android.content.pm.ApplicationInfo;
import android.media.AudioAttributes; import android.media.AudioAttributes;
@ -19,95 +20,16 @@ import java.lang.reflect.Constructor;
import java.lang.reflect.Field; import java.lang.reflect.Field;
import java.lang.reflect.Method; import java.lang.reflect.Method;
@SuppressLint("PrivateApi,BlockedPrivateApi,SoonBlockedPrivateApi,DiscouragedPrivateApi")
public final class Workarounds { public final class Workarounds {
private static final Class<?> ACTIVITY_THREAD_CLASS; private static boolean activityThreadFilled;
private static final Object ACTIVITY_THREAD;
static {
prepareMainLooper();
try {
// ActivityThread activityThread = new ActivityThread();
ACTIVITY_THREAD_CLASS = Class.forName("android.app.ActivityThread");
Constructor<?> activityThreadConstructor = ACTIVITY_THREAD_CLASS.getDeclaredConstructor();
activityThreadConstructor.setAccessible(true);
ACTIVITY_THREAD = activityThreadConstructor.newInstance();
// ActivityThread.sCurrentActivityThread = activityThread;
Field sCurrentActivityThreadField = ACTIVITY_THREAD_CLASS.getDeclaredField("sCurrentActivityThread");
sCurrentActivityThreadField.setAccessible(true);
sCurrentActivityThreadField.set(null, ACTIVITY_THREAD);
} catch (Exception e) {
throw new AssertionError(e);
}
}
private Workarounds() { private Workarounds() {
// not instantiable // not instantiable
} }
public static void apply(boolean audio, boolean camera) {
boolean mustFillConfigurationController = false;
boolean mustFillAppInfo = false;
boolean mustFillAppContext = false;
if (Build.BRAND.equalsIgnoreCase("meizu")) {
// Workarounds must be applied for Meizu phones:
// - <https://github.com/Genymobile/scrcpy/issues/240>
// - <https://github.com/Genymobile/scrcpy/issues/365>
// - <https://github.com/Genymobile/scrcpy/issues/2656>
//
// But only apply when strictly necessary, since workarounds can cause other issues:
// - <https://github.com/Genymobile/scrcpy/issues/940>
// - <https://github.com/Genymobile/scrcpy/issues/994>
mustFillAppInfo = true;
} else if (Build.BRAND.equalsIgnoreCase("honor")) {
// More workarounds must be applied for Honor devices:
// - <https://github.com/Genymobile/scrcpy/issues/4015>
//
// The system context must not be set for all devices, because it would cause other problems:
// - <https://github.com/Genymobile/scrcpy/issues/4015#issuecomment-1595382142>
// - <https://github.com/Genymobile/scrcpy/issues/3805#issuecomment-1596148031>
mustFillAppInfo = true;
mustFillAppContext = true;
}
if (audio && Build.VERSION.SDK_INT == Build.VERSION_CODES.R) {
// Before Android 11, audio is not supported.
// Since Android 12, we can properly set a context on the AudioRecord.
// Only on Android 11 we must fill the application context for the AudioRecord to work.
mustFillAppContext = true;
}
if (camera) {
mustFillAppInfo = true;
mustFillAppContext = true;
}
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.S) {
// On some Samsung devices, DisplayManagerGlobal.getDisplayInfoLocked() calls ActivityThread.currentActivityThread().getConfiguration(),
// which requires a non-null ConfigurationController.
// ConfigurationController was introduced in Android 12, so do not attempt to set it on lower versions.
// <https://github.com/Genymobile/scrcpy/issues/4467>
mustFillConfigurationController = true;
}
if (mustFillConfigurationController) {
// Must be call before fillAppContext() because it is necessary to get a valid system context
fillConfigurationController();
}
if (mustFillAppInfo) {
fillAppInfo();
}
if (mustFillAppContext) {
fillAppContext();
}
}
@SuppressWarnings("deprecation") @SuppressWarnings("deprecation")
private static void prepareMainLooper() { public static void prepareMainLooper() {
// Some devices internally create a Handler when creating an input Surface, causing an exception: // Some devices internally create a Handler when creating an input Surface, causing an exception:
// "Can't create handler inside thread that has not called Looper.prepare()" // "Can't create handler inside thread that has not called Looper.prepare()"
// <https://github.com/Genymobile/scrcpy/issues/240> // <https://github.com/Genymobile/scrcpy/issues/240>
@ -119,8 +41,26 @@ public final class Workarounds {
Looper.prepareMainLooper(); Looper.prepareMainLooper();
} }
private static void fillAppInfo() { @SuppressLint("PrivateApi,DiscouragedPrivateApi")
private static void fillActivityThread() throws Exception {
if (!activityThreadFilled) {
Class<?> activityThreadClass = ActivityThread.getActivityThreadClass();
Object activityThread = ActivityThread.getActivityThread();
// ActivityThread.sCurrentActivityThread = activityThread;
Field sCurrentActivityThreadField = activityThreadClass.getDeclaredField("sCurrentActivityThread");
sCurrentActivityThreadField.setAccessible(true);
sCurrentActivityThreadField.set(null, activityThread);
activityThreadFilled = true;
}
}
@SuppressLint("PrivateApi,DiscouragedPrivateApi")
public static void fillAppInfo() {
try { try {
fillActivityThread();
// ActivityThread.AppBindData appBindData = new ActivityThread.AppBindData(); // ActivityThread.AppBindData appBindData = new ActivityThread.AppBindData();
Class<?> appBindDataClass = Class.forName("android.app.ActivityThread$AppBindData"); Class<?> appBindDataClass = Class.forName("android.app.ActivityThread$AppBindData");
Constructor<?> appBindDataConstructor = appBindDataClass.getDeclaredConstructor(); Constructor<?> appBindDataConstructor = appBindDataClass.getDeclaredConstructor();
@ -135,62 +75,44 @@ public final class Workarounds {
appInfoField.setAccessible(true); appInfoField.setAccessible(true);
appInfoField.set(appBindData, applicationInfo); appInfoField.set(appBindData, applicationInfo);
Class<?> activityThreadClass = ActivityThread.getActivityThreadClass();
Object activityThread = ActivityThread.getActivityThread();
// activityThread.mBoundApplication = appBindData; // activityThread.mBoundApplication = appBindData;
Field mBoundApplicationField = ACTIVITY_THREAD_CLASS.getDeclaredField("mBoundApplication"); Field mBoundApplicationField = activityThreadClass.getDeclaredField("mBoundApplication");
mBoundApplicationField.setAccessible(true); mBoundApplicationField.setAccessible(true);
mBoundApplicationField.set(ACTIVITY_THREAD, appBindData); mBoundApplicationField.set(activityThread, appBindData);
} catch (Throwable throwable) { } catch (Throwable throwable) {
// this is a workaround, so failing is not an error // this is a workaround, so failing is not an error
Ln.d("Could not fill app info: " + throwable.getMessage()); Ln.d("Could not fill app info: " + throwable.getMessage());
} }
} }
private static void fillAppContext() { @SuppressLint("PrivateApi,DiscouragedPrivateApi")
public static void fillAppContext() {
try { try {
Application app = new Application(); fillActivityThread();
Application app = Application.class.newInstance();
Field baseField = ContextWrapper.class.getDeclaredField("mBase"); Field baseField = ContextWrapper.class.getDeclaredField("mBase");
baseField.setAccessible(true); baseField.setAccessible(true);
baseField.set(app, FakeContext.get()); baseField.set(app, FakeContext.get());
Class<?> activityThreadClass = ActivityThread.getActivityThreadClass();
Object activityThread = ActivityThread.getActivityThread();
// activityThread.mInitialApplication = app; // activityThread.mInitialApplication = app;
Field mInitialApplicationField = ACTIVITY_THREAD_CLASS.getDeclaredField("mInitialApplication"); Field mInitialApplicationField = activityThreadClass.getDeclaredField("mInitialApplication");
mInitialApplicationField.setAccessible(true); mInitialApplicationField.setAccessible(true);
mInitialApplicationField.set(ACTIVITY_THREAD, app); mInitialApplicationField.set(activityThread, app);
} catch (Throwable throwable) { } catch (Throwable throwable) {
// this is a workaround, so failing is not an error // this is a workaround, so failing is not an error
Ln.d("Could not fill app context: " + throwable.getMessage()); Ln.d("Could not fill app context: " + throwable.getMessage());
} }
} }
private static void fillConfigurationController() {
try {
Class<?> configurationControllerClass = Class.forName("android.app.ConfigurationController");
Class<?> activityThreadInternalClass = Class.forName("android.app.ActivityThreadInternal");
Constructor<?> configurationControllerConstructor = configurationControllerClass.getDeclaredConstructor(activityThreadInternalClass);
configurationControllerConstructor.setAccessible(true);
Object configurationController = configurationControllerConstructor.newInstance(ACTIVITY_THREAD);
Field configurationControllerField = ACTIVITY_THREAD_CLASS.getDeclaredField("mConfigurationController");
configurationControllerField.setAccessible(true);
configurationControllerField.set(ACTIVITY_THREAD, configurationController);
} catch (Throwable throwable) {
Ln.d("Could not fill configuration: " + throwable.getMessage());
}
}
static Context getSystemContext() {
try {
Method getSystemContextMethod = ACTIVITY_THREAD_CLASS.getDeclaredMethod("getSystemContext");
return (Context) getSystemContextMethod.invoke(ACTIVITY_THREAD);
} catch (Throwable throwable) {
// this is a workaround, so failing is not an error
Ln.d("Could not get system context: " + throwable.getMessage());
return null;
}
}
@TargetApi(Build.VERSION_CODES.R) @TargetApi(Build.VERSION_CODES.R)
@SuppressLint("WrongConstant,MissingPermission") @SuppressLint("WrongConstant,MissingPermission,BlockedPrivateApi,SoonBlockedPrivateApi,DiscouragedPrivateApi")
public static AudioRecord createAudioRecord(int source, int sampleRate, int channelConfig, int channels, int channelMask, int encoding) { public static AudioRecord createAudioRecord(int source, int sampleRate, int channelConfig, int channels, int channelMask, int encoding) {
// Vivo (and maybe some other third-party ROMs) modified `AudioRecord`'s constructor, requiring `Context`s from real App environment. // Vivo (and maybe some other third-party ROMs) modified `AudioRecord`'s constructor, requiring `Context`s from real App environment.
// //

View File

@ -17,7 +17,7 @@ import java.lang.reflect.InvocationTargetException;
import java.lang.reflect.Method; import java.lang.reflect.Method;
@SuppressLint("PrivateApi,DiscouragedPrivateApi") @SuppressLint("PrivateApi,DiscouragedPrivateApi")
public final class ActivityManager { public class ActivityManager {
private final IInterface manager; private final IInterface manager;
private Method getContentProviderExternalMethod; private Method getContentProviderExternalMethod;

View File

@ -0,0 +1,32 @@
package com.genymobile.scrcpy.wrappers;
import java.lang.reflect.Constructor;
public class ActivityThread {
private static final Class<?> activityThreadClass;
private static final Object activityThread;
static {
try {
activityThreadClass = Class.forName("android.app.ActivityThread");
Constructor<?> activityThreadConstructor = activityThreadClass.getDeclaredConstructor();
activityThreadConstructor.setAccessible(true);
activityThread = activityThreadConstructor.newInstance();
} catch (Exception e) {
throw new AssertionError(e);
}
}
private ActivityThread() {
// only static methods
}
public static Object getActivityThread() {
return activityThread;
}
public static Class<?> getActivityThreadClass() {
return activityThreadClass;
}
}

View File

@ -11,7 +11,7 @@ import android.os.IInterface;
import java.lang.reflect.InvocationTargetException; import java.lang.reflect.InvocationTargetException;
import java.lang.reflect.Method; import java.lang.reflect.Method;
public final class ClipboardManager { public class ClipboardManager {
private final IInterface manager; private final IInterface manager;
private Method getPrimaryClipMethod; private Method getPrimaryClipMethod;
private Method setPrimaryClipMethod; private Method setPrimaryClipMethod;
@ -138,8 +138,8 @@ public final class ClipboardManager {
} }
} }
private static void addPrimaryClipChangedListener(Method method, int methodVersion, IInterface manager, IOnPrimaryClipChangedListener listener) private static void addPrimaryClipChangedListener(Method method, int methodVersion, IInterface manager,
throws InvocationTargetException, IllegalAccessException { IOnPrimaryClipChangedListener listener) throws InvocationTargetException, IllegalAccessException {
if (Build.VERSION.SDK_INT < Build.VERSION_CODES.Q) { if (Build.VERSION.SDK_INT < Build.VERSION_CODES.Q) {
method.invoke(manager, listener, FakeContext.PACKAGE_NAME); method.invoke(manager, listener, FakeContext.PACKAGE_NAME);
return; return;

View File

@ -14,7 +14,7 @@ import java.io.Closeable;
import java.lang.reflect.InvocationTargetException; import java.lang.reflect.InvocationTargetException;
import java.lang.reflect.Method; import java.lang.reflect.Method;
public final class ContentProvider implements Closeable { public class ContentProvider implements Closeable {
public static final String TABLE_SYSTEM = "system"; public static final String TABLE_SYSTEM = "system";
public static final String TABLE_SECURE = "secure"; public static final String TABLE_SECURE = "secure";

Some files were not shown because too many files have changed in this diff Show More