Compare commits
10 Commits
master
...
hidpiscale
Author | SHA1 | Date | |
---|---|---|---|
|
ae87885ffa | ||
|
65021416b3 | ||
|
305e76ee6e | ||
|
ef0029d10e | ||
|
36c68c4802 | ||
|
a82f665114 | ||
|
f88cc91a51 | ||
|
38e4b3e39a | ||
|
64efe2c07d | ||
|
93e9fb496c |
238
DEVELOP.md
Normal file
238
DEVELOP.md
Normal file
@ -0,0 +1,238 @@
|
||||
# scrcpy for developers
|
||||
|
||||
## Overview
|
||||
|
||||
This application is composed of two parts:
|
||||
- the server (`scrcpy-server.jar`), to be executed on the device,
|
||||
- the client (the `scrcpy` binary), executed on the host computer.
|
||||
|
||||
The client is responsible to push the server to the device and start its
|
||||
execution.
|
||||
|
||||
Once the client and the server are connected to each other, the server initially
|
||||
sends device information (name and initial screen dimensions), then starts to
|
||||
send a raw H.264 video stream of the device screen. The client decodes the video
|
||||
frames, and display them as soon as possible, without buffering, to minimize
|
||||
latency. The client is not aware of the device rotation (which is handled by the
|
||||
server), it just knows the dimensions of the video frames.
|
||||
|
||||
The client captures relevant keyboard and mouse events, that it transmits to the
|
||||
server, which injects them to the device.
|
||||
|
||||
|
||||
|
||||
## Server
|
||||
|
||||
|
||||
### Privileges
|
||||
|
||||
Capturing the screen requires some privileges, which are granted to `shell`.
|
||||
|
||||
The server is a Java application (with a [`public static void main(String...
|
||||
args)`][main] method), compiled against the Android framework, and executed as
|
||||
`shell` on the Android device.
|
||||
|
||||
[main]: server/src/main/java/com/genymobile/scrcpy/Server.java#L61
|
||||
|
||||
To run such a Java application, the classes must be [_dexed_][dex] (typically,
|
||||
to `classes.dex`). If `my.package.MainClass` is the main class, compiled to
|
||||
`classes.dex`, pushed to the device in `/data/local/tmp`, then it can be run
|
||||
with:
|
||||
|
||||
adb shell CLASSPATH=/data/local/tmp/classes.dex \
|
||||
app_process / my.package.MainClass
|
||||
|
||||
_The path `/data/local/tmp` is a good candidate to push the server, since it's
|
||||
readable and writable by `shell`, but not world-writable, so a malicious
|
||||
application may not replace the server just before the client executes it._
|
||||
|
||||
Instead of a raw _dex_ file, `app_process` accepts a _jar_ containing
|
||||
`classes.dex` (e.g. an [APK]). For simplicity, and to benefit from the gradle
|
||||
build system, the server is built to an (unsigned) APK (renamed to
|
||||
`scrcpy-server.jar`).
|
||||
|
||||
[dex]: https://en.wikipedia.org/wiki/Dalvik_(software)
|
||||
[apk]: https://en.wikipedia.org/wiki/Android_application_package
|
||||
|
||||
|
||||
### Hidden methods
|
||||
|
||||
Albeit compiled against the Android framework, [hidden] methods and classes are
|
||||
not directly accessible (and they may differ from one Android version to
|
||||
another).
|
||||
|
||||
They can be called using reflection though. The communication with hidden
|
||||
components is provided by [_wrappers_ classes][wrappers] and [aidl].
|
||||
|
||||
[hidden]: https://stackoverflow.com/a/31908373/1987178
|
||||
[wrappers]: https://github.com/Genymobile/scrcpy/blob/079d750d41b7849eb1b9caaa6151ef2429581584/server/src/main/java/com/genymobile/scrcpy/wrappers
|
||||
[aidl]: https://github.com/Genymobile/scrcpy/blob/079d750d41b7849eb1b9caaa6151ef2429581584/server/src/main/aidl/android/view
|
||||
|
||||
|
||||
### Threading
|
||||
|
||||
The server uses 2 threads:
|
||||
|
||||
- the **main** thread, encoding and streaming the video to the client;
|
||||
- the **controller** thread, listening for _control events_ (typically,
|
||||
keyboard and mouse events) from the client.
|
||||
|
||||
Since the video encoding is typically hardware, there would be no benefit in
|
||||
encoding and streaming in two different threads.
|
||||
|
||||
|
||||
### Screen video encoding
|
||||
|
||||
The encoding is managed by [`ScreenEncoder`].
|
||||
|
||||
The video is encoded using the [`MediaCodec`] API. The codec takes its input
|
||||
from a [surface] associated to the display, and writes the resulting H.264
|
||||
stream to the provided output stream (the socket connected to the client).
|
||||
|
||||
[`ScreenEncoder`]: https://github.com/Genymobile/scrcpy/blob/079d750d41b7849eb1b9caaa6151ef2429581584/server/src/main/java/com/genymobile/scrcpy/ScreenEncoder.java
|
||||
[`MediaCodec`]: https://developer.android.com/reference/android/media/MediaCodec.html
|
||||
[surface]: https://github.com/Genymobile/scrcpy/blob/079d750d41b7849eb1b9caaa6151ef2429581584/server/src/main/java/com/genymobile/scrcpy/ScreenEncoder.java#L63-L64
|
||||
|
||||
On device [rotation], the codec, surface and display are reinitialized, and a
|
||||
new video stream is produced.
|
||||
|
||||
New frames are produced only when changes occur on the surface. This is good
|
||||
because it avoids to send unnecessary frames, but there are drawbacks:
|
||||
|
||||
- it does not send any frame on start if the device screen does not change,
|
||||
- after fast motion changes, the last frame may have poor quality.
|
||||
|
||||
Both problems are [solved][repeat] by the flag
|
||||
[`KEY_REPEAT_PREVIOUS_FRAME_AFTER`][repeat-flag].
|
||||
|
||||
[rotation]: https://github.com/Genymobile/scrcpy/blob/079d750d41b7849eb1b9caaa6151ef2429581584/server/src/main/java/com/genymobile/scrcpy/ScreenEncoder.java#L89-L92
|
||||
[repeat]: https://github.com/Genymobile/scrcpy/blob/079d750d41b7849eb1b9caaa6151ef2429581584/server/src/main/java/com/genymobile/scrcpy/ScreenEncoder.java#L125-L126
|
||||
[repeat-flag]: https://developer.android.com/reference/android/media/MediaFormat.html#KEY_REPEAT_PREVIOUS_FRAME_AFTER
|
||||
|
||||
|
||||
### Input events injection
|
||||
|
||||
_Control events_ are received from the client by the [`EventController`] (run in
|
||||
a separate thread). There are 5 types of input events:
|
||||
- keycode (cf [`KeyEvent`]),
|
||||
- text (special characters may not be handled by keycodes directly),
|
||||
- mouse motion/click,
|
||||
- mouse scroll,
|
||||
- custom command (e.g. to switch the screen on).
|
||||
|
||||
All of them may need to inject input events to the system. To do so, they use
|
||||
the _hidden_ method [`InputManager.injectInputEvent`] (exposed by our
|
||||
[`InputManager` wrapper][inject-wrapper]).
|
||||
|
||||
[`EventController`]: https://github.com/Genymobile/scrcpy/blob/079d750d41b7849eb1b9caaa6151ef2429581584/server/src/main/java/com/genymobile/scrcpy/EventController.java#L70
|
||||
[`KeyEvent`]: https://developer.android.com/reference/android/view/KeyEvent.html
|
||||
[`MotionEvent`]: https://developer.android.com/reference/android/view/MotionEvent.html
|
||||
[`InputManager.injectInputEvent`]: https://android.googlesource.com/platform/frameworks/base/+/oreo-release/core/java/android/hardware/input/InputManager.java#857
|
||||
[inject-wrapper]: https://github.com/Genymobile/scrcpy/blob/079d750d41b7849eb1b9caaa6151ef2429581584/server/src/main/java/com/genymobile/scrcpy/wrappers/InputManager.java#L27
|
||||
|
||||
|
||||
|
||||
## Client
|
||||
|
||||
The client relies on [SDL], which provides cross-platform API for UI, input
|
||||
events, threading, etc.
|
||||
|
||||
The video stream is decoded by [libav] (FFmpeg).
|
||||
|
||||
[SDL]: https://www.libsdl.org
|
||||
[libav]: https://www.libav.org/
|
||||
|
||||
### Initialization
|
||||
|
||||
On startup, in addition to _libav_ and _SDL_ initialization, the client must
|
||||
push and start the server on the device, and open a socket so that they may
|
||||
communicate.
|
||||
|
||||
Note that the client-server roles are expressed at the application level:
|
||||
|
||||
- the server _serves_ video stream and handle requests from the client,
|
||||
- the client _controls_ the device through the server.
|
||||
|
||||
However, the roles are inverted at the network level:
|
||||
|
||||
- the client opens a server socket and listen on a port before starting the
|
||||
server,
|
||||
- the server connects to the client.
|
||||
|
||||
This role inversion guarantees that the connection will not fail due to race
|
||||
conditions, and avoids polling.
|
||||
|
||||
Once the server is connected, it sends the device information (name and initial
|
||||
screen dimensions). Thus, the client may init the window and renderer, before
|
||||
the first frame is available.
|
||||
|
||||
To minimize startup time, SDL initialization is performed while listening for
|
||||
the connection from the server (see commit [90a46b4]).
|
||||
|
||||
[90a46b4]: https://github.com/Genymobile/scrcpy/commit/90a46b4c45637d083e877020d85ade52a9a5fa8e
|
||||
|
||||
|
||||
### Threading
|
||||
|
||||
The client uses 3 threads:
|
||||
|
||||
- the **main** thread, executing the SDL event loop,
|
||||
- the **decoder** thread, decoding video frames,
|
||||
- the **controller** thread, sending _control events_ to the server.
|
||||
|
||||
|
||||
### Decoder
|
||||
|
||||
The [decoder] runs in a separate thread. It uses _libav_ to decode the H.264
|
||||
stream from the socket, and notifies the main thread when a new frame is
|
||||
available.
|
||||
|
||||
There are two [frames] simultaneously in memory:
|
||||
- the **decoding** frame, written by the decoder from the decoder thread,
|
||||
- the **rendering** frame, rendered in a texture from the main thread.
|
||||
|
||||
When a new decoded frame is available, the decoder _swaps_ the decoding and
|
||||
rendering frame (with proper synchronization). Thus, it immediatly starts
|
||||
to decode a new frame while the main thread renders the last one.
|
||||
|
||||
[decoder]: https://github.com/Genymobile/scrcpy/blob/079d750d41b7849eb1b9caaa6151ef2429581584/app/src/decoder.c
|
||||
[frames]: https://github.com/Genymobile/scrcpy/blob/079d750d41b7849eb1b9caaa6151ef2429581584/app/src/frames.h
|
||||
|
||||
|
||||
### Controller
|
||||
|
||||
The [controller] is responsible to send _control events_ to the device. It runs
|
||||
in a separate thread, to avoid I/O on the main thread.
|
||||
|
||||
On SDL event, received on the main thread, the [input manager][inputmanager]
|
||||
creates appropriate [_control events_][controlevent]. It is responsible to
|
||||
convert SDL events to Android events (using [convert]). It pushes the _control
|
||||
events_ to a blocking queue hold by the controller. On its own thread, the
|
||||
controller takes events from the queue, that it serializes and sends to the
|
||||
client.
|
||||
|
||||
[controller]: https://github.com/Genymobile/scrcpy/blob/079d750d41b7849eb1b9caaa6151ef2429581584/app/src/controller.h
|
||||
[controlevent]: https://github.com/Genymobile/scrcpy/blob/079d750d41b7849eb1b9caaa6151ef2429581584/app/src/controlevent.h
|
||||
[inputmanager]: https://github.com/Genymobile/scrcpy/blob/079d750d41b7849eb1b9caaa6151ef2429581584/app/src/inputmanager.h
|
||||
[convert]: https://github.com/Genymobile/scrcpy/blob/079d750d41b7849eb1b9caaa6151ef2429581584/app/src/convert.h
|
||||
|
||||
|
||||
### UI and event loop
|
||||
|
||||
Initialization, input events and rendering are all [managed][scrcpy] in the main
|
||||
thread.
|
||||
|
||||
Events are handled in the [event loop], which either updates the [screen] or
|
||||
delegates to the [input manager][inputmanager].
|
||||
|
||||
[scrcpy]: https://github.com/Genymobile/scrcpy/blob/079d750d41b7849eb1b9caaa6151ef2429581584/app/src/scrcpy.c
|
||||
[event loop]: https://github.com/Genymobile/scrcpy/blob/079d750d41b7849eb1b9caaa6151ef2429581584/app/src/scrcpy.c#L38
|
||||
[screen]: https://github.com/Genymobile/scrcpy/blob/079d750d41b7849eb1b9caaa6151ef2429581584/app/src/screen.h
|
||||
|
||||
|
||||
## Hack
|
||||
|
||||
For more details, go read the code!
|
||||
|
||||
If you find a bug, or have an awesome idea to implement, please discuss and
|
||||
contribute ;-)
|
260
README.md
260
README.md
@ -1,141 +1,219 @@
|
||||
# ScrCpy
|
||||
# scrcpy
|
||||
|
||||
This project displays screens of Android devices plugged on USB in live.
|
||||
This application provides display and control of Android devices connected on
|
||||
USB. It does not require any _root_ access. It works on _GNU/Linux_, _Windows_
|
||||
and _Mac OS_.
|
||||
|
||||
![screenshot](assets/screenshot-debian-600.jpg)
|
||||
|
||||
|
||||
## Run
|
||||
## Requirements
|
||||
|
||||
### Runtime requirements
|
||||
The Android part requires at least API 21 (Android 5.0).
|
||||
|
||||
This projects requires _FFmpeg_, _LibSDL2_ and _LibSDL2-net_.
|
||||
You need [adb] (recent enough so that `adb reverse` is implemented, it works
|
||||
with 1.0.36). It is available in the [Android SDK platform
|
||||
tools][platform-tools], on packaged in your distribution (`android-adb-tools`).
|
||||
|
||||
On Windows, just download the [platform-tools][platform-tools-windows] and
|
||||
extract the following files to a directory accessible from your `PATH`:
|
||||
- `adb.exe`
|
||||
- `AdbWinApi.dll`
|
||||
- `AdbWinUsbApi.dll`
|
||||
|
||||
Make sure you [enabled adb debugging][enable-adb] on your device(s).
|
||||
|
||||
[adb]: https://developer.android.com/studio/command-line/adb.html
|
||||
[enable-adb]: https://developer.android.com/studio/command-line/adb.html#Enabling
|
||||
[platform-tools]: https://developer.android.com/studio/releases/platform-tools.html
|
||||
[platform-tools-windows]: https://dl.google.com/android/repository/platform-tools-latest-windows.zip
|
||||
|
||||
The client requires _FFmpeg_ and _LibSDL2_.
|
||||
|
||||
|
||||
## Build and install
|
||||
|
||||
### System-specific steps
|
||||
|
||||
#### Linux
|
||||
|
||||
Install the packages from your package manager. For example, on Debian:
|
||||
Install the required packages from your package manager (here, for Debian):
|
||||
|
||||
sudo apt install ffmpeg libsdl2-2.0.0 libsdl2-net-2.0.0
|
||||
# runtime dependencies
|
||||
sudo apt install ffmpeg libsdl2-2.0.0
|
||||
|
||||
# build dependencies
|
||||
sudo apt install make gcc openjdk-8-jdk pkg-config meson zip \
|
||||
libavcodec-dev libavformat-dev libavutil-dev \
|
||||
libsdl2-dev
|
||||
|
||||
|
||||
#### Windows
|
||||
|
||||
From [MSYS2]:
|
||||
For Windows, for simplicity, a prebuilt package with all the dependencies
|
||||
(including `adb`) is available: TODO.
|
||||
|
||||
pacman -S mingw-w64-x86_64-SDL2
|
||||
pacman -S mingw-w64-x86_64-SDL2_net
|
||||
pacman -S mingw-w64-x86_64-ffmpeg
|
||||
Instead, you may want to build it manually. You need [MSYS2] to build the
|
||||
project. From an MSYS2 terminal, install the required packages:
|
||||
|
||||
[MSYS2]: http://www.msys2.org/
|
||||
|
||||
# runtime dependencies
|
||||
pacman -S mingw-w64-x86_64-SDL2 \
|
||||
mingw-w64-x86_64-ffmpeg
|
||||
|
||||
#### MacOS
|
||||
# build dependencies
|
||||
pacman -S mingw-w64-x86_64-make \
|
||||
mingw-w64-x86_64-gcc \
|
||||
mingw-w64-x86_64-pkg-config \
|
||||
mingw-w64-x86_64-meson \
|
||||
zip
|
||||
|
||||
TODO
|
||||
|
||||
|
||||
## Build
|
||||
|
||||
The project is divided into two parts:
|
||||
- the server, running on the device (in `server/`);
|
||||
- the client, running on the computer (in `app/`).
|
||||
|
||||
The server is a raw Java project requiring Android SDK. It not an Android
|
||||
project: the target file is a `.jar`, and a `main()` method is executed with
|
||||
_shell_ rights.
|
||||
|
||||
The client is a C project using [SDL] and [FFmpeg], built with [Meson]/[Ninja].
|
||||
|
||||
The root directory contains a `Makefile` to build both parts.
|
||||
|
||||
[sdl]: https://www.libsdl.org/
|
||||
[ffmpeg]: https://www.ffmpeg.org/
|
||||
[meson]: http://mesonbuild.com/
|
||||
[ninja]: https://ninja-build.org/
|
||||
|
||||
|
||||
### Build requirements
|
||||
|
||||
Install the [Android SDK], the JDK 8 (`openjdk-8-jdk`), and the packages
|
||||
described below.
|
||||
|
||||
[Android SDK]: https://developer.android.com/studio/index.html
|
||||
|
||||
|
||||
#### Linux
|
||||
|
||||
sudo apt install make gcc openjdk-8-jdk pkg-config meson zip \
|
||||
libavcodec-dev libavformat-dev libavutil-dev \
|
||||
libsdl2-dev libsdl2-net-dev
|
||||
|
||||
|
||||
#### Windows
|
||||
|
||||
Install these packages:
|
||||
|
||||
pacman -S mingw-w64-x86_64-make
|
||||
pacman -S mingw-w64-x86_64-gcc
|
||||
pacman -S mingw-w64-x86_64-pkg-config
|
||||
pacman -S mingw-w64-x86_64-meson
|
||||
pacman -S zip
|
||||
|
||||
Java 8 is not available in MSYS2, so install it manually and make it available
|
||||
from the `PATH`:
|
||||
Java (>= 7) is not available in MSYS2, so if you plan to build the server,
|
||||
install it manually and make it available from the `PATH`:
|
||||
|
||||
export PATH="$JAVA_HOME/bin:$PATH"
|
||||
|
||||
|
||||
### Build
|
||||
#### Mac OS
|
||||
|
||||
Make sure your `ANDROID_HOME` variable is set to your Android SDK directory:
|
||||
Use [Homebrew] to install the packages:
|
||||
|
||||
[Homebrew]: https://brew.sh/
|
||||
|
||||
# runtime dependencies
|
||||
brew install sdl2 ffmpeg
|
||||
|
||||
# build dependencies
|
||||
brew install gcc pkg-config meson zip
|
||||
|
||||
Java (>= 7) is not available in Homebrew, so if you plan to build the server,
|
||||
install it manually and make it available from the `PATH`:
|
||||
|
||||
export PATH="$JAVA_HOME/bin:$PATH"
|
||||
|
||||
|
||||
### Common steps
|
||||
|
||||
Install the [Android SDK] (_Android Studio_), and set `ANDROID_HOME` to
|
||||
its directory. For example:
|
||||
|
||||
[Android SDK]: https://developer.android.com/studio/index.html
|
||||
|
||||
export ANDROID_HOME=~/android/sdk
|
||||
|
||||
From the project root directory, execute:
|
||||
Then, build `scrcpy`:
|
||||
|
||||
make build
|
||||
meson x --buildtype release --strip -Db_lto=true
|
||||
cd x
|
||||
ninja
|
||||
|
||||
To run the build:
|
||||
You can test it from here:
|
||||
|
||||
make run
|
||||
ninja run
|
||||
|
||||
It is also pass arguments to `scrcpy` via `make`:
|
||||
Or you can install it on the system:
|
||||
|
||||
make run ARGS="-p 1234"
|
||||
sudo ninja install # without sudo on Windows
|
||||
|
||||
The purpose of this command is to execute `scrcpy` during the development.
|
||||
This installs two files:
|
||||
|
||||
- `/usr/local/bin/scrcpy`
|
||||
- `/usr/local/share/scrcpy/scrcpy-server.jar`
|
||||
|
||||
Just remove them to "uninstall" the application.
|
||||
|
||||
|
||||
### Test
|
||||
#### Prebuilt server
|
||||
|
||||
To execute unit tests:
|
||||
Since the server binary, that will be pushed to the Android device, does not
|
||||
depend on your system and architecture, you may want to use the prebuilt binary
|
||||
instead: [`scrcpy-server.jar`](TODO).
|
||||
|
||||
make test
|
||||
In that case, the build does not require Java or the Android SDK.
|
||||
|
||||
The server-side tests require JUnit 4:
|
||||
Download the prebuilt server somewhere, and specify its path during the Meson
|
||||
configuration:
|
||||
|
||||
sudo apt install junit4
|
||||
|
||||
|
||||
### Generate a release
|
||||
|
||||
From the project root directory, execute:
|
||||
|
||||
make release
|
||||
|
||||
This will generate the application in `dist/scrcpy/`.
|
||||
meson x --buildtype release --strip -Db_lto=true \
|
||||
-Dprebuilt_server=/path/to/scrcpy-server.jar
|
||||
cd x
|
||||
ninja
|
||||
sudo ninja install
|
||||
|
||||
|
||||
## Run
|
||||
|
||||
Plug a device, and from `dist/scrcpy/`, execute:
|
||||
_At runtime, `adb` must be accessible from your `PATH`._
|
||||
|
||||
./scrcpy
|
||||
If everything is ok, just plug an Android device, and execute:
|
||||
|
||||
scrcpy
|
||||
|
||||
It accepts command-line arguments, listed by:
|
||||
|
||||
scrcpy --help
|
||||
|
||||
For example, to decrease video bitrate to 2Mbps (default is 8Mbps):
|
||||
|
||||
scrcpy -b 2M
|
||||
|
||||
To limit the video dimensions (e.g. if the device is 2540×1440, but the host
|
||||
screen is smaller, or cannot decode such a high definition):
|
||||
|
||||
scrcpy -m 1024
|
||||
|
||||
If several devices are listed in `adb devices`, you must specify the _serial_:
|
||||
|
||||
./scrcpy 0123456789abcdef
|
||||
scrcpy -s 0123456789abcdef
|
||||
|
||||
To change the default port (useful to launch several `scrcpy` simultaneously):
|
||||
|
||||
./scrcpy -p 1234
|
||||
## Shortcuts
|
||||
|
||||
Other options are available, check `scrcpy --help`.
|
||||
| Action | Shortcut |
|
||||
| ------------------------------------- | -------------:|
|
||||
| switch fullscreen mode | `Ctrl`+`f` |
|
||||
| resize window to 1:1 (pixel-perfect) | `Ctrl`+`g` |
|
||||
| resize window to remove black borders | `Ctrl`+`x` |
|
||||
| click on `HOME` | `Ctrl`+`h` |
|
||||
| click on `BACK` | `Ctrl`+`b` |
|
||||
| click on `APP_SWITCH` | `Ctrl`+`m` |
|
||||
| click on `VOLUME_UP` | `Ctrl`+`+` |
|
||||
| click on `VOLUME_DOWN` | `Ctrl`+`-` |
|
||||
| click on `POWER` | `Ctrl`+`p` |
|
||||
| turn screen on | _Right-click_ |
|
||||
| enable/disable FPS counter (on stdout) | `Ctrl`+`i` |
|
||||
|
||||
|
||||
## Why _scrcpy_?
|
||||
|
||||
A colleague challenged me to find a name as unpronounceable as [gnirehtet].
|
||||
|
||||
[`strcpy`] copies a **str**ing; `scrcpy` copies a **scr**een.
|
||||
|
||||
[gnirehtet]: https://github.com/Genymobile/gnirehtet
|
||||
[`strcpy`]: http://man7.org/linux/man-pages/man3/strcpy.3.html
|
||||
|
||||
|
||||
## Developers
|
||||
|
||||
Read the [developers page].
|
||||
|
||||
[developers page]: DEVELOP.md
|
||||
|
||||
|
||||
## Licence
|
||||
|
||||
Copyright (C) 2018 Genymobile
|
||||
|
||||
Licensed under the Apache License, Version 2.0 (the "License");
|
||||
you may not use this file except in compliance with the License.
|
||||
You may obtain a copy of the License at
|
||||
|
||||
http://www.apache.org/licenses/LICENSE-2.0
|
||||
|
||||
Unless required by applicable law or agreed to in writing, software
|
||||
distributed under the License is distributed on an "AS IS" BASIS,
|
||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
See the License for the specific language governing permissions and
|
||||
limitations under the License.
|
||||
|
@ -8,6 +8,7 @@ src = [
|
||||
'src/device.c',
|
||||
'src/fpscounter.c',
|
||||
'src/frames.c',
|
||||
'src/hidpi.c',
|
||||
'src/inputmanager.c',
|
||||
'src/lockutil.c',
|
||||
'src/net.c',
|
||||
@ -73,7 +74,7 @@ conf.set('DEFAULT_MAX_SIZE', '0') # 0: unlimited
|
||||
|
||||
# the default video bitrate, in bits/second
|
||||
# overridden by option --bit-rate
|
||||
conf.set('DEFAULT_BIT_RATE', '4000000') # 4Mbps
|
||||
conf.set('DEFAULT_BIT_RATE', '8000000') # 8Mbps
|
||||
|
||||
# whether the app should always display the most recent available frame, even
|
||||
# if the previous one has not been displayed
|
||||
|
@ -136,6 +136,7 @@ SDL_bool input_key_from_sdl_to_android(const SDL_KeyboardEvent *from,
|
||||
|
||||
SDL_bool mouse_button_from_sdl_to_android(const SDL_MouseButtonEvent *from,
|
||||
struct size screen_size,
|
||||
struct hidpi_scale *hidpi_scale,
|
||||
struct control_event *to) {
|
||||
to->type = CONTROL_EVENT_TYPE_MOUSE;
|
||||
|
||||
@ -145,21 +146,30 @@ SDL_bool mouse_button_from_sdl_to_android(const SDL_MouseButtonEvent *from,
|
||||
|
||||
to->mouse_event.buttons = convert_mouse_buttons(SDL_BUTTON(from->button));
|
||||
to->mouse_event.position.screen_size = screen_size;
|
||||
to->mouse_event.position.point.x = (Uint16) from->x;
|
||||
to->mouse_event.position.point.y = (Uint16) from->y;
|
||||
|
||||
Sint32 x = from->x;
|
||||
Sint32 y = from->y;
|
||||
hidpi_unscale_coordinates(hidpi_scale, &x, &y);
|
||||
to->mouse_event.position.point.x = (Uint16) x;
|
||||
to->mouse_event.position.point.y = (Uint16) y;
|
||||
|
||||
return SDL_TRUE;
|
||||
}
|
||||
|
||||
SDL_bool mouse_motion_from_sdl_to_android(const SDL_MouseMotionEvent *from,
|
||||
struct size screen_size,
|
||||
struct hidpi_scale *hidpi_scale,
|
||||
struct control_event *to) {
|
||||
to->type = CONTROL_EVENT_TYPE_MOUSE;
|
||||
to->mouse_event.action = AMOTION_EVENT_ACTION_MOVE;
|
||||
to->mouse_event.buttons = convert_mouse_buttons(from->state);
|
||||
to->mouse_event.position.screen_size = screen_size;
|
||||
to->mouse_event.position.point.x = from->x;
|
||||
to->mouse_event.position.point.y = from->y;
|
||||
|
||||
Sint32 x = from->x;
|
||||
Sint32 y = from->y;
|
||||
hidpi_unscale_coordinates(hidpi_scale, &x, &y);
|
||||
to->mouse_event.position.point.x = (Uint16) x;
|
||||
to->mouse_event.position.point.y = (Uint16) y;
|
||||
|
||||
return SDL_TRUE;
|
||||
}
|
||||
|
@ -3,7 +3,10 @@
|
||||
|
||||
#include <SDL2/SDL_stdinc.h>
|
||||
#include <SDL2/SDL_events.h>
|
||||
|
||||
#include "common.h"
|
||||
#include "controlevent.h"
|
||||
#include "hidpi.h"
|
||||
|
||||
struct complete_mouse_motion_event {
|
||||
SDL_MouseMotionEvent *mouse_motion_event;
|
||||
@ -19,12 +22,14 @@ SDL_bool input_key_from_sdl_to_android(const SDL_KeyboardEvent *from,
|
||||
struct control_event *to);
|
||||
SDL_bool mouse_button_from_sdl_to_android(const SDL_MouseButtonEvent *from,
|
||||
struct size screen_size,
|
||||
struct hidpi_scale *hidpi_scale,
|
||||
struct control_event *to);
|
||||
|
||||
// the video size may be different from the real device size, so we need the size
|
||||
// to which the absolute position apply, to scale it accordingly
|
||||
SDL_bool mouse_motion_from_sdl_to_android(const SDL_MouseMotionEvent *from,
|
||||
struct size screen_size,
|
||||
struct hidpi_scale *hidpi_scale,
|
||||
struct control_event *to);
|
||||
|
||||
// on Android, a scroll event requires the current mouse position
|
||||
|
16
app/src/hidpi.c
Normal file
16
app/src/hidpi.c
Normal file
@ -0,0 +1,16 @@
|
||||
#include "hidpi.h"
|
||||
|
||||
void hidpi_get_scale(struct screen *screen, struct hidpi_scale *scale) {
|
||||
SDL_GL_GetDrawableSize(screen->window, &scale->horizontal.num, &scale->vertical.num);
|
||||
SDL_GetWindowSize(screen->window, &scale->horizontal.div, &scale->vertical.div);
|
||||
}
|
||||
|
||||
void hidpi_unscale_coordinates(struct hidpi_scale *scale, Sint32 *x, Sint32 *y) {
|
||||
// to unscale, we devide by the ratio (so num and div are reversed)
|
||||
if (scale->horizontal.num) {
|
||||
*x = ((Sint64) *x) * scale->horizontal.div / scale->horizontal.num;
|
||||
}
|
||||
if (scale->vertical.num) {
|
||||
*y = ((Sint64) *y) * scale->vertical.div / scale->vertical.num;
|
||||
}
|
||||
}
|
24
app/src/hidpi.h
Normal file
24
app/src/hidpi.h
Normal file
@ -0,0 +1,24 @@
|
||||
#ifndef HIDPI_H
|
||||
#define HIDPI_H
|
||||
|
||||
#include "common.h"
|
||||
#include "screen.h"
|
||||
|
||||
// rational number p/q
|
||||
struct rational {
|
||||
int num;
|
||||
int div;
|
||||
};
|
||||
|
||||
struct hidpi_scale {
|
||||
struct rational horizontal; // drawable.width / window.width
|
||||
struct rational vertical; // drawable.height / window.height
|
||||
};
|
||||
|
||||
void hidpi_get_scale(struct screen *screen, struct hidpi_scale *hidpi_scale);
|
||||
|
||||
// mouse location need to be "unscaled" if hidpi is enabled
|
||||
// <https://nlguillemot.wordpress.com/2016/12/11/high-dpi-rendering/>
|
||||
void hidpi_unscale_coordinates(struct hidpi_scale *hidpi_scale, Sint32 *x, Sint32 *y);
|
||||
|
||||
#endif
|
@ -1,13 +1,37 @@
|
||||
#include "inputmanager.h"
|
||||
|
||||
#include "convert.h"
|
||||
#include "hidpi.h"
|
||||
#include "lockutil.h"
|
||||
#include "log.h"
|
||||
|
||||
static struct point get_mouse_point(void) {
|
||||
int x;
|
||||
int y;
|
||||
SDL_GetMouseState(&x, &y);
|
||||
// Convert window coordinates (as provided by SDL_GetMouseState() to renderer coordinates (as provided in SDL mouse events)
|
||||
//
|
||||
// See my question:
|
||||
// <https://stackoverflow.com/questions/49111054/how-to-get-mouse-position-on-mouse-wheel-event>
|
||||
static void convert_to_renderer_coordinates(SDL_Renderer *renderer, int *x, int *y) {
|
||||
SDL_Rect viewport;
|
||||
float scale_x, scale_y;
|
||||
SDL_RenderGetViewport(renderer, &viewport);
|
||||
SDL_RenderGetScale(renderer, &scale_x, &scale_y);
|
||||
*x = (int) (*x / scale_x) - viewport.x;
|
||||
*y = (int) (*y / scale_y) - viewport.y;
|
||||
}
|
||||
|
||||
static struct point get_mouse_point(struct screen *screen) {
|
||||
int mx;
|
||||
int my;
|
||||
SDL_GetMouseState(&mx, &my);
|
||||
convert_to_renderer_coordinates(screen->renderer, &mx, &my);
|
||||
|
||||
struct hidpi_scale hidpi_scale;
|
||||
hidpi_get_scale(screen, &hidpi_scale);
|
||||
|
||||
// SDL sometimes uses "int", sometimes "Sint32"
|
||||
Sint32 x = mx;
|
||||
Sint32 y = my;
|
||||
hidpi_unscale_coordinates(&hidpi_scale, &x, &y);
|
||||
|
||||
SDL_assert_release(x >= 0 && x < 0x10000 && y >= 0 && y < 0x10000);
|
||||
return (struct point) {
|
||||
.x = (Uint16) x,
|
||||
@ -178,8 +202,12 @@ void input_manager_process_mouse_motion(struct input_manager *input_manager,
|
||||
// do not send motion events when no button is pressed
|
||||
return;
|
||||
}
|
||||
|
||||
struct hidpi_scale hidpi_scale;
|
||||
hidpi_get_scale(input_manager->screen, &hidpi_scale);
|
||||
|
||||
struct control_event control_event;
|
||||
if (mouse_motion_from_sdl_to_android(event, input_manager->screen->frame_size, &control_event)) {
|
||||
if (mouse_motion_from_sdl_to_android(event, input_manager->screen->frame_size, &hidpi_scale, &control_event)) {
|
||||
if (!controller_push_event(input_manager->controller, &control_event)) {
|
||||
LOGW("Cannot send mouse motion event");
|
||||
}
|
||||
@ -192,8 +220,12 @@ void input_manager_process_mouse_button(struct input_manager *input_manager,
|
||||
turn_screen_on(input_manager->controller);
|
||||
return;
|
||||
};
|
||||
|
||||
struct hidpi_scale hidpi_scale;
|
||||
hidpi_get_scale(input_manager->screen, &hidpi_scale);
|
||||
|
||||
struct control_event control_event;
|
||||
if (mouse_button_from_sdl_to_android(event, input_manager->screen->frame_size, &control_event)) {
|
||||
if (mouse_button_from_sdl_to_android(event, input_manager->screen->frame_size, &hidpi_scale, &control_event)) {
|
||||
if (!controller_push_event(input_manager->controller, &control_event)) {
|
||||
LOGW("Cannot send mouse button event");
|
||||
}
|
||||
@ -204,7 +236,7 @@ void input_manager_process_mouse_wheel(struct input_manager *input_manager,
|
||||
const SDL_MouseWheelEvent *event) {
|
||||
struct position position = {
|
||||
.screen_size = input_manager->screen->frame_size,
|
||||
.point = get_mouse_point(),
|
||||
.point = get_mouse_point(input_manager->screen),
|
||||
};
|
||||
struct control_event control_event;
|
||||
if (mouse_wheel_from_sdl_to_android(event, position, &control_event)) {
|
||||
|
@ -141,7 +141,8 @@ SDL_bool screen_init_rendering(struct screen *screen, const char *device_name, s
|
||||
|
||||
struct size window_size = get_initial_optimal_size(frame_size);
|
||||
screen->window = SDL_CreateWindow(device_name, SDL_WINDOWPOS_UNDEFINED, SDL_WINDOWPOS_UNDEFINED,
|
||||
window_size.width, window_size.height, SDL_WINDOW_HIDDEN | SDL_WINDOW_RESIZABLE);
|
||||
window_size.width, window_size.height,
|
||||
SDL_WINDOW_HIDDEN | SDL_WINDOW_RESIZABLE | SDL_WINDOW_ALLOW_HIGHDPI);
|
||||
if (!screen->window) {
|
||||
LOGC("Could not create window: %s", SDL_GetError());
|
||||
return SDL_FALSE;
|
||||
|
@ -71,6 +71,7 @@ static socket_t listen_on_port(Uint16 port) {
|
||||
|
||||
static void close_socket(socket_t *socket) {
|
||||
SDL_assert(*socket != INVALID_SOCKET);
|
||||
net_shutdown(*socket, SHUT_RDWR);
|
||||
if (!net_close(*socket)) {
|
||||
LOGW("Cannot close socket");
|
||||
return;
|
||||
@ -95,10 +96,10 @@ SDL_bool server_start(struct server *server, const char *serial, Uint16 local_po
|
||||
}
|
||||
|
||||
// At the application level, the device part is "the server" because it
|
||||
// serves video stream and control. However, at network level, the client
|
||||
// listens and the server connects to the client. That way, the client can
|
||||
// listen before starting the server app, so there is no need to try to
|
||||
// connect until the server socket is listening on the device.
|
||||
// serves video stream and control. However, at the network level, the
|
||||
// client listens and the server connects to the client. That way, the
|
||||
// client can listen before starting the server app, so there is no need to
|
||||
// try to connect until the server socket is listening on the device.
|
||||
|
||||
server->server_socket = listen_on_port(local_port);
|
||||
if (server->server_socket == INVALID_SOCKET) {
|
||||
|
BIN
assets/screenshot-debian-600.jpg
Normal file
BIN
assets/screenshot-debian-600.jpg
Normal file
Binary file not shown.
After Width: | Height: | Size: 44 KiB |
@ -16,7 +16,6 @@ import java.util.concurrent.atomic.AtomicBoolean;
|
||||
|
||||
public class ScreenEncoder implements Device.RotationListener {
|
||||
|
||||
private static final int DEFAULT_BIT_RATE = 4_000_000; // bits per second
|
||||
private static final int DEFAULT_FRAME_RATE = 60; // fps
|
||||
private static final int DEFAULT_I_FRAME_INTERVAL = 10; // seconds
|
||||
|
||||
@ -40,16 +39,12 @@ public class ScreenEncoder implements Device.RotationListener {
|
||||
this(bitRate, DEFAULT_FRAME_RATE, DEFAULT_I_FRAME_INTERVAL);
|
||||
}
|
||||
|
||||
public ScreenEncoder() {
|
||||
this(DEFAULT_BIT_RATE, DEFAULT_FRAME_RATE, DEFAULT_I_FRAME_INTERVAL);
|
||||
}
|
||||
|
||||
@Override
|
||||
public void onRotationChanged(int rotation) {
|
||||
rotationChanged.set(true);
|
||||
}
|
||||
|
||||
public boolean checkRotationChanged() {
|
||||
public boolean consumeRotationChange() {
|
||||
return rotationChanged.getAndSet(false);
|
||||
}
|
||||
|
||||
@ -87,11 +82,11 @@ public class ScreenEncoder implements Device.RotationListener {
|
||||
byte[] buf = new byte[bitRate / 8]; // may contain up to 1 second of video
|
||||
boolean eof = false;
|
||||
MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
|
||||
while (!checkRotationChanged() && !eof) {
|
||||
while (!consumeRotationChange() && !eof) {
|
||||
int outputBufferId = codec.dequeueOutputBuffer(bufferInfo, -1);
|
||||
eof = (bufferInfo.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0;
|
||||
try {
|
||||
if (checkRotationChanged()) {
|
||||
if (consumeRotationChange()) {
|
||||
// must restart encoding with new size
|
||||
break;
|
||||
}
|
||||
|
Loading…
x
Reference in New Issue
Block a user