Compare commits

...

8 Commits

Author SHA1 Message Date
Romain Vimont
277d40701c hidpi test 2018-03-02 19:31:07 +01:00
Romain Vimont
f5530f5195 Add developer documentation
And update README.
2018-03-02 17:12:44 +01:00
Romain Vimont
6b4c9b5a9f Revert "Enable high dpi support"
Just enabling this flag breaks mouse location values.

This reverts commit 64efe2c07dd996857f57622239c7cdeadc319f7a.
2018-03-02 17:12:44 +01:00
Romain Vimont
a82f665114 Fix comment typo
Replace "at network level" by "at the network level".
2018-03-02 15:43:01 +01:00
Romain Vimont
f88cc91a51 Double the default bitrate
Set the default video bitrate to 8Mbps. This greatly increase quality on
fast motion, without negative side effects.
2018-03-02 15:43:01 +01:00
Romain Vimont
38e4b3e39a Rename rotation detection method name
The old name checkRotationChanged() did not suggest that the flag was
reset.
2018-03-01 17:16:02 +01:00
Romain Vimont
64efe2c07d Enable high dpi support
Use high DPI if available.

Note that on Mac OS X, setting this flag is not sufficient:

> On Apple's OS X you must set the NSHighResolutionCapable Info.plist
> property to YES, otherwise you will not receive a High DPI OpenGL
> display.

<https://wiki.libsdl.org/SDL_CreateWindow#flags>
2018-03-01 13:25:15 +01:00
Romain Vimont
93e9fb496c Update README
Explain how to build, install and run the application.
2018-03-01 12:17:56 +01:00
9 changed files with 467 additions and 105 deletions

238
DEVELOP.md Normal file
View File

@ -0,0 +1,238 @@
# scrcpy for developers
## Overview
This application is composed of two parts:
- the server (`scrcpy-server.jar`), to be executed on the device,
- the client (the `scrcpy` binary), executed on the host computer.
The client is responsible to push the server to the device and start its
execution.
Once the client and the server are connected to each other, the server initially
sends device information (name and initial screen dimensions), then starts to
send a raw H.264 video stream of the device screen. The client decodes the video
frames, and display them as soon as possible, without buffering, to minimize
latency. The client is not aware of the device rotation (which is handled by the
server), it just knows the dimensions of the video frames.
The client captures relevant keyboard and mouse events, that it transmits to the
server, which injects them to the device.
## Server
### Privileges
Capturing the screen requires some privileges, which are granted to `shell`.
The server is a Java application (with a [`public static void main(String...
args)`][main] method), compiled against the Android framework, and executed as
`shell` on the Android device.
[main]: server/src/main/java/com/genymobile/scrcpy/Server.java#L61
To run such a Java application, the classes must be [_dexed_][dex] (typically,
to `classes.dex`). If `my.package.MainClass` is the main class, compiled to
`classes.dex`, pushed to the device in `/data/local/tmp`, then it can be run
with:
adb shell CLASSPATH=/data/local/tmp/classes.dex \
app_process / my.package.MainClass
_The path `/data/local/tmp` is a good candidate to push the server, since it's
readable and writable by `shell`, but not world-writable, so a malicious
application may not replace the server just before the client executes it._
Instead of a raw _dex_ file, `app_process` accepts a _jar_ containing
`classes.dex` (e.g. an [APK]). For simplicity, and to benefit from the gradle
build system, the server is built to an (unsigned) APK (renamed to
`scrcpy-server.jar`).
[dex]: https://en.wikipedia.org/wiki/Dalvik_(software)
[apk]: https://en.wikipedia.org/wiki/Android_application_package
### Hidden methods
Albeit compiled against the Android framework, [hidden] methods and classes are
not directly accessible (and they may differ from one Android version to
another).
They can be called using reflection though. The communication with hidden
components is provided by [_wrappers_ classes][wrappers] and [aidl].
[hidden]: https://stackoverflow.com/a/31908373/1987178
[wrappers]: https://github.com/Genymobile/scrcpy/blob/079d750d41b7849eb1b9caaa6151ef2429581584/server/src/main/java/com/genymobile/scrcpy/wrappers
[aidl]: https://github.com/Genymobile/scrcpy/blob/079d750d41b7849eb1b9caaa6151ef2429581584/server/src/main/aidl/android/view
### Threading
The server uses 2 threads:
- the **main** thread, encoding and streaming the video to the client;
- the **controller** thread, listening for _control events_ (typically,
keyboard and mouse events) from the client.
Since the video encoding is typically hardware, there would be no benefit in
encoding and streaming in two different threads.
### Screen video encoding
The encoding is managed by [`ScreenEncoder`].
The video is encoded using the [`MediaCodec`] API. The codec takes its input
from a [surface] associated to the display, and writes the resulting H.264
stream to the provided output stream (the socket connected to the client).
[`ScreenEncoder`]: https://github.com/Genymobile/scrcpy/blob/079d750d41b7849eb1b9caaa6151ef2429581584/server/src/main/java/com/genymobile/scrcpy/ScreenEncoder.java
[`MediaCodec`]: https://developer.android.com/reference/android/media/MediaCodec.html
[surface]: https://github.com/Genymobile/scrcpy/blob/079d750d41b7849eb1b9caaa6151ef2429581584/server/src/main/java/com/genymobile/scrcpy/ScreenEncoder.java#L63-L64
On device [rotation], the codec, surface and display are reinitialized, and a
new video stream is produced.
New frames are produced only when changes occur on the surface. This is good
because it avoids to send unnecessary frames, but there are drawbacks:
- it does not send any frame on start if the device screen does not change,
- after fast motion changes, the last frame may have poor quality.
Both problems are [solved][repeat] by the flag
[`KEY_REPEAT_PREVIOUS_FRAME_AFTER`][repeat-flag].
[rotation]: https://github.com/Genymobile/scrcpy/blob/079d750d41b7849eb1b9caaa6151ef2429581584/server/src/main/java/com/genymobile/scrcpy/ScreenEncoder.java#L89-L92
[repeat]: https://github.com/Genymobile/scrcpy/blob/079d750d41b7849eb1b9caaa6151ef2429581584/server/src/main/java/com/genymobile/scrcpy/ScreenEncoder.java#L125-L126
[repeat-flag]: https://developer.android.com/reference/android/media/MediaFormat.html#KEY_REPEAT_PREVIOUS_FRAME_AFTER
### Input events injection
_Control events_ are received from the client by the [`EventController`] (run in
a separate thread). There are 5 types of input events:
- keycode (cf [`KeyEvent`]),
- text (special characters may not be handled by keycodes directly),
- mouse motion/click,
- mouse scroll,
- custom command (e.g. to switch the screen on).
All of them may need to inject input events to the system. To do so, they use
the _hidden_ method [`InputManager.injectInputEvent`] (exposed by our
[`InputManager` wrapper][inject-wrapper]).
[`EventController`]: https://github.com/Genymobile/scrcpy/blob/079d750d41b7849eb1b9caaa6151ef2429581584/server/src/main/java/com/genymobile/scrcpy/EventController.java#L70
[`KeyEvent`]: https://developer.android.com/reference/android/view/KeyEvent.html
[`MotionEvent`]: https://developer.android.com/reference/android/view/MotionEvent.html
[`InputManager.injectInputEvent`]: https://android.googlesource.com/platform/frameworks/base/+/oreo-release/core/java/android/hardware/input/InputManager.java#857
[inject-wrapper]: https://github.com/Genymobile/scrcpy/blob/079d750d41b7849eb1b9caaa6151ef2429581584/server/src/main/java/com/genymobile/scrcpy/wrappers/InputManager.java#L27
## Client
The client relies on [SDL], which provides cross-platform API for UI, input
events, threading, etc.
The video stream is decoded by [libav] (FFmpeg).
[SDL]: https://www.libsdl.org
[libav]: https://www.libav.org/
### Initialization
On startup, in addition to _libav_ and _SDL_ initialization, the client must
push and start the server on the device, and open a socket so that they may
communicate.
Note that the client-server roles are expressed at the application level:
- the server _serves_ video stream and handle requests from the client,
- the client _controls_ the device through the server.
However, the roles are inverted at the network level:
- the client opens a server socket and listen on a port before starting the
server,
- the server connects to the client.
This role inversion guarantees that the connection will not fail due to race
conditions, and avoids polling.
Once the server is connected, it sends the device information (name and initial
screen dimensions). Thus, the client may init the window and renderer, before
the first frame is available.
To minimize startup time, SDL initialization is performed while listening for
the connection from the server (see commit [90a46b4]).
[90a46b4]: https://github.com/Genymobile/scrcpy/commit/90a46b4c45637d083e877020d85ade52a9a5fa8e
### Threading
The client uses 3 threads:
- the **main** thread, executing the SDL event loop,
- the **decoder** thread, decoding video frames,
- the **controller** thread, sending _control events_ to the server.
### Decoder
The [decoder] runs in a separate thread. It uses _libav_ to decode the H.264
stream from the socket, and notifies the main thread when a new frame is
available.
There are two [frames] simultaneously in memory:
- the **decoding** frame, written by the decoder from the decoder thread,
- the **rendering** frame, rendered in a texture from the main thread.
When a new decoded frame is available, the decoder _swaps_ the decoding and
rendering frame (with proper synchronization). Thus, it immediatly starts
to decode a new frame while the main thread renders the last one.
[decoder]: https://github.com/Genymobile/scrcpy/blob/079d750d41b7849eb1b9caaa6151ef2429581584/app/src/decoder.c
[frames]: https://github.com/Genymobile/scrcpy/blob/079d750d41b7849eb1b9caaa6151ef2429581584/app/src/frames.h
### Controller
The [controller] is responsible to send _control events_ to the device. It runs
in a separate thread, to avoid I/O on the main thread.
On SDL event, received on the main thread, the [input manager][inputmanager]
creates appropriate [_control events_][controlevent]. It is responsible to
convert SDL events to Android events (using [convert]). It pushes the _control
events_ to a blocking queue hold by the controller. On its own thread, the
controller takes events from the queue, that it serializes and sends to the
client.
[controller]: https://github.com/Genymobile/scrcpy/blob/079d750d41b7849eb1b9caaa6151ef2429581584/app/src/controller.h
[controlevent]: https://github.com/Genymobile/scrcpy/blob/079d750d41b7849eb1b9caaa6151ef2429581584/app/src/controlevent.h
[inputmanager]: https://github.com/Genymobile/scrcpy/blob/079d750d41b7849eb1b9caaa6151ef2429581584/app/src/inputmanager.h
[convert]: https://github.com/Genymobile/scrcpy/blob/079d750d41b7849eb1b9caaa6151ef2429581584/app/src/convert.h
### UI and event loop
Initialization, input events and rendering are all [managed][scrcpy] in the main
thread.
Events are handled in the [event loop], which either updates the [screen] or
delegates to the [input manager][inputmanager].
[scrcpy]: https://github.com/Genymobile/scrcpy/blob/079d750d41b7849eb1b9caaa6151ef2429581584/app/src/scrcpy.c
[event loop]: https://github.com/Genymobile/scrcpy/blob/079d750d41b7849eb1b9caaa6151ef2429581584/app/src/scrcpy.c#L38
[screen]: https://github.com/Genymobile/scrcpy/blob/079d750d41b7849eb1b9caaa6151ef2429581584/app/src/screen.h
## Hack
For more details, go read the code!
If you find a bug, or have an awesome idea to implement, please discuss and
contribute ;-)

260
README.md
View File

@ -1,141 +1,219 @@
# ScrCpy # scrcpy
This project displays screens of Android devices plugged on USB in live. This application provides display and control of Android devices connected on
USB. It does not require any _root_ access. It works on _GNU/Linux_, _Windows_
and _Mac OS_.
![screenshot](assets/screenshot-debian-600.jpg)
## Run ## Requirements
### Runtime requirements The Android part requires at least API 21 (Android 5.0).
This projects requires _FFmpeg_, _LibSDL2_ and _LibSDL2-net_. You need [adb] (recent enough so that `adb reverse` is implemented, it works
with 1.0.36). It is available in the [Android SDK platform
tools][platform-tools], on packaged in your distribution (`android-adb-tools`).
On Windows, just download the [platform-tools][platform-tools-windows] and
extract the following files to a directory accessible from your `PATH`:
- `adb.exe`
- `AdbWinApi.dll`
- `AdbWinUsbApi.dll`
Make sure you [enabled adb debugging][enable-adb] on your device(s).
[adb]: https://developer.android.com/studio/command-line/adb.html
[enable-adb]: https://developer.android.com/studio/command-line/adb.html#Enabling
[platform-tools]: https://developer.android.com/studio/releases/platform-tools.html
[platform-tools-windows]: https://dl.google.com/android/repository/platform-tools-latest-windows.zip
The client requires _FFmpeg_ and _LibSDL2_.
## Build and install
### System-specific steps
#### Linux #### Linux
Install the packages from your package manager. For example, on Debian: Install the required packages from your package manager (here, for Debian):
sudo apt install ffmpeg libsdl2-2.0.0 libsdl2-net-2.0.0 # runtime dependencies
sudo apt install ffmpeg libsdl2-2.0.0
# build dependencies
sudo apt install make gcc openjdk-8-jdk pkg-config meson zip \
libavcodec-dev libavformat-dev libavutil-dev \
libsdl2-dev
#### Windows #### Windows
From [MSYS2]: For Windows, for simplicity, a prebuilt package with all the dependencies
(including `adb`) is available: TODO.
pacman -S mingw-w64-x86_64-SDL2 Instead, you may want to build it manually. You need [MSYS2] to build the
pacman -S mingw-w64-x86_64-SDL2_net project. From an MSYS2 terminal, install the required packages:
pacman -S mingw-w64-x86_64-ffmpeg
[MSYS2]: http://www.msys2.org/ [MSYS2]: http://www.msys2.org/
# runtime dependencies
pacman -S mingw-w64-x86_64-SDL2 \
mingw-w64-x86_64-ffmpeg
#### MacOS # build dependencies
pacman -S mingw-w64-x86_64-make \
mingw-w64-x86_64-gcc \
mingw-w64-x86_64-pkg-config \
mingw-w64-x86_64-meson \
zip
TODO Java (>= 7) is not available in MSYS2, so if you plan to build the server,
install it manually and make it available from the `PATH`:
## Build
The project is divided into two parts:
- the server, running on the device (in `server/`);
- the client, running on the computer (in `app/`).
The server is a raw Java project requiring Android SDK. It not an Android
project: the target file is a `.jar`, and a `main()` method is executed with
_shell_ rights.
The client is a C project using [SDL] and [FFmpeg], built with [Meson]/[Ninja].
The root directory contains a `Makefile` to build both parts.
[sdl]: https://www.libsdl.org/
[ffmpeg]: https://www.ffmpeg.org/
[meson]: http://mesonbuild.com/
[ninja]: https://ninja-build.org/
### Build requirements
Install the [Android SDK], the JDK 8 (`openjdk-8-jdk`), and the packages
described below.
[Android SDK]: https://developer.android.com/studio/index.html
#### Linux
sudo apt install make gcc openjdk-8-jdk pkg-config meson zip \
libavcodec-dev libavformat-dev libavutil-dev \
libsdl2-dev libsdl2-net-dev
#### Windows
Install these packages:
pacman -S mingw-w64-x86_64-make
pacman -S mingw-w64-x86_64-gcc
pacman -S mingw-w64-x86_64-pkg-config
pacman -S mingw-w64-x86_64-meson
pacman -S zip
Java 8 is not available in MSYS2, so install it manually and make it available
from the `PATH`:
export PATH="$JAVA_HOME/bin:$PATH" export PATH="$JAVA_HOME/bin:$PATH"
### Build #### Mac OS
Make sure your `ANDROID_HOME` variable is set to your Android SDK directory: Use [Homebrew] to install the packages:
[Homebrew]: https://brew.sh/
# runtime dependencies
brew install sdl2 ffmpeg
# build dependencies
brew install gcc pkg-config meson zip
Java (>= 7) is not available in Homebrew, so if you plan to build the server,
install it manually and make it available from the `PATH`:
export PATH="$JAVA_HOME/bin:$PATH"
### Common steps
Install the [Android SDK] (_Android Studio_), and set `ANDROID_HOME` to
its directory. For example:
[Android SDK]: https://developer.android.com/studio/index.html
export ANDROID_HOME=~/android/sdk export ANDROID_HOME=~/android/sdk
From the project root directory, execute: Then, build `scrcpy`:
make build meson x --buildtype release --strip -Db_lto=true
cd x
ninja
To run the build: You can test it from here:
make run ninja run
It is also pass arguments to `scrcpy` via `make`: Or you can install it on the system:
make run ARGS="-p 1234" sudo ninja install # without sudo on Windows
The purpose of this command is to execute `scrcpy` during the development. This installs two files:
- `/usr/local/bin/scrcpy`
- `/usr/local/share/scrcpy/scrcpy-server.jar`
Just remove them to "uninstall" the application.
### Test #### Prebuilt server
To execute unit tests: Since the server binary, that will be pushed to the Android device, does not
depend on your system and architecture, you may want to use the prebuilt binary
instead: [`scrcpy-server.jar`](TODO).
make test In that case, the build does not require Java or the Android SDK.
The server-side tests require JUnit 4: Download the prebuilt server somewhere, and specify its path during the Meson
configuration:
sudo apt install junit4 meson x --buildtype release --strip -Db_lto=true \
-Dprebuilt_server=/path/to/scrcpy-server.jar
cd x
### Generate a release ninja
sudo ninja install
From the project root directory, execute:
make release
This will generate the application in `dist/scrcpy/`.
## Run ## Run
Plug a device, and from `dist/scrcpy/`, execute: _At runtime, `adb` must be accessible from your `PATH`._
./scrcpy If everything is ok, just plug an Android device, and execute:
scrcpy
It accepts command-line arguments, listed by:
scrcpy --help
For example, to decrease video bitrate to 2Mbps (default is 8Mbps):
scrcpy -b 2M
To limit the video dimensions (e.g. if the device is 2540×1440, but the host
screen is smaller, or cannot decode such a high definition):
scrcpy -m 1024
If several devices are listed in `adb devices`, you must specify the _serial_: If several devices are listed in `adb devices`, you must specify the _serial_:
./scrcpy 0123456789abcdef scrcpy -s 0123456789abcdef
To change the default port (useful to launch several `scrcpy` simultaneously):
./scrcpy -p 1234 ## Shortcuts
Other options are available, check `scrcpy --help`. | Action | Shortcut |
| ------------------------------------- | -------------:|
| switch fullscreen mode | `Ctrl`+`f` |
| resize window to 1:1 (pixel-perfect) | `Ctrl`+`g` |
| resize window to remove black borders | `Ctrl`+`x` |
| click on `HOME` | `Ctrl`+`h` |
| click on `BACK` | `Ctrl`+`b` |
| click on `APP_SWITCH` | `Ctrl`+`m` |
| click on `VOLUME_UP` | `Ctrl`+`+` |
| click on `VOLUME_DOWN` | `Ctrl`+`-` |
| click on `POWER` | `Ctrl`+`p` |
| turn screen on | _Right-click_ |
| enable/disable FPS counter (on stdout) | `Ctrl`+`i` |
## Why _scrcpy_?
A colleague challenged me to find a name as unpronounceable as [gnirehtet].
[`strcpy`] copies a **str**ing; `scrcpy` copies a **scr**een.
[gnirehtet]: https://github.com/Genymobile/gnirehtet
[`strcpy`]: http://man7.org/linux/man-pages/man3/strcpy.3.html
## Developers
Read the [developers page].
[developers page]: DEVELOP.md
## Licence
Copyright (C) 2018 Genymobile
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

View File

@ -73,7 +73,7 @@ conf.set('DEFAULT_MAX_SIZE', '0') # 0: unlimited
# the default video bitrate, in bits/second # the default video bitrate, in bits/second
# overridden by option --bit-rate # overridden by option --bit-rate
conf.set('DEFAULT_BIT_RATE', '4000000') # 4Mbps conf.set('DEFAULT_BIT_RATE', '8000000') # 8Mbps
# whether the app should always display the most recent available frame, even # whether the app should always display the most recent available frame, even
# if the previous one has not been displayed # if the previous one has not been displayed

View File

@ -35,6 +35,21 @@ static struct input_manager input_manager = {
.screen = &screen, .screen = &screen,
}; };
static void hidpi_fix_coordinates(Sint32 *x, Sint32 *y) {
struct screen_sizes sizes = screen_get_sizes(&screen);
Uint16 ww = sizes.window_size.width;
Uint16 wh = sizes.window_size.height;
Uint16 dw = sizes.drawable_size.width;
Uint16 dh = sizes.drawable_size.height;
printf("window=%dx%d; drawable=%dx%d\n", (int) ww, (int) wh, (int) dw, (int) dh);
if (dw && dw != ww) {
*x = ((Sint64) *x) * ww / dw;
}
if (dh && dh != wh) {
*y = ((Sint64) *y) * wh / dh;
}
}
static void event_loop(void) { static void event_loop(void) {
SDL_Event event; SDL_Event event;
while (SDL_WaitEvent(&event)) { while (SDL_WaitEvent(&event)) {
@ -72,14 +87,17 @@ static void event_loop(void) {
input_manager_process_key(&input_manager, &event.key); input_manager_process_key(&input_manager, &event.key);
break; break;
case SDL_MOUSEMOTION: case SDL_MOUSEMOTION:
hidpi_fix_coordinates(&event.motion.x, &event.motion.y);
input_manager_process_mouse_motion(&input_manager, &event.motion); input_manager_process_mouse_motion(&input_manager, &event.motion);
break; break;
case SDL_MOUSEWHEEL: { case SDL_MOUSEWHEEL: {
hidpi_fix_coordinates(&event.wheel.x, &event.wheel.y);
input_manager_process_mouse_wheel(&input_manager, &event.wheel); input_manager_process_mouse_wheel(&input_manager, &event.wheel);
break; break;
} }
case SDL_MOUSEBUTTONDOWN: case SDL_MOUSEBUTTONDOWN:
case SDL_MOUSEBUTTONUP: { case SDL_MOUSEBUTTONUP: {
hidpi_fix_coordinates(&event.button.y, &event.button.y);
input_manager_process_mouse_button(&input_manager, &event.button); input_manager_process_mouse_button(&input_manager, &event.button);
break; break;
} }

View File

@ -45,6 +45,27 @@ static struct size get_native_window_size(SDL_Window *window) {
return size; return size;
} }
// get the size of the window underlying drawable in pixels
// may differ from get_native_window_size() if hi-dpi is enabled
static struct size get_native_drawable_size(SDL_Window *window) {
int width;
int height;
SDL_GL_GetDrawableSize(window, &width, &height);
struct size size;
size.width = width;
size.height = height;
return size;
}
// return both the native window size and native drawable size
struct screen_sizes screen_get_sizes(const struct screen *screen) {
struct screen_sizes sizes;
sizes.window_size = get_native_window_size(screen->window);
sizes.drawable_size = get_native_drawable_size(screen->window);
return sizes;
}
// get the windowed window size // get the windowed window size
static struct size get_window_size(const struct screen *screen) { static struct size get_window_size(const struct screen *screen) {
if (screen->fullscreen) { if (screen->fullscreen) {
@ -141,7 +162,8 @@ SDL_bool screen_init_rendering(struct screen *screen, const char *device_name, s
struct size window_size = get_initial_optimal_size(frame_size); struct size window_size = get_initial_optimal_size(frame_size);
screen->window = SDL_CreateWindow(device_name, SDL_WINDOWPOS_UNDEFINED, SDL_WINDOWPOS_UNDEFINED, screen->window = SDL_CreateWindow(device_name, SDL_WINDOWPOS_UNDEFINED, SDL_WINDOWPOS_UNDEFINED,
window_size.width, window_size.height, SDL_WINDOW_HIDDEN | SDL_WINDOW_RESIZABLE); window_size.width, window_size.height,
SDL_WINDOW_HIDDEN | SDL_WINDOW_RESIZABLE | SDL_WINDOW_ALLOW_HIGHDPI);
if (!screen->window) { if (!screen->window) {
LOGC("Could not create window: %s", SDL_GetError()); LOGC("Could not create window: %s", SDL_GetError());
return SDL_FALSE; return SDL_FALSE;

View File

@ -34,6 +34,15 @@ struct screen {
.fullscreen = SDL_FALSE, \ .fullscreen = SDL_FALSE, \
} }
// the window and drawable size may differ if hi-dpi is enabled
struct screen_sizes {
// the size of the window client area, as reported by SDL_GetWindowSize()
struct size window_size;
// the size of the window underlying drawable, as reported by
// SDL_GL_GetDrawableSize()
struct size drawable_size;
};
// init SDL and set appropriate hints // init SDL and set appropriate hints
SDL_bool sdl_init_and_configure(void); SDL_bool sdl_init_and_configure(void);
@ -66,4 +75,6 @@ void screen_resize_to_fit(struct screen *screen);
// resize window to 1:1 (pixel-perfect) // resize window to 1:1 (pixel-perfect)
void screen_resize_to_pixel_perfect(struct screen *screen); void screen_resize_to_pixel_perfect(struct screen *screen);
struct screen_sizes screen_get_sizes(const struct screen *screen);
#endif #endif

View File

@ -95,10 +95,10 @@ SDL_bool server_start(struct server *server, const char *serial, Uint16 local_po
} }
// At the application level, the device part is "the server" because it // At the application level, the device part is "the server" because it
// serves video stream and control. However, at network level, the client // serves video stream and control. However, at the network level, the
// listens and the server connects to the client. That way, the client can // client listens and the server connects to the client. That way, the
// listen before starting the server app, so there is no need to try to // client can listen before starting the server app, so there is no need to
// connect until the server socket is listening on the device. // try to connect until the server socket is listening on the device.
server->server_socket = listen_on_port(local_port); server->server_socket = listen_on_port(local_port);
if (server->server_socket == INVALID_SOCKET) { if (server->server_socket == INVALID_SOCKET) {

Binary file not shown.

After

Width:  |  Height:  |  Size: 44 KiB

View File

@ -16,7 +16,6 @@ import java.util.concurrent.atomic.AtomicBoolean;
public class ScreenEncoder implements Device.RotationListener { public class ScreenEncoder implements Device.RotationListener {
private static final int DEFAULT_BIT_RATE = 4_000_000; // bits per second
private static final int DEFAULT_FRAME_RATE = 60; // fps private static final int DEFAULT_FRAME_RATE = 60; // fps
private static final int DEFAULT_I_FRAME_INTERVAL = 10; // seconds private static final int DEFAULT_I_FRAME_INTERVAL = 10; // seconds
@ -40,16 +39,12 @@ public class ScreenEncoder implements Device.RotationListener {
this(bitRate, DEFAULT_FRAME_RATE, DEFAULT_I_FRAME_INTERVAL); this(bitRate, DEFAULT_FRAME_RATE, DEFAULT_I_FRAME_INTERVAL);
} }
public ScreenEncoder() {
this(DEFAULT_BIT_RATE, DEFAULT_FRAME_RATE, DEFAULT_I_FRAME_INTERVAL);
}
@Override @Override
public void onRotationChanged(int rotation) { public void onRotationChanged(int rotation) {
rotationChanged.set(true); rotationChanged.set(true);
} }
public boolean checkRotationChanged() { public boolean consumeRotationChange() {
return rotationChanged.getAndSet(false); return rotationChanged.getAndSet(false);
} }
@ -87,11 +82,11 @@ public class ScreenEncoder implements Device.RotationListener {
byte[] buf = new byte[bitRate / 8]; // may contain up to 1 second of video byte[] buf = new byte[bitRate / 8]; // may contain up to 1 second of video
boolean eof = false; boolean eof = false;
MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo(); MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
while (!checkRotationChanged() && !eof) { while (!consumeRotationChange() && !eof) {
int outputBufferId = codec.dequeueOutputBuffer(bufferInfo, -1); int outputBufferId = codec.dequeueOutputBuffer(bufferInfo, -1);
eof = (bufferInfo.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0; eof = (bufferInfo.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0;
try { try {
if (checkRotationChanged()) { if (consumeRotationChange()) {
// must restart encoding with new size // must restart encoding with new size
break; break;
} }