Compare commits

...

19 Commits

Author SHA1 Message Date
599514ba10 Add --angle
Add an option to rotate the video content by a custom angle.
2024-11-15 21:54:33 +01:00
239553f741 Do not recreate display on every rotation
This semantically reverts 7e3b935932.

The issue seems to be fixed if setSurface() is called before resize() on
the virtual display.

Refs #4840 <https://github.com/Genymobile/scrcpy/pull/4840>
2024-11-15 21:54:33 +01:00
9a1f9f6923 Remove deprecated options 2024-11-15 21:54:33 +01:00
a2938921b5 Use natural device orientation for --new-display
If no size is provided with --new-display, the main display size is
used. But the actual size depended on the current device orientation.

To make it deterministic, use the size of the natural device orientation
(portrait for phones, landscape for tablets).
2024-11-15 21:54:33 +01:00
2e69464ed0 Improve mismatching event size warning
Include both the event size and the current size in the warning message.
2024-11-15 21:54:33 +01:00
19d7d2a492 Apply filters to virtual display capture 2024-11-15 21:54:33 +01:00
1fd6c1c82a Apply filters to camera capture 2024-11-15 21:54:33 +01:00
a270e99147 Add --capture-orientation
Deprecate --lock-video-orientation in favor of a more general option
--capture-orientation, which supports all possible orientations
(0, 90, 180, 270, flip0, flip90, flip180, flip270), and a "locked" flag
via a '@' prefix.

All the old "locked video orientations" are supported:
 - --lock-video-orientation      ->  --capture-orientation=@
 - --lock-video-orientation=0    ->  --capture-orientation=@0
 - --lock-video-orientation=90   ->  --capture-orientation=@90
 - --lock-video-orientation=180  ->  --capture-orientation=@180
 - --lock-video-orientation=270  ->  --capture-orientation=@270

In addition, --capture-orientation can rotate/flip the display without
locking, so that it follows the physical device rotation.

For example:

    scrcpy --capture-orientation=flip90

always flips and rotates the capture by 90° clockwise.

The arguments are consistent with --orientation (which provides a
separate client-side orientation).
2024-11-15 21:54:33 +01:00
a809e2f78d Handle virtual display rotation 2024-11-15 21:54:33 +01:00
e18ba38e89 Extract DisplayCapture
This will allow to share common code between ScreenCapture and
NewDisplayCapture.
2024-11-15 21:54:33 +01:00
7d92f43cfc Reimplement lock orientation using transforms
Reimplement the --lock-video-orientation feature using affine
transforms.
2024-11-15 21:54:33 +01:00
e092f65a8d Reimplement crop using transforms
Reimplement the --crop feature using affine transforms.
2024-11-15 21:54:33 +01:00
71386c375d Ignore signalEndOfStream() error
This may be called at any time to interrupt the current encoding,
including when MediaCodec is in an expected state.
2024-11-15 21:54:33 +01:00
97f96cde81 Move mediaCodec.stop() to finally block
This will allow to stop MediaCodec only after clean ups of other
components which must be performed before stopping MediaCodec.
2024-11-15 21:54:33 +01:00
10ab1ca8e0 Make PositionMapper use affine transforms
This will allow to apply transformations performed by video filters.
2024-11-15 21:54:33 +01:00
34379fe2d3 Temporarily ignore lock video orientation and crop
Get rid of old code implementing --lock-video-orientation and --crop
features on the device side.

They will be reimplemented differently.
2024-11-15 21:54:33 +01:00
eb137770e9 Split computeVideoSize() into limit() and round8()
Expose two methods on Size directly:
 - limit() to downscale a size;
 - round8() to round both dimensions to multiples of 8.

This will allow to remove ScreenInfo completely.
2024-11-15 21:54:33 +01:00
4338e3dd08 Revert "Disable broken options on Android 14"
This reverts commit d62fa8880e.

These options will be reimplemented differently.
2024-11-15 21:54:33 +01:00
48b1bef65d Add on-device OpenGL video filter architecture 2024-11-15 21:54:33 +01:00
35 changed files with 1678 additions and 485 deletions

View File

@ -2,6 +2,7 @@ _scrcpy() {
local cur prev words cword
local opts="
--always-on-top
--angle
--audio-bit-rate=
--audio-buffer=
--audio-codec=
@ -17,6 +18,7 @@ _scrcpy() {
--camera-fps=
--camera-high-speed
--camera-size=
--capture-orientation=
--crop=
-d --select-usb
--disable-screensaver
@ -37,8 +39,6 @@ _scrcpy() {
--list-cameras
--list-displays
--list-encoders
--lock-video-orientation
--lock-video-orientation=
-m --max-size=
-M
--max-fps=

View File

@ -9,6 +9,7 @@ local arguments
arguments=(
'--always-on-top[Make scrcpy window always on top \(above other windows\)]'
'--angle=[Rotate the video content by a custom angle, in degrees]'
'--audio-bit-rate=[Encode the audio at the given bit-rate]'
'--audio-buffer=[Configure the audio buffering delay (in milliseconds)]'
'--audio-codec=[Select the audio codec]:codec:(opus aac flac raw)'
@ -24,6 +25,7 @@ arguments=(
'--camera-facing=[Select the device camera by its facing direction]:facing:(front back external)'
'--camera-fps=[Specify the camera capture frame rate]'
'--camera-size=[Specify an explicit camera capture size]'
'--capture-orientation=[Set the capture video orientation]:orientation:(0 90 180 270 flip0 flip90 flip180 flip270 #0 #90 #180 #270 #flip0 #flip90 #flip180 #flip270)'
'--crop=[\[width\:height\:x\:y\] Crop the device screen on the server]'
{-d,--select-usb}'[Use USB device]'
'--disable-screensaver[Disable screensaver while scrcpy is running]'
@ -44,7 +46,6 @@ arguments=(
'--list-cameras[List cameras available on the device]'
'--list-displays[List displays available on the device]'
'--list-encoders[List video and audio encoders available on the device]'
'--lock-video-orientation=[Lock video orientation]:orientation:(unlocked initial 0 90 180 270)'
{-m,--max-size=}'[Limit both the width and height of the video to value]'
'-M[Use UHID/AOA mouse (same as --mouse=uhid or --mouse=aoa, depending on OTG mode)]'
'--max-fps=[Limit the frame rate of screen capture]'

View File

@ -19,6 +19,10 @@ provides display and control of Android devices connected on USB (or over TCP/IP
.B \-\-always\-on\-top
Make scrcpy window always on top (above other windows).
.TP
.BI "\-\-angle " degrees
Rotate the video content by a custom angle, in degrees counter-clockwise.
.TP
.BI "\-\-audio\-bit\-rate " value
Encode the audio at the given bit rate, expressed in bits/s. Unit suffixes are supported: '\fBK\fR' (x1000) and '\fBM\fR' (x1000000).
@ -121,6 +125,18 @@ If not specified, Android's default frame rate (30 fps) is used.
.BI "\-\-camera\-size " width\fRx\fIheight
Specify an explicit camera capture size.
.TP
.BI "\-\-capture\-orientation " value
Possible values are 0, 90, 180, 270, flip0, flip90, flip180 and flip270, possibly prefixed by '@'.
The number represents the clockwise rotation in degrees; the "flip" keyword applies a horizontal flip before the rotation.
If a leading '@' is passed (@90) for display capture, then the rotation is locked, and is relative to the natural device orientation.
If '@' is passed alone, then the rotation is locked to the initial device orientation.
Default is 0.
.TP
.BI "\-\-crop " width\fR:\fIheight\fR:\fIx\fR:\fIy
Crop the device screen on the server.
@ -241,16 +257,6 @@ List video and audio encoders available on the device.
.B \-\-list\-displays
List displays available on the device.
.TP
\fB\-\-lock\-video\-orientation\fR[=\fIvalue\fR]
Lock capture video orientation to \fIvalue\fR.
Possible values are "unlocked", "initial" (locked to the initial orientation), 0, 90, 180, and 270. The values represent the clockwise rotation from the natural device orientation, in degrees.
Default is "unlocked".
Passing the option without argument is equivalent to passing "initial".
.TP
.BI "\-m, \-\-max\-size " value
Limit both the width and height of the video to \fIvalue\fR. The other dimension is computed so that the device aspect\-ratio is preserved.

View File

@ -107,6 +107,8 @@ enum {
OPT_LIST_APPS,
OPT_START_APP,
OPT_SCREEN_OFF_TIMEOUT,
OPT_CAPTURE_ORIENTATION,
OPT_ANGLE,
};
struct sc_option {
@ -148,6 +150,13 @@ static const struct sc_option options[] = {
.longopt = "always-on-top",
.text = "Make scrcpy window always on top (above other windows).",
},
{
.longopt_id = OPT_ANGLE,
.longopt = "angle",
.argdesc = "degrees",
.text = "Rotate the video content by a custom angle, in degrees "
"counter-clockwise.",
},
{
.longopt_id = OPT_AUDIO_BIT_RATE,
.longopt = "audio-bit-rate",
@ -471,18 +480,27 @@ static const struct sc_option options[] = {
.text = "List video and audio encoders available on the device.",
},
{
.longopt_id = OPT_CAPTURE_ORIENTATION,
.longopt = "capture-orientation",
.argdesc = "value",
.text = "Set the capture video orientation.\n"
"Possible values are 0, 90, 180, 270, flip0, flip90, flip180 "
"and flip270, possibly prefixed by '@'.\n"
"The number represents the clockwise rotation in degrees; the "
"flip\" keyword applies a horizontal flip before the "
"rotation.\n"
"If a leading '@' is passed (@90) for display capture, then "
"the rotation is locked, and is relative to the natural device "
"orientation.\n"
"If '@' is passed alone, then the rotation is locked to the "
"initial device orientation.\n"
"Default is 0.",
},
{
// deprecated
.longopt_id = OPT_LOCK_VIDEO_ORIENTATION,
.longopt = "lock-video-orientation",
.argdesc = "value",
.optional_arg = true,
.text = "Lock capture video orientation to value.\n"
"Possible values are \"unlocked\", \"initial\" (locked to the "
"initial orientation), 0, 90, 180 and 270. The values "
"represent the clockwise rotation from the natural device "
"orientation, in degrees.\n"
"Default is \"unlocked\".\n"
"Passing the option without argument is equivalent to passing "
"\"initial\".",
},
{
.shortopt = 'm',
@ -1582,78 +1600,6 @@ parse_audio_output_buffer(const char *s, sc_tick *tick) {
return true;
}
static bool
parse_lock_video_orientation(const char *s,
enum sc_lock_video_orientation *lock_mode) {
if (!s || !strcmp(s, "initial")) {
// Without argument, lock the initial orientation
*lock_mode = SC_LOCK_VIDEO_ORIENTATION_INITIAL;
return true;
}
if (!strcmp(s, "unlocked")) {
*lock_mode = SC_LOCK_VIDEO_ORIENTATION_UNLOCKED;
return true;
}
if (!strcmp(s, "0")) {
*lock_mode = SC_LOCK_VIDEO_ORIENTATION_0;
return true;
}
if (!strcmp(s, "90")) {
*lock_mode = SC_LOCK_VIDEO_ORIENTATION_90;
return true;
}
if (!strcmp(s, "180")) {
*lock_mode = SC_LOCK_VIDEO_ORIENTATION_180;
return true;
}
if (!strcmp(s, "270")) {
*lock_mode = SC_LOCK_VIDEO_ORIENTATION_270;
return true;
}
if (!strcmp(s, "1")) {
LOGW("--lock-video-orientation=1 is deprecated, use "
"--lock-video-orientation=270 instead.");
*lock_mode = SC_LOCK_VIDEO_ORIENTATION_270;
return true;
}
if (!strcmp(s, "2")) {
LOGW("--lock-video-orientation=2 is deprecated, use "
"--lock-video-orientation=180 instead.");
*lock_mode = SC_LOCK_VIDEO_ORIENTATION_180;
return true;
}
if (!strcmp(s, "3")) {
LOGW("--lock-video-orientation=3 is deprecated, use "
"--lock-video-orientation=90 instead.");
*lock_mode = SC_LOCK_VIDEO_ORIENTATION_90;
return true;
}
LOGE("Unsupported --lock-video-orientation value: %s (expected initial, "
"unlocked, 0, 90, 180 or 270).", s);
return false;
}
static bool
parse_rotation(const char *s, uint8_t *rotation) {
long value;
bool ok = parse_integer_arg(s, &value, false, 0, 3, "rotation");
if (!ok) {
return false;
}
*rotation = (uint8_t) value;
return true;
}
static bool
parse_orientation(const char *s, enum sc_orientation *orientation) {
if (!strcmp(s, "0")) {
@ -1693,6 +1639,32 @@ parse_orientation(const char *s, enum sc_orientation *orientation) {
return false;
}
static bool
parse_capture_orientation(const char *s, enum sc_orientation *orientation,
enum sc_orientation_lock *lock) {
if (*s == '\0') {
LOGE("Capture orientation may not be empty (expected 0, 90, 180, 270, "
"flip0, flip90, flip180 or flip270, possibly prefixed by '@')");
return false;
}
// Lock the orientation by a leading '@'
if (s[0] == '@') {
// Consume '@'
++s;
if (*s == '\0') {
// Only '@': lock to the initial orientation (orientation is unused)
*lock = SC_ORIENTATION_LOCKED_INITIAL;
return true;
}
*lock = SC_ORIENTATION_LOCKED_VALUE;
} else {
*lock = SC_ORIENTATION_UNLOCKED;
}
return parse_orientation(s, orientation);
}
static bool
parse_window_position(const char *s, int16_t *position) {
// special value for "auto"
@ -2302,8 +2274,8 @@ parse_args_with_getopt(struct scrcpy_cli_args *args, int argc, char *argv[],
opts->crop = optarg;
break;
case OPT_DISPLAY:
LOGW("--display is deprecated, use --display-id instead.");
// fall through
LOGE("--display has been removed, use --display-id instead.");
return false;
case OPT_DISPLAY_ID:
if (!parse_display_id(optarg, &opts->display_id)) {
return false;
@ -2367,8 +2339,13 @@ parse_args_with_getopt(struct scrcpy_cli_args *args, int argc, char *argv[],
"--mouse=uhid instead.");
return false;
case OPT_LOCK_VIDEO_ORIENTATION:
if (!parse_lock_video_orientation(optarg,
&opts->lock_video_orientation)) {
LOGE("--lock-video-orientation has been removed, use "
"--capture-orientation instead.");
return false;
case OPT_CAPTURE_ORIENTATION:
if (!parse_capture_orientation(optarg,
&opts->capture_orientation,
&opts->capture_orientation_lock)) {
return false;
}
break;
@ -2386,8 +2363,9 @@ parse_args_with_getopt(struct scrcpy_cli_args *args, int argc, char *argv[],
opts->control = false;
break;
case OPT_NO_DISPLAY:
LOGW("--no-display is deprecated, use --no-playback instead.");
// fall through
LOGE("--no-display has been removed, use --no-playback "
"instead.");
return false;
case 'N':
opts->video_playback = false;
opts->audio_playback = false;
@ -2473,32 +2451,9 @@ parse_args_with_getopt(struct scrcpy_cli_args *args, int argc, char *argv[],
opts->key_inject_mode = SC_KEY_INJECT_MODE_RAW;
break;
case OPT_ROTATION:
LOGW("--rotation is deprecated, use --display-orientation "
"instead.");
uint8_t rotation;
if (!parse_rotation(optarg, &rotation)) {
return false;
}
assert(rotation <= 3);
switch (rotation) {
case 0:
opts->display_orientation = SC_ORIENTATION_0;
break;
case 1:
// rotation 1 was 90° counterclockwise, but orientation
// is expressed clockwise
opts->display_orientation = SC_ORIENTATION_270;
break;
case 2:
opts->display_orientation = SC_ORIENTATION_180;
break;
case 3:
// rotation 3 was 270° counterclockwise, but orientation
// is expressed clockwise
opts->display_orientation = SC_ORIENTATION_90;
break;
}
break;
LOGE("--rotation has been removed, use --orientation or "
"--capture-orientation instead.");
return false;
case OPT_DISPLAY_ORIENTATION:
if (!parse_orientation(optarg, &opts->display_orientation)) {
return false;
@ -2559,23 +2514,9 @@ parse_args_with_getopt(struct scrcpy_cli_args *args, int argc, char *argv[],
}
break;
case OPT_FORWARD_ALL_CLICKS:
LOGW("--forward-all-clicks is deprecated, "
LOGE("--forward-all-clicks has been removed, "
"use --mouse-bind=++++ instead.");
opts->mouse_bindings = (struct sc_mouse_bindings) {
.pri = {
.right_click = SC_MOUSE_BINDING_CLICK,
.middle_click = SC_MOUSE_BINDING_CLICK,
.click4 = SC_MOUSE_BINDING_CLICK,
.click5 = SC_MOUSE_BINDING_CLICK,
},
.sec = {
.right_click = SC_MOUSE_BINDING_CLICK,
.middle_click = SC_MOUSE_BINDING_CLICK,
.click4 = SC_MOUSE_BINDING_CLICK,
.click5 = SC_MOUSE_BINDING_CLICK,
},
};
break;
return false;
case OPT_LEGACY_PASTE:
opts->legacy_paste = true;
break;
@ -2583,9 +2524,9 @@ parse_args_with_getopt(struct scrcpy_cli_args *args, int argc, char *argv[],
opts->power_off_on_close = true;
break;
case OPT_DISPLAY_BUFFER:
LOGW("--display-buffer is deprecated, use --video-buffer "
LOGE("--display-buffer has been removed, use --video-buffer "
"instead.");
// fall through
return false;
case OPT_VIDEO_BUFFER:
if (!parse_buffering_time(optarg, &opts->video_buffer)) {
return false;
@ -2758,6 +2699,9 @@ parse_args_with_getopt(struct scrcpy_cli_args *args, int argc, char *argv[],
return false;
}
break;
case OPT_ANGLE:
opts->angle = optarg;
break;
default:
// getopt prints the error message on stderr
return false;
@ -2852,14 +2796,6 @@ parse_args_with_getopt(struct scrcpy_cli_args *args, int argc, char *argv[],
return false;
}
if (opts->lock_video_orientation ==
SC_LOCK_VIDEO_ORIENTATION_UNLOCKED) {
LOGI("Video orientation is locked for v4l2 sink. "
"See --lock-video-orientation.");
opts->lock_video_orientation =
SC_LOCK_VIDEO_ORIENTATION_INITIAL_AUTO;
}
// V4L2 could not handle size change.
// Do not log because downsizing on error is the default behavior,
// not an explicit request from the user.

View File

@ -50,7 +50,8 @@ const struct scrcpy_options scrcpy_options_default = {
.video_bit_rate = 0,
.audio_bit_rate = 0,
.max_fps = NULL,
.lock_video_orientation = SC_LOCK_VIDEO_ORIENTATION_UNLOCKED,
.capture_orientation = SC_ORIENTATION_0,
.capture_orientation_lock = SC_ORIENTATION_UNLOCKED,
.display_orientation = SC_ORIENTATION_0,
.record_orientation = SC_ORIENTATION_0,
.window_x = SC_WINDOW_POSITION_UNDEFINED,
@ -106,6 +107,7 @@ const struct scrcpy_options scrcpy_options_default = {
.audio_dup = false,
.new_display = NULL,
.start_app = NULL,
.angle = NULL,
};
enum sc_orientation

View File

@ -84,6 +84,12 @@ enum sc_orientation { // v v v
SC_ORIENTATION_FLIP_270, // 1 1 1
};
enum sc_orientation_lock {
SC_ORIENTATION_UNLOCKED,
SC_ORIENTATION_LOCKED_VALUE, // lock to specified orientation
SC_ORIENTATION_LOCKED_INITIAL, // lock to initial device orientation
};
static inline bool
sc_orientation_is_mirror(enum sc_orientation orientation) {
assert(!(orientation & ~7));
@ -130,18 +136,6 @@ sc_orientation_get_name(enum sc_orientation orientation) {
}
}
enum sc_lock_video_orientation {
SC_LOCK_VIDEO_ORIENTATION_UNLOCKED = -1,
// lock the current orientation when scrcpy starts
SC_LOCK_VIDEO_ORIENTATION_INITIAL = -2,
// like SC_LOCK_VIDEO_ORIENTATION_INITIAL, but set automatically
SC_LOCK_VIDEO_ORIENTATION_INITIAL_AUTO = -3,
SC_LOCK_VIDEO_ORIENTATION_0 = 0,
SC_LOCK_VIDEO_ORIENTATION_90 = 3,
SC_LOCK_VIDEO_ORIENTATION_180 = 2,
SC_LOCK_VIDEO_ORIENTATION_270 = 1,
};
enum sc_keyboard_input_mode {
SC_KEYBOARD_INPUT_MODE_AUTO,
SC_KEYBOARD_INPUT_MODE_UHID_OR_AOA, // normal vs otg mode
@ -253,7 +247,9 @@ struct scrcpy_options {
uint32_t video_bit_rate;
uint32_t audio_bit_rate;
const char *max_fps; // float to be parsed by the server
enum sc_lock_video_orientation lock_video_orientation;
const char *angle; // float to be parsed by the server
enum sc_orientation capture_orientation;
enum sc_orientation_lock capture_orientation_lock;
enum sc_orientation display_orientation;
enum sc_orientation record_orientation;
int16_t window_x; // SC_WINDOW_POSITION_UNDEFINED for "auto"

View File

@ -428,8 +428,10 @@ scrcpy(struct scrcpy_options *options) {
.video_bit_rate = options->video_bit_rate,
.audio_bit_rate = options->audio_bit_rate,
.max_fps = options->max_fps,
.angle = options->angle,
.screen_off_timeout = options->screen_off_timeout,
.lock_video_orientation = options->lock_video_orientation,
.capture_orientation = options->capture_orientation,
.capture_orientation_lock = options->capture_orientation_lock,
.control = options->control,
.display_id = options->display_id,
.new_display = options->new_display,

View File

@ -274,9 +274,21 @@ execute_server(struct sc_server *server,
VALIDATE_STRING(params->max_fps);
ADD_PARAM("max_fps=%s", params->max_fps);
}
if (params->lock_video_orientation != SC_LOCK_VIDEO_ORIENTATION_UNLOCKED) {
ADD_PARAM("lock_video_orientation=%" PRIi8,
params->lock_video_orientation);
if (params->angle) {
VALIDATE_STRING(params->angle);
ADD_PARAM("angle=%s", params->angle);
}
if (params->capture_orientation_lock != SC_ORIENTATION_UNLOCKED
|| params->capture_orientation != SC_ORIENTATION_0) {
if (params->capture_orientation_lock == SC_ORIENTATION_LOCKED_INITIAL) {
ADD_PARAM("capture_orientation=@");
} else {
const char *orient =
sc_orientation_get_name(params->capture_orientation);
bool locked =
params->capture_orientation_lock != SC_ORIENTATION_UNLOCKED;
ADD_PARAM("capture_orientation=%s%s", locked ? "@" : "", orient);
}
}
if (server->tunnel.forward) {
ADD_PARAM("tunnel_forward=true");

View File

@ -45,8 +45,10 @@ struct sc_server_params {
uint32_t video_bit_rate;
uint32_t audio_bit_rate;
const char *max_fps; // float to be parsed by the server
const char *angle; // float to be parsed by the server
sc_tick screen_off_timeout;
int8_t lock_video_orientation;
enum sc_orientation capture_orientation;
enum sc_orientation_lock capture_orientation_lock;
bool control;
uint32_t display_id;
const char *new_display;

View File

@ -51,7 +51,6 @@ static void test_options(void) {
"--fullscreen",
"--max-fps", "30",
"--max-size", "1024",
"--lock-video-orientation=2", // optional arguments require '='
// "--no-control" is not compatible with "--turn-screen-off"
// "--no-playback" is not compatible with "--fulscreen"
"--port", "1234:1236",
@ -80,7 +79,6 @@ static void test_options(void) {
assert(opts->fullscreen);
assert(!strcmp(opts->max_fps, "30"));
assert(opts->max_size == 1024);
assert(opts->lock_video_orientation == 2);
assert(opts->port_range.first == 1234);
assert(opts->port_range.last == 1236);
assert(!strcmp(opts->push_target, "/sdcard/Movies"));

View File

@ -103,23 +103,40 @@ The orientation may be applied at 3 different levels:
- The [shortcut](shortcuts.md) <kbd>MOD</kbd>+<kbd>r</kbd> requests the
device to switch between portrait and landscape (the current running app may
refuse, if it does not support the requested orientation).
- `--lock-video-orientation` changes the mirroring orientation (the orientation
- `--capture-orientation` changes the mirroring orientation (the orientation
of the video sent from the device to the computer). This affects the
recording.
- `--orientation` is applied on the client side, and affects display and
recording. For the display, it can be changed dynamically using
[shortcuts](shortcuts.md).
To lock the mirroring orientation (on the capture side):
To capture the video with a specific orientation:
```bash
scrcpy --lock-video-orientation # initial (current) orientation
scrcpy --lock-video-orientation=0 # natural orientation
scrcpy --lock-video-orientation=90 # 90° clockwise
scrcpy --lock-video-orientation=180 # 180°
scrcpy --lock-video-orientation=270 # 270° clockwise
scrcpy --capture-orientation=0
scrcpy --capture-orientation=90 # 90° clockwise
scrcpy --capture-orientation=180 # 180°
scrcpy --capture-orientation=270 # 270° clockwise
scrcpy --capture-orientation=flip0 # hflip
scrcpy --capture-orientation=flip90 # hflip + 90° clockwise
scrcpy --capture-orientation=flip180 # hflip + 180°
scrcpy --capture-orientation=flip270 # hflip + 270° clockwise
```
The capture orientation can be locked by using `@`, so that a physical device
rotation does not change the captured video orientation:
```bash
scrcpy --capture-orientation=@ # locked to the initial orientation
scrcpy --capture-orientation=@0 # locked to 0°
scrcpy --capture-orientation=@90 # locked to 90° clockwise
scrcpy --capture-orientation=@180 # locked to 180°
scrcpy --capture-orientation=@270 # locked to 270° clockwise
scrcpy --capture-orientation=@flip0 # locked to hflip
scrcpy --capture-orientation=@flip90 # locked to hflip + 90° clockwise
scrcpy --capture-orientation=@flip180 # locked to hflip + 180°
scrcpy --capture-orientation=@flip270 # locked to hflip + 270° clockwise
To orient the video (on the rendering side):
```bash
@ -141,6 +158,17 @@ to the MP4 or MKV target file. Flipping is not supported, so only the 4 first
values are allowed when recording.
## Angle
To rotate the video content by a custom angle (in degrees, counter-clockwise):
```
scrcpy --angle=23
```
The center of rotation is the center of the visible area (after cropping).
## Crop
The device screen may be cropped to mirror only part of the screen.

View File

@ -60,6 +60,7 @@ SRC=( \
com/genymobile/scrcpy/audio/*.java \
com/genymobile/scrcpy/control/*.java \
com/genymobile/scrcpy/device/*.java \
com/genymobile/scrcpy/opengl/*.java \
com/genymobile/scrcpy/util/*.java \
com/genymobile/scrcpy/video/*.java \
com/genymobile/scrcpy/wrappers/*.java \

View File

@ -4,6 +4,7 @@ import com.genymobile.scrcpy.audio.AudioCodec;
import com.genymobile.scrcpy.audio.AudioSource;
import com.genymobile.scrcpy.device.Device;
import com.genymobile.scrcpy.device.NewDisplay;
import com.genymobile.scrcpy.device.Orientation;
import com.genymobile.scrcpy.device.Size;
import com.genymobile.scrcpy.util.CodecOption;
import com.genymobile.scrcpy.util.Ln;
@ -13,6 +14,7 @@ import com.genymobile.scrcpy.video.VideoCodec;
import com.genymobile.scrcpy.video.VideoSource;
import android.graphics.Rect;
import android.util.Pair;
import java.util.List;
import java.util.Locale;
@ -32,7 +34,7 @@ public class Options {
private int videoBitRate = 8000000;
private int audioBitRate = 128000;
private float maxFps;
private int lockVideoOrientation = Device.LOCK_VIDEO_ORIENTATION_UNLOCKED;
private float angle;
private boolean tunnelForward;
private Rect crop;
private boolean control = true;
@ -59,6 +61,9 @@ public class Options {
private NewDisplay newDisplay;
private Orientation.Lock captureOrientationLock = Orientation.Lock.Unlocked;
private Orientation captureOrientation = Orientation.Orient0;
private boolean listEncoders;
private boolean listDisplays;
private boolean listCameras;
@ -123,8 +128,8 @@ public class Options {
return maxFps;
}
public int getLockVideoOrientation() {
return lockVideoOrientation;
public float getAngle() {
return angle;
}
public boolean isTunnelForward() {
@ -219,6 +224,14 @@ public class Options {
return newDisplay;
}
public Orientation getCaptureOrientation() {
return captureOrientation;
}
public Orientation.Lock getCaptureOrientationLock() {
return captureOrientationLock;
}
public boolean getList() {
return listEncoders || listDisplays || listCameras || listCameraSizes || listApps;
}
@ -259,10 +272,6 @@ public class Options {
return sendCodecMeta;
}
public void resetLockVideoOrientation() {
this.lockVideoOrientation = Device.LOCK_VIDEO_ORIENTATION_UNLOCKED;
}
@SuppressWarnings("MethodLength")
public static Options parse(String... args) {
if (args.length < 1) {
@ -345,8 +354,8 @@ public class Options {
case "max_fps":
options.maxFps = parseFloat("max_fps", value);
break;
case "lock_video_orientation":
options.lockVideoOrientation = Integer.parseInt(value);
case "angle":
options.angle = parseFloat("angle", value);
break;
case "tunnel_forward":
options.tunnelForward = Boolean.parseBoolean(value);
@ -452,6 +461,11 @@ public class Options {
case "new_display":
options.newDisplay = parseNewDisplay(value);
break;
case "capture_orientation":
Pair<Orientation.Lock, Orientation> pair = parseCaptureOrientation(value);
options.captureOrientationLock = pair.first;
options.captureOrientation = pair.second;
break;
case "send_device_meta":
options.sendDeviceMeta = Boolean.parseBoolean(value);
break;
@ -575,4 +589,25 @@ public class Options {
return new NewDisplay(size, dpi);
}
private static Pair<Orientation.Lock, Orientation> parseCaptureOrientation(String value) {
if (value.isEmpty()) {
throw new IllegalArgumentException("Empty capture orientation string");
}
Orientation.Lock lock;
if (value.charAt(0) == '@') {
// Consume '@'
value = value.substring(1);
if (value.isEmpty()) {
// Only '@': lock to the initial orientation (orientation is unused)
return Pair.create(Orientation.Lock.LockedInitial, Orientation.Orient0);
}
lock = Orientation.Lock.LockedValue;
} else {
lock = Orientation.Lock.Unlocked;
}
return Pair.create(lock, Orientation.getByName(value));
}
}

View File

@ -14,6 +14,7 @@ import com.genymobile.scrcpy.device.DesktopConnection;
import com.genymobile.scrcpy.device.Device;
import com.genymobile.scrcpy.device.NewDisplay;
import com.genymobile.scrcpy.device.Streamer;
import com.genymobile.scrcpy.opengl.OpenGLRunner;
import com.genymobile.scrcpy.util.Ln;
import com.genymobile.scrcpy.util.LogUtils;
import com.genymobile.scrcpy.video.CameraCapture;
@ -84,23 +85,6 @@ public final class Server {
throw new ConfigurationException("New virtual display is not supported");
}
if (Build.VERSION.SDK_INT >= AndroidVersions.API_34_ANDROID_14) {
int lockVideoOrientation = options.getLockVideoOrientation();
if (lockVideoOrientation != Device.LOCK_VIDEO_ORIENTATION_UNLOCKED) {
if (lockVideoOrientation != Device.LOCK_VIDEO_ORIENTATION_INITIAL_AUTO) {
Ln.e("--lock-video-orientation is broken on Android >= 14: <https://github.com/Genymobile/scrcpy/issues/4011>");
throw new ConfigurationException("--lock-video-orientation is broken on Android >= 14");
} else {
// If the flag has been set automatically (because v4l2 sink is enabled), do not fail
Ln.w("--lock-video-orientation is ignored on Android >= 14: <https://github.com/Genymobile/scrcpy/issues/4011>");
}
}
if (options.getCrop() != null) {
Ln.e("--crop is broken on Android >= 14: <https://github.com/Genymobile/scrcpy/issues/4162>");
throw new ConfigurationException("Crop is not broken on Android >= 14");
}
}
CleanUp cleanUp = null;
if (options.getCleanup()) {
@ -191,6 +175,8 @@ public final class Server {
asyncProcessor.stop();
}
OpenGLRunner.quit(); // quit the OpenGL thread, if any
connection.shutdown();
try {
@ -200,6 +186,7 @@ public final class Server {
for (AsyncProcessor asyncProcessor : asyncProcessors) {
asyncProcessor.join();
}
OpenGLRunner.join();
} catch (InterruptedException e) {
// ignore
}

View File

@ -8,6 +8,7 @@ import com.genymobile.scrcpy.device.Device;
import com.genymobile.scrcpy.device.DeviceApp;
import com.genymobile.scrcpy.device.Point;
import com.genymobile.scrcpy.device.Position;
import com.genymobile.scrcpy.device.Size;
import com.genymobile.scrcpy.util.Ln;
import com.genymobile.scrcpy.util.LogUtils;
import com.genymobile.scrcpy.video.SurfaceCapture;
@ -359,7 +360,9 @@ public class Controller implements AsyncProcessor, VirtualDisplayListener {
Point point = displayData.positionMapper.map(position);
if (point == null) {
Ln.w("Ignore touch event, it was generated for a different device size");
Size eventSize = position.getScreenSize();
Size currentSize = displayData.positionMapper.getVideoSize();
Ln.w("Ignore touch event generated for size " + eventSize + " (current size is " + currentSize + ")");
return false;
}
@ -473,7 +476,9 @@ public class Controller implements AsyncProcessor, VirtualDisplayListener {
Point point = displayData.positionMapper.map(position);
if (point == null) {
Ln.w("Ignore scroll event, it was generated for a different device size");
Size eventSize = position.getScreenSize();
Size currentSize = displayData.positionMapper.getVideoSize();
Ln.w("Ignore scroll event generated for size " + eventSize + " (current size is " + currentSize + ")");
return false;
}

View File

@ -3,46 +3,46 @@ package com.genymobile.scrcpy.control;
import com.genymobile.scrcpy.device.Point;
import com.genymobile.scrcpy.device.Position;
import com.genymobile.scrcpy.device.Size;
import com.genymobile.scrcpy.video.ScreenInfo;
import android.graphics.Rect;
import com.genymobile.scrcpy.util.AffineMatrix;
public final class PositionMapper {
private final Size videoSize;
private final Rect contentRect;
private final int coordsRotation;
private final AffineMatrix videoToDeviceMatrix;
public PositionMapper(Size videoSize, Rect contentRect, int videoRotation) {
public PositionMapper(Size videoSize, AffineMatrix videoToDeviceMatrix) {
this.videoSize = videoSize;
this.contentRect = contentRect;
this.coordsRotation = reverseRotation(videoRotation);
this.videoToDeviceMatrix = videoToDeviceMatrix;
}
public static PositionMapper from(ScreenInfo screenInfo) {
// ignore the locked video orientation, the events will apply in coordinates considered in the physical device orientation
Size videoSize = screenInfo.getUnlockedVideoSize();
return new PositionMapper(videoSize, screenInfo.getContentRect(), screenInfo.getVideoRotation());
public static PositionMapper create(Size videoSize, AffineMatrix filterTransform, Size targetSize) {
boolean convertToPixels = !videoSize.equals(targetSize) || filterTransform != null;
AffineMatrix transform = filterTransform;
if (convertToPixels) {
AffineMatrix inputTransform = AffineMatrix.ndcFromPixels(videoSize);
AffineMatrix outputTransform = AffineMatrix.ndcToPixels(targetSize);
transform = outputTransform.multiply(transform).multiply(inputTransform);
}
return new PositionMapper(videoSize, transform);
}
private static int reverseRotation(int rotation) {
return (4 - rotation) % 4;
public Size getVideoSize() {
return videoSize;
}
public Point map(Position position) {
// reverse the video rotation to apply the events
Position devicePosition = position.rotate(coordsRotation);
Size clientVideoSize = devicePosition.getScreenSize();
Size clientVideoSize = position.getScreenSize();
if (!videoSize.equals(clientVideoSize)) {
// The client sends a click relative to a video with wrong dimensions,
// the device may have been rotated since the event was generated, so ignore the event
return null;
}
Point point = devicePosition.getPoint();
int convertedX = contentRect.left + point.getX() * contentRect.width() / videoSize.getWidth();
int convertedY = contentRect.top + point.getY() * contentRect.height() / videoSize.getHeight();
return new Point(convertedX, convertedY);
Point point = position.getPoint();
if (videoToDeviceMatrix != null) {
point = videoToDeviceMatrix.apply(point);
}
return point;
}
}

View File

@ -40,11 +40,6 @@ public final class Device {
public static final int INJECT_MODE_WAIT_FOR_RESULT = InputManager.INJECT_INPUT_EVENT_MODE_WAIT_FOR_RESULT;
public static final int INJECT_MODE_WAIT_FOR_FINISH = InputManager.INJECT_INPUT_EVENT_MODE_WAIT_FOR_FINISH;
public static final int LOCK_VIDEO_ORIENTATION_UNLOCKED = -1;
public static final int LOCK_VIDEO_ORIENTATION_INITIAL = -2;
// like SC_LOCK_VIDEO_ORIENTATION_INITIAL, but set automatically
public static final int LOCK_VIDEO_ORIENTATION_INITIAL_AUTO = -3;
private Device() {
// not instantiable
}

View File

@ -0,0 +1,47 @@
package com.genymobile.scrcpy.device;
public enum Orientation {
// @formatter:off
Orient0("0"),
Orient90("90"),
Orient180("180"),
Orient270("270"),
Flip0("flip0"),
Flip90("flip90"),
Flip180("flip180"),
Flip270("flip270");
public enum Lock {
Unlocked, LockedInitial, LockedValue,
}
private final String name;
Orientation(String name) {
this.name = name;
}
public static Orientation getByName(String name) {
for (Orientation orientation : values()) {
if (orientation.name.equals(name)) {
return orientation;
}
}
throw new IllegalArgumentException("Unknown orientation: " + name);
}
public static Orientation fromRotation(int rotation) {
assert rotation >= 0 && rotation < 4;
return values()[rotation];
}
public boolean isFlipped() {
return (ordinal() & 4) != 0;
}
public int getRotation() {
return this.ordinal() & 3;
}
}

View File

@ -29,6 +29,57 @@ public final class Size {
return new Size(height, width);
}
public Size limit(int maxSize) {
assert maxSize >= 0 : "Max size may not be negative";
assert maxSize % 8 == 0 : "Max size must be a multiple of 8";
if (maxSize == 0) {
// No limit
return this;
}
boolean portrait = height > width;
int major = portrait ? height : width;
if (major <= maxSize) {
return this;
}
int minor = portrait ? width : height;
int newMajor = maxSize;
int newMinor = maxSize * minor / major;
int w = portrait ? newMinor : newMajor;
int h = portrait ? newMajor : newMinor;
return new Size(w, h);
}
/**
* Round both dimensions of this size to be a multiple of 8 (as required by many encoders).
*
* @return The current size rounded.
*/
public Size round8() {
if ((width & 7) == 0 && (height & 7) == 0) {
// Already a multiple of 8
return this;
}
boolean portrait = height > width;
int major = portrait ? height : width;
int minor = portrait ? width : height;
major &= ~7; // round down to not exceed the initial size
minor = (minor + 4) & ~7; // round to the nearest to minimize aspect ratio distortion
if (minor > major) {
minor = major;
}
int w = portrait ? minor : major;
int h = portrait ? major : minor;
return new Size(w, h);
}
public Rect toRect() {
return new Rect(0, 0, width, height);
}
@ -52,6 +103,6 @@ public final class Size {
@Override
public String toString() {
return "Size{" + width + 'x' + height + '}';
return width + "x" + height;
}
}

View File

@ -0,0 +1,138 @@
package com.genymobile.scrcpy.opengl;
import com.genymobile.scrcpy.util.AffineMatrix;
import android.opengl.GLES11Ext;
import android.opengl.GLES20;
import java.nio.FloatBuffer;
public class AffineOpenGLFilter implements OpenGLFilter {
private int program;
private FloatBuffer vertexBuffer;
private FloatBuffer texCoordsBuffer;
private final float[] userMatrix;
private int vertexPosLoc;
private int texCoordsInLoc;
private int texLoc;
private int texMatrixLoc;
private int userMatrixLoc;
public AffineOpenGLFilter(AffineMatrix transform) {
this.userMatrix = transform.to4x4();
}
@Override
public void init() throws OpenGLFilterException {
// inputSize and outputSize are not used for an affine filter, but in theory they are necessary if we want to apply filters which operate
// on specific pixels
// @formatter:off
String vertexShaderCode = "#version 100\n"
+ "attribute vec4 vertex_pos;\n"
+ "attribute vec4 tex_coords_in;\n"
+ "varying vec2 tex_coords;\n"
+ "uniform mat4 tex_matrix;\n"
+ "uniform mat4 user_matrix;\n"
+ "void main() {\n"
+ " gl_Position = vertex_pos;\n"
+ " tex_coords = (tex_matrix * user_matrix * tex_coords_in).xy;\n"
+ "}";
// @formatter:off
String fragmentShaderCode = "#version 100\n"
+ "#extension GL_OES_EGL_image_external : require\n"
+ "precision highp float;\n"
+ "uniform samplerExternalOES tex;\n"
+ "varying vec2 tex_coords;\n"
+ "void main() {\n"
+ " if (tex_coords.x >= 0.0 && tex_coords.x <= 1.0\n"
+ " && tex_coords.y >= 0.0 && tex_coords.y <= 1.0) {\n"
+ " gl_FragColor = texture2D(tex, tex_coords);\n"
+ " } else {\n"
+ " gl_FragColor = vec4(0.0);\n"
+ " }\n"
+ "}";
program = GLUtils.createProgram(vertexShaderCode, fragmentShaderCode);
if (program == 0) {
throw new OpenGLFilterException("Cannot create OpenGL program");
}
float[] vertices = {
-1, -1, // Bottom-left
1, -1, // Bottom-right
-1, 1, // Top-left
1, 1, // Top-right
};
float[] texCoords = {
0, 0, // Bottom-left
1, 0, // Bottom-right
0, 1, // Top-left
1, 1, // Top-right
};
// OpenGL will fill the 3rd and 4th coordinates of the vec4 automatically to 0.0 and 1.0 respectively
vertexBuffer = GLUtils.createFloatBuffer(vertices);
texCoordsBuffer = GLUtils.createFloatBuffer(texCoords);
vertexPosLoc = GLES20.glGetAttribLocation(program, "vertex_pos");
assert vertexPosLoc != -1;
texCoordsInLoc = GLES20.glGetAttribLocation(program, "tex_coords_in");
assert texCoordsInLoc != -1;
texLoc = GLES20.glGetUniformLocation(program, "tex");
assert texLoc != -1;
texMatrixLoc = GLES20.glGetUniformLocation(program, "tex_matrix");
assert texMatrixLoc != -1;
userMatrixLoc = GLES20.glGetUniformLocation(program, "user_matrix");
assert userMatrixLoc != -1;
}
@Override
public void draw(int textureId, float[] texMatrix) {
GLES20.glUseProgram(program);
GLUtils.checkGlError();
GLES20.glEnableVertexAttribArray(vertexPosLoc);
GLUtils.checkGlError();
GLES20.glEnableVertexAttribArray(texCoordsInLoc);
GLUtils.checkGlError();
GLES20.glVertexAttribPointer(vertexPosLoc, 2, GLES20.GL_FLOAT, false, 0, vertexBuffer);
GLUtils.checkGlError();
GLES20.glVertexAttribPointer(texCoordsInLoc, 2, GLES20.GL_FLOAT, false, 0, texCoordsBuffer);
GLUtils.checkGlError();
GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
GLUtils.checkGlError();
GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, textureId);
GLUtils.checkGlError();
GLES20.glUniform1i(texLoc, 0);
GLUtils.checkGlError();
GLES20.glUniformMatrix4fv(texMatrixLoc, 1, false, texMatrix, 0);
GLUtils.checkGlError();
GLES20.glUniformMatrix4fv(userMatrixLoc, 1, false, userMatrix, 0);
GLUtils.checkGlError();
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);
GLUtils.checkGlError();
GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0, 4);
GLUtils.checkGlError();
}
@Override
public void release() {
GLES20.glDeleteProgram(program);
GLUtils.checkGlError();
}
}

View File

@ -0,0 +1,124 @@
package com.genymobile.scrcpy.opengl;
import com.genymobile.scrcpy.BuildConfig;
import com.genymobile.scrcpy.util.Ln;
import android.opengl.GLES20;
import android.opengl.GLU;
import java.nio.ByteBuffer;
import java.nio.ByteOrder;
import java.nio.FloatBuffer;
public final class GLUtils {
private static final boolean DEBUG = BuildConfig.DEBUG;
private GLUtils() {
// not instantiable
}
public static int createProgram(String vertexSource, String fragmentSource) {
int vertexShader = createShader(GLES20.GL_VERTEX_SHADER, vertexSource);
if (vertexShader == 0) {
return 0;
}
int fragmentShader = createShader(GLES20.GL_FRAGMENT_SHADER, fragmentSource);
if (fragmentShader == 0) {
GLES20.glDeleteShader(vertexShader);
return 0;
}
int program = GLES20.glCreateProgram();
if (program == 0) {
GLES20.glDeleteShader(fragmentShader);
GLES20.glDeleteShader(vertexShader);
return 0;
}
GLES20.glAttachShader(program, vertexShader);
checkGlError();
GLES20.glAttachShader(program, fragmentShader);
checkGlError();
GLES20.glLinkProgram(program);
checkGlError();
int[] linkStatus = new int[1];
GLES20.glGetProgramiv(program, GLES20.GL_LINK_STATUS, linkStatus, 0);
if (linkStatus[0] == 0) {
Ln.e("Could not link program: " + GLES20.glGetProgramInfoLog(program));
GLES20.glDeleteProgram(program);
GLES20.glDeleteShader(fragmentShader);
GLES20.glDeleteShader(vertexShader);
return 0;
}
return program;
}
public static int createShader(int type, String source) {
int shader = GLES20.glCreateShader(type);
if (shader == 0) {
Ln.e(getGlErrorMessage("Could not create shader"));
return 0;
}
GLES20.glShaderSource(shader, source);
GLES20.glCompileShader(shader);
int[] compileStatus = new int[1];
GLES20.glGetShaderiv(shader, GLES20.GL_COMPILE_STATUS, compileStatus, 0);
if (compileStatus[0] == 0) {
Ln.e("Could not compile " + getShaderTypeString(type) + ": " + GLES20.glGetShaderInfoLog(shader));
GLES20.glDeleteShader(shader);
return 0;
}
return shader;
}
private static String getShaderTypeString(int type) {
switch (type) {
case GLES20.GL_VERTEX_SHADER:
return "vertex shader";
case GLES20.GL_FRAGMENT_SHADER:
return "fragment shader";
default:
return "shader";
}
}
/**
* Throws a runtime exception if {@link GLES20#glGetError()} returns an error (useful for debugging).
*/
public static void checkGlError() {
if (DEBUG) {
int error = GLES20.glGetError();
if (error != GLES20.GL_NO_ERROR) {
throw new RuntimeException(toErrorString(error));
}
}
}
public static String getGlErrorMessage(String userError) {
int glError = GLES20.glGetError();
if (glError == GLES20.GL_NO_ERROR) {
return userError;
}
return userError + " (" + toErrorString(glError) + ")";
}
private static String toErrorString(int glError) {
String errorString = GLU.gluErrorString(glError);
return "glError 0x" + Integer.toHexString(glError) + " " + errorString;
}
public static FloatBuffer createFloatBuffer(float[] values) {
FloatBuffer fb = ByteBuffer.allocateDirect(values.length * 4).order(ByteOrder.nativeOrder()).asFloatBuffer();
fb.put(values);
fb.position(0);
return fb;
}
}

View File

@ -0,0 +1,13 @@
package com.genymobile.scrcpy.opengl;
import java.io.IOException;
public class OpenGLException extends IOException {
public OpenGLException(String message) {
super(message);
}
public OpenGLException(String message, Throwable cause) {
super(message, cause);
}
}

View File

@ -0,0 +1,21 @@
package com.genymobile.scrcpy.opengl;
public interface OpenGLFilter {
/**
* Initialize the OpenGL filter (typically compile the shaders and create the program).
*
* @throws OpenGLFilterException if an initialization error occurs
*/
void init() throws OpenGLFilterException;
/**
* Render a frame (call for each frame)
*/
void draw(int textureId, float[] texMatrix);
/**
* Release resources
*/
void release();
}

View File

@ -0,0 +1,7 @@
package com.genymobile.scrcpy.opengl;
public class OpenGLFilterException extends OpenGLException {
public OpenGLFilterException(String message) {
super(message);
}
}

View File

@ -0,0 +1,249 @@
package com.genymobile.scrcpy.opengl;
import com.genymobile.scrcpy.device.Size;
import android.graphics.SurfaceTexture;
import android.opengl.EGL14;
import android.opengl.EGLConfig;
import android.opengl.EGLContext;
import android.opengl.EGLDisplay;
import android.opengl.EGLExt;
import android.opengl.EGLSurface;
import android.opengl.GLES11Ext;
import android.opengl.GLES20;
import android.opengl.Matrix;
import android.os.Handler;
import android.os.HandlerThread;
import android.view.Surface;
import java.util.concurrent.Semaphore;
public final class OpenGLRunner {
private static HandlerThread handlerThread;
private static Handler handler;
private static boolean quit;
private EGLDisplay eglDisplay;
private EGLContext eglContext;
private EGLSurface eglSurface;
private final OpenGLFilter filter;
private final boolean ignoreTransformMatrix;
private SurfaceTexture surfaceTexture;
private Surface inputSurface;
private int textureId;
private boolean stopped;
public OpenGLRunner(OpenGLFilter filter, boolean ignoreTransformMatrix) {
this.filter = filter;
this.ignoreTransformMatrix = ignoreTransformMatrix;
}
public OpenGLRunner(OpenGLFilter filter) {
this(filter, false);
}
public static synchronized void initOnce() {
if (handlerThread == null) {
if (quit) {
throw new IllegalStateException("Could not init OpenGLRunner after it is quit");
}
handlerThread = new HandlerThread("OpenGLRunner");
handlerThread.start();
handler = new Handler(handlerThread.getLooper());
}
}
public static void quit() {
HandlerThread thread;
synchronized (OpenGLRunner.class) {
thread = handlerThread;
quit = true;
}
if (thread != null) {
thread.quitSafely();
}
}
public static void join() throws InterruptedException {
HandlerThread thread;
synchronized (OpenGLRunner.class) {
thread = handlerThread;
}
if (thread != null) {
thread.join();
}
}
public Surface start(Size inputSize, Size outputSize, Surface outputSurface) throws OpenGLException {
initOnce();
// Simulate CompletableFuture, but working for all Android versions
final Semaphore sem = new Semaphore(0);
Throwable[] throwableRef = new Throwable[1];
// The whole OpenGL execution must be performed on a Handler, so that SurfaceTexture.setOnFrameAvailableListener() works correctly.
// See <https://github.com/Genymobile/scrcpy/issues/5444>
handler.post(() -> {
try {
run(inputSize, outputSize, outputSurface);
} catch (Throwable throwable) {
throwableRef[0] = throwable;
} finally {
sem.release();
}
});
try {
sem.acquire();
} catch (InterruptedException e) {
// Behave as if this method call was synchronous
Thread.currentThread().interrupt();
}
Throwable throwable = throwableRef[0];
if (throwable != null) {
if (throwable instanceof OpenGLException) {
throw (OpenGLException) throwable;
}
throw new OpenGLException("Asynchronous OpenGL runner init failed", throwable);
}
// No need for synchronization, if it is called after start() (synchronized with the semaphore) and before release()
return inputSurface;
}
private void run(Size inputSize, Size outputSize, Surface outputSurface) throws OpenGLException {
eglDisplay = EGL14.eglGetDisplay(EGL14.EGL_DEFAULT_DISPLAY);
if (eglDisplay == EGL14.EGL_NO_DISPLAY) {
throw new OpenGLException("Unable to get EGL14 display");
}
int[] version = new int[2];
if (!EGL14.eglInitialize(eglDisplay, version, 0, version, 1)) {
throw new OpenGLException("Unable to initialize EGL14");
}
// @formatter:off
int[] attribList = {
EGL14.EGL_RED_SIZE, 8,
EGL14.EGL_GREEN_SIZE, 8,
EGL14.EGL_BLUE_SIZE, 8,
EGL14.EGL_ALPHA_SIZE, 8,
EGL14.EGL_RENDERABLE_TYPE, EGL14.EGL_OPENGL_ES2_BIT,
EGL14.EGL_NONE
};
EGLConfig[] configs = new EGLConfig[1];
int[] numConfigs = new int[1];
EGL14.eglChooseConfig(eglDisplay, attribList, 0, configs, 0, configs.length, numConfigs, 0);
if (numConfigs[0] <= 0) {
throw new OpenGLException("Unable to find ES2 EGL config");
}
EGLConfig eglConfig = configs[0];
// @formatter:off
int[] contextAttribList = {
EGL14.EGL_CONTEXT_CLIENT_VERSION, 2,
EGL14.EGL_NONE
};
eglContext = EGL14.eglCreateContext(eglDisplay, eglConfig, EGL14.EGL_NO_CONTEXT, contextAttribList, 0);
if (eglContext == null) {
throw new OpenGLException("Failed to create EGL context");
}
int[] surfaceAttribList = {
EGL14.EGL_NONE
};
eglSurface = EGL14.eglCreateWindowSurface(eglDisplay, eglConfig, outputSurface, surfaceAttribList, 0);
if (eglSurface == null) {
throw new OpenGLException("Failed to create EGL window surface");
}
if (!EGL14.eglMakeCurrent(eglDisplay, eglSurface, eglSurface, eglContext)) {
throw new OpenGLException("Failed to make EGL context current");
}
int[] textures = new int[1];
GLES20.glGenTextures(1, textures, 0);
GLUtils.checkGlError();
textureId = textures[0];
GLES20.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_LINEAR);
GLUtils.checkGlError();
GLES20.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);
GLUtils.checkGlError();
GLES20.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);
GLUtils.checkGlError();
GLES20.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE);
GLUtils.checkGlError();
surfaceTexture = new SurfaceTexture(textureId);
surfaceTexture.setDefaultBufferSize(inputSize.getWidth(), inputSize.getHeight());
inputSurface = new Surface(surfaceTexture);
filter.init();
surfaceTexture.setOnFrameAvailableListener(surfaceTexture -> {
if (stopped) {
// Make sure to never render after resources have been released
return;
}
render(outputSize);
}, handler);
}
private void render(Size outputSize) {
GLES20.glViewport(0, 0, outputSize.getWidth(), outputSize.getHeight());
GLUtils.checkGlError();
surfaceTexture.updateTexImage();
float[] matrix = new float[16];
if (ignoreTransformMatrix) {
Matrix.setIdentityM(matrix, 0);
} else {
surfaceTexture.getTransformMatrix(matrix);
}
filter.draw(textureId, matrix);
EGLExt.eglPresentationTimeANDROID(eglDisplay, eglSurface, surfaceTexture.getTimestamp());
EGL14.eglSwapBuffers(eglDisplay, eglSurface);
}
public void stopAndRelease() {
final Semaphore sem = new Semaphore(0);
handler.post(() -> {
stopped = true;
surfaceTexture.setOnFrameAvailableListener(null, handler);
filter.release();
int[] textures = {textureId};
GLES20.glDeleteTextures(1, textures, 0);
GLUtils.checkGlError();
EGL14.eglDestroySurface(eglDisplay, eglSurface);
EGL14.eglDestroyContext(eglDisplay, eglContext);
EGL14.eglTerminate(eglDisplay);
eglDisplay = EGL14.EGL_NO_DISPLAY;
eglContext = EGL14.EGL_NO_CONTEXT;
eglSurface = EGL14.EGL_NO_SURFACE;
surfaceTexture.release();
inputSurface.release();
sem.release();
});
try {
sem.acquire();
} catch (InterruptedException e) {
// Behave as if this method call was synchronous
Thread.currentThread().interrupt();
}
}
}

View File

@ -0,0 +1,365 @@
package com.genymobile.scrcpy.util;
import com.genymobile.scrcpy.device.Point;
import com.genymobile.scrcpy.device.Size;
/**
* Represents a 2D affine transform (a 3x3 matrix):
*
* <pre>
* / a c e \
* | b d f |
* \ 0 0 1 /
* </pre>
* <p>
* Or, a 4x4 matrix if we add a z axis:
*
* <pre>
* / a c 0 e \
* | b d 0 f |
* | 0 0 1 0 |
* \ 0 0 0 1 /
* </pre>
*/
public class AffineMatrix {
private final double a, b, c, d, e, f;
/**
* The identity matrix.
*/
public static final AffineMatrix IDENTITY = new AffineMatrix(1, 0, 0, 1, 0, 0);
/**
* Create a new matrix:
*
* <pre>
* / a c e \
* | b d f |
* \ 0 0 1 /
* </pre>
*/
public AffineMatrix(double a, double b, double c, double d, double e, double f) {
this.a = a;
this.b = b;
this.c = c;
this.d = d;
this.e = e;
this.f = f;
}
@Override
public String toString() {
return "[" + a + ", " + c + ", " + e + "; " + b + ", " + d + ", " + f + "]";
}
/**
* Return a matrix which converts from Normalized Device Coordinates to pixels.
*
* @param size the target size
* @return the transform matrix
*/
public static AffineMatrix ndcFromPixels(Size size) {
double w = size.getWidth();
double h = size.getHeight();
return new AffineMatrix(1 / w, 0, 0, -1 / h, 0, 1);
}
/**
* Return a matrix which converts from pixels to Normalized Device Coordinates.
*
* @param size the source size
* @return the transform matrix
*/
public static AffineMatrix ndcToPixels(Size size) {
double w = size.getWidth();
double h = size.getHeight();
return new AffineMatrix(w, 0, 0, -h, 0, h);
}
/**
* Apply the transform to a point ({@code this} should be a matrix converted to pixels coordinates via {@link #ndcToPixels(Size)}).
*
* @param point the source point
* @return the converted point
*/
public Point apply(Point point) {
int x = point.getX();
int y = point.getY();
int xx = (int) (a * x + c * y + e);
int yy = (int) (b * x + d * y + f);
return new Point(xx, yy);
}
/**
* Compute <code>this * rhs</code>.
*
* @param rhs the matrix to multiply
* @return the product
*/
public AffineMatrix multiply(AffineMatrix rhs) {
if (rhs == null) {
// For convenience
return this;
}
double aa = this.a * rhs.a + this.c * rhs.b;
double bb = this.b * rhs.a + this.d * rhs.b;
double cc = this.a * rhs.c + this.c * rhs.d;
double dd = this.b * rhs.c + this.d * rhs.d;
double ee = this.a * rhs.e + this.c * rhs.f + this.e;
double ff = this.b * rhs.e + this.d * rhs.f + this.f;
return new AffineMatrix(aa, bb, cc, dd, ee, ff);
}
/**
* Multiply all matrices from left to right, ignoring any {@code null} matrix (for convenience).
*
* @param matrices the matrices
* @return the product
*/
public static AffineMatrix multiplyAll(AffineMatrix... matrices) {
AffineMatrix result = null;
for (AffineMatrix matrix : matrices) {
if (matrix != null) {
result = matrix.multiply(result);
}
}
return result;
}
/**
* Invert the matrix
*
* @return the inverse matrix (or {@code null} if not invertible).
*/
public AffineMatrix invert() {
// The 3x3 matrix M can be decomposed into M = M1 * M2:
// M1 M2
// / 1 0 e \ / a c 0 \
// | 0 1 f | * | b d 0 |
// \ 0 0 1 / \ 0 0 1 /
//
// The inverse of an invertible 2x2 matrix is given by this formula:
//
// / A B \⁻¹ 1 / D -B \
// \ C D / = ----- \ -C A /
// AD-BC
//
// Let B=c and C=b (to apply the general formula with the same letters).
//
// M⁻¹ = (M1 * M2)⁻¹ = M2⁻¹ * M1⁻¹
//
// M2⁻¹ M1⁻¹
// /----------------\
// 1 / d -B 0 \ / 1 0 -e \
// = ----- | -C a 0 | * | 0 1 -f |
// ad-BC \ 0 0 1 / \ 0 0 1 /
//
// With the original letters:
//
// 1 / d -c 0 \ / 1 0 -e \
// M⁻¹ = ----- | -b a 0 | * | 0 1 -f |
// ad-cb \ 0 0 1 / \ 0 0 1 /
//
// 1 / d -c cf-de \
// = ----- | -b a be-af |
// ad-cb \ 0 0 1 /
double det = a * d - c * b;
if (det == 0) {
// Not invertible
return null;
}
double aa = d / det;
double bb = -b / det;
double cc = -c / det;
double dd = a / det;
double ee = (c * f - d * e) / det;
double ff = (b * e - a * f) / det;
return new AffineMatrix(aa, bb, cc, dd, ee, ff);
}
/**
* Return this transform apply from the center (0.5, 0.5).
*
* @return the resulting matrix
*/
public AffineMatrix fromCenter() {
return translate(0.5, 0.5).multiply(this).multiply(translate(-0.5, -0.5));
}
/**
* Return this transform with the specified aspect ratio.
*
* @param ar the aspect ratio
* @return the resulting matrix
*/
public AffineMatrix withAspectRatio(double ar) {
return scale(1 / ar, 1).multiply(this).multiply(scale(ar, 1));
}
/**
* Return this transform with the specified aspect ratio.
*
* @param size the size describing the aspect ratio
* @return the transform
*/
public AffineMatrix withAspectRatio(Size size) {
double ar = (double) size.getWidth() / size.getHeight();
return withAspectRatio(ar);
}
/**
* Return a translation matrix.
*
* @param x the horizontal translation
* @param y the vertical translation
* @return the matrix
*/
public static AffineMatrix translate(double x, double y) {
return new AffineMatrix(1, 0, 0, 1, x, y);
}
/**
* Return a scaling matrix.
*
* @param x the horizontal scaling
* @param y the vertical scaling
* @return the matrix
*/
public static AffineMatrix scale(double x, double y) {
return new AffineMatrix(x, 0, 0, y, 0, 0);
}
/**
* Return a scaling matrix.
*
* @param from the source size
* @param to the destination size
* @return the matrix
*/
public static AffineMatrix scale(Size from, Size to) {
double scaleX = (double) to.getWidth() / from.getWidth();
double scaleY = (double) to.getHeight() / from.getHeight();
return scale(scaleX, scaleY);
}
/**
* Return a matrix applying a "reframing" ("cropping" a rectangle).
* <p/>
* <code>(x, y)</code> is the bottom-left corner, <code>(w, h)</code> is the size of the rectangle.
*
* @param x horizontal coordinate (increasing to the right)
* @param y vertical coordinate (increasing upwards)
* @param w width
* @param h height
* @return the matrix
*/
public static AffineMatrix reframe(double x, double y, double w, double h) {
if (w == 0 || h == 0) {
throw new IllegalArgumentException("Cannot reframe to an empty area: " + w + "x" + h);
}
return scale(1 / w, 1 / h).multiply(translate(-x, -y));
}
/**
* Return an orthogonal rotation matrix.
*
* @param ccwRotation the counter-clockwise rotation
* @return the matrix
*/
public static AffineMatrix rotateOrtho(int ccwRotation) {
switch (ccwRotation) {
case 0:
return IDENTITY;
case 1:
// 90° counter-clockwise
return new AffineMatrix(0, 1, -1, 0, 1, 0);
case 2:
// 180°
return new AffineMatrix(-1, 0, 0, -1, 1, 1);
case 3:
// 90° clockwise
return new AffineMatrix(0, -1, 1, 0, 0, 1);
default:
throw new IllegalArgumentException("Invalid rotation: " + ccwRotation);
}
}
/**
* Return an horizontal flip matrix.
*
* @return the matrix
*/
public static AffineMatrix hflip() {
return new AffineMatrix(-1, 0, 0, 1, 1, 0);
}
/**
* Return a vertical flip matrix;
*
* @return the matrix
*/
public static AffineMatrix vflip() {
return new AffineMatrix(1, 0, 0, -1, 0, 1);
}
/**
* Return a rotation matrix
*
* @param ccwDegrees the angle, in degrees (counter-clockwise)
* @return the matrix
*/
public static AffineMatrix rotate(double ccwDegrees) {
double radians = Math.toRadians(ccwDegrees);
double cos = Math.cos(radians);
double sin = Math.sin(radians);
return new AffineMatrix(cos, sin, -sin, cos, 0, 0);
}
/**
* Export this affine transform to a 4x4 column-major order matrix.
*
* @param matrix output 4x4 matrix
*/
public void to4x4(float[] matrix) {
// matrix is a 4x4 matrix in column-major order
// Column 0
matrix[0] = (float) a;
matrix[1] = (float) b;
matrix[2] = 0;
matrix[3] = 0;
// Column 1
matrix[4] = (float) c;
matrix[5] = (float) d;
matrix[6] = 0;
matrix[7] = 0;
// Column 2
matrix[8] = 0;
matrix[9] = 0;
matrix[10] = 1;
matrix[11] = 0;
// Column 3
matrix[12] = (float) e;
matrix[13] = (float) f;
matrix[14] = 0;
matrix[15] = 1;
}
/**
* Export this affine transform to a 4x4 column-major order matrix.
*
* @return 4x4 matrix
*/
public float[] to4x4() {
float[] matrix = new float[16];
to4x4(matrix);
return matrix;
}
}

View File

@ -3,7 +3,12 @@ package com.genymobile.scrcpy.video;
import com.genymobile.scrcpy.AndroidVersions;
import com.genymobile.scrcpy.Options;
import com.genymobile.scrcpy.device.ConfigurationException;
import com.genymobile.scrcpy.device.Orientation;
import com.genymobile.scrcpy.device.Size;
import com.genymobile.scrcpy.opengl.AffineOpenGLFilter;
import com.genymobile.scrcpy.opengl.OpenGLFilter;
import com.genymobile.scrcpy.opengl.OpenGLRunner;
import com.genymobile.scrcpy.util.AffineMatrix;
import com.genymobile.scrcpy.util.HandlerExecutor;
import com.genymobile.scrcpy.util.Ln;
import com.genymobile.scrcpy.util.LogUtils;
@ -48,9 +53,15 @@ public class CameraCapture extends SurfaceCapture {
private final CameraAspectRatio aspectRatio;
private final int fps;
private final boolean highSpeed;
private final Rect crop;
private Orientation captureOrientation;
private String cameraId;
private Size size;
private Size captureSize;
private Size videoSize; // after OpenGL transforms
private AffineMatrix transform;
private OpenGLRunner glRunner;
private HandlerThread cameraThread;
private Handler cameraHandler;
@ -67,6 +78,9 @@ public class CameraCapture extends SurfaceCapture {
this.aspectRatio = options.getCameraAspectRatio();
this.fps = options.getCameraFps();
this.highSpeed = options.getCameraHighSpeed();
this.crop = options.getCrop();
this.captureOrientation = options.getCaptureOrientation();
assert captureOrientation != null;
}
@Override
@ -92,13 +106,32 @@ public class CameraCapture extends SurfaceCapture {
@Override
public void prepare() throws IOException {
try {
size = selectSize(cameraId, explicitSize, maxSize, aspectRatio, highSpeed);
if (size == null) {
captureSize = selectSize(cameraId, explicitSize, maxSize, aspectRatio, highSpeed);
if (captureSize == null) {
throw new IOException("Could not select camera size");
}
} catch (CameraAccessException e) {
throw new IOException(e);
}
VideoFilter filter = new VideoFilter(captureSize);
if (crop != null) {
filter.addCrop(crop, false);
}
if (captureOrientation != Orientation.Orient0) {
filter.addOrientation(captureOrientation);
}
transform = filter.getInverseTransform();
videoSize = filter.getOutputSize().limit(maxSize).round8();
if (transform != null) {
// The transform matrix returned by SurfaceTexture is incorrect for camera capture (it often contains an additional unexpected 90°
// rotation), so it is disabled (see start()). A vertical flip must be applied, though.
transform = AffineMatrix.vflip().multiply(transform);
}
}
private static String selectCamera(String explicitCameraId, CameraFacing cameraFacing) throws CameraAccessException, ConfigurationException {
@ -214,6 +247,15 @@ public class CameraCapture extends SurfaceCapture {
@Override
public void start(Surface surface) throws IOException {
if (transform != null) {
assert glRunner == null;
OpenGLFilter glFilter = new AffineOpenGLFilter(transform);
// The transform matrix returned by SurfaceTexture is incorrect for camera capture (it often contains an additional unexpected 90°
// rotation), so disable it. A vertical flip is necessary though (see prepare()).
glRunner = new OpenGLRunner(glFilter, true);
surface = glRunner.start(captureSize, videoSize, surface);
}
try {
CameraCaptureSession session = createCaptureSession(cameraDevice, surface);
CaptureRequest request = createCaptureRequest(surface);
@ -235,7 +277,7 @@ public class CameraCapture extends SurfaceCapture {
@Override
public Size getSize() {
return size;
return videoSize;
}
@Override

View File

@ -18,7 +18,11 @@ public class CaptureReset implements SurfaceCapture.CaptureListener {
public synchronized void reset() {
reset.set(true);
if (runningMediaCodec != null) {
runningMediaCodec.signalEndOfInputStream();
try {
runningMediaCodec.signalEndOfInputStream();
} catch (IllegalStateException e) {
// ignore
}
}
}

View File

@ -0,0 +1,53 @@
package com.genymobile.scrcpy.video;
import com.genymobile.scrcpy.device.DisplayInfo;
import com.genymobile.scrcpy.device.Size;
import com.genymobile.scrcpy.util.Ln;
import com.genymobile.scrcpy.wrappers.ServiceManager;
public abstract class DisplayCapture extends SurfaceCapture {
// Source display size (before resizing/crop) for the current session
private Size sessionDisplaySize;
private synchronized Size getSessionDisplaySize() {
return sessionDisplaySize;
}
protected synchronized void setSessionDisplaySize(Size sessionDisplaySize) {
this.sessionDisplaySize = sessionDisplaySize;
}
protected void handleDisplayChanged(int displayId) {
DisplayInfo di = ServiceManager.getDisplayManager().getDisplayInfo(displayId);
if (di == null) {
Ln.w("DisplayInfo for " + displayId + " cannot be retrieved");
// We can't compare with the current size, so reset unconditionally
if (Ln.isEnabled(Ln.Level.VERBOSE)) {
Ln.v(getClass().getCanonicalName() + ": requestReset(): " + getSessionDisplaySize() + " -> (unknown)");
}
setSessionDisplaySize(null);
invalidate();
} else {
Size size = di.getSize();
// The field is hidden on purpose, to read it with synchronization
@SuppressWarnings("checkstyle:HiddenField")
Size sessionDisplaySize = getSessionDisplaySize(); // synchronized
// .equals() also works if sessionDisplaySize == null
if (!size.equals(sessionDisplaySize)) {
// Reset only if the size is different
if (Ln.isEnabled(Ln.Level.VERBOSE)) {
Ln.v(getClass().getCanonicalName() + ": requestReset(): " + sessionDisplaySize + " -> " + size);
}
// Set the new size immediately, so that a future onDisplayChanged() event called before the asynchronous prepare()
// considers that the current size is the requested size (to avoid a duplicate requestReset())
setSessionDisplaySize(size);
invalidate();
} else if (Ln.isEnabled(Ln.Level.VERBOSE)) {
Ln.v(getClass().getCanonicalName() + ": Size not changed (" + size + "): do not requestReset()");
}
}
}
}

View File

@ -5,23 +5,30 @@ import com.genymobile.scrcpy.Options;
import com.genymobile.scrcpy.control.PositionMapper;
import com.genymobile.scrcpy.device.DisplayInfo;
import com.genymobile.scrcpy.device.NewDisplay;
import com.genymobile.scrcpy.device.Orientation;
import com.genymobile.scrcpy.device.Size;
import com.genymobile.scrcpy.opengl.AffineOpenGLFilter;
import com.genymobile.scrcpy.opengl.OpenGLFilter;
import com.genymobile.scrcpy.opengl.OpenGLRunner;
import com.genymobile.scrcpy.util.AffineMatrix;
import com.genymobile.scrcpy.util.Ln;
import com.genymobile.scrcpy.wrappers.DisplayManager;
import com.genymobile.scrcpy.wrappers.ServiceManager;
import android.graphics.Rect;
import android.hardware.display.DisplayManager;
import android.hardware.display.VirtualDisplay;
import android.os.Build;
import android.os.Handler;
import android.os.HandlerThread;
import android.view.Surface;
import java.io.IOException;
public class NewDisplayCapture extends SurfaceCapture {
public class NewDisplayCapture extends DisplayCapture {
// Internal fields copied from android.hardware.display.DisplayManager
private static final int VIRTUAL_DISPLAY_FLAG_PUBLIC = DisplayManager.VIRTUAL_DISPLAY_FLAG_PUBLIC;
private static final int VIRTUAL_DISPLAY_FLAG_OWN_CONTENT_ONLY = DisplayManager.VIRTUAL_DISPLAY_FLAG_OWN_CONTENT_ONLY;
private static final int VIRTUAL_DISPLAY_FLAG_PUBLIC = android.hardware.display.DisplayManager.VIRTUAL_DISPLAY_FLAG_PUBLIC;
private static final int VIRTUAL_DISPLAY_FLAG_OWN_CONTENT_ONLY = android.hardware.display.DisplayManager.VIRTUAL_DISPLAY_FLAG_OWN_CONTENT_ONLY;
private static final int VIRTUAL_DISPLAY_FLAG_SUPPORTS_TOUCH = 1 << 6;
private static final int VIRTUAL_DISPLAY_FLAG_ROTATES_WITH_CONTENT = 1 << 7;
private static final int VIRTUAL_DISPLAY_FLAG_DESTROY_CONTENT_ON_REMOVAL = 1 << 8;
@ -36,12 +43,25 @@ public class NewDisplayCapture extends SurfaceCapture {
private final VirtualDisplayListener vdListener;
private final NewDisplay newDisplay;
private DisplayManager.DisplayListenerHandle displayListenerHandle;
private HandlerThread handlerThread;
private AffineMatrix displayTransform;
private AffineMatrix eventTransform;
private OpenGLRunner glRunner;
private Size mainDisplaySize;
private int mainDisplayDpi;
private int maxSize; // only used if newDisplay.getSize() != null
private final Rect crop;
private boolean captureOrientationLocked;
private Orientation captureOrientation;
private VirtualDisplay virtualDisplay;
private Size size;
private Size videoSize;
private Size displaySize; // the logical size of the display (including rotation)
private Size physicalSize; // the physical size of the display (without rotation)
private int dpi;
public NewDisplayCapture(VirtualDisplayListener vdListener, Options options) {
@ -49,16 +69,24 @@ public class NewDisplayCapture extends SurfaceCapture {
this.newDisplay = options.getNewDisplay();
assert newDisplay != null;
this.maxSize = options.getMaxSize();
this.crop = options.getCrop();
assert options.getCaptureOrientationLock() != null;
this.captureOrientationLocked = options.getCaptureOrientationLock() != Orientation.Lock.Unlocked;
this.captureOrientation = options.getCaptureOrientation();
assert captureOrientation != null;
}
@Override
protected void init() {
size = newDisplay.getSize();
displaySize = newDisplay.getSize();
dpi = newDisplay.getDpi();
if (size == null || dpi == 0) {
if (displaySize == null || dpi == 0) {
DisplayInfo displayInfo = ServiceManager.getDisplayManager().getDisplayInfo(0);
if (displayInfo != null) {
mainDisplaySize = displayInfo.getSize();
if ((displayInfo.getRotation() % 2) != 0) {
mainDisplaySize = mainDisplaySize.rotate(); // Use the natural device orientation (at rotation 0), not the current one
}
mainDisplayDpi = displayInfo.getDpi();
} else {
Ln.w("Main display not found, fallback to 1920x1080 240dpi");
@ -70,12 +98,50 @@ public class NewDisplayCapture extends SurfaceCapture {
@Override
public void prepare() {
if (!newDisplay.hasExplicitSize()) {
size = ScreenInfo.computeVideoSize(mainDisplaySize.getWidth(), mainDisplaySize.getHeight(), maxSize);
int displayRotation;
if (virtualDisplay == null) {
if (!newDisplay.hasExplicitSize()) {
displaySize = mainDisplaySize.limit(maxSize).round8();
}
if (!newDisplay.hasExplicitDpi()) {
dpi = scaleDpi(mainDisplaySize, mainDisplayDpi, displaySize);
}
videoSize = displaySize;
displayRotation = 0;
// Set the current display size to avoid an unnecessary call to invalidate()
setSessionDisplaySize(displaySize);
} else {
DisplayInfo displayInfo = ServiceManager.getDisplayManager().getDisplayInfo(virtualDisplay.getDisplay().getDisplayId());
displaySize = displayInfo.getSize();
dpi = displayInfo.getDpi();
displayRotation = displayInfo.getRotation();
}
if (!newDisplay.hasExplicitDpi()) {
dpi = scaleDpi(mainDisplaySize, mainDisplayDpi, size);
VideoFilter filter = new VideoFilter(displaySize);
if (crop != null) {
boolean transposed = (displayRotation % 2) != 0;
filter.addCrop(crop, transposed);
}
filter.addOrientation(displayRotation, captureOrientationLocked, captureOrientation);
eventTransform = filter.getInverseTransform();
// The display info gives the oriented size (so videoSize includes the display rotation)
videoSize = filter.getOutputSize().limit(maxSize).round8();
// But the virtual display video always remains in the origin orientation (the video itself is not rotated, so it must rotated manually).
// This additional display rotation must not be included in the input events transform (the expected coordinates are already in the
// physical display size)
VideoFilter displayFilter = new VideoFilter(displaySize);
displayFilter.addRotation(displayRotation);
// The display info gives the oriented size, but the virtual display video always remains in the origin orientation
AffineMatrix displayRotationMatrix = displayFilter.getInverseTransform();
physicalSize = displayFilter.getOutputSize();
displayTransform = AffineMatrix.multiplyAll(displayRotationMatrix, eventTransform);
}
public void startNew(Surface surface) {
@ -98,32 +164,68 @@ public class NewDisplayCapture extends SurfaceCapture {
}
}
virtualDisplay = ServiceManager.getDisplayManager()
.createNewVirtualDisplay("scrcpy", size.getWidth(), size.getHeight(), dpi, surface, flags);
.createNewVirtualDisplay("scrcpy", displaySize.getWidth(), displaySize.getHeight(), dpi, surface, flags);
virtualDisplayId = virtualDisplay.getDisplay().getDisplayId();
Ln.i("New display: " + size.getWidth() + "x" + size.getHeight() + "/" + dpi + " (id=" + virtualDisplayId + ")");
Ln.i("New display: " + displaySize.getWidth() + "x" + displaySize.getHeight() + "/" + dpi + " (id=" + virtualDisplayId + ")");
handlerThread = new HandlerThread("DisplayListener");
handlerThread.start();
Handler handler = new Handler(handlerThread.getLooper());
displayListenerHandle = ServiceManager.getDisplayManager().registerDisplayListener(displayId -> {
if (Ln.isEnabled(Ln.Level.VERBOSE)) {
Ln.v("NewDisplayCapture: onDisplayChanged(" + displayId + ")");
}
if (displayId == virtualDisplayId) {
handleDisplayChanged(displayId);
}
}, handler);
} catch (Exception e) {
Ln.e("Could not create display", e);
throw new AssertionError("Could not create display");
}
if (vdListener != null) {
Rect contentRect = new Rect(0, 0, size.getWidth(), size.getHeight());
PositionMapper positionMapper = new PositionMapper(size, contentRect, 0);
vdListener.onNewVirtualDisplay(virtualDisplayId, positionMapper);
}
}
@Override
public void start(Surface surface) throws IOException {
if (displayTransform != null) {
assert glRunner == null;
OpenGLFilter glFilter = new AffineOpenGLFilter(displayTransform);
glRunner = new OpenGLRunner(glFilter);
surface = glRunner.start(physicalSize, videoSize, surface);
}
if (virtualDisplay == null) {
startNew(surface);
} else {
virtualDisplay.setSurface(surface);
}
if (vdListener != null) {
PositionMapper positionMapper = PositionMapper.create(videoSize, eventTransform, displaySize);
vdListener.onNewVirtualDisplay(virtualDisplay.getDisplay().getDisplayId(), positionMapper);
}
}
@Override
public void stop() {
if (glRunner != null) {
glRunner.stopAndRelease();
glRunner = null;
}
}
@Override
public void release() {
handlerThread.quitSafely();
handlerThread = null;
// displayListenerHandle may be null if registration failed
if (displayListenerHandle != null) {
ServiceManager.getDisplayManager().unregisterDisplayListener(displayListenerHandle);
displayListenerHandle = null;
}
if (virtualDisplay != null) {
virtualDisplay.release();
virtualDisplay = null;
@ -132,7 +234,7 @@ public class NewDisplayCapture extends SurfaceCapture {
@Override
public synchronized Size getSize() {
return size;
return videoSize;
}
@Override

View File

@ -6,7 +6,12 @@ import com.genymobile.scrcpy.control.PositionMapper;
import com.genymobile.scrcpy.device.ConfigurationException;
import com.genymobile.scrcpy.device.Device;
import com.genymobile.scrcpy.device.DisplayInfo;
import com.genymobile.scrcpy.device.Orientation;
import com.genymobile.scrcpy.device.Size;
import com.genymobile.scrcpy.opengl.AffineOpenGLFilter;
import com.genymobile.scrcpy.opengl.OpenGLFilter;
import com.genymobile.scrcpy.opengl.OpenGLRunner;
import com.genymobile.scrcpy.util.AffineMatrix;
import com.genymobile.scrcpy.util.Ln;
import com.genymobile.scrcpy.util.LogUtils;
import com.genymobile.scrcpy.wrappers.DisplayManager;
@ -23,23 +28,27 @@ import android.view.IDisplayFoldListener;
import android.view.IRotationWatcher;
import android.view.Surface;
public class ScreenCapture extends SurfaceCapture {
import java.io.IOException;
public class ScreenCapture extends DisplayCapture {
private final VirtualDisplayListener vdListener;
private final int displayId;
private int maxSize;
private final Rect crop;
private final int lockVideoOrientation;
private Orientation.Lock captureOrientationLock;
private Orientation captureOrientation;
private float angle;
private DisplayInfo displayInfo;
private ScreenInfo screenInfo;
// Source display size (before resizing/crop) for the current session
private Size sessionDisplaySize;
private Size videoSize;
private IBinder display;
private VirtualDisplay virtualDisplay;
private AffineMatrix transform;
private OpenGLRunner glRunner;
private DisplayManager.DisplayListenerHandle displayListenerHandle;
private HandlerThread handlerThread;
@ -56,7 +65,11 @@ public class ScreenCapture extends SurfaceCapture {
assert displayId != Device.DISPLAY_ID_NONE;
this.maxSize = options.getMaxSize();
this.crop = options.getCrop();
this.lockVideoOrientation = options.getLockVideoOrientation();
this.captureOrientationLock = options.getCaptureOrientationLock();
this.captureOrientation = options.getCaptureOrientation();
assert captureOrientationLock != null;
assert captureOrientation != null;
this.angle = options.getAngle();
}
@Override
@ -80,36 +93,7 @@ public class ScreenCapture extends SurfaceCapture {
}
}
if (this.displayId == displayId) {
DisplayInfo di = ServiceManager.getDisplayManager().getDisplayInfo(displayId);
if (di == null) {
Ln.w("DisplayInfo for " + displayId + " cannot be retrieved");
// We can't compare with the current size, so reset unconditionally
if (Ln.isEnabled(Ln.Level.VERBOSE)) {
Ln.v("ScreenCapture: requestReset(): " + getSessionDisplaySize() + " -> (unknown)");
}
setSessionDisplaySize(null);
invalidate();
} else {
Size size = di.getSize();
// The field is hidden on purpose, to read it with synchronization
@SuppressWarnings("checkstyle:HiddenField")
Size sessionDisplaySize = getSessionDisplaySize(); // synchronized
// .equals() also works if sessionDisplaySize == null
if (!size.equals(sessionDisplaySize)) {
// Reset only if the size is different
if (Ln.isEnabled(Ln.Level.VERBOSE)) {
Ln.v("ScreenCapture: requestReset(): " + sessionDisplaySize + " -> " + size);
}
// Set the new size immediately, so that a future onDisplayChanged() event called before the asynchronous prepare()
// considers that the current size is the requested size (to avoid a duplicate requestReset())
setSessionDisplaySize(size);
invalidate();
} else if (Ln.isEnabled(Ln.Level.VERBOSE)) {
Ln.v("ScreenCapture: Size not changed (" + size + "): do not requestReset()");
}
}
handleDisplayChanged(displayId);
}
}, handler);
}
@ -126,46 +110,73 @@ public class ScreenCapture extends SurfaceCapture {
Ln.w("Display doesn't have FLAG_SUPPORTS_PROTECTED_BUFFERS flag, mirroring can be restricted");
}
setSessionDisplaySize(displayInfo.getSize());
screenInfo = ScreenInfo.computeScreenInfo(displayInfo.getRotation(), displayInfo.getSize(), crop, maxSize, lockVideoOrientation);
Size displaySize = displayInfo.getSize();
setSessionDisplaySize(displaySize);
if (captureOrientationLock == Orientation.Lock.LockedInitial) {
// The user requested to lock the video orientation to the current orientation
captureOrientationLock = Orientation.Lock.LockedValue;
captureOrientation = Orientation.fromRotation(displayInfo.getRotation());
}
VideoFilter filter = new VideoFilter(displaySize);
if (crop != null) {
boolean transposed = (displayInfo.getRotation() % 2) != 0;
filter.addCrop(crop, transposed);
}
boolean locked = captureOrientationLock != Orientation.Lock.Unlocked;
filter.addOrientation(displayInfo.getRotation(), locked, captureOrientation);
filter.addAngle(angle);
transform = filter.getInverseTransform();
videoSize = filter.getOutputSize().limit(maxSize).round8();
}
@Override
public void start(Surface surface) {
if (display != null) {
SurfaceControl.destroyDisplay(display);
display = null;
}
if (virtualDisplay != null) {
virtualDisplay.release();
virtualDisplay = null;
public void start(Surface surface) throws IOException {
Size inputSize;
if (transform != null) {
// If there is a filter, it must receive the full display content
inputSize = displayInfo.getSize();
assert glRunner == null;
OpenGLFilter glFilter = new AffineOpenGLFilter(transform);
glRunner = new OpenGLRunner(glFilter);
surface = glRunner.start(inputSize, videoSize, surface);
} else {
// If there is no filter, the display must be rendered at target video size directly
inputSize = videoSize;
}
int virtualDisplayId;
PositionMapper positionMapper;
try {
Size videoSize = screenInfo.getVideoSize();
virtualDisplay = ServiceManager.getDisplayManager()
.createVirtualDisplay("scrcpy", videoSize.getWidth(), videoSize.getHeight(), displayId, surface);
if (virtualDisplay == null) {
virtualDisplay = ServiceManager.getDisplayManager()
.createVirtualDisplay("scrcpy", inputSize.getWidth(), inputSize.getHeight(), displayId, surface);
} else {
virtualDisplay.setSurface(surface);
virtualDisplay.resize(videoSize.getWidth(), videoSize.getHeight(), displayInfo.getDpi());
}
virtualDisplayId = virtualDisplay.getDisplay().getDisplayId();
Rect contentRect = new Rect(0, 0, videoSize.getWidth(), videoSize.getHeight());
// The position are relative to the virtual display, not the original display
positionMapper = new PositionMapper(videoSize, contentRect, 0);
// The positions are relative to the virtual display, not the original display (so use inputSize, not deviceSize!)
positionMapper = PositionMapper.create(videoSize, transform, inputSize);
Ln.d("Display: using DisplayManager API");
} catch (Exception displayManagerException) {
try {
display = createDisplay();
if (display == null) {
display = createDisplay();
}
Rect contentRect = screenInfo.getContentRect();
// does not include the locked video orientation
Rect unlockedVideoRect = screenInfo.getUnlockedVideoSize().toRect();
int videoRotation = screenInfo.getVideoRotation();
Size deviceSize = displayInfo.getSize();
int layerStack = displayInfo.getLayerStack();
setDisplaySurface(display, surface, videoRotation, contentRect, unlockedVideoRect, layerStack);
setDisplaySurface(display, surface, deviceSize.toRect(), inputSize.toRect(), layerStack);
virtualDisplayId = displayId;
positionMapper = PositionMapper.from(screenInfo);
positionMapper = PositionMapper.create(videoSize, transform, deviceSize);
Ln.d("Display: using SurfaceControl API");
} catch (Exception surfaceControlException) {
Ln.e("Could not create display using DisplayManager", displayManagerException);
@ -179,6 +190,14 @@ public class ScreenCapture extends SurfaceCapture {
}
}
@Override
public void stop() {
if (glRunner != null) {
glRunner.stopAndRelease();
glRunner = null;
}
}
@Override
public void release() {
if (Build.VERSION.SDK_INT == AndroidVersions.API_34_ANDROID_14) {
@ -206,7 +225,7 @@ public class ScreenCapture extends SurfaceCapture {
@Override
public Size getSize() {
return screenInfo.getVideoSize();
return videoSize;
}
@Override
@ -223,25 +242,17 @@ public class ScreenCapture extends SurfaceCapture {
return SurfaceControl.createDisplay("scrcpy", secure);
}
private static void setDisplaySurface(IBinder display, Surface surface, int orientation, Rect deviceRect, Rect displayRect, int layerStack) {
private static void setDisplaySurface(IBinder display, Surface surface, Rect deviceRect, Rect displayRect, int layerStack) {
SurfaceControl.openTransaction();
try {
SurfaceControl.setDisplaySurface(display, surface);
SurfaceControl.setDisplayProjection(display, orientation, deviceRect, displayRect);
SurfaceControl.setDisplayProjection(display, 0, deviceRect, displayRect);
SurfaceControl.setDisplayLayerStack(display, layerStack);
} finally {
SurfaceControl.closeTransaction();
}
}
private synchronized Size getSessionDisplaySize() {
return sessionDisplaySize;
}
private synchronized void setSessionDisplaySize(Size sessionDisplaySize) {
this.sessionDisplaySize = sessionDisplaySize;
}
private void registerDisplayListenerFallbacks() {
rotationWatcher = new IRotationWatcher.Stub() {
@Override

View File

@ -1,149 +0,0 @@
package com.genymobile.scrcpy.video;
import com.genymobile.scrcpy.BuildConfig;
import com.genymobile.scrcpy.device.Device;
import com.genymobile.scrcpy.device.Size;
import com.genymobile.scrcpy.util.Ln;
import android.graphics.Rect;
public final class ScreenInfo {
/**
* Device (physical) size, possibly cropped
*/
private final Rect contentRect; // device size, possibly cropped
/**
* Video size, possibly smaller than the device size, already taking the device rotation and crop into account.
* <p>
* However, it does not include the locked video orientation.
*/
private final Size unlockedVideoSize;
/**
* Device rotation, related to the natural device orientation (0, 1, 2 or 3)
*/
private final int deviceRotation;
/**
* The locked video orientation (-1: disabled, 0: normal, 1: 90° CCW, 2: 180°, 3: 90° CW)
*/
private final int lockedVideoOrientation;
public ScreenInfo(Rect contentRect, Size unlockedVideoSize, int deviceRotation, int lockedVideoOrientation) {
this.contentRect = contentRect;
this.unlockedVideoSize = unlockedVideoSize;
this.deviceRotation = deviceRotation;
this.lockedVideoOrientation = lockedVideoOrientation;
}
public Rect getContentRect() {
return contentRect;
}
/**
* Return the video size as if locked video orientation was not set.
*
* @return the unlocked video size
*/
public Size getUnlockedVideoSize() {
return unlockedVideoSize;
}
/**
* Return the actual video size if locked video orientation is set.
*
* @return the actual video size
*/
public Size getVideoSize() {
if (getVideoRotation() % 2 == 0) {
return unlockedVideoSize;
}
return unlockedVideoSize.rotate();
}
public static ScreenInfo computeScreenInfo(int rotation, Size deviceSize, Rect crop, int maxSize, int lockedVideoOrientation) {
if (lockedVideoOrientation == Device.LOCK_VIDEO_ORIENTATION_INITIAL || lockedVideoOrientation == Device.LOCK_VIDEO_ORIENTATION_INITIAL_AUTO) {
// The user requested to lock the video orientation to the current orientation
lockedVideoOrientation = rotation;
}
Rect contentRect = new Rect(0, 0, deviceSize.getWidth(), deviceSize.getHeight());
if (crop != null) {
if (rotation % 2 != 0) { // 180s preserve dimensions
// the crop (provided by the user) is expressed in the natural orientation
crop = flipRect(crop);
}
if (!contentRect.intersect(crop)) {
// intersect() changes contentRect so that it is intersected with crop
Ln.w("Crop rectangle (" + formatCrop(crop) + ") does not intersect device screen (" + formatCrop(deviceSize.toRect()) + ")");
contentRect = new Rect(); // empty
}
}
Size videoSize = computeVideoSize(contentRect.width(), contentRect.height(), maxSize);
return new ScreenInfo(contentRect, videoSize, rotation, lockedVideoOrientation);
}
private static String formatCrop(Rect rect) {
return rect.width() + ":" + rect.height() + ":" + rect.left + ":" + rect.top;
}
public static Size computeVideoSize(int w, int h, int maxSize) {
// Compute the video size and the padding of the content inside this video.
// Principle:
// - scale down the great side of the screen to maxSize (if necessary);
// - scale down the other side so that the aspect ratio is preserved;
// - round this value to the nearest multiple of 8 (H.264 only accepts multiples of 8)
w &= ~7; // in case it's not a multiple of 8
h &= ~7;
if (maxSize > 0) {
if (BuildConfig.DEBUG && maxSize % 8 != 0) {
throw new AssertionError("Max size must be a multiple of 8");
}
boolean portrait = h > w;
int major = portrait ? h : w;
int minor = portrait ? w : h;
if (major > maxSize) {
int minorExact = minor * maxSize / major;
// +4 to round the value to the nearest multiple of 8
minor = (minorExact + 4) & ~7;
major = maxSize;
}
w = portrait ? minor : major;
h = portrait ? major : minor;
}
return new Size(w, h);
}
private static Rect flipRect(Rect crop) {
return new Rect(crop.top, crop.left, crop.bottom, crop.right);
}
/**
* Return the rotation to apply to the device rotation to get the requested locked video orientation
*
* @return the rotation offset
*/
public int getVideoRotation() {
if (lockedVideoOrientation == -1) {
// no offset
return 0;
}
return (deviceRotation + 4 - lockedVideoOrientation) % 4;
}
/**
* Return the rotation to apply to the requested locked video orientation to get the device rotation
*
* @return the (reverse) rotation offset
*/
public int getReverseVideoRotation() {
if (lockedVideoOrientation == -1) {
// no offset
return 0;
}
return (lockedVideoOrientation + 4 - deviceRotation) % 4;
}
}

View File

@ -57,6 +57,13 @@ public abstract class SurfaceCapture {
*/
public abstract void start(Surface surface) throws IOException;
/**
* Stop the capture.
*/
public void stop() {
// Do nothing by default
}
/**
* Return the video size
*

View File

@ -108,9 +108,6 @@ public class SurfaceEncoder implements AsyncProcessor {
// The capture might have been closed internally (for example if the camera is disconnected)
alive = !stopped.get() && !capture.isClosed();
}
// do not call stop() on exception, it would trigger an IllegalStateException
mediaCodec.stop();
} catch (IllegalStateException | IllegalArgumentException e) {
Ln.e("Encoding error: " + e.getClass().getName() + ": " + e.getMessage());
if (!prepareRetry(size)) {
@ -120,6 +117,12 @@ public class SurfaceEncoder implements AsyncProcessor {
alive = true;
} finally {
reset.setRunningMediaCodec(null);
capture.stop();
try {
mediaCodec.stop();
} catch (IllegalStateException e) {
// ignore
}
mediaCodec.reset();
if (surface != null) {
surface.release();

View File

@ -0,0 +1,99 @@
package com.genymobile.scrcpy.video;
import com.genymobile.scrcpy.device.Orientation;
import com.genymobile.scrcpy.device.Size;
import com.genymobile.scrcpy.util.AffineMatrix;
import android.graphics.Rect;
public class VideoFilter {
private Size size;
private AffineMatrix transform;
public VideoFilter(Size inputSize) {
this.size = inputSize;
}
public Size getOutputSize() {
return size;
}
public AffineMatrix getTransform() {
return transform;
}
/**
* Return the inverse transform.
* <p/>
* The direct affine transform describes how the input image is transformed.
* <p/>
* It is often useful to retrieve the inverse transform instead:
* <ul>
* <li>The OpenGL filter expects the matrix to transform the image <em>coordinates</em>, which is the inverse transform;</li>
* <li>The click positions must be transformed back to the device positions, using the inverse transform too.</li>
* </ul>
*
* @return the inverse transform
*/
public AffineMatrix getInverseTransform() {
if (transform == null) {
return null;
}
return transform.invert();
}
private static Rect transposeRect(Rect rect) {
return new Rect(rect.top, rect.left, rect.bottom, rect.right);
}
public void addCrop(Rect crop, boolean transposed) {
if (transposed) {
crop = transposeRect(crop);
}
double inputWidth = size.getWidth();
double inputHeight = size.getHeight();
double x = crop.left / inputWidth;
double y = 1 - (crop.bottom / inputHeight); // OpenGL origin is bottom-left
double w = crop.width() / inputWidth;
double h = crop.height() / inputHeight;
transform = AffineMatrix.reframe(x, y, w, h).multiply(transform);
size = new Size(crop.width(), crop.height());
}
public void addRotation(int ccwRotation) {
if (ccwRotation == 0) {
return;
}
transform = AffineMatrix.rotateOrtho(ccwRotation).multiply(transform);
if (ccwRotation % 2 != 0) {
size = size.rotate();
}
}
public void addOrientation(Orientation captureOrientation) {
if (captureOrientation.isFlipped()) {
transform = AffineMatrix.hflip().multiply(transform);
}
int ccwRotation = (4 - captureOrientation.getRotation()) % 4;
addRotation(ccwRotation);
}
public void addOrientation(int displayRotation, boolean locked, Orientation captureOrientation) {
if (locked) {
// flip/rotate the current display from the natural device orientation (i.e. where display rotation is 0)
int reverseDisplayRotation = (4 - displayRotation) % 4;
addRotation(reverseDisplayRotation);
}
addOrientation(captureOrientation);
}
public void addAngle(double ccwAngle) {
if (ccwAngle == 0) {
return;
}
transform = AffineMatrix.rotate(ccwAngle).withAspectRatio(size).fromCenter().multiply(transform);
}
}