Compare commits

..

5 Commits

Author SHA1 Message Date
d2ae5be3ea Add mouse shortcut to expand settings panel
Double-click on extra mouse button to open the settings panel (a
single-click opens the notification panel).

This is consistent with the keyboard shortcut MOD+n+n.

PR #2264 <https://github.com/Genymobile/scrcpy/pull/2264>

Signed-off-by: Romain Vimont <rom@rom1v.com>
2021-04-20 19:30:17 +02:00
21b857959b Add keyboard shortcut to expand settings panel
PR #2260 <https://github.com/Genymobile/scrcpy/pull/2260>

Signed-off-by: Romain Vimont <rom@rom1v.com>
2021-04-20 19:30:17 +02:00
66b6170c43 Add control message to expand settings panel
PR #2260 <https://github.com/Genymobile/scrcpy/pull/2260>

Signed-off-by: Romain Vimont <rom@rom1v.com>
2021-04-20 19:30:17 +02:00
b8f3064510 Count repeated identical key events
This will allow shortcuts such as MOD+n+n to open the settings panel.

PR #2260 <https://github.com/Genymobile/scrcpy/pull/2260>

Signed-off-by: Romain Vimont <rom@rom1v.com>
2021-04-20 19:30:06 +02:00
f06dbd7927 Rename control message type to COLLAPSE_PANELS
The collapsing action collapses any panels.

By the way, the Android method is named collapsePanels().

PR #2260 <https://github.com/Genymobile/scrcpy/pull/2260>

Signed-off-by: Romain Vimont <rom@rom1v.com>
2021-04-20 19:23:24 +02:00
35 changed files with 602 additions and 1369 deletions

View File

@ -198,7 +198,6 @@ If `--max-size` is also specified, resizing is applied after cropping.
To lock the orientation of the mirroring:
```bash
scrcpy --lock-video-orientation # initial (current) orientation
scrcpy --lock-video-orientation 0 # natural orientation
scrcpy --lock-video-orientation 1 # 90° counterclockwise
scrcpy --lock-video-orientation 2 # 180°
@ -226,9 +225,7 @@ error will give the available encoders:
scrcpy --encoder _
```
### Capture
#### Recording
### Recording
It is possible to record the screen while mirroring:
@ -252,58 +249,6 @@ variation] does not impact the recorded file.
[packet delay variation]: https://en.wikipedia.org/wiki/Packet_delay_variation
#### v4l2loopback
On Linux, it is possible to send the video stream to a v4l2 loopback device, so
that the Android device can be opened like a webcam by any v4l2-capable tool.
The module `v4l2loopback` must be installed:
```bash
sudo apt install v4l2loopback-dkms
```
To create a v4l2 device:
```bash
sudo modprobe v4l2loopback
```
This will create a new video device in `/dev/videoN`, where `N` is an integer
(more [options](https://github.com/umlaeute/v4l2loopback#options) are available
to create several devices or devices with specific IDs).
To list the enabled devices:
```bash
# requires v4l-utils package
v4l2-ctl --list-devices
# simple but might be sufficient
ls /dev/video*
```
To start scrcpy using a v4l2 sink:
```bash
scrcpy --v4l2-sink=/dev/videoN
scrcpy --v4l2-sink=/dev/videoN -N # --no-display to disable mirroring window
```
(replace `N` by the device ID, check with `ls /dev/video*`)
Once enabled, you can open your video stream with a v4l2-capable tool:
```bash
ffplay -i /dev/videoN
vlc v4l2:///dev/videoN # VLC might add some buffering delay
```
For example, you could capture the video within [OBS].
[OBS]: https://obsproject.com/fr
### Connection
#### Wireless
@ -546,6 +491,18 @@ scrcpy -Sw
```
#### Render expired frames
By default, to minimize latency, _scrcpy_ always renders the last decoded frame
available, and drops any previous one.
To force the rendering of all frames (at a cost of a possible increased
latency), use:
```bash
scrcpy --render-expired-frames
```
#### Show touches
For presentations, it may be useful to show physical touches (on the physical
@ -741,10 +698,10 @@ _<kbd>[Super]</kbd> is typically the <kbd>Windows</kbd> or <kbd>Cmd</kbd> key._
| Rotate display left | <kbd>MOD</kbd>+<kbd>←</kbd> _(left)_
| Rotate display right | <kbd>MOD</kbd>+<kbd>→</kbd> _(right)_
| Resize window to 1:1 (pixel-perfect) | <kbd>MOD</kbd>+<kbd>g</kbd>
| Resize window to remove black borders | <kbd>MOD</kbd>+<kbd>w</kbd> \| _Double-left-click¹_
| Resize window to remove black borders | <kbd>MOD</kbd>+<kbd>w</kbd> \| _Double-click¹_
| Click on `HOME` | <kbd>MOD</kbd>+<kbd>h</kbd> \| _Middle-click_
| Click on `BACK` | <kbd>MOD</kbd>+<kbd>b</kbd> \| _Right-click²_
| Click on `APP_SWITCH` | <kbd>MOD</kbd>+<kbd>s</kbd> \| _4th-click³_
| Click on `APP_SWITCH` | <kbd>MOD</kbd>+<kbd>s</kbd>
| Click on `MENU` (unlock screen) | <kbd>MOD</kbd>+<kbd>m</kbd>
| Click on `VOLUME_UP` | <kbd>MOD</kbd>+<kbd>↑</kbd> _(up)_
| Click on `VOLUME_DOWN` | <kbd>MOD</kbd>+<kbd>↓</kbd> _(down)_
@ -753,27 +710,18 @@ _<kbd>[Super]</kbd> is typically the <kbd>Windows</kbd> or <kbd>Cmd</kbd> key._
| Turn device screen off (keep mirroring) | <kbd>MOD</kbd>+<kbd>o</kbd>
| Turn device screen on | <kbd>MOD</kbd>+<kbd>Shift</kbd>+<kbd>o</kbd>
| Rotate device screen | <kbd>MOD</kbd>+<kbd>r</kbd>
| Expand notification panel | <kbd>MOD</kbd>+<kbd>n</kbd> \| _5th-click³_
| Expand settings panel | <kbd>MOD</kbd>+<kbd>n</kbd>+<kbd>n</kbd> \| _Double-5th-click³_
| Collapse panels | <kbd>MOD</kbd>+<kbd>Shift</kbd>+<kbd>n</kbd>
| Copy to clipboard | <kbd>MOD</kbd>+<kbd>c</kbd>
| Cut to clipboard⁴ | <kbd>MOD</kbd>+<kbd>x</kbd>
| Synchronize clipboards and paste⁴ | <kbd>MOD</kbd>+<kbd>v</kbd>
| Expand notification panel | <kbd>MOD</kbd>+<kbd>n</kbd>
| Collapse notification panel | <kbd>MOD</kbd>+<kbd>Shift</kbd>+<kbd>n</kbd>
| Copy to clipboard³ | <kbd>MOD</kbd>+<kbd>c</kbd>
| Cut to clipboard³ | <kbd>MOD</kbd>+<kbd>x</kbd>
| Synchronize clipboards and paste³ | <kbd>MOD</kbd>+<kbd>v</kbd>
| Inject computer clipboard text | <kbd>MOD</kbd>+<kbd>Shift</kbd>+<kbd>v</kbd>
| Enable/disable FPS counter (on stdout) | <kbd>MOD</kbd>+<kbd>i</kbd>
| Pinch-to-zoom | <kbd>Ctrl</kbd>+_click-and-move_
_¹Double-click on black borders to remove them._
_²Right-click turns the screen on if it was off, presses BACK otherwise._
4th and 5th mouse buttons, if your mouse has them._
_⁴Only on Android >= 7._
Shortcuts with repeated keys are executted by releasing and pressing the key a
second time. For example, to execute "Expand settings panel":
1. Press and keep pressing <kbd>MOD</kbd>.
2. Then double-press <kbd>n</kbd>.
3. Finally, release <kbd>MOD</kbd>.
Only on Android >= 7._
All <kbd>Ctrl</kbd>+_key_ shortcuts are forwarded to the device, so they are
handled by the active application.

View File

@ -33,11 +33,6 @@ else
src += [ 'src/sys/unix/process.c' ]
endif
v4l2_support = host_machine.system() == 'linux'
if v4l2_support
src += [ 'src/v4l2_sink.c' ]
endif
check_functions = [
'strdup'
]
@ -54,10 +49,6 @@ if not get_option('crossbuild_windows')
dependency('sdl2'),
]
if v4l2_support
dependencies += dependency('libavdevice')
endif
else
# cross-compile mingw32 build (from Linux to Windows)
@ -133,9 +124,6 @@ conf.set('SERVER_DEBUGGER', get_option('server_debugger'))
# select the debugger method ('old' for Android < 9, 'new' for Android >= 9)
conf.set('SERVER_DEBUGGER_METHOD_NEW', get_option('server_debugger_method') == 'new')
# enable V4L2 support (linux only)
conf.set('HAVE_V4L2', v4l2_support)
configure_file(configuration: conf, output: 'config.h')
src_dir = include_directories('src')

View File

@ -83,12 +83,10 @@ Inject computer clipboard text as a sequence of key events on Ctrl+v (like MOD+S
This is a workaround for some devices not behaving as expected when setting the device clipboard programmatically.
.TP
.BI "\-\-lock\-video\-orientation " [value]
Lock video orientation to \fIvalue\fR. Possible values are "unlocked", "initial" (locked to the initial orientation), 0, 1, 2 and 3. Natural device orientation is 0, and each increment adds a 90 degrees otation counterclockwise.
.BI "\-\-lock\-video\-orientation " value
Lock video orientation to \fIvalue\fR. Possible values are -1 (unlocked), 0, 1, 2 and 3. Natural device orientation is 0, and each increment adds a 90 degrees otation counterclockwise.
Default is "unlocked".
Passing the option without argument is equivalent to passing "initial".
Default is -1 (unlocked).
.TP
.BI "\-\-max\-fps " value
@ -157,6 +155,10 @@ Supported names are currently "direct3d", "opengl", "opengles2", "opengles", "me
.UR https://wiki.libsdl.org/SDL_HINT_RENDER_DRIVER
.UE
.TP
.B \-\-render\-expired\-frames
By default, to minimize latency, scrcpy always renders the last available decoded frame, and drops any previous ones. This flag forces to render all frames, at a cost of a possible increased latency.
.TP
.BI "\-\-rotation " value
Set the initial display rotation. Possibles values are 0, 1, 2 and 3. Each increment adds a 90 degrees rotation counterclockwise.
@ -185,12 +187,6 @@ Enable "show touches" on start, restore the initial value on exit.
It only shows physical touches (not clicks from scrcpy).
.TP
.BI "\-\-v4l2-sink " /dev/videoN
Output to v4l2loopback device.
It requires to lock the video orientation (see --lock-video-orientation).
.TP
.BI "\-V, \-\-verbosity " value
Set the log level ("debug", "info", "warn" or "error").

View File

@ -79,15 +79,12 @@ scrcpy_print_usage(const char *arg0) {
" This is a workaround for some devices not behaving as\n"
" expected when setting the device clipboard programmatically.\n"
"\n"
" --lock-video-orientation [value]\n"
" --lock-video-orientation value\n"
" Lock video orientation to value.\n"
" Possible values are \"unlocked\", \"initial\" (locked to the\n"
" initial orientation), 0, 1, 2 and 3.\n"
" Possible values are -1 (unlocked), 0, 1, 2 and 3.\n"
" Natural device orientation is 0, and each increment adds a\n"
" 90 degrees rotation counterclockwise.\n"
" Default is \"unlocked\".\n"
" Passing the option without argument is equivalent to passing\n"
" \"initial\".\n"
" Default is -1 (unlocked).\n"
"\n"
" --max-fps value\n"
" Limit the frame rate of screen capture (officially supported\n"
@ -146,6 +143,12 @@ scrcpy_print_usage(const char *arg0) {
" \"opengles2\", \"opengles\", \"metal\" and \"software\".\n"
" <https://wiki.libsdl.org/SDL_HINT_RENDER_DRIVER>\n"
"\n"
" --render-expired-frames\n"
" By default, to minimize latency, scrcpy always renders the\n"
" last available decoded frame, and drops any previous ones.\n"
" This flag forces to render all frames, at a cost of a\n"
" possible increased latency.\n"
"\n"
" --rotation value\n"
" Set the initial display rotation.\n"
" Possibles values are 0, 1, 2 and 3. Each increment adds a 90\n"
@ -176,13 +179,6 @@ scrcpy_print_usage(const char *arg0) {
" on exit.\n"
" It only shows physical touches (not clicks from scrcpy).\n"
"\n"
#ifdef HAVE_V4L2
" --v4l2-sink /dev/videoN\n"
" Output to v4l2loopback device.\n"
" It requires to lock the video orientation (see\n"
" --lock-video-orientation).\n"
"\n"
#endif
" -V, --verbosity value\n"
" Set the log level (debug, info, warn or error).\n"
#ifndef NDEBUG
@ -393,27 +389,15 @@ parse_max_fps(const char *s, uint16_t *max_fps) {
}
static bool
parse_lock_video_orientation(const char *s,
enum sc_lock_video_orientation *lock_mode) {
if (!s || !strcmp(s, "initial")) {
// Without argument, lock the initial orientation
*lock_mode = SC_LOCK_VIDEO_ORIENTATION_INITIAL;
return true;
}
if (!strcmp(s, "unlocked")) {
*lock_mode = SC_LOCK_VIDEO_ORIENTATION_UNLOCKED;
return true;
}
parse_lock_video_orientation(const char *s, int8_t *lock_video_orientation) {
long value;
bool ok = parse_integer_arg(s, &value, false, 0, 3,
bool ok = parse_integer_arg(s, &value, false, -1, 3,
"lock video orientation");
if (!ok) {
return false;
}
*lock_mode = (enum sc_lock_video_orientation) value;
*lock_video_orientation = (int8_t) value;
return true;
}
@ -683,7 +667,6 @@ guess_record_format(const char *filename) {
#define OPT_LEGACY_PASTE 1024
#define OPT_ENCODER_NAME 1025
#define OPT_POWER_OFF_ON_CLOSE 1026
#define OPT_V4L2_SINK 1027
bool
scrcpy_parse_args(struct scrcpy_cli_args *args, int argc, char *argv[]) {
@ -703,7 +686,7 @@ scrcpy_parse_args(struct scrcpy_cli_args *args, int argc, char *argv[]) {
{"fullscreen", no_argument, NULL, 'f'},
{"help", no_argument, NULL, 'h'},
{"legacy-paste", no_argument, NULL, OPT_LEGACY_PASTE},
{"lock-video-orientation", optional_argument, NULL,
{"lock-video-orientation", required_argument, NULL,
OPT_LOCK_VIDEO_ORIENTATION},
{"max-fps", required_argument, NULL, OPT_MAX_FPS},
{"max-size", required_argument, NULL, 'm'},
@ -725,9 +708,6 @@ scrcpy_parse_args(struct scrcpy_cli_args *args, int argc, char *argv[]) {
{"show-touches", no_argument, NULL, 't'},
{"stay-awake", no_argument, NULL, 'w'},
{"turn-screen-off", no_argument, NULL, 'S'},
#ifdef HAVE_V4L2
{"v4l2-sink", required_argument, NULL, OPT_V4L2_SINK},
#endif
{"verbosity", required_argument, NULL, 'V'},
{"version", no_argument, NULL, 'v'},
{"window-title", required_argument, NULL, OPT_WINDOW_TITLE},
@ -791,8 +771,7 @@ scrcpy_parse_args(struct scrcpy_cli_args *args, int argc, char *argv[]) {
}
break;
case OPT_LOCK_VIDEO_ORIENTATION:
if (!parse_lock_video_orientation(optarg,
&opts->lock_video_orientation)) {
if (!parse_lock_video_orientation(optarg, &opts->lock_video_orientation)) {
return false;
}
break;
@ -837,8 +816,7 @@ scrcpy_parse_args(struct scrcpy_cli_args *args, int argc, char *argv[]) {
opts->stay_awake = true;
break;
case OPT_RENDER_EXPIRED_FRAMES:
LOGW("Option --render-expired-frames has been removed. This "
"flag has been ignored.");
opts->render_expired_frames = true;
break;
case OPT_WINDOW_TITLE:
opts->window_title = optarg;
@ -912,36 +890,16 @@ scrcpy_parse_args(struct scrcpy_cli_args *args, int argc, char *argv[]) {
case OPT_POWER_OFF_ON_CLOSE:
opts->power_off_on_close = true;
break;
#ifdef HAVE_V4L2
case OPT_V4L2_SINK:
opts->v4l2_device = optarg;
break;
#endif
default:
// getopt prints the error message on stderr
return false;
}
}
#ifdef HAVE_V4L2
if (!opts->display && !opts->record_filename && !opts->v4l2_device) {
LOGE("-N/--no-display requires either screen recording (-r/--record)"
" or sink to v4l2loopback device (--v4l2-sink)");
return false;
}
if (opts->v4l2_device && opts->lock_video_orientation
== SC_LOCK_VIDEO_ORIENTATION_UNLOCKED) {
LOGI("Video orientation is locked for v4l2 sink. "
"See --lock-video-orientation.");
opts->lock_video_orientation = SC_LOCK_VIDEO_ORIENTATION_INITIAL;
}
#else
if (!opts->display && !opts->record_filename) {
LOGE("-N/--no-display requires screen recording (-r/--record)");
return false;
}
#endif
int index = optind;
if (index < argc) {

View File

@ -8,7 +8,4 @@
#define MIN(X,Y) (X) < (Y) ? (X) : (Y)
#define MAX(X,Y) (X) > (Y) ? (X) : (Y)
#define container_of(ptr, type, member) \
((type *) (((char *) (ptr)) - offsetof(type, member)))
#endif

View File

@ -8,9 +8,20 @@
# define _DARWIN_C_SOURCE
#endif
#include <libavcodec/version.h>
#include <libavformat/version.h>
#include <SDL2/SDL_version.h>
// In ffmpeg/doc/APIchanges:
// 2016-04-11 - 6f69f7a / 9200514 - lavf 57.33.100 / 57.5.0 - avformat.h
// Add AVStream.codecpar, deprecate AVStream.codec.
#if (LIBAVFORMAT_VERSION_MICRO >= 100 /* FFmpeg */ && \
LIBAVFORMAT_VERSION_INT >= AV_VERSION_INT(57, 33, 100)) \
|| (LIBAVFORMAT_VERSION_MICRO < 100 && /* Libav */ \
LIBAVFORMAT_VERSION_INT >= AV_VERSION_INT(57, 5, 0))
# define SCRCPY_LAVF_HAS_NEW_CODEC_PARAMS_API
#endif
// In ffmpeg/doc/APIchanges:
// 2018-02-06 - 0694d87024 - lavf 58.9.100 - avformat.h
// Deprecate use of av_register_input_format(), av_register_output_format(),
@ -22,6 +33,15 @@
# define SCRCPY_LAVF_REQUIRES_REGISTER_ALL
#endif
// In ffmpeg/doc/APIchanges:
// 2016-04-21 - 7fc329e - lavc 57.37.100 - avcodec.h
// Add a new audio/video encoding and decoding API with decoupled input
// and output -- avcodec_send_packet(), avcodec_receive_frame(),
// avcodec_send_frame() and avcodec_receive_packet().
#if LIBAVCODEC_VERSION_INT >= AV_VERSION_INT(57, 37, 100)
# define SCRCPY_LAVF_HAS_NEW_ENCODING_DECODING_API
#endif
#if SDL_VERSION_ATLEAST(2, 0, 5)
// <https://wiki.libsdl.org/SDL_HINT_MOUSE_FOCUS_CLICKTHROUGH>
# define SCRCPY_SDL_HAS_HINT_MOUSE_FOCUS_CLICKTHROUGH

View File

@ -4,40 +4,14 @@
#include "events.h"
#include "video_buffer.h"
#include "trait/frame_sink.h"
#include "util/log.h"
/** Downcast packet_sink to decoder */
#define DOWNCAST(SINK) container_of(SINK, struct decoder, packet_sink)
static void
decoder_close_first_sinks(struct decoder *decoder, unsigned count) {
while (count) {
struct sc_frame_sink *sink = decoder->sinks[--count];
sink->ops->close(sink);
}
void
decoder_init(struct decoder *decoder, struct video_buffer *vb) {
decoder->video_buffer = vb;
}
static inline void
decoder_close_sinks(struct decoder *decoder) {
decoder_close_first_sinks(decoder, decoder->sink_count);
}
static bool
decoder_open_sinks(struct decoder *decoder) {
for (unsigned i = 0; i < decoder->sink_count; ++i) {
struct sc_frame_sink *sink = decoder->sinks[i];
if (!sink->ops->open(sink)) {
LOGE("Could not open frame sink %d", i);
decoder_close_first_sinks(decoder, i);
return false;
}
}
return true;
}
static bool
bool
decoder_open(struct decoder *decoder, const AVCodec *codec) {
decoder->codec_ctx = avcodec_alloc_context3(codec);
if (!decoder->codec_ctx) {
@ -51,108 +25,52 @@ decoder_open(struct decoder *decoder, const AVCodec *codec) {
return false;
}
decoder->frame = av_frame_alloc();
if (!decoder->frame) {
LOGE("Could not create decoder frame");
avcodec_close(decoder->codec_ctx);
avcodec_free_context(&decoder->codec_ctx);
return false;
}
if (!decoder_open_sinks(decoder)) {
LOGE("Could not open decoder sinks");
av_frame_free(&decoder->frame);
avcodec_close(decoder->codec_ctx);
avcodec_free_context(&decoder->codec_ctx);
return false;
}
return true;
}
static void
void
decoder_close(struct decoder *decoder) {
decoder_close_sinks(decoder);
av_frame_free(&decoder->frame);
avcodec_close(decoder->codec_ctx);
avcodec_free_context(&decoder->codec_ctx);
}
static bool
push_frame_to_sinks(struct decoder *decoder, const AVFrame *frame) {
for (unsigned i = 0; i < decoder->sink_count; ++i) {
struct sc_frame_sink *sink = decoder->sinks[i];
if (!sink->ops->push(sink, frame)) {
LOGE("Could not send frame to sink %d", i);
return false;
}
}
return true;
}
static bool
bool
decoder_push(struct decoder *decoder, const AVPacket *packet) {
bool is_config = packet->pts == AV_NOPTS_VALUE;
if (is_config) {
// nothing to do
return true;
}
int ret = avcodec_send_packet(decoder->codec_ctx, packet);
if (ret < 0 && ret != AVERROR(EAGAIN)) {
// the new decoding/encoding API has been introduced by:
// <http://git.videolan.org/?p=ffmpeg.git;a=commitdiff;h=7fc329e2dd6226dfecaa4a1d7adf353bf2773726>
#ifdef SCRCPY_LAVF_HAS_NEW_ENCODING_DECODING_API
int ret;
if ((ret = avcodec_send_packet(decoder->codec_ctx, packet)) < 0) {
LOGE("Could not send video packet: %d", ret);
return false;
}
ret = avcodec_receive_frame(decoder->codec_ctx, decoder->frame);
ret = avcodec_receive_frame(decoder->codec_ctx,
decoder->video_buffer->producer_frame);
if (!ret) {
// a frame was received
bool ok = push_frame_to_sinks(decoder, decoder->frame);
// A frame lost should not make the whole pipeline fail. The error, if
// any, is already logged.
(void) ok;
av_frame_unref(decoder->frame);
video_buffer_producer_offer_frame(decoder->video_buffer);
} else if (ret != AVERROR(EAGAIN)) {
LOGE("Could not receive video frame: %d", ret);
return false;
}
#else
int got_picture;
int len = avcodec_decode_video2(decoder->codec_ctx,
decoder->video_buffer->producer_frame,
&got_picture,
packet);
if (len < 0) {
LOGE("Could not decode video packet: %d", len);
return false;
}
if (got_picture) {
video_buffer_producer_offer_frame(decoder->video_buffer);
}
#endif
return true;
}
static bool
decoder_packet_sink_open(struct sc_packet_sink *sink, const AVCodec *codec) {
struct decoder *decoder = DOWNCAST(sink);
return decoder_open(decoder, codec);
}
static void
decoder_packet_sink_close(struct sc_packet_sink *sink) {
struct decoder *decoder = DOWNCAST(sink);
decoder_close(decoder);
}
static bool
decoder_packet_sink_push(struct sc_packet_sink *sink, const AVPacket *packet) {
struct decoder *decoder = DOWNCAST(sink);
return decoder_push(decoder, packet);
}
void
decoder_init(struct decoder *decoder) {
static const struct sc_packet_sink_ops ops = {
.open = decoder_packet_sink_open,
.close = decoder_packet_sink_close,
.push = decoder_packet_sink_push,
};
decoder->packet_sink.ops = &ops;
}
void
decoder_add_sink(struct decoder *decoder, struct sc_frame_sink *sink) {
assert(decoder->sink_count < DECODER_MAX_SINKS);
assert(sink);
assert(sink->ops);
decoder->sinks[decoder->sink_count++] = sink;
decoder_interrupt(struct decoder *decoder) {
video_buffer_interrupt(decoder->video_buffer);
}

View File

@ -3,27 +3,30 @@
#include "common.h"
#include "trait/packet_sink.h"
#include <stdbool.h>
#include <libavformat/avformat.h>
#define DECODER_MAX_SINKS 2
struct video_buffer;
struct decoder {
struct sc_packet_sink packet_sink; // packet sink trait
struct sc_frame_sink *sinks[DECODER_MAX_SINKS];
unsigned sink_count;
struct video_buffer *video_buffer;
AVCodecContext *codec_ctx;
AVFrame *frame;
};
void
decoder_init(struct decoder *decoder);
decoder_init(struct decoder *decoder, struct video_buffer *vb);
bool
decoder_open(struct decoder *decoder, const AVCodec *codec);
void
decoder_add_sink(struct decoder *decoder, struct sc_frame_sink *sink);
decoder_close(struct decoder *decoder);
bool
decoder_push(struct decoder *decoder, const AVPacket *packet);
void
decoder_interrupt(struct decoder *decoder);
#endif

View File

@ -6,9 +6,6 @@
#include <stdbool.h>
#include <unistd.h>
#include <libavformat/avformat.h>
#ifdef HAVE_V4L2
# include <libavdevice/avdevice.h>
#endif
#define SDL_MAIN_HANDLED // avoid link error on Linux Windows Subsystem
#include <SDL2/SDL.h>
@ -31,11 +28,6 @@ print_version(void) {
fprintf(stderr, " - libavutil %d.%d.%d\n", LIBAVUTIL_VERSION_MAJOR,
LIBAVUTIL_VERSION_MINOR,
LIBAVUTIL_VERSION_MICRO);
#ifdef HAVE_V4L2
fprintf(stderr, " - libavdevice %d.%d.%d\n", LIBAVDEVICE_VERSION_MAJOR,
LIBAVDEVICE_VERSION_MINOR,
LIBAVDEVICE_VERSION_MICRO);
#endif
}
static SDL_LogPriority
@ -98,12 +90,6 @@ main(int argc, char *argv[]) {
av_register_all();
#endif
#ifdef HAVE_V4L2
if (args.opts.v4l2_device) {
avdevice_register_all();
}
#endif
if (avformat_network_init()) {
return 1;
}

View File

@ -4,10 +4,6 @@
#include <libavutil/time.h>
#include "util/log.h"
#include "util/str_util.h"
/** Downcast packet_sink to recorder */
#define DOWNCAST(SINK) container_of(SINK, struct recorder, packet_sink)
static const AVRational SCRCPY_TIME_BASE = {1, 1000000}; // timestamps in us
@ -23,8 +19,8 @@ find_muxer(const char *name) {
#else
oformat = av_oformat_next(oformat);
#endif
// until null or containing the requested name
} while (oformat && !strlist_contains(oformat->name, ',', name));
// until null or with name "mp4"
} while (oformat && strcmp(oformat->name, name));
return oformat;
}
@ -61,6 +57,50 @@ recorder_queue_clear(struct recorder_queue *queue) {
}
}
bool
recorder_init(struct recorder *recorder,
const char *filename,
enum sc_record_format format,
struct size declared_frame_size) {
recorder->filename = strdup(filename);
if (!recorder->filename) {
LOGE("Could not strdup filename");
return false;
}
bool ok = sc_mutex_init(&recorder->mutex);
if (!ok) {
LOGC("Could not create mutex");
free(recorder->filename);
return false;
}
ok = sc_cond_init(&recorder->queue_cond);
if (!ok) {
LOGC("Could not create cond");
sc_mutex_destroy(&recorder->mutex);
free(recorder->filename);
return false;
}
queue_init(&recorder->queue);
recorder->stopped = false;
recorder->failed = false;
recorder->format = format;
recorder->declared_frame_size = declared_frame_size;
recorder->header_written = false;
recorder->previous = NULL;
return true;
}
void
recorder_destroy(struct recorder *recorder) {
sc_cond_destroy(&recorder->queue_cond);
sc_mutex_destroy(&recorder->mutex);
free(recorder->filename);
}
static const char *
recorder_get_format_name(enum sc_record_format format) {
switch (format) {
@ -70,6 +110,88 @@ recorder_get_format_name(enum sc_record_format format) {
}
}
bool
recorder_open(struct recorder *recorder, const AVCodec *input_codec) {
const char *format_name = recorder_get_format_name(recorder->format);
assert(format_name);
const AVOutputFormat *format = find_muxer(format_name);
if (!format) {
LOGE("Could not find muxer");
return false;
}
recorder->ctx = avformat_alloc_context();
if (!recorder->ctx) {
LOGE("Could not allocate output context");
return false;
}
// contrary to the deprecated API (av_oformat_next()), av_muxer_iterate()
// returns (on purpose) a pointer-to-const, but AVFormatContext.oformat
// still expects a pointer-to-non-const (it has not be updated accordingly)
// <https://github.com/FFmpeg/FFmpeg/commit/0694d8702421e7aff1340038559c438b61bb30dd>
recorder->ctx->oformat = (AVOutputFormat *) format;
av_dict_set(&recorder->ctx->metadata, "comment",
"Recorded by scrcpy " SCRCPY_VERSION, 0);
AVStream *ostream = avformat_new_stream(recorder->ctx, input_codec);
if (!ostream) {
avformat_free_context(recorder->ctx);
return false;
}
#ifdef SCRCPY_LAVF_HAS_NEW_CODEC_PARAMS_API
ostream->codecpar->codec_type = AVMEDIA_TYPE_VIDEO;
ostream->codecpar->codec_id = input_codec->id;
ostream->codecpar->format = AV_PIX_FMT_YUV420P;
ostream->codecpar->width = recorder->declared_frame_size.width;
ostream->codecpar->height = recorder->declared_frame_size.height;
#else
ostream->codec->codec_type = AVMEDIA_TYPE_VIDEO;
ostream->codec->codec_id = input_codec->id;
ostream->codec->pix_fmt = AV_PIX_FMT_YUV420P;
ostream->codec->width = recorder->declared_frame_size.width;
ostream->codec->height = recorder->declared_frame_size.height;
#endif
int ret = avio_open(&recorder->ctx->pb, recorder->filename,
AVIO_FLAG_WRITE);
if (ret < 0) {
LOGE("Failed to open output file: %s", recorder->filename);
// ostream will be cleaned up during context cleaning
avformat_free_context(recorder->ctx);
return false;
}
LOGI("Recording started to %s file: %s", format_name, recorder->filename);
return true;
}
void
recorder_close(struct recorder *recorder) {
if (recorder->header_written) {
int ret = av_write_trailer(recorder->ctx);
if (ret < 0) {
LOGE("Failed to write trailer to %s", recorder->filename);
recorder->failed = true;
}
} else {
// the recorded file is empty
recorder->failed = true;
}
avio_close(recorder->ctx->pb);
avformat_free_context(recorder->ctx);
if (recorder->failed) {
LOGE("Recording failed to %s", recorder->filename);
} else {
const char *format_name = recorder_get_format_name(recorder->format);
LOGI("Recording complete to %s file: %s", format_name, recorder->filename);
}
}
static bool
recorder_write_header(struct recorder *recorder, const AVPacket *packet) {
AVStream *ostream = recorder->ctx->streams[0];
@ -83,8 +205,13 @@ recorder_write_header(struct recorder *recorder, const AVPacket *packet) {
// copy the first packet to the extra data
memcpy(extradata, packet->data, packet->size);
#ifdef SCRCPY_LAVF_HAS_NEW_CODEC_PARAMS_API
ostream->codecpar->extradata = extradata;
ostream->codecpar->extradata_size = packet->size;
#else
ostream->codec->extradata = extradata;
ostream->codec->extradata_size = packet->size;
#endif
int ret = avformat_write_header(recorder->ctx, NULL);
if (ret < 0) {
@ -190,26 +317,7 @@ run_recorder(void *data) {
sc_mutex_unlock(&recorder->mutex);
break;
}
}
if (!recorder->failed) {
if (recorder->header_written) {
int ret = av_write_trailer(recorder->ctx);
if (ret < 0) {
LOGE("Failed to write trailer to %s", recorder->filename);
recorder->failed = true;
}
} else {
// the recorded file is empty
recorder->failed = true;
}
}
if (recorder->failed) {
LOGE("Recording failed to %s", recorder->filename);
} else {
const char *format_name = recorder_get_format_name(recorder->format);
LOGI("Recording complete to %s file: %s", format_name, recorder->filename);
}
LOGD("Recorder thread ended");
@ -217,108 +325,34 @@ run_recorder(void *data) {
return 0;
}
static bool
recorder_open(struct recorder *recorder, const AVCodec *input_codec) {
bool ok = sc_mutex_init(&recorder->mutex);
if (!ok) {
LOGC("Could not create mutex");
return false;
}
ok = sc_cond_init(&recorder->queue_cond);
if (!ok) {
LOGC("Could not create cond");
goto error_mutex_destroy;
}
queue_init(&recorder->queue);
recorder->stopped = false;
recorder->failed = false;
recorder->header_written = false;
recorder->previous = NULL;
const char *format_name = recorder_get_format_name(recorder->format);
assert(format_name);
const AVOutputFormat *format = find_muxer(format_name);
if (!format) {
LOGE("Could not find muxer");
goto error_cond_destroy;
}
recorder->ctx = avformat_alloc_context();
if (!recorder->ctx) {
LOGE("Could not allocate output context");
goto error_cond_destroy;
}
// contrary to the deprecated API (av_oformat_next()), av_muxer_iterate()
// returns (on purpose) a pointer-to-const, but AVFormatContext.oformat
// still expects a pointer-to-non-const (it has not be updated accordingly)
// <https://github.com/FFmpeg/FFmpeg/commit/0694d8702421e7aff1340038559c438b61bb30dd>
recorder->ctx->oformat = (AVOutputFormat *) format;
av_dict_set(&recorder->ctx->metadata, "comment",
"Recorded by scrcpy " SCRCPY_VERSION, 0);
AVStream *ostream = avformat_new_stream(recorder->ctx, input_codec);
if (!ostream) {
goto error_avformat_free_context;
}
ostream->codecpar->codec_type = AVMEDIA_TYPE_VIDEO;
ostream->codecpar->codec_id = input_codec->id;
ostream->codecpar->format = AV_PIX_FMT_YUV420P;
ostream->codecpar->width = recorder->declared_frame_size.width;
ostream->codecpar->height = recorder->declared_frame_size.height;
int ret = avio_open(&recorder->ctx->pb, recorder->filename,
AVIO_FLAG_WRITE);
if (ret < 0) {
LOGE("Failed to open output file: %s", recorder->filename);
// ostream will be cleaned up during context cleaning
goto error_avformat_free_context;
}
bool
recorder_start(struct recorder *recorder) {
LOGD("Starting recorder thread");
ok = sc_thread_create(&recorder->thread, run_recorder, "recorder",
bool ok = sc_thread_create(&recorder->thread, run_recorder, "recorder",
recorder);
if (!ok) {
LOGC("Could not start recorder thread");
goto error_avio_close;
return false;
}
LOGI("Recording started to %s file: %s", format_name, recorder->filename);
return true;
error_avio_close:
avio_close(recorder->ctx->pb);
error_avformat_free_context:
avformat_free_context(recorder->ctx);
error_cond_destroy:
sc_cond_destroy(&recorder->queue_cond);
error_mutex_destroy:
sc_mutex_destroy(&recorder->mutex);
return false;
}
static void
recorder_close(struct recorder *recorder) {
void
recorder_stop(struct recorder *recorder) {
sc_mutex_lock(&recorder->mutex);
recorder->stopped = true;
sc_cond_signal(&recorder->queue_cond);
sc_mutex_unlock(&recorder->mutex);
sc_thread_join(&recorder->thread, NULL);
avio_close(recorder->ctx->pb);
avformat_free_context(recorder->ctx);
sc_cond_destroy(&recorder->queue_cond);
sc_mutex_destroy(&recorder->mutex);
}
static bool
void
recorder_join(struct recorder *recorder) {
sc_thread_join(&recorder->thread, NULL);
}
bool
recorder_push(struct recorder *recorder, const AVPacket *packet) {
sc_mutex_lock(&recorder->mutex);
assert(!recorder->stopped);
@ -342,51 +376,3 @@ recorder_push(struct recorder *recorder, const AVPacket *packet) {
sc_mutex_unlock(&recorder->mutex);
return true;
}
static bool
recorder_packet_sink_open(struct sc_packet_sink *sink, const AVCodec *codec) {
struct recorder *recorder = DOWNCAST(sink);
return recorder_open(recorder, codec);
}
static void
recorder_packet_sink_close(struct sc_packet_sink *sink) {
struct recorder *recorder = DOWNCAST(sink);
recorder_close(recorder);
}
static bool
recorder_packet_sink_push(struct sc_packet_sink *sink, const AVPacket *packet) {
struct recorder *recorder = DOWNCAST(sink);
return recorder_push(recorder, packet);
}
bool
recorder_init(struct recorder *recorder,
const char *filename,
enum sc_record_format format,
struct size declared_frame_size) {
recorder->filename = strdup(filename);
if (!recorder->filename) {
LOGE("Could not strdup filename");
return false;
}
recorder->format = format;
recorder->declared_frame_size = declared_frame_size;
static const struct sc_packet_sink_ops ops = {
.open = recorder_packet_sink_open,
.close = recorder_packet_sink_close,
.push = recorder_packet_sink_push,
};
recorder->packet_sink.ops = &ops;
return true;
}
void
recorder_destroy(struct recorder *recorder) {
free(recorder->filename);
}

View File

@ -8,7 +8,6 @@
#include "coords.h"
#include "scrcpy.h"
#include "trait/packet_sink.h"
#include "util/queue.h"
#include "util/thread.h"
@ -20,8 +19,6 @@ struct record_packet {
struct recorder_queue QUEUE(struct record_packet);
struct recorder {
struct sc_packet_sink packet_sink; // packet sink trait
char *filename;
enum sc_record_format format;
AVFormatContext *ctx;
@ -31,7 +28,7 @@ struct recorder {
sc_thread thread;
sc_mutex mutex;
sc_cond queue_cond;
bool stopped; // set on recorder_close()
bool stopped; // set on recorder_stop() by the stream reader
bool failed; // set on packet write failure
struct recorder_queue queue;
@ -49,4 +46,22 @@ recorder_init(struct recorder *recorder, const char *filename,
void
recorder_destroy(struct recorder *recorder);
bool
recorder_open(struct recorder *recorder, const AVCodec *input_codec);
void
recorder_close(struct recorder *recorder);
bool
recorder_start(struct recorder *recorder);
void
recorder_stop(struct recorder *recorder);
void
recorder_join(struct recorder *recorder);
bool
recorder_push(struct recorder *recorder, const AVPacket *packet);
#endif

View File

@ -25,21 +25,17 @@
#include "server.h"
#include "stream.h"
#include "tiny_xpm.h"
#include "video_buffer.h"
#include "util/log.h"
#include "util/net.h"
#ifdef HAVE_V4L2
# include "v4l2_sink.h"
#endif
static struct server server;
static struct screen screen;
static struct fps_counter fps_counter;
static struct video_buffer video_buffer;
static struct stream stream;
static struct decoder decoder;
static struct recorder recorder;
#ifdef HAVE_V4L2
static struct sc_v4l2_sink v4l2_sink;
#endif
static struct controller controller;
static struct file_handler file_handler;
@ -251,11 +247,9 @@ scrcpy(const struct scrcpy_options *options) {
bool server_started = false;
bool fps_counter_initialized = false;
bool video_buffer_initialized = false;
bool file_handler_initialized = false;
bool recorder_initialized = false;
#ifdef HAVE_V4L2
bool v4l2_sink_initialized = false;
#endif
bool stream_started = false;
bool controller_initialized = false;
bool controller_started = false;
@ -304,12 +298,18 @@ scrcpy(const struct scrcpy_options *options) {
goto end;
}
struct decoder *dec = NULL;
if (options->display) {
if (!fps_counter_init(&fps_counter)) {
goto end;
}
fps_counter_initialized = true;
if (!video_buffer_init(&video_buffer, options->render_expired_frames)) {
goto end;
}
video_buffer_initialized = true;
if (options->control) {
if (!file_handler_init(&file_handler, server.serial,
options->push_target)) {
@ -317,15 +317,8 @@ scrcpy(const struct scrcpy_options *options) {
}
file_handler_initialized = true;
}
}
struct decoder *dec = NULL;
bool needs_decoder = options->display;
#ifdef HAVE_V4L2
needs_decoder |= !!options->v4l2_device;
#endif
if (needs_decoder) {
decoder_init(&decoder);
decoder_init(&decoder, &video_buffer);
dec = &decoder;
}
@ -343,15 +336,7 @@ scrcpy(const struct scrcpy_options *options) {
av_log_set_callback(av_log_callback);
stream_init(&stream, server.video_socket);
if (dec) {
stream_add_sink(&stream, &dec->packet_sink);
}
if (rec) {
stream_add_sink(&stream, &rec->packet_sink);
}
stream_init(&stream, server.video_socket, dec, rec);
if (options->display) {
if (options->control) {
@ -383,13 +368,12 @@ scrcpy(const struct scrcpy_options *options) {
.fullscreen = options->fullscreen,
};
if (!screen_init(&screen, &fps_counter, &screen_params)) {
if (!screen_init(&screen, &video_buffer, &fps_counter,
&screen_params)) {
goto end;
}
screen_initialized = true;
decoder_add_sink(&decoder, &screen.frame_sink);
if (options->turn_screen_off) {
struct control_msg msg;
msg.type = CONTROL_MSG_TYPE_SET_SCREEN_POWER_MODE;
@ -401,18 +385,6 @@ scrcpy(const struct scrcpy_options *options) {
}
}
#ifdef HAVE_V4L2
if (options->v4l2_device) {
if (!sc_v4l2_sink_init(&v4l2_sink, options->v4l2_device, frame_size)) {
goto end;
}
decoder_add_sink(&decoder, &v4l2_sink.frame_sink);
v4l2_sink_initialized = true;
}
#endif
// now we consumed the header values, the socket receives the video stream
// start the stream
if (!stream_start(&stream)) {
@ -425,13 +397,12 @@ scrcpy(const struct scrcpy_options *options) {
ret = event_loop(options);
LOGD("quit...");
// Close the window immediately on closing, because screen_destroy() may
// only be called once the stream thread is joined (it may take time)
screen_hide_window(&screen);
end:
// The stream is not stopped explicitly, because it will stop by itself on
// end-of-stream
// stop stream and controller so that they don't continue once their socket
// is shutdown
if (stream_started) {
stream_stop(&stream);
}
if (controller_started) {
controller_stop(&controller);
}
@ -453,12 +424,6 @@ end:
stream_join(&stream);
}
#ifdef HAVE_V4L2
if (v4l2_sink_initialized) {
sc_v4l2_sink_destroy(&v4l2_sink);
}
#endif
// Destroy the screen only after the stream is guaranteed to be finished,
// because otherwise the screen could receive new frames after destruction
if (screen_initialized) {
@ -481,6 +446,10 @@ end:
file_handler_destroy(&file_handler);
}
if (video_buffer_initialized) {
video_buffer_destroy(&video_buffer);
}
if (fps_counter_initialized) {
fps_counter_join(&fps_counter);
fps_counter_destroy(&fps_counter);

View File

@ -20,16 +20,6 @@ enum sc_record_format {
SC_RECORD_FORMAT_MKV,
};
enum sc_lock_video_orientation {
SC_LOCK_VIDEO_ORIENTATION_UNLOCKED = -1,
// lock the current orientation when scrcpy starts
SC_LOCK_VIDEO_ORIENTATION_INITIAL = -2,
SC_LOCK_VIDEO_ORIENTATION_0 = 0,
SC_LOCK_VIDEO_ORIENTATION_1,
SC_LOCK_VIDEO_ORIENTATION_2,
SC_LOCK_VIDEO_ORIENTATION_3,
};
#define SC_MAX_SHORTCUT_MODS 8
enum sc_shortcut_mod {
@ -62,7 +52,6 @@ struct scrcpy_options {
const char *render_driver;
const char *codec_options;
const char *encoder_name;
const char *v4l2_device;
enum sc_log_level log_level;
enum sc_record_format record_format;
struct sc_port_range port_range;
@ -70,7 +59,7 @@ struct scrcpy_options {
uint16_t max_size;
uint32_t bit_rate;
uint16_t max_fps;
enum sc_lock_video_orientation lock_video_orientation;
int8_t lock_video_orientation;
uint8_t rotation;
int16_t window_x; // SC_WINDOW_POSITION_UNDEFINED for "auto"
int16_t window_y; // SC_WINDOW_POSITION_UNDEFINED for "auto"
@ -83,6 +72,7 @@ struct scrcpy_options {
bool control;
bool display;
bool turn_screen_off;
bool render_expired_frames;
bool prefer_text;
bool window_borderless;
bool mipmaps;
@ -104,7 +94,6 @@ struct scrcpy_options {
.render_driver = NULL, \
.codec_options = NULL, \
.encoder_name = NULL, \
.v4l2_device = NULL, \
.log_level = SC_LOG_LEVEL_INFO, \
.record_format = SC_RECORD_FORMAT_AUTO, \
.port_range = { \
@ -118,7 +107,7 @@ struct scrcpy_options {
.max_size = 0, \
.bit_rate = DEFAULT_BIT_RATE, \
.max_fps = 0, \
.lock_video_orientation = SC_LOCK_VIDEO_ORIENTATION_UNLOCKED, \
.lock_video_orientation = -1, \
.rotation = 0, \
.window_x = SC_WINDOW_POSITION_UNDEFINED, \
.window_y = SC_WINDOW_POSITION_UNDEFINED, \
@ -131,6 +120,7 @@ struct scrcpy_options {
.control = true, \
.display = true, \
.turn_screen_off = false, \
.render_expired_frames = false, \
.prefer_text = false, \
.window_borderless = false, \
.mipmaps = true, \

View File

@ -13,8 +13,6 @@
#define DISPLAY_MARGINS 96
#define DOWNCAST(SINK) container_of(SINK, struct screen, frame_sink)
static inline struct size
get_rotated_size(struct size size, int rotation) {
struct size rotated_size;
@ -193,6 +191,27 @@ screen_update_content_rect(struct screen *screen) {
}
}
static void
on_frame_available(struct video_buffer *vb, void *userdata) {
(void) vb;
(void) userdata;
static SDL_Event new_frame_event = {
.type = EVENT_NEW_FRAME,
};
// Post the event on the UI thread
SDL_PushEvent(&new_frame_event);
}
static void
on_frame_skipped(struct video_buffer *vb, void *userdata) {
(void) vb;
struct screen *screen = userdata;
fps_counter_add_skipped_frame(screen->fps_counter);
}
static inline SDL_Texture *
create_texture(struct screen *screen) {
SDL_Renderer *renderer = screen->renderer;
@ -243,58 +262,11 @@ event_watcher(void *data, SDL_Event *event) {
}
#endif
static bool
screen_frame_sink_open(struct sc_frame_sink *sink) {
struct screen *screen = DOWNCAST(sink);
(void) screen;
#ifndef NDEBUG
screen->open = true;
#endif
// nothing to do, the screen is already open on the main thread
return true;
}
static void
screen_frame_sink_close(struct sc_frame_sink *sink) {
struct screen *screen = DOWNCAST(sink);
(void) screen;
#ifndef NDEBUG
screen->open = false;
#endif
// nothing to do, the screen lifecycle is not managed by the frame producer
}
static bool
screen_frame_sink_push(struct sc_frame_sink *sink, const AVFrame *frame) {
struct screen *screen = DOWNCAST(sink);
bool previous_frame_skipped;
bool ok = video_buffer_push(&screen->vb, frame, &previous_frame_skipped);
if (!ok) {
return false;
}
if (previous_frame_skipped) {
fps_counter_add_skipped_frame(screen->fps_counter);
// The EVENT_NEW_FRAME triggered for the previous frame will consume
// this new frame instead
} else {
static SDL_Event new_frame_event = {
.type = EVENT_NEW_FRAME,
};
// Post the event on the UI thread
SDL_PushEvent(&new_frame_event);
}
return true;
}
bool
screen_init(struct screen *screen, struct fps_counter *fps_counter,
screen_init(struct screen *screen, struct video_buffer *vb,
struct fps_counter *fps_counter,
const struct screen_params *params) {
screen->vb = vb;
screen->fps_counter = fps_counter;
screen->resize_pending = false;
@ -302,11 +274,11 @@ screen_init(struct screen *screen, struct fps_counter *fps_counter,
screen->fullscreen = false;
screen->maximized = false;
bool ok = video_buffer_init(&screen->vb);
if (!ok) {
LOGE("Could not initialize video buffer");
return false;
}
static const struct video_buffer_callbacks cbs = {
.on_frame_available = on_frame_available,
.on_frame_skipped = on_frame_skipped,
};
video_buffer_set_consumer_callbacks(vb, &cbs, screen);
screen->frame_size = params->frame_size;
screen->rotation = params->rotation;
@ -352,7 +324,6 @@ screen_init(struct screen *screen, struct fps_counter *fps_counter,
if (!screen->renderer) {
LOGC("Could not create renderer: %s", SDL_GetError());
SDL_DestroyWindow(screen->window);
video_buffer_destroy(&screen->vb);
return false;
}
@ -404,17 +375,6 @@ screen_init(struct screen *screen, struct fps_counter *fps_counter,
LOGC("Could not create texture: %s", SDL_GetError());
SDL_DestroyRenderer(screen->renderer);
SDL_DestroyWindow(screen->window);
video_buffer_destroy(&screen->vb);
return false;
}
screen->frame = av_frame_alloc();
if (!screen->frame) {
LOGC("Could not create screen frame");
SDL_DestroyTexture(screen->texture);
SDL_DestroyRenderer(screen->renderer);
SDL_DestroyWindow(screen->window);
video_buffer_destroy(&screen->vb);
return false;
}
@ -433,18 +393,6 @@ screen_init(struct screen *screen, struct fps_counter *fps_counter,
SDL_AddEventWatch(event_watcher, screen);
#endif
static const struct sc_frame_sink_ops ops = {
.open = screen_frame_sink_open,
.close = screen_frame_sink_close,
.push = screen_frame_sink_push,
};
screen->frame_sink.ops = &ops;
#ifndef NDEBUG
screen->open = false;
#endif
return true;
}
@ -453,21 +401,11 @@ screen_show_window(struct screen *screen) {
SDL_ShowWindow(screen->window);
}
void
screen_hide_window(struct screen *screen) {
SDL_HideWindow(screen->window);
}
void
screen_destroy(struct screen *screen) {
#ifndef NDEBUG
assert(!screen->open);
#endif
av_frame_free(&screen->frame);
SDL_DestroyTexture(screen->texture);
SDL_DestroyRenderer(screen->renderer);
SDL_DestroyWindow(screen->window);
video_buffer_destroy(&screen->vb);
}
static void
@ -572,9 +510,7 @@ update_texture(struct screen *screen, const AVFrame *frame) {
static bool
screen_update_frame(struct screen *screen) {
av_frame_unref(screen->frame);
video_buffer_consume(&screen->vb, screen->frame);
AVFrame *frame = screen->frame;
const AVFrame *frame = video_buffer_consumer_take_frame(screen->vb);
fps_counter_add_rendered_frame(screen->fps_counter);

View File

@ -9,17 +9,11 @@
#include "coords.h"
#include "opengl.h"
#include "trait/frame_sink.h"
#include "video_buffer.h"
struct video_buffer;
struct screen {
struct sc_frame_sink frame_sink; // frame sink trait
#ifndef NDEBUG
bool open; // track the open/close state to assert correct behavior
#endif
struct video_buffer vb;
struct video_buffer *vb;
struct fps_counter *fps_counter;
SDL_Window *window;
@ -42,8 +36,6 @@ struct screen {
bool fullscreen;
bool maximized;
bool mipmaps;
AVFrame *frame;
};
struct screen_params {
@ -66,20 +58,14 @@ struct screen_params {
// initialize screen, create window, renderer and texture (window is hidden)
bool
screen_init(struct screen *screen, struct fps_counter *fps_counter,
screen_init(struct screen *screen, struct video_buffer *vb,
struct fps_counter *fps_counter,
const struct screen_params *params);
// destroy window, renderer and texture (if any)
void
screen_destroy(struct screen *screen);
// hide the window
//
// It is used to hide the window immediately on closing without waiting for
// screen_destroy()
void
screen_hide_window(struct screen *screen);
// render the texture to the renderer
//
// Set the update_content_rect flag if the window or content size may have

View File

@ -58,7 +58,7 @@ get_server_path(void) {
LOGE("Could not get executable path, "
"using " SERVER_FILENAME " from current directory");
// not found, use current directory
return strdup(SERVER_FILENAME);
return SERVER_FILENAME;
}
char *dir = dirname(executable_path);
size_t dirlen = strlen(dir);
@ -70,7 +70,7 @@ get_server_path(void) {
LOGE("Could not alloc server path string, "
"using " SERVER_FILENAME " from current directory");
free(executable_path);
return strdup(SERVER_FILENAME);
return SERVER_FILENAME;
}
memcpy(server_path, dir, dirlen);

View File

@ -66,11 +66,25 @@ notify_stopped(void) {
}
static bool
push_packet_to_sinks(struct stream *stream, const AVPacket *packet) {
for (unsigned i = 0; i < stream->sink_count; ++i) {
struct sc_packet_sink *sink = stream->sinks[i];
if (!sink->ops->push(sink, packet)) {
LOGE("Could not send config packet to sink %d", i);
process_config_packet(struct stream *stream, AVPacket *packet) {
if (stream->recorder && !recorder_push(stream->recorder, packet)) {
LOGE("Could not send config packet to recorder");
return false;
}
return true;
}
static bool
process_frame(struct stream *stream, AVPacket *packet) {
if (stream->decoder && !decoder_push(stream->decoder, packet)) {
return false;
}
if (stream->recorder) {
packet->dts = packet->pts;
if (!recorder_push(stream->recorder, packet)) {
LOGE("Could not send packet to recorder");
return false;
}
}
@ -97,11 +111,9 @@ stream_parse(struct stream *stream, AVPacket *packet) {
packet->flags |= AV_PKT_FLAG_KEY;
}
packet->dts = packet->pts;
bool ok = push_packet_to_sinks(stream, packet);
bool ok = process_frame(stream, packet);
if (!ok) {
LOGE("Could not process packet");
LOGE("Could not process frame");
return false;
}
@ -144,7 +156,7 @@ stream_push_packet(struct stream *stream, AVPacket *packet) {
if (is_config) {
// config packet
bool ok = push_packet_to_sinks(stream, packet);
bool ok = process_config_packet(stream, packet);
if (!ok) {
return false;
}
@ -165,33 +177,6 @@ stream_push_packet(struct stream *stream, AVPacket *packet) {
return true;
}
static void
stream_close_first_sinks(struct stream *stream, unsigned count) {
while (count) {
struct sc_packet_sink *sink = stream->sinks[--count];
sink->ops->close(sink);
}
}
static inline void
stream_close_sinks(struct stream *stream) {
stream_close_first_sinks(stream, stream->sink_count);
}
static bool
stream_open_sinks(struct stream *stream, const AVCodec *codec) {
for (unsigned i = 0; i < stream->sink_count; ++i) {
struct sc_packet_sink *sink = stream->sinks[i];
if (!sink->ops->open(sink, codec)) {
LOGE("Could not open packet sink %d", i);
stream_close_first_sinks(stream, i);
return false;
}
}
return true;
}
static int
run_stream(void *data) {
struct stream *stream = data;
@ -208,15 +193,27 @@ run_stream(void *data) {
goto end;
}
if (!stream_open_sinks(stream, codec)) {
LOGE("Could not open stream sinks");
if (stream->decoder && !decoder_open(stream->decoder, codec)) {
LOGE("Could not open decoder");
goto finally_free_codec_ctx;
}
if (stream->recorder) {
if (!recorder_open(stream->recorder, codec)) {
LOGE("Could not open recorder");
goto finally_close_decoder;
}
if (!recorder_start(stream->recorder)) {
LOGE("Could not start recorder");
goto finally_close_recorder;
}
}
stream->parser = av_parser_init(AV_CODEC_ID_H264);
if (!stream->parser) {
LOGE("Could not initialize parser");
goto finally_close_sinks;
goto finally_stop_and_join_recorder;
}
// We must only pass complete frames to av_parser_parse2()!
@ -246,8 +243,20 @@ run_stream(void *data) {
}
av_parser_close(stream->parser);
finally_close_sinks:
stream_close_sinks(stream);
finally_stop_and_join_recorder:
if (stream->recorder) {
recorder_stop(stream->recorder);
LOGI("Finishing recording...");
recorder_join(stream->recorder);
}
finally_close_recorder:
if (stream->recorder) {
recorder_close(stream->recorder);
}
finally_close_decoder:
if (stream->decoder) {
decoder_close(stream->decoder);
}
finally_free_codec_ctx:
avcodec_free_context(&stream->codec_ctx);
end:
@ -256,18 +265,12 @@ end:
}
void
stream_init(struct stream *stream, socket_t socket) {
stream_init(struct stream *stream, socket_t socket,
struct decoder *decoder, struct recorder *recorder) {
stream->socket = socket;
stream->decoder = decoder,
stream->recorder = recorder;
stream->has_pending = false;
stream->sink_count = 0;
}
void
stream_add_sink(struct stream *stream, struct sc_packet_sink *sink) {
assert(stream->sink_count < STREAM_MAX_SINKS);
assert(sink);
assert(sink->ops);
stream->sinks[stream->sink_count++] = sink;
}
bool
@ -282,6 +285,13 @@ stream_start(struct stream *stream) {
return true;
}
void
stream_stop(struct stream *stream) {
if (stream->decoder) {
decoder_interrupt(stream->decoder);
}
}
void
stream_join(struct stream *stream) {
sc_thread_join(&stream->thread, NULL);

View File

@ -8,19 +8,14 @@
#include <libavformat/avformat.h>
#include <SDL2/SDL_atomic.h>
#include "trait/packet_sink.h"
#include "util/net.h"
#include "util/thread.h"
#define STREAM_MAX_SINKS 2
struct stream {
socket_t socket;
sc_thread thread;
struct sc_packet_sink *sinks[STREAM_MAX_SINKS];
unsigned sink_count;
struct decoder *decoder;
struct recorder *recorder;
AVCodecContext *codec_ctx;
AVCodecParserContext *parser;
// successive packets may need to be concatenated, until a non-config
@ -30,14 +25,15 @@ struct stream {
};
void
stream_init(struct stream *stream, socket_t socket);
void
stream_add_sink(struct stream *stream, struct sc_packet_sink *sink);
stream_init(struct stream *stream, socket_t socket,
struct decoder *decoder, struct recorder *recorder);
bool
stream_start(struct stream *stream);
void
stream_stop(struct stream *stream);
void
stream_join(struct stream *stream);

View File

@ -1,26 +0,0 @@
#ifndef SC_FRAME_SINK
#define SC_FRAME_SINK
#include "common.h"
#include <assert.h>
#include <stdbool.h>
typedef struct AVFrame AVFrame;
/**
* Frame sink trait.
*
* Component able to receive AVFrames should implement this trait.
*/
struct sc_frame_sink {
const struct sc_frame_sink_ops *ops;
};
struct sc_frame_sink_ops {
bool (*open)(struct sc_frame_sink *sink);
void (*close)(struct sc_frame_sink *sink);
bool (*push)(struct sc_frame_sink *sink, const AVFrame *frame);
};
#endif

View File

@ -1,27 +0,0 @@
#ifndef SC_PACKET_SINK
#define SC_PACKET_SINK
#include "common.h"
#include <assert.h>
#include <stdbool.h>
typedef struct AVCodec AVCodec;
typedef struct AVPacket AVPacket;
/**
* Packet sink trait.
*
* Component able to receive AVPackets should implement this trait.
*/
struct sc_packet_sink {
const struct sc_packet_sink_ops *ops;
};
struct sc_packet_sink_ops {
bool (*open)(struct sc_packet_sink *sink, const AVCodec *codec);
void (*close)(struct sc_packet_sink *sink);
bool (*push)(struct sc_packet_sink *sink, const AVPacket *packet);
};
#endif

View File

@ -140,24 +140,6 @@ parse_integer_with_suffix(const char *s, long *out) {
return true;
}
bool
strlist_contains(const char *list, char sep, const char *s) {
char *p;
do {
p = strchr(list, sep);
size_t token_len = p ? (size_t) (p - list) : strlen(list);
if (!strncmp(list, s, token_len)) {
return true;
}
if (p) {
list = p + 1;
}
} while (p);
return false;
}
size_t
utf8_truncation_index(const char *utf8, size_t max_len) {
size_t len = strlen(utf8);

View File

@ -43,11 +43,6 @@ parse_integers(const char *s, const char sep, size_t max_items, long *out);
bool
parse_integer_with_suffix(const char *s, long *out);
// search s in the list separated by sep
// for example, strlist_contains("a,bc,def", ',', "bc") returns true
bool
strlist_contains(const char *list, char sep, const char *s);
// return the index to truncate a UTF-8 string at a valid position
size_t
utf8_truncation_index(const char *utf8, size_t max_len);

View File

@ -1,342 +0,0 @@
#include "v4l2_sink.h"
#include "util/log.h"
#include "util/str_util.h"
/** Downcast frame_sink to sc_v4l2_sink */
#define DOWNCAST(SINK) container_of(SINK, struct sc_v4l2_sink, frame_sink)
static const AVRational SCRCPY_TIME_BASE = {1, 1000000}; // timestamps in us
static const AVOutputFormat *
find_muxer(const char *name) {
#ifdef SCRCPY_LAVF_HAS_NEW_MUXER_ITERATOR_API
void *opaque = NULL;
#endif
const AVOutputFormat *oformat = NULL;
do {
#ifdef SCRCPY_LAVF_HAS_NEW_MUXER_ITERATOR_API
oformat = av_muxer_iterate(&opaque);
#else
oformat = av_oformat_next(oformat);
#endif
// until null or containing the requested name
} while (oformat && !strlist_contains(oformat->name, ',', name));
return oformat;
}
static bool
write_header(struct sc_v4l2_sink *vs, const AVPacket *packet) {
AVStream *ostream = vs->format_ctx->streams[0];
uint8_t *extradata = av_malloc(packet->size * sizeof(uint8_t));
if (!extradata) {
LOGC("Could not allocate extradata");
return false;
}
// copy the first packet to the extra data
memcpy(extradata, packet->data, packet->size);
ostream->codecpar->extradata = extradata;
ostream->codecpar->extradata_size = packet->size;
int ret = avformat_write_header(vs->format_ctx, NULL);
if (ret < 0) {
LOGE("Failed to write header to %s", vs->device_name);
return false;
}
return true;
}
static void
rescale_packet(struct sc_v4l2_sink *vs, AVPacket *packet) {
AVStream *ostream = vs->format_ctx->streams[0];
av_packet_rescale_ts(packet, SCRCPY_TIME_BASE, ostream->time_base);
}
static bool
write_packet(struct sc_v4l2_sink *vs, AVPacket *packet) {
if (!vs->header_written) {
bool ok = write_header(vs, packet);
if (!ok) {
return false;
}
vs->header_written = true;
return true;
}
rescale_packet(vs, packet);
bool ok = av_write_frame(vs->format_ctx, packet) >= 0;
// Failing to write the last frame is not very serious, no future frame may
// depend on it, so the resulting file will still be valid
(void) ok;
return true;
}
static bool
encode_and_write_frame(struct sc_v4l2_sink *vs, const AVFrame *frame) {
int ret = avcodec_send_frame(vs->encoder_ctx, frame);
if (ret < 0 && ret != AVERROR(EAGAIN)) {
LOGE("Could not send v4l2 video frame: %d", ret);
return false;
}
AVPacket *packet = &vs->packet;
ret = avcodec_receive_packet(vs->encoder_ctx, packet);
if (ret == 0) {
// A packet was received
bool ok = write_packet(vs, packet);
av_packet_unref(packet);
if (!ok) {
LOGW("Could not send packet to v4l2 sink");
return false;
}
} else if (ret != AVERROR(EAGAIN)) {
LOGE("Could not receive v4l2 video packet: %d", ret);
return false;
}
return true;
}
static int
run_v4l2_sink(void *data) {
struct sc_v4l2_sink *vs = data;
for (;;) {
sc_mutex_lock(&vs->mutex);
while (!vs->stopped && vs->vb.pending_frame_consumed) {
sc_cond_wait(&vs->cond, &vs->mutex);
}
if (vs->stopped) {
sc_mutex_unlock(&vs->mutex);
break;
}
sc_mutex_unlock(&vs->mutex);
video_buffer_consume(&vs->vb, vs->frame);
bool ok = encode_and_write_frame(vs, vs->frame);
av_frame_unref(vs->frame);
if (!ok) {
LOGE("Could not send frame to v4l2 sink");
break;
}
}
LOGD("V4l2 thread ended");
return 0;
}
static bool
sc_v4l2_sink_open(struct sc_v4l2_sink *vs) {
bool ok = video_buffer_init(&vs->vb);
if (!ok) {
return false;
}
ok = sc_mutex_init(&vs->mutex);
if (!ok) {
LOGC("Could not create mutex");
goto error_video_buffer_destroy;
}
ok = sc_cond_init(&vs->cond);
if (!ok) {
LOGC("Could not create cond");
goto error_mutex_destroy;
}
// FIXME
const AVOutputFormat *format = find_muxer("video4linux2,v4l2");
if (!format) {
LOGE("Could not find v4l2 muxer");
goto error_cond_destroy;
}
const AVCodec *encoder = avcodec_find_encoder(AV_CODEC_ID_RAWVIDEO);
if (!encoder) {
LOGE("Raw video encoder not found");
return false;
}
vs->format_ctx = avformat_alloc_context();
if (!vs->format_ctx) {
LOGE("Could not allocate v4l2 output context");
return false;
}
// contrary to the deprecated API (av_oformat_next()), av_muxer_iterate()
// returns (on purpose) a pointer-to-const, but AVFormatContext.oformat
// still expects a pointer-to-non-const (it has not be updated accordingly)
// <https://github.com/FFmpeg/FFmpeg/commit/0694d8702421e7aff1340038559c438b61bb30dd>
vs->format_ctx->oformat = (AVOutputFormat *) format;
vs->format_ctx->url = strdup(vs->device_name);
if (!vs->format_ctx->url) {
LOGE("Could not strdup v4l2 device name");
goto error_avformat_free_context;
return false;
}
AVStream *ostream = avformat_new_stream(vs->format_ctx, encoder);
if (!ostream) {
LOGE("Could not allocate new v4l2 stream");
goto error_avformat_free_context;
return false;
}
ostream->codecpar->codec_type = AVMEDIA_TYPE_VIDEO;
ostream->codecpar->codec_id = encoder->id;
ostream->codecpar->format = AV_PIX_FMT_YUV420P;
ostream->codecpar->width = vs->frame_size.width;
ostream->codecpar->height = vs->frame_size.height;
int ret = avio_open(&vs->format_ctx->pb, vs->device_name, AVIO_FLAG_WRITE);
if (ret < 0) {
LOGE("Failed to open output device: %s", vs->device_name);
// ostream will be cleaned up during context cleaning
goto error_avformat_free_context;
}
vs->encoder_ctx = avcodec_alloc_context3(encoder);
if (!vs->encoder_ctx) {
LOGC("Could not allocate codec context for v4l2");
goto error_avio_close;
}
vs->encoder_ctx->width = vs->frame_size.width;
vs->encoder_ctx->height = vs->frame_size.height;
vs->encoder_ctx->pix_fmt = AV_PIX_FMT_YUV420P;
vs->encoder_ctx->time_base.num = 1;
vs->encoder_ctx->time_base.den = 1;
if (avcodec_open2(vs->encoder_ctx, encoder, NULL) < 0) {
LOGE("Could not open codec for v4l2");
goto error_avcodec_free_context;
}
vs->frame = av_frame_alloc();
if (!vs->frame) {
LOGE("Could not create v4l2 frame");
goto error_avcodec_close;
}
LOGD("Starting v4l2 thread");
ok = sc_thread_create(&vs->thread, run_v4l2_sink, "v4l2", vs);
if (!ok) {
LOGC("Could not start v4l2 thread");
goto error_av_frame_free;
}
vs->header_written = false;
vs->stopped = false;
LOGI("v4l2 sink started to device: %s", vs->device_name);
return true;
error_av_frame_free:
av_frame_free(&vs->frame);
error_avcodec_close:
avcodec_close(vs->encoder_ctx);
error_avcodec_free_context:
avcodec_free_context(&vs->encoder_ctx);
error_avio_close:
avio_close(vs->format_ctx->pb);
error_avformat_free_context:
avformat_free_context(vs->format_ctx);
error_cond_destroy:
sc_cond_destroy(&vs->cond);
error_mutex_destroy:
sc_mutex_destroy(&vs->mutex);
error_video_buffer_destroy:
video_buffer_destroy(&vs->vb);
return false;
}
static void
sc_v4l2_sink_close(struct sc_v4l2_sink *vs) {
sc_mutex_lock(&vs->mutex);
vs->stopped = true;
sc_cond_signal(&vs->cond);
sc_mutex_unlock(&vs->mutex);
sc_thread_join(&vs->thread, NULL);
av_frame_free(&vs->frame);
avcodec_close(vs->encoder_ctx);
avcodec_free_context(&vs->encoder_ctx);
avio_close(vs->format_ctx->pb);
avformat_free_context(vs->format_ctx);
sc_cond_destroy(&vs->cond);
sc_mutex_destroy(&vs->mutex);
video_buffer_destroy(&vs->vb);
}
static bool
sc_v4l2_sink_push(struct sc_v4l2_sink *vs, const AVFrame *frame) {
bool ok = video_buffer_push(&vs->vb, frame, NULL);
if (!ok) {
return false;
}
// signal possible change of vs->vb.pending_frame_consumed
sc_cond_signal(&vs->cond);
return true;
}
static bool
sc_v4l2_frame_sink_open(struct sc_frame_sink *sink) {
struct sc_v4l2_sink *vs = DOWNCAST(sink);
return sc_v4l2_sink_open(vs);
}
static void
sc_v4l2_frame_sink_close(struct sc_frame_sink *sink) {
struct sc_v4l2_sink *vs = DOWNCAST(sink);
sc_v4l2_sink_close(vs);
}
static bool
sc_v4l2_frame_sink_push(struct sc_frame_sink *sink, const AVFrame *frame) {
struct sc_v4l2_sink *vs = DOWNCAST(sink);
return sc_v4l2_sink_push(vs, frame);
}
bool
sc_v4l2_sink_init(struct sc_v4l2_sink *vs, const char *device_name,
struct size frame_size) {
vs->device_name = strdup(device_name);
if (!vs->device_name) {
LOGE("Could not strdup v4l2 device name");
return false;
}
vs->frame_size = frame_size;
static const struct sc_frame_sink_ops ops = {
.open = sc_v4l2_frame_sink_open,
.close = sc_v4l2_frame_sink_close,
.push = sc_v4l2_frame_sink_push,
};
vs->frame_sink.ops = &ops;
return true;
}
void
sc_v4l2_sink_destroy(struct sc_v4l2_sink *vs) {
free(vs->device_name);
}

View File

@ -1,39 +0,0 @@
#ifndef SC_V4L2_SINK_H
#define SC_V4L2_SINK_H
#include "common.h"
#include "coords.h"
#include "trait/frame_sink.h"
#include "video_buffer.h"
#include <libavformat/avformat.h>
struct sc_v4l2_sink {
struct sc_frame_sink frame_sink; // frame sink trait
struct video_buffer vb;
AVFormatContext *format_ctx;
AVCodecContext *encoder_ctx;
char *device_name;
struct size frame_size;
sc_thread thread;
sc_mutex mutex;
sc_cond cond;
bool stopped;
bool header_written;
AVFrame *frame;
AVPacket packet;
};
bool
sc_v4l2_sink_init(struct sc_v4l2_sink *vs, const char *device_name,
struct size frame_size);
void
sc_v4l2_sink_destroy(struct sc_v4l2_sink *vs);
#endif

View File

@ -7,36 +7,67 @@
#include "util/log.h"
bool
video_buffer_init(struct video_buffer *vb) {
vb->pending_frame = av_frame_alloc();
if (!vb->pending_frame) {
return false;
video_buffer_init(struct video_buffer *vb, bool wait_consumer) {
vb->producer_frame = av_frame_alloc();
if (!vb->producer_frame) {
goto error_0;
}
vb->tmp_frame = av_frame_alloc();
if (!vb->tmp_frame) {
av_frame_free(&vb->pending_frame);
return false;
vb->pending_frame = av_frame_alloc();
if (!vb->pending_frame) {
goto error_1;
}
vb->consumer_frame = av_frame_alloc();
if (!vb->consumer_frame) {
goto error_2;
}
bool ok = sc_mutex_init(&vb->mutex);
if (!ok) {
av_frame_free(&vb->pending_frame);
av_frame_free(&vb->tmp_frame);
return false;
goto error_3;
}
vb->wait_consumer = wait_consumer;
if (wait_consumer) {
ok = sc_cond_init(&vb->pending_frame_consumed_cond);
if (!ok) {
sc_mutex_destroy(&vb->mutex);
goto error_2;
}
// interrupted is not used if wait_consumer is disabled since offering
// a frame will never block
vb->interrupted = false;
}
// there is initially no frame, so consider it has already been consumed
vb->pending_frame_consumed = true;
// The callbacks must be set by the consumer via
// video_buffer_set_consumer_callbacks()
vb->cbs = NULL;
return true;
error_3:
av_frame_free(&vb->consumer_frame);
error_2:
av_frame_free(&vb->pending_frame);
error_1:
av_frame_free(&vb->producer_frame);
error_0:
return false;
}
void
video_buffer_destroy(struct video_buffer *vb) {
if (vb->wait_consumer) {
sc_cond_destroy(&vb->pending_frame_consumed_cond);
}
sc_mutex_destroy(&vb->mutex);
av_frame_free(&vb->consumer_frame);
av_frame_free(&vb->pending_frame);
av_frame_free(&vb->tmp_frame);
av_frame_free(&vb->producer_frame);
}
static inline void
@ -46,43 +77,71 @@ swap_frames(AVFrame **lhs, AVFrame **rhs) {
*rhs = tmp;
}
bool
video_buffer_push(struct video_buffer *vb, const AVFrame *frame,
bool *previous_frame_skipped) {
void
video_buffer_set_consumer_callbacks(struct video_buffer *vb,
const struct video_buffer_callbacks *cbs,
void *cbs_userdata) {
assert(!vb->cbs); // must be set only once
assert(cbs);
assert(cbs->on_frame_available);
vb->cbs = cbs;
vb->cbs_userdata = cbs_userdata;
}
void
video_buffer_producer_offer_frame(struct video_buffer *vb) {
assert(vb->cbs);
sc_mutex_lock(&vb->mutex);
// Use a temporary frame to preserve pending_frame in case of error.
// tmp_frame is an empty frame, no need to call av_frame_unref() beforehand.
int r = av_frame_ref(vb->tmp_frame, frame);
if (r) {
LOGE("Could not ref frame: %d", r);
return false;
if (vb->wait_consumer) {
// wait for the current (expired) frame to be consumed
while (!vb->pending_frame_consumed && !vb->interrupted) {
sc_cond_wait(&vb->pending_frame_consumed_cond, &vb->mutex);
}
}
// Now that av_frame_ref() succeeded, we can replace the previous
// pending_frame
swap_frames(&vb->pending_frame, &vb->tmp_frame);
av_frame_unref(vb->tmp_frame);
av_frame_unref(vb->pending_frame);
swap_frames(&vb->producer_frame, &vb->pending_frame);
if (previous_frame_skipped) {
*previous_frame_skipped = !vb->pending_frame_consumed;
}
bool skipped = !vb->pending_frame_consumed;
vb->pending_frame_consumed = false;
sc_mutex_unlock(&vb->mutex);
return true;
if (skipped) {
if (vb->cbs->on_frame_skipped)
vb->cbs->on_frame_skipped(vb, vb->cbs_userdata);
} else {
vb->cbs->on_frame_available(vb, vb->cbs_userdata);
}
}
void
video_buffer_consume(struct video_buffer *vb, AVFrame *dst) {
const AVFrame *
video_buffer_consumer_take_frame(struct video_buffer *vb) {
sc_mutex_lock(&vb->mutex);
assert(!vb->pending_frame_consumed);
vb->pending_frame_consumed = true;
av_frame_move_ref(dst, vb->pending_frame);
// av_frame_move_ref() resets its source frame, so no need to call
// av_frame_unref()
swap_frames(&vb->consumer_frame, &vb->pending_frame);
av_frame_unref(vb->pending_frame);
if (vb->wait_consumer) {
// unblock video_buffer_offer_decoded_frame()
sc_cond_signal(&vb->pending_frame_consumed_cond);
}
sc_mutex_unlock(&vb->mutex);
// consumer_frame is only written from this thread, no need to lock
return vb->consumer_frame;
}
void
video_buffer_interrupt(struct video_buffer *vb) {
if (vb->wait_consumer) {
sc_mutex_lock(&vb->mutex);
vb->interrupted = true;
sc_mutex_unlock(&vb->mutex);
// wake up blocking wait
sc_cond_signal(&vb->pending_frame_consumed_cond);
}
}

View File

@ -12,39 +12,71 @@
typedef struct AVFrame AVFrame;
/**
* A video buffer holds 1 pending frame, which is the last frame received from
* the producer (typically, the decoder).
* There are 3 frames in memory:
* - one frame is held by the producer (producer_frame)
* - one frame is held by the consumer (consumer_frame)
* - one frame is shared between the producer and the consumer (pending_frame)
*
* If a pending frame has not been consumed when the producer pushes a new
* frame, then it is lost. The intent is to always provide access to the very
* last frame to minimize latency.
* The producer generates a frame into the producer_frame (it may takes time).
*
* The producer and the consumer typically do not live in the same thread.
* That's the reason why the callback on_frame_available() does not provide the
* frame as parameter: the consumer might post an event to its own thread to
* retrieve the pending frame from there, and that frame may have changed since
* the callback if producer pushed a new one in between.
* Once the frame is produced, it calls video_buffer_producer_offer_frame(),
* which swaps the producer and pending frames.
*
* When the consumer is notified that a new frame is available, it calls
* video_buffer_consumer_take_frame() to retrieve it, which swaps the pending
* and consumer frames. The frame is valid until the next call, without
* blocking the producer.
*/
struct video_buffer {
AVFrame *producer_frame;
AVFrame *pending_frame;
AVFrame *tmp_frame; // To preserve the pending frame on error
AVFrame *consumer_frame;
sc_mutex mutex;
bool wait_consumer; // never overwrite a pending frame if it is not consumed
bool interrupted;
sc_cond pending_frame_consumed_cond;
bool pending_frame_consumed;
const struct video_buffer_callbacks *cbs;
void *cbs_userdata;
};
struct video_buffer_callbacks {
// Called when a new frame can be consumed by
// video_buffer_consumer_take_frame(vb)
// This callback is mandatory (it must not be NULL).
void (*on_frame_available)(struct video_buffer *vb, void *userdata);
// Called when a pending frame has been overwritten by the producer
// This callback is optional (it may be NULL).
void (*on_frame_skipped)(struct video_buffer *vb, void *userdata);
};
bool
video_buffer_init(struct video_buffer *vb);
video_buffer_init(struct video_buffer *vb, bool wait_consumer);
void
video_buffer_destroy(struct video_buffer *vb);
bool
video_buffer_push(struct video_buffer *vb, const AVFrame *frame, bool *skipped);
void
video_buffer_consume(struct video_buffer *vb, AVFrame *dst);
video_buffer_set_consumer_callbacks(struct video_buffer *vb,
const struct video_buffer_callbacks *cbs,
void *cbs_userdata);
// set the producer frame as ready for consuming
void
video_buffer_producer_offer_frame(struct video_buffer *vb);
// mark the consumer frame as consumed and return it
// the frame is valid until the next call to this function
const AVFrame *
video_buffer_consumer_take_frame(struct video_buffer *vb);
// wake up and avoid any blocking call
void
video_buffer_interrupt(struct video_buffer *vb);
#endif

View File

@ -58,6 +58,7 @@ static void test_options(void) {
"--push-target", "/sdcard/Movies",
"--record", "file",
"--record-format", "mkv",
"--render-expired-frames",
"--serial", "0123456789abcdef",
"--show-touches",
"--turn-screen-off",
@ -86,6 +87,7 @@ static void test_options(void) {
assert(!strcmp(opts->push_target, "/sdcard/Movies"));
assert(!strcmp(opts->record_filename, "file"));
assert(opts->record_format == SC_RECORD_FORMAT_MKV);
assert(opts->render_expired_frames);
assert(!strcmp(opts->serial, "0123456789abcdef"));
assert(opts->show_touches);
assert(opts->turn_screen_off);

View File

@ -287,18 +287,6 @@ static void test_parse_integer_with_suffix(void) {
assert(!ok);
}
static void test_strlist_contains(void) {
assert(strlist_contains("a,bc,def", ',', "bc"));
assert(!strlist_contains("a,bc,def", ',', "b"));
assert(strlist_contains("", ',', ""));
assert(strlist_contains("abc,", ',', ""));
assert(strlist_contains(",abc", ',', ""));
assert(strlist_contains("abc,,def", ',', ""));
assert(!strlist_contains("abc", ',', ""));
assert(strlist_contains(",,|x", '|', ",,"));
assert(strlist_contains("xyz", '\0', "xyz"));
}
int main(int argc, char *argv[]) {
(void) argc;
(void) argv;
@ -316,6 +304,5 @@ int main(int argc, char *argv[]) {
test_parse_integer();
test_parse_integers();
test_parse_integer_with_suffix();
test_strlist_contains();
return 0;
}

View File

@ -3,10 +3,6 @@ package com.genymobile.scrcpy;
import com.genymobile.scrcpy.wrappers.ContentProvider;
import com.genymobile.scrcpy.wrappers.ServiceManager;
import android.os.Parcel;
import android.os.Parcelable;
import android.util.Base64;
import java.io.File;
import java.io.IOException;
@ -19,123 +15,25 @@ public final class CleanUp {
public static final String SERVER_PATH = "/data/local/tmp/scrcpy-server.jar";
// A simple struct to be passed from the main process to the cleanup process
public static class Config implements Parcelable {
public static final Creator<Config> CREATOR = new Creator<Config>() {
@Override
public Config createFromParcel(Parcel in) {
return new Config(in);
}
@Override
public Config[] newArray(int size) {
return new Config[size];
}
};
private static final int FLAG_DISABLE_SHOW_TOUCHES = 1;
private static final int FLAG_RESTORE_NORMAL_POWER_MODE = 2;
private static final int FLAG_POWER_OFF_SCREEN = 4;
private int displayId;
// Restore the value (between 0 and 7), -1 to not restore
// <https://developer.android.com/reference/android/provider/Settings.Global#STAY_ON_WHILE_PLUGGED_IN>
private int restoreStayOn = -1;
private boolean disableShowTouches;
private boolean restoreNormalPowerMode;
private boolean powerOffScreen;
public Config() {
// Default constructor, the fields are initialized by CleanUp.configure()
}
protected Config(Parcel in) {
displayId = in.readInt();
restoreStayOn = in.readInt();
byte options = in.readByte();
disableShowTouches = (options & FLAG_DISABLE_SHOW_TOUCHES) != 0;
restoreNormalPowerMode = (options & FLAG_RESTORE_NORMAL_POWER_MODE) != 0;
powerOffScreen = (options & FLAG_POWER_OFF_SCREEN) != 0;
}
@Override
public void writeToParcel(Parcel dest, int flags) {
dest.writeInt(displayId);
dest.writeInt(restoreStayOn);
byte options = 0;
if (disableShowTouches) {
options |= FLAG_DISABLE_SHOW_TOUCHES;
}
if (restoreNormalPowerMode) {
options |= FLAG_RESTORE_NORMAL_POWER_MODE;
}
if (powerOffScreen) {
options |= FLAG_POWER_OFF_SCREEN;
}
dest.writeByte(options);
}
private boolean hasWork() {
return disableShowTouches || restoreStayOn != -1 || restoreNormalPowerMode || powerOffScreen;
}
@Override
public int describeContents() {
return 0;
}
byte[] serialize() {
Parcel parcel = Parcel.obtain();
writeToParcel(parcel, 0);
byte[] bytes = parcel.marshall();
parcel.recycle();
return bytes;
}
static Config deserialize(byte[] bytes) {
Parcel parcel = Parcel.obtain();
parcel.unmarshall(bytes, 0, bytes.length);
parcel.setDataPosition(0);
return CREATOR.createFromParcel(parcel);
}
static Config fromBase64(String base64) {
byte[] bytes = Base64.decode(base64, Base64.NO_WRAP);
return deserialize(bytes);
}
String toBase64() {
byte[] bytes = serialize();
return Base64.encodeToString(bytes, Base64.NO_WRAP);
}
}
private CleanUp() {
// not instantiable
}
public static void configure(int displayId, int restoreStayOn, boolean disableShowTouches, boolean restoreNormalPowerMode, boolean powerOffScreen)
public static void configure(boolean disableShowTouches, int restoreStayOn, boolean restoreNormalPowerMode, boolean powerOffScreen, int displayId)
throws IOException {
Config config = new Config();
config.displayId = displayId;
config.disableShowTouches = disableShowTouches;
config.restoreStayOn = restoreStayOn;
config.restoreNormalPowerMode = restoreNormalPowerMode;
config.powerOffScreen = powerOffScreen;
if (config.hasWork()) {
startProcess(config);
boolean needProcess = disableShowTouches || restoreStayOn != -1 || restoreNormalPowerMode || powerOffScreen;
if (needProcess) {
startProcess(disableShowTouches, restoreStayOn, restoreNormalPowerMode, powerOffScreen, displayId);
} else {
// There is no additional clean up to do when scrcpy dies
unlinkSelf();
}
}
private static void startProcess(Config config) throws IOException {
String[] cmd = {"app_process", "/", CleanUp.class.getName(), config.toBase64()};
private static void startProcess(boolean disableShowTouches, int restoreStayOn, boolean restoreNormalPowerMode, boolean powerOffScreen,
int displayId) throws IOException {
String[] cmd = {"app_process", "/", CleanUp.class.getName(), String.valueOf(disableShowTouches), String.valueOf(
restoreStayOn), String.valueOf(restoreNormalPowerMode), String.valueOf(powerOffScreen), String.valueOf(displayId)};
ProcessBuilder builder = new ProcessBuilder(cmd);
builder.environment().put("CLASSPATH", SERVER_PATH);
@ -162,27 +60,31 @@ public final class CleanUp {
Ln.i("Cleaning up");
Config config = Config.fromBase64(args[0]);
boolean disableShowTouches = Boolean.parseBoolean(args[0]);
int restoreStayOn = Integer.parseInt(args[1]);
boolean restoreNormalPowerMode = Boolean.parseBoolean(args[2]);
boolean powerOffScreen = Boolean.parseBoolean(args[3]);
int displayId = Integer.parseInt(args[4]);
if (config.disableShowTouches || config.restoreStayOn != -1) {
if (disableShowTouches || restoreStayOn != -1) {
ServiceManager serviceManager = new ServiceManager();
try (ContentProvider settings = serviceManager.getActivityManager().createSettingsProvider()) {
if (config.disableShowTouches) {
if (disableShowTouches) {
Ln.i("Disabling \"show touches\"");
settings.putValue(ContentProvider.TABLE_SYSTEM, "show_touches", "0");
}
if (config.restoreStayOn != -1) {
if (restoreStayOn != -1) {
Ln.i("Restoring \"stay awake\"");
settings.putValue(ContentProvider.TABLE_GLOBAL, "stay_on_while_plugged_in", String.valueOf(config.restoreStayOn));
settings.putValue(ContentProvider.TABLE_GLOBAL, "stay_on_while_plugged_in", String.valueOf(restoreStayOn));
}
}
}
if (Device.isScreenOn()) {
if (config.powerOffScreen) {
if (powerOffScreen) {
Ln.i("Power off screen");
Device.powerOffScreen(config.displayId);
} else if (config.restoreNormalPowerMode) {
Device.powerOffScreen(displayId);
} else if (restoreNormalPowerMode) {
Ln.i("Restoring normal power mode");
Device.setScreenPowerMode(Device.POWER_MODE_NORMAL);
}

View File

@ -55,7 +55,7 @@ public class Controller {
public void control() throws IOException {
// on start, power on the device
if (!Device.isScreenOn()) {
device.pressReleaseKeycode(KeyEvent.KEYCODE_POWER);
device.injectKeycode(KeyEvent.KEYCODE_POWER);
// dirty hack
// After POWER is injected, the device is powered on asynchronously.
@ -273,7 +273,7 @@ public class Controller {
if (keepPowerModeOff) {
schedulePowerModeOff();
}
return device.pressReleaseKeycode(KeyEvent.KEYCODE_POWER);
return device.injectKeycode(KeyEvent.KEYCODE_POWER);
}
private boolean setClipboard(String text, boolean paste) {
@ -284,7 +284,7 @@ public class Controller {
// On Android >= 7, also press the PASTE key if requested
if (paste && Build.VERSION.SDK_INT >= Build.VERSION_CODES.N && device.supportsInputEvents()) {
device.pressReleaseKeycode(KeyEvent.KEYCODE_PASTE);
device.injectKeycode(KeyEvent.KEYCODE_PASTE);
}
return ok;

View File

@ -25,9 +25,6 @@ public final class Device {
public static final int POWER_MODE_OFF = SurfaceControl.POWER_MODE_OFF;
public static final int POWER_MODE_NORMAL = SurfaceControl.POWER_MODE_NORMAL;
public static final int LOCK_VIDEO_ORIENTATION_UNLOCKED = -1;
public static final int LOCK_VIDEO_ORIENTATION_INITIAL = -2;
private static final ServiceManager SERVICE_MANAGER = new ServiceManager();
public interface RotationListener {
@ -164,39 +161,54 @@ public final class Device {
return supportsInputEvents;
}
public static boolean injectEvent(InputEvent inputEvent, int displayId) {
public static boolean injectEvent(InputEvent inputEvent, int mode, int displayId) {
if (!supportsInputEvents(displayId)) {
throw new AssertionError("Could not inject input event if !supportsInputEvents()");
return false;
}
if (displayId != 0 && !InputManager.setDisplayId(inputEvent, displayId)) {
return false;
}
return SERVICE_MANAGER.getInputManager().injectInputEvent(inputEvent, InputManager.INJECT_INPUT_EVENT_MODE_ASYNC);
return SERVICE_MANAGER.getInputManager().injectInputEvent(inputEvent, mode);
}
public boolean injectEvent(InputEvent inputEvent, int mode) {
if (!supportsInputEvents()) {
throw new AssertionError("Could not inject input event if !supportsInputEvents()");
}
return injectEvent(inputEvent, mode, displayId);
}
public static boolean injectEventOnDisplay(InputEvent event, int displayId) {
return injectEvent(event, InputManager.INJECT_INPUT_EVENT_MODE_ASYNC, displayId);
}
public boolean injectEvent(InputEvent event) {
return injectEvent(event, displayId);
return injectEvent(event, InputManager.INJECT_INPUT_EVENT_MODE_ASYNC);
}
public static boolean injectKeyEvent(int action, int keyCode, int repeat, int metaState, int displayId) {
long now = SystemClock.uptimeMillis();
KeyEvent event = new KeyEvent(now, now, action, keyCode, repeat, metaState, KeyCharacterMap.VIRTUAL_KEYBOARD, 0, 0,
InputDevice.SOURCE_KEYBOARD);
return injectEvent(event, displayId);
return injectEventOnDisplay(event, displayId);
}
public boolean injectKeyEvent(int action, int keyCode, int repeat, int metaState) {
return injectKeyEvent(action, keyCode, repeat, metaState, displayId);
long now = SystemClock.uptimeMillis();
KeyEvent event = new KeyEvent(now, now, action, keyCode, repeat, metaState, KeyCharacterMap.VIRTUAL_KEYBOARD, 0, 0,
InputDevice.SOURCE_KEYBOARD);
return injectEvent(event);
}
public static boolean pressReleaseKeycode(int keyCode, int displayId) {
public static boolean injectKeycode(int keyCode, int displayId) {
return injectKeyEvent(KeyEvent.ACTION_DOWN, keyCode, 0, 0, displayId) && injectKeyEvent(KeyEvent.ACTION_UP, keyCode, 0, 0, displayId);
}
public boolean pressReleaseKeycode(int keyCode) {
return pressReleaseKeycode(keyCode, displayId);
public boolean injectKeycode(int keyCode) {
return injectKeyEvent(KeyEvent.ACTION_DOWN, keyCode, 0, 0) && injectKeyEvent(KeyEvent.ACTION_UP, keyCode, 0, 0);
}
public static boolean isScreenOn() {
@ -272,7 +284,7 @@ public final class Device {
if (!isScreenOn()) {
return true;
}
return pressReleaseKeycode(KeyEvent.KEYCODE_POWER, displayId);
return injectKeycode(KeyEvent.KEYCODE_POWER, displayId);
}
/**

View File

@ -82,12 +82,6 @@ public final class ScreenInfo {
public static ScreenInfo computeScreenInfo(DisplayInfo displayInfo, Rect crop, int maxSize, int lockedVideoOrientation) {
int rotation = displayInfo.getRotation();
if (lockedVideoOrientation == Device.LOCK_VIDEO_ORIENTATION_INITIAL) {
// The user requested to lock the video orientation to the current orientation
lockedVideoOrientation = rotation;
}
Size deviceSize = displayInfo.getSize();
Rect contentRect = new Rect(0, 0, deviceSize.getWidth(), deviceSize.getHeight());
if (crop != null) {

View File

@ -50,7 +50,7 @@ public final class Server {
}
}
CleanUp.configure(options.getDisplayId(), restoreStayOn, mustDisableShowTouchesOnCleanUp, true, options.getPowerOffScreenOnClose());
CleanUp.configure(mustDisableShowTouchesOnCleanUp, restoreStayOn, true, options.getPowerOffScreenOnClose(), options.getDisplayId());
boolean tunnelForward = options.isTunnelForward();

View File

@ -14,7 +14,7 @@ public class ActivityManager {
private final IInterface manager;
private Method getContentProviderExternalMethod;
private boolean getContentProviderExternalMethodNewVersion = true;
private boolean getContentProviderExternalMethodLegacy;
private Method removeContentProviderExternalMethod;
public ActivityManager(IInterface manager) {
@ -29,7 +29,7 @@ public class ActivityManager {
} catch (NoSuchMethodException e) {
// old version
getContentProviderExternalMethod = manager.getClass().getMethod("getContentProviderExternal", String.class, int.class, IBinder.class);
getContentProviderExternalMethodNewVersion = false;
getContentProviderExternalMethodLegacy = true;
}
}
return getContentProviderExternalMethod;
@ -46,7 +46,7 @@ public class ActivityManager {
try {
Method method = getGetContentProviderExternalMethod();
Object[] args;
if (getContentProviderExternalMethodNewVersion) {
if (!getContentProviderExternalMethodLegacy) {
// new version
args = new Object[]{name, ServiceManager.USER_ID, token, null};
} else {

View File

@ -12,7 +12,7 @@ public class StatusBarManager {
private final IInterface manager;
private Method expandNotificationsPanelMethod;
private Method expandSettingsPanelMethod;
private boolean expandSettingsPanelMethodNewVersion = true;
private boolean expandSettingsPanelMethodLegacy;
private Method collapsePanelsMethod;
public StatusBarManager(IInterface manager) {
@ -34,7 +34,7 @@ public class StatusBarManager {
} catch (NoSuchMethodException e) {
// old version
expandSettingsPanelMethod = manager.getClass().getMethod("expandSettingsPanel");
expandSettingsPanelMethodNewVersion = false;
expandSettingsPanelMethodLegacy = true;
}
}
return expandSettingsPanelMethod;
@ -59,7 +59,7 @@ public class StatusBarManager {
public void expandSettingsPanel() {
try {
Method method = getExpandSettingsPanel();
if (expandSettingsPanelMethodNewVersion) {
if (!expandSettingsPanelMethodLegacy) {
// new version
method.invoke(manager, (Object) null);
} else {