Compare commits

...

29 Commits

Author SHA1 Message Date
5521a92959 v4l2wip 2021-04-19 09:35:16 +02:00
31583d9808 Use strlist_contains() to find a muxer
The AVOutputFormat name is a string list: it contains names (possibly
only one) separated by '.'
2021-04-19 09:34:07 +02:00
0f1ca970d1 Add strlist_contains()
Add a function to know if a string list, using some separator, contains
a specific string.
2021-04-19 09:22:53 +02:00
2453d86fdf Fix recorder comment 2021-04-18 17:23:23 +02:00
89c2ef2c8d Handle EAGAIN on send_packet in decoder
EAGAIN was only handled on receive_frame.

In practice, it should not be necessary, since one packet always
contains one frame. But just in case.
2021-04-18 17:21:26 +02:00
e3e7924bff Handle errors using gotos in recorder_open()
There are many initializations in recorder_open(). Handle RAII-like
deinitialization using gotos.
2021-04-18 17:21:26 +02:00
bacac84a96 Initialize recorder fields on open
Only initialize ops and parameters copy from recorder_init(). It was
inconsistent to initialize some fields from _init() and some others from
_open().
2021-04-18 17:21:26 +02:00
42c7e97e9b Hide the window immediately on close
The screen may not be destroyed immediately on close to avoid undefined
behavior, because it may still receive events from the decoder.

But the visual window must still be closed immediately.
2021-04-18 17:21:26 +02:00
b16f2eaae1 Assert screen closed on destroy
The destruction order is important, but tricky, because the screen is
open/close by the decoder, but destroyed by scrcpy.c on the main thread.

Add assertions to guarantee that the screen is not destroyed before
being closed.
2021-04-18 17:21:26 +02:00
3423e6d766 Remove video_buffer callbacks
Now that screen is both the owner and the listener of the video buffer,
execute the code directly without callbacks.
2021-04-18 17:21:26 +02:00
2981b11a49 Move video_buffer to screen
The video buffer is now an internal detail of the screen component.

Since the screen is plugged to the decoder via the frame sink trait, the
decoder does not access to the video buffer anymore.
2021-04-18 17:21:26 +02:00
0da276ccba Make decoder push frames to sinks
Now that screen implements the packet sink trait, make decoder push
packets to the sinks without depending on the concrete sink types.
2021-04-18 17:21:26 +02:00
e6ecf09079 Expose screen as frame sink
Make screen implement the frame sink trait.

This will allow the decoder to push frames without depending on the
concrete sink type.
2021-04-18 14:40:00 +02:00
ed86b8d446 Add frame sink trait
This trait will allow to abstract the concrete sink types from the frame
producer (decoder.c).
2021-04-18 14:40:00 +02:00
e1da07e9a8 Make stream push packets to sinks
Now that decoder and recorder implement the packet sink trait, make
stream push packets to the sinks without depending on the concrete sink
types.
2021-04-18 14:40:00 +02:00
8f10558ea9 Expose decoder as packet sink
Make decoder implement the packet sink trait.

This will allow the stream to push packets without depending on the
concrete sink type.
2021-04-18 14:40:00 +02:00
16f6f5bcc1 Reorder decoder functions
This will make further commits more readable.
2021-04-18 14:40:00 +02:00
4634a59631 Expose recorder as packet sink
Make recorder implement the packet sink trait.

This will allow the stream to push packets without depending on the
concrete sink type.
2021-04-18 14:40:00 +02:00
3b58afb0ac Privatize recorder threading
The fact that the recorder uses a separate thread is an internal detail,
so the functions _start(), _stop() and _join() should not be exposed.

Instead, start the thread on _open() and _stop()+_join() on close().

This paves the way to expose the recorder as a packet sink trait.
2021-04-18 14:40:00 +02:00
29c2935c45 Reorder recorder functions
This will make further commits more readable.
2021-04-18 14:40:00 +02:00
12c90196fa Add packet sink trait
This trait will allow to abstract the concrete sink types from the
packet producer (stream.c).
2021-04-18 14:40:00 +02:00
cc714060a0 Add container_of() macro
This will allow to get the parent of an embedded struct.
2021-04-18 14:40:00 +02:00
e50f2d3954 Make video_buffer more generic
The video buffer took ownership of the producer frame (so that it could
swap frames quickly).

In order to support multiple sinks plugged to the decoder, the decoded
frame must not be consumed by the display video buffer.

Therefore, move the producer and consumer frames out of the video
buffer, and use FFmpeg AVFrame refcounting to share ownership while
avoiding copies.
2021-04-18 14:40:00 +02:00
ae04ad41be Remove compat with old FFmpeg codec params API
The new API has been introduced in 2016 in libavformat 57.xx, it's very
old.

This will avoid to maintain two code paths for codec parameters.
2021-04-18 14:40:00 +02:00
dde52d679d Remove compat with old FFmpeg decoding API
The new API has been introduced in 2016 in libavcodec 57.xx, it's very
old.

This will avoid to maintain two code paths for decoding.
2021-04-18 14:40:00 +02:00
00a002e139 Remove option --render-expired-frames
This flag forced the decoder to wait for the previous frame to be
consumed by the display.

It was initially implemented as a compilation flag for testing, not
intended to be exposed at runtime. But to remove ifdefs and to allow
users to test this flag easily, it had finally been exposed by commit
ebccb9f6cc.

In practice, it turned out to be useless: it had no practical impact,
and it did not solve or mitigate any performance issues causing frame
skipping.

But that added some complexity to the codebase: it required an
additional condition variable, and made video buffer calls possibly
blocking, which in turn required code to interrupt it on exit.

To prepare support for multiple sinks plugged to the decoder (display
and v4l2 for example), the blocking call used for pacing the decoder
output becomes unacceptable, so just remove this useless "feature".
2021-04-18 14:40:00 +02:00
d593927b2c Write trailer from recorder thread
The recorder thread wrote the whole content except the trailer, which
was odd.
2021-04-18 14:40:00 +02:00
c4f3a768d5 Provide actions for the extra mouse buttons
Bind APP_SWITCH to button 4 and expand notification panel on button 5.

PR #2258 <https://github.com/Genymobile/scrcpy/pull/2258>

Signed-off-by: Romain Vimont <rom@rom1v.com>
2021-04-17 18:25:07 +02:00
b029b1de9d Forward DOWN and UP separately for right-click
The shortcut "back on screen on" is a bit special: the control is
requested by the client, but the actual event injection (POWER or BACK)
is determined on the device.

To properly inject DOWN and UP events for BACK, transmit the action as
a control parameter.

If the screen is off:
 - on DOWN, inject POWER (DOWN and UP) (wake up the device immediately)
 - on UP, do nothing
If the screen is on:
 - on DOWN, inject BACK DOWN
 - on UP, inject BACK UP

A corner case is when the screen turns off between the DOWN and UP
event. In that case, a BACK UP event will be injected, so it's harmless.

As a consequence of this change, the BACK button is now handled by
Android on mouse released. This is consistent with the keyboard shortcut
(Mod+b) behavior.

PR #2259 <https://github.com/Genymobile/scrcpy/pull/2259>
Refs #2258 <https://github.com/Genymobile/scrcpy/pull/2258>
2021-04-17 18:04:32 +02:00
35 changed files with 1078 additions and 536 deletions

View File

@ -491,18 +491,6 @@ scrcpy -Sw
``` ```
#### Render expired frames
By default, to minimize latency, _scrcpy_ always renders the last decoded frame
available, and drops any previous one.
To force the rendering of all frames (at a cost of a possible increased
latency), use:
```bash
scrcpy --render-expired-frames
```
#### Show touches #### Show touches
For presentations, it may be useful to show physical touches (on the physical For presentations, it may be useful to show physical touches (on the physical

View File

@ -33,6 +33,11 @@ else
src += [ 'src/sys/unix/process.c' ] src += [ 'src/sys/unix/process.c' ]
endif endif
v4l2_support = host_machine.system() == 'linux'
if v4l2_support
src += [ 'src/v4l2_sink.c' ]
endif
check_functions = [ check_functions = [
'strdup' 'strdup'
] ]
@ -49,6 +54,10 @@ if not get_option('crossbuild_windows')
dependency('sdl2'), dependency('sdl2'),
] ]
if v4l2_support
dependencies += dependency('libavdevice')
endif
else else
# cross-compile mingw32 build (from Linux to Windows) # cross-compile mingw32 build (from Linux to Windows)
@ -124,6 +133,9 @@ conf.set('SERVER_DEBUGGER', get_option('server_debugger'))
# select the debugger method ('old' for Android < 9, 'new' for Android >= 9) # select the debugger method ('old' for Android < 9, 'new' for Android >= 9)
conf.set('SERVER_DEBUGGER_METHOD_NEW', get_option('server_debugger_method') == 'new') conf.set('SERVER_DEBUGGER_METHOD_NEW', get_option('server_debugger_method') == 'new')
# enable V4L2 support (linux only)
conf.set('HAVE_V4L2', v4l2_support)
configure_file(configuration: conf, output: 'config.h') configure_file(configuration: conf, output: 'config.h')
src_dir = include_directories('src') src_dir = include_directories('src')

View File

@ -155,10 +155,6 @@ Supported names are currently "direct3d", "opengl", "opengles2", "opengles", "me
.UR https://wiki.libsdl.org/SDL_HINT_RENDER_DRIVER .UR https://wiki.libsdl.org/SDL_HINT_RENDER_DRIVER
.UE .UE
.TP
.B \-\-render\-expired\-frames
By default, to minimize latency, scrcpy always renders the last available decoded frame, and drops any previous ones. This flag forces to render all frames, at a cost of a possible increased latency.
.TP .TP
.BI "\-\-rotation " value .BI "\-\-rotation " value
Set the initial display rotation. Possibles values are 0, 1, 2 and 3. Each increment adds a 90 degrees rotation counterclockwise. Set the initial display rotation. Possibles values are 0, 1, 2 and 3. Each increment adds a 90 degrees rotation counterclockwise.

View File

@ -143,12 +143,6 @@ scrcpy_print_usage(const char *arg0) {
" \"opengles2\", \"opengles\", \"metal\" and \"software\".\n" " \"opengles2\", \"opengles\", \"metal\" and \"software\".\n"
" <https://wiki.libsdl.org/SDL_HINT_RENDER_DRIVER>\n" " <https://wiki.libsdl.org/SDL_HINT_RENDER_DRIVER>\n"
"\n" "\n"
" --render-expired-frames\n"
" By default, to minimize latency, scrcpy always renders the\n"
" last available decoded frame, and drops any previous ones.\n"
" This flag forces to render all frames, at a cost of a\n"
" possible increased latency.\n"
"\n"
" --rotation value\n" " --rotation value\n"
" Set the initial display rotation.\n" " Set the initial display rotation.\n"
" Possibles values are 0, 1, 2 and 3. Each increment adds a 90\n" " Possibles values are 0, 1, 2 and 3. Each increment adds a 90\n"
@ -179,6 +173,11 @@ scrcpy_print_usage(const char *arg0) {
" on exit.\n" " on exit.\n"
" It only shows physical touches (not clicks from scrcpy).\n" " It only shows physical touches (not clicks from scrcpy).\n"
"\n" "\n"
" --v4l2_sink /dev/videoN\n"
" Output to v4l2loopback device.\n"
" It requires to lock the video orientation (see\n"
" --lock-video-orientation).\n"
"\n"
" -V, --verbosity value\n" " -V, --verbosity value\n"
" Set the log level (debug, info, warn or error).\n" " Set the log level (debug, info, warn or error).\n"
#ifndef NDEBUG #ifndef NDEBUG
@ -667,6 +666,7 @@ guess_record_format(const char *filename) {
#define OPT_LEGACY_PASTE 1024 #define OPT_LEGACY_PASTE 1024
#define OPT_ENCODER_NAME 1025 #define OPT_ENCODER_NAME 1025
#define OPT_POWER_OFF_ON_CLOSE 1026 #define OPT_POWER_OFF_ON_CLOSE 1026
#define OPT_V4L2_SINK 1027
bool bool
scrcpy_parse_args(struct scrcpy_cli_args *args, int argc, char *argv[]) { scrcpy_parse_args(struct scrcpy_cli_args *args, int argc, char *argv[]) {
@ -708,6 +708,9 @@ scrcpy_parse_args(struct scrcpy_cli_args *args, int argc, char *argv[]) {
{"show-touches", no_argument, NULL, 't'}, {"show-touches", no_argument, NULL, 't'},
{"stay-awake", no_argument, NULL, 'w'}, {"stay-awake", no_argument, NULL, 'w'},
{"turn-screen-off", no_argument, NULL, 'S'}, {"turn-screen-off", no_argument, NULL, 'S'},
#ifdef HAVE_V4L2
{"v4l2_sink", required_argument, NULL, OPT_V4L2_SINK},
#endif
{"verbosity", required_argument, NULL, 'V'}, {"verbosity", required_argument, NULL, 'V'},
{"version", no_argument, NULL, 'v'}, {"version", no_argument, NULL, 'v'},
{"window-title", required_argument, NULL, OPT_WINDOW_TITLE}, {"window-title", required_argument, NULL, OPT_WINDOW_TITLE},
@ -816,7 +819,8 @@ scrcpy_parse_args(struct scrcpy_cli_args *args, int argc, char *argv[]) {
opts->stay_awake = true; opts->stay_awake = true;
break; break;
case OPT_RENDER_EXPIRED_FRAMES: case OPT_RENDER_EXPIRED_FRAMES:
opts->render_expired_frames = true; LOGW("Option --render-expired-frames has been removed. This "
"flag has been ignored.");
break; break;
case OPT_WINDOW_TITLE: case OPT_WINDOW_TITLE:
opts->window_title = optarg; opts->window_title = optarg;
@ -890,17 +894,32 @@ scrcpy_parse_args(struct scrcpy_cli_args *args, int argc, char *argv[]) {
case OPT_POWER_OFF_ON_CLOSE: case OPT_POWER_OFF_ON_CLOSE:
opts->power_off_on_close = true; opts->power_off_on_close = true;
break; break;
case OPT_V4L2_SINK:
opts->v4l2_device = optarg;
break;
default: default:
// getopt prints the error message on stderr // getopt prints the error message on stderr
return false; return false;
} }
} }
if (!opts->display && !opts->record_filename) { if (!opts->display && !opts->record_filename && !opts->v4l2_device) {
LOGE("-N/--no-display requires screen recording (-r/--record)"); LOGE("-N/--no-display requires either screen recording (-r/--record)"
#ifdef HAVE_V4L2
" or sink to v4l2loopback device (--v4l2_sink)"
#endif
);
return false; return false;
} }
#ifdef HAVE_V4L2
if (opts->v4l2_device && opts->lock_video_orientation == -1) {
LOGI("Video orientation is locked for v4l2 sink. "
"See --lock-video-orientation.");
opts->lock_video_orientation = 0;
}
#endif
int index = optind; int index = optind;
if (index < argc) { if (index < argc) {
LOGE("Unexpected additional argument: %s", argv[index]); LOGE("Unexpected additional argument: %s", argv[index]);

View File

@ -8,4 +8,7 @@
#define MIN(X,Y) (X) < (Y) ? (X) : (Y) #define MIN(X,Y) (X) < (Y) ? (X) : (Y)
#define MAX(X,Y) (X) > (Y) ? (X) : (Y) #define MAX(X,Y) (X) > (Y) ? (X) : (Y)
#define container_of(ptr, type, member) \
((type *) (((char *) (ptr)) - offsetof(type, member)))
#endif #endif

View File

@ -8,20 +8,9 @@
# define _DARWIN_C_SOURCE # define _DARWIN_C_SOURCE
#endif #endif
#include <libavcodec/version.h>
#include <libavformat/version.h> #include <libavformat/version.h>
#include <SDL2/SDL_version.h> #include <SDL2/SDL_version.h>
// In ffmpeg/doc/APIchanges:
// 2016-04-11 - 6f69f7a / 9200514 - lavf 57.33.100 / 57.5.0 - avformat.h
// Add AVStream.codecpar, deprecate AVStream.codec.
#if (LIBAVFORMAT_VERSION_MICRO >= 100 /* FFmpeg */ && \
LIBAVFORMAT_VERSION_INT >= AV_VERSION_INT(57, 33, 100)) \
|| (LIBAVFORMAT_VERSION_MICRO < 100 && /* Libav */ \
LIBAVFORMAT_VERSION_INT >= AV_VERSION_INT(57, 5, 0))
# define SCRCPY_LAVF_HAS_NEW_CODEC_PARAMS_API
#endif
// In ffmpeg/doc/APIchanges: // In ffmpeg/doc/APIchanges:
// 2018-02-06 - 0694d87024 - lavf 58.9.100 - avformat.h // 2018-02-06 - 0694d87024 - lavf 58.9.100 - avformat.h
// Deprecate use of av_register_input_format(), av_register_output_format(), // Deprecate use of av_register_input_format(), av_register_output_format(),
@ -33,15 +22,6 @@
# define SCRCPY_LAVF_REQUIRES_REGISTER_ALL # define SCRCPY_LAVF_REQUIRES_REGISTER_ALL
#endif #endif
// In ffmpeg/doc/APIchanges:
// 2016-04-21 - 7fc329e - lavc 57.37.100 - avcodec.h
// Add a new audio/video encoding and decoding API with decoupled input
// and output -- avcodec_send_packet(), avcodec_receive_frame(),
// avcodec_send_frame() and avcodec_receive_packet().
#if LIBAVCODEC_VERSION_INT >= AV_VERSION_INT(57, 37, 100)
# define SCRCPY_LAVF_HAS_NEW_ENCODING_DECODING_API
#endif
#if SDL_VERSION_ATLEAST(2, 0, 5) #if SDL_VERSION_ATLEAST(2, 0, 5)
// <https://wiki.libsdl.org/SDL_HINT_MOUSE_FOCUS_CLICKTHROUGH> // <https://wiki.libsdl.org/SDL_HINT_MOUSE_FOCUS_CLICKTHROUGH>
# define SCRCPY_SDL_HAS_HINT_MOUSE_FOCUS_CLICKTHROUGH # define SCRCPY_SDL_HAS_HINT_MOUSE_FOCUS_CLICKTHROUGH

View File

@ -67,6 +67,9 @@ control_msg_serialize(const struct control_msg *msg, unsigned char *buf) {
buffer_write32be(&buf[17], buffer_write32be(&buf[17],
(uint32_t) msg->inject_scroll_event.vscroll); (uint32_t) msg->inject_scroll_event.vscroll);
return 21; return 21;
case CONTROL_MSG_TYPE_BACK_OR_SCREEN_ON:
buf[1] = msg->inject_keycode.action;
return 2;
case CONTROL_MSG_TYPE_SET_CLIPBOARD: { case CONTROL_MSG_TYPE_SET_CLIPBOARD: {
buf[1] = !!msg->set_clipboard.paste; buf[1] = !!msg->set_clipboard.paste;
size_t len = write_string(msg->set_clipboard.text, size_t len = write_string(msg->set_clipboard.text,
@ -77,7 +80,6 @@ control_msg_serialize(const struct control_msg *msg, unsigned char *buf) {
case CONTROL_MSG_TYPE_SET_SCREEN_POWER_MODE: case CONTROL_MSG_TYPE_SET_SCREEN_POWER_MODE:
buf[1] = msg->set_screen_power_mode.mode; buf[1] = msg->set_screen_power_mode.mode;
return 2; return 2;
case CONTROL_MSG_TYPE_BACK_OR_SCREEN_ON:
case CONTROL_MSG_TYPE_EXPAND_NOTIFICATION_PANEL: case CONTROL_MSG_TYPE_EXPAND_NOTIFICATION_PANEL:
case CONTROL_MSG_TYPE_COLLAPSE_NOTIFICATION_PANEL: case CONTROL_MSG_TYPE_COLLAPSE_NOTIFICATION_PANEL:
case CONTROL_MSG_TYPE_GET_CLIPBOARD: case CONTROL_MSG_TYPE_GET_CLIPBOARD:

View File

@ -64,6 +64,10 @@ struct control_msg {
int32_t hscroll; int32_t hscroll;
int32_t vscroll; int32_t vscroll;
} inject_scroll_event; } inject_scroll_event;
struct {
enum android_keyevent_action action; // action for the BACK key
// screen may only be turned on on ACTION_DOWN
} back_or_screen_on;
struct { struct {
char *text; // owned, to be freed by free() char *text; // owned, to be freed by free()
bool paste; bool paste;

View File

@ -4,14 +4,40 @@
#include "events.h" #include "events.h"
#include "video_buffer.h" #include "video_buffer.h"
#include "trait/frame_sink.h"
#include "util/log.h" #include "util/log.h"
void /** Downcast packet_sink to decoder */
decoder_init(struct decoder *decoder, struct video_buffer *vb) { #define DOWNCAST(SINK) container_of(SINK, struct decoder, packet_sink)
decoder->video_buffer = vb;
static void
decoder_close_first_sinks(struct decoder *decoder, unsigned count) {
while (count) {
struct sc_frame_sink *sink = decoder->sinks[--count];
sink->ops->close(sink);
}
} }
bool static inline void
decoder_close_sinks(struct decoder *decoder) {
decoder_close_first_sinks(decoder, decoder->sink_count);
}
static bool
decoder_open_sinks(struct decoder *decoder) {
for (unsigned i = 0; i < decoder->sink_count; ++i) {
struct sc_frame_sink *sink = decoder->sinks[i];
if (!sink->ops->open(sink)) {
LOGE("Could not open frame sink %d", i);
decoder_close_first_sinks(decoder, i);
return false;
}
}
return true;
}
static bool
decoder_open(struct decoder *decoder, const AVCodec *codec) { decoder_open(struct decoder *decoder, const AVCodec *codec) {
decoder->codec_ctx = avcodec_alloc_context3(codec); decoder->codec_ctx = avcodec_alloc_context3(codec);
if (!decoder->codec_ctx) { if (!decoder->codec_ctx) {
@ -25,52 +51,106 @@ decoder_open(struct decoder *decoder, const AVCodec *codec) {
return false; return false;
} }
decoder->frame = av_frame_alloc();
if (!decoder->frame) {
LOGE("Could not create decoder frame");
avcodec_close(decoder->codec_ctx);
avcodec_free_context(&decoder->codec_ctx);
return false;
}
if (!decoder_open_sinks(decoder)) {
LOGE("Could not open decoder sinks");
av_frame_free(&decoder->frame);
avcodec_close(decoder->codec_ctx);
avcodec_free_context(&decoder->codec_ctx);
return false;
}
return true; return true;
} }
void static void
decoder_close(struct decoder *decoder) { decoder_close(struct decoder *decoder) {
decoder_close_sinks(decoder);
av_frame_free(&decoder->frame);
avcodec_close(decoder->codec_ctx); avcodec_close(decoder->codec_ctx);
avcodec_free_context(&decoder->codec_ctx); avcodec_free_context(&decoder->codec_ctx);
} }
bool static bool
push_frame_to_sinks(struct decoder *decoder, const AVFrame *frame) {
for (unsigned i = 0; i < decoder->sink_count; ++i) {
struct sc_frame_sink *sink = decoder->sinks[i];
if (!sink->ops->push(sink, frame)) {
LOGE("Could not send frame to sink %d", i);
return false;
}
}
return true;
}
static bool
decoder_push(struct decoder *decoder, const AVPacket *packet) { decoder_push(struct decoder *decoder, const AVPacket *packet) {
// the new decoding/encoding API has been introduced by: bool is_config = packet->pts == AV_NOPTS_VALUE;
// <http://git.videolan.org/?p=ffmpeg.git;a=commitdiff;h=7fc329e2dd6226dfecaa4a1d7adf353bf2773726> if (is_config) {
#ifdef SCRCPY_LAVF_HAS_NEW_ENCODING_DECODING_API // nothing to do
int ret; return true;
if ((ret = avcodec_send_packet(decoder->codec_ctx, packet)) < 0) { }
int ret = avcodec_send_packet(decoder->codec_ctx, packet);
if (ret < 0 && ret != AVERROR(EAGAIN)) {
LOGE("Could not send video packet: %d", ret); LOGE("Could not send video packet: %d", ret);
return false; return false;
} }
ret = avcodec_receive_frame(decoder->codec_ctx, ret = avcodec_receive_frame(decoder->codec_ctx, decoder->frame);
decoder->video_buffer->producer_frame);
if (!ret) { if (!ret) {
// a frame was received // a frame was received
video_buffer_producer_offer_frame(decoder->video_buffer); bool ok = push_frame_to_sinks(decoder, decoder->frame);
// A frame lost should not make the whole pipeline fail. The error, if
// any, is already logged.
(void) ok;
} else if (ret != AVERROR(EAGAIN)) { } else if (ret != AVERROR(EAGAIN)) {
LOGE("Could not receive video frame: %d", ret); LOGE("Could not receive video frame: %d", ret);
return false; return false;
} }
#else
int got_picture;
int len = avcodec_decode_video2(decoder->codec_ctx,
decoder->video_buffer->producer_frame,
&got_picture,
packet);
if (len < 0) {
LOGE("Could not decode video packet: %d", len);
return false;
}
if (got_picture) {
video_buffer_producer_offer_frame(decoder->video_buffer);
}
#endif
return true; return true;
} }
void static bool
decoder_interrupt(struct decoder *decoder) { decoder_packet_sink_open(struct sc_packet_sink *sink, const AVCodec *codec) {
video_buffer_interrupt(decoder->video_buffer); struct decoder *decoder = DOWNCAST(sink);
return decoder_open(decoder, codec);
}
static void
decoder_packet_sink_close(struct sc_packet_sink *sink) {
struct decoder *decoder = DOWNCAST(sink);
decoder_close(decoder);
}
static bool
decoder_packet_sink_push(struct sc_packet_sink *sink, const AVPacket *packet) {
struct decoder *decoder = DOWNCAST(sink);
return decoder_push(decoder, packet);
}
void
decoder_init(struct decoder *decoder) {
static const struct sc_packet_sink_ops ops = {
.open = decoder_packet_sink_open,
.close = decoder_packet_sink_close,
.push = decoder_packet_sink_push,
};
decoder->packet_sink.ops = &ops;
}
void
decoder_add_sink(struct decoder *decoder, struct sc_frame_sink *sink) {
assert(decoder->sink_count < DECODER_MAX_SINKS);
assert(sink);
assert(sink->ops);
decoder->sinks[decoder->sink_count++] = sink;
} }

View File

@ -3,30 +3,27 @@
#include "common.h" #include "common.h"
#include "trait/packet_sink.h"
#include <stdbool.h> #include <stdbool.h>
#include <libavformat/avformat.h> #include <libavformat/avformat.h>
struct video_buffer; #define DECODER_MAX_SINKS 2
struct decoder { struct decoder {
struct video_buffer *video_buffer; struct sc_packet_sink packet_sink; // packet sink trait
struct sc_frame_sink *sinks[DECODER_MAX_SINKS];
unsigned sink_count;
AVCodecContext *codec_ctx; AVCodecContext *codec_ctx;
AVFrame *frame;
}; };
void void
decoder_init(struct decoder *decoder, struct video_buffer *vb); decoder_init(struct decoder *decoder);
bool
decoder_open(struct decoder *decoder, const AVCodec *codec);
void void
decoder_close(struct decoder *decoder); decoder_add_sink(struct decoder *decoder, struct sc_frame_sink *sink);
bool
decoder_push(struct decoder *decoder, const AVPacket *packet);
void
decoder_interrupt(struct decoder *decoder);
#endif #endif

View File

@ -146,13 +146,25 @@ action_cut(struct controller *controller, int actions) {
} }
// turn the screen on if it was off, press BACK otherwise // turn the screen on if it was off, press BACK otherwise
// If the screen is off, it is turned on only on ACTION_DOWN
static void static void
press_back_or_turn_screen_on(struct controller *controller) { press_back_or_turn_screen_on(struct controller *controller, int actions) {
struct control_msg msg; struct control_msg msg;
msg.type = CONTROL_MSG_TYPE_BACK_OR_SCREEN_ON; msg.type = CONTROL_MSG_TYPE_BACK_OR_SCREEN_ON;
if (!controller_push_msg(controller, &msg)) { if (actions & ACTION_DOWN) {
LOGW("Could not request 'press back or turn screen on'"); msg.back_or_screen_on.action = AKEY_EVENT_ACTION_DOWN;
if (!controller_push_msg(controller, &msg)) {
LOGW("Could not request 'press back or turn screen on'");
return;
}
}
if (actions & ACTION_UP) {
msg.back_or_screen_on.action = AKEY_EVENT_ACTION_UP;
if (!controller_push_msg(controller, &msg)) {
LOGW("Could not request 'press back or turn screen on'");
}
} }
} }
@ -649,10 +661,16 @@ input_manager_process_mouse_button(struct input_manager *im,
if (!im->forward_all_clicks) { if (!im->forward_all_clicks) {
int action = down ? ACTION_DOWN : ACTION_UP; int action = down ? ACTION_DOWN : ACTION_UP;
if (control && event->button == SDL_BUTTON_X1) {
action_app_switch(im->controller, action);
return;
}
if (control && event->button == SDL_BUTTON_X2 && down) {
expand_notification_panel(im->controller);
return;
}
if (control && event->button == SDL_BUTTON_RIGHT) { if (control && event->button == SDL_BUTTON_RIGHT) {
if (down) { press_back_or_turn_screen_on(im->controller, action);
press_back_or_turn_screen_on(im->controller);
}
return; return;
} }
if (control && event->button == SDL_BUTTON_MIDDLE) { if (control && event->button == SDL_BUTTON_MIDDLE) {

View File

@ -6,6 +6,9 @@
#include <stdbool.h> #include <stdbool.h>
#include <unistd.h> #include <unistd.h>
#include <libavformat/avformat.h> #include <libavformat/avformat.h>
#ifdef HAVE_V4L2
# include <libavdevice/avdevice.h>
#endif
#define SDL_MAIN_HANDLED // avoid link error on Linux Windows Subsystem #define SDL_MAIN_HANDLED // avoid link error on Linux Windows Subsystem
#include <SDL2/SDL.h> #include <SDL2/SDL.h>
@ -28,6 +31,11 @@ print_version(void) {
fprintf(stderr, " - libavutil %d.%d.%d\n", LIBAVUTIL_VERSION_MAJOR, fprintf(stderr, " - libavutil %d.%d.%d\n", LIBAVUTIL_VERSION_MAJOR,
LIBAVUTIL_VERSION_MINOR, LIBAVUTIL_VERSION_MINOR,
LIBAVUTIL_VERSION_MICRO); LIBAVUTIL_VERSION_MICRO);
#ifdef HAVE_V4L2
fprintf(stderr, " - libavdevice %d.%d.%d\n", LIBAVDEVICE_VERSION_MAJOR,
LIBAVDEVICE_VERSION_MINOR,
LIBAVDEVICE_VERSION_MICRO);
#endif
} }
static SDL_LogPriority static SDL_LogPriority
@ -90,6 +98,12 @@ main(int argc, char *argv[]) {
av_register_all(); av_register_all();
#endif #endif
#ifdef HAVE_V4L2
if (args.opts.v4l2_device) {
avdevice_register_all();
}
#endif
if (avformat_network_init()) { if (avformat_network_init()) {
return 1; return 1;
} }

View File

@ -4,6 +4,10 @@
#include <libavutil/time.h> #include <libavutil/time.h>
#include "util/log.h" #include "util/log.h"
#include "util/str_util.h"
/** Downcast packet_sink to recorder */
#define DOWNCAST(SINK) container_of(SINK, struct recorder, packet_sink)
static const AVRational SCRCPY_TIME_BASE = {1, 1000000}; // timestamps in us static const AVRational SCRCPY_TIME_BASE = {1, 1000000}; // timestamps in us
@ -19,8 +23,8 @@ find_muxer(const char *name) {
#else #else
oformat = av_oformat_next(oformat); oformat = av_oformat_next(oformat);
#endif #endif
// until null or with name "mp4" // until null or containing the requested name
} while (oformat && strcmp(oformat->name, name)); } while (oformat && !strlist_contains(oformat->name, ',', name));
return oformat; return oformat;
} }
@ -57,50 +61,6 @@ recorder_queue_clear(struct recorder_queue *queue) {
} }
} }
bool
recorder_init(struct recorder *recorder,
const char *filename,
enum sc_record_format format,
struct size declared_frame_size) {
recorder->filename = strdup(filename);
if (!recorder->filename) {
LOGE("Could not strdup filename");
return false;
}
bool ok = sc_mutex_init(&recorder->mutex);
if (!ok) {
LOGC("Could not create mutex");
free(recorder->filename);
return false;
}
ok = sc_cond_init(&recorder->queue_cond);
if (!ok) {
LOGC("Could not create cond");
sc_mutex_destroy(&recorder->mutex);
free(recorder->filename);
return false;
}
queue_init(&recorder->queue);
recorder->stopped = false;
recorder->failed = false;
recorder->format = format;
recorder->declared_frame_size = declared_frame_size;
recorder->header_written = false;
recorder->previous = NULL;
return true;
}
void
recorder_destroy(struct recorder *recorder) {
sc_cond_destroy(&recorder->queue_cond);
sc_mutex_destroy(&recorder->mutex);
free(recorder->filename);
}
static const char * static const char *
recorder_get_format_name(enum sc_record_format format) { recorder_get_format_name(enum sc_record_format format) {
switch (format) { switch (format) {
@ -110,88 +70,6 @@ recorder_get_format_name(enum sc_record_format format) {
} }
} }
bool
recorder_open(struct recorder *recorder, const AVCodec *input_codec) {
const char *format_name = recorder_get_format_name(recorder->format);
assert(format_name);
const AVOutputFormat *format = find_muxer(format_name);
if (!format) {
LOGE("Could not find muxer");
return false;
}
recorder->ctx = avformat_alloc_context();
if (!recorder->ctx) {
LOGE("Could not allocate output context");
return false;
}
// contrary to the deprecated API (av_oformat_next()), av_muxer_iterate()
// returns (on purpose) a pointer-to-const, but AVFormatContext.oformat
// still expects a pointer-to-non-const (it has not be updated accordingly)
// <https://github.com/FFmpeg/FFmpeg/commit/0694d8702421e7aff1340038559c438b61bb30dd>
recorder->ctx->oformat = (AVOutputFormat *) format;
av_dict_set(&recorder->ctx->metadata, "comment",
"Recorded by scrcpy " SCRCPY_VERSION, 0);
AVStream *ostream = avformat_new_stream(recorder->ctx, input_codec);
if (!ostream) {
avformat_free_context(recorder->ctx);
return false;
}
#ifdef SCRCPY_LAVF_HAS_NEW_CODEC_PARAMS_API
ostream->codecpar->codec_type = AVMEDIA_TYPE_VIDEO;
ostream->codecpar->codec_id = input_codec->id;
ostream->codecpar->format = AV_PIX_FMT_YUV420P;
ostream->codecpar->width = recorder->declared_frame_size.width;
ostream->codecpar->height = recorder->declared_frame_size.height;
#else
ostream->codec->codec_type = AVMEDIA_TYPE_VIDEO;
ostream->codec->codec_id = input_codec->id;
ostream->codec->pix_fmt = AV_PIX_FMT_YUV420P;
ostream->codec->width = recorder->declared_frame_size.width;
ostream->codec->height = recorder->declared_frame_size.height;
#endif
int ret = avio_open(&recorder->ctx->pb, recorder->filename,
AVIO_FLAG_WRITE);
if (ret < 0) {
LOGE("Failed to open output file: %s", recorder->filename);
// ostream will be cleaned up during context cleaning
avformat_free_context(recorder->ctx);
return false;
}
LOGI("Recording started to %s file: %s", format_name, recorder->filename);
return true;
}
void
recorder_close(struct recorder *recorder) {
if (recorder->header_written) {
int ret = av_write_trailer(recorder->ctx);
if (ret < 0) {
LOGE("Failed to write trailer to %s", recorder->filename);
recorder->failed = true;
}
} else {
// the recorded file is empty
recorder->failed = true;
}
avio_close(recorder->ctx->pb);
avformat_free_context(recorder->ctx);
if (recorder->failed) {
LOGE("Recording failed to %s", recorder->filename);
} else {
const char *format_name = recorder_get_format_name(recorder->format);
LOGI("Recording complete to %s file: %s", format_name, recorder->filename);
}
}
static bool static bool
recorder_write_header(struct recorder *recorder, const AVPacket *packet) { recorder_write_header(struct recorder *recorder, const AVPacket *packet) {
AVStream *ostream = recorder->ctx->streams[0]; AVStream *ostream = recorder->ctx->streams[0];
@ -205,13 +83,8 @@ recorder_write_header(struct recorder *recorder, const AVPacket *packet) {
// copy the first packet to the extra data // copy the first packet to the extra data
memcpy(extradata, packet->data, packet->size); memcpy(extradata, packet->data, packet->size);
#ifdef SCRCPY_LAVF_HAS_NEW_CODEC_PARAMS_API
ostream->codecpar->extradata = extradata; ostream->codecpar->extradata = extradata;
ostream->codecpar->extradata_size = packet->size; ostream->codecpar->extradata_size = packet->size;
#else
ostream->codec->extradata = extradata;
ostream->codec->extradata_size = packet->size;
#endif
int ret = avformat_write_header(recorder->ctx, NULL); int ret = avformat_write_header(recorder->ctx, NULL);
if (ret < 0) { if (ret < 0) {
@ -317,7 +190,26 @@ run_recorder(void *data) {
sc_mutex_unlock(&recorder->mutex); sc_mutex_unlock(&recorder->mutex);
break; break;
} }
}
if (!recorder->failed) {
if (recorder->header_written) {
int ret = av_write_trailer(recorder->ctx);
if (ret < 0) {
LOGE("Failed to write trailer to %s", recorder->filename);
recorder->failed = true;
}
} else {
// the recorded file is empty
recorder->failed = true;
}
}
if (recorder->failed) {
LOGE("Recording failed to %s", recorder->filename);
} else {
const char *format_name = recorder_get_format_name(recorder->format);
LOGI("Recording complete to %s file: %s", format_name, recorder->filename);
} }
LOGD("Recorder thread ended"); LOGD("Recorder thread ended");
@ -325,34 +217,108 @@ run_recorder(void *data) {
return 0; return 0;
} }
bool static bool
recorder_start(struct recorder *recorder) { recorder_open(struct recorder *recorder, const AVCodec *input_codec) {
LOGD("Starting recorder thread"); bool ok = sc_mutex_init(&recorder->mutex);
bool ok = sc_thread_create(&recorder->thread, run_recorder, "recorder",
recorder);
if (!ok) { if (!ok) {
LOGC("Could not start recorder thread"); LOGC("Could not create mutex");
return false; return false;
} }
ok = sc_cond_init(&recorder->queue_cond);
if (!ok) {
LOGC("Could not create cond");
goto error_mutex_destroy;
}
queue_init(&recorder->queue);
recorder->stopped = false;
recorder->failed = false;
recorder->header_written = false;
recorder->previous = NULL;
const char *format_name = recorder_get_format_name(recorder->format);
assert(format_name);
const AVOutputFormat *format = find_muxer(format_name);
if (!format) {
LOGE("Could not find muxer");
goto error_cond_destroy;
}
recorder->ctx = avformat_alloc_context();
if (!recorder->ctx) {
LOGE("Could not allocate output context");
goto error_cond_destroy;
}
// contrary to the deprecated API (av_oformat_next()), av_muxer_iterate()
// returns (on purpose) a pointer-to-const, but AVFormatContext.oformat
// still expects a pointer-to-non-const (it has not be updated accordingly)
// <https://github.com/FFmpeg/FFmpeg/commit/0694d8702421e7aff1340038559c438b61bb30dd>
recorder->ctx->oformat = (AVOutputFormat *) format;
av_dict_set(&recorder->ctx->metadata, "comment",
"Recorded by scrcpy " SCRCPY_VERSION, 0);
AVStream *ostream = avformat_new_stream(recorder->ctx, input_codec);
if (!ostream) {
goto error_avformat_free_context;
}
ostream->codecpar->codec_type = AVMEDIA_TYPE_VIDEO;
ostream->codecpar->codec_id = input_codec->id;
ostream->codecpar->format = AV_PIX_FMT_YUV420P;
ostream->codecpar->width = recorder->declared_frame_size.width;
ostream->codecpar->height = recorder->declared_frame_size.height;
int ret = avio_open(&recorder->ctx->pb, recorder->filename,
AVIO_FLAG_WRITE);
if (ret < 0) {
LOGE("Failed to open output file: %s", recorder->filename);
// ostream will be cleaned up during context cleaning
goto error_avformat_free_context;
}
LOGD("Starting recorder thread");
ok = sc_thread_create(&recorder->thread, run_recorder, "recorder",
recorder);
if (!ok) {
LOGC("Could not start recorder thread");
goto error_avio_close;
}
LOGI("Recording started to %s file: %s", format_name, recorder->filename);
return true; return true;
error_avio_close:
avio_close(recorder->ctx->pb);
error_avformat_free_context:
avformat_free_context(recorder->ctx);
error_cond_destroy:
sc_cond_destroy(&recorder->queue_cond);
error_mutex_destroy:
sc_mutex_destroy(&recorder->mutex);
return false;
} }
void static void
recorder_stop(struct recorder *recorder) { recorder_close(struct recorder *recorder) {
sc_mutex_lock(&recorder->mutex); sc_mutex_lock(&recorder->mutex);
recorder->stopped = true; recorder->stopped = true;
sc_cond_signal(&recorder->queue_cond); sc_cond_signal(&recorder->queue_cond);
sc_mutex_unlock(&recorder->mutex); sc_mutex_unlock(&recorder->mutex);
}
void
recorder_join(struct recorder *recorder) {
sc_thread_join(&recorder->thread, NULL); sc_thread_join(&recorder->thread, NULL);
avio_close(recorder->ctx->pb);
avformat_free_context(recorder->ctx);
sc_cond_destroy(&recorder->queue_cond);
sc_mutex_destroy(&recorder->mutex);
} }
bool static bool
recorder_push(struct recorder *recorder, const AVPacket *packet) { recorder_push(struct recorder *recorder, const AVPacket *packet) {
sc_mutex_lock(&recorder->mutex); sc_mutex_lock(&recorder->mutex);
assert(!recorder->stopped); assert(!recorder->stopped);
@ -376,3 +342,51 @@ recorder_push(struct recorder *recorder, const AVPacket *packet) {
sc_mutex_unlock(&recorder->mutex); sc_mutex_unlock(&recorder->mutex);
return true; return true;
} }
static bool
recorder_packet_sink_open(struct sc_packet_sink *sink, const AVCodec *codec) {
struct recorder *recorder = DOWNCAST(sink);
return recorder_open(recorder, codec);
}
static void
recorder_packet_sink_close(struct sc_packet_sink *sink) {
struct recorder *recorder = DOWNCAST(sink);
recorder_close(recorder);
}
static bool
recorder_packet_sink_push(struct sc_packet_sink *sink, const AVPacket *packet) {
struct recorder *recorder = DOWNCAST(sink);
return recorder_push(recorder, packet);
}
bool
recorder_init(struct recorder *recorder,
const char *filename,
enum sc_record_format format,
struct size declared_frame_size) {
recorder->filename = strdup(filename);
if (!recorder->filename) {
LOGE("Could not strdup filename");
return false;
}
recorder->format = format;
recorder->declared_frame_size = declared_frame_size;
static const struct sc_packet_sink_ops ops = {
.open = recorder_packet_sink_open,
.close = recorder_packet_sink_close,
.push = recorder_packet_sink_push,
};
recorder->packet_sink.ops = &ops;
return true;
}
void
recorder_destroy(struct recorder *recorder) {
free(recorder->filename);
}

View File

@ -8,6 +8,7 @@
#include "coords.h" #include "coords.h"
#include "scrcpy.h" #include "scrcpy.h"
#include "trait/packet_sink.h"
#include "util/queue.h" #include "util/queue.h"
#include "util/thread.h" #include "util/thread.h"
@ -19,6 +20,8 @@ struct record_packet {
struct recorder_queue QUEUE(struct record_packet); struct recorder_queue QUEUE(struct record_packet);
struct recorder { struct recorder {
struct sc_packet_sink packet_sink; // packet sink trait
char *filename; char *filename;
enum sc_record_format format; enum sc_record_format format;
AVFormatContext *ctx; AVFormatContext *ctx;
@ -28,7 +31,7 @@ struct recorder {
sc_thread thread; sc_thread thread;
sc_mutex mutex; sc_mutex mutex;
sc_cond queue_cond; sc_cond queue_cond;
bool stopped; // set on recorder_stop() by the stream reader bool stopped; // set on recorder_close()
bool failed; // set on packet write failure bool failed; // set on packet write failure
struct recorder_queue queue; struct recorder_queue queue;
@ -46,22 +49,4 @@ recorder_init(struct recorder *recorder, const char *filename,
void void
recorder_destroy(struct recorder *recorder); recorder_destroy(struct recorder *recorder);
bool
recorder_open(struct recorder *recorder, const AVCodec *input_codec);
void
recorder_close(struct recorder *recorder);
bool
recorder_start(struct recorder *recorder);
void
recorder_stop(struct recorder *recorder);
void
recorder_join(struct recorder *recorder);
bool
recorder_push(struct recorder *recorder, const AVPacket *packet);
#endif #endif

View File

@ -25,17 +25,21 @@
#include "server.h" #include "server.h"
#include "stream.h" #include "stream.h"
#include "tiny_xpm.h" #include "tiny_xpm.h"
#include "video_buffer.h"
#include "util/log.h" #include "util/log.h"
#include "util/net.h" #include "util/net.h"
#ifdef HAVE_V4L2
# include "v4l2_sink.h"
#endif
static struct server server; static struct server server;
static struct screen screen; static struct screen screen;
static struct fps_counter fps_counter; static struct fps_counter fps_counter;
static struct video_buffer video_buffer;
static struct stream stream; static struct stream stream;
static struct decoder decoder; static struct decoder decoder;
static struct recorder recorder; static struct recorder recorder;
#ifdef HAVE_V4L2
static struct sc_v4l2_sink v4l2_sink;
#endif
static struct controller controller; static struct controller controller;
static struct file_handler file_handler; static struct file_handler file_handler;
@ -247,9 +251,11 @@ scrcpy(const struct scrcpy_options *options) {
bool server_started = false; bool server_started = false;
bool fps_counter_initialized = false; bool fps_counter_initialized = false;
bool video_buffer_initialized = false;
bool file_handler_initialized = false; bool file_handler_initialized = false;
bool recorder_initialized = false; bool recorder_initialized = false;
#ifdef HAVE_V4L2
bool v4l2_sink_initialized = false;
#endif
bool stream_started = false; bool stream_started = false;
bool controller_initialized = false; bool controller_initialized = false;
bool controller_started = false; bool controller_started = false;
@ -305,11 +311,6 @@ scrcpy(const struct scrcpy_options *options) {
} }
fps_counter_initialized = true; fps_counter_initialized = true;
if (!video_buffer_init(&video_buffer, options->render_expired_frames)) {
goto end;
}
video_buffer_initialized = true;
if (options->control) { if (options->control) {
if (!file_handler_init(&file_handler, server.serial, if (!file_handler_init(&file_handler, server.serial,
options->push_target)) { options->push_target)) {
@ -318,7 +319,7 @@ scrcpy(const struct scrcpy_options *options) {
file_handler_initialized = true; file_handler_initialized = true;
} }
decoder_init(&decoder, &video_buffer); decoder_init(&decoder);
dec = &decoder; dec = &decoder;
} }
@ -336,7 +337,15 @@ scrcpy(const struct scrcpy_options *options) {
av_log_set_callback(av_log_callback); av_log_set_callback(av_log_callback);
stream_init(&stream, server.video_socket, dec, rec); stream_init(&stream, server.video_socket);
if (dec) {
stream_add_sink(&stream, &dec->packet_sink);
}
if (rec) {
stream_add_sink(&stream, &rec->packet_sink);
}
if (options->display) { if (options->display) {
if (options->control) { if (options->control) {
@ -368,12 +377,13 @@ scrcpy(const struct scrcpy_options *options) {
.fullscreen = options->fullscreen, .fullscreen = options->fullscreen,
}; };
if (!screen_init(&screen, &video_buffer, &fps_counter, if (!screen_init(&screen, &fps_counter, &screen_params)) {
&screen_params)) {
goto end; goto end;
} }
screen_initialized = true; screen_initialized = true;
decoder_add_sink(&decoder, &screen.frame_sink);
if (options->turn_screen_off) { if (options->turn_screen_off) {
struct control_msg msg; struct control_msg msg;
msg.type = CONTROL_MSG_TYPE_SET_SCREEN_POWER_MODE; msg.type = CONTROL_MSG_TYPE_SET_SCREEN_POWER_MODE;
@ -385,6 +395,18 @@ scrcpy(const struct scrcpy_options *options) {
} }
} }
#ifdef HAVE_V4L2
if (options->v4l2_device) {
if (!sc_v4l2_sink_init(&v4l2_sink, options->v4l2_device, frame_size)) {
goto end;
}
decoder_add_sink(&decoder, &v4l2_sink.frame_sink);
v4l2_sink_initialized = true;
}
#endif
// now we consumed the header values, the socket receives the video stream // now we consumed the header values, the socket receives the video stream
// start the stream // start the stream
if (!stream_start(&stream)) { if (!stream_start(&stream)) {
@ -397,12 +419,13 @@ scrcpy(const struct scrcpy_options *options) {
ret = event_loop(options); ret = event_loop(options);
LOGD("quit..."); LOGD("quit...");
// Close the window immediately on closing, because screen_destroy() may
// only be called once the stream thread is joined (it may take time)
screen_hide_window(&screen);
end: end:
// stop stream and controller so that they don't continue once their socket // The stream is not stopped explicitly, because it will stop by itself on
// is shutdown // end-of-stream
if (stream_started) {
stream_stop(&stream);
}
if (controller_started) { if (controller_started) {
controller_stop(&controller); controller_stop(&controller);
} }
@ -424,6 +447,12 @@ end:
stream_join(&stream); stream_join(&stream);
} }
#ifdef HAVE_V4L2
if (v4l2_sink_initialized) {
sc_v4l2_sink_destroy(&v4l2_sink);
}
#endif
// Destroy the screen only after the stream is guaranteed to be finished, // Destroy the screen only after the stream is guaranteed to be finished,
// because otherwise the screen could receive new frames after destruction // because otherwise the screen could receive new frames after destruction
if (screen_initialized) { if (screen_initialized) {
@ -446,10 +475,6 @@ end:
file_handler_destroy(&file_handler); file_handler_destroy(&file_handler);
} }
if (video_buffer_initialized) {
video_buffer_destroy(&video_buffer);
}
if (fps_counter_initialized) { if (fps_counter_initialized) {
fps_counter_join(&fps_counter); fps_counter_join(&fps_counter);
fps_counter_destroy(&fps_counter); fps_counter_destroy(&fps_counter);

View File

@ -52,6 +52,7 @@ struct scrcpy_options {
const char *render_driver; const char *render_driver;
const char *codec_options; const char *codec_options;
const char *encoder_name; const char *encoder_name;
const char *v4l2_device;
enum sc_log_level log_level; enum sc_log_level log_level;
enum sc_record_format record_format; enum sc_record_format record_format;
struct sc_port_range port_range; struct sc_port_range port_range;
@ -72,7 +73,6 @@ struct scrcpy_options {
bool control; bool control;
bool display; bool display;
bool turn_screen_off; bool turn_screen_off;
bool render_expired_frames;
bool prefer_text; bool prefer_text;
bool window_borderless; bool window_borderless;
bool mipmaps; bool mipmaps;
@ -94,6 +94,7 @@ struct scrcpy_options {
.render_driver = NULL, \ .render_driver = NULL, \
.codec_options = NULL, \ .codec_options = NULL, \
.encoder_name = NULL, \ .encoder_name = NULL, \
.v4l2_device = NULL, \
.log_level = SC_LOG_LEVEL_INFO, \ .log_level = SC_LOG_LEVEL_INFO, \
.record_format = SC_RECORD_FORMAT_AUTO, \ .record_format = SC_RECORD_FORMAT_AUTO, \
.port_range = { \ .port_range = { \
@ -120,7 +121,6 @@ struct scrcpy_options {
.control = true, \ .control = true, \
.display = true, \ .display = true, \
.turn_screen_off = false, \ .turn_screen_off = false, \
.render_expired_frames = false, \
.prefer_text = false, \ .prefer_text = false, \
.window_borderless = false, \ .window_borderless = false, \
.mipmaps = true, \ .mipmaps = true, \

View File

@ -13,6 +13,8 @@
#define DISPLAY_MARGINS 96 #define DISPLAY_MARGINS 96
#define DOWNCAST(SINK) container_of(SINK, struct screen, frame_sink)
static inline struct size static inline struct size
get_rotated_size(struct size size, int rotation) { get_rotated_size(struct size size, int rotation) {
struct size rotated_size; struct size rotated_size;
@ -191,27 +193,6 @@ screen_update_content_rect(struct screen *screen) {
} }
} }
static void
on_frame_available(struct video_buffer *vb, void *userdata) {
(void) vb;
(void) userdata;
static SDL_Event new_frame_event = {
.type = EVENT_NEW_FRAME,
};
// Post the event on the UI thread
SDL_PushEvent(&new_frame_event);
}
static void
on_frame_skipped(struct video_buffer *vb, void *userdata) {
(void) vb;
struct screen *screen = userdata;
fps_counter_add_skipped_frame(screen->fps_counter);
}
static inline SDL_Texture * static inline SDL_Texture *
create_texture(struct screen *screen) { create_texture(struct screen *screen) {
SDL_Renderer *renderer = screen->renderer; SDL_Renderer *renderer = screen->renderer;
@ -262,11 +243,58 @@ event_watcher(void *data, SDL_Event *event) {
} }
#endif #endif
static bool
screen_frame_sink_open(struct sc_frame_sink *sink) {
struct screen *screen = DOWNCAST(sink);
(void) screen;
#ifndef NDEBUG
screen->open = true;
#endif
// nothing to do, the screen is already open on the main thread
return true;
}
static void
screen_frame_sink_close(struct sc_frame_sink *sink) {
struct screen *screen = DOWNCAST(sink);
(void) screen;
#ifndef NDEBUG
screen->open = false;
#endif
// nothing to do, the screen lifecycle is not managed by the frame producer
}
static bool
screen_frame_sink_push(struct sc_frame_sink *sink, const AVFrame *frame) {
struct screen *screen = DOWNCAST(sink);
bool previous_frame_skipped;
bool ok = video_buffer_push(&screen->vb, frame, &previous_frame_skipped);
if (!ok) {
return false;
}
if (previous_frame_skipped) {
fps_counter_add_skipped_frame(screen->fps_counter);
// The EVENT_NEW_FRAME triggered for the previous frame will consume
// this new frame instead
} else {
static SDL_Event new_frame_event = {
.type = EVENT_NEW_FRAME,
};
// Post the event on the UI thread
SDL_PushEvent(&new_frame_event);
}
return true;
}
bool bool
screen_init(struct screen *screen, struct video_buffer *vb, screen_init(struct screen *screen, struct fps_counter *fps_counter,
struct fps_counter *fps_counter,
const struct screen_params *params) { const struct screen_params *params) {
screen->vb = vb;
screen->fps_counter = fps_counter; screen->fps_counter = fps_counter;
screen->resize_pending = false; screen->resize_pending = false;
@ -274,11 +302,11 @@ screen_init(struct screen *screen, struct video_buffer *vb,
screen->fullscreen = false; screen->fullscreen = false;
screen->maximized = false; screen->maximized = false;
static const struct video_buffer_callbacks cbs = { bool ok = video_buffer_init(&screen->vb);
.on_frame_available = on_frame_available, if (!ok) {
.on_frame_skipped = on_frame_skipped, LOGE("Could not initialize video buffer");
}; return false;
video_buffer_set_consumer_callbacks(vb, &cbs, screen); }
screen->frame_size = params->frame_size; screen->frame_size = params->frame_size;
screen->rotation = params->rotation; screen->rotation = params->rotation;
@ -324,6 +352,7 @@ screen_init(struct screen *screen, struct video_buffer *vb,
if (!screen->renderer) { if (!screen->renderer) {
LOGC("Could not create renderer: %s", SDL_GetError()); LOGC("Could not create renderer: %s", SDL_GetError());
SDL_DestroyWindow(screen->window); SDL_DestroyWindow(screen->window);
video_buffer_destroy(&screen->vb);
return false; return false;
} }
@ -375,6 +404,17 @@ screen_init(struct screen *screen, struct video_buffer *vb,
LOGC("Could not create texture: %s", SDL_GetError()); LOGC("Could not create texture: %s", SDL_GetError());
SDL_DestroyRenderer(screen->renderer); SDL_DestroyRenderer(screen->renderer);
SDL_DestroyWindow(screen->window); SDL_DestroyWindow(screen->window);
video_buffer_destroy(&screen->vb);
return false;
}
screen->frame = av_frame_alloc();
if (!screen->frame) {
LOGC("Could not create screen frame");
SDL_DestroyTexture(screen->texture);
SDL_DestroyRenderer(screen->renderer);
SDL_DestroyWindow(screen->window);
video_buffer_destroy(&screen->vb);
return false; return false;
} }
@ -393,6 +433,18 @@ screen_init(struct screen *screen, struct video_buffer *vb,
SDL_AddEventWatch(event_watcher, screen); SDL_AddEventWatch(event_watcher, screen);
#endif #endif
static const struct sc_frame_sink_ops ops = {
.open = screen_frame_sink_open,
.close = screen_frame_sink_close,
.push = screen_frame_sink_push,
};
screen->frame_sink.ops = &ops;
#ifndef NDEBUG
screen->open = false;
#endif
return true; return true;
} }
@ -401,11 +453,21 @@ screen_show_window(struct screen *screen) {
SDL_ShowWindow(screen->window); SDL_ShowWindow(screen->window);
} }
void
screen_hide_window(struct screen *screen) {
SDL_HideWindow(screen->window);
}
void void
screen_destroy(struct screen *screen) { screen_destroy(struct screen *screen) {
#ifndef NDEBUG
assert(!screen->open);
#endif
av_frame_free(&screen->frame);
SDL_DestroyTexture(screen->texture); SDL_DestroyTexture(screen->texture);
SDL_DestroyRenderer(screen->renderer); SDL_DestroyRenderer(screen->renderer);
SDL_DestroyWindow(screen->window); SDL_DestroyWindow(screen->window);
video_buffer_destroy(&screen->vb);
} }
static void static void
@ -510,7 +572,9 @@ update_texture(struct screen *screen, const AVFrame *frame) {
static bool static bool
screen_update_frame(struct screen *screen) { screen_update_frame(struct screen *screen) {
const AVFrame *frame = video_buffer_consumer_take_frame(screen->vb); av_frame_unref(screen->frame);
video_buffer_consume(&screen->vb, screen->frame);
AVFrame *frame = screen->frame;
fps_counter_add_rendered_frame(screen->fps_counter); fps_counter_add_rendered_frame(screen->fps_counter);

View File

@ -9,11 +9,17 @@
#include "coords.h" #include "coords.h"
#include "opengl.h" #include "opengl.h"
#include "trait/frame_sink.h"
struct video_buffer; #include "video_buffer.h"
struct screen { struct screen {
struct video_buffer *vb; struct sc_frame_sink frame_sink; // frame sink trait
#ifndef NDEBUG
bool open; // track the open/close state to assert correct behavior
#endif
struct video_buffer vb;
struct fps_counter *fps_counter; struct fps_counter *fps_counter;
SDL_Window *window; SDL_Window *window;
@ -36,6 +42,8 @@ struct screen {
bool fullscreen; bool fullscreen;
bool maximized; bool maximized;
bool mipmaps; bool mipmaps;
AVFrame *frame;
}; };
struct screen_params { struct screen_params {
@ -58,14 +66,20 @@ struct screen_params {
// initialize screen, create window, renderer and texture (window is hidden) // initialize screen, create window, renderer and texture (window is hidden)
bool bool
screen_init(struct screen *screen, struct video_buffer *vb, screen_init(struct screen *screen, struct fps_counter *fps_counter,
struct fps_counter *fps_counter,
const struct screen_params *params); const struct screen_params *params);
// destroy window, renderer and texture (if any) // destroy window, renderer and texture (if any)
void void
screen_destroy(struct screen *screen); screen_destroy(struct screen *screen);
// hide the window
//
// It is used to hide the window immediately on closing without waiting for
// screen_destroy()
void
screen_hide_window(struct screen *screen);
// render the texture to the renderer // render the texture to the renderer
// //
// Set the update_content_rect flag if the window or content size may have // Set the update_content_rect flag if the window or content size may have

View File

@ -66,25 +66,11 @@ notify_stopped(void) {
} }
static bool static bool
process_config_packet(struct stream *stream, AVPacket *packet) { push_packet_to_sinks(struct stream *stream, const AVPacket *packet) {
if (stream->recorder && !recorder_push(stream->recorder, packet)) { for (unsigned i = 0; i < stream->sink_count; ++i) {
LOGE("Could not send config packet to recorder"); struct sc_packet_sink *sink = stream->sinks[i];
return false; if (!sink->ops->push(sink, packet)) {
} LOGE("Could not send config packet to sink %d", i);
return true;
}
static bool
process_frame(struct stream *stream, AVPacket *packet) {
if (stream->decoder && !decoder_push(stream->decoder, packet)) {
return false;
}
if (stream->recorder) {
packet->dts = packet->pts;
if (!recorder_push(stream->recorder, packet)) {
LOGE("Could not send packet to recorder");
return false; return false;
} }
} }
@ -111,9 +97,11 @@ stream_parse(struct stream *stream, AVPacket *packet) {
packet->flags |= AV_PKT_FLAG_KEY; packet->flags |= AV_PKT_FLAG_KEY;
} }
bool ok = process_frame(stream, packet); packet->dts = packet->pts;
bool ok = push_packet_to_sinks(stream, packet);
if (!ok) { if (!ok) {
LOGE("Could not process frame"); LOGE("Could not process packet");
return false; return false;
} }
@ -156,7 +144,7 @@ stream_push_packet(struct stream *stream, AVPacket *packet) {
if (is_config) { if (is_config) {
// config packet // config packet
bool ok = process_config_packet(stream, packet); bool ok = push_packet_to_sinks(stream, packet);
if (!ok) { if (!ok) {
return false; return false;
} }
@ -177,6 +165,33 @@ stream_push_packet(struct stream *stream, AVPacket *packet) {
return true; return true;
} }
static void
stream_close_first_sinks(struct stream *stream, unsigned count) {
while (count) {
struct sc_packet_sink *sink = stream->sinks[--count];
sink->ops->close(sink);
}
}
static inline void
stream_close_sinks(struct stream *stream) {
stream_close_first_sinks(stream, stream->sink_count);
}
static bool
stream_open_sinks(struct stream *stream, const AVCodec *codec) {
for (unsigned i = 0; i < stream->sink_count; ++i) {
struct sc_packet_sink *sink = stream->sinks[i];
if (!sink->ops->open(sink, codec)) {
LOGE("Could not open packet sink %d", i);
stream_close_first_sinks(stream, i);
return false;
}
}
return true;
}
static int static int
run_stream(void *data) { run_stream(void *data) {
struct stream *stream = data; struct stream *stream = data;
@ -193,27 +208,15 @@ run_stream(void *data) {
goto end; goto end;
} }
if (stream->decoder && !decoder_open(stream->decoder, codec)) { if (!stream_open_sinks(stream, codec)) {
LOGE("Could not open decoder"); LOGE("Could not open stream sinks");
goto finally_free_codec_ctx; goto finally_free_codec_ctx;
} }
if (stream->recorder) {
if (!recorder_open(stream->recorder, codec)) {
LOGE("Could not open recorder");
goto finally_close_decoder;
}
if (!recorder_start(stream->recorder)) {
LOGE("Could not start recorder");
goto finally_close_recorder;
}
}
stream->parser = av_parser_init(AV_CODEC_ID_H264); stream->parser = av_parser_init(AV_CODEC_ID_H264);
if (!stream->parser) { if (!stream->parser) {
LOGE("Could not initialize parser"); LOGE("Could not initialize parser");
goto finally_stop_and_join_recorder; goto finally_close_sinks;
} }
// We must only pass complete frames to av_parser_parse2()! // We must only pass complete frames to av_parser_parse2()!
@ -243,20 +246,8 @@ run_stream(void *data) {
} }
av_parser_close(stream->parser); av_parser_close(stream->parser);
finally_stop_and_join_recorder: finally_close_sinks:
if (stream->recorder) { stream_close_sinks(stream);
recorder_stop(stream->recorder);
LOGI("Finishing recording...");
recorder_join(stream->recorder);
}
finally_close_recorder:
if (stream->recorder) {
recorder_close(stream->recorder);
}
finally_close_decoder:
if (stream->decoder) {
decoder_close(stream->decoder);
}
finally_free_codec_ctx: finally_free_codec_ctx:
avcodec_free_context(&stream->codec_ctx); avcodec_free_context(&stream->codec_ctx);
end: end:
@ -265,12 +256,18 @@ end:
} }
void void
stream_init(struct stream *stream, socket_t socket, stream_init(struct stream *stream, socket_t socket) {
struct decoder *decoder, struct recorder *recorder) {
stream->socket = socket; stream->socket = socket;
stream->decoder = decoder,
stream->recorder = recorder;
stream->has_pending = false; stream->has_pending = false;
stream->sink_count = 0;
}
void
stream_add_sink(struct stream *stream, struct sc_packet_sink *sink) {
assert(stream->sink_count < STREAM_MAX_SINKS);
assert(sink);
assert(sink->ops);
stream->sinks[stream->sink_count++] = sink;
} }
bool bool
@ -285,13 +282,6 @@ stream_start(struct stream *stream) {
return true; return true;
} }
void
stream_stop(struct stream *stream) {
if (stream->decoder) {
decoder_interrupt(stream->decoder);
}
}
void void
stream_join(struct stream *stream) { stream_join(struct stream *stream) {
sc_thread_join(&stream->thread, NULL); sc_thread_join(&stream->thread, NULL);

View File

@ -8,14 +8,19 @@
#include <libavformat/avformat.h> #include <libavformat/avformat.h>
#include <SDL2/SDL_atomic.h> #include <SDL2/SDL_atomic.h>
#include "trait/packet_sink.h"
#include "util/net.h" #include "util/net.h"
#include "util/thread.h" #include "util/thread.h"
#define STREAM_MAX_SINKS 2
struct stream { struct stream {
socket_t socket; socket_t socket;
sc_thread thread; sc_thread thread;
struct decoder *decoder;
struct recorder *recorder; struct sc_packet_sink *sinks[STREAM_MAX_SINKS];
unsigned sink_count;
AVCodecContext *codec_ctx; AVCodecContext *codec_ctx;
AVCodecParserContext *parser; AVCodecParserContext *parser;
// successive packets may need to be concatenated, until a non-config // successive packets may need to be concatenated, until a non-config
@ -25,15 +30,14 @@ struct stream {
}; };
void void
stream_init(struct stream *stream, socket_t socket, stream_init(struct stream *stream, socket_t socket);
struct decoder *decoder, struct recorder *recorder);
void
stream_add_sink(struct stream *stream, struct sc_packet_sink *sink);
bool bool
stream_start(struct stream *stream); stream_start(struct stream *stream);
void
stream_stop(struct stream *stream);
void void
stream_join(struct stream *stream); stream_join(struct stream *stream);

View File

@ -0,0 +1,26 @@
#ifndef SC_FRAME_SINK
#define SC_FRAME_SINK
#include "common.h"
#include <assert.h>
#include <stdbool.h>
typedef struct AVFrame AVFrame;
/**
* Frame sink trait.
*
* Component able to receive AVFrames should implement this trait.
*/
struct sc_frame_sink {
const struct sc_frame_sink_ops *ops;
};
struct sc_frame_sink_ops {
bool (*open)(struct sc_frame_sink *sink);
void (*close)(struct sc_frame_sink *sink);
bool (*push)(struct sc_frame_sink *sink, const AVFrame *frame);
};
#endif

View File

@ -0,0 +1,27 @@
#ifndef SC_PACKET_SINK
#define SC_PACKET_SINK
#include "common.h"
#include <assert.h>
#include <stdbool.h>
typedef struct AVCodec AVCodec;
typedef struct AVPacket AVPacket;
/**
* Packet sink trait.
*
* Component able to receive AVPackets should implement this trait.
*/
struct sc_packet_sink {
const struct sc_packet_sink_ops *ops;
};
struct sc_packet_sink_ops {
bool (*open)(struct sc_packet_sink *sink, const AVCodec *codec);
void (*close)(struct sc_packet_sink *sink);
bool (*push)(struct sc_packet_sink *sink, const AVPacket *packet);
};
#endif

View File

@ -140,6 +140,24 @@ parse_integer_with_suffix(const char *s, long *out) {
return true; return true;
} }
bool
strlist_contains(const char *list, char sep, const char *s) {
char *p;
do {
p = strchr(list, sep);
size_t token_len = p ? (size_t) (p - list) : strlen(list);
if (!strncmp(list, s, token_len)) {
return true;
}
if (p) {
list = p + 1;
}
} while (p);
return false;
}
size_t size_t
utf8_truncation_index(const char *utf8, size_t max_len) { utf8_truncation_index(const char *utf8, size_t max_len) {
size_t len = strlen(utf8); size_t len = strlen(utf8);

View File

@ -43,6 +43,11 @@ parse_integers(const char *s, const char sep, size_t max_items, long *out);
bool bool
parse_integer_with_suffix(const char *s, long *out); parse_integer_with_suffix(const char *s, long *out);
// search s in the list separated by sep
// for example, strlist_contains("a,bc,def", ',', "bc") returns true
bool
strlist_contains(const char *list, char sep, const char *s);
// return the index to truncate a UTF-8 string at a valid position // return the index to truncate a UTF-8 string at a valid position
size_t size_t
utf8_truncation_index(const char *utf8, size_t max_len); utf8_truncation_index(const char *utf8, size_t max_len);

265
app/src/v4l2_sink.c Normal file
View File

@ -0,0 +1,265 @@
#include "v4l2_sink.h"
#include "util/log.h"
#include "util/str_util.h"
/** Downcast frame_sink to sc_v4l2_sink */
#define DOWNCAST(SINK) container_of(SINK, struct sc_v4l2_sink, frame_sink)
static const AVOutputFormat *
find_muxer(const char *name) {
#ifdef SCRCPY_LAVF_HAS_NEW_MUXER_ITERATOR_API
void *opaque = NULL;
#endif
const AVOutputFormat *oformat = NULL;
do {
#ifdef SCRCPY_LAVF_HAS_NEW_MUXER_ITERATOR_API
oformat = av_muxer_iterate(&opaque);
#else
oformat = av_oformat_next(oformat);
#endif
// until null or containing the requested name
} while (oformat && !strlist_contains(oformat->name, ',', name));
return oformat;
}
static bool
encode_and_send_frame(struct sc_v4l2_sink *vs, const AVFrame *frame) {
int ret = avcodec_send_frame(vs->encoder_ctx, frame);
if (ret < 0 && ret != AVERROR(EAGAIN)) {
LOGE("Could not send v4l2 video frame: %d", ret);
return false;
}
AVPacket *packet = &vs->packet;
ret = avcodec_receive_packet(vs->encoder_ctx, packet);
if (ret == 0) {
// A packet was received
av_write_frame(vs->format_ctx, packet);
av_packet_unref(packet);
} else if (ret != AVERROR(EAGAIN)) {
LOGE("Could not receive v4l2 video packet: %d", ret);
return false;
}
return true;
}
static int
run_v4l2_sink(void *data) {
struct sc_v4l2_sink *vs = data;
for (;;) {
sc_mutex_lock(&vs->mutex);
while (!vs->stopped && vs->vb.pending_frame_consumed) {
sc_cond_wait(&vs->cond, &vs->mutex);
}
if (vs->stopped) {
sc_mutex_unlock(&vs->mutex);
break;
}
sc_mutex_unlock(&vs->mutex);
video_buffer_consume(&vs->vb, vs->frame);
bool ok = encode_and_send_frame(vs, vs->frame);
if (!ok) {
LOGE("Could not send frame to v4l2 sink");
break;
}
}
LOGD("V4l2 thread ended");
return 0;
}
static bool
sc_v4l2_sink_open(struct sc_v4l2_sink *vs) {
bool ok = video_buffer_init(&vs->vb);
if (!ok) {
return false;
}
ok = sc_mutex_init(&vs->mutex);
if (!ok) {
LOGC("Could not create mutex");
goto error_video_buffer_destroy;
}
ok = sc_cond_init(&vs->cond);
if (!ok) {
LOGC("Could not create cond");
goto error_mutex_destroy;
}
// FIXME
const AVOutputFormat *format = find_muxer("video4linux2,v4l2");
if (!format) {
LOGE("Could not find v4l2 muxer");
goto error_cond_destroy;
}
const AVCodec *encoder = avcodec_find_encoder(AV_CODEC_ID_RAWVIDEO);
if (!encoder) {
LOGE("Raw video encoder not found");
return false;
}
vs->format_ctx = avformat_alloc_context();
if (!vs->format_ctx) {
LOGE("Could not allocate v4l2 output context");
return false;
}
// contrary to the deprecated API (av_oformat_next()), av_muxer_iterate()
// returns (on purpose) a pointer-to-const, but AVFormatContext.oformat
// still expects a pointer-to-non-const (it has not be updated accordingly)
// <https://github.com/FFmpeg/FFmpeg/commit/0694d8702421e7aff1340038559c438b61bb30dd>
vs->format_ctx->oformat = (AVOutputFormat *) format;
AVStream *ostream = avformat_new_stream(vs->format_ctx, encoder);
if (!ostream) {
LOGE("Could not allocate new v4l2 stream");
goto error_avformat_free_context;
return false;
}
ostream->codecpar->codec_type = AVMEDIA_TYPE_VIDEO;
ostream->codecpar->codec_id = encoder->id;
ostream->codecpar->format = AV_PIX_FMT_YUV420P;
ostream->codecpar->width = vs->frame_size.width;
ostream->codecpar->height = vs->frame_size.height;
int ret = avio_open(&vs->format_ctx->pb, vs->device_name, AVIO_FLAG_WRITE);
if (ret < 0) {
LOGE("Failed to open output device: %s", vs->device_name);
// ostream will be cleaned up during context cleaning
goto error_avformat_free_context;
}
vs->encoder_ctx = avcodec_alloc_context3(encoder);
if (!vs->encoder_ctx) {
LOGC("Could not allocate codec context for v4l2");
goto error_avio_close;
}
vs->encoder_ctx->width = vs->frame_size.width;
vs->encoder_ctx->height = vs->frame_size.height;
vs->encoder_ctx->pix_fmt = AV_PIX_FMT_YUV420P;
vs->encoder_ctx->time_base.num = 1;
vs->encoder_ctx->time_base.den = 1;
if (avcodec_open2(vs->encoder_ctx, encoder, NULL) < 0) {
LOGE("Could not open codec for v4l2");
goto error_avcodec_free_context;
}
vs->frame = av_frame_alloc();
if (!vs->frame) {
LOGE("Could not create v4l2 frame");
goto error_avcodec_close;
}
LOGD("Starting v4l2 thread");
ok = sc_thread_create(&vs->thread, run_v4l2_sink, "v4l2", vs);
if (!ok) {
LOGC("Could not start v4l2 thread");
goto error_av_frame_free;
}
LOGI("v4l2 sink started to device: %s", vs->device_name);
return true;
error_av_frame_free:
av_frame_free(&vs->frame);
error_avcodec_close:
avcodec_close(vs->encoder_ctx);
error_avcodec_free_context:
avcodec_free_context(&vs->encoder_ctx);
error_avio_close:
avio_close(vs->format_ctx->pb);
error_avformat_free_context:
avformat_free_context(vs->format_ctx);
error_cond_destroy:
sc_cond_destroy(&vs->cond);
error_mutex_destroy:
sc_mutex_destroy(&vs->mutex);
error_video_buffer_destroy:
video_buffer_destroy(&vs->vb);
return false;
}
static void
sc_v4l2_sink_close(struct sc_v4l2_sink *vs) {
av_frame_free(&vs->frame);
avcodec_close(vs->encoder_ctx);
avcodec_free_context(&vs->encoder_ctx);
avio_close(vs->format_ctx->pb);
avformat_free_context(vs->format_ctx);
sc_cond_destroy(&vs->cond);
sc_mutex_destroy(&vs->mutex);
video_buffer_destroy(&vs->vb);
}
static bool
sc_v4l2_sink_push(struct sc_v4l2_sink *vs, const AVFrame *frame) {
bool ok = video_buffer_push(&vs->vb, frame, NULL);
if (!ok) {
return false;
}
// signal possible change of vs->vb.pending_frame_consumed
sc_cond_signal(&vs->cond);
return true;
}
static bool
sc_v4l2_frame_sink_open(struct sc_frame_sink *sink) {
struct sc_v4l2_sink *vs = DOWNCAST(sink);
return sc_v4l2_sink_open(vs);
}
static void
sc_v4l2_frame_sink_close(struct sc_frame_sink *sink) {
struct sc_v4l2_sink *vs = DOWNCAST(sink);
sc_v4l2_sink_close(vs);
}
static bool
sc_v4l2_frame_sink_push(struct sc_frame_sink *sink, const AVFrame *frame) {
struct sc_v4l2_sink *vs = DOWNCAST(sink);
return sc_v4l2_sink_push(vs, frame);
}
bool
sc_v4l2_sink_init(struct sc_v4l2_sink *vs, const char *device_name,
struct size frame_size) {
vs->device_name = strdup(device_name);
if (!vs->device_name) {
LOGE("Could not strdup v4l2 device name");
return false;
}
vs->frame_size = frame_size;
static const struct sc_frame_sink_ops ops = {
.open = sc_v4l2_frame_sink_open,
.close = sc_v4l2_frame_sink_close,
.push = sc_v4l2_frame_sink_push,
};
vs->frame_sink.ops = &ops;
return true;
}
void
sc_v4l2_sink_destroy(struct sc_v4l2_sink *vs) {
free(vs->device_name);
}

38
app/src/v4l2_sink.h Normal file
View File

@ -0,0 +1,38 @@
#ifndef SC_V4L2_SINK_H
#define SC_V4L2_SINK_H
#include "common.h"
#include "coords.h"
#include "trait/frame_sink.h"
#include "video_buffer.h"
#include <libavformat/avformat.h>
struct sc_v4l2_sink {
struct sc_frame_sink frame_sink; // frame sink trait
struct video_buffer vb;
AVFormatContext *format_ctx;
AVCodecContext *encoder_ctx;
char *device_name;
struct size frame_size;
sc_thread thread;
sc_mutex mutex;
sc_cond cond;
bool stopped;
AVFrame *frame;
AVPacket packet;
};
bool
sc_v4l2_sink_init(struct sc_v4l2_sink *vs, const char *device_name,
struct size frame_size);
void
sc_v4l2_sink_destroy(struct sc_v4l2_sink *vs);
#endif

View File

@ -7,67 +7,36 @@
#include "util/log.h" #include "util/log.h"
bool bool
video_buffer_init(struct video_buffer *vb, bool wait_consumer) { video_buffer_init(struct video_buffer *vb) {
vb->producer_frame = av_frame_alloc();
if (!vb->producer_frame) {
goto error_0;
}
vb->pending_frame = av_frame_alloc(); vb->pending_frame = av_frame_alloc();
if (!vb->pending_frame) { if (!vb->pending_frame) {
goto error_1; return false;
} }
vb->consumer_frame = av_frame_alloc(); vb->tmp_frame = av_frame_alloc();
if (!vb->consumer_frame) { if (!vb->tmp_frame) {
goto error_2; av_frame_free(&vb->pending_frame);
return false;
} }
bool ok = sc_mutex_init(&vb->mutex); bool ok = sc_mutex_init(&vb->mutex);
if (!ok) { if (!ok) {
goto error_3; av_frame_free(&vb->pending_frame);
} av_frame_free(&vb->tmp_frame);
return false;
vb->wait_consumer = wait_consumer;
if (wait_consumer) {
ok = sc_cond_init(&vb->pending_frame_consumed_cond);
if (!ok) {
sc_mutex_destroy(&vb->mutex);
goto error_2;
}
// interrupted is not used if wait_consumer is disabled since offering
// a frame will never block
vb->interrupted = false;
} }
// there is initially no frame, so consider it has already been consumed // there is initially no frame, so consider it has already been consumed
vb->pending_frame_consumed = true; vb->pending_frame_consumed = true;
// The callbacks must be set by the consumer via
// video_buffer_set_consumer_callbacks()
vb->cbs = NULL;
return true; return true;
error_3:
av_frame_free(&vb->consumer_frame);
error_2:
av_frame_free(&vb->pending_frame);
error_1:
av_frame_free(&vb->producer_frame);
error_0:
return false;
} }
void void
video_buffer_destroy(struct video_buffer *vb) { video_buffer_destroy(struct video_buffer *vb) {
if (vb->wait_consumer) {
sc_cond_destroy(&vb->pending_frame_consumed_cond);
}
sc_mutex_destroy(&vb->mutex); sc_mutex_destroy(&vb->mutex);
av_frame_free(&vb->consumer_frame);
av_frame_free(&vb->pending_frame); av_frame_free(&vb->pending_frame);
av_frame_free(&vb->producer_frame); av_frame_free(&vb->tmp_frame);
} }
static inline void static inline void
@ -77,71 +46,43 @@ swap_frames(AVFrame **lhs, AVFrame **rhs) {
*rhs = tmp; *rhs = tmp;
} }
void bool
video_buffer_set_consumer_callbacks(struct video_buffer *vb, video_buffer_push(struct video_buffer *vb, const AVFrame *frame,
const struct video_buffer_callbacks *cbs, bool *previous_frame_skipped) {
void *cbs_userdata) {
assert(!vb->cbs); // must be set only once
assert(cbs);
assert(cbs->on_frame_available);
vb->cbs = cbs;
vb->cbs_userdata = cbs_userdata;
}
void
video_buffer_producer_offer_frame(struct video_buffer *vb) {
assert(vb->cbs);
sc_mutex_lock(&vb->mutex); sc_mutex_lock(&vb->mutex);
if (vb->wait_consumer) {
// wait for the current (expired) frame to be consumed // Use a temporary frame to preserve pending_frame in case of error.
while (!vb->pending_frame_consumed && !vb->interrupted) { // tmp_frame is an empty frame, no need to call av_frame_unref() beforehand.
sc_cond_wait(&vb->pending_frame_consumed_cond, &vb->mutex); int r = av_frame_ref(vb->tmp_frame, frame);
} if (r) {
LOGE("Could not ref frame: %d", r);
return false;
} }
av_frame_unref(vb->pending_frame); // Now that av_frame_ref() succeeded, we can replace the previous
swap_frames(&vb->producer_frame, &vb->pending_frame); // pending_frame
swap_frames(&vb->pending_frame, &vb->tmp_frame);
av_frame_unref(vb->tmp_frame);
bool skipped = !vb->pending_frame_consumed; if (previous_frame_skipped) {
*previous_frame_skipped = !vb->pending_frame_consumed;
}
vb->pending_frame_consumed = false; vb->pending_frame_consumed = false;
sc_mutex_unlock(&vb->mutex); sc_mutex_unlock(&vb->mutex);
if (skipped) { return true;
if (vb->cbs->on_frame_skipped)
vb->cbs->on_frame_skipped(vb, vb->cbs_userdata);
} else {
vb->cbs->on_frame_available(vb, vb->cbs_userdata);
}
} }
const AVFrame * void
video_buffer_consumer_take_frame(struct video_buffer *vb) { video_buffer_consume(struct video_buffer *vb, AVFrame *dst) {
sc_mutex_lock(&vb->mutex); sc_mutex_lock(&vb->mutex);
assert(!vb->pending_frame_consumed); assert(!vb->pending_frame_consumed);
vb->pending_frame_consumed = true; vb->pending_frame_consumed = true;
swap_frames(&vb->consumer_frame, &vb->pending_frame); av_frame_move_ref(dst, vb->pending_frame);
av_frame_unref(vb->pending_frame); // av_frame_move_ref() resets its source frame, so no need to call
// av_frame_unref()
if (vb->wait_consumer) {
// unblock video_buffer_offer_decoded_frame()
sc_cond_signal(&vb->pending_frame_consumed_cond);
}
sc_mutex_unlock(&vb->mutex); sc_mutex_unlock(&vb->mutex);
// consumer_frame is only written from this thread, no need to lock
return vb->consumer_frame;
}
void
video_buffer_interrupt(struct video_buffer *vb) {
if (vb->wait_consumer) {
sc_mutex_lock(&vb->mutex);
vb->interrupted = true;
sc_mutex_unlock(&vb->mutex);
// wake up blocking wait
sc_cond_signal(&vb->pending_frame_consumed_cond);
}
} }

View File

@ -12,71 +12,39 @@
typedef struct AVFrame AVFrame; typedef struct AVFrame AVFrame;
/** /**
* There are 3 frames in memory: * A video buffer holds 1 pending frame, which is the last frame received from
* - one frame is held by the producer (producer_frame) * the producer (typically, the decoder).
* - one frame is held by the consumer (consumer_frame)
* - one frame is shared between the producer and the consumer (pending_frame)
* *
* The producer generates a frame into the producer_frame (it may takes time). * If a pending frame has not been consumed when the producer pushes a new
* frame, then it is lost. The intent is to always provide access to the very
* last frame to minimize latency.
* *
* Once the frame is produced, it calls video_buffer_producer_offer_frame(), * The producer and the consumer typically do not live in the same thread.
* which swaps the producer and pending frames. * That's the reason why the callback on_frame_available() does not provide the
* * frame as parameter: the consumer might post an event to its own thread to
* When the consumer is notified that a new frame is available, it calls * retrieve the pending frame from there, and that frame may have changed since
* video_buffer_consumer_take_frame() to retrieve it, which swaps the pending * the callback if producer pushed a new one in between.
* and consumer frames. The frame is valid until the next call, without
* blocking the producer.
*/ */
struct video_buffer { struct video_buffer {
AVFrame *producer_frame;
AVFrame *pending_frame; AVFrame *pending_frame;
AVFrame *consumer_frame; AVFrame *tmp_frame; // To preserve the pending frame on error
sc_mutex mutex; sc_mutex mutex;
bool wait_consumer; // never overwrite a pending frame if it is not consumed
bool interrupted;
sc_cond pending_frame_consumed_cond;
bool pending_frame_consumed; bool pending_frame_consumed;
const struct video_buffer_callbacks *cbs;
void *cbs_userdata;
};
struct video_buffer_callbacks {
// Called when a new frame can be consumed by
// video_buffer_consumer_take_frame(vb)
// This callback is mandatory (it must not be NULL).
void (*on_frame_available)(struct video_buffer *vb, void *userdata);
// Called when a pending frame has been overwritten by the producer
// This callback is optional (it may be NULL).
void (*on_frame_skipped)(struct video_buffer *vb, void *userdata);
}; };
bool bool
video_buffer_init(struct video_buffer *vb, bool wait_consumer); video_buffer_init(struct video_buffer *vb);
void void
video_buffer_destroy(struct video_buffer *vb); video_buffer_destroy(struct video_buffer *vb);
void bool
video_buffer_set_consumer_callbacks(struct video_buffer *vb, video_buffer_push(struct video_buffer *vb, const AVFrame *frame, bool *skipped);
const struct video_buffer_callbacks *cbs,
void *cbs_userdata);
// set the producer frame as ready for consuming
void void
video_buffer_producer_offer_frame(struct video_buffer *vb); video_buffer_consume(struct video_buffer *vb, AVFrame *dst);
// mark the consumer frame as consumed and return it
// the frame is valid until the next call to this function
const AVFrame *
video_buffer_consumer_take_frame(struct video_buffer *vb);
// wake up and avoid any blocking call
void
video_buffer_interrupt(struct video_buffer *vb);
#endif #endif

View File

@ -58,7 +58,6 @@ static void test_options(void) {
"--push-target", "/sdcard/Movies", "--push-target", "/sdcard/Movies",
"--record", "file", "--record", "file",
"--record-format", "mkv", "--record-format", "mkv",
"--render-expired-frames",
"--serial", "0123456789abcdef", "--serial", "0123456789abcdef",
"--show-touches", "--show-touches",
"--turn-screen-off", "--turn-screen-off",
@ -87,7 +86,6 @@ static void test_options(void) {
assert(!strcmp(opts->push_target, "/sdcard/Movies")); assert(!strcmp(opts->push_target, "/sdcard/Movies"));
assert(!strcmp(opts->record_filename, "file")); assert(!strcmp(opts->record_filename, "file"));
assert(opts->record_format == SC_RECORD_FORMAT_MKV); assert(opts->record_format == SC_RECORD_FORMAT_MKV);
assert(opts->render_expired_frames);
assert(!strcmp(opts->serial, "0123456789abcdef")); assert(!strcmp(opts->serial, "0123456789abcdef"));
assert(opts->show_touches); assert(opts->show_touches);
assert(opts->turn_screen_off); assert(opts->turn_screen_off);

View File

@ -146,14 +146,18 @@ static void test_serialize_inject_scroll_event(void) {
static void test_serialize_back_or_screen_on(void) { static void test_serialize_back_or_screen_on(void) {
struct control_msg msg = { struct control_msg msg = {
.type = CONTROL_MSG_TYPE_BACK_OR_SCREEN_ON, .type = CONTROL_MSG_TYPE_BACK_OR_SCREEN_ON,
.back_or_screen_on = {
.action = AKEY_EVENT_ACTION_UP,
},
}; };
unsigned char buf[CONTROL_MSG_MAX_SIZE]; unsigned char buf[CONTROL_MSG_MAX_SIZE];
size_t size = control_msg_serialize(&msg, buf); size_t size = control_msg_serialize(&msg, buf);
assert(size == 1); assert(size == 2);
const unsigned char expected[] = { const unsigned char expected[] = {
CONTROL_MSG_TYPE_BACK_OR_SCREEN_ON, CONTROL_MSG_TYPE_BACK_OR_SCREEN_ON,
0x01, // AKEY_EVENT_ACTION_UP
}; };
assert(!memcmp(buf, expected, sizeof(expected))); assert(!memcmp(buf, expected, sizeof(expected)));
} }

View File

@ -287,6 +287,18 @@ static void test_parse_integer_with_suffix(void) {
assert(!ok); assert(!ok);
} }
static void test_strlist_contains(void) {
assert(strlist_contains("a,bc,def", ',', "bc"));
assert(!strlist_contains("a,bc,def", ',', "b"));
assert(strlist_contains("", ',', ""));
assert(strlist_contains("abc,", ',', ""));
assert(strlist_contains(",abc", ',', ""));
assert(strlist_contains("abc,,def", ',', ""));
assert(!strlist_contains("abc", ',', ""));
assert(strlist_contains(",,|x", '|', ",,"));
assert(strlist_contains("xyz", '\0', "xyz"));
}
int main(int argc, char *argv[]) { int main(int argc, char *argv[]) {
(void) argc; (void) argc;
(void) argv; (void) argv;
@ -304,5 +316,6 @@ int main(int argc, char *argv[]) {
test_parse_integer(); test_parse_integer();
test_parse_integers(); test_parse_integers();
test_parse_integer_with_suffix(); test_parse_integer_with_suffix();
test_strlist_contains();
return 0; return 0;
} }

View File

@ -71,6 +71,13 @@ public final class ControlMessage {
return msg; return msg;
} }
public static ControlMessage createBackOrScreenOn(int action) {
ControlMessage msg = new ControlMessage();
msg.type = TYPE_BACK_OR_SCREEN_ON;
msg.action = action;
return msg;
}
public static ControlMessage createSetClipboard(String text, boolean paste) { public static ControlMessage createSetClipboard(String text, boolean paste) {
ControlMessage msg = new ControlMessage(); ControlMessage msg = new ControlMessage();
msg.type = TYPE_SET_CLIPBOARD; msg.type = TYPE_SET_CLIPBOARD;

View File

@ -11,6 +11,7 @@ public class ControlMessageReader {
static final int INJECT_KEYCODE_PAYLOAD_LENGTH = 13; static final int INJECT_KEYCODE_PAYLOAD_LENGTH = 13;
static final int INJECT_TOUCH_EVENT_PAYLOAD_LENGTH = 27; static final int INJECT_TOUCH_EVENT_PAYLOAD_LENGTH = 27;
static final int INJECT_SCROLL_EVENT_PAYLOAD_LENGTH = 20; static final int INJECT_SCROLL_EVENT_PAYLOAD_LENGTH = 20;
static final int BACK_OR_SCREEN_ON_LENGTH = 1;
static final int SET_SCREEN_POWER_MODE_PAYLOAD_LENGTH = 1; static final int SET_SCREEN_POWER_MODE_PAYLOAD_LENGTH = 1;
static final int SET_CLIPBOARD_FIXED_PAYLOAD_LENGTH = 1; static final int SET_CLIPBOARD_FIXED_PAYLOAD_LENGTH = 1;
@ -66,13 +67,15 @@ public class ControlMessageReader {
case ControlMessage.TYPE_INJECT_SCROLL_EVENT: case ControlMessage.TYPE_INJECT_SCROLL_EVENT:
msg = parseInjectScrollEvent(); msg = parseInjectScrollEvent();
break; break;
case ControlMessage.TYPE_BACK_OR_SCREEN_ON:
msg = parseBackOrScreenOnEvent();
break;
case ControlMessage.TYPE_SET_CLIPBOARD: case ControlMessage.TYPE_SET_CLIPBOARD:
msg = parseSetClipboard(); msg = parseSetClipboard();
break; break;
case ControlMessage.TYPE_SET_SCREEN_POWER_MODE: case ControlMessage.TYPE_SET_SCREEN_POWER_MODE:
msg = parseSetScreenPowerMode(); msg = parseSetScreenPowerMode();
break; break;
case ControlMessage.TYPE_BACK_OR_SCREEN_ON:
case ControlMessage.TYPE_EXPAND_NOTIFICATION_PANEL: case ControlMessage.TYPE_EXPAND_NOTIFICATION_PANEL:
case ControlMessage.TYPE_COLLAPSE_NOTIFICATION_PANEL: case ControlMessage.TYPE_COLLAPSE_NOTIFICATION_PANEL:
case ControlMessage.TYPE_GET_CLIPBOARD: case ControlMessage.TYPE_GET_CLIPBOARD:
@ -150,6 +153,14 @@ public class ControlMessageReader {
return ControlMessage.createInjectScrollEvent(position, hScroll, vScroll); return ControlMessage.createInjectScrollEvent(position, hScroll, vScroll);
} }
private ControlMessage parseBackOrScreenOnEvent() {
if (buffer.remaining() < BACK_OR_SCREEN_ON_LENGTH) {
return null;
}
int action = toUnsigned(buffer.get());
return ControlMessage.createBackOrScreenOn(action);
}
private ControlMessage parseSetClipboard() { private ControlMessage parseSetClipboard() {
if (buffer.remaining() < SET_CLIPBOARD_FIXED_PAYLOAD_LENGTH) { if (buffer.remaining() < SET_CLIPBOARD_FIXED_PAYLOAD_LENGTH) {
return null; return null;

View File

@ -101,7 +101,7 @@ public class Controller {
break; break;
case ControlMessage.TYPE_BACK_OR_SCREEN_ON: case ControlMessage.TYPE_BACK_OR_SCREEN_ON:
if (device.supportsInputEvents()) { if (device.supportsInputEvents()) {
pressBackOrTurnScreenOn(); pressBackOrTurnScreenOn(msg.getAction());
} }
break; break;
case ControlMessage.TYPE_EXPAND_NOTIFICATION_PANEL: case ControlMessage.TYPE_EXPAND_NOTIFICATION_PANEL:
@ -255,12 +255,22 @@ public class Controller {
}, 200, TimeUnit.MILLISECONDS); }, 200, TimeUnit.MILLISECONDS);
} }
private boolean pressBackOrTurnScreenOn() { private boolean pressBackOrTurnScreenOn(int action) {
int keycode = Device.isScreenOn() ? KeyEvent.KEYCODE_BACK : KeyEvent.KEYCODE_POWER; if (Device.isScreenOn()) {
if (keepPowerModeOff && keycode == KeyEvent.KEYCODE_POWER) { return device.injectKeyEvent(action, KeyEvent.KEYCODE_BACK, 0, 0);
}
// Screen is off
// Only press POWER on ACTION_DOWN
if (action != KeyEvent.ACTION_DOWN) {
// do nothing,
return true;
}
if (keepPowerModeOff) {
schedulePowerModeOff(); schedulePowerModeOff();
} }
return device.injectKeycode(keycode); return device.injectKeycode(KeyEvent.KEYCODE_POWER);
} }
private boolean setClipboard(String text, boolean paste) { private boolean setClipboard(String text, boolean paste) {

View File

@ -154,6 +154,7 @@ public class ControlMessageReaderTest {
ByteArrayOutputStream bos = new ByteArrayOutputStream(); ByteArrayOutputStream bos = new ByteArrayOutputStream();
DataOutputStream dos = new DataOutputStream(bos); DataOutputStream dos = new DataOutputStream(bos);
dos.writeByte(ControlMessage.TYPE_BACK_OR_SCREEN_ON); dos.writeByte(ControlMessage.TYPE_BACK_OR_SCREEN_ON);
dos.writeByte(KeyEvent.ACTION_UP);
byte[] packet = bos.toByteArray(); byte[] packet = bos.toByteArray();
@ -161,6 +162,7 @@ public class ControlMessageReaderTest {
ControlMessage event = reader.next(); ControlMessage event = reader.next();
Assert.assertEquals(ControlMessage.TYPE_BACK_OR_SCREEN_ON, event.getType()); Assert.assertEquals(ControlMessage.TYPE_BACK_OR_SCREEN_ON, event.getType());
Assert.assertEquals(KeyEvent.ACTION_DOWN, event.getAction());
} }
@Test @Test