How to Implement a Video Widget in Qt That Builds Upon Gstreamer

How to implement a video widget in Qt that builds upon GStreamer?

To connect Gstreamer with your QWidget, you need to get the window handle using QWidget::winId() and you pass it to gst_x_overlay_set_xwindow_id();

Rough sample code:

    sink = gst_element_factory_make("xvimagesink", "sink");
gst_element_set_state(sink, GST_STATE_READY);

QApplication::syncX();
gst_x_overlay_set_xwindow_id(GST_X_OVERLAY(sink), widget->winId());

Also, you will want your widget to be backed by a native window which is achieved by setting the Qt::AA_NativeWindows attribute at the application level or the Qt::WA_NativeWindow attribute at the widget level.

Targeting Qt child widget with gstreamer

It is in fact working with QWidget. However, a call to QApplication::syncX is needed AFTER
the call to WinId :

/* Wrong order */
QApplication::syncX();
gst_x_overlay_set_xwindow_id(GST_X_OVERLAY(xvsink), someWidget->winId());

/* Right order */
unsigned long win_id = someWidget->winId();
QApplication::syncX();
gst_x_overlay_set_xwindow_id(GST_X_OVERLAY(xvsink), win_id);

How properly use Gstreamer in QtWidget?

I could play a video test inside the Qwidget with the below method. Be aware that picking a compatible sink is essential. for more information about sink (and other elements), use gst-inspect-1.0 | findstr sink in the terminal.

#include <gst/gst.h>;
#include <gst/video/videooverlay.h>;
#include <QApplication>;
#include <QTimer>;
#include <QWidget>;

int main(int argc, char *argv[])
{
QApplication app(argc, argv);
app.connect(&app, SIGNAL(lastWindowClosed()), &app, SLOT(quit ()));

if (!g_thread_supported ())
g_thread_init (NULL);
gst_init (&argc, &argv);

// prepare the pipeline
GstElement *pipeline = gst_pipeline_new ("xvoverlay");
GstElement *src = gst_element_factory_make ("videotestsrc", NULL);
GstElement *sink = gst_element_factory_make ("glimagesink", NULL);
gst_bin_add_many (GST_BIN (pipeline), src, sink, NULL);
gst_element_link (src, sink);
// getting more information
gst_debug_set_active(true);
gst_debug_set_default_threshold(GST_LEVEL_WARNING);

QWidget window;
window.resize(320, 240);
window.show();
WId xwinid = window.winId();
gst_video_overlay_set_window_handle (GST_VIDEO_OVERLAY (sink), xwinid);

// run the pipeline

GstStateChangeReturn sret = gst_element_set_state (pipeline,
GST_STATE_PLAYING);
if (sret == GST_STATE_CHANGE_FAILURE) {
gst_element_set_state (pipeline, GST_STATE_NULL);
gst_object_unref (pipeline);
// Exit application
QTimer::singleShot(0, QApplication::activeWindow(), SLOT(quit()));
}

int ret = app.exec();
window.hide();
gst_element_set_state (pipeline, GST_STATE_NULL);
gst_object_unref (pipeline);

return ret;
}

the output inside widget:

Sample Image

Embed separate video streams in Qt widgets on embedded-linux using Gstreamer

I wanted to share the solution we went for in this particular problem, in case it can help anyone who has a similar issue.

After failing with all solutions that required missing librairies (QtMultimedia, qmlglsink, etc.), or just didn't work for unknown reasons, I learned about framebuffers - which are basically just layers for the GPU as far as I'm concerned - and how to use them for this case.

It turns out the linux embedded device I've been working with has 3 framebuffers, which allowed us to split the application into a "background" framebuffer for video stream playback, and a "foreground" framebuffer for the overlay display. The overlay (the Qt MainWindow) needed to be transparent whenever we wanted the video in the background to become visible. For this we used alpha blending and a color key.

After testing individual parts of this solution, we ended up with an app that launches two pipelines (because I want 2 cameras being displayed on the screen at once, and each of them can be switched to another stream using an input-selector). The pipeline structure looked like this, for example :

input-selector name=selector ! decodebin ! textoverlay name=text0 ! queue !
imxg2dvideosink framebuffer=/dev/fb0 name=end0 force-aspect-ratio=false
window-x-coord=0 window-y-coord=0 window-width=512 window-height=473
rtspsrc location=rtsp://10.0.1.1:8554/stream name=src0 ! queue name=qs_0 ! selector.sink_0
rtspsrc location=rtsp://10.0.1.1:8556/stream name=src2 ! queue name=qs_2 ! selector.sink_1
rtspsrc location=rtsp://10.0.1.1:8558/stream name=src4 ! queue name=qs_4 ! selector.sink_2

We pass the framebuffer property to the sink so that it sends the video to the framebuffer 0, while the application itself is being displayed on framebuffer 1, which appears on top of fb0. To achieve this, we simply set the QT_QPA_EGLFS_FB env variable to /dev/fb1 before calling the app executable, since our device runs with the EGLFS plugin.

For the alpha blending and color keying part, we had to do this in the app :

#include <fcntl.h>
#include <linux/mxcfb.h>
#include <sys/ioctl.h>

...

// Read overlay framebuffer fb1
int fb = 0;
fb = open("/dev/fb1", O_RDWR);
if (fb < 0)
qWarning() << "Error, framebuffer cannot be opened";

// Enable alpha
struct mxcfb_gbl_alpha alphaStruct;
alphaStruct.enable = 1;
alphaStruct.alpha = 255;
if (ioctl(fb, MXCFB_SET_GBL_ALPHA, &alphaStruct) < 0)
qWarning() << "Error, framebuffer alpha cannot be set";

// Set color key to pure blue
struct mxcfb_color_key colorKeyStruct;
guint32 colorKeyValue = g_ascii_strtoull("0x0000FF", NULL, 16);
colorKeyStruct.color_key = colorKeyValue;
colorKeyStruct.enable = 1;
if (ioctl(fb, MXCFB_SET_CLR_KEY, &colorKeyStruct) < 0)
qWarning() << "Error, framebuffer color key cannot be set";

...

Basically this opens the framebuffer that the overlay app is running on, enables alpha on it, and then sets one color (blue) as the transparent color. So every pixel with this exact color value will display the video that's running in the background.

So now we have an app that plays video streams with a custom Gstreamer pipeline that uses a HW-accelerated video sink.



Related Topics



Leave a reply



Submit