gstreamer官方源代碼中有一個(gè)基于webrtc插件實(shí)現(xiàn)音視頻通話的開源項(xiàng)目,下面介紹在Ubuntu系統(tǒng)中如何搭建環(huán)境并使用。
一、環(huán)境搭建
1.安裝依賴庫
這里省略gstreamer安裝,直接安裝使用webrtcbin插件使用的相關(guān)庫,參考官網(wǎng)。系統(tǒng)版本建議高于ubuntu18.04。
首先安裝如下相關(guān)依賴庫。
sudo apt-get install -y gstreamer1.0-tools gstreamer1.0-nice gstreamer1.0-plugins-bad gstreamer1.0-plugins-ugly gstreamer1.0-plugins-good libgstreamer1.0-dev git libglib2.0-dev libgstreamer-plugins-bad1.0-dev libsoup2.4-dev libjson-glib-dev
gstreamer項(xiàng)目編譯官方建議使用meson和ninja。參考官方。
sudo apt-get install meson ninja-build
如果想使用虛擬環(huán)境編譯運(yùn)行,官方推薦使用hotdoc,安裝使用參考官方網(wǎng)頁。
我這里使用的系統(tǒng)是Ubuntu22.04,gst版本為1.20。
2.編譯運(yùn)行
gst-webrtc開源項(xiàng)目在gst官方源碼gst-examples目錄下。下載對(duì)應(yīng)版本壓縮包,或者直接git克隆。
cd /path/gst-examples
meson build
環(huán)境沒問題會(huì)顯示
The Meson build system
Version: 1.2.1
Source dir: /home/cht/gst/gst-examples
Build dir: /home/cht/gst/gst-examples/reconfigure
Build type: native build
Project name: gst-examples
Project version: 1.19.2
C compiler for the host machine: cc (gcc 11.4.0 "cc (Ubuntu 11.4.0-1ubuntu1~22.04) 11.4.0")
C linker for the host machine: cc ld.bfd 2.38
Host machine cpu family: x86_64
Host machine cpu: x86_64
Library m found: YES
Found pkg-config: /usr/bin/pkg-config (0.29.2)
Run-time dependency glib-2.0 found: YES 2.72.4
Run-time dependency gio-2.0 found: YES 2.72.4
Run-time dependency gobject-2.0 found: YES 2.72.4
Run-time dependency gmodule-2.0 found: YES 2.72.4
Run-time dependency gstreamer-1.0 found: YES 1.20.3
Run-time dependency gstreamer-play-1.0 found: YES 1.20.3
Run-time dependency gstreamer-tag-1.0 found: YES 1.20.1
Run-time dependency gstreamer-webrtc-1.0 found: YES 1.20.3
Run-time dependency gstreamer-sdp-1.0 found: YES 1.20.1
Run-time dependency gstreamer-rtp-1.0 found: YES 1.20.1
Dependency gstreamer-play-1.0 found: YES 1.20.3 (cached)
Found CMake: /usr/bin/cmake (3.22.1)
Run-time dependency gtk+-3.0 found: NO (tried pkgconfig and cmake)
Run-time dependency x11 found: YES 1.7.5
Dependency gstreamer-sdp-1.0 found: YES 1.20.1 (cached)
Run-time dependency libsoup-2.4 found: YES 2.74.2
Run-time dependency json-glib-1.0 found: YES 1.6.6
Program openssl found: YES (/usr/bin/openssl)
Program generate_cert.sh found: YES (/home/cht/gst/gst-examples/webrtc/signalling/generate_cert.sh)
Program configure_test_check.py found: YES (/usr/bin/python3 /home/cht/gst/gst-examples/webrtc/check/configure_test_check.py)
WARNING: You should add the boolean check kwarg to the run_command call.
It currently defaults to false,
but it will default to true in future releases of meson.
See also: https://github.com/mesonbuild/meson/issues/9300
Build targets in project: 7
Found ninja-1.11.1.git.kitware.jobserver-1 at /home/cht/.local/bin/ninja
WARNING: Running the setup command as `meson [options]` instead of `meson setup [options]` is ambiguous and deprecated.
會(huì)檢查編譯前相關(guān)依賴環(huán)境是否準(zhǔn)備完畢??梢钥吹絯ebrtc項(xiàng)目只更新到了1.19。如果是使用低版本gst 1.18 meson ninja編譯后運(yùn)行可能會(huì)提示缺少liblice相關(guān)插件庫。需要下載https://gitlab.freedesktop.org/libnice/libnice庫手動(dòng)編譯。還是使用meson和ninja。編譯完成 將libgstnice.so插件庫make install到系統(tǒng)默認(rèn)的gstreamer插件路徑/lib/x86_64-linux-gnu/gstreamer-1.0下。
meson成功直接編譯然后運(yùn)行
ninja -C build
cd gst-examples/build/webrtc/sendrecv/gst
./webrtc-sendrecv --peer-id=xxxx
peer-id由https://webrtc.nirbheek.in網(wǎng)頁生成的一個(gè)隨機(jī)數(shù)字。網(wǎng)頁如下圖
運(yùn)行webrtc-sendrecv使用peer-id需要遠(yuǎn)程提供,使用our-id可以讓遠(yuǎn)程來主動(dòng)連接。
運(yùn)行成功后網(wǎng)頁端顯示,另一端推流的圖像為gst中videotestsrc產(chǎn)生,為了方便演示,還疊加了一個(gè)秒表計(jì)時(shí)器timeoverlay。音頻在聲卡類型不清楚的情況下可以使用autoaudiosrc來采集。
網(wǎng)頁這邊默認(rèn)推流的為電腦自帶的攝像頭采集的視頻和麥克風(fēng)采集的音頻。Ubuntu顯示畫面如下
這樣在gstreamer中使用webrtcbin插件簡(jiǎn)易的音視頻p2p對(duì)講實(shí)現(xiàn)。
二、gstreamer采集和播放的源碼分析
1.采集
這里就不分析webrtc的流程了。
采集部分在start_pipeline中
這里對(duì)比原項(xiàng)目有所改動(dòng),原項(xiàng)目使用的vp8軟編碼方式以及audiotestsrc產(chǎn)生的噪聲,由于很多平臺(tái)都支持h264硬編碼所以改了編碼方式,例如瑞芯微平臺(tái)使用的mpph264enc。需要添加一個(gè)宏定義
#define RTP_CAPS_H264 "application/x-rtp,media=video,encoding-name=H264,payload="
2.播放
static void
handle_media_stream (GstPad * pad, GstElement * pipe, const char *convert_name, const char *sink_name)
{
GstPad *qpad;
GstElement *q, *conv, *resample, *sink;
GstPadLinkReturn ret;
gst_println ("Trying to handle stream with %s ! %s", convert_name, sink_name);
q = gst_element_factory_make ("queue", NULL);
g_assert_nonnull (q);
conv = gst_element_factory_make (convert_name, NULL);
g_assert_nonnull (conv);
sink = gst_element_factory_make (sink_name, NULL);
g_assert_nonnull (sink);
if (g_strcmp0 (convert_name, "audioconvert") == 0) {
/* Might also need to resample, so add it just in case.
* Will be a no-op if it's not required. */
resample = gst_element_factory_make ("audioresample", NULL);
g_assert_nonnull (resample);
gst_bin_add_many (GST_BIN (pipe), q, conv, resample, sink, NULL);
gst_element_sync_state_with_parent (q);
gst_element_sync_state_with_parent (conv);
gst_element_sync_state_with_parent (resample);
gst_element_sync_state_with_parent (sink);
gst_element_link_many (q, conv, resample, sink, NULL);
} else {
gst_bin_add_many (GST_BIN (pipe), q, conv, sink, NULL);
gst_element_sync_state_with_parent (q);
gst_element_sync_state_with_parent (conv);
gst_element_sync_state_with_parent (sink);
gst_element_link_many (q, conv, sink, NULL);
}
qpad = gst_element_get_static_pad (q, "sink");
ret = gst_pad_link (pad, qpad);
g_assert_cmphex (ret, ==, GST_PAD_LINK_OK);
}
static void
on_incoming_decodebin_stream (GstElement * decodebin, GstPad * pad,
GstElement * pipe)
{
GstCaps *caps;
const gchar *name;
if (!gst_pad_has_current_caps (pad)) {
gst_printerr ("Pad '%s' has no caps, can't do anything, ignoring\n",
GST_PAD_NAME (pad));
return;
}
caps = gst_pad_get_current_caps (pad);
name = gst_structure_get_name (gst_caps_get_structure (caps, 0));
if (g_str_has_prefix (name, "video")) {
handle_media_stream (pad, pipe, "videoconvert", "waylandsink");
// handle_media_stream (pad, pipe, "videoconvert", "autovideosink");
} else if (g_str_has_prefix (name, "audio")) {
handle_media_stream (pad, pipe, "audioconvert", "autoaudiosink");
} else {
gst_printerr ("Unknown pad %s, ignoring", GST_PAD_NAME (pad));
}
}
static void
on_incoming_stream (GstElement * webrtc, GstPad * pad, GstElement * pipe)
{
GstElement *decodebin;
GstPad *sinkpad;
if (GST_PAD_DIRECTION (pad) != GST_PAD_SRC)
return;
decodebin = gst_element_factory_make ("decodebin", NULL);
g_signal_connect (decodebin, "pad-added",
G_CALLBACK (on_incoming_decodebin_stream), pipe);
gst_bin_add (GST_BIN (pipe), decodebin);
gst_element_sync_state_with_parent (decodebin);
sinkpad = gst_element_get_static_pad (decodebin, "sink");
gst_pad_link (pad, sinkpad);
gst_object_unref (sinkpad);
}
可以看出視頻解碼使用的decodebin,對(duì)于音頻管道建立if (g_strcmp0 (convert_name, “audioconvert”) == 0)對(duì)比使用了重采樣,因?yàn)閷?duì)遠(yuǎn)程傳輸而來的音頻格式在通用處理。文章來源:http://www.zghlxwxcb.cn/news/detail-847263.html
總結(jié)
完成。文章來源地址http://www.zghlxwxcb.cn/news/detail-847263.html
到了這里,關(guān)于gstreamer中使用webrtc實(shí)現(xiàn)音視頻對(duì)講的文章就介紹完了。如果您還想了解更多內(nèi)容,請(qǐng)?jiān)谟疑辖撬阉鱐OY模板網(wǎng)以前的文章或繼續(xù)瀏覽下面的相關(guān)文章,希望大家以后多多支持TOY模板網(wǎng)!