mirror of
https://github.com/aler9/rtsp-simple-server
synced 2026-04-22 23:17:11 +08:00
docs: update (#5471)
This commit is contained in:
@@ -48,14 +48,14 @@ There are four image variants:
|
||||
|
||||
The `1` tag corresponds to the latest `1.x.x` release, that should guarantee backward compatibility when upgrading. It is also possible to bind the image to a specific release, by using the release name as tag (`bluenviron/mediamtx:{docker_version_tag}`).
|
||||
|
||||
The base image does not contain any utility, in order to minimize size and frequency of updates. If you need additional software (like curl, wget, GStreamer), you can build a custom image by using the _MediaMTX_ image as a base stage, by creating a file name `Dockerfile` with this content:
|
||||
The base image does not contain any utility, in order to minimize size and frequency of updates. If you need additional software (like curl, wget, GStreamer), you can build a custom image by using the _MediaMTX_ image as a base stage, by creating a file named `Dockerfile` with this content:
|
||||
|
||||
```
|
||||
FROM bluenviron/mediamtx:1 AS mediamtx
|
||||
FROM ubuntu:24.04
|
||||
|
||||
COPY --from=mediamtx /mediamtx /
|
||||
COPY --from=mediamtx.yml /
|
||||
COPY --from=mediamtx /mediamtx.yml /
|
||||
|
||||
RUN apt update && apt install -y \
|
||||
(insert additional utilities here)
|
||||
|
||||
+94
-15
@@ -16,7 +16,7 @@ Live streams can be published to the server with the following protocols and cod
|
||||
| [RTMP cameras and servers](#rtmp-cameras-and-servers) | RTMP, RTMPS, Enhanced RTMP | **Video**: AV1, VP9, H265, H264<br/>**Audio**: Opus, MPEG-4 Audio (AAC), MPEG-1/2 Audio (MP3), AC-3, G711 (PCMA, PCMU), LPCM |
|
||||
| [HLS cameras and servers](#hls-cameras-and-servers) | Low-Latency HLS, MP4-based HLS, legacy HLS | **Video**: AV1, VP9, H265, H264<br/>**Audio**: Opus, MPEG-4 Audio (AAC) |
|
||||
| [MPEG-TS](#mpeg-ts) | MPEG-TS over UDP, MPEG-TS over Unix socket | **Video**: H265, H264, MPEG-4 Video (H263, Xvid), MPEG-1/2 Video<br/>**Audio**: Opus, MPEG-4 Audio (AAC), MPEG-1/2 Audio (MP3), AC-3<br/>**Other**: KLV |
|
||||
| [RTP](#rtp) | RTP over UDP, RTP over Unix socket | **Video**: AV1, VP9, VP8, H265, H264, MPEG-4 Video (H263, Xvid), MPEG-1/2 Video, M-JPEG<br/>**Audio**: Opus, MPEG-4 Audio (AAC), MPEG-1/2 Audio (MP3), AC-3, G726, G722, G711 (PCMA, PCMU), LPCM<br/>**Other**: KLV, MPEG-TS, any RTP-compatible codec |
|
||||
| [RTP](#rtp) | RTP over UDP | **Video**: AV1, VP9, VP8, H265, H264, MPEG-4 Video (H263, Xvid), MPEG-1/2 Video, M-JPEG<br/>**Audio**: Opus, MPEG-4 Audio (AAC), MPEG-1/2 Audio (MP3), AC-3, G726, G722, G711 (PCMA, PCMU), LPCM<br/>**Other**: KLV, MPEG-TS, any RTP-compatible codec |
|
||||
|
||||
We provide instructions for publishing with the following devices:
|
||||
|
||||
@@ -28,7 +28,8 @@ We provide instructions for publishing with the following software:
|
||||
- [FFmpeg](#ffmpeg)
|
||||
- [GStreamer](#gstreamer)
|
||||
- [OBS Studio](#obs-studio)
|
||||
- [OpenCV](#opencv)
|
||||
- [Python and OpenCV](#python-and-opencv)
|
||||
- [Golang](#golang)
|
||||
- [Unity](#unity)
|
||||
- [Web browsers](#web-browsers)
|
||||
|
||||
@@ -112,7 +113,7 @@ rtsp://localhost:8554/mystream
|
||||
|
||||
The resulting stream will be available on path `/mystream`.
|
||||
|
||||
Some clients that can publish with RTSP are [FFmpeg](#ffmpeg), [GStreamer](#gstreamer), [OBS Studio](#obs-studio), [OpenCV](#opencv).
|
||||
Some clients that can publish with RTSP are [FFmpeg](#ffmpeg), [GStreamer](#gstreamer), [OBS Studio](#obs-studio), [Python and OpenCV](#python-and-opencv).
|
||||
|
||||
### RTSP cameras and servers
|
||||
|
||||
@@ -447,6 +448,8 @@ ffmpeg -re -stream_loop -1 -i file.mp4 -c copy -f mpegts 'udp://238.0.0.1:1234?p
|
||||
|
||||
#### FFmpeg and MPEG-TS over Unix socket
|
||||
|
||||
In _MediaMTX_ configuration, add a path with `source: unix+mpegts:///tmp/socket.sock`. Then:
|
||||
|
||||
```sh
|
||||
ffmpeg -re -f lavfi -i testsrc=size=1280x720:rate=30 \
|
||||
-c:v libx264 -pix_fmt yuv420p -preset ultrafast -b:v 600k \
|
||||
@@ -463,14 +466,6 @@ ffmpeg -re -f lavfi -i testsrc=size=1280x720:rate=30 \
|
||||
-f rtp udp://238.0.0.1:1234?pkt_size=1316
|
||||
```
|
||||
|
||||
#### FFmpeg and RTP over Unix socket
|
||||
|
||||
```sh
|
||||
ffmpeg -re -f lavfi -i testsrc=size=1280x720:rate=30 \
|
||||
-c:v libx264 -pix_fmt yuv420p -preset ultrafast -b:v 600k \
|
||||
-f rtp unix:/tmp/socket.sock
|
||||
```
|
||||
|
||||
#### FFmpeg and SRT
|
||||
|
||||
```sh
|
||||
@@ -557,12 +552,14 @@ In `Settings -> Stream` (or in the Auto-configuration Wizard), use the following
|
||||
|
||||
Save the configuration and click `Start streaming`.
|
||||
|
||||
The resulting stream will be available on path `/mystream`.
|
||||
|
||||
If you want to generate a stream that can be read with WebRTC, open `Settings -> Output -> Recording` and use the following parameters:
|
||||
|
||||
- FFmpeg output type: `Output to URL`
|
||||
- File path or URL: `rtsp://localhost:8554/mystream`
|
||||
- Container format: `rtsp`
|
||||
- Check `show all codecs (even if potentically incompatible)`
|
||||
- Check `show all codecs (even if potentially incompatible)`
|
||||
- Video encoder: `h264_nvenc (libx264)`
|
||||
- Video encoder settings (if any): `bf=0`
|
||||
- Audio track: `1`
|
||||
@@ -570,6 +567,84 @@ If you want to generate a stream that can be read with WebRTC, open `Settings ->
|
||||
|
||||
Then use the button `Start Recording` (instead of `Start Streaming`) to start streaming.
|
||||
|
||||
#### OBS Studio and RTMP, multitrack video
|
||||
|
||||
OBS Studio can publish multiple video tracks or renditions at once. Make sure that the OBS Studio version is ≥ 31.0.0. Open `Settings -> Stream` and use the following parameters:
|
||||
|
||||
- Service: `Custom...`
|
||||
- Server: `rtmp://localhost/mystream`
|
||||
- Stream key: (empty)
|
||||
- Turn on `Enable Multitrack Video`
|
||||
- Leave `Maximum Streaming Bandwidth` and `Maximum Video Tracks` to `Auto`
|
||||
- Turn on `Enable Config Override`
|
||||
- Fill `Config Override (JSON)` with the following text:
|
||||
|
||||
```json
|
||||
{
|
||||
"encoder_configurations": [
|
||||
{
|
||||
"type": "obs_x264",
|
||||
"width": 1920,
|
||||
"height": 1080,
|
||||
"framerate": {
|
||||
"numerator": 30,
|
||||
"denominator": 1
|
||||
},
|
||||
"settings": {
|
||||
"rate_control": "CBR",
|
||||
"bitrate": 6000,
|
||||
"keyint_sec": 2,
|
||||
"preset": "veryfast",
|
||||
"profile": "high",
|
||||
"tune": "zerolatency"
|
||||
},
|
||||
"canvas_index": 0
|
||||
},
|
||||
{
|
||||
"type": "obs_x264",
|
||||
"width": 640,
|
||||
"height": 480,
|
||||
"framerate": {
|
||||
"numerator": 30,
|
||||
"denominator": 1
|
||||
},
|
||||
"settings": {
|
||||
"rate_control": "CBR",
|
||||
"bitrate": 3000,
|
||||
"keyint_sec": 2,
|
||||
"preset": "veryfast",
|
||||
"profile": "main",
|
||||
"tune": "zerolatency"
|
||||
},
|
||||
"canvas_index": 0
|
||||
}
|
||||
],
|
||||
"audio_configurations": {
|
||||
"live": [
|
||||
{
|
||||
"codec": "ffmpeg_aac",
|
||||
"track_id": 1,
|
||||
"channels": 2,
|
||||
"settings": {
|
||||
"bitrate": 160
|
||||
}
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
This can be adjusted according to specific needs. In particular, the `type` field is used to set the video encoder, and these are the available parameters:
|
||||
- `obs_nvenc_av1_tex`: NVIDIA NVENC AV1
|
||||
- `obs_nvenc_hevc_tex`: NVIDIA NVENC H265
|
||||
- `obs_nvenc_h264_tex`: NVIDIA NVENC H264
|
||||
- `av1_texture_amf`: AMD AV1
|
||||
- `h265_texture_amf`: AMD H265
|
||||
- `h264_texture_amf`: AMD H264
|
||||
- `obs_qsv11_av1`: QuickSync AV1
|
||||
- `obs_qsv11_v2`: QuickSync H264
|
||||
- `obs_x264`: software H264
|
||||
|
||||
#### OBS Studio and WebRTC
|
||||
|
||||
Recent versions of OBS Studio can also publish to the server with the [WebRTC / WHIP protocol](#webrtc-clients) Use the following parameters:
|
||||
@@ -581,9 +656,9 @@ Save the configuration and click `Start streaming`.
|
||||
|
||||
The resulting stream will be available on path `/mystream`.
|
||||
|
||||
### OpenCV
|
||||
### Python and OpenCV
|
||||
|
||||
Software which uses the OpenCV library can publish to the server through its GStreamer plugin, as a [RTSP client](#rtsp-clients). It must be compiled with support for GStreamer, by following this procedure:
|
||||
Python-based software can publish to the server with the OpenCV library and its GStreamer plugin, acting as a [RTSP client](#rtsp-clients). OpenCV must be compiled with support for GStreamer, by following this procedure:
|
||||
|
||||
```sh
|
||||
sudo apt install -y libgstreamer1.0-dev libgstreamer-plugins-base1.0-dev gstreamer1.0-plugins-ugly gstreamer1.0-rtsp python3-dev python3-numpy
|
||||
@@ -603,7 +678,7 @@ python3 -c 'import cv2; print(cv2.getBuildInformation())'
|
||||
|
||||
Check that the output contains `GStreamer: YES`.
|
||||
|
||||
Videos can be published with `cv2.VideoWriter`:
|
||||
Videos can then be published with `cv2.VideoWriter`:
|
||||
|
||||
```python
|
||||
from datetime import datetime
|
||||
@@ -656,6 +731,10 @@ while True:
|
||||
|
||||
The resulting stream will be available on path `/mystream`.
|
||||
|
||||
### Golang
|
||||
|
||||
You can publish to the server from the Go programming language by using [gortsplib](https://github.com/bluenviron/gortsplib), a RTSP client/server library, and [gortmplib](https://github.com/bluenviron/gortmplib), a RTMP client/server library. Both powers _MediaMTX_ itself. In the repositories of these projects there are several examples on how to connect to a server and push data.
|
||||
|
||||
### Unity
|
||||
|
||||
Software written with the Unity Engine can publish a stream to the server by using the [WebRTC protocol](#webrtc-clients).
|
||||
|
||||
@@ -115,7 +115,7 @@ The server can produce HLS streams with a variety of video and audio codecs (tha
|
||||
|
||||
You can check what codecs your browser can read with HLS by [using this tool](https://jsfiddle.net/tjcyv5aw/).
|
||||
|
||||
If you want to support most browsers, you can to re-encode the stream by using H264 and AAC codecs, for instance by using FFmpeg:
|
||||
If you want to support most browsers, you can re-encode the stream by using H264 and AAC codecs, for instance by using FFmpeg:
|
||||
|
||||
```sh
|
||||
ffmpeg -i rtsp://original-source \
|
||||
@@ -199,7 +199,7 @@ ffmpeg -i 'srt://localhost:8890?streamid=read:test' -c copy output.mp4
|
||||
|
||||
### GStreamer
|
||||
|
||||
GStreamer can read a stream from the server in several way. The recommended one consists in reading with RTSP.
|
||||
GStreamer can read a stream from the server in several ways. The recommended one consists in reading with RTSP.
|
||||
|
||||
#### GStreamer and RTSP
|
||||
|
||||
@@ -239,7 +239,7 @@ audio-caps="application/x-rtp,media=audio,encoding-name=OPUS,payload=111,clock-r
|
||||
|
||||
### VLC
|
||||
|
||||
VLC can read a stream from the server in several way. The recommended one consists in reading with RTSP:
|
||||
VLC can read a stream from the server in several ways. The recommended one consists in reading with RTSP:
|
||||
|
||||
```sh
|
||||
vlc --network-caching=50 rtsp://localhost:8554/mystream
|
||||
|
||||
@@ -34,7 +34,7 @@ There are several ways to change the configuration:
|
||||
MTX_PATHS_TEST_SOURCE=rtsp://myurl ./mediamtx
|
||||
```
|
||||
|
||||
Parameters in lists can be overridden in the same way as parameters in maps, using their position like an additional key. This is particularly useful if you want to use internal users but define credentials through enviroment variables:
|
||||
Parameters in lists can be overridden in the same way as parameters in maps, using their position like an additional key. This is particularly useful if you want to use internal users but define credentials through environment variables:
|
||||
|
||||
```
|
||||
MTX_AUTHINTERNALUSERS_0_USER=username
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
|
||||
Live streams can be embedded into an external website by using the WebRTC or HLS protocol. Before embedding, check that the stream is ready and can be accessed with intended protocol by using URLs mentioned in [Read a stream](read).
|
||||
|
||||
## WebRTC
|
||||
## WebRTC in iframe
|
||||
|
||||
The simplest way to embed a live stream in a web page, using the WebRTC protocol, consists in adding an `<iframe>` tag to the body section of the HTML:
|
||||
|
||||
@@ -23,7 +23,9 @@ The iframe method is fit for most use cases, but it has some limitations:
|
||||
- it doesn't allow to pass credentials (username, password or token) from the website to _MediaMTX_; credentials are asked directly to users.
|
||||
- it doesn't allow to directly access the video tag, to extract data from it, or to perform dynamic actions.
|
||||
|
||||
In order to overcome these limitations, it is possible to load the stream directly inside a `<video>` tag in the web page, through a JavaScript library.
|
||||
## WebRTC with JavaScript
|
||||
|
||||
In order to overcome the limitations of the iframe-based method, it is possible to load the stream directly inside a `<video>` tag in the web page, through a JavaScript library.
|
||||
|
||||
Download [reader.js](https://github.com/bluenviron/mediamtx/blob/{version_tag}/internal/servers/webrtc/reader.js) from the repository and serve it together with the other assets of the website.
|
||||
|
||||
@@ -80,7 +82,15 @@ After the video tag, add a script that initializes the stream when the page is f
|
||||
</script>
|
||||
```
|
||||
|
||||
## HLS
|
||||
If _MediaMTX_ is hosted on a different domain with respect to the website (in the sample code this is implied), you need to set the `webrtcAllowOrigins` parameter in the configuration file. For example, to allow requests from `https://example.com`:
|
||||
|
||||
```yaml
|
||||
webrtcAllowOrigins: ["https://example.com"]
|
||||
```
|
||||
|
||||
The parameter also supports wildcards, for instance `['http://*.example.com']`.
|
||||
|
||||
## HLS in iframe
|
||||
|
||||
Reading a stream with the HLS protocol introduces some latency, but is usually easier to setup since it doesn't involve managing additional ports that in WebRTC are used to transmit the stream.
|
||||
|
||||
@@ -103,7 +113,9 @@ The iframe method is fit for most use cases, but it has some limitations:
|
||||
- it doesn't allow to pass credentials (username, password or token) from the website to _MediaMTX_; credentials are asked directly to users.
|
||||
- it doesn't allow to directly access the video tag, to extract data from it, or to perform dynamic actions.
|
||||
|
||||
In order to overcome these limitations, it is possible to load the stream directly inside a `<video>` tag in the web page, through the _hls.js_ library.
|
||||
## HLS with JavaScript
|
||||
|
||||
In order to overcome the limitations of the iframe-based method, it is possible to load the stream directly inside a `<video>` tag in the web page, through the _hls.js_ library.
|
||||
|
||||
If you are using a JavaScript bundler, you can import _hls.js_ it by adding [its npm package](https://www.npmjs.com/package/hls.js) as dependency and then importing it:
|
||||
|
||||
@@ -136,7 +148,7 @@ After the video tag, add a script that initializes the stream when the page is f
|
||||
xhrSetup: function (xhr, url) {
|
||||
let user = ""; // fill if needed
|
||||
let pass = ""; // fill if needed
|
||||
let token = ""; // fil if needed
|
||||
let token = ""; // fill if needed
|
||||
|
||||
if (user !== "") {
|
||||
const credentials = btoa(`${user}:${pass}`);
|
||||
@@ -156,3 +168,11 @@ After the video tag, add a script that initializes the stream when the page is f
|
||||
});
|
||||
</script>
|
||||
```
|
||||
|
||||
If _MediaMTX_ is hosted on a different domain with respect to the website (in the sample code this is implied), you need to set the `hlsAllowOrigins` parameter in the configuration file. For example:
|
||||
|
||||
```yaml
|
||||
hlsAllowOrigins: ["https://example.com"]
|
||||
```
|
||||
|
||||
The parameter also supports wildcards, for instance `['http://*.example.com']`.
|
||||
|
||||
@@ -6,7 +6,7 @@ The server can be queried and controlled with an API, that can be enabled by tog
|
||||
api: yes
|
||||
```
|
||||
|
||||
To obtain a list of of active paths, run:
|
||||
To obtain a list of active paths, run:
|
||||
|
||||
```
|
||||
curl http://127.0.0.1:9997/v3/paths/list
|
||||
|
||||
@@ -1,10 +1,10 @@
|
||||
# SRT-specific features
|
||||
|
||||
SRT is a protocol that can be used for publishing and reading streams. Regarding specific tasks, see [Publish](publish#srt-clients) and [Read](read#srt). Features in these page are shared among both tasks.
|
||||
SRT is a protocol that can be used for publishing and reading streams. Regarding specific tasks, see [Publish](publish#srt-clients) and [Read](read#srt). Features in this page are shared among both tasks.
|
||||
|
||||
## Standard stream ID syntax
|
||||
|
||||
In SRT, the stream ID is a string that is sent to the remote part in order to advertise what action the caller is gonna do (publish or read), the path and the credentials. All these informations have to be encoded into a single string. This server supports two stream ID syntaxes, a custom one (that is the one reported in rest of the README) and also a [standard one](https://github.com/Haivision/srt/blob/master/docs/features/access-control.md) proposed by the authors of the protocol and enforced by some hardware. The standard syntax can be used in this way:
|
||||
In SRT, the stream ID is a string that is sent to the remote part in order to advertise what action the caller is gonna do (publish or read), the path and the credentials. All this information have to be encoded into a single string. This server supports two stream ID syntaxes, a custom one (that is the one reported in rest of the README) and also a [standard one](https://github.com/Haivision/srt/blob/master/docs/features/access-control.md) proposed by the authors of the protocol and enforced by some hardware. The standard syntax can be used in this way:
|
||||
|
||||
```
|
||||
srt://localhost:8890?streamid=#!::m=publish,r=mypath,u=myuser,s=mypass&pkt_size=1316
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
# WebRTC-specific features
|
||||
|
||||
WebRTC is a protocol that can be used for publishing and reading streams. Regarding specific tasks, see [Publish](publish#webrtc-clients) and [Read](read#webrtc). Features in these page are shared among both tasks.
|
||||
WebRTC is a protocol that can be used for publishing and reading streams. Regarding specific tasks, see [Publish](publish#webrtc-clients) and [Read](read#webrtc). Features in this page are shared among both tasks.
|
||||
|
||||
## Codec support in browsers
|
||||
|
||||
@@ -19,7 +19,7 @@ In particular, reading and publishing H265 tracks with WebRTC was not possible u
|
||||
|
||||
You can check what codecs your browser can publish or read with WebRTC by [using this tool](https://jsfiddle.net/v24s8q1f/).
|
||||
|
||||
If you want to support most browsers, you can to re-encode the stream by using H264 and Opus codecs, for instance by using FFmpeg:
|
||||
If you want to support most browsers, you can re-encode the stream by using H264 and Opus codecs, for instance by using FFmpeg:
|
||||
|
||||
```sh
|
||||
ffmpeg -i rtsp://original-source \
|
||||
|
||||
@@ -1,10 +1,10 @@
|
||||
# RTSP-specific features
|
||||
|
||||
RTSP is a protocol that can be used for publishing and reading streams. Regarding specific tasks, see [Publish](publish#rtsp-clients) and [Read](read#rtsp). Features in these page are shared among both tasks.
|
||||
RTSP is a protocol that can be used for publishing and reading streams. Regarding specific tasks, see [Publish](publish#rtsp-clients) and [Read](read#rtsp). Features in this page are shared among both tasks.
|
||||
|
||||
## Transport protocols
|
||||
|
||||
A RTSP session is splitted in two parts: the handshake, which is always performed with the TCP protocol, and data streaming, which can be performed with an arbitrary underlying transport protocol, which is chosen by the client during the handshake:
|
||||
A RTSP session is split in two parts: the handshake, which is always performed with the TCP protocol, and data streaming, which can be performed with an arbitrary underlying transport protocol, which is chosen by the client during the handshake:
|
||||
|
||||
- UDP: the most performant, but require clients to access two additional UDP ports on the server, which is often impossible due to blocking or remapping by NATs/firewalls in between.
|
||||
- UDP-multicast: allows to save bandwidth when clients are all in the same LAN, by sending packets once to a fixed multicast IP.
|
||||
@@ -74,7 +74,7 @@ rtsps://localhost:8322/mystream
|
||||
|
||||
Some clients require additional flags for encryption to work properly.
|
||||
|
||||
When reading with GStreamer, set set `tls-validation-flags` to `0`:
|
||||
When reading with GStreamer, set `tls-validation-flags` to `0`:
|
||||
|
||||
```sh
|
||||
gst-launch-1.0 rtspsrc tls-validation-flags=0 location=rtsps://ip:8322/...
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
# RTMP-specific features
|
||||
|
||||
RTMP is a protocol that can be used for publishing and reading streams. Regarding specific tasks, see [Publish](publish#rtmp-clients) and [Read](read#rtmp). Features in these page are shared among both tasks.
|
||||
RTMP is a protocol that can be used for publishing and reading streams. Regarding specific tasks, see [Publish](publish#rtmp-clients) and [Read](read#rtmp). Features in this page are shared among both tasks.
|
||||
|
||||
## Encryption
|
||||
|
||||
|
||||
@@ -8,7 +8,7 @@
|
||||
| [WebRTC: Real-Time Communication in Browsers](https://www.w3.org/TR/webrtc/) | WebRTC |
|
||||
| [RFC8835, Transports for WebRTC](https://datatracker.ietf.org/doc/html/rfc8835) | WebRTC |
|
||||
| [RFC7742, WebRTC Video Processing and Codec Requirements](https://datatracker.ietf.org/doc/html/rfc7742) | WebRTC |
|
||||
| [RFC7847, WebRTC Audio Codec and Processing Requirements](https://datatracker.ietf.org/doc/html/rfc7874) | WebRTC |
|
||||
| [RFC7874, WebRTC Audio Codec and Processing Requirements](https://datatracker.ietf.org/doc/html/rfc7874) | WebRTC |
|
||||
| [RFC7875, Additional WebRTC Audio Codecs for Interoperability](https://datatracker.ietf.org/doc/html/rfc7875) | WebRTC |
|
||||
| [H.265 Profile for WebRTC](https://datatracker.ietf.org/doc/draft-ietf-avtcore-hevc-webrtc/) | WebRTC |
|
||||
| [WebRTC HTTP Ingestion Protocol (WHIP)](https://datatracker.ietf.org/doc/draft-ietf-wish-whip/) | WebRTC |
|
||||
|
||||
@@ -377,6 +377,8 @@ webrtc: true
|
||||
# Address of the WebRTC HTTP listener.
|
||||
webrtcAddress: :8889
|
||||
# Enable HTTPS on the WebRTC server.
|
||||
# This covers only the WebRTC handshake and does not influence the encryption of WebRTC streams
|
||||
# which are always encrypted, with a key that is exchanged during the WebRTC handshake.
|
||||
webrtcEncryption: false
|
||||
# Path to the server key.
|
||||
# This can be generated with:
|
||||
|
||||
Reference in New Issue
Block a user