Merge pull request #2044 from skrashevich/documentation-site

(website): new documentation site
This commit is contained in:
Alex X
2026-02-06 21:49:35 +03:00
committed by GitHub
65 changed files with 2131 additions and 1715 deletions
+24 -8
View File
@@ -17,21 +17,37 @@ concurrency:
cancel-in-progress: true
jobs:
build:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v5
- name: Setup Node
uses: actions/setup-node@v6
with:
node-version: 24
package-manager-cache: false
- name: Install dependencies
run: npm install --no-package-lock
- name: Build docs
run: npm run docs:build
- name: Copy docs into website
run: rsync -a --exclude '.vitepress/' --exclude 'README.md' website/ website/.vitepress/dist/
- name: Setup Pages
uses: actions/configure-pages@v5
- name: Upload artifact
uses: actions/upload-pages-artifact@v3
with:
path: website/.vitepress/dist
# Single deploy job since we're just deploying
deploy:
environment:
name: github-pages
url: ${{ steps.deployment.outputs.page_url }}
runs-on: ubuntu-latest
needs: build
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Setup Pages
uses: actions/configure-pages@v4
- name: Upload artifact
uses: actions/upload-pages-artifact@v3
with:
path: './website'
- name: Deploy to GitHub Pages
id: deployment
uses: actions/deploy-pages@v4
+6
View File
@@ -15,3 +15,9 @@ go2rtc_win*
0_test.go
.DS_Store
website/.vitepress/cache
website/.vitepress/dist
node_modules
package-lock.json
+357 -1306
View File
File diff suppressed because it is too large Load Diff
-117
View File
@@ -1,117 +0,0 @@
# API
Fill free to make any API design proposals.
## HTTP API
Interactive [OpenAPI](https://go2rtc.org/api/).
`www/stream.html` - universal viewer with support params in URL:
- multiple streams on page `src=camera1&src=camera2...`
- stream technology autoselection `mode=webrtc,webrtc/tcp,mse,hls,mp4,mjpeg`
- stream technology comparison `src=camera1&mode=webrtc&mode=mse&mode=mp4`
- player width setting in pixels `width=320px` or percents `width=50%`
`www/webrtc.html` - WebRTC viewer with support two way audio and params in URL:
- `media=video+audio` - simple viewer
- `media=video+audio+microphone` - two way audio from camera
- `media=camera+microphone` - stream from browser
- `media=display+speaker` - stream from desktop
## JavaScript API
- You can write your viewer from the scratch
- You can extend the built-in viewer - `www/video-rtc.js`
- Check example - `www/video-stream.js`
- Check example - https://github.com/AlexxIT/WebRTC
`video-rtc.js` features:
- support technologies:
- WebRTC over UDP or TCP
- MSE or HLS or MP4 or MJPEG over WebSocket
- automatic selection best technology according on:
- codecs inside your stream
- current browser capabilities
- current network configuration
- automatic stop stream while browser or page not active
- automatic stop stream while player not inside page viewport
- automatic reconnection
Technology selection based on priorities:
1. Video and Audio better than just Video
2. H265 better than H264
3. WebRTC better than MSE, than HLS, than MJPEG
## WebSocket API
Endpoint: `/api/ws`
Query parameters:
- `src` (required) - Stream name
### WebRTC
Request SDP:
```json
{"type":"webrtc/offer","value":"v=0\r\n..."}
```
Response SDP:
```json
{"type":"webrtc/answer","value":"v=0\r\n..."}
```
Request/response candidate:
- empty value also allowed and optional
```json
{"type":"webrtc/candidate","value":"candidate:3277516026 1 udp 2130706431 192.168.1.123 54321 typ host"}
```
### MSE
Request:
- codecs list optional
```json
{"type":"mse","value":"avc1.640029,avc1.64002A,avc1.640033,hvc1.1.6.L153.B0,mp4a.40.2,mp4a.40.5,flac,opus"}
```
Response:
```json
{"type":"mse","value":"video/mp4; codecs=\"avc1.64001F,mp4a.40.2\""}
```
### HLS
Request:
```json
{"type":"hls","value":"avc1.640029,avc1.64002A,avc1.640033,hvc1.1.6.L153.B0,mp4a.40.2,mp4a.40.5,flac"}
```
Response:
- you MUST rewrite full HTTP path to `http://192.168.1.123:1984/api/hls/playlist.m3u8`
```json
{"type":"hls","value":"#EXTM3U\n#EXT-X-STREAM-INF:BANDWIDTH=1000000,CODECS=\"avc1.64001F,mp4a.40.2\"\nhls/playlist.m3u8?id=DvmHdd9w"}
```
### MJPEG
Request/response:
```json
{"type":"mjpeg"}
```
File diff suppressed because one or more lines are too long

Before

Width:  |  Height:  |  Size: 27 KiB

+4
View File
@@ -1,3 +1,7 @@
# Docker
Images are built automatically via [GitHub actions](https://github.com/AlexxIT/go2rtc/actions) and published on [Docker Hub](https://hub.docker.com/r/alexxit/go2rtc) and [GitHub](https://github.com/AlexxIT/go2rtc/pkgs/container/go2rtc).
## Versions
- `alexxit/go2rtc:latest` - latest release based on `alpine` (`amd64`, `386`, `arm/v6`, `arm/v7`, `arm64`) with support hardware transcoding for Intel iGPU and Raspberry
+1 -1
View File
@@ -1,4 +1,4 @@
## Example
## ONVIF Client
```shell
go run examples/onvif_client/main.go http://admin:password@192.168.10.90 GetAudioEncoderConfigurations
+67
View File
@@ -0,0 +1,67 @@
# Modules
go2rtc tries to name formats, protocols and codecs the same way they are named in FFmpeg.
Some formats and protocols go2rtc supports exclusively. They have no equivalent in FFmpeg.
- The `echo`, `expr`, `hass` and `onvif` modules receive a link to a stream. They don't know the protocol in advance.
- The `exec` and `ffmpeg` modules support many formats. They are identical to the `http` module.
- The `api`, `app`, `debug`, `ngrok`, `pinggy`, `srtp`, `streams` are supporting modules.
**Modules** implement communication APIs: authorization, encryption, command set, structure of media packets.
**Formats** describe the structure of the data being transmitted.
**Protocols** implement transport for data transmission.
| module | formats | protocols | input | output | ingest | two-way |
|--------------|-----------------|------------------|-------|--------|--------|---------|
| `alsa` | `pcm` | `ioctl` | yes | | | |
| `bubble` | - | `http` | yes | | | |
| `doorbird` | `mulaw` | `http` | yes | | | yes |
| `dvrip` | - | `tcp` | yes | | | yes |
| `echo` | * | * | yes | | | |
| `eseecloud` | `rtp` | `http` | yes | | | |
| `exec` | * | `pipe`, `rtsp` | yes | | | yes |
| `expr` | * | * | yes | | | |
| `ffmpeg` | * | `pipe`, `rtsp` | yes | | | |
| `flussonic` | `mp4` | `ws` | yes | | | |
| `gopro` | `mpegts` | `udp` | yes | | | |
| `hass` | * | * | yes | | | |
| `hls` | `mpegts`, `mp4` | `http` | | yes | | |
| `homekit` | `rtp` | `hap` | yes | yes | | no |
| `http` | `adts` | `http`, `tcp` | yes | | | |
| `http` | `flv` | `http`, `tcp` | yes | | | |
| `http` | `h264` | `http`, `tcp` | yes | | | |
| `http` | `hevc` | `http`, `tcp` | yes | | | |
| `http` | `hls` | `http`, `tcp` | yes | | | |
| `http` | `mjpeg` | `http`, `tcp` | yes | | | |
| `http` | `mpjpeg` | `http` | yes | | | |
| `http` | `mpegts` | `http`, `tcp` | yes | | | |
| `http` | `wav` | `http`, `tcp` | yes | | | |
| `http` | `yuv4mpegpipe` | `http`, `tcp` | yes | | | |
| `isapi` | `alaw`, `mulaw` | `http` | | | | yes |
| `ivideon` | `mp4` | `ws` | yes | | | |
| `mjpeg` | `ascii` | `http` | | yes | | |
| `mjpeg` | `jpeg` | `http` | | yes | | |
| `mjpeg` | `mpjpeg` | `http` | | yes | yes | |
| `mjpeg` | `yuv4mpegpipe` | `http` | | yes | | |
| `mp4` | `mp4` | `http`, `ws` | | yes | | |
| `mpegts` | `adts` | `http` | | yes | | |
| `mpegts` | `mpegts` | `http` | | yes | yes | |
| `multitrans` | `rtp` | `tcp` | | | | yes |
| `nest` | `srtp` | `rtsp`, `webrtc` | yes | | | no |
| `onvif` | `rtp` | * | yes | yes | | |
| `ring` | `srtp` | `webrtc` | yes | | | yes |
| `roborock` | `srtp` | `webrtc` | yes | | | yes |
| `rtmp` | `rtmp` | `rtmp` | yes | yes | yes | |
| `rtmp` | `flv` | `http` | | yes | yes | |
| `rtsp` | `rtsp` | `rtsp` | yes | yes | yes | yes |
| `tapo` | `mpegts` | `http` | yes | | | yes |
| `tuya` | `srtp` | `webrtc` | yes | | | yes |
| `v4l2` | `rawvideo` | `ioctl` | yes | | | |
| `webrtc` | `srtp` | `webrtc` | yes | yes | yes | yes |
| `webtorrent` | `srtp` | `webrtc` | yes | yes | | |
| `wyoming` | `pcm` | `tcp` | | yes | | |
| `wyze` | - | `tutk` | yes | | | yes |
| `xiaomi` | - | `cs2`, `tutk` | yes | | | yes |
| `yandex` | `srtp` | `webrtc` | yes | | | |
+12
View File
@@ -0,0 +1,12 @@
# ALSA
[`new in v1.9.10`](https://github.com/AlexxIT/go2rtc/releases/tag/v1.9.10)
> [!WARNING]
> This source is under development and does not always work well.
[Advanced Linux Sound Architecture](https://en.wikipedia.org/wiki/Advanced_Linux_Sound_Architecture) - a framework for receiving audio from devices on Linux OS.
Easy to add via **WebUI > add > ALSA**.
Alternatively, you can use FFmpeg source.
+44 -3
View File
@@ -1,4 +1,45 @@
## Exit codes
# HTTP API
- https://tldp.org/LDP/abs/html/exitcodes.html
- https://komodor.com/learn/exit-codes-in-containers-and-kubernetes-the-complete-guide/
The HTTP API is the main part for interacting with the application. Default address: `http://localhost:1984/`.
The HTTP API is described in [OpenAPI](../../website/api/openapi.yaml) format. It can be explored in [interactive viewer](https://go2rtc.org/api/). WebSocket API described [here](ws/README.md).
The project's static HTML and JS files are located in the [www](../../www) folder. An external developer can use them as a basis for integrating go2rtc into their project or for developing a custom web interface for go2rtc.
The contents of `www` folder are built into go2rtc when building, but you can use configuration to specify an external folder as the source of static files.
## Configuration
**Important!** go2rtc passes requests from localhost and Unix sockets without HTTP authorization, even if you have it configured. It is your responsibility to set up secure external access to the API. If not properly configured, an attacker can gain access to your cameras and even your server.
- you can disable HTTP API with `listen: ""` and use, for example, only RTSP client/server protocol
- you can enable HTTP API only on localhost with `listen: "127.0.0.1:1984"` setting
- you can change the API `base_path` and host go2rtc on your main app webserver suburl
- all files from `static_dir` hosted on root path: `/`
- you can use raw TLS cert/key content or path to files
```yaml
api:
listen: ":1984" # default ":1984", HTTP API port ("" - disabled)
username: "admin" # default "", Basic auth for WebUI
password: "pass" # default "", Basic auth for WebUI
local_auth: true # default false, Enable auth check for localhost requests
base_path: "/rtc" # default "", API prefix for serving on suburl (/api => /rtc/api)
static_dir: "www" # default "", folder for static files (custom web interface)
origin: "*" # default "", allow CORS requests (only * supported)
tls_listen: ":443" # default "", enable HTTPS server
tls_cert: | # default "", PEM-encoded fullchain certificate for HTTPS
-----BEGIN CERTIFICATE-----
...
-----END CERTIFICATE-----
tls_key: | # default "", PEM-encoded private key for HTTPS
-----BEGIN PRIVATE KEY-----
...
-----END PRIVATE KEY-----
unix_listen: "/tmp/go2rtc.sock" # default "", unix socket listener for API
```
**PS:**
- MJPEG over WebSocket plays better than native MJPEG because Chrome [bug](https://bugs.chromium.org/p/chromium/issues/detail?id=527446)
- MP4 over WebSocket was created only for Apple iOS because it doesn't support file streaming
+69
View File
@@ -0,0 +1,69 @@
# WebSocket
Endpoint: `/api/ws`
Query parameters:
- `src` (required) - Stream name
### WebRTC
Request SDP:
```json
{"type":"webrtc/offer","value":"v=0\r\n..."}
```
Response SDP:
```json
{"type":"webrtc/answer","value":"v=0\r\n..."}
```
Request/response candidate:
- empty value also allowed and optional
```json
{"type":"webrtc/candidate","value":"candidate:3277516026 1 udp 2130706431 192.168.1.123 54321 typ host"}
```
### MSE
Request:
- codecs list optional
```json
{"type":"mse","value":"avc1.640029,avc1.64002A,avc1.640033,hvc1.1.6.L153.B0,mp4a.40.2,mp4a.40.5,flac,opus"}
```
Response:
```json
{"type":"mse","value":"video/mp4; codecs=\"avc1.64001F,mp4a.40.2\""}
```
### HLS
Request:
```json
{"type":"hls","value":"avc1.640029,avc1.64002A,avc1.640033,hvc1.1.6.L153.B0,mp4a.40.2,mp4a.40.5,flac"}
```
Response:
- you MUST rewrite full HTTP path to `http://192.168.1.123:1984/api/hls/playlist.m3u8`
```json
{"type":"hls","value":"#EXTM3U\n#EXT-X-STREAM-INF:BANDWIDTH=1000000,CODECS=\"avc1.64001F,mp4a.40.2\"\nhls/playlist.m3u8?id=DvmHdd9w"}
```
### MJPEG
Request/response:
```json
{"type":"mjpeg"}
```
+35 -8
View File
@@ -1,3 +1,9 @@
# App
The application module is responsible for reading configuration files and running other modules.
The configuration can be edited through the application's WebUI with code highlighting, syntax and specification checking.
- By default, go2rtc will search for the `go2rtc.yaml` config file in the current working directory
- go2rtc supports multiple config files:
- `go2rtc -c config1.yaml -c config2.yaml -c config3.yaml`
@@ -38,6 +44,12 @@ Editors like [GoLand](https://www.jetbrains.com/go/) and [VS Code](https://code.
# yaml-language-server: $schema=https://raw.githubusercontent.com/AlexxIT/go2rtc/master/www/schema.json
```
or from a running go2rtc:
```yaml
# yaml-language-server: $schema=http://localhost:1984/schema.json
```
## Defaults
- Default values may change in updates
@@ -45,26 +57,41 @@ Editors like [GoLand](https://www.jetbrains.com/go/) and [VS Code](https://code.
```yaml
api:
listen: ":1984"
listen: ":1984" # default public port for WebUI and HTTP API
ffmpeg:
bin: "ffmpeg"
bin: "ffmpeg" # default binary path for FFmpeg
log:
format: "color"
level: "info"
level: "info" # default log level
output: "stdout"
time: "UNIXMS"
rtsp:
listen: ":8554"
listen: ":8554" # default public port for RTSP server
default_query: "video&audio"
srtp:
listen: ":8443"
listen: ":8443" # default public port for SRTP server (used for HomeKit)
webrtc:
listen: ":8555/tcp"
listen: ":8555" # default public port for WebRTC server (TCP and UDP)
ice_servers:
- urls: [ "stun:stun.l.google.com:19302" ]
- urls: [ "stun:stun.cloudflare.com:3478", "stun:stun.l.google.com:19302" ]
```
## Log
You can set different log levels for different modules.
```yaml
log:
format: "" # empty (default, autodetect color support), color, json, text
level: "info" # disabled, trace, debug, info (default), warn, error
output: "stdout" # empty (only to memory), stderr, stdout (default)
time: "UNIXMS" # empty (disable timestamp), UNIXMS (default), UNIXMICRO, UNIXNANO
api: trace # module name: log level
```
Modules: `api`, `streams`, `rtsp`, `webrtc`, `mp4`, `hls`, `mjpeg`, `hass`, `homekit`, `onvif`, `rtmp`, `webtorrent`, `wyoming`, `echo`, `exec`, `expr`, `ffmpeg`, `wyze`, `xiaomi`.
+15
View File
@@ -0,0 +1,15 @@
# Bubble
[`new in v1.6.1`](https://github.com/AlexxIT/go2rtc/releases/tag/v1.6.1)
Private format in some cameras from [dvr163.com](http://help.dvr163.com/) and [eseecloud.com](http://www.eseecloud.com/).
## Configuration
- you can skip `username`, `password`, `port`, `ch` and `stream` if they are default
- set up separate streams for different channels and streams
```yaml
streams:
camera1: bubble://username:password@192.168.1.123:34567/bubble/live?ch=0&stream=0
```
+3
View File
@@ -0,0 +1,3 @@
# Debug
This module provides `GET /api/stack`, with which you can find hanging goroutines
+1 -1
View File
@@ -1,6 +1,6 @@
# Doorbird
*[added in v1.9.8](https://github.com/AlexxIT/go2rtc/releases/tag/v1.9.11)*
[`new in v1.9.8`](https://github.com/AlexxIT/go2rtc/releases/tag/v1.9.8)
This source type supports [Doorbird](https://www.doorbird.com/) devices including MJPEG stream, audio stream as well as two-way audio.
+21
View File
@@ -0,0 +1,21 @@
# DVR-IP
[`new in v1.2.0`](https://github.com/AlexxIT/go2rtc/releases/tag/v1.2.0)
Private format from DVR-IP NVR, NetSurveillance, Sofia protocol (NETsurveillance ActiveX plugin XMeye SDK).
## Configuration
- you can skip `username`, `password`, `port`, `channel` and `subtype` if they are default
- set up separate streams for different channels
- use `subtype=0` for Main stream, and `subtype=1` for Extra1 stream
- only the TCP protocol is supported
```yaml
streams:
only_stream: dvrip://username:password@192.168.1.123:34567?channel=0&subtype=0
only_tts: dvrip://username:password@192.168.1.123:34567?backchannel=1
two_way_audio:
- dvrip://username:password@192.168.1.123:34567?channel=0&subtype=0
- dvrip://username:password@192.168.1.123:34567?backchannel=1
```
+48
View File
@@ -0,0 +1,48 @@
# Echo
Some sources may have a dynamic link. And you will need to get it using a Bash or Python script. Your script should echo a link to the source. RTSP, FFmpeg or any of the supported sources.
**Docker** and **Home Assistant add-on** users has preinstalled `python3`, `curl`, `jq`.
## Configuration
```yaml
streams:
apple_hls: echo:python3 hls.py https://developer.apple.com/streaming/examples/basic-stream-osx-ios5.html
```
## Install python libraries
**Docker** and **Hass Add-on** users has preinstalled `python3` without any additional libraries, like [requests](https://requests.readthedocs.io/) or others. If you need some additional libraries - you need to install them to folder with your script:
1. Install [SSH & Web Terminal](https://github.com/hassio-addons/addon-ssh)
2. Goto Add-on Web UI
3. Install library: `pip install requests -t /config/echo`
4. Add your script to `/config/echo/myscript.py`
5. Use your script as source `echo:python3 /config/echo/myscript.py`
## Example: Apple HLS
```yaml
streams:
apple_hls: echo:python3 hls.py https://developer.apple.com/streaming/examples/basic-stream-osx-ios5.html
```
**hls.py**
```python
import re
import sys
from urllib.parse import urljoin
from urllib.request import urlopen
html = urlopen(sys.argv[1]).read().decode("utf-8")
url = re.search(r"https.+?m3u8", html)[0]
html = urlopen(url).read().decode("utf-8")
m = re.search(r"^[a-z0-1/_]+\.m3u8$", html, flags=re.MULTILINE)
url = urljoin(url, m[0])
# ffmpeg:https://devstreaming-cdn.apple.com/videos/streaming/examples/bipbop_16x9/gear1/prog_index.m3u8#video=copy
print("ffmpeg:" + url + "#video=copy")
```
+12
View File
@@ -0,0 +1,12 @@
# EseeCloud
[`new in v1.9.10`](https://github.com/AlexxIT/go2rtc/releases/tag/v1.9.10)
This source is for cameras with a link like this `http://admin:@192.168.1.123:80/livestream/12`. Related [issue](https://github.com/AlexxIT/go2rtc/issues/1690).
## Configuration
```yaml
streams:
camera1: eseecloud://user:pass@192.168.1.123:80/livestream/12
```
+36
View File
@@ -1,3 +1,39 @@
# Exec
Exec source can run any external application and expect data from it. Two transports are supported - **pipe** ([`new in v1.5.0`](https://github.com/AlexxIT/go2rtc/releases/tag/v1.5.0)) and **RTSP**.
If you want to use **RTSP** transport, the command must contain the `{output}` argument in any place. On launch, it will be replaced by the local address of the RTSP server.
**pipe** reads data from app stdout in different formats: **MJPEG**, **H.264/H.265 bitstream**, **MPEG-TS**. Also pipe can write data to app stdin in two formats: **PCMA** and **PCM/48000**.
The source can be used with:
- [FFmpeg](https://ffmpeg.org/) - go2rtc ffmpeg source just a shortcut to exec source
- [FFplay](https://ffmpeg.org/ffplay.html) - play audio on your server
- [GStreamer](https://gstreamer.freedesktop.org/)
- [Raspberry Pi Cameras](https://www.raspberrypi.com/documentation/computers/camera_software.html)
- any of your own software
## Configuration
Pipe commands support parameters (format: `exec:{command}#{param1}#{param2}`):
- `killsignal` - signal which will be sent to stop the process (numeric form)
- `killtimeout` - time in seconds for forced termination with sigkill
- `backchannel` - enable backchannel for two-way audio
- `starttimeout` - time in seconds for waiting first byte from RTSP
```yaml
streams:
stream: exec:ffmpeg -re -i /media/BigBuckBunny.mp4 -c copy -rtsp_transport tcp -f rtsp {output}
picam_h264: exec:libcamera-vid -t 0 --inline -o -
picam_mjpeg: exec:libcamera-vid -t 0 --codec mjpeg -o -
pi5cam_h264: exec:libcamera-vid -t 0 --libav-format h264 -o -
canon: exec:gphoto2 --capture-movie --stdout#killsignal=2#killtimeout=5
play_pcma: exec:ffplay -fflags nobuffer -f alaw -ar 8000 -i -#backchannel=1
play_pcm48k: exec:ffplay -fflags nobuffer -f s16be -ar 48000 -i -#backchannel=1
```
## Backchannel
- You can check audio card names in the **Go2rtc > WebUI > Add**
+2
View File
@@ -1,5 +1,7 @@
# Expr
[`new in v1.8.2`](https://github.com/AlexxIT/go2rtc/releases/tag/v1.8.2)
[Expr](https://github.com/antonmedv/expr) - expression language and expression evaluation for Go.
- [language definition](https://expr.medv.io/docs/Language-Definition) - takes best from JS, Python, Jinja2 syntax
+52 -58
View File
@@ -1,68 +1,62 @@
## FFplay output
# FFmpeg
[FFplay](https://stackoverflow.com/questions/27778678/what-are-mv-fd-aq-vq-sq-and-f-in-a-video-stream) `7.11 A-V: 0.003 fd= 1 aq= 21KB vq= 321KB sq= 0B f=0/0`:
You can get any stream, file or device via FFmpeg and push it to go2rtc. The app will automatically start FFmpeg with the proper arguments when someone starts watching the stream.
- `7.11` - master clock, is the time from start of the stream/video
- `A-V` - av_diff, difference between audio and video timestamps
- `fd` - frames dropped
- `aq` - audio queue (0 - no delay)
- `vq` - video queue (0 - no delay)
- `sq` - subtitle queue
- `f` - timestamp error correction rate (Not 100% sure)
- FFmpeg preinstalled for **Docker** and **Home Assistant add-on** users
- **Home Assistant add-on** users can target files from [/media](https://www.home-assistant.io/more-info/local-media/setup-media/) folder
`M-V`, `M-A` means video stream only, audio stream only respectively.
## Configuration
## Devices Windows
```
>ffmpeg -hide_banner -f dshow -list_options true -i video="VMware Virtual USB Video Device"
[dshow @ 0000025695e52900] DirectShow video device options (from video devices)
[dshow @ 0000025695e52900] Pin "Record" (alternative pin name "0")
[dshow @ 0000025695e52900] pixel_format=yuyv422 min s=1280x720 fps=1 max s=1280x720 fps=10
[dshow @ 0000025695e52900] pixel_format=yuyv422 min s=1280x720 fps=1 max s=1280x720 fps=10 (tv, bt470bg/bt709/unknown, topleft)
[dshow @ 0000025695e52900] pixel_format=nv12 min s=1280x720 fps=1 max s=1280x720 fps=23
[dshow @ 0000025695e52900] pixel_format=nv12 min s=1280x720 fps=1 max s=1280x720 fps=23 (tv, bt470bg/bt709/unknown, topleft)
```
## Devices Mac
```
% ./ffmpeg -hide_banner -f avfoundation -list_devices true -i ""
[AVFoundation indev @ 0x7f8b1f504d80] AVFoundation video devices:
[AVFoundation indev @ 0x7f8b1f504d80] [0] FaceTime HD Camera
[AVFoundation indev @ 0x7f8b1f504d80] [1] Capture screen 0
[AVFoundation indev @ 0x7f8b1f504d80] AVFoundation audio devices:
[AVFoundation indev @ 0x7f8b1f504d80] [0] Soundflower (2ch)
[AVFoundation indev @ 0x7f8b1f504d80] [1] Built-in Microphone
[AVFoundation indev @ 0x7f8b1f504d80] [2] Soundflower (64ch)
```
## Devices Linux
```
# ffmpeg -hide_banner -f v4l2 -list_formats all -i /dev/video0
[video4linux2,v4l2 @ 0x7f7de7c58bc0] Raw : yuyv422 : YUYV 4:2:2 : 640x480 160x120 176x144 320x176 320x240 352x288 432x240 544x288 640x360 752x416 800x448 800x600 864x480 960x544 960x720 1024x576 1184x656 1280x720 1280x960
[video4linux2,v4l2 @ 0x7f7de7c58bc0] Compressed: mjpeg : Motion-JPEG : 640x480 160x120 176x144 320x176 320x240 352x288 432x240 544x288 640x360 752x416 800x448 800x600 864x480 960x544 960x720 1024x576 1184x656 1280x720 1280x960
```
## TTS
Format: `ffmpeg:{input}#{param1}#{param2}#{param3}`. Examples:
```yaml
streams:
tts: ffmpeg:#input=-readrate 1 -readrate_initial_burst 0.001 -f lavfi -i "flite=text='1 2 3 4 5 6 7 8 9 0'"#audio=pcma
# [FILE] all tracks will be copied without transcoding codecs
file1: ffmpeg:/media/BigBuckBunny.mp4
# [FILE] video will be transcoded to H264, audio will be skipped
file2: ffmpeg:/media/BigBuckBunny.mp4#video=h264
# [FILE] video will be copied, audio will be transcoded to PCMU
file3: ffmpeg:/media/BigBuckBunny.mp4#video=copy#audio=pcmu
# [HLS] video will be copied, audio will be skipped
hls: ffmpeg:https://devstreaming-cdn.apple.com/videos/streaming/examples/bipbop_16x9/gear5/prog_index.m3u8#video=copy
# [MJPEG] video will be transcoded to H264
mjpeg: ffmpeg:http://185.97.122.128/cgi-bin/faststream.jpg#video=h264
# [RTSP] video with rotation, should be transcoded, so select H264
rotate: ffmpeg:rtsp://12345678@192.168.1.123/av_stream/ch0#video=h264#rotate=90
```
## Useful links
All transcoding formats have [built-in templates](ffmpeg.go): `h264`, `h265`, `opus`, `pcmu`, `pcmu/16000`, `pcmu/48000`, `pcma`, `pcma/16000`, `pcma/48000`, `aac`, `aac/16000`.
- https://superuser.com/questions/564402/explanation-of-x264-tune
- https://stackoverflow.com/questions/33624016/why-sliced-thread-affect-so-much-on-realtime-encoding-using-ffmpeg-x264
- https://codec.fandom.com/ru/wiki/X264_-_описание_ключей_кодирования
- https://html5test.com/
- https://trac.ffmpeg.org/wiki/Capture/Webcam
- https://trac.ffmpeg.org/wiki/DirectShow
- https://stackoverflow.com/questions/53207692/libav-mjpeg-encoding-and-huffman-table
- https://github.com/tuupola/esp_video/blob/master/README.md
- https://github.com/leandromoreira/ffmpeg-libav-tutorial
- https://www.reddit.com/user/VeritablePornocopium/comments/okw130/ffmpeg_with_libfdk_aac_for_windows_x64/
- https://slhck.info/video/2017/02/24/vbr-settings.html
- [HomeKit audio samples problem](https://superuser.com/questions/1290996/non-monotonous-dts-with-igndts-flag)
But you can override them via YAML config. You can also add your own formats to the config and use them with source params.
```yaml
ffmpeg:
bin: ffmpeg # path to ffmpeg binary
global: "-hide_banner"
timeout: 5 # default timeout in seconds for rtsp inputs
h264: "-codec:v libx264 -g:v 30 -preset:v superfast -tune:v zerolatency -profile:v main -level:v 4.1"
mycodec: "-any args that supported by ffmpeg..."
myinput: "-fflags nobuffer -flags low_delay -timeout {timeout} -i {input}"
myraw: "-ss 00:00:20"
```
- You can use go2rtc stream name as ffmpeg input (ex. `ffmpeg:camera1#video=h264`)
- You can use `video` and `audio` params multiple times (ex. `#video=copy#audio=copy#audio=pcmu`)
- You can use `rotate` param with `90`, `180`, `270` or `-90` values, important with transcoding (ex. `#video=h264#rotate=90`)
- You can use `width` and/or `height` params, important with transcoding (ex. `#video=h264#width=1280`)
- You can use `drawtext` to add a timestamp (ex. `drawtext=x=2:y=2:fontsize=12:fontcolor=white:box=1:boxcolor=black`)
- This will greatly increase the CPU of the server, even with hardware acceleration
- You can use `timeout` param to set RTSP input timeout in seconds (ex. `#timeout=10`)
- You can use `raw` param for any additional FFmpeg arguments (ex. `#raw=-vf transpose=1`)
- You can use `input` param to override default input template (ex. `#input=rtsp/udp` will change RTSP transport from TCP to UDP+TCP)
- You can use raw input value (ex. `#input=-timeout {timeout} -i {input}`)
- You can add your own input templates
Read more about [hardware acceleration](hardware/README.md).
**PS.** It is recommended to check the available hardware in the WebUI add page.
+22
View File
@@ -0,0 +1,22 @@
# FFmpeg Device
You can get video from any USB camera or Webcam as RTSP or WebRTC stream. This is part of FFmpeg integration.
- check available devices in web interface
- `video_size` and `framerate` must be supported by your camera!
- for Linux supported only video for now
- for macOS you can stream FaceTime camera or whole desktop!
- for macOS important to set right framerate
## Configuration
Format: `ffmpeg:device?{input-params}#{param1}#{param2}#{param3}`
```yaml
streams:
linux_usbcam: ffmpeg:device?video=0&video_size=1280x720#video=h264
windows_webcam: ffmpeg:device?video=0#video=h264
macos_facetime: ffmpeg:device?video=0&audio=1&video_size=1280x720&framerate=30#video=h264#audio=pcma
```
**PS.** It is recommended to check the available devices in the WebUI add page.
+5
View File
@@ -0,0 +1,5 @@
# Flussonic
[`new in v1.9.10`](https://github.com/AlexxIT/go2rtc/releases/tag/v1.9.10)
Support streams from [Flusonic](https://flussonic.com/) server. Related [issue](https://github.com/AlexxIT/go2rtc/issues/1678).
+5 -1
View File
@@ -1,11 +1,15 @@
# GoPro
[`new in v1.8.3`](https://github.com/AlexxIT/go2rtc/releases/tag/v1.8.3)
Support streaming from [GoPro](https://gopro.com/) cameras, connected via USB or Wi-Fi to Linux, Mac, Windows.
Supported models: HERO9, HERO10, HERO11, HERO12.
Supported OS: Linux, Mac, Windows, [HassOS](https://www.home-assistant.io/installation/)
Other camera models have different APIs. I will try to add them in future versions.
## Config
## Configuration
- USB-connected cameras create a new network interface in the system
- Linux users do not need to install anything
+41
View File
@@ -0,0 +1,41 @@
# Hass
Support import camera links from [Home Assistant](https://www.home-assistant.io/) config files:
- [Generic Camera](https://www.home-assistant.io/integrations/generic/), setup via GUI
- [HomeKit Camera](https://www.home-assistant.io/integrations/homekit_controller/)
- [ONVIF](https://www.home-assistant.io/integrations/onvif/)
- [Roborock](https://github.com/humbertogontijo/homeassistant-roborock) vacuums with camera
## Configuration
```yaml
hass:
config: "/config" # skip this setting if you are a Home Assistant add-on user
streams:
generic_camera: hass:Camera1 # Settings > Integrations > Integration Name
aqara_g3: hass:Camera-Hub-G3-AB12
```
### WebRTC Cameras
[`new in v1.6.0`](https://github.com/AlexxIT/go2rtc/releases/tag/v1.6.0)
Any cameras in WebRTC format are supported. But at the moment Home Assistant only supports some [Nest](https://www.home-assistant.io/integrations/nest/) cameras in this format.
**Important.** The Nest API only allows you to get a link to a stream for 5 minutes.
Do not use this with Frigate! If the stream expires, Frigate will consume all available RAM on your machine within seconds.
It's recommended to use [Nest source](../nest/README.md) - it supports extending the stream.
```yaml
streams:
# link to Home Assistant Supervised
hass-webrtc1: hass://supervisor?entity_id=camera.nest_doorbell
# link to external Home Assistant with Long-Lived Access Tokens
hass-webrtc2: hass://192.168.1.123:8123?entity_id=camera.nest_doorbell&token=eyXYZ...
```
### RTSP Cameras
By default, the Home Assistant API does not allow you to get a dynamic RTSP link to a camera stream. [This method](https://github.com/felipecrs/hass-expose-camera-stream-source#importing-cameras-from-home-assistant-to-go2rtc-or-frigate) can work around it.
+16
View File
@@ -1,3 +1,19 @@
# HLS
[`new in v1.1.0`](https://github.com/AlexxIT/go2rtc/releases/tag/v1.1.0)
[HLS](https://en.wikipedia.org/wiki/HTTP_Live_Streaming) is the worst technology for real-time streaming.
It can only be useful on devices that do not support more modern technology, like [WebRTC](../webrtc/README.md), [MP4](../mp4/README.md).
The go2rtc implementation differs from the standards and may not work with all players.
API examples:
- HLS/TS stream: `http://192.168.1.123:1984/api/stream.m3u8?src=camera1` (H264)
- HLS/fMP4 stream: `http://192.168.1.123:1984/api/stream.m3u8?src=camera1&mp4` (H264, H265, AAC)
Read more about [codecs filters](../../README.md#codecs-filters).
## Useful links
- https://walterebert.com/playground/video/hls/
+97
View File
@@ -0,0 +1,97 @@
# Apple HomeKit
This module supports both client and server for the [Apple HomeKit](https://www.apple.com/home-app/accessories/) protocol.
## HomeKit Client
**Important:**
- You can use HomeKit Cameras **without Apple devices** (iPhone, iPad, etc.), it's just a yet another protocol
- HomeKit device can be paired with only one ecosystem. So, if you have paired it to an iPhone (Apple Home), you can't pair it with Home Assistant or go2rtc. Or if you have paired it to go2rtc, you can't pair it with an iPhone
- HomeKit device should be on the same network with working [mDNS](https://en.wikipedia.org/wiki/Multicast_DNS) between the device and go2rtc
go2rtc supports importing paired HomeKit devices from [Home Assistant](../hass/README.md).
So you can use HomeKit camera with Home Assistant and go2rtc simultaneously.
If you are using Home Assistant, I recommend pairing devices with it; it will give you more options.
You can pair device with go2rtc on the HomeKit page. If you can't see your devices, reload the page.
Also, try rebooting your HomeKit device (power off). If you still can't see it, you have a problem with mDNS.
If you see a device but it does not have a pairing button, it is paired to some ecosystem (Apple Home, Home Assistant, HomeBridge, etc.). You need to delete the device from that ecosystem, and it will be available for pairing. If you cannot unpair the device, you will have to reset it.
**Important:**
- HomeKit audio uses very non-standard **AAC-ELD** codec with very non-standard params and specification violations
- Audio can't be played in `VLC` and probably any other player
- Audio should be transcoded for use with MSE, WebRTC, etc.
### Client Configuration
Recommended settings for using HomeKit Camera with WebRTC, MSE, MP4, RTSP:
```yaml
streams:
aqara_g3:
- hass:Camera-Hub-G3-AB12
- ffmpeg:aqara_g3#audio=aac#audio=opus
```
RTSP link with "normal" audio for any player: `rtsp://192.168.1.123:8554/aqara_g3?video&audio=aac`
**This source is in active development!** Tested only with [Aqara Camera Hub G3](https://www.aqara.com/eu/product/camera-hub-g3) (both EU and CN versions).
## HomeKit Server
[`new in v1.7.0`](https://github.com/AlexxIT/go2rtc/releases/tag/v1.7.0)
HomeKit module can work in two modes:
- export any H264 camera to Apple HomeKit
- transparent proxy any Apple HomeKit camera (Aqara, Eve, Eufy, etc.) back to Apple HomeKit, so you will have all camera features in Apple Home and also will have RTSP/WebRTC/MP4/etc. from your HomeKit camera
**Important**
- HomeKit cameras support only H264 video and OPUS audio
### Server Configuration
**Minimal config**
```yaml
streams:
dahua1: rtsp://admin:password@192.168.1.123/cam/realmonitor?channel=1&subtype=0
homekit:
dahua1: # same stream ID from streams list, default PIN - 19550224
```
**Full config**
```yaml
streams:
dahua1:
- rtsp://admin:password@192.168.1.123/cam/realmonitor?channel=1&subtype=0
- ffmpeg:dahua1#video=h264#hardware # if your camera doesn't support H264, important for HomeKit
- ffmpeg:dahua1#audio=opus # only OPUS audio supported by HomeKit
homekit:
dahua1: # same stream ID from streams list
pin: 12345678 # custom PIN, default: 19550224
name: Dahua camera # custom camera name, default: generated from stream ID
device_id: dahua1 # custom ID, default: generated from stream ID
device_private: dahua1 # custom key, default: generated from stream ID
```
**Proxy HomeKit camera**
- Video stream from HomeKit camera to Apple device (iPhone, AppleTV) will be transmitted directly
- Video stream from HomeKit camera to RTSP/WebRTC/MP4/etc. will be transmitted via go2rtc
```yaml
streams:
aqara1:
- homekit://...
- ffmpeg:aqara1#audio=aac#audio=opus # optional audio transcoding
homekit:
aqara1: # same stream ID from streams list
```
+47
View File
@@ -0,0 +1,47 @@
# HTTP
This source supports receiving a stream via an HTTP link.
It can determine the source format from the`Content-Type` HTTP header:
- **HTTP-JPEG** (`image/jpeg`) - camera snapshot link, can be converted by go2rtc to MJPEG stream
- **HTTP-MJPEG** (`multipart/x-mixed-replace`) - A continuous sequence of JPEG frames (with HTTP headers).
- **HLS** (`application/vnd.apple.mpegurl`) - A popular [HTTP Live Streaming](https://en.wikipedia.org/wiki/HTTP_Live_Streaming) (HLS) format, which is not designed for real-time media transmission.
> [!WARNING]
> The HLS format is not designed for real time and is supported quite poorly. It is recommended to use it via ffmpeg source with buffering enabled (disabled by default).
## TCP
Source also supports HTTP and TCP streams with autodetection for different formats:
- `adts` - Audio stream in [AAC](https://en.wikipedia.org/wiki/Advanced_Audio_Coding) codec with Audio Data Transport Stream (ADTS) headers.
- `flv` - The legacy but still used [Flash Video](https://en.wikipedia.org/wiki/Flash_Video) format.
- `h264` - AVC/H.264 bitstream.
- `hevc` - HEVC/H.265 bitstream.
- `mjpeg` - A continuous sequence of JPEG frames (without HTTP headers).
- `mpegts` - The legacy [MPEG transport stream](https://en.wikipedia.org/wiki/MPEG_transport_stream) format.
- `wav` - Audio stream in [WAV](https://en.wikipedia.org/wiki/WAV) format.
- `yuv4mpegpipe` - Raw YUV frame stream with YUV4MPEG header.
## Configuration
```yaml
streams:
# [HTTP-FLV] stream in video/x-flv format
http_flv: http://192.168.1.123:20880/api/camera/stream/780900131155/657617
# [JPEG] snapshots from Dahua camera, will be converted to MJPEG stream
dahua_snap: http://admin:password@192.168.1.123/cgi-bin/snapshot.cgi?channel=1
# [MJPEG] stream will be proxied without modification
http_mjpeg: https://mjpeg.sanford.io/count.mjpeg
# [MJPEG or H.264/H.265 bitstream or MPEG-TS]
tcp_magic: tcp://192.168.1.123:12345
# Add custom header
custom_header: "https://mjpeg.sanford.io/count.mjpeg#header=Authorization: Bearer XXX"
```
**PS.** Dahua camera has a bug: if you select MJPEG codec for RTSP second stream, snapshot won't work.
+14
View File
@@ -0,0 +1,14 @@
# Hikvision ISAPI
[`new in v1.3.0`](https://github.com/AlexxIT/go2rtc/releases/tag/v1.3.0)
This source type supports only backchannel audio for the [Hikvision ISAPI](https://tpp.hikvision.com/download/ISAPI_OTAP) protocol. So it should be used as a second source in addition to the RTSP protocol.
## Configuration
```yaml
streams:
hikvision1:
- rtsp://admin:password@192.168.1.123:554/Streaming/Channels/101
- isapi://admin:password@192.168.1.123:80/
```
+10
View File
@@ -0,0 +1,10 @@
# Ivideon
Support public cameras from the service [Ivideon](https://tv.ivideon.com/).
## Configuration
```yaml
streams:
quailcam: ivideon:100-tu5dkUPct39cTp9oNEN2B6/0
```
+39 -11
View File
@@ -1,13 +1,19 @@
# MJPEG
# Motion JPEG
- This module can provide and receive streams in MJPEG format.
- This module is also responsible for receiving snapshots in JPEG format.
- This module also supports streaming to the server console (terminal) in the **animated ASCII art** format.
## MJPEG Client
**Important.** For a stream in MJPEG format, your source MUST contain the MJPEG codec. If your stream has the MJPEG codec, you can receive an **MJPEG stream** or **JPEG snapshots** via the API.
You can receive an MJPEG stream in several ways:
- some cameras support MJPEG codec inside [RTSP stream](#source-rtsp) (ex. second stream for Dahua cameras)
- some cameras have an HTTP link with [MJPEG stream](#source-http)
- some cameras have an HTTP link with snapshots - go2rtc can convert them to [MJPEG stream](#source-http)
- you can convert an H264/H265 stream from your camera via [FFmpeg integration](#source-ffmpeg)
- some cameras support MJPEG codec inside [RTSP stream](../rtsp/README.md) (ex. second stream for Dahua cameras)
- some cameras have an HTTP link with [MJPEG stream](../http/README.md)
- some cameras have an HTTP link with snapshots - go2rtc can convert them to [MJPEG stream](../http/README.md)
- you can convert an H264/H265 stream from your camera via [FFmpeg integration](../ffmpeg/README.md)
With this example, your stream will have both H264 and MJPEG codecs:
@@ -18,18 +24,22 @@ streams:
- ffmpeg:camera1#video=mjpeg
```
## API examples
## MJPEG Server
**MJPEG stream**
### mpjpeg
Output a stream in [MJPEG](https://en.wikipedia.org/wiki/Motion_JPEG) format. In [FFmpeg](https://ffmpeg.org/), this format is called `mpjpeg` because it contains HTTP headers.
```
http://192.168.1.123:1984/api/stream.mjpeg?src=camera1
ffplay http://192.168.1.123:1984/api/stream.mjpeg?src=camera1
```
**JPEG snapshots**
### jpeg
Receiving a JPEG snapshot.
```
http://192.168.1.123:1984/api/frame.jpeg?src=camera1
curl http://192.168.1.123:1984/api/frame.jpeg?src=camera1
```
- You can use `width`/`w` and/or `height`/`h` parameters.
@@ -40,10 +50,14 @@ http://192.168.1.123:1984/api/frame.jpeg?src=camera1
- A cached snapshot will be used if its time is not older than the time specified in the `cache` parameter.
- The `cache` parameter does not check the image dimensions from the cache and those specified in the query.
## Stream as ASCII to Terminal
### ascii
Stream as ASCII to Terminal. This format is just for fun. You can boast to your friends that you can stream cameras even to the server console without a GUI.
[![](https://img.youtube.com/vi/sHj_3h_sX7M/mqdefault.jpg)](https://www.youtube.com/watch?v=sHj_3h_sX7M)
> The demo video features a combination of several settings for this format with added audio. Of course, the format doesn't support audio out of the box.
**Tips**
- this feature works only with MJPEG codec (use transcoding)
@@ -78,3 +92,17 @@ streams:
% curl "http://192.168.1.123:1984/api/stream.ascii?src=gamazda&back=8&text=%20%20"
% curl "http://192.168.1.123:1984/api/stream.ascii?src=gamazda&text=helloworld"
```
### yuv4mpegpipe
Raw [YUV](https://en.wikipedia.org/wiki/Y%E2%80%B2UV) frame stream with [YUV4MPEG](https://manned.org/yuv4mpeg) header.
```
ffplay http://192.168.1.123:1984/api/stream.y4m?src=camera1
```
## Streaming ingest
```shell
ffmpeg -re -i BigBuckBunny.mp4 -c mjpeg -f mpjpeg http://localhost:1984/api/stream.mjpeg?dst=camera1
```
+66
View File
@@ -0,0 +1,66 @@
# MP4
This module provides several features:
1. MSE stream (fMP4 over WebSocket)
2. Camera snapshots in MP4 format (single frame), can be sent to [Telegram](#snapshot-to-telegram)
3. HTTP progressive streaming (MP4 file stream) - bad format for streaming because of high start delay. This format doesn't work in all Safari browsers, but go2rtc will automatically redirect it to HLS/fMP4 in this case.
## API examples
- MP4 snapshot: `http://192.168.1.123:1984/api/frame.mp4?src=camera1` (H264, H265)
- MP4 stream: `http://192.168.1.123:1984/api/stream.mp4?src=camera1` (H264, H265, AAC)
- MP4 file: `http://192.168.1.123:1984/api/stream.mp4?src=camera1` (H264, H265*, AAC, OPUS, MP3, PCMA, PCMU, PCM)
- You can use `mp4`, `mp4=flac` and `mp4=all` param for codec filters
- You can use `duration` param in seconds (ex. `duration=15`)
- You can use `filename` param (ex. `filename=record.mp4`)
- You can use `rotate` param with `90`, `180` or `270` values
- You can use `scale` param with positive integer values (ex. `scale=4:3`)
Read more about [codecs filters](../../README.md#codecs-filters).
**PS.** Rotate and scale params don't use transcoding and change video using metadata.
## Snapshot to Telegram
This examples for Home Assistant [Telegram Bot](https://www.home-assistant.io/integrations/telegram_bot/) integration.
- change `url` to your go2rtc web API (`http://localhost:1984/` for most users)
- change `target` to your Telegram chat ID (support list)
- change `src=camera1` to your stream name from go2rtc config
**Important.** Snapshot will be near instant for most cameras and many sources, except `ffmpeg` source. Because it takes a long time for ffmpeg to start streaming with video, even when you use `#video=copy`. Also the delay can be with cameras that do not start the stream with a keyframe.
### Snapshot from H264 or H265 camera
```yaml
service: telegram_bot.send_video
data:
url: http://localhost:1984/api/frame.mp4?src=camera1
target: 123456789
```
### Record from H264 or H265 camera
Record from service call to the future. Doesn't support loopback.
- `mp4=flac` - adds support PCM audio family
- `filename=record.mp4` - set name for downloaded file
```yaml
service: telegram_bot.send_video
data:
url: http://localhost:1984/api/stream.mp4?src=camera1&mp4=flac&duration=5&filename=record.mp4 # duration in seconds
target: 123456789
```
### Snapshot from JPEG or MJPEG camera
This example works via the [mjpeg](../mjpeg/README.md) module.
```yaml
service: telegram_bot.send_photo
data:
url: http://localhost:1984/api/frame.jpeg?src=camera1
target: 123456789
```
+28
View File
@@ -0,0 +1,28 @@
# MPEG-TS
This module provides an [HTTP API](../api/README.md) for:
- Streaming output in `mpegts` format.
- Streaming output in `adts` format.
- Streaming ingest in `mpegts` format.
> [!NOTE]
> This module is probably better called mpeg. Because AAC is part of MPEG-2 and MPEG-4 and MPEG-TS is part of MPEG-2.
## MPEG-TS Server
```shell
ffplay http://localhost:1984/api/stream.ts?src=camera1
```
## ADTS Server
```shell
ffplay http://localhost:1984/api/stream.aac?src=camera1
```
## Streaming ingest
```shell
ffmpeg -re -i BigBuckBunny.mp4 -c copy -f mpegts http://localhost:1984/api/stream.ts?dst=camera1
```
+9 -3
View File
@@ -1,8 +1,8 @@
# Multitrans
# TP-Link MULTITRANS
**added in v1.9.14** by [@forrestsocool](https://github.com/forrestsocool)
[`new in v1.9.14`](https://github.com/AlexxIT/go2rtc/releases/tag/v1.9.14) by [@forrestsocool](https://github.com/forrestsocool)
Two-way audio support for Chinese version of [TP-Link cameras](https://www.tp-link.com.cn/list_2549.html).
Two-way audio support for Chinese version of [TP-Link](https://www.tp-link.com.cn/) cameras.
## Configuration
@@ -14,3 +14,9 @@ streams:
# two-way audio use MULTITRANS schema
- multitrans://admin:admin@192.168.1.202:554
```
## Useful links
- https://www.tp-link.com.cn/list_2549.html
- https://github.com/AlexxIT/go2rtc/issues/1724
- https://github.com/bingooo/hass-tplink-ipc/
+11
View File
@@ -0,0 +1,11 @@
# Google Nest
[`new in v1.6.0`](https://github.com/AlexxIT/go2rtc/releases/tag/v1.6.0)
For simplicity, it is recommended to connect the Nest/WebRTC camera to the [Home Assistant](../hass/README.md).
But if you can somehow get the below parameters, Nest/WebRTC source will work without Home Assistant.
```yaml
streams:
nest-doorbell: nest:?client_id=***&client_secret=***&refresh_token=***&project_id=***&device_id=***
```
+2
View File
@@ -1,3 +1,5 @@
# ngrok
With the ngrok integration, you can get external access to your streams when your Internet connection is behind a private IP address.
- you may need external access for two different things:
+17
View File
@@ -1,5 +1,22 @@
# ONVIF
## ONVIF Client
[`new in v1.5.0`](https://github.com/AlexxIT/go2rtc/releases/tag/v1.5.0)
The source is not very useful if you already know RTSP and snapshot links for your camera. But it can be useful if you don't.
**WebUI > Add** webpage support ONVIF autodiscovery. Your server must be on the same subnet as the camera. If you use Docker, you must use "network host".
```yaml
streams:
dahua1: onvif://admin:password@192.168.1.123
reolink1: onvif://admin:password@192.168.1.123:8000
tapo1: onvif://admin:password@192.168.1.123:2020
```
## ONVIF Server
A regular camera has a single video source (`GetVideoSources`) and two profiles (`GetProfiles`).
Go2rtc has one video source and one profile per stream.
+17
View File
@@ -0,0 +1,17 @@
# Ring
[`new in v1.9.13`](https://github.com/AlexxIT/go2rtc/releases/tag/v1.9.13) by [@seydx](https://github.com/seydx)
This source type supports Ring cameras with [two-way audio](../../README.md#two-way-audio) support.
## Configuration
If you have a `refresh_token` and `device_id`, you can use them in the `go2rtc.yaml` config file.
Otherwise, you can use the go2rtc web interface and add your Ring account (WebUI > Add > Ring). Once added, it will list all your Ring cameras.
```yaml
streams:
ring: ring:?device_id=XXX&refresh_token=XXX
ring_snapshot: ring:?device_id=XXX&refresh_token=XXX&snapshot
```
+15
View File
@@ -0,0 +1,15 @@
# Roborock
[`new in v1.3.0`](https://github.com/AlexxIT/go2rtc/releases/tag/v1.3.0)
This source type supports Roborock vacuums with cameras. Known working models:
- **Roborock S6 MaxV** - only video (the vacuum has no microphone)
- **Roborock S7 MaxV** - video and two-way audio
- **Roborock Qrevo MaxV** - video and two-way audio
## Configuration
This source supports loading Roborock credentials from the Home Assistant [custom integration](https://github.com/humbertogontijo/homeassistant-roborock) or the [core integration](https://www.home-assistant.io/integrations/roborock). Otherwise, you need to log in to your Roborock account (MiHome account is not supported). Go to go2rtc WebUI > Add webpage. Copy the `roborock://...` source for your vacuum and paste it into your `go2rtc.yaml` config.
If you have a pattern PIN for your vacuum, add it as a numeric PIN (lines: 123, 456, 789) to the end of the `roborock` link.
+60 -2
View File
@@ -1,3 +1,61 @@
# Real-Time Messaging Protocol
This module provides the following features for the RTMP protocol:
- Streaming input - [RTMP client](#rtmp-client)
- Streaming output and ingest in `rtmp` format - [RTMP server](#rtmp-server)
- Streaming output and ingest in `flv` format - [FLV server](#flv-server)
## RTMP Client
You can get a stream from an RTMP server, for example [Nginx with nginx-rtmp-module](https://github.com/arut/nginx-rtmp-module).
### Client Configuration
```yaml
streams:
rtmp_stream: rtmp://192.168.1.123/live/camera1
```
## RTMP Server
[`new in v1.8.0`](https://github.com/AlexxIT/go2rtc/releases/tag/v1.8.0)
Streaming output stream in `rtmp` format:
```shell
ffplay rtmp://localhost:1935/camera1
```
Streaming ingest stream in `rtmp` format:
```shell
ffmpeg -re -i BigBuckBunny.mp4 -c copy -f flv rtmp://localhost:1935/camera1
```
### Server Configuration
By default, the RTMP server is disabled.
```yaml
rtmp:
listen: ":1935" # by default - disabled!
```
## FLV Server
Streaming output in `flv` format.
```shell
ffplay http://localhost:1984/stream.flv?src=camera1
```
Streaming ingest in `flv` format.
```shell
ffmpeg -re -i BigBuckBunny.mp4 -c copy -f flv http://localhost:1984/api/stream.flv?dst=camera1
```
## Tested client
| From | To | Comment |
@@ -37,8 +95,8 @@ Settings > Stream:
- Service: Custom
- Server: rtmp://192.168.10.101/tmp
- Stream Key: <empty>
- Use auth: <disabled>
- Stream Key: `<empty>`
- Use auth: `<disabled>`
**OpenIPC**
+93
View File
@@ -0,0 +1,93 @@
# Real Time Streaming Protocol
This module provides the following features for the RTSP protocol:
- Streaming input - [RTSP client](#rtsp-client)
- Streaming output - [RTSP server](#rtsp-server)
- [Streaming ingest](#streaming-ingest)
- [Two-way audio](#two-way-audio)
## RTSP Client
### Configuration
```yaml
streams:
sonoff_camera: rtsp://rtsp:12345678@192.168.1.123/av_stream/ch0
dahua_camera:
- rtsp://admin:password@192.168.1.123/cam/realmonitor?channel=1&subtype=0&unicast=true&proto=Onvif
- rtsp://admin:password@192.168.1.123/cam/realmonitor?channel=1&subtype=1#backchannel=0
amcrest_doorbell:
- rtsp://username:password@192.168.1.123:554/cam/realmonitor?channel=1&subtype=0#backchannel=0
unifi_camera: rtspx://192.168.1.123:7441/fD6ouM72bWoFijxK
glichy_camera: ffmpeg:rtsp://username:password@192.168.1.123/live/ch00_1
```
### Recommendations
- **Amcrest Doorbell** users may want to disable two-way audio, because with an active stream, you won't have a working call button. You need to add `#backchannel=0` to the end of your RTSP link in YAML config file
- **Dahua Doorbell** users may want to change [audio codec](https://github.com/AlexxIT/go2rtc/issues/49#issuecomment-2127107379) for proper two-way audio. Make sure not to request backchannel multiple times by adding `#backchannel=0` to other stream sources of the same doorbell. The `unicast=true&proto=Onvif` is preferred for two-way audio as this makes the doorbell accept multiple codecs for the incoming audio
- **Reolink** users may want NOT to use RTSP protocol at all, some camera models have a very awful, unusable stream implementation
- **Ubiquiti UniFi** users may want to disable HTTPS verification. Use `rtspx://` prefix instead of `rtsps://`. And don't use `?enableSrtp` [suffix](https://github.com/AlexxIT/go2rtc/issues/81)
- **TP-Link Tapo** users may skip login and password, because go2rtc supports login [without them](https://drmnsamoliu.github.io/video.html)
- If your camera has two RTSP links, you can add both as sources. This is useful when streams have different codecs, for example AAC audio with main stream and PCMU/PCMA audio with second stream
- If the stream from your camera is glitchy, try using [ffmpeg source](../ffmpeg/README.md). It will not add CPU load if you don't use transcoding
- If the stream from your camera is very glitchy, try to use transcoding with [ffmpeg source](../ffmpeg/README.md)
### Other options
Format: `rtsp...#{param1}#{param2}#{param3}`
- Add custom timeout `#timeout=30` (in seconds)
- Ignore audio - `#media=video` or ignore video - `#media=audio`
- Ignore two-way audio API `#backchannel=0` - important for some glitchy cameras
- Use WebSocket transport `#transport=ws...`
### RTSP over WebSocket
```yaml
streams:
# WebSocket with authorization, RTSP - without
axis-rtsp-ws: rtsp://192.168.1.123:4567/axis-media/media.amp?overview=0&camera=1&resolution=1280x720&videoframeskipmode=empty&Axis-Orig-Sw=true#transport=ws://user:pass@192.168.1.123:4567/rtsp-over-websocket
# WebSocket without authorization, RTSP - with
dahua-rtsp-ws: rtsp://user:pass@192.168.1.123/cam/realmonitor?channel=1&subtype=1&proto=Private3#transport=ws://192.168.1.123/rtspoverwebsocket
```
## RTSP Server
You can get any stream as RTSP-stream: `rtsp://192.168.1.123:8554/{stream_name}`
You can enable external password protection for your RTSP streams. Password protection is always disabled for localhost calls (ex. FFmpeg or Home Assistant on the same server).
### Configuration
```yaml
rtsp:
listen: ":8554" # RTSP Server TCP port, default - 8554
username: "admin" # optional, default - disabled
password: "pass" # optional, default - disabled
default_query: "video&audio" # optional, default codecs filters
```
By default go2rtc provide RTSP-stream with only one first video and only one first audio. You can change it with the `default_query` setting:
- `default_query: "mp4"` - MP4 compatible codecs (H264, H265, AAC)
- `default_query: "video=all&audio=all"` - all tracks from all source (not all players can handle this)
- `default_query: "video=h264,h265"` - only one video track (H264 or H265)
- `default_query: "video&audio=all"` - only one first any video and all audio as separate tracks
Read more about [codecs filters](../../README.md#codecs-filters).
## Streaming ingest
```shell
ffmpeg -re -i BigBuckBunny.mp4 -c copy -rtsp_transport tcp -f rtsp rtsp://localhost:8554/camera1
```
## Two-way audio
Before purchasing, it is difficult to understand whether the camera supports two-way audio via the RTSP protocol or not. This isn't usually mentioned in a camera's description. You can only find out by reading reviews from real buyers.
A camera is considered to support two-way audio if it supports the ONVIF Profile T protocol. But in reality, this isn't always the case. And the ONVIF protocol has no connection with the camera's RTSP implementation.
In go2rtc you can find out if the camera supports two-way audio via WebUI > stream probe.
+13
View File
@@ -0,0 +1,13 @@
# SRTP
This is a support module for the [HomeKit](../homekit/README.md) module.
> [!NOTE]
> This module can be removed and its functionality transferred to the homekit module.
## Configuration
```yaml
srtp:
listen: :8443 # enabled by default
```
+86
View File
@@ -1,3 +1,89 @@
# Streams
This core module is responsible for managing the stream list.
## Stream to camera
[`new in v1.3.0`](https://github.com/AlexxIT/go2rtc/releases/tag/v1.3.0)
go2rtc supports playing audio files (ex. music or [TTS](https://www.home-assistant.io/integrations/#text-to-speech)) and live streams (ex. radio) on cameras with [two-way audio](../../README.md#two-way-audio) support.
API example:
```text
POST http://localhost:1984/api/streams?dst=camera1&src=ffmpeg:http://example.com/song.mp3#audio=pcma#input=file
```
- you can stream: local files, web files, live streams or any format, supported by FFmpeg
- you should use [ffmpeg source](../ffmpeg/README.md) for transcoding audio to codec, that your camera supports
- you can check camera codecs on the go2rtc WebUI info page when the stream is active
- some cameras support only low quality `PCMA/8000` codec (ex. [Tapo](../tapo/README.md))
- it is recommended to choose higher quality formats if your camera supports them (ex. `PCMA/48000` for some Dahua cameras)
- if you play files over `http` link, you need to add `#input=file` params for transcoding, so the file will be transcoded and played in real time
- if you play live streams, you should skip `#input` param, because it is already in real time
- you can stop active playback by calling the API with the empty `src` parameter
- you will see one active producer and one active consumer in go2rtc WebUI info page during streaming
## Publish stream
[`new in v1.8.0`](https://github.com/AlexxIT/go2rtc/releases/tag/v1.8.0)
You can publish any stream to streaming services (YouTube, Telegram, etc.) via RTMP/RTMPS. Important:
- Supported codecs: H264 for video and AAC for audio
- AAC audio is required for YouTube; videos without audio will not work
- You don't need to enable [RTMP module](../rtmp/README.md) listening for this task
You can use the API:
```text
POST http://localhost:1984/api/streams?src=camera1&dst=rtmps://...
```
Or config file:
```yaml
publish:
# publish stream "video_audio_transcode" to Telegram
video_audio_transcode:
- rtmps://xxx-x.rtmp.t.me/s/xxxxxxxxxx:xxxxxxxxxxxxxxxxxxxxxx
# publish stream "audio_transcode" to Telegram and YouTube
audio_transcode:
- rtmps://xxx-x.rtmp.t.me/s/xxxxxxxxxx:xxxxxxxxxxxxxxxxxxxxxx
- rtmp://xxx.rtmp.youtube.com/live2/xxxx-xxxx-xxxx-xxxx-xxxx
streams:
video_audio_transcode:
- ffmpeg:rtsp://user:pass@192.168.1.123/stream1#video=h264#hardware#audio=aac
audio_transcode:
- ffmpeg:rtsp://user:pass@192.168.1.123/stream1#video=copy#audio=aac
```
- **Telegram Desktop App** > Any public or private channel or group (where you admin) > Live stream > Start with... > Start streaming.
- **YouTube** > Create > Go live > Stream latency: Ultra low-latency > Copy: Stream URL + Stream key.
## Preload stream
[`new in v1.9.11`](https://github.com/AlexxIT/go2rtc/releases/tag/v1.9.11)
You can preload any stream on go2rtc start. This is useful for cameras that take a long time to start up.
```yaml
preload:
camera1: # default: video&audio = ANY
camera2: "video" # preload only video track
camera3: "video=h264&audio=opus" # preload H264 video and OPUS audio
streams:
camera1:
- rtsp://192.168.1.100/stream
camera2:
- rtsp://192.168.1.101/stream
camera3:
- rtsp://192.168.1.102/h265stream
- ffmpeg:camera3#video=h264#audio=opus#hardware
```
## Examples
```yaml
+61
View File
@@ -0,0 +1,61 @@
# TP-Link Tapo
[`new in v1.2.0`](https://github.com/AlexxIT/go2rtc/releases/tag/v1.2.0)
[TP-Link Tapo](https://www.tapo.com/) proprietary camera protocol with **two-way audio** support.
- stream quality is the same as [RTSP protocol](https://www.tapo.com/en/faq/34/)
- use the **cloud password**, this is not the RTSP password! you do not need to add a login!
- you can also use **UPPERCASE** MD5 hash from your cloud password with `admin` username
- some new camera firmwares require SHA256 instead of MD5
## Configuration
```yaml
streams:
# cloud password without username
camera1: tapo://cloud-password@192.168.1.123
# admin username and UPPERCASE MD5 cloud-password hash
camera2: tapo://admin:UPPERCASE-MD5@192.168.1.123
# admin username and UPPERCASE SHA256 cloud-password hash
camera3: tapo://admin:UPPERCASE-SHA256@192.168.1.123
# VGA stream (the so called substream, the lower resolution one)
camera4: tapo://cloud-password@192.168.1.123?subtype=1
# HD stream (default)
camera5: tapo://cloud-password@192.168.1.123?subtype=0
```
```bash
echo -n "cloud password" | md5 | awk '{print toupper($0)}'
echo -n "cloud password" | shasum -a 256 | awk '{print toupper($0)}'
```
## TP-Link Kasa
[`new in v1.7.0`](https://github.com/AlexxIT/go2rtc/releases/tag/v1.7.0)
> [!NOTE]
> This source should be moved to separate module. Because it's source code not related to Tapo.
[TP-Link Kasa](https://www.kasasmart.com/) non-standard protocol [more info](https://medium.com/@hu3vjeen/reverse-engineering-tp-link-kc100-bac4641bf1cd).
- `username` - urlsafe email, `alex@gmail.com` -> `alex%40gmail.com`
- `password` - base64password, `secret1` -> `c2VjcmV0MQ==`
```yaml
streams:
kc401: kasa://username:password@192.168.1.123:19443/https/stream/mixed
```
Tested: KD110, KC200, KC401, KC420WS, EC71.
## TP-Link Vigi
[`new in v1.9.8`](https://github.com/AlexxIT/go2rtc/releases/tag/v1.9.8)
[TP-Link VIGI](https://www.vigi.com/) cameras. These are cameras from a different sub-brand, but the format is very similar to Tapo. Only the authorization is different. Read more [here](https://github.com/AlexxIT/go2rtc/issues/1470).
```yaml
streams:
camera1: vigi://admin:{password}@192.168.1.123
```
+2 -2
View File
@@ -1,12 +1,12 @@
# Tuya
*[New in v1.9.13](https://github.com/AlexxIT/go2rtc/releases/tag/v1.9.13)*
[`new in v1.9.13`](https://github.com/AlexxIT/go2rtc/releases/tag/v1.9.13) by [@seydx](https://github.com/seydx)
[Tuya](https://www.tuya.com/) is a proprietary camera protocol with **two-way audio** support. go2rtc supports `Tuya Smart API` and `Tuya Cloud API`.
**Tuya Smart API (recommended)**:
- **Smart Life accounts are NOT supported**, you need to create a Tuya Smart account. If the cameras are already added to the Smart Life app, you need to remove them and add them again to the [Tuya Smart](https://play.google.com/store/apps/details?id=com.tuya.smart) app.
- Cameras can be discovered through the go2rtc web interface via Tuya Smart account (Add > Tuya > Select region and fill in email and password > Login).
- **Smart Life accounts are not supported**, you need to create a Tuya Smart account. If the cameras are already added to the Smart Life app, you need to remove them and add them again to the Tuya Smart app.
**Tuya Cloud API**:
- Requires setting up a cloud project in the Tuya Developer Platform.
+3 -1
View File
@@ -1,4 +1,6 @@
# V4L2
# Video4Linux
[`new in v1.9.9`](https://github.com/AlexxIT/go2rtc/releases/tag/v1.9.9)
What you should to know about [V4L2](https://en.wikipedia.org/wiki/Video4Linux):
+153 -6
View File
@@ -1,3 +1,86 @@
# WebRTC
## WebRTC Client
[`new in v1.3.0`](https://github.com/AlexxIT/go2rtc/releases/tag/v1.3.0)
This source type supports four connection formats.
### Creality
[`new in v1.9.10`](https://github.com/AlexxIT/go2rtc/releases/tag/v1.9.10)
[Creality](https://www.creality.com/) 3D printer camera. Read more [here](https://github.com/AlexxIT/go2rtc/issues/1600).
```yaml
streams:
creality_k2p: webrtc:http://192.168.1.123:8000/call/webrtc_local#format=creality
```
### go2rtc
This format is only supported in go2rtc. Unlike WHEP, it supports asynchronous WebRTC connections and two-way audio.
```yaml
streams:
webrtc-go2rtc: webrtc:ws://192.168.1.123:1984/api/ws?src=camera1
```
### Kinesis
[`new in v1.6.1`](https://github.com/AlexxIT/go2rtc/releases/tag/v1.6.1)
Supports [Amazon Kinesis Video Streams](https://aws.amazon.com/kinesis/video-streams/), using WebRTC protocol. You need to specify the signaling WebSocket URL with all credentials in query params, `client_id` and `ice_servers` list in [JSON format](https://developer.mozilla.org/en-US/docs/Web/API/RTCIceServer).
```yaml
streams:
webrtc-kinesis: webrtc:wss://...amazonaws.com/?...#format=kinesis#client_id=...#ice_servers=[{...},{...}]
```
**PS.** For `kinesis` sources, you can use [echo](../echo/README.md) to get connection params using `bash`, `python` or any other script language.
### OpenIPC
[`new in v1.7.0`](https://github.com/AlexxIT/go2rtc/releases/tag/v1.7.0)
Cameras on open-source [OpenIPC](https://openipc.org/) firmware.
```yaml
streams:
webrtc-openipc: webrtc:ws://192.168.1.123/webrtc_ws#format=openipc#ice_servers=[{"urls":"stun:stun.kinesisvideo.eu-north-1.amazonaws.com:443"}]
```
### SwitchBot
Support connection to [SwitchBot](https://us.switch-bot.com/) cameras that are based on Kinesis Video Streams. Specifically, this includes [Pan/Tilt Cam Plus 2K](https://us.switch-bot.com/pages/switchbot-pan-tilt-cam-plus-2k) and [Pan/Tilt Cam Plus 3K](https://us.switch-bot.com/pages/switchbot-pan-tilt-cam-plus-3k) and [Smart Video Doorbell](https://www.switchbot.jp/products/switchbot-smart-video-doorbell). `Outdoor Spotlight Cam 1080P`, `Outdoor Spotlight Cam 2K`, `Pan/Tilt Cam`, `Pan/Tilt Cam 2K`, `Indoor Cam` are based on Tuya, so this feature is not available.
```yaml
streams:
webrtc-switchbot: webrtc:wss://...amazonaws.com/?...#format=switchbot#resolution=hd#play_type=0#client_id=...#ice_servers=[{...},{...}]
```
### WHEP
[WebRTC/WHEP](https://datatracker.ietf.org/doc/draft-murillo-whep/) is replaced by [WebRTC/WISH](https://datatracker.ietf.org/doc/charter-ietf-wish/02/) standard for WebRTC video/audio viewers. But it may already be supported in some third-party software. It is supported in go2rtc.
```yaml
streams:
webrtc-whep: webrtc:http://192.168.1.123:1984/api/webrtc?src=camera1
```
### Wyze
[`new in v1.6.1`](https://github.com/AlexxIT/go2rtc/releases/tag/v1.6.1)
Legacy method to connect to [Wyze](https://www.wyze.com/) cameras using WebRTC protocol via [docker-wyze-bridge](https://github.com/mrlt8/docker-wyze-bridge). For native P2P support without docker-wyze-bridge, see [Source: Wyze](../wyze/README.md).
```yaml
streams:
webrtc-wyze: webrtc:http://192.168.1.123:5000/signaling/camera1?kvs#format=wyze
```
## WebRTC Server
What you should know about WebRTC:
- It's almost always a **direct [peer-to-peer](https://en.wikipedia.org/wiki/Peer-to-peer) connection** from your browser to the go2rtc app
@@ -15,16 +98,58 @@ If an external connection via STUN is used:
- https://habr.com/ru/companies/flashphoner/articles/480006/
- https://www.youtube.com/watch?v=FXVg2ckuKfs
## Default config
### Confiration suggestions
- by default, WebRTC uses both TCP and UDP on port 8555 for connections
- you can use this port for external access
- you can change the port in YAML config:
```yaml
webrtc:
listen: ":8555"
ice_servers:
- urls: [ "stun:stun.l.google.com:19302" ]
listen: ":8555" # address of your local server and port (TCP/UDP)
```
## Config
#### Static public IP
- forward the port 8555 on your router (you can use the same 8555 port or any other as external port)
- add your external IP address and external port to the YAML config
```yaml
webrtc:
candidates:
- 216.58.210.174:8555 # if you have a static public IP address
```
#### Dynamic public IP
- forward the port 8555 on your router (you can use the same 8555 port or any other as the external port)
- add `stun` word and external port to YAML config
- go2rtc automatically detects your external address with STUN server
```yaml
webrtc:
candidates:
- stun:8555 # if you have a dynamic public IP address
```
#### Hard tech way 1. Own TCP-tunnel
If you have a personal [VPS](https://en.wikipedia.org/wiki/Virtual_private_server), you can create a TCP tunnel and setup in the same way as "Static public IP". But use your VPS IP address in the YAML config.
#### Hard tech way 2. Using TURN-server
If you have personal [VPS](https://en.wikipedia.org/wiki/Virtual_private_server), you can install TURN server (e.g. [coturn](https://github.com/coturn/coturn), config [example](https://github.com/AlexxIT/WebRTC/wiki/Coturn-Example)).
```yaml
webrtc:
ice_servers:
- urls: [stun:stun.l.google.com:19302]
- urls: [turn:123.123.123.123:3478]
username: your_user
credential: your_pass
```
### Full configuration
**Important!** This example is not for copy/pasting!
@@ -82,7 +207,7 @@ You can set a **fixed TCP** port and a **random UDP** port for all connections:
You can also disable the TCP port and leave only random UDP ports: `listen: ""`.
## Config filters
### Configuration filters
**Important!** By default, go2rtc excludes all Docker-like candidates (`172.16.0.0/12`). This cannot be disabled.
@@ -106,6 +231,28 @@ webrtc:
candidates: [ 192.168.1.2:8555 ] # add manual host candidate (use docker port forwarding)
```
## Streaming ingest
### Ingest: Browser
[`new in v1.3.0`](https://github.com/AlexxIT/go2rtc/releases/tag/v1.3.0)
You can turn the browser of any PC or mobile into an IP camera with support for video and two-way audio. Or even broadcast your PC screen:
1. Create empty stream in the `go2rtc.yaml`
2. Go to go2rtc WebUI
3. Open `links` page for your stream
4. Select `camera+microphone` or `display+speaker` option
5. Open `webrtc` local page (your go2rtc **should work over HTTPS!**) or `share link` via [WebTorrent](../webtorrent/README.md) technology (work over HTTPS by default)
### Ingest: WHIP
[`new in v1.3.0`](https://github.com/AlexxIT/go2rtc/releases/tag/v1.3.0)
You can use **OBS Studio** or any other broadcast software with [WHIP](https://www.ietf.org/archive/id/draft-ietf-wish-whip-01.html) protocol support. This standard has not yet been approved. But you can download OBS Studio [dev version](https://github.com/obsproject/obs-studio/actions/runs/3969201209):
- Settings > Stream > Service: WHIP > `http://192.168.1.123:1984/api/webrtc?dst=camera1`
## Useful links
- https://www.ietf.org/archive/id/draft-ietf-wish-whip-01.html
+45
View File
@@ -0,0 +1,45 @@
# WebTorrent
> [!NOTE]
> This section needs some improvement.
## WebTorrent Client
[`new in v1.3.0`](https://github.com/AlexxIT/go2rtc/releases/tag/v1.3.0)
This source can get a stream from another go2rtc via [WebTorrent](https://en.wikipedia.org/wiki/WebTorrent) protocol.
### Client Configuration
```yaml
streams:
webtorrent1: webtorrent:?share=huofssuxaty00izc&pwd=k3l2j9djeg8v8r7e
```
## WebTorrent Server
[`new in v1.3.0`](https://github.com/AlexxIT/go2rtc/releases/tag/v1.3.0)
This module supports:
- Share any local stream via [WebTorrent](https://webtorrent.io/) technology
- Get any [incoming stream](../webrtc/README.md#ingest-browser) from PC or mobile via [WebTorrent](https://webtorrent.io/) technology
- Get any remote go2rtc source via [WebTorrent](https://webtorrent.io/) technology
Securely and freely. You do not need to open a public access to the go2rtc server. But in some cases (Symmetric NAT), you may need to set up external access to [WebRTC module](../webrtc/README.md).
To generate a sharing link or incoming link, go to the go2rtc WebUI (stream links page). This link is **temporary** and will stop working after go2rtc is restarted!
### Server Configuration
You can create permanent external links in the go2rtc config:
```yaml
webtorrent:
shares:
super-secret-share: # share name, should be unique among all go2rtc users!
pwd: super-secret-password
src: rtsp-dahua1 # stream name from streams section
```
Link example: `https://go2rtc.org/webtorrent/#share=02SNtgjKXY&pwd=wznEQqznxW&media=video+audio`
+3
View File
@@ -1,5 +1,8 @@
# Wyoming
> [!NOTE]
> The format is under development and does not yet work stably.
This module provide [Wyoming Protocol](https://www.home-assistant.io/integrations/wyoming/) support to create local voice assistants using [Home Assistant](https://www.home-assistant.io/).
- go2rtc can act as [Wyoming Satellite](https://github.com/rhasspy/wyoming-satellite)
+40 -38
View File
@@ -1,5 +1,7 @@
# Wyze
[`new in v1.9.14`](https://github.com/AlexxIT/go2rtc/releases/tag/v1.9.14) by [@seydx](https://github.com/seydx)
This source allows you to stream from [Wyze](https://wyze.com/) cameras using native P2P protocol without the Wyze app or SDK.
**Important:**
@@ -44,14 +46,14 @@ The stream URL is automatically generated when you add cameras via the WebUI:
wyze://[IP]?uid=[P2P_ID]&enr=[ENR]&mac=[MAC]&model=[MODEL]&subtype=[hd|sd]&dtls=true
```
| Parameter | Description |
|-----------|-------------|
| `IP` | Camera's local IP address |
| `uid` | P2P identifier (20 chars) |
| `enr` | Encryption key for DTLS |
| `mac` | Device MAC address |
| `model` | Camera model (e.g., HL_CAM4) |
| `dtls` | Enable DTLS encryption (default: true) |
| Parameter | Description |
|-----------|-------------------------------------------------|
| `IP` | Camera's local IP address |
| `uid` | P2P identifier (20 chars) |
| `enr` | Encryption key for DTLS |
| `mac` | Device MAC address |
| `model` | Camera model (e.g., HL_CAM4) |
| `dtls` | Enable DTLS encryption (default: true) |
| `subtype` | Camera resolution: `hd` or `sd` (default: `hd`) |
## Configuration
@@ -72,35 +74,35 @@ Two-way audio (intercom) is supported automatically. When a consumer sends audio
## Camera Compatibility
| Name | Model | Firmware | Protocol | Encryption | Codecs |
|------|-------|----------|----------|------------|--------|
| Wyze Cam v4 | HL_CAM4 | 4.52.9.4188 | TUTK | TransCode | h264, aac |
| | | 4.52.9.5332 | TUTK | HMAC-SHA1 | h264, aac |
| Wyze Cam v3 Pro | | | TUTK | | |
| Wyze Cam v3 | WYZE_CAKP2JFUS | 4.36.14.3497 | TUTK | TransCode | h264, pcm |
| Wyze Cam v2 | WYZEC1-JZ | 4.9.9.3006 | TUTK | TransCode | h264, pcmu |
| Wyze Cam v1 | | | TUTK | | |
| Wyze Cam Pan v4 | | | Gwell* | | |
| Wyze Cam Pan v3 | | | TUTK | | |
| Wyze Cam Pan v2 | | | TUTK | | |
| Wyze Cam Pan v1 | | | TUTK | | |
| Wyze Cam OG | | | Gwell* | | |
| Wyze Cam OG Telephoto | | | Gwell* | | |
| Wyze Cam OG (2025) | | | Gwell* | | |
| Wyze Cam Outdoor v2 | | | TUTK | | |
| Wyze Cam Outdoor v1 | | | TUTK | | |
| Wyze Cam Floodlight Pro | | | ? | | |
| Wyze Cam Floodlight v2 | | | TUTK | | |
| Wyze Cam Floodlight | | | TUTK | | |
| Wyze Video Doorbell v2 | HL_DB2 | 4.51.3.4992 | TUTK | TransCode | h264, pcm |
| Wyze Video Doorbell v1 | | | TUTK | | |
| Wyze Video Doorbell Pro | | | ? | | |
| Wyze Battery Video Doorbell | | | ? | | |
| Wyze Duo Cam Doorbell | | | ? | | |
| Wyze Battery Cam Pro | | | ? | | |
| Wyze Solar Cam Pan | | | ? | | |
| Wyze Duo Cam Pan | | | ? | | |
| Wyze Window Cam | | | ? | | |
| Wyze Bulb Cam | | | ? | | |
| Name | Model | Firmware | Protocol | Encryption | Codecs |
|-----------------------------|----------------|--------------|----------|------------|------------|
| Wyze Cam v4 | HL_CAM4 | 4.52.9.4188 | TUTK | TransCode | h264, aac |
| | | 4.52.9.5332 | TUTK | HMAC-SHA1 | h264, aac |
| Wyze Cam v3 Pro | | | TUTK | | |
| Wyze Cam v3 | WYZE_CAKP2JFUS | 4.36.14.3497 | TUTK | TransCode | h264, pcm |
| Wyze Cam v2 | WYZEC1-JZ | 4.9.9.3006 | TUTK | TransCode | h264, pcmu |
| Wyze Cam v1 | | | TUTK | | |
| Wyze Cam Pan v4 | | | Gwell* | | |
| Wyze Cam Pan v3 | | | TUTK | | |
| Wyze Cam Pan v2 | | | TUTK | | |
| Wyze Cam Pan v1 | | | TUTK | | |
| Wyze Cam OG | | | Gwell* | | |
| Wyze Cam OG Telephoto | | | Gwell* | | |
| Wyze Cam OG (2025) | | | Gwell* | | |
| Wyze Cam Outdoor v2 | | | TUTK | | |
| Wyze Cam Outdoor v1 | | | TUTK | | |
| Wyze Cam Floodlight Pro | | | ? | | |
| Wyze Cam Floodlight v2 | | | TUTK | | |
| Wyze Cam Floodlight | | | TUTK | | |
| Wyze Video Doorbell v2 | HL_DB2 | 4.51.3.4992 | TUTK | TransCode | h264, pcm |
| Wyze Video Doorbell v1 | | | TUTK | | |
| Wyze Video Doorbell Pro | | | ? | | |
| Wyze Battery Video Doorbell | | | ? | | |
| Wyze Duo Cam Doorbell | | | ? | | |
| Wyze Battery Cam Pro | | | ? | | |
| Wyze Solar Cam Pan | | | ? | | |
| Wyze Duo Cam Pan | | | ? | | |
| Wyze Window Cam | | | ? | | |
| Wyze Bulb Cam | | | ? | | |
_* Gwell based protocols are not yet supported._
+2 -2
View File
@@ -1,6 +1,6 @@
# Xiaomi
# Xiaomi Mi Home
**Added in v1.9.13. Improved in v1.9.14.**
[`new in v1.9.13`](https://github.com/AlexxIT/go2rtc/releases/tag/v1.9.13)
This source allows you to view cameras from the [Xiaomi Mi Home](https://home.mi.com/) ecosystem.
+1 -1
View File
@@ -12,7 +12,7 @@ Source for receiving stream from new [Yandex IP camera](https://alice.yandex.ru/
1. Open this link in any browser: https://iot.quasar.yandex.ru/m/v3/user/devices
2. Copy ID of your camera, key: `"id"`.
## Config examples
## Configuration
```yaml
streams:
+10 -3
View File
@@ -1,7 +1,14 @@
{
"devDependencies": {
"eslint": "^8.44.0",
"eslint-plugin-html": "^7.1.0"
"@types/node": "^25.2.0",
"eslint": "^9.39.2",
"eslint-plugin-html": "^8.1.4",
"vitepress": "^2.0.0-alpha.16"
},
"scripts": {
"docs:dev": "vitepress dev website --host",
"docs:build": "vitepress build website",
"docs:preview": "vitepress preview website"
},
"eslintConfig": {
"env": {
@@ -37,4 +44,4 @@
}
]
}
}
}
+68
View File
@@ -0,0 +1,68 @@
## FFplay output
[FFplay](https://stackoverflow.com/questions/27778678/what-are-mv-fd-aq-vq-sq-and-f-in-a-video-stream) `7.11 A-V: 0.003 fd= 1 aq= 21KB vq= 321KB sq= 0B f=0/0`:
- `7.11` - master clock, is the time from start of the stream/video
- `A-V` - av_diff, difference between audio and video timestamps
- `fd` - frames dropped
- `aq` - audio queue (0 - no delay)
- `vq` - video queue (0 - no delay)
- `sq` - subtitle queue
- `f` - timestamp error correction rate (Not 100% sure)
`M-V`, `M-A` means video stream only, audio stream only respectively.
## Devices Windows
```
>ffmpeg -hide_banner -f dshow -list_options true -i video="VMware Virtual USB Video Device"
[dshow @ 0000025695e52900] DirectShow video device options (from video devices)
[dshow @ 0000025695e52900] Pin "Record" (alternative pin name "0")
[dshow @ 0000025695e52900] pixel_format=yuyv422 min s=1280x720 fps=1 max s=1280x720 fps=10
[dshow @ 0000025695e52900] pixel_format=yuyv422 min s=1280x720 fps=1 max s=1280x720 fps=10 (tv, bt470bg/bt709/unknown, topleft)
[dshow @ 0000025695e52900] pixel_format=nv12 min s=1280x720 fps=1 max s=1280x720 fps=23
[dshow @ 0000025695e52900] pixel_format=nv12 min s=1280x720 fps=1 max s=1280x720 fps=23 (tv, bt470bg/bt709/unknown, topleft)
```
## Devices Mac
```
% ./ffmpeg -hide_banner -f avfoundation -list_devices true -i ""
[AVFoundation indev @ 0x7f8b1f504d80] AVFoundation video devices:
[AVFoundation indev @ 0x7f8b1f504d80] [0] FaceTime HD Camera
[AVFoundation indev @ 0x7f8b1f504d80] [1] Capture screen 0
[AVFoundation indev @ 0x7f8b1f504d80] AVFoundation audio devices:
[AVFoundation indev @ 0x7f8b1f504d80] [0] Soundflower (2ch)
[AVFoundation indev @ 0x7f8b1f504d80] [1] Built-in Microphone
[AVFoundation indev @ 0x7f8b1f504d80] [2] Soundflower (64ch)
```
## Devices Linux
```
# ffmpeg -hide_banner -f v4l2 -list_formats all -i /dev/video0
[video4linux2,v4l2 @ 0x7f7de7c58bc0] Raw : yuyv422 : YUYV 4:2:2 : 640x480 160x120 176x144 320x176 320x240 352x288 432x240 544x288 640x360 752x416 800x448 800x600 864x480 960x544 960x720 1024x576 1184x656 1280x720 1280x960
[video4linux2,v4l2 @ 0x7f7de7c58bc0] Compressed: mjpeg : Motion-JPEG : 640x480 160x120 176x144 320x176 320x240 352x288 432x240 544x288 640x360 752x416 800x448 800x600 864x480 960x544 960x720 1024x576 1184x656 1280x720 1280x960
```
## TTS
```yaml
streams:
tts: ffmpeg:#input=-readrate 1 -readrate_initial_burst 0.001 -f lavfi -i "flite=text='1 2 3 4 5 6 7 8 9 0'"#audio=pcma
```
## Useful links
- https://superuser.com/questions/564402/explanation-of-x264-tune
- https://stackoverflow.com/questions/33624016/why-sliced-thread-affect-so-much-on-realtime-encoding-using-ffmpeg-x264
- https://codec.fandom.com/ru/wiki/X264_-_описание_ключей_кодирования
- https://html5test.com/
- https://trac.ffmpeg.org/wiki/Capture/Webcam
- https://trac.ffmpeg.org/wiki/DirectShow
- https://stackoverflow.com/questions/53207692/libav-mjpeg-encoding-and-huffman-table
- https://github.com/tuupola/esp_video/blob/master/README.md
- https://github.com/leandromoreira/ffmpeg-libav-tutorial
- https://www.reddit.com/user/VeritablePornocopium/comments/okw130/ffmpeg_with_libfdk_aac_for_windows_x64/
- https://slhck.info/video/2017/02/24/vbr-settings.html
- [HomeKit audio samples problem](https://superuser.com/questions/1290996/non-monotonous-dts-with-igndts-flag)
+17 -29
View File
@@ -1,33 +1,14 @@
## Versions
# Scripts
**PS.** Unfortunately, due to the dependency on `pion/webrtc/v4 v4.1.3`, had to upgrade go to `1.23`. Everything described below is not relevant.
This folder contains a script for building binaries for all platforms.
[Go 1.20](https://go.dev/doc/go1.20) is last version with support Windows 7 and macOS 10.13.
Go 1.21 support only Windows 10 and macOS 10.15.
The project has no `CGO` dependencies, so building is as simple as possible using the `go build` command.
So we will set `go 1.20` (minimum version) inside `go.mod` file. And will use env `GOTOOLCHAIN=go1.20.14` for building
`win32` and `mac_amd64` binaries. All other binaries will use latest go version.
The project has to use the latest versions of go due to dependencies on third-party go libraries. Such as `pion/webrtc` or `golang.org/x`. Unfortunately, this breaks compatibility with older versions of operating systems.
```
github.com/miekg/dns v1.1.63
golang.org/x/crypto v0.33.0
golang.org/x/mod v0.20.0 // indirect
golang.org/x/net v0.35.0 // indirect
golang.org/x/sync v0.11.0 // indirect
golang.org/x/sys v0.30.0 // indirect
golang.org/x/tools v0.24.0 // indirect
```
The project uses [UPX](https://github.com/upx/upx) to compress binaries for Linux. The project does not use compression for Windows due to false antivirus alarms. The project does not use compression for macOS due to broken result.
## Build
- UPX-3.96 pack broken bin for `linux_mipsel`
- UPX-3.95 pack broken bin for `mac_amd64`
- UPX pack broken bin for `mac_arm64`
- UPX windows pack is recognised by anti-viruses as malicious
- `aarch64` = `arm64`
- `armv7` = `arm`
## Go
## Useful commands
```
go get -u
@@ -69,18 +50,20 @@ go list -deps .\cmd\go2rtc_rtsp\
## Licenses
- github.com/asticode/go-astits - MIT
- github.com/eclipse/paho.mqtt.golang - EPL-2.0
- github.com/expr-lang/expr - MIT
- github.com/gorilla/websocket - BSD-2
- github.com/mattn/go-isatty - MIT
- github.com/miekg/dns - BSD-3
- github.com/pion/ice/v2 - MIT
- github.com/pion/dtls - MIT
- github.com/pion/ice - MIT
- github.com/pion/interceptor - MIT
- github.com/pion/rtcp - MIT
- github.com/pion/rtp - MIT
- github.com/pion/sdp/v3 - MIT
- github.com/pion/srtp/v2 - MIT
- github.com/pion/sdp - MIT
- github.com/pion/srtp - MIT
- github.com/pion/stun - MIT
- github.com/pion/webrtc/v3 - MIT
- github.com/pion/webrtc - MIT
- github.com/rs/zerolog - MIT
- github.com/sigurn/crc16 - MIT
- github.com/sigurn/crc8 - MIT
@@ -93,6 +76,11 @@ go list -deps .\cmd\go2rtc_rtsp\
- github.com/google/uuid - BSD-3
- github.com/kr/pretty - MIT
- github.com/mattn/go-colorable - MIT
- github.com/pion/datachannel - MIT
- github.com/pion/logging - MIT
- github.com/pion/mdns - MIT
- github.com/pion/randutil - MIT
- github.com/pion/sctp - MIT
- github.com/pmezard/go-difflib - ???
- github.com/wlynxg/anet - BSD-3
- golang.org/x/mod - BSD-3
+147
View File
@@ -0,0 +1,147 @@
import {defineConfig} from 'vitepress';
export default defineConfig({
title: 'go2rtc',
themeConfig: {
nav: [
{text: 'Home', link: '/'},
],
sidebar: [
{
items: [
{text: 'Installation', link: '/#installation'},
{text: 'Configuration', link: '/#configuration'},
{text: 'Security', link: '/#security'},
],
},
{
text: 'Features',
items: [
{text: 'Streaming input', link: '/#streaming-input'},
{text: 'Streaming output', link: '/#streaming-output'},
{text: 'Streaming ingest', link: '/#streaming-ingest'},
{text: 'Two-way audio', link: '/#two-way-audio'},
{text: 'Stream to camera', link: '/#stream-to-camera'},
{text: 'Publish stream', link: '/#publish-stream'},
{text: 'Preload stream', link: '/#preload-stream'},
],
collapsed: false,
},
{
text: 'Codecs',
items: [
{text: 'Codecs filters', link: '/#codecs-filters'},
{text: 'Codecs madness', link: '/#codecs-madness'},
{text: 'Built-in transcoding', link: '/#built-in-transcoding'},
{text: 'Codecs negotiation', link: '/#codecs-negotiation'},
],
collapsed: true,
},
{
text: 'Other',
items: [
{text: 'Projects using go2rtc', link: '/#projects-using-go2rtc'},
{text: 'Camera experience', link: '/#camera-experience'},
{text: 'Tips', link: '/#tips'},
],
collapsed: true,
},
{
text: 'Core modules',
items: [
{text: 'app', link: '/internal/app/'},
{text: 'api', link: '/internal/api/'},
{text: 'streams', link: '/internal/streams/'},
],
collapsed: false,
},
{
text: 'Main modules',
items: [
{text: 'http', link: '/internal/http/'},
{text: 'mjpeg', link: '/internal/mjpeg/'},
{text: 'mp4', link: '/internal/mp4/'},
{text: 'rtsp', link: '/internal/rtsp/'},
{text: 'webrtc', link: '/internal/webrtc/'},
],
collapsed: false,
},
{
text: 'Other modules',
items: [
{text: 'hls', link: '/internal/hls/'},
{text: 'homekit', link: '/internal/homekit/'},
{text: 'onvif', link: '/internal/onvif/'},
{text: 'rtmp', link: '/internal/rtmp/'},
{text: 'webtorrent', link: '/internal/webtorrent/'},
{text: 'wyoming', link: '/internal/wyoming/'},
],
collapsed: false,
},
{
text: 'Script sources',
items: [
{text: 'echo', link: '/internal/echo/'},
{text: 'exec', link: '/internal/exec/'},
{text: 'expr', link: '/internal/expr/'},
{text: 'ffmpeg', link: '/internal/ffmpeg/'},
],
collapsed: false,
},
{
text: 'Other sources',
items: [
{text: 'alsa', link: '/internal/alsa/'},
{text: 'bubble', link: '/internal/bubble/'},
{text: 'doorbird', link: '/internal/doorbird/'},
{text: 'dvrip', link: '/internal/dvrip/'},
{text: 'eseecloud', link: '/internal/eseecloud/'},
{text: 'flussonic', link: '/internal/flussonic/'},
{text: 'gopro', link: '/internal/gopro/'},
{text: 'isapi', link: '/internal/isapi/'},
{text: 'ivideon', link: '/internal/ivideon/'},
{text: 'hass', link: '/internal/hass/'},
{text: 'mpegts', link: '/internal/mpegts/'},
{text: 'multitrans', link: '/internal/multitrans/'},
{text: 'nest', link: '/internal/nest/'},
{text: 'ring', link: '/internal/ring/'},
{text: 'roborock', link: '/internal/roborock/'},
{text: 'tapo', link: '/internal/tapo/'},
{text: 'tuya', link: '/internal/tuya/'},
{text: 'v4l2', link: '/internal/v4l2/'},
{text: 'wyze', link: '/internal/wyze/'},
{text: 'xiaomi', link: '/internal/xiaomi/'},
{text: 'yandex', link: '/internal/yandex/'},
],
collapsed: false,
},
{
text: 'Helper modules',
items: [
{text: 'debug', link: '/internal/debug/'},
{text: 'ngrok', link: '/internal/ngrok/'},
{text: 'pinggy', link: '/internal/pinggy/'},
{text: 'srtp', link: '/internal/srtp/'},
],
collapsed: false,
},
],
socialLinks: [
{icon: "github", link: "https://github.com/AlexxIT/go2rtc"}
],
outline: [2, 3],
search: {provider: 'local'},
},
rewrites: {
'README.md': 'index.md',
'(.*)/README.md': '(.*)/index.md',
},
srcDir: '..',
srcExclude: ['examples/', 'pkg/'],
// cleanUrls: true,
ignoreDeadLinks: true,
});
+9
View File
@@ -0,0 +1,9 @@
# WebSite
These are the sources of the [go2rtc.org](https://go2rtc.org/) website. It's content published on GitHub Pages and is a mirror of [alexxit.github.io/go2rtc/](http://alexxit.github.io/go2rtc/).
The site contains:
- Project's documentation, which is compiled via [vitepress](https://github.com/vuejs/vitepress) from `README.md` files located in the root of the repository, as well as in the `internal` folder.
- Project's API in OpenAPI format, and the [Redoc](https://github.com/Redocly/redoc) viewer
- Project's assets (logo).
+2 -2
View File
@@ -1,7 +1,7 @@
<!DOCTYPE html>
<html lang="en">
<head>
<title>go2rtc - API</title>
<title>API | go2rtc</title>
<meta charset="utf-8">
<meta name="viewport" content="width=device-width, initial-scale=1">
<link href="https://fonts.googleapis.com/css?family=Montserrat:300,400,700|Roboto:300,400,700" rel="stylesheet">
@@ -13,7 +13,7 @@
</style>
</head>
<body>
<redoc spec-url="https://raw.githubusercontent.com/AlexxIT/go2rtc/master/api/openapi.yaml"></redoc>
<redoc spec-url="openapi.yaml"></redoc>
<script src="https://cdn.redoc.ly/redoc/latest/bundles/redoc.standalone.js"></script>
</body>
</html>
Binary file not shown.

After

Width:  |  Height:  |  Size: 15 KiB

Before

Width:  |  Height:  |  Size: 154 KiB

After

Width:  |  Height:  |  Size: 154 KiB

Before

Width:  |  Height:  |  Size: 154 KiB

After

Width:  |  Height:  |  Size: 154 KiB

Before

Width:  |  Height:  |  Size: 37 KiB

After

Width:  |  Height:  |  Size: 37 KiB

-28
View File
@@ -1,28 +0,0 @@
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<title>go2rtc</title>
<meta http-equiv="refresh" content="2; URL='https://github.com/AlexxIT/go2rtc'"/>
<style>
body, html {
height: 100%;
margin: 0;
display: flex;
flex-direction: column;
justify-content: center;
align-items: center;
background-color: white;
}
img {
max-width: 100%;
height: auto;
}
</style>
</head>
<body>
<img src="https://raw.githubusercontent.com/AlexxIT/go2rtc/master/assets/logo.gif" alt="go2rtc">
<a href="https://github.com/AlexxIT/go2rtc">github.com/AlexxIT/go2rtc</a>
</body>
</html>
+46 -80
View File
@@ -1,3 +1,49 @@
# www
This folder contains static HTTP and JS content that is embedded into the application during build. An external developer can use it as a basis for integrating go2rtc into their project or for developing a custom web interface for go2rtc.
## HTTP API
`www/stream.html` - universal viewer with support params in URL:
- multiple streams on page `src=camera1&src=camera2...`
- stream technology autoselection `mode=webrtc,webrtc/tcp,mse,hls,mp4,mjpeg`
- stream technology comparison `src=camera1&mode=webrtc&mode=mse&mode=mp4`
- player width setting in pixels `width=320px` or percents `width=50%`
`www/webrtc.html` - WebRTC viewer with support two way audio and params in URL:
- `media=video+audio` - simple viewer
- `media=video+audio+microphone` - two way audio from camera
- `media=camera+microphone` - stream from browser
- `media=display+speaker` - stream from desktop
## JavaScript API
- You can write your viewer from the scratch
- You can extend the built-in viewer - `www/video-rtc.js`
- Check example - `www/video-stream.js`
- Check example - https://github.com/AlexxIT/WebRTC
`video-rtc.js` features:
- support technologies:
- WebRTC over UDP or TCP
- MSE or HLS or MP4 or MJPEG over WebSocket
- automatic selection best technology according on:
- codecs inside your stream
- current browser capabilities
- current network configuration
- automatic stop stream while browser or page not active
- automatic stop stream while player not inside page viewport
- automatic reconnection
Technology selection based on priorities:
1. Video and Audio better than just Video
2. H265 better than H264
3. WebRTC better than MSE, than HLS, than MJPEG
## Browser support
[ECMAScript 2019 (ES10)](https://caniuse.com/?search=es10) supported by [iOS 12](https://en.wikipedia.org/wiki/IOS_12) (iPhone 5S, iPad Air, iPad Mini 2, etc.).
@@ -8,86 +54,6 @@ But [ECMAScript 2017 (ES8)](https://caniuse.com/?search=es8) almost fine (`es6 +
- Autoplay doesn't work for WebRTC in Safari [read more](https://developer.apple.com/documentation/webkit/delivering_video_content_for_safari/).
## HTML5
**1. Autoplay video tag**
[Video auto play is not working](https://stackoverflow.com/questions/17994666/video-auto-play-is-not-working-in-safari-and-chrome-desktop-browser)
> Recently many browsers can only autoplay the videos with sound off, so you'll need to add muted attribute to the video tag too
```html
<video id="video" autoplay controls playsinline muted></video>
```
- https://developer.apple.com/documentation/webkit/delivering_video_content_for_safari/
**2. [Safari] pc.createOffer**
Don't work in Desktop Safari:
```js
pc.createOffer({offerToReceiveAudio: true, offerToReceiveVideo: true})
```
Should be replaced with:
```js
pc.addTransceiver('video', {direction: 'recvonly'});
pc.addTransceiver('audio', {direction: 'recvonly'});
pc.createOffer();
```
**3. pc.ontrack**
TODO
```js
pc.ontrack = ev => {
const video = document.getElementById('video');
// when audio track not exist in Chrome
if (ev.streams.length === 0) return;
// when audio track not exist in Firefox
if (ev.streams[0].id[0] === '{') return;
// when stream already init
if (video.srcObject !== null) return;
video.srcObject = ev.streams[0];
}
```
## Chromecast 1
2023-02-02. Error:
```
InvalidStateError: Failed to execute 'addTransceiver' on 'RTCPeerConnection': This operation is only supported in 'unified-plan'. 'unified-plan' will become the default behavior in the future, but it is currently experimental. To try it out, construct the RTCPeerConnection with sdpSemantics:'unified-plan' present in the RTCConfiguration argument.
```
User-Agent: `Mozilla/5.0 (X11; Linux armv7l) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/70.0.3538.47 Safari/537.36 CrKey/1.36.159268`
https://webrtc.org/getting-started/unified-plan-transition-guide?hl=en
## Web Icons
[Favicon checker](https://realfavicongenerator.net/), skip:
- Windows 8 and 10 (`browserconfig.xml`)
- Mac OS X El Capitan Safari
```html
<!-- iOS Safari -->
<link rel="apple-touch-icon" href="https://go2rtc.org/icons/apple-touch-icon-180x180.png" sizes="180x180">
<!-- Classic, desktop browsers -->
<link rel="icon" href="https://go2rtc.org/icons/favicon.ico">
<!-- Android Chrome -->
<link rel="manifest" href="https://go2rtc.org/manifest.json">
```
## Useful links
- https://www.webrtc-experiment.com/DetectRTC/