▲ | est 3 days ago | |
For anyone curious, this line of comment shows how it's done: > startCameraAndStream opens the webcam with GoCV, sends raw frames to FFmpeg (via stdin), reads IVF from FFmpeg (via stdout), and writes them into the WebRTC video track. | ||
▲ | Sean-Der 3 days ago | parent [-] | |
If people want to build it another way you could do any of these, examples all on the repo! * Use FFMPEG lib directly * GStreamer * Use libvpx, examples is in mediadevices repo |