我目前正在创建一个简单的家庭监控系统,在这个系统中,我可以作为流媒体用户或消费者进入应用程序.
Technologies个
- Socket.IO-用于Streamer和连接的对等体之间的信令
- 基本VITE应用程序-Reaction
- WebRTC将在2个对等点之间创建流频道
只有本地主机用于开发目的. 我在Windows的WSL终端上同时运行客户端和服务器. 这款应用程序可以从Windows机器上访问.
Current behaviour个
在访问http://localhost/?stream之后,我开始流传输,并等待来自WebSocket的传入消息开始向新的对等点发送信号.
当消费者访问http://localhost时,执行WebRTC连接,并从流传送器端接收曲目.
消费者端的peerConnection.ontrack
事件实际上被触发,但媒体流似乎只包含1个静音的视频轨道.
在流媒体方面,当我做peerConnection.addTrack
的时候,我实际上设置了2首曲目,一首用于视频,另一首用于音频.
Problem个
当消费者从Streamer接收到曲目/流时,它将其设置为video.srcObject
属性,但没有任何react .既不播放视频也不播放音频.
Expected behaviour个
来自流媒体的视频和音频应该在接收到音轨/流并将其设置为消费者的video.srcObject
属性后在消费者端播放.
What I've tried个
我已经浏览了StackOverflow上的各种相关问题,并try 应用所建议的各种修复程序,但都不能正常工作.
- 将视频元素Autoplay、playsInline和静音设置为True
- 在设置遥控器
video.srcObject
之后程序化地播放视频元素 - 使用STUN服务器-但我认为这不是必需的,因为我使用的是本地主机,而不是连接广域网/互联网上的两个对等点.
- 在点击消费者端的按钮后异步连接对等点,以便在没有用户交互的情况下不try 播放视频.
Code个
我将在下面发布主应用程序逻辑中涉及的应用程序文件:
别担心,这次回购确实是这个问题的一个非常基本和最小的例子(它并不是那么大).
100
import { Server } from "socket.io";
const io = new Server();
let streamer = null;
let users = new Set();
io.on("connect", (socket) => {
if (users.has(socket.id)) {
return;
}
console.log(
`Socket ${socket.id} connected - Client IP Address: ${socket.handshake.address}`
);
socket.on("disconnect", (reason) => {
console.log(`Socket: ${socket.id} disconnected - Reason: ${reason}`);
});
socket.on("begin-stream", () => {
console.log("begin stream", socket.id);
streamer = socket.id;
});
socket.on("request-start-stream", (data) => {
console.log("request-start-stream");
socket.to(streamer).emit("handle-request-start-stream", {
to: socket.id,
offer: data.offer,
});
});
socket.on("response-start-stream", (data) => {
console.log("response-start-stream");
socket.to(data.to).emit("handle-response-start-stream", data.answer);
});
});
io.listen(5432, { cors: true });
100
import { useEffect } from "react";
import socket from "./socket";
import useVideo from "./useVideo";
export default function Streamer() {
const { videoRef, element: videoElement } = useVideo();
useEffect(() => {
init();
async function init() {
if (!videoRef.current) return;
const stream = await navigator.mediaDevices.getUserMedia({
video: true,
audio: true,
});
videoRef.current.srcObject = stream;
socket.emit("begin-stream");
socket.on("handle-request-start-stream", async ({ to, offer }) => {
const peerConnection = new RTCPeerConnection();
stream
.getTracks()
.forEach(
(track) =>
console.log("track", track) ||
peerConnection.addTrack(track, stream)
);
await peerConnection.setRemoteDescription(
new RTCSessionDescription(offer)
);
const answer = await peerConnection.createAnswer();
await peerConnection.setLocalDescription(
new RTCSessionDescription(answer)
);
socket.emit("response-start-stream", { to, answer });
});
}
}, []);
return videoElement;
}
100
import { useEffect } from "react";
import socket from "./socket";
import useVideo from "./useVideo";
export default function Consumer() {
const { videoRef, element: videoElement } = useVideo();
useEffect(() => {
init();
async function init() {
const peerConnection = new RTCPeerConnection();
peerConnection.addEventListener("track", ({ streams: [stream] }) => {
if (!videoRef.current) return;
console.log('stream', stream)
videoRef.current.srcObject = stream;
});
const offer = await peerConnection.createOffer({
// workaround to receive the tracks
// if this is not specified, I will never receive the tracks
offerToReceiveAudio: true,
offerToReceiveVideo: true,
});
await peerConnection.setLocalDescription(
new RTCSessionDescription(offer)
);
socket.emit("request-start-stream", { offer });
socket.on("handle-response-start-stream", async (answer) => {
await peerConnection.setRemoteDescription(
new RTCSessionDescription(answer)
);
});
}
}, []);
return videoElement;
}
100(视频标签挂钩)
import { useRef } from "react";
export default function useVideo() {
const videoRef = useRef<HTMLVideoElement>(null);
const element = <video ref={videoRef} autoPlay />;
return { videoRef, element };
}