Sending h264 frames #1286
-
I'm new to WebRTC and video encoding. I'm trying to encode images in rtc::Description::Video media("video", rtc::Description::Direction::SendOnly);
media.addH264Codec(102);
constexpr rtc::SSRC ssrc = 42;
media.addSSRC(ssrc, "video");
video_track = peer->addTrack(media);
auto rtpConfig = std::make_shared<rtc::RtpPacketizationConfig>(ssrc, "video", 102,
rtc::H264RtpPacketizer::defaultClockRate);
auto packetizer = std::make_shared<rtc::H264RtpPacketizer>(rtc::NalUnit::Separator::LongStartSequence, rtpConfig);
auto srReporter = std::make_shared<rtc::RtcpSrReporter>(rtpConfig);
packetizer->addToChain(srReporter);
auto nackResponder = std::make_shared<rtc::RtcpNackResponder>();
packetizer->addToChain(nackResponder);
video_track->setMediaHandler(packetizer); Why might my RTP packets be ignored? I found related problem in discussions #699 |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 7 replies
-
Do you encode to h264 properly (you can write raw h264 frames to a file for testing)? Do you observe suspicious records in libdatachannel's log? If a browser is used on receivers side, do you see something wrong from webrtc's details (Chrome - chrome://webrtc-internals; Edge - edge://webrtc-internals; Firefox - about:webrtc)? |
Beta Was this translation helpful? Give feedback.
The timestamps look fine: the difference between two audio timestamps is 960 (20ms at 48 KHz), and the difference between video timestamps is 3000 (around 31.25ms at 96KHz).
As you can see on the Chrome graph, it receives packets but no valid frames. This looks like an encoding issue. How do you bundle NAL units together into frames? It has to be consistent with the packetizer separator setting.