LIVE555再学习 -- testH264VideoStreamer 源码分析
上一篇文章我們已經(jīng)講了一部分:
testH264VideoStreamer 重復(fù)從 H.264 基本流視頻文件(名為“test.264”)中讀取,并使用 RTP 多播進(jìn)行流式傳輸。?該程序還具有內(nèi)置的 RTSP 服務(wù)器。
Apple 的“QuickTime 播放器”可用于接收和播放此音頻流。 要使用它,讓玩家打開會(huì)話的“rtsp://”URL(程序在開始流式傳輸時(shí)打印出來(lái))。
開源“VLC”和“MPlayer”媒體播放器也可以使用。
因?yàn)槎加杏⑽淖⑨尩?#xff0c;所以源碼分析也簡(jiǎn)單
一、源碼分析
首先我要說(shuō)明一下,testH264VideoStreamer 我執(zhí)行了完,然后一直不出現(xiàn)視頻
嘗試了好幾個(gè)版本的live555,但是都不成功,很影響心情啊。然后看到一篇文章。
參看:live555編譯、播放示例
這里的源碼我試了一下,人家的是可以的,好傷心。為什么呢?先看一下它的源碼,講解的很明白了。
/** 本程序同時(shí)提供單播、組播功能。基于testH264VideoStreamer程序修改,另參考testOnDemandRTSPServer。 注: 單播:重開VLC連接,會(huì)重新讀文件。無(wú)馬賽克 組播:重開VLC連接,會(huì)繼續(xù)上一次的位置往下讀文件。每次連接時(shí),出現(xiàn)馬賽克,VLC出現(xiàn): main error: pictures leaked, trying to workaround */ #include <liveMedia.hh> #include <BasicUsageEnvironment.hh> #include <GroupsockHelper.hh> UsageEnvironment* env; char inputFileName[128] = {0}; // 輸入的視頻文件 H264VideoStreamFramer* videoSource; RTPSink* videoSink; Boolean reuseFirstSource = False; void play(); // forward void announceStream(RTSPServer* rtspServer, ServerMediaSession* sms, char const* streamName, char const* inputFileName); int main(int argc, char** argv) { strcpy(inputFileName, "test.264"); // 默認(rèn)值 if (argc == 2) { strcpy(inputFileName, argv[1]); } printf("Using file: %s\n", inputFileName); // Begin by setting up our usage environment: TaskScheduler* scheduler = BasicTaskScheduler::createNew(); env = BasicUsageEnvironment::createNew(*scheduler); // 描述信息 char const* descriptionString = "Session streamed by \"testH264VideoStreamer\""; // RTSP服務(wù)器,端口為8554 RTSPServer* rtspServer = RTSPServer::createNew(*env, 8554); if (rtspServer == NULL) { *env << "Failed to create RTSP server: " << env->getResultMsg() << "\n"; exit(1); } // 組播 // Create 'groupsocks' for RTP and RTCP: struct in_addr destinationAddress; destinationAddress.s_addr = chooseRandomIPv4SSMAddress(*env); const unsigned short rtpPortNum = 18888; const unsigned short rtcpPortNum = rtpPortNum+1; const unsigned char ttl = 255; const Port rtpPort(rtpPortNum); const Port rtcpPort(rtcpPortNum); Groupsock rtpGroupsock(*env, destinationAddress, rtpPort, ttl); rtpGroupsock.multicastSendOnly(); // we're a SSM source Groupsock rtcpGroupsock(*env, destinationAddress, rtcpPort, ttl); rtcpGroupsock.multicastSendOnly(); // we're a SSM source // Create a 'H264 Video RTP' sink from the RTP 'groupsock': OutPacketBuffer::maxSize = 200000; videoSink = H264VideoRTPSink::createNew(*env, &rtpGroupsock, 96); // Create (and start) a 'RTCP instance' for this RTP sink: const unsigned estimatedSessionBandwidth = 500; // in kbps; for RTCP b/w share const unsigned maxCNAMElen = 100; unsigned char CNAME[maxCNAMElen+1]; gethostname((char*)CNAME, maxCNAMElen); CNAME[maxCNAMElen] = '\0'; // just in case RTCPInstance* rtcp = RTCPInstance::createNew(*env, &rtcpGroupsock, estimatedSessionBandwidth, CNAME, videoSink, NULL /* we're a server */, True /* we're a SSM source */); // Note: This starts RTCP running automatically char const* streamName = "h264ESVideoMulticast"; ServerMediaSession* sms = ServerMediaSession::createNew(*env, streamName, inputFileName, descriptionString, True /*SSM*/); sms->addSubsession(PassiveServerMediaSubsession::createNew(*videoSink, rtcp)); rtspServer->addServerMediaSession(sms); announceStream(rtspServer, sms, streamName, inputFileName); // Start the streaming: *env << "Beginning streaming...\n"; play(); // 播放 // 單播 { char const* streamName = "h264ESVideo"; ServerMediaSession* sms = ServerMediaSession::createNew(*env, streamName, streamName, descriptionString); sms->addSubsession(H264VideoFileServerMediaSubsession ::createNew(*env, inputFileName, reuseFirstSource)); rtspServer->addServerMediaSession(sms); announceStream(rtspServer, sms, streamName, inputFileName); } env->taskScheduler().doEventLoop(); // does not return return 0; // only to prevent compiler warning } // 繼續(xù)讀取文件 void afterPlaying(void* /*clientData*/) { *env << "...done reading from file\n"; videoSink->stopPlaying(); Medium::close(videoSource); // Note that this also closes the input file that this source read from. // Start playing once again: play(); } void play() { // Open the input file as a 'byte-stream file source': ByteStreamFileSource* fileSource = ByteStreamFileSource::createNew(*env, inputFileName); if (fileSource == NULL) { *env << "Unable to open file \"" << inputFileName << "\" as a byte-stream file source\n"; exit(1); } FramedSource* videoES = fileSource; // Create a framer for the Video Elementary Stream: videoSource = H264VideoStreamFramer::createNew(*env, videoES); // Finally, start playing: *env << "Beginning to read from file...\n"; videoSink->startPlaying(*videoSource, afterPlaying, videoSink); } void announceStream(RTSPServer* rtspServer, ServerMediaSession* sms, char const* streamName, char const* inputFileName) { char* url = rtspServer->rtspURL(sms); UsageEnvironment& env = rtspServer->envir(); env << "\n\"" << streamName << "\" stream, from the file \"" << inputFileName << "\"\n"; env << "Play this stream using the URL \"" << url << "\"\n"; delete[] url; }
二、概念介紹
然后,源碼里有這樣一段說(shuō)明:
// Note: This is a multicast address. ?If you wish instead to stream
// using unicast, then you should use the "testOnDemandRTSPServer"
// test program - not this test program - as a model.
翻譯一下:
注意:這是多播地址。 如果你想改為流
使用單播,那么你應(yīng)該使用“testOnDemandRTSPServer”
測(cè)試程序 - 不是這個(gè)測(cè)試程序 - 只是一個(gè)模型。
他這里出現(xiàn)了單播和多播的概念?我之前還查考過(guò)還有直播和點(diǎn)播,這些都要什么區(qū)別?
參看:LIVE555再學(xué)習(xí) -- 單播、多播、廣播、直播、點(diǎn)播 都是個(gè)啥?
三、實(shí)例
參看:使用Live555類庫(kù)實(shí)現(xiàn)的網(wǎng)絡(luò)直播系統(tǒng)
該代碼中,指定了多播IP地址和端口號(hào)。
總結(jié)
以上是生活随笔為你收集整理的LIVE555再学习 -- testH264VideoStreamer 源码分析的全部?jī)?nèi)容,希望文章能夠幫你解決所遇到的問(wèn)題。
- 上一篇: LIVE555再学习 -- testRT
- 下一篇: 第一部分Calendar介绍