live555编译、播放示例

最近被安排搞onvif,onvif的视频传输,就是使用live555做服务器,使用其提供的URL。所以live555也得去了解学习。本文简单介绍live555的编译,然后在原有例程上给出一个示例。

1、编译

live555官网为http://www.live555.com/,源码下载地址:http://www.live555.com/liveMedia/public/。live555支持很多平台,如mac ox,linux,还有mingw。每种平台都带有配置文件,编译方法也较简单。在mingw环境编译方法如下:

1
2
3
$ ./genMakefiles mingw 
$ export CC=gcc
$ make

类似地,在linux编译方法如下:

1
2
$ ./genMakefiles linux
$ make

编译完成后,testProgs目录会生成很多测试程序,在不修改任何代码情况下,可运行这些程序进行测试。以testH264VideoStreamer为例,将H264视频文件放到该目录,改名为test.264,运行testH264VideoStreamer(在mingw环境编译得到的名称是testH264VideoStreamer.exe)。再在VLC打开网络串流,输入地址rtsp://ip/testStream,如:rtsp://192.168.1.100:8554/testStream。
默认情况,该示例程序就是使用test.264文件。如需要修改播放的文件,则要修改源代码文件testH264VideoStreamer.cpp。如果需要再次编译,直接在testProgs输入make即可。

2、示例

下面在testH264VideoStreamer.cpp工程基础上添加单播功能,该功能模块参考testOnDemandRTSPServer工程。代码如下:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
/**
本程序同时提供单播、组播功能。基于testH264VideoStreamer程序修改,另参考testOnDemandRTSPServer。
注:
单播:重开VLC连接,会重新读文件。无马赛克
组播:重开VLC连接,会继续上一次的位置往下读文件。每次连接时,出现马赛克,VLC出现:
main error: pictures leaked, trying to workaround

*/
#include #include #include UsageEnvironment* env;
char inputFileName[128] = {0}; // 输入的视频文件
H264VideoStreamFramer* videoSource;
RTPSink* videoSink;

Boolean reuseFirstSource = False;

void play(); // forward

void announceStream(RTSPServer* rtspServer, ServerMediaSession* sms,
char const* streamName, char const* inputFileName);

int main(int argc, char** argv) {
strcpy(inputFileName, "test.264"); // 默认值
if (argc == 2) {
strcpy(inputFileName, argv[1]);
}
printf("Using file: %s\n", inputFileName);

// Begin by setting up our usage environment:
TaskScheduler* scheduler = BasicTaskScheduler::createNew();
env = BasicUsageEnvironment::createNew(*scheduler);

// 描述信息
char const* descriptionString
= "Session streamed by \"testH264VideoStreamer\"";

// RTSP服务器,端口为8554
RTSPServer* rtspServer = RTSPServer::createNew(*env, 8554);
if (rtspServer == NULL) {
*env << "Failed to create RTSP server: " << env->getResultMsg() << "\n";
exit(1);
}

// 组播
// Create 'groupsocks' for RTP and RTCP:
struct in_addr destinationAddress;
destinationAddress.s_addr = chooseRandomIPv4SSMAddress(*env);

const unsigned short rtpPortNum = 18888;
const unsigned short rtcpPortNum = rtpPortNum+1;
const unsigned char ttl = 255;

const Port rtpPort(rtpPortNum);
const Port rtcpPort(rtcpPortNum);

Groupsock rtpGroupsock(*env, destinationAddress, rtpPort, ttl);
rtpGroupsock.multicastSendOnly(); // we're a SSM source
Groupsock rtcpGroupsock(*env, destinationAddress, rtcpPort, ttl);
rtcpGroupsock.multicastSendOnly(); // we're a SSM source

// Create a 'H264 Video RTP' sink from the RTP 'groupsock':
OutPacketBuffer::maxSize = 200000;
videoSink = H264VideoRTPSink::createNew(*env, &rtpGroupsock, 96);

// Create (and start) a 'RTCP instance' for this RTP sink:
const unsigned estimatedSessionBandwidth = 500; // in kbps; for RTCP b/w share
const unsigned maxCNAMElen = 100;
unsigned char CNAME[maxCNAMElen+1];
gethostname((char*)CNAME, maxCNAMElen);
CNAME[maxCNAMElen] = '\0'; // just in case
RTCPInstance* rtcp
= RTCPInstance::createNew(*env, &rtcpGroupsock,
estimatedSessionBandwidth, CNAME,
videoSink, NULL /* we're a server */,
True /* we're a SSM source */);
// Note: This starts RTCP running automatically

char const* streamName = "h264ESVideoMulticast";
ServerMediaSession* sms
= ServerMediaSession::createNew(*env, streamName, inputFileName,
descriptionString, True /*SSM*/);
sms->addSubsession(PassiveServerMediaSubsession::createNew(*videoSink, rtcp));
rtspServer->addServerMediaSession(sms);

announceStream(rtspServer, sms, streamName, inputFileName);

// Start the streaming:
*env << "Beginning streaming...\n";
play(); // 播放

////////////////////////////////////////////////////////////////////////

// 单播
{
char const* streamName = "h264ESVideo";
ServerMediaSession* sms
= ServerMediaSession::createNew(*env, streamName, streamName,
descriptionString);
sms->addSubsession(H264VideoFileServerMediaSubsession
::createNew(*env, inputFileName, reuseFirstSource));
rtspServer->addServerMediaSession(sms);

announceStream(rtspServer, sms, streamName, inputFileName);
}

env->taskScheduler().doEventLoop(); // does not return

return 0; // only to prevent compiler warning
}

// 继续读取文件
void afterPlaying(void* /*clientData*/) {
*env << "...done reading from file\n";
videoSink->stopPlaying();
Medium::close(videoSource);
// Note that this also closes the input file that this source read from.

// Start playing once again:
play();
}

void play() {
// Open the input file as a 'byte-stream file source':
ByteStreamFileSource* fileSource
= ByteStreamFileSource::createNew(*env, inputFileName);
if (fileSource == NULL) {
*env << "Unable to open file \"" << inputFileName
<< "\" as a byte-stream file source\n";
exit(1);
}

FramedSource* videoES = fileSource;

// Create a framer for the Video Elementary Stream:
videoSource = H264VideoStreamFramer::createNew(*env, videoES);

// Finally, start playing:
*env << "Beginning to read from file...\n";
videoSink->startPlaying(*videoSource, afterPlaying, videoSink);
}

void announceStream(RTSPServer* rtspServer, ServerMediaSession* sms,
char const* streamName, char const* inputFileName) {
char* url = rtspServer->rtspURL(sms);
UsageEnvironment& env = rtspServer->envir();
env << "\n\"" << streamName << "\" stream, from the file \""
<< inputFileName << "\"\n";
env << "Play this stream using the URL \"" << url << "\"\n";
delete[] url;
}

为了便于学习,将live555源码放到github上,地址为https://github.com/latelee/my_live555.git

李迟2015.12.20 周六 晚