ccrtp-devel
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[Ccrtp-devel] using ccrtp without internal threading or timers


From: Rob Prowel
Subject: [Ccrtp-devel] using ccrtp without internal threading or timers
Date: Wed, 24 Jan 2007 17:02:31 -0500
User-agent: Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.8.0.9) Gecko/20070104 Red Hat/1.0.7-0.6.fc5 SeaMonkey/1.0.7

Hi.

I've got a hardware MPEG4 encoder that generates MPEG4 video and u-law PCM audio. The kernel driver returns the frames as a multiplexed stream of data blocks/frames where the intervals between frames are governed by the driver and the read() syscall is blocking until a frame arrives. I need to separate the audio and video into two distinct RTP sessions and transmit them in as near to real-time as is possible. Anything more than 200ms delay between when the event happens and when it is perceived by the operator on the other end of the wifi link will be unacceptable and the frame should then be dropped rather than presented.

I believe it would be overly complex to rely on the threading and timers of ccRTP to setup and manage message delivery when the frame producer already regulates the interval of frames to be sent. If I have to buffer a few audio packets initially and put up with 50ms of a/v sync drift then I don't have a problem with that.

I simply don't want to have any RTP managed threading or timing going on since the frame producer already governs that for me. When I send a timestamped frame to putData() I want it to go out immediately. The available documentation isn't explicit enough to answer my questions. Will the following work for my task? Please CC me in responses.


// --------------------------------
CTR1472_AVStream() stream; // create MPEG frame source or producer

RTPSession
video(InetAddress("localhost"), 4500, 1000.0/15.0); // 15fps, but not controlled by RTP
RTPSession
audio(InetAddress("localhost"), 4510, 20.0); // 20ms packets of audio

InetAddress ad("localhost");
video.SetPayloadType(StaticPayload(sptH263)); // cheating on MPEG4 type
video.addDestination(ad);
video.setMaxSendSegmentSize(500); // small size for benefit of wifi
audio.SetPayloadType(StaticPayload(sptPCMU));
audio.addDestination(ad); audio.setMaxSendSegmentSize(500); // small size for benefit of wifi

video.start(); // is this necessary ???  I don't want to use a run() method
audio.start(); // is this necessary ???  I don't want to use a run() method

do {
char* p; // pointer to frame data int length=0; // octets returned in current frame AVStreamType stype; // type of frame returned float frametime=0.0f; // timestamp of generated frame

stream.readFrame(&p, length, stype, frametime); // blocking read of next frame
   if (length)
       switch(stype) {
           MPEG4_Iframe ... MPEG4_Bframe:
           video->putData((int)(frametime*1000.0f)%1000000000, p, length);
           break;

           mulaw_PCM:
           audio->putData((int)(frametime*1000.0f)%1000000000, p, length);
           default:
           break;
      };

} while (!done);






reply via email to

[Prev in Thread] Current Thread [Next in Thread]