Re: [Openh323-devel] Video size in OPAL

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Hannes Friederich wrote:
> On 7. Aug 2007, at 03:52, Craig Southeren wrote:
>> Hannes Friederich wrote:
>>> Hi,
>>> One problem is that currently, the media options aren't propagated  
>>> to  the other endpoint, which could inspect them to signal the  
>>> correct  media format description to the remote endpoint.
>> I'm not sure what this means.
> It may have changed meanwhile, but I've seen the following problem:
> XMeeting is organized like a gateway. Instead of specifying YUV420P  
> as a possible media format, it specifies e.g. H.261 directly. So, the  
> OPAL transcoding system isn't used (for video at least). If I use non- 
> default media format parameters (e.g. a lower bitrate than the one  
> specified in the media format constructor), this information isn't  
> propagated and the other connection always sees the default  
> parameters. 
> There are good reasons for this behaviour (I don't recall  
> all details since it's too long since I was working on that), so  
> there is needed another way to correctly propagate media format  
> options. Pretty much the same applies here, as I think the signalling  
> code (be it H.323 or SDP) needs to know which frame sizes, bitrates  
> etc to use. Using H.263+ or H.264, it is possible to send non- 
> standard frame sizes, and in case of H.263+ the frame size  
> possibilities should be signaled as well.

I've not looked at XMeeting, so I don't know how you have implemented 
your endpoint.

The easiest way to implement what you describe would be to ensure that 
your endpoint class (the one that represents the computer, not the H323 
or SIP ones) is populated with H.263 and H.261 media formats, rather 
than just having the YUV420P like the PCSS endpoint does. If you do 
this, OPAL will connect the incoming H.263 stream directly to your 
endpoint without any transcoder.

Regardless, the function OpalMediaStream::UpdateMediaFormat is always 
called just before the media patch is started. This will allow the media 
stream (which is connected to the destination device) to access the 
final negotiated media format parameters for the connection.

I'm using exactly this technique in a current OPAL application.

>> When a connection is made (say) to a H.323 endpoint, the H.323
>> connection negotiates the parameters that are used on that side of the
>> call. The OpalMediaFormat instances on the media streams for that
>> connection will contain the negotiated parameters.
> Yes, but which are the parameters to start the negotiation with? If  
> you want a video codec plugin to be easy useable, it should be  
> possible to specify beforehand which frame size / bitrate you'd like,  
> and the plugin should update it's media formats accordingly.  
> Otherwise everyone that wants to send custom picture sizes needs  
> either to use the plugin manager API directly or write its own plugin

You can achieve the same thing by changing the options in the global 
media format list. This will work OK for an application that only makes 
one simultaneous call, but will cause problems if you have multiple 
calls. For this kind of application, you need to make sure the 
OpalConnection::GetMediaFormats descendant returns a media format list 
with the appropriate options set.

>> The "other" side of the OPAL call should not need to see these
>> parameters, as it is isolated by the conversion to/from YUV420P (or
>> whatever intermediate format is used).
>> If there is a mismatch between the video formats on the two sides  
>> of the
>> call, OPAL will insert code to resize or recode the video frames as
>> necessary. It's not the responsibility of the codecs to "peek" at the
>> other side and do strange things on that basis.
>> One good reason for this is that an OpalCall was always designed as a
>> "one to many" relationship, with one "source" stream feeding multiple
>> "sink" streams. How would a source stream know what sink stream to
>> optimise itself for?
> I completely agree that it is not always possible to know which  
> picture format to use. But it is an unnecessary overhead to let the  
> video be captured at, say, 640x480 just to be rescaled to 352x288  
> (CIF). If you can get CIF directly from the camera driver, chances  
> are high that the code is faster, even if the camera driver has to do  
> some internal rescaling itself. But I have to admit that I never  
> really used the OPAL transcoding system / video input, as XMeeting  
> uses a completely different approach in this area.

Of course it is more efficient if the media stream can support the 
native parameters :) That's why OPAL allows it to be done.


  Craig Southeren          Post Increment – VoIP Consulting and Software

  Phone:  +61 243654666      ICQ: #86852844
  Fax:    +61 243656905      MSN: craig_southeren@xxxxxxxxxxx
  Mobile: +61 417231046      Jabber: craigs@xxxxxxxxxx

  "It takes a man to suffer ignorance and smile.
   Be yourself, no matter what they say."   Sting

This email is sponsored by: Splunk Inc.
Still grepping through log files to find problems?  Stop.
Now Search log events and configuration files using AJAX and a browser.
Download your FREE copy of Splunk now >>
Openh323-devel mailing list

[Open H.323]     [IETF SIP]     [Gnu Gatekeeper]     [Asterisk PBX]     [Fedora Linux]     [Gimp]     [Yosemite News]     [Yosemite Photos]     [Yosemite Campsites]     [ISDN Cause Codes]

Add to Google Powered by Linux