Opened 17 years ago

Last modified 16 years ago

#706 new defect

Compiled mplayer on OS X 10.3.9, no sound, ao_macosx.c has an error.

Reported by: newsbought@… Owned by: nicolas.plourde@…
Priority: important Component: ao
Version: 1.0rc1 Severity: normal
Keywords: Cc: ulion2002@…
Blocked By: Blocking:
Reproduced by developer: no Analyzed by developer: no

Description

AudioUnitGetProperty was failing when it tried to get kAudioDevicePropertyBufferSize. The errors displayed looked like this:


AO: [macosx] AudioUnitGetProperty returned 561214578 when getting kAudioDevicePropertyBufferSize
AO: [null] 48000Hz 2ch s16be (2 bytes per sample)


This is because kAudioDevicePropertyBufferSize is a text string (according to here)
http://developer.apple.com/documentation/MusicAudio/Reference/CACoreAudioReference/AudioHardware/CompositePage.html

and AudioUnitGetProperty takes a AudioUnitPropertyID (UInt32) according to here
http://developer.apple.com/documentation/MusicAudio/Reference/CoreAudio/audio_units/chapter_5_section_3.html
and here
http://developer.apple.com/documentation/MusicAudio/Reference/CoreAudio/audio_units/chapter_5_section_3.html#//apple_ref/doc/uid/TP30001108-CH207-92320

Since the CompositePage link said that kAudioDevicePropertyBufferSize ought to get deprecated and replaced by kAudioDevicePropertyBufferFrameSize, I tried it and it works.

Here's the changed code from ao_macosx.c line 304


size = sizeof(UInt32);
err = AudioUnitGetProperty(ao->theOutputUnit, kAudioDevicePropertyBufferFrameSize, kAudioUnitScope_Input, 0, &maxFrames, &size);
if (err)
{

ao_msg(MSGT_AO,MSGL_WARN, "AudioUnitGetProperty returned %d when getting kAudioDevicePropertyBufferFrameSize\n", (int)err);
return CONTROL_FALSE;

}

/*To check value I got back*/

ao_msg(MSGT_AO,MSGL_WARN, "No Error: AudioUnitGetProperty returned %d %d %d when getting kAudioDevicePropertyBufferSize\n", (int)err, (int)maxFrames, (int)size);


And this is the output, with sound:


AO: [macosx] No Error: AudioUnitGetProperty returned 0 512 4 when getting kAudioDevicePropertyBufferSize
AO: [macosx] 48000Hz 2ch s16be (2 bytes per sample)
Starting playback...


So maxFrames is set to 512. Due to my limited knowledge about sound channels and such, I do not know if that is the intended koshur value, but that seems to have fixed the bug.

Hope it helps. Cheers

Change History (3)

comment:1 by ulion2002@…, 16 years ago

Cc: ulion2002@… added
Owner: changed from Reimar.Doeffinger@… to nicolas.plourde@…

kAudioDevicePropertyBufferSize shouldn't be a string, What XCode version did you installed?
The value of maxFrames is used as chunk_size and outburst. Best value for it is the request bytes value when theRenderProc is called. Here's a test patch for it, please try it and give response:
http://lists.mplayerhq.hu/pipermail/mplayer-dev-eng/2007-November/054666.html

comment:2 by newsbought@…, 16 years ago

(In reply to comment #1)

kAudioDevicePropertyBufferSize shouldn't be a string, What XCode version did
you installed?
The value of maxFrames is used as chunk_size and outburst. Best value for it is
the request bytes value when theRenderProc is called. Here's a test patch for
it, please try it and give response:
http://lists.mplayerhq.hu/pipermail/mplayer-dev-eng/2007-November/054666.html

Wow, it's been awhile since I touched this. When I had 10.3.9 running, I had no valid input to check it against so what I put in bugzilla was a fix based on my best knowledge at the time.

Since then I've upgraded to 10.4 and seen what the valid input looks like and came up with a way that doesn't use a deprecated api. Unfortunately I could not test my fix on 10.3.9 anymore because I upgraded, but it has worked solid on 10.4 for months now.

I did basically the same thing except I queried the Audio Device's output stream description for the mBytesPerFrame value and I didn't use the kAudioDevicePropertyNominalSampleRate.

Here's my bit of code:
/*Initialized the device, set the charecteristics for the input stream,

now examine the output stream to get bytes per frame

*/

size = sizeof(AudioStreamBasicDescription);
err = AudioUnitGetProperty(ao->theOutputUnit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Output, 0, &outDesc, &size);
if (err) {

ao_msg(MSGT_AO, MSGL_WARN, "Unable to get the output format (err=%d)\n", err);
return CONTROL_FALSE;

}

/*
Now lets query the number of frames the output device has got on hold
*/

size = sizeof(UInt32);
maxFrames = 0;
err = AudioUnitGetProperty(ao->theOutputUnit, kAudioDevicePropertyBufferFrameSize, kAudioUnitScope_Output, 0, &maxFrames, &size);
if (err){

ao_msg(MSGT_AO,MSGL_WARN, "AudioUnitGetProperty returned %d when getting kAudioDevicePropertyBufferFrameSize\n", (int)err);
return CONTROL_FALSE;

}

/*
Now combine the two to produce the chunk size
*/

ao->chunk_size = maxFrames * outDesc.mBytesPerFrame;

Was this wrong?

comment:3 by ulion2002@…, 16 years ago

You can output the requested byte size in the render function like my patch did, to see whether you calculated the best chunk_size.

And also, no matter what chunk_size used, the output will have no problem, this is just a performance parameter, wrong value won't make it sounds bad.

Note: See TracTickets for help on using tickets.