On Tue, Nov 22, 2011 at 06:31:03AM +0800, Raymond Yau wrote:
2011/11/17 Andrew Eikum aeikum@codeweavers.com:
On Wed, Nov 16, 2011 at 11:08:37PM +0100, Clemens Ladisch wrote:
On 11/16/2011 10:31 PM, Andrew Eikum wrote:
On Wed, Nov 16, 2011 at 10:20:07PM +0100, Clemens Ladisch wrote:
Andrew Eikum wrote:
+++ b/include/pcm.h @@ -44,8 +44,20 @@ extern "C" {
/** PCM generic info container */ typedef struct _snd_pcm_info snd_pcm_info_t; -/** PCM hardware configuration space container */
+/** PCM hardware configuration space container
- snd_pcm_hw_params_t is an opaque structure which contains a set of possible
- PCM hardware configurations. For example, a given instance might include a
- range of buffer sizes, a range of period sizes, and a set of several sample
- formats. Some subset of all possible combinations these sets may be valid,
- but not necessarily any combination will be valid.
- No validation is done by the various snd_pcm_hw_params_set* functions.
These functions do validate the value that the application is trying to set and adjust all other dependent limits.
I didn't find that to be the case in my testing, at least between periods, period_size, and buffer_size. I've attached a test program here.
I can reproduce this.
Try running the program with LIBASOUND_DEBUG=1; it appears that there is a bug in the rate plugin. (Normal programs actually set their rate ...)
This doesn't change the output in any way. I checked the Arch Linux alsa-lib build script[1] and it doesn't seem to disable debugging in any obvious way. Do I have to explicitly enable debugging output and rebuild?
$ LIBASOUND_DEBUG=1 ./alsa_period_count min_buffer_frames: 80 set max buffer size: 80 snd_pcm_hw_params: -12 $
[1] http://projects.archlinux.org/svntogit/packages.git/tree/trunk/PKGBUILD?h=pa...
There is a bug in your program which set the buffer_size_max to buffer_size_min
err = snd_pcm_hw_params_set_periods(pcm, hw_params, 10, 0); if(err < 0){ printf("snd_pcm_hw_params_set_periods: %d\n", err); return 1; } err = snd_pcm_hw_params_set_period_size(pcm, hw_params, 1024, 0); if(err < 0){ printf("snd_pcm_hw_params_set_period_size: %d\n", err); return 1; }
The above already set buffer size to 10240, but you set buffer_size_max to buffer_size_min (80)
err = snd_pcm_hw_params_get_buffer_size_min(hw_params, &buffer_frames); if(err < 0){ printf("snd_pcm_hw_params_get_buffer_size: %d\n", err); return 1; } printf("min_buffer_frames: %u\n", buffer_frames); err = snd_pcm_hw_params_set_buffer_size_max(pcm, hw_params, &buffer_frames); if(err < 0){ printf("snd_pcm_hw_params_get_buffer_size: %d\n", err); return 1; } printf("set max buffer size: %u\n", buffer_frames);
Yes, that's exactly my point. I would expect set_buffer_size_max() to fail, since it's incompatible with the parameters set earlier. Instead, it succeeds.
Andrew