When I look at the HDAudio spec, I see that the BDL entries need to be aligned on a 128-byte boundary, but the buffer length can be any value. When I look at the HDAudio driver (azx_pcm_open in hda_codec.c), I see that we enforce a 128-byte granularity on the period/buffer sizes, but this was added in a patch that fixed alignment issues (5f1545bc)
Any technical reason why the driver is more demanding than the spec? Or is this a confusion between alignment and buffer size? I can understand that rounding to multiple of 128 bytes would be more efficient in terms of data access, but it also prevents applications from getting an interrupt when they want (eg it's impossible to use 20ms periods with a 44.1kHz sampling rate, you'll get 19.59 or 20.31ms).
Thanks, -Pierre