[alsa-devel] Digital distortion when converting Mono to Stereo?
While testing with our ASoC enabled embedded platform, we observed a strange phenomena with Mono audio files. We used external audio test equipment for SNR/THD analysis and used a mono wav file (16bit, 44.1 KHz) which plays a simple sweep from ~10Hz to ~20KHz.
As the hardware output is Stereo, an implicit conversion takes places (don't know which part of ALSA actually does it, though).
Strangely, the measurement showed a phantom frequency which looks like the sweeping test signal frequency was somehow 'mirrored' at 11.025KHz (fs/4). This phantom noise becomes stronger the lower its frequency gets. So while sweeping, those two peaks were approaching each other, they met at 11KHz, and when the 'real' peak aproached 20KHz, the phantom was at something like 1KHz and had comparable level than the 'real' signal. All that was much stronger on one channel and barely there, but still measurable, on the other one.
We first though about an analog filter issue, then suspected the digital input stream, but eventually it turned out that with the same test file in stereo, all these effects suddenly went away. So there must be something in the chain from aplay to the actual PXA SSP ports which adds considerable digital distortion to one channel when converting from Mono to Stereo.
I didn't dig any deeper, but still wanted to let you know. Can anyone think of a possible reason for that? It doesn't harm us in our application as we will never play Mono audio eventually, but I guess it could affect others.
Thanks, Daniel
participants (1)
-
Daniel Mack