[PATCH 0/4] ASoC: fsl_asrc: allow selecting arbitrary clocks

Nicolin Chen nicoleotsuka at gmail.com
Tue Jul 14 22:15:45 CEST 2020


On Tue, Jul 14, 2020 at 06:20:32PM +0200, Arnaud Ferraris wrote:
> >>> The current ASRC driver hardcodes the input and output clocks used for
> >>> sample rate conversions. In order to allow greater flexibility and to
> >>> cover more use cases, it would be preferable to select the clocks using
> >>> device-tree properties.
> >>
> >> We recent just merged a new change that auto-selecting internal
> >> clocks based on sample rates as the first option -- ideal ratio
> >> mode is the fallback mode now. Please refer to:
> >> https://git.kernel.org/pub/scm/linux/kernel/git/next/linux-next.git/commit/?h=next-20200702&id=d0250cf4f2abfbea64ed247230f08f5ae23979f0

> I finally got some time to test and debug clock auto-selection on my
> system, and unfortunately couldn't get it to work.
> 
> Here's some background about my use case: the i.MX6 board acts as a
> Bluetooth proxy between a phone and a headset. It has 2 Bluetooth
> modules (one for each connected device), with audio connected to SSI1 &
> SSI2. Audio sample rate can be either 8 or 16kHz, and bclk can be either
> 512 or 1024kHz, all depending of the capabilities of the headset and phone.
> In our case we want SSI2 to be the input clock to the ASRC and SSI1 the
> output clock, but there is no way to force that with auto-selection:
> both clocks are multiples of both 8k and 16k, so the algorithm will
> always select the SSI1 clock.

Anything wrong with ASRC selecting SSI1 clock for both cases? The
driver calculates the divisors based on the given clock rate, so
the final internal rate should be the same. If there's a problem,
I feel that's a separate bug.


More information about the Alsa-devel mailing list