[alsa-devel] Rationale for BYTCR defaults in the kernel's bytcr_rt5640 machine driver ?

Hans de Goede hdegoede at redhat.com
Tue May 1 21:21:25 CEST 2018


Hi Pierre-Louis,

I'm finally wrapping up my rt5640 jack-detect work, as
such I'm currently testing speaker + headphones + internal mic +
headset-mic + jack-detect functionality on the 10 different
x86 devices with a rt5640 codec which I've gathered.

2 of them stand out in that they use the BYTCR SoC, but don't
have the ACPI table for detecting if SSP0 AIF1 or AIF2
should be used. Currently the driver defaults to AIF2
in this case.

As the somewhat wildcard-ish DMI quirk for boards where
the sys_vendor is "Insyde" (which applies to a lot of
generic designs) shows:

         {
                 .callback = byt_rt5640_quirk_cb,
                 .matches = {
                         DMI_MATCH(DMI_SYS_VENDOR, "Insyde"),
                 },
                 .driver_data = (void *)(BYT_RT5640_IN3_MAP |
                                         BYT_RT5640_MCLK_EN |
                                         BYT_RT5640_SSP0_AIF1),

         },

AIF1 seems to be a better default both models without
the ACPI table which I've here, a HP pavilion X2 and a
Toshiba Click Mini L9W-B need a quirk to use AIF1.

So I was wondering if there was a specific rationale for the
AIF2 default and if it would not be better to change the default
to AIF1 (which will unfortunately bring a chance of regressions) ?

Regards,

Hans




More information about the Alsa-devel mailing list