[PATCH v2] [RFC] selftests: alsa - add PCM test

Mark Brown broonie at kernel.org
Tue Nov 8 15:46:19 CET 2022


On Tue, Nov 08, 2022 at 12:59:14PM +0100, Jaroslav Kysela wrote:
> This initial code does a simple sample transfer tests. By default,
> all PCM devices are detected and tested with short and long
> buffering parameters for 4 seconds. If the sample transfer timing
> is not in a +-100ms boundary, the test fails. Only the interleaved
> buffering scheme is supported in this version.

Oh, thanks for picking this up - something like this has been on my mind
for ages!  This should probably be copied to Shuah and the kselftest
list as well, I've added them.  This looks basically good to me, I've
got a bunch of comments below but I'm not sure any of them except
possibly the one about not putting values in the configuration file by
default should block getting this merged so:

Reviewed-by: Mark Brown <broonie at kernel.org>

> The configuration may be modified with the configuration files.
> A specific hardware configuration is detected and activated
> using the sysfs regex matching. This allows to use the DMI string
> (/sys/class/dmi/id/* tree) or any other system parameters
> exposed in sysfs for the matching for the CI automation.

> The configuration file may also specify the PCM device list to detect
> the missing PCM devices.

>  create mode 100644 tools/testing/selftests/alsa/alsa-local.h
>  create mode 100644 tools/testing/selftests/alsa/conf.c
>  create mode 100644 tools/testing/selftests/alsa/conf.d/Lenovo_ThinkPad_P1_Gen2.conf
>  create mode 100644 tools/testing/selftests/alsa/pcm-test.c

This is a bit unusual for kselftest and might create a bit of churn but
does seem sensible and reasonable to me, it's on the edge of what
kselftest usually covers but seems close enough in scope.  I worry
a bit about ending up needing to add a config fragment as a result but
perhaps we can get away without.

> index 000000000000..0a83f35d43eb
> --- /dev/null
> +++ b/tools/testing/selftests/alsa/conf.d/Lenovo_ThinkPad_P1_Gen2.conf

> +       pcm.0.0 {
> +               PLAYBACK {
> +                       test.time1 {
> +                               access RW_INTERLEAVED   # can be omitted - default
> +                               format S16_LE           # can be omitted - default
> +                               rate 48000              # can be omitted - default
> +                               channels 2              # can be omitted - default
> +                               period_size 512
> +                               buffer_size 4096

I think it'd be better to leave these commented by default, especially
if/once we improve the enumeration.  That way the coverage will default
to whatever the tool does by default on the system (including any
checking of constraints for example).  I guess we might want to add a
way of saying "here's what I expect the constraints to be" but that's
very much future work.

> +#ifdef SND_LIB_VER
> +#if SND_LIB_VERSION >= SND_LIB_VER(1, 2, 6)
> +#define LIB_HAS_LOAD_STRING
> +#endif
> +#endif
> +
> +#ifndef LIB_HAS_LOAD_STRING
> +static int snd_config_load_string(snd_config_t **config, const char *s,
> +				  size_t size)
> +{

This is also in mixer-test, we should pull it into a helper library too.
Something that could be done separately/incrementally.

> +	for (i = 0; i < 4; i++) {

> +
> +	snd_pcm_drain(handle);
> +	ms = timestamp_diff_ms(&tstamp);
> +	if (ms < 3900 || ms > 4100) {

It feels like the runtime might be usefully parameterised here - there's
a tradeoff with detecting inaccurate clocks and runtime that people
might want to make.

> +	ksft_set_plan(num_missing + num_pcms * TESTS_PER_PCM);

> +	for (pcm = pcm_missing; pcm != NULL; pcm = pcm->next) {
> +		ksft_test_result(false, "test.missing.%d.%d.%d.%s\n",
> +				 pcm->card, pcm->device, pcm->subdevice,
> +				 snd_pcm_stream_name(pcm->stream));
> +	}

We don't seem to report a successful test.missing anywhere like
find_pcms() so if we ever hit a test.missing then it'll look like a new
test, old test runs won't have logged the failure.  That can change how
people look at any failures that crop up, "it's new and never worked" is
different to "this used to work" and people are likely to just be
running kselftest rather than specifically know this test.  It'd be
better if we counted the cards in the config and used that for our
expected number of test.missings, logging cards that we find here as
well.

> +	for (pcm = pcm_list; pcm != NULL; pcm = pcm->next) {
> +		test_pcm_time1(pcm, "test.time1", "S16_LE", 48000, 2, 512, 4096);
> +		test_pcm_time1(pcm, "test.time2", "S16_LE", 48000, 2, 24000, 192000);
> +	}

It does feel like especially in the case where no configuration is
specified we should be eumerating what the card can do and both
potentially doing more tests (though there's obviously an execution time
tradeoff with going overboard there) and skipping configurations that
the card never claimed to support in the first place.  In particular I'm
expecting we'll see some cards that only do either 44.1kHz or 48kHz and
will get spurious fails by default, and I'd like to see coverage of mono
playback on cards that claim to support it because I suspect there's a
bunch of them that don't actually do the right thing.

Like I say most of this could be done incrementally if we decide it
needs to get done at all though, we shouldn't let perfect be the enemy
of good.
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 488 bytes
Desc: not available
URL: <http://mailman.alsa-project.org/pipermail/alsa-devel/attachments/20221108/937d8d92/attachment.sig>


More information about the Alsa-devel mailing list