On 8/14/23 05:51, Cezary Rojewski wrote:
On 2023-08-11 8:21 PM, Pierre-Louis Bossart wrote:
On 8/11/23 11:48, Cezary Rojewski wrote:
To provide option for selecting different bit-per-sample than just the maximum one, use the new format calculation mechanism.
Can you remind me what the issue is in selecting the maximum resolution?
Isn't it a good thing to select the maximum resolution when possible so as to provide more headroom and prevent clipping? Usually we try to make sure the resolution becomes limited when we reach the analog parts. I am not sure I see a compelling reason to reduce the resolution on the host side.
Maximum resolution is still the default, nothing changes there. Moreover, new subformat options are not added to any driver apart from the avs-driver.
What's so special about this driver that it needs more capabilities for a standard interface?
The patchset provides _an option_ to change bits-per-sample. Right now there's no option to do that so the hardware - here, SDxFMT and PPLCxFMT
- is not tested. We have enough recommended flows already. Frankly there
is one for SDxFMT for the APL-based platforms (or BXT-based if one prefers that naming) present within snd_hda_intel and DSP drivers alike.
I am also not sure what this patch actually changes, given that the firmware actually converts everything to the full 32-bit resolution.
The issue does not concern firmware, so we leave firmware out of the discussion - this is purely about hardware capabilities.
I don't see how you can leave firmware aside, if the hardware configuration is selected to be based on 24 bits and the firmware generated 32 there's clearly a mismatch.
If this is saying that we are adding an option that will never be used, then why bother?
Lost in space on this one.
I must be missing something on the problem statement.
Signed-off-by: Cezary Rojewski cezary.rojewski@intel.com
sound/soc/sof/intel/hda-dai-ops.c | 5 +++-- 1 file changed, 3 insertions(+), 2 deletions(-)
diff --git a/sound/soc/sof/intel/hda-dai-ops.c b/sound/soc/sof/intel/hda-dai-ops.c index f3513796c189..00703999e91b 100644 --- a/sound/soc/sof/intel/hda-dai-ops.c +++ b/sound/soc/sof/intel/hda-dai-ops.c @@ -194,14 +194,15 @@ static unsigned int hda_calc_stream_format(struct snd_sof_dev *sdev, struct snd_soc_dai *codec_dai = asoc_rtd_to_codec(rtd, 0); unsigned int link_bps; unsigned int format_val; + unsigned int bps; if (substream->stream == SNDRV_PCM_STREAM_PLAYBACK) link_bps = codec_dai->driver->playback.sig_bits; else link_bps = codec_dai->driver->capture.sig_bits; + bps = snd_hdac_stream_format_bps(params_format(params), SNDRV_PCM_SUBFORMAT_STD, link_bps);
I can't say I fully understand the difference between 'bps' and 'link_bps'. The naming is far from self-explanatory just by looking at the code.
There's none. I just didn't reuse the 'link_bps' variable. I can reuse it if you like.
- format_val = snd_hdac_calc_stream_format(params_rate(params), params_channels(params), - params_format(params), link_bps, 0); + format_val = snd_hdac_stream_format(params_channels(params), bps, params_rate(params)); dev_dbg(sdev->dev, "format_val=%#x, rate=%d, ch=%d, format=%d\n", format_val, params_rate(params), params_channels(params), params_format(params));