On 04/30/2012 12:04 PM, Mark Brown wrote:
On Mon, Apr 30, 2012 at 10:26:30AM +0200, Ola Lilja wrote:
On 04/27/2012 01:15 PM, Mark Brown wrote:
No, I really don't see any value at all in it. The machine drivers aren't actually sharing anything visible and the effect of what you're doing is to make the selection of machine a compile time one instead of a runtime one.
No, that is a misunderstanding. We are just dividing the machine-driver file into one main-file and then calling functions from other ones. It is not affecting the framework in any way. We just want to divide the code in a way we find useful. One file calling functions from another one. I don't see how that can be a problem.
The code I'm referring to is this:
| +#ifdef CONFIG_SND_SOC_UX500_AB8500 | +#include <ux500_ab8500.h> | +#endif | + | +/* Define the whole U8500 soundcard, linking platform to the codec-drivers */ | +struct snd_soc_dai_link u8500_dai_links[] = { | + #ifdef CONFIG_SND_SOC_UX500_AB8500 | + { | + .name = "ab8500_0", | + .stream_name = "ab8500_0",
which is definitely compile time. It's not the factoring stuff out, it's the way it's been done. Library code like Tegra uses isn't a problem but this sort of arrangement does cause problems.
OK, the thought with this was to be able to activate/deactivate the individual codec-drivers since we have several separate codecs on our Ux500-platform (Note that in this patch-set there is not patches for the other two codec-drivers). Since we already knows at compile-time if any of these three codecs are present we did it this ways, being able to add them separately in menuconfig. How could we solve this? All three codec-drivers has dependancies to other stuff being activated in menuconfig.