compilers: use module-based compiler finding on Linux as well as Cray#11989
compilers: use module-based compiler finding on Linux as well as Cray#11989
Conversation
|
I'm not sure this allows us to find compilers for which there are not modules, so this is probably incomplete. |
tgamblin
left a comment
There was a problem hiding this comment.
This looks good and is particularly impressive given that it was done in 5 minutes.
Requests:
- Consolidate the module load/parse logic.
- Do the module loading/parsing first, THEN do path-based compiler finding, and prefer the compilers from modules.
If we can get that done I think it is a reasonable way to get people running on TACC machines where we really need to load the modules to run the compiler.
8105269 to
031cad8
Compare
ccba7f1 to
77497ea
Compare
77497ea to
87dbae3
Compare
7ac611b to
b5477aa
Compare
b5477aa to
ba00585
Compare
|
|
||
| module_hints (list or None): list of modules in which to look for | ||
| compilers. A sensible default based on ``module avail`` will be | ||
| used if the values is None. |
There was a problem hiding this comment.
This says there are module_hints but module_hints isn't an argument.
There was a problem hiding this comment.
Yeah I removed the argument and forgot to change the docstring. I'll pull it out of the docs.
| # associated with a module, it will include that module | ||
| module_association = {} | ||
| for mod in modules_to_check_paths: | ||
| path = spack.util.module_cmd.get_bin_path_from_module(mod) |
There was a problem hiding this comment.
I like this approach but see my comment on get_bin_path_from_module -- what if there's more than one thing added to PATH?
Also, one more general question about the approach: will this result in the icc modules being associated with gcc compiler entries? If so maybe we should think about this a bit. You want the right gcc to be loaded when using icc, but you likely don't want icc loaded at all when you're using gcc.
There was a problem hiding this comment.
I don't think it will end up with icc modules being associated with the gcc compilers. icc modules should be adding icc to the PATH and both icc and gcc to the LD_LIBRARY_PATH. That's precisely why I'm not using the existing functionality to get a path from a module, because it would check LD_LIBRARY_PATH as well.
| if path: | ||
| if path in module_association: | ||
| msg = 'Multiple modules associated with path %s. ' % path | ||
| msg += 'Compilers may require manual configuration' |
There was a problem hiding this comment.
We should probably save this message for after we've tried everything, then check module_association for elements with multiple modules and print both the path and the modules. That would make the message much more useful. I realize this information will be in compilers.yaml but I think putting it in the message reduces the digging needed.
tgamblin
left a comment
There was a problem hiding this comment.
Looking even better. One (hopefully last) request
| modules_for_paths |= module_association.get( | ||
| os.path.dirname(p), set()) | ||
|
|
||
| positive_attributes = [ |
There was a problem hiding this comment.
@becker33: we talked about making a module ineligible to be a gcc module if it loaded gcc and any other compiler -- basically make it so that if the intel module puts icc and gcc in the path, we don't make it eligible to be included in compilers.yaml for gcc.
Did that make it in here? I'm not seeing which condition that corresponds to.
|
@tgamblin should be ready for review again, comments addressed. |
|
Thanks for working on this, IMO, useful feature. Combined with recent work on compiler automated configuration, this will be awesome ! I have a few comments and questions about it:
(EDIT: a bit off topic) |
|
I started looking at this and I'm finding some cases that don't fully This is on a small cluster at Rice (nots), x86_64, RedHat 7.5, (1) The GCC modules are split into two modules, GCC and GCCcore. So, Starting with an empty The problem is that spack only identifies the GCCcore module, not the This does matter and some packages fail. For example, diffutils fails The error message is kinda misleading. I'm guessing there's something This may be a difficult case. The path to gcc@8.3.0 is actually from |
|
@mwkrentel, hello ! It seems that GCCcore modules are visible to the users, are they meant to be loadable directly by users, or only indirectly by other modules ? In the latter case, could they be hidden ? I’m not sure it would solve the issue though... I can see that only GCC modules define the |
|
@adrienbernede I think these are mostly questions about Easy Build, of The default (system) MODULEPATH is I'm guessing one of the main reasons for splitting the module is that But if you want to use the compiler, then you load the top-level Anyway, it seems that the spack user should always check the compiler |
| # shenanigans to ensure python will run. | ||
|
|
||
| # LD_LIBRARY_PATH under which Spack ran | ||
| os.environ['SPACK_LD_LIBRARY_PATH'] = spack.main.spack_ld_library_path |
There was a problem hiding this comment.
I think this still has a problem.
If we load multiple modules, we will only get the changes to LD_LIBRARY_PATH from the last one loaded.
There was a problem hiding this comment.
We need to somehow diff the LD_LIBRARY_PATHS and apply the diff, not overwrite.
|
Is this something we need to revisit at a later time? If so I think we need to figure out a way to find packages based on module file inspection (thus targeting compiler as dependencies rather than the current state of things). |
|
I previously put some comments on this in #9376. In short, Also, I was hoping that when spack finds a compiler from a module that The existing way to tell spack about a compiler from a module is to As a compromise, I suggested adding a sub-command to This would tell spack that But note: some compiler modules depend on other modules, eg, |
|
Superseded by #29392 |
Many sites deploy compilers that do not work without their modules. We should find compilers from modules for those sites.