[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: Addressing sources dynamically generated by autoconf
From: |
Thomas Jahns |
Subject: |
Re: Addressing sources dynamically generated by autoconf |
Date: |
Mon, 21 Nov 2022 21:48:22 +0100 |
> On Nov 21, 2022, at 17:10 , Jan Engelhardt <jengelh@inai.de> wrote:
> On Monday 2022-11-21 16:22, Thomas Jahns wrote:
>
>> The question consequently is: how would I create a Makefile.am that accounts
>> for a list of C sources, when the sources are not yet present/known from the
>> perspective of automake?
>
> I don't see that working even without automake. Once make has loaded a
> Makefile, the internal DAG is immutable for all practical considerations. If
> you do not have the source names, what would you give the compiler? Which
> compiler would you even invoke if you do not know whether you are going to
> have
> a C or C++ file?
this is probably a misunderstanding: I meant I would preferably not list the
source files in Makefile.am to prevent duplication of the list already needed
in configure. Also, I wrote that all source files are expected to be C (MPI
internals are currently always written in C to my knowledge, might change with
more rust out there someday).
> That is why source file names ought to be known, even if the file itself is
> empty, absent-until-later, or remains completely unused due to conditionals
> along the way.
It's a bummer if I cannot forego naming the files to inform automake that I
need the C compiler/linker in this file, but that might be so.
>> While installing a fixed MPI library might seem to be the correct way to
>> handle the issue, our users are not typically in a position to demand this
>
> root is not needed. They can install the fixed MPI library to
> /home/self, use LD_LIBRARY_PATH at runtime and
>
> ./configure CPPFLAGS=-I/home/self/include LDFLAGS="-L/home/self/lib
> -Wl,-rpath,/home/self/lib"
>
> for build-time. The upside is that neither libyaxt nor libzzzzzz need to
> bother
> producing a fixed MPI on their own, individually, which slims down all these
> projects.
You would find if you spoke to typical users of HPC software that building MPI
and associated libraries on their own is a very tedious prospect for them.
These are typically researchers interested in some simulation outcome and, as
stated, they are usually operating with little time to spare. Also, without
in-depth knowledge it can be very difficult to recreate an MPI installation
that works just like what the HPC site or vendor installed.
Regards, Thomas
smime.p7s
Description: S/MIME cryptographic signature