[Info-vax] Native compilers

Arne Vajhøj arne at vajhoej.dk
Thu Feb 24 14:25:59 EST 2022


On 2/24/2022 2:08 PM, Johnny Billquist wrote:
> On 2022-02-24 19:19, Arne Vajhøj wrote:
>> Native scripts (DCL on VMS, some shell on *nix, CMD on Windows)
>> has some advantages over dedicated build software:
>> - it is always available - there are no VMS system without DCL,
>>    but plenty of VMS systems without MMS, MMK, Make clones etc.
> 
> Sortof true. But such tools are usually available and free. No reason to 
> not install them.

Many are free and available, but if they are not there
then it is a problem and it is not always easy to get
such stuff installed.

>> - everybody knows it - everybody working on VMS has to know
>>    DCL while many would not know MMS/MMK or make
> 
> That is also true. But learning them is not a bad thing.

Good thing to learn, but many still haven't.

>> The ability to only rebuild what needs to be rebuilt is
>> not an advantage but a huge risk in a build system. It is
>> better to rebuild everything than to hope the build system
>> has understood all dependencies correctly.
> 
> That I very much disagree with. It can be a risk, yes. But it should not 
> be. And calling it a "huge risk" I really disagree with.
> I suspect there are very few larger software pieces of software that 
> isn't using some make-like system these days.
> 
> You simply don't want to just blindly rebuild everything everytime you 
> change something.

I believe that software should be split up in separate
pieces that individually can be build completely pretty quickly.

If native code on VMS then the pieces are shareable images.

So much time can be spent debugging issues from incremental
builds not rebuilding everything that should have been rebuilt.

>> I would only go for build systems if one of:
>> - builds need to be done on multiple platforms (maintaining
>>    native scripts becomes expensive)
> 
> For such situations, something like make is scary as well. When you have 
> different platforms, you fall back to meta-tools like cmake, which I 
> have much less positive experiences with. Or (*cough*) autoconf...

There are some challenges here. :-)

>> - it is a really huge system (where build rules are significantly
>>    simpler than just build commands)
> 
> Build rules only have to be explained once, and then you just need to 
> know what you want to build. Good systems will even figure out all the 
> dependencies on their own.
> 
> Way easier than writing scripts that cover everything.

At some size yes.

>> - dependency download is needed
> 
> Not even sure what that means.

It means that if the build config specify that it needs library A
version X, library B version Y and library C version Z then the
build system download them if they are not present.

>> - tools used are cumbersome to use (so build system encapsulation
>>    really saves something)
> 
> That is equally a win if you just write the script to do it. So in which 
> way would a build system be better than just a script here?

I am thinking about situations where the basic tool set is very
primitive compared to the complexity of the artifact required
while the build systems knows the artifacts to be produced and
can encapsulate all that stuff.

Classes example: Java EE ear files. It is basically a ZIP file
containing both simple files and other ZIP files where everything
need to be in certain paths. It would be a nightmare to build such one
by having a script use ZIP commands. Tools that knows about ear files
can just create the right structure based on way simpler
input.

Arne




More information about the Info-vax mailing list